diff --git a/products/aac-generate-summary.md b/products/aac-generate-summary.md index cad38d54..6842c4e7 100644 --- a/products/aac-generate-summary.md +++ b/products/aac-generate-summary.md @@ -1,6 +1,6 @@ # Generation Summary -**Generated**: 2026-04-19 02:02:50 +**Generated**: 2026-04-26 02:03:06 **Total Duration**: 0m 38s ## Product Crawl Summary @@ -9,28 +9,28 @@ Quick overview for reviewers. See individual product reports for details. | # | Product | Pages | Classified | New | Updated | Deleted | Status | |---|---------|-------|------------|-----|---------|---------|--------| -| 1 | Azure Architecture | 454 | 335 | 0 | 12 | 0 | OK | +| 1 | Azure Architecture | 452 | 336 | 2 | 18 | 4 | OK | ### Totals - **Products Processed**: 1 success, 0 failed -- **Total Pages**: 454 -- **Total Classified**: 335 -- **Total New Pages**: 0 -- **Total Updated Pages**: 12 -- **Total Deleted Pages**: 0 +- **Total Pages**: 452 +- **Total Classified**: 336 +- **Total New Pages**: 2 +- **Total Updated Pages**: 18 +- **Total Deleted Pages**: 4 ### Classification by Type (All Products) | Type | Count | |------|-------| | anti-patterns | 11 | -| architecture-styles | 7 | +| architecture-styles | 8 | | best-practices | 51 | | design-patterns | 50 | -| example-workloads | 73 | +| example-workloads | 74 | | migration-guides | 32 | -| reference-architectures | 52 | +| reference-architectures | 51 | | solution-ideas | 28 | | technology-choices | 31 | diff --git a/products/azure-advisor/azure-advisor.csv b/products/azure-advisor/azure-advisor.csv index d212f0b1..822abe24 100644 --- a/products/azure-advisor/azure-advisor.csv +++ b/products/azure-advisor/azure-advisor.csv @@ -16,9 +16,9 @@ https://learn.microsoft.com/en-us/azure/advisor/advisor-overview,What is Azure A https://learn.microsoft.com/en-us/azure/advisor/advisor-quick-fix,Bulk remediation for recommendations,Quick Fix remediation for Advisor recommendations - Azure Advisor,Use Quick Fix for bulk remediation of Advisor recommendations,Perform bulk remediation using Quick Fix in Advisor,TheQuick Fixfeature provides a faster and easier way to remediate a recommendation on multiple resources. TheQuick Fixfeature allows you to used bulk remediations for resources. TheQuick Fixfeature helps you to quickly optimize and scale your subscription with remediation for your resources. Note TheQuick Fixfeature is only available for specific recommendations using the Azure portal.,2025-01-13T23:00:00.000Z,how-to,best-practices,0.6,True,Explains Quick Fix bulk remediation behavior and constraints for specific recommendations; contains product-specific remediation patterns and gotchas.,unchanged https://learn.microsoft.com/en-us/azure/advisor/advisor-recommendations-digest,Digests,Recommendation digest for Azure Advisor - Azure Advisor,Configure periodic Azure Advisor recommendation digests,Get periodic summary for your active recommendations,Learn how to create a recommendation digest for your Advisor recommendations.,2025-03-21T05:33:00.000Z,how-to,configuration,0.65,True,"Covers setting up scheduled digests for recommendations; likely includes specific configuration options (frequency, scope, channels) that are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-cost-recommendations,Cost,Cost recommendations - Azure Advisor,Apply Azure Advisor cost recommendations across services,Full list of available cost recommendations in Advisor.,"Azure Advisor helps you optimize and reduce your overall Azure spend by identifying idle and underutilized resources. You can get cost recommendations from theCosttab on the Advisor dashboard. Sign in to theAzure portal. Search for and selectAdvisorfrom any page. On theAdvisordashboard, select theCosttab.",2026-02-10T08:00:00.000Z,article,best-practices,0.7,True,"Full list of cost recommendations with service-specific actions; provides concrete, product-specific cost optimization practices.",unchanged -https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-operational-excellence-recommendations,Operational Excellence,Operational excellence recommendations - Azure Advisor,Apply Azure Advisor operational excellence recommendations,Operational excellence recommendations,"Operational excellence recommendations in Azure Advisor can help you with: You can get these recommendations on theOperational Excellencetab of the Advisor dashboard. Sign in to theAzure portal. Search for and selectAdvisorfrom any page. On theAdvisordashboard, select theOperational Excellencetab.",2026-04-14T08:00:00.000Z,article,best-practices,0.68,True,"Page is a catalog of Azure Advisor operational excellence recommendations (DO/DON'T style guidance) that are specific to Azure services and Advisor signals. These are product-specific best-practice rules rather than generic concepts, and represent curated expert guidance on how to configure and operate resources for operational excellence.",updated +https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-operational-excellence-recommendations,Operational Excellence,Operational excellence recommendations - Azure Advisor,Apply Azure Advisor operational excellence recommendations,Operational excellence recommendations,"Operational excellence recommendations in Azure Advisor can help you with: You can get these recommendations on theOperational Excellencetab of the Advisor dashboard. Sign in to theAzure portal. Search for and selectAdvisorfrom any page. On theAdvisordashboard, select theOperational Excellencetab.",2026-04-14T08:00:00.000Z,article,best-practices,0.68,True,"Page is a catalog of Azure Advisor operational excellence recommendations (DO/DON'T style guidance) that are specific to Azure services and Advisor signals. These are product-specific best-practice rules rather than generic concepts, and represent curated expert guidance on how to configure and operate resources for operational excellence.",unchanged https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-performance-recommendations,Performance,Performance recommendations - Azure Advisor,Leverage Azure Advisor performance recommendations,Full list of available performance recommendations in Advisor.,"The performance recommendations in Azure Advisor can help improve the speed and responsiveness of your business-critical applications. You can get performance recommendations from Advisor on thePerformancetab of the Advisor dashboard. Sign in to theAzure portal. Search for and selectAdvisorfrom any page. On theAdvisordashboard, select thePerformancetab.",2026-02-24T08:00:00.000Z,article,best-practices,0.65,True,Full list of performance recommendations with specific actions to improve responsiveness; product-specific tuning guidance.,unchanged -https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-reliability-recommendations,Reliability,Reliability recommendations - Azure Advisor,Use Azure Advisor reliability recommendations for resilience,Full list of available reliability recommendations in Advisor.,"Azure Advisor helps you ensure and improve the continuity of your business-critical applications. You can get reliability recommendations on theReliabilitytab on the Advisor dashboard. Sign in to theAzure portal. Search for and selectAdvisorfrom any page. On theAdvisordashboard, select theReliabilitytab.",2026-04-14T08:00:00.000Z,article,best-practices,0.7,True,"Page lists concrete Azure Advisor reliability recommendations tied to specific Azure services and configurations. It encodes product-specific guidance on improving reliability (for example, redundancy, configuration changes, and service-specific checks), which goes beyond generic reliability concepts and constitutes expert best-practice rules.",updated +https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-reliability-recommendations,Reliability,Reliability recommendations - Azure Advisor,Use Azure Advisor reliability recommendations for resilience,Full list of available reliability recommendations in Advisor.,"Azure Advisor helps you ensure and improve the continuity of your business-critical applications. You can get reliability recommendations on theReliabilitytab on the Advisor dashboard. Sign in to theAzure portal. Search for and selectAdvisorfrom any page. On theAdvisordashboard, select theReliabilitytab.",2026-04-14T08:00:00.000Z,article,best-practices,0.7,True,"Page lists concrete Azure Advisor reliability recommendations tied to specific Azure services and configurations. It encodes product-specific guidance on improving reliability (for example, redundancy, configuration changes, and service-specific checks), which goes beyond generic reliability concepts and constitutes expert best-practice rules.",unchanged https://learn.microsoft.com/en-us/azure/advisor/advisor-release-notes,What's new?,What's new in Azure Advisor - Azure Advisor,,"Learn about what's new and what's changed in Azure Advisor with information from release notes, videos, and blog posts.","Learn about the latest updates and changes in Azure Advisor with the items in this article. The updates and changes include release notes, videos, blog posts, and other types of information. Bookmark this article to stay up to date with the service.",2025-11-10T23:10:00.000Z,reference,,0.1,False,Release notes and change log content; not a stable expert-knowledge skill pattern for the agent.,unchanged https://learn.microsoft.com/en-us/azure/advisor/advisor-resiliency-reviews,Use Azure Advisor resiliency reviews,Azure Advisor resiliency reviews - Azure Advisor,Use Azure Advisor resiliency reviews to improve reliability,Optimize resource resiliency with custom recommendation reviews.,"Azure Advisor Resiliency Reviews help you focus on the most important recommendations to optimize your cloud deployments and improve resiliency. Review recommendations are tailored to the needs of your workload and include custom ones curated by your Microsoft account team using Azure best practices and prioritized automated recommendations. You can find resiliency reviews inAzure Advisor, which serves as your single-entry point for Azure Well Architected Framework (WAF) assessments of industry ",2025-04-22T17:02:00.000Z,how-to,best-practices,0.65,True,Custom resiliency reviews curated by Microsoft account teams with tailored recommendations; encapsulates product-specific resiliency optimization practices.,unchanged https://learn.microsoft.com/en-us/azure/advisor/advisor-score,Advisor score,Advisor score - Azure Advisor,,Use Azure Advisor score to measure optimization progress.,"Learn how to use Azure Advisor score to measure optimization progress. Important The platform updated the logic of the Azure Advisor score to provide you with more accurate results. As a result, the more precise assessment increases or decreases your score.",2025-03-20T22:01:00.000Z,concept-article,,0.4,False,"Explains Advisor score and its logic conceptually; no clear indication of numeric thresholds, config parameters, or decision matrices.",unchanged diff --git a/products/azure-advisor/report.md b/products/azure-advisor/report.md index c1cf475c..df6412c1 100644 --- a/products/azure-advisor/report.md +++ b/products/azure-advisor/report.md @@ -44,8 +44,8 @@ confusable_not_for: Not for Azure Cost Management (use azure-cost-management), A ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 2 -- **Unchanged**: 30 +- **Updated Pages**: 0 +- **Unchanged**: 32 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-advisor/azure-advisor.csv` @@ -63,13 +63,6 @@ confusable_not_for: Not for Azure Cost Management (use azure-cost-management), A ## Changes -### Updated Pages - -- [Operational Excellence](https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-operational-excellence-recommendations) - - Updated: 2026-02-10T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Reliability](https://learn.microsoft.com/en-us/azure/advisor/advisor-reference-reliability-recommendations) - - Updated: 2026-03-24T08:00:00.000Z → 2026-04-14T08:00:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-aks-edge-essentials/azure-aks-edge-essentials.csv b/products/azure-aks-edge-essentials/azure-aks-edge-essentials.csv index 2b70689e..435d97f6 100644 --- a/products/azure-aks-edge-essentials/azure-aks-edge-essentials.csv +++ b/products/azure-aks-edge-essentials/azure-aks-edge-essentials.csv @@ -24,7 +24,7 @@ https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-connect-to-arc https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-deploy-app,Deploy application,Deploy an application - AKS enabled by Azure Arc,,Describes how to deploy a containerized application to a Kubernetes cluster.,This article describes how to deploy a containerized application on your Kubernetes cluster.,2024-05-01T08:00:00.000Z,how-to,,0.3,False,Generic application deployment to Kubernetes; likely uses standard kubectl manifests without AKS Edge–specific configuration matrices.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-deploy-azure-iot,Deploy Azure IoT Operations,Azure IoT Operations with AKS Edge Essentials - AKS enabled by Azure Arc,Deploy Azure IoT Operations on AKS Edge Essentials,Learn how to run the quickstart script that creates an Arc-enabled AKS Edge Essentials Kubernetes cluster that can run Azure IoT Operations.,"Azure Kubernetes Service (AKS) Edge Essentials is one of the supported cluster platforms forAzure IoT Operations. You can use AKS Edge Essentials to create a Microsoft-managed Kubernetes cluster and deploy Azure IoT Operations on it as a workload. This article describes the steps to run a script that creates an AKS Edge Essentials Kubernetes cluster with the required configuration for Azure IoT Operations, and then connects that cluster to Azure Arc. Note Azure IoT Operations supports AKS Edge E",2025-09-25T08:00:00.000Z,how-to,integrations,0.78,True,Describes a script and required cluster configuration to run Azure IoT Operations on AKS Edge; includes product-specific integration parameters and constraints.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-expose-service,Expose Kubernetes services,Expose services with AKS Edge Essentials - AKS enabled by Azure Arc,Expose Kubernetes services on AKS Edge Essentials,Learn how to expose a Kubernetes service with AKS Edge Essentials.,"If you work with Kubernetes applications, you might need to make Kubernetes services accessible to external devices so they can interact with the workloads you've deployed. This article explains how to expose Kubernetes services running on an AKS Edge Essentials cluster to external devices. Depending on the networking configuration you used to set up the Kubernetes cluster, there are two different ways to expose the services: Note If you use Kubernetes services, make sure to set up theInit.Servi",2024-07-12T08:00:00.000Z,how-to,configuration,0.7,True,"Explains service exposure methods depending on networking configuration, including specific settings like Init.Service; product-specific configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-key-manager,Use the Key Manager for Kubernetes extension,Use Key Manager for Kubernetes clusters on AKS Edge Essentials (preview) - AKS enabled by Azure Arc,Use Key Manager to rotate AKS Edge service account keys,Learn how to use the Key Manager for Kubernetes extension to rotate service account keys in Azure Kubernetes Service (AKS) Edge Essentials clusters.,"Important Microsoft is retiring the Key Manager for Kubernetes extension (preview) on April 15, 2026. After this date, the extension will no longer be available for deployment, and no further updates or support will be provided. If you have additional questions, please contact us through theAKS enabled by Azure Arc GitHub repository. TheKubernetes service accountis a a non-human account that provides a unique identity within a Kubernetes cluster. Service account tokens serve important security a",2026-04-15T17:03:00.000Z,how-to,security,0.7,True,"Describes using the Key Manager for Kubernetes extension on AKS Edge Essentials to rotate service account keys, which is a product-specific security/identity operation. Likely includes extension configuration, parameters, and rotation behavior that qualify as expert security configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-key-manager,Use the Key Manager for Kubernetes extension,Use Key Manager for Kubernetes clusters on AKS Edge Essentials (preview) - AKS enabled by Azure Arc,Use Key Manager to rotate AKS Edge service account keys,Learn how to use the Key Manager for Kubernetes extension to rotate service account keys in Azure Kubernetes Service (AKS) Edge Essentials clusters.,"Important Microsoft is retiring the Key Manager for Kubernetes extension (preview) on April 15, 2026. After this date, the extension will no longer be available for deployment, and no further updates or support will be provided. If you have additional questions, please contact us through theAKS enabled by Azure Arc GitHub repository. TheKubernetes service accountis a a non-human account that provides a unique identity within a Kubernetes cluster. Service account tokens serve important security a",2026-04-15T17:03:00.000Z,how-to,security,0.7,True,"Describes using the Key Manager for Kubernetes extension on AKS Edge Essentials to rotate service account keys, which is a product-specific security/identity operation. Likely includes extension configuration, parameters, and rotation behavior that qualify as expert security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-metric-server,Enable metric server,Deploy metrics server on an AKS Edge Essentials cluster - AKS enabled by Azure Arc,Deploy Kubernetes metrics server on AKS Edge Essentials,Learn about the steps to deploy a metrics server on an AKS Edge Essentials cluster.,Themetrics serveris a tool that inspects your containers' resource consumption. You can find theYAML filefor the metrics server deployment in the/Samples/Otherfolderin the GitHub repo.,2024-07-11T08:00:00.000Z,how-to,integrations,0.65,True,Integration-focused article deploying metrics server with provided YAML; includes concrete resource configuration tailored to AKS Edge Essentials.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-more-configs,Additional configuration,AKS Edge Essentials configuration and scripts - AKS enabled by Azure Arc,Advanced AKS Edge Essentials configuration and scripts,Additional configuration options for AKS Edge Essentials.,"This article provides alternate ways of connecting to Azure Arc, which can be applicable to clusters that are connected via a proxy.",2024-07-11T08:00:00.000Z,how-to,configuration,0.7,True,"Provides additional configuration options and alternate Arc connection methods, likely with specific parameters and script usage unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-multi-nic,Configure multiple NICs,Configure multiple NICs for AKS Edge Essentials - AKS enabled by Azure Arc,Configure multiple NICs for AKS Edge Essentials Linux nodes,Learn how to attach multiple network interfaces to an AKS Edge Essentials Linux virtual machine.,"By default, the AKS Edge Essentials Linux node has a single network interface card (NIC) assigned. However, you can configure the Linux node with multiple network interfaces during the deployment of your node. This functionality can be helpful in numerous scenarios where you might have a networking division or separation into different networks or zones. In order to connect an AKS Edge Essentials Linux node to the different networks, you must attach different network interface cards to the Linux",2025-02-25T08:00:00.000Z,how-to,configuration,0.8,True,"Details how to attach and configure multiple NICs for the Linux VM, with specific parameters and constraints unique to AKS Edge Essentials.",unchanged @@ -32,8 +32,8 @@ https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-multi-node-dep https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-offline-install,Offline installation,AKS Edge Essentials offline installation - AKS enabled by Azure Arc,Configure AKS Edge Essentials for offline installation,Learn how to configure your machine for AKS Edge Essentials offline installation.,"AKS Edge Essentials is primarily designed to be installed on an internet-connected machine, since many components are updated regularly. However, with some extra steps, it's possible to deploy AKS Edge Essentials in an offline environment. The following article describes the required configuration needed for an AKS Edge Essentials offline installation:",2024-01-08T22:53:00.000Z,how-to,configuration,0.8,True,"Offline install requires specific repository, package, and network configuration steps unique to AKS Edge Essentials; this is detailed configuration guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-scale-out,Scale out deployment,AKS Edge Essentials scale (preview) - AKS enabled by Azure Arc,,Learn how to scale out your AKS Edge Essentials applications to multiple nodes.,"Now that AKS Edge Essentials is installed on your primary machine, this article describes how you can scale out your cluster to other secondary machines to create a multi-machine deployment. Caution Scaling to additional nodes is an experimental feature. Important AKS Edge Essentials multi-machine deployment is currently in PREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released in",2025-04-04T08:00:00.000Z,how-to,,0.45,False,"Scale-out preview guide; focused on procedure for adding nodes, not on detailed config parameter tables or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption,Enable secret encryption with the KMS plugin,Enable secret encryption on an AKS Edge Essentials cluster - AKS enabled by Azure Arc,Configure KMS-based secret encryption for AKS Edge Essentials,Learn how to enable the KMS provider for AKS Edge Essentials clusters to encrypt secrets.,"Following Kubernetes security best practices, it's recommended that you encrypt the Kubernetes secret store on AKS Edge Essentials clusters. You can perform this encryption by activating theKey Management Service (KMS) provider for AKS Edge Essentials, which enablesencryption at rest for secretsstored in the etcd key-value store. Each secret within the secret store is first encrypted with a unique Data Encryption Key (DEK) generated by the K8s API server. This DEK is temporarily held in memory a",2026-03-10T22:19:00.000Z,how-to,security,0.7,True,"The page describes product-specific steps and configuration details to enable the KMS provider for encrypting Kubernetes secrets on AKS Edge Essentials clusters. This is concrete security configuration guidance (encryption at rest for etcd secrets) rather than a conceptual overview, fitting the security sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine,Set up machine,Steps to prepare your machine for AKS Edge Essentials - AKS enabled by Azure Arc,Prepare Windows host machines for AKS Edge Essentials,Learn how to prepare your machines for AKS Edge Essentials clusters.,This article describes how to set up an Azure Kubernetes Service (AKS) Edge Essentials node machine.,2025-09-24T08:00:00.000Z,how-to,configuration,0.7,True,"Machine preparation guide for AKS Edge Essentials nodes typically includes specific OS settings, features, and configuration commands unique to this product.",unchanged +https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption,Validate and troubleshoot secret encryption,Validate and Troubleshoot secret encryption on an AKS Edge Essentials cluster - AKS enabled by Azure Arc,Validate and troubleshoot AKS Edge secret encryption,Learn how to enable the KMS provider for AKS Edge Essentials clusters to encrypt secrets.,"Following Kubernetes security best practices, your secrets stored in the Kubernetes secret store on AKS Edge Essentials clusters are encrypted at rest by default using theKey Management Service (KMS) provider. Each secret within the secret store is first encrypted with a unique Data Encryption Key (DEK) generated by the Kubernetes API server. This DEK is temporarily held in memory and never written to disk. The API server then requests that the KMS provider encrypt the DEK using a Key Encryption",2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.82,True,"The page explicitly focuses on validating and troubleshooting secret encryption with the KMS provider on AKS Edge Essentials. Such content typically includes specific error messages, diagnostic steps, and symptom→cause→solution guidance for this product’s encryption behavior, which matches the troubleshooting sub-skill definition and constitutes expert knowledge.",new +https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine,Set up machine,Steps to prepare your machine for AKS Edge Essentials - AKS enabled by Azure Arc,Prepare machines and configure nodes for AKS Edge,Learn how to prepare your machines for AKS Edge Essentials clusters.,This article describes how to set up an Azure Kubernetes Service (AKS) Edge Essentials node machine.,2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"A machine preparation guide for AKS Edge Essentials is likely to include concrete configuration steps, required settings (BIOS/virtualization flags, Windows features, networking settings), and possibly parameter names or commands specific to this product. These are product-specific configuration details that qualify as expert knowledge under the configuration category.",updated https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-nested-environment,Set up nested virtualization,Nested virtualization environment for AKS Edge Essentials - AKS enabled by Azure Arc,Configure nested virtualization for AKS Edge Essentials,Learn how to prepare your nested virtualization environment for AKS Edge Essentials clusters.,"This article describes how to set up a nested virtualization environment to deploy an Azure Kubernetes Service (AKS) Edge Essentials cluster. Note Deploying AKS Edge Essentials on top of a nested virtualization environment on VMware ESXi is supported. Other nested virtualization deployments are not supported for production scenarios and are limited to developer purposes. This guide assumes you're using the Hyper-V hypervisor. We do not support using a non-Microsoft hypervisor, such as KVM.",2024-09-11T08:00:00.000Z,how-to,configuration,0.8,True,"Details supported hypervisors, specific nested virtualization settings, and constraints (e.g., only ESXi supported for production); these are product-specific configuration rules.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-single-node-deployment,Create single machine deployment,Single machine Kubernetes - AKS enabled by Azure Arc,,Learn how to deploy AKS Edge Essentials on a single machine.,"You can deploy AKS Edge Essentials on either a single machine or on multiple machines. In a single machine Kubernetes deployment, both the Kubernetes control node and worker node run on the same machine. This article describes how to create the Kubernetes control node on your machine on a private network.",2025-11-17T23:03:00.000Z,how-to,,0.45,False,"How-to for single-node deployment; mostly procedural deployment steps without broad configuration matrices, limits, or troubleshooting mappings.",unchanged @@ -49,10 +49,10 @@ https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-pricing,AKS Edge Ess https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-quickstart,Quickstart,Quickstart for AKS Edge Essentials - AKS enabled by Azure Arc,,Learn how to bring up an AKS Edge Essentials cluster and connect it to Azure Arc.,This quickstart guide describes how to set up an Azure Kubernetes Service (AKS) Edge Essentials single-machine K3S Linux-only cluster. Note,2025-11-17T23:03:00.000Z,quickstart,,0.4,False,"Quickstart tutorial for creating a single-machine cluster; step-by-step but not focused on exhaustive configuration options, limits, or troubleshooting patterns.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-resources-logs,Logs,Logs for AKS Edge Essentials - AKS enabled by Azure Arc,Collect and use AKS Edge Essentials logs for troubleshooting,Gather and troubleshoot AKS Edge Essentials using logs.,"If you experience issues running AKS Edge Essentials IoT Edge in your environment, use this article as a guide for gathering and using logs.",2023-10-26T21:56:00.000Z,end-user-help,troubleshooting,0.8,True,Describes log locations and how to gather logs for AKS Edge Essentials; includes product-specific diagnostic paths and commands.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-software-license-terms,Microsoft Software License Terms,Microsoft Software License Terms for AKS Edge Essentials - AKS enabled by Azure Arc,,Microsoft Software License Terms for AKS Edge Essentials.,"MICROSOFT SOFTWARE LICENSE TERMS MICROSOFT AZURE KUBERNETES SERVICE EDGE ESSENTIALS IF YOU LIVE IN (OR ARE A BUSINESS WITH A PRINCIPAL PLACE OF BUSINESS IN) THE UNITED STATES, PLEASE READ THE ""BINDING ARBITRATION AND CLASS ACTION WAIVER"" SECTION BELOW. IT AFFECTS HOW DISPUTES ARE RESOLVED. These license terms are an agreement between you and Microsoft Corporation (or one of its affiliates). They apply to the software named above and any Microsoft services or software updates (except to the exten",2024-07-03T08:00:00.000Z,how-to,,0.2,False,"Legal software license terms, not technical expert knowledge for an AI agent.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-system-requirements,System requirements and support matrix,AKS Edge Essentials system requirements - AKS enabled by Azure Arc,Check AKS Edge Essentials host system requirements,Requirements and supported versions for AKS Edge Essentials.,This article describes the requirements for the host machine that runs AKS Edge Essentials.,2025-10-24T22:03:00.000Z,limits-and-quotas,limits-quotas,0.78,True,"System requirements pages typically list exact CPU, RAM, disk, OS version, and hardware constraints; these numeric limits and supported versions are expert, product-specific knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-system-requirements,System requirements and support matrix,AKS Edge Essentials system requirements - AKS enabled by Azure Arc,Check AKS Edge Essentials host system requirements,Requirements and supported versions for AKS Edge Essentials.,This article describes the requirements for the host machine that runs AKS Edge Essentials.,2026-04-17T08:00:00.000Z,limits-and-quotas,limits-quotas,0.78,True,"A system requirements page for a specific product typically lists exact hardware, OS, CPU, memory, disk, and virtualization requirements with numeric values and supported versions. These are product-specific limits/constraints that change over time and are not reliably known from training, fitting the limits-quotas category best.",updated https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-troubleshoot-overview,Troubleshooting,Troubleshoot common issues in AKS Edge Essentials - AKS enabled by Azure Arc,Troubleshoot common AKS Edge Essentials issues,Learn about common issues and workarounds in AKS Edge Essentials.,This article describes how to find solutions for issues you encounter when using AKS Edge Essentials. Known issues and errors are organized by functional area. You can use the links provided in this article to find solutions and workarounds to resolve them.,2024-12-13T18:01:00.000Z,troubleshooting-general,troubleshooting,0.85,True,"Central troubleshooting hub listing known issues and errors by functional area, with links to symptom → workaround/fix mappings specific to AKS Edge Essentials.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-whats-new,What's new in AKS Edge Essentials,What's new in AKS Edge Essentials - AKS enabled by Azure Arc,,Learn about what's new in AKS Edge Essentials releases.,"AKS Edge Essentials is a lightweight on-premises Kubernetes implementation of Azure Kubernetes Service (AKS) that automates running containerized applications at scale. This article describes the latest features, enhancements, and updates in AKS Edge Essentials releases.",2025-10-24T08:00:00.000Z,overview,,0.3,False,"What's new/release notes; mostly feature descriptions and dates without structured limits, configuration matrices, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-workload-identity,Configure Workload Identity,Configure Workload Identity on an AKS Edge Essentials cluster (preview) - AKS enabled by Azure Arc,Configure workload identity on AKS Edge Essentials,Learn how to configure an AKS Edge Essentials cluster with workload identity.,"Azure Kubernetes Service (AKS) Edge Essentials is an on-premises Kubernetes implementation of Azure Kubernetes Service (AKS) that automates running containerized applications at scale. This article describes how to perform the following tasks: For a conceptual overview of Workload identity federation, seeWorkload identity federation in Azure Arc-enabled Kubernetes (preview). Important These preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as avai",2026-04-15T17:03:00.000Z,how-to,security,0.76,True,"How-to configuration for workload identity on AKS Edge Essentials/Arc-enabled clusters, likely including Azure AD/Entra app registrations, federated credential setup, Kubernetes service account annotations, and product-specific identity/security settings. This is concrete security configuration rather than conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-whats-new,What's new in AKS Edge Essentials,What's new in AKS Edge Essentials - AKS enabled by Azure Arc,,Learn about what's new in AKS Edge Essentials releases.,"AKS Edge Essentials is a lightweight on-premises Kubernetes implementation of Azure Kubernetes Service (AKS) that automates running containerized applications at scale. This article describes the latest features, enhancements, and updates in AKS Edge Essentials releases.",2026-04-22T22:07:00.000Z,overview,,0.1,False,"A 'What’s new' page primarily describes release notes and feature changes. While detailed, it usually does not focus on structured limits, configuration matrices, or troubleshooting mappings as defined in the sub-skill types, and is closer to update/marketing-style content than reusable expert knowledge per this taxonomy.",updated +https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-workload-identity,Configure Workload Identity,Configure Workload Identity on an AKS Edge Essentials cluster (preview) - AKS enabled by Azure Arc,Configure workload identity on AKS Edge Essentials,Learn how to configure an AKS Edge Essentials cluster with workload identity.,"Azure Kubernetes Service (AKS) Edge Essentials is an on-premises Kubernetes implementation of Azure Kubernetes Service (AKS) that automates running containerized applications at scale. This article describes how to perform the following tasks: For a conceptual overview of Workload identity federation, seeWorkload identity federation in Azure Arc-enabled Kubernetes (preview). Important These preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as avai",2026-04-15T17:03:00.000Z,how-to,security,0.76,True,"How-to configuration for workload identity on AKS Edge Essentials/Arc-enabled clusters, likely including Azure AD/Entra app registrations, federated credential setup, Kubernetes service account annotations, and product-specific identity/security settings. This is concrete security configuration rather than conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-get-kubelet-logs,Get kubelet logs,Get kubelet logs from cluster nodes - AKS enabled by Azure Arc,Retrieve kubelet logs from AKS Arc nodes,Learn how to get kubelet logs in an Azure Kubernetes Service (AKS) enabled by Arc deployment.,"Applies to: AKS on Azure Local As part of operating a Kubernetes cluster in AKS enabled by Azure Arc, you might need to review logs at some point to troubleshoot a problem. This article describes how to usejournalctlto view the kubelet logs on a node.",2025-09-30T17:08:00.000Z,how-to,troubleshooting,0.7,True,Describes using journalctl and node access patterns specific to AKS Arc to obtain kubelet logs for diagnosing issues.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-hci-ip-address-planning,IP address planning,IP address planning for AKS enabled by Azure Arc - AKS enabled by Azure Arc,Plan IP address capacity for AKS Arc production,"Learn about how to plan for IP addresses and reservation, to deploy AKS Arc in production.","Applies to: AKS on Azure Local IP address planning for AKS enabled by Azure Arc involves designing a network that supports applications, node pools, pod networks, service communication, and external access. This article walks you through some key considerations for effective IP address planning, and minimum number of IP addresses required to deploy AKS in production. See theAKS networking concepts and requirementsbefore reading this article.",2025-08-13T08:00:00.000Z,concept-article,limits-quotas,0.7,True,IP planning article explicitly mentions minimum number of IP addresses required; this is numeric capacity guidance and limits.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-hybrid-preview-uninstall,Uninstall AKS cluster provisioning from Azure preview,Before you begin - uninstall the AKS cluster provisioning preview - AKS enabled by Azure Arc,Uninstall AKS cluster provisioning preview before AKS Arc upgrade,Learn how to uninstall the AKS cluster provisioning from Azure preview.,Applies to: AKS on Windows Server This step is only required if you installed the AKS cluster provisioning from Azure preview. The preview ended with the release ofAKS on Azure Local. This article describes the steps to uninstall the preview bits before upgrading to AKS Arc.,2025-04-08T22:02:00.000Z,overview,deployment,0.7,True,Product-specific uninstall requirements and steps tied to preview-to-GA upgrade path; relevant to deployment/upgrade constraints.,unchanged @@ -78,7 +78,7 @@ https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-whats-new-local,What's ne https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-windows-server-retirement,AKS on Windows Server retirement,Retirement of AKS architecture on Windows Server - AKS enabled by Azure Arc,Plan for AKS on Windows Server architecture retirement,Learn about the retirement of AKS on Windows Server.,"Azure Kubernetes Service enabled by Azure Arc (AKS Arc) is a managed Kubernetes service that you can use to deploy and manage containerized applications on-premises, in datacenters, or at edge locations. You can use familiar Azure tools to create and manage your Kubernetes clusters. Microsoft remains committed to supporting AKS Arc on Windows Server platforms through March 2028. To ensure Azure continues to deliver a secure, reliable, and consistent experience, AKS enabled by Azure Arc architect",2026-02-10T18:05:00.000Z,how-to,decision-making,0.65,True,Retirement guidance with timelines and recommended migration target (AKS on Azure Local); helps decide migration and support windows.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/aksarc,Commands,az aksarc,Use az aksarc CLI commands for AKS Arc management,,,2025-07-14T22:05:00Z,,integrations,0.7,True,"CLI reference page for az aksarc; contains command parameters, defaults, and constraints, which are integration/configuration details unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/app-availability,Application availability,Application availability in AKS on Windows Server - AKS enabled by Azure Arc,,Learn about application availability in on Windows Server.,"Applies to: AKS on Windows Server Azure Kubernetes Service (AKS) on Windows Server offers a fully supported container platform that can run cloud-native applications on theKubernetes container orchestration platform. The architecture supports running virtualized Windows and Linux workloads. The AKS architecture is built with failover clustering and live migration that is automatically enabled for target (workload) clusters. During various disruption events, virtual machines that host customer wo",2025-04-08T22:02:00.000Z,concept-article,,0.4,False,"High-level description of AKS Arc availability behavior; summary doesn’t show concrete thresholds, timeouts, or configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aksarc/arc-gateway-aks-arc,Simplify outbound connectivity,Simplify network configuration for AKS on Azure Local with Azure Arc gateway (preview) - AKS enabled by Azure Arc,,Learn how to enable Arc gateway on AKS on Azure Local clusters to simplify network configuration requirements,"If you use enterprise proxies to manage outbound traffic, Azure Arc gateway can help simplify the process of enabling connectivity. Before using Arc gateway with AKS on Azure Local, ensure you complete theprerequisites for creating AKS clusters on Azure Local. The AKS Arc gateway (currently in preview) lets you: Important AKS Arc gateway is currently in preview. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or",2026-03-20T22:06:00.000Z,how-to,,0.2,False,"The page is about simplifying network configuration for AKS on Azure Local using Azure Arc gateway and appears to be a conceptual/feature explanation or how-to. The summary does not indicate detailed configuration parameter tables, numeric limits, or error-code-based troubleshooting. It reads more like preview feature guidance and prerequisites, so it does not clearly meet any expert-knowledge sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/aks/aksarc/arc-gateway-aks-arc,Simplify outbound connectivity,Simplify network configuration requirements with Azure Arc gateway - AKS enabled by Azure Arc,Configure Azure Arc gateway for AKS on Azure Local,Learn how to enable Arc gateway on AKS on Azure Local clusters to simplify network configuration requirements,"If you use enterprise proxies to manage outbound traffic, Azure Arc gateway can help simplify the process of enabling connectivity. Before using Arc gateway with AKS on Azure Local, ensure you complete theprerequisites for creating AKS clusters on Azure Local. The AKS Arc gateway lets you:",2026-04-23T17:08:00.000Z,how-to,configuration,0.68,True,"Page is about enabling and using Arc gateway with AKS on Azure Local to simplify outbound network/proxy configuration. This is product-specific configuration guidance (how to set up Arc gateway in this context), not just conceptual networking theory. While the summary snippet doesn’t show tables, this topic typically includes concrete settings and steps unique to AKS enabled by Azure Arc, which qualifies as expert configuration knowledge rather than generic content.",updated https://learn.microsoft.com/en-us/azure/aks/aksarc/auto-scale-aks-arc,Use autoscaler,Use auto-scaling in a Kubernetes cluster - AKS enabled by Azure Arc,Configure cluster autoscaler for AKS Arc,Learn how to use Azure CLI for cluster autoscaling.,"Applies to: AKS on Azure Local To keep up with application demands in Kubernetes, you might need to adjust the number of nodes that run your workloads. The cluster autoscaler component watches for pods in your cluster that can't be scheduled because of resource constraints. When the cluster autoscaler detects issues, it scales up the number of nodes in the node pool to meet the application demands. It also regularly checks nodes for a lack of running pods and scales down the number of nodes as n",2025-06-09T22:03:00.000Z,how-to,configuration,0.63,True,"Shows how to use Azure CLI to configure autoscaling; likely includes specific flags, min/max node counts, and AKS Arc behaviors.",unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/availability-sets,Use availability sets,Availability sets in AKS enabled by Azure Arc - AKS enabled by Azure Arc,,Learn how to enable availability sets in AKS Arc to improve the availability and distribution of your Kubernetes workloads.,"Availability setsare logical groups of VMs that have weak anti-affinity relationships with each other, to ensure that they are spread evenly across the available fault domains in a physical cluster. A fault domain in this context is a physical host or a group of physical hosts. By using availability sets, AKS Arc can improve the availability and distribution of your Kubernetes workloads. Availability sets can avoid scenarios in which a single node failure can cause multiple VMs to go down or bec",2024-11-22T23:01:00.000Z,how-to,,0.4,False,Explains availability sets conceptually; unlikely to contain numeric limits or detailed configuration matrices.,unchanged https://learn.microsoft.com/en-us/azure/aks/aksarc/azure-hybrid-benefit-22h2,Azure Hybrid Benefit,Azure Hybrid Benefit for AKS on Windows Server - AKS enabled by Azure Arc,,Activate Azure Hybrid Benefit for AKS on Windows Server.,"Applies to: AKS on Windows Server Azure Hybrid Benefit is a program that enables you to significantly reduce the costs of running workloads in the cloud. With Azure Hybrid Benefit for AKS on Windows Server, you can maximize the value of your on-premises licenses and modernize your applications at no extra cost.",2025-04-08T22:02:00.000Z,how-to,,0.3,False,"High-level Azure Hybrid Benefit description; summary doesn’t show SKU mappings, percentages, or concrete licensing rules.",unchanged diff --git a/products/azure-aks-edge-essentials/report.md b/products/azure-aks-edge-essentials/report.md index ec9151d1..7bfa111b 100644 --- a/products/azure-aks-edge-essentials/report.md +++ b/products/azure-aks-edge-essentials/report.md @@ -1,21 +1,21 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing AKS Edge/Arc/hybrid clusters: auth (Entra ID, AD, gMSA, workload identity), RBAC, SSH hardening, cert and key management, and encrypting/securing secrets and container workloads.' configuration: 'Configuring AKS Edge/Arc/hybrid clusters: networking, storage, load - balancers, autoscaling, Arc connectivity, Windows/Linux node settings, monitoring, - upgrades, and offline/host setup.' + balancers, autoscaling, Arc connectivity, monitoring, Windows/Linux nodes, offline/online + updates, and PowerShell-based setup.' decision-making: Guidance on choosing AKS Edge/Arc vs cloud/on-prem, supported versions/add-ons, monitoring, pricing/licensing, support, and planning migrations or retirement of older AKS/Windows Server setups troubleshooting: 'Diagnosing and fixing AKS Edge/Arc cluster issues: installs, upgrades, - networking, storage, security, logs, certificates, known issues, and using tools/PowerShell - for deep troubleshooting.' - limits-quotas: Hardware, storage, IP, and VM requirements plus scale limits, quotas, - and support policies for AKS Edge/Arc and AKS on Azure Local across Windows, VMware, - and release versions + networking, storage, security, certificates, logs, known issues, and VMware/Windows + Server–specific problems.' + limits-quotas: Hardware, storage, IP, and scale limits for AKS Edge/Arc on Azure + Local/VMware/Windows, plus support policies and release changes to plan capacity + and compatibility. integrations: 'Integrating AKS Edge/Arc/hybrid with Azure and on-prem services: REST/CLI/PowerShell management, storage/backup, CSI, networking, IoT/OPC/ONVIF, TPM, AI model deploy, and Key Vault secrets.' @@ -32,13 +32,13 @@ skill_description: Expert knowledge for Azure Kubernetes Service Edge Essentials including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing AKS Edge/Arc clusters, Arc connectivity, IoT/OPC/ONVIF - edge workloads, TPM/Key Vault, or AI model deploys, and other Azure Kubernetes Service - Edge Essentials related development tasks. Not for Azure Kubernetes Service (AKS) - (use azure-kubernetes-service), Azure IoT Edge (use azure-iot-edge), Azure Stack - Edge (use azure-stack-edge), Azure Container Apps (use azure-container-apps). + workloads, TPM/AI deployments, or Key Vault secrets, and other Azure Kubernetes + Service Edge Essentials related development tasks. Not for Azure Kubernetes Service + (AKS) (use azure-kubernetes-service), Azure IoT Edge (use azure-iot-edge), Azure + Stack Edge (use azure-stack-edge), Azure Container Apps (use azure-container-apps). use_when: Use when managing AKS Edge/Arc clusters, Arc connectivity, IoT/OPC/ONVIF - edge workloads, TPM/Key Vault, or AI model deploys, and other Azure Kubernetes Service - Edge Essentials related development tasks. + workloads, TPM/AI deployments, or Key Vault secrets, and other Azure Kubernetes + Service Edge Essentials related development tasks. confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure IoT Edge (use azure-iot-edge), Azure Stack Edge (use azure-stack-edge), Azure Container Apps (use azure-container-apps). @@ -50,14 +50,14 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes - **Total Pages**: 332 - **Fetched**: 332 - **Fetch Failed**: 0 -- **Classified**: 264 -- **Unclassified**: 68 +- **Classified**: 265 +- **Unclassified**: 67 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 2 -- **Unchanged**: 330 -- **Deleted Pages**: 0 +- **New Pages**: 1 +- **Updated Pages**: 4 +- **Unchanged**: 327 +- **Deleted Pages**: 1 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-aks-edge-essentials/azure-aks-edge-essentials.csv` ## Classification Statistics @@ -66,23 +66,35 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes |------|-------|------------| | architecture-patterns | 3 | 0.9% | | best-practices | 4 | 1.2% | -| configuration | 76 | 22.9% | +| configuration | 77 | 23.2% | | decision-making | 9 | 2.7% | | deployment | 33 | 9.9% | | integrations | 55 | 16.6% | | limits-quotas | 8 | 2.4% | -| security | 30 | 9.0% | -| troubleshooting | 46 | 13.9% | -| *(Unclassified)* | 68 | 20.5% | +| security | 29 | 8.7% | +| troubleshooting | 47 | 14.2% | +| *(Unclassified)* | 67 | 20.2% | ## Changes +### New Pages + +- [Validate and troubleshoot secret encryption](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption) + ### Updated Pages -- [Configure Workload Identity](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-workload-identity) - - Updated: 2025-03-11T22:05:00.000Z → 2026-04-15T17:03:00.000Z -- [Use the Key Manager for Kubernetes extension](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-key-manager) - - Updated: 2025-03-11T22:05:00.000Z → 2026-04-15T17:03:00.000Z +- [Simplify outbound connectivity](https://learn.microsoft.com/en-us/azure/aks/aksarc/arc-gateway-aks-arc) + - Updated: 2026-03-20T22:06:00.000Z → 2026-04-23T17:08:00.000Z +- [System requirements and support matrix](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-system-requirements) + - Updated: 2025-10-24T22:03:00.000Z → 2026-04-17T08:00:00.000Z +- [What's new in AKS Edge Essentials](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-whats-new) + - Updated: 2025-10-24T08:00:00.000Z → 2026-04-22T22:07:00.000Z +- [Set up machine](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine) + - Updated: 2025-09-24T08:00:00.000Z → 2026-04-22T22:07:00.000Z + +### Deleted Pages + +- ~~Enable secret encryption with the KMS plugin~~ (https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption) ## Classified Pages @@ -102,6 +114,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [KubeAPIServer unreachable error](https://learn.microsoft.com/en-us/azure/aks/aksarc/kube-api-server-unreachable) | troubleshooting | 0.83 | Targets a specific error message about reaching kube-apiserver/control plane IP, with environment-specific diagnosis and fixes. | | [Control plane configuration validation errors](https://learn.microsoft.com/en-us/azure/aks/aksarc/control-plane-validation-errors) | troubleshooting | 0.82 | Documents specific ControlPlaneConfigurationValidation error codes, meanings, and resolution steps, which are highly product-specific. | | [K8sVersionValidation error](https://learn.microsoft.com/en-us/azure/aks/aksarc/cluster-k8s-version) | troubleshooting | 0.82 | Focuses on a named error code with causes and mitigation steps tied to supported Kubernetes versions in AKS on Azure Local. | +| [Validate and troubleshoot secret encryption](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption) | troubleshooting | 0.82 | The page explicitly focuses on validating and troubleshooting secret encryption with the KMS provider on AKS Edge Essentials. Such content typically includes specific error messages, diagnostic steps, and symptom→cause→solution guidance for this product’s encryption behavior, which matches the troubleshooting sub-skill definition and constitutes expert knowledge. | | [Network validation error due to .local domain](https://learn.microsoft.com/en-us/azure/aks/aksarc/network-validation-error-local) | troubleshooting | 0.81 | Addresses a specific error string and its root cause (.local domain) with concrete remediation steps. | | [AKS Arc PowerShell](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/ps/) | integrations | 0.80 | Reference for AksHci cmdlets with parameter details and behaviors unique to AKS hybrid; effectively an API/config surface. | | [Add-AksEdgeNode](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/aks-edge-ps/add-aksedgenode) | integrations | 0.80 | Cmdlet reference with parameters and behavior for adding AKS Edge nodes; product-specific API surface. | @@ -156,7 +169,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Connect to Arc](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-connect-to-arc) | configuration | 0.78 | Describes Arc connection behavior and requires specific config file section (Arc in config) and possibly parameter values; product-specific configuration knowledge. | | [Deleted AKS Arc cluster still visible on Azure portal](https://learn.microsoft.com/en-us/azure/aks/aksarc/deleted-cluster-visible) | troubleshooting | 0.78 | Explains why a deleted cluster remains as a connectedCluster resource and how to clean it up; symptom→cause→solution. | | [Deploy Azure IoT Operations](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-deploy-azure-iot) | integrations | 0.78 | Describes a script and required cluster configuration to run Azure IoT Operations on AKS Edge; includes product-specific integration parameters and constraints. | -| [System requirements and support matrix](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-system-requirements) | limits-quotas | 0.78 | System requirements pages typically list exact CPU, RAM, disk, OS version, and hardware constraints; these numeric limits and supported versions are expert, product-specific knowledge. | +| [System requirements and support matrix](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-system-requirements) | limits-quotas | 0.78 | A system requirements page for a specific product typically lists exact hardware, OS, CPU, memory, disk, and virtualization requirements with numeric values and supported versions. These are product-specific limits/constraints that change over time and are not reliably known from training, fitting the limits-quotas category best. | | [Use diagnostic checker](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-arc-diagnostic-checker) | troubleshooting | 0.78 | Tool-based troubleshooting for cluster creation failures with specific checks, outputs, and remediation steps unique to AKS Arc. | | [Access TPM secrets](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-access-tpm) | integrations | 0.76 | Shows how to enable TPM passthrough and includes sample C# code and configuration to use host TPM from the VM; this is a concrete integration pattern. | | [Configure Workload Identity](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-workload-identity) | security | 0.76 | How-to configuration for workload identity on AKS Edge Essentials/Arc-enabled clusters, likely including Azure AD/Entra app registrations, federated credential setup, Kubernetes service account annotations, and product-specific identity/security settings. This is concrete security configuration rather than conceptual overview. | @@ -227,7 +240,6 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Deploy a container image from Azure Container Registry](https://learn.microsoft.com/en-us/azure/aks/aksarc/deploy-azure-container-registry) | integrations | 0.70 | Describes integration between ACR and on-prem AKS; likely includes registry configuration, authentication parameters, and deployment patterns specific to this scenario. | | [Deploy an AI model with the AI toolchain operator](https://learn.microsoft.com/en-us/azure/aks/aksarc/deploy-ai-model) | integrations | 0.70 | Integration with Kubernetes AI toolchain operator; article likely includes CRD fields, extension parameters, and model deployment configs unique to this integration. | | [Disable-AksHciArcConnection](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/ps/disable-akshciarcconnection) | configuration | 0.70 | Cmdlet reference for disabling Arc connectivity, including product-specific connection configuration behavior. | -| [Enable secret encryption with the KMS plugin](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption) | security | 0.70 | The page describes product-specific steps and configuration details to enable the KMS provider for encrypting Kubernetes secrets on AKS Edge Essentials clusters. This is concrete security configuration guidance (encryption at rest for etcd secrets) rather than a conceptual overview, fitting the security sub-skill type. | | [Enable-AksHciArcConnection](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/ps/enable-akshciarcconnection) | configuration | 0.70 | Cmdlet reference for enabling Arc connectivity, including configuration parameters unique to AKS hybrid. | | [Encrypt etcd secrets](https://learn.microsoft.com/en-us/azure/aks/aksarc/encrypt-secrets) | troubleshooting | 0.70 | Explicitly a monitoring and troubleshooting guide for etcd secret encryption with AKS Arc-specific behaviors and resolution steps. | | [Expose Kubernetes services](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-expose-service) | configuration | 0.70 | Explains service exposure methods depending on networking configuration, including specific settings like Init.Service; product-specific configuration guidance. | @@ -272,7 +284,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Retrieve certificate-based admin kubeconfig](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-vmware-retrieve-kubeconfig) | configuration | 0.70 | Describes how to obtain certificate-based admin kubeconfig, which involves product-specific commands, parameters, and paths. | | [Secrets Store CSI Driver configuration](https://learn.microsoft.com/en-us/azure/aks/aksarc/secrets-store-csi-driver) | integrations | 0.70 | Describes concrete configuration for the Secrets Store CSI driver and Azure Key Vault provider on AKS on Windows Server, including product-specific integration steps. | | [Security overview](https://learn.microsoft.com/en-us/azure/aks/aksarc/concepts-security) | security | 0.70 | Security-hardening measures and built-in security features for this product; likely includes specific settings and roles beyond generic security theory. | -| [Set up machine](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine) | configuration | 0.70 | Machine preparation guide for AKS Edge Essentials nodes typically includes specific OS settings, features, and configuration commands unique to this product. | +| [Set up machine](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine) | configuration | 0.70 | A machine preparation guide for AKS Edge Essentials is likely to include concrete configuration steps, required settings (BIOS/virtualization flags, Windows features, networking settings), and possibly parameter names or commands specific to this product. These are product-specific configuration details that qualify as expert knowledge under the configuration category. | | [Set-AksEdgeUpgrade](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/aks-edge-ps/set-aksedgeupgrade) | configuration | 0.70 | Cmdlet sets whether AKS Edge can upgrade Kubernetes on update; page will document configuration parameters and allowed values. | | [Set-AksHciOffsiteConfig](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/ps/set-akshcioffsiteconfig) | configuration | 0.70 | Cmdlet reference for offsite/offline configuration with specific parameters and constraints. | | [Start-AksEdgeNode](https://learn.microsoft.com/en-us/azure/aks/aksarc/reference/aks-edge-ps/start-aksedgenode) | deployment | 0.70 | Cmdlet reference for starting node VMs, including asynchronous behavior and endpoint availability details specific to AKS Edge. | @@ -303,6 +315,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [vnet](https://learn.microsoft.com/en-us/azure/aks/aksarc/vnet) | integrations | 0.70 | CLI vnet reference; contains product-specific networking command parameters and constraints. | | [Upgrade Kubernetes clusters](https://learn.microsoft.com/en-us/azure/aks/aksarc/cluster-upgrade) | deployment | 0.69 | Details rolling upgrade behavior, supported upgrade paths, and commands for AKS Arc cluster upgrades, which are product-specific operational steps. | | [Enable Windows node pools](https://learn.microsoft.com/en-us/azure/aks/aksarc/howto-enable-windows-node-pools) | configuration | 0.68 | How-to for enabling a feature tied to specific Azure Local versions and supported Windows images; likely includes concrete CLI flags and configuration values unique to AKS Arc. | +| [Simplify outbound connectivity](https://learn.microsoft.com/en-us/azure/aks/aksarc/arc-gateway-aks-arc) | configuration | 0.68 | Page is about enabling and using Arc gateway with AKS on Azure Local to simplify outbound network/proxy configuration. This is product-specific configuration guidance (how to set up Arc gateway in this context), not just conceptual networking theory. While the summary snippet doesn’t show tables, this topic typically includes concrete settings and steps unique to AKS enabled by Azure Arc, which qualifies as expert configuration knowledge rather than generic content. | | [Supported Kubernetes versions](https://learn.microsoft.com/en-us/azure/aks/aksarc/supported-kubernetes-versions) | decision-making | 0.68 | The page documents the specific Kubernetes versions currently supported for AKS enabled by Azure Arc and explains the support policy and lifecycle. These concrete version lists and lifecycle details are time-sensitive expert knowledge not reliably known from model training and help users decide which Kubernetes version to run and when to upgrade. | | [Monitor Kubernetes audit events](https://learn.microsoft.com/en-us/azure/aks/aksarc/kubernetes-monitor-audit-events) | configuration | 0.67 | Covers creating diagnostic settings to route audit logs as Azure Monitor resource logs; includes product-specific settings and destinations. | | [Disable Windows node pools](https://learn.microsoft.com/en-us/azure/aks/aksarc/disable-windows-nodepool) | configuration | 0.66 | Version-specific configuration steps to disable a feature on Azure Local 2508 and earlier, including commands and flags unique to this environment. | @@ -417,11 +430,10 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Storage](https://learn.microsoft.com/en-us/azure/aks/aksarc/concepts-storage) | 0.30 | Conceptual overview of storage options; does not appear to enumerate specific storage class parameters or limits. | | [Troubleshooting overview](https://learn.microsoft.com/en-us/azure/aks/aksarc/troubleshoot-overview) | 0.30 | High-level troubleshooting overview and navigation; does not itself contain specific error codes, commands, or solutions. | | [Use GPUs](https://learn.microsoft.com/en-us/azure/aks/aksarc/deploy-gpu-node-pool) | 0.30 | From the summary, this is a how-to/tutorial style article on deploying GPU-enabled node pools in AKS on Azure Local. It does not clearly indicate the presence of configuration parameter tables, limits/quotas, error-code-based troubleshooting, or other detailed expert-only data. Likely focuses on step-by-step deployment rather than reference-style expert knowledge. | -| [What's new in AKS Edge Essentials](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-whats-new) | 0.30 | What's new/release notes; mostly feature descriptions and dates without structured limits, configuration matrices, or troubleshooting mappings. | | [Azure portal](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-create-clusters-portal) | 0.20 | Portal-based creation tutorial; focuses on steps rather than exhaustive configuration options or limits. | | [Get support](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-help-support) | 0.20 | Support article about how to open support requests; procedural but not technical configuration, limits, or troubleshooting content. | | [Microsoft Software License Terms](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-software-license-terms) | 0.20 | Legal software license terms, not technical expert knowledge for an AI agent. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-overview) | 0.20 | High-level overview of AKS Edge Essentials; primarily conceptual and marketing-style description without detailed configuration, limits, or troubleshooting matrices. | -| [Simplify outbound connectivity](https://learn.microsoft.com/en-us/azure/aks/aksarc/arc-gateway-aks-arc) | 0.20 | The page is about simplifying network configuration for AKS on Azure Local using Azure Arc gateway and appears to be a conceptual/feature explanation or how-to. The summary does not indicate detailed configuration parameter tables, numeric limits, or error-code-based troubleshooting. It reads more like preview feature guidance and prerequisites, so it does not clearly meet any expert-knowledge sub-skill type. | | [What is AKS enabled by Azure Arc?](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-overview) | 0.20 | High-level overview of AKS enabled by Azure Arc and deployment options; no detailed limits, configs, or error mappings. | +| [What's new in AKS Edge Essentials](https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-whats-new) | 0.10 | A 'What’s new' page primarily describes release notes and feature changes. While detailed, it usually does not focus on structured limits, configuration matrices, or troubleshooting mappings as defined in the sub-skill types, and is closer to update/marketing-style content than reusable expert knowledge per this taxonomy. | | [Blogs and announcements](https://learn.microsoft.com/en-us/azure/aks/aksarc/blogs-announcements) | - | Just a list of blogs and announcements; navigation content without technical details. | diff --git a/products/azure-api-center/azure-api-center.csv b/products/azure-api-center/azure-api-center.csv index 9c13370c..52e9d96b 100644 --- a/products/azure-api-center/azure-api-center.csv +++ b/products/azure-api-center/azure-api-center.csv @@ -5,7 +5,7 @@ https://learn.microsoft.com/en-us/azure/api-center/build-register-apis-vscode-ex https://learn.microsoft.com/en-us/azure/api-center/design-api-github-copilot-azure,Design and develop APIs - GitHub Copilot for Azure,Develop APIs with GitHub Copilot for Azure - API Center plugin - Azure API Center,,"With AI assistance, API developers can use the Azure API Center plugin for GitHub Copilot for Azure to design and develop compliant APIs.","The API Center plugin forGitHub Copilot for Azureaccelerates design and development of new APIs starting from natural language prompts. With AI assistance available through the API Center plugin combined with the API Center VS Code extension, simply describe your API and quickly generate an OpenAPI spec for API development that complies with your organization's standards. After you generate a compliant spec, you can register the API with yourAPI center.",2026-01-28T23:12:00.000Z,how-to,,0.45,False,Describes using GitHub Copilot for Azure plugin; appears conceptual/flow-based rather than a configuration or integration reference.,unchanged https://learn.microsoft.com/en-us/azure/api-center/discover-apis-vscode-extension,Discover and consume APIs - VS Code extension,Discover APIs - VS Code extension - Azure API Center,,API developers can use the Azure API Center extension for Visual Studio Code to discover APIs in their organization's API center.,"API developers in your organization can discover and consume APIs in yourAPI centerby using the Azure API Center extension for Visual Studio Code. The extension provides the following features: Discover APIs- Browse the APIs in your API center, and view their details and documentation. Consume APIs- Generate API SDK clients in their favorite language including JavaScript, TypeScript, .NET, Python, and Java, using the Microsoft Kiota engine that generates SDKs for Microsoft Graph, GitHub, and mor",2026-01-28T23:12:00.000Z,how-to,,0.45,False,Describes discovery and consumption features in VS Code; likely usage-level guidance without deep configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/api-center/enable-api-analysis-linting,API analysis - self-managed,Perform API Linting and Analysis - Azure API Center,Automate API linting deployment in Azure API Center,Configure linting of API definitions in your API center to analyze compliance of APIs with the organization's API style guide.,"This article explains how to enable API analysis inAzure API Centerto set up a linting engine and triggers. These capabilities analyze your API definitions for adherence to organizational style rules, generating both individual and summary reports. API analysis helps identify and correct common errors and inconsistencies in your API definitions. The following procedures supportautomated deploymentof the linting engine and event subscription in your API center. Use the Azure Developer CLI (azd) f",2026-02-28T08:00:00.000Z,how-to,deployment,0.65,True,"Describes automated deployment of a linting engine and event subscription using Azure Developer CLI, which is product-specific deployment guidance beyond generic tutorials. While the summary is brief, it indicates concrete deployment procedures and configuration for the linting engine and triggers.",unchanged -https://learn.microsoft.com/en-us/azure/api-center/enable-api-center-plugin-marketplace,Enable plugin marketplace,Enable API Center Plugin Marketplace - Azure API Center,,Enable a plugin marketplace endpoint (preview) for your Azure API center. Developers can configure it in GitHub Copilot or Claude Code to discover and install plugins from your inventory.,"This article shows how to enable a plugin marketplace endpoint inAzure API Center. The plugin marketplace endpoint uses the API Center data plane API to catalog the AI plugins such as MCP servers and skills available in the API center inventory. After you enable the plugin marketplace, developers can add it to their GitHub Copilot CLI or Claude Code development environment to discover and install plugins from your API center.",2026-04-16T06:12:00.000Z,how-to,,0.2,False,"From the summary, the page is a how-to for enabling a plugin marketplace endpoint and integrating it with tools like GitHub Copilot or Claude Code. It does not clearly indicate the presence of configuration parameter tables, limits, error codes, or other detailed product-specific settings that meet the expert-knowledge criteria for any sub-skill type. It appears to be a feature enablement/tutorial-style article rather than a deep configuration, limits, or troubleshooting reference.",new +https://learn.microsoft.com/en-us/azure/api-center/enable-api-center-plugin-marketplace,Enable plugin marketplace,Enable API Center Plugin Marketplace - Azure API Center,,Enable a plugin marketplace endpoint (preview) for your Azure API center. Developers can configure it in GitHub Copilot or Claude Code to discover and install plugins from your inventory.,"This article shows how to enable a plugin marketplace endpoint inAzure API Center. The plugin marketplace endpoint uses the API Center data plane API to catalog the AI plugins such as MCP servers and skills available in the API center inventory. After you enable the plugin marketplace, developers can add it to their GitHub Copilot CLI or Claude Code development environment to discover and install plugins from your API center.",2026-04-16T06:12:00.000Z,how-to,,0.2,False,"From the summary, the page is a how-to for enabling a plugin marketplace endpoint and integrating it with tools like GitHub Copilot or Claude Code. It does not clearly indicate the presence of configuration parameter tables, limits, error codes, or other detailed product-specific settings that meet the expert-knowledge criteria for any sub-skill type. It appears to be a feature enablement/tutorial-style article rather than a deep configuration, limits, or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/api-center/enable-api-center-portal-vs-code-extension,Enable API Center portal view - VS Code extension,Enable API Center portal view - Azure API Center - VS Code extension - Azure API Center,Control API Center portal access via VS Code extension,Enable enterprise developers to view the enterprise's API Center portal view including API definitions using the Visual Studio Code Extension for Azure API Center.,"This article shows how to provide enterprise developers access to the Azure API Center portal view in the Visual Studio Code extension forAzure API Center. Using the portal view, developers can discover APIs in your Azure API center, view API definitions, and optionally generate API clients when they don't have access to manage the API center itself or add APIs to the inventory. Access to the API Center portal view is managed using Microsoft Entra ID and Azure role-based access control. Tip The ",2026-01-28T23:12:00.000Z,how-to,security,0.7,True,Uses Entra ID and RBAC to manage portal view access; includes role and permission configuration specific to this extension.,unchanged https://learn.microsoft.com/en-us/azure/api-center/enable-managed-api-analysis-linting,API analysis - Microsoft managed,Managed API linting and analysis - Azure API Center,Use managed API linting and analysis in Azure API Center,Automatic linting of API definitions in your API center helps you analyze compliance of APIs with the organization's API style guide.,"Your organization'sAPI centerincludes built-in, Microsoft-managed linting capabilities (preview) to analyze API definitions for adherence to organizational style rules, generating both individual and summary reports. API analysis identifies and helps you correct common errors and inconsistencies in your API definitions. With API analysis: Important If you prefer, you can enableself-managedlinting and analysis using a custom Azure function, overriding the built-in capabilities.Disable any functio",2025-04-07T22:02:00.000Z,how-to,configuration,0.7,True,"Describes built-in linting engine behavior, configuration toggles, and how to enable/disable managed vs self-managed analysis.",unchanged https://learn.microsoft.com/en-us/azure/api-center/export-to-copilot-studio,Export API from API Center to Copilot Studio,Export API from API Center as Copilot Studio connector - Azure API Center,Export API Center APIs as Copilot Studio connectors,Learn how to export an API definition from your Azure API center as a custom connector in Microsoft Copilot Studio.,"You can export an API definition from your Azure API center as aconnector in Microsoft Copilot Studio(preview). Copilot Studio is a graphical, low-code tool for building agents and agent flows and is part of the Microsoft Power Platform family. In Microsoft Copilot Studio, citizen developers can use the exported connector as a custom connector toextend the capabilities of AI agents. In this article, you create a custom connector from an OpenAPI definition associated with an API in an API center.",2026-01-28T23:12:00.000Z,how-to,integrations,0.7,True,"Details how to map API Center definitions to Copilot Studio connector configuration, including specific fields and options.",unchanged diff --git a/products/azure-api-center/report.md b/products/azure-api-center/report.md index 92d9532c..89677735 100644 --- a/products/azure-api-center/report.md +++ b/products/azure-api-center/report.md @@ -37,9 +37,9 @@ confusable_not_for: Not for Azure API Management (use azure-api-management), Azu - **Unclassified**: 17 ### Incremental Update -- **New Pages**: 1 +- **New Pages**: 0 - **Updated Pages**: 0 -- **Unchanged**: 35 +- **Unchanged**: 36 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-api-center/azure-api-center.csv` @@ -56,10 +56,6 @@ confusable_not_for: Not for Azure API Management (use azure-api-management), Azu ## Changes -### New Pages - -- [Enable plugin marketplace](https://learn.microsoft.com/en-us/azure/api-center/enable-api-center-plugin-marketplace) - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-api-management/azure-api-management.csv b/products/azure-api-management/azure-api-management.csv index 58b58d5e..d3ddc6bb 100644 --- a/products/azure-api-management/azure-api-management.csv +++ b/products/azure-api-management/azure-api-management.csv @@ -49,13 +49,13 @@ https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-prov https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-send-service-bus,Send messages to Azure Service Bus,How to Send Messages to Azure Service Bus,Send messages to Azure Service Bus from API Management,Learn how to send messages to Azure Service Bus in Azure API Management. Service Bus is a messaging service that allows you to decouple applications and services.,"APPLIES TO: Developer | Basic | Standard | Premium This article describes how to send messages from API Management to Azure Service Bus by using policy-based integration. Use API Management to provide a secure and scalable way to send messages to Service Bus. Azure Service Busis a fully managed enterprise messaging service designed to decouple applications and services, enabling reliable cloud messaging between distributed systems. It supports AMQP (Advanced Message Queuing Protocol) for systems",2025-10-01T17:42:00.000Z,how-to,integrations,0.8,True,"Describes policy-based integration with Service Bus, including configuration details unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-setup-delegation,Delegate authentication,How to Delegate User Registration and Product Subscription,Configure delegation for user registration and subscriptions in API Management,Learn how to delegate user registration and product subscription to a third party in the Azure API Management developer portal.,"APPLIES TO: Developer | Basic | Basic v2 | Standard | Standard v2 | Premium | Premium v2 Delegation enables your website to own the user data and perform custom validation for users of the developer portal. With delegation, you can handle developer sign-in and sign-up (and related account management operations) and product subscription by using your existing website, instead of the developer portal's built-in functionality.",2025-10-29T05:11:00.000Z,how-to,configuration,0.7,True,Explains how to configure delegation endpoints and settings so an external site handles sign-in/sign-up and product subscription; product-specific configuration details.,unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-azure-monitor,5 - Monitor published APIs,Tutorial - Monitor APIs in Azure API Management,Configure Azure Monitor metrics and logs for API Management,"Learn how to use metrics, alerts, activity logs, and resource logs to monitor your APIs in Azure API Management.","APPLIES TO: All API Management tiers With Azure Monitor, you can visualize, query, route, archive, and take actions on the metrics or logs coming from your Azure API Management service. For an overview of Azure Monitor for API Management, seeMonitor API Management. Tip API teams can use this feature inworkspaces. Workspaces provide isolated administrative access to APIs and their own API runtime environments. In this tutorial, you learn how to: Note API Management supports a range of additional ",2025-07-09T08:00:00.000Z,tutorial,configuration,0.65,True,"Monitoring tutorial typically includes specific diagnostic settings, categories, and log configuration options for API Management and Azure Monitor.",unchanged -https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-managed-service-identity,Use managed identities for Azure resources,Use Managed Identities in Azure API Management,Configure managed identities for Azure API Management,"Learn how to create system-assigned and user-assigned identities in API Management by using the Azure portal, PowerShell, and Resource Manager templates. Learn about supported scenarios with managed i","APPLIES TO: All API Management tiers This article shows how to create a managed identity for an Azure API Management instance and how to use it to access other resources. By using a managed identity generated by Microsoft Entra ID, API Management can easily and securely access other resources that are protected by Microsoft Entra, like Azure Key Vault. Azure manages these identities, so you don't have to provision or rotate any secrets. For more information about managed identities, seeWhat are ",2026-04-16T22:31:00.000Z,how-to,security,0.7,True,"Covers product-specific steps and parameters to create and use system-assigned and user-assigned managed identities for API Management to access other Azure resources, including Entra/Key Vault integration details.",updated +https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-managed-service-identity,Use managed identities for Azure resources,Use Managed Identities in Azure API Management,Configure managed identities for Azure API Management,"Learn how to create system-assigned and user-assigned identities in API Management by using the Azure portal, PowerShell, and Resource Manager templates. Learn about supported scenarios with managed i","APPLIES TO: All API Management tiers This article shows how to create a managed identity for an Azure API Management instance and how to use it to access other resources. By using a managed identity generated by Microsoft Entra ID, API Management can easily and securely access other resources that are protected by Microsoft Entra, like Azure Key Vault. Azure manages these identities, so you don't have to provision or rotate any secrets. For more information about managed identities, seeWhat are ",2026-04-23T06:20:00.000Z,how-to,security,0.7,True,Describes product-specific steps and settings to create and use system-assigned and user-assigned managed identities for API Management to access other Azure resources. This is concrete security/identity configuration (managed identity enablement and usage) rather than conceptual overview.,updated https://learn.microsoft.com/en-us/azure/api-management/api-management-key-concepts,About API Management,Azure API Management - Overview and Key Concepts,,"Introduction to key scenarios, capabilities, and concepts of the Azure API Management service. API Management supports the full API lifecycle.","APPLIES TO: All API Management tiers This article provides an overview of common scenarios and key components of Azure API Management. Azure API Management is a hybrid, multicloud management platform for APIs across all environments. As a platform-as-a-service, API Management supports the complete API lifecycle. Tip If you're already familiar with API Management and ready to start, see these resources:",2025-10-13T08:00:00.000Z,overview,,0.1,False,"High-level overview of Azure API Management concepts and scenarios without numeric limits, configuration tables, or product-specific troubleshooting or security details.",unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-kubernetes,Manage microservices deployed in Azure Kubernetes Service (AKS),Use API Management with Microservices Deployed in AKS,Use API Management with AKS microservices architectures,Learn about options for using Azure API Management to publish microservices-based architectures that are deployed in AKS as APIs.,"APPLIES TO: All API Management tiers Microservices are perfect for building APIs. You can useAzure Kubernetes Service (AKS)to quickly deploy and operate amicroservices-based architecturein the cloud. You can then useAzure API Managementto publish your microservices as APIs for internal and external consumption. This article describes the options for using API Management to publish AKS microservices-based architectures as APIs. It assumes a basic knowledge of Kubernetes, API Management, and Azure",2025-06-02T17:26:00.000Z,concept-article,architecture-patterns,0.7,True,Describes specific options for publishing AKS microservices via API Management; product-specific architecture patterns and trade-offs for AKS integration.,unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-log-to-eventhub-sample,Monitor APIs with advanced logging,"Monitor APIs with Azure API Management, Event Hubs, and Moesif - Azure API Management",Log API Management traffic to Event Hubs and Moesif,"Sample application demonstrating the log-to-eventhub policy by connecting Azure API Management, Azure Event Hubs, and Moesif for HTTP logging and monitoring","APPLIES TO: All API Management tiers TheAPI Management serviceprovides many capabilities to enhance the processing of HTTP requests sent to your HTTP API. However, the existence of the requests and responses is transient. The request is made and flows through the API Management service to your backend API. Your API processes the request and a response flows back through to the API consumer. The API Management service keeps some important statistics about the APIs for display in the Azure portal ",2025-10-01T08:00:00.000Z,how-to,integrations,0.75,True,Demonstrates log-to-eventhub policy with concrete configuration for integrating Event Hubs and Moesif; product-specific integration pattern.,unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-policies,Policy reference index,Azure API Management policy reference,Reference index for Azure API Management policies,Reference index for all Azure API Management policies and settings. Policies allow the API publisher to change API behavior through configuration.,"APPLIES TO: All API Management tiers This section provides brief descriptions and links to reference articles for all API Management policies. The API Managementgatewaysthat support each policy are indicated. For detailed policy settings and examples, see the linked reference articles. More information about policies: Important Limit call rate by subscriptionandSet usage quota by subscriptionhave a dependency on the subscription key. A subscription key isn't required when other policies are appl",2025-05-06T08:00:00.000Z,reference,configuration,0.82,True,Central index linking to detailed policy references; describes which gateways support each policy and includes product-specific policy behavior notes.,unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-policy-expressions,Use policy expressions,Azure API Management policy expressions,Use policy expressions in Azure API Management policies,Learn about policy expressions in Azure API Management. See examples and view other available resources.,APPLIES TO: All API Management tiers This article discusses policy expressions syntax in C# 7. Each expression has access to:,2026-01-15T08:00:00.000Z,reference,integrations,0.6,True,Details syntax and available context objects for policy expressions; effectively an API surface for expressions with product-specific members and behaviors.,unchanged -https://learn.microsoft.com/en-us/azure/api-management/api-management-region-availability,Regional availability,Azure API Management - Availability of v2 tiers and workspace gateways,Check regional availability of API Management v2 tiers,Availability of API Management v2 tiers and workspace gateways in Azure regions. This information supplements product availability by region.,"APPLIES TO: Basic v2 | Standard v2 | Premium | Premium v2 API Managementv2 tiersand API Managementworkspace gatewaysare available in a subset of the regions where the classic tiers are available. For information about the availability of the API Management classic tiers, seeProducts available by region.",2026-04-14T06:14:00.000Z,concept-article,deployment,0.7,True,"The page describes availability of API Management v2 tiers and workspace gateways by Azure region, supplementing product availability by region. This is deployment-related metadata (which SKUs can be deployed where), effectively a platform/region support matrix, which fits the deployment sub-skill focused on platform/tier support constraints.",updated +https://learn.microsoft.com/en-us/azure/api-management/api-management-region-availability,Regional availability,Azure API Management - Availability of v2 tiers and workspace gateways,Check regional availability of API Management v2 tiers,Availability of API Management v2 tiers and workspace gateways in Azure regions. This information supplements product availability by region.,"APPLIES TO: Basic v2 | Standard v2 | Premium | Premium v2 API Managementv2 tiersand API Managementworkspace gatewaysare available in a subset of the regions where the classic tiers are available. For information about the availability of the API Management classic tiers, seeProducts available by region.",2026-04-25T17:14:00.000Z,concept-article,deployment,0.7,True,"Page provides a region-by-region availability matrix for API Management v2 tiers and workspace gateways, which is product- and time-specific information needed for deployment planning and not reliably known from model training.",updated https://learn.microsoft.com/en-us/azure/api-management/api-management-revisions,Manage API revisions,Revisions in Azure API Management,,Learn about the concept of revisions in Azure API Management.,"APPLIES TO: All API Management tiers Revisions allow you to make changes to your APIs in a controlled and safe way. When you want to make changes, create a new revision. You can then edit and test your API without disturbing your API consumers. When you're ready, you then make your revision current. At the same time, you can optionally post an entry to the change log, to keep your API consumers up to date with the changes you made. The change log is published to your developer portal. Note The d",2025-09-29T08:00:00.000Z,concept-article,,0.3,False,Conceptual explanation of revisions; no detailed configuration parameters or limits.,unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-role-based-access-control,Use role-based access control,How to use role-based access control in Azure API Management,Configure RBAC roles for Azure API Management access control,Learn how to use the built-in roles and create custom roles in Azure API Management,"APPLIES TO: All API Management tiers Azure API Management relies on Azure role-based access control (Azure RBAC) to enable fine-grained access management for API Management services and entities including workspaces. This article gives you an overview of the built-in and custom roles in API Management. For more information on access management in the Azure portal, seeGet started with access management in the Azure portal. Note We recommend that you use the Azure Az PowerShell module to interact ",2025-08-08T08:00:00.000Z,concept-article,security,0.85,True,"Lists built-in and custom RBAC roles and scopes specific to API Management, which are concrete security configuration details not derivable generically.",unchanged https://learn.microsoft.com/en-us/azure/api-management/api-management-sample-cache-by-key,Custom caching,Custom Caching in Azure API Management,Implement custom key-based caching in API Management,Learn how to cache items by key in Azure API Management. You can modify the key by using request headers.,"APPLIES TO: All API Management tiers Azure API Management service has built-in support forHTTP response cachingusing the resource URL as the key. You can modify the key by using request headers that use thevary-byproperties. This technique is useful for caching entire HTTP responses (also known as representations), but sometimes it's useful to just cache a portion of a representation. Thecache-lookup-valueandcache-store-valuepolicies give you the ability to store and retrieve arbitrary pieces of",2025-10-06T22:11:00.000Z,how-to,best-practices,0.75,True,Shows product-specific use of cache-lookup-value and cache-store-value policies with key patterns; concrete coding/config patterns unique to API Management.,unchanged @@ -71,7 +71,7 @@ https://learn.microsoft.com/en-us/azure/api-management/applications,Protect prod https://learn.microsoft.com/en-us/azure/api-management/authentication-authorization-overview,API authentication and authorization options,API Authentication and Authorization - Overview - Azure API Management,,"Learn about authentication and authorization features in Azure API Management to secure access to APIs, including options for OAuth 2.0 authorization.","APPLIES TO: All API Management tiers This article is an introduction to a rich, flexible set of features in API Management that help you secure users' access to managed APIs. API authentication and authorization in API Management involve securing the end-to-end communication of client apps to the API Management gateway and through to backend APIs. In many customer environments, OAuth 2.0 is the preferred API authorization protocol. API Management supports OAuth 2.0 authorization between the clie",2025-09-30T22:16:00.000Z,concept-article,,0.4,False,"An overview of authentication and authorization options; summary suggests conceptual coverage without detailed role names, parameters, or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/api-management/authentication-basic-policy,authentication-basic,Azure API Management policy reference - authentication-basic,Use authentication-basic policy to secure backend calls in API Management,"Reference for the authentication-basic policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Use theauthentication-basicpolicy to authenticate with a backend service using Basic authentication. This policy effectively sets the HTTP Authorization header to the value corresponding to the credentials provided in the policy. Caution Minimize risks of credential exposure when configuring this policy. Microsoft recommends that you use more secure authentication methods if supported by your backend, such asmanaged identity authenticationorcredential manager",2025-06-01T11:12:00.000Z,reference,security,0.8,True,"Policy reference with concrete XML syntax, attributes, and examples for basic auth; product-specific security policy configuration.",unchanged https://learn.microsoft.com/en-us/azure/api-management/authentication-certificate-policy,authentication-certificate,Azure API Management policy reference - authentication-certificate,Use authentication-certificate policy for client certificate auth in API Management,"Reference for the authentication-certificate policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Use theauthentication-certificatepolicy to authenticate with a backend service using a client certificate. When the certificate isinstalled into API Managementfirst, identify it first by its thumbprint or certificate ID (resourcename). Caution Minimize risks of credential exposure when configuring this policy. Microsoft recommends that you use more secure authentication methods if supported by your backend, such asmanaged identity authenticationorcredential m",2024-09-17T11:16:00.000Z,reference,security,0.8,True,"Details policy elements, certificate identification (thumbprint/resource name), and usage; product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/api-management/authentication-managed-identity-policy,authentication-managed-identity,Azure API Management policy reference - authentication-managed-identity,Configure authentication-managed-identity policy in API Management,"Reference for the authentication-managed-identity policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Use theauthentication-managed-identitypolicy to authenticate with a backend service using the managed identity. This policy essentially uses the managed identity to obtain an access token from Microsoft Entra ID for accessing the specified resource. After successfully obtaining the token, the policy will set the value of the token in theAuthorizationheader using theBearerscheme. API Management caches the token until it expires. Both system-assigned identity a",2026-04-16T22:31:00.000Z,reference,security,0.78,True,"Policy reference pages for Azure API Management typically list exact policy attributes, allowed values, and behavior details for using managed identities and Microsoft Entra ID tokens. This includes product-specific security configuration parameters (for example, policy XML elements/attributes, header names, token acquisition behavior, and caching semantics) that go beyond generic knowledge and match the security sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/api-management/authentication-managed-identity-policy,authentication-managed-identity,Azure API Management policy reference - authentication-managed-identity,Configure authentication-managed-identity policy in API Management,"Reference for the authentication-managed-identity policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Use theauthentication-managed-identitypolicy to authenticate with a backend service using the managed identity. This policy essentially uses the managed identity to obtain an access token from Microsoft Entra ID for accessing the specified resource. After successfully obtaining the token, the policy will set the value of the token in theAuthorizationheader using theBearerscheme. API Management caches the token until it expires. Both system-assigned identity a",2026-04-23T06:20:00.000Z,reference,security,0.78,True,"Policy reference page with product-specific security configuration details: exact policy name, attributes, how it obtains tokens from Entra ID, how tokens are cached and added to Authorization headers, and examples for system-assigned vs user-assigned managed identities. This is concrete, implementation-focused security configuration rather than conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/api-management/automate-portal-deployments,Automate portal deployments,Automate Developer Portal Deployments - Azure API Management,Automate deployment of API Management developer portal content,Learn how to automatically migrate self-hosted developer portal content between two API Management services.,"APPLIES TO: Developer | Basic | Standard | Premium The developer portal in Azure API Management supports programmatic access to content. The developer portal allows you to import data to, or export from, an API Management service through thecontent management REST API. The REST API access works for both managed and self-hosted developer portals.",2025-10-10T06:02:00.000Z,how-to,deployment,0.7,True,Describes migrating portal content via content management REST API; includes deployment patterns and API details unique to APIM.,unchanged https://learn.microsoft.com/en-us/azure/api-management/automation-manage-api-management,Manage using automation,Manage Azure API Management using Azure Automation,Automate Azure API Management operations with Azure Automation,Learn about how the Azure Automation service can be used to manage Azure API Management.,APPLIES TO: All API Management tiers This guide introduces you to the Azure Automation service and how you can use it to simplify management of Azure API Management.,2025-09-30T22:16:00.000Z,concept-article,deployment,0.6,True,"Describes concrete use of Azure Automation to manage API Management, including product-specific runbook and management patterns beyond generic automation concepts.",unchanged https://learn.microsoft.com/en-us/azure/api-management/azure-ai-foundry-api,Import Microsoft Foundry API,Import a Microsoft Foundry API - Azure API Management,Import Microsoft Foundry AI endpoints into API Management,How to import an API from Microsoft Foundry as a REST API in Azure API Management.,"APPLIES TO: All API Management tiers You can import AI model endpoints deployed in Microsoft Foundry to your API Management instance as APIs. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints. To learn more about managing AI APIs in API Management, see:",2026-03-31T11:59:00.000Z,how-to,integrations,0.65,True,Describes importing Microsoft Foundry AI model endpoints as APIs and using AI gateway policies; this is a product-specific integration pattern likely including endpoint configuration details and API Management-specific settings.,unchanged @@ -108,14 +108,14 @@ https://learn.microsoft.com/en-us/azure/api-management/choose-policy,choose,Azur https://learn.microsoft.com/en-us/azure/api-management/configure-credential-connection,Configure multiple connections,Set up Multiple Connections - Azure API Management,Configure multiple credential connections in API Management,Learn how to set up multiple connections to a configured API credential provider using the portal.,"APPLIES TO: All API Management tiers You can configure multiple connections to a credential provider in your API Management instance. For example, if you configured Microsoft Entra ID as a credential provider, you might need to create multiple connections for different scenarios and users. In this article, you learn how to add a connection to an existing provider by using credential manager in the Azure portal. For an overview of credential manager, seeAbout API credentials and credential manage",2025-10-10T06:02:00.000Z,how-to,configuration,0.72,True,Describes setting up multiple connections to a credential provider; likely includes specific UI/config parameters and constraints for APIM credential manager that are product-specific.,unchanged https://learn.microsoft.com/en-us/azure/api-management/configure-custom-domain,Configure a custom domain,Configure custom domain name for Azure API Management instance - Azure API Management,Configure custom domains and certificates for Azure API Management,How to configure a custom domain name and choose certificates for the endpoints of your Azure API Management instance.,"APPLIES TO: All API Management tiers When you create an Azure API Management service instance in the Azure cloud, Azure assigns it aazure-api.netsubdomain (for example,apim-service-name.azure-api.net). You can also expose your API Management endpoints using your own custom domain name, such ascontoso.com. This article shows you how to map an existing custom DNS name to endpoints exposed by an API Management instance. Important API Management only accepts requests withhost headervalues matching: ",2026-04-03T08:00:00.000Z,how-to,configuration,0.72,True,"The article provides product-specific configuration details for mapping custom DNS names to Azure API Management endpoints, including required host header behavior and certificate selection for different endpoints. These are concrete, service-specific configuration behaviors and constraints that go beyond generic knowledge, but it does not focus on numeric limits/quotas, troubleshooting, or architecture patterns.",unchanged https://learn.microsoft.com/en-us/azure/api-management/configure-graphql-resolver,Configure a GraphQL resolver,Configure GraphQL Resolver in Azure API Management,Configure GraphQL field resolvers in Azure API Management,Configure a GraphQL resolver in Azure API Management for a field in an object type specified in a GraphQL schema,"APPLIES TO: All API Management tiers Configure a resolver to retrieve or set data for a GraphQL field in an object type specified in a GraphQL schema. The schema must be imported to API Management as a GraphQL API. Note Currently, this feature isn't available inworkspaces. Currently, API Management supports resolvers that can access the following data sources:",2025-10-01T08:00:00.000Z,how-to,integrations,0.7,True,Details resolver configuration and supported data sources for GraphQL fields; product-specific integration and configuration pattern.,unchanged -https://learn.microsoft.com/en-us/azure/api-management/configure-service-update-settings,Configure update settings,Configure API Management settings for service updates,Configure API Management automatic service update settings,Learn how to configure settings for applying service updates to your Azure API Management instance. Settings include the upgrade group and the maintenance window.,"APPLIES TO: Developer | Basic | Standard | Premium This article shows you how to configureservice updatesettings (preview) in your API Management instance. Azure periodically applies service updates automatically to API Management instances by using a phased rollout approach. These updates include new features, security enhancements, and reliability improvements. You can't control exactly when Azure updates each API Management instance, but in select service tiers you can choose anupdate groupfo",2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Describes product-specific settings for service updates (upgrade groups, maintenance windows) with concrete configuration options and constraints for different API Management tiers, which are not generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/api-management/configure-service-update-settings,Configure update settings,Configure API Management settings for service updates,Configure API Management automatic service update settings,Learn how to configure settings for applying service updates to your Azure API Management instance. Settings include the upgrade group and the maintenance window.,"APPLIES TO: Developer | Basic | Standard | Premium This article shows you how to configureservice updatesettings (preview) in your API Management instance. Azure periodically applies service updates automatically to API Management instances by using a phased rollout approach. These updates include new features, security enhancements, and reliability improvements. You can't control exactly when Azure updates each API Management instance, but in select service tiers you can choose anupdate groupfo",2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Describes product-specific settings for service updates (upgrade groups, maintenance windows) with concrete configuration options and constraints for different API Management tiers, which are not generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/api-management/cors-policy,cors,Azure API Management policy reference - cors,Configure CORS behavior with cors policy in Azure API Management,"Reference for the cors policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Thecorspolicy adds cross-origin resource sharing (CORS) support to an operation or an API to allow cross-domain calls from browser-based clients. Note Set the policy's elements and child elements in the order provided in the policy statement. To help you configure this policy, the portal provides a guided, form-based editor. Learn more abouthow to set or edit API Management policies.",2024-07-23T08:00:00.000Z,reference,configuration,0.8,True,"Details policy elements for allowed origins, methods, headers, etc.; product-specific configuration syntax and behavior.",unchanged https://learn.microsoft.com/en-us/azure/api-management/cosmosdb-data-source-policy,cosmosdb-data-source,Azure API Management policy reference - cosmosdb-data-source,Configure cosmosdb-data-source policy for GraphQL resolvers,"Reference for the cosmosdb-data-source resolver policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: Developer | Basic | Basic v2 | Standard | Standard v2 | Premium | Premium v2 Thecosmosdb-data-sourceresolver policy resolves data for an object type and field in a GraphQL schema by using aCosmos DBdata source. The schema must be imported to API Management as a GraphQL API. Use the policy to configure a single query request, read request, delete request, or write request and an optional response from the Cosmos DB data source. Note This policy is in preview. Currently, the policy isn",2024-03-18T08:00:00.000Z,reference,configuration,0.82,True,"Includes tier applicability, preview status, and configuration for query/read/write/delete operations against Cosmos DB, which are detailed product-specific configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers,Configure common credential providers,Configure Credential Providers - Azure API Management,Configure credential providers in API Management,Learn how to configure common credential providers in the Azure API Management credential manager. Providers include Microsoft Entra and generic OAuth.,"APPLIES TO: All API Management tiers In this article, you learn about configuring identity providers for managedconnectionsin your Azure API Management instance. Settings for the following common providers are shown: You configure a credential provider in the credential manager in your API Management instance. For a step-by-step example of configuring a Microsoft Entra provider and connection, seeConfigure credential manager - Microsoft Graph API.",2026-04-15T06:10:00.000Z,how-to,configuration,0.7,True,"Explains configuring Microsoft Entra and generic OAuth providers for managed connections with provider-specific settings and parameters in API Management’s credential manager, which are product-specific configuration details.",updated -https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad,Configure credential manager - Microsoft Graph API,Create Connection to Microsoft Graph API,Create managed Microsoft Graph connections via API Management credential manager,Learn how to create and use a managed connection to a backend Microsoft Graph API using the Azure API Management credential manager.,APPLIES TO: All API Management tiers This article guides you through the steps required to create amanaged connectionto the Microsoft Graph API within Azure API Management. Use the Microsoft Entra identity provider to call the Microsoft Graph API. This example uses the authorization code grant type. You learn how to:,2025-12-08T08:00:00.000Z,how-to,integrations,0.8,True,"Provides step-by-step configuration of a managed connection to Microsoft Graph using Entra and authorization code flow, including provider-specific parameters.",unchanged -https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github,Configure credential manager - GitHub,Create Connection to GitHub API - Azure API Management,Configure GitHub OAuth connections in API Management,Learn how to create and use a managed connection to a backend GitHub API using the Azure API Management credential manager.,"APPLIES TO: All API Management tiers In this article, you learn how to create a managedconnectionin API Management and call a GitHub API that requires an OAuth 2.0 token. This example uses the authorization code grant type. You learn how to:",2025-12-08T08:00:00.000Z,how-to,integrations,0.74,True,"How-to for managed GitHub connection with OAuth 2.0 authorization code flow; likely includes APIM credential manager parameters, redirect URIs, and provider-specific config values beyond generic OAuth theory.",unchanged +https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers,Configure common credential providers,Configure Credential Providers - Azure API Management,Configure credential providers in API Management,Learn how to configure common credential providers in the Azure API Management credential manager. Providers include Microsoft Entra and generic OAuth.,"APPLIES TO: All API Management tiers In this article, you learn about configuring identity providers for managedconnectionsin your Azure API Management instance. Settings for the following common providers are shown: You configure a credential provider in the credential manager in your API Management instance. For a step-by-step example of configuring a Microsoft Entra provider and connection, seeConfigure credential manager - Microsoft Graph API.",2026-04-22T06:17:00.000Z,how-to,security,0.7,True,"Covers configuring identity/credential providers (Microsoft Entra, generic OAuth) in the API Management credential manager with provider-specific settings. This is product-specific security/identity configuration rather than generic OAuth theory.",updated +https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad,Configure credential manager - Microsoft Graph API,Create Connection to Microsoft Graph API - Azure API Management,Create managed Microsoft Graph connection from API Management,Learn how to create and use a managed connection to a backend Microsoft Graph API using the Azure API Management credential manager.,"APPLIES TO: All API Management tiers This article guides you through the steps required to create amanaged connectionto the Microsoft Graph API from Azure API Management. Use the Microsoft Entra identity provider to call the Microsoft Graph API. This example uses the authorization code grant type. Note In this article, you can configure the credential provider by using either a traditional authorization code grant type with aclient secretor an authorization code grant type withfederated identity",2026-04-22T06:17:00.000Z,how-to,integrations,0.7,True,"Provides a concrete pattern for integrating Azure API Management with Microsoft Graph via a managed connection using Microsoft Entra, including grant type choice and provider configuration. This is a product-specific integration pattern rather than a generic tutorial.",updated +https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github,Configure credential manager - GitHub API,Create Connection to GitHub API - Azure API Management,Configure GitHub OAuth connections in API Management,Learn how to create and use a managed connection to a backend GitHub API using the Azure API Management credential manager.,"APPLIES TO: All API Management tiers In this article, you learn how to create a managedconnectionin API Management and call a GitHub API that requires an OAuth 2.0 token. This example uses the authorization code grant type. You learn how to:",2025-12-08T08:00:00.000Z,how-to,integrations,0.68,True,"How-to for creating a managed connection from Azure API Management to GitHub using OAuth 2.0 authorization code flow. Likely includes product-specific configuration fields (connection/credential manager settings, callback URLs, scopes, parameter names) and step-by-step integration details that go beyond generic OAuth knowledge, fitting the integrations & coding patterns category.",new https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-user-delegated,Configure credential manager - user-delegated permissions,Manage Connections for End Users - Azure API Management,Configure user-delegated OAuth connections in API Management,Learn how to configure a connection with user-delegated permissions to a backend OAuth 2.0 API using the Azure API Management credential manager.,"APPLIES TO: All API Management tiers This article guides you through the high-level steps to configure and use a managedconnectionthat grants Microsoft Entra users or groups delegated permissions to a backend OAuth 2.0 API. Follow these steps for scenarios when a client app (or bot) needs to access backend secured online resources on behalf of an authenticated user (for example, to check emails or place an order).",2025-12-08T23:11:00.000Z,how-to,integrations,0.7,True,Guides configuring managed connections with user-delegated permissions to backend OAuth 2.0 APIs; likely details APIM credential manager settings and Entra/user delegation parameters not generally known.,unchanged -https://learn.microsoft.com/en-us/azure/api-management/credentials-overview,Credential manager overview,About Credential Manager in Azure API Management,,Learn about using credential manager in Azure API Management to create and manage connections to backend SaaS APIs,"APPLIES TO: All API Management tiers To help you manage access to backend APIs, your API Management instance includes acredential manager. Use credential manager to manage, store, and control access to API credentials from your API Management instance. Note Note Currently, this feature isn't available inworkspaces.",2026-04-10T08:00:00.000Z,concept-article,,0.2,False,"High-level overview of Credential Manager; summary suggests conceptual description without detailed configuration parameters, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/api-management/credentials-overview,Credential manager overview,About Credential Manager in Azure API Management,,Learn about using credential manager in Azure API Management to create and manage connections to backend SaaS APIs,"APPLIES TO: All API Management tiers To help you manage access to backend APIs, your API Management instance includes acredential manager. Use credential manager to manage, store, and control access to API credentials from your API Management instance. Note Note Currently, this feature isn't available inworkspaces.",2026-04-10T08:00:00.000Z,concept-article,,0.2,False,"High-level overview of Credential Manager; summary suggests conceptual description without detailed configuration parameters, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/api-management/credentials-process-flow,Managed connections - process flows,Credential Manager in Azure API Management - Process Flows,Understand credential manager OAuth 2.0 management and runtime flows,Learn about the management and runtime process flows for managing OAuth 2.0 connections using credential manager in Azure API Management,"APPLIES TO: All API Management tiers This article provides details about the process flows for managing OAuth 2.0 connections using credential manager in Azure API Management. The process flows are divided into two parts:managementandruntime. For background about credential manager in API Management, seeAbout credential manager and API credentials in API Management.",2025-09-30T22:16:00.000Z,concept-article,configuration,0.7,True,"Details management and runtime process flows for OAuth 2.0 connections in credential manager, which are specific to API Management.",unchanged https://learn.microsoft.com/en-us/azure/api-management/cross-domain-policy,cross-domain,Azure API Management policy reference - cross-domain,Enable cross-domain access with cross-domain policy in API Management,"Reference for the cross-domain policy available for use in Azure API Management. Provides policy usage, settings, and examples.",APPLIES TO: All API Management tiers Use thecross-domainpolicy to make the API accessible from Adobe Flash and Microsoft Silverlight browser-based clients. Note Set the policy's elements and child elements in the order provided in the policy statement. Learn more abouthow to set or edit API Management policies.,2025-06-01T11:12:00.000Z,reference,configuration,0.7,True,Policy reference for enabling Flash/Silverlight cross-domain access with specific configuration elements; product-specific policy configuration.,unchanged https://learn.microsoft.com/en-us/azure/api-management/developer-portal-alternative-processes-self-host,Alternative approaches to self-hosting,Alternatives for self-hosting developer portal - Azure API Management,Choose alternative approaches for self-hosting API portal,Learn about alternative approaches you can use when you self-host a developer portal in Azure API Management.,"APPLIES TO: Developer | Basic | Basic v2 | Standard | Standard v2 | Premium | Premium v2 There are several alternative approaches you can explore when youself-host a developer portal: This article provides information on each of these approaches. If you haven't already done so, set up alocal environmentfor the latest release of the developer portal.",2026-02-06T18:19:00.000Z,how-to,decision-making,0.62,True,"Discusses several alternative approaches for self-hosting the developer portal, providing guidance on options and when to use them; supports implementation decisions.",unchanged @@ -243,7 +243,7 @@ https://learn.microsoft.com/en-us/azure/api-management/send-request-policy,send- https://learn.microsoft.com/en-us/azure/api-management/send-service-bus-message-policy,send-service-bus-message,Azure API Management Policy Reference - send-service-bus-message,Configure send-service-bus-message policy for Azure Service Bus,"Reference for the send-service-bus-message policy available for use in Azure API Management. Provides policy usage, settings, and examples.",APPLIES TO: Developer | Basic | Standard | Premium Thesend-service-bus-messagepolicy sends a message to an Azure Service Bus queue or topic. The API request can optionally be forwarded to the backend service. Note Note Set the policy's elements and child elements in the order provided in the policy statement. Learn more abouthow to set or edit API Management policies.,2025-10-01T17:42:00.000Z,reference,configuration,0.82,True,"Provides policy configuration for sending messages to Service Bus queues/topics, including tier applicability and options, which are detailed integration/configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/api-management/service-limits,Understand service limits,Understanding Azure API Management Service Limits,Understand and work within Azure API Management service limits,"Learn about service limits in Azure API Management, including their purpose, how they're enforced, and guidelines for managing your service.",Azure API Management enforces variouslimits on resourcessuch as API operations and other entities. This article explains why these limits exist and how to use the service effectively within these constraints.,2026-02-26T23:12:00.000Z,concept-article,limits-quotas,0.9,True,Dedicated limits article; contains specific numeric limits and guidance on how they’re enforced and managed.,unchanged https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-dapr-policy,set-backend-service (Dapr),Azure API Management policy reference - set-backend-service (Dapr),Configure Dapr set-backend-service policy in API Management,"Reference for the set-backend-service policy available for use in Dapr integration with Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: Developer | Premium Theset-backend-servicepolicy sets the target URL for the current request tohttp://localhost:3500/v1.0/invoke/{app-id}[.{ns-name}]/method/{method-name}, replacing template parameters with values specified in the policy statement. The policy assumes that Dapr runs in a sidecar container in the same pod as the gateway. Upon receiving the request, Dapr runtime performs service discovery and actual invocation, including possible protocol translation between HTTP and gR",2025-06-01T11:12:00.000Z,reference,configuration,0.86,True,"Provides the exact Dapr invocation URL template (http://localhost:3500/v1.0/invoke/{app-id}[.{ns-name}]/method/{method-name}) and assumptions about sidecar deployment, which are highly specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy,set-backend-service,Azure API Management policy reference - set-backend-service,Configure set-backend-service policy and backend entities,"Reference for the set-backend-service policy available for use in Azure API Management. Provides policy usage, settings, and examples.",APPLIES TO: All API Management tiers Use theset-backend-servicepolicy to redirect an incoming request to a different backend than the one specified in the API settings for that operation. This policy changes the backend service base URL of the incoming request to a URL orbackendspecified in the policy. Referencing a backend entity allows you to manage the backend service base URL and other settings in a single place and reuse them across multiple APIs and operations. Also implementload balancing,2024-07-23T08:00:00.000Z,reference,configuration,0.82,True,"Explains how to override backend URLs, reference backend entities, and support load balancing, which are detailed APIM configuration behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy,set-backend-service,Azure API Management policy reference - set-backend-service,Use set-backend-service policy in Azure API Management,"Reference for the set-backend-service policy available for use in Azure API Management. Provides policy usage, settings, and examples.",APPLIES TO: All API Management tiers Use theset-backend-servicepolicy to redirect an incoming request to a different backend than the one specified in the API settings for that operation. This policy changes the backend service base URL of the incoming request to a URL orbackendspecified in the policy. Referencing a backend entity allows you to manage the backend service base URL and other settings in a single place and reuse them across multiple APIs and operations. Also implementload balancing,2026-04-24T06:15:00.000Z,reference,integrations,0.68,True,"Policy reference pages typically include product-specific policy syntax, attributes, and configuration options (for example, how to reference backend entities, allowed attribute names/values, and example configurations) that are unique to Azure API Management. This is expert, implementation-level knowledge that goes beyond generic API gateway concepts and fits best under integrations & coding patterns, since it defines how to programmatically route and configure backend services via policy.",updated https://learn.microsoft.com/en-us/azure/api-management/set-body-policy,set-body,Azure API Management policy reference - set-body,Configure set-body policy for API Management requests,"Reference for the set-body policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Use theset-bodypolicy to set the message body for a request or response. To access the message body you can use thecontext.Request.Bodyproperty or thecontext.Response.Body, depending on whether the policy is in the inbound or outbound section. Important By default when you access the message body usingcontext.Request.Bodyorcontext.Response.Body, the original message body is lost and must be set by returning the body back in the expression. To preserve the bod",2024-07-23T08:00:00.000Z,reference,configuration,0.8,True,"Explains concrete usage of context.Request.Body/context.Response.Body and how the body is lost unless reset, plus policy XML, which are detailed, product-specific configuration behaviors.",unchanged https://learn.microsoft.com/en-us/azure/api-management/set-edit-policies,Set or edit policies,Set or Edit Azure API Management Policies,Configure and edit Azure API Management policy definitions,Configure policies at different scopes in Azure API Management by using the policy editor in the Azure portal.,APPLIES TO: All API Management tiers This article shows you how to configure policies in your API Management instance by editing policy definitions in the Azure portal. Each policy definition is an XML document that describes a sequence of inbound and outbound statements that run sequentially on an API request and response. The policy editor in the portal provides guided forms for API publishers to add and edit policies in policy definitions. You can also edit the XML directly in the policy code,2026-03-31T22:19:00.000Z,how-to,configuration,0.7,True,"Explains how to configure policies via XML policy definitions and the policy editor. Policy configuration in API Management is highly product-specific and typically includes concrete policy names, XML elements, and allowed settings, which qualifies as expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/api-management/set-header-policy,set-header,Azure API Management policy reference - set-header,Configure set-header policy in Azure API Management,"Reference for the set-header policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Theset-headerpolicy assigns a value to an existing HTTP response and/or request header or adds a new response and/or request header. Use the policy to insert a list of HTTP headers into an HTTP message. When placed in an inbound pipeline, this policy sets the HTTP headers for the request being passed to the target service. When placed in an outbound pipeline, this policy sets the HTTP headers for the response being sent to the gateway’s client. Note Set the p",2025-05-27T17:04:00.000Z,reference,configuration,0.78,True,"Provides exact policy elements and attributes for adding or modifying HTTP headers in APIM pipelines, which are specific configuration parameters.",unchanged @@ -272,7 +272,7 @@ https://learn.microsoft.com/en-us/azure/api-management/validate-parameters-polic https://learn.microsoft.com/en-us/azure/api-management/validate-status-code-policy,validate-status-code,Azure API Management policy reference - validate-status-code,Validate HTTP status codes with validate-status-code policy,"Reference for the validate-status-code policy available for use in Azure API Management. Provides policy usage, settings, and examples.","APPLIES TO: All API Management tiers Thevalidate-status-codepolicy validates the HTTP status codes in responses against the API schema. This policy may be used to prevent leakage of backend errors, which can contain stack traces. Note The maximum size of the API schema that can be used by this validation policy is 4 MB. If the schema exceeds this limit, validation policies will return errors on runtime. To increase it, please contactsupport. Note Set the policy's elements and child elements in t",2024-07-23T08:00:00.000Z,reference,limits-quotas,0.75,True,Policy reference includes a 4 MB maximum API schema size limit and notes on behavior when exceeded; explicit numeric limit plus configuration.,unchanged https://learn.microsoft.com/en-us/azure/api-management/virtual-network-concepts,Networking options,Azure API Management with an Azure virtual network,Choose and configure virtual network options for API Management,Learn about scenarios and requirements to secure inbound or outbound traffic for your API Management instance using an Azure virtual network.,"APPLIES TO: Developer | Basic | Basic v2 | Standard | Standard v2 | Premium | Premium v2 By default your API Management instance is accessed from the internet at a public endpoint, and acts as a gateway to public backends. API Management provides several options to use an Azure virtual network to secure access to your API Management instance and to backend APIs. Available options depend on theservice tierof your API Management instance. Choose networking capabilities to meet your organization's ",2025-11-13T08:00:00.000Z,concept-article,decision-making,0.7,True,"Explains tier-dependent VNet options, scenarios, and requirements, helping select networking modes with product-specific constraints and guidance.",unchanged https://learn.microsoft.com/en-us/azure/api-management/virtual-network-injection-resources,Virtual network injection (classic tiers) - requirements,Azure API Management virtual network injection - network resources,Meet virtual network resource requirements for API Management injection,Learn about requirements for network resources when you deploy (inject) your API Management Developer or Premium tier instance in an Azure virtual network.,"APPLIES TO: Developer | Premium The following are virtual network resource requirements for injection of an API Management Developer or Premium instance into a virtual network. Note To inject a Premium v2 instance in a virtual network, the requirements and configuration are different.Learn more",2025-07-08T17:24:00.000Z,concept-article,configuration,0.8,True,"Documents specific VNet resource requirements (subnets, address ranges, NSG rules) for Developer and Premium tiers, which are detailed configuration constraints.",unchanged -https://learn.microsoft.com/en-us/azure/api-management/virtual-network-reference,Virtual network configuration,VNet configuration settings,Configure VNet settings for Azure API Management,Reference for network configuration settings when deploying Azure API Management to a virtual network,"APPLIES TO: Developer | Premium This reference provides detailed network configuration settings for an API Management instance deployed (injected) in an Azure virtual network in theexternalorinternalmode. For VNet connectivity options, requirements, and considerations, seeUsing a virtual network with Azure API Management. Important This reference applies only to API Management instances in the classic tiers deployed in a virtual network. For information about virtual network injection in the v2 ",2026-04-10T08:00:00.000Z,reference,configuration,0.86,True,"The page is a reference for network configuration settings for API Management in a virtual network, which typically includes specific setting names, required/allowed values, and mode-specific parameters (external vs internal). This matches the configuration sub-skill definition with product-specific parameters rather than conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/api-management/virtual-network-reference,Virtual network configuration,VNet configuration settings,Configure VNet settings for Azure API Management,Reference for network configuration settings when deploying Azure API Management to a virtual network,"APPLIES TO: Developer | Premium This reference provides detailed network configuration settings for an API Management instance deployed (injected) in an Azure virtual network in theexternalorinternalmode. For VNet connectivity options, requirements, and considerations, seeUsing a virtual network with Azure API Management. Important This reference applies only to API Management instances in the classic tiers deployed in a virtual network. For information about virtual network injection in the v2 ",2026-04-10T08:00:00.000Z,reference,configuration,0.86,True,"The page is a reference for network configuration settings for API Management in a virtual network, which typically includes specific setting names, required/allowed values, and mode-specific parameters (external vs internal). This matches the configuration sub-skill definition with product-specific parameters rather than conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/api-management/virtual-network-workspaces-resources,Virtual network for workspace gateways,Azure API Management workspace gateways - virtual network requirements,Configure virtual network requirements for API Management workspace gateways,Learn about requirements for network resources when you integrate or inject your API Management workspace gateway in an Azure virtual network.,"APPLIES TO: Premium Network isolation is an optional feature of an API Managementworkspace gateway. This article provides network resource requirements when you integrate or inject your gateway in an Azure virtual network. Some requirements differ depending on the desired inbound and outbound access mode. The following modes are supported: For background about networking options in API Management, seeUse a virtual network to secure inbound or outbound traffic for Azure API Management. Note",2025-12-06T06:11:00.000Z,concept-article,configuration,0.75,True,"Provides detailed VNet resource requirements for workspace gateways across different access modes, which are specific configuration constraints.",unchanged https://learn.microsoft.com/en-us/azure/api-management/visual-studio-code-tutorial,10 - Manage APIs in Visual Studio Code,Tutorial: Use VS Code to Import and Manage APIs in Azure API Management,,"Learn how to use the Azure API Management extension for Visual Studio Code to import, test, and manage APIs.","APPLIES TO: Consumption | Developer | Basic | Standard | Premium In this tutorial, you learn how to use the Azure API Management extension for Visual Studio Code for common operations in API Management. You can use the familiar Visual Studio Code environment to import, update, test, and manage APIs. Note Currently, this feature isn't available inworkspaces. In this article, you learn how to: For an introduction to more API Management features, seeImport and publish your first API.",2026-03-27T22:21:00.000Z,tutorial,,0.2,False,"Tutorial for using the API Management VS Code extension; mainly operational walkthrough without detailed configuration parameter tables, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/api-management/vscode-create-service-instance,Create an instance - Visual Studio Code,Quickstart - Create Azure API Management Instance - Visual Studio Code,,Use this quickstart to create an Azure API Management instance with the API Management extension for Visual Studio Code.,"APPLIES TO: Consumption | Developer | Basic | Standard | Premium This quickstart describes the steps to create a new API Management instance using theAzure API Management Extensionfor Visual Studio Code. After creating an instance, you can use the extension for common management tasks such as importing APIs in your API Management instance. Azure API Managementhelps organizations publish APIs to external, partner, and internal developers to unlock the potential of their data and services. API Man",2025-10-15T05:40:00.000Z,quickstart,,0.25,False,VS Code extension quickstart; primarily a tooling tutorial without deep configuration or product-specific patterns.,unchanged diff --git a/products/azure-api-management/report.md b/products/azure-api-management/report.md index 9c9ee276..f809852e 100644 --- a/products/azure-api-management/report.md +++ b/products/azure-api-management/report.md @@ -1,24 +1,24 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - integrations: Patterns and samples for integrating API Management with AI/LLM backends, - OAuth/Graph, logging/monitoring, events, Service Bus, Service Fabric, MCP, and - importing/exporting APIs. + integrations: Patterns and samples for integrating API Management with external + APIs, LLMs, backends, logging/monitoring tools, events, OAuth providers, and exporting/importing + API definitions. limits-quotas: 'Rate, quota, and validation limits in API Management: throttling, per-key quotas, OpenAI/LLM token control, protocol format limits, WebSocket/self-hosted gateway caps, and request/response validation.' - security: 'Securing API Management and gateways: authN/Z (OAuth2, Entra ID, B2C, - JWT, certs, mTLS), RBAC, managed identity, TLS/ciphers, DDoS/Defender, and secure - developer portal access.' + security: 'Securing Azure API Management: authN/authZ (Entra ID, B2C, OAuth2, JWT, + mTLS, basic), certificates, RBAC, managed identity, self-hosted gateway security, + DDoS/Defender, and compliance policies.' decision-making: Guidance for choosing APIM tiers/networking, scaling and cost planning, DevOps/CI/CD, migrations (ARM, portals, workspaces, Amazon API Gateway), and monetization/analytics transitions. troubleshooting: 'Diagnosing and fixing APIM issues: policies and error handling, request tracing/debugging, custom domain/Key Vault cert failures, SNAT timeouts, portal problems, and using Diagnose and Solve.' - deployment: 'Deploying and scaling APIM: multi-region, VNet and zone setups, self-hosted - gateways (AKS/K8s/Docker/Arc), backup/restore, migration, automation, and portal - deployment.' + deployment: 'Deploying and scaling APIM: multi-region, VNet and zone setups, backups/migration, + autoscale, automation, certificates, and self-hosted gateways (AKS, ACA, Docker, + Kubernetes, Arc).' best-practices: Best practices for caching, throttling/quotas, self-hosted gateway on Kubernetes, server-sent events, and securing APIs against OWASP API Top 10 in Azure API Management @@ -31,17 +31,17 @@ category_descriptions: skill_description: Expert knowledge for Azure API Management development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when managing APIM policies, self-hosted gateways, VNet/custom domains, OAuth/Entra - auth, or multi-region setups, and other Azure API Management related development + Use when designing APIM policies, VNet/self-hosted gateways, Entra/OAuth auth, multi-region + setups, or Front Door/App GW routes, and other Azure API Management related development tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure - Front Door (use azure-front-door), Azure Api Center (use azure-api-center), Azure - App Service (use azure-app-service). -use_when: Use when managing APIM policies, self-hosted gateways, VNet/custom domains, - OAuth/Entra auth, or multi-region setups, and other Azure API Management related - development tasks. + Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), + Azure Logic Apps (use azure-logic-apps). +use_when: Use when designing APIM policies, VNet/self-hosted gateways, Entra/OAuth + auth, multi-region setups, or Front Door/App GW routes, and other Azure API Management + related development tasks. confusable_not_for: Not for Azure Application Gateway (use azure-application-gateway), - Azure Front Door (use azure-front-door), Azure Api Center (use azure-api-center), - Azure App Service (use azure-app-service). + Azure Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), + Azure Logic Apps (use azure-logic-apps). --- # Azure API Management Crawl Report @@ -54,10 +54,10 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat - **Unclassified**: 50 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 7 +- **New Pages**: 1 +- **Updated Pages**: 6 - **Unchanged**: 268 -- **Deleted Pages**: 0 +- **Deleted Pages**: 1 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-api-management/azure-api-management.csv` ## Classification Statistics @@ -66,33 +66,39 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat |------|-------|------------| | architecture-patterns | 3 | 1.1% | | best-practices | 6 | 2.2% | -| configuration | 98 | 35.6% | +| configuration | 96 | 34.9% | | decision-making | 17 | 6.2% | | deployment | 18 | 6.5% | -| integrations | 28 | 10.2% | +| integrations | 29 | 10.5% | | limits-quotas | 15 | 5.5% | -| security | 34 | 12.4% | +| security | 35 | 12.7% | | troubleshooting | 6 | 2.2% | | *(Unclassified)* | 50 | 18.2% | ## Changes +### New Pages + +- [Configure credential manager - GitHub API](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github) + ### Updated Pages -- [Configure update settings](https://learn.microsoft.com/en-us/azure/api-management/configure-service-update-settings) - - Updated: 2025-11-25T23:25:00.000Z → 2026-04-13T08:00:00.000Z - [Use managed identities for Azure resources](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-managed-service-identity) - - Updated: 2025-12-19T18:11:00.000Z → 2026-04-16T22:31:00.000Z -- [Credential manager overview](https://learn.microsoft.com/en-us/azure/api-management/credentials-overview) - - Updated: 2025-09-29T17:17:00.000Z → 2026-04-10T08:00:00.000Z + - Updated: 2026-04-16T22:31:00.000Z → 2026-04-23T06:20:00.000Z - [Configure common credential providers](https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers) - - Updated: 2025-12-08T23:11:00.000Z → 2026-04-15T06:10:00.000Z -- [Virtual network configuration](https://learn.microsoft.com/en-us/azure/api-management/virtual-network-reference) - - Updated: 2025-06-17T08:00:00.000Z → 2026-04-10T08:00:00.000Z + - Updated: 2026-04-15T06:10:00.000Z → 2026-04-22T06:17:00.000Z +- [Configure credential manager - Microsoft Graph API](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad) + - Updated: 2025-12-08T08:00:00.000Z → 2026-04-22T06:17:00.000Z - [Regional availability](https://learn.microsoft.com/en-us/azure/api-management/api-management-region-availability) - - Updated: 2026-03-11T17:32:00.000Z → 2026-04-14T06:14:00.000Z + - Updated: 2026-04-14T06:14:00.000Z → 2026-04-25T17:14:00.000Z - [authentication-managed-identity](https://learn.microsoft.com/en-us/azure/api-management/authentication-managed-identity-policy) - - Updated: 2025-02-16T12:10:00.000Z → 2026-04-16T22:31:00.000Z + - Updated: 2026-04-16T22:31:00.000Z → 2026-04-23T06:20:00.000Z +- [set-backend-service](https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy) + - Updated: 2024-07-23T08:00:00.000Z → 2026-04-24T06:15:00.000Z + +### Deleted Pages + +- ~~Configure credential manager - GitHub~~ (https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github) ## Classified Pages @@ -128,14 +134,12 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [retry](https://learn.microsoft.com/en-us/azure/api-management/retry-policy) | configuration | 0.82 | Describes retrycondition, retrycount, and restrictions (cannot contain wait), which are concrete configuration parameters and behavior details. | | [send-request](https://learn.microsoft.com/en-us/azure/api-management/send-request-policy) | configuration | 0.82 | Includes configuration for sending outbound HTTP requests with a timeout parameter and policy XML, which are concrete APIM configuration options. | | [send-service-bus-message](https://learn.microsoft.com/en-us/azure/api-management/send-service-bus-message-policy) | configuration | 0.82 | Provides policy configuration for sending messages to Service Bus queues/topics, including tier applicability and options, which are detailed integration/configuration parameters. | -| [set-backend-service](https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy) | configuration | 0.82 | Explains how to override backend URLs, reference backend entities, and support load balancing, which are detailed APIM configuration behaviors. | | [sql-data-source](https://learn.microsoft.com/en-us/azure/api-management/sql-data-source-policy) | configuration | 0.82 | Documents configuration of T-SQL requests to Azure SQL, tier support (not Consumption), and policy elements, which are specific configuration details. | | [validate-client-certificate](https://learn.microsoft.com/en-us/azure/api-management/validate-client-certificate-policy) | security | 0.82 | Describes validation rules and claims for client certificates; product-specific security policy configuration. | | [Access resource protected by network security perimeter](https://learn.microsoft.com/en-us/azure/api-management/using-network-security-perimeter) | security | 0.80 | Provides step-by-step configuration for network security perimeters, trusted access, and managed identity between API Management and a protected backend. | | [Authenticate and authorize to LLM APIs](https://learn.microsoft.com/en-us/azure/api-management/api-management-authenticate-authorize-ai-apis) | security | 0.80 | Describes concrete auth methods (API key, managed identity, OAuth 2.0) with policy usage; includes product-specific security configuration patterns and likely role/scope details. | | [Authenticate with Microsoft Entra ID - client secret](https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-enable-azure-ad) | security | 0.80 | Uses Entra app with client secret/cert; contains app registration details, scopes, and gateway config parameters specific to APIM. | | [Authenticate with Microsoft Entra ID - workload identity](https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-enable-workload-identity) | security | 0.80 | Shows how to use Entra workload identity; includes role assignments, identity configuration, and gateway settings unique to APIM. | -| [Configure credential manager - Microsoft Graph API](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad) | integrations | 0.80 | Provides step-by-step configuration of a managed connection to Microsoft Graph using Entra and authorization code flow, including provider-specific parameters. | | [Deploy a self-hosted gateway to Azure Kubernetes Service](https://learn.microsoft.com/en-us/azure/api-management/how-to-deploy-self-hosted-gateway-azure-kubernetes-service) | deployment | 0.80 | AKS deployment guide for self-hosted gateway; contains product-specific deployment YAML/Helm values, ports, and resource requirements. | | [Deploy a self-hosted gateway to Kubernetes (YAML)](https://learn.microsoft.com/en-us/azure/api-management/how-to-deploy-self-hosted-gateway-kubernetes) | deployment | 0.80 | Kubernetes YAML deployment article; includes container image, environment variables, and Kubernetes resource settings specific to APIM gateway. | | [Deploy to multiple Azure regions](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-deploy-multi-region) | deployment | 0.80 | Details Premium multi-region deployment, including configuring regional scale units and availability zones, which are product-specific deployment constraints. | @@ -166,7 +170,7 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Email notifications and templates](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-configure-notifications) | configuration | 0.78 | Contains event types, notification settings, and template configuration specific to API Management; concrete configuration options rather than conceptual overview. | | [Export API as custom connector](https://learn.microsoft.com/en-us/azure/api-management/export-api-power-platform) | integrations | 0.78 | Provides concrete steps and parameters to export APIs as custom connectors for Copilot Studio, Power Apps, and Power Automate; product-specific integration pattern. | | [Manage protocols and ciphers](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-manage-protocols-ciphers) | security | 0.78 | TLS/cipher management article for APIM; typically lists supported TLS versions, cipher suite names, and configuration switches—product-specific security settings and constraints. | -| [authentication-managed-identity](https://learn.microsoft.com/en-us/azure/api-management/authentication-managed-identity-policy) | security | 0.78 | Policy reference pages for Azure API Management typically list exact policy attributes, allowed values, and behavior details for using managed identities and Microsoft Entra ID tokens. This includes product-specific security configuration parameters (for example, policy XML elements/attributes, header names, token acquisition behavior, and caching semantics) that go beyond generic knowledge and match the security sub-skill criteria. | +| [authentication-managed-identity](https://learn.microsoft.com/en-us/azure/api-management/authentication-managed-identity-policy) | security | 0.78 | Policy reference page with product-specific security configuration details: exact policy name, attributes, how it obtains tokens from Entra ID, how tokens are cached and added to Authorization headers, and examples for system-assigned vs user-assigned managed identities. This is concrete, implementation-focused security configuration rather than conceptual guidance. | | [azure-openai-token-limit](https://learn.microsoft.com/en-us/azure/api-management/azure-openai-token-limit-policy) | configuration | 0.78 | Policy reference page with product-specific configuration elements and attributes (for example, rate, quota period, behavior on 429/403) and concrete usage examples. Contains named settings and their allowed values/semantics for this specific policy, which are not generic LLM knowledge. | | [cache-remove-value](https://learn.microsoft.com/en-us/azure/api-management/cache-remove-value-policy) | configuration | 0.78 | Provides policy syntax and behavior for deleting cache entries by key, which is specific to APIM cache configuration. | | [check-header](https://learn.microsoft.com/en-us/azure/api-management/check-header-policy) | configuration | 0.78 | Policy reference pages list policy XML shape, attributes, and allowed values (for headers, status codes, error messages) that are product-specific configuration details rather than generic concepts. | @@ -207,7 +211,6 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Virtual network integration (v2 tiers)](https://learn.microsoft.com/en-us/azure/api-management/integrate-vnet-outbound) | configuration | 0.75 | Explains Standard v2/Premium v2 VNet integration for outbound calls, including connectivity requirements and supported scenarios, which are product-specific settings. | | [validate-content](https://learn.microsoft.com/en-us/azure/api-management/validate-content-policy) | limits-quotas | 0.75 | Policy reference lists supported schema formats/content types and includes a 4 MB maximum schema size limit; explicit numeric constraint and behavior. | | [validate-status-code](https://learn.microsoft.com/en-us/azure/api-management/validate-status-code-policy) | limits-quotas | 0.75 | Policy reference includes a 4 MB maximum API schema size limit and notes on behavior when exceeded; explicit numeric limit plus configuration. | -| [Configure credential manager - GitHub](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github) | integrations | 0.74 | How-to for managed GitHub connection with OAuth 2.0 authorization code flow; likely includes APIM credential manager parameters, redirect URIs, and provider-specific config values beyond generic OAuth theory. | | [Mutual certificate authentication for backend](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-mutual-certificates) | security | 0.74 | The article provides product-specific security configuration details for Azure API Management, including how to manage client certificates, bind them to APIs/backends, and configure mutual TLS. It focuses on concrete configuration steps and settings unique to APIM security rather than generic TLS concepts, which qualifies as expert security knowledge. | | [Self-hosted gateway authentication options](https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-authentication-options) | security | 0.74 | Summarizes gateway authentication modes; references configuration settings for authenticating to cloud APIM endpoint—product-specific security configuration. | | [azure-openai-semantic-cache-lookup](https://learn.microsoft.com/en-us/azure/api-management/azure-openai-semantic-cache-lookup-policy) | configuration | 0.74 | The azure-openai-semantic-cache-lookup policy reference includes specific configuration such as similarity score thresholds, vector proximity behavior, and external cache settings, which are detailed product-specific configuration parameters. | @@ -232,7 +235,8 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Call external services from policies](https://learn.microsoft.com/en-us/azure/api-management/api-management-sample-send-request) | integrations | 0.70 | Uses send-request and related policies with specific configuration to integrate external services; product-specific policy integration patterns. | | [Configure Front Door](https://learn.microsoft.com/en-us/azure/api-management/front-door-api-management) | architecture-patterns | 0.70 | Describes a concrete pattern using Front Door for global load balancing, TLS offload, caching, and WAF in front of API Management, with product-specific behavior. | | [Configure a GraphQL resolver](https://learn.microsoft.com/en-us/azure/api-management/configure-graphql-resolver) | integrations | 0.70 | Details resolver configuration and supported data sources for GraphQL fields; product-specific integration and configuration pattern. | -| [Configure common credential providers](https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers) | configuration | 0.70 | Explains configuring Microsoft Entra and generic OAuth providers for managed connections with provider-specific settings and parameters in API Management’s credential manager, which are product-specific configuration details. | +| [Configure common credential providers](https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers) | security | 0.70 | Covers configuring identity/credential providers (Microsoft Entra, generic OAuth) in the API Management credential manager with provider-specific settings. This is product-specific security/identity configuration rather than generic OAuth theory. | +| [Configure credential manager - Microsoft Graph API](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad) | integrations | 0.70 | Provides a concrete pattern for integrating Azure API Management with Microsoft Graph via a managed connection using Microsoft Entra, including grant type choice and provider configuration. This is a product-specific integration pattern rather than a generic tutorial. | | [Configure credential manager - user-delegated permissions](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-user-delegated) | integrations | 0.70 | Guides configuring managed connections with user-delegated permissions to backend OAuth 2.0 APIs; likely details APIM credential manager settings and Entra/user delegation parameters not generally known. | | [Configure custom domain for self-hosted gateway](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-configure-custom-domain-gateway) | configuration | 0.70 | Custom DNS/hostname mapping for gateway; likely includes specific configuration steps, certificate requirements, and APIM settings. | | [Configure update settings](https://learn.microsoft.com/en-us/azure/api-management/configure-service-update-settings) | configuration | 0.70 | Describes product-specific settings for service updates (upgrade groups, maintenance windows) with concrete configuration options and constraints for different API Management tiers, which are not generic knowledge. | @@ -272,14 +276,14 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Protect your API with Azure AD B2C](https://learn.microsoft.com/en-us/azure/api-management/howto-protect-backend-frontend-azure-ad-b2c) | security | 0.70 | Shows a specific security configuration using AD B2C, Easy Auth, and PKCE SPA flow to protect APIs, which is product- and flow-specific. | | [Protect your API with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-protect-backend-with-aad) | security | 0.70 | Provides concrete configuration steps and parameters to secure APIs with OAuth 2.0 and Entra ID, beyond generic OAuth concepts. | | [Provision a self-hosted gateway](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-provision-self-hosted-gateway) | deployment | 0.70 | Provisioning gateway resource is a deployment prerequisite; article likely includes resource-level constraints, required settings, and tier applicability for self-hosted gateway. | -| [Regional availability](https://learn.microsoft.com/en-us/azure/api-management/api-management-region-availability) | deployment | 0.70 | The page describes availability of API Management v2 tiers and workspace gateways by Azure region, supplementing product availability by region. This is deployment-related metadata (which SKUs can be deployed where), effectively a platform/region support matrix, which fits the deployment sub-skill focused on platform/tier support constraints. | +| [Regional availability](https://learn.microsoft.com/en-us/azure/api-management/api-management-region-availability) | deployment | 0.70 | Page provides a region-by-region availability matrix for API Management v2 tiers and workspace gateways, which is product- and time-specific information needed for deployment planning and not reliably known from model training. | | [Retrieve IP addresses](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-ip-addresses) | configuration | 0.70 | Explains how to obtain public/private IPs and when they change, enabling precise firewall and routing configuration specific to API Management. | | [Reuse policy configurations](https://learn.microsoft.com/en-us/azure/api-management/policy-fragments) | configuration | 0.70 | Describes how to define and apply reusable policy XML fragments, including limitations; product-specific configuration pattern. | | [Self-hosted gateway support policy](https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-support-policies) | limits-quotas | 0.70 | Support policies article typically lists explicit limitations, supported versions, and constraints (for tiers, environments) that function as product-specific limits/quotas. | | [Send events to Event Grid](https://learn.microsoft.com/en-us/azure/api-management/how-to-event-grid) | integrations | 0.70 | Details specific API Management event types, subscription configuration, and Event Grid integration behavior, which are product-specific integration patterns. | | [Set or edit policies](https://learn.microsoft.com/en-us/azure/api-management/set-edit-policies) | configuration | 0.70 | Explains how to configure policies via XML policy definitions and the policy editor. Policy configuration in API Management is highly product-specific and typically includes concrete policy names, XML elements, and allowed settings, which qualifies as expert configuration knowledge. | | [Upgrade and scale](https://learn.microsoft.com/en-us/azure/api-management/upgrade-and-scale) | decision-making | 0.70 | Covers tier-specific scaling behavior, unit-based capacity guidance, and how to choose scale units/tiers for load, which is concrete decision guidance beyond generic knowledge. | -| [Use managed identities for Azure resources](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-managed-service-identity) | security | 0.70 | Covers product-specific steps and parameters to create and use system-assigned and user-assigned managed identities for API Management to access other Azure resources, including Entra/Key Vault integration details. | +| [Use managed identities for Azure resources](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-managed-service-identity) | security | 0.70 | Describes product-specific steps and settings to create and use system-assigned and user-assigned managed identities for API Management to access other Azure resources. This is concrete security/identity configuration (managed identity enablement and usage) rather than conceptual overview. | | [Virtual network changes (March 2023)](https://learn.microsoft.com/en-us/azure/api-management/breaking-changes/rp-source-ip-address-change-mar-2023) | configuration | 0.70 | Describes specific new source IP addresses per region and required network configuration updates, which are concrete, product-specific configuration details. | | [Virtual network changes (September 2023)](https://learn.microsoft.com/en-us/azure/api-management/breaking-changes/rp-source-ip-address-change-sep-2023) | configuration | 0.70 | Provides exact new source IP address for Switzerland North and steps to adjust network configuration, which are concrete configuration details. | | [cross-domain](https://learn.microsoft.com/en-us/azure/api-management/cross-domain-policy) | configuration | 0.70 | Policy reference for enabling Flash/Silverlight cross-domain access with specific configuration elements; product-specific policy configuration. | @@ -289,10 +293,12 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [validate-headers](https://learn.microsoft.com/en-us/azure/api-management/validate-headers-policy) | limits-quotas | 0.70 | Policy reference includes a specific maximum API schema size limit (4 MB) and behavior when exceeded; this is an explicit numeric limit/constraint. | | [validate-odata-request](https://learn.microsoft.com/en-us/azure/api-management/validate-odata-request-policy) | configuration | 0.70 | Policy reference for validating OData request URLs, headers, and parameters; includes product-specific configuration elements. | | [validate-parameters](https://learn.microsoft.com/en-us/azure/api-management/validate-parameters-policy) | limits-quotas | 0.70 | Includes explicit maximum API schema size (4 MB) and versioning caveats; numeric limit qualifies as limits-quotas plus configuration details. | +| [Configure credential manager - GitHub API](https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github) | integrations | 0.68 | How-to for creating a managed connection from Azure API Management to GitHub using OAuth 2.0 authorization code flow. Likely includes product-specific configuration fields (connection/credential manager settings, callback URLs, scopes, parameter names) and step-by-step integration details that go beyond generic OAuth knowledge, fitting the integrations & coding patterns category. | | [Connect privately using private endpoint](https://learn.microsoft.com/en-us/azure/api-management/private-endpoint) | configuration | 0.68 | Page describes how to configure an inbound private endpoint for Azure API Management using Private Link, including product-specific settings and network configuration steps (VNet, private endpoint association, DNS behavior). This is concrete, product-specific configuration guidance rather than a conceptual overview, fitting the configuration sub-skill best. | | [Create subscriptions](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-create-subscriptions) | configuration | 0.68 | Walkthrough for creating subscriptions; includes subscription key handling, scope options, and settings specific to APIM. | | [Metrics retirements (August 2023)](https://learn.microsoft.com/en-us/azure/api-management/breaking-changes/metrics-retirement-aug-2023) | configuration | 0.68 | Details specific metric names being retired and the replacement metric, along with guidance to update monitoring and alert rules—product-specific monitoring configuration knowledge. | | [Monitor API Management](https://learn.microsoft.com/en-us/azure/api-management/monitor-api-management) | configuration | 0.68 | Monitoring article usually lists specific metrics, log categories, and diagnostic settings for APIM in Azure Monitor—product-specific configuration details. | +| [set-backend-service](https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy) | integrations | 0.68 | Policy reference pages typically include product-specific policy syntax, attributes, and configuration options (for example, how to reference backend entities, allowed attribute names/values, and example configurations) that are unique to Azure API Management. This is expert, implementation-level knowledge that goes beyond generic API gateway concepts and fits best under integrations & coding patterns, since it defines how to programmatically route and configure backend services via policy. | | [Manage user accounts](https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-create-or-invite-developers) | configuration | 0.66 | User management how-to; likely includes specific user states, operations, and REST API entity details unique to APIM. | | [Trusted service connectivity retirement (March 2026)](https://learn.microsoft.com/en-us/azure/api-management/breaking-changes/trusted-service-connectivity-retirement-march-2026) | configuration | 0.66 | Details retirement of trusted service connectivity to specific Azure services and instructs using alternative networking options, which is product-specific networking configuration guidance. | | [Workspaces preview breaking changes, part 2 (March 2025)](https://learn.microsoft.com/en-us/azure/api-management/breaking-changes/workspaces-breaking-changes-march-2025) | decision-making | 0.66 | Explains removal of preview workspaces support, affected scenarios, and migration expectations, which is expert migration and decision-making content. | diff --git a/products/azure-app-service/azure-app-service.csv b/products/azure-app-service/azure-app-service.csv index 0e05e963..62096193 100644 --- a/products/azure-app-service/azure-app-service.csv +++ b/products/azure-app-service/azure-app-service.csv @@ -16,40 +16,40 @@ https://learn.microsoft.com/en-us/azure/app-service/app-service-plan-manage,Mana https://learn.microsoft.com/en-us/azure/app-service/app-service-undelete,Restore deleted app,Restore Deleted Apps - Azure App Service,,Restore a deleted app in Azure App Service to another similar app created in the same region.,"If you deleted an app in Azure App Service, you can restore the app and continue to use the existing contents and settings. The process overwrites anothertargetapp with the contents and settings of the deleted web app. There are several conditions for restoring a deleted app: This article describes how to restore a deleted web app by following procedures for the Azure portal or Azure PowerShell. You can alsorestore a deleted Azure Functions app.",2026-03-24T02:22:00.000Z,how-to,,0.3,False,"Describes the ability and conditions to restore deleted apps, but from the summary it appears to be a procedural how-to without detailed limits, configuration parameters, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/app-service/app-service-web-app-cloning,Clone app,Clone an App by Using PowerShell - Azure App Service,Clone Azure App Service apps using PowerShell,"Learn how to clone your App Service app to a new app by using PowerShell. Learn about various cloning scenarios, including Traffic Manager integration.","Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure PowerShell. To learn how to migrate to the Az PowerShell module, seeMigrate Azure PowerShell from AzureRM to Az. This article explains how you can clone an existing App Service app to create a new app in a different region or in the same region. You can deploy multiple apps across different regions quickly and easily. App cloning is supported in Standard tiers and higher, and in",2025-06-05T05:17:00.000Z,how-to,deployment,0.7,True,"Explains cloning scenarios, supported tiers, and Traffic Manager integration; App Service–specific deployment pattern.",unchanged https://learn.microsoft.com/en-us/azure/app-service/app-service-web-configure-tls-mutual-auth,Configure TLS mutual authentication,Set Up TLS Mutual Authentication - Azure App Service,Configure TLS mutual authentication in Azure App Service,Learn how to set up TLS mutual authentication in Azure App Service to help secure two-way communication between client and server.,"Important Starting July 28, 2025, changes to App ServiceManaged Certificates(ASMC) will impact how certificates are issued and renewed in certain scenarios. While most customers don’t need to take action, we recommend reviewing ourASMC detailed blog postfor more information. You can restrict access to your Azure App Service app by enabling various types of authentication for the app. One way to set up authentication is to request a client certificate when the client request is sent by using Tran",2025-05-09T08:00:00.000Z,how-to,security,0.82,True,"Mutual TLS setup includes App Service–specific settings (client cert mode, headers, validation behavior) and configuration parameters that are unique security configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-custom-domain,Connect a domain name,Set Up an Existing Custom Domain Name for Your App - Azure App Service,,Learn how to set up and map an existing custom domain name for your App Service app to improve branding and user access.,"Important As of July 28, 2025, changes to App Service Managed Certificates (ASMC) impact how certificates are issued and renewed in certain scenarios. While most customers don’t need to take action, we recommend reviewing ourASMC detailed blog postfor more information. Azure App Serviceprovides a highly scalable, self-patching web hosting service. This guide shows you how to map an existing custom Domain Name System (DNS) name to App Service. To migrate a live site and its DNS domain name to App",2026-04-14T06:14:00.000Z,how-to,,0.2,False,"The page is a step-by-step tutorial for mapping an existing custom domain to an App Service app. It appears to be procedural guidance without detailed configuration parameter tables, limits, error-code mappings, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-custom-domain,Connect a domain name,Set Up an Existing Custom Domain Name for Your App - Azure App Service,,Learn how to set up and map an existing custom domain name for your App Service app to improve branding and user access.,"Important As of July 28, 2025, changes to App Service Managed Certificates (ASMC) impact how certificates are issued and renewed in certain scenarios. While most customers don’t need to take action, we recommend reviewing ourASMC detailed blog postfor more information. Azure App Serviceprovides a highly scalable, self-patching web hosting service. This guide shows you how to map an existing custom Domain Name System (DNS) name to App Service. To migrate a live site and its DNS domain name to App",2026-04-14T06:14:00.000Z,how-to,,0.2,False,"The page is a step-by-step tutorial for mapping an existing custom domain to an App Service app. It appears to be procedural guidance without detailed configuration parameter tables, limits, error-code mappings, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-dotnet-sqldatabase,ASP.NET with SQL DB,Tutorial: ASP.NET app with Azure SQL Database - Azure App Service,,Learn how to deploy a data-driven C# ASP.NET app to Azure App Service and connect it to Azure SQL Database.,"Azure App Serviceprovides a highly scalable, self-patching web hosting service. This tutorial shows you how to deploy a data-driven ASP.NET app in App Service and connect it toAzure SQL Database. When you finish the tutorial, you have an ASP.NET app connected to an Azure SQL database running in Azure. The following example shows the app interface. In this tutorial, you:",2025-06-26T22:19:00.000Z,tutorial,,0.2,False,Tutorial for ASP.NET app with SQL Database; focuses on deployment steps rather than reusable configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-rest-api,Deploy a REST API (tutorial),Tutorial: Host a RESTful API with CORS - Azure App Service,,Learn how Azure App Service helps you host your RESTful APIs with CORS support. App Service can host both front-end web apps and back-end APIs.,"Azure App Serviceprovides a highly scalable self-patching web hosting service. In addition, App Service has built-in support forcross-origin resource sharing (CORS)for RESTful APIs. This tutorial shows how to deploy an ASP.NET Core API app to App Service with CORS support. You configure the app by using command-line tools and deploy the app by using Git. In this tutorial, you learn how to: You can complete this tutorial on macOS, Linux, or Windows. If you don't have an Azure account, create afre",2025-09-16T22:10:00.000Z,tutorial,,0.3,False,"Tutorial for hosting a REST API with CORS; focuses on basic deployment and CORS setup, not deep configuration references or troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-ai-foundry-openapi-tool,Authenticate Microsoft Foundry tool calls,Secure OpenAPI tool calls from Foundry Agent Service - Azure App Service,Secure App Service OpenAPI tools for Foundry with Entra auth,"Configure Microsoft Entra authentication to secure Microsoft Foundry tool calls with managed identity, step by step.","This article shows you how to secure your App Service OpenAPI endpoints when they're called by Foundry Agent Service. When you add your App Service app as an OpenAPI tool in Microsoft Foundry, you can configure it to call your APIs anonymously without authentication, which is easier for development and testing. However, for production environments, you should use Microsoft Entra authentication with managed identity. This guide walks you through configuring managed identity authentication to enab",2025-12-16T18:11:00.000Z,how-to,security,0.82,True,"Step-by-step configuration of Entra authentication and managed identity for Foundry Agent Service calls, including scopes and settings unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version,Manage API versions,Manage AuthN/AuthZ API Versions - Azure App Service,Manage App Service authentication API and runtime versions,"Upgrade your App Service authentication API to V2 or pin it to a specific version, if needed.","This article shows you how to customize the API and runtime versions of the built-inauthentication and authorization in App Service. There are two versions of the management API for App Service authentication. The V2 version is required for the ""Authentication"" experience in the Azure portal. An app already using the V1 API can upgrade to the V2 version once a few changes have been made. Specifically, secret configuration must be moved to slot-sticky application settings. This can be done automa",2023-10-12T17:01:00.000Z,how-to,configuration,0.78,True,"Contains specific configuration for pinning or upgrading auth API versions, including required changes (slot-sticky settings) and version identifiers.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version,Manage API versions,Manage Authentication and Authorization API Versions - Azure App Service,Manage App Service authentication API versions,Learn how to upgrade your Azure App Service authentication API to V2 or pin it to a specific version.,"This article describes how to customize the API and runtime versions of the built-inauthentication and authorization in App Service. There are two versions of the management API for App Service authentication. The V2 version is required for the authentication experience in the Azure portal. An app already using the V1 API can upgrade to the V2 version after a few changes are made. Specifically, secret configuration must be moved to slot-sticky application settings. You can move secret configurat",2026-04-24T17:42:00.000Z,how-to,security,0.8,True,"Explains how to pin or upgrade App Service authentication/authorization API versions, including required changes like moving secrets to slot-sticky app settings. This is product-specific security and configuration behavior tied to auth API versions.",updated https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-customize-sign-in-out,Customize sign-ins/outs,Customize Sign-ins and Sign-outs - Azure App Service,Customize sign-in and sign-out behavior in App Service auth,Use the built-in authentication and authorization in Azure App Service and at the same time customize the sign-in and sign-out behavior.,This article shows you how to customize user sign-ins and sign-outs while using the built-inauthentication and authorization in Azure App Service.,2025-04-04T05:11:00.000Z,how-to,security,0.74,True,"Shows how to adjust redirect URLs, routes, and auth settings for sign-in/out flows using Easy Auth, which are App Service–specific security behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-file-based,Use file-based configuration,File-Based Configuration of AuthN/AuthZ - Azure App Service,Configure App Service authentication using file-based settings,Configure authentication and authorization in App Service using a configuration file to enable certain preview capabilities.,"WithApp Service authentication, the authentication settings can be configured with a file. You may need to use file-based configuration to use certain preview capabilities of App Service authentication / authorization before they're exposed viaAzure Resource ManagerAPIs. Important Remember that your app payload, and therefore this file, may move between environments, as withslots. It's likely you'd want a different app registration pinned to each slot, and in these cases, you should continue to ",2026-03-10T08:00:00.000Z,how-to,configuration,0.8,True,"Describes file-based configuration for App Service AuthN/AuthZ, including a dedicated config file and settings that travel with the app payload; this implies specific configuration keys/values unique to this feature.",updated +https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-file-based,Use file-based configuration,File-Based Configuration of AuthN/AuthZ - Azure App Service,Configure App Service authentication using file-based settings,Configure authentication and authorization in App Service using a configuration file to enable certain preview capabilities.,"WithApp Service authentication, the authentication settings can be configured with a file. You may need to use file-based configuration to use certain preview capabilities of App Service authentication / authorization before they're exposed viaAzure Resource ManagerAPIs. Important Remember that your app payload, and therefore this file, may move between environments, as withslots. It's likely you'd want a different app registration pinned to each slot, and in these cases, you should continue to ",2026-03-10T08:00:00.000Z,how-to,configuration,0.8,True,"Describes file-based configuration for App Service AuthN/AuthZ, including a dedicated config file and settings that travel with the app payload; this implies specific configuration keys/values unique to this feature.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-mcp,Overview,Configure MCP server authorization - Azure App Service,Configure MCP server authorization in Azure App Service,Learn how to configure Model Context Protocol (MCP) server authorization in Azure App Service and Azure Functions,"App Service Authenticationallows you to control access to your Model Context Protocol (MCP) server by requiring MCP clients to authenticate with an identity provider. You can make your app comply with theMCP server authorization specificationby following the instructions in this article. Important MCP server authorization defines access to the server, and it doesn't provide granular control to individual MCP tools or other constructs.",2025-11-05T05:32:00.000Z,how-to,security,0.82,True,"Details how to align App Service Authentication with the MCP server authorization spec, including specific settings and flows unique to MCP.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-mcp-server-vscode,Secure calls from Visual Studio Code,Secure MCP servers with Microsoft Entra authentication - Azure App Service,Secure MCP servers on App Service with Entra authentication,Configure Microsoft Entra authentication to secure your MCP server on Azure App Service and access it from Visual Studio Code.,"This article shows you how to secure your Model Context Protocol (MCP) server hosted on Azure App Service using Microsoft Entra authentication. By enabling authentication, you ensure that only users authenticated with Microsoft Entra can access your MCP server through Copilot agent mode in Visual Studio Code. For other authentication methods and general MCP server security concepts, seeSecure a Model Context Protocol server in Azure App Service.",2025-11-20T06:12:00.000Z,how-to,security,0.8,True,Shows concrete configuration to protect MCP servers with Entra auth and access them from VS Code Copilot agent mode—product- and scenario-specific security setup.,unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-oauth-tokens,Work with tokens,Work with OAuth Tokens in Authentication and Authorization - Azure App Service,Manage OAuth tokens with App Service authentication,"Learn how to retrieve, refresh, and extend session expiration for OAuth tokens when you use Azure App Service built-in authentication and authorization.",This article shows you how to manageOAuthtokens forbuilt-in authentication and authorizationin Azure App Service.,2025-11-28T08:00:00.000Z,how-to,security,0.8,True,"Explains how to retrieve, refresh, and extend OAuth tokens via App Service auth endpoints and settings—detailed token management behavior unique to this feature.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-aad,Use Microsoft Entra,Configure Microsoft Entra Authentication - Azure App Service,Configure Microsoft Entra authentication for App Service,Learn how to configure Microsoft Entra authentication as an identity provider for your App Service or Azure Functions app.,Select another authentication provider to jump to it. This article shows you how to configure authentication for Azure App Service or Azure Functions so that your app signs in users with theMicrosoft identity platform(Microsoft Entra) as the authentication provider.,2025-03-28T08:00:00.000Z,how-to,security,0.86,True,"Contains specific Entra app registration fields, redirect URIs, client IDs, and App Service auth settings required to wire up Entra ID—detailed security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple,Use Apple sign-in (preview),Configure Sign in with Apple (Preview) - Azure App Service,Configure Sign in with Apple for App Service authentication,Learn how to configure Sign in with Apple as an identity provider for your App Service or Azure Functions app.,"This article shows you how to configure Azure App Service or Azure Functions to use Sign in with Apple as an authentication provider. To complete the procedure in this article, you must have enrolled in the Apple developer program. To enroll in the Apple developer program, go todeveloper.apple.com/programs/enroll. Caution Enabling Sign in with Apple will disable management of the App Service Authentication / Authorization feature for your application through some clients, such as the Azure porta",2021-09-23T17:02:00.000Z,how-to,security,0.8,True,"Includes Apple-specific keys, identifiers, and App Service auth settings, plus caveats about portal management—detailed security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook,Use Facebook,Configure Facebook Authentication - Azure App Service,Configure Facebook authentication for Azure App Service,Learn how to configure Facebook authentication as an identity provider for your App Service or Azure Functions app.,"This article shows how to configure Azure App Service or Azure Functions to use Facebook as an authentication provider. To complete the procedure in this article, you need a Facebook account that has a verified email address and a mobile phone number. To create a new Facebook account, go tofacebook.com.",2025-08-15T17:11:00.000Z,how-to,security,0.8,True,"Includes provider-specific keys, callback URLs, and App Service auth settings for Facebook, which are concrete security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github,Use GitHub,Configure GitHub Authentication - Azure App Service,Configure GitHub authentication for Azure App Service,Learn how to configure GitHub authentication as an identity provider for your App Service or Azure Functions app.,"This article shows how to configure Azure App Service or Azure Functions to use GitHub as an authentication provider. To complete the procedure in this article, you need a GitHub account. To create a new GitHub account, go toGitHub.",2025-08-15T17:11:00.000Z,how-to,security,0.8,True,Shows how to set up GitHub as an identity provider with specific client ID/secret and redirect URL configuration in App Service.,unchanged +https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple,Use Apple sign-in (preview),Configure Sign in with Apple (Preview) - Azure App Service,Configure Sign in with Apple for App Service,Learn how to configure Sign in with Apple as an identity provider for your App Service or Azure Functions app.,"This article shows you how to configure Azure App Service or Azure Functions to use Sign in with Apple as an authentication provider. To complete the procedure in this article, you must enroll in the Apple developer program. To enroll in the Apple developer program, go todeveloper.apple.com/programs/enroll. Caution Enabling Sign in with Apple disables management of the App Service Authentication and Authorization feature for your application through some clients, such as the Azure portal, Azure ",2026-04-24T17:42:00.000Z,how-to,security,0.88,True,"Covers configuring Sign in with Apple as an identity provider, including Apple developer program requirements and a caution about disabling certain App Service auth management paths. These are specific security and auth configuration details.",updated +https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook,Use Facebook,Configure Facebook Authentication - Azure App Service,Configure Facebook as App Service identity provider,Learn how to configure Facebook authentication as an identity provider for your App Service or Azure Functions app.,"This article shows how to configure Azure App Service or Azure Functions to use Facebook as an authentication provider. To complete the procedure in this article, you need a Facebook account that has a verified email address and a mobile phone number. To create a new Facebook account, go tofacebook.com.",2026-04-24T18:40:00.000Z,how-to,security,0.86,True,"Shows exact steps and settings to configure Facebook as an auth provider for App Service/Azure Functions, including provider-specific parameters and App Service auth configuration. This is concrete IAM configuration, not just conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github,Use GitHub,Configure GitHub Authentication - Azure App Service,Configure GitHub authentication for App Service,Learn how to configure GitHub authentication as an identity provider for your App Service or Azure Functions app.,"This article shows how to configure Azure App Service or Azure Functions to use GitHub as an authentication provider. To complete the procedure in this article, you need a GitHub account. To create a new GitHub account, go toGitHub.",2026-04-24T17:42:00.000Z,how-to,security,0.86,True,"Provides specific configuration for using GitHub as an identity provider in App Service/Azure Functions, including provider setup and App Service auth settings. These are product-specific security and identity configurations.",updated https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-google,Use Google,Configure Google Authentication - Azure App Service,Configure Google authentication for Azure App Service,Learn how to configure Google authentication as an identity provider for your App Service or Azure Functions app.,"This article shows you how to configure Azure App Service or Azure Functions to use Google as an authentication provider. To complete the procedure, you must have a Google account that has a verified email address. To create a new Google account, go toaccounts.google.com.",2025-09-01T17:17:00.000Z,how-to,security,0.8,True,"Provides Google OAuth client configuration and App Service auth settings, including exact fields and URLs needed for secure integration.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-openid-connect,Use OpenID Connect,Configure an OpenID Connect Provider - Azure App Service,Configure custom OpenID Connect providers for App Service,Learn how to configure an OpenID Connect provider as an identity provider for your App Service or Azure Functions app.,This article shows you how to configure Azure App Service or Azure Functions to use a custom authentication provider that adheres to theOpenID Connect (OIDC) specification. OIDC is an industry standard that many identity providers use. You don't need to understand the details of the specification to use an OIDC identity provider for your app. You can configure your app to use one or more OIDC providers. You must give each OIDC provider a unique friendly name in the app configuration. Only one pr,2025-09-01T17:17:00.000Z,how-to,security,0.84,True,"Explains how to add OIDC providers with specific metadata endpoints, client IDs, and App Service auth configuration fields—product-specific security setup.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter,Use X,Configure X Authentication - Azure App Service,Configure X (Twitter) authentication for Azure App Service,Learn how to configure X authentication as an identity provider for your App Service or Azure Functions app.,"This article shows how to configure Azure App Service or Azure Functions to use X as an authentication provider. To complete the procedure in this article, you need an X account that has a verified email address and phone number. To create a new X account, go tox.com.",2025-08-15T17:11:00.000Z,how-to,security,0.8,True,"Contains provider-specific configuration steps and App Service auth settings for X, which are detailed security integration parameters.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter,Use X,Configure X Authentication - Azure App Service,Configure X (Twitter) authentication for App Service,Learn how to configure X authentication as an identity provider for your App Service or Azure Functions app.,"This article shows you how to configure Azure App Service or Azure Functions to use X as an authentication provider. To complete the procedure in this article, you need an X account that has a verified email address and phone number. To create a new X account, go tox.com.",2026-04-24T17:42:00.000Z,how-to,security,0.86,True,"Details how to configure X/Twitter as an auth provider, including account requirements and App Service/Azure Functions auth settings. This is concrete IAM configuration unique to this integration.",updated https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-user-identities,Access user identities,Work with User Identities in AuthN/AuthZ - Azure App Service,Access user identities with App Service authentication,Learn how to access user identities when you use the built-in authentication and authorization in Azure App Service.,This article shows you how to work with user identities when you usebuilt-in authentication and authorizationin Azure App Service.,2025-07-03T17:22:00.000Z,how-to,security,0.76,True,"Describes how identity claims are exposed (headers, tokens, environment) and how to use them in code—platform-specific security behavior.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-basic-auth-disable,Disable basic auth,Disable Basic Authentication for Deployment - Azure App Service,Disable basic auth for App Service deployments securely,Learn about disabling basic authentication for increased security and how it affects App Service deployments.,"This article discusses how to disable basic username and password authentication for deploying code to Azure App Service apps. The article explains several ways to disable basic authentication, fallback deployment methods if any, and how to monitor basic authentication access attempts. App Service provides basic authentication for FTP and Web Deploy clients to connect using username and password deployment credentials. The basic authentication APIs are good for browsing your site's file system, ",2025-06-16T11:11:00.000Z,how-to,security,0.75,True,"Explains concrete ways to disable basic auth for FTP/Web Deploy, fallback deployment methods, and how to monitor access attempts; this is product-specific security configuration beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-common,Configure common settings,Configure an App Service App - Azure App Service,Configure common settings for Azure App Service apps,"Learn how to configure common settings for an Azure App Service app. You can use the Azure portal, Azure CLI, or Azure PowerShell.","This article explains how to configure common settings for web apps, a mobile back end, or an API app. For Azure Functions, seeApp settings reference for Azure Functions.",2025-03-27T08:00:00.000Z,how-to,configuration,0.75,True,"Explains concrete App Service app settings (like platform, general settings, connection strings) with specific option names and allowed values—product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-connect-to-azure-storage,Mount Azure Storage,Mount Azure Storage as a Local Share - Azure App Service,Mount Azure Storage file shares in App Service,"Learn how to attach custom network share in Azure App Service. Share files between apps, manage static content remotely and access locally.","Azure Storage is Microsoft's cloud storage solution for modern data storage scenarios. Azure Storage offers highly available, massively scalable, durable, and secure storage for data objects in the cloud. This guide shows how to mount Azure Storage Files as a network share in Windows code (noncontainer) in Azure App Service. Azure Storage supportsAzure Files SharesandPremium Files Sharesfor App Service. Azure Storage isn't the default storage for App Service. It's billed separately. You can also",2026-02-09T08:00:00.000Z,how-to,configuration,0.8,True,"Describes how to configure Azure Files and Premium Files as network shares for App Service, including mount settings—product-specific configuration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-custom-container,Configure custom container,Configure a Custom Container - Azure App Service,Configure custom containers for Azure App Service,Learn how to configure a custom container in Azure App Service. This article shows the most common configuration tasks.,This article shows you how to configure a custom container to run on Azure App Service. Learn about key concepts and get instructions for containerization of Windows apps in App Service.,2026-04-08T08:00:00.000Z,how-to,configuration,0.75,True,"Article explicitly about configuring custom containers on App Service and ‘most common configuration tasks’. These pages usually contain container-specific App Service settings (WEBSITES_PORT, startup commands, image settings) and configuration parameters unique to this product, fitting the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/app-service/configure-custom-container,Configure custom container,Configure a Custom Container - Azure App Service,Configure custom containers for Azure App Service,Learn how to configure a custom container in Azure App Service. This article shows the most common configuration tasks.,This article shows you how to configure a custom container to run on Azure App Service. Learn about key concepts and get instructions for containerization of Windows apps in App Service.,2026-04-08T08:00:00.000Z,how-to,configuration,0.75,True,"Article explicitly about configuring custom containers on App Service and ‘most common configuration tasks’. These pages usually contain container-specific App Service settings (WEBSITES_PORT, startup commands, image settings) and configuration parameters unique to this product, fitting the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-domain-traffic-manager,Use Traffic Manager with a domain,Set up Traffic Manager for Your Domain - Azure App Service,Configure Traffic Manager with App Service custom domains,Discover how to use Azure Traffic Manager with a custom domain to improve app performance and global availability.,"Important Starting July 28, 2025, changes to App ServiceManaged Certificates(ASMC) will impact how certificates are issued and renewed in certain scenarios. While most customers don’t need to take action, we recommend reviewing ourASMC detailed blog postfor more information. When you useAzure Traffic Managerto load balance traffic toAzure App Service, the App Service app can be accessed using.trafficmanager.net. You can assign a custom domain name, such as www.contoso.c",2025-02-27T23:02:00.000Z,how-to,configuration,0.7,True,"Shows how to integrate Traffic Manager endpoints with custom domains for App Service, including DNS and routing configuration—service-specific integration settings.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-encrypt-at-rest-using-cmk,Encrypt site data,Encrypt Your Application Source at Rest - Azure App Service,Encrypt App Service app source at rest with CMK,Learn how to encrypt your application data in Azure Storage and deploy it as a package file.,Encrypting your web app's application data at rest requires an Azure Storage Account and an Azure Key Vault. These services are used when you run your app from a deployment package.,2022-03-24T11:04:00.000Z,how-to,security,0.7,True,Describes how to use Storage and Key Vault with run-from-package to encrypt application data at rest; involves product-specific secure configuration patterns.,unchanged +https://learn.microsoft.com/en-us/azure/app-service/configure-encrypt-at-rest-using-cmk,Encrypt site data,Encrypt Your Application Source at Rest - Azure App Service,Encrypt Azure App Service app content with customer-managed keys,Learn how to encrypt your application data in Azure Storage and deploy it as a package file.,Encrypting your web app's application data at rest requires an Azure Storage account and Azure Key Vault. These services are used when you run your app from a deployment package.,2026-04-24T17:42:00.000Z,how-to,security,0.7,True,"Describes how to configure encryption at rest for App Service app content using Azure Storage and Key Vault, including product-specific security configuration steps and parameters for CMK-based encryption when running from a package file.",updated https://learn.microsoft.com/en-us/azure/app-service/configure-error-pages,Configure error pages,Configure error pages on App Service - Azure App Service,Configure custom error pages in Azure App Service,Learn how to configure the custom error pages available on Azure App Service.,"Azure App Service lets you configure specific error pages to present to your web app users instead of the default error pages. This article explains how to configure these custom error pages for your web app. The three types of error code pages available for customization in App Service are403 Access restrictions,502 Gateway errors, and503 Service unavailable. This article walks through adding a custom 403 error page to a web app hosted on App Service, and testing it by using an IP restriction.",2026-02-26T06:24:00.000Z,how-to,configuration,0.7,True,"Shows how to configure specific error pages (403, 502, 503) with App Service–specific settings and behaviors, which are concrete configuration details.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-gateway-required-vnet-integration,Configure gateway-required integration,Configure gateway-required virtual network integration for your app - Azure App Service,Configure gateway-required VNet integration for App Service,Integrate your app in Azure App Service with Azure virtual networks using gateway-required virtual network integration.,"Important Gateway-required virtual network integration is being retired on March 31, 2027. For more information, see theretirement announcementandMigrate from SSTP to IKEv2 or OpenVPN. We recommend migrating toregional virtual network integration, which mitigates the limitations of gateway-required virtual network integration. Gateway-required virtual network integration supports connecting to a virtual network in another region or to a classic virtual network. Gateway-required virtual network i",2026-01-30T08:00:00.000Z,how-to,configuration,0.7,True,"Describes legacy gateway-based integration, limitations, and setup; detailed product-specific networking configuration.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-grpc,gRPC configuration,Configure gRPC on App Service - Azure App Service,Configure gRPC applications on Azure App Service for Linux,Learn how to configure a Google Remote Procedure Call (gRPC) application with Azure App Service on Linux.,"This article explains how to configure your web app for gRPC, a remote procedure call framework that can streamline messages between your client and server over HTTP/2. Using the gRPC protocol over HTTP/2 lets you use features like: Support for gRPC is available on Azure App Service for Linux. To use gRPC on your web app, you configure your app by selecting the HTTP version, proxy, and port. For gRPC client and server samples for each supported language, seegRPC on App Serviceon GitHub.",2026-02-26T06:24:00.000Z,how-to,configuration,0.75,True,"Explains selecting HTTP version, proxy, and port for gRPC; these are concrete App Service configuration parameters and values.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnet-aspire,Aspire,Configure Aspire apps - Azure App Service,Configure Aspire applications on Azure App Service,"Learn how to configure Aspire apps deployed to Azure App Service, including App Service plan settings, Application Insights, dashboard, and health probes.","This article describes how to configureAspireapps deployed toAzure App Service. Aspire provides a streamlined, opinionated way to build observable, production-ready cloud-native applications, and App Service integration allows you to customize the underlying Azure infrastructure through code. If you haven't deployed an Aspire app to App Service yet, see thequickstart guidefirst.",2026-02-02T18:12:00.000Z,how-to,configuration,0.7,True,"Describes configuring Aspire apps including App Service plan settings, Application Insights, dashboards, and health probes—product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnet-framework,ASP.NET,Configure ASP.NET Apps - Azure App Service,Configure ASP.NET apps on Azure App Service,Learn how to configure an ASP.NET app in Azure App Service. This article shows the most common configuration tasks.,"Note For ASP.NET Core, seeConfigure an ASP.NET Core app for Azure App Service. If your ASP.NET app runs in a custom Windows or Linux container, seeConfigure a custom container for Azure App Service. ASP.NET apps must be deployed to Azure App Service as compiled binaries. The Visual Studio publishing tool builds the solution and then deploys the compiled binaries directly. The App Service deployment engine deploys the code repository first and then compiles the binaries. This guide provides key c",2025-07-07T08:00:00.000Z,concept-article,configuration,0.7,True,"Article is explicitly about configuration of ASP.NET apps on App Service and typically includes product-specific settings, connection strings, and configuration options beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnetcore,ASP.NET Core,Configure ASP.NET Core apps - Azure App Service,Configure ASP.NET Core apps on Azure App Service,Learn how to configure an ASP.NET Core app in native Windows instances or in a prebuilt Linux container in Azure App Service.,"Note For ASP.NET in .NET Framework, seeConfigure an ASP.NET app for Azure App Service. If your ASP.NET Core app runs in a custom Windows or Linux container, seeConfigure a custom container for Azure App Service. ASP.NET Core apps must be deployed to Azure App Service as compiled binaries. The Visual Studio publishing tool builds the solution and then deploys the compiled binaries directly. The App Service deployment engine deploys the code repository first and then compiles the binaries. This gu",2025-07-07T08:00:00.000Z,how-to,configuration,0.7,True,"Focuses on configuring ASP.NET Core apps in App Service, likely with specific app settings, environment variables, and platform behaviors unique to the service.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm,APM integration,"Configure APM Platforms for Tomcat, JBoss, or Java SE Apps - Azure App Service",Configure APM for Java apps on Azure App Service,"Learn how to configure APM platforms, such as Application Insights, New Relic, and AppDynamics, for Tomcat, JBoss, or Java SE app on Azure App Service.","This article shows how to connect Java applications deployed on Azure App Service with Azure Monitor Application Insights, NewRelic, and AppDynamics application performance monitoring (APM) platforms. Azure App Service runs Java web applications on a fully managed service in three variants: Note JBoss EAP on App Service now supports ""Bring Your Own License"" (BYOL) billing, this allows customers with existing Red Hat subscriptions to apply those licenses directly to their JBoss EAP deployments on",2025-08-15T17:11:00.000Z,how-to,configuration,0.7,True,"Shows how to connect Java apps to Application Insights, New Relic, and AppDynamics with product-specific configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-data-sources,Data sources,"Configure Data Sources for Tomcat, JBoss, or Java SE Apps - Azure App Service",Configure Java data sources on Azure App Service,"Learn how to configure data sources for Tomcat, JBoss, or Java SE apps on Azure App Service, including native Windows and Linux container variants.","This article shows how to configure data sources in a Java SE, Tomcat, or JBoss app in App Service. Azure App Service runs Java web applications on a fully managed service in three variants: Note JBoss EAP on App Service now supports ""Bring Your Own License"" (BYOL) billing, this allows customers with existing Red Hat subscriptions to apply those licenses directly to their JBoss EAP deployments on Azure App Service.Learn more.",2025-09-25T11:34:00.000Z,how-to,configuration,0.75,True,"Explains configuring data sources for Java SE, Tomcat, and JBoss on App Service, with specific parameters and connection settings.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm,APM integration,"Configure APM Platforms for Tomcat, JBoss, or Java SE Apps - Azure App Service",Integrate Java apps with APM tools on App Service,"Learn how to configure APM platforms, such as Application Insights, New Relic, and AppDynamics, for Tomcat, JBoss, or Java SE app on Azure App Service.","This article shows how to connect Java applications deployed on Azure App Service with Azure Monitor Application Insights, New Relic, and AppDynamics application performance monitoring (APM) platforms. Azure App Service runs Java web applications in three types on a fully managed service: Note JBoss EAP on App Service now supports Bring Your Own License (BYOL) billing. BYOL enables customers who have existing Red Hat subscriptions to apply those licenses directly to their JBoss EAP deployments o",2026-04-24T17:42:00.000Z,how-to,integrations,0.75,True,"Describes connecting Java apps on App Service with Application Insights, New Relic, and AppDynamics. This usually involves product-specific agent configuration, environment variables, and parameter names for each APM platform, fitting the integrations & coding patterns category.",updated +https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-data-sources,Data sources,"Configure Data Sources for Tomcat, JBoss, or Java SE Apps - Azure App Service",Configure Java data sources on Azure App Service,"Learn how to configure data sources for Tomcat, JBoss, or Java SE apps on Azure App Service, including native Windows and Linux container variants.","This article shows how to configure data sources in a Java SE, Tomcat, or JBoss app in App Service. Azure App Service runs Java web applications in three types on a fully managed service: Note JBoss EAP on App Service now supports Bring Your Own License (BYOL) billing. BYOL enables customers who have existing Red Hat subscriptions to apply those licenses directly to their JBoss EAP deployments on Azure App Service. For more information, seeBYOL Support for JBoss EAP on App Service.",2026-04-24T18:40:00.000Z,how-to,configuration,0.75,True,"Article focuses on configuring data sources for Java SE, Tomcat, and JBoss on App Service. It typically includes product-specific configuration parameters (JDBC URLs, connection pool settings, environment variables) and platform-specific options, which match the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-deploy-run,Deployment and runtime,"Deploy and Configure Tomcat, JBoss EAP, or Java SE Apps - Azure App Service",Deploy and configure Java apps on Azure App Service,"Learn how to deploy Tomcat, JBoss EAP, or Java SE apps to run on Azure App Service. Perform common tasks like setting Java versions and configuring logging.","This article shows you the most common deployment and runtime configuration for Java apps in Azure App Service. If it's your first time using Azure App Service, you should first read through theJava quickstart. You can find the answers to general questions about using App Service that aren't specific to Java development in theApp Service FAQ. Azure App Service runs Java web applications on a fully managed service in three variants: Note JBoss EAP on App Service now supports ""Bring Your Own Licen",2025-08-12T08:00:00.000Z,how-to,configuration,0.75,True,"Covers deployment and runtime configuration for Tomcat, JBoss, and Java SE, including Java versions and logging—product-specific configuration options.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-security,Security,"Configure Security for Tomcat, JBoss, or Java SE Apps - Azure App Service",Configure security for Java apps on Azure App Service,"Learn how to configure security for Tomcat, JBoss, or Java SE apps on Azure App Service, such as authentication, Key Vault references, and Java key store.","This article shows how to configure Java-specific security settings in App Service. Java applications running in App Service have the same set ofsecurity best practicesas other applications. Azure App Service runs Java web applications on a fully managed service in three variants: Note JBoss EAP on App Service now supports ""Bring Your Own License"" (BYOL) billing, this allows customers with existing Red Hat subscriptions to apply those licenses directly to their JBoss EAP deployments on Azure App",2025-08-15T17:11:00.000Z,how-to,security,0.7,True,"Java-specific security configuration including authentication, Key Vault references, and Java keystore settings—product-specific security details.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-security,Security,"Configure Security for Tomcat, JBoss, or Java SE Apps - Azure App Service",Configure security for Java apps on Azure App Service,"Learn how to configure security for Tomcat, JBoss, or Java SE apps on Azure App Service, such as authentication, Key Vault references, and Java key store.",This article shows how to configure Java-specific security settings in App Service. Java applications that run in App Service have the same set ofsecurity best practicesas other applications. Azure App Service runs Java web applications in three types on a fully managed service: Note JBoss EAP on App Service now supports Bring Your Own License (BYOL) billing. BYOL enables customers who have existing Red Hat subscriptions to apply those licenses directly to their JBoss EAP deployments on Azure Ap,2026-04-24T17:42:00.000Z,how-to,security,0.8,True,"Covers Java-specific security settings such as authentication integration, Key Vault references, and Java keystore configuration on App Service. These are concrete, product-specific security configurations and patterns rather than generic security concepts.",updated https://learn.microsoft.com/en-us/azure/app-service/configure-language-nodejs,Configure,Configure Node.js Apps - Azure App Service,Configure Node.js applications on Azure App Service,"Learn how to configure a Node.js app in the native Windows instances, or in a prebuilt Linux container, in Azure App Service.","Node.js apps must be deployed with all the required npm dependencies. The App Service deployment engine automatically runsnpm install --productionwhen you deploy aGit repositoryor when you deploy aZip packagewith build automation enabled. If you deploy your files by usingFTP/S, however, you need to upload the required packages manually. This article describes key concepts and provides instructions for Node.js developers who deploy to App Service. If you've never used Azure App Service, complete ",2025-09-08T22:37:00.000Z,how-to,configuration,0.7,True,"Describes how App Service runs Node.js, npm install behavior, and deployment engine details; likely includes specific setting names and behaviors that are product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-language-php,Configure,Configure a PHP App - Azure App Service,Configure PHP applications on Azure App Service,Learn how to configure a PHP app in a prebuilt PHP container in Azure App Service. This article shows the most common configuration tasks.,"Warning PHP on Windows reached theend of supportin November 2022. PHP is supported only for App Service on Linux. This article is for reference only. This guide shows you how to configure your PHP web apps, mobile back ends, and API apps in Azure App Service. The most common configuration tasks are covered. If you're new to App Service, you should first follow theCreate a PHP web app in Azure App Servicequickstart and theDeploy a PHP, MySQL, and Redis app to Azure App Servicetutorial.",2025-09-14T08:00:00.000Z,how-to,configuration,0.7,True,Covers common PHP app configuration tasks on App Service; likely includes specific setting names and behaviors that are product-specific configuration knowledge.,unchanged https://learn.microsoft.com/en-us/azure/app-service/configure-language-python,Configure,Configure Linux Python Apps - Azure App Service,Configure Python apps on Azure App Service Linux,Learn how to configure the Python container in which web apps are run by using both the Azure portal and the Azure CLI.,"This article describes howAzure App Serviceruns Python apps, how you can migrate existing apps to Azure, and how you can customize the behavior of App Service when you need to. Python apps must be deployed with all the requiredpipmodules. The App Service deployment engine automatically activates a virtual environment and installs dependencies from arequirements.txt,pyproject.toml, orsetup.pyfile when you deploy aGit repositoryor when you deploy azip packagewith build automation enabled. This art",2025-09-08T22:37:00.000Z,quickstart,configuration,0.75,True,"Explains how App Service runs Python, virtual environment activation, and dependency installation from requirements files; likely includes specific configuration behaviors and settings that are product-specific.",unchanged @@ -77,7 +77,7 @@ https://learn.microsoft.com/en-us/azure/app-service/deploy-intelligent-apps-dotn https://learn.microsoft.com/en-us/azure/app-service/deploy-local-git,Use local Git,Deploy From a Local Git Repository - Azure App Service,Deploy from a local Git repository to App Service,Learn how to configure and carry out local Git deployment to Azure App Service.,"One of the simplest ways to deploy code is from your local computer. This article shows you how to deploy your app toAzure App Servicefrom a Git repository on your local computer. Note Local Git deployment requiresSource Control Manager (SCM) basic authentication, which is less secure thanother deployment methods. Ifbasic authentication is disabled, you can't configure local Git deployment in the app's Deployment Center.",2025-06-27T17:46:00.000Z,how-to,deployment,0.75,True,"Covers local Git deployment specifics, including requirement for SCM basic authentication and its security implications—product-specific deployment details.",unchanged https://learn.microsoft.com/en-us/azure/app-service/deploy-run-package,Run from package,Run Your App from a ZIP Package - Azure App Service,Deploy and run Azure App Service apps from ZIP packages,Deploy your app's ZIP package with atomicity. Improve the predictability and reliability of your app's behavior during the ZIP deployment process.,"Note Run from package is not supported for Python apps. When deploying a ZIP file of your Python code, you need to set a flag to enable Azure build automation. The build automation will create the Python virtual environment for your app and install any necessary requirements and package needed. Seebuild automationfor more details. Run from package is also not supported for Java apps on Azure App Service. Built‑in Java runtimes (Java SE, Tomcat, and JBoss EAP) require write access to the app dire",2026-04-01T17:25:00.000Z,how-to,deployment,0.8,True,"The article describes product-specific deployment behavior and constraints for 'Run from package', including unsupported runtimes (Python, Java), required flags for Python build automation, and how the platform treats ZIP packages. These are concrete deployment constraints and platform behaviors by runtime/plan, matching the deployment sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/app-service/deploy-staging-slots,Create staging environments,Set Up Staging Environments - Azure App Service,Configure deployment slots and staging for App Service,Learn how to deploy apps to a nonproduction slot and automatically swap into production. Increase the reliability and eliminate app downtime from deployments.,"When you deploy your web app, web app on Linux, mobile back end, or API app toAzure App Service, you can use a separate deployment slot instead of the default production slot. This approach is available if you run in the Standard, Premium, or Isolated tier of an App Service plan. Deployment slots are live apps with their own host names. App content and configuration elements can be swapped between two deployment slots, including the production slot. Deploying your application to a nonproduction ",2025-11-28T18:11:00.000Z,how-to,deployment,0.8,True,"Explains deployment slots, swap behavior, and tier requirements (Standard, Premium, Isolated)—App Service–specific deployment patterns and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/deploy-zip,Use ZIP or WAR,Deploy Files - Azure App Service,Deploy ZIP and file packages to Azure App Service,"Learn to deploy app packages, discrete libraries, static files, or startup scripts to Azure App Service.","This article shows you how to deploy your code as a ZIP, WAR, JAR, or EAR package toAzure App Service. It also shows you how to deploy individual files to App Service, separate from your application package.",2026-03-11T08:00:00.000Z,how-to,deployment,0.7,True,"The article describes product-specific deployment behavior for ZIP/WAR/JAR/EAR and individual file deployment to Azure App Service, including how the platform handles these packages and related constraints. This is concrete, implementation-focused deployment guidance beyond generic knowledge, but not primarily about limits, configuration matrices, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/app-service/deploy-zip,Use ZIP or WAR,Deploy Files - Azure App Service,Deploy ZIP and file packages to Azure App Service,"Learn to deploy app packages, discrete libraries, static files, or startup scripts to Azure App Service.","This article shows you how to deploy your code as a ZIP, WAR, JAR, or EAR package toAzure App Service. It also shows you how to deploy individual files to App Service, separate from your application package.",2026-03-11T08:00:00.000Z,how-to,deployment,0.7,True,"The article describes product-specific deployment behavior for ZIP/WAR/JAR/EAR and individual file deployment to Azure App Service, including how the platform handles these packages and related constraints. This is concrete, implementation-focused deployment guidance beyond generic knowledge, but not primarily about limits, configuration matrices, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/app-service/environment/app-service-app-service-environment-custom-settings,App Service Environment custom settings,Configure custom settings - Azure App Service Environment,Configure ASE-wide custom settings via ARM templates,Configure settings that apply to the entire Azure App Service environment. Learn how to do it with Azure Resource Manager templates.,"Because App Service Environments are isolated to a single customer, there are certain configuration settings that can be applied exclusively to App Service Environments. This article documents the various specific customizations that are available for App Service Environments. If you don't have an App Service Environment, seeHow to Create an App Service Environment v3. You can store App Service Environment customizations by using an array in theclusterSettingsattribute. This attribute is found i",2025-11-26T18:11:00.000Z,tutorial,configuration,0.9,True,"Documents ASE-specific customizations stored in clusterSettings, with concrete setting names and values applied at environment scope—fits configuration category with product-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/app-service/environment/app-service-app-service-environment-geo-distributed-scale,Geo-distributed scale,Geo-Distributed Horizontal Scale - Azure App Service Environment,Design geo-distributed scale with App Service Environments,Learn how to horizontally scale apps by using geo-distribution with Traffic Manager and App Service Environments.,"Application scenarios that require high scale can exceed the compute resource capacity available to a single deployment of an app. Voting applications, sporting events, and televised entertainment events are examples of scenarios that require high scale. You can support high scale requirements by horizontally scaling out apps. To handle extreme load requirements, many app deployments can be made within a single region and across regions. This article provides an overview for creating a distribut",2026-03-13T17:21:00.000Z,concept-article,architecture-patterns,0.68,True,"The article describes concrete, product-specific architecture guidance for horizontally scaling Azure App Service apps across regions using App Service Environments and Traffic Manager. It focuses on when and how to use geo-distribution for high-scale scenarios (for example, events with extreme load) and discusses deployment patterns and trade-offs specific to this service, rather than just conceptual scaling theory. While it may not be heavy on numeric limits, it provides expert, pattern-level guidance unique to Azure App Service Environments.",unchanged https://learn.microsoft.com/en-us/azure/app-service/environment/ase-multi-tenant-comparison,Compare with App Service,App Service Environment v3 and App Service Public Multitenant Comparison - Azure App Service Environment,Compare App Service Environment v3 with multitenant App Service,This article provides an overview of the differences between App Service Environment v3 and the public multitenant offering of App Service.,"An App Service Environment is an Azure App Service feature that provides a fully isolated and dedicated environment for running App Service apps securely at high scale. Compared to the public multitenant offering, where the supporting infrastructure is shared with other customers, an App Service Environment provides enhanced security, isolation, and network access control. This article compares the differentiating features of App Service Environment v3 and the public multitenant offering of App ",2025-12-10T18:17:00.000Z,concept-article,decision-making,0.75,True,Comparison of ASE v3 vs public multitenant offering; supports environment selection with feature differences and trade-offs.,unchanged @@ -90,7 +90,7 @@ https://learn.microsoft.com/en-us/azure/app-service/environment/how-to-custom-do https://learn.microsoft.com/en-us/azure/app-service/environment/how-to-upgrade-preference,Upgrade preference for planned maintenance,Set Upgrade Preference for Planned Maintenance - App Service Environment,,Configure the upgrade preference for Azure App Service Environment planned maintenance in the Azure portal or by using the Azure CLI.,"Microsoft regularly updates Azure App Service to provide new features, new runtime versions, performance improvements, and bug fixes. The upgrade process is referred to asplanned maintenance. Upgrades happen automatically and they're applied progressively through the regions followingAdvancing Safe Deployment Practices. This article describes how to configure the upgrade preference for an App Service Environment v3. An App Service Environment provides a fully isolated and dedicated environment f",2026-03-17T08:00:00.000Z,how-to,,0.3,False,"Page appears to be a how-to for setting upgrade preference for App Service Environment planned maintenance (portal/CLI). From the summary, it focuses on configuration steps and conceptual explanation of planned maintenance, without clear evidence of detailed configuration parameter tables, limits/quotas, error codes, or decision matrices. Lacking strong indicators of product-specific expert knowledge as defined by the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/app-service/environment/integrate-with-application-gateway,Integrate with Application Gateway,Integrate with Azure Application Gateway - Azure App Service Environment,Integrate App Service Environment with Azure Application Gateway,Integrate an app in your internal load balancer (ILB) App Service Environment with Azure Application Gateway in this end-to-end walk-through.,This article describes how to configure Azure Application Gateway to point to an application hosted in an App Service Environment.App Service Environmentis a deployment of Azure App Service in the subnet of an organization's Azure virtual network. App Service Environment can be deployed with an external or internal endpoint for app access. A deployment with an internal endpoint is referred to as aninternal load balancer(ILB) App Service Environment. Application Gateway is a virtual appliance tha,2026-04-06T17:23:00.000Z,how-to,integrations,0.7,True,"End-to-end configuration of Application Gateway with an ILB App Service Environment; likely includes product-specific settings (backend pool, probes, hostnames, ports) and integration parameters beyond generic tutorials.",unchanged https://learn.microsoft.com/en-us/azure/app-service/environment/networking,Networking architecture,App Service Environment networking - Azure App Service Environment,,App Service Environment networking details,"App Service Environment is a single-tenant deployment of Azure App Service that hosts Windows and Linux containers, web apps, API apps, logic apps, and function apps. When you install an App Service Environment, you pick the Azure virtual network that you want it to be deployed in. All of the inbound and outbound application traffic is inside the virtual network you specify. You deploy into a single subnet in your virtual network, and nothing else can be deployed into that subnet.",2026-03-30T22:11:00.000Z,overview,,0.3,False,"Provides a conceptual description of App Service Environment networking and its relationship to a virtual network and subnet, but the summary does not show specific configuration parameters, limits, or security role details.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/environment/overview,About App Service Environments,App Service Environment Overview - Azure App Service Environment,,"Learn about App Service Environments, which are fully isolated and single-tenant App Service deployments that provide high-scale, network-secured hosting.","App Service Environment is an Azure App Service feature that provides a fully isolated and dedicated environment to run App Service apps securely at high scale. Unlike the App Service public multitenant offering that shares supporting infrastructure, an App Service Environment provides dedicated compute for a single customer. For more information, seeApp Service Environment v3 and App Service public multitenant comparison. An App Service Environment provides hosting capabilities for various work",2025-11-11T08:00:00.000Z,overview,,0.1,False,"Conceptual overview of App Service Environment; no detailed quotas, config parameters, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/environment/overview,About App Service Environments,App Service Environment Overview - Azure App Service Environment,,"Learn about App Service Environments, which are fully isolated and single-tenant App Service deployments that provide high-scale, network-secured hosting.","App Service Environment is an Azure App Service feature that provides a fully isolated and dedicated environment to run App Service apps securely at high scale. Unlike the App Service public multitenant offering that shares supporting infrastructure, an App Service Environment provides dedicated compute for a single customer. For more information, seeApp Service Environment v3 and App Service public multitenant comparison. An App Service Environment provides hosting capabilities for various work",2026-04-20T08:00:00.000Z,overview,,0.2,False,"High-level overview of App Service Environment; summary suggests conceptual description and comparison link, but no explicit limits, configuration tables, or error/decision matrices.",updated https://learn.microsoft.com/en-us/azure/app-service/environment/overview-certificates,Certificates,Certificates in App Service Environment - Azure App Service Environment,Manage certificates and bindings in App Service Environment,Explain the use of certificates in an App Service Environment. Learn how certificate bindings work on the single-tenanted apps in an App Service Environment.,"The App Service Environment is a deployment of the Azure App Service that runs within your Azure virtual network. It can be deployed with an internet accessible application endpoint or an application endpoint that is in your virtual network. If you deploy the App Service Environment with an internet accessible endpoint, that deployment is called an External App Service Environment. If you deploy the App Service Environment with an endpoint in your virtual network, that deployment is called an IL",2025-10-31T05:12:00.000Z,overview,configuration,0.7,True,"Covers product-specific certificate usage and binding behavior for single-tenant ASE apps, including concrete configuration details unique to ASE rather than generic TLS concepts.",unchanged https://learn.microsoft.com/en-us/azure/app-service/environment/using,Host an App in an App Service Environment,Host a Web App in an App Service Environment - Azure App Service Environment,Configure and host web apps in App Service Environment,"Create a web app that uses an App Service Environment, and host the isolated app in a virtual network/subnet configuration. Follow procedures in the Azure portal to create the web app, enable encrypti","An App Service Environment is a single-tenant deployment of Azure App Service that integrates with an Azure Virtual Network instance and subnet. In this scenario, you can host your web app in an isolated environment where you're the only user of the system. The apps you deploy are subject to the networking features applied to the virtual network subnet for the App Service Environment. No other features are required for your web apps to access the networking features. When you create the web app ",2026-03-07T06:11:00.000Z,how-to,configuration,0.7,True,"How-to page for creating and configuring a web app inside an App Service Environment, including specific portal options (encryption, diagnostic logging, VNet/subnet usage). Contains product-specific configuration steps and settings rather than just conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/app-service/getting-started,Get started,Getting started with Azure App Service - Azure App Service,,Take the first steps toward working with Azure App Service. Decide on a stack and choose from various actions to get your app running.,Azure App Serviceis a fully managed platform as a service (PaaS) for hosting web applications.,2025-08-26T05:10:00.000Z,overview,,0.1,False,"Getting started/navigation page; no expert-level configuration, limits, or troubleshooting content.",unchanged @@ -99,7 +99,7 @@ https://learn.microsoft.com/en-us/azure/app-service/industry-wide-certificate-ch https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-inbound,Inbound IP address,Prepare for Inbound IP Address Change - Azure App Service,Prepare App Service apps for inbound IP address changes,Learn how to prepare your app if your inbound IP address is going to be changed.,"If you received a notification that the inbound IP address of your Azure App Service app is changing, follow these instructions.",2025-12-12T23:21:00.000Z,how-to,best-practices,0.65,True,Provides concrete steps to handle inbound IP changes; operational guidance specific to App Service networking behavior.,unchanged https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-outbound,Outbound IP address,Prepare for Outbound IP Address Change - Azure App Service,Prepare App Service apps for outbound IP address changes,Learn how to prepare your app if your outbound IP address is going to be changed.,"If you received a notification that the outbound IP addresses of your Azure App Service app are changing, follow these instructions.",2025-12-15T18:25:00.000Z,how-to,best-practices,0.65,True,Gives specific instructions for handling outbound IP changes; product-specific operational guidance.,unchanged https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-ssl,TLS/SSL address,Prepare for TLS/SSL IP Address Change - Azure App Service,Handle TLS/SSL IP address changes for App Service bindings,Learn how to release an existing TLS/SSL IP address and assign a new one if your TLS/SSL IP address is going to be changed.,"If you received a notification that the Transport Layer Security and Secure Sockets Layer (TLS/SSL) IP address of your Azure App Service app is changing, follow the instructions in this article to release the existing TLS/SSL IP address and assign a new one. Note Service endpoints aren't currently supported when enabling IP based SSL on App Service TLS/SSL bindings.",2025-12-12T23:21:00.000Z,how-to,best-practices,0.7,True,Explains how to release and reassign TLS/SSL IPs and notes unsupported service endpoints; concrete platform-specific behavior and steps.,unchanged -https://learn.microsoft.com/en-us/azure/app-service/manage-automatic-scaling,Scale out automatically,How to Enable Automatic Scaling - Azure App Service,,Learn how to scale automatically in Azure App Service with no configuration.,"Note Automatic scaling is available for all app types: Windows and Linux (deploy as code or container).Automatic scaling isn't supported for deployment slot traffic. Automatic scaling is a scale-out option that automatically handles scaling decisions for your web apps and App Service plans. It's different fromAzure autoscale, which lets you define scaling rules based on metrics and schedules. With automatic scaling, you can adjust scaling settings to improve performance and reduce cold-start del",2026-04-16T08:00:00.000Z,how-to,,0.3,False,"Primarily explains what automatic scaling is and how to enable it; summary does not indicate detailed limits, configuration tables, or decision matrices—more of a feature/how-to overview.",updated +https://learn.microsoft.com/en-us/azure/app-service/manage-automatic-scaling,Scale out automatically,How to Enable Automatic Scaling - Azure App Service,,Learn how to scale automatically in Azure App Service with no configuration.,"Note Automatic scaling is available for all app types: Windows and Linux (deploy as code or container).Automatic scaling isn't supported for deployment slot traffic. Automatic scaling is a scale-out option that automatically handles scaling decisions for your web apps and App Service plans. It's different fromAzure autoscale, which lets you define scaling rules based on metrics and schedules. With automatic scaling, you can adjust scaling settings to improve performance and reduce cold-start del",2026-04-16T08:00:00.000Z,how-to,,0.3,False,"Primarily explains what automatic scaling is and how to enable it; summary does not indicate detailed limits, configuration tables, or decision matrices—more of a feature/how-to overview.",unchanged https://learn.microsoft.com/en-us/azure/app-service/manage-backup,Back up and restore app,Back Up an App - Azure App Service,Back up and restore Azure App Service apps,Learn how to restore backups of your apps or configure custom backups in Azure App Service. Customize backups by including the linked database.,"Important Starting3/31/2028, Azure App Service custom backups willno longer support backing up linked databases. SeeDeprecation of linked database backupsfor more information. InAzure App Service, you can easily restore app backups. You can also make on-demand custom backups or configure scheduled custom backups. You can restore a backup by overwriting an existing app or by restoring to a new app or slot. This article shows you how to restore a backup and make custom backups. Back up and restore",2025-11-28T08:00:00.000Z,how-to,configuration,0.7,True,"Shows how to configure on-demand and scheduled backups, including linked database considerations and deprecation; product-specific backup configuration.",unchanged https://learn.microsoft.com/en-us/azure/app-service/manage-create-arc-environment,Enable App Service on Azure Arc,"Set Up Azure Arc for App Service, Functions, and Logic Apps - Azure App Service","Enable App Service, Functions, and Logic Apps on Azure Arc","For your Azure Arc-enabled Kubernetes clusters, learn how to enable App Service apps, function apps, and logic apps.","Important Azure App Service on Arc enabled Kubernetes will beretired on March 31, 2026. From September 30, 2025, customers will no longer be able to install the extension. We request youmigrate to other solutions such as Azure Container Apps on Arc enabled Kubernetes, migrating also allows you to take advantage ofLogic Apps Hybridfor your integration workloads. If you have anAzure Arc-enabled Kubernetes cluster, you can use it to create anApp Service enabled custom locationand deploy web apps, f",2025-10-07T17:35:00.000Z,how-to,deployment,0.65,True,Describes enabling App Service on Arc-enabled Kubernetes and creating custom locations; involves specific deployment and environment configuration steps.,unchanged https://learn.microsoft.com/en-us/azure/app-service/manage-custom-dns-buy-domain,Buy and configure App Service domain,Buy and configure an App Service domain - Azure App Service,Buy and configure App Service managed domains,Learn how to buy and configure a domain name in Azure App Service to create a personalized web address for your app.,"Important Starting July 28, 2025, changes to App ServiceManaged Certificates(ASMC) will impact how certificates are issued and renewed in certain scenarios. While most customers don’t need to take action, we recommend reviewing ourASMC detailed blog postfor more information. App Service domains are custom domains that are managed directly in Azure. They make it easy to manage custom domains forAzure App Service. This article shows you how to buy an App Service domain and configure an App Service",2025-02-14T08:00:00.000Z,how-to,configuration,0.75,True,"Explains how App Service domains work, how to purchase and configure them—product-specific domain configuration details.",unchanged @@ -149,7 +149,7 @@ https://learn.microsoft.com/en-us/azure/app-service/provision-resource-bicep,Use https://learn.microsoft.com/en-us/azure/app-service/provision-resource-terraform,Use Terraform,Create an App by Using a Terraform Template - Azure App Service,,Follow this quickstart to learn how to create your first app in Azure App Service in seconds by using a Terraform template.,"Get started withAzure App Serviceby deploying an app to the cloud viaTerraform. When you use a free App Service tier, there's no charge to complete this quickstart. Terraform allows you to define and create complete infrastructure deployments in Azure. You build Terraform templates in a human-readable format that create and configure Azure resources in a consistent, reproducible manner. This article shows you how to create an app by using Terraform.",2025-06-05T05:17:00.000Z,quickstart,,0.35,False,Quickstart using Terraform to create an app; mostly generic IaC usage rather than detailed App Service–specific configuration tables or troubleshooting.,unchanged https://learn.microsoft.com/en-us/azure/app-service/quickstart-arc,Create app on Azure Arc,Quickstart: Create a web app on Azure Arc - Azure App Service,Deploy a web app to Azure Arc-enabled Kubernetes,Get started with App Service on Azure Arc deploying your first web app.,"Important Azure App Service on Arc enabled Kubernetes will beretired on March 31, 2026. From September 30, 2025, customers will no longer be able to install the extension. We request youmigrate to other solutions such as Azure Container Apps on Arc enabled Kubernetes, migrating also allows you to take advantage ofLogic Apps Hybridfor your integration workloads. In this quickstart, you create anApp Service app to an Azure Arc-enabled Kubernetes cluster(Preview). This scenario supports Linux apps ",2025-11-18T18:43:00.000Z,quickstart,deployment,0.6,True,Quickstart for deploying App Service app to Arc-enabled Kubernetes; includes product-specific deployment commands and constraints.,unchanged https://learn.microsoft.com/en-us/azure/app-service/quickstart-arm-template,Deploy using ARM template,Create an App Service app using an Azure Resource Manager template - Azure App Service,,"Create your first app to Azure App Service in seconds using an Azure Resource Manager template (ARM template), which is one of many ways to deploy to App Service.","Get started withAzure App Serviceby deploying an app to the cloud using an Azure Resource Manager template (ARM template) andAzure CLIin Cloud Shell. A Resource Manager template is a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. You incur no costs to complete this quickstart because you use a free App Service tier. To complete this quickstart, you need an Azure account with an active subscription. If you don't have an Azure account, yo",2025-09-24T17:11:00.000Z,quickstart,,0.3,False,"Quickstart ARM template deployment; mostly step-by-step tutorial without detailed configuration tables, limits, or product-specific decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/quickstart-custom-container,Deploy a custom container,Quickstart: Run a Custom Container on App Service - Azure App Service,,Learn how to run custom containers by using Azure App Service.,"In this quickstart, you learn how to deploy an ASP.NET app in a Windows image toAzure Container Registryfrom Visual Studio. You run the app in a custom container in Azure App Service. Azure App Serviceprovides predefined application stacks on Windows that run on Internet Information Services (IIS). These preconfigured application stackslock down the operating system and prevent low-level access. Custom Windows containers don't have these restrictions. Developers can use custom containers to give",2025-11-18T23:12:00.000Z,quickstart,,0.4,False,Quickstart for running a custom container; mostly step-by-step deployment from Visual Studio/ACR without deep configuration matrices.,unchanged +https://learn.microsoft.com/en-us/azure/app-service/quickstart-custom-container,Deploy a custom container,Quickstart: Run a Custom Container on App Service - Azure App Service,,Learn how to run custom containers by using Azure App Service.,"In this quickstart, you learn how to deploy an ASP.NET app in a Windows image toAzure Container Registryfrom Visual Studio. You run the app in a custom container in Azure App Service. Azure App Serviceprovides predefined application stacks on Windows that run on Internet Information Services (IIS). These preconfigured application stackslock down the operating system and prevent low-level access. Custom Windows containers don't have these restrictions. Developers can use custom containers to give",2026-04-20T08:00:00.000Z,quickstart,,0.3,False,"Quickstart tutorial for running a custom container on App Service; primarily step-by-step deployment from Visual Studio without detailed configuration matrices, limits, or specialized patterns beyond what an LLM is likely to know.",updated https://learn.microsoft.com/en-us/azure/app-service/quickstart-dotnet-aspire,Aspire Quickstart,Quickstart: Deploy an Aspire app - Azure App Service,,Learn how to deploy your first Aspire app to Azure App Service using GitHub Codespaces and Azure Developer CLI.,"In this quickstart, you learn how to create and deploy your firstAspireapp toAzure App Service. Azure App Service provides a fully managed platform for hosting web apps with built-in infrastructure maintenance, security patching, and scaling. You can complete this entire quickstart in your browser using GitHub Codespaces, which provides a pre-configured development environment with .NET 10 and Azure Developer CLI already installed. By the end, you have a running Aspire app deployed to Azure App ",2026-02-02T18:12:00.000Z,quickstart,,0.2,False,"Quickstart for Aspire app deployment; primarily step-by-step tutorial, not a configuration reference or limits guide.",unchanged https://learn.microsoft.com/en-us/azure/app-service/quickstart-dotnetcore,Quickstart,Quickstart: Deploy an ASP.NET web app - Azure App Service,,Learn how to run web apps in Azure App Service by deploying your first ASP.NET app.,"In this quickstart, you learn how to create and deploy your first ASP.NET web app toAzure App Service. App Service supports various versions of .NET apps. It provides a highly scalable, self-patching web hosting service. ASP.NET web apps are cross-platform and can be hosted on Linux or Windows. When you're finished, you have an Azure resource group that includes an App Service hosting plan and an App Service with a deployed web application. Alternatively, you can deploy an ASP.NET web app as par",2025-12-17T12:14:00.000Z,quickstart,,0.2,False,Quickstart deployment tutorial; focuses on basic deployment steps without detailed product-specific configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/app-service/quickstart-java,Quickstart,Quickstart: Create a Java app on Azure App Service - Azure App Service,,Deploy a Java app to Azure App Service in minutes by using the Azure Web App Plugin for Maven.,"Azure App Serviceprovides a highly scalable, self-patching web app hosting service. In this quickstart, you use theMaven Plugin for Azure App Service Web Appsto deploy a Java web application to a Linux Tomcat server in Azure App Service. If Maven isn't your preferred development tool, check out similar articles for Java developers:",2025-06-27T17:46:00.000Z,quickstart,,0.2,False,Java quickstart using Maven plugin; basic deployment tutorial without detailed configuration matrices.,unchanged @@ -177,10 +177,10 @@ https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-chatbot-retrieva https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-local-small-language-model,Local small language models,Use local small language models (SLMs) in Azure App Service - Azure App Service,,Deploy a web app with a local small language model (SLM) as a sidecar container to run AI models entirely within your App Service environment. No outbound calls or external AI service dependencies req,"Deploy a web app with a local small language model (SLM) as a sidecar container to run AI models entirely within your App Service environment. No outbound calls or external AI service dependencies required. This approach is ideal if you have strict data privacy or compliance requirements, as all AI processing and data remain local to your app. App Service offers high-performance, memory-optimized pricing tiers needed for running SLMs in sidecars.",2026-02-02T18:12:00.000Z,how-to,,0.2,False,Describes using local SLMs and mentions pricing tiers conceptually; no explicit numeric limits or configuration parameter tables in the summary.,unchanged https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-model-context-protocol-server,Model Context Protocol servers,Use App Service as a Model Context Protocol (MCP) server - Azure App Service,Integrate Azure App Service as an MCP server,"Integrate your web app as a Model Context Protocol (MCP) server to extend the capabilities of leading personal AI agents such as GitHub Copilot Chat, Cursor, and Winsurf.","Integrate your web app as aModel Context Protocol (MCP)server to extend the capabilities of leading personal AI agents such as GitHub Copilot Chat, Cursor, and Winsurf. By exposing your app's APIs through MCP, you can supercharge these agents with the unique features and business logic your web app already provides, without major development effort or rearchitecture.",2026-04-10T17:11:00.000Z,how-to,integrations,0.68,True,"The page describes how to expose an Azure App Service web app as a Model Context Protocol (MCP) server for agents like GitHub Copilot Chat and Cursor. This is a product- and protocol-specific integration pattern that likely includes concrete endpoint shapes, protocol parameters, and configuration details unique to MCP and App Service, which go beyond generic SDK usage or conceptual overviews.",unchanged https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-openapi-tool,OpenAPI tools for Foundry agents,Use App Service as an OpenAPI tool in Foundry agent - Azure App Service,,"Empower your existing web apps by exposing their capabilities to Foundry Agent Service using OpenAPI. Connect Foundry Agent Service to REST APIs to create powerful, feature-rich agents.","Empower your existing web apps by exposing their capabilities to Foundry Agent Service using OpenAPI. Many web apps already provide REST APIs, making them ideal candidates for integration into agents that can call REST APIs as tools. By connecting Foundry Agent Service to these APIs, you can rapidly create powerful, feature-rich agents with little code.",2026-02-02T18:12:00.000Z,how-to,,0.1,False,Scenario overview for using App Service as an OpenAPI tool; lacks detailed config tables or SDK parameter references.,unchanged -https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app,to Microsoft Graph with managed identity,Tutorial - .NET Web app accesses Microsoft Graph as the app - Azure App Service,Configure managed identity access to Microsoft Graph from App Service,"In this tutorial, you learn how to access data in Microsoft Graph from a .NET web app by using managed identities.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to call Microsoft Graph for the web app. A safe way to give your web app access to data is to use asystem-assigned managed identity. A managed identity from Microsoft Entra ID allows App Service to access resources through role-based access control (RBAC), without requiring app credentials. After assigning a managed identity to your web app, Azure takes care of the creation and distribution of a certificate",2026-03-13T08:00:00.000Z,tutorial,security,0.7,True,"Tutorial shows how to access Microsoft Graph using system-assigned managed identity and RBAC. Likely includes specific Microsoft Entra app registration settings, permission scopes, and role assignments that are product-specific security configuration details.",updated -https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-user,to Microsoft Graph as user,Tutorial - .NET Web app accesses Microsoft Graph as the user - Azure App Service,Grant delegated Microsoft Graph access for App Service users,"In this tutorial, you learn how to access data in Microsoft Graph for a signed-in user from a .NET web app.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to add access to Microsoft Graph from your web app and perform some action as the signed-in user. This section describes how to grant delegated permissions to the web app and get the signed-in user's profile information from Microsoft Entra ID. In this tutorial, you learn how to: If you don't have an Azure account, create afree accountbefore you begin.",2026-03-17T08:00:00.000Z,tutorial,security,0.7,True,"Covers granting delegated permissions to a web app and retrieving signed-in user profile via Microsoft Entra ID. This typically includes concrete permission names, consent configuration, and auth settings, which are product-specific security configuration details.",updated -https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-storage,to other Azure services with managed identity,Tutorial - .NET Web app accesses storage by using managed identities - Azure App Service,Secure App Service access to Azure Storage with managed identities,"In this tutorial, you learn how to access Azure Storage for a .NET app by using managed identities.","Learn how to access Azure services,such as Azure Storage, from a web app (not a signed-in user) running on Azure App Service by using managed identities. This tutorial demonstrates connecting to Azure Storage as an example. Any servicethat supports managed identity (Bin the following image) can be securely accessed using this tutorial: You want to add secure access to Azure services (Azure Storage, Azure SQL Database, Azure Key Vault, or other services) from your web app. You could use a shared ",2026-03-17T08:00:00.000Z,tutorial,security,0.75,True,"Tutorial focuses on using managed identities from App Service to access Azure Storage and other services. It likely specifies role names (e.g., Storage Blob Data Contributor), scope assignments, and identity configuration steps, which are concrete security and RBAC configuration details.",updated -https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-authentication-app-service,Enable built-in authentication quickstart,Quickstart - Add app authentication to a web app - Azure App Service,Enable authentication for Azure App Service web apps,Learn how to enable app authentication for a web app running on Azure App Service. Limit access to the web app to users in your organization​.,"Learn how to enable authentication for your web app running on Azure App Service and limit access to users in your organization. In this tutorial, you learn how to:",2026-03-17T08:00:00.000Z,tutorial,security,0.65,True,"Quickstart for enabling app authentication and restricting access to organizational users; likely includes specific App Service auth settings and Entra configuration steps, which are product-specific security details.",updated +https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app,to Microsoft Graph with managed identity,Tutorial - .NET Web app accesses Microsoft Graph as the app - Azure App Service,Use managed identity to call Microsoft Graph from App Service,"In this tutorial, you learn how to access data in Microsoft Graph from a .NET web app by using managed identities.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to call Microsoft Graph for the web app. A safe way to give your web app access to data is to use asystem-assigned managed identity. A managed identity from Microsoft Entra ID allows App Service to access resources through role-based access control (RBAC), without requiring app credentials. After you assign a managed identity to your web app, Azure takes care of the creation and distribution of a certificat",2026-04-13T22:10:00.000Z,tutorial,security,0.7,True,"Tutorial shows product-specific security configuration: enabling system-assigned managed identity, assigning Microsoft Entra permissions/RBAC, and configuring authentication to Microsoft Graph for App Service. These are concrete security settings and identity patterns beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-user,to Microsoft Graph as user,Tutorial - .NET Web app accesses Microsoft Graph as the user - Azure App Service,Grant delegated Microsoft Graph access for App Service users,"In this tutorial, you learn how to access data in Microsoft Graph for a signed-in user from a .NET web app.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to add access to Microsoft Graph from your web app and perform some action as the signed-in user. This section describes how to grant delegated permissions to the web app and get the signed-in user's profile information from Microsoft Entra ID. In this tutorial, you learn how to: If you don't have an Azure account, create afree accountbefore you begin.",2026-03-17T08:00:00.000Z,tutorial,security,0.7,True,"Covers granting delegated permissions to a web app and retrieving signed-in user profile via Microsoft Entra ID. This typically includes concrete permission names, consent configuration, and auth settings, which are product-specific security configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-storage,to other Azure services with managed identity,Tutorial - .NET Web app accesses storage by using managed identities - Azure App Service,Secure App Service access to Azure Storage with managed identities,"In this tutorial, you learn how to access Azure Storage for a .NET app by using managed identities.","Learn how to access Azure services,such as Azure Storage, from a web app (not a signed-in user) running on Azure App Service by using managed identities. This tutorial demonstrates connecting to Azure Storage as an example. Any servicethat supports managed identity (Bin the following image) can be securely accessed using this tutorial: You want to add secure access to Azure services (Azure Storage, Azure SQL Database, Azure Key Vault, or other services) from your web app. You could use a shared ",2026-03-17T08:00:00.000Z,tutorial,security,0.75,True,"Tutorial focuses on using managed identities from App Service to access Azure Storage and other services. It likely specifies role names (e.g., Storage Blob Data Contributor), scope assignments, and identity configuration steps, which are concrete security and RBAC configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-authentication-app-service,Enable built-in authentication quickstart,Quickstart - Add app authentication to a web app - Azure App Service,Enable authentication for Azure App Service web apps,Learn how to enable app authentication for a web app running on Azure App Service. Limit access to the web app to users in your organization​.,"Learn how to enable authentication for your web app running on Azure App Service and limit access to users in your organization. In this tutorial, you learn how to:",2026-03-17T08:00:00.000Z,tutorial,security,0.65,True,"Quickstart for enabling app authentication and restricting access to organizational users; likely includes specific App Service auth settings and Entra configuration steps, which are product-specific security details.",unchanged https://learn.microsoft.com/en-us/azure/app-service/security-controls-policy,Security controls by Azure Policy reference,Azure Policy Regulatory Compliance controls for Azure App Service - Azure App Service,Use Azure Policy compliance controls for App Service,Lists Azure Policy Regulatory Compliance controls available for Azure App Service. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources.,"Regulatory Compliance in Azure Policyprovides Microsoft created and managed initiative definitions, known asbuilt-ins, for thecompliance domainsandsecurity controlsrelated to different compliance standards. This page lists thecompliance domainsandsecurity controlsfor Azure App Service. You can assign the built-ins for asecurity controlindividually to help make your Azure resources @@ -212,41 +212,41 @@ https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-expressjs,Ch https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-fastapi,Chatbot with local SLM,Tutorial: FastAPI chatbot with SLM extension - Azure App Service,,Learn how to deploy a FastAPI application integrated with a Phi-4 sidecar extension on Azure App Service.,"This tutorial guides you through deploying a FastAPI-based chatbot application integrated with the Phi-4sidecar extensionon Azure App Service. By following the steps, you'll learn how to set up a scalable web app, add an AI-powered sidecar for enhanced conversational capabilities, and test the chatbot's functionality. Hosting your own small language model (SLM) offers several advantages:",2025-11-18T08:00:00.000Z,tutorial,,0.3,False,FastAPI chatbot with SLM sidecar; deployment tutorial without config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-spring-boot,Chatbot with local SLM,Tutorial: Spring Boot chatbot with SLM extension - Azure App Service,,Learn how to deploy a Spring Boot application integrated with a Phi-4 sidecar extension on Azure App Service.,"This tutorial guides you through deploying a Spring Boot-based chatbot application integrated with the Phi-4sidecar extensionon Azure App Service. By following the steps, you'll learn how to set up a scalable web app, add an AI-powered sidecar for enhanced conversational capabilities, and test the chatbot's functionality. Hosting your own small language model (SLM) offers several advantages:",2025-11-18T08:00:00.000Z,tutorial,,0.3,False,"Spring Boot chatbot with SLM sidecar tutorial; focuses on deployment steps, not detailed config matrices or limits.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-auth-aad,Connect to another app as user,Tutorial: Authenticate Users End-to-End - Azure App Service,Secure App Service apps end-to-end with built-in auth,"Learn how to use App Service authentication and authorization to secure your App Service apps end-to-end, including access to remote APIs.","Azure App Service provides a highly scalable, self-patching web hosting service. App Service has built-in support for user authentication and authorization. This tutorial shows how to secure your apps with App Service authentication and authorization. It uses an Express.js with views front end. App Service authentication and authorization support all language runtimes. You can learn how to apply it to your preferred language by following this tutorial. Azure App Service provides a highly scalabl",2025-07-08T17:24:00.000Z,tutorial,security,0.76,True,"Tutorial includes concrete App Service Authentication/Authorization settings, callback URLs, and configuration values for securing front-end and downstream APIs—platform-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript,to Microsoft Graph with managed identity,Tutorial - JavaScript Web app accesses Microsoft Graph as the app - Azure App Service,Access Microsoft Graph as app using managed identity,"In this tutorial, you learn how to access data in Microsoft Graph from a JavaScript web app by using managed identities.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to call Microsoft Graph for the web app. A safe way to give your web app access to data is to use asystem-assigned managed identity. A managed identity from Microsoft Entra ID allows App Service to access resources through role-based access control (RBAC), without requiring app credentials. After assigning a managed identity to your web app, Azure takes care of the creation and distribution of a certificate",2023-08-18T11:30:00.000Z,tutorial,security,0.7,True,"Tutorial configures system-assigned managed identity and RBAC for Graph; likely includes specific permission scopes and role assignments, which are product-specific security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript,to Microsoft Graph as User,Tutorial - Web app accesses Microsoft Graph as the user - Azure App Service,Access Microsoft Graph as user from App Service,"In this tutorial, you learn how to access data in Microsoft Graph for a signed-in user.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to add access to Microsoft Graph from your web app and perform some action as the signed-in user. This section describes how to grant delegated permissions to the web app and get the signed-in user's profile information from Microsoft Entra ID. In this tutorial, you learn how to: If you don't have an Azure account, create afree accountbefore you begin.",2024-02-08T12:20:00.000Z,tutorial,security,0.7,True,"Covers delegated permissions, consent, and Entra ID configuration for Graph; includes specific permission names and flows that are security configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript,to Microsoft Graph with managed identity,Tutorial - JavaScript Web app accesses Microsoft Graph as the app - Azure App Service,Configure managed identity access to Microsoft Graph from App Service,"In this tutorial, you learn how to access data in Microsoft Graph from a JavaScript web app by using managed identities.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to call Microsoft Graph for the web app. A safe way to give your web app access to data is to use asystem-assigned managed identity. A managed identity from Microsoft Entra ID allows App Service to access resources through role-based access control (RBAC), without requiring app credentials. After you assign a managed identity to your web app, Azure takes care of the creation and distribution of a certificat",2026-04-24T17:42:00.000Z,tutorial,security,0.7,True,"Tutorial focuses on using system-assigned managed identity and RBAC to access Microsoft Graph. This typically includes specific Microsoft Entra app roles/permissions, scope values, and configuration parameters for secure access, which are product-specific security configuration details rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript,to Microsoft Graph as User,Tutorial - Web app accesses Microsoft Graph as the user - Azure App Service,Grant delegated Microsoft Graph access for App Service users,"In this tutorial, you learn how to access data in Microsoft Graph for a signed-in user accessing an Azure App Service app.","Learn how to access Microsoft Graph from a web app running on Azure App Service. You want to add access to Microsoft Graph from your web app and perform some action as the signed-in user. This section describes how to grant delegated permissions to the web app and get the signed-in user's profile information from Microsoft Entra ID. In this tutorial, you learn how to: If you don't have an Azure account, create afree accountbefore you begin.",2026-04-24T17:42:00.000Z,tutorial,security,0.7,True,"Covers granting delegated permissions to a web app and obtaining signed-in user profile data via Microsoft Entra ID. This usually involves concrete permission names, scopes, and auth configuration steps specific to App Service and Microsoft Graph, fitting security-focused configuration guidance.",updated https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-sql-database-as-user-dotnet,to SQL database as user,Tutorial: Connect a web app to SQL Database on behalf of the user - Azure App Service,Connect App Service to SQL on behalf of signed-in user,Use Microsoft Entra built-in authentication to connect securely to Azure SQL or other Azure services from a .NET web app on behalf of the signed-in user.,"This tutorial shows you how to connect anAzure App Serviceapp to a back-end Azure SQL database by impersonating the signed-in user, also called theon-behalf-of flow. To configure this flow, you enable App Servicebuilt-in authenticationusing the Microsoft Entra identity provider. This connectivity method is more advanced than the managed identity approach inTutorial: Access data with managed identity, and has the following advantages in enterprise scenarios: In this tutorial, you add Microsoft En",2025-09-01T17:17:00.000Z,tutorial,security,0.7,True,"Implements on-behalf-of flow with built-in authentication; includes detailed Entra configuration, scopes, and connection patterns unique to App Service.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript,to other Azure services with managed identity,Tutorial - JavaScript Web app accesses storage by using managed identities - Azure App Service,Secure JavaScript web app access to Azure Storage,"In this tutorial, you learn how to access Azure Storage for a JavaScript app by using managed identities.","Learn how to access Azure services,such as Azure Storage, from a web app (not a signed-in user) running on Azure App Service by using managed identities. This tutorial demonstrates connecting to Azure Storage as an example. Any servicethat supports managed identity (Bin the following image) can be securely accessed using this tutorial: You want to add secure access to Azure services (Azure Storage, Azure SQL Database, Azure Key Vault, or other services) from your web app. You could use a shared ",2023-07-31T08:00:00.000Z,tutorial,security,0.7,True,"Shows using managed identities to access Storage and other services; likely includes role names and scope configuration, which are product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript,App to app to another Azure service as user,Tutorial: Authenticate users E2E to Azure - Azure App Service,Configure E2E user auth from App Service to Azure services,Learn how to use App Service authentication and authorization to secure your App Service apps end-to-end to a downstream Azure service.,"Learn how to create and configure a backend App service to accept a frontend app's user credential, then exchange that credential for a downstream Azure service. This allows a user to sign in to a frontend App service, pass their credential to a backend App service, then access an Azure service with the same identity. In this tutorial, you learn how to:",2023-03-22T00:00:00.000Z,tutorial,security,0.7,True,"Covers specific configuration to pass user tokens from a front-end App Service to a back-end and then to Azure services, including scopes and auth settings unique to App Service.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-azure-database,Connect to databases with managed identity,Tutorial: Access Azure databases with managed identity - Azure App Service,Secure database access from App Service with managed identity,"Secure database connectivity (Azure SQL Database, Database for MySQL, and Database for PostgreSQL) with managed identity from .NET, Node.js, Python, and Java apps.","App Serviceprovides a highly scalable, self-patching web hosting service in Azure. It also provides amanaged identityfor your app, which is a turn-key solution for securing access to Azure databases, including: Note This tutorial doesn't include guidance forAzure Cosmos DB, which supports Microsoft Entra authentication differently. For more information, see the Azure Cosmos DB documentation, such asUse system-assigned managed identities to access Azure Cosmos DB data. Managed identities in App S",2026-03-18T08:00:00.000Z,tutorial,security,0.7,True,"Tutorial shows concrete, product-specific steps and configuration for using App Service managed identity with Azure SQL/MySQL/PostgreSQL, including auth configuration details that go beyond generic security concepts.",updated -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault,Use .NET,Tutorial: .NET connect to Azure services securely with Key Vault - Azure App Service,Use Key Vault with App Service .NET via MSI,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively from a .NET web app.,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string, which eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. For back-end services that don't support managed identities and still requires connection secrets, you can use Key Vault to manage connection secrets. This tutorial uses Foundry Tools as an example to show you how it's done in practice. When you're finished, you have an app tha",2025-08-12T22:11:00.000Z,tutorial,integrations,0.7,True,"Tutorial shows product-specific pattern for using App Service managed identity with Azure Key Vault for a backend that doesn’t support MSI. It necessarily includes concrete SDK usage, configuration names (e.g., Key Vault URI, secret names, MSI usage), and wiring between App Service and Key Vault, which are integration-focused coding patterns rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript,Use JavaScript,Tutorial: JavaScript connect to Azure services securely with Key Vault - Azure App Service,Use Key Vault with App Service JavaScript via MSI,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively from a JavaScript web app,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string, which eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. For back-end services that don't support managed identities and still requires connection secrets, you can use Key Vault to manage connection secrets. This tutorial uses Foundry Tools as an example to show you how it's done in practice. When you're finished, you have an app tha",2025-11-18T18:43:00.000Z,tutorial,integrations,0.7,True,"Same scenario as index 0 but for JavaScript. Contains concrete code and configuration for using App Service managed identity with Key Vault from a JavaScript app, which is a product-specific integration pattern with SDK parameters and configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php,Use PHP,Tutorial: PHP connect to Azure services securely with Key Vault - Azure App Service,Use Key Vault with App Service PHP via MSI,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively from a PHP web app,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string, which eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. For back-end services that don't support managed identities and still requires connection secrets, you can use Key Vault to manage connection secrets. This tutorial uses Foundry Tools as an example to show you how it's done in practice. When you're finished, you have an app tha",2025-08-12T22:11:00.000Z,tutorial,integrations,0.7,True,"Same scenario as index 0 but for PHP. Provides specific integration steps and code to access Key Vault using App Service managed identity, including configuration details unique to this product combination.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python,Use Python,Tutorial: Python connect to Azure services securely with Key Vault - Azure App Service,Use Key Vault with App Service Python via MSI,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively from a Python web app,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string, which eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. For back-end services that don't support managed identities and still requires connection secrets, you can use Key Vault to manage connection secrets. This tutorial uses Foundry Tools as an example to show you how it's done in practice. When you're finished, you have an app tha",2025-11-18T18:43:00.000Z,tutorial,integrations,0.7,True,"Same scenario as index 0 but for Python. Includes concrete SDK usage and configuration for integrating App Service managed identity with Key Vault, which fits the integrations & coding patterns category.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript,to other Azure services with managed identity,Tutorial - JavaScript Web app accesses storage by using managed identities - Azure App Service,Use managed identities to access Azure Storage from App Service,"In this tutorial, you learn how to access Azure Storage for a JavaScript app by using managed identities.","Learn how to access Azure services, such as Azure Storage, from a web app, not a signed-in user, running on Azure App Service by using managed identities. This tutorial demonstrates connecting to Azure Storage as an example. Any servicethat supports managed identity (Bin the following image) can be securely accessed using this tutorial: You want to add secure access to Azure services, including Azure Storage, Azure SQL Database, and Azure Key Vault, from your web app. You could use a shared key,",2026-04-24T17:42:00.000Z,tutorial,security,0.7,True,"Shows how a JavaScript web app on App Service uses managed identities to securely access Azure Storage and other services. This generally includes specific role assignments (RBAC roles), identity configuration, and access patterns unique to these services, which are product-specific security configuration details.",updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript,App to app to another Azure service as user,Tutorial: Authenticate users E2E to Azure - Azure App Service,Configure end-to-end user auth to Azure services,Learn how to use App Service authentication and authorization to secure your App Service apps end-to-end to a downstream Azure service.,"Learn how to create and configure a back-end App Service app to accept a front-end app's user credential, then exchange that credential for a downstream Azure service. This approach allows a user to sign in to a front-end App Service app, pass their credential to a back-end App Service, then access an Azure service with the same identity. In this tutorial, you learn how to:",2026-04-24T17:42:00.000Z,tutorial,security,0.78,True,"Describes configuring App Service authentication/authorization so a front-end app passes user credentials to a back-end App Service and then to a downstream Azure service. This is detailed, product-specific identity and auth configuration, mapping identities across services.",updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-azure-database,Connect to databases with managed identity,Tutorial: Access Azure databases with managed identity - Azure App Service,Secure database access from App Service with managed identity,"Secure database connectivity (Azure SQL Database, Database for MySQL, and Database for PostgreSQL) with managed identity from .NET, Node.js, Python, and Java apps.","App Serviceprovides a highly scalable, self-patching web hosting service in Azure. It also provides amanaged identityfor your app, which is a turn-key solution for securing access to Azure databases, including: Note This tutorial doesn't include guidance forAzure Cosmos DB, which supports Microsoft Entra authentication differently. For more information, see the Azure Cosmos DB documentation, such asUse system-assigned managed identities to access Azure Cosmos DB data. Managed identities in App S",2026-03-18T08:00:00.000Z,tutorial,security,0.7,True,"Tutorial shows concrete, product-specific steps and configuration for using App Service managed identity with Azure SQL/MySQL/PostgreSQL, including auth configuration details that go beyond generic security concepts.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault,Use .NET,Tutorial: Secure Foundry Tools Connection from .NET App Service Using Key Vault - Azure App Service,Secure .NET App Service secrets with Key Vault,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively from a .NET web app.,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string. This approach eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. When you're finished, you have an app that makes programmatic calls to Foundry Tools without storing any connection secrets in App Service. For back-end services that don't support managed identities and still require connection secrets, you can use Azure Key Vault to m",2026-04-06T08:00:00.000Z,tutorial,security,0.72,True,"Tutorial shows concrete, product-specific configuration for using managed identity and Azure Key Vault from an App Service .NET app, including how to store and reference secrets instead of connection strings. This is security-focused configuration (identity, secret storage) with service-specific settings, not just generic concepts.",updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript,Use JavaScript,Tutorial: JavaScript connect to Azure services securely with Key Vault - Azure App Service,Secure JavaScript App Service secrets with Key Vault,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively using a JavaScript web app.,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string. This approach eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. When you're finished, you have an app that makes programmatic calls to Foundry Tools without storing any connection secrets in App Service. For back-end services that don't support managed identities and still require connection secrets, you can use Azure Key Vault to m",2026-04-24T17:42:00.000Z,tutorial,security,0.72,True,Similar to [0] but for JavaScript; contains App Service + Key Vault integration steps using managed identity and secret configuration details. These are product-specific security and identity configuration patterns beyond generic knowledge.,updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php,Use PHP,Tutorial: PHP Connect to Azure Services Securely with Key Vault - Azure App Service,Secure PHP App Service secrets with Key Vault,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively by using a PHP web app.,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string. This approach eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. When you're finished, you have an app that makes programmatic calls to Foundry Tools without storing any connection secrets in App Service. For back-end services that don't support managed identities and still require connection secrets, you can use Azure Key Vault to m",2026-04-24T17:42:00.000Z,tutorial,security,0.72,True,"Tutorial for PHP App Service apps using Key Vault via managed identity. Includes concrete configuration of identity, Key Vault access, and secret usage, which are product-specific security settings.",updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python,Use Python,Tutorial: Python connect to Azure services securely with Key Vault - Azure App Service,Secure Python App Service secrets with Key Vault,Learn how to secure connectivity to back-end Azure services that don't support managed identity natively using a Python web app.,"Azure App Servicecan usemanaged identitiesto connect to back-end services without a connection string. This approach eliminates connection secrets to manage and keeps your back-end connectivity secure in a production environment. When you're finished, you have an app that makes programmatic calls to Foundry Tools without storing any connection secrets in App Service. For back-end services that don't support managed identities and still require connection secrets, you can use Azure Key Vault to m",2026-04-24T17:42:00.000Z,tutorial,security,0.72,True,Python variant of the Key Vault + App Service managed identity tutorial; contains specific security configuration steps and patterns for this product combination.,updated https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-sql-database,Connect .NET app to SQL database,Securely connect .NET apps to Azure SQL Database using Managed Identity - Azure App Service,Secure SQL access with managed identity in App Service,Learn how your app can use managed identity for secure access to Azure SQL Database and other Azure services without using passwords or secrets.,"Azure App Serviceprovides a highly scalable, self-patching web hosting service in Azure. App Service also provides amanaged identityfor your app, which is a turnkey solution for securing access toAzure SQLand other Azure services. Managed identities in App Service make your app more secure by eliminating secrets, such as credentials in connection strings. This tutorial shows you how to add managed identity to a sample .NET app that has an Azure SQL backend. After you finish, your app can connect",2025-06-26T22:19:00.000Z,tutorial,security,0.65,True,"Tutorial on using managed identity to access Azure SQL; typically includes specific role assignments, connection string formats, and Entra/SQL configuration steps unique to App Service.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-overview,Connectivity scenarios overview,Securely Connect to Azure Resources - Azure App Service,Choose secure connectivity methods for App Service,"Shows you how to connect to other Azure services such as a database, storage, or another app. This overview recommends the more secure method for connecting.","Your app service might need to connect to other Azure services such as a database, storage, or another app. This overview recommends different methods for connecting and when to use them. Today, the decision for a connectivity approach is closely related to secrets management. The common pattern of using connection secrets in connection strings, such as username and password, secret key, etc. is no longer considered the most secure approach for connectivity. The risk is even higher today because",2026-03-12T08:00:00.000Z,overview,decision-making,0.7,True,Overview focused on when to use different secure connectivity approaches between App Service and other Azure resources; contains product-specific guidance on selecting methods based on security/secret management rather than generic concepts.,updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-overview,Connectivity scenarios overview,Securely Connect to Azure Resources - Azure App Service,Choose secure connectivity methods for App Service,"Shows you how to connect to other Azure services such as a database, storage, or another app. This overview recommends the more secure method for connecting.","Your app service might need to connect to other Azure services such as a database, storage, or another app. This overview recommends different methods for connecting and when to use them. Today, the decision for a connectivity approach is closely related to secrets management. The common pattern of using connection secrets in connection strings, such as username and password, secret key, etc. is no longer considered the most secure approach for connectivity. The risk is even higher today because",2026-03-12T08:00:00.000Z,overview,decision-making,0.7,True,Overview focused on when to use different secure connectivity approaches between App Service and other Azure resources; contains product-specific guidance on selecting methods based on security/secret management rather than generic concepts.,unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-custom-container,Deploy app with Azure Container Registry,Tutorial: Build and Run a Custom Image in Azure App Service - Azure App Service,,Learn how to migrate custom software to App Service in a custom container. Build a custom image and deploy it to App Service.,"Azure App Serviceprovides predefined application stacks, like ASP.NET or Node.js, on Windows. These application stacks run on IIS. The preconfigured Windows environment locks down the operating system from: For more information, seeOperating system functionality on App Service. You can deploy a custom-configured Windows image from Visual Studio to make OS changes that your app needs. Doing so makes it easy to migrate an on-premises app that requires a custom OS and software configuration. This t",2025-09-08T22:37:00.000Z,tutorial,,0.45,False,Tutorial for building and running a custom image; primarily a migration walkthrough rather than a detailed configuration or limits reference.,unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-custom-container-sidecar,Configure a sidecar container,Tutorial: Configure a sidecar for a custom container app - Azure App Service,Configure sidecar containers for Linux custom apps in App Service,Add sidecar containers to your custom container in Azure App Service to add or update application services without changing your main container.,"This tutorial shows you how to add an OpenTelemetry collector as a sidecar container to a Linux custom container app in Azure App Service. Sidecar containers in App Service let you deploy extra services and features to your Linux apps without tightly coupling them to the built-in or custom main container. The sidecar containers run alongside the main application container in the same App Service plan. You can add up to nine sidecar containers for each Linux app in App Service. For example, you c",2025-07-16T22:08:00.000Z,tutorial,configuration,0.8,True,"Details how to add and configure sidecar containers (e.g., OpenTelemetry collector) with limits like up to nine sidecars; App Service–specific container configuration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-dotnetcore-sqldb-app,ASP.NET Core with SQL DB,Deploy ASP.NET Core and Azure SQL Database app - Azure App Service,,Learn how to deploy an ASP.NET Core web app to Azure App Service and connect to an Azure SQL Database.,"In this tutorial, you learn how to deploy a data-driven ASP.NET Core app to Azure App Service and connect to an Azure SQL Database. You'll also deploy an Azure Cache for Redis to enable the caching code in your application. Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. Although this tutorial uses an ASP.NET Core 8.0 app, the process is the same for other versions of ASP.NET Core. In this tutorial, you learn how to:",2026-03-19T08:00:00.000Z,tutorial,,0.2,False,"Step-by-step deployment tutorial for an ASP.NET Core app with Azure SQL and Redis. It appears focused on walkthrough instructions rather than detailed configuration tables, limits, error codes, or product-specific best-practice guidance with quantified impact.",updated -https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-jboss-mysql-app,JBoss with MySQL,Tutorial: Linux Java app with JBoss and MySQL - Azure App Service,,"Learn how to get a data-driven Linux JBoss app working in Azure App Service, with connection to a MySQL running in Azure.","This tutorial shows how to build, configure, and deploy a secure JBoss application in Azure App Service that connects to a MySQL database (usingAzure Database for MySQL). Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. When you're finished, you'll have a JBoss app running onAzure App Service on Linux. Note JBoss EAP on App Service now supports ""Bring Your Own License"" (BYOL) billing, this allows customers with existing R",2025-06-02T08:00:00.000Z,tutorial,,0.2,False,JBoss + MySQL deployment tutorial; procedural guidance without detailed configuration tables or limits.,unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-dotnetcore-sqldb-app,ASP.NET Core with SQL DB,Deploy ASP.NET Core and Azure SQL Database app - Azure App Service,,Learn how to deploy an ASP.NET Core web app to Azure App Service and connect to an Azure SQL Database.,"In this tutorial, you learn how to deploy a data-driven ASP.NET Core app to Azure App Service and connect to an Azure SQL Database. You'll also deploy an Azure Cache for Redis to enable the caching code in your application. Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. Although this tutorial uses an ASP.NET Core 8.0 app, the process is the same for other versions of ASP.NET Core. In this tutorial, you learn how to:",2026-03-19T08:00:00.000Z,tutorial,,0.2,False,"Step-by-step deployment tutorial for an ASP.NET Core app with Azure SQL and Redis. It appears focused on walkthrough instructions rather than detailed configuration tables, limits, error codes, or product-specific best-practice guidance with quantified impact.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-jboss-mysql-app,JBoss with MySQL,Tutorial: Linux Java app with JBoss and MySQL - Azure App Service,,"Learn how to get a data-driven Linux JBoss app working in Azure App Service, with connection to a MySQL running in Azure.","This tutorial shows how to build, configure, and deploy a secure JBoss application in Azure App Service that connects to a MySQL database usingAzure Database for MySQL. Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. When you finish this tutorial, you have a JBoss app that runs onAzure App Service on Linux. Note JBoss Enterprise Application Platform (JBoss EAP) on App Service now supports ""Bring Your Own License"" (BYOL) ",2026-03-25T08:00:00.000Z,tutorial,,0.3,False,"Primarily a step-by-step tutorial for deploying a Java JBoss app with MySQL on App Service. It likely shows example settings and code, but not in the form of comprehensive configuration tables, limits, or product-specific best-practice guidance with quantified impact. No clear evidence of expert-only limits, quotas, or specialized troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-quarkus-postgresql-app,Quarkus with PostgreSQL,Tutorial: Linux Java app with Quarkus and PostgreSQL - Azure App Service,,"Learn how to get a data-driven Linux Quarkus app working in Azure App Service, with connection to a PostgreSQL running in Azure.","In this tutorial, you deploy a data-drivenQuarkusweb application to Azure App Service with theAzure Database for PostgreSQLrelational database service. Azure App Service supports Java Standard Edition (Java SE) in a Windows or Linux server environment. In this tutorial, you learn how to:",2025-06-03T17:04:00.000Z,tutorial,,0.2,False,"Step-by-step tutorial deploying Quarkus + PostgreSQL; no config tables, limits, or product-specific error mappings.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-spring-cosmosdb,Spring Boot with MongoDB,Tutorial: Linux Java app with MongoDB - Azure App Service,,"Learn how to get a data-driven Linux Java app working in Azure App Service, with connection to a MongoDB running in Azure Cosmos DB.","In this tutorial, you learn how to build, configure, and deploy a secure Spring Boot application in Azure App Service that connects to a MongoDB database in Azure (actually, a Cosmos DB database with MongoDB API). When you're finished, you'll have a Java SE application running on Azure App Service on Linux. In this tutorial, you learn how to:",2025-04-17T08:00:00.000Z,tutorial,,0.3,False,"Tutorial for Spring Boot app with Cosmos DB (Mongo API); primarily procedural, not a configuration or limits reference.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-tomcat-connect-managed-identity-postgresql-database,Java Tomcat to Postgres,Tutorial: Access data with managed identity in Java - Azure App Service,,"Secure Azure Database for PostgreSQL connectivity with managed identity from a sample Java Tomcat app, and apply it to other Azure services.","Azure App Serviceprovides a highly scalable, self-patching web hosting service in Azure. It also provides amanaged identityfor your app, which is a turn-key solution for securing access toAzure Database for PostgreSQLand other Azure services. Managed identities in App Service make your app more secure by eliminating secrets from your app, such as credentials in the environment variables. In this tutorial, you learn how to: If you don't have an Azure account, create afree accountbefore you begin.",2026-04-10T11:20:00.000Z,tutorial,,0.2,False,"Tutorial-style walkthrough for using managed identity from a Java Tomcat app to access Azure Database for PostgreSQL. It focuses on step-by-step setup and conceptual security benefits, without detailed configuration parameter tables, specific RBAC role lists, error-code-based troubleshooting, or numeric limits/quotas. Content is primarily instructional, not a reference of expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-tomcat-mysql-app,Tomcat with MySQL,Tutorial: Linux Java app with Tomcat and MySQL - Azure App Service,,"Learn how to get a data-driven Linux Tomcat app working in Azure App Service, with connection to a MySQL running in Azure.","This tutorial shows how to build, configure, and deploy a secure Tomcat application in Azure App Service that connects to a MySQL database (usingAzure Database for MySQL). Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. When you're finished, you have a Tomcat app running onAzure App Service on Linux. In this tutorial, you learn how to:",2025-04-17T08:00:00.000Z,tutorial,,0.2,False,"Tomcat + MySQL deployment tutorial; mostly walkthrough, no detailed limits, config matrices, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-multi-region-app,Deploy a multi-region app (tutorial),Tutorial: Create Multi-Region App - Azure App Service,,Build a multi-region app on Azure App Service that can be used for high availability and fault tolerance.,"High availability and fault tolerance are key components of a well-architected solution. A robust configuration includes an emergency plan for unexpected failures, to reduce downtime and keep systems running automatically. When you deploy an app to the cloud, you choose a region in that cloud for the app infrastructure base. If you deploy an app to a single region only, and that region becomes unavailable, the app is also unavailable. The lack of availability might be unacceptable under the term",2026-04-13T22:10:00.000Z,tutorial,,0.3,False,"Tutorial-style multi-region deployment walkthrough; likely step-by-step instructions without detailed limits tables, decision matrices, or product-specific configuration parameter catalogs. Primarily architecture concept and how-to, not expert reference content as defined.",updated -https://learn.microsoft.com/en-us/azure/app-service/tutorial-networking-isolate-vnet,Isolate network traffic (tutorial),Tutorial: Isolate back-end communication with Virtual Network integration - Azure App Service,,Connections from App Service to back-end services are routed through shared network infrastructure with other apps and subscriptions. Learn how to isolate traffic by using Virtual Network integration.,"In this article you will configure an App Service app with secure, network-isolated communication to backend services. The example scenario used is inTutorial: Secure Cognitive Service connection from App Service using Key Vault. When you're finished, you have an App Service app that accesses both Key Vault and Azure AI services through anAzure virtual network, and no other traffic is allowed to access those back-end resources. All traffic will be isolated within your virtual network usingvirtua",2026-02-05T08:00:00.000Z,tutorial,,0.4,False,Networking isolation tutorial; likely procedural with some VNet integration steps but not a comprehensive configuration reference with parameter tables.,unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-multi-region-app,Deploy a multi-region app (tutorial),Tutorial: Create Multi-Region App - Azure App Service,,Build a multi-region app on Azure App Service that can be used for high availability and fault tolerance.,"High availability and fault tolerance are key components of a well-architected solution. A robust configuration includes an emergency plan for unexpected failures, to reduce downtime and keep systems running automatically. When you deploy an app to the cloud, you choose a region in that cloud for the app infrastructure base. If you deploy an app to a single region only, and that region becomes unavailable, the app is also unavailable. The lack of availability might be unacceptable under the term",2026-04-13T22:10:00.000Z,tutorial,,0.3,False,"Tutorial-style multi-region deployment walkthrough; likely step-by-step instructions without detailed limits tables, decision matrices, or product-specific configuration parameter catalogs. Primarily architecture concept and how-to, not expert reference content as defined.",unchanged +https://learn.microsoft.com/en-us/azure/app-service/tutorial-networking-isolate-vnet,Isolate network traffic (tutorial),Tutorial: Isolate Back-End Communication with Virtual Network Integration - Azure App Service,Isolate Azure App Service traffic with VNet integration,Connections from App Service to back-end services are routed through shared network infrastructure with other apps and subscriptions. Learn how to isolate traffic by using virtual network integration.,"In this article, you configure an App Service app with secure, network-isolated communication to back-end services. The example scenario used is inTutorial: Secure Cognitive Service connection from App Service using Key Vault. When you finish, you have an App Service app that accesses both Key Vault and Foundry Tools through anAzure virtual network. No other traffic is allowed to access those back-end resources. All traffic will be isolated within your virtual network viavirtual network integrat",2026-04-24T17:42:00.000Z,tutorial,security,0.65,True,"Tutorial shows concrete, product-specific steps and settings to securely route App Service outbound traffic through a virtual network so only VNet-isolated access to Key Vault and back-end services is allowed. This is security-focused configuration (network isolation, VNet integration) with App Service–specific behavior rather than a generic networking overview.",updated https://learn.microsoft.com/en-us/azure/app-service/tutorial-nodejs-mongodb-app,with MongoDB,Deploy a Node.js + MongoDB app to Azure - Azure App Service,,Learn how to deploy a Node.js app using Express.js and a MongoDB database using Azure App Service in Linux.,"This tutorial shows how to create a secure Node.js app inAzure App Servicethat's connected to anAzure Cosmos DB for MongoDBdatabase. Azure App Service provides a highly scalable, self-patching web hosting service using the Linux operating system. When you're finished, you have an Express.js app running on Azure App Service on Linux. In this tutorial, you learn how to:",2025-12-10T08:00:00.000Z,tutorial,,0.25,False,"Node.js + MongoDB tutorial; focuses on building and deploying an app, not on structured config, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-php-mysql-app,Connect with MySQL,Tutorial: PHP app with MySQL and Redis - Azure App Service,,"Learn how to get a PHP app working in Azure, with connection to a MySQL database and a Redis cache in Azure. Laravel is used in the tutorial.","This tutorial shows how to create a secure PHP app in Azure App Service connects to a MySQL database using Azure Database for MySQL Flexible Server. You also deploy an Azure Cache for Redis to enable the caching code in your application. Azure App Service is a highly scalable, self-patching, web-hosting service that can easily deploy apps on Windows or Linux. When you're finished, you have a Laravel app running on Azure App Service on Linux.",2025-11-19T23:11:00.000Z,tutorial,,0.25,False,"PHP + MySQL + Redis tutorial; deployment walkthrough, not a structured configuration or limits reference.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-python-postgresql-app-django,using Django,Tutorial: Deploy a Python Django web app with PostgreSQL - Azure App Service,,Create a Python Django web app with a PostgreSQL database and deploy it to Azure. The app is hosted on Azure App Service on Linux.,"In this tutorial, you deploy a data-driven Python web app toAzure App Servicethat uses theAzure Database for PostgreSQLrelational database service. Azure App Service supportsPythonin a Linux server environment. This article usesDjango. Alternatives includeFlaskor theFastAPI tutorial. In this tutorial, you learn how to:",2025-12-10T18:17:00.000Z,tutorial,,0.25,False,"Django + PostgreSQL tutorial; similar to other app tutorials, mainly procedural.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-python-postgresql-app-fastapi,using FastAPI,Tutorial: Deploy a Python FastAPI Web App with PostgreSQL - Azure App Service,,Create a FastAPI web app with a PostgreSQL database and deploy it to Azure. The tutorial uses the FastAPI framework and the app is hosted on Azure App Service on Linux.,"In this tutorial, you deploy a data-driven Python web app (FastAPI) toAzure App Servicewith theAzure Database for PostgreSQLrelational database service. Azure App Service supportsPythonin a Linux server environment. If you want, see theFlask tutorialor theDjango tutorialinstead.",2025-12-18T23:12:00.000Z,tutorial,,0.25,False,"FastAPI + PostgreSQL tutorial; app deployment steps, not a configuration or limits reference.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-python-postgresql-app-flask,using Flask,Tutorial: Deploy a Python Flask web app with PostgreSQL - Azure App Service,,Create a Python Flask web app with a PostgreSQL database and deploy it to Azure. The tutorial uses the Flask framework and Azure App Service on Linux.,"In this tutorial, you deploy a data-driven Python web app toAzure App Servicewith theAzure Database for PostgreSQLrelational database service. Azure App Service supportsPythonin a Linux server environment. This article uses aFlaskapp. Alternatives includeDjangoor theFastAPI tutorial. In this tutorial, you learn how to:",2026-01-13T18:33:00.000Z,tutorial,,0.25,False,"Flask + PostgreSQL tutorial; deployment walkthrough without structured config tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-domain-certificate,Domain and cert quickstart,Secure App with a Custom Domain and Certificate - Azure App Service,Secure App Service apps with custom domains and certificates,Learn how to secure your brand with App Service by using a custom domain and enabling App Service managed certificate.,"Important Starting July 28, 2025, changes to App ServiceManaged Certificates(ASMC) will impact how certificates are issued and renewed in certain scenarios. While most customers don’t need to take action, we recommend reviewing ourASMC detailed blog postfor more information. The default domain name that comes with your app might not represent your brand the way you want. In this tutorial, you configure App Service with awwwdomain you own, such aswww.contoso.com, and secure the custom domain with",2025-07-18T08:00:00.000Z,tutorial,security,0.75,True,"Tutorial on configuring custom domains and App Service managed certificates, including TLS/SSL bindings—product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-ntier-app,Deploy an N-tier app (tutorial),Tutorial: Create Secure N-tier Web App - Azure App Service,Secure N-tier Azure App Service with private networking,"Securely deploy your N-tier web app to Azure App Service. Create a virtual network and subnets, private DNS zones, and private endpoints, and configure virtual network integration.","Many applications have more than a single component. For example, you might have a frontend that's publicly accessible and connects to a backend API or web app. The backend resources might connect to a database, storage account, key vault, another virtual machine, or a combination of these resources. This architecture is the foundation of an N-tier application. It's important that applications like this are architected to protect backend resources to the greatest extent possible. This tutorial d",2026-04-15T22:11:00.000Z,tutorial,security,0.7,True,"Tutorial for securing an N-tier web app using virtual networks, subnets, private DNS zones, and private endpoints. This typically includes concrete Azure security configurations (VNet integration settings, subnet usage, private endpoint setup) that are product-specific and go beyond generic security concepts.",updated +https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-ntier-app,Deploy an N-tier app (tutorial),Tutorial: Create Secure N-tier Web App - Azure App Service,Secure N-tier Azure App Service with private networking,"Securely deploy your N-tier web app to Azure App Service. Create a virtual network and subnets, private DNS zones, and private endpoints, and configure virtual network integration.","Many applications have more than a single component. For example, you might have a frontend that's publicly accessible and connects to a backend API or web app. The backend resources might connect to a database, storage account, key vault, another virtual machine, or a combination of these resources. This architecture is the foundation of an N-tier application. It's important that applications like this are architected to protect backend resources to the greatest extent possible. This tutorial d",2026-04-15T22:11:00.000Z,tutorial,security,0.7,True,"Tutorial for securing an N-tier web app using virtual networks, subnets, private DNS zones, and private endpoints. This typically includes concrete Azure security configurations (VNet integration settings, subnet usage, private endpoint setup) that are product-specific and go beyond generic security concepts.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-send-email,Connect to Logic Apps,Tutorial: Integrate with Azure Logic Apps to send email - Azure App Service,,Learn how to create an Azure Logic apps resource to send email and invoke other business processes from your App Service app.,"In this tutorial, you learn how to integrate your App Service app with your business processes by usingAzure Logic Apps. You create a logic app that sends email via Gmail from your Azure App Service app. There are other ways to send emails from a web app, such as using Simple Mail Transfer Protocol (SMTP) configuration in your language framework. However, Logic Apps provides a simple configuration interface for many business integrations without adding complexity to your code. You can use the st",2025-09-02T17:11:00.000Z,tutorial,,0.35,False,Step-by-step tutorial for Logic Apps email integration; mostly generic workflow without deep configuration tables or product-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-sidecar,Deploy sidecar container,Tutorial: Configure a sidecar container - Azure App Service,Configure sidecar containers for Linux apps on App Service,Add sidecar containers to your Linux app in Azure App Service. Add or update services to your application without changing your application code.,"In this tutorial, you add an OpenTelemetry collector as a sidecar container to a Linux (bring-your-own-code) app in Azure App Service. For custom containers, seeTutorial: Configure a sidecar container for custom container in Azure App Service. If you don't have an Azure account, create afree accountbefore you begin. Sidecar containers in App Service let you deploy extra services and features to your Linux apps without tightly coupling them to the built-in or custom main container. The sidecar co",2025-07-16T22:08:00.000Z,tutorial,configuration,0.7,True,"Shows how to add sidecar containers with App Service–specific configuration steps and constraints, including OpenTelemetry collector setup.",unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-troubleshoot-monitor,Azure Monitor tutorial,Tutorial: Troubleshoot with Azure Monitor - Azure App Service,Troubleshoot App Service apps with Azure Monitor,Learn how Azure Monitor and Log Analytics help you monitor your App Service web app. Azure Monitor maximizes the availability by delivery a comprehensive solution for monitoring your environments.,"This tutorial shows how to troubleshoot an Azure App Service app by usingAzure Monitor. The sample app includes code meant to exhaust memory and cause HTTP 500 errors, so you can diagnose and fix the problem by using Azure Monitor. When you're finished, you have a sample app running on App Service on Linux integrated with Azure Monitor. Azure Monitor maximizes the availability and performance of your applications and services by delivering a comprehensive solution for collecting, analyzing, and ",2025-12-15T18:25:00.000Z,tutorial,troubleshooting,0.8,True,Tutorial focused on diagnosing HTTP 500 and memory issues using Azure Monitor and Log Analytics—symptom-to-diagnosis troubleshooting patterns specific to App Service.,unchanged https://learn.microsoft.com/en-us/azure/app-service/tutorial-webjobs,WebJobs Tutorial,Build a scheduled WebJob using your preferred language - Azure App Service,,WebJobs on App Service enable you to automate repetitive tasks on your app. Learn how to create scheduled WebJobs in Azure App Service.,WebJobs is a feature of Azure App Service that enables you to run a program or script in the same instance as a web app. All App Service plans support WebJobs at no extra cost. This tutorial walks you through creating a scheduled (triggered) WebJob using your preferred development stack.,2025-11-03T17:15:00.000Z,tutorial,,0.2,False,Tutorial for building a scheduled WebJob; focuses on basic usage rather than exhaustive configuration or product-specific edge cases.,unchanged https://learn.microsoft.com/en-us/azure/app-service/web-sites-monitor,About quotas & alerts,Azure App Service Quotas and Metrics - Azure App Service,Understand quotas and metrics for Azure App Service,Learn how to monitor apps in Azure App Service by using the Azure portal. Understand the quotas and metrics that are reported.,"Azure App Serviceprovides built-in monitoring functionality for web, mobile, and API apps in theAzure portal. In the portal, you can reviewquotasandmetricsfor an app and App Service plan. You can set upalertsandautoscalingrules based on metrics.",2025-03-31T17:03:00.000Z,concept-article,limits-quotas,0.8,True,Article on quotas and metrics; typically includes specific numeric limits and how they’re reported—service-specific limits and monitoring details.,unchanged -https://learn.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager,Integrate with Traffic Manager,Control Traffic with Traffic Manager - Azure App Service,Apply Traffic Manager best practices with Azure App Service,Find best practices for configuring Azure Traffic Manager when you integrate it with Azure App Service.,"Note This article provides summary information for Microsoft Azure Traffic Manager as it relates to Azure App Service. More information about Azure Traffic Manager itself can be found by visiting the links at the end of this article. You can use Azure Traffic Manager to control how requests from web clients are distributed to apps in Azure App Service. When App Service endpoints are added to an Azure Traffic Manager profile, Azure Traffic Manager keeps track of the status of your App Service app",2025-08-15T22:10:00.000Z,concept-article,best-practices,0.7,True,Explicitly framed as best practices for configuring Traffic Manager with App Service; includes product-specific recommendations and gotchas.,unchanged +https://learn.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager,Integrate with Traffic Manager,Control Traffic with Traffic Manager - Azure App Service,Configure Azure Traffic Manager with App Service endpoints,Find best practices for configuring Azure Traffic Manager when you integrate it with Azure App Service.,"Note This article provides information about Azure Traffic Manager as it relates to Azure App Service. For more information about Traffic Manager, select the link at the end of this article. You can use Traffic Manager to control how requests from web clients are distributed to apps in App Service. When App Service endpoints are added to a Traffic Manager profile, Traffic Manager keeps track of the status of your App Service apps (running, stopped, or deleted) so that it can determine which of t",2026-04-24T17:42:00.000Z,concept-article,best-practices,0.7,True,"Article is explicitly about best practices for using Traffic Manager with App Service, including product-specific recommendations on endpoint configuration, health checks, and routing behavior that go beyond generic load-balancing concepts.",updated https://learn.microsoft.com/en-us/azure/app-service/webjobs-create,How to create WebJobs,How-to Run Background Tasks with WebJobs - Azure App Service,,Learn how to use WebJobs to run background tasks in Azure App Service. Choose from various script formats and run them with CRON expressions.,This article explains how to deploy WebJobs by using theAzure portalto upload an executable or script. WebJobs is a feature ofAzure App Servicethat allows you to run a program or script in the same instance as a web app. All app service plans support WebJobs. There's no extra cost to use WebJobs.,2025-05-01T22:41:00.000Z,concept-article,,0.3,False,How-to for running background tasks with WebJobs; likely procedural without detailed parameter tables or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/app-service/webjobs-dotnet-deploy-vs,Develop WebJobs using VS,Develop and Deploy WebJobs Using Visual Studio - Azure App Service,,"Learn how to develop Azure WebJobs in Visual Studio and deploy them to Azure App Service, including creating a scheduled task.","This article explains how to use Visual Studio to deploy a console app project to a web app inAzure App Serviceas anAzure WebJob. For information about how to deploy WebJobs by using theAzure portal, seeRun background tasks with WebJobs in Azure App Service. You can choose to develop a WebJob that runs as either a.NET Core appor a.NET Framework app. Version 3.x of theAzure WebJobs SDKlets you develop WebJobs that run as either .NET Core apps or .NET Framework apps, while version 2.x supports onl",2026-02-10T08:00:00.000Z,how-to,,0.2,False,"Step-by-step Visual Studio deployment/tutorial for WebJobs; no configuration tables, limits, quotas, or product-specific error/diagnostic details.",unchanged https://learn.microsoft.com/en-us/azure/app-service/webjobs-execution,How WebJobs work,How WebJobs run in Azure App Service - Azure App Service,Configure WebJobs execution behavior with Kudu settings,"Learn how the Kudu engine discovers, triggers, and manages WebJobs in Azure App Service. Azure WebJobs can run background tasks in your app.","Azure WebJobs allow you to run background tasks in your App Service app, without needing separate infrastructure. The Kudu engine discovers and manages these tasks. The Kudu engine is the built-in App Service deployment and runtime management service. Kudu handles WebJob execution, file system access, diagnostics, and log collection behind the scenes. This article explains how WebJobs are discovered, how the runtime decides what to run, and how you can configure behavior using the optionalsettin",2025-12-09T23:13:00.000Z,article,configuration,0.7,True,Explains how Kudu discovers and runs WebJobs and mentions optional settings; this typically includes specific setting names and behaviors unique to WebJobs.,unchanged diff --git a/products/azure-app-service/report.md b/products/azure-app-service/report.md index 11087e47..e29cf6d5 100644 --- a/products/azure-app-service/report.md +++ b/products/azure-app-service/report.md @@ -1,24 +1,23 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: decision-making: Guidance on choosing App Service tiers, plans, auth and networking, plus planning cost, TLS, domains, and migrations (Windows↔Linux, .NET, VNet, Docker Compose, Arc). - best-practices: Best practices for deploying, securing, routing, and maintaining - App Service apps, including handling IP/TLS changes, Traffic Manager, and minimizing - downtime during maintenance/restarts - configuration: 'Configuring App Service apps: runtime and language settings, containers, - networking/VNet, domains/certs, storage, security/auth, monitoring, backups, and - environment variables.' - security: 'Securing App Service apps: auth (Entra, social, OIDC, MCP), TLS/certs, - IP/VNet/firewall, managed identities/Graph/SQL/Storage access, and end‑to‑end - network and data protection.' + best-practices: Best practices for deploying and securing App Service apps, handling + inbound/outbound and TLS IP changes, and using Traffic Manager for resilient, + geo-distributed endpoints + configuration: 'Configuring App Service apps: app settings, networking/VNet, storage, + containers/sidecars, auth, certificates/domains, language runtimes, health/monitoring, + backup/restore, and ASE setup.' + security: 'Securing App Service apps: auth (Entra, social, OIDC, MCP), managed identities/Graph/Storage/SQL, + TLS/certs, IP/VNet/firewall, Key Vault secrets, and end-to-end network isolation.' deployment: 'Deploying and scaling App Service apps: CI/CD (GitHub Actions, Azure Pipelines, CLI/PowerShell), ZIP/FTP/Git deploy, custom containers, slots, ASE/Arc, scaling, DNS and credentials.' - integrations: Patterns for integrating App Service with TLS/SSL, Application Gateway, - Azure OpenAI chatbots, Key Vault via MSI, managed identity DB access, and WebJobs - event-driven bindings. + integrations: Patterns for integrating App Service apps with APM, TLS/SSL certs, + Application Gateway, MCP, Azure OpenAI chatbots (Node/Flask), and event-driven + jobs via WebJobs bindings. architecture-patterns: 'Architectural guidance for App Service: ASE geo-distribution, outbound traffic via NAT Gateway, and recommended Azure services/patterns for building scalable, secure apps.' @@ -31,17 +30,16 @@ category_descriptions: skill_description: Expert knowledge for Azure App Service development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - choosing App Service plans/tiers, configuring TLS/domains, deploying via slots/CI-CD, - or securing with Entra/managed identity, and other Azure App Service related development - tasks. Not for Azure Functions (use azure-functions), Azure Spring Apps (use azure-spring-apps), - Azure Static Web Apps (use azure-static-web-apps), Azure Kubernetes Service (AKS) - (use azure-kubernetes-service). -use_when: Use when choosing App Service plans/tiers, configuring TLS/domains, deploying - via slots/CI-CD, or securing with Entra/managed identity, and other Azure App Service - related development tasks. -confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring Apps - (use azure-spring-apps), Azure Static Web Apps (use azure-static-web-apps), Azure - Kubernetes Service (AKS) (use azure-kubernetes-service). + choosing App Service plans/ASE, configuring VNet/inbound/outbound, auth/TLS, slots/CI-CD, + or custom containers, and other Azure App Service related development tasks. Not + for Azure Functions (use azure-functions), Azure Container Apps (use azure-container-apps), + Azure Spring Apps (use azure-spring-apps), Azure Static Web Apps (use azure-static-web-apps). +use_when: Use when choosing App Service plans/ASE, configuring VNet/inbound/outbound, + auth/TLS, slots/CI-CD, or custom containers, and other Azure App Service related + development tasks. +confusable_not_for: Not for Azure Functions (use azure-functions), Azure Container + Apps (use azure-container-apps), Azure Spring Apps (use azure-spring-apps), Azure + Static Web Apps (use azure-static-web-apps). --- # Azure App Service Crawl Report @@ -50,13 +48,13 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring - **Total Pages**: 248 - **Fetched**: 248 - **Fetch Failed**: 0 -- **Classified**: 158 -- **Unclassified**: 90 +- **Classified**: 159 +- **Unclassified**: 89 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 14 -- **Unchanged**: 234 +- **Updated Pages**: 23 +- **Unchanged**: 225 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-app-service/azure-app-service.csv` @@ -66,47 +64,60 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring |------|-------|------------| | architecture-patterns | 1 | 0.4% | | best-practices | 7 | 2.8% | -| configuration | 52 | 21.0% | +| configuration | 50 | 20.2% | | decision-making | 17 | 6.9% | | deployment | 23 | 9.3% | -| integrations | 10 | 4.0% | +| integrations | 7 | 2.8% | | limits-quotas | 1 | 0.4% | -| security | 44 | 17.7% | +| security | 50 | 20.2% | | troubleshooting | 3 | 1.2% | -| *(Unclassified)* | 90 | 36.3% | +| *(Unclassified)* | 89 | 35.9% | ## Changes ### Updated Pages -- [Connectivity scenarios overview](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-overview) - - Updated: 2024-07-08T17:06:00.000Z → 2026-03-12T08:00:00.000Z -- [Connect to databases with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-azure-database) - - Updated: 2024-09-30T08:00:00.000Z → 2026-03-18T08:00:00.000Z -- [Scale out automatically](https://learn.microsoft.com/en-us/azure/app-service/manage-automatic-scaling) - - Updated: 2026-01-10T06:10:00.000Z → 2026-04-16T08:00:00.000Z -- [Enable built-in authentication quickstart](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-authentication-app-service) - - Updated: 2025-06-04T05:14:00.000Z → 2026-03-17T08:00:00.000Z -- [Use file-based configuration](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-file-based) - - Updated: 2022-02-02T18:10:00.000Z → 2026-03-10T08:00:00.000Z -- [Use ZIP or WAR](https://learn.microsoft.com/en-us/azure/app-service/deploy-zip) - - Updated: 2025-06-04T17:02:00.000Z → 2026-03-11T08:00:00.000Z -- [Connect a domain name](https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-custom-domain) - - Updated: 2025-02-14T08:00:00.000Z → 2026-04-14T06:14:00.000Z -- [ASP.NET Core with SQL DB](https://learn.microsoft.com/en-us/azure/app-service/tutorial-dotnetcore-sqldb-app) - - Updated: 2025-06-30T08:00:00.000Z → 2026-03-19T08:00:00.000Z +- [Use .NET](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault) + - Updated: 2025-08-12T22:11:00.000Z → 2026-04-06T08:00:00.000Z +- [Use JavaScript](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript) + - Updated: 2025-11-18T18:43:00.000Z → 2026-04-24T17:42:00.000Z +- [Use PHP](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php) + - Updated: 2025-08-12T22:11:00.000Z → 2026-04-24T17:42:00.000Z +- [Use Python](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python) + - Updated: 2025-11-18T18:43:00.000Z → 2026-04-24T17:42:00.000Z +- [App to app to another Azure service as user](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript) + - Updated: 2023-03-22T00:00:00.000Z → 2026-04-24T17:42:00.000Z +- [Use Facebook](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook) + - Updated: 2025-08-15T17:11:00.000Z → 2026-04-24T18:40:00.000Z +- [Use GitHub](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github) + - Updated: 2025-08-15T17:11:00.000Z → 2026-04-24T17:42:00.000Z +- [Use X](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter) + - Updated: 2025-08-15T17:11:00.000Z → 2026-04-24T17:42:00.000Z +- [Use Apple sign-in (preview)](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple) + - Updated: 2021-09-23T17:02:00.000Z → 2026-04-24T17:42:00.000Z +- [Manage API versions](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version) + - Updated: 2023-10-12T17:01:00.000Z → 2026-04-24T17:42:00.000Z +- [About App Service Environments](https://learn.microsoft.com/en-us/azure/app-service/environment/overview) + - Updated: 2025-11-11T08:00:00.000Z → 2026-04-20T08:00:00.000Z - [to Microsoft Graph with managed identity](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app) - - Updated: 2023-08-18T11:30:00.000Z → 2026-03-13T08:00:00.000Z -- [to Microsoft Graph as user](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-user) - - Updated: 2025-11-29T12:17:00.000Z → 2026-03-17T08:00:00.000Z -- [to other Azure services with managed identity](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-storage) - - Updated: 2023-07-31T08:00:00.000Z → 2026-03-17T08:00:00.000Z -- [Deploy a multi-region app (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/tutorial-multi-region-app) - - Updated: 2023-03-16T00:00:00.000Z → 2026-04-13T22:10:00.000Z -- [Deploy an N-tier app (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-ntier-app) - - Updated: 2023-03-16T00:00:00.000Z → 2026-04-15T22:11:00.000Z -- [Configure custom container](https://learn.microsoft.com/en-us/azure/app-service/configure-custom-container) - - Updated: 2025-11-18T23:12:00.000Z → 2026-04-08T08:00:00.000Z + - Updated: 2026-03-13T08:00:00.000Z → 2026-04-13T22:10:00.000Z +- [Data sources](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-data-sources) + - Updated: 2025-09-25T11:34:00.000Z → 2026-04-24T18:40:00.000Z +- [APM integration](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm) + - Updated: 2025-08-15T17:11:00.000Z → 2026-04-24T17:42:00.000Z +- [Security](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-security) + - Updated: 2025-08-15T17:11:00.000Z → 2026-04-24T17:42:00.000Z +- [JBoss with MySQL](https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-jboss-mysql-app) + - Updated: 2025-06-02T08:00:00.000Z → 2026-03-25T08:00:00.000Z +- [to Microsoft Graph with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript) + - Updated: 2023-08-18T11:30:00.000Z → 2026-04-24T17:42:00.000Z +- [to Microsoft Graph as User](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript) + - Updated: 2024-02-08T12:20:00.000Z → 2026-04-24T17:42:00.000Z +- [to other Azure services with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript) + - Updated: 2023-07-31T08:00:00.000Z → 2026-04-24T17:42:00.000Z +- [Isolate network traffic (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/tutorial-networking-isolate-vnet) + - Updated: 2026-02-05T08:00:00.000Z → 2026-04-24T17:42:00.000Z +- *...and 3 more* ## Classified Pages @@ -115,7 +126,11 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [App settings reference](https://learn.microsoft.com/en-us/azure/app-service/reference-app-settings) | configuration | 0.95 | Same as index 0: detailed reference of environment variables and app settings, including which are customizable—clear configuration reference content. | | [Configure WordPress](https://learn.microsoft.com/en-us/azure/app-service/reference-app-settings) | configuration | 0.95 | Reference page listing specific environment variable and app setting names, which ones are configurable, and their behaviors—product-specific configuration details not inferable from general knowledge. | | [App Service Environment custom settings](https://learn.microsoft.com/en-us/azure/app-service/environment/app-service-app-service-environment-custom-settings) | configuration | 0.90 | Documents ASE-specific customizations stored in clusterSettings, with concrete setting names and values applied at environment scope—fits configuration category with product-specific parameters. | +| [Use Apple sign-in (preview)](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple) | security | 0.88 | Covers configuring Sign in with Apple as an identity provider, including Apple developer program requirements and a caution about disabling certain App Service auth management paths. These are specific security and auth configuration details. | +| [Use Facebook](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook) | security | 0.86 | Shows exact steps and settings to configure Facebook as an auth provider for App Service/Azure Functions, including provider-specific parameters and App Service auth configuration. This is concrete IAM configuration, not just conceptual guidance. | +| [Use GitHub](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github) | security | 0.86 | Provides specific configuration for using GitHub as an identity provider in App Service/Azure Functions, including provider setup and App Service auth settings. These are product-specific security and identity configurations. | | [Use Microsoft Entra](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-aad) | security | 0.86 | Contains specific Entra app registration fields, redirect URIs, client IDs, and App Service auth settings required to wire up Entra ID—detailed security configuration. | +| [Use X](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter) | security | 0.86 | Details how to configure X/Twitter as an auth provider, including account requirements and App Service/Azure Functions auth settings. This is concrete IAM configuration unique to this integration. | | [Use secrets from Key Vault](https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references) | configuration | 0.86 | Contains exact app setting/connection string syntax for Key Vault references, supported formats, resolution behavior, and platform-specific constraints—detailed configuration parameters unique to App Service. | | [App Service Environment v3 network settings](https://learn.microsoft.com/en-us/azure/app-service/environment/configure-network-settings) | configuration | 0.85 | Focuses on concrete networking settings (FTP access, private endpoints, remote debugging) with CLI/ARM/portal configuration. Contains specific setting names and patterns unique to App Service Environment networking, matching configuration sub-skill. | | [Use OpenID Connect](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-openid-connect) | security | 0.84 | Explains how to add OIDC providers with specific metadata endpoints, client IDs, and App Service auth configuration fields—product-specific security setup. | @@ -135,6 +150,7 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Deploy app behind private endpoint](https://learn.microsoft.com/en-us/azure/app-service/overview-private-endpoint) | configuration | 0.80 | Explains how to connect via Private Link, IP allocation, and traffic flow; product-specific private endpoint configuration. | | [Deployment best practices](https://learn.microsoft.com/en-us/azure/app-service/deploy-best-practices) | best-practices | 0.80 | Explicitly a best-practices article with App Service–specific DOs/DON’Ts, language-specific recommendations, and caveats that go beyond generic deployment advice. | | [Identity scenarios](https://learn.microsoft.com/en-us/azure/app-service/identity-scenarios) | decision-making | 0.80 | Provides scenario-based recommendations, pros/cons, and guidance on when to use each auth solution for App Service apps and APIs—explicit decision-making content. | +| [Manage API versions](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version) | security | 0.80 | Explains how to pin or upgrade App Service authentication/authorization API versions, including required changes like moving secrets to slot-sticky app settings. This is product-specific security and configuration behavior tied to auth API versions. | | [Managed identity overview](https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity) | security | 0.80 | Includes concrete steps, portal/ARM settings, token acquisition endpoints, and scopes for App Service managed identities—detailed security configuration beyond generic MI concepts. | | [Monitoring data reference](https://learn.microsoft.com/en-us/azure/app-service/monitor-app-service-reference) | configuration | 0.80 | Reference article listing metrics, logs, and schema for App Service monitoring—detailed product-specific monitoring configuration/data model. | | [Mount Azure Storage](https://learn.microsoft.com/en-us/azure/app-service/configure-connect-to-azure-storage) | configuration | 0.80 | Describes how to configure Azure Files and Premium Files as network shares for App Service, including mount settings—product-specific configuration. | @@ -143,21 +159,19 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Run from package](https://learn.microsoft.com/en-us/azure/app-service/deploy-run-package) | deployment | 0.80 | The article describes product-specific deployment behavior and constraints for 'Run from package', including unsupported runtimes (Python, Java), required flags for Python build automation, and how the platform treats ZIP packages. These are concrete deployment constraints and platform behaviors by runtime/plan, matching the deployment sub-skill. | | [Secure a custom domain with HTTPS](https://learn.microsoft.com/en-us/azure/app-service/configure-ssl-bindings) | security | 0.80 | Shows how to bind certificates to custom domains and enable HTTPS—product-specific security configuration steps and constraints. | | [Secure calls from Visual Studio Code](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-mcp-server-vscode) | security | 0.80 | Shows concrete configuration to protect MCP servers with Entra auth and access them from VS Code Copilot agent mode—product- and scenario-specific security setup. | +| [Security](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-security) | security | 0.80 | Covers Java-specific security settings such as authentication integration, Key Vault references, and Java keystore configuration on App Service. These are concrete, product-specific security configurations and patterns rather than generic security concepts. | | [Security controls by Azure Policy reference](https://learn.microsoft.com/en-us/azure/app-service/security-controls-policy) | security | 0.80 | Lists built-in Azure Policy definitions for App Service with specific control names and scopes; this is detailed security/compliance configuration data. | | [Stream diagnostic logs](https://learn.microsoft.com/en-us/azure/app-service/troubleshoot-diagnostic-logs) | configuration | 0.80 | Explains how to enable diagnostic logs, what categories exist, and how to access them—product-specific logging configuration and locations. | -| [Use Apple sign-in (preview)](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple) | security | 0.80 | Includes Apple-specific keys, identifiers, and App Service auth settings, plus caveats about portal management—detailed security configuration. | -| [Use Facebook](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook) | security | 0.80 | Includes provider-specific keys, callback URLs, and App Service auth settings for Facebook, which are concrete security configuration details. | -| [Use GitHub](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github) | security | 0.80 | Shows how to set up GitHub as an identity provider with specific client ID/secret and redirect URL configuration in App Service. | | [Use Google](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-google) | security | 0.80 | Provides Google OAuth client configuration and App Service auth settings, including exact fields and URLs needed for secure integration. | -| [Use X](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter) | security | 0.80 | Contains provider-specific configuration steps and App Service auth settings for X, which are detailed security integration parameters. | | [Use file-based configuration](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-file-based) | configuration | 0.80 | Describes file-based configuration for App Service AuthN/AuthZ, including a dedicated config file and settings that travel with the app payload; this implies specific configuration keys/values unique to this feature. | | [VNet integration overview](https://learn.microsoft.com/en-us/azure/app-service/overview-vnet-integration) | configuration | 0.80 | Details App Service VNet integration variations and setup; includes product-specific configuration behaviors and constraints. | | [Work with tokens](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-oauth-tokens) | security | 0.80 | Explains how to retrieve, refresh, and extend OAuth tokens via App Service auth endpoints and settings—detailed token management behavior unique to this feature. | | [Add and manage TLS/SSL certificates](https://learn.microsoft.com/en-us/azure/app-service/configure-ssl-certificate) | security | 0.78 | Certificate install/manage article typically includes product-specific bindings, supported certificate types, SNI vs IP-based SSL behavior, and portal/CLI configuration fields that are unique to App Service security configuration. | -| [Manage API versions](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version) | configuration | 0.78 | Contains specific configuration for pinning or upgrading auth API versions, including required changes (slot-sticky settings) and version identifiers. | +| [App to app to another Azure service as user](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript) | security | 0.78 | Describes configuring App Service authentication/authorization so a front-end app passes user credentials to a back-end App Service and then to a downstream Azure service. This is detailed, product-specific identity and auth configuration, mapping identities across services. | | [Use settings from App Configuration](https://learn.microsoft.com/en-us/azure/app-service/app-service-configuration-references) | configuration | 0.78 | This page documents the exact syntax and behavior of App Configuration references in App Service/Functions (for example, @Microsoft.AppConfiguration(...) patterns, supported reference types, resolution behavior, and platform-specific settings). These are product-specific configuration details and parameter formats that go beyond generic knowledge and fit the configuration sub-skill. | | [Access user identities](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-user-identities) | security | 0.76 | Describes how identity claims are exposed (headers, tokens, environment) and how to use them in code—platform-specific security behavior. | | [Connect to another app as user](https://learn.microsoft.com/en-us/azure/app-service/tutorial-auth-aad) | security | 0.76 | Tutorial includes concrete App Service Authentication/Authorization settings, callback URLs, and configuration values for securing front-end and downstream APIs—platform-specific security configuration. | +| [APM integration](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm) | integrations | 0.75 | Describes connecting Java apps on App Service with Application Insights, New Relic, and AppDynamics. This usually involves product-specific agent configuration, environment variables, and parameter names for each APM platform, fitting the integrations & coding patterns category. | | [About GitHub Actions for containers](https://learn.microsoft.com/en-us/azure/app-service/deploy-container-github-action) | deployment | 0.75 | Provides YAML workflow structure and parameters for deploying containers to App Service; product-specific CI/CD deployment pattern. | | [Buy and configure App Service domain](https://learn.microsoft.com/en-us/azure/app-service/manage-custom-dns-buy-domain) | configuration | 0.75 | Explains how App Service domains work, how to purchase and configure them—product-specific domain configuration details. | | [CI/CD to custom container](https://learn.microsoft.com/en-us/azure/app-service/deploy-ci-cd-custom-container) | deployment | 0.75 | Explains CI/CD configuration from ACR or Docker Hub to App Service; includes product-specific deployment settings and constraints. | @@ -165,7 +179,7 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Configure](https://learn.microsoft.com/en-us/azure/app-service/configure-language-python) | configuration | 0.75 | Explains how App Service runs Python, virtual environment activation, and dependency installation from requirements files; likely includes specific configuration behaviors and settings that are product-specific. | | [Configure common settings](https://learn.microsoft.com/en-us/azure/app-service/configure-common) | configuration | 0.75 | Explains concrete App Service app settings (like platform, general settings, connection strings) with specific option names and allowed values—product-specific configuration knowledge. | | [Configure custom container](https://learn.microsoft.com/en-us/azure/app-service/configure-custom-container) | configuration | 0.75 | Article explicitly about configuring custom containers on App Service and ‘most common configuration tasks’. These pages usually contain container-specific App Service settings (WEBSITES_PORT, startup commands, image settings) and configuration parameters unique to this product, fitting the configuration sub-skill. | -| [Data sources](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-data-sources) | configuration | 0.75 | Explains configuring data sources for Java SE, Tomcat, and JBoss on App Service, with specific parameters and connection settings. | +| [Data sources](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-data-sources) | configuration | 0.75 | Article focuses on configuring data sources for Java SE, Tomcat, and JBoss on App Service. It typically includes product-specific configuration parameters (JDBC URLs, connection pool settings, environment variables) and platform-specific options, which match the configuration sub-skill. | | [Deployment and runtime](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-deploy-run) | configuration | 0.75 | Covers deployment and runtime configuration for Tomcat, JBoss, and Java SE, including Java versions and logging—product-specific configuration options. | | [Disable basic auth](https://learn.microsoft.com/en-us/azure/app-service/configure-basic-auth-disable) | security | 0.75 | Explains concrete ways to disable basic auth for FTP/Web Deploy, fallback deployment methods, and how to monitor access attempts; this is product-specific security configuration beyond generic concepts. | | [Domain and cert quickstart](https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-domain-certificate) | security | 0.75 | Tutorial on configuring custom domains and App Service managed certificates, including TLS/SSL bindings—product-specific security configuration. | @@ -180,7 +194,10 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [to other Azure services with managed identity](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-storage) | security | 0.75 | Tutorial focuses on using managed identities from App Service to access Azure Storage and other services. It likely specifies role names (e.g., Storage Blob Data Contributor), scope assignments, and identity configuration steps, which are concrete security and RBAC configuration details. | | [Connect by using agent identity](https://learn.microsoft.com/en-us/azure/app-service/overview-agent-identity) | security | 0.74 | Describes preview agent identity platform configuration, including specific settings and C# token acquisition patterns that are unique to this feature. | | [Customize sign-ins/outs](https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-customize-sign-in-out) | security | 0.74 | Shows how to adjust redirect URLs, routes, and auth settings for sign-in/out flows using Easy Auth, which are App Service–specific security behaviors. | -| [APM integration](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm) | configuration | 0.70 | Shows how to connect Java apps to Application Insights, New Relic, and AppDynamics with product-specific configuration parameters. | +| [Use .NET](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault) | security | 0.72 | Tutorial shows concrete, product-specific configuration for using managed identity and Azure Key Vault from an App Service .NET app, including how to store and reference secrets instead of connection strings. This is security-focused configuration (identity, secret storage) with service-specific settings, not just generic concepts. | +| [Use JavaScript](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript) | security | 0.72 | Similar to [0] but for JavaScript; contains App Service + Key Vault integration steps using managed identity and secret configuration details. These are product-specific security and identity configuration patterns beyond generic knowledge. | +| [Use PHP](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php) | security | 0.72 | Tutorial for PHP App Service apps using Key Vault via managed identity. Includes concrete configuration of identity, Key Vault access, and secret usage, which are product-specific security settings. | +| [Use Python](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python) | security | 0.72 | Python variant of the Key Vault + App Service managed identity tutorial; contains specific security configuration steps and patterns for this product combination. | | [ASP.NET](https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnet-framework) | configuration | 0.70 | Article is explicitly about configuration of ASP.NET apps on App Service and typically includes product-specific settings, connection strings, and configuration options beyond generic knowledge. | | [ASP.NET Core](https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnetcore) | configuration | 0.70 | Focuses on configuring ASP.NET Core apps in App Service, likely with specific app settings, environment variables, and platform behaviors unique to the service. | | [About high density hosting](https://learn.microsoft.com/en-us/azure/app-service/manage-scale-per-app) | deployment | 0.70 | Explains per-app scaling behavior, plan-level settings, and PowerShell/ARM configuration for App Service, which are specific deployment patterns and constraints. | @@ -189,7 +206,6 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Access restriction overview](https://learn.microsoft.com/en-us/azure/app-service/overview-access-restrictions) | security | 0.70 | Describes access restriction behavior, interaction with private endpoints, and default exposure; product-specific security configuration guidance. | | [App Service Managed Certificate July 2025 Changes](https://learn.microsoft.com/en-us/azure/app-service/app-service-managed-certificate-changes-july-2025) | security | 0.70 | Details ASMC requirements, exceptions, and validation steps tied to DigiCert changes; this is product-specific certificate/security configuration guidance. | | [App Service plans overview](https://learn.microsoft.com/en-us/azure/app-service/overview-hosting-plans) | decision-making | 0.70 | Explains how plans work, billing, and scaling; used to choose SKUs and capacity, which is App Service–specific decision guidance. | -| [App to app to another Azure service as user](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript) | security | 0.70 | Covers specific configuration to pass user tokens from a front-end App Service to a back-end and then to Azure services, including scopes and auth settings unique to App Service. | | [Aspire](https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnet-aspire) | configuration | 0.70 | Describes configuring Aspire apps including App Service plan settings, Application Insights, dashboards, and health probes—product-specific configuration details. | | [Back up and restore app](https://learn.microsoft.com/en-us/azure/app-service/manage-backup) | configuration | 0.70 | Shows how to configure on-demand and scheduled backups, including linked database considerations and deprecation; product-specific backup configuration. | | [Built-in authentication overview](https://learn.microsoft.com/en-us/azure/app-service/overview-authentication-authorization) | security | 0.70 | This article describes App Service/Functions built-in auth (Easy Auth) with product-specific security configuration: identity providers, callback URLs, and likely concrete settings and role/permission implications. It focuses on how to configure secure authentication/authorization for these services, matching the security sub-skill. | @@ -216,36 +232,31 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Deploy continuously](https://learn.microsoft.com/en-us/azure/app-service/deploy-continuous-deployment) | deployment | 0.70 | Explains App Service–specific continuous deployment integration with various repos and build pipelines, including supported mechanisms and behaviors. | | [Deploy sidecar container](https://learn.microsoft.com/en-us/azure/app-service/tutorial-sidecar) | configuration | 0.70 | Shows how to add sidecar containers with App Service–specific configuration steps and constraints, including OpenTelemetry collector setup. | | [Enable virtual network integration](https://learn.microsoft.com/en-us/azure/app-service/configure-vnet-integration-enable) | configuration | 0.70 | The page goes beyond a simple tutorial and includes product-specific configuration details for enabling VNet integration via portal, Azure CLI, and PowerShell, including specific setting names and parameters unique to App Service VNet integration. While it is largely procedural, the presence of concrete configuration commands and options makes it most aligned with the configuration sub-skill. | -| [Encrypt site data](https://learn.microsoft.com/en-us/azure/app-service/configure-encrypt-at-rest-using-cmk) | security | 0.70 | Describes how to use Storage and Key Vault with run-from-package to encrypt application data at rest; involves product-specific secure configuration patterns. | +| [Encrypt site data](https://learn.microsoft.com/en-us/azure/app-service/configure-encrypt-at-rest-using-cmk) | security | 0.70 | Describes how to configure encryption at rest for App Service app content using Azure Storage and Key Vault, including product-specific security configuration steps and parameters for CMK-based encryption when running from a package file. | | [Host an App in an App Service Environment](https://learn.microsoft.com/en-us/azure/app-service/environment/using) | configuration | 0.70 | How-to page for creating and configuring a web app inside an App Service Environment, including specific portal options (encryption, diagnostic logging, VNet/subnet usage). Contains product-specific configuration steps and settings rather than just conceptual overview. | | [How WebJobs work](https://learn.microsoft.com/en-us/azure/app-service/webjobs-execution) | configuration | 0.70 | Explains how Kudu discovers and runs WebJobs and mentions optional settings; this typically includes specific setting names and behaviors unique to WebJobs. | | [Inbound and outbound IPs](https://learn.microsoft.com/en-us/azure/app-service/overview-inbound-outbound-ips) | configuration | 0.70 | Explains when IPs change and how to find them; includes product-specific behavior and configuration patterns for IP management. | | [Integrate with Application Gateway](https://learn.microsoft.com/en-us/azure/app-service/environment/integrate-with-application-gateway) | integrations | 0.70 | End-to-end configuration of Application Gateway with an ILB App Service Environment; likely includes product-specific settings (backend pool, probes, hostnames, ports) and integration parameters beyond generic tutorials. | | [Integrate with NAT gateway](https://learn.microsoft.com/en-us/azure/app-service/overview-nat-gateway-integration) | configuration | 0.70 | Describes product-specific configuration of NAT Gateway with App Service and virtual networks, including which subnets/apps can be associated and how outbound traffic is routed. This is concrete configuration guidance beyond generic networking concepts. | -| [Integrate with Traffic Manager](https://learn.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager) | best-practices | 0.70 | Explicitly framed as best practices for configuring Traffic Manager with App Service; includes product-specific recommendations and gotchas. | +| [Integrate with Traffic Manager](https://learn.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager) | best-practices | 0.70 | Article is explicitly about best practices for using Traffic Manager with App Service, including product-specific recommendations on endpoint configuration, health checks, and routing behavior that go beyond generic load-balancing concepts. | | [Language support policy](https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality) | configuration | 0.70 | Describes exact file, network, and registry access plus diagnostics available to apps; these are platform-specific behavioral details. | | [Migrate Python Windows apps to Linux](https://learn.microsoft.com/en-us/azure/app-service/app-service-migration-windows-linux) | decision-making | 0.70 | Covers key considerations and dependency checks when moving from Windows to Linux; supports OS/runtime migration decisions. | | [Minimum TLS version](https://learn.microsoft.com/en-us/azure/app-service/tls-minimum-version) | security | 0.70 | The page provides concrete guidance on configuring minimum TLS versions (for example, where to set TLS 1.2+, how it applies to App Service, Functions, and Logic Apps Standard, and platform-specific behaviors). These are product-specific security configuration settings, fitting the security sub-skill. | | [Overview of TLS/SSL in App Service](https://learn.microsoft.com/en-us/azure/app-service/overview-tls) | security | 0.70 | Page focuses on TLS/SSL behavior in Azure App Service, including supported TLS versions, certificate handling, bindings, and mutual authentication. These are product-specific security details (e.g., which TLS versions are supported and how certificates are managed in App Service) that go beyond generic TLS concepts and qualify as expert security configuration knowledge. | -| [Security](https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-security) | security | 0.70 | Java-specific security configuration including authentication, Key Vault references, and Java keystore settings—product-specific security details. | | [Security overview](https://learn.microsoft.com/en-us/azure/app-service/overview-security) | best-practices | 0.70 | Article explicitly focuses on security best practices for App Service, likely including concrete recommendations (networking, identity, configuration) specific to this platform. | | [TLS/SSL address](https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-ssl) | best-practices | 0.70 | Explains how to release and reassign TLS/SSL IPs and notes unsupported service endpoints; concrete platform-specific behavior and steps. | -| [Use .NET](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault) | integrations | 0.70 | Tutorial shows product-specific pattern for using App Service managed identity with Azure Key Vault for a backend that doesn’t support MSI. It necessarily includes concrete SDK usage, configuration names (e.g., Key Vault URI, secret names, MSI usage), and wiring between App Service and Key Vault, which are integration-focused coding patterns rather than generic concepts. | | [Use Azure Pipelines](https://learn.microsoft.com/en-us/azure/app-service/deploy-azure-pipelines) | deployment | 0.70 | Describes how to configure Azure Pipelines specifically for App Service deployments with YAML/classic pipelines—product-specific CI/CD deployment patterns. | | [Use GitHub Actions](https://learn.microsoft.com/en-us/azure/app-service/deploy-github-actions) | deployment | 0.70 | Provides App Service–specific GitHub Actions deployment configuration and tasks, which are concrete deployment integration patterns. | -| [Use JavaScript](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript) | integrations | 0.70 | Same scenario as index 0 but for JavaScript. Contains concrete code and configuration for using App Service managed identity with Key Vault from a JavaScript app, which is a product-specific integration pattern with SDK parameters and configuration details. | -| [Use PHP](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php) | integrations | 0.70 | Same scenario as index 0 but for PHP. Provides specific integration steps and code to access Key Vault using App Service managed identity, including configuration details unique to this product combination. | -| [Use Python](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python) | integrations | 0.70 | Same scenario as index 0 but for Python. Includes concrete SDK usage and configuration for integrating App Service managed identity with Key Vault, which fits the integrations & coding patterns category. | | [Use TLS/SSL certificates in app code](https://learn.microsoft.com/en-us/azure/app-service/configure-ssl-certificate-in-code) | integrations | 0.70 | Explains how app code accesses certificates from App Service (store locations, environment paths, thumbprint usage), which are product-specific coding patterns and configuration details. | | [Use Traffic Manager with a domain](https://learn.microsoft.com/en-us/azure/app-service/configure-domain-traffic-manager) | configuration | 0.70 | Shows how to integrate Traffic Manager endpoints with custom domains for App Service, including DNS and routing configuration—service-specific integration settings. | | [Use ZIP or WAR](https://learn.microsoft.com/en-us/azure/app-service/deploy-zip) | deployment | 0.70 | The article describes product-specific deployment behavior for ZIP/WAR/JAR/EAR and individual file deployment to Azure App Service, including how the platform handles these packages and related constraints. This is concrete, implementation-focused deployment guidance beyond generic knowledge, but not primarily about limits, configuration matrices, or troubleshooting. | | [WordPress FAQ](https://learn.microsoft.com/en-us/azure/app-service/wordpress-faq) | troubleshooting | 0.70 | The FAQ contains product-specific troubleshooting guidance for WordPress on Azure App Service, including concrete behaviors and resolutions (for example, how scaling affects WordPress, backup/restore nuances, plugin/theme compatibility, and platform-specific constraints). These are symptom→cause→solution style answers unique to this hosting environment, which an LLM is unlikely to infer from general WordPress or App Service knowledge. | -| [to Microsoft Graph as User](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript) | security | 0.70 | Covers delegated permissions, consent, and Entra ID configuration for Graph; includes specific permission names and flows that are security configuration details. | +| [to Microsoft Graph as User](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript) | security | 0.70 | Covers granting delegated permissions to a web app and obtaining signed-in user profile data via Microsoft Entra ID. This usually involves concrete permission names, scopes, and auth configuration steps specific to App Service and Microsoft Graph, fitting security-focused configuration guidance. | | [to Microsoft Graph as user](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-user) | security | 0.70 | Covers granting delegated permissions to a web app and retrieving signed-in user profile via Microsoft Entra ID. This typically includes concrete permission names, consent configuration, and auth settings, which are product-specific security configuration details. | -| [to Microsoft Graph with managed identity](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app) | security | 0.70 | Tutorial shows how to access Microsoft Graph using system-assigned managed identity and RBAC. Likely includes specific Microsoft Entra app registration settings, permission scopes, and role assignments that are product-specific security configuration details. | -| [to Microsoft Graph with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript) | security | 0.70 | Tutorial configures system-assigned managed identity and RBAC for Graph; likely includes specific permission scopes and role assignments, which are product-specific security configuration details. | +| [to Microsoft Graph with managed identity](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app) | security | 0.70 | Tutorial shows product-specific security configuration: enabling system-assigned managed identity, assigning Microsoft Entra permissions/RBAC, and configuring authentication to Microsoft Graph for App Service. These are concrete security settings and identity patterns beyond generic concepts. | +| [to Microsoft Graph with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript) | security | 0.70 | Tutorial focuses on using system-assigned managed identity and RBAC to access Microsoft Graph. This typically includes specific Microsoft Entra app roles/permissions, scope values, and configuration parameters for secure access, which are product-specific security configuration details rather than generic concepts. | | [to SQL database as user](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-sql-database-as-user-dotnet) | security | 0.70 | Implements on-behalf-of flow with built-in authentication; includes detailed Entra configuration, scopes, and connection patterns unique to App Service. | -| [to other Azure services with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript) | security | 0.70 | Shows using managed identities to access Storage and other services; likely includes role names and scope configuration, which are product-specific security settings. | +| [to other Azure services with managed identity](https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript) | security | 0.70 | Shows how a JavaScript web app on App Service uses managed identities to securely access Azure Storage and other services. This generally includes specific role assignments (RBAC roles), identity configuration, and access patterns unique to these services, which are product-specific security configuration details. | | [Geo-distributed scale](https://learn.microsoft.com/en-us/azure/app-service/environment/app-service-app-service-environment-geo-distributed-scale) | architecture-patterns | 0.68 | The article describes concrete, product-specific architecture guidance for horizontally scaling Azure App Service apps across regions using App Service Environments and Traffic Manager. It focuses on when and how to use geo-distribution for high-scale scenarios (for example, events with extreme load) and discusses deployment patterns and trade-offs specific to this service, rather than just conceptual scaling theory. While it may not be heavy on numeric limits, it provides expert, pattern-level guidance unique to Azure App Service Environments. | | [Integrate with Application Gateway](https://learn.microsoft.com/en-us/azure/app-service/overview-app-gateway-integration) | decision-making | 0.68 | The page goes beyond a conceptual overview and provides product-specific guidance on when and how to integrate Azure Application Gateway with Azure App Service using private endpoints, service endpoints, and App Service Environments. It discusses scenario-based considerations (internal vs external ASE, SCM site access restrictions) that help users decide between integration approaches. While it may not contain many numeric limits, it does include concrete, product-specific configuration and trade-off guidance that fits the decision-making category better than the others. | | [Model Context Protocol servers](https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-model-context-protocol-server) | integrations | 0.68 | The page describes how to expose an Azure App Service web app as a Model Context Protocol (MCP) server for agents like GitHub Copilot Chat and Cursor. This is a product- and protocol-specific integration pattern that likely includes concrete endpoint shapes, protocol parameters, and configuration details unique to MCP and App Service, which go beyond generic SDK usage or conceptual overviews. | @@ -256,6 +267,7 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Enable App Service on Azure Arc](https://learn.microsoft.com/en-us/azure/app-service/manage-create-arc-environment) | deployment | 0.65 | Describes enabling App Service on Arc-enabled Kubernetes and creating custom locations; involves specific deployment and environment configuration steps. | | [Enable built-in authentication quickstart](https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-authentication-app-service) | security | 0.65 | Quickstart for enabling app authentication and restricting access to organizational users; likely includes specific App Service auth settings and Entra configuration steps, which are product-specific security details. | | [Inbound IP address](https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-inbound) | best-practices | 0.65 | Provides concrete steps to handle inbound IP changes; operational guidance specific to App Service networking behavior. | +| [Isolate network traffic (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/tutorial-networking-isolate-vnet) | security | 0.65 | Tutorial shows concrete, product-specific steps and settings to securely route App Service outbound traffic through a virtual network so only VNet-isolated access to Key Vault and back-end services is allowed. This is security-focused configuration (network isolation, VNet integration) with App Service–specific behavior rather than a generic networking overview. | | [Migrate an active domain](https://learn.microsoft.com/en-us/azure/app-service/manage-custom-dns-migrate-domain) | deployment | 0.65 | Describes how to migrate a live domain with no downtime, including sequencing and DNS changes—deployment/migration-specific expert guidance. | | [Migrate from multi-container](https://learn.microsoft.com/en-us/azure/app-service/migrate-sidecar-multi-container-apps) | decision-making | 0.65 | Migration guidance between Docker Compose and sidecars with concrete strategies and considerations qualifies as product-specific decision and migration guidance. | | [Outbound IP address](https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-outbound) | best-practices | 0.65 | Gives specific instructions for handling outbound IP changes; product-specific operational guidance. | @@ -278,8 +290,6 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Deploy app with Azure Container Registry](https://learn.microsoft.com/en-us/azure/app-service/tutorial-custom-container) | 0.45 | Tutorial for building and running a custom image; primarily a migration walkthrough rather than a detailed configuration or limits reference. | | [About OS and runtime patching](https://learn.microsoft.com/en-us/azure/app-service/overview-patch-os-runtime) | 0.40 | OS/runtime patching overview; mostly conceptual explanation of how App Service updates and how to get version info, not detailed configuration or limits. | | [Azure Arc hosting overview](https://learn.microsoft.com/en-us/azure/app-service/overview-arc-integration) | 0.40 | Introduction to App Service on Azure Arc with retirement notice; mostly overview and migration suggestion, not detailed configs or comparisons. | -| [Deploy a custom container](https://learn.microsoft.com/en-us/azure/app-service/quickstart-custom-container) | 0.40 | Quickstart for running a custom container; mostly step-by-step deployment from Visual Studio/ACR without deep configuration matrices. | -| [Isolate network traffic (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/tutorial-networking-isolate-vnet) | 0.40 | Networking isolation tutorial; likely procedural with some VNet integration steps but not a comprehensive configuration reference with parameter tables. | | [Migrate to Managed Instance](https://learn.microsoft.com/en-us/azure/app-service/quickstart-managed-instance) | 0.40 | Quickstart for Managed Instance; while it notes preview regions and plan limitations, summary doesn’t show detailed limit tables or configuration references. | | [Add MCP server to app](https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-model-context-protocol-server-java) | 0.35 | MCP server tutorial for Spring Boot; shows how to expose capabilities but not organized as integration reference with parameter tables. | | [Add MCP server to app](https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-model-context-protocol-server-node) | 0.35 | MCP server tutorial for Node.js; shows how to wire MCP but not as a structured integration reference with parameter tables. | @@ -302,11 +312,13 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Chatbot with local SLM](https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-fastapi) | 0.30 | FastAPI chatbot with SLM sidecar; deployment tutorial without config matrices or limits. | | [Chatbot with local SLM](https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-spring-boot) | 0.30 | Spring Boot chatbot with SLM sidecar tutorial; focuses on deployment steps, not detailed config matrices or limits. | | [Deploy a REST API (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-rest-api) | 0.30 | Tutorial for hosting a REST API with CORS; focuses on basic deployment and CORS setup, not deep configuration references or troubleshooting matrices. | +| [Deploy a custom container](https://learn.microsoft.com/en-us/azure/app-service/quickstart-custom-container) | 0.30 | Quickstart tutorial for running a custom container on App Service; primarily step-by-step deployment from Visual Studio without detailed configuration matrices, limits, or specialized patterns beyond what an LLM is likely to know. | | [Deploy a multi-region app (tutorial)](https://learn.microsoft.com/en-us/azure/app-service/tutorial-multi-region-app) | 0.30 | Tutorial-style multi-region deployment walkthrough; likely step-by-step instructions without detailed limits tables, decision matrices, or product-specific configuration parameter catalogs. Primarily architecture concept and how-to, not expert reference content as defined. | | [Deploy using ARM template](https://learn.microsoft.com/en-us/azure/app-service/quickstart-arm-template) | 0.30 | Quickstart ARM template deployment; mostly step-by-step tutorial without detailed configuration tables, limits, or product-specific decision guidance. | | [Discover .NET](https://learn.microsoft.com/en-us/azure/app-service/app-service-migration-discover-net) | 0.30 | Describes discovery setup using Azure Migrate; more procedural and conceptual, not focused on configuration matrices or decision criteria. | | [Foundry agent calling web app](https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-integrate-azure-ai-agent-dotnet) | 0.30 | Tutorial for integrating with Foundry Agent Service via OpenAPI; primarily step-by-step integration, not a configuration catalog. | | [How to create WebJobs](https://learn.microsoft.com/en-us/azure/app-service/webjobs-create) | 0.30 | How-to for running background tasks with WebJobs; likely procedural without detailed parameter tables or error mappings. | +| [JBoss with MySQL](https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-jboss-mysql-app) | 0.30 | Primarily a step-by-step tutorial for deploying a Java JBoss app with MySQL on App Service. It likely shows example settings and code, but not in the form of comprehensive configuration tables, limits, or product-specific best-practice guidance with quantified impact. No clear evidence of expert-only limits, quotas, or specialized troubleshooting content. | | [Java SE](https://learn.microsoft.com/en-us/azure/app-service/app-service-java-migration) | 0.30 | High-level description of Java migration tools; no detailed decision matrices, limits, or config references indicated. | | [Kudu service](https://learn.microsoft.com/en-us/azure/app-service/resources-kudu) | 0.30 | High-level overview of Kudu service; likely descriptive without detailed configuration tables or troubleshooting mappings. | | [Manage App Service plan](https://learn.microsoft.com/en-us/azure/app-service/app-service-plan-manage) | 0.30 | High-level management article (create, move, scale, delete App Service plans) that is likely procedural without detailed configuration tables, limits, or product-specific parameters. | @@ -331,6 +343,7 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [with MongoDB](https://learn.microsoft.com/en-us/azure/app-service/tutorial-nodejs-mongodb-app) | 0.25 | Node.js + MongoDB tutorial; focuses on building and deploying an app, not on structured config, limits, or troubleshooting. | | [ASP.NET Core with SQL DB](https://learn.microsoft.com/en-us/azure/app-service/tutorial-dotnetcore-sqldb-app) | 0.20 | Step-by-step deployment tutorial for an ASP.NET Core app with Azure SQL and Redis. It appears focused on walkthrough instructions rather than detailed configuration tables, limits, error codes, or product-specific best-practice guidance with quantified impact. | | [ASP.NET with SQL DB](https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-dotnet-sqldatabase) | 0.20 | Tutorial for ASP.NET app with SQL Database; focuses on deployment steps rather than reusable configuration or limits. | +| [About App Service Environments](https://learn.microsoft.com/en-us/azure/app-service/environment/overview) | 0.20 | High-level overview of App Service Environment; summary suggests conceptual description and comparison link, but no explicit limits, configuration tables, or error/decision matrices. | | [Aspire Quickstart](https://learn.microsoft.com/en-us/azure/app-service/quickstart-dotnet-aspire) | 0.20 | Quickstart for Aspire app deployment; primarily step-by-step tutorial, not a configuration reference or limits guide. | | [Azure Policy built-ins reference](https://learn.microsoft.com/en-us/azure/app-service/policy-reference) | 0.20 | Primarily an index of built-in policy definitions with links out; the expert details live in the linked definitions, not on this page itself. | | [Bicep](https://learn.microsoft.com/en-us/azure/app-service/samples-bicep) | 0.20 | Page is a catalog of Bicep sample links for App Service without exposing underlying configuration tables, limits, or detailed parameters; it primarily serves as navigation to samples rather than containing expert-only reference data. | @@ -338,7 +351,6 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [Deploy with Azure Pipelines](https://learn.microsoft.com/en-us/azure/app-service/deploy-container-azure-pipelines) | 0.20 | Primarily a step-by-step CI/CD tutorial for deploying a Windows container app with Azure Pipelines. It does not emphasize product-specific limits, configuration matrices, or deployment constraints by tier/plan; instead it focuses on defining a YAML pipeline and basic deployment flow, which are patterns an LLM generally knows from training. | | [Develop WebJobs using VS](https://learn.microsoft.com/en-us/azure/app-service/webjobs-dotnet-deploy-vs) | 0.20 | Step-by-step Visual Studio deployment/tutorial for WebJobs; no configuration tables, limits, quotas, or product-specific error/diagnostic details. | | [Get started with WebJobs SDK](https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-get-started) | 0.20 | Introductory tutorial for WebJobs SDK with basic queue-trigger example; focuses on how to build and deploy, not on limits, configuration matrices, or troubleshooting mappings. | -| [JBoss with MySQL](https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-jboss-mysql-app) | 0.20 | JBoss + MySQL deployment tutorial; procedural guidance without detailed configuration tables or limits. | | [Java Tomcat to Postgres](https://learn.microsoft.com/en-us/azure/app-service/tutorial-java-tomcat-connect-managed-identity-postgresql-database) | 0.20 | Tutorial-style walkthrough for using managed identity from a Java Tomcat app to access Azure Database for PostgreSQL. It focuses on step-by-step setup and conceptual security benefits, without detailed configuration parameter tables, specific RBAC role lists, error-code-based troubleshooting, or numeric limits/quotas. Content is primarily instructional, not a reference of expert-only details. | | [Local small language models](https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-local-small-language-model) | 0.20 | Describes using local SLMs and mentions pricing tiers conceptually; no explicit numeric limits or configuration parameter tables in the summary. | | [Monitor App Service](https://learn.microsoft.com/en-us/azure/app-service/monitor-app-service) | 0.20 | High-level overview of monitoring options for Azure App Service and Azure Monitor without detailed limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Content is primarily conceptual guidance on what monitoring features exist rather than product-specific expert details. | @@ -355,7 +367,6 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Spring | [WebJobs Quickstart](https://learn.microsoft.com/en-us/azure/app-service/quickstart-webjobs) | 0.20 | Quickstart tutorial for creating a scheduled WebJob; mostly step-by-step example, not a catalog of configs or expert-only details. | | [WebJobs Tutorial](https://learn.microsoft.com/en-us/azure/app-service/tutorial-webjobs) | 0.20 | Tutorial for building a scheduled WebJob; focuses on basic usage rather than exhaustive configuration or product-specific edge cases. | | [About App Service](https://learn.microsoft.com/en-us/azure/app-service/overview) | 0.10 | High-level product overview of Azure App Service without detailed limits, configuration tables, or error mappings. | -| [About App Service Environments](https://learn.microsoft.com/en-us/azure/app-service/environment/overview) | 0.10 | Conceptual overview of App Service Environment; no detailed quotas, config parameters, or troubleshooting content. | | [Agentic web applications](https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-agentic-web-apps) | 0.10 | Scenario overview for agentic web apps; no specific configuration parameters, limits, or error-resolution content. | | [Chatbots and RAG applications](https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-chatbot-retrieval-augmented-generation) | 0.10 | Scenario/tutorial entry page for chatbots and RAG; summary indicates conceptual guidance, not detailed expert configuration or limits. | | [Deploy](https://learn.microsoft.com/en-us/azure/app-service/quickstart-wordpress) | 0.10 | Quickstart tutorial for deploying WordPress on Azure App Service and Azure Database for MySQL; focuses on step-by-step setup rather than limits, configuration matrices, error codes, or product-specific best-practice tables. No detailed quotas, RBAC roles, config parameter tables, or troubleshooting mappings are indicated. | diff --git a/products/azure-application-gateway/azure-application-gateway.csv b/products/azure-application-gateway/azure-application-gateway.csv index 5f45454f..ee8204ef 100644 --- a/products/azure-application-gateway/azure-application-gateway.csv +++ b/products/azure-application-gateway/azure-application-gateway.csv @@ -44,15 +44,15 @@ https://learn.microsoft.com/en-us/azure/application-gateway/features,Application https://learn.microsoft.com/en-us/azure/application-gateway/fips,FIPS 140 support on V2,FIPS 140 on Azure Application Gateway,Enable FIPS 140-compliant mode on Application Gateway v2,Learn how to enable FIPS mode for Azure Application Gateway V2 SKU.,"Application Gateway V2 SKUs can run in a FIPS (Federal Information Processing Standard) 140 approved mode of operation, which is commonly referred to as ""FIPS mode."" With FIPS mode, Application Gateway supports cryptographic modules and data encryption. The FIPS mode calls a FIPS 140-2 validated cryptographic module that ensures FIPS-compliant algorithms for encryption, hashing, and signing when enabled.",2025-12-02T12:09:00.000Z,concept-article,security,0.75,True,Explains how to run v2 SKUs in FIPS 140-approved mode and which cryptographic modules are used; product-specific security compliance configuration.,unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-backend-health-metrics,Backend Health and Metrics,ALB Controller - Backend Health and Metrics,Use ALB Controller backend health and metrics for troubleshooting,Identify and troubleshoot issues using ALB Controller's backend health & metrics endpoints for Application Gateway for Containers.,"Understanding backend health of your Kubernetes services and pods is crucial in identifying issues and assistance in troubleshooting. To help facilitate visibility into backend health, ALB Controller exposes backend health and metrics endpoints in all ALB Controller deployments. ALB Controller's backend health exposes three different experiences: ALB Controller's metric endpoint exposes both metrics and summary of backend health. This endpoint enables exposure to Prometheus. Access to these endp",2024-06-04T05:50:00.000Z,concept-article,troubleshooting,0.75,True,"Documents specific health and metrics endpoints, their schemas, and how to use them to diagnose backend issues in AGC.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-helm-chart,ALB Controller Helm Chart,ALB Controller Helm Chart,Configure ALB Controller Helm chart for Application Gateway for Containers,This article documents the latest helm chart for Application Gateway for Containers' ALB Controller.,A Helm chart to install the ALB Controller on Kubernetes. The following parameters are supported for configuration during installation:,2025-10-24T17:26:00.000Z,release-notes,configuration,0.85,True,"Helm chart documentation lists chart values with names, types, and defaults (e.g., controller settings, image tags, feature flags), which are explicit configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-release-notes,ALB Controller release notes,Release notes for ALB Controller,,This article lists updates made to the Application Gateway for Containers ALB Controller.,This article provides details about changes to the ALB Controller for Application Gateway for Containers. The ALB Controller is a Kubernetes deployment that orchestrates configuration and deployment of Application Gateway for Containers. It uses both ARM and configuration APIs to propagate configuration to the Application Gateway for Containers Azure deployment. Each release of ALB Controller has a documented helm chart version and supported Kubernetes cluster version. Instructions for new or ex,2026-02-09T06:11:00.000Z,release-notes,,0.4,False,"Release notes list version changes and compatibility but generally do not provide stable, reusable expert configuration, limits, or troubleshooting mappings; they are more historical than skill-focused.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes,API Specification,Application Gateway for Containers API Specification for Kubernetes,Use Kubernetes API specification for Application Gateway for Containers,This article provides documentation for Application Gateway for Containers' API specification for Kubernetes.,,2025-11-03T22:11:00.000Z,concept-article,configuration,0.9,True,"API specification for Kubernetes CRDs will include full schema: field names, types, allowed values, and defaults, which are core expert configuration details for this product.",unchanged +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-release-notes,ALB Controller release notes,Release notes for ALB Controller,,This article lists updates made to the Application Gateway for Containers ALB Controller.,This article provides details about changes to the ALB Controller for Application Gateway for Containers. The ALB Controller is a Kubernetes deployment that orchestrates configuration and deployment of Application Gateway for Containers. It uses both Azure Resource Manager and configuration APIs to propagate configuration to the Application Gateway for Containers Azure deployment. Each release of ALB Controller has a documented helm chart version and supported Kubernetes cluster version. Instruc,2026-04-24T11:19:00.000Z,release-notes,,0.2,False,"Release notes primarily describe version history, bug fixes, and feature additions. They usually lack structured limits, configuration tables, or decision matrices as defined in the sub-skills, and are time-sensitive rather than reusable expert knowledge for an AI skills system.",updated +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes,API Specification,Application Gateway for Containers API Specification for Kubernetes,Use Application Gateway for Containers Kubernetes API spec,This article provides documentation for Application Gateway for Containers' API specification for Kubernetes.,,2026-04-22T08:00:00.000Z,concept-article,configuration,0.78,True,"An API specification page for a Kubernetes CRD typically lists resource schemas, field names, allowed values, and defaults for Application Gateway for Containers objects. These are product-specific configuration parameters that an LLM is unlikely to know from training, fitting the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/application-gateway-for-containers-components,Application Gateway for Containers components,Application Gateway for Containers components,Configure components of Application Gateway for Containers,This article provides information about how Application Gateway for Containers accepts incoming requests and routes them to a backend target.,"This article provides detailed descriptions and requirements for components of Application Gateway for Containers. It explains how Application Gateway for Containers accepts incoming requests and routes them to a backend target. For a general overview of Application Gateway for Containers, seeWhat is Application Gateway for Containers. The AKS add-on for Application Gateway for Containers provides a managed deployment experience by AKS for the ALB Controller, eliminating the need to manually dep",2026-03-25T16:54:00.000Z,concept-article,configuration,0.68,True,"The page describes detailed, product-specific components and how requests are routed, including concrete requirements and behaviors for Application Gateway for Containers objects (listeners, routes, backends, etc.). This is configuration-focused expert knowledge about how to set up and wire these components together, beyond generic concepts. It does not primarily present limits, troubleshooting, or decision matrices, but rather concrete configuration semantics.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/application-gateway-for-containers-metrics,Metrics,Azure Monitor metrics for Application Gateway for Containers,Use Azure Monitor metrics with Application Gateway for Containers,Learn how to use metrics to monitor performance of Application Gateway for Containers,"Application Gateway for Containers publishes data points toAzure Monitorfor the performance of your Application Gateway for Containers and backend instances. These data points are called metrics, and are numerical values in an ordered set of time-series data. Metrics describe some aspect of your application gateway at a particular time. If there are requests flowing through the Application Gateway, it measures and sends its metrics in 60-second intervals. If there are no requests flowing through",2025-07-21T22:12:00.000Z,concept-article,configuration,0.75,True,"Defines AGC metric names, intervals, and meanings, and how to configure monitoring; this is detailed metric configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/container-networking,Container networking,Container networking with Azure Application Gateway for Containers,Choose container networking for Application Gateway for Containers,Learn how Azure Application Gateway for Containers works with different container networking interfaces.,"Kubernetes uses Container Networking Interface (CNI) plugins to manage networking in Kubernetes clusters. CNIs are responsible for assigning IP addresses to pods, network routing between pods, Kubernetes Service routing, and more. Azure Kubernetes Service (AKS) uses two main networking models:overlaynetwork andflatnetwork. Overlay networks: Flat networks: When choosing a networking model, consider the use cases for each CNI plugin and the type of network model it uses: When provisioning Applicat",2026-02-18T18:42:00.000Z,concept-article,decision-making,0.7,True,"Explains overlay vs flat networking models, CNI plugin implications, and how they affect provisioning AGC, guiding technology selection.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/custom-health-probe,Custom health probe,Custom health probe for Azure Application Gateway for Containers,Configure custom health probes for Application Gateway for Containers,Learn how to configure a custom health probe for Azure Application Gateway for Containers.,"Application Gateway for Containers monitors the health of all backend targets by default. As backend targets become healthy or unhealthy, Application Gateway for Containers only distributes traffic to healthy endpoints. In addition to using default health probe monitoring, you can also customize the health probe to suit your application's requirements. This article discusses both default and custom health probes. The order and logic of health probing is as follows: The following properties make ",2024-10-28T08:00:00.000Z,concept-article,configuration,0.85,True,"Details probe properties, default behavior, and allowed values for custom health probes, which are specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/diagnostics,Diagnostic Logs,Diagnostic logs for Application Gateway for Containers,Enable and use diagnostic logs for Application Gateway for Containers,Learn how to enable access logs for Application Gateway for Containers,"Learn how to troubleshoot common problems in Application Gateway for Containers. You can monitor Azure Application Gateway for Containers resources in the following ways: Logs: Logs allow for performance, access, and other data to be saved or consumed from a resource for monitoring purposes. Metrics: Application Gateway for Containers has several metrics that help you verify your system is performing as expected.",2025-07-21T22:12:00.000Z,concept-article,configuration,0.7,True,"Explains log categories, how to enable them, and how to use logs and metrics for troubleshooting AGC; these are product-specific monitoring configurations.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/ecdsa-rsa-certificates,ECDSA and RSA certificates,ECDSA and RSA Certificates for Azure Application Gateway for Containers,Configure ECDSA and RSA TLS certificates on Application Gateway for Containers,Learn how to configure a listener with both ECDSA and RSA certificates for Azure Application Gateway for Containers.,"Cryptography is vital in ensuring privacy, integrity, and security of data as it is transmitted between a client and server on a network. Two widely adopted cryptographic algorithms for asymmetric encryption are Rivest-Shamir-Adleman (RSA) and Elliptic Curve Digital Signature Algorithm (ECDSA).",2024-05-09T22:10:00.000Z,concept-article,security,0.75,True,"Details how to attach and use ECDSA and RSA certificates on listeners, including security-related configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/faq,Frequently Asked Questions (FAQ),Frequently asked questions about Azure Application Gateway for Containers,,Find answers to frequently asked questions about Azure Application Gateway for Containers.,The following are common questions asked about Azure Application Gateway for Containers.,2025-07-17T22:14:00Z,concept-article,,0.5,False,"FAQ page; likely mixes conceptual and minor specifics but not organized as deep reference, configuration tables, or troubleshooting flows.",unchanged +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/faq,Frequently Asked Questions (FAQ),Frequently asked questions about Azure Application Gateway for Containers,Troubleshoot common issues in Application Gateway for Containers,Find answers to frequently asked questions about Azure Application Gateway for Containers.,The following are common questions asked about Azure Application Gateway for Containers.,2026-04-24T11:19:00Z,concept-article,troubleshooting,0.6,True,"FAQ pages for Azure services typically include specific behaviors, constraints, and resolutions for common issues; this likely maps symptoms and questions to product-specific explanations and fixes, aligning best with troubleshooting.",updated https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/grpc,gRPC,gRPC with Application Gateway for Containers,Enable and configure gRPC support on Application Gateway for Containers,Learn how to configure Application Gateway for Containers with support for gRPC.,,2024-09-17T22:04:00.000Z,how-to,configuration,0.7,True,"Describes specific settings and routing configuration required for gRPC over AGC, which are product-specific integration/config details.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-backend-mtls-gateway-api,Backend MTLS,Backend MTLS with Application Gateway for Containers - Gateway API,Configure backend mTLS for Application Gateway for Containers,Learn how to configure Application Gateway for Containers with support for backend MTLS authentication.,This document helps set up an example application that uses the following resources from Gateway API. Steps are provided to:,2024-11-05T08:00:00.000Z,how-to,security,0.7,True,"How-to for backend mutual TLS with Gateway API will include product-specific TLS/auth configuration fields (certificate references, secret names, annotations, and required values) that are expert, security-focused settings rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-cert-manager-lets-encrypt-gateway-api,Managed certificates with cert-manager,Cert-manager and Let's Encrypt with Application Gateway for Containers - Gateway API,Use cert-manager and Let’s Encrypt with Application Gateway for Containers,Learn how to configure Application Gateway for Containers with certificates managed by CNCF project cert-manager.,"This guide demonstrates how to use cert-manager to automatically issue and renew SSL/TLS certificates to one or more frontends of your Azure Application Gateway for Containers deployment. We use the Gateway API to configure the necessary resources. For the purposes of this example, we have cert-manager configure certificates issued from Let's Encrypt to demonstrate an end-to-end deployment, where Application Gateway for Containers is providing TLS offloading. For certificates to be issued by Let",2026-02-20T08:00:00.000Z,how-to,security,0.7,True,"Describes exact integration settings between cert-manager, Let’s Encrypt, and Application Gateway for Containers (issuer kinds, annotations, secret names, TLS sections), which are detailed security and certificate configuration parameters.",unchanged @@ -74,17 +74,18 @@ https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-t https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-url-rewrite-ingress-api,URL rewrite,URL Rewrite for Azure Application Gateway for Containers - Ingress API,Configure URL rewrite using Ingress API for Application Gateway for Containers,Learn how to rewrite URLs in Ingress API for Application Gateway for Containers.,"Application Gateway for Containers allows you to rewrite the URL of a client request, including the requests' hostname and/or path. When Application Gateway for Containers initiates the request to the backend target, the request contains the newly rewritten URL to initiate the request.",2026-02-09T06:11:00.000Z,how-to,configuration,0.7,True,"Includes concrete Ingress annotations or fields for URL rewrite behavior, representing detailed configuration parameters unique to this gateway.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-waf-gateway-api,Web Application Firewall (WAF),Azure Web Application Firewall on Application Gateway for Containers - Gateway API,Test and configure Web Application Firewall on Application Gateway for Containers,This article provides an example scenario for testing Azure Web Application Firewall on Application Gateway for Containers.,This article helps you set up an example application that uses resources from the Gateway API. The article provides steps to:,2025-07-31T22:14:00.000Z,how-to,security,0.7,True,"WAF example scenario will include specific WAF policy settings, rule IDs/modes, and configuration fields for this product, which are detailed security configurations.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-websockets-gateway-api,WebSockets,WebSocket protocol and Azure Application Gateway for Containers - Gateway API,Configure WebSocket support in Application Gateway for Containers,Learn how to send a WebSocket request to a backend target with Application Gateway for Containers.,Application Gateway for Containers allows you to leverage WebSocket protocol to connect to backend targets with Application Gateway for Containers.,2026-02-09T06:11:00.000Z,conceptual,configuration,0.6,True,"WebSocket usage with this gateway will involve specific annotations or configuration fields to enable/route WebSocket traffic, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc,Migrate from AGIC to Application Gateway for Containers,Migration Overview - Move Application Gateway Ingress Controller (AGIC) services to Application Gateway for Containers,Plan migration from AGIC to Application Gateway for Containers,Learn how to migrate services from AGIC to Application Gateway for Containers.,"Application Gateway Ingress Controller (AGIC) and Application Gateway for Containers are two solutions that enable application load balancing for Azure Kubernetes Service (AKS) services. In 2018, Application Gateway started Application Gateway Ingress Controller, which translated Kubernetes Ingress configuration to Application Gateway configuration. Over time, Kubernetes has pushed the requirements of scale, performance, and introduced a successor API to Ingress called Gateway API; which has lea",2025-11-10T08:00:00.000Z,concept-article,decision-making,0.8,True,"Compares AGIC vs AGC, explains when and how to migrate, and likely includes scenario-based guidance and migration considerations.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/overview,Application Gateway for Containers,What is Application Gateway for Containers?,,"Overview of Azure Application Load Balancer Application Gateway for Containers features, resources, architecture, and implementation. Learn how Application Gateway for Containers works and how to use ","Application Gateway for Containers is an application layer (layer 7)load balancingand dynamic traffic management product for workloads running in a Kubernetes cluster. It extends Azure's Application Load Balancing portfolio and is a new offering under the Application Gateway product family. Application Gateway for Containers is the evolution of theApplication Gateway Ingress Controller (AGIC), aKubernetesapplication that enables Azure Kubernetes Service (AKS) customers to use Azure's native Appl",2026-02-07T08:00:00.000Z,overview,,0.3,False,"Conceptual overview of Application Gateway for Containers and architecture; no detailed limits, configuration parameter tables, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/load-balancing-strategies,Load balancing strategies,Load balancing strategies in Application Gateway for Containers,Choose load balancing strategies in Application Gateway for Containers,Learn about different load balancing strategies to help build resilient and performant workloads.,"In today's dynamic and high-demand environments, efficient load balancing is crucial for maintaining the performance and reliability of applications. Application Gateway for Containers offers multiple load balancing algorithms to cater to different scenarios and requirements. Understanding and choosing the right load balancing algorithm can significantly impact the overall user experience, resource utilization, and system resilience. Different load balancing algorithms provide unique benefits an",2026-04-24T11:19:00.000Z,concept-article,decision-making,0.65,True,"Describes multiple load balancing algorithms and when to use each for performance and resilience; this is product-specific strategy selection guidance, aligning with decision-making rather than generic concepts.",new +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc,Migrate from AGIC to Application Gateway for Containers,Migration Overview - Move Application Gateway Ingress Controller (AGIC) services to Application Gateway for Containers,Decide and plan migration from AGIC to Application Gateway for Containers,Learn how to migrate services from AGIC to Application Gateway for Containers.,"Application Gateway Ingress Controller (AGIC) and Application Gateway for Containers are two solutions that enable application load balancing for Azure Kubernetes Service (AKS) services. In 2018, Application Gateway started Application Gateway Ingress Controller, which translated Kubernetes Ingress configuration to Application Gateway configuration. Over time, Kubernetes has pushed the requirements of scale, performance, and introduced a successor API to Ingress called Gateway API; which has lea",2026-04-24T11:19:00.000Z,concept-article,decision-making,0.7,True,"Migration overview between AGIC and Application Gateway for Containers; likely includes scenario-based guidance, comparison of capabilities, and when/how to move, which fits decision-making and contains product-specific migration considerations.",updated +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/overview,Application Gateway for Containers,What is Application Gateway for Containers?,,"Overview of Azure Application Load Balancer Application Gateway for Containers features, resources, architecture, and implementation. Learn how Application Gateway for Containers works and how to use ","Application Gateway for Containers is an application layer (layer 7)load balancingand dynamic traffic management product for workloads running in a Kubernetes cluster. It extends Azure's Application Load Balancing portfolio and is a new offering under the Application Gateway product family. Application Gateway for Containers is the evolution of theApplication Gateway Ingress Controller (AGIC), aKubernetesapplication that enables Azure Kubernetes Service (AKS) customers to use Azure's native Appl",2026-04-24T11:19:00.000Z,overview,,0.2,False,"Page is an overview of Application Gateway for Containers features and architecture without clear indication of numeric limits, configuration tables, error codes, or decision matrices. It appears to be conceptual/introductory rather than containing product-specific expert details.",updated https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/prometheus-grafana,Prometheus and Grafana Configuration,Configure Application Gateway for Containers for Prometheus and Grafana,Integrate App Gateway for Containers with Prometheus and Grafana,Configure Application Gateway for Containers metrics to be sent to Prometheus and displayed on Grafana.,"Establishing monitoring for Application Gateway for Containers is a crucial part of successful operations. Monitoring allows you to visualize how traffic is controlled, providing actionable insights that help optimize performance and troubleshoot issues promptly. Monitoring also enhances security by providing valuable insights during investigations, which helps ensure that your gateway remains secure and resilient against threats.",2026-03-10T22:11:00.000Z,how-to,integrations,0.7,True,"Configuration-focused integration article for sending metrics to Prometheus and visualizing in Grafana. Likely includes product-specific metric endpoints, scrape configurations, and parameter values unique to Application Gateway for Containers, fitting the integrations & coding patterns category.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-create-application-gateway-for-containers-byo-deployment,Create Application Gateway for Containers - bring your own deployment,Quickstart: Create Application Gateway for Containers - bring your own deployment,,"In this quickstart, you learn how to provision and manage the Application Gateway for Containers Azure resources independent from Kubernetes configuration.","This guide assumes you're following thebring your owndeployment strategy, where ALB Controller references the Application Gateway for Containers resources precreated in Azure. It's assumed that resource lifecycles are managed in Azure, independent from what is defined within Kubernetes.",2024-08-12T22:07:00.000Z,quickstart,,0.4,False,Quickstart for bring-your-own deployment; likely high-level steps rather than detailed configuration matrices.,unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-create-application-gateway-for-containers-managed-by-alb-controller,Create Application Gateway for Containers - managed by ALB Controller,Quickstart: Create Application Gateway for Containers managed by ALB Controller,,"In this quickstart, you learn how to provision the Application Gateway for Containers resources via Kubernetes definition.","This guide assumes you're following themanaged by ALB controllerdeployment strategy, where all the Application Gateway for Containers resources are managed by ALB controller. Lifecycle is determined by the resources defined in Kubernetes. ALB Controller creates the Application Gateway for Containers resource when anApplicationLoadBalancercustom resource is defined on the cluster. The Application Gateway for Containers lifecycle is based on the lifecycle of the custom resource.",2024-02-27T18:07:00.000Z,quickstart,,0.4,False,"Quickstart for ALB-managed deployment; mostly tutorial content, not deep reference or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-addon,Deploy Application Gateway for Containers ALB Controller - Add-on,Quickstart: Deploy Application Gateway for Containers ALB Controller - AKS Add-on,,"In this quickstart, you learn how to provision the Application Gateway for Containers ALB Controller in an AKS cluster using the AKS add-on.",TheALB Controlleris responsible for translating Gateway API and Ingress API configuration within Kubernetes to load balancing rules within Application Gateway for Containers. The following guide walks through the steps needed to enable both the AKS managed Gateway API and Application Gateway for Containers ALB Controller add-ons on a new or existing Azure Kubernetes Service (AKS) cluster.,2026-02-09T08:00:00.000Z,quickstart,,0.4,False,"Quickstart deployment walkthrough for ALB Controller add-on; primarily step-by-step, not a constraints matrix or deep config reference.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-helm,Deploy Application Gateway for Containers ALB Controller - Helm,Quickstart: Deploy Application Gateway for Containers ALB Controller - Helm,,"In this quickstart, you learn how to provision the Application Gateway for Containers ALB Controller in an AKS cluster.",TheALB Controlleris responsible for translating Gateway API and Ingress API configuration within Kubernetes to load balancing rules within Application Gateway for Containers. The following guide walks through the steps needed to provision an ALB Controller into a new or existing Azure Kubernetes Service (AKS) cluster.,2026-02-09T06:11:00.000Z,quickstart,,0.4,False,Quickstart for deploying ALB Controller via Helm; mostly procedural without extensive parameter tables or constraints.,unchanged +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-addon,Deploy Application Gateway for Containers ALB Controller - Add-on,Quickstart: Deploy Application Gateway for Containers ALB Controller - AKS Add-on,,"In this quickstart, you learn how to provision the Application Gateway for Containers ALB Controller in an AKS cluster using the AKS add-on.",TheALB Controlleris responsible for translating Gateway API and Ingress API configuration within Kubernetes to load balancing rules within Application Gateway for Containers. The following guide walks through the steps needed to enable both the AKS managed Gateway API and Application Gateway for Containers ALB Controller add-ons on a new or existing Azure Kubernetes Service (AKS) cluster.,2026-04-24T11:19:00.000Z,quickstart,,0.3,False,"Quickstart for enabling the ALB Controller add-on on AKS; primarily step-by-step provisioning with generic commands and no detailed configuration tables, limits, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-helm,Deploy Application Gateway for Containers ALB Controller - Helm,Quickstart: Deploy Application Gateway for Containers ALB Controller - Helm,,"In this quickstart, you learn how to provision the Application Gateway for Containers ALB Controller in an AKS cluster.",TheALB Controlleris responsible for translating Gateway API and Ingress API configuration within Kubernetes to load balancing rules within Application Gateway for Containers. The following guide walks through the steps needed to provision an ALB Controller into a new or existing Azure Kubernetes Service (AKS) cluster.,2026-04-22T08:00:00.000Z,quickstart,,0.3,False,"Quickstart for deploying ALB Controller via Helm; focuses on basic install commands rather than detailed configuration options, limits, troubleshooting mappings, or best-practice guidance.",updated https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/scaling-zone-resiliency,Scaling and zone resiliency,Scaling and Zone-redundant Application Gateway for Containers,,This article defines Application Gateway for Containers Autoscaling and Zone-redundant features.,Application Gateway for Containers is configured with autoscaling in a high availability configuration in all cases. The gateway scales in or out based on application traffic. This offers better elasticity to your application and eliminates the need to guess capacity or manually manage instance counts. Autoscaling also maximizes cost savings by not requiring the gateway to constantly run at peak-provisioned capacity for the expected maximum traffic load.,2026-03-12T22:17:00.000Z,concept-article,,0.2,False,"High-level description of autoscaling and zone redundancy; likely conceptual without detailed numeric thresholds, configuration tables, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/server-sent-events,Server-sent events,Server-sent events and Application Gateway for Containers,Configure server-sent events with Application Gateway for Containers,Learn how server-sent events interact with Azure Application Gateway for Containers.,"Server-sent events (SSEs) provide a useful mechanism to enable servers to push real-time updates to web clients over a single HTTP connection. Unlike WebSockets, which allow bidirectional communication, SSEs are unidirectional: the server sends data to the client without expecting any responses. Applications using server-sent events can be found across several industries, such as medical (waiting area status boards), finance (displaying a stock ticker), aviation (flight status), and meteorology ",2025-05-08T17:04:00.000Z,concept-article,configuration,0.65,True,"Explains AGC behavior with SSE and any required configuration or constraints, which are specific to this product.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-helm-chart,ALB Service Mesh Helm Chart,ALB Service Mesh Helm Chart,Configure ALB Service Mesh Helm chart for Application Gateway for Containers,This article documents the latest Helm chart for Application Gateway for Containers' ALB Service Mesh Extension.,A Helm chart to install ALB Controller Service Mesh Extension on Kubernetes. This chart should only be installed after installation of ALB Controller (Add-onorHelm).,2025-11-18T18:43:00.000Z,release-notes,configuration,0.85,True,"Service Mesh Helm chart docs enumerate values and defaults for installing the extension, which are detailed configuration options specific to this product.",unchanged -https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration,Service mesh integration,Service mesh integration with Application Gateway for Containers,Integrate Application Gateway for Containers with Istio service mesh,Learn how to integrate Application Gateway for Containers with Istio service mesh for secure ingress traffic.,"Service meshes handle east-west traffic between services and often use mutual TLS (mTLS) for security. When services need external access (north-south traffic), an ingress gateway simplifies connectivity. Application Gateway for Containers provides a service mesh extension that automates certificate lifecycle management and reduces configuration complexity. While defining mTLS in ingress is possible, definition of mTLS configuration can become redundant and difficult to manage during certificate",2026-02-23T08:00:00.000Z,how-to,integrations,0.8,True,"Provides concrete configuration patterns, certificate handling, and mTLS integration steps between AGC and Istio, including specific YAML and parameters.",unchanged +https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration,Service mesh integration,Service mesh integration with Application Gateway for Containers,Integrate Istio service mesh with Application Gateway for Containers,Learn how to integrate Application Gateway for Containers with Istio service mesh for secure ingress traffic.,"Service meshes handle east-west traffic between services and often use mutual TLS (mTLS) for security. When services need external access (north-south traffic), an ingress gateway simplifies connectivity. Application Gateway for Containers provides a service mesh extension that automates certificate lifecycle management and reduces configuration complexity. While defining mTLS in ingress is possible, definition of mTLS configuration can become redundant and difficult to manage during certificate",2026-04-24T11:19:00.000Z,how-to,integrations,0.65,True,"Covers integration of Application Gateway for Containers with Istio service mesh for secure ingress; likely includes product-specific configuration patterns and parameters for the service mesh extension, fitting integrations & coding patterns.",updated https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/session-affinity,Session affinity,Session affinity overview for Azure Application Gateway for Containers,Configure session affinity for Application Gateway for Containers,Learn how to configure session affinity for Azure Application Gateway for Containers.,"Session affinity, also known assession persistenceorsticky sessions, is a technique used in load balancing to ensure a client's requests are always sent to the same server. This is important for applications that store user data in session variables or in a local cache on a particular server (commonly referred to as a stateful application). With session affinity, Application Gateway for Containers presents a cookie in theSet-Cookieheader of the first response. If the client presents the cookie i",2025-07-02T08:00:00.000Z,concept-article,configuration,0.8,True,"Explains cookie behavior, configuration options, and constraints for sticky sessions in AGC, which are product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/siem-integration-with-sentinel,SIEM integration with Sentinel,Configure Application Gateway for Containers for SIEM integration with Microsoft Sentinel and Microsoft Defender,Integrate Application Gateway for Containers logs with Microsoft Sentinel and Defender,Configure Application Gateway for Containers for SIEM integration with Microsoft Sentinel,"By integrating Application Gateway for Containers (AGC) with Microsoft Sentinel, you centralize and streamline security data collection across your environment. By using this quickstart guide, you can quickly configure Microsoft Sentinel to ingest AGC access logs, enabling precise monitoring, threat detection, and investigation based on real traffic signals. By adding the solution from the Content Hub and configuring the appropriate data connector, AGC access logs begin flowing into Microsoft Se",2026-02-20T18:12:00.000Z,how-to,integrations,0.8,True,"Describes specific data connectors, configuration steps, and log types for SIEM integration between AGC and Sentinel/Defender.",unchanged https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/tls-policy,TLS policy,TLS policy overview for Azure Application Gateway for Containers,Configure TLS policy for Application Gateway for Containers,Learn how to configure TLS policy for Azure Application Gateway for Containers.,"You can use Azure Application Gateway for Containers to control TLS ciphers to meet compliance and security goals of the organization. TLS policy includes definition of the TLS protocol version, cipher suites, and order in which ciphers are preferred during a TLS handshake. Application Gateway for Containers currently offers two predefined policies to choose from.",2025-07-24T22:10:00.000Z,concept-article,security,0.8,True,"Lists predefined TLS policies, protocol versions, and cipher suites, and how to select them to meet compliance requirements.",unchanged diff --git a/products/azure-application-gateway/report.md b/products/azure-application-gateway/report.md index 1f0d0e2e..508138bb 100644 --- a/products/azure-application-gateway/report.md +++ b/products/azure-application-gateway/report.md @@ -1,9 +1,9 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: - configuration: 'Configuring and monitoring Azure Application Gateway and Application - Gateway for Containers: listeners, routing, probes, health, headers/URL rewrite, - session affinity, diagnostics, and AKS/AGIC integration.' + configuration: 'Configuring Application Gateway and Application Gateway for Containers: + listeners, routing, probes, health, headers/URL rewrite, WebSockets/gRPC, networking, + monitoring, and AKS/AGIC integration.' limits-quotas: Autoscaling and zone redundancy settings, gateway capacity and configuration limits, and guidance for migrating from Application Gateway v1 to v2. security: TLS/SSL, certificates, mTLS, WAF, DDoS, HSTS, and secure access patterns @@ -12,45 +12,46 @@ category_descriptions: deployment: Guides for deploying and migrating Application Gateway (v1→v2, IPv6, mTLS), configuring autoscale, and setting up/upgrading AGIC with AKS using portal, ARM, PowerShell, and Helm. - troubleshooting: 'Diagnosing and fixing Application Gateway runtime issues: backend - health, 502s, certificates/Key Vault, listeners, session affinity, mTLS, redirects, - AKS/ALB/containers, and HTTP response codes.' - decision-making: Guidance on choosing networking and pricing for Application Gateway, - and planning migrations (AGIC to containers, v1 retirement, classic VMs to ARM) - integrations: 'Patterns for integrating App Gateway for Containers with monitoring, - security, and scaling: Prometheus/Grafana, Istio, Sentinel/Defender, and autoscaling - AKS pods via gateway metrics.' + troubleshooting: Diagnosing and fixing Application Gateway for Containers issues + using ALB Controller backend health, metrics, and common error/behavior troubleshooting + steps. + decision-making: Guidance on choosing networking and load-balancing for App Gateway + for Containers, estimating pricing, and planning migrations (AGIC, v1 retirement, + classic VMs to ARM). + integrations: Integrating App Gateway for Containers with Prometheus/Grafana, Istio, + Sentinel/Defender, and using its metrics to autoscale AKS pods and build observability/security + pipelines. best-practices: 'Guidance on designing Application Gateway for very high traffic: sizing, autoscaling, performance tuning, capacity planning, and configuration patterns to handle large loads reliably.' skill_description: Expert knowledge for Azure Application Gateway development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring listeners/routing, - WAF/TLS, AGIC/AKS integration, autoscale/zone redundancy, or v1→v2 migration, and - other Azure Application Gateway related development tasks. Not for Azure Front Door - (use azure-front-door), Azure Load Balancer (use azure-load-balancer), Azure Traffic - Manager (use azure-traffic-manager), Azure Firewall (use azure-firewall). -use_when: Use when configuring listeners/routing, WAF/TLS, AGIC/AKS integration, autoscale/zone - redundancy, or v1→v2 migration, and other Azure Application Gateway related development - tasks. -confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load Balancer - (use azure-load-balancer), Azure Traffic Manager (use azure-traffic-manager), Azure - Firewall (use azure-firewall). + WAF/TLS, autoscale/zone redundancy, AGIC with AKS, or App Gateway for Containers, + and other Azure Application Gateway related development tasks. Not for Azure Load + Balancer (use azure-load-balancer), Azure Front Door (use azure-front-door), Azure + Traffic Manager (use azure-traffic-manager), Azure Virtual Network (use azure-virtual-network). +use_when: Use when configuring listeners/routing, WAF/TLS, autoscale/zone redundancy, + AGIC with AKS, or App Gateway for Containers, and other Azure Application Gateway + related development tasks. +confusable_not_for: Not for Azure Load Balancer (use azure-load-balancer), Azure Front + Door (use azure-front-door), Azure Traffic Manager (use azure-traffic-manager), + Azure Virtual Network (use azure-virtual-network). --- # Azure Application Gateway Crawl Report ## Summary -- **Total Pages**: 174 -- **Fetched**: 174 +- **Total Pages**: 175 +- **Fetched**: 175 - **Fetch Failed**: 0 -- **Classified**: 125 -- **Unclassified**: 49 +- **Classified**: 127 +- **Unclassified**: 48 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 174 +- **New Pages**: 1 +- **Updated Pages**: 8 +- **Unchanged**: 166 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-application-gateway/azure-application-gateway.csv` @@ -59,22 +60,44 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | Type | Count | Percentage | |------|-------|------------| | best-practices | 1 | 0.6% | -| configuration | 61 | 35.1% | -| decision-making | 6 | 3.4% | +| configuration | 61 | 34.9% | +| decision-making | 7 | 4.0% | | deployment | 12 | 6.9% | | integrations | 4 | 2.3% | | limits-quotas | 1 | 0.6% | -| security | 38 | 21.8% | -| troubleshooting | 2 | 1.1% | -| *(Unclassified)* | 49 | 28.2% | +| security | 38 | 21.7% | +| troubleshooting | 3 | 1.7% | +| *(Unclassified)* | 48 | 27.4% | ## Changes +### New Pages + +- [Load balancing strategies](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/load-balancing-strategies) + +### Updated Pages + +- [Application Gateway for Containers](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/overview) + - Updated: 2026-02-07T08:00:00.000Z → 2026-04-24T11:19:00.000Z +- [Deploy Application Gateway for Containers ALB Controller - Add-on](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-addon) + - Updated: 2026-02-09T08:00:00.000Z → 2026-04-24T11:19:00.000Z +- [Deploy Application Gateway for Containers ALB Controller - Helm](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-helm) + - Updated: 2026-02-09T06:11:00.000Z → 2026-04-22T08:00:00.000Z +- [Migrate from AGIC to Application Gateway for Containers](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc) + - Updated: 2025-11-10T08:00:00.000Z → 2026-04-24T11:19:00.000Z +- [Service mesh integration](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration) + - Updated: 2026-02-23T08:00:00.000Z → 2026-04-24T11:19:00.000Z +- [Frequently Asked Questions (FAQ)](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/faq) + - Updated: 2025-07-17T22:14:00Z → 2026-04-24T11:19:00Z +- [API Specification](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes) + - Updated: 2025-11-03T22:11:00.000Z → 2026-04-22T08:00:00.000Z +- [ALB Controller release notes](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-release-notes) + - Updated: 2026-02-09T06:11:00.000Z → 2026-04-24T11:19:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [API Specification](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes) | configuration | 0.90 | API specification for Kubernetes CRDs will include full schema: field names, types, allowed values, and defaults, which are core expert configuration details for this product. | | [Ingress for AKS annotations](https://learn.microsoft.com/en-us/azure/application-gateway/ingress-controller-annotations) | configuration | 0.90 | Lists concrete annotation keys, allowed values, and their effects on Application Gateway; this is a configuration reference unique to AGIC. | | [ALB Controller Helm Chart](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-helm-chart) | configuration | 0.85 | Helm chart documentation lists chart values with names, types, and defaults (e.g., controller settings, image tags, feature flags), which are explicit configuration parameters. | | [ALB Service Mesh Helm Chart](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-helm-chart) | configuration | 0.85 | Service Mesh Helm chart docs enumerate values and defaults for installing the extension, which are detailed configuration options specific to this product. | @@ -87,14 +110,13 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | [Configure Key Vault - PowerShell](https://learn.microsoft.com/en-us/azure/application-gateway/configure-keyvault-ps) | security | 0.80 | Describes integrating Key Vault with Application Gateway v2 for TLS certificates using PowerShell; includes SKU restriction and specific Key Vault and gateway parameters, which are product-specific security configuration details. | | [JSON Web Token (JWT) Configuration](https://learn.microsoft.com/en-us/azure/application-gateway/json-web-token-overview) | security | 0.80 | Explains how Application Gateway validates JWTs from Microsoft Entra ID and enforces auth policies; includes product-specific security configuration and token handling behavior. | | [Listener specific SSL policy](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-configure-listener-specific-ssl-policy) | security | 0.80 | Explains configuring different SSL policies per listener and default vs override behavior; these are product-specific security configuration patterns. | -| [Migrate from AGIC to Application Gateway for Containers](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc) | decision-making | 0.80 | Compares AGIC vs AGC, explains when and how to migrate, and likely includes scenario-based guidance and migration considerations. | | [Monitoring data reference](https://learn.microsoft.com/en-us/azure/application-gateway/monitor-application-gateway-reference) | configuration | 0.80 | Contains detailed reference for metrics, logs, and diagnostic settings (names, dimensions, categories) specific to Application Gateway monitoring. | | [SIEM integration with Sentinel](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/siem-integration-with-sentinel) | integrations | 0.80 | Describes specific data connectors, configuration steps, and log types for SIEM integration between AGC and Sentinel/Defender. | -| [Service mesh integration](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration) | integrations | 0.80 | Provides concrete configuration patterns, certificate handling, and mTLS integration steps between AGC and Istio, including specific YAML and parameters. | | [Session affinity](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/session-affinity) | configuration | 0.80 | Explains cookie behavior, configuration options, and constraints for sticky sessions in AGC, which are product-specific settings. | | [TLS policy](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/tls-policy) | security | 0.80 | Lists predefined TLS policies, protocol versions, and cipher suites, and how to select them to meet compliance requirements. | | [Troubleshoot](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/troubleshooting-guide) | troubleshooting | 0.80 | A troubleshooting guide will map specific symptoms and error messages to causes and resolutions, likely including product-specific error codes and diagnostic steps. | | [Using Key Vault](https://learn.microsoft.com/en-us/azure/application-gateway/key-vault-certs) | security | 0.80 | Describes integration with Key Vault for server certificates and includes a concrete TLS 1.2 enforcement date; product-specific security and certificate management configuration. | +| [API Specification](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes) | configuration | 0.78 | An API specification page for a Kubernetes CRD typically lists resource schemas, field names, allowed values, and defaults for Application Gateway for Containers objects. These are product-specific configuration parameters that an LLM is unlikely to know from training, fitting the configuration sub-skill. | | [Add secure flag for cookies](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-secure-flag-session-affinity) | security | 0.78 | Shows how to configure Secure and HttpOnly flags on ApplicationGatewayAffinity cookie via rewrite set; product-specific security configuration with concrete setting names. | | [Certificates for the backend](https://learn.microsoft.com/en-us/azure/application-gateway/certificates-for-backend-authentication) | security | 0.78 | Details how to convert TLS certificates into authentication/trusted root certificates and explains v1 vs v2 requirements; this is nuanced, product-specific TLS security behavior. | | [TLS 1.0 and 1.1 retirement](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-tls-version-retirement) | security | 0.78 | Gives a concrete Azure-wide retirement date (31 August 2025) and product-specific guidance on handling TLS 1.0/1.1 deprecation for frontend and backend connections, which is time-sensitive expert configuration/operations knowledge. | @@ -133,6 +155,7 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | [Listeners](https://learn.microsoft.com/en-us/azure/application-gateway/configuration-listeners) | configuration | 0.70 | Details listener configuration (ports, protocols, host, IP, HTTP/2, WebSockets) with product-specific options and constraints, which are configuration expert knowledge. | | [Managed certificates with cert-manager](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-cert-manager-lets-encrypt-gateway-api) | security | 0.70 | Describes exact integration settings between cert-manager, Let’s Encrypt, and Application Gateway for Containers (issuer kinds, annotations, secret names, TLS sections), which are detailed security and certificate configuration parameters. | | [Managed certificates with cert-manager](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-cert-manager-lets-encrypt-ingress-api) | security | 0.70 | Ingress-based cert-manager integration includes specific annotations, issuer references, and TLS configuration fields that are product- and integration-specific security settings. | +| [Migrate from AGIC to Application Gateway for Containers](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc) | decision-making | 0.70 | Migration overview between AGIC and Application Gateway for Containers; likely includes scenario-based guidance, comparison of capabilities, and when/how to move, which fits decision-making and contains product-specific migration considerations. | | [Migrate from v1 to v2](https://learn.microsoft.com/en-us/azure/application-gateway/migrate-v1-v2) | deployment | 0.70 | Provides concrete, product-specific migration steps using Azure PowerShell scripts, including configuration and traffic migration stages and script variants (enhanced vs legacy). This is specialized deployment/migration guidance beyond generic knowledge. | | [Mutual authentication](https://learn.microsoft.com/en-us/azure/application-gateway/mutual-authentication-overview) | security | 0.70 | Describes mutual authentication behavior and recommends TLS 1.2; involves product-specific security configuration for client certificate validation. | | [Mutual authentication - Template](https://learn.microsoft.com/en-us/azure/application-gateway/mutual-authentication-arm-template) | deployment | 0.70 | Quickstart for deploying Application Gateway with mTLS passthrough using a specific API version (2025-03-01); includes template schema and deployment-specific constraints, which are product-specific deployment details. | @@ -171,6 +194,7 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | [End-to-End TLS](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-end-to-end-tls-ingress-api) | security | 0.65 | Ingress API variant of end-to-end TLS will document Ingress-specific TLS fields, secret wiring, and product-specific options, which are concrete security configuration parameters. | | [HTTP settings](https://learn.microsoft.com/en-us/azure/application-gateway/configuration-http-settings) | configuration | 0.65 | Backend Settings configuration is product-specific, including parameters for backend connections and associations with routing rules. | | [Header rewrite](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-header-rewrite-ingress-api) | configuration | 0.65 | Shows exact Ingress annotations/fields and values for header rewrite, which are detailed configuration parameters unique to this controller. | +| [Load balancing strategies](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/load-balancing-strategies) | decision-making | 0.65 | Describes multiple load balancing algorithms and when to use each for performance and resilience; this is product-specific strategy selection guidance, aligning with decision-making rather than generic concepts. | | [Metrics](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-metrics) | configuration | 0.65 | Lists concrete metric names, dimensions, and meanings plus how to configure alerts; these metric definitions are product-specific configuration/monitoring reference. | | [Migration FAQ](https://learn.microsoft.com/en-us/azure/application-gateway/retirement-faq) | decision-making | 0.65 | FAQ contains product-specific retirement timelines and what happens after retirement, guiding when and how to migrate from V1 to V2. This supports migration and planning decisions tied to concrete dates and behaviors. | | [Multi-site hosting](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-multiple-site-hosting-gateway-api) | configuration | 0.65 | Multi-site hosting with Gateway API requires specific host/path routing configuration fields and examples that are unique to this product’s CRDs and not generic knowledge. | @@ -181,6 +205,7 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | [SSL Offloading](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-ssl-offloading-ingress-api) | security | 0.65 | Documents Ingress TLS sections, secret references, and any gateway-specific options for SSL offload, which are detailed security configuration parameters. | | [SSL certificate management](https://learn.microsoft.com/en-us/azure/application-gateway/ssl-certificate-management) | security | 0.65 | Focuses on listener certificate management for TLS termination; product-specific security and certificate lifecycle configuration. | | [Server-sent events](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/server-sent-events) | configuration | 0.65 | Explains AGC behavior with SSE and any required configuration or constraints, which are specific to this product. | +| [Service mesh integration](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration) | integrations | 0.65 | Covers integration of Application Gateway for Containers with Istio service mesh for secure ingress; likely includes product-specific configuration patterns and parameters for the service mesh extension, fitting integrations & coding patterns. | | [Traffic Splitting / Weighted Round Robin](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/how-to-traffic-splitting-gateway-api) | configuration | 0.65 | Traffic splitting/weighted round robin requires specifying weight fields and backend references in Gateway API resources, which are detailed configuration parameters unique to this product. | | [Understanding pricing](https://learn.microsoft.com/en-us/azure/application-gateway/understanding-pricing) | decision-making | 0.65 | Describes billing process for v1 and v2 SKUs; typically includes cost components and SKU differences to inform pricing-related decisions. | | [WebSockets](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/websockets) | configuration | 0.65 | Covers how AGC handles WebSockets and any necessary configuration or limitations, which are product-specific. | @@ -195,6 +220,7 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | [SSL termination - Azure CLI](https://learn.microsoft.com/en-us/azure/application-gateway/tutorial-ssl-cli) | security | 0.62 | Similar to the PowerShell article but using CLI; includes concrete flags/parameters for binding certificates to listeners, which are product-specific security settings. | | [SSL termination - PowerShell](https://learn.microsoft.com/en-us/azure/application-gateway/tutorial-ssl-powershell) | security | 0.62 | Shows how to attach certificates for TLS termination via PowerShell; likely includes certificate parameter names, listener settings, and bindings that are product-specific security configuration details. | | [Configure Private Link](https://learn.microsoft.com/en-us/azure/application-gateway/private-link-configure) | configuration | 0.60 | How-to article for configuring Application Gateway Private Link via portal, PowerShell, or CLI, likely including specific parameters and commands unique to this feature, which constitutes product-specific configuration knowledge. | +| [Frequently Asked Questions (FAQ)](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/faq) | troubleshooting | 0.60 | FAQ pages for Azure services typically include specific behaviors, constraints, and resolutions for common issues; this likely maps symptoms and questions to product-specific explanations and fixes, aligning best with troubleshooting. | | [Multi-site hosting](https://learn.microsoft.com/en-us/azure/application-gateway/multiple-site-overview) | configuration | 0.60 | Describes multi-site hosting with rule priority and evaluation order plus conditions/limitations; these are product-specific configuration behaviors. | | [Redirection](https://learn.microsoft.com/en-us/azure/application-gateway/redirect-overview) | configuration | 0.60 | Describes redirect capability between listeners and to external sites, including HTTP-to-HTTPS scenarios; involves product-specific rule configuration. | | [URL routing](https://learn.microsoft.com/en-us/azure/application-gateway/url-route-overview) | configuration | 0.60 | Covers UrlPathMap and PathBasedRouting rule configuration with concrete routing behavior, which is product-specific configuration knowledge. | @@ -204,24 +230,23 @@ confusable_not_for: Not for Azure Front Door (use azure-front-door), Azure Load | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Frequently Asked Questions (FAQ)](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/faq) | 0.50 | FAQ page; likely mixes conceptual and minor specifics but not organized as deep reference, configuration tables, or troubleshooting flows. | -| [ALB Controller release notes](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-release-notes) | 0.40 | Release notes list version changes and compatibility but generally do not provide stable, reusable expert configuration, limits, or troubleshooting mappings; they are more historical than skill-focused. | | [Azure portal](https://learn.microsoft.com/en-us/azure/application-gateway/how-to-tcp-tls-proxy) | 0.40 | How-to for configuring TCP/TLS proxy with SQL backend; likely includes some config, but summary suggests a basic walkthrough rather than comprehensive parameter tables or unique troubleshooting. | | [Create Application Gateway for Containers - bring your own deployment](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-create-application-gateway-for-containers-byo-deployment) | 0.40 | Quickstart for bring-your-own deployment; likely high-level steps rather than detailed configuration matrices. | | [Create Application Gateway for Containers - managed by ALB Controller](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-create-application-gateway-for-containers-managed-by-alb-controller) | 0.40 | Quickstart for ALB-managed deployment; mostly tutorial content, not deep reference or troubleshooting. | -| [Deploy Application Gateway for Containers ALB Controller - Add-on](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-addon) | 0.40 | Quickstart deployment walkthrough for ALB Controller add-on; primarily step-by-step, not a constraints matrix or deep config reference. | -| [Deploy Application Gateway for Containers ALB Controller - Helm](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-helm) | 0.40 | Quickstart for deploying ALB Controller via Helm; mostly procedural without extensive parameter tables or constraints. | | [WebSocket support](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-websocket) | 0.40 | Explains WebSocket support conceptually and notes no user-configurable settings; lacks detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/application-gateway/tcp-tls-proxy-overview) | 0.35 | Overview of TCP/TLS proxy capability; likely conceptual without detailed parameter tables or limits in the summary provided. | | [Secure with SSL](https://learn.microsoft.com/en-us/azure/application-gateway/create-ssl-portal) | 0.35 | Tutorial for configuring TLS termination via portal; step-by-step example rather than a comprehensive configuration reference. | | [Application Gateway components](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-components) | 0.30 | Describes components (listeners, backend pools, etc.) at a conceptual level; no detailed parameter tables or numeric constraints. | | [Application Gateway features](https://learn.microsoft.com/en-us/azure/application-gateway/features) | 0.30 | Feature overview and positioning; does not provide numeric limits, configuration parameter tables, or detailed decision matrices. | -| [Application Gateway for Containers](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/overview) | 0.30 | Conceptual overview of Application Gateway for Containers and architecture; no detailed limits, configuration parameter tables, or error mappings. | | [Azure portal](https://learn.microsoft.com/en-us/azure/application-gateway/ipv6-application-gateway-portal) | 0.30 | Primarily a step-by-step portal tutorial for creating an Application Gateway with IPv6; it notes capability constraints (no upgrade from IPv4-only, no IPv6 backends) but does not provide detailed limits/quotas, configuration parameter tables, error codes, or other structured expert knowledge as defined by the sub-skill types. | +| [Deploy Application Gateway for Containers ALB Controller - Add-on](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-addon) | 0.30 | Quickstart for enabling the ALB Controller add-on on AKS; primarily step-by-step provisioning with generic commands and no detailed configuration tables, limits, error codes, or decision matrices. | +| [Deploy Application Gateway for Containers ALB Controller - Helm](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/quickstart-deploy-application-gateway-for-containers-alb-controller-helm) | 0.30 | Quickstart for deploying ALB Controller via Helm; focuses on basic install commands rather than detailed configuration options, limits, troubleshooting mappings, or best-practice guidance. | | [FAQ](https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-faq) | 0.30 | FAQ page likely mixes various topics but description/summary do not indicate presence of specific limits, configuration tables, or error-code-based troubleshooting; treated as general Q&A rather than structured expert-knowledge content. | | [How Application Gateway works](https://learn.microsoft.com/en-us/azure/application-gateway/how-application-gateway-works) | 0.30 | Explains request flow conceptually; lacks specific configuration values, limits, or troubleshooting mappings. | | [Redirect web traffic using Azure PowerShell](https://learn.microsoft.com/en-us/azure/application-gateway/tutorial-url-redirect-powershell) | 0.22 | PowerShell tutorial for URL path-based redirection; despite mentioning 'production-ready', summary indicates a standard routing tutorial without explicit config matrices or limits. | +| [ALB Controller release notes](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-release-notes) | 0.20 | Release notes primarily describe version history, bug fixes, and feature additions. They usually lack structured limits, configuration tables, or decision matrices as defined in the sub-skills, and are time-sensitive rather than reusable expert knowledge for an AI skills system. | | [About v1 retirement](https://learn.microsoft.com/en-us/azure/application-gateway/v1-retirement) | 0.20 | High-level retirement announcement and dates; no technical limits, configuration parameters, or detailed migration decision matrices. | +| [Application Gateway for Containers](https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/overview) | 0.20 | Page is an overview of Application Gateway for Containers features and architecture without clear indication of numeric limits, configuration tables, error codes, or decision matrices. It appears to be conceptual/introductory rather than containing product-specific expert details. | | [Application Gateway overview](https://learn.microsoft.com/en-us/azure/application-gateway/overview) | 0.20 | High-level product overview of Azure Application Gateway without detailed limits, configuration tables, or product-specific troubleshooting or security settings. | | [Application Gateway v2](https://learn.microsoft.com/en-us/azure/application-gateway/overview-v2) | 0.20 | High-level overview of Application Gateway v2 and deprecation notice for v1; no detailed limits, configuration parameters, error codes, or decision matrices with quantified trade-offs. | | [Azure CLI](https://learn.microsoft.com/en-us/azure/application-gateway/tutorial-multiple-sites-cli) | 0.20 | CLI tutorial for hosting multiple sites; mostly procedural without expert-level configuration details. | diff --git a/products/azure-arc/azure-arc.csv b/products/azure-arc/azure-arc.csv index 9de5918a..35108955 100644 --- a/products/azure-arc/azure-arc.csv +++ b/products/azure-arc/azure-arc.csv @@ -185,11 +185,11 @@ See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms tha https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/agent-upgrade,Upgrade agents,Upgrade Azure Arc-enabled Kubernetes agents - Azure Arc,Manage Azure Arc-enabled Kubernetes agent upgrades and support policy,Control agent upgrades for Azure Arc-enabled Kubernetes,"Azure Arc-enabled Kubernetes provides both automatic and manual upgrade capabilities for itsagentsso that agents are upgraded to thelatest version. If you disable automatic upgrade and instead rely on manual upgrade, aversion support policyapplies for Arc agents and the underlying Kubernetes clusters.",2025-01-07T08:00:00.000Z,how-to,deployment,0.7,True,Agent upgrade article covers automatic vs manual upgrades and version support policy—deployment/upgrade constraints and requirements specific to Arc agents.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/arc-gateway-simplify-networking,Simplify network connectivity,Simplify network configuration requirements with Azure Arc gateway - Azure Arc,Use Azure Arc gateway to simplify Kubernetes networking,The Azure Arc gateway lets you onboard Kubernetes clusters to Azure Arc with access to a reduced number of endpoints.,"If your environment requires restricting outbound traffic by using an enterprise proxy or firewall, the Azure Arc gateway can help simplify the process of enabling connectivity. By using the Azure Arc gateway, you can:",2026-02-17T12:03:00.000Z,how-to,configuration,0.7,True,"Network simplification via Arc gateway; likely lists required endpoints, proxy settings, and gateway configuration parameters unique to this feature.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/azure-rbac,Use Azure RBAC for authorization checks,Azure RBAC on Azure Arc-enabled Kubernetes clusters - Azure Arc,Configure Azure RBAC for Arc-enabled Kubernetes authorization,Use Azure RBAC for authorization checks on Azure Arc-enabled Kubernetes clusters.,"You can useMicrosoft Entra IDandAzure role-based access control (Azure RBAC)to control authorization checks on your Azure Arc-enabled Kubernetes cluster. Azure role assignments let you granularly control which users can read, write, and delete Kubernetes objects such as deployment, pod, and service. KubernetesClusterRoleBinding and RoleBindingobject types help to define authorization in Kubernetes natively. For a conceptual overview of this feature, seeAzure RBAC on Azure Arc-enabled Kubernetes.",2026-02-03T06:08:00.000Z,how-to,security,0.85,True,"Covers Azure RBAC on Arc-enabled Kubernetes; such pages list specific built-in roles, scopes, and permission mappings between Azure RBAC and Kubernetes RBAC.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-deploy,Deploy extension,Deploy cert-manager for Arc-enabled Kubernetes (preview) - Azure Arc,Deploy and configure cert-manager for Arc-enabled Kubernetes,Learn how to deploy the cert-manager for Azure Arc-enabled Kubernetes (preview) extension or upgrade from open source cert-manager and trust-manager.,"This article shows how to deploy the cert-manager for Arc-enabled Kubernetes (preview) extension in your Arc-connected clusters. Steps for a new deployment are included, along with requirements for migrating from a cluster that has open source cert-manager and trust-manager. Important Cert-manager for Azure Arc-enabled Kubernetes clusters is currently in public preview. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, pre",2026-04-14T06:03:00.000Z,how-to,configuration,0.65,True,"Deployment article for the cert-manager extension and migration from OSS cert-manager/trust-manager is likely to include product-specific configuration steps, parameter names, and required settings unique to this extension rather than generic Kubernetes deployment guidance.",new -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-egress,Egress support,Egress support for cert-manager for Azure Arc-enabled Kubernetes (preview) - Azure Arc,Configure egress TLS trust with cert-manager on Arc-enabled Kubernetes,Learn how to secure external egress traffic when using cert-manager for Arc-enabled Kubernetes clusters.,"Cert-manager for Arc-enabled Kubernetes provides a cluster‑level mechanism for managing certificates and distributing trusted CA material to workloads running inside an Arc‑enabled Kubernetes cluster. In many scenarios, Kubernetes workloads must initiate outbound (egress) connections to systems that live outside the cluster boundary, such as factory devices, on‑premises services, or enterprise infrastructure. These external systems often use TLS certificates issued by private or enterprise Certi",2026-04-14T06:03:00.000Z,how-to,configuration,0.6,True,"Egress support for distributing trusted CA material usually involves specific Kubernetes resources, fields, and configuration patterns for workloads to consume trust bundles, which qualifies as product-specific configuration details.",new -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-ingress,Ingress support,Ingress support for cert-manager for Azure Arc-enabled Kubernetes (preview) - Azure Arc,Configure ingress TLS with cert-manager on Arc-enabled Kubernetes,Learn how to secure external ingress traffic when using cert-manager for Arc-enabled Kubernetes clusters.,"You need ingress TLS support if your Kubernetes deployment accepts traffic securely from outside the cluster (and you're working within a zero-trust security model). Cert-manager for Arc-enabled Kubernetes can observe ingress and gateway resources, create a certificate resource, store the resulting key pair in a Kubernetes secret, and renew it before expiration. The ingress or gateway controller uses that secret to terminate TLS and route traffic to your services. This article describes how to s",2026-04-14T06:03:00.000Z,how-to,configuration,0.6,True,"Ingress TLS support article will typically show concrete Kubernetes resource specs, annotations, and configuration fields for integrating cert-manager with ingress/gateway controllers, which are product-specific configuration patterns.",new -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-monitor-troubleshoot,Monitor and troubleshoot,Monitor and troubleshoot cert-manager for Arc-enabled Kubernetes (preview) - Azure Arc,Monitor and troubleshoot cert-manager for Arc-enabled Kubernetes,Learn how to monitor the health of the cert-manager for Azure Arc-enabled Kubernetes (preview) extension and troubleshoot problems.,"This article explains how to monitor extension status, check the health of cert-manager and trust-manager, and troubleshoot problems related to cert-manager for Azure Arc-enabled Kubernetes (preview).",2026-04-14T06:03:00.000Z,how-to,troubleshooting,0.8,True,"Explicitly a monitoring and troubleshooting guide; such pages typically map extension/Pod states and cert-manager conditions to causes and resolutions, and may include specific error messages or diagnostic commands.",new -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-overview,Overview,Cert-manager for Arc-enabled Kubernetes (preview) - Azure Arc,,"The cert-manager for Azure Arc-enabled Kubernetes (preview) extension provides a unified, automated solution for managing TLS certificates and trust bundles in Arc-connected clusters.","The cert-manager for Azure Arc-enabled Kubernetes (preview) extension provides a unified, automated solution for managing TLS certificates and trust bundles in Arc-connected clusters. It simplifies the process of issuing, renewing, and managing certificates across hybrid and multicloud environments, providing secure communication and compliance with organizational policies. Important Cert-manager for Azure Arc-enabled Kubernetes clusters is currently in public preview. See theSupplemental Terms ",2026-04-14T06:03:00.000Z,overview,,0.2,False,"High-level overview of cert-manager for Arc-enabled Kubernetes; describes capabilities and scenarios but no concrete limits, configuration tables, error codes, or product-specific decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-deploy,Deploy extension,Deploy cert-manager for Arc-enabled Kubernetes (preview) - Azure Arc,Deploy and configure cert-manager for Arc-enabled Kubernetes,Learn how to deploy the cert-manager for Azure Arc-enabled Kubernetes (preview) extension or upgrade from open source cert-manager and trust-manager.,"This article shows how to deploy the cert-manager for Arc-enabled Kubernetes (preview) extension in your Arc-connected clusters. Steps for a new deployment are included, along with requirements for migrating from a cluster that has open source cert-manager and trust-manager. Important Cert-manager for Azure Arc-enabled Kubernetes clusters is currently in public preview. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, pre",2026-04-14T06:03:00.000Z,how-to,configuration,0.65,True,"Deployment article for the cert-manager extension and migration from OSS cert-manager/trust-manager is likely to include product-specific configuration steps, parameter names, and required settings unique to this extension rather than generic Kubernetes deployment guidance.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-egress,Egress support,Egress support for cert-manager for Azure Arc-enabled Kubernetes (preview) - Azure Arc,Configure egress TLS trust with cert-manager on Arc-enabled Kubernetes,Learn how to secure external egress traffic when using cert-manager for Arc-enabled Kubernetes clusters.,"Cert-manager for Arc-enabled Kubernetes provides a cluster‑level mechanism for managing certificates and distributing trusted CA material to workloads running inside an Arc‑enabled Kubernetes cluster. In many scenarios, Kubernetes workloads must initiate outbound (egress) connections to systems that live outside the cluster boundary, such as factory devices, on‑premises services, or enterprise infrastructure. These external systems often use TLS certificates issued by private or enterprise Certi",2026-04-14T06:03:00.000Z,how-to,configuration,0.6,True,"Egress support for distributing trusted CA material usually involves specific Kubernetes resources, fields, and configuration patterns for workloads to consume trust bundles, which qualifies as product-specific configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-ingress,Ingress support,Ingress support for cert-manager for Azure Arc-enabled Kubernetes (preview) - Azure Arc,Configure ingress TLS with cert-manager on Arc-enabled Kubernetes,Learn how to secure external ingress traffic when using cert-manager for Arc-enabled Kubernetes clusters.,"You need ingress TLS support if your Kubernetes deployment accepts traffic securely from outside the cluster (and you're working within a zero-trust security model). Cert-manager for Arc-enabled Kubernetes can observe ingress and gateway resources, create a certificate resource, store the resulting key pair in a Kubernetes secret, and renew it before expiration. The ingress or gateway controller uses that secret to terminate TLS and route traffic to your services. This article describes how to s",2026-04-14T06:03:00.000Z,how-to,configuration,0.6,True,"Ingress TLS support article will typically show concrete Kubernetes resource specs, annotations, and configuration fields for integrating cert-manager with ingress/gateway controllers, which are product-specific configuration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-monitor-troubleshoot,Monitor and troubleshoot,Monitor and troubleshoot cert-manager for Arc-enabled Kubernetes (preview) - Azure Arc,Monitor and troubleshoot cert-manager for Arc-enabled Kubernetes,Learn how to monitor the health of the cert-manager for Azure Arc-enabled Kubernetes (preview) extension and troubleshoot problems.,"This article explains how to monitor extension status, check the health of cert-manager and trust-manager, and troubleshoot problems related to cert-manager for Azure Arc-enabled Kubernetes (preview).",2026-04-14T06:03:00.000Z,how-to,troubleshooting,0.8,True,"Explicitly a monitoring and troubleshooting guide; such pages typically map extension/Pod states and cert-manager conditions to causes and resolutions, and may include specific error messages or diagnostic commands.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-overview,Overview,Cert-manager for Arc-enabled Kubernetes (preview) - Azure Arc,,"The cert-manager for Azure Arc-enabled Kubernetes (preview) extension provides a unified, automated solution for managing TLS certificates and trust bundles in Arc-connected clusters.","The cert-manager for Azure Arc-enabled Kubernetes (preview) extension provides a unified, automated solution for managing TLS certificates and trust bundles in Arc-connected clusters. It simplifies the process of issuing, renewing, and managing certificates across hybrid and multicloud environments, providing secure communication and compliance with organizational policies. Important Cert-manager for Azure Arc-enabled Kubernetes clusters is currently in public preview. See theSupplemental Terms ",2026-04-14T06:03:00.000Z,overview,,0.2,False,"High-level overview of cert-manager for Arc-enabled Kubernetes; describes capabilities and scenarios but no concrete limits, configuration tables, error codes, or product-specific decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cluster-connect,Securely connect to cluster from anywhere,Use cluster connect to securely connect to Azure Arc-enabled Kubernetes clusters. - Azure Arc,Configure cluster connect for Arc-enabled Kubernetes,"With cluster connect, you can securely connect to Azure Arc-enabled Kubernetes clusters from anywhere without requiring any inbound port to be enabled on the firewall.","With cluster connect, you can securely connect to Azure Arc-enabled Kubernetes clusters from anywhere without requiring any inbound port to be enabled on the firewall. Access to the API server of the Azure Arc-enabled Kubernetes cluster enables the following scenarios: Before you begin, review theconceptual overview of the cluster connect feature.",2026-01-17T06:02:00.000Z,how-to,configuration,0.7,True,"How-to for securely connecting to Arc-enabled clusters; likely includes specific CLI parameters, connection modes, and config values unique to cluster connect.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-agent-overview,Agent overview,Azure Arc-enabled Kubernetes agent overview - Azure Arc,,Learn about the Azure Arc agents deployed on the Kubernetes clusters when connecting them to Azure Arc.,"Azure Arc-enabled Kubernetesprovides a centralized, consistent control plane to manage policy, governance, and security across Kubernetes clusters in different environments. You deploy Azure Arc agents on Kubernetes clusters when youconnect them to Azure Arc. This article provides an overview of these agents, along with high-level steps for deploying them to your cluster and moving connected resources across Azure regions.",2026-03-30T08:00:00.000Z,concept-article,,0.1,False,"Agent overview is conceptual and high-level, explaining what the agents are and general deployment steps; no indication of specific limits, config matrices, or detailed troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-azure-rbac,Azure RBAC integration,Azure RBAC on Azure Arc-enabled Kubernetes - Azure Arc,Apply Azure RBAC authorization to Arc-enabled Kubernetes clusters,This article provides a conceptual overview of the Azure RBAC capability on Azure Arc-enabled Kubernetes.,"Microsoft Entra IDandAzure role-based access control (Azure RBAC)let you control authorization checks on your Azure Arc-enabled Kubernetes cluster. Using Azure RBAC with your cluster gives you the benefits of Azure role assignments, such as activity logs showing changes made by users to your Azure resource.",2025-06-20T05:15:00.000Z,concept-article,security,0.8,True,"Azure RBAC article will list specific role names, scopes, and how they map to Kubernetes permissions—directly matching security sub-skill criteria.",unchanged @@ -199,7 +199,7 @@ https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-custom-l https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-data-exchange,Data exchange between cluster and Azure,Data exchanged between Azure Arc-enabled Kubernetes cluster and Azure - Azure Arc,Understand data exchange patterns between Arc-enabled Kubernetes and Azure,"The scenarios enabled by Azure Arc-enabled Kubernetes involve exchange of desired state configurations, metadata, and other scenario specific operational data.","Azure Arc-enabled Kubernetes scenarios involve exchange of desired state configurations, metadata, and other scenario specific operational data between the Azure Arc-enabled Kubernetes cluster environment and Azure service. For all types of data, theAzure Arc agentsinitiate outbound communication to Azure services, so they only require only egress access to endpoints listed in thenetwork prerequisites. Enabling inbound ports on firewall isn't required for the Azure Arc agents. The following tabl",2024-10-13T11:09:00.000Z,concept-article,security,0.7,True,"Describes specific data types exchanged, directions, and endpoints; includes a table of data categories and flows, which are product-specific security/compliance-relevant details.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-extensions,Cluster extensions,Cluster extensions in Azure Arc-enabled Kubernetes - Azure Arc,,Get a conceptual overview of the Azure Arc-enabled Kubernetes cluster extensions capability.,"This article describes how you can use Azure Arc-enabled Kubernetes cluster extensions viaHelm Chartsto manage your Kubernetes applications. The cluster extensions feature in Azure Arc-enabled Kubernetes has all the building blocks you need to define, install, and upgrade even the most complex Kubernetes applications. The cluster extension feature builds on the packaging components of Helm. With extensions, you use an Azure Resource Manager-driven experience for installation and lifecycle manage",2026-03-07T06:06:00.000Z,concept-article,,0.1,False,"Described as a conceptual overview of cluster extensions; focuses on what extensions are and how they relate to Helm, not on concrete limits, configuration parameter tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-argocd,Application deployment with GitOps (Argo CD),Application deployments with GitOps (Argo CD) - Azure Arc,,This article provides a conceptual overview of GitOps with Argo CD for use in Azure Arc-enabled Kubernetes and Azure Kubernetes Service (AKS) clusters.,"When you useArgo CD for GitOps on Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes clusters, your Git repository becomes the source of truth for the desired state of your applications and cluster configurations. This approach offers several benefits:",2026-03-24T06:03:00.000Z,concept-article,,0.1,False,"Conceptual overview of GitOps with Argo CD; focuses on benefits and high-level behavior, not on product-specific limits, configuration parameters, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-flux2,Application deployment with GitOps (Flux),Application deployments with GitOps (Flux v2) - Azure Arc,,This article provides a conceptual overview of GitOps with Flux v2 for use in Azure Arc-enabled Kubernetes and Azure Kubernetes Service (AKS) clusters.,"Azure provides an automated application deployments capability using GitOps that works with Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes clusters. The key benefits provided by adopting GitOps for deploying applications to Kubernetes clusters include: With GitOps, you declare the desired state of your Kubernetes clusters in files in Git repositories. The Git repositories may contain the following files: Because these files are stored in a Git repository, they're versioned, and ",2025-10-08T08:00:00.000Z,concept-article,,0.1,False,"Explicitly described as a conceptual overview of GitOps with Flux v2. No evidence of numeric limits, configuration parameter tables, or troubleshooting details.",new +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-flux2,Application deployment with GitOps (Flux),Application deployments with GitOps (Flux v2) - Azure Arc,,This article provides a conceptual overview of GitOps with Flux v2 for use in Azure Arc-enabled Kubernetes and Azure Kubernetes Service (AKS) clusters.,"Azure provides an automated application deployments capability using GitOps that works with Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes clusters. The key benefits provided by adopting GitOps for deploying applications to Kubernetes clusters include: With GitOps, you declare the desired state of your Kubernetes clusters in files in Git repositories. The Git repositories may contain the following files: Because these files are stored in a Git repository, they're versioned, and ",2025-10-08T08:00:00.000Z,concept-article,,0.1,False,"Explicitly described as a conceptual overview of GitOps with Flux v2. No evidence of numeric limits, configuration parameter tables, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-flux2,Configurations and GitOps (Flux v1),Application deployments with GitOps (Flux v2) - Azure Arc,,This article provides a conceptual overview of GitOps in Azure for use in Azure Arc-enabled Kubernetes and Azure Kubernetes Service (AKS) clusters.,"Azure provides an automated application deployments capability using GitOps that works with Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes clusters. The key benefits provided by adopting GitOps for deploying applications to Kubernetes clusters include: With GitOps, you declare the desired state of your Kubernetes clusters in files in Git repositories. The Git repositories may contain the following files: Because these files are stored in a Git repository, they're versioned, and ",2025-10-08T08:00:00.000Z,concept-article,,0.3,False,Duplicate conceptual GitOps overview; same reasoning as index 28.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-flux2-ci-cd,CI/CD workflow using GitOps,CI/CD Workflow using GitOps (Flux v2) - Azure Arc-enabled Kubernetes - Azure Arc,,This article provides a conceptual overview of a CI/CD workflow using GitOps.,"Modern Kubernetes deployments contain multiple applications, clusters, and environments. With GitOps, you can manage these complex setups more easily, tracking the desired state of the Kubernetes environments declaratively with Git. Using common Git tooling to declare cluster state, you can increase accountability, facilitate fault investigation, and enable automation to manage environments. This article describes how GitOps fits into the full application change lifecycle using Azure Arc, Azure ",2025-04-22T08:00:00.000Z,concept-article,,0.3,False,CI/CD workflow article is conceptual lifecycle guidance; description doesn’t show concrete parameter tables or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-inner-loop-gitops,Inner loop developer experience,Inner loop developer experience for teams adopting GitOps - Azure Arc,,Learn how an established inner loop can enhance developer productivity and help in a seamless transition for teams adopting GitOps.,This article describes how an established inner loop dev framework can enhance developer productivity for teams adopting GitOps with Arc-enabled Kubernetes or Azure Container Service (AKS) clusters.,2025-01-24T05:33:00.000Z,concept-article,,0.2,False,Inner loop developer experience is process/practice oriented; unlikely to contain product-specific configuration or limits.,unchanged @@ -215,11 +215,11 @@ https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/custom-locations,Cr https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/deploy-marketplace,Deploy Marketplace applications,Deploy and manage applications from Microsoft Marketplace on Azure Arc-enabled Kubernetes clusters - Azure Arc,,Learn how to discover Kubernetes applications in Microsoft Marketplace and deploy them on your Arc-enabled Kubernetes clusters.,"Microsoft Marketplaceis an online store that contains thousands of IT software applications and services built by industry-leading technology companies. In Microsoft Marketplace, you can find, try, buy, and deploy the software and services you need to build new solutions and manage your cloud infrastructure. The catalog includes solutions for different industries and technical areas, free trials, and consulting services from Microsoft partners. Included among these solutions are Kubernetes appli",2026-03-24T22:15:00.000Z,how-to,,0.3,False,"Appears to be a how-to/tutorial for deploying Marketplace applications to Arc-enabled Kubernetes, without indication of detailed configuration tables, limits, or product-specific error mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/diagnose-connection-issues,Connection issues,Diagnose connection issues for Azure Arc-enabled Kubernetes clusters - Azure Arc,Diagnose Azure Arc-enabled Kubernetes connection problems,Learn how to resolve common issues when connecting Kubernetes clusters to Azure Arc.,"If you have trouble connecting a cluster to Azure Arc, the problem is likely one of the issues listed in this article. Two flowcharts provide guided help: one if you'renot using a proxy server, and one that applies if your network connectionuses a proxy server. Tip These flowcharts apply whether you're using Azure CLI or Azure PowerShell toconnect your cluster. However, some of the steps require the use of Azure CLI. If you didn't alreadyinstall Azure CLI, be sure to do so before you begin.",2026-03-30T22:11:00.000Z,how-to,troubleshooting,0.85,True,"Focuses on diagnosing connection issues with guided flowcharts; likely includes specific checks, commands, and conditions unique to Arc connectivity, fitting symptom-to-solution troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/extensions,Deploy and manage cluster extensions,Deploy and manage an Azure Arc-enabled Kubernetes cluster extension - Azure Arc,Configure and manage Azure Arc Kubernetes extensions,Learn how to create and manage an extension instance on an Azure Arc-enabled Kubernetes cluster.,"You can use extensions in Azure Arc-enabled Kubernetes clusters to enable Azure services and scenarios. This article describes how to create extension instances and set required and optional parameters, including options for updates and configurations. You also learn how to view, list, update, and delete extension instances. Before you begin, read theoverview of Azure Arc-enabled Kubernetes cluster extensionsand review thelist of currently available extensions. The steps in this article work for",2026-03-05T23:14:00.000Z,how-to,configuration,0.7,True,"Article focuses on creating and managing extension instances with required and optional parameters, including options for updates and configurations. This implies product-specific configuration parameters and values for Azure Arc-enabled Kubernetes extensions, fitting the configuration sub-skill. It is not just a conceptual overview but a parameter-driven how-to.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/extensions-release,Available extensions,Available extensions for Azure Arc-enabled Kubernetes clusters - Azure Arc,,See a list of extensions that are currently available for Azure Arc-enabled Kubernetes clusters. View Flux extension release notes.,"Cluster extensions for Azure Arc-enabled Kubernetesprovide an Azure Resource Manager-based experience to install and manage lifecycles for different Azure capabilities in your cluster. You candeploy extensions to your clustersto support different scenarios and to improve cluster management. The following extensions are currently available to use with Azure Arc-enabled Kubernetes clusters. With one exception, all the extensions that are described in this article arecluster-scoped. Azure API Manag",2026-04-17T17:16:00.000Z,how-to,,0.3,False,"Appears to be a catalog/list of available Azure Arc-enabled Kubernetes extensions and brief descriptions. No indication of configuration tables, limits, error codes, or detailed parameters; primarily discovery/overview content.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/extensions-release,Available extensions,Available extensions for Azure Arc-enabled Kubernetes clusters - Azure Arc,,See a list of extensions that are currently available for Azure Arc-enabled Kubernetes clusters. View Flux extension release notes.,"Cluster extensions for Azure Arc-enabled Kubernetesprovide an Azure Resource Manager-based experience to install and manage lifecycles for different Azure capabilities in your cluster. You candeploy extensions to your clustersto support different scenarios and to improve cluster management. The following extensions are currently available to use with Azure Arc-enabled Kubernetes clusters. With one exception, all the extensions that are described in this article arecluster-scoped. Azure API Manag",2026-04-17T17:16:00.000Z,how-to,,0.3,False,"Appears to be a catalog/list of available Azure Arc-enabled Kubernetes extensions and brief descriptions. No indication of configuration tables, limits, error codes, or detailed parameters; primarily discovery/overview content.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/extensions-troubleshooting,Extension issues,Troubleshoot extension issues for Azure Arc-enabled Kubernetes clusters - Azure Arc,Troubleshoot Arc-enabled Kubernetes extension failures,Learn how to resolve common problems with Azure Arc-enabled Kubernetes cluster extensions.,"This article describes troubleshooting tips for common problems related tocluster extensionslike GitOps (Flux v2) in Azure or Open Service Mesh (OSM). For help with troubleshooting Azure Arc-enabled Kubernetes problems in general, seeTroubleshoot Azure Arc-enabled Kubernetes issues.",2025-12-02T23:14:00.000Z,how-to,troubleshooting,0.9,True,"Troubleshooting for cluster extensions; will list extension provisioning states, error messages, and corrective actions.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/faq,Frequently Asked Questions,Azure Arc-enabled Kubernetes and GitOps frequently asked questions - Azure Arc,Resolve common issues for Arc-enabled Kubernetes and GitOps,This article contains a list of frequently asked questions related to Azure Arc-enabled Kubernetes and Azure GitOps.,This article addresses frequently asked questions about Azure Arc-enabled Kubernetes and GitOps.,2025-03-04T23:01:00.000Z,concept-article,troubleshooting,0.7,True,"FAQ for Arc-enabled Kubernetes and GitOps likely includes specific error messages, behaviors, and resolutions, mapping symptoms to causes and fixes.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/flux-gitops-release-notes,GitOps (Flux) Release notes,What's new for Flux v(GitOps) in Azure Arc enabled Kubernetes - Azure Arc,,The release notes identify important updates and improvements in the microsoft.flux extension and important notices about changes.,"The Flux (GitOps) extension is updated on an ongoing basis. This article provides information about the most recent releases of the extension. Important To ensure continued compatibility and avoid disruptions, update your sources toremove references to deprecated APIsand ensure that clusters are running the latest version of the extension. Themost recent version of the Flux (GitOps) extensionand the two previous versions (N-2) are supported. We generally recommend that you use the most recent ve",2026-04-17T17:16:00.000Z,release-notes,,0.4,False,"Release notes for the Flux (GitOps) extension. While they may contain version-specific changes, they typically lack stable, reusable expert patterns like limits, configuration matrices, or troubleshooting mappings required by the defined sub-skill types.",new -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/gitops-flux2-parameters,GitOps (Flux) parameters,GitOps (Flux v2) supported parameters - Azure Arc,Configure GitOps (Flux v2) parameters on Azure Kubernetes,Understand the supported parameters for GitOps (Flux v2) in Azure for use in Azure Arc-enabled Kubernetes and Azure Kubernetes Service (AKS) clusters.,"Azure provides an automated application deployments capability using GitOps that works with Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes clusters. GitOps with Flux v2 lets you use your Git repository as the source of truth for cluster configuration and application deployment. For more information, seeApplication deployments with GitOps (Flux v2)andTutorial: Deploy applications using GitOps with Flux v2. GitOps on Azure Arc-enabled Kubernetes or Azure Kubernetes Service usesFlu",2024-09-19T17:07:00.000Z,concept-article,configuration,0.8,True,"Described as documenting supported parameters for GitOps (Flux v2) in Azure Arc-enabled Kubernetes and AKS. This implies parameter names, allowed values, and possibly defaults—matching the configuration sub-skill definition with product-specific settings.",new +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/flux-gitops-release-notes,GitOps (Flux) Release notes,What's new for Flux v(GitOps) in Azure Arc enabled Kubernetes - Azure Arc,,The release notes identify important updates and improvements in the microsoft.flux extension and important notices about changes.,"The Flux (GitOps) extension is updated on an ongoing basis. This article provides information about the most recent releases of the extension. Important To ensure continued compatibility and avoid disruptions, update your sources toremove references to deprecated APIsand ensure that clusters are running the latest version of the extension. Themost recent version of the Flux (GitOps) extensionand the two previous versions (N-2) are supported. We generally recommend that you use the most recent ve",2026-04-17T17:16:00.000Z,release-notes,,0.4,False,"Release notes for the Flux (GitOps) extension. While they may contain version-specific changes, they typically lack stable, reusable expert patterns like limits, configuration matrices, or troubleshooting mappings required by the defined sub-skill types.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/gitops-flux2-parameters,GitOps (Flux) parameters,GitOps (Flux v2) supported parameters - Azure Arc,Configure GitOps (Flux v2) parameters on Azure Kubernetes,Understand the supported parameters for GitOps (Flux v2) in Azure for use in Azure Arc-enabled Kubernetes and Azure Kubernetes Service (AKS) clusters.,"Azure provides an automated application deployments capability using GitOps that works with Azure Kubernetes Service (AKS) and Azure Arc-enabled Kubernetes clusters. GitOps with Flux v2 lets you use your Git repository as the source of truth for cluster configuration and application deployment. For more information, seeApplication deployments with GitOps (Flux v2)andTutorial: Deploy applications using GitOps with Flux v2. GitOps on Azure Arc-enabled Kubernetes or Azure Kubernetes Service usesFlu",2024-09-19T17:07:00.000Z,concept-article,configuration,0.8,True,"Described as documenting supported parameters for GitOps (Flux v2) in Azure Arc-enabled Kubernetes and AKS. This implies parameter names, allowed values, and possibly defaults—matching the configuration sub-skill definition with product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/identity-access-overview,Identity and access overview,Azure Arc-enabled Kubernetes identity and access overview - Azure Arc,Configure identity and access options for Arc-enabled Kubernetes,Understand identity and access options for Arc-enabled Kubernetes clusters.,"You can authenticate, authorize, and control access to your Azure Arc-enabled Kubernetes clusters. This topic provides an overview of the options for doing so with your Arc-enabled Kubernetes clusters. This image shows the ways that these different options can be used: You can also use both cluster connect and Azure RBAC together if that is most appropriate for your needs.",2024-09-19T17:07:00.000Z,concept-article,security,0.65,True,"Identity and access overview for Arc-enabled Kubernetes will reference specific auth modes and Azure RBAC integration patterns; while conceptual, it’s directly about security configuration choices.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/kubernetes-resource-view,View resources in Azure portal,Access Kubernetes resources from Azure portal - Azure Arc,Use Azure portal Kubernetes resource view for Arc-enabled clusters,Learn how to interact with Kubernetes resources to manage an Azure Arc-enabled Kubernetes cluster from the Azure portal.,"The Azure portal includes a Kubernetes resource view for easy access to the Kubernetes resources in your Azure Arc-enabled Kubernetes cluster. Viewing Kubernetes resources from the Azure portal reduces context switching between the Azure portal and thekubectlcommand-line tool, streamlining the experience for viewing and editing your Kubernetes resources. The resource viewer currently includes multiple resource types, including deployments, pods, and replica sets.",2025-02-19T23:02:00.000Z,how-to,configuration,0.6,True,Describes how to interact with Kubernetes resources via portal; likely includes resource types supported and specific UI-driven configuration capabilities unique to this integration.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/managed-extensions,Version-managed extensions,Version-managed extensions (preview) for Arc-enabled Kubernetes - Azure Arc,Use version-managed extensions on Arc-enabled Kubernetes,Version-managed extensions (preview) for Arc-enabled Kubernetes adds efficiency by helping your extensions work better together.,"Version-managed extensions (preview) simplifies the process of building, deploying, and maintaining applications on Arc-enabled Kubernetes clusters. Applications can use version-managed extensions to assure compatibility, reconcile interdependencies, and remove the complexity of relying on bring-your-own open-source solutions. Version-managed extensions are provided in bundled sets that are validated for interoperability against a set of supported Arc-enabled Kubernetes platforms, providing a st",2025-06-24T22:35:00.000Z,overview,configuration,0.7,True,"Describes how bundled, version-managed extensions are configured and constrained; includes product-specific extension set details and compatibility behavior.",unchanged @@ -232,9 +232,9 @@ services, seeAzure Policy built-in definitions. The name of each built-in policy the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2024-10-08T08:00:00.000Z,reference,security,0.85,True,"Policy reference; lists specific policy definition names, effects, and scopes for Arc-enabled Kubernetes, which are security/governance configurations.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/private-link,Use Private Link Scope,Use private connectivity for Azure Arc-enabled Kubernetes clusters with private link (preview) - Azure Arc,Configure Private Link connectivity for Arc-enabled Kubernetes clusters,"With Azure Arc, you can use a Private Link Scope model to allow multiple Kubernetes clusters to use a single private endpoint.","Azure Private Linkallows you to securely link Azure services to your virtual network using private endpoints. This means you can connect your on-premises Kubernetes clusters with Azure Arc and send all traffic over an Azure ExpressRoute or site-to-site VPN connection instead of using public networks. In Azure Arc, you can use a Private Link Scope model to allow multiple Kubernetes clusters to communicate with their Azure Arc resources using a single private endpoint. This document covers when to",2025-02-25T08:00:00.000Z,how-to,security,0.75,True,"Private Link usage involves specific private endpoint, scope, and networking configuration; also security-focused on private connectivity patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/quickstart-connect-cluster,Connect a cluster to Azure Arc,Quickstart: Connect an existing Kubernetes cluster to Azure Arc - Azure Arc,,"In this quickstart, you learn how to connect an Azure Arc-enabled Kubernetes cluster.","Get started with Azure Arc-enabled Kubernetes by using Azure CLI or Azure PowerShell to connect an existing Kubernetes cluster to Azure Arc. For a conceptual look at connecting clusters to Azure Arc, seeAzure Arc-enabled Kubernetes agent overview. To try things out in a sample/practice experience, visit theAzure Arc Jumpstart.",2026-03-02T08:00:00.000Z,quickstart,,0.2,False,"Quickstart for connecting a Kubernetes cluster to Azure Arc; primarily step-by-step onboarding using CLI/PowerShell without detailed configuration tables, limits, error-code mappings, or product-specific decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/release-notes,What's new with Arc-enabled Kubernetes,What's new with Azure Arc-enabled Kubernetes - Azure Arc,,Learn about the latest releases of Arc-enabled Kubernetes.,"Azure Arc-enabled Kubernetes is updated on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about recent releases of theAzure Arc-enabled Kubernetes agents. When any of the Arc-enabled Kubernetes agents are updated, all of the agents in theazure-arcnamespace are incremented with a new version number, so that the version numbers are consistent across agents. When a new version is released, all of the agents are upgraded together to",2026-03-24T17:13:00.000Z,concept-article,,0.3,False,"Release notes typically list version changes and high-level feature updates; the summary does not indicate presence of numeric limits, config tables, error-code mappings, or other structured expert details as defined by the sub-skill types.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/release-notes,What's new with Arc-enabled Kubernetes,What's new with Azure Arc-enabled Kubernetes - Azure Arc,,Learn about the latest releases of Arc-enabled Kubernetes.,"Azure Arc-enabled Kubernetes is updated on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about recent releases of theAzure Arc-enabled Kubernetes agents. When any of the Arc-enabled Kubernetes agents are updated, all of the agents in theazure-arcnamespace are incremented with a new version number, so that the version numbers are consistent across agents. When a new version is released, all of the agents are upgraded together to",2026-04-23T08:00:00.000Z,concept-article,,0.2,False,"Release notes primarily list version changes and high-level updates for Azure Arc-enabled Kubernetes agents. While they may contain some specific behaviors or fixes, they are not organized as limits, configuration references, troubleshooting guides, or other reusable expert patterns defined in the sub-skill types. Content is more about change history than structured expert knowledge per the given categories.",updated https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/resource-graph-samples,Azure Resource Graph queries,Azure Resource Graph sample queries for Azure Arc-enabled Kubernetes - Azure Arc,Run Azure Resource Graph queries for Arc-enabled Kubernetes,Sample Azure Resource Graph queries for Azure Arc-enabled Kubernetes showing use of resource types and tables to access Azure Arc-enabled Kubernetes related resources and properties.,"Azure Resource Graphis an Azure service that lets you query at scale, helping you effectively govern your environment. Queries are created using Kusto Query Language (KQL). For more information, seeUnderstanding the Azure Resource Graph query language. This page provides a list of sample Azure Resource Graph queries for Azure Arc-enabled Kubernetes. You can run these queries through Azure PowerShell or Azure CLI, or in the Azure portal using the Resource Graph Explorer. Feel free to modify the q",2025-05-19T17:03:00.000Z,sample,integrations,0.7,True,"Sample KQL queries reference specific resource types, tables, and properties unique to Arc-enabled Kubernetes, which are integration/query patterns with Azure Resource Graph.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension,Get started,Use the Azure Key Vault Secret Store extension to sync secrets to the Kubernetes secret store for offline access in Azure Arc-enabled Kubernetes clusters - Azure Arc,Configure Azure Key Vault Secret Store extension on Arc Kubernetes,"The Azure Key Vault Secret Store extension for Kubernetes (""SSE"") automatically synchronizes secrets from an Azure Key Vault to a Kubernetes cluster for offline access.","The Azure Key Vault Secret Store extension for Kubernetes (SSE) automatically synchronizes secrets from anAzure Key Vaultto anAzure Arc-enabled Kubernetes clusterfor offline access. This means you can use Azure Key Vault to store, maintain, and rotate your secrets, even when running your Kubernetes cluster in a semi-disconnected state. Synchronized secrets are stored in the clustersecret store, making them available as Kubernetes secrets to be used in all the usual ways: mounted as data volumes,",2025-11-18T19:04:00.000Z,how-to,configuration,0.9,True,"Explains how SSE syncs secrets and how to configure it; such pages include extension settings, sync modes, and Kubernetes secret mapping parameters.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension,Get started,Use the Azure Key Vault Secret Store extension to sync secrets to the Kubernetes secret store for offline access in Azure Arc-enabled Kubernetes clusters - Azure Arc,Configure Azure Key Vault Secret Store extension for Arc Kubernetes,"The Azure Key Vault Secret Store extension for Kubernetes (""SSE"") automatically synchronizes secrets from an Azure Key Vault to a Kubernetes cluster for offline access.","The Azure Key Vault Secret Store extension for Kubernetes (SSE) automatically synchronizes secrets from anAzure Key Vaultto anAzure Arc-enabled Kubernetes clusterfor offline access. This means you can use Azure Key Vault to store, maintain, and rotate your secrets, even when running your Kubernetes cluster in a semi-disconnected state. Synchronized secrets are stored in the clustersecret store, making them available as Kubernetes secrets to be used in all the usual ways: mounted as data volumes,",2026-04-22T11:03:00.000Z,how-to,configuration,0.74,True,"The page describes a specific Azure Arc-enabled Kubernetes extension that syncs Key Vault secrets into the Kubernetes secret store. Such extension docs typically include product-specific configuration options (CRD fields, Helm values, parameters like key vault name, tenant, sync intervals, secret mapping formats) and their allowed values, which qualify as expert configuration knowledge beyond generic Kubernetes or Key Vault usage.",updated https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension-reference,Reference,Reference for Azure Key Vault Secret Store Extension - Azure Arc,Configure Azure Key Vault Secret Store Extension on Arc Kubernetes,"A reference guide for the Azure Key Vault Secret Store Extension, documenting the possibilities allowed in each of SSE's configuration resources.","The SSE can be configured in four places: Configuration settings provided to Arc infrastructure when creating or updating the extension,AKVSyncresources,SecretSynckubernetes resources, andSecretProviderClasskubernetes resources.",2026-03-26T17:15:00.000Z,reference,configuration,0.8,True,"Described as a reference guide documenting what is allowed in each of the extension’s configuration resources, which implies detailed configuration parameters and options specific to this product.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension-release-notes,Release notes,What's new for AKV Secret Store extension - Azure Arc,,The release notes identify important updates and improvements in the Azure Key Vault Secret Store extension.,Important updates and improvements to the Azure Key Vault Secret Store extension are listed here.,2026-03-03T12:03:00.000Z,release-notes,,0.2,False,"Release notes typically list version changes, fixes, and new features but not the structured limits, configuration matrices, error-code mappings, or decision tables required by the defined sub-skill types. The description indicates a changelog-style page without detailed quotas, config parameter tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension-troubleshooting,Troubleshooting,Troubleshooting issues with the Secret Store extension - Azure Arc,Troubleshoot Azure Key Vault Secret Store extension issues,This guide helps you to efficiently diagnose and resolve issues with the Azure Key Vault Secret Store extension.,"The Secret Store Extension (SSE) creates a Kubernetes deployment that contains a pod with two containers: the controller, which manages storage of secrets in the cluster, handles scheduling and configurations, and the provider, which accesses secrets from Azure Key Vault (AKV). SSE can be directly configured viaSecretSyncandSecretProviderClassresources, and it can also be configured via consolidatedAKVSyncresources, which aim to be easier for humans to understand. In addition to configuration pa",2025-11-18T19:04:00.000Z,troubleshooting,troubleshooting,0.95,True,"Explicit troubleshooting guide; will map specific SSE symptoms, error messages, and misconfigurations to resolutions and diagnostic steps.",unchanged @@ -251,10 +251,10 @@ https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/validation-program, https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/workload-identity,Use workload identity,Deploy and configure workload identity federation in Azure Arc-enabled Kubernetes - Azure Arc,Deploy workload identity federation on Arc-enabled Kubernetes,Workload identity federation can be used with Azure Arc-enabled Kubernetes clusters.,"You can enable theworkload identity featureon an Azure Arc-enabled Kubernetes cluster by using Azure CLI. The process follows these high-level steps: For an overview of this feature, seeWorkload identity federation in Azure Arc-enabled Kubernetes. Tip This article describes the steps required to deploy and configure workload identity on an Arc-enabled Kubernetes cluster. To learn how to enable workload identity on other types of clusters, see the following articles:",2025-11-18T19:04:00.000Z,how-to,security,0.8,True,"Step-by-step deployment of workload identity; typically includes Entra app registration settings, token audience/issuer values, and Kubernetes annotations specific to this feature.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/workload-management,Explore multi-cluster workload management,Explore workload management in a multi-cluster environment with GitOps - Azure Arc,,Explore typical use-cases that Platform and Application teams face on a daily basis working with Kubernetes workloads in a multi-cluster environment.,"Enterprise organizations, developing cloud native applications, face challenges to deploy, configure and promote a great variety of applications and services across multiple Kubernetes clusters at scale. This environment may include Azure Kubernetes Service (AKS) clusters, clusters running on other public cloud providers, or clusters in on-premises data centers that are connected to Azure through the Azure Arc. Refer to theconceptual articleexploring the business process, challenges and solution",2026-03-06T08:00:00.000Z,how-to,,0.2,False,"Described as exploring typical use cases and challenges for multi-cluster Kubernetes workload management with GitOps; appears to be conceptual/overview guidance without mention of concrete limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/add-public-cloud,Add a public cloud,Add a public cloud with the multicloud connector in the Azure portal - Azure Arc,,Learn how to add an AWS or GCP cloud by using the multicloud connector enabled by Azure Arc.,"The multicloud connector enabled by Azure Arc lets you connect non-Azure public cloud resources to Azure by using the Azure portal. Currently, the multicloud connector provides support for connecting resources from these public clouds:",2025-11-18T16:11:00.000Z,how-to,,0.4,False,Portal-based onboarding tutorial; mostly step-by-step UI instructions without reusable configuration tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/onboard-multicloud-vms-arc,Onboard VMs to Azure Arc,Onboard VMs to Azure Arc through the multicloud connector - Azure Arc,Onboard multicloud VMs to Azure Arc with the connector,Learn how to enable the Arc onboarding solution with the multicloud connector enabled by Azure Arc.,"TheArc onboardingsolution of the multicloud connector autodiscovers VMs in aconnected public cloud, then installs theAzure Connected Machine agentto onboard the VMs to Azure Arc. This simplified experience lets you use Azure management services, such as Azure Monitor, providing a centralized way to manage Azure, AWS, and GCP VMs together. Currently, the multicloud connector provides support for connecting resources from these public clouds: You can enable theArc onboardingsolution when youconnec",2026-04-17T22:09:00.000Z,how-to,deployment,0.6,True,"Onboarding solution for VMs via the multicloud connector is a deployment/onboarding pattern that typically includes connector-specific requirements, supported platforms, and constraints for bringing non-Azure VMs under Arc management.",updated -https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/overview,Multicloud connector enabled by Azure Arc >,What is Multicloud connector enabled by Azure Arc? - Azure Arc,,"The multicloud connector lets you connect non-Azure public cloud resources to Azure, providing a centralized source for management and governance.","Multicloud connector enabled by Azure Arc lets you connect non-Azure public cloud resources to Azure, providing a centralized source for management and governance. Currently, the multicloud connector provides support for connecting resources from these public clouds: The multicloud connector supports these solutions: For more information about how the multicloud connector works, including prerequisites, seeAdd a public cloud with the multicloud connector in the Azure portal. The multicloud conne",2026-04-13T08:00:00.000Z,overview,,0.2,False,"Page is an overview of Azure Arc Multicloud connector capabilities and supported clouds/solutions, without detailed limits, configuration parameters, error codes, or decision matrices. It’s primarily conceptual/introductory and does not contain the kind of specific expert-only details required by any sub-skill type.",updated -https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/resource-representation,Resource representation in Azure,Multicloud connector resource representation in Azure - Azure Arc,,Understand how AWS and GCP resources are represented in Azure after they're added through the multicloud connector enabled by Azure Arc.,"The multicloud connector enabled by Azure Arc lets you connect non-Azure public cloud resources to Azure, providing a centralized source for management and governance. Currently, the multicloud connector provides support for connecting resources from these public clouds: This article describes how AWS resources from a connected public cloud are represented in your Azure environment.",2026-04-17T22:09:00.000Z,how-to,,0.3,False,"Describes how AWS/GCP resources are represented in Azure conceptually; likely a mapping/overview without detailed configuration tables, limits, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/troubleshoot-multicloud-connector,Troubleshooting,Troubleshoot multicloud connector enabled by Azure Arc - Azure Arc,Troubleshoot Azure Arc multicloud connector issues,Troubleshoot the multicloud connector enabled by Azure Arc,This document provides guidance on troubleshooting issues for your Multicloud connector and solutions.,2026-04-15T22:10:00.000Z,how-to,troubleshooting,0.85,True,"Dedicated troubleshooting article for the multicloud connector; such content normally includes specific error states, causes, and remediation steps unique to this connector.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/onboard-multicloud-vms-arc,Onboard VMs to Azure Arc,Onboard VMs to Azure Arc through the multicloud connector - Azure Arc,Onboard multicloud VMs to Azure Arc with the connector,Learn how to enable the Arc onboarding solution with the multicloud connector enabled by Azure Arc.,"TheArc onboardingsolution of the multicloud connector autodiscovers VMs in aconnected public cloud, then installs theAzure Connected Machine agentto onboard the VMs to Azure Arc. This simplified experience lets you use Azure management services, such as Azure Monitor, providing a centralized way to manage Azure, AWS, and GCP VMs together. Currently, the multicloud connector provides support for connecting resources from these public clouds: You can enable theArc onboardingsolution when youconnec",2026-04-17T22:09:00.000Z,how-to,deployment,0.6,True,"Onboarding solution for VMs via the multicloud connector is a deployment/onboarding pattern that typically includes connector-specific requirements, supported platforms, and constraints for bringing non-Azure VMs under Arc management.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/overview,Multicloud connector enabled by Azure Arc >,What is Multicloud connector enabled by Azure Arc? - Azure Arc,,"The multicloud connector lets you connect non-Azure public cloud resources to Azure, providing a centralized source for management and governance.","Multicloud connector enabled by Azure Arc lets you connect non-Azure public cloud resources to Azure, providing a centralized source for management and governance. Currently, the multicloud connector provides support for connecting resources from these public clouds: The multicloud connector supports these solutions: For more information about how the multicloud connector works, including prerequisites, seeAdd a public cloud with the multicloud connector in the Azure portal. The multicloud conne",2026-04-13T08:00:00.000Z,overview,,0.2,False,"Page is an overview of Azure Arc Multicloud connector capabilities and supported clouds/solutions, without detailed limits, configuration parameters, error codes, or decision matrices. It’s primarily conceptual/introductory and does not contain the kind of specific expert-only details required by any sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/resource-representation,Resource representation in Azure,Multicloud connector resource representation in Azure - Azure Arc,,Understand how AWS and GCP resources are represented in Azure after they're added through the multicloud connector enabled by Azure Arc.,"The multicloud connector enabled by Azure Arc lets you connect non-Azure public cloud resources to Azure, providing a centralized source for management and governance. Currently, the multicloud connector provides support for connecting resources from these public clouds: This article describes how AWS resources from a connected public cloud are represented in your Azure environment.",2026-04-17T22:09:00.000Z,how-to,,0.3,False,"Describes how AWS/GCP resources are represented in Azure conceptually; likely a mapping/overview without detailed configuration tables, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/troubleshoot-multicloud-connector,Troubleshooting,Troubleshoot multicloud connector enabled by Azure Arc - Azure Arc,Troubleshoot Azure Arc multicloud connector issues,Troubleshoot the multicloud connector enabled by Azure Arc,This document provides guidance on troubleshooting issues for your Multicloud connector and solutions.,2026-04-15T22:10:00.000Z,how-to,troubleshooting,0.85,True,"Dedicated troubleshooting article for the multicloud connector; such content normally includes specific error states, causes, and remediation steps unique to this connector.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/view-multicloud-inventory,View multicloud inventory,View multicloud inventory with the multicloud connector enabled by Azure Arc - Azure Arc,,View multicloud inventory with the multicloud connector enabled by Azure Arc,"TheInventorysolution of the multicloud connector shows an up-to-date view of your resources from other public clouds in Azure, providing you with a single place to see all your cloud resources. Currently, the multicloud connector provides support for connecting resources from these public clouds: After you enable theInventorysolution, metadata from the assets in the source cloud is included with the asset representations in Azure. You can also apply Azure tags or Azure policies to these resource",2025-10-22T08:00:00.000Z,how-to,,0.4,False,Inventory viewing article; likely UI-driven with minimal deep configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/network-requirements-consolidated,Network requirements,Azure Arc network requirements - Azure Arc,"Configure Azure Arc network endpoints, ports, and protocols","A consolidated list of network requirements for Azure Arc features and Azure Arc-enabled services. Lists endpoints, ports, and protocols.","This article lists the endpoints, ports, and protocols required for Azure Arc-enabled services and features. Generally, connectivity requirements include these principles: To use a proxy, verify that the agents and the machine performing the onboarding process meet the network requirements in this article. Tip For the Azure public cloud, you can reduce the number of required endpoints by using the Azure Arc gateway forArc-enabled serversorArc-enabled Kubernetes.",2026-04-01T08:00:00.000Z,reference,configuration,0.8,True,"Consolidated list of Azure Arc network requirements with specific endpoints, ports, and protocols. This is product-specific configuration data (hostnames, ports, protocols) that functions as a settings reference, matching the configuration sub-skill definition.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/overview,About Azure Arc,Azure Arc overview - Azure Arc,,Learn about what Azure Arc is and how it helps customers enable management and governance of their hybrid resources with other Azure services and features.,"Today, companies struggle to control and govern increasingly complex environments that extend across data centers, multiple clouds, and edge. Each environment and cloud possesses its own set of management tools, and new DevOps and ITOps operational models can be hard to implement across resources. Azure Arc simplifies governance and management by delivering a consistent multicloud and on-premises management platform. Azure Arc provides a centralized, unified way to: To download architecture diag",2025-08-26T17:08:00.000Z,overview,,0.2,False,"High-level Azure Arc overview; no concrete limits, configs, or error details.",unchanged @@ -270,7 +270,7 @@ https://learn.microsoft.com/en-us/azure/azure-arc/resource-bridge/troubleshoot-r https://learn.microsoft.com/en-us/azure/azure-arc/resource-bridge/upgrade,Upgrade,Upgrade Arc resource bridge - Azure Arc,Upgrade Azure Arc resource bridge safely,Learn how to upgrade Arc resource bridge using either cloud-managed upgrade or manual upgrade.,"This article describes how Arc resource bridge is upgraded, and the two ways upgrade can be performed: cloud-managed upgrade or manual upgrade. Currently, some private cloud providers differ in how they handle Arc resource bridge upgrades.",2025-12-30T23:07:00.000Z,how-to,deployment,0.7,True,Details cloud-managed vs manual upgrade flows and provider-specific differences; product-specific deployment/upgrade behavior.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/resource-graph-samples,Azure Resource Graph queries,Azure Resource Graph sample queries for Azure Arc - Azure Arc,,Sample Azure Resource Graph queries for Azure Arc showing use of resource types and tables to access Azure Arc related resources and properties.,"Azure Resource Graphis an Azure service that lets you query at scale, helping you effectively govern your environment. Queries are created using Kusto Query Language (KQL). For more information, seeUnderstanding the Azure Resource Graph query language. This page provides a list of sample Azure Resource Graph queries for Azure Arc. You can run these queries through Azure PowerShell or Azure CLI, or in the Azure portal using the Resource Graph Explorer. Feel free to modify the queries to suit your",2025-05-19T17:03:00.000Z,sample,,0.4,False,Sample KQL queries are useful but not configuration/limits/troubleshooting; more tutorial-like usage examples.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/servers/agent-overview,Connected Machine agent overview,Overview of the Azure Connected Machine agent - Azure Arc,,"This article provides a detailed overview of the Azure Connected Machine agent, which supports monitoring virtual machines hosted in hybrid environments.","The Azure Connected Machine agent lets you manage Windows and Linux machines hosted outside of Azure, on your corporate network or other cloud providers. Warning Only Connected Machineagent versionswithin the last one year are officially supported by the product group. All customers should update to an agent version within this window orenable automatic agent upgrades (preview).",2025-08-12T08:00:00.000Z,overview,,0.5,False,"Agent overview; mostly conceptual plus support window note, not a detailed configuration or troubleshooting reference.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/servers/agent-release-notes,What's new with Connected Machine agent?,What's new with Azure Connected Machine agent - Azure Arc,,"This article has release notes for Azure Connected Machine agent. For many of the summarized issues, there are links to more details.","Warning Only Connected Machine agent versions released within the last year are officially supported by the product group. All customers should update to an agent version within this window orenable automatic agent upgrades (preview). Microsoft recommends staying up to date with the latest agent version whenever possible. The Azure Connected Machine agent receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information ab",2026-04-14T06:03:00.000Z,overview,,0.3,False,"Release notes summarize agent versions and changes; while detailed, they are version history rather than reusable expert knowledge patterns like limits, configuration matrices, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/servers/agent-release-notes,What's new with Connected Machine agent?,What's new with Azure Connected Machine agent - Azure Arc,,"This article has release notes for Azure Connected Machine agent. For many of the summarized issues, there are links to more details.","Warning Only Connected Machine agent versions released within the last year are officially supported by the product group. All customers should update to an agent version within this window orenable automatic agent upgrades (preview). Microsoft recommends staying up to date with the latest agent version whenever possible. The Azure Connected Machine agent receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information ab",2026-04-14T06:03:00.000Z,overview,,0.3,False,"Release notes summarize agent versions and changes; while detailed, they are version history rather than reusable expert knowledge patterns like limits, configuration matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/servers/agent-release-notes-archive,Archive for Connected Machine agent release notes,Archive for What's new with Azure Connected Machine agent - Azure Arc,,Release notes for Azure Connected Machine agent versions older than six months,"Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. This article contains information about older releases of the Connected Machine agent. The primaryWhat's new in Azure Connected Machine agent?article contains updates for the last six months. Microsoft recommends staying up to date with the latest agent version whenever possible.",2026-03-24T08:00:00.000Z,overview,,0.4,False,"Archive of older release notes; historical change log rather than reusable expert knowledge for skills such as configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/servers/api-extended-security-updates,Programmatically manage Extended Security Updates licenses,Programmatically deploy and manage Azure Arc Extended Security Updates licenses - Azure Arc,Use Azure Arc WS2012 ESU ARM APIs programmatically,Learn how to programmatically deploy and manage Azure Arc Extended Security Updates licenses for Windows Server 2012.,"This article provides instructions to programmatically provision and manage Windows Server 2012 and Windows Server 2012 R2 Extended Security Updates lifecycle operations through the Azure Arc WS2012 ESU ARM APIs. For each of the API commands explained in this article, be sure to enter accurate parameter information for location, state, edition, type, and processors depending on your particular scenario. Note You'll need to create a service principal to use the Azure API to manage ESUs. SeeConnec",2024-10-03T21:58:00.000Z,concept-article,integrations,0.7,True,"Describes specific ARM API operations and parameters (location, state, edition, type, processors) for ESU lifecycle; this is product-specific API integration guidance with concrete parameter usage.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/servers/arc-gateway,Simplify network configuration requirements,Simplify Network Configuration Requirements with Azure Arc Gateway - Azure Arc,Configure Azure Arc gateway network endpoints and usage,Learn how to simplify network configuration requirements with Azure Arc gateway.,"If you use enterprise proxies to manage outbound traffic, Azure Arc gateway lets you onboard infrastructure to Azure Arc by using only seven endpoints. With Azure Arc gateway, you can:",2026-04-01T08:00:00.000Z,how-to,configuration,0.7,True,"The Arc gateway article is about simplifying network configuration and explicitly mentions reducing endpoints to seven. Such content typically includes the exact list of required endpoints/URLs and possibly ports, which are concrete configuration parameters unique to this product scenario, matching the configuration sub-skill.",unchanged @@ -437,30 +437,30 @@ https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/clean-u https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/configure,Configure your solutions,Configure your Solutions with Workload Orchestration Portal - Azure Arc,,Learn how to use the workload orchestration portal to configure your solutions and publish values for deployment.,"The Configure tab in the workload orchestration portal provides a comprehensive view of your solutions at the factory and line hierarchy levels. This article describes how to use the workload orchestration portal to add configuration values and publish values for deployment for your solutions. If you want information about other tabs in the workload orchestration portal, seeMonitor your solutionsandDeploy your solutions.",2025-12-23T18:04:00.000Z,how-to,,0.45,False,Describes using the Configure tab to add/publish values; more of a UI workflow guide than a detailed configuration schema or limits document.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/configuring-schema,Configuration schema,Configuration Schemas for Workload Orchestration - Azure Arc,Define configuration schemas for Arc workload orchestration,Learn the rules on how to create configuration schemas for workload orchestration.,"The configuration schema is a YAML file that defines the structure and rules for the configuration of a solution template. It specifies the properties, types, and validation rules for the configuration values that users can provide when deploying the solution template. A configuration template can refer to a single schema or none. The schema is used to validate the configuration values provided by users and ensure that they conform to the expected format and constraints. It also provides a way t",2025-06-26T22:17:00.000Z,concept-article,configuration,0.85,True,"Details YAML schema structure, property types, and validation rules for solution templates; includes product-specific schema fields and constraints.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/configuring-template,Configuration template,Configuration Templates for workload orchestration - Azure Arc,Author configuration templates for Arc workload orchestration,Learn how to create configuration templates for workload orchestration using the templating language.,"A configuration template can refer to a single schema or none. Schema can be referred with its full path, that is, subscription, resource group name, schema name and version, or just with schema name and version. If subscription and resource group name aren't provided, then these details are taken from the request for solution or target creation. The building blocks of a templating language help you manage configurations dynamically using different expressions: This article describes different e",2025-05-19T15:26:00.000Z,concept-article,configuration,0.8,True,"Describes templating language building blocks and expressions, including how schemas are referenced; contains product-specific template syntax and configuration constructs.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/delete-resources,Delete resources,Delete Resources in Workload Orchestration - Azure Arc,,Learn how to delete the resources created with workload orchestration,"This article details how to delete or uninstall any workload orchestration resources in Azure, and its cascading impact on other dependent resources, if any. You can delete resources such as targets, solution templates, schema, and configuration templates. IT users can delete resources using the Azure CLI. For more information about workload orchestration, seePrepare your environment for workload orchestration.",2026-04-12T11:04:00.000Z,how-to,,0.3,False,"Appears to be a procedural how-to for deleting workload orchestration resources via CLI, without indication of product-specific limits, configuration matrices, or detailed settings tables. Likely standard delete commands and cascading behavior that an LLM can infer from general Azure patterns.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/delete-resources,Delete resources,Delete Resources in Workload Orchestration - Azure Arc,,Learn how to delete the resources created with workload orchestration,"This article details how to delete or uninstall any workload orchestration resources in Azure, and its cascading impact on other dependent resources, if any. You can delete resources such as targets, solution templates, schema, and configuration templates. IT users can delete resources using the Azure CLI. For more information about workload orchestration, seePrepare your environment for workload orchestration.",2026-04-12T11:04:00.000Z,how-to,,0.3,False,"Appears to be a procedural how-to for deleting workload orchestration resources via CLI, without indication of product-specific limits, configuration matrices, or detailed settings tables. Likely standard delete commands and cascading behavior that an LLM can infer from general Azure patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/deploy,Deploy your solutions,Deploy your Solutions with Workload Orchestration Portal - Azure Arc,,"Learn to use workload orchestration portal to deploy your applications, and also to delete, stop, roll back, and retry solutions.","The Deploy tab in the workload orchestration portal displays the targets and the solutions applicable to the targets. It shows targets created at multiple hierarchical levels such as factory and line. This article describes how to use the workload orchestration portal to deploy, delete, stop, roll back, and retry solutions. If you want information about other tabs in the workload orchestration portal, seeMonitor your solutionsandConfigure your solutions.",2025-12-23T18:04:00.000Z,how-to,,0.4,False,"Task-focused portal how-to for deploying/rolling back solutions; likely step-by-step UI usage without configuration tables, limits, or product-specific patterns beyond generic deployment flows.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/diagnose-problems,Diagnose edge-related errors,Diagnostics of Edge-Related Logs and Errors in Workload Orchestration - Azure Arc,Diagnose Azure Arc workload orchestration logs and errors,"Learn how to diagnose workload orchestration logs and errors, audit and diagnostic logs, and collect container logs or Kubernetes events.","This article describes how to diagnose workload orchestration logs and errors. It covers the different types of logs that can be collected or generated, how to enable workload orchestration audit and diagnostic logs, and how to collect container logs or Kubernetes events.",2025-06-09T05:17:00.000Z,how-to,troubleshooting,0.7,True,"Diagnostics article for logs and errors; likely includes specific log types, how to enable audit/diagnostic logs, and commands/locations for container logs and Kubernetes events, matching troubleshooting criteria.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/external-validation,Enable external validation with Event Grid,External Validation for Workload Orchestration - Azure Arc,Configure Event Grid external validation for Arc workloads,Learn how to use Event Grid for workload orchestration external validation and how to create a solution template with external validation enabled.,"External validation allows you to validate the solution template using an external service, such as an Azure Function or a webhook. The external validation service receives events from the workload orchestration service and can perform custom validation logic. This article describes how to set up an Event Grid subscription for workload orchestration and how to create and publish a solution template with external validation enabled.",2025-05-26T22:00:00.000Z,how-to,integrations,0.7,True,"Describes setting up Event Grid subscription and enabling external validation in solution templates; likely includes event types, endpoint configuration, and template parameters, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/external-validation-payload,External validation payloads,External Validation Payload - Azure Arc,Use external validation payload schema for Arc solutions,This document provides details on the Event Grid payload for external validation of solution versions in workload orchestration.,"This document provides details on the Event Grid payload for external validation of solution versions in workload orchestration. It includes information on the Event Grid message format, field descriptions, and API endpoints for getting solution version resources and updating external validation status. For more information, seeExternal validation for workload orchestration.",2025-05-19T15:26:00.000Z,reference,integrations,0.8,True,"Details Event Grid payload format, field descriptions, and API endpoints; this is product-specific API/parameter reference information, matching integrations criteria.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/how-to-stage,Stage a solution before deployment,Staging Resources Before Deployment - Azure Arc,,Learn how to stage resources before deployment in Azure Arc Workload Orchestration.,"Staging is a predeployment step that allows you to validate and download resources before they are deployed to the edge cluster. This process helps ensure that all configurations, images, and dependencies are correctly set up, ensuring reliable deployments within limited maintenance windows. Some common scenarios where staging is beneficial are:",2026-03-02T12:03:00.000Z,how-to,,0.3,False,"From the summary, this is a procedural 'how to stage resources' guide without evidence of detailed configuration tables, limits, or product-specific parameters. It appears to describe the concept and scenarios for staging rather than expert-only numeric limits, configuration matrices, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/initial-setup-configuration,Set up workload orchestration,Set Up Workload Orchestration - Azure Arc,,"Learn how to configure resources, author solutions, and manage deployments for Azure Arc workload orchestration.","IT admins and developers are responsible for setting up and configuring workload orchestration. This includes configuring the resources, authoring solutions, and managing deployments. This article describes the steps to configure application specific resources such as workload orchestration resources, author solutions, and manage deployments. It also provides information about application and solution versioning, different ways to author configurations, and different solution authoring scenarios",2026-04-12T11:04:00.000Z,install-set-up-deploy,,0.3,False,"Describes steps and concepts for setup and configuration at a high level; summary does not indicate detailed parameter tables, limits, or product-specific troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/initial-setup-configuration,Set up workload orchestration,Set Up Workload Orchestration - Azure Arc,,"Learn how to configure resources, author solutions, and manage deployments for Azure Arc workload orchestration.","IT admins and developers are responsible for setting up and configuring workload orchestration. This includes configuring the resources, authoring solutions, and managing deployments. This article describes the steps to configure application specific resources such as workload orchestration resources, author solutions, and manage deployments. It also provides information about application and solution versioning, different ways to author configurations, and different solution authoring scenarios",2026-04-12T11:04:00.000Z,install-set-up-deploy,,0.3,False,"Describes steps and concepts for setup and configuration at a high level; summary does not indicate detailed parameter tables, limits, or product-specific troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/initial-setup-environment,Prepare the environment,Prepare the Environment for Workload Orchestration - Azure Arc,Prepare environment and resources for Arc workload orchestration,Learn how to set up the environment for workload orchestration. This procedure is done by IT admins.,"IT admins are responsible for the initial setup of workload orchestration, which includes creating a custom location, downloading the workload orchestration extension, and installing the required components. The IT admin also needs to set up the Azure resources for workload orchestration, including the Azure Kubernetes Service (AKS) cluster, site, and site address. This article describes how to prepare the environment for workload orchestration. The following steps are shared across all Azure re",2025-12-23T18:04:00.000Z,install-set-up-deploy,configuration,0.65,True,"Describes creating custom location, installing extensions, and setting up AKS/site resources; likely includes specific resource types, parameter names, and required settings unique to workload orchestration.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/known-issues,Known issues,Known Issues for Workload Orchestration - Azure Arc,Resolve known Azure Arc workload orchestration issues,"This article provides a list of known issues for workload orchestration in Azure Arc, including workarounds.",This article provides a list of known issues for workload orchestration in Azure Arc. It includes workarounds and solutions for each issue. Known issues are updated as new issues are discovered and resolved.,2025-06-26T22:17:00.000Z,troubleshooting-known-issue,troubleshooting,0.7,True,"Known issues list with workarounds/solutions; effectively symptom-to-workaround mappings, which are product-specific troubleshooting knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/migration-script,Migration script for GA,Migrate Existing Target Resources to General Availability - Azure Arc,Run migration script to upgrade Arc targets to GA,Learn how to migrate existing target resources in Azure Arc workload orchestration to the general availability (GA) version.,"If you used workload orchestration in preview, you need to run a migration script to update the existing target resources to the general availability (GA) version. For more information about the GA release, seeRelease Notes for Workload Orchestration.",2025-06-26T22:17:00.000Z,reference,deployment,0.65,True,"Migration script for preview-to-GA targets; likely includes specific script parameters, required versions, and sequencing, representing product-specific deployment/migration requirements.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/monitor,Monitor your solutions,Monitor your Solutions with Workload Orchestration Portal - Azure Arc,,"Learn how to navigate the workload orchestration portal to monitor your solutions, including their status, capabilities, and deployment details.","Workload orchestration portal provides a user-friendly interface for monitoring your solutions. In the Monitor tab you can view your solutions, including their status, capabilities, and deployment details. This article describes how to use the workload orchestration portal to monitor your solutions. If you want information about other tabs in the workload orchestration portal, seeConfigure your solutionsandDeploy your solutions.",2025-10-27T22:11:00.000Z,how-to,,0.3,False,"Monitoring tab navigation/how-to; appears to describe viewing status and details, not specific diagnostic commands, error codes, or configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/onboarding-scripts,Onboard using scripts,Onboarding Scripts for Workload Orchestration - Azure Arc,,The onboarding scripts are designed to help you set up the necessary infrastructure and resources for workload orchestration in Azure Arc.,"The onboarding scripts are designed to help you set up the necessary infrastructure and resources for workload orchestration in Azure Arc. The scripts automate the process of creating a Kubernetes cluster, deploying on the cluster, creating custom location and site, installing the workload orchestration CLI extension and other resources necessary to deploy your 1st application. The scripts are available in 3 variants - PowerShell, Python and Bash, all of which are functionally equivalent. Tip If",2026-04-12T11:04:00.000Z,install-set-up-deploy,,0.4,False,"Onboarding scripts article appears to be a procedural/tutorial guide for creating clusters and resources; summary does not show detailed configuration tables, limits, or error mappings.",updated -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/overview,Overview,What Is Workload Orchestration? - Azure Arc,,Workload orchestration is a cross-platform orchestrator for managing edge workloads using an Azure control plane.,"Workload orchestration for Azure Arc is a cloud‑native, cross‑platform service that enables centralized deployment, configuration, and lifecycle management of application workloads across distributed edge environments. It helps organizations deploy applications consistently across multiple fleets with site‑specific configurations, and natively supports Kubernetes workloads.",2026-04-12T11:04:00.000Z,overview,,0.1,False,"High-level conceptual overview of Azure Arc workload orchestration; no concrete limits, configs, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/onboarding-scripts,Onboard using scripts,Onboarding Scripts for Workload Orchestration - Azure Arc,,The onboarding scripts are designed to help you set up the necessary infrastructure and resources for workload orchestration in Azure Arc.,"The onboarding scripts are designed to help you set up the necessary infrastructure and resources for workload orchestration in Azure Arc. The scripts automate the process of creating a Kubernetes cluster, deploying on the cluster, creating custom location and site, installing the workload orchestration CLI extension and other resources necessary to deploy your 1st application. The scripts are available in 3 variants - PowerShell, Python and Bash, all of which are functionally equivalent. Tip If",2026-04-12T11:04:00.000Z,install-set-up-deploy,,0.4,False,"Onboarding scripts article appears to be a procedural/tutorial guide for creating clusters and resources; summary does not show detailed configuration tables, limits, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/overview,Overview,What Is Workload Orchestration? - Azure Arc,,Workload orchestration is a cross-platform orchestrator for managing edge workloads using an Azure control plane.,"Workload orchestration for Azure Arc is a cloud‑native, cross‑platform service that enables centralized deployment, configuration, and lifecycle management of application workloads across distributed edge environments. It helps organizations deploy applications consistently across multiple fleets with site‑specific configurations, and natively supports Kubernetes workloads.",2026-04-12T11:04:00.000Z,overview,,0.1,False,"High-level conceptual overview of Azure Arc workload orchestration; no concrete limits, configs, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/portal-user-guide,User guide,User Guide for the Workload Orchestration Portal - Azure Arc,,"Learn how to navigate the workload orchestration portal to monitor, configure, and deploy solutions without writing code.","This guide is designed for low code/non-code users who want to manage solutions in the workload orchestration portal. Theworkload orchestration portalprovides a user-friendly interface with three main tabs on the left side:Monitor,Configure, andDeploy. Each tab has its own set of features and functionalities that help you manage your solutions effectively.",2025-12-23T18:04:00.000Z,how-to,,0.35,False,"Portal user guide for navigation and basic operations; UI-focused, not a configuration parameter or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-multiple-instances-k8s,Create multiple instances of a solution in the same K8s,Create a Solution with Multiple Instances with Workload Orchestration - Azure Arc,,Learn how to create a solution with multiple instances in the same Kubernetes namespace using workload orchestration via CLI.,"In this article, you create a solution with multiple instances in the same Kubernetes namespace using workload orchestration via CLI.",2025-06-26T22:17:00.000Z,how-to,,0.45,False,"Tutorial for multiple instances in a namespace; primarily example-driven CLI steps, not a structured configuration or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-multiple-shared-adapter-dependency,Create a solution with multiple shared adapter dependencies,Create a Solution with Multiple Dependencies with Workload Orchestration - Azure Arc,,Learn how to create a solution with multiple shared adapter dependencies using workload orchestration via CLI.,"In this tutorial, you create a solution with multiple shared adapter dependencies using workload orchestration via CLI. You will create a Factory Sensor Anomaly Detector (FSAD) solution that depends on a Shared Sync Adapter (SSA) solution. The FSAD solution is deployed on a child target, while the SSA solution is deployed on a parent target. The FSAD solution uses the SSA solution to synchronize data between devices and servers.",2025-05-19T15:26:00.000Z,tutorial,,0.45,False,"Tutorial for multiple shared adapter dependencies; focused on example workflow, not on generalized configuration or limits documentation.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-shared-adapter-dependency,Create a solution with shared adapter dependency,Create a Solution with Shared Adapter Dependency with Workload Orchestration - Azure Arc,,Learn how to create a solution with shared adapter dependency using workload orchestration via CLI.,"In this tutorial, you use the workload orchestration via CLI to create a Factory Sensor Anomaly Detector (FSAD) solution which is dependent on a Shared Sync Adapter (SSA) solution. A shared sync adapter is a component used in various solutions to manage data synchronization between devices and servers. The FSAD solution is deployed on a child target, while the SSA solution is deployed on a parent target. The FSAD solution uses the SSA solution to synchronize data between devices and servers.",2025-06-26T22:17:00.000Z,tutorial,,0.45,False,Tutorial creating a solution with shared adapter dependency; scenario-based CLI usage rather than a parameter reference or troubleshooting guide.,unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-with-common-configuration,Create solution with common configurations,Create a Basic Solution with Common Configurations with Workload Orchestration - Azure Arc,,Learn how to create a basic solution with common configurations using the workload orchestration via CLI.,"In this quickstart, you create a basic solution with common configurations using the workload orchestration via CLI. The solution is a Helm chart that contains the application and its dependencies. The common configurations are used to define the configurable attributes at each hierarchical level that can be used for a particular solution.",2026-04-12T11:04:00.000Z,quickstart,,0.2,False,"Quickstart for creating a solution with common configurations; summary suggests example usage of Helm and configuration hierarchy, not detailed config reference, limits, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-without-common-configuration,Create a basic solution,Create a Basic Solution with Workload Orchestration - Azure Arc,,Learn how to create a basic solution without common configurations using the workload orchestration via CLI.,"In this quickstart, you create a basic solution using the workload orchestration via CLI.",2026-04-12T11:04:00.000Z,quickstart,,0.2,False,"Quickstart for creating a basic solution via CLI; typical step-by-step tutorial without indication of expert-only limits, configuration matrices, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-with-common-configuration,Create solution with common configurations,Create a Basic Solution with Common Configurations with Workload Orchestration - Azure Arc,,Learn how to create a basic solution with common configurations using the workload orchestration via CLI.,"In this quickstart, you create a basic solution with common configurations using the workload orchestration via CLI. The solution is a Helm chart that contains the application and its dependencies. The common configurations are used to define the configurable attributes at each hierarchical level that can be used for a particular solution.",2026-04-12T11:04:00.000Z,quickstart,,0.2,False,"Quickstart for creating a solution with common configurations; summary suggests example usage of Helm and configuration hierarchy, not detailed config reference, limits, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-without-common-configuration,Create a basic solution,Create a Basic Solution with Workload Orchestration - Azure Arc,,Learn how to create a basic solution without common configurations using the workload orchestration via CLI.,"In this quickstart, you create a basic solution using the workload orchestration via CLI.",2026-04-12T11:04:00.000Z,quickstart,,0.2,False,"Quickstart for creating a basic solution via CLI; typical step-by-step tutorial without indication of expert-only limits, configuration matrices, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-upgrade-shared-application,Upgrade a shared application,Upgrade a Shared Application with Workload Orchestration - Azure Arc,,Learn how to upgrade a shared application using workload orchestration via CLI.,"In this tutorial, you upgrade a shared Factory Sensor Anomaly Detector (FSAD) solution and deploy a dependent Shared Sync Adapter (SSA) solution using workload orchestration and the Azure CLI. The FSAD solution is deployed on a child target, while the SSA solution is deployed on a parent target. The FSAD solution uses the SSA solution to synchronize data between devices and servers. For more information about the FSAD and SSA solutions, seeQuickstart: Create a solution with multiple shared adapt",2025-05-19T15:26:00.000Z,how-to,,0.45,False,Upgrade tutorial for shared application; scenario-based instructions without clear evidence of generalized configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/rbac-guide,RBAC guide,Role Based Access Control (RBAC) Guide for Workload Orchestration - Azure Arc,Set up RBAC for Arc workload orchestration resources,Learn how to set up Role-Based Access Control (RBAC) when using workload orchestration.,"This article describes how you can set up RBAC when using workload orchestration. Standard Azure RBAC is compatible with workload orchestration since all objects are ARM-based. You can use either the Azure Standard Built-in Roles (Owner, Reader, and Contributor) or a Custom Defined Role to support various combinations of actions suitable for their organization. For more information, seeAzure Role-Based Access Control (RBAC).",2025-12-23T18:04:00.000Z,overview,security,0.7,True,"RBAC guide for workload orchestration; discusses using built-in and custom roles with ARM-based objects, implying product-specific action sets and scopes.",unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/release-notes,Release notes,Release Notes for Workload Orchestration - Azure Arc,,Release notes for Workload Orchestration.,"This article provides the latest and past release notes for workload orchestration in Azure Arc. It includes new features, improvements, and bug fixes.",2026-04-12T11:04:00.000Z,release-notes,,0.2,False,"Release notes usually list features, fixes, and changes but not the structured limits, configuration parameters, or troubleshooting mappings required by the defined sub-skill types. They are time-bound change logs rather than reusable expert configuration or diagnostic knowledge.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/release-notes,Release notes,Release Notes for Workload Orchestration - Azure Arc,,Release notes for Workload Orchestration.,"This article provides the latest and past release notes for workload orchestration in Azure Arc. It includes new features, improvements, and bug fixes.",2026-04-12T11:04:00.000Z,release-notes,,0.2,False,"Release notes usually list features, fixes, and changes but not the structured limits, configuration parameters, or troubleshooting mappings required by the defined sub-skill types. They are time-bound change logs rather than reusable expert configuration or diagnostic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/service-group,Use service groups,Service Groups for Workload Orchestration - Azure Arc,Configure service groups for Arc workload orchestration,Learn about service groups and how to configure them in workload orchestration.,"Service groups are a new resource type in Azure Resource Manager (ARM) that help you organize related resources, like resource groups, subscriptions, and management groups, under one service, application, or workload. This article explains how to create a service group and configure it to use it with workload orchestration. For more information, seeRBAC for service groups. Note Service groups are in preview and require allowlisting of user subscriptions for access. You need to fillthis formto ge",2026-01-13T06:05:00.000Z,how-to,configuration,0.7,True,Explains creating and configuring service groups as ARM resources; likely includes resource properties and configuration fields unique to service groups.,unchanged -https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/troubleshooting,Troubleshooting guide,Troubleshooting for Workload Orchestration - Azure Arc,Diagnose and fix Azure Arc workload orchestration issues,Learn how to troubleshoot issues with Workload Orchestration.,This article provides troubleshooting guidance for common issues encountered when using workload orchestration in Azure Arc.,2026-04-12T11:04:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Explicitly described as troubleshooting guidance for common issues. Such pages typically map specific symptoms and error messages to causes and resolutions, often including product-specific commands or logs, which qualifies as expert troubleshooting knowledge.",updated +https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/troubleshooting,Troubleshooting guide,Troubleshooting for Workload Orchestration - Azure Arc,Diagnose and fix Azure Arc workload orchestration issues,Learn how to troubleshoot issues with Workload Orchestration.,This article provides troubleshooting guidance for common issues encountered when using workload orchestration in Azure Arc.,2026-04-12T11:04:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Explicitly described as troubleshooting guidance for common issues. Such pages typically map specific symptoms and error messages to causes and resolutions, often including product-specific commands or logs, which qualifies as expert troubleshooting knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/tutorial-service-group-scenario-1,Create a solution with a leaf target,Solution with a Leaf Target - Azure Arc,,Learn how to create and configure a solution with a leaf target in a four-level service group hierarchy using workload orchestration.,"In this tutorial, you create and configure a target at the line level, which is the leaf level in a four-level service group hierarchy. You use service groups to orchestrate workloads across different levels of the hierarchy. For more information, seeService groups at different hierarchy levels in workload orchestration. Note Service groups are in preview and require allowlisting of user subscriptions for access. You need to fillthis formto get access to service groups.",2025-12-23T18:04:00.000Z,tutorial,,0.4,False,"Tutorial for leaf target in hierarchy; example scenario using service groups, not a general configuration or decision matrix reference.",unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/tutorial-service-group-scenario-2,Create a solution with a non-leaf target,Solution with a Non-Leaf Target - Azure Arc,,Learn how to create and configure a solution with a non-leaf target in a four-level service group hierarchy.,"In this tutorial, you create and configure a target at the city level, which is the third level in a four-level service group hierarchy. You use service groups to orchestrate workloads across different levels of the hierarchy. For more information, seeService groups at different hierarchy levels in workload orchestration. Note Service groups are in preview and require allowlisting of user subscriptions for access. You need to fillthis formto get access to service groups.",2025-12-23T18:04:00.000Z,tutorial,,0.4,False,Tutorial for non-leaf target; similar scenario-based guidance without broad configuration or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/tutorial-service-group-scenario-3,Create a solution with shared dependencies at different levels,Solution with Multiple Shared Dependencies at Different Hierarchy Levels - Azure Arc,,Learn how to create a solution with multiple shared dependencies at different hierarchy levels in Azure Arc-enabled Kubernetes.,"In this tutorial, you create a solution with multiple shared dependencies at different levels in a four-level hierarchy organization. You use service groups to orchestrate workloads across different levels of the hierarchy. For more information, seeService groups at different hierarchy levels in workload orchestration. Note Service groups are in preview and require allowlisting of user subscriptions for access. You need to fillthis formto get access to service groups.",2025-06-05T05:30:00.000Z,tutorial,,0.4,False,"Tutorial for multiple shared dependencies; focused on a specific example hierarchy, not on reusable configuration reference material.",unchanged diff --git a/products/azure-arc/report.md b/products/azure-arc/report.md index 52fe02eb..dce7bed0 100644 --- a/products/azure-arc/report.md +++ b/products/azure-arc/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Azure Arc: identity, RBAC, AD/Entra auth, keytabs, TDE, certificates, network/Private Link, policies, and hardening for Kubernetes, servers, SQL MI, @@ -13,9 +13,9 @@ category_descriptions: limits-quotas: Limits, quotas, versions, and requirements for Arc-enabled Kubernetes, Edge RAG, Arc data services, resource bridge, and billing/ESU behavior for connected machines and Windows Server. - configuration: 'Configuring Azure Arc infrastructure and data services: storage, - networking, GitOps, monitoring, security, Edge RAG, agents/extensions, and lifecycle - tasks for Arc-enabled Kubernetes, servers, and data.' + configuration: 'Configuring Azure Arc infrastructure and services: Kubernetes, data + services, Edge RAG, storage, networking, GitOps, monitoring, VM/agent settings, + extensions, and portal/CLI management.' architecture-patterns: 'Patterns for Arc data/compute design: container storage data flow, Arc Edge Volumes, HA/DR for Arc SQL MI and failover groups, and advanced Edge RAG data parsing.' @@ -33,14 +33,14 @@ skill_description: Expert knowledge for Azure Arc development including troubles security, configuration, integrations & coding patterns, and deployment. Use when managing Arc-enabled Kubernetes, servers/VMs, SQL MI, Edge RAG, resource bridge, or Arc data services, and other Azure Arc related development tasks. Not for Azure - Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Virtual Machines - (use azure-virtual-machines), Azure Stack Edge (use azure-stack-edge), Azure VMware - Solution (use azure-vmware-solution). + Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Stack Edge (use azure-stack-edge), + Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Network (use + azure-virtual-network). use_when: Use when managing Arc-enabled Kubernetes, servers/VMs, SQL MI, Edge RAG, resource bridge, or Arc data services, and other Azure Arc related development tasks. confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), - Azure Virtual Machines (use azure-virtual-machines), Azure Stack Edge (use azure-stack-edge), - Azure VMware Solution (use azure-vmware-solution). + Azure Stack Edge (use azure-stack-edge), Azure Virtual Machines (use azure-virtual-machines), + Azure Virtual Network (use azure-virtual-network). --- # Azure Arc Crawl Report @@ -53,10 +53,10 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes - **Unclassified**: 152 ### Incremental Update -- **New Pages**: 8 -- **Updated Pages**: 14 -- **Unchanged**: 401 -- **Deleted Pages**: 2 +- **New Pages**: 0 +- **Updated Pages**: 2 +- **Unchanged**: 421 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-arc/azure-arc.csv` ## Classification Statistics @@ -76,52 +76,12 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes ## Changes -### New Pages - -- [Overview](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-overview) -- [Deploy extension](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-deploy) -- [Ingress support](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-ingress) -- [Egress support](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-egress) -- [Monitor and troubleshoot](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/cert-manager-monitor-troubleshoot) -- [Application deployment with GitOps (Flux)](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-flux2) -- [GitOps (Flux) Release notes](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/flux-gitops-release-notes) -- [GitOps (Flux) parameters](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/gitops-flux2-parameters) - ### Updated Pages -- [Multicloud connector enabled by Azure Arc >](https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/overview) - - Updated: 2025-11-20T18:10:00.000Z → 2026-04-13T08:00:00.000Z -- [Resource representation in Azure](https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/resource-representation) - - Updated: 2025-11-18T16:11:00.000Z → 2026-04-17T22:09:00.000Z -- [Onboard VMs to Azure Arc](https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/onboard-multicloud-vms-arc) - - Updated: 2025-11-18T16:11:00.000Z → 2026-04-17T22:09:00.000Z -- [Troubleshooting](https://learn.microsoft.com/en-us/azure/azure-arc/multicloud-connector/troubleshoot-multicloud-connector) - - Updated: 2025-11-18T16:11:00.000Z → 2026-04-15T22:10:00.000Z -- [What's new with Connected Machine agent?](https://learn.microsoft.com/en-us/azure/azure-arc/servers/agent-release-notes) - - Updated: 2026-03-25T06:03:00.000Z → 2026-04-14T06:03:00.000Z -- [Available extensions](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/extensions-release) - - Updated: 2026-04-01T08:00:00.000Z → 2026-04-17T17:16:00.000Z -- [Delete resources](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/delete-resources) - - Updated: 2026-03-02T12:03:00.000Z → 2026-04-12T11:04:00.000Z -- [Troubleshooting guide](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/troubleshooting) - - Updated: 2025-06-26T22:17:00.000Z → 2026-04-12T11:04:00.000Z -- [Release notes](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/release-notes) - - Updated: 2026-03-02T12:03:00.000Z → 2026-04-12T11:04:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/overview) - - Updated: 2025-07-01T22:12:00.000Z → 2026-04-12T11:04:00.000Z -- [Set up workload orchestration](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/initial-setup-configuration) - - Updated: 2025-09-05T17:07:00.000Z → 2026-04-12T11:04:00.000Z -- [Onboard using scripts](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/onboarding-scripts) - - Updated: 2025-07-01T22:12:00.000Z → 2026-04-12T11:04:00.000Z -- [Create a basic solution](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-without-common-configuration) - - Updated: 2025-12-23T18:04:00.000Z → 2026-04-12T11:04:00.000Z -- [Create solution with common configurations](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/quickstart-solution-with-common-configuration) - - Updated: 2025-12-23T18:04:00.000Z → 2026-04-12T11:04:00.000Z - -### Deleted Pages - -- ~~Application deployment with GitOps (Flux v2)~~ (https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-gitops-flux2) -- ~~GitOps (Flux v2) parameters~~ (https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/gitops-flux2-parameters) +- [Get started](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension) + - Updated: 2025-11-18T19:04:00.000Z → 2026-04-22T11:03:00.000Z +- [What's new with Arc-enabled Kubernetes](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/release-notes) + - Updated: 2026-03-24T17:13:00.000Z → 2026-04-23T08:00:00.000Z ## Classified Pages @@ -131,7 +91,6 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Troubleshooting](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension-troubleshooting) | troubleshooting | 0.95 | Explicit troubleshooting guide; will map specific SSE symptoms, error messages, and misconfigurations to resolutions and diagnostic steps. | | [Agent connection issues](https://learn.microsoft.com/en-us/azure/azure-arc/servers/troubleshoot-agent-onboard) | troubleshooting | 0.90 | Troubleshooting guide for agent connectivity with Arc-enabled servers; will map connection symptoms to causes and fixes, including product-specific commands and log locations. | | [Extension issues](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/extensions-troubleshooting) | troubleshooting | 0.90 | Troubleshooting for cluster extensions; will list extension provisioning states, error messages, and corrective actions. | -| [Get started](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension) | configuration | 0.90 | Explains how SSE syncs secrets and how to configure it; such pages include extension settings, sync modes, and Kubernetes secret mapping parameters. | | [SCVMM-specific deployment errors](https://learn.microsoft.com/en-us/azure/azure-arc/system-center-virtual-machine-manager/troubleshoot-scvmm) | troubleshooting | 0.90 | Explicit troubleshooting article for deployment errors; likely organized by error codes/messages and their resolutions, which is core troubleshooting content. | | [Troubleshoot Extended Security Updates](https://learn.microsoft.com/en-us/azure/azure-arc/servers/troubleshoot-extended-security-updates) | troubleshooting | 0.90 | Explicit troubleshooting article for ESU enablement with symptom-based guidance around licensing, enrollment, provider registration, and patch delivery; will contain specific causes and resolutions unique to ESU via Arc. | | [Troubleshoot SSH access to Azure Arc-enabled servers](https://learn.microsoft.com/en-us/azure/azure-arc/servers/ssh-arc-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting article with Arc SSH-specific symptoms, causes, and resolutions, likely including error messages and diagnostic steps. | @@ -235,6 +194,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [arcdata](https://learn.microsoft.com/en-us/azure/azure-arc/data/about-arcdata-extension) | integrations | 0.75 | Command reference for az arcdata; contains parameter names, allowed values, and behavior unique to this extension. | | [azcmagent genkey](https://learn.microsoft.com/en-us/azure/azure-arc/servers/azcmagent-genkey) | configuration | 0.75 | Command reference for generating key pairs for asynchronous onboarding to Arc-enabled VM offerings; includes specific usage patterns unique to this product. | | [azcmagent logs](https://learn.microsoft.com/en-us/azure/azure-arc/servers/azcmagent-logs) | configuration | 0.75 | Command reference for collecting agent and extension logs into a ZIP; includes product-specific log collection behavior and options. | +| [Get started](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension) | configuration | 0.74 | The page describes a specific Azure Arc-enabled Kubernetes extension that syncs Key Vault secrets into the Kubernetes secret store. Such extension docs typically include product-specific configuration options (CRD fields, Helm values, parameters like key vault name, tenant, sync intervals, secret mapping formats) and their allowed values, which qualify as expert configuration knowledge beyond generic Kubernetes or Key Vault usage. | | [Upload logs](https://learn.microsoft.com/en-us/azure/azure-arc/data/upload-logs) | configuration | 0.74 | Log export and upload procedure; includes product-specific log categories, commands, and integration behavior with Azure Monitor. | | [Upload metrics](https://learn.microsoft.com/en-us/azure/azure-arc/data/upload-metrics) | configuration | 0.74 | Metrics export and upload guide; includes product-specific metric sets, commands, and Azure Monitor integration details. | | [Connectivity modes and requirements](https://learn.microsoft.com/en-us/azure/azure-arc/data/connectivity) | decision-making | 0.72 | Explains connectivity modes and requirements; typically includes comparison of modes, prerequisites, and trade-offs to help select the right connectivity option. | @@ -507,7 +467,6 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Set up workload orchestration](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/initial-setup-configuration) | 0.30 | Describes steps and concepts for setup and configuration at a high level; summary does not indicate detailed parameter tables, limits, or product-specific troubleshooting. | | [Stage a solution before deployment](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/how-to-stage) | 0.30 | From the summary, this is a procedural 'how to stage resources' guide without evidence of detailed configuration tables, limits, or product-specific parameters. It appears to describe the concept and scenarios for staging rather than expert-only numeric limits, configuration matrices, or error mappings. | | [View alerts](https://learn.microsoft.com/en-us/azure/azure-arc/site-manager/how-to-view-alerts) | 0.30 | Viewing alerts is primarily a UI/status walkthrough; it doesn’t emphasize configuration parameters or troubleshooting mappings. | -| [What's new with Arc-enabled Kubernetes](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/release-notes) | 0.30 | Release notes typically list version changes and high-level feature updates; the summary does not indicate presence of numeric limits, config tables, error-code mappings, or other structured expert details as defined by the sub-skill types. | | [What's new with Connected Machine agent?](https://learn.microsoft.com/en-us/azure/azure-arc/servers/agent-release-notes) | 0.30 | Release notes summarize agent versions and changes; while detailed, they are version history rather than reusable expert knowledge patterns like limits, configuration matrices, or troubleshooting mappings. | | [Workflows and features](https://learn.microsoft.com/en-us/azure/azure-arc/workload-orchestration/workflow-features) | 0.30 | Overview of workflows and features; conceptual description of authoring/deploying/managing solutions without explicit config tables or limits. | | [Azure Arc-enabled System Center Virtual Machine Manager >](https://learn.microsoft.com/en-us/azure/azure-arc/system-center-virtual-machine-manager/overview) | 0.25 | Overview of Arc-enabled SCVMM; no explicit expert-level configuration or limits mentioned. | @@ -541,6 +500,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [View connectivity status](https://learn.microsoft.com/en-us/azure/azure-arc/site-manager/how-to-view-connectivity-status) | 0.20 | Viewing connectivity status is a UI usage guide; it doesn’t expose numeric limits, config matrices, or detailed error mappings. | | [View update status](https://learn.microsoft.com/en-us/azure/azure-arc/site-manager/how-to-view-update-status) | 0.20 | Viewing update status is a basic monitoring/portal guide without expert-only configuration or troubleshooting content. | | [What is Edge RAG?](https://learn.microsoft.com/en-us/azure/azure-arc/edge-rag/overview) | 0.20 | High-level overview of Edge RAG; summary doesn’t show detailed configs, limits, or troubleshooting content. | +| [What's new with Arc-enabled Kubernetes](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/release-notes) | 0.20 | Release notes primarily list version changes and high-level updates for Azure Arc-enabled Kubernetes agents. While they may contain some specific behaviors or fixes, they are not organized as limits, configuration references, troubleshooting guides, or other reusable expert patterns defined in the sub-skill types. Content is more about change history than structured expert knowledge per the given categories. | | [What's new?](https://learn.microsoft.com/en-us/azure/azure-arc/container-storage/whats-new) | 0.20 | What's new page referencing blogs and case studies; not a technical reference for limits/configs/troubleshooting. | | [Windows Server Management](https://learn.microsoft.com/en-us/azure/azure-arc/servers/windows-server-management-overview) | 0.20 | High-level overview of Windows Server Management enabled by Azure Arc and its benefits; no detailed limits, configuration parameters, error codes, or decision matrices. | | [Agent overview](https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/conceptual-agent-overview) | 0.10 | Agent overview is conceptual and high-level, explaining what the agents are and general deployment steps; no indication of specific limits, config matrices, or detailed troubleshooting content. | diff --git a/products/azure-architecture/azure-architecture.csv b/products/azure-architecture/azure-architecture.csv index 2afd1b20..e7c0f821 100644 --- a/products/azure-architecture/azure-architecture.csv +++ b/products/azure-architecture/azure-architecture.csv @@ -39,7 +39,6 @@ https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/orchestrate-mach https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/unlock-insights-from-conversational-data,Unlock insights from conversational data,Build a Conversation Knowledge Mining Solution by using Foundry Tools - Azure Architecture Center,Design conversation analytics with Foundry Tools,Learn how to design a scalable conversation analytics system that extracts insights from conversational data by using Foundry Tools and Microsoft Foundry.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This architecture describes a conversation knowledge mining solution that extracts actionable insights from large volumes of conversational data, like from a call center. The soluti",2026-02-19T18:33:00Z,solution-idea,solution-ideas,0.85,True,Explicitly labeled as a solution idea with conceptual architecture diagram and scenario-specific guidance for conversation knowledge mining.,unchanged https://learn.microsoft.com/en-us/azure/architecture/ai-ml/openai/architecture/call-center-openai-analytics,Extract and analyze call center data,"Extract and Analyze Call Center Data by Using Azure OpenAI in Foundry Models, Speech Services, and Language Services - Azure Architecture Center",Analyze call center conversations with Azure OpenAI,Learn how to extract insights from customer conversations at a call center by using Foundry Tools and Azure OpenAI in Foundry Models.,"This article describes how to extract insights from customer conversations at a call center by using Foundry Tools and Azure OpenAI in Foundry Models. Use these services to improve your customer interactions and satisfaction by analyzing call intent and sentiment, extracting key entities, and summarizing call content.",2026-02-19T18:33:00Z,concept-article,example-workloads,0.7,True,"Call center analytics workload using Foundry Tools and Azure OpenAI; detailed, domain-specific implementation rather than a generic solution idea.",unchanged https://learn.microsoft.com/en-us/azure/architecture/analytics/analytics-get-started,Get started,Analytics architecture design - Azure Architecture Center,,"Get an overview of Azure analytics technologies, guidance offerings, solution ideas, and reference architectures.","With the exponential growth in data, organizations rely on the limitless compute, storage, and analytical power of Azure to scale, stream, predict, and see their data. Analytics solutions turn volumes of data into useful business intelligence (BI), such as reports and visualizations, and inventive artificial intelligence (AI), such as forecasts based on machine learning. Whether your organization is just starting to evaluate cloud-based analytics tools or is looking to expand your current implem",2026-03-27T05:03:00.000Z,concept-article,,0.2,False,"Overview of Azure analytics technologies and related guidance. It’s a high-level entry point without a specific pattern structure, solution idea diagram, or detailed implementation/deployment specifics, so it doesn’t meet any sub-skill type’s expert-knowledge criteria.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/analytics/architecture/fabric-deployment-patterns,Microsoft Fabric deployment patterns,Deployment Patterns for Microsoft Fabric - Azure Architecture Center,,Learn about common deployment scenarios for Microsoft Fabric.,"This article describes four deployment patterns that you can choose from when you deploy Microsoft Fabric. Learn about considerations, recommendations, and potential nonreversible decisions for each deployment pattern. The following design areas are outlined for each Fabric deployment pattern:",2025-10-30T05:03:00Z,concept-article,,0.45,False,"Describes deployment patterns for Microsoft Fabric with considerations and recommendations, but this is more of a conceptual deployment-pattern guide; it does not clearly fall into the defined sub-skill types (no patterns/ URL, no best-practices/ marker, not an example-scenario or reference-architectures path).",unchanged https://learn.microsoft.com/en-us/azure/architecture/antipatterns/,Overview,Performance testing and antipatterns - Azure Architecture Center,Identify and remediate common Azure performance antipatterns,Build scalability solutions for common stressors by learning about performance antipatterns. These are common practices that are likely to cause scalability problems when an application is under press,"Performance antipatterns, much like design patterns, are common defective processes and implementations within organizations. These are common practices that are likely to cause scalability problems when an application is under pressure. Awareness of these practices can help simplify communication of high-level concepts among software practitioners, and knowledge of antipatterns can be helpful when reviewing code or diagnosing performance issues. Here's a common scenario: An application behaves ",2026-02-03T06:02:00.000Z,design-pattern,anti-patterns,0.9,True,"Antipatterns/ index describing performance antipatterns, their impact, and remediation approaches; focuses on what not to do.",unchanged https://learn.microsoft.com/en-us/azure/architecture/antipatterns/busy-database/,Busy Database,Busy Database antipattern - Azure Architecture Center,Avoid the Busy Database performance antipattern,"Understand the Busy Database antipattern, which can cause performance and scalability problems by offloading processing to a database server.","Offloading processing to a database server can cause it to spend a significant proportion of time running code, rather than responding to requests to store and retrieve data.",2022-12-16T18:32:00.000Z,design-pattern,anti-patterns,0.95,True,"Named antipattern page under antipatterns/ with symptoms, causes, and remediation steps for offloading too much processing to the database.",unchanged https://learn.microsoft.com/en-us/azure/architecture/antipatterns/busy-front-end/,Busy Front End,Busy Front End antipattern - Azure Architecture Center,Detect and fix the Busy Front End antipattern,Asynchronous work on a large number of background threads can starve other foreground tasks of resources.,"Performing asynchronous work on a large number of background threads can starve other concurrent foreground tasks of resources, decreasing response times to unacceptable levels.",2026-02-03T06:02:00.000Z,design-pattern,anti-patterns,0.95,True,"Named antipattern under antipatterns/ describing symptoms of excessive background threads and how to remediate, clearly an anti-pattern guide.",unchanged @@ -59,7 +58,7 @@ https://learn.microsoft.com/en-us/azure/architecture/aws-professional/databases, https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/,AKS for Amazon EKS professionals,AKS for Amazon EKS Professionals - Azure Architecture Center,Map Amazon EKS concepts and configs to Azure AKS,"Learn about the AKS managed solution, configurations, best practices, and similarities and differences compared to Amazon EKS.","This series of articles helps professionals who are familiar with Amazon Elastic Kubernetes Service (EKS) understandAzure Kubernetes Service (AKS). The series highlights key similarities and differences between these two managed Kubernetes solutions. The articles compare AKS with Amazon EKS in the following Kubernetes design areas: For greenfield AKS implementations, seeBaseline architecture for an AKS cluster. AKS isn't the only way to run containers in Azure, and Amazon EKS is only one of the ",2026-04-08T17:34:00.000Z,concept-article,migration-guides,0.86,True,"The page is part of the aws-professional path and explicitly targets Amazon EKS professionals learning AKS. It compares AKS and EKS across Kubernetes design areas, which implies detailed service mapping, configuration differences, and migration considerations between AWS and Azure. This aligns with the migration-guides definition (cross-cloud comparison, service mapping, and guidance for moving from one platform to another), and goes beyond generic conceptual content.",unchanged https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/cost-management,Cost management and optimization,Cost Management for Kubernetes - Azure Architecture Center,Manage and optimize AKS costs versus Amazon EKS,"Understand Kubernetes cluster and workload costs, learn how to optimize and govern costs, and compare AKS and Amazon EKS options.",This article explains pricing and cost management in Azure Kubernetes Service (AKS) compared to Amazon Elastic Kubernetes Service (EKS). It describes how to optimize costs and implement cost governance solutions for your AKS cluster. Note This article is part of aseries of articlesthat helps professionals who are familiar withAmazon EKSunderstandAzure Kubernetes Service (AKS).,2025-04-08T17:31:00.000Z,concept-article,migration-guides,0.78,True,"Explains AKS pricing, cost optimization, and governance in comparison with EKS. Cross-cloud cost management and mapping is migration-focused expert guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/governance,Cluster governance,Governance Options for a Kubernetes Cluster - Azure Architecture Center,Apply governance to AKS clusters versus EKS,"Understand governance options for a Kubernetes cluster, and compare Amazon EKS and Azure Kubernetes Service (AKS) governance options.","Governance refers to an organization's ability to enforce and validate rules to help guarantee compliance with corporate standards. Governance helps organizations mitigate risks, comply with corporate standards and external regulations, and minimize interruption to adoption or innovation. Governance includes planning initiatives, setting strategic priorities, and using mechanisms and processes to control applications and resources. For Kubernetes clusters in a cloud environment, governance means",2025-04-08T17:31:00.000Z,concept-article,migration-guides,0.78,True,Describes governance mechanisms for Kubernetes clusters and compares EKS and AKS governance options. Cross-cloud governance mapping is expert migration guidance.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/migrate,Migrate EKS workloads to AKS,Migrate from Amazon Elastic Kubernetes Service to Azure Kubernetes Service - Azure Architecture Center,Migrate Kubernetes workloads from EKS to AKS,Learn how to migrate stateless and stateful workloads from Amazon Elastic Kubernetes Service (EKS) to Azure Kubernetes Service (AKS).,"This article describes strategies to migrate typical stateless and stateful workloads from Amazon Elastic Kubernetes Service (EKS) to Azure Kubernetes Service (AKS). Note This guide focuses on the migration process and considerations for moving workloads from EKS to AKS. If you want to learn more about AKS configurations, best practices, and similarities and differences compared to Amazon EKS, start withAKS for Amazon EKS professionals.",2026-04-14T05:03:00.000Z,concept-article,migration-guides,0.86,True,"URL path contains aws-professional/eks-to-aks/migrate, and the page focuses on concrete migration strategies and considerations for moving stateless and stateful workloads from Amazon EKS to Azure AKS. This is a cross-cloud migration guide with platform-specific steps and gotchas that go beyond generic conceptual content, matching the migration-guides criteria.",updated +https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/migrate,Migrate EKS workloads to AKS,Migrate from Amazon Elastic Kubernetes Service to Azure Kubernetes Service - Azure Architecture Center,Migrate Kubernetes workloads from EKS to AKS,Learn how to migrate stateless and stateful workloads from Amazon Elastic Kubernetes Service (EKS) to Azure Kubernetes Service (AKS).,"This article describes strategies to migrate typical stateless and stateful workloads from Amazon Elastic Kubernetes Service (EKS) to Azure Kubernetes Service (AKS). Note This guide focuses on the migration process and considerations for moving workloads from EKS to AKS. If you want to learn more about AKS configurations, best practices, and similarities and differences compared to Amazon EKS, start withAKS for Amazon EKS professionals.",2026-04-14T05:03:00.000Z,concept-article,migration-guides,0.86,True,"URL path contains aws-professional/eks-to-aks/migrate, and the page focuses on concrete migration strategies and considerations for moving stateless and stateful workloads from Amazon EKS to Azure AKS. This is a cross-cloud migration guide with platform-specific steps and gotchas that go beyond generic conceptual content, matching the migration-guides criteria.",unchanged https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/monitoring,Cluster monitoring and logging,Kubernetes Monitoring and Logging - Azure Architecture Center,Compare and configure AKS vs EKS monitoring and logging,"Understand monitoring and logging for an Azure Kubernetes Service (AKS) cluster and workloads, and compare Amazon EKS and AKS monitoring and logging.",This article compares Azure Kubernetes Service (AKS) monitoring and Amazon Elastic Kubernetes Service (EKS) monitoring. It describes options that you can use to monitor and manage the logs of an AKS cluster and its workloads. Note This article is part of aseries of articlesthat helps professionals who are familiar withAmazon EKSunderstandAzure Kubernetes Service (AKS).,2025-04-08T17:31:00.000Z,concept-article,migration-guides,0.78,True,"In the aws-professional/eks-to-aks path, focused on monitoring/logging differences and concrete options for AKS versus EKS. Contains platform-specific guidance and comparisons that go beyond generic Kubernetes knowledge, fitting a migration/transition guide for AWS professionals.",unchanged https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/node-pools,Agent node management,Manage Kubernetes Nodes and Node Pools - Azure Architecture Center,Manage AKS nodes and node pools versus EKS,"Understand Kubernetes nodes and node pools, how to handle Azure Kubernetes Service (AKS) nodes and node pools, and node pool options for Amazon EKS and AKS.","Kubernetes architecture consists of two layers: thecontrol planeand at least onenode in a node pool. This article describes and compares how Amazon Elastic Kubernetes Service (EKS) and Azure Kubernetes Service (AKS) manage agent nodes and worker nodes. Note This article is part of aseries of articlesthat helps professionals who are familiar withAmazon EKSunderstandAzure Kubernetes Service (AKS). In both Amazon EKS and AKS, the cloud platform provides and manages the control plane layer, and the ",2025-04-08T17:31:00.000Z,concept-article,migration-guides,0.78,True,"Details how EKS and AKS manage agent/worker nodes and node pools, with Azure-specific options. This comparative, platform-mapping content is characteristic of migration guides.",unchanged https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/private-clusters,Network topologies and security,Enhance Network Access Security to Kubernetes - Azure Architecture Center,Secure AKS API access compared to Amazon EKS,"Understand networking options to help securely access the Kubernetes API server, and compare options in Amazon EKS and Azure Kubernetes Service (AKS).",This article compares networking modes for Azure Kubernetes Service (AKS) and Amazon Elastic Kubernetes Service (EKS). It describes how to improve connection security to the managed API server of an AKS cluster. It also includes options to restrict public network access. Note This article is part of aseries of articlesthat helps professionals who are familiar withAmazon EKSunderstandAzure Kubernetes Service (AKS).,2025-04-08T17:31:00.000Z,concept-article,migration-guides,0.78,True,"In the aws-professional/eks-to-aks path, compares private cluster and networking options between EKS and AKS with concrete security configuration guidance. This is expert, migration-oriented content for AWS users moving to Azure.",unchanged @@ -85,7 +84,7 @@ https://learn.microsoft.com/en-us/azure/architecture/best-practices/message-enco https://learn.microsoft.com/en-us/azure/architecture/best-practices/monitoring,Monitoring and diagnostics,Monitoring and diagnostics guidance - Azure Architecture Center,Design monitoring and diagnostics for Azure applications,"Learn how to track how users use your distributed applications and services, trace resource utilization, and monitor the health and performance.","Distributed applications and services running in the cloud are, by their nature, complex pieces of software that comprise many moving parts. In a production environment, it's important to be able to track the way in which users use your system, trace resource utilization, and generally monitor the health and performance of your system. You can use this information as a diagnostic aid to detect and correct issues, and also to help spot potential problems and prevent them from occurring.",2026-03-28T05:02:00Z,best-practice,best-practices,0.9,True,"Monitoring and diagnostics guidance under best-practices with specific recommendations on tracking usage, resource utilization, and health for distributed apps. Provides actionable implementation guidance and cross-cutting best practices rather than just conceptual overview, matching the best-practices category.",unchanged https://learn.microsoft.com/en-us/azure/architecture/best-practices/transient-faults,Transient fault handling,Transient Fault Handling - Azure Architecture Center,Handle transient faults in cloud-based applications,"Learn how to handle transient faults that the loss of network connectivity, temporary unavailability, or timeouts can cause.","All applications that communicate with remote services and resources must detect and recover from transient faults. This requirement is especially true for applications that run in the cloud. Because of the nature of the cloud environment and connectivity over the internet, your application is likely to encounter transient faults more often. Transient faults include the momentary loss of network connectivity to components and services, the temporary unavailability of a service, and timeouts that",2026-02-26T18:33:00.000Z,best-practice,best-practices,0.9,True,"Provides concrete strategies (retries, backoff, etc.) for transient fault handling with prescriptive guidance, matching best-practices.",unchanged https://learn.microsoft.com/en-us/azure/architecture/browse/,Browse all Architectures,Browse Azure Architectures - Azure Architecture Center,,"Find architecture diagrams and technology descriptions for reference architectures, real world examples of cloud architectures, and solution ideas for common workloads on Azure.","Find architecture diagrams and technology descriptions for reference architectures, real world examples of cloud architectures, and solution ideas for common workloads on Azure.",2025-11-05T18:33:00Z,browse-hub,,0.2,False,"Browse/navigation hub listing many architectures; no detailed implementation, configuration, or deployment guidance on this page itself.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/changelog,What's new,What's New in Azure Architecture Center - Azure Architecture Center,,New and updated articles in Azure Architecture Center,"The Azure Architecture Center (AAC) helps you design, build, and operate solutions on Azure. Learn about the cloud architectural styles and design patterns. Use the technology choices and guides to decide the services that are right for your solution. The guidance is based on all aspects of building for the cloud, such as reliability, security, cost optimization, operations, and performance. The following new and updated articles have recently been published in the Azure Architecture Center.",2026-04-14T05:03:00.000Z,whats-new,,0.0,False,"Changelog/navigation page listing new and updated Azure Architecture Center articles; does not itself contain architectural guidance, patterns, or implementation details.",updated +https://learn.microsoft.com/en-us/azure/architecture/changelog,What's new,What's New in Azure Architecture Center - Azure Architecture Center,,New and updated articles in Azure Architecture Center,"The Azure Architecture Center (AAC) helps you design, build, and operate solutions on Azure. Learn about the cloud architectural styles and design patterns. Use the technology choices and guides to decide the services that are right for your solution. The guidance is based on all aspects of building for the cloud, such as reliability, security, cost optimization, operations, and performance. The following new and updated articles have recently been published in the Azure Architecture Center.",2026-04-24T05:04:00.000Z,whats-new,,0.0,False,Changelog/navigation page listing new and updated Azure Architecture Center articles; does not itself contain detailed architectural guidance or implementation specifics.,updated https://learn.microsoft.com/en-us/azure/architecture/containers/container-get-started,Get started,Container architecture design - Azure Architecture Center,,"Get an overview of Azure container technologies, guidance offerings, solution ideas, and reference architectures.","Containers have become the standard for packaging and deploying modern applications. Azure provides a comprehensive set of container services that range from fully managed Kubernetes clusters to serverless container platforms. Whether you're modernizing existing applications, building cloud-native microservices, or running stateful workloads, Azure container services offer the flexibility, portability, and scalability your organization needs. Choosing the right container platform depends on your",2026-03-27T05:03:00.000Z,concept-article,,0.2,False,"High-level overview of Azure container technologies and related guidance; no detailed deployment configurations, sizing guidance, or concrete implementation specifics that go beyond general conceptual knowledge.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/ai-services/image-video-processing,Image and video processing,Azure AI Video Processing Guide - Azure Architecture Center,Choose Azure AI services for image and video processing,Learn about Foundry Tools for video processing and video generation. Understand and compare each service's capabilities and use cases.,"Foundry Toolshelps developers and organizations create AI-based, advanced, production-ready applications that align with responsible AI practices by using out-of-the-box, prebuilt, and customizable APIs and models. This article describes video and image processing capabilities in Tools, such as visual analysis and generation of images, object detection, image classification, and facial recognition. The suite includes the following services: Azure OpenAI in Foundry Modelsprovides access to the fo",2026-03-24T17:35:00.000Z,concept-article,technology-choices,0.86,True,"Compares multiple Azure AI/Foundry services for image and video analysis and generation, describing capabilities and use cases to help pick the right service. This is a decision/selection guide across services, matching technology-choices and containing concrete, scenario-based comparison knowledge.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/ai-services/speech-recognition-generation,Speech recognition and generation,Choose an Azure Speech Recognition and Generation Technology - Azure Architecture Center,Select Azure speech recognition and generation services,"Learn about speech recognition and generation capabilities in Foundry Tools, including speech-to-text, text-to-speech, speech translation, and avatar creation.","Foundry Toolshelps developers and organizations create AI-based, advanced, production-ready applications that align with responsible AI practices by using out-of-the-box, prebuilt, and customizable APIs and models. This article describes speech-to-text (STT) and text-to-speech (TTS) capabilities in Tools. You can transcribe speech to text with high accuracy, produce natural-sounding TTS voices, translate spoken audio, and conduct live AI voice conversations. Create custom voices, add specific wo",2026-03-24T17:35:00.000Z,concept-article,technology-choices,0.86,True,"Describes and contrasts speech-to-text, text-to-speech, speech translation, and related capabilities in Foundry Tools, with guidance on when to use each. This is a service selection guide (comparison and decision criteria) under the AI services data guide, fitting technology-choices with expert, product-specific details.",unchanged @@ -97,15 +96,16 @@ https://learn.microsoft.com/en-us/azure/architecture/data-guide/disaster-recover https://learn.microsoft.com/en-us/azure/architecture/data-guide/relational-data/etl,ETL guide,"Extract, transform, load (ETL) - Azure Architecture Center",,"Learn about extract, transform, load (ETL) and extract, load, transform (ELT) data transformation pipelines, and how to use control flows and data flows.","Organizations commonly need to gather data from multiple sources in various formats and move it to one or more data stores. The destination might not be the same type of data store as the source, and the data often needs to be shaped, cleaned, or transformed before loading. Various tools, services, and processes help address these challenges. Regardless of the approach, you need to coordinate the work and apply data transformations within the data pipeline. The following sections highlight the c",2025-10-30T05:03:00Z,concept-article,,0.3,False,"Describes ETL/ELT concepts and pipeline components; while it may mention tools and services, it is a general data-pipeline overview rather than a detailed reference architecture, pattern, or best-practices document with explicit DOs/DON’Ts.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/relational-data/online-analytical-processing,OLAP solutions,Online Analytical Processing - Azure Architecture Center,,Learn about online analytical processing solutions to organize large databases and support complex analysis without affecting transactional systems.,"Online analytical processing (OLAP) is a technology that organizes large business databases to perform complex calculations and trend analysis. This method enables intricate queries without disrupting transactional systems. Business transactions and records are stored in databases known asonline transaction processing (OLTP) databases, which are optimized for individual record entries. These databases hold valuable information, but they're not designed for analysis, so data retrieval is time-con",2025-04-22T19:38:00.000Z,concept-article,,0.2,False,"Explains OLAP concepts and how they relate to OLTP; primarily conceptual data-architecture guidance without Azure-specific deployment details, sizing, or concrete implementation steps.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/relational-data/online-transaction-processing,OLTP solutions,Online Transaction Processing (OLTP) - Azure Architecture Center,,"Learn about atomicity, consistency, and other features of online transaction processing (OLTP), which manages transactional data while supporting querying.","The management of transactional data by using computer systems is referred to as online transaction processing (OLTP). OLTP systems record business interactions as they occur in the day-to-day operation of the organization, and support querying of this data to make inferences.",2026-02-07T06:02:00.000Z,concept-article,,0.2,False,"Conceptual explanation of OLTP characteristics (atomicity, consistency, etc.); lacks Azure-specific expert configuration guidance or pattern/anti-pattern structure.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-lake,Data lakes,What is a data lake? - Azure Architecture Center,,"Learn about the advantages of using data lake storage repositories, which can hold terabytes and petabytes of data in its native, raw format.","A data lake is a storage repository that holds a large amount of data in its native, raw format. Data lake stores are designed to scale cost-effectively to terabytes and petabytes data, making them suitable for handling massive and diverse datasets. The data typically comes from multiple diverse sources and can include structured data (like relational tables), semi-structured data (like JSON, XML, or logs), and unstructured data (like images, audio, or video). A data lake helps you store everyth",2025-09-16T17:40:00.000Z,concept-article,,0.2,False,"Defines what a data lake is and its advantages; high-level scenario description without detailed Azure service configurations, deployment guidance, or pattern-style structure.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-lake,Data lakes,What Is a Data Lake? - Azure Architecture Center,,"Learn about the advantages of using data lake storage repositories, which can store terabytes and petabytes of data in its native, raw format.","A data lake is a storage repository that holds large volumes of data in its native, raw format. Data lakes scale cost effectively to handle terabytes and petabytes of data, which makes them suitable for handling massive and diverse datasets. The data typically comes from many different sources and can include structured data like relational tables, semistructured data like JSON, XML, or log files, and unstructured data like images, audio, or video. Data lakes store all data types in their origin",2026-04-21T05:03:00.000Z,concept-article,,0.1,False,"Conceptual explanation of data lakes and their characteristics; does not match any specified sub-skill URL patterns (no reference-architectures/, solution-ideas/, patterns/, technology-choices/, architecture-styles/, best-practices/, antipatterns/, example-scenario/, industries/, migration/) and lacks detailed deployment guidance, patterns structure, or comparison tables.",updated https://learn.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-transfer,Data transfer options,Choose a Data Transfer Technology - Azure Architecture Center,Select Azure data transfer tools and services,"Learn about data transfer options like the Azure Import/Export service, Azure Data Box, Azure Data Factory, Fabric Data Factory, and command-line and graphical interface tools.","This article describes several options that you can use to transfer data to and from Azure, depending on your needs.",2025-11-25T06:04:00.000Z,concept-article,technology-choices,0.7,True,"Compares several Azure data transfer options (Data Box, Import/Export, Data Factory, etc.) and when to use each, functioning as a technology-choices guide.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/ai-services,AI services,Choose an Azure AI Technology - Azure Architecture Center,Select appropriate Azure AI services for applications,Learn about AI services that you can use in AI applications and data flows. Choose the appropriate service for your use case.,"Foundry Toolsprovides a suite of data science tools, models, and inferencing capabilities that support a range of functions. Most AI services require little to no AI expertise, so students, small-business owners, startups, and large enterprises can easily use them. Instead of building custom solutions, we recommend that you use these services to embed intelligent functionality into your workloads. In many cases, prebuilt models and software as a service (SaaS) solutions provide the necessary cap",2026-03-24T17:35:00.000Z,concept-article,technology-choices,0.86,True,"Page is under data-guide/technology-choices and compares multiple Azure AI services with guidance on which to use for specific AI application and data-flow scenarios. It provides selection criteria and trade-offs between services, which is specialized decision knowledge beyond generic model training data.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/analysis-visualizations-reporting,Analytics and reporting,Choose a Data Analytics and Reporting Technology in Azure - Azure Architecture Center,Choose Azure technologies for analytics and reporting,Evaluate big data analytics technology options for Azure. Use key selection criteria and a capability matrix to help you choose a data analytics technology.,The goal of most big data solutions is to provide insights into the data through analysis and reporting. Analysis and reporting can include preconfigured reports and visualizations or interactive data exploration.,2025-05-01T18:23:00.000Z,concept-article,technology-choices,0.95,True,"Under /technology-choices/, compares Azure analytics/reporting services with selection criteria and capability matrix, providing decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/analytical-data-stores,Analytical data stores,Choose an Analytical Data Store in Azure - Azure Architecture Center,Select the right analytical data store in Azure,Evaluate analytical data store options for big data by reviewing key selection criteria and a capability matrix to compare database models and features.,"In abig dataarchitecture, there's often a need for an analytical data store that serves processed data in a structured format that can be queried by using analytical tools. Analytical data stores that support querying of both hot-path and cold-path data are collectively referred to as theserving layer, ordata serving storage. The serving layer handles processed data from both the hot path and the cold path. In theLambda architecture, the serving layer is subdivided into two layers. Thespeed serv",2026-04-15T08:00:00.000Z,concept-article,technology-choices,0.86,True,"The page is a decision guide for choosing among Azure analytical data store options. It includes selection criteria and a capability matrix comparing multiple services and database models, which matches the technology-choices pattern (comparison tables, selection criteria). This is expert knowledge beyond generic LLM training because it encodes up-to-date, Azure-specific service comparisons and guidance.",updated +https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/analytical-data-stores,Analytical data stores,Choose an Analytical Data Store in Azure - Azure Architecture Center,Select the right analytical data store in Azure,Evaluate analytical data store options for big data by reviewing key selection criteria and a capability matrix to compare database models and features.,"In abig dataarchitecture, there's often a need for an analytical data store that serves processed data in a structured format that can be queried by using analytical tools. Analytical data stores that support querying of both hot-path and cold-path data are collectively referred to as theserving layer, ordata serving storage. The serving layer handles processed data from both the hot path and the cold path. In theLambda architecture, the serving layer is subdivided into two layers. Thespeed serv",2026-04-15T08:00:00.000Z,concept-article,technology-choices,0.86,True,"The page is a decision guide for choosing among Azure analytical data store options. It includes selection criteria and a capability matrix comparing multiple services and database models, which matches the technology-choices pattern (comparison tables, selection criteria). This is expert knowledge beyond generic LLM training because it encodes up-to-date, Azure-specific service comparisons and guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/batch-processing,Batch processing,Choose a batch processing technology - Azure Architecture Center,Select Azure batch processing technologies for big data,"Compare technology choices for big data batch processing in Azure, including key selection criteria and a capability matrix.",Big data solutions often consist of discrete batch processing tasks that contribute to the overall data processing solution. You can use batch processing for workloads that don't require immediate access to insights. Batch processing can complement real-time processing requirements. You can also use batch processing to balance complexity and reduce cost for your overall implementation. The fundamental requirement of batch processing engines is to scale out computations to handle a large volume o,2025-12-11T06:03:00.000Z,concept-article,technology-choices,0.95,True,"Under /technology-choices/, compares batch processing options with key selection criteria and capability matrix to guide service choice.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage,Big data storage,Choose a data storage technology - Azure Architecture Center,Select big data storage technologies in Azure,"Compare big data storage technology options in Azure, including key selection criteria and a capability matrix.","This article compares options for data storage for big data solutions—specifically, data storage for bulk data ingestion and batch processing, as opposed toanalytical datastoresor real-time streaming ingestion.",2024-10-04T17:31:00.000Z,concept-article,technology-choices,0.9,True,"Compares big data storage options with key selection criteria and capability matrix, fitting technology-choices.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-analytical-data-stores,Choose an analytical data store in Microsoft Fabric,Choose an Analytical Data Store in Microsoft Fabric - Azure Architecture Center,Choose an analytical data store in Microsoft Fabric,"Evaluate analytical data store options in Microsoft Fabric based on data volumes, types, compute engine, ingestion, transformation, and query patterns.","Analytical data stores are essential for storing, processing, and serving data to support various analytical workloads.Microsoft Fabricis a unified data platform that provides several analytical data stores as software as a service (SaaS). Each data store provides distinct capabilities to address different analytical requirements. Selecting the right analytical data store depends on factors such as data volume, data type, compute engine, ingestion and transformation patterns, query needs, access",2025-10-15T05:04:00.000Z,concept-article,technology-choices,0.95,True,"Under /technology-choices/, evaluates multiple Fabric analytical data stores with selection factors and trade-offs, matching technology-choices.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing,Natural language processing,Natural Language Processing Technology - Azure Architecture Center,Choose Azure natural language processing technologies,"Choose a natural language processing service for sentiment analysis, topic and language detection, key phrase extraction, and document categorization.","Natural language processing has many applications, such as sentiment analysis, topic detection, language detection, key phrase extraction, and document categorization. Specifically, you can use natural language processing to: As technology advances, you can use natural language processing to categorize and analyze text data. You can also use it to enhance interpretable AI functions across diverse domains. The integration of language models significantly enhances the capabilities of natural langu",2026-02-19T06:03:00.000Z,concept-article,technology-choices,0.75,True,"Provides comparison and selection guidance for NLP services (sentiment, key phrases, etc.), aligning with technology-choices.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-deployment-patterns,Choose a Microsoft Fabric deployment pattern,Choose a Microsoft Fabric Deployment Pattern - Azure Architecture Center,Select the right Microsoft Fabric deployment pattern,"Evaluate four Microsoft Fabric deployment patterns to structure capacities, workspaces, and items based on your requirements.","When you deployMicrosoft Fabric, you need to decide how to structure capacities, workspaces, and items across your organization. The right deployment pattern depends on your requirements for governance, security, performance isolation, and cost management. This guide helps architects and platform teams evaluate four deployment patterns and understand the considerations and trade-offs for each one.",2026-04-20T17:34:00.000Z,concept-article,technology-choices,0.78,True,"The page is a decision guide for choosing among multiple Microsoft Fabric deployment patterns. It discusses trade-offs and considerations (governance, security, performance isolation, cost management) and lives under a technology-choices path. It provides expert, scenario-specific selection guidance rather than just conceptual overview, matching the technology-choices category.",new +https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing,Natural language processing,Choose a Natural Language Processing Technology - Azure Architecture Center,Select Azure services for natural language processing workloads,"Choose a natural language processing technology for sentiment analysis, topic and language detection, key phrase extraction, and document categorization.","Natural language processing encompasses techniques that analyze, understand, and generate human language from text data. Azure provides managed API-driven services and distributed open-source frameworks that address natural language processing workloads that range from sentiment analysis and entity recognition to document classification and text summarization. This guide helps you evaluate and choose from the primary natural language processing options on Azure so that you can match the right te",2026-04-21T17:34:00.000Z,concept-article,technology-choices,0.86,True,"The page is under data-guide/technology-choices and compares multiple Azure NLP options (managed APIs vs frameworks) with selection criteria for scenarios like sentiment analysis, entity recognition, and document classification. It provides decision guidance and trade-offs between services, which is specific, expert decision knowledge rather than a generic overview.",updated https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/pipeline-orchestration-data-movement,Pipeline orchestration,Choose a Data Pipeline Orchestration Technology - Azure Architecture Center,Choose Azure data pipeline orchestration services,"Choose an Azure data pipeline orchestration technology to automate pipeline orchestration, control flow, and data movement workflows.","Most big data solutions consist of repeated data processing operations, encapsulated in workflows. A pipeline orchestrator helps automate these workflows. It can schedule jobs, run workflows, and coordinate dependencies among tasks.",2025-11-08T06:02:00.000Z,concept-article,technology-choices,0.9,True,Data-guide technology-choices article comparing orchestration technologies with selection criteria and capability matrix.,unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/search-options,Search data store,Choose a search data store - Azure Architecture Center,Choose an Azure search data store technology,Learn about the capabilities of search data stores in Azure and the key criteria for choosing one that best matches your needs.,"This article compares technology choices for search data stores in Azure. A search data store is used to create and store specialized indexes for performing searches on free-form text. The text that is indexed might reside in a separate data store, such as blob storage. An application submits a query to the search data store, and the result is a list of matching documents. For more information about this scenario, seeProcessing free-form text for search.",2025-03-20T17:31:00.000Z,concept-article,technology-choices,0.85,True,"Explicit comparison of Azure search data store options with criteria, under technology-choices/, matching decision-guide behavior.",unchanged https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/stream-processing,Stream processing,Choose a Stream Processing Technology - Azure Architecture Center,Compare Azure real-time stream processing services,"Compare options for real-time message stream processing in Azure, with key selection criteria and a capability matrix.",This article compares technology choices for real-time stream processing in Azure.,2025-11-14T06:02:00.000Z,concept-article,technology-choices,0.9,True,"Explicitly compares stream processing options with selection criteria and capability matrix, under technology-choices/.",unchanged @@ -136,24 +136,24 @@ https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/measu https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/sap-production,SAP deployment using an Oracle database,SAP deployment in Azure using an Oracle database - Azure Architecture Center,,"Learn proven practices for running SAP on Oracle in Azure, with high availability.","This reference architecture shows a set of proven practices for running a high-availability SAP NetWeaver with Oracle Database on Azure. The architecture principles are OS-agnostic, however, unless otherwise specified, the architecture is assumed to be based on Linux. The first diagram shows a reference architecture for SAP on Oracle in Azure. We recommend that you deploy across two availability zones. Download aVisio fileof this architecture and related architectures. Note To deploy this refere",2025-04-17T17:31:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/scalable-apps-performance-modeling-site-reliability,Scalable cloud applications and site reliability engineering (SRE),Scalable Cloud Applications and SRE - Azure Architecture Center,Apply SRE principles to build scalable Azure API platforms,Build scalable cloud applications by using performance modeling and other principles and practices of site reliability engineering (SRE).,"The success of your cloud solution depends on its reliability. Reliability is the probability that the system functions as expected, under specified conditions, within a specified time. Site reliability engineering (SRE) is a set of principles and practices for creating scalable and highly reliable software systems. SRE is a standard approach for designing digital services to support reliability goals in your workload. This article demonstrates how to apply SRE principles to a scalable API platf",2026-03-28T05:02:00Z,example-scenario,example-workloads,0.78,True,"URL contains example-scenario/apps and the article demonstrates applying SRE principles to a scalable API platform, implying a concrete workload architecture and implementation details beyond generic SRE theory, which aligns with example-workloads.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/azure-virtual-desktop/azure-virtual-desktop-multi-region-bcdr,Multiregion BCDR for Azure Virtual Desktop,Multiregion BCDR for Azure Virtual Desktop - Azure Architecture Center,Implement multiregion BCDR for Azure Virtual Desktop,Learn about the possible options and scenarios so that you can design and implement an effective multiregion BCDR strategy for Azure Virtual Desktop.,"This article provides implementation-level architecture and configuration guidance for deploying Azure Virtual Desktop with multiregion business continuity and disaster recovery (BCDR). It describes howFSLogixstores user profiles in virtual hard disk (VHD) containers and how cloud cache replicates profiles across regions. It also outlines BCDR model options, including active-active, active-passive, and personal host pools that use Azure Site Recovery, along with FSLogix cloud cache configuration",2026-03-25T05:05:00.000Z,concept-article,example-workloads,0.9,True,"URL contains example-scenario/, and the summary explicitly states implementation-level architecture and configuration guidance for multiregion BCDR with Azure Virtual Desktop, FSLogix, Cloud Cache, and Azure Site Recovery. This is a detailed, scenario-specific implementation guide with concrete configuration options, fitting example-workloads.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/,Certificate lifecycle management on Azure,Certificate Lifecycle Management on Azure - Azure Architecture Center,Automate certificate lifecycle management with nonintegrated CAs on Azure,Learn how to create an infrastructure and workflow to automate the renewal process for certificates issued by a nonintegrated certification authority (CA).,"In cybersecurity, setting up automatic certificate renewal is important to maintaining a secure and reliable environment. Failure to update or renew certificates in a timely manner exposes systems to vulnerabilities. Potentially vulnerable areas include: Azure Key Vault supportsautomatic certificate renewalissued by an integrated certification authority (CA) such asDigiCertorGlobalSign. For a nonintegrated CA, amanualapproach is required. This article bridges the gap by providing an automatic re",2025-09-04T17:35:00Z,example-scenario,example-workloads,0.86,True,example-scenario/certificate-lifecycle URL; detailed workflow and infrastructure for automating renewals with nonintegrated CAs using Key Vault and other Azure components.,unchanged +https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/,Certificate life cycle management on Azure,Certificate Life Cycle Management on Azure - Azure Architecture Center,Implement automated certificate lifecycle management on Azure,Learn how to create an infrastructure and workflow to automate the renewal process for certificates issued by a nonintegrated certification authority (CA).,"Organizations that use an internal or nonintegrated certificate authority (CA) often rely on manual processes to renew Transport Layer Security (TLS) and Secure Sockets Layer (SSL) certificates. Manual renewal can lead to expired certificates that cause service outages, like when a web server certificate expires unnoticed and disrupts customer-facing applications. Azure Key Vault supportsautomatic certificate renewalfor integrated CAs likeDigiCertorGlobalSign, but nonintegrated CAs require amanu",2026-04-22T17:33:00Z,example-scenario,example-workloads,0.78,True,"URL contains example-scenario/, and the page describes a concrete implementation for automating TLS/SSL certificate renewal with a nonintegrated CA using Azure services. It goes beyond a high-level solution idea by detailing workflow, components, and operational considerations for a specific use case (certificate lifecycle), matching the example-workloads criteria.",new https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/data-warehouse,Data warehousing and analytics,Data warehousing and analytics - Azure Architecture Center,Build a unified Azure data warehouse and analytics pipeline,This example demonstrates a data pipeline that integrates large amounts of data from multiple sources into a unified analytics platform in Azure.,"This example scenario demonstrates a data pipeline that integrates large amounts of data from multiple sources into a unified analytics platform in Azure. This specific scenario is based on a sales and marketing solution, but the design patterns are relevant for many industries requiring advanced analytics of large datasets such as e-commerce, retail, and healthcare.",2025-09-04T17:35:00Z,example-scenario,example-workloads,0.9,True,"Under /example-scenario/data/, provides a detailed data warehousing and analytics pipeline scenario with specific services and flows.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/esri-arcgis-azure-virtual-desktop,Esri ArcGIS on Azure Virtual Desktop,Esri ArcGIS Platform on Azure Virtual Desktop - Azure Architecture Center,Implement Esri ArcGIS Pro on Azure Virtual Desktop,View a reference architecture that shows how to deploy ArcGIS Pro in Azure Virtual Desktop to support the hyperscale of Azure.,This architecture describes how to deploy Esri ArcGIS Pro in Azure Virtual Desktop to support the hyperscale of Azure. The architecture also includes back-end components like ArcGIS Enterprise to build a complete system on Azure. ArcGIS® is a trademark of its company. No endorsement is implied by the use of this mark.,2026-03-18T17:35:00Z,example-scenario,example-workloads,0.78,True,"URL contains example-scenario/, and the page describes a concrete implementation of Esri ArcGIS Pro and ArcGIS Enterprise on Azure Virtual Desktop with a full system architecture. Example scenarios in the Architecture Center typically include detailed component choices, configuration considerations, and deployment guidance for a specific industry/use case (GIS/VDI), which goes beyond generic knowledge and into expert, scenario-specific implementation details.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/greenfield-lakehouse-fabric,Greenfield lakehouse on Microsoft Fabric,Greenfield Lakehouse on Microsoft Fabric - Azure Architecture Center,Implement a greenfield Microsoft Fabric lakehouse platform,"Learn about a greenfield solution for creating a robust, scalable data platform by using the lakehouse design paradigm on Microsoft Fabric.","This example workload describes a greenfield solution for creating a scalable data platform by using Microsoft Fabric and the lakehouse design paradigm. Fabric is a platform that integrates data storage, processing, and analytics. A greenfield lakehouse provides a clean start for designing an efficient, future-proof data ecosystem.",2026-02-20T06:03:00Z,reference-architecture,example-workloads,0.86,True,"URL contains example-scenario/ and describes a concrete greenfield lakehouse implementation on Fabric with platform-specific architecture and implementation details, fitting the example-workloads definition.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/real-time-lakehouse-data-processing,Near real-time lakehouse processing,Use Azure Synapse Analytics for Near Real-Time Lakehouse Data Processing - Azure Architecture Center,Implement near real-time lakehouse processing with Synapse,"Use Azure Event Hubs, Azure Synapse Analytics, and Azure Data Lake Storage to create an end-to-end, near real-time data lakehouse data processing solution.","Data-driven enterprises need to keep their back-end and analytics systems in near real-time sync with customer-facing applications. The effects of transactions, updates, and changes must reflect accurately through end-to-end processes, related applications, and online transaction processing (OLTP) systems. The tolerable latency for changes in OLTP applications to reflect in the downstream systems that use the data might only be a few minutes. This article describes an end-to-end solution for nea",2025-09-22T17:34:00Z,example-scenario,example-workloads,0.9,True,"Under /example-scenario/data/, details an end-to-end near real-time lakehouse solution using Event Hubs, Synapse, and Data Lake Storage.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/small-medium-data-warehouse,Modern data warehouse for small business,Modern Data Warehouses for Small or Medium-Sized Businesses - Azure Architecture Center,Modernize SMB data warehouses with Fabric and Azure SQL,"Learn how to use Microsoft Fabric, Azure SQL Database, Azure SQL Managed Instance, and Azure Data Lake Storage to modernize legacy and on-premises data.","This article describes ways that small or medium-sized businesses can migrate and modernize legacy data stores within their current budgets and skill set. It shows how to progressively explore big data tools and capabilities. These data warehousing solutions integrate with Azure Machine Learning, Foundry Tools, Microsoft Power Platform, Dynamics 365, and other Microsoft technologies. These solutions provide an initial entry point to Microsoft Fabric, which is a managed software as a service (Saa",2026-01-13T18:33:00Z,example-scenario,example-workloads,0.9,True,"Under /example-scenario/data/, focuses on SMB-specific modernization paths with concrete services and progressive implementation details.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/dataplate2e/data-platform-end-to-end,Analytics end to end,Analytics End-to-End with Microsoft Fabric - Azure Architecture Center,Implement end-to-end analytics with Microsoft Fabric,Learn how to use Microsoft data services to build a modern analytics platform capable of handling the most common data challenges in an organization.,"The solution in this article combines a range of Microsoft services that ingest, store, process, enrich, and serve data and insights from different sources. These sources include structured, semistructured, unstructured, and streaming formats.",2025-12-17T18:43:00Z,example-scenario,example-workloads,0.9,True,"URL under /example-scenario/ and describes a concrete end-to-end analytics implementation using specific Microsoft services, fitting example-workloads.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/automated-api-deployments-apiops,Automate API deployments with APIOps,Automated API deployments using APIOps - Azure Architecture Center,Implement automated APIOps deployments with API Management,"Use APIOps with an Azure API Management instance to build and deploy APIs. This solution provides self-service tools, auditing, policy enforcement, and early feedback.","APIOps is a methodology that applies the concepts of GitOps andDevOpsto API deployment. Like DevOps,APIOpshelps team members easily make changes and deploy them in an iterative and automated way. This architecture demonstrates how you can improve the entire API lifecycle and API quality by using APIOps.",2026-04-15T05:05:00Z,example-scenario,example-workloads,0.7,True,"URL contains example-scenario/devops/, indicating an example workload. The page describes a concrete APIOps implementation for API Management with lifecycle improvements, which goes beyond a high-level solution idea and aligns with detailed, scenario-specific implementation guidance.",updated -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/manage-microsoft-365-tenant-configuration-microsoft365dsc-devops,Manage Microsoft 365 with DevOps,Manage Microsoft 365 Tenant Configuration with Azure DevOps - Azure Architecture Center,Manage Microsoft 365 tenant configuration using Azure DevOps,Learn how to manage Microsoft 365 tenant configuration by using Microsoft365DSC and Azure DevOps.,This article describes a solution that tracks changes that service administrators make and adds an approval process to deployments to Microsoft 365 tenants. It can help you prevent untracked changes to Microsoft 365 tenants and prevent configuration drift between multiple Microsoft 365 tenants.,2026-04-15T05:05:00Z,example-scenario,example-workloads,0.7,True,"URL contains example-scenario/devops/, and the article details a specific implementation (Microsoft365DSC plus Azure DevOps) to manage Microsoft 365 tenant configuration and approvals, matching the example-workloads definition as a concrete, scenario-focused architecture.",updated +https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/automated-api-deployments-apiops,Automate API deployments with APIOps,Automated API deployments using APIOps - Azure Architecture Center,Implement automated APIOps deployments with API Management,"Use APIOps with an Azure API Management instance to build and deploy APIs. This solution provides self-service tools, auditing, policy enforcement, and early feedback.","APIOps is a methodology that applies the concepts of GitOps andDevOpsto API deployment. Like DevOps,APIOpshelps team members easily make changes and deploy them in an iterative and automated way. This architecture demonstrates how you can improve the entire API lifecycle and API quality by using APIOps.",2026-04-15T05:05:00Z,example-scenario,example-workloads,0.7,True,"URL contains example-scenario/devops/, indicating an example workload. The page describes a concrete APIOps implementation for API Management with lifecycle improvements, which goes beyond a high-level solution idea and aligns with detailed, scenario-specific implementation guidance.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/manage-microsoft-365-tenant-configuration-microsoft365dsc-devops,Manage Microsoft 365 with DevOps,Manage Microsoft 365 Tenant Configuration with Azure DevOps - Azure Architecture Center,Manage Microsoft 365 tenant configuration using Azure DevOps,Learn how to manage Microsoft 365 tenant configuration by using Microsoft365DSC and Azure DevOps.,This article describes a solution that tracks changes that service administrators make and adds an approval process to deployments to Microsoft 365 tenants. It can help you prevent untracked changes to Microsoft 365 tenants and prevent configuration drift between multiple Microsoft 365 tenants.,2026-04-15T05:05:00Z,example-scenario,example-workloads,0.7,True,"URL contains example-scenario/devops/, and the article details a specific implementation (Microsoft365DSC plus Azure DevOps) to manage Microsoft 365 tenant configuration and approvals, matching the example-workloads definition as a concrete, scenario-focused architecture.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/file-storage/enterprise-file-shares-disaster-recovery,Enterprise file shares with DR,Enterprise file shares with disaster recovery - Azure Architecture Center,Implement resilient Azure NetApp Files shares with disaster recovery,Learn how to implement resilient NetApp file shares. Failure of the primary Azure region causes automatic failover to the secondary Azure region.,"This architecture provides file shares that fail over automatically to a backup region in case of failure. The failover is transparent to the clients and applications that access the shares. The shares can be used for applications and virtual desktops that must be resilient to disruption, whether planned or unplanned. Azure NetApp Files provides the file shares. Its cross-region replication capability replicates the shares from the primary region to the secondary. Distributed File System (DFS) N",2025-04-17T17:31:00Z,example-scenario,example-workloads,0.88,True,"example-scenario/file-storage URL; detailed DR architecture using ANF cross-region replication, DFS-N, and client failover behavior for enterprise file shares.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/file-storage/moodle-azure-netapp-files,Moodle deployment with ANF,Moodle Deployment with Azure NetApp Files - Azure Architecture Center,Deploy scalable Moodle on Azure with NetApp Files,"Deploy Moodle by using Azure NetApp Files for a resilient solution that provides high-throughput, low-latency access to scalable shared storage.","Moodle is an open-source learning management system that requires high-throughput, low-latency access to storage. Many Moodle deployments require easy scalability to adapt to growing demand. This article explains how you can deploy Moodle by using Azure services on Azure Virtual Machine Scale Sets and store user-accessible learning data files in Azure NetApp Files. This article describes a zonal deployment for high availability and cross-zone replication and also gives examples of a single-zone ",2025-10-10T17:33:00Z,example-scenario,example-workloads,0.9,True,"example-scenario/file-storage URL; concrete deployment guidance for Moodle using VM Scale Sets, ANF, zonal deployment, and replication options.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/file-storage/oracle-azure-netapp-files,Oracle Database with Azure NetApp Files,Oracle Database with Azure NetApp Files - Azure Architecture Center,Run Oracle Database on Azure VMs with Azure NetApp Files,"Implement a high-bandwidth, low-latency solution for Oracle Database workloads. Use Azure NetApp Files for enterprise-scale performance and reduced costs.","The most demanding Oracle Database workloads require substantial I/O capacity. They also need low-latency access to storage. This document describes a scalable, high-bandwidth, low-latency solution for running Oracle Database workloads on Azure virtual machines (VMs) with shared file access via the network file system (NFS) protocol. The architecture uses Azure NetApp Files, a first-party Azure shared file-storage service.",2025-04-21T17:30:00Z,example-scenario,example-workloads,0.9,True,"example-scenario/file-storage URL; detailed architecture for high-bandwidth, low-latency Oracle workloads using ANF and NFS on Azure VMs.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/file-storage/sql-server-azure-netapp-files,SQL Server on VMs with Azure NetApp Files,SQL Server on Azure Virtual Machines With Azure NetApp Files - Azure Architecture Center,Migrate SQL Server to Azure VMs using Azure NetApp Files,"Learn how to implement a high-bandwidth, low-latency solution for SQL Server workloads. Use Azure NetApp Files for enterprise-scale performance and reduced costs.","This article provides guidance about how to migrate SQL Server workloads to Azure Virtual Machines by using Azure NetApp Files. Azure NetApp Files is an enterprise-class file storage service that delivers high-performance, low-latency, and scalable storage via the Server Message Block (SMB) protocol. These capabilities make it well-suited for online transaction processing (OLTP) applications. This migration strategy enables cost-effective deployment while preserving enterprise-grade performance ",2025-10-03T05:02:00Z,example-scenario,example-workloads,0.9,True,example-scenario/file-storage URL; migration-focused architecture for SQL Server OLTP workloads on Azure VMs with ANF SMB shares and performance considerations.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/forensics/,Computer forensics,Computer Forensics Chain of Custody in Azure - Azure Architecture Center,Implement computer forensics chain of custody on Azure,Learn how to create an infrastructure and workflow to help ensure a valid digital evidence chain of custody (CoC) for computer forensics in Azure.,"This article outlines an infrastructure and workflow process designed to help teams provide digital evidence that demonstrates a valid chain of custody in response to legal requests. This article describes how to maintain a valid chain of custody throughout the stages of evidence acquisition, preservation, and access. Note This article is based on the theoretical and practical knowledge of the authors. Before you use it for legal purposes, validate its applicability with your legal department.",2026-04-14T05:03:00Z,example-scenario,example-workloads,0.78,True,"URL contains example-scenario/, and the article describes a concrete, detailed implementation of an Azure-based infrastructure and workflow for digital evidence chain of custody across acquisition, preservation, and access. It goes beyond a high-level solution idea by providing scenario-specific technical guidance for a specialized use case (computer forensics/legal evidence handling), but is not a generic reference architecture with full production sizing tables.",updated +https://learn.microsoft.com/en-us/azure/architecture/example-scenario/forensics/,Computer forensics,Computer Forensics Chain of Custody in Azure - Azure Architecture Center,Implement computer forensics chain of custody on Azure,Learn how to create an infrastructure and workflow to help ensure a valid digital evidence chain of custody (CoC) for computer forensics in Azure.,"This article outlines an infrastructure and workflow process designed to help teams provide digital evidence that demonstrates a valid chain of custody in response to legal requests. This article describes how to maintain a valid chain of custody throughout the stages of evidence acquisition, preservation, and access. Note This article is based on the theoretical and practical knowledge of the authors. Before you use it for legal purposes, validate its applicability with your legal department.",2026-04-14T05:03:00Z,example-scenario,example-workloads,0.78,True,"URL contains example-scenario/, and the article describes a concrete, detailed implementation of an Azure-based infrastructure and workflow for digital evidence chain of custody across acquisition, preservation, and access. It goes beyond a high-level solution idea by providing scenario-specific technical guidance for a specialized use case (computer forensics/legal evidence handling), but is not a generic reference architecture with full production sizing tables.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gateway/application-gateway-before-azure-firewall,Zero Trust network for web applications,Implement a Zero Trust Network for Web Applications by Using Azure Firewall and Azure Application Gateway - Azure Architecture Center,Implement Zero Trust network for web apps with Azure Firewall,Protect web apps by implementing Zero Trust security in virtual networks. Use Azure Firewall and Azure Application Gateway in networks to maximize protection.,"This article describes how to implement Zero Trust security for web apps to enable inspection and end-to-end encryption. The Zero Trust model includes many other concepts, such as continuous identity verification and minimizing the size of the implicit trust areas. This article focuses on the encryption and inspection component of a Zero Trust architecture for inbound traffic from the public internet. For more information about other aspects of deploying your application securely, such as authen",2025-07-17T05:04:00.000Z,concept-article,example-workloads,0.86,True,example-scenario/gateway URL; detailed Zero Trust inbound traffic architecture using Firewall and App Gateway with encryption/inspection specifics.,unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gateway/firewall-application-gateway,Virtual Network security options,Azure Firewall and Application Gateway for Virtual Networks - Azure Architecture Center,Secure virtual networks with Azure Firewall and Application Gateway,Learn about options and best practices for how to use Azure Firewall and Azure Application Gateway security in virtual networks.,"To help secure Azure application workloads, use protective measures such as authentication and encryption in the applications themselves. You can add security layers to the virtual networks that host the applications. These security layers help protect the application's inbound flows from unintended use. They also limit outbound flows to the internet to only those endpoints that your application requires. This article describesAzure Virtual Networksecurity services like Azure DDoS Protection, Az",2026-03-18T08:00:00.000Z,concept-article,example-workloads,0.7,True,"URL contains example-scenario/, and the description mentions options and best practices for combining Azure Firewall, Application Gateway, DDoS Protection, and other virtual network security services. Example-scenario pages typically include concrete architectures, configuration guidance, and scenario-specific recommendations that go beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gitops-aks/gitops-blueprint-aks,GitOps for AKS,GitOps for Azure Kubernetes Service - Azure Architecture Center,Operate AKS clusters using GitOps with Flux and Argo CD,Learn techniques for using GitOps principles to operate and manage an Azure Kubernetes Services (AKS) cluster. The solutions use Flux v2 and Argo CD.,"GitOps is a set of principles for operating and managing a software system. This article describes techniques for using GitOps principles to operate and manage an Azure Kubernetes Services (AKS) cluster. TheFlux,Argo CD,OPA Gatekeeper,Kubernetes, andGitlogos are trademarks of their respective companies. No endorsement is implied by the use of these marks.",2025-11-01T05:02:00Z,example-scenario,example-workloads,0.75,True,Example scenario under /example-scenario/gitops-aks/ with detailed GitOps techniques and tooling configuration for AKS; scenario-specific implementation rather than generic pattern.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline,AKS on Azure Local baseline architecture,Azure Kubernetes Service (AKS) Baseline Architecture for AKS on Azure Local - Azure Architecture Center,Apply AKS baseline architecture on Azure Local,Learn how to design and implement a baseline architecture for Microsoft Azure Kubernetes Service (AKS) that runs on Azure Local.,"This scenario illustrates how to design and implement a baseline architecture for Microsoft Azure Kubernetes Service (AKS) that runs on Azure Local. This article includes recommendations for networking, security, identity, management, and monitoring of the cluster based on an organization's business requirements. Important The information in this article applies to AKS on Azure Local andAKS on Windows Server. The most recent version of AKS runs on the Azure Stack HCI, version 23H2 operating syst",2025-09-04T17:35:00Z,example-scenario,example-workloads,0.84,True,"Example-scenario for AKS on Azure Local with detailed recommendations for networking, security, identity, management, and monitoring in hybrid context.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline,AKS on Azure Local baseline architecture,Azure Kubernetes Service (AKS) Baseline Architecture for AKS on Azure Local - Azure Architecture Center,Implement AKS baseline architecture on Azure Local,Learn how to design and implement a baseline architecture for Microsoft Azure Kubernetes Service (AKS) that runs on Azure Local.,"This scenario illustrates how to design and implement a baseline architecture for Microsoft Azure Kubernetes Service (AKS) that runs on Azure Local. This article includes recommendations for networking, security, identity, management, and monitoring of the cluster based on an organization's business requirements. Important The information in this article applies to AKS on Azure Local andAKS on Windows Server. The most recent version of AKS runs on the Azure Stack HCI, version 23H2 operating syst",2026-04-23T17:32:00Z,example-scenario,example-workloads,0.78,True,"URL contains example-scenario/, and the page describes a baseline AKS architecture with concrete recommendations for networking, security, identity, management, and monitoring on Azure Local/Stack. This is a detailed, scenario-specific implementation rather than a generic overview, matching example-workloads.",updated https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-hybrid-azure-local,Deploy apps with AKS enabled by Azure Arc on Azure Local,Deploy and Operate Apps with AKS Enabled by Azure Arc on Azure Local - Azure Architecture Center,Build GitOps pipelines for AKS on Azure Local with Arc,Build containerized app deployment pipelines for AKS on Azure Local with Azure Arc and GitOps. Learn about Flux automation for hybrid Kubernetes clusters.,"This article provides recommendations for building an app deployment pipeline for containerized apps on Azure Kubernetes Service (AKS) enabled by Azure Arc. The apps can run on Azure Local. The guidance is specifically for deployments that use Azure Arc and GitOps. Important The information in this article applies toAKS on Azure Local, version 23H2 (latest version).",2025-12-18T18:34:00Z,example-scenario,example-workloads,0.84,True,"Scenario-specific guidance for app deployment pipelines using AKS, Azure Arc, and Flux GitOps with concrete hybrid Kubernetes implementation details.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/azure-files-on-premises-authentication,Azure files secured by AD DS,Azure Files Accessed from On-Premises and Secured by AD DS in a Private Network - Azure Architecture Center,Provide on-premises access to Azure Files with AD DS,Learn how to provide on-premises access to Azure Files with security provided by on-premises Windows Server Active Directory Domain Services (AD DS).,This architecture demonstrates one way to provide file shares in the cloud to on-premises users and applications that access files on Windows Server through a private endpoint.,2025-12-18T18:34:00Z,example-scenario,example-workloads,0.8,True,Example-scenario for securing Azure Files via on-premises AD DS over private endpoints with detailed hybrid identity and networking configuration.,unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/secure-hybrid-messaging-client,Enhanced-security hybrid: client access,Enhanced-security hybrid messaging for client access - Azure Architecture Center,Secure hybrid messaging client access with MFA,Enhance your security in a client access scenario by using Microsoft Entra multifactor authentication.,"The article shows how to implement multifactor authentication for Outlook desktop clients that access Microsoft Exchange. There are four architectures that correspond to four different possibilities for the Microsoft Exchange that has the user mailbox: Note In the diagrams, black dashed lines show basic interactions between local Active Directory, Microsoft Entra Connect, Microsoft Entra ID, AD FS, and Web Application Proxy components. You can learn about these interactions inHybrid identity req",2025-12-06T06:04:00Z,example-scenario,example-workloads,0.8,True,Example-scenario with multiple concrete architectures for Outlook desktop access to Exchange in hybrid setups using Entra MFA and identity components.,unchanged @@ -180,7 +180,6 @@ https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/ https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/mainframe-replication-precisely-connect,Mainframe data replication with Connect,Replicate Mainframe Data by Using Precisely Connect - Azure Architecture Center,Replicate mainframe data to Azure using Precisely Connect,Learn how to use Precisely Connect to replicate mainframe and midrange data to Azure in real time with change data capture.,This article describes how to use Precisely Connect to migrate mainframe and midrange systems to Azure. Precisely Connect provides real-time data replication from legacy systems to Azure by using change data capture (CDC) technology. This solution provides data consistency between on-premises mainframe environments and Azure while minimizing the effect on source system performance. The architecture supports various mainframe and midrange data sources and replicates data to Azure targets like Azu,2025-11-21T18:42:00Z,example-scenario,example-workloads,0.9,True,"URL contains example-scenario/mainframe and the article focuses on a specific product (Precisely Connect) for real-time CDC-based replication to Azure targets, providing a concrete workload implementation, which fits example-workloads.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-aix-workloads-to-azure-with-skytap,Migrate AIX workloads with Skytap,Migrate AIX workloads to Azure with Skytap - Azure Architecture Center,,This example illustrates a migration of AIX logical partitions (LPARs) to Skytap on Azure.,Skytap on Azure simplifies cloud migration for applications that run on IBM Power Systems. This example illustrates a migration of AIX logical partitions (LPARs) to Skytap on Azure and is based on best practices from recent customer experiences. A web app on Microsoft Azure gives users a modern interface for the resources running in LPARs on Skytap on Azure.,2025-09-20T05:02:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-ibm-i-series-to-azure-with-skytap,Migrate IBM i series to Azure with Skytap,Migrate IBM i series to Azure with Skytap - Azure Architecture Center,,This example architecture shows how to use the native IBM i backup and recovery services with Microsoft Azure components.,"This example architecture shows how to use the native IBM i backup and recovery services with Microsoft Azure components to quickly migrate IBM i workloads to Skytap on Azure. This native IBM Power9 infrastructure is hosted in an Azure datacenter, minimizing the latency between traditional workloads and those running natively on Azure. You get the reliability and reach of Azure, the flexibility to deploy and scale IBM i logical partitions (LPARs) on demand, plus full backup and recovery services",2025-09-20T05:02:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-unisys-dorado-mainframe-apps-with-astadia-micro-focus,Unisys Dorado migration,Unisys Dorado mainframe migration to Azure with Astadia and Micro Focus - Azure Architecture Center,,"Migrate Unisys Dorado mainframe systems with Astadia and Micro Focus products. Move to Azure without rewriting code, switching data models, or updating screens.","This solution migrates Unisys Dorado mainframe systems to Azure with Astadia and Micro Focus products, without rewriting code, switching data models, or updating screens.",2025-08-08T05:07:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/modernize-mainframe-data-to-azure,Modernize mainframe midrange data,Modernize Mainframe and Midrange Data - Azure Architecture Center,Modernize IBM mainframe and midrange data on Azure,Learn how to modernize IBM mainframe and midrange data. Use a data-first approach to migrate the data to scalable and more secure cloud storage on Azure.,"Apache®,Spark, and the flame logo are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries. No endorsement by The Apache Software Foundation is implied by the use of these marks. This article describes an end-to-end modernization plan for mainframe and midrange data sources. Modernization helps improve scalability and performance for your mission-critical workloads.",2025-09-20T05:02:00Z,example-scenario,example-workloads,0.84,True,"URL contains example-scenario/mainframe and the article describes an end-to-end modernization plan for mainframe and midrange data sources using Azure data services, which is a detailed scenario implementation fitting example-workloads.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/move-archive-data-mainframes,Move mainframe archive data to Azure,Move archive data from mainframe systems to Azure - Azure Architecture Center,Move archive data from mainframes to Azure storage,Learn about a reference architecture that shows how to move data from mainframe and midrange systems to Azure.,"This reference architecture shows how to move data from mainframe and midrange systems to Azure. In this architecture, archived data is serviced and used only in the mainframe system. Azure is used only as a storage medium.",2025-09-20T05:02:00Z,example-scenario,reference-architectures,0.8,True,Described as a reference architecture for moving archived mainframe/midrange data to Azure; includes specific Azure storage usage and data flow for this scenario.,unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/process-batch-transactions,Batch transaction processing,High-volume batch transaction processing - Azure Architecture Center,,Use Azure Kubernetes Service (AKS) and Azure Service Bus to implement high-volume batch transaction processing.,"The architecture uses AKS to implement compute clusters of the applications that process high-volume batches of transactions. The applications receive the transactions in messages from Service Bus topics or queues. The topics and queues can be at Azure datacenters in different geographic regions, and multiple AKS clusters can read input from them. Note This architecture suits a type of batch transaction processing that, on IBM mainframes, is often implemented by using the IBM MQ family of messag",2025-09-20T05:02:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged @@ -193,7 +192,7 @@ https://learn.microsoft.com/en-us/azure/architecture/example-scenario/manufactur https://learn.microsoft.com/en-us/azure/architecture/example-scenario/manufacturing/teamcenter-plm-netapp-files,Teamcenter with Azure NetApp Files,Use Teamcenter PLM with Azure NetApp Files - Azure Architecture Center,,Learn how to use Azure NetApp Files as a storage solution for Siemens Teamcenter product lifecycle management (PLM).,The article demonstrates how to useAzure NetApp Filesas a storage solution for Siemens Teamcenter product lifecycle management (PLM). It assumes familiarity with theSiemens Teamcenter baseline architecture on Azure. The Teamcenter baseline architecture helps understand the importance selecting the right storage solution(s). It also underscores the value Azure NetApp Files provides in a Siemens Teamcenter PLM deployment on Azure.,2025-10-30T05:03:00Z,concept-article,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/monitoring/monitoring-observable-systems-media,Real-time monitoring and observable systems for media,Build Real-Time Monitoring and Observable Systems for Media - Azure Architecture Center,Build real-time monitoring and observability for media telemetry,"Learn how to build scalable real-time monitoring architecture for media streaming by using Azure Event Hubs, Fabric eventstreams, and Data Activator.",This architecture describes a solution that provides near real-time monitoring and observability of systems and user device telemetry data. It focuses on a use case for the media industry.,2025-11-11T06:09:00Z,example-scenario,example-workloads,0.86,True,"example-scenario/monitoring URL; media-industry-specific telemetry and monitoring solution using Event Hubs, Fabric eventstreams, and Data Activator with detailed architecture and implementation steps.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/quantum/quantum-computing-integration-with-classical-apps,Quantum computing integration with classical apps,Quantum computing integration with classical apps - Azure Architecture Center,,Learn how to use Azure Quantum to implement hybrid quantum applications,"Classical computing is increasingly challenged with today's most complex compute problems - even at the scale of our most powerful supercomputers. Quantum computers hold the promise to dramatically extend our compute capabilities. By exploiting the properties of quantum physics to perform computations, they provide exponential speedups for certain types of problems. For example, quantum computers do exceptionally well with problems that require calculating a large number of possible combinations",2025-04-21T17:30:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged -https://learn.microsoft.com/en-us/azure/architecture/example-scenario/security/virtual-machine-compliance,Manage virtual machine compliance,Manage virtual machine compliance - Azure Architecture Center,,Manage virtual machine compliance without impairing DevOps practices. Use Azure VM Image Builder and Azure Compute Gallery to minimize risk from system images.,This article describes how to manage virtual machine compliance without impairing DevOps practices. Use Azure VM Image Builder and Azure Compute Gallery to minimize risk from system images.,2025-04-17T17:31:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged +https://learn.microsoft.com/en-us/azure/architecture/example-scenario/security/virtual-machine-compliance,Manage virtual machine compliance,Manage Virtual Machine Compliance - Azure Architecture Center,Implement compliant Azure VM images with DevOps,Manage virtual machine compliance without disrupting DevOps practices. Use Azure VM Image Builder and Azure Compute Gallery to minimize risk from system images.,This article describes how to manage virtual machine (VM) compliance without disrupting DevOps practices. Use Azure VM Image Builder and Azure Compute Gallery to minimize risk from system images. The solution consists of the gold image publishing process and the VM compliance tracking process.,2026-04-22T17:33:00Z,example-scenario,example-workloads,0.78,True,"The page is under an example-scenario path and describes a concrete implementation for managing VM compliance using Azure VM Image Builder and Azure Compute Gallery. It focuses on a specific operational use case (gold image publishing and VM compliance tracking) with practical, scenario-driven guidance that goes beyond high-level concepts, matching the example-workloads category more than solution-ideas or reference-architectures.",updated https://learn.microsoft.com/en-us/azure/architecture/example-scenario/serverless/microservices-with-container-apps,Microservices with Container Apps,Deploy Microservices to Azure Container Apps - Azure Architecture Center,Replatform Kubernetes microservices to Azure Container Apps,Learn how to replatform and deploy existing Kubernetes microservice applications workloads to run in Azure Container Apps.,"This scenario shows an existing workload originally designed for Kubernetes, which is replatformed to run in Azure Container Apps. Container Apps supports brownfield workloads where teams want to simplify infrastructure and container orchestration. The example workload is a containerized microservices application. It reuses the same workload defined inMicroservices architecture on Azure Kubernetes Service (AKS). This architecture rehosts the workload in Container Apps as its application platform",2025-11-26T18:33:00Z,example-scenario,example-workloads,0.85,True,Example scenario under /example-scenario/serverless/ showing detailed replatforming of an existing AKS microservices workload to Container Apps; includes concrete implementation details.,unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/serverless/microservices-with-container-apps-dapr,Microservices with Dapr and KEDA,Deploy Microservices with Azure Container Apps and Dapr - Azure Architecture Center,Build Dapr-based microservices on Azure Container Apps,Build a serverless microservices solution on Azure Container Apps by using Distributed Application Runtime (Dapr) and Kubernetes event-driven autoscaling (KEDA).,This article describes a solution for running an order management system that has 10 microservices on Azure Container Apps. The solution also uses microservices best practices through Distributed Application Runtime (Dapr) and event-driven scaling with Kubernetes event-driven autoscaling (KEDA). Dapr and Traefik are trademarks of their respective companies. No endorsement is implied by the use of these marks.,2025-10-10T17:33:00Z,example-scenario,example-workloads,0.85,True,"Example scenario for an order management system with 10 microservices using Container Apps, Dapr, and KEDA; detailed, scenario-specific architecture and configuration.",unchanged https://learn.microsoft.com/en-us/azure/architecture/example-scenario/unix-migration/migrate-aix-azure-linux,AIX UNIX to Azure Linux migration,AIX UNIX on-premises to Azure Linux migration - Azure Architecture Center,,"Migrate an on-premises IBM AIX system and web application to a highly available, secure RedHat Enterprise Linux solution in Azure.",This solution describes a migration from an IBM AIX Unix platform to Red Hat Enterprise Linux (RHEL) in Azure. The real-world example was a Health and Human Services application for a large customer. Low transaction time and latency were important requirements for both the legacy and the Azure systems. A key functionality is storing customer information in a database that links into a network file store containing related graphical images. Azure addresses this need with Azure NetApp Files.,2025-09-17T05:02:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged @@ -213,7 +212,7 @@ https://learn.microsoft.com/en-us/azure/architecture/guide/architecture-styles/m https://learn.microsoft.com/en-us/azure/architecture/guide/architecture-styles/n-tier,N-tier application,N-tier Architecture Style - Azure Architecture Center,Implement N-tier architecture style on Azure,"Learn about the benefits, challenges, and best practices for N-tier architectures on Azure, which separates an app into logical layers and physical tiers.","An N-tier architecture divides an application intological layersandphysical tiers. The diagram starts with a client, then goes to a web application firewall (WAF), and then to the web tier. From the web tier, one flow goes to middle tier 1 and another flow goes to messaging and then middle tier 2. From middle tier 1, one flow goes to remote service and another flow goes to the data tier. From middle tier 2, the flow goes to the data tier. From the data tier, both flows reverse through cache, the",2025-09-19T05:03:00.000Z,concept-article,architecture-styles,0.95,True,"Describes N-tier as an architecture style with benefits, challenges, and best practices; URL under guide/architecture-styles/, matching architecture-styles.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/architecture-styles/web-queue-worker,Web-queue-worker,Web-Queue-Worker Architecture Style - Azure Architecture Center,Use Web-Queue-Worker architecture style on Azure,"Learn about the Web-Queue-Worker pattern. Discover benefits, challenges, and implementation by using Azure App Service, Azure Functions, and message queues.","This architecture has two core components. Aweb front endhandles client requests, and aworkerdoes resource-intensive tasks, long-running workflows, or batch jobs. The web front end communicates with the worker through amessage queue. On the left, a client connects to an identity provider and a web front end. The web front end connects to a remote service, a database, and a queue. The database points back to the web front end and goes through a cache. The queue connects to a worker, which points ",2026-01-30T18:35:00.000Z,concept-article,architecture-styles,0.9,True,"Although it references a pattern, the page is under guide/architecture-styles/ and describes Web-Queue-Worker as an architecture style with benefits, challenges, and implementation guidance, fitting architecture-styles per URL rule.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/aws/aws-azure-security-solutions,Microsoft security for AWS,Microsoft Security Solutions for AWS - Azure Architecture Center,Secure AWS environments with Microsoft security solutions,Learn how Microsoft security solutions can help secure and protect Amazon Web Services (AWS) account access and environments.,"This guide describes how to use Microsoft security solutions to help secure Amazon Web Services (AWS) by applying defense-in-depth controls that align with Zero Trust principles. These solutions help ensure consistent visibility, enforcement, and protection across your AWS environment. The following diagram summarizes how AWS environments can benefit from Microsoft security components. The image shows a three‑column diagram where arrows connect Microsoft Azure services on the left, Benefits in t",2026-01-14T18:35:00.000Z,concept-article,migration-guides,0.7,True,"aws-azure-security-solutions guide maps Microsoft security services to AWS scenarios, providing concrete cross-cloud security implementation guidance. This is expert, multi-cloud adoption/migration content.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/guide/azure-sandbox/azure-sandbox,Azure Sandbox,Azure Sandbox - Azure Architecture Center,Deploy Terraform-based Azure Sandbox environments,"Accelerate your Azure project by using a fully functional sandbox environment that includes virtual networks, virtual machines, and databases.","Azure Sandbox is a Terraform-based project designed to simplify the deployment of sandbox environments in Azure. It provides a modular and reusable framework for implementing foundational infrastructure, which can accelerate the development of innovative new solutions in Azure. The Azure Sandbox is composed of modular components that you can deploy individually or together, depending on your needs. Because a fully provisioned environment might be costly, you can manage expenses by stopping or de",2025-09-13T05:04:00.000Z,concept-article,reference-architectures,0.7,True,Describes a Terraform-based project for deploying modular sandbox environments with specific components and cost controls; effectively a deployable architecture with concrete implementation details.,unchanged +https://learn.microsoft.com/en-us/azure/architecture/guide/azure-sandbox/azure-sandbox,Azure Sandbox,Azure Sandbox - Azure Architecture Center,,"Accelerate your Azure project by using a fully functional sandbox environment that includes virtual networks, virtual machines, and databases.","Azure Sandbox is a Terraform-based project designed to simplify the deployment of sandbox environments in Azure. It provides a modular and reusable framework for implementing foundational infrastructure, which can accelerate the development of innovative new solutions in Azure. The Azure Sandbox is composed of modular components that you can deploy individually or together, depending on your needs. Because a fully provisioned environment might be costly, you can manage expenses by stopping or de",2026-04-24T17:33:00.000Z,concept-article,,0.3,False,"Terraform-based sandbox deployment guidance, but described at a high level as a project/framework. From the summary it lacks the detailed architecture diagrams, SKU-level configuration, or production sizing/deployment specifics required for reference-architectures or example-workloads classifications.",updated https://learn.microsoft.com/en-us/azure/architecture/guide/choose-azure-container-service,Container options,Choose an Azure Container Service - Azure Architecture Center,Select the right Azure container hosting service,Understand how to evaluate which Azure container service is best suited to your specific workload scenarios and requirements.,"Azure provides a range of container hosting services that are designed to accommodate various workloads, architectures, and business requirements. This container service selection guide can help your workload team understand which Azure container service is best suited to your workload scenarios and requirements. Note In this guide,workloadrefers to a collection of application resources that support a business goal or the implementation of a business process. A workload uses multiple services, l",2025-06-27T05:05:00.000Z,concept-article,technology-choices,0.9,True,"Guide under /guide/choose- with comparison of Azure container services, selection criteria, and trade-offs, which are service-specific decision details beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/compute/high-performance-computing,High-performance computing,High-Performance Computing (HPC) on Azure - Azure Architecture Center,,"Learn about high-performance computing (HPC) on Azure, which uses many CPU or GPU-based computers to solve complex mathematical tasks.",,2026-03-12T17:42:00.000Z,reference-architecture,,0.2,False,"HPC on Azure guide pages are typically conceptual overviews of compute options, workloads, and general guidance rather than a specific reference architecture, pattern, or detailed implementation with SKUs, deployment repos, or stepwise configuration. URL path (/guide/compute/) does not match any required sub-skill patterns (reference-architectures, solution-ideas, patterns, technology-choices, architecture-styles, best-practices, antipatterns, example-scenario, migration).",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/container-service-general-considerations,Considerations,Architectural Considerations for Choosing an Azure Container Service - Azure Architecture Center,Choose the right Azure container service for workloads,Get a quick overview of common feature-level considerations that can help you choose an Azure container service. Part two of a series.,"This article describes how to choose an Azure container service. It provides an overview of feature-level considerations that are common and critical for some workloads. It can help you make decisions to ensure that your workload meets requirements for reliability, security, cost optimization, operational excellence, and performance efficiency. Note This article is part two in a series. We strongly recommend that you readpart onefirst to get context for these architectural considerations.",2025-06-27T05:05:00.000Z,concept-article,technology-choices,0.8,True,"Guide about choosing an Azure container service with feature-level considerations; functions as a decision guide between multiple Azure container options, which is specialized selection knowledge.",unchanged @@ -229,9 +228,9 @@ https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/par https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/redundancy,Make all things redundant,Make all things redundant - Azure Architecture Center,,"Use these recommendations to avoid having single points of failure, by building redundancy into your application.",,2024-10-11T17:31:00.000Z,concept-article,,0.5,False,"Redundancy design principle with recommendations, but under guide/design-principles/ rather than best-practices/; not a pattern, style, or technology-choice comparison.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/scale-out,Design to scale out,Design to scale out - Azure Architecture Center,,"Use these recommendations to design your applications for horizontal scaling, which is the ability to use as much capacity as the application needs.",,2024-09-23T17:31:00.000Z,concept-article,,0.5,False,Scale-out design principle; similar to other design-principles pages and not mapped to any of the defined sub-skill URL categories.,unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/self-healing,Design for self-healing,Design for Self-Healing - Azure Architecture Center,,"Learn how to design self-healing Azure applications that detect, respond to, and recover from failures to ensure high availability and resilience.","Failures are inevitable in a distributed system. Hardware can fail. The network can have transient failures. Entire services, datacenters, or even Azure regions rarely experience a disruption, but your workload architecture must account for those outages. Address resiliency and recovery early in your workload design. Design an application that's self-healing when failures occur. Use the following approach:",2025-08-23T05:04:00.000Z,concept-article,,0.5,False,"Self-healing design principle includes recommendations but URL is guide/design-principles/, not best-practices/; does not match any defined category’s URL and structural requirements.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/guide/devops/deployment-scripts-property-check,Use deployment scripts to check resource properties,Use deployment scripts to check resource properties - Azure Architecture Center,,Use Bicep and a deployment script to pause a deployment in order to wait for a resource property to return a specific value.,"This article describes how to use Bicep and a deployment script to pause a deployment in order to wait for a resource property to return a specific value. You can use this technique to ensure that a deployment succeeds when the resource that's being deployed reports to Resource Manager that it's ready but the underlying resources aren't. When this happens, the resource that's being deployed isn't yet ready to interact with the remainder of the deployment, so a pause is required. This article use",2026-04-15T05:05:00Z,concept-article,,0.4,False,"Describes a specific deployment technique using Bicep and deployment scripts, but it is a focused how-to rather than an architecture, pattern, or cross-cutting best-practices article; URL path is guide/devops/ and doesn’t match any defined sub-skill paths, so it doesn’t fit the available categories.",updated +https://learn.microsoft.com/en-us/azure/architecture/guide/devops/deployment-scripts-property-check,Use deployment scripts to check resource properties,Use deployment scripts to check resource properties - Azure Architecture Center,,Use Bicep and a deployment script to pause a deployment in order to wait for a resource property to return a specific value.,"This article describes how to use Bicep and a deployment script to pause a deployment in order to wait for a resource property to return a specific value. You can use this technique to ensure that a deployment succeeds when the resource that's being deployed reports to Resource Manager that it's ready but the underlying resources aren't. When this happens, the resource that's being deployed isn't yet ready to interact with the remainder of the deployment, so a pause is required. This article use",2026-04-15T05:05:00Z,concept-article,,0.4,False,"Describes a specific deployment technique using Bicep and deployment scripts, but it is a focused how-to rather than an architecture, pattern, or cross-cutting best-practices article; URL path is guide/devops/ and doesn’t match any defined sub-skill paths, so it doesn’t fit the available categories.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/devops/devops-start-here,Get started,DevOps architecture design - Azure Architecture Center,,"Learn about DevOps and how to implement DevOps solutions on Azure by using services such as Azure DevOps, Azure Pipelines, Azure Monitor, and Azure DevTest Labs.","The termDevOpsderives fromdevelopmentandoperations. It refers to the integration of development, quality assurance, and IT operations into a unified culture and set of processes for delivering software. For an overview of DevOps, seeWhat is DevOps?. DevOps includes these activities and operations:",2025-10-15T05:04:00.000Z,concept-article,,0.2,False,DevOps overview and navigation page; high-level description of DevOps activities rather than detailed implementation guidance.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/guide/devsecops/devsecops-on-aks,DevSecOps on AKS,DevSecOps on Azure Kubernetes Service (AKS) - Azure Architecture Center,,Learn how DevSecOps helps you incorporate security best practices from the start of software development.,"DevSecOps, also called Secure DevOps, builds on the practice of DevOps by incorporating security at different stages of a traditional DevOps lifecycle. Some of the benefits of building security in DevOps practices include: When DevSecOps is applied to Azure Kubernetes Service (AKS), different organization roles can have different considerations for implementing security. Examples of these different organization roles are: This article is broken out into different DevOps lifecycle stages with con",2026-04-15T05:05:00Z,concept-article,,0.4,False,"DevSecOps on AKS is lifecycle- and role-oriented security guidance without concrete SKUs, deployment scripts, or production-ready architecture implementation details; it reads as best-practice style content but URL doesn’t match best-practices/ and lacks the explicit DO/DON'T structure required for that category.",updated +https://learn.microsoft.com/en-us/azure/architecture/guide/devsecops/devsecops-on-aks,DevSecOps on AKS,DevSecOps on Azure Kubernetes Service (AKS) - Azure Architecture Center,,Learn how DevSecOps helps you incorporate security best practices from the start of software development and throughout the software deployment life cycle (SDLC).,"DevSecOps, also calledSecure DevOps, builds on the practice of DevOps by incorporating security at different stages of a traditional DevOps life cycle. Build security into DevOps practices to: Make your applications and systems more secure, provide visibility into security threats, and prevent vulnerabilities from reaching deployed environments. Increase security awareness among your development and operation teams. Incorporate automated security processes into your software development life cyc",2026-04-24T17:33:00.000Z,concept-article,,0.2,False,"DevSecOps on AKS page appears to be lifecycle and practice-oriented, focusing on concepts and general security integration into DevOps. The summary does not indicate detailed architecture diagrams, concrete configuration tables, or implementation specifics that would qualify as expert knowledge under any listed sub-skill type.",updated https://learn.microsoft.com/en-us/azure/architecture/guide/hadoop/apache-kafka-migration,Apache Kafka migration to Azure,Apache Kafka Migration to Azure - Azure Architecture Center,Migrate Apache Kafka workloads to Azure services,Learn about Apache Kafka and about ways to migrate a Kafka implementation to Azure by using Azure services like HDInsight and Event Hubs.,"Apache Kafkais a highly scalable and fault tolerant distributed messaging system that implements a publish-subscribe architecture. It's used as an ingestion layer in real-time streaming scenarios, such as Internet of Things and real-time log monitoring systems. It's also used increasingly as the immutable append-only data store in Kappa architectures. Apache®,Apache Spark®,Apache Hadoop®,Apache HBase,Apache Storm®,Apache Sqoop®,Apache Kafka®, and the flame logo are either registered trademarks o",2025-10-30T05:03:00Z,concept-article,migration-guides,0.8,True,"Explicit migration guide for Kafka to Azure using HDInsight and Event Hubs, including service mapping and migration considerations.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/iot/machine-learning-inference-iot-edge,Enable ML inference on IoT Edge,Enable machine learning inference on an Azure IoT Edge device - Azure Architecture Center,Enable ML inference on Azure IoT Edge devices,"Enable machine learning inference on an IoT Edge device. Scenarios include image classification, object detection, and body, face, and gesture analysis.","AI on the edge is one of the most popular edge scenarios. Implementations of this scenario include image classification, object detection, body, face, and gesture analysis, and image manipulation. This architecture guide describes how to use Azure IoT Edge to support these scenarios. You can improve AI accuracy by updating the AI model, but in some scenarios the edge device network environment isn't good. For example, in the wind power and oil industries, equipment might be located in the desert",2026-01-07T18:32:00Z,concept-article,example-workloads,0.84,True,"IoT guide with concrete architecture and implementation details for running ML models on IoT Edge, including deployment patterns and edge constraints beyond generic conceptual info.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/iot/scale-iot-solution-azure,Scale your IoT Hub solutions,Scale Out an Azure IoT Hub-based Solution to Support Millions of Devices - Azure Architecture Center,Scale Azure IoT Hub solutions to millions of devices,"Learn how to scale out your Azure IoT Hub-based solution by using a scale-out pattern model, such as low-touch or zero-touch, to support millions of devices.",This article describes how to scale an Internet of Things (IoT) solution by using a scale-out pattern on the Azure IoT Hub platform. The scale-out pattern addresses scaling challenges by adding instances to a deployment instead of increasing instance size. This implementation guidance shows you how to scale an IoT solution that supports millions of devices and considers Azure service and subscription limits. The article outlines low-touch and zero-touch deployment models of the scale-out pattern,2025-10-12T08:00:00.000Z,concept-article,best-practices,0.7,True,"Guide article with specific implementation guidance and recommendations (low-touch/zero-touch scale-out patterns, service limits, deployment models) for large-scale IoT Hub solutions.",unchanged @@ -258,11 +257,10 @@ https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/hy https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/load-balancing-overview,Load balancing options,Load Balancing Options - Azure Architecture Center,Select Azure load balancing services for applications,Learn about Azure load balancing services and considerations to select one for distributing traffic across multiple computing resources.,"The termload balancingrefers to the distribution of processing across multiple computing resources. You load balance to optimize resource usage, maximize throughput, minimize response time, and avoid overloading any single resource. Load balancing can also improve availability by sharing a workload across redundant computing resources. Azure provides various load balancing services that you can use to distribute your workloads across multiple computing resources. These services include Azure API",2025-11-12T18:34:00.000Z,concept-article,technology-choices,0.85,True,"Load-balancing-overview under technology-choices/ that compares Azure load balancing services and when to use each, including trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/messaging,Asynchronous messaging,Asynchronous Messaging Options - Azure Architecture Center,Choose Azure asynchronous messaging services,"Learn about asynchronous messaging options in Azure, including the different types of messages and the entities that participate in a messaging infrastructure.","This article describes the different types of messages and the entities that participate in a messaging infrastructure. Based on message type requirements, Azure provides messaging services that include Azure Service Bus messaging, Azure Event Grid, and Azure Event Hubs. For more information, seeCompare messaging services. At an architectural level, aproducerentity creates a message datagram to distribute information.Consumerentities become aware of the information and act accordingly. The produ",2026-02-18T06:02:00.000Z,concept-article,technology-choices,0.85,True,"Describes messaging types and compares Azure Service Bus, Event Grid, and Event Hubs with guidance on when to use each, matching technology-choices.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/microservices-assessment,Microservices assessment and readiness,Microservices Assessment and Readiness - Azure Architecture Center,,Learn about the considerations to keep in mind when you move to a microservices architecture. Use this guide to assess your application's readiness.,"A microservices architecture can provide many benefits for your applications, including agility, scalability, and high availability. This architecture can also present challenges. When you build microservices-based applications or transform existing applications into a microservices architecture, you need to analyze and assess your current situation to identify areas that need improvement. This article describes some considerations to keep in mind when you move to a microservices architecture. Y",2025-07-30T05:05:00.000Z,concept-article,,0.4,False,"Microservices assessment/readiness is a decision/readiness guide, but not a technology comparison between Azure services nor an architecture style; does not match defined categories.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/service-for-java-comparison,Compare Java application hosting options,Compare Java application hosting options on Azure - Azure Architecture Center,Compare Java application hosting options on Azure,"Explore recommended strategies for hosting Java applications to Azure, including platform types and supportability.","Azure offers many options for teams to build and deploy Java applications. This article covers mainstream scenarios for Java on Azure and provides high-level planning suggestions and considerations. Apache®,Apache Kafka,Apache Struts,Apache Tomcat, and the flame logo are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries. No endorsement by The Apache Software Foundation is implied by the use of these marks.",2026-02-11T06:05:00Z,concept-article,technology-choices,0.8,True,"Explicitly compares multiple Azure hosting options for Java with planning considerations, acting as a technology-choices decision guide.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/storage-options,Storage options,Choose an Azure storage service - Azure Architecture Center,Select the right Azure storage service,Use this guide to decide which Azure storage service best suits your application.,"Storage capabilities are critical for supporting workloads and services that are hosted in the cloud. As you prepare for your cloud adoption, review this information to plan for your storage needs.",2026-03-05T06:02:00.000Z,concept-article,technology-choices,0.9,True,"Technology-choices guide (URL contains guide/technology-choices) that compares multiple Azure storage services with selection criteria and trade-offs. Contains expert, up-to-date comparison details that go beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/technology-choices-overview,Overview,Technology Choices for Azure Solutions - Azure Architecture Center,Navigate Azure technology choice decision guides,"Learn where to find Azure technology guidance across compute, data, analytics, networking, and AI by using curated comparison and choice resources.","When you design an Azure solution, you must select the right technologies for your workload across compute, data, networking, and other areas. The following resources include comparison matrices, flowcharts, and decision trees to help you evaluate your options and find the best match for your requirements.",2026-02-18T06:02:00.000Z,concept-article,technology-choices,0.9,True,"Overview of technology choice resources with comparison matrices, flowcharts, and decision trees; URL under guide/technology-choices/, matching technology-choices.",unchanged https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/vector-search,Vector search,Choose an Azure Service for Vector Search - Azure Architecture Center,Select an Azure service for vector search,Learn how to use this information to decide which Azure service for vector search best suits your application.,"Vector search is a method of finding information stored in a database in the shape of vectors. Vectors are groups of numbers that represent features or characteristics of media, such as text or images. Vectors are a significant advancement over traditional keyword-based search methods. They provide faster, more accurate results by capturing and comparing semantic relationships within the information. Azure provides multiple ways to store and search vectorized data. This article helps architects ",2026-02-18T06:02:00.000Z,concept-article,technology-choices,0.95,True,"Container/guide/technology-choices/ article that compares multiple Azure vector search options and when to use each, including scenario-specific guidance.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins,Azure Load Testing for IoT Hub,Azure Load Testing with Custom Plugins for Event Hubs and IoT Hub to Simulate Device Behaviors - Azure Architecture Center,Load test Event Hubs and IoT Hub with custom JMeter plugins,Learn how to design KPIs and develop a dashboard for Azure Load Testing with custom JMeter plugins to simulate device behaviors.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This solution provides guidance for how to use Azure Load Testing, a service that lets you run Apache JMeter scripts and custom plugins to simulate user and device behaviors. This s",2025-03-20T17:31:00.000Z,solution-idea,solution-ideas,0.9,True,"Explicitly labeled solution idea; provides architecture using Azure Load Testing, JMeter plugins, KPIs, and dashboards to simulate device behaviors—scenario-specific but not full reference architecture.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins,Azure Load Testing for IoT Hub,Azure Load Testing with Custom Plugins for Event Hubs and IoT Hub to Simulate Device Behaviors - Azure Architecture Center,Simulate device behaviors with Azure Load Testing and custom plugins,Learn how to design KPIs and develop a dashboard for Azure Load Testing with custom JMeter plugins to simulate device behaviors.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This solution provides guidance for how to use Azure Load Testing, a service that lets you run Apache JMeter scripts and custom plugins to simulate user and device behaviors. This s",2026-04-21T08:00:00.000Z,solution-idea,solution-ideas,0.9,True,"The page is explicitly labeled as a solution idea and describes a conceptual architecture using Azure Load Testing, JMeter, and custom plugins to simulate device behaviors. It provides a high-level blueprint with major components, less detailed than a reference architecture, matching solution-ideas.",updated https://learn.microsoft.com/en-us/azure/architecture/guide/testing/mission-critical-deployment-testing,Continuous validation,Continuous validation with Azure Load Testing and Azure Chaos Studio - Azure Architecture Center,Run continuous validation with Load Testing and Chaos Studio,Guide on performing automated continuous validation in production-like environments with Azure Load Testing and Chaos Studio.,"As cloud-native applications and services become more complex, deploying changes and new releases for them can be challenging. Outages are frequently caused by faulty deployments or releases. Buterrors can also occur after deployment, when an application starts receiving real traffic, especially in complex workloads that run in highly distributed multitenant cloud environments and that are maintained by multiple development teams. These environments require more resiliency measures, like retry l",2025-09-22T17:34:00.000Z,concept-article,best-practices,0.7,True,Testing guide with concrete recommendations (how to use Azure Load Testing and Chaos Studio) and DO-style guidance for resilient deployments.,unchanged https://learn.microsoft.com/en-us/azure/architecture/high-availability/traffic-manager-application-gateway,Multiregion load balancing,Multiregion Load Balancing - Azure Architecture Center,Implement multiregion load balancing with Traffic Manager and Application Gateway,Learn how to build an Azure system that serves web and non-web workloads with resilient multitier applications in multiple Azure regions.,"This architecture is for global, internet-facing applications that use HTTP(S) and non-HTTP(S) protocols. It features Domain Name System (DNS)-based global load balancing, two forms of regional load balancing, and global virtual network peering to create a reliable architecture. Availability zones provide resiliency within each region, and multiregion deployment provides recoverability from a regional outage. Azure Web Application Firewall and Azure Firewall inspect traffic at multiple layers.",2026-03-11T17:36:00Z,reference-architecture,reference-architectures,0.78,True,"This page describes a production-ready multiregion architecture using Azure Traffic Manager, Application Gateway, and other services for high availability. It includes a detailed architecture for global, internet-facing workloads, discusses DNS-based global load balancing plus regional load balancing, availability zones, multiregion deployment, and security layers (WAF, Azure Firewall). As a high-availability architecture in the Architecture Center, it typically provides concrete deployment and configuration guidance, satisfying the reference-architectures criteria more than solution-ideas or general best practices, even though the URL does not contain 'reference-architectures/'.",unchanged https://learn.microsoft.com/en-us/azure/architecture/hybrid/arc-hybrid-kubernetes,Manage Kubernetes by using Azure Arc,Manage and Deploy Kubernetes in Azure Arc - Azure Architecture Center,Manage hybrid Kubernetes clusters with Azure Arc,"Learn how Azure Arc extends Kubernetes cluster management and configuration across customer data centers, edge locations, and multiple cloud environments.","This reference architecture describes how Azure Arc extends Kubernetes cluster management and configuration across customer datacenters, edge locations, and multiple cloud environments.",2025-09-17T05:02:00Z,reference-architecture,example-workloads,0.7,True,Hybrid reference architecture for Arc-enabled Kubernetes across datacenters and clouds with concrete management and configuration guidance.,unchanged @@ -287,11 +285,11 @@ https://learn.microsoft.com/en-us/azure/architecture/landing-zones/landing-zone- https://learn.microsoft.com/en-us/azure/architecture/landing-zones/subscription-vending,Subscription vending implementation,Subscription Vending Implementation Guidance - Azure Architecture Center,,Learn how to implement subscription vending to standardize the process for automatic subscription creation and deploy workloads faster.,"This article provides implementation guidance for subscription vending automation. Subscription vending standardizes the process for requesting, deploying, and governing subscriptions so that application teams can deploy their workloads faster. The architecture diagram shows subscription vending organization with a management group hierarchy at the top and subscription automation workflow at the bottom. At the top, a tenant root group points to a Contoso management group, which branches into thr",2026-01-12T18:32:00.000Z,concept-article,,0.4,False,"Subscription vending implementation guidance is detailed but is a governance/automation guide, not one of the specified architecture or pattern categories.",unchanged https://learn.microsoft.com/en-us/azure/architecture/mainframe/virtualization-of-unisys-clearpath-forward-os-2200-enterprise-server-on-azure,Unisys ClearPath Forward OS 2200 enterprise server virtualization on Azure,Unisys ClearPath Forward OS 2200 Enterprise Server Virtualization on Azure - Azure Architecture Center,,Learn how to use virtualization technologies from Microsoft partner Unisys with an existing Unisys ClearPath Forward (CPF) Dorado enterprise server.,"This article describes how to use virtualization technologies from Unisys, a Microsoft partner, with an existing Unisys ClearPath Forward (CPF) Dorado enterprise server. Use this approach to quickly move your system to Azure without having to rewrite your application code or redesign your database. Existing code is maintained in its original form. The application screens, user interactions, and data structures behind the scenes stay the same, which eliminates the need to retrain your users.",2025-04-14T17:30:00Z,example-scenario,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/microservices/ci-cd,CI/CD for microservices,CI/CD for microservices - Azure Architecture Center,,"Learn about continuous integration and continuous delivery for microservices, including challenges and recommended approaches.","Faster release cycles are one of the major advantages of microservices architectures. But without a good CI/CD process, you won't achieve the agility that microservices promise. This article describes the challenges and recommends some approaches to the problem.",2025-10-30T05:03:00Z,concept-article,,0.4,False,"CI/CD for microservices is implementation guidance but not tagged as best-practices/ and not a pattern, style, or technology-choice comparison per definitions.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/microservices/ci-cd-kubernetes,CI/CD for microservices on Kubernetes,Microservices CI/CD pipeline on Kubernetes with Azure DevOps and Helm - Azure Architecture Center,,Learn about building a continuous integration and continuous delivery (CI/CD) pipeline for deploying microservices to Azure Kubernetes Service (AKS) using Azure DevOps and Helm.,"It can be challenging to create a reliable continuous integration/continuous delivery (CI/CD) process for a microservices architecture. Individual teams must be able to release services quickly and reliably, without disrupting other teams or destabilizing the application as a whole. This article describes an example CI/CD pipeline for deploying microservices to Azure Kubernetes Service (AKS). Every team and project is different, so don't take this article as a set of hard-and-fast rules. Instead",2026-04-15T05:05:00Z,concept-article,,0.4,False,"The page describes an example CI/CD pipeline for microservices on AKS using Azure DevOps and Helm. It is scenario-specific implementation guidance, but it does not match any of the defined sub-skill URL patterns (no reference-architectures/, solution-ideas/, patterns/, technology-choices/, architecture-styles/, best-practices/, antipatterns/, example-scenario/, industries/, or migration/). It appears as a detailed how-to/implementation article rather than one of the specified architecture-center skill types, so it is not classified under the given categories.",updated -https://learn.microsoft.com/en-us/azure/architecture/microservices/design/,Introduction,Design a Microservices Architecture - Azure Architecture Center,,Learn how to design and build a microservices architecture on Azure by following a reference implementation that illustrates best practices.,"Microservices are a popular architectural style for building cloud applications that remain resilient, scale efficiently, deploy independently, and evolve rapidly. To deliver real value, microservices require a different approach to design and application development. This set of articles explores how to build a microservices architecture on Azure. It includes the following guidance: Compute options for microservices: Evaluate Azure compute services for microservices, including Azure Kubernetes ",2025-09-26T05:04:00.000Z,concept-article,,0.3,False,"Landing page for microservices design series; mostly navigational/overview, not a specific pattern, style, or comparison guide.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/microservices/ci-cd-kubernetes,CI/CD for microservices on Kubernetes,Microservices CI/CD pipeline on Kubernetes with Azure DevOps and Helm - Azure Architecture Center,,Learn about building a continuous integration and continuous delivery (CI/CD) pipeline for deploying microservices to Azure Kubernetes Service (AKS) using Azure DevOps and Helm.,"It can be challenging to create a reliable continuous integration/continuous delivery (CI/CD) process for a microservices architecture. Individual teams must be able to release services quickly and reliably, without disrupting other teams or destabilizing the application as a whole. This article describes an example CI/CD pipeline for deploying microservices to Azure Kubernetes Service (AKS). Every team and project is different, so don't take this article as a set of hard-and-fast rules. Instead",2026-04-15T05:05:00Z,concept-article,,0.4,False,"The page describes an example CI/CD pipeline for microservices on AKS using Azure DevOps and Helm. It is scenario-specific implementation guidance, but it does not match any of the defined sub-skill URL patterns (no reference-architectures/, solution-ideas/, patterns/, technology-choices/, architecture-styles/, best-practices/, antipatterns/, example-scenario/, industries/, or migration/). It appears as a detailed how-to/implementation article rather than one of the specified architecture-center skill types, so it is not classified under the given categories.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/microservices/design/,Introduction,Design a Microservices Architecture - Azure Architecture Center,Design and implement microservices architecture on Azure,Learn how to design and build a microservices architecture on Azure by following a reference implementation that illustrates best practices.,"Microservices are a popular architectural style for building cloud applications that remain resilient, scale efficiently, deploy independently, and evolve rapidly. To deliver real value, microservices require a different approach to design and application development. This set of articles explores how to build a microservices architecture on Azure. It includes the following guidance: Compute options for microservices: Compare Azure compute platforms for microservices, including Azure Kubernetes ",2026-04-24T17:33:00.000Z,concept-article,architecture-styles,0.7,True,"Describes how to design and build a microservices architecture as a style, with detailed Azure-focused guidance and best practices. URL is under microservices/design rather than patterns or solution-ideas, and it provides actionable implementation details for the microservices architecture style.",updated https://learn.microsoft.com/en-us/azure/architecture/microservices/design/api-design,API design,API Design - Azure Architecture Center,,"Learn about API design for microservices, including REST and RPC trade-offs, versioning strategies, and mapping REST to domain-driven design.","A microservices architecture requires good API design because all data exchange between services occurs either through messages or API calls. Efficient APIs help preventchatty input/output (I/O). Independent teams design services, so you must define API semantics and versioning schemes clearly to avoid breaking other services when you update a service. The diagram shows a flow between a client and three back-end services through a central scheduler service. An arrow labeled HTTP POST points from",2025-11-20T18:35:00.000Z,concept-article,,0.4,False,API design for microservices is detailed guidance but not under best-practices/ and not structured as a named pattern or technology-choice comparison.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options,Choose a compute option,Choose a Compute Option for Microservices - Azure Architecture Center,Choose Azure compute options for microservices,"Learn about compute options, the hosting models for the computing resources that your application runs on, like service orchestrator and serverless architectures.","The termcomputerefers to the hosting model for the computing resources that your application runs on. This article provides prescriptive guidance to help you choose a compute platform for microservices. Your microservice compute platform selection might depend on more nuanced requirements. For a microservices architecture, the following approaches are popular: Although these options aren't the only ones, they're both proven approaches to building microservices. An application might include both ",2026-02-11T18:33:00.000Z,concept-article,technology-choices,0.85,True,"Explicitly about choosing between compute platforms for microservices; compares options and selection criteria; URL under guide/technology-choices/, matching technology-choices.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options,Microservices compute options,Choose a Compute Option for Microservices - Azure Architecture Center,Choose compute platform options for microservices,"Learn about compute options, the hosting models for the computing resources that your application runs on, like service orchestrator and serverless architectures.","The termcomputerefers to the hosting model for the computing resources that your application runs on. This article provides prescriptive guidance to help you choose a compute platform for microservices. Your microservice compute platform selection might depend on more nuanced requirements. For a microservices architecture, the following approaches are popular: Although these options aren't the only ones, they're both proven approaches to building microservices. An application might include both ",2026-02-11T18:33:00.000Z,concept-article,technology-choices,0.9,True,"Same as index 17: compares compute options for microservices with selection criteria; URL under guide/technology-choices/, matching technology-choices.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options,Choose a compute option,Choose a Compute Option for Microservices - Azure Architecture Center,Choose Azure compute platform for microservices workloads,"Choose an Azure compute platform for a microservices architecture. Compare options based on interservice communication, independent scaling, and deployability.","This article helps you choose an Azure compute platform for a workload built on a microservices architecture. A microservices architecture consists of small, independently deployable services that communicate over the network. Your compute platform needs to support independent scaling, independent deployment, and reliable interservice communication across many services. For decision factors that apply to any workload, like team skills, networking, and portability, seeChoose an Azure compute serv",2026-04-24T17:33:00.000Z,concept-article,technology-choices,0.85,True,"Decision guide comparing Azure compute options for microservices, including selection criteria and trade-offs. Although URL does not contain technology-choices/, the content clearly focuses on choosing between multiple Azure compute services for a specific scenario.",updated +https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options,Microservices compute options,Choose a Compute Option for Microservices - Azure Architecture Center,Choose Azure compute platform for microservices workloads,"Choose an Azure compute platform for a microservices architecture. Compare options based on interservice communication, independent scaling, and deployability.","This article helps you choose an Azure compute platform for a workload built on a microservices architecture. A microservices architecture consists of small, independently deployable services that communicate over the network. Your compute platform needs to support independent scaling, independent deployment, and reliable interservice communication across many services. For decision factors that apply to any workload, like team skills, networking, and portability, seeChoose an Azure compute serv",2026-04-24T17:33:00.000Z,concept-article,technology-choices,0.85,True,Duplicate of index 2: decision guide comparing Azure compute options for microservices with scenario-specific selection criteria and trade-offs.,updated https://learn.microsoft.com/en-us/azure/architecture/microservices/design/data-considerations,Data considerations,Data Considerations for Microservices - Azure Architecture Center,,Learn about managing data in a microservices architecture. Data integrity and data consistency pose critical challenges for microservices.,"This article describes considerations for managing data in a microservices architecture. Each microservice manages its own data, so data integrity and data consistency pose critical challenges. Two services shouldn't share a data store. Each service manages its own private data store, and other services can't access it directly. This rule prevents unintentional coupling between services, which happens when services share the same underlying data schemas. If the data schema changes, the change mu",2025-11-21T18:42:00.000Z,concept-article,,0.4,False,"Data considerations for microservices is conceptual/implementation guidance; does not match any of the specific categorized types (no patterns/, best-practices/, or technology-choices/ markers).",unchanged https://learn.microsoft.com/en-us/azure/architecture/microservices/design/gateway,API gateways,API gateways - Azure Architecture Center,,An API gateway sits between clients and services and acts as a reverse proxy. Learn how to choose an API gateway technology for a microservice.,"In a microservices architecture, a client might interact with more than one front-end service. Given this fact, how does a client know what endpoints to call? What happens when new services are introduced, or existing services are refactored? How do services handle SSL termination, mutual TLS, authentication, and other concerns? AnAPI gatewaycan help to address these challenges. Download aVisio fileof this architecture.",2026-01-24T06:02:00Z,concept-article,,0.4,False,"API gateway article explains concept and considerations; not a patterns/ article, not a solution-idea or reference architecture, and not a technology-choice comparison across Azure services.",unchanged https://learn.microsoft.com/en-us/azure/architecture/microservices/design/interservice-communication,Interservice communication,Interservice communication in microservices - Azure Architecture Center,,Learn about the tradeoffs between asynchronous messaging versus synchronous APIs for communication between microservices and some challenges in communication.,"Communication between microservices must be efficient and robust. With lots of small services interacting to complete a single business activity, this can be a challenge. In this article, we look at the tradeoffs between asynchronous messaging versus synchronous APIs. Then we look at some of the challenges in designing resilient interservice communication.",2025-10-30T05:03:00Z,concept-article,,0.4,False,"Interservice communication trade-offs article; design guidance but not a named pattern under patterns/, nor a technology-choice comparison between Azure services.",unchanged @@ -302,7 +300,7 @@ https://learn.microsoft.com/en-us/azure/architecture/microservices/model/microse https://learn.microsoft.com/en-us/azure/architecture/microservices/model/tactical-domain-driven-design,Use tactical DDD to design microservices,Use Tactical DDD to Design Microservices - Azure Architecture Center,,"Use domain-driven design in a microservices architecture to identify the entity and aggregate patterns, which help identify natural boundaries for the services.","Domain-driven design (DDD) rejects a single unified model for the entire system. It instead encourages you to divide the system into bounded contexts that each has its own model. Thedomain analysisarticle covers the strategic phase of DDD. This article continues those steps and applies tactical DDD patterns to define domain models more precisely within their bounded contexts. In a microservices architecture, where each bounded context is a microservice candidate, the entity and aggregate pattern",2026-02-26T18:33:00.000Z,concept-article,,0.4,False,"Tactical DDD for microservices is design guidance; does not follow the design-patterns URL structure or pattern sections, and is not another listed category.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/,Networking,Networking architecture design - Azure Architecture Center,,"Learn about sample architectures, solutions, and guides that can help you explore the various networking services in Azure.","This article provides information about sample architectures, solutions, and guides that can help you explore networking in Azure. Designing and implementing Azure networking capabilities is a critical part of your cloud solution. You'll need to make networking design decisions to properly support your workloads and services. Azure provides a wide range of networking tools and capabilities. These are just some of the key networking services available in Azure: For information about more Azure ne",2026-03-11T17:36:00.000Z,concept-article,,0.2,False,"High-level landing/overview page for Azure networking architectures and resources; does not appear to contain detailed deployment guidance, configuration tables, or pattern-style sections. It mainly links to sample architectures and guides rather than providing expert implementation details itself.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/azure-dns-private-resolver,Azure DNS Private Resolver,Azure DNS Private Resolver - Azure Architecture Center,,Learn how to use Azure DNS Private Resolver to simplify hybrid recursive Domain Name System (DNS) resolution for on-premises and Azure workloads.,This article describes a solution for using Azure DNS Private Resolver to simplify hybrid recursive Domain Name System (DNS) resolution. You can use DNS Private Resolver for on-premises workloads and Azure workloads. DNS Private Resolver simplifies private DNS resolution from on-premises to Azure private DNS and from Azure private DNS to on-premises.,2026-03-23T17:34:00Z,example-scenario,,0.3,False,"Azure DNS Private Resolver article describes a specific networking solution and likely includes configuration guidance, but it is a service-specific architecture/feature guide under /networking/architecture/ rather than any of the defined sub-skill types (no patterns/, solution-ideas/, reference-architectures/, best-practices/, antipatterns/, technology-choices/, architecture-styles/, example-scenario/, or migration/ in the path). It does not match the required URL patterns or structural cues for the listed categories.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/hub-spoke,Hub-spoke network topology in Azure,Hub-Spoke Network Topology in Azure - Azure Architecture Center,Implement a hub-spoke network topology in Azure,"Learn about hub-spoke topology in Azure, where the spoke virtual networks connect to the hub virtual network and can connect to each other.","This reference architecture implements a hub-spoke network pattern with customer-managed hub infrastructure components. The hub-spoke network pattern, also known ashub and spoke, is the network topology that the Cloud Adoption Framework for Azure recommends. See,Define an Azure network topologyto understand why this topology is considered a best practice for many organizations. For a Microsoft-managed hub infrastructure solution, seeHub-spoke network topology with Azure Virtual WAN.",2026-04-14T17:32:00Z,reference-architecture,reference-architectures,0.78,True,"Describes a concrete hub-spoke network reference architecture with Azure networking components, implementation details, and production-oriented guidance; it is not just a conceptual style but a deployable topology recommended by the Cloud Adoption Framework.",updated +https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/hub-spoke,Hub-spoke network topology in Azure,Hub-Spoke Network Topology in Azure - Azure Architecture Center,Implement a hub-spoke network topology in Azure,"Learn about hub-spoke topology in Azure, where the spoke virtual networks connect to the hub virtual network and can connect to each other.","This reference architecture implements a hub-spoke network pattern with customer-managed hub infrastructure components. The hub-spoke network pattern, also known ashub and spoke, is the network topology that the Cloud Adoption Framework for Azure recommends. See,Define an Azure network topologyto understand why this topology is considered a best practice for many organizations. For a Microsoft-managed hub infrastructure solution, seeHub-spoke network topology with Azure Virtual WAN.",2026-04-14T17:32:00Z,reference-architecture,reference-architectures,0.78,True,"Describes a concrete hub-spoke network reference architecture with Azure networking components, implementation details, and production-oriented guidance; it is not just a conceptual style but a deployable topology recommended by the Cloud Adoption Framework.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/hub-spoke-virtual-wan-architecture,Hub-spoke topology with Virtual WAN,Hub-Spoke Network Topology That Uses Azure Virtual WAN - Azure Architecture Center,Hub-spoke network topology using Azure Virtual WAN,Learn how to implement a hub-spoke network topology by using Azure Virtual WAN. Create a hub virtual network and spoke virtual networks that peer with the hub.,This hub-spoke architecture provides an alternate solution to thehub-spoke network topology architectureand thesecure hybrid network architecture. Thehubis a virtual network in Azure that serves as a central point of connectivity to your on-premises network. Thespokesare virtual networks that peer with the hub and help isolate workloads. Traffic flows between the on-premises datacenters and the hub through an Azure ExpressRoute or Azure VPN Gateway connection. This approach replaces traditional ,2025-06-27T05:05:00Z,example-scenario,reference-architectures,0.82,True,"Detailed architecture using Virtual WAN, ExpressRoute/VPN, and peering; alternative to other reference topologies with deployment guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/massive-scale-azure-architecture,Massive-scale Virtual WAN architecture design,Massive-scale VWAN architecture design - Azure Architecture Center,Design massive-scale Azure Virtual WAN with multi-hub,Learn about the architecture for a massive-scale Azure Virtual WAN deployment that has multiple hubs per region.,"This example workload shows an Azure Virtual WAN deployment with multiple hubs per region. To improve availability and scalability, each hub peers to geographically dispersed, redundant Azure ExpressRoute circuits. This architecture is for exceptionally large and critical workloads. It supports business units and applications that reside on spoke virtual networks. The spoke virtual networks often have security requirements for internet-to-spoke or spoke-to-spoke connectivity.",2025-10-30T05:03:00Z,concept-article,example-workloads,0.78,True,"Example workload for exceptionally large deployments with multiple hubs per region and redundant ExpressRoute; deep, scenario-specific network design.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/performance-security-optimized-vwan,Virtual WAN optimized for requirements,Virtual WAN architecture optimized for department-specific requirements - Azure Architecture Center,Virtual WAN design for mixed security and performance needs,Learn how to design a single network that has varying security and performance requirements.,"A global manufacturing company provided the architecture that this article describes. The company's operational technology and information technology departments are highly integrated, demanding a single internal network. But the environments have drastically different security and performance requirements. Because of the sensitive nature of the company's operations, all traffic needs to be firewall protected by a network virtual appliance (NVA) the customer hosts on their own virtual machines a",2025-04-21T17:30:00Z,example-scenario,example-workloads,0.76,True,"Based on a real manufacturing company scenario with OT/IT requirements and NVA firewalls; detailed, department-specific architecture.",unchanged @@ -310,7 +308,7 @@ https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/tru https://learn.microsoft.com/en-us/azure/architecture/networking/guide/cross-tenant-secure-access-private-endpoints,Cross-tenant secure access to apps,Cross-tenant Secure Access to Apps by Using Private Endpoints - Azure Architecture Center,,Restrict inbound traffic to a web app or function app. Use private endpoints in Azure to give consumer tenants secure access to provider tenant apps.,"Like most Azure platform as a service (PaaS) solutions, Azure web apps and function apps are publicly reachable over the internet by default. You can restrict inbound traffic to Azure web apps and function apps by using private endpoints. Private endpoints provide clients in your private network with secure access to your app over Azure Private Link. Private endpoints use IP addresses from the Azure virtual network address space. Network traffic between the client and the app traverses the virtu",2026-04-07T05:03:00.000Z,concept-article,,0.0,False,"Networking implementation guide for cross-tenant private endpoint access; URL path is /networking/guide/ and does not match any of the specified skill-type paths (reference-architectures, solution-ideas, patterns, technology-choices, architecture-styles, best-practices, antipatterns, example-scenario, industries, migration). While it likely contains detailed configuration guidance, it does not fit any defined sub-skill category.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/guide/internet-protocol-version-4-exhaustion,Prevent IPv4 exhaustion,Prevent IPv4 Exhaustion in Azure - Azure Architecture Center,Prevent IPv4 address exhaustion in large Azure networks,Minimize private address space consumption when you build large networks in Azure. This article presents two methods to create more IPv4 address space.,"Corporate networks typically use address spaces from the private IPv4 address ranges defined byRequest for Comments (RFC) 1918, like10.0.0.0/8,172.16.0.0/12, and192.168.0.0/16. In on-premises environments, these ranges provide enough IP addresses to meet the requirements of even the largest networks. As a result, many organizations develop address management practices that prioritize simple routing configurations and agile processes for IP address allocation. They often don't prioritize efficien",2025-06-17T05:03:00.000Z,concept-article,best-practices,0.74,True,Provides concrete strategies and methods to minimize private address consumption; actionable network design guidance.,unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/guide/ipv6-architecture,IPv6 hub-spoke network topology,IPv6 hub-spoke network topology - Azure Architecture Center,Add IPv6 support to Azure hub-spoke networks,"Learn how to transition a hub-and-spoke network topology in Azure so it supports IPv6, which creates a dual-stack network.","This article describes how to transition an IPv4 hub-and-spoke network topology to IPv6. It presents thehub-and-spoke network topologyas a starting point and describes the steps you can take to implement IPv6 support. In a hub-and-spoke network, the hub virtual network is a central point of connectivity for the spoke virtual networks. The spoke virtual networks connect to the hub and can provide isolation for application resources. For more information, seeTransitioning to IPv6.",2025-10-30T05:03:00Z,concept-article,best-practices,0.68,True,Stepwise guidance for transitioning an existing hub-spoke topology to dual-stack IPv4/IPv6; includes Azure-specific configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/networking/guide/ipv6-ip-planning,Conceptual planning for IPv6 networking,Conceptual planning for IPv6 networking - Azure Architecture Center,,Learn strategies for transitioning an IPv4 network environment on Azure to IPv6. IPv6 provides a larger pool of internet addresses to accommodate growth.,"This guide outlines strategies for transitioning an IPv4 network environment in Azure to IPv6. This transition is necessary as the number of internet-connected devices expands and IPv4 addresses near exhaustion. The IPv6 protocol provides a larger pool of internet addresses to accommodate future growth and offers enhanced security features (native IPsec), flow labeling, and simplified network configurations. This article helps you understand IPv6, acquire IPv6 addresses, and transition to IPv6.",2026-04-14T05:03:00.000Z,concept-article,,0.2,False,"Primarily a conceptual planning guide for IPv6 transition strategies without a specific deployable architecture, pattern structure, or detailed implementation artifacts that meet any sub-skill type criteria.",updated +https://learn.microsoft.com/en-us/azure/architecture/networking/guide/ipv6-ip-planning,Conceptual planning for IPv6 networking,Conceptual planning for IPv6 networking - Azure Architecture Center,,Learn strategies for transitioning an IPv4 network environment on Azure to IPv6. IPv6 provides a larger pool of internet addresses to accommodate growth.,"This guide outlines strategies for transitioning an IPv4 network environment in Azure to IPv6. This transition is necessary as the number of internet-connected devices expands and IPv4 addresses near exhaustion. The IPv6 protocol provides a larger pool of internet addresses to accommodate future growth and offers enhanced security features (native IPsec), flow labeling, and simplified network configurations. This article helps you understand IPv6, acquire IPv6 addresses, and transition to IPv6.",2026-04-14T05:03:00.000Z,concept-article,,0.2,False,"Primarily a conceptual planning guide for IPv6 transition strategies without a specific deployable architecture, pattern structure, or detailed implementation artifacts that meet any sub-skill type criteria.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/guide/network-virtual-appliance-high-availability,Deploy highly available NVAs,Deploy Highly Available NVAs - Azure Architecture Center,,Learn how to deploy network virtual appliances in high availability architectures for ingress and egress traffic in Azure.,"This article describes common ways to deploy a set of network virtual appliances (NVAs) for high availability in Azure. An NVA typically controls the flow of traffic between network segments that have different security levels. For example, you might use an NVA between a perimeter network virtual network and the public internet, or to connect external locations to Azure via a VPN or software-defined WAN (SD-WAN) appliances. This article assumes that you have a basic understanding of Azure networ",2026-04-07T05:03:00.000Z,concept-article,,0.4,False,"The page describes common ways to deploy NVAs for high availability and assumes basic networking knowledge, but the URL path is 'networking/guide/' rather than any of the specified classification paths (reference-architectures, solution-ideas, patterns, technology-choices, architecture-styles, best-practices, antipatterns, example-scenario, migration). It appears to be a general guidance article rather than a formal reference architecture, pattern, or best-practices page as defined by the categories, so it does not clearly fit any sub-skill type under the given rules.",unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/guide/private-link-hub-spoke-network,Private Link in hub-and-spoke network,Azure Private Link in a Hub-and-Spoke Network - Azure Architecture Center,Use Azure Private Link in hub-spoke networks,Use Azure Private Link to access PaaS services from a hub-and-spoke network. Connect to these services by using a private IP address from your virtual network.,"This article describes how to use Azure Private Link in a hub-and-spoke network topology. The target audience includes network architects and cloud solution architects. This guide outlines how to use Azure private endpoints to privately access platform as a service (PaaS) resources. This guide doesn't cover virtual network integration, service endpoints, and other solutions for connecting infrastructure as a service (IaaS) components to Azure PaaS resources. For more information about these solu",2025-07-23T17:34:00.000Z,concept-article,best-practices,0.68,True,Guide for using private endpoints to access PaaS from hub-spoke topology; includes prescriptive configuration and design considerations.,unchanged https://learn.microsoft.com/en-us/azure/architecture/networking/guide/private-link-virtual-wan-dns-guide,Guide overview,Guide to Private Link and DNS in Azure Virtual WAN - Azure Architecture Center,Configure DNS for Private Link in Azure Virtual WAN,"Learn how to implement DNS to support private endpoints in a Virtual WAN network. In these scenarios, every Virtual WAN hub has a firewall that has DNS proxy enabled.","Azure Private Link makes it possible for clients to access Azure platform as a service (PaaS) services directly from private virtual networks without using public IP addressing. For each service, you configure a private endpoint that uses a private IP address from your network. Clients can then use the private endpoint to connect privately to the service. Clients continue to use the fully qualified domain name (FQDN) of a service to connect to it. You configure DNS in your network to resolve the",2025-12-10T06:03:00Z,concept-article,best-practices,0.7,True,Detailed DNS design and configuration guidance for private endpoints in Virtual WAN; highly specific implementation knowledge.,unchanged @@ -339,7 +337,7 @@ https://learn.microsoft.com/en-us/azure/architecture/patterns/cache-aside,Cache- https://learn.microsoft.com/en-us/azure/architecture/patterns/choreography,Choreography,Choreography Pattern - Azure Architecture Center,Implement the Choreography pattern for workflows,"Learn how to set up services to decide when and how to process a business operation, instead of depending on a central orchestrator.","The Choreography pattern decentralizes workflow logic and distributes responsibilities to other components within a system. Instead of depending on a central orchestrator, services decide when and how to process a business operation.",2026-04-02T17:32:00.000Z,design-pattern,design-patterns,0.9,True,"URL is under /patterns/, describes the Choreography pattern with decentralized workflow logic, including context, solution, and usage guidance typical of Azure design pattern docs.",unchanged https://learn.microsoft.com/en-us/azure/architecture/patterns/circuit-breaker,Circuit Breaker,Circuit Breaker Pattern - Azure Architecture Center,Use the Circuit Breaker pattern for resilient calls,Learn how to handle faults that might take varying amounts of time to fix when applications connect to a remote service or resource.,The Circuit Breaker pattern helps handle faults that might take varying amounts of time to recover from when an application connects to a remote service or resource. A circuit breaker temporarily blocks access to a faulty service after it detects failures. This action prevents repeated unsuccessful attempts so that the system can recover effectively. This pattern can improve the stability and resiliency of an application.,2025-03-21T17:30:00.000Z,design-pattern,design-patterns,0.98,True,"patterns/circuit-breaker is a canonical design pattern page with context, solution, and trade-offs for handling remote service faults—expert pattern guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/patterns/claim-check,Claim Check,Claim-Check pattern - Azure Architecture Center,Apply the Claim-Check pattern for large messages,"Examine the Claim-Check pattern, which splits a large message into a claim check and a payload to avoid overwhelming a message bus.","The Claim-Check pattern allows workloads to transfer payloads without storing the payload in a messaging system. The pattern stores the payload in an external data store and uses a ""claim check"" to retrieve the payload. The claim check is a unique, obscure token or key. To retrieve the payload, applications need to present the claim-check token to the external data store.",2025-12-09T06:03:00Z,design-pattern,design-patterns,0.9,True,patterns/claim-check describes the Claim-Check pattern with explanation of claim tokens and external payload storage—named pattern with implementation considerations.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction,Compensating Transaction,Compensating Transaction pattern - Azure Architecture Center,Use the Compensating Transaction pattern for rollback,Use the Compensating Transaction pattern to undo work when a step of an eventually consistent operation fails.,"When you use an eventually consistent operation that consists of a series of steps, the Compensating Transaction pattern can be useful. Specifically, if one or more of the steps fail, you can use the Compensating Transaction pattern to undo the work that the steps performed. Typically, you find operations that follow the eventual consistency model in cloud-hosted applications that implement complex business processes and workflows.",2025-12-09T06:03:00Z,design-pattern,design-patterns,0.9,True,"patterns/compensating-transaction describes a named pattern for undoing work in eventually consistent operations, with context and solution—design-patterns.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction,Compensating Transaction,Compensating Transaction Pattern - Azure Architecture Center,Apply the Compensating Transaction pattern in distributed workflows,Use the Compensating Transaction pattern to undo work when a step of an eventually consistent operation fails in distributed systems.,Use this pattern to undo work when one or more steps fail in an eventually consistent operation. Cloud-hosted applications that implement complex business processes and workflows commonly use operations that follow the eventual consistency model.,2026-04-20T17:34:00.000Z,design-pattern,design-patterns,0.95,True,"The page is under the patterns/ path and describes a named design pattern (Compensating Transaction) with concrete implementation guidance for eventually consistent operations. Azure Architecture Center pattern pages typically include structured sections like context/problem, solution, and considerations, which go beyond generic concepts and provide reusable, expert implementation knowledge for distributed systems.",updated https://learn.microsoft.com/en-us/azure/architecture/patterns/competing-consumers,Competing Consumers,Competing Consumers Pattern - Azure Architecture Center,Use the Competing Consumers pattern for scaling,Learn how the Competing Consumers pattern supports multiple concurrent consumers processing messages received on the same messaging channel.,"Enable multiple concurrent consumers to process messages received on the same messaging channel. With multiple concurrent consumers, a system can process multiple messages at the same time to optimize throughput, improve scalability and availability, and balance the workload.",2026-04-03T17:33:00.000Z,design-pattern,design-patterns,0.95,True,"URL is under /patterns/, explains the named Competing Consumers pattern with problem/solution, when to use, and implementation considerations for concurrent message processing.",unchanged https://learn.microsoft.com/en-us/azure/architecture/patterns/compute-resource-consolidation,Compute Resource Consolidation,Compute Resource Consolidation Pattern - Azure Architecture Center,Consolidate compute with the Resource Consolidation pattern,Use the Compute Resource Consolidation design pattern to consolidate multiple tasks or operations into one computational unit.,Consolidate multiple tasks or operations into one computational unit. This pattern can increase compute resource utilization and reduce the costs and management overhead associated with compute processing in cloud-hosted applications.,2026-04-08T08:00:00.000Z,design-pattern,design-patterns,0.9,True,"URL is under /patterns/, presents the Compute Resource Consolidation pattern with context, solution, and trade-offs for optimizing compute utilization and cost.",unchanged https://learn.microsoft.com/en-us/azure/architecture/patterns/cqrs,CQRS,CQRS Pattern - Azure Architecture Center,Apply the CQRS pattern to separate reads and writes,Learn how to segregate operations that read data from operations that update data by using the Command Query Responsibility Segregation (CQRS) pattern.,"Command Query Responsibility Segregation (CQRS) is a design pattern that segregates read and write operations for a data store into separate data models. This approach allows each model to be optimized independently and can improve the performance, scalability, and security of an application.",2025-02-21T18:32:00.000Z,design-pattern,design-patterns,0.98,True,"patterns/cqrs is a canonical CQRS design pattern article with context, solution, and trade-offs—expert pattern knowledge.",unchanged @@ -395,7 +393,7 @@ https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/con https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/data/stream-processing-databricks,Stream processing with Azure Databricks,Stream Processing with Databricks - Azure Architecture Center,Deploy a Databricks-based stream processing pipeline,"Learn how to create a four-stage, end-to-end stream processing pipeline in Azure by using Azure Databricks.","This reference architecture shows an end-to-end stream processing pipeline. The four stages of this pipeline include ingest, process, store, and analyze and report. For this reference architecture, the pipeline ingests data from two sources, performs a join on related records from each stream, enriches the result, and calculates an average in real time. The results are then stored for further analysis.",2026-01-06T18:34:00Z,reference-architecture,reference-architectures,0.95,True,"URL under /reference-architectures/data/, shows a production-ready stream processing architecture with Azure Databricks and detailed pipeline stages.",unchanged https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/data/stream-processing-stream-analytics,Stream processing with Stream Analytics,Stream processing with Stream Analytics - Azure Architecture Center,Implement stream processing with Azure Stream Analytics,"This reference architecture shows an end-to-end stream processing pipeline, which ingests data, correlates records, and calculates a rolling average.","This reference architecture shows an end-to-end stream processing pipeline. The pipeline ingests data from two sources, correlates records in the two streams, and calculates a rolling average across a time window. The results are stored for further analysis.",2025-06-13T05:16:00Z,reference-architecture,reference-architectures,0.95,True,"Under /reference-architectures/data/, provides an end-to-end stream processing architecture using Stream Analytics with detailed stages and components.",unchanged https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/dmz/secure-vnet-dmz,Implement a secure hybrid network,Implement a secure hybrid network - Azure Architecture Center,Implement secure hybrid DMZ network in Azure,Learn how to deploy a secure hybrid network that extends an on-premises network to Azure by implementing a perimeter network called a DMZ.,"This reference architecture shows a secure hybrid network that extends an on-premises network to Azure. The architecture implements aperimeter network, also called aDMZ, between the on-premises network and an Azure virtual network. All inbound and outbound traffic passes through Azure Firewall.",2025-06-13T05:16:00Z,reference-architecture,reference-architectures,0.9,True,"DMZ reference architecture with Azure Firewall and hybrid connectivity, including detailed network layout and security configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration,Basic enterprise integration on Azure,Basic enterprise integration on Azure - Azure Architecture Center,Deploy basic enterprise integration with Logic Apps and APIM,Recommended architecture for implementing a simple enterprise integration pattern using Azure Logic Apps and Azure API Management.,"This reference architecture usesAzure Integration Servicesto orchestrate calls to enterprise backend systems. The backend systems can include software as a service (SaaS) systems, Azure services, and existing web services in your enterprise.",2025-06-13T05:16:00Z,reference-architecture,reference-architectures,0.92,True,"Under reference-architectures/enterprise-integration; describes concrete Azure services, integration flows, and implementation details for a production-ready basic enterprise integration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration,Basic enterprise integration on Azure,Basic Enterprise Integration on Azure - Azure Architecture Center,Deploy basic enterprise integration with Logic Apps and API Management,Learn how to implement a basic enterprise integration architecture on Azure by using Azure Logic Apps and Azure API Management.,"This reference architecture usesAzure Integration Servicesto orchestrate calls to enterprise back-end systems. Back-end systems can include software as a service (SaaS) systems, Azure services, or existing web services in your enterprise. Integration Services encompasses several components, including Azure Logic Apps, Azure API Management, Azure Service Bus, Azure Event Grid, Azure Functions, and Azure Data Factory. This basic architecture uses only Logic Apps and API Management.",2026-04-20T17:34:00Z,reference-architecture,reference-architectures,0.9,True,"URL contains reference-architectures/, and the page describes a production-ready enterprise integration architecture using Azure Integration Services (Logic Apps, API Management). Reference architectures in this section typically include detailed diagrams, service choices, and deployment recommendations, matching the reference-architectures criteria.",updated https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/,Connect an on-premises network to Azure,Connect an on-premises network to Azure - Azure Architecture Center,Choose connectivity options for on-premises to Azure VNets,Learn about the options for connecting an on-premises network to an Azure Virtual Network (VNet) by comparing reference architectures for each option.,"This article compares three options for connecting an on-premises network to an Azure Virtual Network (VNet). For each option, a more detailed reference architecture is available.",2025-06-13T05:16:00Z,reference-architecture,technology-choices,0.78,True,Reference-architectures index that compares multiple hybrid networking options; focuses on choosing between architectures rather than one implementation.,unchanged https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/expressroute-vpn-failover,Connect on-premises with ExpressRoute,Connect an On-Premises Network to Azure using ExpressRoute - Azure Architecture Center,Connect on-premises networks to Azure with ExpressRoute and VPN failover,Learn how to connect an on-premises network to an Azure virtual network by using ExpressRoute with a virtual private network (VPN) gateway failover.,"This reference architecture shows how to connect an on-premises network to an Azure virtual network by using Azure ExpressRoute, with a site-to-site virtual private network (VPN) as a failover connection.",2025-06-13T05:16:00Z,reference-architecture,reference-architectures,0.9,True,"Reference-architectures URL with detailed hybrid network design using ExpressRoute plus VPN failover, including production-ready configuration guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/troubleshoot-vpn,Troubleshoot a hybrid VPN connection,Troubleshoot a hybrid VPN connection - Azure Architecture Center,,Learn tips for how to troubleshoot a VPN gateway connection between an on-premises network and Azure.,"This article gives some tips for troubleshooting a VPN gateway connection between an on-premises network and Azure. For general information on troubleshooting common VPN-related errors, seeTroubleshooting common VPN related errors.",2025-10-30T05:03:00Z,concept-article,,0.4,False,"Troubleshooting tips for VPN gateway; service-specific troubleshooting rather than architecture, patterns, or cross-cutting best practices.",unchanged @@ -410,17 +408,17 @@ https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/n-t https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/n-tier/windows-vm,Run a Windows VM on Azure,Run a Windows VM on Azure - Azure Architecture Center,,"Learn the best practices for running a Windows virtual machine on Azure, which requires some additional components, including networking and storage resources.","Provisioning a virtual machine (VM) in Azure requires additional components besides the VM itself, including networking and storage resources. This article shows best practices for running a secure Windows VM on Azure.",2025-10-30T05:03:00Z,concept-article,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/sap/run-sap-bw4hana-with-linux-virtual-machines,SAP BW/4HANA in Linux on Azure,Run SAP BW/4HANA with Linux VMs - Azure Architecture Center,,"Learn about the SAP BW/4HANA application tier and how it's suitable for a high availability, small-scale production environment of SAP BW/4HANA on Azure.","The following example focuses specifically on the SAP BW/4HANA application tier. It's suitable for a small-scale production environment of SAP BW/4HANA on Azure, where high availability is a priority.",2025-04-17T17:31:00Z,reference-architecture,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/sap/run-sap-hana-for-linux-virtual-machines,SAP HANA scale-up on Linux,SAP HANA for Linux VMs in Scale-up Systems - Azure Architecture Center,,"Proven practices for running SAP HANA in a high-availability, scale-up environment that supports disaster recovery on Azure.","This reference architecture shows a set of proven practices for running SAP HANA in a highly available, scale-up environment that supports disaster recovery on Azure. This implementation focuses on the database layer only.",2025-09-04T05:03:00Z,reference-architecture,,0.0,False,Parse error: Expecting value: line 1 column 1 (char 0),unchanged -https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions,Overview,Event Hubs with Azure Functions - Azure Architecture Center,Architect and optimize Event Hubs with Azure Functions,"Learn how to architect, develop, and deploy efficient and scalable code that runs on Azure Functions and responds to Event Hubs events.","Solutions that use Azure Event Hubs together with Azure Functions benefit from aserverlessarchitecture that is scalable, cost-effective, and capable of processing large volumes of data in near real time. Although these services are commonly used together, there are many features, settings, and intricacies that add complexity to their relationship. This article provides guidance on how to effectively take advantage of this integration by highlighting key considerations and techniques for performa",2025-10-30T05:03:00Z,concept-article,best-practices,0.8,True,"Serverless integration guidance with detailed recommendations, settings, and techniques for using Event Hubs and Functions together; cross-cutting best practices for this pairing.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/observability,Observability,Monitor Azure Functions and Event Hubs - Azure Architecture Center,Monitor Event Hubs and Azure Functions with Application Insights,Learn how to monitor an Azure Functions topology that uses Event Hubs.,"Monitoring provides insights into the behavior and health of your systems, and helps build a holistic view of the environment, historic trends, correlate diverse factors, and measure changes in performance, consumption, or error rate. Azure Functions offers built-in integration withApplication Insights. From Application Insights, you can get information such as the number of function app instances or request and dependency telemetry of a function. When you work with Functions and Event Hubs, App",2025-10-30T05:03:00Z,concept-article,best-practices,0.84,True,Observability guidance with specific monitoring configurations and telemetry patterns for Functions and Event Hubs topologies.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale,Performance and scale,Performance and scale guidance for Event Hubs with Azure Functions - Azure Architecture Center,Optimize performance and scale for Event Hubs-triggered Functions,Learn how to plan for and deploy more efficient and scalable code that runs on Azure Functions and responds to Event Hubs events.,This article provides guidance for optimizing scalability and performance when you use Azure Event Hubs and Azure Functions together in your applications.,2025-10-30T05:03:00Z,concept-article,best-practices,0.86,True,"Focused on performance and scalability tuning for Event Hubs with Functions, including concrete configuration and deployment recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design,Resilient design,Resilient design guidance for Event Hubs and Functions - Azure Architecture Center,Design resilient Event Hubs and Azure Functions solutions,Learn how to develop resilient and scalable code that runs on Azure Functions and responds to Event Hubs events.,"Error handling, designing for idempotency and managing retry behavior are a few of the critical measures you can take to ensure Event Hubs triggered functions are resilient and capable of handling large volumes of data. This article covers these crucial concepts and makes recommendations for serverless event-streaming solutions. Azure provides three main messaging services that can be used with Azure Functions to support a wide range of unique, event-driven scenarios. Because of its partitioned ",2025-08-21T17:34:00.000Z,concept-article,best-practices,0.86,True,"Provides actionable guidance on error handling, idempotency, and retries for Event Hubs-triggered Functions—clear DOs/DON'Ts and implementation details.",unchanged -https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security,Security,Secure Azure Functions with Event Hubs - Azure Architecture Center,Secure Azure Functions integrated with Event Hubs,Learn how to securely develop and deploy efficient and scalable code that runs on Azure Functions and responds to Event Hubs events.,"When configuring access to resources in Azure, you should apply fine-grained control over permissions to resources. Access to these resources should be based onneed to knowandleast privilegesecurity principles to make sure that clients can only perform the limited set of actions granted to them.",2025-12-08T18:34:00.000Z,concept-article,best-practices,0.84,True,"Security-focused implementation guidance for Functions and Event Hubs, applying least privilege and fine-grained access with concrete configuration advice.",unchanged +https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions,Overview,Event Hubs with Azure Functions - Azure Architecture Center,Architect Event Hubs integrations with Azure Functions,"Learn how to architect, develop, and deploy efficient and scalable code that runs on Azure Functions and responds to Event Hubs events.","Solutions that use Azure Event Hubs together with Azure Functions benefit from aserverlessarchitecture that is scalable, cost-effective, and capable of processing large volumes of data in near real time. Although these services are commonly used together, there are many features, settings, and intricacies that add complexity to their relationship. This article provides guidance on how to effectively take advantage of this integration by highlighting key considerations and techniques for performa",2026-04-23T17:32:00Z,concept-article,best-practices,0.76,True,"Provides detailed guidance on architecting, developing, and deploying Event Hubs-triggered Azure Functions, including specific settings and integration nuances. This is cross-cutting implementation guidance with concrete recommendations (DOs/techniques), fitting best-practices rather than a pattern or solution idea.",updated +https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/observability,Observability,Monitor Azure Functions and Event Hubs - Azure Architecture Center,Monitor Event Hubs and Azure Functions with Application Insights,Learn how to monitor an Azure Functions topology that uses Event Hubs.,"Monitoring provides insights into the behavior and health of your systems, and helps build a holistic view of the environment, historic trends, correlate diverse factors, and measure changes in performance, consumption, or error rate. Azure Functions offers built-in integration withApplication Insights. From Application Insights, you can get information such as the number of function app instances or request and dependency telemetry of a function. When you work with Functions and Event Hubs, App",2026-04-23T17:32:00Z,concept-article,best-practices,0.8,True,"Explains how to monitor a Functions + Event Hubs topology using Application Insights, including what telemetry to use and how to interpret it. This is concrete monitoring/observability implementation guidance, a classic cross-cutting best-practices scenario.",updated +https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale,Performance and scale,Performance and scale guidance for Event Hubs with Azure Functions - Azure Architecture Center,Optimize performance and scale for Event Hubs Functions,Learn how to plan for and deploy more efficient and scalable code that runs on Azure Functions and responds to Event Hubs events.,This article provides guidance for optimizing scalability and performance when you use Azure Event Hubs and Azure Functions together in your applications.,2026-04-23T17:32:00Z,concept-article,best-practices,0.86,True,"Focused on performance and scalability tuning for Event Hubs with Azure Functions, including concrete optimization guidance and likely specific configuration recommendations. This is actionable implementation guidance (DOs) across performance/scale, matching best-practices.",updated +https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design,Resilient design,Resilient design guidance for Event Hubs and Functions - Azure Architecture Center,Design resilient Event Hubs-triggered Azure Functions,Learn how to develop resilient and scalable code that runs on Azure Functions and responds to Event Hubs events.,"Error handling, designing for idempotency and managing retry behavior are a few of the critical measures you can take to ensure Event Hubs triggered functions are resilient and capable of handling large volumes of data. This article covers these crucial concepts and makes recommendations for serverless event-streaming solutions. Azure provides three main messaging services that can be used with Azure Functions to support a wide range of unique, event-driven scenarios. Because of its partitioned ",2026-04-22T08:00:00.000Z,concept-article,best-practices,0.84,True,"Covers error handling, idempotency, and retry behavior with specific recommendations for resilient Event Hubs-triggered Functions. This is cross-cutting resilience guidance with concrete implementation advice, aligning with best-practices.",updated +https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security,Security,Secure Azure Functions with Event Hubs - Azure Architecture Center,Secure Azure Functions that consume Event Hubs,Learn how to securely develop and deploy efficient and scalable code that runs on Azure Functions and responds to Event Hubs events.,"When configuring access to resources in Azure, you should apply fine-grained control over permissions to resources. Access to these resources should be based onneed to knowandleast privilegesecurity principles to make sure that clients can only perform the limited set of actions granted to them.",2026-04-22T08:00:00.000Z,concept-article,best-practices,0.82,True,"Provides detailed security guidance (least privilege, fine-grained access) for Functions and Event Hubs integration, with specific configuration practices. This is security-focused implementation guidance with DO/DO-NOT style recommendations, fitting best-practices.",updated https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/analytics-service-bus,Real-time analytics with Azure Service Bus and Microsoft Fabric,Real-Time Analytics with Azure Service Bus and Microsoft Fabric - Azure Architecture Center,Build real-time analytics using Azure Service Bus and Microsoft Fabric,Learn how to use Microsoft Fabric with Azure Service Bus for near real-time solutions at enterprise scale in production environments.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. Cloud architects, data engineers, and retail solution strategists can use this architecture to build a scalable, secure, and intelligent real-time analytics solution by using Azure ",2026-04-02T17:32:00Z,solution-idea,solution-ideas,0.9,True,"URL path includes /solution-ideas/articles/. The page presents a solution idea with an architecture diagram, explains how Azure Service Bus and Microsoft Fabric integrate for near real-time analytics, and offers scenario-specific guidance for enterprise-scale implementations. It’s less detailed than a full reference architecture and fits the solution-ideas category with practical expert guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/azure-databricks-modern-analytics-architecture,Modern analytics with Azure Databricks,Create a Modern Analytics Architecture by Using Azure Databricks - Azure Architecture Center,Design a modern analytics architecture with Azure Databricks,"Learn how to create a modern analytics architecture by using Azure Databricks and Data Lake Storage. Unify data, analytics, and AI workloads at any scale.",Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This solution outlines the key principles and components of modern data architectures. Azure Databricks is central to the solution. This platform integrates directly with other serv,2026-04-03T17:33:00Z,solution-idea,solution-ideas,0.9,True,"URL path includes /solution-ideas/articles/. The article describes a solution idea with a conceptual architecture diagram, shows how multiple Azure services (Databricks, Data Lake Storage, etc.) work together, and provides scenario-specific guidance without deep deployment code or sizing tables. This matches the solution-ideas criteria and contains concrete architectural know-how.",unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/azure-security-build-first-layer-defense,First layer of defense: Azure security,Building the First Layer of Defense with Azure Security Services - Azure Architecture Center,Build first security layer with core Azure security services,Secure your IT environment by using Azure security services and Azure Security Benchmark. This article is part of a series.,Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. You can use various Azure services to create a complete IT infrastructure for your organization. Azure also provides security services that help you protect that infrastructure. By ,2025-12-18T18:34:00Z,solution-idea,solution-ideas,0.9,True,solution-ideas/articles URL; scenario-specific guidance on composing Azure security services into a foundational defense layer.,unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/compute-get-started,Get started,Get Started with Compute Architecture Design - Azure Architecture Center,,"Learn about Azure compute technologies, from virtual machines to serverless functions. Review guidance, solution ideas, and reference architectures.","Compute resources form the foundation of cloud workloads. Azure provides a wide range of compute options to meet diverse requirements, from virtual machines (VMs) that give you full control of the operating system to fully managed serverless functions that scale automatically. Azure compute services support workload migration, cloud-native application development, and high-performance computing (HPC) simulations. They also provide flexibility and scaling capabilities to improve workload performa",2026-03-12T17:42:00.000Z,concept-article,,0.2,False,"Although under /solution-ideas/, this page is a broad overview of Azure compute options and links to other guidance. It is primarily conceptual and navigational, not a specific architecture with detailed implementation or scenario-specific design; it lacks the depth of expert deployment or configuration knowledge required for classification.",unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/data-streaming-scenario,Data streaming with AKS,Data streaming with AKS - Azure Architecture Center,Stream and process real-time data with AKS and Kafka,"Use AKS to easily ingest and process a real-time data stream, with millions of data points collected via sensors.","Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This article presents a solution for using Azure Kubernetes Service (AKS) to quickly process and analyze a large volume of streaming data from devices. Apache®,Apache Kafka, andApac",2025-10-10T17:33:00Z,solution-idea,solution-ideas,0.85,True,Solution idea under /solution-ideas/ describing AKS-based data streaming architecture; conceptual architecture diagram and service interactions tailored to streaming scenario.,unchanged -https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/devsecops-infrastructure-as-code,Solution idea: DevSecOps for infrastructure as code,DevSecOps for Infrastructure as Code (IaC) - Azure Architecture Center,Design a DevSecOps IaC pipeline with GitHub and Azure,Learn how to use DevSecOps for IaC to more securely and efficiently deploy cloud infrastructure into a new Azure landing zone.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This solution idea illustrates the DevSecOps pipeline that uses GitHub for infrastructure as code (IaC). It also describes how to govern the workflow for operational excellence, sec",2026-04-15T05:05:00Z,solution-idea,solution-ideas,0.95,True,"URL path includes solution-ideas/articles/, and the page is explicitly labeled as a solution idea with a conceptual architecture diagram for a DevSecOps IaC pipeline using GitHub and Azure landing zones; it is higher-level than a reference architecture and matches the solution-ideas criteria.",updated +https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/devsecops-infrastructure-as-code,Solution idea: DevSecOps for infrastructure as code,DevSecOps for Infrastructure as Code (IaC) - Azure Architecture Center,Design a DevSecOps IaC pipeline with GitHub and Azure,Learn how to use DevSecOps for IaC to more securely and efficiently deploy cloud infrastructure into a new Azure landing zone.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This solution idea illustrates the DevSecOps pipeline that uses GitHub for infrastructure as code (IaC). It also describes how to govern the workflow for operational excellence, sec",2026-04-15T05:05:00Z,solution-idea,solution-ideas,0.95,True,"URL path includes solution-ideas/articles/, and the page is explicitly labeled as a solution idea with a conceptual architecture diagram for a DevSecOps IaC pipeline using GitHub and Azure landing zones; it is higher-level than a reference architecture and matches the solution-ideas criteria.",unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/ingest-etl-stream-with-adb,"Ingestion, ETL, and stream processing",Build ETL pipelines with Azure Databricks and Delta Lake - Azure Architecture Center,Build batch and streaming ETL with Databricks and Delta Lake,Create ETL pipelines for batch and streaming data with Azure Databricks to simplify data lake ingestion at any scale.,"Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. Your organization needs to ingest data of any format, size, and speed into the cloud in a consistent way. The solution in this article meets that need with an architecture that impl",2025-09-15T17:33:00Z,solution-idea,solution-ideas,0.95,True,"URL under /solution-ideas/articles/, explicitly labeled as a solution idea with conceptual architecture diagram and scenario-specific guidance for ETL pipelines.",unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/iot-azure-data-explorer,IoT Hub analytics with Azure Data Explorer,IoT analytics with Azure Data Explorer and Azure IoT Hub - Azure Architecture Center,Analyze IoT telemetry with Azure Data Explorer and IoT Hub,"Use Azure Data Explorer with Azure IoT Hub for near real-time IoT telemetry analytics on fast-flowing, high-volume streaming data from a wide range of IoT devices.","Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. This solution idea describes how Azure Data Explorer provides near real-time analytics for fast flowing, high volume streaming data from internet of things (IoT) devices and sensors",2026-02-13T18:32:00Z,solution-idea,solution-ideas,0.94,True,"solution-ideas/articles URL; conceptual architecture showing IoT Hub, ADX, and related services for near real-time IoT analytics with scenario-specific guidance.",unchanged https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/mainframe-azure-file-replication,Mainframe file replication on Azure,Mainframe File Replication and Sync on Azure - Azure Architecture Center,Replicate and sync mainframe files to Azure storage,"Learn about several options for moving, converting, transforming, and storing mainframe and midrange file system data on-premises and in Azure.","Solution ideas This article describes a solution idea. Your cloud architect can use this guidance to help visualize the major components for a typical implementation of this architecture. Use this article as a starting point to design a well-architected solution that aligns with your workload's specific requirements. When you migrate an on-premises mainframe or midrange application to Azure, data transfer is a key consideration. Several modernization scenarios require you to replicate files to A",2025-09-15T17:33:00Z,solution-idea,solution-ideas,0.9,True,solution-ideas/articles URL; describes options and architectures for moving and transforming mainframe/midrange file data to Azure in modernization scenarios.,unchanged diff --git a/products/azure-architecture/report.md b/products/azure-architecture/report.md index b8662637..00c74bdb 100644 --- a/products/azure-architecture/report.md +++ b/products/azure-architecture/report.md @@ -1,58 +1,58 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - example-workloads: End-to-end reference architectures and patterns for real-world - Azure workloads, covering data/AI, mainframe migration, networking, security, - hybrid/Arc, AKS, Fabric, VDI, and app modernization. - reference-architectures: 'End-to-end Azure solution blueprints for mission-critical, - hybrid, and multiregion workloads: networking, security, data/ML, AKS, AVD, integration, - and DR-ready reference architectures.' - design-patterns: 'Design and integration patterns for building resilient, scalable, - secure Azure apps: messaging, gateways, multitenancy, caching, transactions, auth, - monitoring, and legacy modernization.' - technology-choices: Guides for choosing the right Azure/Fabric services (AI, data, - storage, compute, networking, containers, messaging) by comparing options for - specific workload and architecture needs. - best-practices: 'Best-practice patterns for Azure apps: API design, autoscale, caching, - DR/HA, networking/IPv4-6, AKS, IoT, SAP, Event Hubs, App Service, DevSecOps, and - GenAI/RAG operations.' - solution-ideas: End-to-end solution patterns for AI, analytics, security, DevSecOps, - SAP, and data platforms on Azure, including reference architectures and implementation - guidance. + example-workloads: 'End-to-end reference architectures for real-world Azure workloads: + data/AI pipelines, mainframe migration, AKS and networking, security/Zero Trust, + hybrid/Arc, VDI, and enterprise app deployments.' + reference-architectures: 'End-to-end Azure solution blueprints: mission-critical, + hybrid, and multiregion architectures for data, networking, security, AKS, AVD, + MLOps, integration, and global-scale app delivery.' + design-patterns: 'Patterns for designing resilient, scalable Azure apps: messaging, + gateways, multitenancy, data partitioning, caching, workflows, auth, monitoring, + and integration with legacy systems.' + technology-choices: Guides for choosing the right Azure/Fabric services (compute, + storage, data, AI/ML, analytics, networking, messaging) for specific workloads + and architectural requirements. + best-practices: 'Best-practice patterns for Azure apps: API design, autoscaling, + caching, DR, networking, AKS ops, IoT, SAP, Event Hubs/Functions, multitenant/RAG, + and secure, resilient cloud architectures.' + solution-ideas: End-to-end solution patterns for AI, analytics, IoT, security, SAP, + and data platforms on Azure, including architecture, services to use, and reference + implementations. anti-patterns: Diagnosing and fixing common Azure performance and scalability anti-patterns (busy DB/front end, chatty I/O, no caching, retry storms, noisy neighbors, sync I/O, monolithic persistence). migration-guides: Guides for migrating from AWS, GCP, on-prem Oracle, Kafka, and Kubernetes/EKS to Azure, including service mapping, architecture, governance, security, cost, and app modernization patterns. - architecture-styles: 'Azure app architecture patterns: when and how to use Big Compute, - Big Data, event-driven, microservices, N-tier, and Web-Queue-Worker styles, with - design guidance and tradeoffs.' + architecture-styles: Guidance on choosing and designing Azure app architectures + (microservices, N‑tier, event-driven, web‑queue‑worker, big data, big compute) + and when/how to apply each style. skill_description: Expert guidance for designing Azure solutions using Azure Architecture. Covers reference architectures, solution ideas, design patterns, technology choices, architecture styles, best practices, anti-patterns, example workloads, and migration - guides. Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregion - DR, SAP/IoT platforms, or GenAI/RAG workloads, and other Azure Architecture related - development tasks. -use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregion - DR, SAP/IoT platforms, or GenAI/RAG workloads, and other Azure Architecture related - development tasks. + guides. Use when designing AKS, data/AI pipelines, Zero Trust networking, hybrid/Arc + setups, or multiregion apps on Azure, and other Azure Architecture related development + tasks. +use_when: Use when designing AKS, data/AI pipelines, Zero Trust networking, hybrid/Arc + setups, or multiregion apps on Azure, and other Azure Architecture related development + tasks. --- # Azure Architecture Crawl Report ## Summary -- **Total Pages**: 454 -- **Fetched**: 454 +- **Total Pages**: 452 +- **Fetched**: 452 - **Fetch Failed**: 0 -- **Classified**: 335 -- **Unclassified**: 119 +- **Classified**: 336 +- **Unclassified**: 116 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 12 -- **Unchanged**: 442 -- **Deleted Pages**: 0 +- **New Pages**: 2 +- **Updated Pages**: 18 +- **Unchanged**: 432 +- **Deleted Pages**: 4 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-architecture/azure-architecture.csv` ## Classification Statistics @@ -60,44 +60,68 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | Type | Count | Percentage | |------|-------|------------| | anti-patterns | 11 | 2.4% | -| architecture-styles | 7 | 1.5% | -| best-practices | 51 | 11.2% | -| design-patterns | 50 | 11.0% | -| example-workloads | 73 | 16.1% | -| migration-guides | 32 | 7.0% | -| reference-architectures | 52 | 11.5% | +| architecture-styles | 8 | 1.8% | +| best-practices | 51 | 11.3% | +| design-patterns | 50 | 11.1% | +| example-workloads | 74 | 16.4% | +| migration-guides | 32 | 7.1% | +| reference-architectures | 51 | 11.3% | | solution-ideas | 28 | 6.2% | -| technology-choices | 31 | 6.8% | -| *(Unclassified)* | 119 | 26.2% | +| technology-choices | 31 | 6.9% | +| *(Unclassified)* | 116 | 25.7% | ## Changes +### New Pages + +- [Choose a Microsoft Fabric deployment pattern](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-deployment-patterns) +- [Certificate life cycle management on Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/) + ### Updated Pages -- [Analytical data stores](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/analytical-data-stores) - - Updated: 2025-09-12T17:34:00.000Z → 2026-04-15T08:00:00.000Z -- [Migrate EKS workloads to AKS](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/migrate) - - Updated: 2025-04-08T17:31:00.000Z → 2026-04-14T05:03:00.000Z - [What's new](https://learn.microsoft.com/en-us/azure/architecture/changelog) - - Updated: 2026-04-08T17:34:00.000Z → 2026-04-14T05:03:00.000Z -- [CI/CD for microservices on Kubernetes](https://learn.microsoft.com/en-us/azure/architecture/microservices/ci-cd-kubernetes) - - Updated: 2025-10-30T05:03:00Z → 2026-04-15T05:05:00Z + - Updated: 2026-04-14T05:03:00.000Z → 2026-04-24T05:04:00.000Z +- [Introduction](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/) + - Updated: 2025-09-26T05:04:00.000Z → 2026-04-24T17:33:00.000Z +- [Choose a compute option](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options) + - Updated: 2026-02-11T18:33:00.000Z → 2026-04-24T17:33:00.000Z +- [Microservices compute options](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options) + - Updated: 2026-02-11T18:33:00.000Z → 2026-04-24T17:33:00.000Z +- [Natural language processing](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing) + - Updated: 2026-02-19T06:03:00.000Z → 2026-04-21T17:34:00.000Z +- [Compensating Transaction](https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction) + - Updated: 2025-12-09T06:03:00Z → 2026-04-20T17:34:00.000Z +- [Azure Sandbox](https://learn.microsoft.com/en-us/azure/architecture/guide/azure-sandbox/azure-sandbox) + - Updated: 2025-09-13T05:04:00.000Z → 2026-04-24T17:33:00.000Z - [DevSecOps on AKS](https://learn.microsoft.com/en-us/azure/architecture/guide/devsecops/devsecops-on-aks) - - Updated: 2025-10-30T05:03:00Z → 2026-04-15T05:05:00Z -- [Use deployment scripts to check resource properties](https://learn.microsoft.com/en-us/azure/architecture/guide/devops/deployment-scripts-property-check) - - Updated: 2025-10-30T05:03:00Z → 2026-04-15T05:05:00Z -- [Automate API deployments with APIOps](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/automated-api-deployments-apiops) - - Updated: 2026-02-26T06:03:00Z → 2026-04-15T05:05:00Z -- [Manage Microsoft 365 with DevOps](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/manage-microsoft-365-tenant-configuration-microsoft365dsc-devops) - - Updated: 2025-04-17T17:31:00Z → 2026-04-15T05:05:00Z -- [Solution idea: DevSecOps for infrastructure as code](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/devsecops-infrastructure-as-code) - - Updated: 2025-09-15T17:33:00Z → 2026-04-15T05:05:00Z -- [Hub-spoke network topology in Azure](https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/hub-spoke) - - Updated: 2025-06-13T05:16:00Z → 2026-04-14T17:32:00Z -- [Conceptual planning for IPv6 networking](https://learn.microsoft.com/en-us/azure/architecture/networking/guide/ipv6-ip-planning) - - Updated: 2025-01-17T08:00:00.000Z → 2026-04-14T05:03:00.000Z -- [Computer forensics](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/forensics/) - - Updated: 2025-09-04T17:35:00Z → 2026-04-14T05:03:00Z + - Updated: 2026-04-15T05:05:00Z → 2026-04-24T17:33:00.000Z +- [Manage virtual machine compliance](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/security/virtual-machine-compliance) + - Updated: 2025-04-17T17:31:00Z → 2026-04-22T17:33:00Z +- [Data lakes](https://learn.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-lake) + - Updated: 2025-09-16T17:40:00.000Z → 2026-04-21T05:03:00.000Z +- [AKS on Azure Local baseline architecture](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline) + - Updated: 2025-09-04T17:35:00Z → 2026-04-23T17:32:00Z +- [Overview](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions) + - Updated: 2025-10-30T05:03:00Z → 2026-04-23T17:32:00Z +- [Performance and scale](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale) + - Updated: 2025-10-30T05:03:00Z → 2026-04-23T17:32:00Z +- [Resilient design](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design) + - Updated: 2025-08-21T17:34:00.000Z → 2026-04-22T08:00:00.000Z +- [Security](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security) + - Updated: 2025-12-08T18:34:00.000Z → 2026-04-22T08:00:00.000Z +- [Observability](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/observability) + - Updated: 2025-10-30T05:03:00Z → 2026-04-23T17:32:00Z +- [Basic enterprise integration on Azure](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration) + - Updated: 2025-06-13T05:16:00Z → 2026-04-20T17:34:00Z +- [Azure Load Testing for IoT Hub](https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins) + - Updated: 2025-03-20T17:31:00.000Z → 2026-04-21T08:00:00.000Z + +### Deleted Pages + +- ~~Microsoft Fabric deployment patterns~~ (https://learn.microsoft.com/en-us/azure/architecture/analytics/architecture/fabric-deployment-patterns) +- ~~Certificate lifecycle management on Azure~~ (https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/) +- ~~Unisys Dorado migration~~ (https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-unisys-dorado-mainframe-apps-with-astadia-micro-focus) +- ~~Compare Java application hosting options~~ (https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/service-for-java-comparison) ## Classified Pages @@ -122,6 +146,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Cache-Aside](https://learn.microsoft.com/en-us/azure/architecture/patterns/cache-aside) | design-patterns | 0.95 | patterns/cache-aside describes the Cache-Aside pattern with problem/solution and implementation details—classic design pattern content. | | [Chatty I/O](https://learn.microsoft.com/en-us/azure/architecture/antipatterns/chatty-io/) | anti-patterns | 0.95 | Antipattern article with named issue, symptoms, and mitigation steps under /antipatterns/. | | [Choose an analytical data store in Microsoft Fabric](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-analytical-data-stores) | technology-choices | 0.95 | Under /technology-choices/, evaluates multiple Fabric analytical data stores with selection factors and trade-offs, matching technology-choices. | +| [Compensating Transaction](https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction) | design-patterns | 0.95 | The page is under the patterns/ path and describes a named design pattern (Compensating Transaction) with concrete implementation guidance for eventually consistent operations. Azure Architecture Center pattern pages typically include structured sections like context/problem, solution, and considerations, which go beyond generic concepts and provide reusable, expert implementation knowledge for distributed systems. | | [Competing Consumers](https://learn.microsoft.com/en-us/azure/architecture/patterns/competing-consumers) | design-patterns | 0.95 | URL is under /patterns/, explains the named Competing Consumers pattern with problem/solution, when to use, and implementation considerations for concurrent message processing. | | [Cross-region resiliency for SQL TDE with Azure Key Vault Managed HSM](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/secure-sql-managed-instance-managed-hardware-security-module) | solution-ideas | 0.95 | URL path includes solution-ideas/articles; the page is explicitly labeled a solution idea and presents a conceptual architecture for SQL Managed Instance with Azure Key Vault Managed HSM for TDE keys, matching the solution-ideas criteria. | | [Extraneous Fetching](https://learn.microsoft.com/en-us/azure/architecture/antipatterns/extraneous-fetching/) | anti-patterns | 0.95 | Describes a named antipattern, its impact, and concrete remediation guidance under /antipatterns/. | @@ -147,7 +172,6 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [IoT Hub analytics with Azure Data Explorer](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/iot-azure-data-explorer) | solution-ideas | 0.94 | solution-ideas/articles URL; conceptual architecture showing IoT Hub, ADX, and related services for near real-time IoT analytics with scenario-specific guidance. | | [Microsoft Sentinel automated responses](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/microsoft-sentinel-automated-response) | solution-ideas | 0.93 | solution-ideas/articles URL; describes architecture and usage of Sentinel playbooks for automated incident response in enterprise environments. | | [Multilayered protection for Azure VMs](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/multilayered-protection-azure-vm) | solution-ideas | 0.93 | solution-ideas/articles URL; conceptual architecture for layered VM protection using Azure and Entra ID security services with scenario-specific recommendations. | -| [Basic enterprise integration on Azure](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration) | reference-architectures | 0.92 | Under reference-architectures/enterprise-integration; describes concrete Azure services, integration flows, and implementation details for a production-ready basic enterprise integration pattern. | | [AKS baseline for multi-region clusters](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-multi-region/aks-multi-cluster) | reference-architectures | 0.90 | Reference architecture under /reference-architectures/containers/aks-multi-region/ describing active/active multiregion AKS with concrete configuration and HA recommendations. | | [API implementation](https://learn.microsoft.com/en-us/azure/architecture/best-practices/api-implementation) | best-practices | 0.90 | Best-practices article with actionable implementation guidance for web APIs, including deployment and environment considerations. | | [Advanced microservices on AKS](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-microservices/aks-microservices-advanced) | reference-architectures | 0.90 | Advanced reference architecture building on AKS baseline; covers network policies, autoscaling, distributed tracing with specific AKS settings, clearly production-ready implementation guidance. | @@ -155,16 +179,16 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Analyze MongoDB Atlas data](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/analytics/sync-mongodb-atlas-fabric-analytics) | example-workloads | 0.90 | URL contains example-scenario/analytics and the article details how to integrate MongoDB Atlas with Fabric using open mirroring for low-latency ingestion, which is a specific technical implementation scenario matching example-workloads. | | [Autoscaling](https://learn.microsoft.com/en-us/azure/architecture/best-practices/auto-scaling) | best-practices | 0.90 | Focused best-practices article (URL contains best-practices/auto-scaling) with specific DOs/DON’Ts and implementation guidance for autoscaling in Azure environments, which is detailed operational expertise rather than high-level concepts. | | [Azure Data Factory mission critical architecture](https://learn.microsoft.com/en-us/azure/architecture/databases/architecture/azure-data-factory-mission-critical) | reference-architectures | 0.90 | URL contains mission-critical; provides specific changes and recommendations for mission-critical workloads aligned with CAF guidance. | -| [Azure Load Testing for IoT Hub](https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins) | solution-ideas | 0.90 | Explicitly labeled solution idea; provides architecture using Azure Load Testing, JMeter plugins, KPIs, and dashboards to simulate device behaviors—scenario-specific but not full reference architecture. | +| [Azure Load Testing for IoT Hub](https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins) | solution-ideas | 0.90 | The page is explicitly labeled as a solution idea and describes a conceptual architecture using Azure Load Testing, JMeter, and custom plugins to simulate device behaviors. It provides a high-level blueprint with major components, less detailed than a reference architecture, matching solution-ideas. | | [Azure Local baseline](https://learn.microsoft.com/en-us/azure/architecture/hybrid/azure-local-baseline) | reference-architectures | 0.90 | Baseline reference architecture URL with detailed cluster design, component breakdown, and production-ready recommendations for Azure Local infrastructure. | | [Backends for Frontends](https://learn.microsoft.com/en-us/azure/architecture/patterns/backends-for-frontends) | design-patterns | 0.90 | patterns/backends-for-frontends describes the Backends for Frontends pattern with context, solution, and usage guidance—canonical design pattern content. | | [Background jobs](https://learn.microsoft.com/en-us/azure/architecture/best-practices/background-jobs) | best-practices | 0.90 | Located under best-practices and provides concrete DOs/DON’Ts and implementation guidance for background jobs (hosting options, reliability, scaling, event-driven vs scheduled). This is cross-cutting architectural guidance with actionable recommendations, fitting best-practices and containing detailed, Azure-specific implementation advice. | +| [Basic enterprise integration on Azure](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration) | reference-architectures | 0.90 | URL contains reference-architectures/, and the page describes a production-ready enterprise integration architecture using Azure Integration Services (Logic Apps, API Management). Reference architectures in this section typically include detailed diagrams, service choices, and deployment recommendations, matching the reference-architectures criteria. | | [Big data storage](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage) | technology-choices | 0.90 | Compares big data storage options with key selection criteria and capability matrix, fitting technology-choices. | | [Build workloads with Azure Spot Virtual Machines](https://learn.microsoft.com/en-us/azure/architecture/guide/spot/spot-eviction) | best-practices | 0.90 | Explicitly described as best practices for building reliable workloads with Spot VMs, including actionable recommendations and trade-offs. | | [Caching](https://learn.microsoft.com/en-us/azure/architecture/best-practices/caching) | best-practices | 0.90 | Under best-practices and gives specific recommendations on when and how to cache, likely including patterns, pitfalls, and configuration considerations. It offers actionable DOs/DON’Ts and concrete examples for improving performance and scalability, which is expert implementation guidance across scenarios. | | [Choreography](https://learn.microsoft.com/en-us/azure/architecture/patterns/choreography) | design-patterns | 0.90 | URL is under /patterns/, describes the Choreography pattern with decentralized workflow logic, including context, solution, and usage guidance typical of Azure design pattern docs. | | [Claim Check](https://learn.microsoft.com/en-us/azure/architecture/patterns/claim-check) | design-patterns | 0.90 | patterns/claim-check describes the Claim-Check pattern with explanation of claim tokens and external payload storage—named pattern with implementation considerations. | -| [Compensating Transaction](https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction) | design-patterns | 0.90 | patterns/compensating-transaction describes a named pattern for undoing work in eventually consistent operations, with context and solution—design-patterns. | | [Compute](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/compute) | migration-guides | 0.90 | URL contains 'aws-professional/', and the page compares Azure and AWS compute services. It provides service mapping and detailed comparison to support migration/transition decisions, matching the migration-guides definition and containing expert, platform-specific knowledge. | | [Compute Resource Consolidation](https://learn.microsoft.com/en-us/azure/architecture/patterns/compute-resource-consolidation) | design-patterns | 0.90 | URL is under /patterns/, presents the Compute Resource Consolidation pattern with context, solution, and trade-offs for optimizing compute utilization and cost. | | [Connect on-premises with ExpressRoute](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/expressroute-vpn-failover) | reference-architectures | 0.90 | Reference-architectures URL with detailed hybrid network design using ExpressRoute plus VPN failover, including production-ready configuration guidance. | @@ -187,7 +211,6 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Mainframe file replication on Azure](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/mainframe-azure-file-replication) | solution-ideas | 0.90 | solution-ideas/articles URL; describes options and architectures for moving and transforming mainframe/midrange file data to Azure in modernization scenarios. | | [Map threats to your IT environment](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/map-threats-it-environment) | solution-ideas | 0.90 | solution-ideas/articles URL; provides a structured approach and architecture for mapping threats to an IT environment using Azure security tooling. | | [Microservices architecture on AKS](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-microservices/aks-microservices) | reference-architectures | 0.90 | Reference architecture under /reference-architectures/containers/aks-microservices; describes concrete AKS configuration, networking choice (Azure CNI with Cilium), and production-focused guidance. | -| [Microservices compute options](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options) | technology-choices | 0.90 | Same as index 17: compares compute options for microservices with selection criteria; URL under guide/technology-choices/, matching technology-choices. | | [Modern analytics with Azure Databricks](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/azure-databricks-modern-analytics-architecture) | solution-ideas | 0.90 | URL path includes /solution-ideas/articles/. The article describes a solution idea with a conceptual architecture diagram, shows how multiple Azure services (Databricks, Data Lake Storage, etc.) work together, and provides scenario-specific guidance without deep deployment code or sizing tables. This matches the solution-ideas criteria and contains concrete architectural know-how. | | [Modern data warehouse for small business](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/small-medium-data-warehouse) | example-workloads | 0.90 | Under /example-scenario/data/, focuses on SMB-specific modernization paths with concrete services and progressive implementation details. | | [Monitoring and diagnostics](https://learn.microsoft.com/en-us/azure/architecture/best-practices/monitoring) | best-practices | 0.90 | Monitoring and diagnostics guidance under best-practices with specific recommendations on tracking usage, resource utilization, and health for distributed apps. Provides actionable implementation guidance and cross-cutting best practices rather than just conceptual overview, matching the best-practices category. | @@ -239,17 +262,16 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Azure Local storage switchless](https://learn.microsoft.com/en-us/azure/architecture/hybrid/azure-local-switchless) | reference-architectures | 0.86 | Builds on Azure Local baseline reference architecture with specific design changes for storage switchless deployments and deployment-focused guidance. | | [Baseline Microsoft Foundry chat reference architecture](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/architecture/baseline-microsoft-foundry-chat) | reference-architectures | 0.86 | URL path includes 'architecture/baseline', indicating a baseline reference architecture. The description mentions building network-secured, highly available, zone-redundant enterprise chat applications using specific Azure services (Azure App Service, OpenAI GPT models, Microsoft Agent Framework). Baseline reference architectures in this section typically include detailed architecture diagrams, concrete service choices, security and availability configurations, and deployment guidance, which constitute expert, implementation-level knowledge beyond generic LLM training. | | [Baseline web application with zone redundancy](https://learn.microsoft.com/en-us/azure/architecture/web-apps/app-service/architectures/baseline-zone-redundant) | reference-architectures | 0.86 | Baseline reference architecture with Application Gateway, WAF, Private Link, and PaaS integration; production-ready with detailed network/security design. | -| [Certificate lifecycle management on Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/) | example-workloads | 0.86 | example-scenario/certificate-lifecycle URL; detailed workflow and infrastructure for automating renewals with nonintegrated CAs using Key Vault and other Azure components. | | [Extend on-premises AD FS to Azure](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/identity/adfs) | reference-architectures | 0.86 | Reference-architectures URL with secure hybrid network and AD FS federation design, including Azure components and trust configuration. | | [Greenfield lakehouse on Microsoft Fabric](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/greenfield-lakehouse-fabric) | example-workloads | 0.86 | URL contains example-scenario/ and describes a concrete greenfield lakehouse implementation on Fabric with platform-specific architecture and implementation details, fitting the example-workloads definition. | | [Host name preservation](https://learn.microsoft.com/en-us/azure/architecture/best-practices/host-name-preservation) | best-practices | 0.86 | The page is under best-practices/, gives concrete DO/DON'T guidance for preserving the original HTTP host name when using reverse proxies with Azure services, and discusses specific implementation behaviors (cookies, redirects, authentication issues). This is actionable, service-specific guidance that goes beyond generic concepts and fits the best-practices category. | | [Image and video processing](https://learn.microsoft.com/en-us/azure/architecture/data-guide/ai-services/image-video-processing) | technology-choices | 0.86 | Compares multiple Azure AI/Foundry services for image and video analysis and generation, describing capabilities and use cases to help pick the right service. This is a decision/selection guide across services, matching technology-choices and containing concrete, scenario-based comparison knowledge. | | [Migrate EKS workloads to AKS](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/migrate) | migration-guides | 0.86 | URL path contains aws-professional/eks-to-aks/migrate, and the page focuses on concrete migration strategies and considerations for moving stateless and stateful workloads from Amazon EKS to Azure AKS. This is a cross-cloud migration guide with platform-specific steps and gotchas that go beyond generic conceptual content, matching the migration-guides criteria. | +| [Natural language processing](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing) | technology-choices | 0.86 | The page is under data-guide/technology-choices and compares multiple Azure NLP options (managed APIs vs frameworks) with selection criteria for scenarios like sentiment analysis, entity recognition, and document classification. It provides decision guidance and trade-offs between services, which is specific, expert decision knowledge rather than a generic overview. | | [On-premises AD domains with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/identity/azure-ad) | reference-architectures | 0.86 | Identity reference architecture for hybrid integration of AD and Entra ID with best-practice topology and synchronization guidance. | -| [Performance and scale](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale) | best-practices | 0.86 | Focused on performance and scalability tuning for Event Hubs with Functions, including concrete configuration and deployment recommendations. | +| [Performance and scale](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale) | best-practices | 0.86 | Focused on performance and scalability tuning for Event Hubs with Azure Functions, including concrete optimization guidance and likely specific configuration recommendations. This is actionable implementation guidance (DOs) across performance/scale, matching best-practices. | | [Real-time monitoring and observable systems for media](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/monitoring/monitoring-observable-systems-media) | example-workloads | 0.86 | example-scenario/monitoring URL; media-industry-specific telemetry and monitoring solution using Event Hubs, Fabric eventstreams, and Data Activator with detailed architecture and implementation steps. | | [Rehost IMS DC and IMS DB](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/rehost-ims-raincode-imsql) | example-workloads | 0.86 | URL contains example-scenario/mainframe and the article explains how to rehost IMS workloads on Azure using Raincode IMSql without code changes, a concrete migration implementation scenario matching example-workloads. | -| [Resilient design](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design) | best-practices | 0.86 | Provides actionable guidance on error handling, idempotency, and retries for Event Hubs-triggered Functions—clear DOs/DON'Ts and implementation details. | | [Speech recognition and generation](https://learn.microsoft.com/en-us/azure/architecture/data-guide/ai-services/speech-recognition-generation) | technology-choices | 0.86 | Describes and contrasts speech-to-text, text-to-speech, speech translation, and related capabilities in Foundry Tools, with guidance on when to use each. This is a service selection guide (comparison and decision criteria) under the AI services data guide, fitting technology-choices with expert, product-specific details. | | [Targeted language processing](https://learn.microsoft.com/en-us/azure/architecture/data-guide/ai-services/targeted-language-processing) | technology-choices | 0.86 | Focused decision guide for targeted language processing in Foundry Tools (Language, text analytics, translation, document extraction). It compares capabilities and use cases of multiple services and helps select the right one, matching the technology-choices pattern and containing concrete, up-to-date service-specific guidance. | | [Teamcenter baseline architecture](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/manufacturing/teamcenter-baseline) | example-workloads | 0.86 | URL contains example-scenario/ and 'baseline', indicating a detailed implementation for a specific industry workload (manufacturing PLM). These example-scenario pages typically include concrete Azure service choices, topology, and configuration guidance for running Teamcenter in production-like environments, which goes beyond generic concepts and constitutes expert, scenario-specific knowledge. | @@ -261,7 +283,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Automate SAP workloads by using SUSE on Azure](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/sap-workload-automation-suse) | solution-ideas | 0.85 | Solution idea under /solution-ideas/ describing SUSE SAP automation architecture; shows how Azure and SUSE components work together for SAP workloads with scenario-specific guidance. | | [Azure hybrid options](https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/hybrid-considerations) | technology-choices | 0.85 | Under technology-choices/, compares Azure hybrid solutions and when to use each, including concrete criteria for selecting among Azure hybrid services. | | [Build a multiple-agent workflow automation solution by using Semantic Kernel](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/multiple-agent-workflow-automation) | solution-ideas | 0.85 | Labeled as a solution idea; describes multi-agent automation architecture using Microsoft Agent Framework and Azure Container Apps. | -| [Choose a compute option](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options) | technology-choices | 0.85 | Explicitly about choosing between compute platforms for microservices; compares options and selection criteria; URL under guide/technology-choices/, matching technology-choices. | +| [Choose a compute option](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options) | technology-choices | 0.85 | Decision guide comparing Azure compute options for microservices, including selection criteria and trade-offs. Although URL does not contain technology-choices/, the content clearly focuses on choosing between multiple Azure compute services for a specific scenario. | | [Content Delivery Network](https://learn.microsoft.com/en-us/azure/architecture/best-practices/cdn) | best-practices | 0.85 | CDN guidance with concrete recommendations on when and how to use CDNs in Azure, matching best-practices. | | [Customer order forecasting](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/next-order-forecasting) | solution-ideas | 0.85 | Solution idea URL and labeling; provides scenario-specific architecture for SKU-level demand forecasting using Azure ML services. | | [Data and AI](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/data-ai) | migration-guides | 0.85 | Provides service mapping and differences for data and AI between AWS and Azure; supports migration planning. | @@ -274,6 +296,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Identity and access management](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/workload-identity) | migration-guides | 0.85 | Deep comparison of workload identity and access between EKS and AKS, guiding migration and design decisions. | | [Image classification](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/intelligent-apps-image-processing) | solution-ideas | 0.85 | Marked as a solution idea; shows how multiple Azure services (Computer Vision, Functions, etc.) work together for image processing. | | [Load balancing options](https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/load-balancing-overview) | technology-choices | 0.85 | Load-balancing-overview under technology-choices/ that compares Azure load balancing services and when to use each, including trade-offs. | +| [Microservices compute options](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options) | technology-choices | 0.85 | Duplicate of index 2: decision guide comparing Azure compute options for microservices with scenario-specific selection criteria and trade-offs. | | [Microservices with Container Apps](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/serverless/microservices-with-container-apps) | example-workloads | 0.85 | Example scenario under /example-scenario/serverless/ showing detailed replatforming of an existing AKS microservices workload to Container Apps; includes concrete implementation details. | | [Microservices with Dapr and KEDA](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/serverless/microservices-with-container-apps-dapr) | example-workloads | 0.85 | Example scenario for an order management system with 10 microservices using Container Apps, Dapr, and KEDA; detailed, scenario-specific architecture and configuration. | | [Migrate an Oracle database to Azure](https://learn.microsoft.com/en-us/azure/architecture/databases/idea/topic-migrate-oracle-azure) | migration-guides | 0.85 | Explicit migration guidance from on-premises Oracle to Azure VMs or OD@A, with considerations and recommendations; matches migration-guides criteria. | @@ -285,26 +308,24 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Secure AKS workloads with Azure Front Door](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/aks-front-door/aks-front-door) | example-workloads | 0.85 | Under /example-scenario/aks-front-door/ with detailed architecture using Front Door, WAF, Private Link, and NGINX; scenario-specific implementation for securing AKS workloads. | | [Unlock insights from conversational data](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/unlock-insights-from-conversational-data) | solution-ideas | 0.85 | Explicitly labeled as a solution idea with conceptual architecture diagram and scenario-specific guidance for conversation knowledge mining. | | [Use Azure Kubernetes Service to host GPU-based workloads](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-gpu/gpu-aks) | reference-architectures | 0.85 | Under /reference-architectures/containers/aks-gpu/ with guidance on choosing GPU SKUs and running training/inference; includes concrete configuration and sizing recommendations. | -| [AKS on Azure Local baseline architecture](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline) | example-workloads | 0.84 | Example-scenario for AKS on Azure Local with detailed recommendations for networking, security, identity, management, and monitoring in hybrid context. | | [Deploy MongoDB Atlas on Azure](https://learn.microsoft.com/en-us/azure/architecture/databases/architecture/mongodb-atlas-baseline) | reference-architectures | 0.84 | Baseline deployment architecture with secure private connectivity, single/multi-region options, and example implementation link; production-ready guidance. | | [Deploy apps with AKS enabled by Azure Arc on Azure Local](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-hybrid-azure-local) | example-workloads | 0.84 | Scenario-specific guidance for app deployment pipelines using AKS, Azure Arc, and Flux GitOps with concrete hybrid Kubernetes implementation details. | | [Enable ML inference on IoT Edge](https://learn.microsoft.com/en-us/azure/architecture/guide/iot/machine-learning-inference-iot-edge) | example-workloads | 0.84 | IoT guide with concrete architecture and implementation details for running ML models on IoT Edge, including deployment patterns and edge constraints beyond generic conceptual info. | | [High-availability deployment](https://learn.microsoft.com/en-us/azure/architecture/web-apps/app-service-environment/architectures/app-service-environment-high-availability-deployment) | reference-architectures | 0.84 | Describes zone-redundant ASE deployment for improved resiliency; production-focused reference architecture with availability considerations. | | [Migrate cloud workloads across security tenants](https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/migrate-cloud-workloads-across-security-tenants) | solution-ideas | 0.84 | Explicit solution-ideas article describing architecture and strategy for migrating workloads across security tenants during business transformations. | | [Modernize mainframe midrange data](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/modernize-mainframe-data-to-azure) | example-workloads | 0.84 | URL contains example-scenario/mainframe and the article describes an end-to-end modernization plan for mainframe and midrange data sources using Azure data services, which is a detailed scenario implementation fitting example-workloads. | -| [Observability](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/observability) | best-practices | 0.84 | Observability guidance with specific monitoring configurations and telemetry patterns for Functions and Event Hubs topologies. | -| [Security](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security) | best-practices | 0.84 | Security-focused implementation guidance for Functions and Event Hubs, applying least privilege and fine-grained access with concrete configuration advice. | +| [Resilient design](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design) | best-practices | 0.84 | Covers error handling, idempotency, and retry behavior with specific recommendations for resilient Event Hubs-triggered Functions. This is cross-cutting resilience guidance with concrete implementation advice, aligning with best-practices. | | [Standard deployment](https://learn.microsoft.com/en-us/azure/architecture/web-apps/app-service-environment/architectures/app-service-environment-standard-deployment) | reference-architectures | 0.84 | Recommended architecture for enterprise workloads on ASE v3 with security best practices; detailed, deployable reference design. | | [Automotive test data analytics](https://learn.microsoft.com/en-us/azure/architecture/industries/automotive/automotive-telemetry-analytics) | example-workloads | 0.83 | industries/automotive URL; detailed, industry-specific analytics workload for automotive test fleets with concrete Azure data services and flow, beyond generic analytics concepts. | | [Hub-spoke topology with Virtual WAN](https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/hub-spoke-virtual-wan-architecture) | reference-architectures | 0.82 | Detailed architecture using Virtual WAN, ExpressRoute/VPN, and peering; alternative to other reference topologies with deployment guidance. | | [Overview](https://learn.microsoft.com/en-us/azure/architecture/gcp-professional/) | migration-guides | 0.82 | gcp-professional root article explaining Azure accounts, platform, and services to GCP experts. It’s explicitly cross-cloud onboarding/migration guidance with platform mapping and differences. | | [Secure research for regulated data](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/architecture/secure-compute-for-research) | example-workloads | 0.82 | AI/ML architecture article with detailed secure compute environment for researchers, including network isolation, identity, and data protection specifics. | +| [Security](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security) | best-practices | 0.82 | Provides detailed security guidance (least privilege, fine-grained access) for Functions and Event Hubs integration, with specific configuration practices. This is security-focused implementation guidance with DO/DO-NOT style recommendations, fitting best-practices. | | [Apache Kafka migration to Azure](https://learn.microsoft.com/en-us/azure/architecture/guide/hadoop/apache-kafka-migration) | migration-guides | 0.80 | Explicit migration guide for Kafka to Azure using HDInsight and Event Hubs, including service mapping and migration considerations. | | [Azure files secured by AD DS](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/azure-files-on-premises-authentication) | example-workloads | 0.80 | Example-scenario for securing Azure Files via on-premises AD DS over private endpoints with detailed hybrid identity and networking configuration. | | [Baseline Microsoft Foundry chat architecture in an Azure landing zone](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/architecture/baseline-microsoft-foundry-landing-zone) | reference-architectures | 0.80 | Builds on the baseline Foundry chat reference architecture and describes deployment in an Azure landing zone; production-focused architecture with detailed Azure resource organization. | | [Blue-green deployment of AKS clusters](https://learn.microsoft.com/en-us/azure/architecture/guide/aks/blue-green-deployment-for-aks) | best-practices | 0.80 | Guide under /guide/aks/ describing design and implementation of blue-green deployments for AKS using Azure managed services; includes concrete architecture and operational steps. | | [CI/CD for AKS apps via Azure Pipelines](https://learn.microsoft.com/en-us/azure/architecture/guide/aks/aks-cicd-azure-pipelines) | best-practices | 0.80 | Describes a CI/CD baseline architecture focused on AKS with Azure Pipelines; includes concrete pipeline stages, deployment flows, and configuration details. | -| [Compare Java application hosting options](https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/service-for-java-comparison) | technology-choices | 0.80 | Explicitly compares multiple Azure hosting options for Java with planning considerations, acting as a technology-choices decision guide. | | [Considerations](https://learn.microsoft.com/en-us/azure/architecture/guide/container-service-general-considerations) | technology-choices | 0.80 | Guide about choosing an Azure container service with feature-level considerations; functions as a decision guide between multiple Azure container options, which is specialized selection knowledge. | | [Data partitioning strategies (by service)](https://learn.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning-strategies) | best-practices | 0.80 | Provides concrete strategies for partitioning data across specific Azure services, building on general partitioning best practices. | | [Deploy AD DS in an Azure virtual network](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/identity/adds-extend-domain) | example-workloads | 0.80 | Example-scenario for deploying domain controllers in Azure to extend an on-premises AD domain with concrete hybrid authentication design. | @@ -322,8 +343,8 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Messaging](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/messaging) | migration-guides | 0.80 | Messaging service comparison and mapping (SES, SQS, etc.) to Azure equivalents for migration scenarios. | | [Mission-critical architecture design](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-mission-critical/mission-critical-intro) | reference-architectures | 0.80 | Part of /reference-architectures/ aks-mission-critical series with production-focused guidance and deployment methodology. | | [Move mainframe archive data to Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/move-archive-data-mainframes) | reference-architectures | 0.80 | Described as a reference architecture for moving archived mainframe/midrange data to Azure; includes specific Azure storage usage and data flow for this scenario. | +| [Observability](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/observability) | best-practices | 0.80 | Explains how to monitor a Functions + Event Hubs topology using Application Insights, including what telemetry to use and how to interpret it. This is concrete monitoring/observability implementation guidance, a classic cross-cutting best-practices scenario. | | [Operational procedures](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-mission-critical/mission-critical-operations) | reference-architectures | 0.80 | Operational procedures and organizational alignment guidance specific to mission-critical reference architecture stamps. | -| [Overview](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions) | best-practices | 0.80 | Serverless integration guidance with detailed recommendations, settings, and techniques for using Event Hubs and Functions together; cross-cutting best practices for this pairing. | | [Prepare to choose a data store in Azure](https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-stores-getting-started) | technology-choices | 0.80 | Decision guide for selecting among Azure data store services (URL contains guide/technology-choices). Provides structured selection criteria across functional, performance, cost, and security dimensions, which is expert decision knowledge. | | [Reengineer mainframe batch apps](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/reengineer-mainframe-batch-apps-azure) | example-workloads | 0.80 | URL contains example-scenario/mainframe and the content describes a reference architecture for re-engineering z/OS batch workloads using Azure services; although called a reference architecture, it is scoped to a specific workload scenario, aligning best with example-workloads. | | [Regions and zones](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/regions-zones) | migration-guides | 0.80 | Compares regional and zonal constructs between AWS and Azure to guide resilient migration designs. | @@ -336,9 +357,12 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Update Windows VMs in Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/wsus/) | example-workloads | 0.80 | example-scenario/wsus URL; provides concrete network, DMZ, and WSUS configuration guidance for updating VMs in locked-down VNets, including Azure Firewall considerations. | | [Use Azure Red Hat OpenShift in the financial services industry](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aro/azure-redhat-openshift-financial-services-workloads) | reference-architectures | 0.80 | Under /reference-architectures/containers/aro/ with landing zone architecture for FSI; includes regulatory, security, and hybrid-cloud configuration specifics for production environments. | | [Use the Transactional Outbox pattern](https://learn.microsoft.com/en-us/azure/architecture/databases/guide/transactional-out-box-cosmos) | design-patterns | 0.80 | Named pattern with Azure-specific implementation using Cosmos DB transactional batches, change feed, and Service Bus; includes reliability and idempotency guidance. | +| [AKS on Azure Local baseline architecture](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline) | example-workloads | 0.78 | URL contains example-scenario/, and the page describes a baseline AKS architecture with concrete recommendations for networking, security, identity, management, and monitoring on Azure Local/Stack. This is a detailed, scenario-specific implementation rather than a generic overview, matching example-workloads. | | [API Management landing zone architecture](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/integration/app-gateway-internal-api-management-function) | example-workloads | 0.78 | Example-scenario URL with detailed enterprise API management architecture using Application Gateway, internal APIM, Functions, CI/CD, and network/security configuration guidance for production-like environments. | | [Agent node management](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/node-pools) | migration-guides | 0.78 | Details how EKS and AKS manage agent/worker nodes and node pools, with Azure-specific options. This comparative, platform-mapping content is characteristic of migration guides. | | [Azure enterprise cloud file share](https://learn.microsoft.com/en-us/azure/architecture/hybrid/azure-files-private) | reference-architectures | 0.78 | Explicitly described as a reference architecture for an enterprise-level cloud file sharing solution using Azure Files, Azure File Sync, private DNS, and private endpoints. Provides a concrete Azure service composition and deployment-focused guidance for a production scenario, which constitutes expert implementation knowledge beyond generic conceptual content and aligns with the reference-architectures sub-skill. | +| [Certificate life cycle management on Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/) | example-workloads | 0.78 | URL contains example-scenario/, and the page describes a concrete implementation for automating TLS/SSL certificate renewal with a nonintegrated CA using Azure services. It goes beyond a high-level solution idea by detailing workflow, components, and operational considerations for a specific use case (certificate lifecycle), matching the example-workloads criteria. | +| [Choose a Microsoft Fabric deployment pattern](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-deployment-patterns) | technology-choices | 0.78 | The page is a decision guide for choosing among multiple Microsoft Fabric deployment patterns. It discusses trade-offs and considerations (governance, security, performance isolation, cost management) and lives under a technology-choices path. It provides expert, scenario-specific selection guidance rather than just conceptual overview, matching the technology-choices category. | | [Cluster governance](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/governance) | migration-guides | 0.78 | Describes governance mechanisms for Kubernetes clusters and compares EKS and AKS governance options. Cross-cloud governance mapping is expert migration guidance. | | [Cluster monitoring and logging](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/monitoring) | migration-guides | 0.78 | In the aws-professional/eks-to-aks path, focused on monitoring/logging differences and concrete options for AKS versus EKS. Contains platform-specific guidance and comparisons that go beyond generic Kubernetes knowledge, fitting a migration/transition guide for AWS professionals. | | [Computer forensics](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/forensics/) | example-workloads | 0.78 | URL contains example-scenario/, and the article describes a concrete, detailed implementation of an Azure-based infrastructure and workflow for digital evidence chain of custody across acquisition, preservation, and access. It goes beyond a high-level solution idea by providing scenario-specific technical guidance for a specialized use case (computer forensics/legal evidence handling), but is not a generic reference architecture with full production sizing tables. | @@ -350,6 +374,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Esri ArcGIS on Azure Virtual Desktop](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/esri-arcgis-azure-virtual-desktop) | example-workloads | 0.78 | URL contains example-scenario/, and the page describes a concrete implementation of Esri ArcGIS Pro and ArcGIS Enterprise on Azure Virtual Desktop with a full system architecture. Example scenarios in the Architecture Center typically include detailed component choices, configuration considerations, and deployment guidance for a specific industry/use case (GIS/VDI), which goes beyond generic knowledge and into expert, scenario-specific implementation details. | | [Hub-spoke network topology in Azure](https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/hub-spoke) | reference-architectures | 0.78 | Describes a concrete hub-spoke network reference architecture with Azure networking components, implementation details, and production-oriented guidance; it is not just a conceptual style but a deployable topology recommended by the Cloud Adoption Framework. | | [Inbound and outbound internet connections for SAP on Azure](https://learn.microsoft.com/en-us/azure/architecture/guide/sap/sap-internet-inbound-outbound) | best-practices | 0.78 | The article offers proven practices for securing inbound and outbound internet connectivity for SAP on Azure infrastructure. It focuses on actionable security and networking guidance (do/don’t style recommendations) for a specific workload scenario, which aligns with the best-practices sub-skill type rather than patterns, solution ideas, or reference architectures. | +| [Manage virtual machine compliance](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/security/virtual-machine-compliance) | example-workloads | 0.78 | The page is under an example-scenario path and describes a concrete implementation for managing VM compliance using Azure VM Image Builder and Azure Compute Gallery. It focuses on a specific operational use case (gold image publishing and VM compliance tracking) with practical, scenario-driven guidance that goes beyond high-level concepts, matching the example-workloads category more than solution-ideas or reference-architectures. | | [Massive-scale Virtual WAN architecture design](https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/massive-scale-azure-architecture) | example-workloads | 0.78 | Example workload for exceptionally large deployments with multiple hubs per region and redundant ExpressRoute; deep, scenario-specific network design. | | [Migrate an Oracle database to OD@A Exadata Database Service](https://learn.microsoft.com/en-us/azure/architecture/databases/idea/migrate-oracle-odaa-exadata) | migration-guides | 0.78 | Prescriptive migration using Oracle ZDM from on-premises Exadata to Oracle Database@Azure; contains tool-specific and Azure-networking guidance. | | [Migrate an Oracle database to an Azure virtual machine](https://learn.microsoft.com/en-us/azure/architecture/databases/idea/migrate-oracle-azure-iaas) | migration-guides | 0.78 | Step-by-step migration scenario using Oracle Data Guard and Azure networking; includes concrete topology and operational guidance beyond generic concepts. | @@ -365,12 +390,12 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Storage options](https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/storage) | migration-guides | 0.78 | Compares storage capabilities and workload data options between EKS and AKS, with Azure-specific storage choices. This is expert, cross-cloud migration guidance under aws-professional/eks-to-aks. | | [Hybrid availability and monitoring](https://learn.microsoft.com/en-us/azure/architecture/hybrid/hybrid-perf-monitoring) | example-workloads | 0.76 | Hybrid monitoring reference architecture with concrete use of Azure Monitor for VMs across Azure, on-premises, and other clouds. | | [IoT Hub private file upload to Azure Storage](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/iot/iot-private-file-upload) | example-workloads | 0.76 | example-scenario/iot URL; though marked as solution idea, it gives concrete implementation for private file upload via IoT Hub to Storage behind firewall and custom domain—detailed scenario implementation. | +| [Overview](https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions) | best-practices | 0.76 | Provides detailed guidance on architecting, developing, and deploying Event Hubs-triggered Azure Functions, including specific settings and integration nuances. This is cross-cutting implementation guidance with concrete recommendations (DOs/techniques), fitting best-practices rather than a pattern or solution idea. | | [Securing access to multitenant web apps from on prem](https://learn.microsoft.com/en-us/azure/architecture/web-apps/guides/networking/access-multitenant-web-app-from-on-premises) | best-practices | 0.76 | Step-by-step guidance for private connectivity via Private Link and VNet integration; detailed networking and security configuration. | | [Virtual WAN optimized for requirements](https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/performance-security-optimized-vwan) | example-workloads | 0.76 | Based on a real manufacturing company scenario with OT/IT requirements and NVA firewalls; detailed, department-specific architecture. | | [Access an AKS API server](https://learn.microsoft.com/en-us/azure/architecture/guide/security/access-azure-kubernetes-service-cluster-api-server) | best-practices | 0.75 | Security-focused guide on AKS API server access options (Bastion, ExpressRoute, Cloud Shell) with deployment considerations; provides concrete, expert configuration guidance. | | [GitOps for AKS](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gitops-aks/gitops-blueprint-aks) | example-workloads | 0.75 | Example scenario under /example-scenario/gitops-aks/ with detailed GitOps techniques and tooling configuration for AKS; scenario-specific implementation rather than generic pattern. | | [Measure Azure app sustainability](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/measure-azure-app-sustainability-sci-score) | example-workloads | 0.75 | Example scenario under /example-scenario/ with a concrete sustainability model, data inputs, and SCI-based measurement approach. | -| [Natural language processing](https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing) | technology-choices | 0.75 | Provides comparison and selection guidance for NLP services (sentiment, key phrases, etc.), aligning with technology-choices. | | [Troubleshoot networking](https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/troubleshoot-network-aks) | best-practices | 0.75 | Network troubleshooting guide with specific steps, scenarios, and checks; detailed operational content beyond generic advice. | | [Azure Virtual Desktop for Azure Local](https://learn.microsoft.com/en-us/azure/architecture/hybrid/azure-local-workload-virtual-desktop) | reference-architectures | 0.74 | Hybrid architecture article under /architecture/hybrid/ with workload-specific design considerations for Azure Virtual Desktop on Azure Local. It builds on the Azure Local baseline reference architecture, focuses on concrete deployment and configuration choices (compute, storage, networking, security) for a production-ready setup, and provides implementation guidance rather than just conceptual overview, matching the reference-architectures criteria. | | [Move IoT Hub solutions to production](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/iot/iot-move-to-production) | best-practices | 0.74 | example-scenario/iot URL but content focuses on concrete production-readiness practices (deployment stamps, transient fault handling, provisioning models) rather than just conceptual design. | @@ -388,7 +413,6 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Automate API deployments with APIOps](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/devops/automated-api-deployments-apiops) | example-workloads | 0.70 | URL contains example-scenario/devops/, indicating an example workload. The page describes a concrete APIOps implementation for API Management with lifecycle improvements, which goes beyond a high-level solution idea and aligns with detailed, scenario-specific implementation guidance. | | [Automate PDF form processing](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/architecture/automate-pdf-forms-processing) | example-workloads | 0.70 | Describes a specific automated PDF processing workload using multiple Azure services with implementation-level architecture; fits example-workloads better than high-level solution idea. | | [Automate document classification](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/architecture/automate-document-classification-durable-functions) | example-workloads | 0.70 | ai-ml/architecture article with a concrete workload (document classification pipeline) and detailed service interactions; scenario-specific implementation but not a generic pattern or solution idea. | -| [Azure Sandbox](https://learn.microsoft.com/en-us/azure/architecture/guide/azure-sandbox/azure-sandbox) | reference-architectures | 0.70 | Describes a Terraform-based project for deploying modular sandbox environments with specific components and cost controls; effectively a deployable architecture with concrete implementation details. | | [Content delivery](https://learn.microsoft.com/en-us/azure/architecture/guide/networking/global-web-applications/mission-critical-content-delivery) | reference-architectures | 0.70 | Provides concrete CDN-based architecture using Azure Front Door and multi-CDN strategies for resilient content delivery. | | [Continuous validation](https://learn.microsoft.com/en-us/azure/architecture/guide/testing/mission-critical-deployment-testing) | best-practices | 0.70 | Testing guide with concrete recommendations (how to use Azure Load Testing and Chaos Studio) and DO-style guidance for resilient deployments. | | [Data obfuscation with Delphix](https://learn.microsoft.com/en-us/azure/architecture/databases/guide/data-obfuscation-with-delphix-in-azure-data-factory) | example-workloads | 0.70 | Scenario-specific architecture integrating Delphix CC with Azure Data Factory ETL and Synapse; detailed workflow for compliance-focused masking. | @@ -398,6 +422,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Extract and analyze call center data](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/openai/architecture/call-center-openai-analytics) | example-workloads | 0.70 | Call center analytics workload using Foundry Tools and Azure OpenAI; detailed, domain-specific implementation rather than a generic solution idea. | | [Global HTTP ingress](https://learn.microsoft.com/en-us/azure/architecture/guide/networking/global-web-applications/mission-critical-global-http-ingress) | reference-architectures | 0.70 | Details combining multiple Azure ingress and security services for highly available global HTTP entry, beyond a simple conceptual diagram. | | [Guide overview](https://learn.microsoft.com/en-us/azure/architecture/networking/guide/private-link-virtual-wan-dns-guide) | best-practices | 0.70 | Detailed DNS design and configuration guidance for private endpoints in Virtual WAN; highly specific implementation knowledge. | +| [Introduction](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/) | architecture-styles | 0.70 | Describes how to design and build a microservices architecture as a style, with detailed Azure-focused guidance and best practices. URL is under microservices/design rather than patterns or solution-ideas, and it provides actionable implementation details for the microservices architecture style. | | [Introduction](https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/day-2-operations-guide) | best-practices | 0.70 | Operator guide for triage, patching, upgrading, and troubleshooting AKS; contains operational runbook-style guidance and practices beyond conceptual overview. | | [Java](https://learn.microsoft.com/en-us/azure/architecture/web-apps/guides/enterprise-app-patterns/reliable-web-app/java/guidance) | migration-guides | 0.70 | Similar to index 0 but for Java, this article provides detailed, prescriptive guidance for replatforming Java web apps to Azure using the Reliable Web App pattern, including architecture, code, and configuration specifics. This is expert migration knowledge rather than a high-level overview or generic pattern description, and it is not a reference architecture or solution idea. | | [Kubernetes at the edge](https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/choose-kubernetes-edge-compute-option) | technology-choices | 0.70 | Compares multiple Kubernetes-at-edge options with operational cost, configuration, and flexibility trade-offs, providing concrete decision guidance between Azure offerings. | @@ -458,7 +483,6 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Chunking phase](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/rag/rag-chunking-phase) | 0.45 | Chunking phase guide is detailed technique guidance but not a design pattern (no patterns/ URL or pattern sections) and not an architecture or solution idea; falls outside the defined types. | | [Embedding phase](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/rag/rag-generate-embeddings) | 0.45 | Embeddings phase article explains concepts and choices but is not a technology-choices comparison across Azure services nor a formal pattern or architecture. | | [Information-retrieval phase](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/rag/rag-information-retrieval) | 0.45 | Information-retrieval phase focuses on search configuration concepts; lacks the formal pattern structure or multi-service architecture classification required by the defined types. | -| [Microsoft Fabric deployment patterns](https://learn.microsoft.com/en-us/azure/architecture/analytics/architecture/fabric-deployment-patterns) | 0.45 | Describes deployment patterns for Microsoft Fabric with considerations and recommendations, but this is more of a conceptual deployment-pattern guide; it does not clearly fall into the defined sub-skill types (no patterns/ URL, no best-practices/ marker, not an example-scenario or reference-architectures path). | | [API design](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/api-design) | 0.40 | API design for microservices is detailed guidance but not under best-practices/ and not structured as a named pattern or technology-choice comparison. | | [API gateways](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/gateway) | 0.40 | API gateway article explains concept and considerations; not a patterns/ article, not a solution-idea or reference architecture, and not a technology-choice comparison across Azure services. | | [Azure Virtual Desktop design guide](https://learn.microsoft.com/en-us/azure/architecture/landing-zones/azure-virtual-desktop/design-guide) | 0.40 | Landing zone design guide is broad guidance without concrete SKUs, deployment repos, or production-ready configuration details; more conceptual than a deployable reference architecture. | @@ -474,7 +498,6 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Deployment options](https://learn.microsoft.com/en-us/azure/architecture/landing-zones/landing-zone-deploy) | 0.40 | Landing zone deployment options article is prescriptive but is a methodology/overview; not a specific reference architecture, pattern, or comparison per defined categories. | | [Design for evolution](https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/design-for-evolution) | 0.40 | Design for change principle; similar to other design-principles pages and not mapped to any of the specified sub-skill categories. | | [Design for operations](https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/design-for-operations) | 0.40 | Design for operations is conceptual/role guidance; not a pattern, style, best-practices/ article, or technology-choice comparison. | -| [DevSecOps on AKS](https://learn.microsoft.com/en-us/azure/architecture/guide/devsecops/devsecops-on-aks) | 0.40 | DevSecOps on AKS is lifecycle- and role-oriented security guidance without concrete SKUs, deployment scripts, or production-ready architecture implementation details; it reads as best-practice style content but URL doesn’t match best-practices/ and lacks the explicit DO/DON'T structure required for that category. | | [Get started](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/containers/aks-start-here) | 0.40 | Although under reference-architectures/, this page is a planning and journey overview for adopting AKS, not a concrete deployable reference architecture. The summary indicates foundational concepts and organizational alignment rather than specific SKUs, configuration tables, or deployment repos. It lacks the detailed production-ready architecture, sizing guidance, and implementation specifics required for expert-knowledge classification. | | [Getting started](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/rag/rag-solution-design-and-evaluation-guide) | 0.40 | RAG solution design guide is methodology-focused and conceptual; doesn’t map to any defined sub-skill type like patterns/, solution-ideas/, or reference-architectures/. | | [High availability for multitier AKS apps](https://learn.microsoft.com/en-us/azure/architecture/guide/aks/aks-high-availability) | 0.40 | This is a high-availability guidance article for multitier AKS applications, likely containing checklists and conceptual HA mechanisms. It does not match any defined sub-skill URL patterns (reference-architectures/, solution-ideas/, patterns/, technology-choices/, architecture-styles/, best-practices/, antipatterns/, example-scenario/, migration/). Without clear evidence of detailed, non-obvious configuration values or production-ready implementation specifics, it is treated as general guidance rather than expert knowledge for this classification task. | @@ -496,10 +519,10 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Use tactical DDD to design microservices](https://learn.microsoft.com/en-us/azure/architecture/microservices/model/tactical-domain-driven-design) | 0.40 | Tactical DDD for microservices is design guidance; does not follow the design-patterns URL structure or pattern sections, and is not another listed category. | | [MLOps maturity model](https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/mlops-maturity-model) | 0.35 | MLOps maturity model is a conceptual framework for assessing maturity, not an implementation guide, architecture, or pattern per the defined categories. | | [Azure DNS Private Resolver](https://learn.microsoft.com/en-us/azure/architecture/networking/architecture/azure-dns-private-resolver) | 0.30 | Azure DNS Private Resolver article describes a specific networking solution and likely includes configuration guidance, but it is a service-specific architecture/feature guide under /networking/architecture/ rather than any of the defined sub-skill types (no patterns/, solution-ideas/, reference-architectures/, best-practices/, antipatterns/, technology-choices/, architecture-styles/, example-scenario/, or migration/ in the path). It does not match the required URL patterns or structural cues for the listed categories. | +| [Azure Sandbox](https://learn.microsoft.com/en-us/azure/architecture/guide/azure-sandbox/azure-sandbox) | 0.30 | Terraform-based sandbox deployment guidance, but described at a high level as a project/framework. From the summary it lacks the detailed architecture diagrams, SKU-level configuration, or production sizing/deployment specifics required for reference-architectures or example-workloads classifications. | | [Big data](https://learn.microsoft.com/en-us/azure/architecture/databases/guide/big-data-architectures) | 0.30 | Explains big data architectures conceptually (ingestion, processing, analysis) and when they apply; does not appear to include production-ready architecture diagrams with SKUs, GitHub deployments, or explicit DO/DON’T best-practice lists. | | [Design patterns for microservices](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/patterns) | 0.30 | High-level overview page that lists or introduces multiple microservices design patterns rather than fully describing a single named pattern with the required 'Context and problem', 'Solution', and 'When to use' structure; functions more as a conceptual index than a detailed pattern article. | | [ETL guide](https://learn.microsoft.com/en-us/azure/architecture/data-guide/relational-data/etl) | 0.30 | Describes ETL/ELT concepts and pipeline components; while it may mention tools and services, it is a general data-pipeline overview rather than a detailed reference architecture, pattern, or best-practices document with explicit DOs/DON’Ts. | -| [Introduction](https://learn.microsoft.com/en-us/azure/architecture/microservices/design/) | 0.30 | Landing page for microservices design series; mostly navigational/overview, not a specific pattern, style, or comparison guide. | | [Overview](https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/) | 0.30 | Design principles overview; conceptual foundation, not a specific best-practices/ article, pattern, style, or technology-choice comparison. | | [Overview](https://learn.microsoft.com/en-us/azure/architecture/guide/responsible-innovation/) | 0.30 | High-level responsible engineering overview; conceptual and ethical guidance without detailed technical implementation specifics. | | [Understand potential harms](https://learn.microsoft.com/en-us/azure/architecture/guide/responsible-innovation/harms-modeling/) | 0.30 | Harms modeling foundations are conceptual risk-identification practices, not deep Azure implementation details. | @@ -509,7 +532,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Architecture for SaaS and multitenancy](https://learn.microsoft.com/en-us/azure/architecture/guide/saas-multitenant-solution-architecture/) | 0.20 | Introductory SaaS and multitenant architecture overview; primarily conceptual and navigational. | | [Browse all Architectures](https://learn.microsoft.com/en-us/azure/architecture/browse/) | 0.20 | Browse/navigation hub listing many architectures; no detailed implementation, configuration, or deployment guidance on this page itself. | | [Conceptual planning for IPv6 networking](https://learn.microsoft.com/en-us/azure/architecture/networking/guide/ipv6-ip-planning) | 0.20 | Primarily a conceptual planning guide for IPv6 transition strategies without a specific deployable architecture, pattern structure, or detailed implementation artifacts that meet any sub-skill type criteria. | -| [Data lakes](https://learn.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-lake) | 0.20 | Defines what a data lake is and its advantages; high-level scenario description without detailed Azure service configurations, deployment guidance, or pattern-style structure. | +| [DevSecOps on AKS](https://learn.microsoft.com/en-us/azure/architecture/guide/devsecops/devsecops-on-aks) | 0.20 | DevSecOps on AKS page appears to be lifecycle and practice-oriented, focusing on concepts and general security integration into DevOps. The summary does not indicate detailed architecture diagrams, concrete configuration tables, or implementation specifics that would qualify as expert knowledge under any listed sub-skill type. | | [Get started](https://learn.microsoft.com/en-us/azure/architecture/analytics/analytics-get-started) | 0.20 | Overview of Azure analytics technologies and related guidance. It’s a high-level entry point without a specific pattern structure, solution idea diagram, or detailed implementation/deployment specifics, so it doesn’t meet any sub-skill type’s expert-knowledge criteria. | | [Get started](https://learn.microsoft.com/en-us/azure/architecture/containers/container-get-started) | 0.20 | High-level overview of Azure container technologies and related guidance; no detailed deployment configurations, sizing guidance, or concrete implementation specifics that go beyond general conceptual knowledge. | | [Get started](https://learn.microsoft.com/en-us/azure/architecture/databases/database-get-started) | 0.20 | Introductory database architecture design overview and technology choices summary; conceptual and navigational in nature without deep implementation details or comparison tables that meet the technology-choices criteria. | @@ -527,6 +550,7 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [OLTP solutions](https://learn.microsoft.com/en-us/azure/architecture/data-guide/relational-data/online-transaction-processing) | 0.20 | Conceptual explanation of OLTP characteristics (atomicity, consistency, etc.); lacks Azure-specific expert configuration guidance or pattern/anti-pattern structure. | | [Web applications](https://learn.microsoft.com/en-us/azure/architecture/web-apps/) | 0.20 | High-level overview and navigation hub for web architectures; lacks detailed deployable architecture, patterns, or configuration specifics. | | [Architecture icons](https://learn.microsoft.com/en-us/azure/architecture/icons/) | 0.10 | Icon download and diagramming assets; no architectural guidance, patterns, or deployment details. | +| [Data lakes](https://learn.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-lake) | 0.10 | Conceptual explanation of data lakes and their characteristics; does not match any specified sub-skill URL patterns (no reference-architectures/, solution-ideas/, patterns/, technology-choices/, architecture-styles/, best-practices/, antipatterns/, example-scenario/, industries/, migration/) and lacks detailed deployment guidance, patterns structure, or comparison tables. | | [AIX UNIX to Azure Linux migration](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/unix-migration/migrate-aix-azure-linux) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Batch transaction processing](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/process-batch-transactions) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Cross-tenant secure access to apps](https://learn.microsoft.com/en-us/azure/architecture/networking/guide/cross-tenant-secure-access-private-endpoints) | - | Networking implementation guide for cross-tenant private endpoint access; URL path is /networking/guide/ and does not match any of the specified skill-type paths (reference-architectures, solution-ideas, patterns, technology-choices, architecture-styles, best-practices, antipatterns, example-scenario, industries, migration). While it likely contains detailed configuration guidance, it does not fit any defined sub-skill category. | @@ -537,7 +561,6 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [IBM z/OS migration with Avanade AMT](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/avanade-amt-zos-migration) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [IBM z/OS online transaction processing](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/ibm-zos-online-transaction-processing-azure) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Integrate IBM MQs with Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/integrate-ibm-message-queues-azure) | - | Parse error: Expecting value: line 1 column 1 (char 0) | -| [Manage virtual machine compliance](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/security/virtual-machine-compliance) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Migrate AIX workloads with Skytap](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-aix-workloads-to-azure-with-skytap) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Migrate HP-UX workloads](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/hp-ux-stromasys-charon-par) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Migrate IBM i series to Azure with Skytap](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-ibm-i-series-to-azure-with-skytap) | - | Parse error: Expecting value: line 1 column 1 (char 0) | @@ -556,9 +579,8 @@ use_when: Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregio | [Teamcenter with Azure NetApp Files](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/manufacturing/teamcenter-plm-netapp-files) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Unisys ClearPath Forward OS 2200 enterprise server virtualization on Azure](https://learn.microsoft.com/en-us/azure/architecture/mainframe/virtualization-of-unisys-clearpath-forward-os-2200-enterprise-server-on-azure) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Unisys ClearPath MCP virtualization on Azure](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/unisys-clearpath-forward-mainframe-rehost) | - | Parse error: Expecting value: line 1 column 1 (char 0) | -| [Unisys Dorado migration](https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/migrate-unisys-dorado-mainframe-apps-with-astadia-micro-focus) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Unisys mainframe migration with Avanade AMT](https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/migration/unisys-mainframe-migration) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [Use the Azure Governance Visualizer](https://learn.microsoft.com/en-us/azure/architecture/landing-zones/azure-governance-visualizer-accelerator) | - | Deployment guidance for Azure Governance Visualizer is a tool-specific implementation article under landing-zones/, not one of the defined architecture sub-skill types (no patterns/, solution-ideas/, reference-architectures/, best-practices/, etc. in path). It focuses on automating runs and outputs rather than reusable cloud architecture, patterns, or technology choices. | | [VM baseline](https://learn.microsoft.com/en-us/azure/architecture/virtual-machines/baseline) | - | Parse error: Expecting value: line 1 column 1 (char 0) | | [VM baseline in an Azure landing zone](https://learn.microsoft.com/en-us/azure/architecture/virtual-machines/baseline-landing-zone) | - | Parse error: Expecting value: line 1 column 1 (char 0) | -| [What's new](https://learn.microsoft.com/en-us/azure/architecture/changelog) | - | Changelog/navigation page listing new and updated Azure Architecture Center articles; does not itself contain architectural guidance, patterns, or implementation details. | +| [What's new](https://learn.microsoft.com/en-us/azure/architecture/changelog) | - | Changelog/navigation page listing new and updated Azure Architecture Center articles; does not itself contain detailed architectural guidance or implementation specifics. | diff --git a/products/azure-artifacts/azure-artifacts.csv b/products/azure-artifacts/azure-artifacts.csv index 7a3d43e9..a9d1913e 100644 --- a/products/azure-artifacts/azure-artifacts.csv +++ b/products/azure-artifacts/azure-artifacts.csv @@ -29,7 +29,7 @@ https://learn.microsoft.com/en-us/azure/devops/artifacts/how-to/search-upstream? https://learn.microsoft.com/en-us/azure/devops/artifacts/how-to/set-up-upstream-sources?view=azure-devops,Set up upstream sources,Set up upstream sources for your feed - Azure Artifacts,Configure upstream sources for Azure Artifacts feeds,Learn how to set up external feeds and public registries as upstream sources for your Azure Artifacts feed.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Azure Artifacts upstream sources, you can simplify package management by using a single feed for both the packages you publish and those you consume from external feeds and public registries likenpmjs.comandNuGet.org. When upstream sources are enabled, Azure Artifacts automatically saves a copy of any package installed to your feed. However, you must be a collaborator or higher to install packages from upstream sources. ",2026-02-13T02:04:00.000Z,how-to,configuration,0.7,True,"How-to for setting up upstream sources; typically includes feed settings, allowed values, and UI/REST configuration options specific to Azure Artifacts.",unchanged https://learn.microsoft.com/en-us/azure/devops/artifacts/how-to/upstream-internal-feed?view=azure-devops,Upstream from internal feeds,Set up an internal feed as an upstream source - Azure Artifacts,,Learn how to set up an internal feed as an upstream source in Azure Artifacts.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Artifacts upstream sources simplify package management by enabling developers to store packages from multiple sources in a single feed. When a package is installed for the first time from an upstream source, Azure Artifacts automatically saves a copy to your feed to ensure continued access, even if the upstream source becomes temporarily unavailable. This tutorial walks you through how to set up an internal feed from th",2026-03-31T20:43:00.000Z,tutorial,,0.3,False,"Primarily a how-to/tutorial for configuring an internal feed as an upstream source in Azure Artifacts; it does not emphasize numeric limits, detailed configuration parameter tables, error-code-based troubleshooting, or decision matrices. The content is procedural rather than expert reference material.",unchanged -https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops,Use packages from Google Maven Repository,Use packages from Google Maven Repository upstream source - Azure Artifacts,Use Google Maven Repository as Azure Artifacts upstream,How to consume packages from Google Maven Repository upstream source,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Azure Artifacts, developers can enable upstream sources to consume packages from different public registries such as Google Maven Repository. Once enabled, Azure Artifacts will automatically save a copy of any package installed from the upstream. Additionally, Azure Artifacts supports other Maven upstream sources such as Maven Central, Gradle Plugins, and JitPack. In this article, you learn how to:",2026-02-17T18:05:00.000Z,tutorial,integrations,0.7,True,Shows how to configure Google Maven as an upstream source; includes Azure feed configuration and Maven/Gradle repository entries unique to this integration.,unchanged +https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops,Use packages from Google Maven Repository,Use packages from Google Maven Repository upstream source - Azure Artifacts,Consume Google Maven packages via Azure Artifacts upstream,Learn how to consume packages from Google Maven Repository upstream source in Azure Artifacts feeds.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Azure Artifacts, developers can enable upstream sources to consume packages from public registries such as the Google Maven Repository. When an upstream source is enabled, Azure Artifacts automatically saves a copy of any package installed to the feed by users withFeed and Upstream Reader (Collaborator)permissions or higher. Azure Artifacts also supports other Maven upstream sources, including Maven Central, Gradle Plugi",2026-04-20T17:05:00.000Z,tutorial,integrations,0.68,True,"Page describes product-specific integration details for using Google Maven Repository as an upstream source in Azure Artifacts feeds, including required permissions (Feed and Upstream Reader/Collaborator) and Azure DevOps-specific configuration steps and behaviors that go beyond generic Maven usage.",updated https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/gradle-plugins?view=azure-devops,Use packages from Gradle Plugins,Gradle Plugins upstream source - Azure Artifacts,Add Gradle Plugins repository as Azure Artifacts upstream,How to add Gradle Plugins upstream source,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Azure Artifacts feeds, you can enable upstream sources to include packages from different public registries such as Gradle Plugins. Once upstream sources are enabled on your feed, Azure Artifacts will save a copy of any package you install from upstream. Azure Artifacts also support other Maven upstream sources such as Maven Central, Google Maven Repository, and JitPack. Note Organization-scoped feeds cannot be converted",2025-12-19T16:56:00.000Z,how-to,integrations,0.7,True,Documents how to add Gradle Plugins as an upstream source; likely includes specific repository URLs and feed configuration options.,unchanged https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/install?view=azure-devops,Restore Maven packages,Restore Maven packages from your Azure Artifacts feed - Azure Artifacts,Configure Maven to restore packages from Azure Artifacts,Learn how to connect to your Azure Artifacts feed and restore your Maven packages.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Azure Artifacts, you can publish and restore Maven packages from Azure Artifacts feed and public registries. This article will walk you through setting up your Maven project, connecting to your Azure Artifacts feed, and restoring your Maven packages.",2025-08-15T18:39:00.000Z,how-to,integrations,0.7,True,"How-to page for connecting Maven to Azure Artifacts feeds; typically includes product-specific repository URLs, settings.xml snippets, and authentication parameters that are configuration/integration details rather than generic Maven knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/jitpack-upstream?view=azure-devops,Use packages from JitPack,Jitpack upstream source - Azure Artifacts,Configure JitPack as an Azure Artifacts upstream source,How to add JitPack upstream source,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Azure Artifacts, you can consume packages from different public registries such as Maven Central and Google Maven Repository and JitPack. Once you enable upstream sources, Azure Artifacts will save a copy of any package you install from upstream.",2025-12-19T16:56:00.000Z,how-to,integrations,0.7,True,Integration-focused article for JitPack; includes Azure Artifacts feed settings and Maven/Gradle configuration details.,unchanged diff --git a/products/azure-artifacts/report.md b/products/azure-artifacts/report.md index 51e1f3f4..6a4f8aad 100644 --- a/products/azure-artifacts/report.md +++ b/products/azure-artifacts/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: limits-quotas: Storage quotas, free allocation, and per-package size/count limits in Azure Artifacts, plus how to monitor, manage, and publish packages within those limits. - integrations: How to connect build tools and CLIs (Cargo, Maven, Gradle, npm, NuGet, - Python, PowerShell, Universal) to Azure Artifacts feeds, publish/restore packages, - and use upstream sources. + integrations: How to connect build tools (NuGet, npm, Maven, Gradle, Cargo, Python, + PowerShell) to Azure Artifacts, publish/restore packages, use upstream sources, + and debug with symbols. best-practices: Guidance on Azure Artifacts package management best practices, configuring and using upstream sources, and safely restoring packages from external feeds. security: 'Securing Azure Artifacts feeds: configuring permissions, protecting upstream @@ -20,15 +20,15 @@ category_descriptions: Azure Artifacts feeds, including workflow setup, authentication, and CI/CD integration. skill_description: Expert knowledge for Azure Artifacts development including best practices, decision making, limits & quotas, security, configuration, integrations - & coding patterns, and deployment. Use when managing feeds, upstream sources, package - publishing/restore, GitHub Actions CI/CD, or npm/NuGet config, and other Azure Artifacts + & coding patterns, and deployment. Use when managing feeds, upstream sources, views/promotion, + retention, GitHub Actions CI/CD, or package tool auth, and other Azure Artifacts related development tasks. Not for Azure DevOps (use azure-devops), Azure Pipelines - (use azure-pipelines), Azure Repos (use azure-repos), Azure Boards (use azure-boards). -use_when: Use when managing feeds, upstream sources, package publishing/restore, GitHub - Actions CI/CD, or npm/NuGet config, and other Azure Artifacts related development + (use azure-pipelines), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). +use_when: Use when managing feeds, upstream sources, views/promotion, retention, GitHub + Actions CI/CD, or package tool auth, and other Azure Artifacts related development tasks. confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (use - azure-pipelines), Azure Repos (use azure-repos), Azure Boards (use azure-boards). + azure-pipelines), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). --- # Azure Artifacts Crawl Report @@ -42,8 +42,8 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 73 +- **Updated Pages**: 1 +- **Unchanged**: 72 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-artifacts/azure-artifacts.csv` @@ -62,6 +62,11 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us ## Changes +### Updated Pages + +- [Use packages from Google Maven Repository](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops) + - Updated: 2026-02-17T18:05:00.000Z → 2026-04-20T17:05:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -89,12 +94,12 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us | [Restore Maven packages](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/install?view=azure-devops) | integrations | 0.70 | How-to page for connecting Maven to Azure Artifacts feeds; typically includes product-specific repository URLs, settings.xml snippets, and authentication parameters that are configuration/integration details rather than generic Maven knowledge. | | [Set up upstream sources](https://learn.microsoft.com/en-us/azure/devops/artifacts/how-to/set-up-upstream-sources?view=azure-devops) | configuration | 0.70 | How-to for setting up upstream sources; typically includes feed settings, allowed values, and UI/REST configuration options specific to Azure Artifacts. | | [Use packages from Crates.io](https://learn.microsoft.com/en-us/azure/devops/artifacts/cargo/cargo-upstream-source?view=azure-devops) | integrations | 0.70 | Guides configuring Cargo to use Crates.io through Azure Artifacts; includes registry configuration and Azure-specific endpoints. | -| [Use packages from Google Maven Repository](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops) | integrations | 0.70 | Shows how to configure Google Maven as an upstream source; includes Azure feed configuration and Maven/Gradle repository entries unique to this integration. | | [Use packages from Gradle Plugins](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/gradle-plugins?view=azure-devops) | integrations | 0.70 | Documents how to add Gradle Plugins as an upstream source; likely includes specific repository URLs and feed configuration options. | | [Use packages from JitPack](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/jitpack-upstream?view=azure-devops) | integrations | 0.70 | Integration-focused article for JitPack; includes Azure Artifacts feed settings and Maven/Gradle configuration details. | | [Use packages from Maven Central](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/upstream-sources?view=azure-devops) | integrations | 0.70 | Covers Maven settings for using Maven Central through Azure Artifacts; includes repository IDs, URLs, and snapshot behavior specific to the service. | | [Use packages from NuGet.org](https://learn.microsoft.com/en-us/azure/devops/artifacts/nuget/upstream-sources?view=azure-devops) | integrations | 0.70 | Shows how to configure projects/CLI to pull from nuget.org through Azure Artifacts; includes NuGet.config entries, source URLs, and Azure-specific parameters. | | [Use packages from the npm registry](https://learn.microsoft.com/en-us/azure/devops/artifacts/npm/upstream-sources?view=azure-devops) | integrations | 0.70 | Documents npmrc configuration, registry URLs, and scope handling for Azure Artifacts; these are product-specific integration details. | +| [Use packages from Google Maven Repository](https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops) | integrations | 0.68 | Page describes product-specific integration details for using Google Maven Repository as an upstream source in Azure Artifacts feeds, including required permissions (Feed and Upstream Reader/Collaborator) and Azure DevOps-specific configuration steps and behaviors that go beyond generic Maven usage. | | [Cargo](https://learn.microsoft.com/en-us/azure/devops/artifacts/get-started-cargo?view=azure-devops) | integrations | 0.65 | Get-started article that configures Cargo to use Azure Artifacts; includes registry configuration and Azure-specific URLs. | | [Debug with Visual Studio](https://learn.microsoft.com/en-us/azure/devops/artifacts/symbols/debug-with-symbols-visual-studio?view=azure-devops) | integrations | 0.65 | Shows how to configure Visual Studio symbol settings to use Azure Artifacts symbol server; includes server URLs and configuration options unique to this integration. | | [Debug with WinDbg](https://learn.microsoft.com/en-us/azure/devops/artifacts/symbols/debug-with-symbols-windbg?view=azure-devops) | integrations | 0.65 | Explains configuring WinDbg to consume symbols from Azure Artifacts; includes .sympath settings and server URLs specific to Azure Artifacts. | diff --git a/products/azure-automation/azure-automation.csv b/products/azure-automation/azure-automation.csv index bec4d68c..707cfc8a 100644 --- a/products/azure-automation/azure-automation.csv +++ b/products/azure-automation/azure-automation.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/automation/add-user-assigned-identity,Using user-assigned managed identity,Using a user-assigned managed identity for an Azure Automation account,Configure user-assigned managed identity for Azure Automation,This article describes how to set up a user-assigned managed identity for Azure Automation accounts.,"This article shows you how to add a user-assigned managed identity for an Azure Automation account and how to use it to access other resources. For more information on how managed identities work with Azure Automation, seeManaged identities. Note It is not possible to use a User Assigned Managed Identity on a Hybrid Runbook Worker when a Managed Identity (either System or User assigned) has been created for the Automation Account. If Managed Identity has not been assigned to the Automation Accou",2025-11-17T08:00:00.000Z,how-to,security,0.9,True,"Details adding and using user-assigned managed identities, including constraints with Hybrid Runbook Worker, which is product-specific security behavior.",unchanged +https://learn.microsoft.com/en-us/azure/automation/add-user-assigned-identity,Using user-assigned managed identity,Using a user-assigned managed identity for an Azure Automation account,Configure user-assigned managed identity for Azure Automation,This article describes how to set up a user-assigned managed identity for Azure Automation accounts.,"This article shows you how to add a user-assigned managed identity for an Azure Automation account and how to use it to access other resources. For more information on how managed identities work with Azure Automation, seeManaged identities. Note It is not possible to use a User Assigned Managed Identity on a Hybrid Runbook Worker when a Managed Identity (either System or User assigned) has been created for the Automation Account. If Managed Identity has not been assigned to the Automation Accou",2026-04-14T08:00:00.000Z,how-to,security,0.8,True,"Describes setting up and using a user-assigned managed identity for Automation accounts, including a specific limitation with Hybrid Runbook Worker; this is product-specific identity/security configuration and behavior.",updated https://learn.microsoft.com/en-us/azure/automation/automation-alert-metric,Monitor runbooks with metric alert,Monitor Azure Automation runbooks with metric alerts,Configure metric alerts for Azure Automation runbooks,This article describes how to setup a metric alert based on runbook completion status.,"In this article, you learn how to create ametric alertbased on runbook completion status.",2023-01-18T18:06:00.000Z,how-to,configuration,0.65,True,"Describes setting up metric alerts based on runbook completion; likely includes metric names, dimensions, and threshold settings unique to Automation.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-availability-zones,Availability zones,Availability zones support for Azure Automation,Use availability zones for resilient Azure Automation deployments,This article provides an overview of Azure availability zones and regions for Azure Automation,"Azure availability zonesprovides an improved resiliency and high availability to a service instance in a specific Azure region. Azure Automation now supports Availability zones to provide improved resiliency and reliability high availability to the service, runbooks, and other automation assets. Azure availability zones is a high-availability offering that protects your applications and data from data center failures. Availability zones are unique physical locations within an Azure region and ea",2025-11-17T08:00:00.000Z,conceptual,deployment,0.6,True,"Describes availability zone support and regional behavior for Automation, relevant to deployment and high-availability planning.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-child-runbooks,Create modular runbooks,Create modular runbooks in Azure Automation,Design modular parent-child runbooks in Azure Automation,This article explains how to create a runbook that another runbook calls.,"It's a good practice in Azure Automation to write reusable, modular runbooks with a discrete function that other runbooks call. A parent runbook often calls one or more child runbooks to perform required functionality. There are two ways to call a child runbook: inline or through a cmdlet. The following table summarizes the differences to help you decide which way is better for your scenarios.",2025-11-17T08:00:00.000Z,how-to,best-practices,0.8,True,Provides a comparison table of inline vs cmdlet-based child runbook calls and guidance on when to use each; clear product-specific best practices.,unchanged @@ -12,9 +12,9 @@ https://learn.microsoft.com/en-us/azure/automation/automation-disaster-recovery, https://learn.microsoft.com/en-us/azure/automation/automation-dsc-cd-chocolatey,Set up continuous deployment with Chocolatey,Set up Azure Automation continuous deployment with Chocolatey,Set up continuous deployment with DSC and Chocolatey,This article tells how to set up continuous deployment with State Configuration and the Chocolatey package manager.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-11-17T08:00:00.000Z,how-to,deployment,0.65,True,Shows continuous deployment pipeline using State Configuration and Chocolatey; likely includes concrete DSC resource usage and deployment patterns specific to Azure Automation.,unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile,Compile DSC configurations,Compile DSC configurations in Azure Automation State Configuration,Compile DSC configurations in Azure Automation,This article tells how to compile Desired State Configuration (DSC) configurations for Azure Automation.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure +https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile,Compile DSC configurations,Compile DSC configurations in Azure Automation State Configuration,Compile DSC configurations in Azure Automation State Configuration,This article tells how to compile Desired State Configuration (DSC) configurations for Azure Automation.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-11-17T08:00:00.000Z,how-to,configuration,0.7,True,"Covers compilation process; likely includes specific compilation options, job parameters, and Automation behaviors not generally known.",unchanged +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-15T08:00:00.000Z,how-to,configuration,0.7,True,"Compilation article is likely to document specific compilation options, configuration properties, and possibly limits around compilation jobs, providing detailed configuration parameters and behaviors unique to Azure Automation State Configuration.",updated https://learn.microsoft.com/en-us/azure/automation/automation-dsc-config-data-at-scale,Configure data at scale,Configure data at scale for Azure Automation State Configuration,Configure DSC data at scale in Azure Automation,This article tells how to configure data at scale for Azure Automation State Configuration.,"Applies to:✔️ Windows PowerShell 5.1 Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. Azure Machine Configuration also includes hybrid machine support throughArc-enabled server",2025-11-17T08:00:00.000Z,how-to,configuration,0.65,True,"Focuses on configuring data at scale for State Configuration, likely including specific configuration patterns and structures.",unchanged @@ -32,37 +32,37 @@ Automation State Configuration, and the most commonly requested features from cu Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-11-17T08:00:00.000Z,how-to,configuration,0.7,True,"Integration with Azure Monitor Logs; likely includes workspace settings, diagnostic configuration, and table names unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-dsc-extension-history,Work with State Configuration extension version history,Work with Azure Desired State Configuration extension version history,Use Azure DSC extension version history for configuration,This article shares version history information for the Desired State Configuration (DSC) extension in Azure.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-03-05T08:00:00.000Z,release-notes,configuration,0.7,True,"Version history pages typically list extension versions with specific behaviors, configuration changes, and sometimes parameter defaults unique to the product, which qualify as expert configuration knowledge not inferable from general training.",updated +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-24T08:00:00.000Z,release-notes,configuration,0.7,True,"Version history for the DSC extension typically lists specific extension versions, behaviors, and changes that affect how you configure and manage the extension; this is detailed, product-specific configuration/versioning knowledge not captured by generic training.",updated https://learn.microsoft.com/en-us/azure/automation/automation-dsc-getting-started,Get started with State Configuration,Get started with Azure Automation State Configuration,Perform common Azure Automation State Configuration tasks,This article tells how to do the most common tasks in Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-11-17T08:00:00.000Z,how-to,configuration,0.6,True,"Getting-started guide for State Configuration typically includes concrete steps, parameters, and configuration examples specific to this service.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding,Enable State Configuration,Enable Azure Automation State Configuration,Onboard machines to Azure Automation State Configuration,This article tells how to set up machines for management with Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,"Getting-started article for State Configuration typically walks through concrete steps and settings (registering nodes, assigning configurations, checking compliance) with specific property names and options, which constitutes product-specific configuration knowledge beyond generic DSC concepts.",updated +https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding,Enable State Configuration,Enable Azure Automation State Configuration,Enable and onboard machines to Azure Automation State Configuration,This article tells how to set up machines for management with Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-04-01T08:00:00.000Z,how-to,configuration,0.7,True,"Describes enabling machines for management; likely includes registration commands, configuration modes, and endpoint settings unique to this feature.",unchanged +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-15T08:00:00.000Z,how-to,configuration,0.78,True,"Onboarding/enablement article for State Configuration will include exact onboarding commands, registration parameters, and configuration options for machines, which are product-specific configuration details (e.g., registration keys, URIs, flags) that qualify as expert configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/automation/automation-dsc-overview,Overview,Azure Automation State Configuration overview,,This article provides an overview of Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-03-05T08:00:00.000Z,overview,,0.2,False,"Overview of Azure Automation State Configuration and retirement notice; no detailed limits, configuration tables, error codes, or product-specific decision matrices.",updated +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-24T08:00:00.000Z,overview,,0.2,False,"Overview of Azure Automation State Configuration; described as an overview and retirement notice, likely conceptual and service-level explanation without detailed parameters, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/automation/automation-dsc-remediate,Remediate noncompliant State Configuration servers,Remediate noncompliant Azure Automation State Configuration servers,Remediate noncompliant servers with State Configuration,This article tells how to reapply configurations on demand to servers that are no longer compliant.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-11-17T08:00:00.000Z,conceptual,configuration,0.6,True,Describes on-demand reapplication of configurations; likely includes commands and settings for triggering remediation unique to Automation DSC.,unchanged https://learn.microsoft.com/en-us/azure/automation/automation-edit-textual-runbook,Edit textual runbooks,Edit textual runbooks in Azure Automation,Use the Azure Automation textual editor for PowerShell runbooks,This article tells how to use the Azure Automation textual editor to work with PowerShell and PowerShell Workflow runbooks.,"You can use the textual editor in Azure Automation to editPowerShell runbooksandPowerShell Workflow runbooks. This editor has the typical features of other code editors, such as IntelliSense. It also uses color coding with additional special features to assist you in accessing resources common to runbooks. The textual editor includes a feature to insert code for cmdlets, assets, and child runbooks into a runbook. Instead of typing in the code yourself, you can select from a list of available res",2025-11-17T08:00:00.000Z,how-to,configuration,0.65,True,Describes editor features like inserting cmdlets/assets/child runbooks and how they map to Automation resources; product-specific authoring configuration.,unchanged https://learn.microsoft.com/en-us/azure/automation/automation-faq,FAQ,Azure Automation FAQ,,This article gives answers to frequently asked questions about Azure Automation.,"This Microsoft FAQ is a list of commonly asked questions about Azure Automation. If you have any other questions about its capabilities, go to the discussion forum and post your questions. When a question is frequently asked, we add it to this article so that it's found quickly and easily.",2024-08-20T08:00:00.000Z,faq,,0.3,False,"FAQ description is generic; no indication of specific error codes, limits, or config parameters in the summary.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-graphical-authoring-intro,Edit Graphical runbooks,Author graphical runbooks in Azure Automation,Author and configure graphical runbooks in Azure Automation,This article tells how to author a graphical runbook without working with code.,"Important Azure Automation Run as accounts, including Classic Run as accounts have retired on30 September 2023and replaced withManaged Identities. You would no longer be able to create or renew Run as accounts through the Azure portal. For more information, seemigrating from an existing Run As accounts to managed identity. All runbooks in Azure Automation are Windows PowerShell workflows. Graphical runbooks and graphical PowerShell Workflow runbooks generate PowerShell code that the Automation ",2025-11-17T08:00:00.000Z,overview,configuration,0.65,True,Explains how graphical runbooks generate PowerShell code and how to configure them; product-specific authoring behavior and options.,unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks,Run runbooks on Hybrid Runbook Worker,Run Azure Automation Runbooks on a Hybrid Runbook Worker,,This article describes how to run runbooks on machines in your local datacenter or other cloud provider with the Hybrid Runbook Worker.,"Important Note Azure Automation Run As Account has retired on September 30, 2023 and is replaced with Managed Identities. Follow the guidelines onhow to start migrating your runbooks to use managed identities. For more information, seemigrating from an existing Run As accounts to managed identity. Runbooks that run on aHybrid Runbook Workertypically manage resources on the local computer or against resources in the local environment where the worker is deployed. Runbooks in Azure Automation typi",2025-07-29T08:00:00.000Z,how-to,,0.4,False,"Describes how to run runbooks on Hybrid Runbook Worker but appears more conceptual/usage-focused without clear evidence of detailed error codes, limits, or config tables.",unchanged +https://learn.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks,Run runbooks on Hybrid Runbook Worker,Run Azure Automation Runbooks on a Hybrid Runbook Worker,Run and manage runbooks on Hybrid Runbook Workers,This article describes how to run runbooks on machines in your local datacenter or other cloud provider with the Hybrid Runbook Worker.,"Important Runbooks that run on aHybrid Runbook Workertypically manage resources on the local computer or against resources in the local environment where the worker is deployed. Runbooks in Azure Automation typically manage resources in the Azure cloud. Even though they're used differently, runbooks that run in Azure Automation and runbooks that run on a Hybrid Runbook Worker are identical in structure. When you author a runbook to run on a Hybrid Runbook Worker, you should edit and test the run",2026-04-15T08:00:00.000Z,how-to,best-practices,0.62,True,"How-to article for running runbooks on Hybrid Runbook Worker machines is likely to include product-specific guidance (where to run what, worker group behavior, gotchas around local vs cloud resources, and configuration nuances) that go beyond generic concepts. This aligns best with product-specific best practices rather than pure configuration tables.",updated https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker,Hybrid Runbook Worker overview,Azure Automation Hybrid Runbook Worker Overview,Configure and use Azure Automation Hybrid Runbook Worker,Know about Hybrid Runbook Worker. How to install and run the runbooks on machines in your local datacenter or cloud provider.,Important Runbooks in Azure Automation might not have access to resources in other clouds or in your on-premises environment because they run on the Azure cloud platform. You can use the Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on the machine hosting the role and against resources in the environment to manage those local resources. Runbooks are stored and managed in Azure Automation and then delivered to one or more assigned machines. Azure Automation provides n,2025-07-08T17:24:00.000Z,overview,configuration,0.6,True,Hybrid Runbook Worker overview includes installation and environment-specific configuration details unique to this feature.,unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-limits-quotas,Manage automation limits and quotas,Manage Azure Automation subscription limits and quotas,View and manage Azure Automation limits and quotas,This article provides information on how to view automation limits and request for quota increase or decrease.,This article provides steps to view the current limits and request for quota increase or decrease.,2025-01-28T12:10:00.000Z,how-to,limits-quotas,0.9,True,Explicitly about Automation subscription limits and quotas and how to request changes; expected to list concrete numeric limits and quota behaviors.,unchanged +https://learn.microsoft.com/en-us/azure/automation/automation-limits-quotas,Manage automation limits and quotas,Manage Azure Automation subscription limits and quotas,View and manage Azure Automation limits and quotas,This article provides information on how to view automation limits and request for quota increase or decrease.,This article provides steps to view the current limits and request for quota increase or decrease.,2026-04-15T08:00:00.000Z,how-to,limits-quotas,0.9,True,A dedicated limits/quotas article; almost certainly contains specific numeric limits and quota-increase procedures that are not generally known from training.,updated https://learn.microsoft.com/en-us/azure/automation/automation-linux-hrw-install,Deploy agent-based Linux worker,Deploy an agent-based Linux Hybrid Runbook Worker in Automation,Deploy Linux Hybrid Runbook Worker agent,This article tells how to install an agent-based Hybrid Runbook Worker to run runbooks on Linux-based machines in your local datacenter or cloud environment.,"Important You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on the Azure or non-Azure machine, including servers registered withAzure Arc-enabled servers. From the machine or server that's hosting the role, you can run runbooks directly it and against resources in the environment to manage those local resources. For the Linux Hybrid Runbook Worker seeDeploy an extension-based Windows or Linux User Hybrid Runbook Worker in Automation After you success",2025-06-12T17:02:00.000Z,how-to,deployment,0.65,True,Step-by-step installation for agent-based Linux Hybrid Runbook Worker with product-specific commands/requirements; contains concrete deployment details beyond generic knowledge.,unchanged https://learn.microsoft.com/en-us/azure/automation/automation-manage-send-joblogs-log-analytics,Forward Azure Automation diagnostic logs to Azure Monitor,Forward Azure Automation job data to Azure Monitor logs,Forward Azure Automation job logs to Azure Monitor,This article tells how to send job status and runbook job streams to Azure Monitor logs.,"Azure Automation can send runbook job status and job streams to your Log Analytics workspace. This process does not involve workspace linking and is completely independent and allows you to perform simple investigations. Job logs and job streams are visible in the Azure portal, or with PowerShell for individual jobs. With Azure Monitor logs for your Automation account, you can: Using Azure Monitor logs, you can consolidate logs from different resources in the same workspace where it can be analy",2025-11-17T08:00:00.000Z,how-to,integrations,0.75,True,Describes configuration to send job status and streams to Log Analytics workspaces; includes workspace and diagnostic settings specific to Automation–Monitor integration.,unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-managing-data,Management of Azure Automation data,Azure Automation data security,Understand and configure Azure Automation data security,This article helps you learn how Azure Automation protects your privacy and secures your data.,This article contains several topics explaining how data is protected and secured in an Azure Automation environment.,2026-03-05T08:00:00.000Z,overview,security,0.7,True,"Article is specifically about how Azure Automation protects and secures data. Such pages typically include product-specific security mechanisms, data handling behaviors, and possibly RBAC or identity details unique to the service, which qualify as expert security configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/automation/automation-managing-data,Management of Azure Automation data,Azure Automation data security,Understand and configure Azure Automation data security,This article helps you learn how Azure Automation protects your privacy and secures your data.,This article contains several topics explaining how data is protected and secured in an Azure Automation environment.,2026-04-24T08:00:00.000Z,overview,security,0.7,True,"Focused on how Azure Automation protects and secures data; likely includes product-specific security mechanisms, data handling, and possibly identity/permission details beyond generic security concepts.",updated https://learn.microsoft.com/en-us/azure/automation/automation-network-configuration,Automation network configuration details,Azure Automation network configuration details,Configure network requirements for Azure Automation components,"This article provides details of network information required by Azure Automation State Configuration, Azure Automation Hybrid Runbook Worker, Update Management, and Change Tracking and Inventory","This page provides networking details that are required forHybrid Runbook Worker and State Configuration, and forUpdate Management and Change Tracking and Inventory.",2024-11-29T08:00:00.000Z,overview,configuration,0.9,True,"Provides detailed network information (endpoints, ports, URLs) required by Automation features, which is product-specific configuration.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-orchestrator-migration,Migrate from Orchestrator to Azure Automation (Beta),Migrate from Orchestrator to Azure Automation (Beta),Migrate System Center Orchestrator runbooks to Azure Automation,This article tells how to migrate runbooks and integration packs from Orchestrator to Azure Automation.,"Runbooks inSystem Center 2012 - Orchestratorare based on activities from integration packs that are written specifically for Orchestrator, while runbooks in Azure Automation are based on Windows PowerShell.Graphical runbooksin Azure Automation have a similar appearance to Orchestrator runbooks, with their activities representing PowerShell cmdlets, child runbooks, and assets. In addition to converting runbooks themselves, you must convert the integration packs with the activities that the runboo",2025-11-17T08:00:00.000Z,how-to,decision-making,0.7,True,"Migration guidance from Orchestrator to Automation, including how to convert integration packs and runbooks; supports technology selection and migration decisions.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-powershell-workflow,Learn PowerShell Workflow,Learn PowerShell Workflow for Azure Automation,,This article teaches you the differences between PowerShell Workflow and PowerShell and concepts applicable to Automation runbooks.,"Runbooks in Azure Automation are implemented as Windows PowerShell workflows, Windows PowerShell scripts that use Windows Workflow Foundation. A workflow is a sequence of programmed, connected steps that perform long-running tasks or require the coordination of multiple steps across multiple devices or managed nodes. While a workflow is written with Windows PowerShell syntax and launched by Windows PowerShell, it is processed by Windows Workflow Foundation. The benefits of a workflow over a norm",2025-11-17T08:00:00.000Z,overview,,0.45,False,Conceptual explanation of PowerShell Workflow vs PowerShell for Automation runbooks; more educational than product-specific configuration or troubleshooting.,unchanged https://learn.microsoft.com/en-us/azure/automation/automation-role-based-access-control,Manage role permissions and security,Manage role permissions and security in Azure Automation,Assign Azure RBAC roles and permissions for Automation accounts,"This article describes how to use Azure role-based access control (Azure RBAC), which enables access management and role permissions for Azure resources.","Azure role-based access control (Azure RBAC) enables access management for Azure resources. UsingAzure RBAC, you can segregate duties within your team and grant only the amount of access to users, groups, and applications that they need to perform their jobs. You can grant role-based access to users using the Azure portal, Azure Command-Line tools, or Azure Management APIs.",2025-11-17T08:00:00.000Z,how-to,security,0.8,True,"Describes Automation-specific RBAC usage; likely lists roles, scopes, and permission mappings for Automation resources.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-runbook-authoring,Azure Automation extension for Visual Studio Code,Runbook authoring using VS Code in Azure Automation,Author and manage Automation runbooks using VS Code,This article provides an overview authoring runbooks in Azure Automation using the visual studio code.,"This article explains about the Visual Studio extension that you can use to create and manage runbooks. Azure Automation provides a new extension from VS Code to create and manage runbooks. Using this extension, you can perform all runbook management operations such as, creating and editing runbooks, triggering a job, tracking recent jobs output, linking a schedule, asset management, and local debugging.",2025-11-17T08:00:00.000Z,overview,configuration,0.6,True,"Describes VS Code extension capabilities and operations for runbook management, including product-specific configuration of tooling.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution,Runbook execution overview,Runbook execution in Azure Automation,Design resilient Azure Automation runbook execution behavior,This article provides an overview of the processing of runbooks in Azure Automation.,"Process automation in Azure Automation allows you to create and manage PowerShell, PowerShell Workflow, and graphical runbooks. For details, seeAzure Automation runbooks. Automation executes your runbooks based on the logic defined inside them. If a runbook is interrupted, it restarts at the beginning. This behavior requires you to write runbooks that support being restarted if transient issues occur. Starting a runbook in Azure Automation creates a job, which is a single execution instance of t",2025-11-17T08:00:00.000Z,overview,best-practices,0.6,True,"Discusses runbook execution behavior and restart semantics, requiring product-specific guidance on how to author runbooks to handle interruptions.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-runbook-gallery,Use existing runbooks and modules,Use Azure Automation runbooks and modules in PowerShell Gallery,,This article tells how to use runbooks and modules from Microsoft GitHub repos and the PowerShell Gallery.,"Rather than creating your own runbooks and modules in Azure Automation, you can access scenarios that have already been built by Microsoft and the community. You can get Azure-related PowerShell and Python runbooks from the Runbook Gallery in the Azure portal, andmodulesandrunbooks(which may or may not be specific to Azure) from the PowerShell Gallery. You can also contribute to the community by sharingscenarios that you develop. Note The TechNet Script Center is retiring. All of the runbooks fr",2025-11-17T08:00:00.000Z,how-to,,0.5,False,"Describes using Runbook Gallery and PowerShell Gallery; largely discovery and high-level usage, not deep configuration parameters or limits.",unchanged +https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution,Runbook execution overview,Runbook execution in Azure Automation,,This article provides an overview of the processing of runbooks in Azure Automation.,"Process automation in Azure Automation allows you to create and manage PowerShell, PowerShell Workflow, and graphical runbooks. For details, seeAzure Automation runbooks. Automation executes your runbooks based on the logic defined inside them. If a runbook is interrupted, it restarts at the beginning. This behavior requires you to write runbooks that support being restarted if transient issues occur. Starting a runbook in Azure Automation creates a job, which is a single execution instance of t",2026-04-15T08:00:00.000Z,overview,,0.4,False,"Overview of runbook execution behavior; mostly conceptual description of jobs and restarts without specific numeric limits, configuration parameters, or troubleshooting codes.",updated +https://learn.microsoft.com/en-us/azure/automation/automation-runbook-gallery,Use existing runbooks and modules,Use Azure Automation runbooks and modules in PowerShell Gallery,,This article tells how to use runbooks and modules from Microsoft GitHub repos and the PowerShell Gallery.,"Rather than creating your own runbooks and modules in Azure Automation, you can access scenarios that have already been built by Microsoft and the community. You can get Azure-related PowerShell and Python runbooks from the Runbook Gallery in the Azure portal, andmodulesandrunbooks(which may or may not be specific to Azure) from the PowerShell Gallery. You can also contribute to the community by sharingscenarios that you develop. Note The TechNet Script Center is retiring. All of the runbooks fr",2026-04-15T08:00:00.000Z,how-to,,0.4,False,"Focuses on using gallery content and sharing scenarios; likely a usage/tutorial article without detailed config tables, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/automation/automation-runbook-graphical-error-handling,Handle errors in graphical runbooks,Handle errors in Azure Automation graphical runbooks,Implement error handling in Azure Automation graphical runbooks,This article tells how to implement error handling logic in graphical runbooks.,"A key design principle to consider for your Azure Automation graphical runbook is the identification of issues that the runbook might experience during execution. These issues can include success, expected error states, and unexpected error conditions. Often, if there is a non-terminating error that occurs with a runbook activity, Windows PowerShell handles the activity by processing any activity that follows, regardless of the error. The error is likely to generate an exception, but the next ac",2025-11-17T08:00:00.000Z,overview,best-practices,0.8,True,"Provides concrete patterns for handling success, expected, and unexpected errors in graphical runbooks; product-specific error-handling design guidance.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages,Configure runbook output,Configure runbook output and message streams,Configure output and message streams in Azure Automation runbooks,This article tells how to implement error handling logic and describes output and message streams in Azure Automation runbooks.,Most Azure Automation runbooks have some form of output. This output can be an error message to the user or a complex object intended to be used with another runbook. Windows PowerShell providesmultiple streamsto send output from a script or workflow. Azure Automation works with each of these streams differently. You should follow best practices for using the streams when you're creating a runbook. The following table briefly describes each stream with its behavior in the Azure portal for publis,2025-11-17T08:00:00.000Z,overview,best-practices,0.8,True,Explains how Automation handles different PowerShell streams and prescribes best practices for error handling and output; product-specific behavior and guidance.,unchanged +https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages,Configure runbook output,Configure runbook output and message streams,Configure output and message streams in Azure Automation runbooks,This article tells how to implement error handling logic and describes output and message streams in Azure Automation runbooks.,Most Azure Automation runbooks have some form of output. This output can be an error message to the user or a complex object intended to be used with another runbook. Windows PowerShell providesmultiple streamsto send output from a script or workflow. Azure Automation works with each of these streams differently. You should follow best practices for using the streams when you're creating a runbook. The following table briefly describes each stream with its behavior in the Azure portal for publis,2026-04-15T08:00:00.000Z,overview,best-practices,0.85,True,"Describes how each PowerShell stream behaves in Azure Automation plus recommended patterns for error handling and output; these are concrete, product-specific DO/DON'T guidelines.",updated https://learn.microsoft.com/en-us/azure/automation/automation-runbook-types,Automation runbook types,Azure Automation Runbook Types,Choose appropriate Azure Automation runbook types,This article describes the types of runbooks that you can use in Azure Automation and considerations for determining which type to use.,"The Azure Automation Process Automation feature supports several types of runbooks, as defined in the following table. To learn about the process automation environment, seeRunbook execution in Azure Automation. Note Azure Automation will follow the support lifecycle of PowerShell and Python language versions in accordance with the timelines published by the parent products,PowerShellandPython, respectively. We recommend that you use runbooks with supported language versions. Take into account t",2025-07-15T08:00:00.000Z,overview,decision-making,0.6,True,"Compares runbook types and considerations for choosing among them, which is product-specific decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-scenario-aws-deployment,Deploy AWS VM with Automation runbook,Deploy an Amazon Web Services VM with an Azure Automation runbook,Provision AWS virtual machines using Azure Automation runbooks,This article tells how to automate the creation of an Amazon Web Services VM.,"In this article, you learn how you can leverage Azure Automation to provision a virtual machine in your Amazon Web Service (AWS) subscription and give that VM a specific name - which AWS refers to as “tagging” the VM.",2025-11-17T08:00:00.000Z,overview,integrations,0.7,True,"Automates AWS VM creation; likely includes AWS PowerShell/SDK parameters, tagging patterns, and cross-cloud configuration details unique to this scenario.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-scenario-using-watcher-task,Track updated files with watcher task,Track updated files with an Azure Automation watcher task,Create watcher tasks to track file updates in Automation,This article tells how to create a watcher task in the Azure Automation account to watch for new files created in a folder.,"Azure Automation uses a watcher task to look for events and trigger actions with PowerShell runbooks. The watcher task contains two parts, the watcher and the action. A watcher runbook runs at an interval defined in the watcher task, and outputs data to an action runbook. Note Watcher tasks are not supported in Microsoft Azure operated by 21Vianet. Important Starting in May 2020, using Azure Logic Apps is the recommended and supported way to monitor for events, schedule recurring tasks, and trig",2025-11-17T08:00:00.000Z,how-to,configuration,0.6,True,"Watcher task scenario; likely includes watcher configuration intervals, runbook bindings, and limitations specific to Azure Automation watcher tasks.",unchanged @@ -72,35 +72,35 @@ https://learn.microsoft.com/en-us/azure/automation/automation-security-overview, https://learn.microsoft.com/en-us/azure/automation/automation-send-email,Send an email from a runbook,Send an email from an Azure Automation runbook,Send email from Azure Automation runbook using SendGrid,This article tells how to send an email from within a runbook.,"You can send an email from a runbook withSendGridusing PowerShell. If you don't have an Azure subscription, create afree accountbefore you begin.",2025-11-17T08:00:00.000Z,how-to,integrations,0.7,True,"Shows how to integrate SendGrid with Automation runbooks; likely includes API keys, connection parameters, and PowerShell patterns specific to this integration.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-services,What are the various Automation services in Azure?,Automation services in Azure - overview,,This article tells what are the Automation services in Azure and how to compare and use it to automate the lifecycle of infrastructure and applications.,"This article explains various Automation services offered in the Azure environment. These services can automate business and operational processes and solve integration problems among multiple services, systems, and processes. Automation services can define input, action, activity to be performed, conditions, error handling, and output generation. Using these services, you can run various activities on a schedule or do a manual demand-based execution. Each service has its unique advantages and t",2024-08-12T11:22:00.000Z,overview,,0.2,False,"Conceptual comparison of automation services; no detailed decision matrices, limits, or config tables evident from summary.",unchanged https://learn.microsoft.com/en-us/azure/automation/automation-subscription-limits-faq,Automation limits and quotas,Azure Automation subscription limits and quotas,Azure Automation subscription limits and quotas reference,This article provides automation subscription and service limits and includes answers to frequently asked questions.,This article provides an overview of the default quotas or limits offered to different resources in Azure Automation.,2025-03-10T08:00:00.000Z,faq,limits-quotas,0.95,True,"Explicitly described as providing default quotas and limits for Automation resources, including numeric values and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-update-azure-modules,Update Azure PowerShell modules,Update Azure PowerShell modules in Azure Automation,Update and manage Azure PowerShell modules in Automation accounts,This article tells how to update common Azure PowerShell modules provided by default in Azure Automation.,"The most common PowerShell modules are provided by default in each Automation account. SeeDefault modules. As the Azure team updates the Azure modules regularly, changes can occur with the included cmdlets. These changes, for example, renaming a parameter or deprecating a cmdlet entirely, can negatively affect your runbooks. Note You can't delete global modules, which are modules that Automation provides out of the box. Important New Runtime environment experience allows you to manage modules an",2025-11-17T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how default/global modules are updated and managed, including runtime environment behavior; product-specific module configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/automation/automation-update-azure-modules,Update Azure PowerShell modules,Update Azure PowerShell modules in Azure Automation,Update built-in Azure PowerShell modules in Automation accounts,This article tells how to update common Azure PowerShell modules provided by default in Azure Automation.,"The most common PowerShell modules are provided by default in each Automation account. SeeDefault modules. As the Azure team updates the Azure modules regularly, changes can occur with the included cmdlets. These changes, for example, renaming a parameter or deprecating a cmdlet entirely, can negatively affect your runbooks. Note You can't delete global modules, which are modules that Automation provides out of the box. Important New Runtime environment experience allows you to manage modules an",2026-04-20T11:11:00.000Z,how-to,configuration,0.7,True,"Describes how default/global modules are managed, updated, and constrained in Automation, including runtime environment behavior; this is product-specific configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/automation/automation-use-azure-ad,Configure authentication with Microsoft Entra ID,Use Microsoft Entra ID in Azure Automation to authenticate to Azure,Configure Microsoft Entra ID authentication for Azure Automation,This article tells how to use Microsoft Entra ID within Azure Automation as the provider for authentication to Azure.,"TheMicrosoft Entra IDservice enables a number of administrative tasks, such as user management, domain management, and single sign-on configuration. This article describes how to use Microsoft Entra ID within Azure Automation as the provider for authentication to Azure.",2025-07-15T11:09:00.000Z,how-to,security,0.7,True,"Covers using Entra ID as authentication provider for Azure Automation; likely includes specific app registrations, scopes, and auth configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-webhooks,Start a runbook from a webhook,Start an Azure Automation Runbook from a Webhook,Trigger Azure Automation runbooks via webhooks from external services,This article tells how to use a webhook to start a runbook in Azure Automation from an HTTP call.,"A webhook allows an external service to start a particular runbook in Azure Automation through a single HTTP request. External services include Azure DevOps Services, GitHub, Azure Monitor logs, and custom applications. Such a service can use a webhook to start a runbook without implementing the full Azure Automation API. You can compare webhooks to other methods of starting a runbook inStarting a runbook in Azure Automation. To understand client requirements for TLS 1.2 or higher with webhooks,",2025-07-15T11:09:00.000Z,how-to,integrations,0.8,True,"Describes webhook URL, headers, payload, and TLS requirements for starting runbooks from services like GitHub and Azure DevOps; integration-focused with concrete parameters.",unchanged -https://learn.microsoft.com/en-us/azure/automation/automation-windows-hrw-install,Deploy agent-based Windows worker,Deploy an agent-based Windows Hybrid Runbook Worker in Automation,Deploy agent-based Windows Hybrid Runbook Workers in Azure Automation,This article tells how to deploy an agent-based Hybrid Runbook Worker that you can use to run runbooks on Windows-based machines in your local datacenter or cloud environment.,"Important You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on an Azure or non-Azure machine, including servers registered withAzure Arc-enabled servers. From the machine or server that's hosting the role, you can run runbooks directly against it and against resources in the environment to manage those local resources. Azure Automation stores and manages runbooks and then delivers them to one or more chosen machines. For more information, seeDeploy a",2025-11-17T08:00:00.000Z,how-to,deployment,0.75,True,"Details deployment of agent-based Hybrid Runbook Workers on Windows, including requirements and constraints; product-specific deployment configuration.",unchanged +https://learn.microsoft.com/en-us/azure/automation/automation-webhooks,Start a runbook from a webhook,Start an Azure Automation Runbook from a Webhook,Trigger Azure Automation runbooks using webhooks,This article tells how to use a webhook to start a runbook in Azure Automation from an HTTP call.,"A webhook allows an external service to start a particular runbook in Azure Automation through a single HTTP request. External services include Azure DevOps Services, GitHub, Azure Monitor logs, and custom applications. Such a service can use a webhook to start a runbook without implementing the full Azure Automation API. You can compare webhooks to other methods of starting a runbook inStarting a runbook in Azure Automation. To understand client requirements for TLS 1.2 or higher with webhooks,",2026-04-15T08:00:00.000Z,how-to,integrations,0.7,True,"Webhook article for Automation typically includes request schema, required headers, URL format, and TLS requirements—product-specific integration parameters and constraints.",updated +https://learn.microsoft.com/en-us/azure/automation/automation-windows-hrw-install,Deploy agent-based Windows worker,Deploy an agent-based Windows Hybrid Runbook Worker in Automation,Deploy agent-based Windows Hybrid Runbook Workers in Azure Automation,This article tells how to deploy an agent-based Hybrid Runbook Worker that you can use to run runbooks on Windows-based machines in your local datacenter or cloud environment.,"Important You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on an Azure or non-Azure machine, including servers registered withAzure Arc-enabled servers. From the machine or server that's hosting the role, you can run runbooks directly against it and against resources in the environment to manage those local resources. Azure Automation stores and manages runbooks and then delivers them to one or more chosen machines. For more information, seeDeploy a",2026-04-20T11:11:00.000Z,how-to,deployment,0.7,True,Details how to deploy the agent-based Hybrid Runbook Worker role on Windows machines with Automation-specific requirements; this is a deployment-focused article for a particular worker type.,updated https://learn.microsoft.com/en-us/azure/automation/change-tracking/guidance-migration-log-analytics-monitoring-agent,Migration from Log Analytics to Azure Monitoring Agent version,Migration guidance from Change Tracking and inventory using Log Analytics to Azure Monitoring Agent,Migrate Change Tracking from Log Analytics agent to AMA,An overview on how to migrate from Change Tracking and inventory using Log Analytics to Azure Monitoring Agent.,"Applies to:✔️ Windows VMs ✔️ Linux VMs ✔️ Azure Arc-enabled servers. This article provides guidance to move from Change Tracking and Inventory using Log Analytics (LA) version to the Azure Monitoring Agent (AMA) version. Using the Azure portal, you can migrate from Change Tracking & Inventory with LA agent to Change Tracking & Inventory with AMA and there are two ways to do this migration: Additionally, you can use a script to migrate all Virtual Machines and Arc-enabled non-Azure machines at a ",2025-11-17T08:00:00.000Z,how-to,decision-making,0.65,True,"Migration guidance between LA and AMA; likely includes comparison of behaviors, supported scenarios, and recommended migration paths, aiding technology selection and transition.",unchanged https://learn.microsoft.com/en-us/azure/automation/compose-configurationwithcompositeresources,Compose DSC configurations,Compose DSC configurations,Compose DSC configurations using composite resources,This article tells how to compose configurations using composite resources in Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2025-11-17T08:00:00.000Z,how-to,configuration,0.7,True,"Explains composing configurations; likely includes DSC syntax, resource structure, and Automation-specific handling of composite resources.",unchanged -https://learn.microsoft.com/en-us/azure/automation/context-switching,Context switching in Azure Automation,Context switching in Azure Automation,Avoid Azure Automation runbook issues from context switching,This article explains context switching and how to avoid runbook issues.,"Context switching is when the context in one process changes the context in a different process. An Azure context is a set of information that defines the target of Azure PowerShell cmdlets. The context consists of the following properties: When an account signs on that can access several subscriptions, any of those subscriptions may be added to the user's context. To guarantee the correct subscription, you must declare it when connecting. For example, useAdd-AzAccount -Credential $Cred -subscri",2025-11-17T08:00:00.000Z,overview,best-practices,0.7,True,"Explains Azure context behavior and prescribes specific patterns (e.g., explicit subscription selection) to avoid subtle runbook bugs; product-specific gotchas and recommendations.",unchanged +https://learn.microsoft.com/en-us/azure/automation/context-switching,Context switching in Azure Automation,Context switching in Azure Automation,Avoid context switching issues in Azure Automation runbooks,This article explains context switching and how to avoid runbook issues.,"Context switching is when the context in one process changes the context in a different process. An Azure context is a set of information that defines the target of Azure PowerShell cmdlets. The context consists of the following properties: When an account signs on that can access several subscriptions, any of those subscriptions may be added to the user's context. To guarantee the correct subscription, you must declare it when connecting. For example, useAdd-AzAccount -Credential $Cred -subscri",2026-04-20T11:11:00.000Z,overview,best-practices,0.7,True,Explains Azure context behavior and how to explicitly set subscription/account in runbooks; contains product-specific PowerShell usage patterns and gotchas unique to Azure Automation.,updated https://learn.microsoft.com/en-us/azure/automation/delete-account,Manage Automation account,Manage your Azure Automation account,,This article tells how to delete and your Automation account across the different configuration scenarios and restore a deleted Automation account,"After you enable an Azure Automation account to help automate IT or business process, or enable its other features to support operations management of your Azure and non-Azure machines, you may decide to stop using the Automation account. If you have enabled features that depend on integration with an Azure Monitor Log Analytics workspace, there are more steps required to complete this action. This article tells you how to completely remove your Automation account through the Azure portal, using",2025-05-08T08:00:00.000Z,how-to,,0.45,False,Describes deleting/restoring Automation accounts and related resources; mostly procedural without detailed configuration tables or error-code-based troubleshooting.,unchanged https://learn.microsoft.com/en-us/azure/automation/delete-run-as-account,Delete Run As account,Delete an Azure Automation Run As account,,This article tells how to delete a Run As account with PowerShell or from the Azure portal.,"Important Azure Automation Run as accounts, including Classic Run as accounts have retired on30 September 2023and replaced withManaged Identities. You would no longer be able to create or renew Run as accounts through the Azure portal. For more information, seemigrating from an existing Run As accounts to managed identity. Run As accounts in Azure Automation provide authentication for managing resources on the Azure Resource Manager or Azure Classic deployment model using Automation runbooks an",2025-04-11T08:00:00.000Z,how-to,,0.4,False,Describes retirement and deletion of Run As accounts and migration to managed identities; primarily lifecycle and portal/PowerShell steps without deep config parameters or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/automation/disable-local-authentication,Disable local authentication,Disable local authentication in Azure Automation,Disable local authentication for Azure Automation securely,This article describes disabling local authentication in Azure Automation.,"Important When you disable local authentication, it impacts starting a runbook using a webhook, source control auto sync, and Automation Desired State Configuration. For more information, see theavailable alternatives. Azure Automation provides Microsoft Entra authentication support for all Automation service public endpoints. This critical security enhancement removes certificate dependencies and gives organizations control to disable local authentication methods. This feature provides you with",2026-04-02T11:11:00.000Z,how-to,security,0.72,True,"The page is focused on a concrete security configuration change (disabling local authentication) for Azure Automation, including its specific impact on features like webhooks, source control auto sync, and DSC, and the move to Microsoft Entra-based auth. This is product-specific security guidance with concrete implications and configuration behavior, fitting the security sub-skill. It is not just conceptual; it describes how this setting affects particular mechanisms and what alternatives to use.",unchanged -https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation,Disable system-assigned managed identity,Disable system-assigned managed identity for Azure Automation account,Disable system-assigned managed identity on Automation accounts,This article explains how to disable a system-assigned managed identity for an Azure Automation account.,"You can disable a system-assigned managed identity in Azure Automation by using the Azure portal, or using REST API.",2025-11-17T08:00:00.000Z,how-to,security,0.8,True,"Explains disabling system-assigned managed identity via portal or REST, which is identity/security configuration specific to Automation.",unchanged +https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation,Disable system-assigned managed identity,Disable system-assigned managed identity for Azure Automation account,Disable system-assigned managed identity in Azure Automation,This article explains how to disable a system-assigned managed identity for an Azure Automation account.,"You can disable a system-assigned managed identity in Azure Automation by using the Azure portal, or using REST API.",2026-04-20T11:11:00.000Z,how-to,security,0.7,True,Explains how to disable a system-assigned managed identity for an Automation account via portal and REST; involves product-specific identity configuration steps and parameters.,updated https://learn.microsoft.com/en-us/azure/automation/enable-managed-identity-for-automation,Using system-assigned managed identity,Using a system-assigned managed identity for an Azure Automation account,Enable system-assigned managed identity for Azure Automation,This article describes how to set up managed identity for Azure Automation accounts.,"This article shows you how to enable a system-assigned managed identity for an Azure Automation account and how to use it to access other resources. For more information on how managed identities work with Azure Automation, seeManaged identities. If you don't have an Azure subscription, create afree accountbefore you begin.",2025-01-20T08:00:00.000Z,how-to,security,0.9,True,"Shows how to configure system-assigned managed identity and use it to access resources, a core security/identity configuration.",unchanged https://learn.microsoft.com/en-us/azure/automation/enforce-job-execution-hybrid-worker,Use Azure Policy to enforce job execution,Enforce job execution on Azure Automation Hybrid Runbook Worker,Enforce Hybrid Runbook Worker job execution via policy,This article tells how to use a custom Azure Policy definition to enforce job execution on an Azure Automation Hybrid Runbook Worker.,"Important Starting a runbook on a Hybrid Runbook Worker uses aRun onoption that allows you to specify the name of a Hybrid Runbook Worker group when initiating from the Azure portal, with the Azure PowerShell, or REST API. When a group is specified, one of the workers in that group retrieves and runs the runbook. If your runbook does not specify this option, Azure Automation runs the runbook in the Azure sandbox. Anyone in your organization who is a member of theAutomation Job Operatoror higher ",2025-11-17T08:00:00.000Z,overview,configuration,0.65,True,"Uses a custom Azure Policy definition to force jobs onto Hybrid Runbook Workers; likely includes specific policy JSON, parameters, and settings unique to this feature.",unchanged -https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install,Deploy extension-based worker,Deploy an extension-based Windows or Linux User Hybrid Runbook Worker in Azure Automation,Deploy extension-based Hybrid Runbook Workers for Windows and Linux,This article provides information about deploying the extension-based User Hybrid Runbook Worker to run runbooks on Windows or Linux machines in your on-premises datacenter or other cloud environment.,"The extension-based onboarding is only forUserHybrid Runbook Workers. This article describes how to: deploy a user Hybrid Runbook Worker on a Windows or Linux machine, remove the worker, and remove a Hybrid Runbook Worker group. ForSystemHybrid Runbook Worker onboarding, seeDeploy an agent-based Windows Hybrid Runbook Worker in AutomationorDeploy an agent-based Linux Hybrid Runbook Worker in Automation. You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks direct",2025-05-12T08:00:00.000Z,how-to,deployment,0.8,True,Product-specific deployment method (extension-based) with steps and constraints for Windows/Linux machines and worker groups; distinct deployment pattern.,unchanged +https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install,Deploy extension-based worker,Deploy an extension-based Windows or Linux User Hybrid Runbook Worker in Azure Automation,Deploy extension-based Hybrid Runbook Workers on Windows and Linux,This article provides information about deploying the extension-based User Hybrid Runbook Worker to run runbooks on Windows or Linux machines in your on-premises datacenter or other cloud environment.,"The extension-based onboarding is only forUserHybrid Runbook Workers. This article describes how to: deploy a user Hybrid Runbook Worker on a Windows or Linux machine, remove the worker, and remove a Hybrid Runbook Worker group. ForSystemHybrid Runbook Worker onboarding, seeDeploy an agent-based Windows Hybrid Runbook Worker in AutomationorDeploy an agent-based Linux Hybrid Runbook Worker in Automation. You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks direct",2026-04-15T08:00:00.000Z,how-to,deployment,0.75,True,Covers deploying extension-based user Hybrid Runbook Workers with platform-specific requirements and steps; this is a product-specific deployment pattern for on-premises/other clouds.,updated https://learn.microsoft.com/en-us/azure/automation/graphical-runbook-sdk,Work with the Graphical runbook SDK,Use the Azure Automation graphical runbook SDK (preview),Use the Azure Automation graphical runbook SDK,This article tells how to use the Azure Automation graphical runbook SDK (preview).,Graphical runbookshelp manage the complexities of the underlying Windows PowerShell or PowerShell Workflow code. The Microsoft Azure Automation graphical authoring SDK enables developers to create and edit graphical runbooks for use with Azure Automation. This article describes basic steps in creating a graphical runbook from your code.,2025-11-17T08:00:00.000Z,how-to,integrations,0.65,True,SDK-focused article for programmatically creating/editing graphical runbooks; likely includes API/SDK parameters and structures unique to this product.,unchanged https://learn.microsoft.com/en-us/azure/automation/how-to/automation-region-dns-records,Manage DNS records used by Automation,Azure Datacenter DNS records used by Azure Automation,Configure Azure Automation regional DNS records for firewalled networks,This article provides the DNS records required by Azure Automation features when restricting communication to a specific Azure region hosting that Automation account.,"TheAzure Automationservice uses many Domain Name System (DNS) records for features to connect to the service. If you have an Automation account configured for a specific region, you can restrict communication to that regional datacenter. You might need to know these records to allow the following Automation features to work when it's hosted behind a firewall: Note Linux Hybrid Runbook Worker registration will fail with the new records unless it is version 1.6.10.2 or higher. You must upgrade to ",2025-11-17T08:00:00.000Z,overview,configuration,0.85,True,Provides concrete DNS record lists per region and feature for Automation accounts behind firewalls; highly specific configuration data not inferable from general knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/automation/how-to/move-account,Move Automation account to another subscription,Move your Azure Automation account to another subscription,,This article tells how to move your Automation account to another subscription.,"Azure Automation allows you to move some resources to a new resource group or subscription. You can move resources through the Azure portal, PowerShell, the Azure CLI, or the REST API. To learn more about the process, seeMove resources to a new resource group or subscription. The Automation account is one of the resources that you can move. In this article, you'll learn to move Automation accounts to another resource or subscription. The high-level steps for moving your Automation account are:",2025-11-17T08:00:00.000Z,how-to,,0.45,False,How-to for moving Automation accounts between subscriptions/resource groups; reuses generic Azure move-resource process without deep product-specific limits or configs.,unchanged +https://learn.microsoft.com/en-us/azure/automation/how-to/move-account,Move Automation account to another subscription,Move your Azure Automation account to another subscription,,This article tells how to move your Automation account to another subscription.,"Azure Automation allows you to move some resources to a new resource group or subscription. You can move resources through the Azure portal, PowerShell, the Azure CLI, or the REST API. To learn more about the process, seeMove resources to a new resource group or subscription. The Automation account is one of the resources that you can move. In this article, you'll learn to move Automation accounts to another resource or subscription. The high-level steps for moving your Automation account are:",2026-04-20T11:11:00.000Z,how-to,,0.3,False,"Primarily a how-to move resource guide; likely step-by-step portal/CLI instructions without detailed limits, config tables, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/automation/how-to/private-link-security,Connect privately to Automation account,Use Azure Private Link to securely connect networks to Azure Automation,Secure Azure Automation access with Private Link and private endpoints,Use Azure Private Link to securely connect networks to Azure Automation,"Azure Private Endpoint is a network interface that connects you privately and securely to a service powered by Azure Private Link. Private Endpoint uses a private IP address from your VNet, effectively bringing the Automation service into your VNet. Network traffic between the machines on the VNet and the Automation account traverses over the VNet and a private link on the Microsoft backbone network, eliminating exposure from the public internet. For example, you have a VNet where you have disab",2024-09-10T08:00:00.000Z,overview,security,0.75,True,"Details using Private Link/private endpoints for Automation, including network/security configuration specifics unique to this service.",unchanged https://learn.microsoft.com/en-us/azure/automation/how-to/runbook-authoring-extension-for-vscode,Use Automation extension for Visual Studio Code,Azure Automation extension for Visual Studio Code,,Learn how to use the Azure Automation extension for Visual Studio Code to author runbooks.,"This article explains about the Visual Studio that you can use to create and manage runbooks. You can perform all runbook management operations such as creating runbooks, editing runbook, triggering a job, tracking recent jobs outputs, linking a schedule, asset management and local debugging.",2023-01-18T18:06:00.000Z,how-to,,0.3,False,"Explains using VS Code extension to author runbooks; mostly tooling workflow, not configuration matrices, limits, or product-specific troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-textual,Create a PowerShell Workflow runbook,Tutorial - Create a PowerShell Workflow runbook in Azure Automation,,"This tutorial teaches you to create, test, and publish a PowerShell Workflow runbook.","This tutorial walks you through the creation of aPowerShell Workflow runbookin Azure Automation. PowerShell Workflow runbooks are text runbooks based on Windows PowerShell Workflow. You can create and edit the code of the runbook using the text editor in the Azure portal. Note This article is applicable only for PowerShell 5.1. PowerShell 7+ versions do not support Workflows, and outdated runbooks cannot be updated. We recommend you to use PowerShell 7.2 textual runbooks for advanced features su",2025-11-17T08:00:00.000Z,tutorial,,0.3,False,"Tutorial on creating a PowerShell Workflow runbook; mostly authoring steps, not configuration matrices or limits.",unchanged https://learn.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-textual-python-3,Create a Python 3 runbook,Create a Python 3.8 runbook in Azure Automation,,"This article teaches you to create, test, and publish a simple Python 3.8 runbook in your Azure Automation account.",This tutorial walks you through the creation of aPython 3.8 runbookin Azure Automation. Python runbooks compile under Python 2.7 and 3.8 You can directly edit the code of the runbook using the text editor in the Azure portal.,2025-11-17T08:00:00.000Z,tutorial,,0.3,False,Tutorial for Python runbook creation; basic how-to without expert-level configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/automation/learn/powershell-runbook-managed-identity,Create PowerShell runbook using managed identity,Create PowerShell Runbook Using Managed Identity in Azure Automation,Use managed identity in Azure Automation PowerShell runbooks,"In this tutorial, you learn how to use managed identities with a PowerShell runbook in Azure Automation.","This tutorial walks you through creating aPowerShell runbookin Azure Automation that uses amanaged identityto interact with resources. PowerShell runbooks are based on Windows PowerShell. A managed identity from Microsoft Entra ID allows your runbook to easily access other Microsoft Entra protected resources. In this tutorial, you learn how to: If you don't have an Azure subscription, create afree accountbefore you begin.",2025-06-27T22:11:00.000Z,tutorial,security,0.7,True,"Shows how a runbook uses a managed identity to access Entra-protected resources, including product-specific auth patterns and permissions.",unchanged +https://learn.microsoft.com/en-us/azure/automation/learn/powershell-runbook-managed-identity,Create PowerShell runbook using managed identity,Create PowerShell Runbook Using Managed Identity in Azure Automation,,"In this tutorial, you learn how to use managed identities with a PowerShell runbook in Azure Automation.","This tutorial walks you through creating aPowerShell runbookin Azure Automation that uses amanaged identityto interact with resources. PowerShell runbooks are based on Windows PowerShell. A managed identity from Microsoft Entra ID allows your runbook to easily access other Microsoft Entra protected resources. In this tutorial, you learn how to: If you don't have an Azure subscription, create afree accountbefore you begin.",2026-04-20T11:11:00.000Z,tutorial,,0.3,False,"Tutorial on creating a PowerShell runbook with managed identity; step-by-step usage but no detailed configuration tables, RBAC role lists, or product-specific error mappings.",updated https://learn.microsoft.com/en-us/azure/automation/manage-office-365,Manage Office 365 services,Manage Office 365 services using Azure Automation,Manage Office 365 services with Azure Automation,This article tells how to use Azure Automation to manage Office 365 subscription services.,"You can use Azure Automation for management of Office 365 subscription services, for products such as Microsoft Word and Microsoft Outlook. Interactions with Office 365 are enabled byMicrosoft Entra ID. SeeUse Microsoft Entra ID in Azure Automation to authenticate to Azure.",2024-09-15T08:00:00.000Z,how-to,integrations,0.65,True,"Covers Automation integration with Office 365 via Entra ID; likely includes connection setup, permissions, and cmdlet usage specific to this integration.",unchanged https://learn.microsoft.com/en-us/azure/automation/manage-runbooks,Manage runbooks,Manage runbooks in Azure Automation,Manage Azure Automation runbooks with recommended design patterns,This article tells how to manage runbooks in Azure Automation.,You can add a runbook to Azure Automation by either creating a new one or importing an existing one from a file or theRunbook Gallery. This article provides information for managing a runbook and recommended patterns and best practices with runbook design. You can find all the details of accessing community runbooks and modules inRunbook and module galleries for Azure Automation.,2025-11-17T08:00:00.000Z,how-to,best-practices,0.7,True,"Includes recommended patterns and best practices for runbook design specific to Azure Automation, beyond generic scripting advice.",unchanged https://learn.microsoft.com/en-us/azure/automation/manage-runtime-environment,Runtime environment scenarios,Manage Runtime Environment and Associated Runbooks in Azure Automation,Configure runtime environments and associated runbooks in Azure Automation,This article tells how to manage runbooks in Runtime environment and associated runbooks Azure Automation,This article provides information on how to create Runtime Environment and perform various operations through Azure portal and REST API.,2025-07-09T08:00:00.000Z,how-to,configuration,0.7,True,Covers creation and management of runtime environments via portal and REST; likely includes environment properties and constraints specific to Automation.,unchanged https://learn.microsoft.com/en-us/azure/automation/manage-sql-server-in-automation,Manage databases in Azure SQL database using Azure Automation,Manage databases in Azure SQL databases using Azure Automation,Manage Azure SQL databases using Automation managed identity,This article explains on how to use Azure SQL server database using a system assigned managed identity in Azure Automation.,"This article describes the procedure to connect and manage databases in Azure SQL database using Azure Automation'ssystem-assigned managed identity. With Azure Automation, you can manage databases in Azure SQL Database by using thelatest Az PowerShell cmdletsthat are available inAzure Az PowerShell. Azure Automation has these Azure Az PowerShell cmdlets available out of the box, so that you can perform all the SQL database management tasks within the service. You can also pair these cmdlets in A",2024-09-15T08:00:00.000Z,how-to,integrations,0.7,True,"Describes connecting Automation to Azure SQL via system-assigned identity and Az PowerShell cmdlets, a product-specific integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers,Migrate existing Agent-based Hybrid Workers to Extension-based Workers,Migrate existing agent-based hybrid workers to extension-based-workers in Azure Automation,Migrate Azure Automation hybrid workers to extension-based,This article provides information on how to migrate an existing agent-based hybrid worker to extension based workers.,Important Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) has retired on31 August 2024and is no longer supported. Follow the guidelines in this article on how to migrate from an existing Agent-based User Hybrid Runbook Workers to Extension-based Hybrid Workers. This article describes the benefits of Extension-based User Hybrid Runbook Worker and how to migrate existing Agent-based User Hybrid Runbook Workers to Extension-based Hybrid Workers. There are two Hybrid Runb,2026-03-05T08:00:00.000Z,how-to,deployment,0.68,True,"The article gives concrete, product-specific migration guidance from agent-based to extension-based hybrid workers, including sequencing, prerequisites, and platform-specific steps that affect how production workers are deployed and operated. This is migration/deployment-focused expert knowledge rather than a generic overview, but it does not center on limits, security roles, or configuration parameter tables.",updated +https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers,Migrate existing Agent-based Hybrid Workers to Extension-based Workers,Migrate existing agent-based hybrid workers to extension-based-workers in Azure Automation,Migrate agent-based Hybrid Runbook Workers to extension-based workers,This article provides information on how to migrate an existing agent-based hybrid worker to extension based workers.,Important Azure Automation Agent-based User Hybrid Runbook Worker (Windows and Linux) has retired on31 August 2024and is no longer supported. Follow the guidelines in this article on how to migrate from an existing Agent-based User Hybrid Runbook Workers to Extension-based Hybrid Workers. This article describes the benefits of Extension-based User Hybrid Runbook Worker and how to migrate existing Agent-based User Hybrid Runbook Workers to Extension-based Hybrid Workers. There are two Hybrid Runb,2026-04-24T08:00:00.000Z,how-to,decision-making,0.7,True,"Provides migration guidance between agent-based and extension-based workers, including retirement dates and benefits; helps decide migration approach and path with product-specific constraints.",updated https://learn.microsoft.com/en-us/azure/automation/migrate-run-as-accounts-managed-identity,Migrate Run As account to managed identity,Migrate from a Run As account to Managed identities,Plan and execute migration from Run As to managed identities,This article describes how to migrate from a Run As account to managed identities in Azure Automation.,"Important Azure Automation Run as accounts, including Classic Run as accounts have retired on30 September 2023and replaced withManaged Identities. You would no longer be able to create or renew Run as accounts through the Azure portal. For more information about migration cadence and the support timeline for Run As account creation and certificate renewal, see thefrequently asked questions. Run As accounts in Azure Automation provide authentication for managing resources deployed through Azure ",2025-11-17T08:00:00.000Z,how-to,decision-making,0.65,True,"Migration-focused article with cadence, support timelines, and concrete guidance on when/how to move from Run As accounts to managed identities; fits decision-making around authentication model migration.",unchanged https://learn.microsoft.com/en-us/azure/automation/overview,What is Azure Automation?,Azure Automation overview,,This article tells what Azure Automation is and how to use it to automate the lifecycle of infrastructure and applications.,"Automation is needed in three broad areas of cloud operations: Azure Automation delivers a cloud-based automation service that supports consistent management across your Azure and non-Azure environments. It includes process automation, configuration management, shared capabilities, and heterogeneous features. There are several Azure services that can deliver the above requirements, where each service includes a set of capabilities and serves a role as a programmable platform to build cloud solut",2025-04-24T08:00:00.000Z,overview,,0.2,False,"High-level overview of Azure Automation capabilities without concrete limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/automation/policy-reference,Azure Policy built-ins,Built-in policy definitions for Azure Automation,Use built-in Azure Policy definitions for Automation,Lists Azure Policy built-in policy definitions for Azure Automation. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy @@ -128,7 +128,7 @@ the built-ins for asecurity controlindividually to help make your Azure resource with the specific standard. The title of each built-in policy definition links to the policy definition in ",2024-05-15T08:00:00.000Z,sample,security,0.8,True,"Lists specific built-in policy definitions and controls for Automation, which are product-specific security/compliance configurations.",unchanged https://learn.microsoft.com/en-us/azure/automation/shared-resources/certificates,Manage certificates,Manage certificates in Azure Automation,Securely manage certificates for Azure Automation runbooks and DSC,This article tells how to work with certificates for access by runbooks and DSC configurations.,"Azure Automation stores certificates securely for access by runbooks and DSC configurations, by using theGet-AzAutomationCertificatecmdlet for Azure Resource Manager resources. Secure certificate storage allows you to create runbooks and DSC configurations that use certificates for authentication, or add them to Azure or third-party resources. Note Secure assets in Azure Automation include credentials, certificates, connections, and encrypted variables. These assets are encrypted and stored in A",2026-01-22T18:12:00.000Z,how-to,security,0.75,True,Details secure storage/encryption of certificate assets and how to access them via specific cmdlets; product-specific security configuration patterns.,unchanged https://learn.microsoft.com/en-us/azure/automation/shared-resources/credentials,Manage credentials,Manage credentials in Azure Automation,Create and use credential assets securely in Azure Automation,This article tells how to create credential assets and use them in a runbook or DSC configuration.,"An Automation credential asset holds an object that contains security credentials, such as a user name and a password. Runbooks and DSC configurations use cmdlets that accept aPSCredentialobject for authentication. Alternatively, they can extract the user name and password of thePSCredentialobject to provide to some application or service requiring authentication. Note Secure assets in Azure Automation include credentials, certificates, connections, and encrypted variables. These assets are encr",2025-11-17T08:00:00.000Z,how-to,security,0.75,True,"Focuses on secure credential asset handling, encryption, and usage in runbooks/DSC; product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules,Manage modules in Azure Automation,Manage modules in Azure Automation,Plan migration from AzureRM to Az modules in Azure Automation,This article tells how to use PowerShell modules to enable cmdlets in runbooks and DSC resources in DSC configurations.,"Note StartingFebruary 1, 2025, Azure Automation willdiscontinuethe execution of all the runbooks that use AzureRM modules. StartingNovember 1, 2024, you won't be able to create new runbooks using AzureRM modules. The AzureRM PowerShell module has been officially deprecated as ofFebruary 29, 2024. We recommend you to migrate from the AzureRM module to the Az PowerShell module to ensure continued support and updates. While the AzureRM module may still work, it is no longer maintained or supported,",2025-05-27T08:00:00.000Z,how-to,decision-making,0.7,True,Contains deprecation timelines and guidance on moving from AzureRM to Az modules; supports decisions about module/runtime strategy and migration timing.,unchanged +https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules,Manage modules in Azure Automation,Manage modules in Azure Automation,Manage and migrate PowerShell modules in Azure Automation,This article tells how to use PowerShell modules to enable cmdlets in runbooks and DSC resources in DSC configurations.,"Note StartingFebruary 1, 2025, Azure Automation willdiscontinuethe execution of all the runbooks that use AzureRM modules. StartingNovember 1, 2024, you won't be able to create new runbooks using AzureRM modules. The AzureRM PowerShell module has been officially deprecated as ofFebruary 29, 2024. We recommend you to migrate from the AzureRM module to the Az PowerShell module to ensure continued support and updates. While the AzureRM module may still work, it is no longer maintained or supported,",2026-04-15T08:00:00.000Z,how-to,best-practices,0.7,True,"Includes deprecation timelines, module usage guidance, and migration specifics (AzureRM to Az) that are product- and date-specific; these are concrete recommendations and edge cases for this service.",updated https://learn.microsoft.com/en-us/azure/automation/shared-resources/schedules,Manage schedules,Manage schedules in Azure Automation,Configure schedules for Azure Automation runbooks,This article tells how to create and work with a schedule in Azure Automation.,"To schedule a runbook in Azure Automation to start at a specified time, you link it to one or more schedules. A schedule can be configured to either run once or on a recurring hourly or daily schedule for runbooks in the Azure portal. You can also schedule them for weekly, monthly, specific days of the week or days of the month, or a particular day of the month. A runbook can be linked to multiple schedules, and a schedule can have multiple runbooks linked to it. Note Azure Automation supports D",2025-11-17T08:00:00.000Z,how-to,configuration,0.7,True,"Details supported schedule types (once, hourly, daily, weekly, monthly, specific days) and how they behave; concrete configuration options for scheduling.",unchanged https://learn.microsoft.com/en-us/azure/automation/shared-resources/variables,Manage variables,Manage variables in Azure Automation,Define and use variable assets in Azure Automation,This article tells how to work with variables in runbooks and DSC configurations.,"Variable assets are values that are available to all runbooks and DSC configurations in your Automation account. You can manage them from the Azure portal, from PowerShell, within a runbook, or in a DSC configuration. Automation variables are useful for the following scenarios: Sharing a value among multiple runbooks or DSC configurations. Sharing a value among multiple jobs from the same runbook or DSC configuration. Managing a value used by runbooks or DSC configurations from the portal or fro",2025-11-17T08:00:00.000Z,overview,configuration,0.7,True,Describes Automation variable asset behavior and management via portal/PowerShell/runbooks; includes specific patterns for sharing values across jobs and runbooks.,unchanged https://learn.microsoft.com/en-us/azure/automation/source-control-integration,Use source control integration,Use Source Control Integration in Azure Automation,Configure Azure Automation source control integration,This article tells you how to synchronize Azure Automation source control with other repositories.,"Source control integration in Azure Automation supports single-direction synchronization from your source control repository. Source control allows you to keep your runbooks in your Automation account up to date with scripts in your GitHub or Azure DevOps source control repository. This feature makes it easy to promote code that has been tested in your development environment to your production Automation account. Source control integration lets you easily collaborate with your team, track chang",2025-07-02T17:10:00.000Z,how-to,configuration,0.7,True,"Covers one-way sync setup with GitHub/Azure DevOps; likely includes specific configuration fields, connection settings, and schedules unique to Automation source control.",unchanged @@ -139,17 +139,17 @@ Azure Machine Configuration also includes hybrid machine support throughArc-enab https://learn.microsoft.com/en-us/azure/automation/troubleshoot/collect-data-microsoft-azure-automation-case,Data to Collect when Opening a Case for Microsoft Azure Automation,Data to collect when opening a case for Microsoft Azure Automation,Collect diagnostic data for Azure Automation support cases,This article describes the information to gather before opening a case for Azure Automation with Microsoft Azure Support.,"This article describes the information that you should gather before you open a case for Azure Automation with Microsoft Azure Support. Even though this information isn't required to open a case. However, it helps the support team to quickly resolve a problem. Note For more information, refer the Knowledge Base article4034605 - How to capture Azure Automation-scripted diagnostics.",2022-10-25T11:27:00.000Z,troubleshooting,troubleshooting,0.7,True,"Describes what logs and data to gather; likely includes specific cmdlets, log locations, and configuration exports unique to Automation support.",unchanged https://learn.microsoft.com/en-us/azure/automation/troubleshoot/desired-state-configuration,Troubleshoot State Configuration Issues,Troubleshoot Azure Automation State Configuration issues,Troubleshoot Azure Automation State Configuration problems,This article tells how to troubleshoot and resolve Azure Automation State Configuration issues.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2024-10-22T08:00:00.000Z,troubleshooting,troubleshooting,0.8,True,"Troubleshooting State Configuration; likely includes DSC error codes, compliance issues, and remediation steps specific to Azure Automation DSC.",unchanged -https://learn.microsoft.com/en-us/azure/automation/troubleshoot/extension-based-hybrid-runbook-worker,Troubleshoot Extension-based Hybrid Runbook Worker Issues,Troubleshoot extension-based Hybrid Runbook Worker issues in Azure Automation,Troubleshoot extension-based Hybrid Runbook Worker issues,This article tells how to troubleshoot and resolve issues that arise with Azure Automation extension-based Hybrid Runbook Workers.,"This article provides information on troubleshooting and resolving issues with Azure Automation extension-based Hybrid Runbook Workers. For troubleshooting agent-based workers, seeTroubleshoot agent-based Hybrid Runbook Worker issues in Automation. For general information, seeHybrid Runbook Worker overview.",2025-05-08T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,"Focused on extension-based workers; likely includes extension status codes, logs, and resolution steps specific to this worker type.",unchanged -https://learn.microsoft.com/en-us/azure/automation/troubleshoot/hybrid-runbook-worker,Troubleshoot Agent-based Hybrid Runbook Worker Issues,Troubleshoot agent-based Hybrid Runbook Worker issues in Azure Automation,Troubleshoot agent-based Hybrid Runbook Worker issues,This article tells how to troubleshoot and resolve issues that arise with Azure Automation agent-based Hybrid Runbook Workers.,"Important This article provides information on troubleshooting and resolving issues with Azure Automation agent-based Hybrid Runbook Workers. For troubleshooting extension-based workers, seeTroubleshoot extension-based Hybrid Runbook Worker issues in Automation. For general information, seeHybrid Runbook Worker overview.",2023-09-17T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,"Agent-based worker troubleshooting; expected to list agent logs, error codes, and configuration fixes unique to Hybrid Runbook Workers.",unchanged -https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity,Troubleshoot Managed Identity Issues,Troubleshoot Azure Automation managed identity issues,Troubleshoot managed identity issues in Azure Automation,This article tells how to troubleshoot and resolve issues when using a managed identity with an Automation account.,"This article discusses solutions to problems that you might encounter when you use a managed identity with your Automation account. For general information about using managed identity with Automation accounts, seeAzure Automation account authentication overview.",2024-04-23T11:15:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicit troubleshooting guide; expected to list specific error messages, causes, and resolutions for Automation managed identities.",unchanged -https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks,Troubleshoot Runbook Issues,Troubleshoot Azure Automation runbook issues,Troubleshoot Azure Automation runbook execution issues,This article tells how to troubleshoot and resolve issues with Azure Automation runbooks.,"This article describes runbook issues that might occur and how to resolve them. For general information, seeRunbook execution in Azure Automation.",2024-05-09T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,"Runbook troubleshooting article; expected to contain specific error messages, timeouts, and diagnostic steps unique to Automation runbooks.",unchanged -https://learn.microsoft.com/en-us/azure/automation/troubleshoot/shared-resources,Troubleshoot Shared Resources Issues,Troubleshoot Azure Automation shared resource issues,Troubleshoot Azure Automation shared resource problems,This article tells how to troubleshoot and resolve issues with Azure Automation shared resources.,This article discusses issues that might arise when you're usingshared resourcesin Azure Automation.,2023-10-03T22:21:00.000Z,troubleshooting,troubleshooting,0.8,True,Troubleshooting shared resources; likely includes symptom → cause → fix mappings and possibly error codes unique to Automation shared resources.,unchanged -https://learn.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state,Configure servers to a desired state and manage drift,Configure machines to a desired state in Azure Automation,,This article tells how to configure machines to a desired state using Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-15T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,"Troubleshooting article for State Configuration will map specific compliance, reporting, or configuration application issues to causes and fixes, including error codes and diagnostic steps, which is specialized troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/automation/troubleshoot/extension-based-hybrid-runbook-worker,Troubleshoot Extension-based Hybrid Runbook Worker Issues,Troubleshoot extension-based Hybrid Runbook Worker issues in Azure Automation,Troubleshoot extension-based Hybrid Runbook Worker issues,This article tells how to troubleshoot and resolve issues that arise with Azure Automation extension-based Hybrid Runbook Workers.,"This article provides information on troubleshooting and resolving issues with Azure Automation extension-based Hybrid Runbook Workers. For troubleshooting agent-based workers, seeTroubleshoot agent-based Hybrid Runbook Worker issues in Automation. For general information, seeHybrid Runbook Worker overview.",2026-04-15T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Troubleshooting guide for extension-based Hybrid Runbook Workers will document specific failure modes, logs, error codes, and remediation steps unique to this worker type, fitting the troubleshooting category strongly.",updated +https://learn.microsoft.com/en-us/azure/automation/troubleshoot/hybrid-runbook-worker,Troubleshoot Agent-based Hybrid Runbook Worker Issues,Troubleshoot agent-based Hybrid Runbook Worker issues in Azure Automation,Troubleshoot agent-based Hybrid Runbook Worker issues,This article tells how to troubleshoot and resolve issues that arise with Azure Automation agent-based Hybrid Runbook Workers.,"Important This article provides information on troubleshooting and resolving issues with Azure Automation agent-based Hybrid Runbook Workers. For troubleshooting extension-based workers, seeTroubleshoot extension-based Hybrid Runbook Worker issues in Automation. For general information, seeHybrid Runbook Worker overview.",2026-04-15T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Similar to index 10 but for agent-based workers; focuses on diagnosing and resolving concrete issues with agent-based Hybrid Runbook Workers, including specific errors and log locations, which is expert troubleshooting knowledge.",updated +https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity,Troubleshoot Managed Identity Issues,Troubleshoot Azure Automation managed identity issues,Troubleshoot managed identity issues in Azure Automation accounts,This article tells how to troubleshoot and resolve issues when using a managed identity with an Automation account.,"This article discusses solutions to problems that you might encounter when you use a managed identity with your Automation account. For general information about using managed identity with Automation accounts, seeAzure Automation account authentication overview.",2026-04-20T11:11:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article for managed identity issues; will contain specific error messages/codes, causes, and resolution steps (e.g., RBAC assignments, identity states) that map symptoms to solutions, which is classic troubleshooting expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks,Troubleshoot Runbook Issues,Troubleshoot Azure Automation runbook issues,Diagnose and fix Azure Automation runbook execution issues,This article tells how to troubleshoot and resolve issues with Azure Automation runbooks.,"This article describes runbook issues that might occur and how to resolve them. For general information, seeRunbook execution in Azure Automation.",2026-04-20T11:11:00.000Z,troubleshooting,troubleshooting,0.9,True,"Runbook troubleshooting article is explicitly about issues and resolutions; it will include specific error messages, job status meanings, and corrective actions, which are product-specific troubleshooting patterns.",updated +https://learn.microsoft.com/en-us/azure/automation/troubleshoot/shared-resources,Troubleshoot Shared Resources Issues,Troubleshoot Azure Automation shared resource issues,Troubleshoot Azure Automation shared resource problems,This article tells how to troubleshoot and resolve issues with Azure Automation shared resources.,This article discusses issues that might arise when you're usingshared resourcesin Azure Automation.,2026-04-20T11:11:00.000Z,troubleshooting,troubleshooting,0.88,True,"Troubleshooting guide for shared resources will enumerate concrete issues, error conditions, and their resolutions (e.g., module import failures, credential asset issues), providing symptom→cause→solution mappings specific to Azure Automation.",updated +https://learn.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state,Configure servers to a desired state and manage drift,Configure machines to a desired state in Azure Automation,Configure machines to desired state with Azure Automation,This article tells how to configure machines to a desired state using Azure Automation State Configuration.,"Note Azure Automation State Configuration will be retired on September 30, 2027, please transition toAzure Machine Configurationby that date. For more information, see theblog postannouncement. The Azure Machine Configuration service combines features of DSC Extension, Azure Automation State Configuration, and the most commonly requested features from customer feedback. -Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-14T11:11:00.000Z,tutorial,,0.3,False,"Tutorial-style guidance for configuring machines to a desired state; likely step-by-step instructions without structured limits, configuration parameter tables, or troubleshooting mappings.",updated +Azure Machine Configuration also includes hybrid machine support throughArc-enabled servers. Important TheAdd,Compose configura",2026-04-24T08:00:00.000Z,tutorial,configuration,0.7,True,"Tutorial for configuring machines to a desired state using Azure Automation State Configuration will include concrete configuration steps, parameter names, and settings for assigning and monitoring configurations, which are product-specific configuration patterns rather than just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/automation/whats-new,What's new?,What's New in Azure Automation,,Significant updates to Azure Automation updated each month.,"Azure Automation receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about: This page is updated monthly, so revisit it regularly. If you're looking for items older than six months, you can find them inArchive for What's new in Azure Automation.",2026-03-10T08:00:00.000Z,overview,,0.1,False,"Monthly 'what's new' changelog/updates page; primarily release notes and high-level feature announcements without structured limits, configuration tables, error mappings, or decision matrices that match the defined expert-knowledge sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/automation/whats-new-archive,Archive for What's new,Archive for What's new in Azure Automation,,"The What's new release notes in the Overview section of this content set contain six months of activity. Thereafter, the items are removed from the main article and put into this article.","Caution This article references CentOS, a Linux distribution that has reached the End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. The primaryWhat's new in Azure Automation?article contains updates for the last six months, while this article contains all the older information. What's new in Azure Automation? provides you with information about:",2026-03-05T08:00:00.000Z,overview,,0.1,False,"Archive of release notes and high-level change log information; does not focus on concrete limits, configuration parameters, or other structured expert details as defined by the sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/automation/whats-new-archive,Archive for What's new,Archive for What's new in Azure Automation,,"The What's new release notes in the Overview section of this content set contain six months of activity. Thereafter, the items are removed from the main article and put into this article.","Caution This article references CentOS, a Linux distribution that has reached the End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. The primaryWhat's new in Azure Automation?article contains updates for the last six months, while this article contains all the older information. What's new in Azure Automation? provides you with information about:",2026-04-24T08:00:00.000Z,overview,,0.1,False,"Archive of release notes and updates; primarily timeline and marketing/overview style content without concrete limits, configs, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/change-tracking-inventory-support-matrix,Supported regions,Azure Change Tracking and Inventory Support Matrix,Support matrix for Change Tracking and Inventory with AMA,Get a summary of support settings and limitations for enabling Azure Change Tracking and Inventory and tracking changes.,"Azure Change Tracking and Inventory monitors changes and provides inventory logs for servers across Azure, on-premises, and other cloud environments. This article summarizes support settings and limitations when you enable Change Tracking and Inventory and track changes. It also provides information about the supported regions and mappings for Change Tracking and Inventory by using the Azure Monitor Agent (AMA).",2026-01-10T06:10:00.000Z,overview,limits-quotas,0.8,True,"Explicitly a support matrix and limitations page; likely includes region support tables, OS/version support, and specific constraints that qualify as limits/quotas.",unchanged https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/enable-change-tracking-at-scale-machines-blade,At scale using Azure portal - Machines blade (New),Enable Change Tracking and Inventory at Scale with the Azure Portal - Machines Pane,Enable Change Tracking and Inventory at scale via Machines pane,This article describes how to enable Azure Change Tracking and Inventory at scale for Windows and Linux VMs by using the Machines pane in the Azure portal.,Applies to:✔️ Windows VMs ✔️ Linux VMs ✔️ Windows Registry ✔️ Windows Files ✔️ Linux Files ✔️ Windows Software This article provides procedures that show how you can enable Azure Change Tracking and Inventory at scale by using the Azure portalMachinespane.,2026-01-10T06:10:00.000Z,how-to,configuration,0.65,True,"Portal-based scale enablement; likely includes specific configuration options, data sources, and parameter values for enabling the feature on many VMs.",unchanged https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/enable-change-tracking-at-scale-policy,At scale using Azure Policy,Enable Change Tracking and Inventory at Scale for Azure VMs by Using Azure Policy,Enable Change Tracking and Inventory at scale with Azure Policy,"Learn how to use Azure Policy to enable Azure Change Tracking and Inventory at scale for Windows and Linux VMs, including Azure Arc-enabled VMs and Azure Virtual Machine Scale Sets.",Applies to:✔️ Windows VMs ✔️ Linux VMs ✔️ Windows Registry ✔️ Windows Files ✔️ Linux Files ✔️ Windows Software This article provides detailed procedures on how to enable Azure Change Tracking and Inventory at scale by using Azure Policy.,2026-01-10T06:10:00.000Z,how-to,configuration,0.7,True,"Uses Azure Policy for scale enablement; likely includes policy definitions, parameters, and assignment settings specific to this scenario.",unchanged diff --git a/products/azure-automation/report.md b/products/azure-automation/report.md index 4ab052ac..92a84fa8 100644 --- a/products/azure-automation/report.md +++ b/products/azure-automation/report.md @@ -1,43 +1,42 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Automation accounts: identities, RBAC, auth methods, encryption, - certificates/credentials, private endpoints, and policy/compliance for secure - runbooks and access.' - configuration: Configuring and managing Azure Automation runbooks, DSC/State Configuration, - Hybrid Runbook Workers, alerts, schedules, source control, policies, packages, - and deployment/runtime settings. - deployment: Guides for deploying resilient Automation accounts, setting up DR and - continuous deployment, and installing/migrating Windows/Linux Hybrid Runbook Workers - and agents. - best-practices: Best practices for structuring, chaining, and managing runbooks, - handling errors and output streams, ensuring resilient execution, and avoiding - context-switching issues in Azure Automation. - integrations: Integrating Automation runbooks with Azure/AWS/Office 365/SQL, authenticating - via identities/webhooks, deploying ARM, sending logs to Monitor, and emailing - via SendGrid - limits-quotas: 'Limits, quotas, and version/support details for Azure Automation: - DSC extension changes, Automation resource limits, subscription quotas, and Change - Tracking/Inventory support with AMA.' - decision-making: Guidance on choosing Azure Automation runbook types and planning - migrations (Orchestrator, Log Analytics agent, Hybrid workers, Run As accounts, - AzureRM→Az, and agent-to-extension changes). - troubleshooting: 'Diagnosing and fixing Azure Automation issues: DSC/State Configuration, - Hybrid Runbook Workers (agent/extension), managed identities, runbook failures, + certificates/credentials, Private Link, Azure Policy, and Terraform-based secure + provisioning.' + configuration: 'Configuring Azure Automation runbooks and State Configuration: DSC + compilation and remediation, Hybrid Runbook Workers, schedules, alerts, policies, + networking, packages, source control, and deployment.' + deployment: Guides for deploying resilient Automation accounts, setting up DR, and + installing/configuring Hybrid Runbook Workers (Windows/Linux, agent- and extension-based) + and DSC/Chocolatey CD. + best-practices: 'Designing robust runbooks: modular parent-child patterns, Hybrid + Runbook Workers, error handling, output/log streams, context switching issues, + and managing/migrating PowerShell modules.' + integrations: Patterns for integrating runbooks with AWS, ARM templates, webhooks, + email (SendGrid), Azure Monitor, Office 365, Azure SQL, and using the graphical + runbook SDK and managed identity. + limits-quotas: Azure Automation limits per subscription/account, how to view/manage + quotas, and AMA-based Change Tracking & Inventory support/scale constraints. + decision-making: 'Guidance on choosing runbook types and planning migrations: Orchestrator + to Automation, Log Analytics to AMA, agent to extension workers, and Run As accounts + to managed identities.' + troubleshooting: 'Diagnosing and fixing Azure Automation issues: runbook failures, + DSC problems, Hybrid Runbook Worker (agent/extension), managed identity/auth errors, shared resources, and collecting support diagnostics.' skill_description: Expert knowledge for Azure Automation development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations - & coding patterns, and deployment. Use when building runbooks, DSC/State Configuration, - Hybrid Runbook Workers, webhooks/identities auth, or Azure/AWS/SQL integrations, - and other Azure Automation related development tasks. Not for Azure Functions (use - azure-functions), Azure Logic Apps (use azure-logic-apps), Azure Scheduler (use - azure-scheduler), Azure Update Manager (use azure-update-manager). -use_when: Use when building runbooks, DSC/State Configuration, Hybrid Runbook Workers, - webhooks/identities auth, or Azure/AWS/SQL integrations, and other Azure Automation - related development tasks. + & coding patterns, and deployment. Use when building runbooks, Hybrid Runbook Workers, + DSC/State Configuration, webhooks/integrations, or managed identity auth, and other + Azure Automation related development tasks. Not for Azure Functions (use azure-functions), + Azure Logic Apps (use azure-logic-apps), Azure Scheduler (use azure-scheduler), + Azure DevOps (use azure-devops). +use_when: Use when building runbooks, Hybrid Runbook Workers, DSC/State Configuration, + webhooks/integrations, or managed identity auth, and other Azure Automation related + development tasks. confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic Apps - (use azure-logic-apps), Azure Scheduler (use azure-scheduler), Azure Update Manager - (use azure-update-manager). + (use azure-logic-apps), Azure Scheduler (use azure-scheduler), Azure DevOps (use + azure-devops). --- # Azure Automation Crawl Report @@ -51,8 +50,8 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 109 +- **Updated Pages**: 30 +- **Unchanged**: 85 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-automation/azure-automation.csv` @@ -60,13 +59,13 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A | Type | Count | Percentage | |------|-------|------------| -| best-practices | 6 | 5.2% | -| configuration | 37 | 32.2% | +| best-practices | 7 | 6.1% | +| configuration | 38 | 33.0% | | decision-making | 5 | 4.3% | -| deployment | 7 | 6.1% | +| deployment | 6 | 5.2% | | integrations | 9 | 7.8% | | limits-quotas | 3 | 2.6% | -| security | 17 | 14.8% | +| security | 16 | 13.9% | | troubleshooting | 7 | 6.1% | | *(Unclassified)* | 24 | 20.9% | @@ -74,18 +73,47 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A ### Updated Pages +- [Run runbooks on Hybrid Runbook Worker](https://learn.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks) + - Updated: 2025-07-29T08:00:00.000Z → 2026-04-15T08:00:00.000Z - [Overview](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-overview) - - Updated: 2025-11-17T08:00:00.000Z → 2026-03-05T08:00:00.000Z + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Get started with State Configuration](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-getting-started) + - Updated: 2025-11-17T08:00:00.000Z → 2026-04-15T08:00:00.000Z +- [Enable State Configuration](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding) + - Updated: 2025-04-01T08:00:00.000Z → 2026-04-15T08:00:00.000Z - [Configure servers to a desired state and manage drift](https://learn.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state) - - Updated: 2025-11-17T08:00:00.000Z → 2026-04-14T11:11:00.000Z + - Updated: 2026-04-14T11:11:00.000Z → 2026-04-24T08:00:00.000Z +- [Compile DSC configurations](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile) + - Updated: 2025-11-17T08:00:00.000Z → 2026-04-15T08:00:00.000Z - [Work with State Configuration extension version history](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-extension-history) - - Updated: 2025-11-17T08:00:00.000Z → 2026-03-05T08:00:00.000Z -- [Migrate existing Agent-based Hybrid Workers to Extension-based Workers](https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers) - - Updated: 2025-04-07T08:00:00.000Z → 2026-03-05T08:00:00.000Z + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Troubleshoot Managed Identity Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity) + - Updated: 2024-04-23T11:15:00.000Z → 2026-04-20T11:11:00.000Z +- [Troubleshoot Shared Resources Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/shared-resources) + - Updated: 2023-10-03T22:21:00.000Z → 2026-04-20T11:11:00.000Z +- [Troubleshoot Runbook Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks) + - Updated: 2024-05-09T08:00:00.000Z → 2026-04-20T11:11:00.000Z +- [Troubleshoot Extension-based Hybrid Runbook Worker Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/extension-based-hybrid-runbook-worker) + - Updated: 2025-05-08T08:00:00.000Z → 2026-04-15T08:00:00.000Z +- [Troubleshoot Agent-based Hybrid Runbook Worker Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/hybrid-runbook-worker) + - Updated: 2023-09-17T08:00:00.000Z → 2026-04-15T08:00:00.000Z +- [Troubleshoot State Configuration Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/desired-state-configuration) + - Updated: 2024-10-22T08:00:00.000Z → 2026-04-15T08:00:00.000Z - [Archive for What's new](https://learn.microsoft.com/en-us/azure/automation/whats-new-archive) - - Updated: 2023-08-01T08:00:00.000Z → 2026-03-05T08:00:00.000Z + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Create PowerShell runbook using managed identity](https://learn.microsoft.com/en-us/azure/automation/learn/powershell-runbook-managed-identity) + - Updated: 2025-06-27T22:11:00.000Z → 2026-04-20T11:11:00.000Z +- [Runbook execution overview](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution) + - Updated: 2025-11-17T08:00:00.000Z → 2026-04-15T08:00:00.000Z - [Management of Azure Automation data](https://learn.microsoft.com/en-us/azure/automation/automation-managing-data) - - Updated: 2025-11-17T08:00:00.000Z → 2026-03-05T08:00:00.000Z + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Using user-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/add-user-assigned-identity) + - Updated: 2025-11-17T08:00:00.000Z → 2026-04-14T08:00:00.000Z +- [Disable system-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation) + - Updated: 2025-11-17T08:00:00.000Z → 2026-04-20T11:11:00.000Z +- [Move Automation account to another subscription](https://learn.microsoft.com/en-us/azure/automation/how-to/move-account) + - Updated: 2025-11-17T08:00:00.000Z → 2026-04-20T11:11:00.000Z +- *...and 10 more* ## Classified Pages @@ -94,30 +122,28 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A | [Automation limits and quotas](https://learn.microsoft.com/en-us/azure/automation/automation-subscription-limits-faq) | limits-quotas | 0.95 | Explicitly described as providing default quotas and limits for Automation resources, including numeric values and constraints. | | [Best practices for security in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/automation-security-guidelines) | security | 0.95 | Explicitly a security best-practices article with detailed guidance on securing Automation accounts, Hybrid workers, identities, and networks. | | [Automation network configuration details](https://learn.microsoft.com/en-us/azure/automation/automation-network-configuration) | configuration | 0.90 | Provides detailed network information (endpoints, ports, URLs) required by Automation features, which is product-specific configuration. | -| [Manage automation limits and quotas](https://learn.microsoft.com/en-us/azure/automation/automation-limits-quotas) | limits-quotas | 0.90 | Explicitly about Automation subscription limits and quotas and how to request changes; expected to list concrete numeric limits and quota behaviors. | +| [Manage automation limits and quotas](https://learn.microsoft.com/en-us/azure/automation/automation-limits-quotas) | limits-quotas | 0.90 | A dedicated limits/quotas article; almost certainly contains specific numeric limits and quota-increase procedures that are not generally known from training. | +| [Troubleshoot Agent-based Hybrid Runbook Worker Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/hybrid-runbook-worker) | troubleshooting | 0.90 | Similar to index 10 but for agent-based workers; focuses on diagnosing and resolving concrete issues with agent-based Hybrid Runbook Workers, including specific errors and log locations, which is expert troubleshooting knowledge. | +| [Troubleshoot Extension-based Hybrid Runbook Worker Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/extension-based-hybrid-runbook-worker) | troubleshooting | 0.90 | Troubleshooting guide for extension-based Hybrid Runbook Workers will document specific failure modes, logs, error codes, and remediation steps unique to this worker type, fitting the troubleshooting category strongly. | +| [Troubleshoot Managed Identity Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity) | troubleshooting | 0.90 | Explicit troubleshooting article for managed identity issues; will contain specific error messages/codes, causes, and resolution steps (e.g., RBAC assignments, identity states) that map symptoms to solutions, which is classic troubleshooting expert knowledge. | +| [Troubleshoot Runbook Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks) | troubleshooting | 0.90 | Runbook troubleshooting article is explicitly about issues and resolutions; it will include specific error messages, job status meanings, and corrective actions, which are product-specific troubleshooting patterns. | | [Using system-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/enable-managed-identity-for-automation) | security | 0.90 | Shows how to configure system-assigned managed identity and use it to access resources, a core security/identity configuration. | -| [Using user-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/add-user-assigned-identity) | security | 0.90 | Details adding and using user-assigned managed identities, including constraints with Hybrid Runbook Worker, which is product-specific security behavior. | +| [Troubleshoot Shared Resources Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/shared-resources) | troubleshooting | 0.88 | Troubleshooting guide for shared resources will enumerate concrete issues, error conditions, and their resolutions (e.g., module import failures, credential asset issues), providing symptom→cause→solution mappings specific to Azure Automation. | +| [Configure runbook output](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages) | best-practices | 0.85 | Describes how each PowerShell stream behaves in Azure Automation plus recommended patterns for error handling and output; these are concrete, product-specific DO/DON'T guidelines. | | [Encryption of secure assets](https://learn.microsoft.com/en-us/azure/automation/automation-secure-asset-encryption) | security | 0.85 | Details encryption models and configuration for credentials, certificates, and scripts, which is product-specific security configuration. | | [Manage DNS records used by Automation](https://learn.microsoft.com/en-us/azure/automation/how-to/automation-region-dns-records) | configuration | 0.85 | Provides concrete DNS record lists per region and feature for Automation accounts behind firewalls; highly specific configuration data not inferable from general knowledge. | -| [Troubleshoot Agent-based Hybrid Runbook Worker Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/hybrid-runbook-worker) | troubleshooting | 0.85 | Agent-based worker troubleshooting; expected to list agent logs, error codes, and configuration fixes unique to Hybrid Runbook Workers. | -| [Troubleshoot Extension-based Hybrid Runbook Worker Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/extension-based-hybrid-runbook-worker) | troubleshooting | 0.85 | Focused on extension-based workers; likely includes extension status codes, logs, and resolution steps specific to this worker type. | -| [Troubleshoot Managed Identity Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity) | troubleshooting | 0.85 | Explicit troubleshooting guide; expected to list specific error messages, causes, and resolutions for Automation managed identities. | -| [Troubleshoot Runbook Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks) | troubleshooting | 0.85 | Runbook troubleshooting article; expected to contain specific error messages, timeouts, and diagnostic steps unique to Automation runbooks. | +| [Troubleshoot State Configuration Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/desired-state-configuration) | troubleshooting | 0.85 | Troubleshooting article for State Configuration will map specific compliance, reporting, or configuration application issues to causes and fixes, including error codes and diagnostic steps, which is specialized troubleshooting content. | | [Configure runbook input parameters](https://learn.microsoft.com/en-us/azure/automation/runbook-input-parameters) | configuration | 0.80 | Details parameter types, behaviors, and configuration for different runbook types; specific to Automation’s parameter handling. | -| [Configure runbook output](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages) | best-practices | 0.80 | Explains how Automation handles different PowerShell streams and prescribes best practices for error handling and output; product-specific behavior and guidance. | | [Create modular runbooks](https://learn.microsoft.com/en-us/azure/automation/automation-child-runbooks) | best-practices | 0.80 | Provides a comparison table of inline vs cmdlet-based child runbook calls and guidance on when to use each; clear product-specific best practices. | -| [Deploy extension-based worker](https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install) | deployment | 0.80 | Product-specific deployment method (extension-based) with steps and constraints for Windows/Linux machines and worker groups; distinct deployment pattern. | -| [Disable system-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation) | security | 0.80 | Explains disabling system-assigned managed identity via portal or REST, which is identity/security configuration specific to Automation. | | [Disaster recovery for Automation accounts](https://learn.microsoft.com/en-us/azure/automation/automation-disaster-recovery) | deployment | 0.80 | Provides DR strategy for Automation accounts and dependent resources, including product-specific deployment and recovery patterns. | | [Handle errors in graphical runbooks](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-graphical-error-handling) | best-practices | 0.80 | Provides concrete patterns for handling success, expected, and unexpected errors in graphical runbooks; product-specific error-handling design guidance. | | [Manage role permissions and security](https://learn.microsoft.com/en-us/azure/automation/automation-role-based-access-control) | security | 0.80 | Describes Automation-specific RBAC usage; likely lists roles, scopes, and permission mappings for Automation resources. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/automation/security-controls-policy) | security | 0.80 | Lists specific built-in policy definitions and controls for Automation, which are product-specific security/compliance configurations. | -| [Start a runbook from a webhook](https://learn.microsoft.com/en-us/azure/automation/automation-webhooks) | integrations | 0.80 | Describes webhook URL, headers, payload, and TLS requirements for starting runbooks from services like GitHub and Azure DevOps; integration-focused with concrete parameters. | | [Supported regions](https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/change-tracking-inventory-support-matrix) | limits-quotas | 0.80 | Explicitly a support matrix and limitations page; likely includes region support tables, OS/version support, and specific constraints that qualify as limits/quotas. | -| [Troubleshoot Shared Resources Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/shared-resources) | troubleshooting | 0.80 | Troubleshooting shared resources; likely includes symptom → cause → fix mappings and possibly error codes unique to Automation shared resources. | -| [Troubleshoot State Configuration Issues](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/desired-state-configuration) | troubleshooting | 0.80 | Troubleshooting State Configuration; likely includes DSC error codes, compliance issues, and remediation steps specific to Azure Automation DSC. | +| [Using user-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/add-user-assigned-identity) | security | 0.80 | Describes setting up and using a user-assigned managed identity for Automation accounts, including a specific limitation with Hybrid Runbook Worker; this is product-specific identity/security configuration and behavior. | +| [Enable State Configuration](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding) | configuration | 0.78 | Onboarding/enablement article for State Configuration will include exact onboarding commands, registration parameters, and configuration options for machines, which are product-specific configuration details (e.g., registration keys, URIs, flags) that qualify as expert configuration knowledge. | | [Connect privately to Automation account](https://learn.microsoft.com/en-us/azure/automation/how-to/private-link-security) | security | 0.75 | Details using Private Link/private endpoints for Automation, including network/security configuration specifics unique to this service. | -| [Deploy agent-based Windows worker](https://learn.microsoft.com/en-us/azure/automation/automation-windows-hrw-install) | deployment | 0.75 | Details deployment of agent-based Hybrid Runbook Workers on Windows, including requirements and constraints; product-specific deployment configuration. | +| [Deploy extension-based worker](https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install) | deployment | 0.75 | Covers deploying extension-based user Hybrid Runbook Workers with platform-specific requirements and steps; this is a product-specific deployment pattern for on-premises/other clouds. | | [Forward Azure Automation diagnostic logs to Azure Monitor](https://learn.microsoft.com/en-us/azure/automation/automation-manage-send-joblogs-log-analytics) | integrations | 0.75 | Describes configuration to send job status and streams to Log Analytics workspaces; includes workspace and diagnostic settings specific to Automation–Monitor integration. | | [Manage Python 3 packages](https://learn.microsoft.com/en-us/azure/automation/python-3-packages) | configuration | 0.75 | Details Python 3 package handling, including version support notes and requirements for Hybrid Runbook Workers; product-specific configuration nuances. | | [Manage certificates](https://learn.microsoft.com/en-us/azure/automation/shared-resources/certificates) | security | 0.75 | Details secure storage/encryption of certificate assets and how to access them via specific cmdlets; product-specific security configuration patterns. | @@ -126,35 +152,37 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A | [At scale using Azure Policy](https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/enable-change-tracking-at-scale-policy) | configuration | 0.70 | Uses Azure Policy for scale enablement; likely includes policy definitions, parameters, and assignment settings specific to this scenario. | | [Automation account authentication overview](https://learn.microsoft.com/en-us/azure/automation/automation-security-overview) | security | 0.70 | Authentication overview for Automation accounts with supported scenarios and permission scopes is product-specific security guidance. | | [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/automation/policy-reference) | configuration | 0.70 | Index of built-in policy definitions; includes policy names, versions, and links to definitions, which are concrete configuration artifacts for managing Automation resources. | -| [Compile DSC configurations](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile) | configuration | 0.70 | Covers compilation process; likely includes specific compilation options, job parameters, and Automation behaviors not generally known. | +| [Compile DSC configurations](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile) | configuration | 0.70 | Compilation article is likely to document specific compilation options, configuration properties, and possibly limits around compilation jobs, providing detailed configuration parameters and behaviors unique to Azure Automation State Configuration. | | [Compose DSC configurations](https://learn.microsoft.com/en-us/azure/automation/compose-configurationwithcompositeresources) | configuration | 0.70 | Explains composing configurations; likely includes DSC syntax, resource structure, and Automation-specific handling of composite resources. | | [Configure authentication with Amazon Web Services](https://learn.microsoft.com/en-us/azure/automation/automation-config-aws-account) | integrations | 0.70 | Product-specific integration between Azure Automation and AWS; likely includes connection/credential parameters and patterns unique to this cross-cloud scenario. | | [Configure authentication with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/automation/automation-use-azure-ad) | security | 0.70 | Covers using Entra ID as authentication provider for Azure Automation; likely includes specific app registrations, scopes, and auth configuration details. | -| [Context switching in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/context-switching) | best-practices | 0.70 | Explains Azure context behavior and prescribes specific patterns (e.g., explicit subscription selection) to avoid subtle runbook bugs; product-specific gotchas and recommendations. | +| [Configure servers to a desired state and manage drift](https://learn.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state) | configuration | 0.70 | Tutorial for configuring machines to a desired state using Azure Automation State Configuration will include concrete configuration steps, parameter names, and settings for assigning and monitoring configurations, which are product-specific configuration patterns rather than just conceptual guidance. | +| [Context switching in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/context-switching) | best-practices | 0.70 | Explains Azure context behavior and how to explicitly set subscription/account in runbooks; contains product-specific PowerShell usage patterns and gotchas unique to Azure Automation. | | [Create Automation account - Resource Manager template](https://learn.microsoft.com/en-us/azure/automation/quickstart-create-automation-account-template) | configuration | 0.70 | ARM template article includes JSON parameters and default values, which are explicit configuration options for Automation accounts. | -| [Create PowerShell runbook using managed identity](https://learn.microsoft.com/en-us/azure/automation/learn/powershell-runbook-managed-identity) | security | 0.70 | Shows how a runbook uses a managed identity to access Entra-protected resources, including product-specific auth patterns and permissions. | | [Data to Collect when Opening a Case for Microsoft Azure Automation](https://learn.microsoft.com/en-us/azure/automation/troubleshoot/collect-data-microsoft-azure-automation-case) | troubleshooting | 0.70 | Describes what logs and data to gather; likely includes specific cmdlets, log locations, and configuration exports unique to Automation support. | | [Deploy AWS VM with Automation runbook](https://learn.microsoft.com/en-us/azure/automation/automation-scenario-aws-deployment) | integrations | 0.70 | Automates AWS VM creation; likely includes AWS PowerShell/SDK parameters, tagging patterns, and cross-cloud configuration details unique to this scenario. | | [Deploy Resource Manager template with runbook](https://learn.microsoft.com/en-us/azure/automation/automation-deploy-template-runbook) | integrations | 0.70 | Describes deploying ARM templates stored in Azure Storage; likely includes storage access patterns, template deployment cmdlets, and parameter handling specific to Automation. | -| [Enable State Configuration](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding) | configuration | 0.70 | Describes enabling machines for management; likely includes registration commands, configuration modes, and endpoint settings unique to this feature. | +| [Deploy agent-based Windows worker](https://learn.microsoft.com/en-us/azure/automation/automation-windows-hrw-install) | deployment | 0.70 | Details how to deploy the agent-based Hybrid Runbook Worker role on Windows machines with Automation-specific requirements; this is a deployment-focused article for a particular worker type. | +| [Disable system-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation) | security | 0.70 | Explains how to disable a system-assigned managed identity for an Automation account via portal and REST; involves product-specific identity configuration steps and parameters. | | [Enable managed identities - Azure portal](https://learn.microsoft.com/en-us/azure/automation/quickstarts/enable-managed-identity) | security | 0.70 | Covers enabling managed identities for Automation accounts, which involves specific identity/security configuration steps and parameters. | | [Integrate with Azure Monitor Logs](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-diagnostics) | configuration | 0.70 | Integration with Azure Monitor Logs; likely includes workspace settings, diagnostic configuration, and table names unique to this integration. | | [Manage Python 2 packages](https://learn.microsoft.com/en-us/azure/automation/python-packages) | configuration | 0.70 | Product-specific instructions for handling Python 2 packages in sandbox and Hybrid Runbook Workers, including any required paths or environment behavior. | | [Manage connections](https://learn.microsoft.com/en-us/azure/automation/automation-connections) | configuration | 0.70 | Describes Automation connection asset schema and usage; likely includes property names and constraints for external service connections. | | [Manage databases in Azure SQL database using Azure Automation](https://learn.microsoft.com/en-us/azure/automation/manage-sql-server-in-automation) | integrations | 0.70 | Describes connecting Automation to Azure SQL via system-assigned identity and Az PowerShell cmdlets, a product-specific integration pattern. | -| [Manage modules in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules) | decision-making | 0.70 | Contains deprecation timelines and guidance on moving from AzureRM to Az modules; supports decisions about module/runtime strategy and migration timing. | +| [Manage modules in Azure Automation](https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules) | best-practices | 0.70 | Includes deprecation timelines, module usage guidance, and migration specifics (AzureRM to Az) that are product- and date-specific; these are concrete recommendations and edge cases for this service. | | [Manage runbooks](https://learn.microsoft.com/en-us/azure/automation/manage-runbooks) | best-practices | 0.70 | Includes recommended patterns and best practices for runbook design specific to Azure Automation, beyond generic scripting advice. | | [Manage schedules](https://learn.microsoft.com/en-us/azure/automation/shared-resources/schedules) | configuration | 0.70 | Details supported schedule types (once, hourly, daily, weekly, monthly, specific days) and how they behave; concrete configuration options for scheduling. | | [Manage variables](https://learn.microsoft.com/en-us/azure/automation/shared-resources/variables) | configuration | 0.70 | Describes Automation variable asset behavior and management via portal/PowerShell/runbooks; includes specific patterns for sharing values across jobs and runbooks. | -| [Management of Azure Automation data](https://learn.microsoft.com/en-us/azure/automation/automation-managing-data) | security | 0.70 | Article is specifically about how Azure Automation protects and secures data. Such pages typically include product-specific security mechanisms, data handling behaviors, and possibly RBAC or identity details unique to the service, which qualify as expert security configuration knowledge. | +| [Management of Azure Automation data](https://learn.microsoft.com/en-us/azure/automation/automation-managing-data) | security | 0.70 | Focused on how Azure Automation protects and secures data; likely includes product-specific security mechanisms, data handling, and possibly identity/permission details beyond generic security concepts. | +| [Migrate existing Agent-based Hybrid Workers to Extension-based Workers](https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers) | decision-making | 0.70 | Provides migration guidance between agent-based and extension-based workers, including retirement dates and benefits; helps decide migration approach and path with product-specific constraints. | | [Migrate from Orchestrator to Azure Automation (Beta)](https://learn.microsoft.com/en-us/azure/automation/automation-orchestrator-migration) | decision-making | 0.70 | Migration guidance from Orchestrator to Automation, including how to convert integration packs and runbooks; supports technology selection and migration decisions. | | [Runtime environment scenarios](https://learn.microsoft.com/en-us/azure/automation/manage-runtime-environment) | configuration | 0.70 | Covers creation and management of runtime environments via portal and REST; likely includes environment properties and constraints specific to Automation. | | [Send an email from a runbook](https://learn.microsoft.com/en-us/azure/automation/automation-send-email) | integrations | 0.70 | Shows how to integrate SendGrid with Automation runbooks; likely includes API keys, connection parameters, and PowerShell patterns specific to this integration. | | [Start a runbook](https://learn.microsoft.com/en-us/azure/automation/start-runbooks) | configuration | 0.70 | Includes a comparison table of start methods and lifecycle behavior; product-specific configuration of how runbooks are triggered. | -| [Update Azure PowerShell modules](https://learn.microsoft.com/en-us/azure/automation/automation-update-azure-modules) | configuration | 0.70 | Explains how default/global modules are updated and managed, including runtime environment behavior; product-specific module configuration details. | +| [Start a runbook from a webhook](https://learn.microsoft.com/en-us/azure/automation/automation-webhooks) | integrations | 0.70 | Webhook article for Automation typically includes request schema, required headers, URL format, and TLS requirements—product-specific integration parameters and constraints. | +| [Update Azure PowerShell modules](https://learn.microsoft.com/en-us/azure/automation/automation-update-azure-modules) | configuration | 0.70 | Describes how default/global modules are managed, updated, and constrained in Automation, including runtime environment behavior; this is product-specific configuration knowledge. | | [Use source control integration](https://learn.microsoft.com/en-us/azure/automation/source-control-integration) | configuration | 0.70 | Covers one-way sync setup with GitHub/Azure DevOps; likely includes specific configuration fields, connection settings, and schedules unique to Automation source control. | -| [Work with State Configuration extension version history](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-extension-history) | configuration | 0.70 | Version history pages typically list extension versions with specific behaviors, configuration changes, and sometimes parameter defaults unique to the product, which qualify as expert configuration knowledge not inferable from general training. | -| [Migrate existing Agent-based Hybrid Workers to Extension-based Workers](https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers) | deployment | 0.68 | The article gives concrete, product-specific migration guidance from agent-based to extension-based hybrid workers, including sequencing, prerequisites, and platform-specific steps that affect how production workers are deployed and operated. This is migration/deployment-focused expert knowledge rather than a generic overview, but it does not center on limits, security roles, or configuration parameter tables. | +| [Work with State Configuration extension version history](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-extension-history) | configuration | 0.70 | Version history for the DSC extension typically lists specific extension versions, behaviors, and changes that affect how you configure and manage the extension; this is detailed, product-specific configuration/versioning knowledge not captured by generic training. | | [At scale using Azure portal - Machines blade (New)](https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/enable-change-tracking-at-scale-machines-blade) | configuration | 0.65 | Portal-based scale enablement; likely includes specific configuration options, data sources, and parameter values for enabling the feature on many VMs. | | [Configure data at scale](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-config-data-at-scale) | configuration | 0.65 | Focuses on configuring data at scale for State Configuration, likely including specific configuration patterns and structures. | | [Configure data based on STIG](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-configuration-based-on-stig) | configuration | 0.65 | STIG-based configuration data for State Configuration implies detailed configuration structures and parameters unique to this feature. | @@ -164,6 +192,7 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A | [Deploy agent-based Linux worker](https://learn.microsoft.com/en-us/azure/automation/automation-linux-hrw-install) | deployment | 0.65 | Step-by-step installation for agent-based Linux Hybrid Runbook Worker with product-specific commands/requirements; contains concrete deployment details beyond generic knowledge. | | [Edit Graphical runbooks](https://learn.microsoft.com/en-us/azure/automation/automation-graphical-authoring-intro) | configuration | 0.65 | Explains how graphical runbooks generate PowerShell code and how to configure them; product-specific authoring behavior and options. | | [Edit textual runbooks](https://learn.microsoft.com/en-us/azure/automation/automation-edit-textual-runbook) | configuration | 0.65 | Describes editor features like inserting cmdlets/assets/child runbooks and how they map to Automation resources; product-specific authoring configuration. | +| [Get started with State Configuration](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-getting-started) | configuration | 0.65 | Getting-started article for State Configuration typically walks through concrete steps and settings (registering nodes, assigning configurations, checking compliance) with specific property names and options, which constitutes product-specific configuration knowledge beyond generic DSC concepts. | | [Manage Office 365 services](https://learn.microsoft.com/en-us/azure/automation/manage-office-365) | integrations | 0.65 | Covers Automation integration with Office 365 via Entra ID; likely includes connection setup, permissions, and cmdlet usage specific to this integration. | | [Migrate Run As account to managed identity](https://learn.microsoft.com/en-us/azure/automation/migrate-run-as-accounts-managed-identity) | decision-making | 0.65 | Migration-focused article with cadence, support timelines, and concrete guidance on when/how to move from Run As accounts to managed identities; fits decision-making around authentication model migration. | | [Migration from Log Analytics to Azure Monitoring Agent version](https://learn.microsoft.com/en-us/azure/automation/change-tracking/guidance-migration-log-analytics-monitoring-agent) | decision-making | 0.65 | Migration guidance between LA and AMA; likely includes comparison of behaviors, supported scenarios, and recommended migration paths, aiding technology selection and transition. | @@ -173,41 +202,40 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure Logic A | [Trigger runbook from Azure alert](https://learn.microsoft.com/en-us/azure/automation/automation-create-alert-triggered-runbook) | configuration | 0.65 | Shows wiring alerts to runbooks via action groups; likely includes specific action group settings and runbook linkage parameters unique to this integration. | | [Use Azure Policy to enforce job execution](https://learn.microsoft.com/en-us/azure/automation/enforce-job-execution-hybrid-worker) | configuration | 0.65 | Uses a custom Azure Policy definition to force jobs onto Hybrid Runbook Workers; likely includes specific policy JSON, parameters, and settings unique to this feature. | | [Work with the Graphical runbook SDK](https://learn.microsoft.com/en-us/azure/automation/graphical-runbook-sdk) | integrations | 0.65 | SDK-focused article for programmatically creating/editing graphical runbooks; likely includes API/SDK parameters and structures unique to this product. | +| [Run runbooks on Hybrid Runbook Worker](https://learn.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks) | best-practices | 0.62 | How-to article for running runbooks on Hybrid Runbook Worker machines is likely to include product-specific guidance (where to run what, worker group behavior, gotchas around local vs cloud resources, and configuration nuances) that go beyond generic concepts. This aligns best with product-specific best practices rather than pure configuration tables. | | [Automation Runtime environment](https://learn.microsoft.com/en-us/azure/automation/runtime-environment-overview) | configuration | 0.60 | Runtime environment overview likely details supported versions and environment settings, which are configuration-specific. | | [Automation runbook types](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-types) | decision-making | 0.60 | Compares runbook types and considerations for choosing among them, which is product-specific decision guidance. | | [Availability zones](https://learn.microsoft.com/en-us/azure/automation/automation-availability-zones) | deployment | 0.60 | Describes availability zone support and regional behavior for Automation, relevant to deployment and high-availability planning. | | [Azure Automation extension for Visual Studio Code](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-authoring) | configuration | 0.60 | Describes VS Code extension capabilities and operations for runbook management, including product-specific configuration of tooling. | -| [Get started with State Configuration](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-getting-started) | configuration | 0.60 | Getting-started guide for State Configuration typically includes concrete steps, parameters, and configuration examples specific to this service. | | [Hybrid Runbook Worker overview](https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker) | configuration | 0.60 | Hybrid Runbook Worker overview includes installation and environment-specific configuration details unique to this feature. | | [Remediate noncompliant State Configuration servers](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-remediate) | configuration | 0.60 | Describes on-demand reapplication of configurations; likely includes commands and settings for triggering remediation unique to Automation DSC. | -| [Runbook execution overview](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution) | best-practices | 0.60 | Discusses runbook execution behavior and restart semantics, requiring product-specific guidance on how to author runbooks to handle interruptions. | | [Track updated files with watcher task](https://learn.microsoft.com/en-us/azure/automation/automation-scenario-using-watcher-task) | configuration | 0.60 | Watcher task scenario; likely includes watcher configuration intervals, runbook bindings, and limitations specific to Azure Automation watcher tasks. | ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Use existing runbooks and modules](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-gallery) | 0.50 | Describes using Runbook Gallery and PowerShell Gallery; largely discovery and high-level usage, not deep configuration parameters or limits. | | [Learn PowerShell Workflow](https://learn.microsoft.com/en-us/azure/automation/automation-powershell-workflow) | 0.45 | Conceptual explanation of PowerShell Workflow vs PowerShell for Automation runbooks; more educational than product-specific configuration or troubleshooting. | | [Manage Automation account](https://learn.microsoft.com/en-us/azure/automation/delete-account) | 0.45 | Describes deleting/restoring Automation accounts and related resources; mostly procedural without detailed configuration tables or error-code-based troubleshooting. | -| [Move Automation account to another subscription](https://learn.microsoft.com/en-us/azure/automation/how-to/move-account) | 0.45 | How-to for moving Automation accounts between subscriptions/resource groups; reuses generic Azure move-resource process without deep product-specific limits or configs. | | [Delete Run As account](https://learn.microsoft.com/en-us/azure/automation/delete-run-as-account) | 0.40 | Describes retirement and deletion of Run As accounts and migration to managed identities; primarily lifecycle and portal/PowerShell steps without deep config parameters or error mappings. | | [Remove user-assigned managed identity](https://learn.microsoft.com/en-us/azure/automation/remove-user-assigned-identity) | 0.40 | Procedural how-to for removing a user-assigned managed identity via portal/PowerShell/ARM; no detailed configuration tables, limits, or product-specific edge cases. | | [Run Azure CLI commands in PowerShell 7.4 runbooks](https://learn.microsoft.com/en-us/azure/automation/quickstart-cli-support-powershell-runbook-runtime-environment) | 0.40 | Shows that Azure CLI 2.64.0 is default in PowerShell 7.4 runtime, but no broader config matrices or constraints; mostly a how-to. | -| [Run runbooks on Hybrid Runbook Worker](https://learn.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks) | 0.40 | Describes how to run runbooks on Hybrid Runbook Worker but appears more conceptual/usage-focused without clear evidence of detailed error codes, limits, or config tables. | +| [Runbook execution overview](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution) | 0.40 | Overview of runbook execution behavior; mostly conceptual description of jobs and restarts without specific numeric limits, configuration parameters, or troubleshooting codes. | | [Update runbook from PowerShell 7.1 to PowerShell 7.4](https://learn.microsoft.com/en-us/azure/automation/quickstart-update-runbook-in-runtime-environment) | 0.40 | Upgrade runbook runtime version tutorial; summary doesn’t show detailed config tables or limits, more of a procedural guide. | +| [Use existing runbooks and modules](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-gallery) | 0.40 | Focuses on using gallery content and sharing scenarios; likely a usage/tutorial article without detailed config tables, limits, or troubleshooting mappings. | | [Create Automation account - Azure portal](https://learn.microsoft.com/en-us/azure/automation/automation-create-standalone-account) | 0.35 | How-to for creating a standalone Automation account; mostly portal steps without detailed configuration matrices. | | [Enable Desired State Configuration for a machine](https://learn.microsoft.com/en-us/azure/automation/quickstarts/dsc-configuration) | 0.35 | Quickstart for configuring a VM with DSC; likely procedural without detailed config option tables or limits in the summary. | | [Install Hybrid Worker extension - Azure portal](https://learn.microsoft.com/en-us/azure/automation/quickstarts/install-hybrid-worker-extension) | 0.35 | Quickstart for installing Hybrid Worker extension; appears as a basic tutorial without detailed configuration parameter tables. | | [About Change tracking and inventory](https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/overview-monitoring-agent) | 0.30 | Overview of Change Tracking and Inventory with AMA; summary suggests high-level features and benefits without detailed configuration tables or limits. | -| [Configure servers to a desired state and manage drift](https://learn.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state) | 0.30 | Tutorial-style guidance for configuring machines to a desired state; likely step-by-step instructions without structured limits, configuration parameter tables, or troubleshooting mappings. | +| [Create PowerShell runbook using managed identity](https://learn.microsoft.com/en-us/azure/automation/learn/powershell-runbook-managed-identity) | 0.30 | Tutorial on creating a PowerShell runbook with managed identity; step-by-step usage but no detailed configuration tables, RBAC role lists, or product-specific error mappings. | | [Create a PowerShell Workflow runbook](https://learn.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-textual) | 0.30 | Tutorial on creating a PowerShell Workflow runbook; mostly authoring steps, not configuration matrices or limits. | | [Create a Python 3 runbook](https://learn.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-textual-python-3) | 0.30 | Tutorial for Python runbook creation; basic how-to without expert-level configuration or limits. | | [FAQ](https://learn.microsoft.com/en-us/azure/automation/automation-faq) | 0.30 | FAQ description is generic; no indication of specific error codes, limits, or config parameters in the summary. | +| [Move Automation account to another subscription](https://learn.microsoft.com/en-us/azure/automation/how-to/move-account) | 0.30 | Primarily a how-to move resource guide; likely step-by-step portal/CLI instructions without detailed limits, config tables, or error-code-based troubleshooting. | | [Use Automation extension for Visual Studio Code](https://learn.microsoft.com/en-us/azure/automation/how-to/runbook-authoring-extension-for-vscode) | 0.30 | Explains using VS Code extension to author runbooks; mostly tooling workflow, not configuration matrices, limits, or product-specific troubleshooting. | | [Create Automation account - Azure portal](https://learn.microsoft.com/en-us/azure/automation/quickstarts/create-azure-automation-account-portal) | 0.20 | Quickstart for creating an Automation account via portal; step-by-step tutorial without config matrices or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-overview) | 0.20 | Overview of Azure Automation State Configuration and retirement notice; no detailed limits, configuration tables, error codes, or product-specific decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/automation/automation-dsc-overview) | 0.20 | Overview of Azure Automation State Configuration; described as an overview and retirement notice, likely conceptual and service-level explanation without detailed parameters, limits, or troubleshooting mappings. | | [What are the various Automation services in Azure?](https://learn.microsoft.com/en-us/azure/automation/automation-services) | 0.20 | Conceptual comparison of automation services; no detailed decision matrices, limits, or config tables evident from summary. | | [What is Azure Automation?](https://learn.microsoft.com/en-us/azure/automation/overview) | 0.20 | High-level overview of Azure Automation capabilities without concrete limits, configs, or error mappings. | -| [Archive for What's new](https://learn.microsoft.com/en-us/azure/automation/whats-new-archive) | 0.10 | Archive of release notes and high-level change log information; does not focus on concrete limits, configuration parameters, or other structured expert details as defined by the sub-skill types. | +| [Archive for What's new](https://learn.microsoft.com/en-us/azure/automation/whats-new-archive) | 0.10 | Archive of release notes and updates; primarily timeline and marketing/overview style content without concrete limits, configs, or troubleshooting mappings. | | [What's new?](https://learn.microsoft.com/en-us/azure/automation/whats-new) | 0.10 | Monthly 'what's new' changelog/updates page; primarily release notes and high-level feature announcements without structured limits, configuration tables, error mappings, or decision matrices that match the defined expert-knowledge sub-skill types. | diff --git a/products/azure-backup/azure-backup.csv b/products/azure-backup/azure-backup.csv index 4e4ff284..2aa563f4 100644 --- a/products/azure-backup/azure-backup.csv +++ b/products/azure-backup/azure-backup.csv @@ -31,7 +31,7 @@ https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster- https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-concept,Prerequisites,Azure Kubernetes Service (AKS) backup using Azure Backup prerequisites - Azure Backup,Meet prerequisites and configure access for AKS backup,This article explains the prerequisites for Azure Kubernetes Service (AKS) backup.,"This article describes the prerequisites for Azure Kubernetes Service (AKS) backup. Azure Backup now allows you to back up AKS clusters (cluster resources and persistent volumes attached to the cluster) using a backup extension, which must be installed in the cluster. Backup vault communicates with the cluster via this Backup Extension to perform backup and restore operations. Based on the least privileged security model, a Backup vault must haveTrusted Accessenabled to communicate with the AKS ",2025-08-12T08:00:00.000Z,overview,configuration,0.7,True,"Prerequisites for AKS backup include enabling Trusted Access, installing a backup extension, and likely specific permission scopes and settings. These are concrete configuration requirements unique to AKS backup.",unchanged https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-policy,Audit and Enforce backup operations for Azure Kubernetes Service clusters using Azure policy,Audit and enforce backup operations for Azure Kubernetes Service clusters via Azure Backup using Azure Policy - Azure Backup,Enforce AKS backup compliance using Azure Policy,Learn how to use Azure Policy to audit and enforce backup operations for all Azure Kubernetes Service clusters created in a given scope,"This article describes how Azure Backup uses built-inAzure Policydefinitions to automate auditing and enforcement ofbackup configurations for Azure Kubernetes Service (AKS) clusters, ensuring compliance with organizational data protection standards. As a Backup and Compliance admin, choose the policy that best fits your team's structure and resource organization to manage AKS cluster backups effectively.",2025-09-18T11:11:00.000Z,how-to,security,0.7,True,Uses specific built-in Azure Policy definitions for AKS backup enforcement; policy names and scopes are product-specific security/compliance configuration.,unchanged https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-support-matrix,Support matrix,Azure Kubernetes Service (AKS) backup support matrix - Azure Backup,Check AKS backup region support and limitations,This article provides a summary of support settings and limitations of Azure Kubernetes Service (AKS) backup.,"You can useAzure Backupto help protect Azure Kubernetes Service (AKS). This article summarizes region availability, supported scenarios, and limitations.",2026-03-17T11:12:00.000Z,reference,limits-quotas,0.86,True,"A support matrix for AKS backup will list region availability, supported scenarios, and explicit limitations in tabular form. These are concrete, product-specific constraints and limits that qualify as expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-using-cli,Backup,Back up Azure Kubernetes Service (AKS) using Azure CLI - Azure Backup,Configure AKS backups with Azure Backup via CLI,This article explains how to back up Azure Kubernetes Service (AKS) using Azure CLI.,"This article describes how to configure and back up Azure Kubernetes Service (AKS) using Azure CLI. You can also back up AKS usingAzure PowerShell. Azure Backup now allows you to back up AKS clusters (cluster resources and persistent volumes attached to the cluster) using a backup extension, which must be installed in the cluster. Backup vault communicates with the cluster via this Backup Extension to perform backup and restore operations.",2026-04-15T22:11:00.000Z,how-to,configuration,0.68,True,"The page is a CLI-focused how-to for configuring AKS backups using Azure Backup and a backup extension. Such articles typically include product-specific CLI commands, required parameters, resource names, and configuration options (for the backup vault, backup policy, backup extension, and AKS cluster integration). These concrete configuration details and parameter values are expert knowledge beyond generic LLM training. It is not primarily about limits, troubleshooting, or deployment matrices, but about how to configure the backup integration and policies.",updated +https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-using-cli,Backup,Back up Azure Kubernetes Service (AKS) using Azure CLI - Azure Backup,Configure AKS backups with Azure Backup via CLI,This article explains how to back up Azure Kubernetes Service (AKS) using Azure CLI.,"This article describes how to configure and back up Azure Kubernetes Service (AKS) using Azure CLI. You can also back up AKS usingAzure PowerShell. Azure Backup now allows you to back up AKS clusters (cluster resources and persistent volumes attached to the cluster) using a backup extension, which must be installed in the cluster. Backup vault communicates with the cluster via this Backup Extension to perform backup and restore operations.",2026-04-15T22:11:00.000Z,how-to,configuration,0.68,True,"The page is a CLI-focused how-to for configuring AKS backups using Azure Backup and a backup extension. Such articles typically include product-specific CLI commands, required parameters, resource names, and configuration options (for the backup vault, backup policy, backup extension, and AKS cluster integration). These concrete configuration details and parameter values are expert knowledge beyond generic LLM training. It is not primarily about limits, troubleshooting, or deployment matrices, but about how to configure the backup integration and policies.",unchanged https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-using-powershell,Backup,Back up Azure Kubernetes Service (AKS) using Azure PowerShell - Azure Backup,,This article explains how to back up Azure Kubernetes Service (AKS) using PowerShell.,"This article describes how to configure and back up Azure Kubernetes Service (AKS) using Azure PowerShell. Azure Backup now allows you to back up AKS clusters (cluster resources and persistent volumes attached to the cluster) using a backup extension, which must be installed in the cluster. Backup vault communicates with the cluster via this Backup Extension to perform backup and restore operations.",2025-12-15T08:00:00.000Z,how-to,,0.4,False,"PowerShell-based backup how-to for AKS; mainly command walkthrough without configuration matrices, limits, or security/diagnostic details.",unchanged https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-manage-backups,Manage,Manage Azure Kubernetes Service (AKS) backups using Azure Backup - Azure Backup,Configure AKS backup extension and trusted access,This article explains how to manage Azure Kubernetes Service (AKS) backups using Azure Backup.,"This article describes how to register resource providers on your subscriptions for using Backup Extension and Trusted Access. Also, it provides you with the Azure CLI commands to manage them. Azure Backup now allows you to back up AKS clusters (cluster resources and persistent volumes attached to the cluster) using a backup extension, which must be installed in the cluster. AKS cluster requires Trusted Access enabled with Backup vault, so that the vault can communicate with the Backup Extension",2026-03-20T08:00:00.000Z,how-to,configuration,0.7,True,"Provides concrete Azure CLI commands and resource provider/Trusted Access setup details specific to enabling and managing AKS backups with Azure Backup, including product-specific configuration steps and parameters rather than just conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-restore,Restore,Restore Azure Kubernetes Service (AKS) using Azure Backup. - Azure Backup,Restore AKS clusters and volumes using Azure Backup,This article explains how to restore backed-up Azure Kubernetes Service (AKS) using Azure Backup.,"This article describes how to restore backed-up Azure Kubernetes Service (AKS). You can also restore AKS cluster usingAzure PowerShell. Azure Backup now allows you to back up AKS clusters (cluster resources and persistent volumes attached to the cluster) using a backup extension, which must be installed in the cluster. Backup vault communicates with the cluster via this Backup Extension to perform backup and restore operations.",2026-03-17T11:12:00.000Z,how-to,configuration,0.7,True,"Restore-focused how-to for AKS with Azure Backup; likely documents restore configuration options, parameters, and constraints specific to AKS backup extension and Backup vault.",unchanged @@ -40,9 +40,9 @@ https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster- https://learn.microsoft.com/en-us/azure/backup/azure-policy-configure-diagnostics,Configure vault diagnostics settings at scale,Configure Vault Diagnostics settings at scale - Azure Backup,Configure Azure Backup vault diagnostics via Azure Policy,Configure Log Analytics Diagnostics settings for all vaults in a given scope using Azure Policy,"This article describes how Azure Backup integrates with Log Analytics for reporting. Each vault requires adiagnostics settingto send data to Log Analytics. To avoid manual setup for every vault, Azure Backup offers a built-inAzure Policythat automatically applies Log Analytics diagnostics settings across subscriptions or resource groups. The following sections explain how to use this policy.",2026-03-18T08:00:00.000Z,how-to,configuration,0.7,True,"Describes configuring Log Analytics diagnostics for Azure Backup vaults using a built-in Azure Policy. This typically includes specific policy definitions, parameter names, and required settings for diagnostics, which are product-specific configuration details rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/backup/back-up-azure-database-postgresql-flex-backup-cli,Backup,Back up Azure Database for PostgreSQL - Flexible Server using Azure CLI - Azure Backup,Back up PostgreSQL Flexible Server using Azure CLI,Learn how to back up Azure Database for PostgreSQL - Flexible Server using Azure CLI.,This article describes how to back up Azure Database for PostgreSQL - Flexible Server using Azure CLI.,2026-01-19T18:12:00.000Z,how-to,integrations,0.7,True,CLI article with specific commands and parameters to configure backups for PostgreSQL Flexible Server; product-specific integration details.,unchanged https://learn.microsoft.com/en-us/azure/backup/back-up-azure-database-postgresql-flex-backup-powershell,Backup,Back up Azure Database for PostgreSQL - Flexible Server using Azure PowerShell - Azure Backup,Back up PostgreSQL Flexible Server using PowerShell,Learn how to back up Azure Database for PostgreSQL - Flexible Server using Azure PowerShell.,This article describes how to back up Azure Database for PostgreSQL - Flexible Server using Azure PowerShell. Learn more about thesupported scenarios and limitations for Azure Database for PostgreSQL - flexible server backup.,2026-01-19T18:12:00.000Z,how-to,integrations,0.7,True,PowerShell article with cmdlets and parameters for Flexible Server backups; includes supported scenarios and limitations specific to this backup type.,unchanged -https://learn.microsoft.com/en-us/azure/backup/back-up-azure-stack-hyperconverged-infrastructure-virtual-machines,Back up Azure Local virtual machines,Back up Azure Local virtual machines with MABS - Azure Backup,,This article contains the procedures to back up and recover virtual machines using Microsoft Azure Backup Server (MABS).,"This article describes how to back up virtual machines running on Azure Local, versions23 H2and22 H2, using Microsoft Azure Backup Server (MABS).",2025-10-15T05:40:00.000Z,how-to,,0.4,False,Backup of Azure Local VMs with MABS; appears to be a scenario-specific how-to without detailed config matrices or limits.,unchanged +https://learn.microsoft.com/en-us/azure/backup/back-up-azure-stack-hyperconverged-infrastructure-virtual-machines,Back up Azure Local virtual machines,Back up Azure Local virtual machines with MABS - Azure Backup,,This article contains the procedures to back up and recover virtual machines using Microsoft Azure Backup Server (MABS).,This article describes how to back up virtual machines running on Azure Local using Microsoft Azure Backup Server (MABS).,2026-04-24T08:00:00.000Z,how-to,,0.2,False,"Described as a general procedure article for backing up Azure Local VMs with MABS; summary does not indicate specific limits, error codes, configuration tables, or product-specific gotchas. Likely a step-by-step tutorial rather than expert reference content.",updated https://learn.microsoft.com/en-us/azure/backup/back-up-file-data,Back up file data with MABS,Back up file data with MABS - Azure Backup,,You can back up file data on server and client computers with MABS.,"This article describes how to use Microsoft Azure Backup Server (MABS) to back up and recover file data on both server and client computers. You learn how to configure protection groups, set up backup schedules, and perform data recovery to ensure your critical files are protected and available when needed.",2025-09-19T17:12:00.000Z,how-to,,0.4,False,"How-to backup guide for files with MABS; summary shows procedural steps but no explicit config tables, limits, or product-specific error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/backup/back-up-hyper-v-virtual-machines-mabs,Back up Hyper-V virtual machines,Back up Hyper-V virtual machines with MABS - Azure Backup,,This article contains the procedures for backing up and recovery of virtual machines using Microsoft Azure Backup Server (MABS).,"This article describes how to back up and restore Hyper-V virtual machines using Microsoft Azure Backup Server (MABS). Note When the trim is performed within the guest OS, the tracking of incremental blocks is reset, resulting in a full backup. The trim within the guest OS releases unused blocks of the virtual disk (VHDX) and optimizes the disk size. However, this reduces the size of the VHDX and changes the SequenceNumber of the tracked incremental blocks, resulting in a full backup size. Unles",2025-07-22T08:00:00.000Z,how-to,,0.45,False,"Hyper-V VM backup/restore with MABS; includes a note about trim behavior but overall is a procedural backup/restore guide, not a structured best-practices or config reference.",unchanged +https://learn.microsoft.com/en-us/azure/backup/back-up-hyper-v-virtual-machines-mabs,Back up Hyper-V virtual machines,Back up Hyper-V virtual machines with MABS - Azure Backup,Back up and restore Hyper-V VMs with MABS,This article contains the procedures for backing up and recovery of virtual machines using Microsoft Azure Backup Server (MABS).,"This article describes how to back up and restore Hyper-V virtual machines using Microsoft Azure Backup Server (MABS). Note When the trim is performed within the guest OS, the tracking of incremental blocks is reset, resulting in a full backup. The trim within the guest OS releases unused blocks of the virtual disk (VHDX) and optimizes the disk size. However, this reduces the size of the VHDX and changes the SequenceNumber of the tracked incremental blocks, resulting in a full backup size. Unles",2026-04-24T08:00:00.000Z,how-to,best-practices,0.68,True,"Procedural article with product-specific behavioral details (for example, guest OS trim resetting incremental block tracking and forcing full backups). These are concrete, non-obvious gotchas unique to MABS + Hyper-V backup behavior, fitting best-practices rather than generic how-to.",updated https://learn.microsoft.com/en-us/azure/backup/back-up-managed-disks-tutorial,Tutorial to back up Azure Managed Disks,Tutorial - Back up Azure Managed Disks using Azure Backup - Azure Backup,,"In this tutorial, learn how to back up Azure Managed Disks from the Azure portal.","This tutorial describes how to back upAzure Managed Diskfrom the Azure portal. Azure Disk Backupis a native, cloud-based backup solution that protects your data in managed disks. It's a simple, secure, and cost-effective solution that enables you to configure protection for managed disks. It assures that you can recover your data in a disaster scenario.",2026-01-29T18:11:00.000Z,tutorial,,0.1,False,"Tutorial for backing up managed disks from the portal; primarily step-by-step guidance without indication of detailed numeric limits, configuration matrices, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-afs-cli,Backup,Back up Azure Files with Azure CLI - Azure Backup,,Learn how to use Azure CLI to back up Azure Files in the Recovery Services vault,"The Azure CLI provides a command-line experience for managing Azure resources. It's a great tool for building custom automation to use Azure resources. This article details how to back up Azure Files with Azure CLI. You can also perform these steps viaAzure PowerShellor theAzure portal. By the end of this tutorial, you'll learn how to perform the operations below with Azure CLI:",2026-02-17T08:00:00.000Z,how-to,,0.4,False,"CLI how-to tutorial for backing up Azure Files; likely step-by-step commands without config matrices, limits, or specialized troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-architecture,Azure Backup architecture,Architecture Overview - Azure Backup,,"Provides an overview of the architecture, components, and processes used by the Azure Backup service.","You can use theAzure Backup serviceto back up data to the Microsoft Azure cloud platform. This article summarizes Azure Backup architecture, components, and processes.",2025-11-18T08:00:00.000Z,overview,,0.1,False,"High-level architecture overview of Azure Backup components and flows; no detailed limits, configs, or error mappings.",unchanged @@ -50,13 +50,13 @@ https://learn.microsoft.com/en-us/azure/backup/backup-azure-about-mars,Overview, https://learn.microsoft.com/en-us/azure/backup/backup-azure-afs-automation,Backup,Back up Azure files using PowerShell - Azure Backup,,Learn how to use Azure PowerShell to back up Azure Files through an Azure Backup Recovery Services vault.,"This article describes how to use Azure PowerShell to back up Azure Files through anAzure BackupRecovery Services vault. You can also back up Azure Files usingAzure portal,CLI, andREST API.",2026-02-17T08:00:00.000Z,how-to,,0.3,False,PowerShell backup how-to; primarily procedural without detailed configuration parameter tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-alternate-dpm-server,Recover data from Azure Backup Server,Recover data from an Azure Backup Server by using Azure Backup - Azure Backup,Recover Azure Backup Server data from any vault-registered server,Recover the data you've protected to a Recovery Services vault from any Azure Backup Server registered to that vault.,"This article describes how to recover data from Azure Backup Server. Note When the trim is performed within the guest OS, the tracking of incremental blocks is reset, resulting in a full backup. The trim within the guest OS releases unused blocks of the virtual disk (VHDX) and optimizes the disk size. However, this reduces the size of the VHDX and changes the SequenceNumber of the tracked incremental blocks, resulting in a full backup size. Unless the purpose is to improve the efficiency of stor",2025-07-16T08:00:00.000Z,how-to,best-practices,0.7,True,Describes cross-server recovery behavior and a specific TRIM-related edge case where guest OS TRIM resets incremental block tracking and forces full backups; this is a product-specific gotcha not generally known.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-alternate-dpm-server-troubleshoot,Data recovery from Microsoft Azure Backup Server,Troubleshoot data recovery from Microsoft Azure Backup Server by using Azure Backup - Azure Backup,Troubleshoot data recovery from Microsoft Azure Backup Server,Learn how to troubleshoot data recovery from Microsoft Azure Backup Server.,This article provides recommendations for troubleshooting common errors encountered during data recovery from Microsoft Azure Backup Server (MABS).,2026-01-08T08:00:00.000Z,troubleshooting,troubleshooting,0.88,True,MABS data recovery–specific errors and recommended troubleshooting actions.,unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms,Restore Azure VMs in the portal,Restore VMs by using the Azure portal using Azure Backup - Azure Backup,Restore Azure VMs from Recovery Services vaults using portal,"Restore an Azure virtual machine from a recovery point by using the Azure portal, including the Cross Region Restore feature.",This article describes how to restore Azure VM data from the recovery points stored inAzure BackupRecovery Services vaults.,2026-01-23T08:00:00.000Z,how-to,configuration,0.7,True,"Portal-based restore steps, including Cross Region Restore options and parameters, are product-specific configuration flows.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms,Restore Azure VMs in the portal,Restore VMs by using the Azure portal using Azure Backup - Azure Backup,,"Restore an Azure virtual machine from a recovery point by using the Azure portal, including the Cross Region Restore feature.",This article describes how to restore Azure VM data from the recovery points stored inAzure BackupRecovery Services vaults.,2026-04-24T11:19:00.000Z,how-to,,0.2,False,"Article on restoring VMs from recovery points; describes restore operations and features like Cross Region Restore but is primarily procedural, lacking detailed limits, configuration parameter tables, or structured troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-userestapi-backupazurevms,Back up Azure VMs,Back up Azure VMs using REST API in Azure Backup - Azure Backup,Configure and run VM backups via REST API,"In this article, learn how to configure, initiate, and manage backup operations of Azure VM Backup using REST API.","This article describes how to manage backups for an Azure VM using Azure Backup via REST API. Configure protection for the first time for a previously unprotected Azure VM, trigger an on-demand backup for a protected Azure VM and modify backup properties of a backed-up VM via REST API as explained here. To protect an Azure VM using the Azure portal, seethis article. Learn how tocreate vaultandcreate policyREST API tutorials for creating new vaults and policies. Let's assume you want to protect a",2026-02-20T08:00:00.000Z,how-to,integrations,0.75,True,"Details REST endpoints and request bodies for configuring, triggering, and modifying VM backups; API contract is product-specific.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-userestapi-createorupdatepolicy,Create and update backup policy,Create backup policies via REST API in Azure Backup - Azure Backup,Create Azure Backup policies using REST API,"In this article, you'll learn how to create and manage backup policies (schedule and retention) using REST API.","This article describes how to create policies for the backup of Azure VM, SQL database in Azure VM, SAP HANA database in Azure VM, and Azure Files. Learn more aboutcreating or modifying a backup policy for an Azure Recovery Services vault by using REST API.",2026-02-16T08:00:00.000Z,how-to,integrations,0.75,True,Defines REST schema for backup policy schedule and retention for multiple workloads; includes policy JSON structure and parameters unique to Azure Backup.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-userestapi-createorupdatevault,Create Recovery Services vault,Create Recovery Services vaults using REST API for Azure Backup - Azure Backup,Create Recovery Services vaults using Backup REST API,"In this article, learn how to manage backup and restore operations of Azure VM Backup using REST API.","This article describes how to create Azure Recovery Services vault using REST API. To create the vault using the Azure portal, seethis article. A Recovery Services vault is a storage entity in Azure that houses data. The data is typically copies of data, or configuration information for virtual machines (VMs), workloads, servers, or workstations. You can use Recovery Services vaults to hold backup data for various Azure services such as IaaS VMs (Linux or Windows) and SQL Server in Azure VMs. Re",2025-12-17T12:14:00.000Z,how-to,integrations,0.7,True,Describes specific REST endpoints and payloads for vault creation; includes API parameters unique to Recovery Services vaults.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-userestapi-managejobs,Manage Azure Backup jobs,Manage the backup jobs using REST API in Azure Backup - Azure Backup,Track Azure Backup jobs using REST API,"In this article, learn how to track and manage the backup and restore jobs of Azure Backup using REST API.","This article describes how to fetch, track, and monitor Azure Backup job status by using REST APIs. You learn how to identify job IDs from backup and restore operations, retrieve job details to track progress, and access extended information for completed jobs, such as task-level status, backup metrics, and protected entity details. Azure Backup runs these jobs in the background for operations like backup, restore, and disable backup, and REST API endpoints provide end-to-end visibility into job",2026-02-20T18:12:00.000Z,how-to,integrations,0.75,True,Documents REST endpoints and response schemas for job status and metrics; includes job ID handling and fields unique to Azure Backup.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-userestapi-restoreazurevms,Restore Azure VMs,Restore Azure VMs using REST API - Azure Backup,Restore Azure VMs and disks using REST API,"In this article, learn how to manage to restore operations of Azure Virtual Machine Backup using REST API.","This article describes how to restore Azure Virtual Machines or individual disks from a recovery point by using REST API. It covers selecting recovery points, triggering restore operations, and performing cross-region restores.",2026-02-20T08:00:00.000Z,how-to,integrations,0.75,True,"Provides REST operations, parameters, and flows for VM and disk restore including cross-region; these are Azure Backup–specific API patterns.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-vms-prepare,Set up a vault and enable backup for Azure VMs,Back Up Azure VMs in a Recovery Services Vault - Azure Backup,,This article describes how to back up Azure VMs in a Recovery Services vault by using Azure Backup.,"This article describes how to back up Azure virtual machines (VMs) in a Recovery Services vault by usingAzure Backup. In this article, you learn how to: Note This article describes how to set up a vault and select VMs to back up. It's useful if you want to back up multiple VMs. Alternatively, you canback up a single Azure VMdirectly from the VM settings.",2026-03-02T12:12:00.000Z,how-to,,0.3,False,"Primarily a how-to guide for setting up a Recovery Services vault and selecting VMs for backup. Based on the description, it focuses on procedural steps rather than detailed configuration tables, limits, error codes, or product-specific best-practice gotchas. Lacks clear evidence of numeric limits, config parameter matrices, or troubleshooting mappings required for expert-knowledge classification.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-vms-prepare,Set up a vault and enable backup for Azure VMs,Back Up Azure VMs in a Recovery Services Vault - Azure Backup,,This article describes how to back up Azure VMs in a Recovery Services vault by using Azure Backup.,"This article describes how to back up Azure virtual machines (VMs) in a Recovery Services vault by usingAzure Backup. In this article, you learn how to: Note This article describes how to set up a vault and select VMs to back up. It's useful if you want to back up multiple VMs. Alternatively, you canback up a single Azure VMdirectly from the VM settings.",2026-03-25T08:00:00.000Z,how-to,,0.2,False,"Article on backing up Azure VMs in a Recovery Services vault; mainly procedural guidance for setup and selection of VMs, without explicit numeric limits, configuration tables, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/backup/backup-azure-auto-enable-backup,Auto-Enable Backup on VM Creation using Azure Policy,Audit and enforce backup during VM creation automatically using Azure Policy - Azure Backup,Auto-enable VM backups using Azure Policy,Learn how to use Azure Policy to autoenable backup for all VMs created in a given scope.,"This article describes how Backup or Compliance Admins can ensure that all business-critical machines have appropriate backup and retention policies. Azure Backup offers a variety of built-in policies throughAzure Policyto help you automatically configure backup for your Azure Virtual Machines (VMs). Based on the structure of your backup teams and the organization of your resources, you can choose the most suitable policy from the following options to ensure effective and consistent backup manag",2026-01-30T08:00:00.000Z,how-to,configuration,0.8,True,"Describes built-in Azure Policy definitions for enforcing backup on VMs; includes specific policy names, parameters, and scopes—product-specific configuration.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-cloud-as-tape,Replace your tape library,Replace your tape infrastructure by using Azure Backup - Azure Backup,,Learn how Azure Backup provides tape-like semantics that enable you to back up and restore data in Azure,"This article describes how to enable backup and retention policies. If you use tapes for long-term retention, Azure Backup offers a powerful alternative with this feature, which you need toenable in the Azure Backup service. If you use System Center DPM, update to at least DPM 2012 R2 UR5 before using DPM with Azure Backup.",2025-12-12T12:11:00.000Z,how-to,,0.45,False,"Describes using Azure Backup as tape replacement and enabling policies; summary doesn’t show numeric limits, config parameter tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-exchange-server,Use DPM to back up Exchange server,Back up an Exchange server via System Center DPM - Azure Backup,Configure DPM to back up Exchange to Azure safely,Learn how to back up an Exchange server to Azure Backup using System Center 2012 R2 DPM,"This article describes how to configure a System Center 2012 R2 Data Protection Manager (DPM) server to back up a Microsoft Exchange server to Azure Backup. Note When the trim is performed within the guest OS, the tracking of incremental blocks is reset, resulting in a full backup. The trim within the guest OS releases unused blocks of the virtual disk (VHDX) and optimizes the disk size. However, this reduces the size of the VHDX and changes the SequenceNumber of the tracked incremental blocks, ",2025-07-15T08:00:00.000Z,how-to,best-practices,0.7,True,Exchange backup via DPM with the TRIM note about incremental block tracking being reset and causing full backups; this is a concrete product-specific behavior and gotcha.,unchanged @@ -100,12 +100,12 @@ https://learn.microsoft.com/en-us/azure/backup/backup-azure-dpm-introduction,Pre https://learn.microsoft.com/en-us/azure/backup/backup-azure-encrypted-vm-troubleshoot,Encrypted Azure VM backup,Troubleshoot encrypted Azure VM backup errors - Azure Backup,Diagnose and fix encrypted Azure VM backup errors,Describes how to troubleshoot common errors that might occur when you use Azure Backup to back up an encrypted VM.,You can troubleshoot common errors encountered while using Azure Backup service to back up encrypted Azure virtual machines with the steps listed below:,2025-03-27T08:00:00.000Z,troubleshooting,troubleshooting,0.95,True,Troubleshooting guide with specific Azure Backup error codes/messages for encrypted VMs and mapped resolutions.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-enhanced-soft-delete-configure-manage,Manage soft deleted items,Configure and manage soft delete for Azure Backup - Azure Backup,Configure and manage soft delete for Azure Backup,This article describes about how to configure and manage soft delete for Azure Backup.,"This article describes how to configure and usesoft deleteto protect your data and recover backups, if they're deleted.",2025-11-10T18:13:00.000Z,how-to,security,0.7,True,"Provides concrete configuration steps and behavior for soft delete on Azure Backup vaults, including how deleted backups are protected and recovered.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-exchange-mabs,Exchange,Back up Exchange server with Azure Backup Server - Azure Backup,,Learn how to back up an Exchange server to Azure Backup using Azure Backup Server,This article describes how to configure Microsoft Azure Backup Server (MABS) to back up a Microsoft Exchange server to Azure.,2025-03-25T08:00:00.000Z,how-to,,0.4,False,Exchange backup configuration with MABS; likely a scenario how-to without detailed parameter tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-folder-backup-faq,FAQ-MARS agent,Microsoft Azure Recovery Services (MARS) Agent – FAQ - Azure Backup,,Addresses common questions about backing up files and folders with Azure Backup.,This article answers common questions about backing up data with the Microsoft Azure Recovery Services (MARS) Agent in theAzure Backupservice.,2025-07-18T17:15:00Z,faq,,0.45,False,"MARS Agent FAQ about backing up files/folders. Without explicit mention of error codes, config tables, or limits, it’s treated as general FAQ rather than expert troubleshooting or configuration content.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-folder-backup-faq,FAQ-MARS agent,Microsoft Azure Recovery Services (MARS) Agent – FAQ - Azure Backup,MARS agent backup limits and behavioral constraints,Addresses common questions about backing up files and folders with Azure Backup.,This article answers common questions about backing up data with the Microsoft Azure Recovery Services (MARS) Agent in theAzure Backupservice.,2026-04-23T17:12:00Z,faq,limits-quotas,0.7,True,"FAQ pages for Azure Backup MARS agent typically document concrete product limits (max backup size, number of items, retention ranges, concurrency, supported OS versions) and time-related behaviors (scheduling, throttling). These are numeric, product-specific constraints that qualify as limits-quotas.",updated https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-share-rest-api,Backup,Back up Azure Files with REST API - Azure Backup,Configure Azure Files backup via Azure Backup REST API,Learn how to use REST API to back up Azure Files in the Recovery Services vault,"This article describes how to back up an Azure Files using Azure Backup via REST API. You can also back up Azure Files usingAzure portal,CLI, andAzure PowerShell. This article assumes you've already created a Recovery Services vault and policy for configuring backup for your File Share. If you haven’t, refer to thecreate vaultandcreate policyREST API tutorials for creating new vaults and policies. For this article, we'll use the following resources: RecoveryServicesVault:azurefilesvault Policy:s",2026-02-20T08:00:00.000Z,how-to,integrations,0.65,True,"REST API article that defines specific resource names and likely includes request/response schemas and parameters unique to Azure Backup, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-files,Configure backup,Back up Azure Files in the Azure portal - Azure Backup,Configure Azure Files backups in Recovery Services vault,Learn how to use the Azure portal to back up Azure Files in the Recovery Services vault,"This article describes how to back upAzure Filesfrom the Azure portal. You can also back up Azure Files usingCLI,Azure PowerShell, andREST API. Azure Files backup is a native cloud solution that protects your data and eliminates on-premises maintenance overheads. Azure Backup seamlessly integrates with Azure File Sync, centralizing your file share data and backups. The simple, reliable, and secure solution allows you to protect your enterprise file shares usingsnapshotandvaultedbackups, ensuring",2026-02-17T08:00:00.000Z,how-to,configuration,0.7,True,"Portal configuration for Azure Files backup includes selecting snapshot/vaulted backups, policies, and integration with File Sync—product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-files-faq,FAQ-Back up Azure Files,Back up Azure Files FAQ - Azure Backup,,"In this article, discover answers to common questions about how to protect your Azure file shares with the Azure Backup service.","This article answers common questions about backing up Azure Files. In some of the answers, there are links to the articles that have comprehensive information. You can also post questions about the Azure Backup service in theMicrosoft Q&A question page for discussion. To learn about the supported Azure Files backup and restore scenarios, region availability, and limitations, see thesupport matrix. To quickly scan the sections in this article, use the links to the right, underIn this article.",2026-02-17T12:16:00Z,faq,,0.45,False,"Azure Files backup FAQ. The summary points to a separate support matrix for detailed scenarios/limits, suggesting this page itself is higher-level Q&A.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-files-policy-automation,Audit and Enforce backup for Azure Files using Azure policy,Audit and enforce backup for Azure Files using Azure Policy - Azure Backup,Audit and enforce Azure Files backup using Azure Policy,Learn how to use Azure Policy to audit and enforce backups for all Azure Files instances created in a given scope.,"This article describes how Azure Backup uses built-in Azure Policy definitions to automate auditing and enforcement of backup configurations for Azure Files, ensuring compliance with organizational data protection standards. Based on the structure of your backup teams and the organization of your resources, you can choose the most suitable policy from the following options to ensure effective and consistent backup management.",2025-06-05T11:21:00.000Z,how-to,configuration,0.8,True,Uses built-in policy definitions for Azure Files backup; contains policy-specific configuration and parameters for enforcement and auditing.,unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-azure-granular-billing,Granular billing,Configure Granular Billing in Azure Backup - Azure Backup,,Azure Backup granular billing preview lets you view backup costs at multiple levels. Improve cost tracking and reporting for your teams today.,"Granular billing in Azure Backup lets you view backup costs at a custom granularity level for improved chargeback and reporting across teams, applications, and cost centers. Important",2026-04-13T22:10:00.000Z,how-to,,0.2,False,"From the summary, the page describes the concept and benefits of granular billing for Azure Backup and how it helps with cost tracking and reporting. There is no indication of numeric limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. It appears to be a feature explanation/overview rather than detailed configuration, limits, or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/backup/backup-azure-granular-billing,Granular billing,Configure Granular Billing in Azure Backup - Azure Backup,,Azure Backup granular billing preview lets you view backup costs at multiple levels. Improve cost tracking and reporting for your teams today.,"Granular billing in Azure Backup lets you view backup costs at a custom granularity level for improved chargeback and reporting across teams, applications, and cost centers. Important",2026-04-13T22:10:00.000Z,how-to,,0.2,False,"From the summary, the page describes the concept and benefits of granular billing for Azure Backup and how it helps with cost tracking and reporting. There is no indication of numeric limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. It appears to be a feature explanation/overview rather than detailed configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-immutable-vault-concept,Overview,Concept of Immutable Vault for Azure Backup - Azure Backup,Use immutable vaults to protect Azure Backup data,"This article explains the concept of an immutable vault for Azure Backup, and how it helps protect data from malicious actors.","An immutable vault for Azure Backup can help you protect your backup data by blocking any operations that could lead to loss of recovery points. You can lock the immutable vault setting to make it irreversible. You can also use WORM (write once, read many) storage for backups to prevent any malicious actors from disabling immutability and deleting backups.",2025-12-18T18:11:00.000Z,overview,security,0.65,True,"Describes Azure Backup’s immutable vault and WORM behavior, including irreversible lock semantics that are specific to this product’s security model.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-immutable-vault-how-to-manage,Manage,How to manage Azure Backup Immutable vault operations - Azure Backup,Manage immutable vault settings for Azure Backup,This article explains how to manage Azure Backup Immutable vault operations.,"This article describes how to manage Azure Backup Immutable vault operations for Recovery Services vault and Backup vault. Immutable vaultcan help you protect your backup data by blocking any operations that could lead to loss of recovery points. Further, you can lock the Immutable vault setting to enable WORM storage immutability and make it irreversible to prevent any malicious actors from disabling immutability and deleting backups. Note",2025-12-30T08:00:00.000Z,how-to,security,0.7,True,"How-to for managing immutable vault operations on Recovery Services and Backup vaults, with product-specific security settings and operational constraints.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-integrate-microsoft-defender-using-logic-apps,Integrate with Microsoft Defender's ransomware alerts,Integrate Microsoft Defender's ransomware alerts to protect Azure Backup recovery points - Azure Backup,Integrate Defender ransomware alerts with Azure Backup,Learn how to integrate Microsoft Defender for Cloud and Azure Backup using logic app.,"This article describes how to integrate Microsoft Defender's ransomware alerts to preserve Azure Backup recovery points. Assume there has been a breach on the Virtual Machine that is protected by both Defender and Azure Backup. Defender detects the ransomware, raises an alert which includes details of the activity and suggested recommendations to remediate. As soon as a ransomware signal is detected from Defender, ensuring backups are preserved (i.e., paused from expiring) to minimize the data l",2025-03-25T08:00:00.000Z,how-to,integrations,0.75,True,"Shows Logic Apps-based integration between Defender and Backup, including signals and actions to preserve recovery points; product-specific integration pattern.",unchanged @@ -118,7 +118,7 @@ https://learn.microsoft.com/en-us/azure/backup/backup-azure-manage-windows-serve https://learn.microsoft.com/en-us/azure/backup/backup-azure-mars-troubleshoot,Azure Backup agent,Troubleshoot the Azure Backup agent - Azure Backup,Resolve Azure Backup (MARS) agent installation and backup issues,"In this article, learn how to troubleshoot the installation and registration of the Azure Backup agent.","This article describes how to resolve errors you might see during configuration, registration, backup, and restore.",2025-12-01T08:00:00.000Z,troubleshooting,troubleshooting,0.95,True,"Contains concrete MARS agent error codes and stepwise resolutions for install, registration, backup, and restore.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-microsoft-azure-backup,Install Azure Backup Server,Use Azure Backup Server to back up workloads - Azure Backup,,"In this article, learn how to prepare your environment to protect and back up workloads using Microsoft Azure Backup Server (MABS).","Applies To: MABS v4. This article explains how to prepare your environment to back up workloads using Microsoft Azure Backup Server (MABS). With Azure Backup Server, you can protect application workloads, such as Hyper-V VMs, VMware VMs, Azure Local VMs, Microsoft SQL Server, SharePoint Server, Microsoft Exchange, and Windows clients from a single console. Note To learn more about backing up VMware servers with Azure Backup Server, see the article,Use Azure Backup Server to back up a VMware serv",2026-01-21T12:16:00.000Z,how-to,,0.4,False,Environment preparation and general how-to for using MABS; summary does not indicate detailed config parameter tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitor-alert-faq,FAQ-Azure Backup Monitoring and Reporting,Monitoring and Reporting FAQ - Azure Backup,,"In this article, discover answers to common questions about the Azure Backup Monitoring Alert and Azure Backup reports.",This article answers common questions about Azure Backup monitoring and reporting.,2025-07-18T17:15:00Z,faq,,0.4,False,"Monitoring and reporting FAQ for Azure Backup. Typically conceptual and usage-focused, not detailed troubleshooting or configuration matrices.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitor-alerts-notification,Configure notifications,Configure and manage Azure Monitor based alert notifications for Azure Backup - Azure Backup,,Learn how to configure Azure Monitor alert notifications.,This article describes how to configure and manage Azure Monitor based alert notifications for Azure Backup. You can use these alerts to monitor the health of your backup items and take corrective actions when needed.,2026-04-15T11:11:00.000Z,how-to,,0.3,False,"Summary indicates a how-to for configuring Azure Monitor-based alert notifications for Azure Backup, but it reads like a general configuration/tutorial flow without clear evidence of detailed parameter tables, specific config values, or product-unique constraints. Likely procedural guidance rather than dense expert configuration or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitor-alerts-notification,Configure notifications,Configure and manage Azure Monitor based alert notifications for Azure Backup - Azure Backup,,Learn how to configure Azure Monitor alert notifications.,This article describes how to configure and manage Azure Monitor based alert notifications for Azure Backup. You can use these alerts to monitor the health of your backup items and take corrective actions when needed.,2026-04-15T11:11:00.000Z,how-to,,0.3,False,"Summary indicates a how-to for configuring Azure Monitor-based alert notifications for Azure Backup, but it reads like a general configuration/tutorial flow without clear evidence of detailed parameter tables, specific config values, or product-unique constraints. Likely procedural guidance rather than dense expert configuration or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitor-troubleshoot,Monitoring and alerts,Troubleshoot monitoring issues for Azure Backup - Azure Backup,Fix Azure Backup monitoring and protection status issues,Learn how to troubleshoot monitoring issues for Azure Backup.,This article provides troubleshooting steps that help you resolve error massages caused during monitor protection operations for Azure Backup.,2025-12-09T12:47:00.000Z,troubleshooting,troubleshooting,0.86,True,Monitoring-related error messages for Azure Backup and their resolutions.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitoring-alerts,Azure Monitor alerts,Manage Azure Monitor based alerts for Azure Backup - Azure Backup,Configure Azure Monitor alerts for Azure Backup,Learn about the new and improved alerting capabilities via Azure Monitor and the process to configure Azure Monitor.,This article describes how to switch to Azure Monitor based alerts for Azure Backup and monitor them.,2026-03-19T11:16:00.000Z,how-to,configuration,0.68,True,"The article goes beyond conceptual alerting and includes product-specific configuration details for Azure Backup alerts via Azure Monitor (for example, which alert rules to enable/disable, how to switch from classic alerts, and specific settings in the Azure portal). These are concrete, service-specific configuration steps and options rather than generic monitoring guidance, fitting the configuration sub-skill best. It does not focus on limits, troubleshooting error codes, or architecture decisions.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitoring-built-in-monitor,Review backup estate,Monitor Azure Backup protected workloads - Azure Backup,Configure built-in monitoring for Azure Backup workloads,"In this article, learn about the monitoring and notification capabilities for Azure Backup workloads using the Azure portal.","Azure Backup provides multiple backup solutions based on the backup requirement and infrastructure topology (On-premises vs Azure). Any backup user or admin should see what's going on across all solutions and can expect to be notified in important scenarios. Overview of alerts, jobs, security and usage are available by default in theOverviewpane of Resiliency, Recovery Services Vault and Backup Vault. This article describes the ways to view and configure monitoring capabilities via Resiliency, R",2026-01-30T18:17:00.000Z,how-to,configuration,0.65,True,"Describes specific monitoring and notification capabilities in the portal (alerts, jobs, security, usage) for Azure Backup; likely includes product-specific monitoring settings and options.",unchanged @@ -135,7 +135,7 @@ https://learn.microsoft.com/en-us/azure/backup/backup-azure-recovery-services-va https://learn.microsoft.com/en-us/azure/backup/backup-azure-reports-data-model,Log Analytics Data Model for Resource Specific Diagnostic Events,Data Model for resource specific Diagnostic Events - Azure Backup,Use resource-specific diagnostic data model for Azure Backup,This data model is in reference to the Resource Specific Mode of sending diagnostic events to Log Analytics (LA).,"Note For creating custom reporting views, it is recommended to usesystem functions on Azure Monitor logsinstead of working with the raw tables listed below.",2024-12-30T08:00:00.000Z,reference,configuration,0.8,True,"Data model reference for resource-specific diagnostic events; includes table/field names and schemas used in Log Analytics, which are detailed configuration/reference data.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-files-from-vm,Recover files from Azure VM backups,Recover files and folders from Azure VM backup - Azure Backup,,"In this article, learn how to recover files and folders from an Azure virtual machine recovery point.","Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. Azure Backup provides the capability to restoreAzure virtual machines (VMs) and disksfrom Azure VM backups, also known as recovery points. This article explains how to recover files and folders from an Azure VM backup. Restoring files and folders is available only for Azure VMs deployed using th",2026-03-17T08:00:00.000Z,how-to,,0.4,False,"Task-focused restore tutorial for recovering files from VM backups; typically step-by-step UI workflow without configuration tables, limits, or error-code-based troubleshooting, so it doesn’t meet the expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-key-secret,Restore keys and secret for encrypted VMs,Restore Key Vault key & secret for encrypted Azure VM - Azure Backup,Restore Key Vault keys and secrets for encrypted Azure VMs via Azure Backup,Learn how to restore Key Vault key and secret for encrypted VMs via Azure PowerShell using Azure Backup.,"This article describes how to use Azure VM Backup to restore encrypted Azure Virtual Machines (VMs) when the original key and secret aren't available in the Key Vault. It's also applicable for scenarios where you want to maintain a separate copy of the key (Key Encryption Key) and secret (BitLocker Encryption Key) for the restored VM. Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure PowerShell. To learn how to migrate to the A",2025-08-20T11:14:00.000Z,how-to,security,0.8,True,"Includes specific PowerShell cmdlets, parameters, and flows to restore KEK/BEK from backup; this is detailed, product-specific security and integration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-system-state,Restore Windows Server System State,Restore System State to a Windows Server - Azure Backup,Restore Windows Server system state from Azure Backup,Step-by-step explanation for restoring Windows Server System State from a backup in Azure.,"This article explains how to restore Windows Server System State backups from an Azure Recovery Services vault. To restore System State, you must have a System State backup (created using the instructions inBack up System State, and make sure you've installed thelatest version of the Microsoft Azure Recovery Services (MARS) agent. Recovering Windows Server System State data from an Azure Recovery Services vault is a two-step process: Restore System State as files from Azure Backup. When restorin",2025-09-09T08:00:00.000Z,how-to,configuration,0.65,True,"System state restore involves specific restore modes, paths, and post-restore steps that are product-specific configuration/operation details.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-system-state,Restore Windows Server System State,Restore System State to a Windows Server - Azure Backup,,Step-by-step explanation for restoring Windows Server System State from a backup in Azure.,"This article explains how to restore Windows Server System State backups from an Azure Recovery Services vault. To restore System State, you must have a System State backup (created using the instructions inBack up System State, and make sure you've installed thelatest version of the Microsoft Azure Recovery Services (MARS) agent. Recovering Windows Server System State data from an Azure Recovery Services vault is a two-step process: Restore System State as files from Azure Backup. When restorin",2026-04-23T08:00:00.000Z,how-to,,0.3,False,"How-to guide for restoring Windows Server System State from Azure Backup; mainly procedural steps. No strong indication of detailed configuration matrices, limits, or troubleshooting error-code mappings.",updated https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-windows-server,Recover files from Azure to Windows Server,Restore files to Windows Server using the MARS Agent - Azure Backup,,"In this article, learn how to restore data stored in Azure to a Windows server or Windows computer with the Microsoft Azure Recovery Services (MARS) Agent.","This article describes how to restore data from a backup vault. To restore data, you use the Recover Data wizard in the Microsoft Azure Recovery Services (MARS) Agent.",2025-06-20T17:02:00.000Z,how-to,,0.45,False,Restore tutorial using the Recover Data wizard; mostly procedural steps without detailed configuration tables or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-sap-hana-database,Database,Back up an SAP HANA database to Azure with Azure Backup - Azure Backup,,"In this article, learn how to back up an SAP HANA database to Azure virtual machines with the Azure Backup service.",This article describes how to back up SAP HANA databases that are running on Azure VMs to an Azure Backup Recovery Services vault. SAP HANA databases are critical workloads that require a low recovery-point objective (RPO) and long-term retention. You can back up SAP HANA databases running on Azure virtual machines (VMs) by usingAzure Backup. You can alsoback up SAP HANA System Replication databases on Azure VMsandback up SAP HANA database instance snapshots on Azure VMs. To learn about the supp,2026-02-16T08:00:00.000Z,how-to,,0.35,False,SAP HANA backup on Azure VMs tutorial; summary points to separate support matrix for scenarios/limits; this article is mainly procedural.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-azure-sap-hana-database-troubleshoot,SAP HANA backup in Azure VMs,Troubleshoot SAP HANA databases back up errors - Azure Backup,Fix SAP HANA database backup errors on Azure VMs,Describes how to troubleshoot common errors that might occur when you use Azure Backup to back up SAP HANA databases.,"This article provides troubleshooting information to back up SAP HANA databases on Azure virtual machines. For more information on the SAP HANA backup scenarios we currently support, seeScenario support.",2026-01-09T08:00:00.000Z,troubleshooting,troubleshooting,0.95,True,SAP HANA-on-Azure backup error codes and resolutions specific to Azure Backup.,unchanged @@ -183,16 +183,16 @@ https://learn.microsoft.com/en-us/azure/backup/backup-during-vm-creation,Enable https://learn.microsoft.com/en-us/azure/backup/backup-encryption,Encryption in Azure Backup,Encryption in Azure Backup - Azure Backup,Understand encryption behavior in Azure Backup,Learn how encryption features in Azure Backup help you protect your backup data and meet the security needs of your business.,"Azure Backup automatically encrypts all your backed-up data while storing in the cloud using Azure Storage encryption, which helps you meet your security and compliance commitments. This data at rest is encrypted using 256-bit AES encryption (one of the strongest block ciphers available that is FIPS 140-2 compliant). Additionally, all your backup data in transit is transferred over HTTPS. It always remains on the Azure backbone network. This article describes the levels of encryption in Azure Ba",2024-09-11T08:00:00.000Z,reference,security,0.8,True,"Documents encryption mechanisms (AES-256, at-rest and in-transit) and how they apply to Backup; includes product-specific security behavior.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-instant-restore-capability,Back up and restore Azure VMs with Azure Backup Instant Restore,Azure Instant Restore Capability - Azure Backup,Instant Restore performance limits for Azure VM backups,Learn about Azure Instant Restore availability and FAQs for a VM backup stack in an Azure Resource Manager deployment model.,This article describes the improved backup and restore performance of the Instant Restore capability in Azure Backup.,2026-04-01T06:12:00.000Z,overview,limits-quotas,0.78,True,"The Instant Restore capability article describes improved backup/restore performance and typically documents concrete operational constraints such as how many snapshots are kept, how long snapshots are retained, time windows for instant restore, and I/O or performance characteristics tied to specific configurations. These are numeric, product-specific limits and behaviors that fit the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-mabs-add-storage,Add storage,Use Modern Backup Storage with Azure Backup Server - Azure Backup,Configure Modern Backup Storage for Azure Backup Server,Learn about the new features in Azure Backup Server. This article describes how to upgrade your Backup Server installation.,"This article describes how to add storage to Microsoft Azure Backup Server (MABS). Azure Backup Server V2 and later supports Modern Backup Storage that offers storage savings of 50 percent, backups that are three times faster, and more efficient storage. It also offers workload-aware storage. Note To use Modern Backup Storage, you must run Backup Server V2 or later on Windows Server 2016 or later. -If you run Backup Server V2 on an earlier version of Windows Server, Azure Backup Server can't take",2026-04-13T08:00:00.000Z,how-to,configuration,0.68,True,"The article describes product-specific configuration details for adding and using Modern Backup Storage in Microsoft Azure Backup Server, including required server versions (MABS V2+ and Windows Server 2016+), feature behavior, and how storage is attached and used. These are concrete, product-specific configuration requirements rather than generic concepts.",updated +If you run Backup Server V2 on an earlier version of Windows Server, Azure Backup Server can't take",2026-04-13T08:00:00.000Z,how-to,configuration,0.68,True,"The article describes product-specific configuration details for adding and using Modern Backup Storage in Microsoft Azure Backup Server, including required server versions (MABS V2+ and Windows Server 2016+), feature behavior, and how storage is attached and used. These are concrete, product-specific configuration requirements rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-mabs-files-applications-azure-stack,Protect files and applications,Back up files in Azure Stack VMs - Azure Backup,,Use Azure Backup to back up and recover Azure Stack files and applications to your Azure Stack environment.,"You can use Azure Backup to protect (or back up) files and applications on Azure Stack. To back up files and applications, install Microsoft Azure Backup Server as a virtual machine running on Azure Stack. You can protect the files on any Azure Stack server in the same virtual network. Once you've installed Azure Backup Server, add Azure disks to increase the local storage available for short-term backup data. Azure Backup Server uses Azure storage for long-term retention. Note Though Azure Back",2025-06-27T08:00:00.000Z,how-to,,0.4,False,"How-to for backing up files/apps on Azure Stack VMs; summary mentions adding disks and using Azure storage but no explicit parameter tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-mabs-install-azure-stack,Install Azure Backup Server,Install Azure Backup Server on Azure Stack Hub - Azure Backup,,"In this article, learn how to use Azure Backup Server to protect or back up workloads in Azure Stack Hub.","This article describes how to install Azure Backup Server on Azure Stack Hub. Microsoft Azure Backup Server (MABS) protects Infrastructure as a Service (IaaS) workloads, including virtual machines running in Azure Stack Hub. It enables you to manage all workload protection from a single console, streamlining your operations. Note To learn about security capabilities, refer toAzure Backup security features documentation. For more information on tbe supported workloads, see theAzure Backup Server ",2025-04-30T08:00:00.000Z,how-to,,0.4,False,"Install guide for MABS on Azure Stack Hub; summary suggests step-by-step setup and supported workloads reference, but no explicit config tables or limits in the provided text.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-mabs-protection-matrix,MABS V4 (and later) protection matrix,MABS (Azure Backup Server) V4 protection matrix - Azure Backup,Use MABS v4 protection matrix for supported workloads,"This article provides a support matrix listing all workloads, data types, and installations that Azure Backup Server v4 protects.",This article lists the various servers and workloads that you can protect with Azure Backup Server. The following matrix lists what can be protected with Azure Backup Server. The following table lists the support matrix for MABS v4 (and later): Note Support for the 32-bit protection agent isn't supported with MABS v4 (and later). See32-Bit protection agent deprecation.,2026-03-09T08:00:00.000Z,reference,deployment,0.7,True,Protection matrix is effectively a support/deployment matrix listing which workloads and installations are supported by Azure Backup Server v4; this is product- and version-specific expert knowledge about what can be protected where.,unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3,Release notes MABS,Release notes for Microsoft Azure Backup Server v3 - Azure Backup,Resolve known issues in Microsoft Azure Backup Server v3,This article provides the information about the known issues and workarounds for Microsoft Azure Backup Server (MABS) v3.,This article provides the known issues and workarounds for Microsoft Azure Backup Server (MABS) V3.,2025-07-16T11:12:00.000Z,release-notes,troubleshooting,0.7,True,"Release notes for MABS v3 with known issues and workarounds; these map specific symptoms to causes and resolutions, fitting troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3,Release notes MABS,Release notes for Microsoft Azure Backup Server v3 - Azure Backup,Troubleshoot known issues in MABS v3,This article provides the information about the known issues and workarounds for Microsoft Azure Backup Server (MABS) v3.,This article provides the known issues and workarounds for Microsoft Azure Backup Server (MABS) V3.,2026-04-24T06:15:00.000Z,release-notes,troubleshooting,0.7,True,"Release notes for MABS v3 with known issues and workarounds usually list specific symptoms, error messages/codes, causes, and resolutions. This matches the troubleshooting pattern of symptom → cause → solution with product-specific details.",updated https://learn.microsoft.com/en-us/azure/backup/backup-mabs-sharepoint-azure-stack,Protect SharePoint farm,Back up a SharePoint farm on Azure Stack using Microsoft Azure Backup Server - Azure Backup,,Learn how to back up and restore SharePoint data using Microsoft Azure Backup Server (MABS).,"This article describes how to back up and restore SharePoint data using Microsoft Azure Backup Server (MABS). Microsoft Azure Backup Server (MABS) enables you to back up a SharePoint farm (on Azure Stack) to Microsoft Azure, which gives an experience similar to back up of other data sources. Azure Backup provides flexibility in the backup schedule to create daily, weekly, monthly, or yearly backup points, and gives you retention policy options for various backup points. It also provides the capa",2025-03-25T08:00:00.000Z,how-to,,0.4,False,"SharePoint farm backup/restore on Azure Stack; summary focuses on schedule/retention flexibility conceptually, not on specific numeric limits or config parameters.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-mabs-sql-azure-stack,Protect SQL Server database,Back up SQL Server on Azure Stack using Azure Backup - Azure Backup,,Learn how to configure Microsoft Azure Backup Server (MABS) to protect SQL Server databases on Azure Stack.,This article describes how to configure Microsoft Azure Backup Server (MABS) to protect SQL Server databases on Azure Stack.,2026-02-13T08:00:00.000Z,how-to,,0.3,False,"Very short summary; just states configuring MABS to protect SQL Server on Azure Stack, with no evidence of detailed config tables or limits.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-mabs-system-state-and-bmr,Protect system state and bare metal recovery,System state and bare-metal recovery protection for Azure Backup - Azure Backup,,Use Azure Backup Server to back up your system state and provide bare-metal recovery (BMR) protection.,This article describes how to back up system state and restore to bare metal by using Azure Backup Server.,2025-12-05T08:00:00.000Z,how-to,,0.4,False,System state and BMR protection how-to; primarily procedural backup/restore steps rather than configuration or limits reference.,unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-mabs-system-state-and-bmr,Protect system state and bare metal recovery,System state and bare-metal recovery protection for Azure Backup - Azure Backup,,Use Azure Backup Server to back up your system state and provide bare-metal recovery (BMR) protection.,This article describes how to back up system state and restore to bare metal by using Azure Backup Server.,2026-04-24T08:00:00.000Z,how-to,,0.25,False,"Article appears to be a procedural guide for backing up system state and performing bare-metal recovery with Azure Backup Server. The summary does not show specific limits, configuration parameter tables, or detailed troubleshooting mappings, so it likely lacks the required expert-knowledge patterns.",updated https://learn.microsoft.com/en-us/azure/backup/backup-mabs-unattended-install,Unattended installation,Silent installation of Azure Backup Server V4 - Azure Backup,Automate silent installation of Azure Backup Server v4,Use a PowerShell script to silently install Azure Backup Server V4. This kind of installation is also called an unattended installation.,"This article describes how to run an unattended installation of Azure Backup Server. These steps don't apply if you're installing older version of Azure Backup Server like MABS V1, V2 and V3.",2025-07-14T11:09:00.000Z,how-to,deployment,0.7,True,"Unattended installation via PowerShell script is a deployment pattern specific to MABS, with product-specific requirements and parameters.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-mabs-whats-new-mabs,What's New in MABS,What's new in Microsoft Azure Backup Server - Azure Backup,,"Microsoft Azure Backup Server gives you enhanced backup capabilities for protecting VMs, files and folders, workloads, and more.","Microsoft Azure Backup Server gives you enhanced backup capabilities to protect VMs, files and folders, workloads, and more.",2025-07-25T08:00:00.000Z,release-notes,,0.3,False,"“What’s new” overview; typically marketing/feature summary without detailed configuration, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-mabs-whats-new-mabs,What's New in MABS,What's new in Microsoft Azure Backup Server - Azure Backup,,"Microsoft Azure Backup Server gives you enhanced backup capabilities for protecting VMs, files and folders, workloads, and more.","Microsoft Azure Backup Server gives you enhanced backup capabilities to protect VMs, files and folders, workloads, and more.",2026-04-24T08:00:00.000Z,release-notes,,0.2,False,"“What’s new” feature overview for Microsoft Azure Backup Server; typically release highlights and marketing-style descriptions rather than detailed limits, configuration tables, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/backup/backup-managed-disks,Backup,Back up Azure Managed Disks - Azure Backup,Configure Azure Managed Disk backups in the portal,Learn how to back up Azure Managed Disks from the Azure portal.,"This article describes how to back upAzure Diskusing the Azure portal. You can also use REST API tocreate a Backup policyandconfigure backupfor Azure Managed Disk. To view the supported Azure Disk backup and restore scenarios, region availability, and limitations, see thesupport matrix. For common questions, see thefrequently asked questions. Note If the target disk is attached as a Persistent Volume to an AKS cluster, chooseAzure Backup for AKSover the standalone Disk Backup solution. It enable",2026-02-13T08:00:00.000Z,how-to,configuration,0.65,True,"Portal-based disk backup setup includes selecting vaults, policies, and options; these are product-specific configuration steps and parameters.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-managed-disks-cli,Backup,Back up Azure Managed Disks using Azure CLI - Azure Backup,Back up Azure Managed Disks using Azure CLI,Learn how to back up Azure Managed Disks using Azure CLI.,"This article describes how to back upAzure Managed Diskusing Azure CLI. You can also use REST API tocreate a Backup policyandconfigure backupfor Azure Managed Disk. Important Support for Azure Managed Disks backup and restore via CLI is in preview and available as an extension in Az 2.15.0 version and later. The extension is automatically installed when you run theaz dataprotectioncommands.Learn moreabout extensions. Learn about theAzure Disk backup region availability, supported scenarios and l",2025-08-25T08:00:00.000Z,how-to,integrations,0.7,True,"CLI article uses az dataprotection extension with specific commands/parameters and version constraints (Az 2.15.0+), which are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-managed-disks-policy,Audit and Enforce backup for Managed Disks using Azure policy,Audit and enforce backup for Managed Disks using Azure Policy - Azure Backup,Audit and enforce Managed Disks backup with Azure Policy,Learn how to use Azure Policy to audit and enforce backups for Azure Managed Disks to ensure compliance and protect business-critical data.,"This article describes how Azure Backup uses built-inAzure Policydefinitions to automate auditing and enforce backup configurations for Azure Managed Disks. These built-in policies ensure compliance with your organization’s retention requirements for business-critical machines. As a Backup and Compliance admin, choose the policy that best fits your team’s structure and resource organization to automatically configure backups for Azure Managed Disks.",2025-11-25T18:11:00.000Z,how-to,configuration,0.8,True,"Describes built-in Azure Policy definitions for Managed Disks backup; includes policy names, parameters, and scopes that are specific configuration knowledge.",unchanged @@ -211,14 +211,14 @@ https://learn.microsoft.com/en-us/azure/backup/backup-sql-server-database-from-a https://learn.microsoft.com/en-us/azure/backup/backup-sql-server-on-availability-groups,Back up SQL Server Availability Group,Back up SQL Server always on availability groups - Azure Backup,Back up SQL Server Always On availability groups with Azure Backup,"In this article, learn how to back up SQL Server on availability groups.","Azure Backup offers an end-to-end support for backing up SQL Server always on availability groups (AG) if all nodes are in the same region and subscription as the Recovery Services vault. However, if the AG nodes are spread across regions/subscriptions/on-premises and Azure, there are a few considerations to keep in mind. To view the backup and restore scenarios that we support today, see thesupport matrix. For common questions, see thefrequently asked questions. Note Backup of Basic Availabilit",2026-02-13T08:00:00.000Z,how-to,best-practices,0.65,True,"Covers AG-specific considerations (same region/subscription requirements, unsupported Basic AG, cross-region/on-premises nuances) which are product-specific gotchas and guidance.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-sql-server-vm-from-vm-pane,Back up a SQL Server VM from the VM pane,Back up a SQL Server VM from the VM pane - Azure Backup,,"In this article, learn how to back up SQL Server databases on Azure virtual machines from the VM pane.",This article explains how to back up SQL Server running in Azure VMs with theAzure Backupservice. You can back up SQL Server VMs using two methods:,2026-02-13T08:00:00.000Z,how-to,,0.35,False,Basic explanation of two methods to back up SQL Server VMs; largely procedural without deep config or limits.,unchanged https://learn.microsoft.com/en-us/azure/backup/backup-support-automation,Support matrix for automation,Automation in Azure Backup support matrix - Azure Backup,,This article summarizes automation tasks related to Azure Backup support.,You can automate most backup related tasks using programmatic methods in Azure Backup. This article provides information about various scenarios that automation clients support and the corresponding document references.,2026-02-20T08:00:00.000Z,reference,,0.3,False,"Support matrix here appears to be a high-level list of which Azure Backup tasks can be automated and links to other docs, without concrete limits, configuration tables, or error-code-based troubleshooting. It’s more of a capability overview than detailed expert configuration or limits content.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix,Azure Backup support matrix,Azure Backup support matrix - Azure Backup,Review global Azure Backup support settings and limits,Provides a summary of support settings and limitations for the Azure Backup service.,"You can useAzure Backupto back up data to the Microsoft Azure cloud platform. This article summarizes the general support settings and limitations for Azure Backup scenarios and deployments. Other support matrices are available: Note This service supportsAzure Lighthouse, which lets service providers sign in to their own tenant to manage subscriptions and resource groups that customers have delegated.",2026-01-29T08:00:00.000Z,reference,limits-quotas,0.85,True,"Support matrix summarizing support settings and limitations across scenarios; typically includes numeric limits, supported versions, and constraints not known generically.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas,Support matrix,Support matrix for Azure VM backups - Azure Backup,Support matrix and limitations for Azure VM backups,Get a summary of support settings and limitations for backing up Azure VMs by using the Azure Backup service.,"You can use theAzure Backup serviceto back up on-premises machines and workloads, along with Azure virtual machines (VMs). This article summarizes support settings and limitations when you back up Azure VMs by using Azure Backup. Other support matrices include:",2026-01-28T08:00:00.000Z,reference,limits-quotas,0.9,True,"Support matrix pages enumerate supported/unsupported configurations and explicit limitations (OS versions, disk types, max sizes, scenarios) in table form; these are detailed constraints and limits that qualify as expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm,DPM/Azure Backup Server (MABS) support matrix,MABS & System Center DPM support matrix - Azure Backup,Review MABS and DPM backup support and limits,This article summarizes Azure Backup support when you use Microsoft Azure Backup Server (MABS) or System Center DPM to back up on-premises and Azure VM resources.,"This article summarizes support settings and limitations for backing up machines by using Microsoft Azure Backup Server (MABS) or System Center Data Protection Manager (DPM), and Azure Backup. Azure Backupallows you to back up on-premises machines and workloads, and Azure virtual machines (VMs).",2025-09-11T08:00:00.000Z,reference,limits-quotas,0.85,True,Support matrix for MABS and DPM; summarizes supported scenarios and limitations with detailed tables and constraints.,unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix,Azure Backup support matrix,Azure Backup support matrix - Azure Backup,Review Azure Backup support settings and limits,Provides a summary of support settings and limitations for the Azure Backup service.,"You can useAzure Backupto back up data to the Microsoft Azure cloud platform. This article summarizes the general support settings and limitations for Azure Backup scenarios and deployments. Other support matrices are available: Note This service supportsAzure Lighthouse, which lets service providers sign in to their own tenant to manage subscriptions and resource groups that customers have delegated.",2026-03-25T08:00:00.000Z,reference,limits-quotas,0.86,True,"A 'support matrix' for Azure Backup scenarios and deployments will list which workloads, OS versions, and configurations are supported along with explicit limitations and constraints. These matrices typically include exact numerical limits (for example, maximum number of items per vault, supported sizes, retention constraints) and scenario-specific support details that are product- and version-specific, qualifying as expert knowledge. This aligns best with the limits-quotas category because the primary purpose is to enumerate support boundaries and limitations rather than provide conceptual guidance or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas,Support matrix,Support matrix for Azure VM backups - Azure Backup,Review Azure VM backup support limits and matrix,Get a summary of support settings and limitations for backing up Azure VMs by using the Azure Backup service.,"You can use theAzure Backup serviceto back up on-premises machines and workloads, along with Azure virtual machines (VMs). This article summarizes support settings and limitations when you back up Azure VMs by using Azure Backup. Other support matrices include:",2026-04-24T11:19:00.000Z,reference,limits-quotas,0.86,True,"A 'support matrix' for backups typically enumerates detailed support constraints and limitations (for example, supported/unsupported VM types, disk sizes, OS versions, and backup feature limits). These are product-specific limits and constraints that change over time and are not reliably known from training data, fitting the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm,DPM/Azure Backup Server (MABS) support matrix,MABS & System Center DPM support matrix - Azure Backup,Check MABS and DPM Azure Backup support limits,This article summarizes Azure Backup support when you use Microsoft Azure Backup Server (MABS) or System Center DPM to back up on-premises and Azure VM resources.,"This article summarizes support settings and limitations for backing up machines by using Microsoft Azure Backup Server (MABS) or System Center Data Protection Manager (DPM), and Azure Backup. Azure Backupallows you to back up on-premises machines and workloads, and Azure virtual machines (VMs). Note Windows Server 2008, 2008 R2, 2012 and 2012 R2 have reached End of Support (EOS). Review your usage and plan OS upgrades and migrations accordingly. For more information, see End of support for: Per",2026-04-24T08:00:00.000Z,reference,limits-quotas,0.85,True,"Support matrix pages for MABS/DPM and Azure Backup enumerate supported/unsupported workloads, OS versions, and often concrete limits (max data sizes, protected instances, scenarios) in tabular form. These are product-specific support and limitation details that function as limits/quotas beyond generic knowledge.",updated https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mars-agent,Support matrix,Support matrix for the MARS agent - Azure Backup,Check MARS agent backup support and limitations,This article summarizes Azure Backup support when you back up machines that are running the Microsoft Azure Recovery Services (MARS) agent.,"You can use theAzure Backup serviceto back up on-premises machines and apps and to back up Azure virtual machines (VMs). This article summarizes support settings and limitations when you use the Microsoft Azure Recovery Services (MARS) agent to back up machines. Note Windows Server 2008 and Windows Server 2008 R2 have reached End of Support (EOS). For more information, see,End of support for Windows Server 2008 and Windows Server 2008 R2andPerform in-place upgrade to Windows Server 2016, 2019, 2",2026-03-09T08:00:00.000Z,reference,limits-quotas,0.85,True,"Support matrix for the MARS agent will list supported OS versions, workloads, and explicit limitations and constraints; these are concrete product-specific limits and support boundaries that constitute expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-the-mabs-server,Back up the MABS server,Back up the MABS server - Azure Backup,,Learn how to back up the Microsoft Azure Backup Server (MABS).,"This article describes how to create a robust backup strategy for Microsoft Azure Backup Server (MABS). It explains the importance of backing up both the MABS server and its database to ensure reliable data recovery and maintain business continuity in case of server failure. Without a proper backup plan, organizations risk losing access to disk-based recovery points and may be forced to manually rebuild the server, which can be time-consuming and disruptive. The article provides an overview of a",2025-09-23T08:00:00.000Z,how-to,,0.4,False,"High-level backup strategy for MABS server and DB; summary doesn’t indicate concrete config values, limits, or product-specific error mappings.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-vault-overview,Overview,Overview of the Backup vaults - Azure Backup,,An overview of Backup vaults.,"This article describes the features of a Backup vault. A Backup vault is a storage entity in Azure that houses backup data for certain newer workloads that Azure Backup supports. You can use Backup vaults to hold backup data for various Azure services, such as Azure Blob, Azure Database for PostgreSQL servers and newer workloads that Azure Backup will support. Backup vaults make it easy to organize your backup data, while minimizing management overhead. Learn about thetypes of vault supported fo",2026-01-29T08:00:00.000Z,overview,,0.1,False,"Overview of Backup vaults; primarily descriptive without detailed configuration parameters, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/backup/backup-vault-troubleshoot,Backup vault,Troubleshoot Backup vault management operations on Azure Backup - Azure Backup,Troubleshoot Backup vault management errors,This article describes how to troubleshoot common errors that might occur when you manage Backup vault.,"This article provides troubleshooting information to manage Back vault operations. For more information on the supported Backup vault management scenarios we currently support, seeBackup vault overview.",2026-01-30T08:00:00.000Z,troubleshooting,troubleshooting,0.8,True,Explicit troubleshooting article for Backup vault operations; expected to map specific error messages/codes to causes and resolutions.,unchanged -https://learn.microsoft.com/en-us/azure/backup/backup-windows-with-mars-agent,Back up Windows Server files and folders,Back up Windows machines by using the MARS agent - Azure Backup,Configure Windows backups using the MARS agent,Use the Microsoft Azure Recovery Services (MARS) agent to back up Windows machines.,This article describes how to back up Windows machines by using theAzure Backupservice and the Microsoft Azure Recovery Services (MARS) agent or Azure Backup agent.,2025-06-18T08:00:00.000Z,how-to,configuration,0.7,True,"Describes how to configure backup schedules, retention, and included items in the MARS agent UI; these are concrete product-specific configuration parameters and patterns.",unchanged +https://learn.microsoft.com/en-us/azure/backup/backup-windows-with-mars-agent,Back up Windows Server files and folders,Back up Windows machines by using the MARS agent - Azure Backup,,Use the Microsoft Azure Recovery Services (MARS) agent to back up Windows machines.,This article describes how to back up Windows machines by using theAzure Backupservice and the Microsoft Azure Recovery Services (MARS) agent or Azure Backup agent.,2026-04-23T17:12:00.000Z,how-to,,0.3,False,"Step-by-step tutorial for backing up Windows machines with the MARS agent; primarily procedural guidance without clear indication of configuration tables, numeric limits, or product-specific error mappings. Does not clearly meet any expert-knowledge sub-skill criteria.",updated https://learn.microsoft.com/en-us/azure/backup/blob-backup-configure-manage,Configure and manage,Configure and manage backup for Azure Blobs using Azure Backup - Azure Backup,Configure operational and vaulted backups for Azure Blobs,Learn how to configure and manage operational and vaulted backups for Azure Blobs.,"Azure Backup allows you to configure operational and vaulted backups to protect block blobs in your storage accounts. This article describes how to configure and manage backups on one or more storage accounts using the Azure portal. You can alsoconfigure backup using REST API. For more information on the general availability of vaulted backups for Azure Blob Storage and how they enhance data protection with ransomware resilience and long-term retention, see theMicrosoft Community Hub blog.",2026-01-30T08:00:00.000Z,how-to,configuration,0.7,True,"Portal configuration of blob backups includes selecting backup types, policies, and retention—product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/backup/blob-backup-configure-quick,From the Azure portal,Quickstart - Configure vaulted backup for Azure Blobs using Azure Backup - Azure Backup,,"In this quickstart, learn how to configure vaulted backup for Azure Blobs.","This quickstart describes how to create a backup policy and configure vaulted backup for Azure Blobs from the Azure portal. You can alsoconfigure backup using REST API. Azure Backupnow allows you to configure bothoperationalandvaultedbackups to protect block blobs in your storage accounts. Vaulted backup of blobs is a managed offsite backup solution that stores the backup data in a general v2 storage account, enabling you to protect your backup data against ransomware attacks or source data loss",2025-06-17T17:06:00.000Z,quickstart,,0.35,False,"Quickstart for configuring vaulted backup for Azure Blobs via portal; focuses on creating a policy and enabling backup, not on exhaustive settings or quotas.",unchanged https://learn.microsoft.com/en-us/azure/backup/blob-backup-configure-tutorial,Tutorial to back up Azure Blob,Tutorial - Configure vaulted backup for Azure Blobs using Azure Backup - Azure Backup,,"In this tutorial, learn how to configure vaulted backup for Azure Blobs.","This tutorial describes how to create a backup policy and configure vaulted backup for Azure Blobs from the Azure portal. You can alsoconfigure backup using REST API. Azure Backupnow allows you to configure bothoperationalandvaultedbackups to protect block blobs in your storage accounts. Vaulted backup of blobs is a managed offsite backup solution that stores the backup data in a general v2 storage account, enabling you to protect your backup data against ransomware attacks or source data loss d",2025-06-17T08:00:00.000Z,tutorial,,0.1,False,"Tutorial for configuring vaulted backup for blobs; focuses on how to create a policy and configure backup via portal/REST, not on exhaustive configuration parameter tables or numeric limits.",unchanged @@ -254,7 +254,7 @@ https://learn.microsoft.com/en-us/azure/backup/manage-azure-sql-vm-rest-api,Mana https://learn.microsoft.com/en-us/azure/backup/manage-backup-vault,Manage,Manage Backup vaults - Azure Backup,Manage Azure Backup vault settings and operations,Learn how to manage the Backup vaults.,"This article describes how to manage Backup vaults once they're created. A Backup vault is a storage entity in Azure that houses backup data for certain newer workloads that Azure Backup supports. You can use Backup vaults to hold backup data for various Azure services, such Azure Database for PostgreSQL servers and newer workloads that Azure Backup will support. Backup vaults make it easy to organize your backup data, while minimizing management overhead. Backup vaults are based on the Azure Re",2026-02-05T08:00:00.000Z,how-to,configuration,0.7,True,Managing Backup vaults involves specific settings and operational behaviors unique to Azure Backup’s vault implementation.,unchanged https://learn.microsoft.com/en-us/azure/backup/manage-monitor-sql-database-backup,Manage,Manage and monitor SQL Server DBs on an Azure VM - Azure Backup,Manage and monitor Azure Backup for SQL Server VMs,This article describes how to manage and monitor SQL Server databases that are running on an Azure VM.,"This article describes common tasks for managing and monitoring SQL Server databases that are running on an Azure virtual machine (VM) and that are backed up to an Azure Backup Recovery Services vault by theAzure Backupservice using Azure portal. You can also useAzure CLIandREST APIto manage SQL database backups. You can monitor jobs and alerts, stop and resume database protection, run backup jobs, and unregister a VM from backups. If you haven't yet configured backups for your SQL Server databa",2026-02-24T18:11:00.000Z,how-to,configuration,0.6,True,"Describes managing protection, jobs, alerts, and unregistering VMs via portal/CLI/REST; likely includes specific operation names and parameters for managing backups.",unchanged https://learn.microsoft.com/en-us/azure/backup/manage-recovery-points,Manage recovery points,Manage recovery points - Azure Backup,,Learn how the Azure Backup service manages recovery points for virtual machines,"This article describes how retention works for virtual machines. Whenever backups happen, recovery points are created from which restore operations can be carried out. For virtual machines, the initial backup is a full backup and the subsequent backups are incremental backups.",2026-03-09T06:15:00.000Z,how-to,,0.0,False,"The page explains how Azure Backup manages VM recovery points and retention conceptually (full vs incremental backups, restore from recovery points). From the description, it does not clearly expose specific numeric limits, configuration tables, error codes, or detailed configuration parameters that match any sub-skill detection criteria. Without evidence of concrete limits, quotas, or product-specific configuration values, it does not meet the expert-knowledge threshold for any category.",unchanged -https://learn.microsoft.com/en-us/azure/backup/manage-telemetry,Turn on/off telemetry settings,Manage telemetry settings in Microsoft Azure Backup Server (MABS) - Azure Backup,Configure telemetry and diagnostics for Azure Backup Server,This article provides information about how to manage the telemetry settings in MABS.,"Note This feature is applicable for MABS V3 UR2 and later. This article describes how to manage telemetry (Diagnostics and utility data) settings in Microsoft Azure Backup Server (MABS). By default, MABS sends diagnostic and connectivity data to Microsoft. Microsoft uses this data to enhance the quality, security, and reliability of its products and services. Administrators can turn off this feature at any point of time. Learn about thedata collected details.",2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Page is about managing telemetry settings for Microsoft Azure Backup Server, which typically includes product-specific options (e.g., enabling/disabling diagnostics, possibly registry/setting names, version applicability like MABS V3 UR2 and later). This is configuration-focused expert knowledge rather than generic concepts, but not about limits, security roles, or deployment.",updated +https://learn.microsoft.com/en-us/azure/backup/manage-telemetry,Turn on/off telemetry settings,Manage telemetry settings in Microsoft Azure Backup Server (MABS) - Azure Backup,Configure telemetry and diagnostics for Azure Backup Server,This article provides information about how to manage the telemetry settings in MABS.,"Note This feature is applicable for MABS V3 UR2 and later. This article describes how to manage telemetry (Diagnostics and utility data) settings in Microsoft Azure Backup Server (MABS). By default, MABS sends diagnostic and connectivity data to Microsoft. Microsoft uses this data to enhance the quality, security, and reliability of its products and services. Administrators can turn off this feature at any point of time. Learn about thedata collected details.",2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Page is about managing telemetry settings for Microsoft Azure Backup Server, which typically includes product-specific options (e.g., enabling/disabling diagnostics, possibly registry/setting names, version applicability like MABS V3 UR2 and later). This is configuration-focused expert knowledge rather than generic concepts, but not about limits, security roles, or deployment.",unchanged https://learn.microsoft.com/en-us/azure/backup/metrics-overview,Metrics,Monitor the health of your backups using Azure Backup Metrics (preview) - Azure Backup,Use Azure Backup metrics and thresholds to monitor backup health,"In this article, learn about the metrics available for Azure Backup to monitor your backup health",Azure Backup provides a set of built-in metrics via Azure Monitor that enable you to monitor the health of your backups. It also allows you to configure alert rules that trigger when the metrics exceed defined thresholds. Azure Backup offers the following key capabilities: Learn more about Azure Monitor metrics.,2025-11-27T08:00:00.000Z,overview,limits-quotas,0.7,True,"Metrics overview for Azure Backup; such pages define specific metric names and threshold-based alerting guidance, including numeric thresholds for health monitoring.",unchanged https://learn.microsoft.com/en-us/azure/backup/microsoft-azure-backup-server-protection-v3,MABS V3 RTM protection matrix,What Azure Backup Server V3 RTM can back up - Azure Backup,Use MABS v3 RTM protection matrix for supported workloads,"This article provides a protection matrix listing all workloads, data types, and installations that Azure Backup Serve V3 RTM protects.","Note Windows Server 2008 and Windows Server 2008 R2 have reached End of Support (EOS). For more information, see,End of support for Windows Server 2008 and Windows Server 2008 R2andPerform in-place upgrade to Windows Server 2016, 2019, 2022, or 2025. Review your usage and plan OS upgrades and migrations accordingly. The following matrix lists what can be protected with Azure Backup Server V3 RTM and earlier versions.",2026-03-09T08:00:00.000Z,reference,deployment,0.7,True,"Lists what can be protected with Azure Backup Server V3 RTM and earlier in matrix form (workloads vs supported environments), which functions as a deployment/support matrix containing expert, version-specific compatibility details.",unchanged https://learn.microsoft.com/en-us/azure/backup/microsoft-azure-backup-server-protection-v3-ur1,MABS V3 UR1 (and later),MABS (Azure Backup Server) V3 UR1 protection matrix - Azure Backup,Use MABS v3 UR1 protection matrix for supported workloads,"This article provides a support matrix listing all workloads, data types, and installations that Azure Backup Server protects.",This article lists the various servers and workloads that you can protect with Azure Backup Server. The following matrix lists what can be protected with Azure Backup Server. Use the following matrix for MABS v3 UR1 (and later): Workloads – The workload type of technology. Version – Supported MABS version for the workloads. MABS installation – The computer/location where you wish to install MABS. Protection and recovery – List the detailed information about the workloads such as supported storag,2026-03-09T08:00:00.000Z,reference,deployment,0.7,True,"Provides a detailed support/protection matrix by workload, version, installation location, and protection/recovery options for MABS v3 UR1; this is a product-specific deployment/support matrix that an LLM would not reliably know.",unchanged @@ -298,7 +298,7 @@ https://learn.microsoft.com/en-us/azure/backup/quick-backup-postgresql-flexible- https://learn.microsoft.com/en-us/azure/backup/quick-backup-postgresql-flexible-server-terraform,With Terraform,Quickstart - Configure backup for Azure Database for PostgreSQL - Flexible Server using a Terraform template - Azure Backup,,Learn how to configure backup for Azure Database for PostgreSQL - Flexible Server with a Terraform template.,"This quickstart describes how to configure backup for the Azure Database for PostgreSQL - Flexible Server using the Terraform template. Azure Backupallows you to back up your Azure PostgreSQL - Flexible servers using multiple clients, such as Azure portal, PowerShell, CLI, Azure Resource Manager, Bicep, and so on.",2026-01-20T08:00:00.000Z,quickstart,,0.35,False,"Terraform quickstart for PostgreSQL Flexible Server backup; template-based tutorial, not focused on limits or config matrices.",unchanged https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-bicep-file,Back up a VM - Bicep file,Quickstart - Bicep file VM Backup - Azure Backup,,Learn how to back up your virtual machines with a Bicep file,"Azure Backupallows you to back up your Azure VM using multiple options - such as Azure portal, PowerShell, CLI, Azure Resource Manager, Bicep, and so on. This article describes how to back up an Azure VM with an Azure Bicep file and Azure PowerShell. This quickstart focuses on the process of deploying a Bicep file to create a Recovery Services vault. For more information on developing Bicep files, see theBicep documentationand thetemplate reference. Bicep is a language for declaratively deployin",2025-12-19T08:00:00.000Z,quickstart,,0.3,False,"Bicep quickstart for VM backup; example deployment, not a comprehensive config or limits reference.",unchanged https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-cli,Back up a VM - CLI,Quickstart - Back up a VM with Azure CLI - Azure Backup,,"In this Quickstart, learn how to create a Recovery Services vault, enable protection on a VM, and create the initial recovery point with Azure CLI.","The Azure CLI is used to create and manage Azure resources from the command line or in scripts. You can protect your data by taking backups at regular intervals. Azure Backup creates recovery points that can be stored in geo-redundant recovery vaults. This article details how to back up a virtual machine (VM) in Azure with the Azure CLI. You can also perform these steps withAzure PowerShellor in theAzure portal. This quickstart enables backup on an existing Azure VM. If you need to create a VM, ",2025-12-19T08:00:00.000Z,quickstart,,0.3,False,"Quickstart using Azure CLI to back up a VM; focuses on example commands, not exhaustive configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-portal,Back up a VM - Azure portal,Quickstart - Back up a VM with the Azure portal by using Azure Backup - Azure Backup,,"In this Quickstart, learn how to create a Recovery Services vault, enable protection on an Azure VM, and back up the VM, with the Azure portal.","This quickstart describes how to enable backup on an existing Azure VM by using the Azure portal. If you need to create a VM, you cancreate a VM with the Azure portal. Azure backups can be created through the Azure portal. This method provides a browser-based user interface to create and configure Azure backups and all related resources. You can protect your data by taking backups at regular intervals. Azure Backup creates recovery points that can be stored in geo-redundant recovery vaults. This",2026-03-02T12:12:00.000Z,quickstart,,0.2,False,"Quickstart tutorial focused on using the Azure portal to back up a VM; primarily step-by-step UI guidance without detailed limits, configuration parameter tables, error codes, or decision matrices. Does not meet the thresholds for any expert-knowledge sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-portal,Back up a VM - Azure portal,Quickstart - Back up a VM with the Azure portal by using Azure Backup - Azure Backup,,"In this Quickstart, learn how to create a Recovery Services vault, enable protection on an Azure VM, and back up the VM, with the Azure portal.","This quickstart describes how to enable backup on an existing Azure VM by using the Azure portal. If you need to create a VM, you cancreate a VM with the Azure portal. Azure backups can be created through the Azure portal. This method provides a browser-based user interface to create and configure Azure backups and all related resources. You can protect your data by taking backups at regular intervals. Azure Backup creates recovery points that can be stored in geo-redundant recovery vaults. This",2026-03-26T08:00:00.000Z,quickstart,,0.1,False,"Quickstart for backing up a single VM via the portal; primarily a step-by-step tutorial without detailed limits, configuration matrices, or product-specific troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-powershell,Back up a VM - PowerShell,Quickstart - Back up a VM with PowerShell - Azure Backup,,"In this Quickstart, learn how to back up your Azure virtual machines with the Azure PowerShell module.","TheAzure PowerShell AZmodule is used to create and manage Azure resources from the command line or in scripts. Azure Backupbacks up on-premises machines and apps, and Azure VMs. This article shows you how to back up an Azure VM with the AZ module. Alternatively, you can back up a VM using theAzure CLI, or in theAzure portal. This quickstart enables backup on an existing Azure VM. If you need to create a VM, you cancreate a VM with Azure PowerShell. This quickstart requires the Azure PowerShell A",2025-12-19T08:00:00.000Z,quickstart,,0.3,False,"Quickstart using PowerShell to back up a VM; primarily procedural, not a reference of parameters or quotas.",unchanged https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-template,Back up a VM - ARM template,Quickstart - Resource Manager template VM Backup - Azure Backup,,Learn how to back up your virtual machines with Azure Resource Manager template,"Azure Backupbacks up on-premises machines and apps, and Azure VMs. This article shows you how to back up an Azure VM with an Azure Resource Manager template (ARM template) and Azure PowerShell. This quickstart focuses on the process of deploying an ARM template to create a Recovery Services vault. For more information on developing ARM templates, see theAzure Resource Manager documentationand thetemplate reference. AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that",2025-12-19T08:00:00.000Z,quickstart,,0.3,False,"ARM template quickstart for VM backup; shows one template, not a full configuration matrix or quotas.",unchanged https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-terraform,Back up a VM - Terraform,Quickstart: Back up a virtual machine in Azure with Terraform - Azure Backup,,"In this quickstart, you learn how to configure Azure Backup to run a backup on demand by creating and configuring an Azure Windows virtual machine, virtual network, subnet, public IP, network security","In this quickstart, you create an Azure Windows virtual machine (VM) and associated resources using Terraform. An Azure Windows VM is a scalable computing resource that Azure provides. It's an on-demand, virtualized Windows server in the Azure cloud. You can use it to deploy, test, and run applications, among other things. In addition to the VM, this code also creates a virtual network, subnet, public IP, network security group, network interface, storage account, Azure Backup recovery services ",2025-04-03T22:05:00.000Z,quickstart,,0.3,False,"Terraform quickstart creating a VM and backup; tutorial-style, not focused on limits or detailed configuration tables.",unchanged @@ -381,7 +381,7 @@ https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-azure-files-vault https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-azure-vm,Back up Azure VMs with PowerShell,Tutorial: Multiple Azure VM backup with PowerShell - Azure Backup,,This tutorial details backing up multiple Azure VMs to a Recovery Services vault using Azure PowerShell.,"Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure PowerShell. To learn how to migrate to the Az PowerShell module, seeMigrate Azure PowerShell from AzureRM to Az. This tutorial describes how to deploy anAzure BackupRecovery Services vault to back up multiple Azure VMs using PowerShell. Before you back up (or protect) a virtual machine, complete theprerequisitesto prepare your environment for protecting your VMs. Important This ",2025-04-30T11:13:00.000Z,tutorial,,0.2,False,"PowerShell tutorial for backing up multiple VMs; focuses on basic deployment steps rather than expert configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-restore-files-windows-server,Restore files to Windows Server,Tutorial - Recover files from Azure to a Windows Server by using Azure Backup - Azure Backup,Recover Windows Server files from Azure using MARS,Learn how to use the Microsoft Azure Recovery Services Agent (MARS) agent to recover items from Azure to a Windows Server.,This tutorial describes how to use the Microsoft Azure Recovery Services (MARS) agent to recover files from Azure to a Windows Server. Azure Backup allows you restore individual items from Windows Server backups. You can recover accidentally deleted files seamlessly and instantly.,2025-04-30T11:13:00.000Z,tutorial,configuration,0.6,True,Describes using MARS agent to recover individual items; likely includes product-specific restore configuration steps and options.,unchanged https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-sap-hana-db,Back up SAP HANA databases using the Azure portal,Tutorial - Back up SAP HANA databases in Azure VMs - Azure Backup,,"In this tutorial, learn how to back up SAP HANA databases running on Azure VM to an Azure Backup Recovery Services vault.",This tutorial shows you how to back up SAP HANA databases running on Azure VMs to an Azure Backup Recovery Services vault. In this article you'll learn how to: Hereare all the scenarios that we currently support.,2026-01-29T08:00:00.000Z,tutorial,,0.1,False,"Tutorial for backing up SAP HANA databases in Azure VMs; although it mentions supported scenarios, it is framed as a how-to tutorial rather than a detailed support matrix with explicit numeric limits or configuration tables.",unchanged -https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-vm-at-scale,Back up multiple Azure VMs,Tutorial - Back Up Multiple Azure Virtual Machines by Using Azure Backup - Azure Backup,,"In this tutorial, learn how to create a Recovery Services vault, define a backup policy, and simultaneously back up multiple virtual machines.","This tutorial describes how to back up multiple virtual machines (VMs) by using the Azure portal. Azure stores backup data in a Recovery Services vault, which is accessible from theSettingsmenu of most services. This integration simplifies the backup process. However, managing each database or virtual machine separately can be tedious. To streamline backups for multiple virtual machines (for department or location), you can create a backup policy and apply it to the relevant machines. In this tu",2026-01-28T18:20:00.000Z,tutorial,,0.2,False,"Tutorial for backing up multiple VMs via portal; no indication of detailed limits, config parameter tables, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-vm-at-scale,Back up multiple Azure VMs,Tutorial - Back Up Multiple Azure Virtual Machines by Using Azure Backup - Azure Backup,,"In this tutorial, learn how to create a Recovery Services vault, define a backup policy, and simultaneously back up multiple virtual machines.","This tutorial describes how to back up multiple virtual machines (VMs) by using the Azure portal. Azure stores backup data in a Recovery Services vault, which is accessible from theSettingsmenu of most services. This integration simplifies the backup process. However, managing each database or virtual machine separately can be tedious. To streamline backups for multiple virtual machines (for department or location), you can create a backup policy and apply it to the relevant machines. In this tu",2026-03-26T08:00:00.000Z,tutorial,,0.1,False,"Tutorial for backing up multiple VMs at scale; focuses on how-to steps and basic policy creation, not on detailed limits, configuration parameter tables, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-windows-server-to-azure,Back up Windows Server,Tutorial - Back up Windows Server to Azure - Azure Backup,Configure MARS agent backups for Windows Server to Azure,This tutorial details backing up on-premises Windows Servers to a Recovery Services vault.,"This tutorial describes how to back up on-premises Windows Server to Azure using the Microsoft Azure Recovery Services (MARS) agent. Azure Backup protects Windows Server from corruption, cyberattacks, and disasters. This solution uses the lightweight Microsoft Azure Recovery Services (MARS) agent, which is installed on the server to protect files, folders, and system configuration data through Windows Server System State backups.",2025-12-01T18:18:00.000Z,tutorial,configuration,0.6,True,"Uses MARS agent to protect files, folders, and system state; typically involves agent-specific configuration options and schedules that are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/backup/tutorial-configure-backup-aks,Configure item level backup of an AKS cluster,Tutorial: Configure item-level backup for an Azure Kubernetes Service cluster - Azure Backup,Configure AKS item-level backups with Azure Backup,"Learn how to configure backup for an Azure Kubernetes Service (AKS) cluster, and use Azure Backup to back up specific items from the cluster.","This tutorial describes how to configure backup for an Azure Kubernetes Service (AKS) cluster, and then use the Azure Backup configuration to back up specific items in the cluster. You also learn how to use backup hooks in a backup configuration to achieve application-consistent backups for databases that are deployed in an AKS cluster. You can use Azure Backup to back up AKS clusters by using the Backup extension. The extension must be installed in the cluster. An AKS cluster backup includes cl",2026-01-09T12:11:00.000Z,tutorial,configuration,0.6,True,Describes configuring backup for AKS clusters including backup hooks for app-consistent backups; implies product-specific configuration options and parameters for hooks and backup extension.,unchanged https://learn.microsoft.com/en-us/azure/backup/tutorial-configure-sap-hana-database-instance-snapshot-backup,Configure SAP HANA database instance snapshot backup,Tutorial - Configure SAP HANA database instance snapshot backup - Azure Backup,Configure SAP HANA instance snapshot backups with Azure CLI,"In this tutorial, learn how to configure the SAP HANA database instance snapshot backup and run an on-demand backup.","This tutorial describes how to configure backup for SAP HANA database instance snapshot and run an on-demand backup using Azure CLI. Azure Backup now performs an SAP HANA storage snapshot-based backup of an entire database instance. Backup combines an Azure Managed Disk full or incremental snapshot with HANA snapshot commands to provide instant HANA backup and restore. For more information on the supported scenarios, see thesupport matrixfor SAP HANA.",2025-11-18T18:43:00.000Z,tutorial,configuration,0.65,True,Describes configuring snapshot-based backups combining disk snapshots and HANA commands; involves product-specific backup configuration parameters.,unchanged diff --git a/products/azure-backup/report.md b/products/azure-backup/report.md index d9d01840..e32af290 100644 --- a/products/azure-backup/report.md +++ b/products/azure-backup/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring Azure Backup and restore: vaults, policies, agents, scripts/APIs, monitoring/alerts, and workload-specific setup for VMs, AKS, Files, @@ -10,37 +10,33 @@ category_descriptions: decision-making: Guidance on estimating Azure Backup costs, checking supported VM image SKUs for backup policies, and deciding how to migrate classic backup alerts to Azure Monitor alerts. - limits-quotas: Backup support matrices, regional availability, limits, retention, - and performance/behavior details for Azure Backup across VMs, databases, disks, - files, blobs, SAP, and monitoring/reporting. - troubleshooting: Diagnosing and fixing Azure Backup errors across VMs, databases - (SQL, PostgreSQL, MySQL, SAP), files, disks, AKS, MARS/MABS/DPM, vault/agent issues, - and slow or failed backups/restores. + limits-quotas: Backup support matrices, regional support, limits, retention, performance, + and behavioral constraints for Azure Backup across VMs, databases, disks, files, + blobs, AKS, SAP, and monitoring. + troubleshooting: Diagnosing and fixing Azure Backup failures and errors across VMs, + disks, databases (SQL, PostgreSQL, MySQL, SAP), files, AKS, MABS/DPM, agents, + vaults, and restore/recovery issues. security: Security, encryption, access control, soft delete, immutability, private endpoints, MUA/Resource Guard, and Azure Policy for protecting and restoring Azure Backup data. integrations: End-to-end scripting patterns for configuring, running, monitoring, and restoring Azure Backup across VMs, SQL, PostgreSQL, SAP HANA, Files, Blobs, Disks, and on-premises using CLI, PowerShell, REST, and Logic Apps. - best-practices: Best practices for Azure Backup Server, DPM, SQL (incl. Always On), - and Azure VMs, covering safe configuration, TRIM handling, and product-specific - backup and recovery guidance. + best-practices: Best practices for configuring and optimizing Azure Backup/MABS/DPM + for Hyper-V VMs, SQL (incl. Always On), Exchange, TRIM-aware backups, and safe + restore/recovery across vaults. deployment: 'MABS v3/v4 deployment details: supported workload/protection matrices and how to automate unattended/silent installation of Azure Backup Server v4.' skill_description: Expert knowledge for Azure Backup development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - backing up Azure VMs, SQL/SAP HANA/PostgreSQL, Files/Blobs/Disks, configuring vaults/policies, - or using CLI/PowerShell/REST, and other Azure Backup related development tasks. - Not for Azure Site Recovery (use azure-site-recovery), Azure Virtual Machines (use - azure-virtual-machines), Azure Blob Storage (use azure-blob-storage), Azure Files - (use azure-files). -use_when: Use when backing up Azure VMs, SQL/SAP HANA/PostgreSQL, Files/Blobs/Disks, - configuring vaults/policies, or using CLI/PowerShell/REST, and other Azure Backup - related development tasks. -confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure Virtual - Machines (use azure-virtual-machines), Azure Blob Storage (use azure-blob-storage), - Azure Files (use azure-files). + protecting Azure VMs, SQL/SAP HANA/PostgreSQL, Files/Blobs/Disks, AKS, or MABS/DPM + via CLI/PowerShell/REST, and other Azure Backup related development tasks. Not for + Azure Site Recovery (use azure-site-recovery). +use_when: Use when protecting Azure VMs, SQL/SAP HANA/PostgreSQL, Files/Blobs/Disks, + AKS, or MABS/DPM via CLI/PowerShell/REST, and other Azure Backup related development + tasks. +confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery). --- # Azure Backup Crawl Report @@ -49,13 +45,13 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure - **Total Pages**: 393 - **Fetched**: 393 - **Fetch Failed**: 0 -- **Classified**: 230 -- **Unclassified**: 163 +- **Classified**: 229 +- **Unclassified**: 164 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 4 -- **Unchanged**: 388 +- **New Pages**: 0 +- **Updated Pages**: 15 +- **Unchanged**: 378 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-backup/azure-backup.csv` @@ -64,32 +60,50 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 1 | 0.3% | -| best-practices | 7 | 1.8% | -| configuration | 74 | 18.8% | +| best-practices | 8 | 2.0% | +| configuration | 71 | 18.1% | | decision-making | 3 | 0.8% | | deployment | 4 | 1.0% | | integrations | 52 | 13.2% | -| limits-quotas | 25 | 6.4% | +| limits-quotas | 26 | 6.6% | | security | 37 | 9.4% | | troubleshooting | 27 | 6.9% | -| *(Unclassified)* | 163 | 41.5% | +| *(Unclassified)* | 164 | 41.7% | ## Changes -### New Pages - -- [Granular billing](https://learn.microsoft.com/en-us/azure/backup/backup-azure-granular-billing) - ### Updated Pages -- [Turn on/off telemetry settings](https://learn.microsoft.com/en-us/azure/backup/manage-telemetry) - - Updated: 2025-07-15T08:00:00.000Z → 2026-04-13T08:00:00.000Z -- [Configure notifications](https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitor-alerts-notification) - - Updated: 2025-11-26T08:00:00.000Z → 2026-04-15T11:11:00.000Z -- [Backup](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-using-cli) - - Updated: 2025-05-29T08:00:00.000Z → 2026-04-15T22:11:00.000Z -- [Add storage](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-add-storage) - - Updated: 2025-07-15T08:00:00.000Z → 2026-04-13T08:00:00.000Z +- [Support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas) + - Updated: 2026-01-28T08:00:00.000Z → 2026-04-24T11:19:00.000Z +- [Back up a VM - Azure portal](https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-portal) + - Updated: 2026-03-02T12:12:00.000Z → 2026-03-26T08:00:00.000Z +- [Back up multiple Azure VMs](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-vm-at-scale) + - Updated: 2026-01-28T18:20:00.000Z → 2026-03-26T08:00:00.000Z +- [Set up a vault and enable backup for Azure VMs](https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-vms-prepare) + - Updated: 2026-03-02T12:12:00.000Z → 2026-03-25T08:00:00.000Z +- [Restore Azure VMs in the portal](https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms) + - Updated: 2026-01-23T08:00:00.000Z → 2026-04-24T11:19:00.000Z +- [Back up Hyper-V virtual machines](https://learn.microsoft.com/en-us/azure/backup/back-up-hyper-v-virtual-machines-mabs) + - Updated: 2025-07-22T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Back up Azure Local virtual machines](https://learn.microsoft.com/en-us/azure/backup/back-up-azure-stack-hyperconverged-infrastructure-virtual-machines) + - Updated: 2025-10-15T05:40:00.000Z → 2026-04-24T08:00:00.000Z +- [Protect system state and bare metal recovery](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-system-state-and-bmr) + - Updated: 2025-12-05T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [FAQ-MARS agent](https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-folder-backup-faq) + - Updated: 2025-07-18T17:15:00Z → 2026-04-23T17:12:00Z +- [Back up Windows Server files and folders](https://learn.microsoft.com/en-us/azure/backup/backup-windows-with-mars-agent) + - Updated: 2025-06-18T08:00:00.000Z → 2026-04-23T17:12:00.000Z +- [Restore Windows Server System State](https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-system-state) + - Updated: 2025-09-09T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- [What's New in MABS](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-whats-new-mabs) + - Updated: 2025-07-25T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Release notes MABS](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3) + - Updated: 2025-07-16T11:12:00.000Z → 2026-04-24T06:15:00.000Z +- [DPM/Azure Backup Server (MABS) support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm) + - Updated: 2025-09-11T08:00:00.000Z → 2026-04-24T08:00:00.000Z +- [Azure Backup support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix) + - Updated: 2026-01-29T08:00:00.000Z → 2026-03-25T08:00:00.000Z ## Classified Pages @@ -115,26 +129,26 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [SAP ASE database backup in Azure VMs](https://learn.microsoft.com/en-us/azure/backup/troubleshoot-sap-ase-sybase-database-backup) | troubleshooting | 0.90 | SAP ASE-specific error codes and Azure Backup–specific resolutions. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-flex-support-matrix) | limits-quotas | 0.90 | Support matrix summarizing supported regions, scenarios, and limitations will contain concrete numeric and capability limits, which are expert limits/quotas details. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-support-matrix) | limits-quotas | 0.90 | Support matrix for PostgreSQL server backup with regions, scenarios, and limitations will include concrete service limits and capability constraints, fitting the limits-quotas category. | -| [Support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas) | limits-quotas | 0.90 | Support matrix pages enumerate supported/unsupported configurations and explicit limitations (OS versions, disk types, max sizes, scenarios) in table form; these are detailed constraints and limits that qualify as expert knowledge. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/blob-backup-support-matrix) | limits-quotas | 0.90 | Explicitly described as a support matrix summarizing regional availability, supported scenarios, and limitations for operational and vaulted blob backups. Such matrices typically contain product-specific constraints and numeric limits that fit the limits-quotas category. | | [Data recovery from Microsoft Azure Backup Server](https://learn.microsoft.com/en-us/azure/backup/backup-azure-alternate-dpm-server-troubleshoot) | troubleshooting | 0.88 | MABS data recovery–specific errors and recommended troubleshooting actions. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/sap-hana-backup-support-matrix) | limits-quotas | 0.88 | A SAP HANA backup 'support matrix' explicitly documents supported scenarios, OS distributions, and limitations (including a specific minimum log backup frequency of 15 minutes). These are concrete product limits and constraints that qualify as expert knowledge. | | [System Center DPM](https://learn.microsoft.com/en-us/azure/backup/backup-azure-scdpm-troubleshooting) | troubleshooting | 0.88 | DPM-specific issues and fixes when used with Azure Backup, including environment-specific guidance. | +| [Azure Backup support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix) | limits-quotas | 0.86 | A 'support matrix' for Azure Backup scenarios and deployments will list which workloads, OS versions, and configurations are supported along with explicit limitations and constraints. These matrices typically include exact numerical limits (for example, maximum number of items per vault, supported sizes, retention constraints) and scenario-specific support details that are product- and version-specific, qualifying as expert knowledge. This aligns best with the limits-quotas category because the primary purpose is to enumerate support boundaries and limitations rather than provide conceptual guidance or troubleshooting. | | [FAQ-Azure VM backup](https://learn.microsoft.com/en-us/azure/backup/backup-azure-vm-backup-faq) | limits-quotas | 0.86 | FAQ for Azure VM Backup typically includes precise product behaviors and constraints such as maximum number of VMs per vault, supported backup frequencies, retention constraints, supported disk sizes, and other numeric or tightly specified limits that are not obvious from general knowledge. These are expert, product-specific details that match the limits-quotas category better than generic FAQ or troubleshooting. | | [Monitoring and alerts](https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitor-troubleshoot) | troubleshooting | 0.86 | Monitoring-related error messages for Azure Backup and their resolutions. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/azure-data-lake-storage-backup-support-matrix) | limits-quotas | 0.86 | Support matrix for Azure Data Lake Storage vaulted backup, summarizing regional availability, supported scenarios, and limitations. Such pages typically include detailed product-specific constraints and limits, which qualify as expert knowledge under limits-quotas. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/azure-file-share-support-matrix) | limits-quotas | 0.86 | A support matrix for Azure Files backup typically enumerates supported/unsupported configurations and explicit limitations (for example, supported storage tiers, maximum number of shares, backup frequency constraints). This is product-specific expert knowledge with concrete limits that an LLM is unlikely to know from training. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-support-matrix) | limits-quotas | 0.86 | A support matrix for AKS backup will list region availability, supported scenarios, and explicit limitations in tabular form. These are concrete, product-specific constraints and limits that qualify as expert knowledge. | +| [Support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas) | limits-quotas | 0.86 | A 'support matrix' for backups typically enumerates detailed support constraints and limitations (for example, supported/unsupported VM types, disk sizes, OS versions, and backup feature limits). These are product-specific limits and constraints that change over time and are not reliably known from training data, fitting the limits-quotas category. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/disk-backup-support-matrix) | limits-quotas | 0.86 | A 'support matrix' page for Azure Disk Backup will list detailed support constraints such as supported regions, disk types, scenarios, and explicit limitations, typically in tables with specific values. These product-specific constraints and limits qualify as expert knowledge under limits-quotas. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/sql-support-matrix) | limits-quotas | 0.86 | A 'support matrix' for Azure Backup and SQL Server in Azure VMs will list supported/unsupported versions, configurations, and especially concrete limitations (for example, max database sizes, supported storage types, region-specific constraints, and other numeric or matrix-style limits). This is product- and SKU-specific expert knowledge that an LLM would not reliably know from training, and it aligns best with the limits-quotas category. | | [System State](https://learn.microsoft.com/en-us/azure/backup/backup-azure-system-state-troubleshoot) | troubleshooting | 0.86 | System State backup problems on Windows servers with Azure Backup and targeted fixes. | -| [Azure Backup support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix) | limits-quotas | 0.85 | Support matrix summarizing support settings and limitations across scenarios; typically includes numeric limits, supported versions, and constraints not known generically. | | [Azure VM](https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-troubleshoot) | troubleshooting | 0.85 | Troubleshooting guide for VM backup/restore; typically includes error codes, causes, and stepwise resolutions unique to Azure Backup. | | [Azure VM file recovery](https://learn.microsoft.com/en-us/azure/backup/backup-azure-vm-file-recovery-troubleshoot) | troubleshooting | 0.85 | Focused on file recovery problems; likely lists specific symptoms and resolutions for the mount/recovery process unique to Azure Backup. | | [Azure role-based access control](https://learn.microsoft.com/en-us/azure/backup/backup-rbac-rs-vault) | security | 0.85 | Lists Azure Backup–specific RBAC roles and allowed actions; includes scope and operation mappings unique to this product. | | [Backup vault](https://learn.microsoft.com/en-us/azure/backup/encryption-at-rest-with-cmk-for-backup-vault) | security | 0.85 | Similar to 30 but for Backup vaults; details CMK configuration, Key Vault integration, and behavior specific to Backup vault encryption. | | [Configure Multi-user authorization using Resource Guard](https://learn.microsoft.com/en-us/azure/backup/multi-user-authorization) | security | 0.85 | Explains MUA model, Resource Guard configuration, and PIM-based approval flows; these are product-specific security patterns and settings. | -| [DPM/Azure Backup Server (MABS) support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm) | limits-quotas | 0.85 | Support matrix for MABS and DPM; summarizes supported scenarios and limitations with detailed tables and constraints. | +| [DPM/Azure Backup Server (MABS) support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm) | limits-quotas | 0.85 | Support matrix pages for MABS/DPM and Azure Backup enumerate supported/unsupported workloads, OS versions, and often concrete limits (max data sizes, protected instances, scenarios) in tabular form. These are product-specific support and limitation details that function as limits/quotas beyond generic knowledge. | | [Recovery Services vault](https://learn.microsoft.com/en-us/azure/backup/encryption-at-rest-with-cmk) | security | 0.85 | Explains CMK setup, Key Vault usage, and key hierarchy (DEK/KEK) for Recovery Services vaults; includes specific security configuration parameters. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/azure-elastic-san-backup-support-matrix) | limits-quotas | 0.85 | Support matrix articles typically list regions, supported scenarios, and explicit limitations in tables with concrete values; this is expert knowledge about service capabilities and constraints. | | [Support matrix](https://learn.microsoft.com/en-us/azure/backup/backup-azure-mysql-flexible-server-support-matrix) | limits-quotas | 0.85 | Support matrix for MySQL Flexible Server retention summarizes supported scenarios and limitations; such matrices typically include explicit constraints and conditions, which are expert limits/quotas knowledge. | @@ -200,7 +214,6 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Use Diagnostic Settings with Log Analytics](https://learn.microsoft.com/en-us/azure/backup/backup-azure-diagnostic-events) | configuration | 0.75 | Explains diagnostics settings for Recovery Services and Backup vaults, including event categories and destinations (storage, event hub, Log Analytics); these are product-specific configuration options. | | [Audit and Enforce backup operations for Azure Kubernetes Service clusters using Azure policy](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-policy) | security | 0.70 | Uses specific built-in Azure Policy definitions for AKS backup enforcement; policy names and scopes are product-specific security/compliance configuration. | | [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/backup/policy-reference) | security | 0.70 | Lists Azure Backup–specific built-in policy definitions with names and scopes, which are security/governance configuration artifacts. | -| [Back up Windows Server files and folders](https://learn.microsoft.com/en-us/azure/backup/backup-windows-with-mars-agent) | configuration | 0.70 | Describes how to configure backup schedules, retention, and included items in the MARS agent UI; these are concrete product-specific configuration parameters and patterns. | | [Back up an Azure VM from VM settings](https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-first-look-arm) | configuration | 0.70 | Details the VM-level backup configuration flow, including policy and vault selection, which is product-specific. | | [Backup](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup) | configuration | 0.70 | How-to article for configuring AKS backups via portal/PowerShell and Backup extension; likely includes product-specific settings (Backup vault, extension parameters, policy options) beyond generic backup concepts. | | [Backup](https://learn.microsoft.com/en-us/azure/backup/back-up-azure-database-postgresql-flex-backup-cli) | integrations | 0.70 | CLI article with specific commands and parameters to configure backups for PostgreSQL Flexible Server; product-specific integration details. | @@ -224,6 +237,7 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Delete](https://learn.microsoft.com/en-us/azure/backup/backup-azure-delete-vault) | configuration | 0.70 | Describes exact dependencies that must be removed and the sequence to delete a vault; these behaviors are specific to Azure Backup. | | [Disable Soft delete for File Share](https://learn.microsoft.com/en-us/azure/backup/scripts/disable-soft-delete-for-file-shares) | integrations | 0.70 | Shows ARM API-based script with specific resource/parameter usage to change soft delete settings for file shares. | | [Email Report](https://learn.microsoft.com/en-us/azure/backup/backup-reports-email) | integrations | 0.70 | Describes deploying a Logic App that queries Log Analytics workspaces for backup data; involves integration-specific parameters (workspace selection, query, schedule) between Backup, LA, and Logic Apps. | +| [FAQ-MARS agent](https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-folder-backup-faq) | limits-quotas | 0.70 | FAQ pages for Azure Backup MARS agent typically document concrete product limits (max backup size, number of items, retention ranges, concurrency, supported OS versions) and time-related behaviors (scheduling, throttling). These are numeric, product-specific constraints that qualify as limits-quotas. | | [FAQ-Soft Delete](https://learn.microsoft.com/en-us/azure/backup/soft-delete-azure-backup-faq) | security | 0.70 | Soft delete is a concrete security feature; FAQs typically include exact behavior, retention windows, and required settings/permissions. These are product-specific security details that qualify as expert knowledge. | | [Find Vault for Storage Account PowerShell Script](https://learn.microsoft.com/en-us/azure/backup/scripts/backup-powershell-script-find-recovery-services-vault) | integrations | 0.70 | Provides a concrete PowerShell script using Azure Backup/Recovery Services cmdlets and parameters to locate the associated vault. | | [Install MARS agent](https://learn.microsoft.com/en-us/azure/backup/install-mars-agent) | configuration | 0.70 | MARS agent installation docs typically include product-specific configuration options (proxy, cache location, vault credentials) and sometimes registry/settings details, which qualify as configuration expert knowledge. | @@ -255,7 +269,7 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Pricing calculator](https://learn.microsoft.com/en-us/azure/backup/azure-backup-pricing) | decision-making | 0.70 | Guides cost estimation and budgeting for Azure Backup with workload-specific pricing considerations, supporting service/tier selection decisions. | | [Quickstart - Enable Multi-user authorization (MUA)](https://learn.microsoft.com/en-us/azure/backup/enable-multi-user-authorization-quickstart) | security | 0.70 | Quickstart for configuring MUA on Recovery Services and Backup vaults; contains concrete security configuration steps and settings specific to Azure Backup. | | [Recover data from Azure Backup Server](https://learn.microsoft.com/en-us/azure/backup/backup-azure-alternate-dpm-server) | best-practices | 0.70 | Describes cross-server recovery behavior and a specific TRIM-related edge case where guest OS TRIM resets incremental block tracking and forces full backups; this is a product-specific gotcha not generally known. | -| [Release notes MABS](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3) | troubleshooting | 0.70 | Release notes for MABS v3 with known issues and workarounds; these map specific symptoms to causes and resolutions, fitting troubleshooting. | +| [Release notes MABS](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3) | troubleshooting | 0.70 | Release notes for MABS v3 with known issues and workarounds usually list specific symptoms, error messages/codes, causes, and resolutions. This matches the troubleshooting pattern of symptom → cause → solution with product-specific details. | | [Resource Manager and Bicep files](https://learn.microsoft.com/en-us/azure/backup/backup-rm-template-samples) | configuration | 0.70 | Indexes ARM/Bicep templates for Recovery Services and Backup vaults; underlying resource types and properties (Microsoft.RecoveryServices, Microsoft.DataProtection) are configuration references. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-restore) | configuration | 0.70 | Restore-focused how-to for AKS with Azure Backup; likely documents restore configuration options, parameters, and constraints specific to AKS backup extension and Backup vault. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-flex-restore-cli) | integrations | 0.70 | Shows CLI commands and parameters for restoring Flexible Server backups; includes ALR-only behavior and example vault/resource group usage. | @@ -267,7 +281,6 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Restore](https://learn.microsoft.com/en-us/azure/backup/restore-managed-disks-cli) | integrations | 0.70 | Describes az dataprotection restore commands, preview extension requirements, and OLR limitations—concrete CLI integration behavior. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/restore-postgresql-database-cli) | integrations | 0.70 | Provides CLI commands and parameters for restoring PostgreSQL backups; includes PaaS-specific ALR-only behavior. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/restore-postgresql-database-ps) | integrations | 0.70 | Shows PowerShell cmdlets and parameters for restoring PostgreSQL backups; includes PaaS-specific behavior (no OLR, ALR only) which is product-specific. | -| [Restore Azure VMs in the portal](https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms) | configuration | 0.70 | Portal-based restore steps, including Cross Region Restore options and parameters, are product-specific configuration flows. | | [Supported VM SKUs for Azure Policy](https://learn.microsoft.com/en-us/azure/backup/backup-azure-policy-supported-skus) | decision-making | 0.70 | Lists which VM publisher/offer/SKU combinations are supported by built-in Azure Backup policies; this is a detailed compatibility matrix used to decide which VMs can be auto-protected, fitting decision-making based on SKU support. | | [Transport Layer Security](https://learn.microsoft.com/en-us/azure/backup/transport-layer-security) | security | 0.70 | Article describes enabling TLS 1.2 for Azure Backup; typically includes registry/settings, minimum protocol versions, and product-specific configuration steps, which qualify as concrete security configuration details. | | [Turn on/off telemetry settings](https://learn.microsoft.com/en-us/azure/backup/manage-telemetry) | configuration | 0.70 | Page is about managing telemetry settings for Microsoft Azure Backup Server, which typically includes product-specific options (e.g., enabling/disabling diagnostics, possibly registry/setting names, version applicability like MABS V3 UR2 and later). This is configuration-focused expert knowledge rather than generic concepts, but not about limits, security roles, or deployment. | @@ -279,6 +292,7 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Using pre-backup and post-backup scripts](https://learn.microsoft.com/en-us/azure/backup/pre-backup-post-backup-scripts) | configuration | 0.70 | Describes ScriptingConfig.xml and how to specify scripts; involves specific config file and settings unique to MABS backup scripting. | | [Add storage](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-add-storage) | configuration | 0.68 | The article describes product-specific configuration details for adding and using Modern Backup Storage in Microsoft Azure Backup Server, including required server versions (MABS V2+ and Windows Server 2016+), feature behavior, and how storage is attached and used. These are concrete, product-specific configuration requirements rather than generic concepts. | | [Azure Monitor alerts](https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitoring-alerts) | configuration | 0.68 | The article goes beyond conceptual alerting and includes product-specific configuration details for Azure Backup alerts via Azure Monitor (for example, which alert rules to enable/disable, how to switch from classic alerts, and specific settings in the Azure portal). These are concrete, service-specific configuration steps and options rather than generic monitoring guidance, fitting the configuration sub-skill best. It does not focus on limits, troubleshooting error codes, or architecture decisions. | +| [Back up Hyper-V virtual machines](https://learn.microsoft.com/en-us/azure/backup/back-up-hyper-v-virtual-machines-mabs) | best-practices | 0.68 | Procedural article with product-specific behavioral details (for example, guest OS trim resetting incremental block tracking and forcing full backups). These are concrete, non-obvious gotchas unique to MABS + Hyper-V backup behavior, fitting best-practices rather than generic how-to. | | [Backup](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-backup-using-cli) | configuration | 0.68 | The page is a CLI-focused how-to for configuring AKS backups using Azure Backup and a backup extension. Such articles typically include product-specific CLI commands, required parameters, resource names, and configuration options (for the backup vault, backup policy, backup extension, and AKS cluster integration). These concrete configuration details and parameter values are expert knowledge beyond generic LLM training. It is not primarily about limits, troubleshooting, or deployment matrices, but about how to configure the backup integration and policies. | | [HANA System Replication database](https://learn.microsoft.com/en-us/azure/backup/sap-hana-database-with-hana-system-replication-backup) | configuration | 0.68 | The article describes product-specific steps and settings to back up SAP HANA databases with HANA System Replication on Azure VMs using Azure Backup. It includes Azure Backup–specific configuration flows (e.g., how to register/protect primary vs secondary, switching protection from standalone to HSR, vault and workload configuration) that are unique to this integration scenario and not generic tutorial content. While it’s procedural, the value lies in the concrete, product-specific configuration behavior for HSR-enabled HANA on Azure. | | [Active Directory domain controllers backup /restore](https://learn.microsoft.com/en-us/azure/backup/active-directory-backup-restore) | configuration | 0.65 | The article describes how to back up and restore Active Directory domain controllers using Azure Backup, including recommended procedures. Such how-to guidance for a specific workload usually includes product-specific configuration steps (backup policies, protection settings, restore options) and workload-specific nuances that constitute expert knowledge beyond generic backup or AD restore concepts. | @@ -308,7 +322,6 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Re-register MABS using public access](https://learn.microsoft.com/en-us/azure/backup/register-public-access-vault-backup-server) | security | 0.65 | Guides switching from private endpoints to public access; involves security-related configuration of vault connectivity. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/restore-azure-file-share-rest-api) | integrations | 0.65 | REST API-based restore article; expected to list operation names, request bodies, and parameters specific to Azure Backup, which are product-specific integration details. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/restore-managed-disks-ps) | integrations | 0.65 | Uses Azure PowerShell cmdlets and parameters specific to disk restore scenarios, including constraints like no OLR support—product-specific integration behavior. | -| [Restore Windows Server System State](https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-system-state) | configuration | 0.65 | System state restore involves specific restore modes, paths, and post-restore steps that are product-specific configuration/operation details. | | [Review backup estate](https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitoring-built-in-monitor) | configuration | 0.65 | Describes specific monitoring and notification capabilities in the portal (alerts, jobs, security, usage) for Azure Backup; likely includes product-specific monitoring settings and options. | | [Upgrade MARS agent](https://learn.microsoft.com/en-us/azure/backup/upgrade-mars-agent) | configuration | 0.65 | Upgrade guidance for MARS agent usually includes version-specific requirements, supported OS versions, and configuration considerations, which are product-specific configuration details. | | [Agentless multi-disk crash-consistent VM backup](https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-agentless-multi-disk-crash-consistent-overview) | configuration | 0.64 | Describes Enhanced VM backup policy and configuring consistency type (application-consistent vs crash-consistent) and retry behavior. These are specific configuration options and policy settings unique to Azure Backup for VMs. | @@ -330,14 +343,12 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Back up Hyper-V virtual machines](https://learn.microsoft.com/en-us/azure/backup/back-up-hyper-v-virtual-machines-mabs) | 0.45 | Hyper-V VM backup/restore with MABS; includes a note about trim behavior but overall is a procedural backup/restore guide, not a structured best-practices or config reference. | | [Backup](https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql) | 0.45 | Portal-based backup configuration tutorial; references supported configurations and limitations but those details are in separate pages. | | [Backup](https://learn.microsoft.com/en-us/azure/backup/backup-azure-mysql-flexible-server) | 0.45 | Tutorial for backing up MySQL Flexible Server; preview is paused and summary doesn’t show detailed config tables or limits. | | [Configure backup](https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-flex) | 0.45 | Portal configuration tutorial for PostgreSQL Flexible Server backups; summary doesn’t show detailed config tables or limits. | | [FAQ-Azure Disk Backup](https://learn.microsoft.com/en-us/azure/backup/disk-backup-faq) | 0.45 | Azure Disk Backup FAQ. Summary indicates general questions and points to a support matrix for detailed limits, so this page is not the primary expert-knowledge source. | | [FAQ-Back up Azure Files](https://learn.microsoft.com/en-us/azure/backup/backup-azure-files-faq) | 0.45 | Azure Files backup FAQ. The summary points to a separate support matrix for detailed scenarios/limits, suggesting this page itself is higher-level Q&A. | | [FAQ-Back up SQL Server databases on Azure VMs](https://learn.microsoft.com/en-us/azure/backup/faq-backup-sql-server) | 0.45 | SQL Server on Azure VM backup FAQ. It points to a support matrix for detailed support scenarios, so this page is likely higher-level Q&A without deep numeric/config details. | -| [FAQ-MARS agent](https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-folder-backup-faq) | 0.45 | MARS Agent FAQ about backing up files/folders. Without explicit mention of error codes, config tables, or limits, it’s treated as general FAQ rather than expert troubleshooting or configuration content. | | [Manage](https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-flex-manage) | 0.45 | Management tutorial for PostgreSQL Flexible Server backups; summary doesn’t indicate detailed configuration parameter tables. | | [Recover files from Azure to Windows Server](https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-windows-server) | 0.45 | Restore tutorial using the Recover Data wizard; mostly procedural steps without detailed configuration tables or troubleshooting mappings. | | [Replace your tape library](https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-cloud-as-tape) | 0.45 | Describes using Azure Backup as tape replacement and enabling policies; summary doesn’t show numeric limits, config parameter tables, or troubleshooting mappings. | @@ -351,7 +362,6 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Restore all files in a volume](https://learn.microsoft.com/en-us/azure/backup/restore-all-files-volume-mars) | 0.45 | How-to for restoring all files in a volume; likely a step-by-step wizard walkthrough without expert-level configuration or limits. | | [SQL Server](https://learn.microsoft.com/en-us/azure/backup/backup-azure-sql-mabs) | 0.45 | SQL Server backup with MABS; scenario configuration article, but summary does not clearly indicate detailed configuration parameter tables or limits. | | [Use DPM to back up a SharePoint farm](https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-sharepoint) | 0.45 | SharePoint farm backup to Azure with DPM; summary is conceptual and procedural without explicit config values, limits, or error-code mappings. | -| [Back up Azure Local virtual machines](https://learn.microsoft.com/en-us/azure/backup/back-up-azure-stack-hyperconverged-infrastructure-virtual-machines) | 0.40 | Backup of Azure Local VMs with MABS; appears to be a scenario-specific how-to without detailed config matrices or limits. | | [Back up file data with MABS](https://learn.microsoft.com/en-us/azure/backup/back-up-file-data) | 0.40 | How-to backup guide for files with MABS; summary shows procedural steps but no explicit config tables, limits, or product-specific error mappings. | | [Back up multiple SQL Server VMs from the vault](https://learn.microsoft.com/en-us/azure/backup/backup-sql-server-database-azure-vms) | 0.40 | Step-by-step portal tutorial for backing up SQL Server VMs; lacks detailed config tables, limits, or troubleshooting mappings. | | [Back up the MABS server](https://learn.microsoft.com/en-us/azure/backup/backup-the-mabs-server) | 0.40 | High-level backup strategy for MABS server and DB; summary doesn’t indicate concrete config values, limits, or product-specific error mappings. | @@ -374,7 +384,6 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Overview](https://learn.microsoft.com/en-us/azure/backup/backup-azure-sql-database) | 0.40 | High-level explanation of backing up SQL Server to Azure Backup; detailed matrices and FAQs are in separate linked pages. | | [Protect SharePoint farm](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-sharepoint-azure-stack) | 0.40 | SharePoint farm backup/restore on Azure Stack; summary focuses on schedule/retention flexibility conceptually, not on specific numeric limits or config parameters. | | [Protect files and applications](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-files-applications-azure-stack) | 0.40 | How-to for backing up files/apps on Azure Stack VMs; summary mentions adding disks and using Azure storage but no explicit parameter tables, limits, or error mappings. | -| [Protect system state and bare metal recovery](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-system-state-and-bmr) | 0.40 | System state and BMR protection how-to; primarily procedural backup/restore steps rather than configuration or limits reference. | | [Recover files from Azure VM backups](https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-files-from-vm) | 0.40 | Task-focused restore tutorial for recovering files from VM backups; typically step-by-step UI workflow without configuration tables, limits, or error-code-based troubleshooting, so it doesn’t meet the expert-knowledge criteria. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-restore-using-cli) | 0.40 | CLI-based restore tutorial for AKS; focuses on how to run restores rather than expert configuration, limits, or error diagnostics. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-restore-using-powershell) | 0.40 | PowerShell restore tutorial for AKS; describes OLR/ALR options but likely as procedural steps, not as decision matrices or config tables. | @@ -416,6 +425,7 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [About Azure VM restore](https://learn.microsoft.com/en-us/azure/backup/about-azure-vm-restore) | 0.30 | Describes restore options and scenarios for Azure VM backups at a conceptual/process level. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. Appears to be an overview of restore behaviors rather than expert configuration or troubleshooting content. | | [Azure VMs](https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-automation) | 0.30 | Primarily a PowerShell how-to for VM backup/restore; no config tables, limits, or product-specific edge-case guidance beyond generic automation steps. | | [Back up Azure Files volumes in AKS clusters](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-aks-azure-files) | 0.30 | Tutorial for configuring backup and restore of Azure Files volumes in AKS; mainly procedural guidance without detailed limits, configuration matrices, or troubleshooting mappings. | +| [Back up Windows Server files and folders](https://learn.microsoft.com/en-us/azure/backup/backup-windows-with-mars-agent) | 0.30 | Step-by-step tutorial for backing up Windows machines with the MARS agent; primarily procedural guidance without clear indication of configuration tables, numeric limits, or product-specific error mappings. Does not clearly meet any expert-knowledge sub-skill criteria. | | [Back up a VM - ARM template](https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-template) | 0.30 | ARM template quickstart for VM backup; shows one template, not a full configuration matrix or quotas. | | [Back up a VM - Bicep file](https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-bicep-file) | 0.30 | Bicep quickstart for VM backup; example deployment, not a comprehensive config or limits reference. | | [Back up a VM - CLI](https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-cli) | 0.30 | Quickstart using Azure CLI to back up a VM; focuses on example commands, not exhaustive configuration or limits. | @@ -443,15 +453,14 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Quickstart to run preregistration script for the database backup](https://learn.microsoft.com/en-us/azure/backup/sap-ase-database-backup-run-preregistration-quickstart) | 0.30 | Quickstart for running a preregistration script in Cloud Shell; mainly procedural steps. While it mentions authentication and network validation, the summary does not indicate detailed configuration parameter tables or security role definitions. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/azure-elastic-san-backup-restore) | 0.30 | Portal restore how-to; mentions separate page for supported scenarios and limitations; this page itself is likely procedural. | | [Restore SAP HANA databases using CLI](https://learn.microsoft.com/en-us/azure/backup/tutorial-sap-hana-restore-cli) | 0.30 | Step-by-step restore tutorial using Azure CLI for SAP HANA on Azure VMs; likely mostly procedural commands without configuration matrices, limits, or product-specific error mappings. | +| [Restore Windows Server System State](https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-system-state) | 0.30 | How-to guide for restoring Windows Server System State from Azure Backup; mainly procedural steps. No strong indication of detailed configuration matrices, limits, or troubleshooting error-code mappings. | | [Restore the entire SAP HANA system to snapshot restore point](https://learn.microsoft.com/en-us/azure/backup/quick-sap-hana-database-instance-restore) | 0.30 | Quickstart walkthrough for restoring SAP HANA via portal; primarily step-by-step UI actions without detailed limits, config tables, or error mappings. | | [SQL in Azure VM backups](https://learn.microsoft.com/en-us/azure/backup/backup-azure-sql-automation) | 0.30 | PowerShell tutorial for SQL in Azure VM backup/restore; focused on task flow, not on limits, config matrices, or error-code troubleshooting. | -| [Set up a vault and enable backup for Azure VMs](https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-vms-prepare) | 0.30 | Primarily a how-to guide for setting up a Recovery Services vault and selecting VMs for backup. Based on the description, it focuses on procedural steps rather than detailed configuration tables, limits, error codes, or product-specific best-practice gotchas. Lacks clear evidence of numeric limits, config parameter matrices, or troubleshooting mappings required for expert-knowledge classification. | | [Support matrix for automation](https://learn.microsoft.com/en-us/azure/backup/backup-support-automation) | 0.30 | Support matrix here appears to be a high-level list of which Azure Backup tasks can be automated and links to other docs, without concrete limits, configuration tables, or error-code-based troubleshooting. It’s more of a capability overview than detailed expert configuration or limits content. | | [Tutorial - Back up Azure Data Lake Storage](https://learn.microsoft.com/en-us/azure/backup/azure-data-lake-storage-backup-tutorial) | 0.30 | Step-by-step tutorial for backing up Azure Data Lake Storage via portal; no indication of detailed limits, configuration matrices, or product-specific error mappings. | | [Tutorial - Back up Azure Files using Azure portal](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-azure-files-vault-tier-portal) | 0.30 | Portal tutorial for configuring Azure Files backup; primarily procedural without detailed configuration parameter tables, limits, or error-code-based troubleshooting. | | [Tutorial for SAP ASE database on Azure VMs backup](https://learn.microsoft.com/en-us/azure/backup/sap-ase-database-backup-tutorial) | 0.30 | Tutorial for backing up SAP ASE using Resiliency; appears focused on walkthrough steps. The reference to supported configurations is likely a pointer to another matrix article rather than containing the matrix itself. | | [Using the Azure portal](https://learn.microsoft.com/en-us/azure/backup/quick-cross-region-restore) | 0.30 | Quickstart for cross-region restore via portal; shows how to enable and use feature, not detailed limits or decision matrices. | -| [What's New in MABS](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-whats-new-mabs) | 0.30 | “What’s new” overview; typically marketing/feature summary without detailed configuration, limits, or troubleshooting mappings. | | [Windows Server using the MARS agent](https://learn.microsoft.com/en-us/azure/backup/backup-client-automation) | 0.30 | PowerShell how-to for MARS/Windows Server backup; likely step-by-step commands rather than structured configuration or troubleshooting reference. | | [With Azure CLI](https://learn.microsoft.com/en-us/azure/backup/quick-kubernetes-backup-cli) | 0.30 | AKS vaulted backup via CLI quickstart; procedural CLI usage, not a catalog of configuration options or quotas. | | [With Azure PowerShell](https://learn.microsoft.com/en-us/azure/backup/quick-secondary-region-restore-postgresql-powershell) | 0.30 | Quickstart for cross-region restore with PowerShell; example commands, not a full troubleshooting or configuration reference. | @@ -463,23 +472,28 @@ confusable_not_for: Not for Azure Site Recovery (use azure-site-recovery), Azure | [Back up the database](https://learn.microsoft.com/en-us/azure/backup/tutorial-postgresql-backup) | 0.25 | Tutorial for backing up PostgreSQL server; standard portal walkthrough without detailed limits or configuration parameter tables. | | [Backup](https://learn.microsoft.com/en-us/azure/backup/tutorial-create-first-backup-azure-database-postgresql-flex) | 0.25 | Tutorial for backing up PostgreSQL Flexible Server; basic configuration steps without detailed limits or config tables. | | [Configure and run Cross Region Restore](https://learn.microsoft.com/en-us/azure/backup/tutorial-cross-region-restore) | 0.25 | Cross-region restore tutorial; describes enabling and running CRR but no explicit numeric limits, decision matrices, or troubleshooting mappings. | +| [Protect system state and bare metal recovery](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-system-state-and-bmr) | 0.25 | Article appears to be a procedural guide for backing up system state and performing bare-metal recovery with Azure Backup Server. The summary does not show specific limits, configuration parameter tables, or detailed troubleshooting mappings, so it likely lacks the required expert-knowledge patterns. | | [Restore](https://learn.microsoft.com/en-us/azure/backup/tutorial-restore-postgresql-flex) | 0.25 | Tutorial for restoring PostgreSQL Flexible Server; restore workflow only, no expert-level limits, configuration matrices, or troubleshooting. | | [Archived release notes](https://learn.microsoft.com/en-us/azure/backup/backup-release-notes-archived) | 0.20 | Archive of release notes; high-level listing of past features without detailed technical limits, configs, or troubleshooting mappings in this index page. | +| [Back up Azure Local virtual machines](https://learn.microsoft.com/en-us/azure/backup/back-up-azure-stack-hyperconverged-infrastructure-virtual-machines) | 0.20 | Described as a general procedure article for backing up Azure Local VMs with MABS; summary does not indicate specific limits, error codes, configuration tables, or product-specific gotchas. Likely a step-by-step tutorial rather than expert reference content. | | [Back up Azure VMs with PowerShell](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-azure-vm) | 0.20 | PowerShell tutorial for backing up multiple VMs; focuses on basic deployment steps rather than expert configuration, limits, or troubleshooting. | | [Back up Windows Server System State](https://learn.microsoft.com/en-us/azure/backup/backup-azure-system-state) | 0.20 | Tutorial-style walkthrough for backing up Windows Server system state to Azure Backup; based on the description it focuses on basic how-to steps without exposing detailed limits, configuration parameter tables, error-code mappings, or other product-specific expert references. | -| [Back up a VM - Azure portal](https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-portal) | 0.20 | Quickstart tutorial focused on using the Azure portal to back up a VM; primarily step-by-step UI guidance without detailed limits, configuration parameter tables, error codes, or decision matrices. Does not meet the thresholds for any expert-knowledge sub-skill type. | -| [Back up multiple Azure VMs](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-vm-at-scale) | 0.20 | Tutorial for backing up multiple VMs via portal; no indication of detailed limits, config parameter tables, or troubleshooting content. | | [FAQ-Back up SAP HANA databases on Azure VMs](https://learn.microsoft.com/en-us/azure/backup/sap-hana-faq-backup-azure-vm) | 0.20 | This is an FAQ page about backing up SAP HANA databases on Azure VMs. From the description and summary, it primarily answers common questions and defers detailed support/limitations to a separate support matrix page. It likely contains high-level guidance and clarifications rather than structured numeric limits, configuration tables, or error-code-based troubleshooting content, so it does not clearly meet any expert-knowledge sub-skill criteria. | | [Granular billing](https://learn.microsoft.com/en-us/azure/backup/backup-azure-granular-billing) | 0.20 | From the summary, the page describes the concept and benefits of granular billing for Azure Backup and how it helps with cost tracking and reporting. There is no indication of numeric limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. It appears to be a feature explanation/overview rather than detailed configuration, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/backup/azure-file-share-backup-overview) | 0.20 | High-level overview of Azure Files backup capabilities and benefits; lacks numeric limits, configuration tables, or troubleshooting details. | | [Overview](https://learn.microsoft.com/en-us/azure/backup/sap-ase-database-about) | 0.20 | High-level 'about' article describing SAP ASE backup on Azure VMs; summary suggests conceptual overview of capabilities, not detailed limits, configuration tables, or decision matrices. | +| [Restore Azure VMs in the portal](https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms) | 0.20 | Article on restoring VMs from recovery points; describes restore operations and features like Cross Region Restore but is primarily procedural, lacking detailed limits, configuration parameter tables, or structured troubleshooting content. | | [Restore a VM with Azure CLI](https://learn.microsoft.com/en-us/azure/backup/tutorial-restore-disk) | 0.20 | Tutorial on restoring a VM with Azure CLI is primarily step-by-step procedural guidance. It may show example commands and parameters, but not organized configuration matrices, limits, or product-specific error mappings. It reads as a how-to tutorial rather than a reference of expert-only constraints or configurations. | | [Restore individual files](https://learn.microsoft.com/en-us/azure/backup/tutorial-restore-files) | 0.20 | File-level restore tutorial; high-level steps without product-specific limits, configuration tables, or troubleshooting flows. | +| [Set up a vault and enable backup for Azure VMs](https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-vms-prepare) | 0.20 | Article on backing up Azure VMs in a Recovery Services vault; mainly procedural guidance for setup and selection of VMs, without explicit numeric limits, configuration tables, or decision matrices. | +| [What's New in MABS](https://learn.microsoft.com/en-us/azure/backup/backup-mabs-whats-new-mabs) | 0.20 | “What’s new” feature overview for Microsoft Azure Backup Server; typically release highlights and marketing-style descriptions rather than detailed limits, configuration tables, or troubleshooting mappings. | | [Azure Backup architecture](https://learn.microsoft.com/en-us/azure/backup/backup-architecture) | 0.10 | High-level architecture overview of Azure Backup components and flows; no detailed limits, configs, or error mappings. | | [Azure Backup terminology](https://learn.microsoft.com/en-us/azure/backup/azure-backup-glossary) | 0.10 | A glossary defines terminology and concepts but does not typically contain limits, configuration parameters, error codes, or prescriptive patterns; it is reference/overview content rather than expert operational knowledge. | | [Back up HANA System Replication database](https://learn.microsoft.com/en-us/azure/backup/quick-backup-hana-cli) | 0.10 | Quickstart for backing up SAP HANA with Azure CLI; focuses on how to run commands and create a vault, not on detailed limits, configuration matrices, or troubleshooting mappings. | | [Back up SAP HANA databases using CLI](https://learn.microsoft.com/en-us/azure/backup/tutorial-sap-hana-backup-cli) | 0.10 | Tutorial for SAP HANA backup using Azure CLI; primarily shows CLI usage and backup steps, without extensive configuration parameter tables, numeric limits, or structured troubleshooting content. | | [Back up SAP HANA databases using the Azure portal](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-sap-hana-db) | 0.10 | Tutorial for backing up SAP HANA databases in Azure VMs; although it mentions supported scenarios, it is framed as a how-to tutorial rather than a detailed support matrix with explicit numeric limits or configuration tables. | +| [Back up a VM - Azure portal](https://learn.microsoft.com/en-us/azure/backup/quick-backup-vm-portal) | 0.10 | Quickstart for backing up a single VM via the portal; primarily a step-by-step tutorial without detailed limits, configuration matrices, or product-specific troubleshooting content. | +| [Back up multiple Azure VMs](https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-vm-at-scale) | 0.10 | Tutorial for backing up multiple VMs at scale; focuses on how-to steps and basic policy creation, not on detailed limits, configuration parameter tables, or error-code-based troubleshooting. | | [Overview](https://learn.microsoft.com/en-us/azure/backup/azure-data-lake-storage-backup-overview) | 0.10 | Overview of Azure Data Lake Storage vaulted backup; high-level explanation of capabilities and benefits without clear indication of numeric limits, configuration matrices, or decision criteria. | | [Overview](https://learn.microsoft.com/en-us/azure/backup/backup-azure-recovery-services-vault-overview) | 0.10 | Conceptual overview of Recovery Services vaults; lacks concrete configuration tables, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/backup/backup-center-overview) | 0.10 | Navigation/overview page for Backup center capabilities; no specific limits, configs, or troubleshooting mappings indicated. | diff --git a/products/azure-baremetal-infrastructure/azure-baremetal-infrastructure.csv b/products/azure-baremetal-infrastructure/azure-baremetal-infrastructure.csv index 7430406d..03e9411a 100644 --- a/products/azure-baremetal-infrastructure/azure-baremetal-infrastructure.csv +++ b/products/azure-baremetal-infrastructure/azure-baremetal-infrastructure.csv @@ -6,4 +6,4 @@ https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-o https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-on-azure/architecture,Architecture,Architecture of BareMetal Infrastructure for NC2 on Azure - Azure Baremetal Infrastructure,Understand NC2 on Azure BareMetal architecture options,Learn about the architecture of several configurations of BareMetal Infrastructure for NC2 on Azure.,"NC2 provides Nutanix-based private clouds in Azure. The private cloud hardware and software deployments are fully integrated and automated in Azure. Deploy and manage the private cloud through the Azure portal, CLI, or PowerShell. A private cloud includes clusters with: Private clouds are installed and managed within an Azure subscription. The number of private clouds within a subscription is scalable. The following diagram describes the architectural components of the NC2 on Azure. Each NC2 on ",2025-04-23T22:04:00.000Z,reference,architecture-patterns,0.65,True,"Describes architectural components and several configurations of NC2 on Azure; likely includes product-specific topology patterns and when to use particular configurations, which are not generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-on-azure/available-regions-skus,Available Regions and SKUs,Available Regions and SKUs for Nutanix Cloud Clusters on Azure - Azure Baremetal Infrastructure,Select regions and SKUs for Nutanix Cloud Clusters on Azure,Learn about the available regions and SKUs for Nutanix Cloud Clusters on Azure.,Nutanix Cloud Clusters on Azure are offered in three different Ready Node or SKU types and many Azure regions.,2026-04-07T22:12:00.000Z,reference,decision-making,0.74,True,"Page lists which Azure regions support Nutanix Cloud Clusters and which specific Ready Node/SKU types are available per region. This is product- and SKU-specific availability data that changes over time and isn't inferable from general training. It directly supports deployment and capacity planning decisions (which region/SKU combinations can be used), fitting the decision-making category better than limits-quotas since it focuses on availability matrices rather than numeric resource limits.",unchanged https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-on-azure/faq,FAQ,FAQ - Azure Baremetal Infrastructure,,Questions frequently asked about NC2 on Azure,This article addresses questions most frequently asked about NC2 on Azure.,2025-07-30T22:07:00.000Z,faq,,0.4,False,"FAQ format; summary does not indicate presence of specific error codes, configuration tables, or numeric limits—likely general Q&A rather than deep technical reference.",unchanged -https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-on-azure/get-started,Get started,Get started - Azure Baremetal Infrastructure,,"Learn how to sign up, set up, and use Nutanix Cloud Clusters on Azure.",Learn how to get started with Nutanix Cloud Clusters (NC2) on Azure. You can also sign up for a free trial of NC2 on Azure.,2026-04-13T17:16:00.000Z,get-started,,0.2,False,"Page is a getting-started guide for Nutanix Cloud Clusters on Azure, focused on sign-up and initial setup. It does not expose detailed limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined by the sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-on-azure/get-started,Get started,Get started - Azure Baremetal Infrastructure,,"Learn how to sign up, set up, and use Nutanix Cloud Clusters on Azure.",Learn how to get started with Nutanix Cloud Clusters (NC2) on Azure. You can also sign up for a free trial of NC2 on Azure.,2026-04-13T17:16:00.000Z,get-started,,0.2,False,"Page is a getting-started guide for Nutanix Cloud Clusters on Azure, focused on sign-up and initial setup. It does not expose detailed limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined by the sub-skill types.",unchanged diff --git a/products/azure-baremetal-infrastructure/report.md b/products/azure-baremetal-infrastructure/report.md index e40a7298..c4d29a46 100644 --- a/products/azure-baremetal-infrastructure/report.md +++ b/products/azure-baremetal-infrastructure/report.md @@ -32,8 +32,8 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 7 +- **Updated Pages**: 0 +- **Unchanged**: 8 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-baremetal-infrastructure/azure-baremetal-infrastructure.csv` @@ -47,11 +47,6 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A ## Changes -### Updated Pages - -- [Get started](https://learn.microsoft.com/en-us/azure/baremetal-infrastructure/workloads/nc2-on-azure/get-started) - - Updated: 2025-07-31T17:19:00.000Z → 2026-04-13T17:16:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-bastion/report.md b/products/azure-bastion/report.md index 63e850a5..312d51f3 100644 --- a/products/azure-bastion/report.md +++ b/products/azure-bastion/report.md @@ -48,7 +48,7 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A - **New Pages**: 0 - **Updated Pages**: 0 - **Unchanged**: 40 -- **Deleted Pages**: 1 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-bastion/azure-bastion.csv` ## Classification Statistics @@ -66,10 +66,6 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A ## Changes -### Deleted Pages - -- ~~Troubleshoot~~ (https://learn.microsoft.com/en-us/troubleshoot/azure/bastion/welcome-azure-bastion) - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-batch/azure-batch.csv b/products/azure-batch/azure-batch.csv index 6dd44ea0..974ec642 100644 --- a/products/azure-batch/azure-batch.csv +++ b/products/azure-batch/azure-batch.csv @@ -51,9 +51,9 @@ and encryption settings. However, you may need to update pool properties as your over time or if a VM image reaches end-of-life. Some, but not all, of these pool properties can be patched or updated to accommodate these situations. This article provides information about updateable ",2026-01-06T06:10:00.000Z,how-to,configuration,0.7,True,"Details which Batch pool properties are mutable vs immutable and how to patch them, which is product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-pool-vm-sizes,Supported VM sizes,Choose VM sizes and images for pools - Azure Batch,Choose Azure Batch VM sizes and images,How to choose from the available VM sizes and OS versions for compute nodes in Azure Batch pools,"When you select a node size for an Azure Batch pool, you can choose from almost all the VM sizes available in Azure. Azure offers a range of sizes for Linux and Windows VMs for different workloads.",2026-01-05T08:00:00.000Z,concept-article,decision-making,0.7,True,Guidance on selecting VM sizes and OS images for Batch pools; likely includes decision criteria and trade-offs for different workloads.,unchanged -https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide,Classic compute node communication model,Migrate Azure Batch pools to the simplified compute node communication model - Azure Batch,Migrate Batch pools to simplified node communication,Learn how to migrate Azure Batch pools to the simplified compute node communication model and plan for feature end of support.,"To improve security, simplify the user experience, and enable key future improvements, Azure Batch will retire the classic +https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide,Classic compute node communication model,Migrate Azure Batch pools to the simplified compute node communication model - Azure Batch,Plan and execute migration to Azure Batch simplified node communication,Learn how to migrate Azure Batch pools to the simplified compute node communication model and plan for feature end of support.,"To improve security, simplify the user experience, and enable key future improvements, Azure Batch will retire the classic compute node communication model onMarch 31, 2026. Learn how to migrate your Batch pools to using the simplified compute -node communication model.",2025-04-02T08:00:00.000Z,how-to,decision-making,0.8,True,Migration guide for moving to the simplified communication model with explicit retirement date; provides product-specific migration considerations.,unchanged +node communication model.",2026-04-22T17:34:00.000Z,how-to,decision-making,0.68,True,"The migration guide is time-bound to a specific retirement date and provides product-specific guidance on when and how to move from the classic to the simplified compute node communication model. It includes concrete migration considerations, feature support differences, and recommendations for planning and executing the transition, which helps users decide migration timing and approach between two service models. This goes beyond conceptual overview and constitutes expert, product-specific decision guidance.",updated https://learn.microsoft.com/en-us/azure/batch/batch-powershell-cmdlets-get-started,Use Azure PowerShell,Get started with PowerShell - Azure Batch,Manage Azure Batch resources using PowerShell cmdlets,A quick introduction to the Azure PowerShell cmdlets you can use to manage Batch resources.,"With the Azure Batch PowerShell cmdlets, you can perform and script many common Batch tasks. This is a quick introduction to the cmdlets you can use to manage your Batch accounts and work with your Batch resources such as pools, jobs, and tasks. For a complete list of Batch cmdlets and detailed cmdlet syntax, see theAzure Batch cmdlet reference. We recommend that you update your Azure PowerShell modules frequently to take advantage of service updates and enhancements.",2025-04-02T08:00:00.000Z,how-to,integrations,0.6,True,"Introduces Batch-specific PowerShell cmdlets and how to use them to manage pools, jobs, and tasks, including command names and usage patterns unique to Batch.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-quota-limit,Quotas and limits,Service quotas and limits - Azure Batch,Review Azure Batch service quotas and limits,"Learn about default Azure Batch quotas, limits, and constraints. Also learn how to request quota increases.","As with other Azure services, there are limits on certain resources associated with Azure Batch. For example, if your pool doesn't reach your target number of compute nodes, you might have reached the core quota limit for your Batch account. Many limits are default quotas, which Azure applies at the subscription or account level. Keep these quotas in mind as you design and scale up your Batch workloads. You can run multiple Batch workloads in a single Batch account. Or, you can distribute your w",2024-06-05T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Dedicated quotas and limits page; contains specific numeric limits, quotas, and constraints for Batch resources and guidance on quota increases.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-rendering-architectures,Rendering architectures,Azure rendering reference architectures - Azure Batch,Reference architectures for bursting render farms to Azure Batch,Architectures for using Azure Batch and other Azure services to extend an on-premises render farm by bursting to the cloud,"This article shows high-level architecture diagrams for scenarios to extend, or ""burst"", an on-premises render farm to Azure. The examples show different options for Azure compute, networking, and storage services.",2025-02-07T08:00:00.000Z,how-to,architecture-patterns,0.7,True,"Provides specific reference architectures for extending on-premises render farms to Azure, including choices of compute, networking, and storage services, which are architecture patterns.",unchanged @@ -63,7 +63,7 @@ https://learn.microsoft.com/en-us/azure/batch/batch-rendering-storage-data-movem https://learn.microsoft.com/en-us/azure/batch/batch-role-based-access-control,Role-based access control for Azure Batch service,Role-based access control for Azure Batch service - Azure Batch,Configure Azure RBAC roles for Azure Batch,Learn how to use Azure role-based access control for managing individual access to Azure Batch account.,"Azure Batch Service supports a set ofbuilt-in Azure rolesthat provide different levels of permissions to Azure Batch account. By using Azure role-based access control (Azure RBAC), an authorization system for managing individual access to Azure resources, you could assign specific permissions to users, service principals, or other identities that need to interact with your Batch account. You can alsoassign custom roleswith custom, fine-grained permissions that adapt your specific use scenario. N",2025-08-12T11:11:00.000Z,how-to,security,0.9,True,"Describes Batch-specific built-in roles and permissions, including role names and scopes for managing Batch accounts.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-service-workflow-features,Batch service workflow and resources,Batch service workflow and resources - Azure Batch,,Learn about the features of the Batch service and its high-level workflow from a development standpoint.,"In this overview of the core components of the Azure Batch service, we discuss the high-level workflow that Batch developers can use to build large-scale parallel compute solutions, along with the primary service resources that are used. Whether you're developing a distributed computational application or service that issues directREST APIcalls or you're using another one of theBatch SDKs, you'll use many of the resources and features discussed here. Tip For a higher-level introduction to the Ba",2025-04-02T08:00:00.000Z,concept-article,,0.3,False,"Describes Batch workflow and resources at a high level; conceptual overview without detailed limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-sig-images,Create a pool with Azure Compute Gallery,Use the Azure Compute Gallery to create a custom image pool - Azure Batch,Use Azure Compute Gallery images for Batch pools,Custom image pools are an efficient way to configure compute nodes to run your Batch workloads.,"When you create an Azure Batch pool using the Virtual Machine Configuration, you specify a VM image that provides the operating system for each compute node in the pool. You can create a pool of virtual machines either with a supported Azure Marketplace image or create a custom image with anAzure Compute Gallery image.",2025-07-01T08:00:00.000Z,concept-article,configuration,0.7,True,Explains how to configure Batch pools to use Compute Gallery images with product-specific image reference settings.,unchanged -https://learn.microsoft.com/en-us/azure/batch/batch-spot-vms,Use Azure Spot VMs,Run Batch workloads on cost-effective Spot VMs - Azure Batch,Decide when to run Azure Batch on Spot VMs,Learn how to provision Spot VMs to reduce the cost of Azure Batch workloads.,"Azure Batch offers Spot virtual machines (VMs) to reduce the cost of Batch workloads. Spot VMs make new types of Batch workloads possible by enabling a large amount of compute power to be used for a low cost. Spot VMs take advantage of surplus capacity in Azure. The amount of surplus capacity that is available varies depending on factors such as VM family, VM size, region, and time of day. When you specify Spot VMs in your pools, Azure Batch can use this surplus, when available. The tradeoff for",2026-04-14T22:21:00.000Z,how-to,decision-making,0.68,True,"The page discusses trade-offs of using Spot VMs for Azure Batch, including when workloads are suitable or not due to eviction risk and surplus capacity variability. It provides product-specific guidance on choosing Spot vs regular VMs for different workload characteristics, which aligns with decision-making criteria, even though it may not include strict numeric limits.",updated +https://learn.microsoft.com/en-us/azure/batch/batch-spot-vms,Use Azure Spot VMs,Run Batch workloads on cost-effective Spot VMs - Azure Batch,Decide when to run Azure Batch on Spot VMs,Learn how to provision Spot VMs to reduce the cost of Azure Batch workloads.,"Azure Batch offers Spot virtual machines (VMs) to reduce the cost of Batch workloads. Spot VMs make new types of Batch workloads possible by enabling a large amount of compute power to be used for a low cost. Spot VMs take advantage of surplus capacity in Azure. The amount of surplus capacity that is available varies depending on factors such as VM family, VM size, region, and time of day. When you specify Spot VMs in your pools, Azure Batch can use this surplus, when available. The tradeoff for",2026-04-14T22:21:00.000Z,how-to,decision-making,0.68,True,"The page discusses trade-offs of using Spot VMs for Azure Batch, including when workloads are suitable or not due to eviction risk and surplus capacity variability. It provides product-specific guidance on choosing Spot vs regular VMs for different workload characteristics, which aligns with decision-making criteria, even though it may not include strict numeric limits.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-task-complete-event,Task complete event,Azure Batch task complete event - Azure Batch,Understand Azure Batch task complete diagnostic event,"Reference for Batch task complete event. This event is emitted once a task is completed, regardless of the exit code.","This event is emitted once a task is completed, regardless of the exit code. This event can be used to determine the duration of a task, where the task ran, and whether it was retried. The following example shows the body of a task complete event.",2026-02-05T23:11:00.000Z,reference,configuration,0.7,True,"Reference for the task complete event, including how to derive duration, node, and retry info from the payload, which is expert monitoring detail.",unchanged https://learn.microsoft.com/en-us/azure/batch/batch-task-dependencies,Task dependencies,Create task dependencies to run tasks - Azure Batch,Configure task dependencies for Azure Batch jobs,Create tasks that depend on the completion of other tasks for processing MapReduce style and similar big data workloads in Azure Batch.,"With Batch task dependencies, you create tasks that are scheduled for execution on compute nodes after the completion of one or more parent tasks. For example, you can create a job that renders each frame of a 3D movie with separate, parallel tasks. The final task merges the rendered frames into the complete movie only after all frames have been successfully rendered. In other words, the final task is dependent on the previous parent tasks. Some scenarios where task dependencies are useful inclu",2025-07-01T08:00:00.000Z,how-to,configuration,0.7,True,Describes Batch-specific dependency configuration for tasks (parent/child relationships) to support MapReduce and similar workflows.,unchanged https://learn.microsoft.com/en-us/azure/batch/batch-task-fail-event,Task fail event,Azure Batch task fail event - Azure Batch,Understand Azure Batch task fail diagnostic event,Reference for Batch task fail event. This event is emitted in addition to a task complete event and can be used to detect when a task fails.,This event is emitted when a task completes with a failure. Currently all nonzero exit codes are considered failures. This event is emittedin addition toa task complete event and can be used to detect when a task fails. The following example shows the body of a task fail event.,2026-02-05T23:11:00.000Z,reference,configuration,0.7,True,"Documents the task fail event, its relationship to task complete, and failure semantics (nonzero exit codes), which are Batch-specific diagnostic details.",unchanged @@ -137,8 +137,8 @@ https://learn.microsoft.com/en-us/azure/batch/security-controls-policy,Security page lists thecompliance domainsandsecurity controlsfor Azure Batch. You can assign the built-ins for asecurity controlindividually to help make your Azure resources compliant with the specific standard. The title of each built-in policy definition links to the policy definition in the A",2024-04-29T08:00:00.000Z,sample,security,0.7,True,Lists specific Azure Policy built-in definitions and compliance controls for Azure Batch; security/compliance configuration details unique to the service.,unchanged -https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication,Use simplified compute node communication,Use simplified compute node communication - Azure Batch,Enable simplified compute node communication in Azure Batch,Learn about the simplified compute node communication mode in the Azure Batch service and how to enable it.,"An Azure Batch pool contains one or more compute nodes that execute user-specified workloads in the form of Batch tasks. To enable Batch functionality and Batch pool infrastructure management, compute nodes must communicate with the Azure Batch service. Batch supports two types of communication modes: This article describes thesimplifiedcommunication mode and the associated network configuration requirements. Tip Information in this document pertaining to networking resources and rules such as N",2026-01-12T08:00:00.000Z,how-to,configuration,0.7,True,"Describes Batch-specific communication modes and required network configuration (NSGs, rules) for simplified mode.",unchanged -https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip,Create a simplified node communication pool without public IP addresses,Create a simplified node communication pool without public IP addresses - Azure Batch,Create simplified communication Batch pools without public IPs,Learn how to create an Azure Batch simplified node communication pool without public IP addresses.,"Note This replaces the previous preview version ofAzure Batch pool without public IP addresses. This new version requiresusing simplified compute node communication. Important Support for pools without public IP addresses in Azure Batch is currently available forselect regions. When you create an Azure Batch pool, you can provision the virtual machine (VM) configuration pool without a public IP address. This article explains how to set up a Batch pool without public IP addresses.",2026-01-12T08:00:00.000Z,how-to,configuration,0.8,True,"Describes Batch-specific requirements (simplified communication, region support) and configuration for pools without public IP addresses.",unchanged +https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication,Use simplified compute node communication,Use simplified compute node communication - Azure Batch,Configure simplified compute node communication in Azure Batch,Learn about the simplified compute node communication mode in the Azure Batch service and how to enable it.,"An Azure Batch pool contains one or more compute nodes that execute user-specified workloads in the form of Batch tasks. To enable Batch functionality and Batch pool infrastructure management, compute nodes must communicate with the Azure Batch service. Batch supports two types of communication modes: This article describes thesimplifiedcommunication mode and the associated network configuration requirements. Warning Theclassiccompute node communication mode will be retired on31 March 2026and re",2026-04-22T17:34:00.000Z,how-to,configuration,0.7,True,"The article describes how to enable the simplified compute node communication mode and its specific network configuration requirements. It includes product-specific settings (communication modes, required network configuration, and retirement date of classic mode) that are not generic knowledge. This aligns best with configuration, as it focuses on how to configure Batch pools and networking for this mode rather than general concepts or limits.",updated +https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip,Create a simplified node communication pool without public IP addresses,Create a simplified node communication pool without public IP addresses - Azure Batch,Configure Azure Batch pools without public IP addresses,Learn how to create an Azure Batch simplified node communication pool without public IP addresses.,"Note This replaces the previous preview version ofAzure Batch pool without public IP addresses. This new version requiresusing simplified compute node communication. Important Support for pools without public IP addresses in Azure Batch is currently available forselect regions. When you create an Azure Batch pool, you can provision the virtual machine (VM) configuration pool without a public IP address. This article explains how to set up a Batch pool without public IP addresses.",2026-04-21T08:00:00.000Z,how-to,configuration,0.7,True,"The page explains how to create a Batch pool using simplified node communication without public IPs, including region-specific availability and required configuration steps. It is focused on concrete, product-specific configuration of pool networking rather than conceptual guidance, so it fits the configuration sub-skill type.",updated https://learn.microsoft.com/en-us/azure/batch/tutorial-batch-functions,OCR with Batch and Functions,Tutorial - Trigger a Batch job using Azure Functions - Azure Batch,,Learn how to apply OCR to scanned documents as they're added to a storage blob by using Azure Function Apps.,"In this tutorial, you learn how to trigger a Batch job usingAzure Functions. This article walks through an example that takes documents added to an Azure Storage blob container applies optical character recognition (OCR) by using Azure Batch. To streamline the OCR processing, this example configures an Azure function that runs a Batch OCR job each time a file is added to the blob container. You learn how to:",2023-05-04T17:07:00.000Z,tutorial,,0.3,False,Tutorial integrating Batch with Azure Functions; integration example but not a parameter/setting reference or troubleshooting guide.,unchanged https://learn.microsoft.com/en-us/azure/batch/tutorial-parallel-dotnet,Parallel file processing - .NET,Tutorial: Run a parallel workload using the .NET API - Azure Batch,,Learn how to transcode media files in parallel using ffmpeg in Azure Batch with the Batch .NET client library.,"Use Azure Batch to run large-scale parallel and high-performance computing (HPC) batch jobs efficiently in Azure. This tutorial walks through a C# example of running a parallel workload using Batch. You learn a common Batch application workflow and how to interact programmatically with Batch and Storage resources. In this tutorial, you convert MP4 media files to MP3 format, in parallel, by using theffmpegopen-source tool. If you don't have an Azure account, create afree accountbefore you begin.",2025-04-02T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for parallel workload with .NET; scenario walkthrough without formal best-practices, limits, or troubleshooting tables.",unchanged https://learn.microsoft.com/en-us/azure/batch/tutorial-parallel-python,Parallel file processing - Python,Tutorial: Run a parallel workload using the Python API - Azure Batch,,Learn how to process media files in parallel using ffmpeg in Azure Batch with the Batch Python client library.,"Use Azure Batch to run large-scale parallel and high-performance computing (HPC) batch jobs efficiently in Azure. This tutorial walks through a Python example of running a parallel workload using Batch. You learn a common Batch application workflow and how to interact programmatically with Batch and Storage resources. In this tutorial, you convert MP4 media files to MP3 format, in parallel, by using theffmpegopen-source tool. If you don't have an Azure account, create afree accountbefore you beg",2024-03-01T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for parallel workload with Python; similar to .NET tutorial, focused on example rather than reference-style expert details.",unchanged diff --git a/products/azure-batch/report.md b/products/azure-batch/report.md index 3c6e2456..94560eea 100644 --- a/products/azure-batch/report.md +++ b/products/azure-batch/report.md @@ -1,20 +1,20 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Batch accounts and pools: auth with Entra ID/managed identities, keys and CMK encryption, RBAC and policy, private endpoints/network perimeters, Key Vault access, and certificate/key rotation.' configuration: Configuring Batch pools, tasks, networking, containers, autoscale, - OS/images, filesystems, and setting up monitoring, diagnostics events, alerts, - and metrics/logs + monitoring/diagnostics, and storage mounts, plus reference for events, metrics, + and management via SDK. deployment: Deploying Azure Batch workloads using Azure Pipelines and CLI templates, including end-to-end job setup, automation, and integration into CI/CD workflows. integrations: 'Using Azure Batch programmatically and via CLI/PowerShell: SDK patterns (JavaScript, .NET, Linux workloads), storing task output in Storage, and adding telemetry with Application Insights.' decision-making: Guidance on choosing VM sizes/images, using Spot VMs, managing - Batch costs, and deciding/migrating between custom image options, Compute Gallery, - and simplified node communication. + Batch costs, and planning/migrating custom image pools and node communication + features. best-practices: Performance, scaling, scheduling, security, and data/output best practices for designing, monitoring, and optimizing large or specialized Azure Batch workloads (MPI, rendering, high task counts). @@ -30,15 +30,17 @@ category_descriptions: skill_description: Expert knowledge for Azure Batch development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - configuring Batch pools/tasks, autoscale, Spot VMs, CI/CD job deployment, or render/MPI - workloads, and other Azure Batch related development tasks. Not for Azure HDInsight - (use azure-hdinsight), Azure Databricks (use azure-databricks), Azure Kubernetes - Service (AKS) (use azure-kubernetes-service), Azure Virtual Machines (use azure-virtual-machines). -use_when: Use when configuring Batch pools/tasks, autoscale, Spot VMs, CI/CD job deployment, - or render/MPI workloads, and other Azure Batch related development tasks. + configuring Batch pools/tasks, autoscale, private networking, CI/CD deployment, + or SDK-based job orchestration, and other Azure Batch related development tasks. + Not for Azure HDInsight (use azure-hdinsight), Azure Databricks (use azure-databricks), + Azure Machine Learning (use azure-machine-learning), Azure Virtual Machines (use + azure-virtual-machines). +use_when: Use when configuring Batch pools/tasks, autoscale, private networking, CI/CD + deployment, or SDK-based job orchestration, and other Azure Batch related development + tasks. confusable_not_for: Not for Azure HDInsight (use azure-hdinsight), Azure Databricks - (use azure-databricks), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), - Azure Virtual Machines (use azure-virtual-machines). + (use azure-databricks), Azure Machine Learning (use azure-machine-learning), Azure + Virtual Machines (use azure-virtual-machines). --- # Azure Batch Crawl Report @@ -52,9 +54,9 @@ confusable_not_for: Not for Azure HDInsight (use azure-hdinsight), Azure Databri ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 112 -- **Deleted Pages**: 1 +- **Updated Pages**: 3 +- **Unchanged**: 110 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-batch/azure-batch.csv` ## Classification Statistics @@ -76,12 +78,12 @@ confusable_not_for: Not for Azure HDInsight (use azure-hdinsight), Azure Databri ### Updated Pages -- [Use Azure Spot VMs](https://learn.microsoft.com/en-us/azure/batch/batch-spot-vms) - - Updated: 2026-02-24T18:11:00.000Z → 2026-04-14T22:21:00.000Z - -### Deleted Pages - -- ~~Low-priority virtual machines~~ (https://learn.microsoft.com/en-us/azure/batch/low-priority-vms-retirement-migration-guide) +- [Use simplified compute node communication](https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication) + - Updated: 2026-01-12T08:00:00.000Z → 2026-04-22T17:34:00.000Z +- [Create a simplified node communication pool without public IP addresses](https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip) + - Updated: 2026-01-12T08:00:00.000Z → 2026-04-21T08:00:00.000Z +- [Classic compute node communication model](https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide) + - Updated: 2025-04-02T08:00:00.000Z → 2026-04-22T17:34:00.000Z ## Classified Pages @@ -93,11 +95,9 @@ confusable_not_for: Not for Azure HDInsight (use azure-hdinsight), Azure Databri | [Best practices](https://learn.microsoft.com/en-us/azure/batch/best-practices) | best-practices | 0.85 | Explicit best-practices article with product-specific tips to enhance performance and avoid design pitfalls in Batch solutions. | | [Configure public network access with Batch accounts](https://learn.microsoft.com/en-us/azure/batch/public-network-access) | security | 0.85 | Security/network configuration article with specific behavior (e.g., max 200 IP rules per endpoint) and guidance on IP network rules and Private Link. | | [Checking for pool and node errors](https://learn.microsoft.com/en-us/azure/batch/batch-pool-node-error-checking) | troubleshooting | 0.80 | Organized around detecting and avoiding background pool/node failures with Batch-specific error patterns and mitigation steps. | -| [Classic compute node communication model](https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide) | decision-making | 0.80 | Migration guide for moving to the simplified communication model with explicit retirement date; provides product-specific migration considerations. | | [Configure Container Data Isolation Task](https://learn.microsoft.com/en-us/azure/batch/batch-container-isolation-task) | configuration | 0.80 | Explains Batch-specific isolation settings to control which AZ_BATCH_NODE_ROOT_DIR data paths are mounted into containers. | | [Configure customer-managed keys](https://learn.microsoft.com/en-us/azure/batch/batch-customer-managed-key) | security | 0.80 | Provides concrete configuration guidance for using Key Vault and managed identities with Batch customer-managed keys, including required identity types and key setup. | | [Configure managed identities](https://learn.microsoft.com/en-us/azure/batch/managed-identity-pools) | security | 0.80 | Details Batch-specific Identity property usage to attach user-assigned managed identities and obtain tokens for Azure resources. | -| [Create a simplified node communication pool without public IP addresses](https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip) | configuration | 0.80 | Describes Batch-specific requirements (simplified communication, region support) and configuration for pools without public IP addresses. | | [Enable certificate rotation](https://learn.microsoft.com/en-us/azure/batch/automatic-certificate-rotation) | security | 0.80 | Explains how to use user-assigned managed identities and Key Vault to automatically renew certificates in Batch pools. | | [Microsoft Entra ID with Batch Management](https://learn.microsoft.com/en-us/azure/batch/batch-aad-auth-management) | security | 0.80 | Provides concrete guidance for authenticating Batch Management operations via MSAL/Entra ID with product-specific scopes and flows. | | [Microsoft Entra ID with Batch service](https://learn.microsoft.com/en-us/azure/batch/batch-aad-auth) | security | 0.80 | Details Batch-specific Entra ID auth patterns (integrated vs service principal) and how they apply to Batch service APIs. | @@ -120,6 +120,7 @@ confusable_not_for: Not for Azure HDInsight (use azure-hdinsight), Azure Databri | [Create a CI/CD pipeline for Batch](https://learn.microsoft.com/en-us/azure/batch/batch-ci-cd) | deployment | 0.70 | Provides product-specific CI/CD patterns using Azure Pipelines and ARM templates to deploy Batch-based HPC environments. | | [Create a pool with Azure Compute Gallery](https://learn.microsoft.com/en-us/azure/batch/batch-sig-images) | configuration | 0.70 | Explains how to configure Batch pools to use Compute Gallery images with product-specific image reference settings. | | [Create a pool with a managed image resource](https://learn.microsoft.com/en-us/azure/batch/batch-custom-images) | decision-making | 0.70 | Covers managed image vs Compute Gallery usage, API version constraints, and retirement timelines, guiding migration and image selection decisions. | +| [Create a simplified node communication pool without public IP addresses](https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip) | configuration | 0.70 | The page explains how to create a Batch pool using simplified node communication without public IPs, including region-specific availability and required configuration steps. It is focused on concrete, product-specific configuration of pool networking rather than conceptual guidance, so it fits the configuration sub-skill type. | | [Create resource files](https://learn.microsoft.com/en-us/azure/batch/resource-files) | configuration | 0.70 | Explains Batch-specific resource file configuration from various sources and how they are placed on VMs for different task types. | | [Job preparation and completion tasks](https://learn.microsoft.com/en-us/azure/batch/batch-job-prep-release) | configuration | 0.70 | Describes Batch-specific job-level prep and release task configuration to manage data movement and cleanup on nodes. | | [MPI](https://learn.microsoft.com/en-us/azure/batch/batch-mpi) | best-practices | 0.70 | Describes how to configure multi-instance tasks for MPI applications using Batch .NET, including Batch-specific task settings and coordination patterns. | @@ -148,8 +149,9 @@ confusable_not_for: Not for Azure HDInsight (use azure-hdinsight), Azure Databri | [Update pool properties](https://learn.microsoft.com/en-us/azure/batch/batch-pool-update-properties) | configuration | 0.70 | Details which Batch pool properties are mutable vs immutable and how to patch them, which is product-specific configuration knowledge. | | [Use extensions with pools](https://learn.microsoft.com/en-us/azure/batch/create-pool-extensions) | configuration | 0.70 | Describes how to select, configure, and monitor VM extensions on Batch nodes with product-specific extension handling. | | [Use private endpoints with Batch accounts](https://learn.microsoft.com/en-us/azure/batch/private-connectivity) | security | 0.70 | Describes product-specific network security configuration for Batch using Private Link and private endpoints, including required subnet and access behavior details. | -| [Use simplified compute node communication](https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication) | configuration | 0.70 | Describes Batch-specific communication modes and required network configuration (NSGs, rules) for simplified mode. | +| [Use simplified compute node communication](https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication) | configuration | 0.70 | The article describes how to enable the simplified compute node communication mode and its specific network configuration requirements. It includes product-specific settings (communication modes, required network configuration, and retirement date of classic mode) that are not generic knowledge. This aligns best with configuration, as it focuses on how to configure Batch pools and networking for this mode rather than general concepts or limits. | | [User accounts for running tasks](https://learn.microsoft.com/en-us/azure/batch/batch-user-accounts) | configuration | 0.70 | Describes concrete Batch-specific user account types and how to configure them for tasks, including properties/flags that control elevation and isolation. These are product-specific configuration details beyond generic OS accounts. | +| [Classic compute node communication model](https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide) | decision-making | 0.68 | The migration guide is time-bound to a specific retirement date and provides product-specific guidance on when and how to move from the classic to the simplified compute node communication model. It includes concrete migration considerations, feature support differences, and recommendations for planning and executing the transition, which helps users decide migration timing and approach between two service models. This goes beyond conceptual overview and constitutes expert, product-specific decision guidance. | | [Use Azure Spot VMs](https://learn.microsoft.com/en-us/azure/batch/batch-spot-vms) | decision-making | 0.68 | The page discusses trade-offs of using Spot VMs for Azure Batch, including when workloads are suitable or not due to eviction risk and surplus capacity variability. It provides product-specific guidance on choosing Spot vs regular VMs for different workload characteristics, which aligns with decision-making criteria, even though it may not include strict numeric limits. | | [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/batch/policy-reference) | security | 0.65 | Indexes Batch-specific Azure Policy built-in definitions, which are concrete governance/security controls and policy names unique to this service. | | [Batch analytics](https://learn.microsoft.com/en-us/azure/batch/batch-analytics) | configuration | 0.65 | Provides reference information for Batch analytics events and alerts, including event schemas and categories, which are service-specific monitoring/diagnostic details. | diff --git a/products/azure-blob-storage/azure-blob-storage.csv b/products/azure-blob-storage/azure-blob-storage.csv index 473cc446..6d30a405 100644 --- a/products/azure-blob-storage/azure-blob-storage.csv +++ b/products/azure-blob-storage/azure-blob-storage.csv @@ -28,7 +28,7 @@ https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-ta https://learn.microsoft.com/en-us/azure/storage-discovery/create-workspace,Create a Storage Discovery workspace,Create and manage an Azure Storage Discovery Workspace - Azure Storage Discovery,Configure Azure Storage Discovery workspace scopes and reporting,Learn how to create an Azure Storage Discovery Workspace.,"To deploy the Azure Storage Discovery service, you need to create a Discovery workspace resource in one of your resource groups. With this resource, you define which storage resources you want to cover across your Microsoft Entra tenant and how you want to segment reporting for them. The workspace offers prebuilt reports in the Azure portal that you can use to retrieve the insights you need about your storage resources. Follow the steps in this article to create an Azure Storage Discovery worksp",2025-10-10T22:10:00.000Z,how-to,configuration,0.7,True,Describes workspace resource and how to define covered storage resources and segmentation; these are product-specific configuration options.,unchanged https://learn.microsoft.com/en-us/azure/storage-discovery/create-workspace-bicep,Bicep template for creating a workspace,Bicep template for creating an Azure Storage Discovery workspace - Azure Storage Discovery,Deploy a Storage Discovery workspace using Bicep,Learn how to create an Azure Storage Discovery workspace using a bicep template.,"This quickstart shows you how to use a Bicep file to deploy a Storage Discovery workspace in Azure. Bicepis a domain-specific language (DSL) that uses declarative syntax to deploy Azure resources. It provides concise syntax, reliable type safety, and support for code reuse. Bicep offers the best authoring experience for your infrastructure-as-code solutions in Azure.",2025-10-16T05:16:00.000Z,quickstart-bicep,configuration,0.7,True,"Bicep template quickstart that likely includes parameter names and allowed values for the workspace resource, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/storage-discovery/deployment-planning,Planning for deployment,Planning for an Azure Storage Discovery deployment - Azure Storage Discovery,Apply deployment best practices for Azure Storage Discovery,Considerations and best-practices for deploying the Azure Storage Discovery service,"Before you continue, be sure to get anoverview of the Storage Discovery serviceand the value it can provide to you.",2025-10-10T22:10:00.000Z,overview,best-practices,0.65,True,Deployment planning article with considerations and best practices specific to Storage Discovery deployment.,unchanged -https://learn.microsoft.com/en-us/azure/storage-discovery/egress-insights,Understand egress insights,Understand egress insights in Azure Storage Discovery - Azure Storage Discovery,,Learn how to use Azure Storage Discovery to gain visibility into egress patterns across your Azure Blob Storage estate.,"Understanding the data egress - the transfer of data out of your storage accounts - can reveal cost and safety insights. Understanding where egress occurs, how it trends over time, and which storage accounts contribute the most helps you make informed decisions about cost optimization and architecture improvements. Azure Storage Discovery surfaces egress insights as part of theActivityreport in your Discovery workspace. These insights give you enterprise-wide visibility into egress patterns with",2026-04-14T11:11:00.000Z,conceptual,,0.2,False,"Page appears to describe how to view and interpret egress insights and patterns in Azure Storage Discovery, but the summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or concrete decision matrices. It reads as conceptual/analytical guidance about understanding egress rather than product-specific expert details that match any sub-skill type.",new +https://learn.microsoft.com/en-us/azure/storage-discovery/egress-insights,Understand egress insights,Understand egress insights in Azure Storage Discovery - Azure Storage Discovery,,Learn how to use Azure Storage Discovery to gain visibility into egress patterns across your Azure Blob Storage estate.,"Understanding the data egress - the transfer of data out of your storage accounts - can reveal cost and safety insights. Understanding where egress occurs, how it trends over time, and which storage accounts contribute the most helps you make informed decisions about cost optimization and architecture improvements. Azure Storage Discovery surfaces egress insights as part of theActivityreport in your Discovery workspace. These insights give you enterprise-wide visibility into egress patterns with",2026-04-14T11:11:00.000Z,conceptual,,0.2,False,"Page appears to describe how to view and interpret egress insights and patterns in Azure Storage Discovery, but the summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or concrete decision matrices. It reads as conceptual/analytical guidance about understanding egress rather than product-specific expert details that match any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/storage-discovery/frequently-asked-questions,Frequently Asked Questions,Frequently asked questions for the Azure Storage Discovery service,Review Azure Storage Discovery FAQs and service limits,Find answers for frequently asked questions about Azure Storage Discovery.,"In this article, learn about frequently asked questions and answers for the Azure Storage Discovery service. Insights aggregation often completes within a few hours but can also take more than a day. Activity, Security, and Consumption reports show insights only for Standard pricing plan and not for Free plan. Verify your workspace's pricing plan and upgrade if needed. Discovery workspace has a default limit of 10 scopes per workspace. Support team may be contacted with a request to increase thi",2025-11-21T06:02:00.000Z,release-notes,limits-quotas,0.75,True,FAQ includes concrete behaviors and at least one explicit numeric limit (default 10 scopes per workspace) and plan-specific feature availability.,unchanged https://learn.microsoft.com/en-us/azure/storage-discovery/get-started-reports,Get Started on Storage Discovery reports,Understand the common structure of Azure Storage Discovery reports. - Azure Storage Discovery,,Learn how insights are presented in Azure Storage Discovery reports.,"Azure Storage Discovery reports organize insights into distinct categories that help you understand and manage your Azure Blob Storage estate: Capacity,Activity, andErrorsreports use interactive charts to present data in a unified format, making it easier to explore trends, derive insights, and take action to optimize your storage estate. Storage Discovery reports include top-level filters that help you focus on specific aspects of your storage estate. These filters apply across all reports and ",2026-04-01T22:41:00.000Z,overview,,0.2,False,"Page describes the structure and use of Azure Storage Discovery reports (capacity, activity, errors) and top-level filters. It appears to be a conceptual/UX overview of how insights are presented, without specific limits, configuration parameter tables, error-code mappings, or other product-specific numeric thresholds or settings.",unchanged https://learn.microsoft.com/en-us/azure/storage-discovery/get-storage-estate-insights,Deploy Storage Discovery and access insights,Get started with Azure Storage Discovery to unlock insights about your Azure Storage estate,,Learn how to deploy a Storage Discovery workspace resource and access insights about a portion of the Azure Storage estate that's interesting to you.,"This article walks you through the basic steps for using Azure Storage Discovery you can take to unlock deep insights about your Azure Storage estate. In this tutorial, you: If you don't have an Azure subscription, create afree accountbefore you begin.",2025-10-10T06:02:00.000Z,tutorial,,0.35,False,"Getting-started tutorial; focuses on basic deployment and usage, not deep configuration or limits.",unchanged @@ -41,7 +41,7 @@ https://learn.microsoft.com/en-us/azure/storage-discovery/resource-move,Move res https://learn.microsoft.com/en-us/azure/storage-mover/agent-deploy,Deploy a storage mover agent,How to deploy an Azure Storage Mover agent.,Deploy and configure Azure Storage Mover agents,Learn how to deploy an Azure Mover agent,"The Azure Storage Mover service utilizes agents to perform the migration jobs you configure in the service. An agent is a virtual machine-based migration appliance that runs on a virtualization host. Ideally, your virtualization host is located as near as possible to the source storage to be migrated. Storage Mover can support multiple agents. Because an agent is essentially a migration appliance, you interact with it through an agent-local administrative shell. The shell limits the operations y",2025-05-29T08:00:00.000Z,how-to,configuration,0.7,True,"Agent deployment for a VM-based appliance typically includes specific VM sizing, OS, ports, and agent-local shell commands; these are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/storage-mover/agent-deploy,Unregister an agent,How to deploy an Azure Storage Mover agent.,Deploy and configure Azure Storage Mover agents,Learn how to deploy an Azure Mover agent,"The Azure Storage Mover service utilizes agents to perform the migration jobs you configure in the service. An agent is a virtual machine-based migration appliance that runs on a virtualization host. Ideally, your virtualization host is located as near as possible to the source storage to be migrated. Storage Mover can support multiple agents. Because an agent is essentially a migration appliance, you interact with it through an agent-local administrative shell. The shell limits the operations y",2025-05-29T08:00:00.000Z,how-to,configuration,0.7,True,"Duplicate of index 8; agent deployment details are product-specific configuration (VM requirements, shell commands, connectivity).",unchanged https://learn.microsoft.com/en-us/azure/storage-mover/agent-register,Register an agent,How to register an Azure Storage Mover agent,Securely register Azure Storage Mover agents,Learn about agent VM registration to run your migration jobs.,"The Azure Storage Mover service utilizes agents that carry out the migration jobs you configure in the service. The agent is a virtual machine-based appliance that you run on a virtualization host, close to the source storage. You need to register an agent to create a trust relationship with your Storage Mover resource. This trust enables your agent to securely receive migration jobs and report progress. Agent registration can occur over either the public or private endpoint of your Storage Move",2025-04-09T22:03:00.000Z,how-to,security,0.65,True,"Agent registration establishes trust with the Storage Mover resource and can occur over public or private endpoints; likely includes identity, trust, and endpoint configuration details specific to this service.",unchanged -https://learn.microsoft.com/en-us/azure/storage-mover/azure-to-azure-migration,Transfer from Azure Blob to Blob,Get started with azureblobcontainer-to-azureblobcontainer migration in Azure Storage Mover,,The Blob-to-Blob Migration feature in Azure Storage mover allows you to securely transfer data from a Blob container in Azure Storage to another Blob container in any storage account in a global Azure,"The Azure-to-Azure Migration feature in Azure Storage Mover allows you to securely transfer large datasets between Blob containers across different Azure storage accounts and regions. This article guides you through the complete process of configuring Storage Mover to migrate your data between two Blob containers. The process consists of configuring endpoints, and creating and running a migration job.",2026-04-16T06:12:00.000Z,quickstart,,0.2,False,"The page is a step-by-step getting-started guide for configuring an Azure Storage Mover Blob-to-Blob migration job. It describes creating endpoints and running a migration, but based on the summary it does not emphasize specific limits/quotas, configuration parameter tables, error-code-based troubleshooting, or other detailed product-specific settings that meet the expert-knowledge criteria. It appears to be primarily procedural tutorial content rather than a reference for limits, configuration, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/storage-mover/azure-to-azure-migration,Transfer from Azure Blob to Blob,Get started with azureblobcontainer-to-azureblobcontainer migration in Azure Storage Mover,,The Blob-to-Blob Migration feature in Azure Storage mover allows you to securely transfer data from a Blob container in Azure Storage to another Blob container in any storage account in a global Azure,"The Azure-to-Azure Migration feature in Azure Storage Mover allows you to securely transfer large datasets between Blob containers across different Azure storage accounts and regions. This article guides you through the complete process of configuring Storage Mover to migrate your data between two Blob containers. The process consists of configuring endpoints, and creating and running a migration job.",2026-04-16T06:12:00.000Z,quickstart,,0.2,False,"The page is a step-by-step getting-started guide for configuring an Azure Storage Mover Blob-to-Blob migration job. It describes creating endpoints and running a migration, but based on the summary it does not emphasize specific limits/quotas, configuration parameter tables, error-code-based troubleshooting, or other detailed product-specific settings that meet the expert-knowledge criteria. It appears to be primarily procedural tutorial content rather than a reference for limits, configuration, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/storage-mover/bandwidth-management,Configure bandwidth limits,How to schedule bandwidth limitations of a Storage Mover agent,Configure bandwidth schedules for Storage Mover agents,Learn how to set a bandwidth schedule that limits the use of the WAN link for a Storage Mover agent,"In this article, you learn how to set bandwidth management schedules for your Storage Mover agents. When migrating your files and folders to Azure, you need to carefully consider the upload bandwidth you want to make available to each of your Storage Mover agents. Other workloads may also depend on having sufficient bandwidth available. To make your Storage Mover agents a good neighbor to other workloads in your network, you can schedule limits for each agent.",2024-07-10T08:00:00.000Z,how-to,configuration,0.75,True,"Bandwidth management schedules imply specific configuration options (time windows, bandwidth caps per agent); these are product-specific settings and ranges.",unchanged https://learn.microsoft.com/en-us/azure/storage-mover/billing,Migration costs,Understanding Azure Storage Mover billing,Estimate and understand Azure Storage Mover migration costs,Understanding Azure Storage Mover billing,Azure Storage Mover facilitates the migration of unstructured data into Azure. This article provides insight into the categories of costs that may apply to your migration scenarios.,2023-04-07T00:00:00.000Z,conceptual,decision-making,0.6,True,"Billing article that explains cost categories and how different migration scenarios affect charges, supporting cost-related decision-making.",unchanged https://learn.microsoft.com/en-us/azure/storage-mover/cloud-to-cloud-migration,Migrate from Amazon S3 to Azure,Get started with cloud-to-cloud migration in Azure Storage Mover,Configure S3 to Azure Blob cloud-to-cloud migration,"The Cloud-to-Cloud Migration feature in Azure Storage mover allows you to securely transfer data from Amazon Simple Storage Service (Amazon S3) to Azure Blob Storage, utilizing Azure Arc for AWS (Amaz","The Cloud-to-Cloud Migration feature in Azure Storage Mover allows you to securely transfer data from Amazon Simple Storage Service (Amazon S3) to Azure Blob Storage. The feature utilizes Azure Arc multicloud connectors for AWS (Amazon Web Services) to simplify authentication and resource management capabilities to resources outside of the Azure cloud. These capabilities and resources can include on-premises servers, multicloud environments, and edge computing devices. For more details on Azure ",2025-11-05T12:23:00.000Z,quickstart,integrations,0.7,True,"Covers Cloud-to-Cloud Migration using Azure Arc multicloud connectors for AWS; likely includes product-specific parameters, authentication settings, and integration patterns between S3, Arc, and Storage Mover.",unchanged @@ -65,8 +65,8 @@ https://learn.microsoft.com/en-us/azure/storage-mover/storage-mover-create,Creat https://learn.microsoft.com/en-us/azure/storage-mover/troubleshooting,Collect a support bundle,"Create, retrieve, and view the support bundle from an Azure Storage Mover agent",Generate and analyze Azure Storage Mover support bundles,"Learn how to create, retrieve, and view the support bundle from the Azure Storage Mover Agent.","Your organization's migration project utilizes the Azure Storage Mover to do the bulk of the migration-specific work. An unexpected issue within one of the components has the potential to bring the migration to a standstill. Storage Mover agents are capable of generating a support bundle to help resolve such issues. This article helps you through the process of creating the support bundle on the agent, retrieving the compressed log bundle, and accessing the logs it contains. This article assumes",2023-08-08T11:20:00.000Z,how-to,troubleshooting,0.7,True,"Explains how to create, retrieve, and view support bundles and logs for issue resolution; contains product-specific diagnostic procedures and log locations.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-best-practices,Best practices,Best practices for using blob access tiers - Azure Storage,Best practices for choosing and managing blob access tiers,Learn about best practice guidelines that help you use access tiers to optimize performance and reduce costs.,"This article provides best practice guidelines that help you use access tiers to optimize performance and reduce costs. To learn more about access tiers, seeAccess tiers for blob data.",2025-11-18T17:01:00.000Z,concept-article,best-practices,0.65,True,"Explicit best-practices article; likely includes concrete tier-selection patterns, lifecycle rules, and product-specific gotchas around rehydration and costs.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-online-manage,Change a blob's access tier,Set a blob's access tier - Azure Storage,,"Learn how to specify a blob's access tier when you upload it, or how to change the access tier for an existing blob.","You can set a blob's access tier in any of the following ways: By setting the account default access tier setting for the storage account. Blobs in the account inherit this access tier unless you explicitly override the setting for an individual blob. By explicitly setting a blob's tier on upload. You can create a blob in the hot, cool, cold, or archive tier. By changing an existing blob's tier with a Set Blob Tier operation. Typically, you would use this operation to move from a hotter tier to ",2025-11-18T17:01:00.000Z,how-to,,0.4,False,How-to for setting a blob’s access tier; likely procedural without comprehensive configuration tables or numeric ranges beyond what’s in general docs.,unchanged -https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview,Access tiers,Access tiers for blob data - Azure Storage,,"Azure storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it's being used. Learn about the hot, cool, cold, and archive access tie","Data stored in the cloud grows at an exponential pace. To manage costs for your expanding storage needs, it can be helpful to organize your data based on how frequently it will be accessed and how long it will be retained. Azure storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it's being used. Azure Storage access tiers include: Azure storage capacity limits are set at the account level, rather than according to access tier",2026-04-13T22:10:00.000Z,concept-article,,0.2,False,"Primarily an overview of access tiers (hot, cool, cold, archive) and cost/usage concepts; while it may mention that capacity limits are at the account level, it does not focus on detailed numeric limits, configuration parameters, or decision matrices with quantified thresholds.",updated -https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-smart,Smart tiering,Optimize Azure Blob Storage costs with smart tier - Azure Storage,Optimize Blob Storage costs with smart tier thresholds,"Optimize your Azure Blob Storage costs with smart tier, automatically moving data between access tiers based on usage patterns.","Smart tier automatically moves your data between the hot, cool, and cold access tiers based on usage patterns, optimizing your costs for these access tiers without setting up supplementary rulesets or policies. Smart tier is ideal for storing data on standard online tiers when access patterns are unclear and you don’t want to manage transitions. By default, new data is stored in thehottier. Any object that isn't accessed for30 daysis moved to thecooltier; after90 daysof inactivity, it transition",2026-04-13T22:10:00.000Z,how-to,limits-quotas,0.7,True,"Describes smart tier behavior with specific inactivity thresholds (for example, 30 and 90 days) that trigger automatic movement between hot, cool, and cold tiers; these are concrete numeric rules governing the feature’s operation, fitting the limits/quotas category.",updated +https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview,Access tiers,Access tiers for blob data - Azure Storage,,"Azure storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it's being used. Learn about the hot, cool, cold, and archive access tie","Data stored in the cloud grows at an exponential pace. To manage costs for your expanding storage needs, it can be helpful to organize your data based on how frequently it will be accessed and how long it will be retained. Azure storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it's being used. Azure Storage access tiers include: Azure storage capacity limits are set at the account level, rather than according to access tier",2026-04-13T22:10:00.000Z,concept-article,,0.2,False,"Primarily an overview of access tiers (hot, cool, cold, archive) and cost/usage concepts; while it may mention that capacity limits are at the account level, it does not focus on detailed numeric limits, configuration parameters, or decision matrices with quantified thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-smart,Smart tiering,Optimize Azure Blob Storage costs with smart tier - Azure Storage,Optimize Blob Storage costs with smart tier thresholds,"Optimize your Azure Blob Storage costs with smart tier, automatically moving data between access tiers based on usage patterns.","Smart tier automatically moves your data between the hot, cool, and cold access tiers based on usage patterns, optimizing your costs for these access tiers without setting up supplementary rulesets or policies. Smart tier is ideal for storing data on standard online tiers when access patterns are unclear and you don’t want to manage transitions. By default, new data is stored in thehottier. Any object that isn't accessed for30 daysis moved to thecooltier; after90 daysof inactivity, it transition",2026-04-13T22:10:00.000Z,how-to,limits-quotas,0.7,True,"Describes smart tier behavior with specific inactivity thresholds (for example, 30 and 90 days) that trigger automatic movement between hot, cool, and cold tiers; these are concrete numeric rules governing the feature’s operation, fitting the limits/quotas category.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-configure,Configure anonymous access,Configure anonymous read access for containers and blobs - Azure Storage,Configure anonymous read access for Blob containers,Learn how to allow or disallow anonymous access to blob data for the storage account. Set the container's anonymous access setting to make containers and blobs available for anonymous access.,"Azure Storage supports optional anonymous read access for containers and blobs. By default, anonymous access to your data is never permitted. Unless you explicitly enable anonymous access, all requests to a container and its blobs must be authorized. When you configure a container's access level setting to permit anonymous access, clients can read data in that container without authorizing the request. Warning When a container is configured for anonymous access, any client can read data in that ",2025-03-04T08:00:00.000Z,how-to,security,0.8,True,"Details account and container-level settings that control anonymous access, including access level options; product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-overview,Overview,Overview of remediating anonymous read access for blob data - Azure Storage,Remediate anonymous read access to Azure Blob data,Learn how to remediate anonymous read access to blob data for both Azure Resource Manager and classic storage accounts.,"Azure Storage supports optional anonymous read access for containers and blobs. By default, anonymous access to your data is never permitted. Unless you explicitly enable anonymous access, all requests to a container and its blobs must be authorized. We recommend that you disable anonymous access for all of your storage accounts. This article provides an overview of how to remediate anonymous access for your storage accounts. Warning Anonymous access presents a security risk. We recommend that y",2025-03-17T22:05:00.000Z,how-to,security,0.7,True,Gives concrete remediation steps and configuration paths for disabling anonymous blob access across account types; product-specific security hardening guidance.,unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-prevent,Remediate anonymous access,Remediate anonymous read access to blob data (Azure Resource Manager deployments) - Azure Storage,Prevent anonymous read access for ARM-based Blob accounts,Learn how to analyze current anonymous requests against a storage account and how to prevent anonymous access for the entire storage account or for an individual container.,"Azure Blob Storage supports optional anonymous read access to containers and blobs. However, anonymous access may present a security risk. We recommend that you disable anonymous access for optimal security. Disallowing anonymous access helps to prevent data breaches caused by undesired anonymous access. By default, anonymous access to your blob data is always prohibited. The default configuration for an Azure Resource Manager storage account prohibits users from configuring anonymous access to ",2025-03-04T08:00:00.000Z,how-to,security,0.82,True,Explains how to analyze anonymous requests and configure account/container settings to block them; includes specific security configuration behaviors for ARM accounts.,unchanged @@ -210,7 +210,7 @@ https://learn.microsoft.com/en-us/azure/storage/blobs/sas-service-create-dotnet, https://learn.microsoft.com/en-us/azure/storage/blobs/sas-service-create-java,Authorize using a service SAS,Create a service SAS for a container or blob with Java - Azure Storage,Create service SAS for containers and blobs in Java,Learn how to create a service shared access signature (SAS) for a container or blob using the Azure Blob Storage client library for Java.,"A shared access signature (SAS) enables you to grant limited access to containers and blobs in your storage account. When you create a SAS, you specify its constraints, including which Azure Storage resources a client is allowed to access, what permissions they have on those resources, and how long the SAS is valid. Every SAS is signed with a key. You can sign a SAS in one of two ways: Note A user delegation SAS offers superior security to a SAS that is signed with the storage account key. Micro",2024-09-06T22:05:00.000Z,how-to,security,0.7,True,"Provides Java client patterns and parameters for service SAS creation, including permissions and scope configuration.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/sas-service-create-javascript,Service SAS,Create a service SAS for a container or blob with JavaScript - Azure Storage,Create service SAS for blobs with JavaScript,Learn how to create a service shared access signature (SAS) for a container or blob using the Azure Blob Storage client library for JavaScript.,"A shared access signature (SAS) enables you to grant limited access to containers and blobs in your storage account. When you create a SAS, you specify its constraints, including which Azure Storage resources a client is allowed to access, what permissions they have on those resources, and how long the SAS is valid. Every SAS is signed with a key. You can sign a SAS in one of two ways: Note A user delegation SAS offers superior security to a SAS that is signed with the storage account key. Micro",2024-08-05T08:00:00.000Z,how-to,security,0.7,True,"Provides JavaScript client patterns and parameters for service SAS creation, including permissions and constraints.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/sas-service-create-python,Authorize using a service SAS,Create a service SAS for a container or blob with Python - Azure Storage,Create service SAS for Azure blobs using Python,Learn how to create a service shared access signature (SAS) for a container or blob using the Azure Blob Storage client library for Python.,"A shared access signature (SAS) enables you to grant limited access to containers and blobs in your storage account. When you create a SAS, you specify its constraints, including which Azure Storage resources a client is allowed to access, what permissions they have on those resources, and how long the SAS is valid. Every SAS is signed with a key. You can sign a SAS in one of two ways: Note A user delegation SAS offers superior security to a SAS that is signed with the storage account key. Micro",2024-09-06T22:05:00.000Z,how-to,security,0.8,True,"Shows Python client methods and parameter sets for service SAS (resources, permissions, validity), which are detailed security configuration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/storage/blobs/scalability-targets,Blob storage,Scalability and performance targets for Blob storage - Azure Storage,Azure Blob Storage scalability and performance limits,Learn about scalability and performance targets for Blob storage.,"This reference details scalability and performance targets for Azure Storage. The scalability and performance targets listed here are high-end targets, but are achievable. In all cases, the request rate and bandwidth achieved by your storage account depends upon the size of objects stored, the access patterns utilized, and the type of workload your application performs. Make sure to test your service to determine whether its performance meets your requirements. If possible, avoid sudden spikes i",2025-06-26T22:19:00.000Z,concept-article,limits-quotas,0.95,True,"This reference page documents concrete scalability and performance targets (request rates, bandwidth, object counts) for Azure Blob Storage accounts, including specific numeric limits and constraints that are not inferable from general knowledge.",updated +https://learn.microsoft.com/en-us/azure/storage/blobs/scalability-targets,Blob storage,Scalability and performance targets for Blob storage - Azure Storage,Azure Blob Storage scalability and performance limits,Learn about scalability and performance targets for Blob storage.,"This reference details scalability and performance targets for Azure Storage. The scalability and performance targets listed here are high-end targets, but are achievable. In all cases, the request rate and bandwidth achieved by your storage account depends upon the size of objects stored, the access patterns utilized, and the type of workload your application performs. Make sure to test your service to determine whether its performance meets your requirements. If possible, avoid sudden spikes i",2025-06-26T22:19:00.000Z,concept-article,limits-quotas,0.95,True,"This reference page documents concrete scalability and performance targets (request rates, bandwidth, object counts) for Azure Blob Storage accounts, including specific numeric limits and constraints that are not inferable from general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/scalability-targets-premium-block-blobs,Premium block blob storage accounts,Scalability targets for premium block blob storage accounts - Azure Storage,Premium block blob storage scalability limits,"Learn about premium-performance block blob storage accounts. Block blob storage accounts are optimized for applications that use smaller, kilobyte-range objects.","A premium-performance block blob storage account is optimized for applications that use smaller, kilobyte-range objects. It's ideal for applications that require high transaction rates or consistent low-latency storage. Premium performance block blob storage is designed to scale with your applications. If your scenario requires that you deploy application(s) that require hundreds of thousands of requests per second or petabytes of storage capacity, contact Microsoft by submitting a support reque",2025-05-21T17:04:00.000Z,concept-article,limits-quotas,0.95,True,"Premium block blob scalability targets include specific maximum IOPS, throughput, capacity, and request rate numbers per account and per blob, which are numeric limits not inferable from general training.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/scalability-targets-premium-page-blobs,Premium page blob storage accounts,Scalability targets for premium page blob storage accounts - Azure Storage,Premium page blob storage scalability limits,A premium performance page blob storage account is optimized for read/write operations. This type of storage account backs an unmanaged disk for an Azure virtual machine.,"This reference details scalability and performance targets for Azure Storage. The scalability and performance targets listed here are high-end targets, but are achievable. In all cases, the request rate and bandwidth achieved by your storage account depends upon the size of objects stored, the access patterns utilized, and the type of workload your application performs. Make sure to test your service to determine whether its performance meets your requirements. If possible, avoid sudden spikes i",2023-04-03T00:00:00.000Z,concept-article,limits-quotas,0.95,True,"Premium page blob scalability targets document exact IOPS, throughput, and capacity limits per disk/account in tabular form, matching limits-quotas criteria.",unchanged https://learn.microsoft.com/en-us/azure/storage/blobs/secure-file-transfer-protocol-host-keys,Host keys (SFTP) support,Host keys for SFTP support for Azure Blob Storage - Azure Storage,Validate SFTP host keys for Azure Blob Storage connections,Find a list of valid host keys when using an SFTP client to connect with Azure Blob Storage.,"This article contains a list of valid host keys used to connect to Azure Blob Storage from SFTP clients. Blob storage now supports the SSH File Transfer Protocol (SFTP). This support provides the ability to securely connect to Blob Storage via an SFTP endpoint, allowing you to leverage SFTP for file access, file transfer, as well as file management. For more information, seeSSH File Transfer Protocol (SFTP) support for Azure Blob Storage. When you connect to Blob Storage by using an SFTP client,",2026-02-02T18:12:00.000Z,reference,security,0.85,True,Contains a list of valid SSH host key fingerprints/types for Blob Storage SFTP endpoints. These are product-specific security parameters used to verify server identity.,unchanged @@ -399,13 +399,14 @@ https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-p https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/backup-archive-disaster-recovery/veeam/veeam-solution-guide,Veeam getting started guide,Azure Data Protection with Veeam - Azure Storage,Apply Veeam best practices with Azure Blob backups,"This article provides information for using Azure Blob storage with Veeam solutions, including details on how to get started and best practices.","Azure Blob Storage is compatible with many Veeam products, offering a cost-effective solution for data retention and recovery. Due to the wide range of features available in Azure, no single Veeam product supports all of them. To understand which features each product supports, refer to Veeam’s article onusing object storage with Veeam Products. This article provides a comprehensive overview of Veeam’s backup and data protection solutions for Microsoft Azure environments. It assists IT professio",2025-09-07T08:00:00.000Z,concept-article,best-practices,0.65,True,Described as providing best practices and product-specific guidance for using Veeam with Azure Blob Storage; likely includes concrete recommendations and gotchas unique to this integration.,unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/container-solutions/partner-overview,Overview of validated partners,Storage container solution partners - Azure Storage,,List of Microsoft partner companies that build customer solutions for containers with Azure Storage,"This article highlights Microsoft partner solutions that enable automation, data protection, and storage management of container-based solutions at scale. Note While Azure Storage works closely with our partners, support for any partner solution is provided by the partner, not Azure Support. You will need to open a case with the partner's support organization and then, if necessary, open a case with Azure Support to troubleshoot Azure infrastructure events that may be connected to the issue.",2025-10-03T11:10:00.000Z,concept-article,,0.1,False,Container solution partner overview; primarily a list of partners without deep technical configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/atempo-quick-start-guide,Atempo getting started guide,Migrate data to Azure with Atempo Miria,Configure Atempo Miria to migrate data into Azure Storage,Getting started guide to implement Atempo Miria infrastructure with Azure Storage. This article helps you integrate the Atempo Miria Infrastructure with Azure storage.,This document will provide assistance in getting started with configuring Atemop Miria to migrate data to Azure Storage.,2023-05-04T08:00:00.000Z,quickstart,configuration,0.65,True,"Getting started guide for configuring Atempo Miria with Azure Storage; likely includes specific connection parameters, options, and operational settings.",unchanged -https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/azure-file-migration-program-solutions,Azure Storage Migration Program Solutions,Azure Storage Migration Program Details,,Overview of the Azure Storage Migration Program and how to use it,"In ourMigration tools comparisonpage, we highlighted cases where an Independent Software Vendor (ISV) partner solution is required or well-suited to meet specific source - target storage combinations. In this page, we share more details on the ISVs enrolled in our Storage Migration Program (SMP), how to engage with them, and answer common questions about the program.",2025-03-20T17:02:00.000Z,concept-article,,0.3,False,Program overview and engagement details for Storage Migration Program; more about process and partners than technical decision matrices or configuration.,unchanged +https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/azure-file-migration-program-solutions,Azure Storage Migration Program Solutions,Azure Storage Migration Program Details,,Overview of the Azure Storage Migration Program and how to use it,"In ourMigration tools comparisonpage, we highlighted cases where an Independent Software Vendor (ISV) partner solution is required or well-suited to meet specific source - target storage combinations. In this page, we share more details on the ISVs enrolled in our Storage Migration Program (SMP), how to engage with them, and answer common questions about the program.",2026-04-23T11:10:00.000Z,concept-article,,0.1,False,"Program overview for Azure Storage Migration Program and ISV engagement; does not expose detailed technical limits, configuration tables, troubleshooting mappings, or quantified decision criteria.",updated https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/cirrus-data-migration-guide,Cirrus Data getting started guide,Migrate your block data to Azure with Cirrus Data - Azure Storage,Configure Cirrus Migrate Cloud for Azure block data,Learn how Cirrus Migrate Cloud enables disk migration from an existing storage system or cloud to Azure. The original system operates during migration.,"Cirrus Migrate Cloud (CMC) enables disk migration from an existing storage system or cloud to Azure. Migration proceeds while the original system is still in operation. This article presents the methodology to successfully configure and execute the migration. The solution uses distributed Migration Agents that run on every host. The agents allow direct Host-to-Host connections. Each Host-to-Host migration is independent, which makes the solution infinitely scalable. There are no central bottlene",2022-06-13T17:03:00.000Z,how-to,configuration,0.7,True,"Described as a methodology to configure and execute migration using Cirrus Migration Agents and host-to-host connections. A migration guide for a specific validated partner typically includes product-specific settings, parameters, and step ordering that go beyond generic knowledge, fitting configuration best.",unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/dobimigrate-quick-start-guide,DobiMigrate getting started guide,Migrate your file data to Azure with Datadobi DobiMigrate - Azure Storage,Set up Datadobi DobiMigrate for Azure storage migration,"Provides getting started guide to implement Datadobi DobiMigrate, and migrate your data to Azure Files, Azure NetApp Files, or ISV NAS solution","This article helps you integrate the Datadobi DobiMigrate infrastructure with Azure storage. It includes prerequisites, considerations, implementation, and operational guidance. DobiMigrate enables file and object migrations between storage platforms. It migrates data from on-premises to Azure quickly, easily, and cost effectively.",2021-06-10T17:07:00.000Z,concept-article,configuration,0.7,True,"Getting started/quick start guide for integrating DobiMigrate with Azure storage; expected to include concrete configuration steps, parameters, and operational guidance specific to this tool.",unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/komprise-quick-start-guide,Komprise getting started guide,Analyze and migrate your file data to Azure with Komprise Intelligent Data Manager - Azure Storage,Configure Komprise Intelligent Data Manager for Azure migration,"Getting started guide to implement Komprise Intelligent Data Manager. This guide shows how to analyze your file infrastructure, and migrates your data to Azure Files, Azure NetApp Files, Azure Blob St","This article describes using Komprise Intelligent Data Management to identify and place the right data in the right Azure Storage Service. Moving data can be intimidating. There are often numerous challenges, beginning with identifying what to move, matching data value to proper storage class, then moving it promptly all while minimizing end-user impacts. Komprise makes it easy to move your data to Azure storage services like Azure Files, Azure NetApp Files, Azure Blob Storage or other ISV NAS ",2023-08-22T16:02:00.000Z,quickstart,configuration,0.7,True,Guide for analyzing and migrating data with Komprise to multiple Azure storage services; likely contains product-specific configuration flows and settings for tiering and migration.,unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/komprise-tiering-guide,Komprise tiering guide,Optimize file storage with Komprise Intelligent Tiering for Azure - Azure Storage,Implement Komprise Intelligent Tiering to Azure Blob,Getting started guide to implement Komprise Tiering. This guide shows how to your data to Azure Blob storage using Komprise's Intelligent Tiering,"Businesses and public sector enterprises often retain file data for decades as it contains business value such as customer insights, genomic patterns, machine learning training data, and compliance information. Because of its large volume, variety and velocity of its growth, file data can become expensive to store, backup and manage. Most IT organizations spend 30% to 50% of their budgets on file data storage and backups, as shown in the2023 Komprise State of Unstructured Data Management Report.",2023-10-31T05:41:00.000Z,quickstart,configuration,0.7,True,"Tiering guide focused on moving data to Azure Blob using Komprise; expected to detail tiering policies, thresholds, and configuration options unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/migration-tools-comparison,Migration tools comparison,Azure Storage migration tools comparison - Unstructured data,Choose Azure unstructured data migration tools,Basic functionality and comparison between tools used for migration of unstructured data,"The following comparison matrix shows basic functionality of different tools that can be used for migration of unstructured data. Tip Azure File Sync can be utilized for migrating data to Azure Files, even if you don't intend to use a hybrid solution for on-premises caching or syncing. This migration process is efficient and causes no downtime. To use Azure File Sync as a migration tool,simply deploy itand, after the migration is finished,remove the server endpoint. Ideally Azure File Sync woul",2025-07-02T08:00:00.000Z,concept-article,decision-making,0.8,True,Explicitly a comparison matrix of migration tools with criteria; supports tool selection decisions and likely includes capability differences and scenario-based recommendations.,unchanged -https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/partner-overview,Overview of validated partners,"Storage data governance, management, and migration partners - Azure Storage",,"List of Microsoft partner companies that build customer solutions for data governance, management, and migration with Azure Storage","This article highlights Microsoft partner companies integrated with Azure Storage that can improve your overall data management capabilities. These partner solutions can support storage assessment and reporting, platform-agnostic migration, replication, cloud tiering, or data governance. Note While Azure Storage works closely with our partners, support for any partner solution is provided by the partner, not Azure Support. You will need to open a case with the partner's support organization and ",2025-10-03T11:10:00.000Z,concept-article,,0.1,False,"Data governance/management/migration partner overview; mostly descriptive listing, not detailed technical guidance.",unchanged +https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/netapp-data-migrator-guide,NetApp Data Migrator getting started guide,Migrate data to Azure NetApp Files with NetApp Data Migrator,Deploy and operate NetApp Data Migrator to Azure NetApp Files,Migrate data to Azure NetApp Files with NetApp Data Migrator,"This article helps you deploy and use NetApp Data Migrator (NDM) to migrate unstructured file data to Azure NetApp Files (ANF). It outlines prerequisites, Azure-specific deployment steps, file-server configuration, migration planning, execution, and operational guidance for successful end-to-end data movement into ANF. NDM provides an intuitive UI and API-driven workflow that enables you to discover, analyze, and migrate NFS or SMB datasets from on-premises or third-party file servers into Azure",2026-04-23T11:10:00.000Z,concept-article,deployment,0.68,True,"The guide contains product-specific deployment and operational steps for NetApp Data Migrator into Azure NetApp Files, including Azure-specific deployment details, prerequisites, and migration execution/operations that go beyond generic migration concepts. It focuses on end-to-end data movement into ANF with concrete, tool-specific guidance, which fits best under deployment compared to other categories.",new +https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/partner-overview,Overview of validated partners,"Storage data governance, management, and migration partners - Azure Storage",,"List of Microsoft partner companies that build customer solutions for data governance, management, and migration with Azure Storage","This article highlights Microsoft partner companies integrated with Azure Storage that can improve your overall data management capabilities. These partner solutions can support storage assessment and reporting, platform-agnostic migration, replication, cloud tiering, or data governance. Note While Azure Storage works closely with our partners, support for any partner solution is provided by the partner, not Azure Support. You will need to open a case with the partner's support organization and ",2026-04-23T11:10:00.000Z,concept-article,,0.1,False,"Partner overview and marketing-style listing of data governance, management, and migration partners; no product-specific limits, configuration parameters, error codes, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/storagex-quick-start-guide,StorageX getting started guide,Analyze and migrate your file data to Azure with Data Dynamics StorageX - Azure Storage,Deploy Data Dynamics StorageX for Azure file migrations,"Getting started guide to implement Data Dynamics StorageX. Guide shows how to migrate your data to Azure Files, Azure NetApp Files, Azure Blob Storage, or any available ISV NAS solution","This article helps you deploy Data Dynamics StorageX in Microsoft Azure. We introduce key concepts around how StorageX works, deployment prerequisites, installation process, and how-tos for operational guidance. For more in-depth information, visitData Dynamics Customer Portal. Data Dynamics StorageX is a Unified Unstructured Data Management platform that allows analyzing, managing, and moving data across heterogeneous storage environments. Basic capabilities are: You can find more functionaliti",2021-10-03T23:13:00.000Z,concept-article,configuration,0.7,True,"Deployment and operational guidance for StorageX in Azure; expected to include detailed configuration steps, prerequisites, and product-specific options for analyzing and moving data.",unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/primary-secondary-storage/isv-file-services,ISV file services,Considerations for running ISV file services in Azure - Azure Storage,Choose ISV file services options in Azure,Basic guidance for different ISV options on running file services in Azure,"Azure offers various options for storing file data. Azure native storage services, both first party andnative ISV services, are: There are several articles that describe the differences and recommendation on selecting the native file service. You can learn more: Many independent software vendor (ISV) solutions can provide file services in Azure. This article addresses two topics: A full list of verified ISV solutions is available within theAzure Storage partners for primary and secondary storage",2025-07-31T11:10:00.000Z,concept-article,decision-making,0.65,True,"Article is explicitly about considerations and recommendations for running ISV file services versus native options. This is selection guidance between different approaches and services, which aligns with decision-making, and is likely to include scenario-based recommendations beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/primary-secondary-storage/nasuni-deployment-guide,Nasuni getting started guide,Nasuni configuration guide for Microsoft Azure - Azure Storage,Deploy and configure Nasuni with Azure Blob Storage,Deployment guide for Nasuni and Azure Blob Storage,"Nasuni uses cost-effective Azure Blob object and an intelligent caching architecture to deliver high performance SMB and NFS file shares across multiple Azure regions and on-premises locations. With effortless scalability, up-to-the-minute recovery points, instant recoveries, real-time ransomware detection, zero-latency edge performance, remote/hybrid worker support, and more, Nasuni with Azure Blob is the enterprise-class solution for moving traditional file server and NAS workloads into the cl",2023-11-03T05:36:00.000Z,concept-article,deployment,0.7,True,"Labeled as a deployment guide for Nasuni and Azure Blob Storage. Such validated-partner deployment guides typically contain product-specific deployment steps, supported patterns, and constraints for using Nasuni with Azure, which qualifies as expert deployment/configuration knowledge.",unchanged diff --git a/products/azure-blob-storage/report.md b/products/azure-blob-storage/report.md index 8ccae88f..57222180 100644 --- a/products/azure-blob-storage/report.md +++ b/products/azure-blob-storage/report.md @@ -1,9 +1,9 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - deployment: 'Guides for deploying and configuring Blob Storage: static website hosting - (CDN, GitHub Actions, Terraform), feature support, Data Lake enablement, and hybrid/migration - tools (Data Box, WANdisco, Nasuni, Tiger Bridge).' + deployment: 'Guides for deploying and integrating Blob Storage: static website hosting + (CDN, GitHub Actions, Terraform), feature enablement, and data migration from + HDFS/Hadoop and third‑party storage tools.' decision-making: 'Cost and architecture decisions for Blob Storage: pricing, cost estimation/optimization, tiers, protocols (NFS/SFTP/BlobFuse), data protection, connectivity, and migration/upgrade tool choices.' @@ -27,32 +27,32 @@ category_descriptions: containers/blobs, leases, tiers, tags, and events.' skill_description: Expert knowledge for Azure Blob Storage development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations - & coding patterns, and deployment. Use when using Data Lake features, NFS/SFTP/BlobFuse, - static website hosting, SAS/RBAC auth, or SDK-based blob operations, and other Azure - Blob Storage related development tasks. Not for Azure Files (use azure-files), Azure - Table Storage (use azure-table-storage), Azure Queue Storage (use azure-queue-storage), + & coding patterns, and deployment. Use when using Data Lake/NFS/SFTP, static website + hosting/CDN, SAS/RBAC auth, lifecycle/immutability, or SDK/BlobFuse APIs, and other + Azure Blob Storage related development tasks. Not for Azure Files (use azure-files), + Azure Queue Storage (use azure-queue-storage), Azure Table Storage (use azure-table-storage), Azure NetApp Files (use azure-netapp-files). -use_when: Use when using Data Lake features, NFS/SFTP/BlobFuse, static website hosting, - SAS/RBAC auth, or SDK-based blob operations, and other Azure Blob Storage related - development tasks. -confusable_not_for: Not for Azure Files (use azure-files), Azure Table Storage (use - azure-table-storage), Azure Queue Storage (use azure-queue-storage), Azure NetApp +use_when: Use when using Data Lake/NFS/SFTP, static website hosting/CDN, SAS/RBAC + auth, lifecycle/immutability, or SDK/BlobFuse APIs, and other Azure Blob Storage + related development tasks. +confusable_not_for: Not for Azure Files (use azure-files), Azure Queue Storage (use + azure-queue-storage), Azure Table Storage (use azure-table-storage), Azure NetApp Files (use azure-netapp-files). --- # Azure Blob Storage Crawl Report ## Summary -- **Total Pages**: 408 -- **Fetched**: 408 +- **Total Pages**: 409 +- **Fetched**: 409 - **Fetch Failed**: 0 -- **Classified**: 335 +- **Classified**: 336 - **Unclassified**: 73 ### Incremental Update - **New Pages**: 1 -- **Updated Pages**: 4 -- **Unchanged**: 403 +- **Updated Pages**: 2 +- **Unchanged**: 406 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-blob-storage/azure-blob-storage.csv` @@ -60,32 +60,28 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Table Storage ( | Type | Count | Percentage | |------|-------|------------| -| best-practices | 30 | 7.4% | +| best-practices | 30 | 7.3% | | configuration | 63 | 15.4% | | decision-making | 22 | 5.4% | -| deployment | 11 | 2.7% | -| integrations | 132 | 32.4% | +| deployment | 12 | 2.9% | +| integrations | 132 | 32.3% | | limits-quotas | 17 | 4.2% | | security | 54 | 13.2% | | troubleshooting | 6 | 1.5% | -| *(Unclassified)* | 73 | 17.9% | +| *(Unclassified)* | 73 | 17.8% | ## Changes ### New Pages -- [Understand egress insights](https://learn.microsoft.com/en-us/azure/storage-discovery/egress-insights) +- [NetApp Data Migrator getting started guide](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/netapp-data-migrator-guide) ### Updated Pages -- [Transfer from Azure Blob to Blob](https://learn.microsoft.com/en-us/azure/storage-mover/azure-to-azure-migration) - - Updated: 2025-12-18T18:11:00.000Z → 2026-04-16T06:12:00.000Z -- [Blob storage](https://learn.microsoft.com/en-us/azure/storage/blobs/scalability-targets) - - Updated: 2023-09-12T11:19:00.000Z → 2025-06-26T22:19:00.000Z -- [Access tiers](https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview) - - Updated: 2025-12-02T08:00:00.000Z → 2026-04-13T22:10:00.000Z -- [Smart tiering](https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-smart) - - Updated: 2026-04-07T22:12:00.000Z → 2026-04-13T22:10:00.000Z +- [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/partner-overview) + - Updated: 2025-10-03T11:10:00.000Z → 2026-04-23T11:10:00.000Z +- [Azure Storage Migration Program Solutions](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/azure-file-migration-program-solutions) + - Updated: 2025-03-20T17:02:00.000Z → 2026-04-23T11:10:00.000Z ## Classified Pages @@ -352,6 +348,7 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Table Storage ( | [User delegation SAS](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-create-user-delegation-sas-javascript) | security | 0.70 | Shows JavaScript APIs and required permissions to obtain user delegation keys and sign SAS tokens; security-focused configuration. | | [Version-level policies](https://learn.microsoft.com/en-us/azure/storage/blobs/immutable-version-level-worm-policies) | configuration | 0.70 | Describes version-scoped WORM policies at account/container/version levels; product-specific configuration semantics. | | [Versioning](https://learn.microsoft.com/en-us/azure/storage/blobs/versioning-overview) | configuration | 0.70 | Explains how Blob versioning maintains previous versions and how they can be accessed; product-specific feature semantics. | +| [NetApp Data Migrator getting started guide](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/netapp-data-migrator-guide) | deployment | 0.68 | The guide contains product-specific deployment and operational steps for NetApp Data Migrator into Azure NetApp Files, including Azure-specific deployment details, prerequisites, and migration execution/operations that go beyond generic migration concepts. It focuses on end-to-end data movement into ANF with concrete, tool-specific guidance, which fits best under deployment compared to other categories. | | [Snapshots](https://learn.microsoft.com/en-us/azure/storage/blobs/snapshots-overview) | configuration | 0.68 | Describes snapshot semantics and billing specifics for Blob Storage; product-specific feature behavior beyond generic snapshot concepts. | | [.NET](https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-dotnet) | integrations | 0.65 | .NET client library methods and options for directory/file operations in HNS accounts are integration/coding patterns specific to this product. | | [Append data to an append blob](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-append) | integrations | 0.65 | Shows how to construct and use append blobs via .NET APIs, including specific methods and usage constraints unique to this SDK. | @@ -459,7 +456,6 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Table Storage ( | [Deploy Storage Discovery and access insights](https://learn.microsoft.com/en-us/azure/storage-discovery/get-storage-estate-insights) | 0.35 | Getting-started tutorial; focuses on basic deployment and usage, not deep configuration or limits. | | [ARM](https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-task-quickstart-arm) | 0.30 | ARM template quickstart; shows one deployment example rather than full configuration reference. | | [Azure CLI](https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-task-quickstart-cli) | 0.30 | Quickstart using Azure CLI; focuses on basic usage, not exhaustive configuration or limits. | -| [Azure Storage Migration Program Solutions](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/azure-file-migration-program-solutions) | 0.30 | Program overview and engagement details for Storage Migration Program; more about process and partners than technical decision matrices or configuration. | | [Azure portal](https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-task-quickstart-portal) | 0.30 | Quickstart using Azure portal; likely step-by-step tutorial without deep config tables or limits. | | [Bicep](https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-task-quickstart-bicep) | 0.30 | Bicep quickstart; demonstrates a template, but not a comprehensive config or limits reference. | | [BlobFuse2 help](https://learn.microsoft.com/en-us/azure/storage/blobs/blobfuse2-commands-help) | 0.30 | Command reference for 'blobfuse2 help'; generic CLI usage, no limits, configs, or troubleshooting mappings. | @@ -496,11 +492,12 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Table Storage ( | [Understand egress insights](https://learn.microsoft.com/en-us/azure/storage-discovery/egress-insights) | 0.20 | Page appears to describe how to view and interpret egress insights and patterns in Azure Storage Discovery, but the summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or concrete decision matrices. It reads as conceptual/analytical guidance about understanding egress rather than product-specific expert details that match any sub-skill type. | | [What is Azure Storage Actions?](https://learn.microsoft.com/en-us/azure/storage-actions/overview) | 0.20 | High-level service overview without product-specific limits, configs, or error diagnostics. | | [What is Azure Storage Discovery?](https://learn.microsoft.com/en-us/azure/storage-discovery/overview) | 0.20 | Service overview for Storage Discovery; high-level description without detailed limits, configs, or troubleshooting. | +| [Azure Storage Migration Program Solutions](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/azure-file-migration-program-solutions) | 0.10 | Program overview for Azure Storage Migration Program and ISV engagement; does not expose detailed technical limits, configuration tables, troubleshooting mappings, or quantified decision criteria. | | [Introduction to Blob Storage](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction) | 0.10 | Introductory overview of Blob Storage; no expert-level numeric limits, configs, or troubleshooting content. | | [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/analytics/partner-overview) | 0.10 | Partner overview and listing for analytics integrations; no detailed configuration parameters, limits, or troubleshooting content. | | [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/backup-archive-disaster-recovery/partner-overview) | 0.10 | High-level overview of archive/backup/BCDR partners; lacks concrete settings, limits, or error-resolution details. | | [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/container-solutions/partner-overview) | 0.10 | Container solution partner overview; primarily a list of partners without deep technical configuration or troubleshooting content. | -| [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/partner-overview) | 0.10 | Data governance/management/migration partner overview; mostly descriptive listing, not detailed technical guidance. | +| [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/partner-overview) | 0.10 | Partner overview and marketing-style listing of data governance, management, and migration partners; no product-specific limits, configuration parameters, error codes, or decision matrices. | | [Overview of validated partners](https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/primary-secondary-storage/partner-overview) | 0.10 | Partner overview listing NAS/SAN solution vendors and high-level positioning. No indication of numeric limits, configuration parameters, decision matrices, or troubleshooting details. | | [Use with other Azure services](https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-integrate-with-services-tutorials) | 0.10 | Navigation page listing other tutorials; no technical details, limits, configuration parameters, or troubleshooting content. | | [What is Azure Blob Storage?](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-overview) | 0.10 | High-level overview of Azure Blob Storage; no specific limits, configs, or error details. | diff --git a/products/azure-boards/azure-boards.csv b/products/azure-boards/azure-boards.csv index 4e367368..06afa7a5 100644 --- a/products/azure-boards/azure-boards.csv +++ b/products/azure-boards/azure-boards.csv @@ -17,14 +17,14 @@ https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/manage-issues-imp https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/manage-work-items?view=azure-devops,Manage work items,Manage work items effectively - Azure Boards,,"Optimize work item management in Azure Boards. Learn to create, update, link, track, and organize work items with best practices for team collaboration.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Visual Studio 2019 | Visual Studio 2022 Work items are the foundation of project management in Azure Boards, enabling teams to track and organize all types of work including user stories, product backlog items, tasks, test cases, and bugs. Effective work item management helps teams: This comprehensive guide covers essential work item management capabilities in Azure DevOps. Tip You canuse AI to help with this tasklater in thi",2026-03-02T14:06:00.000Z,concept-article,,0.35,False,"General guide to managing work items effectively; mostly conceptual and procedural, any best practices are high-level without product-specific numeric or config details.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/move-change-type?view=azure-devops,"Change work item type, move work items",Move work items and change the work item type in Azure Boards - Azure Boards,,Learn how to change the work item type or bulk move work items to another project in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Sometimes, work items get created with the wrong type or assigned to an incorrect project. You can correct these issues by updating individual work items or bulk modifying multiple items. You can also remove irrelevant work items from your backlog or Taskboard. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. To change the type of multiple wo",2026-03-04T02:02:00.000Z,how-to,,0.3,False,"How-to guide for moving/changing work items; lacks numeric limits, config parameter tables, or detailed troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/bulk-add-modify-work-items-excel?view=azure-devops,Bulk add or modify (Excel),Bulk Modify Azure Boards Work Items with Excel - Azure Boards,,"Use the Microsoft Excel plugin in Azure DevOps to bulk add or modify Azure Boards work items, such as tasks, bugs, backlog items, or issues.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't provide updates or fixes for this add-in. For bulk work item operations, use theCSV import/export functionality, which is the recommended and supported approach. This article shows how you can save time with Microsoft Excel when you need to add or modify many work it",2025-09-29T22:02:00.000Z,tutorial,,0.4,False,"Describes using Excel add-in for bulk work item edits; procedural guidance without detailed config tables, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/faqs?view=azure-devops,Excel FAQs,Work in Excel connected to Azure Boards FAQs - Azure Boards,Resolve common Excel and Azure Boards integration questions,Learn answers to frequently asked questions about Excel and Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't provide updates or fixes for this add-in. For bulk work item operations, use theCSV import/export functionality, which is the recommended and supported approach. Find answers to common questions about using Microsoft Excel to add or modify work items in Azure DevOps.",2026-02-13T22:02:00Z,faq,troubleshooting,0.6,True,"FAQ focused on Excel–Azure Boards usage; likely includes specific issues and resolutions tied to this integration, which are product-specific troubleshooting knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/faqs?view=azure-devops,Excel FAQs,Excel and Azure Boards FAQs - Azure Boards,,Learn answers to frequently asked questions about Excel and Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't provide updates or fixes for this add-in. For bulk work item operations, use theCSV import/export functionality, which is the recommended and supported approach. Find answers to common questions about using Microsoft Excel to add or modify work items in Azure DevOps.",2026-04-22T21:02:00Z,faq,,0.3,False,"Excel and Azure Boards FAQ focuses on support status and general guidance (use CSV import/export instead of Office add-in); no indication of numeric limits, config tables, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops,Office integration issues,Resolve common Azure DevOps Office integration issues - Azure Boards,Troubleshoot Azure DevOps Office integration issues,Learn how to resolve common integration issues that occur with Azure DevOps Office integrations.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure DevOps integrates with Microsoft Office applications, primarily Excel and Project, to enable bulk editing and management of work items. This integration relies on the Azure DevOps Office Integration add-in, which adds aTeamribbon to your Office applications. Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't pr",2026-02-28T08:00:00.000Z,troubleshooting,troubleshooting,0.8,True,"Explicitly about resolving common Office integration issues; such pages typically list specific error messages, causes, and resolutions for the Azure DevOps Office Integration add-in, which are product-specific troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops,Resolve Excel data conflicts,Resolve common Azure DevOps Office integration issues - Azure Boards,Troubleshoot Azure DevOps Office integration issues,Learn how to resolve common integration issues that occur with Azure DevOps Office integrations.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure DevOps integrates with Microsoft Office applications, primarily Excel and Project, to enable bulk editing and management of work items. This integration relies on the Azure DevOps Office Integration add-in, which adds aTeamribbon to your Office applications. Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't pr",2026-02-28T08:00:00.000Z,troubleshooting,troubleshooting,0.8,True,Same URL and description as index 17; contains product-specific error diagnosis and resolution steps for the Office integration add-in.,unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops,Resolve Excel data validation errors,Resolve common Azure DevOps Office integration issues - Azure Boards,Troubleshoot Azure DevOps Office integration issues,Learn how to resolve common integration issues that occur with Azure DevOps Office integrations.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure DevOps integrates with Microsoft Office applications, primarily Excel and Project, to enable bulk editing and management of work items. This integration relies on the Azure DevOps Office Integration add-in, which adds aTeamribbon to your Office applications. Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't pr",2026-02-28T08:00:00.000Z,troubleshooting,troubleshooting,0.8,True,Duplicate of index 17; troubleshooting content with symptom → cause → solution mappings for Azure DevOps Office integration.,unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/track-work?view=azure-devops,Connect to an Office client,Connect Azure Boards to an Office client to track your work - Azure Boards,Connect Azure Boards work tracking with Excel,Learn how to connect to Excel to integrate and track your work in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Important The Azure DevOps Office integration add-in is no longer supported and might not function with current versions of Office or browsers. Microsoft doesn't provide updates or fixes for this add-in. For bulk work item operations, use theCSV import/export functionality, which is the recommended and supported approach. To support your work tracking efforts, you can use Microsoft Excel. You can either work in online mode, w",2025-12-01T22:03:00.000Z,how-to,integrations,0.6,True,"Explains connecting Excel to Azure Boards including online/offline modes; contains product-specific integration behavior and constraints (e.g., CSV import/export recommendation).",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/organize-backlog?view=azure-devops,Organize your backlog (map or reparent),Tutorial: Organize Your Product Backlog in Azure Boards - Azure Boards,,Learn how to map or parent backlog items to features. Then learn how to map features to epics in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In this tutorial, you learn how to organize your backlog in Azure Boards. After you add features or epics to your portfolio backlog, you can map backlog items. You can add and group items into a hierarchy. Also, drill up or down in the hierarchy, reorder and reparent items, and filter hierarchical views. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto ge",2026-03-04T02:02:00.000Z,tutorial,,0.2,False,"Tutorial on organizing backlog hierarchy; no product-specific limits, configuration parameter tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/remove-delete-work-items?view=azure-devops,"Remove, delete, or restore","Remove, delete, and restore work items in Azure Boards - Azure Boards",,"Learn how to remove, delete, or restore work items in Azure Boards to manage backlogs and boards more efficiently.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Work items can stay forever in your work tracking data store. You never have to delete them. However, you might want to set up a work item management process for one of the following actions: Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. Note To move a work item from one project to another, or to change the work item type, seeMove work ite",2026-04-17T17:04:00.000Z,how-to,,0.2,False,"Page is a how-to guide for removing, deleting, and restoring work items in Azure Boards. It appears to focus on UI steps and process guidance without numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined in the sub-skill types.",updated -https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops,Fix reordering and nesting issues,"Resolve nest, display, and reorder issues for work items - Azure Boards",Fix Azure Boards backlog nesting and reorder errors,Learn how to fix reordering and nesting issues for work items in Azure Boards. Resolve error messages and maintain a natural hierarchy for your backlog.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 When you reorder, nest, and display work items, Azure Boards expects anatural hierarchy. The natural hierarchy breaks when you create same-category or same-type links between work items. For example, parent to child links that are bug to bug or user story to user story orrequirementscategory totaskcategory. Use this article to address error messages when you add links that aren't in the natural hierarchy.",2026-02-28T08:00:00.000Z,troubleshooting,troubleshooting,0.75,True,Specifically about resolving reordering/nesting issues and error messages when hierarchy rules are violated; maps particular symptoms and messages to causes and corrective actions.,unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/remove-delete-work-items?view=azure-devops,"Remove, delete, or restore","Remove, delete, and restore work items in Azure Boards - Azure Boards",,"Learn how to remove, delete, or restore work items in Azure Boards to manage backlogs and boards more efficiently.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Work items can stay forever in your work tracking data store. You never have to delete them. However, you might want to set up a work item management process for one of the following actions: Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. Note To move a work item from one project to another, or to change the work item type, seeMove work ite",2026-04-17T17:04:00.000Z,how-to,,0.2,False,"Page is a how-to guide for removing, deleting, and restoring work items in Azure Boards. It appears to focus on UI steps and process guidance without numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined in the sub-skill types.",unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops,Fix reordering and nesting issues,"Resolve nest, display, and reorder issues for work items - Azure Boards",Troubleshoot Azure Boards backlog nesting and reordering,Learn how to fix reordering and nesting issues for work items in Azure Boards. Resolve error messages and maintain a natural hierarchy for your backlog.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 When you reorder, nest, and display work items, Azure Boards expects anatural hierarchy. The natural hierarchy breaks when you create same-category or same-type links between work items. For example, parent-to-child links that are bug to bug or user story to user story. Use this article to address error messages when you add links that aren't in the natural hierarchy.",2026-04-22T21:02:00.000Z,troubleshooting,troubleshooting,0.7,True,Article explicitly focuses on resolving specific reordering and nesting issues and error messages when hierarchy rules are violated; this is symptom → cause → resolution guidance unique to Azure Boards behavior.,updated https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/set-column-options?view=azure-devops,Change column options,Add or remove columns within a work item list in Azure Boards - Azure Boards,,Manage columns in Azure Boards work item lists to display and organize fields that matter most. Discover step-by-step guidance and boost productivity.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Visual Studio 2019 | Visual Studio 2022 Each column in a work item list corresponds to a work item field. Add or remove columns to show the fields you need, or drag a column to reorder it. Your column settings persist per page and apply only to your views. The following table shows which column actions are available in each list view: Action Backlogs Sprint backlogs Queries Work items Add or remove a column field Yes Yes Yes ",2026-03-27T21:05:00.000Z,how-to,,0.2,False,"UI/how-to guidance for adding/removing/reordering columns in work item lists; no numeric limits, config tables, error codes, or product-specific patterns beyond generic usage.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/work-item-template?view=azure-devops,Use work item templates,Update Work Items with Templates in Azure Boards - Azure Boards,,"Add and manage Azure DevOps work item templates, update work items, and prepopulate work item fields in the web portal and Visual Studio.",Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Visual Studio 2022 | Visual Studio 2019 | Visual Studio 2017 | Visual Studio 2015 Work item templates help you quickly create work items with prepopulated values for your commonly used fields. You can use work item templates to create work items or make bulk updates to several work items. This article describes how you can add and manage work item templates from the web portal or from Visual Studio 2015 or earlier versions. F,2025-11-12T22:03:00.000Z,how-to,,0.35,False,"Work item templates usage and management; mostly feature explanation and steps, no numeric limits, config tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/best-practices-agile-project-management?view=azure-devops,Best practices for Agile project management,Best practices for Agile product management - Azure Boards,Apply Azure Boards agile product management practices,Get started with this guide for product managers who are new to Azure Boards to plan and track your products.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This guide helps product managers get started with Azure Boards. It summarizes practical recommendations for configuring teams, planning work, and using Boards, Backlogs, Iterations, and Delivery Plans to deliver value predictably. Note If your team follows Kanban or Scrum specifically, seeAbout Boards and Kanbanor theScrum tutorials. Most recommendations apply to both Azure DevOps Services (cloud) and Azure DevOps Server (on",2025-11-07T02:02:00.000Z,best-practice,best-practices,0.65,True,"Provides concrete DO/DON'T guidance for configuring teams, planning work, and using Boards/Backlogs/Iterations in Azure Boards; product-specific usage recommendations rather than generic agile theory.",unchanged @@ -40,8 +40,8 @@ https://learn.microsoft.com/en-us/azure/devops/boards/boards/wip-limits?view=azu https://learn.microsoft.com/en-us/azure/devops/boards/configure-customize?view=azure-devops,Configure & customize boards,Configure and Customize Azure Boards - Azure Boards,Configure and customize Azure Boards processes and boards,"Learn how to configure area paths, iterations, work item types, workflows, and team settings in Azure Boards to match your organization's processes and reporting needs.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Customize Azure Boards to match your team's processes and portfolio needs. This article describes recommended tasks and considerations for administrators who configure area and iteration structure, work item types (WITs), workflows, and board behavior. If you already know the configuration tasks you want, start with these articles: Note Most guidance here applies to both cloud and on-premises deployments. Some features, such ",2026-01-30T18:03:00.000Z,concept-article,configuration,0.8,True,"Targets administrators configuring area/iteration paths, WITs, workflows, and board behavior; likely includes specific setting names and options, matching configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/extensions/dependency-tracker?view=azure-devops,Dependency Tracker,Use the Dependency Tracker extension - Azure DevOps,,"Manage dependencies in Azure DevOps with the Dependency Tracker extension. Streamline collaboration, monitor progress, and keep your projects on track.","Azure DevOps Services Note UseDelivery Plansto track dependencies instead of Dependency Tracker. The Dependency Tracker extension isn't a supported feature of Azure Boards and no product team supports it. For questions, suggestions, or problems you have when using the extension, visit theMarketplace for Azure DevOps, Dependency Tracker extensionpage. The Dependency Tracker extension is only available on Azure DevOps Services. The Dependency Tracker extension helps you manage dependencies across ",2026-03-27T21:05:00.000Z,how-to,,0.3,False,"Describes usage and support status of the Dependency Tracker extension with a high-level note to use Delivery Plans instead. No detailed configuration parameters, limits, error codes, or decision matrices are evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/extensions/migrate-integrate?view=azure-devops,Migrate and integrate,About migrating and integrating work tracking data - Azure Boards,Select migration and integration options for Azure Boards,"Learn how you can migrate or integrate work tracking data from other software applications to Azure Boards, plus available extensions.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can migrate work tracking data into Azure Boards and integrate Azure Boards with many non-Microsoft tools. This article gives an overview of migration options, common scenarios, and extensions that help with migration and integration. Tip Browse Azure Boards extensions in the Visual Studio Marketplace to customize and extend your boards experience. See the ""Extensions for Azure Boards"" section later in this article.",2025-10-27T22:02:00.000Z,overview,decision-making,0.6,True,"Overview of migration options and scenarios; helps choose between approaches and extensions for moving/integrating work tracking data, which is a decision-making aid.",unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops,FAQs,Azure Boards FAQs - Azure Boards,,Answers to frequently asked questions about Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Azure Boards. For FAQs specific to queries or using Microsoft Excel to add or modify work items, seeQuery FAQsandFAQs: Work in Excel connected to Azure Boards. View features on our roadmap on theFeatures Timeline. To request a feature or up‑vote one, go to ourDeveloper Community page.",2025-12-19T16:00:00Z,faq,,0.3,False,"General Azure Boards FAQ; mostly high-level Q&A and links, not focused on detailed error codes or configuration matrices.",unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops,Work item form caching,Azure Boards FAQs - Azure Boards,,Answers to frequently asked questions about Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Azure Boards. For FAQs specific to queries or using Microsoft Excel to add or modify work items, seeQuery FAQsandFAQs: Work in Excel connected to Azure Boards. View features on our roadmap on theFeatures Timeline. To request a feature or up‑vote one, go to ourDeveloper Community page.",2025-12-19T16:00:00Z,faq,,0.2,False,"General FAQ page; summary does not indicate specific limits, error codes, configuration tables, or other detailed expert-only data. Likely high-level Q&A and navigation to other docs.",unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops,FAQs,Azure Boards FAQs - Azure Boards,,Answers to frequently asked questions about Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Azure Boards. For FAQs specific to queries or using Microsoft Excel to add or modify work items, seeQuery FAQsandFAQs: Work in Excel connected to Azure Boards. View features on our roadmap on theFeatures Timeline. To request a feature or up‑vote one, go to ourDeveloper Community page.",2026-04-22T21:02:00Z,faq,,0.2,False,"General Azure Boards FAQ; summary points to other pages for specifics and roadmap; likely high-level usage questions rather than detailed limits, configuration, or diagnostic content.",updated +https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops,Work item form caching,Azure Boards FAQs - Azure Boards,,Answers to frequently asked questions about Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Azure Boards. For FAQs specific to queries or using Microsoft Excel to add or modify work items, seeQuery FAQsandFAQs: Work in Excel connected to Azure Boards. View features on our roadmap on theFeatures Timeline. To request a feature or up‑vote one, go to ourDeveloper Community page.",2026-04-22T21:02:00Z,faq,,0.0,False,"FAQ page appears to provide general usage answers and links to other resources; no clear indication of detailed limits, configuration tables, error-code mappings, or other product-specific expert data per the defined categories.",updated https://learn.microsoft.com/en-us/azure/devops/boards/get-started/permissions-access-boards?view=azure-devops,Default permissions & access for Boards,Default permissions and access for Azure Boards - Azure Boards,Understand default permissions and access in Azure Boards,Learn about default permissions and access levels in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 As a project member, your ability to use Azure Boards features depends on the access level and security group assigned to you. Users with theBasicaccess level or higher get full access to Boards features. Users assigned theStakeholderaccess level get limited, targeted access: they can view and modify many work items but can't use some Board features. Built-in security groups—Readers**,Contributors, and **Project Administrator",2025-10-27T22:02:00.000Z,overview,security,0.85,True,"Details access levels (Basic, Stakeholder) and built-in security groups (Readers, Contributors, Project Administrators) with their capabilities; matches security sub-skill with specific role names and scopes.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/get-started/plan-track-work?view=azure-devops,Plan and track work,Plan and track work in Azure Boards - Azure Boards,,"Learn how to plan and track work by using Azure Boards using the Agile, Basic, Scrum, or Capability Maturity Model Integration (CMMI) processes.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use Azure Boards to plan and track work with the Agile, Basic, Scrum, or Capability Maturity Model Integration (CMMI) processes. For more information about process choices, seeAbout processes and process templates. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. The Agile process uses user stories, tasks, bugs, features, and epics to plan an",2026-03-04T02:02:00.000Z,how-to,,0.2,False,"General guidance on planning and tracking work in Azure Boards using different processes; appears conceptual/how-to without product-specific limits, configuration tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/get-started/sign-up-invite-teammates?view=azure-devops,Sign up for Azure Boards,Sign up for Azure Boards - Azure Boards,,Learn how to sign up for free for Azure Boards.,"Azure DevOps Services Sign up forAzure Boardsto plan, track, and discuss work across your teams. For a short overview, seeWhat is Azure Boards?. This quickstart shows the simplest path to get started and invite teammates. What you learn: Quick steps: To sign up for all Azure DevOps Services, seeSign up, sign in to Azure DevOps.",2026-03-04T02:02:00.000Z,quickstart,,0.0,False,"Quickstart for signing up and inviting teammates to Azure Boards; no limits, configuration tables, error codes, or other product-specific expert details.",unchanged @@ -50,7 +50,7 @@ https://learn.microsoft.com/en-us/azure/devops/boards/github/?view=azure-devops, https://learn.microsoft.com/en-us/azure/devops/boards/github/configure-status-badges?view=azure-devops,Add status badges,Add status badges for your GitHub repo - Azure Boards,Add Azure Boards status badges to GitHub repos,Learn how to add and configure your board badge status so it appears on your GitHub repo.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Add Markdown syntax to a GitHub repoREADME.mdfile to display your board status as a badge. Copy the syntax from your board settings and paste it into the README. This syntax works for bothGitHub.com and GitHub Enterprise Server connections. For GitHub Enterprise Server, your server must be network accessible to Azure DevOps Services.",2026-03-27T21:05:00.000Z,how-to,integrations,0.6,True,Covers product-specific integration between Azure Boards and GitHub via Markdown badge syntax and connectivity requirements for GitHub Enterprise Server. Contains concrete integration behavior and configuration context rather than generic tutorial content.,unchanged https://learn.microsoft.com/en-us/azure/devops/boards/github/connect-on-premises-to-github?view=azure-devops-server,Connect Azure Boards (on-premises),Connect an on-premises project to a GitHub Enterprise Server - Azure DevOps Server,Configure on-premises Azure DevOps with GitHub Enterprise,Learn how to configure one or more GitHub repositories to integrate with an Azure Boards on-premises project.,"Azure DevOps Server | Azure DevOps Server 2022 When you connect your Azure DevOps Server project to your GitHub repositories, you support linking between GitHub commits and pull requests to work items. You can use GitHub for software development while using Azure Boards to plan and track your work. Note Azure DevOps Server supports integration with GitHub.com and GitHub Enterprise Server repositories. To connect from Azure DevOps Services, seeConnect Azure Boards to GitHub.",2025-07-29T19:13:00.000Z,how-to,configuration,0.65,True,On-premises integration with GitHub Enterprise Server usually requires specific configuration parameters and constraints unique to this product combination.,unchanged https://learn.microsoft.com/en-us/azure/devops/boards/github/connect-to-github?view=azure-devops,Connect Azure Boards (cloud),Connect an Azure Boards or Azure DevOps project to a GitHub repository - Azure Boards,Integrate Azure Boards projects with GitHub repositories,Configure one or more GitHub repositories to integrate with Azure Boards.,"Azure DevOps Services Connect your Azure Boards project to GitHub.com repositories so that commits and pull requests automatically link to work items. This integration lets you plan and track work in Azure Boards while your team develops in GitHub. After you connect, you can: Note Azure Boards supports integration with both GitHub.com and GitHub Enterprise Server. To connect from an on-premises Azure DevOps Server, seeConnect Azure DevOps Server to GitHub Enterprise Server.",2026-03-27T21:05:00.000Z,how-to,integrations,0.65,True,"Describes a specific Azure Boards–GitHub integration, including supported endpoints (GitHub.com and GitHub Enterprise Server) and product-specific connection behavior. While largely procedural, it encodes concrete integration behavior that is product-specific and not just generic SDK usage.",unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/github/features-timeline?view=azure-devops,Roadmap and features timeline,Features timeline and roadmap - Azure Boards,,Learn about upcoming and recently released features for Azure Boards integration with GitHub.,,2026-01-22T22:02:00.000Z,overview,,0.2,False,"Features timeline/roadmap is release/marketing information, not technical expert knowledge for skills.",unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/github/features-timeline?view=azure-devops,Roadmap and features timeline,Features timeline and roadmap - Azure Boards,,Learn about upcoming and recently released features for Azure Boards integration with GitHub.,,2026-04-22T17:59:00.000Z,overview,,0.1,False,"Features timeline/roadmap content is primarily release and marketing-style information, not detailed limits, configuration parameters, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/devops/boards/github/install-github-app?view=azure-devops,Install the Azure Boards app,Install the Azure Boards App for GitHub - Azure Boards,Configure Azure Boards GitHub app connections,"Use this quickstart to configure the Azure Boards app to connect one or more GitHub repositories to Azure Boards, and change GitHub repo access.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In this quickstart, you install the Azure Boards app for GitHub to connect Azure Boards to your GitHub repositories. When you connect Azure Boards projects with GitHub.com repositories, you support linking between GitHub commits and pull requests to work items. You can use GitHub for software development while using Azure Boards to plan and track your work. After you install the Azure Boards app for GitHub on your GitHub acco",2025-09-10T18:07:00.000Z,quickstart,configuration,0.65,True,"Quickstart for installing/configuring Azure Boards app for GitHub; likely includes specific configuration options (scopes, repo access) and settings unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/github/link-to-from-github?view=azure-devops,Link to GitHub artifacts,"Link GitHub commits, PRs, branches, and issues to work items - Azure Boards",Integrate Azure Boards work items with GitHub artifacts,"Learn how to links work items to GitHub commits, pull requests, branches, and issues, and automatically transition work item states in Azure Boards.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In this article, learn how to link work items to GitHub commits, pull requests, branches, and builds after connecting your Azure Boards project with a GitHub repository. You can use the#mentionsyntax for commits and branches, use!mentions to reference GitHub pull requests from work item discussions, or add a GitHub commit, pull request, or branch link directly from the Azure Boards work item. Note GitHub integration support:",2026-02-27T18:03:00.000Z,quickstart,integrations,0.7,True,"Details Azure Boards–GitHub linking patterns (#mention, !mention) and behavior; product-specific integration semantics beyond generic SDK usage.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/github/work-item-integration-github-copilot?view=azure-devops,Use GitHub Copilot,Use GitHub Copilot with Azure Boards - Azure Boards,Integrate GitHub Copilot with Azure Boards work items,Learn how to use GitHub Copilot with work items to automatically create pull requests and track coding progress in Azure DevOps.,"Azure DevOps Services Note This feature is in Private Preview. Access is limited and functionality might change before general availability. Azure Boards integrates with GitHub Copilot to streamline your development workflow. You can use work items directly with Copilot, which automatically creates branches, implements code changes, and generates draft pull requests while keeping your work item updated with progress. This integration allows you to: Important This integration requires GitHub repo",2025-11-21T23:34:00.000Z,how-to,integrations,0.75,True,"Describes a specific integration between Azure Boards and GitHub Copilot, including requirements (GitHub repo, private preview) and likely configuration parameters unique to this integration.",unchanged @@ -61,7 +61,7 @@ https://learn.microsoft.com/en-us/azure/devops/boards/plans/?view=azure-devops," https://learn.microsoft.com/en-us/azure/devops/boards/plans/add-edit-delivery-plan?view=azure-devops,Add or edit a Delivery Plan,Add or edit a Delivery Plan in Azure Boards - Azure Boards,,Learn how to add or edit a Delivery Plan in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Delivery Plans provide a highly interactive calendar view of multiple team backlogs. This article shows how to add and edit a plan. For the use cases, benefits, and interactions you can perform, seeReview team Delivery Plans. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. Note This article describes how to add or edit Delivery Plans 2.0, wh",2026-03-02T14:06:00.000Z,how-to,,0.25,False,"How to add/edit a delivery plan; procedural content without evidence of expert-only limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/plans/agile-culture?view=azure-devops,Agile culture,Elements of Agile culture - Azure Boards,,Create an Agile culture of autonomous teams and an aligned enterprise. Create the culture by using Agile tools when working in Azure Boards and Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 As your team grows, you want your tools to grow with it. If you're an enterprise adopting Agile methodologies, you want your Agile tools to support your enterprise's business goals. However, successfully scaling Agile requires you to address both the culture and tools within your organization. Note New to Agile? For more information, seeAgile CultureandScaling Agile to Large Teams.",2025-11-03T23:38:00.000Z,best-practice,,0.1,False,Discusses agile culture and scaling; conceptual guidance without product-specific numeric thresholds or configs.,unchanged https://learn.microsoft.com/en-us/azure/devops/boards/plans/configure-hierarchical-teams?view=azure-devops,Configure a hierarchy of teams,Configure hierarchical teams - Azure Boards,,Learn how to configure teams to support hierarchical portfolio backlogs for tracking progress across teams in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article explains how to configure a hierarchy of teams that supports tailored backlog views for management and feature teams. A hierarchical team structure helps ensure that your organization remains agile, focused, and aligned with its strategic objectives. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. Teams can use customized backlo",2026-03-02T14:06:00.000Z,tutorial,,0.25,False,Configuring hierarchical teams; appears as step-by-step UI configuration without detailed parameter tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/plans/faqs?view=azure-devops,Delivery Plans FAQs,Frequently asked questions (FAQs) about Delivery Plans - Azure Boards,,Learn how to get answers to common questions about working with Delivery Plans.,"Azure Boards Delivery Plans provides a calendar view of multiple backlogs, teams, and team backlogs from different projects. It replaces the Delivery Plans marketplace extension and is available for Azure DevOps Services and Azure DevOps Server 2022 and later. The following answers address common questions about Delivery Plans.",2025-09-29T22:02:00Z,faq,,0.3,False,FAQ for Delivery Plans; likely general usage Q&A without detailed error-code mappings or numeric constraints.,unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/plans/faqs?view=azure-devops,Delivery Plans FAQs,Delivery Plans FAQs - Azure Boards,,Learn how to get answers to common questions about working with Delivery Plans.,"Azure Boards Delivery Plans provides a calendar view of multiple backlogs, teams, and team backlogs from different projects. It replaces the Delivery Plans marketplace extension and is available for Azure DevOps Services and Azure DevOps Server 2022 and later. The following answers address common questions about Delivery Plans.",2026-04-22T21:02:00Z,faq,,0.2,False,"FAQ about Delivery Plans usage and availability; summary indicates general Q&A and feature scope, not numeric limits, configuration tables, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/devops/boards/plans/portfolio-management?view=azure-devops,Portfolio management,Manage product and portfolio backlogs - Azure Boards,,Learn how to work with a hierarchical team structure to manage product and portfolio backlogs and track progress across teams.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Portfolio backlogs let product owners track the work of multiple agile feature teams, monitor progress across projects, and manage risks and dependencies. Product owners create their vision and roadmap for each release and define high-level goals as Epics or Features. Feature teams break down the Epics or Features into Stories for prioritization and development. This structure gives each feature team its own backlog for plann",2026-03-02T14:06:00.000Z,how-to,,0.25,False,"Portfolio backlog management overview/how-to; no indication of numeric limits, config parameter tables, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/plans/practices-that-scale?view=azure-devops,Practices that scale,Implement Agile practices that scale for working in Azure Boards and Azure DevOps - Azure Boards,Apply scalable Agile practices in Azure Boards,Learn about scaling Agile recommended practices for working in Azure Boards and Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Enterprise organizations adopt Agile practices for many reasons. Prime among these reasons include: As your organization grows, you want to scale your practices to remain agile and meet changing goals. To do that, consider these two guiding principles:",2025-11-03T23:38:00.000Z,best-practice,best-practices,0.6,True,Described as recommended practices for scaling Agile in Azure Boards/Azure DevOps; likely includes product-specific guidance and patterns beyond generic Agile advice.,unchanged https://learn.microsoft.com/en-us/azure/devops/boards/plans/review-team-plans?view=azure-devops,Review team Delivery Plans,Use team delivery plans - Azure Boards,,"Learn how to review and use delivery plans in Azure Boards to track and interact with multiple team deliverables, schedules, and dependencies.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Delivery plans in Azure Boards let you visualize and review the work items that your teams plan to deliver. A delivery plan shows selected teams' scheduled work items by sprint or iteration path in a calendar view. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. You can use Delivery Plans to review multiple backlogs and teams across your Azu",2026-03-02T14:06:00.000Z,tutorial,,0.25,False,"Using delivery plans to review work; summary suggests conceptual and procedural guidance, not limits, config matrices, or troubleshooting mappings.",unchanged @@ -81,7 +81,7 @@ https://learn.microsoft.com/en-us/azure/devops/boards/queries/planning-ranking-p https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-by-area-iteration-path?view=azure-devops,Query by area or iteration,Query By Area Or Iteration Path - Azure Boards,,Learn how to query for work items based on their area or iteration path in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Area Path and Iteration Path fields appear on all work item forms for every work item type. You define these paths for your project—area pathsanditeration paths—and then select the ones you want to associate with a team. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. To understand how to work with area and iteration paths, seeAbout teams an",2026-03-04T02:02:00.000Z,example-scenario,,0.2,False,"How-to guidance for querying by area/iteration paths; no numeric limits, config tables, error codes, or product-specific thresholds. Primarily conceptual and UI/query usage instructions.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-by-date-or-current-iteration?view=azure-devops,Query by date or current sprint,Query by date or current iteration in Azure Boards - Azure Boards,,"Learn how to query for work items based on a date, a team's current iteration, or a sliding window of sprints in Azure Boards.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article shows how to list work items by creation, change, resolution, or closed dates. It also shows how to use date macros, such as@Today, and iteration macros for team sprints. For iteration path fundamentals and client or macro restrictions, seeQuery by area or iteration path. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started.",2026-03-04T02:02:00.000Z,example-scenario,,0.2,False,"Shows how to query by dates and iteration macros like @Today and current iteration; does not expose numeric limits, quotas, or detailed configuration tables. Mostly general usage patterns.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-by-workflow-changes?view=azure-devops,Query by assignment or workflow changes,"Query by account, user, workflow, or board changes - Azure Boards",,"Learn how to list work items based on changes made to their assignment, state, or board column or swimlane in Azure Boards.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Efficiently tracking assignment and workflow changes in your work items is essential for maintaining project visibility and ensuring smooth progress. This article shows how to create queries that monitor these changes, enabling better management and oversight of your team's work. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started.",2026-03-04T02:02:00.000Z,example-scenario,,0.2,False,"How-to for creating Azure Boards queries by workflow/assignment changes; no numeric limits, config tables, error-code mappings, or product-specific thresholds. Primarily procedural guidance, not expert reference data.",unchanged -https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-faqs?view=azure-devops,Query FAQs,Frequently Asked Questions (FAQs) about Queries in Azure Boards and Azure DevOps - Azure Boards,,Get answers to common questions about working with work item queries in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 The Query Editor finds work items by using clauses that combine fields, operators, and values. Save queries and share them with teammates. The following answers cover common questions about work item queries.",2025-09-29T22:02:00Z,faq,,0.35,False,"FAQ about queries; mostly clarifications of behavior, but not structured as symptom→solution troubleshooting with error codes.",unchanged +https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-faqs?view=azure-devops,Query FAQs,Azure Boards query FAQs - Azure Boards,Troubleshoot Azure Boards work item query issues,Get answers to common questions about working with work item queries in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 The Query Editor finds work items by using clauses that combine fields, operators, and values. Save queries and share them with teammates. The following answers cover common questions about work item queries.",2026-04-22T21:02:00Z,faq,troubleshooting,0.7,True,"FAQ content for Azure Boards queries typically includes specific error messages, edge-case behaviors, and product-specific gotchas (for example, why certain fields can't be queried, how query limits behave, and how particular operators work). These are symptom → cause → solution style answers unique to Azure DevOps Boards, which an LLM is unlikely to know in detail from training.",updated https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-field-value?view=azure-devops,Query by field comparison,Query by field values - Azure Boards,,Learn how to create a query by filtering on field values that are compared to other field values in Azure Boards and Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use comparison field operators when you want to filter work items by comparing one field's value to another field's value. Common uses include: Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started.",2026-03-04T02:02:00.000Z,example-scenario,,0.2,False,"Covers using comparison operators between fields in queries; no numeric limits, config tables, or decision matrices with quantified trade-offs. Standard query usage guidance.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-index-quick-ref?view=azure-devops,Query quick reference,"Use an index to query examples, tasks, operators, and macros - Azure Boards",,"Learn how to use an index to query operators, macros, and sample queries that are used to list work items for Azure Boards and Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use this index to quickly access example queries and information on opening, defining, and working with queries. To learn how to use the Query Editor, seeDefine a query. If you find that your queries take too long to return results, seeDefine a query/Best practices.",2025-10-27T17:53:00.000Z,overview,,0.3,False,"Index/quick reference to query examples; navigation aid, not a concentrated expert-knowledge page.",unchanged https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-numeric?view=azure-devops,Query by a numeric field,"Query by numeric fields based on effort, schedules, and story points - Azure Boards",,"Track work by creating queries based on effort, story points, schedules, or time tracking fields in Azure Boards and Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Learn how to query by numeric fields such as effort, schedule estimates, story points, or time-tracking fields in Azure Boards and Azure DevOps. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. Common numeric fields track effort for requirements or estimated, remaining, and completed work for tasks. Use queries to list the work items you care",2026-03-04T02:02:00.000Z,example-scenario,,0.2,False,"Shows how to query by numeric fields like effort and story points; does not include quotas, timeouts, or product-specific configuration parameters with ranges. General usage instructions.",unchanged diff --git a/products/azure-boards/report.md b/products/azure-boards/report.md index f880ff20..e14c619c 100644 --- a/products/azure-boards/report.md +++ b/products/azure-boards/report.md @@ -1,10 +1,11 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: limits-quotas: Managing Azure Boards limits for test artifacts and work item attachments, including size/quantity constraints and how to restore deleted test-related items. - troubleshooting: Diagnosing and fixing Azure Boards + Excel/Office integration issues - (sync, add-in, connection, mapping) and resolving backlog nesting/reordering errors. + troubleshooting: Diagnosing and fixing Azure Boards issues with Office integration, + backlog hierarchy/reordering problems, and work item query errors or unexpected + results. integrations: Connecting Azure Boards to external tools (Excel, GitHub, Copilot, Slack, Teams) and using WIQL, badges, and integrations to sync work items and development artifacts. @@ -20,12 +21,13 @@ category_descriptions: access, and setting access controls and policies for teams and users.' skill_description: Expert knowledge for Azure Boards development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, and integrations - & coding patterns. Use when managing work items, boards/backlogs, WIQL queries, - Excel/Office sync, or GitHub/Teams integrations, and other Azure Boards related + & coding patterns. Use when managing work items, backlogs/boards, WIQL queries, + GitHub/Office integration, or Azure Boards security, and other Azure Boards related development tasks. Not for Azure DevOps (use azure-devops), Azure Test Plans (use azure-test-plans), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos). -use_when: Use when managing work items, boards/backlogs, WIQL queries, Excel/Office - sync, or GitHub/Teams integrations, and other Azure Boards related development tasks. +use_when: Use when managing work items, backlogs/boards, WIQL queries, GitHub/Office + integration, or Azure Boards security, and other Azure Boards related development + tasks. confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (use azure-test-plans), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos). --- @@ -41,8 +43,8 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 125 +- **Updated Pages**: 7 +- **Unchanged**: 119 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-boards/azure-boards.csv` @@ -63,8 +65,20 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u ### Updated Pages -- [Remove, delete, or restore](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/remove-delete-work-items?view=azure-devops) - - Updated: 2026-03-04T02:02:00.000Z → 2026-04-17T17:04:00.000Z +- [Work item form caching](https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops) + - Updated: 2025-12-19T16:00:00Z → 2026-04-22T21:02:00Z +- [Query FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-faqs?view=azure-devops) + - Updated: 2025-09-29T22:02:00Z → 2026-04-22T21:02:00Z +- [Delivery Plans FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/plans/faqs?view=azure-devops) + - Updated: 2025-09-29T22:02:00Z → 2026-04-22T21:02:00Z +- [Roadmap and features timeline](https://learn.microsoft.com/en-us/azure/devops/boards/github/features-timeline?view=azure-devops) + - Updated: 2026-01-22T22:02:00.000Z → 2026-04-22T17:59:00.000Z +- [Excel FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/faqs?view=azure-devops) + - Updated: 2026-02-13T22:02:00Z → 2026-04-22T21:02:00Z +- [FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops) + - Updated: 2025-12-19T16:00:00Z → 2026-04-22T21:02:00Z +- [Fix reordering and nesting issues](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops) + - Updated: 2026-02-28T08:00:00.000Z → 2026-04-22T21:02:00.000Z ## Classified Pages @@ -77,7 +91,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [Resolve Excel data conflicts](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops) | troubleshooting | 0.80 | Same URL and description as index 17; contains product-specific error diagnosis and resolution steps for the Office integration add-in. | | [Resolve Excel data validation errors](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops) | troubleshooting | 0.80 | Duplicate of index 17; troubleshooting content with symptom → cause → solution mappings for Azure DevOps Office integration. | | [Code review & feedback fields](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/guidance/guidance-code-review-feedback-field-reference?view=azure-devops) | configuration | 0.75 | Lists specific fields for code review and feedback with descriptions, data types, and recommended uses. This is a structured reference of configuration-like field metadata and usage patterns unique to Azure Boards. | -| [Fix reordering and nesting issues](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops) | troubleshooting | 0.75 | Specifically about resolving reordering/nesting issues and error messages when hierarchy rules are violated; maps particular symptoms and messages to causes and corrective actions. | | [Use GitHub Copilot](https://learn.microsoft.com/en-us/azure/devops/boards/github/work-item-integration-github-copilot?view=azure-devops) | integrations | 0.75 | Describes a specific integration between Azure Boards and GitHub Copilot, including requirements (GitHub repo, private preview) and likely configuration parameters unique to this integration. | | [Work item fields](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/work-item-fields?view=azure-devops) | configuration | 0.74 | The page is a detailed reference of Azure DevOps work item fields and their attributes, including system-defined field names, data types, behaviors, and how they can be modified. This is product-specific configuration knowledge (field identifiers, attributes, and allowed modifications) that an LLM is unlikely to fully know from training. It aligns best with the configuration sub-skill because it documents concrete field settings and attributes rather than conceptual usage or limits. | | [About default processes](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/guidance/choose-process?view=azure-devops) | decision-making | 0.70 | Explicitly guides selection among Agile/Basic/Scrum/CMMI processes; provides scenario-based guidance for choosing the most suitable process, which is product-specific decision support. | @@ -85,7 +98,9 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [Azure Boards with Teams](https://learn.microsoft.com/en-us/azure/devops/boards/integrations/boards-teams?view=azure-devops) | integrations | 0.70 | Covers Azure Boards/Azure DevOps apps for Teams; includes product-specific integration behaviors and configuration steps beyond generic Teams usage. | | [Bugs, issues, & risks fields](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/guidance/cmmi/guidance-bugs-issues-risks-field-reference-cmmi?view=azure-devops) | configuration | 0.70 | Described as listing fields that track bugs, issues, and risks for the CMMI process template. This is a product-specific field reference (names, meanings, likely types) that functions as configuration metadata. | | [Delete test artifacts](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/delete-test-artifacts?view=azure-devops) | limits-quotas | 0.70 | Includes a specific soft-delete retention period of 14 days for test plans and suites, which is a concrete numeric limit not generally known. | +| [Fix reordering and nesting issues](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops) | troubleshooting | 0.70 | Article explicitly focuses on resolving specific reordering and nesting issues and error messages when hierarchy rules are violated; this is symptom → cause → resolution guidance unique to Azure Boards behavior. | | [Link to GitHub artifacts](https://learn.microsoft.com/en-us/azure/devops/boards/github/link-to-from-github?view=azure-devops) | integrations | 0.70 | Details Azure Boards–GitHub linking patterns (#mention, !mention) and behavior; product-specific integration semantics beyond generic SDK usage. | +| [Query FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-faqs?view=azure-devops) | troubleshooting | 0.70 | FAQ content for Azure Boards queries typically includes specific error messages, edge-case behaviors, and product-specific gotchas (for example, why certain fields can't be queried, how query limits behave, and how particular operators work). These are symptom → cause → solution style answers unique to Azure DevOps Boards, which an LLM is unlikely to know in detail from training. | | [Query fields, operators & macros](https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-operators-variables?view=azure-devops) | configuration | 0.70 | Reference for field data types, operators, and macros including version-specific applicability; effectively a configuration surface for the query engine with parameter-like options that are product-specific. | | [Secure your Azure Boards](https://learn.microsoft.com/en-us/azure/devops/boards/secure-your-azure-boards?view=azure-devops) | security | 0.70 | Focuses on security concepts and access controls for Azure Boards; likely includes specific RBAC roles/permissions and configuration guidance unique to the product. | | [Set query permissions (Security)](https://learn.microsoft.com/en-us/azure/devops/boards/queries/set-query-permissions?view=azure-devops) | security | 0.70 | Describes setting permissions on queries and folders; such pages typically list specific permission names/RBAC actions and scopes unique to Azure DevOps, which qualifies as product-specific security/authorization configuration. | @@ -101,7 +116,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [About boards](https://learn.microsoft.com/en-us/azure/devops/boards/boards/kanban-overview?view=azure-devops) | best-practices | 0.60 | Covers Kanban board basics plus best practices in Azure Boards; likely includes product-specific recommendations (e.g., how to set up columns, WIP usage) beyond generic Kanban theory. | | [Add status badges](https://learn.microsoft.com/en-us/azure/devops/boards/github/configure-status-badges?view=azure-devops) | integrations | 0.60 | Covers product-specific integration between Azure Boards and GitHub via Markdown badge syntax and connectivity requirements for GitHub Enterprise Server. Contains concrete integration behavior and configuration context rather than generic tutorial content. | | [Connect to an Office client](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/track-work?view=azure-devops) | integrations | 0.60 | Explains connecting Excel to Azure Boards including online/offline modes; contains product-specific integration behavior and constraints (e.g., CSV import/export recommendation). | -| [Excel FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/faqs?view=azure-devops) | troubleshooting | 0.60 | FAQ focused on Excel–Azure Boards usage; likely includes specific issues and resolutions tied to this integration, which are product-specific troubleshooting knowledge. | | [Migrate and integrate](https://learn.microsoft.com/en-us/azure/devops/boards/extensions/migrate-integrate?view=azure-devops) | decision-making | 0.60 | Overview of migration options and scenarios; helps choose between approaches and extensions for moving/integrating work tracking data, which is a decision-making aid. | | [Practices that scale](https://learn.microsoft.com/en-us/azure/devops/boards/plans/practices-that-scale?view=azure-devops) | best-practices | 0.60 | Described as recommended practices for scaling Agile in Azure Boards/Azure DevOps; likely includes product-specific guidance and patterns beyond generic Agile advice. | @@ -124,7 +138,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [Link work items to objects](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/add-link?view=azure-devops) | 0.35 | Describes linking work items to other objects; focuses on relationships and traceability, not on configuration tables, quotas, or error handling. | | [Manage work items](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/manage-work-items?view=azure-devops) | 0.35 | General guide to managing work items effectively; mostly conceptual and procedural, any best practices are high-level without product-specific numeric or config details. | | [Move work items from one team to another](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/move-work-items?view=azure-devops) | 0.35 | Explains moving work items between teams by changing Area Paths; procedural guidance without detailed configuration matrices, limits, or troubleshooting. | -| [Query FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-faqs?view=azure-devops) | 0.35 | FAQ about queries; mostly clarifications of behavior, but not structured as symptom→solution troubleshooting with error codes. | | [Query by build & test integration fields](https://learn.microsoft.com/en-us/azure/devops/boards/queries/build-test-integration?view=azure-devops) | 0.35 | Build/test integration queries article; summary mentions sample queries and tips but not SDK parameter tables, limits, or error-code mappings. | | [Query by picklist value](https://learn.microsoft.com/en-us/azure/devops/boards/queries/planning-ranking-priorities?view=azure-devops) | 0.35 | Explains queries based on rank/priority; no indication of numeric limits, configuration parameter tables, or decision matrices with thresholds. | | [Scrum process guidance](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/guidance/scrum-process?view=azure-devops) | 0.35 | Describes Scrum process artifacts and queries; usage-focused without clear evidence of expert-only configuration or limits. | @@ -136,10 +149,9 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [Bulk modify (web)](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/bulk-modify-work-items?view=azure-devops) | 0.30 | Task-focused how-to for bulk editing work items; no product-specific limits, configuration tables, or error-code-based troubleshooting. | | [Change work item type, move work items](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/move-change-type?view=azure-devops) | 0.30 | How-to guide for moving/changing work items; lacks numeric limits, config parameter tables, or detailed troubleshooting mappings. | | [Customize a Taskboard](https://learn.microsoft.com/en-us/azure/devops/boards/sprints/customize-taskboard?view=azure-devops) | 0.30 | Covers customizing sprint taskboard cards and columns; likely UI-driven customization without structured config tables, numeric ranges, or advanced patterns. | -| [Delivery Plans FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/plans/faqs?view=azure-devops) | 0.30 | FAQ for Delivery Plans; likely general usage Q&A without detailed error-code mappings or numeric constraints. | | [Dependency Tracker](https://learn.microsoft.com/en-us/azure/devops/boards/extensions/dependency-tracker?view=azure-devops) | 0.30 | Describes usage and support status of the Dependency Tracker extension with a high-level note to use Delivery Plans instead. No detailed configuration parameters, limits, error codes, or decision matrices are evident from the summary. | | [End of sprint activities](https://learn.microsoft.com/en-us/azure/devops/boards/sprints/end-sprint-activities?view=azure-devops) | 0.30 | End-of-sprint hygiene and process steps; does not include numeric thresholds, configuration tables, or detailed troubleshooting content. | -| [FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops) | 0.30 | General Azure Boards FAQ; mostly high-level Q&A and links, not focused on detailed error codes or configuration matrices. | +| [Excel FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/faqs?view=azure-devops) | 0.30 | Excel and Azure Boards FAQ focuses on support status and general guidance (use CSV import/export instead of Office add-in); no indication of numeric limits, config tables, or error-code-based troubleshooting. | | [Filter your board](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/filter-backlogs-boards-plans?view=azure-devops) | 0.30 | Explains filtering backlogs, boards, and plans; describes UI-driven filtering, not product-specific configuration parameters or quotas. | | [Follow work items and PRs](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/follow-work-items?view=azure-devops) | 0.30 | Explains following work items and PRs for notifications; usage-level feature description, no detailed config parameters or quotas. | | [GitHub integration](https://learn.microsoft.com/en-us/azure/devops/boards/github/?view=azure-devops) | 0.30 | Overview of Azure Boards–GitHub integration; mostly conceptual and high-level without detailed config parameter tables in the summary. | @@ -173,6 +185,8 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [Create your backlog](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/create-your-backlog?view=azure-devops) | 0.20 | Backlog creation and management overview/tutorial; mainly conceptual Scrum/Azure Boards usage without expert-only numeric or config details. | | [Customize cards](https://learn.microsoft.com/en-us/azure/devops/boards/boards/customize-cards?view=azure-devops) | 0.20 | Explains customizing board cards and visual styling; standard how-to content without detailed configuration tables or product-specific constraints. | | [Define features & epics](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/define-features-epics?view=azure-devops) | 0.20 | Describes using features and epics to organize backlogs; largely conceptual and UI-driven without product-specific numeric thresholds or configs. | +| [Delivery Plans FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/plans/faqs?view=azure-devops) | 0.20 | FAQ about Delivery Plans usage and availability; summary indicates general Q&A and feature scope, not numeric limits, configuration tables, or error-code-based troubleshooting. | +| [FAQs](https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops) | 0.20 | General Azure Boards FAQ; summary points to other pages for specifics and roadmap; likely high-level usage questions rather than detailed limits, configuration, or diagnostic content. | | [Manage bugs](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/manage-bugs?view=azure-devops) | 0.20 | Content is about defining and managing bugs as a work item type in Azure Boards; appears to be process guidance and feature overview without product-specific numeric limits, configuration parameter tables, or error-code-based troubleshooting. | | [Manage columns](https://learn.microsoft.com/en-us/azure/devops/boards/boards/add-columns?view=azure-devops) | 0.20 | Describes managing columns on a board (add/edit/map); appears as UI/usage guidance without expert-only limits, configs, or error codes. | | [Manage issues](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/manage-issues-impediments?view=azure-devops) | 0.20 | Describes how to track and manage impediments/problems in Azure Boards; focuses on work item usage and concepts, not on detailed configuration options, limits, or troubleshooting mappings. | @@ -189,15 +203,15 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Test Plans (u | [Query history and discussion](https://learn.microsoft.com/en-us/azure/devops/boards/queries/history-and-auditing?view=azure-devops) | 0.20 | Describes querying work item history for auditing; lacks error codes, numeric thresholds, or detailed configuration options. Primarily conceptual and procedural. | | [Quick reference - concepts and tasks](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/quick-ref?view=azure-devops) | 0.20 | Quick reference/navigation to concepts and tasks; summary doesn’t indicate detailed limits, configs, or troubleshooting mappings. | | [Remove, delete, or restore](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/remove-delete-work-items?view=azure-devops) | 0.20 | Page is a how-to guide for removing, deleting, and restoring work items in Azure Boards. It appears to focus on UI steps and process guidance without numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined in the sub-skill types. | -| [Roadmap and features timeline](https://learn.microsoft.com/en-us/azure/devops/boards/github/features-timeline?view=azure-devops) | 0.20 | Features timeline/roadmap is release/marketing information, not technical expert knowledge for skills. | | [Set WIP limits](https://learn.microsoft.com/en-us/azure/devops/boards/boards/wip-limits?view=azure-devops) | 0.20 | Page explains what WIP limits are and how to use them conceptually in Azure Boards; no evidence of product-specific numeric limits, quotas, or configuration tables with exact values. | | [Set up your backlogs & boards](https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/create-your-backlog?view=azure-devops) | 0.20 | Covers creating and managing a product backlog; general process and UI usage without specific limits, configuration ranges, or troubleshooting details. | | [Track dependencies](https://learn.microsoft.com/en-us/azure/devops/boards/plans/track-dependencies?view=azure-devops) | 0.20 | Describes visual dependency tracking in Delivery Plans; no evidence of numeric thresholds, decision matrices, or configuration parameter tables. Primarily UI and workflow guidance. | -| [Work item form caching](https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops) | 0.20 | General FAQ page; summary does not indicate specific limits, error codes, configuration tables, or other detailed expert-only data. Likely high-level Q&A and navigation to other docs. | | [Scrum concepts](https://learn.microsoft.com/en-us/azure/devops/boards/sprints/scrum-key-concepts?view=azure-devops) | 0.15 | Glossary of Scrum and sprint terms; conceptual definitions, not expert configuration or troubleshooting. | | [5. Share your sprint plan](https://learn.microsoft.com/en-us/azure/devops/boards/sprints/share-plan?view=azure-devops) | 0.10 | Simple instructions for sharing a sprint plan via URL/email/print; no expert-only configuration, limits, or decision matrices. | | [Agile culture](https://learn.microsoft.com/en-us/azure/devops/boards/plans/agile-culture?view=azure-devops) | 0.10 | Discusses agile culture and scaling; conceptual guidance without product-specific numeric thresholds or configs. | | [Agile glossary](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/agile-glossary?view=azure-devops) | 0.10 | Agile glossary of terms is conceptual terminology, not configuration, limits, error codes, or other expert-only technical details. | +| [Roadmap and features timeline](https://learn.microsoft.com/en-us/azure/devops/boards/github/features-timeline?view=azure-devops) | 0.10 | Features timeline/roadmap content is primarily release and marketing-style information, not detailed limits, configuration parameters, or troubleshooting mappings. | | [What is Azure Boards?](https://learn.microsoft.com/en-us/azure/devops/boards/get-started/what-is-azure-boards?view=azure-devops) | 0.10 | High-level overview of Azure Boards and its benefits without product-specific limits, configs, or decision matrices. | | [CMMI background](https://learn.microsoft.com/en-us/azure/devops/boards/work-items/guidance/cmmi/guidance-background-to-cmmi?view=azure-devops) | 0.05 | Background notes on CMMI with references to external books; conceptual context, not product-specific expert configuration or limits. | | [Sign up for Azure Boards](https://learn.microsoft.com/en-us/azure/devops/boards/get-started/sign-up-invite-teammates?view=azure-devops) | - | Quickstart for signing up and inviting teammates to Azure Boards; no limits, configuration tables, error codes, or other product-specific expert details. | +| [Work item form caching](https://learn.microsoft.com/en-us/azure/devops/boards/faqs?view=azure-devops) | - | FAQ page appears to provide general usage answers and links to other resources; no clear indication of detailed limits, configuration tables, error-code mappings, or other product-specific expert data per the defined categories. | diff --git a/products/azure-chaos-studio/azure-chaos-studio.csv b/products/azure-chaos-studio/azure-chaos-studio.csv index f83b5929..4de2d5b2 100644 --- a/products/azure-chaos-studio/azure-chaos-studio.csv +++ b/products/azure-chaos-studio/azure-chaos-studio.csv @@ -45,7 +45,7 @@ https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-tutorial-servi https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-tutorial-service-direct-portal,Portal,Create an experiment using a service-direct fault with Chaos Studio - Azure Chaos Studio,,Create an experiment that uses a service-direct fault with Azure Chaos Studio to fail over an Azure Cosmos DB instance.,"You can use a chaos experiment to verify that your application is resilient to failures by causing those failures in a controlled environment. In this article, you cause a multi-read, single-write Azure Cosmos DB failover by using a chaos experiment and Azure Chaos Studio. Running this experiment can help you defend against data loss when a failover event occurs. You can use these same steps to set up and run an experiment for any service-direct fault. Aservice-directfault runs directly against ",2024-10-14T08:00:00.000Z,how-to,,0.35,False,Tutorial for a Cosmos DB service-direct fault; primarily step-by-step usage without broad configuration or error reference.,unchanged https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-versions,Chaos Mesh version compatibility,Azure Chaos Studio compatibility - Azure Chaos Studio,Check OS and tool compatibility for Chaos Studio,Understand the compatibility of Azure Chaos Studio with specific operating systems and tools.,The following reference shows relevant version support and compatibility within Chaos Studio.,2025-10-24T05:09:00.000Z,overview,configuration,0.7,True,"Version support and compatibility reference typically lists specific OS versions, agent versions, and tool support matrices that are detailed, product-specific configuration/compatibility data.",unchanged https://learn.microsoft.com/en-us/azure/chaos-studio/experiment-examples,Experiment examples,Azure CLI & Azure portal example experiments - Azure Chaos Studio,Create Chaos Studio experiments via CLI and portal,See real examples of creating various chaos experiments in the Azure portal and CLI.,"This article provides examples for creating experiments from your command line (CLI) and Azure portal parameter examples for various experiments. You can copy and paste the following commands into the CLI or Azure portal, and edit them for your specific resources. Here's an example of where you would copy and paste the Azure portal parameter into: To save one of the ""experiment.json"" examples shown below, simply typenano experiment.jsoninto your Cloud Shell, copy and paste any of the below exper",2024-09-11T16:50:00.000Z,reference,integrations,0.7,True,"Provides concrete CLI and JSON examples for experiment creation; likely includes request schema, parameter names, and values specific to Chaos Studio’s API and portal integration.",unchanged -https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets,Azure policy definitions,Azure Policy samples for adding resources to Chaos Studio - Azure Chaos Studio,Use Azure Policy to onboard resources to Chaos Studio,Sample Azure policies to add resources to Azure Chaos Studio by using targets and capabilities.,"This article includes sampleAzure Policydefinitions that createtargets and capabilitiesfor a specific resource type. You can automatically add resources to Azure Chaos Studio. First, youdeploy these samples as custom policy definitions. Then youassign the policyto a scope. In these samples, we add service-direct targets and capabilities for eachsupported resource typeby usingtargets and capabilities. Note Each of these policies differs slightly, and you should consult the documentation of the Re",2024-10-14T08:00:00.000Z,sample,configuration,0.8,True,"Contains Azure Policy definitions that create Chaos Studio targets/capabilities, including policy rule structure and resource details specific to Chaos Studio.",unchanged +https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets,Azure policy definitions,Azure Policy samples for adding resources to Chaos Studio - Azure Chaos Studio,Use Azure Policy to add Chaos Studio targets,Sample Azure policies to add resources to Azure Chaos Studio by using targets and capabilities.,"This article includes sampleAzure Policydefinitions that createtargets and capabilitiesfor a specific resource type. You can automatically add resources to Azure Chaos Studio. First, youdeploy these samples as custom policy definitions. Then youassign the policyto a scope. In these samples, we add service-direct targets and capabilities for eachsupported resource typeby usingtargets and capabilities. Note Each of these policies differs slightly, and you should consult the documentation of the Re",2026-04-20T17:15:00.000Z,sample,configuration,0.7,True,"Page provides concrete Azure Policy definition samples for configuring Chaos Studio targets and capabilities per resource type. These include specific policy rule structures, parameter names, and configuration patterns that are product-specific and not just conceptual, fitting the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/chaos-studio/sample-template-experiment,Experiments,Azure Resource Manager template samples for chaos experiments - Azure Chaos Studio,Define Chaos Studio experiments using ARM templates,Sample Azure Resource Manager templates to create Azure Chaos Studio experiments.,This article includes sampleAzure Resource Manager templates (ARM templates)to create achaos experimentin Azure Chaos Studio. Each sample includes a template file and a parameters file with sample values to provide to the template.,2024-10-14T08:00:00.000Z,sample,configuration,0.85,True,"ARM samples show exact schema and properties for experiment resources, including actions/selectors; this is product-specific configuration detail.",unchanged https://learn.microsoft.com/en-us/azure/chaos-studio/sample-template-targets,Targets and capabilities,Resource Manager template samples for targets and capabilities in Chaos Studio - Azure Chaos Studio,Deploy Chaos Studio targets and capabilities via ARM,Sample Azure Resource Manager (ARM) templates to add resources to Azure Chaos Studio by using targets and capabilities.,This article includes sampleAzure Resource Manager templates (ARM templates)to createtargets and capabilitiesto add a resource to Azure Chaos Studio. Each sample includes a template file and a parameters file with sample values to provide to the template.,2024-10-14T08:00:00.000Z,sample,configuration,0.85,True,"ARM template samples expose specific resource types, properties, and parameter structures for Chaos Studio targets/capabilities—detailed configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/chaos-studio/troubleshooting,Troubleshoot common issues,Troubleshoot common Azure Chaos Studio problems - Azure Chaos Studio,Troubleshoot common Azure Chaos Studio issues,Learn to troubleshoot common problems when you use Azure Chaos Studio.,"As you use Azure Chaos Studio, you might occasionally encounter some problems. This article explains common problems and troubleshooting steps.",2025-06-26T22:16:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article; likely organized by specific errors and symptoms with causes and resolutions unique to Chaos Studio (for example, permission issues, target onboarding failures).",unchanged diff --git a/products/azure-chaos-studio/report.md b/products/azure-chaos-studio/report.md index f491a035..a4ae04fb 100644 --- a/products/azure-chaos-studio/report.md +++ b/products/azure-chaos-studio/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-03-16' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Chaos Studio: identities, roles, permissions, CMK encryption, network/IP controls, Private Link, VNet injection, AKS auth, and safely controlling experiment targets/capabilities.' - configuration: 'Configuring Chaos Studio: ARM/Bicep experiment definitions, deploying - agents/targets, parameters, Azure Monitor/Workbook integration, OS/tool compatibility, - and onboarding via Azure Policy' + configuration: 'Configuring Chaos Studio: deploy agents/targets via ARM/Bicep/Policy, + define experiments and parameters, set up monitoring with Azure Monitor/Workbooks, + and verify OS/tool compatibility.' troubleshooting: Diagnosing and fixing Chaos Studio and Chaos Agent issues, including installation/health problems, VM agent status checks, known errors, and common experiment or connectivity failures. @@ -17,12 +17,13 @@ category_descriptions: experiments into automated workflows skill_description: Expert knowledge for Chaos Studio development including troubleshooting, limits & quotas, security, configuration, and integrations & coding patterns. Use - when defining ARM/Bicep experiments, deploying Chaos Agents, using CLI/REST, or - integrating with Azure Monitor, and other Chaos Studio related development tasks. - Not for Azure Monitor (use azure-monitor), Azure Resiliency (use azure-resiliency), + when securing Chaos Studio targets, deploying agents via ARM/Policy, configuring + experiments, or using CLI/REST APIs, and other Chaos Studio related development + tasks. Not for Azure Monitor (use azure-monitor), Azure Resiliency (use azure-resiliency), Azure Reliability (use azure-reliability), Azure Site Recovery (use azure-site-recovery). -use_when: Use when defining ARM/Bicep experiments, deploying Chaos Agents, using CLI/REST, - or integrating with Azure Monitor, and other Chaos Studio related development tasks. +use_when: Use when securing Chaos Studio targets, deploying agents via ARM/Policy, + configuring experiments, or using CLI/REST APIs, and other Chaos Studio related + development tasks. confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency (use azure-resiliency), Azure Reliability (use azure-reliability), Azure Site Recovery (use azure-site-recovery). @@ -39,8 +40,8 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 51 +- **Updated Pages**: 1 +- **Unchanged**: 50 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-chaos-studio/azure-chaos-studio.csv` @@ -57,6 +58,11 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency ## Changes +### Updated Pages + +- [Azure policy definitions](https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets) + - Updated: 2024-10-14T08:00:00.000Z → 2026-04-20T17:15:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -74,7 +80,6 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency | [Targets and capabilities](https://learn.microsoft.com/en-us/azure/chaos-studio/sample-template-targets) | configuration | 0.85 | ARM template samples expose specific resource types, properties, and parameter structures for Chaos Studio targets/capabilities—detailed configuration reference. | | [Accessing container image details](https://learn.microsoft.com/en-us/azure/chaos-studio/azure-container-instance-details) | security | 0.80 | Provides exact container image details from MCR used as a bastion for private networking; used in security reviews and allowlisting—product-specific security artifact. | | [Authorize Chaos Studio IP addresses for an AKS cluster](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-aks-ip-ranges) | security | 0.80 | Explains how to allow Chaos Studio IP addresses to reach AKS, likely including specific IP ranges and network rule configuration—product-specific security/networking. | -| [Azure policy definitions](https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets) | configuration | 0.80 | Contains Azure Policy definitions that create Chaos Studio targets/capabilities, including policy rule structure and resource details specific to Chaos Studio. | | [Bicep](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-bicep) | configuration | 0.80 | Bicep sample defines Chaos Studio resources and parameters; includes specific property names and structures unique to the service. | | [Configure an experiment using customer-managed keys (CMK)](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-configure-customer-managed-keys) | security | 0.80 | Covers CMK setup with user-assigned managed identities and Azure Blob Storage; this typically includes specific role assignments, key vault/identity configuration details, and access scope requirements. | | [Emit telemetry to App Insights](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-set-up-app-insights) | integrations | 0.80 | Shows how to configure Chaos Studio agent-based experiments to emit specific telemetry events to Application Insights—product-specific integration settings. | @@ -84,6 +89,7 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency | [Use Microsoft Entra authentication with Chaos Mesh](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-aks-authentication) | security | 0.80 | Describes supported authentication methods between Chaos Studio and AKS using Microsoft Entra, including auth flows and permissions specific to this integration. | | [Verify agent status](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-agent-verify-status) | troubleshooting | 0.80 | Explains agent status states and how to troubleshoot when not running correctly—symptom to cause/solution guidance specific to Chaos Agent. | | [Concepts](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-agent-concepts) | security | 0.75 | Deep dive into agent behavior, network access requirements, identities, and security considerations—product-specific security and connectivity configuration. | +| [Azure policy definitions](https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets) | configuration | 0.70 | Page provides concrete Azure Policy definition samples for configuring Chaos Studio targets and capabilities per resource type. These include specific policy rule structures, parameter names, and configuration patterns that are product-specific and not just conceptual, fitting the configuration sub-skill. | | [Chaos Mesh version compatibility](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-versions) | configuration | 0.70 | Version support and compatibility reference typically lists specific OS versions, agent versions, and tool support matrices that are detailed, product-specific configuration/compatibility data. | | [Experiment examples](https://learn.microsoft.com/en-us/azure/chaos-studio/experiment-examples) | integrations | 0.70 | Provides concrete CLI and JSON examples for experiment creation; likely includes request schema, parameter names, and values specific to Chaos Studio’s API and portal integration. | | [Limitations and known issues](https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-limitations) | limits-quotas | 0.70 | Limitations and known issues pages usually enumerate concrete constraints (unsupported scenarios, resource types, or behaviors) that are highly product-specific and not general knowledge. | diff --git a/products/azure-cloud-adoption-framework/azure-cloud-adoption-framework.csv b/products/azure-cloud-adoption-framework/azure-cloud-adoption-framework.csv index e8b91e73..81a0ed7d 100644 --- a/products/azure-cloud-adoption-framework/azure-cloud-adoption-framework.csv +++ b/products/azure-cloud-adoption-framework/azure-cloud-adoption-framework.csv @@ -109,10 +109,10 @@ https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/enterpris https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/enterprise-scale/testing-approach,Testing approach for Azure landing zones,Testing approach for Azure landing zones - Cloud Adoption Framework,Test Azure landing zone deployments and policies,Testing approach for Azure landing zones,"Note This article only applies to Microsoft Azure and not to any other Microsoft Cloud offerings such as Microsoft 365 or Microsoft Dynamics 365. Some organizations might want to test their Azure landing zones platform deployment's Azure Policy definitions and assignments, role-based access control (RBAC) custom roles and assignments, and so on. The tests can be completed via automation by using Azure Resource Manager templates (ARM templates),Terraform,Bicep, or manually via the Azure portal. T",2025-04-02T17:02:00.000Z,concept-article,readiness,0.8,True,"Provides a structured testing approach for landing zone deployments, including policy definitions, RBAC, and automation via ARM/Terraform/Bicep—expert validation and readiness practice.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/enterprise-scale/transition,Transition to the Azure landing zone reference architecture,Transition an existing Azure environment to the Azure landing zone reference architecture - Cloud Adoption Framework,Transition existing Azure environments to landing zones,Learn how to onboard existing environments and move resources to the Azure landing zone reference architecture.,"Many organizations have an existing Azure footprint, one or more subscriptions, and potentially an existing management group structure. Depending on their business requirements and scenarios, they might have Azure resources deployed, such as Azure VPN Gateway or Azure ExpressRoute for hybrid connectivity. This article provides recommendations to help your organization navigate changes based on your existing Azure environment that's transitioning into the Azure landing zone reference architecture",2025-12-19T21:03:00.000Z,concept-article,readiness,0.84,True,"Gives detailed recommendations for onboarding existing subscriptions, management groups, and connectivity into the Azure landing zone reference architecture—expert environment transition and landing zone design.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/,What is an Azure landing zone?,What is an Azure landing zone? - Cloud Adoption Framework,Understand Azure landing zones as environment foundation,Learn how a landing zone provides the basic building block of any cloud adoption environment.,"An Azure landing zone is the standardized and recommended approach for all organizations utilizing Azure. It provides a consistent way to set up and manage your Azure environment at scale. It ensures consistency across your organization by aligning with key requirements for security, compliance, and operational efficiency throughplatform and application landing zones. They provide a well-architected foundation aligned with coredesign principlesacross eightdesign areas.",2025-12-15T13:02:00.000Z,concept-article,readiness,0.85,True,"Defines Azure landing zones, platform vs application landing zones, and their role in secure, governed environments; core landing zone readiness knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only,Transition by duplicating a landing zone management group,Scenario- Transition an environment by duplicating a landing zone management group - Cloud Adoption Framework,Use duplicate management groups to audit landing zone alignment,Learn about an approach to transition to the Azure landing zone reference architecture by duplicating a landing zone management group.,"This article describes an example approach that transitions an environment to the Azure landing zone reference architecture by duplicating the landing zone management group with policies inaudit onlymode. With this approach, you can quickly access the new desired target architecture and then assess the application or workload subscriptions for compliance. This approach eliminates the risk of affecting the application teams because the policies are inaudit onlymode.",2025-12-19T21:03:00.000Z,concept-article,readiness,0.82,True,"Describes a specific transition approach using duplicated management groups with audit-only policies to assess compliance—expert, nuanced landing zone migration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only,Transition by duplicating a landing zone management group,Scenario- Transition an environment by duplicating a landing zone management group - Cloud Adoption Framework,Duplicate landing zone management group in audit-only mode,Learn about an approach to transition to the Azure landing zone reference architecture by duplicating a landing zone management group.,"This article describes an example approach that transitions an environment to the Azure landing zone reference architecture by duplicating the landing zone management group with policies inaudit onlymode. With this approach, you can quickly access the new desired target architecture and then assess the application or workload subscriptions for compliance. This approach eliminates the risk of affecting the application teams because the policies are inaudit onlymode.",2026-04-23T18:04:00.000Z,concept-article,readiness,0.84,True,"Describes a specific technical approach to transition to the Azure landing zone reference architecture by duplicating a landing zone management group with Azure Policy set to audit-only, then assessing workload subscriptions for compliance. This is concrete landing zone and management group implementation guidance, clearly in the readiness domain and contains expert, scenario-driven details not captured by general training data.",updated https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-multiple-management-groups,Transition management groups,Scenario- Transition management groups to the Azure landing zone reference architecture - Cloud Adoption Framework,Transition existing management groups to landing zone hierarchy,Learn how to transition existing Azure environments comprised of single or few management groups into the Azure landing zone reference architecture.,"This article describes considerations and instructions to migrate and transition your existing environment into the conceptual architecture of Azure landing zones. This scenario covers single or multiple management groups. In this scenario, it's assumed that the customer already uses Azure. They have a management group hierarchy with multiple subscriptions that host a few applications or services within the platform. But their current implementation limits their scalability and growth related to",2025-12-19T21:03:00.000Z,concept-article,readiness,0.84,True,Provides prescriptive guidance for reworking management group hierarchies and subscriptions into the landing zone reference architecture—deep readiness and platform structure expertise.,unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-regional-org,Transition a regional organization environment,Scenario- Transition a regional organization environment to the Azure landing zone reference architecture - Cloud Adoption Framework,Align regional dev/test/prod structures to landing zones,Learn how to transition existing regional Azure environments comprised of multiple management groups into the Azure landing zone reference architecture.,"This article describes considerations and instructions to migrate and transition your Azure environment into the Azure landing zone reference architecture. This scenario covers a regional organization with management groups that are separated into development, testing, and production (dev/test/prod) environments. In this scenario, the customer has a large footprint on Azure. They have a management group hierarchy that's organized by dev/test/prod environments and then by region. Their current im",2025-12-19T21:03:00.000Z,concept-article,readiness,0.84,True,"Covers a complex regional organization scenario with dev/test/prod management groups and how to transition to the landing zone architecture—expert multi-region, multi-environment readiness guidance.",unchanged -https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription,Transition a single subscription with no management groups,Scenario - Transition a single subscription with no management groups to the Azure landing zone reference architecture - Cloud Adoption Framework,Migrate a single-subscription Azure environment to landing zones,Learn how to transition existing Azure environments with a single subscription with no management groups into the Azure landing zone reference architecture.,"This article describes considerations and instructions to migrate and transition your Azure environment into the Azure landing zone reference architecture. This scenario covers a single subscription with no management groups. In this scenario, the customer already uses Azure and already hosts a few applications or services within the platform. But their current implementation limits their scalability and growth related to theircloud firststrategy. As part of this expansion, they plan to migrate ",2025-12-19T21:03:00.000Z,concept-article,readiness,0.84,True,Scenario article with concrete steps and considerations for moving from a single subscription with no management groups into the landing zone architecture—expert subscription and management group redesign.,unchanged +https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription,Transition a single subscription with no management groups,Scenario - Transition a single subscription with no management groups to the Azure landing zone reference architecture - Cloud Adoption Framework,Migrate single-subscription Azure environments to landing zone,Learn how to transition existing Azure environments with a single subscription with no management groups into the Azure landing zone reference architecture.,"This article describes considerations and instructions to migrate and transition your Azure environment into the Azure landing zone reference architecture. This scenario covers a single subscription with no management groups. In this scenario, the customer already uses Azure and already hosts a few applications or services within the platform. But their current implementation limits their scalability and growth related to theircloud firststrategy. As part of this expansion, they plan to migrate ",2026-04-23T18:04:00.000Z,concept-article,readiness,0.86,True,"Provides detailed, scenario-specific guidance for transitioning an existing single-subscription Azure environment with no management groups into the Azure landing zone reference architecture, including concrete landing zone alignment considerations and migration steps. This is deep implementation guidance about landing zone architecture and environment preparation, which fits the readiness category and goes beyond generic conceptual content.",updated https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/azure-ad-define,Define a Microsoft Entra tenant,Define Microsoft Entra tenants - Cloud Adoption Framework,Define and configure Microsoft Entra tenants for Azure,Understand how to set up Microsoft Entra tenants.,"A Microsoft Entra tenant provides identity and access management, which is an important part of your security posture. A Microsoft Entra tenant ensures that authenticated and authorized users only access the resources to which they have permissions. Microsoft Entra ID provides these services to applications and services deployed in and outside of Azure (such as on-premises or third-party cloud providers). Microsoft Entra ID is also used by software as a service (SaaS) applications such as Micros",2025-12-19T21:03:00.000Z,concept-article,readiness,0.85,True,Guides how to set up Entra tenants for identity and access as part of landing zone design; core identity foundation for readiness.,unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/azure-billing-cloud-solution-provider,Cloud solution provider,Cloud Solution Provider service - Cloud Adoption Framework,Understand CSP service agreements in Azure landing zones,Learn how to understand Cloud Solution Provider (CSP) service agreements and Microsoft Entra tenants.,The Cloud Solution Provider (CSP) service gives Microsoft partners access to Microsoft cloud services within one platform. It supports partners to: Azure in CSPis an Azure plan with various subscriptions that are hosted by the partner'sMicrosoft Partner Agreement (MPA). The MPA is similar to the Microsoft customer agreement. Both are hosted on the modern commerce platform and use asimplified purchase agreement. Important The partner CSP completely manages an MPA.,2025-12-19T21:03:00.000Z,concept-article,readiness,0.75,True,"Describes CSP-based Azure plans, partner-managed MPAs, and their implications for tenant and subscription setup; environment and commercial design area.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/azure-billing-enterprise-agreement,Enterprise Agreement,Azure Enterprise Agreement enrollment design area guidance - Cloud Adoption Framework,Design Enterprise Agreement enrollment for Azure landing zones,Understand the Enterprise Agreement enrollments and Microsoft Entra tenants design area.,"Enterprise Agreement enrollment represents the commercial relationship between Microsoft and how your organization uses Azure. It provides billing foundation for your subscriptions and how your digital estate is administered. The Microsoft Cost Management blade in the Azure portal helps you to manage your Enterprise Agreement enrollment. An enrollment often represents an organization's hierarchy, including departments, accounts, and subscriptions. This hierarchy represents cost centers within an",2025-12-19T21:03:00.000Z,concept-article,readiness,0.8,True,"Provides detailed guidance on EA enrollment hierarchy, cost centers, and administration as a design area for landing zones; foundational environment setup.",unchanged @@ -401,7 +401,7 @@ https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/s https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-lza-identify-sap-data-sources,Identify SAP data sources,Identify SAP data sources - Cloud Adoption Framework,Identify SAP data sources for Azure integration,Learn how to identify SAP applications and connectors to integrate SAP data with Azure data services.,"This article is part of the ""SAP extend and innovate data: Best practices"" article series. Digital transformation requires a seamless combination of intelligence derived from data across business operations to meet the business objectives of an enterprise. Enterprises use SAP applications as enterprise resource planning systems (ERP), line of business (LOB) SaaS applications, enterprise data warehouses, business intelligence, or integration platforms. The siloed data in SAP systems can be harnes",2023-03-21T00:00:00.000Z,concept-article,adoption-patterns,0.8,True,Concrete guidance on discovering SAP apps and connectors to integrate SAP data with Azure data services—scenario-specific adoption pattern.,unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-lza-security-operations,Security operations,Security operations for SAP on Azure - Cloud Adoption Framework,Implement security operations for SAP on Azure,Learn how to implement a security operation for SAP in Microsoft Cloud to ensure your organization's sensitive data and applications are protected.,"This article is part of the ""SAP extend and innovate security: Best practices"" article series. This article describes best practices for security operations to secure your SAP environment in Azure. Implement a comprehensive security operation for SAP in Microsoft Cloud to ensure your organization's sensitive data and applications are protected from cyber threats.",2026-01-16T21:04:00.000Z,concept-article,security,0.86,True,"Best practices for running a security operations function for SAP in Microsoft Cloud, including threat monitoring and incident response patterns.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-power-platform-architecture-workflow,SAP and Power Platform architecture workflow,SAP and Power Platform Architecture Workflow - Cloud Adoption Framework,Implement SAP and Power Platform integration workflow,View an architecture workflow for integrations of SAP and Microsoft Power Platform and learn about security considerations.,This article is part of theSAP and Power Platformarticle series:,2024-11-05T18:01:00.000Z,concept-article,scenarios,0.88,True,"Provides an architecture workflow for SAP–Power Platform integrations plus security considerations. This is detailed, workload-specific architecture and integration guidance for a particular scenario (SAP + Power Platform), aligning directly with CAF scenarios and containing expert implementation knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-power-platform-extend-landing-zone,Extend an SAP landing zone to support Power Platform,Extend an SAP Landing Zone to Support Power Platform - Cloud Adoption Framework,Extend SAP landing zones to integrate with Power Platform,Learn how to implement the integration of SAP with Microsoft Power Platform by extending your SAP landing zone.,"When you integrate SAP systems with Microsoft Power Platform, the actions you take depend on your use cases and the connector that you use. Each connector has unique technical requirements that you need to address. This article outlines integration options and provides links to implementation guides that can help you establish the connections and technical setup required for each scenario. Based on your organization's needs and use cases, you can select and implement the appropriate components t",2026-04-16T22:04:00.000Z,concept-article,scenarios,0.86,True,"Scenario-specific CAF guidance for integrating SAP with Microsoft Power Platform by extending an SAP landing zone. It discusses connector-specific technical requirements, integration options, and links to implementation guides for different SAP–Power Platform use cases. This is workload-specific (SAP) landing zone extension and integration guidance, matching the 'scenarios' sub-skill and containing detailed, implementation-focused knowledge not generally known to LLMs.",updated +https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-power-platform-extend-landing-zone,Extend an SAP landing zone to support Power Platform,Extend an SAP Landing Zone to Support Power Platform - Cloud Adoption Framework,Extend SAP landing zones to integrate with Power Platform,Learn how to implement the integration of SAP with Microsoft Power Platform by extending your SAP landing zone.,"When you integrate SAP systems with Microsoft Power Platform, the actions you take depend on your use cases and the connector that you use. Each connector has unique technical requirements that you need to address. This article outlines integration options and provides links to implementation guides that can help you establish the connections and technical setup required for each scenario. Based on your organization's needs and use cases, you can select and implement the appropriate components t",2026-04-16T22:04:00.000Z,concept-article,scenarios,0.86,True,"Scenario-specific CAF guidance for integrating SAP with Microsoft Power Platform by extending an SAP landing zone. It discusses connector-specific technical requirements, integration options, and links to implementation guides for different SAP–Power Platform use cases. This is workload-specific (SAP) landing zone extension and integration guidance, matching the 'scenarios' sub-skill and containing detailed, implementation-focused knowledge not generally known to LLMs.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-power-platform-fundamental,Extend SAP with Power Platform,SAP and Power Platform Fundamentals - Cloud Adoption Framework,Apply SAP and Power Platform integration fundamentals,Learn how to extend SAP with Microsoft Power Platform to create better end-to-end business solutions.,This article is part of theSAP and Power Platformarticle series:,2024-11-05T18:01:00.000Z,concept-article,scenarios,0.7,True,"Part of a CAF scenario series focused on extending SAP with Power Platform. While high-level in name, these fundamentals typically include concrete integration patterns, connector usage considerations, and SAP-specific guidance for Power Platform, which is workload-specific CAF adaptation and thus scenarios.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/strategy,SAP,Strategy for SAP adoption on Azure - Cloud Adoption Framework,Define strategy for SAP adoption on Azure,"Explore why organizations move SAP workloads to the cloud and how it drives innovation, reduces costs, and ensures flexibility. Start your cloud journey today.","This article helps you understand why SAP cloud adoption matters, what motivates organizations to move SAP workloads to the cloud, and how to measure progress throughout your migration.",2026-01-26T16:07:00.000Z,concept-article,scenarios,0.75,True,"SAP-specific CAF scenario content explaining motivations, drivers, and progress measurement for SAP cloud adoption, tailored to this workload rather than generic strategy.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/strategy,SAP adoption,Strategy for SAP adoption on Azure - Cloud Adoption Framework,Define cloud strategy for SAP workloads on Azure,"Explore why organizations move SAP workloads to the cloud and how it drives innovation, reduces costs, and ensures flexibility. Start your cloud journey today.","This article helps you understand why SAP cloud adoption matters, what motivates organizations to move SAP workloads to the cloud, and how to measure progress throughout your migration.",2026-01-26T16:07:00.000Z,concept-article,strategy,0.78,True,"Discusses motivations, business drivers, and progress measurement for SAP cloud adoption, mapping SAP move to business outcomes.",unchanged @@ -419,7 +419,7 @@ https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/define https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/inform/cost-efficiency,Cost efficiency,Cost efficiency considerations for your cloud adoption strategy - Cloud Adoption Framework,Incorporate cost efficiency into cloud strategy,Learn how to optimize costs that are associated with cloud services and ensure that your organization gets the best value for its investment.,"Cost efficiency in your cloud adoption means that you effectively manage and optimize costs associated with cloud service usage. Cost efficiency involves strategic decisions to ensure your organization gets the best value for your investment and minimizes unnecessary expenses. Cloud adoption helps drive business growth, expand market share, and increase revenue. Implement cost efficiency to optimize your cloud adoption journey.",2026-03-02T23:08:00.000Z,concept-article,strategy,0.7,True,"Gives specific cost-efficiency considerations and decision guidance as part of cloud adoption strategy, tying financial outcomes to strategic choices. This aligns with strategy (business drivers, value, and cost outcomes) rather than governance or operations.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/inform/resiliency,Resiliency,Resiliency Considerations for Your Cloud Strategy - Cloud Adoption Framework,Plan resiliency as part of cloud strategy,Learn about the importance of resiliency and how to plan for the unexpected in your cloud adoption strategy.,"Resiliency is the ability of your infrastructure to maintain functionality and availability despite disruptions or failures. It's a cornerstone of any successful cloud adoption strategy. Design your cloud infrastructure with resiliency in mind to minimize the impact of disruptions. Doing so helps you maintain continuity and reliability in your business operations. Consider that the more tightly integrated your business is with your technology, the more important the resiliency of that technology",2026-03-02T23:08:00.000Z,concept-article,strategy,0.68,True,"Focuses on resiliency as a strategic consideration in cloud adoption, linking business continuity and technology dependency to strategic planning. It provides structured guidance on incorporating resiliency into the overall cloud strategy.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/inform/security,Security,Security Considerations for Your Cloud Strategy - Cloud Adoption Framework,Integrate security considerations into cloud strategy,Learn how to design your cloud infrastructure with security in mind to protect your data and applications from unauthorized access and data breaches.,"You need a well-designed security strategy for a successful cloud adoption. If your organization uses traditional on-premises environments, you should evaluate your cloud expertise and specifically focus on cloud security. To manage security in the cloud, you might need to significantly change your security team structure and overall security approach. Potential changes to your organization might introduce stress and conflict. For a successful cloud adoption, ensure that your management teams pr",2026-03-02T23:08:00.000Z,concept-article,strategy,0.66,True,"Covers how to design a security approach as part of cloud adoption strategy, including organizational changes and management alignment. It is framed as strategic security alignment rather than deep security operations or technical configuration, so it best fits strategy.",unchanged -https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/inform/sustainability,Sustainability,Sustainability Considerations in Your Cloud Strategy - Cloud Adoption Framework,Incorporate sustainability objectives into cloud strategy,Learn how to incorporate sustainability into your cloud strategy. Use Azure to drive your sustainability efforts.,"Sustainability is an increasingly important performance indicator for organizations. As sustainability becomes a crucial part of organizations' operations across industries, it adds pressure from new stakeholders, challenges existing profit pools, and creates opportunities for new profit pools. Organizations must respond effectively with new types of solutions and technology-enabled approaches. This transformation is good for business. Research from multiple sources indicates that sustainability",2026-04-13T18:03:00.000Z,concept-article,strategy,0.7,True,"The page is in the CAF Strategy/Inform section and provides concrete guidance on integrating sustainability into cloud strategy, mapping sustainability drivers to business outcomes and cloud usage. This aligns with the strategy sub-skill (business drivers, outcome definition, stakeholder considerations). It goes beyond generic marketing by giving prescriptive considerations and frameworks for including sustainability in cloud strategy, which qualifies as expert knowledge for the CAF domain.",updated +https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/inform/sustainability,Sustainability,Sustainability Considerations in Your Cloud Strategy - Cloud Adoption Framework,Incorporate sustainability objectives into cloud strategy,Learn how to incorporate sustainability into your cloud strategy. Use Azure to drive your sustainability efforts.,"Sustainability is an increasingly important performance indicator for organizations. As sustainability becomes a crucial part of organizations' operations across industries, it adds pressure from new stakeholders, challenges existing profit pools, and creates opportunities for new profit pools. Organizations must respond effectively with new types of solutions and technology-enabled approaches. This transformation is good for business. Research from multiple sources indicates that sustainability",2026-04-13T18:03:00.000Z,concept-article,strategy,0.7,True,"The page is in the CAF Strategy/Inform section and provides concrete guidance on integrating sustainability into cloud strategy, mapping sustainability drivers to business outcomes and cloud usage. This aligns with the strategy sub-skill (business drivers, outcome definition, stakeholder considerations). It goes beyond generic marketing by giving prescriptive considerations and frameworks for including sustainability in cloud strategy, which qualifies as expert knowledge for the CAF domain.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/motivations,"2. Determine motivations, mission and objectives","Determine Your Motivations, Mission, and Objectives - Cloud Adoption Framework","Define motivations, mission, and objectives for cloud","Learn how to use the Cloud Adoption Framework for Azure to understand the motivations, mission, and objectives behind cloud migration that can help produce more successful business outcomes.","The purpose of this article is to help you understand and define motivations, mission, and objectives for your cloud adoption strategy. These are crucial for IT decision-makers and executives because they help ensure alignment with strategic business objectives, maximize return on investment, and facilitate informed decision making.",2026-02-10T21:02:00.000Z,concept-article,strategy,0.8,True,"Gives structured guidance to elicit and formalize motivations and objectives for cloud adoption, directly supporting business driver identification and outcome mapping.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/prepare-organizational-alignment,4. Prepare your organization,Prepare for organizational alignment - Cloud Adoption Framework,Prepare organizational alignment for cloud adoption,"Learn how to prepare your organization for cloud adoption by aligning leadership, strategies, and operating models.","Organizational alignment is important to ensure a collective buy-in to the strategies you're planning to execute on. To ensure you have the required support throughout your cloud adoption journey, consider getting a few key stakeholders involved early on, and expand your leadership buy-in as you iterate on your strategy execution.",2025-02-03T18:03:00.000Z,concept-article,organization,0.7,True,"Focuses on leadership buy-in, stakeholder involvement, and operating model alignment; this is organizational change and alignment guidance rather than pure strategy or planning.",unchanged https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/whats-new,What's new,What's New in Microsoft's Cloud Adoption Framework - Cloud Adoption Framework,,Learn about recent updates to Microsoft's Cloud Adoption Framework.,"Microsoft's Cloud Adoption Framework is continuously updated with new guidance, enhanced content, and refined recommendations based on customer experiences and evolving cloud practices. This page helps you stay informed about the latest updates, new resources, deprecated content, and where to find specific information as the framework evolves.",2026-04-09T18:03:00.000Z,concept-article,,0.1,False,"Page is a change log/updates overview for the Cloud Adoption Framework, summarizing new and deprecated content and navigation pointers. It does not itself contain deep implementation guidance, methodologies, or detailed frameworks that match any sub-skill detection hints.",unchanged diff --git a/products/azure-cloud-adoption-framework/report.md b/products/azure-cloud-adoption-framework/report.md index 70ea3204..c19f5633 100644 --- a/products/azure-cloud-adoption-framework/report.md +++ b/products/azure-cloud-adoption-framework/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: scenarios: End-to-end adoption and landing zone guidance for specific Azure scenarios (AI agents, AKS, RHEL, ARO, AIS, AVD/Citrix, AVS, Oracle, SAP, analytics/data @@ -23,8 +23,8 @@ category_descriptions: ops, data/analytics, security, cost, DevOps, AI agents) needed to run and scale Azure cloud adoption. readiness: 'Designing and operating Azure landing zones: network topology, identity, - subscriptions, governance, automation/DevOps, multitenancy, and workload‑specific - patterns (SAP, AVS, analytics, Oracle, etc.).' + subscriptions, governance, automation/DevOps, and workload-specific setups (AVS, + SAP, analytics, Oracle, App Service, Container Apps).' security: Security design and governance for Azure landing zones, including Zero Trust, IAM, encryption, DevOps, AKS, analytics, SAP/Oracle, Arc, and ongoing security operations. @@ -32,11 +32,11 @@ skill_description: Expert guidance for planning and executing cloud adoption usi Azure Cloud Adoption Framework. Covers strategy, planning, readiness & landing zones, adoption patterns, governance, security, operations & management, organization & teams, and adoption scenarios. Use when designing Azure landing zones, AVS/VMware, - SAP/Oracle, AKS, or AI/analytics workloads, and other Azure Cloud Adoption Framework - related development tasks. + SAP/Oracle, AKS, or analytics/data mesh deployments, and other Azure Cloud Adoption + Framework related development tasks. use_when: Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, or - AI/analytics workloads, and other Azure Cloud Adoption Framework related development - tasks. + analytics/data mesh deployments, and other Azure Cloud Adoption Framework related + development tasks. --- # Azure Cloud Adoption Framework Crawl Report @@ -74,10 +74,10 @@ use_when: Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, o ### Updated Pages -- [Sustainability](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/inform/sustainability) - - Updated: 2025-02-03T18:03:00.000Z → 2026-04-13T18:03:00.000Z -- [Extend an SAP landing zone to support Power Platform](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-power-platform-extend-landing-zone) - - Updated: 2024-11-05T18:01:00.000Z → 2026-04-16T22:04:00.000Z +- [Transition a single subscription with no management groups](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription) + - Updated: 2025-12-19T21:03:00.000Z → 2026-04-23T18:04:00.000Z +- [Transition by duplicating a landing zone management group](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only) + - Updated: 2025-12-19T21:03:00.000Z → 2026-04-23T18:04:00.000Z ## Classified Pages @@ -183,6 +183,7 @@ use_when: Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, o | [Team topologies](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/considerations/devops-teams-topologies) | organization | 0.86 | Gives prescriptive guidance on team structures, platform vs application teams, and how to balance control and autonomy—clear organizational operating model and RACI-style content. | | [Technical planning](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/azure-vmware/plan) | planning | 0.86 | Provides templates and concrete guidance to include Azure VMware Solution in the overall adoption plan, including backlog creation and skills planning—clear planning methodology. | | [Traditional Azure networking topology](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/traditional-azure-networking-topology) | readiness | 0.86 | Describes detailed design considerations and recommendations for traditional Azure network topologies, including diagrams and connectivity patterns, which are landing zone network design (readiness). | +| [Transition a single subscription with no management groups](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription) | readiness | 0.86 | Provides detailed, scenario-specific guidance for transitioning an existing single-subscription Azure environment with no management groups into the Azure landing zone reference architecture, including concrete landing zone alignment considerations and migration steps. This is deep implementation guidance about landing zone architecture and environment preparation, which fits the readiness category and goes beyond generic conceptual content. | | [3. Monitor Azure](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/manage/monitor) | operations | 0.85 | Contains detailed guidance on planning, configuring, and optimizing monitoring across Azure, other clouds, on-premises, and edge, including concrete monitoring patterns and integration practices that are implementation-specific. | | [4. Manage agents](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ai-agents/integrate-manage-operate) | operations | 0.85 | Covers lifecycle management, integration into workflows, and operational excellence for AI agents, including security, compliance, and cost efficiency—operations and management focus. | | [Assess workloads](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/plan/assess-workloads-for-cloud-migration) | planning | 0.85 | Describes a structured assessment phase to collect architecture, performance, security, and dependency data before migration; fits migration planning and assessment. | @@ -248,7 +249,7 @@ use_when: Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, o | [Single-region design that doesn't have Global Reach](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/azure-vmware/single-region-virtual-wan-without-global-reach) | readiness | 0.84 | Network topology and traffic flow guidance for AVS with secure Virtual WAN but no Global Reach, including design trade-offs—implementation-level landing zone content. | | [Single-region design that has Global Reach](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/azure-vmware/single-region-virtual-wan-global-reach) | readiness | 0.84 | Provides best-practice topology and traffic flow recommendations for AVS with secure Virtual WAN and ExpressRoute Global Reach in a single region. | | [Transition a regional organization environment](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-regional-org) | readiness | 0.84 | Covers a complex regional organization scenario with dev/test/prod management groups and how to transition to the landing zone architecture—expert multi-region, multi-environment readiness guidance. | -| [Transition a single subscription with no management groups](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription) | readiness | 0.84 | Scenario article with concrete steps and considerations for moving from a single subscription with no management groups into the landing zone architecture—expert subscription and management group redesign. | +| [Transition by duplicating a landing zone management group](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only) | readiness | 0.84 | Describes a specific technical approach to transition to the Azure landing zone reference architecture by duplicating a landing zone management group with Azure Policy set to audit-only, then assessing workload subscriptions for compliance. This is concrete landing zone and management group implementation guidance, clearly in the readiness domain and contains expert, scenario-driven details not captured by general training data. | | [Transition management groups](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-multiple-management-groups) | readiness | 0.84 | Provides prescriptive guidance for reworking management group hierarchies and subscriptions into the landing zone reference architecture—deep readiness and platform structure expertise. | | [Transition to the Azure landing zone reference architecture](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/enterprise-scale/transition) | readiness | 0.84 | Gives detailed recommendations for onboarding existing subscriptions, management groups, and connectivity into the Azure landing zone reference architecture—expert environment transition and landing zone design. | | [Governance and security](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ai-agents/governance-security-across-organization) | governance | 0.83 | Provides layered governance and security framework for AI agents, including data governance, compliance, and policy formation mapped to CAF governance concepts—specialized governance guidance. | @@ -277,7 +278,6 @@ use_when: Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, o | [SAP data integration with Azure - Security](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/sap/sap-lza-data-integration-security) | security | 0.82 | Provides multi-layer security architecture and component-level considerations for SAP data integration using Azure services—deep security posture guidance. | | [Scale with multiple subscriptions](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-setup-guide/initial-subscriptions) | readiness | 0.82 | Gives prescriptive guidance on initial subscription creation and scaling patterns as the environment grows—core subscription design and readiness expertise. | | [Security governance and compliance](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/azure-virtual-desktop/eslz-security-governance-and-compliance) | scenarios | 0.82 | Contains detailed, scenario-specific design considerations and recommended security, governance, and compliance controls for Azure Virtual Desktop landing zones. This is a workload-specific CAF adaptation rather than generic governance or security methodology, so it fits scenarios. | -| [Transition by duplicating a landing zone management group](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only) | readiness | 0.82 | Describes a specific transition approach using duplicated management groups with audit-only policies to assess compliance—expert, nuanced landing zone migration pattern. | | [Update custom policies](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/update-custom-policies) | governance | 0.82 | Provides high-level but concrete process for updating already-deployed custom policies and initiatives, including tooling-specific paths (Terraform, Bicep), which is expert policy-governance lifecycle guidance. | | [1. Assess your cloud adoption strategy](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/assessment) | strategy | 0.80 | Describes a concrete assessment process for an existing cloud adoption strategy, including how to interpret maturity, identify gaps, and align with business goals. This is expert strategy-assessment methodology, not just conceptual overview or planning detail. | | [1. Define operating model](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/plan/prepare-organization-for-cloud) | planning | 0.80 | Describes converting strategy into a cloud adoption plan, including choosing operations approaches and distributing responsibilities; fits adoption planning guidance. | diff --git a/products/azure-cloud-services/azure-cloud-services.csv b/products/azure-cloud-services/azure-cloud-services.csv index 7dc96a41..b6523bd9 100644 --- a/products/azure-cloud-services/azure-cloud-services.csv +++ b/products/azure-cloud-services/azure-cloud-services.csv @@ -3,9 +3,9 @@ https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/availabl https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/certificates-and-key-vault,Certificates,Store and use certificates in Azure Cloud Services (extended support),Securely store and use certificates with Key Vault in Cloud Services,Processes for storing and using certificates in Azure Cloud Services (extended support),"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. Key Vault is used to store certificates that are associated to Cloud Services (extended support). Key Vaults can be created through theAzure portalandPowerShell. Add the certificates to Key Vault, then reference the certificate thumbprints in Service Configuration file. You also need to enable Key Vault for appropriate per",2025-04-11T11:11:00.000Z,how-to,security,0.8,True,"Describes using Key Vault for certificates, referencing thumbprints in .cscfg, and enabling Key Vault permissions; product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-family-1-retirement,Family 1 retirement notice,Guest OS family 1 retirement notice,Guest OS Family 1 retirement dates and deployment impact,Provides information about when the Azure Guest OS Family 1 retirement happened and how to determine if its retirement affects you.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. The retirement of OS Family 1 was first announced on June 1, 2013. Sept 2, 2014The Azure Guest operating system (Guest OS) Family 1.x, which is based on the Windows Server 2008 operating system, was officially retired. All attempts to deploy new services or upgrade existing services using Family 1 fail with an error messag",2025-04-08T11:16:00.000Z,concept-article,limits-quotas,0.7,True,Provides specific retirement dates and behavior (deploy/upgrade failures) for OS Family 1; time-based constraints on what can be deployed.,unchanged https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-family-2-3-4-retirement,"Family 2, 3, 4 retirement notice","Guest OS family 2, 3, and 4 retirement notice",Guest OS Families 2–4 retirement timelines and impact,"Information about when the Azure Guest OS Family 2, 3, and 4 retirement happened and how to determine if their retirement affects you.","Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. The retirement of Azure Guest OS Families 2, 3, and 4 was announced in July 2024, with the following end-of-life dates: If you have questions, visit theMicrosoft question page for Cloud Servicesorcontact Azure support.",2025-04-08T11:16:00.000Z,concept-article,limits-quotas,0.7,True,"Lists announced retirement dates for OS Families 2, 3, and 4; these are concrete time-based constraints affecting deployments.",unchanged -https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-microsoft-security-response-center-releases,Guest OS patches,List of updates applied to the Azure Guest OS,,This article lists the Microsoft Security Response Center updates applied to different Azure Guest OS. See if an update applies to your Guest OS.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. The following tables show the Microsoft Security Response Center (MSRC) updates applied to the Azure Guest OS. Search this article to determine if a particular update applies to your Guest OS. Updates always carry forward for the particularfamilythey were introduced in.",2026-04-14T08:00:00.000Z,concept-article,,0.0,False,"Primarily a listing of MSRC updates applied to Azure Guest OS images; it’s version/update history, not limits, configuration parameters, error codes, or decision matrices. No numeric limits, quotas, config tables, or troubleshooting mappings that fit the defined sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-microsoft-security-response-center-releases,Guest OS patches,List of updates applied to the Azure Guest OS,,This article lists the Microsoft Security Response Center updates applied to different Azure Guest OS. See if an update applies to your Guest OS.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. The following tables show the Microsoft Security Response Center (MSRC) updates applied to the Azure Guest OS. Search this article to determine if a particular update applies to your Guest OS. Updates always carry forward for the particularfamilythey were introduced in.",2026-04-14T08:00:00.000Z,concept-article,,0.0,False,"Primarily a listing of MSRC updates applied to Azure Guest OS images; it’s version/update history, not limits, configuration parameters, error codes, or decision matrices. No numeric limits, quotas, config tables, or troubleshooting mappings that fit the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-retirement-policy,Retirement policy,Supportability and retirement policy guide for Azure Guest OS,Understand support and retirement policy for Azure Guest OS,Provides information about what Microsoft supports regarding the Azure Guest OS used by Cloud Services.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. The information on this page relates to the Azure Guest operating system (Guest OS) for Cloud Services worker and web roles (PaaS). It doesn't apply to Virtual Machines (IaaS). Microsoft has a publishedsupport policy for the Guest OS. This page describes how the policy is implemented. The policy is: At times, more than two",2025-05-09T08:00:00.000Z,concept-article,security,0.65,True,Details supportability and retirement policy implementation for Guest OS families; includes policy thresholds and constraints relevant to secure operations.,unchanged -https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-update-matrix,Guest OS release news,Learn about the latest Azure Guest OS Releases,Plan Azure Cloud Services Guest OS upgrade path,The latest release news and SDK compatibility for Azure Cloud Services Guest OS.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. Provides you with up-to-date information about the latest Azure Guest OS releases for Cloud Services. This information helps you plan your upgrade path before a Guest OS is disabled. If you configure your roles to useautomaticGuest OS updates as described inAzure Guest OS Update Settings, it isn't vital that you read this ",2026-04-14T08:00:00.000Z,concept-article,decision-making,0.7,True,"The page provides a detailed Guest OS update matrix and SDK compatibility information for Azure Cloud Services (extended support), including which Guest OS versions are current, deprecated, or disabled and their corresponding SDK support. This is product- and version-specific data that changes over time and isn't inferable from general training. It directly supports upgrade and migration decisions based on specific OS/SDK combinations, fitting the decision-making category better than others.",updated +https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-update-matrix,Guest OS release news,Learn about the latest Azure Guest OS Releases,Plan Azure Cloud Services Guest OS upgrade path,The latest release news and SDK compatibility for Azure Cloud Services Guest OS.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. Provides you with up-to-date information about the latest Azure Guest OS releases for Cloud Services. This information helps you plan your upgrade path before a Guest OS is disabled. If you configure your roles to useautomaticGuest OS updates as described inAzure Guest OS Update Settings, it isn't vital that you read this ",2026-04-14T08:00:00.000Z,concept-article,decision-making,0.7,True,"The page provides a detailed Guest OS update matrix and SDK compatibility information for Azure Cloud Services (extended support), including which Guest OS versions are current, deprecated, or disabled and their corresponding SDK support. This is product- and version-specific data that changes over time and isn't inferable from general training. It directly supports upgrade and migration decisions based on specific OS/SDK combinations, fitting the decision-making category better than others.",unchanged https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-model-and-package,Config files and packaging,What is the Azure Cloud Service (extended support) model and package,"Understand Cloud Services model, config, and package files","Describes the cloud service (extended support) model (.csdef, .cscfg) and package (.cspkg) in Azure","Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. A cloud service is created from three components, the service definition(.csdef), the service config(.cscfg), and a service package(.cspkg). Both theServiceDefinition.csdefandServiceConfig.cscfgfiles are XML-based and describe the structure of the cloud service and its configuration. We collectively call these files the mo",2024-07-24T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes .csdef, .cscfg, and .cspkg structure; these schema/model details are product-specific configuration knowledge beyond generic LLM training.",unchanged https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/configure-scaling,Configure scaling,Configure scaling for Azure Cloud Services (extended support),Configure autoscaling rules for Cloud Services deployments,How to enable scaling options for Azure Cloud Services (extended support),"Important As of March 31, 2025, cloud Services (extended support)is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. Conditions can be configured to enable Cloud Services (extended support) deployments to scale in and out. These conditions can be based on CPU usage, disk load, and network load. Consider the following information when configuring scaling of your Cloud Service deployments:",2025-04-08T11:16:00.000Z,how-to,best-practices,0.7,True,"Provides considerations and conditions for scaling based on CPU, disk, and network; includes product-specific guidance and likely recommended thresholds.",unchanged https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/deploy-portal,Deploy the cloud service - Portal,Deploy Azure Cloud Services (extended support) - Azure portal,,Deploy Azure Cloud Services (extended support) by using the Azure portal.,"Important As of March 31, 2025, cloud Services (extended support) is deprecated and will be fully retired on March 31, 2027.Learn moreabout this deprecation andhow to migrate. This article shows you how to use the Azure portal to create an Azure Cloud Services (extended support) deployment.",2025-04-08T11:16:00.000Z,quickstart,,0.3,False,Portal deployment walkthrough; primarily step-by-step UI instructions without deployment matrices or tier-specific constraints.,unchanged diff --git a/products/azure-cloud-services/report.md b/products/azure-cloud-services/report.md index 7306c961..71c6265b 100644 --- a/products/azure-cloud-services/report.md +++ b/products/azure-cloud-services/report.md @@ -46,8 +46,8 @@ confusable_not_for: Not for Azure Networking (use azure-networking), Azure Virtu ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 2 -- **Unchanged**: 43 +- **Updated Pages**: 0 +- **Unchanged**: 45 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-cloud-services/azure-cloud-services.csv` @@ -66,13 +66,6 @@ confusable_not_for: Not for Azure Networking (use azure-networking), Azure Virtu ## Changes -### Updated Pages - -- [Guest OS release news](https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-update-matrix) - - Updated: 2026-02-25T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Guest OS patches](https://learn.microsoft.com/en-us/azure/cloud-services-extended-support/cloud-services-guestos-microsoft-security-response-center-releases) - - Updated: 2026-02-26T08:00:00.000Z → 2026-04-14T08:00:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-cognitive-search/azure-cognitive-search.csv b/products/azure-cognitive-search/azure-cognitive-search.csv index d2ddb3c4..e8a950b7 100644 --- a/products/azure-cognitive-search/azure-cognitive-search.csv +++ b/products/azure-cognitive-search/azure-cognitive-search.csv @@ -1,24 +1,24 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-blob,Azure blob,Create a Blob Knowledge Source for Agentic Retrieval - Azure AI Search,Configure blob knowledge sources for agentic retrieval,"A blob knowledge source specifies a blob container that you want to read from. It also includes models and properties for creating an indexer, data source, skillset, and index used for agentic retriev","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use ablob knowledge sourceto index and query Azure blob content in an agentic retrieval pipeline.Knowledge sourcesare created independently, referenced in aknowledge base, and used a",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Describes models and properties for creating indexer, data source, skillset, and index from Blob; this typically includes parameter names, allowed values, and JSON schemas unique to Azure AI Search agentic retrieval.",updated -https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-onelake,OneLake,Create an Indexed OneLake Knowledge Source for Agentic Retrieval - Azure AI Search,Set up indexed OneLake knowledge sources in Azure AI Search,"Learn how to create an indexed OneLake knowledge source in Azure AI Search. An indexed OneLake knowledge source specifies a lakehouse, models, and properties that create an enrichment pipeline for age","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use anindexed OneLake knowledge sourceto index and query Microsoft OneLake files in an agentic retrieval pipeline.Knowledge sourcesare created independently, referenced in aknowledge",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,Covers specifying a lakehouse and enrichment pipeline models/properties; likely contains detailed configuration fields and structures specific to OneLake integration as a knowledge source.,updated -https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-search-index,Search index,Create a Search Index Knowledge Source - Azure AI Search,Configure search index knowledge sources for agentic retrieval,"Learn how to create a search index knowledge source, which specifies an index used by a knowledge base for agentic retrieval workloads.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Asearch index knowledge sourcespecifies a connection to an Azure AI Search index that provides searchable content in an agentic retrieval pipeline.Knowledge sourcesare created indepe",2026-04-15T06:03:00.000Z,how-to,configuration,0.68,True,"How-to page for creating a search index knowledge source; likely includes specific request schemas, property names, and configuration parameters for Azure AI Search knowledge bases, which are product-specific configuration details beyond generic LLM knowledge.",updated -https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-sharepoint-indexed,SharePoint,Create a SharePoint (Indexed) Knowledge Source - Azure AI Search,Configure indexed SharePoint knowledge sources for Azure AI Search,"Learn how to create an indexed SharePoint knowledge source, which ingests content from SharePoint sites into a searchable index on Azure AI Search.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use anindexed SharePoint knowledge sourceto index and query SharePoint content in an agentic retrieval pipeline.Knowledge sourcesare created independently, referenced in aknowledge b",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Ingests SharePoint content into a searchable index; such pages usually define data source, indexer, and index configuration parameters and schemas that are product-specific.",updated -https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-sharepoint-remote,SharePoint,Create a SharePoint (Remote) Knowledge Source - Azure AI Search,Configure remote SharePoint knowledge sources via Copilot Retrieval API,"Learn how to create a remote SharePoint knowledge source, which tells an agentic retrieval engine in Azure AI Search to query SharePoint sites directly.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Aremote SharePoint knowledge sourceuses theCopilot Retrieval APIto query textual content directly from SharePoint in Microsoft 365. No search index or connection string is needed. On",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Describes using Copilot Retrieval API to query SharePoint directly; likely includes specific API parameters, request shapes, and configuration options unique to this remote knowledge source type.",updated -https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-web,Web,Create Web Knowledge Source for Agentic Retrieval - Azure AI Search,Create and configure Web Knowledge Sources using Bing grounding,Learn how to create a Web Knowledge Source resource for agentic retrieval workloads in Azure AI Search.,"Important Web Knowledge Source, which uses Grounding with Bing Search and/or Grounding with Bing Custom Search, is aFirst Party Consumption Servicegoverned by theGrounding with Bing terms of useand theMicrosoft Privacy Statement. TheMicrosoft Data Protection Addendumdoesn't apply to data sent to Web Knowledge Source. When Customer uses Web Knowledge Source, Customer Data flows outside the Azure compliance and Geo boundary. This also means use of Web Knowledge Source waives all elevated Governmen",2026-04-15T06:03:00.000Z,how-to,configuration,0.72,True,"Covers Web Knowledge Source using Grounding with Bing/Bing Custom Search; beyond legal notes, such pages typically define resource configuration fields, options, and constraints specific to this feature.",updated +https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-blob,Azure blob,Create a Blob Knowledge Source for Agentic Retrieval - Azure AI Search,Configure blob knowledge sources for agentic retrieval,"A blob knowledge source specifies a blob container that you want to read from. It also includes models and properties for creating an indexer, data source, skillset, and index used for agentic retriev","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use ablob knowledge sourceto index and query Azure blob content in an agentic retrieval pipeline.Knowledge sourcesare created independently, referenced in aknowledge base, and used a",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Describes models and properties for creating indexer, data source, skillset, and index from Blob; this typically includes parameter names, allowed values, and JSON schemas unique to Azure AI Search agentic retrieval.",unchanged +https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-onelake,OneLake,Create an Indexed OneLake Knowledge Source for Agentic Retrieval - Azure AI Search,Set up indexed OneLake knowledge sources in Azure AI Search,"Learn how to create an indexed OneLake knowledge source in Azure AI Search. An indexed OneLake knowledge source specifies a lakehouse, models, and properties that create an enrichment pipeline for age","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use anindexed OneLake knowledge sourceto index and query Microsoft OneLake files in an agentic retrieval pipeline.Knowledge sourcesare created independently, referenced in aknowledge",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,Covers specifying a lakehouse and enrichment pipeline models/properties; likely contains detailed configuration fields and structures specific to OneLake integration as a knowledge source.,unchanged +https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-search-index,Search index,Create a Search Index Knowledge Source - Azure AI Search,Configure search index knowledge sources for agentic retrieval,"Learn how to create a search index knowledge source, which specifies an index used by a knowledge base for agentic retrieval workloads.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Asearch index knowledge sourcespecifies a connection to an Azure AI Search index that provides searchable content in an agentic retrieval pipeline.Knowledge sourcesare created indepe",2026-04-15T06:03:00.000Z,how-to,configuration,0.68,True,"How-to page for creating a search index knowledge source; likely includes specific request schemas, property names, and configuration parameters for Azure AI Search knowledge bases, which are product-specific configuration details beyond generic LLM knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-sharepoint-indexed,SharePoint,Create a SharePoint (Indexed) Knowledge Source - Azure AI Search,Configure indexed SharePoint knowledge sources for Azure AI Search,"Learn how to create an indexed SharePoint knowledge source, which ingests content from SharePoint sites into a searchable index on Azure AI Search.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use anindexed SharePoint knowledge sourceto index and query SharePoint content in an agentic retrieval pipeline.Knowledge sourcesare created independently, referenced in aknowledge b",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Ingests SharePoint content into a searchable index; such pages usually define data source, indexer, and index configuration parameters and schemas that are product-specific.",unchanged +https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-sharepoint-remote,SharePoint,Create a SharePoint (Remote) Knowledge Source - Azure AI Search,Configure remote SharePoint knowledge sources via Copilot Retrieval API,"Learn how to create a remote SharePoint knowledge source, which tells an agentic retrieval engine in Azure AI Search to query SharePoint sites directly.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Aremote SharePoint knowledge sourceuses theCopilot Retrieval APIto query textual content directly from SharePoint in Microsoft 365. No search index or connection string is needed. On",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Describes using Copilot Retrieval API to query SharePoint directly; likely includes specific API parameters, request shapes, and configuration options unique to this remote knowledge source type.",unchanged +https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-web,Web,Create Web Knowledge Source for Agentic Retrieval - Azure AI Search,Create and configure Web Knowledge Sources using Bing grounding,Learn how to create a Web Knowledge Source resource for agentic retrieval workloads in Azure AI Search.,"Important Web Knowledge Source, which uses Grounding with Bing Search and/or Grounding with Bing Custom Search, is aFirst Party Consumption Servicegoverned by theGrounding with Bing terms of useand theMicrosoft Privacy Statement. TheMicrosoft Data Protection Addendumdoesn't apply to data sent to Web Knowledge Source. When Customer uses Web Knowledge Source, Customer Data flows outside the Azure compliance and Geo boundary. This also means use of Web Knowledge Source waives all elevated Governmen",2026-04-15T06:03:00.000Z,how-to,configuration,0.72,True,"Covers Web Knowledge Source using Grounding with Bing/Bing Custom Search; beyond legal notes, such pages typically define resource configuration fields, options, and constraints specific to this feature.",unchanged https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-web-manage,Manage web access,Enable or Disable Access to Web Knowledge Source - Azure AI Search,Enable or disable Web Knowledge Source in Azure subscription,"Learn how to enable or disable the use of Web Knowledge Source in your Azure subscription. By default, access is enabled, but you can disable or re-enable access using the Azure CLI.","Important Web Knowledge Source, which uses Grounding with Bing Search and/or Grounding with Bing Custom Search, is aFirst Party Consumption Servicegoverned by theGrounding with Bing terms of useand theMicrosoft Privacy Statement. TheMicrosoft Data Protection Addendumdoesn't apply to data sent to Web Knowledge Source. When Customer uses Web Knowledge Source, Customer Data flows outside the Azure compliance and Geo boundary. This also means use of Web Knowledge Source waives all elevated Governmen",2026-03-25T08:00:00.000Z,how-to,configuration,0.7,True,"Describes enabling/disabling via Azure CLI; likely lists exact CLI commands, flags, and subscription-level settings that are product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-overview,What is a knowledge source?,What is a Knowledge Source? - Azure AI Search,Define knowledge source objects for agentic retrieval,Learn about the knowledge source object used for agentic retrieval workloads in Azure AI Search.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. A knowledge source specifies the content used for agentic retrieval. It either encapsulates a search index populated by external data, or it's a direct connection to a remote target ",2026-02-02T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes the schema and properties of a knowledge source object, including allowed types and fields; these are concrete configuration details for the service.",unchanged https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-answer-synthesis,Enable answer synthesis,Enable Answer Synthesis - Azure AI Search,Enable answer synthesis in Azure AI Search knowledge bases,"Learn how to enable answer synthesis in a knowledge base or retrieve request in Azure AI Search. At query time, the knowledge base uses your deployed LLM to produce natural-language answers with citat","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. By default, aknowledge basein Azure AI Search performsdata extraction, which returns raw grounding chunks from your knowledge sources. Data extraction is useful for retrieving specif",2026-03-25T08:00:00.000Z,how-to,configuration,0.65,True,"Explains how to turn on answer synthesis and configure behavior; likely includes request parameters, configuration fields, and options for citations and extraction vs synthesis.",unchanged https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-index,Create an index for agentic retrieval,Create an Index for Agentic Retrieval - Azure AI Search,Define Azure AI Search index for agentic retrieval,Create an index that has fields and configurations that work for agentic retrieval workloads in Azure AI Search.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In Azure AI Search, agentic retrieval uses context and user questions to generate a range of subqueries that can execute against your content in aknowledge source. A knowledge source",2026-03-25T08:00:00.000Z,how-to,configuration,0.65,True,"Describes fields and configurations for agentic retrieval indexes; likely includes concrete index schema patterns, field attributes, and configuration values specific to this workload.",unchanged -https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-knowledge-base,Create a knowledge base,Create a Knowledge Base - Azure AI Search,Define and configure knowledge bases for agentic retrieval,Learn how to create a knowledge base for agentic retrieval workloads in Azure AI Search.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In Azure AI Search, aknowledge baseis a top-level object that orchestratesagentic retrieval. It defines which knowledge sources to query and the default behavior for retrieval operat",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Explains how to create a knowledge base object that orchestrates agentic retrieval; likely includes JSON schema, property names, and allowed values for knowledge base configuration.",updated -https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-pipeline,Build an end-to-end retrieval solution,Tutorial: Build an Agentic Retrieval Solution - Azure AI Search,,Build an agentic retrieval solution that connects Azure AI Search to Foundry Agent Service via MCP. Follow this tutorial to create a knowledge base and agent.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Learn how to create an intelligent, MCP-enabled solution that integrates Azure AI Search with Foundry Agent Service foragentic retrieval. You can use this architecture for conversati",2026-04-15T08:00:00.000Z,tutorial,,0.4,False,"Tutorial for building an end-to-end agentic retrieval solution; likely step-by-step and architectural, but not primarily a catalog of configuration options, limits, or error mappings as required by the expert-knowledge categories.",updated +https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-knowledge-base,Create a knowledge base,Create a Knowledge Base - Azure AI Search,Define and configure knowledge bases for agentic retrieval,Learn how to create a knowledge base for agentic retrieval workloads in Azure AI Search.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In Azure AI Search, aknowledge baseis a top-level object that orchestratesagentic retrieval. It defines which knowledge sources to query and the default behavior for retrieval operat",2026-04-15T06:03:00.000Z,how-to,configuration,0.7,True,"Explains how to create a knowledge base object that orchestrates agentic retrieval; likely includes JSON schema, property names, and allowed values for knowledge base configuration.",unchanged +https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-pipeline,Build an end-to-end retrieval solution,Tutorial: Build an Agentic Retrieval Solution - Azure AI Search,,Build an agentic retrieval solution that connects Azure AI Search to Foundry Agent Service via MCP. Follow this tutorial to create a knowledge base and agent.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Learn how to create an intelligent, MCP-enabled solution that integrates Azure AI Search with Foundry Agent Service foragentic retrieval. You can use this architecture for conversati",2026-04-15T08:00:00.000Z,tutorial,,0.4,False,"Tutorial for building an end-to-end agentic retrieval solution; likely step-by-step and architectural, but not primarily a catalog of configuration options, limits, or error mappings as required by the expert-knowledge categories.",unchanged https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-migrate,Migrate agentic retrieval code,Migrate Agentic Retrieval Code - Azure AI Search,Migrate Azure AI Search agentic retrieval code between REST API versions,Learn how to migrate your agentic retrieval code to the latest REST API version. This article focuses on breaking changes and backwards compatibility.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. If you wroteagentic retrievalcode using an early preview REST API, this article explains when and how to migrate to a newer version. It also describes breaking and nonbreaking change",2026-03-25T08:00:00.000Z,how-to,decision-making,0.6,True,"Focuses on breaking vs nonbreaking changes and when/how to migrate; likely includes version-specific behavior differences and guidance on choosing migration paths, which is decision-oriented expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-retrieve,Query a knowledge base (APIs or MCP),Query Knowledge Base via APIs or MCP - Azure AI Search,"Query Azure AI Search knowledge bases via REST, SDKs, and MCP","Learn how to Query a knowledge base using the retrieve action or MCP endpoint in Azure AI Search using REST APIs, Azure SDKs, or any MCP-compatible client.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In an agentic retrieval pipeline, theretrieve actioninvokes parallel query processing from a knowledge base. You can call the retrieve action directly using the Search Service REST A",2026-04-16T11:04:00.000Z,how-to,integrations,0.7,True,"Focuses on calling the retrieve action and MCP endpoint; such API pages usually list request/response parameters, action names, and endpoint-specific options that are product-specific integration details.",updated +https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-retrieve,Query a knowledge base (APIs or MCP),Query Knowledge Base via APIs or MCP - Azure AI Search,"Query Azure AI Search knowledge bases via REST, SDKs, and MCP","Learn how to Query a knowledge base using the retrieve action or MCP endpoint in Azure AI Search using REST APIs, Azure SDKs, or any MCP-compatible client.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In an agentic retrieval pipeline, theretrieve actioninvokes parallel query processing from a knowledge base. You can call the retrieve action directly using the Search Service REST A",2026-04-16T11:04:00.000Z,how-to,integrations,0.7,True,"Focuses on calling the retrieve action and MCP endpoint; such API pages usually list request/response parameters, action names, and endpoint-specific options that are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-set-retrieval-reasoning-effort,Set retrieval reasoning effort,Set the Retrieval Reasoning Effort - Azure AI Search,Set retrievalReasoningEffort for agentic retrieval workloads,Learn how to set the level of LLM processing for agentic retrieval in Azure AI Search.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In agentic retrieval, you can specify the level of large language model (LLM) processing for query planning and answer formulation. Use theretrievalReasoningEffortproperty to set LLM",2026-02-12T23:10:00.000Z,how-to,configuration,0.78,True,Describes the retrievalReasoningEffort property and its allowed values to control LLM processing; concrete configuration parameter unique to this feature.,unchanged https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-overview,What is agentic retrieval?,Agentic Retrieval Overview - Azure AI Search,,"Learn about agentic retrieval in Azure AI Search, a pipeline that uses LLMs to decompose complex queries into subqueries for better RAG and agent workflows.","Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In Azure AI Search,agentic retrievalis a multi-query pipeline designed for complex questions posed by users or agents in chat and copilot apps. It's intended forretrieval-augmented g",2026-03-11T08:00:00.000Z,concept-article,,0.2,False,"High-level conceptual overview of agentic retrieval in Azure AI Search; summary indicates architecture description and intended use but no evidence of numeric limits, configuration tables, error codes, or decision matrices with quantified trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/search/chat-completion-skill-example-usage,Add AI-generated content (GenAI Prompt skill),Use language models for content generation in an ingestion pipeline - Azure AI Search,Use Chat Completion skill for image captioning in Azure AI Search,Use language models to caption your images and facilitate an image search through your data.,"In this article, learn how to generate captions using AI enrichment and a skillset. Images often contain useful information that's relevant in search scenarios. You canvectorize imagesto represent visual content in your search index. Or, you can useAI enrichment and skillsetsto create and extract searchabletextfrom images. The Chat Completion skill (preview) can generate a description of each image in your data source, and the indexer pushes that description into a search index. To view the desc",2026-02-28T06:12:00.000Z,how-to,integrations,0.65,True,"Describes concrete usage of the Chat Completion skill within an Azure AI Search ingestion pipeline, including product-specific skill configuration and how outputs are mapped into the search index. This is an integration/coding pattern between Azure AI Search indexers, AI enrichment, and language models, beyond generic LLM knowledge.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-aml-skill,Azure Machine Learning (AML),Custom AML Skill in Skillsets - Azure AI Search,Configure AML custom skill in Azure AI Search,Learn how to extend the capabilities of Azure AI Search skillsets with Microsoft Foundry or Azure Machine Learning models.,"Important Support for indexer connections to the model catalog is in public preview undersupplemental terms of use. Preview REST APIs support this capability. Use theAMLskill to extend AI enrichment with a deployed base embedding model from theMicrosoft Foundry model catalogor a customAzure Machine Learning(AML) model. Your data is processed in theGeowhere your model is deployed. You specify the AML skill in a skillset, which then integrates your deployed model into an AI enrichment pipeline. Th",2025-12-11T23:00:00.000Z,concept-article,integrations,0.7,True,"Describes the AML skill as a callable component in skillsets, including model catalog/AML endpoint integration and JSON contract details that are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-attach-cognitive-services,Attach a billable resource,Attach Resource to Skillset for Billing - Azure AI Search,Attach Foundry resource and understand AI enrichment quotas,Learn how to attach a Microsoft Foundry resource to an AI enrichment pipeline for billing purposes in Azure AI Search.,"If you're using built-in skills forAI enrichmentin Azure AI Search, you can enrich a small number of documents for free, up to 20 transactions per index per day. For larger or more frequent workloads, you should attach a billable Microsoft Foundry resource to yourskillset. Azure AI Search uses dedicated, internally hosted resources to execute built-in skills backed by Foundry Tools and requires a Foundry resource solely for billing purposes. The exception is theAzure Content Understanding skill,",2025-11-18T15:37:00.000Z,how-to,limits-quotas,0.82,True,Specifies free quota (20 transactions per index per day) and when to attach a billable Foundry resource; explicit numeric limits and billing behavior.,unchanged -https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings,Troubleshoot errors and warnings,Indexer errors and warnings - Azure AI Search,Resolve common Azure AI Search indexer errors and warnings,This article provides information and solutions to common errors and warnings you might encounter during AI enrichment in Azure AI Search.,"This article provides information and solutions to common errors and warnings you might encounter during indexing and AI enrichment in Azure AI Search. Indexing stops when the error count exceeds'maxFailedItems'. If you want indexers to ignore these errors (and skip over ""failed documents""), consider updating themaxFailedItemsandmaxFailedItemsPerBatchas describedhere. Note Each failed document along with its document key (when available) will show up as an error in the indexer execution status. ",2025-10-14T08:00:00.000Z,error-reference,troubleshooting,0.86,True,"Maps specific indexer error/warning messages and codes to causes and solutions, including use of maxFailedItems and related settings.",unchanged +https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings,Troubleshoot errors and warnings,Indexer Errors and Warnings - Azure AI Search,Troubleshoot Azure AI Search indexer errors and warnings,This article provides information and solutions to common errors and warnings you might encounter during AI enrichment in Azure AI Search.,"This article provides information and solutions to common errors and warnings you might encounter during indexing and AI enrichment in Azure AI Search. Indexing stops when the error count exceeds'maxFailedItems'. If you want indexers to ignore these errors (and skip over ""failed documents""), consider updating themaxFailedItemsandmaxFailedItemsPerBatchas describedhere. Note Each failed document along with its document key (when available) will show up as an error in the indexer execution status. ",2026-04-23T08:00:00.000Z,error-reference,troubleshooting,0.86,True,"Article is explicitly about common indexer errors and warnings in Azure AI Search, including parameters like maxFailedItems and maxFailedItemsPerBatch, and maps symptoms (errors/warnings) to causes and resolutions. This is product-specific troubleshooting content with configuration-related error handling guidance.",updated https://learn.microsoft.com/en-us/azure/search/cognitive-search-concept-annotations-syntax,Reference a skill output,Reference enriched nodes during skillset execution - Azure AI Search,Use annotation syntax to reference enriched nodes in skillsets,Explains the annotation syntax and how to reference inputs and outputs of a skillset in an AI enrichment pipeline in Azure AI Search.,"During skillset execution, the engine builds an in-memoryenrichment treethat captures each enrichment, such as recognized entities or translated text. In this article, learn how to reference an enrichment node in the enrichment tree so that you can pass output to downstream skills or specify an output field mapping for a search index field. This article uses examples to illustrate various scenarios. For the full syntax, seeSkill context and input annotation language.",2025-05-27T08:00:00.000Z,concept-article,configuration,0.82,True,"Details the enrichment tree, annotation language, and path syntax for referencing skill inputs/outputs; highly product-specific configuration language.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-concept-image-scenarios,Process image files,Extract text from images by using AI enrichment - Azure AI Search,,"Use Optical Character Recognition (OCR) and image analysis to extract text, layout, captions, and tags from image files in Azure AI Search pipelines.","Images often contain useful information that's relevant in search scenarios. Azure AI Search doesn't query image content in real time, but you can extract information about an image during indexing and make that content searchable. To represent image content in a search index, you can use these approaches: You can also create acustom skillto invoke any external image processing that you want to provide. This article focuses on image analysis and OCR, custom skills that provide external processin",2026-02-27T08:00:00.000Z,how-to,,0.3,False,"Describes how to use OCR and image analysis in Azure AI Search pipelines at a conceptual and scenario level. It does not appear to provide detailed configuration tables, limits, or error-code-based troubleshooting that would qualify as expert knowledge under the given categories.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-concept-intro,What is AI enrichment?,AI enrichment concepts - Azure AI Search,,"Content extraction, natural language processing (NLP), and image processing can create searchable content in Azure AI Search indexes.","In Azure AI Search,AI enrichmentrefers to integration withFoundry Toolsto process content that isn't searchable in its raw form. Through enrichment, analysis and inference are used to create searchable content and structure where none previously existed. Because Azure AI Search is used for text and vector queries, the purpose of AI enrichment is to improve the utility of your content in search-related scenarios. Raw content must be text or images (you can't enrich vectors), but the output of an ",2025-11-18T15:37:00.000Z,concept-article,,0.3,False,"Conceptual overview of AI enrichment capabilities and purpose; lacks detailed configuration tables, limits, or error mappings.",unchanged @@ -41,7 +41,7 @@ https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-content-un https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-custom-entity-lookup,Custom Entity Lookup,Custom Entity Lookup skill - Azure AI Search,Configure Custom Entity Lookup skill parameters,Extract different custom entities from text in an Azure AI Search enrichment pipeline.,"TheCustom Entity Lookupskill is used to detect or recognize entities that you define. During skillset execution, the skill looks for text from a custom, user-defined list of words and phrases. The skill uses this list to label any matching entities found within source documents. The skill also supports a degree of fuzzy matching that can be applied to find matches that are similar but not exact. Note This skill isn't bound to a Foundry Tools API but requires a Foundry Tools key to allow more tha",2026-01-07T08:00:00.000Z,reference,configuration,0.7,True,"Custom Entity Lookup skill docs typically define skill JSON schema, inputs/outputs, and parameters (including fuzzy matching options) that are specific configuration details for this product.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-deprecated,Deprecated skills,Deprecated Cognitive Skills - Azure AI Search,Migrate from deprecated Azure AI Search skills,This page contains a list of cognitive skills that are considered deprecated and won't be supported moving forward.,"This document describes cognitive skills that are considered deprecated (retired). Use the following guide for the contents: If you're using theMicrosoft.Skills.Text.EntityRecognitionSkill(Entity Recognition cognitive skill (v2)), this article helps you upgrade your skillset to use theMicrosoft.Skills.Text.V3.EntityRecognitionSkillwhich is generally available and introduces new features. If you're using theMicrosoft.Skills.Text.SentimentSkill(Sentiment cognitive skill (v2)), this article helps y",2026-01-07T08:00:00.000Z,reference,decision-making,0.7,True,"Provides upgrade guidance from deprecated skills to v3 equivalents, including which replacements to choose and how to migrate, which is product-specific decision and migration guidance.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-extraction,Document Extraction,Document Extraction cognitive skill - Azure AI Search,Control Document Extraction behavior in skillsets,Extracts content from a file within the enrichment pipeline.,"TheDocument Extractionskill extracts content from a file in theenrichment pipeline. By default, content extraction or retrieval is built into the enrichment pipeline. However, by using the Document Extraction skill, you can control how parameters are set, and how extracted content is named in the enrichment tree. Forvectorandmultimodal search, Document Extraction combined with theText Split skillis more affordable than otherdata chunking approaches. TheMultimodal tutorialdemonstrates this scenar",2026-01-07T08:00:00.000Z,reference,configuration,0.7,True,"Document Extraction skill docs describe parameters controlling extraction behavior and naming in the enrichment tree, which are concrete configuration options for this product.",unchanged -https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout,Document Layout,Document Layout Skill - Azure AI Search,Configure Document Layout skill for Azure AI Search enrichment,Analyze a document to extract regions of interest and their inter-relationships to produce a syntactical representation (markdown format) in an enrichment pipeline in Azure AI Search.,"TheDocument Layoutskill uses thelayout modelfrom Azure Document Intelligence in Foundry Tools to analyze a document, detect its structure and characteristics, and produce a syntactical representation in Markdown or text format. This skill supports text and image extraction, the latter of which includes location metadata that preserves image position within a document. Image proximity to related content is beneficial in retrieval-augmented generation (RAG) andmultimodal searchscenarios. For trans",2026-01-20T18:13:00.000Z,reference,configuration,0.82,True,"Describes how the Document Layout skill works, supported formats, and configuration options; product-specific skill configuration.",unchanged +https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout,Document Layout,Document Layout Skill - Azure AI Search,Use Document Layout skill in Azure AI Search enrichments,Analyze a document to extract regions of interest and their inter-relationships to produce a syntactical representation (markdown format) in an enrichment pipeline in Azure AI Search.,"TheDocument Layoutskill uses thelayout modelfrom Azure Document Intelligence in Foundry Tools to analyze a document, detect its structure and characteristics, and produce a syntactical representation in Markdown or text format. This skill supports text and image extraction, the latter of which includes location metadata that preserves image position within a document. Image proximity to related content is beneficial in retrieval-augmented generation (RAG) andmultimodal searchscenarios. For trans",2026-04-22T08:00:00.000Z,reference,integrations,0.68,True,"The page describes a specific cognitive skill (Document Layout) within Azure AI Search, including how it uses the Azure Document Intelligence layout model and how it outputs markdown/text with location metadata for RAG and multimodal search. These are product-specific integration details between Azure AI Search and Document Intelligence that go beyond generic knowledge, fitting the integrations & coding patterns category best.",updated https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-entity-linking-v3,Entity Linking (v3),Entity Linking cognitive skill (v3) - Azure AI Search,Configure Entity Linking v3 skill in Azure AI Search,Extract different linked entities from text in an enrichment pipeline in Azure AI Search.,TheEntity Linkingskill (v3) returns a list of recognized entities with links to articles in a well-known knowledge base (Wikipedia). Note This skill is bound to theEntity Linkingmachine learning models inAzure Vision in Foundry Tools. It requires abillable resourcefor transactions that exceed 20 documents per indexer per day. Execution of built-in skills is charged at the existingFoundry Tools Standard price.,2026-01-07T08:00:00.000Z,reference,configuration,0.8,True,"Skill reference including binding to Foundry Tools, billing requirements, and parameters; detailed configuration for this enrichment skill.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-entity-recognition,Entity Recognition (v2),Entity Recognition cognitive skill (v2) - Azure AI Search,Configure Entity Recognition skill v2 in skillsets,Extract different types of entities from text in an enrichment pipeline in Azure AI Search.,"TheEntity Recognitionskill (v2) extracts entities of different types from text. This skill uses the machine learning models provided byText Analyticsin Foundry Tools. Important The Entity Recognition skill (v2) (Microsoft.Skills.Text.EntityRecognitionSkill) is now discontinued and replaced byMicrosoft.Skills.Text.V3.EntityRecognitionSkill. Follow the recommendations inDeprecated skillsto migrate to a supported skill. Note As you expand scope by increasing the frequency of processing, adding more",2026-01-07T08:00:00.000Z,reference,configuration,0.65,True,"Covers the v2 Entity Recognition skill, including supported entity types and skill JSON configuration tied to Text Analytics, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-entity-recognition-v3,Entity Recognition (v3),Entity Recognition cognitive skill (v3) - Azure AI Search,Configure Entity Recognition v3 skill in Azure AI Search,Extract different types of entities using the machine learning models of Azure Language in Foundry Tools in an AI enrichment pipeline in Azure AI Search.,"TheEntity Recognitionskill (v3) extracts entities of different types from text. These entities fall under 14 distinct categories, ranging from people and organizations to URLs and phone numbers. This skill uses theNamed Entity Recognitionmachine learning models provided byAzure Language in Foundry Tools. Note This skill is bound to Foundry Tools and requiresa billable resourcefor transactions that exceed 20 documents per indexer per day. Execution of built-in skills is charged at the existingFou",2026-01-07T08:00:00.000Z,reference,configuration,0.8,True,"Documents entity categories, model binding, and billing/resource requirements; product-specific skill configuration details.",unchanged @@ -83,7 +83,7 @@ https://learn.microsoft.com/en-us/azure/search/knowledge-store-projection-shape, https://learn.microsoft.com/en-us/azure/search/knowledge-store-projections-examples,Define a projection,Define projections - Azure AI Search,Define knowledge store projections in Azure AI Search,"Learn how to define table, object, and file projections in a knowledge store by reviewing syntax and examples.","Note Knowledge storesare secondary storage that exists in Azure Storage and contain the outputs of Azure AI Search skillsets. They're separate from knowledge sources and knowledge bases, which are used inagentic retrievalworkflows. Projectionsare the component of aknowledge store definitionthat determines how AI enriched content is stored in Azure Storage. Projections determine the type, quantity, and composition of the data structures containing your content. In this article, learn the syntax f",2025-10-21T08:00:00.000Z,concept-article,integrations,0.65,True,"Covers syntax and examples for table, object, and file projections; this is product-specific configuration of how enrichment outputs integrate with Azure Storage, with concrete schema and parameter usage.",unchanged https://learn.microsoft.com/en-us/azure/search/monitor-azure-cognitive-search,Monitor data,Monitor Azure AI Search,Set up monitoring for Azure AI Search with Azure Monitor,Start here to learn how to monitor Azure AI Search.,"This article describes: Note If you're already familiar with this service and/or Azure Monitor and just want to know how to analyze monitoring data, see theAnalyzesection near the end of this article. When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system. Azure Monitor provides you with a view of availability",2025-07-25T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes how Azure AI Search integrates with Azure Monitor, including which metrics/logs are emitted and how to configure collection. Product-specific monitoring configuration.",unchanged https://learn.microsoft.com/en-us/azure/search/monitor-azure-cognitive-search-data-reference,Monitoring data reference,Monitoring data reference for Azure AI Search,Reference monitoring metrics and logs for Azure AI Search,This article contains important reference material you need when you monitor Azure AI Search.,This article contains all the monitoring reference information for this service. SeeMonitor Azure AI Searchfor details on the data you can collect for Azure AI Search and how to use it.,2025-07-25T08:00:00.000Z,concept-article,configuration,0.7,True,"Monitoring data reference typically enumerates specific metrics, dimensions, and log categories for this product, which are configuration/telemetry details not derivable from general knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/search/multimodal-search-overview,What is multimodal search?,Multimodal Search Concepts and Guidance - Azure AI Search,,"Learn what multimodal search is, how Azure AI Search supports it for text and image content, and where to find detailed concepts, tutorials, and samples.","Multimodal search refers to the ability to ingest, understand, and retrieve information across multiple content types, including text, images, video, and audio. In Azure AI Search, multimodal search natively supports the ingestion of documents containing text and images and the retrieval of their content, enabling you to perform searches that combine both modalities. Building a robust multimodal pipeline typically involves: Extracting inline images and page text from documents. Describing images",2026-03-25T08:00:00.000Z,concept-article,,0.1,False,"The multimodal search overview focuses on concepts (what multimodal search is, typical pipeline steps, and high-level guidance). It does not appear to contain product-specific configuration parameters, limits, or troubleshooting mappings, so it does not qualify as expert knowledge under the defined sub-skill types.",unchanged +https://learn.microsoft.com/en-us/azure/search/multimodal-search-overview,What is multimodal search?,Multimodal Search Concepts and Guidance - Azure AI Search,,"Learn what multimodal search is, how Azure AI Search supports it for text and image content, and where to find detailed concepts, tutorials, and samples.","Multimodal search refers to the ability to ingest, understand, and retrieve information across multiple content types, including text, images, video, and audio. In Azure AI Search, multimodal search natively supports the ingestion of documents containing text and images and the retrieval of their content, enabling you to perform searches that combine both modalities. Building a robust multimodal pipeline typically involves: Extracting inline images and page text from documents. Describing images",2026-04-20T08:00:00.000Z,concept-article,,0.2,False,"Page is a conceptual overview of multimodal search in Azure AI Search, describing what it is and high-level pipeline steps. It does not include numeric limits, configuration tables, error codes, or product-specific decision matrices or best-practice details.",updated https://learn.microsoft.com/en-us/azure/search/policy-reference,Azure Policy built-ins,Built-in Policy Definitions - Azure AI Search,Use built-in Azure Policy definitions for Azure AI Search,Lists Azure Policy built-in policy definitions for Azure AI Search. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy definitions for Azure AI Search. For additional Azure Policy built-ins for other services, seeAzure Policy built-in definitions. The name of each built-in policy definition links to the policy definition in the Azure portal. Use the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2026-03-25T08:00:00.000Z,concept-article,security,0.7,True,"Lists concrete built-in Azure Policy definitions specific to Azure AI Search, including their exact policy names and links to full definitions. These are product-specific security/governance configurations (policy assignments and effects) that go beyond generic knowledge and map directly to Azure RBAC/compliance controls.",unchanged https://learn.microsoft.com/en-us/azure/search/query-lucene-syntax,Full Lucene query syntax,Lucene Query Syntax - Azure AI Search,Use Lucene query syntax with Azure AI Search,"Reference for the full Lucene query syntax, as used in Azure AI Search for wildcard, fuzzy search, RegEx, and other advanced query constructs.","When creating queries in Azure AI Search, you can opt for the fullLucene Query Parsersyntax for specialized query forms: wildcard, fuzzy search, proximity search, regular expressions. Much of the Lucene Query Parser syntax isimplemented intact in Azure AI Search, except forrange searches, which are constructed through$filterexpressions. To use full Lucene syntax, set the queryType tofulland pass in a query expression patterned for wildcard, fuzzy search, or one of the other query forms supported",2026-02-19T08:00:00.000Z,concept-article,integrations,0.7,True,Documents exactly which Lucene constructs are supported/modified in this service and how to configure queryType and expressions—product-specific query API behavior.,unchanged https://learn.microsoft.com/en-us/azure/search/query-odata-filter-orderby-syntax,Overview,OData Language Overview - Azure AI Search,Construct OData expressions for Azure AI Search queries,"OData language overview for filters, select, and order-by for Azure AI Search keyword search.","This article provides an overview of the OData expression language used in$filter,$order-by, and$selectexpressions for keyword search in Azure AI Search over numeric and string (nonvector) fields. The language is presented ""bottom-up"" starting with the most basic elements. The OData expressions that you can construct in a query request range from simple to highly complex, but they all share common elements. Shared elements include: Once you understand these common concepts, you can continue with",2026-02-19T08:00:00.000Z,concept-article,integrations,0.72,True,"Service-specific OData language overview for this API; defines supported constructs and how they map into query parameters, which is integration-focused syntax knowledge.",unchanged @@ -116,7 +116,7 @@ https://learn.microsoft.com/en-us/azure/search/search-dotnet-sdk-migration-versi https://learn.microsoft.com/en-us/azure/search/search-explorer,Query with Search Explorer,Quickstart: Search Explorer Query Tool - Azure AI Search,,"Search explorer is a query tool in the Azure portal that sends query requests to a search index in Azure AI Search. Use it to learn syntax, test query expressions, or inspect a search document.","In this quickstart, you learn how to useSearch explorer, a built-in query tool in the Azure portal for running queries against an Azure AI Search index. Use this tool to test a query or filter expression or to confirm whether content exists in the index. This quickstart uses an existing index to demonstrate Search explorer.",2025-12-04T08:00:00.000Z,quickstart,,0.25,False,Quickstart for using Search Explorer UI; mostly procedural steps without deep config matrices or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/search/search-faceted-navigation,Add faceted navigation,Add facets to a query - Azure AI Search,,Add faceted navigation for self-directed navigation in applications that integrate with Azure AI Search.,"Faceted navigation is used for self-directed filtering on query results in a search app, where your application offers form controls for scoping search to groups of documents (for example, categories or brands), and Azure AI Search provides the data structures and filters to back the experience. In this article, learn the steps for returning a faceted navigation structure in Azure AI Search. Once you're familiar with basic concepts and clients, continue toFacet examplesfor syntax about various u",2025-11-05T08:00:00.000Z,how-to,,0.35,False,How-to for adding facets; examples of query parameters but not a comprehensive configuration reference or troubleshooting guide.,unchanged https://learn.microsoft.com/en-us/azure/search/search-faceted-navigation-examples,Sample facets,Faceted navigation examples - Azure AI Search,,"Examples that demonstrate query syntax for facet hierarchies, distinct counts, facet aggregations, and facet filters.","This section extendsfaceted navigation configurationwith examples that demonstrate basic usage and other scenarios. Facetable fields are defined in an index, but facet parameters and expressions are defined in query requests. If you have an index with facetable fields, you can try new preview features likefacet hierarchies,facet aggregations, andfacet filterson existing indexes.",2025-11-18T15:37:00.000Z,how-to,,0.45,False,Facet examples article; mostly syntax samples rather than structured config tables or decision/troubleshooting content.,unchanged -https://learn.microsoft.com/en-us/azure/search/search-faq-frequently-asked-questions,FAQ,Azure AI Search FAQ - Azure AI Search,Review Azure AI Search limits and behaviors from FAQ,"Get answers to common questions about Azure AI Search, a cloud-hosted search service on Microsoft Azure.",Find answers to commonly asked questions about Azure AI Search.,2026-04-15T22:09:00Z,faq,limits-quotas,0.78,True,"The FAQ includes product-specific expert details such as the maximum number of indexes, documents, and fields per service, shard/partition limits, and other numeric constraints that are not derivable from general knowledge. These are concrete service limits and quotas, matching the limits-quotas criteria.",updated +https://learn.microsoft.com/en-us/azure/search/search-faq-frequently-asked-questions,FAQ,Azure AI Search FAQ - Azure AI Search,Review Azure AI Search limits and behaviors from FAQ,"Get answers to common questions about Azure AI Search, a cloud-hosted search service on Microsoft Azure.",Find answers to commonly asked questions about Azure AI Search.,2026-04-15T22:09:00Z,faq,limits-quotas,0.78,True,"The FAQ includes product-specific expert details such as the maximum number of indexes, documents, and fields per service, shard/partition limits, and other numeric constraints that are not derivable from general knowledge. These are concrete service limits and quotas, matching the limits-quotas criteria.",unchanged https://learn.microsoft.com/en-us/azure/search/search-features-list,Features,Feature descriptions - Azure AI Search,,Explore the feature categories of Azure AI Search.,"Azure AI Search provides information retrieval and uses optional AI integration to extract more value from text and vector content. The following table summarizes features by category. There's feature parity in all Azure public, private, and sovereign clouds, but some features aren't supported inspecific regionsorspecific tiers. Note Looking for preview features? See thepreview features list.",2026-01-26T23:15:00.000Z,concept-article,,0.3,False,"Feature list/overview; describes capabilities and categories but not detailed configuration tables, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/search/search-file-storage-integration,Files,Azure Files indexer (preview) - Azure AI Search,Index Azure Files shares with Azure AI Search (preview),Set up an Azure Files indexer to automate indexing of file shares in Azure AI Search.,"Important Azure Files indexer is currently in public preview underSupplemental Terms of Use. Use apreview REST APIto create the indexer data source. In this article, learn how to configure anindexerthat imports content from Azure Files and makes it searchable in Azure AI Search. Inputs to the indexer are your files in a single share. Output is a search index with searchable content and metadata stored in individual fields. To configure and run the indexer, you can use:",2026-01-23T08:00:00.000Z,how-to,integrations,0.75,True,Preview Azure Files indexer requires specific preview REST APIs and configuration steps unique to this integration.,unchanged https://learn.microsoft.com/en-us/azure/search/search-filters,What is a filter?,Text query filters - Azure AI Search,,Apply filter criteria to include or exclude content before text query execution in Azure AI Search.,"Afilterprovides value-based criteria for including or excluding content before query execution for keyword search, or before or after query execution for vector search. Filters are applied to nonvector fields, but can be used in vector search if documents include nonvector fields. For example, for indexes organized around chunked content, you might have parent-level fields or metadata fields that can be filtered. This article explains filtering for keyword search. For more information about vect",2025-03-11T08:00:00.000Z,concept-article,,0.35,False,"Explains filter usage conceptually; likely examples but no numeric thresholds, config tables, or error mappings.",unchanged @@ -154,7 +154,7 @@ https://learn.microsoft.com/en-us/azure/search/search-how-to-index-cosmosdb-sql, https://learn.microsoft.com/en-us/azure/search/search-how-to-index-logic-apps,Create a workflow,Connect to Azure Logic Apps - Azure AI Search,Use Azure Logic Apps workflows for automated indexing,Use an Azure Logic Apps workflow for automated indexing in Azure AI Search.,"In Azure AI Search, you can use theImport data (new)wizardin the Azure portal to create a logic app workflow that indexes and vectorizes your content. This capability is equivalent to anindexerand data source that generates an indexing pipeline and creates searchable content. After you create a workflow in the wizard, you can manage the workflow in Azure Logic Apps alongside your other workflows. Behind the scenes, the wizard follows a workflow template that pulls in (ingests) content from a sou",2026-01-23T08:00:00.000Z,how-to,integrations,0.7,True,"Describes Logic Apps-based ingestion pipeline template, including how it maps to indexer-like behavior and configuration in the Import data wizard.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-index-mysql,Azure DB for MySQL,Azure DB for MySQL (preview) - Azure AI Search,Index Azure Database for MySQL with Azure AI Search (preview),Learn how to set up a search indexer to index data stored in Azure Database for MySQL for full text search in Azure AI Search.,"Important MySQL support is currently in public preview underSupplemental Terms of Use. We recommend the latest preview API. There is currently no portal support. In this article, learn how to configure anindexerthat imports content from Azure Database for MySQL and makes it searchable in Azure AI Search. Inputs to the indexer are rows from a single table or view. Output is a search index with searchable content in individual fields. This article supplementsCreate an indexerwith information that'",2025-10-09T05:03:00.000Z,how-to,integrations,0.8,True,"MySQL indexer preview requires specific preview APIs and configuration for tables/views, which is detailed integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-index-onelake-files,OneLake,OneLake indexer - Azure AI Search,Configure OneLake files indexer for Azure AI Search,Set up a OneLake indexer to automate indexing of content and metadata from Microsoft OneLake files and shortcuts.,"In this article, learn how to configure a OneLake files indexer for extracting searchable data and metadata data from alakehouseon top ofMicrosoft OneLake. To configure and run the indexer, you can use: This article uses the REST APIs to illustrate each step.",2026-02-17T08:00:00.000Z,how-to,integrations,0.7,True,"Product-specific configuration of a OneLake indexer, including REST-based setup and parameters for extracting data and metadata from lakehouses.",unchanged -https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online,SharePoint in Microsoft 365,SharePoint in Microsoft 365 Indexer - Azure AI Search,Configure SharePoint Online indexer for Azure AI Search,Set up a SharePoint in Microsoft 365 indexer to automate indexing of document library content in Azure AI Search.,"Important The SharePoint in Microsoft 365 indexer is in public preview. It's offered ""as-is"" underSupplemental Terms of Useand supported on a best-effort basis only. Preview features aren't recommended for production workloads and aren't guaranteed to become generally available. Before you proceed, review theknown limitations. Fill out this formto register for the preview. All requests are approved automatically. After you fill out the form, use apreview REST APIto index your content. This artic",2026-04-16T08:00:00.000Z,how-to,integrations,0.68,True,"The article describes a preview SharePoint Online indexer with product-specific REST API usage and configuration details unique to Azure AI Search and SharePoint integration. It goes beyond a simple tutorial by covering how to set up and use the indexer with Azure Search, which is specialized integration knowledge not generally known from training.",updated +https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online,SharePoint in Microsoft 365,SharePoint in Microsoft 365 Indexer - Azure AI Search,Configure SharePoint Online indexer for Azure AI Search,Set up a SharePoint in Microsoft 365 indexer to automate indexing of document library content in Azure AI Search.,"Important The SharePoint in Microsoft 365 indexer is in public preview. It's offered ""as-is"" underSupplemental Terms of Useand supported on a best-effort basis only. Preview features aren't recommended for production workloads and aren't guaranteed to become generally available. Before you proceed, review theknown limitations. Fill out this formto register for the preview. All requests are approved automatically. After you fill out the form, use apreview REST APIto index your content. This artic",2026-04-20T08:00:00.000Z,how-to,configuration,0.68,True,"How-to article for setting up the SharePoint in Microsoft 365 indexer, which typically includes product-specific configuration parameters (indexer definition fields, data source settings, authentication details, and preview REST API specifics). These are concrete, service-specific configuration details rather than conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-database,Azure SQL Databases,Azure SQL indexer - Azure AI Search,Configure Azure SQL indexer for Azure AI Search,Set up a search indexer to index tables in Azure SQL Database for vector and full text search in Azure AI Search.,"In this article, learn how to configure anindexerthat imports content from Azure SQL Database or an Azure SQL managed instance and makes it searchable in Azure AI Search. This article supplementsCreate an indexerwith information that's specific to Azure SQL. It uses the Azure portal and REST APIs to demonstrate a three-part workflow common to all indexers: create a data source, create an index, create an indexer. Data extraction occurs when you submit the Create Indexer request. This article als",2025-11-21T08:00:00.000Z,how-to,integrations,0.72,True,"How-to for configuring Azure SQL/MI as a data source and indexer, including REST schema and product-specific parameters for indexers and data sources.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-managed-instance,Azure SQL Managed Instances,Indexer connection to SQL Managed Instances - Azure AI Search,Secure indexer connections to Azure SQL Managed Instance,Enable public endpoint to allow connections to SQL Managed Instances from an indexer on Azure AI Search.,"Indexers in Azure AI Search connect to external data sources over a public endpoint. If you're setting up anAzure SQL indexerfor a connection to a SQL managed instance, follow the steps in this article to ensure the public endpoint is set up correctly. Alternatively, for private connections,create a shared private linkinstead. Note Always Encryptedcolumns are not currently supported by Azure AI Search indexers.",2025-07-11T08:00:00.000Z,how-to,security,0.68,True,Covers enabling public endpoint and networking specifics for SQL MI so Azure AI Search indexers can connect; product-specific connectivity and security configuration.,unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-managed-instance-with-managed-identity,SQL Managed Instance,Connect to Azure SQL Managed Instance Using a Managed Identity - Azure AI Search,Connect Azure SQL Managed Instance with managed identity,Learn how to set up an Azure AI Search indexer connection to an Azure SQL Managed Instance using a managed identity.,"This article describes how to set up an Azure AI Search indexer connection toSQL Managed Instanceusing a managed identity instead of providing credentials in the connection string. You can use a system-assigned managed identity or a user-assigned managed identity (preview). Managed identities are Microsoft Entra logins and require Azure role assignments to access data in SQL Managed Instance. Before you learn more about this feature, you should understand what an indexer is and how to set up an ",2026-01-21T08:00:00.000Z,how-to,security,0.78,True,"Covers configuring Azure AI Search indexers to SQL Managed Instance using managed identities, including required roles and connection setup specifics that are product- and feature-specific.",unchanged @@ -162,7 +162,7 @@ https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-server,Az https://learn.microsoft.com/en-us/azure/search/search-how-to-integrated-vectorization,Use integrated vectorization,Integrated Vectorization Using REST APIs - Azure AI Search,Use integrated vectorization with Azure AI Search REST APIs,Learn how to use skills to automate data chunking and vectorization during indexing and query execution.,"In this article, you learn how to use a skillset to chunk and vectorize content from asupported data source. The skillset calls theText Split skillorDocument Layout skillfor chunking and an embedding skill that's attached to asupported embedding modelfor chunk vectorization. You also learn how to store the chunked and vectorized content in avector index. This article describes the end-to-end workflow forintegrated vectorizationusing REST. For portal-based instructions, seeQuickstart: Vectorize t",2026-01-16T08:00:00.000Z,how-to,integrations,0.75,True,"Describes skillset configuration, embedding skills, and REST parameters for integrated vectorization—detailed integration and config patterns.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-large-index,Import large data sets,Index large data sets for full text search - Azure AI Search,Optimize large-scale indexing in Azure AI Search,"Learn about strategies for large data indexing or computationally intensive indexing through batch mode, resourcing, and scheduled, parallel, and distributed indexing.","If you need to index large or complex data sets in your search solution, this article explores strategies to accommodate long-running processes on Azure AI Search. These strategies assume familiarity with thetwo basic approaches for importing data:pushingdata into an index, orpullingin data from a supported data source using asearch indexer. If your scenario involves computationally intensiveAI enrichment, then indexers are required, given the skillset dependency on indexers. This article comple",2025-10-06T08:00:00.000Z,concept-article,best-practices,0.7,True,"Provides strategies for batch, scheduled, parallel, and distributed indexing, including product-specific performance considerations.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-load-search-index,Import data,Load an index - Azure AI Search,Load and refresh data into Azure AI Search indexes,"Import and refresh data in a search index using the Azure portal, REST APIs, or an Azure SDK.","This article explains how to import documents into a predefined search index using REST APIs, Azure SDKs, or the Azure portal. Tip For the fastest path to loading data, use theImport data wizardin the Azure portal, which creates an index and loads it in one workflow.",2026-01-20T23:09:00.000Z,how-to,configuration,0.6,True,"Explains document import via REST, SDKs, and portal with product-specific operations and payload structures.",unchanged -https://learn.microsoft.com/en-us/azure/search/search-how-to-manage-index,Manage an index,Index Management - Azure AI Search,,"Learn how to manage indexes in Azure AI Search. Operations include viewing all indexes on your search service, checking index-specific statistics and definitions, and deleting indexes.","After youcreate an index, you can use theAzure portalto access its statistics and definition or remove it from your search service. After youcreate an index, you can use theAzure AI Search REST APIsto access its statistics and definition or remove it from your search service. After youcreate an index, you can use the Azure SDK for.NET,Java,JavaScript, orPythonto access its statistics and definition or remove it from your search service. This article describes how to manage an index without affec",2026-04-15T06:03:00.000Z,how-to,,0.3,False,"Page appears to describe how to view and manage Azure AI Search indexes via portal, REST, and SDKs, but the summary does not indicate presence of specific limits, quotas, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details. It reads more like a how-to/tutorial for basic index management operations rather than reference-style expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/search/search-how-to-manage-index,Manage an index,Index Management - Azure AI Search,,"Learn how to manage indexes in Azure AI Search. Operations include viewing all indexes on your search service, checking index-specific statistics and definitions, and deleting indexes.","After youcreate an index, you can use theAzure portalto access its statistics and definition or remove it from your search service. After youcreate an index, you can use theAzure AI Search REST APIsto access its statistics and definition or remove it from your search service. After youcreate an index, you can use the Azure SDK for.NET,Java,JavaScript, orPythonto access its statistics and definition or remove it from your search service. This article describes how to manage an index without affec",2026-04-15T06:03:00.000Z,how-to,,0.3,False,"Page appears to describe how to view and manage Azure AI Search indexes via portal, REST, and SDKs, but the summary does not indicate presence of specific limits, quotas, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details. It reads more like a how-to/tutorial for basic index management operations rather than reference-style expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-managed-identities,Configure a managed identity,Configure a Managed Identity - Azure AI Search,Configure managed identities for Azure AI Search outbound connections,"Create a managed identity for your search service, and use Microsoft Entra authentication and role-based-access controls for connections to other cloud services.","You can use Microsoft Entra ID security principals and role assignments for outbound connections from Azure AI Search to other Azure resources providing data, applied AI, or vectorization during indexing or queries. To use roles on an outbound connection, first configure your search service to use either asystem-assigned or user-assigned managed identityas the security principal for your search service in a Microsoft Entra tenant. After you have a managed identity, you can assign roles for autho",2026-03-25T08:00:00.000Z,how-to,security,0.9,True,"Describes using system- and user-assigned managed identities, plus role assignments for outbound connections; includes specific identity configuration steps and role usage patterns.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-semantic-chunking,Chunk and vectorize by document layout,Chunk and Vectorize by Document Layout - Azure AI Search,Chunk and vectorize content by document layout,"Chunk textual content by headings and semantically coherent fragments, generate embeddings, and send the results to a searchable index.","Text data chunking strategies play a key role in optimizing RAG responses and performance. By using theDocument Layoutskill, you can chunk content based on document structure, capturing headings and chunking the content body based on semantic coherence, such as paragraphs and sentences. Chunks are processed independently. Because LLMs work with multiple chunks, when those chunks are of higher quality and semantically coherent, the overall relevance of the query is improved. The Document Layout s",2026-01-21T23:12:00.000Z,how-to,best-practices,0.7,True,"Describes using the Document Layout skill with specific chunking strategies for RAG, which are product-specific optimization patterns.",unchanged https://learn.microsoft.com/en-us/azure/search/search-how-to-upgrade,Upgrade a service,Service Upgrade in the Azure Portal - Azure AI Search,Upgrade Azure AI Search service capacity in portal,Learn how to upgrade your existing Azure AI Search service to high-capacity storage and processors in your region.,"An upgrade brings older search services to the capabilities of new services created in the same region. Specifically, it upgrades the computing power of the underlying service. This one-time operation doesn't introduce breaking changes to your application, and you shouldn't need to change any code. Foreligible services, an upgrade increases thepartition storageandvector index sizeon the same pricing tier at no extra cost. This article describes how to upgrade your service in theAzure portal. Alt",2026-03-25T08:00:00.000Z,how-to,decision-making,0.7,True,"Upgrade article describes when and how to move to higher-capacity storage and processors, including which services are eligible and what changes (partition storage, vector index size) occur; this supports migration/upgrade decisions with product-specific details.",unchanged @@ -182,12 +182,12 @@ https://learn.microsoft.com/en-us/azure/search/search-indexer-access-control-lis https://learn.microsoft.com/en-us/azure/search/search-indexer-field-mappings,Define field mappings,Map fields in indexers - Azure AI Search,Configure field mappings for Azure AI Search indexers,Configure field mappings in an indexer to account for differences in field names and data representations.,This article explains how to set explicit field mappings that establish the data path between source fields in a supported data source and target fields in a search index.,2025-09-23T08:00:00.000Z,how-to,configuration,0.7,True,"Field mapping syntax and behavior (source-to-target, transformations) are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-how-to-access-private-sql,Connect to a SQL managed instance private endpoint,Connect to SQL Managed Instance - Azure AI Search,Connect Azure AI Search indexers to SQL Managed Instance privately,Configure an indexer connection to access content in an Azure SQL Managed instance that's protected through a private endpoint.,"This article explains how to configure an indexer in Azure AI Search for a private connection to a SQL managed instance that runs within a virtual network. The private connection is through ashared private linkand Azure Private Link. On a private connection to a managed instance, the fully qualified domain name (FQDN) of the instance must include theDNS Zone. Currently, only the Azure AI Search Management REST API provides adnsZonePrefixparameter for accepting the DNS zone specification. Althoug",2026-03-17T08:00:00.000Z,how-to,security,0.7,True,"Describes configuring an indexer private connection to Azure SQL Managed Instance via shared private link and Private Link, including requirement that the FQDN include the DNS zone and use of a specific dnsZonePrefix parameter in the Management REST API. These are product-specific secure connectivity and parameter details, best aligned with security.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-ip-restricted,Connect through a firewall,Connect Through Firewalls - Azure AI Search,Configure Azure AI Search indexer access through IP firewalls,Configure IP firewall rules to allow data access by an Azure AI Search indexer.,"Azure AI Search makes external, outbound calls during indexer processing for content and skills, and for agentic retrieval requests that include calls to large language models (LLMs). If the target Azure resource uses IP firewall rules to filter incoming calls, you must create an inbound rule in your firewall that admits requests from Azure AI Search. This article explains how to find the IP address of your search service and configure an inbound IP rule on an Azure Storage account. While specif",2026-03-18T22:23:00.000Z,how-to,security,0.7,True,"Page is about configuring IP firewall rules so Azure AI Search indexers can reach IP-restricted resources. It likely includes product-specific details such as how to obtain the search service outbound IPs and how to apply them to Azure Storage firewall settings. These are concrete, service-specific security configuration steps (network access control) rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private,Connect through a shared private link,Connect Through a Shared Private Link - Azure AI Search,Configure shared private link indexer connections in Azure AI Search,Configure indexer connections to access content from other Azure resources that are protected through a shared private link.,"This article explains how to configure ashared private linkfor private, outbound connections from Azure AI Search to an Azure resource running in a virtual network. With a shared private link, the search service connects to a virtual network IP address instead of a public endpoint. Shared private link is a premium feature billed by usage. For details, seeAzure Private Link pricing. Note If you're setting up a private indexer connection to a SQL Managed Instance, seeCreate a shared private link f",2026-04-10T08:00:00.000Z,how-to,security,0.7,True,"How-to for configuring shared private link from Azure AI Search indexers to private resources. Contains product-specific networking and security configuration details (private link usage, connection behavior to VNet IPs, premium feature constraints) that go beyond generic concepts, fitting the security/secure connectivity configuration category.",unchanged +https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private,Connect through a shared private link,Connect Through a Shared Private Link - Azure AI Search,Configure shared private link connections for Azure AI Search indexers,Configure indexer connections to access content from other Azure resources that are protected through a shared private link.,"This article explains how to configure ashared private linkfor private, outbound connections from Azure AI Search to an Azure resource running in a virtual network. With a shared private link, the search service connects to a virtual network IP address instead of a public endpoint. Shared private link is a premium feature billed by usage. For details, seeAzure Private Link pricing. Note If you're setting up a private indexer connection to a SQL Managed Instance, seeCreate a shared private link f",2026-04-22T08:00:00.000Z,how-to,configuration,0.72,True,"The article provides product-specific configuration steps and parameters for setting up shared private link connections from Azure AI Search indexers to private endpoints in virtual networks. It includes detailed settings and resource-type-specific configuration that go beyond generic networking concepts, fitting the configuration sub-skill best.",updated https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-trusted-service-exception,Connect as a trusted service,Connect as Trusted Service - Azure AI Search,Configure trusted service exception for Azure AI Search indexers,Learn how to enable secure data access to Azure Storage from an indexer in Azure AI Search.,"In Azure AI Search, indexers that access Azure blobs can use thetrusted service exceptionto securely access blobs. This mechanism offers customers who are unable to grantindexer access using IP firewall rulesa simple, secure, and free alternative for accessing data in storage accounts. Note If Azure Storage is behind a firewall and in the same region as Azure AI Search, you won't be able to create an inbound rule that admits requests from your search service. The solution for this scenario is fo",2026-03-26T22:23:00.000Z,how-to,security,0.78,True,"How-to page for securely connecting Azure AI Search indexers to Azure Storage via the trusted service exception. Likely includes specific Azure Storage network settings, required configuration steps, and constraints for when the search service and storage account are in the same region and behind firewalls. These are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-overview,What is an indexer?,Indexer overview - Azure AI Search,,"Crawl Azure SQL Database, SQL Managed Instance, Azure Cosmos DB, or Azure storage to extract searchable data and populate an Azure AI Search index.","Anindexerin Azure AI Search is a crawler that extracts textual data from cloud data sources and populates a search index using field-to-field mappings between source data and a search index. This approach is sometimes referred to as a 'pull model' because the search service pulls data in without you having to write any code that adds data to an index. Indexers also driveskillset execution and AI enrichment, where you can configure skills to integrate extra processing of content en route to an in",2025-06-23T08:00:00.000Z,concept-article,,0.4,False,"Indexer overview is conceptual (what indexers are, pull model) without detailed config tables or error/limit specifics.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-securing-resources,Secure access to external data,Indexer access to protected resources - Azure AI Search,Secure indexer access to network-protected resources in Azure AI Search,Learn import concepts and requirements related to network-level security options for outbound requests made by indexers in Azure AI Search.,"If your Azure resources are deployed in an Azure virtual network, this concept article explains how a search indexer can access content that's protected by network security. It describes the outbound traffic patterns and indexer execution environments. It also covers the network protections supported by Azure AI Search and factors that might influence your security strategy. Finally, because Azure Storage is used for both data access and persistent storage, this article also covers network consi",2025-05-12T08:00:00.000Z,concept-article,security,0.7,True,"Describes outbound traffic patterns, supported network protections, and constraints for indexers accessing VNet-protected resources; product-specific security and networking behavior.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-sensitivity-labels,Pull Purview sensitivity labels into an index,Use Indexers to Ingest Microsoft Purview Sensitivity Labels - Azure AI Search,Configure indexers to ingest Microsoft Purview sensitivity labels,Learn how to configure Azure AI Search indexers to ingest Microsoft Purview sensitivity labels from supported data sources for document-level security enforcement.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Azure AI Search now supports automatic extraction ofMicrosoft Purview sensitivity labelsat document-level during indexing, with label-based access control enforced at query time. Ava",2026-03-19T22:16:00.000Z,how-to,security,0.78,True,"How-to for extracting Purview sensitivity labels at index time. Likely includes specific indexer configuration properties, supported data sources, label field mappings, and enforcement behavior, which are product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/search/search-indexer-sharepoint-access-control-lists,Pull SharePoint ACL permissions into an index,Use a SharePoint Indexer to Ingest Permission Metadata - Azure AI Search,Configure SharePoint ACL ingestion with Azure AI Search indexers,Learn how to configure Azure AI Search indexers for ingesting Access Control Lists (ACLs) from SharePoint in Microsoft 365 files.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This article explains how to ingest an access control list (ACL) alongside other content from SharePoint in Microsoft 365 using an Azure AI Search indexer. Permissions from SharePoin",2026-04-26T08:00:00.000Z,how-to,configuration,0.7,True,"The article describes how to configure Azure AI Search indexers to ingest SharePoint Access Control Lists, including product-specific configuration fields and behaviors for permission metadata. This is detailed, product-specific configuration guidance rather than a conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/search/search-indexer-sharepoint-access-control-lists,Pull SharePoint ACL permissions into an index,Use a SharePoint Indexer to Ingest Permission Metadata - Azure AI Search,Configure SharePoint ACL ingestion with Azure AI Search indexers,Learn how to configure Azure AI Search indexers for ingesting Access Control Lists (ACLs) from SharePoint in Microsoft 365 files.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This article explains how to ingest an access control list (ACL) alongside other content from SharePoint in Microsoft 365 using an Azure AI Search indexer. Permissions from SharePoin",2026-04-26T08:00:00.000Z,how-to,configuration,0.7,True,"The article describes how to configure Azure AI Search indexers to ingest SharePoint Access Control Lists, including product-specific configuration fields and behaviors for permission metadata. This is detailed, product-specific configuration guidance rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-troubleshooting,Troubleshoot an indexer,Indexer troubleshooting - Azure AI Search,Troubleshoot Azure AI Search indexer issues without errors,Provides indexer problem and resolution guidance for cases when no error messages are returned from the service search.,"Occasionally, indexers run into problems that don't produce errors or that occur on other Azure services, such as during authentication or when connecting. This article focuses on troubleshooting indexer problems when there are no messages to guide you. It also provides troubleshooting for errors that come from non-search resources used during indexing. Note If you have an Azure AI Search error to investigate, seeTroubleshooting common indexer errors and warningsinstead.",2025-10-23T08:00:00.000Z,troubleshooting-general,troubleshooting,0.82,True,"Symptom-based guidance for indexer problems, including product-specific diagnostics and steps when no error messages are returned.",unchanged https://learn.microsoft.com/en-us/azure/search/search-indexer-tutorial,Index Azure SQL Database,C# Tutorial: Index Azure SQL Data - Azure AI Search,,"In this C# tutorial, you connect to Azure SQL Database, extract searchable data, and load it into an Azure AI Search index.","Learn how to configure anindexerto extract searchable data from Azure SQL Database and send it to a search index in Azure AI Search. In this tutorial, you use C# and theAzure SDK for .NETto:",2026-02-27T08:00:00.000Z,tutorial,,0.2,False,"Primarily a step-by-step C# tutorial for creating an indexer from Azure SQL to Azure AI Search. It focuses on workflow and example code rather than enumerating product-specific configuration matrices, limits, or error mappings; it lacks the structured expert details required by the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/search/search-language-support,Design a multilingual index,Multi-Language Indexing for Non-English Search Queries - Azure AI Search,,Create an index that supports multi-language content and then create queries scoped to that content.,"If you have strings in multiple languages, you can usevector searchto represent multilingual content mathematically, which is the more modern approach. Alternatively, if you aren't using vectors, you can attachlanguage analyzersthat analyze strings using linguistic rules of a specific language during indexing and query execution. With a language analyzer, you get better handling of diacritics, character variants, punctuation, and word root forms. Azure AI Search supports Microsoft and Lucene ana",2026-03-18T22:23:00.000Z,how-to,,0.3,False,"Appears to describe language analyzers and multilingual indexing conceptually. Summary does not indicate detailed configuration tables, numeric thresholds, or product-specific limits/quotas, nor concrete best-practice gotchas; more of a capability/feature overview.",unchanged @@ -230,6 +230,7 @@ https://learn.microsoft.com/en-us/azure/search/search-query-sensitivity-labels,Q https://learn.microsoft.com/en-us/azure/search/search-query-simple-examples,Sample queries (simple syntax),Examples of simple syntax - Azure AI Search,,"Explore query examples that demonstrate the simple syntax for full text search, filter search, and geo search against an Azure AI Search index.","In Azure AI Search, thesimple query syntaxinvokes the default query parser for full text search. The parser is fast and handles common scenarios, including full text search, filtered and faceted search, and prefix search. This article uses examples to illustrate simple syntax usage in aSearch Documents (REST API)request. Note An alternative query syntax isLucene, which supports more complex query structures, such as fuzzy and wildcard search. For more information, seeExamples of full Lucene sear",2025-04-14T08:00:00.000Z,concept-article,,0.3,False,Collection of simple query syntax examples; useful but not expert-only configuration or decision content.,unchanged https://learn.microsoft.com/en-us/azure/search/search-query-troubleshoot-collection-filters,Troubleshoot a collection filter,Troubleshooting OData collection filters - Azure AI Search,Troubleshoot OData collection filter errors in Azure AI Search,Learn approaches for resolving OData collection filter errors in Azure AI Search queries.,"Tofilteron collection fields in Azure AI Search, you can use theanyandalloperatorstogether withlambda expressions. A lambda expression is a subfilter that is applied to each element of a collection. Not every feature of filter expressions is available inside a lambda expression. Which features are available differs depending on the data type of the collection field that you want to filter. This can result in an error if you try to use a feature in a lambda expression that isn't supported in that",2025-05-29T08:00:00.000Z,concept-article,troubleshooting,0.8,True,"Explicit troubleshooting article for collection filters; likely maps specific error messages/conditions to causes and fixes, which is product-specific diagnostic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/search/search-query-understand-collection-filters,What is a collection filter?,OData collection filters - Azure AI Search,,"Learn the mechanics of how OData collection filters work in Azure AI Search queries, including limitations and behaviors unique to collections.","This article provides background for developers who are writing advanced filters with complex lambda expressions. The article explains why the rules for collection filters exist by exploring how Azure AI Search executes these filters. When you build afilteron collection fields in Azure AI Search, you can use theanyandalloperatorstogether withlambda expressions. Lambda expressions are Boolean expressions that refer to arange variable. In filters that use a lambda expression, theanyandalloperators",2025-05-29T08:00:00.000Z,concept-article,,0.45,False,"Deep explanation of OData collection filters; while it covers behaviors and limitations, it's more conceptual mechanics than structured troubleshooting or config tables.",unchanged +https://learn.microsoft.com/en-us/azure/search/search-region-capacity,Handling regional capacity constraints,How to handle regional capacity constraints in Azure AI Search - Azure AI Search,Choose alternative region for Azure AI Search capacity,Learn how to handle a regional capacity constraint that effects your Azure AI Search service.,This article helps you decide what to do when your preferred Azure AI Search region is unavailable due to capacity constraints. It also provides evaluation criteria for selecting an alternative region.,2026-04-23T06:10:00.000Z,concept-article,decision-making,0.72,True,"The article provides concrete, product-specific guidance for what to do when a preferred Azure AI Search region has capacity constraints, including evaluation criteria for selecting an alternative region. This is explicit decision-making content (how to choose another region under specific conditions) rather than a generic overview, but it does not focus on numeric limits/quotas or configuration parameters.",new https://learn.microsoft.com/en-us/azure/search/search-region-support,Supported regions,Supported Regions - Azure AI Search,Check Azure AI Search regional feature availability,Learn about the regions that offer Azure AI Search and the features available in each region.,This article identifies the cloud regions in which Azure AI Search is available. It also lists which premium features are available in each region.,2026-03-25T08:00:00.000Z,concept-article,deployment,0.65,True,"Region support pages enumerate which features are available in which regions, effectively a support matrix of capabilities by region; this is deployment-relevant platform support information that is highly specific and not generally known.",unchanged https://learn.microsoft.com/en-us/azure/search/search-relevance-overview,What is relevance?,Relevance - Azure AI Search,,Describe strategies for producing relevant results in Azure AI Search and explain how the scoring and ranking algorithms work and how to use them together.,"The true measure of relevance ishow wella retrieved set of results meets your customer and user information needs. In this article, learn about:",2025-12-08T23:11:00.000Z,concept-article,,0.35,False,"Relevance overview and strategies; conceptual explanation of scoring and ranking, not specific numeric thresholds or configs.",unchanged https://learn.microsoft.com/en-us/azure/search/search-security-api-keys,Authenticate with keys,Connect Using API Keys - Azure AI Search,Manage Azure AI Search admin and query API keys,Learn how to manage and use admin and query API keys for authenticating requests to an Azure AI Search service endpoint.,"Azure AI Search supports both identity-based and key-based authentication (default) for connections to your search service. A request made to a search service endpoint is accepted if both the request and the API key are valid and if the search service is configured to allow API keys on a request. Important When you create a search service, key-based authentication is the default, but it's not the most secure option. We recommend that you replace it withrole-based access.",2026-03-26T22:23:00.000Z,how-to,security,0.85,True,"Details key-based authentication, admin vs query keys, and configuration of key usage; includes product-specific authentication behavior and recommendations beyond generic API key usage.",unchanged @@ -265,7 +266,7 @@ https://learn.microsoft.com/en-us/azure/search/semantic-how-to-enable-scoring-pr https://learn.microsoft.com/en-us/azure/search/semantic-how-to-query-request,Add semantic ranking to queries,Add semantic ranking - Azure AI Search,Invoke semantic ranking in Azure AI Search queries,Set a semantic query type to attach the deep learning models of semantic ranker.,"You can apply semantic ranking to text queries, hybrid queries, and vector queries if your search documents contain string fields and thevector query has a text representationin the search document. This article explains how to invoke the semantic ranker on queries. It assumes you're using the most recent stable or preview APIs. For help with older versions, seeMigrate semantic ranking code.",2025-11-06T08:00:00.000Z,how-to,integrations,0.68,True,"Shows how to call semantic ranker via query parameters and request bodies for different query types; includes API parameter names and usage patterns, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/search/semantic-how-to-query-rewrite,Rewrite queries with semantic ranker,Rewrite queries with semantic ranker in Azure AI Search - Azure AI Search,Use semantic query rewriting in Azure AI Search,Learn how to rewrite queries with semantic ranker in Azure AI Search,"Note This feature is currently in public preview. This preview is provided without a service-level agreement and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Query rewriting is the process of transforming a user's query into a more effective one, adding more terms and refining search results. The search service sends the search query (or ",2025-11-21T08:00:00.000Z,how-to,integrations,0.7,True,"Describes query rewriting with semantic ranker, including request/response fields and parameters in preview APIs; these are concrete API-level integration details.",unchanged https://learn.microsoft.com/en-us/azure/search/semantic-search-overview,What is semantic ranking?,Semantic ranking - Azure AI Search,,Learn how Azure AI Search uses deep learning semantic ranking models from Bing to make search results more intuitive.,"In Azure AI Search,semantic rankeris a feature that measurably improves search relevance by using Microsoft's language understanding models to rerank search results. Semantic ranker is also built intoagentic retrieval. This article is a high-level introduction to help you understand the behaviors and benefits of semantic ranker. Semantic ranker is a premium feature that's billed by usage, but you can use it for free subject toservice limitsfor the free tier. We recommend this article for backgro",2026-01-16T06:04:00.000Z,concept-article,,0.25,False,High-level overview of semantic ranker behavior and benefits; mostly conceptual and billing notes without detailed configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/search/service-configure-firewall,Configure network access,Configure Network Access - Azure AI Search,Configure IP firewall and trusted service access for Azure AI Search,Restrict inbound network access to Azure AI Search with IP firewall rules and trusted service exceptions.,"This article explains how to restrictinboundnetwork access to a search service's public endpoint. You can configure IP firewall rules to allow access only from specific IP addresses, address ranges, or subnets. You can also enable exceptions for trusted Azure services. Firewall rules control which clients can send requests (queries, indexing, management operations) to your search service. They don't affectoutboundconnections from the search service to external resources. For outbound security, s",2026-04-15T22:09:00.000Z,how-to,security,0.78,True,"Network access configuration article; typically includes specific firewall rule settings, trusted service options, and possibly example configurations that define product-specific security behavior and parameters.",updated +https://learn.microsoft.com/en-us/azure/search/service-configure-firewall,Configure network access,Configure Network Access - Azure AI Search,Configure IP firewall and trusted service access for Azure AI Search,Restrict inbound network access to Azure AI Search with IP firewall rules and trusted service exceptions.,"This article explains how to restrictinboundnetwork access to a search service's public endpoint. You can configure IP firewall rules to allow access only from specific IP addresses, address ranges, or subnets. You can also enable exceptions for trusted Azure services. Firewall rules control which clients can send requests (queries, indexing, management operations) to your search service. They don't affectoutboundconnections from the search service to external resources. For outbound security, s",2026-04-15T22:09:00.000Z,how-to,security,0.78,True,"Network access configuration article; typically includes specific firewall rule settings, trusted service options, and possibly example configurations that define product-specific security behavior and parameters.",unchanged https://learn.microsoft.com/en-us/azure/search/service-create-private-endpoint,Create a private endpoint,Create a private endpoint for a secure connection - Azure AI Search,Create private endpoints for secure Azure AI Search access,Set up a private endpoint in a virtual network for a secure client connection to an Azure AI Search service.,This article explains how to configure a private connection to Azure AI Search so that it admits requests from clients in a virtual network instead of over a public internet connection.,2026-01-23T23:20:00.000Z,how-to,security,0.8,True,Details how to set up private endpoints and required settings for secure VNet connectivity; concrete security configuration for this service.,unchanged https://learn.microsoft.com/en-us/azure/search/speller-how-to-add,Add spell check,Add spell check to queries - Azure AI Search,,"Attach spelling correction to the query pipeline, to fix typos on query terms before executing the query.","Important Spell correction is in public preview undersupplemental terms of use. It's available through the Azure portal, preview REST APIs, and beta versions of Azure SDK libraries. You can improve recall by spell-correcting words in a query before they reach the search engine. Thespellerparameter is supported for all text (non-vector) query types.",2025-08-27T08:00:00.000Z,how-to,,0.4,False,"How-to for adding spell check; may mention preview status but not focused on limits, config tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/search/troubleshoot-shared-private-link-resources,Troubleshoot a private connection,Troubleshoot shared private link resources - Azure AI Search,Troubleshoot shared private link resource issues in Azure AI Search,Troubleshooting guide for common problems when managing shared private link resources in Azure AI Search.,"A shared private link allows Azure AI Search to make secure outbound connections over a private endpoint when accessing customer resources in a virtual network. This article can help you resolve errors that might occur. Creating a shared private link is a search service control plane operation. You cancreate a shared private linkusing either the Azure portal or aManagement REST API. During provisioning, the state of the request isUpdating. After the operation completes successfully, status isSuc",2025-06-04T08:00:00.000Z,troubleshooting-general,troubleshooting,0.86,True,"Organized around resolving errors when managing shared private link resources, including provisioning states and likely error codes; symptom → cause → solution guidance.",unchanged diff --git a/products/azure-cognitive-search/report.md b/products/azure-cognitive-search/report.md index 866bfbed..01697ce6 100644 --- a/products/azure-cognitive-search/report.md +++ b/products/azure-cognitive-search/report.md @@ -1,20 +1,20 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - configuration: 'Configuring Azure AI Search: data sources, indexes, analyzers, skillsets, - enrichment, vector/semantic settings, knowledge bases, monitoring, and relevance/answer - tuning.' - decision-making: Planning Azure AI Search capacity, SKUs, and costs, and upgrading/migrating - REST APIs, skills, SDKs, and service tiers for agentic retrieval and .NET apps + configuration: 'Configuring Azure AI Search: data sources, index schemas, analyzers, + enrichment skillsets, vector/semantic settings, knowledge bases, monitoring, and + relevance/ranking behavior' + decision-making: Guidance on Azure AI Search capacity, SKUs, regions, cost planning, + and upgrading/migrating REST APIs, skills, SDKs, and service resources. integrations: Integrating Azure AI Search with data sources, SDKs, vectorizers, - and custom skills; building knowledge stores; and crafting queries/filters (OData, - Lucene, semantic, vector, agentic). + and custom skills; building queries (Lucene/OData), enrichments, knowledge stores, + and agentic/semantic retrieval. limits-quotas: 'Limits, quotas, and behaviors for Azure AI Search: service and index caps by tier, vector/index size limits, indexer scheduling windows, enrichment quotas, and related FAQs.' - troubleshooting: Diagnosing and fixing Azure AI Search indexer and skillset issues, - including debug sessions, OData filter errors, private link problems, and storage/metrics - discrepancies. + troubleshooting: Diagnosing and fixing Azure AI Search indexer and skillset issues + (errors, warnings, no-error failures), OData filter problems, private link issues, + and storage/metrics discrepancies. best-practices: Design, scaling, and performance tuning of Azure AI Search indexing/querying, including enrichment pipelines, chunking/vectorization, data modeling, concurrency-safe updates, and vector optimization. @@ -29,31 +29,31 @@ category_descriptions: skill_description: Expert knowledge for Azure AI Search development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - designing indexes/skillsets, vector/semantic search, RAG queries, indexers, or secure - multi-tenant setups, and other Azure AI Search related development tasks. Not for - Azure Cosmos DB (use azure-cosmos-db), Azure Data Explorer (use azure-data-explorer), - Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Factory (use azure-data-factory). -use_when: Use when designing indexes/skillsets, vector/semantic search, RAG queries, - indexers, or secure multi-tenant setups, and other Azure AI Search related development - tasks. -confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Explorer - (use azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics), - Azure Data Factory (use azure-data-factory). + designing indexes, skillsets, vector/semantic search, indexers, or RAG/agentic retrieval + with Azure AI Search, and other Azure AI Search related development tasks. Not for + Azure Cosmos DB (use azure-cosmos-db), Azure SQL Database (use azure-sql-database), + Azure Table Storage (use azure-table-storage), Azure Open Datasets (use azure-open-datasets). +use_when: Use when designing indexes, skillsets, vector/semantic search, indexers, + or RAG/agentic retrieval with Azure AI Search, and other Azure AI Search related + development tasks. +confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure SQL Database + (use azure-sql-database), Azure Table Storage (use azure-table-storage), Azure Open + Datasets (use azure-open-datasets). --- # Azure AI Search Crawl Report ## Summary -- **Total Pages**: 299 -- **Fetched**: 299 +- **Total Pages**: 300 +- **Fetched**: 300 - **Fetch Failed**: 0 -- **Classified**: 226 +- **Classified**: 227 - **Unclassified**: 73 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 14 -- **Unchanged**: 285 +- **New Pages**: 1 +- **Updated Pages**: 5 +- **Unchanged**: 294 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-cognitive-search/azure-cognitive-search.csv` @@ -63,47 +63,33 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex |------|-------|------------| | architecture-patterns | 3 | 1.0% | | best-practices | 17 | 5.7% | -| configuration | 88 | 29.4% | -| decision-making | 9 | 3.0% | +| configuration | 89 | 29.7% | +| decision-making | 10 | 3.3% | | deployment | 5 | 1.7% | -| integrations | 55 | 18.4% | +| integrations | 55 | 18.3% | | limits-quotas | 6 | 2.0% | -| security | 35 | 11.7% | +| security | 34 | 11.3% | | troubleshooting | 8 | 2.7% | -| *(Unclassified)* | 73 | 24.4% | +| *(Unclassified)* | 73 | 24.3% | ## Changes +### New Pages + +- [Handling regional capacity constraints](https://learn.microsoft.com/en-us/azure/search/search-region-capacity) + ### Updated Pages -- [Manage an index](https://learn.microsoft.com/en-us/azure/search/search-how-to-manage-index) - - Updated: 2025-07-03T22:08:00.000Z → 2026-04-15T06:03:00.000Z +- [What is multimodal search?](https://learn.microsoft.com/en-us/azure/search/multimodal-search-overview) + - Updated: 2026-03-25T08:00:00.000Z → 2026-04-20T08:00:00.000Z - [SharePoint in Microsoft 365](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online) - - Updated: 2026-03-24T08:00:00.000Z → 2026-04-16T08:00:00.000Z -- [FAQ](https://learn.microsoft.com/en-us/azure/search/search-faq-frequently-asked-questions) - - Updated: 2026-03-26T17:13:00Z → 2026-04-15T22:09:00Z -- [Pull SharePoint ACL permissions into an index](https://learn.microsoft.com/en-us/azure/search/search-indexer-sharepoint-access-control-lists) - - Updated: 2026-03-25T08:00:00.000Z → 2026-04-26T08:00:00.000Z -- [Search index](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-search-index) - - Updated: 2026-03-25T08:00:00.000Z → 2026-04-15T06:03:00.000Z -- [Azure blob](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-blob) - - Updated: 2026-03-26T17:13:00.000Z → 2026-04-15T06:03:00.000Z -- [OneLake](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-onelake) - - Updated: 2026-03-26T17:13:00.000Z → 2026-04-15T06:03:00.000Z -- [SharePoint](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-sharepoint-indexed) - - Updated: 2026-03-26T17:13:00.000Z → 2026-04-15T06:03:00.000Z -- [SharePoint](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-sharepoint-remote) - - Updated: 2026-03-26T17:13:00.000Z → 2026-04-15T06:03:00.000Z -- [Web](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-web) - - Updated: 2026-03-25T08:00:00.000Z → 2026-04-15T06:03:00.000Z -- [Create a knowledge base](https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-knowledge-base) - - Updated: 2026-03-04T23:12:00.000Z → 2026-04-15T06:03:00.000Z -- [Query a knowledge base (APIs or MCP)](https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-retrieve) - - Updated: 2026-04-09T17:17:00.000Z → 2026-04-16T11:04:00.000Z -- [Build an end-to-end retrieval solution](https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-pipeline) - - Updated: 2026-04-01T08:00:00.000Z → 2026-04-15T08:00:00.000Z -- [Configure network access](https://learn.microsoft.com/en-us/azure/search/service-configure-firewall) - - Updated: 2026-03-18T22:23:00.000Z → 2026-04-15T22:09:00.000Z + - Updated: 2026-04-16T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Troubleshoot errors and warnings](https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings) + - Updated: 2025-10-14T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- [Connect through a shared private link](https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private) + - Updated: 2026-04-10T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Document Layout](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout) + - Updated: 2026-01-20T18:13:00.000Z → 2026-04-22T08:00:00.000Z ## Classified Pages @@ -115,7 +101,7 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [Configure cross-tenant CMK](https://learn.microsoft.com/en-us/azure/search/search-security-managed-encryption-cross-tenant) | security | 0.86 | Explains using a multitenant Entra app to use a Key Vault key from another tenant. Contains tenant, app, and permission configuration specific to this scenario. | | [Enable role-based access](https://learn.microsoft.com/en-us/azure/search/search-security-enable-roles) | security | 0.86 | Describes enabling RBAC, including which roles apply and how to switch from key-based auth; includes specific role scopes and configuration flags. | | [Troubleshoot a private connection](https://learn.microsoft.com/en-us/azure/search/troubleshoot-shared-private-link-resources) | troubleshooting | 0.86 | Organized around resolving errors when managing shared private link resources, including provisioning states and likely error codes; symptom → cause → solution guidance. | -| [Troubleshoot errors and warnings](https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings) | troubleshooting | 0.86 | Maps specific indexer error/warning messages and codes to causes and solutions, including use of maxFailedItems and related settings. | +| [Troubleshoot errors and warnings](https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings) | troubleshooting | 0.86 | Article is explicitly about common indexer errors and warnings in Azure AI Search, including parameters like maxFailedItems and maxFailedItemsPerBatch, and maps symptoms (errors/warnings) to causes and resolutions. This is product-specific troubleshooting content with configuration-related error handling guidance. | | [Annotation reference language](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-annotation-language) | configuration | 0.85 | Reference for the annotation expression language used in skillsets; defines syntax and path rules that are unique configuration semantics. | | [Assign roles (users)](https://learn.microsoft.com/en-us/azure/search/search-security-rbac) | security | 0.85 | Page is specifically about connecting using Azure roles and managing permissions via Microsoft Entra ID. It will enumerate built-in role names, scopes, and how they map to admin/developer/query permissions, which is product-specific security configuration expert knowledge. | | [Authenticate with keys](https://learn.microsoft.com/en-us/azure/search/search-security-api-keys) | security | 0.85 | Details key-based authentication, admin vs query keys, and configuration of key usage; includes product-specific authentication behavior and recommendations beyond generic API key usage. | @@ -130,7 +116,6 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [Azure Content Understanding](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-content-understanding) | configuration | 0.82 | Skill reference for Azure Content Understanding, including how it analyzes and chunks documents and outputs; product-specific configuration and behavior. | | [Azure Vision multimodal embeddings](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-vision-vectorize) | configuration | 0.82 | Skill reference with parameters, preview API version, billing behavior, and usage constraints; detailed configuration for a specific built-in skill. | | [Configure semantic ranker](https://learn.microsoft.com/en-us/azure/search/semantic-how-to-configure) | configuration | 0.82 | Explains adding semantic configurations to an index, including specific JSON schema fields and allowed values; this is detailed product configuration. | -| [Document Layout](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout) | configuration | 0.82 | Describes how the Document Layout skill works, supported formats, and configuration options; product-specific skill configuration. | | [Image Analysis](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-image-analysis) | configuration | 0.82 | Reference for Image Analysis skill including supported image requirements and options; detailed configuration and constraints for this skill. | | [Map to index fields](https://learn.microsoft.com/en-us/azure/search/cognitive-search-output-field-mapping) | configuration | 0.82 | Explains indexer outputFieldMappings schema and how to map in-memory enrichment paths to index fields; includes parameter names and behavior. | | [OCR](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-ocr) | configuration | 0.82 | Skill reference mapping to specific Azure Vision APIs and language support; includes product-specific behavior and constraints. | @@ -227,7 +212,9 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [Index one-to-many](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-azure-blob-one-to-many) | configuration | 0.74 | Details specific parsingMode values (delimitedText, jsonArray, jsonLines, markdown/oneToMany) and how they affect document projection; product-specific config. | | [Azure SQL Databases](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-database) | integrations | 0.72 | How-to for configuring Azure SQL/MI as a data source and indexer, including REST schema and product-specific parameters for indexers and data sources. | | [Azure SQL Server VMs](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-server) | security | 0.72 | Page gives product-specific steps for securing an Azure AI Search indexer connection to SQL Server on an Azure VM, including use of public internet endpoints, certificates for the SQL Server FQDN, and firewall configuration. These are concrete, service-specific security configuration details rather than generic concepts. | +| [Connect through a shared private link](https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private) | configuration | 0.72 | The article provides product-specific configuration steps and parameters for setting up shared private link connections from Azure AI Search indexers to private endpoints in virtual networks. It includes detailed settings and resource-type-specific configuration that go beyond generic networking concepts, fitting the configuration sub-skill best. | | [Custom Web API](https://learn.microsoft.com/en-us/azure/search/vector-search-vectorizer-custom-web-api) | integrations | 0.72 | Page describes the Custom Web API vectorizer, including the required JSON payload structure and how it is referenced in index definitions and vector profiles. This is product-specific integration detail (how Azure AI Search calls an external embedding service at query time) that goes beyond generic knowledge and fits the integrations & coding patterns category. | +| [Handling regional capacity constraints](https://learn.microsoft.com/en-us/azure/search/search-region-capacity) | decision-making | 0.72 | The article provides concrete, product-specific guidance for what to do when a preferred Azure AI Search region has capacity constraints, including evaluation criteria for selecting an alternative region. This is explicit decision-making content (how to choose another region under specific conditions) rather than a generic overview, but it does not focus on numeric limits/quotas or configuration parameters. | | [Index JSON in Azure blobs](https://learn.microsoft.com/en-us/azure/search/search-semi-structured-data) | integrations | 0.72 | REST-based tutorial for indexing nested JSON arrays and semi-structured data; includes product-specific indexer and field mapping configuration. | | [Index Markdown in Azure blobs](https://learn.microsoft.com/en-us/azure/search/search-markdown-data-tutorial) | integrations | 0.72 | Shows REST configuration for Markdown-aware indexers, including parsing modes and field mappings; preview behavior is product-specific. | | [Overview](https://learn.microsoft.com/en-us/azure/search/query-odata-filter-orderby-syntax) | integrations | 0.72 | Service-specific OData language overview for this API; defines supported constructs and how they map into query parameters, which is integration-focused syntax knowledge. | @@ -251,7 +238,6 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [Configure BM25 ranking](https://learn.microsoft.com/en-us/azure/search/index-ranking-similarity) | configuration | 0.70 | How-to page for enabling BM25 on services, likely includes specific service version conditions, configuration flags, and request parameters unique to Azure AI Search rather than just conceptual ranking theory. | | [Configure a search service](https://learn.microsoft.com/en-us/azure/search/search-manage) | security | 0.70 | Day-one configuration checklist covering RBAC, managed identities, and network security with product-specific settings. | | [Connect through a firewall](https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-ip-restricted) | security | 0.70 | Page is about configuring IP firewall rules so Azure AI Search indexers can reach IP-restricted resources. It likely includes product-specific details such as how to obtain the search service outbound IPs and how to apply them to Azure Storage firewall settings. These are concrete, service-specific security configuration steps (network access control) rather than generic concepts. | -| [Connect through a shared private link](https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private) | security | 0.70 | How-to for configuring shared private link from Azure AI Search indexers to private resources. Contains product-specific networking and security configuration details (private link usage, connection behavior to VNet IPs, premium feature constraints) that go beyond generic concepts, fitting the security/secure connectivity configuration category. | | [Connect to a SQL managed instance private endpoint](https://learn.microsoft.com/en-us/azure/search/search-indexer-how-to-access-private-sql) | security | 0.70 | Describes configuring an indexer private connection to Azure SQL Managed Instance via shared private link and Private Link, including requirement that the FQDN include the DNS zone and use of a specific dnsZonePrefix parameter in the Management REST API. These are product-specific secure connectivity and parameter details, best aligned with security. | | [Create a hybrid query](https://learn.microsoft.com/en-us/azure/search/hybrid-search-how-to-query) | configuration | 0.70 | How-to for constructing hybrid queries with RRF; includes specific query parameters and configuration patterns unique to Azure AI Search hybrid search. | | [Create a knowledge base](https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-knowledge-base) | configuration | 0.70 | Explains how to create a knowledge base object that orchestrates agentic retrieval; likely includes JSON schema, property names, and allowed values for knowledge base configuration. | @@ -304,11 +290,12 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [Add semantic ranking to queries](https://learn.microsoft.com/en-us/azure/search/semantic-how-to-query-request) | integrations | 0.68 | Shows how to call semantic ranker via query parameters and request bodies for different query types; includes API parameter names and usage patterns, fitting integrations & coding patterns. | | [Azure SQL Managed Instances](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-managed-instance) | security | 0.68 | Covers enabling public endpoint and networking specifics for SQL MI so Azure AI Search indexers can connect; product-specific connectivity and security configuration. | | [Create an indexer](https://learn.microsoft.com/en-us/azure/search/search-how-to-create-indexers) | configuration | 0.68 | Indexer creation docs for Azure AI Search typically include product-specific configuration parameters (indexer object properties, schedule settings, data source connection settings, field mappings, and options) with concrete names and allowed values. This goes beyond a conceptual overview and provides detailed configuration knowledge unique to Azure AI Search indexers, fitting the configuration sub-skill best. | +| [Document Layout](https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout) | integrations | 0.68 | The page describes a specific cognitive skill (Document Layout) within Azure AI Search, including how it uses the Azure Document Intelligence layout model and how it outputs markdown/text with location metadata for RAG and multimodal search. These are product-specific integration details between Azure AI Search and Document Intelligence that go beyond generic knowledge, fitting the integrations & coding patterns category best. | | [Explore the code](https://learn.microsoft.com/en-us/azure/search/tutorial-csharp-search-query-integration) | integrations | 0.68 | Cheat sheet of essential .NET SDK integration steps and query patterns for Azure AI Search, with concrete code usage beyond generic SDK knowledge. | | [Sample custom skill (Bing Entity Search)](https://learn.microsoft.com/en-us/azure/search/cognitive-search-create-custom-skill-example) | integrations | 0.68 | The page shows a concrete, product-specific pattern for wrapping the Bing Entity Search API in an Azure Function that implements the Azure AI Search custom skill interface. It includes request/response schema details and code-level integration specifics that go beyond generic tutorial content, fitting the integrations & coding patterns category. | | [Search index](https://learn.microsoft.com/en-us/azure/search/agentic-knowledge-source-how-to-search-index) | configuration | 0.68 | How-to page for creating a search index knowledge source; likely includes specific request schemas, property names, and configuration parameters for Azure AI Search knowledge bases, which are product-specific configuration details beyond generic LLM knowledge. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/search/security-controls-policy) | security | 0.68 | Page lists Azure Policy Regulatory Compliance controls specific to Azure AI Search, including at least one built-in policy definition that can be used in policy assignments. These are product-specific security/compliance configurations (policy definitions and how they apply to the service), which fits the security category. | -| [SharePoint in Microsoft 365](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online) | integrations | 0.68 | The article describes a preview SharePoint Online indexer with product-specific REST API usage and configuration details unique to Azure AI Search and SharePoint integration. It goes beyond a simple tutorial by covering how to set up and use the indexer with Azure Search, which is specialized integration knowledge not generally known from training. | +| [SharePoint in Microsoft 365](https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online) | configuration | 0.68 | How-to article for setting up the SharePoint in Microsoft 365 indexer, which typically includes product-specific configuration parameters (indexer definition fields, data source settings, authentication details, and preview REST API specifics). These are concrete, service-specific configuration details rather than conceptual guidance. | | [Add AI-generated content (GenAI Prompt skill)](https://learn.microsoft.com/en-us/azure/search/chat-completion-skill-example-usage) | integrations | 0.65 | Describes concrete usage of the Chat Completion skill within an Azure AI Search ingestion pipeline, including product-specific skill configuration and how outputs are mapped into the search index. This is an integration/coding pattern between Azure AI Search indexers, AI enrichment, and language models, beyond generic LLM knowledge. | | [Create a knowledge store](https://learn.microsoft.com/en-us/azure/search/knowledge-store-create-rest) | integrations | 0.65 | Describes creating a knowledge store using REST, tied to indexers and skillsets and specifying Azure Storage as the output. This implies detailed REST payload structure and parameters specific to Azure AI Search–Azure Storage integration, matching the integrations & coding patterns category. | | [Create a search index](https://learn.microsoft.com/en-us/azure/search/search-how-to-create-search-index) | configuration | 0.65 | Covers schema definition and index creation via portal, REST, or SDK with product-specific field and index settings. | @@ -404,6 +391,7 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [What is a knowledge store?](https://learn.microsoft.com/en-us/azure/search/knowledge-store-concept-intro) | 0.20 | Conceptual introduction to knowledge stores and their role as secondary storage for AI-enriched content. The summary indicates high-level explanation without concrete configuration parameters, limits, or troubleshooting details. | | [What is agentic retrieval?](https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-overview) | 0.20 | High-level conceptual overview of agentic retrieval in Azure AI Search; summary indicates architecture description and intended use but no evidence of numeric limits, configuration tables, error codes, or decision matrices with quantified trade-offs. | | [What is full-text search?](https://learn.microsoft.com/en-us/azure/search/search-lucene-query-architecture) | 0.20 | Conceptual description of full text search architecture; no specific limits, configs, or decision matrices. | +| [What is multimodal search?](https://learn.microsoft.com/en-us/azure/search/multimodal-search-overview) | 0.20 | Page is a conceptual overview of multimodal search in Azure AI Search, describing what it is and high-level pipeline steps. It does not include numeric limits, configuration tables, error codes, or product-specific decision matrices or best-practice details. | | [What's Azure AI Search?](https://learn.microsoft.com/en-us/azure/search/search-what-is-azure-search) | 0.20 | High-level introduction and use cases for Azure AI Search without product-specific limits, configs, or detailed patterns. | | [Portal](https://learn.microsoft.com/en-us/azure/search/search-get-started-portal) | 0.10 | Quickstart for full-text search via portal wizard is a step-by-step tutorial; it describes how to click through UI and run a sample, without detailed configuration tables, limits, or error mappings. | | [Portal](https://learn.microsoft.com/en-us/azure/search/search-get-started-portal-image-search) | 0.10 | Quickstart for multimodal search is a how-to walkthrough; it explains using the wizard with sample data, not detailed expert configuration, limits, or decision matrices. | @@ -411,5 +399,4 @@ confusable_not_for: Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Ex | [Portal](https://learn.microsoft.com/en-us/azure/search/search-get-started-skillset) | 0.10 | Quickstart for creating a skillset in the portal is a basic tutorial; it introduces skills and shows wizard usage, but does not typically include detailed parameter tables, quotas, or troubleshooting mappings. | | [Training](https://learn.microsoft.com/en-us/azure/search/resource-training) | 0.10 | Navigation/overview page linking to training modules; contains no product-specific configuration, limits, or troubleshooting details. | | [What is document-level security?](https://learn.microsoft.com/en-us/azure/search/search-document-level-access-overview) | 0.10 | Described as a conceptual overview of document-level permissions. Likely explains concepts and scenarios without detailed configuration parameters, role names, or error mappings. | -| [What is multimodal search?](https://learn.microsoft.com/en-us/azure/search/multimodal-search-overview) | 0.10 | The multimodal search overview focuses on concepts (what multimodal search is, typical pipeline steps, and high-level guidance). It does not appear to contain product-specific configuration parameters, limits, or troubleshooting mappings, so it does not qualify as expert knowledge under the defined sub-skill types. | | [What's new](https://learn.microsoft.com/en-us/azure/search/whats-new) | 0.10 | Release notes and what's-new overview; does not focus on structured limits, configuration matrices, troubleshooting mappings, or other expert-knowledge patterns defined in the sub-skill types. | diff --git a/products/azure-container-apps/azure-container-apps.csv b/products/azure-container-apps/azure-container-apps.csv index 56e0ac9c..62602a46 100644 --- a/products/azure-container-apps/azure-container-apps.csv +++ b/products/azure-container-apps/azure-container-apps.csv @@ -8,7 +8,7 @@ https://learn.microsoft.com/en-us/azure/container-apps/authentication-entra,Micr https://learn.microsoft.com/en-us/azure/container-apps/authentication-facebook,Facebook,Enable authentication and authorization in Azure Container Apps with Facebook,Enable Facebook authentication in Azure Container Apps,Learn to use the built-in Facebook authentication provider in Azure Container Apps.,"This article explains how to configure Azure Container Apps to use Facebook as an authentication provider. To follow the procedure in this article, you need a Facebook account with a verified email address and a mobile phone number. To create a new Facebook account, go tofacebook.com.",2025-05-29T08:00:00.000Z,how-to,security,0.85,True,Shows how to configure Facebook as an auth provider with Container Apps–specific settings and callback configuration.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/authentication-github,GitHub,Enable authentication and authorization in Azure Container Apps with GitHub,Configure GitHub authentication for Azure Container Apps,Learn to use the built-in GitHub authentication provider in Azure Container Apps.,"This article shows how to configure Azure Container Apps to use GitHub as an authentication provider. To complete the procedure in this article, you need a GitHub account. To create a new GitHub account, go toGitHub.",2026-03-23T08:00:00.000Z,how-to,security,0.78,True,"Step-by-step configuration of GitHub as an auth provider for Azure Container Apps. This necessarily includes provider-specific settings, callback URLs, and Azure auth configuration fields, which are product-specific security and identity configuration details.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/authentication-google,Google,Enable authentication and authorization in Azure Container Apps with Google,Configure Google authentication for Azure Container Apps,Learn to use the built-in Google authentication provider in Azure Container Apps.,"This article shows you how to configure Azure Container Apps to use Google as an authentication provider. To complete the following procedure, you must have a Google account that has a verified email address. To create a new Google account, go toaccounts.google.com.",2026-03-27T17:43:00.000Z,how-to,security,0.7,True,"Provider-specific auth configuration for Container Apps (Google) is typically documented with concrete settings such as callback URLs, client ID/secret fields, and platform-specific parameters. These are product- and provider-specific security configuration details that qualify as expert knowledge beyond generic OAuth concepts.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/authentication-openid,Custom OpenID Connect,Enable authentication and authorization in Azure Container Apps with a Custom OpenID Connect provider,Configure custom OpenID Connect auth for Container Apps,Learn to use the built-in Custom OpenID Connect authentication provider in Azure Container Apps.,"This article shows you how to configure Azure Container Apps to use a custom authentication provider that adheres to theOpenID Connect specification. OpenID Connect (OIDC) is an industry standard widely adopted by many identity providers (IDPs). You don't need to understand the details of the specification in order to configure your app to use an adherent IDP. You can configure your app to use one or more OIDC providers. Each must be given a unique alphanumeric name in the configuration, and onl",2026-04-14T17:11:00.000Z,how-to,security,0.78,True,"Page describes concrete configuration of Azure Container Apps built-in auth with a custom OpenID Connect provider, including provider naming rules and app configuration details. This is product-specific authentication configuration rather than generic OIDC theory, fitting the security category.",updated +https://learn.microsoft.com/en-us/azure/container-apps/authentication-openid,Custom OpenID Connect,Enable authentication and authorization in Azure Container Apps with a Custom OpenID Connect provider,Configure custom OpenID Connect auth for Container Apps,Learn to use the built-in Custom OpenID Connect authentication provider in Azure Container Apps.,"This article shows you how to configure Azure Container Apps to use a custom authentication provider that adheres to theOpenID Connect specification. OpenID Connect (OIDC) is an industry standard widely adopted by many identity providers (IDPs). You don't need to understand the details of the specification in order to configure your app to use an adherent IDP. You can configure your app to use one or more OIDC providers. Each must be given a unique alphanumeric name in the configuration, and onl",2026-04-14T17:11:00.000Z,how-to,security,0.78,True,"Page describes concrete configuration of Azure Container Apps built-in auth with a custom OpenID Connect provider, including provider naming rules and app configuration details. This is product-specific authentication configuration rather than generic OIDC theory, fitting the security category.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/authentication-twitter,X,Enable authentication and authorization in Azure Container Apps with X,Configure X (Twitter) authentication for Azure Container Apps,Learn to use the built-in X authentication provider in Azure Container Apps.,"This article shows how to configure Azure Container Apps to use X as an authentication provider. To complete the procedure in this article, you need an X account that has a verified email address and phone number. To create a new X account, go tox.com.",2026-03-27T17:43:00.000Z,how-to,security,0.7,True,"X/Twitter auth integration for Container Apps generally includes specific configuration fields, redirect URLs, and provider options unique to this product and provider. These are concrete, product-specific security configuration steps rather than generic OAuth guidance.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/azure-arc-create-container-app,2 - Create container app,Tutorial: Create a container app on Azure Arc,,Get started with Azure Container Apps on Azure Arc-enabled Kubernetes deploying your first app.,"In this tutorial, you create aContainer app to an Azure Arc-enabled Kubernetes clusterand learn to:",2025-05-02T08:00:00.000Z,tutorial,,0.3,False,"Tutorial for creating a container app on Azure Arc-enabled Kubernetes; likely step-by-step deployment/usage instructions without detailed config tables, limits, or error-code-based troubleshooting. Does not clearly match any expert-knowledge sub-skill type from the summary.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/azure-arc-enable-cluster,1 - Set up Azure Arc-enabled Kubernetes clusters,Tutorial: Enable Azure Container Apps on Azure Arc-enabled Kubernetes,Enable Azure Container Apps on Arc-enabled Kubernetes clusters,Tutorial: learn how to set up Azure Container Apps in your Azure Arc-enabled Kubernetes clusters.,"WithAzure Arc-enabled Kubernetes clusters, you can create aContainer Apps enabled custom locationin your on-premises or cloud Kubernetes cluster to deploy your Azure Container Apps applications as you would any other region. This tutorial shows how to enable Azure Container Apps on an Azure Arc–enabled Kubernetes cluster. In this tutorial, you:",2025-09-10T22:40:00.000Z,tutorial,deployment,0.7,True,"Tutorial to enable Container Apps on Arc will include specific deployment requirements, supported cluster types/versions, and configuration constraints unique to this deployment model.",unchanged @@ -35,10 +35,10 @@ https://learn.microsoft.com/en-us/azure/container-apps/cors,CORS,Configure CORS https://learn.microsoft.com/en-us/azure/container-apps/custom-domains-certificates,Set up custom domain with existing certificate,Custom Domain Names and Certificates in Azure Container Apps,Manage custom domains and TLS certificates in Container Apps,Learn how to manage custom domain names and certificates in Azure Container Apps.,"Azure Container Apps allows you to bind one or more custom domains to a container app. Note If you configure acustom environment DNS (Domain Name System) suffix, you can't add a custom domain that contains this suffix to your container app.",2026-02-26T08:00:00.000Z,how-to,security,0.7,True,Details managing custom domains and certificates with platform-specific constraints (DNS suffix interactions) for secure ingress.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/custom-domains-managed-certificates,Custom domain with a free certificate,Custom Domain Names and Free Managed Certificates in Container Apps,Configure custom domains and managed certificates in Container Apps,Learn how to configure custom domain names and managed certificates in Azure Container Apps.,"Azure Container Apps allows you to bind one or more custom domains to a container app. You can automatically configure a free managed certificate for your custom domain when your container app is publicly accessible from theDigiCert IP addresses. If you want to set up a custom domain that uses your own certificate, seeCustom domain names and bring your own certificates in Azure Container Apps. Note If you configure acustom environment DNS suffix, you can't add a custom domain that contains this ",2026-02-20T18:12:00.000Z,how-to,security,0.75,True,"Covers binding custom domains and automatic managed certificates with constraints (e.g., DigiCert IP requirement, DNS suffix caveat); product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/custom-virtual-networks,Virtual network configuration,Configuring virtual networks Azure Container Apps environments,Configure custom virtual networks for Container Apps,Learn how to configure virtual networks in Azure Container Apps.,"A virtual network creates a secure boundary around your Azure Container Appsenvironment. By default, environments are created with a VNet that is automatically generated. However, using an existing VNet provides more Azure networking features such as integration with Application Gateway, Network Security Groups, and communication with resources behind private endpoints. This configuration is important for enterprise customers who need to isolate internal, mission-critical applications from the p",2025-07-01T22:25:00.000Z,concept-article,configuration,0.8,True,"Explains using existing VNets, integration with Application Gateway, NSGs, and private endpoints for Container Apps, which is detailed networking configuration.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/dapr-authentication-token,Enable token authentication for Dapr requests,Enable Token Authentication for Dapr Requests,Use Dapr APP_API_TOKEN for Container Apps requests,Learn how to enable token authentication for Dapr requests to your container app in Azure Container Apps.,"WhenDapris enabled for your application in Azure Container Apps, it injects the environment variableAPP_API_TOKENinto your app's container. Dapr includes the same token in all requests sent to your app, as either: The token is randomly generated and unique per each app and app revision. It can also change at any time. Your application should read the token from theAPP_API_TOKENenvironment variable when it starts up to ensure that it's using the correct token. You can use this token to authentica",2026-04-16T17:19:00.000Z,how-to,security,0.82,True,"Page documents the APP_API_TOKEN environment variable injected by Dapr in Azure Container Apps, how the token is generated, and how to use it to authenticate incoming Dapr requests. This is a product-specific authentication mechanism with concrete variable names and behavior, fitting the security category.",updated -https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-connect-services,Connect to Azure or partner services,Connect to other Azure or partner services via Dapr components,Secure Dapr component connections to Azure services,Learn more about connecting Dapr components with Azure and non-Microsoft services.,"Securely establish connections to Azure and non-Microsoft services for Dapr components by using managed identity or Azure Key Vault secret stores. Before getting started, learn more about theoffered support for Dapr components.",2026-03-27T08:00:00.000Z,concept-article,security,0.7,True,"Page covers how to securely connect Dapr components to Azure and partner services using managed identity and Azure Key Vault secret stores. This involves product-specific identity and secret configuration, aligning with the security category.",updated +https://learn.microsoft.com/en-us/azure/container-apps/dapr-authentication-token,Enable token authentication for Dapr requests,Enable Token Authentication for Dapr Requests,Use Dapr APP_API_TOKEN for Container Apps requests,Learn how to enable token authentication for Dapr requests to your container app in Azure Container Apps.,"WhenDapris enabled for your application in Azure Container Apps, it injects the environment variableAPP_API_TOKENinto your app's container. Dapr includes the same token in all requests sent to your app, as either: The token is randomly generated and unique per each app and app revision. It can also change at any time. Your application should read the token from theAPP_API_TOKENenvironment variable when it starts up to ensure that it's using the correct token. You can use this token to authentica",2026-04-16T17:19:00.000Z,how-to,security,0.82,True,"Page documents the APP_API_TOKEN environment variable injected by Dapr in Azure Container Apps, how the token is generated, and how to use it to authenticate incoming Dapr requests. This is a product-specific authentication mechanism with concrete variable names and behavior, fitting the security category.",unchanged +https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-connect-services,Connect to Azure or partner services,Connect to other Azure or partner services via Dapr components,Secure Dapr component connections to Azure services,Learn more about connecting Dapr components with Azure and non-Microsoft services.,"Securely establish connections to Azure and non-Microsoft services for Dapr components by using managed identity or Azure Key Vault secret stores. Before getting started, learn more about theoffered support for Dapr components.",2026-03-27T08:00:00.000Z,concept-article,security,0.7,True,"Page covers how to securely connect Dapr components to Azure and partner services using managed identity and Azure Key Vault secret stores. This involves product-specific identity and secret configuration, aligning with the security category.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-connection,Connect to Azure services via Azure portal,Connect to Azure Services via Dapr Components in the Azure Portal,Create Dapr components for Azure services in portal,Learn how to easily create Dapr components by using Azure Container Apps in the Azure portal.,"You can easily connect Dapr APIs tobacking Azure servicesby using a combination ofService ConnectorandDapr. This feature creates Dapr components on your behalf with valid metadata and authenticated identity to access the Azure service. In this guide, you connect the Dapr publish and subscribe (pub/sub) API to an Azure Service Bus:",2026-02-12T23:11:00.000Z,how-to,integrations,0.76,True,"Shows how Service Connector and Dapr work together to create components with valid metadata and identities (example: Service Bus pub/sub), which is concrete integration configuration.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-resiliency,Dapr component resiliency,Dapr Component Resiliency (Preview) - Azure Container Apps,Configure Dapr component resiliency in Container Apps,Learn how to make your Dapr components resilient in Azure Container Apps.,"Resiliency policies proactively prevent, detect, and recover from your container app failures. In this article, you learn how to apply resiliency policies for applications that use Dapr to integrate with different cloud services. Such as, state stores, pub/sub message brokers, secret stores, and more. You can configure resiliency policies like retries, timeouts, and circuit breakers for the following outbound and inbound operation directions by using a Dapr component: The following screenshot sh",2026-03-27T08:00:00.000Z,concept-article,configuration,0.7,True,"Page explains how to configure resiliency policies (retries, timeouts, circuit breakers) for Dapr components in Azure Container Apps. It focuses on product-specific configuration of resiliency settings rather than generic resiliency concepts, matching the configuration category.",updated +https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-resiliency,Dapr component resiliency,Dapr Component Resiliency (Preview) - Azure Container Apps,Configure Dapr component resiliency in Container Apps,Learn how to make your Dapr components resilient in Azure Container Apps.,"Resiliency policies proactively prevent, detect, and recover from your container app failures. In this article, you learn how to apply resiliency policies for applications that use Dapr to integrate with different cloud services. Such as, state stores, pub/sub message brokers, secret stores, and more. You can configure resiliency policies like retries, timeouts, and circuit breakers for the following outbound and inbound operation directions by using a Dapr component: The following screenshot sh",2026-03-27T08:00:00.000Z,concept-article,configuration,0.7,True,"Page explains how to configure resiliency policies (retries, timeouts, circuit breakers) for Dapr components in Azure Container Apps. It focuses on product-specific configuration of resiliency settings rather than generic resiliency concepts, matching the configuration category.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/dapr-components,Components overview,Dapr Components in Azure Container Apps,Configure Dapr components in Container Apps environments,Learn more about how Dapr components work on your Azure Container App service to develop applications.,"Dapr uses a modular design where functionality is delivered as acomponent. The use of Dapr components is optional and dictated exclusively by the needs of your application. Dapr components in container apps are environment-level resources that: In this guide, you learn how to configure Dapr components for your Azure Container Apps services.",2026-02-18T23:18:00.000Z,concept-article,configuration,0.78,True,"Explains environment-level Dapr components, their lifecycle, and configuration in Container Apps, which is detailed component configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/dapr-functions-extension,Deploy using the Dapr extension for Azure Functions,Deploy the Dapr Extension for Azure Functions - Azure Container Apps,Use Azure Functions Dapr extension in Container Apps,Learn how to use and deploy the Azure Functions with Dapr extension in your Dapr-enabled container apps.,"TheDapr extension for Azure Functionsallows you to easily interact with the Dapr APIs from an Azure Function by using triggers and bindings. In this guide, you learn how to:",2026-02-18T23:18:00.000Z,how-to,integrations,0.76,True,"Shows how to integrate Azure Functions with Dapr APIs via triggers/bindings in Container Apps, involving specific binding/trigger configuration parameters unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/dapr-keda-scaling,Scale Dapr apps with KEDA using Bicep,Scale Dapr Applications with KEDA Scalers Using Bicep - Azure Container Apps,Scale Dapr apps with KEDA scalers using Bicep,Find out how you can use a KEDA scaler to scale a container app and its Dapr sidecar in Azure Container Apps based on the message count in a service bus topic.,"Azure Container Apps automatically scales applications to zero when there's no incoming HTTP traffic. However, you can also use other scaling triggers for apps that have non-HTTP traffic. For example, apps that use theDistributed Application Runtime (Dapr)publish and subscribe (pub/sub) and bindings building block APIs can benefit from event-driven scaling. In these scenarios, you can useKubernetes event-driven autoscaling (KEDA) scalersto scale your application and its Dapr sidecar out and in, ",2025-12-10T23:18:00.000Z,concept-article,configuration,0.7,True,"Describes using KEDA scalers with Dapr apps in Container Apps, including scaling behavior for sidecars and message-count triggers, which is product-specific scaling configuration.",unchanged @@ -62,9 +62,10 @@ https://learn.microsoft.com/en-us/azure/container-apps/functions-container-apps, https://learn.microsoft.com/en-us/azure/container-apps/functions-keda-mappings,KEDA scaling mappings reference,Azure Functions KEDA scaling mappings on Container Apps,Map Functions triggers to KEDA scaling in Container Apps,Learn how Azure Functions trigger parameters map to KEDA scaling parameters for autoscaling in Azure Container Apps.,"When you deploy Azure Functions on Azure Container Apps, the platform automatically translates your Functions trigger parameters into KEDA scaling configurations. This translation ensures that your Functions scale appropriately based on the incoming workload from various event sources.",2026-03-12T05:52:00.000Z,reference,configuration,0.8,True,"Explains how Azure Functions trigger parameters map to KEDA scaling parameters when running on Azure Container Apps. This is a product-specific mapping of configuration parameters (Functions trigger settings to KEDA scaler settings), which is expert configuration knowledge not obvious from general training data. Fits configuration because it defines how specific parameters translate and should be set for autoscaling.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/functions-manage,Manage a Functions app,Manage your Azure Functions in Container Apps,Manage Azure Functions instances in Container Apps via CLI,Learn to manage functions in Container Apps,"You can manage your deployed functions within Azure Container Apps by using the Azure CLI. The following commands help you list, inspect, and interact with the functions running in your containerized environment. Note When dealing with multirevision scenarios, add the--revision parameter to your command to target a specific revision.",2026-01-13T23:18:00.000Z,how-to,configuration,0.7,True,"Shows CLI commands and parameters to list and interact with functions, including revision targeting; these are product-specific management/configuration commands.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/functions-overview,Functions,Azure Functions on Azure Container Apps overview,,Learn how Azure Functions works with autoscaling rules in Azure Container Apps.,"Azure Functions on Azure Container Apps offers a fully managed serverless hosting environment that brings together the event-driven capabilities of Azure Functions with the robust features of Container Apps. This integration includes advanced capabilities such as Kubernetes-based orchestration, built-in autoscaling powered by KEDA (Kubernetes-based Event Driven Autoscaling), Dapr (Distributed Application Runtime) integration, GPU workload support, sidecar support, virtual network (VNet) connecti",2026-03-31T08:00:00.000Z,how-to,,0.2,False,"Overview of Azure Functions on Container Apps and autoscaling; summary suggests conceptual integration description without specific limits, configuration tables, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/container-apps/functions-secrets-tutorial,Manage secrets for Azure Functions on Azure Container Apps,Use secrets with Azure Functions in Azure Container Apps,Secure secrets and host keys for Functions in Container Apps,Learn how to store app-level secrets and manage Functions host keys for Azure Functions apps running in Azure Container Apps.,"Azure Functions running in Azure Container Apps involves two distinct secrets scenarios that you configure differently: App-level secretsare values your function code consumes at runtime, such as database connection strings or API keys. Store these secrets as Container Apps secrets, either directly or through Azure Key Vault references. Functions host keysare authentication keys (master, host, and function keys) that the Functions runtime uses to secure HTTP-triggered endpoints. Configure a stor",2026-04-20T22:11:00.000Z,how-to,security,0.7,True,"Page focuses on handling app-level secrets and Functions host keys when running Azure Functions inside Azure Container Apps. It describes product-specific security configuration patterns (using Container Apps secrets, Key Vault references, and Functions host key storage/management) that are unique to this integration scenario, going beyond generic concepts.",new https://learn.microsoft.com/en-us/azure/container-apps/functions-unified-platform,Run event-driven and batch workloads,Run event-driven and batch workloads with Azure Functions on Azure Container Apps,,"Learn how Azure Functions on Azure Container Apps enables you to run event-driven, batch, and API workloads with the flexibility of containers and the scalability of serverless computing.","Azure Functions on Azure Container Apps combines the productivity of Function-as-a-Service (FaaS) with the flexibility of containerized hosting. This integration allows you to deploy event-driven functions as continuous services while maintaining the ability to handle finite workloads with definitive start and end times. This platform uses a rich set of triggers and bindings and incorporates advanced Azure Container Apps features, enabling it to execute virtually any containerized workload.",2025-11-21T06:02:00.000Z,overview,,0.4,False,High-level overview of Functions on Container Apps for event-driven and batch workloads; summary doesn’t show detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/functions-usage,Azure portal,Create an Azure Functions app with auto scaling rules on Azure Container Apps,,Learn to create an Azure Functions app preconfigured with auto scaling rules in Azure Container Apps.,"This article shows you how to create anAzure Functions app in Azure Container Apps, complete with preconfigured autoscaling rules.",2026-01-13T23:18:00.000Z,how-to,,0.3,False,"Shows how to create an Azure Functions app with autoscaling rules, but described as a how-to; summary doesn’t indicate detailed KEDA rule tables or numeric thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/get-started,Command line,Quickstart: Deploy your first container app with containerapp up,,Deploy your first application to Azure Container Apps using the Azure CLI containerapp up command.,"The Azure Container Apps service enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while you leave behind the concerns of manually configuring cloud infrastructure and complex container orchestrators. In this quickstart, you create and deploy your first container app using theaz containerapp upcommand.",2026-04-14T17:11:00.000Z,quickstart,,0.2,False,"Quickstart tutorial focused on first deployment using az containerapp up; likely step-by-step commands without detailed config tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/container-apps/get-started,Command line,Quickstart: Deploy your first container app with containerapp up,,Deploy your first application to Azure Container Apps using the Azure CLI containerapp up command.,"The Azure Container Apps service enables you to run microservices and containerized applications on a serverless platform. With Container Apps, you enjoy the benefits of running containers while you leave behind the concerns of manually configuring cloud infrastructure and complex container orchestrators. In this quickstart, you create and deploy your first container app using theaz containerapp upcommand.",2026-04-14T17:11:00.000Z,quickstart,,0.2,False,"Quickstart tutorial focused on first deployment using az containerapp up; likely step-by-step commands without detailed config tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/github-actions,GitHub Actions,Publish revisions with GitHub Actions in Azure Container Apps,Automate Container Apps revisions with GitHub Actions,Learn to automatically create new revisions in Azure Container Apps using a GitHub Actions workflow.,"Azure Container Apps allows you to use GitHub Actions to publishrevisionsto your container app. As commits are pushed to your GitHub repository, a workflow is triggered which updates the container image in the container registry. Azure Container Apps creates a new revision based on the updated container image. The GitHub Actions workflow triggers when you commit to a specific branch in your repository. When creating the workflow, you decide which branch triggers the workflow. This article shows ",2025-05-19T17:08:00.000Z,how-to,deployment,0.75,True,Describes GitHub Actions workflow that updates images and triggers new revisions; includes product-specific CI/CD workflow configuration.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/github-actions-cli,GitHub Actions with Azure CLI,Generate GitHub Actions workflow with Azure CLI in Azure Container Apps,Generate Container Apps GitHub Actions via Azure CLI,Learn to automatically create GitHub Actions workflow in Azure Container Apps,"Azure Container Apps allows you to use GitHub Actions to publishrevisionsto your container app. As commits are pushed to your GitHub repository, a GitHub Actions workflow is triggered which updates thecontainerimage in the container registry. Once the container is updated in the registry, Azure Container Apps creates a new revision based on the updated container image. The GitHub Actions workflow runs when there are commits to a specific branch in your repository. You choose which branch trigger",2025-02-03T08:00:00.000Z,how-to,deployment,0.75,True,Focuses on generating a GitHub Actions workflow tailored to Container Apps using CLI; contains specific workflow and deployment configuration.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/gpu-serverless-overview,Serverless GPUs,Using serverless GPUs in Azure Container Apps,,Learn to how to use GPUs with apps and jobs in Azure Container Apps.,"Azure Container Apps provides access to GPUs on demand without you having to manage the underlying infrastructure. As a serverless feature, you pay only for GPUs in use. When enabled, the number of GPUs used for your app rises and falls to meet the load demands of your application. Serverless GPUs enable you to seamlessly run your workloads with automatic scaling, optimized cold start, per-second billing with scale down to zero when not in use, and reduced operational overhead. Serverless GPUs a",2025-11-18T17:01:00.000Z,how-to,,0.35,False,"Overview of serverless GPUs; describes benefits and behavior but summary doesn’t show numeric limits, config tables, or decision matrices.",unchanged @@ -83,13 +84,13 @@ https://learn.microsoft.com/en-us/azure/container-apps/java-admin-eureka-integra https://learn.microsoft.com/en-us/azure/container-apps/java-application-performance-management-config,Configure application performance management (APM) Java agent with init-container,Configure the APM Java agent with Init Containers - Azure Container Apps,Configure Java APM agent with init containers in Container Apps,Learn how to configure the Application Performance Management (APM) Java agent with init containers in Azure Container Apps.,"In this tutorial, you configure the Application Performance Management (APM) Java agent with init containers in Azure Container Apps. APM helps power observability for your container apps. You can package the APM plugin in the same image or Dockerfile with your app, but it binds together the management concerns, like release and Common Vulnerabilities and Exposures (CVE) mitigation. Rather than binding the concerns together, you can use the Java agent and init containers in Azure Container Apps ",2025-02-03T08:00:00.000Z,tutorial,configuration,0.7,True,"Details init container and Java agent configuration, including container definitions, env vars, and arguments specific to Container Apps’ APM pattern.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-build-environment-variables,Build environment variables (preview),Build environment variables for Java in Azure Container Apps,Configure Java build environment variables in Container Apps,Learn about Java image build from source code via environment variables.,"Azure Container Apps usesBuildpacksto automatically create a container image that you can use to deploy your source code directly to the cloud. To take control of your build configuration, use environment variables to customize parts of your build like the JDK, Maven, and Tomcat. The following article shows you how to configure environment variables to help you take control over builds that automatically create a container for you.",2025-11-07T23:24:00.000Z,conceptual,configuration,0.8,True,"Explicitly about environment variables controlling buildpacks (JDK, Maven, Tomcat) with names and allowed values—clear configuration reference content.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-component-logs,Query managed component logs,Observability of managed Java components in Azure Container Apps,Access logs for managed Java components in Container Apps,Learn how to retrieve logs of managed Java components in Azure Container Apps.,"Java components include built-in observability features that can give you a holistic view of Java component health throughout its lifecycle. In this tutorial, you learn how to query logs messages generated by a Java component.",2025-06-16T08:00:00.000Z,how-to,configuration,0.65,True,"Explains how to query component logs, including log categories, queries, and locations specific to managed Java components on this platform.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/java-config-server,Connect to Config Server for Spring,Connect to a managed Config Server for Spring in Azure Container Apps,Connect Azure Container Apps to managed Config Server for Spring,Learn how to connect a Config Server for Spring to your container app.,"Config Server for Spring provides a centralized location to make configuration data available to multiple applications. In this article, you learn to connect an app hosted in Azure Container Apps to a Java Config Server for Spring instance. The Config Server for Spring Java component uses a GitHub repository as the source for configuration settings. Configuration values are made available to your container app via a binding between the component and your container app. As values change in the co",2026-03-30T08:00:00.000Z,tutorial,integrations,0.7,True,"The article describes binding a Container App to a Config Server for Spring component, using GitHub as a backing store and exposing configuration via bindings. This requires product-specific connection and binding parameters, making it an integrations-focused page.",updated +https://learn.microsoft.com/en-us/azure/container-apps/java-config-server,Connect to Config Server for Spring,Connect to a managed Config Server for Spring in Azure Container Apps,Connect Azure Container Apps to managed Config Server for Spring,Learn how to connect a Config Server for Spring to your container app.,"Config Server for Spring provides a centralized location to make configuration data available to multiple applications. In this article, you learn to connect an app hosted in Azure Container Apps to a Java Config Server for Spring instance. The Config Server for Spring Java component uses a GitHub repository as the source for configuration settings. Configuration values are made available to your container app via a binding between the component and your container app. As values change in the co",2026-03-30T08:00:00.000Z,tutorial,integrations,0.7,True,"The article describes binding a Container App to a Config Server for Spring component, using GitHub as a backing store and exposing configuration via bindings. This requires product-specific connection and binding parameters, making it an integrations-focused page.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-containers-intro,Introduction to containers,Introduction to Containers for Java Applications,,Learn the basics of using containers for Java applications.,"Containers provide a consistent, portable environment for your Java applications across development, testing, and production stages. This article introduces containerization concepts for Java applications and guides you through creating, debugging, optimizing, and deploying containerized Java applications to Azure Container Apps. In this article, you learn essential containerization concepts for Java developers and the following skills: By containerizing your Java applications, you get consisten",2025-04-24T05:12:00.000Z,conceptual,,0.1,False,Intro to containers for Java is conceptual and generic; not specific to Container Apps configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-dynamic-log-level,Set dynamic logger level,Set dynamic logger level to troubleshoot Java applications in Azure Container Apps (preview),Use dynamic log levels to troubleshoot Java apps on Container Apps,Learn how to use dynamic logger level settings to debug your Java applications running on Azure Container Apps.,"Azure Container Apps platform offers a built-in diagnostics tool exclusively for Java developers to help them debug and troubleshoot their Java applications running on Azure Container Apps more easily and efficiently. One of the key features is a dynamic logger level change, which allows you to access log details that are hidden by default. When enabled, log information is collected without code modifications or forcing you to restart your app when changing log levels. Before getting started, yo",2025-05-29T22:04:00.000Z,how-to,troubleshooting,0.7,True,Focuses on a diagnostics tool for Java with dynamic logger level changes; includes platform-specific commands/settings and symptom-to-diagnosis logging patterns.,unchanged -https://learn.microsoft.com/en-us/azure/container-apps/java-eureka-server,Connect to Eureka Server for Spring,Connect to a managed Eureka Server for Spring in Azure Container Apps,Integrate Azure Container Apps with managed Eureka Server for Spring,Learn how to use a managed Eureka Server for Spring in Azure Container Apps.,"Eureka Server for Spring is a service registry that allows microservices to register themselves and discover other services. Eureka Server for Spring is available as an Azure Container Apps component. You can bind your container app to Eureka Server for Spring for automatic registration with the Eureka server. In this tutorial, you learn how to: Important This tutorial uses services that can affect your Azure bill. If you decide to follow along, make sure that you delete the resources featured i",2026-03-30T08:00:00.000Z,tutorial,integrations,0.7,True,"Connecting Container Apps to a managed Eureka Server for Spring involves binding configuration, connection parameters, and possibly SDK or Spring configuration properties specific to this Azure component. This is a concrete integration pattern between Container Apps and Eureka, fitting integrations.",updated +https://learn.microsoft.com/en-us/azure/container-apps/java-eureka-server,Connect to Eureka Server for Spring,Connect to a managed Eureka Server for Spring in Azure Container Apps,Integrate Azure Container Apps with managed Eureka Server for Spring,Learn how to use a managed Eureka Server for Spring in Azure Container Apps.,"Eureka Server for Spring is a service registry that allows microservices to register themselves and discover other services. Eureka Server for Spring is available as an Azure Container Apps component. You can bind your container app to Eureka Server for Spring for automatic registration with the Eureka server. In this tutorial, you learn how to: Important This tutorial uses services that can affect your Azure bill. If you decide to follow along, make sure that you delete the resources featured i",2026-03-30T08:00:00.000Z,tutorial,integrations,0.7,True,"Connecting Container Apps to a managed Eureka Server for Spring involves binding configuration, connection parameters, and possibly SDK or Spring configuration properties specific to this Azure component. This is a concrete integration pattern between Container Apps and Eureka, fitting integrations.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-eureka-server-highly-available,Create a highly available Eureka server component cluster,Tutorial: Create a Highly Available Eureka Server Component Cluster in Azure Container Apps,Design a highly available Eureka cluster on Container Apps,Find out how to create a highly available Eureka service in Azure Container Apps. Go through steps for linking Eureka server instances to form a cluster.,"In this tutorial, you find out how to create a Eureka service that's designed to remain operational in the face of failures and high demand. Building a highly available Eureka service helps ensure the service registry that you use for Azure Container Apps is always available to clients regardless of demand. Achieving high availability status for Eureka includes linking multiple Eureka server instances together so they form a cluster. The cluster provides resources so that if one Eureka server fa",2025-11-21T06:02:00.000Z,tutorial,architecture-patterns,0.7,True,Focuses on HA design for Eureka using multiple instances and clustering; product-specific pattern for resilience and scaling of the service registry.,unchanged -https://learn.microsoft.com/en-us/azure/container-apps/java-feature-switch,Turn on Java features,How to turn on Java features in Azure Container Apps,Enable Java-optimized features in Azure Container Apps,How to turn on Java features to use Java-optimized supports in Azure Container Apps.,"This guide provides step-by-step instructions for enabling key Java features in Azure Container Apps. By activating these features, you can optimize your Java applications for performance, monitoring, and ease of development.",2026-03-30T08:00:00.000Z,how-to,configuration,0.65,True,"Turning on Java features in Container Apps generally requires setting specific configuration flags, annotations, or environment variables for Java runtimes, monitoring, and performance tuning. These are product-specific configuration steps rather than generic Java guidance, so it aligns with configuration.",updated -https://learn.microsoft.com/en-us/azure/container-apps/java-gateway-for-spring,Connect to Gateway for Spring,Connect to a managed Gateway for Spring in Azure Container Apps (preview),Use managed Gateway for Spring with Azure Container Apps,Learn how to connect a Gateway for Spring to your container app.,"Gateway for Spring offers an efficient and powerful way to route, manage, and handle API requests as part of a microservices architecture. It serves as an API Gateway that routes external requests to different services, adding various capabilities such as filtering, load balancing, and more. In this article, you learn how to create a gateway that directs requests to your container apps. In this tutorial, you learn to: Important This tutorial uses services that can affect your Azure bill. If you ",2026-03-31T08:00:00.000Z,tutorial,integrations,0.7,True,"Creating and wiring a Gateway for Spring to route requests to Container Apps is an integration scenario with product-specific configuration (gateway definitions, routing to apps, bindings). This is a concrete integration and coding pattern between Container Apps and Gateway for Spring, so it fits integrations.",updated +https://learn.microsoft.com/en-us/azure/container-apps/java-feature-switch,Turn on Java features,How to turn on Java features in Azure Container Apps,Enable Java-optimized features in Azure Container Apps,How to turn on Java features to use Java-optimized supports in Azure Container Apps.,"This guide provides step-by-step instructions for enabling key Java features in Azure Container Apps. By activating these features, you can optimize your Java applications for performance, monitoring, and ease of development.",2026-03-30T08:00:00.000Z,how-to,configuration,0.65,True,"Turning on Java features in Container Apps generally requires setting specific configuration flags, annotations, or environment variables for Java runtimes, monitoring, and performance tuning. These are product-specific configuration steps rather than generic Java guidance, so it aligns with configuration.",unchanged +https://learn.microsoft.com/en-us/azure/container-apps/java-gateway-for-spring,Connect to Gateway for Spring,Connect to a managed Gateway for Spring in Azure Container Apps (preview),Use managed Gateway for Spring with Azure Container Apps,Learn how to connect a Gateway for Spring to your container app.,"Gateway for Spring offers an efficient and powerful way to route, manage, and handle API requests as part of a microservices architecture. It serves as an API Gateway that routes external requests to different services, adding various capabilities such as filtering, load balancing, and more. In this article, you learn how to create a gateway that directs requests to your container apps. In this tutorial, you learn to: Important This tutorial uses services that can affect your Azure bill. If you ",2026-03-31T08:00:00.000Z,tutorial,integrations,0.7,True,"Creating and wiring a Gateway for Spring to route requests to Container Apps is an integration scenario with product-specific configuration (gateway definitions, routing to apps, bindings). This is a concrete integration and coding pattern between Container Apps and Gateway for Spring, so it fits integrations.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-get-started,Artifacts (preview),Launch Your First Java Application in Azure Container Apps using a WAR or JAR File,,Shows how to deploy a Java project in Azure Container Apps using a WAR or JAR file.,"This article shows you how to deploy the Spring PetClinic sample application to Azure Container Apps using a web application archive (WAR) file or a Java Archive (JAR) file. There are several options available for deploying Java applications, including the following options: By the end of this tutorial, you deploy a web application that you can manage through the Azure portal. The following screenshot shows the home page of the PetClinic application deployed to Azure Container Apps:",2025-03-06T05:32:00.000Z,quickstart,,0.3,False,"Tutorial deploying WAR/JAR; focuses on deployment steps, not detailed configuration options or expert-only limits.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-get-started-dockerfile,Dockerfile,Launch Your First Java application in Azure Container Apps Using a Dockerfile,,Learn how to deploy a Java project in Azure Container Apps using a Dockerfile.,"This article shows you how to deploy the Spring PetClinic sample application to Azure Container Apps using a Dockerfile. There are several options available for deploying Java applications, including the following options: By the end of this tutorial, you deploy a web application that you can manage through the Azure portal. The following screenshot shows the home page of the PetClinic application deployed to Azure Container Apps:",2025-03-06T05:32:00.000Z,quickstart,,0.3,False,Step-by-step tutorial deploying a sample via Dockerfile; likely uses generic commands without deep config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-get-started-github-repository,Github repository,Launch your First Java Application in Azure Container Apps Using a GitHub Repository,,Learn how to deploy a Java project in Azure Container Apps using a GitHub Repository.,"This article shows you how to deploy the Spring PetClinic sample application to Azure Container Apps using a GitHub repository. There are several options available for deploying Java applications, including the following options: By the end of this tutorial, you deploy a web application that you can manage through the Azure portal. The following screenshot shows the home page of the PetClinic application deployed to Azure Container Apps:",2025-03-06T05:32:00.000Z,quickstart,,0.3,False,Tutorial deploying via GitHub repo; mostly workflow steps rather than configuration reference or quotas.,unchanged @@ -98,7 +99,7 @@ https://learn.microsoft.com/en-us/azure/container-apps/java-metrics,Java metrics https://learn.microsoft.com/en-us/azure/container-apps/java-metrics-scale-with-keda,Scale with Java metrics,Tutorial: Scale a container app with Java metrics,,Scale a container app with Java metrics.,"Azure Container Apps manages automatic horizontal scaling through a set of declarative scaling rules. You can create your own scale rules withcustomized event sources. In this tutorial, you add a custom scale rule to scale your container app with Java metrics and observe how your application scales.",2026-03-25T08:00:00.000Z,tutorial,,0.35,False,"Tutorial on adding a custom KEDA scale rule using Java metrics. The summary suggests a how-to flow rather than a catalog of configuration parameters, limits, or diagnostic mappings; likely general tutorial content rather than expert reference-style knowledge.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-metrics-with-grafana,Build a Java metrics dashboard with Azure Managed Grafana,Tutorial: Build a Java metrics dashboard with Azure Managed Grafana,,Learn to build a Java metrics dashboard with Azure Managed Grafana.,"In this tutorial, you learn how to set up a metrics dashboard using Azure Managed Grafana to monitor Java applications running in Azure Container Apps. Grafana is a popular tool for centralized metrics visualization and monitoring in the observability industry. Azure Managed Grafana is a fully managed Azure service that allows you to deploy and manage Grafana dashboards with seamless Azure integration. You can use Azure Managed Grafana to visualize Java metrics exposed by Azure Container Apps or",2026-03-27T17:43:00.000Z,tutorial,,0.3,False,"Tutorial on building a Java metrics dashboard with Azure Managed Grafana. From the summary, it appears to be a step-by-step example, not a reference of configuration options, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-microservice-get-started,Launch Java microservice apps,Launch a Java Microservice Application with Managed Java Components - Azure Container Apps,,Learn how to deploy a Java microservice project in Azure Container Apps with managed Java components.,"This article explains how to deploy an application in Azure Container Apps that uses Java components to handle configuration management, service discovery, and health and metrics. The sample application shown in this example is the Java PetClinic, which uses the microservice architecture pattern. The following diagram depicts the architecture of the PetClinic application on Azure Container Apps: Diagram of an Azure Container Apps environment showing the architecture of four microservices-based a",2026-04-01T06:12:00.000Z,quickstart,,0.2,False,"Tutorial-style deployment of a Java microservice (PetClinic) on Container Apps. From the summary, it appears to be architectural and tutorial guidance without detailed configuration tables, limits, or product-specific diagnostic/security settings.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/java-overview,Overview,Java on Azure Container Apps overview,,Learn about the tools and resources needed to run Java applications on Azure Container Apps.,"Azure Container Apps can run any containerized Java application in the cloud while giving flexible options for how you deploy your applications. When you use Container Apps for your containerized Java applications, you get: Cost effective scaling: When you use theConsumption plan, your Java apps can scale to zero. Scaling in when there's little demand for your app automatically drives costs down for your projects. Deployment options: Azure Container Apps integrates withBuildpacks, which allows y",2024-11-19T08:00:00.000Z,conceptual,,0.2,False,Java overview content describing benefits and options; lacks detailed configuration parameters or quotas.,unchanged +https://learn.microsoft.com/en-us/azure/container-apps/java-overview,Overview,Java on Azure Container Apps overview,,Learn about the tools and resources needed to run Java applications on Azure Container Apps and about advantages of using Container Apps for Java applications.,"Azure Container Apps can run any containerized Java application in the cloud while giving flexible options for how you deploy your applications. When you use Container Apps for your containerized Java applications, you get: Cost effective scaling: When you use theConsumption plan, your Java apps can scale to zero. Scaling in when there's little demand for your app automatically drives costs down for your projects. Deployment options: Azure Container Apps integrates withBuildpacks, which allows y",2026-04-24T22:19:00.000Z,overview,,0.1,False,"High-level overview of running Java on Azure Container Apps with benefits and general deployment options; no specific limits, configuration tables, error codes, or decision matrices that meet the expert-knowledge criteria.",updated https://learn.microsoft.com/en-us/azure/container-apps/java-petclinic-ai-overview,Overview,Introduction to the Java PetClinic AI Sample in Azure Container Apps,Understand AI-enabled PetClinic architecture on Container Apps,Explains the architecture of AI applications deployed to Azure Container Apps.,"The Spring PetClinic sample is a classic reference application that demonstrates the use of Spring Boot with Java. This tutorial features an AI-enhanced version built on Azure Container Apps that extends the traditional PetClinic management system with modern AI capabilities. The application you build in this tutorial is an AI chat assistant that uses Retrieval Augmented Generation (RAG). To connect to Azure OpenAI Service, the application uses Spring AI SDKs to support the web application. For ",2025-02-12T23:02:00.000Z,concept-article,architecture-patterns,0.65,True,Explains a specific RAG-based AI architecture on Container Apps with Azure OpenAI and Spring AI; product-specific pattern and component interactions beyond generic RAG concepts.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/java-petclinic-ai-tutorial,Deploy the PetClinic AI sample,Deploy an AI-Enabled Instance of the Spring PetClinic on Azure Container Apps,,Use the azd automation tool to deploy a sample AI application to Azure Container Apps.,"In this article, you learn how to useAzure OpenAI Serviceand Azure Container Apps to create a natural language interface for the Spring PetClinic sample application. For information on the architectural details of this application, seeJava PetClinic AI sample in Container Apps overview.",2025-02-12T23:02:00.000Z,tutorial,,0.4,False,"Tutorial using azd to deploy the AI sample; mainly procedural, with architecture details deferred to another article.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/javascript-overview,Overview,JavaScript on Azure Container Apps overview,,Learn about the tools and resources needed to run JavaScript applications on Azure Container Apps.,Azure Container Apps can run any containerized JavaScript application in the cloud while giving flexible options for how you deploy your applications.,2025-02-25T23:02:00.000Z,concept-article,,0.2,False,High-level JavaScript overview; mostly conceptual and marketing-style description of capabilities.,unchanged @@ -113,7 +114,7 @@ https://learn.microsoft.com/en-us/azure/container-apps/logging,Application loggi https://learn.microsoft.com/en-us/azure/container-apps/manage-secrets,Manage secrets,Manage secrets in Azure Container Apps,Manage and use secrets in Azure Container Apps,Learn to store and consume sensitive configuration values in Azure Container Apps.,"Azure Container Apps allows your application to securely store sensitive configuration values. Once secrets are defined at the application level, secured values are available to revisions in your container apps. Additionally, you can reference secured values inside scale rules. For information on using secrets with Dapr, refer toDapr integration. An updated or deleted secret doesn't automatically affect existing revisions in your app. When a secret is updated or deleted, you can respond to chang",2026-04-03T17:16:00.000Z,how-to,security,0.65,True,"Secret management in Container Apps involves product-specific behaviors (app-level secrets, how revisions see updates, use in scale rules, interaction with Dapr). These are concrete, platform-specific security/configuration details and edge cases that go beyond generic secret management concepts.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/managed-identity,Use managed identities,Managed identities in Azure Container Apps,Use managed identities with Azure Container Apps,Using managed identities in Container Apps,"A managed identity from Microsoft Entra ID allows your container app to access other Microsoft Entra protected resources. For more about managed identities in Microsoft Entra ID, seeManaged identities for Azure resources. Your container app can be granted two types of identities:",2025-06-03T22:06:00.000Z,how-to,security,0.78,True,"Covers Container Apps–specific use of system- and user-assigned managed identities, including how identities are attached and used to access Entra-protected resources, which is product-specific IAM configuration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/managed-identity-image-pull,Azure Container Registry image pull with managed identity,Azure Container Apps image pull from Azure Container Registry with managed identity,Configure ACR image pulls via managed identity,Set up Azure Container Apps to authenticate Azure Container Registry image pulls with managed identity,"You can pull images from private repositories in Microsoft Azure Container Registry using managed identities for authentication to avoid the use of administrative credentials. You can use a user-assigned or system-assigned managed identity to authenticate with Azure Container Registry. Container Apps checks for a new version of the image whenever a container is started. In Docker or Kubernetes terminology, Container Apps sets each container's image pull policy toalways. This article describes ho",2025-02-05T18:02:00.000Z,how-to,security,0.76,True,"Explains Container Apps image pull behavior (always pull policy) and how to wire managed identities to ACR pulls, which is a product-specific security/auth configuration detail.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/mcp-authentication,Authentication,Secure MCP servers on Azure Container Apps,Secure MCP servers on Azure Container Apps with Entra ID,Learn how to authenticate and authorize MCP servers on Azure Container Apps using Microsoft Entra ID or API key authentication.,This article explains how to authenticate and secure MCP servers running on Azure Container Apps. The approach differs depending on whether you host astandalone container appor use theplatform-managed MCP serverin dynamic sessions.,2026-04-13T22:10:00.000Z,how-to,security,0.7,True,"Article is specifically about authenticating and authorizing MCP servers using Microsoft Entra ID or API keys; likely includes concrete auth flows, scopes, and role/permission details unique to Container Apps MCP scenarios.",updated +https://learn.microsoft.com/en-us/azure/container-apps/mcp-authentication,Authentication,Secure MCP servers on Azure Container Apps,Secure MCP servers on Azure Container Apps with Entra ID,Learn how to authenticate and authorize MCP servers on Azure Container Apps using Microsoft Entra ID or API key authentication.,This article explains how to authenticate and secure MCP servers running on Azure Container Apps. The approach differs depending on whether you host astandalone container appor use theplatform-managed MCP serverin dynamic sessions.,2026-04-13T22:10:00.000Z,how-to,security,0.7,True,"Article is specifically about authenticating and authorizing MCP servers using Microsoft Entra ID or API keys; likely includes concrete auth flows, scopes, and role/permission details unique to Container Apps MCP scenarios.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/mcp-choosing-azure-service,Choose a hosting option,Choose an Azure service for your MCP server,Select Azure hosting service for MCP servers,"Compare Azure Container Apps, App Service, Azure Functions, and Azure Kubernetes Service (AKS) for hosting MCP servers and pick the best fit for your workload.","You can use several Azure services to host Model Context Protocol (MCP) servers. This guide helps you choose the right service based on your workload requirements, team expertise, and operational needs. Note This article compares multiple Azure services to help you make a hosting decision. For Container Apps-specific details, seeMCP servers on Azure Container Apps.",2026-03-27T17:43:00.000Z,product-comparison,decision-making,0.8,True,"Explicitly compares Azure Container Apps, App Service, Functions, and AKS for hosting MCP servers to help choose the right service based on workload, expertise, and operational needs. This is service-selection guidance with scenario-based recommendations, fitting decision-making.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/mcp-overview,Overview,Host MCP servers on Azure Container Apps,,Learn how to host MCP servers on Azure Container Apps as standalone container apps or with dynamic sessions.,"Model Context Protocol (MCP)is an open standard that connects AI applications to external data sources and tools. By using MCP, AI clients like GitHub Copilot can discover and invoke capabilities you expose, turning your APIs, databases, and business logic into tools an AI agent can use through natural language. Azure Container Apps supports two hosting models for MCP servers:",2026-02-27T06:11:00.000Z,conceptual,,0.35,False,"Overview of hosting MCP servers and models; summary doesn’t show detailed configs, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/mcp-troubleshooting,Troubleshooting,Troubleshoot MCP servers on Azure Container Apps,Troubleshoot MCP server issues on Azure Container Apps,"Diagnose and fix common issues with MCP servers on Azure Container Apps, including connection, protocol, and authentication problems.","This article covers common problems you might encounter when deploying and consuming MCP servers on Azure Container Apps. The problems are organized by category and apply to bothstandalone container apps and platform-managed dynamic sessions, unless noted otherwise.",2026-02-27T06:11:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article for MCP servers, organized around common problems and resolutions, likely including error messages and diagnostic steps.",unchanged @@ -136,7 +137,7 @@ https://learn.microsoft.com/en-us/azure/container-apps/mtls,Use mTLS,Use mTLS in https://learn.microsoft.com/en-us/azure/container-apps/networking,Overview,Networking in Azure Container Apps environment,Understand networking model for Container Apps environments,Learn about virtual networks in Azure Container Apps.,"Azure Container Apps operate in the context of anenvironment, which runs its own virtual network. As you create an environment, there are a few key considerations that inform the networking capabilities of your container apps:",2025-07-01T22:25:00.000Z,conceptual,configuration,0.7,True,"Explains environment-level virtual network behavior and key networking considerations specific to Container Apps, which informs how to configure networking for apps.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/observability,Overview,Observability in Azure Container Apps,,Monitor your running app in Azure Container Apps,"Azure Container Apps provides several built-in observability features that together give you a holistic view of your container app’s health throughout its application lifecycle. These features help you monitor and diagnose the state of your app to improve performance and respond to trends and critical problems. These features include: Note While not a built-in feature,Azure Monitor Application Insightsis a powerful tool to monitor your web and background applications. Although Container Apps doe",2025-11-06T08:00:00.000Z,concept-article,,0.4,False,"Observability overview listing features; lacks detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/opentelemetry-agents,OpenTelemetry agents,Collect and read OpenTelemetry data in Azure Container Apps,Configure OpenTelemetry data agents in Azure Container Apps,Learn to record and query data collected using OpenTelemetry in Azure Container Apps (preview).,"Using anOpenTelemetrydata agent with your Azure Container Apps environment, you can choose to send observability data in an OpenTelemetry format by: Piping data from an agent into a desired endpoint. Destination options include Azure Monitor Application Insights, Datadog, and any OpenTelemetry Protocol (OTLP)-compatible endpoint. Easily changing destination endpoints without having to reconfigure how they emit data, and without having to manually run an OpenTelemetry agent. This article shows yo",2026-03-31T08:00:00.000Z,how-to,configuration,0.65,True,"Shows how to use an OpenTelemetry data agent with Azure Container Apps and route data to various endpoints. This typically involves product-specific configuration options (agent settings, destination endpoints, formats) that are not generic and align with configuration/integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/overview,About Azure Container Apps,Azure Container Apps overview,,Learn about common scenarios and uses for Azure Container Apps.,"Azure Container Apps is a serverless platform that allows you to maintain less infrastructure and save costs while running containerized applications. Instead of worrying about server configuration, container orchestration, and deployment details, Container Apps provides all the up-to-date server resources required to keep your applications stable and secure. Common uses of Azure Container Apps include: Additionally, applications built on Azure Container Apps can dynamically scale based on the f",2026-03-31T08:00:00.000Z,overview,,0.1,False,"High-level overview of Azure Container Apps scenarios and benefits; no specific limits, configs, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/container-apps/overview,About Azure Container Apps,Azure Container Apps overview,,Learn about common scenarios and uses for Azure Container Apps.,"Azure Container Apps is a serverless platform that allows you to maintain less infrastructure and save costs while running containerized applications. Instead of worrying about server configuration, container orchestration, and deployment details, Container Apps provides all the up-to-date server resources required to keep your applications stable and secure. Common uses of Azure Container Apps include: Additionally, applications built on Azure Container Apps can dynamically scale based on the f",2026-03-31T08:00:00.000Z,overview,,0.1,False,"High-level overview of Azure Container Apps scenarios and benefits; no specific limits, configs, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/planned-maintenance,Planned maintenance,Azure Container Apps planned maintenance,Configure planned maintenance windows for Container Apps,Configure system-level planned maintenance in Azure Container Apps,"Azure Container Apps is a fully managed service where platform and infrastructure updates are regularly and automatically applied to both components and environments. The Container Apps update system is designed to minimize the effect on performance of your apps during updates. By defining maintenance windows, you can designate the most advantageous times for your application. Defining a maintenance window allows you to decide the range of time when noncritical updates are applied to your Contai",2025-09-23T22:10:00.000Z,how-to,configuration,0.7,True,"Covers defining maintenance windows and how updates are applied; likely includes specific settings/parameters for maintenance configuration, fitting configuration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/plans,Plans,Azure Container Apps Plan Types,Compare Azure Container Apps plan types,Compare the plans that are available in Azure Container Apps.,Azure Container Apps features two plan types:,2026-01-29T23:12:00.000Z,concept-article,decision-making,0.7,True,Explicitly compares available plan types; such plan/SKU comparison content is decision-making guidance.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/policy-reference,Azure policy definitions,Built-in policy definitions for Azure Container Apps,Use built-in Azure Policy definitions for Container Apps,Lists Azure Policy built-in policy definitions for Azure Container Apps. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy @@ -149,6 +150,7 @@ https://learn.microsoft.com/en-us/azure/container-apps/quickstart-portal,Azure p https://learn.microsoft.com/en-us/azure/container-apps/quickstart-repo-to-cloud,Code repository,Quickstart: Build and deploy from a repository to Azure Container Apps,,Build your container app from a code repository and deploy in Azure Container Apps using az containerapp up.,"This article demonstrates how to build and deploy a microservice to Azure Container Apps from a GitHub repository using the programming language of your choice. In this quickstart, you create a sample microservice, which represents a backend web API service that returns a static collection of music albums. This sample application is available in two versions. One version includes a container, where the source contains a Dockerfile. The other version has no Dockerfile. Select the version that bes",2025-02-05T18:02:00.000Z,quickstart,,0.25,False,Quickstart for repo-to-cloud deployment; tutorial-style without detailed settings tables or tier comparisons.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/quota-requests,Make quota requests,Request quota changes for Azure Container Apps,Request quota increases for Azure Container Apps,Learn about how and where to submit a quota request for Azure Container Apps.,"Azure Container Apps has default quotas and limits that apply to your resources. As your application needs grow, you might need to increase these limits. This article explains how to request quota changes for Azure Container Apps through integrated and manual request processes. Follow these procedures when you need to expand your resource capacity beyond the default limits.",2025-09-19T05:15:00.000Z,how-to,limits-quotas,0.7,True,"Explains how to change default limits; while process-focused, it depends on and references specific quota values and scopes from the platform’s quota system.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/quotas,Quotas,Quotas for Azure Container Apps,Review quotas and limits for Azure Container Apps,Learn about quotas for Azure Container Apps.,"Azure Container Apps assigns different quota types to different scopes. In addition to the subscription scope, quotas also apply to region, environment, and application scopes. All quota requests are initiated using Azure Quota Management System (QMS), which features two options for making quota requests. Note Azure Container Apps is a production grade service designed for at-scale workloads. Making a quota request that escalates to the support team isn't out of the norm, but part of the process",2025-10-31T11:10:00.000Z,conceptual,limits-quotas,0.9,True,"Explicit quotas article; will list numeric limits by scope (subscription, region, environment, app) and possibly tables—canonical limits & quotas reference.",unchanged +https://learn.microsoft.com/en-us/azure/container-apps/relocate-region,Relocate to another region,Relocate Azure Container Apps to another region,Relocate Azure Container Apps across regions,Learn how to relocate an Azure Container Apps workload to a different Azure region by redeploying the managed environment and container apps.,"This article describes how to relocate an Azure Container Apps workload to another Azure region by recreating the managed environment and container apps in the target region. The process covers exporting application configuration, redeploying resources, validating the deployment, and cleaning up resources in the source region. Important Azure Container Apps doesn't support in-place region migration. Yourecreateresources in the target region - existing resources aren't moved. This article covers ",2026-04-22T06:17:00.000Z,how-to,deployment,0.68,True,"Describes a product-specific, non-obvious deployment/migration pattern for Azure Container Apps between regions, including the fact that in-place region migration is not supported and that workloads must be recreated in the target region. This is concrete, service-specific deployment/migration guidance rather than a generic tutorial.",new https://learn.microsoft.com/en-us/azure/container-apps/revisions,Revisions,Update and deploy changes in Azure Container Apps,,Learn how to use revisions to make changes in Azure Container Apps.,"Change management can be challenging as you develop containerized applications in the cloud. Ultimately, you need the support to track changes, ensure uptime, and have mechanisms to handle smooth rollbacks. Change management in Azure Container Apps is powered by revisions, which are a snapshot of each version of your container app. Key characteristics of revisions include: Immutable: Once established, a revision remains unchangeable. Versioned: Revisions act as a record of the container app's ve",2025-10-27T08:00:00.000Z,concept-article,,0.4,False,Explains revisions conceptually and their characteristics; summary doesn’t show concrete config parameters or decision matrices.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/revisions-manage,Revision management,Manage revisions in Azure Container Apps,Configure and manage Container Apps revisions,Manage revisions in Azure Container Apps,"Azure Container Apps allows your container app to support multiple revisions. With this feature, you can activate and deactivate revisions, and control the amount oftraffic sent to each revision. To learn more about revisions, seeRevisions in Azure Container Apps. A revision is created when you first deploy your application. New revisions are created when youupdateyour application withrevision-scope changes. You can also update your container app based on a specific revision. This article descri",2025-02-03T08:00:00.000Z,conceptual,configuration,0.7,True,"Explains how to activate/deactivate revisions, route traffic, and update by revision; likely includes specific configuration fields and commands for revision management.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/rule-based-routing,Use rule-based routing,Use rule-based routing in Azure Container Apps,Configure rule-based HTTP routing in Container Apps,Learn how to use rule-based routing in Azure Container Apps.,"In this article, you learn how to use rule-based routing with Azure Container Apps. With rule-based routing, you create a fully qualified domain name (FQDN) on your container apps environment. You then use rules to route requests for this FQDN to different container apps, depending on the path of each request.",2026-01-19T06:14:00.000Z,tutorial,configuration,0.7,True,"Routing article will include route rule schema, host/path matching, and configuration fields for environment-level FQDN routing—product-specific config details beyond generic L7 routing concepts.",unchanged @@ -161,15 +163,15 @@ https://learn.microsoft.com/en-us/azure/container-apps/serverless-gpu-nim,Deploy https://learn.microsoft.com/en-us/azure/container-apps/service-connector,.NET app with Blob Storage,Connect a container app to a cloud service with Service Connector,Connect Container Apps to Azure services with Service Connector,Learn to connect a container app to an Azure service using the Azure portal or the CLI.,"Azure Container Apps allows you to use Service Connector to connect to cloud services in just a few steps. Service Connector manages the configuration of the network settings and connection information between different services. To view all supported services,learn more about Service Connector. In this article, you learn to connect a container app to Azure Blob Storage. Important Support for Service Connector (preview) on Azure Container Apps ends on March 30, 2026. After that date, new service",2025-09-14T22:14:00.000Z,how-to,integrations,0.75,True,"Shows how to connect to Blob Storage via Service Connector; this feature has product-specific connection configuration and parameters, fitting integrations.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/service-discovery-resiliency,Service discovery resiliency,Service discovery resiliency (preview) - Azure Container Apps,Configure service discovery resiliency policies,Learn how to apply container app to container app resiliency when using the application's service name in Azure Container Apps.,"With Azure Container Apps resiliency, you can proactively prevent, detect, and recover from service request failures by using simple resiliency policies. In this article, you learn how to configure Azure Container Apps resiliency policies when initiating requests by using Azure Container Apps service discovery. Note Currently, you can't apply resiliency policies to requests made by using the Dapr Service Invocation API. Each request to a container app enforces policies. You can tailor policies t",2025-11-07T08:00:00.000Z,conceptual,configuration,0.7,True,"Describes Container Apps–specific resiliency policy configuration (per-request policies, supported directions, and limitations like no Dapr service invocation support), which are detailed configuration behaviors.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/session-pool,Session pools,Use session pools in Azure Container Apps,,Learn to use and manage session pools in Azure Container Apps.,Session pools provide subsecond session allocation times and manage the lifecycle of each session.,2026-04-06T22:10:00.000Z,how-to,,0.3,False,"Summary suggests a conceptual/usage explanation of session pools (subsecond allocation and lifecycle management) without clear indication of numeric limits, configuration parameter tables, or troubleshooting/error mappings. Likely a usage/overview page rather than expert-knowledge content as defined by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/sessions,Overview,Dynamic sessions in Azure Container Apps,,Learn about dynamic sessions in Azure Container Apps.,"Azure Container Apps dynamic sessions provide fast access to secure sandboxed environments that are ideal for running code or applications that require strong isolation from other workloads. Dynamic sessions offer prewarmed environments throughsession poolsthat start the container in milliseconds, scale on demand, and maintain strong isolation. This makes them ideal for interactive workloads, running LLM generated scripts, and secure execution of custom code.",2026-04-14T22:21:00.000Z,concept-article,,0.3,False,"Conceptual description of dynamic sessions and their use cases; summary does not indicate presence of numeric limits, config parameter tables, or troubleshooting/error code mappings.",updated +https://learn.microsoft.com/en-us/azure/container-apps/sessions,Overview,Dynamic sessions in Azure Container Apps,,Learn about dynamic sessions in Azure Container Apps.,"Azure Container Apps dynamic sessions provide fast access to secure sandboxed environments that are ideal for running code or applications that require strong isolation from other workloads. Dynamic sessions offer prewarmed environments throughsession poolsthat start the container in milliseconds, scale on demand, and maintain strong isolation. This makes them ideal for interactive workloads, running LLM generated scripts, and secure execution of custom code.",2026-04-14T22:21:00.000Z,concept-article,,0.3,False,"Conceptual description of dynamic sessions and their use cases; summary does not indicate presence of numeric limits, config parameter tables, or troubleshooting/error code mappings.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-code-interpreter,Code interpreter sessions,Serverless Code Interpreter Sessions in Azure Container Apps,,Learn how to run a serverless code interpreter session in Azure Container Apps.,Azure Container Appsdynamic sessionsprovide fast and scalable access to a code interpreter. Each code interpreter session is fully isolated by a Hyper-V boundary and is designed to run untrusted code.,2026-03-13T22:12:00.000Z,how-to,,0.3,False,"Describes serverless code interpreter sessions and isolation model; appears to be a usage/tutorial-style article without explicit limits, configuration matrices, or detailed troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/sessions-custom-container,Custom container sessions,Custom container sessions in Azure Container Apps,,Learn to run custom container session in Azure Container Apps.,"In addition to the built-in code interpreter that Azure Container Apps dynamic sessions provide, you can also use custom containers to define your own session sandboxes. Note This article applies only to custom container session pools. Unless noted, features described here aren't available for code interpreter session pools.",2026-03-13T22:12:00.000Z,conceptual,,0.3,False,"Explains using custom containers for dynamic sessions and notes scope limitations, but summary does not show concrete configuration tables, numeric limits, or error-resolution content.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-autogen,AutoGen,Tutorial: Use code interpreter sessions in AutoGen with Azure Container Apps,Integrate AutoGen code interpreter sessions with Azure Container Apps,Learn to use code interpreter sessions in AutoGen on Azure Container Apps.,"AutoGenis a framework for developing large language model (LLM) applications using multiple agents that converse with each other to solve tasks. Agents built with AutoGen can operate in various modes that employ combinations of LLMs, human inputs, and tools. One important type of tool for AutoGen agents is code executors. They enable agents to perform complex tasks by writing and executing code. By integrating Azure Container Apps dynamic sessions with AutoGen, you give the agent acode interpret",2026-03-31T08:00:00.000Z,tutorial,integrations,0.65,True,"Tutorial on wiring AutoGen agents to Azure Container Apps dynamic sessions as a code interpreter; likely includes product-specific configuration parameters, endpoint settings, and SDK usage patterns for this integration.",updated +https://learn.microsoft.com/en-us/azure/container-apps/sessions-custom-container,Custom container sessions,Custom container sessions in Azure Container Apps,Configure custom container sessions in Azure Container Apps,"Learn about custom container sessions in Azure Container Apps, including container probes for session pools.","In addition to the built-in code interpreter that Azure Container Apps dynamic sessions provide, you can also use custom containers to define your own session sandboxes. Note This article applies only to custom container session pools. Unless otherwise noted, features described here aren't available for code interpreter session pools.",2026-04-24T22:19:00.000Z,concept-article,configuration,0.7,True,"The page describes how to configure custom container-based session pools in Azure Container Apps, including product-specific settings like container probes for session pools and behavior differences from built-in code interpreter sessions. These are configuration details unique to this feature rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-autogen,AutoGen,Tutorial: Use code interpreter sessions in AutoGen with Azure Container Apps,Integrate AutoGen code interpreter sessions with Azure Container Apps,Learn to use code interpreter sessions in AutoGen on Azure Container Apps.,"AutoGenis a framework for developing large language model (LLM) applications using multiple agents that converse with each other to solve tasks. Agents built with AutoGen can operate in various modes that employ combinations of LLMs, human inputs, and tools. One important type of tool for AutoGen agents is code executors. They enable agents to perform complex tasks by writing and executing code. By integrating Azure Container Apps dynamic sessions with AutoGen, you give the agent acode interpret",2026-03-31T08:00:00.000Z,tutorial,integrations,0.65,True,"Tutorial on wiring AutoGen agents to Azure Container Apps dynamic sessions as a code interpreter; likely includes product-specific configuration parameters, endpoint settings, and SDK usage patterns for this integration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-langchain,LangChain,Tutorial: Use code interpreter sessions in LangChain with Azure Container Apps,Use Azure Container Apps code interpreter sessions with LangChain,Learn to use code interpreter sessions in LangChain on Azure Container Apps.,"LangChainis a framework designed to simplify the creation of applications using large language models (LLMs). When you build an AI agent with LangChain, an LLM interprets user input and generates a response. The AI agent often struggles when it needs to perform mathematical and symbolic reasoning to produce a response. By integrating Azure Container Apps dynamic sessions with LangChain, you give the agent acode interpreterto use to perform specialized tasks. In this tutorial, you learn how to ru",2024-10-11T08:00:00.000Z,tutorial,integrations,0.7,True,Integration tutorial between LangChain agents and dynamic sessions; expected to show concrete configuration and code patterns unique to this integration.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-llamaindex,LlamaIndex,Tutorial: Use code interpreter sessions in LlamaIndex with Azure Container Apps,,Learn to use code interpreter sessions in LlamaIndex on Azure Container Apps.,"LlamaIndexis a powerful framework for building context-augmented language model (LLM) applications. When you build an AI agent with LlamaIndex, an LLM interprets user input and generates a response. The AI agent often struggles when it needs to perform mathematical and symbolic reasoning to produce a response. By integrating Azure Container Apps dynamic sessions with LlamaIndex, you give the agent acode interpreterto use to perform specialized tasks. In this tutorial, you learn how to run a Llam",2026-03-26T08:00:00.000Z,tutorial,,0.3,False,"Tutorial on using code interpreter sessions in LlamaIndex with Azure Container Apps. Likely step-by-step guidance without detailed configuration tables, limits, or product-specific error mappings; primarily instructional, not expert reference content as defined.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-nodejs,JavaScript code interpreter,Tutorial: Run JavaScript code in a code interpreter session in Azure Container Apps,Execute JavaScript via Azure Container Apps dynamic sessions,Learn to use code interpreter sessions to run JavaScript code in Azure Container Apps.,This tutorial demonstrates how to execute JavaScript code in Azure Container Apps dynamic sessions using an HTTP API. In this tutorial you:,2025-05-19T15:23:00.000Z,tutorial,integrations,0.7,True,Tutorial for running JavaScript through an HTTP API backed by dynamic sessions; includes concrete API usage and configuration patterns.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-python-mcp,Python code interpreter with MCP,Tutorial: Use MCP with dynamic sessions (Python),,Learn how to create a Python session pool with the platform-managed MCP server enabled and execute Python code remotely using Azure Container Apps dynamic sessions.,"Important The platform-managed MCP server for dynamic sessions is inpreview. The API version2025-02-02-previewandmcpServerSettingsproperties are subject to change. This tutorial shows how to create a session pool with the platform-managed MCP server enabled, connect to it, and execute Python code remotely. Unlike thestandalone MCP server tutorials, you don't write or deploy MCP server code. The platform provides built-in tools for Python session pools: In this tutorial, you:",2026-03-13T22:12:00.000Z,tutorial,,0.3,False,"Tutorial-style walkthrough for using platform-managed MCP with dynamic sessions in Azure Container Apps. From the summary, it focuses on how to create a session pool and execute Python code remotely, without exposing configuration tables, limits, quotas, or product-specific error mappings. Lacks the structured expert details required for any sub-skill category.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-semantic-kernel,Semantic Kernel,Tutorial: Use code interpreter sessions in Semantic Kernel with Azure Container Apps,Use Semantic Kernel code interpreter sessions on Azure Container Apps,Learn to use code interpreter sessions in Semantic Kernel on Azure Container Apps.,"Semantic Kernelis an open-source AI framework created by Microsoft for .NET, Python, and Java developers working with Large Language Models (LLMs). When you build an AI agent with Semantic Kernel, an LLM interprets user input and generates a response. The AI agent often struggles when it needs to perform mathematical and symbolic reasoning to produce a response. By integrating Azure Container Apps dynamic sessions with Semantic Kernel, you give the agent acode interpreterto use to perform specia",2026-03-31T08:00:00.000Z,tutorial,integrations,0.65,True,"Shows how to integrate Semantic Kernel agents with Azure Container Apps dynamic sessions; expected to contain concrete configuration values, API/SDK parameters, and wiring patterns unique to this integration.",updated +https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-semantic-kernel,Semantic Kernel,Tutorial: Use code interpreter sessions in Semantic Kernel with Azure Container Apps,Use Semantic Kernel code interpreter sessions on Azure Container Apps,Learn to use code interpreter sessions in Semantic Kernel on Azure Container Apps.,"Semantic Kernelis an open-source AI framework created by Microsoft for .NET, Python, and Java developers working with Large Language Models (LLMs). When you build an AI agent with Semantic Kernel, an LLM interprets user input and generates a response. The AI agent often struggles when it needs to perform mathematical and symbolic reasoning to produce a response. By integrating Azure Container Apps dynamic sessions with Semantic Kernel, you give the agent acode interpreterto use to perform specia",2026-03-31T08:00:00.000Z,tutorial,integrations,0.65,True,"Shows how to integrate Semantic Kernel agents with Azure Container Apps dynamic sessions; expected to contain concrete configuration values, API/SDK parameters, and wiring patterns unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-shell,Shell session,Tutorial: Execute shell commands in a session pool using Azure Container Apps (preview),Run shell commands using Azure Container Apps session pools,Learn to use dynamic sessions to run shell commands in Azure Container Apps.,This tutorial demonstrates how to deploy a shell session pool in Azure Container Apps and execute shell commands using the Dynamic Sessions API and ARM templates. In this tutorial you:,2025-11-18T17:01:00.000Z,tutorial,integrations,0.7,True,Demonstrates deploying a shell session pool and using the Dynamic Sessions API; includes specific API calls and ARM template parameters.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-shell-mcp,Shell session with MCP,Tutorial: Use MCP with dynamic sessions (shell),Use platform-managed MCP with dynamic shell sessions in Azure Container Apps,Learn how to create a shell session pool with the platform-managed MCP server enabled and execute shell commands remotely using Azure Container Apps dynamic sessions.,"Important The platform-managed MCP server for dynamic sessions is inpreview. The API version2025-02-02-previewandmcpServerSettingsproperties are subject to change. This tutorial shows how to create a shell session pool with the platform-managed MCP server enabled, connect to it, and execute shell commands remotely - both from the CLI and from GitHub Copilot Chat in VS Code. Unlike thestandalone MCP server tutorials, you don't write or deploy MCP server code. The platform provides built-in tools ",2026-02-27T06:11:00.000Z,tutorial,integrations,0.75,True,"Shows how to enable the platform-managed MCP server on shell session pools and connect from CLI and Copilot; includes preview API version, mcpServerSettings, and other integration-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/sessions-usage,Usage,Use dynamic sessions in Azure Container Apps,,Learn how to use dynamic sessions in Azure Container Apps.,"Azure Container Apps dynamicsessionsoffer isolated, secure contexts when you need to run code or applications separately from other workloads. Sessions run inside asession poolwhich provides immediate access to new and existing sessions. These sessions are ideal for scenarios where user-generated input needs to be processed in a controlled manner or when integrating third-party services that require executing code in an isolated environment. You don't need to deploy a container app resource to u",2026-04-06T22:10:00.000Z,how-to,,0.3,False,"Appears to be a how-to/tutorial style page on using dynamic sessions, without evidence of configuration tables, limits, error codes, or decision matrices. The summary focuses on what sessions are good for and that you don't need to deploy a container app resource, which is useful but not expert-level configuration, limits, or troubleshooting content per the defined categories.",unchanged @@ -181,13 +183,13 @@ https://learn.microsoft.com/en-us/azure/container-apps/storage-mounts-azure-file https://learn.microsoft.com/en-us/azure/container-apps/structure,Overview,Compute and billing structures in Azure Container Apps,Choose Container Apps compute and billing structures,Learn how compute and networking features and billing methods are structured in Azure Container Apps.,"In Azure Container Apps, the environment and plan type you use determines the functionality and billing methods associated with your application. This article explains the relationship between plans, workload profiles, and why to consider selecting one over another.",2025-05-21T08:00:00.000Z,conceptual,decision-making,0.7,True,"Explains relationship between plans and workload profiles and when to select each; this is SKU/plan selection guidance with scenario-based recommendations, fitting decision-making.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/token-store,Token store,Enable an authentication token store in Azure Container Apps,Configure token store-backed auth for Container Apps,Learn to secure authentication tokens independent of your application.,"Azure Container Apps authentication supports a feature called token store. A token store is a repository of tokens that are associated with the users of your web apps and APIs. You enable a token store by configuring your container app with an Azure Blob Storage container. Your application code sometimes needs to access data from these providers on the user's behalf, such as: You typically need to write code to collect, store, and refresh tokens in your application. With a token store, you canre",2025-06-30T17:11:00.000Z,how-to,security,0.72,True,"Describes product-specific token store behavior and configuration using Azure Blob Storage, including how tokens are stored and accessed independently of app code, which is detailed implementation guidance rather than generic auth theory.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/traffic-splitting,Configure traffic-splitting,Traffic splitting in Azure Container Apps,Configure traffic splitting across Container Apps revisions,Send a portion of an apps traffic to different revisions in Azure Container Apps.,"By default, when ingress are enabled, all traffic is routed to the latest deployed revision. When you enablemultiple revision modein your container app, you can split incoming traffic between active revisions. Traffic splitting is useful for testing updates to your container app. You can use traffic splitting to gradually phase in a new revision inblue-green deploymentsor inA/B testing. Traffic splitting is based on the weight (percentage) of traffic that is routed to each revision. The combined",2025-06-16T08:00:00.000Z,how-to,configuration,0.74,True,"Describes revision-based traffic weights, constraints (weights sum to 100%), and behavior in multiple revision mode, which is specific routing configuration.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-container-create-failures,Container exits,Troubleshoot Container Exit Failures in Azure Container Apps,Diagnose container exit failures in Azure Container Apps,Learn to troubleshoot container exit events in Azure Container Apps.,"Container exit events indicate that a container is stopped or exited. These exit events can significantly affect the availability, stability, and performance of your container app. The underlying issues that trigger these events can potentially lead to downtime or degraded service. Each event is recorded to provide insights into the container's lifecycle, helping you diagnose issues related to the container's execution. When a container exits, it could exit with a nonzero exit code (indicating f",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.9,True,"Page is organized around specific container exit symptoms and how to resolve them, likely including Azure Container Apps–specific error messages, causes, and remediation steps. This is product-specific troubleshooting guidance rather than generic debugging.",new +https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-container-create-failures,Container exits,Troubleshoot Container Exit Failures in Azure Container Apps,Diagnose container exit failures in Azure Container Apps,Learn to troubleshoot container exit events in Azure Container Apps.,"Container exit events indicate that a container is stopped or exited. These exit events can significantly affect the availability, stability, and performance of your container app. The underlying issues that trigger these events can potentially lead to downtime or degraded service. Each event is recorded to provide insights into the container's lifecycle, helping you diagnose issues related to the container's execution. When a container exits, it could exit with a nonzero exit code (indicating f",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.9,True,"Page is organized around specific container exit symptoms and how to resolve them, likely including Azure Container Apps–specific error messages, causes, and remediation steps. This is product-specific troubleshooting guidance rather than generic debugging.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-container-start-failures,Container start,Troubleshoot start failures in Azure Container Apps,Troubleshoot container start failures in Azure Container Apps,Learn to troubleshoot start failures in Azure Container Apps,Deploying a containerized application to Azure Container Apps for the first time can sometimes result in the app failing to start. Use this guide to figure out how to diagnose and fix common issues when your container doesn't start.,2025-03-31T23:30:00.000Z,how-to,troubleshooting,0.8,True,Guides diagnosis of containers that fail to start with Container Apps–specific checks and remedies.,unchanged -https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-health-probe-failures,Health probe,Troubleshoot Health Probe Failures in Azure Container Apps,Troubleshoot health probe failures in Azure Container Apps,Learn to troubleshoot health probe failures in Azure Container Apps.,"Health probe failures in Azure Container Apps indicate that the container app doesn't pass the required health checks and could be considered unhealthy or unready. Health probes are a great way to determine application health. Specifically, health probes help work around performance issues related to timeouts during container startup, deadlocks when running the container, and serving traffic when the container isn't ready to accept traffic.",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.9,True,"Focuses on health probe failure scenarios with concrete symptoms and resolutions for Azure Container Apps, mapping probe failures to causes (timeouts, readiness issues) and fixes. This matches the troubleshooting pattern of symptom → cause → solution.",updated +https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-health-probe-failures,Health probe,Troubleshoot Health Probe Failures in Azure Container Apps,Troubleshoot health probe failures in Azure Container Apps,Learn to troubleshoot health probe failures in Azure Container Apps.,"Health probe failures in Azure Container Apps indicate that the container app doesn't pass the required health checks and could be considered unhealthy or unready. Health probes are a great way to determine application health. Specifically, health probes help work around performance issues related to timeouts during container startup, deadlocks when running the container, and serving traffic when the container isn't ready to accept traffic.",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.9,True,"Focuses on health probe failure scenarios with concrete symptoms and resolutions for Azure Container Apps, mapping probe failures to causes (timeouts, readiness issues) and fixes. This matches the troubleshooting pattern of symptom → cause → solution.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-image-pull-failures,Image pull,Troubleshoot image pull failures in Azure Container Apps,Fix container image pull failures in Azure Container Apps,Learn to troubleshoot image pull failures in Azure Container Apps,"If Azure can't pull the container image from the specified registry, the container can't be created or executed, leading to a failure in launching the application. An image pull failure is when the Azure platform is unable to download (described as a ""pull"") the container image required by the application.",2026-04-03T17:16:00.000Z,how-to,troubleshooting,0.92,True,"Dedicated to diagnosing and resolving image pull failures, a specific class of errors. Azure Container Apps–specific causes (registry auth, configuration, networking) and their resolutions constitute symptom→cause→solution mappings beyond generic container knowledge.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-open-container-initiative-errors,Open Container Initiative (OCI),Troubleshooting Open Container Initiative (OCI) errors in Azure Container Apps,Resolve OCI runtime errors in Azure Container Apps,Learn to troubleshoot aOpen Container Initiative (OCI) errors in Azure Container Apps,"An Open Container Initiative (OCI) runtime error is a failure in the container runtime during the execution or creation of a container. Failures can occur at any point in the container lifecycle, such as pulling the image, creating the container, or starting the container. A container create failure happens when the container fails to initialize, pull the image, or run properly. An OCI runtime error directly contributes to this failure as it often occurs during the container creation phase. For ",2026-03-24T08:00:00.000Z,how-to,troubleshooting,0.92,True,"Focused on Open Container Initiative runtime errors in Azure Container Apps, mapping specific failure points (image pull, container create/start) to causes and resolutions. Likely includes concrete error messages and Azure-specific remediation steps, which qualify as expert troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-storage-mount-failures,Storage mount,Troubleshoot Storage Mount Failures in Azure Container Apps,Fix storage mount failures in Azure Container Apps,Learn to troubleshoot storage mount failures in Azure Container Apps.,"Azure Container Apps allows containers to interact with external storage systems, such as Azure Files, Azure Blob Storage, or Azure Managed Disks, for persisting data. These storage resources are mounted to the container's file system during container startup. Storage mount failures occur when your app's container is unable to successfully mount or access required storage resources, such as volumes or file systems.",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.9,True,"Covers storage mount failure scenarios for Azure Files/Blob/Managed Disks in Container Apps, with product-specific error conditions and corrective actions. This is expert troubleshooting content tied to Azure Container Apps storage integration.",updated -https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-target-port-settings,Target port,Troubleshoot Target Port Settings in Azure Container Apps,Resolve target port misconfiguration in Azure Container Apps,Learn to troubleshoot target port settings in Azure Container Apps.,"Incorrect target port settings in a container app prevent incoming requests from reaching the correct port where the container listens for traffic. This port mismatch stops the container app from routing external traffic to the right application inside the container. Such misconfiguration can cause app downtime or delays in serving requests, reducing service availability. Additionally, if the app scales and the target port is misconfigured, new instances might not function correctly, which can n",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.85,True,"Describes how incorrect target port settings break traffic routing and scaling, and provides concrete diagnosis and fixes for Azure Container Apps port configuration. This is structured troubleshooting guidance for a specific misconfiguration scenario.",updated +https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-storage-mount-failures,Storage mount,Troubleshoot Storage Mount Failures in Azure Container Apps,Fix storage mount failures in Azure Container Apps,Learn to troubleshoot storage mount failures in Azure Container Apps.,"Azure Container Apps allows containers to interact with external storage systems, such as Azure Files, Azure Blob Storage, or Azure Managed Disks, for persisting data. These storage resources are mounted to the container's file system during container startup. Storage mount failures occur when your app's container is unable to successfully mount or access required storage resources, such as volumes or file systems.",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.9,True,"Covers storage mount failure scenarios for Azure Files/Blob/Managed Disks in Container Apps, with product-specific error conditions and corrective actions. This is expert troubleshooting content tied to Azure Container Apps storage integration.",unchanged +https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-target-port-settings,Target port,Troubleshoot Target Port Settings in Azure Container Apps,Resolve target port misconfiguration in Azure Container Apps,Learn to troubleshoot target port settings in Azure Container Apps.,"Incorrect target port settings in a container app prevent incoming requests from reaching the correct port where the container listens for traffic. This port mismatch stops the container app from routing external traffic to the right application inside the container. Such misconfiguration can cause app downtime or delays in serving requests, reducing service availability. Additionally, if the app scales and the target port is misconfigured, new instances might not function correctly, which can n",2026-04-14T22:21:00.000Z,how-to,troubleshooting,0.85,True,"Describes how incorrect target port settings break traffic routing and scaling, and provides concrete diagnosis and fixes for Azure Container Apps port configuration. This is structured troubleshooting guidance for a specific misconfiguration scenario.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/troubleshooting,Overview,Troubleshooting in Azure Container Apps,Diagnose and fix Azure Container Apps runtime issues,Learn to troubleshoot an Azure Container Apps application.,Reviewing Azure Container Apps logs and configuration settings can reveal underlying issues if your container app isn't behaving correctly. Use the following guide to help you locate and view details about your container app.,2026-03-31T08:00:00.000Z,how-to,troubleshooting,0.86,True,"Troubleshooting guide for Azure Container Apps that describes how to use specific logs, diagnostics, and configuration checks to identify and resolve issues. Organized around symptoms and platform-specific diagnostic steps, which are product-specific and not just generic debugging advice.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/tutorial-ci-cd-runners-jobs,Deploy self-hosted CI/CD runners with jobs,Tutorial: Run GitHub Actions runners and Azure Pipelines agents with Azure Container Apps jobs,Run self-hosted CI/CD runners with Container Apps jobs,Learn to create self-hosted CI/CD runners and agents with jobs in Azure Container Apps,"GitHub Actions and Azure Pipelines let you run CI/CD workflows with self-hosted runners and agents. You can run self-hosted runners and agents by using event-driven Azure Container Appsjobs. Self-hosted runners are useful when you need to run workflows that require access to local resources or tools that aren't available to a cloud-hosted runner. For example, a self-hosted runner in a Container Apps job allows your workflow to access resources inside the job's virtual network that a cloud-hosted",2025-11-24T23:17:00.000Z,conceptual,deployment,0.7,True,Shows how to host GitHub Actions runners and Azure Pipelines agents as jobs; includes product-specific job configuration for CI/CD scenarios.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/tutorial-code-to-cloud,Deploy a backend microservice app,Tutorial: Build and deploy your app to Azure Container Apps,,Build and deploy your app to Azure Container Apps with az containerapp create command.,This article demonstrates how to build and deploy a microservice to Azure Container Apps from a source repository using your preferred programming language. This is the first tutorial in the series of articles that walk you through how to use core capabilities within Azure Container Apps. The first step is to create a back end web API service that returns a static collection of music albums. Note You can also build and deploy this app using theaz containerapp upby following the instructions in t,2025-02-03T08:00:00.000Z,tutorial,,0.4,False,Tutorial for building and deploying a sample microservice; summary doesn’t show advanced configuration matrices or limits.,unchanged @@ -204,7 +206,7 @@ https://learn.microsoft.com/en-us/azure/container-apps/tutorial-scaling,Scale a https://learn.microsoft.com/en-us/azure/container-apps/tutorial-update-from-code,Update a container app deployed from code,Tutorial: Update a container app deployed from source code,,Update and deploy your source code to Azure Container Apps.,"This article demonstrates how to update the container app you created in the previous article,Build and deploy your source code to Azure Container Apps. If you haven't completed the steps in the previous article, stop here and return to this article once all steps are done. In this tutorial you:",2025-02-05T23:02:00.000Z,tutorial,,0.3,False,Tutorial for updating an app; appears to be a continuation of basic deployment steps without advanced config or limits.,unchanged https://learn.microsoft.com/en-us/azure/container-apps/use-azure-firewall,Managing outbound connections with Azure Firewall,Use Azure Firewall with Azure Container Apps,Secure Azure Container Apps egress with Azure Firewall,Learn about securing Azure Container Apps using Azure Firewall.,"This article tells you how to integrate your Azure Container Apps environment with Azure Firewall using user defined routes (UDR). Using UDR, you can control how traffic is routed within your virtual network. You can route all outbound traffic from your container apps through Azure Firewall, which provides a central point for monitoring traffic and applying security policies. This setup helps protect your container apps from potential threats. It also helps in meeting compliance requirements by ",2026-04-06T22:10:00.000Z,overview,security,0.7,True,"The article describes product-specific steps to integrate Azure Container Apps with Azure Firewall using user-defined routes, including concrete configuration of routing and firewall usage to secure outbound traffic. This is security-focused configuration guidance with Azure-specific constructs (UDR, virtual network routing, central firewall), which qualifies as expert knowledge beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/user-defined-routes,Enable User Defined Routes (UDR),Container Apps outbound traffic control with Azure Firewall,Control Container Apps outbound traffic via Azure Firewall,"Use Azure Firewall to route outbound traffic from Container Apps to the internet, private IP addresses, and Azure services.","Note This feature is only supported for the default workload profiles environment type. The legacy Consumption-only environment type does not support user defined routes. This article shows you how to use user defined routes (UDR) withAzure Firewallto lock down outbound traffic from your Container Apps to back-end Azure resources or other network resources. Azure creates a default route table for your virtual networks on create. By implementing a user-defined route table, you can control how tra",2025-04-03T08:00:00.000Z,article,security,0.82,True,"Details using UDRs with Azure Firewall for outbound traffic control, including environment-type support constraints and route table behavior, which is specific security/networking configuration.",unchanged -https://learn.microsoft.com/en-us/azure/container-apps/vnet-custom,Use a custom virtual network,Integrate a virtual network with an Azure Container Apps environment,Configure VNet integration for Azure Container Apps environments,Learn how to integrate a virtual network with an Azure Container Apps environment.,The following example shows you how to create a Container Apps environment in an existing virtual network (VNet). Begin by signing in to theAzure portal.,2026-04-02T08:00:00.000Z,how-to,configuration,0.7,True,"VNet integration for Container Apps typically involves product-specific network configuration parameters (subnet requirements, address ranges, delegated subnets, inbound/outbound access behaviors). These are concrete configuration details unique to Azure Container Apps networking rather than generic concepts, so it best fits configuration.",updated +https://learn.microsoft.com/en-us/azure/container-apps/vnet-custom,Use a custom virtual network,Integrate a virtual network with an Azure Container Apps environment,Configure VNet integration for Azure Container Apps environments,Learn how to integrate a virtual network with an Azure Container Apps environment.,The following example shows you how to create a Container Apps environment in an existing virtual network (VNet). Begin by signing in to theAzure portal.,2026-04-02T08:00:00.000Z,how-to,configuration,0.7,True,"VNet integration for Container Apps typically involves product-specific network configuration parameters (subnet requirements, address ranges, delegated subnets, inbound/outbound access behaviors). These are concrete configuration details unique to Azure Container Apps networking rather than generic concepts, so it best fits configuration.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/waf-app-gateway,Configure WAF Application Gateway,Protect Azure Container Apps with Application Gateway and Web Application Firewall (WAF),Protect Container Apps with Application Gateway WAF,Learn how to protect Azure Container Apps with Application Gateway Web Application Firewall (WAF),"When you host your apps or microservices in Azure Container Apps, you might not want to publish them directly to the internet. Instead, you might want to expose them through a reverse proxy. A reverse proxy is a service that sits in front of one or more services. It intercepts and directs incoming traffic to the right destination. With reverse proxies, you can place services in front of your apps that support cross-cutting functionality, including: This article shows how to protect your containe",2026-03-27T08:00:00.000Z,how-to,security,0.65,True,"Describes using Application Gateway as a reverse proxy with Web Application Firewall in front of Azure Container Apps. This is a product-specific security configuration/integration pattern for securing ingress, fitting the security sub-skill (and not just a conceptual overview).",unchanged https://learn.microsoft.com/en-us/azure/container-apps/whats-new,What's new,What's new in Azure Container Apps - Azure Container Apps,,Learn more about what's new in Azure Container Apps.,This article lists significant updates and new features available in Azure Container Apps.,2025-05-22T08:00:00.000Z,whats-new,,0.1,False,"What's new page is a changelog/announcements; not a stable expert-knowledge reference for configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/container-apps/workflows-overview,Workflow,Workflow in Azure Container Apps,,Learn about your workflow options for Azure Container Apps.,"Note The termworkflowoften has multiple meanings. In the context of Durable Functions, you might see workflows referred to as orchestrations. To avoid any confusion with container orchestrations, this article uses the term workflow instead. A workflow is a multi-step operation that usually occurs in a specific order or involves long-running tasks. Real-world scenarios requiring workflows include: Events like temporary infrastructure failures or dependency downtime can often interrupt workflow ex",2026-02-12T23:11:00.000Z,overview,,0.3,False,"High-level overview of workflow options and concepts in Container Apps; no concrete limits, configs, or error mappings.",unchanged diff --git a/products/azure-container-apps/report.md b/products/azure-container-apps/report.md index 5e135794..b6c66413 100644 --- a/products/azure-container-apps/report.md +++ b/products/azure-container-apps/report.md @@ -1,15 +1,15 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring Container Apps runtime: networking, ingress, scaling, revisions, Dapr, Java features, logging/monitoring, storage mounts, workload profiles, - and routing/traffic controls.' - security: 'Securing Container Apps: auth (Entra, social, OIDC, mTLS, certs), secrets/MI, - network controls (VNet, private endpoints, firewall, WAF), TLS/DNS, and security - best practices/policies.' - deployment: 'Deploying and automating Container Apps: CI/CD with GitHub Actions/Azure - Pipelines, Docker Compose migration, self-hosted runners, and Arc-enabled Kubernetes - integration.' + and maintenance/alerts.' + security: 'Auth, network, and data protection for Container Apps: identity providers, + mTLS/client certs, CORS, domains/TLS, secrets, managed identity, private endpoints, + firewalls, and security best practices.' + deployment: 'Guides for deploying and automating Container Apps: CI/CD with GitHub + Actions/Azure Pipelines, Docker Compose, Arc-enabled Kubernetes, cross-region + moves, and self-hosted CI runners.' decision-making: Guidance on choosing Container Apps plans, compute, GPUs, and hosting options, plus cost modeling and migration paths from Functions, Heroku, Java/Spring/Tomcat, and other Azure services. @@ -31,33 +31,33 @@ category_descriptions: skill_description: Expert knowledge for Azure Container Apps development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when configuring ingress/scale, securing with Entra/OIDC, wiring Dapr/Spring, - or deploying via GitHub Actions, and other Azure Container Apps related development - tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure - Container Instances (use azure-container-instances), Azure App Service (use azure-app-service), - Azure Functions (use azure-functions). -use_when: Use when configuring ingress/scale, securing with Entra/OIDC, wiring Dapr/Spring, - or deploying via GitHub Actions, and other Azure Container Apps related development - tasks. + Use when configuring ingress/scale, Dapr, managed identity, CI/CD pipelines, or + Java microservices on Azure Container Apps, and other Azure Container Apps related + development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), + Azure App Service (use azure-app-service), Azure Functions (use azure-functions), + Azure Spring Apps (use azure-spring-apps). +use_when: Use when configuring ingress/scale, Dapr, managed identity, CI/CD pipelines, + or Java microservices on Azure Container Apps, and other Azure Container Apps related + development tasks. confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), - Azure Container Instances (use azure-container-instances), Azure App Service (use - azure-app-service), Azure Functions (use azure-functions). + Azure App Service (use azure-app-service), Azure Functions (use azure-functions), + Azure Spring Apps (use azure-spring-apps). --- # Azure Container Apps Crawl Report ## Summary -- **Total Pages**: 209 -- **Fetched**: 209 +- **Total Pages**: 211 +- **Fetched**: 211 - **Fetch Failed**: 0 -- **Classified**: 131 -- **Unclassified**: 78 +- **Classified**: 134 +- **Unclassified**: 77 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 18 -- **Unchanged**: 190 -- **Deleted Pages**: 1 +- **New Pages**: 2 +- **Updated Pages**: 2 +- **Unchanged**: 207 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-container-apps/azure-container-apps.csv` ## Classification Statistics @@ -66,63 +66,28 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes |------|-------|------------| | architecture-patterns | 3 | 1.4% | | best-practices | 3 | 1.4% | -| configuration | 39 | 18.7% | -| decision-making | 16 | 7.7% | -| deployment | 6 | 2.9% | -| integrations | 19 | 9.1% | -| limits-quotas | 2 | 1.0% | -| security | 32 | 15.3% | -| troubleshooting | 11 | 5.3% | -| *(Unclassified)* | 78 | 37.3% | +| configuration | 40 | 19.0% | +| decision-making | 16 | 7.6% | +| deployment | 7 | 3.3% | +| integrations | 19 | 9.0% | +| limits-quotas | 2 | 0.9% | +| security | 33 | 15.6% | +| troubleshooting | 11 | 5.2% | +| *(Unclassified)* | 77 | 36.5% | ## Changes ### New Pages -- [Container exits](https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-container-create-failures) +- [Relocate to another region](https://learn.microsoft.com/en-us/azure/container-apps/relocate-region) +- [Manage secrets for Azure Functions on Azure Container Apps](https://learn.microsoft.com/en-us/azure/container-apps/functions-secrets-tutorial) ### Updated Pages -- [About Azure Container Apps](https://learn.microsoft.com/en-us/azure/container-apps/overview) - - Updated: 2025-07-15T17:10:00.000Z → 2026-03-31T08:00:00.000Z -- [Command line](https://learn.microsoft.com/en-us/azure/container-apps/get-started) - - Updated: 2025-02-05T18:02:00.000Z → 2026-04-14T17:11:00.000Z -- [Authentication](https://learn.microsoft.com/en-us/azure/container-apps/mcp-authentication) - - Updated: 2026-02-27T06:11:00.000Z → 2026-04-13T22:10:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/container-apps/sessions) - - Updated: 2026-04-06T22:10:00.000Z → 2026-04-14T22:21:00.000Z -- [AutoGen](https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-autogen) - - Updated: 2024-10-11T08:00:00.000Z → 2026-03-31T08:00:00.000Z -- [Semantic Kernel](https://learn.microsoft.com/en-us/azure/container-apps/sessions-tutorial-semantic-kernel) - - Updated: 2024-10-11T08:00:00.000Z → 2026-03-31T08:00:00.000Z -- [Health probe](https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-health-probe-failures) - - Updated: 2025-01-27T23:07:00.000Z → 2026-04-14T22:21:00.000Z -- [Storage mount](https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-storage-mount-failures) - - Updated: 2025-01-27T23:07:00.000Z → 2026-04-14T22:21:00.000Z -- [Target port](https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-target-port-settings) - - Updated: 2025-01-27T23:07:00.000Z → 2026-04-14T22:21:00.000Z -- [Custom OpenID Connect](https://learn.microsoft.com/en-us/azure/container-apps/authentication-openid) - - Updated: 2024-10-16T17:09:00.000Z → 2026-04-14T17:11:00.000Z -- [Dapr component resiliency](https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-resiliency) - - Updated: 2026-02-18T23:18:00.000Z → 2026-03-27T08:00:00.000Z -- [Connect to Azure or partner services](https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-connect-services) - - Updated: 2026-02-12T23:11:00.000Z → 2026-03-27T08:00:00.000Z -- [Enable token authentication for Dapr requests](https://learn.microsoft.com/en-us/azure/container-apps/dapr-authentication-token) - - Updated: 2026-02-12T23:11:00.000Z → 2026-04-16T17:19:00.000Z -- [Use a custom virtual network](https://learn.microsoft.com/en-us/azure/container-apps/vnet-custom) - - Updated: 2025-02-03T08:00:00.000Z → 2026-04-02T08:00:00.000Z -- [Turn on Java features](https://learn.microsoft.com/en-us/azure/container-apps/java-feature-switch) - - Updated: 2024-10-14T11:21:00.000Z → 2026-03-30T08:00:00.000Z -- [Connect to Eureka Server for Spring](https://learn.microsoft.com/en-us/azure/container-apps/java-eureka-server) - - Updated: 2024-11-19T18:02:00.000Z → 2026-03-30T08:00:00.000Z -- [Connect to Config Server for Spring](https://learn.microsoft.com/en-us/azure/container-apps/java-config-server) - - Updated: 2024-11-19T18:02:00.000Z → 2026-03-30T08:00:00.000Z -- [Connect to Gateway for Spring](https://learn.microsoft.com/en-us/azure/container-apps/java-gateway-for-spring) - - Updated: 2024-11-19T18:02:00.000Z → 2026-03-31T08:00:00.000Z - -### Deleted Pages - -- ~~Container create~~ (https://learn.microsoft.com/en-us/azure/container-apps/troubleshoot-container-create-failures) +- [Custom container sessions](https://learn.microsoft.com/en-us/azure/container-apps/sessions-custom-container) + - Updated: 2026-03-13T22:12:00.000Z → 2026-04-24T22:19:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/container-apps/java-overview) + - Updated: 2024-11-19T08:00:00.000Z → 2026-04-24T22:19:00.000Z ## Classified Pages @@ -195,6 +160,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Consumption-only environment type](https://learn.microsoft.com/en-us/azure/container-apps/environment-type-consumption-only) | decision-making | 0.70 | Describes features and billing considerations for a specific legacy environment type and when to use it versus newer options, which is environment/tier selection guidance. | | [Container debug console](https://learn.microsoft.com/en-us/azure/container-apps/container-debug-console) | troubleshooting | 0.70 | Debug console is a troubleshooting tool; article explains when and how to use it, including platform-specific behavior like one debug container per app. | | [Create a highly available Eureka server component cluster](https://learn.microsoft.com/en-us/azure/container-apps/java-eureka-server-highly-available) | architecture-patterns | 0.70 | Focuses on HA design for Eureka using multiple instances and clustering; product-specific pattern for resilience and scaling of the service registry. | +| [Custom container sessions](https://learn.microsoft.com/en-us/azure/container-apps/sessions-custom-container) | configuration | 0.70 | The page describes how to configure custom container-based session pools in Azure Container Apps, including product-specific settings like container probes for session pools and behavior differences from built-in code interpreter sessions. These are configuration details unique to this feature rather than generic concepts. | | [Dapr component resiliency](https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-resiliency) | configuration | 0.70 | Page explains how to configure resiliency policies (retries, timeouts, circuit breakers) for Dapr components in Azure Container Apps. It focuses on product-specific configuration of resiliency settings rather than generic resiliency concepts, matching the configuration category. | | [Deploy a frontend microservice app](https://learn.microsoft.com/en-us/azure/container-apps/communicate-between-microservices) | integrations | 0.70 | Tutorial on communication between microservices using FQDNs and ingress modes; contains product-specific networking behavior and patterns. | | [Deploy self-hosted CI/CD runners with jobs](https://learn.microsoft.com/en-us/azure/container-apps/tutorial-ci-cd-runners-jobs) | deployment | 0.70 | Shows how to host GitHub Actions runners and Azure Pipelines agents as jobs; includes product-specific job configuration for CI/CD scenarios. | @@ -209,6 +175,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Log monitoring](https://learn.microsoft.com/en-us/azure/container-apps/log-monitoring) | configuration | 0.70 | Covers Log Analytics workspace integration and specific log tables/queries for Container Apps, which are product-specific configuration/usage details. | | [Make quota requests](https://learn.microsoft.com/en-us/azure/container-apps/quota-requests) | limits-quotas | 0.70 | Explains how to change default limits; while process-focused, it depends on and references specific quota values and scopes from the platform’s quota system. | | [Manage a Functions app](https://learn.microsoft.com/en-us/azure/container-apps/functions-manage) | configuration | 0.70 | Shows CLI commands and parameters to list and interact with functions, including revision targeting; these are product-specific management/configuration commands. | +| [Manage secrets for Azure Functions on Azure Container Apps](https://learn.microsoft.com/en-us/azure/container-apps/functions-secrets-tutorial) | security | 0.70 | Page focuses on handling app-level secrets and Functions host keys when running Azure Functions inside Azure Container Apps. It describes product-specific security configuration patterns (using Container Apps secrets, Key Vault references, and Functions host key storage/management) that are unique to this integration scenario, going beyond generic concepts. | | [Managing outbound connections with Azure Firewall](https://learn.microsoft.com/en-us/azure/container-apps/use-azure-firewall) | security | 0.70 | The article describes product-specific steps to integrate Azure Container Apps with Azure Firewall using user-defined routes, including concrete configuration of routing and firewall usage to secure outbound traffic. This is security-focused configuration guidance with Azure-specific constructs (UDR, virtual network routing, central firewall), which qualifies as expert knowledge beyond generic concepts. | | [Migrate Spring Boot applications](https://learn.microsoft.com/en-us/azure/container-apps/migrate-spring-boot) | decision-making | 0.70 | Migration guide for Spring Boot to Azure Container Apps that covers pre-migration assessment and post-migration optimization. This typically includes Azure-specific considerations, trade-offs, and recommendations for how/when to migrate, which are decision-making and migration-path guidance beyond generic concepts. | | [Migrate Spring Cloud applications](https://learn.microsoft.com/en-us/azure/container-apps/migrate-spring-cloud) | decision-making | 0.70 | Explains what to be aware of when migrating Spring Cloud applications to Azure Container Apps, implying Azure- and Spring-Cloud-specific considerations and trade-offs. This is migration decision guidance rather than a simple tutorial, fitting decision-making with product-specific nuances. | @@ -239,6 +206,7 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Communicate between multiple apps](https://learn.microsoft.com/en-us/azure/container-apps/connect-apps) | integrations | 0.68 | The page describes product-specific communication patterns between container apps in the same Azure Container Apps environment, including use of FQDNs, app names, Dapr service invocation, and custom domains. These are concrete, platform-specific integration patterns and routing behaviors that go beyond generic container networking knowledge, fitting the integrations sub-skill best. | | [Migrate Java with GitHub Copilot app modernization](https://learn.microsoft.com/en-us/azure/container-apps/migrate-java-github-copilot-app-modernization) | decision-making | 0.68 | Describes using GitHub Copilot app modernization with Azure Container Apps, including supported Java versions and frameworks and how to approach upgrades and migration. This is specialized migration and tool-selection guidance with product-specific constraints (e.g., supported Java versions). | | [Private endpoints and DNS](https://learn.microsoft.com/en-us/azure/container-apps/private-endpoints-with-dns) | security | 0.68 | The page is focused on configuring private endpoints and DNS for Azure Container Apps environments, which is a product-specific security and network access configuration scenario. It likely includes concrete settings such as required DNS zone names, record patterns, and environment-specific configuration steps for private endpoints and virtual networks. This aligns best with the security sub-skill, as it covers secure connectivity and private access configuration rather than generic networking concepts. | +| [Relocate to another region](https://learn.microsoft.com/en-us/azure/container-apps/relocate-region) | deployment | 0.68 | Describes a product-specific, non-obvious deployment/migration pattern for Azure Container Apps between regions, including the fact that in-place region migration is not supported and that workloads must be recreated in the target region. This is concrete, service-specific deployment/migration guidance rather than a generic tutorial. | | [Securing a custom VNET with an NSG](https://learn.microsoft.com/en-us/azure/container-apps/firewall-integration) | security | 0.68 | Page focuses on concrete network security configuration for Azure Container Apps environments, including NSGs, user-defined routes, and firewall-secured outbound traffic. These are product-specific security settings and patterns that go beyond generic concepts, matching the security sub-skill criteria. | | [Blue/Green deployment](https://learn.microsoft.com/en-us/azure/container-apps/blue-green-deployment) | best-practices | 0.66 | Applies blue-green deployment strategy specifically to Container Apps, including how to route traffic between environments/revisions, which is actionable product-specific practice beyond generic theory. | | [Alerts](https://learn.microsoft.com/en-us/azure/container-apps/alerts) | configuration | 0.65 | Shows how to configure metric and log-based alerts for Container Apps; includes product-specific alert rule configuration steps. | @@ -304,7 +272,6 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Build a Java metrics dashboard with Azure Managed Grafana](https://learn.microsoft.com/en-us/azure/container-apps/java-metrics-with-grafana) | 0.30 | Tutorial on building a Java metrics dashboard with Azure Managed Grafana. From the summary, it appears to be a step-by-step example, not a reference of configuration options, limits, or troubleshooting mappings. | | [Code interpreter sessions](https://learn.microsoft.com/en-us/azure/container-apps/sessions-code-interpreter) | 0.30 | Describes serverless code interpreter sessions and isolation model; appears to be a usage/tutorial-style article without explicit limits, configuration matrices, or detailed troubleshooting mappings. | | [Containers](https://learn.microsoft.com/en-us/azure/container-apps/containers) | 0.30 | General overview of containers and jobs in Container Apps; summary doesn’t indicate detailed config tables or limits. | -| [Custom container sessions](https://learn.microsoft.com/en-us/azure/container-apps/sessions-custom-container) | 0.30 | Explains using custom containers for dynamic sessions and notes scope limitations, but summary does not show concrete configuration tables, numeric limits, or error-resolution content. | | [Deploy using ARM or Bicep](https://learn.microsoft.com/en-us/azure/container-apps/microservices-dapr-azure-resource-manager) | 0.30 | Quickstart using ARM/Bicep; focused on example deployment rather than comprehensive configuration reference or limits. | | [Deploy using Azure CLI](https://learn.microsoft.com/en-us/azure/container-apps/microservices-dapr) | 0.30 | Quickstart deployment tutorial; mainly step-by-step usage of CLI without emphasis on reusable configuration matrices or expert-only details. | | [Dockerfile](https://learn.microsoft.com/en-us/azure/container-apps/java-get-started-dockerfile) | 0.30 | Step-by-step tutorial deploying a sample via Dockerfile; likely uses generic commands without deep config matrices or limits. | @@ -333,11 +300,11 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Modernize .NET & Java apps](https://learn.microsoft.com/en-us/azure/container-apps/modernize-ai) | 0.20 | High-level modernization and Copilot tooling overview; no indication of concrete limits, configs, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/container-apps/ai-integration) | 0.20 | High-level overview of AI workloads on Azure Container Apps; summary suggests examples and scenarios but no indication of numeric limits, config tables, error codes, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/container-apps/dotnet-overview) | 0.20 | High-level .NET overview for Container Apps; primarily conceptual guidance and positioning without detailed config tables or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/container-apps/java-overview) | 0.20 | Java overview content describing benefits and options; lacks detailed configuration parameters or quotas. | | [Overview](https://learn.microsoft.com/en-us/azure/container-apps/javascript-overview) | 0.20 | High-level JavaScript overview; mostly conceptual and marketing-style description of capabilities. | | [About Azure Container Apps](https://learn.microsoft.com/en-us/azure/container-apps/overview) | 0.10 | High-level overview of Azure Container Apps scenarios and benefits; no specific limits, configs, error codes, or decision matrices. | | [Introduction to containers](https://learn.microsoft.com/en-us/azure/container-apps/java-containers-intro) | 0.10 | Intro to containers for Java is conceptual and generic; not specific to Container Apps configuration or limits. | | [Introduction to containers](https://learn.microsoft.com/en-us/azure/container-apps/start-containers) | 0.10 | Introductory explanation of containers and motivations; no product-specific limits, configs, or troubleshooting content. | +| [Overview](https://learn.microsoft.com/en-us/azure/container-apps/java-overview) | 0.10 | High-level overview of running Java on Azure Container Apps with benefits and general deployment options; no specific limits, configuration tables, error codes, or decision matrices that meet the expert-knowledge criteria. | | [Samples](https://learn.microsoft.com/en-us/azure/container-apps/samples) | 0.10 | Index page listing sample scenarios; primarily navigation to other content. No indication of limits, configuration tables, troubleshooting mappings, or decision matrices. | | [Use serverless containers](https://learn.microsoft.com/en-us/azure/container-apps/start-serverless-containers) | 0.10 | Conceptual overview of serverless containers and scaling; lacks specific configuration parameters or numeric thresholds. | | [What's new](https://learn.microsoft.com/en-us/azure/container-apps/whats-new) | 0.10 | What's new page is a changelog/announcements; not a stable expert-knowledge reference for configuration, limits, or troubleshooting. | diff --git a/products/azure-container-registry/azure-container-registry.csv b/products/azure-container-registry/azure-container-registry.csv index e38f527d..9878fd35 100644 --- a/products/azure-container-registry/azure-container-registry.csv +++ b/products/azure-container-registry/azure-container-registry.csv @@ -1,8 +1,9 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/container-registry/allow-access-trusted-services,Allow access by trusted services,Access Network-Restricted Registry By Trusted Azure Service - Azure Container Registry,Allow trusted Azure services to access network-restricted ACR,Enable a trusted Azure service instance to securely access a network-restricted container registry to pull or push images,"With Azure Container Registry, you can allow select trusted Azure services to access a registry that's configured with network access rules. When you allow trusted services, a trusted service instance can securely bypass the registry's network rules and perform operations such as pulling or pushing images. This article explains how to enable and use trusted services with a network-restricted Azure container registry. Use the Azure Cloud Shell or a local installation of the Azure CLI to run the c",2025-03-20T22:01:00.000Z,how-to,security,0.8,True,Explains enabling trusted services bypass for network rules; product-specific security configuration and behavior.,unchanged https://learn.microsoft.com/en-us/azure/container-registry/anonymous-pull-access,Enable unauthenticated anonymous pull access,Enable Unauthenticated Anonymous Pull Access in Azure Container Registry - Azure Container Registry,Enable anonymous pull access for Azure Container Registry,Optionally enable unauthenticated anonymous pull access to make content in your Azure container registry publicly available,"By default, access to pull or push content from an Azure container registry is only available toauthenticatedusers. Enabling anonymous (unauthenticated) pull access makes all registry content publicly available for read (pull) actions. Use anonymous pull access in scenarios that don't require user authentication, such as distributing public container images. Anonymous pull access is available in the Standard and Premiumservice tiers. To configure anonymous pull access, update a registry using th",2025-05-04T05:21:00.000Z,how-to,security,0.8,True,"Describes enabling unauthenticated pull, tier constraints (Standard/Premium), and security implications; product-specific security setting.",unchanged +https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-acr-to-acr-cli,Enable artifact cache from another ACR registry,Enable artifact cache to cache artifacts from another Azure Container Registry - Azure Container Registry,Configure ACR-to-ACR artifact caching with managed identity,Learn how to cache container images from one Azure Container Registry in another using managed identity authentication with Azure CLI and Bicep.,"In this article, you will learn how to enable theartifact cache featurein your Azure Container Registry (ACR) to cache images from another Azure Container Registry. The downstream registry authenticates to the upstream registry using a managed identity, so you don't need to store credentials in Azure Key Vault. In addition to the prerequisites listed here, you need an Azure account with an active subscription.Create an account for free.",2026-04-20T22:10:00.000Z,how-to,integrations,0.68,True,"The article describes a product-specific integration pattern between two Azure Container Registries using managed identity, Azure CLI, and Bicep. It includes concrete configuration steps and parameters (identity-based auth between upstream and downstream registries) that are specific to Azure Container Registry’s artifact cache feature, which go beyond generic container registry knowledge.",new https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-cli,Enable artifact cache - CLI,Enable artifact cache in your Azure Container Registry with Azure CLI - Azure Container Registry,Configure artifact cache in ACR using Azure CLI,"Learn how to use Azure CLI to cache container images in Azure Container Registry, improving performance and efficiency.","In this article, you learn how to use Azure CLI to enable theartifact cache featurein your Azure Container Registry (ACR) with or without authentication. In addition to the prerequisites listed here, you need an Azure account with an active subscription.Create an account for free.",2025-05-12T22:00:00.000Z,how-to,configuration,0.7,True,CLI-focused article for enabling artifact cache with/without auth; likely includes specific parameters and allowed values for configuration.,unchanged -https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-overview,Overview,Optimize image pulls with artifact cache in Azure Container Registry - Azure Container Registry,,"Artifact cache is a feature that allows you to cache container images in Azure Container Registry, improving performance and efficiency.",The artifact cache feature of Azure Container Registry lets you cache container images in both public and private repositories. Artifact cache enables faster and morereliable pull operationsthrough Azure Container Registry (ACR). It uses features like geo-replication and availability zone support for higher availability and faster image pulls. You can access cached registries over private networks to align with your firewall configurations and compliance standards. Artifact cache addresses the c,2026-04-14T22:12:00.000Z,concept-article,,0.2,False,"The page is an overview of Azure Container Registry artifact cache, describing benefits like faster pulls, geo-replication, and private access. The summary does not indicate presence of numeric limits, configuration parameter tables, error codes, or decision matrices. It appears to be conceptual/feature overview rather than detailed configuration, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-overview,Overview,Optimize image pulls with artifact cache in Azure Container Registry - Azure Container Registry,,"Artifact cache is a feature that allows you to cache container images in Azure Container Registry, improving performance and efficiency.",The artifact cache feature of Azure Container Registry lets you cache container images in both public and private repositories. Artifact cache enables faster and morereliable pull operationsthrough Azure Container Registry (ACR). It uses features like geo-replication and availability zone support for higher availability and faster image pulls. You can access cached registries over private networks to align with your firewall configurations and compliance standards. Artifact cache addresses the c,2026-04-14T22:12:00.000Z,concept-article,,0.2,False,"The page is an overview of Azure Container Registry artifact cache, describing benefits like faster pulls, geo-replication, and private access. The summary does not indicate presence of numeric limits, configuration parameter tables, error codes, or decision matrices. It appears to be conceptual/feature overview rather than detailed configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-portal,Enable artifact cache - Portal,Enable artifact cache in your Azure Container Registry with Azure portal - Azure Container Registry,,"Learn how to use the Azure portal to cache container images in Azure Container Registry, improving performance and efficiency.","In this article, you learn how to use the Azure portal to enable theartifact cache featurein your Azure Container Registry (ACR). In addition to the prerequisites listed here, you need an Azure account with an active subscription.Create an account for free.",2025-05-12T22:00:00.000Z,how-to,,0.4,False,"Portal how-to for enabling artifact cache; likely step-by-step UI instructions without detailed config tables, limits, or product-specific edge-case guidance.",unchanged https://learn.microsoft.com/en-us/azure/container-registry/authenticate-aks-cross-tenant,Cross-tenant authentication from AKS,Cross-Tenant Authentication from AKS to ACR - Azure Container Registry,Configure cross-tenant AKS authentication to Azure Container Registry,Configure an AKS cluster's service principal with permissions to access your Azure container registry in a different Microsoft Entra tenant.,"If you have an Azure Kubernetes Service (AKS) cluster in one Microsoft Entra tenant, and an Azure container registry in a different tenant, you can configure cross-tenant authentication to enable the AKS cluster to pull images from the container registry. This article walks through the steps to enable cross-tenant authentication by using the AKS service principal credential to pull from the container registry. In this article, we refer to the tenant containing the AKS cluster asTenant A, and the",2026-01-07T23:12:00.000Z,how-to,security,0.8,True,"Details cross-tenant auth setup using AKS service principal; includes tenant-specific configuration and permissions, a security configuration scenario.",unchanged https://learn.microsoft.com/en-us/azure/container-registry/authenticate-kubernetes-options,Kubernetes authentication scenarios,Kubernetes Authentication Scenarios for ACR - Azure Container Registry,Select Kubernetes authentication options for Azure Container Registry,Overview of options and scenarios to authenticate to an Azure container registry from a Kubernetes cluster to pull container images,"You can use an Azure container registry as a source of container images for Kubernetes. This setup can include clusters you manage, managed clusters hosted inAzure Kubernetes Service (AKS)or other clouds, and ""local"" Kubernetes configurations such asminikubeandkind. To pull images to your Kubernetes cluster from an Azure container registry, you need to establish an authentication and authorization mechanism. Depending on your cluster environment, choose one of the following methods:",2026-01-07T23:12:00.000Z,concept-article,decision-making,0.7,True,"Compares multiple auth mechanisms for different Kubernetes environments and scenarios; helps choose between options with scenario-based guidance, fitting decision-making.",unchanged diff --git a/products/azure-container-registry/report.md b/products/azure-container-registry/report.md index f74f5b3e..4d3c8d93 100644 --- a/products/azure-container-registry/report.md +++ b/products/azure-container-registry/report.md @@ -1,9 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Azure Container Registry: auth methods, RBAC/ABAC and tokens, network isolation (firewall, Private Link, IP/service tags), data exfiltration controls, image signing/verification, and policy/compliance.' + integrations: Patterns and code samples for integrating ACR with AKS/ACI, GitHub + Actions, ORAS, Helm, Notation signing, webhooks, artifact caching, transfers, + and image import/build workflows. configuration: 'Configuring ACR behavior: caching, purge/retention/soft delete, delete locks, tasks (YAML, timers, multi-step, patching, agent pools), webhooks, wildcard cache rules, and monitoring.' @@ -11,9 +14,6 @@ category_descriptions: and migrating image signing from Docker Content Trust to Notary Project. best-practices: 'Best practices for ACR operations: managing public image dependencies, safe image deletion and storage cleanup, and robust image tagging/versioning strategies.' - integrations: How to integrate ACR with ACI, AKS, Helm, ORAS, Buildpacks, ACR Transfer, - GitHub Actions, Notation, Key Vault, and webhooks for image access, builds, signing, - and automation troubleshooting: 'Diagnosing and fixing ACR issues: health checks, error codes, login/auth, network and performance problems, transfer/streaming/cache failures, logs, Arc/connected registry, and CMK encryption.' @@ -29,12 +29,12 @@ skill_description: Expert knowledge for Azure Container Registry development inc troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using ACR Tasks, geo-replication, Private Link, connected registries, or - image signing with Notation, and other Azure Container Registry related development + image signing/verification, and other Azure Container Registry related development tasks. Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Red Hat OpenShift (use azure-redhat-openshift). use_when: Use when using ACR Tasks, geo-replication, Private Link, connected registries, - or image signing with Notation, and other Azure Container Registry related development + or image signing/verification, and other Azure Container Registry related development tasks. confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Kubernetes Service (AKS) @@ -44,16 +44,16 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu ## Summary -- **Total Pages**: 122 -- **Fetched**: 122 +- **Total Pages**: 123 +- **Fetched**: 123 - **Fetch Failed**: 0 -- **Classified**: 88 +- **Classified**: 89 - **Unclassified**: 34 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 121 +- **New Pages**: 1 +- **Updated Pages**: 0 +- **Unchanged**: 122 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-container-registry/azure-container-registry.csv` @@ -63,21 +63,20 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu |------|-------|------------| | architecture-patterns | 2 | 1.6% | | best-practices | 4 | 3.3% | -| configuration | 14 | 11.5% | -| decision-making | 3 | 2.5% | -| deployment | 3 | 2.5% | -| integrations | 12 | 9.8% | +| configuration | 14 | 11.4% | +| decision-making | 3 | 2.4% | +| deployment | 3 | 2.4% | +| integrations | 13 | 10.6% | | limits-quotas | 2 | 1.6% | -| security | 36 | 29.5% | +| security | 36 | 29.3% | | troubleshooting | 12 | 9.8% | -| *(Unclassified)* | 34 | 27.9% | +| *(Unclassified)* | 34 | 27.6% | ## Changes -### Updated Pages +### New Pages -- [Overview](https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-overview) - - Updated: 2025-12-01T23:09:00.000Z → 2026-04-14T22:12:00.000Z +- [Enable artifact cache from another ACR registry](https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-acr-to-acr-cli) ## Classified Pages @@ -153,6 +152,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Use ACR webhooks](https://learn.microsoft.com/en-us/azure/container-registry/container-registry-webhook) | integrations | 0.70 | Webhooks require specific payload formats, event types, and configuration parameters unique to ACR; this is an integration pattern with external services. | | [Validate container image signatures in AKS with Ratify and Azure Policy](https://learn.microsoft.com/en-us/azure/container-registry/container-registry-tutorial-verify-with-ratify-aks) | security | 0.70 | Shows concrete configuration of Ratify and Azure Policy on AKS clusters, including policy setup and verification behavior specific to ACR/AKS, which is product-specific security configuration. | | [Wildcard support](https://learn.microsoft.com/en-us/azure/container-registry/wildcards-artifact-cache) | configuration | 0.70 | Describes supported wildcard patterns for artifact cache rules, mapping target to source repositories. This is product-specific configuration behavior (which wildcard syntaxes are supported and how they apply), fitting configuration. | +| [Enable artifact cache from another ACR registry](https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-acr-to-acr-cli) | integrations | 0.68 | The article describes a product-specific integration pattern between two Azure Container Registries using managed identity, Azure CLI, and Bicep. It includes concrete configuration steps and parameters (identity-based auth between upstream and downstream registries) that are specific to Azure Container Registry’s artifact cache feature, which go beyond generic container registry knowledge. | | [ACR Transfer with Az CLI](https://learn.microsoft.com/en-us/azure/container-registry/container-registry-transfer-cli) | integrations | 0.65 | CLI-focused guide for ACR Transfer using the acrtransfer extension; likely documents specific CLI commands and parameters unique to this feature, matching integrations & coding patterns. | | [About connected registry](https://learn.microsoft.com/en-us/azure/container-registry/intro-connected-registry) | architecture-patterns | 0.65 | Describes connected registry feature, scenarios, and when to use on-prem/remote replicas; includes ACR-specific architectural pattern. | | [Build image with Buildpacks (preview)](https://learn.microsoft.com/en-us/azure/container-registry/container-registry-tasks-pack-build) | integrations | 0.65 | Shows product-specific CLI command (az acr pack build) and its parameters to integrate Cloud Native Buildpacks with ACR, which is an integration/coding pattern. | diff --git a/products/azure-copilot/azure-copilot.csv b/products/azure-copilot/azure-copilot.csv index 7289de94..69d4aac6 100644 --- a/products/azure-copilot/azure-copilot.csv +++ b/products/azure-copilot/azure-copilot.csv @@ -25,7 +25,7 @@ https://learn.microsoft.com/en-us/azure/copilot/get-monitoring-information,Get m https://learn.microsoft.com/en-us/azure/copilot/improve-storage-accounts,Manage and migrate storage accounts,Manage and migrate storage accounts using Azure Copilot,Improve and migrate Azure storage accounts with Copilot,Learn how Azure Copilot can improve the security posture and data resiliency of storage accounts and help with storage migration solutions,"Azure Copilot can provide contextual and dynamic responses to harden the security posture and enhance data resiliency ofstorage accounts. Whenmigrating data to Azure, you can get help finding the right solution. Azure Copilot can also help you troubleshoot and resolve commonAzure File Syncissues related to your stored data. Responses are dynamic and based on your specific storage account and settings. Based on your prompts, Azure Copilot provides specific recommendations to improve your storage ",2025-11-19T08:00:00.000Z,how-to,security,0.7,True,"Focuses on hardening security posture, data resiliency, and migration solutions for storage accounts; includes product-specific security and migration guidance.",unchanged https://learn.microsoft.com/en-us/azure/copilot/manage-access,Manage access,Manage access to Azure Copilot,Manage user access and authorization for Azure Copilot,Learn how administrators can manage user access to Azure Copilot.,"By default, Azure Copilot is available to all users in a tenant. However,Global Administratorscan manage access to Azure Copilot for their organization. Access can also be optionally granted to specific Microsoft Entra users or groups. If Azure Copilot is not available for a user, they'll see an unauthorized message when they select theCopilotbutton in the Azure portal. Note In some cases, your tenant may not have access to Azure Copilot by default. Global Administrators can enable access by fol",2025-11-18T16:11:00.000Z,how-to,security,0.85,True,"Describes how Global Administrators manage access, including role names and enablement steps; this is product-specific identity and access management guidance.",unchanged https://learn.microsoft.com/en-us/azure/copilot/manage-agents-preview,Manage preview access,Manage access to Agents (preview) in Azure Copilot,Control tenant access to Azure Copilot agents preview,Tenant administrators can control which Azure Copilot preview features are available to users in their Azure tenant.,"To manage access toAgents (preview) in Azure Copilot, an administrator can request or remove access at the tenant level.",2025-11-18T16:11:00.000Z,overview,security,0.65,True,"Tenant-level access management for Agents (preview) is admin/security configuration; likely includes specific Entra roles or steps unique to this feature, fitting security configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/copilot/migration-agent,Migration,Migration agent capabilities in Agents (preview) in Azure Copilot,,Agents (preview) in Azure Copilot lets you get help with migration tasks help using intelligent agent capabilities.,"Agents (preview) in Azure Copilotintelligently surfaces the right agent to help with your tasks. When you ask Azure Copilot for help withmigration tasks, you can get help with planning, assessing, strategizing, and moving workloads to Azure. Azure Copilot works to understand your migration goals, such as faster migration, modernization to PaaS, and regional targets, then creates artifacts such asbusiness casesandassessmentsusing your existing context. You can also get help setting uplanding zone",2026-04-15T22:10:00.000Z,concept-article,,0.2,False,"Summary describes high-level capabilities of Azure Copilot migration agents (planning, assessing, strategizing, moving workloads, creating business cases and assessments, setting up landing zones) without exposing concrete limits, configuration parameters, error codes, or decision matrices. Content appears conceptual/marketing rather than detailed technical guidance, so no sub-skill type applies.",updated +https://learn.microsoft.com/en-us/azure/copilot/migration-agent,Migration,Migration agent capabilities in Agents (preview) in Azure Copilot,,Agents (preview) in Azure Copilot lets you get help with migration tasks help using intelligent agent capabilities.,"Agents (preview) in Azure Copilotintelligently surfaces the right agent to help with your tasks. When you ask Azure Copilot for help withmigration tasks, you can get help with planning, assessing, strategizing, and moving workloads to Azure. Azure Copilot works to understand your migration goals, such as faster migration, modernization to PaaS, and regional targets, then creates artifacts such asbusiness casesandassessmentsusing your existing context. You can also get help setting uplanding zone",2026-04-15T22:10:00.000Z,concept-article,,0.2,False,"Summary describes high-level capabilities of Azure Copilot migration agents (planning, assessing, strategizing, moving workloads, creating business cases and assessments, setting up landing zones) without exposing concrete limits, configuration parameters, error codes, or decision matrices. Content appears conceptual/marketing rather than detailed technical guidance, so no sub-skill type applies.",unchanged https://learn.microsoft.com/en-us/azure/copilot/observability-agent,Observability,Observability agent capabilities in Agents (preview) in Azure Copilot,,Agents (preview) in Azure Copilot helps investigate Azure Monitor alerts and provide observability using intelligent agent capabilities.,"Agents (preview) in Azure Copilotintelligently surfaces the right agent to help with your tasks. When you ask Azure Copilot for help with investigatingAzure Monitor alerts, it creates an Azure Monitor issue and starts aninvestigation. If no default Azure Monitor Workspace is configured for the subscription (or one isn't passed as context in the prompt), Azure Copilot attempts to configure one for you. Azure Copilot investigates the alert and provides a summary of the issue and its findings, incl",2026-01-14T06:03:00.000Z,concept-article,,0.3,False,"Explains observability agent behavior at a high level; no specific error codes, configuration tables, or limits.",unchanged https://learn.microsoft.com/en-us/azure/copilot/optimization-agent,Optimization,Optimization agent capabilities in Agents (preview) in Azure Copilot,,Agents (preview) in Azure Copilot lets you complete optimization tasks using intelligent agent capabilities.,"Agents (preview) in Azure Copilotintelligently surfaces the right agent to help with your tasks. When you ask Azure Copilot for help with optimization tasks, you can get help taking actions to reduce costs and carbon emissions for your Azure resources while maintaining performance. Important The functionality described in this article is only available for tenants that have access toAgents (preview) in Azure Copilot. Azure Copilot provides detailed optimization recommendations and helps you unde",2025-11-18T16:11:00.000Z,concept-article,,0.3,False,"Optimization agent article is scenario/benefit focused; lacks concrete quotas, config parameters, or detailed troubleshooting flows.",unchanged https://learn.microsoft.com/en-us/azure/copilot/overview,Overview,Azure Copilot Overview,,Azure Copilot is an AI-powered tool to help you do more with Azure.,"Azure Copilot is an AI-powered tool to help you do more with Azure. With Azure Copilot, you can gain new insights, discover more benefits of the cloud, and orchestrate across both cloud and edge. Copilot leverages Large Language Models (LLMs), the Azure control plane, and insights about your Azure environment to help you work more efficiently. Azure Copilot can help you navigate the hundreds of services and thousands of resource types that Azure offers. It unifies knowledge and data across hundr",2025-11-18T16:11:00.000Z,overview,,0.1,False,"High-level marketing/overview of Azure Copilot capabilities without concrete limits, configs, or error details.",unchanged diff --git a/products/azure-copilot/report.md b/products/azure-copilot/report.md index e0bfad09..9cf96bdf 100644 --- a/products/azure-copilot/report.md +++ b/products/azure-copilot/report.md @@ -44,8 +44,8 @@ confusable_not_for: Not for Azure AI services (use microsoft-foundry-tools), Azu ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 38 +- **Updated Pages**: 0 +- **Unchanged**: 39 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-copilot/azure-copilot.csv` @@ -63,11 +63,6 @@ confusable_not_for: Not for Azure AI services (use microsoft-foundry-tools), Azu ## Changes -### Updated Pages - -- [Migration](https://learn.microsoft.com/en-us/azure/copilot/migration-agent) - - Updated: 2026-03-12T22:17:00.000Z → 2026-04-15T22:10:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-cosmos-db/azure-cosmos-db.csv b/products/azure-cosmos-db/azure-cosmos-db.csv index 81c4fe0e..b3bf6c24 100644 --- a/products/azure-cosmos-db/azure-cosmos-db.csv +++ b/products/azure-cosmos-db/azure-cosmos-db.csv @@ -18,8 +18,8 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/autoscale-faq,Autoscale FAQ,Fr https://learn.microsoft.com/en-us/azure/cosmos-db/benchmarking-framework,Benchmarking framework,Measure performance with a benchmarking framework - Azure Cosmos DB,Benchmark Azure Cosmos DB for NoSQL with YCSB,"Use YCSB to benchmark Azure Cosmos DB for NoSQL with recipes to measure read, write, scan, and update performance.","There are more choices, now than ever, on the type of database to use with your data workload. One of the key factors to picking a database is the performance of the database or service, but benchmarking performance can be cumbersome and error-prone. Thebenchmarking framework for Azure Databasessimplifies the process of measuring performance with popular open-source benchmarking tools with low-friction recipes that implement common best practices. In Azure Cosmos DB for NoSQL, the framework impl",2025-12-19T18:19:00.000Z,how-to,best-practices,0.7,True,"A benchmarking framework article usually includes concrete configuration recipes, workload parameters, and tuning guidance for accurate measurements, which are product-specific best practices for performance testing.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-dotnet,Best practices for .NET SDK,Best Practices for .NET SDK V3 - Azure Cosmos DB,Best practices for Azure Cosmos DB .NET SDK v3,Learn the best practices for using the Azure Cosmos DB .NET SDK v3.,"This article walks through the best practices for using the Azure Cosmos DB .NET SDK. By following these practices, you can help improve your latency, availability, and boost overall performance. Watch the following video to learn more about using the .NET SDK from an Azure Cosmos DB engineer!",2025-12-19T18:19:00.000Z,how-to,best-practices,0.85,True,"Explicit best-practices article with SDK-specific recommendations, configuration values, and gotchas (e.g., client reuse, connection modes) that are unique to Cosmos DB .NET v3.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-java,Best practices for Java SDK,Best Practices for Java SDK V4 - Azure Cosmos DB,Best practices for Azure Cosmos DB Java SDK v4,Learn the best practices for using the Azure Cosmos DB Java SDK v4,"This article walks through the best practices for using the Azure Cosmos DB Java SDK. Following these practices, will help improve your latency, availability, and boost overall performance.",2025-12-19T18:19:00.000Z,how-to,best-practices,0.84,True,"SDK-specific best-practices article with concrete recommendations and patterns (e.g., async usage, connection modes, retries) that are unique to Cosmos DB Java v4.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-python,Best practices for Python SDK,Best practices for Python SDK - Azure Cosmos DB,Optimize Azure Cosmos DB Python SDK performance,Review a list of best practices for using the Azure Cosmos DB Python SDK in a performant manner.,"This guide includes best practices for solutions built using the latest version of the Python SDK for Azure Cosmos DB for NoSQL. The best practices included here helps improve latency, improve availability, and boost overall performance for your solutions.",2026-04-16T17:12:00.000Z,best-practice,best-practices,0.86,True,"The page is explicitly a best practices guide for the Azure Cosmos DB Python SDK, focused on improving latency, availability, and performance. Such guidance typically includes product-specific recommendations (for example, preferred client reuse patterns, connection modes, retry configurations, and request options) that are unique to Cosmos DB’s Python SDK behavior and not just generic Python or database advice. This aligns with the best-practices criteria: concrete DO/DON'T guidance and SDK-specific configuration/code patterns.",updated -https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript,Best practices for JavaScript SDK,Best practices for JavaScript SDK - Azure Cosmos DB,Apply performance best practices for Cosmos DB JavaScript SDK,Review a list of best practices for using the Azure Cosmos DB JavaScript SDK in a performant manner.,"This guide includes best practices for solutions built using the latest version of the JavaScript SDK for Azure Cosmos DB for NoSQL. The best practices included here helps improve latency, improve availability, and boost overall performance for your solutions.",2025-12-19T18:19:00.000Z,best-practice,best-practices,0.86,True,"Explicitly a best practices guide for the JavaScript SDK; such pages usually contain concrete recommendations (e.g., enabling bulk mode, tuning connection options, handling throttling) and SDK-specific patterns that qualify as product-specific best practices.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-python,Best practices for Python SDK,Best practices for Python SDK - Azure Cosmos DB,Optimize Azure Cosmos DB Python SDK performance,Review a list of best practices for using the Azure Cosmos DB Python SDK in a performant manner.,"This guide includes best practices for solutions built using the latest version of the Python SDK for Azure Cosmos DB for NoSQL. The best practices included here helps improve latency, improve availability, and boost overall performance for your solutions.",2026-04-16T17:12:00.000Z,best-practice,best-practices,0.86,True,"The page is explicitly a best practices guide for the Azure Cosmos DB Python SDK, focused on improving latency, availability, and performance. Such guidance typically includes product-specific recommendations (for example, preferred client reuse patterns, connection modes, retry configurations, and request options) that are unique to Cosmos DB’s Python SDK behavior and not just generic Python or database advice. This aligns with the best-practices criteria: concrete DO/DON'T guidance and SDK-specific configuration/code patterns.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript,Best practices for JavaScript SDK,Best practices for JavaScript SDK - Azure Cosmos DB,Optimize Azure Cosmos DB JavaScript SDK usage,Review a list of best practices for using the Azure Cosmos DB JavaScript SDK in a performant manner.,"This guide includes best practices for solutions built using the latest version of the JavaScript SDK for Azure Cosmos DB for NoSQL. The best practices included here helps improve latency, improve availability, and boost overall performance for your solutions.",2026-04-21T17:11:00.000Z,best-practice,best-practices,0.86,True,"The page is explicitly a best practices guide for the Cosmos DB JavaScript SDK, focused on improving latency, availability, and performance. Such SDK-specific recommendations typically include concrete patterns (e.g., connection handling, retry configuration, throughput usage) and product-specific gotchas that go beyond generic programming advice, fitting the best-practices sub-skill.",updated https://learn.microsoft.com/en-us/azure/cosmos-db/bicep-samples,Bicep syntax templates,Bicep Samples - Azure Cosmos DB,Deploy and configure Azure Cosmos DB with Bicep,Use Bicep to create and configure Azure Cosmos DB.,"This article shows Bicep samples for API for NoSQL accounts. You can also find Bicep samples forCassandra,Gremlin,MongoDB, andTableAPIs.",2025-12-19T18:19:00.000Z,sample,configuration,0.7,True,"Bicep sample collections for Cosmos DB typically include resource property names, allowed values, and configuration patterns (throughput, consistency, regions) that are product-specific and not just conceptual. These are concrete configuration definitions rather than generic how-to steps.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/bulk-executor-dotnet,Bulk executor - .NET library,Use Bulk Executor .NET Library in for Bulk Import and Update Operations - Azure Cosmos DB,Use .NET bulk executor library for Cosmos DB,Learn how to bulk import and update the Azure Cosmos DB documents using the bulk executor .NET library.,"Note The bulk executor library described in this article is maintained for applications that use the .NET SDK 2.x version. For new applications, you can use thebulk supportthat's directly available with the.NET SDK version 3.x, and it doesn't require any external library. If you currently use the bulk executor library and plan to migrate to bulk support on the newer SDK, use the steps in theMigration guideto migrate your application. This tutorial provides instructions on how to use the bulk exe",2025-12-19T18:19:00.000Z,how-to,integrations,0.72,True,"How-to for bulk executor .NET library will include API usage patterns, method signatures, and configuration options (batch sizes, retries) specific to this library, which are integration/coding details.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/bulk-executor-java,Bulk executor - Java library,Use Bulk Executor Java Library in to Perform Bulk Import and Update Operations - Azure Cosmos DB,Use Java bulk support in Cosmos DB SDK v4,Bulk import and update Azure Cosmos DB documents using bulk executor Java library,"This tutorial provides instructions on performing bulk operations in theAzure Cosmos DB Java V4 SDK. This version of the SDK comes with the bulk executor library built-in. If you're using an older version of Java SDK, it's recommended tomigrate to the latest version. Azure Cosmos DB Java V4 SDK is the current recommended solution for Java bulk support. Currently, the bulk executor library is supported only by Azure Cosmos DB for NoSQL and API for Gremlin accounts. To learn about using bulk execu",2025-12-19T18:19:00.000Z,how-to,integrations,0.7,True,"Java bulk operations guide includes SDK-specific APIs, configuration options, and constraints for bulk import/update, which are detailed integration patterns.",unchanged @@ -76,7 +76,7 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/spark-hdinsight,Conn https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/spark-read-operation,Read data,Read API for Cassandra table data using Spark - Azure Cosmos DB for Apache Cassandra,Read Cosmos DB Cassandra table data using Spark,This article describes how to read data from API for Cassandra tables in Azure Cosmos DB.,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. This article describes how to read data stored in Azure Cosmos DB for Apache Cassandra from Spark.",2026-02-27T18:12:00.000Z,how-to,integrations,0.65,True,"Reading data via Spark connector involves product-specific options (consistency, partitioning, predicates) that are integration-focused expert details.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/spark-table-copy-operations,Copy table data,Table copy operations on Azure Cosmos DB for Apache Cassandra from Spark - Azure Cosmos DB for Apache Cassandra,Copy tables in Cosmos DB Cassandra using Spark,This article details how to copy data between tables in Azure Cosmos DB for Apache Cassandra,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. This article describes how to copy data between tables in Azure Cosmos DB for Apache Cassandra from Spark. The commands des",2026-02-27T18:12:00.000Z,how-to,integrations,0.6,True,Table copy operations via Spark connector involve specific configuration and performance considerations unique to Cosmos DB Cassandra integration.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/spark-upsert-operations,Upsert data,Upsert data into Azure Cosmos DB for Apache Cassandra from Spark - Azure Cosmos DB for Apache Cassandra,Upsert data into Cosmos DB Cassandra from Spark,This article details how to upsert into tables in Azure Cosmos DB for Apache Cassandra from Spark,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. This article describes how to upsert data into Azure Cosmos DB for Apache Cassandra from Spark.",2026-02-27T18:12:00.000Z,how-to,integrations,0.65,True,Upsert semantics and connector options for Cosmos DB Cassandra via Spark are product-specific integration patterns and may include configuration flags and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/support,Wire protocol support,Apache Cassandra features supported by Azure Cosmos DB for Apache Cassandra - Azure Cosmos DB for Apache Cassandra,,Learn about the Apache Cassandra feature support in Azure Cosmos DB for Apache Cassandra and the benefits of the Apache Cassandra APIs.,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. Azure Cosmos DB is Microsoft's globally distributed multi-model database service. You can communicate with the Azure Cosmos",2026-04-16T17:12:00.000Z,overview,,0.2,False,"Page appears to be a feature-support/overview comparison between Apache Cassandra and Azure Cosmos DB for Cassandra API, likely listing which features are supported rather than detailed limits, configuration parameters, or troubleshooting mappings. No clear evidence of numeric limits, config tables, error codes, or decision matrices from the provided summary.",updated +https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/support,Wire protocol support,Apache Cassandra features supported by Azure Cosmos DB for Apache Cassandra - Azure Cosmos DB for Apache Cassandra,,Learn about the Apache Cassandra feature support in Azure Cosmos DB for Apache Cassandra and the benefits of the Apache Cassandra APIs.,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. Azure Cosmos DB is Microsoft's globally distributed multi-model database service. You can communicate with the Azure Cosmos",2026-04-16T17:12:00.000Z,overview,,0.2,False,"Page appears to be a feature-support/overview comparison between Apache Cassandra and Azure Cosmos DB for Cassandra API, likely listing which features are supported rather than detailed limits, configuration parameters, or troubleshooting mappings. No clear evidence of numeric limits, config tables, error codes, or decision matrices from the provided summary.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/templates-samples,JSON syntax templates,Resource Manager templates for Azure Cosmos DB for Apache Cassandra - Azure Cosmos DB for Apache Cassandra,Deploy Cosmos DB Cassandra resources with ARM templates,Use Azure Resource Manager templates to create and configure Azure Cosmos DB for Apache Cassandra.,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. In this article, you learn how to use Azure Resource Manager templates to help deploy and manage your Azure Cosmos DB accou",2026-02-27T18:12:00.000Z,how-to,deployment,0.6,True,"Resource Manager templates article includes schema, parameter names, and allowed values for deploying Cosmos DB Cassandra accounts and resources.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/tokens,Tokens and the Token function,Tokens and the Token Function in Azure Cosmos DB for Apache Cassandra - Azure Cosmos DB for Apache Cassandra,Use tokens and token() function in Cosmos DB Cassandra API,Tokens and the Token Function in Azure Cosmos DB for Apache Cassandra.,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. This article discusses tokens and token function in Azure Cosmos DB for Apache Cassandra and clarifies the difference betwe",2026-02-27T18:12:00.000Z,overview,configuration,0.6,True,"Token behavior in Cosmos DB’s Cassandra API is product-specific; article likely details partitioning, token ranges, and usage differences vs open-source Cassandra.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/troubleshoot-common-issues,Troubleshoot common issues,Troubleshoot common errors in the Azure Cosmos DB for Apache Cassandra - Azure Cosmos DB for Apache Cassandra,Troubleshoot common Cosmos DB Cassandra API errors,This article discusses common issues in the Azure Cosmos DB for Apache Cassandra and how to troubleshoot them.,"APPLIES TO:Cassandra Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to migrate an existing Apache Cassandra application? ConsiderAzure Managed Instance for Apache Cassandra. The API for Cassandra inAzure Cosmos DBis a compatibility layer that provideswire protocol supportfor the open-source Apach",2026-02-27T18:12:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article; these typically list specific error codes/messages, causes, and resolutions unique to Cosmos DB Cassandra API.",unchanged @@ -121,13 +121,11 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/disaster-recovery-guidance,Dis https://learn.microsoft.com/en-us/azure/cosmos-db/distribute-data-globally,Global distribution overview,Distribute Data Globally - Azure Cosmos DB,,"Learn about planet-scale geo-replication, multi-region writes, failover, and data recovery using global databases from Azure Cosmos DB, a globally distributed, multi-model database service.","Today's applications are required to be highly responsive and always online. To achieve low latency and high availability, instances of these applications need to be deployed in datacenters that are close to their users. These applications are typically deployed in multiple datacenters and are called globally distributed. Globally distributed applications need a globally distributed database that can transparently replicate the data anywhere in the world to enable the applications to operate on ",2025-07-08T22:09:00.000Z,concept-article,,0.3,False,"High-level explanation of global distribution concepts (geo-replication, multi-region writes, failover) without detailed limits, configuration tables, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/distribute-throughput-across-partitions-faq,Distribute throughput across partitions FAQ,Frequently Asked Questions on Distributing Throughput Across Partitions in (Preview) - Azure Cosmos DB,FAQ on Cosmos DB throughput redistribution limits,Frequently asked questions about distributing throughput across partitions in Azure Cosmos DB,The throughput redistribution feature of Azure Cosmos DB gives you the ability to redistribute your provisioned throughput across physical partitions. This article answers commonly asked questions about Azure Cosmos DB throughput redistribution across partitions.,2025-12-19T18:19:00Z,faq,limits-quotas,0.7,True,"FAQ for throughput redistribution will enumerate constraints, supported scenarios, and numeric limits for the preview feature, which are expert, product-specific limits.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/distributed-nosql,Understand distributed NoSQL,Understand Distributed NoSQL Databases - Azure Cosmos DB,,Learn about distributed NoSQL databases and how you can use them together with your cloud-native global-scale applications at with flexible data schemas.,"Azure Cosmos DB is a globally distributed database platform for both NoSQL and relational databases of any scale. This article explores distributed NoSQL databases in the context of Azure Cosmos DB’s various NoSQL API options. For more information about other data storage options in Azure, seeUnderstand data store models.",2025-12-15T23:09:00.000Z,overview,,0.1,False,"Conceptual explanation of distributed NoSQL databases; no indication of product-specific limits, configs, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking,Dynamic data masking (DDM),Dynamic Data Masking (DDM) (Preview) - Azure Cosmos DB,Configure Dynamic Data Masking in Cosmos DB,Learn how to configure Dynamic Data Masking (DDM) in Azure Cosmos DB to protect sensitive data like personal data and protected health information with policy-based security features.,"This article explains how to configure Dynamic Data Masking on your Azure Cosmos DB account. Important Dynamic Data Masking is in public preview. -This feature is provided without a service level agreement. -For more information, seeSupplemental Terms of Use for Microsoft Azure Previews.",2025-12-19T18:19:00.000Z,feature-guide,security,0.7,True,"DDM configuration article will include product-specific masking policy options, field-level settings, and constraints that qualify as expert security configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking,Dynamic data masking (DDM),Dynamic Data Masking (DDM) - Azure Cosmos DB,Configure Dynamic Data Masking in Azure Cosmos DB,Learn how to configure Dynamic Data Masking (DDM) in Azure Cosmos DB to protect sensitive data like personal data and protected health information with policy-based security features.,This article explains how to configure Dynamic Data Masking on your Azure Cosmos DB account.,2026-04-20T22:08:00.000Z,feature-guide,security,0.8,True,"Describes how to configure Dynamic Data Masking, a product-specific security feature; likely includes concrete configuration steps, policy definitions, and parameter values for masking sensitive data in Cosmos DB.",updated https://learn.microsoft.com/en-us/azure/cosmos-db/dynamo-to-cosmos,Migrate - DynamoDB application to API for NoSQL,Migrate Your Application From Amazon DynamoDB - Azure Cosmos DB,Migrate .NET applications from Amazon DynamoDB to Azure Cosmos DB,Learn how to migrate your .NET application from Amazon DynamoDB to Azure Cosmos DB.,"Azure Cosmos DB is a scalable, globally distributed, fully managed database. It provides guaranteed low-latency access to your data. This article describes how to migrate your .NET application from Amazon DynamoDB to Azure Cosmos DB with minimal code changes. To learn more about Azure Cosmos DB, see theoverviewarticle.",2025-12-19T18:19:00.000Z,how-to,integrations,0.7,True,"Describes mapping DynamoDB SDK usage to Cosmos DB .NET SDK, including API differences and configuration changes, which are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/dynamodb-data-migration-cosmos-db,Migrate - DynamoDB data to API for NoSQL,Migrate Data From DynamoDB - Azure Cosmos DB,Choose and implement data migration from DynamoDB to Cosmos DB,"Learn about data migration options from DynamoDB to Azure Cosmos DB, and walk through an offline migration process by using Azure services.","This article focuses ondata migrationfrom Amazon DynamoDB to Azure Cosmos DB for NoSQL. Before you begin, it's important to understand the difference between data migration and application migration: Depending on your requirements, you can migrate data in parallel with migrating an application. But data migration is often a prerequisite to application migration. To learn more about application migration, seeMigrate your application from Amazon DynamoDB to Azure Cosmos DB.",2025-12-19T18:19:00.000Z,how-to,decision-making,0.7,True,"Focuses on data migration options and an offline migration process using Azure services; includes when to choose each option and process steps, which is migration decision-making.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/emulator,Emulator,Emulator (Docker/local) - Azure Cosmos DB,Configure and use Azure Cosmos DB local emulator,Use the Azure Cosmos DB local or docker-based emulator to test your applications against multiple API endpoints.,"The Azure Cosmos DB emulator provides a local environment that emulates the Azure Cosmos DB service designed for development purposes. Using the emulator, you can develop and test your application locally, without creating an Azure subscription or incurring any service costs. When you're satisfied with how your application is working with the emulator, you can transition to using an Azure Cosmos DB account with minimal friction. Important We do not recommend the use of the emulator for productio",2024-11-19T13:29:00.000Z,concept-article,configuration,0.7,True,"Emulator docs typically list ports, connection strings, feature support, and configuration flags, which are concrete configuration details specific to the emulator environment.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-linux,Linux emulator (preview),Linux-based emulator - vNext (preview) - Azure Cosmos DB,Run Azure Cosmos DB Linux-based emulator container,Use the Azure Cosmos DB Linux-based emulator to test your applications against API for NoSQL endpoints.,"The next generation of the Azure Cosmos DB emulator is entirely Linux-based and is available as a Docker container. It supports running on a wide variety of processors and operating systems. Important This version of the emulator only supports the API for NoSQL ingateway mode, with a select subset of features. For more information, seefeature support.",2025-11-18T18:29:00.000Z,how-to,configuration,0.7,True,"Linux-based emulator article will include container image names, environment variables, ports, and supported modes, which are configuration parameters.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-linux,Linux emulator (preview),Linux-based emulator - vNext (preview) - Azure Cosmos DB,,Use the Azure Cosmos DB Linux-based emulator to test your applications against API for NoSQL endpoints.,"The next generation of the Azure Cosmos DB emulator is entirely Linux-based and is available as a Docker container. It supports running on a wide variety of processors and operating systems. Important This version of the emulator only supports the API for NoSQL ingateway mode, with a select subset of features. For more information, seefeature support.",2026-04-21T17:11:00.000Z,how-to,,0.3,False,"The summary describes a conceptual/overview page for the Linux-based Cosmos DB emulator (what it is, that it runs as a Docker container, and that it supports a subset of features). There is no indication of detailed configuration tables, limits, error codes, or other expert-level specifics in the provided description.",updated https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-release-notes,Release notes,Release Notes for the Windows (Local) Emulator - Azure Cosmos DB,,View the release notes for the latest version and previous versions of the Azure Cosmos DB Windows (local) emulator.,The Azure Cosmos DB emulator is updated at a regular cadence with release notes provided in this article. Download latest version (2.14.2),2025-06-24T08:00:00.000Z,release-notes,,0.3,False,"Release notes are version history and change logs, not structured skills content like limits, configuration matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-windows-arguments,Windows command-line arguments,Windows emulator command-line and PowerShell reference - Azure Cosmos DB,Control Cosmos DB Windows emulator via CLI and PowerShell,Manage the Azure Cosmos DB emulator with the command line or PowerShell and change the configuration of the emulator.,"The Azure Cosmos DB emulator provides a local environment that emulates the Azure Cosmos DB service for local development purposes. After installing the emulator, you can control the emulator with command line and PowerShell commands. This article describes how to use the command-line and PowerShell commands to start and stop the emulator, configure options, and perform other operations. You have to run the commands from the installation location. Important This article only includes command-lin",2025-03-10T08:00:00.000Z,reference,configuration,0.9,True,"Command-line and PowerShell reference includes specific command names, arguments, and configuration options (e.g., /Port, /Key, /EnableMongoDbEndpoint), which are explicit configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/estimate-ru-with-capacity-planner,Estimate RU/s with capacity planner,Estimate Costs using Capacity Planner - API for NoSQL - Azure Cosmos DB,Estimate Cosmos DB RU/s and cost with capacity planner,Learn how to use Azure Cosmos DB capacity planner to estimate the throughput and cost required when using Azure Cosmos DB for NoSQL.,"Note If you're planning a data migration to Azure Cosmos DB and all that you know is the number of vCores and servers in your existing sharded and replicated database cluster, read aboutestimating request units using vCores or vCPUs. To optimize cost and performance, it's essential to configure your Azure Cosmos DB databases and containers with the right amount of provisioned throughput, orRequest Units (RU/s). This article describes how to use the Azure Cosmos DBcapacity plannerto estimate the ",2025-12-19T18:19:00.000Z,how-to,decision-making,0.8,True,"Guides use of the capacity planner to estimate RU/s and cost, including planner inputs/outputs and how to size resources; directly supports capacity planning and tier selection decisions.",unchanged @@ -183,7 +181,7 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/gremlin/solution-supply-chain- https://learn.microsoft.com/en-us/azure/cosmos-db/gremlin/support,Compatibility,Azure Cosmos DB for Gremlin support and compatibility with TinkerPop features - Azure Cosmos DB for Apache Gremlin,Use TinkerPop Gremlin features with Cosmos DB,Learn about the Gremlin language from Apache TinkerPop. Learn which features and steps are available in Azure Cosmos DB and the TinkerPop Graph engine compatibility differences.,"Important Are you looking for a database solution forhigh-scalescenarios with a 99.999% availability service level agreement (SLA), instant autoscale, and automatic failover across multiple regions? ConsiderAzure Cosmos DB for NoSQL. Are you looking to implement an online analytical processing (OLAP) graph or migrate an existing Apache Gremlin application? ConsiderGraph in Microsoft Fabric. Azure Cosmos DB supportsApache Tinkerpop'sgraph traversal language, known asGremlin. You can use the Greml",2025-10-28T22:14:00.000Z,overview,integrations,0.7,True,"Describes compatibility between Azure Cosmos DB for Gremlin and Apache TinkerPop, including which steps/features are supported; this is product-specific API/feature mapping that LLMs are unlikely to know in detail and fits integration/compatibility patterns.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/hierarchical-partition-keys,Hierarchical partition keys overview,Hierarchical Partition Keys - Azure Cosmos DB,Use hierarchical partition keys in Cosmos DB,"Learn about subpartitioning in Azure Cosmos DB, how to use the feature, and how to manage logical partitions.","Azure Cosmos DB distributes your data across logical and physical partitions based on your partition keys to support horizontal scaling. By using hierarchical partition keys (also calledsubpartitoning), you can configure up to a three-level hierarchy for your partition keys to further optimize data distribution and for a higher level of scaling. If you use synthetic keys today, have scenarios in which partition keys can exceed 20 GB of data, or would like to ensure that each tenant's document ma",2026-02-02T08:00:00.000Z,concept-article,best-practices,0.7,True,"Describes subpartitioning with up to three levels and when to use it (e.g., >20 GB per key, tenant isolation); includes product-specific guidance and constraints.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/hierarchical-partition-keys-faq,Hierarchical partition keys FAQ,Frequently Asked Questions on Hierarchical Partition Keys - Azure Cosmos DB,,Frequently asked questions about subpartitioning in Azure Cosmos DB,"Hierarchical partition keys, or subpartitoning, allows you to configure up to a three level hierarchy for your partition keys to further optimize data distribution and enable higher scale. This article answers commonly asked questions about Azure Cosmos DB Hierarchical partition keys.",2026-03-12T06:17:00Z,faq,,0.3,False,"FAQ about hierarchical partition keys appears primarily conceptual and explanatory (what they are, how they behave). The description does not indicate presence of numeric limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. Without evidence of concrete limits, config values, or troubleshooting mappings, it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/hierarchical-partition-keys-unlimited-scale,Unlimited scale with hierarchical partition keys,Unlimited logical partition storage with hierarchical partition keys - Azure Cosmos DB,Use hierarchical partition keys to bypass 20-GB limit,Learn how to use hierarchical partition keys with a unique last level to scale beyond the 20-GB logical partition limit in Azure Cosmos DB.,"This document explains how hierarchical partition keys (HPK) can help you model high-cardinality data. To avoid hitting the 20 GB logical partition size limit, use a multi-level partition key path that ends with a unique identifier, such as /id. In Azure Cosmos DB, items are grouped into logical partitions based on the value of the partition key. A single logical partition has a maximum storage size of 20 GB. For more information, seelogical partitions overview. If your data model causes too man",2026-04-17T22:09:00.000Z,concept-article,limits-quotas,0.78,True,"Contains the exact 20-GB logical partition storage limit and prescriptive guidance on using hierarchical partition keys with a unique last level (for example /id) to avoid hitting that limit, which is product-specific quota knowledge.",new +https://learn.microsoft.com/en-us/azure/cosmos-db/hierarchical-partition-keys-unlimited-scale,Unlimited scale with hierarchical partition keys,Unlimited logical partition storage with hierarchical partition keys - Azure Cosmos DB,Use hierarchical partition keys to bypass 20-GB limit,Learn how to use hierarchical partition keys with a unique last level to scale beyond the 20-GB logical partition limit in Azure Cosmos DB.,"This document explains how hierarchical partition keys (HPK) can help you model high-cardinality data. To avoid hitting the 20 GB logical partition size limit, use a multi-level partition key path that ends with a unique identifier, such as /id. In Azure Cosmos DB, items are grouped into logical partitions based on the value of the partition key. A single logical partition has a maximum storage size of 20 GB. For more information, seelogical partitions overview. If your data model causes too man",2026-04-17T22:09:00.000Z,concept-article,limits-quotas,0.78,True,"Contains the exact 20-GB logical partition storage limit and prescriptive guidance on using hierarchical partition keys with a unique last level (for example /id) to avoid hitting that limit, which is product-specific quota knowledge.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-add-assign-user-roles,Assign users and roles,Add and assign user roles - Azure Cosmos DB,Add and assign Cosmos DB RBAC user roles,Learn how to add and assign user roles for Azure Cosmos DB for NoSQL accounts. Follow step-by-step instructions to configure role-based access control and secure your database resources.,"Warning Resource Owner Password Credential (ROPC) flow isn't recommended for production Azure Cosmos DB for NoSQL workloads as it requires handling credentials directly, which poses security risks. For more secure authentication, use role-based access control with Microsoft Entra ID. For more information, seerole-based access control and Microsoft Entra ID authentication in Azure Cosmos DB for NoSQL. Azure Cosmos DB for NoSQL allows you to assign roles to control access to your database resource",2025-12-19T18:19:00.000Z,how-to,security,0.75,True,"Covers assigning roles for Cosmos DB for NoSQL; includes specific role names, scopes, and security recommendations (e.g., avoid ROPC), fitting security configuration.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-alert-on-logical-partition-key-storage-size,Create alert on logical partition key size,Create alerts to monitor if storage for a logical partition key is approaching 20 GB - Azure Cosmos DB,Alert when Cosmos DB logical partitions near 20 GB limit,Learn how to set up alerts for Azure Cosmos DB using Log Analytics,"Azure Cosmos DB enforces a maximum logical partition key size of 20 GB. For example, if you have a container/collection partitioned byUserId, the data within the ""Alice"" logical partition can store up to 20 GB of data. You can use alerts to monitor if you have any logical partition keys that are approaching the 20-GB logical partition limit. Alerts can send you a notification in the form of an email or execute an action, such as an Azure Function or Logic App, when the condition is triggered. In",2026-03-12T06:17:00.000Z,how-to,limits-quotas,0.86,True,"Page states the exact enforced maximum logical partition key size of 20 GB for Azure Cosmos DB and shows how to monitor when a logical partition approaches this numeric limit, which is a concrete quota value not reliably known from training.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-always-encrypted,Use Always Encrypted,Use client-side encryption with Always Encrypted - Azure Cosmos DB,Use Always Encrypted client-side encryption in Cosmos DB,Learn how to use client-side encryption with Always Encrypted for Azure Cosmos DB,"Important A breaking change has been introduced with the 1.0 release of our encryption packages. If you created data encryption keys and encryption-enabled containers with prior versions, you'll need to re-create your databases and containers after migrating your client code to 1.0 packages. Always Encrypted is a feature designed to protect sensitive data, such as credit card numbers or national/regional identification numbers (for example, U.S. social security numbers), stored in Azure Cosmos D",2025-05-08T08:00:00.000Z,how-to,security,0.8,True,"Always Encrypted setup involves SDK configuration, key store parameters, and version-specific behaviors; these are product-specific security configuration details.",unchanged @@ -233,7 +231,7 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-javascript-query-items, https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-javascript-vector-index-query,JavaScript,Index and Query Vector Data in JavaScript - Azure Cosmos DB,Index and query vector data in Cosmos DB with JavaScript,Add vector data in Azure Cosmos DB for NoSQL and then query the data efficiently in your JavaScript application.,"This article explains how to create vector data, index the data, and then query the data in a container. Before you use vector indexing and search, you must first enable vector search in Azure Cosmos DB for NoSQL. After you set up the Azure Cosmos DB container for vector search, you create a vector embedding policy. Next, you add vector indexes to the container indexing policy. Then you create a container with vector indexes and a vector embedding policy. Finally, you perform a vector search on ",2025-12-19T18:19:00.000Z,how-to,integrations,0.7,True,"JavaScript SDK guide for vector indexing/search; contains API calls and configuration options (embedding policy, index config) specific to Cosmos DB’s JS SDK.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-conflicts,Configure conflict resolution policies,Manage Conflicts Between Regions - Azure Cosmos DB,Manage multi-region conflict resolution in Cosmos DB,Learn how to manage conflicts in Azure Cosmos DB by creating the last-writer-wins or a custom conflict resolution policy,"With multi-region writes, when multiple clients write to the same item, conflicts might occur. When a conflict occurs, you can resolve the conflict by using different conflict resolution policies. This article describes how to manage conflict resolution policies. Tip Conflict resolution policy can only be specified at container creation time and can't be modified after container creation.",2025-12-19T18:19:00.000Z,how-to,configuration,0.7,True,"Explains how to configure last-writer-wins or custom conflict resolution at container creation, including constraints (only at creation time) and policy settings—specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-consistency,Manage consistency levels,Manage Consistency - Azure Cosmos DB,Configure and override Cosmos DB consistency levels,"Learn how to configure and manage consistency levels in Azure Cosmos DB using Azure portal, .NET SDK, Java SDK, and various other SDKs.","This article explains how to manage consistency levels in Azure Cosmos DB. You learn how to configure the default consistency level, override the default consistency, manually manage session tokens, and understand the Probabilistically Bounded Staleness (PBS) metric. As you change your account level consistency, ensure you redeploy your applications and make any necessary code modifications to apply these changes. Note We recommend that you use the Azure Az PowerShell module to interact with Azu",2025-12-19T18:19:00.000Z,how-to,configuration,0.7,True,"Shows how to configure default and per-request consistency, manage session tokens, and interpret PBS metrics using specific SDK and portal settings—product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account,Manage an Azure Cosmos DB account,Manage by Using the Azure Portal - Azure Cosmos DB,Manage Cosmos DB accounts and understand control plane limits,"Learn how to manage Azure Cosmos DB resources by using the Azure portal, PowerShell, CLI, and Azure Resource Manager templates.","This article describes how to manage various tasks on an Azure Cosmos DB account by using the Azure portal. Azure Cosmos DB can also be managed with other Azure management clients, includingAzure PowerShell,Azure CLI,Azure Resource Manager templates,Bicep, andTerraform. Tip The management API for Azure Cosmos DB orcontrol planeisn't designed for high-request volumes like the rest of the service. To learn more, seeControl Plane Service Limits",2026-02-02T18:14:00.000Z,how-to,limits-quotas,0.6,True,"Management article that explicitly references control plane service limits; full content typically includes specific request rate limits and constraints for management operations—numeric, product-specific limits.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account,Manage an Azure Cosmos DB account,Manage by Using the Azure Portal - Azure Cosmos DB,,"Learn how to manage Azure Cosmos DB resources by using the Azure portal, PowerShell, CLI, and Azure Resource Manager templates.","This article describes how to manage various tasks on an Azure Cosmos DB account by using the Azure portal. Azure Cosmos DB can also be managed with other Azure management clients, includingAzure PowerShell,Azure CLI,Azure Resource Manager templates,Bicep, andTerraform. Tip The management API for Azure Cosmos DB orcontrol planeisn't designed for high-request volumes like the rest of the service. To learn more, seeControl Plane Service Limits",2026-04-21T17:11:00.000Z,how-to,,0.2,False,"Page is a general management/how-to guide for Azure Cosmos DB accounts via portal/CLI/ARM. The summary and URL indicate step-by-step management operations, not detailed limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. The only limits reference is a link to separate control plane service limits, so this page itself does not contain the expert numerical or configuration details required by any sub-skill type.",updated https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-indexing-policy,Manage indexing policies,Manage Indexing Policies - Azure Cosmos DB,Manage Cosmos DB indexing policies via SDKs,"Learn how to manage indexing policies, include or exclude a property from indexing, how to define indexing using different Azure Cosmos DB SDKs.","In Azure Cosmos DB, data is indexed followingindexing policiesthat are defined for each container. The default indexing policy for newly created containers enforces range indexes for any string or number. You can override this policy with your own custom indexing policy. Note The method of updating indexing policies described in this article only applies to Azure Cosmos DB for NoSQL. Learn about indexing inAzure Cosmos DB for MongoDBandSecondary indexing in Azure Cosmos DB for Apache Cassandra.",2025-12-19T18:19:00.000Z,how-to,configuration,0.8,True,"Shows how to include/exclude paths and update indexing policies using specific SDK methods and policy fields, which are concrete configuration instructions.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-migrate-desktop-tool,Desktop data migration tool,Migrate Data Using the Data Migration Tool - Azure Cosmos DB,Migrate data to Azure Cosmos DB using Data Migration Tool,"Use the Azure Cosmos DB Data Migration Tool to migrate data from JSON, MongoDB, SQL Server, and many other databases and file formats to Azure Cosmos DB.",TheAzure Cosmos DB Data Migration Toolis an open-source command-line application to import or export data from Azure Cosmos DB. The tool is built on an extension model for source and sink objects to migrate data.,2026-01-21T23:12:00.000Z,how-to,integrations,0.7,True,"Data Migration Tool reference typically includes command-line options, connection parameters, and source/sink configuration settings, which are integration patterns for moving data.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-migrate-from-bulk-executor-library,Migrate from bulk executor to .NET SDK,Migrate From the Bulk Executor Library to the Bulk Support in .NET V3 SDK - Azure Cosmos DB,Migrate from .NET bulk executor to SDK v3 bulk,Learn how to migrate your application from using the bulk executor library to the bulk support in Azure Cosmos DB SDK V3,This article describes the required steps to migrate an existing application's code that uses the.NET bulk executor libraryto thebulk supportfeature in the latest version of the .NET SDK.,2025-12-19T18:19:00.000Z,how-to,decision-making,0.68,True,"Migration guide describes how to move from external bulk library to built-in bulk support, including mapping of APIs and behaviors, which is concrete migration and version-selection decision guidance.",unchanged @@ -244,7 +242,7 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-multi-master,Configure https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-provision-autoscale-throughput,Use autoscale throughput - database or container,Provision Autoscale Throughput - Azure Cosmos DB,Enable and configure autoscale throughput in Cosmos DB,"Learn how to provision autoscale throughput at the container and database level in Azure Cosmos DB for NoSQL using Azure portal, CLI, PowerShell, and various other SDKs.","This article explains how to enable autoscale throughput on a database or container (collection, graph, or table) in Azure Cosmos DB for NoSQL. You can enable autoscale on a single container, or provision autoscale throughput on a database and share it among all the containers in the database. If you're using a different API, seeAPI for MongoDB,API for Cassandra, orAPI for Gremlin.",2025-12-19T18:19:00.000Z,how-to,configuration,0.75,True,Explains enabling autoscale at database/container level with portal/CLI/SDK; includes autoscale-specific configuration parameters and allowed values unique to Cosmos DB.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-provision-container-throughput,Use standard (manual) throughput - container,Provision Container Throughput - Azure Cosmos DB,Provision container-level throughput in Cosmos DB,"Learn how to provision throughput at the container level in Azure Cosmos DB for NoSQL using Azure portal, CLI, PowerShell and various other SDKs.","This article explains how to provision standard (manual) throughput on a container in Azure Cosmos DB for NoSQL. You can provision throughput on a single container, orprovision throughput on a databaseand share it among the containers within the database. You can provision throughput on a container using Azure portal, Azure CLI, or Azure Cosmos DB SDKs. If you are using a different API, seeAPI for MongoDB,API for Cassandra,API for Gremlinarticles to provision the throughput.",2025-12-19T18:19:00.000Z,how-to,configuration,0.7,True,"How-to article for provisioning throughput via portal/CLI/SDKs; typically includes specific parameters, request bodies, and options for configuring RU/s at container scope.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-provision-database-throughput,Use standard (manual) throughput - database,Provision Database Throughput - Azure Cosmos DB,Provision database-level throughput in Cosmos DB,"Learn how to provision throughput at the database level in Azure Cosmos DB for NoSQL using Azure portal, CLI, PowerShell and various other SDKs.","This article explains how to provision standard (manual) throughput on a database in Azure Cosmos DB for NoSQL. You can provision throughput for a singlecontainer, or for a database and share the throughput among the containers within it. To learn when to use container level and database level throughput, see theUse cases for provisioning throughput on containers and databasesarticle. You can provision database level throughput by using the Azure portal or Azure Cosmos DB SDKs. If you are using ",2025-12-19T18:19:00.000Z,how-to,configuration,0.7,True,"Covers configuring throughput at database scope using portal and SDKs, with product-specific options and parameters for RU/s sharing across containers.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-python-create-container,Create a container,Create a Container Using Python - Azure Cosmos DB,,Learn how to create a container in your Azure Cosmos DB for NoSQL database using the Python SDK.,"Containers in Azure Cosmos DB store sets of items. Before you can create, query, or manage items, you must first create a container.",2026-04-15T17:08:00.000Z,how-to,,0.2,False,"Page is a how-to/tutorial for creating a Cosmos DB container with Python. It likely shows basic SDK usage and example code, but not product-specific limits, configuration tables, error-code mappings, or other structured expert details as defined (no quotas, decision matrices, RBAC tables, or config parameter catalogs).",updated +https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-python-create-container,Create a container,Create a Container Using Python - Azure Cosmos DB,,Learn how to create a container in your Azure Cosmos DB for NoSQL database using the Python SDK.,"Containers in Azure Cosmos DB store sets of items. Before you can create, query, or manage items, you must first create a container.",2026-04-15T17:08:00.000Z,how-to,,0.2,False,"Page is a how-to/tutorial for creating a Cosmos DB container with Python. It likely shows basic SDK usage and example code, but not product-specific limits, configuration tables, error-code mappings, or other structured expert details as defined (no quotas, decision matrices, RBAC tables, or config parameter catalogs).",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-python-create-database,Create a database,Create a Database Using Python - Azure Cosmos DB,Create Cosmos DB databases with Python SDK,Learn how to create a database in your Azure Cosmos DB for NoSQL account using the Python SDK.,"Databases in Azure Cosmos DB are units of management for one or more containers. Before you can create or manage containers, you must first create a database.",2025-12-19T18:19:00.000Z,how-to,integrations,0.65,True,Shows Python SDK database creation methods and options that are specific to Cosmos DB.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-python-get-started,Get started,Get started using Python - Azure Cosmos DB,Connect to Cosmos DB for NoSQL using Python SDK,Get started developing a Python application that works with Azure Cosmos DB for NoSQL. This article helps you learn how to set up a project and configure access to an Azure Cosmos DB for NoSQL endpoin,"This article explains how to connect to Azure Cosmos DB for NoSQL using the Python SDK. After connecting, perform operations on databases, containers, and items. Package (PyPi)|API reference|Library source code|Give feedback",2025-12-19T18:19:00.000Z,how-to,integrations,0.7,True,"Includes PyPI package usage, client configuration, and connection patterns specific to the Cosmos DB Python SDK.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-python-vector-index-query,Python,Index and Query Vector Data in Python - Azure Cosmos DB,Index and query vector data in Cosmos DB with Python,Add vector data in Azure Cosmos DB for NoSQL and then query the data efficiently in your Python application.,"This article explains how to create vector data, index the data, and then query the data in a container. Before you use vector indexing and search, you must first enable vector search in Azure Cosmos DB for NoSQL. After you set up the Azure Cosmos DB container for vector search, you create a vector embedding policy. Next, you add vector indexes to the container indexing policy. Then you create a container with vector indexes and a vector embedding policy. Finally, you perform a vector search on ",2025-12-19T18:19:00.000Z,how-to,integrations,0.7,True,Python SDK usage for vector indexing/search; includes concrete SDK method signatures and parameters unique to Cosmos DB vector support.,unchanged @@ -416,7 +414,7 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/periodic-backup-update-storage https://learn.microsoft.com/en-us/azure/cosmos-db/policy,Azure Policy support,Use Azure Policy to Implement Governance and Controls for Resources - Azure Cosmos DB,Apply Azure Policy governance to Cosmos DB resources,Learn how to use Azure Policy to implement governance and controls for Azure Cosmos DB resources.,"Azure Policyhelps to enforce organizational governance standards, assess resource compliance, and implement automatic remediation. Common use cases include security, cost management, and configuration consistency. Azure Policy provides built-in policy definitions. You can create custom policy definitions for scenarios that are not addressed by the built-in policy definitions. See theAzure Policy documentationfor more details. Important Azure Policy is enforced at the resource provider level for ",2025-12-19T18:19:00.000Z,how-to,security,0.7,True,Shows how to use Azure Policy for Cosmos DB; will reference specific policy definitions and configuration relevant to security/compliance for this service.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/policy-reference,Azure Policy built-ins,Built-in Policy Definitions - Azure Cosmos DB,Use built-in Azure Policy definitions for Cosmos DB,Lists Azure Policy built-in policy definitions for Azure Cosmos DB. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy definitions for Azure Cosmos DB. For additional Azure Policy built-ins for other services, seeAzure Policy built-in definitions. The name of each built-in policy definition links to the policy definition in the Azure portal. Use -the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2024-08-14T17:44:00.000Z,reference,security,0.75,True,Lists specific Azure Policy definitions for Cosmos DB with names and enforcement scopes; these are product-specific security/compliance configurations.,unchanged +the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2025-12-19T18:19:00.000Z,reference,security,0.7,True,"Lists specific built-in Azure Policy definitions for Azure Cosmos DB, including exact policy names and links to their definitions. These are product-specific security and governance configurations (policy definitions and effects) that an LLM is unlikely to know exhaustively from training, fitting the security sub-skill focused on RBAC/policy and configuration-level security controls.",updated https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/,PostgreSQL,Azure Cosmos DB for PostgreSQL documentation - Azure Cosmos DB for PostgreSQL,,Azure Cosmos DB for PostgreSQL is a managed service for PostgreSQL extended with the Citus open source superpower of *distributed tables*.,Azure Cosmos DB for PostgreSQL is a managed service for PostgreSQL extended with the Citus open source superpower of *distributed tables*.,2025-10-30T05:05:00Z,landing-page,,0.2,False,Landing/overview page for Cosmos DB for PostgreSQL; high-level description without detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/concepts-authentication,Microsoft Entra ID and Postgres authentication,PostgreSQL and Microsoft Entra ID authentication - Azure Cosmos DB for PostgreSQL,Configure PostgreSQL and Entra ID authentication,Learn about the concepts of native PostgreSQL and Microsoft Entra ID authentication with Azure Cosmos DB for PostgreSQL,"Important Azure Cosmos DB for PostgreSQL is on a retirement path and no longer recommended for new projects. Instead, use one of these two services: ForPostgreSQLworkloads: use theElastic Clusters feature of Azure Database For PostgreSQLto use the horizontal scale-out and distributed PostgreSQL features contained within the open source Citus extension. ForNoSQLworkloads, useAzure Cosmos DB for NoSQLfor a distributed database solution that includes a 99.999% availability service level agreement (",2025-12-15T18:19:00.000Z,concept-article,security,0.8,True,"Covers concrete authentication modes, mappings between Entra ID and PostgreSQL roles, and service-specific auth behaviors and parameters, which are detailed security configurations.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/concepts-availability-zones,Availability zones,Availability zone (AZ) outage resiliency – Azure Cosmos DB for PostgreSQL - Azure Cosmos DB for PostgreSQL,Design for AZ outage resiliency in Cosmos DB PostgreSQL,Disaster recovery using Azure availability zones (AZ) concepts,"Important Azure Cosmos DB for PostgreSQL is on a retirement path and no longer recommended for new projects. Instead, use one of these two services: ForPostgreSQLworkloads: use theElastic Clusters feature of Azure Database For PostgreSQLto use the horizontal scale-out and distributed PostgreSQL features contained within the open source Citus extension. ForNoSQLworkloads, useAzure Cosmos DB for NoSQLfor a distributed database solution that includes a 99.999% availability service level agreement (",2025-12-15T18:19:00.000Z,concept-article,architecture-patterns,0.65,True,"Covers how availability zones are used by this service and how to architect for zone failures, including product-specific deployment patterns and constraints.",unchanged @@ -535,10 +533,10 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-template-bicep,Use https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-template-json,Use an Azure Resource Manager template,Quickstart - Create a Database and Container Using Azure Resource Manager Template - Azure Cosmos DB,Provision Cosmos DB database and container with ARM templates,Quickstart showing how to an Azure Cosmos DB database and a container by using Azure Resource Manager template,"Azure Cosmos DB is Microsoft’s fast NoSQL database with open APIs for any scale. You can use Azure Cosmos DB to quickly create and query key/value databases, document databases, and graph databases. Without a credit card or an Azure subscription, you can set up a freeTry Azure Cosmos DB account. This quickstart focuses on the process of deploying an Azure Resource Manager template (ARM template) to create an Azure Cosmos DB database and a container within that database. You can later store data ",2025-12-19T18:19:00.000Z,quickstart,configuration,0.65,True,"ARM template quickstart exposes Cosmos DB-specific resource types, properties, and settings, which are explicit configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-terraform,Use a Terraform template,Quickstart - Create a Database and Container Using Terraform - Azure Cosmos DB,Provision Cosmos DB database and container using Terraform,Quickstart showing how to an Azure Cosmos DB database and a container using Terraform,"Azure Cosmos DB is Microsoft’s fast NoSQL database with open APIs for any scale. You can use Azure Cosmos DB to quickly create and query key/value databases, document databases, and graph databases. Without a credit card or an Azure subscription, you can set up a freeTry Azure Cosmos DB account. This quickstart focuses on the process of deployments via Terraform to create an Azure Cosmos database and a container within that database. You can later store data in this container.",2026-01-20T18:11:00.000Z,quickstart,configuration,0.6,True,"Terraform quickstart uses Cosmos DB-specific resource blocks and arguments, which are concrete configuration options for this service.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-dotnet,Query a vector store with a .NET app,Quickstart - Azure Cosmos DB vector search with .NET - Azure Cosmos DB,,Learn how to use vector search in Azure Cosmos DB with .NET. Store and query vector data efficiently in your applications.,"Learn to use vector search in Azure Cosmos DB to store and query vector data efficiently. This quickstart provides a guided tour of key vector search techniques using a.NET sample appon GitHub. The app uses a sample hotel dataset in a JSON file with calculated vectors from thetext-embedding-3-smallmodel. The hotel data includes hotel names, locations, descriptions, and vector embeddings.",2026-03-12T06:17:00.000Z,quickstart-sdk,,0.2,False,"Quickstart/tutorial for using vector search with .NET; based on the description it focuses on example usage and a sample app, without indicating specific limits, configuration tables, error-code mappings, or other product-specific expert details.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-go,Query a vector store with a Go app,Quickstart - Azure Cosmos DB vector search with Go - Azure Cosmos DB,Implement Azure Cosmos DB vector search with Go,Use this quickstart to implement vector search in Azure Cosmos DB with Go. Store and query hotel data with embeddings.,"Use vector search in Azure Cosmos DB with the Go client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-04-14T11:03:00.000Z,quickstart-sdk,integrations,0.65,True,"Quickstart includes concrete Go SDK usage and product-specific parameters for vector search (indexing, query options, embedding handling). This is code-focused integration with Azure Cosmos DB’s vector store rather than generic Go or vector concepts.",new -https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-java,Query a vector store with a Java app,Quickstart - Azure Cosmos DB vector search with Java - Azure Cosmos DB,Implement Azure Cosmos DB vector search with Java,Use this quickstart to implement vector search in Azure Cosmos DB with Java. Store and query hotel data with embeddings.,"Use vector search in Azure Cosmos DB with the Java client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-04-14T11:03:00.000Z,quickstart-sdk,integrations,0.65,True,"Quickstart demonstrates Java SDK calls and configuration for Cosmos DB vector search, including how to persist and query vectors. These are concrete integration patterns tied to this service.",new +https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-go,Query a vector store with a Go app,Quickstart - Azure Cosmos DB vector search with Go - Azure Cosmos DB,Implement Azure Cosmos DB vector search with Go,Use this quickstart to implement vector search in Azure Cosmos DB with Go. Store and query hotel data with embeddings.,"Use vector search in Azure Cosmos DB with the Go client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-04-14T11:03:00.000Z,quickstart-sdk,integrations,0.65,True,"Quickstart includes concrete Go SDK usage and product-specific parameters for vector search (indexing, query options, embedding handling). This is code-focused integration with Azure Cosmos DB’s vector store rather than generic Go or vector concepts.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-java,Query a vector store with a Java app,Quickstart - Azure Cosmos DB vector search with Java - Azure Cosmos DB,Implement Azure Cosmos DB vector search with Java,Use this quickstart to implement vector search in Azure Cosmos DB with Java. Store and query hotel data with embeddings.,"Use vector search in Azure Cosmos DB with the Java client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-04-14T11:03:00.000Z,quickstart-sdk,integrations,0.65,True,"Quickstart demonstrates Java SDK calls and configuration for Cosmos DB vector search, including how to persist and query vectors. These are concrete integration patterns tied to this service.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-nodejs,Query a vector store with a Node.js app,Quickstart - Vector Search with Node.js - Azure Cosmos DB,Implement vector search in Cosmos DB with Node.js,Use this quickstart to implement vector search in Azure Cosmos DB with Node.js. Store and query hotel data with embeddings using the text-embedding-3-small model.,"Use vector search in Azure Cosmos DB with the Node.js client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-02-12T18:33:00.000Z,quickstart-sdk,integrations,0.7,True,"Vector search quickstart with Node.js uses Cosmos DB’s vector APIs and parameters, providing concrete integration patterns for vector operations.",unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-python,Query a vector store with a Python app,Quickstart - Vector Search with Python - Azure Cosmos DB,Implement Azure Cosmos DB vector search with Python,Use this quickstart to implement vector search in Azure Cosmos DB with Python using the text-embedding-3-small model. Store and query hotel data with embeddings.,"Use vector search in Azure Cosmos DB with the Python client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-04-13T22:09:00.000Z,quickstart-sdk,integrations,0.65,True,"Quickstart shows Python client library patterns and parameters specific to Cosmos DB vector search, including how to store/query embeddings. This is product-specific integration code, not just conceptual guidance.",new +https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-python,Query a vector store with a Python app,Quickstart - Vector Search with Python - Azure Cosmos DB,Implement Azure Cosmos DB vector search with Python,Use this quickstart to implement vector search in Azure Cosmos DB with Python using the text-embedding-3-small model. Store and query hotel data with embeddings.,"Use vector search in Azure Cosmos DB with the Python client library. Store and query vector data efficiently in your applications. This quickstart uses a sample hotel dataset in a JSON file with vectors from thetext-embedding-3-smallmodel. The dataset includes hotel names, locations, descriptions, and vector embeddings. Find the sample code with resource provisioning onGitHub.",2026-04-13T22:09:00.000Z,quickstart-sdk,integrations,0.65,True,"Quickstart shows Python client library patterns and parameters specific to Cosmos DB vector search, including how to store/query embeddings. This is product-specific integration code, not just conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/read-change-feed,Reading change feed,Read Change Feed - Azure Cosmos DB,Use push and pull models to read Cosmos DB change feed,Learn how to work with the change feed in Azure Cosmos DB by using either a push model or a pull model.,"You can work with the Azure Cosmos DB change feed by using either a push model or a pull model. With apushmodel, the change feed processor pushes work to a client that has business logic for processing this work. However, the complexity in checking for work and storing state for the last processed work is handled within the change feed processor. With apullmodel, the client has to pull the work from the server. The client, in this case, not only has business logic for processing work but also fo",2025-12-19T18:19:00.000Z,how-to,integrations,0.7,True,Describes push vs pull consumption with processor vs manual state management; includes SDK usage patterns and configuration unique to Cosmos DB change feed.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/reference-data-plane-security,Data plane actions,Data plane security reference - Azure Cosmos DB,Reference for Cosmos DB data plane RBAC roles,Learn about data plane actions and built-in roles for role-based access control in Azure Cosmos DB for NoSQL. See which permissions are available and how to use them.,Azure Cosmos DB for NoSQL exposes a unique set of data actions and roles within its native role-based access control implementation. This article includes a list of those actions and roles with descriptions on what permissions are granted for each resource. Warning Azure Cosmos DB for NoSQL's native role-based access control doesn't support thenotDataActionsproperty. Any action that isn't specified as an alloweddataActionis excluded automatically.,2025-12-19T18:19:00.000Z,reference,security,0.9,True,"Explicit reference of data actions and built-in roles; contains detailed role names, permissions, and constraints, which are core security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/reference-data-plane-security,Data plane built-in roles,Data plane security reference - Azure Cosmos DB,Reference for Cosmos DB data plane RBAC roles,Learn about data plane actions and built-in roles for role-based access control in Azure Cosmos DB for NoSQL. See which permissions are available and how to use them.,Azure Cosmos DB for NoSQL exposes a unique set of data actions and roles within its native role-based access control implementation. This article includes a list of those actions and roles with descriptions on what permissions are granted for each resource. Warning Azure Cosmos DB for NoSQL's native role-based access control doesn't support thenotDataActionsproperty. Any action that isn't specified as an alloweddataActionis excluded automatically.,2025-12-19T18:19:00.000Z,reference,security,0.9,True,"Duplicate of index 26; same detailed RBAC data actions and built-in roles, which are expert security configuration details.",unchanged @@ -581,7 +579,7 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/sdk-java-v4,Java SDK v4,Java S https://learn.microsoft.com/en-us/azure/cosmos-db/sdk-nodejs,Node.js,"SQL Node.Js API, SDK and Resources - Azure Cosmos DB",,"Learn all about the SQL Node.js API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB Node.js SDK.",,2025-12-19T18:19:00.000Z,reference,,0.3,False,"Node.js SDK release notes/resources; index-style page with versions and dates, not deep config or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/sdk-observability,SDK observability,SDK Observability - Azure Cosmos DB,Configure Cosmos DB SDK observability with OpenTelemetry,Learn how to monitor requests made using the Azure Cosmos DB SDKs with OpenTelemetry and Application Insights.,"The Azure Cosmos DB .NET and Java SDKs support distributed tracing to help you monitor your applications. Tracing the flow of requests is helpful in debugging, analyzing latency and performance, and gathering diagnostics. Instrument tracing for your applications usingOpenTelemetry, which is vendor-neutral and has a set of semantic conventions to ensure a standardized data format regardless of your chosen exporter, or use theApplication Insights SDK or Azure Monitor OpenTelemetry Distro.",2025-12-19T18:19:00.000Z,how-to,configuration,0.7,True,"Observability docs include specific tracing configuration (exporter settings, resource attributes, instrumentation options) and SDK parameters for telemetry, which are concrete configuration details.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/sdk-python,Python,"SQL Python API, SDK and Resources - Azure Cosmos DB",,"Learn all about the SQL Python API and SDK including release dates, retirement dates, and changes made between each version of the Azure Cosmos DB Python SDK.",Important,2026-02-03T23:09:00.000Z,reference,,0.3,False,Python SDK release notes/resources; summary is minimal and marked 'Important' but not clearly exposing detailed config or troubleshooting mappings.,unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/security,Security overview,Secure your account - Azure Cosmos DB,,Review the fundamentals of securing Azure Cosmos DB for NoSQL from the perspective of data and networking security.,"Azure Cosmos DB for NoSQL is a globally distributed, multi-model database service designed for mission-critical applications. While Azure Cosmos DB provides built-in security features to protect your data, it's essential to follow best practices to further enhance the security of your account, data, and networking configurations. This article provides guidance on how to best secure your Azure Cosmos DB for NoSQL deployment.",2025-12-19T18:19:00.000Z,best-practice,,0.4,False,"High-level security fundamentals and shared-responsibility overview; likely conceptual guidance without concrete RBAC roles, config parameters, or numeric constraints.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/security,Security overview,Secure your account - Azure Cosmos DB,Secure Azure Cosmos DB for NoSQL accounts,Review the fundamentals of securing Azure Cosmos DB for NoSQL from the perspective of data and networking security.,"Azure Cosmos DB for NoSQL is a globally distributed, multi-model database service designed for mission-critical applications. While Azure Cosmos DB provides built-in security features to protect your data, it's essential to follow best practices to further enhance the security of your account, data, and networking configurations. This article provides guidance on how to best secure your Azure Cosmos DB for NoSQL deployment.",2026-04-21T17:11:00.000Z,best-practice,security,0.7,True,"Security-focused article for Cosmos DB that goes beyond concepts and provides concrete, product-specific guidance on securing accounts, data, and networking; likely includes specific features (e.g., RBAC roles, network restrictions, keys) and their recommended configurations.",updated https://learn.microsoft.com/en-us/azure/cosmos-db/security-considerations,Considerations and guidance,Security considerations - Azure Cosmos DB,,Review considerations and best practices for securing your Azure Cosmos DB account including features that Microsoft implements for your account's security.,"Data security is a shared responsibility between you, the customer, and your database provider. Depending on the database provider you choose, the amount of responsibility you carry can vary. If you choose an on-premises solution, you need to provide everything from endpoint protection to physical security of your hardware, which is no easy task. If you choose a platform as a service (PaaS) cloud database provider, such as Azure Cosmos DB, your area of concern shrinks considerably. For more info",2024-10-03T11:07:00.000Z,concept-article,,0.4,False,General security considerations and shared responsibility; likely conceptual best practices without detailed product-specific parameters or roles.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/security-controls-policy,Security controls by Azure Policy,Azure Policy Regulatory Compliance Controls - Azure Cosmos DB,Review Cosmos DB Azure Policy compliance controls,Lists Azure Policy Regulatory Compliance controls available for Azure Cosmos DB. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources.,"Regulatory Compliance in Azure Policyprovides Microsoft created and managed initiative definitions, known asbuilt-ins, for thecompliance domainsandsecurity controlsrelated to different compliance standards. This page lists thecompliance domainsandsecurity controlsfor Azure Cosmos DB. You can assign @@ -673,11 +671,13 @@ https://learn.microsoft.com/en-us/azure/cosmos-db/tutorial-query,Query data,"Tut https://learn.microsoft.com/en-us/azure/cosmos-db/tutorial-setup-ci-cd,Set up CI/CD with Azure Pipelines,Set up CI/CD Pipeline With Emulator Build Task - Azure Cosmos DB,Set up Azure DevOps CI/CD with Cosmos DB emulator,Tutorial on how to set up build and release workflow in Azure DevOps using the Azure Cosmos DB emulator build task,"Note Due to the full removal of Windows 2016 hosted runners on April 1st, 2022, this method of using the Azure Cosmos DB emulator with build task in Azure DevOps is no longer supported. We are actively working on alternative solutions. Meanwhile, you can follow the below instructions to leverage the Azure Cosmos DB emulator which comes pre-installed when using the ""windows-2019"" agent type. The Azure Cosmos DB Emulator provides a local environment that emulates the Azure Cosmos DB service for de",2025-12-19T18:19:00.000Z,how-to,deployment,0.66,True,"CI/CD tutorial with emulator build task likely includes agent type requirements, supported runners, and product-specific constraints for using the emulator in pipelines, which are deployment-specific details.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/tutorial-spark-connector,Spark 3.x online transaction processing (OLTP) connector,Tutorial - Connect Using Spark - Azure Cosmos DB,,Connect to Azure Cosmos DB for NoSQL by using the Spark 3 OLTP connector. Use the connector to query data in your API for a NoSQL account.,"In this tutorial, you use the Azure Cosmos DB Spark connector to read or write data from an Azure Cosmos DB for NoSQL account. This tutorial uses Azure Databricks and a Jupyter notebook to illustrate how to integrate with the API for NoSQL from Spark. This tutorial focuses on Python and Scala, although you can use any language or interface supported by Spark. In this tutorial, you learn how to:",2025-12-19T18:19:00.000Z,tutorial,,0.4,False,Tutorial for Spark connector usage; likely step-by-step without comprehensive parameter tables or product-specific limits; more how-to than deep configuration or best-practices.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/tutorial-springboot-azure-kubernetes-service,Integrate with Azure Kubernetes Service and Spring Boot,Spring Boot Application with Azure Kubernetes Service - Azure Cosmos DB,,Learn how to deploy a Spring Boot application to Azure Kubernetes Service and perform CRUD operations on data in Azure Cosmos DB for NoSQL.,"Note For Spring Boot applications, we recommend using Azure Spring Apps. However, you can still use Azure Kubernetes Service as a destination. SeeJava Workload Destination Guidancefor advice. In this tutorial, you set up and deploy a Spring Boot application that exposes REST APIs to perform CRUD operations on data in Azure Cosmos DB (API for NoSQL account). You package the application as Docker image, push it to Azure Container Registry, deploy to Azure Kubernetes Service and test the applicatio",2025-12-19T18:19:00.000Z,how-to,,0.4,False,"Tutorial for deploying a Spring Boot app to AKS with Cosmos DB; primarily step-by-step guidance without configuration matrices, limits, or deep troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/cosmos-db/understand-request-unit-consumption,Understanding request unit consumption,Understanding Request Units Consumption - Azure Cosmos DB,Estimate and interpret Azure Cosmos DB RU consumption,This article explains how request units are consumed in Azure Cosmos DB with some examples.,"Azure Cosmos DB uses request units (RUs) as a normalized measure of the resources required to execute database operations. Instead of managing provisioning of resources such as CPU, memory, and I/O independently, RUs provide a simple and consistent way to understand how different operations consume resources. Each operation consumes Request Units reflecting the work performed by the service to execute the request with a focus on trying to ensure predictability. +This article explains what influen",2026-04-21T17:11:00.000Z,concept-article,limits-quotas,0.7,True,"A detailed RU consumption article for Cosmos DB typically includes concrete, product-specific RU cost examples (for example, RU charges for point reads vs. queries, item size impacts, indexing effects). These are effectively numeric resource limits/constraints unique to the service and not generally knowable from training data, fitting the limits-quotas category better than conceptual guidance.",new https://learn.microsoft.com/en-us/azure/cosmos-db/unique-keys,Unique key constraints,Use Unique Keys - Azure Cosmos DB,Define and use unique key policies in Cosmos DB,Learn how to define and use unique keys for an Azure Cosmos DB database. This article also describes how unique keys add a layer of data integrity.,"Unique keys add a layer of data integrity to an Azure Cosmos DB container. You create a unique key policy when you create an Azure Cosmos DB container. With unique keys, you make sure that one or more values within a logical partition is unique. You also can guarantee uniqueness perpartition key. After you create a container with a unique key policy, the creation of a new item or an update of an existing item that results in a duplicate within a logical partition is prevented, as specified by th",2025-07-25T08:00:00.000Z,concept-article,configuration,0.65,True,Describes how unique key policies work per logical partition/partition key and their constraints at container creation—specific configuration and behavior details.,unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/use-cases,Common use cases,Common Use Cases and Scenarios - Azure Cosmos DB,,"Learn about the top five use cases for Azure Cosmos DB: user generated content, event logging, catalog data, user preferences data, and Internet of Things (IoT).","This article provides an overview of several common use cases for Azure Cosmos DB. The recommendations in this article serve as a starting point as you develop your application with Azure Cosmos DB. After reading this article, you'll be able to answer the following questions:",2025-05-30T08:00:00.000Z,solution-overview,,0.2,False,"Overview of common use cases; mostly conceptual guidance without specific numeric thresholds, configuration parameters, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/use-metrics,Monitor and debug using metrics,Monitor and Debug With Insights - Azure Cosmos DB,Use Cosmos DB metrics and insights to debug issues,Learn how to use metrics and insights in Azure Cosmos DB to debug common issues and monitor the database.,"Azure Cosmos DB provides insights for throughput, storage, consistency, availability, and latency. The Azure portal provides an aggregated view of these metrics. You can also view Azure Cosmos DB metrics from Azure Monitor API. The dimension values for the metrics such as container name are case-insensitive. Therefore, you need to use case-insensitive comparison when doing string comparisons on these dimension values. To learn how to view metrics from Azure monitor, seeMonitor Azure Cosmos DB. T",2024-11-11T23:01:00.000Z,how-to,troubleshooting,0.65,True,"Shows how to use specific metrics (throughput, storage, consistency, availability, latency) and notes case-insensitive dimension behavior, which is a product-specific gotcha useful for troubleshooting queries and monitoring.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/vector-database,Vector databases,Integrated Vector Database - Azure Cosmos DB,Use Cosmos DB as an integrated vector database,Review how to use Azure Cosmos DB as a vector database in numerous domains and situations across analytical and generative AI.,"Tip For the latest vector database and RAG pattern app samples, visitAzure Cosmos DB Samples Gallery. Vector databases are used in numerous domains and situations across analytical and generative AI, including natural language processing, video and image recognition, recommendation system, and search, among others. In 2023, a notable trend in software was the integration of AI enhancements, often achieved by incorporating specialized standalone vector databases into existing tech stacks. This ar",2025-10-18T05:04:00.000Z,concept-article,architecture-patterns,0.6,True,Discusses using Cosmos DB as a vector database across domains and RAG patterns; likely includes product-specific architectural guidance for when and how to use it.,unchanged -https://learn.microsoft.com/en-us/azure/cosmos-db/vector-search,Vector indexing and search,Integrated Vector Store - Azure Cosmos DB,,Enhance AI-based applications using the integrated vector store functionality in Azure Cosmos DB for NoSQL,"Azure Cosmos DB for NoSQL now offers efficient vector indexing and search. This feature is designed to handle multi-modal, high-dimensional vectors, enabling efficient and accurate vector search at any scale. You can now store vectors directly in the documents alongside your data. Each document in your database can contain not only traditional schema-free data, but also multi-modal high-dimensional vectors as other properties of the documents. This colocation of data and vectors allows for effic",2026-04-15T22:09:00.000Z,concept-article,,0.2,False,"Page appears to be a conceptual/feature overview of integrated vector store in Azure Cosmos DB, describing capabilities and benefits without detailed limits, configuration tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/cosmos-db/vector-search,Vector indexing and search,Integrated Vector Store - Azure Cosmos DB,,Enhance AI-based applications using the integrated vector store functionality in Azure Cosmos DB for NoSQL,"Azure Cosmos DB for NoSQL now offers efficient vector indexing and search. This feature is designed to handle multi-modal, high-dimensional vectors, enabling efficient and accurate vector search at any scale. You can now store vectors directly in the documents alongside your data. Each document in your database can contain not only traditional schema-free data, but also multi-modal high-dimensional vectors as other properties of the documents. This colocation of data and vectors allows for effic",2026-04-15T22:09:00.000Z,concept-article,,0.2,False,"Page appears to be a conceptual/feature overview of integrated vector store in Azure Cosmos DB, describing capabilities and benefits without detailed limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/vercel-integration,Vercel,Vercel Integration - Azure Cosmos DB,Connect Vercel applications to Azure Cosmos DB,Integrate web applications using the Vercel platform with Azure Cosmos DB for NOSQL or MongoDB as a data source.,Vercel offers a user-friendly and robust platform for web application development and deployment. This new integration improves productivity as developers can now easily create Vercel applications with a backend database already configured. This Integration helps developers transform their creative ideas into reality in real-time.,2025-09-03T17:13:00.000Z,how-to,integrations,0.7,True,"Vercel integration doc will contain configuration steps, environment variables, and connection parameters specific to wiring Vercel projects to Cosmos DB, matching integrations.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/visual-studio-code-extension,Use Visual Studio Code extension,Use Visual Studio Code to Connect and Manage Resources - Azure Cosmos DB,,Learn how to connect to Azure Cosmos DB for NoSQL by using Visual Studio Code.,"Visual Studio Codeis a versatile code editor for Linux, macOS, and Windows, supporting numerous extensions. This quickstart shows you how to connect to Azure Cosmos DB for NoSQL and Azure Cosmos DB for MongoDB using Visual Studio Code. It covers performing core database operations, including querying, inserting, updating, and deleting data.",2025-11-18T18:29:00.000Z,how-to,,0.3,False,"Quickstart for connecting via VS Code and doing CRUD; primarily tutorial content without detailed configuration parameters, limits, or troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/cosmos-db/whitepapers,Whitepapers,Conceptual whitepapers - Azure Cosmos DB,,"This list of conceptual whitepapers describes various Azure Cosmos DB service, development, and data concepts in depth.",Whitepapers allow you to explore Azure Cosmos DB concepts at a deeper level. This article provides you with a list of available whitepapers for Azure Cosmos DB.,2025-12-19T18:19:00.000Z,concept-article,,0.1,False,"This is a navigation list of whitepapers, not the whitepapers themselves; it does not contain detailed technical content or parameters.",unchanged @@ -704,9 +704,9 @@ https://learn.microsoft.com/en-us/azure/documentdb/compare-mongodb-atlas,Compare https://learn.microsoft.com/en-us/azure/documentdb/compatibility-features,MongoDB feature compatibility,MongoDB Feature Compatibility - Azure DocumentDB,Review MongoDB feature compatibility limits in DocumentDB,"Discover MongoDB feature compatibility in Azure DocumentDB. Learn supported aggregation stages, commands, and features. Optimize your workloads today.","Azure DocumentDB fully implements the MongoDB wire protocol for feature compatibility, allowing you to run nearly all MongoDB workloads without any application changes. This native Azure service offers optimized performance, lower total cost of ownership (TCO), and built-in AI capabilities, empowering modern, data-driven applications with ease. The tables in this article outline MongoDB features that are unsupported or limited in Azure DocumentDB. As a fully managed PaaS solution, Azure Document",2025-11-18T18:29:00.000Z,concept-article,limits-quotas,0.7,True,"Compatibility article includes tables of unsupported/limited MongoDB features in Azure DocumentDB, effectively defining product-specific functional limits and constraints.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/compatibility-query-language,MongoDB query language compatibility,MongoDB Query Language (MQL) Compatibility - Azure DocumentDB,Check MQL compatibility across MongoDB versions in DocumentDB,"Learn about MongoDB Query Language (MQL) compatibility in Azure DocumentDB, including supported operators, commands, and features across versions 5.0-8.0.","Azure DocumentDB provides comprehensive MongoDB Query Language (MQL) compatibility, combining MongoDB's familiar features with Azure's enterprise capabilities. This article provides a version-wise overview of MQL compatibility and feature support across versions 5.0-8.0, including operators, commands, indexes, and the MongoDB wire protocol. Applications can run without code changes, using the same client drivers, SDKs, and tools. Users benefit from Azure's scalability, security, and integration ",2025-11-18T18:29:00.000Z,concept-article,limits-quotas,0.7,True,"Provides version-wise tables of supported/unsupported MQL operators and commands for MongoDB 5.0–8.0, which are detailed capability limits per version.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/compute-storage,Burstable tier,Compute and Storage Configurations - Azure DocumentDB,Understand compute and storage sizing for Azure DocumentDB clusters,Supported compute and storage configurations for Azure DocumentDB clusters,"Azure DocumentDB compute resources are provided as vCores, which represent the logical CPU of the underlying hardware. The storage size for provisioning refers to the capacity available to the shards in your cluster. The storage is used for database files, temporary files, transaction logs, and the -database server logs. You can select the compute and storage settings independently. The selected compute and storage values apply to each shard in the cluster.",2026-04-17T22:09:00.000Z,limits-and-quotas,limits-quotas,0.65,True,"Duplicate of index 2 with the same URL and summary. For the same reasons, it likely contains tables of supported compute and storage configurations with numeric limits, fitting the limits-quotas category.",updated +database server logs. You can select the compute and storage settings independently. The selected compute and storage values apply to each shard in the cluster.",2026-04-17T22:09:00.000Z,limits-and-quotas,limits-quotas,0.65,True,"Duplicate of index 2 with the same URL and summary. For the same reasons, it likely contains tables of supported compute and storage configurations with numeric limits, fitting the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/compute-storage,Compute and storage,Compute and Storage Configurations - Azure DocumentDB,Understand compute and storage sizing for Azure DocumentDB clusters,Supported compute and storage configurations for Azure DocumentDB clusters,"Azure DocumentDB compute resources are provided as vCores, which represent the logical CPU of the underlying hardware. The storage size for provisioning refers to the capacity available to the shards in your cluster. The storage is used for database files, temporary files, transaction logs, and the -database server logs. You can select the compute and storage settings independently. The selected compute and storage values apply to each shard in the cluster.",2026-04-17T22:09:00.000Z,limits-and-quotas,limits-quotas,0.65,True,"Page covers supported compute and storage configurations, with vCores and storage capacity per shard. Such configuration pages typically include tables of supported sizes, ranges, and possibly per-shard limits, which are product-specific numeric constraints that qualify as limits-quotas.",updated +database server logs. You can select the compute and storage settings independently. The selected compute and storage values apply to each shard in the cluster.",2026-04-17T22:09:00.000Z,limits-and-quotas,limits-quotas,0.65,True,"Page covers supported compute and storage configurations, with vCores and storage capacity per shard. Such configuration pages typically include tables of supported sizes, ranges, and possibly per-shard limits, which are product-specific numeric constraints that qualify as limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/cross-region-replication,Cross-region replication,Cross-Region Replication Best Practices - Disaster Recovery - Azure DocumentDB,Apply cross-region replication and DR best practices in Azure DocumentDB,Learn how to configure Azure DocumentDB cross-region replication for disaster recovery and read scalability. Includes best practices for replica cluster promotion and connection strings.,This article discusses cross-region disaster recovery (DR) for Azure DocumentDB. It also covers read capabilities of the replica clusters in the same or other Azure regions for read operations scalability. The replication feature allows you to replicate data from a cluster to a read-only cluster in another or the same Azure region. Replicas are updated with asynchronous replication technology. You can have one cluster replica in another region of choice for the primary Azure DocumentDB cluster. ,2025-11-18T18:29:00.000Z,best-practice,best-practices,0.8,True,"Explicit best-practices article for DR and replication, likely with concrete recommendations (replica counts, promotion patterns, connection string usage) specific to DocumentDB.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/data-api,Data API,Data API for Azure DocumentDB - Azure DocumentDB,Use the Azure DocumentDB Data API over HTTPS,Explains how to interact with your data over HTTPS with simple RESTful endpoints.,"The Data API for Azure DocumentDB is an https interface that allows developers to access and interact with their data without needing a database driver. It simplifies data operations by enabling control plane and aggregation operations through HTTP requests. This API is ideal for web applications, providing secure and scalable access to databases.",2025-11-18T18:29:00.000Z,how-to,integrations,0.7,True,"Describes RESTful HTTPS interface for data/control operations; likely includes endpoint formats, parameters, and constraints unique to this API.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/database-encryption-at-rest,Data encryption,Encryption at rest in Azure DocumentDB - Azure DocumentDB,,Learn about encryption of data in Azure DocumentDB databases using service-managed and customer-managed encryption keys.,"All the data managed by an Azure DocumentDB is always encrypted at rest. That data includes all system and user databases, temporary files, logs, and backups.",2025-11-18T18:29:00.000Z,concept-article,,0.4,False,"Conceptual overview of encryption at rest; no clear indication of specific configuration parameters, roles, or error codes.",unchanged @@ -721,7 +721,7 @@ of every shard in a cluster. If a shard becomes unresponsive for any reason, Azu switches incoming connections from the failed shard to its standby. When failover happens, promoted shards always have fresh data through synchronous replication. All primary shards in a cluster are provisioned into oneavailability zone (AZ)for better latency between the shards. The standby shards are provisioned i",2025-11-18T18:29:00.000Z,concept-article,architecture-patterns,0.65,True,"Explains HA internals such as standby replicas, synchronous replication, and AZ placement; product-specific HA architecture guidance.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/high-availability-replication-best-practices,Best practices for high availability and disaster recovery,High availability (HA) and cross-region replication best practices - Azure DocumentDB,Implement HA and cross-region replication best practices in DocumentDB,Learn about best practices for high availability (HA) and cross-region replication in Azure DocumentDB.,"Ensuring high availability and enabling cross-region replication are essential for mission-critical applications using Azure DocumentDB. This document outlines best practices for configuring and managinghigh availability (HA)andcross-region replication. Follow guidance in this document to achieve optimal performance, resilience, and disaster recovery capabilities in Azure DocumentDB.",2025-11-18T18:29:00.000Z,concept-article,best-practices,0.8,True,Best-practices guidance for configuring HA and replication with product-specific recommendations and gotchas.,unchanged -https://learn.microsoft.com/en-us/azure/documentdb/high-performance-storage,High performance storage (Premium SSD v2),Premium SSD v2 Disks - High Performance Storage - Azure DocumentDB,Configure Premium SSD v2 performance limits for Azure DocumentDB,Learn how to use Premium SSD v2 high performance storage in Azure DocumentDB for higher IOPS and bandwidth.,"Azure DocumentDB usesPremium SSD v2disks to deliver significantly higher performance for I/O-intensive workloads by de-coupling storage capacity from IOPS and bandwidth settings. With Premium SSD v2 storage on Azure DocumentDB, the maximum configurable IOPS and bandwidth settings are available by default regardless of the storage capacity configured for the cluster. The IOPS and bandwidth capacity of the Compute tier determines the achievable IOPS and bandwidth in the storage layer without the n",2026-04-17T22:09:00.000Z,feature-guide,limits-quotas,0.7,True,"Page is about Premium SSD v2 high-performance storage and explicitly mentions maximum configurable IOPS and bandwidth settings being available by default and determined by the compute tier. This implies product-specific numeric IOPS/bandwidth limits and constraints that an LLM would not know from training, fitting the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/documentdb/high-performance-storage,High performance storage (Premium SSD v2),Premium SSD v2 Disks - High Performance Storage - Azure DocumentDB,Configure Premium SSD v2 performance limits for Azure DocumentDB,Learn how to use Premium SSD v2 high performance storage in Azure DocumentDB for higher IOPS and bandwidth.,"Azure DocumentDB usesPremium SSD v2disks to deliver significantly higher performance for I/O-intensive workloads by de-coupling storage capacity from IOPS and bandwidth settings. With Premium SSD v2 storage on Azure DocumentDB, the maximum configurable IOPS and bandwidth settings are available by default regardless of the storage capacity configured for the cluster. The IOPS and bandwidth capacity of the Compute tier determines the achievable IOPS and bandwidth in the storage layer without the n",2026-04-17T22:09:00.000Z,feature-guide,limits-quotas,0.7,True,"Page is about Premium SSD v2 high-performance storage and explicitly mentions maximum configurable IOPS and bandwidth settings being available by default and determined by the compute tier. This implies product-specific numeric IOPS/bandwidth limits and constraints that an LLM would not know from training, fitting the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/how-to-assess-plan-migration-readiness,Premigration assessment,Assess for readiness and plan migration - Azure DocumentDB,Assess MongoDB workloads and plan migration to Azure DocumentDB,Assess an existing MongoDB installation to determine if it's suitable for migration to Azure DocumentDB.,Carry out up-front planning tasks and make critical decisions before migrating your data to Azure DocumentDB. These decisions make your migration process run smoothly.,2025-11-18T18:29:00.000Z,how-to,decision-making,0.65,True,"Focused on up-front planning and critical decisions for migration readiness; provides structured guidance on when and how to migrate, fitting technology selection/migration decision-making.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/how-to-build-dotnet-console-app,.NET,Build a .NET console app - Azure DocumentDB,Build a .NET console app integrating with Azure DocumentDB,Connect to an Azure DocumentDB cluster by using a .NET console application in your preferred developer language.,"This guide demonstrates how to build a .NET console application to connect to an Azure DocumentDB cluster. You set up your development environment, use theAzure.Identitylibrary from the Azure SDK for .NET to authenticate, and interact with the database to create, query, and update documents.",2025-11-18T18:29:00.000Z,how-to,integrations,0.75,True,Uses Azure.Identity and DocumentDB-specific connection/auth patterns; includes SDK usage and parameters unique to this product integration.,unchanged https://learn.microsoft.com/en-us/azure/documentdb/how-to-build-go-console-app,Go,Build a Go console app - Azure DocumentDB,,Connect to an Azure DocumentDB cluster by using a Go console application in your preferred developer language.,"This guide explains how to build a Go console application to connect to an Azure DocumentDB cluster. You set up your development environment, use theazidentitypackage from the Azure SDK for Go to authenticate, and perform common operations on documents in the database.",2025-11-18T18:29:00.000Z,how-to,,0.3,False,"Go console app tutorial; basic environment setup and operations, not a configuration reference or best-practices guide.",unchanged @@ -801,7 +801,7 @@ https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$out,$o https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$redact,$redact,$redact - Azure DocumentDB,Filter document fields by access with $redact in DocumentDB,Filters the content of the documents based on access rights.,The$redactstage in aggregation pipeline is used to filter fields of the documents in a collection dynamically based on access rights or other conditions. It processes each document in the pipeline and removes or retains fields based on the specified logic.,2025-11-18T18:29:00.000Z,language-reference,security,0.7,True,"Documents $redact stage behavior for access-based field filtering, a product-specific security-related aggregation feature.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$replacewith,$replaceWith,$replaceWith - Azure DocumentDB,Transform documents with $replaceWith in DocumentDB,The $replaceWith operator in Azure DocumentDB returns a document after replacing a document with the specified document,The$replaceWithaggregation stage operator is used to replace the input document with the specified document. The$replaceWithoperator transforms documents from one structure to another or replaces them entirely with new fields and values.,2025-11-18T18:29:00.000Z,language-reference,integrations,0.75,True,"Explains $replaceWith semantics for replacing document structures in the pipeline, which is specific aggregation API behavior.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$sample,$sample,$sample - Azure DocumentDB,Randomly sample documents with $sample in DocumentDB,The $sample operator in Azure DocumentDB returns a randomly selected number of documents,"The$samplestage is used in aggregation pipelines to randomly select a specified number of documents from a collection. The$samplecommand is useful during testing, data analysis, and generating random subsets of data for machine learning.",2025-11-18T18:29:00.000Z,language-reference,integrations,0.75,True,"Documents the $sample stage for random selection of documents, a concrete aggregation feature of this service.",unchanged -https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set,$set,$set - Azure DocumentDB,Set or update fields with $set in DocumentDB,The $set operator in Azure DocumentDB updates or creates a new field with a specified value,The$setoperator updates an existing field or creates a new field with the specified value if it does not exist. One or more fields listed are updated or created. The dot notation is used to update or create nested objects.,2025-11-18T18:29:00.000Z,language-reference,integrations,0.75,True,"Describes $set operator behavior including dot-notation for nested objects, which is detailed, product-specific update/aggregation semantics.",unchanged +https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set,$set,$set - Azure DocumentDB,Use the $set update operator in Azure DocumentDB,The $set operator in Azure DocumentDB updates or creates a new field with a specified value,The$setoperator in Azure DocumentDB updates an existing field or creates a new field with the specified value if it does not exist. One or more fields listed are updated or created. The dot notation is used to update or create nested objects.,2026-04-24T06:03:00.000Z,language-reference,integrations,0.7,True,"Describes a product-specific update operator ($set) with precise behavior and syntax (including dot notation for nested objects), which is detailed API semantics rather than generic knowledge. This fits integrations/coding patterns as it defines how to use a specific operator in queries/updates.",updated https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$skip,$skip,$skip - Azure DocumentDB,Implement pagination with $skip in DocumentDB pipelines,The $skip stage in the aggregation pipeline is used to skip a specified number of documents from the input and pass the remaining documents to the next stage in the pipeline.,The $skip stage in the aggregation pipeline is used to skip a specified number of documents from the input and pass the remaining documents to the next stage in the pipeline. The stage is useful for implementing pagination in queries and for controlling the subset of documents that subsequent stages in the pipeline operate on.,2025-11-18T18:29:00.000Z,language-reference,integrations,0.75,True,"Explains $skip stage behavior for skipping N documents in aggregation pipelines, which is specific to this query framework.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$sort,$sort,$sort - Azure DocumentDB,,The $sort stage in the aggregation pipeline is used to order the documents in the pipeline by a specified field or fields.,"The $sort stage in the aggregation pipeline is used to order the documents in the pipeline by a specified field or fields. This stage helps you sort data, like arranging sales by amount or events by date.",2025-11-18T18:29:00.000Z,language-reference,,0.2,False,"Operator reference for $sort with basic behavior description; no limits, config tables, error codes, or product-specific thresholds.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$sortbycount,$sortByCount,$sortByCount - Azure DocumentDB,,The $sortByCount stage in the aggregation pipeline is used to group documents by a specified expression and then sort the count of documents in each group in descending order.,The $sortByCount stage in the aggregation pipeline is used to group documents by a specified expression and then sort the count of documents in each group in descending order. The$sortByCountstage is useful for quickly identifying the most common values within a dataset.,2025-11-18T18:29:00.000Z,language-reference,,0.2,False,"Describes $sortByCount semantics conceptually; lacks quotas, configuration parameters, or troubleshooting mappings.",unchanged @@ -866,7 +866,7 @@ https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise-update/$bit https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitand,$bitAnd,$bitAnd - Azure DocumentDB,Apply $bitAnd bitwise operations in Azure DocumentDB,The $bitAnd operator performs a bitwise AND operation on integer values and returns the result as an integer.,"The$bitAndoperator performs abitwise ANDoperation on integer values. It compares each bit of the first operand to the corresponding bit of the second operand. If both bits are 1, the corresponding result bit is set to 1. Otherwise, the corresponding result bit is set to 0.",2025-11-18T18:29:00.000Z,language-reference,integrations,0.65,True,"Documents the $bitAnd operator semantics for integer fields in DocumentDB, which is specific to this database’s query/update language.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitnot,$bitNot,$bitNot - Azure DocumentDB,Use $bitNot bitwise operator in Azure DocumentDB,The $bitNot operator performs a bitwise NOT operation on integer values and returns the result as an integer.,"The$bitNotoperator performs a bitwise NOT operation on integer values. It inverts all the bits of the operand, turning 1s into 0s and 0s into 1s. The result is the bitwise complement of the input value.",2025-11-18T18:29:00.000Z,language-reference,integrations,0.65,True,"Explains how $bitNot inverts bits on integer values in DocumentDB, a product-specific operator definition and usage pattern.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitor,$bitOr,$bitOr - Azure DocumentDB,Use $bitOr bitwise operator in Azure DocumentDB,The $bitOr operator performs a bitwise OR operation on integer values and returns the result as an integer.,"The$bitOroperator performs a bitwise OR operation on integer values. It compares each bit of the first operand to the corresponding bit of the second operand. If either bit is 1, the corresponding result bit is set to 1. If both bits are 0, the corresponding result bit is set to 0.",2025-11-18T18:29:00.000Z,language-reference,integrations,0.65,True,"Describes the $bitOr operator behavior for integer fields, which is concrete API semantics for DocumentDB’s bitwise operations.",unchanged -https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$cmp,$cmp,$cmp - Azure DocumentDB,Compare values using $cmp in Azure DocumentDB,The $cmp operator compares two values,"The$cmpoperator compares two specified values. The $cmp operator returns -1 if the first value is less than the second, 0 if the two values are equal and 1 if the first value is greater than the second.",2025-11-18T18:29:00.000Z,language-reference,integrations,0.65,True,"Documents the $cmp comparison operator and its return values (-1, 0, 1), which is precise operator semantics for this database.",unchanged +https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$cmp,$cmp,$cmp - Azure DocumentDB,,The $cmp operator in Azure DocumentDB compares two values,"The$cmpoperator compares two specified values. The $cmp operator returns -1 if the first value is less than the second, 0 if the two values are equal and 1 if the first value is greater than the second.",2026-04-23T08:00:00.000Z,language-reference,,0.3,False,"Describes basic comparison operator behavior (-1, 0, 1) without product-specific limits, configuration, or nuanced patterns; this is generic programming knowledge rather than expert, product-unique guidance.",updated https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$eq,$eq,$eq - Azure DocumentDB,Filter equal values with $eq in Azure DocumentDB,The $eq query operator compares the value of a field to a specified value,"The$eqoperator is used to match documents where the value of a field is equal to a specified value. The $eq operator filters documents based on exact matches on query predicates to retrieve documents with specific values, objects and arrays.",2025-11-18T18:29:00.000Z,language-reference,integrations,0.6,True,"Describes the $eq operator behavior for matching field values, which is concrete query operator usage in DocumentDB.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$gt,$gt,$gt - Azure DocumentDB,Query greater-than values with $gt in Azure DocumentDB,The $gt query operator retrieves documents where the value of a field is greater than a specified value,The$gtoperator retrieves documents where the value of a field is greater than a specified value. The$gtoperator queries numerical and date values to filter records that exceed a specified threshold.,2025-11-18T18:29:00.000Z,language-reference,integrations,0.6,True,Explains the $gt comparison operator semantics for numeric and date fields in DocumentDB queries.,unchanged https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$gte,$gte,$gte - Azure DocumentDB,Query minimum thresholds with $gte in Azure DocumentDB,The $gte operator retrieves documents where the value of a field is greater than or equal to a specified value,The$gteoperator retrieves documents where the value of a field is greater than or equal to a specified value. The$gteoperator retrieves documents that meet a minimum threshold for the value of a field.,2025-11-18T18:29:00.000Z,language-reference,integrations,0.6,True,"Documents the $gte operator behavior for greater-than-or-equal comparisons in DocumentDB, which is specific query syntax.",unchanged @@ -986,7 +986,7 @@ https://learn.microsoft.com/en-us/azure/documentdb/quickstart-terraform,Deploy a https://learn.microsoft.com/en-us/azure/documentdb/rag,RAG with Langchain & OpenAI,Retrieval-Augmented Generation with LangChain and OpenAI - Azure DocumentDB,"Implement RAG with Azure DocumentDB, LangChain, and OpenAI","Enhance AI-based applications using retrieval-augmented generation (RAG) with Azure DocumentDB, LangChain, and OpenAI.","In the fast-evolving realm of generative AI, large language models (LLMs) like GPT have transformed natural language processing. However, an emerging trend in AI is the use of vector stores, which play a pivotal role in enhancing AI applications. This tutorial explores how to use Azure DocumentDB, LangChain, and OpenAI to implement retrieval-augmented generation (RAG) for superior AI performance, alongside discussing LLMs and their limitations. We explore the rapidly adopted paradigm of RAG, and",2025-11-18T18:29:00.000Z,how-to,architecture-patterns,0.65,True,"RAG tutorial provides a concrete architecture for combining DocumentDB as a vector store with LangChain and OpenAI, including retrieval and generation flow patterns.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/regional-availability,Regional availability,Region Availability - Azure DocumentDB,Plan Azure DocumentDB deployments by region availability,Discover Azure DocumentDB region availability across 40+ Azure regions worldwide. Deploy close to users to reduce latency and meet compliance requirements.,"Azure DocumentDB offers region availability across 40+ Azure regions worldwide. This global reach helps you deploy applications close to your users, reducing latency, meeting data residency requirements, and improving business continuity. Azure DocumentDB is available in the following clouds and their corresponding regions:",2025-11-20T08:00:00.000Z,concept-article,deployment,0.65,True,Region availability listing across clouds/regions is a deployment planning matrix specific to this service.,unchanged https://learn.microsoft.com/en-us/azure/documentdb/release-notes,Release notes,Service release notes - Azure DocumentDB,,"Explore Azure DocumentDB release notes with feature updates, engine enhancements, and infrastructure improvements grouped by date. Stay current with the latest capabilities.","Azure DocumentDB continuously evolves with new features, performance improvements, and infrastructure enhancements. This article provides a comprehensive history of feature releases, engine updates, and service improvements for Azure DocumentDB. Each release includes details about new capabilities, query operator enhancements, and infrastructure changes to help you stay current with the latest developments. Note Items tagged as[Preview]require a support request to enable them on your cluster.",2025-11-20T08:00:00.000Z,release-notes,,0.4,False,"Release notes summary; without the full content, we can’t confirm presence of structured limits, config tables, or error mappings required by the categories.",unchanged -https://learn.microsoft.com/en-us/azure/documentdb/scalability-overview,Scalability overview,Scalability overview - Azure DocumentDB,,Cost and performance advantages of scalability for Azure DocumentDB,"Azure DocumentDB offers the ability to scale clusters both vertically and horizontally. While the Compute cluster tier and Storage disk functionally depend on each other, the scalability and cost of compute and storage are decoupled.",2026-04-17T08:00:00.000Z,concept-article,,0.1,False,"Described as a scalability overview focusing on cost and performance advantages and conceptual decoupling of compute and storage. No indication of specific numeric limits, configuration parameters, or decision matrices; primarily conceptual content.",updated +https://learn.microsoft.com/en-us/azure/documentdb/scalability-overview,Scalability overview,Scalability overview - Azure DocumentDB,,Cost and performance advantages of scalability for Azure DocumentDB,"Azure DocumentDB offers the ability to scale clusters both vertically and horizontally. While the Compute cluster tier and Storage disk functionally depend on each other, the scalability and cost of compute and storage are decoupled.",2026-04-17T08:00:00.000Z,concept-article,,0.1,False,"Described as a scalability overview focusing on cost and performance advantages and conceptual decoupling of compute and storage. No indication of specific numeric limits, configuration parameters, or decision matrices; primarily conceptual content.",unchanged https://learn.microsoft.com/en-us/azure/documentdb/secondary-users,Create secondary users,Read and read/write privileges with secondary native users - Azure DocumentDB,Manage secondary native users and privileges in Azure DocumentDB,Learn how to create and configure secondary native users with read and read/write privileges in Azure DocumentDB. Discover how to delegate access securely and get started today.,"Azure DocumentDB supports secondary nativeDocumentDBusers with specialized read-write and read-only roles, enabling secure delegation of data access. The built-in administrative account, created during cluster provisioning, has full privileges, including user management. Secondary users are automatically replicated to cluster replicas, but user management must be performed on the primary cluster.",2025-11-18T18:29:00.000Z,feature-guide,security,0.8,True,Describes built-in admin account and secondary user roles (read/read-write) with product-specific access control behavior.,unchanged https://learn.microsoft.com/en-us/azure/documentdb/security,Security overview,Secure your cluster - Azure DocumentDB,Secure Azure DocumentDB clusters with network and data controls,Learn how to secure Azure DocumentDB clusters with best practices for data and network protection. Strengthen security and prevent breaches.,"Azure DocumentDB is a fully managed NoSQL database service designed for high-performance, mission-critical applications. Securing your Azure DocumentDB cluster is essential to protect your data and network. This article explains best practices and key features to help you prevent, detect, and respond to database breaches.",2025-11-18T18:29:00.000Z,best-practice,security,0.7,True,Security-focused article with best practices for data and network protection; likely includes product-specific security settings and recommendations beyond generic concepts.,unchanged https://learn.microsoft.com/en-us/azure/documentdb/solutions-finance,Financial services and technology,Financial Services and Technology Solutions - Azure DocumentDB,,"Build secure financial solutions for banking, insurance, and fintech with Azure DocumentDB. Modernize legacy systems and improve customer experiences.","Azure DocumentDB lets financial institutions build secure, scalable, and modern solutions for banking, insurance, and fintech. Its globally distributed architecture and flexible data model support mission-critical workloads and regulatory compliance. Financial organizations use Azure DocumentDB to modernize legacy systems, improve customer experiences, and accelerate innovation while maintaining high security and reliability standards.",2025-11-18T18:29:00.000Z,solution-overview,,0.2,False,Financial services solution overview; primarily marketing and conceptual benefits without concrete technical decision matrices or configs.,unchanged diff --git a/products/azure-cosmos-db/report.md b/products/azure-cosmos-db/report.md index 9cb8f8be..f0608bc6 100644 --- a/products/azure-cosmos-db/report.md +++ b/products/azure-cosmos-db/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - integrations: SDK patterns, bulk ops, change feed, vector search, and integrations - (Kafka, Spark, Functions, BI, AI agents) for Cosmos DB APIs (NoSQL, Mongo, Cassandra, - PostgreSQL, Gremlin, DocumentDB). + integrations: SDK patterns, bulk ops, change feed, vector search, and integration + guides for Cosmos DB APIs (NoSQL, Mongo, Cassandra, PostgreSQL, Gremlin, DocumentDB) + plus Kafka, Spark, BI, and migration tools. security: 'Securing Cosmos DB and related services: identity/RBAC, keys and encryption, network isolation (VNet, Private Link, firewalls), TLS, auditing, policies, and - data‑level protections.' + workload-specific security (NoSQL, Mongo, PostgreSQL, Cassandra, DocumentDB).' architecture-patterns: 'Architectural patterns for Cosmos DB and PostgreSQL: multitenancy, sharding, HA/DR, change feed, HTAP, real-time analytics, and AI/LLM agents, memory, vectors, and semantic caching.' @@ -16,12 +16,12 @@ category_descriptions: configuration: 'Configuring Cosmos DB and related services: throughput, indexing, TTL, backup/restore, global distribution, monitoring, emulators, SDK tuning, and deployment via Bicep/ARM/Terraform across all APIs.' - best-practices: Performance, scaling, cost, and resiliency best practices for Cosmos - DB (all APIs/SDKs), including partitioning, indexing, throughput, benchmarking, - DR, and tuning PostgreSQL/Cassandra workloads - limits-quotas: 'Limits, quotas, and behaviors for Cosmos DB and DocumentDB: RU/throughput, - autoscale, burst, backups, partitions, indexing, APIs (Core, Cassandra, Mongo, - Table, Gremlin), and PostgreSQL cluster sizing.' + best-practices: Performance, scaling, partitioning, indexing, cost optimization, + SDK tuning, and HA/DR best practices for Cosmos DB (NoSQL, MongoDB, Cassandra, + PostgreSQL) and legacy DocumentDB. + limits-quotas: 'Limits, quotas, and behaviors for Cosmos DB and DocumentDB: RUs, + autoscale, burst, backup/PITR, partitions, indexing, APIs (Mongo/Cassandra/Table/Gremlin), + fleets, free tier, and PostgreSQL cluster sizing.' troubleshooting: 'Diagnosing and fixing Cosmos DB issues: SDK errors, timeouts, 4xx/5xx codes, performance/RU analysis, metrics/log queries, CMK/backup problems, and API-specific (Mongo/Cassandra/Gremlin/Postgres) troubleshooting.' @@ -31,32 +31,31 @@ category_descriptions: skill_description: Expert knowledge for Azure Cosmos DB development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - using Cosmos DB NoSQL/Mongo/Cassandra APIs, change feed, vector search, global distribution, - or HTAP/analytics workloads, and other Azure Cosmos DB related development tasks. + using Cosmos DB NoSQL/Mongo/Cassandra/Postgres APIs, change feed, vector search, + global distribution, or HTAP, and other Azure Cosmos DB related development tasks. Not for Azure Table Storage (use azure-table-storage), Azure SQL Database (use azure-sql-database), - Azure Database for MySQL (use azure-database-mysql), Azure Database for PostgreSQL - (use azure-database-postgresql). -use_when: Use when using Cosmos DB NoSQL/Mongo/Cassandra APIs, change feed, vector - search, global distribution, or HTAP/analytics workloads, and other Azure Cosmos - DB related development tasks. + Azure Data Explorer (use azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics). +use_when: Use when using Cosmos DB NoSQL/Mongo/Cassandra/Postgres APIs, change feed, + vector search, global distribution, or HTAP, and other Azure Cosmos DB related development + tasks. confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure SQL - Database (use azure-sql-database), Azure Database for MySQL (use azure-database-mysql), - Azure Database for PostgreSQL (use azure-database-postgresql). + Database (use azure-sql-database), Azure Data Explorer (use azure-data-explorer), + Azure Synapse Analytics (use azure-synapse-analytics). --- # Azure Cosmos DB Crawl Report ## Summary -- **Total Pages**: 987 -- **Fetched**: 987 +- **Total Pages**: 988 +- **Fetched**: 988 - **Fetch Failed**: 0 -- **Classified**: 772 -- **Unclassified**: 215 +- **Classified**: 771 +- **Unclassified**: 217 ### Incremental Update -- **New Pages**: 4 +- **New Pages**: 1 - **Updated Pages**: 8 -- **Unchanged**: 975 +- **Unchanged**: 979 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-cosmos-db/azure-cosmos-db.csv` @@ -64,44 +63,41 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 39 | 4.0% | +| architecture-patterns | 39 | 3.9% | | best-practices | 56 | 5.7% | -| configuration | 120 | 12.2% | +| configuration | 119 | 12.0% | | decision-making | 51 | 5.2% | | deployment | 22 | 2.2% | -| integrations | 329 | 33.3% | -| limits-quotas | 41 | 4.2% | -| security | 63 | 6.4% | +| integrations | 328 | 33.2% | +| limits-quotas | 41 | 4.1% | +| security | 64 | 6.5% | | troubleshooting | 51 | 5.2% | -| *(Unclassified)* | 215 | 21.8% | +| *(Unclassified)* | 217 | 22.0% | ## Changes ### New Pages -- [Unlimited scale with hierarchical partition keys](https://learn.microsoft.com/en-us/azure/cosmos-db/hierarchical-partition-keys-unlimited-scale) -- [Query a vector store with a Go app](https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-go) -- [Query a vector store with a Python app](https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-python) -- [Query a vector store with a Java app](https://learn.microsoft.com/en-us/azure/cosmos-db/quickstart-vector-store-java) +- [Understanding request unit consumption](https://learn.microsoft.com/en-us/azure/cosmos-db/understand-request-unit-consumption) ### Updated Pages -- [Create a container](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-python-create-container) - - Updated: 2025-12-19T18:19:00.000Z → 2026-04-15T17:08:00.000Z -- [Vector indexing and search](https://learn.microsoft.com/en-us/azure/cosmos-db/vector-search) - - Updated: 2025-12-19T18:19:00.000Z → 2026-04-15T22:09:00.000Z -- [Best practices for Python SDK](https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-python) - - Updated: 2026-01-20T18:11:00.000Z → 2026-04-16T17:12:00.000Z -- [High performance storage (Premium SSD v2)](https://learn.microsoft.com/en-us/azure/documentdb/high-performance-storage) - - Updated: 2025-11-21T23:09:00.000Z → 2026-04-17T22:09:00.000Z -- [Scalability overview](https://learn.microsoft.com/en-us/azure/documentdb/scalability-overview) - - Updated: 2025-11-18T18:29:00.000Z → 2026-04-17T08:00:00.000Z -- [Compute and storage](https://learn.microsoft.com/en-us/azure/documentdb/compute-storage) - - Updated: 2025-11-18T18:29:00.000Z → 2026-04-17T22:09:00.000Z -- [Burstable tier](https://learn.microsoft.com/en-us/azure/documentdb/compute-storage) - - Updated: 2025-11-18T18:29:00.000Z → 2026-04-17T22:09:00.000Z -- [Wire protocol support](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/support) - - Updated: 2025-06-06T22:01:00.000Z → 2026-04-16T17:12:00.000Z +- [Security overview](https://learn.microsoft.com/en-us/azure/cosmos-db/security) + - Updated: 2025-12-19T18:19:00.000Z → 2026-04-21T17:11:00.000Z +- [Dynamic data masking (DDM)](https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking) + - Updated: 2025-12-19T18:19:00.000Z → 2026-04-20T22:08:00.000Z +- [Manage an Azure Cosmos DB account](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account) + - Updated: 2026-02-02T18:14:00.000Z → 2026-04-21T17:11:00.000Z +- [Best practices for JavaScript SDK](https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript) + - Updated: 2025-12-19T18:19:00.000Z → 2026-04-21T17:11:00.000Z +- [Linux emulator (preview)](https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-linux) + - Updated: 2025-11-18T18:29:00.000Z → 2026-04-21T17:11:00.000Z +- [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/cosmos-db/policy-reference) + - Updated: 2024-08-14T17:44:00.000Z → 2025-12-19T18:19:00.000Z +- [$cmp](https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$cmp) + - Updated: 2025-11-18T18:29:00.000Z → 2026-04-23T08:00:00.000Z +- [$set](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set) + - Updated: 2025-11-18T18:29:00.000Z → 2026-04-24T06:03:00.000Z ## Classified Pages @@ -143,7 +139,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Troubleshoot unauthorized](https://learn.microsoft.com/en-us/azure/cosmos-db/troubleshoot-unauthorized) | troubleshooting | 0.90 | Maps 401 unauthorized errors to causes like MAC signature mismatch and key issues, with concrete resolution steps; product-specific troubleshooting. | | [Windows command-line arguments](https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-windows-arguments) | configuration | 0.90 | Command-line and PowerShell reference includes specific command names, arguments, and configuration options (e.g., /Port, /Key, /EnableMongoDbEndpoint), which are explicit configuration parameters. | | [Cost-effective reads and writes](https://learn.microsoft.com/en-us/azure/cosmos-db/key-value-store-cost) | limits-quotas | 0.88 | This page explicitly describes RU charges for simple reads/writes; such docs contain specific RU numbers per operation and item size, which are numeric limits/quotas that LLMs won’t reliably know. | -| [Best practices for JavaScript SDK](https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript) | best-practices | 0.86 | Explicitly a best practices guide for the JavaScript SDK; such pages usually contain concrete recommendations (e.g., enabling bulk mode, tuning connection options, handling throttling) and SDK-specific patterns that qualify as product-specific best practices. | +| [Best practices for JavaScript SDK](https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript) | best-practices | 0.86 | The page is explicitly a best practices guide for the Cosmos DB JavaScript SDK, focused on improving latency, availability, and performance. Such SDK-specific recommendations typically include concrete patterns (e.g., connection handling, retry configuration, throughput usage) and product-specific gotchas that go beyond generic programming advice, fitting the best-practices sub-skill. | | [Best practices for Python SDK](https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-python) | best-practices | 0.86 | The page is explicitly a best practices guide for the Azure Cosmos DB Python SDK, focused on improving latency, availability, and performance. Such guidance typically includes product-specific recommendations (for example, preferred client reuse patterns, connection modes, retry configurations, and request options) that are unique to Cosmos DB’s Python SDK behavior and not just generic Python or database advice. This aligns with the best-practices criteria: concrete DO/DON'T guidance and SDK-specific configuration/code patterns. | | [Configure the integrated cache](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-configure-integrated-cache) | configuration | 0.86 | A 'how to configure' article for integrated cache will list specific configuration parameters (cache mode, TTL, gateway node size/count, connection strings) and stepwise settings, matching configuration criteria. | | [Create alert on logical partition key size](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-alert-on-logical-partition-key-storage-size) | limits-quotas | 0.86 | Page states the exact enforced maximum logical partition key size of 20 GB for Azure Cosmos DB and shows how to monitor when a logical partition approaches this numeric limit, which is a concrete quota value not reliably known from training. | @@ -236,6 +232,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Differences between API for Apache Cassandra and Apache Cassandra](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/adoption) | best-practices | 0.80 | Explicitly described as best practices for adapting from Apache Cassandra; such content includes product-specific recommendations, differences, and edge cases when moving workloads. | | [Disable key-based authentication](https://learn.microsoft.com/en-us/azure/cosmos-db/table/how-to-connect-role-based-access-control) | security | 0.80 | Same RBAC + Entra ID how-to as index 2; includes product-specific role assignments and scopes. | | [Distribute throughput across partitions overview](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-redistribute-throughput-across-partitions) | best-practices | 0.80 | Describes how to rebalance RU/s across physical partitions, including a max of 10,000 RU/s per partition and guidance for hot partitions and splits—product-specific behavior and gotchas. | +| [Dynamic data masking (DDM)](https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking) | security | 0.80 | Describes how to configure Dynamic Data Masking, a product-specific security feature; likely includes concrete configuration steps, policy definitions, and parameter values for masking sensitive data in Cosmos DB. | | [Estimate RU/s with capacity planner](https://learn.microsoft.com/en-us/azure/cosmos-db/estimate-ru-with-capacity-planner) | decision-making | 0.80 | Guides use of the capacity planner to estimate RU/s and cost, including planner inputs/outputs and how to size resources; directly supports capacity planning and tier selection decisions. | | [Estimate RU/s with capacity planner](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/estimate-ru-capacity-planner) | decision-making | 0.80 | Capacity planner guidance typically includes RU/s estimates, cost calculations, and workload-based recommendations, helping choose throughput levels with quantified trade-offs. | | [Get started](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/how-to-dotnet-get-started) | integrations | 0.80 | Shows .NET driver configuration, connection strings, and options specific to Cosmos DB for MongoDB. | @@ -302,14 +299,12 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [$indexStats](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$indexstats) | integrations | 0.75 | Explains the $indexStats stage returning per-index usage metrics, a diagnostic API unique to this service. | | [$replaceWith](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$replacewith) | integrations | 0.75 | Explains $replaceWith semantics for replacing document structures in the pipeline, which is specific aggregation API behavior. | | [$sample](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$sample) | integrations | 0.75 | Documents the $sample stage for random selection of documents, a concrete aggregation feature of this service. | -| [$set](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set) | integrations | 0.75 | Describes $set operator behavior including dot-notation for nested objects, which is detailed, product-specific update/aggregation semantics. | | [$skip](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$skip) | integrations | 0.75 | Explains $skip stage behavior for skipping N documents in aggregation pipelines, which is specific to this query framework. | | [.NET](https://learn.microsoft.com/en-us/azure/documentdb/how-to-build-dotnet-console-app) | integrations | 0.75 | Uses Azure.Identity and DocumentDB-specific connection/auth patterns; includes SDK usage and parameters unique to this product integration. | | [Access Azure Key Vault with managed identity](https://learn.microsoft.com/en-us/azure/cosmos-db/access-key-vault-managed-identity) | security | 0.75 | Describes configuring managed identity and Key Vault access policies for Cosmos DB; includes specific role/permission settings and identity usage. | | [Assign users and roles](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-add-assign-user-roles) | security | 0.75 | Covers assigning roles for Cosmos DB for NoSQL; includes specific role names, scopes, and security recommendations (e.g., avoid ROPC), fitting security configuration. | | [Audit logging](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/how-to-enable-audit) | security | 0.75 | Shows how to enable pgAudit in this managed service, including specific parameters and limitations, which are product-specific security/audit configurations. | | [Azure Data Factory (ADF)](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/howto-ingest-azure-data-factory) | integrations | 0.75 | Step-by-step ADF integration with this service, including linked service settings, sink configuration, and constraints unique to Cosmos DB for PostgreSQL. | -| [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/cosmos-db/policy-reference) | security | 0.75 | Lists specific Azure Policy definitions for Cosmos DB with names and enforcement scopes; these are product-specific security/compliance configurations. | | [Azure Stream Analytics (ASA)](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/howto-ingest-azure-stream-analytics) | integrations | 0.75 | Details Stream Analytics output configuration, connection parameters, and patterns specific to writing into this distributed PostgreSQL service. | | [Basics of pgvector](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/howto-use-pgvector) | integrations | 0.75 | Using pgvector here involves extension-specific SQL commands, configuration parameters, and possibly index options unique to this product’s pgvector support, which are integration/coding patterns. | | [Burst capacity FAQ](https://learn.microsoft.com/en-us/azure/cosmos-db/burst-capacity-faq) | limits-quotas | 0.75 | Burst capacity FAQ will detail constraints, eligibility, and numeric behaviors (e.g., accumulation rules per partition), which are product-specific limits. | @@ -425,6 +420,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [$redact](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$redact) | security | 0.70 | Documents $redact stage behavior for access-based field filtering, a product-specific security-related aggregation feature. | | [$regex](https://learn.microsoft.com/en-us/azure/documentdb/operators/evaluation-query/$regex) | integrations | 0.70 | Provides details on a specific query operator and its role in regex-based filtering, which is API-specific. | | [$second](https://learn.microsoft.com/en-us/azure/documentdb/operators/date-expression/$second) | integrations | 0.70 | Describes a concrete operator and its return range (0–59), which is specific to the query API. | +| [$set](https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set) | integrations | 0.70 | Describes a product-specific update operator ($set) with precise behavior and syntax (including dot notation for nested objects), which is detailed API semantics rather than generic knowledge. This fits integrations/coding patterns as it defines how to use a specific operator in queries/updates. | | [$setField](https://learn.microsoft.com/en-us/azure/documentdb/operators/object-expression/$setfield) | integrations | 0.70 | Explains $setField operator behavior for adding, updating, and removing embedded fields in Azure DocumentDB aggregation, which is specific to this product’s query engine. | | [$shift](https://learn.microsoft.com/en-us/azure/documentdb/operators/window-operators/$shift) | integrations | 0.70 | Explains $shift window operator behavior within partitions and sorted windows in Azure DocumentDB; detailed, product-specific window function semantics. | | [$stdDevSamp](https://learn.microsoft.com/en-us/azure/documentdb/operators/accumulators/$stddevsamp) | integrations | 0.70 | Documents the $stddevsamp operator, including when to use it vs $stddevpop, which is detailed, product-specific aggregation behavior. | @@ -442,6 +438,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Async Java sample with change feed](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-java-change-feed) | integrations | 0.70 | End-to-end Java sample using change feed processor; includes concrete SDK usage and configuration patterns specific to Cosmos DB. | | [Autoscale FAQ](https://learn.microsoft.com/en-us/azure/cosmos-db/autoscale-faq) | limits-quotas | 0.70 | Autoscale FAQ pages usually list concrete RU/s ranges, scaling intervals, and other numeric constraints and edge cases, which are product-specific limits not inferable from general knowledge. | | [Azure App Service](https://learn.microsoft.com/en-us/azure/cosmos-db/create-website) | deployment | 0.70 | Shows a no-touch deployment using an ARM template; includes resource definitions and wiring of connection settings, which are product-specific deployment configuration patterns. | +| [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/cosmos-db/policy-reference) | security | 0.70 | Lists specific built-in Azure Policy definitions for Azure Cosmos DB, including exact policy names and links to their definitions. These are product-specific security and governance configurations (policy definitions and effects) that an LLM is unlikely to know exhaustively from training, fitting the security sub-skill focused on RBAC/policy and configuration-level security controls. | | [Azure Policy support](https://learn.microsoft.com/en-us/azure/cosmos-db/policy) | security | 0.70 | Shows how to use Azure Policy for Cosmos DB; will reference specific policy definitions and configuration relevant to security/compliance for this service. | | [Azure Site Reliability Engineering (SRE) Agent](https://learn.microsoft.com/en-us/azure/cosmos-db/site-reliability-engineering-agent) | troubleshooting | 0.70 | SRE Agent is an AI-powered diagnostic tool; documentation will map symptoms to diagnostics and recommendations, which is product-specific troubleshooting guidance. | | [Back up and restore](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/concepts-backup) | deployment | 0.70 | Describes backup frequency, retention behavior, and restore capabilities specific to this service, which are critical operational/deployment details not generally known. | @@ -499,7 +496,6 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Distribute and modify tables](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/howto-modify-distributed-tables) | configuration | 0.70 | Documents specific SQL commands, options, and patterns for creating and altering distributed tables in Azure Cosmos DB for PostgreSQL, including product-specific syntax and behaviors. | | [Distribute reads globally](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/readpreference-global-distribution) | best-practices | 0.70 | Describes how to use MongoDB read preference with globally distributed Cosmos DB; likely includes concrete patterns and recommended settings for latency/consistency trade-offs. | | [Distribute throughput across partitions FAQ](https://learn.microsoft.com/en-us/azure/cosmos-db/distribute-throughput-across-partitions-faq) | limits-quotas | 0.70 | FAQ for throughput redistribution will enumerate constraints, supported scenarios, and numeric limits for the preview feature, which are expert, product-specific limits. | -| [Dynamic data masking (DDM)](https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking) | security | 0.70 | DDM configuration article will include product-specific masking policy options, field-level settings, and constraints that qualify as expert security configuration knowledge. | | [Emulator](https://learn.microsoft.com/en-us/azure/cosmos-db/emulator) | configuration | 0.70 | Emulator docs typically list ports, connection strings, feature support, and configuration flags, which are concrete configuration details specific to the emulator environment. | | [Enable LDAP authentication](https://learn.microsoft.com/en-us/azure/managed-instance-apache-cassandra/ldap) | security | 0.70 | LDAP auth configuration is security-focused and product-specific; likely includes config parameters, supported versions, and preview constraints. | | [Enable fleet analytics](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-enable-fleet-analytics) | configuration | 0.70 | Enablement guide will specify workspace settings, connection parameters, and configuration steps to turn on analytics, matching configuration. | @@ -531,7 +527,6 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Java](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-java-vector-index-query) | integrations | 0.70 | Java SDK guide for vector indexing/search; provides product-specific API usage and configuration for vector embedding policies and indexes. | | [JavaScript](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-javascript-vector-index-query) | integrations | 0.70 | JavaScript SDK guide for vector indexing/search; contains API calls and configuration options (embedding policy, index config) specific to Cosmos DB’s JS SDK. | | [Limit total account throughput](https://learn.microsoft.com/en-us/azure/cosmos-db/limit-total-account-throughput) | configuration | 0.70 | Describes how to cap total provisioned throughput; will include specific configuration options/parameters and behavior unique to Cosmos DB billing and control. | -| [Linux emulator (preview)](https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-linux) | configuration | 0.70 | Linux-based emulator article will include container image names, environment variables, ports, and supported modes, which are configuration parameters. | | [Manage agent memories](https://learn.microsoft.com/en-us/azure/cosmos-db/gen-ai/agentic-memories) | architecture-patterns | 0.70 | Article explicitly covers patterns, data models, and query strategies for agent memory, including strengths and limitations—this is product-specific architecture guidance. | | [Manage consistency levels](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-consistency) | configuration | 0.70 | Shows how to configure default and per-request consistency, manage session tokens, and interpret PBS metrics using specific SDK and portal settings—product-specific configuration knowledge. | | [Manage replication](https://learn.microsoft.com/en-us/azure/documentdb/how-to-cluster-replica) | configuration | 0.70 | Quick guide to enable/disable replication and promote replica clusters; likely includes specific configuration options and parameters. | @@ -594,6 +589,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Schedule maintenance](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/howto-maintenance) | deployment | 0.70 | Shows concrete maintenance window settings and options in the portal for this service, which affect deployment and operations planning. | | [Search using Lucene Index](https://learn.microsoft.com/en-us/azure/managed-instance-apache-cassandra/search-lucene-index) | integrations | 0.70 | Describes using Stratio Cassandra Lucene Index plugin; likely includes plugin configuration, CQL patterns, and settings unique to this integration. | | [Security](https://learn.microsoft.com/en-us/azure/cosmos-db/gremlin/security) | security | 0.70 | Security-focused article; typically includes Cosmos-specific networking, keys, RBAC roles, and configuration parameters for Gremlin accounts, which are product-specific security settings. | +| [Security overview](https://learn.microsoft.com/en-us/azure/cosmos-db/security) | security | 0.70 | Security-focused article for Cosmos DB that goes beyond concepts and provides concrete, product-specific guidance on securing accounts, data, and networking; likely includes specific features (e.g., RBAC roles, network restrictions, keys) and their recommended configurations. | | [Security overview](https://learn.microsoft.com/en-us/azure/documentdb/security) | security | 0.70 | Security-focused article with best practices for data and network protection; likely includes product-specific security settings and recommendations beyond generic concepts. | | [Self-serve minimum transport-layer security (TLS) version enforcement](https://learn.microsoft.com/en-us/azure/cosmos-db/self-serve-minimum-tls-enforcement) | security | 0.70 | Describes self-service API and account-level TLS settings; includes specific configuration parameters and allowed TLS versions, which are security configuration details. | | [Serverless Performance](https://learn.microsoft.com/en-us/azure/cosmos-db/serverless-performance) | limits-quotas | 0.70 | Describes performance behavior and maximum capacity of serverless containers, including how capacity is determined by data size; this is product-specific performance/limit information. | @@ -617,6 +613,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Throughput control](https://learn.microsoft.com/en-us/azure/cosmos-db/throughput-control-spark) | best-practices | 0.70 | Throughput control feature docs describe concrete configuration patterns (global control groups, RU limits, SDK-specific options) and edge cases for bulk data movement, which are product-specific best practices. | | [Time to live](https://learn.microsoft.com/en-us/azure/cosmos-db/time-to-live) | configuration | 0.70 | Explains TTL behavior, container vs item-level settings, and that TTL is configured in seconds with automatic deletion semantics—product-specific configuration semantics. | | [Tune query performance with Index Advisor](https://learn.microsoft.com/en-us/azure/documentdb/index-advisor) | best-practices | 0.70 | Performance tuning assistant with concrete index recommendations and explanations; product-specific optimization patterns. | +| [Understanding request unit consumption](https://learn.microsoft.com/en-us/azure/cosmos-db/understand-request-unit-consumption) | limits-quotas | 0.70 | A detailed RU consumption article for Cosmos DB typically includes concrete, product-specific RU cost examples (for example, RU charges for point reads vs. queries, item size impacts, indexing effects). These are effectively numeric resource limits/constraints unique to the service and not generally knowable from training data, fitting the limits-quotas category better than conceptual guidance. | | [Updating data](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/tutorial-update) | integrations | 0.70 | Covers update operations using Mongo commands against Cosmos DB; likely includes concrete command syntax and behavior specific to Cosmos’ Mongo API implementation. | | [Upgrade](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/concepts-upgrade) | deployment | 0.70 | Covers types of upgrades (PostgreSQL, Citus, platform) and precautions specific to this managed service, including operational constraints and recommended upgrade patterns. | | [Upgrade API version](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/upgrade-version) | decision-making | 0.70 | Describes how to upgrade MongoDB wire protocol version for existing accounts; includes migration/upgrade path considerations and likely version-specific guidance. | @@ -658,7 +655,6 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [$bitOr](https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitor) | integrations | 0.65 | Describes the $bitOr operator behavior for integer fields, which is concrete API semantics for DocumentDB’s bitwise operations. | | [$bottom](https://learn.microsoft.com/en-us/azure/documentdb/operators/accumulators/$bottom) | integrations | 0.65 | $bottom operator reference; describes sorting fields and behavior, part of the product’s aggregation API surface. | | [$bottomN](https://learn.microsoft.com/en-us/azure/documentdb/operators/accumulators/$bottomn) | integrations | 0.65 | $bottomN operator reference; documents parameters (N, sort fields) and behavior specific to DocumentDB’s aggregation implementation. | -| [$cmp](https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$cmp) | integrations | 0.65 | Documents the $cmp comparison operator and its return values (-1, 0, 1), which is precise operator semantics for this database. | | [$cond](https://learn.microsoft.com/en-us/azure/documentdb/operators/conditional-expression/$cond) | integrations | 0.65 | Documents the $cond conditional expression operator and its ternary-like behavior in aggregation pipelines, which is specialized aggregation syntax. | | [$count](https://learn.microsoft.com/en-us/azure/documentdb/operators/accumulators/$count) | integrations | 0.65 | $count operator reference; explains usage in aggregation/grouping, which is part of the service’s operator semantics. | | [$dateAdd](https://learn.microsoft.com/en-us/azure/documentdb/operators/date-expression/$dateadd) | integrations | 0.65 | Documents the $dateAdd date expression operator and its supported units, which is specific to DocumentDB’s aggregation framework. | @@ -850,7 +846,6 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [How to - Reverse extract, transform, & load (ETL)](https://learn.microsoft.com/en-us/azure/cosmos-db/reverse-extract-transform-load) | architecture-patterns | 0.60 | Describes reverse ETL from data lakes/warehouses back into operational systems, including Cosmos DB, which is an architecture pattern for data movement and real-time analytics integration. | | [JSON syntax templates](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/templates-samples) | deployment | 0.60 | Resource Manager templates article includes schema, parameter names, and allowed values for deploying Cosmos DB Cassandra accounts and resources. | | [Java](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/quickstart-java) | integrations | 0.60 | Java quickstart will contain driver configuration, endpoint, and SSL/auth details specific to Cosmos DB’s Cassandra API. | -| [Manage an Azure Cosmos DB account](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account) | limits-quotas | 0.60 | Management article that explicitly references control plane service limits; full content typically includes specific request rate limits and constraints for management operations—numeric, product-specific limits. | | [Mapping Cassandra consistency levels](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/consistency-mapping) | decision-making | 0.60 | Consistency mapping between Apache Cassandra and Cosmos DB Cassandra API is product-specific and used to decide which consistency level to choose. These mappings and their behavioral implications are expert decision knowledge unique to this integration. | | [Migrate - Apache Cassandra to API for Apache Cassandra using Databricks](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/migrate-data-databricks) | decision-making | 0.60 | Migration via Databricks Spark includes product-specific steps, considerations, and possibly performance/cost trade-offs for moving data, which is expert migration decision guidance. | | [Migrate data](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/migrate-data) | decision-making | 0.60 | Migration tutorial likely covers tool choices, throughput settings, and trade-offs for moving data into Cosmos DB Cassandra, guiding migration decisions. | @@ -935,7 +930,6 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Request data restore](https://learn.microsoft.com/en-us/azure/cosmos-db/periodic-backup-request-data-restore) | 0.40 | Focuses on support process for restore requests; summary does not show concrete technical parameters, limits, or configuration details. | | [Retrieval-augmented generation (RAG)](https://learn.microsoft.com/en-us/azure/cosmos-db/gen-ai/rag) | 0.40 | RAG overview in Cosmos DB context; appears conceptual without concrete thresholds, limits, or configuration tables. | | [Run queries](https://learn.microsoft.com/en-us/azure/cosmos-db/postgresql/quickstart-run-queries) | 0.40 | Quickstart for running queries; likely standard SQL examples without product-specific limits, configs, or troubleshooting mappings. | -| [Security overview](https://learn.microsoft.com/en-us/azure/cosmos-db/security) | 0.40 | High-level security fundamentals and shared-responsibility overview; likely conceptual guidance without concrete RBAC roles, config parameters, or numeric constraints. | | [Solution accelerators](https://learn.microsoft.com/en-us/azure/cosmos-db/solutions) | 0.40 | High-level solution accelerators and AI scenarios; more marketing/overview than detailed best practices with quantified impact or config specifics. | | [Spark 3.x online transaction processing (OLTP) connector](https://learn.microsoft.com/en-us/azure/cosmos-db/tutorial-spark-connector) | 0.40 | Tutorial for Spark connector usage; likely step-by-step without comprehensive parameter tables or product-specific limits; more how-to than deep configuration or best-practices. | | [Stored procedures, triggers, and user-defined functions (UDFs)](https://learn.microsoft.com/en-us/azure/cosmos-db/stored-procedures-triggers-udfs) | 0.40 | Primarily conceptual introduction to stored procedures, triggers, and UDFs; likely lacks detailed configuration tables, limits, or error mappings required for expert classification. | @@ -943,6 +937,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Node.js and React app](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/tutorial-develop-react) | 0.35 | Video-based tutorial series overview; primarily app-building guidance, not detailed product configuration or troubleshooting reference. | | [Part 6 - Perform CRUD operations](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/tutorial-develop-nodejs-part-6) | 0.35 | Tutorial part 6 adding CRUD functions; mostly application-level code, not Cosmos-specific configuration tables or limits. | | [$addToSet](https://learn.microsoft.com/en-us/azure/documentdb/operators/array-update/$addtoset) | 0.30 | $addToSet behavior (uniqueness) is standard; no Azure-specific configuration, limits, or troubleshooting mappings. | +| [$cmp](https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$cmp) | 0.30 | Describes basic comparison operator behavior (-1, 0, 1) without product-specific limits, configuration, or nuanced patterns; this is generic programming knowledge rather than expert, product-unique guidance. | | [$each](https://learn.microsoft.com/en-us/azure/documentdb/operators/array-update/$each) | 0.30 | $each usage within $addToSet/$push is generic update semantics; no config tables or limits. | | [$indexOfArray](https://learn.microsoft.com/en-us/azure/documentdb/operators/array-expression/$indexofarray) | 0.30 | $indexOfArray description includes -1 for not found, but this is generic behavior, not product-specific expert knowledge or config. | | [$isArray](https://learn.microsoft.com/en-us/azure/documentdb/operators/array-expression/$isarray) | 0.30 | $isArray returns true/false; generic type-check behavior without product-specific settings or limits. | @@ -971,6 +966,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Java](https://learn.microsoft.com/en-us/azure/documentdb/how-to-build-java-console-app) | 0.30 | Java console app tutorial; standard connection and CRUD operations, no detailed limits, config matrices, or troubleshooting mappings. | | [Java SDK v4](https://learn.microsoft.com/en-us/azure/cosmos-db/sdk-java-v4) | 0.30 | Java SDK v4 release notes/resources; focused on versions and high-level performance notes, not detailed config or troubleshooting content. | | [Java bulk executor library v2 (legacy)](https://learn.microsoft.com/en-us/azure/cosmos-db/sdk-java-bulk-executor-v2) | 0.30 | Bulk Executor Java SDK release notes/resources; mainly lifecycle and migration guidance, not detailed configuration or error mappings. | +| [Linux emulator (preview)](https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-linux) | 0.30 | The summary describes a conceptual/overview page for the Linux-based Cosmos DB emulator (what it is, that it runs as a Docker container, and that it supports a subset of features). There is no indication of detailed configuration tables, limits, error codes, or other expert-level specifics in the provided description. | | [Management operations](https://learn.microsoft.com/en-us/azure/managed-instance-apache-cassandra/management-operations) | 0.30 | Management responsibilities and supported operations; likely conceptual and role-based without detailed config tables, limits, or error mappings. | | [Migrate using Visual Studio Code extension](https://learn.microsoft.com/en-us/azure/documentdb/how-to-migrate-vs-code-extension) | 0.30 | Tutorial-style page on using the Azure DocumentDB Migration Extension in VS Code to run MongoDB-to-DocumentDB migrations. Based on the summary, it focuses on step-by-step usage and benefits (no extra infrastructure, secure connectivity, zero-cost usage) rather than detailed configuration tables, limits, or error-code-based troubleshooting. Lacks clear evidence of the structured expert knowledge required by the defined sub-skill types. | | [Model document data](https://learn.microsoft.com/en-us/azure/cosmos-db/modeling-data) | 0.30 | Conceptual data modeling guidance; likely general patterns and considerations rather than product-specific numeric limits or configuration parameters. | @@ -1061,6 +1057,7 @@ confusable_not_for: Not for Azure Table Storage (use azure-table-storage), Azure | [Healthcare](https://learn.microsoft.com/en-us/azure/documentdb/solutions-healthcare) | 0.20 | Healthcare solution overview; focuses on use cases and benefits, not on detailed configuration, limits, or troubleshooting. | | [IoT and manufacturing](https://learn.microsoft.com/en-us/azure/documentdb/solutions-iot) | 0.20 | IoT and manufacturing solutions overview; primarily conceptual and marketing, without explicit expert configuration or troubleshooting details. | | [Logistics, supply chain, retail, and e-commerce](https://learn.microsoft.com/en-us/azure/documentdb/solutions-retail) | 0.20 | Industry solution marketing/overview for retail and supply chain; no detailed limits, configs, or troubleshooting content indicated. | +| [Manage an Azure Cosmos DB account](https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account) | 0.20 | Page is a general management/how-to guide for Azure Cosmos DB accounts via portal/CLI/ARM. The summary and URL indicate step-by-step management operations, not detailed limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. The only limits reference is a link to separate control plane service limits, so this page itself does not contain the expert numerical or configuration details required by any sub-skill type. | | [Manage data using CQLSH](https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/manage-data-cqlsh) | 0.20 | Quickstart for using CQLSH is primarily a step-by-step tutorial; summary does not indicate detailed config tables, limits, or troubleshooting content. | | [MongoDB](https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb/) | 0.20 | Landing/overview page for Cosmos DB for MongoDB; primarily conceptual and marketing, without detailed limits, configuration tables, or troubleshooting mappings. | | [Monitor Azure Cosmos DB](https://learn.microsoft.com/en-us/azure/cosmos-db/monitor) | 0.20 | Overview of monitoring with Azure Monitor; appears conceptual and navigational without specific metrics tables or configuration parameters in the summary. | diff --git a/products/azure-cost-management/azure-cost-management.csv b/products/azure-cost-management/azure-cost-management.csv index f965f544..252009b6 100644 --- a/products/azure-cost-management/azure-cost-management.csv +++ b/products/azure-cost-management/azure-cost-management.csv @@ -54,7 +54,7 @@ https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/billing-s https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/cancel-azure-subscription,Cancel and delete subscription,Cancel and delete your Azure subscription - Microsoft Cost Management,Cancel and permanently delete Azure subscriptions safely,"Describes how to cancel or deleted your Azure subscription, like the Free Trial subscription.","You can cancel your Azure subscription in the Azure portal if you no longer need it. Although not required, Microsoftrecommendsthat you take the following actions before you cancel your subscription: Instead of canceling a subscription, you can remove all of its resources toprevent unwanted charges. Note After you cancel your last Azure subscription, you can delete it after all required conditions are met.",2026-01-25T08:00:00.000Z,conceptual,configuration,0.64,True,Provides specific pre-cancellation actions and conditions required before deletion; these are product-specific operational configuration steps around subscription lifecycle.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-azure-account-profile,Edit billing contact information,Change contact information for an Azure billing account - Microsoft Cost Management,,Describes how to change the contact information of your Azure billing account,"This article helps you update contact information for abilling accountin the Azure portal. The instructions to update the contact information vary by the billing account type. To learn more about billing accounts and identify your billing account type, seeView billing accounts in Azure portal. An Azure billing account is separate from your Azure user account andMicrosoft account. If you want to update your Microsoft Entra user profile information, only a user administrator can make the changes. ",2026-02-27T18:31:00.000Z,how-to,,0.35,False,Explains how to change billing contact information; straightforward portal operations without expert-level limits or configuration matrices.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-credit-card,"Add, update, or delete payment method","Add, update, or delete a payment method - Microsoft Cost Management",,"This article describes how to add, update, or delete a payment method for an Azure subscription.","This article applies to customers who signed up for Azure online by using a credit card. In the Azure portal, you can change your default payment method to a new credit card and update your credit card details. You can also delete a payment method that you use to pay for an Azure subscription. To make these changes, you need these credentials: The supported payment methods for Azure are credit card, debit card, and wire transfer. Azure doesn't support virtual or prepaid cards. To see a complete ",2026-03-12T08:00:00.000Z,how-to,,0.2,False,"Procedural portal how-to for managing payment methods; no product-specific limits, configuration tables, error codes, or quantified guidance that meet any sub-skill criteria.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-of-channel-partner,Initiate a Change of Channel Partner (COCP) request,Initiate a change of channel partner request - Microsoft Cost Management,Initiate and manage change of channel partner for EA,This article shows enterprise customers how to initiate a request to change channel partners via the Azure portal.,"Azure customers with an Enterprise Agreement can now initiate a change of channel partner (COCP) request through the Azure portal. This change moves the ability to initiate the COCP process away from partners and enables the customers to start the process instead. When a customer initiates a COCP, the new chosen partner receives a notification via email. The partner can either accept or decline the request. When the partner accepts, the Azure customer is notified and given the effective date whe",2026-01-30T18:17:00.000Z,article,configuration,0.62,True,"Describes COCP request flow, notifications, and effective dates; includes EA-specific operational rules for changing partners.",unchanged +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-of-channel-partner,Initiate a Change of Channel Partner (COCP) request,Initiate a change of channel partner request - Microsoft Cost Management,,This article shows enterprise customers how to initiate a request to change channel partners via the Azure portal.,"Azure customers with an Enterprise Agreement can now initiate a change of channel partner (COCP) request through the Azure portal. This change moves the ability to initiate the COCP process away from partners and enables the customers to start the process instead. When a customer initiates a COCP, the new chosen partner receives a notification via email. The partner can either accept or decline the request. When the partner accepts, the Azure customer is notified and given the effective date whe",2026-04-20T08:00:00.000Z,article,,0.2,False,"Explains how to initiate a change of channel partner request; appears to be procedural guidance without product-specific limits, configuration parameters, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/check-free-service-usage,Check usage,Monitor and track Azure free service usage - Microsoft Cost Management,Monitor Azure free service usage against quotas,Learn how to check free service usage in the Azure portal. There's no charge for services included in a free account unless you go over the service limits.,"You're not charged for services included for free with your Azure free account, unless you exceed the limits of the services. To remain in the limits, you can use the Azure portal to track the free service usage. Usage only appears in the Azure portal after you start using free resources so if you haven't used any resources, usage isn't shown. Note that usage and status doesn't immediately appear. It's delayed for one to two days after you use a resource.",2025-12-29T08:00:00.000Z,conceptual,limits-quotas,0.65,True,"Guides tracking of free service usage to avoid exceeding limits; includes timing details (1–2 day delay) and service-specific free quotas, which are numeric constraints.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/classic-administrator-retire,Prepare for classic administrator roles retirement,Prepare for Azure classic administrator roles retirement - Microsoft Cost Management,,Learn about the retirement of Azure classic administrator roles and how to transition them to Azure role-based access control (RBAC) roles.,"Azure classic administrator roles retired on August 31, 2024. If your organization has any active Co-Administrator or Service Administrator roles, you need to transition them to Azure role-based access control (RBAC) roles by then.Azure Service manager and all Azure classic resourcesalso retire on that date.",2025-12-29T08:00:00.000Z,conceptual,,0.3,False,"Primarily an announcement/transition overview for classic administrator role retirement; likely conceptual with high-level guidance, not detailed RBAC role tables or config parameters.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/cloud-subscription,What is a cloud subscription,What is a cloud subscription? - Microsoft Cost Management,,"Learn about cloud subscriptions, how they help manage Microsoft products and services, and the benefits of organizing resources with multiple subscriptions.","A cloud subscription is a way to manage the products and services that you buy from Microsoft. A cloud subscription is created when you acquire various Azure resources and other products including Microsoft Azure Consumption Commitment, Enterprise Support, and Microsoft AI Cloud Partner Program. Over time cloud subscription scope will expand to include more Microsoft products and services including device or user license offers like Microsoft 365. Key aspects of a cloud subscription:",2025-12-29T23:14:00.000Z,concept-article,,0.1,False,"Conceptual explanation of what a cloud subscription is and its benefits; no detailed numeric limits, configs, or troubleshooting mappings.",unchanged @@ -65,19 +65,19 @@ https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/create-fr https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/create-subscription,Create an MCA subscription,Create a Microsoft Customer Agreement subscription - Azure Cost Management + Billing,,Learn how to add a new Microsoft Customer Agreement subscription in the Azure portal. See information about billing account forms and view other available resources.,"This article helps you create aMicrosoft Customer Agreementsubscription for yourself or for someone else in your current Microsoft Entra directory/tenant. You may want another subscription to avoid hitting subscription quota limits, to create separate environments for security, or to isolate data for compliance reasons. If you want to create a Microsoft Customer Agreement subscription in a different Microsoft Entra tenant, seeCreate an MCA subscription request. If you want to create subscription",2026-02-11T18:17:00.000Z,conceptual,,0.4,False,Step-by-step portal tutorial to create an MCA subscription; mostly procedural UI guidance without detailed configuration parameter tables or numeric limits.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/create-subscription-request,Create an MCA subscription request,Create a Microsoft Customer Agreement subscription request - Azure Cost Management + Billing,,Learn how to create an Azure subscription request in the Azure portal. See information about billing account forms and view other available resources.,"This article helps you create aMicrosoft Customer Agreementsubscription for someone else that's in a different Microsoft Entra directory/tenant. After the request is created, the recipient accepts the subscription request. You might want another subscription to avoid hitting subscription quota limits, to create separate environments for security, or to isolate data for compliance reasons. If you instead want to create a subscription for yourself or for someone else in your current Microsoft Entr",2026-01-06T08:00:00.000Z,conceptual,,0.4,False,Tutorial for creating an MCA subscription request in another tenant; likely UI-driven without deep configuration or numeric constraints.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/data-transfer-fees,Data transfer fees,Azure data transfer fees - Microsoft Cost Management,Understand Azure data transfer fee rules in Europe,Describes how fees are applied to a data transfer,"Azure now offers at-cost transfer of data for customers and CSP partners in Europe transferring data via the internet between Azure to another data processing service provider. This applies to scenarios where multiple services of different providers are used in parallel, in an interoperable manner. Use the following steps to submit a request if you're transferring data in this manner.",2025-09-14T22:14:00.000Z,conceptual,limits-quotas,0.7,True,"Describes at-cost data transfer scenarios and eligibility; likely includes specific conditions and possibly quantitative fee rules for cross-provider transfers, which are pricing/limit details not generally known.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-administration,EA billing administration,EA Billing administration on the Azure portal - Microsoft Cost Management,Perform common EA billing administration tasks in Azure portal,This article explains the common tasks that an enterprise administrator accomplishes in the Azure portal.,"This article explains the common tasks that an Enterprise Agreement (EA) administrator accomplishes in theAzure portal. A direct enterprise agreement is signed between Microsoft and an enterprise agreement customer. Conversely, an indirect EA is one where a customer signs an agreement with a Microsoft partner. This article is applicable for both direct and indirect EA customers.",2025-09-23T08:00:00.000Z,conceptual,configuration,0.66,True,"Explains concrete EA billing operations (for example, managing accounts, departments, cost centers) in the portal; contains EA-specific operational configuration steps.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-azure-usage-charges-invoices,View and download EA usage details,View your Azure usage summary details and download reports for EA enrollments - Microsoft Cost Management,,"This article explains how enterprise administrators of direct and indirect Enterprise Agreement (EA) enrollments can view a summary of their usage data, Azure Prepayment consumed, and charges associat","This article explains how partner administrators of indirect enrollments and enterprise administrators of direct and indirect Enterprise Agreement (EA) enrollments can view a summary of their usage data, Azure Prepayment consumed, and charges associated with other usage in the Azure portal. Charges are presented at the summary level across all accounts and subscriptions of the enrollment. Check out theEA admin manage consumption and invoicesvideo. It's part of theEnterprise Customer Billing Expe",2025-09-22T08:00:00.000Z,how-to,,0.4,False,Explains how EA admins view usage summaries and reports; largely portal navigation and role requirements without deep product-specific configs or limits.,unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-billing-invoice-documents,Direct EA billing invoice documents,Direct EA billing invoice documents - Microsoft Cost Management,Understand timing of direct EA invoice document availability,Learn how to understand the invoice files associated with your direct enterprise agreement.,"This article helps you understand the invoice files associated with your direct enterprise agreement. For information about downloading the files, seeView your Azure usage summary details and download reports for direct EA enrollments. All documents are available between the 12th- 15th day of each month. However, when an invoice is unusually large, availability might be delayed.",2025-09-23T08:00:00.000Z,conceptual,limits-quotas,0.7,True,Specifies that documents are available between specific days of the month and notes delays for unusually large invoices; these are concrete timing constraints and conditions that qualify as limits/quotas.,unchanged +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-administration,EA billing administration,EA Billing administration on the Azure portal - Microsoft Cost Management,,This article explains the common tasks that an enterprise administrator accomplishes in the Azure portal.,"This article explains the common tasks that an Enterprise Agreement (EA) administrator accomplishes in theAzure portal. A direct enterprise agreement is signed between Microsoft and an enterprise agreement customer. Conversely, an indirect EA is one where a customer signs an agreement with a Microsoft partner. This article is applicable for both direct and indirect EA customers.",2026-04-21T08:00:00.000Z,how-to,,0.3,False,"Covers common EA billing admin tasks in the portal; likely step-by-step UI/process content without detailed configuration tables, limits, or security role definitions beyond what is in the dedicated roles article.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-azure-usage-charges-invoices,View and download EA usage details,View your Azure usage summary details and download reports for EA enrollments - Microsoft Cost Management,,"This article explains how enterprise administrators of direct and indirect Enterprise Agreement (EA) enrollments can view a summary of their usage data, Azure Prepayment consumed, and charges associat","This article explains how partner administrators of indirect enrollments and enterprise administrators of direct and indirect Enterprise Agreement (EA) enrollments can view a summary of their usage data, Azure Prepayment consumed, and charges associated with other usage in the Azure portal. Charges are presented at the summary level across all accounts and subscriptions of the enrollment. Check out theEA admin manage consumption and invoicesvideo. It's part of theEnterprise Customer Billing Expe",2026-04-20T08:00:00.000Z,how-to,,0.4,False,"Primarily a how-to for viewing usage summaries and downloading reports in the portal. It’s focused on navigation and UI steps for EA admins rather than exposing configuration tables, limits, or decision matrices. No strong indication of product-specific parameters, quotas, or error codes.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-billing-invoice-documents,Direct EA billing invoice documents,Direct EA billing invoice documents - Microsoft Cost Management,,Learn how to understand the invoice files associated with your direct enterprise agreement.,"This article helps you understand the invoice files associated with your direct enterprise agreement. For information about downloading the files, seeView your Azure usage summary details and download reports for direct EA enrollments. All documents are available between the 12th- 15th day of each month. However, when an invoice is unusually large, availability might be delayed.",2026-04-20T08:00:00.000Z,reference,,0.3,False,"Describes timing and nature of invoice document availability for direct EA; while it mentions a general 12th–15th day window, it lacks detailed limits, configuration options, or structured decision/troubleshooting content required for any sub-skill type.",updated https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/discover-cloud-footprint,Discover cloud footprint FAQ,Discover your Microsoft cloud footprint FAQ - Microsoft Cost Management,,This article helps to answer frequently asked questions that customers have about their Microsoft cloud footprint.,"This article helps to answer frequently asked questions that customers have about their Microsoft cloud footprint. Your cloud footprint commonly includes but isn’t limited to, legal entities, billing accounts, and billing profiles, tenants, subscriptions, and so on. This article provides links to other articles that you can use to determine your entire Microsoft cloud footprint.",2025-12-29T08:00:00.000Z,concept-article,,0.3,False,"Cloud footprint FAQ mainly links to other articles and explains concepts; not focused on limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/download-azure-invoice-daily-usage-date,Download your invoice,Download Azure billing invoice - Microsoft Cost Management,,Describes how to download or view your Azure billing invoice.,"For most subscriptions, you can download your invoice from theAzure portalor have it sent in email. Azure EA customers can download their organization's invoices using the information atDownload or view your Azure billing invoice. Only certain roles have permission to get billing invoice, like the Account Administrator or Enterprise Administrator. To learn more about getting access to billing information, seeManage access to Azure billing using roles. If you have a Microsoft Customer Agreement, ",2026-01-14T08:00:00.000Z,how-to,,0.35,False,"Another invoice download/how-to article with role prerequisites; no deep technical configuration, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-azure-marketplace,Azure Marketplace,Azure Marketplace - Microsoft Cost Management,,Describes how EA customers can use Azure Marketplace.,This article explains how EA customers and partners can view marketplace charges and enable Azure Marketplace purchases.,2025-09-23T08:00:00.000Z,conceptual,,0.45,False,Explains how EA customers view marketplace charges and enable purchases; mostly portal steps and conceptual billing behavior without detailed limits or configs.,unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-billing-administration-partners,Azure portal administration for partners,EA billing administration for partners in the Azure portal - Microsoft Cost Management,Manage indirect EA billing as a partner in Azure portal,This article explains the common tasks that a partner administrator accomplishes in the Azure portal to manage indirect enterprise agreements.,This article explains the common tasks that a partner administrator accomplishes in the Azure portalhttps://portal.azure.comto manage indirect EAs. An indirect EA is one where a customer signs an agreement with a Microsoft partner. The partner administrator manages their indirect EAs on behalf of their customers. You can watch theEA Billing administration in the Azure portal for Partnersseries of videos on YouTube.,2025-09-24T08:00:00.000Z,conceptual,configuration,0.66,True,"Partner-focused EA billing administration; includes specific tasks and settings for managing indirect EA customers, which are product-specific operational details.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-direct-portal-get-started,Get started with EA billing in the Azure portal,Get started with your Enterprise Agreement billing account - Microsoft Cost Management,,This article explains how Azure Enterprise Agreement (Azure EA) customers can use the Azure portal to manage their billing.,"Note On February 15, 2024, theEA portalretired. It's now read only. All EA customers and partners use Cost Management + Billing in the Azure portal to manage their enrollments. This article helps direct and indirect Azure Enterprise Agreement (Azure EA) customers with their billing administration on theAzure portal. Get basic information about: We have several videos that walk you through getting started with the Azure portal for Enterprise Agreements. Check out the series atEnterprise Customer ",2025-09-23T08:00:00.000Z,conceptual,,0.35,False,"Explains how EA customers use the Azure portal for billing; primarily procedural onboarding, not detailed configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-agreements,Azure EA agreements and amendments,Azure EA agreements and amendments - Microsoft Cost Management,,"The article describes how Azure EA agreements and amendments might affect your access, use, and payments for Azure services.","The article describes how Azure EA agreements and amendments might affect your access, use, and payments for Azure services.",2025-10-13T08:00:00.000Z,conceptual,,0.3,False,"Agreement/amendment effects on EA access and payments are contractual and procedural, not detailed technical limits, configs, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations,Azure EA VM reserved instances,Azure EA VM reserved instances - Microsoft Cost Management,Use EA VM reservations to optimize Azure costs,This article summaries how Azure reservations for VM reserved instances can help you save your money with your enterprise enrollment.,"This article summaries how Azure reservations for VM reserved instances can help you save your money with your enterprise enrollment. For more information about reservations, seeWhat are Azure Reservations?",2025-10-13T08:00:00.000Z,conceptual,decision-making,0.6,True,"Describes how EA VM reserved instances affect savings and enrollment usage; while cost-focused, it provides EA-specific guidance on when and how to use reservations for cost decisions.",unchanged +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-azure-marketplace,Azure Marketplace,Azure Marketplace - Microsoft Cost Management,Manage Azure Marketplace usage under EA enrollments,Describes how EA customers can use Azure Marketplace.,This article explains how EA customers and partners can view marketplace charges and enable Azure Marketplace purchases.,2026-04-20T08:00:00.000Z,how-to,decision-making,0.7,True,"Explains how EA customers and partners can view and enable Marketplace charges, which usually involves EA-specific rules, scopes, and behaviors for Marketplace billing. These details guide decisions on enabling Marketplace purchases and understanding their cost impact within EA, which is specialized decision-making content.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-billing-administration-partners,Azure portal administration for partners,EA billing administration for partners in the Azure portal - Microsoft Cost Management,,This article explains the common tasks that a partner administrator accomplishes in the Azure portal to manage indirect enterprise agreements.,This article explains the common tasks that a partner administrator accomplishes in the Azure portalhttps://portal.azure.comto manage indirect EAs. An indirect EA is one where a customer signs an agreement with a Microsoft partner. The partner administrator manages their indirect EAs on behalf of their customers. You can watch theEA Billing administration in the Azure portal for Partnersseries of videos on YouTube.,2026-04-20T08:00:00.000Z,how-to,,0.3,False,"Describes partner EA billing administration tasks; appears to be procedural and navigational without explicit limits, configuration parameter tables, or detailed security role definitions.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-direct-portal-get-started,Get started with EA billing in the Azure portal,Get started with your Enterprise Agreement billing account - Microsoft Cost Management,,This article explains how Azure Enterprise Agreement (Azure EA) customers can use the Azure portal to manage their billing.,"Note On February 15, 2024, theEA portalretired. It's now read only. All EA customers and partners use Cost Management + Billing in the Azure portal to manage their enrollments. This article helps direct and indirect Azure Enterprise Agreement (Azure EA) customers with their billing administration on theAzure portal. Get basic information about: We have several videos that walk you through getting started with the Azure portal for Enterprise Agreements. Check out the series atEnterprise Customer ",2026-04-21T08:00:00.000Z,get-started,,0.2,False,"Getting-started administration article for EA billing in the Azure portal; primarily navigation and conceptual management guidance, no detailed limits, configuration tables, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-agreements,Azure EA agreements and amendments,Azure EA agreements and amendments - Microsoft Cost Management,Understand how Azure EA agreements affect billing,"The article describes how Azure EA agreements and amendments might affect your access, use, and payments for Azure services.","The article describes how Azure EA agreements and amendments might affect your access, use, and payments for Azure services.",2026-04-20T08:00:00.000Z,concept-article,decision-making,0.7,True,"EA agreements and amendments documentation typically contains detailed, contract-specific rules about access, usage, and payment behavior that are not inferable from general training data. These rules guide customers on how agreement structure impacts billing and access, which is decision-focused (how changes to agreements affect what you can or should do). While not limits/quotas, it is specialized guidance for interpreting EA constructs and making enrollment-related decisions.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations,Azure EA VM reserved instances,Azure EA VM reserved instances - Microsoft Cost Management,Use Azure EA VM reservations to optimize costs,This article summaries how Azure reservations for VM reserved instances can help you save your money with your enterprise enrollment.,"This article summaries how Azure reservations for VM reserved instances can help you save your money with your enterprise enrollment. For more information about reservations, seeWhat are Azure Reservations?",2026-04-20T08:00:00.000Z,concept-article,decision-making,0.65,True,"Guidance on EA VM reserved instances for enterprise enrollments typically includes EA-specific behaviors, eligibility, and cost implications that inform whether and how to purchase reservations under EA. This is specialized cost-optimization and enrollment-specific decision guidance rather than generic reservations info.",updated https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing,View and download your organization's pricing,View and download your organization's Azure pricing - Microsoft Cost Management,,Learn how to view and download pricing or estimate costs with your organization's pricing.,"Azure customers with an Azure Enterprise Agreement (EA), Microsoft Customer Agreement (MCA), or Microsoft Partner Agreement (MPA) can view and download their pricing in the Azure portal.Learn how to check your billing account type. Price sheets are available for download covering up to 13 months prior.",2025-07-31T08:00:00.000Z,conceptual,,0.4,False,Shows how to view/download organization pricing; mostly portal navigation and account-type distinctions without detailed decision matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing-overview,Azure EA pricing,Azure EA pricing - Microsoft Cost Management,Interpret Azure EA pricing and usage calculations,This article provides a pricing overview for Azure enterprise customers.,This article provides a pricing overview for Azure enterprise customers. It also provides details on how usage is calculated. It answers many frequently asked questions about charges for various Azure services in an Enterprise Agreement where the calculations are more complex.,2025-09-23T08:00:00.000Z,conceptual,decision-making,0.65,True,EA pricing overview and detailed usage calculation rules for various services are contract- and platform-specific and help decide cost expectations; this is specialized billing decision guidance not derivable from generic knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-transfers,Transfer EA accounts,Transfer Azure Enterprise enrollment accounts and subscriptions - Microsoft Cost Management,Transfer Azure Enterprise enrollment accounts and subscriptions,Describes how Azure Enterprise enrollment accounts and subscriptions are transferred.,This article provides an overview of enterprise transfers.,2025-09-23T08:00:00.000Z,conceptual,configuration,0.6,True,"Overview of enterprise transfers; even if high-level, it covers EA-specific transfer rules and constraints that are not generic knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing-overview,Azure EA pricing,Azure EA pricing - Microsoft Cost Management,Interpret Azure EA pricing and usage calculations,This article provides a pricing overview for Azure enterprise customers.,This article provides a pricing overview for Azure enterprise customers. It also provides details on how usage is calculated. It answers many frequently asked questions about charges for various Azure services in an Enterprise Agreement where the calculations are more complex.,2026-04-21T08:00:00.000Z,concept-article,decision-making,0.8,True,"EA pricing overview for enterprise customers usually includes detailed, contract-specific pricing rules, how usage is calculated, and FAQs about complex charge scenarios. These are expert billing rules not derivable from generic knowledge and help customers decide and reason about costs under EA, fitting decision-making around pricing and usage behavior.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-transfers,Transfer EA accounts,Transfer Azure Enterprise enrollment accounts and subscriptions - Microsoft Cost Management,,Describes how Azure Enterprise enrollment accounts and subscriptions are transferred.,This article provides an overview of enterprise transfers.,2026-04-21T08:00:00.000Z,concept-article,,0.2,False,"Describes process/overview for transferring EA accounts and subscriptions; no detailed limits, configuration tables, error codes, or decision matrices with quantified criteria.",updated https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-understand-pricesheet,Enterprise agreement price sheet terms,Terms in your Enterprise Agreement price sheet - Azure - Microsoft Cost Management,,Learn how to read and understand your usage and bill for an Enterprise Agreement.,"This article applies to an Azure billing account for an Enterprise Agreement (EA).Check if you have access to Enterprise Agreement. Depending on the policies set for your organization by the Enterprise Admin, only certain administrative roles provide access to your organization's EA pricing information. For more information, seeUnderstand Azure Enterprise Agreement administrative roles in Azure. For billing periods January 2023 onwards, a new version of the Azure price sheet is available for dow",2026-01-14T08:00:00.000Z,conceptual,,0.45,False,"Describes terms in EA price sheets and access roles; terminology and role-based access, but not detailed numeric limits or configuration parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/elevate-access-global-admin,Elevate access,Elevate access to manage billing accounts - Microsoft Cost Management,Elevate Global Administrator access to Azure billing accounts,Describes how to elevate access for a Global Administrator to manage billing accounts using the Azure portal or REST API.,"As a Global Administrator in Microsoft Entra ID, you might not have access to all billing accounts in your directory. This article describes the ways that you can elevate your access to all billing accounts. Elevating your access to manage all billing accounts gives you the ability to view and manage cost and billing for your accounts. You can view invoices, charges, products that are purchased, and the users that have access to the billing accounts. If you want to elevate your access to manage ",2026-01-14T08:00:00.000Z,how-to,security,0.8,True,"Describes how a Global Administrator can elevate access to all billing accounts via portal or REST; includes security role behavior and elevation configuration, fitting security.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/enable-marketplace-purchases,Enable Azure Marketplace purchases,Enable Marketplace Purchases in Azure - Microsoft Cost Management,Configure Azure Marketplace and private offer purchase policies,This article shows you how to enable Marketplace private offer purchases.,"You can use Microsoft Marketplace in the Azure portal to buy non-Microsoft software to use with Azure. To use Marketplace, you need to set up and configure Marketplace policy settings. Then, you assign the required user access permissions to billing accounts and subscriptions. This article explains how to set up and enable Marketplace purchases, and specifically how to enable Marketplace private offer purchases. To enable Marketplace private offer purchase, first:",2026-01-30T18:17:00.000Z,conceptual,configuration,0.76,True,"Shows how to set Marketplace policy settings and assign permissions; likely includes specific policy names, allowed values, and RBAC roles, which are configuration and security details.",unchanged @@ -122,13 +122,13 @@ https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/subscript https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/supported-payment-methods,Supported payment methods,Supported payment methods - Microsoft Cost Management,,"Learn about the payment methods that are supported for Azure subscriptions, based on your country or region.","This article helps determine you which credit and debit cards are supported for Azure subscriptions, based on your country or region. To pay for your Azure subscription, you can use a credit or debit card if you have a Microsoft Customer Agreement (MCA) or Microsoft Online Services Program (MOSP) account. The following cards are supported for Azure subscriptions, based on your country or region. For more information about payment methods, seeAdd, update, or remove a payment method. Note Prepaid ",2026-03-12T08:00:00.000Z,reference,,0.3,False,"Lists supported payment methods by region but primarily as descriptive/eligibility info; lacks detailed numeric limits, configuration parameters, or decision matrices required for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/switch-azure-offer,Switch subscription offer,Change Azure subscription offer - Microsoft Cost Management,Choose and switch between Azure subscription offers,Learn about how to change your Azure subscription and switch to a different offer.,"As a customer with apay-as-you-go subscription, you can switch your Azure subscription to another offer in the Azure portal. For example, you can use this feature to take advantage of themonthly credits for Visual Studio subscribers. If you have an expired Visual Studio subscription, you can switch to apay-as-you-go subscription. Just want to upgrade from Free Trial?Seeupgrade your subscription.",2025-12-29T08:00:00.000Z,how-to,decision-making,0.63,True,"Guidance on changing from pay-as-you-go to other offers (for example, Visual Studio credits, Free Trial upgrade); likely includes offer-specific eligibility and constraints to help decide which offer to select.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/transfer-subscriptions-subscribers-csp,Transfer CSP subscriptions,Transfer Azure subscriptions between subscribers and Cloud Solution Providers - Microsoft Cost Management,,Learn how you can transfer Azure subscriptions between subscribers and Cloud Solution Providers (CSPs).,This article provides high-level steps used to transfer Azure subscriptions to and from CSP partners and their customers. This information is intended for the Azure subscriber to help them coordinate with their partner. Information that Microsoft partners use for the transfer process is documented atTransfer subscriptions under an Azure plan from one partner to another. Download or export cost and billing information that you want to keep before you start a transfer request. Billing and utilizat,2026-03-30T17:14:00.000Z,how-to,,0.3,False,"Provides high-level steps to transfer subscriptions between subscribers and CSPs without clear indication of detailed matrices, constraints, or configuration parameters; appears more like a procedural overview than expert decision or configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles,EA roles in Azure,Understand admin roles for Enterprise Agreements in Azure - Microsoft Cost Management,Understand and assign Azure Enterprise Agreement admin roles,"Learn about the administrative roles available to manage Azure Enterprise Agreements (EA), including permissions and how to assign them.","To help manage your organization's usage and spend, Azure customers with an Enterprise Agreement can assign the following six distinct administrative roles. ¹ The Bill-To contact on the direct EA contract gets this role. ² The Bill-To contact can't be added or changed in the Azure portal. It gets added to the EA enrollment based on the user who is set up as the Bill-To contact on agreement level. To change the Bill-To contact, a request needs to be made through a partner/software advisor to the ",2025-09-23T08:00:00.000Z,concept-article,security,0.82,True,Lists six EA admin roles with their permissions and assignment rules; these are specific RBAC-like role definitions and governance details.,unchanged +https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles,EA roles in Azure,Understand admin roles for Enterprise Agreements in Azure - Microsoft Cost Management,Manage Azure EA administrative roles and permissions,"Learn about the administrative roles available to manage Azure Enterprise Agreements (EA), including permissions and how to assign them.","To help manage your organization's usage and spend, Azure customers with an Enterprise Agreement can assign the following six distinct administrative roles. ¹ The Bill-To contact on the direct EA contract gets this role. ² The Bill-To contact can't be added or changed in the Azure portal. It gets added to the EA enrollment based on the user who is set up as the Bill-To contact on agreement level. To change the Bill-To contact, a request needs to be made through a partner/software advisor to the ",2026-04-21T08:00:00.000Z,concept-article,security,0.7,True,"Defines specific EA admin roles and their permissions, including who can assign them and constraints around the Bill-To contact; this is product-specific RBAC/permission information that qualifies as security-focused expert knowledge.",updated https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-mca-roles,Microsoft Customer Agreements roles in Azure,Billing roles for Microsoft Customer Agreements - Azure - Microsoft Cost Management,Use MCA billing roles for access control,Learn about billing roles for billing accounts in Azure for Microsoft Customer Agreements.,"To manage your billing account for a Microsoft Customer Agreement, use the roles described in the following sections. These roles are in addition to the built-in roles Azure has to control access to resources. For more information, seeAzure built-in roles. This article applies to a billing account for a Microsoft Customer Agreement.Check if you have access to a Microsoft Customer Agreement. Watch theManage access to your MCA billing accountvideo to learn how you can control access to your Micros",2026-02-10T08:00:00.000Z,how-to,security,0.85,True,Defines specific billing roles for Microsoft Customer Agreements and how they relate to Azure built-in roles; detailed RBAC-like role definitions are security configuration.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/upgrade-azure-subscription,Upgrade free account,Upgrade your Azure account - Microsoft Cost Management,Choose and perform Azure account upgrades,Learn how to upgrade your Azure free or Azure for Students Starter account. See additional information about Azure support plans.,"You can upgrade yourAzure free accounttopay-as-you-go ratesin the Azure portal. If you have anAzure for Students Starter accountand are eligible for anAzure free account, you can upgrade to it to anAzure free account. You get $200 Azure credit in your billing currency and 12 months of free services on upgrade. If you don't qualify for a free account, you can upgrade topay-as-you-go rateswith asupport request. If you have anAzure for Studentsaccount, you can upgrade topay-as-you-go rates. Note If",2026-01-14T08:00:00.000Z,conceptual,decision-making,0.65,True,"Explains upgrade paths between free, Students, and pay-as-you-go, including when you get $200 credit and 12 months of free services; supports concrete billing-plan selection decisions.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/view-all-accounts,View billing accounts,View your billing accounts in Azure portal - Microsoft Cost Management,,"Learn how to view your billing accounts in the Azure portal. See scope information for Enterprise, Microsoft Customer, and Microsoft Partner Agreements.","A billing account is created when you sign up to use Azure. You use your billing account to manage your invoices, payments, and track costs. You can have access to multiple billing accounts. For example, if you signed up for Azure for your personal projects. You could also have access through your organization's Enterprise Agreement or Microsoft Customer Agreement. For each of these scenarios, you would have a separate billing account. Azure portal supports the following type of billing accounts",2026-01-20T08:00:00.000Z,conceptual,,0.3,False,"Shows how to view billing accounts and scopes; mostly navigation and conceptual explanation of account types, not detailed configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/view-payment-history,View payment history,View payment history - Microsoft Cost Management,,This article describes how to view your payment history for a Microsoft Customer Agreement.,The article explains how you can view your payment history in the Azure portal. This article applies to customers who have the following Azure account types: A Microsoft Customer Agreement purchased directly through Azure.com. A Microsoft Customer Agreement purchased through a Microsoft representative. A Microsoft Customer Agreement purchased through a Microsoft partner.,2026-03-08T08:00:00.000Z,how-to,,0.1,False,"Simple guidance on viewing payment history in the Azure portal; no specific limits, configuration parameters, or technical troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/withholding-tax-credit-india,Request Withholding Tax credit - India,Request a credit for Withholding Tax on your account (India customers) - Azure - Microsoft Cost Management,,Learn how to request a credit on your account for Withholding Tax you paid. This article only applies to customers in India.,"Customers in India receive Web Direct (Azure and Microsoft 365) invoices billed by Microsoft Regional Sales Pte Ltd. Singapore (MRS) and make cross-border payments to Singapore to settle the invoice. If you withheld taxes when remitting the payment, this article explains the process for claiming a credit for the Withholding Tax (WHT) in your account with MRS.",2025-11-04T08:00:00.000Z,conceptual,,0.4,False,"Explains process for requesting withholding tax credit in India; procedural finance/tax workflow rather than technical limits, configuration, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/checklist-microsoft-customer-agreement-billing-migration,Checklist for billing migration,Checklist Microsoft Customer Agreement Billing Migration - Microsoft Cost Management,Prepare subscriptions for Microsoft Customer Agreement billing migration,This guide helps customers who sign a Microsoft Customer Agreement prepare their existing subscriptions for a billing migration.,"This article provides early guidance so customers can understand migration impact, confirm readiness, and prepare the required billing configuration for a smooth transition to the Microsoft Customer Agreement (MCA).",2026-04-15T11:11:00.000Z,article,decision-making,0.68,True,"The checklist provides concrete, prescriptive guidance for assessing readiness and planning a migration from existing billing to Microsoft Customer Agreement. It focuses on what to verify and configure before migration, helping users decide how and when to transition subscriptions and billing constructs. While it doesn't emphasize numeric limits, it is migration- and decision-oriented rather than conceptual marketing content.",new +https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/checklist-microsoft-customer-agreement-billing-migration,Checklist for billing migration,Checklist Microsoft Customer Agreement Billing Migration - Microsoft Cost Management,Prepare subscriptions for Microsoft Customer Agreement billing migration,This guide helps customers who sign a Microsoft Customer Agreement prepare their existing subscriptions for a billing migration.,"This article provides early guidance so customers can understand migration impact, confirm readiness, and prepare the required billing configuration for a smooth transition to the Microsoft Customer Agreement (MCA).",2026-04-15T11:11:00.000Z,article,decision-making,0.68,True,"The checklist provides concrete, prescriptive guidance for assessing readiness and planning a migration from existing billing to Microsoft Customer Agreement. It focuses on what to verify and configure before migration, helping users decide how and when to transition subscriptions and billing constructs. While it doesn't emphasize numeric limits, it is migration- and decision-oriented rather than conceptual marketing content.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/manage-tenants,Manage tenants in your agreement,Manage tenants in your Microsoft Customer Agreement billing account - Azure - Microsoft Cost Management,Manage tenants and secure billing access under MCA,The article helps you understand and manage tenants associated with your Microsoft Customer Agreement billing account.,"The article helps you understand and manage tenants associated with your Microsoft Customer Agreement billing account. Use the information to manage tenants, transfer subscriptions, and administer billing ownership while you ensure secure access to your billing environment.",2025-09-09T22:10:00.000Z,conceptual,security,0.7,True,"Focuses on managing tenants, transferring subscriptions, and administering billing ownership while ensuring secure access. Likely includes billing roles/permissions and tenant-scoped access patterns, which are product-specific security/identity configurations.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/microsoft-customer-agreement-faq,FAQ,Microsoft Customer Agreement FAQ - Azure - Cost Management + Billing,,This article provides answers to frequently asked questions about the Microsoft Customer Agreement.,,2025-05-27T22:04:00Z,faq,,0.3,False,"FAQ format about MCA; typically high-level Q&A without detailed configuration parameters, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/microsoft-customer-agreement-get-started,Checklist after signing your agreement,Key next steps after accepting your Microsoft Customer Agreement - Azure - Microsoft Cost Management,,This article helps you get started as you begin to manage Azure billing and subscriptions under your new Microsoft Customer Agreement.,This article helps you get started as you begin to manage Azure billing and subscriptions under your new Microsoft Customer Agreement. Some of the benefits under the agreement include:,2025-01-07T08:00:00.000Z,conceptual,,0.4,False,"High-level 'getting started' after accepting MCA, likely benefits and basic steps without detailed configuration tables or quantified trade-offs.",unchanged @@ -251,7 +251,7 @@ https://learn.microsoft.com/en-us/azure/cost-management-billing/troubleshoot-sub https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/billing-meter-id-updates,Billing meter ID updates,Azure billing meter ID updates - Microsoft Cost Management,Assess impact of Azure billing meter ID updates,Learn about how Azure billing meter ID updates might affect your Azure consumption and billing.,"On March 1, 2024, some Azure billing meters were updated for improved meter ID metadata. More updates are underway. A billing meter is used to determine the cost of using a specific service or resource in Azure. It helps calculate the amount you're charged based on the quantity of the resource consumed. The billing meter varies depending on the type of service or resource used. The meter ID updates result in having only individual unique meters. A unique meter means that every Azure service, res",2025-04-28T08:00:00.000Z,conceptual,decision-making,0.6,True,Details how meter ID metadata changes and unique meter behavior affect consumption and billing; specialized billing rules that guide how to interpret charges over time.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/billing-meter-location,Shared billing meter regions,Shared billing meter regions - Microsoft Cost Management,Understand shared Azure billing meter regions,"Learn about shared billing meter regions in Azure, how they affect cost calculations, and the difference between billing meter regions and resource locations.",An Azure billing meter is a collection of information in the Azure billing system that helps define the cost of using a specific service or resource in Azure. It helps determine the amount you're charged based on the quantity of the resource consumed. The billing meter varies depending on the type of service or resource that you use. Some billing meters might apply to only one Azure region or location. Other billing meters aresharedacross multiple or all Azure regions. This difference means that,2025-05-27T08:00:00.000Z,concept-article,decision-making,0.65,True,Explains how shared billing meter regions differ from resource locations and how that affects cost calculations; this is Azure-specific billing behavior used for cost planning decisions.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/download-azure-daily-usage,View and download your Azure usage and charges,View and download Azure usage and charges - Microsoft Cost Management,,"Learn how to download or view your Azure daily usage and charges, and see other available resources.","You can download a daily breakdown of your Azure usage and charges in the Azure portal. Only certain roles have permission to get Azure usage information, like the Account Administrator or Enterprise Administrator. To learn more about getting access to billing information, seeManage access to Azure billing using roles. If you have a Microsoft Customer Agreement (MCA), you must be a billing profile Owner, Contributor, Reader, or Invoice manager to view your Azure usage and charges. If you have a ",2026-02-06T18:19:00.000Z,how-to,,0.45,False,Describes how to download daily usage and which roles can view it; role mapping is simple and not a detailed configuration or troubleshooting matrix.,unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/download-azure-invoice,View and download your Azure invoice,View and download your Azure invoice - Microsoft Cost Management,,Learn how to view and download your Azure invoice. You can download your invoice in the Azure portal or get it sent in an email.,"You can download your invoice in theAzure portalor get it sent in email. Invoices are sent to the person set to receive invoices for the enrollment. If you're an Azure customer with an Enterprise Agreement (EA customer), only an EA administrator can download and view your organization's invoice. Direct EA administrators canDownload or view their Azure billing invoice. Indirect EA administrators can use the information atAzure Enterprise enrollment invoicesto download their invoice.",2026-04-01T08:00:00.000Z,how-to,,0.1,False,"Procedural guidance for viewing/downloading Azure invoices; no numeric limits, configuration tables, error codes, or decision matrices. Primarily portal navigation and role-based access notes, which are not detailed RBAC or security configuration.",updated +https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/download-azure-invoice,View and download your Azure invoice,View and download your Azure invoice - Microsoft Cost Management,,Learn how to view and download your Azure invoice. You can download your invoice in the Azure portal or get it sent in an email.,"You can download your invoice in theAzure portalor get it sent in email. Invoices are sent to the person set to receive invoices for the enrollment. If you're an Azure customer with an Enterprise Agreement (EA customer), only an EA administrator can download and view your organization's invoice. Direct EA administrators canDownload or view their Azure billing invoice. Indirect EA administrators can use the information atAzure Enterprise enrollment invoicesto download their invoice.",2026-04-01T08:00:00.000Z,how-to,,0.1,False,"Procedural guidance for viewing/downloading Azure invoices; no numeric limits, configuration tables, error codes, or decision matrices. Primarily portal navigation and role-based access notes, which are not detailed RBAC or security configuration.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/keep-billing-accounts-active,Keep your billing account active,Keep your Microsoft business billing account active - Microsoft Cost Management,Manage Azure billing account dormancy and retention,Learn about billing account dormancy and how you can keep and maintain an active billing account.,"If a billing account is unused for a certain amount of time, it's classified asinactive. An inactive billing account can increase potential security risks to that account and the resources it contains. To reduce this risk, Microsoft takes measures to secure, protect, and ultimately delete inactive billing accounts, tenants, and subscriptions within them. This article applies to the following agreements:",2026-02-05T08:00:00.000Z,concept-article,limits-quotas,0.7,True,"Discusses when accounts become inactive and subsequent protection/deletion measures; likely includes specific inactivity time thresholds and lifecycle timings, which are numeric policy limits not generally known.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mca-download-tax-document,View and download tax documents,View tax documents for your Azure invoice - Microsoft Cost Management,,Learn how to view and download tax receipts for your billing profile.,"You can download tax documents for your Azure invoice if you have access to invoices in the Azure portal. Only certain roles have access to invoices, such as the Account Administrator. If you have a Microsoft Customer Agreement, you must be a Billing profile Owner, Contributor, Reader, or Invoice manager to download invoices and tax documents. If you have Microsoft Partner Agreement, you must have the Admin Agent orbilling adminrole in the partner organization.Check your billing account typeto s",2025-11-03T08:00:00.000Z,concept-article,,0.35,False,Explains how to view/download tax documents and required roles; procedural billing portal usage without expert-level technical detail.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mca-overview,Billing accounts for Microsoft Customer Agreement,Get started with Microsoft Customer Agreement billing account - Azure - Microsoft Cost Management,,"Learn about your Microsoft Customer Agreement billing account, including billing profiles and invoice payment methods.","A billing account is created when you sign up to use Azure. You use your billing account to manage invoices, payments, and track costs. You can have access to multiple billing accounts. For example, you signed up for Azure for your personal projects. You could also have access to Azure through your organization's Enterprise Agreement or Microsoft Customer Agreement. For each of these scenarios, you would have a separate billing account. This article applies to a billing account for a Microsoft C",2025-11-18T08:00:00.000Z,conceptual,,0.35,False,"Get-started overview for MCA billing accounts; focuses on concepts like billing profiles and payment methods, not detailed limits or configuration matrices.",unchanged @@ -263,7 +263,7 @@ https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mpa-o https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/pay-bill,Pay your Azure bill,Pay your Microsoft Customer Agreement or Microsoft Online Subscription Program bill - Microsoft Cost Management,Configure payment methods for MCA and MOSP bills,"Learn how to pay your bill in the Azure portal. You must be a billing profile owner, contributor, or invoice manager to pay in the portal.","This article applies to: If you're unsure of your billing account type, seeCheck the type of your accountlater in this article. There are two ways to pay your bill for Azure. You can pay with the default payment method of your billing profile, or you can make a one-time payment with thePay nowoption. If you signed up for Azure through a Microsoft representative, your default payment method is always set to wire transfer. Automatic credit card payment isn't an option if you signed up for Azure th",2025-10-21T08:00:00.000Z,how-to,configuration,0.6,True,"Details how to pay bills via default payment method or one-time payment, with constraints like wire transfer-only scenarios; these are product-specific billing configuration rules.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/plan-manage-costs,Plan to manage Azure costs,Plan to manage Azure costs - Microsoft Cost Management,Plan and implement Azure cost management practices,Learn how to plan to manage Azure costs and use cost-tracking and management features for your Azure account.,"This article guides you in planning how to better manage your Azure costs. When you sign up for Azure, there are several steps you can take to better understand your spending: If you need to cancel your Azure subscription, seeCancel your Azure subscription.",2025-05-21T22:04:00.000Z,conceptual,best-practices,0.6,True,Guides planning to manage Azure costs with concrete steps and use of specific cost management features; product-specific cost governance recommendations qualify as best practices even if not heavily numeric.,unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-customer-agreement-bill,Review your Microsoft Customer Agreement bill,Review your Microsoft Customer Agreement bill - Azure - Microsoft Cost Management,,Learn how to review your bill and resource usage and to verify charges for your Microsoft Customer Agreement invoice.,"You can review the charges on your invoice by analyzing the individual transactions. In the billing account for a Microsoft Customer Agreement, an invoice is generated every month for each billing profile. The invoice includes all charges for consumption-based subscriptions from the previous month. You can view your invoices in the Azure portal and compare the charges to the usage detail file. This tutorial applies only to Azure customers with a Microsoft Customer Agreement. In this tutorial, yo",2025-11-04T08:00:00.000Z,tutorial,,0.3,False,"Guides reviewing Microsoft Customer Agreement bills; primarily procedural and conceptual without detailed limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-enterprise-agreement-bill,Review your Enterprise Agreement bill,Review your Azure Enterprise Agreement bill - Microsoft Cost Management,,Learn how to read and understand your usage and bill for Azure Enterprise Agreements.,Azure customers with an Enterprise Agreement receive an invoice when they exceed the organization's credit or use services that aren't covered by the credit. Your organization's credit includes your Azure Prepayment (previously called monetary commitment). Azure Prepayment is the amount your organization paid upfront for usage of Azure services. You can add Azure Prepayment funds to your Enterprise Agreement by contacting your Microsoft account manager or reseller. This tutorial applies only to ,2025-09-22T08:00:00.000Z,tutorial,,0.3,False,"Explains how to read an EA bill; focused on understanding invoice and credit usage, not on numeric limits, config parameters, or troubleshooting codes.",unchanged +https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-enterprise-agreement-bill,Review your Enterprise Agreement bill,Review your Azure Enterprise Agreement bill - Microsoft Cost Management,,Learn how to read and understand your usage and bill for Azure Enterprise Agreements.,Azure customers with an Enterprise Agreement receive an invoice when they exceed the organization's credit or use services that aren't covered by the credit. Your organization's credit includes your Azure Prepayment (previously called monetary commitment). Azure Prepayment is the amount your organization paid upfront for usage of Azure services. You can add Azure Prepayment funds to your Enterprise Agreement by contacting your Microsoft account manager or reseller. This tutorial applies only to ,2026-04-20T08:00:00.000Z,tutorial,,0.2,False,"Billing explanation for Enterprise Agreement usage and credits; appears to be conceptual guidance on reading bills without product-specific limits, configuration parameters, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-individual-bill,Review your individual account bill,Review your individual Azure subscription bill - Microsoft Cost Management,,"Learn how to understand your bill and resource usage and to verify charges for your individual Azure subscription, including pay-as-you-go.","This article helps you understand and review the bill for your pay-as-you-go or Visual Studio Azure subscription, including pay-as-you-go and Visual Studio. For each billing period, you normally receive an invoice in email. The invoice is a representation of your Azure bill. The same cost information on the invoice is available in the Azure portal. In this tutorial, you compare your invoice with the detailed daily usage file and with cost analysis in the Azure portal. This tutorial applies only ",2025-11-11T08:00:00.000Z,tutorial,,0.3,False,"Tutorial on reviewing an individual bill; mostly portal navigation and conceptual explanation of invoice vs usage file, not configuration, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-partner-agreement-bill,Review your Microsoft Partner Agreement bill,Review your Microsoft Partner Agreement invoice - Azure - Microsoft Cost Management,,Learn how to review your bill and resource usage and to verify charges for your Microsoft Partner Agreement invoice.,"In the billing account for a Microsoft Partner Agreement, an invoice is generated every month for each billing profile. The invoice includes all customer charges from the previous month. You can understand the charges on your invoice by analyzing the individual transactions in the Azure portal. You can also view your invoices in the Azure portal and compare the charges to the usage detail file. For more information, seehow to download invoices from the Azure portal. This tutorial applies only to",2025-11-10T08:00:00.000Z,tutorial,,0.3,False,"Describes how to review Partner Agreement invoices; focuses on navigation and understanding charges, not on expert-level limits or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/understand-azure-marketplace-charges,View external Azure service charges,Understand your Azure external service charges - Microsoft Cost Management,,"Learn about billing of external services, formerly known as Marketplace, charges in Azure.","Third-party software vendors in the Azure Marketplace publish external services. For example, SendGrid is an external service that you can purchase in Azure, but Microsoft doesn't publish it. Some Microsoft products are sold through Azure Marketplace, too.",2025-11-10T08:00:00.000Z,concept-article,,0.3,False,"Conceptual explanation of external/Marketplace service charges; lacks specific limits, configs, or troubleshooting mappings.",unchanged diff --git a/products/azure-cost-management/report.md b/products/azure-cost-management/report.md index 3d252caf..b634275d 100644 --- a/products/azure-cost-management/report.md +++ b/products/azure-cost-management/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Azure billing and cost data: RBAC roles, admin access, directory transfers, fraud prevention, and permissions for subscriptions, reservations, and savings plans.' - decision-making: Deciding how to allocate and analyze Azure costs, choose subscriptions - and billing models, and optimize spend with reservations, savings plans, Hybrid - Benefit, and limited-time discounts. + decision-making: Guidance for choosing cost views, billing models, reservations, + savings plans, and allocation strategies, plus EA→MCA transitions and product-specific + discount/chargeback decisions. integrations: APIs, scripts, and Power BI for cost/billing analysis, plus programmatic creation and management of EA/MCA/MPA subscriptions and reservations across tenants. configuration: 'Configuring Azure Cost Management: budgets, alerts, tags, exports, @@ -29,14 +29,14 @@ category_descriptions: skill_description: Expert knowledge for Azure Cost Management development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when managing budgets/alerts, exports, Cost Management APIs, reservations/savings - plans, or Hybrid Benefit, and other Azure Cost Management related development tasks. + Use when managing budgets/alerts, tags/exports, Cost Management APIs, reservations/savings + plans, or EA→MCA billing, and other Azure Cost Management related development tasks. Not for Azure Advisor (use azure-advisor), Azure Monitor (use azure-monitor), Azure - Quotas (use azure-quotas), Azure Portal (use azure-portal). -use_when: Use when managing budgets/alerts, exports, Cost Management APIs, reservations/savings - plans, or Hybrid Benefit, and other Azure Cost Management related development tasks. + Quotas (use azure-quotas), Azure Impact Reporting (use azure-impact-reporting). +use_when: Use when managing budgets/alerts, tags/exports, Cost Management APIs, reservations/savings + plans, or EA→MCA billing, and other Azure Cost Management related development tasks. confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (use - azure-monitor), Azure Quotas (use azure-quotas), Azure Portal (use azure-portal). + azure-monitor), Azure Quotas (use azure-quotas), Azure Impact Reporting (use azure-impact-reporting). --- # Azure Cost Management Crawl Report @@ -45,13 +45,13 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us - **Total Pages**: 267 - **Fetched**: 267 - **Fetch Failed**: 0 -- **Classified**: 184 -- **Unclassified**: 83 +- **Classified**: 181 +- **Unclassified**: 86 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 1 -- **Unchanged**: 265 +- **New Pages**: 0 +- **Updated Pages**: 13 +- **Unchanged**: 254 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-cost-management/azure-cost-management.csv` @@ -61,25 +61,45 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us |------|-------|------------| | architecture-patterns | 1 | 0.4% | | best-practices | 7 | 2.6% | -| configuration | 38 | 14.2% | -| decision-making | 68 | 25.5% | +| configuration | 34 | 12.7% | +| decision-making | 70 | 26.2% | | deployment | 1 | 0.4% | | integrations | 13 | 4.9% | -| limits-quotas | 12 | 4.5% | +| limits-quotas | 11 | 4.1% | | security | 18 | 6.7% | | troubleshooting | 26 | 9.7% | -| *(Unclassified)* | 83 | 31.1% | +| *(Unclassified)* | 86 | 32.2% | ## Changes -### New Pages - -- [Checklist for billing migration](https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/checklist-microsoft-customer-agreement-billing-migration) - ### Updated Pages -- [View and download your Azure invoice](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/download-azure-invoice) - - Updated: 2025-11-17T08:00:00.000Z → 2026-04-01T08:00:00.000Z +- [Azure EA agreements and amendments](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-agreements) + - Updated: 2025-10-13T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [View and download EA usage details](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-azure-usage-charges-invoices) + - Updated: 2025-09-22T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Azure EA pricing](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing-overview) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-21T08:00:00.000Z +- [Azure EA VM reserved instances](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations) + - Updated: 2025-10-13T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Azure Marketplace](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-azure-marketplace) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Review your Enterprise Agreement bill](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-enterprise-agreement-bill) + - Updated: 2025-09-22T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Get started with EA billing in the Azure portal](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-direct-portal-get-started) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-21T08:00:00.000Z +- [Direct EA billing invoice documents](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-billing-invoice-documents) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Transfer EA accounts](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-transfers) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-21T08:00:00.000Z +- [Initiate a Change of Channel Partner (COCP) request](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-of-channel-partner) + - Updated: 2026-01-30T18:17:00.000Z → 2026-04-20T08:00:00.000Z +- [EA roles in Azure](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-21T08:00:00.000Z +- [EA billing administration](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-administration) + - Updated: 2025-09-23T08:00:00.000Z → 2026-04-21T08:00:00.000Z +- [Azure portal administration for partners](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-billing-administration-partners) + - Updated: 2025-09-24T08:00:00.000Z → 2026-04-20T08:00:00.000Z ## Classified Pages @@ -92,8 +112,8 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Grant RBAC access to Azure Reservations using PowerShell](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/manage-reservations-rbac-powershell) | security | 0.85 | Shows how to assign specific RBAC roles via Azure PowerShell for reservations; includes role names and scope usage, which is expert security configuration. | | [Microsoft Customer Agreements roles in Azure](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-mca-roles) | security | 0.85 | Defines specific billing roles for Microsoft Customer Agreements and how they relate to Azure built-in roles; detailed RBAC-like role definitions are security configuration. | | [Subscription access](https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/troubleshoot-subscription-access) | troubleshooting | 0.85 | Explicit troubleshooting article for subscription access after signing MCA, likely mapping common symptoms to causes and resolutions, possibly with specific error messages or role issues. | -| [EA roles in Azure](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles) | security | 0.82 | Lists six EA admin roles with their permissions and assignment rules; these are specific RBAC-like role definitions and governance details. | | [Not available due to conflict error](https://learn.microsoft.com/en-us/azure/cost-management-billing/troubleshoot-subscription/troubleshoot-not-available-conflict) | troubleshooting | 0.82 | The article focuses on a specific Azure portal error message ('Not available due to conflict') when selecting a management group for reservations or savings plans and provides solutions. It is organized around a concrete error symptom and its resolution paths, which are unique to Azure Cost Management and reservation management behavior, fitting the troubleshooting category. | +| [Azure EA pricing](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing-overview) | decision-making | 0.80 | EA pricing overview for enterprise customers usually includes detailed, contract-specific pricing rules, how usage is calculated, and FAQs about complex charge scenarios. These are expert billing rules not derivable from generic knowledge and help customers decide and reason about costs under EA, fitting decision-making around pricing and usage behavior. | | [Can't create VM - Contact reseller](https://learn.microsoft.com/en-us/azure/cost-management-billing/troubleshoot-billing/cannot-create-vm) | troubleshooting | 0.80 | KB-style article for a specific symptom (cannot create VM as EA user) with concrete causes and resolutions; qualifies as troubleshooting with product-specific behavior. | | [Change administrator](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/add-change-subscription-administrator) | security | 0.80 | Describes adding/changing subscription administrators using Azure RBAC; likely enumerates built-in roles and scopes, which are concrete security configuration details. | | [Create EA subscriptions programmatically](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/programmatically-create-subscription-enterprise-agreement) | integrations | 0.80 | Provides concrete API versions, request schemas, CLI/PowerShell commands, and ARM template parameters for EA subscription creation; these are detailed integration patterns with configuration parameters. | @@ -142,6 +162,8 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Analyze Azure costs with the Power BI template app](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/analyze-cost-data-azure-cost-management-power-bi-template-app) | integrations | 0.70 | Covers installing and using a specific Power BI app with constraints (EA-only, unsupported clouds) and likely connection parameters; fits integrations & coding patterns. | | [App Service](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/prepay-app-service) | decision-making | 0.70 | Describes product- and SKU-specific reservation options and savings behavior for App Service Premium v3 and Isolated v2, used for cost decisions. | | [Azure Cosmos DB](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/understand-cosmosdb-reservation-charges) | decision-making | 0.70 | Explains that reservations cover provisioned RU/s only and exclude storage, networking, and predefined containers, which is critical for cost modeling and purchase decisions. | +| [Azure EA agreements and amendments](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-agreements) | decision-making | 0.70 | EA agreements and amendments documentation typically contains detailed, contract-specific rules about access, usage, and payment behavior that are not inferable from general training data. These rules guide customers on how agreement structure impacts billing and access, which is decision-focused (how changes to agreements affect what you can or should do). While not limits/quotas, it is specialized guidance for interpreting EA constructs and making enrollment-related decisions. | +| [Azure Marketplace](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-azure-marketplace) | decision-making | 0.70 | Explains how EA customers and partners can view and enable Marketplace charges, which usually involves EA-specific rules, scopes, and behaviors for Marketplace billing. These details guide decisions on enabling Marketplace purchases and understanding their cost impact within EA, which is specialized decision-making content. | | [Azure SQL Database](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/understand-reservation-charges) | decision-making | 0.70 | Specifies which SQL Database components (compute, replicas) are covered and how discounts apply hourly, plus interaction with Azure Hybrid Benefit, enabling informed reservation decisions. | | [Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/understand-reservation-charges-sql-managed-instance) | decision-making | 0.70 | Details which Managed Instance costs are covered (compute, replicas) and how they interact with Azure Hybrid Benefit, informing capacity and licensing decisions. | | [Azure account sign-up issues](https://learn.microsoft.com/en-us/azure/cost-management-billing/troubleshoot-subscription/troubleshoot-azure-sign-up) | troubleshooting | 0.70 | Walks through sign-up flow and common issues at each step; likely includes specific error messages and their resolutions, fitting troubleshooting criteria. | @@ -157,7 +179,7 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Data transfer fees](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/data-transfer-fees) | limits-quotas | 0.70 | Describes at-cost data transfer scenarios and eligibility; likely includes specific conditions and possibly quantitative fee rules for cross-provider transfers, which are pricing/limit details not generally known. | | [Databricks](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/prepay-databricks-reserved-capacity) | decision-making | 0.70 | Explains DBCU prepurchase behavior, term options, and how usage is deducted, which is product-specific cost optimization guidance. | | [Declined card](https://learn.microsoft.com/en-us/azure/cost-management-billing/troubleshoot-billing/troubleshoot-declined-card) | troubleshooting | 0.70 | Troubleshooting guide for declined cards at Azure sign-up or during subscription use; organized by error scenarios and resolutions, likely including specific error messages and causes, fitting the troubleshooting criteria. | -| [Direct EA billing invoice documents](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-billing-invoice-documents) | limits-quotas | 0.70 | Specifies that documents are available between specific days of the month and notes delays for unusually large invoices; these are concrete timing constraints and conditions that qualify as limits/quotas. | +| [EA roles in Azure](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles) | security | 0.70 | Defines specific EA admin roles and their permissions, including who can assign them and constraints around the Bill-To contact; this is product-specific RBAC/permission information that qualifies as security-focused expert knowledge. | | [Exchange and refund reservations](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/exchange-and-refund-azure-reservations) | decision-making | 0.70 | Describes allowed exchange/refund scenarios within product families and constraints, which are specific policy and decision rules for managing reservations. | | [Export cost data with a Storage SAS key](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/export-cost-data-storage-account-sas-key) | configuration | 0.70 | Explains how partners export cost data to a storage account in another tenant using a SAS key. Involves product-specific configuration of SAS-based exports and billing account roles. | | [Find a reservation purchaser from Azure logs](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/find-reservation-purchaser-from-logs) | troubleshooting | 0.70 | Uses Azure Monitor/directory logs and specific fields (email IDs) to diagnose who made a purchase, which is a product-specific troubleshooting pattern. | @@ -210,15 +232,13 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Checklist for billing migration](https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/checklist-microsoft-customer-agreement-billing-migration) | decision-making | 0.68 | The checklist provides concrete, prescriptive guidance for assessing readiness and planning a migration from existing billing to Microsoft Customer Agreement. It focuses on what to verify and configure before migration, helping users decide how and when to transition subscriptions and billing constructs. While it doesn't emphasize numeric limits, it is migration- and decision-oriented rather than conceptual marketing content. | | [Markup - China](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/markup-china) | configuration | 0.68 | The article describes how to configure markup rules specifically for Microsoft Azure operated by 21Vianet, including product applicability constraints (first-party vs third-party, marketplace, seat-based) and how markup is reflected in pricing and cost experiences. This is product- and region-specific configuration behavior that is unlikely to be fully known from generic training data, and it focuses on concrete configuration of a Cost Management feature rather than conceptual billing overviews. | | [Transfer subscription to EA](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/mosp-ea-transfer) | configuration | 0.68 | Describes detailed steps and constraints to transfer specific offer types (for example, MS-AZR-0003P) to EA without downtime; these are product-specific operational procedures. | -| [Azure portal administration for partners](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-billing-administration-partners) | configuration | 0.66 | Partner-focused EA billing administration; includes specific tasks and settings for managing indirect EA customers, which are product-specific operational details. | -| [EA billing administration](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-administration) | configuration | 0.66 | Explains concrete EA billing operations (for example, managing accounts, departments, cost centers) in the portal; contains EA-specific operational configuration steps. | | [App Service](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-discount-app-service) | decision-making | 0.65 | Explains how reservations map specifically to App Service Premium v3 and Isolated v2 instances and prerequisites like App Service Environment, informing purchase and deployment choices. | | [App Service - JBoss EAP Integrated Support](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/prepay-jboss-eap-integrated-support-app-service) | decision-making | 0.65 | Covers reservation applicability and savings behavior for JBoss EAP Integrated Support on App Service, which is product-specific purchase guidance. | | [Apply billing tags](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/billing-tags) | configuration | 0.65 | Describes billing tags on MCA entities (billing profiles, invoice sections) and how to apply them in Cost Management. This is a product-specific tagging mechanism and configuration, beyond generic tagging concepts. | | [Avoid charges](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/avoid-charges-free-account) | limits-quotas | 0.65 | Describes eligibility for $200 credit for 30 days and references limited quantities of free services for 12 months, with conditions under which charges occur. This is product-specific numeric limit/entitlement behavior for the Azure free account. | | [Avoid unused subscriptions](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/avoid-unused-subscriptions) | security | 0.65 | Guidance to avoid automatic blocking/deletion of inactive subscriptions; likely includes concrete steps and product-specific recommendations (for example, required actions or settings) to mitigate security risk, fitting security-focused best-practice configuration. | | [Azure Cache for Redis](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/understand-azure-cache-for-redis-reservation-charges) | decision-making | 0.65 | Clarifies exactly which Redis costs are covered (compute only, premium tier only) and which are not, guiding reservation purchasing and cost planning. | -| [Azure EA pricing](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing-overview) | decision-making | 0.65 | EA pricing overview and detailed usage calculation rules for various services are contract- and platform-specific and help decide cost expectations; this is specialized billing decision guidance not derivable from generic knowledge. | +| [Azure EA VM reserved instances](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations) | decision-making | 0.65 | Guidance on EA VM reserved instances for enterprise enrollments typically includes EA-specific behaviors, eligibility, and cost implications that inform whether and how to purchase reservations under EA. This is specialized cost-optimization and enrollment-specific decision guidance rather than generic reservations info. | | [Azure SQL Edge](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/discount-sql-edge) | decision-making | 0.65 | Clarifies that reservations apply to future SQL Edge deployments on edge devices and exclude software, storage, and networking, guiding whether and how to reserve. | | [Azure SQL Edge](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/prepay-sql-edge) | decision-making | 0.65 | Details how SQL Edge reservations apply only to device usage and how discounts are scoped, which is specific purchase/usage behavior. | | [Billing Automation Scenarios](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-automation-scenarios) | integrations | 0.65 | Maps concrete billing/cost scenarios to specific APIs and their capabilities; contains product-specific API usage patterns and parameters, fitting integrations & coding patterns. | @@ -253,9 +273,7 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [EA tasks in Microsoft Customer Agreement](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/mca-enterprise-operations) | configuration | 0.64 | Describes how EA tasks translate to MCA billing account structure and operations; includes specific mappings and procedures unique to EA-to-MCA transitions. | | [Switch subscription offer](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/switch-azure-offer) | decision-making | 0.63 | Guidance on changing from pay-as-you-go to other offers (for example, Visual Studio credits, Free Trial upgrade); likely includes offer-specific eligibility and constraints to help decide which offer to select. | | [Transfer subscriptions between partners](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/azure-plan-subscription-transfer-partners) | configuration | 0.63 | Explains pre- and post-transfer considerations and steps for moving subscriptions under an Azure plan from one partner to another; contains product-specific transfer behavior and requirements. | -| [Initiate a Change of Channel Partner (COCP) request](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-of-channel-partner) | configuration | 0.62 | Describes COCP request flow, notifications, and effective dates; includes EA-specific operational rules for changing partners. | | [Create subscriptions programmatically](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/programmatically-create-subscription) | decision-making | 0.61 | Explains options and requirements for programmatic subscription creation across agreement types and API versions; helps decide which API path to use based on agreement and constraints. | -| [Azure EA VM reserved instances](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations) | decision-making | 0.60 | Describes how EA VM reserved instances affect savings and enrollment usage; while cost-focused, it provides EA-specific guidance on when and how to use reservations for cost decisions. | | [Billing meter ID updates](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/billing-meter-id-updates) | decision-making | 0.60 | Details how meter ID metadata changes and unique meter behavior affect consumption and billing; specialized billing rules that guide how to interpret charges over time. | | [Charge back savings plan costs](https://learn.microsoft.com/en-us/azure/cost-management-billing/savings-plan/charge-back-costs) | decision-making | 0.60 | Describes how to use amortized cost data and effective prices for internal chargeback. This involves product-specific cost interpretation and allocation guidance, which helps make financial/chargeback decisions based on Azure billing data. | | [Customize views](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/customize-cost-analysis-views) | configuration | 0.60 | Details how to customize views to understand charges and investigate changes; likely includes specific view settings and options, fitting configuration of the Cost Analysis tool. | @@ -265,7 +283,6 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Retrieve large datasets with exports](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/ingest-azure-usage-at-scale) | deployment | 0.60 | Focuses on exporting large unaggregated cost datasets at scale and when to prefer exports over the Cost Details API. Contains product-specific guidance on export-based retrieval patterns, which is close to deployment/operationalization of exports. | | [SQL Server HADR vs Azure Hybrid Benefit](https://learn.microsoft.com/en-us/azure/cost-management-billing/scope-level/sql-server-hadr-licenses) | decision-making | 0.60 | Explains coexistence of SQL Server HADR Software Assurance benefit with centrally managed Azure Hybrid Benefit, including how DR replicas are treated for licensing. This is specialized guidance for deciding how to configure DR replicas without consuming licenses. | | [Set up Microsoft Customer Agreement billing account](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/mca-setup-account) | decision-making | 0.60 | Covers how billing changes when moving from Enterprise Agreement to Microsoft Customer Agreement and the steps to set up the new billing account; this is specialized guidance for choosing and transitioning billing models, aligning with decision-making and migration considerations. | -| [Transfer EA accounts](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-transfers) | configuration | 0.60 | Overview of enterprise transfers; even if high-level, it covers EA-specific transfer rules and constraints that are not generic knowledge. | | [Usage data for individual subscriptions](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/understand-reserved-instance-usage) | configuration | 0.60 | Provides product-specific guidance on using ReservationId and portal usage files to understand how discounts are applied, effectively configuring and reading billing data. | | [Use built-in views](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/cost-analysis-built-in-views) | decision-making | 0.60 | Explains when to use which built-in view and what insights each provides; this is product-specific decision guidance for selecting analysis patterns. | | [Use cost analysis for common tasks](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/cost-analysis-common-uses) | best-practices | 0.60 | Shows how to achieve common analysis tasks; provides concrete, product-specific patterns for answering typical cost questions, aligning with actionable best practices. | @@ -278,7 +295,6 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [View Kubernetes costs](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/view-kubernetes-costs) | 0.55 | Shows how to view AKS costs; likely focused on using specific views but not on deep configuration tables, limits, or troubleshooting. | | [Cost Management for partners](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/get-started-partners) | 0.50 | Get-started guide for partners; mainly describes how partners use features and enable access, not detailed configs, limits, or troubleshooting. | | [Understand and work with scopes](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/understand-work-scopes) | 0.50 | Conceptual explanation of scopes and permissions; while product-specific, it’s more conceptual/role-based than configuration tables or numeric limits. | -| [Azure Marketplace](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-azure-marketplace) | 0.45 | Explains how EA customers view marketplace charges and enable purchases; mostly portal steps and conceptual billing behavior without detailed limits or configs. | | [Enterprise agreement price sheet terms](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-understand-pricesheet) | 0.45 | Describes terms in EA price sheets and access roles; terminology and role-based access, but not detailed numeric limits or configuration parameter tables. | | [Microsoft Customer Agreement invoice terms](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mca-understand-your-invoice) | 0.45 | Describes how to read MCA invoices and key terms; largely conceptual and explanatory without structured config tables or numeric limits. | | [Microsoft Customer Agreement price sheet terms](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/mca-understand-pricesheet) | 0.45 | Explains terms in the MCA price sheet; mostly terminology and access instructions, not configuration parameters or numeric service limits. | @@ -302,7 +318,7 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Partner FAQ](https://learn.microsoft.com/en-us/azure/cost-management-billing/partner-faq) | 0.40 | Partner FAQ is general Q&A; description doesn’t indicate detailed limits, configuration tables, or error-code-based troubleshooting. | | [Request Withholding Tax credit - India](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/withholding-tax-credit-india) | 0.40 | Explains process for requesting withholding tax credit in India; procedural finance/tax workflow rather than technical limits, configuration, or troubleshooting. | | [Tutorial - Export data](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/tutorial-improved-exports) | 0.40 | Tutorial-style walkthrough for creating exports; likely focuses on step-by-step UI usage without detailed configuration tables, limits, or advanced patterns. | -| [View and download EA usage details](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-azure-usage-charges-invoices) | 0.40 | Explains how EA admins view usage summaries and reports; largely portal navigation and role requirements without deep product-specific configs or limits. | +| [View and download EA usage details](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-azure-usage-charges-invoices) | 0.40 | Primarily a how-to for viewing usage summaries and downloading reports in the portal. It’s focused on navigation and UI steps for EA admins rather than exposing configuration tables, limits, or decision matrices. No strong indication of product-specific parameters, quotas, or error codes. | | [View and download your organization's pricing](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing) | 0.40 | Shows how to view/download organization pricing; mostly portal navigation and account-type distinctions without detailed decision matrices or limits. | | [View reservation transactions](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/view-purchase-refunds) | 0.40 | Describes ways to view transactions (portal, Power BI, REST) at a conceptual level; no strong indication of detailed APIs or parameters. | | [View reservation utilization](https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/reservation-utilization) | 0.40 | Primarily explains how to view utilization in portal/Power BI; likely step-by-step UI usage without deep product-specific parameters or codes. | @@ -311,17 +327,18 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Billing accounts for Microsoft Customer Agreement](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mca-overview) | 0.35 | Get-started overview for MCA billing accounts; focuses on concepts like billing profiles and payment methods, not detailed limits or configuration matrices. | | [Download your invoice](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/download-azure-invoice-daily-usage-date) | 0.35 | Another invoice download/how-to article with role prerequisites; no deep technical configuration, limits, or troubleshooting mappings. | | [Edit billing contact information](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-azure-account-profile) | 0.35 | Explains how to change billing contact information; straightforward portal operations without expert-level limits or configuration matrices. | -| [Get started with EA billing in the Azure portal](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-direct-portal-get-started) | 0.35 | Explains how EA customers use the Azure portal for billing; primarily procedural onboarding, not detailed configuration or limits. | | [Subscription states](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/subscription-states) | 0.35 | Describes subscription lifecycle states conceptually (active, disabled, deleted) and their effects; not focused on numeric limits, configuration parameters, or error-code-based troubleshooting. | | [Track your Conditional Azure Credit Offer](https://learn.microsoft.com/en-us/azure/cost-management-billing/benefits/caco/track-conditional-credit-offer) | 0.35 | Explains CACO concept and milestone behavior at a high level; summary does not expose specific numeric thresholds, configuration parameters, or decision matrices. | | [Update tax details](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/manage-tax-information) | 0.35 | Describes updating tax details and addresses; procedural billing profile management, not technical configuration or limits. | | [View and download tax documents](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mca-download-tax-document) | 0.35 | Explains how to view/download tax documents and required roles; procedural billing portal usage without expert-level technical detail. | -| [Azure EA agreements and amendments](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-agreements) | 0.30 | Agreement/amendment effects on EA access and payments are contractual and procedural, not detailed technical limits, configs, or troubleshooting mappings. | +| [Azure portal administration for partners](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-billing-administration-partners) | 0.30 | Describes partner EA billing administration tasks; appears to be procedural and navigational without explicit limits, configuration parameter tables, or detailed security role definitions. | | [Cancellation policy](https://learn.microsoft.com/en-us/azure/cost-management-billing/savings-plan/cancel-savings-plan) | 0.30 | High-level policy statement that savings plan purchases are final and non-cancellable; no detailed matrices, codes, or configuration values. | | [Change Entra Directory](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/subscription-change-directory) | 0.30 | Workflow/how-to for changing an Azure subscription’s Entra directory; no indication of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Primarily procedural guidance rather than expert reference details. | | [Changes to your updated pay-as-you-go billing account](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mosp-new-customer-experience) | 0.30 | Describes updated billing experience and capabilities; more of a feature overview and walkthrough than detailed configuration or limits content. | | [Create EA support request](https://learn.microsoft.com/en-us/azure/cost-management-billing/troubleshoot-billing/how-to-create-azure-support-request-ea) | 0.30 | How-to for creating EA support requests; mostly portal navigation and process, no product-specific limits, configs, or detailed troubleshooting mappings. | +| [Direct EA billing invoice documents](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-billing-invoice-documents) | 0.30 | Describes timing and nature of invoice document availability for direct EA; while it mentions a general 12th–15th day window, it lacks detailed limits, configuration options, or structured decision/troubleshooting content required for any sub-skill type. | | [Discover cloud footprint FAQ](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/discover-cloud-footprint) | 0.30 | Cloud footprint FAQ mainly links to other articles and explains concepts; not focused on limits, configs, or troubleshooting mappings. | +| [EA billing administration](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-administration) | 0.30 | Covers common EA billing admin tasks in the portal; likely step-by-step UI/process content without detailed configuration tables, limits, or security role definitions beyond what is in the dedicated roles article. | | [Estimate costs with the Azure pricing calculator](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/pricing-calculator) | 0.30 | How-to for using the pricing calculator; mostly UI usage and planning guidance without numeric service limits, configs, or troubleshooting mappings. | | [FAQ](https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/microsoft-customer-agreement-faq) | 0.30 | FAQ format about MCA; typically high-level Q&A without detailed configuration parameters, limits, or error-code-based troubleshooting. | | [Filter and view subscriptions](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/filter-view-subscriptions) | 0.30 | Explains how to filter and view subscriptions in the portal; mostly UI navigation and basic filtering, not deep configuration or troubleshooting. | @@ -331,7 +348,6 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Manage a Microsoft discount resource](https://learn.microsoft.com/en-us/azure/cost-management-billing/benefits/discounts/manage-azure-discount) | 0.30 | Describes discount resource metadata and behavior conceptually; lacks specific numeric thresholds, configuration parameters, or troubleshooting mappings. | | [Manage savings plan](https://learn.microsoft.com/en-us/azure/cost-management-billing/savings-plan/manage-savings-plan) | 0.30 | Primarily a how-to/manage tutorial (change scope, split plan, optimize use) without clear indication of detailed configuration tables, limits, or decision matrices. Lacks strong signals of expert-only configuration or decision logic from the summary. | | [Prepare for classic administrator roles retirement](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/classic-administrator-retire) | 0.30 | Primarily an announcement/transition overview for classic administrator role retirement; likely conceptual with high-level guidance, not detailed RBAC role tables or config parameters. | -| [Review your Enterprise Agreement bill](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-enterprise-agreement-bill) | 0.30 | Explains how to read an EA bill; focused on understanding invoice and credit usage, not on numeric limits, config parameters, or troubleshooting codes. | | [Review your Microsoft Customer Agreement bill](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-customer-agreement-bill) | 0.30 | Guides reviewing Microsoft Customer Agreement bills; primarily procedural and conceptual without detailed limits, configs, or error mappings. | | [Review your Microsoft Partner Agreement bill](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-partner-agreement-bill) | 0.30 | Describes how to review Partner Agreement invoices; focuses on navigation and understanding charges, not on expert-level limits or configuration tables. | | [Review your individual account bill](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-individual-bill) | 0.30 | Tutorial on reviewing an individual bill; mostly portal navigation and conceptual explanation of invoice vs usage file, not configuration, limits, or error mappings. | @@ -345,8 +361,12 @@ confusable_not_for: Not for Azure Advisor (use azure-advisor), Azure Monitor (us | [Find tenant ID and primary domain name](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/find-tenant-id-domain) | 0.25 | Simple how-to for finding tenant ID and primary domain; generic portal navigation without detailed configuration tables or limits. | | [Add, update, or delete payment method](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-credit-card) | 0.20 | Procedural portal how-to for managing payment methods; no product-specific limits, configuration tables, error codes, or quantified guidance that meet any sub-skill criteria. | | [Enable preview features](https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/enable-preview-features-cost-management-labs) | 0.20 | Explains how to explore preview features; mostly portal navigation and feature list, not deep configuration or troubleshooting content. | +| [Get started with EA billing in the Azure portal](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-direct-portal-get-started) | 0.20 | Getting-started administration article for EA billing in the Azure portal; primarily navigation and conceptual management guidance, no detailed limits, configuration tables, or decision matrices. | +| [Initiate a Change of Channel Partner (COCP) request](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-of-channel-partner) | 0.20 | Explains how to initiate a change of channel partner request; appears to be procedural guidance without product-specific limits, configuration parameters, or troubleshooting mappings. | | [Microsoft Entra ID Free](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/microsoft-entra-id-free) | 0.20 | High-level description of Microsoft Entra ID Free as part of billing; no concrete limits, configuration parameters, roles list, or other detailed expert data. | | [Resolve pay-as-you-go past due balance](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/resolve-past-due-balance) | 0.20 | Primarily a procedural billing/how-to article about resolving past-due balances; no product-specific limits, configuration parameters, error-code mappings, or other expert-only technical details. | +| [Review your Enterprise Agreement bill](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/review-enterprise-agreement-bill) | 0.20 | Billing explanation for Enterprise Agreement usage and credits; appears to be conceptual guidance on reading bills without product-specific limits, configuration parameters, or error-code-based troubleshooting. | +| [Transfer EA accounts](https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-transfers) | 0.20 | Describes process/overview for transferring EA accounts and subscriptions; no detailed limits, configuration tables, error codes, or decision matrices with quantified criteria. | | [Understand the billing and tenant relationship](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/understand-billing-tenant-relationship) | 0.20 | Conceptual explanation of billing–tenant relationships; no detailed configuration parameters, limits, or troubleshooting mappings. | | [What are savings plans?](https://learn.microsoft.com/en-us/azure/cost-management-billing/savings-plan/savings-plan-overview) | 0.20 | High-level billing and discount overview for savings plans without specific numeric limits, configuration parameters, error codes, or decision matrices with quantified criteria. Primarily conceptual description of what savings plans are and how they work. | | [Billing accounts for Microsoft Partner Agreement](https://learn.microsoft.com/en-us/azure/cost-management-billing/understand/mpa-overview) | 0.10 | Overview of Microsoft Partner Agreement billing accounts; primarily conceptual and procedural without product-specific limits, configuration tables, or decision matrices. | diff --git a/products/azure-data-explorer/azure-data-explorer.csv b/products/azure-data-explorer/azure-data-explorer.csv index ba9ff5f5..4f5ce090 100644 --- a/products/azure-data-explorer/azure-data-explorer.csv +++ b/products/azure-data-explorer/azure-data-explorer.csv @@ -2,13 +2,13 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_sk https://learn.microsoft.com/en-us/azure/data-explorer/add-cluster-connection,Add a cluster connection,Add a cluster connection in the Azure Data Explorer web UI - Azure Data Explorer,,Learn how to add cluster connections for multiple user accounts or Microsoft Entra directories in the Azure Data Explorer web UI.,"This article explains how to add a cluster connection in theAzure Data Explorer web UI. The Azure Data Explorer web UI allows you to seamlessly manage clusters from various user accounts or Microsoft Entra directories. Follow the optional steps in this article to set up connections with alternative credentials. Once connected, you can switch between clusters associated with different credentials within a unified interface, without a need to repeatedly sign in and sign out or switch directories.",2025-01-21T18:08:00.000Z,reference,,0.3,False,How to add cluster connections in web UI; likely step-by-step UI instructions rather than parameter tables.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/add-cluster-principal,Add cluster principal,Add cluster principals for Azure Data Explorer - Azure Data Explorer,Programmatically add Azure Data Explorer cluster principals,"In this article, you learn how to add cluster principals for Azure Data Explorer.","Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. In this article, you'll learn how to add cluster principals for Azure Data Explorer by using C#, Python, or an Azure Resource Manager (ARM) template. For code samples based on previous SDK versions, see thearchived article.",2023-10-13T11:04:00.000Z,how-to,security,0.7,True,"Covers adding principals via C#, Python, and ARM. This is security/identity configuration with API-level details for managing access.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/add-database-principal,Add database principal,Add database principals for Azure Data Explorer - Azure Data Explorer,Programmatically add Azure Data Explorer database principals,"In this article, you learn how to add database principals for Azure Data Explorer.","Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. In this article, you'll learn how to add database principals for Azure Data Explorer by using C#, Python, or an Azure Resource Manager (ARM) template.",2023-10-13T11:04:00.000Z,how-to,security,0.7,True,Similar to cluster principals but at database scope; involves security roles and access configuration via SDKs/ARM.,unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/add-query-visualization,Add a query visualization,Add a Query Visualization in the Web UI - Azure Data Explorer,,Learn how to add a query visualization in the Azure Data Explorer web UI.,"In this article, you learn how to create and customize visuals from query results by using the UI, such as the one found in Azure Data Explorer Dashboards. You can further manipulate these visuals and pin them in adashboard. You don't need to rerun the query to add or modify these visuals. This feature is especially useful for heavy queries. For a full list of available visuals, seeVisualization. For visuals that are only available in the web UI or dashboards, seeDashboard-specific visuals.",2026-04-13T17:02:00.000Z,how-to,,0.2,False,"How-to UI guide for creating visualizations from query results; no limits, config tables, error codes, or product-specific thresholds. Primarily step-by-step usage, not expert reference content.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/add-query-visualization,Add a query visualization,Add a Query Visualization in the Web UI - Azure Data Explorer,,Learn how to add a query visualization in the Azure Data Explorer web UI.,"In this article, you learn how to create and customize visuals from query results by using the UI, such as the one found in Azure Data Explorer Dashboards. You can further manipulate these visuals and pin them in adashboard. You don't need to rerun the query to add or modify these visuals. This feature is especially useful for heavy queries. For a full list of available visuals, seeVisualization. For visuals that are only available in the web UI or dashboards, seeDashboard-specific visuals.",2026-04-13T17:02:00.000Z,how-to,,0.2,False,"How-to UI guide for creating visualizations from query results; no limits, config tables, error codes, or product-specific thresholds. Primarily step-by-step usage, not expert reference content.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/apache-log4j2-connector,Apache Log4J 2,Ingest data in Azure Data Explorer using the Apache log4J 2 connector - Azure Data Explorer,,Learn how to use the Apache log4J 2 connector in Azure Data Explorer.,"Log4J is a popular logging framework for Java applications maintained by the Apache Foundation. Log4J allows developers to control which log statements are output with arbitrary granularity based on the logger's name, logger level, and message pattern.Apache Log4J 2is an upgrade to Log4J, with significant improvements over the preceding Log4j 1.x. Log4J 2 provides many of the improvements available in Logback, while fixing some inherent problems in Logback's architecture. The Apache log4J 2 sink",2024-02-18T12:04:00.000Z,how-to,,0.5,False,Apache log4J 2 sink usage; summary is conceptual and does not show detailed parameter tables or quotas.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/auto-stop-clusters,Automatic stop of inactive clusters,Automatic stop of inactive clusters in Azure Data Explorer - Azure Data Explorer,Understand automatic stop behavior for inactive clusters,"Learn when your cluster is due to stop running using the Automatic stop feature, and how to enable/disable the Automatic stop.",Azure Data Explorer clusters that areinactivefor a specified time interval stop automatically. Inactivity means clusters with no data ingestion or queries in the past 5 days. The interval is fixed at 5 days and can't be changed. Cluster behavior doesn't resume automatically. Restart the cluster manually. Note The following cluster types aren't stopped automatically:,2026-01-08T12:04:00.000Z,how-to,limits-quotas,0.85,True,"Defines inactivity as no ingestion or queries for 5 days and states the interval is fixed and non-configurable, plus exceptions. This is a concrete time-based limit/behavior unique to the service.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/automated-deploy-overview,Automated provisioning,Automated provisioning in Azure Data Explorer - Azure Data Explorer,Automate provisioning of Azure Data Explorer environments,This article provides an overview of the different mechanisms for automating the provisioning of Azure Data Explorer environments.,"Automated provisioningis a process for quickly deploying and configuring the resources you need to run your Azure Data Explorer cluster. It's a critical part of a DevOps or DataOps workflow. The provisioning process doesn't require you to manually configure the cluster, doesn't require human intervention, and is easy to set up. You might use automated provisioning to deploy a preconfigured cluster with data, as part of a continuous integration and continuous delivery (CI/CD) pipeline. Some of th",2023-06-19T17:02:00.000Z,how-to,deployment,0.6,True,Overview of automated provisioning mechanisms tied to DevOps/DataOps; describes product-specific deployment options and constraints.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/azure-advisor,Use Advisor recommendations to optimize your cluster,Use Azure Advisor recommendations to optimize your Azure Data Explorer cluster - Azure Data Explorer,,This article describes Azure Advisor recommendations used to optimize your Azure Data Explorer cluster,"Azure Advisor analyzes Azure Data Explorer cluster configurations and usage telemetry, and offers personalized, actionable recommendations to help you optimize your cluster.",2026-01-08T12:04:00.000Z,how-to,,0.45,False,"Describes Azure Advisor recommendations conceptually; summary doesn’t show concrete thresholds, limits, or config matrices.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/azure-data-explorer-dashboard-share,Share dashboards,Share Azure Data Explorer dashboards - Azure Data Explorer,,Learn how to share Azure Data Explorer dashboards,"A dashboard is a collection of tiles, optionally organized in pages, where each tile has an underlying query and a visual representation. For more information on creating dashboards, seeVisualize data with Azure Data Explorer dashboards. In this document, you'll learn how to grant permissions and manage permissions to share a dashboard with other users. Important To access the dashboard, a dashboard viewer needs the following: In general, dashboards are shared in two steps: Grant permissions, an",2025-08-28T11:06:00.000Z,how-to,,0.5,False,Covers sharing dashboards and required permissions at a high level; summary doesn’t show specific RBAC role names or scopes.,unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/azure-data-explorer-dashboards,Use dashboards,Visualize data with the Azure Data Explorer dashboard - Azure Data Explorer,,Learn how to visualize data with the Azure Data Explorer dashboard,"Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. Explore your data from end-to-end in the Azure Data Explorer web application, starting withdata ingestion, runningqueries, and ultimately building dashboards. A dashboard is a collection of tiles, optionally organized in pages, where each tile has an underlying query and a visual representation. Using the web UI, you can natively export Kusto Query Language (KQL) queries to a dashboard as visua",2025-08-28T11:06:00.000Z,how-to,,0.35,False,"Dashboard article appears to be a how-to for building dashboards; mostly workflow/UI, not detailed configuration matrices or error/limit references.",unchanged +https://learn.microsoft.com/en-us/azure/data-explorer/azure-data-explorer-dashboards,Use dashboards,Visualize data with the Azure Data Explorer dashboard - Azure Data Explorer,,Learn how to visualize data with the Azure Data Explorer dashboard,"Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. Explore your data from end-to-end in the Azure Data Explorer web application, starting withdata ingestion, runningqueries, and ultimately building dashboards. A dashboard is a collection of tiles, optionally organized in pages, where each tile has an underlying query and a visual representation. Using the web UI, you can natively export Kusto Query Language (KQL) queries to a dashboard as visua",2026-04-20T08:00:00.000Z,how-to,,0.2,False,"Primarily a how-to/overview for creating and using Azure Data Explorer dashboards; no indication of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Content is likely general UI and query export steps rather than expert-only reference details.",updated https://learn.microsoft.com/en-us/azure/data-explorer/azure-powershell,Use Kusto cmdlets in Azure PowerShell,Use Kusto cmdlets in Azure PowerShell - Azure Data Explorer,Manage ADX with Azure PowerShell Kusto cmdlets,This article describes how to use Kusto cmdlets in Azure PowerShell.,"PowerShell scripts can use Azure PowerShellAz.Kusto cmdletsto run management commands. The steps in this article aren't required if you're running commands inAzure Cloud Shell. If you're running the CLI locally, follow these steps to set up your environment.",2024-09-04T16:57:00.000Z,reference,integrations,0.65,True,"Cmdlet article documents specific Az.Kusto cmdlets, parameters, and usage patterns, which are API/SDK-level integration details for ADX management.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/base-query,Create a dashboard base query,Create a dashboard base query - Azure Data Explorer,,Learn how to create a base query for an Azure Data Explorer dashboard.,"Base queries are reusable query snippets that can be used as building blocks for all dashboard query components such as parameters, tiles, and other base queries. A single base query is managed in the context of a specific dashboard. Each dashboard can have one or more base queries. Base queries make query maintenance in dashboards easier, as queries can be managed in a central location. To interactively explore sample dashboards, seeQuickstart: Visualize sample data dashboards. Note Base query ",2026-01-12T08:00:00.000Z,how-to,,0.3,False,"Base queries as reusable snippets is a usage pattern, but article is procedural without numeric thresholds or config matrices.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/business-continuity-create-solution,Create solutions,Create business continuity and disaster recovery solutions with Azure Data Explorer - Azure Data Explorer,Design ADX regional DR and replication solutions,This article describes how to create business continuity and disaster recovery solutions with Azure Data Explorer,"This article details how you can prepare for an Azure regional outage by replicating your Azure Data Explorer resources, management, and ingestion in different Azure regions. An example of data ingestion with Azure Event Hubs is given. Cost optimization is also discussed for different architecture configurations. For a more in-depth look at architecture considerations and recovery solutions, see thebusiness continuity overview.",2022-03-07T00:00:00.000Z,how-to,architecture-patterns,0.7,True,"Describes concrete DR solutions, replication between regions, and cost-optimized configurations; these are ADX-specific architectural patterns and trade-offs.",unchanged @@ -27,7 +27,7 @@ https://learn.microsoft.com/en-us/azure/data-explorer/create-cluster-and-databas https://learn.microsoft.com/en-us/azure/data-explorer/create-cluster-database,Create a cluster and database using SDKs,Create an Azure Data Explorer cluster and database - Azure Data Explorer,Programmatically configure Azure Data Explorer clusters,Learn how to create an Azure Data Explorer cluster and database.,"Azure Data Explorer is a fast, fully managed data analytics service for real-time analysis on large volumes of data streaming from applications, websites, IoT devices, and more. To use Azure Data Explorer, you first create a cluster, and create one or more databases in that cluster. Then, you can ingest (load) data into a database and run queries against it. In this article, you'll learn how to create a cluster and a database using either C#, Python, Go, the Azure CLI, PowerShell, or an Azure Re",2025-09-28T08:00:00.000Z,how-to,configuration,0.6,True,Shows concrete API/CLI/SDK parameters for cluster and database creation that are product-specific configuration details.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/create-event-grid-connection,Create an Event Grid data connection,Create an Event Grid data connection - Azure Data Explorer,Apply Event Grid ingestion file size limits in Azure Data Explorer,Learn how to create an Event Grid data connection to ingest data into Azure Data Explorer.,"In this article, you learn how to ingest blobs from your storage account into Azure Data Explorer using an Event Grid data connection. You'll create an Event Grid data connection that sets anAzure Event Gridsubscription. The Event Grid subscription routes events from your storage account to Azure Data Explorer via an Azure Event Hubs. Note Ingestion supports a maximum file size of 6 GB. The recommendation is to ingest files between 100 MB and 1 GB. To learn how to create the connection using the",2026-02-08T08:00:00.000Z,how-to,limits-quotas,0.8,True,"Explicitly states maximum file size (6 GB) and recommended range (100 MB–1 GB) for ingestion, which are concrete product limits.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/create-event-grid-connection-sdk,Create an Event Grid data connection with SDKs,Create an Event Grid data connection with SDKs - Azure Data Explorer,,"In this article, you learn how to ingest data into Azure Data Explorer from Event Grid using the Kusto SDKs.","In this article, you learn how to ingest blobs from your storage account into Azure Data Explorer using an Event Grid data connection. You'll create an Event Grid data connection that sets anAzure Event Gridsubscription. The Event Grid subscription routes events from your storage account to Azure Data Explorer via an Azure Event Hubs. To learn how to create the connection in the Azure portal or with an ARM template, seeCreate an Event Grid data connection. For general information about ingesting",2023-07-16T23:16:00.000Z,how-to,,0.45,False,SDK-based Event Grid connection how-to; summary does not expose specific parameter tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/create-event-hubs-connection,Create an Event Hubs data connection,Create an Event Hubs Data Connection - Azure Data Explorer,,Learn how to ingest data from Event Hubs into Azure Data Explorer.,"Azure Data Explorer offers ingestion fromEvent Hubs, a big data streaming platform and event ingestion service. Event Hubs can process millions of events per second in near real time. In this article, you connect to an event hub and ingest data into Azure Data Explorer. For an overview on ingesting from Event Hubs, seeAzure Event Hubs data connection. To learn how to create the connection using the Kusto software developer kits (SDKs), seeCreate an Event Hubs data connection with SDKs. For code ",2026-01-12T08:00:00.000Z,how-to,,0.3,False,"Describes how to create an Event Hubs data connection and ingest data into Azure Data Explorer. The summary suggests a connection/ingestion walkthrough rather than configuration reference tables, limits, or troubleshooting matrices. Lacks clear evidence of expert-level numeric limits, parameter ranges, or error-code-based diagnosis.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/create-event-hubs-connection,Create an Event Hubs data connection,Create an Event Hubs Data Connection - Azure Data Explorer,,Learn how to ingest data from Event Hubs into Azure Data Explorer.,"Azure Data Explorer offers ingestion fromEvent Hubs, a big data streaming platform and event ingestion service. Event Hubs can process millions of events per second in near real time. In this article, you connect to an event hub and ingest data into Azure Data Explorer. For an overview on ingesting from Event Hubs, seeAzure Event Hubs data connection. To learn how to create the connection using the Kusto software developer kits (SDKs), seeCreate an Event Hubs data connection with SDKs. For code ",2026-01-12T08:00:00.000Z,how-to,,0.3,False,"Describes how to create an Event Hubs data connection and ingest data into Azure Data Explorer. The summary suggests a connection/ingestion walkthrough rather than configuration reference tables, limits, or troubleshooting matrices. Lacks clear evidence of expert-level numeric limits, parameter ranges, or error-code-based diagnosis.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/create-event-hubs-connection-sdk,Create an Event Hubs data connection with SDKs,Create an Event Hubs data connection with SDKs - Azure Data Explorer,,"In this article, you learn how to ingest data into Azure Data Explorer from Event Hubs using SDKs.","Azure Data Explorer offers ingestion fromEvent Hubs, a big data streaming platform and event ingestion service. Event Hubs can process millions of events per second in near real time. In this article, you connect to an event hub and ingest data into Azure Data Explorer. For an overview on ingesting from Event Hubs, seeAzure Event Hubs data connection. To learn how to create the connection in the Azure Data Explorer web UI, Azure portal, or with an ARM template, seeCreate an Event Hubs data conne",2023-07-16T23:16:00.000Z,how-to,,0.45,False,SDK-based Event Hubs connection tutorial; summary lacks explicit config tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/create-iot-hub-connection,Create an IoT Hub data connection,Create an IoT Hub data connection - Azure Data Explorer,,"In this article, you learn how to ingest data into Azure Data Explorer from IoT Hub.","This article shows you how to ingest data into Azure Data Explorer from IoT Hub, a big data streaming platform and IoT ingestion service. To learn how to create the connection in the Azure portal or with an ARM template, seeCreate an IoT data connection. For general information about ingesting into Azure Data Explorer from IoT Hub, seeConnect to IoT Hub. Note Only events enqueued after you create the data connection are ingested. For code samples based on previous SDK versions, see thearchived a",2026-02-08T23:02:00.000Z,how-to,,0.45,False,"IoT Hub connection how-to; note about only post-connection events is conceptual, no detailed limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/create-iot-hub-connection-sdk,Create an IoT Hub data connection with SDKs,Create an IoT Hub data connection with SDKs - Azure Data Explorer,,"In this article, you learn how to ingest data into Azure Data Explorer from IoT Hub using SDKs.","This article shows you how to ingest data into Azure Data Explorer from IoT Hub, a big data streaming platform, and IoT ingestion service. To learn how to create the connection using the Kusto SDKs, seeCreate an IoT Hub data connection with SDKs. For general information about ingesting into Azure Data Explorer from IoT Hub, seeConnect to IoT Hub. Note Only events enqueued after you create the data connection are ingested. For code samples based on previous SDK versions, see thearchived article.",2025-08-26T22:21:00.000Z,how-to,,0.45,False,SDK-based IoT Hub connection tutorial; summary lacks explicit configuration parameter tables or limits.,unchanged @@ -45,7 +45,7 @@ https://learn.microsoft.com/en-us/azure/data-explorer/data-factory-command-activ https://learn.microsoft.com/en-us/azure/data-explorer/data-factory-integration,Integration with Data Factory,Azure Data Explorer integration with Azure Data Factory - Azure Data Explorer,,"In this article, integrate Azure Data Explorer with Azure Data Factory to use the copy, lookup, and command activities.",Azure Data Factory(ADF) is a cloud-based data integration service that allows you to integrate different data stores and perform activities on the data. ADF allows you to create data-driven workflows for orchestrating and automating data movement and data transformation. Azure Data Explorer is one of thesupported data storesin Azure Data Factory.,2025-09-02T08:00:00.000Z,how-to,,0.35,False,Integration overview with Azure Data Factory; no detailed parameter tables or limits in summary.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/data-factory-load-data,Copy data using Azure Data Factory,Copy data from Azure Data Factory to Azure Data Explorer - Azure Data Explorer,,"In this article, you learn how to ingest (load) data into Azure Data Explorer by using the Azure Data Factory copy tool.","Important This connector can be used inReal-Time Intelligencein Microsoft Fabric. Use the instructions in this article with the following exceptions: Azure Data Explorer is a fast, fully managed, data-analytics service. It offers real-time analysis on large volumes of data that stream from many sources, such as applications, websites, and IoT devices. With Azure Data Explorer, you can iteratively explore data and identify patterns and anomalies to improve products, enhance customer experiences, ",2023-01-26T00:00:00.000Z,how-to,,0.4,False,ADF copy tool ingestion how-to; summary is generic and does not expose expert-only configuration details.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/data-factory-template,Bulk copy data using Azure Data Factory template,Copy in bulk from a database to Azure Data Explorer by using the Azure Data Factory template - Azure Data Explorer,,"In this article, you learn to use an Azure Data Factory template to copy in bulk from a database to Azure Data Explorer","Azure Data Explorer is a fast, fully managed, data-analytics service. It offers real-time analysis on large volumes of data that stream from many sources, such as applications, websites, and IoT devices. To copy data from a database in Oracle Server, Netezza, Teradata, or SQL Server to Azure Data Explorer, you have to load huge amounts of data from multiple tables. Usually, the data has to be partitioned in each table so that you can load rows with multiple threads in parallel from a single tabl",2023-11-19T12:02:00.000Z,how-to,,0.45,False,Bulk copy template article; summary mentions partitioning but no explicit numeric thresholds or config tables.,unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/data-lake-query-data,Query data in Azure Data Lake,Query Data in Azure Data Lake Using Azure Data Explorer - Azure Data Explorer,,Learn how to query data in Azure Data Lake using Azure Data Explorer.,"Azure Data Lake Storage is a highly scalable and cost-effective data lake solution for big data analytics. It combines the power of a high-performance file system with massive scale and economy to help you reduce your time to insight. Data Lake Storage Gen2 extends Azure Blob Storage capabilities and is optimized for analytics workloads. Azure Data Explorer integrates with Azure Blob Storage and Azure Data Lake Storage (Gen1 and Gen2), providing fast, cached, and indexed access to data stored in",2026-04-12T08:00:00.000Z,how-to,,0.2,False,"Integration overview of querying Azure Data Lake via Azure Data Explorer, but summary indicates high-level description of capabilities without detailed config parameters, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/data-lake-query-data,Query data in Azure Data Lake,Query Data in Azure Data Lake Using Azure Data Explorer - Azure Data Explorer,,Learn how to query data in Azure Data Lake using Azure Data Explorer.,"Azure Data Lake Storage is a highly scalable and cost-effective data lake solution for big data analytics. It combines the power of a high-performance file system with massive scale and economy to help you reduce your time to insight. Data Lake Storage Gen2 extends Azure Blob Storage capabilities and is optimized for analytics workloads. Azure Data Explorer integrates with Azure Blob Storage and Azure Data Lake Storage (Gen1 and Gen2), providing fast, cached, and indexed access to data stored in",2026-04-12T08:00:00.000Z,how-to,,0.2,False,"Integration overview of querying Azure Data Lake via Azure Data Explorer, but summary indicates high-level description of capabilities without detailed config parameters, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/data-profile,Access the data profile of a table,Access the data profile of a table - Azure Data Explorer,,Learn how to access the data profile of a table in the Azure Data Explorer web UI.,The data profile feature in the Azure Data Explorer web UI allows you to quickly gain insights into the data within your tables. It features a time chart illustrating data distribution according to a specifieddatetimefield and presents each column of the table along with essential related statistics. This article explains how to access and understand the data profile of a table.,2024-06-13T16:55:00.000Z,how-to,,0.25,False,Explains data profile feature usage; mostly UI guidance without detailed config matrices or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/data-purge-portal,Purge data - Azure portal,Use data purge to delete personal data from a device or service in Azure Data Explorer - Azure Data Explorer,Purge personal data from Azure Data Explorer,Learn about how to purge (delete) data from Azure Data Explorer using data purge.,"Note This article provides steps about how to delete personal data from the device or service and can be used to support your obligations under the GDPR. For general information about GDPR, see theGDPR section of the Microsoft Trust Centerand theGDPR section of the Service Trust portal. Azure Data Explorer supports the ability to delete individual records. Data deletion through the.purgecommand protects personal data and shouldn't be used in other scenarios. It isn't designed to support frequent",2023-02-01T00:00:00.000Z,how-to,security,0.7,True,"Data purge/.purge command usage is tightly tied to GDPR-related deletion semantics and ADX-specific behavior; likely includes command syntax, constraints, and scenarios that are product-specific security/compliance knowledge.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/data-share,Use Azure Data Share,Use Azure Data Share to share data with Azure Data Explorer - Azure Data Explorer,,Learn about how to share your data with Azure Data Explorer and Azure Data Share.,"You can share data in many traditional ways, such as through file shares, FTP, email, and APIs. These methods require both parties to build and maintain a data pipeline that moves data between teams and organizations. With Azure Data Explorer, you can easily and securely share your data with people in your company or external partners. Sharing occurs in near-real-time, with no need to build or maintain a data pipeline. All database changes, including schema and data, on the provider side are ins",2025-11-03T08:00:00.000Z,how-to,,0.45,False,"Data Share integration article summary is high-level; no explicit mention of connector configuration parameters, limits, or code-level patterns.",unchanged @@ -66,7 +66,7 @@ https://learn.microsoft.com/en-us/azure/data-explorer/flow,Power Automate connec https://learn.microsoft.com/en-us/azure/data-explorer/flow-usage,Power Automate connector usage examples,Usage examples for Azure Data Explorer connector to Power Automate - Azure Data Explorer,Automate ADX queries with Power Automate examples,Learn some common usage examples for Azure Data Explorer connector to Power Automate.,"The Azure Data Explorer Power Automate (previously Microsoft flow) connector allows Azure Data Explorer to use the flow capabilities ofMicrosoft Power Automate. You can run Kusto queries and commands automatically, as part of a scheduled or triggered task. This article includes several common Power Automate connector usage examples. For more information, seeAzure Data Explorer Power Automate connector.",2026-02-25T18:13:00.000Z,how-to,integrations,0.65,True,"Usage examples for the connector provide concrete flow patterns, parameter usage, and ADX-specific query/command integration scenarios, which are coding/integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/fluent-bit,Fluent Bit,Ingest data with Fluent Bit into Azure Data Explorer - Azure Data Explorer,,Learn how to ingest (load) data into Azure Data Explorer from Fluent Bit.,"Fluent Bitis an open-source agent that collects logs, metrics, and traces from various sources. It allows you to filter, modify, and aggregate event data before sending it to storage. This article guides you through the process of using Fluent Bit to send data to your KQL database. This article shows how to ingest data with Fluent Bit. For a complete list of data connectors, seeData connectors overview.",2026-02-02T08:00:00.000Z,how-to,,0.45,False,"Fluent Bit ingestion article; summary indicates generic pipeline steps, no explicit limits or parameter matrices.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/follower,Follower databases,Use follower database feature to attach databases in Azure Data Explorer - Azure Data Explorer,Use follower databases for cross-cluster ADX access,Learn about how to attach databases in Azure Data Explorer using the follower database feature.,"Thefollower databasefeature allows you to attach a database located in a different cluster to your Azure Data Explorer cluster. Thefollower databaseis attached inread-onlymode, making it possible to view the data and run queries on the data that was ingested into theleader database. The follower database synchronizes changes in the leader databases. Because of the synchronization, there's a data lag of a few seconds to a few minutes in data availability. The length of the time lag depends on the",2025-12-11T08:00:00.000Z,how-to,architecture-patterns,0.7,True,"Follower database feature involves leader/follower synchronization, read-only semantics, and lag characteristics; this is an ADX-specific architectural pattern with concrete behavior details.",unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/get-data-amazon-s3,Get data from Amazon S3,Get Data from Amazon S3 Into Azure Data Explorer - Azure Data Explorer,,Learn how to get data from Amazon S3 into Azure Data Explorer.,"Data ingestion is the process of loading data from one or more sources into a table in Azure Data Explorer. After ingestion, the data is available for query. In this article, you learn how to get data from Amazon S3 into either a new or existing table. For more information on Amazon S3, seeWhat is Amazon S3? For general information on data ingestion, seeAzure Data Explorer data ingestion overview.",2026-04-13T17:02:00.000Z,how-to,,0.3,False,"Appears to be a step-by-step ingestion tutorial for pulling data from Amazon S3 into Azure Data Explorer. The summary suggests procedural guidance rather than detailed configuration tables, limits, or product-specific error mappings. No indication of numeric limits, config parameter matrices, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/get-data-amazon-s3,Get data from Amazon S3,Get Data from Amazon S3 Into Azure Data Explorer - Azure Data Explorer,,Learn how to get data from Amazon S3 into Azure Data Explorer.,"Data ingestion is the process of loading data from one or more sources into a table in Azure Data Explorer. After ingestion, the data is available for query. In this article, you learn how to get data from Amazon S3 into either a new or existing table. For more information on Amazon S3, seeWhat is Amazon S3? For general information on data ingestion, seeAzure Data Explorer data ingestion overview.",2026-04-13T17:02:00.000Z,how-to,,0.3,False,"Appears to be a step-by-step ingestion tutorial for pulling data from Amazon S3 into Azure Data Explorer. The summary suggests procedural guidance rather than detailed configuration tables, limits, or product-specific error mappings. No indication of numeric limits, config parameter matrices, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/get-data-file,Get data from local file,Get Data From a File - Azure Data Explorer,,Learn how to get data from a local file in Azure Data Explorer.,"Data ingestion is the process of loading data from one or more sources into a table in Azure Data Explorer. Once ingested, the data is available for query. In this article, you learn how to get data from a local file into either a new or existing table. For general information on data ingestion, seeAzure Data Explorer data ingestion overview.",2026-02-12T08:00:00.000Z,how-to,,0.3,False,"Wizard-style how-to for local file ingestion; no config tables, limits, or product-specific edge cases.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/get-data-storage,Get data from Azure storage,Get Data from Azure Storage - Azure Data Explorer,,Learn how to get data from Azure storage in Azure Data Explorer.,"Data ingestion is the process used to load data from one or more sources into a table in Azure Data Explorer. Once ingested, the data becomes available for query. In this article, you learn how to get data from Azure storage (ADLS Gen2 container, blob container, or individual blobs) into either a new or existing table. For general information on data ingestion, seeAzure Data Explorer data ingestion overview. Warning The Get Data Wizard doesn't support ingestion from Azure Storage throughprivate ",2026-01-05T18:02:00.000Z,how-to,,0.35,False,Basic Get Data wizard usage for Azure Storage; warning about private endpoints but no detailed config or limits.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/go-ingest-data,Go SDK,Ingest data with Azure Data Explorer Go SDK - Azure Data Explorer,,"In this article, you learn how to ingest (load) data into Azure Data Explorer using Go SDK.","Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. It provides aGo SDK client libraryfor interacting with the Azure Data Explorer service. You can use theGo SDKto ingest, control, and query data in Azure Data Explorer clusters. In this article, you first create a table and data mapping in a test cluster. You then queue an ingestion to the cluster using the Go SDK and validate the results.",2023-07-02T23:25:00.000Z,how-to,,0.4,False,Go SDK ingestion how-to; summary shows generic steps without expert-only configuration or limits.,unchanged @@ -84,7 +84,7 @@ https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-hub-over https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-flink,Apache Flink,Ingest data with Apache Flink into Azure Data Explorer - Azure Data Explorer,,Learn how to ingest data with Apache Flink into Azure Data Explorer.,"Apache Flinkis a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. The Flink connector is anopen source projectthat can run on any Flink cluster. It implements data sink for moving data from a Flink cluster. Using the connector to Apache Flink, you can build fast and scalable applications targeting data driven scenarios, for example, machine learning (ML), Extract-Transform-Load (ETL), and Log Analytics. In this article, you learn how ",2024-02-15T11:11:00.000Z,how-to,,0.5,False,"Flink connector how-to; summary indicates generic usage, no explicit parameter tables or limits visible.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-fluentd,Fluentd,Ingest log data with Fluentd into Azure Data Explorer - Azure Data Explorer,,Learn how to ingest log data with Fluentd into Azure Data Explorer.,"Log ingestion is the process of collecting, transforming, and preparing log data from applications, servers, containers, and cloud services so you can store, analyze, and monitor it. Logs capture information such as errors, warnings, usage patterns, and system performance. Reliable log ingestion ensures that operational and security data is available in near real-time for troubleshooting and insights. This article explains how to send logs from Fluentd to Azure Data Explorer (Kusto), including i",2026-01-07T12:03:00.000Z,how-to,,0.5,False,Fluentd ingestion article; summary mentions configuration but does not reveal specific parameter tables or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-historical,Ingest historical data,Ingest Historical Data Into Azure Data Explorer - Azure Data Explorer,,Learn how to use LightIngest to ingest historical or ad hoc data ingestion into Azure Data Explorer.,"A common scenario when onboarding to Azure Data Explorer is ingesting historical data, sometimes called backfill. The process involves ingesting data from an existing storage system into a table, which is a collection ofextents. Ingest historical data by using thecreationTime ingestion propertyto set the creation time of extents to the time the data wascreated. Using the creation time as the ingestion partitioning criterion can age your data in accordance with yourcacheandretentionpolicies, and ",2026-04-12T08:00:00.000Z,how-to,,0.4,False,"Focuses on ingesting historical data using LightIngest and the creationTime ingestion property. While it likely includes some product-specific usage guidance, the summary does not indicate detailed configuration tables, numeric thresholds, or error-code mappings that meet the expert-knowledge criteria for the defined sub-skills.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-historical,Ingest historical data,Ingest Historical Data Into Azure Data Explorer - Azure Data Explorer,,Learn how to use LightIngest to ingest historical or ad hoc data ingestion into Azure Data Explorer.,"A common scenario when onboarding to Azure Data Explorer is ingesting historical data, sometimes called backfill. The process involves ingesting data from an existing storage system into a table, which is a collection ofextents. Ingest historical data by using thecreationTime ingestion propertyto set the creation time of extents to the time the data wascreated. Using the creation time as the ingestion partitioning criterion can age your data in accordance with yourcacheandretentionpolicies, and ",2026-04-12T08:00:00.000Z,how-to,,0.4,False,"Focuses on ingesting historical data using LightIngest and the creationTime ingestion property. While it likely includes some product-specific usage guidance, the summary does not indicate detailed configuration tables, numeric thresholds, or error-code mappings that meet the expert-knowledge criteria for the defined sub-skills.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-iot-hub-overview,IoT Hub data connection,Ingest from IoT Hub - Azure Data Explorer,,This article describes Ingest from IoT Hub in Azure Data Explorer.,"Azure IoT Hubis a managed service, hosted in the cloud, that acts as a central message hub for bi-directional communication between your IoT application and the devices it manages. Azure Data Explorer offers continuous ingestion from customer-managed IoT Hubs, using itsEvent Hubs compatible built in endpoint of device-to-cloud messages. The IoT ingestion pipeline goes through several steps. First, you create an IoT Hub and register a device to it. You then create a target table in Azure Data Exp",2026-02-23T08:00:00.000Z,how-to,,0.45,False,IoT Hub ingestion overview; pipeline description without numeric thresholds or detailed configuration references.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-kafka,Apache Kafka,Ingest data From Kafka Into Azure Data Explorer - Azure Data Explorer,,"In this article, you learn how to ingest (load) data into Azure Data Explorer from Kafka.","Apache Kafkais a distributed streaming platform for building real-time streaming data pipelines that reliably move data between systems or applications.Kafka Connectis a tool for scalable and reliable streaming of data between Apache Kafka and other data systems. TheKusto Kafka Sinkserves as the connector from Kafka and doesn't require using code. Download the sink connector jar from theGit repoorConfluent Connector Hub. This article shows how to ingest data with Kafka, using a self-contained Do",2026-02-12T08:00:00.000Z,how-to,,0.5,False,Kafka connector article; summary mentions Docker-based example but no explicit limits or configuration matrices.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-logstash,Logstash,Ingest data from Logstash to Azure Data Explorer - Azure Data Explorer,,"In this article, you learn how to ingest (load) data into Azure Data Explorer from Logstash","Important This connector can be used inReal-Time Intelligencein Microsoft Fabric. Use the instructions in this article with the following exceptions: Logstashis an open source, server-side data processing pipeline that ingests data from many sources simultaneously, transforms the data, and then sends the data to your favorite ""stash"". In this article, you'll send that data to Azure Data Explorer, which is a fast and highly scalable data exploration service for log and telemetry data. You'll init",2023-10-19T11:05:00.000Z,how-to,,0.5,False,Logstash ingestion article; summary is tutorial-style without explicit limits or detailed configuration references.,unchanged @@ -125,7 +125,7 @@ https://learn.microsoft.com/en-us/azure/data-explorer/managed-identities-overvie https://learn.microsoft.com/en-us/azure/data-explorer/migrate-cluster-to-multiple-availability-zone,Migrate cluster to support availability zones,Migrate your cluster to support multiple availability zones (preview) - Azure Data Explorer,Migrate Azure Data Explorer clusters to availability zones,This guide teaches you how to migrate your cluster to support multiple availability zones.,"Many Azure regions provide availability zones, which are separated groups of datacenters within a region. Availability zones are close enough to have low-latency connections to other availability zones. They're connected by a high-performance network with a round-trip latency of less than 2 ms. However, availability zones are far enough apart to reduce the likelihood that more than one will be affected by local outages or weather. Availability zones have independent power, cooling, and networkin",2025-04-20T11:07:00.000Z,how-to,deployment,0.6,True,"Migration guide for enabling multi-AZ support. Such docs usually include region/zone support constraints and specific migration steps, fitting deployment/migration patterns.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/migrate-elasticsearch-to-azure-data-explorer,Migrate from Elasticsearch,Elasticsearch to Azure Data Explorer: Migration guide - Azure Data Explorer,Migrate Elasticsearch workloads to Azure Data Explorer,"This guide teaches you to migrate your Elasticsearch data to Azure Data Explorer, by using Logstash.","In this guide, you learnhow to migrateyour Elasticsearch data to Azure Data Explorer, by usingLogstash. In this guide, the data to be migrated is in an Elasticsearch index namedvehiclethat has the following data schema:",2023-07-09T23:03:00.000Z,how-to,decision-making,0.7,True,Migration guide with concrete steps and mappings from Elasticsearch schema and tooling (Logstash) to Azure Data Explorer.,unchanged https://learn.microsoft.com/en-us/azure/data-explorer/monitor-data-explorer,Monitor Azure Data Explorer,Monitor Azure Data Explorer - Azure Data Explorer,,"Learn how to monitor Azure Data Explorer using Azure Monitor, including data collection, analysis, and alerting.","Azure Monitor collects and aggregates metrics and logs from your system to monitor availability, performance, and resilience, and notify you of issues affecting your system. You can use the Azure portal, PowerShell, Azure CLI, REST API, or client libraries to set up and view monitoring data. Different metrics and logs are available for different resource types. This article describes the types of monitoring data you can collect for this service and ways to analyze that data.",2026-02-25T18:13:00.000Z,how-to,,0.4,False,"High-level monitoring overview describing data types and tools; summary doesn’t indicate detailed metric tables, thresholds, or configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/monitor-data-explorer-reference,Monitoring data reference,Monitoring data reference for Azure Data Explorer - Azure Data Explorer,Reference metrics and logs for Azure Data Explorer monitoring,This article contains important reference material you need when you monitor Azure Data Explorer by using Azure Monitor.,This article contains all the monitoring reference information for this service. SeeMonitor Azure Data Explorerfor details on the data you can collect for Azure Data Explorer and how to use it.,2026-04-16T08:00:00.000Z,reference,configuration,0.7,True,"Monitoring reference pages for Azure services typically list all available metrics, dimensions, log categories, and their exact names and semantics for Azure Monitor. These are product-specific configuration and reference details (metric names, log table names, dimensions, and how to use them) that an LLM is unlikely to know reliably from training. This fits the configuration sub-skill because it is a structured reference for monitoring-related settings and signals rather than conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/monitor-data-explorer-reference,Monitoring data reference,Monitoring data reference for Azure Data Explorer - Azure Data Explorer,Reference metrics and logs for Azure Data Explorer monitoring,This article contains important reference material you need when you monitor Azure Data Explorer by using Azure Monitor.,This article contains all the monitoring reference information for this service. SeeMonitor Azure Data Explorerfor details on the data you can collect for Azure Data Explorer and how to use it.,2026-04-16T08:00:00.000Z,reference,configuration,0.7,True,"Monitoring reference pages for Azure services typically list all available metrics, dimensions, log categories, and their exact names and semantics for Azure Monitor. These are product-specific configuration and reference details (metric names, log table names, dimensions, and how to use them) that an LLM is unlikely to know reliably from training. This fits the configuration sub-skill because it is a structured reference for monitoring-related settings and signals rather than conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/monitor-queued-ingestion,Monitor queued ingestion with metrics,Monitor queued ingestion in Azure Data Explorer - Azure Data Explorer,Monitor queued ingestion metrics in ADX,Learn how to use Azure Data Explorer metrics to monitor queued ingestion to Azure Data Explorer in Azure portal.,"In thequeued ingestionprocess, Azure Data Explorer optimizesdata ingestionfor high throughput by batching incoming small chunks of data into batches based on a configurableingestion batching policy. The batching policy allows you to set the trigger conditions for sealing a batch (data size, number of blobs, or time passed). These batches are then optimally ingested for fast query results. In this article, you will learn how to use metrics to monitor queued ingestion to Azure Data Explorer in Azu",2023-11-02T11:07:00.000Z,how-to,best-practices,0.65,True,"Monitoring queued ingestion with batching policy implies ADX-specific metrics, thresholds, and interpretations for ingestion health, which are concrete operational best practices.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/monitor-with-resource-health,Use resource health to monitor cluster health,Monitor Azure Data Explorer using Resource Health - Azure Data Explorer,Use Resource Health to diagnose ADX issues,Use Azure Resource Health to monitor Azure Data Explorer resources.,Resource Healthfor Azure Data Explorer informs you of the health of your Azure Data Explorer resource and provides actionable recommendations to troubleshoot problems. Resource health is updated every 1-2 minutes and reports the current and past health of your resources. Resource Health determines the health of your Azure Data Explorer resource by examining various health status checks such as:,2023-03-14T00:00:00.000Z,how-to,troubleshooting,0.65,True,"Resource Health article for ADX likely maps health states and checks to causes and recommended actions, providing symptom→cause→solution guidance specific to this service.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/multi-tenant,Multi-tenant solutions,Comparing different multi-tenant solutions with Azure Data Explorer - Azure Data Explorer,Choose Azure Data Explorer multi-tenant architecture,Learn about the different ways to architect a multi-tenant solution in Azure Data Explorer.,"The concept of multi-tenancy in Azure Data Explorer refers to serving different tenants and storing their data in a single cluster. Atenantcan represent a customer, a group of users, or any classifications of users where data needs to be segregated along the tenants' boundaries. You can also have multi-level multi-tenancy scenario, such as multiple applications that each have multiple tenants. This scenario isn't covered by this article but similar principles apply. An important factor is the wa",2023-07-27T11:08:00.000Z,how-to,architecture-patterns,0.7,True,Compares concrete multi-tenant patterns specific to Azure Data Explorer and when to use each; architecture decision guidance.,unchanged @@ -196,7 +196,7 @@ https://learn.microsoft.com/en-us/azure/data-explorer/web-customize-settings,Cus https://learn.microsoft.com/en-us/azure/data-explorer/web-query-data,Query sample data,Quickstart: Learn How to Query Sample Data in the Azure Data Explorer Web UI - Azure Data Explorer,,"In this quickstart, you learn how to query and share sample data in the Azure Data Explorer web UI.","In this Quickstart, you'll learn how to query data in the stand-alone Azure Data Explorer web UI. Azure Data Explorer provides a web experience that enables you to connect to your Azure Data Explorer clusters and write, run, and shareKusto Query Language (KQL)commands and queries. The web experience is available in the Azure portal and as a stand-alone web application, theAzure Data Explorer web UI. In the Azure Data Explorer web UI, the query editor provides suggestions and warnings as you writ",2025-06-05T22:04:00.000Z,quickstart,,0.3,False,"Quickstart for querying sample data; tutorial-style, not a catalog of expert configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/data-explorer/web-results-grid,Explore the results grid,Azure Data Explorer Web UI Results Grid - Azure Data Explorer,,Learn how to work with the results grid in the Azure Data Explorer web UI.,"In this guide, you learn how to work with query results in theAzure Data Explorer web UIby using the results grid. By using the results grid, you can customize and manipulate your results, and enhance the efficiency and effectiveness of your data analysis. To learn how to run queries, seeQuickstart: Query data in the Azure Data Explorer web UI.",2026-04-13T17:02:00.000Z,how-to,,0.2,False,"Describes how to work with the results grid in the web UI; lacks numeric limits, configuration parameter tables, or troubleshooting mappings. General feature-usage instructions rather than expert reference.",updated +https://learn.microsoft.com/en-us/azure/data-explorer/web-results-grid,Explore the results grid,Azure Data Explorer Web UI Results Grid - Azure Data Explorer,,Learn how to work with the results grid in the Azure Data Explorer web UI.,"In this guide, you learn how to work with query results in theAzure Data Explorer web UIby using the results grid. By using the results grid, you can customize and manipulate your results, and enhance the efficiency and effectiveness of your data analysis. To learn how to run queries, seeQuickstart: Query data in the Azure Data Explorer web UI.",2026-04-13T17:02:00.000Z,how-to,,0.2,False,"Describes how to work with the results grid in the web UI; lacks numeric limits, configuration parameter tables, or troubleshooting mappings. General feature-usage instructions rather than expert reference.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/web-share-queries,Share queries,Share Queries From Azure Data Explorer Web UI - Azure Data Explorer,,"Share queries from the Azure Data Explorer web UI: learn to copy links, export results, pin to dashboards, or open live in Excel.","This article shows you how to useAzure Data Explorer web UIto: Note To learn how to run queries, seeQuickstart: Query data in the Azure Data Explorer web UI.",2025-08-25T22:02:00.000Z,how-to,,0.25,False,"Shows how to share queries and export results; mostly UI and workflow, not expert configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/web-sync,Access the web UI anywhere,Access your Azure Data Explorer web UI data anywhere - Azure Data Explorer,Configure cloud sync for Azure Data Explorer web UI profiles,This guide teaches you how to access your Azure Data Explorer Web UI Data from anywhere using syncing.,"This article describes how to sync your Azure Data Explorer web UI profile to the cloud, enabling a consistent experience across devices and browsers. When you sync, your browser's settings, tabs, and connections are stored in the cloud, making them accessible through theAzure Data Explorer web UIfrom anywhere. Synced data is now associated with your account rather than the specific machine, enabling a consistent experience across devices.",2026-01-05T12:03:00.000Z,how-to,configuration,0.64,True,"Explains how profile sync stores settings, tabs, and connections; likely includes specific toggles and account-scoped configuration options unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/data-explorer/web-ui-kql,Write KQL queries,Write Kusto Query Language Queries in the Azure Data Explorer Web UI - Azure Data Explorer,,"In this article, you learn how to write Kusto Query Language (KQL) queries in the Azure Data Explorer web UI.","TheAzure Data Explorer web UIquery editor offers various features to help you writeKusto Query Language (KQL)queries. Some of these features include built-in KQL Intellisense and autocomplete, inline documentation, and quick fix pop-ups. In this article, you learn what you should know when writing KQL queries in the web UI.",2026-02-12T08:00:00.000Z,how-to,,0.35,False,Describes KQL editor features like IntelliSense and quick fixes; mostly conceptual usage guidance without detailed parameter tables.,unchanged diff --git a/products/azure-data-explorer/report.md b/products/azure-data-explorer/report.md index d96f776a..02743796 100644 --- a/products/azure-data-explorer/report.md +++ b/products/azure-data-explorer/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Configuring ADX security: auth/RBAC, managed identities, encryption/CMK, network isolation (private endpoints, outbound/public access), policies, locks, @@ -54,8 +54,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 7 -- **Unchanged**: 190 +- **Updated Pages**: 1 +- **Unchanged**: 196 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-data-explorer/azure-data-explorer.csv` @@ -78,20 +78,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics ### Updated Pages -- [Get data from Amazon S3](https://learn.microsoft.com/en-us/azure/data-explorer/get-data-amazon-s3) - - Updated: 2025-10-06T22:02:00.000Z → 2026-04-13T17:02:00.000Z -- [Ingest historical data](https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-historical) - - Updated: 2025-11-03T08:00:00.000Z → 2026-04-12T08:00:00.000Z -- [Create an Event Hubs data connection](https://learn.microsoft.com/en-us/azure/data-explorer/create-event-hubs-connection) - - Updated: 2025-08-26T08:00:00.000Z → 2026-01-12T08:00:00.000Z -- [Add a query visualization](https://learn.microsoft.com/en-us/azure/data-explorer/add-query-visualization) - - Updated: 2023-10-13T11:04:00.000Z → 2026-04-13T17:02:00.000Z -- [Explore the results grid](https://learn.microsoft.com/en-us/azure/data-explorer/web-results-grid) - - Updated: 2023-05-28T08:00:00.000Z → 2026-04-13T17:02:00.000Z -- [Query data in Azure Data Lake](https://learn.microsoft.com/en-us/azure/data-explorer/data-lake-query-data) - - Updated: 2025-06-11T22:03:00.000Z → 2026-04-12T08:00:00.000Z -- [Monitoring data reference](https://learn.microsoft.com/en-us/azure/data-explorer/monitor-data-explorer-reference) - - Updated: 2025-01-02T18:01:00.000Z → 2026-04-16T08:00:00.000Z +- [Use dashboards](https://learn.microsoft.com/en-us/azure/data-explorer/azure-data-explorer-dashboards) + - Updated: 2025-08-28T11:06:00.000Z → 2026-04-20T08:00:00.000Z ## Classified Pages @@ -260,7 +248,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Ingest and query monitoring data](https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-no-code) | 0.35 | No-code ingestion tutorial; focuses on steps rather than detailed configuration matrices or limits. | | [Integration with Data Factory](https://learn.microsoft.com/en-us/azure/data-explorer/data-factory-integration) | 0.35 | Integration overview with Azure Data Factory; no detailed parameter tables or limits in summary. | | [Sample app generator](https://learn.microsoft.com/en-us/azure/data-explorer/sample-app-generator-wizard) | 0.35 | Sample app generator wizard is a tooling overview; summary doesn’t indicate detailed configuration tables or non-obvious constraints, more of a guided example generator. | -| [Use dashboards](https://learn.microsoft.com/en-us/azure/data-explorer/azure-data-explorer-dashboards) | 0.35 | Dashboard article appears to be a how-to for building dashboards; mostly workflow/UI, not detailed configuration matrices or error/limit references. | | [Write KQL queries](https://learn.microsoft.com/en-us/azure/data-explorer/web-ui-kql) | 0.35 | Describes KQL editor features like IntelliSense and quick fixes; mostly conceptual usage guidance without detailed parameter tables. | | [Add a cluster connection](https://learn.microsoft.com/en-us/azure/data-explorer/add-cluster-connection) | 0.30 | How to add cluster connections in web UI; likely step-by-step UI instructions rather than parameter tables. | | [Apply conditional formatting](https://learn.microsoft.com/en-us/azure/data-explorer/dashboard-conditional-formatting) | 0.30 | Explains conditional formatting rules conceptually; no product-specific limits, config parameter tables, or error codes. | @@ -296,6 +283,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Query data in the web UI overview](https://learn.microsoft.com/en-us/azure/data-explorer/web-ui-query-overview) | 0.20 | UI overview of query page; describes features but not deep configuration tables or limits. | | [Query integrations overview](https://learn.microsoft.com/en-us/azure/data-explorer/integrate-query-overview) | 0.20 | High-level overview of query integrations; mainly navigation and conceptual listing of tools. | | [Solution architectures](https://learn.microsoft.com/en-us/azure/data-explorer/solution-architectures) | 0.20 | Points to Azure Architecture Center; no product-specific decision matrices or quantified thresholds in this summary. | +| [Use dashboards](https://learn.microsoft.com/en-us/azure/data-explorer/azure-data-explorer-dashboards) | 0.20 | Primarily a how-to/overview for creating and using Azure Data Explorer dashboards; no indication of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Content is likely general UI and query export steps rather than expert-only reference details. | | [Visualize sample data dashboards](https://learn.microsoft.com/en-us/azure/data-explorer/web-ui-samples-dashboards) | 0.20 | Quickstart for sample dashboards; primarily step-by-step usage, no deep config, limits, or troubleshooting mappings. | | [What is Azure Data Explorer?](https://learn.microsoft.com/en-us/azure/data-explorer/data-explorer-overview) | 0.10 | High-level product overview without concrete limits, configs, or error details. | | [What's new](https://learn.microsoft.com/en-us/azure/data-explorer/whats-new) | 0.10 | Documentation change log; no technical limits, configs, or troubleshooting content. | diff --git a/products/azure-data-factory/azure-data-factory.csv b/products/azure-data-factory/azure-data-factory.csv index 94fa4078..0ad3f406 100644 --- a/products/azure-data-factory/azure-data-factory.csv +++ b/products/azure-data-factory/azure-data-factory.csv @@ -113,7 +113,7 @@ https://learn.microsoft.com/en-us/azure/data-factory/connector-netezza,Netezza,C https://learn.microsoft.com/en-us/azure/data-factory/connector-odata,OData,Copy data from OData sources - Azure Data Factory & Azure Synapse,Configure Azure Data Factory OData connector,Learn how to copy data from OData sources to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Data Factory in Microsoft Fabricis the next generation of Azure Data Factory, with a simpler architecture, built-in AI, and new features. If you're new to data integration, start with Fabric Data Factory. Existing ADF workloads can upgrade to Fabric to access new capabilities across data science, real-time analytics, and reporting. This article outlines how to use Copy Activity in an Azure Data Factory or Synapse Analytics pipeline to copy",2026-03-25T22:12:00.000Z,how-to,integrations,0.8,True,"OData connector documentation for ADF includes property tables (serviceUri, authenticationType, pagination rules, query options) and supported OData versions/endpoints. These are concrete integration settings unique to this connector.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/connector-odbc,ODBC,Copy data from and to ODBC data stores - Azure Data Factory & Azure Synapse,Use ODBC connector in Azure Data Factory,Learn how to copy data from and to ODBC data stores by using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Try outData Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises.Microsoft Fabriccovers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how tostart a new trialfor free! This article outlines how to use the Copy Activity in Azure Data Factory to copy data from and to an ODBC data store. It builds on thecopy activity overviewarticle that presents a gener",2024-09-26T08:00:00.000Z,conceptual,integrations,0.88,True,"ODBC connector page describes driver configuration, connection string formats, and dataset/linked service properties specific to ADF’s ODBC implementation, including parameter names and constraints.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/connector-office-365,Microsoft 365,Copy and transform data from Microsoft 365 (Office 365) - Azure Data Factory & Azure Synapse,Copy and transform Microsoft 365 data with Data Factory,Learn how to copy and transform data from Microsoft 365 (Office 365) to supported sink data stores by using copy and mapping data flow activity in an Azure Data Factory or Synapse Analytics pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Data Factory in Microsoft Fabricis the next generation of Azure Data Factory, with a simpler architecture, built-in AI, and new features. If you're new to data integration, start with Fabric Data Factory. Existing ADF workloads can upgrade to Fabric to access new capabilities across data science, real-time analytics, and reporting. Azure Data Factory and Synapse Analytics pipelines integrate withMicrosoft Graph data connect, allowing you t",2026-03-25T22:12:00.000Z,how-to,integrations,0.88,True,"Microsoft 365 connector article describes integration with Microsoft Graph data connect, including tenant/app configuration, consent scopes, and connector-specific dataset properties, which are expert integration details.",unchanged -https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle,Oracle,Copy data to and from Oracle - Azure Data Factory & Azure Synapse,Configure Azure Data Factory Oracle connector,"Learn how to copy data from supported source stores to an Oracle database, or from Oracle to supported sink stores, using Data Factory or Azure Synapse Analytics pipelines.","APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Try outData Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises.Microsoft Fabriccovers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how tostart a new trialfor free! This article outlines how to use the copy activity in Azure Data Factory to copy data from and to an Oracle database. It builds on thecopy activity overview. Important The Oracle connec",2026-04-02T08:00:00.000Z,how-to,integrations,0.78,True,"Connector page for Oracle in Data Factory/Synapse typically includes detailed connector-specific settings (linked service properties, dataset schema mapping, supported data types, query options, partitioning settings, and behavior flags) with parameter names and allowed values. These are product-specific integration details and configuration parameters that go beyond generic knowledge of Oracle or Data Factory, fitting the integrations sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle,Oracle,Copy data to and from Oracle - Azure Data Factory & Azure Synapse,Integrate Azure Data Factory with Oracle databases,"Learn how to copy data from supported source stores to an Oracle database, or from Oracle to supported sink stores, using Data Factory or Azure Synapse Analytics pipelines.","APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Data Factory in Microsoft Fabricis the next generation of Azure Data Factory, with a simpler architecture, built-in AI, and new features. If you're new to data integration, start with Fabric Data Factory. Existing ADF workloads can upgrade to Fabric to access new capabilities across data science, real-time analytics, and reporting. This article outlines how to use the copy activity in Azure Data Factory to copy data from and to an Oracle d",2026-04-09T08:00:00.000Z,how-to,integrations,0.74,True,"Connector pages for specific sources/sinks (Oracle) typically include product-specific connection properties, supported data types, and configuration parameters (such as connection string options, partitioning settings, and performance-related flags) that are unique to the Azure Data Factory Oracle connector. These are concrete integration details rather than just conceptual guidance, fitting the integrations sub-skill.",updated https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-cloud-storage,Oracle Cloud Storage,Copy data from Oracle Cloud Storage - Azure Data Factory & Azure Synapse,Connect Azure Data Factory to Oracle Cloud Storage,Learn about how to copy data from Oracle Cloud Storage to supported sink data stores using an Azure Data Factory or Synapse Analytics pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Try outData Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises.Microsoft Fabriccovers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how tostart a new trialfor free! This article outlines how to copy data from Oracle Cloud Storage. To learn more, read the introductory articles forAzure Data FactoryandSynapse Analytics.",2023-10-20T08:00:00.000Z,conceptual,integrations,0.8,True,"Describes how to configure Oracle Cloud Storage as a source, including endpoint formats, auth parameters, and dataset settings unique to this connector.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-eloqua,Oracle Eloqua,Copy data from Oracle Eloqua (Preview) - Azure Data Factory & Azure Synapse,Connect Azure Data Factory to Oracle Eloqua,Learn how to copy data from Oracle Eloqua to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Data Factory in Microsoft Fabricis the next generation of Azure Data Factory, with a simpler architecture, built-in AI, and new features. If you're new to data integration, start with Fabric Data Factory. Existing ADF workloads can upgrade to Fabric to access new capabilities across data science, real-time analytics, and reporting. Important This connector is atEnd of Support stage. You are recommended to migrate toODBC connectorby install",2026-03-25T22:12:00.000Z,how-to,integrations,0.76,True,"Oracle Eloqua connector (preview/end-of-support) pages typically list connector-specific linked service and dataset properties, supported objects, and authentication parameters, which are detailed integration settings not derivable from generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-responsys,Oracle Responsys,Copy data from Oracle Responsys (Preview) - Azure Data Factory & Azure Synapse,Connect Azure Data Factory to Oracle Responsys,Learn how to copy data from Oracle Responsys to supported sink data stores using a copy activity in an Azure Data Factory or Synapse Analytics pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Data Factory in Microsoft Fabricis the next generation of Azure Data Factory, with a simpler architecture, built-in AI, and new features. If you're new to data integration, start with Fabric Data Factory. Existing ADF workloads can upgrade to Fabric to access new capabilities across data science, real-time analytics, and reporting. Important This connector is atEnd of Support stage. You are recommended to migrate toODBC connectorby install",2026-03-25T22:12:00.000Z,how-to,integrations,0.76,True,"Responsys connector docs usually define specific configuration properties (endpoint URLs, auth settings, object names) and supported operations, forming product-specific integration patterns rather than conceptual content.",unchanged @@ -358,7 +358,7 @@ https://learn.microsoft.com/en-us/azure/data-factory/how-to-run-self-hosted-inte https://learn.microsoft.com/en-us/azure/data-factory/how-to-schedule-azure-ssis-integration-runtime,Schedule Azure-SSIS integration runtime,Schedule an Azure-SSIS integration runtime - Azure Data Factory,Schedule start and stop of Azure-SSIS integration runtime,This article describes how to schedule starting and stopping an Azure-SSIS integration runtime by using Azure Data Factory.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Try outData Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises.Microsoft Fabriccovers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how tostart a new trialfor free! This article describes how to schedule the starting and stopping of an Azure-SQL Server Integration Services (SSIS) integration runtime (IR) by using Azure Data Factory and Azure Synaps",2024-01-05T08:00:00.000Z,conceptual,configuration,0.7,True,"Scheduling Azure-SSIS IR uses specific pipeline triggers, activity configurations, and possibly time-based settings that are concrete configuration patterns for this service.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/how-to-send-email,Monitor pipelines with email notifications,How to send email - Azure Data Factory & Azure Synapse,Send email notifications from ADF and Synapse pipelines,Learn how to send an email with an Azure Data Factory or Azure Synapse pipeline.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Try outData Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises.Microsoft Fabriccovers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how tostart a new trialfor free! It's often necessary to send notifications during or after execution of a pipeline. Notification provides proactive alerting and reduces the need for reactive monitoring to discover iss",2025-11-02T05:12:00.000Z,tutorial,integrations,0.7,True,"Provides concrete patterns and configuration (e.g., Logic Apps, Webhook, or Office 365 connectors) with parameter details for integrating email into pipelines.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/how-to-sqldb-to-cosmosdb,Azure SQL DB to Azure Cosmos DB,Migrate Azure SQL Database tables to Azure Cosmos DB with Azure Data Factory - Azure Data Factory,Migrate Azure SQL schemas to Azure Cosmos DB with ADF,Take an existing normalized database schema from Azure SQL Database and migrate to an Azure Cosmos DB denormalized container with Azure Data Factory.,"This guide explains how to take an existing normalized database schema in Azure SQL Database and convert it into an Azure Cosmos DB denormalized schema for loading into Azure Cosmos DB. SQL schemas are typically modeled using third normal form, resulting in normalized schemas that provide high levels of data integrity and fewer duplicate data values. Queries can join entities together across tables for reading. Azure Cosmos DB is optimized for super-quick transactions and querying within a colle",2024-10-04T11:22:00.000Z,how-to,architecture-patterns,0.7,True,"Explains denormalization patterns, schema transformation, and pipeline design for SQL→Cosmos DB migrations, including trade-offs and product-specific mapping.",unchanged -https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-data-factory-pipelines-to-fabric-data-factory,Upgrade Azure Data Factory pipelines to Fabric,Upgrade your Azure Data Factory pipelines to Fabric - Azure Data Factory,Assess and upgrade Azure Data Factory pipelines to Fabric,Learn how to assess and upgrade your Azure Data Factory pipelines to Fabric Data Factory.,"Your Azure Data Factory pipelines already power critical workflows. In this article, you learn how to bring them into Fabric to unlock a more integrated, analytics-ready experience. This built-in migration experience helps you modernize your existing Azure Data Factory workloads in a few simple clicks. The migration experience helps you: This assessment-first approach helps ensure migrations are intentional, transparent, and incremental. You can upgrade pipelines at your own pace and validate re",2026-03-04T23:27:00.000Z,how-to,decision-making,0.7,True,"The page describes an assessment-first, built-in migration experience from Azure Data Factory to Fabric Data Factory, including guidance on assessing pipeline readiness, understanding compatibility, and upgrading at your own pace. This is migration/modernization decision guidance rather than just a conceptual overview, fitting the decision-making sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-data-factory-pipelines-to-fabric-data-factory,Upgrade Azure Data Factory pipelines to Fabric,Upgrade your Azure Data Factory pipelines to Fabric - Azure Data Factory,Assess and upgrade Azure Data Factory pipelines to Fabric,Learn how to assess and upgrade your Azure Data Factory pipelines to Fabric Data Factory.,"Your Azure Data Factory pipelines already power critical workflows. In this article, you learn how to bring them into Fabric to unlock a more integrated, analytics-ready experience. This built-in migration experience helps you modernize your existing Azure Data Factory workloads in a few simple clicks. The migration experience helps you: This assessment-first approach helps ensure migrations are intentional, transparent, and incremental. You can upgrade pipelines at your own pace and validate re",2026-04-21T17:17:00.000Z,how-to,decision-making,0.68,True,"The page focuses on assessing and upgrading Azure Data Factory pipelines to Fabric Data Factory, describing an assessment-first, incremental migration experience. This is migration/upgrade decision guidance (when and how to move existing workloads, how to validate, and how to pace the migration), which fits the decision-making category. It goes beyond a conceptual overview by providing product-specific migration flow and considerations, but does not emphasize limits, configuration tables, or troubleshooting details.",updated https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-synapse-analytics-pipelines-to-fabric-data-factory,Upgrade Azure Synapse Analytics pipelines to Fabric,Modernize your Azure Synapse Analytics pipelines with Fabric - Azure Data Factory,Modernize Azure Synapse pipelines to Fabric Data Factory,Learn how to assess and upgrade your Azure Synapse Analytics pipelines to Fabric Data Factory.,"Modernizing your workflows in Microsoft Fabric often starts with bringing your existing Azure Synapse Analytics pipelines forward. The migration experience helps you assess pipeline readiness, understand compatibility gaps, and migrate supported pipelines into a Fabric workspace—so you can move in a controlled, low-risk way.",2026-03-18T22:20:00.000Z,how-to,decision-making,0.7,True,"The page focuses on assessing and upgrading Azure Synapse Analytics pipelines to Fabric Data Factory, including readiness assessment and compatibility gaps to enable controlled, low-risk migration. This is specific migration decision guidance between services, aligning with the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/how-to-use-azure-key-vault-secrets-pipeline-activities,Use Azure Key Vault secrets in pipeline activities,Use Azure Key Vault secrets in pipeline activities - Azure Data Factory,Use Azure Key Vault secrets in ADF pipeline activities,Learn how to fetch stored credentials from Azure Key Vault and use them during data factory pipeline runs.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Data Factory in Microsoft Fabricis the next generation of Azure Data Factory, with a simpler architecture, built-in AI, and new features. If you're new to data integration, start with Fabric Data Factory. Existing ADF workloads can upgrade to Fabric to access new capabilities across data science, real-time analytics, and reporting. You can store credentials or secret values in an Azure Key Vault and use them during pipeline execution to pa",2026-03-25T22:12:00.000Z,how-to,security,0.85,True,"This page explains how to reference Key Vault secrets in activity definitions, with concrete JSON fields, reference syntax, and permission requirements, which are detailed security and configuration patterns unique to ADF.",unchanged https://learn.microsoft.com/en-us/azure/data-factory/how-to-use-sql-managed-instance-with-ir,Use Azure SQL Managed Instance with Azure-SSIS IR,Use Azure SQL Managed Instance with Azure-SQL Server Integration Services (SSIS) in Azure Data Factory - Azure Data Factory,Use Azure SQL Managed Instance with Azure-SSIS IR,Learn how to use Azure SQL Managed Instance with SQL Server Integration Services (SSIS) in Azure Data Factory.,"APPLIES TO:Azure Data FactoryAzure Synapse Analytics Tip Try outData Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises.Microsoft Fabriccovers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how tostart a new trialfor free! You can now move your SQL Server Integration Services (SSIS) projects, packages, and workloads to the Azure cloud. Deploy, run, and manage SSIS projects and packages on Azure SQL Databa",2024-10-03T08:00:00.000Z,conceptual,architecture-patterns,0.65,True,"Covers detailed, product-specific patterns for hosting SSIS catalog on SQL Managed Instance with Azure-SSIS IR, including configuration choices and migration considerations.",unchanged diff --git a/products/azure-data-factory/report.md b/products/azure-data-factory/report.md index e24c7ed2..025f28cd 100644 --- a/products/azure-data-factory/report.md +++ b/products/azure-data-factory/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: best-practices: 'Performance, DataOps, and reliability best practices for ADF: tuning data flows/copy, optimizing sources/sinks/IR, handling schema drift/errors, and @@ -13,9 +13,9 @@ category_descriptions: security: Securing Data Factory with identity, encryption, Key Vault, and Azure Policy, plus network controls like VNets, Private Link, firewalls, private endpoints, and secure runtimes (Azure-SSIS, self-hosted). - integrations: Connector how-tos, patterns, and activities for copying/transforming - data between ADF and many services (databases, SaaS apps, files, Fabric, Databricks, - SSIS, ML, Synapse) using data flows and SDKs. + integrations: Connector how-tos, patterns, and activities for integrating ADF with + dozens of data sources, running external compute (Databricks, Synapse, SSIS, ML), + and building mapping data flow transformations. troubleshooting: 'Diagnosing and fixing ADF issues: connector/format errors, copy & data flow performance, pipelines/triggers, SHIR/SSIS IR problems, security/access, and interpreting failure logs.' @@ -30,14 +30,12 @@ category_descriptions: skill_description: Expert knowledge for Azure Data Factory development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - designing ADF pipelines, Mapping Data Flows, SHIR/SSIS IR, secure VNet/Private Link - setups, or CI/CD deployments, and other Azure Data Factory related development tasks. - Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure Databricks - (use azure-databricks), Azure Stream Analytics (use azure-stream-analytics), Azure - Data Explorer (use azure-data-explorer). -use_when: Use when designing ADF pipelines, Mapping Data Flows, SHIR/SSIS IR, secure - VNet/Private Link setups, or CI/CD deployments, and other Azure Data Factory related - development tasks. + building ADF pipelines, mapping data flows, SHIR/SSIS IR, VNet/Private Link, or + CI/CD deployments, and other Azure Data Factory related development tasks. Not for + Azure Synapse Analytics (use azure-synapse-analytics), Azure Databricks (use azure-databricks), + Azure Stream Analytics (use azure-stream-analytics), Azure Data Explorer (use azure-data-explorer). +use_when: Use when building ADF pipelines, mapping data flows, SHIR/SSIS IR, VNet/Private + Link, or CI/CD deployments, and other Azure Data Factory related development tasks. confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure Databricks (use azure-databricks), Azure Stream Analytics (use azure-stream-analytics), Azure Data Explorer (use azure-data-explorer). @@ -54,8 +52,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 505 +- **Updated Pages**: 2 +- **Unchanged**: 503 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-data-factory/azure-data-factory.csv` @@ -76,6 +74,13 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics ## Changes +### Updated Pages + +- [Upgrade Azure Data Factory pipelines to Fabric](https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-data-factory-pipelines-to-fabric-data-factory) + - Updated: 2026-03-04T23:27:00.000Z → 2026-04-21T17:17:00.000Z +- [Oracle](https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle) + - Updated: 2026-04-02T08:00:00.000Z → 2026-04-09T08:00:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -293,7 +298,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Jira](https://learn.microsoft.com/en-us/azure/data-factory/connector-jira) | integrations | 0.78 | Jira connector pages for Data Factory typically include connector-specific integration details such as supported authentication types, dataset and linked service properties, REST endpoint patterns, query parameters, and property tables with allowed values and defaults. These are product-specific configuration and API parameters for integrating ADF/Synapse with Jira, which qualify as integrations-focused expert knowledge rather than a generic tutorial. | | [MongoDB Atlas](https://learn.microsoft.com/en-us/azure/data-factory/connector-mongodb-atlas) | integrations | 0.78 | Connector pages for ADF typically include detailed linked service, dataset, and copy activity property tables (host, port, authenticationType, connectionString, query, etc.) with allowed values and defaults that are specific to this product’s integration with MongoDB Atlas. These are product-specific API/config parameters rather than generic concepts. | | [ORC format](https://learn.microsoft.com/en-us/azure/data-factory/format-orc) | configuration | 0.78 | ORC format support pages for ADF/Synapse typically list dataset and copy activity settings (compressionCodec, stripeSize, rowIndexStride, etc.) with allowed values and defaults, which are explicit configuration parameters rather than generic ORC theory. | -| [Oracle](https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle) | integrations | 0.78 | Connector page for Oracle in Data Factory/Synapse typically includes detailed connector-specific settings (linked service properties, dataset schema mapping, supported data types, query options, partitioning settings, and behavior flags) with parameter names and allowed values. These are product-specific integration details and configuration parameters that go beyond generic knowledge of Oracle or Data Factory, fitting the integrations sub-skill. | | [Presto](https://learn.microsoft.com/en-us/azure/data-factory/connector-presto) | integrations | 0.78 | Presto connector pages provide property tables for linked services/datasets (host, port, catalog, schema, SSL, query) and supported capabilities, which are concrete integration parameters unique to this connector. | | [QuickBooks Online](https://learn.microsoft.com/en-us/azure/data-factory/connector-quickbooks) | integrations | 0.78 | QuickBooks Online connector documentation typically lists connection/auth properties, supported objects, and query/filter parameters in tables, which are product-specific integration details. | | [Salesforce Marketing Cloud](https://learn.microsoft.com/en-us/azure/data-factory/connector-salesforce-marketing-cloud) | integrations | 0.78 | Salesforce Marketing Cloud connector pages include connector-specific configuration properties, supported entities, and limitations, which are detailed integration parameters unique to this product. | @@ -330,6 +334,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Surrogate key](https://learn.microsoft.com/en-us/azure/data-factory/data-flow-surrogate-key) | integrations | 0.75 | Surrogate key transformation documentation describes how keys are generated, configuration fields (start value, increment, partition behavior), and constraints that are specific to ADF/Synapse data flows, fitting integration/coding patterns. | | [Window](https://learn.microsoft.com/en-us/azure/data-factory/data-flow-window) | integrations | 0.75 | Window transformation documentation includes product-specific parameters (window type, frame, partitioning, ordering) and expression usage unique to ADF/Synapse data flows, which are detailed integration/coding patterns. | | [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/data-factory/policy-reference) | security | 0.74 | Lists Azure Policy built-in definitions for Data Factory; these include specific policy names, effects, and scopes that control security/compliance posture, which are product-specific security configuration artifacts. | +| [Oracle](https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle) | integrations | 0.74 | Connector pages for specific sources/sinks (Oracle) typically include product-specific connection properties, supported data types, and configuration parameters (such as connection string options, partitioning settings, and performance-related flags) that are unique to the Azure Data Factory Oracle connector. These are concrete integration details rather than just conceptual guidance, fitting the integrations sub-skill. | | [Quickbase](https://learn.microsoft.com/en-us/azure/data-factory/connector-quickbase) | integrations | 0.74 | Quickbase transformation connector docs describe dataset and source/sink properties, supported operations, and connector-specific behaviors in data flows, which are detailed integration and configuration patterns. | | [Smartsheet](https://learn.microsoft.com/en-us/azure/data-factory/connector-smartsheet) | integrations | 0.74 | Smartsheet transformation connector docs describe dataset/source properties, supported sheet/row operations, and connector-specific behaviors in data flows, which are concrete integration and configuration details. | | [Access a secured Microsoft Purview account](https://learn.microsoft.com/en-us/azure/data-factory/how-to-access-secured-purview-account) | security | 0.70 | Covers firewall, private endpoint, and network configuration details for Purview–ADF integration, including security-specific settings and scopes. | @@ -423,7 +428,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Union](https://learn.microsoft.com/en-us/azure/data-factory/data-flow-union) | integrations | 0.70 | Union transformation pages typically document how streams are combined, column resolution rules, and configuration parameters unique to ADF/Synapse data flows, which are product-specific integration details. | | [Unpivot](https://learn.microsoft.com/en-us/azure/data-factory/data-flow-unpivot) | integrations | 0.70 | Unpivot transformation docs usually list configuration fields (input columns, attribute/value columns) and behavior specific to ADF/Synapse data flows, representing concrete integration and transformation patterns. | | [Until activity](https://learn.microsoft.com/en-us/azure/data-factory/control-flow-until-activity) | configuration | 0.70 | Documents properties (expression, activities, timeout, delay) and loop behavior; includes timeout semantics and configuration values specific to this activity. | -| [Upgrade Azure Data Factory pipelines to Fabric](https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-data-factory-pipelines-to-fabric-data-factory) | decision-making | 0.70 | The page describes an assessment-first, built-in migration experience from Azure Data Factory to Fabric Data Factory, including guidance on assessing pipeline readiness, understanding compatibility, and upgrading at your own pace. This is migration/modernization decision guidance rather than just a conceptual overview, fitting the decision-making sub-skill. | | [Upgrade Azure Synapse Analytics pipelines to Fabric](https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-synapse-analytics-pipelines-to-fabric-data-factory) | decision-making | 0.70 | The page focuses on assessing and upgrading Azure Synapse Analytics pipelines to Fabric Data Factory, including readiness assessment and compatibility gaps to enable controlled, low-risk migration. This is specific migration decision guidance between services, aligning with the decision-making sub-skill. | | [Use custom parameters with a Resource Manager template](https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-resource-manager-custom-parameters) | configuration | 0.70 | Page is about using custom parameters in ARM templates for Azure Data Factory CI/CD. These articles typically list specific ARM parameter names, JSON schema, and how they map to Data Factory properties, which are product-specific configuration details rather than generic ARM concepts. | | [Using Azure Data Factory UI](https://learn.microsoft.com/en-us/azure/data-factory/join-azure-ssis-integration-runtime-virtual-network-ui) | security | 0.70 | Joining IR to a VNet via portal involves selecting specific subnets and understanding required network/security settings, which are concrete security-related configuration steps. | @@ -433,6 +437,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Wait activity](https://learn.microsoft.com/en-us/azure/data-factory/control-flow-wait-activity) | configuration | 0.70 | Provides Wait activity property (waitTimeInSeconds) and behavior; specific configuration parameter and its effect on pipeline execution. | | [Webhook activity](https://learn.microsoft.com/en-us/azure/data-factory/control-flow-webhook-activity) | integrations | 0.70 | Documents Webhook activity properties (url, method, headers, body, timeout, reportStatusOnCallBack) and callback contract; detailed integration and configuration parameters. | | [Window functions](https://learn.microsoft.com/en-us/azure/data-factory/data-flow-window-functions) | integrations | 0.70 | Documents window functions and their parameters in ADF/Synapse mapping data flows; function set and syntax are product-specific integration APIs. | +| [Upgrade Azure Data Factory pipelines to Fabric](https://learn.microsoft.com/en-us/azure/data-factory/how-to-upgrade-your-azure-data-factory-pipelines-to-fabric-data-factory) | decision-making | 0.68 | The page focuses on assessing and upgrading Azure Data Factory pipelines to Fabric Data Factory, describing an assessment-first, incremental migration experience. This is migration/upgrade decision guidance (when and how to move existing workloads, how to validate, and how to pace the migration), which fits the decision-making category. It goes beyond a conceptual overview by providing product-specific migration flow and considerations, but does not emphasize limits, configuration tables, or troubleshooting details. | | [Access SQL Managed Instance](https://learn.microsoft.com/en-us/azure/data-factory/tutorial-managed-virtual-network-sql-managed-instance) | security | 0.65 | Describes setting up Private Link Service/private endpoints between Data Factory managed VNet and SQL Managed Instance; includes product-specific secure connectivity configuration beyond generic concepts. | | [Access on premises SQL Server](https://learn.microsoft.com/en-us/azure/data-factory/tutorial-managed-virtual-network-on-premise-sql-server) | security | 0.65 | Tutorial for Private Link Service and private endpoints from a managed VNet to on-prem SQL; contains product-specific network/security configuration steps and required resources, which are detailed security integration patterns. | | [Aggregate](https://learn.microsoft.com/en-us/azure/data-factory/data-flow-aggregate) | configuration | 0.65 | Describes Aggregate transformation settings (group by, aggregations, windowing options) with UI fields and expression patterns; product-specific configuration surface. | diff --git a/products/azure-database-mysql/azure-database-mysql.csv b/products/azure-database-mysql/azure-database-mysql.csv index d8af68d6..8df9d0dd 100644 --- a/products/azure-database-mysql/azure-database-mysql.csv +++ b/products/azure-database-mysql/azure-database-mysql.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy,Version support policy,Version Support Policy,Plan for Azure Database for MySQL version lifecycle,Describes the policy around MySQL major and minor versions in Azure Database for MySQL,"Azure Database for MySQL provides a fully managed database service powered by the MySQL community edition, enabling developers to build and scale applications efficiently. This article outlines the version support policy for Azure Database for MySQL, detailing the lifecycle management, including version availability, updates, and end-of-support timelines. Customers can ensure their applications remain secure, performant, and aligned with the latest MySQL innovations while minimizing disruption d",2026-03-27T22:08:00.000Z,concept-article,decision-making,0.7,True,"A version support policy page typically includes concrete lifecycle details such as specific major/minor versions supported, GA and end-of-support dates, and upgrade timelines. These are time-bound, product-specific facts that an LLM will not reliably know from training and are used to decide when to upgrade or which version to choose, fitting decision-making.",unchanged +https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy,Version support policy,Version Support Policy,Plan MySQL version lifecycle on Azure Database,Describes the policy around MySQL major and minor versions in Azure Database for MySQL,"Azure Database for MySQL provides a fully managed database service powered by the MySQL community edition, enabling developers to build and scale applications efficiently. This article outlines the version support policy for Azure Database for MySQL, detailing the lifecycle management, including version availability, updates, and end-of-support timelines. Customers can ensure their applications remain secure, performant, and aligned with the latest MySQL innovations while minimizing disruption d",2026-04-16T08:00:00.000Z,concept-article,decision-making,0.68,True,"A version support policy page for Azure Database for MySQL typically includes concrete lifecycle details (which MySQL major/minor versions are supported, GA/retirement dates, and upgrade windows). These are product-specific, time-bound details that an LLM won't reliably know from training and are used to decide when to upgrade or which version to choose, fitting decision-making better than other categories.",updated https://learn.microsoft.com/en-us/azure/mysql/flexible-server/azure-pipelines-deploy-database-task,Azure Pipelines,Azure Pipelines Task - Azure Database for MySQL,Configure Azure Pipelines task for Azure MySQL deployments,Enable an Azure Database for MySQL - Flexible Server CLI task for using with Azure Pipelines.,"You can automatically deploy your database updates to Azure Database for MySQL Flexible Server after every successful build withAzure Pipelines. You can use the Azure CLI task to update the database with either a SQL file or an inline SQL script. You can run this task on cross-platform agents running on Linux, macOS, or Windows operating systems.",2025-07-22T05:04:00.000Z,how-to,deployment,0.7,True,"Describes a specific Azure Pipelines task using Azure CLI to deploy MySQL updates, including task parameters and supported agents—this is CI/CD deployment configuration specific to the product.",unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concept-monitor-best-practices,Monitoring best practices,Monitoring Best Practices - Azure Database for MySQL,Implement monitoring best practices for MySQL flexible server,This article describes the best practices to monitor Azure Database for MySQL - Flexible Server.,"Learn about the best practices for monitoring your database operations. These practices help ensure that performance stays strong as your data size grows. As we add new capabilities to the platform, we continue to refine the best practices detailed in this section.",2025-08-20T22:07:00.000Z,best-practice,best-practices,0.7,True,Describes concrete monitoring practices and likely specific metrics/thresholds for Azure Database for MySQL Flexible Server.,unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concept-operation-excellence-best-practices,Operational best practices,Operational Best Practices - Azure Database for MySQL,Apply operational best practices for Azure Database for MySQL,This article describes the best practices to operate your Azure Database for MySQL - Flexible Server database on Azure.,"Learn about the best practices for working with Azure Database for MySQL Flexible Server. As we add new capabilities to the platform, we continue to focus on refining the best practices detailed in this section.",2025-11-25T08:00:00.000Z,best-practice,best-practices,0.7,True,"Explicitly described as operational best practices for Azure Database for MySQL Flexible Server; likely includes product-specific recommendations, configuration guidance, and gotchas beyond generic MySQL advice.",unchanged @@ -19,7 +19,7 @@ https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-connectio https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-data-in-replication,Data-in replication,Data-In Replication - Azure Database for MySQL,Design data-in replication into MySQL Flexible Server,Learn about using Data-in replication to synchronize from an external server into an Azure Database for MySQL - Flexible Server instance.,"Data-in replication allows you to synchronize data from an external MySQL server into an Azure Database for MySQL Flexible Server instance. The external server can be on-premises, in virtual machines, Azure Database for MySQL Single Server, or a database service hosted by other cloud providers. Data-in replication is based on the binary log (binlog) file position or GTID-based replication. To learn more about binlog replication, see theMySQL Replication. Note Configuring Data-in replication for ",2026-01-07T12:03:00.000Z,how-to,architecture-patterns,0.68,True,"Covers Azure-specific replication pattern from external MySQL into Flexible Server, with configuration details and constraints.",unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-data-out-replication,Data-out replication,Data-Out Replication - Azure Database for MySQL,Design data-out replication from MySQL Flexible Server,Learn about the concepts of data-out replication out of Azure Database for MySQL - Flexible Server to another MySQL server.,"Data-out replication allows you to synchronize data out of an Azure Database for MySQL Flexible Server instance to another MySQL server using MySQL native replication. The MySQL server (replica) can be on-premises, in virtual machines, or a database service hosted by other cloud providers. WhileReplicate data into Azure Database for MySQL - Flexible Serverhelps to move data into an Azure Database for MySQL Flexible Server instance (replica), Data-out replication would allow you to transfer data ",2025-01-21T23:00:00.000Z,concept-article,architecture-patterns,0.68,True,Describes replication from Flexible Server to external MySQL; includes pattern-specific configuration and limitations.,unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-error-logs,Error logs,Error Logs - Azure Database for MySQL - Flexible Server - Azure Database for MySQL,Use MySQL Flexible Server error logs for troubleshooting,This article describes the error logs feature for Azure Database for MySQL - Flexible Server.,"In Azure Database for MySQL - Flexible Server, the error log is available to users to configure and access. Error logs in MySQL gather diagnostic messages during server startup and shutdown, and while the server is running, information that can help proactive troubleshooting. For more information about the MySQL error log, see theError logsection in the MySQL documentation.Under Preview phase, error logs are available under Server logs only, error logscan't be emittedtoAzure Diagnostic logs. In ",2024-12-02T23:02:00.000Z,concept-article,troubleshooting,0.7,True,Error logs article describes how to access and interpret logs for this service; includes product-specific log locations and behaviors.,unchanged -https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability,High availability,Zone-Redundant High-Availability (HA) - Azure Database for MySQL,,Get a conceptual overview of zone-redundant high-availability in Azure Database for MySQL.,"By using Azure Database for MySQL Flexible Server, you can configure high availability with automatic failover. This solution ensures that failures never cause loss of committed data and that the database isn't a single point of failure in your software architecture. When you configure high availability, Flexible Server automatically provisions and manages a standby Hyper-V replica. You pay for the provisioned compute and storage for both the primary and secondary replicas. Two high-availability",2026-03-30T22:08:00.000Z,concept-article,,0.2,False,"Described as a conceptual overview of high availability for Azure Database for MySQL Flexible Server. Summary focuses on what HA is, automatic failover, and general behavior, without exposing concrete limits, configuration tables, error codes, or decision matrices. Does not meet any sub-skill detection criteria for expert, configuration-level knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability,High availability,Zone-Redundant High-Availability (HA) - Azure Database for MySQL,,Get a conceptual overview of zone-redundant high-availability in Azure Database for MySQL.,"By using Azure Database for MySQL Flexible Server, you can configure high availability with automatic failover. This solution ensures that failures never cause loss of committed data and that the database isn't a single point of failure in your software architecture. When you configure high availability, Flexible Server automatically provisions and manages a standby Hyper-V replica. You pay for the provisioned compute and storage for both the primary and secondary replicas. Two high-availability",2026-04-22T08:00:00.000Z,concept-article,,0.2,False,"The page is described as a conceptual overview of zone-redundant high availability for Azure Database for MySQL Flexible Server. From the summary, it explains what HA is, automatic failover, and the existence of primary/standby replicas, but does not indicate specific numeric limits, configuration parameter tables, error codes, or decision matrices. It reads as architecture/feature explanation rather than detailed expert configuration, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability-faq,High availability FAQ,High-Availability (HA) - FAQ - Azure Database for MySQL,High availability FAQ and choices for MySQL Flexible Server,FAQ related to high availability in Azure Database for MySQL - Flexible Server.,"High availability is a key feature of Azure Database for MySQL, designed to minimize downtime and ensure your applications remain accessible even during planned maintenance or unexpected outages. This article addresses common questions about high availability (HA) options, billing, failover processes, performance impacts, and best practices to help you make informed decisions for your MySQL workloads on Azure.",2025-12-22T12:26:00.000Z,concept-article,decision-making,0.66,True,"FAQ on HA options, billing, performance impact, and best practices; helps choose HA configuration with product-specific details.",unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-limitations,Limitations,Limitations in Azure Database for MySQL - Flexible Server - Azure Database for MySQL,Service limitations for Azure MySQL Flexible Server,"This article describes limitations in Azure Database for MySQL - Flexible Server, such as the number of connection and storage engine options.","This article describes limitations in Azure Database for MySQL - Flexible Server.General limitationsin the MySQL database engine also apply. If you want to learn about resource limitations (compute, memory, storage), see thearticle about compute and storage.",2025-11-10T12:03:00.000Z,concept-article,limits-quotas,0.9,True,"Explicitly about limitations; such pages typically list numeric limits (connections, engines, features) that are product- and tier-specific.",unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-maintenance,Service maintenance,Scheduled Maintenance - Azure Database for MySQL,Plan for scheduled maintenance on MySQL Flexible Server,This article describes the scheduled maintenance feature in Azure Database for MySQL.,"Azure Database for MySQL performs periodic maintenance to help keep your managed database secure, stable, and up to date. During maintenance, the server gets new features, updates, and patches. Important Avoid all server operations (modifications, configuration changes, starting/stopping) during Azure Database for MySQL maintenance. Engaging in these activities can lead to unpredictable outcomes that might affect server performance and stability. Wait until maintenance concludes before you condu",2025-11-25T08:00:00.000Z,concept-article,deployment,0.65,True,"Scheduled maintenance behavior (windows, duration, impact) is product-specific operational detail that affects deployment and operations planning.",unchanged @@ -112,7 +112,7 @@ https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/janu https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/july-2025,July 2025,Release Notes for Azure Database for MySQL Flexible Server - July 2025 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server July 2025.,"We're excited to announce the July 2025 version of Azure Database for MySQL Flexible Server. Starting July 1, 2025, all new servers will automatically be onboarded to this latest version. Existing servers are upgraded during their next scheduled maintenance. If you prefer to upgrade your servers earlier, you can enroll in our Virtual Canary Program by following thislink. This new version introduces a range of new features and enhancements, resolves known issues, and includes essential security p",2025-07-15T05:14:00.000Z,release-notes,,0.3,False,"Release notes summary; similar to other notes, high-level without explicit limits or config references in the snippet.",unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/june-2024,June 2024,Release Notes for Azure Database for MySQL Flexible Server - June 2024 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server June 2024.,"We're pleased to announce the June 2024 maintenance for Azure Database for MySQL Flexible Server. In this maintenance update, we're addressing some availability issues that have been affecting a subset of our servers. While most servers remain unaffected, a small portion experiences maintenance activities to enhance their performance and stability. We appreciate your understanding and patience as we work to improve our service.",2024-11-27T08:00:00.000Z,release-notes,,0.2,False,Maintenance note focused on availability; lacks detailed technical mappings in the provided text.,unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/march-2025,March 2025,Release Notes for Azure Database for MySQL - March 2025 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server March 2025.,"We're excited to announce the March 2025 version of Azure Database for MySQL. All new servers are automatically onboarded to the latest version. Existing servers are upgraded during their next scheduled maintenance. If you prefer to upgrade your servers earlier, you can enroll in our Virtual Canary Program by visitingScheduled maintenance in Azure Database for MySQL. This new version introduces a range of new features and enhancements, resolves known issues, and includes essential security patch",2025-04-23T17:01:00.000Z,release-notes,,0.3,False,"Release notes summary; no explicit expert-level configuration, limits, or troubleshooting mappings in the snippet.",unchanged -https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/march-2026,March 2026,Release Notes for Azure Database for MySQL Flexible Server - March 2026 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server March 2026.,"The March 2026 version of Azure Database for MySQL Flexible Server is now available. Starting March 31, 2026, all new servers will automatically use this version. Existing servers upgrade during their next scheduled maintenance. To upgrade your servers earlier, enroll in theVirtual Canary Program. This version introduces enhancements, resolves known problems, and includes important security patches to ensure optimal performance and security.",2026-04-15T22:09:00.000Z,release-notes,,0.2,False,"Release notes summary mentions a new version, upgrade timing, and general improvements but no visible detailed limits, configuration parameters, error codes, or decision matrices. Without the full changelog content (e.g., specific fixes, config values, or error mappings), it does not clearly expose expert-knowledge patterns defined in the sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/march-2026,March 2026,Release Notes for Azure Database for MySQL Flexible Server - March 2026 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server March 2026.,"The March 2026 version of Azure Database for MySQL Flexible Server is now available. Starting March 31, 2026, all new servers will automatically use this version. Existing servers upgrade during their next scheduled maintenance. To upgrade your servers earlier, enroll in theVirtual Canary Program. This version introduces enhancements, resolves known problems, and includes important security patches to ensure optimal performance and security.",2026-04-15T22:09:00.000Z,release-notes,,0.2,False,"Release notes summary mentions a new version, upgrade timing, and general improvements but no visible detailed limits, configuration parameters, error codes, or decision matrices. Without the full changelog content (e.g., specific fixes, config values, or error mappings), it does not clearly expose expert-knowledge patterns defined in the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/may-2024,May 2024,Release Notes for Azure Database for MySQL Flexible Server - May 2024 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server May 2024.,"We're pleased to announce the May 2024 maintenance for Azure Database for MySQL Flexible Server. This maintenance incorporates several new features and improvement, along with known issue fix, and security patches.",2024-12-02T23:02:00.000Z,release-notes,,0.2,False,Maintenance summary; no explicit expert-level configuration or limits content shown.,unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/may-2025,May 2025,Release Notes for Azure Database for MySQL Flexible Server - May 2025 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server May 2025.,"We're excited to announce the May 2025 version of Azure Database for MySQL Flexible Server. Starting May 1, 2025, all new servers will automatically be onboarded to this latest version. Existing servers are upgraded during their next scheduled maintenance. If you prefer to upgrade your servers earlier, you can enroll in our Virtual Canary Program by following thislink. This new version introduces a range of new features and enhancements, resolves known issues, and includes essential security pat",2025-06-17T22:06:00.000Z,release-notes,,0.3,False,Release notes summary; only general mention of features and fixes in the provided text.,unchanged https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/october-2024,October 2024,Release Notes for Azure Database for MySQL Flexible Server - October 2024 - Azure Database for MySQL,,Learn about the release notes for Azure Database for MySQL Flexible Server October 2024.,"We're pleased to announce the October 2024 maintenance for Azure Database for MySQL Flexible Server. This maintenance incorporates several new features and improvement, along with known issue fix, and security patches.",2024-12-02T23:02:00.000Z,release-notes,,0.25,False,"Maintenance release note summary; no detailed limits, configs, or error codes in the provided text.",unchanged diff --git a/products/azure-database-mysql/report.md b/products/azure-database-mysql/report.md index fbcde819..86e45b51 100644 --- a/products/azure-database-mysql/report.md +++ b/products/azure-database-mysql/report.md @@ -1,9 +1,9 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - decision-making: 'Guidance on MySQL Flexible Server planning: version lifecycle, - HA/BCDR, performance features, sizing and pricing, and assessing, choosing methods - for, and executing migrations to Azure.' + decision-making: Guidance on MySQL version planning, HA/BCDR choices, sizing and + tiers, performance features, and end-to-end migration/upgrade strategies to Azure + Database for MySQL. deployment: 'Automating MySQL Flexible Server deployments and maintenance: CI/CD with Azure Pipelines/GitHub Actions, backups/geo-restore, major version upgrades, and scheduled/automated management tasks.' @@ -31,17 +31,17 @@ category_descriptions: skill_description: Expert knowledge for Azure Database for MySQL development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when planning MySQL Flexible Server HA/BCDR, CI/CD deployments, VNet/Private - Link, read replicas, or AKS connectivity, and other Azure Database for MySQL related - development tasks. Not for Azure Database for MariaDB (use azure-database-mariadb), + Use when deploying MySQL Flexible Server, configuring VNet/Private Link, HA/read + replicas, backups/restore, or AKS connectivity, and other Azure Database for MySQL + related development tasks. Not for Azure Database for MariaDB (use azure-database-mariadb), Azure Database for PostgreSQL (use azure-database-postgresql), Azure SQL Database - (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance). -use_when: Use when planning MySQL Flexible Server HA/BCDR, CI/CD deployments, VNet/Private - Link, read replicas, or AKS connectivity, and other Azure Database for MySQL related - development tasks. + (use azure-sql-database), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). +use_when: Use when deploying MySQL Flexible Server, configuring VNet/Private Link, + HA/read replicas, backups/restore, or AKS connectivity, and other Azure Database + for MySQL related development tasks. confusable_not_for: Not for Azure Database for MariaDB (use azure-database-mariadb), Azure Database for PostgreSQL (use azure-database-postgresql), Azure SQL Database - (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance). + (use azure-sql-database), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). --- # Azure Database for MySQL Crawl Report @@ -55,8 +55,8 @@ confusable_not_for: Not for Azure Database for MariaDB (use azure-database-maria ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 178 +- **Updated Pages**: 2 +- **Unchanged**: 177 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-database-mysql/azure-database-mysql.csv` @@ -79,8 +79,10 @@ confusable_not_for: Not for Azure Database for MariaDB (use azure-database-maria ### Updated Pages -- [March 2026](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/march-2026) - - Updated: 2026-04-09T22:12:00.000Z → 2026-04-15T22:09:00.000Z +- [Version support policy](https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy) + - Updated: 2026-03-27T22:08:00.000Z → 2026-04-16T08:00:00.000Z +- [High availability](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability) + - Updated: 2026-03-30T22:08:00.000Z → 2026-04-22T08:00:00.000Z ## Classified Pages @@ -151,9 +153,9 @@ confusable_not_for: Not for Azure Database for MariaDB (use azure-database-maria | [Service tiers](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-service-tiers-storage) | decision-making | 0.70 | Service tiers article typically includes vCore/memory/storage characteristics per tier and guidance on when to choose Burstable, General Purpose, or Memory-Optimized, which is SKU/tier selection decision-making. | | [Troubleshooting best practices](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-troubleshooting-best-practices) | best-practices | 0.70 | Offers specific recommendations to keep databases running smoothly and design schemas for performance, tailored to Azure Database for MySQL Flexible Server. | | [Tune performance using the sys_schema](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/how-to-troubleshoot-sys-schema) | best-practices | 0.70 | Provides concrete queries and patterns using sys_schema to find performance problems and maintain databases, which are actionable, product-specific practices. | -| [Version support policy](https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy) | decision-making | 0.70 | A version support policy page typically includes concrete lifecycle details such as specific major/minor versions supported, GA and end-of-support dates, and upgrade timelines. These are time-bound, product-specific facts that an LLM will not reliably know from training and are used to decide when to upgrade or which version to choose, fitting decision-making. | | [Data-in replication](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-data-in-replication) | architecture-patterns | 0.68 | Covers Azure-specific replication pattern from external MySQL into Flexible Server, with configuration details and constraints. | | [Data-out replication](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-data-out-replication) | architecture-patterns | 0.68 | Describes replication from Flexible Server to external MySQL; includes pattern-specific configuration and limitations. | +| [Version support policy](https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy) | decision-making | 0.68 | A version support policy page for Azure Database for MySQL typically includes concrete lifecycle details (which MySQL major/minor versions are supported, GA/retirement dates, and upgrade windows). These are product-specific, time-bound details that an LLM won't reliably know from training and are used to decide when to upgrade or which version to choose, fitting decision-making better than other categories. | | [High availability FAQ](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability-faq) | decision-making | 0.66 | FAQ on HA options, billing, performance impact, and best practices; helps choose HA configuration with product-specific details. | | [Alerts](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/how-to-alert-on-metric) | configuration | 0.65 | Describes configuring metric alerts with thresholds and stateful behavior; product-specific monitoring configuration. | | [Assessment](https://learn.microsoft.com/en-us/azure/mysql/migrate/mysql-on-premises-azure-db/03-assessment) | decision-making | 0.65 | Assessment phase for migration typically includes concrete criteria (workload characteristics, sizing, compatibility) and guidance to decide readiness and target options; this is migration decision-making content. | @@ -257,7 +259,7 @@ confusable_not_for: Not for Azure Database for MariaDB (use azure-database-maria | [Azure portal](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/quickstart-create-server-portal) | 0.20 | Portal quickstart for server creation; mostly step-by-step UI instructions without deep config matrices. | | [Backup & restore concepts](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-backup-restore) | 0.20 | Characterized as concepts of backup and restore for Azure Database for MySQL Flexible Server. Summary discusses automatic backups, storage redundancy, and point-in-time restore at a high level, but does not show specific retention numbers, configuration parameters, or troubleshooting details. No qualifying limits, configuration tables, or decision guidance are evident. | | [Fabric mirroring](https://learn.microsoft.com/en-us/azure/mysql/integration/fabric-mirroring-mysql) | 0.20 | From the summary, this page is a conceptual overview of mirroring Azure Database for MySQL into Microsoft Fabric/OneLake and its benefits (BI, AI, data engineering). It does not indicate the presence of specific limits, configuration tables, error codes, or decision matrices. Without evidence of concrete parameters, quotas, or troubleshooting mappings, it does not meet the expert-knowledge criteria for any sub-skill type. | -| [High availability](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability) | 0.20 | Described as a conceptual overview of high availability for Azure Database for MySQL Flexible Server. Summary focuses on what HA is, automatic failover, and general behavior, without exposing concrete limits, configuration tables, error codes, or decision matrices. Does not meet any sub-skill detection criteria for expert, configuration-level knowledge. | +| [High availability](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-high-availability) | 0.20 | The page is described as a conceptual overview of zone-redundant high availability for Azure Database for MySQL Flexible Server. From the summary, it explains what HA is, automatic failover, and the existence of primary/standby replicas, but does not indicate specific numeric limits, configuration parameter tables, error codes, or decision matrices. It reads as architecture/feature explanation rather than detailed expert configuration, limits, or troubleshooting content. | | [June 2024](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/june-2024) | 0.20 | Maintenance note focused on availability; lacks detailed technical mappings in the provided text. | | [March 2026](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/march-2026) | 0.20 | Release notes summary mentions a new version, upgrade timing, and general improvements but no visible detailed limits, configuration parameters, error codes, or decision matrices. Without the full changelog content (e.g., specific fixes, config values, or error mappings), it does not clearly expose expert-knowledge patterns defined in the sub-skill types. | | [May 2024](https://learn.microsoft.com/en-us/azure/mysql/flexible-server/release-notes/may-2024) | 0.20 | Maintenance summary; no explicit expert-level configuration or limits content shown. | diff --git a/products/azure-database-postgresql/azure-database-postgresql.csv b/products/azure-database-postgresql/azure-database-postgresql.csv index 1439b50e..2cbae21c 100644 --- a/products/azure-database-postgresql/azure-database-postgresql.csv +++ b/products/azure-database-postgresql/azure-database-postgresql.csv @@ -37,7 +37,7 @@ https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/azure-pipe https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-azure-advisor-recommendations,Azure Advisor recommendations,Azure Advisor - Azure Database for PostgreSQL,,Learn about Azure Advisor recommendations for your Azure Database for PostgreSQL flexible server instance.,Learn about how Azure Advisor is applied to an Azure Database for PostgreSQL flexible server instance and get answers to common questions.,2025-12-22T23:02:00.000Z,concept-article,,0.3,False,"Advisor recommendations overview and FAQ; no concrete numeric limits, config tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-limits,Limits,Limits in Azure Database for PostgreSQL - Azure Database for PostgreSQL,Review capacity and functional limits for Azure PostgreSQL,"This article describes limits in Azure Database for PostgreSQL flexible server instances, such as the number of connections and storage engine options.","The following sections describe capacity and functional limits for Azure Database for PostgreSQL flexible server instances. If you'd like to learn about resource (compute, memory, storage) tiers, see thecomputeandstoragearticles.",2026-01-20T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Dedicated limits article; contains specific numeric limits (connections, storage, etc.) and capacity constraints unique to Azure PostgreSQL flexible server.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-logical,Logical replication and logical decoding,Logical replication and logical decoding - Azure Database for PostgreSQL,,Learn about using logical replication and logical decoding in Azure Database for PostgreSQL flexible server instances.,An Azure Database for PostgreSQL flexible server instance supports the following logical data extraction and replication methodologies: Logical replication Logical decodingwhich is implemented bydecodingthe content of write-ahead log (WAL).,2025-12-22T23:02:00.000Z,concept-article,,0.4,False,"Conceptual explanation of logical replication and decoding; summary does not indicate detailed configuration tables, limits, or product-specific thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-maintenance,Scheduled maintenance,Scheduled Maintenance - Azure Database for PostgreSQL,Configure and plan scheduled maintenance for Azure PostgreSQL,This article describes the scheduled maintenance feature in your Azure Database for PostgreSQL flexible server instances.,"Your Azure Database for PostgreSQL flexible server instance periodically performs maintenance operations to help keep your managed database secure, stable, and up to date. During maintenance, the server gets new features, updates, and patches. Important Avoid all server operations (modifications, configuration changes, starting/stopping the server) during Azure Database for PostgreSQL flexible server instance maintenance. Engaging in these activities can lead to unpredictable outcomes and possib",2026-01-20T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains maintenance behavior and constraints (e.g., avoid operations during maintenance); includes Azure-specific operational guidance and possibly scheduling parameters.",unchanged +https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-maintenance,Scheduled maintenance,Scheduled Maintenance - Azure Database for PostgreSQL,,This article describes the scheduled maintenance feature in your Azure Database for PostgreSQL flexible server instances.,"Your Azure Database for PostgreSQL flexible server instance periodically performs maintenance operations to help keep your managed database secure, stable, and up to date. During maintenance, the server gets new features, updates, and patches. Important Avoid all server operations (modifications, configuration changes, starting/stopping the server) during Azure Database for PostgreSQL flexible server instance maintenance. Engaging in these activities can lead to unpredictable outcomes and possib",2026-04-22T17:15:00.000Z,concept-article,,0.3,False,"Primarily a conceptual/behavioral description of scheduled maintenance for Azure Database for PostgreSQL flexible server. The summary indicates guidance like avoiding operations during maintenance but does not clearly indicate presence of numeric limits, configuration parameter tables, error codes, or decision matrices. Without evidence of specific values, ranges, or product-specific configuration tables, it does not meet the expert-knowledge criteria for any sub-skill type.",updated https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-major-version-upgrade,Major version upgrade,Major Version Upgrades - Azure Database for PostgreSQL,Plan and execute PostgreSQL major version upgrades,Learn how to use Azure Database for PostgreSQL to do in-place major version upgrades of PostgreSQL on a server.,"Your Azure Database for PostgreSQL flexible server instance supports PostgreSQL versions 18, 17, 16, 15, 14, 13, 12, 11. The Postgres community releases a new major version that contains new features about once a year. Additionally, each major version receives periodic bug fixes in the form of minor releases. Minor version upgrades include changes that are backward compatible with existing applications. An Azure Database for PostgreSQL flexible server instance periodically updates the minor vers",2026-04-09T22:12:00.000Z,concept-article,decision-making,0.68,True,"The page is about in-place major version upgrades for Azure Database for PostgreSQL flexible server, including which PostgreSQL major versions are supported and how/when to upgrade. This is product-specific upgrade and lifecycle guidance that helps decide when and how to move between versions, which falls under decision-making. It goes beyond generic concepts by tying to Azure’s supported versions and upgrade behavior, which an LLM is unlikely to know in detail from training.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-reserved-pricing,Prepay for reserved capacity,Prepay for compute resources with reserved capacity - Azure Database for PostgreSQL,Decide on reserved capacity purchases for Azure PostgreSQL,Learn about reserved compute pricing and how to purchase Azure Database for PostgreSQL reserved capacity.,"Azure Database for PostgreSQL helps you save money by prepaying for compute resources, compared to pay-as-you-go prices. With reserved capacity, you make an upfront commitment on Azure Database for PostgreSQL for a one-year or three-year period. This commitment gives you a significant discount on the compute costs. To purchase Azure Database for PostgreSQL reserved capacity, you need to specify the Azure region, deployment type, performance tier, and term.",2025-12-22T23:02:00.000Z,how-to,decision-making,0.7,True,"Explains reserved capacity with term options and required inputs (region, tier, term); helps choose pricing model and capacity commitment for workloads.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-servers,Servers concepts,Server Concepts for Flexible Server - Azure Database for PostgreSQL,Apply server configuration concepts for Azure PostgreSQL flexible server,This article provides considerations and guidelines for configuring and managing Azure Database for PostgreSQL flexible server instances.,This article provides considerations and guidelines for working with an Azure Database for PostgreSQL flexible server instance.,2025-12-22T23:02:00.000Z,concept-article,configuration,0.6,True,Provides considerations and guidelines for configuring/managing servers; likely includes Azure-specific configuration options and operational constraints.,unchanged @@ -107,10 +107,10 @@ You might also want to refer to the officialREADMEof the project.",2025-12-22T23 https://learn.microsoft.com/en-us/azure/postgresql/extensions/how-to-view-installed-extensions,View installed extensions,View installed extensions - Azure Database for PostgreSQL,View installed PostgreSQL extensions and versions on Azure,This article describes how to view installed extensions in an Azure Database for PostgreSQL flexible server instance.,"You might want to view the extensions that are installed in an Azure Database for PostgreSQL flexible server instance, and their corresponding versions.",2025-10-01T05:05:00.000Z,how-to,configuration,0.65,True,Explains how to list installed extensions and their versions; involves specific queries or portal/CLI parameters tied to Azure PostgreSQL.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/extensions/quickstart-azure-storage-extension,Quickstart examples,Quickstart Examples for Azure Storage Extension - Azure Database for PostgreSQL,Use Azure Storage extension examples for PostgreSQL,Learn how to use the Azure Storage extension in Azure Database for PostgreSQL to import and export data.,Following is a list of examples to help you learn how to use the Azure Storage extension.,2025-12-22T23:02:00.000Z,reference,integrations,0.65,True,Quickstart examples for Azure Storage extension; likely show concrete function calls and parameter usage for integrating PostgreSQL with Azure Storage.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/extensions/reference-azure-storage-extension,Reference,Function Reference for Azure Storage Extension - Azure Database for PostgreSQL,Reference for Azure Storage extension functions in PostgreSQL,Learn everything about the functions provided by the Azure Storage extension in Azure Database for PostgreSQL.,"Following is the list of functions provided by the Azure Storage extension: Function that allows adding a storage account, and its associated access key, to the list of storage accounts that theazure_storageextension can access. If a previous invocation of this function already added the reference to this storage account, it doesn't add a new entry but instead updates the access key of the existing entry. Note This function doesn't validate if the referred account name exists or if it's accessib",2025-12-22T23:02:00.000Z,reference,integrations,0.9,True,"Function reference with detailed signatures, parameters, and behavior for azure_storage; this is a product-specific integration API surface.",unchanged -https://learn.microsoft.com/en-us/azure/postgresql/high-availability/concepts-high-availability,High availability,High Availability in Azure Database for PostgreSQL - Azure Database for PostgreSQL,,This article describes high availability on an Azure Database for PostgreSQL flexible server instance.,"Azure Database for PostgreSQL supports high availability by provisioning physically separated primary and standby replicas. This high availability model is designed to ensure that committed data is never lost during failures. In a high availability (HA) setup, data is synchronously committed to both the primary and standby servers. The model is designed so that the database doesn't become a single point of failure in your software architecture. By default in most regions, your standby replica is",2026-04-17T08:00:00.000Z,how-to,,0.3,False,"Content appears to be a conceptual explanation of high availability behavior for Azure Database for PostgreSQL flexible server (primary/standby model, synchronous commit, avoiding single point of failure). The summary does not indicate presence of specific numeric limits, configuration parameter tables, error codes, or decision matrices with thresholds. It reads as an architecture/behavior overview rather than detailed, product-specific expert guidance with concrete values.",updated +https://learn.microsoft.com/en-us/azure/postgresql/high-availability/concepts-high-availability,High availability,High Availability in Azure Database for PostgreSQL - Azure Database for PostgreSQL,,This article describes high availability on an Azure Database for PostgreSQL flexible server instance.,"Azure Database for PostgreSQL supports high availability by provisioning physically separated primary and standby replicas. This high availability model is designed to ensure that committed data is never lost during failures. In a high availability (HA) setup, data is synchronously committed to both the primary and standby servers. The model is designed so that the database doesn't become a single point of failure in your software architecture. By default in most regions, your standby replica is",2026-04-17T08:00:00.000Z,how-to,,0.3,False,"Content appears to be a conceptual explanation of high availability behavior for Azure Database for PostgreSQL flexible server (primary/standby model, synchronous commit, avoiding single point of failure). The summary does not indicate presence of specific numeric limits, configuration parameter tables, error codes, or decision matrices with thresholds. It reads as an architecture/behavior overview rather than detailed, product-specific expert guidance with concrete values.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/high-availability/how-to-configure-high-availability,Configure high availability,Configure High Availability - Azure Database for PostgreSQL,Configure high availability for Azure PostgreSQL flexible server,This article describes how to configure and operate high availability on an Azure Database for PostgreSQL flexible server instance.,"This article describes how to enable or disable high availability (HA) on your Azure Database for PostgreSQL flexible server instance by using the Azure portal or the Azure CLI. The information applies whether you're using instances in the same zone or using a zone-redundant deployment model. The high-availability feature deploys physically separate primary and standby replicas. You can provision the replicas within the same availability zone or in different zones, depending on the deployment mo",2026-01-14T18:46:00.000Z,how-to,configuration,0.7,True,"Details HA configuration options (same-zone vs zone-redundant, replica settings) and likely specific parameters in portal/CLI, which are product-specific configuration settings.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/high-availability/how-to-monitor-high-availability,Monitor high availability,High Availability (HA) Health Status Monitoring - Azure Database for PostgreSQL,Interpret HA health states for Azure PostgreSQL servers,This article describes how to monitor the health of HA-enabled instances for Azure Database for PostgreSQL flexible server using Azure Resource Health.,"Azure Database for PostgreSQL flexible server includes a High Availability (HA) Health Status Monitoring feature, which uses Azure's Resource Health Check (RHC) framework. This service provides continuous insights into the health of HA-enabled instances, notifying you of events that might affect connectivity and availability. The following details each health state and associated scenarios to help you troubleshoot and maintain HA stability.",2025-12-22T23:02:00.000Z,how-to,troubleshooting,0.7,True,"Describes HA Health Status Monitoring with distinct health states and associated scenarios to help troubleshoot and maintain HA stability. This implies symptom → state → cause guidance specific to Azure Resource Health for PostgreSQL, which is product-specific troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/postgresql/integration/concepts-fabric-mirroring,Fabric mirroring,Mirroring in Microsoft Fabric - Azure Database for PostgreSQL,,Learn about Mirroring in Microsoft Fabric for Azure Database for PostgreSQL flexible server instances.,"Mirroring in Fabric(now generally available) provides an easy experience to avoid complex ETL (Extract Transform Load) and integrate your existing Azure Database for PostgreSQL estate with the rest of your data in Microsoft Fabric. You can continuously replicate your existing Azure Database for PostgreSQL directly into Fabric OneLake. Inside Fabric, you can unlock powerful business intelligence, artificial intelligence, Data Engineering, Data Science, and data sharing scenarios. Important Newly ",2025-12-22T23:02:00.000Z,concept-article,,0.4,False,"Conceptual overview of Fabric mirroring; summary emphasizes benefits and scenarios, not configuration parameters, limits, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/postgresql/integration/concepts-fabric-mirroring,Fabric mirroring,Mirroring in Microsoft Fabric - Azure Database for PostgreSQL,,Learn about Mirroring in Microsoft Fabric for Azure Database for PostgreSQL flexible server instances.,"Mirroring in Fabric(now generally available) provides an easy experience to avoid complex ETL (Extract Transform Load) and integrate your existing Azure Database for PostgreSQL estate with the rest of your data in Microsoft Fabric. You can continuously replicate your existing Azure Database for PostgreSQL directly into Fabric OneLake. Inside Fabric, you can unlock powerful business intelligence, artificial intelligence, Data Engineering, Data Science, and data sharing scenarios. Important Newly ",2026-04-22T08:00:00.000Z,concept-article,,0.2,False,"Appears to be a conceptual overview of Fabric mirroring for Azure Database for PostgreSQL, focused on benefits and high-level behavior. No clear indication of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices from the summary.",updated https://learn.microsoft.com/en-us/azure/postgresql/integration/connect-with-power-bi-desktop,Power BI,Connect with Power BI - Azure Database for PostgreSQL,,This article shows how to build Power BI reports from data on your Azure Database for PostgreSQL flexible server instance.,"In this Quickstart, you learn how to connect to an Azure Database for PostgreSQL flexible server instance with Power BI Desktop. With Power BI Desktop, you can visually explore your data through a free-form drag-and-drop canvas, a broad range of modern data visualizations, and an easy-to-use report authoring experience. You can import directly from the tables or import from a SELECT query. This article applies to Power BI Desktop only. Currently, Power Query online or Power BI Service isnot supp",2025-12-22T23:02:00.000Z,quickstart,,0.3,False,"Basic connection quickstart between Power BI Desktop and PostgreSQL; no indication of detailed connector parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/integration/create-automation-tasks,Manage Azure resources with automation tasks,Stop/start automation tasks - Azure Database for PostgreSQL,,This article describes how to stop/start an Azure Database for PostgreSQL flexible server instance by using automation tasks.,"Note If a free trial is available, you can find more information about ithere. You can create automation tasks for your Azure Database for PostgreSQL flexible server instance to start or stop the server on a predefined schedule. Set the Interval and Frequency values on the task's Configure tab to automatically start or stop the server a specific number of times every day, week, or month. The automation task continues to work until you delete or disable the task. You can also set up automation ta",2025-12-22T23:02:00.000Z,quickstart,,0.4,False,"Quickstart-style how-to for creating automation tasks; summary shows only basic scheduling concepts (Interval/Frequency) without detailed configuration tables, limits, or product-specific edge cases.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/integration/how-to-connect-data-factory,Connect to Azure Data Factory and Azure Synapse Analytics,How to Connect from Azure Data Factory to Azure Database for PostgreSQL - Azure Database for PostgreSQL,Configure Azure Data Factory connector for PostgreSQL,Guide on how to connect an Azure Database for PostgreSQL flexible server instance from Azure Data Factory using the Azure Database for PostgreSQL connector,"Important The Azure Database for PostgreSQL version 2.0 provides an improved native Azure Database for PostgreSQL support. If you use the Azure Database for PostgreSQL version 1.0 in your solution, you should upgrade your Azure Database for PostgreSQL linked service at your earliest convenience.",2025-12-22T23:02:00.000Z,how-to,integrations,0.7,True,"Connector-focused article for Azure Data Factory with versioned connector guidance; such pages typically include linked service property names, required parameters, and version-specific behavior that are product-specific integration details.",unchanged @@ -146,11 +146,11 @@ https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tut https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tutorial-migration-service-iaas-online,From an Azure VM or an on-premises PostgreSQL server,"Migrate Online, from an Azure VM or an On-Premises PostgreSQL to Azure Database for PostgreSQL, Using the Migration Service in Azure - Azure Database for PostgreSQL",,"Learn to migrate, seamlessly and in online mode, from an Azure VM or an on-premises PostgreSQL to Azure Database for PostgreSQL, using the migration service in Azure.",This article guides you in migrating a PostgreSQL instance from your on-premises or Azure virtual machines (VMs) to Azure Database for PostgreSQL flexible server in online mode. The migration service in Azure Database for PostgreSQL is a fully managed service integrated into the Azure portal and Azure CLI. It's designed to simplify your migration journey to the Azure Database for PostgreSQL flexible server.,2025-07-28T05:31:00.000Z,tutorial,,0.4,False,Online migration from VM/on-prem tutorial; likely focused on steps rather than detailed config parameter catalogs or error-code mappings.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tutorial-migration-service-rds-offline,From Amazon RDS,"Migrate Offline, from an Amazon RDS for PostgreSQL to Azure Database for PostgreSQL, Using the Migration Service in Azure - Azure Database for PostgreSQL",,"Learn to migrate, seamlessly and in offline mode, from an Amazon RDS for PostgreSQL to Azure Database for PostgreSQL, using the migration service in Azure.",This article guides you in migrating an Amazon RDS for PostgreSQL instance to Azure Database for PostgreSQL flexible server in offline mode. The migration service in Azure Database for PostgreSQL is a fully managed service integrated into the Azure portal and Azure CLI. It's designed to simplify your migration journey to the Azure Database for PostgreSQL flexible server.,2025-07-28T05:31:00.000Z,tutorial,,0.4,False,Offline migration from Amazon RDS tutorial; primarily walkthrough content rather than configuration matrices or troubleshooting catalogs.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tutorial-migration-service-rds-online,From Amazon RDS,"Migrate Online, from an Amazon RDS for PostgreSQL to Azure Database for PostgreSQL, Using the Migration Service in Azure - Azure Database for PostgreSQL",,"Learn to migrate, seamlessly and in online mode, from an Amazon RDS for PostgreSQL to Azure Database for PostgreSQL, using the migration service in Azure.",This article guides you in migrating a PostgreSQL instance from your on-premises or Azure virtual machines (VMs) to Azure Database for PostgreSQL flexible server in online mode. The migration service in Azure Database for PostgreSQL is a fully managed service integrated into the Azure portal and Azure CLI. It's designed to simplify your migration journey to the Azure Database for PostgreSQL flexible server.,2025-07-28T05:31:00.000Z,tutorial,,0.35,False,Online migration from Amazon RDS tutorial; appears similar in nature to other how-to migration guides.,unchanged -https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-best-practices,Application conversion best practices,Oracle to PostgreSQL Application Conversion: Best Practices - Azure Database for PostgreSQL,Apply best practices for Oracle-to-PostgreSQL app conversion,Best practices and recommendations for optimal Oracle to PostgreSQL application conversion using Visual Studio Code PostgreSQL extension.,This article provides best practices and recommendations to ensure optimal results when using the Oracle to Azure Database for PostgreSQL application conversion feature in Visual Studio Code.,2026-04-16T11:05:00.000Z,concept-article,best-practices,0.7,True,Explicitly labeled as best practices for a specific migration feature; likely includes concrete DO/DON'T guidance and product-specific recommendations for using the VS Code PostgreSQL extension during conversion.,new -https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-limitations,Application conversion limitations,Oracle to PostgreSQL Application Conversion: Limitations - Azure Database for PostgreSQL,,"Known limitations, unsupported objects, and constraints when using the Oracle to PostgreSQL application conversion feature in Visual Studio Code.",This article outlines the known limitations and constraints when using the Oracle to PostgreSQL application conversion feature in Visual Studio Code.,2026-04-16T11:05:00.000Z,concept-article,,0.3,False,"Describes limitations and unsupported objects conceptually; summary does not indicate numeric limits, quotas, or configuration ranges required for limits-quotas classification.",new -https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-overview,Application conversion overview,What Is Oracle to PostgreSQL Application Conversion? - Azure Database for PostgreSQL,,Learn how to convert application code interacting with Oracle database schemas to PostgreSQL using the Visual Studio Code PostgreSQL extension with AI-powered transformation.,The Oracle to Azure Database for PostgreSQL application conversion feature helps you migrate existing Oracle-based applications to work with Azure Database for PostgreSQL by automatically converting Oracle-specific database interaction code into PostgreSQL-compatible code. This feature is designed to accelerate application modernization and reduce manual rewrite effort during Oracle-to-PostgreSQL migrations. The application conversion capability is available through the Visual Studio Code Postgr,2026-04-16T11:05:00.000Z,concept-article,,0.2,False,"High-level overview of the Oracle-to-PostgreSQL application conversion feature; no indication of numeric limits, configuration tables, error codes, or product-specific best-practice details.",new -https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-reports,Application conversion reports,Oracle to PostgreSQL Application Conversion: Reports - Azure Database for PostgreSQL,,Understanding the summary reports generated during Oracle to PostgreSQL application conversion using Visual Studio Code PostgreSQL extension.,The Oracle to Azure Database for PostgreSQL Application Conversion feature generates a comprehensive report during the conversion process to help you validate and understand your application code conversion.,2026-04-16T11:05:00.000Z,concept-article,,0.2,False,"Explains understanding of conversion reports; likely descriptive of report contents rather than listing configuration parameters, limits, or troubleshooting mappings.",new -https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-tutorial,Application conversion tutorial,Oracle to PostgreSQL Application Conversion: Tutorial - Azure Database for PostgreSQL,,Step-by-step tutorial for converting application code interacting with Oracle database schemas to PostgreSQL using the Visual Studio Code PostgreSQL extension.,"This tutorial guides you through converting Oracle client application code to Azure Database for PostgreSQL using the Visual Studio Code PostgreSQL extension with GitHub Copilot Agent Mode to automate and validate code transformation. It covers setting up your environment, importing your application codebase, running the conversion process, and reviewing the generated PostgreSQL-compatible code. Before you begin, ensure you have completed a schema conversion project for optimal results. Here's w",2026-04-16T11:05:00.000Z,tutorial,,0.2,False,"Step-by-step tutorial for using the VS Code extension; likely procedural guidance without detailed configuration matrices, limits, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-best-practices,Application conversion best practices,Oracle to PostgreSQL Application Conversion: Best Practices - Azure Database for PostgreSQL,Apply best practices for Oracle-to-PostgreSQL app conversion,Best practices and recommendations for optimal Oracle to PostgreSQL application conversion using Visual Studio Code PostgreSQL extension.,This article provides best practices and recommendations to ensure optimal results when using the Oracle to Azure Database for PostgreSQL application conversion feature in Visual Studio Code.,2026-04-16T11:05:00.000Z,concept-article,best-practices,0.7,True,Explicitly labeled as best practices for a specific migration feature; likely includes concrete DO/DON'T guidance and product-specific recommendations for using the VS Code PostgreSQL extension during conversion.,unchanged +https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-limitations,Application conversion limitations,Oracle to PostgreSQL Application Conversion: Limitations - Azure Database for PostgreSQL,,"Known limitations, unsupported objects, and constraints when using the Oracle to PostgreSQL application conversion feature in Visual Studio Code.",This article outlines the known limitations and constraints when using the Oracle to PostgreSQL application conversion feature in Visual Studio Code.,2026-04-16T11:05:00.000Z,concept-article,,0.3,False,"Describes limitations and unsupported objects conceptually; summary does not indicate numeric limits, quotas, or configuration ranges required for limits-quotas classification.",unchanged +https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-overview,Application conversion overview,What Is Oracle to PostgreSQL Application Conversion? - Azure Database for PostgreSQL,,Learn how to convert application code interacting with Oracle database schemas to PostgreSQL using the Visual Studio Code PostgreSQL extension with AI-powered transformation.,The Oracle to Azure Database for PostgreSQL application conversion feature helps you migrate existing Oracle-based applications to work with Azure Database for PostgreSQL by automatically converting Oracle-specific database interaction code into PostgreSQL-compatible code. This feature is designed to accelerate application modernization and reduce manual rewrite effort during Oracle-to-PostgreSQL migrations. The application conversion capability is available through the Visual Studio Code Postgr,2026-04-16T11:05:00.000Z,concept-article,,0.2,False,"High-level overview of the Oracle-to-PostgreSQL application conversion feature; no indication of numeric limits, configuration tables, error codes, or product-specific best-practice details.",unchanged +https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-reports,Application conversion reports,Oracle to PostgreSQL Application Conversion: Reports - Azure Database for PostgreSQL,,Understanding the summary reports generated during Oracle to PostgreSQL application conversion using Visual Studio Code PostgreSQL extension.,The Oracle to Azure Database for PostgreSQL Application Conversion feature generates a comprehensive report during the conversion process to help you validate and understand your application code conversion.,2026-04-16T11:05:00.000Z,concept-article,,0.2,False,"Explains understanding of conversion reports; likely descriptive of report contents rather than listing configuration parameters, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-tutorial,Application conversion tutorial,Oracle to PostgreSQL Application Conversion: Tutorial - Azure Database for PostgreSQL,,Step-by-step tutorial for converting application code interacting with Oracle database schemas to PostgreSQL using the Visual Studio Code PostgreSQL extension.,"This tutorial guides you through converting Oracle client application code to Azure Database for PostgreSQL using the Visual Studio Code PostgreSQL extension with GitHub Copilot Agent Mode to automate and validate code transformation. It covers setting up your environment, importing your application codebase, running the conversion process, and reviewing the generated PostgreSQL-compatible code. Before you begin, ensure you have completed a schema conversion project for optimal results. Here's w",2026-04-16T11:05:00.000Z,tutorial,,0.2,False,"Step-by-step tutorial for using the VS Code extension; likely procedural guidance without detailed configuration matrices, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-schema-conversions/schema-conversions-app-quickstart,Application Conversion Quickstart,Oracle to Azure Database for PostgreSQL Application Conversion: Quickstart - Azure Database for PostgreSQL,,Step-by-step quickstart for converting Oracle client application code to PostgreSQL using the Visual Studio PostgreSQL extension with Azure OpenAI integration.,"This quickstart walks you through convertingOracle client application codeto Azure Database for PostgreSQL by using theApplication Conversionfeature in the Oracle to Azure Database for PostgreSQL migration tooling, available in the Visual Studio Code PostgreSQL extension. You learn how to: While it'snot requiredto perform a database schema conversion beforehand, westrongly recommendcompleting a schema migration first.If you already converted your Oracle schema to PostgreSQL, the application conv",2025-11-18T18:29:00.000Z,quickstart,,0.45,False,Quickstart for application conversion; mostly step-by-step usage rather than deep configuration or troubleshooting reference.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-schema-conversions/schema-conversions-best-practices,Schema conversion best practices,Oracle to PostgreSQL Schema Conversion: Best Practices - Azure Database for PostgreSQL,Apply best practices for Oracle-to-PostgreSQL schema conversion,Best practices and recommendations for optimal Oracle to PostgreSQL schema conversion using Visual Studio Code PostgreSQL extension with Azure OpenAI integration.,This article provides best practices and recommendations to ensure optimal results when using the Oracle to Azure Database for PostgreSQL schema conversion feature in Visual Studio Code.,2025-11-18T18:29:00.000Z,concept-article,best-practices,0.75,True,"Explicit best-practices article for a specific schema conversion tool; likely includes concrete recommendations, ordering, and tool-specific gotchas.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-schema-conversions/schema-conversions-limitations,Schema conversion limitations,Oracle to PostgreSQL Schema Conversion - Limitations - Azure Database for PostgreSQL,Understand limitations of Oracle-to-PostgreSQL schema conversion tool,"Known limitations, unsupported objects, and constraints when using the Oracle to PostgreSQL schema conversion feature in Visual Studio Code with Azure OpenAI integration.","This preview summarizes known limitations, unsupported objects, and migration considerations when using the Oracle → PostgreSQL schema conversion feature in Visual Studio Code. This article outlines the known limitations and constraints when using the Oracle to PostgreSQL schema conversion feature in Visual Studio Code.",2025-11-18T18:29:00.000Z,concept-article,limits-quotas,0.7,True,"Limitations article for a specific feature; typically enumerates unsupported objects and constraints, which are expert-only capability limits.",unchanged @@ -164,7 +164,7 @@ performance automatically and help prevent problems. Intelligent tuning continuo status and dynamically adapts the database to your workload. This feature comprises two automatic tuning functions: You can enable intelligent tuning by using theAzure portalor theAzure CLI.",2025-12-22T23:02:00.000Z,concept-article,,0.45,False,"High-level overview of intelligent tuning; summary mentions enabling via portal/CLI but not detailed parameters, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-logging,Logs,Logs - Azure Database for PostgreSQL,Configure and access server logs in Azure PostgreSQL,"Describes logging configuration, storage, and analysis in Azure Database for PostgreSQL.","Azure Database for PostgreSQL allows you to configure and access Postgres' standard logs. The logs can be used to identify, troubleshoot, and repair configuration errors and suboptimal performance. Logging information you can configure and access includes errors, query information, autovacuum records, connections, and checkpoints. (Access to transaction logs isn't available). Audit logging is made available through a Postgres extension,pgaudit. To learn more, visit theauditing conceptsarticle.",2026-01-20T08:00:00.000Z,how-to,configuration,0.7,True,"Logging concepts page for a specific service usually lists log categories, configuration parameters, and how to enable/route them, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-monitoring,Monitor and tune overview,Monitoring and Metrics - Azure Database for PostgreSQL,,Review the monitoring and metrics features in an Azure Database for PostgreSQL flexible server instance.,Monitoring data about your servers helps you troubleshoot and optimize for your workload. Your Azure Database for PostgreSQL flexible server instance provides various monitoring options to give you insight into how your server is performing.,2026-04-17T08:00:00.000Z,concept-article,,0.3,False,"Conceptual overview of monitoring and metrics for Azure Database for PostgreSQL; summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-monitoring,Monitor and tune overview,Monitoring and Metrics - Azure Database for PostgreSQL,,Review the monitoring and metrics features in an Azure Database for PostgreSQL flexible server instance.,Monitoring data about your servers helps you troubleshoot and optimize for your workload. Your Azure Database for PostgreSQL flexible server instance provides various monitoring options to give you insight into how your server is performing.,2026-04-23T08:00:00.000Z,concept-article,,0.2,False,"Described as a general overview of monitoring and metrics options for Azure Database for PostgreSQL flexible server. Summary suggests conceptual guidance rather than detailed configuration tables, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-query-performance-insight,Query Performance Insight,Query Performance Insight - Azure Database for PostgreSQL,,This article describes the Query Performance Insight feature in an Azure Database for PostgreSQL flexible server instance.,Query Performance Insight provides intelligent query analysis for databases in an Azure Database for PostgreSQL flexible server instance. It helps identify the top resource consuming and long-running queries in your workload. This helps you find the queries to optimize to improve overall workload performance and efficiently use the resource that you're paying for. Query Performance Insight helps you spend less time troubleshooting database performance by providing:,2025-12-22T23:02:00.000Z,how-to,,0.4,False,"Feature overview of Query Performance Insight; summary lists capabilities but not detailed configuration parameters, thresholds, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-query-store,Query store overview,Query store - Azure Database for PostgreSQL,,This article describes the query store feature for Azure Database for PostgreSQL flexible server instance.,"Query store is a feature in an Azure Database for PostgreSQL flexible server instance that provides a way to track query performance over time. Query store simplifies the troubleshooting of performance issues by helping you quickly find the longest running and most resource-intensive queries. Query store automatically captures a history of queries and runtime statistics, and retains them for your review. It slices the data by time so that you can see temporal usage patterns. Data for all users, ",2025-12-22T23:02:00.000Z,how-to,,0.45,False,"Conceptual description of query store; summary focuses on what it does, not on specific configuration parameters, limits, or best-practice values.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-query-store-best-practices,Query store best practices,Best practices for query store - Azure Database for PostgreSQL,Apply query store best practices in PostgreSQL,This article describes best practices for query store in an Azure Database for PostgreSQL flexible server instance.,This article outlines best practices for using query store in an Azure Database for PostgreSQL flexible server instance.,2025-12-22T23:02:00.000Z,best-practice,best-practices,0.75,True,"Explicit best-practices article for query store; these typically include product-specific recommendations (e.g., parameter settings, retention values, workload-specific guidance) that go beyond generic theory.",unchanged @@ -177,10 +177,11 @@ https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-configure-serv https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-enable-intelligent-performance-cli,Azure CLI,Configure intelligent tuning - Azure CLI - Azure Database for PostgreSQL,Manage intelligent tuning settings with Azure CLI,This article describes how to configure intelligent tuning in an Azure Database for PostgreSQL flexible server instance by using the Azure CLI.,"You can verify and update the intelligent tuning configuration for an Azure Database for PostgreSQL flexible server instance by using the Azure CLI. To learn more about intelligent tuning, see theoverview.",2025-12-22T23:02:00.000Z,how-to,configuration,0.75,True,"CLI-based configuration article; will list CLI commands and parameter names/values for intelligent tuning, which are detailed configuration options.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-enable-intelligent-performance-portal,Azure portal,Configure intelligent tuning - portal - Azure Database for PostgreSQL,Configure intelligent tuning via Azure portal,This article describes how to configure intelligent tuning in an Azure Database for PostgreSQL flexible server instance through the Azure portal.,"This article provides a step-by-step procedure to configure intelligent tuning in an Azure Database for PostgreSQL flexible server instance by using the Azure portal. To learn more about intelligent tuning, see theoverview. Important Autovacuum tuning is currently supported for the General Purpose and Memory Optimized server compute tiers that have four or more vCores. The Burstable server compute tier isn't supported.",2025-12-22T23:02:00.000Z,how-to,configuration,0.8,True,"Portal configuration article; includes supported tiers (e.g., General Purpose/Memory Optimized with ≥4 vCores, Burstable unsupported) and likely specific toggles/parameters, which are concrete configuration and tier constraints.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-get-apply-recommendations-from-autonomous-tuning,Use recommendations,"How to Query, Interpret and Apply Recommendations Produced by Autonomous Tuning in Azure Database for PostgreSQL - Azure Database for PostgreSQL",Interpret and apply autonomous tuning recommendations,"This article describes how to query, interpret, and apply the recommendations produced by autonomous tuning feature in an Azure Database for PostgreSQL flexible server instance.","Autonomous tuning persists the recommendations that it produces in a set of tables located under theintelligentperformanceschema in theazure_sysdatabase. These recommendations can be read using theAutonomous tuningpage in Azure portal, or using the Azure CLIaz postgres flexible-server autonomous-tuning list-table-recommendationsandaz postgres flexible-server autonomous-tuning list-index-recommendationscommands. However, none of those two methods reveal the text of the queries for which the recom",2026-01-29T23:08:00.000Z,how-to,troubleshooting,0.8,True,"Describes how to query and interpret recommendations, including specific schema/table names and Azure CLI commands; maps outputs to actions, which is product-specific troubleshooting/diagnostic guidance.",unchanged +https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-use-dashboards-with-grafana,Use Azure Monitor dashboards with Grafana,Visualize PostgreSQL metrics and logs with dashboards with Grafana - Azure Database for PostgreSQL,,Learn how to use dashboards with Grafana in the Azure portal to monitor Azure Database for PostgreSQL metrics and logs.,"Dashboards with Grafana are now natively integrated into the Azure portal for Azure Database for PostgreSQL. This experience enables you to visualize key metrics and logs in a unified, interactive dashboard—without needing to deploy or manage a separate Grafana instance. With just a few select, you can explore Azure PostgreSQL server metrics, correlate them with log entries by timestamp, and build visual insights into performance and availability.",2026-04-24T17:20:00.000Z,how-to,,0.3,False,"Tutorial-style description on using Grafana dashboards in the Azure portal to visualize PostgreSQL metrics and logs. From the summary, it focuses on how to use the integrated experience, not on product-specific configuration matrices, limits, or error-code troubleshooting.",new https://learn.microsoft.com/en-us/azure/postgresql/network/concepts-networking-private,Networking with private access (virtual network Integration),Networking Overview with Private Access (Virtual Network) - Azure Database for PostgreSQL,,Learn about connectivity and networking options for your Azure Database for PostgreSQL flexible server instance with private access (virtual network).,"This article describes connectivity and networking concepts for Azure Database for PostgreSQL flexible server instances. When you create an Azure Database for PostgreSQL flexible server instance, you must choose one of the following networking options: This document describes the private access (virtual network integration) networking option.",2026-02-12T08:00:00.000Z,concept-article,,0.35,False,"Private access networking overview; high-level explanation of VNet integration, not a configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/network/concepts-networking-private-link,Networking with Private Endpoint,Networking Overview with Private Link Connectivity - Azure Database for PostgreSQL,,Learn about connectivity and networking options for an Azure Database for PostgreSQL flexible server instance with Azure Private Link.,"Azure Private Link allows you to create private endpoints for your Azure Database for PostgreSQL flexible server instance to bring it inside your virtual network. This functionality is a recommended alternative to thenetworking capabilities provided by virtual network integration. With Private Link, traffic between your virtual network and the service traverses the Microsoft backbone network. Exposing your service to the public internet is no longer necessary. You can create your own private lin",2025-12-22T23:02:00.000Z,concept-article,,0.35,False,"Private Link overview; conceptual description of private endpoints, no explicit config matrices or limits in summary.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/network/concepts-networking-public,Networking with public access (allowed IP addresses),Networking Overview with Public Access (Allowed IP Addresses) - Azure Database for PostgreSQL,,Learn about connectivity and networking with public access for an Azure Database for PostgreSQL flexible server instance.,"This article describes connectivity and networking concepts for Azure Database for PostgreSQL flexible server instances. When you create an Azure Database for PostgreSQL flexible server instance, you must choose one of the following networking options: The following characteristics apply whether you choose to use the private access or the public access option: Note Because the Azure Database for PostgreSQL service is a managed database service, users aren't provided host or operating system acce",2025-12-22T23:02:00.000Z,concept-article,,0.35,False,Networking overview with public access; conceptual networking description without detailed config tables or limits in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-migrate-vnet-private-endpoint-capable-server,How to migrate vnet private endpoint capable server,Migrate from VNet to a Private Endpoint Capable Network Configuration - Azure Database for PostgreSQL,,Learn how to migrate Azure Database for PostgreSQL from a VNet deployment to a network configuration that supports private endpoint connectivity.,"Azure Database for PostgreSQL deployed withprivate accessusesvirtual network injectionto place the server in a delegated subnet within your virtual network. You can migrate these servers to a network configuration that supports private endpoints. Private endpoints provide secure connectivity through Azure Private Link, so you can access your server over a private IP address within your virtual network. This migration replaces VNet-injected networking with private endpoint support, giving you mor",2026-04-17T17:16:00.000Z,how-to,,0.2,False,"Primarily a how-to migration guide from VNet-injected to private endpoint networking; based on the summary, it does not clearly expose configuration tables, specific parameter values, limits, or error-code-based troubleshooting that meet the expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-migrate-vnet-private-endpoint-capable-server,How to migrate vnet private endpoint capable server,Migrate from VNet to a Private Endpoint Capable Network Configuration - Azure Database for PostgreSQL,Migrate Azure PostgreSQL from VNet injection to private endpoints,Learn how to migrate Azure Database for PostgreSQL from a VNet deployment to a network configuration that supports private endpoint connectivity.,"Azure Database for PostgreSQL deployed withprivate accessusesvirtual network injectionto place the server in a delegated subnet within your virtual network. You can migrate these servers to a network configuration that supports private endpoints. Private endpoints provide secure connectivity through Azure Private Link, so you can access your server over a private IP address within your virtual network. This migration replaces VNet-injected networking with private endpoint support, giving you mor",2026-04-22T17:15:00.000Z,how-to,deployment,0.7,True,"The page describes a specific, product-focused migration path from VNet-injected Azure Database for PostgreSQL servers to a private endpoint–capable network configuration. This is concrete deployment/migration guidance (how to change networking model for an existing production service), including ordered steps and platform-specific constraints, which fits the deployment/migration sub-skill better than generic concepts. It goes beyond conceptual networking overviews and provides actionable, service-specific migration instructions.",updated https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-add-firewall-rules,Add firewall rules,Add firewall rules - Azure Database for PostgreSQL,,This article describes how to add firewall rules to an Azure Database for PostgreSQL flexible server.,"With public access enabled, you can configure firewall rules to allow connections originating from specific IP addresses, or from any Azure service. Using theAzure portal: Select your Azure Database for PostgreSQL flexible server. In the resource menu, selectNetworking. If you want to create a firewall rule to allow connections originating from the public IP address of the client machine that you're using to connect to navigate the portal, selectAdd current client IP address (###.###.###.###). A",2025-12-22T23:02:00.000Z,how-to,,0.3,False,"Add firewall rules via portal; summary shows basic steps and UI actions, no parameter reference tables.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-add-private-endpoint,Add private endpoint connections,Add private endpoint connections - Azure Database for PostgreSQL,,This article describes how to add private endpoint connections to an Azure Database for PostgreSQL flexible server.,"Azure Database for PostgreSQL flexible server is an Azure Private Link service. This means that you can create private endpoints so that your client applications can connect privately and securely to your Azure Database for PostgreSQL flexible server. A private endpoint to your Azure Database for PostgreSQL flexible server is a network interface that you can inject in a subnet of an Azure virtual network. Any host or service that can route network traffic to that subnet, are able to communicate ",2025-12-22T23:02:00.000Z,how-to,,0.35,False,"Add private endpoint connections; likely step-by-step Private Link setup, not a full configuration reference with parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-approve-private-endpoint,Approve private endpoint connections,Approve private endpoint connections - Azure Database for PostgreSQL,,This article describes how to approve private endpoint connections to an Azure Database for PostgreSQL flexible server.,"Azure Database for PostgreSQL flexible server is an Azure Private Link service. This means that you can create private endpoints so that your client applications can connect privately and securely to your Azure Database for PostgreSQL flexible server. A private endpoint to your Azure Database for PostgreSQL flexible server is a network interface that you can inject in a subnet of an Azure virtual network. Any host or service that can route network traffic to that subnet, are able to communicate ",2025-12-22T23:02:00.000Z,how-to,,0.25,False,"Approve private endpoint connections; workflow-focused, not configuration matrices or quotas.",unchanged @@ -191,7 +192,7 @@ https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-ser https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-reject-private-endpoint,Reject private endpoint connections,Reject private endpoint connections - Azure Database for PostgreSQL,,This article describes how to reject private endpoint connections to an Azure Database for PostgreSQL flexible server.,"Azure Database for PostgreSQL flexible server is an Azure Private Link service. This means that you can create private endpoints so that your client applications can connect privately and securely to your Azure Database for PostgreSQL flexible server. A private endpoint to your Azure Database for PostgreSQL flexible server is a network interface that you can inject in a subnet of an Azure virtual network. Any host or service that can route network traffic to that subnet, are able to communicate ",2025-12-22T23:02:00.000Z,how-to,,0.25,False,"Reject private endpoint connections; similar to approve article, mainly procedural.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-vent-integration-change-private-dns-zone,Change private DNS zone,Change private DNS zone - Azure Database for PostgreSQL,,This article describes how to change the private DNS zone of your Azure Database for PostgreSQL flexible server.,"Azure Private DNSprovides a reliable and secure DNS service for your virtual network. Azure Private DNS manages and resolves domain names in the virtual network, without the need to configure a custom DNS solution. When you deploy an Azure Database for PostgreSQL flexible server withNetworking with private access (VNET Integration)mode, you're required to provide the private DNS zone in which is mandatory to be able to do DNS resolution. For new Azure Database for PostgreSQL flexible server cre",2025-12-22T23:02:00.000Z,how-to,,0.35,False,Change private DNS zone; summary mentions requirement but not detailed parameter tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/overview,What is Azure Database for PostgreSQL?,What Is Azure Database for PostgreSQL? - Azure Database for PostgreSQL,,Provides an overview of Azure Database for PostgreSQL.,"This article provides an overview of Azure Database for PostgreSQL, helping you get acquainted with its key features and core concepts. Azure Database for PostgreSQL is a fully managed database service that gives you granular control and flexibility over database management functions and configuration settings. The service provides flexibility and server configuration customizations based on your requirements. The architecture lets you collocate the database engine with the client tier for lower",2026-01-09T08:00:00.000Z,overview,,0.1,False,"High-level product overview without concrete limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas,Overview,Read replicas - Azure Database for PostgreSQL,Use read replicas in Azure PostgreSQL flexible server,This article describes the read replica feature usage for an Azure Database for PostgreSQL flexible server instance.,"The read replica feature allows you to replicate data from an Azure Database for PostgreSQL flexible server instance to a read-only replica. Replicas are updatedasynchronouslywith the PostgreSQL engine's native physical replication technology. Streaming replication by using replication slots is the default operation mode. When necessary, file-based log shipping is used to catch up. You can replicate from the primary server to up to five replicas. Replicas are new servers you manage similar to re",2026-04-06T08:00:00.000Z,how-to,limits-quotas,0.68,True,Contains product-specific expert details such as the maximum number of replicas per primary (up to five replicas) and behavior of replication modes (streaming vs file-based log shipping) that represent concrete limits/constraints not derivable from generic PostgreSQL knowledge.,updated +https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas,Overview,Read replicas - Azure Database for PostgreSQL,Use read replicas in Azure PostgreSQL flexible server,This article describes the read replica feature usage for an Azure Database for PostgreSQL flexible server instance.,"The read replica feature allows you to replicate data from an Azure Database for PostgreSQL flexible server instance to a read-only replica. Replicas are updatedasynchronouslywith the PostgreSQL engine's native physical replication technology. Streaming replication by using replication slots is the default operation mode. When necessary, file-based log shipping is used to catch up. You can replicate from the primary server to up to five replicas. Replicas are new servers you manage similar to re",2026-04-06T08:00:00.000Z,how-to,limits-quotas,0.68,True,Contains product-specific expert details such as the maximum number of replicas per primary (up to five replicas) and behavior of replication modes (streaming vs file-based log shipping) that represent concrete limits/constraints not derivable from generic PostgreSQL knowledge.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas-geo,Geo-Replication,Geo-replication - Azure Database for PostgreSQL,Plan geo-replication for Azure PostgreSQL flexible server,This article describes the Geo-replication in an Azure Database for PostgreSQL flexible server instance.,"A read replica can be created in the same region as the primary server or in a different geographical region. Geo-replication can be helpful for scenarios like disaster recovery planning or bringing data closer to your users. You can have a primary server in anyAzure Database for PostgreSQL flexible server service region. A primary server can also have replicas in any global region of Azure that supports Azure Database for PostgreSQL flexible server instances. Additionally, we also support speci",2025-12-22T23:02:00.000Z,concept-article,decision-making,0.7,True,"Describes which regions can host primaries and replicas, supported global regions, and special region pairs; helps decide geo-replication topology based on region support and DR needs.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas-promote,Promote read replicas,Promote read replicas - Azure Database for PostgreSQL,,This article describes the promote action for read replica feature in an Azure Database for PostgreSQL flexible server instance.,"Promote refers to the process where a replica is commanded to end its replica mode and transition into full read-write operations. Important Promote operation isn't automatic. If a failure occurs on the primary server, the system doesn't switch to the read replica independently. A user action is always required for the promote operation. Promotion of replicas can be done in two distinct manners: Promote to primary server This action elevates a replica to the role of the primary server. In the pr",2026-02-16T08:00:00.000Z,concept-article,,0.4,False,"Conceptual description of promoting replicas; summary shows no numeric limits, error codes, or config tables.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas-virtual-endpoints,Virtual endpoints,Virtual endpoints - Azure Database for PostgreSQL,,This article describes the virtual endpoints for read replica feature in an Azure Database for PostgreSQL flexible server instance.,"Virtual Endpoints are read-write and read-only listener endpoints, that remain consistent irrespective of the current role of the Azure Database for PostgreSQL flexible server instance. This means you don't have to update your application's connection string after performing thepromote to primary serveraction, as the endpoints will automatically point to the correct instance following a role change. All operations involving virtual endpoints, whether adding, editing, or removing, are performed i",2025-12-22T23:02:00.000Z,concept-article,,0.3,False,"Overview of virtual endpoints behavior; no indication of config parameter tables, limits, or decision matrices.",unchanged @@ -204,10 +205,11 @@ https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-show-virt https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-switch-over-replica-to-primary,Switch over read replica to primary,Switch over read replica to primary - Azure Database for PostgreSQL,,This article describes how to switch over a read replica so that it becomes the primary.,This article provides step-by-step instructions to switch over a read replica of an Azure Database for PostgreSQL flexible server so that it becomes the new primary server of the replication set.,2025-12-22T23:02:00.000Z,how-to,,0.3,False,"Switchover how-to; summary suggests procedural guidance only, no numeric thresholds or config tables.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-update-virtual-endpoints,Update virtual endpoints,Update virtual endpoints - Azure Database for PostgreSQL,,This article describes how to update virtual endpoints for an Azure Database for PostgreSQL flexible server.,This article provides step-by-step instructions to update virtual endpoints associated to an Azure Database for PostgreSQL flexible server.,2025-12-22T23:02:00.000Z,how-to,,0.2,False,Update virtual endpoints tutorial; no evidence of expert-only configuration matrices or constraints.,unchanged https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2025-october,October 2025,Release Notes for Azure Database for PostgreSQL Maintenance - October 2025 - Azure Database for PostgreSQL,,Learn about the maintenance release notes for Azure Database for PostgreSQL Server October 2025.,"The October 2025 version of Azure Database for PostgreSQL is now available. Starting October 24, 2025, all new servers automatically use this version. Existing servers upgrade during their next scheduled maintenance. This version introduces new features and enhancements, resolves known issues, and includes important security patches to ensure optimal performance and security.",2026-02-06T06:06:00.000Z,release-notes,,0.4,False,Similar to other maintenance release notes; primarily change log without structured expert patterns per the defined categories.,unchanged +https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-april,April 2026,Release Notes for Azure Database for PostgreSQL Maintenance - April 2026 - Azure Database for PostgreSQL,,Learn about the maintenance release notes for Azure Database for PostgreSQL April 2026.,"We're excited to announce the April 2026 version of Azure Database for PostgreSQL. Beginning April 22, 2026, the service automatically onboards all new servers to this latest version. The service upgrades existing servers during their next scheduled maintenance. This new version introduces a range of new features and enhancements, resolves known issues, and includes essential security patches to ensure optimal performance and security.",2026-04-23T17:24:00.000Z,release-notes,,0.3,False,"High-level maintenance release announcement; summary does not expose concrete limits, configuration tables, error codes, or decision matrices.",new https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-january,January 2026,Release Notes for Azure Database for PostgreSQL Maintenance - January 2026 - Azure Database for PostgreSQL,,Learn about the maintenance release notes for Azure Database for PostgreSQL January 2026.,"We're excited to announce the January 2026 version of Azure Database for PostgreSQL. Starting January 20, 2026, the service automatically onboards all new servers to this latest version. The service upgrades existing servers during their next scheduled maintenance. This new version introduces a range of new features and enhancements, resolves known issues, and includes essential security patches to ensure optimal performance and security.",2026-03-09T22:16:00.000Z,release-notes,,0.3,False,"Similar to the March 2026 notes, the January 2026 maintenance release page summary focuses on announcing a new service version and its rollout date, without evidence of detailed error mappings, configuration parameters, or limits. This makes it more of an overview/announcement than a page containing the structured expert knowledge required for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-march,March 2026,Release Notes for Azure Database for PostgreSQL Maintenance - March 2026 - Azure Database for PostgreSQL,,Learn about the maintenance release notes for Azure Database for PostgreSQL March 2026.,"We're excited to announce the March 2026 version of Azure Database for PostgreSQL. Starting March 3, 2026, the service automatically onboards all new servers to this latest version. The service upgrades existing servers during their next scheduled maintenance. This new version introduces a range of new features and enhancements, resolves known issues, and includes essential security patches to ensure optimal performance and security.",2026-03-04T23:13:00.000Z,release-notes,,0.3,False,"The summary describes a high-level maintenance release announcement (new features, enhancements, security patches, and automatic onboarding date). Without detailed tables of fixes, error codes, configuration changes, or limits, it is primarily release marketing/overview content rather than structured expert knowledge as defined by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/release-notes-maintenance-index,Maintenance release notes index,Maintenance release notes index for Azure Database for PostgreSQL - Azure Database for PostgreSQL,Understand PostgreSQL maintenance release rollout behavior,Learn about the maintenance release notes for Azure Database for PostgreSQL.,"Welcome to the centralized index of maintenance release notes forAzure Database for PostgreSQL. This page provides a year-by-year overview of all published maintenance release notes, helping you stay informed about platform updates, feature enhancements, and bug fixes. Note Each release note typically applies to serverscreated after a specific date, which the document clearly mentions. However, due to Azure’sregion-by-region deployment model, the actual rollout can span up to a month globally. R",2026-02-06T06:06:00.000Z,release-notes,deployment,0.6,True,Mentions region-by-region deployment model and rollout timing; full page likely details maintenance behavior and constraints relevant to deployment/upgrade planning.,unchanged -https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes,Release notes,Release Notes for Azure Database for PostgreSQL - Azure Database for PostgreSQL,,"Release notes for Azure Database for PostgreSQL, including feature additions, engine versions support, extensions, and other announcements.","This article highlights the latest updates and enhancements for Azure Database for PostgreSQL, service including new feature releases, supported engine versions, available extensions, and other important announcements.",2026-04-10T08:00:00.000Z,concept-article,,0.3,False,"Release notes list feature additions, engine version support, and extensions over time but are primarily chronological announcements rather than structured limits, configuration matrices, troubleshooting mappings, or decision frameworks. They typically lack the organized parameter tables, error-code mappings, or quota details required by the defined sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/release-notes-maintenance-index,Maintenance release notes index,Maintenance release notes index for Azure Database for PostgreSQL - Azure Database for PostgreSQL,,Learn about the maintenance release notes for Azure Database for PostgreSQL.,"Welcome to the centralized index of maintenance release notes forAzure Database for PostgreSQL. This page provides a year-by-year overview of all published maintenance release notes, helping you stay informed about platform updates, feature enhancements, and bug fixes. Note Each release note typically applies to serverscreated after a specific date, which the document clearly mentions. However, due to Azure’sregion-by-region deployment model, the actual rollout can span up to a month globally. R",2026-04-22T17:15:00.000Z,release-notes,,0.2,False,"Index/overview of maintenance release notes without specific technical details, limits, configuration parameters, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes,Release notes,Release Notes for Azure Database for PostgreSQL - Azure Database for PostgreSQL,,"Release notes for Azure Database for PostgreSQL, including feature additions, engine versions support, extensions, and other announcements.","This article highlights the latest updates and enhancements for Azure Database for PostgreSQL, service including new feature releases, supported engine versions, available extensions, and other important announcements.",2026-04-10T08:00:00.000Z,concept-article,,0.3,False,"Release notes list feature additions, engine version support, and extensions over time but are primarily chronological announcements rather than structured limits, configuration matrices, troubleshooting mappings, or decision frameworks. They typically lack the organized parameter tables, error-code mappings, or quota details required by the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes-api,API release notes,API release notes - Azure Database for PostgreSQL,,API release notes for Azure Database for PostgreSQL.,"This page provides latest news and updates regarding the recommended API versions to be used.The API versions that are not listed here might be supported, but will be retired soon.The documentation for the latest Stable API version is availablehere.",2025-12-22T23:02:00.000Z,overview,,0.3,False,"API release notes page is a news/update list; not organized as limits, configs, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes-cli,CLI release notes,CLI release notes - Azure Database for PostgreSQL,,Azure CLI release notes for Azure Database for PostgreSQL.,This page provides latest news and updates regarding the Azure Database for PostgreSQL additions for Azure CLI.,2025-12-22T23:02:00.000Z,overview,,0.3,False,"CLI release notes are change logs; they don’t match the structured best-practices, troubleshooting, or configuration patterns required.",unchanged https://learn.microsoft.com/en-us/azure/postgresql/samples/sample-change-server-configuration,Change server configurations,Azure CLI Script - Change Server Configurations - Azure Database for PostgreSQL,List and change PostgreSQL server configuration via CLI,This sample CLI script lists all available server configuration options and updates the value of one of the options.,"This sample CLI script lists all available configuration parameters and their allowable values for Azure Database for PostgreSQL flexible server, and sets thelog_retention_daysto a value that is other than the default one. If you don't have an Azure account, create afree accountbefore you begin.",2025-12-22T23:02:00.000Z,how-to,configuration,0.8,True,"CLI script that lists configuration parameters and allowable values, and changes log_retention_days; directly exposes configuration parameter names, defaults, and ranges.",unchanged diff --git a/products/azure-database-postgresql/report.md b/products/azure-database-postgresql/report.md index b8b1f27d..1c895d33 100644 --- a/products/azure-database-postgresql/report.md +++ b/products/azure-database-postgresql/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: best-practices: 'Performance, migration, and security best practices for Azure PostgreSQL: tuning queries, extensions, pooling, bulk load, stats, partitioning, pgvector, @@ -19,8 +19,8 @@ category_descriptions: decision-making: Guidance on choosing compute tiers, versions, and hosting, planning upgrades and geo-replication, and sizing, validating, and reserving capacity for Azure Database for PostgreSQL. - deployment: Deploying and upgrading Azure PostgreSQL (CI/CD, Bicep, AKS/Django, - Web Apps), managing networking (VNet, private endpoints), maintenance behavior, + deployment: CI/CD deployment to Azure PostgreSQL, infrastructure-as-code (Bicep), + app deployment patterns (AKS/Django, Web Apps + VNet), upgrades, network migrations, and point-in-time restore. configuration: 'Configuring Azure Database for PostgreSQL servers: parameters, extensions, logging, monitoring, HA, networking, maintenance, scaling, vector search, and @@ -31,13 +31,13 @@ category_descriptions: skill_description: Expert knowledge for Azure Database for PostgreSQL development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, - and deployment. Use when using pgvector search, Azure AI/OpenAI embeddings, VNet/private - endpoints, geo-replication, or CI/CD deploys, and other Azure Database for PostgreSQL + and deployment. Use when using pgvector, Azure AI/OpenAI embeddings, VNet/TLS security, + sharding/elastic scale, or PITR/backup, and other Azure Database for PostgreSQL related development tasks. Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Cosmos DB (use azure-cosmos-db). -use_when: Use when using pgvector search, Azure AI/OpenAI embeddings, VNet/private - endpoints, geo-replication, or CI/CD deploys, and other Azure Database for PostgreSQL +use_when: Use when using pgvector, Azure AI/OpenAI embeddings, VNet/TLS security, + sharding/elastic scale, or PITR/backup, and other Azure Database for PostgreSQL related development tasks. confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), SQL Server on Azure Virtual Machines @@ -47,16 +47,16 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S ## Summary -- **Total Pages**: 322 -- **Fetched**: 322 +- **Total Pages**: 324 +- **Fetched**: 324 - **Fetch Failed**: 0 -- **Classified**: 216 -- **Unclassified**: 106 +- **Classified**: 215 +- **Unclassified**: 109 ### Incremental Update -- **New Pages**: 5 +- **New Pages**: 2 - **Updated Pages**: 5 -- **Unchanged**: 312 +- **Unchanged**: 317 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-database-postgresql/azure-database-postgresql.csv` @@ -65,38 +65,35 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 7 | 2.2% | -| best-practices | 15 | 4.7% | -| configuration | 85 | 26.4% | +| best-practices | 15 | 4.6% | +| configuration | 84 | 25.9% | | decision-making | 11 | 3.4% | | deployment | 8 | 2.5% | -| integrations | 26 | 8.1% | -| limits-quotas | 16 | 5.0% | -| security | 28 | 8.7% | +| integrations | 26 | 8.0% | +| limits-quotas | 16 | 4.9% | +| security | 28 | 8.6% | | troubleshooting | 20 | 6.2% | -| *(Unclassified)* | 106 | 32.9% | +| *(Unclassified)* | 109 | 33.6% | ## Changes ### New Pages -- [Application conversion overview](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-overview) -- [Application conversion tutorial](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-tutorial) -- [Application conversion best practices](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-best-practices) -- [Application conversion limitations](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-limitations) -- [Application conversion reports](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-reports) +- [April 2026](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-april) +- [Use Azure Monitor dashboards with Grafana](https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-use-dashboards-with-grafana) ### Updated Pages -- [Overview](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas) - - Updated: 2025-12-22T23:02:00.000Z → 2026-04-06T08:00:00.000Z - [How to migrate vnet private endpoint capable server](https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-migrate-vnet-private-endpoint-capable-server) - - Updated: 2026-04-07T22:11:00.000Z → 2026-04-17T17:16:00.000Z -- [Release notes](https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes) - - Updated: 2026-03-30T08:00:00.000Z → 2026-04-10T08:00:00.000Z -- [High availability](https://learn.microsoft.com/en-us/azure/postgresql/high-availability/concepts-high-availability) - - Updated: 2026-04-09T08:00:00.000Z → 2026-04-17T08:00:00.000Z + - Updated: 2026-04-17T17:16:00.000Z → 2026-04-22T17:15:00.000Z +- [Scheduled maintenance](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-maintenance) + - Updated: 2026-01-20T08:00:00.000Z → 2026-04-22T17:15:00.000Z +- [Maintenance release notes index](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/release-notes-maintenance-index) + - Updated: 2026-02-06T06:06:00.000Z → 2026-04-22T17:15:00.000Z +- [Fabric mirroring](https://learn.microsoft.com/en-us/azure/postgresql/integration/concepts-fabric-mirroring) + - Updated: 2025-12-22T23:02:00.000Z → 2026-04-22T08:00:00.000Z - [Monitor and tune overview](https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-monitoring) - - Updated: 2026-04-08T08:00:00.000Z → 2026-04-17T08:00:00.000Z + - Updated: 2026-04-17T08:00:00.000Z → 2026-04-23T08:00:00.000Z ## Classified Pages @@ -206,6 +203,7 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Firewall rules](https://learn.microsoft.com/en-us/azure/postgresql/security/security-firewall-rules) | security | 0.70 | Firewall rules article for a specific service typically includes server-level rule names, IP range formats, and product-specific behavior (e.g., allow Azure services toggle), which are concrete security configuration details. | | [Full vacuum using pg_repack extension](https://learn.microsoft.com/en-us/azure/postgresql/troubleshoot/how-to-perform-fullvacuum-pg-repack) | best-practices | 0.70 | How-to optimization guide for pg_repack on Azure Database for PostgreSQL flexible server with product-specific operational guidance and tuning steps, beyond generic PostgreSQL theory. | | [Geo-Replication](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas-geo) | decision-making | 0.70 | Describes which regions can host primaries and replicas, supported global regions, and special region pairs; helps decide geo-replication topology based on region support and DR needs. | +| [How to migrate vnet private endpoint capable server](https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-migrate-vnet-private-endpoint-capable-server) | deployment | 0.70 | The page describes a specific, product-focused migration path from VNet-injected Azure Database for PostgreSQL servers to a private endpoint–capable network configuration. This is concrete deployment/migration guidance (how to change networking model for an existing production service), including ordered steps and platform-specific constraints, which fits the deployment/migration sub-skill better than generic concepts. It goes beyond conceptual networking overviews and provides actionable, service-specific migration instructions. | | [How to set up the network](https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/how-to-network-setup-migration-service) | configuration | 0.70 | Networking scenarios article mentions a summarizing table; likely includes scenario-specific configuration requirements (VNet, private endpoints, ports) that are concrete configuration patterns. | | [Identity](https://learn.microsoft.com/en-us/azure/postgresql/security/security-managed-identity-overview) | security | 0.70 | Explains managed identities usage with Azure PostgreSQL; likely includes role assignments, scopes, and connection configuration for secure access. | | [In-database embeddings with azure_local_ai](https://learn.microsoft.com/en-us/azure/postgresql/extensions/azure-local-ai) | configuration | 0.70 | Describes using azure_local_ai extension, including model registration and embedding configuration; product-specific configuration despite retirement notice. | @@ -250,7 +248,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Scalable vector indexing using pg_diskann](https://learn.microsoft.com/en-us/azure/postgresql/extensions/how-to-use-pgdiskann) | configuration | 0.70 | Covers enabling pg_diskann and using DiskANN; includes extension-specific configuration and usage patterns. | | [Scale a server](https://learn.microsoft.com/en-us/azure/postgresql/samples/sample-scale-server-up-or-down) | configuration | 0.70 | CLI script for scaling compute and storage; includes product-specific constraints (storage can only scale up) and parameter usage, which are configuration details and behavioral limits. | | [Scale storage performance](https://learn.microsoft.com/en-us/azure/postgresql/scale/how-to-scale-storage-performance) | limits-quotas | 0.70 | Scaling storage performance for Premium SSD and SSD v2 typically references specific performance tiers, IOPS, and throughput limits tied to disk size, which are numeric quotas. | -| [Scheduled maintenance](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-maintenance) | configuration | 0.70 | Explains maintenance behavior and constraints (e.g., avoid operations during maintenance); includes Azure-specific operational guidance and possibly scheduling parameters. | | [Schema conversion limitations](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-schema-conversions/schema-conversions-limitations) | limits-quotas | 0.70 | Limitations article for a specific feature; typically enumerates unsupported objects and constraints, which are expert-only capability limits. | | [Script activity](https://learn.microsoft.com/en-us/azure/postgresql/integration/how-to-data-factory-script-activity-azure) | integrations | 0.70 | Script activity integration article; typically documents activity configuration fields, command types, and constraints specific to the PostgreSQL connector, which are detailed integration patterns. | | [Secure connectivity with SSL and TLS](https://learn.microsoft.com/en-us/azure/postgresql/security/security-tls) | security | 0.70 | Page is product-specific security guidance for Azure Database for PostgreSQL flexible server, detailing which TLS protocol versions are supported/required and how secure connectivity must be configured. This is concrete, service-specific security configuration information (TLS 1.2/1.3 only, SSL deprecated) rather than a generic conceptual overview, fitting the security sub-skill. | @@ -306,7 +303,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Design a real-time dashboard](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/tutorial-real-time-dashboard) | architecture-patterns | 0.60 | Shows how to parallelize dashboard queries using elastic clusters; contains pattern-level guidance on distributing queries and data for real-time analytics on this service. | | [Go](https://learn.microsoft.com/en-us/azure/postgresql/connectivity/connect-go) | integrations | 0.60 | Go sample demonstrates driver configuration and connection details specific to Azure PostgreSQL. | | [Java](https://learn.microsoft.com/en-us/azure/postgresql/connectivity/connect-java) | integrations | 0.60 | Shows JDBC connection patterns and authentication options tailored to Azure Database for PostgreSQL. | -| [Maintenance release notes index](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/release-notes-maintenance-index) | deployment | 0.60 | Mentions region-by-region deployment model and rollout timing; full page likely details maintenance behavior and constraints relevant to deployment/upgrade planning. | | [PHP](https://learn.microsoft.com/en-us/azure/postgresql/connectivity/connect-php) | integrations | 0.60 | PHP connection examples and SQL usage against Azure PostgreSQL are concrete integration patterns. | | [PostgreSQL as a vector store](https://learn.microsoft.com/en-us/azure/postgresql/azure-ai/generative-ai-semantic-search) | architecture-patterns | 0.60 | Hands-on semantic search tutorial likely includes concrete schema, query, and indexing patterns specific to this integration. | | [Premigration validation](https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/concepts-premigration-migration-service) | decision-making | 0.60 | Premigration validation rules help decide readiness and highlight blockers; likely includes rule lists and criteria that guide migration decisions and readiness assessment. | @@ -340,7 +336,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Stop compute of a server](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/how-to-stop-server) | 0.45 | Basic operational guide to stop compute; similar to start article, mostly procedural without deep configuration or troubleshooting content. | | [Copy activity](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/how-to-data-factory-copy-activity-fabric) | 0.40 | Step-by-step copy activity creation in Fabric; summary does not indicate detailed parameter tables or constraints beyond generic support for copy/bulk insert/upsert. | | [Delete a server](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/how-to-delete-server) | 0.40 | Step-by-step deletion instructions; operational but not focused on configuration parameters, limits, or troubleshooting mappings. | -| [Fabric mirroring](https://learn.microsoft.com/en-us/azure/postgresql/integration/concepts-fabric-mirroring) | 0.40 | Conceptual overview of Fabric mirroring; summary emphasizes benefits and scenarios, not configuration parameters, limits, or decision matrices. | | [From Amazon Aurora](https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tutorial-migration-service-aurora-offline) | 0.40 | Offline migration from Amazon Aurora tutorial; likely similar procedural guidance without extensive expert-only tables or error mappings. | | [From Amazon RDS](https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tutorial-migration-service-rds-offline) | 0.40 | Offline migration from Amazon RDS tutorial; primarily walkthrough content rather than configuration matrices or troubleshooting catalogs. | | [From Google Cloud SQL for PostgreSQL](https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/tutorial-migration-service-cloud-sql-offline) | 0.40 | Offline migration from Google Cloud SQL tutorial; appears to be a step-by-step guide rather than deep config or troubleshooting reference. | @@ -373,6 +368,7 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [API release notes](https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes-api) | 0.30 | API release notes page is a news/update list; not organized as limits, configs, or decision matrices. | | [Add firewall rules](https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-add-firewall-rules) | 0.30 | Add firewall rules via portal; summary shows basic steps and UI actions, no parameter reference tables. | | [Application conversion limitations](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-application-conversions/app-conversions-limitations) | 0.30 | Describes limitations and unsupported objects conceptually; summary does not indicate numeric limits, quotas, or configuration ranges required for limits-quotas classification. | +| [April 2026](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-april) | 0.30 | High-level maintenance release announcement; summary does not expose concrete limits, configuration tables, error codes, or decision matrices. | | [Azure Advisor recommendations](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-azure-advisor-recommendations) | 0.30 | Advisor recommendations overview and FAQ; no concrete numeric limits, config tables, or error-code-based troubleshooting. | | [CLI release notes](https://learn.microsoft.com/en-us/azure/postgresql/release-notes/release-notes-cli) | 0.30 | CLI release notes are change logs; they don’t match the structured best-practices, troubleshooting, or configuration patterns required. | | [Confidential computing](https://learn.microsoft.com/en-us/azure/postgresql/security/security-confidential-computing) | 0.30 | High-level description of confidential computing options; likely conceptual without concrete RBAC roles, config parameters, or numeric constraints. | @@ -385,7 +381,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [January 2026](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-january) | 0.30 | Similar to the March 2026 notes, the January 2026 maintenance release page summary focuses on announcing a new service version and its rollout date, without evidence of detailed error mappings, configuration parameters, or limits. This makes it more of an overview/announcement than a page containing the structured expert knowledge required for any sub-skill type. | | [List all backups](https://learn.microsoft.com/en-us/azure/postgresql/backup-restore/how-to-list-all-backups) | 0.30 | Lists backups via portal/CLI; step-by-step usage, not configuration reference or limits documentation. | | [March 2026](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/2026-march) | 0.30 | The summary describes a high-level maintenance release announcement (new features, enhancements, security patches, and automatic onboarding date). Without detailed tables of fixes, error codes, configuration changes, or limits, it is primarily release marketing/overview content rather than structured expert knowledge as defined by the sub-skill types. | -| [Monitor and tune overview](https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-monitoring) | 0.30 | Conceptual overview of monitoring and metrics for Azure Database for PostgreSQL; summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting. | | [Overview](https://learn.microsoft.com/en-us/azure/postgresql/extensions/concepts-extensions) | 0.30 | Conceptual overview of extensions and modules; no indication of detailed configuration parameters, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/postgresql/migrate/migration-service/overview-migration-service-postgresql) | 0.30 | Overview of migration service and options; appears conceptual/marketing without detailed limits, configs, or decision matrices. | | [Overview Apache AGE for PostgreSQL](https://learn.microsoft.com/en-us/azure/postgresql/azure-ai/generative-ai-age-overview) | 0.30 | Overview of Apache AGE capabilities; conceptual description of graph features without clear indication of detailed configuration, limits, or troubleshooting content. | @@ -395,8 +390,10 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Reset local administrator password](https://learn.microsoft.com/en-us/azure/postgresql/security/security-reset-admin-password) | 0.30 | Reset admin password how-to; simple procedural steps, not deep security configuration or role mapping. | | [Restore to custom restore point](https://learn.microsoft.com/en-us/azure/postgresql/backup-restore/how-to-restore-custom-restore-point) | 0.30 | How-to restore to a custom restore point; procedural content without detailed numeric constraints or config parameter reference. | | [Restore to latest restore point](https://learn.microsoft.com/en-us/azure/postgresql/backup-restore/how-to-restore-latest-restore-point) | 0.30 | How-to restore to latest restore point; step-by-step instructions without expert-only configuration or limits detail. | +| [Scheduled maintenance](https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-maintenance) | 0.30 | Primarily a conceptual/behavioral description of scheduled maintenance for Azure Database for PostgreSQL flexible server. The summary indicates guidance like avoiding operations during maintenance but does not clearly indicate presence of numeric limits, configuration parameter tables, error codes, or decision matrices. Without evidence of specific values, ranges, or product-specific configuration tables, it does not meet the expert-knowledge criteria for any sub-skill type. | | [Switch over read replica to primary](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-switch-over-replica-to-primary) | 0.30 | Switchover how-to; summary suggests procedural guidance only, no numeric thresholds or config tables. | | [Troubleshooting guides](https://learn.microsoft.com/en-us/azure/postgresql/troubleshoot/concepts-troubleshooting-guides) | 0.30 | High-level description of troubleshooting guides integrated in the portal; appears to be a conceptual/navigation page without specific error codes or detailed mappings. | +| [Use Azure Monitor dashboards with Grafana](https://learn.microsoft.com/en-us/azure/postgresql/monitor/how-to-use-dashboards-with-grafana) | 0.30 | Tutorial-style description on using Grafana dashboards in the Azure portal to visualize PostgreSQL metrics and logs. From the summary, it focuses on how to use the integrated experience, not on product-specific configuration matrices, limits, or error-code troubleshooting. | | [Virtual endpoints](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/concepts-read-replicas-virtual-endpoints) | 0.30 | Overview of virtual endpoints behavior; no indication of config parameter tables, limits, or decision matrices. | | [Approve private endpoint connections](https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-approve-private-endpoint) | 0.25 | Approve private endpoint connections; workflow-focused, not configuration matrices or quotas. | | [Delete firewall rules](https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-networking-servers-deployed-public-access-delete-firewall-rules) | 0.25 | Delete firewall rules tutorial; procedural content without expert-only configuration details. | @@ -413,10 +410,12 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Create virtual endpoints](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-create-virtual-endpoints) | 0.20 | How-to create virtual endpoints; appears to be UI/CLI steps without detailed config tables or limits. | | [Delete a read replica](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-delete-read-replica) | 0.20 | Delete read replica tutorial; operational, not configuration or troubleshooting reference. | | [Delete virtual endpoints](https://learn.microsoft.com/en-us/azure/postgresql/read-replica/how-to-delete-virtual-endpoints) | 0.20 | Delete virtual endpoints tutorial; operational steps, not configuration reference or limits. | +| [Fabric mirroring](https://learn.microsoft.com/en-us/azure/postgresql/integration/concepts-fabric-mirroring) | 0.20 | Appears to be a conceptual overview of Fabric mirroring for Azure Database for PostgreSQL, focused on benefits and high-level behavior. No clear indication of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices from the summary. | | [Geo-disaster recovery](https://learn.microsoft.com/en-us/azure/postgresql/backup-restore/concepts-geo-disaster-recovery) | 0.20 | Conceptual overview of geo-disaster recovery and resiliency; summary does not indicate specific numeric limits, configuration parameters, or detailed decision matrices. Primarily high-level behavior and concepts. | -| [How to migrate vnet private endpoint capable server](https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-migrate-vnet-private-endpoint-capable-server) | 0.20 | Primarily a how-to migration guide from VNet-injected to private endpoint networking; based on the summary, it does not clearly expose configuration tables, specific parameter values, limits, or error-code-based troubleshooting that meet the expert-knowledge criteria. | +| [Maintenance release notes index](https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/release-notes-maintenance-index) | 0.20 | Index/overview of maintenance release notes without specific technical details, limits, configuration parameters, or troubleshooting content. | | [Migrate from Premium SSD to Premium SSD v2 using PITR](https://learn.microsoft.com/en-us/azure/postgresql/compute-storage/concepts-storage-migrate-ssd-to-ssd-v2) | 0.20 | Step-by-step migration via point-in-time restore is procedural tutorial content; the summary suggests no detailed limits tables, config matrices, or error-code-based troubleshooting. It doesn’t clearly map to any expert-knowledge sub-skill type. | | [Migrate from Premium SSD to Premium SSD v2 using replication](https://learn.microsoft.com/en-us/azure/postgresql/compute-storage/concepts-storage-replicate-ssd-to-ssd-v2) | 0.20 | Step-by-step migration using replicas is a how-to guide. The summary doesn’t indicate presence of numeric limits, config parameter tables, or decision matrices; it appears as general migration procedure rather than expert-knowledge reference. | +| [Monitor and tune overview](https://learn.microsoft.com/en-us/azure/postgresql/monitor/concepts-monitoring) | 0.20 | Described as a general overview of monitoring and metrics options for Azure Database for PostgreSQL flexible server. Summary suggests conceptual guidance rather than detailed configuration tables, limits, or troubleshooting mappings. | | [Oracle to PostgreSQL schema conversion](https://learn.microsoft.com/en-us/azure/postgresql/migrate/oracle-schema-conversions/schema-conversions-overview) | 0.20 | High-level overview of Oracle-to-PostgreSQL schema conversion using the VS Code PostgreSQL extension and Azure OpenAI. The summary indicates conceptual description of the feature and workflow, but does not mention concrete limits, configuration parameter tables, error codes, or product-specific decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/postgresql/developer/vs-code-extension/vs-code-overview) | 0.20 | Overview of VS Code extension; largely conceptual and feature description without detailed config tables or error mappings. | | [Partners](https://learn.microsoft.com/en-us/azure/postgresql/migrate/partners-migration-postgresql) | 0.20 | Lists third-party migration partners and tools; primarily marketing/partner directory content without detailed technical parameters, limits, or configuration specifics. | diff --git a/products/azure-databricks/azure-databricks.csv b/products/azure-databricks/azure-databricks.csv index c00c35d7..ef8918d3 100644 --- a/products/azure-databricks/azure-databricks.csv +++ b/products/azure-databricks/azure-databricks.csv @@ -1,103 +1,103 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/databricks/,Azure Databricks Documentation,Azure Databricks documentation,,"Learn Azure Databricks, a unified analytics platform for data analysts, data engineers, data scientists, and machine learning engineers.","Learn Azure Databricks, a unified analytics platform for data analysts, data engineers, data scientists, and machine learning engineers.",2025-12-02T19:20:00Z,landing-page,,0.0,False,"High-level product documentation landing page; no detailed limits, configs, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/,Administration topics,Administration - Azure Databricks,,"Manage your Databricks account, workspaces, users, security, compute resources, and monitor usage across your organization.","Databricks provides comprehensive administration capabilities to manage your account, workspaces, identities, and resources across your organization. Use these tools to configure account settings, deploy workspaces, manage users and groups, control compute resources, and monitor usage.",2026-04-07T08:00:00.000Z,concept-article,,0.2,False,"High-level administration landing page describing areas like account, workspaces, users, security, and monitoring. No indication of specific limits, configuration parameter tables, error codes, or decision matrices; appears to be conceptual/navigation content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/access-control/tokens,Monitor and manage access to personal access tokens,Monitor and revoke personal access tokens - Azure Databricks,Monitor and revoke Azure Databricks personal access tokens,Manage access authentication for Azure Databricks REST API clients using personal access tokens (PATs).,"To authenticate to the Azure DatabricksREST API, a user can create a personal access token (PAT) and use it in their REST API request. A user can also create a service principal and use it with a personal access token to call Azure Databricks REST APIs in their CI/CD tools and automation. This article explains how Azure Databricks admins can manage personal access tokens in their workspace. To create a personal access token, seeAuthenticate with Azure Databricks personal access tokens (legacy).",2026-01-06T08:00:00.000Z,concept-article,security,0.7,True,"Details admin workflows and controls for listing, monitoring, and revoking PATs in Databricks, including product-specific token management behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/,Account management overview,Manage your Azure Databricks account - Azure Databricks,Configure Azure Databricks account-level settings,"Learn how to manage account-level Azure Databricks configurations, including user management, workspace creation, and diagnostic logging, and learn when to use the Azure Databricks account console or ","This article describes the settings available to account admins on theaccount console. Azure Databricks accounts are managed both through the Azure Databricksaccount consoleand the Azure Portal. In the account console, account admins manageUnity Catalog metastores,users and groups, and various account-level settings including feature enablement, email preferences, language settings, and account naming. The Azure Portal is where users with the Azure Contributor or Owner role can create workspaces",2026-01-09T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes specific account console settings (Unity Catalog metastores, feature enablement, email/language/account naming) and when to use account console vs Azure portal. These are concrete configuration options and scopes unique to Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account,Manage your subscription,Manage your subscription - Azure Databricks,Manage and change Azure Databricks subscription plans,Learn how to manage your Azure Databricks account subscription. Including deleting the account.,"This article explains how to upgrade, downgrade, or cancel your Azure Databricks subscription.",2025-12-09T08:00:00.000Z,concept-article,decision-making,0.65,True,"Explains how to upgrade, downgrade, or cancel subscriptions. Such content typically includes SKU/plan options, constraints, and implications of each choice, which are decision-making details for subscription management.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-log-delivery,Configure diagnostic log delivery,Configure diagnostic log delivery - Azure Databricks,Configure Azure Databricks diagnostic audit log delivery securely,Learn how to configure audit log delivery.,"Note Databricks recommends using the audit log system table (system.access.audit) to access your account's audit logs. SeeAudit log system table reference. This article describes how to enable diagnostic log delivery for your Azure Databricks workspaces. Note Diagnostic logs require thePremium plan. Log in to the Azure portal as an Owner, Contributor, or as a user with a custom role with theMicrosoft.Databricks/workspaces/assignWorkspaceAdmin/actionpermission for the Azure Databricks workspace. ",2026-03-18T08:00:00.000Z,concept-article,security,0.7,True,"Describes how to enable audit log delivery with specific Azure roles and permissions (e.g., required actions on Microsoft.Databricks/workspaces), which are concrete security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-logs,Diagnostic logs reference guide,Diagnostic log reference - Azure Databricks,Interpret Azure Databricks diagnostic audit logs,Learn which services and events are recorded in the audit logs.,Note This feature requires thePremium plan. This article provides you with a comprehensive reference of audit log services and events. The availability of these services depends on how you access the logs: Note Azure Databricks retains a copy of audit logs for up to 1 year for security and fraud analysis purposes.,2026-04-16T08:00:00.000Z,concept-article,configuration,0.7,True,"Provides a comprehensive reference of audit log services and events, which is product-specific expert knowledge (event names, services, and log structure) used for configuring and consuming logs.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/budgets,Create and monitor budgets,Create and monitor budgets - Azure Databricks,Configure and monitor Azure Databricks account budgets,Learn how to set up and monitor budgets in your Azure Databricks account.,"This article explains how to use budgets to track spending in your Azure Databricks account. Budgets enable you to monitor usage across your account. You can set up budgets to either track account-wide spending, or apply filters to track the spending of specific teams, projects, or workspaces. Budgets are measured in USD and use the list price of the SKU, including platform add-ons. Important Budgets improve your ability to monitor usage. They do not stop usage or charges from happening. Your fi",2026-04-15T08:00:00.000Z,concept-article,configuration,0.6,True,"Details how budgets work (USD, SKU list price, platform add-ons) and how they are applied and monitored. These are product-specific configuration semantics for budgets that an LLM is unlikely to know exactly.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/custom-url,Enable your custom URL,Enable your custom URL - Azure Databricks,Enable and configure custom URL for Databricks account,"Learn how to enable a custom URL for your Databricks account to provide a single, branded entry point for all users and workspaces.","A custom URL gives your Azure Databricks account a single, branded entry point (for example:acme.databricks.com) that replaces the collection of per-workspace URLs that account users navigate today. After enabling a custom URL, users can log in to their Azure Databricks account and move freely between workspaces and Databricks One without reauthenticating.",2026-03-31T23:28:00.000Z,concept-article,configuration,0.7,True,"Custom URL setup involves specific configuration fields, DNS/host requirements, and behavior across workspaces. These are concrete configuration parameters and behaviors unique to Databricks accounts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/legacy-features,Disable legacy features,Disable access to legacy features in new workspaces - Azure Databricks,Configure disabling legacy features in new Databricks workspaces,Learn how to remove access to legacy features across your Databricks workspaces.,"Note This setting is not available for accounts created after December 19, 2025. Accounts created after this date won't have access to legacy features by default. This page describes how to use theDisable legacy featuresaccount setting so new workspaces in your account are provisioned without access to legacy features. This setting disables the following legacy features in all new workspaces in your account: Workspace admins can use workspace-level settings to disable legacy features in existing",2026-03-19T18:35:00.000Z,concept-article,configuration,0.7,True,"Describes an account setting that controls legacy feature availability, including date-based behavior and which features are affected. This is product-specific configuration behavior not derivable from generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/no-isolation-shared,"Enable admin protection for ""No Isolation Shared"" clusters on your account",Enable admin protection for no isolation shared clusters in your account - Azure Databricks,Configure admin protection for no-isolation shared clusters,Learn how to enable admin protection for “No isolation shared” clusters on your account.,This page describes how account admins can use an account-level setting to prevent internal credentials from being automatically generated for Azure Databricks workspace admins on no isolation shared clusters. Note Workspace admins can use theEnforce User Isolation settingto disable the use of no isolation shared clusters in a workspace. Account admins can disable no isolation shared clusters in all new workspaces using theDisable legacy featuressetting. The admin protection for No Isolation Sha,2025-10-07T08:00:00.000Z,concept-article,security,0.7,True,"Describes a specific account-level security setting that controls automatic internal credential generation for workspace admins on no isolation shared clusters, including product-specific behavior and constraints that are not generic knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/,Administration topics,Administration - Azure Databricks,,"Manage your Databricks account, workspaces, users, security, compute resources, and monitor usage across your organization.","Databricks provides comprehensive administration capabilities to manage your account, workspaces, identities, and resources across your organization. Use these tools to configure account settings, deploy workspaces, manage users and groups, control compute resources, and monitor usage.",2026-04-22T08:00:00.000Z,landing-page,,0.3,False,"High-level administration landing page that aggregates topics; no indication of detailed limits, configs, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/access-control/tokens,Monitor and manage access to personal access tokens,Monitor and revoke personal access tokens - Azure Databricks,Monitor and revoke Azure Databricks personal access tokens,Manage access authentication for Azure Databricks REST API clients using personal access tokens (PATs).,"To authenticate to the Azure DatabricksREST API, a user can create a personal access token (PAT) and use it in their REST API request. A user can also create a service principal and use it with a personal access token to call Azure Databricks REST APIs in their CI/CD tools and automation. This article explains how Azure Databricks admins can manage personal access tokens in their workspace. To create a personal access token, seeAuthenticate with Azure Databricks personal access tokens (legacy).",2026-04-22T08:00:00.000Z,how-to,security,0.75,True,"Explains admin workflows and controls for listing and revoking PATs in a workspace—detailed, product-specific security configuration/operations.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/,Account management overview,Manage your Azure Databricks account - Azure Databricks,Configure Azure Databricks account-level settings,"Learn how to manage account-level Azure Databricks configurations, including user management, workspace creation, and diagnostic logging, and learn when to use the Azure Databricks account console or ","This article describes the settings available to account admins on theaccount console. Azure Databricks accounts are managed both through the Azure Databricksaccount consoleand the Azure Portal. In the account console, account admins manageUnity Catalog metastores,users and groups, and various account-level settings including feature enablement, email preferences, language settings, and account naming. The Azure Portal is where users with the Azure Contributor or Owner role can create workspaces",2026-04-22T08:00:00.000Z,landing-page,configuration,0.7,True,"Describes concrete account-level settings (Unity Catalog metastores, feature enablement, email and language settings) and when to use account console vs Azure portal, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account,Manage your subscription,Manage your subscription - Azure Databricks,Manage and change Azure Databricks subscriptions,Learn how to manage your Azure Databricks account subscription. Including deleting the account.,"This article explains how to upgrade, downgrade, or cancel your Azure Databricks subscription.",2026-04-22T08:00:00.000Z,how-to,decision-making,0.65,True,"Explains how to upgrade, downgrade, or cancel subscriptions; likely includes SKU/tier implications and concrete steps, providing product-specific decision and configuration guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-log-delivery,Configure diagnostic log delivery,Configure diagnostic log delivery - Azure Databricks,Configure Azure Databricks diagnostic log delivery,Learn how to configure audit log delivery.,"Note Databricks recommends using the audit log system table (system.access.audit) to access your account's audit logs. SeeAudit log system table reference. This article describes how to enable diagnostic log delivery for your Azure Databricks workspaces. Note Diagnostic logs require thePremium plan. Log in to the Azure portal as an Owner, Contributor, or as a user with a custom role with theMicrosoft.Databricks/workspaces/assignWorkspaceAdmin/actionpermission for the Azure Databricks workspace. ",2026-04-22T08:00:00.000Z,how-to,configuration,0.65,True,Contains specific configuration steps and required Azure permissions (including exact action name) for enabling log delivery—product-specific configuration details.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-logs,Diagnostic logs reference guide,Diagnostic log reference - Azure Databricks,Reference Azure Databricks diagnostic log services and events,Learn which services and events are recorded in the audit logs.,Note This feature requires thePremium plan. This article provides you with a comprehensive reference of audit log services and events. The availability of these services depends on how you access the logs: Note Azure Databricks retains a copy of audit logs for up to 1 year for security and fraud analysis purposes.,2026-04-22T08:00:00.000Z,reference,configuration,0.7,True,"Provides a comprehensive reference of audit/diagnostic log services and event types; this is detailed mapping of services to events, akin to configuration metadata.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/budgets,Create and monitor budgets,Create and monitor budgets - Azure Databricks,Create and monitor Azure Databricks spending budgets,Learn how to set up and monitor budgets in your Azure Databricks account.,"This article explains how to use budgets to track spending in your Azure Databricks account. Budgets enable you to monitor usage across your account. You can set up budgets to either track account-wide spending, or apply filters to track the spending of specific teams, projects, or workspaces. Budgets are measured in USD and use the list price of the SKU, including platform add-ons. Important Budgets improve your ability to monitor usage. They do not stop usage or charges from happening. Your fi",2026-04-22T21:32:00.000Z,how-to,decision-making,0.7,True,"Details how budgets are defined (USD, list price, filters) and how to apply them to teams/projects; this is concrete guidance for planning and monitoring spend decisions.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/custom-url,Enable your custom URL,Enable your custom URL - Azure Databricks,Configure a custom URL for Databricks account access,"Learn how to enable a custom URL for your Databricks account to provide a single, branded entry point for all users and workspaces.","A custom URL gives your Azure Databricks account a single, branded entry point (for example:acme.databricks.com) that replaces the collection of per-workspace URLs that account users navigate today. After enabling a custom URL, users can log in to their Azure Databricks account and move freely between workspaces and Databricks One without reauthenticating.",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,Covers enabling a custom branded URL for the account; this typically involves specific configuration steps and parameters unique to Azure Databricks.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/legacy-features,Disable legacy features,Disable access to legacy features in new workspaces - Azure Databricks,Disable legacy features in new Databricks workspaces,Learn how to remove access to legacy features across your Databricks workspaces.,"Note This setting is not available for accounts created after December 19, 2025. Accounts created after this date won't have access to legacy features by default. This page describes how to use theDisable legacy featuresaccount setting so new workspaces in your account are provisioned without access to legacy features. This setting disables the following legacy features in all new workspaces in your account: Workspace admins can use workspace-level settings to disable legacy features in existing",2026-04-22T08:00:00.000Z,how-to,configuration,0.75,True,"Describes an account-level setting that disables specific legacy features for new workspaces and includes a date-based availability rule, which is product-specific configuration behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/no-isolation-shared,"Enable admin protection for ""No Isolation Shared"" clusters on your account",Enable admin protection for no isolation shared clusters in your account - Azure Databricks,Secure no isolation shared clusters with admin protection,Learn how to enable admin protection for “No isolation shared” clusters on your account.,This page describes how account admins can use an account-level setting to prevent internal credentials from being automatically generated for Azure Databricks workspace admins on no isolation shared clusters. Note Workspace admins can use theEnforce User Isolation settingto disable the use of no isolation shared clusters in a workspace. Account admins can disable no isolation shared clusters in all new workspaces using theDisable legacy featuressetting. The admin protection for No Isolation Sha,2026-04-22T08:00:00.000Z,how-to,security,0.8,True,"Details an account-level setting to prevent automatic internal credential generation for admins on no isolation shared clusters and references related isolation settings, which are product-specific security controls.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/serverless-quotas,Serverless quotas,Serverless compute quotas - Azure Databricks,Review Azure Databricks serverless compute quotas,Learn about quotas for serverless compute version of the Azure Databricks platform architecture.,"Serverless quotas are a safety measure forserverless compute. Quotas are enforced on serverless compute for notebooks, jobs, Lakeflow Spark Declarative Pipelines, and forserverless SQL warehouses. Quotas are measured in Databricks Units (DBUs) per hour. A DBU is a normalized unit of processing power on the Databricks platform used for measurement and pricing purposes. Seethe pricing page.",2025-11-12T19:26:00.000Z,concept-article,limits-quotas,0.9,True,Explicitly about quotas for serverless compute; such pages typically list DBU/hour limits and possibly per-workspace or per-feature caps that are not knowable from training data.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/standard-tier,End of life for Standard tier workspaces on Azure Databricks,End of life for Standard tier workspaces on Azure Databricks - Azure Databricks,Plan migration from Databricks Standard to Premium tier,Information on the deprecation of Databricks Standard tier workspaces and instructions for upgrading to a higher tier.,"Databricks has announced the end of life for Standard tier workspaces on Azure Databricks. Customers have until October 1, 2026, to upgrade existing workspaces to the Premium tier. Any remaining Standard tier workspaces on October 1, 2026, will be automatically upgraded to the Premium tier. There is no impact to existing workspaces until Oct 1, 2026. Refer to thepricing pageto assess how this upgrade might impact your bill. Starting April 1, 2026, all new workspaces must be created on the Premiu",2026-03-10T23:23:00.000Z,concept-article,decision-making,0.7,True,Contains concrete deprecation and upgrade deadlines (specific dates) and guidance on upgrading tiers; this is decision-focused information (when to move tiers) that changes over time and is not inferable from training alone.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage,Usage dashboards,Usage dashboards - Azure Databricks,,Learn how to access usage dashboards to monitor your Azure Databricks account spend.,"This article explains how to import pre-built usage dashboards to your workspaces to monitor account- and workspace-level usage. To access the Usage page, as an account admin go to theaccount consoleand click theUsageicon.",2026-03-31T23:28:00.000Z,concept-article,,0.3,False,High-level explanation of importing and using pre-built usage dashboards; likely a navigation/overview page without detailed parameter tables or product-specific limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags,Monitor usage using tags,Use tags to attribute and track usage - Azure Databricks,,"Learn how to add custom tags to resources to monitor cost and accurately attribute Azure Databricks usage to your organization's business units and teams (for example, for chargebacks).","This article explains how to use tags to attribute compute usage to specific workspaces, teams, projects, or users to support cost tracking and budgeting. There are two types of tags: Warning Tag data is stored as plain text and may be replicated globally. Do not use tag names, values, or descriptors that could compromise the security of your resources. For example, do not use tag names, values or descriptors that contain personal or sensitive information.",2026-04-15T08:00:00.000Z,concept-article,,0.3,False,"Explains how to use tags for cost attribution conceptually; summary does not indicate detailed parameter tables, limits, or product-specific error codes.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/verbose-logs,Enable verbose logs,Enable verbose audit logs - Azure Databricks,Enable and configure verbose audit logs in Databricks,Learn how to enable verbose audit logs in your workspace.,"Verbose audit logs are additional records generated whenever a query or command is run in your workspace. These logs record the text of each command or query. By default, these logs are not enabled in workspaces. To enable or disable verbose audit logs, do the following: When you enable or disable verbose logging, an auditable event is emitted in the categoryworkspacewith actionworkspaceConfKeys. TheworkspaceConfKeysrequest parameter isenableVerboseAuditLogs. The request parameterworkspaceConfVa",2026-03-11T18:04:00.000Z,concept-article,configuration,0.8,True,"Documents specific workspace configuration keys (enableVerboseAuditLogs, workspaceConfKeys) and emitted events, which are Databricks-specific configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/admin-concepts,Administration overview,Databricks administration overview - Azure Databricks,,"Core concepts and foundational information about Databricks administration privileges, responsibilities, and admin types.",This article provides an introduction to Azure Databricks admin privileges and responsibilities.,2026-04-15T08:00:00.000Z,concept-article,,0.1,False,"High-level overview of Azure Databricks admin concepts and roles; no specific limits, configuration parameters, error codes, or decision matrices with quantified criteria.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/automatic-cluster-update,Automatic cluster update,Automatic cluster update - Azure Databricks,,Learn how to schedule automatic updates of compute resources.,"Automatic cluster update ensures that all the clusters in a workspace are periodically updated to the latest host OS image and security updates. Account admins can schedule the maintenance window frequency, start date, and start time. If there are no updates for images for running compute resources, by default they are not restarted but you can configure the feature to force restart during the maintenance window. This applies to all compute resources that run in theclassic compute plane: cluster",2026-02-18T08:00:00.000Z,concept-article,,0.4,False,"Describes what automatic cluster update does and how to schedule it, but summary does not indicate specific configuration parameter tables, limits, or product-unique gotchas; likely a procedural/how-to page rather than expert configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/personal-compute,Manage the Personal Compute cluster policy,Manage the Personal Compute policy - Azure Databricks,Configure and enforce Personal Compute policy in Azure Databricks,Learn how the personal compute default policy works and how to enable it.,"Personal Compute is an Azure Databricks-managedpolicyavailable, by default, on all Azure Databricks workspaces. Granting users access to this policy enables them to create single-machine compute resources in Azure Databricks for their individual use. Users can create the personal compute resource quickly using shortcuts in either a notebook or the Compute page. With the Personal Compute policy, administrators have a simple solution for providing limited compute creation privileges. This removes ",2025-10-07T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains how the Personal Compute default policy works and how to enable/manage it, including constraints on compute creation; these are specific configuration behaviors for this product feature.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policies,Create and manage compute policies,Create and manage compute policies - Azure Databricks,,Learn how to use policies that restrict cluster creation capabilities for users and user groups according to a predefined set of rules.,"This article explains how to create and manage policies in your workspace. For information on writing policy definitions, seeCompute policy reference. Note Policies require thePremium plan.",2026-04-10T08:00:00.000Z,concept-article,,0.2,False,"High-level guidance on creating and managing compute policies; the summary does not indicate detailed parameter tables, limits, or product-specific gotchas. It appears more like conceptual/administrative how-to content than expert reference.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-definition,Compute policy reference,Compute policy reference - Azure Databricks,Define Azure Databricks compute policy attributes,Learn about the available attributes you can use when defining a compute policy.,"Note This page uses JSON examples of policy definitions. You can define a policy using JSON, or use thecompute policy UIto configure the policy definitions using dropdown menus and other UI elements. This page is a reference for compute policy definitions, including a list of available policy attributes and limitation types. There are also sample policies you can reference for common use cases.",2026-03-17T08:00:00.000Z,concept-article,configuration,0.9,True,"Explicitly a 'compute policy reference' listing available policy attributes and limitation types, with JSON examples and sample policies. This is a configuration reference with specific attribute names, allowed values, and example configurations that are product-specific expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-families,Default policies and policy families,Default policies and policy families - Azure Databricks,Use Databricks default compute policy families effectively,Learn how to use the default compute policy families to create compute and new compute policies.,"Note Policies require thePremium plan. This article describes the default compute policies available in your workspace and explains how to use them to create custom policies. In your workspace, four default policies are designed for four different use cases. The policies are the following: Each policy includes rules that enforce best practices for its specific use case. Workspace admins can override or add rules to these policies. Additionally, you can create new custom policies using these defa",2026-01-23T23:40:00.000Z,concept-article,best-practices,0.7,True,"Describes default policies for specific use cases and notes that each enforces best practices; includes concrete policy behaviors and how to customize them, which are product-specific best practices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/standard-tier,End of life for Standard tier workspaces on Azure Databricks,End of life for Standard tier workspaces on Azure Databricks - Azure Databricks,Plan for Databricks Standard tier end of life,Information on the deprecation of Databricks Standard tier workspaces and instructions for upgrading to a higher tier.,"Databricks has announced the end of life for Standard tier workspaces on Azure Databricks. Customers have until October 1, 2026, to upgrade existing workspaces to the Premium tier. Any remaining Standard tier workspaces on October 1, 2026, will be automatically upgraded to the Premium tier. There is no impact to existing workspaces until Oct 1, 2026. Refer to thepricing pageto assess how this upgrade might impact your bill. Starting April 1, 2026, all new workspaces must be created on the Premiu",2026-04-22T08:00:00.000Z,upgrade-and-migration-article,decision-making,0.85,True,"Provides concrete dates and required actions for upgrading from Standard to Premium tier, with billing impact considerations, which is explicit SKU/tier decision and migration guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage,Usage dashboards,Usage dashboards - Azure Databricks,Import and use Azure Databricks usage dashboards,Learn how to access usage dashboards to monitor your Azure Databricks account spend.,"This article explains how to import pre-built usage dashboards to your workspaces to monitor account- and workspace-level usage. To access the Usage page, as an account admin go to theaccount consoleand click theUsageicon.",2026-04-22T08:00:00.000Z,how-to,configuration,0.6,True,Explains how to access and import specific pre-built dashboards from the account console—product-specific configuration/usage steps.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags,Monitor usage using tags,Use tags to attribute and track usage - Azure Databricks,Tag Azure Databricks resources for cost attribution,"Learn how to add custom tags to resources to monitor cost and accurately attribute Azure Databricks usage to your organization's business units and teams (for example, for chargebacks).","This article explains how to use tags to attribute compute usage to specific workspaces, teams, projects, or users to support cost tracking and budgeting. There are two types of tags: Warning Tag data is stored as plain text and may be replicated globally. Do not use tag names, values, or descriptors that could compromise the security of your resources. For example, do not use tag names, values or descriptors that contain personal or sensitive information.",2026-04-22T08:00:00.000Z,how-to,best-practices,0.65,True,Gives Databricks-specific guidance on using tags for cost tracking and includes security warnings about tag content; these are product-specific DO/DON’T recommendations.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/verbose-logs,Enable verbose logs,Enable verbose audit logs - Azure Databricks,Enable and manage verbose audit logs in Databricks,Learn how to enable verbose audit logs in your workspace.,"Verbose audit logs are additional records generated whenever a query or command is run in your workspace. These logs record the text of each command or query. By default, these logs are not enabled in workspaces. To enable or disable verbose audit logs, do the following: When you enable or disable verbose logging, an auditable event is emitted in the categoryworkspacewith actionworkspaceConfKeys. TheworkspaceConfKeysrequest parameter isenableVerboseAuditLogs. The request parameterworkspaceConfVa",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,Documents the exact workspace configuration keys and parameters used to toggle verbose audit logs—concrete configuration settings.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/admin-concepts,Administration overview,Databricks administration overview - Azure Databricks,,"Core concepts and foundational information about Databricks administration privileges, responsibilities, and admin types.",This article provides an introduction to Azure Databricks admin privileges and responsibilities.,2026-04-22T08:00:00.000Z,concept-article,,0.2,False,"Conceptual overview of admin roles and responsibilities without product-specific numeric limits, configs, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/automatic-cluster-update,Automatic cluster update,Automatic cluster update - Azure Databricks,Configure automatic cluster updates in Databricks,Learn how to schedule automatic updates of compute resources.,"Automatic cluster update ensures that all the clusters in a workspace are periodically updated to the latest host OS image and security updates. Account admins can schedule the maintenance window frequency, start date, and start time. If there are no updates for images for running compute resources, by default they are not restarted but you can configure the feature to force restart during the maintenance window. This applies to all compute resources that run in theclassic compute plane: cluster",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"Explains scheduling maintenance windows, frequency, and restart behavior for automatic cluster updates, which are concrete configuration options for Databricks compute.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/personal-compute,Manage the Personal Compute cluster policy,Manage the Personal Compute policy - Azure Databricks,Use and configure the Personal Compute policy,Learn how the personal compute default policy works and how to enable it.,"Personal Compute is an Azure Databricks-managedpolicyavailable, by default, on all Azure Databricks workspaces. Granting users access to this policy enables them to create single-machine compute resources in Azure Databricks for their individual use. Users can create the personal compute resource quickly using shortcuts in either a notebook or the Compute page. With the Personal Compute policy, administrators have a simple solution for providing limited compute creation privileges. This removes ",2026-04-22T08:00:00.000Z,how-to,decision-making,0.65,True,"Explains how the Personal Compute policy works, when to grant it, and how it limits compute creation—guidance for choosing this policy for certain scenarios.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policies,Create and manage compute policies,Create and manage compute policies - Azure Databricks,Create and manage Databricks compute policies,Learn how to use policies that restrict cluster creation capabilities for users and user groups according to a predefined set of rules.,"This article explains how to create and manage policies in your workspace. For information on writing policy definitions, seeCompute policy reference. Note Policies require thePremium plan.",2026-04-22T08:00:00.000Z,how-to,configuration,0.6,True,"Explains how to configure compute policies in the workspace, tying into a reference of policy attributes; this is concrete product configuration behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-definition,Compute policy reference,Compute policy reference - Azure Databricks,Define Databricks compute policies with JSON attributes,Learn about the available attributes you can use when defining a compute policy.,"Note This page uses JSON examples of policy definitions. You can define a policy using JSON, or use thecompute policy UIto configure the policy definitions using dropdown menus and other UI elements. This page is a reference for compute policy definitions, including a list of available policy attributes and limitation types. There are also sample policies you can reference for common use cases.",2026-04-22T08:00:00.000Z,reference,configuration,0.9,True,"Lists all available policy attributes, limitation types, and sample JSON policy definitions—classic configuration reference with parameter names and allowed values.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-families,Default policies and policy families,Default policies and policy families - Azure Databricks,Apply default Databricks compute policy families,Learn how to use the default compute policy families to create compute and new compute policies.,"Note Policies require thePremium plan. This article describes the default compute policies available in your workspace and explains how to use them to create custom policies. In your workspace, four default policies are designed for four different use cases. The policies are the following: Each policy includes rules that enforce best practices for its specific use case. Workspace admins can override or add rules to these policies. Additionally, you can create new custom policies using these defa",2026-04-22T08:00:00.000Z,reference,best-practices,0.7,True,Describes default policies for specific use cases and the rules they enforce as best practices; includes product-specific recommendations for when to use each policy.,updated https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/web-terminal,Enable web terminal,Enable the web terminal - Azure Databricks,Enable and manage the Azure Databricks web terminal,Learn how to enable and manage the Azure Databricks web terminal.,"To use theweb terminalon your clusters, a workspace admin must enable the web terminal as follows: Note After you enable the feature in the admin settings, you may need to wait a few minutes for the configuration value to propagate to all clusters. You also need to refresh your page for changes to take effect in your local session.",2024-05-03T21:32:00.000Z,concept-article,configuration,0.7,True,Describes enabling web terminal via admin settings and propagation behavior; involves specific workspace/cluster configuration toggles.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/disaster-recovery,Disaster recovery overview,Disaster recovery - Azure Databricks,Plan disaster recovery architecture for Azure Databricks,Learn about disaster recovery planning with Azure Databricks.,"A clear disaster recovery pattern is critical for a cloud-native data analytics platform such as Azure Databricks. It's critical that your data teams can use the Azure Databricks platform even in the rare case of a regional service-wide cloud-service provider outage, whether caused by a regional disaster like a hurricane or earthquake, or another source. Azure Databricks is often a core part of an overall data ecosystem that includes many services, including upstream data ingestion services (bat",2026-03-18T08:00:00.000Z,concept-article,architecture-patterns,0.65,True,Covers Databricks-specific disaster recovery patterns and how the service behaves in regional outages as part of a broader data ecosystem; this is product-specific DR architecture guidance beyond generic DR concepts.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/,Governed tags,Governed tags - Azure Databricks,Understand governed tags for policy enforcement in Databricks,"Learn how to create governed tags, which automatically enforce policies that restrict the ability to create, manage, and use tags.","This page provides an overview of governed tags in Azure Databricks. To create and manage governed tags, seeCreate and manage governed tags. To apply tags, seeApply tags to Unity Catalog securable objects. Warning Tag data is stored as plain text and may be replicated globally. Do not use tag names, values, or descriptors that could compromise the security of your resources. For example, do not use tag names, values or descriptors that contain personal or sensitive information.",2026-04-02T22:35:00.000Z,concept-article,security,0.65,True,"Governed tags automatically enforce policies; the page includes security-related warnings and describes how tags affect permissions, which is product-specific governance and security behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-governed-tags,Create and manage governed tags,Create and manage governed tags - Azure Databricks,,"Learn how to create and manage governed tags, which restrict how tags can be created, assigned, and used.","This page explains how to create and manage governed tags across your account. For an overview of governed tags, seeGoverned tags. Warning Tag data is stored as plain text and may be replicated globally. Do not use tag names, values, or descriptors that could compromise the security of your resources. For example, do not use tag names, values or descriptors that contain personal or sensitive information.",2026-04-13T21:03:00.000Z,concept-article,,0.2,False,"The page describes how to create and manage governed tags and includes a general security warning about tag data being stored as plain text and replicated globally. From the summary, it appears to be a procedural/how-to page without detailed RBAC roles, configuration parameter tables, or other expert-only specifics. It does not clearly match limits, configuration, troubleshooting, or other expert categories based on the provided excerpt.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-permissions,Manage governed tag permissions,Manage permissions on governed tags - Azure Databricks,Manage permissions and access control on governed tags,"Learn how to manage permissions that restrict the ability to create, manage, and use governed tags.","This page explains how to grant permissions on governed tags. For an overview of governed tags, seeGoverned tags.",2026-04-01T08:00:00.000Z,concept-article,security,0.75,True,Focuses on granting permissions on governed tags; includes specific permission names and scopes that define how tag-based governance is secured.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/sql/,SQL warehouse admin settings,SQL warehouse admin settings - Azure Databricks,Configure SQL warehouse admin settings and access controls,Configure workspace level permissions and settings for SQL warehouses.,"This article explains the SQL warehouse settings and access controls available to workspace admins. Databricks recommends retaining the default settings for all workspace-level configurations for SQL warehouses. These settings assume that workspace admins are responsible for creating and configuring all SQL warehouses and that you use Unity Catalog for data governance. Workspace administrators can configure the following settings for an Azure Databricks workspace: Note By default, all users have",2026-03-24T22:15:00.000Z,concept-article,configuration,0.8,True,"Details workspace-level configuration options and access controls for SQL warehouses, including default behaviors and Unity Catalog assumptions, which are Databricks-specific configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/sql/data-access-configuration,Data access configuration,Data access configurations - Azure Databricks,Configure SQL warehouse data access properties in Databricks,Learn how workspace admins configure access to external data objects for SQL warehouses.,Important These instructions apply to legacy data access patterns. Databricks recommends using Unity Catalog external locations for data access. SeeConnect to cloud object storage using Unity Catalog. This article describes how to manage data access properties for SQL warehouses in a workspace. Important Changing these settings restarts all running SQL warehouses.,2026-04-14T08:00:00.000Z,concept-article,security,0.7,True,"Data access configuration for SQL warehouses is security-related and usually includes specific properties, flags, and access modes that control external data object access; these are product-specific IAM/data access settings beyond generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/disaster-recovery,Disaster recovery overview,Disaster recovery - Azure Databricks,Design disaster recovery patterns for Azure Databricks,Learn about disaster recovery planning with Azure Databricks.,"A clear disaster recovery pattern is critical for a cloud-native data analytics platform such as Azure Databricks. It's critical that your data teams can use the Azure Databricks platform even in the rare case of a regional service-wide cloud-service provider outage, whether caused by a regional disaster like a hurricane or earthquake, or another source. Azure Databricks is often a core part of an overall data ecosystem that includes many services, including upstream data ingestion services (bat",2026-04-22T08:00:00.000Z,best-practice,architecture-patterns,0.65,True,"Focuses on DR planning for Databricks within a broader data ecosystem; likely includes Databricks-specific DR patterns and guidance on when to use them, which is architecture and pattern guidance specific to the service.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/,Governed tags,Governed tags - Azure Databricks,,"Learn how to create governed tags, which automatically enforce policies that restrict the ability to create, manage, and use tags.","This page provides an overview of governed tags in Azure Databricks. To create and manage governed tags, seeCreate and manage governed tags. To apply tags, seeApply tags to Unity Catalog securable objects. Warning Tag data is stored as plain text and may be replicated globally. Do not use tag names, values, or descriptors that could compromise the security of your resources. For example, do not use tag names, values or descriptors that contain personal or sensitive information.",2026-04-22T08:00:00.000Z,overview,,0.2,False,"Described as an overview of governed tags with a general warning about tag data; likely conceptual governance explanation without detailed permissions, config tables, or numeric limits.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-governed-tags,Create and manage governed tags,Create and manage governed tags - Azure Databricks,Create and manage governed tags in Unity Catalog,"Learn how to create and manage governed tags, which restrict how tags can be created, assigned, and used.","This page explains how to create and manage governed tags across your account. For an overview of governed tags, seeGoverned tags. Warning Tag data is stored as plain text and may be replicated globally. Do not use tag names, values, or descriptors that could compromise the security of your resources. For example, do not use tag names, values or descriptors that contain personal or sensitive information.",2026-04-23T17:47:00.000Z,how-to,security,0.7,True,"Creating and managing governed tags in Databricks is tightly tied to access control and policy enforcement; such a page typically documents specific tag-related controls and constraints unique to the product, which falls under product-specific security/governance configuration rather than generic tagging concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-permissions,Manage governed tag permissions,Manage permissions on governed tags - Azure Databricks,Manage permissions for governed tags in Databricks,"Learn how to manage permissions that restrict the ability to create, manage, and use governed tags.","This page explains how to grant permissions on governed tags. For an overview of governed tags, seeGoverned tags.",2026-04-23T17:47:00.000Z,how-to,security,0.9,True,"A page focused on managing permissions for governed tags will contain product-specific RBAC roles, privileges, and permission scopes for tag creation and usage, which is expert security knowledge per the criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/sql/,SQL warehouse admin settings,SQL warehouse admin settings - Azure Databricks,Configure SQL warehouse admin settings in Databricks,Configure workspace level permissions and settings for SQL warehouses.,"This article explains the SQL warehouse settings and access controls available to workspace admins. Databricks recommends retaining the default settings for all workspace-level configurations for SQL warehouses. These settings assume that workspace admins are responsible for creating and configuring all SQL warehouses and that you use Unity Catalog for data governance. Workspace administrators can configure the following settings for an Azure Databricks workspace: Note By default, all users have",2026-04-23T08:00:00.000Z,landing-page,configuration,0.82,True,"Covers workspace-level SQL warehouse settings and access controls with product-specific options and defaults, fitting configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/sql/data-access-configuration,Data access configuration,Data access configurations - Azure Databricks,Configure legacy data access for Databricks SQL warehouses,Learn how workspace admins configure access to external data objects for SQL warehouses.,Important These instructions apply to legacy data access patterns. Databricks recommends using Unity Catalog external locations for data access. SeeConnect to cloud object storage using Unity Catalog. This article describes how to manage data access properties for SQL warehouses in a workspace. Important Changing these settings restarts all running SQL warehouses.,2026-04-22T08:00:00.000Z,how-to,configuration,0.8,True,"Describes workspace data access properties for SQL warehouses and their impact (restarts), which are detailed configuration options.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/sql/serverless,Enable serverless SQL warehouses,Set up serverless SQL warehouses - Azure Databricks,Set up and manage serverless SQL warehouses,Learn about serverless SQL warehouses and how to manage them.,"This page explains how to set up serverless SQL warehouses for your workspace. Important Serverless SQL warehouses are enabled by default. Review this page to check that your workspace meets the necessary requirements. To make changes as described on this page, you must have Contributor or Owner permissions on the Azure workspace resource. Note Serverless SQL warehouses donothave public IP addresses. For more architectural information, seeHigh-level architecture.",2026-03-17T08:00:00.000Z,how-to,configuration,0.75,True,"Explains requirements and setup for serverless SQL warehouses, including that they are enabled by default and have no public IP addresses—service-specific configuration and behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/,System tables overview,System tables reference - Azure Databricks,Use Azure Databricks system tables for monitoring,"Learn about Azure Databricks system tables for monitoring costs, auditing activity, tracking compute, and observing data and AI workloads.","System tables provide an Azure Databricks-hosted analytical store of your account's operational data. Use them to monitor costs, audit security events, track compute and job performance, and observe data and AI workloads.",2026-04-16T08:00:00.000Z,concept-article,configuration,0.65,True,"A system tables reference for monitoring costs, auditing, and tracking workloads typically documents table names, schemas, and possibly configuration needed to enable or query them, which are detailed product-specific structures not known generically.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/assistant,Genie Code system table,Genie Code system table reference - Azure Databricks,Monitor Genie Code usage with assistant events system table,Learn how to use the Genie Code system table to monitor adoption and usage.,"Important This system table is inPublic Preview. This page includes a schema reference of the Genie Code events system table and an example that uses the data in a dashboard. Each row in this system table represents a message sent by a user in either Genie Code window or the in-cell Assistant. For more information about the Genie Code, seeGenie Code. Table path: This system table is located atsystem.access.assistant_events.",2026-03-31T08:00:00.000Z,concept-article,configuration,0.8,True,"Documents the Genie Code events system table, including its path and schema; this is detailed, product-specific system table configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/audit-logs,Audit log system table,Audit log system table reference - Azure Databricks,Use the Databricks audit log system table schema,Learn to use the audit log system table to gain insights and observability in your Azure Databricks account.,"Important This system table is inPublic Preview. This article outlines the audit log table schema and has sample queries you can use with the audit log system table to answer common account activity questions. For information on audit log events, seeDiagnostic log reference. Table path: This system table is located atsystem.access.audit.",2026-01-17T05:47:00.000Z,concept-article,configuration,0.8,True,"Reference page for audit log system table schema and sample queries; contains column names, types, and usage patterns that are Databricks-specific configuration/usage details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/billing,Billable usage system table,Billable usage system table reference - Azure Databricks,Query billable usage system table in Azure Databricks,Learn how the billable usage system table works so you can query and monitor usage in your Azure Databricks account.,"This article provides an overview of the billable usage system table, including the schema and example queries. With system tables, your account's billable usage data is centralized and routed to all regions, so you can view your account's global usage from whichever region your workspace is in. For information on using this table to monitor costs and sample queries, seeMonitor costs using system tables. Table path: This system table is located atsystem.billing.usage.",2026-04-15T08:00:00.000Z,concept-article,configuration,0.7,True,"Reference for the billable usage system table (system.billing.usage) will include the exact table path, schema, column names, and example queries, which are concrete product-specific data structures and usage patterns that qualify as expert configuration/usage knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/clean-rooms,Clean room system table,Clean room events system table reference - Azure Databricks,Track clean room activity with events system table,A reference guide for using the clean room system table.,"Important This system table is inPublic Preview. The clean room events table records actions taken by you or your collaborators on clean rooms in your account. This table includes regional data from across your account. For more information on clean rooms, seeWhat is Azure Databricks Clean Rooms?. Table path: This system table is located atsystem.access.clean_room_events.",2025-10-07T08:00:00.000Z,concept-article,configuration,0.8,True,"Reference for the clean room events system table, including table path and schema; these are specific configuration/metadata details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/compute,Compute system tables,Compute system tables reference - Azure Databricks,Reference Databricks compute system tables for monitoring,Learn how to use the available compute system tables to monitor cluster changes and available node types.,"This article provides you with a reference guide for the compute system tables. You can use these tables to monitor the activity and metrics of non-serverless all-purpose compute, jobs compute, and Lakeflow Spark Declarative Pipelines compute in your account. The compute tables include:",2026-04-07T08:00:00.000Z,concept-article,configuration,0.7,True,"Provides a reference for compute system tables, including specific table names and schema used to monitor clusters and node types. These are concrete, product-specific structures and fields that an LLM wouldn’t reliably know without the doc.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-classification,Data classification system tables,Data classification system table reference - Azure Databricks,Use data classification system table for sensitive data detection,Learn to read and query the data classification results system table to understand detected sensitive data across your metastore.,Important This feature is inPublic Preview. This page outlines the data classification results table schema and includes sample queries. The table stores detections for sensitive data classes at the column level across enabled catalogs in your metastore. Table path:system.data_classification.results,2026-01-22T13:21:00.000Z,concept-article,configuration,0.75,True,Data classification results table schema and usage are Databricks-specific configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-quality-monitoring,Data quality monitoring system tables reference,Data quality monitoring results system table reference - Azure Databricks,Use data quality monitoring results system table in Databricks,"Learn to read and query the data quality monitoring results system table to understand table health, incidents, and downstream impact across your metastore.","Important This feature is inPublic Preview. This page outlines the data quality monitoring results system table schema and includes sample queries. The table stores results of freshness and completeness checks, as well as downstream impact and root cause analysis, across all tables enabled for data quality monitoring in your metastore. Table path:system.data_quality_monitoring.table_results Only account admins can access this table, and they must grant access to others as needed. The system tabl",2026-03-30T08:00:00.000Z,concept-article,configuration,0.8,True,"Provides table path, schema, and access rules for the data quality monitoring results system table; this is detailed, product-specific configuration information.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs,Jobs system tables,Jobs system table reference - Azure Databricks,Use Databricks jobs system tables for monitoring,Learn how to use the lakeflow system tables to monitor jobs in your account.,"Note Thelakeflowschema was previously known asworkflow. The content of both schemas is identical. This article is a reference for thelakeflowsystem tables, which record job activity in your account. These tables include records from all workspaces in your account deployed in the same cloud region. To see records from another region, you must view the tables from a workspace deployed in that region.",2026-03-16T08:00:00.000Z,concept-article,configuration,0.75,True,"A 'system table reference' for jobs is inherently a schema/config reference: it will list table/column names, data types, and semantics for the lakeflow (workflow) system tables. That structure is product-specific expert knowledge not derivable from general training and fits the configuration category (reference of fields and how they are used).",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs-cost,Monitor job costs,Monitor job costs & performance with system tables - Azure Databricks,Use Databricks system tables to monitor job costs,Learn how to use the lakeflow system tables and billing system tables to monitor the cost of jobs in your account.,This article provides examples of how to use system tables to monitor the cost and performance of Lakeflow Jobs and Pipelines in your account. These queries only calculate costs for jobs run on jobs compute and serverless compute. Jobs run on SQL warehouses and all-purpose compute are not billed as jobs and are thus excluded from cost attribution. Note These queries won't return records from workspaces outside your current workspace's cloud region. To monitor job costs from workspaces outside yo,2026-02-23T08:00:00.000Z,concept-article,configuration,0.7,True,Provides concrete example queries and references to specific Lakeflow and billing system tables/fields for cost and performance monitoring. These table/column details and usage patterns are product-specific.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage,Data lineage system tables,Lineage system tables reference - Azure Databricks,Use Databricks lineage system tables for data tracking,Learn how view and analyze the table lineage and column lineage system tables.,"This page includes a reference for the two lineage system tables. These system tables build on Unity Catalog'sdata lineage feature, allowing you to programmatically query lineage data to fuel decision making and reports. To access the tables, the schemas must be enabled in yoursystemcatalog. For more information, seeEnable system tables. Note Both lineage tables represent a subset of all read/write events, as it is not always possible to capture lineage. Records are only emitted when lineage can",2026-04-09T08:00:00.000Z,concept-article,configuration,0.7,True,"A reference for lineage system tables, including table names, schemas, and how lineage records are emitted. These are product-specific table structures and behaviors that qualify as configuration/reference details beyond generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/marketplace,Marketplace system tables,Marketplace system tables reference - Azure Databricks,Analyze Databricks Marketplace data with system tables,Learn how to analyze Databricks Marketplace data using system tables.,Important This system table is inPublic Preview. This page provides a reference on how to use system tables to help monitor yourDatabricks Marketplaceselling process. Marketplace system tables live in thesystem.marketplaceschema. This schema is enabled automatically when a Marketplace listing is created in your metastore. The schema includes the following tables: Note You can use the Provider Analytics Dashboard to monitor recipient usage metrics. The dashboard pulls data from the Marketplace sy,2025-10-07T08:00:00.000Z,concept-article,configuration,0.8,True,"Documents specific Marketplace system tables, their schema, and how they are enabled and used; these table structures and paths are product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/materialization,Shared materialization history system table reference,Delta Sharing materialization history system table reference - Azure Databricks,Use Delta Sharing materialization history system table,Learn how to analyze shared materialized data history using system tables.,"The shared materialized data history table represents data materializations created from view sharing, materialized views, and streaming tables using Delta Sharing. It contains information on where the data came from, the securable being materialized, and when the materialization was created. For more information about shared materializations, seeAdd views to a shareandRead shared views. Table path: This system table is located atsystem.sharing.materialization_history.",2025-10-07T08:00:00.000Z,concept-article,configuration,0.8,True,"Documents the shared materialized data history system table, including table path and schema; this is detailed configuration/metadata for a specific Databricks feature.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/mlflow,MLflow system tables,MLflow system tables reference - Azure Databricks,Query MLflow experiment metadata via Databricks system tables,Learn to use the MLflow system tables to query experiment run metadata in your Azure Databricks account.,"Important The MLflow system tables are inPublic Preview. Themlflowsystem tables capture experiment metadata managed within theMLflow tracking service. These tables allow privileged users to take advantage of Databricks lakehouse tooling on their MLflow data across all workspaces within the region. You can use the tables to build custom AI/BI dashboards, set up SQL alerts, or perform large-scale analytical queries. Through themlflowsystem tables, users can answer questions like: Note Themlflowsys",2026-03-26T08:00:00.000Z,concept-article,configuration,0.8,True,"Describes MLflow system tables, their schema, and usage for cross-workspace analytics; these are concrete configuration/metadata details unique to Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/model-serving-cost,Monitor model serving costs,Monitor model serving costs - Azure Databricks,Analyze Databricks model serving costs with system tables,Learn how to use billing system tables to monitor the cost of Mosaic AI Model Serving endpoints in your account.,This article provides examples of how to usesystem tablesto monitor the cost of Mosaic AI Model Serving endpoints in your Azure Databricks account.,2026-02-23T08:00:00.000Z,concept-article,configuration,0.65,True,"Gives concrete examples of using billing system tables for Mosaic AI Model Serving endpoints, including specific table/field usage. These telemetry schema details are expert, product-specific knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/network,Network access system table,Network access events system table reference - Azure Databricks,Analyze network access denials with Databricks system tables,A reference guide for using the network access system table.,"Important This system table is inPublic Preview. The network access events tables record events where network access is denied. Each row represents an individual event, such as a blocked outbound request to an external domain or a blocked inbound request from a restricted IP.",2026-03-18T08:00:00.000Z,concept-article,configuration,0.75,True,Describes network access events system tables that record blocked network events; table structures and usage are product-specific configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/predictive-optimization,Predictive optimization system table,Predictive optimization system table reference - Azure Databricks,Use predictive optimization system table in Azure Databricks,Learn to read and query the predictive optimization operation history system table to gain insight into how predictive optimization operates in your workspaces.,"Important This system table is inPublic Preview. Note To have access to this table, your region must support predictive optimization. SeeAzure Databricks regions. This article outlines the predictive optimization operation history table schema and provides sample queries. Predictive optimization optimizes your data layout for peak performance and cost efficiency. The system table tracks the operation history of this feature. For information on predictive optimization, seePredictive optimization ",2026-02-12T08:00:00.000Z,concept-article,configuration,0.8,True,"Outlines the predictive optimization operation history table schema and table path, plus sample queries; these are concrete configuration details for a specific feature.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/,System tables overview,System tables reference - Azure Databricks,Use Databricks system tables for operational monitoring,"Learn about Azure Databricks system tables for monitoring costs, auditing activity, tracking compute, and observing data and AI workloads.","System tables provide an Azure Databricks-hosted analytical store of your account's operational data. Use them to monitor costs, audit security events, track compute and job performance, and observe data and AI workloads.",2026-04-22T08:00:00.000Z,landing-page,configuration,0.7,True,Reference for system tables schema and usage is product-specific configuration/metadata needed for monitoring and analytics.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/assistant,Genie Code system table,Genie Code system table reference - Azure Databricks,Monitor Genie Code usage with assistant events table,Learn how to use the Genie Code system table to monitor adoption and usage.,"Important This system table is inPublic Preview. This page includes a schema reference of the Genie Code events system table and an example that uses the data in a dashboard. Each row in this system table represents a message sent by a user in either Genie Code window or the in-cell Assistant. For more information about the Genie Code, seeGenie Code. Table path: This system table is located atsystem.access.assistant_events.",2026-04-22T08:00:00.000Z,reference,configuration,0.75,True,"Gives the exact system table path and schema for assistant events plus example queries, which is detailed, product-specific metadata/configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/audit-logs,Audit log system table,Audit log system table reference - Azure Databricks,Use audit log system table for account activity,Learn to use the audit log system table to gain insights and observability in your Azure Databricks account.,"Important This system table is inPublic Preview. This article outlines the audit log table schema and has sample queries you can use with the audit log system table to answer common account activity questions. For information on audit log events, seeDiagnostic log reference. Table path: This system table is located atsystem.access.audit.",2026-04-22T08:00:00.000Z,reference,configuration,0.8,True,"Provides the exact system table path, schema, and sample queries for audit logs, which is detailed product metadata/configuration rather than conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/billing,Billable usage system table,Billable usage system table reference - Azure Databricks,Query billable usage system table in Databricks,Learn how the billable usage system table works so you can query and monitor usage in your Azure Databricks account.,"This article provides an overview of the billable usage system table, including the schema and example queries. With system tables, your account's billable usage data is centralized and routed to all regions, so you can view your account's global usage from whichever region your workspace is in. For information on using this table to monitor costs and sample queries, seeMonitor costs using system tables. Table path: This system table is located atsystem.billing.usage.",2026-04-22T08:00:00.000Z,reference,configuration,0.78,True,"Provides table path, schema, and example queries for system.billing.usage, which is detailed product-specific configuration/metadata.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/clean-rooms,Clean room system table,Clean room events system table reference - Azure Databricks,Track clean room events via system table,A reference guide for using the clean room system table.,"Important This system table is inPublic Preview. The clean room events table records actions taken by you or your collaborators on clean rooms in your account. This table includes regional data from across your account. For more information on clean rooms, seeWhat is Azure Databricks Clean Rooms?. Table path: This system table is located atsystem.access.clean_room_events.",2026-04-22T08:00:00.000Z,reference,configuration,0.7,True,"Defines the clean room events system table path and schema, which is detailed system metadata/configuration not derivable from general knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/compute,Compute system tables,Compute system tables reference - Azure Databricks,Use compute system tables to monitor Databricks clusters,Learn how to use the available compute system tables to monitor cluster changes and available node types.,"This article provides you with a reference guide for the compute system tables. You can use these tables to monitor the activity and metrics of non-serverless all-purpose compute, jobs compute, and Lakeflow Spark Declarative Pipelines compute in your account. The compute tables include:",2026-04-22T08:00:00.000Z,reference,configuration,0.76,True,Reference for compute system tables and their fields is specific to Databricks’ internal monitoring schema.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-classification,Data classification system tables,Data classification system table reference - Azure Databricks,Use data classification results system table,Learn to read and query the data classification results system table to understand detected sensitive data across your metastore.,Important This feature is inPublic Preview. This page outlines the data classification results table schema and includes sample queries. The table stores detections for sensitive data classes at the column level across enabled catalogs in your metastore. Table path:system.data_classification.results,2026-04-22T08:00:00.000Z,reference,configuration,0.8,True,Outlines the data classification results table schema and sample queries; this is precise table configuration and usage guidance.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-quality-monitoring,Data quality monitoring system tables reference,Data quality monitoring results system table reference - Azure Databricks,Query data quality monitoring results system table,"Learn to read and query the data quality monitoring results system table to understand table health, incidents, and downstream impact across your metastore.","Important This feature is inPublic Preview. This page outlines the data quality monitoring results system table schema and includes sample queries. The table stores results of freshness and completeness checks, as well as downstream impact and root cause analysis, across all tables enabled for data quality monitoring in your metastore. Table path:system.data_quality_monitoring.table_results Only account admins can access this table, and they must grant access to others as needed. The system tabl",2026-04-22T08:00:00.000Z,reference,configuration,0.8,True,"Provides schema and usage for the data quality monitoring results table, including specific fields for freshness, completeness, and impact—expert configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs,Jobs system tables,Jobs system table reference - Azure Databricks,Use lakeflow jobs system tables to track Databricks jobs,Learn how to use the lakeflow system tables to monitor jobs in your account.,"Note Thelakeflowschema was previously known asworkflow. The content of both schemas is identical. This article is a reference for thelakeflowsystem tables, which record job activity in your account. These tables include records from all workspaces in your account deployed in the same cloud region. To see records from another region, you must view the tables from a workspace deployed in that region.",2026-04-22T08:00:00.000Z,reference,configuration,0.76,True,Reference for lakeflow (workflow) system tables and their regional behavior is specific operational metadata/configuration.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs-cost,Monitor job costs,Monitor job costs & performance with system tables - Azure Databricks,Analyze Lakeflow job costs and performance with system tables,Learn how to use the lakeflow system tables and billing system tables to monitor the cost of jobs in your account.,This article provides examples of how to use system tables to monitor the cost and performance of Lakeflow Jobs and Pipelines in your account. These queries only calculate costs for jobs run on jobs compute and serverless compute. Jobs run on SQL warehouses and all-purpose compute are not billed as jobs and are thus excluded from cost attribution. Note These queries won't return records from workspaces outside your current workspace's cloud region. To monitor job costs from workspaces outside yo,2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,Provides concrete example queries and explains which system tables/fields to use for job cost attribution—detailed product-specific usage of system tables.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage,Data lineage system tables,Lineage system tables reference - Azure Databricks,Query table and column lineage system tables,Learn how view and analyze the table lineage and column lineage system tables.,"This page includes a reference for the two lineage system tables. These system tables build on Unity Catalog'sdata lineage feature, allowing you to programmatically query lineage data to fuel decision making and reports. To access the tables, the schemas must be enabled in yoursystemcatalog. For more information, seeEnable system tables. Note Both lineage tables represent a subset of all read/write events, as it is not always possible to capture lineage. Records are only emitted when lineage can",2026-04-22T08:00:00.000Z,reference,configuration,0.75,True,Reference for lineage system tables with schema and usage patterns; this is specific structural/configuration knowledge about Databricks system tables.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/marketplace,Marketplace system tables,Marketplace system tables reference - Azure Databricks,Use Marketplace system tables for provider analytics,Learn how to analyze Databricks Marketplace data using system tables.,Important This system table is inPublic Preview. This page provides a reference on how to use system tables to help monitor yourDatabricks Marketplaceselling process. Marketplace system tables live in thesystem.marketplaceschema. This schema is enabled automatically when a Marketplace listing is created in your metastore. The schema includes the following tables: Note You can use the Provider Analytics Dashboard to monitor recipient usage metrics. The dashboard pulls data from the Marketplace sy,2026-04-22T08:00:00.000Z,reference,configuration,0.75,True,"Explains the system.marketplace schema and its tables, including how they’re used for analytics—expert table configuration information.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/materialization,Shared materialization history system table reference,Delta Sharing materialization history system table reference - Azure Databricks,Analyze Delta Sharing materialization history table,Learn how to analyze shared materialized data history using system tables.,"The shared materialized data history table represents data materializations created from view sharing, materialized views, and streaming tables using Delta Sharing. It contains information on where the data came from, the securable being materialized, and when the materialization was created. For more information about shared materializations, seeAdd views to a shareandRead shared views. Table path: This system table is located atsystem.sharing.materialization_history.",2026-04-22T08:00:00.000Z,reference,configuration,0.75,True,"Describes the materialization history system table path and fields, which is detailed product-specific metadata/configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/mlflow,MLflow system tables,MLflow system tables reference - Azure Databricks,Work with MLflow experiment system tables in Databricks,Learn to use the MLflow system tables to query experiment run metadata in your Azure Databricks account.,"Important The MLflow system tables are inPublic Preview. Themlflowsystem tables capture experiment metadata managed within theMLflow tracking service. These tables allow privileged users to take advantage of Databricks lakehouse tooling on their MLflow data across all workspaces within the region. You can use the tables to build custom AI/BI dashboards, set up SQL alerts, or perform large-scale analytical queries. Through themlflowsystem tables, users can answer questions like: Note Themlflowsys",2026-04-22T08:00:00.000Z,reference,configuration,0.8,True,"Documents MLflow system table schemas and how to query them, which is detailed Databricks-specific configuration/metadata.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/model-serving-cost,Monitor model serving costs,Monitor model serving costs - Azure Databricks,Track Mosaic AI Model Serving costs with system tables,Learn how to use billing system tables to monitor the cost of Mosaic AI Model Serving endpoints in your account.,This article provides examples of how to usesystem tablesto monitor the cost of Mosaic AI Model Serving endpoints in your Azure Databricks account.,2026-04-22T08:00:00.000Z,how-to,configuration,0.65,True,Gives specific guidance on which billing/system tables and fields to query for model serving endpoints—expert configuration/usage details.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/network,Network access system table,Network access events system table reference - Azure Databricks,Analyze network access events via system table,A reference guide for using the network access system table.,"Important This system table is inPublic Preview. The network access events tables record events where network access is denied. Each row represents an individual event, such as a blocked outbound request to an external domain or a blocked inbound request from a restricted IP.",2026-04-22T08:00:00.000Z,reference,configuration,0.75,True,"Describes the network access events system table, including schema and how each row maps to blocked requests—specific table configuration/metadata details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/predictive-optimization,Predictive optimization system table,Predictive optimization system table reference - Azure Databricks,Inspect predictive optimization operation history table,Learn to read and query the predictive optimization operation history system table to gain insight into how predictive optimization operates in your workspaces.,"Important This system table is inPublic Preview. Note To have access to this table, your region must support predictive optimization. SeeAzure Databricks regions. This article outlines the predictive optimization operation history table schema and provides sample queries. Predictive optimization optimizes your data layout for peak performance and cost efficiency. The system table tracks the operation history of this feature. For information on predictive optimization, seePredictive optimization ",2026-04-22T08:00:00.000Z,reference,configuration,0.8,True,Reference for the predictive optimization system table with schema and sample queries; this is specific configuration/metadata knowledge.,updated https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/pricing,Pricing system table,Pricing system table reference - Azure Databricks,Use Databricks pricing system table for SKU price history,Learn how the billable usage system table works so you can query and monitor usage in your Azure Databricks account.,"This article provides you with an overview of the pricing system table, including the schema and example queries. The pricing table gives you access to a historical log of SKU pricing. A record gets added each time there is a change to a SKU price. These logs can help you perform cost analysis and monitor pricing changes. Table path: This system table is located atsystem.billing.list_prices.",2025-02-05T19:42:00.000Z,concept-article,configuration,0.8,True,Describes pricing system table schema and usage; table path and columns are Databricks-specific configuration/metadata details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/query-history,Query history system table,Query history system table reference - Azure Databricks,Query Azure Databricks query history system table,A reference guide for using the query history system table.,"Important This system table is inPublic Preview. This article includes information on the query history system table, including an outline of the table's schema. Table path: This system table is located atsystem.query.history.",2026-04-14T08:00:00.000Z,concept-article,configuration,0.7,True,"Reference page for a specific system table, including its schema and column details. This is product-specific configuration/metadata that an LLM is unlikely to know exactly from training and is used to configure queries and analytics over system tables.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/serverless-billing,Monitor serverless costs,Monitor the cost of serverless compute - Azure Databricks,Analyze Databricks serverless compute costs via system tables,Learn how to analyze the billable usage system to monitor the cost of serverless compute.,"This article explains how to use the billable usage system table to monitor the cost of your serverless compute usage. You can monitor the usage of serverless compute for notebooks and jobs by querying the billable usage system table (system.billing.usage), which includes user and workload attributes related to serverless compute costs. The applicable fields include: Note Because of our distributed architecture, you might see multiple records associated with a given serverless compute workload i",2026-04-15T08:00:00.000Z,concept-article,configuration,0.65,True,"Explains how to query the billable usage system table for serverless compute, including specific fields and their meanings. This is product-specific table/field configuration knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouse-events,SQL warehouse events system table,Warehouse events system table reference - Azure Databricks,Use warehouse events system table for SQL warehouse monitoring,Learn how to analyze the warehouse events system table to monitor your SQL warehouses.,"Important This system table is inPublic Preview. In this article, you learn how to use the warehouse events system table to monitor and manage the SQL warehouses in your workspaces. This table records a row for every time a warehouse starts, stops, runs, and scales up and down. You can use the sample queries in this article with alerts to keep you informed of changes to your warehouses. Table path: This system table is located atsystem.compute.warehouse_events.",2025-10-07T08:00:00.000Z,concept-article,configuration,0.8,True,"Describes a specific system table (path, schema, and usage patterns) for warehouse events; these details are configuration/metadata unique to Azure Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouses,SQL warehouses system table,Warehouses system table reference - Azure Databricks,Analyze SQL warehouses via Databricks warehouses system table,Learn how to analyze the warehouses system table to monitor your SQL warehouses.,"Important This system table is inPublic Preview. In this article, you learn how to use the warehouses system table to monitor and manage the SQL warehouses in your workspaces. Each row is a snapshot of the SQL warehouse properties at that moment. A new snapshot is created when the properties change. The warehouses system table is located atsystem.compute.warehouses. For example queries that can track SQL warehouse activity, seeExample queries for monitoring SQL warehouse activity.",2026-03-12T23:05:00.000Z,concept-article,configuration,0.75,True,Reference for warehouses system table including schema and usage; table path and columns are product-specific configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/workspaces,Workspaces system table,Workspaces system table reference - Azure Databricks,Monitor Azure Databricks workspaces with system table reference,Learn how to use the workspaces system table to monitor workspaces in your account.,"Important This system table is inPublic Preview. This page explains how to use the workspaces system table to monitor workspaces in your Azure Databricks account. Each row in the table represents the latest known state of an active workspace in your account, including metadata and lifecycle status. This table is most useful when joined with other system tables. You can use it to get aggregate statistics on reliability, performance, and cost across workspaces in your account. Note The table only ",2025-10-07T08:00:00.000Z,concept-article,configuration,0.8,True,"Explains the workspaces system table, including its structure and how to join with other tables; these table-level details are product-specific configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/zerobus-ingest,Zerobus system tables reference,Zerobus Ingest system table reference - Azure Databricks,Monitor Zerobus Ingest activity using Databricks system tables,Learn how to use the Zerobus system tables to monitor jobs in your account.,"This article is a reference for thezerobussystem tables, which track Zerobus Ingest activity in your workspace. These tables include your account records from all workspaces in your same region. To see records from another region, you must view the tables from a workspace deployed in that region.",2025-12-22T23:55:00.000Z,concept-article,configuration,0.7,True,Zerobus system tables reference with schema and regional behavior; these are product-specific configuration/telemetry details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/usage/,Cost management tools on Databricks,Cost management tools on Databricks - Azure Databricks,,This article provides information about the cost controls and cost monitoring features available on Databricks.,The articles in this section outline the available cost controls and cost monitoring features available on Databricks.,2026-04-16T17:44:00.000Z,concept-article,,0.2,False,"Section overview of cost management tools; appears to be navigational/summary content without detailed configuration, limits, or troubleshooting data.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies,Attribute usage using serverless usage policies,Attribute usage with serverless usage policies - Azure Databricks,Configure Databricks serverless usage policies for tagging,Learn how to use serverless usage policies to enforce cost attribution tags for serverless compute.,"Important This feature is inPublic Preview. This article explains how to use serverless usage policies to add cost attribution tags on serverless compute workloads. Serverless usage policies consist of tags that are applied to any serverless compute activity incurred by a user assigned to the policy. The tags are logged in your billing records, allowing you to attribute serverless usage to specific budgets. For more on creating budgets, seeCreate and monitor budgets.",2026-04-16T17:44:00.000Z,concept-article,configuration,0.6,True,"Describes serverless usage policies and how tags are applied and logged in billing records. This is product-specific configuration behavior for policies and tagging, beyond generic tagging concepts.",new -https://learn.microsoft.com/en-us/azure/databricks/admin/usage/default-storage,Monitor default storage costs,Monitor default storage costs - Azure Databricks,Query Databricks billable usage for storage cost tracking,Learn how to use the billable usage system table to monitor and track default storage costs in Azure Databricks.,This article explains how to use the billable usage system table to monitor the cost of your default storage usage.,2026-01-08T23:31:00.000Z,concept-article,integrations,0.65,True,Describes concrete use of the billable usage system table with product-specific fields and example queries to monitor default storage costs.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/usage/system-tables,Analyze the billing logs,Monitor costs using system tables - Azure Databricks,Use Databricks billable usage system table for costs,This article provides information about how to use the billable usage system table to monitor Databricks costs.,This article explains how you can use thesystem.billing.usagetable on its own or joined with other system tables to get a picture of your account's Azure Databricks usage. The following feature-specific articles are also available:,2026-04-15T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes the billable usage system table and how to join it with other system tables. This is detailed, product-specific schema/usage information that functions as configuration/metadata for cost monitoring.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/,Users and groups overview,"Manage users, service principals, and groups - Azure Databricks","Manage users, groups, and service principals in Databricks","Learn how to manage users, groups, and service principals in Azure Databricks.","Databricks provides centralized identity management for users, groups, and service principals across your account and workspaces. Identity management in Azure Databricks enables you to control who can access your workspaces, data, and compute resources, with flexible options for syncing identities from your identity provider. For an opinionated perspective on how to best configure identity in Azure Databricks, seeIdentity best practices. To manage access for users, service principals, and groups",2026-02-02T08:00:00.000Z,concept-article,security,0.7,True,"Covers centralized identity management across account and workspaces, including syncing from identity providers and controlling access to workspaces, data, and compute—product-specific IAM behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management,Sync users and groups automatically from MS Entra ID,Sync users and groups automatically from Microsoft Entra ID - Azure Databricks,Configure automatic identity sync from Entra ID to Databricks,"Learn how to configure Azure Databricks to add users, service principals, and groups from Microsoft Entra ID to Azure Databricks.","This page describes how Azure Databricks automatically syncs users, service principals, and groups from Microsoft Entra ID using automatic identity management.",2026-02-25T19:11:00.000Z,concept-article,configuration,0.75,True,"Automatic identity management setup requires specific Databricks and Entra ID configuration steps and options, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices,Best practices,Identity best practices - Azure Databricks,Apply identity best practices and migrate to federation,"Learn how to migrate to identity federation, which enables you to manage all of your users, groups, and service principals in the Azure Databricks account.","This page provides an opinionated perspective on how to best configure identity in Azure Databricks. It includes a guide on how to migrate to identity federation, which enables you to manage all of your users, groups, and service principals in the Azure Databricks account. For an overview of the Azure Databricks identity model, seeAzure Databricks identities. For information on how to securely access Azure Databricks APIs, seeManage personal access token permissions.",2026-01-09T08:00:00.000Z,concept-article,best-practices,0.85,True,"Provides an opinionated, Databricks-specific guide for configuring identity and migrating to identity federation, including concrete recommendations and patterns that go beyond generic IAM advice.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/groups,Groups,Groups - Azure Databricks,,"Learn how admins create and manage Azure Databricks groups. Groups make it easier to assign access to workspaces, data, and other securable objects.","This page gives an overview of groups in Azure Databricks. For how to manage groups, seeManage groups. Groups simplify identity management by making it easier to assign access to workspaces, data, and other securable objects. All Databricks identities can be assigned as members of groups. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without",2026-01-05T08:00:00.000Z,concept-article,,0.2,False,"Described as an overview of groups; likely conceptual without detailed role names, config parameters, or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-groups,Manage groups,Manage groups - Azure Databricks,Create and manage Azure Databricks groups for access control,"Learn how admins create and manage Azure Databricks groups. Groups make it easier to assign access to workspaces, data, and other securable objects.","This page explains how to manage groups for your Azure Databricks account and workspaces. For an overview of groups, seeGroups. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without identity federation.",2026-02-06T08:00:00.000Z,concept-article,security,0.65,True,"Provides product-specific guidance on managing groups in Azure Databricks for securing workspaces and data, including how groups map to permissions and identity federation behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-service-principals,Manage service principals,Manage service principals - Azure Databricks,Manage Azure Databricks service principals and access,"Learn how to manage service principals for your Azure Databricks account and workspaces. A service principal is an identity that you create in Azure Databricks for use with automated tools, jobs, and ","This page explains how to manage service principals for your Azure Databricks account and workspaces. For an overview of service principals, seeService principals. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without identity federation.",2026-01-24T08:00:00.000Z,concept-article,security,0.7,True,"Describes concrete management operations for Databricks service principals at account and workspace scope, including permissions and access control patterns that are specific to the product.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/,SCIM provisioning overview,Sync users and groups from Microsoft Entra ID using SCIM - Azure Databricks,Configure SCIM-based user and group provisioning to Databricks,Learn how to provision users to Azure Databricks using SCIM-enabled IdPs.,"This article describes how to configure your identity provider (IdP) and Azure Databricks to provision users and groups to Azure Databricks usingSCIM, or System for Cross-domain Identity Management, an open standard that allows you to automate user provisioning. You can also sync users and groups from Microsoft Entra ID using automatic identity management. Automatic identity management does not require you to configure an application in Microsoft Entra ID. It also supports syncing Microsoft Entr",2025-12-17T08:00:00.000Z,concept-article,configuration,0.8,True,"SCIM provisioning setup includes endpoint URLs, token scopes, and IdP app settings specific to Databricks, matching configuration sub-skill criteria.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/aad,Configure SCIM provisioning for MS Entra ID,Configure SCIM provisioning using Microsoft Entra ID (Azure Active Directory) - Azure Databricks,Configure Microsoft Entra SCIM provisioning to Azure Databricks,Learn how to provision users to Azure Databricks using Microsoft Microsoft Entra ID.,"This article describes how to set up provisioning to the Azure Databricks account using Microsoft Entra ID. You can also sync users and groups from Microsoft Entra ID using automatic identity management. Automatic identity management does not require you to configure an application in Microsoft Entra ID. It also supports syncing Microsoft Entra ID service principals and nested groups to Azure Databricks, which is not supported using SCIM provisioning. Automatic identity management is enabled by ",2025-10-20T08:00:00.000Z,concept-article,security,0.8,True,"Contains detailed, product-specific configuration steps and parameters for SCIM provisioning from Entra ID to Databricks, including app configuration and sync behavior unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/service-principals,Service principals overview,Service principals - Azure Databricks,Use Azure Databricks service principals for secure automation,"Learn about using service principals for your Azure Databricks account and workspaces. A service principal is an identity that you create in Azure Databricks for use with automated tools, jobs, and ap","A service principal is a specialized identity in Azure Databricks designed for automation and programmatic access. Service principals provide secure, API-only access to Azure Databricks resources for automated tools, scripts, and CI/CD platforms, without relying on individual user credentials. For how to manage service principals, seeManage service principals. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about le",2026-03-16T08:00:00.000Z,concept-article,security,0.7,True,"Covers product-specific identity model for service principals in Azure Databricks, including how they interact with workspaces and identity federation. Contains concrete, platform-specific security/identity behavior beyond generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/users,Users,Manage users - Azure Databricks,"Add, update, and remove Databricks users securely","Learn how to manage users in Azure Databricks workspaces as an account admin or workspace admin. You can add, update, or remove users.","This article explains how to add, update, and remove Azure Databricks users. For an overview of the Azure Databricks identity model, seeAzure Databricks identities. To manage access for users, seeAuthentication and access control. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without identity federation.",2026-01-09T08:00:00.000Z,concept-article,security,0.75,True,"Explains how admins manage users in identity-federated workspaces, tied to the Databricks identity model and access control, which is specific IAM configuration/operation knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/workspace-local-groups,Manage workspace-local groups (legacy),Manage workspace-local groups (legacy) - Azure Databricks,Manage legacy workspace-local groups in Databricks,,"This article explains how admins create and manage legacy workspace-local groups. For an overview of account groups, the primary groups in Azure Databricks seeGroups.",2025-12-17T08:00:00.000Z,concept-article,configuration,0.65,True,Legacy workspace-local groups behavior is Databricks-specific; page likely documents exact configuration flows and constraints unique to these legacy groups.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/query-history,Query history system table,Query history system table reference - Azure Databricks,Query query-history system table in Databricks,A reference guide for using the query history system table.,"Important This system table is inPublic Preview. This article includes information on the query history system table, including an outline of the table's schema. Table path: This system table is located atsystem.query.history.",2026-04-22T08:00:00.000Z,reference,configuration,0.78,True,"Provides table path and schema outline for system.query.history, which is detailed product-specific system table configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/serverless-billing,Monitor serverless costs,Monitor the cost of serverless compute - Azure Databricks,Monitor serverless compute costs via billing table,Learn how to analyze the billable usage system to monitor the cost of serverless compute.,"This article explains how to use the billable usage system table to monitor the cost of your serverless compute usage. You can monitor the usage of serverless compute for notebooks and jobs by querying the billable usage system table (system.billing.usage), which includes user and workload attributes related to serverless compute costs. The applicable fields include: Note Because of our distributed architecture, you might see multiple records associated with a given serverless compute workload i",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,Shows which fields in system.billing.usage correspond to serverless compute and how to query them—specific configuration/field-level knowledge.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouse-events,SQL warehouse events system table,Warehouse events system table reference - Azure Databricks,Monitor SQL warehouse events with system tables,Learn how to analyze the warehouse events system table to monitor your SQL warehouses.,"Important This system table is inPublic Preview. In this article, you learn how to use the warehouse events system table to monitor and manage the SQL warehouses in your workspaces. This table records a row for every time a warehouse starts, stops, runs, and scales up and down. You can use the sample queries in this article with alerts to keep you informed of changes to your warehouses. Table path: This system table is located atsystem.compute.warehouse_events.",2026-04-22T08:00:00.000Z,reference,configuration,0.78,True,"Warehouse events table path and event semantics (start, stop, scale) are Databricks-specific system table configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouses,SQL warehouses system table,Warehouses system table reference - Azure Databricks,Analyze SQL warehouses using system.compute.warehouses table,Learn how to analyze the warehouses system table to monitor your SQL warehouses.,"Important This system table is inPublic Preview. In this article, you learn how to use the warehouses system table to monitor and manage the SQL warehouses in your workspaces. Each row is a snapshot of the SQL warehouse properties at that moment. A new snapshot is created when the properties change. The warehouses system table is located atsystem.compute.warehouses. For example queries that can track SQL warehouse activity, seeExample queries for monitoring SQL warehouse activity.",2026-04-22T08:00:00.000Z,reference,configuration,0.76,True,"Describes warehouses system table path and snapshot semantics, which are product-specific configuration/metadata details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/workspaces,Workspaces system table,Workspaces system table reference - Azure Databricks,Inspect workspace metadata using system table,Learn how to use the workspaces system table to monitor workspaces in your account.,"Important This system table is inPublic Preview. This page explains how to use the workspaces system table to monitor workspaces in your Azure Databricks account. Each row in the table represents the latest known state of an active workspace in your account, including metadata and lifecycle status. This table is most useful when joined with other system tables. You can use it to get aggregate statistics on reliability, performance, and cost across workspaces in your account. Note The table only ",2026-04-22T08:00:00.000Z,reference,configuration,0.7,True,"Explains the workspaces system table, its columns and lifecycle status fields, and how to join with other tables—expert schema/configuration information.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/zerobus-ingest,Zerobus system tables reference,Zerobus Ingest system table reference - Azure Databricks,Query Zerobus Ingest system table schema and fields,Learn how to use the Zerobus system tables to monitor jobs in your account.,"This article is a reference for thezerobussystem tables, which track Zerobus Ingest activity in your workspace. These tables include your account records from all workspaces in your same region. To see records from another region, you must view the tables from a workspace deployed in that region.",2026-04-22T08:00:00.000Z,reference,configuration,0.7,True,"System table reference pages enumerate table paths, column names, types, and semantics plus example queries—product-specific configuration/metadata details that function like a schema/config reference and are not generally known to LLMs.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/usage/,Cost management tools on Databricks,Cost management tools on Databricks - Azure Databricks,,This article provides information about the cost controls and cost monitoring features available on Databricks.,The articles in this section outline the available cost controls and cost monitoring features available on Databricks.,2026-04-22T08:00:00.000Z,landing-page,,0.3,False,"High-level navigation/overview of cost management tools without detailed configuration tables, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies,Attribute usage using serverless usage policies,Attribute usage with serverless usage policies - Azure Databricks,Configure serverless usage policies for cost attribution,Learn how to use serverless usage policies to enforce cost attribution tags for serverless compute.,"Important This feature is inPublic Preview. This article explains how to use serverless usage policies to add cost attribution tags on serverless compute workloads. Serverless usage policies consist of tags that are applied to any serverless compute activity incurred by a user assigned to the policy. The tags are logged in your billing records, allowing you to attribute serverless usage to specific budgets. For more on creating budgets, seeCreate and monitor budgets.",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"Describes serverless usage policies, their tag configuration, and how they affect billing records—detailed configuration behavior unique to Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/usage/default-storage,Monitor default storage costs,Monitor default storage costs - Azure Databricks,Monitor default storage costs using billable usage table,Learn how to use the billable usage system table to monitor and track default storage costs in Azure Databricks.,This article explains how to use the billable usage system table to monitor the cost of your default storage usage.,2026-04-22T08:00:00.000Z,how-to,configuration,0.65,True,"Explains how to use the billable usage system table to isolate default storage costs, including relevant fields—product-specific configuration/usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/usage/system-tables,Analyze the billing logs,Monitor costs using system tables - Azure Databricks,Query billable usage system table for Databricks costs,This article provides information about how to use the billable usage system table to monitor Databricks costs.,This article explains how you can use thesystem.billing.usagetable on its own or joined with other system tables to get a picture of your account's Azure Databricks usage. The following feature-specific articles are also available:,2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to use the system.billing.usage table, including joins and fields, to monitor costs—detailed system table configuration/usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/,Users and groups overview,"Manage users, service principals, and groups - Azure Databricks","Manage users, groups, and service principals in Databricks","Learn how to manage users, groups, and service principals in Azure Databricks.","Databricks provides centralized identity management for users, groups, and service principals across your account and workspaces. Identity management in Azure Databricks enables you to control who can access your workspaces, data, and compute resources, with flexible options for syncing identities from your identity provider. For an opinionated perspective on how to best configure identity in Azure Databricks, seeIdentity best practices. To manage access for users, service principals, and groups",2026-04-22T08:00:00.000Z,landing-page,security,0.7,True,Centralized identity management and access control across account/workspaces is product-specific IAM behavior and configuration.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management,Sync users and groups automatically from MS Entra ID,Sync users and groups automatically from Microsoft Entra ID - Azure Databricks,Configure automatic identity sync from Entra ID to Databricks,"Learn how to configure Azure Databricks to add users, service principals, and groups from Microsoft Entra ID to Azure Databricks.","This page describes how Azure Databricks automatically syncs users, service principals, and groups from Microsoft Entra ID using automatic identity management.",2026-04-22T08:00:00.000Z,feature-guide,security,0.76,True,"Details automatic syncing of users, groups, and service principals from Entra ID, which is a product-specific identity integration/configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices,Best practices,Identity best practices - Azure Databricks,Apply identity best practices and migrate to federation in Databricks,"Learn how to migrate to identity federation, which enables you to manage all of your users, groups, and service principals in the Azure Databricks account.","This page provides an opinionated perspective on how to best configure identity in Azure Databricks. It includes a guide on how to migrate to identity federation, which enables you to manage all of your users, groups, and service principals in the Azure Databricks account. For an overview of the Azure Databricks identity model, seeAzure Databricks identities. For information on how to securely access Azure Databricks APIs, seeManage personal access token permissions.",2026-04-24T08:00:00.000Z,best-practice,best-practices,0.82,True,"Opinionated, product-specific guidance on configuring identity and migrating to identity federation qualifies as best practices.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/groups,Groups,Groups - Azure Databricks,Understand and use groups for Databricks access control,"Learn how admins create and manage Azure Databricks groups. Groups make it easier to assign access to workspaces, data, and other securable objects.","This page gives an overview of groups in Azure Databricks. For how to manage groups, seeManage groups. Groups simplify identity management by making it easier to assign access to workspaces, data, and other securable objects. All Databricks identities can be assigned as members of groups. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without",2026-04-24T08:00:00.000Z,concept-article,security,0.65,True,Explains Databricks groups as securable-object access constructs; this is product-specific identity and access model information.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-groups,Manage groups,Manage groups - Azure Databricks,Create and manage groups in Azure Databricks,"Learn how admins create and manage Azure Databricks groups. Groups make it easier to assign access to workspaces, data, and other securable objects.","This page explains how to manage groups for your Azure Databricks account and workspaces. For an overview of groups, seeGroups. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without identity federation.",2026-04-24T08:00:00.000Z,how-to,security,0.7,True,"Management of groups for access to workspaces, data, and other securables is concrete IAM/security configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-service-principals,Manage service principals,Manage service principals - Azure Databricks,Manage service principals in Azure Databricks,"Learn how to manage service principals for your Azure Databricks account and workspaces. A service principal is an identity that you create in Azure Databricks for use with automated tools, jobs, and ","This page explains how to manage service principals for your Azure Databricks account and workspaces. For an overview of service principals, seeService principals. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without identity federation.",2026-04-22T08:00:00.000Z,how-to,security,0.78,True,"Covers management of service principals at account and workspace scope, which is detailed IAM/security configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/,SCIM provisioning overview,Sync users and groups from Microsoft Entra ID using SCIM - Azure Databricks,Configure SCIM-based user and group provisioning to Databricks,Learn how to provision users to Azure Databricks using SCIM-enabled IdPs.,"This article describes how to configure your identity provider (IdP) and Azure Databricks to provision users and groups to Azure Databricks usingSCIM, or System for Cross-domain Identity Management, an open standard that allows you to automate user provisioning. You can also sync users and groups from Microsoft Entra ID using automatic identity management. Automatic identity management does not require you to configure an application in Microsoft Entra ID. It also supports syncing Microsoft Entr",2026-04-22T08:00:00.000Z,concept-article,security,0.78,True,SCIM provisioning setup between IdPs and Databricks is a concrete identity/security configuration with product-specific steps and parameters.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/aad,Configure SCIM provisioning for MS Entra ID,Configure SCIM provisioning using Microsoft Entra ID (Azure Active Directory) - Azure Databricks,Set up SCIM provisioning from Microsoft Entra ID to Databricks,Learn how to provision users to Azure Databricks using Microsoft Microsoft Entra ID.,"This article describes how to set up provisioning to the Azure Databricks account using Microsoft Entra ID. You can also sync users and groups from Microsoft Entra ID using automatic identity management. Automatic identity management does not require you to configure an application in Microsoft Entra ID. It also supports syncing Microsoft Entra ID service principals and nested groups to Azure Databricks, which is not supported using SCIM provisioning. Automatic identity management is enabled by ",2026-04-22T08:00:00.000Z,how-to,security,0.8,True,"Provides detailed configuration for Entra ID SCIM provisioning to Databricks, including capabilities vs automatic identity management; clearly IAM/security configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/service-principals,Service principals overview,Service principals - Azure Databricks,Use service principals for secure automation in Databricks,"Learn about using service principals for your Azure Databricks account and workspaces. A service principal is an identity that you create in Azure Databricks for use with automated tools, jobs, and ap","A service principal is a specialized identity in Azure Databricks designed for automation and programmatic access. Service principals provide secure, API-only access to Azure Databricks resources for automated tools, scripts, and CI/CD platforms, without relying on individual user credentials. For how to manage service principals, seeManage service principals. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about le",2026-04-22T08:00:00.000Z,concept-article,security,0.78,True,Defines Databricks service principals and their security role in API-only access; this is product-specific identity/security behavior.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/users,Users,Manage users - Azure Databricks,Manage Azure Databricks workspace users,"Learn how to manage users in Azure Databricks workspaces as an account admin or workspace admin. You can add, update, or remove users.","This article explains how to add, update, and remove Azure Databricks users. For an overview of the Azure Databricks identity model, seeAzure Databricks identities. To manage access for users, seeAuthentication and access control. Note This page assumes your workspace has identity federation enabled, which is the default for most workspaces. For information about legacy workspaces without identity federation, seeLegacy workspaces without identity federation.",2026-04-22T08:00:00.000Z,how-to,security,0.7,True,"Procedures and constraints for adding, updating, and removing users in identity-federated workspaces are specific IAM/security operations.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/workspace-local-groups,Manage workspace-local groups (legacy),Manage workspace-local groups (legacy) - Azure Databricks,Manage legacy workspace-local groups in Databricks,,"This article explains how admins create and manage legacy workspace-local groups. For an overview of account groups, the primary groups in Azure Databricks seeGroups.",2026-04-22T08:00:00.000Z,how-to,security,0.68,True,"Describes legacy workspace-local group management, which is specific to Databricks’ IAM model and not generic knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/,Manage workspace settings overview,Manage your workspace - Azure Databricks,,"Learn how to manage Azure Databricks workspace behavior, including storage purges, security headers, and notebook options.",The articles in this section explain the purpose and function of notable workspace settings. Azure Databricks workspace administrators can access the workspace settings by clicking their username in the top bar of the Databricks workspace and selectingSettings.,2025-09-29T22:49:00.000Z,concept-article,,0.2,False,Section overview for workspace settings; navigation-style content without specific settings tables in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/appearance,Workspace appearance settings,Manage workspace appearance settings - Azure Databricks,Configure workspace appearance settings in Databricks,"Learn how to manage workspace appearance settings. Workspace admins can manage settings related to the appearance of your workspace, including themes, date and time format, and language.","As a workspace admin, you can manage various settings related to the appearance of your workspace, including themes, date and time format, and language.",2026-03-31T23:28:00.000Z,concept-article,configuration,0.65,True,"Covers concrete workspace settings (themes, date/time format, language) and how admins manage them; these are specific configuration options rather than generic UI concepts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/base-environment,Manage serverless base environments,Manage serverless workspace base environments - Azure Databricks,Configure serverless workspace base environments in Databricks,Configure and manage workspace base environments for serverless notebooks and jobs in your workspace.,Important This feature is inPublic Preview. This page explains how to create and manage serverless base environments across a workspace.,2026-04-13T21:03:00.000Z,concept-article,configuration,0.7,True,"Admin-focused page on creating and managing serverless base environments is likely to enumerate specific workspace settings, parameter names, and allowed values for serverless notebooks and jobs, which are product-specific configuration details rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/appearance,Workspace appearance settings,Manage workspace appearance settings - Azure Databricks,,"Learn how to manage workspace appearance settings. Workspace admins can manage settings related to the appearance of your workspace, including the workspace label, themes, date and time format, and la","As a workspace admin, you can manage various settings related to the appearance of your workspace, including the workspace label, themes, date and time format, and language. Account admins can also set the workspace label from the account console.",2026-04-23T17:47:00.000Z,how-to,,0.2,False,"Workspace appearance (labels, themes, date/time, language) is basic UI configuration without detailed parameters or expert-only behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/base-environment,Manage serverless base environments,Manage workspace base environments - Azure Databricks,Manage workspace base environments for serverless notebooks,Configure and manage workspace base environments for serverless notebooks and jobs in your workspace.,"This page explains how to create and manage workspace base environments across a workspace. Workspace base environments let workspace admins create and manage pre-built, cached environments for serverless notebooks.",2026-04-24T08:00:00.000Z,how-to,configuration,0.75,True,"Explains creating and managing pre-built cached base environments for serverless notebooks and jobs, which involves specific configuration options unique to Databricks.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/dbfs-browser,Manage access to DBFS browser,Manage the DBFS file browser - Azure Databricks,Manage DBFS visual file browser access in Databricks,Learn how to enable and disable the ability to browse data in the Databricks File System using the visual browser interface.,"As a workspace admin user, you can manage your users' ability to browse data in the Databricks File System (DBFS) using thevisual browser interface. This setting does not controlprogrammatic accessto the Databricks File System, for example through the DBFS command-line interface.",2024-05-17T08:00:00.000Z,concept-article,configuration,0.7,True,"Admin setting controlling DBFS visual browser; distinct from programmatic access, so configuration is product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-access-mode,Default compute access mode,Default access mode for jobs compute - Azure Databricks,Set default access mode for Databricks jobs compute,Learn how to set a default access mode for compute resources in your workspace.,This article describes how to set a default access mode for jobs compute in a workspace. This setting applies to compute resources created using the Jobs API where the access mode isn't defined in the cluster definition (configured using thejob_clusters.new_cluster.data_security_modeproperty). This situation is more prevalent in legacy jobs that were authored before access mode was introduced. To define a default access mode: Important Changing this setting updates the access mode for all jobs c,2025-05-09T19:58:00.000Z,concept-article,configuration,0.8,True,Covers a workspace setting that controls default data_security_mode for jobs; this is a product-specific configuration with concrete behavior changes.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-python-packages,Configure default Python package repositories,Configure default Python package repositories - Azure Databricks,Configure default Python package repositories in Databricks,Learn how workspace admins can configure default Python package repos for their workspace.,"Important Configuring default Python package repositories for Lakeflow Spark Declarative Pipelines is inPublic Preview. Workspace admins can control access to this feature from thePreviewspage. Workspace admins can configure private or authenticated package repositories within workspaces as the default pip configuration for notebooks, jobs, and Lakeflow Spark Declarative Pipelines. If a workspace is configured with a default Python package repo, users in the workspace will be able to install pac",2026-03-24T22:15:00.000Z,concept-article,configuration,0.8,True,"Describes workspace-level default pip configuration for notebooks, jobs, and Lakeflow Spark Declarative Pipelines, including how to set private/authenticated repos—product-specific configuration parameters and behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/deletion-vectors,Auto-enable deletion vectors,Auto-enable deletion vectors - Azure Databricks,Auto-enable deletion vectors for new Delta tables,Learn how to configure options for creating new tables with deletion vectors.,Use this workspace setting to configure whether new Delta tables are created with deletion vectors enabled by default. This setting applies to all tables created using Databricks SQL warehouses and clusters running Databricks Runtime 14.0 and above. Databricks recommends using deletion vectors for all tables except for those used in workloads with incompatible Databricks Runtime versions or external Delta clients. SeeDeletion vectors in Databricks. Tables with deletion vectors enabled automatica,2025-10-07T08:00:00.000Z,concept-article,configuration,0.75,True,"Provides a workspace setting that controls default creation of Delta tables with deletion vectors, including version constraints (SQL warehouses and DBR 14.0+), which is a product-specific configuration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-python-packages,Configure default Python package repositories,Configure default Python package repositories - Azure Databricks,Configure default Python package repositories in Databricks,Learn how workspace admins can configure default Python package repos for their workspace.,"Important Configuring default Python package repositories for Lakeflow Spark Declarative Pipelines is inPublic Preview. Workspace admins can control access to this feature from thePreviewspage. Workspace admins can configure private or authenticated package repositories within workspaces as the default pip configuration for notebooks, jobs, and Lakeflow Spark Declarative Pipelines. If a workspace is configured with a default Python package repo, users in the workspace will be able to install pac",2026-04-22T08:00:00.000Z,how-to,configuration,0.78,True,"Admin setting for default pip configuration and authenticated/private repos is product-specific configuration with concrete options and behavior, beyond generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/deletion-vectors,Auto-enable deletion vectors,Auto-enable deletion vectors - Azure Databricks,Configure default deletion vector behavior for Delta tables,Learn how to configure options for creating new tables with deletion vectors.,Use this workspace setting to configure whether new Delta tables are created with deletion vectors enabled by default. This setting applies to all tables created using Databricks SQL warehouses and clusters running Databricks Runtime 14.0 and above. Databricks recommends using deletion vectors for all tables except for those used in workloads with incompatible Databricks Runtime versions or external Delta clients. SeeDeletion vectors in Databricks. Tables with deletion vectors enabled automatica,2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,Workspace setting controlling default creation of Delta tables with deletion vectors is a Databricks-specific configuration option with nuanced applicability.,updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/disable-upload-data-ui,Manage access to file upload interface,Disable the upload data UI - Azure Databricks,Disable the upload data UI in Azure Databricks,Learn how to enable and disable the ability to upload local data files using the user interface.,"As a workspace admin, you can manage your users' ability to upload data using theupload data UI. This setting enables or prevents users from uploading data securely to a Delta table or Databricks File System (DBFS) directly from the homepage, theDatatab, or theFilemenu in a notebook. This setting does not controlprogrammatic accessto the Databricks File System, for example through the DBFS command-line interface.",2024-08-09T08:00:00.000Z,concept-article,configuration,0.7,True,Workspace setting that controls UI-based data uploads; product-specific configuration with clear effect on capabilities.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/email,Workspace email settings,Manage workspace email settings - Azure Databricks,Configure email notification settings for Databricks workspaces,Learn how to manage your workspace's email settings and when users get email notifications for certain events.,"As a workspace admin user, you can configure when users get emails for certain events. To manage the email settings: You can also update the notification destinations from this page. For more information on notification destinations, seeManage notification destinations.",2026-02-18T22:10:00.000Z,concept-article,configuration,0.75,True,Describes when users get emails and how to configure notification destinations; involves specific workspace settings and options.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/email,Workspace email settings,Manage workspace email settings - Azure Databricks,Configure workspace email notification settings in Databricks,Learn how to manage your workspace's email settings and when users get email notifications for certain events.,"As a workspace admin user, you can configure when users get emails for certain events. To manage the email settings: You can also update the notification destinations from this page. For more information on notification destinations, seeManage notification destinations.",2026-04-22T08:00:00.000Z,how-to,configuration,0.65,True,"Defines specific workspace-level email notification settings and events, which are product-specific configuration options.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/enforce-user-isolation,Enforce user isolation cluster types on a workspace,Enforce user isolation cluster types on a workspace - Azure Databricks,Enforce user isolation cluster types in Databricks workspaces,Learn how to enable Enforce User Isolation on a workspace.,There is a workspace setting called Enforce User Isolation that admins can enable to prevent creating or starting a cluster with the“No isolation shared” cluster access typeoran equivalent legacy cluster type. Attempting to start or restart a cluster of those types will fail. Important There is a relatedaccount-level admin protection setting that affects these cluster types. To enable Enforce User Isolation on a workspace: Currently-running clusters are unaffected by changes to this setting unti,2024-05-03T21:32:00.000Z,concept-article,security,0.85,True,Security-focused workspace setting that blocks certain cluster access types; includes specific cluster type names and behavior when enabled.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/manage-previews,Manage Databricks Previews,Manage Azure Databricks previews - Azure Databricks,Manage Azure Databricks preview feature settings,Learn how to manage Databricks Previews in your organization's workspaces.,This page tells you how to manage Azure Databricks previews in your account or workspace.,2025-10-07T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains how to enable/disable Databricks preview features at account/workspace scope, involving specific preview management settings unique to the platform.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/manage-previews,Manage Databricks Previews,Manage Azure Databricks previews - Azure Databricks,,Learn how to manage Databricks Previews in your organization's workspaces.,This page tells you how to manage Azure Databricks previews in your account or workspace.,2026-04-22T08:00:00.000Z,how-to,,0.2,False,Managing previews is mostly procedural/admin UI guidance without detailed config tables or numeric constraints.,updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notebook-results,Configure notebook result storage location,Configure notebook result storage location - Azure Databricks,Configure storage location for Databricks notebook results,Learn how to manage where notebook results get stored for your workspace.,"Your organization's privacy requirements may require that you store all interactive notebook results in the workspace storage account in your cloud account, rather than the Databricks-managed control plane default location where some notebook command results are stored. Notebook command output is stored differently depending on how you run the notebook. By default, when you run a notebook interactively by clickingRunin the notebook: When you run a notebook as a job, by scheduling it or by clicki",2024-10-04T08:00:00.000Z,concept-article,configuration,0.8,True,Details how notebook results are stored by mode and how to redirect them to workspace storage; product-specific configuration and behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notebooks,Manage access to notebook features,Manage access to notebook features - Azure Databricks,Manage user access to Databricks notebook features,Learn how to enable and disable the ability to download notebook results and the ability to version notebooks in Git.,"As a workspace admin user, you can manage your users' access to notebook features as follows:",2024-08-07T08:00:00.000Z,concept-article,configuration,0.7,True,Workspace admin controls for downloading results and Git versioning; involves specific feature toggles and permissions.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notification-destinations,Manage notification destinations,Manage notification destinations - Azure Databricks,Configure Azure Databricks notification destinations via webhooks,Learn how to configure notification destinations for your workspace.,"This page teaches you how to create and configure notification destinations for your workspace. System notifications are messages that tell you when your workflow experiences a run event (start, success, and failure). By default, notifications are sent to user email addresses, but admins can configure alternate notification destinations using webhooks. This allows you to build event-driven integrations with Azure Databricks. Admins can also configure notification destinations to receive access r",2026-04-17T18:03:00.000Z,concept-article,configuration,0.7,True,"Describes how admins configure notification destinations and webhooks; such pages typically include concrete setting names, payload formats, and configuration parameters for system notifications, which are product-specific configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/restrict-workspace-admins,Restrict workspace admins,Restrict workspace admins - Azure Databricks,Configure RestrictWorkspaceAdmins for Databricks workspaces,Learn how to restrict workspace admins from changing ownership of objects in their workspace to other users or service principals.,TheRestrictWorkspaceAdminssetting lets account admins restrict workspace admin permissions. It has two independent fields: Changing one field does not affect the other.,2026-04-09T08:00:00.000Z,concept-article,security,0.7,True,"Describes a specific admin control (RestrictWorkspaceAdmins) with concrete, product-specific behavior and fields that govern workspace admin permissions. This is detailed configuration/security behavior that isn’t obvious from general knowledge and maps to security/permission settings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/storage,Purge workspace storage,Purge workspace storage - Azure Databricks,Purge Azure Databricks workspace storage securely,"Learn how to permanently purge workspace storage in Azure Databricks, such as deleted notebook cells, entire notebooks, experiments, or cluster logs.","Your organization's privacy requirements may require that you occasionally purge deleted objects like notebook cells, entire notebooks, experiments, or cluster logs.",2025-11-21T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains how to permanently purge deleted notebook cells, notebooks, experiments, and cluster logs, including Databricks-specific storage behavior and controls that go beyond generic deletion.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/,Workspace creation overview,Create a workspace - Azure Databricks,,Overview of articles concerning workspace creation and management.,This article is an overview of your options for creating and managing workspaces.,2026-03-27T08:00:00.000Z,concept-article,,0.2,False,"High-level overview of workspace creation and management options without detailed settings, limits, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notification-destinations,Manage notification destinations,Manage notification destinations - Azure Databricks,Configure notification destinations and webhooks in Databricks,Learn how to configure notification destinations for your workspace.,"This page teaches you how to create and configure notification destinations for your workspace. System notifications are messages that tell you when your workflow experiences a run event (start, success, and failure). By default, notifications are sent to user email addresses, but admins can configure alternate notification destinations using webhooks. This allows you to build event-driven integrations with Azure Databricks. Admins can also configure notification destinations to receive access r",2026-04-22T08:00:00.000Z,how-to,configuration,0.8,True,"Describes system notification destinations, including webhook-based configuration for workflow events and access requests; these are concrete product-specific settings.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/restrict-workspace-admins,Restrict workspace admins,Restrict workspace admins - Azure Databricks,Restrict workspace admin permissions in Azure Databricks,Learn how to restrict workspace admins from changing ownership of objects in their workspace to other users or service principals.,TheRestrictWorkspaceAdminssetting lets account admins restrict workspace admin permissions. It has two independent fields: Changing one field does not affect the other.,2026-04-22T08:00:00.000Z,reference,security,0.78,True,Describes the RestrictWorkspaceAdmins setting with distinct fields controlling ownership changes; this is a product-specific security/RBAC configuration.,updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/storage,Purge workspace storage,Purge workspace storage - Azure Databricks,Purge Azure Databricks workspace storage securely,"Learn how to permanently purge workspace storage in Azure Databricks, such as deleted notebook cells, entire notebooks, experiments, or cluster logs.","Your organization's privacy requirements may require that you occasionally purge deleted objects like notebook cells, entire notebooks, experiments, or cluster logs.",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"Details how to permanently purge deleted notebooks, cells, experiments, and logs; this is product-specific configuration/operations behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/,Workspace creation overview,Create a workspace - Azure Databricks,,Overview of articles concerning workspace creation and management.,This article is an overview of your options for creating and managing workspaces.,2026-04-22T08:00:00.000Z,landing-page,,0.25,False,"Overview/landing page for workspace creation and management options without detailed settings, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/arm-template,Create workspace using an ARM template,Deploy a workspace with an ARM template - Azure Databricks,,Learn how to deploy and review an Azure Databricks deployment using an ARM template.,"This article explains how to create an Azure Databricks workspace using an ARM template. AnARM templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. If your environment meets the prerequisites and you're familiar with using ARM templates, select theDeploy to Azurebutton. Th",2024-08-26T08:00:00.000Z,concept-article,,0.35,False,How-to for deploying with ARM templates; likely generic template usage without detailed Databricks-specific configuration tables in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/azure-cli,Create workspace using the Azure CLI,Deploy a workspace with the Azure CLI - Azure Databricks,,Learn how to create an Azure Databricks deployment using the Azure CLI.,This article explains how to create an Azure Databricks deployment using the Azure CLI.,2025-10-22T22:05:00.000Z,concept-article,,0.3,False,CLI deployment how-to; summary does not mention product-specific deployment constraints or matrices.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/azure-cli,Create workspace using the Azure CLI,Deploy a workspace with the Azure CLI - Azure Databricks,Deploy an Azure Databricks workspace with Azure CLI,Learn how to create an Azure Databricks deployment using the Azure CLI.,This article explains how to create an Azure Databricks deployment using the Azure CLI.,2026-04-22T08:00:00.000Z,how-to,deployment,0.7,True,"Explains how to create a Databricks deployment using Azure CLI, which involves product-specific deployment commands and parameters.",updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/bicep,Create workspace using Bicep,Deploy a workspace with Bicep - Azure Databricks,,Learn how to deploy and review an Azure Databricks deployment using Bicep.,"This article explains how to create an Azure Databricks workspace using Bicep. Bicepis a domain-specific language (DSL) that uses declarative syntax to deploy Azure resources. It provides concise syntax, reliable type safety, and support for code reuse. Bicep offers the best authoring experience for your infrastructure-as-code solutions in Azure.",2024-08-26T08:00:00.000Z,concept-article,,0.35,False,Bicep deployment how-to; appears to be generic IaC usage rather than a detailed configuration reference with Databricks-specific parameters.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/create-workspace,Create workspace using Azure Portal,Deploy a workspace using the Azure Portal - Azure Databricks,,Learn how to create and manage Azure Databricks workspaces.,This page describes how to deploy a workspace using the Azure Portal. These instructions can be used to deploy a serverless workspace (Public Preview) or a classic workspace.,2026-04-15T08:00:00.000Z,concept-article,,0.2,False,"Step-by-step deployment via Azure Portal without tier matrices, constraints, or detailed configuration parameter tables; primarily a basic how-to guide rather than expert configuration or deployment reference.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/delete-workspace,Delete a workspace,Delete a workspace - Azure Databricks,,Learn how to delete a Azure Databricks workspace.,"This page shows you how to delete a workspace in your Azure Databricks account. Note When you delete an Azure Databricks workspace, most compute resources such as VMs and disks are automatically cleaned up. However, the DBFS storage account and access connector in the managed resource group are retained unless you force delete them. SeeForce delete the workspace catalog. To delete an Azure Databricks workspace:",2026-03-12T18:41:00.000Z,concept-article,,0.4,False,"Describes workspace deletion and notes about retained resources; summary does not show detailed configuration options or limits, more of a procedural guide.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/powershell,Create workspace using PowerShell,Deploy a workspace with PowerShell - Azure Databricks,,Learn how to deploy and review an Azure Databricks deployment using Powershell.,"This article explains how to create an Azure Databricks workspace using Powershell. If you choose to use PowerShell locally, this article requires that you install the Az PowerShell module and connect to your Azure account using theConnect-AzAccountcmdlet. For more information about installing the Az PowerShell module, seeInstall Azure PowerShell. To connect to your Azure account with a user account or service principal, seeAuthenticate with Azure PowerShell. Note If you want to create an Azure ",2025-10-22T22:05:00.000Z,concept-article,,0.35,False,"PowerShell deployment how-to; summary focuses on prerequisites and basic commands, not on Databricks-specific deployment constraints or matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces,Serverless workspaces overview,Create a serverless workspace - Azure Databricks,Decide when and how to use serverless workspaces,"An overview of Databricks serverless workspaces, including benefits and limitations.","A serverless workspace is a workspace deployed in your Azure Databricks account that comes preconfigured with serverless compute and default storage, providing a fully managed, enterprise-ready SaaS experience. Serverless workspaces are ideal for many use cases. They can be a production-level environment for organizations that exclusively use serverless compute, or a composable, short-lived environment for internal training and testing. Additionally, account admins without permission to provisio",2026-03-10T08:00:00.000Z,concept-article,decision-making,0.65,True,"Explains benefits and limitations of serverless workspaces and when they are ideal (production-only serverless, training, testing, limited permissions), providing Databricks-specific decision guidance rather than just a conceptual overview.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces-best-practices,Best practices for serverless workspaces,Best practices for serverless workspaces - Azure Databricks,Apply best practices for Azure Databricks serverless workspaces,Learn about best practices for serverless workspaces.,"This page lists important best practices for creating, managing, and securing serverless workspaces. These guidelines focus on use cases, cost management, security, and platform requirements.",2026-04-15T08:00:00.000Z,concept-article,best-practices,0.7,True,"Focused on concrete DO/DON'T guidance for serverless workspaces, including product-specific recommendations around cost management, security, and platform requirements that go beyond generic advice.",updated -https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-access,Databricks access to customer workspaces,Workspace access for Azure Databricks personnel - Azure Databricks,Control Azure Databricks personnel access to workspaces,Learn about how Azure Databricks can access workspaces and how to configure this ability.,"Depending on how your workspace access setting is configured, Azure Databricks personnel might temporarily log in to your workspace using the UI or API in order to investigate an outage, security event, or to support your deployment. When Azure Databricks personnel log in to your workspace, the following security measures are enforced:",2025-10-07T08:00:00.000Z,concept-article,security,0.8,True,"Describes how Databricks personnel can access workspaces for support and what security measures are enforced, including workspace access settings and behavior unique to the service.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/create-workspace,Create workspace using Azure Portal,Deploy a workspace using the Azure Portal - Azure Databricks,Deploy Databricks workspaces using the Azure portal,Learn how to create and manage Azure Databricks workspaces.,This page describes how to deploy a workspace using the Azure Portal. These instructions can be used to deploy a serverless workspace (Public Preview) or a classic workspace.,2026-04-22T08:00:00.000Z,how-to,deployment,0.7,True,"Provides product-specific deployment steps for classic and serverless workspaces via the Azure portal, including workspace-type-specific deployment behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/delete-workspace,Delete a workspace,Delete a workspace - Azure Databricks,Delete Azure Databricks workspaces and managed resources,Learn how to delete a Azure Databricks workspace.,"This page shows you how to delete a workspace in your Azure Databricks account. Note When you delete an Azure Databricks workspace, most compute resources such as VMs and disks are automatically cleaned up. However, the DBFS storage account and access connector in the managed resource group are retained unless you force delete them. SeeForce delete the workspace catalog. To delete an Azure Databricks workspace:",2026-04-22T08:00:00.000Z,how-to,deployment,0.65,True,"Covers workspace deletion behavior, including which resources are automatically cleaned up and when to force delete DBFS storage and access connectors, which are product-specific lifecycle/deployment details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/powershell,Create workspace using PowerShell,Deploy a workspace with PowerShell - Azure Databricks,Deploy an Azure Databricks workspace using PowerShell,Learn how to deploy and review an Azure Databricks deployment using Powershell.,"This article explains how to create an Azure Databricks workspace using Powershell. If you choose to use PowerShell locally, this article requires that you install the Az PowerShell module and connect to your Azure account using theConnect-AzAccountcmdlet. For more information about installing the Az PowerShell module, seeInstall Azure PowerShell. To connect to your Azure account with a user account or service principal, seeAuthenticate with Azure PowerShell. Note If you want to create an Azure ",2026-04-22T08:00:00.000Z,how-to,deployment,0.7,True,"Describes deploying and reviewing a Databricks workspace with PowerShell, including required modules and commands, which are product-specific deployment details.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces,Serverless workspaces overview,Create a serverless workspace - Azure Databricks,Evaluate and use Databricks serverless workspaces,"An overview of Databricks serverless workspaces, including benefits and limitations.","A serverless workspace is a workspace deployed in your Azure Databricks account that comes preconfigured with serverless compute and default storage, providing a fully managed, enterprise-ready SaaS experience. Serverless workspaces are ideal for many use cases. They can be a production-level environment for organizations that exclusively use serverless compute, or a composable, short-lived environment for internal training and testing. Additionally, account admins without permission to provisio",2026-04-22T08:00:00.000Z,concept-article,decision-making,0.7,True,"Describes benefits, limitations, and use cases for serverless workspaces versus other options, providing product-specific guidance on when to choose this workspace type.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces-best-practices,Best practices for serverless workspaces,Best practices for serverless workspaces - Azure Databricks,Apply best practices for Databricks serverless workspaces,Learn about best practices for serverless workspaces.,"This page lists important best practices for creating, managing, and securing serverless workspaces. These guidelines focus on use cases, cost management, security, and platform requirements.",2026-04-22T08:00:00.000Z,best-practice,best-practices,0.85,True,"Explicitly lists best practices for creating, managing, securing, and controlling costs in serverless workspaces, which are product-specific DO/DON'T recommendations.",updated +https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-access,Databricks access to customer workspaces,Workspace access for Azure Databricks personnel - Azure Databricks,Configure Azure Databricks personnel access to workspaces,Learn about how Azure Databricks can access workspaces and how to configure this ability.,"Depending on how your workspace access setting is configured, Azure Databricks personnel might temporarily log in to your workspace using the UI or API in order to investigate an outage, security event, or to support your deployment. When Azure Databricks personnel log in to your workspace, the following security measures are enforced:",2026-04-22T08:00:00.000Z,concept-article,security,0.8,True,Explains how Databricks personnel can access workspaces and the enforced security measures; this is specific security configuration and behavior.,updated https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-storage-redundancy,Change workspace storage redundancy,Change workspace storage redundancy options - Azure Databricks,Change Azure Databricks workspace storage redundancy settings,learn how to change the Azure Databricks workspace storage account redundancy setting(s).,"When creating a workspace, the default redundancy for the managed storage account (Databricks filesystem (DBFS) root) is set as Geo-redundant storage (GRS). In this article, you will learn how to change the Azure Databricks workspace storage (DBFS) account redundancy settings. Important Starting on March 6, 2023, new Azure Databricks workspaces use Azure Data Lake Storage storage accounts for the DBFS root. Workspaces created before March 6, 2023 used Blob Storage.",2025-05-09T19:58:00.000Z,concept-article,configuration,0.75,True,Explains how to change DBFS storage redundancy and notes date-based behavior changes; involves specific configuration options and platform behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/agent-skills/,AI agent skills,Agent skills for AI coding assistants - Azure Databricks,,Discover and install agent skills that help AI coding assistants perform Azure Databricks development tasks more effectively.,"Agent skills are task-specific instruction files that AI coding assistants like Claude and GitHub Copilot can load to perform Azure Databricks development tasks. Skills package domain-specific knowledge, best practices, and workflows into a format optimized for AI consumption. To learn how to extend Genie Code in the Azure Databricks workspace, seeExtend Genie Code with agent skills. Skills follow the openAgent Skillsstandard. Each skill is a Markdown file with front-matter metadata that describ",2026-03-31T23:28:00.000Z,concept-article,,0.3,False,"Conceptual description of agent skills format and usage; no indication of detailed config tables, limits, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ai-bi/,Databricks AI/BI,Databricks AI/BI - Azure Databricks,,"Databricks AI/BI provides self-service data analysis with AI-powered dashboards, conversational Genie spaces, and seamless platform integration.","Databricks AI/BI is a business intelligence solution that uses compound AI to enhance data analysis with self-service insights, governance, and performance. Explore the tools and features available for creating dashboards, building Genie spaces, defining business semantics, and managing AI/BI resources.",2026-04-09T22:01:00.000Z,landing-page,,0.1,False,"High-level product/feature overview of Databricks AI/BI; no detailed limits, configuration tables, error codes, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/,Admin guide,AI/BI administration guide - Azure Databricks,Configure AI/BI access controls in Azure Databricks,Learn about available settings and controls for sharing and accessing AI/BI products.,This page describes the account and workspace-level administrative controls that can be applied to AI/BI products.,2026-04-16T08:00:00.000Z,overview,security,0.7,True,"An administration guide for AI/BI products is likely to enumerate specific workspace/account-level settings, RBAC roles, and access control options unique to Databricks AI/BI (for example, which roles can share artifacts, manage AI features, or control data access). These are product-specific security/administration configurations rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/,Admin guide,AI/BI administration guide - Azure Databricks,Configure AI/BI access controls in Azure Databricks,Learn about available settings and controls for sharing and accessing AI/BI products.,This page describes the account and workspace-level administrative controls that can be applied to AI/BI products.,2026-04-16T08:00:00.000Z,overview,security,0.7,True,"An administration guide for AI/BI products is likely to enumerate specific workspace/account-level settings, RBAC roles, and access control options unique to Databricks AI/BI (for example, which roles can share artifacts, manage AI features, or control data access). These are product-specific security/administration configurations rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/audit,Monitor usage,Monitor Genie spaces usage with audit logs and alerts - Azure Databricks,Monitor Genie space activity with audit log queries,Learn how to use audit logs to monitor Genie space activity.,"Important This feature is inPublic Preview. Warning Due to an outage, some dashboard audit logs might be missing for the period October 16, 2025 – November 19, 2025. This page has sample queries that workspace admins can use to monitor activity associated with Genie spaces. All queries access the audit logs table, which is a system table that stores records for all audit events from workspaces in your region. SeeMonitor account activity with system tables. For a comprehensive reference of availa",2026-02-25T19:11:00.000Z,how-to,troubleshooting,0.7,True,Provides sample queries against system audit tables to monitor Genie spaces; these are product-specific diagnostic patterns (symptom-to-query-to-detection) that go beyond generic logging concepts.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/embed,Manage embedding,Manage dashboard and Genie space embedding - Azure Databricks,Configure embedding controls for dashboards and Genie,Learn how workspace admins can manage embedding options for AI/BI dashboards and Genie spaces.,This page shows how workspace admins can manage embedding options for AI/BI dashboards and Genie spaces.,2026-04-09T17:57:00.000Z,how-to,security,0.65,True,"Workspace admin page for managing embedding options is likely to include security-related settings, scopes, and possibly role-based controls specific to embedding AI/BI assets.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/slack-subscriptions,Configure Slack notifications,Configure Slack notifications - Azure Databricks,Configure Slack notifications for AI/BI dashboard subscriptions,Learn how to configure Slack notifications for AI/BI dashboard subscriptions.,"AI/BI dashboards support sending scheduled snapshots to Slack channels. This allows teams to receive dashboard updates directly in their Slack workspace. This page explains how to create a Slack app and configure a Slack channel as a notification destination. Dashboard editors can then add this notification destination as a subscriber to scheduled dashboards. For information about adding a Slack channel as a subscriber, seeSubscribe a Slack or Microsoft Teams channel. Slack subscriptions deliver",2026-02-23T08:00:00.000Z,how-to,configuration,0.7,True,Describes creating a Slack app and configuring channels as notification destinations; involves specific configuration steps and parameters unique to this integration.,unchanged @@ -107,21 +107,21 @@ https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/use-apis,API tool https://learn.microsoft.com/en-us/azure/databricks/ai-bi/concepts,AI/BI concepts,AI/BI concepts - Azure Databricks,,"Learn about Databricks AI/BI concepts, compound AI systems, and how AI/BI integrates with the Databricks Data Intelligence Platform.","Databricks AI/BI is a new type of business intelligence product designed to provide a deep understanding of your data's semantics, enabling self-service data analysis for everyone in your organization. AI/BI is built on acompound AI systemthat draws insights from the full lifecycle of your data across the Azure Databricks platform, including ETL pipelines, lineage, and other queries.",2026-04-06T18:30:00.000Z,concept-article,,0.1,False,"Conceptual explanation of AI/BI and compound AI; lacks product-specific numeric limits, config parameters, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ai-bi/consumers/,Consumer access,What is consumer access? - Azure Databricks,Manage consumer access and permissions in Databricks One,Learn about the consumer role and how to manage users.,"Consumer access provides a streamlined, read-only experience for business users to discover and consume dashboards, Genie spaces, and Databricks Apps shared with them using Databricks One. This access type is designed for nontechnical users who require insights from analytics products without the complexity of full workspace functionality. This page explains how workspace admins can grant consumer access, outlines the capabilities and restrictions for users with this entitlement, describes relev",2026-03-19T22:07:00.000Z,concept-article,security,0.7,True,"Describes consumer role capabilities, restrictions, and how admins grant access; this is identity/entitlement configuration with product-specific roles and permissions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/,AI/BI release notes,AI/BI release notes - Azure Databricks,,Learn about what's new in Databricks AI/BI releases.,"This page lists new features and updates for the AI/BI product line. AI/BI is a business intelligence product that includes dashboards for visualization and reporting, plus Genie for conversational analytics.",2026-01-21T08:00:00.000Z,landing-page,,0.3,False,"Top-level AI/BI release notes index; summary indicates a list of features and updates, not structured limits, configuration, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2024,Release notes 2024,AI/BI release notes 2024 - Azure Databricks,,2024 release notes for AI/BI features and improvements.,The following AI/BI features and updates were released from January through December 2024.,2026-03-31T08:00:00.000Z,release-notes,,0.3,False,"2024 AI/BI release notes summary indicates a chronological list of improvements, not structured expert-knowledge content per the defined categories.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2025,Release notes 2025,AI/BI release notes 2025 - Azure Databricks,,2025 release notes for AI/BI features and improvements.,The following AI/BI features and improvements were released in 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"AI/BI 2025 release notes index; similar to 2026, appears to be a chronological feature list without clear indication of structured expert details that fit the defined sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2026,Release notes 2026,AI/BI release notes 2026 - Azure Databricks,,2026 release notes for AI/BI features and improvements.,The following AI/BI features and improvements were released in 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-16T08:00:00.000Z,release-notes,,0.2,False,"Release notes typically list feature changes and dates but not the specific limits, configuration tables, error-code mappings, or decision matrices required by the defined sub-skill types. The description does not indicate presence of numeric limits, configuration parameters, or troubleshooting details.",updated -https://learn.microsoft.com/en-us/azure/databricks/ai-bi/tools,Partner tools,Business intelligence tools - Azure Databricks,,Learn about built-in and third-party business intelligence tools that complement and integrate with Azure Databricks.,This article highlights business intelligence tools that integrate with Azure Databricks.,2026-03-10T23:23:00.000Z,overview,,0.1,False,High-level overview of BI tools that integrate with Databricks; likely marketing/integrations overview without detailed config tables or parameters.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/,Govern LLMs and MCPs (AI Gateway),AI Gateway - Azure Databricks,,"Learn how AI Gateway governs and monitors LLM endpoints, MCP servers, coding agents, and model serving endpoints.","Important This page covers the new AI Gateway (visible in the left nav of the UI), which is currently inBeta. Account admins can enable access to this feature in the account consolePreviewspage. SeeManage Azure Databricks previews. For details on the previous version of AI Gateway, seeAI Gateway for serving endpoints. AI Gateway is the Databricks central AI governance layer for LLM endpoints, MCP servers, and coding agents. Use AI Gateway to analyze usage, configure permissions, enforce guardrai",2026-04-15T22:08:00.000Z,landing-page,,0.3,False,"Overview of AI Gateway beta and its role; summary does not show concrete limits, configuration parameter tables, or error mappings—primarily conceptual governance description.",new -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/coding-agent-integration-beta,Coding agent integration,Integrate with coding agents - Azure Databricks,,"Learn how to integrate coding agents like Cursor, Gemini CLI, and Codex CLI with AI Gateway.","Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. With the Azure Databricks coding agent integration, you can manage access and usage for coding agents like Cursor, Gemini CLI, and Codex CLI. Built on AI Gateway, it provides rate limiting, usage tracking, and inference tables for your coding tools.",2026-04-07T17:44:00.000Z,integration,,0.3,False,"Described as an integration overview for coding agents using AI Gateway, likely focused on conceptual integration and basic setup rather than detailed config parameter tables or error-code troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-ai-gateway-endpoints,Configure endpoints,Configure AI Gateway on model serving endpoints - Azure Databricks,,Learn how to configure AI Gateway on a model serving endpoint.,"Important A new AI Gateway experience is available in Beta. The new AI Gateway is the enterprise control plane for governing LLM endpoints and coding agents with enhanced features. SeeAI Gateway (Beta). In this article, you learn how to configureAI Gatewayon a model serving endpoint.",2026-03-05T08:00:00.000Z,how-to,,0.5,False,How-to configure AI Gateway on endpoints; likely has some settings but summary does not clearly indicate full configuration tables with defaults/ranges.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-endpoints-beta,Configure endpoints,Configure AI Gateway endpoints - Azure Databricks,Configure AI Gateway endpoints in the beta experience,Learn how to configure AI Gateway endpoints in the Beta experience.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to configureAI Gateway (Beta)endpoints.,2026-04-15T22:08:00.000Z,how-to,configuration,0.7,True,"Explicitly about configuring AI Gateway endpoints; such pages typically include endpoint configuration parameters, allowed values, and defaults, matching the configuration sub-skill.",updated -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables,Inference tables,Monitor served models using AI Gateway-enabled inference tables - Azure Databricks,,"Learn what AI Gateway-enabled inference tables are, what they log and how to enable them using the UI.",Important A new AI Gateway experience is available in Beta. The new AI Gateway is the enterprise control plane for governing LLM endpoints and coding agents with enhanced features. SeeAI Gateway (Beta). Important The legacy inference table experience for custom model serving endpoints will be deprecated soon. SeeMigrating to AI Gateway inference tables. This article describes AI Gateway-enabled inference tables for monitoring served models. The inference table automatically captures incoming req,2026-02-27T08:00:00.000Z,feature-guide,,0.5,False,Describes AI Gateway-enabled inference tables; may include schema details but summary does not show explicit config ranges or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables-beta,Inference tables,Monitor models using inference tables - Azure Databricks,,Learn how to use AI Gateway inference tables to monitor requests and responses for AI Gateway endpoints.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to use inference tables to monitorAI Gateway (Beta)endpoints.,2026-03-02T08:00:00.000Z,feature-guide,,0.4,False,Explains inference tables conceptually; no explicit evidence of configuration parameter ranges or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta,LLMs,AI Gateway for LLM endpoints - Azure Databricks,Govern LLM endpoints with AI Gateway permissions and limits,"Learn about AI Gateway LLM governance features, including permissions, usage tracking, rate limits, and coding agent integration.","Important This page covers the new AI Gateway (visible in the left nav of the UI), which is currently inBeta. Account admins can enable access to this feature in the account consolePreviewspage. SeeManage Azure Databricks previews. For details on the previous version of AI Gateway, seeAI Gateway for serving endpoints.",2026-04-15T22:08:00.000Z,overview,security,0.65,True,"Described as covering permissions, usage tracking, rate limits, and coding agent integration. Likely includes product-specific governance and permission settings, fitting security (with some limits-quotas aspects).",new -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-serving-endpoints,Model serving endpoints (previous),AI Gateway for serving endpoints - Azure Databricks,,This article provides an overview of AI Gateway for serving endpoints and its supported features.,"Important A new AI Gateway experience is available in Beta. The new AI Gateway is the enterprise control plane for governing LLM endpoints and coding agents with enhanced features. SeeAI Gateway for LLM endpoints. This page describes AI Gateway for serving endpoints, which governs and monitors access to supported generative AI models and their associated model-serving endpoints.",2026-02-27T08:00:00.000Z,overview,,0.3,False,"Described as an overview of AI Gateway for serving endpoints and supported features. Sounds like conceptual/feature overview without explicit limits, config tables, or error mappings.",new -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/query-endpoints-beta,Query endpoints,Query AI Gateway endpoints - Azure Databricks,,Learn how to query AI Gateway endpoints using unified and native APIs.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to queryAI Gateway (Beta)endpoints using supported APIs.,2026-03-27T08:00:00.000Z,how-to,,0.4,False,Describes how to query AI Gateway endpoints; summary does not indicate detailed parameter tables or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/rate-limits-beta,Rate limits,Configure rate limits for AI Gateway endpoints - Azure Databricks,Configure AI Gateway endpoint rate limits in Databricks,Learn how to configure rate limits for AI Gateway endpoints to manage capacity and costs.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to configure rate limits forAI Gateway (Beta)endpoints. Rate limits allow you to enforce consumption limits on an endpoint to manage capacity and costs.,2026-04-09T17:57:00.000Z,how-to,limits-quotas,0.8,True,"A page specifically about configuring rate limits for AI Gateway endpoints is likely to include concrete numeric limits (requests per minute/hour, token caps), configuration ranges, and possibly tier-specific constraints, which qualify as expert limits/quotas knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/usage-tracking-beta,Usage tracking,Monitor usage for AI Gateway endpoints - Azure Databricks,,Learn how to monitor usage for AI Gateway endpoints using the usage tracking system table.,"Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to monitor usage forAI Gateway (Beta)endpoints using the usage tracking system table. The usage tracking table automatically captures request and response details for an endpoint, logging essential metrics like token usage and latency. You can use the data in this table to monitor usage, track costs, and gain insigh",2026-03-02T08:00:00.000Z,how-to,,0.4,False,Monitoring usage via system table; likely describes schema but summary does not show detailed config ranges or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2024,Release notes 2024,AI/BI release notes 2024 - Azure Databricks,,2024 release notes for AI/BI features and improvements.,The following AI/BI features and updates were released from January through December 2024.,2026-04-22T08:00:00.000Z,release-notes,,0.2,False,"2024 AI/BI release notes summarize features and updates; no indication of structured limits, configuration parameters, or decision matrices that meet expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2025,Release notes 2025,AI/BI release notes 2025 - Azure Databricks,,2025 release notes for AI/BI features and improvements.,The following AI/BI features and improvements were released in 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-20T08:00:00.000Z,release-notes,,0.2,False,"Page is a yearly release-notes overview for AI/BI; description does not indicate presence of numeric limits, config tables, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2026,Release notes 2026,AI/BI release notes 2026 - Azure Databricks,,2026 release notes for AI/BI features and improvements.,The following AI/BI features and improvements were released in 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-22T08:00:00.000Z,release-notes,,0.2,False,"Release notes typically list new features and fixes but not structured limits, configs, or troubleshooting mappings as defined; summary provides no evidence of detailed expert-knowledge content.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-bi/tools,Partner tools,Business intelligence tools - Azure Databricks,,Learn about built-in and third-party business intelligence tools that complement and integrate with Azure Databricks.,This article highlights business intelligence tools that integrate with Azure Databricks.,2026-04-20T08:00:00.000Z,overview,,0.2,False,"High-level description of BI tools that integrate with Azure Databricks; likely a catalog/overview without detailed configuration tables, limits, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/,AI governance (Unity AI Gateway),Unity AI Gateway - Azure Databricks,,"Learn how Unity AI Gateway governs and monitors LLM endpoints, MCP servers, coding agents, and model serving endpoints.","Important This page covers the new AI Gateway (visible in the left nav of the UI), which is currently inBeta. Account admins can enable access to this feature in the account consolePreviewspage. SeeManage Azure Databricks previews. For details on the previous version of Unity AI Gateway, seeUnity AI Gateway for serving endpoints. Unity AI Gateway is the Databricks central AI governance layer for LLM endpoints, MCP servers, and coding agents. Use Unity AI Gateway to analyze usage, configure permi",2026-04-21T22:28:00.000Z,landing-page,,0.3,False,"Overview of Unity AI Gateway capabilities; summary indicates conceptual governance description without concrete limits, config tables, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/coding-agent-integration-beta,Coding agent integration,Integrate with coding agents - Azure Databricks,Integrate coding agents with Unity AI Gateway for governance,"Learn how to integrate coding agents like Cursor, Gemini CLI, and Codex CLI with Unity AI Gateway.","Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. With the Azure Databricks coding agent integration, you can manage access and usage for coding agents like Cursor, Gemini CLI, and Codex CLI. Built on Unity AI Gateway, it provides rate limiting, usage tracking, and inference tables for your coding tools.",2026-04-21T22:28:00.000Z,integration,integrations,0.7,True,"Integration with specific coding agents (Cursor, Gemini CLI, Codex CLI) will require product-specific configuration parameters, endpoint URLs, and possibly token or header settings unique to Databricks’ AI Gateway, fitting the integrations & coding patterns category.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-ai-gateway-endpoints,Configure endpoints,Configure Unity AI Gateway on model serving endpoints - Azure Databricks,Configure Unity AI Gateway on Databricks model serving endpoints,Learn how to configure Unity AI Gateway on a model serving endpoint.,"Important A new Unity AI Gateway experience is available in Beta. The new Unity AI Gateway is the enterprise control plane for governing LLM endpoints and coding agents with enhanced features. SeeUnity AI Gateway for LLM endpoints. In this article, you learn how to configureUnity AI Gatewayon a model serving endpoint.",2026-04-21T08:00:00.000Z,how-to,configuration,0.75,True,"A how-to configuration article for enabling AI Gateway on serving endpoints will include specific setting names, UI/CLI parameters, and possibly JSON configuration snippets unique to Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-endpoints-beta,Configure endpoints,Configure Unity AI Gateway endpoints - Azure Databricks,Configure Unity AI Gateway endpoints in Databricks,Learn how to configure Unity AI Gateway endpoints in the Beta experience.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to configureUnity AI Gatewayendpoints.,2026-04-21T22:28:00.000Z,how-to,configuration,0.65,True,"Page is specifically about configuring AI Gateway endpoints; likely includes endpoint setting names, allowed values, and possibly tables of configuration parameters unique to this product.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables,Inference tables,Monitor served models using Unity AI Gateway-enabled inference tables - Azure Databricks,Enable and use Unity AI Gateway inference tables for served models,"Learn what Unity AI Gateway-enabled inference tables are, what they log and how to enable them using the UI.",Important A new Unity AI Gateway experience is available in Beta. The new Unity AI Gateway is the enterprise control plane for governing LLM endpoints and coding agents with enhanced features. SeeUnity AI Gateway for LLM endpoints. Important The legacy inference table experience for custom model serving endpoints will be deprecated soon. SeeMigrating to AI Gateway inference tables. This article describes Unity AI Gateway-enabled inference tables for monitoring served models. The inference table ,2026-04-21T08:00:00.000Z,feature-guide,configuration,0.8,True,"Describes what inference tables log and how to enable them; this will include table schemas, column names, and configuration steps specific to Databricks AI Gateway, which are expert configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables-beta,Inference tables,Monitor models using inference tables - Azure Databricks,Configure Unity AI Gateway inference tables for endpoint monitoring,Learn how to use Unity AI Gateway inference tables to monitor requests and responses for Unity AI Gateway endpoints.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to use inference tables to monitorUnity AI Gatewayendpoints.,2026-04-21T22:28:00.000Z,feature-guide,configuration,0.7,True,"Inference tables for Unity AI Gateway endpoints will document specific table schemas, column names, and enablement settings, which are product-specific configuration details beyond generic monitoring concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta,LLMs,Unity AI Gateway for LLM endpoints - Azure Databricks,,"Learn about Unity AI Gateway LLM governance features, including permissions, usage tracking, rate limits, and coding agent integration.","Important This page covers the new AI Gateway (visible in the left nav of the UI), which is currently inBeta. Account admins can enable access to this feature in the account consolePreviewspage. SeeManage Azure Databricks previews. For details on the previous version of Unity AI Gateway, seeUnity AI Gateway for serving endpoints.",2026-04-21T22:28:00.000Z,overview,,0.3,False,"Overview of AI Gateway for LLM endpoints mentioning permissions, usage tracking, and rate limits, but summary does not show specific numeric limits or detailed RBAC role definitions.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-serving-endpoints,Model serving endpoints (previous),Unity AI Gateway for serving endpoints - Azure Databricks,,This article provides an overview of Unity AI Gateway for serving endpoints and its supported features.,"Important A new Unity AI Gateway experience is available in Beta. The new Unity AI Gateway is the enterprise control plane for governing LLM endpoints and coding agents with enhanced features. SeeUnity AI Gateway for LLM endpoints. This page describes Unity AI Gateway for serving endpoints, which governs and monitors access to supported generative AI models and their associated model-serving endpoints.",2026-04-21T22:28:00.000Z,overview,,0.2,False,"Described as an overview of Unity AI Gateway for serving endpoints and supported features; overviews typically lack detailed numeric limits, configuration tables, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/query-endpoints-beta,Query endpoints,Query Unity AI Gateway endpoints - Azure Databricks,Query Unity AI Gateway endpoints via unified APIs,Learn how to query Unity AI Gateway endpoints using unified and native APIs.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to queryUnity AI Gatewayendpoints using supported APIs.,2026-04-21T22:28:00.000Z,how-to,integrations,0.65,True,"Describes querying AI Gateway endpoints using unified and native APIs; likely includes request/response parameters, API-specific options, and product-specific integration patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/rate-limits-beta,Rate limits,Configure rate limits for Unity AI Gateway endpoints - Azure Databricks,Set and manage Unity AI Gateway endpoint rate limits,Learn how to configure rate limits for Unity AI Gateway endpoints to manage capacity and costs.,Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to configure rate limits forUnity AI Gatewayendpoints. Rate limits allow you to enforce consumption limits on an endpoint to manage capacity and costs.,2026-04-21T22:28:00.000Z,how-to,limits-quotas,0.9,True,"A page dedicated to configuring rate limits for Unity AI Gateway endpoints will list concrete numeric limits (requests per second/minute, token caps, default and maximum values) and possibly tier-based tables, which are exactly the kind of expert numeric constraints described under limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/usage-tracking-beta,Usage tracking,Monitor usage for Unity AI Gateway endpoints - Azure Databricks,Configure and query Unity AI Gateway usage tracking tables,Learn how to monitor usage for Unity AI Gateway endpoints using the usage tracking system table.,"Important This feature is inBeta. Account admins can control access to this feature from the account consolePreviewspage. SeeManage Azure Databricks previews. This page describes how to monitor usage forUnity AI Gatewayendpoints using the usage tracking system table. The usage tracking table automatically captures request and response details for an endpoint, logging essential metrics like token usage and latency. You can use the data in this table to monitor usage, track costs, and gain insight",2026-04-21T22:28:00.000Z,how-to,configuration,0.7,True,"Usage tracking system table for Unity AI Gateway endpoints will include concrete schema details (column names for token counts, latency, request/response metadata) and example queries specific to Databricks system tables, which are product-specific configuration/telemetry details not inferable from general knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/archive/,Archived docs,Azure Databricks documentation archive - Azure Databricks,,"The docs in this archive have been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported.","Important This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported. In this archive, you can find earlier versions of documentation for Azure Databricks products, features, APIs, and workflows.",2026-04-10T18:06:00.000Z,archived,,0.1,False,Archive landing page describing retired documentation; no specific technical details itself.,unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/azure-admin/conditional-access,Conditional access,Conditional access - Azure Databricks,Enable Microsoft Entra conditional access for Databricks,"Learn how to enable Microsoft Entra ID conditional access for Azure Databricks, to allow administrators to control where and when users are permitted to sign in to Azure Databricks.","Important This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are not officially endorsed or tested by Databricks. Azure Databricks supports Microsoft Entra ID conditional access, which allows administrators to control where and when users are permitted to sign in to Azure Databricks. For example, conditional access policies can restrict sign-in to your corporate network or can require multi-factor authentication. For mo",2026-01-21T08:00:00.000Z,archived,security,0.7,True,"The page describes enabling conditional access for Azure Databricks, which involves identity and access configuration. Such content typically includes specific policy settings and scopes, fitting the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/azure/aqs,Streaming with AQS,Azure Blob storage file source with Azure Queue Storage (legacy) - Azure Databricks,Use legacy ABS-AQS streaming connector in Databricks,Learn how to use the legacy AQS connector for Azure Storage (Queues and Blobs) as a source for streaming data in Azure Databricks.,"Important This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported. SeeWhat is Auto Loader?. The ABS-AQS connector provides an optimized file source that uses Azure Queue Storage (AQS) to find new files written to an Azure Blob storage (ABS) container without repeatedly listing all of the files. This provides two advantages: Note The ABS-AQS source deletes messages from the AQS queue as it consumes eve",2026-02-11T08:00:00.000Z,archived,integrations,0.75,True,"Describes the ABS-AQS connector as an optimized file source using Azure Queue Storage to detect new files in Blob Storage; legacy connector docs generally include connector-specific options and behaviors (for example, message deletion semantics) that are expert integration details.",unchanged @@ -129,7 +129,7 @@ https://learn.microsoft.com/en-us/azure/databricks/archive/azure/cosmosdb,Azure https://learn.microsoft.com/en-us/azure/databricks/archive/azure/stream-synapse,Structured Streaming writes to Azure Synapse,Structured Streaming writes to Azure Synapse - Azure Databricks,Stream data from Databricks to Azure Synapse with Structured Streaming,Learn how to use Apache Spark Structured Streaming to write data to Azure Synapse Analytics using Azure Databricks.,Important This documentation has been retired and might not be updated. The Azure Synapse connector offers efficient and scalable Structured Streaming write support for Azure Synapse that provides consistent user experience with batch writes and usesCOPYfor large data transfers between an Azure Databricks cluster and Azure Synapse instance. Structured Streaming support between Azure Databricks and Synapse provides simple semantics for configuring incremental ETL jobs. The model used to load data,2024-12-17T08:00:00.000Z,archived,integrations,0.7,True,"Describes Structured Streaming writes to Synapse using a specific connector and COPY semantics, a Databricks–Synapse integration pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/azure/synapse-polybase,Synapse with Polybase,Connecting Azure Databricks and Azure Synapse with PolyBase (legacy) - Azure Databricks,Configure legacy PolyBase connector for Databricks,View configuration information for the legacy PolyBase connector for Azure Databricks and Synapse.,"Important This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported. SeeQuery data in Azure Synapse Analytics. Databricks recommends using the defaultCOPYfunctionality with Azure Data Lake Storage for connections to Azure Synapse. This article includes legacy documentation around PolyBase and blob storage. Azure Synapse Analytics(formerly SQL Data Warehouse) is a cloud-based enterprise data warehouse th",2026-02-11T08:00:00.000Z,archived,integrations,0.7,True,Described as including configuration information for the legacy PolyBase connector between Azure Databricks and Synapse; such connector docs typically list connector-specific options and parameters that qualify as product-specific integration details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/compute/cluster-ui-preview,Clusters UI changes and cluster access modes,Clusters UI changes and cluster access modes - Azure Databricks,Understand Databricks cluster UI changes and access modes,"Learn about the new updates available in the Clusters UI for access mode, cluster mode, and credential passthrough.","Warning This article has been archived and may no longer reflect the current state of the product. For information about cluster configuration, seeCompute configuration reference. A new clusters user interface is available with the following changes: For more details on configuring compute for Unity Catalog, seeAccess modes. Important By default, the clusters editor uses the New UI. To toggle between the New UI and the Legacy UI, click thePreviewbutton at the top of the page and click the settin",2024-05-03T08:00:00.000Z,archived,configuration,0.7,True,"Explains new vs legacy cluster UI and access mode configuration, including product-specific compute configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/archive/compute/configure,Create cluster UI (legacy),Configure compute (legacy) - Azure Databricks,Configure legacy Azure Databricks cluster settings,"Learn how to configure Azure Databricks clusters, including cluster mode, runtime, instance types, size, pools, autoscaling preferences, termination schedule, Apache Spark options, custom tags, log de","Note These are instructions for the legacy create cluster UI, and are included only for historical accuracy. All customers should be using theupdated create cluster UI. This article explains the configuration options available when you create and edit Azure Databricks clusters. It focuses on creating and editing clusters using the UI. For other methods, see theDatabricks CLI, theClusters API, andDatabricks Terraform provider. For help deciding what combination of configuration options suits your",2026-04-15T08:00:00.000Z,archived,configuration,0.8,True,"Details many cluster configuration options (modes, autoscaling, termination, Spark options, tags, logging) with specific setting names and behaviors for the legacy UI, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/archive/compute/configure,Create cluster UI (legacy),Configure compute (legacy) - Azure Databricks,Configure legacy Azure Databricks cluster settings,"Learn how to configure Azure Databricks clusters, including cluster mode, runtime, instance types, size, pools, autoscaling preferences, termination schedule, Apache Spark options, custom tags, log de","Note These are instructions for the legacy create cluster UI, and are included only for historical accuracy. All customers should be using theupdated create cluster UI. This article explains the configuration options available when you create and edit Azure Databricks clusters. It focuses on creating and editing clusters using the UI. For other methods, see theDatabricks CLI, theClusters API, andDatabricks Terraform provider. For help deciding what combination of configuration options suits your",2026-04-15T08:00:00.000Z,archived,configuration,0.8,True,"Details many cluster configuration options (modes, autoscaling, termination, Spark options, tags, logging) with specific setting names and behaviors for the legacy UI, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/compute/libraries-init-scripts,Install a library with an init script,Install a library with an init script (legacy) - Azure Databricks,Migrate Databricks library installs from init scripts,Learn how to install libraries with init scripts,"Note Databricks recommends defining libraries using cluster policies to provide custom configurations for compute environments. SeeAdd libraries to a policy. Migrate all libraries installed with init scripts to use this pattern, because installing libraries using init scripts can lead to unexpected errors, such as ""Module not found"" failures during job execution. Important This documentation has been retired and might not be updated. Databricks no longer recommends using init scripts to install ",2025-04-17T18:19:00.000Z,archived,best-practices,0.65,True,"Contains Databricks-specific guidance on why installing libraries via init scripts causes issues (for example, 'Module not found' during jobs) and recommends alternative patterns, which are concrete product-specific gotchas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/compute/policies-best-practices,Best practices for cluster policies,Best practices: Compute policies - Azure Databricks,Apply best practices for Databricks compute policies,Learn best practices when using Azure Databricks compute policies.,"Warning This article has been archived and may no longer reflect the current state of the product. For information about compute policies, seeCreate and manage compute policies. Azure Databrickscompute policiesprovide administrators control over the creation of compute resources in an Azure Databricks workspace. Effective use of compute policies allows administrators to: Combined with effective onboarding, approval, and chargeback processes, compute policies can be a foundational component in Az",2026-01-24T08:00:00.000Z,archived,best-practices,0.7,True,Explicitly a best-practices article for compute policies; these typically include concrete DO/DON'T guidance and product-specific recommendations for controlling compute creation and governance.,unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/amazon-redshift,Query Amazon Redshift using Databricks,Query Amazon Redshift using Azure Databricks - Azure Databricks,Use Azure Databricks connector for Amazon Redshift,Learn how to read and write data to Amazon Redshift on Azure Databricks.,"You can read and write tables from Amazon Redshift with Azure Databricks. Important The legacy query federation documentation has been retired and might not be updated. The configurations mentioned in this content are not officially endorsed or tested by Databricks. IfLakehouse Federationsupports your source database, Databricks recommends using that instead. The Databricks Redshift data source uses Amazon S3 to efficiently transfer data in and out of Redshift and uses JDBC to automatically trig",2026-02-11T08:00:00.000Z,archived,integrations,0.72,True,"Describes Databricks Redshift data source using S3 and JDBC, which usually involves specific connector options, parameter names, and configuration patterns unique to this integration. That aligns with integrations & coding patterns rather than generic concepts.",unchanged @@ -158,19 +158,19 @@ https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/synapse-an https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/,Credential passthrough (legacy),Credential passthrough (legacy) - Azure Databricks,Configure legacy credential passthrough security in Databricks,Learn how to use credential passthrough to enable secure access to storage.,Important This documentation has been retired and might not be updated. Credential passthrough is deprecated starting with Databricks Runtime 15.0 and will be removed in future Databricks Runtime versions. Databricks recommends that you upgrade to Unity Catalog. Unity Catalog simplifies security and governance of your data by providing a central place to administer and audit data access across multiple workspaces in your account. SeeWhat is Unity Catalog?. Credential passthrough allows you to au,2024-08-01T21:10:00.000Z,archived,security,0.7,True,"Describes how credential passthrough works, its deprecation, and how it secures storage access in Databricks with product-specific security behavior and configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/adls-passthrough,Authenticate to ADLS using Microsoft Entra ID (formerly Azure AD) credentials,Access Azure Data Lake Storage using Microsoft Entra ID credential passthrough (legacy) - Azure Databricks,Secure ADLS access with Entra ID passthrough in Databricks,Learn how to use passthrough authentication to read and write data to Azure Data Lake Storage using Azure Databricks.,Important This documentation has been retired and might not be updated. Credential passthrough is deprecated starting with Databricks Runtime 15.0 and will be removed in future Databricks Runtime versions. Databricks recommends that you upgrade to Unity Catalog. Unity Catalog simplifies security and governance of your data by providing a central place to administer and audit data access across multiple workspaces in your account. SeeWhat is Unity Catalog?. For heightened security and governance ,2026-02-11T08:00:00.000Z,archived,security,0.75,True,"Covers using Microsoft Entra ID credential passthrough to access ADLS from Databricks; such docs normally include authentication configuration details and security-specific settings, which are product-specific security knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/,Legacy Databricks CLI overview,Legacy Databricks CLI - Azure Databricks,Install and configure legacy Databricks CLI,Learn how to install and configure your environment to run the Databricks command-line interface (Databricks CLI).,"Important This documentation has been retired and might not be updated. Databricks recommends that you use Databricks CLI version 0.205 or above instead of the legacy Databricks CLI version 0.18 or below. Databricks CLI version 0.18 or below is not supported by Databricks. For information about Databricks CLI versions 0.205 and above, seeWhat is the Databricks CLI?. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. The leg",2026-01-21T08:00:00.000Z,archived,configuration,0.8,True,"Explains how to install and configure the legacy Databricks CLI; CLI docs usually list command options, environment variables, and config parameters, which are detailed configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/cluster-policies-cli,Cluster policies,Cluster Policies CLI (legacy) - Azure Databricks,Use legacy Databricks Cluster Policies CLI commands,Learn how to use the Databricks cluster-policies command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Only workspace admin users can create, edit, and delete poli",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Documents product-specific CLI subcommands, flags, and usage patterns for managing cluster policies; these are concrete configuration interfaces unique to Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/cluster-policies-cli,Cluster policies,Cluster Policies CLI (legacy) - Azure Databricks,Use legacy Databricks Cluster Policies CLI commands,Learn how to use the Databricks cluster-policies command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Only workspace admin users can create, edit, and delete poli",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Documents product-specific CLI subcommands, flags, and usage patterns for managing cluster policies; these are concrete configuration interfaces unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/clusters-cli,Clusters,Clusters CLI (legacy) - Azure Databricks,Use legacy Databricks clusters CLI commands,Learn how to use the Databricks clusters command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks clusters CLI subcommands by ",2024-03-01T08:00:00.000Z,archived,integrations,0.8,True,"Documents specific clusters CLI subcommands, parameters, and usage patterns for Databricks CLI ≤0.18, which are product-specific API/CLI integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/dbfs-cli,DBFS,DBFS CLI (legacy) - Azure Databricks,Use legacy Databricks DBFS CLI commands,Learn how to use the Databricks DBFS command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks DBFS CLI subcommands appendi",2026-01-21T08:00:00.000Z,archived,configuration,0.8,True,"Describes running DBFS CLI subcommands; such pages typically enumerate command names, flags, and behaviors, which are product-specific configuration/command reference details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/dlt-cli,DLT,Lakeflow Spark Declarative Pipelines CLI (legacy) - Azure Databricks,Use legacy Lakeflow Spark Declarative Pipelines CLI,Learn how to use the Databricks Lakeflow Spark Declarative Pipelines command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks Lakeflow Spark Declarative Pipelines CLI ",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Provides Databricks-specific CLI commands, parameters, and usage for Lakeflow Spark Declarative Pipelines, which are concrete configuration/invocation details not generally known.",updated -https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/groups-cli,Groups,Groups CLI (legacy) - Azure Databricks,Manage Databricks groups with legacy CLI,Learn how to use the Databricks groups command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Note You run Databricks groups CLI subcommands by appending ",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Describes Databricks groups CLI subcommands and syntax, including how to manage groups via CLI; these are product-specific configuration/management commands.",updated -https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/instance-pools-cli,Instance pools,Instance Pools CLI (legacy) - Azure Databricks,Manage Databricks instance pools with legacy CLI,Learn how to use the Databricks pools command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Note The pools CLI requires Databricks CLI 0.9.0 or above. Y",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Covers Databricks pools CLI usage, including required CLI version and specific subcommands/parameters for configuring instance pools, which are detailed configuration interfaces.",updated +https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/dlt-cli,DLT,Lakeflow Spark Declarative Pipelines CLI (legacy) - Azure Databricks,Use legacy Lakeflow Spark Declarative Pipelines CLI,Learn how to use the Databricks Lakeflow Spark Declarative Pipelines command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks Lakeflow Spark Declarative Pipelines CLI ",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Provides Databricks-specific CLI commands, parameters, and usage for Lakeflow Spark Declarative Pipelines, which are concrete configuration/invocation details not generally known.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/groups-cli,Groups,Groups CLI (legacy) - Azure Databricks,Manage Databricks groups with legacy CLI,Learn how to use the Databricks groups command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Note You run Databricks groups CLI subcommands by appending ",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Describes Databricks groups CLI subcommands and syntax, including how to manage groups via CLI; these are product-specific configuration/management commands.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/instance-pools-cli,Instance pools,Instance Pools CLI (legacy) - Azure Databricks,Manage Databricks instance pools with legacy CLI,Learn how to use the Databricks pools command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Note The pools CLI requires Databricks CLI 0.9.0 or above. Y",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Covers Databricks pools CLI usage, including required CLI version and specific subcommands/parameters for configuring instance pools, which are detailed configuration interfaces.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/jobs-cli,Jobs,Jobs CLI (legacy) - Azure Databricks,Run and manage Databricks jobs with legacy CLI,Learn how to use the Databricks jobs command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks jobs CLI subcommands by appe",2024-12-13T08:00:00.000Z,archived,integrations,0.8,True,Jobs CLI page documents job-related CLI subcommands and parameters unique to Databricks.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/libraries-cli,Libraries,Libraries CLI (legacy) - Azure Databricks,Manage Databricks libraries with legacy CLI,Learn how to use the Databricks libraries command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks libraries CLI subcommands by appending th",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Lists Databricks libraries CLI subcommands and their usage, providing concrete command names and options that are specific to this product’s configuration surface.",updated +https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/libraries-cli,Libraries,Libraries CLI (legacy) - Azure Databricks,Manage Databricks libraries with legacy CLI,Learn how to use the Databricks libraries command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks libraries CLI subcommands by appending th",2026-04-14T21:45:00.000Z,archived,configuration,0.8,True,"Lists Databricks libraries CLI subcommands and their usage, providing concrete command names and options that are specific to this product’s configuration surface.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/repos-cli,Repos,Repos CLI (legacy) - Azure Databricks,Work with Databricks repos via the legacy CLI,Learn how to use the Databricks repos command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Note The Repos CLI requires Databricks CLI 0.15",2024-03-18T08:00:00.000Z,archived,integrations,0.8,True,"Repos CLI page describes commands and required CLI versions for interacting with Databricks repos, a product-specific integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/runs-cli,Runs,Runs CLI (legacy) - Azure Databricks,Manage Databricks job runs with the legacy CLI,Learn how to use the Databricks runs command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks job runs CLI subcommands by ",2024-12-13T08:00:00.000Z,archived,integrations,0.8,True,"Runs CLI documentation includes detailed subcommands and usage for job runs, which are Databricks-specific API/CLI patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/secrets-cli,Secrets,Secrets CLI (legacy) - Azure Databricks,Manage Databricks secrets with the legacy CLI,Learn how to use the Databricks secrets command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks secrets CLI subcommands by a",2024-11-21T08:00:00.000Z,archived,security,0.8,True,"Secrets CLI page documents commands for secret scopes and values, which are product-specific security configuration operations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/stack-cli,Stack,Stack CLI (legacy) - Azure Databricks,Deploy Databricks stacks using the legacy Stack CLI,Learn how to use the Databricks stack command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Databricks CLI versions 0.205 and above do not ",2024-03-01T08:00:00.000Z,archived,deployment,0.7,True,Stack CLI is used to deploy and manage stacks of Databricks resources; the page documents product-specific deployment commands and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/tokens-cli,Tokens,Tokens CLI (legacy) - Azure Databricks,Manage Databricks personal access tokens with legacy CLI,Learn how to use the Databricks tokens command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks Tokens CLI subcommands by appending them ",2026-04-14T21:45:00.000Z,archived,security,0.7,True,"Documents Databricks Tokens CLI subcommands and usage for creating and managing tokens, which are product-specific security/auth configuration operations.",updated +https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/tokens-cli,Tokens,Tokens CLI (legacy) - Azure Databricks,Manage Databricks personal access tokens with legacy CLI,Learn how to use the Databricks tokens command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeDatabricks CLI. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks Tokens CLI subcommands by appending them ",2026-04-14T21:45:00.000Z,archived,security,0.7,True,"Documents Databricks Tokens CLI subcommands and usage for creating and managing tokens, which are product-specific security/auth configuration operations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/unity-catalog-cli,Unity Catalog,Unity Catalog CLI (legacy) - Azure Databricks,Manage Unity Catalog with legacy Databricks CLI,Learn how to use the Azure Databricks Unity Catalog command-line interface (CLI).,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. Note The Unity Catalog CLI isExperimental. The ",2026-03-26T08:00:00.000Z,archived,configuration,0.8,True,"Describes the Unity Catalog CLI, including that it is experimental; CLI docs typically provide specific command and option references, which are detailed configuration/management knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/workspace-cli,Workspace,Workspace CLI (legacy) - Azure Databricks,Manage Databricks workspace objects with legacy CLI,Learn how to use the Databricks workspace command-line interface.,"Important This documentation has been retired and might not be updated. This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. SeeWhat is the Databricks CLI?. To find your version of the Databricks CLI, rundatabricks -v. To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, seeDatabricks CLI migration. You run Databricks workspace CLI subcommands by",2024-03-18T08:00:00.000Z,archived,integrations,0.8,True,"Workspace CLI documentation provides commands and parameters for managing workspace items, which are Databricks-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/dbutils-library,dbutils.library,Library utility (dbutils.library) (legacy) - Azure Databricks,Use legacy dbutils.library utilities in Databricks,"Understand and learn how to use Databricks Utilities to work with object storage, to chain and parameterize notebooks, and to work with secrets.","Note dbutils.library.installanddbutils.library.installPyPIAPIs are removed in Databricks Runtime 11.0 and above. Most library utility commands are deprecated. Most library utilities are not available on Databricks Runtime ML. For information ondbutils.library.restartPython, seeRestart the Python process on Azure Databricks. This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported. Databricks strongly r",2024-04-03T08:00:00.000Z,archived,configuration,0.7,True,"Describes specific dbutils.library APIs, their deprecation, and runtime availability, which are detailed product-specific configuration and coding patterns.",unchanged @@ -250,13 +250,13 @@ https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-vie https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/advanced-techniques,Advanced techniques,Advanced techniques for metric views - Azure Databricks,Implement advanced metric view calculations in Databricks,Learn advanced techniques for metric views including window measures and composability.,"Advanced techniques for metric views enable sophisticated calculations such as moving averages, period-over-period changes, running totals, and complex derived KPIs while maintaining consistency and reusability across your semantic layer. This page explains how to use window measures for time-series analysis and composability for building complex metrics from simpler measures. This page assumes familiarity with basic metric view modeling concepts. SeeModel metric views. Note The examples on this",2026-04-09T22:01:00.000Z,concept-article,integrations,0.65,True,"Describes product-specific syntax and patterns (window measures, composability) for Databricks metric views, including concrete YAML/code expressions and behaviors that are unique to this feature, fitting integrations & coding patterns more than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/basic-modeling,Model metric views,Model metric views - Azure Databricks,Model business data with Unity Catalog metric views,Learn how to model data with Unity Catalog metric views.,"Metric views create a semantic layer for your data, transforming tables and views into standardized business metrics. They define what to measure, how to aggregate it, and how to segment it. Metric views ensure that every user across the organization reports the same value for the same Key Performance Indicator (KPI), eliminating inconsistent reporting and enabling flexible analysis across any dimensions. For a full example with joins, dimensions, measures, and agent metadata, seeTutorial: Build",2026-04-09T22:01:00.000Z,concept-article,best-practices,0.6,True,"Focuses on how to model data with metric views; likely includes concrete modeling patterns and product-specific recommendations for measures, dimensions, and KPIs beyond generic modeling theory.",unchanged https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/create-edit,Create and edit metric views,Create and edit metric views - Azure Databricks,Create and edit Unity Catalog metric views,Learn how to create and edit metric views using the Catalog Explorer UI or SQL.,"This page explains how to create and editmetric viewsusing the Catalog Explorer UI or SQL. The Catalog Explorer UI includes a low-code editor and a YAML editor. The low-code UI is a good starting point if you prefer not to write SQL. For a more complex example, seeTutorial: Build a complete metric view with joins.",2026-04-09T22:01:00.000Z,how-to,configuration,0.65,True,"Covers creating/editing metric views via UI and SQL; likely includes specific SQL syntax, YAML schema, and field options that constitute product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/feature-availability,Feature availability,Metric view feature availability - Azure Databricks,Check Databricks metric view feature runtime requirements,Minimum Databricks Runtime and YAML specification version requirements for metric view features.,Metric views were introduced in Databricks Runtime 16.4. The following table lists minimum Databricks Runtime version requirements for specific features added in later versions.,2026-04-17T08:00:00.000Z,feature-availability,limits-quotas,0.8,True,"The page describes minimum Databricks Runtime and YAML specification versions required for specific metric view features, likely in a table mapping features to exact runtime versions. These are precise version constraints (a form of limits/requirements) that an LLM would not reliably know from training.",updated +https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/feature-availability,Feature availability,Metric view feature availability - Azure Databricks,Check Databricks metric view feature runtime requirements,Minimum Databricks Runtime and YAML specification version requirements for metric view features.,Metric views were introduced in Databricks Runtime 16.4. The following table lists minimum Databricks Runtime version requirements for specific features added in later versions.,2026-04-17T08:00:00.000Z,feature-availability,limits-quotas,0.8,True,"The page describes minimum Databricks Runtime and YAML specification versions required for specific metric view features, likely in a table mapping features to exact runtime versions. These are precise version constraints (a form of limits/requirements) that an LLM would not reliably know from training.",unchanged https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/level-of-detail,Use level of detail expressions,Use level of detail (LOD) expressions in metric views - Azure Databricks,Use LOD expressions in Databricks metric views,Use level of detail expressions in metric views to control aggregation granularity independently of the dimensions in your query.,Level of detail (LOD) expressions let you specify the granularity at which aggregations are calculated independently of the dimensions in your query. This page explains how to use LOD expressions in metric views.,2026-04-09T22:01:00.000Z,concept-article,integrations,0.7,True,"Covers Databricks-specific level-of-detail expression syntax and behavior within metric views, with concrete expression patterns and semantics that are unique to this product feature, aligning with integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/manage,Manage metric views,Manage metric views - Azure Databricks,Manage access and lifecycle for metric views,"Learn how to manage metric views, control access, and use Catalog Explorer.","Metric views are registered to Unity Catalog and can be managed using Catalog Explorer. Users with at leastSELECTpermission can view metric view details, and owners can control access, enable collaborative editing, and manage the lifecycle of metric views through the UI or SQL commands.",2026-04-09T22:01:00.000Z,concept-article,security,0.7,True,"Describes controlling access, permissions (e.g., SELECT), ownership, and lifecycle management; this is product-specific security and governance configuration for Unity Catalog metric views.",unchanged https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/materialization,Materialization,Materialization for metric views - Azure Databricks,Design and use materialization for Databricks metric views,Learn how to use materialization to accelerate metric view queries with automatic incremental updates and intelligent query rewrite.,"Important This feature isExperimental. Materialization for metric views accelerates queries by using materialized views to pre-compute aggregations. Lakeflow Spark Declarative Pipelines orchestrates user-defined materialized views for a given metric view, and at query time, the query optimizer intelligently routes queries to the best materialized view using automatic aggregate-aware query matching (query rewriting). This approach provides automatic incremental updates without requiring you to de",2026-04-09T22:01:00.000Z,feature-guide,architecture-patterns,0.6,True,"Explains how Databricks metric view materialization works with Lakeflow Spark Declarative Pipelines and aggregate-aware query rewriting, including product-specific behavior and when to use materialized views for performance, which is closer to an architecture/design pattern for this feature.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/query,Query metric views,Query metric views - Azure Databricks,,Learn how to query metric views and consume them in downstream tools.,"You can query metric views like standard views from any SQL editor attached to a SQL warehouse or compute resource running a supported runtime. Metric views support flexible grouping and filtering, so you can analyze measures across any combination of dimensions at runtime without pre-computing every aggregation. The queries on this page demonstrate common query patterns.",2026-04-17T21:49:00.000Z,concept-article,,0.2,False,"The page focuses on how to query metric views and shows common query patterns. This is closer to tutorial/query examples than to configuration, limits, or troubleshooting content with product-specific parameters or error codes. It does not clearly meet any expert-knowledge sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/query,Query metric views,Query metric views - Azure Databricks,Query Azure Databricks metric views from downstream tools,Learn how to query metric views and consume them in downstream tools.,"You can query metric views like standard views from any SQL editor attached to a SQL warehouse or compute resource running a supported runtime. Metric views support flexible grouping and filtering, so you can analyze measures across any combination of dimensions at runtime without pre-computing every aggregation. The queries on this page demonstrate common query patterns.",2026-04-22T08:00:00.000Z,concept-article,integrations,0.62,True,Shows concrete SQL query patterns and how to consume metric views from external tools. This is a product-specific querying/integration pattern rather than generic SQL usage.,updated https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/tpch-example,Tutorial: Build a complete metric view with joins,Tutorial: Build a complete metric view with joins - Azure Databricks,,"Build a complete sales analytics metric view with joins, dimensions, and measures using the TPC-H dataset for real-world analysis.","This tutorial walks you through building a comprehensive sales analytics metric view using the TPC-H dataset. By the end, you'll have a metric view that: If you're new to metric views, start withCreate and edit metric viewsto learn the basics. This tutorial extends that foundation with real-world complexity.",2026-04-09T22:01:00.000Z,tutorial,,0.3,False,"Tutorial-style walkthrough for building a metric view with joins; primarily step-by-step example usage without configuration tables, limits, or product-specific decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/yaml-reference,YAML syntax reference,Metric view YAML syntax reference - Azure Databricks,Reference YAML configuration for Databricks metric views,Learn the syntax and expressions supported in a YAML metric view definition.,"This page explains the complete YAML grammar for metric views. Metric view definitions follow standard YAML notation syntax. For minimum runtime and YAML specification version requirements for each feature, seeMetric view feature availability. SeeYAML Specification 1.2.2documentation to learn more about YAML specifications.",2026-04-10T08:00:00.000Z,reference,configuration,0.85,True,"Provides the full YAML grammar and supported expressions for metric view definitions, including field names and allowed structures, which are product-specific configuration details not inferable from general knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/yaml-reference,YAML syntax reference,Metric view YAML syntax reference - Azure Databricks,Use YAML syntax to define Azure Databricks metric views,Learn the syntax and expressions supported in a YAML metric view definition.,"This page explains the complete YAML grammar for metric views. Metric view definitions follow standard YAML notation syntax. For minimum runtime and YAML specification version requirements for each feature, seeMetric view feature availability. SeeYAML Specification 1.2.2documentation to learn more about YAML specifications.",2026-04-23T17:47:00.000Z,reference,configuration,0.78,True,"A full YAML grammar reference for metric views with supported expressions and feature availability is a configuration reference: specific field names, allowed values, and syntax that an LLM would not reliably infer from training.",updated https://learn.microsoft.com/en-us/azure/databricks/catalog-explorer/,What is Catalog Explorer?,What is Catalog Explorer? - Azure Databricks,,Use Catalog Explorer to discover and manage data and AI assets on Azure Databricks.,"Databricks Catalog Explorer provides a UI to explore and manage data, schemas (databases), tables, models, functions, and other AI assets. To open Catalog Explorer, clickCatalogin the sidebar.",2026-03-10T23:23:00.000Z,overview,,0.28,False,Introductory description of Catalog Explorer UI; lacks detailed configuration parameters or expert-only behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/catalog-explorer/entity-relationship-diagram,View entity relationships,View the Entity Relationship Diagram - Azure Databricks,Use Catalog Explorer Entity Relationship Diagram in Databricks,View primary key and foreign key relationships between tables as a graph in the Entity Relationship Diagram in Catalog Explorer.,"This article describes how to access the Entity Relationship Diagram (ERD) in Catalog Explorer. The ERD displays the primary key and foreign key relationships between tables in a graph, providing a clear and intuitive representation of how data entities connect. You can access the ERD from Catalog Explorer when viewing any table that contains a foreign key constraint. For more information about primary key and foreign key constraints, seeConstraints on Azure Databricks. To access the ERD, do the",2026-02-23T08:00:00.000Z,how-to,configuration,0.63,True,"Describes how to access and interpret the ERD in Catalog Explorer, including conditions (tables with foreign key constraints) and UI/feature behavior that are specific to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/catalogs/,Overview,What are catalogs in Azure Databricks? - Azure Databricks,,Learn about catalogs in Azure Databricks and how they work in Unity Catalog.,"A catalog is the primary unit of data organization in the Azure Databricks Unity Catalog data governance model. This article gives an overview of catalogs in Unity Catalog and how best to use them. Catalogs are the first layer in Unity Catalog's three-level namespace (catalog.schema.table-etc). They contain schemas, which in turn can contain tables, views, volumes, models, and functions. Catalogs are registered in a Unity Catalog metastore in your Azure Databricks account.",2026-03-26T08:00:00.000Z,overview,,0.35,False,Overview of catalogs in Unity Catalog and how they fit into the namespace; mostly conceptual guidance without detailed configuration tables or security role matrices.,unchanged @@ -271,40 +271,41 @@ https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/bi-serving-sql-se https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/compute,Compute creation cheat sheet,Compute creation cheat sheet - Azure Databricks,Apply Azure Databricks compute creation best practices,This article provides an overview of Azure Databricks compute creation best practices.,"This article aims to provide clear and opinionated guidance for compute creation. By using the right compute types for your workflow, you can improve performance and save on costs.",2026-04-06T21:50:00.000Z,best-practice,best-practices,0.7,True,"Aimed at clear, opinionated guidance for compute creation to optimize performance and cost; Databricks-specific recommendations for choosing compute types qualify as product-specific best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/jobs,Production job scheduling cheat sheet,Production job scheduling cheat sheet - Azure Databricks,Implement Azure Databricks production job scheduling best practices,This article provides an overview of Azure Databricks production job best practices.,"This article aims to provide clear and opinionated guidance for production job scheduling. Using best practices can help reduce costs, improve performance, and tighten security.",2026-04-06T21:50:00.000Z,best-practice,best-practices,0.75,True,"Provides opinionated guidance for production job scheduling with Databricks-specific patterns to reduce cost, improve performance, and tighten security, fitting product-focused best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/power-bi,Power BI cheat sheet,Power BI cheat sheet - Azure Databricks,Best practices for Power BI dashboards on Databricks,This page provides an overview of best practices when creating Power BI dashboards using Databricks data.,"This page provides clear and opinionated guidance for efficiently managing your data in Power BI and Azure Databricks to optimize query performance and create efficient dashboards. For a set of practical quickstarts demonstrating reference implementations of some of the best practices for using Power BI on Azure Databricks, see thisrepository.",2026-02-17T23:30:00.000Z,best-practice,best-practices,0.84,True,"Explicit best-practices cheat sheet with opinionated, Databricks-specific guidance to optimize query performance and dashboard efficiency.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/,Clean rooms overview,What is Azure Databricks Clean Rooms? - Azure Databricks,,"Learn how to use Clean Rooms, a Databricks feature that provides a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access","This page introduces Clean Rooms, an Azure Databricks feature that uses Delta Sharing and serverless compute to provide a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access to each other's data.",2025-10-27T08:00:00.000Z,concept-article,,0.35,False,Introductory overview of Clean Rooms; primarily conceptual without detailed configuration parameters or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/,Clean rooms overview,What is Azure Databricks Clean Rooms? - Azure Databricks,,"Learn how to use Clean Rooms, a Databricks feature that provides a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access","This page introduces Clean Rooms, an Azure Databricks feature that uses Delta Sharing and serverless compute to provide a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access to each other's data.",2026-04-21T08:00:00.000Z,concept-article,,0.2,False,"High-level feature introduction for Databricks Clean Rooms without detailed limits, configuration parameters, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/clean-room-collaborator,Work with clean rooms as a collaborator,Work with Azure Databricks clean rooms as an invited collaborator - Azure Databricks,Collaborate as an invited Clean Rooms user,Learn how to join a Databricks clean room created by another collaborator and work in that clean room. Databricks Clean Rooms provides a secure and privacy-protecting environment where multiple partie,"This page describes how to join and work in an Azure Databricks clean room as an invited collaborator. A clean room can have ten total collaborators. Databricks Clean Rooms provides a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access to each other's data. For more information, seeWhat is Azure Databricks Clean Rooms?. When another collaborator creates a clean room so that you can work with them on a shared data p",2025-12-18T08:00:00.000Z,concept-article,limits-quotas,0.7,True,States a specific numeric limit: a clean room can have ten total collaborators; this is an explicit quota plus collaborator workflow details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/clean-room-notebook,Run notebooks in clean rooms,Run notebooks in clean rooms - Azure Databricks,Run notebooks securely in clean rooms,Learn how to run notebooks in a clean room. Clean Rooms are a Databricks feature that provides a secure and privacy-protecting environment where multiple parties can work together on sensitive enterpr,"This page describes how to run notebooks in clean rooms. Notebooks are the interface that collaborators use to run data analysis in collaboration. To learn how to add a notebook to a clean room, seeCreate clean rooms.",2025-12-18T08:00:00.000Z,concept-article,configuration,0.6,True,"Describes how notebooks operate inside clean rooms; likely includes execution constraints, permissions, and environment behaviors unique to Clean Rooms.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/create-clean-room,Create a clean room,Create clean rooms - Azure Databricks,Create Azure Databricks clean rooms for secure collaboration,Learn how to create a clean room in Databricks to provide a secure and privacy-protecting environment where multiple parties can share sensitive enterprise data and collaborate without direct access t,"This page describes how to create a Databricks clean room using the UI. A clean room is a secure environment for collaborative data analysis. Key features and limitations: To use the REST API, seeCreate a clean room.",2026-01-24T08:00:00.000Z,concept-article,configuration,0.7,True,Describes creating clean rooms via UI and notes key features and limitations; includes product-specific configuration steps and constraints.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/create-clean-room,Create a clean room,Create clean rooms - Azure Databricks,,Learn how to create a clean room in Databricks to provide a secure and privacy-protecting environment where multiple parties can share sensitive enterprise data and collaborate without direct access t,"This page describes how to create a Databricks clean room using the UI. A clean room is a secure environment for collaborative data analysis. Key features and limitations: To use the REST API, seeCreate a clean room.",2026-04-21T08:00:00.000Z,concept-article,,0.3,False,"UI-based how-to for creating a clean room; appears procedural without detailed config tables, limits, or product-specific best-practice guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/manage-clean-room,Manage clean rooms,Manage clean rooms - Azure Databricks,Manage and monitor Azure Databricks clean rooms,"Learn how to update, monitor, and delete clean rooms. Clean rooms are a Databricks feature that provides a secure and privacy-protecting environment where multiple parties can work together on sensiti","This page describes how to manage clean rooms, including how to: These tasks can be performed by all collaborators in a clean room.",2026-01-24T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains how collaborators update, monitor, and delete clean rooms; contains product-specific management operations and behaviors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/output-tables,Work with output tables,Create and work with output tables in Databricks Clean Rooms - Azure Databricks,Create and use Clean Rooms output tables,"Learn how to create Clean Rooms notebooks that share output tables, and learn how to access output tables as a collaborator who runs such notebooks in a clean room. Clean Rooms are a Databricks featur","This page introduces output tables, which are temporary read-only tables generated by a notebook run and shared to the notebook runner's Unity Catalog metastore. This article describes how to use a notebook to create output tables and how runners can read these output tables in their Unity Catalog metastore.",2025-12-18T08:00:00.000Z,concept-article,configuration,0.7,True,Defines output tables as temporary read-only tables and explains how to create and access them; includes product-specific table behavior and lifecycle semantics.,unchanged https://learn.microsoft.com/en-us/azure/databricks/comments/,Overview,Add comments to data and AI assets - Azure Databricks,Add and manage comments on Unity Catalog assets,Learn how to add comments to Azure Databricks securable objects to assist with data discoverability.,"This article introduces comments for data and AI assets and explains how to add them. Comments can help you and other users find and manage the data and AI assets you need. Comments provide a metadata field for annotating your securable objects. You can add comments to any securable object in Unity Catalog, such as catalogs, schemas, tables, volumes, AI models, and more. You can also add comments to table columns. After adding comments to a securable object, any user withBROWSEprivileges can vie",2026-03-03T08:00:00.000Z,how-to,configuration,0.6,True,Shows how to set and view comments on securable objects with specific commands and permission behavior; product-specific metadata configuration.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/comments/ai-comments,Add AI-generated comments,Add AI-generated comments to Unity Catalog objects - Azure Databricks,Use AI-generated comments for Unity Catalog objects,Learn how to add AI-generated comments to Unity Catalog objects and table columns in Catalog Explorer.,"This article introduces AI-generated Unity Catalog object and table column comments (also known as AI-generated documentation), explains how they work, and shows how to add and edit them. Important Saving comments triggers anALTERSQL command, which can disrupt Azure Databricks pipelines and jobs. For details about the AI behind AI-generated comments, seeDatabricks AI assistive features trust and safety.",2026-02-26T08:00:00.000Z,how-to,configuration,0.7,True,"Describes specific behavior (ALTER commands triggered, impact on pipelines) and concrete steps to configure AI-generated comments.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/comments/ai-comments,Add AI-generated comments,Add AI-generated comments to Unity Catalog objects - Azure Databricks,,Learn how to add AI-generated comments to Unity Catalog objects and table columns in Catalog Explorer.,"This page introduces AI-generated Unity Catalog object and table column comments (also known as AI-generated documentation), explains how they work, and shows how to add and edit them. Important Saving comments triggers anALTERSQL command, which can disrupt Azure Databricks pipelines and jobs. For details about the AI behind AI-generated comments, seeDatabricks AI assistive features trust and safety.",2026-04-23T17:47:00.000Z,how-to,,0.3,False,"Page is primarily a feature overview and how-to for AI-generated comments in Unity Catalog. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined by the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/databricks/compute/,Compute overview,Compute - Azure Databricks,,Learn about the types of Azure Databricks compute available in your workspace.,"Azure Databricks compute refers to the selection of computing resources available on Azure Databricks to run your data engineering, data science, and analytics workloads. Choose from serverless compute for on-demand scaling, classic compute for customizable resources, or SQL warehouses for optimized analytics. You can view and manage compute resources in theComputesection of your workspace:",2026-04-07T08:00:00.000Z,concept-article,,0.1,False,"High-level overview of compute types (serverless, classic, SQL warehouses) and where to view/manage them; no indication of numeric limits, configuration tables, error codes, or detailed decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/choose-compute,Choose a compute type for your workload,Compute selection recommendations - Azure Databricks,Choose appropriate Azure Databricks compute types,Learn which compute type to use for your workload.,"This page explains how to choose the right compute type for your workload. Depending on your permissions, you might have the option to choose or create a number of different types of compute resources. For guidance on configuring classic compute resources, seeCompute configuration recommendations.",2026-04-14T08:00:00.000Z,concept-article,decision-making,0.7,True,"Page is explicitly about compute selection recommendations and likely includes workload-based guidance and trade-offs between Databricks compute options, which is product-specific decision-making beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/compute/choose-compute,Choose a compute type for your workload,Compute selection recommendations - Azure Databricks,Choose appropriate Azure Databricks compute types,Learn which compute type to use for your workload.,"This page explains how to choose the right compute type for your workload. Depending on your permissions, you might have the option to choose or create a number of different types of compute resources. For guidance on configuring classic compute resources, seeCompute configuration recommendations.",2026-04-14T08:00:00.000Z,concept-article,decision-making,0.7,True,"Page is explicitly about compute selection recommendations and likely includes workload-based guidance and trade-offs between Databricks compute options, which is product-specific decision-making beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/cluster-config-best-practices,Configuration recommendations,Compute configuration recommendations - Azure Databricks,Apply Databricks compute configuration best practices,Learn best practices when selecting and configuring Azure Databricks compute.,"This article includes recommendations and best practices related to compute configuration. If your workload is supported, Databricks recommends using serverless compute rather than configuring your own compute resource. Serverless compute is the simplest and most reliable compute option. It requires no configuration, is always available, and scales according to your workload. Serverless compute is a compute option for notebooks, jobs, and Lakeflow Spark Declarative Pipelines. SeeConnect to serve",2025-10-07T08:00:00.000Z,concept-article,best-practices,0.78,True,Explicitly a recommendations/best practices article for compute configuration; likely includes concrete product-specific guidance and gotchas beyond generic advice.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/cluster-metrics,Monitor compute,View compute metrics - Azure Databricks,,Learn how to access and read compute metrics in the Databricks UI.,"This article explains how to use the native compute metrics tool in the Azure Databricks UI to gather key hardware and Spark metrics. The metrics UI is available for all-purpose and jobs compute. Metrics are available in almost real-time with a normal delay of less than one minute. Metrics are stored in Azure Databricks-managed storage, not in the customer's storage. Serverless compute for notebooks and jobs uses query insights instead of the metrics UI. For more information on serverless comput",2026-03-16T08:00:00.000Z,concept-article,,0.45,False,Explains how to view compute metrics and notes approximate delay; mostly UI usage and conceptual metrics info without detailed config or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/clusters-manage,Manage classic compute,Manage classic compute - Azure Databricks,,"Learn how to manage Azure Databricks compute, including displaying, editing, starting, terminating, deleting, controlling access, and monitoring performance and logs.","This article describes how to manage Azure Databricks compute, including displaying, editing, starting, terminating, deleting, controlling access, and monitoring performance and logs. You can also use theClusters APIto manage compute programmatically.",2025-10-07T08:00:00.000Z,concept-article,,0.4,False,"Management article (start, stop, edit, monitor) appears procedural; summary does not indicate detailed config tables, limits, or error-code mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/configure,Compute configuration,Compute configuration reference - Azure Databricks,Use Databricks compute configuration settings effectively,Learn about the compute configuration settings available in Databricks.,"Note The organization of this article assumes you are using the simple form compute UI. For an overview of the simple form updates, seeUse the simple form to manage compute. This article explains the configuration settings available when creating a new all-purpose or job compute resource. Most users create compute resources using their assigned policies, which limits the configurable settings. If you don't see a particular setting in your UI, it's because the policy you've selected does not allo",2026-04-15T08:00:00.000Z,concept-article,configuration,0.9,True,"This is a configuration reference for Databricks compute, detailing available settings when creating all-purpose or job compute resources and how policies affect visibility of those settings. It clearly maps to configuration, with product-specific options and constraints.",updated +https://learn.microsoft.com/en-us/azure/databricks/compute/cluster-metrics,Monitor compute,View compute metrics - Azure Databricks,,Learn how to access and read compute metrics in the Databricks UI.,"This article explains how to use the native compute metrics tool in the Azure Databricks UI to gather key hardware and Spark metrics. The metrics UI is available for all-purpose and jobs compute. Metrics are available in almost real-time with a normal delay of less than one minute. Metrics are stored in Azure Databricks-managed storage, not in the customer's storage. Serverless compute for notebooks and jobs uses query insights instead of the metrics UI. For more information on serverless comput",2026-04-24T18:05:00.000Z,concept-article,,0.4,False,"Explains how to view compute metrics in the UI and notes storage location and delay; appears to be a usage guide without detailed configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/compute/clusters-manage,Manage classic compute,Manage classic compute - Azure Databricks,,"Learn how to manage Azure Databricks compute, including displaying, editing, starting, terminating, deleting, controlling access, and monitoring performance and logs.","This article describes how to manage Azure Databricks compute, including displaying, editing, starting, terminating, deleting, controlling access, and monitoring performance and logs. You can also use theClusters APIto manage compute programmatically.",2026-04-20T08:00:00.000Z,concept-article,,0.2,False,"Described as a how-to for managing compute (start, stop, edit, monitor); likely procedural UI/API usage without detailed configuration tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/compute/configure,Compute configuration,Compute configuration reference - Azure Databricks,Configure Azure Databricks compute settings and options,Learn about the compute configuration settings available in Databricks.,"Note The organization of this article assumes you are using the simple form compute UI. For an overview of the simple form updates, seeUse the simple form to manage compute. This article explains the configuration settings available when creating a new all-purpose or job compute resource. Most users create compute resources using their assigned policies, which limits the configurable settings. If you don't see a particular setting in your UI, it's because the policy you've selected does not allo",2026-04-22T08:00:00.000Z,concept-article,configuration,0.86,True,A reference for all-purpose and job compute configuration settings; Databricks-specific parameters and options constitute expert configuration knowledge beyond generic cluster setup.,updated https://learn.microsoft.com/en-us/azure/databricks/compute/custom-containers,Use a custom image as your compute environment,Customize containers with Databricks Container Service - Azure Databricks,Configure custom Docker containers for Databricks compute,"Learn how to customize Azure Databricks compute using Docker images, for full control of library customization, environment lockdown, and CI/CD integration.","Databricks Container Services lets you specify a Docker image when you create compute. Some example use cases include: You can also use Docker images to create custom deep learning environments on compute with GPU devices. For additional information about using GPU compute with Databricks Container Services, seeDatabricks Container Services on GPU compute. For tasks to be executed each time the container starts, use aninit script.",2026-02-11T08:00:00.000Z,concept-article,configuration,0.7,True,Covers customizing compute with Docker images and init scripts; such content typically includes image configuration parameters and environment setup details specific to Databricks Container Services.,unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/dedicated-limitations,Limitations,Dedicated compute requirements and limitations - Azure Databricks,Understand dedicated compute requirements and limitations,Learn about limitations and requirements for dedicated compute access mode.,"This page outlines requirements and limitations for dedicated compute. Most dedicated compute limitations are runtime-dependent, as feature support has been added over time. Important Init scripts and libraries have different support across access modes and Databricks Runtime versions. SeeWhere can init scripts be installed?andCompute-scoped libraries. Dedicated compute assigned to a group has additional limitations. SeeGroup access limitations.",2026-03-31T23:28:00.000Z,concept-article,limits-quotas,0.76,True,"Requirements and limitations page for dedicated compute; typically lists runtime-dependent feature support and constraints, which are expert, product-specific limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/dedicated-overview,Dedicated compute overview,Dedicated compute overview - Azure Databricks,,Learn about dedicated compute access mode for enhanced security and fine-grained access control.,This page provides an overview of dedicated compute access mode.,2025-10-07T08:00:00.000Z,concept-article,,0.2,False,Dedicated compute overview; primarily conceptual description of access mode and benefits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-type-instances,Instance type compatibility reference,Flexible node type compatibility reference - Azure Databricks,Reference compatible instance groups for flexible nodes,A reference page of instance type compatibility groups for flexible node types.,"This reference lists the instance type compatibility groups for flexible node types. When you configure a compute resource with flexible node types, Azure Databricks can automatically fall back to compatible instance types if your preferred type is unavailable. Compatible instance types within a group share the same vCPU count, memory (within 100-110%), local disk configuration, CPU architecture, and OS image support. For more information about flexible node types, seeImprove compute launch reli",2026-03-09T18:23:00.000Z,concept-article,configuration,0.9,True,"A reference list of instance type compatibility groups; effectively a configuration matrix of which instance types can be substituted, which is detailed expert configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/compute/events-api-updates,Cluster events API pagination changes,Cluster events API pagination changes - Azure Databricks,Migrate to token-based Databricks cluster events pagination,Learn about the upcoming deprecation of pagination fields in the `cluster_events` API endpoint and how to migrate to the new token-based pagination.,"To improve API performance, Azure Databricks is updating how pagination works in theCluster events API. On July 30, 2026, Azure Databricks will deprecate thelimit,offset,total_count, andnext_pagefields in the/api/2.1/clusters/eventsAPI endpoint. These fields will be replaced with new token-based pagination fields to improve API performance.",2026-04-22T17:34:00.000Z,concept-article,integrations,0.65,True,"Describes a product-specific API behavior change with concrete field names being deprecated (limit, offset, total_count, next_page) and their replacement with token-based pagination fields on a specific date. This is detailed, time-bound API contract knowledge that an LLM is unlikely to infer from training data and is directly relevant to how clients must integrate with the Databricks cluster_events API.",new +https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-type-instances,Instance type compatibility reference,Flexible node type compatibility reference - Azure Databricks,Select compatible instance types for flexible nodes,A reference page of instance type compatibility groups for flexible node types.,"This reference lists the instance type compatibility groups for flexible node types. When you configure a compute resource with flexible node types, Azure Databricks can automatically fall back to compatible instance types if your preferred type is unavailable. Compatible instance types within a group share the same vCPU count, memory (within 100-110%), local disk configuration, CPU architecture, and OS image support. For more information about flexible node types, seeImprove compute launch reli",2026-04-21T08:00:00.000Z,concept-article,decision-making,0.7,True,"Provides a reference list of instance type compatibility groups for flexible node types; this is detailed, product-specific guidance for choosing fallback instances, which supports deployment/selection decisions.",updated https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-types,Flexible node types overview,Improve compute launch reliability using flexible node types - Azure Databricks,Use flexible node types for reliable Databricks compute,Use flexible node types to improve classic compute launch reliability and reduce compute costs through intelligent instance type fallback.,"Classic compute resources in Azure Databricks use flexible node types, which allows your compute resource to fall back to alternative, compatible instance types when your specified instance type is unavailable. This behavior improves compute launch reliability by reducing capacity failures (stockout errors) during compute launches. For spot instances with fallback, flexible node types can attempt to acquire instances multiple times across different instance types before falling back to on-demand",2026-03-09T18:23:00.000Z,concept-article,best-practices,0.7,True,"Describes behavior of flexible node types, fallback logic, and stockout handling; these are product-specific operational recommendations and patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/gpu,Create GPU-enabled compute,GPU-enabled compute - Azure Databricks,Decide when and how to use GPU Databricks compute,"Learn about GPU-enabled Azure Databricks compute, when to use them, what they require, and how to create them.",Note Some GPU-enabled instance types are inBetaand are marked as such in the drop-down list when you select the driver and worker types during compute creation.,2026-02-03T08:00:00.000Z,concept-article,decision-making,0.65,True,"Covers when to use GPU-enabled compute, requirements, and how to create them; this is service-specific selection guidance between GPU and non-GPU compute.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/group-access,Assign dedicated compute to a group,Dedicated compute group access - Azure Databricks,Configure dedicated group access for Databricks compute,Learn how to create and use compute resources assigned to a group.,"This article explains how to create a compute resource assigned to a group using theDedicatedaccess mode. Dedicated group access mode allows users to get the operational efficiency of a standard access mode cluster while also securely supporting languages and workloads that are not supported by standard access mode, such as Databricks Runtime for ML, RDD APIs, and R.",2026-04-16T08:00:00.000Z,concept-article,security,0.68,True,"The page explains how to create compute resources assigned to a group using Dedicated access mode, which is about access control and secure support for certain workloads. This involves product-specific access configuration and maps best to security (identity/access management).",updated +https://learn.microsoft.com/en-us/azure/databricks/compute/group-access,Assign dedicated compute to a group,Dedicated compute group access - Azure Databricks,Configure dedicated group access for Databricks compute,Learn how to create and use compute resources assigned to a group.,"This article explains how to create a compute resource assigned to a group using theDedicatedaccess mode. Dedicated group access mode allows users to get the operational efficiency of a standard access mode cluster while also securely supporting languages and workloads that are not supported by standard access mode, such as Databricks Runtime for ML, RDD APIs, and R.",2026-04-16T08:00:00.000Z,concept-article,security,0.68,True,"The page explains how to create compute resources assigned to a group using Dedicated access mode, which is about access control and secure support for certain workloads. This involves product-specific access configuration and maps best to security (identity/access management).",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/lakeguard,How Lakeguard enforces user isolation,How does Databricks enforce user isolation? - Azure Databricks,Understand Databricks Lakeguard user isolation model,Learn how Unity Catalog and Lakeguard enforce data governance and user isolation on Databricks.,This page explains how Azure Databricks uses Lakeguard to enforce user isolation in shared compute environments and fine-grained access control in dedicated compute.,2025-11-06T23:11:00.000Z,concept-article,security,0.78,True,Details how Unity Catalog and Lakeguard enforce user isolation and fine-grained access control in different compute modes; includes product-specific security behavior and configuration concepts.,unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/photon,How Photon improves query performance,What is Photon? - Azure Databricks,,"Learn about Photon, the Azure Databricks native vectorized query engine that runs SQL workloads faster and reduces your total cost per workload.","This article explains the benefits of running your workloads on the Photon query engine. Photon is a high-performance Azure Databricks-native vectorized query engine that runs your SQL workloads and DataFrame API calls faster to reduce your total cost per workload. Photon is compatible with Apache Spark APIs, so it works with your existing code.",2025-01-14T08:00:00.000Z,concept-article,,0.35,False,"High-level explanation of Photon benefits and compatibility; lacks detailed configuration parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/pool-best-practices,Pool best practices,Pool best practices - Azure Databricks,Apply best practices for Databricks pools,Learn best practices when configuring and using Azure Databricks pools.,"This article explains what pools are, and how you can best configure them. For information on creating a pool, seePool configuration reference. Note If your workload supports serverless compute, Databricks recommends using serverless compute instead of pools to take advantage of always-on, scalable compute. SeeConnect to serverless compute.",2024-12-18T22:16:00.000Z,concept-article,best-practices,0.8,True,Explicit best-practices article on how to best configure and use pools; contains product-specific recommendations and patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/pool-index,Instance pools overview,Connect to pools - Azure Databricks,Decide when and how to use Azure Databricks pools,Learn what Databricks pools are and how to use them.,"Note If your workload supports serverless compute, Databricks recommends using serverless compute instead of pools to take advantage of always-on, scalable compute. SeeConnect to serverless compute. Azure Databricks pools are a set of idle, ready-to-use instances. When cluster nodes are created using the idle instances, cluster start and auto-scaling times are reduced. If the pool has no idle instances, the pool expands by allocating a new instance from the instance provider in order to accommod",2026-04-08T19:25:00.000Z,concept-article,decision-making,0.65,True,"Explains what pools are and when to use them, including a recommendation to prefer serverless compute when supported; this is product-specific guidance that helps choose between pools and serverless for different workloads, fitting decision-making around compute options.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/pools,Pool configuration reference,Pool configuration reference - Azure Databricks,Configure Databricks instance pools in the UI,"Learn how to create an Azure Databricks pool in the UI, including the available configuration options for new pools.","This article describes the available settings when creating a pool using the UI. To learn how to use the Databricks CLI to create a pool, seeDatabricks CLI commands. To learn how to use the REST API to create a pool, see theInstance Pools API. Note If your workload supports serverless compute, Databricks recommends using serverless compute instead of pools to take advantage of always-on, scalable compute. SeeConnect to serverless compute.",2025-02-11T08:00:00.000Z,concept-article,configuration,0.85,True,"Pool configuration reference describing available settings for new pools; likely includes parameter names, allowed values, and defaults.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/,Serverless compute overview,Connect to serverless compute - Azure Databricks,,"Learn about serverless compute for notebooks, workflows, and Lakeflow Declarative Pipelines in Databricks.","This page explains how to connect to and use serverless compute for notebooks, workflows, and Lakeflow Spark Declarative Pipelines in Azure Databricks.",2026-03-19T18:35:00.000Z,concept-article,,0.3,False,Explains how to connect to serverless compute; likely a usage/tutorial page rather than configuration reference with parameter tables.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/best-practices,Best practices for serverless compute,Best practices for serverless compute - Azure Databricks,Apply serverless compute best practices in Azure Databricks,"Best practices for serverless compute in Databricks, including performance mode selection, dependency management, networking, and cost monitoring.","Follow these recommendations to maximize productivity, reduce costs, and improve reliability when using serverless compute for notebooks, jobs, and pipelines on Azure Databricks.",2026-04-16T17:44:00.000Z,concept-article,best-practices,0.78,True,"The page is explicitly a best-practices guide for Databricks serverless compute, with product-specific recommendations (for example, performance mode selection, dependency handling, networking, and cost monitoring) that go beyond generic advice and are tied to this specific service’s behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/dependencies,Configure serverless environment,Configure the serverless environment - Azure Databricks,Configure environments and policies for Databricks serverless notebooks,"Learn how to manage the environment, dependencies, memory size, and serverless usage policy in your serverless notebooks.","This page explains how to use a serverless notebook'sEnvironmentside panel to configure dependencies, serverless usage policies, memory, and base environment. This panel provides a single place to manage the notebook's serverless settings. Settings configured in this panel only apply when the notebook is connected to serverless compute. To expand theEnvironmentside panel, click on thebutton to the right of the notebook.",2026-04-15T08:00:00.000Z,concept-article,configuration,0.86,True,"The page describes a configuration side panel for serverless notebooks, including settings for dependencies, memory size, base environment, and usage policies. These are concrete configuration options unique to Databricks serverless, matching the configuration sub-skill.",updated -https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/limitations,Serverless compute limitations,Serverless compute limitations - Azure Databricks,Understand feature and usage limitations of Databricks serverless compute,Learn about the current limitations of serverless compute for notebooks and jobs.,This article explains the current limitations of serverless compute for notebooks and jobs. It starts with an overview of the most important considerations and then provides a comprehensive reference list of limitations.,2026-04-15T08:00:00.000Z,concept-article,limits-quotas,0.74,True,"This is a reference list of current limitations for serverless compute. While the summary doesn’t show numbers, such limitation pages typically enumerate concrete constraints (unsupported features, behavioral limits) that are product-specific and not conceptual. Among the available types, it best aligns with limits-quotas as a limitations reference.",updated -https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/migration,Migrate from classic to serverless,Migrate from classic compute to serverless compute - Azure Databricks,Plan migration from classic to serverless Databricks compute,"Learn how to migrate workloads from classic compute to serverless compute, including prerequisites, code changes, and a phased migration plan.","Migrate your workloads from classic compute to serverless compute. Serverless compute handles provisioning, scaling, runtime upgrades, and optimization automatically. Most classic workloads can migrate with minimal or no code changes. This page focuses on those workloads. Some features, such asdf.cache, are not yet supported on serverless, but will not require code changes once available. Certain workloads that depend on R or Scala notebooks require classic compute and will not be able to migrat",2026-04-15T08:00:00.000Z,concept-article,decision-making,0.7,True,"The article focuses on migrating from classic to serverless compute, including which workloads can or cannot migrate and feature gaps. This is product-specific decision guidance about when to use serverless vs. classic and how to phase migration, fitting the decision-making category.",new +https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/,Serverless compute overview,Connect to serverless compute - Azure Databricks,,"Learn about serverless compute for notebooks, workflows, and Lakeflow Declarative Pipelines in Databricks.","This page explains how to connect to and use serverless compute for notebooks, workflows, and Lakeflow Spark Declarative Pipelines in Azure Databricks.",2026-04-23T08:00:00.000Z,concept-article,,0.3,False,"Page appears to describe how to connect and use Azure Databricks serverless compute for notebooks, workflows, and Lakeflow pipelines. Based on the description, it is likely a conceptual/how-to connection guide without detailed limits tables, configuration parameter matrices, error-code-based troubleshooting, or decision matrices. Lacking clear evidence of numeric limits, config tables, or product-specific troubleshooting, it does not meet the expert-knowledge criteria for any sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/best-practices,Best practices for serverless compute,Best practices for serverless compute - Azure Databricks,Apply serverless compute best practices in Azure Databricks,"Best practices for serverless compute in Databricks, including performance mode selection, dependency management, networking, and cost monitoring.","Follow these recommendations to maximize productivity, reduce costs, and improve reliability when using serverless compute for notebooks, jobs, and pipelines on Azure Databricks.",2026-04-16T17:44:00.000Z,concept-article,best-practices,0.78,True,"The page is explicitly a best-practices guide for Databricks serverless compute, with product-specific recommendations (for example, performance mode selection, dependency handling, networking, and cost monitoring) that go beyond generic advice and are tied to this specific service’s behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/dependencies,Configure serverless environment,Configure the serverless environment - Azure Databricks,Configure environments and policies for Databricks serverless notebooks,"Learn how to manage the environment, dependencies, memory size, and serverless usage policy in your serverless notebooks.","This page explains how to use a serverless notebook'sEnvironmentside panel to configure dependencies, serverless usage policies, memory, and base environment. This panel provides a single place to manage the notebook's serverless settings. Settings configured in this panel only apply when the notebook is connected to serverless compute. To expand theEnvironmentside panel, click on thebutton to the right of the notebook.",2026-04-15T08:00:00.000Z,concept-article,configuration,0.86,True,"The page describes a configuration side panel for serverless notebooks, including settings for dependencies, memory size, base environment, and usage policies. These are concrete configuration options unique to Databricks serverless, matching the configuration sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/limitations,Serverless compute limitations,Serverless compute limitations - Azure Databricks,Understand feature and usage limitations of Databricks serverless compute,Learn about the current limitations of serverless compute for notebooks and jobs.,This article explains the current limitations of serverless compute for notebooks and jobs. It starts with an overview of the most important considerations and then provides a comprehensive reference list of limitations.,2026-04-15T08:00:00.000Z,concept-article,limits-quotas,0.74,True,"This is a reference list of current limitations for serverless compute. While the summary doesn’t show numbers, such limitation pages typically enumerate concrete constraints (unsupported features, behavioral limits) that are product-specific and not conceptual. Among the available types, it best aligns with limits-quotas as a limitations reference.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/migration,Migrate from classic to serverless,Migrate from classic compute to serverless compute - Azure Databricks,Plan migration from classic to serverless Databricks compute,"Learn how to migrate workloads from classic compute to serverless compute, including prerequisites, code changes, and a phased migration plan.","Migrate your workloads from classic compute to serverless compute. Serverless compute handles provisioning, scaling, runtime upgrades, and optimization automatically. Most classic workloads can migrate with minimal or no code changes. This page focuses on those workloads. Some features, such asdf.cache, are not yet supported on serverless, but will not require code changes once available. Certain workloads that depend on R or Scala notebooks require classic compute and will not be able to migrat",2026-04-15T08:00:00.000Z,concept-article,decision-making,0.7,True,"The article focuses on migrating from classic to serverless compute, including which workloads can or cannot migrate and feature gaps. This is product-specific decision guidance about when to use serverless vs. classic and how to phase migration, fitting the decision-making category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/notebooks,Serverless compute for notebooks,Serverless compute for notebooks - Azure Databricks,,Learn how to use your workspace's serverless interactive compute resource.,"This article explains how to use serverless compute for notebooks. For information on using serverless compute for jobs, seeRun your Lakeflow Jobs with serverless compute for workflows. For pricing information about using serverless compute in notebooks, seeDatabricks pricing.",2026-03-25T08:00:00.000Z,concept-article,,0.3,False,How-to article for using serverless compute for notebooks; appears procedural without detailed config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/simple-form,Use the simple compute form,Use the simple form to manage compute - Azure Databricks,,Learn how to use the simple form to create compute resources in the Databricks UI.,This article explains how to access and use the simplified compute UI to create and edit all-purpose and job compute. You can switch between the simple form and the legacy compute form at the top of the create compute UI using theSimple formtoggle.,2025-12-04T20:41:00.000Z,concept-article,,0.3,False,Explains how to use the simple compute UI; mostly procedural UI steps without deep configuration reference tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/compute/single-user-fgac,Fine-grained access control,Fine-grained access control on dedicated compute - Azure Databricks,Use fine-grained access control on dedicated compute,"Learn how serverless compute provides data filtering with dedicated compute, enabling users to use dedicated compute to query tables that use row filters and column masks, dynamic views, materialized ","Fine-grained access control allows you to restrict access to specific data using views, row filters, and column masks. This page explains how serverless compute is used to enforce fine-grained access controls on dedicated compute resources. Note Dedicated compute is all-purpose or jobs compute configured withDedicatedaccess mode (formerly single user access mode). SeeAccess modes.",2026-03-10T08:00:00.000Z,concept-article,security,0.7,True,"Explains how serverless and dedicated compute enforce row filters, column masks, and dynamic views; this is product-specific access control behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/compute/single-user-fgac,Fine-grained access control,Fine-grained access control on dedicated compute - Azure Databricks,Use fine-grained access control on dedicated compute,"Learn how serverless compute provides data filtering with dedicated compute, enabling users to use dedicated compute to query tables that use row filters and column masks, dynamic views, materialized ","Fine-grained access control allows you to restrict access to specific data using views, row filters, and column masks. This page explains how serverless compute is used to enforce fine-grained access controls on dedicated compute resources. Note Dedicated compute is all-purpose or jobs compute configured withDedicatedaccess mode (formerly single user access mode). SeeAccess modes.",2026-04-21T08:00:00.000Z,concept-article,security,0.7,True,"Explains how serverless compute enforces fine-grained access controls (row filters, column masks, dynamic views) on dedicated compute; this is Databricks-specific security/authorization behavior.",updated https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/,SQL warehouses overview,Connect to a SQL warehouse - Azure Databricks,,"Learn about using SQL warehouses, formerly called SQL endpoints, for data warehousing on Azure Databricks.","A SQL warehouse is a compute resource that lets you query and explore data on Azure Databricks. Most users have access to SQL warehouses configured by administrators. For information on serverless compute plane architecture, seeServerless compute plane. Databricks recommends using serverless SQL warehouses when available.",2026-01-08T08:00:00.000Z,concept-article,,0.3,False,"Primarily a conceptual/usage overview of SQL warehouses and how to connect; no detailed limits, config tables, or error mappings that meet the expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/bi-workload-settings,SQL warehouse settings for BI workloads,SQL warehouse settings for BI workloads - Azure Databricks,Tune Databricks SQL warehouses for BI workloads,Learn how to configure SQL warehouses for optimal business intelligence workload performance.,"Business intelligence workloads have distinct characteristics that require specific SQL warehouse configuration considerations. This page provides guidance on analyzing your BI workload requirements and configuring SQL warehouses to deliver optimal performance, cost-efficiency, and reliability.",2026-02-17T23:30:00.000Z,concept-article,best-practices,0.76,True,"Provides workload-specific configuration recommendations for BI, including concrete sizing, concurrency, and performance/cost trade-offs for Databricks SQL warehouses.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/create,Create a SQL warehouse,Create a SQL warehouse - Azure Databricks,Configure and manage Databricks SQL warehouses,"Learn about SQL warehouse requirements, how to configure and manage SQL warehouses using the Azure Databricks UI, and advanced configuration options.","Workspace admins and sufficiently privileged users can configure and manage SQL warehouses. This article outlines how to create, edit, and monitor existing SQL warehouses. You can also create SQL warehouses using theSQL warehouse API, orTerraform. Databricks recommends using serverless SQL warehouses when available. Note Most users cannot create SQL warehouses, but can restart any SQL warehouse they can connect to. SeeConnect to a SQL warehouse.",2026-03-10T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes requirements and advanced configuration options for SQL warehouses, likely including specific settings and parameters in the UI/API, which is product-specific configuration knowledge beyond generic deployment steps.",unchanged @@ -320,19 +321,19 @@ https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/debug https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/query-watchdog,Handling large queries in interactive workflows,Handling large queries in interactive workflows - Azure Databricks,Control large interactive queries with Query Watchdog,Learn how to use Query Watchdog to throttle large queries in Azure Databricks.,"A challenge with interactive data workflows is handling large queries. This includes queries that generate too many output rows, fetch many external partitions, or compute on extremely large data sets. These queries can be extremely slow, saturate compute resources, and make it difficult for others to share the same compute. Query Watchdog is a process that prevents queries from monopolizing compute resources by examining the most common causes of large queries and terminating queries that pass ",2024-03-15T00:30:00.000Z,concept-article,best-practices,0.72,True,Troubleshooting/mitigation guidance for large queries using Databricks-specific Query Watchdog settings and behaviors; contains product-specific thresholds and actions for throttling and terminating problematic queries.,unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/use-compute,Classic compute overview,Classic compute overview - Azure Databricks,,Learn how users can access and create compute to run their queries and workloads.,This page explains how to access and create classic compute in your Azure Databricks workspace.,2026-03-25T08:00:00.000Z,concept-article,,0.3,False,"High-level overview of classic compute access and creation; no detailed limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/compute/web-terminal,Use the web terminal,Run shell commands in Azure Databricks web terminal - Azure Databricks,,"Azure Databricks web terminal provides an interactive way to run shell commands and use editors, such as Vim or Emacs.","The Azure Databricks web terminal provides a convenient and highly interactive way to run shell commands in a command-line interface (CLI), includingDatabricks CLI commands, to take actions on Databricks objects programmatically. It's especially useful for advanced use cases, such as batch operations on multiple files, which existing user interfaces (UIs) might not fully support. Multiple users can use the web terminal on one compute. You can use the web terminal to do the following:",2026-03-16T08:00:00.000Z,concept-article,,0.4,False,Describes web terminal usage and capabilities; appears as a feature overview/tutorial without deep configuration tables or error mappings.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/,Overview,Connect to data sources and external services - Azure Databricks,Configure Azure Databricks connections to external data,"Learn how to connect your Azure Databricks workspace to storage, external data systems, and external cloud services.","This page provides recommendations for administrators and power users who are configuring connections between Azure Databricks and external data sources and services. You can connect your Azure Databricks account to data sources such as cloud object storage, relational database management systems, streaming data services, and enterprise platforms such as CRMs. You can also connect your Azure Databricks account to non-storage external services.",2026-04-17T18:03:00.000Z,overview,integrations,0.68,True,"Page targets admins configuring connections from Azure Databricks to external storage and services; it typically includes product-specific connection patterns, endpoint/credential configuration details, and service-specific guidance that go beyond generic SDK usage, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/connect/,Overview,Connect to data sources and external services - Azure Databricks,Configure Azure Databricks connections to external data,"Learn how to connect your Azure Databricks workspace to storage, external data systems, and external cloud services.","This page provides recommendations for administrators and power users who are configuring connections between Azure Databricks and external data sources and services. You can connect your Azure Databricks account to data sources such as cloud object storage, relational database management systems, streaming data services, and enterprise platforms such as CRMs. You can also connect your Azure Databricks account to non-storage external services.",2026-04-17T18:03:00.000Z,overview,integrations,0.68,True,"Page targets admins configuring connections from Azure Databricks to external storage and services; it typically includes product-specific connection patterns, endpoint/credential configuration details, and service-specific guidance that go beyond generic SDK usage, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/jdbc-connection,JDBC connection,JDBC connection - Azure Databricks,Configure JDBC Unity Catalog connections to external databases,"Learn how to connect to external databases using JDBC with Unity Catalog connections, including driver installation, query options, and migration guidance.","Note This feature is a workspace-level beta on Databricks Runtime 17.3 and above. To enable this feature in your workspace, seeManage workspace-level previews. Azure Databricks supports connecting to external databases using JDBC. You can use a JDBC Unity Catalog connection to read and write to a data source with theSpark Data Source APIor Azure DatabricksRemote Query SQL API. The JDBC connection is a securable object in Unity Catalog that specifies the JDBC driver, the URL path, and credentials",2026-03-31T08:00:00.000Z,how-to,integrations,0.8,True,"Details JDBC connection objects, driver specification, URL, credentials, and usage via Spark and Remote Query SQL API; includes beta/runtime constraints and migration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/managed-ingestion,Connect to managed ingestion sources (Lakeflow Connect),Connect to managed ingestion sources - Azure Databricks,Create Lakeflow Connect connections for managed ingestion,Learn how to create connections in Catalog Explorer that store authentication details for Lakeflow Connect managed ingestion sources. Any user with `USE CONNECTION` privilege on the connection can the,Learn how to create connections in Catalog Explorer that store authentication details for Lakeflow Connect managed ingestion sources. Any user with theUSE CONNECTIONprivilege on the connection can then create managed ingestion pipelines from sources like Salesforce and SQL Server. An admin user must complete the steps in this article if the users who will create pipelines are non-admin users or will use programmatic interfaces. These interfaces require that users specify an existing connection w,2026-04-02T08:00:00.000Z,how-to,security,0.7,True,Describes creating connection objects that store authentication details and USE CONNECTION privileges; product-specific IAM and connection configuration.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/,Kafka,Connect to Apache Kafka - Azure Databricks,Connect Azure Databricks streaming to Apache Kafka,This article describes how you can use Apache Kafka as either a source or a sink when running Structured Streaming workloads on Azure Databricks.,"This article describes how you can use Apache Kafka as either a source or a sink when running Structured Streaming workloads on Azure Databricks. For more information about Kafka, see theApache Kafka documentation.",2026-04-15T22:08:00.000Z,how-to,integrations,0.7,True,"Kafka as source/sink for Structured Streaming generally requires product-specific options (e.g., Kafka configs, security settings) and code patterns, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/connect/managed-ingestion,Connect to managed ingestion sources (Lakeflow Connect),Connect to managed ingestion sources - Azure Databricks,Configure connections for Lakeflow managed ingestion sources,Learn how to create connections in Catalog Explorer that store authentication details for Lakeflow Connect managed ingestion sources. Any user with `USE CONNECTION` privilege on the connection can the,Learn how to create connections in Catalog Explorer that store authentication details for Lakeflow Connect managed ingestion sources. Any user with theUSE CONNECTIONprivilege on the connection can then create managed ingestion pipelines from sources like Salesforce and SQL Server. An admin user must complete the steps in this article if the users who will create pipelines are non-admin users or will use programmatic interfaces. These interfaces require that users specify an existing connection w,2026-04-23T08:00:00.000Z,how-to,configuration,0.65,True,"Explains how to create and use connection objects in Catalog Explorer for Lakeflow Connect managed ingestion, including USE CONNECTION privileges and authentication storage. This is product-specific configuration of connections and permissions, likely with concrete settings and roles, fitting the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/,Kafka,Connect to Apache Kafka - Azure Databricks,Connect Azure Databricks streaming to Apache Kafka,This article describes how you can use Apache Kafka as either a source or a sink when running Structured Streaming workloads on Azure Databricks.,"This article describes how you can use Apache Kafka as either a source or a sink when running Structured Streaming workloads on Azure Databricks. For more information about Kafka, see theApache Kafka documentation.",2026-04-15T22:08:00.000Z,how-to,integrations,0.7,True,"Kafka as source/sink for Structured Streaming generally requires product-specific options (e.g., Kafka configs, security settings) and code patterns, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/authentication,Authentication,Authentication - Azure Databricks,Configure Kafka authentication on Azure Databricks,This article describes how you can authenticate to Apache Kafka on Azure Databricks.,The Azure Databricks Kafka connector supports multiple authentication methods for connecting to Kafka. This article covers some of the most common authentication methods on Databricks. The full list of supported authentication methods can be found in theKafka documentation.,2026-04-10T08:00:00.000Z,how-to,security,0.75,True,"Authentication article for Kafka connector; expected to list supported auth methods, security protocol settings, and configuration parameters specific to Databricks Kafka integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/faq,FAQ,FAQ - Azure Databricks,Troubleshoot Apache Kafka usage on Databricks,Find answers to commonly asked questions about Apache Kafka on Azure Databricks.,Commonly asked questions about using Kafka with Azure Databricks.,2026-04-15T22:08:00.000Z,faq,troubleshooting,0.7,True,"Kafka FAQ for Databricks typically addresses concrete issues, error messages, and behavior clarifications specific to this integration, aligning with troubleshooting-style expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/faq,FAQ,FAQ - Azure Databricks,Troubleshoot Apache Kafka usage on Databricks,Find answers to commonly asked questions about Apache Kafka on Azure Databricks.,Commonly asked questions about using Kafka with Azure Databricks.,2026-04-15T22:08:00.000Z,faq,troubleshooting,0.7,True,"Kafka FAQ for Databricks typically addresses concrete issues, error messages, and behavior clarifications specific to this integration, aligning with troubleshooting-style expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/options,Options,Options - Azure Databricks,Configure Kafka connector options in Azure Databricks,Reference documentation for Apache Kafka connector options on Azure Databricks.,"This page describes configuration options for reading from and writing to Apache Kafka using Structured Streaming on Azure Databricks. The Azure Databricks Kafka connector is built on top of the Apache Spark Kafka connector and supports all standard Kafka configuration options. Any option prefixed withkafka.is passed through directly to the underlying Kafka client. For example,.option(""kafka.max.poll.records"", ""500"")sets the Kafka consumer'smax.poll.recordsproperty. See theKafka configuration do",2026-03-12T18:41:00.000Z,reference,configuration,0.95,True,"Reference for Kafka connector options necessarily includes option names, allowed values, and behavior; these are detailed configuration parameters unique to Databricks’ Kafka integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pub-sub,Pub/Sub,Subscribe to Google Pub/Sub - Azure Databricks,Integrate Google Pub/Sub with Azure Databricks streaming,This article describes how you can subscribe to Google Pub/Sub with Structured Streaming on Azure Databricks.,"Azure Databricks provides a built-in connector to subscribe to Google Pub/Sub in Databricks Runtime 13.3 LTS and above. This connector provides exactly-once processing semantics for records from the subscriber. Note Pub/Sub might publish duplicate records, and records might arrive to the subscriber out of order. You should write Azure Databricks code to handle duplicate and out-of-order records.",2025-02-14T08:00:00.000Z,how-to,integrations,0.7,True,"Describes a built-in Pub/Sub connector; such pages typically include connector options, required parameters, and semantics (exactly-once, ordering) that are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pub-sub,Pub/Sub,Subscribe to Google Pub/Sub - Azure Databricks,Subscribe to Google Pub/Sub with Databricks Structured Streaming,"Learn how to subscribe to Google Pub/Sub with Structured Streaming on Azure Databricks, including authentication, schema details, and configuration options.","Use the built-in connector to subscribe to Google Pub/Sub. This connector provides exactly-once processing semantics for records from the subscriber. Note Pub/Sub might publish duplicate records, or records might arrive to the subscriber out of order. Write code to handle duplicate and out-of-order records.",2026-04-21T18:07:00.000Z,how-to,integrations,0.8,True,"Describes a built-in Pub/Sub connector with exactly-once semantics, authentication, schema, and configuration options; includes product-specific options and caveats (duplicates, out-of-order) that form an integration and coding pattern.",updated https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pulsar,Pulsar,Stream from Apache Pulsar - Azure Databricks,Configure Apache Pulsar streaming source on Azure Databricks,This article describes how you can configure a streaming read from Apache Pulsar on Azure Databricks.,"Important This feature is inPublic Preview. In Databricks Runtime 14.1 and above, you can use Structured Streaming to stream data from Apache Pulsar on Azure Databricks. Structured Streaming provides exactly-once processing semantics for data read from Pulsar sources.",2024-03-01T08:00:00.000Z,how-to,integrations,0.7,True,Pulsar streaming connector configuration is an integration pattern; will include connector options and parameters specific to Databricks’ Pulsar support.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/uc-connections,Unity Catalog connections,Unity Catalog connections - Azure Databricks,Secure Unity Catalog connections to external systems,"Learn about Unity Catalog connections, the securable objects that store endpoint and credential information for accessing external systems from Azure Databricks.",Aconnectionis a securable object in Unity Catalog that stores the endpoint and credentials needed to access an external system. It lives directly under themetastorein the Unity Catalog object hierarchy. A connection bundles together: Connections are distinct fromstorage credentials(for cloud object storage) andservice credentials(for non-storage cloud services).,2026-04-17T18:03:00.000Z,overview,security,0.7,True,"Unity Catalog connections are securable objects with endpoint and credential definitions; documentation for these objects generally includes permission scopes, object-level security semantics, and how they differ from storage/service credentials, which is product-specific IAM/security configuration.",new +https://learn.microsoft.com/en-us/azure/databricks/connect/uc-connections,Unity Catalog connections,Unity Catalog connections - Azure Databricks,Secure Unity Catalog connections to external systems,"Learn about Unity Catalog connections, the securable objects that store endpoint and credential information for accessing external systems from Azure Databricks.",Aconnectionis a securable object in Unity Catalog that stores the endpoint and credentials needed to access an external system. It lives directly under themetastorein the Unity Catalog object hierarchy. A connection bundles together: Connections are distinct fromstorage credentials(for cloud object storage) andservice credentials(for non-storage cloud services).,2026-04-17T18:03:00.000Z,overview,security,0.7,True,"Unity Catalog connections are securable objects with endpoint and credential definitions; documentation for these objects generally includes permission scopes, object-level security semantics, and how they differ from storage/service credentials, which is product-specific IAM/security configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/,Overview,Connect to external cloud services using Unity Catalog - Azure Databricks,Govern external cloud service access with Unity Catalog,Learn how to use Unity Catalog to govern access to cloud services from Azure Databricks.,"This page explains how Unity Catalog governs access to non-storage cloud services and provides links to pages that explain how to enable access to cloud services from Azure Databricks. To enable and govern access to external services: This is preferable to passing credentials directly in your code, attaching credentials directly to a compute resource, or using Databricks secrets, because it allows you to restrict and track access using Unity Catalog privilege management and audit logging. Servic",2026-01-20T08:00:00.000Z,overview,security,0.7,True,Explains how to enable and govern access to non-storage cloud services using Unity Catalog privileges and audit logging; product-specific security configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/manage-service-credentials,Manage service credentials,Manage service credentials - Azure Databricks,Manage Unity Catalog service credentials and permissions,"This article describes administrative tasks for Unity Catalog-governed service credentials, which are securable objects that let you govern access to external cloud services in Azure Databricks.","This article describes how to list, view, update, grant permissions on, and delete service credentials, which are Unity Catalog securable objects that let you govern access to external cloud services. See also:",2026-01-20T08:00:00.000Z,how-to,security,0.7,True,"Covers listing, updating, granting permissions on, and deleting service credentials; product-specific security administration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials,Create service credentials,Create service credentials - Azure Databricks,Create Unity Catalog service credentials for cloud services,Learn how to use Unity Catalog to create service credentials that manage connections to external cloud services from Azure Databricks.,"This article describes how to create a service credential object in Unity Catalog that lets you govern access from Azure Databricks to external cloud services. A service credential in Unity Catalog encapsulates a long-term cloud credential that grants access to such services. Service credentials are not intended for governing access to cloud storage that is used as a Unity Catalog managed storage location or external storage location. For those use cases, use a storage credential. SeeConnect to ",2026-01-24T08:00:00.000Z,how-to,security,0.75,True,"Defines service credential objects, their scope, and how they encapsulate long-term cloud credentials; product-specific security object configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials,Create service credentials,Create service credentials - Azure Databricks,Create Unity Catalog service credentials for external services,Learn how to use Unity Catalog to create service credentials that manage connections to external cloud services from Azure Databricks.,"This article describes how to create a service credential object in Unity Catalog that lets you govern access from Azure Databricks to external cloud services. A service credential in Unity Catalog encapsulates a long-term cloud credential that grants access to such services. Service credentials are not intended for governing access to cloud storage that is used as a Unity Catalog managed storage location or external storage location. For those use cases, use a storage credential. SeeConnect to ",2026-04-23T08:00:00.000Z,how-to,security,0.7,True,"Describes how to create and govern service credential objects in Unity Catalog for external cloud services. This is identity/access configuration specific to Azure Databricks and Unity Catalog, likely including object types, permissions, and scope details that qualify as product-specific security configuration.",updated https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/use-service-credentials,Use service credentials to access services,Use Unity Catalog service credentials to connect to external cloud services - Azure Databricks,Use Unity Catalog service credentials to call cloud APIs,Learn how to use Unity Catalog service credentials to connect to external cloud services from Azure Databricks.,This article describes how to use a service credential in Unity Catalog to connect to external cloud services. A service credential object in Unity Catalog encapsulates a long-term cloud credential that provides access to an external cloud service that users need to connect to from Azure Databricks. See also:,2026-01-20T08:00:00.000Z,how-to,integrations,0.7,True,Shows how to use service credential objects when connecting to external cloud services; includes concrete usage patterns and parameters for integrations.,unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/,Overview,Connect to cloud object storage using Unity Catalog - Azure Databricks,Configure Unity Catalog access to cloud object storage,Learn how Unity Catalog uses cloud object storage and how to access cloud storage from Azure Databricks.,"This article gives an overview of the cloud storage connections that are required to work with data using Unity Catalog, along with information about how Unity Catalog governs access to cloud storage and external cloud services.",2026-01-20T08:00:00.000Z,concept-article,security,0.7,True,Explains required storage connections and how Unity Catalog governs access; includes product-specific security and access configuration patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/azure-managed-identities,Access storage using managed identities,Use Azure managed identities in Unity Catalog to access storage - Azure Databricks,Use Azure managed identities for Unity Catalog storage,Learn how to use Azure managed identities to connect to Azure Databricks Unity Catalog metastore root storage and other external storage accounts.,This page describes how to use Azure managed identities for connecting to storage containers on behalf of Unity Catalog users.,2026-02-11T08:00:00.000Z,how-to,security,0.8,True,"Details how to configure managed identities for metastore root and external storage; includes identity types, scopes, and Databricks-specific steps.",unchanged @@ -340,41 +341,41 @@ https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-s https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/external-locations-dbfs-root,Configure external locations for DBFS root,Connect to a DBFS root external location (legacy) - Azure Databricks,Connect DBFS root as a Unity Catalog external location,Learn how to configure an external location in Unity Catalog to govern access to your DBFS root storage location.,"This page describes how to connect to aDatabricks File System (DBFS) rootstorage external location. After you connect, you can to govern access to objects in DBFS root storage using Unity Catalog. Although Databricks recommends against storing data in DBFS root storage, your workspace might do so because of legacy practices. For example, your workspace-local, legacy Azure Databricks Hive metastore might have stored data in the DBFS root. Follow this guide to connect to the DBFS root by first cre",2026-03-10T08:00:00.000Z,how-to,configuration,0.75,True,"Provides detailed steps and constraints for governing DBFS root via external locations, including legacy-specific behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/manage-external-locations,Manage external locations,Manage external locations - Azure Databricks,Administer Unity Catalog external locations and permissions,This article describes administration tasks for Unity Catalog external locations on Azure Databricks.,"This page describes how to list, view, update, grant permissions on, enable file events for, and deleteexternal locations. Note Databricks recommends governing file access using volumes. SeeWhat are Unity Catalog volumes?.",2026-04-09T08:00:00.000Z,how-to,security,0.75,True,"Describes listing, updating, granting permissions, enabling file events, and deleting external locations. Involves product-specific administrative and permission operations on external locations, aligning with security/identity configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/manage-storage-credentials,Manage storage credentials,Manage storage credentials - Azure Databricks,Administer Unity Catalog storage credentials and permissions,This article describes administrative tasks for Unity Catalog storage credentials on Azure Databricks.,"This page describes how to list, view, update, grant permissions on, and deletestorage credentials. Databricks recommends that you grant onlyCREATE EXTERNAL LOCATIONand no other privileges on storage credentials. This page describes how to manage storage credentials using Catalog Explorer and SQL commands. For information about using the Databricks CLI or Terraform instead, see theDatabricks Terraform documentationandWhat is the Databricks CLI?.",2026-03-10T23:23:00.000Z,how-to,security,0.8,True,"Covers listing, updating, and permission recommendations (e.g., grant only CREATE EXTERNAL LOCATION) for storage credentials, which are detailed security practices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/managed-storage,Configure managed storage,Specify a managed storage location in Unity Catalog - Azure Databricks,Configure Unity Catalog managed storage locations,"Learn how to associate managed storage locations with a Unity Catalog metastore, catalog, or schema and how these locations are used for managed volumes and tables.","A managed storage location specifies a location in cloud object storage for storing data for managed tables and managed volumes. You can associate a managed storage location with a metastore, catalog, or schema. Managed storage locations at lower levels in the hierarchy override storage locations defined at higher levels when managed tables or managed volumes are created. Databricks recommends that you assign managed storage at the catalog level for logical data isolation, with metastore-level a",2026-04-17T18:03:00.000Z,concept-article,configuration,0.7,True,"Describes how to associate managed storage locations at metastore, catalog, and schema levels and how precedence works; this is product-specific configuration behavior (hierarchy overrides, recommended assignment level) that an LLM would not reliably infer without the docs.",updated +https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/managed-storage,Configure managed storage,Specify a managed storage location in Unity Catalog - Azure Databricks,Configure Unity Catalog managed storage locations,"Learn how to associate managed storage locations with a Unity Catalog metastore, catalog, or schema and how these locations are used for managed volumes and tables.","A managed storage location specifies a location in cloud object storage for storing data for managed tables and managed volumes. You can associate a managed storage location with a metastore, catalog, or schema. Managed storage locations at lower levels in the hierarchy override storage locations defined at higher levels when managed tables or managed volumes are created. Databricks recommends that you assign managed storage at the catalog level for logical data isolation, with metastore-level a",2026-04-17T18:03:00.000Z,concept-article,configuration,0.7,True,"Describes how to associate managed storage locations at metastore, catalog, and schema levels and how precedence works; this is product-specific configuration behavior (hierarchy overrides, recommended assignment level) that an LLM would not reliably infer without the docs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials,Configure storage credentials for ADLS2,Create a storage credential for connecting to Azure Data Lake Storage - Azure Databricks,Create storage credentials for Azure Data Lake Storage,Learn how to create a storage credential in Azure Databricks to connect to Azure Data Lake Storage.,"This page describes how to create storage credentials in Unity Catalog to connect to Azure Data Lake Storage. For information about other cloud storage options supported by Unity Catalog, seeCloud storage options supported by Unity Catalog. Astorage credentialcontains a long-term cloud credential that provides access to cloud storage. You reference storage credentials, along with the cloud storage path, when you createexternal locationsin Unity Catalog to govern access to external storage.",2026-03-10T08:00:00.000Z,how-to,security,0.8,True,"Describes how to configure storage credentials, including required fields and their use in external locations, which are product-specific security configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials-r2,Configure storage credentials for Cloudflare R2,Create a storage credential for connecting to Cloudflare R2 - Azure Databricks,Create storage credentials for Cloudflare R2 in Unity Catalog,Learn how to create a storage credential in Azure Databricks to connect to Cloudflare R2 buckets.,"This article describes how to create a storage credential in Unity Catalog to connect to Cloudflare R2. Cloudflare R2 object storage incurs no egress fees. Replicating or migrating data that you share to R2 enables you to share data across clouds and regions without incurring egress fees. Astorage credentialcontains a long-term cloud credential that provides access to cloud storage. You reference storage credentials, along with the cloud storage path, when you createexternal locationsin Unity Ca",2026-03-09T18:23:00.000Z,how-to,security,0.8,True,"Provides concrete configuration steps and parameters for R2 storage credentials and their use in external locations, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials-s3,Configure storage credentials for AWS S3,Create a storage credential for connecting to AWS S3 (read-only) - Azure Databricks,Create read-only AWS S3 storage credentials in Unity Catalog,Learn how to create a storage credential in Azure Databricks to connect to AWS S3.,"This article describes how to create storage credentials in Unity Catalog to connect to AWS S3. Support for S3 in Azure Databricks is read-only. Astorage credentialcontains a long-term cloud credential with access to cloud storage. You reference a storage credential and the cloud storage path when you createexternal locationsin Unity Catalog to govern access to external storage. For more information about storage credentials and external locations, seeConnect to cloud object storage using Unity ",2026-03-10T08:00:00.000Z,how-to,security,0.8,True,"Documents how to configure S3 storage credentials with read-only support and how they integrate with external locations, a product-specific security setup.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/,Dashboards,Dashboards - Azure Databricks,,Learn how to share insights with your team using AI/BI dashboards.,"You can use dashboards to build data visualizations and share reports with your team. AI/BI dashboards feature AI-assisted authoring, an enhanced visualization library, and a streamlined configuration experience so that you can quickly transform data into sharable insights. When published, your dashboards can be shared with anyone registered to your Azure Databricks account, even if they don't have access to the workspace.",2026-03-16T17:36:00.000Z,landing-page,,0.2,False,"Overview of Databricks dashboards and sharing capabilities; lacks numeric limits, detailed configuration tables, troubleshooting mappings, or other product-specific expert details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/,Dashboards,Dashboards - Azure Databricks,,Learn how to share insights with your team using AI/BI dashboards.,"You can use dashboards to build data visualizations and share reports with your team. AI/BI dashboards feature AI-assisted authoring, an enhanced visualization library, and a streamlined configuration experience so that you can quickly transform data into sharable insights. When published, your dashboards can be shared with anyone registered to your Azure Databricks account, even if they don't have access to the workspace.",2026-04-22T17:34:00.000Z,landing-page,,0.2,False,"High-level overview of Databricks dashboards and sharing; no numeric limits, configuration tables, error codes, or product-specific decision matrices.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/automate/,Developer workflows,Dashboard developer workflows - Azure Databricks,,"Learn how to automate dashboard workflows, transfer dashboards across workspaces, and manage dashboard versions with Git.","AI/BI dashboards support programmatic and DevOps-oriented workflows for managing dashboards at scale. You can manage dashboards as code using Declarative Automation Bundles and REST APIs, transfer dashboards across workspaces using import and export, and apply source control using Databricks Git folders.",2026-04-09T08:00:00.000Z,landing-page,,0.3,False,"Automation and DevOps workflows for dashboards are likely procedural/how-to content without detailed config tables, limits, or product-specific error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/automate/git-support,Source control,Version control dashboards with Git - Azure Databricks,,"Learn how to use Databricks Git folders to version control AI/BI dashboards, collaborate with team members, and streamline deployment across workspaces.",This page explains how to use Databricks Git folders for version control and collaborative dashboard development. It also describes how to implement CI/CD processes to develop and deploy dashboards across different workspaces.,2026-03-16T17:36:00.000Z,how-to,,0.3,False,"Explains how to use Databricks Git folders for version control and CI/CD of dashboards. The summary suggests a procedural/tutorial focus without specific configuration matrices, limits, or error-code troubleshooting content required for expert classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/automate/import-export,Import and export,"Export, import, or replace a dashboard - Azure Databricks",Export and import Databricks dashboards across workspaces,"Learn how to export, import, and replace dashboards across workspaces.","You can export and import dashboards as files to facilitate the sharing of editable dashboards across different workspaces. To transfer a dashboard to a different workspace, export it as a file and then import it into the new workspace. You can also replace dashboard files in place. That means that when you edit a dashboard file directly, you can upload that file to the original workspace and overwrite the existing file while maintaining the existing sharing settings. The following sections expl",2026-02-23T23:41:00.000Z,how-to,deployment,0.7,True,"Describes export/import/replace mechanics and constraints for dashboard files between workspaces, which are deployment-focused product details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/caching,Caching and performance,Dataset optimization and caching - Azure Databricks,,Learn about performance optimizations and caching strategies for dashboards.,"AI/BI dashboards are valuable data analysis and decision-making tools, and efficient load times can significantly improve the user experience. This article explains how caching and dataset optimizations make dashboards more performant and efficient.",2026-01-21T08:00:00.000Z,concept-article,,0.4,False,"Discusses caching and dataset optimization conceptually; summary does not show concrete numeric thresholds, config tables, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/concepts,Concepts,Dashboard concepts - Azure Databricks,,"Learn about AI/BI dashboards, a data visualization platform for building interactive reports and sharing insights across your organization.","AI/BI dashboards are an interactive data visualization platform that allows you to transform data into actionable insights and share them across your organization. Dashboards feature AI-assisted authoring, an enhanced visualization library, and a streamlined configuration experience to help you quickly build and publish reports. When published, your dashboards can be shared with anyone registered to your Azure Databricks account, even if they don't have access to the workspace.",2026-03-18T08:00:00.000Z,concept-article,,0.25,False,"Conceptual explanation of dashboard capabilities and usage; does not expose detailed configuration parameters, limits, or product-specific troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/concepts,Concepts,Dashboard concepts - Azure Databricks,,"Learn about AI/BI dashboards, a data visualization platform for building interactive reports and sharing insights across your organization.","AI/BI dashboards are an interactive data visualization platform that allows you to transform data into actionable insights and share them across your organization. Dashboards feature AI-assisted authoring, an enhanced visualization library, and a streamlined configuration experience to help you quickly build and publish reports. When published, your dashboards can be shared with anyone registered to your Azure Databricks account, even if they don't have access to the workspace.",2026-04-21T08:00:00.000Z,concept-article,,0.2,False,"Conceptual description of AI/BI dashboard capabilities; lacks concrete limits, configuration parameters, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/genie-spaces,Dashboards with Genie spaces,Genie spaces with dashboards - Azure Databricks,,"Learn how to configure a companion Genie space for an AI/BI dashboard, allowing users to explore data with natural language.","Published dashboards include a Genie space by default to allow business users to explore data using natural language. A Genie space is a no-code interface that provides anAsk Geniebutton on published dashboards, allowing viewers to chat with the data instead of relying only on predefined visualizations. TheAsk Geniebutton is available to viewers both in the workspace and when dashboards are embedded usingbasic embedding. SeeWhat is a Genie space. When you publish your dashboard, theEnable Geniet",2026-04-01T08:00:00.000Z,how-to,,0.2,False,"Describes Genie spaces with dashboards and natural language exploration; appears to be feature usage/overview, not deep config or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/limits,Dashboard limits,Dashboard limits - Azure Databricks,Review Azure Databricks AI/BI dashboard limits,Learn about limits for AI/BI dashboards.,"This article describes the limits that apply to AI/BI dashboards, including pages, datasets, widgets, visualizations, and subscriptions.",2026-04-16T17:44:00.000Z,reference,limits-quotas,0.95,True,"Explicitly described as listing limits for pages, datasets, widgets, visualizations, and subscriptions; this will contain product-specific numeric limits and quotas.",updated -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/,Author dashboards,Author dashboards - Azure Databricks,,"Learn the basics of creating, viewing, organizing, and managing dashboards.","This page covers the fundamental operations for working with dashboards, including creating, viewing, organizing, and deleting dashboards. You can also create a dashboard using the Dashboard Authoring Agent, seeUse Genie Code for dashboard authoring. For a tutorial on creating a dashboard, seeCreate a dashboard.",2026-02-23T23:41:00.000Z,how-to,,0.2,False,"Covers basic authoring operations (create, view, organize); primarily conceptual/how-to UI usage without deep config or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/dashboard-agent,Genie Code for dashboard authoring,Use Genie Code for dashboard authoring - Azure Databricks,,Learn how to use Genie Code for dashboard authoring to build AI/BI dashboards. The agent can create multi-step dashboard workflows from a single prompt.,"This page introduces Genie Code for dashboard authoring, an AI data agent available by selecting Agent mode in Genie Code. Designed specifically for AI/BI dashboards, it creates visualizations, configures layouts, and builds complete dashboards—all from a single prompt.",2026-04-16T17:44:00.000Z,feature-guide,,0.3,False,"Feature and workflow overview for Genie Code dashboard authoring; no indication of numeric limits, config tables, error codes, or product-specific parameters.",updated +https://learn.microsoft.com/en-us/azure/databricks/dashboards/limits,Dashboard limits,Dashboard limits - Azure Databricks,Review Azure Databricks AI/BI dashboard limits,Learn about limits for AI/BI dashboards.,"This article describes the limits that apply to AI/BI dashboards, including pages, datasets, widgets, visualizations, and subscriptions.",2026-04-16T17:44:00.000Z,reference,limits-quotas,0.95,True,"Explicitly described as listing limits for pages, datasets, widgets, visualizations, and subscriptions; this will contain product-specific numeric limits and quotas.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/,Author dashboards,Author dashboards - Azure Databricks,,"Learn the basics of creating, viewing, organizing, and managing dashboards.","This page covers the fundamental operations for working with dashboards, including creating, viewing, organizing, and deleting dashboards. You can also create a dashboard using the Dashboard Authoring Agent, seeUse Genie Code for dashboard authoring. For a tutorial on creating a dashboard, seeCreate a dashboard.",2026-04-22T17:34:00.000Z,how-to,,0.3,False,"Basic how-to for creating and organizing dashboards; appears procedural without detailed config tables, limits, or error-resolution content.",updated +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/dashboard-agent,Genie Code for dashboard authoring,Use Genie Code for dashboard authoring - Azure Databricks,,Learn how to use Genie Code for dashboard authoring to build AI/BI dashboards. The agent can create multi-step dashboard workflows from a single prompt.,"This page introduces Genie Code for dashboard authoring, an AI data agent available by selecting Agent mode in Genie Code. Designed specifically for AI/BI dashboards, it creates visualizations, configures layouts, and builds complete dashboards—all from a single prompt.",2026-04-16T17:44:00.000Z,feature-guide,,0.3,False,"Feature and workflow overview for Genie Code dashboard authoring; no indication of numeric limits, config tables, error codes, or product-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/custom-calculations/,Custom calculations,What are custom calculations? - Azure Databricks,,Learn how to create and use custom calculations with AI/BI dashboards.,Custom calculations let you define dynamic metrics and transformations without modifying dataset queries. This page explains how to use custom calculations in AI/BI dashboards.,2026-04-09T08:00:00.000Z,concept-article,,0.35,False,"Explains custom calculations in dashboards; likely focuses on UI usage and formula concepts, not on product-specific limits, config ranges, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/custom-calculations/function-reference,Function reference,Custom calculation function reference - Azure Databricks,Reference functions for Databricks dashboard custom calculations,Complete reference of supported functions for custom calculations in AI/BI dashboards.,"This page provides a complete reference of all supported functions for custom calculations in AI/BI dashboards. For information about how to use custom calculations, seeWhat are custom calculations?.",2026-02-23T23:41:00.000Z,reference,configuration,0.8,True,"A complete function reference is expert knowledge: lists function names, signatures, argument types, and behavior specific to this product.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/custom-calculations/level-of-detail,Level of detail expressions,Level of detail (LOD) expressions - Azure Databricks,,Use level of detail expressions to control aggregation granularity independently of the grouping in your visualizations.,Level of detail (LOD) expressions enable you to specify a granularity at which aggregations are calculated that is different than the level of detail of your visualizations. This page explains how to use LOD expressions in AI/BI dashboards.,2026-04-09T08:00:00.000Z,concept-article,,0.35,False,"Covers level of detail (LOD) expressions conceptually and how to use them in visualizations; does not indicate presence of numeric limits, config tables, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/datasets,Datasets,Create and manage dashboard datasets - Azure Databricks,,Learn how to create and manage dashboard datasets.,This article explains how to create and manage dashboard datasets using the dataset editor in an AI/BI dashboard.,2026-04-09T08:00:00.000Z,how-to,,0.3,False,How-to article on creating and managing dashboard datasets via the editor; likely procedural UI steps rather than structured configuration references or expert-only constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/local-metric-views,Dashboard local metric views,Dashboard local metric views - Azure Databricks,,"Create and use metric views scoped to an AI/BI dashboard, without publishing to Unity Catalog first.","Important This feature is inPublic Preview. Dashboard local metric views enable you to define dimensions, measures, and join relationships directly in an AI/BI dashboard using a visual interface. Unlike Unity Catalog metric views, dashboard local metric views are scoped to the dashboard and do not require publishing to Unity Catalog. When your metrics are ready for broader use, you can export them to Unity Catalog as governed metric views.",2026-04-09T17:57:00.000Z,how-to,,0.3,False,"Describes dashboard-local metric views and their scope; mainly conceptual and workflow-focused, without numeric limits, config parameter tables, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/datasets,Datasets,Create and manage dashboard datasets - Azure Databricks,,Learn how to create and manage dashboard datasets.,This article explains how to create and manage dashboard datasets using the dataset editor in an AI/BI dashboard.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Explains creating and managing dashboard datasets using the editor; summary suggests step-by-step usage, not detailed configuration matrices or limits.",updated +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/local-metric-views,Local metric views,Local metric views - Azure Databricks,,"Create and use metric views scoped to an AI/BI dashboard, without publishing to Unity Catalog first.","Important This feature is inPublic Preview. Local metric views enable you to define dimensions, measures, and join relationships directly in an AI/BI dashboard using a visual interface. Local metric views retain all of the benefits of semantic objects, such as accurate metric aggregations regardless of groupings and query-time transformation rather than pre-processing, while keeping the semantic logic contained within a single dashboard. They provide a location to house semantic logic that doesn",2026-04-23T17:47:00.000Z,how-to,,0.3,False,"Describes local metric views conceptually and their benefits; no indication of numeric thresholds, config tables, or security/decision matrices.",new https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filter-types,Filter types,Dashboard filter types - Azure Databricks,,Learn about the types of filters that you can apply on AI/BI dashboards.,Filters allow dashboard viewers to interact with a dashboard and dynamically adjust the data displayed according to their needs. This article outlines the types of filters that you can configure on an AI/BI dashboard and has example configurations for filters.,2026-02-24T23:25:00.000Z,reference,,0.4,False,Overview of filter types with example configurations; more tutorial-like than a detailed config or limits reference.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/,Filters and drill through,Use dashboard filters - Azure Databricks,,Learn how to use filter widgets on dashboards to focus on selected dataset fields and parameters.,Filters limit the data presented in dashboard visualizations so that users can focus on data that meets specific criteria. This page explains available filter types and how to work with them.,2026-03-18T08:00:00.000Z,how-to,,0.3,False,"Describes how to use dashboard filter widgets conceptually and via UI steps; no product-specific limits, config tables, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/,Filters and drill through,Use dashboard filters - Azure Databricks,,Learn how to use filter widgets on dashboards to focus on selected dataset fields and parameters.,Filters limit the data presented in dashboard visualizations so that users can focus on data that meets specific criteria. This page explains available filter types and how to work with them.,2026-04-21T08:00:00.000Z,how-to,,0.3,False,"Explains filter types and how to work with them; summary does not indicate numeric limits, parameter tables, or error/diagnostic mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/field-filters,Filter on fields,Filter on fields - Azure Databricks,,Learn how to filter dashboards using dataset fields.,"Field filters narrow data by specific dataset fields. For example, a field filter might limit data to a particular date range based on a date field in a dataset. Field filters can be connected to one or more datasets, allowing dynamic changes to available filter values based on selections. To connect a filter to fields from more than one dataset, add multipleFields, up to one per dataset. The filter applies to all visualizations built on the selected datasets. Selecting a value for one filter dy",2026-02-23T23:41:00.000Z,how-to,,0.5,False,Field filter behavior and multi-dataset connections; likely conceptual and UI-based without structured configuration tables.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/parameters,Parameters,Work with dashboard parameters - Azure Databricks,,Make dashboards interactive using parameters. Enable viewers to input specific values into dataset queries at runtime.,"This page explains how to use parameters on AI/BI dashboards. If you want to learn about field filters instead, seeFilter on fields. AI/BI dashboard parameters let you substitute different values into dataset queries at runtime. This allows you to filter data by criteria such as dates and product categories before data is aggregated in a SQL query, leading to more efficient querying and precise analysis. Parameters can be used with filter widgets to make dashboards interactive or with visualizat",2026-03-25T08:00:00.000Z,how-to,,0.3,False,Explains dashboard parameters and interactivity; appears to be usage/tutorial content without detailed configuration tables or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/parameters,Parameters,Work with dashboard parameters - Azure Databricks,,Make dashboards interactive using parameters. Enable viewers to input specific values into dataset queries at runtime.,"This page explains how to use parameters on AI/BI dashboards. If you want to learn about field filters instead, seeFilter on fields. AI/BI dashboard parameters let you substitute different values into dataset queries at runtime. This allows you to filter data by criteria such as dates and product categories before data is aggregated in a SQL query, leading to more efficient querying and precise analysis. Parameters can be used with filter widgets to make dashboards interactive or with visualizat",2026-04-22T17:34:00.000Z,how-to,,0.4,False,"Describes using parameters to make dashboards interactive; focuses on runtime substitution and filtering, but no explicit config tables or limits are indicated.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/settings,Dashboard settings,Dashboard settings - Azure Databricks,Configure global settings for Databricks dashboards,Customize dashboard theme and locale settings to control visual appearance and data formatting across all widgets in your AI/BI dashboards.,"Dashboard-level settings provide global control over the following aspects of your dashboard: These settings apply to all widgets and visualizations within a dashboard, ensuring a consistent user experience. When you modify dashboard settings, changes are automatically applied to existing widgets and are used as defaults for any new widgets you add. Note Dashboard settings are only accessible from draft dashboards. This page explains how to customize dashboard settings and themes to match your o",2026-03-11T18:04:00.000Z,reference,configuration,0.65,True,Dashboard-level theme and locale settings are product-specific configuration; page likely enumerates setting names and allowed values.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/,Visualizations,Dashboard visualizations - Azure Databricks,,Learn how to configure charts for AI/BI dashboards.,This page explains how to use and customize visualization widgets on AI/BI dashboards.,2026-04-17T18:03:00.000Z,how-to,,0.3,False,Explains how to configure dashboard visualizations; appears to be UI/tutorial style without detailed config parameter tables or numeric constraints.,updated +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/,Visualizations,Dashboard visualizations - Azure Databricks,,Learn how to configure charts for AI/BI dashboards.,This page explains how to use and customize visualization widgets on AI/BI dashboards.,2026-04-21T08:00:00.000Z,how-to,,0.3,False,"Covers how to use and customize visualization widgets; likely UI-driven instructions without expert-only limits, quotas, or config reference tables.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/maps,Map visualizations,Map visualizations - Azure Databricks,Configure map visualizations in Databricks AI/BI dashboards,Learn about map visualization configuration options for AI/BI dashboards,The map visualizations display results on a geographic map. The query result set must include the appropriate geographic data:,2026-04-02T22:35:00.000Z,reference,configuration,0.6,True,"Focuses on map visualization configuration options and required geographic data; such pages typically enumerate specific settings and allowed values for map types, layers, and data fields, which is product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/tables,Table visualizations,Table and pivot table visualizations - Azure Databricks,,Learn about table and pivot table visualization configuration options in AI/BI dashboards,"You can use table and pivot table visualizations to present data in a structured format with customizable elements such as custom and conditional formatting, hyperlinks, and images. This page explains how to control data presentation in table and pivot table visualizations. For examples of creating pivot visualizations, seePivot visualization.",2026-03-03T08:00:00.000Z,reference,,0.4,False,"Covers table and pivot visualization configuration; primarily UI-driven formatting options, not expert-only configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/text-widgets,Text widgets,Text widgets - Azure Databricks,,"Learn how to use text widgets to add formatted text, links, and images to your dashboards.","Text widgets let you add formatted text, links, and images to your dashboards. Drag a text widget onto the canvas and double-click it to start editing. Use the formatting toolbar to style content in the following ways: You can also use markdown syntax to edit text. Click thekebab menu >Show markdownto view the text widget in markdown syntax. For details about basic markdown syntax, seethe Markdown guide. The following example demonstrates the formatting capabilities available in text widgets. Yo",2026-02-23T23:41:00.000Z,how-to,,0.3,False,"Text widget usage and markdown formatting; generic and UI-focused, not product-specific expert configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/types,Visualization types,AI/BI dashboard visualization types - Azure Databricks,,"Learn about the types of visualizations available in AI/BI dashboards, including area, bar, line, maps, sankey, and more. For notebook and SQL visualizations, see notebook visualization types.","This page outlines the types of visualizations available to use in AI/BI dashboards and shows you how to create an example of each visualization type. For instructions on building a dashboard, seeCreate a dashboard. You can use natural language to prompt Genie Code to create bar, line, point map, scatter, pie, and counter charts. SeeUse Genie Code for dashboard authoring. Important This page covers visualizations forAI/BI dashboards. For visualizations in Azure Databricks notebooks and the SQL e",2026-03-09T08:00:00.000Z,reference,,0.4,False,"Catalog of visualization types and examples; mostly conceptual/how-to, not deep config or limits.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/types,Visualization types,AI/BI dashboard visualization types - Azure Databricks,,"Learn about the types of visualizations available in AI/BI dashboards, including area, bar, line, maps, sankey, and more. For notebook and SQL visualizations, see notebook visualization types.","This page outlines the types of visualizations available to use in AI/BI dashboards and shows you how to create an example of each visualization type. For instructions on building a dashboard, seeCreate a dashboard. You can use natural language to prompt Genie Code to create bar, line, point map, scatter, pie, and counter charts. SeeUse Genie Code for dashboard authoring. Important This page covers visualizations forAI/BI dashboards. For visualizations in Azure Databricks notebooks and the SQL e",2026-04-21T18:07:00.000Z,reference,,0.3,False,"Lists visualization types and examples; appears as feature usage documentation rather than detailed configuration, limits, or troubleshooting guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/monitor-usage,Govern and audit,Monitor dashboard usage - Azure Databricks,Query Databricks audit logs to monitor dashboard usage,Learn how to monitor dashboard usage with audit logs and system tables.,"Important This feature is inPublic Preview. This page has sample queries that admins can use to monitor activity associated with dashboards. All queries access the audit logs table, which is a system table that stores records for all audit events from workspaces in your region. Account admins have access to system tables by default. To grant access to other users, seeGrant access to system tables. SeeMonitor account activity with system tables. For a comprehensive reference of available audit lo",2026-02-23T23:41:00.000Z,how-to,configuration,0.7,True,"Provides concrete sample SQL against system tables and references specific audit log schemas and fields, which are expert configuration/usage details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/,Embed dashboards,Embed a dashboard - Azure Databricks,Embed Azure Databricks dashboards in external apps,Learn how to embed an AI/BI dashboard into an external application.,This page shows how to embed an AI/BI dashboard in an external website or application.,2026-03-12T08:00:00.000Z,landing-page,integrations,0.7,True,"Embedding dashboards requires specific embed URLs, parameters, and integration patterns unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/basic,Basic embedding,Basic dashboard embedding - Azure Databricks,Implement basic iframe embedding for Databricks dashboards,Learn about basic embedding to control access and manage permissions for embedded dashboards directly from an external application.,This page shows how to embed a dashboard as an iframe in an external application. Viewers access the dashboard using their Azure Databricks credentials.,2026-02-23T23:41:00.000Z,how-to,integrations,0.7,True,Describes iframe-based embedding with Databricks-specific URLs and access behavior; this is a concrete integration pattern.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/external-embed,Embedding for external users,What is embedding for external users? - Azure Databricks,Securely embed Databricks dashboards for external users,Learn how to implement embedding for external users to control access and manage permissions for embedded dashboards from an external application.,"This page describes how embedding for external users works, how to configure your Azure Databricks workspace for secure sharing of embedded dashboards, and how to use sample applications to get started. Embedding for external users uses a service principal and scoped access tokens to authenticate and authorize access to embedded dashboards. This approach lets you share dashboards with viewers outside of your organization, such as partners and customers, without provisioning Azure Databricks acco",2026-03-12T08:00:00.000Z,how-to,security,0.8,True,Details use of service principals and scoped access tokens for external embedding; includes product-specific auth and permission configuration.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/schedule-subscribe,Schedules and subscriptions,Manage scheduled dashboard updates and subscriptions - Azure Databricks,,Learn how to manage scheduled dashboard updates and subscriptions for published AI/BI dashboards.,"Scheduling dashboard updates ensures that your dashboards display up-to-date information and improves performance for end users. Subscriptions keep stakeholders informed by sending dashboard snapshots to email, Slack, or Microsoft Teams. This page explains how to set up and manage subscriptions.",2026-04-16T17:44:00.000Z,how-to,,0.4,False,"Describes scheduling updates and subscriptions conceptually; summary does not indicate specific numeric schedules, quotas, or config tables.",updated -https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/share,Publishing,Share a dashboard - Azure Databricks,,Learn how to publish and share insights with your team using dashboards.,This article explains how to publish a dashboard and share it with users in your workspace or account.,2026-04-13T08:00:00.000Z,how-to,,0.3,False,"Covers how to publish and share dashboards; likely procedural sharing steps without limits, error mappings, or detailed configuration matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/external-embed,Embedding for external users,What is embedding for external users? - Azure Databricks,Securely configure external dashboard embedding in Azure Databricks,Learn how to implement embedding for external users to control access and manage permissions for embedded dashboards from an external application.,"This page describes how embedding for external users works, how to configure your Azure Databricks workspace for secure sharing of embedded dashboards, and how to use sample applications to get started. Embedding for external users uses a service principal and scoped access tokens to authenticate and authorize access to embedded dashboards. This approach lets you share dashboards with viewers outside of your organization, such as partners and customers, without provisioning Azure Databricks acco",2026-04-21T08:00:00.000Z,how-to,security,0.68,True,"Describes product-specific security configuration for embedding dashboards for external users, including use of service principals, scoped access tokens, and workspace settings. This is concrete IAM/auth configuration unique to Azure Databricks embedding rather than a generic overview.",updated +https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/schedule-subscribe,Schedules and subscriptions,Manage scheduled dashboard updates and subscriptions - Azure Databricks,,Learn how to manage scheduled dashboard updates and subscriptions for published AI/BI dashboards.,"Scheduling dashboard updates ensures that your dashboards display up-to-date information and improves performance for end users. Subscriptions keep stakeholders informed by sending dashboard snapshots to email, Slack, or Microsoft Teams. This page explains how to set up and manage subscriptions.",2026-04-16T17:44:00.000Z,how-to,,0.4,False,"Describes scheduling updates and subscriptions conceptually; summary does not indicate specific numeric schedules, quotas, or config tables.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/share,Publishing,Share a dashboard - Azure Databricks,,Learn how to publish and share insights with your team using dashboards.,This article explains how to publish a dashboard and share it with users in your workspace or account.,2026-04-21T08:00:00.000Z,how-to,,0.2,False,"How-to for publishing and sharing dashboards; likely basic sharing steps without detailed RBAC role lists, security scopes, or other expert-only details.",updated https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/,Tutorials,Dashboard tutorials - Azure Databricks,,Learn how to build and manage AI/BI dashboards using the Azure Databricks UI and REST API.,Follow along with tutorials designed to teach you build and manage AI/BI dashboards.,2026-04-10T18:06:00.000Z,landing-page,,0.1,False,"Tutorials index page; primarily navigation to how-to guides without consolidated limits, configs, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/create-dashboard,Create a dashboard,Create a dashboard - Azure Databricks,,Create and share AI/BI dashboards on Azure Databricks using the UI. Define the dataset using a SQL query or by choosing a Unity Catalog table.,"Learn how to use the AI/BI dashboard UI to create and share insights. For information about dashboard features, seeDashboards. The steps in this tutorial demonstrate how to build and share the following dashboard: You can also create a dashboard using the Dashboard Authoring Agent, seeUse Genie Code for dashboard authoring.",2026-03-10T23:23:00.000Z,tutorial,,0.35,False,Step-by-step UI tutorial for creating a dashboard; mostly basic usage without detailed configuration matrices or platform-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/dashboard-crud-api,Manage dashboards with AI/BI APIs,Use dashboard APIs to create and manage dashboards - Azure Databricks,Use Databricks dashboard REST APIs for management,"Use the Dashboard API to manage dashboards, access, and permissions.","The Databricks REST API includes management tools specifically for managing AI/BI dashboards. This page demonstrates how to use API tools to create and manage dashboards. To do these tasks using the UI, seeAuthor dashboards. Note AI/BI dashboards were previously known as Lakeview dashboards. The Lakeview API still retains that name.",2026-04-09T08:00:00.000Z,tutorial,integrations,0.65,True,"Describes Databricks REST API for dashboard CRUD and permissions. Likely includes endpoint names, request/response parameters, and product-specific API behavior, which fits integrations & coding patterns.",unchanged @@ -384,12 +385,12 @@ https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/query-ba https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/workspace-dashboard-api,Manage dashboards with Workspace APIs,Manage dashboards with Workspace APIs - Azure Databricks,Manage AI/BI dashboards via Workspace and Lakeview APIs,Use the Databricks REST APIs to create and manage dashboards.,"This tutorial demonstrates how to manage dashboards using the Lakeview API and Workspace API. Each step includes a sample request and response, and explanations about how to use the API tools and properties together. Each step can be referenced on its own. Following all steps in order guides you through a complete workflow. Note This workflow calls the Workspace API to retrieve an AI/BI dashboard as a generic workspace object. AI/BI dashboards were previously known as Lakeview dashboards. The La",2026-02-23T08:00:00.000Z,tutorial,integrations,0.7,True,"Provides concrete REST API request/response examples and property usage for Workspace and Lakeview APIs, including specific fields and patterns for managing dashboards, which is product-specific integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-engineering/,Overview,Data engineering with Databricks - Azure Databricks,,Learn about engineering data pipelines with Databricks.,"Databricks provides Lakeflow, an end-to-end data engineering solution that empowers data engineers, software developers, SQL developers, analysts, and data scientists to deliver high-quality data for downstream analytics, AI, and operational applications. Lakeflow is a unified solution for ingestion, transformation, and orchestration of your data, and includes Lakeflow Connect, Lakeflow Spark Declarative Pipelines, and Lakeflow Jobs.",2026-01-23T08:00:00.000Z,concept-article,,0.1,False,"High-level overview of Lakeflow and data engineering; no detailed settings, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-engineering/batch-vs-streaming,Batch vs. streaming,Batch vs. streaming data processing in Azure Databricks - Azure Databricks,,Learn about the differences between batch and streaming data processing on the Azure Databricks platform.,"This article describes the key differences between batch and streaming, two different data processing semantics used for data engineering workloads, including ingestion, transformation, and real-time processing. Streaming is commonly associated with low-latency and continuous processing from message buses, such as Apache Kafka. However, in Azure Databricks it has a more expansive definition. The underlying engine of Lakeflow Spark Declarative Pipelines (Apache Spark and Structured Streaming) has",2026-01-23T08:00:00.000Z,concept-article,,0.1,False,Conceptual explanation of batch vs streaming semantics; no concrete product-specific thresholds or configs.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-engineering/best-practices,Overview,Data engineering best practices - Azure Databricks,,Learn about data engineering best practices in Databricks.,"The following articles provide best practices for data engineering in Azure Databricks. For links to other best practices articles, includingCI/CD workflows best practices, seeBest practice articles.",2026-03-06T19:12:00.000Z,concept-article,,0.3,False,"High-level pointer page that links to other best practices; summary does not show concrete product-specific configs, numbers, or patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-engineering/best-practices,Overview,Data engineering best practices - Azure Databricks,,Learn about data engineering best practices in Databricks.,"The following articles provide best practices for data engineering in Azure Databricks. For links to other best practices articles, includingCI/CD workflows best practices, seeBest practice.",2026-04-21T22:28:00.000Z,concept-article,,0.1,False,"This is a hub/overview page that links to other best practices articles and does not itself contain concrete product-specific recommendations, numeric thresholds, or configuration details. It is primarily navigational and conceptual.",updated https://learn.microsoft.com/en-us/azure/databricks/data-engineering/concepts,Overview,Data engineering concepts - Azure Databricks,,Learn about data engineering concepts in Databricks.,The following topics provide overviews of general concepts in data engineering in Azure Databricks.,2026-03-09T18:23:00.000Z,concept-article,,0.1,False,Conceptual overview of data engineering topics; lacks product-specific parameters or numeric thresholds.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-engineering/fan-in-fan-out,Fan-in and fan-out architecture,Fan-in and fan-out architecture in Lakeflow Spark Declarative Pipelines - Azure Databricks,Implement fan-in and fan-out patterns in Lakeflow pipelines,Learn how to implement fan-in and fan-out architecture patterns in Lakeflow Spark Declarative Pipelines.,"Fan-in and fan-out are common patterns in modern data engineering for building scalable, reliable pipelines. This page describes both patterns and demonstrates how to implement them in Lakeflow Spark Declarative Pipelines.",2026-03-04T19:15:00.000Z,best-practice,architecture-patterns,0.7,True,"Describes how to realize fan-in/fan-out specifically in Lakeflow Spark Declarative Pipelines, including Databricks-specific constructs and wiring patterns, making it a product-specific architecture pattern guide.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-engineering/fan-in-fan-out,Fan-in and fan-out architecture,Fan-in and fan-out architecture in Lakeflow Spark Declarative Pipelines - Azure Databricks,Implement fan-in and fan-out in Lakeflow pipelines,Learn how to implement fan-in and fan-out architecture patterns in Lakeflow Spark Declarative Pipelines.,"Fan-in and fan-out are common patterns in modern data engineering for building scalable, reliable pipelines. This page describes both patterns and demonstrates how to implement them in Lakeflow Spark Declarative Pipelines.",2026-04-21T08:00:00.000Z,best-practice,architecture-patterns,0.7,True,"Describes concrete fan-in/fan-out implementation patterns specifically for Lakeflow Spark Declarative Pipelines, including how to structure and orchestrate stages. This is product-specific architecture guidance beyond generic fan-in/fan-out concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/data-engineering/observability-best-practices,Observability,"Observability in Azure Databricks for jobs, Lakeflow Spark Declarative Pipelines, and Lakeflow Connect - Azure Databricks",Apply observability best practices for Databricks jobs and pipelines,"Learn about Observability in Azure Databricks for jobs, Lakeflow Spark Declarative Pipelines, and Lakeflow Connect","Monitoring your streaming applications' performance, cost, and health is essential to building reliable, efficient ETL pipelines. Azure Databricks provides a rich set of observability features across Jobs, Lakeflow Spark Declarative Pipelines, and Lakeflow Connect to help diagnose bottlenecks, optimize performance, and manage resource usage and costs. This article outlines best practices in the following areas:",2026-01-23T08:00:00.000Z,concept-article,best-practices,0.7,True,"Details Databricks-specific observability features and recommended configurations for monitoring jobs, Lakeflow pipelines, and Connect, including product-specific metrics and tooling usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-engineering/procedural-vs-declarative,Procedural vs. declarative,Procedural vs. declarative data processing in Azure Databricks - Azure Databricks,,Learn about the differences between procedural and declarative data processing on Azure Databricks.,"This article covers the differences between procedural and declarative programming and their usage in Databricks. Procedural and declarative programming are two fundamental programming paradigms in computer science. Each represents a different approach to structuring and executing instructions. When designing data pipelines, engineers must choose between procedural and declarative data processing models. This decision impacts workflow complexity, maintainability, and efficiency. This page explai",2026-01-23T08:00:00.000Z,concept-article,,0.1,False,Conceptual comparison of procedural vs declarative processing; no Databricks-specific configs or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-engineering/schema-evolution,Schema evolution,Schema evolution in Azure Databricks - Azure Databricks,,Learn how schemas evolve in Azure Databricks datasets and how to get the results you want when they do.,"Schema evolutionrefers to a system's ability to adapt to changes in the structure of data over time. These changes are common when working with semi-structured data, event streams, or third-party sources where new fields are added, data types shift, or nested structures evolve. Common changes include: Supporting schema evolution is critical for building resilient, long-running pipelines that can accommodate changing data without frequent manual updates.",2026-04-01T08:00:00.000Z,concept-article,,0.3,False,"Conceptual explanation of schema evolution and its importance; summary does not indicate product-specific configuration tables, limits, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-engineering/schema-evolution,Schema evolution,Schema evolution in Azure Databricks - Azure Databricks,,Learn how schemas evolve in Azure Databricks data sets and how to get the results you want when they do.,"Schema evolutionrefers to a system's ability to adapt to changes in the structure of data over time. These changes are common when working with semi-structured data, event streams, or third-party sources where new fields are added, data types shift, or nested structures evolve. Common changes include: Supporting schema evolution is critical for building resilient, long-running pipelines that can accommodate changing data without frequent manual updates.",2026-04-21T08:00:00.000Z,concept-article,,0.2,False,"Primarily conceptual explanation of schema evolution and its importance; summary does not indicate product-specific configuration values, limits, or detailed patterns beyond general concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/data-engineering/tables-views,Tables and views,Tables and views in Azure Databricks - Azure Databricks,,"Learn about the differences between tables, views, streaming tables, and materialized views in Azure Databricks.","This article gives an overview of tables, views, streaming tables, and materialized views in Azure Databricks.",2026-01-23T08:00:00.000Z,concept-article,,0.1,False,Overview of tables and views; lacks detailed configuration tables or numeric constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/,Overview,Data governance with Azure Databricks - Azure Databricks,,"Learn about data governance in Azure Databricks, with a focus on Unity Catalog.","Data governanceis a framework of policies, processes, roles, and technical controls that ensures your organization's data is secure, trustworthy, and used responsibly throughout its lifecycle. Effective data governance enables you to maintain data quality, protect sensitive information, meet regulatory requirements, and maximize the value of your data assets. Key components of data governance include: This page focuses on the governance of data using Unity Catalog in Azure Databricks. Related se",2026-03-12T08:00:00.000Z,overview,,0.2,False,High-level overview of data governance concepts and Unity Catalog; primarily conceptual framework rather than detailed configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/,Legacy table access control overview,Hive metastore table access control (legacy) - Azure Databricks,Configure Hive metastore table ACLs in Databricks,Learn how to use table access control to grant and deny access to your data.,"Each Azure Databricks workspace deploys with a built-in Hive metastore as a managed service. An instance of the metastore deploys to each cluster and securely accesses metadata from a central per-workspace repository. By default, a cluster allows all users to access all data managed by the workspace's built-in Hive metastore unless table access control is enabled for that cluster. Table access control lets you programmatically grant and revoke access to objects in your workspace's Hive metastore",2025-05-09T19:58:00.000Z,overview,security,0.75,True,"Describes table access control for the built-in Hive metastore, including how to programmatically grant/revoke access; this involves product-specific security/privilege configuration.",unchanged @@ -404,13 +405,13 @@ https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control,Access control in Unity Catalog,Access control in Unity Catalog - Azure Databricks,Understand and configure access control in Unity Catalog,"Learn how access control works in Unity Catalog, including privileges, ABAC policies, object ownership, and data-level restrictions.","This page has an overview of access control in Unity Catalog, including privileges, policies, and data-level controls.",2026-03-26T08:00:00.000Z,concept-article,security,0.7,True,"Overview of privileges, ABAC policies, and data-level restrictions; likely enumerates specific privilege names and policy constructs that define the product’s security model.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control/permissions-concepts,Permissions concepts,Unity Catalog permissions model concepts - Azure Databricks,Learn Unity Catalog permissions model and inheritance,"Learn how the Unity Catalog permissions model works, including the object hierarchy, privileges, ownership, usage privileges, and inheritance.","This page explains core concepts of the Unity Catalog permissions model, including the object model, privileges, ownership, and inheritance. For a general reference of all Unity Catalog privileges, seeUnity Catalog privileges reference. For instructions on granting and revoking privileges, seeShow, grant, and revoke privileges.",2026-03-31T23:28:00.000Z,concept-article,security,0.75,True,"Describes object hierarchy, privileges, ownership, and inheritance; contains detailed, product-specific permission semantics beyond generic RBAC concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control/privileges-reference,Privileges reference,Unity Catalog privileges reference - Azure Databricks,Reference Unity Catalog privileges and applicable objects,Learn about each privilege in Unity Catalog.,"This page is a reference for Unity Catalog privileges and the securable objects they apply to. For detailed descriptions of each securable object type, seeUnity Catalog securable objects reference. To learn how to grant privileges in Unity Catalog, seeShow, grant, and revoke privileges. Note This page refers to the Unity Catalog privileges and inheritance model in Privilege Model version 1.0. If you created your Unity Catalog metastore during the public preview (before August 25, 2022), you migh",2026-03-31T23:28:00.000Z,reference,security,0.8,True,A privilege reference page listing each privilege and applicable securable objects is inherently detailed security configuration knowledge with specific role/privilege names.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/ai-governance,AI governance,AI governance - Azure Databricks,,Learn how Databricks governs AI assets and AI traffic using Unity Catalog and AI Gateway.,"AI governance extends the data governance capabilities of Unity Catalog to AI resources, applying the same access control, lineage, and audit model that protects your data assets to your AI assets and AI traffic.",2026-04-15T22:08:00.000Z,overview,,0.2,False,"High-level AI governance overview; lacks concrete RBAC role lists, config parameters, or detailed decision matrices.",new -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/best-practices,Unity Catalog best practices,Unity Catalog best practices - Azure Databricks,Apply Unity Catalog data governance best practices,Learn best practices for setting up data governance and data isolation in Azure Databricks using Unity Catalog.,"This document provides recommendations for using Unity Catalog to meet your data governance needs most effectively. For an introduction to data governance on Azure Databricks, seeData governance with Azure Databricks. For an introduction to Unity Catalog, seeWhat is Unity Catalog?.",2026-04-09T08:00:00.000Z,best-practice,best-practices,0.7,True,"The page is explicitly a best practices guide for using Unity Catalog to meet data governance needs and data isolation on Azure Databricks. Such guidance is typically product-specific, including concrete recommendations on how to structure catalogs, schemas, permissions, and isolation patterns that go beyond generic governance theory. This aligns with the best-practices sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/ai-governance,AI governance,AI governance - Azure Databricks,,Learn how Databricks governs AI assets and AI traffic using Unity Catalog and AI Gateway.,"AI governance extends the data governance capabilities of Unity Catalog to AI resources, applying the same access control, lineage, and audit model that protects your data assets to your AI assets and AI traffic.",2026-04-15T22:08:00.000Z,overview,,0.2,False,"High-level AI governance overview; lacks concrete RBAC role lists, config parameters, or detailed decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/best-practices,Unity Catalog best practices,Unity Catalog best practices - Azure Databricks,Apply Unity Catalog governance best practices in Azure Databricks,Learn best practices for setting up data governance and data isolation in Azure Databricks using Unity Catalog.,"This document provides recommendations for using Unity Catalog to meet your data governance needs most effectively. For an introduction to data governance on Azure Databricks, seeData governance with Azure Databricks. For an introduction to Unity Catalog, seeWhat is Unity Catalog?.",2026-04-20T08:00:00.000Z,best-practice,best-practices,0.78,True,"The page explicitly focuses on best practices for configuring Unity Catalog for data governance and isolation. Unity Catalog is product-specific, and such guidance typically includes concrete recommendations (for example, how to structure catalogs/schemas, how to separate environments, how to assign permissions) and gotchas unique to Unity Catalog on Azure Databricks. This aligns with the best-practices category as product-specific DO/DON'T guidance rather than generic theory.",updated https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/certify-deprecate-data,Flag certified and deprecated data,Flag data as certified or deprecated - Azure Databricks,Apply certified and deprecated tags in Unity Catalog,Learn how to apply certification and deprecation system tags to Unity Catalog securable objects to indicate trust or lifecycle status.,This page shows how to apply system tags to objects to mark them as certified or deprecated.,2026-03-09T08:00:00.000Z,how-to,configuration,0.7,True,"Describes how to apply specific system tags to Unity Catalog securable objects, including tag names and usage semantics, which are product-specific configuration operations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/create-metastore,Create a Unity Catalog metastore,Create a Unity Catalog metastore - Azure Databricks,Create and link Unity Catalog metastores in Databricks,Learn how to create a Unity Catalog metastore for your Azure Databricks account and link it to workspaces.,"This page shows how to create a Unity Catalog metastore and link it to workspaces. Important For workspaces that were enabled for Unity Catalog automatically, the instructions in this page are unnecessary. Databricks began to enable new workspaces for Unity Catalog automatically on November 9, 2023, with a rollout proceeding gradually across accounts. You must follow the instructions in this page only if you have a workspace and don't already have a metastore in your workspace region. To determi",2025-06-17T21:13:00.000Z,how-to,configuration,0.8,True,"Shows how to create a metastore and link it to workspaces; this involves specific configuration fields, region constraints, and admin options that are detailed product configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification,Data classification,Data Classification - Azure Databricks,,Learn how to automatically classify and tag sensitive data in your catalog.,"Important This feature is inPublic Preview. This page describes how to use Databricks Data Classification in Unity Catalog to automatically classify and tag sensitive data in your catalog. Data catalogs can have a vast amount of data, often containing known and unknown sensitive data. It is critical for data teams to understand what kind of sensitive data exists in each table so that they can both govern and democratize access to this data. To address this problem, Databricks Data Classification",2026-04-09T08:00:00.000Z,how-to,,0.3,False,"The summary indicates a feature overview of Databricks Data Classification and its purpose (automatically classify and tag sensitive data). It does not clearly indicate the presence of configuration tables, role definitions, or other detailed parameters. It appears more conceptual/feature-usage oriented than a parameterized configuration or security reference, so it is not classified as expert knowledge under the given types.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification,Data classification,Data Classification - Azure Databricks,Configure automatic data classification in Unity Catalog,Learn how to automatically classify and tag sensitive data in your catalog.,"This page describes how to use Databricks Data Classification in Unity Catalog to automatically classify and tag sensitive data in your catalog. Data catalogs can have a vast amount of data, often containing known and unknown sensitive data. It is critical for data teams to understand what kind of sensitive data exists in each table so that they can both govern and democratize access to this data. To address this problem, Databricks Data Classification uses an AI agent to automatically classify ",2026-04-19T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to use Databricks Data Classification with Unity Catalog; this usually involves specific configuration steps, options, and possibly classifier settings unique to the service, which counts as expert configuration knowledge beyond generic data classification theory.",updated https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification-tags,Supported classification tags,Supported classification tags - Azure Databricks,Use supported Databricks data classification tags,"Reference list of classification tags supported by Databricks Data Classification, organized by global tags, regional tags, and compliance frameworks.","Important This feature is inPublic Preview. Data Classification detects sensitive data and assignssystem tagsto matching columns. For a mapping of tags to compliance frameworks, seeCompliance framework reference. For a full list of tags organized byglobal tags(which apply across all cloud regions) andregional tags(which detect data specific to certain countries or regions), seeGlobal tagsandRegional tags.",2026-03-16T08:00:00.000Z,reference,configuration,0.65,True,"Reference list of supported system classification tags, organized by global, regional, and compliance frameworks. The exact tag names and their organization are product-specific configuration/metadata details that an LLM would not reliably know from training.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-lineage,Overview of data lineage,View data lineage using Unity Catalog - Azure Databricks,View and analyze data lineage with Unity Catalog,Learn how to use Unity Catalog to view and analyze data lineage.,This page describes how to visualize data lineage using Catalog Explorer and the data lineage system tables.,2026-03-24T08:00:00.000Z,how-to,configuration,0.6,True,"Describes how to use Catalog Explorer and lineage system tables; likely includes specific table names, UI options, and query patterns that are product-specific configuration/usage details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-lineage,Overview of data lineage,View data lineage using Unity Catalog - Azure Databricks,View and configure Unity Catalog data lineage,Learn how to use Unity Catalog to view and analyze data lineage.,This page describes how to visualize data lineage using Catalog Explorer and the data lineage system tables.,2026-04-22T21:32:00.000Z,how-to,configuration,0.7,True,"Describes how to use Catalog Explorer and lineage system tables; such a page typically includes concrete UI options, lineage query patterns, and table/column names for system tables, which are product-specific configuration/usage details rather than generic concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/,Data quality monitoring,Data quality monitoring - Azure Databricks,,"Learn about data quality monitoring, formerly known as Lakehouse Monitoring. Anomaly detection at scale (catalog/schema level) and data profiling at table-level.","Data quality monitoring helps you ensure the quality of all of your data assets in Unity Catalog. Data quality monitoring includes the following capabilities: Data profiling was formerly known as Lakehouse Monitoring. To draw useful insights from your data, you must have confidence in the quality of your data. Anomaly detection monitors enabled tables forfreshnessandcompleteness. Freshnessrefers to how recently a table has been updated. Anomaly detection analyzes the history of commits to a tabl",2026-03-12T23:05:00.000Z,overview,,0.45,False,High-level overview of data quality monitoring capabilities; summary suggests conceptual description of anomaly detection and profiling rather than detailed config tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/anomaly-detection/,Detect anomalies at scale,Anomaly detection - Azure Databricks,,Learn how to automatically monitor freshness and completeness of your tables based on historical data.,"Important This feature is inPublic Preview. This page describes what anomaly detection is, what it monitors, and how to use it. Important Anomaly detection usesdefault storageto store scan results in thesystem.data_quality_monitoring.table_resultssystem table. You are not billed for this storage.",2026-04-09T08:00:00.000Z,overview,,0.35,False,"The anomaly detection page describes what anomaly detection is, what it monitors, and how to use it, plus a billing note about default storage. From the summary, it looks like a feature explanation/usage guide rather than a limits table, configuration reference, or troubleshooting guide with specific error codes or parameters. Lacking clear evidence of detailed numeric thresholds or config tables, it is not classified as expert knowledge here.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/anomaly-detection/alerts,Set alerts,Alerts for anomaly detection - Azure Databricks,,Learn how to view and create alerts for anomaly detection in Databricks.,"This page describes how to set up and view alerts for anomaly detection. For an introduction to anomaly detection, seeAnomaly detection. Alerts notify you when anomaly detection identifies a data quality issue, such as a stale table or an unexpected drop in row count. You can work with alerts in the following ways:",2026-03-20T17:45:00.000Z,how-to,,0.3,False,"High-level description of setting up and viewing alerts; summary does not indicate specific configuration parameters, limits, or error mappings.",unchanged @@ -420,26 +421,26 @@ https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/create-monitor-ui,Create a profile using the Databricks UI,Create a profile using the Databricks UI - Azure Databricks,Create and configure data profiles in Databricks UI,Learn how to create a data profile using the Databricks UI. Learn about the types of profile available and the parameters for each type.,"This article demonstrates create a data profile using the Databricks UI. You can also usethe API. To access the Databricks UI, do the following: In the workspace left sidebar, clickto openCatalog Explorer. Navigate to the table you want to profile. Click theQualitytab. Ifanomaly detectionis not enabled for this schema, clickEnable. If anomaly detection is enabled for this schema, clickConfigure. In theData Quality Monitoringdialog, in theData profilingfield, clickConfigure. In the dialog, select",2026-03-12T23:05:00.000Z,how-to,configuration,0.7,True,"Walks through configuring data profiling monitors in the UI, including selectable profile types and parameters for each type, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/custom-metrics,Use custom metrics,Use custom metrics with data profiling - Azure Databricks,Define and use custom metrics in Databricks data profiling,Learn how to define custom metrics for data profiling. Custom metrics are useful to capture business logic that is not reflected in the built-in metrics.,"This page describes how to create a custom metric in data profiling. In addition to the analysis and drift statistics that are automatically calculated, you can create custom metrics. For example, you might want to track a weighted mean that captures some aspect of business logic or use a custom model quality score. You can also create custom drift metrics that track changes to the values in the primary table (compared to the baseline or the previous time window). For more details on how to use ",2026-03-12T23:05:00.000Z,how-to,configuration,0.7,True,"Explains how to define custom and drift metrics for profiling, including how they plug into the profiling system and metric tables, which are Databricks-specific configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/expense,View expenses,View data quality monitoring expenses - Azure Databricks,Query Databricks data quality monitoring expenses,Learn how to view data quality monitoring expenses and distinguish anomaly detection and data profiling.,"To check data quality monitoring expenses, query the system tablesystem.billing.usage. For more information on querying billing records, seeBillable usage system table reference.",2026-03-12T23:05:00.000Z,how-to,configuration,0.65,True,"Shows how to query the system.billing.usage system table to distinguish anomaly detection vs profiling costs, which is a product-specific configuration/usage of billing tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/fairness-bias,Monitor fairness and bias for classification models,Monitor fairness and bias for classification models - Azure Databricks,,Learn how to monitor fairness and bias in predictions from classification models using data profiling in Unity Catalog.,"With data profiling, you can monitor the predictions of a classification model to see if the model performs similarly on data associated with different groups. For example, you can investigate whether a loan-default classifier generates the same false-positive rate for applicants from different demographics.",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Fairness/bias monitoring description is conceptual; no product-specific limits, configs, error codes, or decision matrices with quantified thresholds.",updated +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/fairness-bias,Monitor fairness and bias for classification models,Monitor fairness and bias for classification models - Azure Databricks,,Learn how to monitor fairness and bias in predictions from classification models using data profiling in Unity Catalog.,"With data profiling, you can monitor the predictions of a classification model to see if the model performs similarly on data associated with different groups. For example, you can investigate whether a loan-default classifier generates the same false-positive rate for applicants from different demographics.",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Fairness/bias monitoring description is conceptual; no product-specific limits, configs, error codes, or decision matrices with quantified thresholds.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/monitor-alerts,Monitor alerts,Profile alerts - Azure Databricks,Configure Databricks SQL alerts for profile metrics,Learn how to create an alert based on data profiling.,"This page describes how to create a Databricks SQL alert based on a metric from a profile metrics table. Some common uses for profile alerts include: Profile alerts are created and used the same way as other Databricks SQL alerts. You create aDatabricks SQL queryon the profile metrics table or drift metrics table. You then create a Databricks SQL alert for this query. You can configure the alert to evaluate the query at a desired frequency, and send a notification if the alert is triggered. By d",2026-03-12T23:05:00.000Z,how-to,configuration,0.7,True,Shows how to build SQL queries on profile and drift metric tables and configure alert frequency and notifications; uses specific system tables and alert settings that are product-specific.,unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/monitor-dashboard,Use the data profiling dashboard,Data profiling dashboard - Azure Databricks,,Learn about the dashboard that is automatically created and updated by a data profile.,"This page describes the dashboard that is automatically created when a profile is run. For an introduction to data profiling, seeData profiling. When a profile runs, it creates a dashboard that displays key metrics computed by the profile. The visualizations included in the default dashboard configuration depend on the profile type, and the different metrics are organized into sections. The left side of the dashboard shows lists of the metrics and statistics included in the tables and charts. Th",2026-03-12T23:05:00.000Z,reference,,0.4,False,"Describes what the automatically created dashboard shows but appears more descriptive than configuration-focused; no clear parameter tables, limits, or decision matrices indicated in the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/monitor-output,Profile metric tables,Data profiling metric tables - Azure Databricks,Understand Databricks data profiling metric tables,Learn about the metric tables and dashboard created by data profiling.,"This page describes the metric tables created by data profiling. For information about the dashboard created by a profile, seeData profiling dashboard. When a profile runs on a Databricks table, it creates or updates two metric tables: a profile metrics table and a drift metrics table.",2026-03-12T23:05:00.000Z,reference,configuration,0.65,True,"Explains the specific schema/structure and purpose of the profile metrics and drift metrics tables created by data profiling, which are product-specific configuration/usage details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/disable-hms,Disable the legacy Hive metastore,Disable access to the Hive metastore used by your Azure Databricks workspace - Azure Databricks,Disable direct Hive metastore access in Databricks workspaces,"Learn how to disable direct access to Hive metastores from your workspace, ensuring that all access and governance is provided by Unity Catalog.","This page describes how to disable direct access to the legacy Hive metastore that is used by your Azure Databricks workspace, whether the workspace-local Hive metastore or an external Hive metastore. When you've completed your Unity Catalog migration or federated your Hive metastore as a foreign catalog that is governed by Unity Catalog, you can use a simple workspace admin setting to prevent users from bypassing Unity Catalog and accessing tables registered in the Hive metastore. Data in Hive ",2026-01-20T08:00:00.000Z,how-to,configuration,0.7,True,"Explains a specific workspace admin setting to block Hive metastore access; this is a concrete configuration option with defined behavior, part of product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/enable-workspaces,Enable a workspace for Unity Catalog,Enable a workspace for Unity Catalog - Azure Databricks,Configure Azure Databricks workspaces for Unity Catalog,Learn how to enable an Azure Databricks workspace for Unity Catalog by assigning a Unity Catalog metastore.,"This article explains how to enable a workspace for Unity Catalog by assigning a Unity Catalog metastore. This article applies only when you are upgrading an existing non-Unity Catalog workspace to use Unity Catalog. Important On November 9, 2023, Databricks started to enable new workspaces for Unity Catalog automatically, with a rollout proceeding gradually. If your workspace was enabled for Unity Catalog automatically, this article does not apply to you. To determine if your workspace is alrea",2026-03-17T08:00:00.000Z,how-to,configuration,0.7,True,"Describes how to enable an existing workspace for Unity Catalog by assigning a metastore, including product-specific configuration steps and conditions (e.g., automatic enablement after a certain date). This is concrete configuration guidance unique to Azure Databricks and Unity Catalog.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/external-lineage,Bring your own lineage,Bring your own data lineage - Azure Databricks,Ingest external data lineage into Unity Catalog,Learn how to add external lineage metadata to Unity Catalog so that workloads that run outside of Azure Databricks are captured in Unity Catalog data lineage.,"This page describes how to update data lineage to include external assets and workflows that are run outside of Azure Databricks. Unity Catalog automatically captures runtime data lineage across queries that are run on Azure Databricks. However, you might have workloads that run outside of Azure Databricks (for example, first mile ETL or last mile BI). Unity Catalog lets you add external lineage metadata to augment the Azure Databricks data lineage it captures automatically, giving you an end-to",2026-03-09T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to add external lineage metadata, including specific APIs or table structures for lineage updates; these are product-specific configuration/integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/,Row and column filters,Row filters and column masks - Azure Databricks,Secure data with Unity Catalog row filters and column masks on Azure Databricks,Learn how to govern your data at the row and column level using row filters and column masks.,This page provides guidance for using row filters and column masks to filter sensitive data in your tables.,2026-04-03T22:07:00.000Z,concept-article,security,0.8,True,Provides concrete guidance on defining and applying row filters and column masks in Unity Catalog to protect sensitive data. This includes product-specific security policy constructs and behaviors.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/,Row and column filters,Row filters and column masks - Azure Databricks,Secure data with row filters and column masks in Unity Catalog,Learn how to govern your data at the row and column level using row filters and column masks.,This page provides guidance for using row filters and column masks to filter sensitive data in your tables.,2026-04-23T08:00:00.000Z,concept-article,security,0.7,True,"Provides guidance on using row filters and column masks to protect sensitive data at row/column level. This is product-specific data governance and security configuration, likely including specific SQL constructs and Unity Catalog behaviors.",updated https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/manually-apply,Add row filters and column masks,Manually apply row filters and column masks - Azure Databricks,Manually apply Unity Catalog row filters and column masks using mapping tables,Apply row filters and column masks using mapping tables.,"This page provides guidance and examples for using row filters, column masks, and mapping tables to filter sensitive data in your tables. These features require Unity Catalog. If you're looking for a centralized tag-based approach to filtering and masking, seeUnity Catalog attribute-based access control (ABAC). ABAC enables you to manage policies using governed tags and apply them consistently across many tables.",2026-04-03T22:07:00.000Z,how-to,security,0.8,True,"Shows detailed, product-specific patterns for implementing row filters and column masks with mapping tables, including how they integrate with ABAC. These are concrete security configuration and policy application patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/get-started,Set up and manage Unity Catalog,Get started with Unity Catalog - Azure Databricks,,"Learn how to get started with Unity Catalog. This page is intended for Azure Databricks admins and power users, but it is also useful for end users.","This article explains how to get started with Unity Catalog to manage data in your Azure Databricks workspace. It is intended primarily for workspace admins who are using Unity Catalog for the first time. To set up Unity Catalog using the Databricks Terraform provider, seeAutomate Unity Catalog setup using Terraform. By the end of this article you will have: You can also review other introductory articles: Note If you want to upgrade an existing non-Unity-Catalog workspace to Unity Catalog, you ",2026-01-24T08:00:00.000Z,get-started,,0.4,False,‘Get started’ article is likely a procedural setup guide without dense configuration parameter tables or decision matrices; mostly onboarding steps.,unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/hive-metastore,Work with Hive metastore alongside Unity Catalog,Work with the legacy Hive metastore alongside Unity Catalog - Azure Databricks,Configure legacy Hive metastore alongside Unity Catalog,Learn how to work with the Azure Databricks workspace-level Hive metastore alongside the Unity Catalog metastore without using Hive metastore federation.,"This article explains one approach to continuing to use the legacy per-workspace Hive metastore when your Azure Databricks workspace is enabled for Unity Catalog. If your workspace was in service before it was enabled for Unity Catalog, it likely has a Hive metastore that contains data that you might want to continue to use. This article describes how to continue to work with tables that are registered in a Hive metastore. Important The per-workspace Hive metastore is a legacy feature, and the i",2026-03-26T08:00:00.000Z,how-to,configuration,0.65,True,"Explains a specific, non-obvious configuration pattern for using the legacy per-workspace Hive metastore with Unity Catalog. This is product-specific setup behavior that goes beyond generic concepts, fitting configuration/interop guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/jobs-update,Update jobs,Update jobs when you upgrade legacy workspaces to Unity Catalog - Azure Databricks,Update Databricks jobs after upgrading to Unity Catalog,Learn how to update jobs when you upgrade legacy Azure Databricks workspaces to Unity Catalog.,"When youupgrade legacy workspaces to Unity Catalog, you might need to update existing jobs to reference upgraded tables and filepaths. The main table on this page lists typical scenarios and suggestions for updating your jobs. Scenarios that require code examples link to theDetailed scenariossection. For a demo of updating jobs to Unity Catalog, seeUpgrading a Job to Unity Catalog.",2026-03-31T23:28:00.000Z,reference,configuration,0.6,True,"Main table of scenarios and code changes for jobs when upgrading; contains concrete path and table-reference patterns unique to Unity Catalog migration, fitting configuration/coding adjustments.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-metastore,Manage metastores,Manage Unity Catalog metastores - Azure Databricks,Manage Unity Catalog metastore lifecycle and behavior,"Learn how to update, delete, and manage the behavior of Unity Catalog metastores in your Azure Databricks account.","This article shows how to update, delete, and manage the behavior of Unity Catalog metastores in your Azure Databricks account. To learn about Unity Catalog metastores and how to create them, seeCreate a Unity Catalog metastore.",2026-01-30T08:00:00.000Z,how-to,configuration,0.7,True,"Shows how to update, delete, and manage metastore behavior, which involves specific configuration options and admin controls not generally known.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-metastore,Manage metastores,Manage Unity Catalog metastores - Azure Databricks,Configure and manage Unity Catalog metastores,"Learn how to update, delete, and manage the behavior of Unity Catalog metastores in your Azure Databricks account.","This article shows how to update, delete, and manage the behavior of Unity Catalog metastores in your Azure Databricks account. To learn about Unity Catalog metastores and how to create them, seeCreate a Unity Catalog metastore.",2026-04-23T08:00:00.000Z,how-to,configuration,0.7,True,"A management page for Unity Catalog metastores typically includes product-specific settings (e.g., default catalog, region behavior, assignment rules) and concrete configuration options unique to Databricks Unity Catalog, which qualify as expert configuration knowledge rather than just conceptual overview.",updated https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/,About privilege management in Unity Catalog,Manage privileges in Unity Catalog - Azure Databricks,Manage Unity Catalog privileges and data access,"Learn to manage privileges in Unity Catalog, including managing metastore administrators, object ownership, and access to data.","This page explains how to control access to data and other objects in Unity Catalog. To learn about how this model differs from access control in the Hive metastore, seeWork with the legacy Hive metastore alongside Unity Catalog.",2026-01-20T08:00:00.000Z,how-to,security,0.75,True,"Explains how to control access, manage metastore admins, and object ownership; includes product-specific privilege names and role behaviors that are detailed security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/access-request-destinations,Manage access request destinations,Manage access requests - Azure Databricks,Configure and manage Unity Catalog access requests,Learn how to manage access requests on securable objects in Unity Catalog.,Important This feature is inPublic Preview. The Request access feature allows users to request privileges for securable objects in Unity Catalog. This page explains how to configure access request destinations as an administrator. These destinations determine where access requests are sent when users request access to data objects.,2026-02-18T22:10:00.000Z,how-to,security,0.7,True,Explains how to configure destinations for access requests; involves product-specific security workflow settings and permission flows.,unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/admin-privileges,Admin privileges in Unity Catalog,Admin privileges in Unity Catalog - Azure Databricks,Understand admin privileges for Unity Catalog management,"Learn about the Unity Catalog metastore management privileges, the metastore admin role, and the Unity Catalog management privileges owned by account and workspace admins.","This article describes privileges that Azure Databricks account admins, workspace admins, and metastore admins have for managing Unity Catalog. Note If your workspace was enabled for Unity Catalog automatically, workspace admins have default privileges on the attached metastore and the workspace catalog, if a workspace catalog was provisioned. SeeWorkspace admin privileges when workspaces are enabled for Unity Catalog automatically.",2026-03-27T08:00:00.000Z,reference,security,0.8,True,"Describes privileges of account, workspace, and metastore admins; includes specific role names and their scopes, which are product-specific security details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/allowlist,Allowlist libraries and init scripts on standard compute,Allowlist libraries and init scripts on compute with standard access mode (formerly shared access mode) - Azure Databricks,Configure Unity Catalog allowlist for Databricks compute,"Learn to use the allowlist object to allow init scripts, JAR files, and Maven coordinates on compute with standard access mode.","In Databricks Runtime 13.3 LTS and above, theallowlistin Unity Catalog controls which libraries and init scripts can run on standard access mode compute. This allows users to leverage these artifacts on compute configured with standard access mode. By default, the allowlist is empty. You cannot disable this feature. To modify the allowlist, you must have theMANAGE ALLOWLISTprivilege. SeeMANAGE ALLOWLIST. You can add a directory or file to the allowlist even if it hasn't been created yet. SeeWork",2026-03-31T08:00:00.000Z,how-to,security,0.82,True,"Describes allowlist object, default behavior, and MANAGE ALLOWLIST privilege; this is product-specific security/privilege configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/allowlist,Allowlist libraries and init scripts on standard compute,Allowlist libraries and init scripts on compute with standard access mode (formerly shared access mode) - Azure Databricks,Manage Unity Catalog allowlist for libraries and scripts,"Learn to use the allowlist object to allow init scripts, JAR files, and Maven coordinates on compute with standard access mode.","In Databricks Runtime 13.3 LTS and above, theallowlistin Unity Catalog controls which libraries and init scripts can run on standard access mode compute. This allows users to leverage these artifacts on compute configured with standard access mode. By default, the allowlist is empty. You cannot disable this feature. To modify the allowlist, you must have theMANAGE ALLOWLISTprivilege. SeeMANAGE ALLOWLIST. You can add a directory or file to the allowlist even if it hasn't been created yet. SeeWork",2026-04-17T08:00:00.000Z,how-to,security,0.78,True,"Covers the Unity Catalog allowlist object, default behavior, and MANAGE ALLOWLIST privilege; this is product-specific security/permission configuration for controlling which artifacts can run on standard access mode compute.",updated https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/ownership,Object ownership,Manage Unity Catalog object ownership - Azure Databricks,Manage and transfer Unity Catalog object ownership,Learn how to view and transfer object ownership for Unity Catalog securable objects.,"Eachsecurable objectin Unity Catalog has an owner. The owner can be any principal: a user, service principal, or account group. The principal that creates an object becomes its initial owner. An object's owner has all privileges on the object, such asSELECTandMODIFYon a table, in addition to the permission to grant privileges to other principals. An object's owner has the ability to drop the object.",2026-04-09T08:00:00.000Z,how-to,security,0.8,True,"Describes Databricks-specific ownership rules, capabilities, and effects on privileges for Unity Catalog securable objects, which are detailed IAM semantics unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/upgrade-privilege-model,Upgrade to privilege inheritance,Upgrade to privilege inheritance - Azure Databricks,Upgrade Unity Catalog to privilege inheritance model,Learn how to upgrade to the latest privilege model in Unity Catalog.,"If you created your Unity Catalog metastore during the public preview (before August 25, 2022), you can upgrade to Privilege Model version 1.0 to take advantage ofprivilege inheritance. Existing workloads will continue to operate as-is until you upgrade your privilege model. Databricks recommends upgrading to Privilege Model version 1.0 to get the benefits of privilege inheritance and new features.",2026-04-10T08:00:00.000Z,how-to,security,0.7,True,"Explains how to move to Unity Catalog Privilege Model 1.0, including behavior changes around privilege inheritance and upgrade implications, which are product-specific security/authorization details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/managed-versus-external,Managed versus external,Managed versus external assets in Unity Catalog - Azure Databricks,Choose managed vs external assets in Unity Catalog,"Learn about managed and external tables and volumes in Unity Catalog, including the key differences and when to use each.","Everysecurable objectthat you register in Unity Catalog is centrally governed. This means that Unity Catalog manages the object's metadata, allowing it to control all aspects of governance including access, auditing, and lineage. However, for data assets like tables and volumes, Unity Catalog can also control the storage location and lifecycle of the underlying data files in your cloud account, which includes how they are organized, optimized, and when they are deleted. This distinction is what ",2026-04-14T17:47:00.000Z,concept-article,decision-making,0.7,True,"The page focuses on when to use managed versus external tables and volumes in Unity Catalog and describes trade-offs around storage location, lifecycle, and governance. This is product-specific decision guidance about selecting between options rather than a generic concept, fitting the decision-making sub-skill. No numeric limits or config tables are indicated, so other categories are less appropriate.",new +https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/managed-versus-external,Managed versus external,Managed versus external assets in Unity Catalog - Azure Databricks,Choose managed vs external assets in Unity Catalog,"Learn about managed and external tables and volumes in Unity Catalog, including the key differences and when to use each.","Everysecurable objectthat you register in Unity Catalog is centrally governed. This means that Unity Catalog manages the object's metadata, allowing it to control all aspects of governance including access, auditing, and lineage. However, for data assets like tables and volumes, Unity Catalog can also control the storage location and lifecycle of the underlying data files in your cloud account, which includes how they are organized, optimized, and when they are deleted. This distinction is what ",2026-04-14T17:47:00.000Z,concept-article,decision-making,0.7,True,"The page focuses on when to use managed versus external tables and volumes in Unity Catalog and describes trade-offs around storage location, lifecycle, and governance. This is product-specific decision guidance about selecting between options rather than a generic concept, fitting the decision-making sub-skill. No numeric limits or config tables are indicated, so other categories are less appropriate.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/migrate,Upgrade Hive tables and views to Unity Catalog,Upgrade Hive tables and views to Unity Catalog - Azure Databricks,Upgrade Hive metastore tables and views to Unity Catalog,Learn how to upgrade tables and views in your Azure Databricks workspace-local Hive metastore to Unity Catalog.,"This article describes how to upgrade tables and views registered in your existing workspace-local Hive metastore to Unity Catalog. Note As an alternative to the table migration processes described in this article, you can use Hive metastore federation to create a catalog in Unity Catalog that mirrors your Hive metastore. SeeHive metastore federation: enable Unity Catalog to govern tables registered in a Hive metastore. You can upgrade a Hive table either to amanaged tableorexternal tablein Unit",2026-03-26T08:00:00.000Z,how-to,configuration,0.65,True,"Describes concrete processes to upgrade tables to managed or external Unity Catalog tables; likely includes specific commands, options, and constraints that are product-specific configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/securable-objects,Securable objects,Unity Catalog securable objects reference - Azure Databricks,Reference Unity Catalog securable objects and permissions,Learn about securable objects in Unity Catalog.,"This page describes all securable objects in Unity Catalog. Asecurable objectis an object defined in Unity Catalog on which privileges can be granted to a principal (user, service principal, or group).",2026-04-09T08:00:00.000Z,reference,security,0.8,True,"Lists all Unity Catalog securable object types and how privileges apply to them, including product-specific object categories and permission semantics that are not generic to databases.",unchanged https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/ucx,Migrate to Unity Catalog using UCX,Use the UCX utilities to upgrade your workspace to Unity Catalog - Azure Databricks,Use UCX utilities to automate Unity Catalog workspace upgrade,"Learn about UCX, the Databricks Labs project that helps you upgrade your non-Unity-Catalog workspace to Unity Catalog.","This article introducesUCX, a Databricks Labs project that provides tools to help you upgrade your non-Unity-Catalog workspace to Unity Catalog. Note UCX, like all projects in the databrickslabs GitHub account, is provided for your exploration only, and is not formally supported by Databricks with service-level agreements (SLAs). It is provided as-is. We make no guarantees of any kind. Do not submit a Databricks support ticket relating to issues that arise from the use of this project. Instead, ",2026-01-24T08:00:00.000Z,how-to,configuration,0.6,True,"Introduces UCX tools for upgrading workspaces; such tooling docs typically include specific commands, flags, and configuration parameters unique to this migration utility.",unchanged @@ -458,67 +459,72 @@ https://learn.microsoft.com/en-us/azure/databricks/dbfs/disable-dbfs-root-mounts https://learn.microsoft.com/en-us/azure/databricks/dbfs/mounts,Mounting object storage,Mounting cloud object storage on Azure Databricks - Azure Databricks,Migrate from DBFS mounts to Unity Catalog external locations,"Learn how to view, create, and manage tables and databases in Azure Databricks.","Important DBFS mounts are a deprecated pattern and not recommended by Databricks. New accounts are provisioned without access to these features. Databricks recommends using Unity Catalogexternal locationsinstead. Azure Databricks enables users to mount cloud object storage to the Databricks File System (DBFS) to simplify data access patterns for users that are unfamiliar with cloud concepts. Mounted data does not work with Unity Catalog, and Databricks recommends migrating away from using mounts",2026-02-23T08:00:00.000Z,how-to,best-practices,0.65,True,"Covers a deprecated pattern (DBFS mounts) and recommends migration to Unity Catalog external locations with Databricks-specific guidance on when and why to avoid mounts. This is actionable, product-specific best-practice content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dbfs/root-locations,Default locations,What are the root directories? - Azure Databricks,,Learn about special and default locations on the DBFS root storage location.,Important The DBFS root is deprecated. New accounts are provisioned without access to this feature. Databricks recommends using Unity Catalogvolumesandworkspace filesinstead. Azure Databricks historically used directories in the workspace root directory for common storage locations. Most of these locations are deprecated. /Volumesprovides an alias for path-based access to data in Unity Catalog volumes. SeeWhat are Unity Catalog volumes?.,2026-02-23T08:00:00.000Z,concept-article,,0.3,False,"Primarily explains what the root directories are and mentions deprecation; appears to be conceptual/overview of locations without detailed configuration tables, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dbfs/unity-catalog,Best practices: DBFS and Unity Catalog,Best practices for DBFS and Unity Catalog - Azure Databricks,Apply DBFS and Unity Catalog usage best practices,Learn how to securely use DBFS in Unity Catalog enabled Azure Databricks workspaces.,"Important Both DBFS root and DBFS mounts are deprecated and not recommended by Databricks. New accounts are provisioned without access to these features. Databricks recommends using Unity Catalogvolumes,external locations, orworkspace filesinstead. Unity Catalog introduces a number of new configurations and concepts that approach data governance entirely differently than DBFS. This article outlines several best practices around working with Unity Catalog external locations and DBFS. Databricks r",2026-02-23T08:00:00.000Z,best-practice,best-practices,0.7,True,"Described as outlining several best practices around working with Unity Catalog external locations and DBFS, focused on secure and recommended usage patterns specific to Azure Databricks. This is product-specific guidance rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/,Delta Sharing overview,What is Delta Sharing? - Azure Databricks,,Learn how to use Delta Sharing for secure data and AI asset sharing with users outside your organization or on different metastores within your Azure Databricks account.,"This page introduces Delta Sharing in Azure Databricks, the secure data sharing platform that lets you share data and AI assets in Azure Databricks with users outside your organization, regardless of whether they use Azure Databricks. Delta Sharing is also the basis forDatabricks Marketplace, an open forum for exchanging data products, andClean Rooms, a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data. Delta Sharing is also available",2026-04-09T08:00:00.000Z,overview,,0.2,False,"Introductory overview of Delta Sharing; no detailed limits, configs, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/,Delta Sharing overview,What is Delta Sharing? - Azure Databricks,,Learn how to use Delta Sharing for secure data and AI asset sharing with users outside your organization or on different metastores within your Azure Databricks account.,"This page introduces Delta Sharing in Azure Databricks, the secure data sharing platform that lets you share data and AI assets in Azure Databricks with users outside your organization, regardless of whether they use Azure Databricks. Delta Sharing is also the basis forDatabricks Marketplace, an open forum for exchanging data products, andClean Rooms, a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data. Delta Sharing is also available",2026-04-21T08:00:00.000Z,overview,,0.2,False,"High-level introduction to Delta Sharing and related concepts (Marketplace, Clean Rooms) without concrete limits, configuration parameters, or detailed procedures.",updated https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/access-list,Use IP access lists to restrict access,Restrict Delta Sharing recipient access using IP access lists (open sharing) - Azure Databricks,Secure Delta Sharing open access with IP allow lists,Learn about how to use Delta Sharing IP access lists as an additional way to control recipient access to shared data.,"This article describes how data providers can assign IP access lists to control recipient access to shared data. If you, as a data provider, are using theopen Delta Sharing protocol, you can limit a recipient to a restricted set of IP addresses when they access data that you share. This list is independent ofWorkspace IP access lists. Only allow lists are supported. The IP access list affects the following: Each recipient supports a maximum of 100 IP/CIDR values, where one CIDR counts as a singl",2026-03-17T08:00:00.000Z,how-to,security,0.85,True,"Defines IP access list behavior, including maximum of 100 IP/CIDR values per recipient and interaction with workspace IP lists; this is precise, product-specific security configuration with numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/audit-logs,Audit data sharing,Audit and monitor data sharing - Azure Databricks,Audit and monitor Delta Sharing activity with Databricks logs,Learn how to use audit logs to view Delta Sharing events in Azure Databricks.,This article describes how data providers and recipients can use audit logs to monitor Delta Sharing events. Provider audit logs record actions taken by the provider and actions taken by recipients on the provider's shared data. Recipient audit logs record events related to the accessing of shares and the management of provider objects.,2026-01-20T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Describes specific audit log events for providers and recipients; likely maps event types and fields to actions, which is product-specific diagnostic information.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/audit-logs,Audit data sharing,Audit and monitor data sharing - Azure Databricks,Monitor Delta Sharing activity with audit logs,Learn how to use audit logs to view Delta Sharing events in Azure Databricks.,This article describes how data providers and recipients can use audit logs to monitor Delta Sharing events. Provider audit logs record actions taken by the provider and actions taken by recipients on the provider's shared data. Recipient audit logs record events related to the accessing of shares and the management of provider objects.,2026-04-22T17:34:00.000Z,how-to,security,0.65,True,Covers how providers and recipients use audit logs to monitor Delta Sharing events. Audit log event types and their meanings are product-specific security and compliance details that qualify as expert knowledge.,updated https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient,Create and manage recipients (Databricks-to-Databricks),Create and manage data recipients for Delta Sharing (Databricks-to-Databricks sharing) - Azure Databricks,Create and manage Delta Sharing recipients in Databricks,"Learn how to use Azure Databricks to create Delta Sharing recipients who are on other Databricks workspaces and grant those recipients access to securely shared data, update, and delete them.","This page explains how to create and manage recipients in Delta Sharing, when the recipients are on a Databricks workspace that is enabled forUnity Catalog. A recipient is the named object that represents the identity of a user or group of users who consume shared data. The way you create recipients differs depending on whether or not your recipient has access to a Databricks workspace that is enabled for Unity Catalog: Recipients with access to a Unity Catalog-enabled Databricks workspace: You ",2026-03-17T08:00:00.000Z,how-to,configuration,0.8,True,"Explains recipient objects, including different flows depending on Unity Catalog access; involves product-specific object properties and configuration steps for recipients.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-oidc-fed,Create and manage recipients (OIDC),Use Open ID Connect (OIDC) federation to enable authentication to Delta Sharing shares (open sharing) - Azure Databricks,Configure OIDC federation for Delta Sharing open access,Learn how to use OIDC federation to enable non-Databricks recipients to authenticate to Azure Databricks to access Delta Sharing shares.,"This page explains how data providers in Azure Databricks can federate authentication to an identity provider (IdP) to govern access to Delta Sharing shares created in Azure Databricks. This authentication flow uses OIDC federation, allowing JSON Web Tokens (JWTs) issued by the recipient's IdP as short-lived OAuth tokens authenticated by Azure Databricks. ThisDatabricks-to-open sharingauthentication method is designed for recipients who do not have access to a Unity Catalog-enabled Databricks wo",2026-03-17T08:00:00.000Z,how-to,security,0.85,True,"Explains how to federate authentication via OIDC, including JWT/OAuth token handling and IdP integration; contains product-specific security configuration parameters and flows.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-token,Create and manage recipients (tokens),Create a recipient object for non-Databricks users using bearer tokens (open sharing) - Azure Databricks,Secure Delta Sharing access for external users with bearer tokens,Learn how to use Azure Databricks to create Delta Sharing recipients who do not have access to a Unity Catalog-enabled Databricks workspace and grant those recipients access to securely shared data.,"This article describes how to create Delta Sharing recipients who do not have access to a Unity Catalog-enabled Databricks workspace and grant those recipients access to securely shared data using bearer tokens. This authentication flow, along with the OIDC token federation authentication flow, is calledopen sharing. Here's how it works: As a data provider, you create the recipient object in your Unity Catalog metastore. When you create the recipient object, you select the bearer token method, A",2026-03-17T08:00:00.000Z,how-to,security,0.8,True,"Details creating recipient objects using bearer tokens, token generation, rotation, and scope; this is product-specific authentication and access configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-share,Create and manage shares,Create and manage shares for Delta Sharing - Azure Databricks,Create and manage Unity Catalog Delta Sharing shares,"Learn how to use Azure Databricks to create and manage Delta Sharing shares, the objects that represent data to be shared securely with users outside your organization.","This page explains how to create and manage shares for Delta Sharing. A share is a securable object in Unity Catalog that you use for sharing the following data assets with one or more recipients: If you share an entire schema (database), the recipient can access all of the tables, streaming tables, views, materialized views, models, and volumes in the schema at the moment you share it, along with any data and AI assets added to the schema in the future. A share can contain data and AI assets fr",2026-04-10T08:00:00.000Z,how-to,configuration,0.7,True,Explains how to define and manage share objects in Unity Catalog; likely includes specific commands/parameters for configuring shares and asset inclusion behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-token,Create and manage recipients (tokens),Create a recipient object for non-Databricks users using bearer tokens (open sharing) - Azure Databricks,Configure Delta Sharing bearer-token recipients for open sharing,Learn how to use Azure Databricks to create Delta Sharing recipients who do not have access to a Unity Catalog-enabled Databricks workspace and grant those recipients access to securely shared data.,"This article describes how to create Delta Sharing recipients who do not have access to a Unity Catalog-enabled Databricks workspace and grant those recipients access to securely shared data using bearer tokens. This authentication flow, along with the OIDC token federation authentication flow, is calledopen sharing. Here's how it works: As a data provider, you create the recipient object in your Unity Catalog metastore. When you create the recipient object, you select the bearer token method, A",2026-04-23T08:00:00.000Z,how-to,security,0.7,True,"Describes creating recipient objects and configuring bearer-token based authentication for non-Databricks users. This is product-specific security configuration (recipient object, bearer token method, token handling) that goes beyond generic security concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-share,Create and manage shares,Create and manage shares for Delta Sharing - Azure Databricks,,"Learn how to use Azure Databricks to create and manage Delta Sharing shares, the objects that represent data to be shared securely with users outside your organization.","This page explains how to create and manage shares for Delta Sharing. A share is a securable object in Unity Catalog that you use for sharing the following data assets with one or more recipients: If you share an entire schema (database), the recipient can access all of the tables, streaming tables, views, materialized views, models, and volumes in the schema at the moment you share it, along with any data and AI assets added to the schema in the future. A share can contain data and AI assets fr",2026-04-23T08:00:00.000Z,how-to,,0.4,False,"Explains what shares are and how they conceptually work (what assets can be included, behavior when sharing schemas). Summary does not indicate detailed configuration tables, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/grant-access,Grant access to shares,Manage access to Delta Sharing data shares (for providers) - Azure Databricks,Manage provider-side access control for Delta Sharing,Learn how to use Azure Databricks to grant data recipients access to securely shared data in Delta Sharing.,"This article explains how to grant a data recipient access to a Delta Sharing share. It also explains how to view, update, and revoke access.",2026-03-18T15:56:00.000Z,how-to,security,0.65,True,"Describes how providers grant, view, update, and revoke access to Delta Sharing shares. This typically involves product-specific permission models, roles, or access control constructs in Unity Catalog, which qualify as security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-egress,Manage egress costs (for providers),Monitor and manage Delta Sharing egress costs (for providers) - Azure Databricks,Optimize Delta Sharing to reduce cloud egress costs,"Learn how to manage Delta Sharing egress costs using DEEP CLONE and change data feed (CDF), and Cloudflare R2 storage..","This page describes tools that you can use to monitor and manage cloud vendor egress costs when you share data and AI assets using Delta Sharing. Unlike other data sharing platforms, Delta Sharing does not require data replication. This model has many advantages, but it means that your cloud vendor may charge data egress fees when you share data across clouds or regions. If you use Delta Sharing to share data and AI assets within a region, you incur no egress cost. To monitor and manage egress c",2026-01-20T08:00:00.000Z,how-to,best-practices,0.75,True,"Provides concrete recommendations (DEEP CLONE, CDF, Cloudflare R2) and when to use them to manage egress; these are product-specific cost-optimization practices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-provider,Manage providers (for recipients),Manage Delta Sharing providers (for data recipients) - Azure Databricks,Manage Delta Sharing provider objects in Unity Catalog,Learn how data recipients can view data provider information and change the provider name in their Azure Databricks user interfaces.,"This article describes how to use Unity Catalog to get information about data providers who are sharing data with you using Delta Sharing. It also describes what a provider object is and when you might need to create a provider object in your Unity Catalog metastore, a task that most recipients should never need to do. Important Data recipients must have access to a Databricks workspace that is enabled forUnity Catalogto use the functionality described in this article. This article does not appl",2026-03-18T15:56:00.000Z,how-to,security,0.6,True,"Explains provider objects and how recipients view and manage provider information in Unity Catalog. This is tied to access control and identity of data providers, which is a product-specific security/authorization construct.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-databricks,Read shared data (Databricks-to-Databricks),Read data shared using Databricks-to-Databricks Delta Sharing (for recipients) - Azure Databricks,Read Databricks-to-Databricks Delta Sharing data,"Learn how to read data and notebooks that have been shared with you using the Databricks-to-Databricks Delta Sharing protocol, in which Databricks manages a secure connection and data sharing without ","This page describes how to read data shared with you using theDatabricks-to-DatabricksDelta Sharing protocol, where Databricks manages a secure connection for data sharing. Unlike the Delta Sharingopen sharingprotocol, the Databricks-to-Databricks protocol does not require a credential file (token-based security). Databricks-to-Databricks sharing requires that you, as the recipient, meetbothof the following requirements: If either requirement is not met, seeRead data shared using Delta Sharing o",2026-04-10T08:00:00.000Z,how-to,integrations,0.7,True,How-to for consuming Databricks-to-Databricks shares with product-specific connection behavior and security model; likely includes concrete options/parameters for accessing shared data from Databricks workspaces.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-open,Read shared data (tokens),Read data shared using Delta Sharing open sharing with bearer tokens (for recipients) - Azure Databricks,Consume Delta Sharing open shares with bearer tokens,"Learn how to read data that has been shared with you using the Delta Sharing open sharing protocol, which uses a credential file (token-based security) for access.","This page describes how to read data shared with you using the Delta Sharingopen sharingprotocol with bearer tokens. It includes instructions for reading shared data using the following tools: In this open sharing model, you use a credential file, shared with a member of your team by the data provider, to gain secure read access to shared data. Access persists as long as the credential is valid and the provider continues to share the data. Providers manage credential expiration and rotation. Upd",2026-04-10T08:00:00.000Z,how-to,integrations,0.75,True,Describes reading Delta Sharing open shares using credential files and multiple tools; likely includes connector/SDK parameters and token-based configuration details unique to Delta Sharing.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-provider,Manage providers (for recipients),Manage Delta Sharing providers for data recipients - Azure Databricks,,Learn how data recipients can view data provider information and change the provider name in their Azure Databricks user interfaces.,"This page describes how to use Unity Catalog to get information about data providers who are sharing data with you using Delta Sharing. It also describes what a provider object is and when you might need to create a provider object in your Unity Catalog metastore, a task that most recipients should never need to do. Important Data recipients must have access to a Databricks workspace that is enabled forUnity Catalogto use the functionalities described. This page does not apply to recipients who ",2026-04-24T18:05:00.000Z,how-to,,0.45,False,"Describes how recipients view provider information and manage provider objects in the UI; appears to be basic usage/navigation without detailed configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-databricks,Read shared data (Databricks-to-Databricks),Read data shared using Databricks-to-Databricks Delta Sharing (for recipients) - Azure Databricks,,"Learn how to read data and notebooks that have been shared with you using the Databricks-to-Databricks Delta Sharing protocol, in which Databricks manages a secure connection and data sharing without ","This page describes how to read data shared with you using theDatabricks-to-DatabricksDelta Sharing protocol, where Databricks manages a secure connection for data sharing. Unlike the Delta Sharingopen sharingprotocol, the Databricks-to-Databricks protocol does not require a credential file (token-based security). Databricks-to-Databricks sharing requires that you, as the recipient, meetbothof the following requirements: If either requirement is not met, seeRead data shared using Delta Sharing o",2026-04-22T08:00:00.000Z,how-to,,0.4,False,Describes how recipients read shared data using Databricks-to-Databricks protocol; appears to be a usage/tutorial style page without detailed configuration matrices or troubleshooting content.,updated +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-open,Read shared data (tokens),Read data shared using Delta Sharing open sharing with bearer tokens - Azure Databricks,Access Delta Sharing open-shared data using bearer tokens,"Learn how to read data that has been shared with you using the Delta Sharing open sharing protocol, which uses a credential file (token-based security) for access.","This page describes how to read data shared with you using the Delta Sharingopen sharingprotocol with bearer tokens. It includes instructions for reading shared data using the following tools: In this open sharing model, you use a credential file, shared with a member of your team by the data provider, to gain secure read access to shared data. Access persists as long as the credential is valid and the provider continues to share the data. Providers manage credential expiration and rotation. Upd",2026-04-23T08:00:00.000Z,how-to,security,0.65,True,"Explains how to use credential files and bearer tokens to securely read shared data, including token-based access behavior and rotation managed by providers. This is product-specific authentication/authorization usage rather than generic tutorial content.",updated https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/recipient,Access data shared with you,Access data shared with you using Delta Sharing (for recipients) - Azure Databricks,Access Delta Sharing data as a recipient in Databricks,This page shows how to access data that has been shared with you using Delta Sharing.,"This page explains how to access data that has been shared with you using Delta Sharing. Delta Sharing supports two models: Databricks-to-Databricks sharing, for Azure Databricks workspace users with Unity Catalog, and open sharing, for any recipient using any tool.",2026-03-17T08:00:00.000Z,concept-article,configuration,0.7,True,Recipient-focused guide for accessing shared data across both sharing models; includes concrete steps and catalog/SQL usage specific to Delta Sharing.,unchanged https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/,SAP BDC connector for Databricks,Share data between SAP Business Data Cloud (BDC) and Azure Databricks - Azure Databricks,Integrate SAP BDC with Azure Databricks via Delta Sharing,Learn how to share between Azure Databricks and SAP Business Data Cloud (BDC).,"This page introduces the SAP Business Data Cloud (BDC) Connector for Azure Databricks, which allows you to share data fromSAP BDCto Azure Databricks and from Azure Databricks to SAP BDC usingDelta Sharing. Note If you elect to use the SAP BDC Connector to share SAP BDC data to Azure Databricks, Databricks may disclose certain usage and operations information to SAP relating to your use of that data, including identifying your organization in connection with such information. SeeUsage data shared",2026-04-06T21:50:00.000Z,overview,integrations,0.7,True,Covers SAP BDC connector behavior between SAP BDC and Databricks; integration-specific details about how data is shared between the two systems.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/create-connection,Create an SAP BDC connection,Create and manage the SAP Business Data Cloud (BDC) connector - Azure Databricks,Configure SAP BDC connector for Delta Sharing,Learn how to set up an SAP Business Data Cloud (BDC) connection on Delta Sharing for Delta Sharing.,This page explains how to set up an SAP Business Data Cloud (BDC) connection on Azure Databricks for Delta Sharing. This connection is necessary for sharing with and receiving shares from an SAP BDC account.,2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Page is about creating and managing an SAP Business Data Cloud connection in Azure Databricks Delta Sharing. This typically includes product-specific connection parameters, options, and settings (such as connection properties, authentication settings, and management operations), which qualify as configuration details beyond generic tutorial content.",updated -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/semantic-metadata,SAP BDC semantic metadata,SAP BDC semantic metadata - Azure Databricks,,"Learn about the semantic metadata that syncs from SAP Business Data Cloud (BDC) into Unity Catalog, including comments, keys, and governance tags.","This page describes the semantic metadata that automatically syncs from SAP Business Data Cloud (BDC) into Unity Catalog for mounted SAP BDC shares. SAP table and column names can be difficult to read. For all mounted SAP BDC shares, semantic metadata is automatically ingested into Unity Catalog at the table level when a table is accessed, making the data more understandable and discoverable. Any changes made in SAP BDC are reflected in Unity Catalog. SAP BDC is the source of truth for semantic ",2026-04-14T21:45:00.000Z,concept-article,,0.2,False,"Describes what semantic metadata syncs from SAP BDC into Unity Catalog and how it behaves conceptually. Based on the summary, it focuses on behavior and concepts (comments, keys, governance tags, source of truth) rather than concrete configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/create-connection,Create an SAP BDC connection,Create and manage the SAP Business Data Cloud (BDC) connector - Azure Databricks,Configure SAP BDC connector for Delta Sharing,Learn how to set up an SAP Business Data Cloud (BDC) connection on Delta Sharing for Delta Sharing.,This page explains how to set up an SAP Business Data Cloud (BDC) connection on Azure Databricks for Delta Sharing. This connection is necessary for sharing with and receiving shares from an SAP BDC account.,2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Page is about creating and managing an SAP Business Data Cloud connection in Azure Databricks Delta Sharing. This typically includes product-specific connection parameters, options, and settings (such as connection properties, authentication settings, and management operations), which qualify as configuration details beyond generic tutorial content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/semantic-metadata,SAP BDC semantic metadata,SAP BDC semantic metadata - Azure Databricks,,"Learn about the semantic metadata that syncs from SAP Business Data Cloud (BDC) into Unity Catalog, including comments, keys, and governance tags.","This page describes the semantic metadata that automatically syncs from SAP Business Data Cloud (BDC) into Unity Catalog for mounted SAP BDC shares. SAP table and column names can be difficult to read. For all mounted SAP BDC shares, semantic metadata is automatically ingested into Unity Catalog at the table level when a table is accessed, making the data more understandable and discoverable. Any changes made in SAP BDC are reflected in Unity Catalog. SAP BDC is the source of truth for semantic ",2026-04-23T17:47:00.000Z,concept-article,,0.3,False,"Describes semantic metadata syncing from SAP BDC into Unity Catalog at a conceptual level; no indication of configuration parameters, limits, or detailed mappings that would qualify as expert configuration or integration patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/share-to-sap,Grant SAP BDC recipients access to shares,Grant SAP Business Data Cloud (BDC) recipients access to Delta Sharing data shares - Azure Databricks,Configure Delta Sharing access for SAP BDC recipients,Learn how to set up an SAP Business Data Cloud (BDC) connection on Azure Databricks for Delta Sharing.,This page explains how to share data assets with your SAP Business Data Cloud (BDC) recipient.,2026-03-18T15:56:00.000Z,how-to,integrations,0.7,True,"How-to for setting up SAP Business Data Cloud as a Delta Sharing recipient. Likely includes product-specific connection parameters, configuration steps, and required settings unique to the Databricks–SAP BDC integration, which go beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/set-up,Set up Delta Sharing for your account (for providers),Set up Delta Sharing for your account (for providers) - Azure Databricks,Configure Delta Sharing for Azure Databricks providers,Learn how to set up Delta Sharing in your Azure Databricks account to enable you to share data securely with other organizations.,"This page describes how to setup Delta Sharing on Azure Databricks for data providers (organizations that want to use Delta Sharing to share data securely). If you are adata recipient(an organization that receives data that is shared using Delta Sharing), seeRead data shared using Databricks-to-Databricks Delta Sharing (for recipients). Important Delta Sharing requires a Unity Catalog-enabled workspace. You can create one Unity Catalog-enabled workspace for share management. In some accounts, ne",2026-03-26T08:00:00.000Z,install-set-up-deploy,configuration,0.75,True,"Describes how to enable and set up Delta Sharing at the account level; likely includes specific settings (Unity Catalog workspace requirement, metastore selection, enablement flags) that are product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-databricks,Overview of D2D sharing,Share data using the Delta Sharing Databricks-to-Databricks protocol (for providers) - Azure Databricks,Configure Databricks-to-Databricks Delta Sharing for providers,"Learn how to share data securely with any Databricks user, regardless of account or cloud host, using Databricks-to-Databricks Delta Sharing and Unity Catalog.","This article gives an overview of how to use Databricks-to-Databricks Delta Sharing to share data securely with any Databricks user, regardless of account or cloud host, as long as that user has access to a workspace enabled forUnity Catalog. Note If you are a data recipient (a user or group of users with whom Databricks data is being shared), seeAccess data shared with you using Delta Sharing (for recipients).",2026-01-20T08:00:00.000Z,concept-article,configuration,0.7,True,Covers how to share data via the Databricks-to-Databricks protocol; typically includes share/recipient configuration steps and parameters unique to this feature.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-open,Overview of open sharing,Share data using the Delta Sharing open sharing protocol (for providers) - Azure Databricks,Configure open Delta Sharing for external data recipients,"Learn how to share data securely with users outside your Azure Databricks workspace or account using the Delta Sharing open sharing protocol, which lets you share with any user, regardless of whether ","This page gives an overview of how providers can use the Delta Sharing open sharing protocol to share data from your Unity Catalog-enabled Azure Databricks workspace with any user on any computing platform, anywhere. If you are a data recipient (a user or group of users with whom data is being shared), see insteadAccess data shared with you using Delta Sharing (for recipients).",2026-01-20T08:00:00.000Z,concept-article,configuration,0.7,True,"Provider-focused overview of open sharing; typically includes how to configure shares and endpoints for non-Databricks tools, with product-specific parameters.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-databricks,Overview of D2D sharing,Share data using the Delta Sharing Databricks-to-Databricks protocol (for providers) - Azure Databricks,,"Learn how to share data securely with any Databricks user, regardless of account or cloud host, using Databricks-to-Databricks Delta Sharing and Unity Catalog.","This article gives an overview of how to use Databricks-to-Databricks Delta Sharing to share data securely with any Databricks user, regardless of account or cloud host, as long as that user has access to a workspace enabled forUnity Catalog. Note If you are a data recipient (a user or group of users with whom Databricks data is being shared), seeAccess data shared with you using Delta Sharing (for recipients).",2026-04-21T08:00:00.000Z,concept-article,,0.4,False,"Overview of Databricks-to-Databricks Delta Sharing for providers; primarily conceptual workflow and requirements, not detailed configuration tables, error codes, or quantified trade-offs.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-open,Overview of open sharing,Share data using the Delta Sharing open sharing protocol (for providers) - Azure Databricks,,"Learn how to share data securely with users outside your Azure Databricks workspace or account using the Delta Sharing open sharing protocol, which lets you share with any user, regardless of whether ","This page gives an overview of how providers can use the Delta Sharing open sharing protocol to share data from your Unity Catalog-enabled Azure Databricks workspace with any user on any computing platform, anywhere. If you are a data recipient (a user or group of users with whom data is being shared), see insteadAccess data shared with you using Delta Sharing (for recipients).",2026-04-22T17:34:00.000Z,concept-article,,0.4,False,"Provider-focused overview of open sharing protocol; summary suggests conceptual guidance and workflow, not product-specific limits, configuration tables, or error mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sharing-over-oidc-m2m,Read shared data (OIDC M2M),Receive Delta Sharing shares using a Python client and Open ID Connect (OIDC) federation in a machine-to-machine flow (open sharing) - Azure Databricks,Use Python M2M OIDC federation to access Delta Sharing,Learn how to use machine-to-machine (M2M) OIDC federation to enable non-Databricks recipient Python client apps to access Databricks-provided Delta Sharing shares,"This page describes how data recipients can use a Python client registered in their own identity provider (IdP) to establish access to Delta Sharing shares created in Databricks. This ""machine-to-machine"" (M2M) OAuth Client Credentials grant flow is typically used in scenarios where an application, such as a nightly job running on a virtual machine, accesses data autonomously. This authentication flow uses OIDC federation, allowing JSON Web Tokens (JWTs) issued by the recipient's IdP to be used ",2026-01-24T08:00:00.000Z,how-to,integrations,0.8,True,"Covers machine-to-machine OAuth client credentials flow for Python clients; includes client registration, token acquisition, and Delta Sharing-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sharing-over-oidc-u2m,Read shared data (OIDC U2M),Receive Delta Sharing shares using Open ID Connect (OIDC) federation in a user-to-machine flow (open sharing) - Azure Databricks,Use U2M OIDC federation clients with Delta Sharing,Learn how to use user-to-machine (U2M) OIDC federation to enable non-Databricks recipients to access Databricks-provided Delta Sharing shares,"This page describes how data recipients can use a 'user-to-machine' (U2M) application (e.g, Power BI) to establish access to Delta Sharing shares created in Azure Databricks using Open ID Connect (OIDC) federation. The ""user-to-machine"" (U2M) authentication flow uses OIDC federation, allowing JSON Web Tokens (JWTs) issued by the recipient's IdP to be used as short-lived OAuth tokens that are authenticated by Azure Databricks. ThisDatabricks-to-open sharingauthentication method is designed for re",2026-01-24T08:00:00.000Z,how-to,integrations,0.8,True,"Describes user-to-machine OIDC flows for tools like Power BI; includes client configuration, scopes, and token usage specific to this integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/troubleshooting,Troubleshoot common sharing errors,Troubleshoot common sharing issues in Delta Sharing - Azure Databricks,Troubleshoot common Delta Sharing access errors,Learn about the most common Delta Sharing errors and how to fix them.,The following sections describe common errors that might occur when you try to access data in a share.,2026-04-08T22:17:00.000Z,troubleshooting-general,troubleshooting,0.9,True,Explicit troubleshooting article; expected to map specific Delta Sharing error messages or codes to causes and resolutions.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/troubleshooting,Troubleshoot common sharing errors,Troubleshoot common sharing issues in Delta Sharing - Azure Databricks,Troubleshoot common Azure Databricks Delta Sharing errors,Learn about the most common Delta Sharing errors and how to fix them.,The following sections describe common errors that might occur when you try to access data in a share.,2026-04-21T08:00:00.000Z,troubleshooting-general,troubleshooting,0.9,True,"Explicitly a troubleshooting page describing common errors when accessing shared data. It will contain specific error messages/codes and their resolutions, which are product-specific symptom→cause→solution mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/delta/,Overview,What is Delta Lake in Azure Databricks? - Azure Databricks,,Learn about the Delta Lake storage protocol used to power the Databricks lakehouse.,"Delta Lake is the optimized storage layer that provides the foundation for tables in a lakehouse on Databricks. Delta Lake isopen source softwarethat extends Parquet data files with a file-based transaction log forACID transactionsand scalable metadata handling. Delta Lake is fully compatible with Apache Spark APIs, and was developed for tight integration with Structured Streaming, allowing you to easily use a single copy of data for both batch and streaming operations and providing incremental ",2026-03-31T08:00:00.000Z,overview,,0.3,False,"Conceptual overview of Delta Lake; no indication of detailed limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/best-practices,Best practices,Best practices: Delta Lake - Azure Databricks,Apply Delta Lake best practices on Azure Databricks,Best practices and recommendations for using Delta Lake on Azure Databricks.,This article describes best practices when using Delta Lake.,2026-04-02T08:00:00.000Z,best-practice,best-practices,0.9,True,"Explicit DO/DON’T guidance for Delta Lake (file sizing, OPTIMIZE/VACUUM usage, schema evolution, partitioning, concurrency) with Databricks-specific recommendations and gotchas that are not generic Spark knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/catalog-commits,Catalog commits,Catalog commits - Azure Databricks,Configure catalog commits for Unity Catalog Delta tables,Enable catalog commits on Unity Catalog tables to coordinate transactions at the catalog level on Azure Databricks.,"This page explains how to enable catalog commits, a Delta table feature that shifts commit coordination from the file system to Unity Catalog, making the catalog the single source of truth for table state.",2026-04-14T17:47:00.000Z,article,configuration,0.7,True,"How-to page for enabling catalog commits; likely includes specific table properties, configuration flags, and allowed values for turning on catalog-level commit coordination, which fits configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/delta/checkpoint-v2,Checkpoint V2,Checkpoint V2 - Azure Databricks,Enable and use Delta Lake Checkpoint V2 in Databricks,Enable checkpoint V2 on Delta Lake to support more concurrent writers and reduces write conflicts on large or frequently updated tables.,"Checkpoint V2 allows Delta Lake to support more concurrent writers and reduces write conflicts on large or frequently updated tables. Delta Lake periodically writes checkpoints that record the state of the transaction log. Checkpoints speed up query planning by allowing Delta Lake to reconstruct table state without replaying the full transaction log. You can read and write tables with checkpoint V2 in Databricks Runtime 13.3 LTS and above. For the open-source protocol specification, seecheckpoin",2026-04-17T18:03:00.000Z,feature-guide,configuration,0.7,True,"Describes enabling Checkpoint V2 with runtime/version requirements; likely includes table properties or configuration parameters to turn on the feature and constraints around usage, which are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/databricks/delta/catalog-commits,Catalog commits,Catalog commits - Azure Databricks,Configure catalog commits for Unity Catalog Delta tables,Enable catalog commits on Unity Catalog tables to coordinate transactions at the catalog level on Azure Databricks.,"This page explains how to enable catalog commits, a Delta table feature that shifts commit coordination from the file system to Unity Catalog, making the catalog the single source of truth for table state.",2026-04-14T17:47:00.000Z,article,configuration,0.7,True,"How-to page for enabling catalog commits; likely includes specific table properties, configuration flags, and allowed values for turning on catalog-level commit coordination, which fits configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta/checkpoint-v2,Checkpoint V2,Checkpoint V2 - Azure Databricks,Enable and use Delta Lake Checkpoint V2 in Databricks,Enable checkpoint V2 on Delta Lake to support more concurrent writers and reduces write conflicts on large or frequently updated tables.,"Checkpoint V2 allows Delta Lake to support more concurrent writers and reduces write conflicts on large or frequently updated tables. Delta Lake periodically writes checkpoints that record the state of the transaction log. Checkpoints speed up query planning by allowing Delta Lake to reconstruct table state without replaying the full transaction log. You can read and write tables with checkpoint V2 in Databricks Runtime 13.3 LTS and above. For the open-source protocol specification, seecheckpoin",2026-04-17T18:03:00.000Z,feature-guide,configuration,0.7,True,"Describes enabling Checkpoint V2 with runtime/version requirements; likely includes table properties or configuration parameters to turn on the feature and constraints around usage, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/clone,Clone,Clone a table on Azure Databricks - Azure Databricks,"Clone Delta, Parquet, and Iceberg tables on Databricks",Learn how to create a copy of a table on Azure Databricks at a specific version using the Clone command. Clones can be either deep or shallow.,"Create a copy of an existing table on Azure Databricks at a specific version using theclonecommand. Clones can be either deep or shallow. Azure Databricks also supports cloning Parquet and Apache Iceberg tables. SeeIncrementally clone Parquet and Apache Iceberg tables to Delta Lake. For details on using clone with Unity Catalog, seeShallow clone for Unity Catalog tables. Note Databricks recommends using Delta Sharing to provide read-only access to tables across different organizations. SeeWhat i",2026-03-18T08:00:00.000Z,feature-guide,configuration,0.65,True,"Details clone command behavior (deep vs shallow), versioned cloning, and cross-format cloning; these are product-specific DDL semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/clone-unity-catalog,Shallow clone,Shallow clone for Unity Catalog tables - Azure Databricks,Use shallow clone with Unity Catalog tables,Learn how to use shallow clone with Unity Catalog tables on Azure Databricks to create new tables without copying underlying data files.,"Important This feature is inPublic Preview. Important Shallow clone support differs for Unity Catalog managed and external tables. For managed tables use Databricks Runtime 13.3 and above, and for external tables use Databricks Runtime 14.2 and above. You can only clone Unity Catalog managed tables to Unity Catalog managed tables and Unity Catalog external tables to Unity Catalog external tables.VACUUMbehavior differs between managed and external tables. SeeUseVACUUMwith Unity Catalog shallow cl",2026-04-02T22:35:00.000Z,feature-guide,configuration,0.75,True,"Documents runtime requirements, managed vs external behavior, and VACUUM behavior differences for shallow clones—detailed product-specific configuration and behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/clustering,Liquid clustering,Use liquid clustering for tables - Azure Databricks,Optimize Databricks tables with liquid clustering,Use liquid clustering to simplify data layout decisions and optimize query performance without partitioning.,"Liquid clustering is a data layout optimization technique that replaces table partitioning andZORDER. It simplifies table management and optimizes query performance by automatically organizing data based on clustering keys. Unlike traditional partitioning, you can redefine clustering keys without rewriting existing data. This allows your data layout to evolve alongside changing analytic needs. Liquid clustering applies to both streaming tables and materialized views. Important Liquid clustering ",2026-04-16T08:00:00.000Z,feature-guide,best-practices,0.7,True,"Explains how to use liquid clustering instead of partitioning/ZORDER; likely includes Databricks-specific configuration syntax, recommended patterns, and trade-offs for clustering keys, which are concrete product-specific best practices.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta/clustering,Liquid clustering,Use liquid clustering for tables - Azure Databricks,Optimize Delta tables with liquid clustering on Azure Databricks,Use liquid clustering to simplify data layout decisions and optimize query performance without partitioning.,"Liquid clustering is a data layout optimization technique that replaces table partitioning andZORDER. It simplifies table management and optimizes query performance by automatically organizing data based on clustering keys. Unlike traditional partitioning, you can redefine clustering keys without rewriting existing data. This allows your data layout to evolve alongside changing analytic needs. Liquid clustering applies to both streaming tables and materialized views. Important Liquid clustering ",2026-04-24T08:00:00.000Z,feature-guide,best-practices,0.7,True,"Explains how and when to use liquid clustering instead of partitioning/ZORDER, including product-specific behavior (redefining clustering keys without rewrite, applicability to streaming tables and materialized views). These are Databricks-specific optimization patterns and gotchas.",updated https://learn.microsoft.com/en-us/azure/databricks/delta/column-mapping,Column mapping,Rename and drop columns with Delta Lake column mapping - Azure Databricks,Manage Delta Lake columns with column mapping on Azure Databricks,This page describes how Delta Lake column mapping enables metadata-only changes to mark columns as deleted or renamed without rewriting data files.,"This page describes how Delta Lake column mapping enables metadata-only changes to mark columns as deleted or renamed without rewriting data files. Azure Databricks supports column mapping for Delta Lake tables. Column mapping enables metadata-only changes to mark columns as deleted or renamed without rewriting data files. Column mapping also allows you to use characters not allowed by Parquet in column names, such as spaces. This enables you to directly ingest CSV or JSON data into Delta withou",2026-03-24T01:32:00.000Z,how-to,configuration,0.7,True,"Details how to enable and use column mapping to rename/drop columns and support special characters without rewriting data files. Includes specific table properties/behaviors unique to Delta on Databricks, which are configuration-focused expert details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/custom-metadata,User-defined metadata,Enrich tables with custom metadata - Azure Databricks,,Add custom metadata to tables to enrich data discovery on Azure Databricks.,Databricks recommends always providing comments for tables and columns in tables. You can generate these comments using AI. SeeAdd AI-generated comments to Unity Catalog objects. Unity Catalog also provides the ability to tag data. SeeApply tags to Unity Catalog securable objects. Log messages for individual commits to tables in a field in the transaction log.,2026-03-18T18:06:00.000Z,how-to,,0.2,False,"Describes adding comments and tags as custom metadata for Delta/Unity Catalog tables but appears to be conceptual/how-to without product-specific limits, config parameter tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/data-skipping,Data skipping,Data skipping - Azure Databricks,Tune Azure Databricks data skipping with stats and Z-order,"Enhance data skipping with stats columns, Z-order, and optimize on Azure Databricks.","Note In Databricks Runtime 13.3 and above, Databricks recommends using liquid clustering for table layout. Clustering is not compatible with Z-ordering. SeeUse liquid clustering for tables. Data skipping information is collected automatically when you write data into a table. Azure Databricks takes advantage of this information (minimum and maximum values, null counts, and total records per file) at query time to provide faster queries. You must have statistics collected for columns that are use",2026-03-06T08:00:00.000Z,concept-article,best-practices,0.65,True,"Contains Databricks-specific recommendations on how to leverage data skipping (stats columns, Z-order, OPTIMIZE) and when they are effective, including interactions with liquid clustering. These are concrete, product-specific performance practices rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/deletion-vectors,Deletion vectors,Deletion vectors in Databricks - Azure Databricks,Use deletion vectors to accelerate Delta table modifications on Azure Databricks,Learn how Azure Databricks leverages deletion vectors to accelerate deletes and updates to data stored in tables.,"Deletion vectors are a storage optimization feature that accelerates modifications to tables. By default, deleting a single row requires rewriting the entire Parquet file containing that record. Deletion vectors avoid this overhead. When deletion vectors are enabled,DELETE,UPDATE, andMERGEoperations mark rows as modified without rewriting the Parquet file. Reads then resolve the current table state by applying the modifications recorded in deletion vectors. Databricks recommends using Databricks",2026-03-09T08:00:00.000Z,feature-guide,best-practices,0.7,True,"Explains how deletion vectors change DELETE/UPDATE/MERGE behavior, when they are recommended, and their interaction with storage and performance. These are Databricks-specific optimization practices and behavioral gotchas.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta/deletion-vectors,Deletion vectors,Deletion vectors in Databricks - Azure Databricks,Use deletion vectors to accelerate Delta table updates,Learn how to use deletion vectors in Azure Databricks to accelerate deletes and updates to data stored in tables.,"Deletion vectors are a storage optimization feature that accelerates modifications to tables. By default, deleting a single row requires rewriting the entire Parquet file containing that record. Deletion vectors avoid this overhead. When deletion vectors are enabled,DELETE,UPDATE, andMERGEoperations mark rows as modified without rewriting the Parquet file. Reads then resolve the current table state by applying the modifications recorded in deletion vectors. Databricks recommends using Databricks",2026-04-21T22:28:00.000Z,feature-guide,best-practices,0.7,True,"Describes Databricks-specific deletion vector behavior (marking rows instead of rewriting Parquet files) and recommendations on when to enable them. This is a product-specific performance and storage optimization pattern, not generic SQL knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/delta/delta-change-data-feed,Change data feed,Use Delta Lake change data feed on Azure Databricks - Azure Databricks,Configure and use Delta Lake change data feed on Azure Databricks,Learn how to get row-level change information from Delta tables using the Delta Lake change data feed.,"Change data feed allows Azure Databricks to track row-level changes between versions of a Delta table. When enabled on a Delta table, the runtime recordschange eventsfor all the data written into the table. This includes the row data along with metadata indicating whether the specified row was inserted, deleted, or updated. You can use change data feed to power common data use cases including: Important Change data feed works in tandem with table history to provide change information. Because cl",2026-03-06T08:00:00.000Z,feature-guide,configuration,0.7,True,"Describes how to enable and query change data feed, including table properties and options tied to Delta table history. These are product-specific configuration knobs and usage patterns for CDF.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/drop-feature,Drop table features,Drop a Delta Lake table feature and downgrade table protocol - Azure Databricks,Drop Delta table features and downgrade protocol versions,Learn how to drop table features in Delta Lake to downgrade reader and writer protocol requirements and resolve compatibility issues.,"This article describes how to drop Delta Lake table features and downgrade protocol versions. This functionality is available in Databricks Runtime 16.3 and above. Not all Delta table features can be dropped. SeeWhat Delta table features can be dropped?. You should only useDROP FEATUREto support compatibility with earlier Databricks Runtime versions, Delta Sharing, or external Delta Lake reader or writer clients. Note Legacy support forDROP FEATUREis available starting in Databricks Runtime 14.3",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to use DROP FEATURE and protocol downgrades; likely includes specific feature names, protocol version constraints, and commands, which are detailed configuration and compatibility settings unique to Delta Lake on Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta/drop-feature,Drop table features,Drop a Delta Lake table feature and downgrade table protocol - Azure Databricks,Drop Delta table features and downgrade protocol versions,Learn how to drop table features in Delta Lake to downgrade reader and writer protocol requirements and resolve compatibility issues.,"This article describes how to drop Delta Lake table features and downgrade protocol versions. This functionality is available in Databricks Runtime 16.3 and above. Not all Delta table features can be dropped. SeeWhat Delta table features can be dropped?. You should only useDROP FEATUREto support compatibility with earlier Databricks Runtime versions, Delta Sharing, or external Delta Lake reader or writer clients. Note Legacy support forDROP FEATUREis available starting in Databricks Runtime 14.3",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to use DROP FEATURE and protocol downgrades; likely includes specific feature names, protocol version constraints, and commands, which are detailed configuration and compatibility settings unique to Delta Lake on Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/drop-table,Drop or replace,Drop or replace a table - Azure Databricks,Drop and replace Delta and Unity Catalog tables,Learn about dropping and replacing managed and external tables and Unity Catalog tables.,Azure Databricks supports SQL standard DDL commands for dropping and replacing tables registered with either Unity Catalog or the Hive metastore. This article provides examples of dropping and replacing tables and recommendations for syntax depending on your configured environment and desired outcome.,2026-03-06T08:00:00.000Z,how-to,configuration,0.65,True,Provides environment-specific DDL syntax recommendations and behavior for dropping/replacing tables across Unity Catalog and Hive metastore—product-specific configuration semantics.,unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/feature-compatibility,Feature compatibility,Delta Lake feature compatibility and protocols - Azure Databricks,Choose Delta Lake protocol versions and feature sets,Compare supported features and upgrade Delta Lake protocol versions on Azure Databricks.,"This article provides an overview of Delta Lake protocols, table features, and compatibility with Delta Lake clients for reads and writes. The transaction log for a Delta table contains protocol versioning information. SeeReview table details with describe detail.",2026-04-07T08:00:00.000Z,reference,decision-making,0.65,True,"Compares protocol versions and feature compatibility across clients; used to decide which protocol/features to enable based on reader/writer support, a product-specific compatibility and selection guide.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/generated-columns,Generated columns,Delta Lake generated columns - Azure Databricks,Define and use Delta Lake generated columns on Azure Databricks,Automatically generate column values using user-specified functions in Delta Lake on Azure Databricks.,"Important This feature is inPublic Preview. Delta Lake supports generated columns which are a special type of column whose values are automatically generated based on a user-specified function over other columns in the Delta table. When you write to a table with generated columns and you do not explicitly provide values for them, Delta Lake automatically computes the values. For example, you can automatically generate a date column (for partitioning the table by date) from the timestamp column; ",2026-03-06T08:00:00.000Z,feature-guide,configuration,0.7,True,"Describes how to declare generated columns, their constraints, and behaviors (e.g., automatic computation, partitioning use). These are product-specific configuration semantics for table schemas.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta/generated-columns,Generated columns,Delta Lake generated columns - Azure Databricks,Define and manage Delta Lake generated columns on Azure Databricks,Automatically generate column values using user-specified functions in Delta Lake on Azure Databricks.,"Important This feature is inPublic Preview. Delta Lake supports generated columns which are a special type of column whose values are automatically generated based on a user-specified function over other columns in the Delta table. When you write to a table with generated columns and you do not explicitly provide values for them, Delta Lake automatically computes the values. For example, you can automatically generate a date column (for partitioning the table by date) from the timestamp column; ",2026-04-23T21:54:00.000Z,feature-guide,configuration,0.7,True,"Details how to declare generated columns in Delta Lake, including syntax and behavior when values are omitted, and use for partitioning. These are specific configuration/DDL patterns unique to Delta Lake on Databricks.",updated https://learn.microsoft.com/en-us/azure/databricks/delta/history,History and data retention,Work with table history - Azure Databricks,Work with Delta table history and time travel on Azure Databricks,Review and navigate table versions using table history and time travel commands.,"Each operation that modifies a table creates a new table version. Use history information to audit operations, rollback a table, or query a table at a specific point in time using time travel. Note Databricks doesn't recommend using table history as a long-term backup solution for data archival. Use only the past 7 days for time travel operations unless you have set both data and log retention configurations to a larger value.",2026-03-06T08:00:00.000Z,feature-guide,configuration,0.7,True,"Explains how table history and time travel work, including retention-related recommendations (e.g., only use past 7 days unless retention configs are increased) and commands to navigate versions. These are product-specific behavioral and configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/merge,Merge data,Upsert into a Delta Lake table using merge - Azure Databricks,Use MERGE to upsert into Delta tables on Databricks,Upsert data into a Delta Lake table using merge on Azure Databricks.,"You can upsert data from a source table, view, or DataFrame into a target Delta table by using theMERGESQL operation. Delta Lake supports inserts, updates, and deletes inMERGE, and it supports extended syntax beyond the SQL standards to facilitate advanced use cases. Suppose you have a source table namedpeople10mupdatesor a source path at/tmp/delta/people-10m-updatesthat contains new data for a target table namedpeople10mor a target path at/tmp/delta/people-10m. Some of these new records may alr",2026-03-06T08:00:00.000Z,how-to,integrations,0.65,True,"Covers Delta Lake MERGE syntax and extended, non-standard capabilities specific to Databricks/Delta Lake, including code patterns and behavior beyond ANSI SQL. This is product-specific coding pattern knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/optimize,Optimize,Optimize data file layout - Azure Databricks,Optimize Delta table file layout on Databricks,Learn to compact small data files and improve data layout for enhanced query performance with optimize.,"Predictive optimization automatically runsOPTIMIZEon Unity Catalog managed tables. Databricks recommends enabling predictive optimization for all Unity Catalog managed tables to simplify data maintenance and reduce storage costs. SeePredictive optimization for Unity Catalog managed tables. TheOPTIMIZEcommand rewrites data files to improve data layout for tables. For tables with liquid clustering enabled,OPTIMIZErewrites data files to group data by liquid clustering keys. For tables with partitio",2026-03-31T08:00:00.000Z,how-to,best-practices,0.7,True,"Explains OPTIMIZE behavior, interaction with liquid clustering and partitioning, and recommends enabling predictive optimization—product-specific performance best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/row-tracking,Row tracking,Row tracking in Databricks - Azure Databricks,Enable and use row tracking for Delta and Iceberg tables,Learn how row tracking enables tracking how rows change across table versions.,"Row tracking allows Azure Databricks to track row-level lineage in a table. Some incremental updates for materialized views require this feature. All Apache Iceberg v3 tables include row tracking. SeeUse Apache Iceberg v3 features. For Delta Lake tables, you must explicitly enable row tracking. Important Row tracking is available in Databricks Runtime 14.1 and above. Row tracking is a table feature and uses a higher table writer protocol than some clients. Table protocol versions can't be downgr",2026-04-10T18:06:00.000Z,how-to,configuration,0.75,True,"Describes enabling row tracking, its runtime requirements, and protocol implications; includes specific feature flags/DDL and version constraints unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/s3-limitations,Delta Lake limitations on S3,Delta Lake limitations on S3 - Azure Databricks,Handle Delta Lake limitations on Amazon S3,Learn limitations and edge cases when working with Delta Lake on AWS S3.,"This article details some of the limitations you might encounter while working with data stored in S3 with Delta Lake on Azure Databricks. The eventually consistent model used in Amazon S3 can lead to potential problems when multiple systems or clusters modify data in the same table simultaneously. Azure Databricks and Delta Lake support multi-cluster writes by default, meaning that queries writing to a table from multiple clusters at the same time won't corrupt the table. For Delta tables store",2026-03-06T08:00:00.000Z,reference,best-practices,0.75,True,"Documents S3-specific limitations and edge cases for Delta Lake, including behavior under eventual consistency and multi-cluster writes—product- and platform-specific gotchas.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite,Selective overwrite,Selectively overwrite data with Delta Lake - Azure Databricks,Apply selective overwrite patterns in Delta Lake,Use replaceWhere and dynamic partition overwrites for selective overwrites with Delta Lake.,"Delta Lake has the following distinct options for selective overwrites: For most operations, Databricks recommends usingreplaceWhereto specify which data to overwrite. Important If data has been accidentally overwritten, you can userestoreto undo the change.",2026-04-13T21:52:00.000Z,how-to,best-practices,0.7,True,"Provides concrete recommendations such as preferring replaceWhere for most operations and using restore if data is accidentally overwritten. These are product-specific DO/DON’T operational patterns for Delta Lake, fitting best-practices.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite,Selective overwrite,Selectively overwrite data with Delta Lake - Azure Databricks,Apply selective overwrite patterns with Delta Lake,Use replaceWhere and dynamic partition overwrites for selective overwrites with Delta Lake.,"Delta Lake has the following distinct options for selective overwrites: For most operations, Databricks recommends usingreplaceWhereto specify which data to overwrite. Important If data has been accidentally overwritten, you can userestoreto undo the change.",2026-04-20T08:00:00.000Z,how-to,best-practices,0.65,True,"Covers specific Delta Lake options (replaceWhere, dynamic partition overwrite) with guidance on when to use each and a recommendation to prefer replaceWhere. This is product-specific DO/DON'T guidance and patterns rather than generic theory.",updated https://learn.microsoft.com/en-us/azure/databricks/delta/table-details,View table details,Review table details with describe detail - Azure Databricks,Inspect Azure Databricks Delta table metadata with DESCRIBE DETAIL,"View table details, configurations, and metadata with the describe detail command.","You can retrieve detailed information about a table (for example, number of files, data size) usingDESCRIBE DETAIL. For Spark SQL syntax details, seeDESCRIBE DETAIL. See theDelta Lake API documentationfor Scala/Java/Python syntax details.",2026-03-06T08:00:00.000Z,how-to,configuration,0.7,True,"DESCRIBE DETAIL exposes many table-level configuration fields and metadata (e.g., properties, protocol versions, stats) that are product-specific. This is effectively a reference for what details are returned and how to interpret them, which is configuration-focused expert knowledge beyond generic SQL DESCRIBE behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/table-properties,Table properties reference,Table properties reference - Azure Databricks,Reference table properties for Delta and Iceberg on Databricks,Reference list for table properties in Azure Databricks.,"Delta Lake and Apache Iceberg use table properties to control table behavior and features. These properties might have specific meanings and affect behaviors when set. Note All operations that set or update table properties conflict with other concurrent write operations, causing them to fail. Databricks recommends you modify a table property only when there are no concurrent write operations on the table.",2026-04-16T08:00:00.000Z,reference,configuration,0.9,True,"Explicitly a reference list of table properties that control behavior. Such pages typically include property names, allowed values, and effects—product-specific configuration details that qualify as expert knowledge under the configuration category.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta/table-properties,Table properties reference,Table properties reference - Azure Databricks,Reference table properties for Delta and Iceberg on Databricks,Reference list for table properties in Azure Databricks.,"Delta Lake and Apache Iceberg use table properties to control table behavior and features. These properties might have specific meanings and affect behaviors when set. Note All operations that set or update table properties conflict with other concurrent write operations, causing them to fail. Databricks recommends you modify a table property only when there are no concurrent write operations on the table.",2026-04-16T08:00:00.000Z,reference,configuration,0.9,True,"Explicitly a reference list of table properties that control behavior. Such pages typically include property names, allowed values, and effects—product-specific configuration details that qualify as expert knowledge under the configuration category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/tune-file-size,Tune file size,Control data file size - Azure Databricks,Control Delta table data file size on Azure Databricks,Control target file size manually or configure file size autotuning.,"Note The manual tuning recommendations in this article do not apply to Unity Catalog managed tables, which use automatic file size tuning. For new tables, use Unity Catalog managed tables with default settings. In Databricks Runtime 13.3 and above, Databricks recommends using clustering for table layout. SeeUse liquid clustering for tables. Databricks recommends using predictive optimization to automatically runOPTIMIZEandVACUUMfor tables. SeePredictive optimization for Unity Catalog managed tab",2026-04-03T17:47:00.000Z,best-practice,best-practices,0.7,True,"Provides Databricks-specific recommendations for manual vs automatic file size tuning, including when Unity Catalog managed tables override manual tuning and how file size interacts with clustering and predictive optimization. These are concrete, product-specific tuning practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/tutorial,Tutorial,Tutorial: Create and manage Delta Lake tables - Azure Databricks,,"Create, upsert, read, write, update, delete, display history, query using time travel, optimize, liquid clustering, and clean up operations for Delta Lake tables.","This tutorial demonstrates common Delta table operations using sample data.Delta Lakeis the optimized storage layer that provides the foundation for tables on Databricks. Unless otherwise specified, all tables on Databricks are Delta tables.",2026-03-31T08:00:00.000Z,tutorial,,0.35,False,Tutorial on common Delta table operations; primarily step-by-step usage rather than exhaustive configuration tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/type-widening,Type widening,Type widening - Azure Databricks,Configure Delta Lake type widening in Databricks,Learn how to widen column data types in Delta tables without rewriting data files using type widening in Databricks Runtime 15.4 LTS and above.,Tables with type widening enabled allow you to change column data types to a wider type without rewriting underlying data files. You can either change column types manually or use schema evolution to evolve column types. Important Type widening is available in Databricks Runtime 15.4 LTS and above. Tables with type widening enabled can only be read in Databricks Runtime 15.4 LTS and above. Type widening requires Delta Lake. All Unity Catalog managed tables use Delta Lake by default.,2026-04-13T08:00:00.000Z,feature-guide,configuration,0.7,True,"Describes a Databricks Runtime–specific feature (type widening) with product-version constraints and behavior details (only works on Runtime 15.4 LTS+ and requires Delta Lake). While the summary doesn’t show full parameter tables, this is a product-specific configuration/behavior toggle rather than a generic concept, so it fits configuration better than other categories.",updated +https://learn.microsoft.com/en-us/azure/databricks/delta/type-widening,Type widening,Type widening - Azure Databricks,Configure Delta Lake type widening in Databricks,Learn how to widen column data types in Delta tables without rewriting data files using type widening in Databricks Runtime 15.4 LTS and above.,Tables with type widening enabled allow you to change column data types to a wider type without rewriting underlying data files. You can either change column types manually or use schema evolution to evolve column types. Important Type widening is available in Databricks Runtime 15.4 LTS and above. Tables with type widening enabled can only be read in Databricks Runtime 15.4 LTS and above. Type widening requires Delta Lake. All Unity Catalog managed tables use Delta Lake by default.,2026-04-13T08:00:00.000Z,feature-guide,configuration,0.7,True,"Describes a Databricks Runtime–specific feature (type widening) with product-version constraints and behavior details (only works on Runtime 15.4 LTS+ and requires Delta Lake). While the summary doesn’t show full parameter tables, this is a product-specific configuration/behavior toggle rather than a generic concept, so it fits configuration better than other categories.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/uniform,Universal Format (UniForm),Read Delta tables with Iceberg clients - Azure Databricks,Configure Delta tables for Iceberg client reads (UniForm) on Azure Databricks,"Configure Delta tables to be read as Iceberg tables, a functionality formerly known as Universal Format (UniForm).",This article provides details for enabling Iceberg reads on tables stored with Delta Lake in Azure Databricks. This feature requires Databricks Runtime 14.3 LTS or above. Note This functionality was previously called Delta Lake Universal Format (UniForm). You can configure an external connection to have Unity Catalog act as an Iceberg catalog. SeeAccess Azure Databricks tables from Apache Iceberg clients.,2026-03-23T08:00:00.000Z,feature-guide,configuration,0.75,True,"Gives specific steps and properties to enable Iceberg reads on Delta tables and configure Unity Catalog as an Iceberg catalog. These are concrete, product-specific configuration patterns for interoperability.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/delta/update-schema,Update schema,Update table schema - Azure Databricks,,"Manual or automatic table schema updates to add, rename, or drop columns.","Tables support schema evolution, allowing modifications to table structure as data requirements change. The following types of changes are supported: Make these changes explicitly using DDL or implicitly using DML. Important Schema updates conflict with all concurrent write operations. Databricks recommends coordinating schema changes to avoid write conflicts. Updating a table schema terminates any streams reading from that table. To continue processing, restart the stream using the methods desc",2026-04-13T21:03:00.000Z,how-to,,0.3,False,"High-level description of schema evolution capabilities and coordination concerns; the summary does not indicate specific configuration parameters, numeric thresholds, or error codes. Likely more of a conceptual/how-to page than a reference with expert-only details.",updated -https://learn.microsoft.com/en-us/azure/databricks/delta/vacuum,Vacuum,Remove unused data files with vacuum - Azure Databricks,Vacuum Delta tables safely and efficiently on Databricks,Remove stale data files to reduce storage costs with vacuum command.,Predictive optimization automatically runsVACUUMon Unity Catalog managed tables. Databricks recommends enabling predictive optimizations for all Unity Catalog managed tables to simplify data maintenance and reduce storage costs. SeePredictive optimization for Unity Catalog managed tables. Remove data files no longer referenced by a table that are older than the retention threshold by running theVACUUMcommand on the table. RunningVACUUMregularly is important for cost and compliance because of the,2026-04-03T08:00:00.000Z,feature-guide,best-practices,0.7,True,"VACUUM behavior on Delta/Unity Catalog is product-specific; page typically includes retention thresholds, recommended settings, and gotchas (e.g., data loss risks, minimum retention), which are concrete Databricks-specific maintenance recommendations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta/update-schema,Update schema,Update table schema - Azure Databricks,,"Manual or automatic table schema updates to add, rename, or drop columns.","Tables support schema evolution, allowing modifications to table structure as data requirements change. The following types of changes are supported: Make these changes explicitly using DDL or implicitly using DML. Important Schema updates conflict with all concurrent write operations. Databricks recommends coordinating schema changes to avoid write conflicts. Updating a table schema terminates any streams reading from that table. To continue processing, restart the stream using the methods desc",2026-04-13T21:03:00.000Z,how-to,,0.3,False,"High-level description of schema evolution capabilities and coordination concerns; the summary does not indicate specific configuration parameters, numeric thresholds, or error codes. Likely more of a conceptual/how-to page than a reference with expert-only details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/delta/vacuum,Vacuum,Remove unused data files with vacuum - Azure Databricks,Vacuum Delta tables safely and efficiently in Azure Databricks,Remove stale data files to reduce storage costs with vacuum command.,Predictive optimization automatically runsVACUUMon Unity Catalog managed tables. Databricks recommends enabling predictive optimizations for all Unity Catalog managed tables to simplify data maintenance and reduce storage costs. SeePredictive optimization for Unity Catalog managed tables. Remove data files no longer referenced by a table that are older than the retention threshold by running theVACUUMcommand on the table. RunningVACUUMregularly is important for cost and compliance because of the,2026-04-21T08:00:00.000Z,feature-guide,best-practices,0.7,True,"Covers using VACUUM with retention thresholds and recommendations (predictive optimization, regular scheduling) that are specific to Databricks Delta behavior and compliance/cost trade-offs. This is actionable product-specific maintenance guidance rather than generic SQL knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/delta/variant,Variant,Variant type support - Azure Databricks,Store semi-structured data with VARIANT type,Learn about using the variant type for semi-structured data on Azure Databricks.,"Important This feature is inPublic Preview. TheVARIANTdata type stores semi-structured data. For examples on working withVARIANT, seeQuery variant data. You must use Databricks Runtime 15.3 or above to read and write tables with variant support enabled.",2026-02-09T20:20:00.000Z,feature-guide,configuration,0.65,True,"Documents Databricks VARIANT type behavior, enabling/disabling it on tables, and runtime constraints, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/delta/variant-shredding,Variant shredding,Optimize performance on the VARIANT data with shredding - Azure Databricks,Optimize VARIANT data performance with shredding on Azure Databricks,Learn how to optimize performance on the VARIANT data with shredding with Azure Databricks.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to use shredding to optimize query performance on tables with semi-structured data inVARIANTcolumns. SeeVARIANTtype,Variant type support, andQuery variant data.",2026-03-24T22:15:00.000Z,feature-guide,best-practices,0.7,True,"Describes how to apply shredding to VARIANT columns, including Databricks-specific behavior and recommended usage patterns to improve query performance on semi-structured data. These are concrete optimization practices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/designer/,Overview,Lakeflow Designer - Azure Databricks,,Learn how to use Lakeflow Designer to visually build data transformation workflows on Azure Databricks with a drag-and-drop canvas.,"Important This feature is inPublic Preview. Lakeflow Designer is a visual, no-code, AI-native experience for data preparation and analytics. Built directly in Azure Databricks, it lets users prepare and explore data through a drag-and-drop canvas and natural language. Designer workflows are backed by production-ready code and fully governed by Unity Catalog. This enables workflows to move smoothly from prototype to production, with no need to reverse-engineer or rebuild.",2026-04-22T21:32:00.000Z,landing-page,,0.2,False,"High-level description of Lakeflow Designer as a visual, no-code experience; appears to be overview/marketing without detailed limits, config tables, or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/databricks/designer/build-transformation,Create a Visual data prep,How to create a Visual data prep in Lakeflow Designer - Azure Databricks,,Learn how to create and run a Visual data prep in Lakeflow Designer on Azure Databricks.,"Important This feature is inPublic Preview. Lakeflow Designer lets you build data transformation workflows on a visual, drag-and-drop canvas. This page explains how to create a Visual data prep — from adding a data source and chaining operators to previewing results and writing to Unity Catalog. To build a Visual data prep:",2026-04-22T21:32:00.000Z,how-to,,0.3,False,"Step-by-step guidance on creating a Visual data prep; tutorial-style content without explicit limits, configuration reference tables, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/databricks/designer/built-in-operators,Built-in operators,Built-in operators in Lakeflow Designer - Azure Databricks,,"Reference for the built-in operators available in Lakeflow Designer on Azure Databricks, including configuration options and usage.","Important This feature is inPublic Preview. Lakeflow Designer includes built-in operators for common data preparation and transformation tasks. Open the operator menu in the side panel on the left to browse operators by category, or useSearch for an operator...at the top of the panel. To open an operator's configuration pane after you add it to the canvas, double-click it or hold the pointer over it and click(Edit operator).",2026-04-22T21:32:00.000Z,reference,,0.4,False,"Reference for built-in operators with configuration options, but summary does not indicate structured parameter tables with defaults/ranges or other expert-only details; likely general usage descriptions.",new +https://learn.microsoft.com/en-us/azure/databricks/designer/ingest-data,Ingest data,Ingest data into Lakeflow Designer - Azure Databricks,,"Learn how to bring data into Lakeflow Designer on Azure Databricks, including local files, cloud storage, Google Drive, and SharePoint.","Important This feature is inPublic Preview. This page describes the available options for bringing data into a Visual data prep in Lakeflow Designer. Designer can work with any data accessible through Azure Databricks. All data ingestion in Designer starts with theSourceoperator. When you open a Source operator's configuration pane, you have the following options.",2026-04-22T21:32:00.000Z,how-to,,0.3,False,"Describes options for ingesting data (local files, cloud storage, etc.) into Designer; appears to be procedural guidance rather than detailed configuration reference or limits/quotas.",new +https://learn.microsoft.com/en-us/azure/databricks/designer/what-is-lakeflow-designer,What is Lakeflow Designer?,What is Lakeflow Designer? - Azure Databricks,,"Learn what Lakeflow Designer is, how it works, and the key concepts behind building visual data transformation workflows on Azure Databricks.","Important This feature is inPublic Preview. Lakeflow Designer provides a visual canvas for analysts to perform data analytics, preparation, and basic automation. In Designer, you create Visual data preps, each made up of a sequence of operators (such as filter, join, and transform) arranged as a DAG to produce a result. All transformations are backed by code, which supports moving workflows to production. With Lakeflow Designer, you can: In the image above, you can see:",2026-04-22T21:32:00.000Z,concept-article,,0.2,False,"Conceptual 'what is' page explaining Lakeflow Designer and key concepts; no indication of numeric limits, configuration parameter tables, or decision matrices.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/,Overview,Local development tools - Azure Databricks,Choose local development tools for Azure Databricks,Learn about the tools you can use to develop applications that integrate with Azure Databricks.,"Databricks provides an ecosystem of tools to help you develop applications and solutions that integrate with Azure Databricks and programmatically manage Databricks resources and data. This page provides recommendations for the best tools for common developer scenarios. For a complete overview of developer tools, seeDevelop on Databricks.",2026-03-16T17:36:00.000Z,landing-page,decision-making,0.65,True,"Described as providing recommendations for the best tools for common developer scenarios, which is explicit decision guidance about which Databricks-related dev tools to use when. That is product-specific decision-making content (tool selection by scenario) beyond a generic overview.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/,Authentication overview,Authorize access to Azure Databricks resources - Azure Databricks,Configure authorization for Azure Databricks APIs and CLI,Learn how to authorize access to Databricks resources through the Databricks CLI or APIs.,"This page explains how to authorize access to Azure Databricks resources using the Databricks CLI and REST APIs. It describes different authorization methods, when to use them, and how to configure authentication for your use case. To access Azure Databricks resources with the CLI or REST APIs, you must authenticate using a Azure Databricks account with appropriate permissions. Your Azure Databricks administrator or a user with administrator privileges configures the account.",2026-04-03T08:00:00.000Z,concept-article,security,0.75,True,"Explains concrete authorization methods, when to use each, and how to configure authentication for Databricks CLI and REST APIs. Contains product-specific auth flows and permission requirements.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/aad-token-manual,Manual token generation,Get Microsoft Entra ID tokens manually - Azure Databricks,Manually obtain Microsoft Entra tokens for Databricks REST APIs,Learn how to manually generate Microsoft Entra ID access tokens for users and service principals to authenticate with Databricks REST APIs.,"This page describes how to manually generate Microsoft Entra ID access tokens for users and service principals to authenticate with Azure Databricks REST APIs. Manual token generation is an advanced technique. Important Databricks doesn't recommend manually creating Microsoft Entra ID tokens. They expire within one hour and require manual replacement. Instead, use tools or SDKs withunified authenticationto handle token management automatically. UseAzure Databricks managed service principalsfor m",2026-01-24T08:00:00.000Z,how-to,security,0.8,True,"Explains manual generation of Entra ID access tokens for Databricks, including token scopes, lifetimes, and usage. Contains advanced, product-specific security/auth details.",unchanged @@ -533,14 +539,14 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/config-profile https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/env-vars,Environment variables and fields reference,Environment variables and fields for unified authentication - Azure Databricks,Reference for Databricks unified authentication environment variables,Complete reference for environment variables and fields used with Databricks unified authentication across all tools and SDKs.,"This reference lists environment variables and configuration fields forDatabricks unified authentication. They work consistently across the Databricks CLI, Terraform provider, and SDKs for Python, Java, and Go. Use this reference to set up authentication or troubleshoot authentication issues. Each entry includes:",2026-01-16T08:00:00.000Z,reference,configuration,0.9,True,Explicit reference listing environment variables and configuration fields with names and semantics for unified authentication. This is a configuration table-style reference unique to Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation,Overview,Authenticate access to Azure Databricks using OAuth token federation - Azure Databricks,,Learn about OAuth token federation authentication and authorization for Azure Databricks.,This page provides overview information about OAuth token federation for accessing Azure Databricks account and workspace resources using tokens from your identity provider.,2026-01-29T08:00:00.000Z,overview,,0.4,False,Described as overview information about OAuth token federation; likely conceptual without detailed config tables or role mappings.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-exchange,Authenticate with an identity provider token,Authenticate with an identity provider token - Azure Databricks,Authenticate to Databricks using federated IdP tokens,Learn how to securely call a Databricks REST API using a token from a federated identity provider.,"This page explains how to authenticate to Azure Databricks using a token issued by your organization’s identity provider. Azure Databricks supportsOAuth 2.0 Token Exchangeto let you exchange a federated identity token for a Databricks OAuth token. With token federation, the Databricks CLI, SDKs, and other tools can automatically handle this exchange and manage access tokens for you. The lifetime of each access token is derived from the lifetime of the federated token that you provide, which is m",2026-02-25T08:00:00.000Z,how-to,security,0.75,True,"Explains OAuth 2.0 token exchange with Databricks, including token lifetimes derived from IdP tokens and API usage patterns, which are product-specific security details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-policy,Configure a federation policy,Configure a federation policy - Azure Databricks,Configure OAuth token federation policy for Azure Databricks,Learn how to create and configure an OAuth token federation policy for Azure Databricks.,"Databricks OAuth token federation enables you to securely access Databricks APIs using tokens from your identity provider (IdP). To enable OAuth token federation, you must configure a federation policy, either as Databricks account-wide or for workloads. This page describes how to create and configure an OAuth token federation policy.",2026-04-10T08:00:00.000Z,how-to,security,0.78,True,"Page describes how to create and configure an OAuth token federation policy, which is an authentication/authorization feature. It likely includes specific policy fields, allowed values, scopes, and configuration parameters unique to Databricks OAuth federation, which qualify as product-specific security configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-policy,Configure a federation policy,Configure a federation policy - Azure Databricks,Configure OAuth federation policy for Azure Databricks,Learn how to create and configure an OAuth token federation policy for Azure Databricks.,"Databricks OAuth token federation enables you to securely access Databricks APIs using tokens from your identity provider (IdP). To enable OAuth token federation, you must configure a federation policy, either as Databricks account-wide or for workloads. This page describes how to create and configure an OAuth token federation policy.",2026-04-21T08:00:00.000Z,how-to,security,0.78,True,"Page is about configuring an OAuth token federation policy, which is a security/identity configuration topic. It likely includes concrete policy parameters (policy names, scopes, audiences, issuer/subject filters, claim mappings, and how they apply account-wide vs workload-specific), which are product-specific security settings rather than generic OAuth theory.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-provider,Overview,Enable workload identity federation in CI/CD - Azure Databricks,Enable Databricks workload identity federation in CI/CD,"Learn how to enable workload identity federation, also known as OpenID Connect (OIDC), in your CI/CD pipeline in Databricks.","Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks APIs without the need for Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. With workload identity federation, your workload authenticates to Databricks as a service principal in your Databricks account using workload identity tokens issued by the automation environment. Important Databricks ",2026-01-16T08:00:00.000Z,overview,security,0.8,True,"Covers configuring OIDC-based workload identity federation for Databricks APIs, including how tokens are exchanged and how service principals are used. Product-specific security/auth configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-m2m,Overview,Authorize service principal access to Azure Databricks with OAuth - Azure Databricks,Authorize Databricks service principals with OAuth M2M,Learn how to set up OAuth authentication and authorization for Databricks on your cloud account with a Databricks service principal.,"This page explains how to authorize access to Azure Databricks resources from unattended processes, such as automated CLI commands or REST API calls made from scripts or applications. Azure Databricks usesOAuth 2.0as the preferred protocol for service principal authorization and authentication outside of the UI. Unified client authentication automates token generation and refresh. When a service principal signs in and is granted consent, OAuth issues an access token for the CLI, SDK, or other to",2026-03-06T08:00:00.000Z,how-to,security,0.8,True,Describes OAuth 2.0 machine-to-machine setup for Databricks service principals with product-specific auth flows and configuration.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-u2m,OAuth authentication as a user,Authorize user access to Azure Databricks with OAuth - Azure Databricks,Authorize Databricks user access with OAuth,"Learn how to set up OAuth user account authorization for Azure Databricks APIs, SDKs, and tools.","This page explains how to authorize user access to Azure Databricks resources when using the Databricks CLI or Azure Databricks REST APIs. Azure Databricks usesOAuth 2.0as the preferred protocol for user authorization and authentication outside of the UI. Unified client authentication automates token generation and refresh. After a user signs in and grants consent, OAuth issues an access token for the CLI, SDK, or other tool to use on the user’s behalf. Each access token is valid for one hour, a",2026-01-29T08:00:00.000Z,how-to,security,0.8,True,"Details OAuth 2.0 user authorization for Databricks APIs/SDKs, including token lifetimes (one hour) and unified auth behavior, matching security configuration criteria.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/pat,Personal access tokens (legacy),Authenticate with Azure Databricks personal access tokens (legacy) - Azure Databricks,Use Azure Databricks personal access tokens with limits,Learn how to set up Azure Databricks authentication by using Azure Databricks personal access tokens (PATs).,"Azure Databricks personal access tokens (PATs) let you authenticate to resources and APIs at the workspace level. You can store them in environment variables or Azure Databricksconfiguration profiles. Each PAT is valid for only one workspace, and a user can create up to 600 PATs per workspace. Azure Databricks automatically revokes PATs that haven’t been used for 90 days. Important Where possible, Databricks recommends using OAuth instead of PATs for user account authentication because OAuth pro",2026-04-14T08:00:00.000Z,how-to,limits-quotas,0.86,True,"Contains explicit numeric limits and lifecycle rules for PATs (for example, up to 600 PATs per user per workspace and automatic revocation after 90 days of inactivity). These are concrete quotas and time-based constraints that an LLM would not reliably know from training, matching the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-u2m,OAuth authentication as a user,Authorize user access to Azure Databricks with OAuth - Azure Databricks,Authorize Azure Databricks user access with OAuth,"Learn how to set up OAuth user account authorization for Azure Databricks APIs, SDKs, and tools.","This page explains how to authorize user access to Azure Databricks resources when using the Databricks CLI or Azure Databricks REST APIs. Azure Databricks usesOAuth 2.0as the preferred protocol for user authorization and authentication outside of the UI. Unified client authentication automates token generation and refresh. After a user signs in and grants consent, OAuth issues an access token for the CLI, SDK, or other tool to use on the user’s behalf. Each access token is valid for one hour, a",2026-04-23T08:00:00.000Z,how-to,security,0.76,True,"Page covers setting up OAuth 2.0 user authorization for Databricks APIs/CLI/SDKs, including token lifetimes (access token valid for one hour) and likely specific auth configuration details (scopes, redirect URIs, client types, unified client auth behavior). These are product-specific authentication settings and token behaviors, fitting the security sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/pat,Personal access tokens (legacy),Authenticate with Azure Databricks personal access tokens (legacy) - Azure Databricks,Use Azure Databricks personal access tokens with limits,Learn how to set up Azure Databricks authentication by using Azure Databricks personal access tokens (PATs).,"Azure Databricks personal access tokens (PATs) let you authenticate to resources and APIs at the workspace level. You can store them in environment variables or Azure Databricksconfiguration profiles. Each PAT is valid for only one workspace, and a user can create up to 600 PATs per workspace. Azure Databricks automatically revokes PATs that haven’t been used for 90 days. Important Where possible, Databricks recommends using OAuth instead of PATs for user account authentication because OAuth pro",2026-04-14T08:00:00.000Z,how-to,limits-quotas,0.86,True,"Contains explicit numeric limits and lifecycle rules for PATs (for example, up to 600 PATs per user per workspace and automatic revocation after 90 days of inactivity). These are concrete quotas and time-based constraints that an LLM would not reliably know from training, matching the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-azure-devops,Azure DevOps Pipelines,Enable workload identity federation for Azure DevOps Pipelines - Azure Databricks,Configure Databricks workload identity federation for Azure DevOps,Learn how to enable OAuth token federation for your Databricks CI/CD flows that use Azure DevOps Pipelines.,"Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks without the need for Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. To enable workload identity federation for Azure DevOps Pipelines: After you enable workload identity federation, the Databricks SDKs and the Databricks CLI automatically fetch workload identity tokens from Azure DevOps Pi",2026-01-16T08:00:00.000Z,how-to,security,0.8,True,Shows exact steps and settings to enable OAuth token federation between Azure DevOps Pipelines and Databricks. Contains provider-specific claims/config fields and security wiring.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-circleci,CircleCI,Enable workload identity federation for CircleCI - Azure Databricks,Configure Databricks workload identity federation for CircleCI,Learn how to enable OAuth token federation for your Databricks CI/CD flows that use CircleCI.,"Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks without the need for Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. To enable workload identity federation for CircleCI: After you enable workload identity federation, the Databricks SDKs and the Databricks CLI automatically fetch workload identity tokens from CircleCI and exchange them fo",2026-01-16T08:00:00.000Z,how-to,security,0.8,True,Explains CircleCI-specific OIDC token configuration and Databricks federation setup. Product-specific security integration details and parameters.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-github,GitHub Actions,Enable workload identity federation for GitHub Actions - Azure Databricks,Configure Databricks workload identity federation for GitHub Actions,Learn how to enable OAuth token federation for your Databricks CI/CD flows that use GitHub Actions.,"Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks without the need for Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. To enable workload identity federation for GitHub Actions: After you enable workload identity federation, the Databricks SDKs and the Databricks CLI automatically fetch workload identity tokens from GitHub and exchange the",2026-01-16T08:00:00.000Z,how-to,security,0.8,True,"Provides GitHub-specific configuration (OIDC settings, trust relationships) to federate identities to Databricks. Product- and platform-specific security integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-github,GitHub Actions,Enable workload identity federation for GitHub Actions - Azure Databricks,Enable Databricks workload identity federation with GitHub Actions,Learn how to enable OAuth token federation for your Databricks CI/CD flows that use GitHub Actions.,"Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks without Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. To enable workload identity federation for GitHub Actions: After you enable workload identity federation, the Databricks SDKs and the Databricks CLI automatically fetch workload identity tokens from GitHub and exchange them for Databri",2026-04-21T22:28:00.000Z,how-to,integrations,0.7,True,"Page describes enabling OAuth/OIDC workload identity federation specifically for GitHub Actions CI/CD flows. This is an integration pattern between Azure Databricks and GitHub Actions and likely includes GitHub workflow configuration (OIDC provider, audience, subject, environment variables) and Databricks-side settings, which are concrete, product-specific integration parameters.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-gitlab,GitLab CI/CD,Enable workload identity federation for GitLab CI/CD - Azure Databricks,Configure Databricks workload identity federation for GitLab CI/CD,Learn how to enable OAuth token federation for your Databricks CI/CD flows that use GitLab CI/CD.,"Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks without the need for Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. To enable workload identity federation for GitLab CI/CD: After you enable workload identity federation, the Databricks SDKs and the Databricks CLI automatically fetch workload identity tokens from GitLab CI/CD and exchange",2026-01-16T08:00:00.000Z,how-to,security,0.8,True,Describes GitLab CI/CD OIDC configuration and Databricks-side setup for token federation. Contains concrete security parameters and flows unique to this integration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-other,Other providers,"Enable workload identity federation for Terraform Cloud, Bitbucket Pipelines, or Jenkins - Azure Databricks",Enable Databricks workload identity federation for Terraform Cloud and others,"Learn how to enable OAuth token federation, also known as OIDC, for your Databricks CI/CD flows that use Terraform Cloud, Bitbucket Pipelines, or Jenkins.","Databricks OAuth token federation, also known as OpenID Connect (OIDC), allows your automated workloads running outside of Databricks to securely access Databricks without the need for Databricks secrets. SeeAuthenticate access to Azure Databricks using OAuth token federation. To enable workload identity federation for Terraform Cloud, Atlassian Bitbucket Pipelines, or Jenkins: After you enable workload identity federation, the Databricks SDKs and the Databricks CLI automatically fetch workload ",2026-01-16T08:00:00.000Z,how-to,security,0.8,True,"Shows how to configure OIDC federation for Terraform Cloud, Bitbucket Pipelines, and Jenkins with Databricks. Contains concrete security configuration fields and flows.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/service-principals,Service principals for CI/CD,Service principals for CI/CD - Azure Databricks,Use Databricks service principals for CI/CD automation,Learn how to use service principals with CI/CD for Azure Databricks projects.,"This page describes how to use service principals for CI/CD with Azure Databricks. Aservice principalis an identity created for use with automated tools and applications, including: As a security best practice, Databricks recommends using a service principal and its token instead of your Azure Databricks user or your Databricks personal access token for your workspace user to give CI/CD platforms access to Azure Databricks resources. Some benefits to this approach include the following: To give ",2026-01-24T08:00:00.000Z,how-to,security,0.75,True,"Describes using service principals and tokens for CI/CD access to Databricks, including security best practices and specific identity configuration patterns.",unchanged @@ -561,12 +567,12 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/manual-bund https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/migrate-resources,Migrate existing resources,Migrate existing resources to a bundle - Azure Databricks,Generate bundle config from existing Databricks resources,"Learn how to generate bundle resource configuration for an existing resource, such as a job or pipeline, add it to a bundle and bind to the existing resource in the workspace.","When building your bundle, you may want to include Databricks resources that already exist and are fully configured in the remote workspace. You can use the Databricks CLIbundle generatecommand to quickly autogenerate configuration in your bundle for existing apps, dashboards, jobs, and pipelines. Seedatabricks bundle generate. Configuration that you can copy and manually paste into bundle resource configuration files is available in the Databricks UI for some resources, such as jobs and pipelin",2026-03-16T08:00:00.000Z,upgrade-and-migration-article,configuration,0.7,True,Describes using `databricks bundle generate` and UI-exported configuration to bind existing jobs/pipelines/apps; involves concrete configuration structures and fields.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/mlops-stacks,MLOps Stacks tutorial,Declarative Automation Bundles for MLOps Stacks - Azure Databricks,Apply MLOps Stack best practices with bundles,Learn about how to use Declarative Automation Bundles to work with MLOps Stacks.,"You can use Declarative Automation Bundles, the Databricks CLI, and theDatabricks MLOps Stackrepository on GitHub to createMLOps Stacks. An MLOps Stack is anMLOpsproject on Azure Databricks that follows production best practices out of the box. SeeWhat are Declarative Automation Bundles?. This shows how to create, deploy, and run an MLOps Stacks bundle project.",2026-03-16T08:00:00.000Z,tutorial,best-practices,0.6,True,Describes MLOps Stacks that follow production best practices out of the box and how to create/deploy them with bundles; likely includes Databricks-specific patterns and recommendations.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/overrides,Dynamic setting overrides,Override with target settings - Azure Databricks,Override bundle settings with target-specific configuration,Learn how to override or join top-level settings with target settings in Declarative Automation Bundles.,"This page describes how to override or join top-level settings with target settings in Declarative Automation Bundles. For information about bundle target settings, seetargets.",2026-03-16T08:00:00.000Z,concept-article,configuration,0.8,True,Explains how to override or join top-level settings with target settings; involves specific configuration patterns and keys for bundles.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/permissions,Resource permissions,Set permissions for resources in Declarative Automation Bundles - Azure Databricks,Set permissions for Azure Databricks Declarative Automation Bundles,Learn how to set permissions for resources in Declarative Automation Bundles.,"This article describes how to set permissions for resources in Declarative Automation Bundles. For information about resources supported in bundles, seeDeclarative Automation Bundles resources. In Azure Databricksbundle configuration files, you can define permissions at the top level to apply to all resources defined in the bundle, or you can define permissions to apply to specific resources. Note Permissions cannot overlap. In other words, permissions for a user, group, or service principal can",2026-04-15T22:08:00.000Z,how-to,security,0.78,True,"The article explains how to define permissions for resources in bundle configuration files, including rules such as non-overlapping permissions for users, groups, and service principals. This is product-specific IAM/permissions configuration, fitting the security category.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/permissions,Resource permissions,Set permissions for resources in Declarative Automation Bundles - Azure Databricks,Set permissions for Azure Databricks Declarative Automation Bundles,Learn how to set permissions for resources in Declarative Automation Bundles.,"This article describes how to set permissions for resources in Declarative Automation Bundles. For information about resources supported in bundles, seeDeclarative Automation Bundles resources. In Azure Databricksbundle configuration files, you can define permissions at the top level to apply to all resources defined in the bundle, or you can define permissions to apply to specific resources. Note Permissions cannot overlap. In other words, permissions for a user, group, or service principal can",2026-04-15T22:08:00.000Z,how-to,security,0.78,True,"The article explains how to define permissions for resources in bundle configuration files, including rules such as non-overlapping permissions for users, groups, and service principals. This is product-specific IAM/permissions configuration, fitting the security category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/pipelines-tutorial,Bundle with pipelines tutorial,Develop pipelines with Declarative Automation Bundles - Azure Databricks,,Complete a hands-on tutorial that demonstrates how to use Declarative Automation Bundles to work with Lakeflow Spark Declarative Pipelines.,"Declarative Automation Bundles (formerly known as Databricks Asset Bundles) enable you to programmatically validate, deploy, and run Azure Databricks resources such as Lakeflow Spark Declarative Pipelines. SeeWhat are Declarative Automation Bundles?. This page describes how to create a bundle to programmatically manage a pipeline. SeeLakeflow Spark Declarative Pipelines. The bundle is created using the Databricks CLIpipelines initcommand, which defines an ETL pipeline and job to run it. You then",2026-03-16T08:00:00.000Z,tutorial,,0.3,False,Tutorial for using bundles with pipelines; procedural guidance rather than detailed configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/python-wheel,Python wheel builds tutorial,Build a Python wheel file using Declarative Automation Bundles - Azure Databricks,Configure bundles to build and deploy Python wheels,Learn how to build and deploy Python wheel files in Declarative Automation Bundles.,"This page describes how to build, deploy, and run a Python wheel file using Declarative Automation Bundles. SeeWhat are Declarative Automation Bundles?. For an example configuration that builds a JAR and uploads it to Unity Catalog, seeBundle that uploads a JAR file to Unity Catalog.",2026-03-16T08:00:00.000Z,tutorial,configuration,0.65,True,"Describes how to declare and configure Python wheel build/deploy in bundles, likely including specific YAML keys and parameter values unique to Declarative Automation Bundles.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/python/,Configuration in Python,Bundle configuration in Python - Azure Databricks,Define and modify Databricks bundle resources using Python,Learn how to work with Python support for Declarative Automation Bundles,Python support for Declarative Automation Bundles extendsDeclarative Automation Bundleswith additional capabilities that apply during bundle deployment so that you can: Define resources in Python code. These definitions can coexist withresources defined in YAML. Dynamically create resources during bundle deployment using metadata. SeeCreate resources using metadata. Modify resources defined in YAML or Python during bundle deployment. SeeModify resources defined in YAML or Python. Tip You can als,2026-03-16T17:36:00.000Z,tutorial,configuration,0.8,True,Python support for bundles includes APIs and patterns for defining resources in code and modifying YAML; these are detailed configuration/programmatic patterns specific to bundles.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/reference,Bundle configuration reference,Configuration reference - Azure Databricks,Reference for databricks.yml bundle configuration keys,Configuration reference for databricks.yml,"This article provides reference for keys supported by Declarative Automation Bundles (formerly known as Databricks Asset Bundles) configuration (YAML). SeeWhat are Declarative Automation Bundles?. For complete bundle examples, seeBundle configuration examplesand thebundle-examples GitHub repository.",2026-04-02T08:00:00.000Z,reference,configuration,0.95,True,"Explicit configuration reference for all supported YAML keys, likely with parameter names, types, and allowed values—core expert configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/resources,Bundle resources,Declarative Automation Bundles resources - Azure Databricks,Configure Azure Databricks Declarative Automation Bundles resources,Learn about resources supported by Declarative Automation Bundles and how to configure them.,"Declarative Automation Bundles (formerly known as Databricks Asset Bundles) allows you to specify information about the Azure Databricks resources used by the bundle in theresourcesmapping in the bundle configuration. Seeresources reference. This page provides configuration reference for all supported resource types for bundles and provides details and an example for each supported type. For additional examples, seeBundle configuration examples. The JSON schema for bundles that is used to valida",2026-04-15T08:00:00.000Z,reference,configuration,0.86,True,"The page is a configuration reference for all supported resource types in Declarative Automation Bundles, describing how to specify resources in the bundle configuration. This implies detailed resource-specific configuration parameters and schema-level options unique to Databricks bundles, matching the configuration category.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/resources,Bundle resources,Declarative Automation Bundles resources - Azure Databricks,Configure Azure Databricks Declarative Automation Bundles resources,Learn about resources supported by Declarative Automation Bundles and how to configure them.,"Declarative Automation Bundles (formerly known as Databricks Asset Bundles) allows you to specify information about the Azure Databricks resources used by the bundle in theresourcesmapping in the bundle configuration. Seeresources reference. This page provides configuration reference for all supported resource types for bundles and provides details and an example for each supported type. For additional examples, seeBundle configuration examples. The JSON schema for bundles that is used to valida",2026-04-15T08:00:00.000Z,reference,configuration,0.86,True,"The page is a configuration reference for all supported resource types in Declarative Automation Bundles, describing how to specify resources in the bundle configuration. This implies detailed resource-specific configuration parameters and schema-level options unique to Databricks bundles, matching the configuration category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/run-as,Bundle run identity,Specify a run identity for a Declarative Automation Bundles workflow - Azure Databricks,Specify run identities for bundle workflows,Learn how to specify the identity to use to run a bundle workflow.,"This article describes how to use therun_assetting to specify the identity to use when running Declarative Automation Bundles workflows. Therun_assetting can be configured as a top-level mapping to apply to resources, or within atargetdeployment mapping in abundle configuration file. It can be set to auser_nameor aservice_principal_name. (Non-admins can only set this field to their own email.) This setting provides the ability to separate the identity used to deploy a bundle job or pipeline from",2026-03-16T08:00:00.000Z,how-to,security,0.85,True,"Details the `run_as` setting, allowed values (`user_name`, `service_principal_name`), and constraints (non-admin restrictions); product-specific IAM configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/scala-jar,Scala JAR build tutorial,Build a Scala JAR using Declarative Automation Bundles - Azure Databricks,Configure bundles to build and deploy Scala JARs,Learn how to build and deploy a Scala JAR to serverless compute using Declarative Automation Bundles.,"This article describes how to build, deploy, and run a Scala JAR with Declarative Automation Bundles. For information about bundles, seeWhat are Declarative Automation Bundles?. For example configuration that builds a Java JAR and uploads it to Unity Catalog, seeBundle that uploads a JAR file to Unity Catalog.",2026-03-16T08:00:00.000Z,tutorial,configuration,0.65,True,"Explains bundle configuration for Scala JAR build and deployment, with product-specific YAML keys and settings for serverless compute.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/settings,Overview,Declarative Automation Bundles configuration - Azure Databricks,Understand Databricks bundle configuration file syntax,Learn about Declarative Automation Bundles configuration file syntax. Bundles enable programmatic management of Azure Databricks workflows.,"This article describes the syntax for bundle configuration files, which define Declarative Automation Bundles (formerly known as Databricks Asset Bundles). SeeWhat are Declarative Automation Bundles?. To create and work with bundles, seeDevelop Declarative Automation Bundles. For bundle configuration reference, seeConfiguration reference.",2026-03-16T17:36:00.000Z,reference,configuration,0.85,True,"Explicitly about configuration file syntax; will contain keys, allowed values, and structural rules for databricks.yml, which are expert configuration details.",unchanged @@ -580,15 +586,15 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace,O https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace-author,Author bundles in the workspace,Author bundles in the workspace - Azure Databricks,Author and edit bundle configurations in the Databricks workspace,Learn how to author Declarative Automation Bundles in the workspace.,"Declarative Automation Bundles can be created and modified directly in the workspace. For requirements for using bundles in the workspace, seeDeclarative Automation Bundles in the workspace requirements. For more information about bundles, seeWhat are Declarative Automation Bundles?.",2026-03-16T17:36:00.000Z,how-to,configuration,0.65,True,Focuses on authoring bundles in workspace; includes specific UI-driven configuration flows and constraints for bundle files and targets.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace-deploy,Deploy a bundle from the workspace,Deploy bundles and run workflows from the workspace - Azure Databricks,Deploy Databricks bundles and run workflows from the workspace,Learn how to deploy and run Declarative Automation Bundles from the workspace.,"Assets that are part of your Declarative Automation Bundles can be created and modified from a local development environment or the workspace, but in order for the changes to be synchronized with the corresponding Databricks resources, bundles must be deployed. Bundles have unique identities in a workspace, so regardless of whether a bundle is deployed from a local machine or the workspace, bundle assets are not duplicated. For requirements for using bundles in the workspace, seeDeclarative Auto",2026-03-16T17:36:00.000Z,how-to,deployment,0.7,True,"Covers deployment behavior, unique bundle identities, and synchronization; these are product-specific deployment semantics and constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace-tutorial,Tutorial: Create and deploy a bundle in the workspace,Tutorial: Create and deploy a bundle in the workspace - Azure Databricks,Create and deploy bundles from the Databricks workspace,Learn how to create and deploy Declarative Automation Bundles in the workspace.,"To help you get started using Declarative Automation Bundles in the workspace, this tutorial walks you through creating a bundle with a job, deploying it, and running the job in the bundle - all from the workspace. For requirements for using bundles in the workspace, seeDeclarative Automation Bundles in the workspace requirements. For more information about bundles, seeWhat are Declarative Automation Bundles?.",2026-03-16T17:36:00.000Z,tutorial,configuration,0.7,True,Tutorial for workspace bundle creation and deployment; will show concrete configuration steps and requirements specific to workspace-based bundles.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/,Overview,CI/CD on Azure Databricks - Azure Databricks,Implement CI/CD pipelines for Azure Databricks,Learn how to use continuous integration and continuous delivery (CI/CD) systems with Databricks.,"Continuous integration and continuous delivery (CI/CD) refers to the process of developing and delivering software in short, frequent cycles through the use of automation pipelines. CI/CD is common in software development, and is becoming increasingly necessary in data engineering and data science. By automating the building, testing, and deployment of code, development teams deliver releases more reliably than with manual processes. Databricks provides tools for developing CI/CD pipelines that ",2026-04-16T08:00:00.000Z,overview,deployment,0.63,True,"CI/CD on Azure Databricks documentation typically includes Databricks-specific deployment patterns, supported methods, and constraints for automating builds/tests/deployments. These are product-specific deployment details and patterns, beyond generic CI/CD concepts, fitting the deployment sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/,Overview,CI/CD on Azure Databricks - Azure Databricks,Implement CI/CD pipelines for Azure Databricks,Learn how to use continuous integration and continuous delivery (CI/CD) systems with Databricks.,"Continuous integration and continuous delivery (CI/CD) refers to the process of developing and delivering software in short, frequent cycles through the use of automation pipelines. CI/CD is common in software development, and is becoming increasingly necessary in data engineering and data science. By automating the building, testing, and deployment of code, development teams deliver releases more reliably than with manual processes. Databricks provides tools for developing CI/CD pipelines that ",2026-04-16T08:00:00.000Z,overview,deployment,0.63,True,"CI/CD on Azure Databricks documentation typically includes Databricks-specific deployment patterns, supported methods, and constraints for automating builds/tests/deployments. These are product-specific deployment details and patterns, beyond generic CI/CD concepts, fitting the deployment sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/azure-devops,Azure DevOps,Continuous integration and delivery on Azure Databricks using Azure DevOps - Azure Databricks,Set up Azure DevOps CI/CD pipelines for Databricks,Learn how to use Azure DevOps to enable CI/CD for Azure Databricks projects.,"Note This article covers Azure DevOps, which is developed by a third party. To contact the provider, seeAzure DevOps Services support. This article guides you through configuringAzure DevOpsto work with Azure Databricks. Specifically, you will configure a continuous integration and delivery (CI/CD) workflow to connect to a Git repository, run jobs using Azure Pipelines to build and unit test a Python wheel (*.whl), and deploy it for use in Databricks notebooks. For an overview of CI/CD with Azur",2026-01-30T19:03:00.000Z,how-to,deployment,0.65,True,"Configures Azure DevOps pipelines to build, test, and deploy Python wheels to Databricks; includes Databricks-specific CI/CD pipeline configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/best-practices,Best practices,Best practices and recommended CI/CD workflows on Databricks - Azure Databricks,Apply Databricks-recommended CI/CD workflows and patterns,Learn about CI/CD best practices and CI/CD workflows recommended by Databricks.,"CI/CD (Continuous Integration and Continuous Delivery) has become a cornerstone of modern data engineering and analytics, as it ensures that code changes are integrated, tested, and deployed rapidly and reliably. Databricks recognizes that you may have diverse CI/CD requirements shaped by your organizational preferences, existing workflows, and specific technology environment, and provides a flexible framework that supports various CI/CD options. This page describes best practices to help you de",2026-03-16T17:36:00.000Z,best-practice,best-practices,0.7,True,"Explicitly a best-practices page; Databricks CI/CD guidance typically includes product-specific workflow patterns, recommended configurations, and gotchas beyond generic CI/CD theory.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/best-practices,Best practices,Best practices and recommended CI/CD workflows on Databricks - Azure Databricks,Apply CI/CD best practices on Azure Databricks,Learn about CI/CD best practices and CI/CD workflows recommended by Databricks.,"CI/CD (Continuous Integration and Continuous Delivery) has become a cornerstone of modern data engineering and analytics, as it ensures that code changes are integrated, tested, and deployed rapidly and reliably. Databricks recognizes that you may have diverse CI/CD requirements shaped by your organizational preferences, existing workflows, and specific technology environment, and provides a flexible framework that supports various CI/CD options. This page describes best practices to help you de",2026-04-21T08:00:00.000Z,best-practice,best-practices,0.7,True,"Page explicitly focuses on CI/CD best practices and recommended workflows specific to Databricks. It likely includes concrete, product-specific recommendations and patterns for structuring repos, jobs, and deployment flows on Databricks, which fits the best-practices category rather than generic CI/CD concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/github,GitHub Actions,GitHub Actions - Azure Databricks,Use Databricks GitHub Actions for CI/CD pipelines,Learn how to use GitHub Actions developed for Azure Databricks in your CI/CD workflows.,"Important This feature is inPublic Preview. GitHub Actionstrigger runs of your CI/CD flows from your GitHub repositories and allow you to automate your build, test, and deployment CI/CD pipeline. This page provides information about the GitHub Actions developed by Databricks and examples for common use cases. For information about other CI/CD features and best practices on Databricks, seeCI/CD on Azure DatabricksandBest practices and recommended CI/CD workflows on Databricks.",2026-03-16T08:00:00.000Z,how-to,deployment,0.7,True,"Describes Databricks-authored GitHub Actions with examples for CI/CD. These actions have specific inputs/outputs and constraints, representing deployment-focused integration patterns unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/jenkins,Jenkins,CI/CD with Jenkins on Azure Databricks - Azure Databricks,,"Learn how to set up a CI/CD pipeline on Azure Databricks using Jenkins, an open source automation server.","Note This article covers Jenkins, which is developed by a third party. To contact the provider, seeJenkins Help. This article describes how to use theJenkinsautomation server with Azure Databricks to implement a CI/CD development workflow. For an overview of CI/CD with Azure Databricks, seeCI/CD on Azure Databricks. For best practices, seeBest practices and recommended CI/CD workflows on Databricks.",2026-03-16T17:36:00.000Z,how-to,,0.3,False,"Summary indicates a general how-to for using Jenkins with Azure Databricks CI/CD. It reads as a workflow/tutorial article without explicit mention of product-specific limits, configuration parameter tables, or deployment matrices. Likely mostly step-by-step guidance that an LLM can approximate from generic Jenkins + Databricks patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/,Overview,Databricks CLI - Azure Databricks,,"Learn about the Databricks CLI, a command-line interface utility that enables you to work with Azure Databricks.","Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. TheDatabricks CLI(command-line interface) allows you to interact with the Azure Databricks platform from your local terminal or automation scripts. You can also run Databricks CLI commands from within a Databricks workspace using web terminal. SeeRun shell commands",2026-04-14T08:00:00.000Z,overview,,0.3,False,"High-level description of the Databricks CLI and its usage. The summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting. It appears to be an overview/introduction rather than expert configuration or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/authentication,Authenticate,Authentication for the Databricks CLI - Azure Databricks,Configure authentication for the Databricks CLI,Learn how to set up authentication between the Databricks CLI and your Azure Databricks accounts and workspaces.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. This article explains how to set up authentication between the Databricks CLI and your Azure Databricks accounts and workspaces. It assumes you already installed the Databricks CLI. SeeInstall or update the Databricks CLI. Before you run Databricks CLI commands, yo",2026-04-14T08:00:00.000Z,concept-article,security,0.8,True,"Explains how to set up authentication between the Databricks CLI and Databricks accounts/workspaces. This typically includes specific auth methods, environment variables, profile settings, and token/OAuth configuration parameters unique to the Databricks CLI, which fits the security category as product-specific authentication configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/,Overview,Databricks CLI - Azure Databricks,,"Learn about the Databricks CLI, a command-line interface utility that enables you to work with Azure Databricks.","Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. TheDatabricks CLI(command-line interface) allows you to interact with the Azure Databricks platform from your local terminal or automation scripts. You can also run Databricks CLI commands from within a Databricks workspace using web terminal. SeeRun shell commands",2026-04-14T08:00:00.000Z,overview,,0.3,False,"High-level description of the Databricks CLI and its usage. The summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting. It appears to be an overview/introduction rather than expert configuration or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/authentication,Authenticate,Authentication for the Databricks CLI - Azure Databricks,Configure authentication for the Databricks CLI,Learn how to set up authentication between the Databricks CLI and your Azure Databricks accounts and workspaces.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. This article explains how to set up authentication between the Databricks CLI and your Azure Databricks accounts and workspaces. It assumes you already installed the Databricks CLI. SeeInstall or update the Databricks CLI. Before you run Databricks CLI commands, yo",2026-04-14T08:00:00.000Z,concept-article,security,0.8,True,"Explains how to set up authentication between the Databricks CLI and Databricks accounts/workspaces. This typically includes specific auth methods, environment variables, profile settings, and token/OAuth configuration parameters unique to the Databricks CLI, which fits the security category as product-specific authentication configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/bundle-commands,bundle,bundle command group - Azure Databricks,Use Databricks CLI bundle commands for deployments,Learn how to use the Databricks CLI to deploy and to run jobs and Lakeflow Spark Declarative Pipelines.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thebundlecommand group within theDatabricks CLIcontains commands for managing Declarative Automation Bundles. Declarative Automation Bundles let you express projects as code and programmatically validate, deploy, and run Azure Databricks workflows such as Azure Dat",2026-03-16T08:00:00.000Z,reference,integrations,0.78,True,"CLI reference pages typically enumerate command names, flags, and parameter behaviors that are product-specific and not inferable from general knowledge. This page is a command-group reference for 'bundle', so it likely lists options, flags, and usage patterns for deploying and running jobs and Lakeflow pipelines, which fits integrations & coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/commands,Overview,Databricks CLI commands - Azure Databricks,Use Azure Databricks CLI command groups and options,Get information about available command groups and commands for the Databricks CLI. Command groups contain sets of related CLI commands.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. This article provides information about available Databricks CLI commands. This information supplements the command line help. For more information about installing and using the Databricks CLI, seeInstall or update the Databricks CLIandWhat is the Databricks CLI?.",2026-04-09T08:00:00.000Z,reference,configuration,0.7,True,"The page is a detailed reference of Databricks CLI commands for versions 0.205+ and includes product-specific command names, parameters, and usage patterns that go beyond generic CLI knowledge. This is configuration-oriented expert knowledge about how to control Azure Databricks via its CLI, which an LLM is unlikely to fully know from training.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/commands,Overview,Databricks CLI commands - Azure Databricks,Use Azure Databricks CLI command groups and options,Get information about available command groups and commands for the Databricks CLI. Command groups contain sets of related CLI commands.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. This article provides information about available Databricks CLI commands. This information supplements the command line help. For more information about installing and using the Databricks CLI, seeInstall or update the Databricks CLIandDatabricks CLI. The Databric",2026-04-24T23:15:00.000Z,reference,configuration,0.74,True,"A Databricks CLI command reference typically lists product-specific commands, subcommands, and parameters that go beyond generic CLI usage. These command names, flags, and their exact behaviors are expert, product-specific configuration/invocation details that an LLM is unlikely to fully know from training. The page is not just a tutorial; it is a catalog of available commands and options, which best fits the configuration category.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/databricks-cli-from-azure-cloud-shell,Databricks CLI from Azure Cloud Shell,Use Databricks CLI from Azure Cloud Shell - Azure Databricks,Run Databricks CLI from Azure Cloud Shell,Learn how to use the Databricks CLI from Azure Cloud Shell to perform operations on Azure Databricks.,Learn how to use the Databricks CLI from Azure Cloud Shell to perform operations on Databricks.,2026-01-19T08:00:00.000Z,how-to,integrations,0.7,True,"Describes how to configure and use the Databricks CLI specifically within Azure Cloud Shell, including environment variables/commands that are integration-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/install,Install or update,Install or update the Databricks CLI - Azure Databricks,Install and configure the Databricks CLI,Learn how to install the Databricks CLI. The Databricks CLI is a command-line tool that works with Azure Databricks.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. This article describes how to install or update the Databricks CLI. SeeWhat is the Databricks CLI?. To configure authentication for the Databricks CLI, seeAuthentication for the Databricks CLI.",2026-01-24T08:00:00.000Z,install-set-up-deploy,configuration,0.65,True,"Installation/update article that also points to authentication configuration. Typically includes platform-specific install commands and possibly environment variables or paths, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/migrate,Migrate,Databricks CLI migration - Azure Databricks,Migrate from legacy to new Databricks CLI,Learn how to migrate from the legacy version of the Databricks CLI to the new version of the Databricks CLI.,"This article describes how to migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above. Databricks CLI versions 0.205 and above are inPublic Preview. For brevity, this article refers to Databricks CLI versions 0.18 and below as the “legacy” CLI, and Databricks CLI versions 0.205 and above as the “new” CLI. For more information about the legacy and new CLIs, see:",2026-04-09T08:00:00.000Z,upgrade-and-migration-article,decision-making,0.68,True,"The page is focused on migrating between specific Databricks CLI versions (0.18 and below to 0.205+), which is a product-specific migration/upgrade decision scenario. It likely includes concrete guidance on when and how to move, differences between versions, and steps/considerations for migration. This aligns best with the decision-making category’s migration/upgrade path criterion. It is not just a conceptual overview; it contains version-specific, product-specific instructions that an LLM would not reliably know from training.",unchanged @@ -627,13 +633,14 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/accou https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-workspaces-commands,account workspaces,account workspaces command group - Azure Databricks,Manage Databricks workspaces at account level via CLI,Learn how to use the Databricks CLI account workspaces commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theaccount workspacescommand group within theDatabricks CLIcontains commands to manage workspaces for your account. A Databricks workspace is an environment for accessing all of your Databricks assets. The workspace organizes objects (notebooks, libraries, and expe",2026-02-19T19:35:00.000Z,reference,configuration,0.7,True,Workspaces command group; workspace lifecycle operations with product-specific parameters and constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-commands,alerts,alerts command group - Azure Databricks,Use Databricks CLI alerts command group,Learn how to use the Databricks CLI alerts commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thealertscommand group within theDatabricks CLIcontains commands to perform get, create, update, and delete operations on alerts. An alert is a Databricks SQL object that periodically runs a query, evaluates a condition of its result, and notifies one or more users",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"CLI reference pages typically list command names, required/optional parameters, and flags specific to the Databricks alerts API. These are product-specific API/SDK-style parameters and options that qualify as integration patterns rather than generic tutorial content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-legacy-commands,alerts-legacy (deprecated),alerts-legacy command group (deprecated) - Azure Databricks,Use deprecated Databricks legacy alerts CLI commands,Learn how to use the Databricks CLI alerts-legacy commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Warning These commands are deprecated. SeeUpdate to the latest Databricks SQL API version. To perform operations on new Databricks SQL alerts, seealertscommand group. Thealerts-legacycommand group within theDatabricks CLIallows you to perform get, create, update, a",2026-02-06T22:46:00.000Z,reference,configuration,0.65,True,Alerts-legacy command group; deprecated but still defines specific command behavior and object schema for older alerts.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-v2-commands,alerts-v2,alerts-v2 command group - Azure Databricks,Use Databricks CLI alerts-v2 commands for SQL alerts,Learn how to use the Databricks CLI alerts-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thealerts-v2command group within theDatabricks CLIallows you to manage SQL alerts. An alert periodically runs a query, evaluates a condition of its result, and notifies one or more users and/or notification destinations if the condition was met.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"CLI reference pages typically list command names, flags, parameters, and usage patterns that are product-specific and not inferable from general knowledge. Even though the summary is brief, the alerts-v2 command group reference will enumerate concrete commands and options for managing SQL alerts, which fits the integrations & coding patterns category.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/api-commands,api,api command group - Azure Databricks,Call Databricks REST APIs with CLI api command,Learn how to use the Databricks CLI api commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theapicommand group within theDatabricks CLIenables you to call any available Databricks REST API. You should run theapicommand only for advanced scenarios, such as preview releases of specific Databricks REST APIs for which the Databricks CLI does not already wrap",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"The api command reference will enumerate CLI parameters, request structure, and usage patterns for invoking arbitrary Databricks REST endpoints. These are concrete, product-specific integration details (parameters, flags, usage) that fit the integrations sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/apps-commands,apps,apps command group - Azure Databricks,Manage Databricks Apps using CLI commands,Learn how to use the Databricks CLI to work with apps,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theappscommand group within theDatabricks CLIallows you to manage apps. Apps run directly on a customer's Databricks instance, integrate with their data, use and extend Databricks services, and enable users to interact through single sign-on. SeeDatabricks Apps.",2026-02-19T19:35:00.000Z,reference,integrations,0.7,True,Apps command group; product-specific app lifecycle operations and parameters for apps running on Databricks.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/apps-commands,apps,apps command group - Azure Databricks,Manage Databricks apps with CLI apps commands,Learn how to use the Databricks CLI to work with apps,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theappscommand group within theDatabricks CLIallows you to manage apps. Apps run directly on a customer's Databricks instance, integrate with their data, use and extend Databricks services, and enable users to interact through single sign-on. SeeDatabricks Apps.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"As a CLI reference for the apps command group, this page will contain specific commands, arguments, and behaviors for managing Databricks apps. These are product-specific integration patterns for programmatic control, matching the integrations category.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/artifact-allowlists-commands,artifact-allowlists,artifact-allowlists command group - Azure Databricks,Manage Unity Catalog artifact allowlists via CLI,Learn how to use the Databricks CLI artifact-allowlists commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theartifact-allowlistscommand group within theDatabricks CLIallows you to manage artifact allowlists in Unity Catalog. In Databricks Runtime 13.3 and above, you can add libraries and init scripts to the allowlist in Unity Catalog so that users can leverage these ar",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"This command group reference will describe specific CLI commands and parameters for managing artifact allowlists in Unity Catalog, including command syntax and options. These are product-specific configuration/integration parameters, matching the integrations category.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/auth-commands,auth,auth command group - Azure Databricks,Configure Databricks CLI authentication with auth commands,Learn how to set up the Databricks CLI to use Databricks authentication with OAuth.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theauthcommand group within theDatabricks CLIcontains authentication related commands, including the following: Tip To get information about the current Databricks CLI user, rundatabricks current-user me.",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"Auth command reference pages document concrete CLI commands, flags, and authentication parameters (e.g., profiles, tokens, OAuth flows) unique to the Databricks CLI. These are detailed integration/configuration parameters rather than high-level concepts, fitting integrations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/auth-commands,auth,auth command group - Azure Databricks,Configure Databricks CLI authentication with auth commands,Learn how to set up the Databricks CLI to use Databricks authentication with OAuth.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theauthcommand group within theDatabricks CLIcontains authentication related commands, including the following: Tip To get information about the current Databricks CLI user, rundatabricks current-user me.",2026-04-24T18:05:00.000Z,reference,security,0.8,True,"The auth command group reference will document concrete authentication-related commands, flags, and possibly OAuth configuration parameters specific to Databricks CLI. Because it focuses on authentication setup and identity, it best fits the security category and contains expert, product-specific details.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/catalogs-commands,catalogs,catalogs command group - Azure Databricks,Manage Unity Catalog catalogs with Databricks CLI,Learn how to use the Databricks CLI catalogs commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thecatalogscommand group within theDatabricks CLIallows you to manage catalogs in Unity Catalog. A catalog is the first layer of Unity Catalog's three-level namespace. It's used to organize your data assets. SeeWhat is Unity Catalog?.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"As a CLI command-group reference for 'catalogs', it will contain specific commands, arguments, and behaviors for managing Unity Catalog catalogs. These are concrete API/CLI parameters unique to Databricks, matching integrations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-asset-revisions-commands,clean-room-asset-revisions,clean-room-asset-revisions command group - Azure Databricks,Manage clean room asset revisions via CLI,Learn how to use the Databricks CLI clean-room-asset-revisions commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theclean-room-asset-revisionscommand group within theDatabricks CLIcontains commands to manage asset revisions in clean rooms. These revisions denote new versions of uploaded assets (for example, notebooks) in the clean room.",2026-02-19T19:35:00.000Z,reference,configuration,0.73,True,"Command group reference for clean-room-asset-revisions will enumerate commands and parameters unique to Databricks, qualifying as expert configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-assets-commands,clean-room-assets,clean-room-assets command group - Azure Databricks,Manage clean room assets via Databricks CLI,Learn how to use the Databricks CLI to work with clean room assets,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theclean-room-assetscommand group within theDatabricks CLIallows you to manage clean room assets. Clean room assets are data and objects such as tables, volumes, and notebooks, that are shared with the clean room. Supported asset types includeFOREIGN_TABLE,NOTEBOOK",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"The page documents 'clean-room-assets' CLI commands, including supported asset types and command parameters. Such detailed CLI options and object types are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-assets-commands,clean-room-assets,clean-room-assets command group - Azure Databricks,Manage Databricks clean room assets via CLI,Learn how to use the Databricks CLI to work with clean room assets,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theclean-room-assetscommand group within theDatabricks CLIallows you to manage clean room assets. Clean room assets are data and objects such as tables, volumes, and notebooks, that are shared with the clean room. Supported asset types includeFOREIGN_TABLE,NOTEBOOK",2026-04-24T23:15:00.000Z,reference,integrations,0.76,True,"This CLI reference for clean-room-assets will list specific commands and parameters for managing shared assets like FOREIGN_TABLE and NOTEBOOK. These are concrete API/CLI patterns unique to Databricks, aligning with integrations & coding patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-auto-approval-rules-commands,clean-room-auto-approval-rules,clean-room-auto-approval-rules command group - Azure Databricks,Configure clean room auto-approval rules via CLI,Learn how to use the Databricks CLI clean-room-auto-approval-rules commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theclean-room-auto-approval-rulescommand group within theDatabricks CLIcontains commands to manage auto-approval rules for clean rooms. Clean room auto-approval rules automatically create an approval on your behalf when an asset (for example, a notebook) meeting sp",2026-02-19T19:35:00.000Z,reference,configuration,0.73,True,"Contains specific commands and flags for auto-approval rules, representing concrete configuration options rather than general concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-task-runs-commands,clean-room-task-runs,clean-room-task-runs command group - Azure Databricks,Control clean room task runs with Databricks CLI,Learn how to use the Databricks CLI to work with clean room task runs,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theclean-room-task-runscommand group within theDatabricks CLIallows you to manage the executions of notebooks in a clean room.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Reference for 'clean-room-task-runs' commands will list specific commands, flags, and usage for managing notebook executions in clean rooms, which are concrete integration/CLI patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-rooms-commands,clean-rooms,clean-rooms command group - Azure Databricks,Administer clean rooms using Databricks CLI,Learn how to use the Databricks CLI to work with clean rooms,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theclean-roomscommand group within theDatabricks CLIallows you to manage clean rooms. A clean room uses Delta Sharing and serverless compute to provide a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"The 'clean-rooms' command group reference will enumerate commands and parameters for creating and managing clean rooms, which are detailed, product-specific CLI integration points.",unchanged @@ -649,28 +656,33 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/consu https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/consumer-providers-commands,consumer-providers,consumer-providers command group - Azure Databricks,Use Databricks CLI consumer-providers Marketplace commands,Learn how to use the Databricks CLI consumer-providers commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theconsumer-providerscommand group within theDatabricks CLIallows you to interact with providers in the Databricks Marketplace. Providers are the entities that publish listings to the Marketplace. SeeBecome a Databricks Marketplace provider.",2026-04-06T08:00:00.000Z,reference,integrations,0.7,True,"CLI reference pages typically list product-specific commands, flags, and parameters (including required/optional options and their meanings) for interacting with Databricks Marketplace providers. These command/parameter details and exact names are expert, product-specific knowledge that an LLM is unlikely to infer from general training. This aligns best with the integrations sub-skill, which covers SDK/API/CLI parameter references and configuration details for connecting to services.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/credentials-commands,credentials,credentials command group - Azure Databricks,Manage Unity Catalog credentials with Databricks CLI,Learn how to use the Databricks CLI credentials commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thecredentialscommand group within theDatabricks CLIallows you to manage credentials for accessing services on your cloud tenant. Each credential is subject to Unity Catalog access-control policies that control which users and groups can access the credential. A cr",2026-01-19T08:00:00.000Z,reference,security,0.78,True,"This page covers CLI commands for managing credentials subject to Unity Catalog access-control policies. It likely includes specific fields, scopes, and commands tied to security and IAM.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/current-user-commands,current-user,current-user command group - Azure Databricks,Retrieve current user information via Databricks CLI,Learn how to use the Databricks CLI current-user commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thecurrent-usercommand group within theDatabricks CLIallows you to retrieve information about the currently authenticated user or service principal.",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"The 'current-user' command group will specify exact commands and output structure for querying the authenticated user or service principal, which are product-specific CLI details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-quality-commands,data-quality,data-quality command group - Azure Databricks,Manage Unity Catalog data quality via Databricks CLI,Learn how to use the Databricks CLI data-quality commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thedata-qualitycommand group within theDatabricks CLIcontains commands to manage the data quality of Unity Catalog objects.",2026-02-19T19:35:00.000Z,reference,configuration,0.7,True,"Data-quality command group will list specific operations and parameters for data quality management, which are expert configuration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-classification-commands,data-classification,data-classification command group - Azure Databricks,Control Unity Catalog data classification with CLI,Learn how to use the Databricks CLI data-classification commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thedata-classificationcommand group within theDatabricks CLIallows you to manage data classification for Unity Catalog catalogs. Data classification automatically identifies and tags sensitive data (PII) in Unity Catalog tables. Each catalog can have at most one co",2026-04-24T23:15:00.000Z,reference,integrations,0.74,True,"The data-classification command group reference will provide exact CLI commands and options to manage data classification for Unity Catalog catalogs, including constraints like one coordinator per catalog. These are product-specific command patterns, fitting integrations.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-quality-commands,data-quality,data-quality command group - Azure Databricks,Use Databricks CLI data-quality commands for Unity Catalog,Learn how to use the Databricks CLI data-quality commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thedata-qualitycommand group within theDatabricks CLIcontains commands to manage the data quality of Unity Catalog objects. SeeData quality monitoring.",2026-04-24T08:00:00.000Z,reference,integrations,0.74,True,"This page documents the data-quality CLI commands for managing data quality monitoring on Unity Catalog objects. It will enumerate concrete commands and parameters, which are product-specific integration details rather than conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-sources-commands,data-sources (deprecated),data-sources command group (deprecated) - Azure Databricks,Query Databricks SQL data sources using CLI,Learn how to use the Databricks CLI data-sources commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Warning These commands are deprecated. SeeUpdate to the latest Databricks SQL API version. Thedata-sourcescommand group within theDatabricks CLIallows you to get information about SQL warehouses for creating query objects.",2026-02-06T22:46:00.000Z,reference,configuration,0.69,True,"Data-sources command group exposes specific commands and options for SQL warehouse information, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/database-commands,database,database command group - Azure Databricks,Administer Databricks database instances with CLI,Learn how to use the Databricks CLI database commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thedatabasecommand group within theDatabricks CLIcontains commands to manage database instances. Database instances manage storage and compute resources and provide the endpoints that users connect to. SeeWhat is a database instance?.",2026-02-03T08:00:00.000Z,reference,configuration,0.7,True,Database instance management commands and their arguments are detailed configuration interfaces unique to Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/entity-tag-assignments-commands,entity-tag-assignments,entity-tag-assignments command group - Azure Databricks,Assign and manage Unity Catalog entity tags via CLI,Learn how to use the Databricks CLI entity-tag-assignments commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theentity-tag-assignmentscommand group within theDatabricks CLIcontains commands to create, update, delete, and list tag assignments across Unity Catalog entities. Tags are attributes that include keys and optional values that you can use to organize and categorize",2026-02-19T19:35:00.000Z,reference,configuration,0.7,True,"Entity-tag-assignments commands define specific operations and parameters for tagging, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/experiments-commands,experiments,experiments command group - Azure Databricks,Manage MLflow experiments using Databricks CLI,Learn how to use the Databricks CLI experiments commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theexperimentscommand group within theDatabricks CLIallows you to create, edit, delete, and manage experiments in MLflow. SeeOrganize training runs with MLflow experiments.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"The 'experiments' command group reference will include specific commands and flags for creating and managing MLflow experiments, which are detailed CLI integration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/environments-commands,environments,environments command group - Azure Databricks,Manage Databricks environments with CLI environments commands,Learn how to use the Databricks CLI environments commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theenvironmentscommand group within theDatabricks CLIallows you to manage environment resources. The Environments API provides management capabilities for different types of environments including workspace-level base environments that define the environment versio",2026-04-24T23:15:00.000Z,reference,integrations,0.76,True,"The environments command group reference will list specific commands and options for managing environment resources and versions. These CLI details are expert, product-specific integration patterns between tooling and the Databricks Environments API.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/experiments-commands,experiments,experiments command group - Azure Databricks,Manage MLflow experiments using Databricks CLI experiments commands,Learn how to use the Databricks CLI experiments commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theexperimentscommand group within theDatabricks CLIallows you to create, edit, delete, and manage experiments in MLflow. SeeOrganize training runs with MLflow experiments.",2026-04-24T23:15:00.000Z,reference,integrations,0.76,True,This experiments command group reference will contain concrete CLI commands and parameters for creating and managing MLflow experiments in Databricks. These are detailed integration patterns between the CLI and MLflow/Databricks services.,updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/external-lineage-commands,external-lineage,external-lineage command group - Azure Databricks,Define external lineage relationships via Databricks CLI,Learn how to use the Databricks CLI external-lineage commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theexternal-lineagecommand group within theDatabricks CLIcontains commands to define and manage lineage relationships between Databricks objects and external systems.",2026-02-19T19:35:00.000Z,reference,configuration,0.7,True,"External-lineage commands expose specific parameters for lineage relationships, which are expert configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/external-locations-commands,external-locations,external-locations command group - Azure Databricks,Configure Unity Catalog external locations with CLI,Learn how to use the Databricks CLI external-locations commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theexternal-locationscommand group within theDatabricks CLIcontains commands to create and manage external locations for Unity Catalog. SeeWhat are Unity Catalog volumes?.",2026-03-03T08:00:00.000Z,reference,configuration,0.71,True,"External-locations command group lists commands and options for location configuration, which are detailed product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/external-metadata-commands,external-metadata,external-metadata command group - Azure Databricks,Register external metadata in Unity Catalog via CLI,Learn how to use the Databricks CLI external-metadata commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theexternal-metadatacommand group within theDatabricks CLIcontains commands to register and manage metadata about external systems within Unity Catalog.",2026-02-19T19:35:00.000Z,reference,configuration,0.7,True,External-metadata commands and their parameters are specific configuration interfaces for metadata registration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/feature-engineering-commands,feature-engineering,feature-engineering command group - Azure Databricks,Manage Databricks feature store via CLI,Learn how to use the Databricks CLI feature-engineering commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thefeature-engineeringcommand group within theDatabricks CLIallows you to manage features in your Databricks feature store.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"The 'feature-engineering' command group manages features in the feature store, with concrete commands and parameters that represent product-specific integration behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/fs-commands,fs,fs command group - Azure Databricks,Use Databricks CLI fs commands for DBFS and volumes,Learn how to use the Databricks CLI fs commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thefscommand group within theDatabricks CLIallows you to perform file system operations onvolumesin Unity Catalog and theDatabricks File System (DBFS). fscommands require volume paths to begin withdbfs:/Volumesand require directory and file paths in DBFS to begin w",2026-01-19T08:00:00.000Z,reference,integrations,0.8,True,"The 'fs' command group reference will detail file system operations, required path prefixes (dbfs:/Volumes, dbfs:/), and command syntax, which are specific integration/configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/functions-commands,functions,functions command group - Azure Databricks,Manage Unity Catalog user-defined functions via CLI,Learn how to use the Databricks CLI functions commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thefunctionscommand group within theDatabricks CLIallows you to manageuser-defined functions (UDFs)in Unity Catalog. The function implementation can be any SQL expression or query, and it can be invoked wherever a table reference is allowed in a query. In Unity Cat",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"The 'functions' command group reference will list commands and parameters for managing UDFs, including how implementations are specified, which are product-specific CLI integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/genie-commands,genie,genie command group - Azure Databricks,Use Databricks CLI genie commands for Genie spaces,Learn how to use the Databricks CLI genie commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thegeniecommand group within theDatabricks CLIcontains commands for Genie. SeeWhat is a Genie space.",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"The 'genie' command group contains commands for Genie; this implies specific CLI operations and parameters unique to Genie spaces, fitting integrations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/genie-commands,genie,genie command group - Azure Databricks,Use Databricks CLI genie commands for Genie spaces,Learn how to use the Databricks CLI genie commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thegeniecommand group within theDatabricks CLIcontains commands for Genie. SeeWhat is a Genie space.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"CLI reference pages list product-specific commands, flags, and parameters for the genie command group. These are concrete API/CLI integration details (names, options, behaviors) that qualify as expert knowledge beyond generic concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/git-credentials-commands,git-credentials,git-credentials command group - Azure Databricks,Configure Git credentials for Databricks via CLI,Learn how to use the Databricks CLI git credentials commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thegit-credentialscommand group within theDatabricks CLIallows you to register personal access tokens for Databricks to do Git operations on behalf of the user. SeeGet access tokens from Git provider.",2026-01-20T08:00:00.000Z,reference,integrations,0.8,True,"The 'git-credentials' command group reference will specify commands and parameters for registering PATs for Git operations, which are concrete integration and configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/global-init-scripts-commands,global-init-scripts,global-init-scripts command group - Azure Databricks,Configure global init scripts with Databricks CLI,Learn how to use the Databricks CLI global init scripts commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theglobal-init-scriptscommand group within theDatabricks CLIenables workspace administrators to configure global initialization scripts for their workspace. These scripts run on every node in every cluster in the workspace. SeeGlobal init scripts.",2026-01-19T08:00:00.000Z,reference,configuration,0.78,True,"This page documents CLI commands for global init scripts, including how to define and manage scripts that run on every node. These are specific configuration commands and parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/grants-commands,grants,grants command group - Azure Databricks,Manage Unity Catalog grants using Databricks CLI,Learn how to use the Databricks CLI grants commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thegrantscommand group within theDatabricks CLIallows you to manage grants. In Unity Catalog, data is secure by default. Initially, users have no access to data in a metastore. Access can be granted by either a metastore admin, the owner of an object, or the owner ",2026-03-06T08:00:00.000Z,reference,security,0.76,True,"Grants are core to access control; the CLI reference will list specific grant/revoke commands and permission scopes, fitting the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/groups-commands,groups,groups command group - Azure Databricks,Manage Databricks workspace groups via CLI,Learn how to use the Databricks CLI groups commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thegroupscommand group within theDatabricks CLIallows you to manage groups in the Databricks workspace. Groups simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects. SeeGroups.",2026-01-19T08:00:00.000Z,reference,security,0.78,True,"The 'groups' command group reference will include commands and parameters for managing groups used in access control, which is security/identity management specific to Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/groups-v2-commands,groups-v2,groups-v2 command group - Azure Databricks,Manage Databricks groups with CLI groups-v2 commands,Learn how to use the Databricks CLI groups-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thegroups-v2command group within theDatabricks CLIallows you to manage groups in the Databricks workspace. Groups simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects. It is best practice to assi",2026-04-24T23:15:00.000Z,reference,integrations,0.8,True,"Describes Databricks CLI groups-v2 commands with specific operations and parameters for managing workspace groups. This is product-specific CLI/API surface, fitting integrations & coding patterns.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/instance-pools-commands,instance-pools,instance-pools command group - Azure Databricks,Manage Databricks instance pools using CLI,Learn how to use the Databricks CLI instance pools commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theinstance-poolscommand group within theDatabricks CLIenables you to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces cluster start and auto-scaling times. SeeConnect to pools.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"The 'instance-pools' command group will list commands and flags for creating and managing pools, including parameters for cloud instances, which are product-specific integration/configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/instance-profiles-commands,instance-profiles,instance-profiles command group - Azure Databricks,Manage Databricks instance profiles via CLI,Learn how to use the Databricks CLI instance profiles,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theinstance-profilescommand group within theDatabricks CLIallows admins to add, list, and remove instance profiles that users can launch clusters with. Regular users can list the instance profiles available to them.",2026-01-19T08:00:00.000Z,reference,security,0.74,True,"Instance profiles are tied to how clusters assume cloud identities; the CLI reference will include commands and parameters for adding/removing profiles, which are IAM/security-related configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/ip-access-lists-commands,ip-access-lists,ip-access-lists command group - Azure Databricks,Configure IP access lists with Databricks CLI,Learn how to use the Databricks CLI ip-access-lists commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theip-access-listscommand group within theDatabricks CLIcontains commands that enable admins to configure IP access lists. SeeManage IP access lists",2026-01-19T08:00:00.000Z,reference,security,0.8,True,"The 'ip-access-lists' command group reference will specify commands, fields, and allowed values for configuring IP access lists, which are concrete security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/jobs-commands,jobs,jobs command group - Azure Databricks,Create and manage Databricks jobs via CLI,Learn how to use the Databricks CLI jobs commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thejobscommand group within theDatabricks CLIallows you to create, edit, and delete jobs. SeeLakeflow Jobs.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"The 'jobs' command group reference will enumerate commands and parameters for job lifecycle management, including job definitions and runs, which are detailed CLI integration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/knowledge-assistants-commands,knowledge-assistants,knowledge-assistants command group - Azure Databricks,Manage Knowledge Assistants using Databricks CLI commands,Learn how to use the Databricks CLI knowledge-assistants commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theknowledge-assistantscommand group within theDatabricks CLIallows you to manage Knowledge Assistants and related resources.",2026-04-24T23:15:00.000Z,reference,integrations,0.8,True,"Provides detailed CLI commands and options for the knowledge-assistants command group, exposing product-specific parameters and behaviors that are expert integration details.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/labs-commands,labs,labs command group - Azure Databricks,Use Databricks CLI labs commands for experimental apps,Learn how to use the Databricks CLI to work with Databricks Labs applications.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thelabscommand group within theDatabricks CLIcontains commands to manage available experimentalDatabricks Labsapplications. For more information about available Databricks Labs applications, see the following:",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"CLI reference pages enumerate product-specific command names, flags, and behaviors for the labs command group, which are not inferable from general knowledge and qualify as expert integration/config details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/lakeview-commands,lakeview,lakeview command group - Azure Databricks,Manage Lakeview dashboards with Databricks CLI,Learn how to use the Databricks CLI lakeview commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thelakeviewcommand group within theDatabricks CLIcontains commands for specific management operations for Lakeview dashboards. SeeDashboards.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"Provides detailed CLI command syntax and parameters for the lakeview command group, which are product-specific API/SDK-style details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/lakeview-embedded-commands,lakeview-embedded,lakeview-embedded command group - Azure Databricks,Embed Lakeview dashboards via Databricks CLI lakeview-embedded,Learn how to use the Databricks CLI lakeview-embedded commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thelakeview-embeddedcommand group within theDatabricks CLIprovides token-based Lakeview APIs for embedding dashboards in external applications.",2026-04-24T23:15:00.000Z,reference,integrations,0.8,True,"Covers token-based Lakeview embedded APIs through CLI commands, including specific command names and parameters for embedding dashboards, which are concrete integration patterns.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/libraries-commands,libraries,libraries command group - Azure Databricks,Manage Databricks libraries via CLI commands,Learn how to use the Databricks CLI libraries commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thelibrariescommand group within theDatabricks CLIallows you to install and uninstall libraries and get the status of libraries on a cluster. SeeInstall libraries.",2026-01-19T08:00:00.000Z,reference,integrations,0.82,True,"Documents exact CLI commands and options for installing, uninstalling, and checking status of libraries on clusters; these are concrete, product-specific command interfaces.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/metastores-commands,metastores,metastores command group - Azure Databricks,Administer Unity Catalog metastores using Databricks CLI,Learn how to use the Databricks CLI metastores commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Themetastorescommand group within theDatabricks CLIallows you to manage metastores. A metastore is the top-level container of objects in Unity Catalog. It stores data assets (tables and views) and the permissions that govern access to them. SeeMetastore.",2026-03-17T08:00:00.000Z,reference,integrations,0.8,True,"Contains specific CLI operations and parameters for managing metastores, which are detailed integration/command patterns unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/model-registry-commands,model-registry,model-registry command group - Azure Databricks,Use CLI model-registry commands for workspace models,Learn how to use the Databricks CLI model registry commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Note This API reference documents APIs for the Workspace Model Registry. Databricks recommends usingModels in Unity Cataloginstead. Models in Unity Catalog provides centralized model governance, cross-workspace access, lineage, and deployment. Workspace Model Regis",2026-01-19T08:00:00.000Z,reference,integrations,0.8,True,"CLI reference for model-registry includes concrete command names and flags that represent expert, product-specific integration details.",unchanged @@ -679,10 +691,11 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/notif https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/online-tables-commands,online-tables,online-tables command group - Azure Databricks,Create and manage online tables using CLI,Learn how to use the Databricks CLI online-tables commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theonline-tablescommand group within theDatabricks CLIallows you to create, online tables, which provide lower latency and higher QPS access to data from Delta tables.",2026-01-19T08:00:00.000Z,reference,integrations,0.8,True,"Provides exact online-tables CLI commands and arguments, representing concrete product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/permissions-commands,permissions,permissions command group - Azure Databricks,Manage Databricks object permissions with CLI commands,Learn how to use the Databricks CLI permissions commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepermissionscommand group within theDatabricks CLIallows you to manage access control for various users on different objects, including the following: For the mapping of the required permissions for specific actions or abilities and other important information, s",2026-01-19T08:00:00.000Z,reference,security,0.83,True,"permissions CLI group is about access control; the reference will list specific permission targets, operations, and possibly role mappings, which are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/pipelines-commands,pipelines,pipelines command group - Azure Databricks,Control Databricks pipelines with CLI commands,Learn how to use the Databricks CLI pipelines commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepipelinescommand group within theDatabricks CLIcontains two sets of functionality. The first set allows you to manage a pipeline project and its workflow. The second set allows you to create, edit, delete, start, and view details about pipeline objects in Databr",2026-01-30T19:03:00.000Z,reference,integrations,0.7,True,pipelines command group reference defines CLI operations and parameters for pipeline projects and objects.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policies-commands,policies,policies command group - Azure Databricks,Manage Unity Catalog ABAC policies with Databricks CLI,Learn how to use the Databricks CLI policies commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepoliciescommand group within theDatabricks CLIallows you to manage Attribute-Based Access Control (ABAC) policies in Unity Catalog. ABAC provides high leverage governance for enforcing compliance policies. With ABAC policies, access is controlled in a hierarchic",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Documents the policies command group with specific CLI operations and arguments for ABAC policies. These are product-specific command and parameter details, matching integrations criteria.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policy-compliance-for-clusters-commands,policy-compliance-for-clusters,policy-compliance-for-clusters command group - Azure Databricks,View cluster policy compliance using Databricks CLI,Learn how to use the Databricks CLI policy-compliance-for-clusters commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepolicy-compliance-for-clusterscommand group within theDatabricks CLIcontains commands to view and manage the policy compliance status of clusters in your workspace.",2026-02-19T19:35:00.000Z,reference,configuration,0.65,True,policy-compliance-for-clusters commands expose workspace-specific configuration/compliance state via CLI with defined parameters.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policy-compliance-for-jobs-commands,policy-compliance-for-jobs,policy-compliance-for-jobs command group - Azure Databricks,Check job policy compliance with Databricks CLI,Learn how to use the Databricks CLI policy-compliance-for-jobs commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepolicy-compliance-for-jobscommand group within theDatabricks CLIcontains commands to view and manage the policy compliance status of jobs in your workspace.",2026-02-19T19:35:00.000Z,reference,configuration,0.65,True,policy-compliance-for-jobs CLI reference provides concrete commands and options to inspect/manage job compliance configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policy-families-commands,policy-families,policy-families command group - Azure Databricks,View Databricks cluster policy families via CLI,Learn how to use the Databricks CLI policy families commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepolicy-familiescommand group within theDatabricks CLIallows you to view available policy families. A policy family contains a policy definition providing best practices for configuring clusters for a particular use case. Databricks manages and provides policy fa",2026-01-19T08:00:00.000Z,reference,best-practices,0.68,True,policy-families CLI commands expose Databricks-managed best-practice cluster configurations; the reference includes concrete identifiers and usage patterns tied to those best practices.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/postgres-commands,postgres,postgres command group - Azure Databricks,Manage Lakebase Postgres resources with Databricks CLI,Learn how to use the Databricks CLI postgres commands,"Note This information applies to Databricks CLI versions 0.285 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepostgrescommand group within theDatabricks CLIcontains commands to manage Lakebase Postgres resources including projects, branches, and endpoints. The Postgres API provides access to a Postgres database via REST API or direct SQL.",2026-02-04T19:43:00.000Z,reference,integrations,0.75,True,"postgres command group reference includes commands and parameters for projects, branches, and endpoints, a detailed integration surface.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/postgres-commands,postgres,postgres command group - Azure Databricks,Manage Lakebase Postgres resources using Databricks CLI,Learn how to use the Databricks CLI postgres commands,"Note This information applies to Databricks CLI versions 0.285 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepostgrescommand group within theDatabricks CLIcontains commands to manage Lakebase Postgres resources including projects, branches, and endpoints. The Postgres API provides access to a Postgres database via REST API or direct SQL.",2026-04-24T23:15:00.000Z,reference,integrations,0.82,True,"The postgres command group reference exposes concrete commands, flags, and resource management patterns for Lakebase Postgres via CLI, which are detailed integration/API usage specifics.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-exchange-filters-commands,provider-exchange-filters,provider-exchange-filters command group - Azure Databricks,Manage marketplace exchange filters with Databricks CLI,Learn how to use the Databricks CLI provider-exchange-filters commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theprovider-exchange-filterscommand group within theDatabricks CLIcontains commands to manage marketplace exchange filters. Marketplace exchanges filters curate which groups can access an exchange. SeeCreate and manage private exchanges in Databricks Marketplace.",2026-01-19T08:00:00.000Z,reference,integrations,0.79,True,"provider-exchange-filters CLI reference defines specific commands and parameters for marketplace filters, which are detailed integration endpoints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-exchanges-commands,provider-exchanges,provider-exchanges command group - Azure Databricks,Administer Databricks marketplace exchanges via CLI,Learn how to use the Databricks CLI provider-exchanges commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theprovider-exchangescommand group within theDatabricks CLIallows you to manage marketplace exchanges. Marketplace exchanges allow providers to share their listings with a curated set of customers. SeeCreate and manage private exchanges in Databricks Marketplace.",2026-01-19T08:00:00.000Z,reference,integrations,0.79,True,provider-exchanges CLI commands and their options are product-specific integration details for marketplace management.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-files-commands,provider-files,provider-files command group - Azure Databricks,Manage Databricks marketplace files using CLI,Learn how to use the Databricks CLI provider-files commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theprovider-filescommand group within theDatabricks CLIcontains commands to manage files in the marketplace. Marketplace offers a set of file APIs for various purposes such as preview notebooks and provider icons. SeeWhat is Databricks Marketplace?.",2026-01-19T08:00:00.000Z,reference,integrations,0.79,True,"provider-files CLI group exposes file APIs (e.g., for preview notebooks, icons) with concrete command syntax and parameters.",unchanged @@ -691,8 +704,9 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provi https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-provider-analytics-dashboards-commands,provider-provider-analytics-dashboards,provider-provider-analytics-dashboards command group - Azure Databricks,Manage provider analytics dashboards via Databricks CLI,Learn how to use the Databricks CLI provider-provider-analytics-dashboards commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theprovider-provider-analytics-dashboardscommand group within theDatabricks CLIcontains commands to manage templated analytics solution for providers in Databricks Marketplace. SeeWhat is Databricks Marketplace?.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"provider-provider-analytics-dashboards CLI group defines commands for templated analytics solutions, with concrete syntax and options.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-providers-commands,provider-providers,provider-providers command group - Azure Databricks,Administer Databricks marketplace providers using CLI,Learn how to use the Databricks CLI provider-providers commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theprovider-providerscommand group within theDatabricks CLIcontains commands to manage providers in Databricks Marketplace. Providers are entities that manage assets in Marketplace. SeeWhat is Databricks Marketplace?.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"provider-providers CLI reference contains specific commands and parameters for provider entities, which are expert integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/providers-commands,providers,providers command group - Azure Databricks,Manage Delta Sharing providers with Databricks CLI,Learn how to use the Databricks CLI providers commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theproviderscommand group within theDatabricks CLIcontains commands to manage Delta sharing providers. A data provider represents the organization in the real world who shares the data. SeeWhat is Delta Sharing?.",2026-01-19T08:00:00.000Z,reference,integrations,0.8,True,"providers CLI group for Delta Sharing exposes concrete command names, flags, and behaviors that are product-specific integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/psql-command,psql,psql command - Azure Databricks,Connect to Databricks Postgres instances with psql CLI,Learn how to use the Databricks CLI psql command to connect to database instances,"Note This information applies to Databricks CLI versions 0.285 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepsqlcommand within theDatabricks CLIallows you to connect to a specified database instance using a PostgreSQL client.",2026-02-06T20:21:00.000Z,reference,integrations,0.8,True,"psql command reference includes connection parameters, flags, and usage patterns specific to Databricks-managed Postgres.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitors-commands,quality-monitors,quality-monitors command group - Azure Databricks,Configure Databricks quality monitors via CLI,Learn how to use the Databricks CLI quality-monitors commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thequality-monitorscommand group within theDatabricks CLIcontains commands to create, edit, and delete quality monitors. A monitor computes and monitors data or model quality metrics for a table over time. It generates metrics tables and a dashboard that you can us",2026-03-12T08:00:00.000Z,reference,integrations,0.8,True,"quality-monitors CLI commands and options for creating/editing/deleting monitors are detailed, product-specific interfaces.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/psql-command,psql,psql command - Azure Databricks,Connect to Lakebase Postgres with Databricks CLI psql,Learn how to use the Databricks CLI psql command to connect to database instances,"Note This information applies to Databricks CLI versions 0.285 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thepsqlcommand within theDatabricks CLIallows you to connect to a Lakebase Postgres database using a PostgreSQL client. It supports both Lakebase Provisioned instances and Lakebase Autoscaling projects.",2026-04-24T23:15:00.000Z,reference,integrations,0.85,True,"Describes the psql CLI command, including how it connects to Lakebase Postgres instances and supported modes. This is a product-specific integration command with concrete parameters and behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitor-v2-commands,quality-monitor-v2,quality-monitor-v2 command group - Azure Databricks,Use Databricks CLI quality-monitor-v2 commands for data quality,Learn how to use the Databricks CLI quality-monitor-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Warning These commands are deprecated. Use thedata-qualitycommands instead. Seedata-qualitycommand group. Thequality-monitor-v2command group within theDatabricks CLIallows you to manage data quality monitors on Unity Catalog objects.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Even though deprecated, the quality-monitor-v2 command group reference contains specific CLI commands and options for managing data quality monitors, which are detailed integration patterns.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitors-commands,quality-monitors,quality-monitors command group - Azure Databricks,Use Databricks CLI quality-monitors command group,Learn how to use the Databricks CLI quality-monitors commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Warning These commands are deprecated. Use thedata-qualitycommands instead. Seedata-qualitycommand group. Thequality-monitorscommand group within theDatabricks CLIcontains commands to create, edit, and delete quality monitors. A monitor computes and monitors data o",2026-04-24T08:00:00.000Z,reference,integrations,0.78,True,"CLI reference pages enumerate product-specific commands, flags, and parameters for the quality-monitors command group, which are not generally known from training and map directly to SDK/API usage patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/queries-commands,queries,queries command group - Azure Databricks,Create and manage Databricks SQL queries with CLI,Learn how to use the Databricks CLI queries commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thequeriescommand group within theDatabricks CLIallows you to perform get, create, update, and delete operations on queries. A query is a Databricks SQL object that includes the target SQL warehouse, query text, name, description, tags, and parameters. SeeAccess an",2026-01-19T08:00:00.000Z,reference,integrations,0.82,True,"queries CLI reference documents exact commands and parameters for CRUD operations on queries, which are specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/queries-legacy-commands,queries-legacy (deprecated),queries-legacy command group (deprecated) - Azure Databricks,Use deprecated queries-legacy commands in CLI,Learn how to use the Databricks CLI queries-legacy commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Warning These commands are deprecated. SeeUpdate to the latest Databricks SQL API version. To perform operations on new Databricks SQL queries, seequeriescommand group. Thequeries-legacycommand group within theDatabricks CLIallows you to perform get, create, update",2026-02-06T22:46:00.000Z,reference,integrations,0.7,True,queries-legacy command group still documents specific CLI syntax and parameters for older SQL query APIs.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/query-history-commands,query-history,query-history command group - Azure Databricks,Access Databricks SQL query history using CLI,Learn how to use the Databricks CLI query-history commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thequery-historycommand group within theDatabricks CLIcontains commands for storing and retrieving the list of queries run against SQL endpoints and serverless compute.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"query-history CLI commands for storing and retrieving query history are concrete, product-specific command interfaces.",unchanged @@ -705,86 +719,91 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/resou https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/rfa-commands,rfa,rfa command group - Azure Databricks,Handle Unity Catalog access requests with rfa CLI,Learn how to use the Databricks CLI rfa commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Therfa(Request for Access) command group within theDatabricks CLIcontains commands to enable users to request access for Unity Catalog securables. These commands provide a standardized way for securable owners (or users with MANAGE privileges) to manage access requ",2026-02-19T19:35:00.000Z,reference,security,0.7,True,"rfa command group manages access requests for securables, involving permissions and MANAGE privileges, a security-focused workflow.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/schemas-commands,schemas,schemas command group - Azure Databricks,Manage Unity Catalog schemas with Databricks CLI,Learn how to use the Databricks CLI schemas commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theschemascommand group within theDatabricks CLIcontains commands to manage schemas in Unity Catalog. A schema is the second layer of Unity Catalog's three-level namespace. A schema organizes tables, views and functions. SeeWhat are schemas in Azure Databricks?.",2026-01-19T08:00:00.000Z,reference,integrations,0.79,True,"schemas CLI reference provides exact commands and parameters for schema lifecycle operations, which are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/secrets-commands,secrets,secrets command group - Azure Databricks,Manage Databricks secrets and scopes via CLI,Learn how to use the Databricks CLI secrets commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thesecretscommand group within theDatabricks CLIallows you to manage secrets, secret scopes, and access permissions. Sometimes accessing data requires that you authenticate to external data sources through JDBC. Instead of directly entering your credentials into a ",2026-01-19T08:00:00.000Z,reference,security,0.84,True,"secrets CLI group covers secrets, scopes, and access permissions; the reference will list specific commands and permission-related parameters, which are product-specific security configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principal-secrets-proxy-commands,service-principal-secrets-proxy,service-principal-secrets-proxy command group - Azure Databricks,Manage service principal secrets via Databricks CLI,Learn how to use the Databricks CLI service-principal-secrets-proxy commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theservice-principal-secrets-proxycommand group within theDatabricks CLIallows you to manage service principal secrets at the workspace level. To use these commands, the service principal must be first added to the current workspace. You can use the generated secre",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Describes Databricks CLI service-principal-secrets-proxy commands with concrete command names and options, representing product-specific integration patterns for managing secrets.",new https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principals-commands,service-principals,service-principals command group - Azure Databricks,Administer Databricks service principals using CLI,Learn how to use the Databricks CLI service principals commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theservice-principalscommand group within theDatabricks CLIallows you to manage service principals in your Databricks workspace. SeeService principals for CI/CD.",2026-01-19T08:00:00.000Z,reference,security,0.78,True,"service-principals CLI commands manage identities used for CI/CD; these are concrete, product-specific IAM configuration interfaces.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/serving-endpoints-commands,serving-endpoints,serving-endpoints command group - Azure Databricks,Manage Databricks model serving endpoints with CLI,Learn how to use the Databricks CLI serving endpoints commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theserving-endpointscommand group within theDatabricks CLIallows you to create, update, and delete model serving endpoints. SeeManage model serving endpoints.",2026-01-19T08:00:00.000Z,reference,integrations,0.8,True,"serving-endpoints CLI reference defines exact commands and parameters for creating, updating, and deleting serving endpoints, which are detailed integration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principals-v2-commands,service-principals-v2,service-principals-v2 command group - Azure Databricks,Manage Databricks service principals with CLI v2,Learn how to use the Databricks CLI service-principals-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theservice-principals-v2command group within theDatabricks CLIallows you to manage service principal identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. Databricks recommends creating service principals to run prod",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Provides detailed CLI command group reference (service-principals-v2) with specific commands/parameters for identity automation, which are product-specific API/SDK-style integration details.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/serving-endpoints-commands,serving-endpoints,serving-endpoints command group - Azure Databricks,Control Databricks serving endpoints using CLI,Learn how to use the Databricks CLI serving endpoints commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theserving-endpointscommand group within theDatabricks CLIallows you to create, update, and delete model serving endpoints. SeeManage model serving endpoints.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Covers the serving-endpoints CLI command group with concrete command syntax and options for managing model serving endpoints, which are product-specific integration commands.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/settings-commands,settings,settings command group - Azure Databricks,Configure Databricks workspace settings via CLI,Learn how to use the Databricks CLI settings commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thesettingscommand group within theDatabricks CLIcontains commands manage workspace-level settings, which control various features and policies that apply across the entire workspace. SeeManage your workspace.",2026-01-19T08:00:00.000Z,reference,configuration,0.8,True,"settings CLI group manages workspace-level settings; the reference will list specific setting names, allowed values, and commands, which are configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/shares-commands,shares,shares command group - Azure Databricks,Manage Unity Catalog shares using Databricks CLI,Learn how to use the Databricks CLI to work with shares,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thesharescommand group within theDatabricks CLIallows you to manage shares in Unity Catalog. A share is a container instantiated withshares create. Once created you can iteratively register a collection of existing data assets defined within the metastore usingshar",2026-01-19T08:00:00.000Z,reference,integrations,0.8,True,"shares CLI reference documents concrete commands and parameters for share lifecycle and asset registration, which are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/shares-commands,shares,shares command group - Azure Databricks,Manage Unity Catalog shares with Databricks CLI,Learn how to use the Databricks CLI to work with shares,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thesharescommand group within theDatabricks CLIallows you to manage shares in Unity Catalog. A share is a container instantiated withshares create. Once created you can iteratively register a collection of existing data assets defined within the metastore usingshar",2026-04-24T08:00:00.000Z,reference,integrations,0.78,True,"Documents the shares CLI command group with specific commands and parameters for Unity Catalog shares, representing detailed integration/configuration patterns unique to Databricks.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/ssh-commands,ssh,ssh command group - Azure Databricks,Use Databricks CLI ssh commands for remote development,Learn how to use the Databricks CLI ssh commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Important Databricks Remote Development is inBeta. Thesshcommand group within theDatabricks CLIallows you to set up and establish SSH connections to Databricks compute. SeeDatabricks Remote Development.",2026-03-06T19:12:00.000Z,reference,integrations,0.74,True,"CLI reference page for ssh command group will list command names, flags, and parameter behaviors specific to Databricks Remote Development, which are product-specific API/CLI details not generally known from training.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/storage-credentials-commands,storage-credentials,storage-credentials command group - Azure Databricks,Use Databricks CLI storage-credentials commands,Learn how to use the Databricks CLI storage-credentials commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thestorage-credentialscommand group within theDatabricks CLIcontains commands to manage storage credentials in Unity Catalog. A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. Each storage ",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"CLI reference pages typically list command names, required/optional parameters, and flags specific to the Databricks CLI and Unity Catalog storage credentials. These are product-specific API/SDK-style parameters and options that qualify as integration/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/sync-commands,sync,sync command - Azure Databricks,Sync local files with Databricks CLI sync command,Learn how to use the Databricks CLI to perform one-way synchronization of file changes from a local directory to a folder in your remote Azure Databricks workspace.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thesynccommand group within theDatabricks CLIenables one-way synchronization of local code and file changes in a directory on your local development machine to a folder in your remote Azure Databricks workspace. Note",2026-02-08T08:00:00.000Z,reference,integrations,0.78,True,"The sync command reference will include CLI parameters (paths, filters, flags) and behavior details for one-way synchronization, which are product-specific command/parameter patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/system-schemas-commands,system-schemas,system-schemas command group - Azure Databricks,Manage system schemas with Databricks CLI commands,Learn how to use the Databricks CLI system-schemas commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thesystem-schemascommand group within theDatabricks CLIcontains commands to manage schemas in the system catalog. A system schema may contain information about customer usage of Unity Catalog such as audit-logs, billing-logs, and lineage information. SeeMonitor acc",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Describes CLI commands and parameters for managing system schemas, including command syntax and options that are specific to Databricks CLI.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/table-constraints-commands,table-constraints,table-constraints command group - Azure Databricks,Manage table constraints via Databricks CLI,Learn how to use the Databricks CLI table-constraints commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetable-constraintscommand group within theDatabricks CLIcontains commands to manage primary key and foreign key constraints that encode relationships between fields in tables.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Command-group reference for table-constraints will enumerate commands and flags for primary/foreign key management, which are concrete CLI integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tables-commands,tables,tables command group - Azure Databricks,Manage Unity Catalog tables with Databricks CLI,Learn how to use the Databricks CLI tables commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetablescommand group within theDatabricks CLIcontains commands to manage tables in Unity Catalog. A table resides in the third layer of Unity Catalog's three-level namespace. It contains rows of data.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Provides Databricks CLI tables command syntax and parameters, which are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tables-commands,tables,tables command group - Azure Databricks,Administer Unity Catalog tables via Databricks CLI,Learn how to use the Databricks CLI tables commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetablescommand group within theDatabricks CLIcontains commands to manage tables in Unity Catalog. A table resides in the third layer of Unity Catalog's three-level namespace. It contains rows of data.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"The tables command group reference lists product-specific CLI commands and flags for managing tables, which are concrete integration points not derivable from generic knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tag-policies-commands,tag-policies,tag-policies command group - Azure Databricks,Administer governed tag policies with Databricks CLI,Learn how to use the Databricks CLI tag-policies commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetag-policiescommand group within theDatabricks CLIcontains commands to manage policies for governed tags in Databricks. SeeGoverned tags.",2026-02-19T19:35:00.000Z,reference,integrations,0.72,True,"tag-policies command group reference contains concrete commands and parameters for managing governed tags, which are Databricks-specific API/CLI patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/temporary-path-credentials-commands,temporary-path-credentials,temporary-path-credentials command group - Azure Databricks,Generate temporary path credentials with Databricks CLI,Learn how to use the Databricks CLI temporary-path-credentials commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetemporary-path-credentialscommand group within theDatabricks CLIcontains commands to generate short-lived, downscoped credentials used to access external cloud storage locations registered in Databricks. These credentials provide secure and time-limited access t",2026-02-19T19:35:00.000Z,reference,integrations,0.76,True,"temporary-path-credentials command reference will include parameters (scopes, durations, storage locations) for generating downscoped credentials, which are detailed integration/security configuration values.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/temporary-table-credentials-commands,temporary-table-credentials,temporary-table-credentials command group - Azure Databricks,Generate temporary table credentials with Databricks CLI,Learn how to use the Databricks CLI temporary-table-credentials commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetemporary-table-credentialscommand group within theDatabricks CLIcontains commands to generate temporary table credentials. These are short-lived, downscoped credentials used to access cloud storage locations where table data is stored in Databricks. SeeUnity Ca",2026-02-19T08:00:00.000Z,reference,integrations,0.76,True,"Covers CLI commands and options for temporary-table-credentials, including parameter names and usage patterns unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/token-management-commands,token-management,token-management command group - Azure Databricks,Administer token management using Databricks CLI,Learn how to use the Databricks CLI token-management commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetoken-managementcommand group within theDatabricks CLIenables administrators to get all tokens and delete tokens for other users. Admins can either get every token, get a specific token by ID, or get all tokens for a particular user.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"Token-management command reference will list commands, IDs, and flags for managing tokens, which are specific CLI integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tokens-commands,tokens,tokens command group - Azure Databricks,Create and revoke tokens with Databricks CLI,Learn how to use the Databricks CLI tokens commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thetokenscommand group within theDatabricks CLIallows you to create, list, and revoke tokens that can be used to authenticate and access Databricks APIs and tools.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"Tokens command group reference exposes concrete CLI commands and parameters for token lifecycle operations, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/users-commands,users,users command group - Azure Databricks,Manage Databricks users with CLI users commands,Learn how to use the Databricks CLI users commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theuserscommand group within theDatabricks CLIallows you to manage user identities in your Databricks workspace. User identities recognized by Databricks are represented by email addresses. SeeManage users. Databricks recommends using SCIM provisioning to sync user",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"Describes CLI commands and arguments for user identity management, including parameter names and usage unique to Databricks CLI.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-endpoints-commands,vector-search-endpoints,vector-search-endpoints command group - Azure Databricks,Manage vector search endpoints via Databricks CLI,Learn how to use the Databricks CLI vector-search-endpoints commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thevector-search-endpointscommand group within theDatabricks CLIcontains commands to manage vector search endpoints. Endpoints represent the compute resources to host vector search indexes. SeeMosaic AI Vector Search.",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,Vector-search-endpoints command reference will include endpoint-specific CLI parameters and options that are detailed integration patterns.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/users-v2-commands,users-v2,users-v2 command group - Azure Databricks,Manage Databricks users with CLI users-v2 commands,Learn how to use the Databricks CLI users-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theusers-v2command group within theDatabricks CLIallows you to manage user identities in the Databricks workspace. Databricks recommends using SCIM provisioning to sync users and groups automatically from your identity provider to your Databricks workspace.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Describes the users-v2 CLI command group with specific command names and options for user identity management, fitting the integrations category as product-specific API/CLI patterns.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-endpoints-commands,vector-search-endpoints,vector-search-endpoints command group - Azure Databricks,Operate vector search endpoints using Databricks CLI,Learn how to use the Databricks CLI vector-search-endpoints commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thevector-search-endpointscommand group within theDatabricks CLIcontains commands to manage vector search endpoints. Endpoints represent the compute resources to host vector search indexes. SeeMosaic AI Vector Search.",2026-04-24T23:15:00.000Z,reference,integrations,0.8,True,"Provides detailed CLI commands and parameters for vector-search-endpoints, a product-specific integration surface for Mosaic AI Vector Search.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-indexes-commands,vector-search-indexes,vector-search-indexes command group - Azure Databricks,Manage vector search indexes with Databricks CLI,Learn how to use the Databricks CLI vector-search-indexes commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thevector-search-indexescommand group within theDatabricks CLIcontains commands to manage vector search indexes. A vector search index is an efficient representation of your embedding vectors that supports real-time and approximate nearest neighbor (ANN) search que",2026-01-19T08:00:00.000Z,reference,integrations,0.78,True,"Provides CLI command syntax and parameters for vector-search-indexes, which are concrete, product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/version-command,version,version command - Azure Databricks,Check Databricks CLI version with version command,Learn how to use the Databricks CLI version command,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theversioncommand within theDatabricks CLIallows you to retrieve information about the current version of the CLI.",2026-01-19T08:00:00.000Z,reference,integrations,0.7,True,"Even though simple, the version command reference will show exact CLI invocation and any flags, which are specific command-level integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/volumes-commands,volumes,volumes command group - Azure Databricks,Manage Unity Catalog volumes using Databricks CLI,Learn how to use the Databricks CLI volumes commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thevolumescommand group within theDatabricks CLIcontains commands to manage volumes in Unity Catalog. Volumes provide features for accessing, storing, governing, organizing and processing files. SeeWhat are Unity Catalog volumes?.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Volumes command group reference lists CLI commands and parameters for volume operations, which are product-specific integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/warehouses-commands,warehouses,warehouses command group - Azure Databricks,Manage SQL warehouses with Databricks CLI,Learn how to use the Databricks CLI warehouses commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thewarehousescommand group within theDatabricks CLIallows you to manage SQL warehouses. A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. SeeConnect to a SQL warehouse.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Warehouses command group will document CLI commands, flags, and parameter names for SQL warehouse management, which are integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/warehouses-commands,warehouses,warehouses command group - Azure Databricks,Manage Databricks SQL warehouses via CLI,Learn how to use the Databricks CLI warehouses commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Thewarehousescommand group within theDatabricks CLIallows you to manage SQL warehouses. A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. SeeConnect to a SQL warehouse.",2026-04-24T23:15:00.000Z,reference,integrations,0.8,True,"The warehouses CLI command group reference includes concrete command syntax and options for SQL warehouse management, which are specific integration details beyond generic knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-bindings-commands,workspace-bindings,workspace-bindings command group - Azure Databricks,Configure Unity Catalog workspace bindings with CLI,Learn how to use the Databricks CLI workspace-bindings commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theworkspace-bindingscommand group within theDatabricks CLIcontains commands to configure (bind) securables in Unity Catalog. A securable in Databricks can be configured asOPENorISOLATED. AnOPENsecurable can be accessed from any workspace, while anISOLATEDsecurable",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Workspace-bindings command reference will show commands and flags to set OPEN/ISOLATED bindings, which are concrete CLI integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-commands,workspace,workspace command group - Azure Databricks,Manage workspace files and folders via Databricks CLI,Learn how to use the Databricks CLI workspace commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theworkspacecommand group within theDatabricks CLIallows you to list, import, export, and delete workspace files and folders. SeeWhat are workspace files?.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Workspace command group reference includes CLI operations (list, import, export, delete) with specific parameters and options unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-conf-commands,workspace-conf,workspace-conf command group - Azure Databricks,Update advanced workspace settings via Databricks CLI,Learn how to use the Databricks CLI workspace-conf commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theworkspace-confcommand group within theDatabricks CLIcontains commands to update workspace settings for advanced users.",2026-01-19T08:00:00.000Z,reference,integrations,0.76,True,"Workspace-conf command group documents specific configuration keys and CLI parameters for workspace settings, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-entity-tag-assignments-commands,workspace-entity-tag-assignments,workspace-entity-tag-assignments command group - Azure Databricks,Manage workspace entity tag assignments via CLI,Learn how to use the Databricks CLI workspace-entity-tag-assignments commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theworkspace-entity-tag-assignmentscommand group within theDatabricks CLIcontains commands to manage tag assignments on workspace-scoped objects.",2026-02-19T19:35:00.000Z,reference,integrations,0.7,True,"workspace-entity-tag-assignments command group reference includes commands and parameters for tagging workspace-scoped objects, which are Databricks-specific integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/troubleshooting,Troubleshooting,Troubleshoot the Databricks CLI - Azure Databricks,Troubleshoot common Databricks CLI errors and issues,Learn how to troubleshoot issues with the Databricks CLI.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Use the following information to troubleshoot issues with the Databricks CLI.",2026-01-19T08:00:00.000Z,troubleshooting,troubleshooting,0.84,True,"A dedicated troubleshooting page will map specific CLI error messages/codes and symptoms to causes and resolutions, which is product-specific troubleshooting knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-iam-v2-commands,workspace-iam-v2,workspace-iam-v2 command group - Azure Databricks,Manage Databricks workspace IAM via CLI v2,Learn how to use the Databricks CLI workspace-iam-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theworkspace-iam-v2command group within theDatabricks CLIallows you to manage identities and workspace access. These APIs are used to manage identities and the workspace access of these identities in Databricks.",2026-04-24T23:15:00.000Z,reference,security,0.78,True,"CLI reference pages enumerate product-specific workspace-iam-v2 commands, flags, and behaviors for managing identities and access in Databricks. These are concrete, versioned security/IAM configuration interfaces that an LLM cannot reliably infer from training data and map directly to security-related operations.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-settings-v2-commands,workspace-settings-v2,workspace-settings-v2 command group - Azure Databricks,Configure Databricks workspace settings with CLI v2,Learn how to use the Databricks CLI workspace-settings-v2 commands,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Theworkspace-settings-v2command group within theDatabricks CLIallows you to manage workspace level settings.",2026-04-24T23:15:00.000Z,reference,configuration,0.76,True,"The workspace-settings-v2 CLI reference describes specific commands, parameters, and allowed values for managing workspace-level settings. This is detailed configuration surface (names, options, and behaviors) that qualifies as expert knowledge beyond generic concepts.",new +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/troubleshooting,Troubleshooting,Troubleshoot the Databricks CLI - Azure Databricks,Troubleshoot Azure Databricks CLI issues,Learn how to troubleshoot issues with the Databricks CLI.,"Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. Use the following information to troubleshoot issues with the Databricks CLI.",2026-04-20T22:01:00.000Z,troubleshooting,troubleshooting,0.86,True,"A dedicated troubleshooting page for the Databricks CLI will list concrete error messages/codes, causes, and resolution steps specific to this CLI, matching the symptom → cause → solution pattern and providing expert diagnostic knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/tutorial,Tutorial,Databricks CLI tutorial - Azure Databricks,,"Use this hands-on tutorial to quickly get started with the Databricks command-line interface (Databricks CLI), provided by Databricks.","Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. The Databricks command-line interface (also known as the Databricks CLI) utility provides an easy-to-use interface to automate the Azure Databricks platform from your terminal, command prompt, or automation scripts. SeeWhat is the Databricks CLI?. This article demo",2026-01-24T08:00:00.000Z,tutorial,,0.45,False,Tutorial-style walkthrough for using the Databricks CLI. Likely focuses on step-by-step usage rather than exhaustive configuration references or expert-only details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/usage,Basic usage,Basic usage for the Databricks CLI - Azure Databricks,,"Learn how to use the Databricks CLI to output available command groups and commands, output help, and work with CLI output.","Note This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is inPublic Preview. Databricks CLI use is subject to theDatabricks LicenseandDatabricks Privacy Notice, including any Usage Data provisions. This page shows you how to list Databricks CLIcommand groups and commands, display Databricks CLI help, and work with Databricks CLI output. SeeWhat is the Databricks CLI?. To install and configure authentication for the Databricks CLI, seeDatabricks CLI tutorial.",2026-01-27T01:24:00.000Z,overview,,0.3,False,"Basic CLI usage and help/output handling; likely procedural tutorial without detailed config tables, limits, or product-specific error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/,Overview,Databricks Apps - Azure Databricks,,"Get a conceptual overview of Databricks Apps and learn about its main use cases, requirements, and limitations.","Databricks Apps enables developers to build and deploy secure data and AI applications directly on the Databricks platform, which eliminates the need for separate infrastructure. Apps run on the serverless platform and integrate with key platform services, including Unity Catalog for data governance, Databricks SQL for querying data, and OAuth for authentication. Common use cases include interactive dashboards, RAG chat apps, data entry forms, and custom operational interfaces. Develop apps loca",2026-03-31T23:28:00.000Z,landing-page,,0.3,False,"Conceptual overview of Databricks Apps, use cases, requirements, and limitations; does not emphasize detailed config parameters, error codes, or numeric limits that qualify as expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/,Overview,Databricks Apps - Azure Databricks,,"Get a conceptual overview of Databricks Apps and learn about its main use cases, requirements, and limitations.","Databricks Apps enables developers to build and deploy secure data and AI applications directly on the Databricks platform, which eliminates the need for separate infrastructure. Apps run on the serverless platform and integrate with key platform services, including Unity Catalog for data governance, Databricks SQL for querying data, and OAuth for authentication. Common use cases include interactive dashboards, RAG chat apps, data entry forms, and custom operational interfaces. Develop apps loca",2026-04-20T08:00:00.000Z,landing-page,,0.1,False,"Conceptual overview of Databricks Apps, use cases, and high-level requirements/limitations. No detailed configuration tables, numeric limits, error codes, or decision matrices; primarily descriptive content.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/app-development,Develop apps,Develop apps - Azure Databricks,,"This article has details for creating data and AI apps with Databricks Apps, including how to use Databricks features with your apps and important information for developing apps using the supported f","To build data and AI apps with Databricks Apps, you can use any IDE that supports Python, such as PyCharm, IntelliJ IDEA, or Visual Studio Code. Azure Databricks recommends using the Databricks extension for Visual Studio Code, but you can also edit your code in theDatabricks notebook and file editor. The Databricks Apps environment automatically sets several environment variables, such as the URL of the Azure Databricks workspace running the app and values required for authentication. Many apps",2026-03-17T18:09:00.000Z,landing-page,,0.4,False,"Development guidance mentions environment variables and supported frameworks but is summarized at a high level; no explicit parameter tables, default values, or product-specific constraints are evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/app-runtime,Configure app.yaml,Configure Databricks app execution with app.yaml - Azure Databricks,Configure Databricks Apps with app.yaml runtime settings,This article describes how to configure the build and deployment for a Databricks app using an app.yaml file.,"Theapp.yamlfile in a Databricks app defines how your app runs. If your app requires a different entry point or environment-specific configuration, you can include this optional file in your project to override the default behavior. You can use the.yamlor.ymlfile extension. This file must be located in the root of your project directory.",2026-03-17T18:09:00.000Z,how-to,configuration,0.8,True,"Details how app.yaml controls Databricks app execution, including entry points and environment-specific configuration. This is a product-specific config file with defined fields and behavior, fitting the configuration sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/apps-resource,Apps,Add a Databricks app resource to a Databricks app - Azure Databricks,Configure app-to-app resources for Databricks Apps,"Learn how to add another Databricks app as a resource for your Databricks app, enabling app-to-app communication.","Add anotherDatabricks appas a resource for your app so it can communicate with other deployed apps. This enables app-to-app interactions, such as calling another app's API or orchestrating workflows across multiple apps.",2026-04-15T19:49:00.000Z,how-to,configuration,0.65,True,"Adding another Databricks app as a resource likely involves specific configuration fields and environment variables for app-to-app communication, which are configuration details unique to this product.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/auth,Configure authorization,Configure authorization in a Databricks app - Azure Databricks,Configure authentication and authorization for Databricks Apps,Learn about the authorization and authentication mechanisms available to enforce data access controls for your app.,"Databricks Apps supports secure application development on Azure Databricks. As apps access data and services within a workspace, they must use authentication and authorization mechanisms that enforce data access controls and respect user permissions. The Databricks Apps authorization model is based on OAuth 2.0 and combines the permissions assigned to the app with those of the user accessing it. To support this framework, Databricks Apps uses two complementary identity models:",2026-04-15T08:00:00.000Z,how-to,security,0.8,True,"Covers Databricks Apps authorization model based on OAuth 2.0 and app/user identity models; likely includes specific scopes, roles, and configuration parameters unique to Databricks Apps.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/apps-resource,Apps,Add a Databricks app resource to a Databricks app - Azure Databricks,Configure app-to-app resources for Databricks Apps,"Learn how to add another Databricks app as a resource for your Databricks app, enabling app-to-app communication.","Add anotherDatabricks appas a resource for your app so it can communicate with other deployed apps. This enables app-to-app interactions, such as calling another app's API or orchestrating workflows across multiple apps.",2026-04-15T19:49:00.000Z,how-to,configuration,0.65,True,"Adding another Databricks app as a resource likely involves specific configuration fields and environment variables for app-to-app communication, which are configuration details unique to this product.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/auth,Configure authorization,Configure authorization in a Databricks app - Azure Databricks,Configure authentication and authorization for Databricks Apps,Learn about the authorization and authentication mechanisms available to enforce data access controls for your app.,"Databricks Apps supports secure application development on Azure Databricks. As apps access data and services within a workspace, they must use authentication and authorization mechanisms that enforce data access controls and respect user permissions. The Databricks Apps authorization model is based on OAuth 2.0 and combines the permissions assigned to the app with those of the user accessing it. To support this framework, Databricks Apps uses two complementary identity models:",2026-04-15T08:00:00.000Z,how-to,security,0.8,True,"Covers Databricks Apps authorization model based on OAuth 2.0 and app/user identity models; likely includes specific scopes, roles, and configuration parameters unique to Databricks Apps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/best-practices,Best practices for apps,Best practices for Databricks Apps - Azure Databricks,Apply security and performance best practices for Databricks apps,Learn about best practices for building apps.,"This page lists important best practices for developing and running Databricks Apps. These guidelines focus on security, performance, and platform requirements.",2026-03-17T18:09:00.000Z,best-practice,best-practices,0.8,True,"The page explicitly lists best practices for developing and running Databricks Apps, focusing on security, performance, and platform requirements. These are product-specific DO/DON'T guidelines and likely include concrete recommendations and gotchas unique to Databricks Apps, matching the best-practices category.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/compute-size,Configure compute,Configure compute resources for a Databricks app - Azure Databricks,Select and configure compute size for Databricks Apps,"Configure instance size for your Databricks app to control CPU, memory, and cost for different workload requirements.",Each Databricks app runs on compute resources that determine its processing power and memory. Choose an instance size when you create or edit an app to match your workload requirements.,2026-04-15T19:49:00.000Z,how-to,decision-making,0.65,True,"Choosing instance size for apps to match workload and cost implies guidance on when to pick which size; likely includes Databricks-specific thresholds or recommendations, fitting decision-making.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/compute-size,Configure compute,Configure compute resources for a Databricks app - Azure Databricks,Choose compute size for Azure Databricks Apps,"Configure instance size for your Databricks app to control CPU, memory, and cost for different workload requirements.",Each Databricks app runs on compute resources that determine its processing power and memory. Choose an instance size when you create or edit an app to match your workload requirements.,2026-04-24T08:00:00.000Z,how-to,decision-making,0.65,True,"A page about configuring instance size for Databricks Apps is likely to include SKU/size options, CPU/memory values, and guidance on which size to choose for different workloads. That is expert decision guidance with quantified trade-offs (cost vs. performance) and tier selection, fitting decision-making.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/configuration,Configure apps,Configure apps - Azure Databricks,"Configure Databricks Apps templates, auth, and routing","Learn how to configure app templates, permissions, authorization, and request handling for your Databricks apps.","Before you define how your app runs or what dependencies it needs, you typically start by creating and configuring the app in your Azure Databricks workspace. This includes choosing a template or building from scratch and setting up authorization and permissions. These steps establish the app's identity and access model in the workspace. Use the following topics to learn how to create and configure your app:",2026-02-02T19:09:00.000Z,landing-page,configuration,0.74,True,"Explains how to configure app templates, permissions, authorization, and request handling, including specific configuration options and models unique to Databricks Apps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/configure-env,Set up your environment,Set up your Databricks Apps workspace and development environment - Azure Databricks,Prepare workspace and local environment for Databricks Apps,Configure workspace and development environment requirements before you develop an app.,"To build, deploy, and run Databricks apps, your environment must meet specific prerequisites. These include requirements for both your Azure Databricks workspace and your local development environment. To deploy and run apps in your Azure Databricks workspace, make sure it meets the following requirements: All apps require Databricks CLI version 0.229.0 or above, configured to access your workspace. To install or update the CLI, seeInstall or update the Databricks CLI. Azure Databricks recommend",2026-01-29T08:00:00.000Z,how-to,deployment,0.7,True,"Lists specific workspace and local prerequisites (CLI version, workspace features) required to deploy and run apps; product-specific deployment requirements.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/connect-local,Connect using token authentication,Connect to an API Databricks app using token authentication - Azure Databricks,Use token-based auth for Databricks app APIs,"Learn how to connect to your Azure Databricks app APIs using token-based authentication from local machines, external applications, and other apps.","You can call a Databricks app that exposes an HTTP API (for example, a FastAPI or Gradio app) using OAuth 2.0 Bearer token authentication. This method works from your local development environment, external applications, and other Azure Databricks apps. Note This method applies only to apps that expose APIs or endpoints (accessible using/api/routes). For apps that provide only a user interface or background processing, you can't connect using token authentication.",2026-04-07T17:44:00.000Z,how-to,security,0.7,True,"Describes OAuth 2.0 Bearer token authentication for Databricks app HTTP APIs, including applicability constraints (only /api/routes apps). This is product-specific security/auth configuration and usage guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/connections,Connections,Add a Unity Catalog connection resource to a Databricks app - Azure Databricks,Add Unity Catalog connection resources to Databricks Apps,Learn how to add Unity Catalog connections as Databricks Apps resources for secure access to external data sources.,"AddUnity Catalog connectionsas Databricks Apps resources to enable secure access to external services and data sources. Unity Catalog connections manage credentials and authentication details, so you don't have to hardcode credentials in your application code.",2026-04-15T08:00:00.000Z,how-to,security,0.7,True,"Unity Catalog connections manage credentials and authentication; configuring them as resources is security-focused and likely includes specific connection types, scopes, and permissions.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-app-template,Create an app from a template,Create a Databricks app from a template - Azure Databricks,,Create a Databricks app from a template.,"Create a Databricks app quickly by starting from a template. The UI guides you through selecting a framework, naming your app, and configuring required resources. Once you create the app, Azure Databricks deploys it with example code and setup instructions to help you begin development locally.",2026-04-15T19:49:00.000Z,how-to,,0.3,False,"Template-creation walkthrough; summary suggests UI-driven steps, not detailed configuration matrices, limits, or security role definitions.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-custom-app,Create a custom app,Create a custom Databricks app - Azure Databricks,,Create a custom Databricks app using your own code.,"Create a custom Databricks app using your own code. Start by naming the app, then upload your code and artifacts to the workspace. Azure Databricks creates the app but leaves deployment up to you, so you have full control over the app’s setup and configuration.",2026-04-15T19:49:00.000Z,how-to,,0.3,False,"Custom app creation flow; likely a procedural guide without deep config parameter tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/connections,Connections,Add a Unity Catalog connection resource to a Databricks app - Azure Databricks,Add Unity Catalog connection resources to Databricks Apps,Learn how to add Unity Catalog connections as Databricks Apps resources for secure access to external data sources.,"AddUnity Catalog connectionsas Databricks Apps resources to enable secure access to external services and data sources. Unity Catalog connections manage credentials and authentication details, so you don't have to hardcode credentials in your application code.",2026-04-15T08:00:00.000Z,how-to,security,0.7,True,"Unity Catalog connections manage credentials and authentication; configuring them as resources is security-focused and likely includes specific connection types, scopes, and permissions.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-app-template,Create an app from a template,Create a Databricks app from a template - Azure Databricks,,Create a Databricks app from a template.,"Create a Databricks app quickly by starting from a template. The UI guides you through selecting a framework, naming your app, and configuring required resources. Once you create the app, Azure Databricks deploys it with example code and setup instructions to help you begin development locally.",2026-04-24T08:00:00.000Z,how-to,,0.2,False,"Guides creating an app from a template via UI. Appears to be procedural/tutorial content without detailed configuration parameter tables, limits, or error-resolution mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-custom-app,Create a custom app,Create a custom Databricks app - Azure Databricks,,Create a custom Databricks app using your own code.,"Create a custom Databricks app using your own code. Start by naming the app, then upload your code and artifacts to the workspace. Azure Databricks creates the app but leaves deployment up to you, so you have full control over the app’s setup and configuration.",2026-04-24T08:00:00.000Z,how-to,,0.2,False,"Describes creating a custom Databricks app and uploading code. This is a how-to flow without explicit expert-level configuration matrices, limits, or troubleshooting structures.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/dependencies,Manage dependencies,Manage dependencies for a Databricks app - Azure Databricks,Configure Python and Node.js dependencies for Databricks apps,"This article explains how to manage Python dependencies in a Azure Databricks app, including overriding pre-installed libraries.","Each Databricks app can include dependencies for Python, Node.js, or both. You define these dependencies in language-specific files:",2026-03-17T18:09:00.000Z,how-to,configuration,0.7,True,"The page describes how Databricks Apps handle Python/Node.js dependencies via language-specific files and overriding pre-installed libraries. This is product-specific configuration behavior (how dependencies are resolved and managed in Databricks Apps), which goes beyond generic package management knowledge. It likely includes specific file names, resolution order, and constraints, fitting the configuration category.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/deploy,Deploy apps,Deploy a Databricks app - Azure Databricks,Deploy Databricks apps via UI and CLI,Learn how to deploy an app after you create and develop it.,"After you create and develop your Azure Databricks app, deploy it to make it accessible in the Azure Databricks workspace. Deployment builds your app, installs dependencies, and runs it using the configuration defined in your project files. You can deploy apps using the Azure Databricks UI or the Databricks CLI. Note If you create an app from a template, Azure Databricks deploys it automatically when you first create it. However, you can still re-deploy it later after you make changes. SeeCreate",2026-04-17T08:00:00.000Z,how-to,deployment,0.7,True,"Covers how Databricks apps are built and run using platform-specific deployment flows (UI and CLI) and project configuration, which are product-specific deployment details rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/deploy,Deploy apps,Deploy a Databricks app - Azure Databricks,Deploy Azure Databricks Apps via UI or CLI,Learn how to deploy an app after you create and develop it.,"After you create and develop your Azure Databricks app, deploy it to make it accessible in the Azure Databricks workspace. Deployment builds your app, installs dependencies, and runs it using the configuration defined in your project files. You can deploy apps using the Azure Databricks UI or the Databricks CLI. Note If you create an app from a template, Azure Databricks deploys it automatically when you first create it. However, you can still re-deploy it later after you make changes. SeeCreate",2026-04-24T08:00:00.000Z,how-to,deployment,0.68,True,"The page focuses on how to deploy Databricks Apps, including behavior differences (automatic deployment from templates, redeploy after changes) and likely product-specific deployment commands/constraints. This is deployment-focused expert knowledge beyond generic 'run a build' instructions.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/embed,Embed apps,Embed Databricks apps in web applications - Azure Databricks,Configure iframe embedding for Databricks apps,Learn how to embed Databricks apps in other web applications.,"Embed Databricks apps in other web applications using an HTML iframe element. To embed an app, specify the app URL as thesrcattribute in an iframe tag.",2026-01-16T08:00:00.000Z,how-to,configuration,0.65,True,"Explains embedding apps via HTML iframe with specific src URL; likely includes URL patterns and parameters unique to Databricks app embedding, which is configuration-focused.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/environment-variables,Define environment variables,Define environment variables in a Databricks app - Azure Databricks,Configure environment variables for Databricks apps,Learn how to define environment variables for an app.,"Azure Databricks automatically sets certain environment variables in the app runtime environment. These variables provide essential information about the app and workspace, and are accessible to all Databricks apps by default. For a list of default variables, seeDatabricks Apps environment. If your app requires additional environment variables, define them in theapp.yamlconfiguration file in theenvsection. Each variable requires a name and a value. Variables can use a hardcoded value or referenc",2026-04-06T18:30:00.000Z,how-to,configuration,0.7,True,"Covers defining environment variables via app.yaml in the env section, including default variables set by the platform. This is product-specific configuration behavior and likely includes parameter names and usage patterns unique to Databricks Apps.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/functions,User-defined functions,Add a user-defined function (UDF) resource to a Databricks app - Azure Databricks,Add Unity Catalog UDF resources to Databricks Apps,Learn how to add user-defined functions (UDFs) as Databricks Apps resources to execute registered SQL and Python functions.,"Adduser-defined functions (UDFs)registered in Unity Catalog as Databricks Apps resources to enable your app to execute registered SQL and Python functions. UDFs provide reusable business logic, data transformations, and custom operations that can be shared across your organization with centralized governance.",2026-04-15T19:49:00.000Z,how-to,configuration,0.7,True,"Describes adding registered SQL/Python UDFs as resources; likely includes how to reference them and any required configuration fields, which are product-specific configuration patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/genie,Genie spaces,Add a Genie space resource to a Databricks app - Azure Databricks,Integrate Genie spaces as resources in Databricks Apps,This article explains how to add a Genie space as a Databricks Apps resource.,"AddGenie spacesas Databricks Apps resources to enable natural language querying in your applications. Genie spaces provide a conversational interface for data exploration, allowing users to ask business questions in plain English and receive SQL-based insights from your curated datasets. When you add a Genie space as a resource, your app can:",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,"Adding Genie spaces as resources for natural language querying likely requires specific configuration parameters and environment variables, which are product-specific configuration patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/get-started,Get started with apps,Get started with Databricks Apps - Azure Databricks,,Learn how to set up your development environment for Databricks Apps and develop and deploy your first Databricks app.,"This article helps you get started with Databricks Apps using a step-by-step example to create a simple app in your Azure Databricks workspace using a template that follows Azure Databricks best practices. This example walks you through: By the end of this article, you’ll be able to iterate on your app locally and deploy updates to Databricks.",2026-04-15T19:49:00.000Z,get-started,,0.3,False,"Step-by-step getting started tutorial; description does not indicate detailed config tables, limits, or product-specific troubleshooting or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/functions,User-defined functions,Add a user-defined function (UDF) resource to a Databricks app - Azure Databricks,Add Unity Catalog UDF resources to Databricks Apps,Learn how to add user-defined functions (UDFs) as Databricks Apps resources to execute registered SQL and Python functions.,"Adduser-defined functions (UDFs)registered in Unity Catalog as Databricks Apps resources to enable your app to execute registered SQL and Python functions. UDFs provide reusable business logic, data transformations, and custom operations that can be shared across your organization with centralized governance.",2026-04-15T19:49:00.000Z,how-to,configuration,0.7,True,"Describes adding registered SQL/Python UDFs as resources; likely includes how to reference them and any required configuration fields, which are product-specific configuration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/genie,Genie spaces,Add a Genie space resource to a Databricks app - Azure Databricks,Integrate Genie spaces as resources in Databricks Apps,This article explains how to add a Genie space as a Databricks Apps resource.,"AddGenie spacesas Databricks Apps resources to enable natural language querying in your applications. Genie spaces provide a conversational interface for data exploration, allowing users to ask business questions in plain English and receive SQL-based insights from your curated datasets. When you add a Genie space as a resource, your app can:",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,"Adding Genie spaces as resources for natural language querying likely requires specific configuration parameters and environment variables, which are product-specific configuration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/get-started,Get started with apps,Get started with Databricks Apps - Azure Databricks,,Learn how to set up your development environment for Databricks Apps and develop and deploy your first Databricks app.,"This article helps you get started with Databricks Apps using a step-by-step example to create a simple app in your Azure Databricks workspace using a template that follows Azure Databricks best practices. This example walks you through: By the end of this article, you’ll be able to iterate on your app locally and deploy updates to Databricks.",2026-04-24T08:00:00.000Z,get-started,,0.2,False,"Step-by-step getting started tutorial for Databricks Apps. While it may show example commands or flows, it is focused on walkthrough rather than enumerating configuration matrices, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/http-headers,Access HTTP headers,Access HTTP headers passed to Databricks apps - Azure Databricks,Use HTTP headers forwarded to Databricks apps,This article explains which HTTP headers are passed to your app.,"Databricks Apps passes specificX-Forwarded-*HTTP headers from the reverse proxy to your app. Use these headers to access information about the original request, such as the client IP address or protocol. Databricks Apps includes the followingX-Forwarded-*headers in requests that are forwarded from the reverse proxy to your app:",2026-01-16T08:00:00.000Z,reference,configuration,0.8,True,"Lists specific X-Forwarded-* headers passed from the reverse proxy and how to use them, which is detailed environment/configuration information (exact header names and semantics) unique to Databricks Apps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/key-concepts,Key concepts in apps,Key concepts in Databricks Apps - Azure Databricks,,Learn about key concepts in Databricks app.,"This article introduces the core concepts behind Databricks Apps, including how apps are structured, how they manage dependencies and state, how permissions work, and how apps interact with platform resources. Understanding these concepts helps when developing, deploying, and managing apps in your workspace.",2026-03-17T18:09:00.000Z,concept-article,,0.2,False,"Key concepts article describing structure, permissions, and interactions at a conceptual level; lacks concrete configuration tables, numeric thresholds, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakebase,Lakebase,Add a Lakebase resource to a Databricks app - Azure Databricks,Configure Lakebase database resources for Databricks Apps,"This article explains how to add a Lakebase database as a Databricks Apps resource, including Autoscaling and Provisioned options.",Add Lakebase databases as Databricks Apps resources to persist data across deployments. These PostgreSQL-backed resources let your app create and manage schemas and tables that retain state. The following types of Lakebase database resources are available: Both types use the same PostgreSQL connection model and provide the same environment variables to your app.,2026-04-15T19:49:00.000Z,how-to,configuration,0.75,True,"Describes Autoscaling and Provisioned Lakebase options and shared PostgreSQL connection model; likely includes environment variables and config options, fitting configuration with expert details.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakeflow,Lakeflow Jobs,Add a Lakeflow Jobs resource to a Databricks app - Azure Databricks,Add and configure Lakeflow Jobs as Databricks app resources,This article explains how to add a Lakeflow Job as a Databricks Apps resource.,"AddLakeflow Jobsas Databricks Apps resources so your app can trigger, monitor, and manage workflow automation. Lakeflow Jobs provide orchestration for data processing workloads, allowing you to coordinate and run multiple tasks as part of larger workflows within your app.",2026-04-15T19:49:00.000Z,how-to,configuration,0.65,True,"Enabling apps to trigger and monitor Lakeflow Jobs typically involves specific resource configuration fields and parameters, which are product-specific configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/mlflow,MLflow experiments,Add an MLflow experiment resource to a Databricks app - Azure Databricks,Configure MLflow experiment resources for Databricks Apps,"Learn how to add MLflow experiments as Databricks Apps resources for tracking and managing AI applications, agents, LLMs, and ML models.","AddMLflow experimentsas Databricks Apps resources to enable experiment tracking for your AI applications, agents, LLMs, and ML models. MLflow experiments provide a structured way to organize and log runs, track parameters, metrics, and artifacts throughout the AI application development lifecycle. When you add an MLflow experiment as a resource, your app can:",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,Adding MLflow experiments as resources for tracking runs likely includes environment variables and configuration parameters unique to Databricks Apps integration with MLflow.,updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/model-serving,Model serving endpoints,Add a model serving endpoint resource to a Databricks app - Azure Databricks,Add model serving endpoint resources to Databricks Apps,"Learn how to add model serving endpoints as Databricks Apps resources, including permission levels and best practices.",Addmodel serving endpointsas Databricks Apps resources so your app can query machine learning models for inference. Model serving endpoints handle model predictions and provide a consistent interface to access deployed models.,2026-04-15T19:49:00.000Z,how-to,configuration,0.7,True,"Integrating model serving endpoints, including permission levels and best practices, implies specific configuration fields and possibly security-related settings unique to Databricks model serving.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/monitor,Monitor apps,Logging and Monitoring for Databricks Apps - Azure Databricks,Monitor Databricks app logs and audit events,Monitor logs and audit events for your Databricks app.,"Effective logging and monitoring help you detect and respond to security events in Databricks Apps. Apps generate both application-level logs and platform audit logs, which you can use for diagnostics, performance tracking, and security analytics.",2026-04-15T08:00:00.000Z,how-to,security,0.65,True,Focuses on application logs and platform audit logs for Databricks Apps in a security context; likely includes product-specific log sources and monitoring configuration relevant to security operations.,updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/networking,Configure networking,Configure networking for Databricks Apps - Azure Databricks,Configure networking and access controls for Databricks Apps,This article explains how to configure networking for your re:\[app].,"Databricks Apps supports fine-grained network control to help you secure and manage how your app communicates with the internet and internal resources. You can configure both ingress (incoming) and egress (outgoing) traffic rules using a combination of IP access lists, front-end private connectivity, and network policies.",2026-04-17T21:49:00.000Z,how-to,security,0.7,True,"Networking configuration for apps with IP access lists, private connectivity, and network policies is security-focused and likely includes specific settings and patterns unique to Databricks Apps.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/observability,App telemetry,Configure telemetry for Databricks Apps - Azure Databricks,Configure OpenTelemetry-based telemetry for Databricks Apps,"Configure app telemetry to collect traces, logs, and metrics for your Databricks app in Unity Catalog tables.","Important App telemetry is inBeta. Databricks Apps telemetry collects traces, logs, and metrics and persists them to Unity Catalog tables using the OpenTelemetry (OTel) protocol. After you enable app telemetry, Databricks automatically captures system logs and usage events such as user login and direct API requests. You can also add custom instrumentation using the OpenTelemetry SDK for your framework.",2026-04-15T08:00:00.000Z,how-to,configuration,0.8,True,"Describes configuring app telemetry with OpenTelemetry and Unity Catalog tables, including Databricks-specific settings and parameters for traces, logs, and metrics collection.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/permissions,Configure permissions,Configure permissions for a Databricks app - Azure Databricks,Manage Databricks app permissions and access control,This article explains how to configure and manage user permissions for your app.,"Permissions control what users can do with Databricks apps, such as accessing, managing, and sharing apps. This is different from authentication, which verifies a user's identity. Permissions determine what actions the user is authorized to perform within the app.",2026-04-15T08:00:00.000Z,how-to,security,0.75,True,"Explains configuring and managing user permissions for Databricks apps; such pages typically list specific permission levels/RBAC roles and their effects, which is product-specific security configuration.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/resources,Add resources,Add resources to a Databricks app - Azure Databricks,Configure Databricks app resources and secret management,This article covers adding resources and managing secrets for your Databricks app.,"Your Databricks apps can integrate with various Azure Databricks platform features, such as Databricks SQL for querying data, Lakeflow Jobs for data ingestion and processing, Mosaic AI Model Serving to access generative AI models, and Azure Databricks secrets for managing sensitive information. In the context of apps, these platform features are referred to asresources.",2026-04-15T08:00:00.000Z,how-to,configuration,0.7,True,"Covers adding various platform features as resources and managing secrets; such pages typically define resource configuration options and environment variables, which are product-specific configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/secrets,Secrets,Add a secret resource to a Databricks app - Azure Databricks,Use Databricks secrets as secure resources in apps,This article explains how to add a Databricks secret as a Databricks Apps resource.,"AddDatabricks secretsas Databricks Apps resources to securely pass sensitive values, such as API keys or tokens, to your app. Databricks Apps supports secrets stored insecret scopes. Apps retrieve these secrets at runtime, which keeps them out of your application code and environment definitions.",2026-04-15T19:49:00.000Z,how-to,security,0.8,True,"Focuses on securely passing sensitive values via secret scopes; likely details secret scope usage, access patterns, and environment variables, which are product-specific security configurations.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/sql-warehouse,SQL warehouses,Add a SQL warehouse resource to a Databricks app - Azure Databricks,Configure SQL warehouse resources for Databricks Apps,This article explains how to add a SQL warehouse as a Databricks Apps resource.,AddSQL warehousesas Databricks Apps resources to enable your app to connect to compute resources and run SQL queries.,2026-04-15T19:49:00.000Z,how-to,configuration,0.7,True,"Connecting apps to SQL warehouses to run queries typically involves specifying warehouse identifiers, connection parameters, and environment variables, which are configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/system-env,App environment,Databricks Apps environment - Azure Databricks,Understand environment and dependencies for Databricks apps,"Learn about the app environment for Databricks Apps, including available binaries, environment variables, and how to manage dependencies.",Your Databricks app runs in a managed environment with the following binaries and resources:,2026-03-31T23:28:00.000Z,reference,configuration,0.85,True,"Describes the managed app environment, including available binaries, environment variables, and dependency management; this is detailed configuration/environment reference information specific to Databricks Apps.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tables,Unity Catalog tables,Add a Unity Catalog table resource to a Databricks app - Azure Databricks,Add Unity Catalog table resources with governed access,"Learn how to add Unity Catalog tables as Databricks Apps resources, including required privileges and best practices.","AddUnity Catalog tablesas Databricks Apps resources so your app can query and modify data stored in Unity Catalog with governance and access control. Unity Catalog tables provide structured data storage with fine-grained permissions, so your app can securely read and write data without hardcoding credentials.",2026-04-15T19:49:00.000Z,how-to,security,0.7,True,"Emphasizes required privileges, governance, and access control for Unity Catalog tables; likely lists specific privilege requirements and patterns, fitting product-specific security configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakebase,Lakebase,Add a Lakebase resource to a Databricks app - Azure Databricks,Configure Lakebase database resources for Databricks Apps,"This article explains how to add a Lakebase database as a Databricks Apps resource, including Autoscaling and Provisioned options.",Add Lakebase databases as Databricks Apps resources to persist data across deployments. These PostgreSQL-backed resources let your app create and manage schemas and tables that retain state. The following types of Lakebase database resources are available: Both types use the same PostgreSQL connection model and provide the same environment variables to your app.,2026-04-15T19:49:00.000Z,how-to,configuration,0.75,True,"Describes Autoscaling and Provisioned Lakebase options and shared PostgreSQL connection model; likely includes environment variables and config options, fitting configuration with expert details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakeflow,Lakeflow Jobs,Add a Lakeflow Jobs resource to a Databricks app - Azure Databricks,Add and configure Lakeflow Jobs as Databricks app resources,This article explains how to add a Lakeflow Job as a Databricks Apps resource.,"AddLakeflow Jobsas Databricks Apps resources so your app can trigger, monitor, and manage workflow automation. Lakeflow Jobs provide orchestration for data processing workloads, allowing you to coordinate and run multiple tasks as part of larger workflows within your app.",2026-04-15T19:49:00.000Z,how-to,configuration,0.65,True,"Enabling apps to trigger and monitor Lakeflow Jobs typically involves specific resource configuration fields and parameters, which are product-specific configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/mlflow,MLflow experiments,Add an MLflow experiment resource to a Databricks app - Azure Databricks,Configure MLflow experiment resources for Databricks Apps,"Learn how to add MLflow experiments as Databricks Apps resources for tracking and managing AI applications, agents, LLMs, and ML models.","AddMLflow experimentsas Databricks Apps resources to enable experiment tracking for your AI applications, agents, LLMs, and ML models. MLflow experiments provide a structured way to organize and log runs, track parameters, metrics, and artifacts throughout the AI application development lifecycle. When you add an MLflow experiment as a resource, your app can:",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,Adding MLflow experiments as resources for tracking runs likely includes environment variables and configuration parameters unique to Databricks Apps integration with MLflow.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/model-serving,Model serving endpoints,Add a model serving endpoint resource to a Databricks app - Azure Databricks,Add model serving endpoint resources to Databricks Apps,"Learn how to add model serving endpoints as Databricks Apps resources, including permission levels and best practices.",Addmodel serving endpointsas Databricks Apps resources so your app can query machine learning models for inference. Model serving endpoints handle model predictions and provide a consistent interface to access deployed models.,2026-04-15T19:49:00.000Z,how-to,configuration,0.7,True,"Integrating model serving endpoints, including permission levels and best practices, implies specific configuration fields and possibly security-related settings unique to Databricks model serving.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/monitor,Monitor apps,Logging and Monitoring for Databricks Apps - Azure Databricks,Configure logging and audit monitoring for Databricks Apps,Monitor logs and audit events for your Databricks app.,"Effective logging and monitoring help you detect and respond to security events in Databricks Apps. Apps generate both application-level logs and platform audit logs, which you can use for diagnostics, performance tracking, and security analytics.",2026-04-20T08:00:00.000Z,how-to,security,0.63,True,"The page is about logs and audit events for Databricks Apps and explicitly mentions security event detection. Such content typically includes product-specific log types, locations, and possibly required permissions or configuration steps for audit logging, which are security-focused expert details.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/networking,Configure networking,Configure networking for Databricks Apps - Azure Databricks,Configure networking and access controls for Databricks Apps,This article explains how to configure networking for your re:\[app].,"Databricks Apps supports fine-grained network control to help you secure and manage how your app communicates with the internet and internal resources. You can configure both ingress (incoming) and egress (outgoing) traffic rules using a combination of IP access lists, front-end private connectivity, and network policies.",2026-04-17T21:49:00.000Z,how-to,security,0.7,True,"Networking configuration for apps with IP access lists, private connectivity, and network policies is security-focused and likely includes specific settings and patterns unique to Databricks Apps.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/observability,App telemetry,Configure telemetry for Databricks Apps - Azure Databricks,Enable and configure telemetry for Databricks Apps,"Configure app telemetry to collect traces, logs, and metrics for your Databricks app in Unity Catalog tables.","Important App telemetry is inBeta. Databricks Apps telemetry collects traces, logs, and metrics and persists them to Unity Catalog tables using the OpenTelemetry (OTel) protocol. After you enable app telemetry, Databricks automatically captures system logs and usage events such as user login and direct API requests. You can also add custom instrumentation using the OpenTelemetry SDK for your framework.",2026-04-20T08:00:00.000Z,how-to,configuration,0.7,True,"Telemetry configuration for Databricks Apps using OpenTelemetry and Unity Catalog will include specific configuration options (tables, protocols, SDK settings, enablement flags). These are concrete configuration parameters and patterns unique to Databricks Apps, fitting the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/permissions,Configure permissions,Configure permissions for a Databricks app - Azure Databricks,Manage Databricks app permissions and access control,This article explains how to configure and manage user permissions for your app.,"Permissions control what users can do with Databricks apps, such as accessing, managing, and sharing apps. This is different from authentication, which verifies a user's identity. Permissions determine what actions the user is authorized to perform within the app.",2026-04-15T08:00:00.000Z,how-to,security,0.75,True,"Explains configuring and managing user permissions for Databricks apps; such pages typically list specific permission levels/RBAC roles and their effects, which is product-specific security configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/resources,Add resources,Add resources to a Databricks app - Azure Databricks,Configure Databricks app resources and secret management,This article covers adding resources and managing secrets for your Databricks app.,"Your Databricks apps can integrate with various Azure Databricks platform features, such as Databricks SQL for querying data, Lakeflow Jobs for data ingestion and processing, Mosaic AI Model Serving to access generative AI models, and Azure Databricks secrets for managing sensitive information. In the context of apps, these platform features are referred to asresources.",2026-04-15T08:00:00.000Z,how-to,configuration,0.7,True,"Covers adding various platform features as resources and managing secrets; such pages typically define resource configuration options and environment variables, which are product-specific configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/secrets,Secrets,Add a secret resource to a Databricks app - Azure Databricks,Use Databricks secrets as secure resources in apps,This article explains how to add a Databricks secret as a Databricks Apps resource.,"AddDatabricks secretsas Databricks Apps resources to securely pass sensitive values, such as API keys or tokens, to your app. Databricks Apps supports secrets stored insecret scopes. Apps retrieve these secrets at runtime, which keeps them out of your application code and environment definitions.",2026-04-15T19:49:00.000Z,how-to,security,0.8,True,"Focuses on securely passing sensitive values via secret scopes; likely details secret scope usage, access patterns, and environment variables, which are product-specific security configurations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/sql-warehouse,SQL warehouses,Add a SQL warehouse resource to a Databricks app - Azure Databricks,Configure SQL warehouse resources for Databricks Apps,This article explains how to add a SQL warehouse as a Databricks Apps resource.,AddSQL warehousesas Databricks Apps resources to enable your app to connect to compute resources and run SQL queries.,2026-04-15T19:49:00.000Z,how-to,configuration,0.7,True,"Connecting apps to SQL warehouses to run queries typically involves specifying warehouse identifiers, connection parameters, and environment variables, which are configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/system-env,App environment,Databricks Apps environment - Azure Databricks,Understand Databricks Apps system binaries and env vars,"Learn about the app environment for Databricks Apps, including available binaries, environment variables, and how to manage dependencies.",Your Databricks app runs in a managed environment with the following binaries and resources:,2026-04-20T08:00:00.000Z,reference,configuration,0.74,True,"A Databricks Apps 'environment' page typically enumerates the exact binaries available in the container/VM image and lists system environment variables and their meanings. Those are product-specific configuration details (names, paths, versions) that an LLM is unlikely to know from training and map to configuration parameters rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tables,Unity Catalog tables,Add a Unity Catalog table resource to a Databricks app - Azure Databricks,Add Unity Catalog table resources with governed access,"Learn how to add Unity Catalog tables as Databricks Apps resources, including required privileges and best practices.","AddUnity Catalog tablesas Databricks Apps resources so your app can query and modify data stored in Unity Catalog with governance and access control. Unity Catalog tables provide structured data storage with fine-grained permissions, so your app can securely read and write data without hardcoding credentials.",2026-04-15T19:49:00.000Z,how-to,security,0.7,True,"Emphasizes required privileges, governance, and access control for Unity Catalog tables; likely lists specific privilege requirements and patterns, fitting product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tags,Apply tags,Apply tags - Azure Databricks,,Learn how to apply tags to Databricks apps for organization and management.,"Important This feature is inPublic Preview. Use tags to organize and categorize Databricks apps for easier management. Databricks Apps also support certification and deprecation system tags to indicate trust or lifecycle status. For more information about tags, seeApply tags to Unity Catalog securable objects. For information about certification and deprecation, seeFlag data as certified or deprecated.",2026-02-02T19:09:00.000Z,concept-article,,0.4,False,Describes tagging apps for organization and lifecycle status; mostly conceptual/UX-level without detailed configuration parameters or numeric constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tutorial-node,Tutorial: Develop a Node.js app,Tutorial: Develop a Node.js Databricks app - Azure Databricks,,Learn how to develop a Node.js Databricks app with Express and Chart.js.,This tutorial shows you how to create a simple Node.js app in Databricks Apps that serves a web page with a dynamic chart usingChart.jsandExpress. The app includes:,2026-01-16T08:00:00.000Z,tutorial,,0.4,False,"Tutorial for building a simple Node.js app with Express and Chart.js; likely step-by-step example code rather than structured config tables, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tutorial-streamlit,Tutorial: Develop an app with Streamlit,Tutorial: Develop a Databricks app with Streamlit - Azure Databricks,,Learn how to develop a Databricks app with Streamlit and the Databricks SQL Connector for Python.,This tutorial shows how to build a Databricks app using theDatabricks SQL Connector for PythonandStreamlit. You'll learn how to develop an app that does the following:,2026-03-26T08:00:00.000Z,tutorial,,0.4,False,"Tutorial for building a Databricks app with Streamlit and SQL Connector; primarily example workflow, not structured expert reference content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/uc-volumes,Unity Catalog volumes,Add a Unity Catalog volume resource to a Databricks app - Azure Databricks,Configure Unity Catalog volume resources for Databricks Apps,"Learn how to add Unity Catalog volumes as Databricks Apps resources, including required privileges and best practices.","AddUnity Catalog volumesas Databricks Apps resources so your app can read from and write to files and directories stored in Unity Catalog with governance and access control. Volumes provide persistent storage for unstructured data, such as configuration files, model artifacts, logs, or other file-based data that your app needs.",2026-04-15T19:49:00.000Z,how-to,security,0.7,True,Covers volumes with governance and access control plus required privileges; this is product-specific security and permission configuration for file-based storage.,updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/vector-search,Vector search indexes,Add a vector search index resource to a Databricks app - Azure Databricks,Configure vector search index resources in Databricks Apps,Learn how to add vector search indexes as Databricks Apps resources for semantic search and retrieval-augmented generation.,"Addvector search indexesas Databricks Apps resources to enable semantic search and similarity-based retrieval in your applications. Vector search indexes store and query high-dimensional vector embeddings, powering use cases like retrieval-augmented generation (RAG), semantic search, and recommendation systems.",2026-04-15T08:00:00.000Z,how-to,configuration,0.7,True,"Page is about adding vector search indexes as app resources and likely includes Databricks-specific resource configuration options (resource definitions, properties, and parameters) that go beyond generic vector search concepts.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/view-app-details,View app details,View the details for a Databricks app - Azure Databricks,Use Databricks app details for monitoring and troubleshooting,Learn how to access detailed information about your app.,"The app details page provides comprehensive information about your Databricks app, including deployment status, runtime logs, environment configuration, and deployment history. Use this page to monitor your app's health, troubleshoot issues, and manage its configuration.",2026-04-15T19:49:00.000Z,how-to,troubleshooting,0.65,True,"Summary emphasizes runtime logs, deployment status, and using the page to troubleshoot issues, which typically includes product-specific diagnostic locations and symptom-to-solution guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/uc-volumes,Unity Catalog volumes,Add a Unity Catalog volume resource to a Databricks app - Azure Databricks,Configure Unity Catalog volume resources for Databricks Apps,"Learn how to add Unity Catalog volumes as Databricks Apps resources, including required privileges and best practices.","AddUnity Catalog volumesas Databricks Apps resources so your app can read from and write to files and directories stored in Unity Catalog with governance and access control. Volumes provide persistent storage for unstructured data, such as configuration files, model artifacts, logs, or other file-based data that your app needs.",2026-04-15T19:49:00.000Z,how-to,security,0.7,True,Covers volumes with governance and access control plus required privileges; this is product-specific security and permission configuration for file-based storage.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/vector-search,Vector search indexes,Add a vector search index resource to a Databricks app - Azure Databricks,Configure vector search index resources in Databricks Apps,Learn how to add vector search indexes as Databricks Apps resources for semantic search and retrieval-augmented generation.,"Addvector search indexesas Databricks Apps resources to enable semantic search and similarity-based retrieval in your applications. Vector search indexes store and query high-dimensional vector embeddings, powering use cases like retrieval-augmented generation (RAG), semantic search, and recommendation systems.",2026-04-15T08:00:00.000Z,how-to,configuration,0.7,True,"Page is about adding vector search indexes as app resources and likely includes Databricks-specific resource configuration options (resource definitions, properties, and parameters) that go beyond generic vector search concepts.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/view-app-details,View app details,View the details for a Databricks app - Azure Databricks,,Learn how to access detailed information about your app.,"The app details page provides comprehensive information about your Databricks app, including deployment status, runtime logs, environment configuration, and deployment history. Use this page to monitor your app's health, troubleshoot issues, and manage its configuration.",2026-04-24T08:00:00.000Z,how-to,,0.25,False,"Explains the app details page for Databricks Apps (deployment status, logs, configuration, history). While useful for operations, the summary does not indicate specific error codes, configuration parameter tables, or other structured expert knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect-legacy,Legacy versions,Databricks Connect for Databricks Runtime 12.2 LTS and below - Azure Databricks,Plan migration from legacy Databricks Connect runtimes,"Learn how to use Databricks Connect to connect your favorite IDE, notebook server, or custom applications to Azure Databricks clusters.","Important Databricks Connect for Databricks Runtime 12.2 LTS and below is deprecated. Databricks Runtime 12.2 LTS and all earlier LTS versions have reached end of support. UseDatabricks Connect for Databricks Runtime 13.3 LTS and aboveinstead. For information on migrating from Databricks Connect for Databricks Runtime 12.2 LTS and below to Databricks Connect for Databricks Runtime 13.3 LTS and above, seeMigrate to Databricks Connect for PythonorMigrate to Databricks Connect for Scala. Databricks",2026-03-26T08:00:00.000Z,archived,decision-making,0.65,True,Deprecated-runtime article that points to migration paths; contains product-specific guidance on when and how to move to newer runtimes.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/,Overview,Databricks Connect - Azure Databricks,,"Learn about Databricks Connect. Databricks Connect allows you to connect popular IDEs, notebook servers, and other custom applications to Azure Databricks compute.","Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. Databricks Connect is a client library for the Databricks Runtime that allows you to connect to Azure Databricks compute from IDEs such as Visual Studio Code, PyCharm, and IntelliJ IDEA, notebooks and any custom application, to enable new interactive user experiences based on your Azure Databricks Lakehouse. Databricks Connect is available for the following languages:",2026-04-14T08:00:00.000Z,overview,,0.3,False,"High-level overview of Databricks Connect capabilities and supported languages; no detailed limits, configuration tables, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/,Overview,Databricks Connect - Azure Databricks,,"Learn about Databricks Connect. Databricks Connect allows you to connect popular IDEs, notebook servers, and other custom applications to Azure Databricks compute.","Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. Databricks Connect is a client library for the Databricks Runtime that allows you to connect to Azure Databricks compute from IDEs such as Visual Studio Code, PyCharm, and IntelliJ IDEA, notebooks and any custom application, to enable new interactive user experiences based on your Azure Databricks Lakehouse. Databricks Connect is available for the following languages:",2026-04-14T08:00:00.000Z,overview,,0.3,False,"High-level overview of Databricks Connect capabilities and supported languages; no detailed limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/advanced,Advanced usage,Advanced usage of Databricks Connect - Azure Databricks,,Learn about some of the more advanced usage of Databricks Connect. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks compute.,Note This article covers Databricks Connect for Databricks Runtime 14.0 and above. This article describes topics that go beyond the basic setup of Databricks Connect.,2026-01-19T08:00:00.000Z,concept-article,,0.45,False,"Advanced usage topics are likely conceptual or example-based; summary does not indicate explicit limits, config parameter tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/cluster-config,Compute configuration,Compute configuration for Databricks Connect - Azure Databricks,Configure compute connections for Databricks Connect,"Learn about configuring compute for Databricks Connect. Databricks Connect allows you to connect popular IDEs, notebook servers, and other custom applications to Azure Databricks clusters or serverles","Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This page describes different ways of configuring a connection between Databricks Connect and your Azure Databricksclusterorserverless compute. Databricks Connect enables you to connect popular IDEs such as Visual Studio Code, PyCharm, RStudio Desktop, IntelliJ IDEA, notebook servers, and other custom applications to Azure Databricks clusters. SeeWhat is Databricks Connect?.",2026-04-03T08:00:00.000Z,how-to,configuration,0.74,True,"Describes configuration options for connecting Databricks Connect to clusters or serverless compute, likely including specific setting names, modes, and allowed values.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/notebooks,Databricks notebooks,Databricks Connect support in Databricks notebooks - Azure Databricks,Use Databricks Connect within Databricks notebooks,Learn about using Databricks Connect with Databricks notebooks.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. Databricks Connect allows you to connect to Databricks compute from a local development environment outside of Databricks. You can then develop, debug, and test your code directly from your IDE before moving your code to a notebook or job in Databricks. SeeWhat is Databricks Connect?.",2026-01-19T08:00:00.000Z,concept-article,integrations,0.62,True,"Explains how to integrate Databricks Connect with notebooks, including specific configuration or invocation patterns that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/,Overview,Databricks Connect for Python - Azure Databricks,Configure and use Databricks Connect for Python,Learn how to use Databricks Connect for Python. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. Databricks Connect enables you to connect popular IDEs such as PyCharm, notebook servers, and other custom applications to Azure Databricks compute. SeeDatabricks Connect.",2026-04-14T21:45:00.000Z,overview,configuration,0.7,True,"Python-specific Databricks Connect article typically includes concrete client configuration parameters (host, token, cluster ID, runtime version), environment variables, and version compatibility details that are product-specific and not just conceptual usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/,Overview,Databricks Connect for Python - Azure Databricks,Configure and use Databricks Connect for Python,Learn how to use Databricks Connect for Python. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. Databricks Connect enables you to connect popular IDEs such as PyCharm, notebook servers, and other custom applications to Azure Databricks compute. SeeDatabricks Connect.",2026-04-14T21:45:00.000Z,overview,configuration,0.7,True,"Python-specific Databricks Connect article typically includes concrete client configuration parameters (host, token, cluster ID, runtime version), environment variables, and version compatibility details that are product-specific and not just conceptual usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/databricks-utilities,Databricks Utilities,Databricks Utilities with Databricks Connect for Python - Azure Databricks,Use Databricks Utilities with Databricks Connect for Python,Learn how to use Databricks Utilities with Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to useDatabricks Utilitieswith Databricks Connect for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeWhat is Databricks Connect?. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. For the Scala version of this article, seeDatabricks Utilities with Databr",2026-01-19T08:00:00.000Z,how-to,integrations,0.8,True,"Describes how to call Databricks Utilities via Databricks Connect, including specific methods, parameters, and behaviors that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/examples,Code examples,Code examples for Databricks Connect for Python - Azure Databricks,Use Databricks Connect for Python code patterns,View code examples that use Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides code examples that use Databricks Connect for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. For the Scala version of this article, seeCode examples for Databricks Connect for Scala. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. The fo",2026-04-14T21:45:00.000Z,how-to,integrations,0.65,True,"Provides concrete code examples for Databricks Connect for Python, showing API/SDK usage patterns, method signatures, and parameters specific to this integration between local IDEs and Azure Databricks clusters.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/examples,Code examples,Code examples for Databricks Connect for Python - Azure Databricks,Use Databricks Connect for Python code patterns,View code examples that use Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides code examples that use Databricks Connect for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. For the Scala version of this article, seeCode examples for Databricks Connect for Scala. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. The fo",2026-04-14T21:45:00.000Z,how-to,integrations,0.65,True,"Provides concrete code examples for Databricks Connect for Python, showing API/SDK usage patterns, method signatures, and parameters specific to this integration between local IDEs and Azure Databricks clusters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/install,Install Databricks Connect for Python,Install Databricks Connect for Python - Azure Databricks,Install and configure Databricks Connect for Python,Learn how to install Databricks Connect for Python. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks clusters.,Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to install Databricks Connect for Python. SeeWhat is Databricks Connect?.,2026-01-19T08:00:00.000Z,install-set-up-deploy,integrations,0.7,True,"Install guide will include concrete configuration steps, environment variables, and version compatibility details for the Python client, which are integration-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/limitations,Limitations,Limitations with Databricks Connect for Python - Azure Databricks,,Learn about the limitations with Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article lists limitations with Databricks Connect for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeWhat is Databricks Connect?. For the Scala version of this article, seeLimitations with Databricks Connect for Scala. Important Depending on the version of Python, Databricks Runtime, and Databricks Connect that you",2026-01-19T08:00:00.000Z,concept-article,,0.4,False,Describes limitations conceptually; summary does not indicate concrete numeric limits or quotas required for limits-quotas classification.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/migrate,Migrate from legacy versions,Migrate to Databricks Connect for Python - Azure Databricks,Migrate from older to new Databricks Connect for Python,Learn how to migrate to Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"This article describes how to migrate from Databricks Connect for Databricks Runtime 12.2 LTS and below to Databricks Connect for Databricks Runtime 13.3 LTS and above for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. For the Scala version of this article, seeMigrate to Databricks Connect for Sca",2026-04-14T21:45:00.000Z,upgrade-and-migration-article,decision-making,0.65,True,"Migration guide between Databricks Runtime 12.2 LTS and 13.3+ Connect versions; typically includes version-specific behavior changes, required configuration updates, and guidance on when/how to move, which supports concrete upgrade decisions.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/migrate,Migrate from legacy versions,Migrate to Databricks Connect for Python - Azure Databricks,Migrate from older to new Databricks Connect for Python,Learn how to migrate to Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"This article describes how to migrate from Databricks Connect for Databricks Runtime 12.2 LTS and below to Databricks Connect for Databricks Runtime 13.3 LTS and above for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. For the Scala version of this article, seeMigrate to Databricks Connect for Sca",2026-04-14T21:45:00.000Z,upgrade-and-migration-article,decision-making,0.65,True,"Migration guide between Databricks Runtime 12.2 LTS and 13.3+ Connect versions; typically includes version-specific behavior changes, required configuration updates, and guidance on when/how to move, which supports concrete upgrade decisions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/spark-shell,PySpark Shell,PySpark shell - Azure Databricks,,Learn about the Pyspark shell that is included in Databricks Connect. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks compute.,Note This article covers Databricks Connect for Databricks Runtime 14.0 and above. Databricks Connect for Python ships with apysparkbinary which is a PySpark REPL (a Spark shell) configured to use Databricks Connect.,2026-01-19T08:00:00.000Z,how-to,,0.3,False,"Likely a usage/overview page for the PySpark shell in Databricks Connect without detailed limits, configs, or troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/testing,Testing,Testing for Databricks Connect for Python - Azure Databricks,Test Databricks Connect for Python code with pytest,Learn how to test functions that use Databricks Connect for Python by using pytest.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to run tests usingpytestwith Databricks Connect for Databricks Runtime 13.3 LTS and above. To install Databricks Connect for Python, seeInstall Databricks Connect for Python. To get started withpytest, seeGet Startedin thepytestdocumentation. Important Databricks Connect and PySpark are mutually exclusive. For more information, seeConflicting PySpark installations. Note When running ",2026-01-19T08:00:00.000Z,how-to,best-practices,0.7,True,"Explains product-specific testing patterns, including how to structure tests, handle mutual exclusivity with PySpark, and possibly configuration for pytest when using Databricks Connect.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/troubleshooting,Troubleshooting,Troubleshooting Databricks Connect for Python - Azure Databricks,Troubleshoot Databricks Connect for Python issues,Learn how to troubleshoot common issues with Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides troubleshooting information for Databricks Connect for Python. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeWhat is Databricks Connect?. For the Scala version of this article, seeTroubleshooting Databricks Connect for Scala.",2026-01-19T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,Explicit troubleshooting article; Databricks Connect typically documents specific error messages and resolutions that are product-specific.,unchanged @@ -796,11 +815,11 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/r/,Overview,Databricks Connect for R - Azure Databricks,,Learn how to use Databricks Connect for R. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks clusters.,"Note This article coverssparklyrintegration with Databricks Connect for Databricks Runtime 13.0 and above. This integration is neither provided by Databricks nor directly supported by Databricks. For questions, go to thePosit Community. To report issues, go to theIssuessection of thesparklyrrepository in GitHub. For more information, seeDatabricks Connect v2in thesparklyrdocumentation. Databricks Connect enables you to connect popular IDEs such as RStudio Desktop, notebook servers, and other cus",2026-01-24T08:00:00.000Z,tutorial,,0.35,False,How-to for R integration via sparklyr; appears to be general usage guidance without detailed config tables or troubleshooting matrices.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/requirements,Requirements,Databricks Connect usage requirements - Azure Databricks,Meet Databricks Connect runtime and environment requirements,"Learn about Databricks Connect installation requirements. Databricks Connect allows you to connect popular IDEs, notebook servers, and other custom applications to Azure Databricks clusters or serverl","Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides usage requirements for Databricks Connect. For information about Databricks Connect, seeWhat is Databricks Connect?.",2026-03-03T08:00:00.000Z,checklist,limits-quotas,0.66,True,"Usage requirements page typically lists specific supported runtimes, Python/Scala versions, OS constraints, and possibly connection limits, which are concrete numeric and compatibility constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/,Overview,Databricks Connect for Scala - Azure Databricks,,Learn how to use Databricks Connect for Scala. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. Databricks Connect enables you to connect popular IDEs such as IntelliJ IDEA, notebook servers, and other custom applications to Azure Databricks compute. SeeWhat is Databricks Connect?.",2026-01-19T08:00:00.000Z,overview,,0.3,False,General how-to for Databricks Connect for Scala; likely step-by-step usage without deep config tables or troubleshooting matrices.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/databricks-utilities,Databricks Utilities,Databricks Utilities with Databricks Connect for Scala - Azure Databricks,Use Databricks Utilities via Databricks Connect for Scala,Learn how to use Databricks Utilities with Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to useDatabricks Utilitieswith Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. For the Python version of this article, seeDatabricks Utilities with Databricks Conn",2026-04-14T21:45:00.000Z,how-to,integrations,0.7,True,"Describes how to call Databricks Utilities (dbutils) through Databricks Connect for Scala, which involves specific APIs, method names, and usage patterns unique to this integration path.",updated -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/examples,Code examples,Code examples for Databricks Connect for Scala - Azure Databricks,Use Databricks Connect for Scala code examples,View code examples that use Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides code examples that use Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. For the Python version of this article, seeCode examples for Databricks Connect for Python. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. The f",2026-04-14T21:45:00.000Z,how-to,integrations,0.65,True,"Scala-focused article with concrete Databricks Connect code examples, demonstrating product-specific APIs, method calls, and parameters for integrating Scala applications with Azure Databricks clusters.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/databricks-utilities,Databricks Utilities,Databricks Utilities with Databricks Connect for Scala - Azure Databricks,Use Databricks Utilities via Databricks Connect for Scala,Learn how to use Databricks Utilities with Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to useDatabricks Utilitieswith Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. For the Python version of this article, seeDatabricks Utilities with Databricks Conn",2026-04-14T21:45:00.000Z,how-to,integrations,0.7,True,"Describes how to call Databricks Utilities (dbutils) through Databricks Connect for Scala, which involves specific APIs, method names, and usage patterns unique to this integration path.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/examples,Code examples,Code examples for Databricks Connect for Scala - Azure Databricks,Use Databricks Connect for Scala code examples,View code examples that use Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides code examples that use Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeDatabricks Connect. For the Python version of this article, seeCode examples for Databricks Connect for Python. Before you begin to use Databricks Connect, you mustset up the Databricks Connect client. The f",2026-04-14T21:45:00.000Z,how-to,integrations,0.65,True,"Scala-focused article with concrete Databricks Connect code examples, demonstrating product-specific APIs, method calls, and parameters for integrating Scala applications with Azure Databricks clusters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/install,Install Databricks Connect for Scala,Install Databricks Connect for Scala - Azure Databricks,Install Databricks Connect client for Scala,Learn how to install Databricks Connect for Scala. Databricks Connect allows you to connect popular IDEs and other custom applications to Azure Databricks clusters.,Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to install the Databricks Connect for Scala client. SeeDatabricks Connect for Scala.,2026-03-03T08:00:00.000Z,install-set-up-deploy,integrations,0.7,True,"Scala install article will specify client artifacts, versions, and configuration parameters for connecting Scala apps to Databricks, which are integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/jar-compile,Tutorial: Run code on serverless compute,Tutorial: Run Scala code on serverless compute - Azure Databricks,Build and deploy Scala JARs to Databricks serverless,Learn how to deploy Scala JARs on serverless compute using Databricks Connect for Scala,Important Serverless Scala and Java jobs are inPublic Preview. You can use JAR tasks to deploy your JAR. SeeManage Azure Databricks previewsif it's not already enabled. This tutorial provides an overview of how to get started with Databricks Connect for Scala using serverless compute. It walks through building a Unity Catalog-enabled compute (either classic compute in standard access mode or serverless compute) compatible Scala JAR file. Tip To create a Scala project that is fully configured to ,2026-03-16T17:36:00.000Z,tutorial,integrations,0.68,True,"Tutorial content, but for Databricks Connect for Scala it typically includes product-specific build settings (for example, required Scala/Spark versions, Maven/SBT coordinates, Unity Catalog / serverless compatibility flags, and packaging details for JAR tasks). These are concrete integration/configuration patterns unique to Databricks rather than generic Scala usage.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/limitations,Limitations,Limitations with Databricks Connect for Scala - Azure Databricks,Understand Databricks Connect for Scala limitations,Learn about the limitations with Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks compute resources.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article lists limitations with Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks compute resources. SeeDatabricks Connect. For the Python version of this article, seeLimitations with Databricks Connect for Python. Important Depending on the version of Scala, Java, Databricks Runtime, and Databricks Connect th",2026-04-14T21:45:00.000Z,concept-article,limits-quotas,0.7,True,"Limitations page for Databricks Connect for Scala typically enumerates concrete unsupported operations, version compatibility constraints, and possibly numeric or behavioral limits that are product-specific and not general knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/limitations,Limitations,Limitations with Databricks Connect for Scala - Azure Databricks,Understand Databricks Connect for Scala limitations,Learn about the limitations with Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks compute resources.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article lists limitations with Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks compute resources. SeeDatabricks Connect. For the Python version of this article, seeLimitations with Databricks Connect for Python. Important Depending on the version of Scala, Java, Databricks Runtime, and Databricks Connect th",2026-04-14T21:45:00.000Z,concept-article,limits-quotas,0.7,True,"Limitations page for Databricks Connect for Scala typically enumerates concrete unsupported operations, version compatibility constraints, and possibly numeric or behavioral limits that are product-specific and not general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/migrate,Migrate from legacy versions,Migrate to Databricks Connect for Scala - Azure Databricks,Migrate from legacy to new Scala Databricks Connect,Learn how to migrate to Databricks Connect for Scala. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note Databricks Connect for Databricks Runtime 13.3 LTS and above for Scala is inPublic Preview. This article describes how to migrate from Databricks Connect for Databricks Runtime 12.2 LTS and below to Databricks Connect for Databricks Runtime 13.3 LTS and above for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeWhat is Databricks Connect?. Before you begin to use Databricks Connect, you mustset up the D",2026-01-19T08:00:00.000Z,upgrade-and-migration-article,decision-making,0.7,True,Migration-focused article; migration guidance between runtime generations is product-specific decision-making and upgrade-path knowledge.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/testing,Testing,Testing for Databricks Connect for Scala - Azure Databricks,,Learn how to test functions that use Databricks Connect for Scala by using ScalaTest.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article describes how to run tests using ScalaTest with Databricks Connect for Databricks Runtime 13.3 LTS and above. To install Databricks Connect for Scala, seeInstall Databricks Connect for Scala. To get started with ScalaTest and run it locally, seeGetting startedin the ScalaTest documentation. For example, given the following filesrc/main/scala/NYCTaxiFunctions.scalacontaining agetSparkfunction that ",2026-01-19T08:00:00.000Z,how-to,,0.4,False,"Testing article using ScalaTest; mostly generic testing patterns and example code, not product-specific limits or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/troubleshooting,Troubleshooting,Troubleshooting Databricks Connect for Scala - Azure Databricks,Troubleshoot Databricks Connect for Scala problems,Learn how to troubleshoot common issues with Databricks Connect for Python. Databricks Connect allows you to connect popular applications to Azure Databricks clusters.,"Note This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above. This article provides troubleshooting information for Databricks Connect for Scala. Databricks Connect enables you to connect popular IDEs, notebook servers, and custom applications to Azure Databricks clusters. SeeWhat is Databricks Connect?. For the Python version of this article, seeTroubleshooting Databricks Connect for Python.",2026-03-03T08:00:00.000Z,troubleshooting,troubleshooting,0.8,True,Explicit troubleshooting article; expected to map specific errors and version issues to resolutions for Databricks Connect for Scala.,unchanged @@ -833,7 +852,7 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/cluster-n https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/service-principals,Service principals for Terraform,Provision a service principal by using Terraform - Azure Databricks,Provision Databricks service principals using Terraform,Learn how to use Terraform to provision service principals for Azure Databricks automation scenarios.,"Note To provision a Microsoft Entra ID managed service principal by using the Azure portal and the Azure Databricks user interface instead, seeService principals. Microsoft Entra ID managed service principals differ from managed identities for Azure resources, which Azure Databricks also supports for authentication. To learn how to use managed identities for Azure resources instead of Microsoft Entra ID managed service principals for Azure Databricks authentication, seeUse Azure managed identiti",2026-01-24T08:00:00.000Z,how-to,security,0.7,True,"Covers creating service principals for Databricks automation with Terraform, including security-focused identity configuration. Likely includes specific fields/parameters for service principal resources and scopes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/troubleshoot,Troubleshoot issues with the Terraform provider,Troubleshoot the Databricks Terraform provider - Azure Databricks,Troubleshoot common Databricks Terraform provider errors,Troubleshoot common issues with the Databricks Terraform provider,"This article provides troubleshooting information for common errors when using the Databricks Terraform provider. For information about the Databricks Terraform provider, seeDatabricks Terraform provider. Note For Terraform-specific support, see theLatest Terraform topicson the HashiCorp Discuss website. For issues specific to the Databricks Terraform Provider, seeIssuesin thedatabrickslabs/terraform-provider-databricksGitHub repository.",2026-01-19T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article for Databricks Terraform provider; typically organized by specific error messages/codes and their resolutions, which are product-specific diagnostic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/workspace-management,Manage resources with Terraform,Manage Databricks workspaces using Terraform - Azure Databricks,Manage Azure Databricks workspace resources using Terraform,"Learn how to manage Azure Databricks workspace resources, such as Azure Databricks secrets, access tokens, notebooks, jobs, and clusters, by using Terraform.","This article shows how to manage resources in anAzure Databricks workspaceusing theDatabricks Terraform provider. The following configuration blocks initialize the most common variables,databricks_spark_version,databricks_node_type, anddatabricks_current_user.",2026-02-25T08:00:00.000Z,how-to,integrations,0.7,True,"Shows concrete Terraform configuration blocks and variables (spark version, node type, current user) for Databricks workspace resources. This is product-specific integration/config detail, not just conceptual guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/,Overview,Databricks extension for Visual Studio Code - Azure Databricks,,"Learn about the Databricks extension for Visual Studio Code, which enables you to connect your local development machine to a remote Databricks workspace with just a few clicks.","The Databricks extension for Visual Studio Code enables you to connect to your remote Azure Databricks workspaces fromVisual Studio CodeorCursoron your local development machine. You can then: Note The Databricks extension for Visual Studio Code supports running R, Scala, and SQL notebooks as automated jobs but does not provide any deeper support for these languages within Visual Studio Code.",2026-04-14T08:00:00.000Z,overview,,0.2,False,"Primarily an overview of the Databricks VS Code extension and its capabilities; no detailed configuration tables, limits, error-code-based troubleshooting, or product-specific decision matrices are evident from the summary.",updated +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/,Overview,Databricks extension for Visual Studio Code - Azure Databricks,,"Learn about the Databricks extension for Visual Studio Code, which enables you to connect your local development machine to a remote Databricks workspace with just a few clicks.","The Databricks extension for Visual Studio Code enables you to connect to your remote Azure Databricks workspaces fromVisual Studio CodeorCursoron your local development machine. You can then: Note The Databricks extension for Visual Studio Code supports running R, Scala, and SQL notebooks as automated jobs but does not provide any deeper support for these languages within Visual Studio Code.",2026-04-14T08:00:00.000Z,overview,,0.2,False,"Primarily an overview of the Databricks VS Code extension and its capabilities; no detailed configuration tables, limits, error-code-based troubleshooting, or product-specific decision matrices are evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/authentication,Authenticate,Set up authorization for the Databricks extension for Visual Studio Code - Azure Databricks,Configure OAuth authorization for Databricks VS Code extension,Learn how to set up authorization and authentication between the Databricks extension for Visual Studio Code and your Azure Databricks workspace.,"This article describes how to set up authorization and authentication between the Databricks extension for Visual Studio Code and your Azure Databricks workspace if you haven't already configured the extension through project setup. SeeWhat is the Databricks extension for Visual Studio Code?. The Databricks extension for Visual Studio Code implements portions of theDatabricks unified authenticationstandard, which enables you to configure Azure Databricks OAuth 2.0-based authorization once and th",2026-04-03T08:00:00.000Z,how-to,security,0.8,True,"Covers authorization and authentication; likely includes specific OAuth scopes, token configuration, and workspace auth settings unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/bundles,Declarative Automation Bundles extension features,Declarative Automation Bundles extension features - Azure Databricks,Use Declarative Automation Bundles with Databricks VS Code,Learn how to debug code by using the Databricks Connect integration in the Databricks extension for Visual Studio Code.,"The Databricks extension for Visual Studio Code provides additional features within Visual Studio Code that enable you to easily define, deploy, and run Declarative Automation Bundles to apply CI/CD best practices to your Lakeflow Jobs, Lakeflow Spark Declarative Pipelines, and MLOps Stacks. SeeWhat are Declarative Automation Bundles?. To install the Databricks extension for Visual Studio Code, seeInstall the Databricks extension for Visual Studio Code.",2026-03-16T17:36:00.000Z,how-to,deployment,0.65,True,"Covers defining, deploying, and running Declarative Automation Bundles for Lakeflow Jobs, Spark Declarative Pipelines, and MLOps Stacks. This is CI/CD-focused, with Databricks-specific deployment patterns and bundle behavior that go beyond generic deployment tutorials.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/command-palette,Command Palette commands,Command Palette commands for the Databricks extension for Visual Studio Code - Azure Databricks,,Learn about Command Palette commands for the Databricks extension for Visual Studio Code.,This article listsCommand Palettecommands for the Databricks extension for Visual Studio Code. SeeWhat is the Databricks extension for Visual Studio Code?.,2026-03-16T08:00:00.000Z,reference,,0.5,False,"Lists Command Palette commands; mostly command names and descriptions, not configuration ranges, limits, or troubleshooting mappings.",unchanged @@ -841,7 +860,7 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/configur https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/databricks-connect,Debug code with Databricks Connect,Debug code using Databricks Connect for the Databricks extension for Visual Studio Code - Azure Databricks,,Learn how to debug code by using the Databricks Connect integration in the Databricks extension for Visual Studio Code.,"This article describes how to use theDatabricks Connectintegration in the Databricks extension for Visual Studio Code to run and debug individual Python (.py) files. For information about the extension, seeWhat is the Databricks extension for Visual Studio Code?. The Databricks Connect integration also allows you to run and debug notebook cells. SeeRun and debug notebook cells with Databricks Connect using the Databricks extension for Visual Studio Code.",2026-01-19T08:00:00.000Z,how-to,,0.4,False,Describes debugging with Databricks Connect in VS Code; appears to be workflow guidance rather than detailed troubleshooting or config tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/faqs,FAQs,Frequently asked questions about the Databricks extension for Visual Studio Code - Azure Databricks,Resolve common issues with Databricks VS Code extension,Read frequently asked questions about the Databricks extension for Visual Studio Code.,This article lists frequently asked questions about the Databricks extension for Visual Studio Code. SeeWhat is the Databricks extension for Visual Studio Code?.,2026-03-16T17:36:00.000Z,faq,troubleshooting,0.61,True,"FAQ pages for a specific extension typically include concrete error messages, limitations, and resolution steps (for example, connection/auth issues, sync problems, language support caveats). These map symptoms to causes and fixes, fitting the troubleshooting category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/install,Install the extension,Install the Databricks extension for Visual Studio Code - Azure Databricks,,"Learn how to install and open the Databricks extension for Visual Studio Code, and then configure a project for the extension to use.","This article describes how to install and open the Databricks extension for Visual Studio Code, and then configure a project for the extension to use. SeeWhat is the Databricks extension for Visual Studio Code?.",2026-01-19T08:00:00.000Z,install-set-up-deploy,,0.35,False,Installation and basic project configuration; usually procedural without detailed config parameter tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/notebooks,Debug notebooks with Databricks Connect,Run and debug notebook cells with Databricks Connect using the Databricks extension for Visual Studio Code - Azure Databricks,,Learn how to run and debug notebooks in Visual Studio Code using the Databricks Connect integration in the Databricks extension for Visual Studio Code.,"You can run and debug notebooks, one cell at a time or all cells at once, and see their results in the Visual Studio Code UI using the Databricks extension for Visual Studio Code Databricks Connect integration. All code runs locally, while all code involving DataFrame operations runs on the cluster in the remote Azure Databricks workspace and run responses are sent back to the local caller. All code is debugged locally, while all Spark code continues to run on the cluster in the remote Azure Dat",2026-01-19T08:00:00.000Z,how-to,,0.4,False,"Explains running and debugging notebook cells; likely procedural without product-specific limits, config matrices, or error-code mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/notebooks,Debug notebooks with Databricks Connect,Run and debug notebook cells with Databricks Connect using the Databricks extension for Visual Studio Code - Azure Databricks,,Learn how to run and debug notebooks in Visual Studio Code using the Databricks Connect integration in the Databricks extension for Visual Studio Code.,"You can run and debug notebooks, one cell at a time or all cells at once, and see their results in the Visual Studio Code UI using the Databricks extension for Visual Studio Code Databricks Connect integration. All code runs locally, while all code involving DataFrame operations runs on the cluster in the remote Azure Databricks workspace and run responses are sent back to the local caller. All code is debugged locally, while all Spark code continues to run on the cluster in the remote Azure Dat",2026-04-23T08:00:00.000Z,how-to,,0.3,False,"Primarily a how-to/tutorial for running and debugging Databricks notebooks from VS Code. It likely describes workflow steps and basic integration behavior, but not detailed configuration tables, limits, error-code mappings, or product-specific parameter references that meet the expert-knowledge criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/pytest,Run tests,Run Python tests using the Databricks extension for Visual Studio Code - Azure Databricks,,Learn how to run tests using the Databricks extension for Visual Studio Code.,This page describes how to run Python tests using the Databricks extension for Visual Studio Code. SeeWhat is the Databricks extension for Visual Studio Code?.,2026-01-24T08:00:00.000Z,how-to,,0.35,False,Describes running Python tests; likely generic test execution steps rather than product-specific expert configuration or troubleshooting.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/run,Run a file or notebook in Databricks,Run a file on a cluster or a file or notebook as a job in Azure Databricks using the Databricks extension for Visual Studio Code - Azure Databricks,,Learn how to run a file on a cluster or a file or notebook as a job in Azure Databricks using the Databricks extension for Visual Studio Code.,"The Databricks extension for Visual Studio Code allows you to run your Python code on a cluster or your Python, R, Scala, or SQL code or notebook as a job in Azure Databricks. This information assumes that you have already installed and set up the Databricks extension for Visual Studio Code. SeeInstall the Databricks extension for Visual Studio Code. Note To debug code or notebooks from within Visual Studio Code, use Databricks Connect. SeeDebug code using Databricks Connect for the Databricks e",2026-03-21T08:00:00.000Z,how-to,,0.35,False,How to run files/notebooks as jobs; typical usage article without detailed configuration parameter tables or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/settings,Extension settings,Settings for the Databricks extension for Visual Studio Code - Azure Databricks,Configure settings for the Databricks VS Code extension,Learn about extension Settings for the Databricks extension for Visual Studio Code.,This article lists extension settings for the Databricks extension for Visual Studio Code. SeeWhat is the Databricks extension for Visual Studio Code?,2026-01-19T08:00:00.000Z,reference,configuration,0.85,True,"Explicitly lists extension settings; likely includes setting names, allowed values, and defaults, which is product-specific configuration knowledge.",unchanged @@ -849,7 +868,7 @@ https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/troubles https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/tutorial,Tutorial,Tutorial: Run Python on a cluster and as a job using the Databricks extension for Visual Studio Code - Azure Databricks,,Learn how to use the Databricks extension for Visual Studio Code to run your local Python code on a remote Azure Databricks workspace.,"This tutorial walks you through setting up the Databricks extension for Visual Studio Code, and then running Python on an Azure Databricks cluster and as an Azure Databricks job in your remote workspace. SeeWhat is the Databricks extension for Visual Studio Code?.",2026-01-19T08:00:00.000Z,tutorial,,0.3,False,Tutorial for using VS Code extension to run Python; typical step-by-step usage without detailed configuration matrices or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/databricks/developers/,Overview,Develop on Databricks - Azure Databricks,,"Learn about Databricks APIs and tools for developing collaborative data science, data engineering, and data analysis solutions in Azure Databricks.","Databricks developer users encompass the data scientists, data engineers, data analysts, machine learning engineers, as well as DevOps and MLOps engineers - all building solutions and integrations to extend and customize Databricks for their specific needs. In addition to the many Databricks APIs and data engineering features available in the workspace, there are also many tools for connecting to Databricks and developing locally that support developer users of Databricks. This article provides ",2026-03-16T08:00:00.000Z,landing-page,,0.2,False,"High-level overview of Databricks developer tools and APIs without detailed configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/discover/,Overview,Discover data - Azure Databricks,,"Learn how to find datasets, list files, explore data descriptions, and discover tables on Azure Databricks.","Azure Databricks provides a suite of tools and products that simplify the discovery of data assets that are accessible through the Databricks Data Intelligence Platform. This article provides an opinionated overview of how you can discover and preview data that has already been configured for access in your workspace. Topics in this section focus on exploring data objects and data files. If you're looking for information about working with assets such as notebooks, SQL queries, libraries, and mo",2025-07-10T20:02:00.000Z,landing-page,,0.3,False,High-level overview of data discovery tools; mostly conceptual and navigational without detailed configuration tables or product-specific limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/discover/database-objects,Explore database objects,Explore database objects - Azure Databricks,,"Learn how to explore database objects like catalogs, schemas, tables, and views with Catalog Explorer and SQL.","This page details how you can discover and explore catalogs, schemas, tables, and other database objects in Unity Catalog. The instructions in this page focus on returning details for database objects that you have at least theBROWSEorSELECTprivilege on. For general information on Unity Catalog privileges, seeUnity Catalog privileges reference. For information about how to set schema ownership and permissions, seeManage Unity Catalog object ownershipandManage privileges in Unity Catalog. This pa",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Task-focused page on exploring catalogs, schemas, tables, and views using Catalog Explorer and SQL. The summary suggests usage instructions rather than detailed configuration tables, limits, or troubleshooting mappings, so it does not clearly fit any expert-knowledge sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/databricks/discover/database-objects,Explore database objects,Explore database objects - Azure Databricks,,"Learn how to explore database objects like catalogs, schemas, tables, and views with Catalog Explorer and SQL.","This page details how you can discover and explore catalogs, schemas, tables, and other database objects in Unity Catalog. The instructions in this page focus on returning details for database objects that you have at least theBROWSEorSELECTprivilege on. For general information on Unity Catalog privileges, seeUnity Catalog privileges reference. For information about how to set schema ownership and permissions, seeManage Unity Catalog object ownershipandManage privileges in Unity Catalog. This pa",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Task-focused page on exploring catalogs, schemas, tables, and views using Catalog Explorer and SQL. The summary suggests usage instructions rather than detailed configuration tables, limits, or troubleshooting mappings, so it does not clearly fit any expert-knowledge sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/databricks/discover/databricks-datasets,Sample datasets,Sample datasets - Azure Databricks,,Learn how to find and use sample datasets within your existing Azure Databricks workspaces.,There are a variety of sample datasets provided by Azure Databricks and made available by third parties that you can use in your Azure Databricksworkspace.,2026-03-31T08:00:00.000Z,how-to,,0.35,False,"High-level guidance on finding and using sample datasets; likely lacks detailed configuration parameters, limits, or troubleshooting content that would qualify as expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/discover/discover-page,Discover page,Discover page and domains - Azure Databricks,,"Use the **Discover** page to browse curated data assets organized by domains and discover tables, dashboards, and Genie spaces in Azure Databricks.",Important This feature is inBeta. TheDiscoverpage is a curated browsing experience for your organization to discover data assets and insights in Azure Databricks. Discover provides an intuitive interface that showcases your assets and insights without requiring deep technical knowledge of catalog hierarchies. Usedomainsin the discovery experience to logically organize data in a way that's best suited for your business. Warning Domains are built on tags. Tag data is stored as plain text and is re,2026-03-26T08:00:00.000Z,how-to,,0.3,False,"Primarily a UI/feature overview of the Discover page and domains; does not focus on detailed configuration parameters, limits, or error codes that would constitute expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/discover/files,Explore storage and find data files,Explore storage and find data files - Azure Databricks,Explore Unity Catalog volumes and storage paths,Learn how to explore data directories in Unity Catalog volumes and other cloud object storage locations to discover files.,"This page focuses on finding and exploring directories and data files managed with Unity Catalog volumes, including UI-based instructions for exploring volumes with Catalog Explorer. It includes examples for programmatic exploration of data in cloud object storage using volume paths and cloud URIs. Databricks recommends using volumes to manage access to data in cloud object storage. For more information on connecting to data in cloud object storage, seeConnect to data sources and external servic",2026-02-23T08:00:00.000Z,concept-article,configuration,0.62,True,"Includes programmatic examples and likely specific path formats and options for exploring volumes and cloud URIs, which are concrete configuration/usage details for Databricks-managed storage.",unchanged @@ -867,7 +886,7 @@ https://learn.microsoft.com/en-us/azure/databricks/error-messages/dc-sqlserver-e https://learn.microsoft.com/en-us/azure/databricks/error-messages/delta-iceberg-compat-v1-violation-error-class,DELTA_ICEBERG_COMPAT_V1_VIOLATION error condition,DELTA_ICEBERG_COMPAT_V1_VIOLATION error class - Azure Databricks,Understand DELTA_ICEBERG_COMPAT_V1_VIOLATION errors,Documentation for the DELTA\_ICEBERG\_COMPAT\_V1\_VIOLATION error condition on Azure Databricks,SQLSTATE: KD00E The validation of IcebergCompatV1 has failed.,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.8,True,Error class for Delta-Iceberg compatibility validation failures with specific SQLSTATE and condition name. Product-specific error semantics used in troubleshooting.,unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/divide-by-zero-error-class,DIVIDE_BY_ZERO error condition,DIVIDE_BY_ZERO error condition - Azure Databricks,Handle DIVIDE_BY_ZERO errors in Databricks SQL,Documentation for the DIVIDE\_BY\_ZERO error condition on Azure Databricks,"SQLSTATE: 22012 Division by zero. Usetry_divideto tolerate divisor being 0 and return NULL instead. If necessary setto “false” to bypass this error.",2024-03-01T08:00:00.000Z,error-reference,troubleshooting,0.9,True,"Includes error class, SQLSTATE, message, suggested use of try_divide, and config flag to bypass, which is a clear symptom→solution mapping.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/error-messages/error-classes,Error classes,Error conditions in Azure Databricks - Azure Databricks,Handle Azure Databricks named error conditions,Documentation for the error conditions used in Azure Databricks,"Applies to:Databricks SQLDatabricks Runtime 12.2 and above Error conditions are descriptive, human-readable strings that unique to the error they describe. You can use error conditions to programmatically handle errors in your application without the need to parse the error message. This is a list of common, named error conditions returned by Azure Databricks.",2026-04-13T08:00:00.000Z,error-reference,troubleshooting,0.9,True,"Page is a catalog of Azure Databricks-specific error conditions used in Databricks SQL and Databricks Runtime, intended for programmatic handling. It enumerates product-specific error identifiers and meanings, which are not generally known from training data and map directly to troubleshooting and error-handling logic.",updated +https://learn.microsoft.com/en-us/azure/databricks/error-messages/error-classes,Error classes,Error conditions in Azure Databricks - Azure Databricks,Handle Azure Databricks named error conditions,Documentation for the error conditions used in Azure Databricks,"Applies to:Databricks SQLDatabricks Runtime 12.2 and above Error conditions are descriptive, human-readable strings that unique to the error they describe. You can use error conditions to programmatically handle errors in your application without the need to parse the error message. This is a list of common, named error conditions returned by Azure Databricks.",2026-04-13T08:00:00.000Z,error-reference,troubleshooting,0.9,True,"Page is a catalog of Azure Databricks-specific error conditions used in Databricks SQL and Databricks Runtime, intended for programmatic handling. It enumerates product-specific error identifiers and meanings, which are not generally known from training data and map directly to troubleshooting and error-handling logic.",unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/ewkb-parse-error-error-class,EWKB_PARSE_ERROR error condition,EWKB_PARSE_ERROR error condition - Azure Databricks,Fix EWKB_PARSE_ERROR geometry parsing issues,Documentation for the EWKB\_PARSE\_ERROR error condition on Azure Databricks,SQLSTATE: 22023 Error parsing EWKB:at position,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.85,True,Defines EWKB_PARSE_ERROR with SQLSTATE and message template including parseError and position. Clear mapping from parsing symptom to error condition for debugging.,unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/ewkt-parse-error-error-class,EWKT_PARSE_ERROR error condition,EWKT_PARSE_ERROR error condition - Azure Databricks,Fix EWKT_PARSE_ERROR geometry parsing issues,Documentation for the EWKT\_PARSE\_ERROR error condition on Azure Databricks,SQLSTATE: 22023 Error parsing EWKT:at position,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.85,True,Documents EWKT_PARSE_ERROR with SQLSTATE and detailed message template. Product-specific error identifier and format used for diagnosing spatial parsing problems.,unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/geojson-parse-error-error-class,GEOJSON_PARSE_ERROR error condition,GEOJSON_PARSE_ERROR error class - Azure Databricks,Resolve GEOJSON_PARSE_ERROR in Databricks,Documentation for the GEOJSON\_PARSE\_ERROR error condition on Azure Databricks,SQLSTATE: 22023 Error parsing GeoJSON:at position,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.85,True,GEOJSON_PARSE_ERROR error class with SQLSTATE and message template including parseError and position. Used directly in troubleshooting malformed GeoJSON input.,unchanged @@ -883,7 +902,7 @@ https://learn.microsoft.com/en-us/azure/databricks/error-messages/missing-aggreg https://learn.microsoft.com/en-us/azure/databricks/error-messages/row-column-access-error-class,ROW_COLUMN_ACCESS error condition,ROW_COLUMN_ACCESS error condition - Azure Databricks,Diagnose ROW_COLUMN_ACCESS errors for filters and masks,Documentation for the ROW\_COLUMN\_ACCESS error condition on Azure Databricks,SQLSTATE: none assigned Error using row filters or column masks: This error class has the following derived error classes:,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.8,True,"Describes a Databricks error class related to row filters and column masks and notes derived error classes, which is specific to Databricks security features and used for troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/sqlstates,SQLSTATE codes,SQLSTATE error codes - Azure Databricks,Interpret Azure Databricks SQLSTATE error codes,"Learn about SQLSTATE errors in Azure Databricks. A SQLSTATE is a SQL standard encoding for error conditions used by JDBC, ODBC, and other client APIs.","Applies to:Databricks SQLDatabricks Runtime 12.2 and above All error classes returned by Azure Databricks are associated with a 5 characterSQLSTATE. ASQLSTATEis a SQL standard encoding for error conditions commonly used byJDBC,ODBC, and other client APIs. ASQLSTATEconsists of two portions: A two character class, and a three character subclass. -Each character must be a digit'0'to'9'or'A'to'Z'. While manySQLSTATEvalues are prescribed by the SQL standard, others are common in the industry, specific",2026-04-13T08:00:00.000Z,error-reference,troubleshooting,0.85,True,"Page documents the specific SQLSTATE codes used by Azure Databricks, mapping error classes to 5-character SQLSTATE values for JDBC/ODBC and other clients. These are concrete, product-specific error codes and meanings that support diagnosis and handling of failures, fitting the troubleshooting category.",updated +Each character must be a digit'0'to'9'or'A'to'Z'. While manySQLSTATEvalues are prescribed by the SQL standard, others are common in the industry, specific",2026-04-13T08:00:00.000Z,error-reference,troubleshooting,0.85,True,"Page documents the specific SQLSTATE codes used by Azure Databricks, mapping error classes to 5-character SQLSTATE values for JDBC/ODBC and other clients. These are concrete, product-specific error codes and meanings that support diagnosis and handling of failures, fitting the troubleshooting category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/table-or-view-not-found-error-class,TABLE_OR_VIEW_NOT_FOUND error condition,TABLE_OR_VIEW_NOT_FOUND error condition - Azure Databricks,Fix TABLE_OR_VIEW_NOT_FOUND errors in Databricks,Documentation for the TABLE\_OR\_VIEW\_NOT\_FOUND error condition on Azure Databricks,"SQLSTATE: 42P01 The table or viewcannot be found. Verify the spelling and correctness of the schema and catalog. If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog. To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.",2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.95,True,"Provides SQLSTATE, error message template, and concrete remediation steps (check schema/catalog, use IF EXISTS), which is clear symptom→cause→solution guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/unresolved-routine-error-class,UNRESOLVED_ROUTINE error condition,UNRESOLVED_ROUTINE error condition - Azure Databricks,Resolve UNRESOLVED_ROUTINE function resolution errors,Documentation for the UNRESOLVED\_ROUTINE error condition on Azure Databricks,SQLSTATE: 42883 Cannot resolve functionon search path.,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.9,True,"Documents a specific Databricks error class with SQLSTATE and message about resolving functions on a search path, which is product-specific error semantics for troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/unsupported-table-operation-error-class,UNSUPPORTED_TABLE_OPERATION error condition,UNSUPPORTED_TABLE_OPERATION error condition - Azure Databricks,Understand UNSUPPORTED_TABLE_OPERATION errors in Databricks,Documentation for the UNSUPPORTED\_TABLE\_OPERATION error condition on Azure Databricks,SQLSTATE: none assigned The tabledoes not support.,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.8,True,"Defines a Databricks error class indicating a table does not support a given operation, which is specific runtime behavior needed for diagnosing failures.",unchanged @@ -891,40 +910,41 @@ https://learn.microsoft.com/en-us/azure/databricks/error-messages/unsupported-vi https://learn.microsoft.com/en-us/azure/databricks/error-messages/wkb-parse-error-error-class,WKB_PARSE_ERROR error condition,WKB_PARSE_ERROR error condition - Azure Databricks,Troubleshoot WKB_PARSE_ERROR for geometry parsing,Documentation for the WKB\_PARSE\_ERROR error condition on Azure Databricks,SQLSTATE: 22023 Error parsing WKB:at position,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.9,True,"Provides SQLSTATE and a detailed message template including parseError and position, which is specific to Databricks’ WKB parsing behavior and used for error diagnosis.",unchanged https://learn.microsoft.com/en-us/azure/databricks/error-messages/wkt-parse-error-error-class,WKT_PARSE_ERROR error condition,WKT_PARSE_ERROR error condition - Azure Databricks,Troubleshoot WKT_PARSE_ERROR for geometry parsing,Documentation for the WKT\_PARSE\_ERROR error condition on Azure Databricks,SQLSTATE: 22023 Error parsing WKT:at position,2026-01-19T08:00:00.000Z,error-reference,troubleshooting,0.9,True,"Analogous to WKB, this documents WKT parsing errors with SQLSTATE and message template, which is product-specific troubleshooting information.",unchanged https://learn.microsoft.com/en-us/azure/databricks/exploratory-data-analysis/,Exploratory data analysis (EDA),Exploratory data analysis on Azure Databricks: Tools and techniques - Azure Databricks,,Learn about tools and techniques for doing exploratory data analysis (EDA) on Databricks.,This article describes tools and techniques for exploratory data analysis (EDA) on Azure Databricks.,2025-02-12T08:00:00.000Z,landing-page,,0.3,False,"Conceptual overview of EDA tools and techniques; lacks detailed configuration tables, limits, or product-specific edge-case mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/external-access/,Overview,Access Databricks data using external systems - Azure Databricks,Choose patterns for external access to Databricks data,Learn about recommended patterns for using external systems to access data registered to Unity Catalog.,"This page provides an overview of functionality and recommendations for making data managed and governed by Azure Databricks available to external systems. These patterns focus on scenarios where your organization needs to integrate trusted tools or systems to Azure Databricks data. If you are looking for guidance on sharing data outside of your organization, seeShare data and AI assets securely.",2026-04-14T17:47:00.000Z,overview,architecture-patterns,0.65,True,"Described as recommended patterns and functionality for making Unity Catalog–governed Databricks data available to external systems. This is product-specific guidance on when and how to use particular access patterns for external tools, which fits architecture-patterns more than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/external-access/,Overview,Access Databricks data using external systems - Azure Databricks,Choose patterns for external access to Databricks data,Learn about recommended patterns for using external systems to access data registered to Unity Catalog.,"This page provides an overview of functionality and recommendations for making data managed and governed by Azure Databricks available to external systems. These patterns focus on scenarios where your organization needs to integrate trusted tools or systems to Azure Databricks data. If you are looking for guidance on sharing data outside of your organization, seeShare data and AI assets securely.",2026-04-14T17:47:00.000Z,overview,architecture-patterns,0.65,True,"Described as recommended patterns and functionality for making Unity Catalog–governed Databricks data available to external systems. This is product-specific guidance on when and how to use particular access patterns for external tools, which fits architecture-patterns more than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/external-access/admin,Enable external access,Enable external data access to Unity Catalog - Azure Databricks,Configure secure external access to Unity Catalog,Learn how to configure Unity Catalog database objects for access from external systems.,"Important This feature is inPublic Preview. Azure Databricks provides access to Unity Catalog tables using the Unity REST API and Apache Iceberg REST catalog. A metastore admin must enable external data access for each metastore you need to access externally. The user or service principal that configures the connection must have theEXTERNAL USE SCHEMAprivilege for each schema where they need to perform supported operations: reading from managed tables or creating, reading, and writing to externa",2026-03-26T08:00:00.000Z,how-to,security,0.8,True,Details how a metastore admin enables external data access and requires specific privileges like 'EXTERNAL USE SCHEMA'. This is product-specific security and permission configuration guidance.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/external-access/compatibility-mode,Compatibility Mode,Compatibility Mode - Azure Databricks,Enable Compatibility Mode for external table reads,"Use Compatibility Mode to read Unity Catalog managed tables, streaming tables, and materialized views from external Delta and Iceberg clients while maintaining optimal performance on Databricks.","Important This feature is inPublic Preview. Using Compatibility Mode, you can read Unity Catalog managed tables, materialized views, and streaming tables from external systems while maintaining optimal performance on Azure Databricks. This feature automatically generates read-only versions of your tables that can be accessed by any Delta Lake or Iceberg client.",2026-02-23T08:00:00.000Z,how-to,configuration,0.65,True,Describes a Databricks-only feature that auto-generates read-only table versions for external clients; includes specific behavior and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/external-access/credential-vending,Credential vending,Unity Catalog credential vending for external system access - Azure Databricks,Configure Unity Catalog credential vending for external engines,Learn how to control external engine access to data in your Unity Catalog catalogs.,"Important This feature is inPublic Preview. Tip For information about how to read Azure Databricks data using Microsoft Fabric, seeUse Microsoft Fabric to read data that is registered in Unity Catalog. This page describes how Unity Catalog credential vending functionality supports access to data in Azure Databricks from external processing engines. Credential vending supports external systems that connect to Unity Catalog using the Unity REST API and Apache Iceberg REST catalog. SeeAccess Databr",2026-04-14T17:47:00.000Z,how-to,security,0.7,True,"Focuses on controlling external engine access to Unity Catalog data via credential vending. This is product-specific access control and authentication behavior for external systems, aligning with security configuration patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/external-access/iceberg,Iceberg client access,Access Azure Databricks tables from Apache Iceberg clients - Azure Databricks,Configure Iceberg REST catalog access to Databricks tables,Learn how to configure access to Azure Databricks tables from Apache Iceberg clients.,"Important Unity Catalog Apache Iceberg REST Catalog API is inPublic Previewin Databricks Runtime 16.4 LTS and above. This endpoint is recommended for reading from and writing to tables from Iceberg clients. Unity Catalog also has a read-only Iceberg REST Catalog API endpoint. This is a legacy endpoint. SeeRead Databricks tables from Apache Iceberg clients (legacy). The Apache Iceberg REST catalog lets supported clients, such as Apache Spark, Apache Flink, and Trino, read from and write to Unity ",2026-03-31T23:28:00.000Z,how-to,configuration,0.75,True,"Explains how to configure Apache Iceberg clients to read/write Unity Catalog tables via the Iceberg REST catalog, including runtime/version constraints and endpoint usage. This is concrete integration/configuration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/external-access/compatibility-mode,Compatibility Mode,Compatibility Mode - Azure Databricks,Use Compatibility Mode for external Delta and Iceberg clients,"Use Compatibility Mode to read Unity Catalog managed tables, streaming tables, and materialized views from external Delta and Iceberg clients while maintaining optimal performance on Databricks.","Important This feature is inPublic Preview. Using Compatibility Mode, you can read Unity Catalog managed tables, materialized views, and streaming tables from external systems while maintaining optimal performance on Azure Databricks. This feature automatically generates read-only versions of your tables that can be accessed by any Delta Lake or Iceberg client.",2026-04-21T08:00:00.000Z,how-to,integrations,0.65,True,"Describes how Compatibility Mode exposes read-only versions of Unity Catalog tables to external Delta/Iceberg clients, with Databricks-specific configuration and access patterns. This is an integration-focused pattern with product-specific behavior, not just conceptual content.",updated +https://learn.microsoft.com/en-us/azure/databricks/external-access/credential-vending,Credential vending,Unity Catalog credential vending for external system access - Azure Databricks,Configure Unity Catalog credential vending for external engines,Learn how to control external engine access to data in your Unity Catalog catalogs.,"Important This feature is inPublic Preview. Tip For information about how to read Azure Databricks data using Microsoft Fabric, seeUse Microsoft Fabric to read data that is registered in Unity Catalog. This page describes how Unity Catalog credential vending functionality supports access to data in Azure Databricks from external processing engines. Credential vending supports external systems that connect to Unity Catalog using the Unity REST API and Apache Iceberg REST catalog. SeeAccess Databr",2026-04-14T17:47:00.000Z,how-to,security,0.7,True,"Focuses on controlling external engine access to Unity Catalog data via credential vending. This is product-specific access control and authentication behavior for external systems, aligning with security configuration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/external-access/iceberg,Iceberg client access,Access Azure Databricks tables from Apache Iceberg clients - Azure Databricks,Configure Iceberg REST catalog access to Unity Catalog tables,Learn how to configure access to Azure Databricks tables from Apache Iceberg clients.,"Important Unity Catalog Apache Iceberg REST Catalog API is inPublic Previewin Databricks Runtime 16.4 LTS and above. This endpoint is recommended for reading from and writing to tables from Iceberg clients. Unity Catalog also has a read-only Iceberg REST Catalog API endpoint. This is a legacy endpoint. SeeRead Databricks tables from Apache Iceberg clients (legacy). The Apache Iceberg REST catalog lets supported clients, such as Apache Spark, Apache Flink, and Trino, read from and write to Unity ",2026-04-20T22:01:00.000Z,how-to,integrations,0.7,True,"Page describes configuring Apache Iceberg REST catalog clients (Spark, Flink, Trino) to read/write Unity Catalog tables, including product-specific REST endpoints, auth/config parameters, and client-side settings. This is concrete integration configuration rather than a conceptual overview.",updated https://learn.microsoft.com/en-us/azure/databricks/external-access/integrations,Supported integrations,Unity Catalog integrations - Azure Databricks,Choose Unity Catalog integration method for external engines,Learn about supported integrations for Unity Catalog using the Unity REST API and Iceberg REST catalog.,"The following table shows integrated engines that can interact with Unity Catalog using theUnity REST APIor theIceberg REST catalog. For full details on support and configuration, and for any engines not listed here, refer to the documentation provided by the integrated engine.",2026-03-04T08:00:00.000Z,reference,decision-making,0.6,True,"Contains a table of integrated engines and whether they use Unity REST API or Iceberg REST catalog, helping users decide which integration path to use per engine. This is selection guidance between options based on capabilities.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/external-access/unity-rest,Delta client access,Access Databricks tables from Delta clients - Azure Databricks,Use Unity REST API from external Delta clients,"Learn how to create, read, and write to Databricks tables from external Delta clients using the Unity REST API.","This page describes how to use the Unity REST API to create, read, and write to Unity Catalog managed and external tables from external Delta clients. For a full list of supported integrations, seeUnity Catalog integrations. Tip For information about how to read Azure Databricks data using Microsoft Fabric, seeUse Microsoft Fabric to read data that is registered in Unity Catalog.",2026-04-14T17:47:00.000Z,how-to,integrations,0.7,True,"Explains how to create, read, and write Unity Catalog tables from external Delta clients via the Unity REST API. This implies product-specific API usage and parameters for integrating external Delta clients with Databricks, which matches integrations & coding patterns.",new +https://learn.microsoft.com/en-us/azure/databricks/external-access/unity-rest,Delta client access,Access Databricks tables from Delta clients - Azure Databricks,Use Unity REST API from external Delta clients,"Learn how to create, read, and write to Databricks tables from external Delta clients using the Unity REST API.","This page describes how to use the Unity REST API to create, read, and write to Unity Catalog managed and external tables from external Delta clients. For a full list of supported integrations, seeUnity Catalog integrations. Tip For information about how to read Azure Databricks data using Microsoft Fabric, seeUse Microsoft Fabric to read data that is registered in Unity Catalog.",2026-04-14T17:47:00.000Z,how-to,integrations,0.7,True,"Explains how to create, read, and write Unity Catalog tables from external Delta clients via the Unity REST API. This implies product-specific API usage and parameters for integrating external Delta clients with Databricks, which matches integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/,Files on Databricks overview,Work with files on Azure Databricks - Azure Databricks,Use Databricks file utilities and APIs,Learn about options for working with files on Azure Databricks.,"Azure Databricks has multiple utilities and APIs for interacting with files in the following locations: This article has examples for interacting with files in these locations for the following tools: Important Some operations in Databricks, especially those using Java or Scala libraries, run as JVM processes, for example: These operations do not support reading from or writing to Unity Catalog volumes or workspace files using standard file paths, such as/Volumes/my-catalog/my-schema/my-volume/m",2026-03-31T08:00:00.000Z,overview,configuration,0.7,True,"Covers Databricks-specific file interaction APIs and constraints (e.g., unsupported paths for JVM processes), which are configuration and behavior details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/cwd-dbr-14,CWD for notebooks,What is the default current working directory? - Azure Databricks,Understand Databricks default working directory behavior,Learn about changes to the default current working directory (CWD) for notebook and file execution.,"This article describes how the default current working directory (CWD) works for notebook and file execution. Note Use Databricks Runtime 14.0+ and default workspace configs for more consistency in (CWD) behavior throughout the workspace. There are two default CWD behaviors for code executed locally innotebooksandfiles: This CWD behavior affects all code, including%shand Python or R code that doesn't use Apache Spark. The behavior is determined by code language, Databricks Runtime version, works",2026-01-19T08:00:00.000Z,concept-article,configuration,0.8,True,"Explains CWD behavior differences by language, runtime version, and workspace config; these are nuanced, product-specific execution settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/files-recommendations,Recommendations for files in volumes and workspace files,Recommendations for files in volumes and workspace files - Azure Databricks,Choose between Databricks volumes and workspace files,Find storage recommendations for files in volumes and workspace files in Databricks.,"When you upload or save data or files to Azure Databricks, you can choose to store these files using Unity Catalog volumes or workspace files. This article contains recommendations and requirements for using these locations. For more details on volumes and workspace files, seeWhat are Unity Catalog volumes?andWhat are workspace files?. Databricks recommends using Unity Catalog volumes to store data, libraries, and build artifacts. Store notebooks, SQL queries, and code files as workspace files. ",2026-03-31T23:28:00.000Z,best-practice,best-practices,0.8,True,"Provides concrete recommendations and requirements on when to use Unity Catalog volumes vs workspace files for different asset types, which are product-specific best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/python-unit-tests,Work with Python unit tests,Python unit testing in the workspace - Azure Databricks,,"Discover, run, and track Python unit tests directly in the Databricks workspace using the testing sidebar, inline glyphs, and results panel.","Azure Databricks provides a suite of tools to discover, run, and track Python unit tests directly in the workspace. Use the testing sidebar pane, inline execution glyphs, and a dedicated results pane to manage your tests without leaving the workspace. Python unit testing tools are available when you have a valid Python test file open.",2026-02-26T19:56:00.000Z,feature-guide,,0.45,False,"Describes UI tools for Python unit testing; summary does not indicate detailed configuration parameters, error codes, or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/unzip-files,Unzip compressed files,Expand and read Zip compressed files - Azure Databricks,Unzip and read compressed files in Databricks,Learn how to unzip and read data from Zip compressed files using Azure Databricks.,"You can use theunzipBash command to expand Zip (.zip) compressed files or directories of files. The Azure Databricks%shmagic commandenables execution of arbitrary Bash code, including theunzipcommand. Apache Spark provides native codecs for interacting with compressed Parquet files. Most Parquet files written by Azure Databricks end with.snappy.parquet, indicating they use snappy compression.",2026-03-31T08:00:00.000Z,how-to,integrations,0.65,True,"Describes using `%sh` with `unzip` and notes Parquet compression behavior (e.g., `.snappy.parquet`) in Databricks, which are product-specific integration/usage patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/volumes,What are volumes?,What are volumes? - Azure Databricks,Use Unity Catalog volumes for file storage,Learn what Unity Catalog volumes are and how to work with files in them on Azure Databricks.,"Volumes are Unity Catalog objects that govern access to non-tabular data. They provide a logical layer over cloud object storage so you can store, organize, and manage files with centralized governance. For comprehensive documentation on volumes, seeWhat are Unity Catalog volumes?. Unity Catalog supports two types of volumes:",2026-03-31T23:28:00.000Z,overview,configuration,0.65,True,Defines Unity Catalog volume types and how they govern access to non-tabular data; these are Databricks-specific storage configuration concepts.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/files/workspace,Workspace files overview,What are workspace files? - Azure Databricks,,Learn what workspace files are and how to interact with them on Azure Databricks.,"Workspace files are the files stored and managed in the Databricks workspace file system. Workspace files can be almost any type of file. Common examples include the following: Note Genie spaces and experiments cannot be workspace files. For recommendations on working with files, seeRecommendations for files in volumes and workspace files. Your Azure Databricks workspace file tree can contain folders attached to a Git repository called ""Databricks Git folders"". Git folders have some additional f",2026-04-16T08:00:00.000Z,overview,,0.3,False,"Conceptual explanation of Databricks workspace files and Git folders; no detailed configuration parameter tables, limits, quotas, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/files/workspace,Workspace files overview,What are workspace files? - Azure Databricks,,Learn what workspace files are and how to interact with them on Azure Databricks.,"Workspace files are the files stored and managed in the Databricks workspace file system. Workspace files can be almost any type of file. Common examples include the following: Note Genie spaces and experiments cannot be workspace files. For recommendations on working with files, seeRecommendations for files in volumes and workspace files. Your Azure Databricks workspace file tree can contain folders attached to a Git repository called ""Databricks Git folders"". Git folders have some additional f",2026-04-16T08:00:00.000Z,overview,,0.3,False,"Conceptual explanation of Databricks workspace files and Git folders; no detailed configuration parameter tables, limits, quotas, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/workspace-basics,Workspace files basic usage,Workspace files basic usage - Azure Databricks,,"Learn how to use the UI to create, upload, and edit workspace files in Databricks Git folders.","You can use the workspace UI to create, import, and edit workspace files. Note All files present in a repository are synced as workspace files automatically when youclone a Git repository.",2026-01-19T08:00:00.000Z,how-to,,0.4,False,Basic UI how-to for creating and editing workspace files; lacks detailed parameter tables or advanced behavior nuances.,unchanged https://learn.microsoft.com/en-us/azure/databricks/files/workspace-init-scripts,Init scripts in workspace files,Store init scripts in workspace files - Azure Databricks,Store and reference init scripts in Databricks workspace files,Learn how to store and reference init scripts with workspace files in Azure Databricks.,"Databricks recommends storing init scripts in workspace files in Databricks Runtime 11.3 LTS and above if you are not using Unity Catalog. Note There is limited support for init scripts in workspace files in Databricks Runtime 9.1 LTS and 10.4 LTS, but this support does not cover all common use patterns for init scripts, such as referencing other files from init scripts. Databricks recommends using init scripts in cloud object storage for Databricks Runtime 9.1 LTS and 10.4 LTS. For more on work",2026-01-19T08:00:00.000Z,concept-article,configuration,0.75,True,"Provides runtime-version-specific support details and recommended storage locations for init scripts, which are Databricks-specific configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/workspace-interact,Programmatically interact with workspace files,Programmatically interact with workspace files - Azure Databricks,Programmatically manage Databricks workspace files,"Learn how to programmatically read, create, update, and delete workspace files with Databricks.","You can interact with workspace files stored in Azure Databricks programmatically. This enables tasks such as: You can programmatically create, edit, rename, and delete workspace files in Databricks Runtime 11.3 LTS and above. This functionality is supported for notebooks in Databricks Runtime 16.2 and above, and serverless environment 2 and above. Note To disable writing to workspace files, set the cluster environment variableWSFS_ENABLE_WRITE_SUPPORT=false. For more information, seeEnvironment",2026-01-19T08:00:00.000Z,how-to,configuration,0.8,True,"Includes runtime/version requirements and an environment variable `WSFS_ENABLE_WRITE_SUPPORT=false` controlling behavior, which are concrete configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/workspace-modules,Import modules from workspace files,Work with Python and R modules - Azure Databricks,Import Python and R modules from Databricks workspace files,Learn how to import Python and R modules using workspace files in Databricks.,"This article describes how you can use relative paths to import custom Python and R modules stored in workspace files alongside your Databricks notebooks. Workspace files can facilitate tighter development lifecycles, allowing you tomodularize your code,convert %run commands to import statements, andrefactor Python wheel files to co-versioned modules. You can also use the built-in Databricksweb terminal to test your code. Note In Databricks Runtime 14.0 and above, the the default current working",2026-01-19T08:00:00.000Z,how-to,configuration,0.75,True,"Describes Databricks-specific module import behavior, relative paths, and CWD behavior for modules, which are configuration/behavior details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/files/write-data,Writing data,Where does Azure Databricks write data? - Azure Databricks,Identify where Databricks writes operational data,Learn where Azure Databricks writes data files.,"This article details locations where Azure Databricks writes data during everyday operations and configurations. Because Azure Databricks has a suite of tools that span many technologies and interact with cloud resources in a shared-responsibility model, the default locations used to store data vary based on the execution environment, configurations, and libraries. The information in this article is meant to help you understand default paths for various operations and how configurations might al",2026-01-19T08:00:00.000Z,faq,configuration,0.75,True,"Details default storage paths and how configurations affect write locations across Databricks components, which are product-specific configuration behaviors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/classification,Classification,Classification - Azure Databricks,,Learn how to classify a large volume of documents into predefined categories using Classification.,"Important This feature is inPublic Previewand isHIPAA compliant. You can use Classification to classify your documents into predefined categories with AI. Examples of classification include: Classification is built on top of the AI function,ai_classify. TheAgentspage provides a UI interface to quickly classify documents and unstructured text and iterate on classification fields for better results.",2026-03-31T23:28:00.000Z,how-to,,0.45,False,"Overview of Classification built on ai_classify and Agents UI; summary suggests conceptual usage, not detailed configuration or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/custom-llm,Custom LLM (legacy),Use Custom LLM to create a gen AI agent for text (legacy) - Azure Databricks,,Learn how to create a specialized LLM for custom text generation using Custom LLM.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This article describes how to create a generative AI agent for custom text-based tasks using Custom LLM.,2026-04-15T08:00:00.000Z,concept-article,,0.35,False,"Describes creating a generative AI agent using Custom LLM (beta). Summary suggests a feature tutorial rather than detailed limits, configuration matrices, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/document-parsing,Document Parsing,Document Parsing - Azure Databricks,Parse documents using Databricks Document Parsing,Learn how to parse and visualize document structure using Document Parsing.,"Document Parsing uses state-of-the-art research techniques to extract and visualize structured data from a wide range of document types, including but not limited to PDFs, images, Word documents (DOC/DOCX), and PowerPoint files (PPT/PPTX). It's designed to handle complex layouts such as tables, charts, and mixed text-image content. Document Parsing is built on theai_parse_documentfunction and includes a UI that allows you to parse documents and immediately inspect their structure through formatt",2026-04-16T17:44:00.000Z,how-to,integrations,0.7,True,"Document Parsing is built on the ai_parse_document function and includes a UI. The full page is expected to document the ai_parse_document function’s parameters and usage, which are product-specific integration details (function name, behavior, and likely options) for connecting document content into Databricks workflows.",updated -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/idp-pipeline-tutorial,Tutorial: Intelligent document processing pipeline,Document Intelligence powered by AI Functions - Azure Databricks,Build an IDP pipeline with Databricks AI Functions,"Build an end-to-end intelligent document processing pipeline using AI Functions and a medallion architecture to ingest, parse, classify, and extract data from PDFs.","Open notebook version of this page This tutorial demonstrates an end-to-endIntelligent Document Processing (IDP)pipeline using three Databricks AI Functions. The pipeline processes SEC-filed legal agreements, classifying each into one of five categories (affiliate, marketing, consulting, hosting, escrow) and extracting relevant terms like party names, dates, and compensation details. Prerequisites Note Thesamples.sec.contractsdata set is available in all workspaces by default. To process your ow",2026-04-16T17:44:00.000Z,concept-article,architecture-patterns,0.6,True,"Describes an end-to-end intelligent document processing pipeline using a medallion architecture and multiple AI Functions. This is a product-specific architecture pattern (IDP on Databricks with medallion layers) and provides concrete guidance on how to structure ingestion, parsing, classification, and extraction, which aligns with architecture-patterns.",new +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/custom-llm,Custom LLM (legacy),Use Custom LLM to create a gen AI agent for text (legacy) - Azure Databricks,,Learn how to create a specialized LLM for custom text generation using Custom LLM.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This article describes how to create a generative AI agent for custom text-based tasks using Custom LLM.,2026-04-15T08:00:00.000Z,concept-article,,0.35,False,"Describes creating a generative AI agent using Custom LLM (beta). Summary suggests a feature tutorial rather than detailed limits, configuration matrices, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/document-parsing,Document Parsing,Document Parsing - Azure Databricks,Parse documents using Databricks Document Parsing,Learn how to parse and visualize document structure using Document Parsing.,"Document Parsing uses state-of-the-art research techniques to extract and visualize structured data from a wide range of document types, including but not limited to PDFs, images, Word documents (DOC/DOCX), and PowerPoint files (PPT/PPTX). It's designed to handle complex layouts such as tables, charts, and mixed text-image content. Document Parsing is built on theai_parse_documentfunction and includes a UI that allows you to parse documents and immediately inspect their structure through formatt",2026-04-16T17:44:00.000Z,how-to,integrations,0.7,True,"Document Parsing is built on the ai_parse_document function and includes a UI. The full page is expected to document the ai_parse_document function’s parameters and usage, which are product-specific integration details (function name, behavior, and likely options) for connecting document content into Databricks workflows.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/idp-pipeline-tutorial,Tutorial: Intelligent document processing pipeline,Document Intelligence powered by AI Functions - Azure Databricks,Build an IDP pipeline with Databricks AI Functions,"Build an end-to-end intelligent document processing pipeline using AI Functions and a medallion architecture to ingest, parse, classify, and extract data from PDFs.","Open notebook version of this page This tutorial demonstrates an end-to-endIntelligent Document Processing (IDP)pipeline using three Databricks AI Functions. The pipeline processes SEC-filed legal agreements, classifying each into one of five categories (affiliate, marketing, consulting, hosting, escrow) and extracting relevant terms like party names, dates, and compensation details. Prerequisites Note Thesamples.sec.contractsdata set is available in all workspaces by default. To process your ow",2026-04-16T17:44:00.000Z,concept-article,architecture-patterns,0.6,True,"Describes an end-to-end intelligent document processing pipeline using a medallion architecture and multiple AI Functions. This is a product-specific architecture pattern (IDP on Databricks with medallion layers) and provides concrete guidance on how to structure ingestion, parsing, classification, and extraction, which aligns with architecture-patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/info-extraction,Information Extraction,Information Extraction - Azure Databricks,,Learn how to extract information from a large volume of unlabeled or labeled text documents using Information Extraction.,"Important This feature is inPublic Previewand isHIPAA compliant. This page covers the new version of Information Extraction. For information about the previous version, seeUse Information Extraction (legacy) Information Extraction transforms unstructured documents and text into key, structured insights using a defined schema. This allows information embedded in unstructured text, PDFs, images, or tables to be directly used for analysis, reporting, or downstream agents and applications. Examples ",2026-03-31T23:28:00.000Z,how-to,,0.45,False,High-level description of Information Extraction and use cases; likely conceptual with workflow guidance rather than detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/intelligent-document-processing,Intelligent document processing,Intelligent document processing - Azure Databricks,,"Learn how to use Databricks AI Functions to turn unstructured documents into structured insights with composable, governed pipelines.","Intelligent Document Processing (IDP) converts unstructured content—such as PDFs, DOCX files, images, and presentations—into structured, enriched data that powers downstream agents, applications, and analytics. With Azure Databricks, you can build end-to-end IDP pipelines directly on the Lakehouse using natively composable AI Functions, includingai_parse_document,ai_extract, andai_classify. These research-developed functions are purpose-built for high-performance document processing. Because all",2026-03-31T23:28:00.000Z,concept-article,,0.4,False,Describes intelligent document processing conceptually and mentions AI Functions; appears as solution overview without detailed configs or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/key-info-extraction,Information extraction (legacy),Use Information Extraction (legacy) - Azure Databricks,Create information extraction agents with Databricks,Learn how to extract information from a large volume of unlabeled or labeled text documents using Information Extraction.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Note This page covers the old version of Information Extraction. Databricks recommends using the latest version. SeeInformation Extraction. This page describes how to create a generative AI agent for information extraction using Information Extraction.,2026-04-15T08:00:00.000Z,concept-article,integrations,0.6,True,"Describes creating a generative AI agent for information extraction using the legacy Information Extraction feature. This typically involves product-specific APIs/functions and configuration for that agent, which falls under integrations & coding patterns rather than generic concepts.",updated -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/knowledge-assistant,Create a Knowledge Assistant agent,Use Knowledge Assistant to create a high-quality chatbot over your documents - Azure Databricks,,Learn how to create a custom AI chatbot on your documents using Knowledge Assistant.,This page describes how to use Knowledge Assistant to create a question-and-answer chatbot over your documents and improve its quality based on natural language feedback from your subject matter experts.,2026-04-17T08:00:00.000Z,concept-article,,0.3,False,"Appears to be a how-to/tutorial for creating a chatbot with Knowledge Assistant; no indication of numeric limits, config tables, error-code mappings, or product-specific decision matrices.",updated -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor,Create a Supervisor Agent,Use Supervisor Agent to create a coordinated multi-agent system - Azure Databricks,Orchestrate Databricks multi-agent systems with Supervisor,Learn how to create a supervisor agent to orchestrate a multi-agent system using Supervisor Agent.,This page describes how to use Supervisor Agent to create a multi-agent supervisor system that orchestrates AI agents and tools to work together on complex tasks. You can improve their coordination based on natural language feedback from your subject matter experts.,2026-04-17T08:00:00.000Z,concept-article,integrations,0.62,True,"Describes concrete, product-specific patterns for wiring multiple Databricks agents and tools together under a Supervisor Agent, including how agents interact and are coordinated. This is implementation-focused integration logic rather than just conceptual multi-agent theory.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/key-info-extraction,Information extraction (legacy),Use Information Extraction (legacy) - Azure Databricks,Create information extraction agents with Databricks,Learn how to extract information from a large volume of unlabeled or labeled text documents using Information Extraction.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Note This page covers the old version of Information Extraction. Databricks recommends using the latest version. SeeInformation Extraction. This page describes how to create a generative AI agent for information extraction using Information Extraction.,2026-04-15T08:00:00.000Z,concept-article,integrations,0.6,True,"Describes creating a generative AI agent for information extraction using the legacy Information Extraction feature. This typically involves product-specific APIs/functions and configuration for that agent, which falls under integrations & coding patterns rather than generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/knowledge-assistant,Create a Knowledge Assistant agent,Use Knowledge Assistant to create a high-quality chatbot over your documents - Azure Databricks,,Learn how to create a custom AI chatbot on your documents using Knowledge Assistant.,This page describes how to use Knowledge Assistant to create a question-and-answer chatbot over your documents and improve its quality based on natural language feedback from your subject matter experts.,2026-04-17T08:00:00.000Z,concept-article,,0.3,False,"Appears to be a how-to/tutorial for creating a chatbot with Knowledge Assistant; no indication of numeric limits, config tables, error-code mappings, or product-specific decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor,Create a Supervisor Agent,Use Supervisor Agent to create a coordinated multi-agent system - Azure Databricks,,Learn how to create a supervisor agent to orchestrate a multi-agent system using Supervisor Agent.,This page describes how to use Supervisor Agent to create a multi-agent supervisor system that orchestrates AI agents and tools to work together on complex tasks. You can improve their coordination based on natural language feedback from your subject matter experts.,2026-04-24T23:15:00.000Z,concept-article,,0.3,False,"The page describes how to use a Supervisor Agent to orchestrate a multi-agent system in Azure Databricks, but based on the summary it appears to be a conceptual/how-to guide without explicit limits, configuration parameter tables, error-code-based troubleshooting, or quantified decision matrices. It focuses on coordination patterns and natural language feedback rather than product-specific numeric thresholds, RBAC roles, or detailed configuration references.",updated https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor-long-running-tasks,Handle long-running tasks,Supervisor Agent long-running tasks - Azure Databricks,Handle long-running tasks with Databricks Supervisor Agent,Learn how to handle long-running tasks using task continuation with the Supervisor Agent multi-agent supervisor endpoint.,"This page explains how to handle long-running tasks using task continuation withSupervisor Agent. When a supervisor agent works on complex tasks that require many tool calls, the task might need more time than a single API request allows. To handle this, the supervisor supports long-running task mode, which automatically breaks long tasks into multiple request/response cycles so the agent can continue working without timing out.",2026-04-01T08:00:00.000Z,concept-article,limits-quotas,0.7,True,"Explicitly addresses long-running tasks and request timeouts; such pages typically document timeout limits and continuation behavior (e.g., max duration per request) that qualify as expert limits/quotas knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/supervisor-api,Supervisor API,Supervisor API (Beta) - Azure Databricks,Use Databricks Supervisor API to run agent loops,"Learn how to use the Supervisor API to build custom agents with hosted tools, background mode, and model portability.","Important This feature is inBeta. Account admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The Supervisor API simplifiesbuilding custom agentson Azure Databricks with support forbackground modefor long-running tasks. You define the model, tools, and instructions in one request to anOpenResponses-compatible endpoint (POST /mlflow/v1/responses), and Azure Databricks runs the agent loop for you: repeatedly calling the model, selecting and executin",2026-04-21T08:00:00.000Z,concept-article,integrations,0.68,True,"Describes a specific OpenResponses-compatible endpoint (POST /mlflow/v1/responses) and how Supervisor API orchestrates model and tool calls. This is concrete, product-specific API behavior and request pattern that goes beyond generic LLM knowledge, fitting integrations & coding patterns best.",new https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/,MLflow 2: Evaluate gen AI apps,Agent Evaluation (MLflow 2) - Azure Databricks,,Introduction to Mosaic AI Agent Evaluation. Learn why evaluation is important for AI applications.,"Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This article gives an overview of how to work with Agent Evaluation, based on MLflow 2.",2026-02-19T08:00:00.000Z,overview,,0.3,False,High-level overview of Agent Evaluation (MLflow 2); clearly an introduction without detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/advanced-agent-eval,MLflow 2: Customize AI judges,Customize AI judges (MLflow 2) - Azure Databricks,Customize MLflow 2 AI judges for your agents,"Learn how to customize AI judges for your generative AI applications, including creating custom judges.",Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This article describes several techniques you can use to customize the LLM judges used to evaluate the quality and latency ofAI agents. It covers the following techniques: See theexample notebookillustrating the use of these techniques.,2026-02-19T08:00:00.000Z,how-to,best-practices,0.7,True,"Describes concrete techniques to customize judges, likely including specific parameters and patterns; these are product-specific best-practice customization patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/custom-metrics,MLflow 2: Custom metrics,Custom metrics (MLflow 2) - Azure Databricks,Define custom metrics in MLflow 2 Agent Evaluation,Learn how to create custom metrics in Mosaic AI Agent Evaluation programmatically.,"Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This guide explains how to use custom metrics for evaluating AI applications within Agent Framework. Custom metrics provide flexibility to define evaluation metrics tailored to your specific business use case, whether based on simple heuristics, advanced logic, or programmatic evaluations.",2026-02-19T08:00:00.000Z,how-to,configuration,0.65,True,"Explains how to create custom metrics programmatically; likely includes function signatures and required fields, which are product-specific configuration/code patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluate-agent,MLflow 2: Run an evaluation,Run an evaluation and view the results (MLflow 2) - Azure Databricks,,Learn how to use the Mosaic AI Agent Evaluation to evaluate the quality of your generative AI applications.,"Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This article describes how to run an evaluation and view the results as you develop your AI application. For information about how to monitor deployed agents, seeMonitor GenAI in production. To evaluate an agent you must specify an evaluation set. At minimum, an evaluation set is a set of requests to your application that can come either from a curated set of ev",2026-03-18T08:00:00.000Z,how-to,,0.45,False,"Explains how to run an evaluation and view results; summary does not show detailed input schema, limits, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-quickstart,MLflow 2: Quickstart notebook,Agent Evaluation tutorial notebook (MLflow 2) - Azure Databricks,,"Tutorial notebook illustrating how to evaluate a gen AI app using proprietary LLM judges, custom metrics, and labels from domain experts.","Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. The following notebook demonstrates how to evaluate a gen AI app using Agent Evaluation's proprietary LLM judges, custom metrics, and labels from domain experts. It demonstrates the following: To get your agent ready for pre-production, seeGet started with AI agents. For general information, seeAgent Evaluation (MLflow 2). Get notebook",2026-04-17T08:00:00.000Z,tutorial,,0.2,False,"Tutorial/quickstart description for an evaluation notebook; no indication of specific limits, configuration tables, error codes, or product-specific settings. Appears to be procedural guidance rather than detailed reference content with expert-only knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-quickstart,MLflow 2: Quickstart notebook,"Agent Evaluation (MLflow 2): Custom metrics, guidelines and domain expert labels - Azure Databricks",,"Tutorial notebook illustrating how to evaluate a gen AI app using proprietary LLM judges, custom metrics, and labels from domain experts.","Open notebook version of this page NoteThis notebook describes MLflow 2 Agent Evaluation. Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. For information about MLflow 3, seeevaluation and monitoring on MLflow 3andmigrating to MLflow 3. This notebook demonstrates how to evaluate a GenAI app using Agent Evaluation's proprietary LLM judges, custom metrics, and labels from domain experts. It demonstrates: To get your agent ready for pre-production, see theagent quickst",2026-04-22T17:34:00.000Z,tutorial,,0.2,False,"Notebook-based quickstart for evaluating a GenAI app with MLflow 2 Agent Evaluation; primarily a tutorial and example workflow, not a reference for limits, configuration matrices, troubleshooting codes, or other expert-only details.",updated https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-schema,MLflow 2: Input schema,Agent Evaluation input schema (MLflow 2) - Azure Databricks,Use MLflow 2 Agent Evaluation input schema,Learn about the schema of the inputs and outputs to Agent Evaluation.,"Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This article explains the input schema required by Agent Evaluation to assess your application's quality, cost, and latency. The input schema is identical for both online and offline evaluations. For general information about evaluation sets, seeEvaluation sets (MLflow 2).",2026-02-19T08:00:00.000Z,reference,configuration,0.7,True,Explains the input schema required by Agent Evaluation; schema definitions and required fields are product-specific configuration details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-set,MLflow 2: Evaluation sets,Evaluation sets (MLflow 2) - Azure Databricks,Apply best practices for MLflow 2 evaluation sets,"Learn about evaluation sets, the data required to calculate evaluation metrics, and best practices.","Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. To measure the quality of anAI agent, you need to be able to define a representative set of requests along with criteria that characterize high-quality responses. You do that by providing an evaluation set. This article covers the various options for your evaluation set and some best practices for creating an evaluation set. Databricks recommends creating a huma",2026-02-19T08:00:00.000Z,concept-article,best-practices,0.7,True,Explicitly mentions best practices for creating evaluation sets in MLflow 2 Agent Evaluation; these are product-specific recommendations for dataset design and usage.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/llm-judge-metrics,"MLflow 2: How quality, cost, and latency are evaluated","How quality, cost, and latency are assessed by Agent Evaluation (MLflow 2) - Azure Databricks","Interpret MLflow 2 Agent Evaluation quality, cost, latency","Learn how Agent Evaluation metrics and large language model judges help you evaluate the quality, cost, and latency of your agentic application.","Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This article explains how Agent Evaluation assesses your AI application's quality, cost, and latency and provides insights to guide your quality improvements and cost and latency optimizations. It covers the following: For reference information about each of the built-in LLM judges, seeBuilt-in AI judges (MLflow 2).",2026-02-19T08:00:00.000Z,concept-article,decision-making,0.65,True,"Describes how metrics and judges assess quality, cost, and latency and provide insights to guide optimizations; this is product-specific decision guidance for trade-offs.",unchanged @@ -932,12 +952,12 @@ https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluatio https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/review-app,MLflow 2: Collect human feedback,Use the review app for human reviews of a gen AI app (MLflow 2) - Azure Databricks,,Learn how to collect human feedback from subject matter experts,Important Databricks recommends using MLflow 3 for evaluating and monitoring GenAI apps. This page describes MLflow 2 Agent Evaluation. This article describes how to use the review app to collect feedback from subject matter experts (SMEs). You can use the review app to do the following:,2026-02-19T08:00:00.000Z,how-to,,0.4,False,"Describes using a review app to collect human feedback; likely workflow/UI guidance without product-specific limits, config tables, or error-code mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/synthesize-evaluation-set,Synthesize evaluation sets,Synthesize evaluation sets - Azure Databricks,,Learn about how to synthesize evaluation sets from documents.,"This page describes how to synthetically generate a high-quality evaluation set for measuring the quality of your agent. Manually building an evaluation set is often time-consuming, and it is difficult to ensure that it covers all of the functionality of your agent. Agent Evaluation removes this barrier by automatically generating a representative evaluation set from your documents, allowing you to quickly evaluate your agent with good coverage of test cases.",2026-03-10T08:00:00.000Z,how-to,,0.45,False,Explains synthetic evaluation set generation conceptually; no explicit product-specific configuration values or constraints in the summary.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/troubleshooting,MLflow 2: Troubleshoot evaluation,Troubleshoot evaluation (MLflow 2) - Azure Databricks,Troubleshoot Mosaic AI Agent Evaluation issues,Troubleshoot issues you encounter when using Mosaic AI Agent Evaluation on generative AI applications.,"This article describes issues you might encounter when evaluating generative AI applications using Mosaic AI Agent Evaluation, and how to fix them.",2026-02-19T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article for Agent Evaluation; likely organized by specific errors and resolutions, including product-specific causes and fixes.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication,Authentication for agents,Authentication for AI agents - Azure Databricks,Configure authentication for Databricks App-based AI agents,Learn about authentication methods for AI agents deployed on Databricks Apps.,"AI agents often need to authenticate to other resources to complete tasks. For example, a deployed agent might need to access a Vector Search index to query unstructured data, a serving endpoint to call a foundation model, or Unity Catalog functions to execute custom logic. This page covers authentication methods for agents deployed on Databricks Apps. For agents deployed on Model Serving endpoints, seeAuthentication for AI agents (Model Serving). Databricks Apps provides two authentication meth",2026-02-10T23:12:00.000Z,concept-article,security,0.75,True,"Covers authentication methods for agents accessing Vector Search, endpoints, and Unity Catalog; likely includes specific auth flows, scopes, and configuration parameters.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication,Authentication for agents,Authentication for AI agents - Azure Databricks,Configure authentication for Databricks Apps AI agents,"Configure app authorization and user authorization for AI agents deployed on Databricks Apps, including service principals and OBO scopes.","AI agents often need to authenticate to other resources to complete tasks. For example, a deployed agent might need to access a Vector Search index to query unstructured data, a serving endpoint to call a foundation model, or Unity Catalog functions to execute custom logic. This page covers authentication methods for agents deployed on Databricks Apps. For agents deployed on Model Serving endpoints, seeAuthentication for AI agents (Model Serving). Databricks Apps provides two authentication meth",2026-04-21T18:07:00.000Z,concept-article,security,0.74,True,"Covers concrete authentication methods for agents (service principals, OBO scopes) and how Databricks Apps handles app and user authorization to other resources. These are product-specific security and identity configuration patterns, matching the security sub-skill.",updated https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication-model-serving,Authentication for agents,Authentication for AI agents (Model Serving) - Azure Databricks,Configure authentication for Databricks AI agents on Model Serving,Learn about authentication methods for AI agents accessing Databricks and external resources.,"Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. AI agents often need to authenticate to other resources to complete tasks. For example, a deployed agent might need to access a Vector Search index to query unstructured data or the Promp",2026-04-04T08:00:00.000Z,concept-article,security,0.7,True,"Authentication-focused page for agents accessing Databricks and external resources; likely includes concrete auth flows, token types, scopes, and Databricks-specific configuration details for Model Serving and external resources. This is product-specific security configuration rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-legacy-schema,Legacy input and output schema,Legacy input and output agent schema (Model Serving) - Azure Databricks,Migrate from legacy Databricks agent input/output schemas,"The AI agent schemas ChatAgent, ChatModel, SplitChatMessageRequest, and StringResponse have been deprecated. If you use any of these legacy schemas, Databricks recommends migrating to the ResponsesAge","Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Note Databricks recommends migrating to theResponsesAgentschema to author agents. SeeAuthor an AI agent and deploy it on Databricks Apps. AI agents must adhere to specific input and outpu",2026-02-12T19:39:00.000Z,concept-article,configuration,0.8,True,"Details deprecated schemas (ChatAgent, ChatModel, etc.) and recommends ResponsesAgent; this is schema-level configuration knowledge with specific names and structures.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-tool,Overview,AI agent tools - Azure Databricks,Implement Databricks AI agent tools with MCP and Unity Catalog,"Use Mosaic AI Agent Framework to create AI agent tools that expand an LLM beyond language generation using MCP servers, with legacy support for Unity Catalog functions.","AI agent tools give your agents practical capabilities beyond text generation, like searching documents, querying databases, calling REST APIs, or running custom code. Use pre-configured managed MCP servers for immediate access to Azure Databricks data, use external MCP servers to connect to third-party APIs, or build custom tools for specialized business logic. Explore these common tool patterns and implementation examples:",2026-04-07T17:44:00.000Z,concept-article,integrations,0.65,True,"Describes AI agent tools, including managed MCP servers and legacy Unity Catalog function tools. Likely contains Databricks-specific tool definitions, parameters, and patterns for connecting agents to data and services, which fits product-specific integration and coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/anthropic-uc-integration,Anthropic,Integrate Anthropic with Databricks Unity Catalog tools - Azure Databricks,Integrate Anthropic SDK with Databricks Unity Catalog tools,Use Databricks Unity Catalog to integrate SQL and Python functions as tools in Anthropic SDK LLM calls. This integration combines the governance of Unity Catalog with Anthropic to create powerful gen ,Use Databricks Unity Catalog to integrate SQL and Python functions as tools in Anthropic SDK LLM calls. This integration combines the governance of Unity Catalog with Anthropic models to create powerful gen AI apps.,2025-09-03T08:00:00.000Z,concept-article,integrations,0.8,True,Shows how to call Anthropic models with UC tools; expected to include Anthropic SDK parameters and Databricks tool wiring details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent,Author agents on Databricks Apps,Author an AI agent and deploy it on Databricks Apps - Azure Databricks,Deploy custom AI agents on Databricks Apps with Agent Framework,"Author an AI agent and deploy it on Databricks Apps using Agent Framework and popular agent authoring libraries like LangGraph, PyFunc, and OpenAI.","Build an AI agent and deploy it using Databricks Apps. Databricks Apps gives you full control over the agent code, server configuration, and deployment workflow. This approach is ideal when you need custom server behavior, git-based versioning, or local IDE development. Every conversationalagent templateincludes a built-in chat UI (shown above) with no additional setup required. The chat UI supports streaming responses, markdown rendering, Databricks authentication, and optional persistent chat ",2026-04-07T17:44:00.000Z,concept-article,deployment,0.65,True,"Covers authoring an AI agent and deploying it on Databricks Apps, including control over server configuration and deployment workflow. This implies Databricks-specific deployment patterns and requirements (e.g., how agents are hosted, app templates, deployment behavior), fitting the deployment sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent,Author agents on Databricks Apps,Author an AI agent and deploy it on Databricks Apps - Azure Databricks,,"Author an AI agent and deploy it on Databricks Apps using Agent Framework and popular agent authoring libraries like LangGraph, PyFunc, and OpenAI.","Build an AI agent and deploy it using Databricks Apps. Databricks Apps gives you full control over the agent code, server configuration, and deployment workflow. This approach is ideal when you need custom server behavior, git-based versioning, or local IDE development. Tip If your agent uses only Azure Databricks-hosted tools and does not need custom logic between tool calls, you can use theSupervisor API (Beta)to let Azure Databricks manage the agent loop for you. Every conversationalagent tem",2026-04-21T18:07:00.000Z,concept-article,,0.2,False,"High-level guidance on authoring and deploying agents on Databricks Apps; summary indicates conceptual workflow and when to use Supervisor API, but no concrete API parameters, limits, or detailed configuration tables are evident.",updated https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent-model-serving,Author agents using Model Serving,Author an AI agent and deploy it on Model Serving - Azure Databricks,Author and deploy Databricks agents on Model Serving,"Author an AI agent in Python, using Agent Framework and popular agent authoring libraries like LangGraph, PyFunc, and OpenAI. Deploy the Agent to Model Serving.","Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. This page shows how to author an AI agent in Python using Agent Framework and popular agent authoring libraries like LangGraph and OpenAI.",2026-03-18T08:00:00.000Z,concept-article,deployment,0.65,True,"Describes authoring agents in Python and deploying to Databricks Model Serving; such pages typically include endpoint configuration, deployment parameters, and product-specific patterns not known generically.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/build-genai-apps,Overview,Build gen AI apps on Azure Databricks - Azure Databricks,,Overview of building generative AI apps on Databricks.,"This page provides an overview of tools for building, deploying, and managing generative AI (gen AI) apps on Azure Databricks.",2026-03-31T23:28:00.000Z,concept-article,,0.2,False,"Overview of tools for building and managing GenAI apps; high-level and tool-list oriented without detailed decision matrices, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/chat-app,Standalone chat app,Build and share a chat UI with Databricks Apps - Azure Databricks,Build and deploy a chat UI for Databricks agents,Use Databricks Apps to build and deploy a custom chat UI for your agent.,Use Databricks Apps to build and deploy a chat UI for your agent.Agent app templatesinclude this chat UI with no additional setup. Use this page to customize the template UI or to add a chat UI to an agent deployed without a template.,2026-03-27T08:00:00.000Z,concept-article,deployment,0.7,True,"Describes building and deploying a chat UI using Databricks Apps; likely includes app configuration, routing, and deployment details specific to Databricks.",unchanged @@ -945,14 +965,14 @@ https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/create-agent,Overview,Create an AI agent - Azure Databricks,,Learn what it means to create an agent on Databricks and the available methods for creating agents.,"This article introduces the process of creating AI agents on Azure Databricks and outlines the available methods for creating agents. To learn more about agents, seeAgent system design patterns.",2026-03-31T23:28:00.000Z,concept-article,,0.35,False,Introduces the process and methods for creating agents; appears conceptual and introductory rather than a detailed configuration or troubleshooting reference.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/create-custom-tool,UC function tools,Create AI agent tools using Unity Catalog functions - Azure Databricks,Create custom Databricks AI agent tools with Unity Catalog functions,Learn how to create AI agent tools that run custom task-specific operations.,Use Unity Catalog functions to create AI agent tools that execute custom logic and perform specific tasks that extend the capabilities of LLMs beyond language generation.,2026-04-07T17:44:00.000Z,concept-article,integrations,0.7,True,"Shows how to build custom tools using Unity Catalog functions to execute task-specific logic. This is concrete, product-specific guidance on defining callable functions/tools for agents, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/debug-agent,Debug agents,Debug a deployed AI agent - Azure Databricks,Troubleshoot and debug Databricks AI agent deployments,Learn how to debug common issues when deploying AI agents on Databricks.,"This page covers how to debug common issues with AI agents deployed on Azure Databricks. Go to: Most debugging sections on this page apply toagents deployed to Databricks Apps. However, you can also find debugging information foragents deployed on Model Serving (legacy)using the tab selectors.",2026-02-12T22:57:00.000Z,concept-article,troubleshooting,0.82,True,"Organized around debugging issues for agents on Apps and Model Serving, likely listing specific error messages, causes, and resolutions. This matches symptom→cause→solution troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/deploy-agent,Deploy agents,Deploy an agent for generative AI applications (Model Serving) - Azure Databricks,Deploy Databricks AI agents with Model Serving,Learn how to deploy an agent for your generative AI application on Databricks.,"Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Deploy your AI agent on Mosaic AI Model Serving using thedeploy()function from theAgent Framework Python API. Deployment creates a serving endpoint with built-in scalability, monitoring, ",2026-04-17T21:49:00.000Z,concept-article,deployment,0.65,True,Covers deploying agents via the Agent Framework deploy() function on Mosaic AI Model Serving. This is product-specific deployment behavior and likely includes endpoint configuration details and constraints unique to Databricks Model Serving.,updated -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/external-connection-tools,Connect agents to external services,Connect agents to external services - Azure Databricks,Connect Databricks agents to external services and APIs,"Learn how to connect AI agents to external applications and APIs using MCP servers, managed OAuth, the UC connections proxy, or UC function tools.","Important This feature is inPublic Preview. Connect yourAI agentsto external applications like Slack, Google Calendar, or any service with an API. Azure Databricks provides several approaches depending on whether the external service has an MCP server, whether you need per-user authentication, or whether you prefer to call APIs directly from agent code. All approaches rely on a Unity CatalogHTTP connectionto securely manage credentials and govern access to external services.",2026-04-15T08:00:00.000Z,concept-article,integrations,0.7,True,"Explains multiple product-specific approaches (MCP servers, managed OAuth, UC connections proxy, direct API calls) and relies on Unity Catalog HTTP connections. This is integration-focused with Databricks-specific connection patterns and configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/deploy-agent,Deploy agents,Deploy an agent for generative AI applications (Model Serving) - Azure Databricks,Deploy Databricks AI agents with Model Serving,Learn how to deploy an agent for your generative AI application on Databricks.,"Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Deploy your AI agent on Mosaic AI Model Serving using thedeploy()function from theAgent Framework Python API. Deployment creates a serving endpoint with built-in scalability, monitoring, ",2026-04-17T21:49:00.000Z,concept-article,deployment,0.65,True,Covers deploying agents via the Agent Framework deploy() function on Mosaic AI Model Serving. This is product-specific deployment behavior and likely includes endpoint configuration details and constraints unique to Databricks Model Serving.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/external-connection-tools,Connect agents to external services,Connect agents to external services - Azure Databricks,Connect Databricks agents to external services and APIs,"Learn how to connect AI agents to external applications and APIs using MCP servers, managed OAuth, the UC connections proxy, or UC function tools.","Important This feature is inPublic Preview. Connect yourAI agentsto external applications like Slack, Google Calendar, or any service with an API. Azure Databricks provides several approaches depending on whether the external service has an MCP server, whether you need per-user authentication, or whether you prefer to call APIs directly from agent code. All approaches rely on a Unity CatalogHTTP connectionto securely manage credentials and govern access to external services.",2026-04-15T08:00:00.000Z,concept-article,integrations,0.7,True,"Explains multiple product-specific approaches (MCP servers, managed OAuth, UC connections proxy, direct API calls) and relies on Unity Catalog HTTP connections. This is integration-focused with Databricks-specific connection patterns and configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/feedback-model,Feedback model (deprecated),Feedback model (deprecated) - Azure Databricks,Replace deprecated Databricks agent feedback model,Learn about the deprecated feedback model for collecting agent feedback through experimental APIs.,"Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Important Deprecation notice: The feedback model has beendeprecatedas of December 4, 2025 and is no longer supported in the latest version of databricks-agents. Action required: UseMLflow",2026-02-23T22:28:00.000Z,concept-article,configuration,0.65,True,Describes deprecated feedback model and required migration to MLflow Traces; likely includes API names and configuration changes specific to Databricks agents.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/langchain-uc-integration,LangChain/LangGraph,Integrate LangChain with Databricks Unity Catalog tools - Azure Databricks,Integrate LangChain workflows with Unity Catalog tools,Use Databricks Unity Catalog to integrate SQL and Python functions as tools in LangChain and LangGraph workflows.,Use Databricks Unity Catalog to integrate SQL and Python functions as tools in LangChain and LangGraph workflows. This integration combines the governance of Unity Catalog with LangChain capabilities to build powerful LLM-based applications.,2026-02-10T23:12:00.000Z,concept-article,integrations,0.78,True,"Provides concrete LangChain/LangGraph code and configuration to register Unity Catalog SQL/Python functions as tools. Includes parameter names, classes, and wiring patterns unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/llamaindex-uc-integration,LlamaIndex,Integrate LlamaIndex with Databricks Unity Catalog tools - Azure Databricks,Integrate LlamaIndex workflows with Databricks Unity Catalog tools,"Use Databricks Unity Catalog to integrate SQL and Python functions as tools in LlamaIndex workflows, enabling indexing and querying of large datasets for LLMs.",Use Databricks Unity Catalog to integrate SQL and Python functions as tools in LlamaIndex workflows. This integration combines Unity Catalog governance with LlamaIndex's capabilities to index and query large datasets for LLMs.,2025-09-03T08:00:00.000Z,concept-article,integrations,0.8,True,Focuses on wiring UC SQL/Python functions into LlamaIndex; expected to contain concrete code and parameter mappings that are product- and SDK-specific.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/log-agent,Log and register agents,Log and register AI agents (Model Serving) - Azure Databricks,Log and register Databricks Model Serving AI agents,"Learn how to create, parameterize, and log AI application agents.","Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Log AI agents using Mosaic AI Agent Framework. Logging an agent is the basis of the development process. Logging captures a ""point in time"" of the agent's code and configuration so you ca",2026-02-10T23:12:00.000Z,concept-article,configuration,0.7,True,"Explains logging agents as points-in-time of code and configuration; likely includes specific logging APIs, parameters, and schema details unique to Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/mcp,MCP servers,MCP servers for agents - Azure Databricks,,"Use Model Context Protocol (MCP) servers to connect Databricks AI agents to tools, resources, and external services through a standardized interface.","Model Context Protocol (MCP)is an open source standard that connects AI agents to tools, resources, prompts, and other contextual information. On Databricks, MCP servers are governed throughAI Gatewayand integrate with the agent framework as a primary way to give agents practical capabilities. SeeModel Context Protocol (MCP) on Databricksfor complete documentation.",2026-04-15T22:08:00.000Z,concept-article,,0.4,False,"Introduces MCP servers for agents and references broader MCP documentation; likely conceptual/overview of the protocol and its role in Databricks rather than detailed config tables, limits, or troubleshooting mappings.",new -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/migrate-agent-to-apps,Migrate from Model Serving,Migrate an agent from Model Serving to Databricks Apps - Azure Databricks,,"Migrate your AI agent from a Model Serving endpoint to Databricks Apps for full control over agent code, configuration, and deployment.",Migrate an existing AI agent from a Model Serving endpoint to Databricks Apps. Databricks recommendsauthoring agents on Databricks Appssince it provides the following advantages over Model Serving:,2026-04-17T08:00:00.000Z,concept-article,,0.4,False,"Migration guidance from Model Serving to Databricks Apps, but summary suggests high-level recommendations and advantages rather than detailed configuration matrices, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/mcp,MCP servers,MCP servers for agents - Azure Databricks,,"Use Model Context Protocol (MCP) servers to connect Databricks AI agents to tools, resources, and external services through a standardized interface.","Model Context Protocol (MCP)is an open source standard that connects AI agents to tools, resources, prompts, and other contextual information. On Databricks, MCP servers are governed throughAI Gatewayand integrate with the agent framework as a primary way to give agents practical capabilities. SeeModel Context Protocol (MCP) on Databricksfor complete documentation.",2026-04-15T22:08:00.000Z,concept-article,,0.4,False,"Introduces MCP servers for agents and references broader MCP documentation; likely conceptual/overview of the protocol and its role in Databricks rather than detailed config tables, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/migrate-agent-to-apps,Migrate from Model Serving,Migrate an agent from Model Serving to Databricks Apps - Azure Databricks,,"Migrate your AI agent from a Model Serving endpoint to Databricks Apps for full control over agent code, configuration, and deployment.",Migrate an existing AI agent from a Model Serving endpoint to Databricks Apps. Databricks recommendsauthoring agents on Databricks Appssince it provides the following advantages over Model Serving:,2026-04-17T08:00:00.000Z,concept-article,,0.4,False,"Migration guidance from Model Serving to Databricks Apps, but summary suggests high-level recommendations and advantages rather than detailed configuration matrices, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/multi-agent-apps,Create a custom multi-agent system,Build a multi-agent system on Databricks Apps - Azure Databricks,Build multi-agent orchestrator apps on Databricks,"Build a multi-agent orchestrator on Databricks Apps that routes requests to other agents, Genie spaces, and serving endpoints.","Instead of building one agent that does everything, a multi-agent orchestrator routes requests to specialized subagents from a single entry point. For example, you can combine a RAG agent that queries unstructured documents with a Genie agent that queries structured data, so users get answers from multiple sources. The orchestrator treats each subagent as a tool and uses its instructions to route requests to the right one. The orchestrator supports the following subagent types:",2026-02-25T19:11:00.000Z,concept-article,architecture-patterns,0.7,True,Describes a multi-agent orchestrator pattern with supported subagent types; provides Databricks-specific architectural guidance on routing and composition of agents and services.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/multi-agent-genie,Multi-agent using Genie,Use Genie in multi-agent systems (Model Serving) - Azure Databricks,Create Genie-based multi-agent systems on Databricks,Create a multi-agent system using Genie and LangGraph.,"Important This feature is inPublic Preview. Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. This page describes Genie agent systems and shows how to create a multi-agent system using Mosaic AI Agent Framework andGenie spaces.",2026-03-10T23:23:00.000Z,concept-article,architecture-patterns,0.7,True,Describes Genie agent systems and how to create a multi-agent system using Genie spaces; this is a Databricks-specific multi-agent architecture pattern.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/non-conversational-agents,Non-conversational agents,Non-conversational AI agents using MLflow - Azure Databricks,Build non-conversational Databricks AI agents with MLflow,Build non-conversational AI agents that process structured inputs and provide specific outputs without maintaining conversation state using Mosaic AI Agent Framework.,"Open notebook version of this page Legacy workflow: This notebook describes the legacy approach of deploying agents through Model Serving. Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. For the recommended approach, seeAuthor an AI agent and deploy it on Databricks Apps. Non-conversational agents process structured inputs to produce specific outputs without maintaining conversation state. Each request is ",2026-02-10T23:12:00.000Z,concept-article,architecture-patterns,0.65,True,Describes non-conversational agents that process structured inputs without state; likely includes Databricks-specific patterns and schemas for such agents.,unchanged @@ -960,9 +980,10 @@ https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/query-agent,Query agents,Query an agent deployed on Azure Databricks - Azure Databricks,Query Databricks agents via REST and SDK clients,Learn how to query an AI agent for your generative AI application on Databricks.,Learn how to send requests to agents deployed to Databricks Apps or Model Serving endpoints. Databricks provides multiple query methods to fit different use cases and integration needs. Select the query approach that best fits your use case: Databricks recommends theDatabricks OpenAI Clientfor new applications. Choose theREST APIwhen integrating with platforms that expect OpenAI-compatible endpoints.,2026-03-11T22:00:00.000Z,concept-article,integrations,0.7,True,"Details specific request formats, headers, and client configuration (Databricks OpenAI client vs REST) for querying agents. These are product-specific API integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/request-assessment-logs,Request logs and assessment logs (deprecated),Agent inference tables: Request and assessment logs (deprecated) - Azure Databricks,Migrate from deprecated Databricks agent inference tables,Request logs and assessment logs for Mosaic AI Agent Framework are deprecated. Learn how to migrate to MLflow Traces and production monitoring.,"Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Important Deprecation notice: As of December 4, 2025, Databricks no longer automatically populates thepayload_request_logsandpayload_assessment_logstables. These tables have beendeprecate",2026-02-10T08:00:00.000Z,concept-article,configuration,0.7,True,"Details deprecation of specific request and assessment log tables and migration to MLflow Traces; likely includes table names, schema expectations, and migration steps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/slack-agent,Slack,Connect an AI agent to Slack - Azure Databricks,Integrate Databricks AI agents with Slack via HTTP connections,Create an AI agent that can post messages to Slack using HTTP Unity Catalog connections.,"Important This feature is inPublic Preview. Learn how to create anAI agentthat can post messages to Slack using HTTP Unity Catalog connections. This page demonstrates User-to-Machine authentication for external services, allowing your agent to interact with Slack.",2026-04-02T22:35:00.000Z,concept-article,integrations,0.8,True,Shows how to connect an agent to Slack using HTTP Unity Catalog connections and user-to-machine auth; this is a concrete integration with product-specific connection parameters and auth setup.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/stateful-agents,Agent memory,AI agent memory - Azure Databricks,,Learn how to build AI agents with memory capabilities on Databricks Apps using Lakebase as the durable memory store for both short-term and long-term memory.,"Memory lets AI agents remember information from earlier in the conversation or from previous conversations. This lets agents provide context-aware responses and build personalized experiences over time. Use DatabricksLakebase, a fully-managed Postgres OLTP database, to manage conversation state and history.",2026-04-17T18:03:00.000Z,concept-article,,0.3,False,"Describes AI agent memory concepts and using Lakehouse/Lakebase as durable memory; likely conceptual and architectural overview without concrete limits, config tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/stateful-agents,Agent memory,AI agent memory - Azure Databricks,,Learn how to build AI agents with memory capabilities on Databricks Apps using Lakebase as the durable memory store for both short-term and long-term memory.,"Memory lets AI agents remember information from earlier in the conversation or from previous conversations. This lets agents provide context-aware responses and build personalized experiences over time. Use DatabricksLakebase, a fully-managed Postgres OLTP database, to manage conversation state and history.",2026-04-17T18:03:00.000Z,concept-article,,0.3,False,"Describes AI agent memory concepts and using Lakehouse/Lakebase as durable memory; likely conceptual and architectural overview without concrete limits, config tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/stateful-agents-model-serving,Agent memory,AI agent memory (Model Serving) - Azure Databricks,Implement AI agent memory on Databricks Model Serving,Learn how to build AI agents with memory capabilities using Mosaic AI Agent Framework with Lakebase as the durable memory store for both short-term and long-term memory.,"Important This feature is inPublic Preview. Important For new use cases, Databricks recommends deploying agents on Databricks Apps for full control over agent code, server configuration, and deployment workflow. SeeAuthor an AI agent and deploy it on Databricks Apps. To migrate an existing agent, seeMigrate an agent from Model Serving to Databricks Apps. Memory lets AI agents to remember information from earlier in the conversation or from previous conversations. This lets agents provide context",2026-02-25T23:30:00.000Z,concept-article,architecture-patterns,0.7,True,Explains building agents with memory using Lakebase via Model Serving; likely includes Databricks-specific memory patterns and configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/structured-retrieval-tools,Work with structured data,Connect agents to structured data - Azure Databricks,Connect Databricks AI agents to structured data sources,Use Mosaic AI Agent Framework to create AI agent tools to query structured data sources such as SQL tables.,"AI agents often need to query or manipulate structured data to answer questions, update records, or create data pipelines. Databricks provides multiple approaches for connecting agents to structured data in Unity Catalog tables and external data stores. Use pre-configured MCP servers for immediate access to Genie spaces and SQL warehouses, or build custom tools for specialized workflows. This page shows how to:",2026-04-07T17:44:00.000Z,concept-article,integrations,0.7,True,"Focuses on connecting agents to structured data via Unity Catalog tables, SQL warehouses, and external stores using MCP servers and custom tools. This is a concrete integration pattern with Databricks-specific mechanisms and likely parameterized tool definitions.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/supervisor-api-app,Build with Supervisor API,Build a custom agent using the Supervisor API (Beta) - Azure Databricks,Build Databricks Apps agents with Supervisor API orchestration,Use the Databricks Supervisor API to build custom agents on Databricks Apps with server-side tool execution and multi-AI model support.,"Important This feature is inBeta. Account admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can build a Azure Databricks Apps agent that uses theSupervisor API (Beta)for orchestration instead of managing the agent loop in your own code. The result is the same asauthoring a custom agent: a deployed App with a chat UI, an/invocationsendpoint, and authentication. The difference is that Azure Databricks runs the agent loop for you. Youragent.pym",2026-04-21T08:00:00.000Z,concept-article,integrations,0.7,True,"Explains using Supervisor API for orchestration instead of custom loops, tied to Databricks Apps agent.py behavior and endpoints. This is product-specific integration behavior and coding pattern for combining Apps with Supervisor API, which qualifies as expert integration knowledge.",new https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/teams-agent,Microsoft Teams,Connect an AI agent to Microsoft Teams - Azure Databricks,Integrate Databricks AI agents with Microsoft Teams via OAuth OBO,Create an AI agent that interacts with Microsoft Teams using OAuth On Behalf Of authentication through Azure Bot Service.,"Important This feature is inPublic Preview. This page shows how to integrateAI agentswith Microsoft Teams using OAuth ""On Behalf Of"" authentication. This integration lets your Databricks agents interact with users through Microsoft Teams while maintaining secure, user-delegated access to Databricks resources.",2026-02-10T08:00:00.000Z,concept-article,integrations,0.8,True,Shows Teams integration using Azure Bot Service and OAuth On Behalf Of; this is a concrete integration with detailed auth configuration and Databricks-specific setup.,unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/unity-catalog-tool-integration,Overview,Integrate Unity Catalog tools with third party generative AI frameworks - Azure Databricks,Use Unity Catalog tools with third-party gen AI frameworks,Use popular generative AI libraries with Unity Catalog governance to integrate the capabilities of third-party agent authoring frameworks.,"Unity Catalog AI agent tools can easily be used in popular gen AI libraries like LangChain, LlamaIndex, OpenAI, and Anthropic. These integrations combine Unity Catalog tool governance with the capabilities of third party agent authoring frameworks. For example:",2025-10-06T08:00:00.000Z,concept-article,integrations,0.8,True,"Explains integrating UC tools with LangChain, LlamaIndex, OpenAI, and Anthropic; likely includes SDK-specific configuration and tool registration patterns unique to Databricks + these frameworks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/unstructured-retrieval-tools,Query unstructured data,Connect agents to unstructured data - Azure Databricks,Connect Databricks AI agents to unstructured data with Vector Search,"Use Mosaic AI Agent Framework to create AI agent tools to query unstructured data sources, such as a document corpus, using Vector Search indexes.","AI agents often need to query unstructured data like document collections, knowledge bases, or text corpora to answer questions and provide context-aware responses. Databricks provides multiple approaches for connecting agents to unstructured data in Vector Search indexes and external vector stores. Use pre-configured MCP servers for immediate access to Databricks Vector Search indexes, develop retriever tools locally with AI Bridge packages, or build custom retriever functions for specialized w",2026-04-07T17:44:00.000Z,concept-article,integrations,0.7,True,"Covers connecting agents to unstructured data via Databricks Vector Search indexes and external vector stores, including retriever tools and AI Bridge packages. This is a product-specific integration pattern with concrete tool and API usage.",unchanged @@ -973,11 +994,11 @@ https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/agents-de https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/concepts/,Concepts,Concepts: Generative AI on Azure Databricks - Azure Databricks,,"Learn about key concepts for generative AI on Databricks, including challenges, solutions, agent systems, the GenAI developer workflow, Retrieval-Augmented Generation (RAG), and more.","A GenAI app is an application that uses generative AI models (such as large language models, image generation models, and text-to-speech models) to create new outputs, automate complex tasks, or engage in intelligent interactions based on user input. A GenAI app can be powered by simple calls to LLMs or other GenAI models, or by complex AI agents. Read more aboutlevels of complexity. Agents, tools, evaluation, models, and other aspects of GenAI apps can be customized with your proprietary data. ",2026-03-27T08:00:00.000Z,concept-article,,0.1,False,"Concepts page for generative AI on Databricks; high-level explanations of agents, RAG, workflows without detailed configs, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/gen-ai-capabilities,Gen AI capabilities,Azure Databricks generative AI capabilities - Azure Databricks,,"Use Azure Databricks for building, evaluating, deploying, and monitoring generative AI applications at scale.","Azure Databricks provides a platform for building, evaluating, deploying, and monitoring generative AI applications (GenAI apps). It brings together a suite of tools that tackle thechallenges of developing enterprise-grade GenAI apps. Azure Databricksintegrates with popular open source frameworks, adding enterprise-grade governance, observability, and operational tooling, collectively known as LLMOps. This page lists major features for GenAI, organized by GenAI workflow stages.",2026-03-27T08:00:00.000Z,concept-article,,0.2,False,"Lists Azure Databricks GenAI capabilities by workflow stage; primarily an overview/feature catalog without detailed decision matrices, configs, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/gen-ai-challenges,Key challenges in building GenAI apps,Key challenges in building GenAI apps - Azure Databricks,,"Learn about the three primary challenges to building a production-quality GenAI app: quality, control of data and models, and cost.","Despite the power of modern GenAI models, production-grade generative AI applications are often challenging to build. Three key challenges can be summarized as:",2026-03-27T08:00:00.000Z,concept-article,,0.1,False,"Describes key challenges in building GenAI apps; conceptual discussion of quality, control, and cost rather than product-specific expert configuration or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/,Overview,Model Context Protocol (MCP) on Databricks - Azure Databricks,,"Use Model Context Protocol (MCP) on Databricks to connect AI agents with tools, data, and workflows through a standardized, secure interface.","MCPis an open source standard that connects AI agents to tools, resources, prompts, and other contextual information. AI Gateway is the enterprise control plane for governing MCP servers, in addition to LLM endpoints. Databricks provides the following types of MCP servers: To see your available MCP servers, go to your workspace >Agents>MCP Servers:",2026-04-15T22:08:00.000Z,concept-article,,0.4,False,"Conceptual description of MCP on Databricks and listing types of MCP servers; summary does not indicate detailed configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/,Overview,Model Context Protocol (MCP) on Databricks - Azure Databricks,,"Use Model Context Protocol (MCP) on Databricks to connect AI agents with tools, data, and workflows through a standardized, secure interface.","MCPis an open source standard that connects AI agents to tools, resources, prompts, and other contextual information. AI Gateway is the enterprise control plane for governing MCP servers, in addition to LLM endpoints. Databricks provides the following types of MCP servers: To see your available MCP servers, go to your workspace >AI Gateway>MCPs:",2026-04-24T18:05:00.000Z,concept-article,,0.3,False,High-level description of MCP on Databricks and available server types; summary suggests conceptual overview and navigation to specific MCP server pages without detailed configuration tables or numeric limits.,updated https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/connect-external-services,Connect clients to MCP servers,Connect non-Databricks clients to Databricks MCP servers - Azure Databricks,Authenticate external clients to Databricks MCP servers,"Connect external AI assistants and IDEs like Claude, Claude Code, Cursor, Replit, and MCP Inspector to Databricks MCP servers using OAuth or personal access token authentication.","Important This feature is inPublic Preview. Connect non-Databricks (external) clients, AI assistants, and IDEs that support Model Context Protocol (MCP) to Databricks MCP servers. This provides access to Databricks data and tools directly in your development environment. By connecting external clients to Databricks MCP servers, you can:",2026-04-02T08:00:00.000Z,concept-article,security,0.8,True,"Focuses on connecting external IDEs/assistants using OAuth or PAT; expected to document specific auth flows, token usage, and security settings unique to Databricks MCP.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp,Host a custom MCP server,Host custom MCP servers using Databricks apps - Azure Databricks,Host custom MCP servers as Databricks apps,Host custom MCP servers on Databricks Apps to connect AI agents to specialized tools and context.,"Host custom or third-party MCP servers asDatabricks apps. Custom MCP servers are useful if you already have an MCP server you want to deploy or if you want to run a third-party MCP server as a source of tools. Access to custom MCP servers is controlled throughDatabricks Apps permissions. To monitor custom MCP activity alongside your other MCP servers and LLM endpoints, useAI Gateway.",2026-04-15T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes how to host custom/third-party MCP servers as Databricks apps, controlled via Databricks Apps permissions and monitored via AI Gateway. Likely includes app configuration parameters, permission settings, and wiring to MCP, which are product-specific configuration details.",new -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/external-mcp,External MCP servers,Use external MCP servers - Azure Databricks,Configure external MCP servers with Databricks proxies,Connect to external MCP servers using Databricks-managed proxies with Unity Catalog connections for secure authentication and token management.,Important This feature is inPublic Preview. Connect agents to third-party Model Context Protocol (MCP) servers through Databricks-managed proxies to access external tools and services. Databricks supports both shared principal and per-user authentication for external MCP servers. SeeSupported authentication methods.,2026-04-16T08:00:00.000Z,concept-article,configuration,0.7,True,"Page is about connecting to external MCP servers via Databricks-managed proxies and Unity Catalog connections. This typically includes concrete connection settings, auth modes (shared principal vs per-user), and configuration parameters specific to Databricks MCP integration, which are product-specific details not inferable from general knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp,Managed MCP servers,Use Databricks managed MCP servers - Azure Databricks,Use Databricks managed MCP servers to connect agents to data,Connect AI agents to Databricks data and tools using managed MCP servers with no setup required.,"Important This feature is inPublic Preview. Databricks managed MCP servers are ready-to-use servers that connect your AI agents to data stored in Unity Catalog, Databricks Vector Search indexes, Genie spaces, and custom functions. Unity Catalog permissions are always enforced, so agents and users can only access tools and data they're allowed to. To monitor managed MCP server usage and manage access alongside your other AI resources, useAI Gateway.",2026-04-15T08:00:00.000Z,concept-article,integrations,0.7,True,"Describes managed MCP servers that connect agents to Unity Catalog, Vector Search, Genie spaces, and custom functions, with governance via AI Gateway. This is a product-specific integration pattern between agents and Databricks data/tools.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp,Host a custom MCP server,Host custom MCP servers using Databricks apps - Azure Databricks,Host custom MCP servers on Databricks Apps with controlled access,Host custom MCP servers on Databricks Apps to connect AI agents to specialized tools and context.,"Host custom or third-party MCP servers asDatabricks apps. Custom MCP servers are useful if you already have an MCP server you want to deploy or if you want to run a third-party MCP server as a source of tools. Access to custom MCP servers is controlled throughDatabricks Apps permissions. To monitor custom MCP activity alongside your other MCP servers and LLM endpoints, useAI Gateway.",2026-04-23T08:00:00.000Z,concept-article,security,0.65,True,"Access to custom MCP servers is controlled via Databricks Apps permissions; the full content will detail permission scopes/roles and how they apply to MCP servers, which is product-specific security configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/external-mcp,External MCP servers,Use external MCP servers - Azure Databricks,Secure external MCP servers with Databricks-managed proxies,Connect to external MCP servers using Databricks-managed proxies with Unity Catalog connections for secure authentication and token management.,Important This feature is inPublic Preview. Connect agents to third-party Model Context Protocol (MCP) servers through Databricks-managed proxies to access external tools and services. Databricks supports both shared principal and per-user authentication for external MCP servers. SeeSupported authentication methods.,2026-04-23T08:00:00.000Z,concept-article,security,0.75,True,Describes supported authentication methods (shared principal vs per-user) and Unity Catalog connections for token management; this implies concrete auth configuration parameters and security patterns unique to Databricks’ MCP integration.,updated +https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp,Managed MCP servers,Use Databricks managed MCP servers - Azure Databricks,Use Databricks managed MCP servers with Unity Catalog security,Connect AI agents to Databricks data and tools using managed MCP servers with no setup required.,"Important This feature is inPublic Preview. Databricks managed MCP servers are ready-to-use servers that connect your AI agents to data stored in Unity Catalog, Databricks Vector Search indexes, Genie spaces, and custom functions. Unity Catalog permissions are always enforced, so agents and users can only access tools and data they're allowed to. To monitor managed MCP server usage and manage access alongside your other AI resources, useAI Gateway.",2026-04-23T08:00:00.000Z,concept-article,security,0.65,True,"Managed MCP servers enforce Unity Catalog permissions; the full article is likely to list specific permission models, role mappings, and how access is evaluated for tools and data, which are product-specific security configuration details.",updated https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp-meta-param,Meta parameters,Meta parameters for Databricks managed MCP servers - Azure Databricks,Configure _meta parameters for Databricks managed MCP servers,Use the `_meta` parameter to control MCP server behavior such as result limits and search filters when building AI agents.,"Important This feature is inPublic Preview. When you build AI agents that use Databricks managed MCP servers, use the_metaparameter to control tool behavior such as result limits, search filters, or SQL warehouse selection. This allows you to preset configuration while keeping queries flexible for your agent to generate dynamically. The_metaparameter is part of the officialMCP specification.",2026-02-24T08:00:00.000Z,concept-article,configuration,0.9,True,"Explicitly about the `_meta` parameter controlling result limits, filters, and warehouse selection; this is a configuration reference with parameter names, allowed values, and behavior—clear expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/pretrained-models,AI catalog foundation models,Access generative AI and LLM models from Unity Catalog - Azure Databricks,,Access generative AI or LLM models on Databricks. One-click access from Catalog Explorer.,"Important This feature is inPublic Preview. Azure Databricks includes a selection of high-quality generative AI and LLM foundation models in Unity Catalog. This article describes how you can use those models and incorporate them into your inference workflows. These models allow you to access state-of-the-art AI capabilities, saving you the time and expense of building your own custom models. For information about using your own custom models with Unity Catalog, seeManage model lifecycle in Unity",2026-03-31T23:28:00.000Z,concept-article,,0.3,False,"Describes accessing generative AI/LLM models from Unity Catalog and one-click access. Likely a usage/feature overview without detailed configuration tables, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/retrieval-augmented-generation,Introduction to RAG,RAG (Retrieval Augmented Generation) on Azure Databricks - Azure Databricks,,Learn about retrieval augmented generation (RAG) on Azure Databricks to achieve greater large language model (LLM) accuracy with your own data.,"Retrieval-augmented generation (RAG) is a powerful technique that combines large language models (LLMs) with real-time data retrieval to generate more accurate, up-to-date, and contextually relevant responses. This approach is especially valuable for answering questions about proprietary, frequently changing, or domain-specific information.",2026-03-27T08:00:00.000Z,concept-article,,0.2,False,"RAG overview on Azure Databricks; explains what RAG is and its benefits, but summary does not indicate detailed product-specific configs or limits.",unchanged @@ -992,39 +1013,39 @@ https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-co https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/quality-overview,Improve gen AI app quality,Improve RAG application quality - Azure Databricks,Optimize Databricks RAG application quality,Get an overview of the key elements of retrieval augmented generation quality: the data pipeline and the RAG chain.,"This article provides an overview of how you can refine each component to increase the quality of your retrieval augmented generation (RAG) application. There are myriad “knobs” to tune at every point in both the offline data pipeline, and online RAG chain. While there are countless others, the article focuses on the most important knobs that have the greatest impact on the quality of your RAG application. Databricks recommends starting with these knobs.",2025-05-09T19:58:00.000Z,concept-article,best-practices,0.7,True,"Explicitly about 'knobs' to tune in pipeline and RAG chain; this is actionable, product-specific tuning guidance rather than generic theory, fitting best-practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/quality-rag-chain,Improve RAG chain quality,Improve RAG chain quality - Azure Databricks,Improve Databricks RAG chain quality,The article describes the data pipeline and the steps for processing data for retrieval augmented generation (RAG).,"This article covers how you can improve the quality of the RAG app using components of the RAG chain. The RAG chain takes a user query as input, retrieves relevant information given that query, and generates an appropriate response grounded on the retrieved data. While the exact steps within a RAG chain can vary widely depending on the use case and requirements, the following are the key components to consider when building your RAG chain: Learn more about iteratively evaluating and improving yo",2024-10-09T08:00:00.000Z,concept-article,best-practices,0.7,True,"Focuses on improving quality via RAG chain components; likely includes concrete recommendations and patterns for Databricks RAG chains, which are best-practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/external-models-tutorial,Query OpenAI models,Tutorial: Create external model endpoints to query OpenAI models - Azure Databricks,Configure Databricks external endpoints for OpenAI,This tutorial provides step-by-step instructions for configuring and querying an external model endpoint that serves OpenAI models.,"This article provides step-by-step instructions for configuring and querying an external model endpoint that serves OpenAI models for completions, chat, and embeddings using theMLflow Deployments SDK. Learn more aboutexternal models. If you prefer to use the Serving UI to accomplish this task, seeCreate an external model serving endpoint.",2024-10-30T08:00:00.000Z,concept-article,configuration,0.7,True,"Step-by-step instructions for configuring external model endpoints via MLflow Deployments; likely includes endpoint parameter names, required fields, and specific configuration options unique to Databricks external models.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/genie-code/,What is Genie Code?,Genie Code - Azure Databricks,,"An introduction to Genie Code in Databricks, a data-intelligent AI partner, integrated across the workspace and Databricks One.","Genie Code is an autonomous AI partner purpose-built for data work in Azure Databricks. Unlike other AI assistants, Genie Code is deeply integrated with Unity Catalog, allowing it to understand your complete data landscape, including tables, columns, and lineage. This contextual awareness enables Genie Code to accelerate complex data workflows while autonomously adapting to your specific data and governance model. Genie Code is designed for data teams to use every day, from experimentation and m",2026-04-15T08:00:00.000Z,overview,,0.2,False,"High-level introduction to Genie Code; summary indicates conceptual overview and positioning without concrete limits, configs, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/genie-code/,What is Genie Code?,Genie Code - Azure Databricks,,"An introduction to Genie Code in Databricks, a data-intelligent AI partner, integrated across the workspace and Databricks One.","Genie Code is an autonomous AI partner purpose-built for data work in Azure Databricks. Unlike other AI assistants, Genie Code is deeply integrated with Unity Catalog, allowing it to understand your complete data landscape, including tables, columns, and lineage. This contextual awareness enables Genie Code to accelerate complex data workflows while autonomously adapting to your specific data and governance model. Genie Code is designed for data teams to use every day, from experimentation and m",2026-04-15T08:00:00.000Z,overview,,0.2,False,"High-level introduction to Genie Code; summary indicates conceptual overview and positioning without concrete limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/genie-code/github-mcp,Enterprise code search with Github MCP,Enterprise code search on Genie Code via the Github MCP server - Azure Databricks,,Enterprise code search on Genie Code via Github MCP server.,Expand Genie Code's enterprise code search capabilities by connecting to the Github MCP server. Note MCP servers are only supported in Genie CodeAgent mode.,2026-03-31T08:00:00.000Z,how-to,,0.4,False,"Describes GitHub MCP integration conceptually; no explicit mention of parameter tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/genie-code/impact,Measure Genie Code impact,Measure Genie Code impact - Azure Databricks,,"Measure the impact of Genie Code using system logs and survey data. Measure adoption, engagement, and productivity impacts.","This article provides information on measuring the impact of Genie Code in terms of adoption, engagement, and reported productivity gains.",2026-03-31T08:00:00.000Z,how-to,,0.3,False,Focuses on measuring impact via logs and surveys; likely analytics guidance without product-specific limits or configs.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/genie-code/instructions,Add custom instructions,Add custom instructions - Azure Databricks,Configure Genie Code custom instructions effectively,Learn how to add custom instructions to Genie Code and about best practices for writing instructions.,"You can customize how Genie Code responds by adding custom instructions. Genie Code considers these instructions when it generates new responses. For example, instructions can include: With the exception of Quick Fix and Autocomplete, Genie Code applies instructions to every response it generates, including inline suggestions, chat, Suggest Fix, and Agent mode. In addition to user and workspace instructions, Genie Code automatically discovers and readsAGENTS.mdandCLAUDE.mdfiles in your workspace",2026-04-15T08:00:00.000Z,best-practice,best-practices,0.7,True,"Covers how to add custom instructions plus best practices for writing them, likely including product-specific guidance and edge cases for how instructions are applied across Genie Code features.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie-code/mcp,Connect to MCP servers,Connect Genie Code to MCP servers - Azure Databricks,Connect Genie Code to external MCP servers,Connect Genie Code to MCP servers to give it access to tools and data sources.,Connect Genie Code to external tools and data sources through the Model Context Protocol (MCP). Genie Code can use any MCP servers that have been added to your workspace and that you have permission to use. Note MCP servers are only supported in Genie CodeAgent mode.,2026-04-17T18:03:00.000Z,how-to,integrations,0.7,True,"Focuses on connecting Genie Code to MCP servers (external tools/data). This is an integration pattern specific to Databricks + MCP, likely with configuration details and constraints such as Agent mode-only support.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie-code/sample-data-explorer,Explore sample data,Explore sample data with Genie Code - Azure Databricks,,Use Genie Code to explore and analyze sample datasets from Unity Catalog using natural language.,"Sample Data Explorer with Genie Code provides a conversational interface for discovering and exploring datasets in your workspace. Use natural language to ask questions about your data without writing SQL. Using table metadata and row-level samples, Sample Data Explorer generates SQL queries that you can either review manually or configure to run automatically. Sample Data Explorer supports the following table types:",2026-04-15T22:08:00.000Z,how-to,,0.3,False,"Describes using Genie Code to explore sample data with natural language; appears as a usage/tutorial flow without detailed config tables, limits, or troubleshooting mappings.",new -https://learn.microsoft.com/en-us/azure/databricks/genie-code/skills,Genie Code skills,Extend Genie Code with agent skills - Azure Databricks,Create and configure custom Genie Code agent skills,Create skills to extend Genie Code with specialized capabilities for domain-specific tasks and workflows.,"Genie Code comes with built-in skills that are pre-configured for common Azure Databricks workflows, such as writing code in Azure Databricks notebooks, exploring data in Unity Catalog, building dashboards, creating pipelines, and working with MLflow. You can also create your own skills to extend Genie Code in Agent mode with specialized capabilities for your domain-specific tasks. This page explains how to create and optimize skills.",2026-04-15T19:49:00.000Z,how-to,configuration,0.65,True,"Explains how to create and optimize Genie Code skills. Such pages typically define skill configuration parameters and behavior specific to this product, fitting configuration/integrations; configuration is the closest match.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie-code/tips,Improve Genie Code responses,Tips to improve Genie Code responses - Azure Databricks,Apply practical tips to improve Genie Code responses,Learn about best practices for using Genie Code and some tips to help Genie Code provide better responses.,"This page provides general tips and best practices to help Genie Code provide better responses. In addition to the tips on this page, you can tailor Genie Code to your needs in the following ways:",2026-04-17T18:03:00.000Z,best-practice,best-practices,0.65,True,"Explicitly described as tips and best practices for using Genie Code. While summary is brief, this type of page typically contains concrete, product-specific DOs/DON’Ts and interaction patterns unique to Genie Code.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie-code/use-genie-code,Use Genie Code,Use Genie Code - Azure Databricks,,"Learn how to enable or disable Genie Code and how Assistant can help you generate, understand, and troubleshoot code.","Genie Code is a context-aware AI assistant that helps you with data work in Databricks notebooks, SQL editor, jobs, AI/BI dashboards, file editor, and more. It is capable of generating, optimzing, explaining, and fixing code and queries. Use the Genie Code chat to ask for help and use Agent mode to let Genie Code autonomously work on complex multi-step tasks. Note Genie Code defaults to using Databricks-hosted AI models if you disablePartner-powered AI features. To learn how Genie Code helps wit",2026-04-16T17:44:00.000Z,how-to,,0.3,False,"Usage overview for Genie Code (enable/disable, general capabilities) but summary does not show detailed configuration tables, limits, or error-code-based troubleshooting.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie/,Genie space overview,What is a Genie space - Azure Databricks,,Learn how Genie spaces are used to explore data through a natural language chat interface.,"This page introduces Genie, an Azure Databricks feature that allows business teams to interact with their data using natural language. It uses generative AI tailored to your organization's terminology and data, with the ability to monitor and refine its performance through user feedback.",2026-03-31T08:00:00.000Z,landing-page,,0.1,False,"Introductory 'What is' page for Genie spaces; high-level conceptual overview without detailed configs, limits, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie-code/instructions,Add custom instructions,Add custom instructions - Azure Databricks,Configure Genie Code custom instructions effectively,Learn how to add custom instructions to Genie Code and about best practices for writing instructions.,"You can customize how Genie Code responds by adding custom instructions. Genie Code considers these instructions when it generates new responses. For example, instructions can include: With the exception of Quick Fix and Autocomplete, Genie Code applies instructions to every response it generates, including inline suggestions, chat, Suggest Fix, and Agent mode. In addition to user and workspace instructions, Genie Code automatically discovers and readsAGENTS.mdandCLAUDE.mdfiles in your workspace",2026-04-15T08:00:00.000Z,best-practice,best-practices,0.7,True,"Covers how to add custom instructions plus best practices for writing them, likely including product-specific guidance and edge cases for how instructions are applied across Genie Code features.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie-code/mcp,Connect to MCP servers,Connect Genie Code to MCP servers - Azure Databricks,Connect Genie Code to external MCP servers,Connect Genie Code to MCP servers to give it access to tools and data sources.,Connect Genie Code to external tools and data sources through the Model Context Protocol (MCP). Genie Code can use any MCP servers that have been added to your workspace and that you have permission to use. Note MCP servers are only supported in Genie CodeAgent mode.,2026-04-17T18:03:00.000Z,how-to,integrations,0.7,True,"Focuses on connecting Genie Code to MCP servers (external tools/data). This is an integration pattern specific to Databricks + MCP, likely with configuration details and constraints such as Agent mode-only support.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie-code/sample-data-explorer,Explore sample data,Explore sample data with Genie Code - Azure Databricks,,Use Genie Code to explore and analyze sample datasets from Unity Catalog using natural language.,"Sample Data Explorer with Genie Code provides a conversational interface for discovering and exploring datasets in your workspace. Use natural language to ask questions about your data without writing SQL. Using table metadata and row-level samples, Sample Data Explorer generates SQL queries that you can either review manually or configure to run automatically. Sample Data Explorer supports the following table types:",2026-04-21T08:00:00.000Z,how-to,,0.2,False,"Describes using Genie Code to explore sample data; summary suggests conceptual and workflow guidance, not detailed configuration, limits, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/genie-code/skills,Genie Code skills,Extend Genie Code with agent skills - Azure Databricks,Create and configure custom Genie Code agent skills,Create skills to extend Genie Code with specialized capabilities for domain-specific tasks and workflows.,"Genie Code comes with built-in skills that are pre-configured for common Azure Databricks workflows, such as writing code in Azure Databricks notebooks, exploring data in Unity Catalog, building dashboards, creating pipelines, and working with MLflow. You can also create your own skills to extend Genie Code in Agent mode with specialized capabilities for your domain-specific tasks. This page explains how to create and optimize skills.",2026-04-15T19:49:00.000Z,how-to,configuration,0.65,True,"Explains how to create and optimize Genie Code skills. Such pages typically define skill configuration parameters and behavior specific to this product, fitting configuration/integrations; configuration is the closest match.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie-code/tips,Improve Genie Code responses,Tips to improve Genie Code responses - Azure Databricks,Apply practical tips to improve Genie Code responses,Learn about best practices for using Genie Code and some tips to help Genie Code provide better responses.,"This page provides general tips and best practices to help Genie Code provide better responses. In addition to the tips on this page, you can tailor Genie Code to your needs in the following ways:",2026-04-17T18:03:00.000Z,best-practice,best-practices,0.65,True,"Explicitly described as tips and best practices for using Genie Code. While summary is brief, this type of page typically contains concrete, product-specific DOs/DON’Ts and interaction patterns unique to Genie Code.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie-code/use-genie-code,Use Genie Code,Use Genie Code - Azure Databricks,,"Learn how to enable or disable Genie Code and how Assistant can help you generate, understand, and troubleshoot code.","Genie Code is a context-aware AI assistant that helps you with data work in Databricks notebooks, SQL editor, jobs, AI/BI dashboards, file editor, and more. It is capable of generating, optimzing, explaining, and fixing code and queries. Use the Genie Code chat to ask for help and use Agent mode to let Genie Code autonomously work on complex multi-step tasks. Note Genie Code defaults to using Databricks-hosted AI models if you disablePartner-powered AI features. To learn how Genie Code helps wit",2026-04-16T17:44:00.000Z,how-to,,0.3,False,"Usage overview for Genie Code (enable/disable, general capabilities) but summary does not show detailed configuration tables, limits, or error-code-based troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie/,Genie space overview,What is a Genie space - Azure Databricks,,Learn how Genie spaces are used to explore data through a natural language chat interface.,"This page introduces Genie, an Azure Databricks feature that allows business teams to interact with their data using natural language. It uses generative AI tailored to your organization's terminology and data, with the ability to monitor and refine its performance through user feedback.",2026-04-22T17:34:00.000Z,landing-page,,0.1,False,"Introductory overview of Genie spaces and natural language data exploration; no detailed configuration parameters, limits, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/databricks/genie/agent-mode,Agent mode,Agent mode in Genie spaces - Azure Databricks,,"Use Agent mode in Genie spaces to answer both straightforward data questions and complex, exploratory questions with multi-step reasoning.","Note Agent mode was formerly known as Research Agent. Agent mode extends Genie's capabilities to answer both straightforward data questions and complex business questions. It uses multi-step reasoning and hypothesis testing to uncover deeper insights. When you ask a question, Agent mode creates and refines a research plan, running multiple SQL queries, learning from each result, and iterating until it has enough evidence to provide a comprehensive answer. Unlike standard Genie queries, Agent mod",2026-03-18T08:00:00.000Z,concept-article,,0.4,False,"Describes Agent mode behavior and capabilities; appears conceptual/behavioral rather than listing config parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/genie/benchmarks,Benchmarks,Use benchmarks in a Genie space - Azure Databricks,,Learn how to use benchmarks to evaluate and maintain Genie spaces systematically.,This page explains how to use benchmarks to evaluate the accuracy of your Genie space.,2026-01-21T08:00:00.000Z,how-to,,0.4,False,"Explains using benchmarks to evaluate Genie spaces; likely methodology and workflow rather than product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/genie/best-practices,Best practices,Curate an effective Genie space - Azure Databricks,Apply best practices for curating Genie spaces,Learn best practices for creating and maintaining Genie spaces.,"The goal of curating a Genie space is to create an environment where business users can pose natural language questions and receive accurate, consistent answers based on their data. Genie spaces use advanced models that generate sophisticated queries and understand general world knowledge. Most business questions are domain-specific, so a space curator's role is to bridge the gap between that general world knowledge and the specialized language used in a specific domain or by a particular compan",2026-04-09T08:00:00.000Z,best-practice,best-practices,0.7,True,"Explicitly a best-practices article for creating and maintaining Genie spaces; likely includes product-specific guidance and gotchas for achieving accurate, consistent answers beyond generic LLM advice.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/genie/conversation-api,Use the Genie API,Use the Genie API to integrate Genie into your applications - Azure Databricks,Integrate Genie natural language querying via API,"Use the Genie API to integrate natural language data querying into applications, chatbots, and AI agent frameworks.","This page explains how to use the Genie API to enable Genie capabilities in your own chatbot, agent, or application.",2026-04-16T08:00:00.000Z,tutorial,integrations,0.7,True,"API integration page for embedding Genie into applications; likely includes endpoint URLs, request/response schemas, and parameter references that are product-specific integration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/genie/conversation-api,Use the Genie API,Use the Genie API to integrate Genie into your applications - Azure Databricks,Integrate Genie via the conversation API into applications,"Use the Genie API to integrate natural language data querying into applications, chatbots, and AI agent frameworks.","This page explains how to use the Genie API to enable Genie capabilities in your own chatbot, agent, or application.",2026-04-21T08:00:00.000Z,tutorial,integrations,0.7,True,"Describes using the Genie API in agents, chatbots, and apps. An API reference-style page will include endpoint/parameter details and request/response structures that are product-specific integration patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/genie/embed,Embed a Genie space,Embed a Genie space - Azure Databricks,,Learn how to embed a Genie space as an iframe in an external website or application.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can embed a Genie space as an iframe in an external website or application. This lets business users interact with Genie directly within your internal tools or portals without navigating to Azure Databricks.,2026-04-09T17:57:00.000Z,how-to,,0.4,False,"Embedding a Genie space as an iframe is probably a short how-to with basic embed code; unlikely to include detailed config matrices, limits, or security role tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/genie/file-upload,Upload files,Upload a file - Azure Databricks,,Learn how to upload small data files to a Genie space.,Important This feature is inPublic Preview. This article explains how to upload CSV and Excel files directly into a Genie space for analysis using natural language and in combination with other tables in the space.,2026-04-14T17:47:00.000Z,how-to,,0.4,False,"Covers uploading small CSV/Excel files to a Genie space; while there may be size limits, the summary does not indicate explicit numeric quotas or detailed configuration tables.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie/knowledge-store,Build a knowledge store,Build a knowledge store for more reliable Genie spaces - Azure Databricks,,Build a knowledge store and use prompt matching to improve Genie's accuracy and understanding of your data.,"The Genie knowledge store allows you to curate and enhance your space through localized metadata, prompt matching, and structured SQL instructions. These features help Genie understand your data and generate more accurate and relevant responses.",2026-04-17T08:00:00.000Z,how-to,,0.5,False,"Describes building a knowledge store and using prompt matching; summary suggests conceptual and procedural guidance, not detailed limits, config matrices, or error mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/genie/set-up,Set up a Genie space,Set up and manage a Genie space - Azure Databricks,,Learn how to set up and manage Genie spaces.,"This article explains how to set up and manage a Genie space, a chat interface for business users to ask natural-language questions about their data.",2026-04-17T08:00:00.000Z,how-to,,0.4,False,Explains how to set up and manage a Genie space; likely a setup/tutorial flow without detailed config parameter tables or numeric thresholds.,updated -https://learn.microsoft.com/en-us/azure/databricks/genie/talk-to-genie,Use Genie to explore data,Use a Genie space to explore business data - Azure Databricks,,Learn how to use a Genie space to ask natural language questions about your data and provide feedback to authors to refine and enhance existing Genie spaces.,"This page explains how business teams can use, test, and provide feedback on Genie spaces—no-code, self-service chat interfaces for asking questions about your company data. To follow the provided guidance, a Genie space author must have already set up a Genie space. For information about creating a Genie space, seeSet up and manage a Genie space.",2026-03-18T22:49:00.000Z,how-to,,0.2,False,"Describes how business users can interact with Genie spaces and provide feedback. This is end-user usage guidance, not expert-level configuration, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie/file-upload,Upload files,Upload a file - Azure Databricks,,Learn how to upload small data files to a Genie space.,Important This feature is inPublic Preview. This article explains how to upload CSV and Excel files directly into a Genie space for analysis using natural language and in combination with other tables in the space.,2026-04-14T17:47:00.000Z,how-to,,0.4,False,"Covers uploading small CSV/Excel files to a Genie space; while there may be size limits, the summary does not indicate explicit numeric quotas or detailed configuration tables.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/genie/knowledge-store,Build a knowledge store,Build a knowledge store for more reliable Genie spaces - Azure Databricks,Design effective Genie knowledge stores for accurate answers,Build a knowledge store and use prompt matching to improve Genie's accuracy and understanding of your data.,"The Genie knowledge store allows you to curate and enhance your space through localized metadata, prompt matching, and structured SQL instructions. These features help Genie understand your data and generate more accurate and relevant responses.",2026-04-20T08:00:00.000Z,how-to,best-practices,0.64,True,"Focuses on how to build and curate a Genie knowledge store using localized metadata, prompt matching, and SQL instructions. This implies product-specific guidance and patterns to improve accuracy—actionable DO/DON'T style recommendations rather than just concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/genie/set-up,Set up a Genie space,Set up and manage a Genie space - Azure Databricks,Configure and manage Genie spaces in Azure Databricks,Learn how to set up and manage Genie spaces.,"This article explains how to set up and manage a Genie space, a chat interface for business users to ask natural-language questions about their data.",2026-04-22T08:00:00.000Z,how-to,configuration,0.63,True,"Covers how to set up and manage Genie spaces with product-specific options and settings. Likely includes concrete configuration steps and parameters (space settings, permissions, data connections) beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/genie/talk-to-genie,Use Genie to explore data,Use a Genie space to explore business data - Azure Databricks,,Learn how to use a Genie space to ask natural language questions about your data and provide feedback to authors to refine and enhance existing Genie spaces.,"This page explains how business teams can use, test, and provide feedback on Genie spaces—no-code, self-service chat interfaces for asking questions about your company data. To follow the provided guidance, a Genie space author must have already set up a Genie space. For information about creating a Genie space, seeSet up and manage a Genie space.",2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Explains how business users interact with Genie spaces and provide feedback; primarily usage guidance without detailed configuration parameters, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/genie/troubleshooting,Troubleshooting,Troubleshoot Genie spaces - Azure Databricks,Troubleshoot common Azure Databricks Genie issues,Troubleshoot common issues when creating and maintaining Genie spaces.,This page outlines how to resolve common problems when creating and maintaining Genie spaces.,2026-04-01T08:00:00.000Z,troubleshooting,troubleshooting,0.7,True,"Dedicated troubleshooting page for Genie spaces; such pages typically map specific symptoms and errors to causes and resolutions, which are product-specific expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/,Get started on Databricks,Get started tutorials on Azure Databricks - Azure Databricks,,Learn how to get started using Databricks.,"The tutorials in this section introduce core features and guide you through the basics of working with the Azure Databricks platform. For information about online training resources, seeGet free Databricks training. If you do not have a Azure Databricks account,sign up for a free trial.",2026-03-24T22:15:00.000Z,get-started,,0.1,False,"Navigation hub for getting-started tutorials; no detailed technical guidance, limits, or configuration references.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/analyst-get-started,Explore data in Databricks One,Tutorial: Explore dashboards and query data with Genie in Databricks One - Azure Databricks,,"Get started as a business analyst using Databricks One. View dashboards, ask natural language data questions with Genie, and discover the assets shared with you.","This tutorial introduces Databricks One, a simplified Azure Databricks interface for business users. This tutorial walks you through:",2026-02-25T23:30:00.000Z,get-started,,0.35,False,"Business analyst tutorial for dashboards and Genie; usage-focused, not deep configuration or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/architecture,Overview,Architecture - Azure Databricks,,Learn about Azure Databricks architecture concepts including platform fundamentals and lakehouse design patterns.,Learn about Azure Databricks architecture concepts including platform fundamentals and lakehouse design patterns.,2026-01-19T08:00:00.000Z,get-started,,0.3,False,General architecture concepts and lakehouse design patterns; likely high-level and conceptual without product-specific numeric thresholds or decision matrices.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/getting-started/best-practices,Databricks best practice articles,Best practice articles - Azure Databricks,,Explore best practice articles to help you make the most out of Azure Databricks.,This article provides a reference of best practice articles you can use to optimize your Azure Databricks activity. The Azure Databricks documentation includes a number of best practices articles to help you get the best performance at the lowest cost when using and administering Azure Databricks.,2026-04-15T22:08:00.000Z,get-started,,0.2,False,"Index page listing best practice articles, but itself does not contain concrete recommendations, configs, or quantified guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/getting-started/best-practices,Databricks best practice articles,Best practice - Azure Databricks,,Explore best practice articles to help you make the most out of Azure Databricks.,This page links to best practice articles for Azure Databricks.,2026-04-21T22:28:00.000Z,get-started,,0.1,False,"Acts as a navigation hub linking to best practice articles rather than containing the concrete, product-specific recommendations itself.",updated https://learn.microsoft.com/en-us/azure/databricks/getting-started/ce-migration,Migrate Community Edition workspaces to Free Edition,Migrate Community Edition workspaces to Free Edition - Azure Databricks,Migrate Databricks Community Edition to Free Edition,Learn how to migrate your Community Edition workspace to Databricks Free Edition using the automated migration tool.,"Important This feature is inPublic Preview. With the release of Databricks Free Edition, Community Edition (CE) will soon be retired. Community Edition workspace owners should use the workspace migration tool to migrate to Free Edition as soon as possible. For a feature comparison between Community Edition and Free Edition, seeFeature comparison.",2026-03-27T08:00:00.000Z,get-started,decision-making,0.65,True,Migration-focused page that discusses moving from Community Edition to Free Edition and references a feature comparison; contains product-specific migration considerations and upgrade path guidance.,unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/concepts,Databricks components,Azure Databricks components - Azure Databricks,,"Learn fundamental Azure Databricks components such as workspaces, data objects, clusters, machine learning models, and access.",This article introduces fundamental components you need to understand in order to use Azure Databricks effectively.,2026-04-06T08:00:00.000Z,get-started,,0.1,False,"Fundamental conceptual overview of Azure Databricks components (workspaces, clusters, data objects, etc.) without product-specific limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/connect/,Integrations,Azure Databricks integrations overview - Azure Databricks,,"Learn how to connect Azure Databricks to data sources, BI, and developer tools.","Databricks provides integrations for connecting to an assortment of data sources and BI tools. In addition, with Databricks developer tools you can connect to Databricks from your local machine to automate workflows, build custom solutions, and more.",2026-03-24T22:15:00.000Z,get-started,,0.2,False,"Integrations overview that lists categories of connections and tools but does not expose detailed configuration parameters, limits, or SDK reference tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/create-table,Create a table,Tutorial: Create your first table and grant privileges - Azure Databricks,,Learn how to create a table and assign permissions in Azure Databricks.,"This tutorial provides a quick walkthrough of creating a table and granting privileges in Azure Databricks using the Unity Catalog data governance model. As of November 9, 2023, workspaces in new accounts are automatically enabled for Unity Catalog and include the permissions required for all users to complete this tutorial. If you are unsure if your workspace is enabled for Unity Catalog, seeGet started with Unity Catalog. If you would like to familiarize yourself with Unity Catalog data object",2026-03-24T08:00:00.000Z,get-started,,0.35,False,"Tutorial on creating a table and granting privileges; primarily step-by-step usage of Unity Catalog, not a comprehensive security or configuration reference with role matrices or parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/data-pipeline-get-started,Build an ETL pipeline (Lakeflow SDP),Tutorial: Build an ETL pipeline with Lakeflow Spark Declarative Pipelines - Azure Databricks,,"Learn how to create and deploy an ETL (extract, transform, and load) pipeline with Lakeflow Spark Declarative Pipelines.","This tutorial explains how to create and deploy an ETL (extract, transform, and load) pipeline for data orchestration using Lakeflow Spark Declarative Pipelines and Auto Loader. An ETL pipeline implements the steps to read data from source systems, transform that data based on requirements, such as data quality checks and record de-duplication, and write the data to a target system, such as a data warehouse or a data lake. In this tutorial, you will use pipelines and Auto Loader to: For more inf",2026-01-30T08:00:00.000Z,get-started,,0.35,False,ETL pipeline tutorial using Lakeflow Spark Declarative Pipelines; focuses on workflow steps rather than configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/getting-started/dataframes,Tutorial: Load and transform data using DataFrames,Tutorial: Load and transform data using Apache Spark DataFrames - Azure Databricks,,"Learn how to load and transform data using the Apache Spark Python (PySpark) DataFrame API, the Apache Spark Scala DataFrame API, and the SparkR SparkDataFrame API in Azure Databricks.","This tutorial shows you how to load and transform data using the Apache Spark Python (PySpark) DataFrame API, the Apache Spark Scala DataFrame API, and the SparkR SparkDataFrame API in Azure Databricks. Note If you are usingDatabricks Free Edition, select thePythontab for all code examples in this tutorial. Free Edition does not support R or Scala. Additionally, Free Edition restricts outbound internet access, so you must upload the CSV file using the workspace UI instead of downloading it with ",2026-03-31T08:00:00.000Z,get-started,,0.4,False,Step-by-step tutorial for DataFrames; mostly generic Spark usage. Free Edition notes are environment constraints but not detailed limits/config matrices.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/getting-started/dataframes,Tutorial: Load and transform data using DataFrames,Tutorial: Load and transform data using Apache Spark DataFrames - Azure Databricks,,"Learn how to load and transform data using the Apache Spark Python (PySpark) DataFrame API, the Apache Spark Scala DataFrame API, and the SparkR SparkDataFrame API in Azure Databricks.","This tutorial shows you how to load and transform data using the Apache Spark Python (PySpark) DataFrame API, the Apache Spark Scala DataFrame API, and the SparkR SparkDataFrame API in Azure Databricks. Note If you are usingDatabricks Free Edition, select thePythontab for all code examples in this tutorial. Free Edition does not support R or Scala. Additionally, Free Edition restricts outbound internet access, so you must upload the CSV file using the workspace UI instead of downloading it with ",2026-04-23T08:00:00.000Z,get-started,,0.2,False,"Tutorial-style DataFrame usage with basic load/transform examples; no product-specific limits, configuration tables, or specialized patterns beyond standard Spark APIs.",updated https://learn.microsoft.com/en-us/azure/databricks/getting-started/etl-quick-start,Build an ETL pipeline (Apache Spark),Tutorial: Build an ETL pipeline with Apache Spark on the Databricks platform - Azure Databricks,,"Learn how to create and deploy an ETL (extract, transform, and load) pipeline with Apache Spark on the Databricks platform.","This tutorial shows you how to develop and deploy your first ETL (extract, transform, and load) pipeline for data orchestration with Apache Spark. Although this tutorial uses Databricks all-purpose compute, you can also use serverless compute if it's enabled for your workspace. You can also use Lakeflow Spark Declarative Pipelines to build ETL pipelines. Databricks Lakeflow Spark Declarative Pipelines reduces the complexity of building, deploying, and maintaining production ETL pipelines. SeeTut",2026-04-08T22:17:00.000Z,get-started,,0.1,False,"Step-by-step ETL tutorial; focuses on how to build a first pipeline, not on detailed configuration matrices, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-edition,Sign up for Databricks Free Edition,Sign up for Databricks Free Edition - Azure Databricks,,Learn how to sign up for Databricks Free Edition and start using Azure Databricks today.,"If you're interested in Databricks for your personal use, sign up for Databricks Free Edition by going to theDatabricks Free Edition signup page. Choose your preferred signup method and then Databricks will create a new workspace for you to use.",2026-03-10T23:23:00.000Z,get-started,,0.3,False,"Signup instructions for Free Edition; procedural, not configuration/limits/troubleshooting focused.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-edition-limitations,Free Edition limitations,Databricks Free Edition limitations - Azure Databricks,Review Azure Databricks Free Edition usage limits,Learn about the limitations of Databricks Free Edition accounts.,"Note Databricks Free Edition is a no-cost offering and does not include guaranteed reliability, support, or service-level agreements. Free Edition accounts include many of the latest Databricks releases, but some features may be limited or unavailable. If you would like access to the full Databricks platform, you must create a new Databricks account by starting afree trial. Databricks Free Edition is available at no cost to all qualified users. To keep it accessible to all, each account is subje",2026-04-14T08:00:00.000Z,get-started,limits-quotas,0.9,True,"Page specifically describes limitations of Databricks Free Edition; given the context ('each account is subje...') it almost certainly lists concrete numeric limits/quotas (for example, max clusters, concurrent users, or usage caps), which are product- and tier-specific values that an LLM would not reliably know from training.",updated +https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-edition-limitations,Free Edition limitations,Databricks Free Edition limitations - Azure Databricks,Review Azure Databricks Free Edition usage limits,Learn about the limitations of Databricks Free Edition accounts.,"Note Databricks Free Edition is a no-cost offering and does not include guaranteed reliability, support, or service-level agreements. Free Edition accounts include many of the latest Databricks releases, but some features may be limited or unavailable. If you would like access to the full Databricks platform, you must create a new Databricks account by starting afree trial. Databricks Free Edition is available at no cost to all qualified users. To keep it accessible to all, each account is subje",2026-04-14T08:00:00.000Z,get-started,limits-quotas,0.9,True,"Page specifically describes limitations of Databricks Free Edition; given the context ('each account is subje...') it almost certainly lists concrete numeric limits/quotas (for example, max clusters, concurrent users, or usage caps), which are product- and tier-specific values that an LLM would not reliably know from training.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-training,Free training,Get free Databricks training - Azure Databricks,,Access free training for Azure Databricks.,"As a customer, you have access to all Databricks free customer training offerings. These offerings include courses, recorded webinars, and quarterly product roadmap webinars. Access the material from your Databricks workspace account, or create an account to access the free training.Databricks Academy. Azure Databricks technical documentationhas many tutorials and information that can help you get up to speed on the platform. TheKnowledge Baseprovides troubleshooting tips and answers to frequent",2026-01-19T08:00:00.000Z,get-started,,0.1,False,"Describes access to training resources; no technical limits, configs, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-trial,Sign up for a free trial,Sign up for Azure Databricks for free - Azure Databricks,,Learn how to set up a Databricks free trial.,"This page explains how to create your Azure Databricks account and start your free trial. With the free trial, you are eligible to receive credit for free Databricks usage, valid for 14 days after you start your trial.",2026-03-31T08:00:00.000Z,get-started,,0.3,False,"Step-by-step sign-up and trial activation; does not focus on limits, configuration matrices, or decision criteria beyond basic eligibility and duration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-trial-vs-free-edition,Choose between the free trial and Free Edition,Sign up for Azure Databricks - Azure Databricks,Choose between Databricks Free Edition and free trial,Compare Azure Databricks Free Edition to a free trial.,Azure Databricks offers two no-cost options for getting started with the platform. This page helps you understand the differences and choose the right option for your needs:,2026-03-16T08:00:00.000Z,get-started,decision-making,0.8,True,"Explicitly compares two no-cost options and helps users choose; likely includes a feature/benefit comparison table and scenario-based guidance, which is product-specific decision-making content.",unchanged @@ -1032,51 +1053,51 @@ https://learn.microsoft.com/en-us/azure/databricks/getting-started/gen-ai-llm-ag https://learn.microsoft.com/en-us/azure/databricks/getting-started/genie-code-customer-segmentation,Segment customers with Genie Code,Tutorial: Customer segmentation with Genie Code - Azure Databricks,,"Learn how to use Genie Code to run end-to-end customer segmentation in a Databricks notebook, from data profiling to K-means clustering.","In this tutorial, you use Genie Code to run end-to-end customer segmentation directly inside a Databricks notebook. Starting from a raw marketing campaign data set, Genie Code handles data profiling, feature engineering, K-means clustering, and persona generation — all from a single prompt.",2026-03-31T08:00:00.000Z,tutorial,,0.35,False,Tutorial using Genie Code for customer segmentation; focuses on end-to-end example rather than detailed configuration parameters or limits for Genie Code.,unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/high-level-architecture,High-level architecture,High-level architecture - Azure Databricks,,"Get a high-level overview of Azure Databricks platform architecture, including control plane, compute plane, and storage components.","This article provides a high-level overview of Azure Databricks architecture, including its enterprise architecture, in combination with Azure.",2026-03-16T08:00:00.000Z,get-started,,0.3,False,"High-level overview of Azure Databricks architecture; conceptual enterprise architecture description without detailed limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/getting-started/import-visualize-data,Import and visualize CSV data from a notebook,Tutorial: Import and visualize CSV data from a notebook - Azure Databricks,,"Learn to use a Databricks notebook to import a CSV file into Unity Catalog, load data into a DataFrame, and visualize data by using Python, Scala, and R.","This tutorial walks you through using a Azure Databricks notebook to import data from a CSV file containing baby name data fromhealth.data.ny.govinto your Unity Catalog volume using Python, Scala, and R. You also learn to modify a column name, visualize the data, and save to a table. Note If you are usingDatabricks Free Edition, select thePythontab for all code examples in this tutorial. Free Edition does not support R or Scala. Additionally, Free Edition restricts outbound internet access, so y",2026-03-31T08:00:00.000Z,get-started,,0.35,False,"Hands-on tutorial for importing and visualizing CSV data; while it mentions Free Edition restrictions, the focus is procedural, not a structured limits table or configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/getting-started/ml-get-started,Train and deploy an ML model,Tutorial: Build your first machine learning model on Azure Databricks - Azure Databricks,,Learn how to build a simple machine learning classification model on Databricks using the scikit-learn library.,"This tutorial shows you how to build a machine learning classification model using thescikit-learnlibrary on Azure Databricks. The goal is to create a classification model to predict whether a wine is considered “high-quality”. The dataset consists of 11 features of different wines (for example, alcohol content, acidity, and residual sugar) and a quality ranking between 1 to 10. This example also illustrates the use ofMLflowto track the model development process, andHyperoptto automate hyperpara",2026-03-25T08:00:00.000Z,get-started,,0.3,False,"ML tutorial using scikit-learn and MLflow; mainly generic ML workflow and Databricks usage, not Databricks-specific configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start,Query and visualize data,Tutorial: Query and visualize data from a notebook - Azure Databricks,,"Learn data science basics on Azure Databricks. Using a notebook, query and visualize data stored in Unity Catalog by using SQL, Python, Scala, and R.","This tutorial walks you through using an Azure Databricks notebook to query sample data stored in Unity Catalog using SQL, Python, Scala, and R and then visualize the query results in the notebook.",2025-10-24T20:03:00.000Z,get-started,,0.3,False,"Intro tutorial on querying and visualizing data; shows basic usage, not product-specific configs, limits, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/getting-started/ml-get-started,Train and deploy an ML model,Get started: Build your first machine learning model on Databricks - Azure Databricks,,Learn how to build a simple machine learning classification model on Databricks using the scikit-learn library with Optuna.,"Open notebook version of this page This example notebook illustrates how to train a machine learning classification model on Databricks. Databricks Runtime for Machine Learning comes with many libraries pre-installed, including scikit-learn for training and pre-processing algorithms, MLflow to track the model development process, and Optuna to scale hyperparameter tuning. In this notebook, you create a classification model to predict whether a wine is considered ""high-quality"". The dataset consi",2026-04-20T22:01:00.000Z,get-started,,0.2,False,"Intro ML tutorial using scikit-learn and Optuna on Databricks; focuses on basic model training workflow, not on Databricks-specific limits, configs, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start,Query and visualize data,Tutorial: Query and visualize data from a notebook - Azure Databricks,,"Learn data science basics on Azure Databricks. Using a notebook, query and visualize data stored in Unity Catalog by using SQL, Python, Scala, and R.","This tutorial walks you through using an Azure Databricks notebook to query sample data stored in Unity Catalog using SQL, Python, Scala, and R and then visualize the query results in the notebook. Tip TellGenie Code(Agent mode) to do this for you:",2026-04-24T18:05:00.000Z,get-started,,0.2,False,"Tutorial-style quickstart for querying and visualizing data in a Databricks notebook; no indication of product-specific limits, configs, error codes, or decision matrices beyond generic how-to steps.",updated https://learn.microsoft.com/en-us/azure/databricks/guides/,Data guides overview,Data guides - Azure Databricks,,"Learn how to find, access, and work with data on the Databricks Data Intelligence Platform.","The Databricks Data Intelligence Platform enables data practitioners throughout your organization to collaborate and productionize data solutions using shared, securely governed data assets and tools. This page helps you identify the correct starting point for your use case. Many tasks on Azure Databricks require elevated permissions. Many organizations restrict these elevated permissions to a small number of users or teams. This page disambiguates actions that can be completed by most workspace",2026-03-31T08:00:00.000Z,landing-page,,0.25,False,"Navigation-style data guides hub that routes users to other content; does not itself contain detailed limits, configuration, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/iceberg/,Iceberg,What is Apache Iceberg in Azure Databricks? - Azure Databricks,,Learn about the Apache Iceberg table format and how it is supported on Azure Databricks.,"Important Unity Catalog-managed Iceberg tables are available inPublic Previewin Databricks Runtime 16.4 LTS and above. Foreign Iceberg tables are also in Public Preview in Databricks Runtime 16.4 LTS and above. Iceberg v3 features are available inPublic Previewin Databricks Runtime 18.0 and above. SeeUse Apache Iceberg v3 features. Apache Icebergis an open-source table format for analytics workloads. It supports features like schema evolution, time travel, and hidden partitioning. Like Delta Lak",2026-04-15T19:49:00.000Z,concept-article,,0.3,False,"High-level 'What is Apache Iceberg' overview and support statement; primarily conceptual description of the table format and preview status without detailed limits, configs, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/databricks/iceberg/iceberg-v3,Iceberg v3,Use Apache Iceberg v3 features - Azure Databricks,Configure and use Apache Iceberg v3 features in Unity Catalog,"Learn how to use Apache Iceberg v3 features with Unity Catalog to enhance query performance and unlock new capabilities for managed Delta tables with UniForm, managed Iceberg, and foreign Iceberg tabl","Important Iceberg v3 features are inPublic Preview. This page describes how to use Apache Iceberg v3 features with Unity Catalog. Iceberg v3 enhances query performance and introduces new features for managed Delta Lake tables with UniForm, managed Iceberg tables, and foreign Iceberg tables. The key features of Iceberg v3 are: Row lineage is required for all Iceberg v3 tables. There's no command to enable or disable row lineage.",2026-04-10T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes how to use Iceberg v3 features with Unity Catalog, including table-level feature requirements (row lineage) and likely specific configuration/DDL options unique to Databricks’ Iceberg v3 implementation.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/iceberg/,Iceberg,What is Apache Iceberg in Azure Databricks? - Azure Databricks,,Learn about the Apache Iceberg table format and how it is supported on Azure Databricks.,"Important Unity Catalog-managed Iceberg tables are available inPublic Previewin Databricks Runtime 16.4 LTS and above. Foreign Iceberg tables are also in Public Preview in Databricks Runtime 16.4 LTS and above. Iceberg v3 features are available inPublic Previewin Databricks Runtime 18.0 and above. SeeUse Apache Iceberg v3 features. Apache Icebergis an open-source table format for analytics workloads. It supports features like schema evolution, time travel, and hidden partitioning. Like Delta Lak",2026-04-21T08:00:00.000Z,concept-article,,0.2,False,"High-level explanation of Apache Iceberg support in Azure Databricks; summary indicates conceptual overview of features (schema evolution, time travel, hidden partitioning) without product-specific limits, configuration tables, or detailed patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/iceberg/iceberg-v3,Iceberg v3,Use Apache Iceberg v3 features - Azure Databricks,Configure and use Apache Iceberg v3 features in Unity Catalog,"Learn how to use Apache Iceberg v3 features with Unity Catalog to enhance query performance and unlock new capabilities for managed Delta tables with UniForm, managed Iceberg, and foreign Iceberg tabl","Important Iceberg v3 features are inPublic Preview. This page describes how to use Apache Iceberg v3 features with Unity Catalog. Iceberg v3 enhances query performance and introduces new features for managed Delta Lake tables with UniForm, managed Iceberg tables, and foreign Iceberg tables. The key features of Iceberg v3 are: Row lineage is required for all Iceberg v3 tables. There's no command to enable or disable row lineage.",2026-04-24T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes how to use Iceberg v3 features with Unity Catalog, including requirements like mandatory row lineage and likely specific table properties/DDL needed to enable v3 behavior. This is product-specific configuration knowledge beyond generic Iceberg concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/,Concepts,Standard connectors in Lakeflow Connect - Azure Databricks,,"Learn about the standard connectors in Databricks Lakeflow Connect, which offer higher levels of ingestion pipeline customization compared to the managed connectors.","This page describes the standard connectors in Databricks Lakeflow Connect, which offer higher levels of ingestion pipeline customization compared to the managed connectors.",2026-03-11T08:00:00.000Z,overview,,0.2,False,"High-level description of standard connectors; appears to be overview/navigation without detailed limits, configs, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/,Overview,Ingest data from cloud object storage - Azure Databricks,Choose incremental ingestion options from cloud object storage,Learn about options for data ingestion from cloud object storage.,This article lists the ways you can configure incremental ingestion from cloud object storage.,2026-01-20T08:00:00.000Z,concept-article,decision-making,0.6,True,"Lists ways to configure incremental ingestion from cloud object storage; likely compares approaches (e.g., Auto Loader vs others) with criteria to help choose, which is decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/add-data-external-locations,Load data using a Unity Catalog external location,Load data using a Unity Catalog external location - Azure Databricks,Load data via Unity Catalog external locations in Databricks,Learn how to use the add data UI to create a managed table from a cloud object storage path that's defined as a Unity Catalog external location.,Important This feature is inPublic Preview. This article describes how to use the add data UI to create a managed table from data in Azure Data Lake Storage using a Unity Catalog external location. Anexternal locationis an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.,2026-03-26T08:00:00.000Z,how-to,configuration,0.7,True,"Describes using the add data UI with Unity Catalog external locations; likely includes specific object types, required settings, and constraints for external locations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/,Overview,What is Auto Loader? - Azure Databricks,,Learn how to ingest data to the lakehouse with Auto Loader.,Auto Loader incrementally and efficiently processes new data files as they arrive in cloud storage without any additional setup.,2026-04-14T21:45:00.000Z,concept-article,,0.2,False,"High-level overview of Auto Loader and its purpose; summary does not indicate detailed configuration tables, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/add-data-external-locations,Load data using a Unity Catalog external location,Load data using a Unity Catalog external location - Azure Databricks,,Learn how to use the add data UI to create a managed table from a cloud object storage path that's defined as a Unity Catalog external location.,Important This feature is inPublic Preview. This article describes how to use the add data UI to create a managed table from data in Azure Data Lake Storage using a Unity Catalog external location. Anexternal locationis an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.,2026-04-23T08:00:00.000Z,how-to,,0.2,False,How-to article using the add data UI; appears to be a step-by-step tutorial without detailed configuration parameter tables or numeric constraints.,updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/,Overview,What is Auto Loader? - Azure Databricks,,Learn how to ingest data to the lakehouse with Auto Loader.,Auto Loader incrementally and efficiently processes new data files as they arrive in cloud storage without any additional setup.,2026-04-23T08:00:00.000Z,concept-article,,0.1,False,"High-level 'What is Auto Loader?' overview; summary indicates conceptual description of incremental ingestion, not detailed limits, configs, or troubleshooting.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/directory-listing-mode,Directory listing mode,Configure Auto Loader streams in directory listing mode - Azure Databricks,Configure Auto Loader directory listing mode for ingestion,Learn about features and optimizations when using Auto Loader in directory listing mode.,"This page describes how to configure Auto Loader streams to use directory listing mode to incrementally discover and ingest cloud data. Auto Loader uses directory listing mode by default. In directory listing mode, Auto Loader identifies new files by listing the input directory. Directory listing mode allows you to quickly start Auto Loader streams without any permission configurations other than access to your data on cloud storage. For best performance with directory listing mode, use Databric",2026-04-01T08:00:00.000Z,concept-article,configuration,0.8,True,Describes how to configure streams in directory listing mode and includes performance guidance; likely lists specific options and recommended settings.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/faq,FAQ,Auto Loader FAQ - Azure Databricks,Resolve common Databricks Auto Loader questions and issues,Find answers to commonly asked questions about Auto Loader.,Commonly asked questions about Databricks Auto Loader.,2026-03-31T08:00:00.000Z,faq,troubleshooting,0.7,True,"FAQ for Auto Loader; typically includes specific behaviors, limits, and resolutions to common issues, which are product-specific expert details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/file-detection-modes,Overview,Compare Auto Loader file detection modes - Azure Databricks,Select Auto Loader file detection mode for your workload,Compare directory listing and file notification modes for Auto Loader.,"Auto Loader supports two modes for detecting new files: directory listing and file notification. You can switch file discovery modes across stream restarts and still obtain exactly-once data processing guarantees. Note Auto Loader does not guarantee the order in which files are discovered or processed, regardless of file detection mode. Design your pipelines to handle out-of-order file arrivals. For guidance, seeHandle out-of-order data.",2026-03-19T18:35:00.000Z,concept-article,decision-making,0.8,True,"Explicit comparison of directory listing vs file notification modes; helps choose between modes based on performance and scalability trade-offs, fitting decision-making criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/file-events-explained,Auto Loader with file events overview,Auto Loader with file events overview - Azure Databricks,Use managed file events with Auto Loader,Learn how Auto Loader with file events works.,ThecloudFiles.useManagedFileEventsoption with Auto Loader enables efficient file discovery.,2026-03-19T08:00:00.000Z,concept-article,configuration,0.7,True,Explains cloudFiles.useManagedFileEvents option; this is a specific configuration flag with product-specific behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/file-notification-mode,File notification mode,Configure Auto Loader streams in file notification mode - Azure Databricks,Configure Databricks Auto Loader in file notification mode,Configure Auto Loader to use file notification mode to incrementally discover and ingest cloud data.,"This page describes how to configure Auto Loader streams to use file notification mode to incrementally discover and ingest cloud data. In file notification mode, Auto Loader automatically sets up a notification service and queue service that subscribes to file events from the input directory. You can use file notifications to scale Auto Loader to ingest millions of files an hour. When compared to directory listing mode, file notification mode is faster and more scalable. Also, you can switch be",2026-04-09T08:00:00.000Z,how-to,configuration,0.8,True,"The page is about configuring Auto Loader streams in file notification mode, including how the notification and queue services are set up and how to switch between modes. This implies specific configuration parameters and mode-specific behavior for Databricks ingestion from cloud object storage, which are product-specific settings rather than generic streaming concepts, so it fits configuration with expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/migrating-to-file-events,Migrate to Auto Loader with file events,Migrate to Auto Loader with file events - Azure Databricks,Migrate existing Auto Loader streams to file events,Learn how to migrate to Auto Loader with file events.,"If you have existing Auto Loader streams that discover files using directory listing or classic notifications, you can migrate them to Auto Loader with file events.",2026-02-25T19:11:00.000Z,how-to,best-practices,0.68,True,"Migration guidance typically includes stepwise changes, required option updates, and edge-case handling when switching modes. These are product-specific operational patterns and gotchas, fitting best-practices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/options,Options,Auto Loader options - Azure Databricks,Configure Auto Loader cloudFiles options in Databricks,"Reference documentation for Auto Loader and cloudFiles options, parameters, and keywords.",Configuration options specific to thecloudFilessource are prefixed withcloudFilesso that they are in a separate namespace from other Structured Streaming source options.,2026-04-14T08:00:00.000Z,reference,configuration,0.9,True,"Reference for Auto Loader/cloudFiles options implies detailed parameter names, allowed values, and defaults for this specific source, which matches configuration-type expert knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/patterns,Examples,Common data loading patterns - Azure Databricks,Apply common Auto Loader data ingestion patterns,Learn common data loading patterns leveraging Auto Loader.,Auto Loader simplifies a number of common data ingestion tasks. This quick reference provides examples for several popular patterns.,2026-03-26T08:00:00.000Z,best-practice,best-practices,0.7,True,Quick reference of common patterns with concrete examples for Auto Loader; likely includes recommended code/config patterns and gotchas specific to this product.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/production,Configuring for production,Configure Auto Loader for production workloads - Azure Databricks,Configure Databricks Auto Loader for production,Learn how to configure Auto Loader for production workloads,Databricks recommends using Auto Loader in Lakeflow Spark Declarative Pipelines for incremental data ingestion. Lakeflow Spark Declarative Pipelines extends functionality in Apache Spark Structured Streaming and allows you to write just a few lines of declarative Python or SQL to deploy a production-quality data pipeline with: Databricks also recommends you follow the streaming best practices for running Auto Loader in production. SeeProduction considerations for Structured Streaming.,2026-04-14T21:45:00.000Z,best-practice,best-practices,0.65,True,"Production configuration guidance for Auto Loader is typically product-specific (checkpointing, trigger intervals, schema evolution, scaling patterns) and goes beyond generic streaming theory, giving concrete recommendations for running Auto Loader at scale.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/options,Options,Auto Loader options - Azure Databricks,Configure Auto Loader cloudFiles options in Databricks,"Reference documentation for Auto Loader and cloudFiles options, parameters, and keywords.",Configuration options specific to thecloudFilessource are prefixed withcloudFilesso that they are in a separate namespace from other Structured Streaming source options.,2026-04-14T08:00:00.000Z,reference,configuration,0.9,True,"Reference for Auto Loader/cloudFiles options implies detailed parameter names, allowed values, and defaults for this specific source, which matches configuration-type expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/patterns,Examples,Common data loading patterns - Azure Databricks,Apply Auto Loader data ingestion patterns in Databricks,Learn common data loading patterns leveraging Auto Loader.,Auto Loader simplifies a number of common data ingestion tasks. This quick reference provides examples for several popular patterns.,2026-04-23T08:00:00.000Z,best-practice,architecture-patterns,0.7,True,"Quick-reference of concrete Auto Loader ingestion patterns (incremental loads, schema evolution, directory layouts) with product-specific options and code snippets that guide when and how to use each pattern; this is actionable pattern guidance beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/production,Configuring for production,Configure Auto Loader for production workloads - Azure Databricks,Configure Databricks Auto Loader for production,Learn how to configure Auto Loader for production workloads,Databricks recommends using Auto Loader in Lakeflow Spark Declarative Pipelines for incremental data ingestion. Lakeflow Spark Declarative Pipelines extends functionality in Apache Spark Structured Streaming and allows you to write just a few lines of declarative Python or SQL to deploy a production-quality data pipeline with: Databricks also recommends you follow the streaming best practices for running Auto Loader in production. SeeProduction considerations for Structured Streaming.,2026-04-14T21:45:00.000Z,best-practice,best-practices,0.65,True,"Production configuration guidance for Auto Loader is typically product-specific (checkpointing, trigger intervals, schema evolution, scaling patterns) and goes beyond generic streaming theory, giving concrete recommendations for running Auto Loader at scale.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/schema,Schema inference and evolution,Configure schema inference and evolution in Auto Loader - Azure Databricks,Configure schema inference and evolution in Auto Loader,Learn how schema inference and evolution work in Auto Loader.,"You can configure Auto Loader to automatically detect the schema of loaded data, allowing you to initialize tables without explicitly declaring the data schema and evolve the table schema as new columns are introduced. This eliminates the need to manually track and apply schema changes over time. Auto Loader can also ""rescue"" data that was unexpected (for example, of differing data types) in a JSON blob column, which you can choose to view later using thesemi-structured data access APIs. Auto Lo",2026-04-01T08:00:00.000Z,how-to,configuration,0.8,True,Explains how to configure schema inference/evolution and rescue data; likely includes specific Auto Loader options/parameters and their behaviors.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/type-widening,Automatic type widening,Automatic type widening with Auto Loader - Azure Databricks,Enable automatic type widening in Auto Loader,Learn how to use automatic type widening with Auto Loader schema evolution.,"Important This feature is in Public Preview in Databricks Runtime 16.4 and above. Auto Loader incrementally and efficiently processes new data files as they arrive in cloud storage. It also reduces pipeline maintenance by automatically handling complex schema changes. For example, you can configure Auto Loader to automatically detect the schema of loaded data, allowing you to initialize tables without explicitly declaring the data schema. You can also evolve the table schema as new columns are i",2026-04-02T22:35:00.000Z,concept-article,configuration,0.75,True,"Feature-specific configuration for automatic type widening; expected to include concrete options, behaviors, and constraints unique to Auto Loader.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/unity-catalog,Auto Loader with Unity Catalog,Using Auto Loader with Unity Catalog - Azure Databricks,Configure Auto Loader with Unity Catalog for secure ingestion,Use Auto Loader for incremental data ingestion from external locations or to tables managed by Unity Catalog.,"Auto Loader can securely ingest data from external locations configured with Unity Catalog. To learn more about securely connecting storage with Unity Catalog, seeConnect to cloud object storage using Unity Catalog. Auto Loader relies on Structured Streaming for incremental processing; for recommendations and limitations seeUsing Unity Catalog with Structured Streaming. Note In Databricks Runtime 11.3 LTS and above, you can use Auto Loader with either standard or dedicated access modes (formerly",2025-02-18T19:07:00.000Z,how-to,best-practices,0.62,True,"Describes how to use Auto Loader with Unity Catalog, including references to Structured Streaming limitations and recommendations. This typically includes product-specific configuration guidance and gotchas (for example, access modes, catalog constraints), fitting best-practices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/,Overview,Get started using COPY INTO to load data - Azure Databricks,,Use the COPY INTO SQL command to load data incrementally and idempotently from cloud storage into a Delta table in Databricks SQL.,"You can use theCOPY INTOSQL command to load data from a file location into a Delta table.COPY INTOis retriable and idempotent — files in the source location that have already been loaded are skipped on subsequent runs. COPY INTOoffers these capabilities: Note For a more scalable and robust file ingestion experience, Databricks recommends that SQL users use streaming tables. For more information, seeStreaming tables. Warning COPY INTOrespects the workspace setting for deletion vectors. If enabled",2026-04-17T21:49:00.000Z,concept-article,,0.2,False,"Intro/get-started page for COPY INTO; summary suggests conceptual and tutorial-style usage without explicit mention of detailed limits, config tables, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/unity-catalog,Auto Loader with Unity Catalog,Using Auto Loader with Unity Catalog - Azure Databricks,,Use Auto Loader for incremental data ingestion from external locations or to tables managed by Unity Catalog.,"Auto Loader can securely ingest data from external locations configured with Unity Catalog. To learn more about securely connecting storage with Unity Catalog, seeConnect to cloud object storage using Unity Catalog. Auto Loader relies on Structured Streaming for incremental processing; for recommendations and limitations seeUsing Unity Catalog with Structured Streaming. Note In Databricks Runtime 11.3 LTS and above, you can use Auto Loader with either standard or dedicated access modes (formerly",2026-04-21T18:07:00.000Z,how-to,,0.3,False,"Describes using Auto Loader with Unity Catalog and references other docs for recommendations/limitations; summary does not show concrete config tables, numeric limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/,Overview,Get started using COPY INTO to load data - Azure Databricks,,Use the COPY INTO SQL command to load data incrementally and idempotently from cloud storage into a Delta table in Databricks SQL.,"You can use theCOPY INTOSQL command to load data from a file location into a Delta table.COPY INTOis retriable and idempotent — files in the source location that have already been loaded are skipped on subsequent runs. COPY INTOoffers these capabilities: Note For a more scalable and robust file ingestion experience, Databricks recommends that SQL users use streaming tables. For more information, seeStreaming tables. Warning COPY INTOrespects the workspace setting for deletion vectors. If enabled",2026-04-17T21:49:00.000Z,concept-article,,0.2,False,"Intro/get-started page for COPY INTO; summary suggests conceptual and tutorial-style usage without explicit mention of detailed limits, config tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/configure-data-access,Overview,Configure data access for ingestion - Azure Databricks,Configure secure ADLS access for Databricks ingestion,Learn how admins can configure access to cloud object storage so that users can load data into Azure Databricks.,This article describes how admin users can configure access to data in a container in Azure Data Lake Storage (ADLS) so that Azure Databricks users can load data from ADLS into a table in Azure Databricks. This article describes the following ways to configure secure access to source data:,2026-03-26T08:00:00.000Z,how-to,security,0.7,True,"Configuration-focused article describing concrete ways to configure secure access from Azure Databricks to ADLS containers. Likely includes storage-specific auth patterns, permission scopes, and configuration parameters unique to Databricks and ADLS rather than just conceptual security guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/examples,Overview,Common data loading patterns using COPY INTO - Azure Databricks,Apply common COPY INTO data loading patterns,Learn common patterns for using COPY INTO to load data to Delta Lake.,Learn common patterns for usingCOPY INTOto load data from file sources into Delta Lake. There are many options for usingCOPY INTO. You can alsouse temporary credentials with COPY INTOin combination with these patterns. SeeCOPY INTOfor a full reference of all options.,2024-09-10T17:49:00.000Z,sample,best-practices,0.7,True,"Provides common patterns for COPY INTO usage, which are concrete SQL patterns and option combinations tailored to Databricks, aligning with best-practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/generate-temporary-credentials,Generate temporary credentials for ingestion,Generate temporary credentials for ingestion - Azure Databricks,Generate temporary ADLS credentials for Databricks ingestion,Learn how admins can create generate temporary credentials to share with other Azure Databricks users so they can securely access data in cloud object storage for data ingestion tasks.,This article describes how to get credentials in your Azure storage account that have just enough access to read data in an Azure Data Lake Storage (ADLS) container.,2025-03-17T22:50:00.000Z,how-to,security,0.8,True,"Focuses on creating temporary credentials with least privilege. This involves specific permission scopes and security settings, which are security configuration expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/temporary-credentials,Use temporary credentials,Load data using COPY INTO with temporary credentials - Azure Databricks,Use temporary credentials with COPY INTO securely,Learn how to use temporary credentials with COPY INTO to load data files into Delta Lake.,"If your Azure Databricks cluster or SQL warehouse doesn't have permissions to read your source files, you can use temporary credentials to access data from external cloud object storage and load files into a Delta Lake table. Depending on how your organization manages your cloud security, you might need to ask a cloud administrator or power user to provide you with credentials. For more information, seeGenerate temporary credentials for ingestion.",2026-04-13T21:52:00.000Z,how-to,security,0.7,True,"Describes using temporary credentials for accessing external storage with COPY INTO; such content typically includes auth-related parameters, scopes, and security-specific configuration unique to Databricks ingestion.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/temporary-credentials,Use temporary credentials,Load data using COPY INTO with temporary credentials - Azure Databricks,Use temporary credentials with COPY INTO securely,Learn how to use temporary credentials with COPY INTO to load data files into Delta Lake.,"If your Azure Databricks cluster or SQL warehouse doesn't have permissions to read your source files, you can use temporary credentials to access data from external cloud object storage and load files into a Delta Lake table. Depending on how your organization manages your cloud security, you might need to ask a cloud administrator or power user to provide you with credentials. For more information, seeGenerate temporary credentials for ingestion.",2026-04-13T21:52:00.000Z,how-to,security,0.7,True,"Describes using temporary credentials for accessing external storage with COPY INTO; such content typically includes auth-related parameters, scopes, and security-specific configuration unique to Databricks ingestion.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/tutorial-dbsql,Use compute with an instance profile,Load data using COPY INTO with a service principal - Azure Databricks,,Learn how to use COPY INTO to load data from cloud object storage into a table in Databricks SQL.,"This article describes how to use theCOPY INTOcommand to load data from an Azure Data Lake Storage (ADLS) container in your Azure account into a table in Databricks SQL. The steps in this article assume that your admin has configured a SQL warehouse to use a Azure Databricks service principal so that you can access your source files in ADLS. If your admin configured a Unity Catalog external location with a storage credential, seeLoad data using COPY INTO with Unity Catalog volumes or external lo",2025-03-17T22:50:00.000Z,tutorial,,0.45,False,"A tutorial using COPY INTO with a service principal is mostly step-by-step. While it touches security, it is less likely to be a dense reference of roles/permissions beyond what other security pages cover.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/tutorial-notebook,Notebook tutorial,Tutorial: COPY INTO with Spark SQL - Azure Databricks,,Learn how to use the COPY INTO command to load JSON data from a Unity Catalog volume into a Delta table in your Databricks workspace.,"Databricks recommends that you use theCOPY INTOcommand for incremental and bulk data loading for data sources that contain thousands of files. In this tutorial, you use theCOPY INTOcommand to load JSON data from a Unity Catalog volume into a Delta table in your Azure Databricks workspace. You use theWanderbrickssample dataset as the data source. For more advanced ingestion use cases, seeWhat is Auto Loader?.",2026-04-15T19:49:00.000Z,tutorial,,0.2,False,"Tutorial notebook for COPY INTO; likely step-by-step example rather than structured configuration tables, limits, or troubleshooting matrices.",updated -https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/unity-catalog,Use a Unity Catalog volume or external location,Load data using COPY INTO with Unity Catalog volumes or external locations - Azure Databricks,Configure COPY INTO with Unity Catalog volumes,Learn how to use COPY INTO to load data into Unity Catalog managed or external tables from any source and file format supported by COPY INTO.,Learn how to useCOPY INTOto ingest data to Unity Catalog managed or external tables from any source and file format supported by COPY INTO. Unity Catalog adds new options for configuring secure access to raw data. You can use Unity Catalog volumes or external locations to access data in cloud object storage. Databricks recommends using volumes to access files in cloud storage as part of the ingestion process usingCOPY INTO. For more information about recommendations for using volumes and externa,2026-03-26T08:00:00.000Z,how-to,configuration,0.65,True,"How-to for using COPY INTO with Unity Catalog volumes/external locations. COPY INTO and Unity Catalog typically require specific option names, table properties, and configuration parameters that are product-specific, going beyond generic SQL knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/tutorial-notebook,Notebook tutorial,Tutorial: COPY INTO with Spark SQL - Azure Databricks,,Learn how to use the COPY INTO command to load JSON data from a Unity Catalog volume into a Delta table in your Databricks workspace.,"Databricks recommends that you use theCOPY INTOcommand for incremental and bulk data loading for data sources that contain thousands of files. In this tutorial, you use theCOPY INTOcommand to load JSON data from a Unity Catalog volume into a Delta table in your Azure Databricks workspace. You use theWanderbrickssample dataset as the data source. For more advanced ingestion use cases, seeWhat is Auto Loader?.",2026-04-15T19:49:00.000Z,tutorial,,0.2,False,"Tutorial notebook for COPY INTO; likely step-by-step example rather than structured configuration tables, limits, or troubleshooting matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/unity-catalog,Use a Unity Catalog volume or external location,Load data using COPY INTO with Unity Catalog volumes or external locations - Azure Databricks,Load ADLS data into Unity Catalog tables with COPY INTO,Learn how to use COPY INTO to load data into Unity Catalog managed tables from volumes or external locations.,"This article describes how to use theCOPY INTOcommand to load data from an Azure Data Lake Storage (ADLS) container in your Azure account into a table in Databricks SQL. The steps in this article assume that your admin has configured a Unity Catalog volume or external location so that you can access your source files in ADLS. If your admin configured a compute resource to use a service principal, seeLoad data using COPY INTO with a service principalorTutorial: COPY INTO with Spark SQLinstead. If",2026-04-23T08:00:00.000Z,how-to,integrations,0.7,True,"Describes Databricks SQL COPY INTO usage with Unity Catalog volumes/external locations, including product-specific syntax, options, and configuration details for ADLS-backed tables; this is concrete integration and coding pattern guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/onboard-data,Onboard data from ADLS,Set up incremental ingestion from Azure Data Lake Storage - Azure Databricks,Configure incremental ingestion from ADLS with Auto Loader,Set up incremental ingestion from Azure Data Lake Storage into Azure Databricks using Auto Loader and Lakeflow Declarative Pipelines. Securely access source data using a Unity Catalog volume or a Unit,"This article describes how to set up incremental data ingestion from Azure Data Lake Storage into Azure Databricks. You'll learn how to securely access source data in a cloud object storage location that corresponds with a Unity Catalog volume (recommended) or a Unity Catalog external location. Then, you'll learn how to ingest the data incrementally into a Unity Catalog managed table using Auto Loader with Lakeflow Spark Declarative Pipelines. Note To set up incremental ingestion in Databricks S",2026-03-26T08:00:00.000Z,how-to,configuration,0.65,True,Step-by-step setup for incremental ingestion including Unity Catalog volumes/external locations and Auto Loader settings; likely includes specific configuration parameters and values.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/create-or-modify-table,Create or modify a table using file upload,Create or modify a table using file upload - Azure Databricks,Create or modify Delta tables via file upload UI,Learn about uploading data and creating tables using the Create or modify a table using file upload page.,"TheCreate or modify a table using file uploadpage allows you to upload CSV, TSV, or JSON, Avro, Parquet, or text files to create or overwrite a managed Delta Lake table. You can create managed Delta tables in Unity Catalog or in the Hive metastore. Note In addition, you can useuse the add data UIorCOPY INTOto load files from cloud storage. Important You can use the UI to create a Delta table by importing small CSV, TSV, JSON, Avro, Parquet, or text files from your local machine.",2026-03-31T23:28:00.000Z,how-to,configuration,0.65,True,"Describes a specific Databricks UI feature with constraints (file types, small file size, managed Delta tables in Unity Catalog or Hive metastore). These are concrete product-specific configuration/usage details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/,Overview,Migrate data to Delta Lake - Azure Databricks,Plan migration of existing data to Delta Lake on Databricks,Learn about ways to convert existing data to Delta Lake while migrating to Azure Databricks.,"Azure Databricks provides tools to simplify the migration of Parquet and Apache Iceberg data into Delta Lake. Databricks recommends storing data using Unity Catalog managed tables, but in-place conversion provides many of the same benefits without needing to fully rewrite all data. Databricks recommends usingCLONEif the source system continues to receive updates during the migration.",2025-03-27T20:15:00.000Z,overview,decision-making,0.7,True,Discusses when to use CLONE vs in-place conversion and Unity Catalog managed tables; this is migration and option-selection guidance with product-specific trade-offs.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/clone-parquet,Clone Parquet and Iceberg tables,Incrementally clone Parquet and Apache Iceberg tables to Delta Lake - Azure Databricks,Incrementally clone Parquet and Iceberg to Delta Lake,Learn how to incrementally clone Parquet and Apache Iceberg tables to Delta Lake.,You can use Azure Databricks clone functionality to incrementally convert data from Parquet or Apache Iceberg data sources to managed or external Delta tables. Azure Databricks clone for Parquet and Iceberg combines functionality used toclone Delta tablesandconvert tables to Delta Lake. This article describes use cases and limitations for this feature and provides examples. Important This feature is inPublic Preview. Note This feature requires Databricks Runtime 11.3 LTS or above.,2026-04-15T19:49:00.000Z,how-to,integrations,0.7,True,"Covers Azure Databricks clone functionality for Parquet and Iceberg, including runtime version requirement (11.3 LTS+), preview status, use cases, limitations, and examples. This is detailed, product-specific behavior for integrating external table formats with Delta Lake, fitting integrations & coding patterns more than other categories.",updated -https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/convert-to-delta,Convert to Delta Lake,Convert to Delta Lake - Azure Databricks,Convert Parquet and Iceberg tables to Delta Lake,Learn how to convert Parquet and Apache Iceberg tables to Delta Lake.,"TheCONVERT TO DELTASQL command performs a one-time conversion for Parquet and Apache Iceberg tables to Delta Lake tables. For incremental conversion of Parquet or Iceberg tables to Delta Lake, seeIncrementally clone Parquet and Apache Iceberg tables to Delta Lake. Unity Catalog supports theCONVERT TO DELTASQL command for Parquet and Iceberg tables stored in external locations managed by Unity Catalog. You can configure existing Parquet data files as external tables in Unity Catalog and then conv",2026-04-15T19:49:00.000Z,how-to,integrations,0.68,True,"Describes the product-specific CONVERT TO DELTA SQL command, including how it behaves with Unity Catalog external locations and existing Parquet data files. This is concrete, product-specific integration/command behavior rather than a conceptual overview, but it does not focus on limits, architecture, or deployment.",updated -https://learn.microsoft.com/en-us/azure/databricks/ingestion/file-metadata-column,File metadata column,File metadata column - Azure Databricks,Use the _metadata file column in Databricks,Access metadata about source data files by querying the hidden \_metadata column.,"You can get metadata information for input files with the_metadatacolumn. The_metadatacolumn is ahiddencolumn, and is available for all input file formats. To include the_metadatacolumn in the returned DataFrame, you must explicitly select it in the read query where you specify the source. If the data source contains a column named_metadata, queries return the column from the data source, and not the file metadata. Warning New fields might be added to the_metadatacolumn in future releases. To pr",2026-01-20T08:00:00.000Z,concept-article,best-practices,0.65,True,"Describes behavior of the hidden _metadata column, including precedence when a user column has the same name and warnings about future field additions. These are product-specific behavioral details and gotchas that qualify as expert knowledge and best-practice usage patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/google-drive,Google Drive,Ingest files from Google Drive - Azure Databricks,Ingest Google Drive files into Azure Databricks,"Learn how to ingest files from Google Drive into Azure Databricks using `read_files` (Databricks SQL), Auto Loader (`.readStream` with `cloudFiles`), `COPY INTO`, and `spark.read`.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. :::note Compliance The Google Drive connector supports use in workspaces with theConfigure enhanced security and compliance settingsenabled. ::: The standard Google Drive connector in Lakeflow Connect allows you to use Azure Databricks Spark and SQL functions (read_files,spark.read,COPY INTO, and Auto Loader) to create Spark dataframes, materialized vie",2026-04-07T08:00:00.000Z,how-to,integrations,0.7,True,"Connector article for Google Drive ingestion; likely includes Databricks-specific options (e.g., connector names, parameters, and usage patterns for read_files, spark.read, COPY INTO, Auto Loader) that are configuration- and API-specific rather than generic tutorial content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/,Concepts,Managed connectors in Lakeflow Connect - Azure Databricks,,"Learn how Databricks Lakeflow Connect managed connectors enable you to ingest data from SaaS applications, databases, and data warehouses.","Note Managed connectors in Lakeflow Connect are in variousrelease states. This page provides an overview of managed connectors in Databricks Lakeflow Connect for ingesting data from SaaS applications and databases. The resulting ingestion pipeline is governed by Unity Catalog and is powered by serverless compute and Lakeflow Spark Declarative Pipelines. Managed connectors leverage efficient incremental reads and writes to make data ingestion faster, scalable, and more cost-efficient, while your ",2026-04-15T22:08:00.000Z,concept-article,,0.2,False,"Described as an overview of managed connectors in Lakeflow Connect, focusing on capabilities and benefits (incremental reads/writes, cost efficiency). No clear indication of detailed configuration tables, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/clone-parquet,Clone Parquet and Iceberg tables,Incrementally clone Parquet and Apache Iceberg tables to Delta Lake - Azure Databricks,Incrementally clone Parquet and Iceberg to Delta Lake,Learn how to incrementally clone Parquet and Apache Iceberg tables to Delta Lake.,You can use Azure Databricks clone functionality to incrementally convert data from Parquet or Apache Iceberg data sources to managed or external Delta tables. Azure Databricks clone for Parquet and Iceberg combines functionality used toclone Delta tablesandconvert tables to Delta Lake. This article describes use cases and limitations for this feature and provides examples. Important This feature is inPublic Preview. Note This feature requires Databricks Runtime 11.3 LTS or above.,2026-04-15T19:49:00.000Z,how-to,integrations,0.7,True,"Covers Azure Databricks clone functionality for Parquet and Iceberg, including runtime version requirement (11.3 LTS+), preview status, use cases, limitations, and examples. This is detailed, product-specific behavior for integrating external table formats with Delta Lake, fitting integrations & coding patterns more than other categories.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/convert-to-delta,Convert to Delta Lake,Convert to Delta Lake - Azure Databricks,Convert Parquet and Iceberg tables to Delta Lake,Learn how to convert Parquet and Apache Iceberg tables to Delta Lake.,"TheCONVERT TO DELTASQL command performs a one-time conversion for Parquet and Apache Iceberg tables to Delta Lake tables. For incremental conversion of Parquet or Iceberg tables to Delta Lake, seeIncrementally clone Parquet and Apache Iceberg tables to Delta Lake. Unity Catalog supports theCONVERT TO DELTASQL command for Parquet and Iceberg tables stored in external locations managed by Unity Catalog. You can configure existing Parquet data files as external tables in Unity Catalog and then conv",2026-04-15T19:49:00.000Z,how-to,integrations,0.68,True,"Describes the product-specific CONVERT TO DELTA SQL command, including how it behaves with Unity Catalog external locations and existing Parquet data files. This is concrete, product-specific integration/command behavior rather than a conceptual overview, but it does not focus on limits, architecture, or deployment.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/file-metadata-column,File metadata column,File metadata column - Azure Databricks,Use the _metadata file metadata column in Databricks,Access metadata about source data files by querying the hidden \_metadata column.,"You can get metadata information for input files with the_metadatacolumn. The_metadatacolumn is ahiddencolumn, and is available for all input file formats. To include the_metadatacolumn in the returned DataFrame, you must explicitly select it in the read query where you specify the source. If the data source contains a column named_metadata, queries return the column from the data source, and not the file metadata. Warning New fields might be added to the_metadatacolumn in future releases. To pr",2026-04-23T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains the hidden _metadata column semantics, available fields, and how to select it in queries, including product-specific behavior when a user column has the same name; these are detailed configuration/usage semantics unique to Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/google-drive,Google Drive,Ingest files from Google Drive - Azure Databricks,Ingest Google Drive files into Azure Databricks,"Learn how to ingest files from Google Drive into Azure Databricks using `read_files` (Databricks SQL), Auto Loader (`.readStream` with `cloudFiles`), `COPY INTO`, and `spark.read`.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. :::note Compliance The Google Drive connector supports use in workspaces with theConfigure enhanced security and compliance settingsenabled. ::: The standard Google Drive connector in Lakeflow Connect allows you to use Azure Databricks Spark and SQL functions (read_files,spark.read,COPY INTO, and Auto Loader) to create Spark dataframes, materialized vie",2026-04-24T18:05:00.000Z,how-to,integrations,0.8,True,"Covers the Lakeflow Connect Google Drive connector with Databricks-specific functions (read_files, spark.read, COPY INTO, Auto Loader) and likely includes connector options, authentication, and constraints; this is a product-specific integration pattern.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/,Concepts,Managed connectors in Lakeflow Connect - Azure Databricks,,"Learn how Databricks Lakeflow Connect managed connectors enable you to ingest data from SaaS applications, databases, and data warehouses.","Note Managed connectors in Lakeflow Connect are in variousrelease states. This page provides an overview of managed connectors in Databricks Lakeflow Connect for ingesting data from SaaS applications and databases. The resulting ingestion pipeline is governed by Unity Catalog and is powered by serverless compute and Lakeflow Spark Declarative Pipelines. Managed connectors leverage efficient incremental reads and writes to make data ingestion faster, scalable, and more cost-efficient, while your ",2026-04-15T22:08:00.000Z,concept-article,,0.2,False,"Described as an overview of managed connectors in Lakeflow Connect, focusing on capabilities and benefits (incremental reads/writes, cost efficiency). No clear indication of detailed configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/azure-sql-db-firewall,Azure SQL Database firewall configuration,Configure firewall settings for Azure SQL Database - Azure Databricks,Configure Azure SQL Database firewall for Databricks ingestion,Learn how to configure your Azure SQL Database firewall to allow access from Azure Databricks classic compute using either service endpoints or private endpoints within a classic VNet injection setup.,"Azure SQL Database provides firewall settings to control network access from external systems, including Lakeflow Connect. The ingestion gateway for the SQL Server connector deploys inside classic compute and the context of the Virtual Network (VNet) associated with an Azure Databricks workspace. The following steps outline how to configure your Azure SQL Database firewall to allow access from Azure Databricks classic compute using either service endpoints or private endpoints within a classic V",2026-03-10T08:00:00.000Z,how-to,security,0.85,True,"Describes precise firewall configuration for Azure SQL Database to allow Databricks classic compute access, including network constructs (VNets, endpoints) and likely specific rules and settings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/cdc-overview,Database connectors (CDC),Database connectors in Lakeflow Connect - Azure Databricks,,Learn about the database connectors in Databricks Lakeflow Connect for ingesting data from relational databases using change data capture (CDC).,Databricks Lakeflow Connect provides fully-managed connectors for ingesting data from relational databases using change data capture (CDC). Each connector efficiently tracks changes in the source database and applies them incrementally to Delta tables.,2026-03-31T23:28:00.000Z,landing-page,,0.15,False,"Overview of database connectors using CDC; appears conceptual, describing what the connectors do rather than detailed limits, configuration parameters, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/cdc-overview,Database connectors (CDC),Database connectors in Lakeflow Connect - Azure Databricks,,Learn about the database connectors in Databricks Lakeflow Connect for ingesting data from relational databases using change data capture (CDC).,Databricks Lakeflow Connect provides fully-managed connectors for ingesting data from relational databases using change data capture (CDC). Each connector efficiently tracks changes in the source database and applies them incrementally to Delta tables.,2026-04-21T18:07:00.000Z,landing-page,,0.4,False,"Overview of database connectors and CDC in Lakeflow Connect; sounds conceptual (what the connectors do) rather than detailed configuration, limits, or decision-making guidance with quantified criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/column-selection,Column selection,Select columns to ingest - Azure Databricks,Configure column selection for Lakeflow Connect ingestion,"Learn how to select or deselect specific columns for ingestion. By default, managed connectors in Lakeflow Connect ingest all current and future columns in the specified tables.","Applies to:API-based pipeline authoringSaaS connectorsDatabase connectors By default, managed connectors in Lakeflow Connect ingest all current and future columns in the specified tables. Optionally use one of the following table configuration properties in your pipeline definition to select or deselect specific columns for ingestion: The example pipeline definitions on this page show how to select three specific columns for ingestion, depending on the pipeline creation interface. To deselect sp",2026-03-16T17:36:00.000Z,how-to,configuration,0.7,True,"Describes specific table configuration properties to include or exclude columns in managed connector pipelines, with example pipeline definitions. This is product-specific configuration behavior rather than generic ingestion guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/common-patterns,Common patterns,Common patterns for managed ingestion pipelines - Azure Databricks,Apply common patterns for Lakeflow managed ingestion pipelines,Learn about common patterns and techniques for managed ingestion pipelines.,"Lakeflow Connect provides patterns and techniques to optimize your managed ingestion pipelines. Use these patterns to control which data is ingested, manage pipeline updates, and configure advanced behaviors. Not all connectors support the common patterns in this section.",2026-03-24T17:49:00.000Z,how-to,best-practices,0.7,True,"Explicitly about patterns and techniques to optimize managed ingestion pipelines, including controlling ingested data and configuring advanced behaviors. These are Databricks-specific best-practice patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/common-patterns,Common patterns,Common patterns for managed ingestion pipelines - Azure Databricks,Apply common Lakeflow Connect ingestion pipeline patterns,Learn about common patterns and techniques for managed ingestion pipelines.,"Lakeflow Connect provides patterns and techniques to optimize your managed ingestion pipelines. Use these patterns to control which data is ingested, manage pipeline updates, and configure advanced behaviors. Not all connectors support the common patterns in this section.",2026-04-22T17:34:00.000Z,how-to,best-practices,0.6,True,"Describes concrete patterns and techniques to optimize managed ingestion pipelines (controlling ingested data, managing updates, advanced behaviors) that are specific to Lakeflow Connect connectors, which aligns with product-specific best practices.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence,Confluence,Confluence connector - Azure Databricks,,Overview of the Confluence connector for managed ingestion pipelines.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed Confluence connector in Lakeflow Connect allows you to ingest data from Confluence into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.3,False,"Connector overview; likely conceptual description of Confluence connector without detailed limits, configs, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-faq,FAQ,Confluence connector FAQs - Azure Databricks,,"Find answers to frequently asked questions about the Confluence connector, including data ingestion details, incremental updates, and permission requirements.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page answers frequently asked questions about the Confluence connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,faq,,0.4,False,"FAQ page; description mentions ingestion details and permissions but not explicit error codes, limits, or config tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-limits,Limitations,Confluence connector limitations - Azure Databricks,Review Confluence connector limits and API constraints,"Learn about limitations and considerations for ingesting data from Confluence using Databricks Lakeflow Connect, including content restrictions and API rate limits.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page lists limitations and considerations for ingesting data from Confluence using Databricks Lakeflow Connect.,2026-03-26T08:00:00.000Z,limits-and-quotas,limits-quotas,0.8,True,Explicitly about limitations and API rate limits; such pages typically list numeric rate limits and content restrictions specific to this connector.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-pipeline,Create ingestion pipeline,Ingest data from Confluence - Azure Databricks,,Learn how to create a managed Confluence ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed Confluence ingestion pipeline using Databricks Lakeflow Connect.,2026-03-24T01:32:00.000Z,how-to,,0.3,False,"Pipeline creation tutorial; unlikely to contain detailed config matrices, limits, or troubleshooting mappings beyond generic steps.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-pipeline,Create ingestion pipeline,Ingest data from Confluence - Azure Databricks,,Learn how to create a managed Confluence ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed Confluence ingestion pipeline using Databricks Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Appears to be a step-by-step ingestion/tutorial page for Confluence using Lakeflow Connect. Description and summary indicate how-to pipeline creation, not configuration tables, limits, error-code mappings, or product-specific best-practice guidance with quantified impact.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-reference,Reference,Confluence connector reference - Azure Databricks,Use Confluence connector reference for schemas and metadata,"Reference material for Confluence ingestion using Databricks Lakeflow Connect, including data type transformations, metadata columns, and schema mapping details.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page contains reference material for the Confluence connector in Lakeflow Connect.,2026-03-04T19:15:00.000Z,reference,configuration,0.8,True,"Reference page explicitly mentions data type transformations, metadata columns, and schema mapping details, which are product-specific configuration/reference parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-source-setup,Set up data source,Configure OAuth U2M for Confluence ingestion - Azure Databricks,Configure OAuth U2M security for Confluence ingestion,"Learn how to configure OAuth user-to-machine (U2M) authentication for Confluence ingestion into Databricks, including creating API tokens and obtaining required credentials.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to configure OAuth user-to-machine (U2M) authentication for Confluence ingestion into Azure Databricks.,2026-01-24T08:00:00.000Z,how-to,security,0.75,True,"Describes configuring OAuth U2M authentication, which typically includes product-specific auth parameters, token handling, and required credentials for this connector.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-troubleshoot,Troubleshooting,Troubleshoot Confluence ingestion - Azure Databricks,Diagnose and fix Databricks Confluence ingestion issues,"Learn about common issues with the Confluence connector in Databricks Lakeflow Connect and how to resolve them, including authentication errors and rate limits.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes common issues with the Confluence connector in Databricks Lakeflow Connect and how to resolve them.,2026-03-16T17:36:00.000Z,troubleshooting,troubleshooting,0.8,True,"Explicitly described as troubleshooting common issues with the Confluence connector, including authentication errors and rate limits. Such pages typically map specific error messages and causes to resolutions, which is product-specific expert knowledge.",unchanged @@ -1085,7 +1106,7 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d3 https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-limits,Limitations,Microsoft Dynamics 365 ingestion connector limitations - Azure Databricks,Understand Dynamics 365 connector ingestion limits,Understand limitations and restrictions when ingesting Microsoft Dynamics 365 data using the Lakeflow Connect Dynamics 365 connector.,Important This feature is inPublic Preview. This page describes limitations and restrictions for the Microsoft Dynamics 365 connector in Lakeflow Connect.,2026-03-10T08:00:00.000Z,limits-and-quotas,limits-quotas,0.8,True,Described as limitations and restrictions for the connector; such pages usually enumerate numeric limits and constraints specific to this integration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-pipeline,Create ingestion pipeline,Ingest data from Microsoft Dynamics 365 - Azure Databricks,,Learn how to create a managed Microsoft Dynamics 365 ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inPublic Preview. Learn how to create a managed Microsoft Dynamics 365 ingestion pipeline using Databricks Lakeflow Connect.,2026-03-23T08:00:00.000Z,how-to,,0.3,False,How-to pipeline creation guide; likely step-by-step without detailed config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-reference,Reference,Microsoft Dynamics 365 connector reference - Azure Databricks,Use Dynamics 365 connector configuration reference,"Technical reference for authentication, configuration, data types, and pipeline parameters for the Lakeflow Connect Dynamics 365 connector.",Important This feature is inPublic Preview. This page provides technical reference information for the Microsoft Dynamics 365 connector in Lakeflow Connect.,2026-02-02T08:00:00.000Z,reference,configuration,0.85,True,"Technical reference for authentication, configuration, data types, and pipeline parameters implies detailed parameter names, allowed values, and defaults unique to this connector.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-source-setup,Set up data source,Configure data source for Microsoft Dynamics 365 ingestion - Azure Databricks,Configure Dynamics 365 and storage for Databricks ingestion,"Set up Azure Synapse Link and ADLS Gen2 to export Microsoft Dynamics 365 data, then configure authentication for Databricks to ingest the exported data.","Important This feature is inPublic Preview. Learn how to set up Microsoft Dynamics 365 as a data source for ingestion into Azure Databricks using Lakeflow Connect. For information about how the connector accesses your source data, seeHow does the connector access D365 data?. For a list of supported Dataverse applications, seeWhich Dynamics 365 applications are supported?.",2026-02-02T19:09:00.000Z,how-to,configuration,0.8,True,"Source setup for D365 plus Synapse Link and ADLS Gen2 will include specific configuration parameters, connection details, and authentication settings unique to this connector.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-source-setup,Set up data source,Configure data source for Microsoft Dynamics 365 ingestion - Azure Databricks,,"Set up Azure Synapse Link and ADLS Gen2 to export Microsoft Dynamics 365 data, then configure authentication for Databricks to ingest the exported data.","Important This feature is inPublic Preview. Learn how to set up Microsoft Dynamics 365 as a data source for ingestion into Azure Databricks using Lakeflow Connect. For information about how the connector accesses your source data, seeHow does the connector access D365 data?. For a list of supported Dataverse applications, seeWhich Dynamics 365 applications are supported?.",2026-04-23T08:00:00.000Z,how-to,,0.3,False,"Focus is on setting up Dynamics 365 as a data source and configuring authentication. Likely a procedural setup guide rather than a reference of limits, configuration parameter tables, or troubleshooting mappings. No clear evidence of expert-only details from the provided summary.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-troubleshoot,Troubleshooting,Troubleshoot Microsoft Dynamics 365 ingestion - Azure Databricks,Troubleshoot Dynamics 365 data ingestion issues,Troubleshoot common issues when ingesting Microsoft Dynamics 365 data using the Lakeflow Connect Dynamics 365 connector.,Important This feature is inPublic Preview. This page provides troubleshooting guidance for common issues with the Microsoft Dynamics 365 connector in Lakeflow Connect.,2026-01-24T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,Explicit troubleshooting guide for this connector; such pages map specific errors and symptoms to causes and resolutions.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/faq,FAQ,Managed connector FAQs - Azure Databricks,,Find answers to frequently asked questions about managed connectors in Databricks Lakeflow Connect.,"Find answers to frequently asked questions about managed connectors in Databricks Lakeflow Connect. For connector-specific FAQs, see the documentation for your connector.",2026-04-08T08:00:00.000Z,faq,,0.0,False,"FAQ page description is generic and does not indicate presence of specific limits, configuration tables, error-code mappings, or other detailed expert knowledge as defined; likely high-level Q&A and connector pointers rather than numeric limits or product-specific configuration matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/full-refresh,Full refresh,Fully refresh target tables - Azure Databricks,Fully refresh Lakeflow Connect managed ingestion target tables,Learn how to fully refresh target tables in your managed ingestion pipelines.,"Applies to:SaaS connectorsDatabase connectors Fully refreshing the ingestion pipeline clears the data and state of the target tables, then reprocesses all records from the data source. You can fully refresh all tables in the pipeline or select tables to refresh. Important The ingestion pipeline update might fail during theInitializingorResetting tablesphase. Lakeflow Connect retries the pipeline automatically several times. If you interrupt the automatic retries or they eventually fail fatally, ",2026-01-20T08:00:00.000Z,how-to,best-practices,0.65,True,"Describes behavior of full refresh, pipeline phases, and automatic retries during Initializing/Resetting tables, including failure behavior. This is detailed, product-specific operational guidance.",unchanged @@ -1093,7 +1114,7 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/ga https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-concepts,Concepts,Google Ads connector concepts - Azure Databricks,,Learn how the managed Google Ads connector in Lakeflow Connect works.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how the managed Google Ads connector in Lakeflow Connect works.,2026-02-06T20:21:00.000Z,concept-article,,0.3,False,Concepts page; likely explains how the connector works conceptually rather than listing concrete configuration parameters or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-limits,Limitations,Google Ads connector limitations - Azure Databricks,Review Google Ads connector ingestion limitations,Limitations of the Google Ads connector for managed ingestion pipelines.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn about known limitations when using the managed Google Ads connector in Lakeflow Connect.,2026-04-02T22:35:00.000Z,concept-article,limits-quotas,0.8,True,Explicitly about known limitations; such connector limitation pages typically list numeric API limits and connector-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-overview,Google Ads,Google Ads connector - Azure Databricks,,Overview of the managed Google Ads connector in Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed Google Ads connector in Lakeflow Connect allows you to ingest data from Google Ads into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.3,False,Overview of Google Ads connector; primarily conceptual description of capabilities.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-pipeline,Create ingestion pipeline,Ingest data from Google Ads - Azure Databricks,,Learn how to create a managed ingestion pipeline to ingest data from Google Ads into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed ingestion pipeline to ingest data from Google Ads into Azure Databricks.,2026-03-24T01:32:00.000Z,concept-article,,0.3,False,Pipeline creation tutorial; unlikely to include detailed config tables or numeric limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-pipeline,Create ingestion pipeline,Ingest data from Google Ads - Azure Databricks,,Learn how to create a managed ingestion pipeline to ingest data from Google Ads into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed ingestion pipeline to ingest data from Google Ads into Azure Databricks.,2026-04-22T08:00:00.000Z,concept-article,,0.3,False,"Described as a guide to create a managed ingestion pipeline from Google Ads. This is typical tutorial content without explicit mention of limits, config matrices, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-reference,Reference,Google Ads connector reference - Azure Databricks,Use Google Ads connector table schemas and data types,"Reference information for the managed Google Ads connector in Lakeflow Connect, including table schemas and data types.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page provides reference information for the managed Google Ads connector in Lakeflow Connect, including supported tables, column definitions, and data types.",2026-02-06T20:21:00.000Z,concept-article,configuration,0.9,True,"Reference page with supported tables, column definitions, and data types is detailed configuration/reference information unique to this connector.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-source-setup,Set up data source,Configure OAuth for Google Ads ingestion - Azure Databricks,Configure OAuth 2.0 for Google Ads Databricks ingestion,Learn how to configure Google Ads to enable authentication from Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to configure Google Ads to enable authentication from Azure Databricks. The managed Google Ads connector in Lakeflow Connect uses OAuth 2.0 user-to-machine (U2M) authentication. You'll use the authentication details that you retrieve from the steps on this page to create a Unity Catalog connection in Azure Databricks.,2026-02-06T20:21:00.000Z,concept-article,configuration,0.85,True,"OAuth setup for Google Ads will include specific configuration fields (client ID, secret, scopes, redirect URIs) and Unity Catalog connection parameters unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-troubleshoot,Troubleshooting,Troubleshoot the Google Ads connector - Azure Databricks,Troubleshoot Google Ads connector ingestion issues,Troubleshoot common errors with the managed Google Ads connector in Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to troubleshoot common errors with the managed Google Ads connector in Lakeflow Connect.,2026-02-06T20:21:00.000Z,concept-article,troubleshooting,0.9,True,Troubleshooting page explicitly for common errors; will map specific error messages or codes to causes and resolutions for this connector.,unchanged @@ -1101,20 +1122,20 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/go https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-concepts,Concepts,Google Analytics Raw Data connector concepts - Azure Databricks,,Learn about the managed Google Analytics Raw Data ingestion connector in Databricks Lakeflow Connect.,"The Google Analytics Raw Data connector allows you to ingest raw, event-level data from Google Analytics 4 (GA4) using Databricks Lakeflow Connect and Google BigQuery.",2026-04-02T22:35:00.000Z,concept-article,,0.2,False,Concepts page; likely explains conceptual model of GA4 raw data ingestion rather than concrete limits or configs.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-faq,FAQ,Google Analytics Raw Data connector FAQs - Azure Databricks,,Find answers to frequently asked questions about the Google Analytics Raw Data connector in Databricks Lakeflow Connect.,This page answers frequently asked questions about the Google Analytics Raw Data connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,faq,,0.4,False,FAQ page; description does not indicate detailed error-code mappings or numeric limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-limits,Limitations,Google Analytics Raw Data connector limitations - Azure Databricks,Review GA4 raw data connector limits and constraints,"Learn about limitations and considerations for connecting to and ingesting raw, event-level from Google Analytics using Databricks Lakeflow Connect and Google BigQuery.","This page lists limitations and considerations for ingesting raw, event-level data from Google Analytics using Databricks Lakeflow Connect and Google BigQuery.",2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.8,True,Explicitly about limitations and considerations for GA4 raw data ingestion; such pages usually list numeric limits and connector-specific constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-pipeline,Create ingestion pipeline,Ingest data from Google Analytics 4 - Azure Databricks,,Learn how to ingest Google Analytics 4 data into Azure Databricks using Lakeflow Connect and Google BigQuery.,Learn how to ingest Google Analytics 4 data into Azure Databricks using Lakeflow Connect and Google BigQuery.,2026-03-24T01:32:00.000Z,how-to,,0.3,False,How-to ingestion pipeline guide; likely procedural without detailed config matrices or numeric constraints.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-pipeline,Create ingestion pipeline,Ingest data from Google Analytics 4 - Azure Databricks,,Learn how to ingest Google Analytics 4 data into Azure Databricks using Lakeflow Connect and Google BigQuery.,Learn how to ingest Google Analytics 4 data into Azure Databricks using Lakeflow Connect and Google BigQuery.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Page teaches how to ingest Google Analytics 4 data via BigQuery using Lakeflow Connect. Summary suggests a how-to integration tutorial, not a configuration reference, limits table, or decision/troubleshooting guide.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-reference,Reference,Google Analytics Raw Data connector reference - Azure Databricks,Consult GA4 raw data connector configuration reference,Access reference material for the Google Analytics Raw Data connector in Databricks Lakeflow Connect.,This page contains reference material for the Google Analytics Raw Data connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,reference,configuration,0.8,True,"Reference material for the connector implies detailed configuration parameters, data type mappings, and possibly defaults unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-source-setup,Set up data source,Set up Google Analytics 4 and Google BigQuery for Azure Databricks ingestion - Azure Databricks,Set up GA4 and BigQuery access for Databricks ingestion,"Learn how to set up Google Analytics 4 and Google BigQuery to ingest raw, event-level data into Azure Databricks.","This page describes how to set up Google Analytics 4 (GA4) and Google BigQuery (BigQuery) to ingest raw, event-level data into Azure Databricks.",2026-01-24T08:00:00.000Z,how-to,security,0.7,True,"Describes setting up GA4 and BigQuery for ingestion, which typically includes project roles, service accounts, and permission scopes—product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-troubleshoot,Troubleshooting,Troubleshoot Google Analytics ingestion - Azure Databricks,Troubleshoot Google Analytics raw data ingestion issues,Learn about common issues with the Google Analytics Raw Data connector in Databricks Lakeflow Connect and how to resolve them.,This page describes common issues with the Google Analytics Raw Data connector in Databricks Lakeflow Connect and how to resolve them.,2026-03-10T23:23:00.000Z,troubleshooting,troubleshooting,0.9,True,"Troubleshooting page will map common connector-specific issues to causes and resolutions, including any relevant error messages or configuration pitfalls.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-limits,Limitations,HubSpot connector limitations - Azure Databricks,Review HubSpot connector limits in Lakeflow Connect,Learn about known limitations when using the managed HubSpot connector in Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn about known limitations when using the managed HubSpot connector in Lakeflow Connect.,2026-04-17T18:03:00.000Z,concept-article,limits-quotas,0.7,True,"A 'limitations' page for a specific managed connector typically enumerates concrete constraints (such as unsupported objects/fields, pagination or rate limits, data size or schema restrictions) that are unique to the Azure Databricks Lakeflow Connect HubSpot integration. These product-specific limits are not generally known from pretraining and fit the limits-quotas category best.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-limits,Limitations,HubSpot connector limitations - Azure Databricks,Review HubSpot connector limits in Lakeflow Connect,Learn about known limitations when using the managed HubSpot connector in Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn about known limitations when using the managed HubSpot connector in Lakeflow Connect.,2026-04-17T18:03:00.000Z,concept-article,limits-quotas,0.7,True,"A 'limitations' page for a specific managed connector typically enumerates concrete constraints (such as unsupported objects/fields, pagination or rate limits, data size or schema restrictions) that are unique to the Azure Databricks Lakeflow Connect HubSpot integration. These product-specific limits are not generally known from pretraining and fit the limits-quotas category best.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-overview,HubSpot,HubSpot connector - Azure Databricks,,Overview of the HubSpot connector for managed ingestion pipelines.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed HubSpot connector in Lakeflow Connect allows you to ingest data from HubSpot Marketing Hub into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.3,False,Overview of HubSpot connector; primarily conceptual description of capabilities.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-pipeline,Create ingestion pipeline,Ingest data from HubSpot - Azure Databricks,,Learn how to create a managed HubSpot ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed HubSpot ingestion pipeline using Databricks Lakeflow Connect.,2026-03-24T01:32:00.000Z,how-to,,0.3,False,Pipeline creation tutorial; likely step-by-step without detailed config matrices or numeric limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-pipeline,Create ingestion pipeline,Ingest data from HubSpot - Azure Databricks,,Learn how to create a managed HubSpot ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed HubSpot ingestion pipeline using Databricks Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"HubSpot ingestion pipeline creation with Lakeflow Connect is likely a procedural tutorial. No indication of numeric limits, config parameter tables, or structured troubleshooting content in the summary.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-reference,Reference,HubSpot connector reference - Azure Databricks,Use HubSpot connector reference for tables and updates,"Reference information for managed HubSpot connector in Lakeflow Connect, including supported source tables and their update patterns.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page provides reference information about the managed HubSpot connector in Lakeflow Connect, including supported source tables and their update patterns.",2026-02-13T19:39:00.000Z,concept-article,configuration,0.85,True,Reference information about supported source tables and their update patterns is detailed connector-specific configuration/reference data.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-source-setup,Set up data source,Configure OAuth for HubSpot ingestion - Azure Databricks,Configure OAuth 2.0 for HubSpot ingestion,Learn how to configure HubSpot for ingestion from HubSpot Marketing Hub into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to configure HubSpot for ingestion from HubSpot Marketing Hub into Azure Databricks. The managed HubSpot connector in Lakeflow Connect uses OAuth 2.0 user-to-machine (U2M) authentication. Use the authentication details that you retrieve from the steps on this page to create a Unity Catalog connection in Azure Databricks.,2026-03-24T22:15:00.000Z,concept-article,security,0.85,True,"Explicitly about configuring OAuth 2.0 U2M authentication and Unity Catalog connection details, which are product-specific security and auth settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-troubleshoot,Troubleshooting,Troubleshoot the HubSpot connector - Azure Databricks,Troubleshoot HubSpot connector ingestion problems,Learn how to troubleshoot common issues with the managed HubSpot connector in Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to troubleshoot common issues with the managed HubSpot connector in Lakeflow Connect.,2026-02-13T19:39:00.000Z,concept-article,troubleshooting,0.9,True,"Troubleshooting page will contain mappings from connector-specific issues to causes and fixes, including any relevant error messages.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira,Jira,Jira connector - Azure Databricks,,Overview of the Jira connector for managed ingestion pipelines.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed Jira connector in Lakeflow Connect allows you to ingest data from Jira into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.3,False,Overview of Jira connector; mainly conceptual description of ingestion capabilities.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-faq,FAQ,Jira connector FAQs - Azure Databricks,,"Find answers to frequently asked questions about the Jira connector, including data ingestion details, delete handling, and permission requirements.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page answers frequently asked questions about the Jira connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,faq,,0.4,False,FAQ page; description mentions delete handling and permissions but not explicit error-code mappings or numeric limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-limits,Limitations,Jira connector limitations - Azure Databricks,Review Jira Lakeflow connector ingestion limits,"Learn about limitations and considerations for ingesting data from Jira, including incremental sync support and delete tracking.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page lists limitations and considerations for ingesting data from Jira using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.7,True,"A 'limitations' page for a specific connector is likely to enumerate concrete ingestion constraints (for example, unsupported operations, maximum sizes, or rate-related limits) that are product-specific and not generally known; treated as limits-quotas.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-pipeline,Create ingestion pipeline,Ingest data from Jira - Azure Databricks,,Learn how to create a managed Jira ingestion pipeline using Databricks Lakeflow Connect.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed Jira ingestion pipeline using Databricks Lakeflow Connect. Note The Jira connector automatically retries with exponential backoff when rate limits are encountered. If rate limit errors persist, seeRate limit errors.",2026-03-30T08:00:00.000Z,how-to,,0.35,False,"Pipeline creation guide; mention of automatic retries and rate limits is conceptual here, without explicit numeric limits or config tables.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-pipeline,Create ingestion pipeline,Ingest data from Jira - Azure Databricks,,Learn how to create a managed Jira ingestion pipeline using Databricks Lakeflow Connect.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed Jira ingestion pipeline using Databricks Lakeflow Connect. Note The Jira connector automatically retries with exponential backoff when rate limits are encountered. If rate limit errors persist, seeRate limit errors.",2026-04-22T08:00:00.000Z,how-to,,0.4,False,"Jira ingestion pipeline page mentions automatic retries with exponential backoff and rate limit errors, but the summary does not show concrete numeric limits, configuration tables, or detailed error-code mappings. It likely links to troubleshooting elsewhere rather than containing that expert reference content itself.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-reference,Reference,Jira connector reference - Azure Databricks,Use Jira Lakeflow connector reference settings,"Access reference material for Jira ingestion, including supported source tables, required OAuth scopes, and permissions.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page contains reference material for the Jira connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,reference,configuration,0.7,True,"A connector 'reference' page typically lists supported tables, required OAuth scopes, and permissions—i.e., concrete parameter names and required values—matching the configuration category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-source-setup,Set up data source,Configure Jira for ingestion - Azure Databricks,Configure OAuth 2.0 security for Jira ingestion,Learn how to configure OAuth 2.0 authentication for Jira ingestion into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to configure Jira for ingestion into Azure Databricks.,2026-01-24T08:00:00.000Z,how-to,security,0.8,True,"Describes configuring OAuth 2.0 for Jira ingestion, which involves product-specific auth endpoints, scopes, and credential parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-troubleshoot,Troubleshooting,Troubleshoot Jira ingestion - Azure Databricks,Troubleshoot Jira Lakeflow ingestion errors,"Learn about common issues with the Jira connector in Databricks Lakeflow Connect and how to resolve them, including authentication errors and OAuth configuration issues.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes common issues with the Jira connector in Databricks Lakeflow Connect and how to resolve them.,2026-03-31T23:28:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicitly a troubleshooting page for Jira ingestion; such pages normally map specific error messages/codes and OAuth misconfigurations to resolutions, which is expert troubleshooting knowledge.",unchanged @@ -1122,7 +1143,7 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/me https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-concepts,Concepts,Meta Ads connector concepts - Azure Databricks,,Learn about the managed Meta Ads ingestion connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The Meta Ads connector allows you to ingest advertising data from Meta Ads using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,concept-article,,0.2,False,"A 'concepts' page is typically high-level explanation of how the connector works, not detailed configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-faq,FAQ,Meta Ads ingestion connector FAQs - Azure Databricks,,Find answers to frequently asked questions about the Meta Ads ingestion connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page answers frequently asked questions about the Meta Ads ingestion connector in Databricks Lakeflow Connect.,2026-03-12T23:05:00.000Z,faq,,0.4,False,"FAQ page, but summary does not mention error codes, config parameters, or numeric thresholds; likely high-level Q&A.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-limits,Limitations,Meta Ads ingestion connector limitations - Azure Databricks,Understand Meta Ads ingestion connector limits,Learn about limitations and considerations for ingesting data from Meta Ads using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn about limitations and considerations for ingesting data from Meta Ads using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.7,True,"A 'limitations and considerations' page for a specific connector typically lists concrete constraints (for example, supported object counts, date ranges, or API limits) that qualify as limits-quotas.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-pipeline,Create ingestion pipeline,Ingest data from Meta Ads - Azure Databricks,,Learn how to create a managed ingestion pipeline to ingest data from Meta Ads into Azure Databricks.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed ingestion pipeline to ingest data from Meta Ads into Azure Databricks. For a list of supported objects, seeSupported objects.",2026-03-24T01:32:00.000Z,concept-article,,0.3,False,"Pipeline creation pages are generally step-by-step tutorials without exhaustive config matrices, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-pipeline,Create ingestion pipeline,Ingest data from Meta Ads - Azure Databricks,,Learn how to create a managed ingestion pipeline to ingest data from Meta Ads into Azure Databricks.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to create a managed ingestion pipeline to ingest data from Meta Ads into Azure Databricks. For a list of supported objects, seeSupported objects.",2026-04-22T08:00:00.000Z,concept-article,,0.2,False,"Appears to be a how-to/tutorial for creating a Meta Ads ingestion pipeline. No indication of numeric limits, config parameter tables, error-code mappings, or other expert-only details; likely step-by-step usage instructions.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-reference,Reference,Meta Ads ingestion connector reference - Azure Databricks,Reference Meta Ads connector objects and mappings,"Access reference material for the Meta Ads ingestion connector in Databricks Lakeflow Connect, including data type transformations and supported objects.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page contains reference material for the Meta Ads ingestion connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,reference,configuration,0.7,True,"Reference material including data type transformations and supported objects implies tables of fields, types, and mappings—product-specific configuration/parameter knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-source-setup,Set up data source,Set up Meta Ads as a data source - Azure Databricks,Configure Meta Ads as Lakeflow data source,Learn how to set up Meta Ads as a data source for ingestion using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to set up Meta Ads as a data source for ingestion using Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,how-to,configuration,0.65,True,"Source setup pages usually include concrete configuration fields (app IDs, secrets, scopes, redirect URIs) and possibly required permissions, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-troubleshoot,Troubleshooting,Troubleshoot Meta Ads ingestion - Azure Databricks,Troubleshoot Meta Ads Lakeflow ingestion issues,Learn about common issues with the Meta Ads ingestion connector in Databricks Lakeflow Connect and how to resolve them.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes common issues with the Meta Ads ingestion connector in Databricks Lakeflow Connect and how to resolve them.,2026-01-20T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,Explicit troubleshooting page; expected to contain symptom → cause → solution mappings and possibly error codes for the Meta Ads connector.,unchanged @@ -1132,36 +1153,36 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/my https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-aws-rds-config,Configure RDS and Aurora,Configure Amazon RDS and Amazon Aurora MySQL for ingestion - Azure Databricks,Configure Amazon RDS and Aurora MySQL for CDC,Learn how to configure Amazon RDS and Amazon Aurora MySQL for ingestion using Databricks Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Learn how to configure Amazon RDS for MySQL and Amazon Aurora MySQL for ingestion using Lakeflow Connect. You must enable binary logging and configure binlog retention to support change data capture.,2026-01-23T23:40:00.000Z,how-to,configuration,0.75,True,Connector-specific setup for RDS/Aurora MySQL; typically lists parameter group settings and required values for binlog and retention.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-azure-config,Configure Azure Database for MySQL,Configure Azure Database for MySQL for ingestion - Azure Databricks,Configure Azure Database for MySQL for CDC,Learn how to configure Azure Database for MySQL for ingestion using Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Learn how to configure Azure Database for MySQL for ingestion into Azure Databricks. You must enable binary logging and configure binlog retention to support change data capture.,2026-01-23T23:40:00.000Z,how-to,configuration,0.75,True,Describes enabling binary logging and retention for Azure Database for MySQL; likely includes specific parameter names and allowed values.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-ec2-config,Configure MySQL on EC2 or VM,Configure MySQL on Amazon EC2 for ingestion - Azure Databricks,Configure MySQL on EC2 for Databricks ingestion,Learn how to configure MySQL on Amazon EC2 for ingestion using Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Learn how to configure MySQL running on Amazon EC2 for ingestion. You must enable binary logging and configure binlog retention to support change data capture.,2026-01-23T23:40:00.000Z,how-to,configuration,0.7,True,Setup for MySQL on EC2 for CDC ingestion; likely includes my.cnf parameters and required configuration values.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-faq,FAQ,MySQL connector FAQ - Azure Databricks,,Frequently asked questions about the MySQL connector in Databricks Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Find answers to frequently asked questions about the MySQL connector.,2026-03-31T23:28:00.000Z,faq,,0.35,False,"MySQL connector FAQ; summary does not indicate detailed error-code mappings, config tables, or limits—likely general Q&A rather than expert troubleshooting or configuration reference.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-faq,FAQ,MySQL connector FAQ - Azure Databricks,,Frequently asked questions about the MySQL connector in Databricks Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Find answers to frequently asked questions about the MySQL connector.,2026-04-21T08:00:00.000Z,faq,,0.4,False,"MySQL connector FAQ may contain some product-specific details, but from the summary it appears to be general Q&A; no clear indication of numeric limits, config tables, or error-code-based troubleshooting to justify classification.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-gcp-config,Configure Google Cloud SQL for MySQL,Configure Google Cloud SQL for MySQL for ingestion - Azure Databricks,Configure Cloud SQL for MySQL for Lakeflow,Learn how to configure Google Cloud SQL for MySQL for data ingestion using Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. This page describes how to configure Google Cloud SQL for MySQL for ingestion. You must enable binary logging and configure binlog retention to support change data capture.,2026-01-23T23:40:00.000Z,how-to,configuration,0.7,True,Cloud SQL-specific configuration for binlog and retention; typically lists parameter names and constraints unique to this environment.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-limits,Limitations,MySQL connector limitations - Azure Databricks,Review MySQL Lakeflow connector limitations and considerations,Learn about limitations and considerations for MySQL ingestion using Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Learn about limitations and considerations for the MySQL connector.,2026-04-01T08:00:00.000Z,limits-and-quotas,limits-quotas,0.8,True,"Explicit limitations and considerations for the MySQL connector; such pages typically list concrete constraints, unsupported operations, and behavioral limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-pipeline,Create ingestion pipeline,Create a MySQL ingestion pipeline - Azure Databricks,,Learn how to ingest data from MySQL and load it into Azure Databricks using Lakeflow Connect.,"Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Learn how to ingest data from MySQL into Azure Databricks using Lakeflow Connect. The MySQL connector supports Amazon RDS for MySQL, Aurora MySQL, Azure Database for MySQL, Google Cloud SQL for MySQL, and MySQL running on EC2.",2026-03-23T08:00:00.000Z,how-to,,0.25,False,"Tutorial for creating a MySQL ingestion pipeline; primarily procedural ingestion steps rather than structured configuration reference, limits, or troubleshooting catalog.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-privileges,Privilege requirements for database users,Grant MySQL user privileges - Azure Databricks,Assign required MySQL privileges for Lakeflow ingestion,Learn about the privileges required for the MySQL user that you use for ingesting data into Azure Databricks.,"Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Important This page contains references to the termslave, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this page. Learn how to grant the privileges required for the MySQL user that you use for ingesting data into Azure Databricks. Databricks recommends that you create a dedicated MySQL user solely for Azure Databricks ingestio",2026-01-20T08:00:00.000Z,how-to,security,0.88,True,"Dedicated page on required MySQL user privileges; will list specific GRANT statements and permissions, which are product-specific security/identity configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-reference,Reference,MySQL connector reference - Azure Databricks,Use MySQL connector reference for types and DDL handling,"Find reference material for the MySQL connector in Lakeflow Connect, including data type mappings and DDL operation handling.","Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Find reference material for the MySQL connector in Lakeflow Connect, including data type mappings and DDL operation handling.",2026-01-20T08:00:00.000Z,reference,configuration,0.86,True,"Reference material including data type mappings and DDL operation handling; this is detailed, product-specific behavior/configuration that qualifies as expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-source-setup,Set up data source,Configure MySQL for ingestion into Azure Databricks - Azure Databricks,Configure MySQL for Lakeflow CDC ingestion,Learn how to configure MySQL for ingestion into Azure Databricks using Lakeflow Connect.,Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Learn how to configure MySQL for ingestion into Azure Databricks using Lakeflow Connect. The MySQL connector uses binary log (binlog) replication to capture changes from your MySQL database and incrementally syncs them to Azure Databricks.,2026-01-24T08:00:00.000Z,how-to,configuration,0.82,True,"Focuses on configuring MySQL for ingestion using binlog replication; likely includes specific server settings, required options, and constraints that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-troubleshoot,Troubleshooting,Troubleshoot MySQL ingestion issues - Azure Databricks,Diagnose and fix MySQL Lakeflow Connect ingestion errors,Troubleshoot common issues with MySQL ingestion using Lakeflow Connect.,"Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Important This page contains references to the termslave, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this page.",2026-01-24T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Troubleshooting page for MySQL ingestion; Databricks connector troubleshooting docs typically list specific error messages/codes and connector-specific resolutions, which are not generally known to LLMs.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-troubleshoot,Troubleshooting,Troubleshoot MySQL ingestion issues - Azure Databricks,Diagnose and fix Lakeflow MySQL ingestion errors,Troubleshoot common issues with MySQL ingestion using Lakeflow Connect.,"Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Important This page contains references to the termslave, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this page.",2026-04-22T17:34:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicitly a troubleshooting guide for MySQL ingestion; Databricks/Lakeflow-specific issues, likely includes concrete error messages and resolutions organized by symptom and fix.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-utility-script,Utility objects script,Prepare MySQL for ingestion using the utility objects script - Azure Databricks,Prepare MySQL with utility objects for Lakeflow,Learn how to prepare MySQL for ingestion into Azure Databricks using the utility objects script.,"Important The MySQL connector is in Public Preview. Contact your Azure Databricks account team to request access. Important This page contains references to the termslave, a term that Microsoft no longer uses. When the term is removed from the software, we'll remove it from this page. Complete the MySQL database setup tasks to ingest into Azure Databricks using Lakeflow Connect.",2026-03-16T08:00:00.000Z,how-to,configuration,0.8,True,"Describes preparing MySQL using a utility objects script; likely documents specific database objects, scripts, and configuration steps unique to this connector.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite,NetSuite,NetSuite connector - Azure Databricks,,Overview of the NetSuite connector for managed ingestion pipelines.,Important This feature is inPublic Preview. The managed NetSuite connector in Lakeflow Connect allows you to ingest data from NetSuite into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.2,False,"Connector overview for NetSuite; likely conceptual and marketing without detailed limits, config tables, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-limits,Limitations,NetSuite connector limitations - Azure Databricks,Understand NetSuite Lakeflow connector limitations,Learn about known limitations and considerations for ingesting data from NetSuite using Azure Databricks Lakeflow Connect.,Important This feature is inPublic Preview. Learn about known limitations and considerations for ingesting data from NetSuite using Azure Databricks Lakeflow Connect.,2026-03-04T19:15:00.000Z,limits-and-quotas,limits-quotas,0.7,True,"Limitations and considerations page; typically lists unsupported objects, rate limits, and other concrete constraints.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-pipeline,Create ingestion pipeline,Ingest data from NetSuite - Azure Databricks,,Learn how to create a managed NetSuite ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inPublic Preview. This page shows how to create a managed NetSuite ingestion pipeline using Lakeflow Connect.,2026-03-25T08:00:00.000Z,how-to,,0.3,False,Pipeline creation tutorial; typically procedural without exhaustive configuration matrices or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-pipeline,Create ingestion pipeline,Ingest data from NetSuite - Azure Databricks,,Learn how to create a managed NetSuite ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inPublic Preview. This page shows how to create a managed NetSuite ingestion pipeline using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.2,False,"Described as a page showing how to create a managed NetSuite ingestion pipeline. This is typical tutorial content without evidence of limits, configuration matrices, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-reference,Reference,NetSuite connector reference - Azure Databricks,Use NetSuite connector tables and type mappings,"Find reference material for the NetSuite ingestion connector in Azure Databricks Lakeflow Connect, including supported data sources, tables, and data type mappings.",Important This feature is inPublic Preview. Find reference material for the NetSuite ingestion connector in Azure Databricks Lakeflow Connect.,2026-03-25T08:00:00.000Z,reference,configuration,0.8,True,"Reference page with supported data sources, tables, and data type mappings; this is detailed parameter/type mapping information that fits configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-source-setup,Set up data source,Configure NetSuite for ingestion into Azure Databricks - Azure Databricks,Configure NetSuite token-based auth for ingestion,Learn how to configure your NetSuite account for Azure Databricks ingestion using token-based authentication.,Important This feature is inPublic Preview. Learn how to configure your NetSuite account for Azure Databricks ingestion using token-based authentication (TBA).,2026-01-20T08:00:00.000Z,how-to,configuration,0.75,True,"Describes configuring NetSuite with token-based authentication; such pages usually list specific account settings, token fields, and required roles/permissions—product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/pipeline-maintenance,Pipeline maintenance,Common pipeline maintenance tasks - Azure Databricks,Perform ongoing maintenance for Lakeflow pipelines,Learn how to perform ongoing operations for managed ingestion connectors.,Learn how to perform ongoing operations for managed ingestion pipelines.,2026-02-09T20:20:00.000Z,how-to,best-practices,0.6,True,Covers Databricks-specific operational tasks for managed ingestion connectors; actionable maintenance guidance.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/pipeline-tags,Pipeline tagging,Apply tags to managed ingestion pipelines - Azure Databricks,Configure tags on Lakeflow managed ingestion pipelines,"Learn how to apply tags to managed ingestion pipelines to organize resources, track ownership, and attribute costs.","Important This feature is inPublic Preview. Applies to:API-based pipeline authoringSaaS connectorsDatabase connectors Learn how to apply tags to managed ingestion pipelines to organize resources, track ownership, and attribute costs. Pipeline tags are different from serverless usage policies. Pipeline tags are metadata on the pipeline resource, whereas serverless usage policies apply tags to serverless compute billing records. SeeAttribute usage with serverless usage policies.",2026-04-16T17:44:00.000Z,how-to,configuration,0.65,True,"Explains how to apply tags to managed ingestion pipelines for organization and cost attribution, distinguishing them from serverless usage policies. This is product-specific configuration of tagging metadata on pipeline resources, fitting configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/pipeline-tags,Pipeline tagging,Apply tags to managed ingestion pipelines - Azure Databricks,Configure tags on Lakeflow managed ingestion pipelines,"Learn how to apply tags to managed ingestion pipelines to organize resources, track ownership, and attribute costs.","Important This feature is inPublic Preview. Applies to:API-based pipeline authoringSaaS connectorsDatabase connectors Learn how to apply tags to managed ingestion pipelines to organize resources, track ownership, and attribute costs. Pipeline tags are different from serverless usage policies. Pipeline tags are metadata on the pipeline resource, whereas serverless usage policies apply tags to serverless compute billing records. SeeAttribute usage with serverless usage policies.",2026-04-16T17:44:00.000Z,how-to,configuration,0.65,True,"Explains how to apply tags to managed ingestion pipelines for organization and cost attribution, distinguishing them from serverless usage policies. This is product-specific configuration of tagging metadata on pipeline resources, fitting configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql,PostgreSQL,PostgreSQL ingestion connector - Azure Databricks,,"Learn about the PostgreSQL ingestion workflow, including factors that affect your setup and step-by-step guidance for different user personas.","Important The PostgreSQL connector is in Public Preview. Contact your Azure Databricks account team to request access. This page helps you understand the PostgreSQL ingestion workflow, including the factors that determine your setup approach and the steps involved for different user personas.",2026-04-02T22:35:00.000Z,concept-article,,0.3,False,"Workflow/overview and persona-based guidance; summary does not indicate numeric limits, config tables, or detailed error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-faq,FAQ,PostgreSQL connector FAQs - Azure Databricks,Resolve common PostgreSQL Lakeflow Connect connector issues,Find answers to frequently asked questions about the PostgreSQL connector in Databricks Lakeflow Connect.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page answers frequently asked questions about the PostgreSQL connector in Databricks Lakeflow Connect.,2026-03-31T23:28:00.000Z,faq,troubleshooting,0.65,True,"FAQ for a specific connector usually includes concrete behaviors, edge cases, and resolutions for connector-specific issues that go beyond generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-limits,Limitations,PostgreSQL connector limitations - Azure Databricks,Understand PostgreSQL Lakeflow Connect connector limitations,Learn about the limitations and considerations for PostgreSQL ingestion using Databricks Lakeflow Connect.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page lists the limitations and considerations for PostgreSQL ingestion using Databricks Lakeflow Connect.,2026-03-31T23:28:00.000Z,limits-and-quotas,limits-quotas,0.7,True,"Limitations/considerations page for a specific connector is likely to enumerate concrete constraints (unsupported features, size or performance limits) that qualify as expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-faq,FAQ,PostgreSQL connector FAQs - Azure Databricks,,Find answers to frequently asked questions about the PostgreSQL connector in Databricks Lakeflow Connect.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page answers frequently asked questions about the PostgreSQL connector in Databricks Lakeflow Connect.,2026-04-22T08:00:00.000Z,faq,,0.2,False,"FAQ page; description does not indicate detailed error codes, config tables, or numeric limits—likely conceptual and preview/usage Q&A.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-limits,Limitations,PostgreSQL connector limitations - Azure Databricks,,Learn about the limitations and considerations for PostgreSQL ingestion using Databricks Lakeflow Connect.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page lists the limitations and considerations for PostgreSQL ingestion using Databricks Lakeflow Connect.,2026-04-21T08:00:00.000Z,limits-and-quotas,,0.4,False,Described as limitations and considerations but not clearly numeric limits/quotas; likely qualitative constraints rather than specific values or tables required for limits-quotas.,updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-maintenance,Maintenance,Maintain PostgreSQL ingestion pipelines - Azure Databricks,Maintain and operate PostgreSQL ingestion pipelines in Lakeflow,Learn how to perform ongoing operations for maintaining your PostgreSQL ingestion pipelines.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page describes the ongoing operations for maintaining PostgreSQL ingestion pipelines.,2026-03-18T22:49:00.000Z,how-to,best-practices,0.65,True,"Ongoing operations and maintenance guidance for ingestion pipelines typically includes product-specific recommendations, gotchas, and operational patterns (for example, how and when to refresh, handle schema changes, or manage performance), which fits best-practices for this connector.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-pipeline,Create ingestion pipeline,Ingest data from PostgreSQL - Azure Databricks,,Learn how to ingest data from PostgreSQL and load it into Azure Databricks using Lakeflow Connect.,"Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page describes how to ingest data from PostgreSQL and load it into Azure Databricks using Lakeflow Connect. The PostgreSQL connector supports AWS RDS PostgreSQL, Aurora PostgreSQL, Amazon EC2, Azure Database for PostgreSQL, Azure virtual machines, GCP Cloud SQL for PostgreSQL, and on-premises PostgreSQL databases using Azure ExpressRoute, A",2026-03-23T08:00:00.000Z,how-to,,0.3,False,"How-to pipeline setup; summary does not mention detailed config tables, limits, or error mappings beyond standard tutorial content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-privileges,Privilege requirements for source database users,PostgreSQL database user requirements - Azure Databricks,Configure PostgreSQL user privileges for Databricks ingestion,Learn about the privileges you must grant to the PostgreSQL database user that you plan to use for ingesting data into Azure Databricks.,"Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page describes the privileges required for the PostgreSQL replication user that you create for Databricks ingestion. Note For AWS RDS and Aurora, you don't need superuser privileges. Therds_replicationrole provides the necessary replication privileges. Note For Azure Database for PostgreSQL Flexible Server, ensure logical replication is ena",2026-01-20T08:00:00.000Z,how-to,security,0.7,True,"Describes required PostgreSQL privileges/roles for the replication user and cloud-specific notes (RDS/Aurora, Azure Flexible Server); this is product-specific IAM/privilege configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-reference,Reference,PostgreSQL connector reference - Azure Databricks,Reference PostgreSQL connector types and unsupported objects,"Access reference material for the PostgreSQL connector in Databricks Lakeflow Connect. For example, unsupported objects and transformations from PostgreSQL data types to Delta-compatible data types.",Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page contains reference material for the PostgreSQL connector in Databricks Lakeflow Connect.,2026-03-31T23:28:00.000Z,reference,configuration,0.7,True,"Reference for unsupported objects and data type transformations implies detailed, product-specific mapping tables and constraints that function as configuration/reference data.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-source-setup,Set up data source,Configure PostgreSQL for ingestion into Azure Databricks - Azure Databricks,Configure PostgreSQL source for Lakeflow Connect ingestion,Preview the source setup tasks for ingestion from PostgreSQL into Azure Databricks using Lakeflow Connect.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page describes the source setup tasks for ingestion from PostgreSQL into Azure Databricks using Lakeflow Connect.,2026-03-18T22:49:00.000Z,how-to,configuration,0.7,True,"Described as 'source setup tasks' for PostgreSQL ingestion, which typically includes product-specific connection settings, parameters, and possibly required configuration values unique to the connector. This aligns with configuration guidance rather than generic tutorial content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-troubleshoot,Troubleshooting,Troubleshoot PostgreSQL ingestion - Azure Databricks,Troubleshoot PostgreSQL ingestion with Lakeflow Connect,Learn about common issues with the PostgreSQL connector in Databricks Lakeflow Connect and how to resolve them.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page describes the common issues with the PostgreSQL connector in Databricks Lakeflow Connect and how to resolve them.,2026-01-20T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting guide for PostgreSQL connector; will map specific symptoms and errors to causes and fixes, which is expert, product-specific knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-limits,Limitations,Query-based connector limitations - Azure Databricks,,"Known limitations for query-based connectors in Lakeflow Connect, including deletion tracking and cursor column constraints.",Important This feature is inPublic Preview.,2026-04-15T22:08:00.000Z,limits-and-quotas,,0.4,False,"Describes limitations conceptually (deletion tracking, cursor constraints) but summary does not indicate specific numeric limits, quotas, or tier-based tables required for limits-quotas classification.",new -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-overview,Query-based connectors,Query-based connectors - Azure Databricks,,"Learn how query-based connectors in Lakeflow Connect ingest database data by querying the source directly using a cursor column, without CDC or an ingestion gateway.","Important This feature is inPublic Preview. Query-based connectors in Lakeflow Connect ingest data from databases by querying the source directly, without requiring change data capture (CDC) configuration. Instead of relying on binlogs or CDC infrastructure, they use a cursor column—a monotonically increasing timestamp or integer column—to track which rows are new or updated since the last pipeline run. Query-based connectors use Unity Catalog connections andLakehouse Federationto connect to sou",2026-04-15T22:08:00.000Z,concept-article,,0.3,False,"Overview of query-based connectors and cursor-column concept; no clear evidence of numeric limits, config parameter tables, or detailed error mappings from the summary.",new -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-pipeline,Create ingestion pipeline,Create a query-based ingestion pipeline - Azure Databricks,,Learn how to create a query-based ingestion pipeline in Lakeflow Connect using the Azure Databricks UI or Declarative Automation Bundles.,Important This feature is inPublic Preview. This page shows how to create a query-based ingestion pipeline in Lakeflow Connect.,2026-04-15T22:08:00.000Z,how-to,,0.3,False,"How-to guide for creating a query-based ingestion pipeline; described as a UI/automation walkthrough, not a reference of config parameters, limits, or troubleshooting details.",new -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-reference,Reference,Query-based connector reference - Azure Databricks,Configure query-based connectors in Lakeflow Connect,"Reference for query-based connector configuration parameters, cursor column requirements, soft-deletion and hard-deletion tracking, and error conditions.","Important This feature is inPublic Preview. This page contains reference documentation for query-based connectors in Lakeflow Connect, including configuration parameters, cursor column requirements, deletion tracking syntax, and error conditions.",2026-04-15T22:08:00.000Z,reference,configuration,0.8,True,"Explicitly described as a reference for connector configuration parameters, cursor column requirements, deletion tracking syntax, and error conditions—indicating parameter-level configuration details and product-specific behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-troubleshoot,Troubleshooting,Troubleshoot query-based connectors - Azure Databricks,Troubleshoot Lakeflow Connect query-based connector issues,"Troubleshoot common issues with query-based connectors in Lakeflow Connect, including cursor column errors, connection failures, and deletion tracking.",Important Query-based connectors are in Public Preview. Contact your Azure Databricks account team to request access.,2026-04-15T22:08:00.000Z,troubleshooting,troubleshooting,0.9,True,"Focused on troubleshooting common issues with query-based connectors, including specific error conditions like cursor column errors and connection failures, matching symptom→cause→solution troubleshooting guidance.",new +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-troubleshoot,Troubleshooting,Troubleshoot PostgreSQL ingestion - Azure Databricks,Resolve common PostgreSQL Lakeflow ingestion issues,Learn about common issues with the PostgreSQL connector in Databricks Lakeflow Connect and how to resolve them.,Important The PostgreSQL connector for Lakeflow Connect is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview. This page describes the common issues with the PostgreSQL connector in Databricks Lakeflow Connect and how to resolve them.,2026-04-22T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,Troubleshooting page for PostgreSQL connector; Databricks-specific ingestion problems with symptom-to-solution mappings and likely error messages.,updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-limits,Limitations,Query-based connector limitations - Azure Databricks,,"Known limitations for query-based connectors in Lakeflow Connect, including deletion tracking and cursor column constraints.",Important This feature is inPublic Preview.,2026-04-15T22:08:00.000Z,limits-and-quotas,,0.4,False,"Describes limitations conceptually (deletion tracking, cursor constraints) but summary does not indicate specific numeric limits, quotas, or tier-based tables required for limits-quotas classification.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-overview,Query-based connectors,Query-based connectors - Azure Databricks,,"Learn how query-based connectors in Lakeflow Connect ingest database data by querying the source directly using a cursor column, without CDC or an ingestion gateway.","Important This feature is inPublic Preview. Query-based connectors in Lakeflow Connect ingest data from databases by querying the source directly, without requiring change data capture (CDC) configuration. Instead of relying on binlogs or CDC infrastructure, they use a cursor column—a monotonically increasing timestamp or integer column—to track which rows are new or updated since the last pipeline run. Query-based connectors use Unity Catalog connections andLakehouse Federationto connect to sou",2026-04-15T22:08:00.000Z,concept-article,,0.3,False,"Overview of query-based connectors and cursor-column concept; no clear evidence of numeric limits, config parameter tables, or detailed error mappings from the summary.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-pipeline,Create ingestion pipeline,Create a query-based ingestion pipeline - Azure Databricks,,Learn how to create a query-based ingestion pipeline in Lakeflow Connect using the Azure Databricks UI or Declarative Automation Bundles.,Important This feature is inPublic Preview. This page shows how to create a query-based ingestion pipeline in Lakeflow Connect.,2026-04-15T22:08:00.000Z,how-to,,0.3,False,"How-to guide for creating a query-based ingestion pipeline; described as a UI/automation walkthrough, not a reference of config parameters, limits, or troubleshooting details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-reference,Reference,Query-based connector reference - Azure Databricks,Configure query-based connectors in Lakeflow Connect,"Reference for query-based connector configuration parameters, cursor column requirements, soft-deletion and hard-deletion tracking, and error conditions.","Important This feature is inPublic Preview. This page contains reference documentation for query-based connectors in Lakeflow Connect, including configuration parameters, cursor column requirements, deletion tracking syntax, and error conditions.",2026-04-15T22:08:00.000Z,reference,configuration,0.8,True,"Explicitly described as a reference for connector configuration parameters, cursor column requirements, deletion tracking syntax, and error conditions—indicating parameter-level configuration details and product-specific behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-troubleshoot,Troubleshooting,Troubleshoot query-based connectors - Azure Databricks,Troubleshoot Lakeflow Connect query-based connector issues,"Troubleshoot common issues with query-based connectors in Lakeflow Connect, including cursor column errors, connection failures, and deletion tracking.",Important Query-based connectors are in Public Preview. Contact your Azure Databricks account team to request access.,2026-04-15T22:08:00.000Z,troubleshooting,troubleshooting,0.9,True,"Focused on troubleshooting common issues with query-based connectors, including specific error conditions like cursor column errors and connection failures, matching symptom→cause→solution troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/row-filtering,Row filtering,Select rows to ingest - Azure Databricks,Configure row filtering for Lakeflow Connect ingestion,Learn how to filter rows during ingestion with Lakeflow Connect to improve performance and minimize data duplication.,Important This feature is inBeta. Applies to:API-based pipeline authoringSaaS connectors Row filtering allows you to ingest only the data you need by applying conditions similar to a SQLWHEREclause. This improves performance (especially for initial loads with historical data) and minimizes data duplication (especially in development environments).,2026-03-31T23:28:00.000Z,how-to,configuration,0.7,True,"Describes a 'row filtering' feature in beta, analogous to SQL WHERE, including its impact on performance and duplication. This is a specific configuration capability for managed connectors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/run-as,Run as,Configure the Run as identity for a pipeline - Azure Databricks,Configure Run as identity for Lakeflow ingestion pipelines,Configure the Run as identity for a managed ingestion pipeline to control which permissions the pipeline uses when it runs.,"Applies to:SaaS connectorsDatabase connectors TheRun assetting determines which identity's permissions are used when a managed ingestion pipeline runs. By default, the pipeline runs as its owner (a user). Databricks recommends changingRun asto a service principal so that pipelines continue to run if the owner leaves the organization or loses access to the workspace. Using a service principal ensures the pipeline continues to run regardless of changes to individual user accounts.",2026-03-25T08:00:00.000Z,how-to,security,0.7,True,"Explains the 'Run as' setting, defaulting to owner, and recommends using a service principal to ensure continuity. This is product-specific identity/permission configuration guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/saas-overview,SaaS connectors,SaaS connectors in Lakeflow Connect - Azure Databricks,,Learn about the SaaS connectors in Databricks Lakeflow Connect for ingesting data from enterprise applications.,"Databricks Lakeflow Connect provides fully-managed connectors for ingesting data from enterprise SaaS applications. Each connector handles source-specific authentication, incremental reads, schema evolution, and automated retries.",2026-04-01T08:00:00.000Z,landing-page,,0.2,False,"High-level overview of SaaS connectors; no detailed limits, configs, or error mappings indicated.",unchanged @@ -1176,14 +1197,14 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sc https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow,ServiceNow,ServiceNow connector - Azure Databricks,,"Overview of the ServiceNow connector for managed ingestion pipelines, including authentication methods and step-by-step guidance for different user personas.",The managed ServiceNow connector in Lakeflow Connect allows you to ingest data from ServiceNow into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.25,False,"ServiceNow connector overview; primarily conceptual and persona-focused guidance, not detailed limits, configuration matrices, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-faq,FAQ,ServiceNow connector FAQs - Azure Databricks,,Find answers to frequently asked questions about the ServiceNow connector in Databricks Lakeflow Connect.,This page answers frequently asked questions about the ServiceNow connector in Databricks Lakeflow Connect.,2026-03-31T23:28:00.000Z,faq,,0.35,False,"FAQ page; may contain some specifics but not clearly structured as limits, configuration reference, or troubleshooting with error codes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-limits,Limitations,ServiceNow connector limitations - Azure Databricks,Understand ServiceNow Lakeflow connector limitations,Learn about limitations and considerations for ingesting data from ServiceNow using Databricks Lakeflow Connect.,This page lists limitations and considerations for ingesting data from ServiceNow using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.75,True,"A limitations page for ServiceNow ingestion is expected to list concrete constraints and unsupported scenarios, which are expert limits-quotas details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-pipeline,Create ingestion pipeline,Ingest data from ServiceNow - Azure Databricks,,Learn how to create a managed ServiceNow ingestion pipeline using Databricks Lakeflow Connect.,Learn how to create a managed ServiceNow ingestion pipeline using Lakeflow Connect.,2026-03-24T01:32:00.000Z,how-to,,0.3,False,Pipeline creation tutorial for ServiceNow; generally procedural without exhaustive config tables or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-pipeline,Create ingestion pipeline,Ingest data from ServiceNow - Azure Databricks,,Learn how to create a managed ServiceNow ingestion pipeline using Databricks Lakeflow Connect.,Learn how to create a managed ServiceNow ingestion pipeline using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.2,False,"ServiceNow ingestion pipeline creation guide; summary suggests procedural instructions only. No signs of detailed limits, config tables, or error-resolution content that would qualify as expert knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-reference,Reference,ServiceNow connector reference - Azure Databricks,Use ServiceNow connector data type mappings,"Access reference material for ingestion using the ServiceNow connector in Databricks Lakeflow Connect, including transformations from ServiceNow data types to Delta-compatible data types.",This page contains reference material for the ServiceNow connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,reference,configuration,0.8,True,Reference material with transformations from ServiceNow data types to Delta-compatible types implies detailed mapping tables—configuration-level expert knowledge.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-source-setup,Set up data source,Configure ServiceNow for Azure Databricks ingestion - Azure Databricks,Configure ServiceNow instance for Databricks ingestion,Learn how to configure your ServiceNow instance for Azure Databricks ingestion.,Learn how to configure your ServiceNow instance for Azure Databricks ingestion.,2026-03-31T23:28:00.000Z,how-to,configuration,0.7,True,"Configuration page for ServiceNow; likely includes specific instance settings, roles, and connection parameters—product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-troubleshoot,Troubleshooting,Troubleshoot ServiceNow ingestion - Azure Databricks,Diagnose and fix Databricks ServiceNow connector issues,Learn about common issues with the ServiceNow connector in Databricks Lakeflow Connect and how to resolve them.,This page describes common issues with the ServiceNow connector in Databricks Lakeflow Connect and how to resolve them.,2026-03-16T17:36:00.000Z,troubleshooting,troubleshooting,0.86,True,"The page is explicitly a troubleshooting guide for the ServiceNow connector in Databricks Lakeflow Connect, describing common issues and how to resolve them. Such content typically includes product-specific error messages, causes, and resolution steps that go beyond generic debugging knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint,SharePoint,SharePoint connector - Azure Databricks,,"Overview of the SharePoint connector for managed ingestion pipelines, including authentication methods and step-by-step guidance for different user personas.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed SharePoint connector in Lakeflow Connect allows you to ingest unstructured files from SharePoint into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.25,False,"SharePoint connector overview; likely conceptual and marketing, not detailed limits, configuration parameters, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-faq,FAQ,Microsoft SharePoint connector FAQs - Azure Databricks,,Find answers to frequently asked questions about the Microsoft SharePoint connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page answers frequently asked questions about the Microsoft SharePoint connector in Databricks Lakeflow Connect.,2026-01-20T08:00:00.000Z,faq,,0.35,False,"FAQ page for SharePoint connector; may contain mixed information but not clearly structured as limits, configuration tables, or troubleshooting with error codes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-limits,Limitations,Microsoft SharePoint connector limitations - Azure Databricks,Review Microsoft SharePoint connector ingestion limits,Learn about limitations and considerations for ingestion from Microsoft SharePoint using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page lists limitations and considerations for ingestion from Microsoft SharePoint using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.75,True,"A limitations page for SharePoint ingestion is expected to list concrete constraints (for example, file types, sizes, or rate limits), which are expert limits-quotas knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-pipeline,Create ingestion pipeline,Ingest data from SharePoint - Azure Databricks,,Learn how to create a Microsoft SharePoint ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. :::note Compliance The managed SharePoint connector supports use in workspaces with theConfigure enhanced security and compliance settingsenabled. ::: This page shows how to create a managed Microsoft SharePoint ingestion pipeline using Lakeflow Connect.,2026-03-31T23:28:00.000Z,how-to,,0.35,False,"Pipeline creation tutorial for SharePoint; mostly procedural and not clearly a configuration reference, limits page, or troubleshooting guide.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-pipeline,Create ingestion pipeline,Ingest data from SharePoint - Azure Databricks,Use SharePoint connector with enhanced security workspaces,Learn how to create a Microsoft SharePoint ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. :::note Compliance The managed SharePoint connector supports use in workspaces with theConfigure enhanced security and compliance settingsenabled. ::: This page shows how to create a managed Microsoft SharePoint ingestion pipeline using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,security,0.65,True,Explicitly states the managed SharePoint connector supports use in workspaces with 'Configure enhanced security and compliance settings' enabled. This is a product-specific security/compliance capability detail that LLMs are unlikely to know from training and maps to security configuration behavior.,updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-rag,Downstream RAG use case,Downstream RAG use case - Azure Databricks,,"Now that you've created your SharePoint pipeline, learn how to proceed with your downstream RAG use case.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Now that you've created your SharePoint pipeline, you can parse the raw documents to text, chunking the parsed data, creating embeddings from the chunks, and more. You can then usereadStreamon the output table directly in your downstream pipeline.",2026-01-20T08:00:00.000Z,how-to,,0.3,False,"Downstream RAG use case description; likely conceptual workflow guidance rather than detailed product-specific limits, configs, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-reference,Reference,Microsoft SharePoint connector reference - Azure Databricks,Use Microsoft SharePoint connector reference settings,Access reference material for the Microsoft SharePoint connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page contains reference material for the Microsoft SharePoint connector in Databricks Lakeflow Connect.,2026-01-24T08:00:00.000Z,reference,configuration,0.7,True,"Connector reference pages typically include supported objects, parameters, and possibly type mappings—detailed configuration information specific to this product.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-m2m,OAuth M2M,Configure OAuth M2M for SharePoint ingestion - Azure Databricks,Configure OAuth M2M auth for SharePoint ingestion,Learn how to configure OAuth machine-to-machine (M2M) authentication for SharePoint ingestion into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Important M2M OAuth for SharePoint is in Public Preview. Learn how to configure OAuth machine-to-machine (M2M) authentication for SharePoint ingestion into Azure Databricks.,2026-01-24T08:00:00.000Z,how-to,security,0.75,True,"Describes configuring OAuth machine-to-machine authentication; expected to include specific app registrations, scopes, and permission settings—product-specific security configuration.",unchanged @@ -1191,41 +1212,42 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sh https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-refresh-token,Manual refresh token,Configure manual token refresh authentication for Microsoft SharePoint ingestion - Azure Databricks,Set up manual token refresh for SharePoint ingestion,Learn how to configure manual token refresh authentication for Microsoft SharePoint ingestion into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to configure manual token refresh authentication for Microsoft SharePoint ingestion into Azure Databricks.,2026-03-26T08:00:00.000Z,how-to,security,0.7,True,"Manual token refresh authentication involves specific token lifetimes, refresh endpoints, and secrets; these are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-u2m,OAuth U2M,Configure OAuth U2M for Microsoft SharePoint ingestion - Azure Databricks,Configure OAuth U2M auth for SharePoint ingestion,Learn how to configure OAuth user-to-machine (U2M) authentication for Microsoft SharePoint ingestion into Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to configure OAuth user-to-machine (U2M) authentication for Microsoft SharePoint ingestion into Azure Databricks.,2026-03-26T08:00:00.000Z,how-to,security,0.75,True,"User-to-machine OAuth configuration for SharePoint ingestion; likely details scopes, consent, and app settings—security-focused configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-troubleshoot,Troubleshooting,Troubleshoot Microsoft SharePoint ingestion - Azure Databricks,Diagnose and fix Lakeflow SharePoint connector issues,Learn about common issues with the Microsoft SharePoint connector in Databricks Lakeflow Connect and how to resolve them.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes common issues with the Microsoft SharePoint connector in Databricks Lakeflow Connect and how to resolve them.,2026-01-30T08:00:00.000Z,troubleshooting,troubleshooting,0.86,True,A dedicated troubleshooting page for the SharePoint connector; likely organized by specific error messages/symptoms and corresponding resolutions unique to Lakeflow Connect.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-concepts,Concepts,Microsoft SQL Server connector concepts - Azure Databricks,,Learn about the fully-managed Microsoft SQL Server connector in Databricks Lakeflow Connect.,"This page describes how the SQL Server connector works, including its core concepts.",2026-03-31T23:28:00.000Z,concept-article,,0.2,False,Concepts/how-it-works description; likely high-level connector concepts rather than detailed configuration or troubleshooting data.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-concepts,Concepts,Microsoft SQL Server connector concepts - Azure Databricks,,Learn about the fully-managed Microsoft SQL Server connector in Databricks Lakeflow Connect.,"This page describes how the SQL Server connector works, including its core concepts.",2026-04-22T17:34:00.000Z,concept-article,,0.1,False,"Concepts/overview page explaining how the SQL Server connector works; no indication of numeric limits, config tables, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-faq,FAQ,SQL Server connector FAQs - Azure Databricks,Answer common SQL Server Lakeflow Connect connector questions,Find answers to frequently asked questions about the SQL Server connector in Databricks Lakeflow Connect.,This page answers frequently asked questions about the SQL Server connector in Databricks Lakeflow Connect.,2026-03-31T23:28:00.000Z,faq,troubleshooting,0.65,True,"Connector FAQ is likely to include specific behaviors, edge cases, and resolutions for SQL Server connector usage that are not generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-limits,Limitations,SQL Server connector limitations - Azure Databricks,Review SQL Server Lakeflow Connect connector limitations,Learn about limitations and considerations for SQL Server ingestion using Databricks Lakeflow Connect.,This page lists limitations and considerations for SQL Server ingestion using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.7,True,"Limitations/considerations page for SQL Server connector is expected to list concrete constraints and unsupported scenarios, which are expert, product-specific details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-limits,Limitations,SQL Server connector limitations - Azure Databricks,,Learn about limitations and considerations for SQL Server ingestion using Databricks Lakeflow Connect.,This page lists limitations and considerations for SQL Server ingestion using Databricks Lakeflow Connect.,2026-04-21T18:07:00.000Z,limits-and-quotas,,0.4,False,Lists limitations and considerations but summary does not show concrete numeric limits or quotas; likely qualitative constraints rather than detailed limit tables.,updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-overview,SQL Server,Microsoft SQL Server ingestion connector - Azure Databricks,,"Learn about the SQL Server ingestion workflow, including factors that affect your setup and step-by-step guidance for different user personas.","This page helps you understand the SQL Server ingestion workflow, including the factors that determine your setup approach and the steps involved for different user personas.",2026-04-02T22:35:00.000Z,concept-article,,0.2,False,"Connector workflow overview; no indication of detailed limits, config matrices, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-pipeline,Create ingestion pipeline,Ingest data from SQL Server - Azure Databricks,,Learn how to ingest data from SQL Server into Azure Databricks using Lakeflow Connect.,"Learn how to ingest data from SQL Server into Azure Databricks using Lakeflow Connect. The SQL Server connector supports Azure SQL Database, Azure SQL Managed Instance, and Amazon RDS SQL databases. This includes SQL Server running on Azure virtual machines (VMs) and Amazon EC2. The connector also supports SQL Server on-premises using Azure ExpressRoute and AWS Direct Connect networking.",2026-03-24T01:32:00.000Z,how-to,,0.3,False,"How-to ingestion pipeline guide; summary does not indicate detailed config tables, limits, or error mappings beyond standard tutorial content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-privileges,Privilege requirements for source database users,Microsoft SQL Server database user requirements - Azure Databricks,Set SQL Server user permissions for Databricks ingestion,Learn about the privileges you must grant to the Microsoft SQL Server database user that you plan to use for ingesting data into Azure Databricks.,Learn which privileges to grant the Microsoft SQL Server database user that you plan to use for ingesting into Azure Databricks. Databricks recommends that you create a database user that is solely used for Databricks ingestion.,2026-04-02T22:35:00.000Z,how-to,security,0.8,True,Covers required SQL Server privileges for the ingestion user; this is product-specific security/permission configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-reference,Reference,SQL Server connector reference - Azure Databricks,Use SQL Server connector reference for types and objects,"Access reference material for the SQL Server connector in Databricks Lakeflow Connect. For example, unsupported objects and transformations from SQL Server data types to Delta-compatible data types.",This page contains reference material for the SQL Server connector in Databricks Lakeflow Connect.,2026-02-02T08:00:00.000Z,reference,configuration,0.7,True,Reference for unsupported objects and data type transformations implies detailed mapping tables and constraints that serve as expert configuration/reference information.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-source-setup,Set up data source,Configure Microsoft SQL Server for ingestion into Azure Databricks - Azure Databricks,Prepare SQL Server source for Databricks ingestion,Preview the source setup tasks for ingestion from SQL Server into Azure Databricks using Lakeflow Connect.,Preview the source setup tasks ingestion from SQL Server into Azure Databricks using Lakeflow Connect.,2026-02-02T19:09:00.000Z,how-to,configuration,0.65,True,"Source setup tasks for SQL Server ingestion typically include specific configuration steps, settings, and possibly parameter values unique to Lakeflow Connect.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-troubleshoot,Troubleshooting,Troubleshoot SQL Server ingestion - Azure Databricks,Troubleshoot SQL Server ingestion with Lakeflow Connect,Learn about common issues with the Microsoft SQL Server connector in Databricks Lakeflow Connect and how to resolve them.,This page describes common issues with the Microsoft SQL Server connector in Databricks Lakeflow Connect and how to resolve them.,2026-03-26T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,Explicit troubleshooting guide; will contain symptom → cause → solution mappings and possibly error codes/messages specific to the SQL Server connector.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-troubleshoot,Troubleshooting,Troubleshoot SQL Server ingestion - Azure Databricks,Troubleshoot Databricks Lakeflow SQL Server ingestion,Learn about common issues with the Microsoft SQL Server connector in Databricks Lakeflow Connect and how to resolve them.,This page describes common issues with the Microsoft SQL Server connector in Databricks Lakeflow Connect and how to resolve them.,2026-04-22T17:34:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting guide for SQL Server connector; expected to contain connector-specific errors, causes, and resolutions.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-utility,Utility objects script,Prepare SQL Server for ingestion using the utility objects script - Azure Databricks,Run SQL Server utility script for Lakeflow ingestion,Learn how to prepare SQL Server for ingestion into Azure Databricks using the utility objects script.,Complete the SQL Server database setup tasks to ingest into Azure Databricks using Lakeflow Connect.,2026-03-04T19:15:00.000Z,how-to,configuration,0.78,True,"Preparing SQL Server using a utility objects script implies specific script parameters, objects, and configuration steps unique to this connector—expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-utility-reference,Script reference,SQL Server utility objects script reference - Azure Databricks,Reference SQL Server utility objects script for Lakeflow,"Access reference material for the SQL Server utility objects script, including components, parameters, and troubleshooting.","Access reference material for the SQL Server utility objects script, including components, parameters, and troubleshooting.",2026-03-04T19:15:00.000Z,reference,configuration,0.84,True,"Reference for a utility script with components and parameters is a configuration reference, likely including parameter names, allowed values, and behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/table-rename,Name destination tables,Name a destination table - Azure Databricks,Set custom destination table names in Lakeflow Connect,"Learn how to name a destination table during Lakeflow Connect managed ingestion. By default, destination tables are given the names of the corresponding source tables.","Applies to:UI-based pipeline authoringAPI-based pipeline authoringSaaS connectorsDatabase connectors By default, a destination table created during Lakeflow Connect managed ingestion is given the name of the corresponding source table. However, you can optionally specify a different name for the destination table. For example, if you ingest an object into two tables in the same schema, you must specify a unique name for one of the tables to differentiate between them. Managed ingestion connector",2026-03-16T17:36:00.000Z,how-to,configuration,0.7,True,"Describes how to override default destination table naming and the requirement for unique names when ingesting the same object into multiple tables in a schema. This is a concrete, product-specific configuration option.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-limits,Limitations,TikTok Ads connector limitations - Azure Databricks,Review TikTok Ads Lakeflow connector limitations,Limitations of the TikTok Ads connector for managed ingestion pipelines.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed TikTok Ads connector in Lakeflow Connect has the following limitations:,2026-04-02T22:35:00.000Z,concept-article,limits-quotas,0.82,True,"A limitations page for a specific connector typically lists concrete constraints (supported objects, date ranges, API caps) that are product-specific and not generally known.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-overview,TikTok Ads,TikTok Ads connector - Azure Databricks,,Overview of the TikTok Ads connector for managed ingestion pipelines.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed TikTok Ads connector in Lakeflow Connect allows you to ingest data from TikTok Ads into Azure Databricks.,2026-04-02T22:35:00.000Z,concept-article,,0.1,False,"Connector overview for TikTok Ads; described as an overview without indication of limits, config tables, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-pipeline,Create ingestion pipeline,Ingest data from TikTok Ads - Azure Databricks,,Learn how to create a managed TikTok Ads ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to create a managed TikTok Ads ingestion pipeline using Lakeflow Connect.,2026-03-24T01:32:00.000Z,how-to,,0.2,False,How-to guide for creating a TikTok Ads ingestion pipeline; appears to be a step-by-step tutorial rather than configuration reference or troubleshooting with expert-only details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-pipeline,Create ingestion pipeline,Ingest data from TikTok Ads - Azure Databricks,,Learn how to create a managed TikTok Ads ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to create a managed TikTok Ads ingestion pipeline using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Scenario/tutorial-style ingestion guide for TikTok Ads using Lakeflow Connect; likely step-by-step UI usage without detailed config tables, limits, or product-specific error mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-reference,Reference,TikTok Ads connector reference - Azure Databricks,"Reference TikTok Ads tables, dimensions, metrics","Reference information for the TikTok Ads connector, including supported tables, dimensions, and metrics.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This article provides reference information for the TikTok Ads connector, including supported tables, dimensions, and metrics.",2026-02-19T23:31:00.000Z,concept-article,configuration,0.8,True,"Lists supported tables, dimensions, and metrics—detailed connector-specific schema/configuration information that is expert reference knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-source-setup,Set up data source,Configure TikTok Ads for managed ingestion - Azure Databricks,Configure TikTok Ads auth for Databricks ingestion,Learn how to configure TikTok Ads to enable authentication from Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to configure TikTok Ads to enable authentication from Azure Databricks.,2026-02-19T23:31:00.000Z,how-to,security,0.76,True,"Configuring TikTok Ads for authentication will involve app credentials, scopes, and callback settings—product-specific security/auth configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-troubleshoot,Troubleshooting,Troubleshoot the TikTok Ads connector - Azure Databricks,Troubleshoot TikTok Ads connector in Lakeflow,Troubleshoot common issues with the TikTok Ads connector.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to troubleshoot common issues with the TikTok Ads connector.,2026-02-19T23:31:00.000Z,concept-article,troubleshooting,0.9,True,"Troubleshooting page for a specific connector; expected to contain error messages/codes and resolution steps, which is expert troubleshooting knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tls-server-certificate-validation,TLS server certificate validation,TLS server certificate validation - Azure Databricks,Configure TLS certificate validation for Lakeflow Connect databases,Configure TLS server certificate validation for Lakeflow Connect database connectors to verify source server identity and prevent person-in-the-middle attacks.,"Lakeflow Connect database connector pipelines encrypt all data in transit using TLS. Starting with newly created pipelines, Lakeflow Connect also validates the source database server's TLS certificate. This certificate validation verifies that the pipeline is connecting to the intended server—not an impersonator—and prevents person-in-the-middle (PITM) attacks. This page applies to the MySQL, PostgreSQL, and SQL Server Lakeflow Connect connectors.",2026-04-22T17:34:00.000Z,how-to,security,0.7,True,"Explains how to configure TLS server certificate validation for specific Lakeflow Connect database connectors (MySQL, PostgreSQL, SQL Server), including product-specific security behavior and settings to prevent PITM attacks, which fits security configuration guidance.",new https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-limits,Limitations,Workday HCM connector limitations - Azure Databricks,Understand Workday HCM Lakeflow connector limitations,Limitations of the managed Workday Human Capital Management (HCM) connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page contains information about known limitations of the managed Workday Human Capital Management (HCM) connector in Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.8,True,"Explicitly a limitations page; such pages usually enumerate specific unsupported features, object caps, or behavioral constraints that qualify as expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-overview,Workday HCM,Workday Human Capital Management (HCM) connector - Azure Databricks,,Overview of the managed Workday HCM ingestion connector in Lakeflow Connect.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed Workday Human Capital Management (HCM) connector in Lakeflow Connect allows you to ingest data from Workday HCM into Azure Databricks. Use this connector to consolidate human resources (HR) data such as workers, positions, payroll, and talent records into Azure Databricks for analysis and downstream processing. This page covers the Workday H",2026-04-02T22:35:00.000Z,concept-article,,0.1,False,"Overview of the Workday HCM connector; summary suggests conceptual description of what data it ingests, not detailed limits, config parameters, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-pipeline,Create ingestion pipeline,Ingest data from Workday HCM - Azure Databricks,,Learn how to create a managed Human Capital Management (HCM) ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to create a managed Workday Human Capital Management (HCM) ingestion pipeline using Lakeflow Connect.,2026-03-24T01:32:00.000Z,how-to,,0.2,False,Pipeline creation tutorial for Workday HCM; likely procedural steps rather than structured configuration reference or error-resolution mappings.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-pipeline,Create ingestion pipeline,Ingest data from Workday HCM - Azure Databricks,,Learn how to create a managed Human Capital Management (HCM) ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to create a managed Workday Human Capital Management (HCM) ingestion pipeline using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Tutorial for creating a Workday HCM ingestion pipeline; description suggests procedural guidance rather than configuration matrices, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-reference,Reference,Workday HCM connector reference - Azure Databricks,Use Workday HCM connector reference in Lakeflow,Reference documentation for the managed Workday Human Capital Management (HCM) connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page contains reference documentation for the managed Workday Human Capital Management (HCM) connector in Lakeflow Connect.,2026-03-10T19:51:00.000Z,reference,configuration,0.78,True,"Connector reference documentation usually includes supported objects, fields, and mappings—product-specific configuration/reference information.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-setup,Set up data source,Configure authentication to Workday HCM - Azure Databricks,Configure Workday HCM authentication for Databricks,Learn how to configure Workday Human Capital Management (HCM) to enable authentication from Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to configure Workday Human Capital Management (HCM) to enable authentication from Azure Databricks. You'll use the credentials from this page to create a connection in Azure Databricks.,2026-03-10T19:51:00.000Z,how-to,security,0.76,True,"Auth configuration for Workday HCM will include specific credentials, endpoints, and permission scopes—product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-troubleshoot,Troubleshooting,Troubleshoot the Workday HCM connector - Azure Databricks,Troubleshoot Workday HCM connector in Lakeflow,Troubleshoot common errors with the managed Workday Human Capital Management (HCM) connector in Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to troubleshoot common errors with the managed Workday Human Capital Management (HCM) connector in Lakeflow Connect.,2026-03-10T19:51:00.000Z,troubleshooting,troubleshooting,0.9,True,"Troubleshooting page for a specific connector; expected to map errors and symptoms to causes and fixes, which is expert troubleshooting knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports,Workday Reports,Workday Reports connector - Azure Databricks,Configure Workday Reports connector authentication and setup,"Overview of the Workday Reports connector for managed ingestion pipelines, including authentication methods and step-by-step guidance for different user personas.","The managed Workday Reports connector in Lakeflow Connect allows you to ingest data from Workday reports into Azure Databricks. This page covers the Workday Reports connector, which ingests data from Workday custom reports. To ingest data from standard Workday HCM modules, seeWorkday Human Capital Management (HCM) connector.",2026-04-02T22:35:00.000Z,concept-article,configuration,0.64,True,Overview that explicitly includes authentication methods and step-by-step guidance for different personas; likely documents specific auth modes and required settings unique to this connector.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-faq,FAQ,Workday Reports connector FAQs - Azure Databricks,,Find answers to frequently asked questions about the Workday Reports connector in Databricks Lakeflow Connect.,This page answers frequently asked questions about the Workday Reports connector in Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,faq,,0.35,False,"FAQ page; summary does not indicate presence of specific error codes, config tables, or limits—likely general Q&A rather than structured expert troubleshooting or config reference.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-limits,Limitations,Workday Reports connector limitations - Azure Databricks,Review limitations for Workday Reports ingestion,Learn about limitations and considerations for ingesting Workday reports using Databricks Lakeflow Connect.,This page lists limitations and considerations for ingesting Workday reports using Databricks Lakeflow Connect.,2026-04-02T22:35:00.000Z,limits-and-quotas,limits-quotas,0.8,True,Explicit limitations and considerations for Workday Reports ingestion; such pages typically list concrete constraints and unsupported scenarios that are product-specific.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-pipeline,Create ingestion pipeline,Ingest Workday reports - Azure Databricks,,Learn how to ingest Workday reports into Azure Databricks using Lakeflow Connect.,Learn how to ingest Workday reports into Azure Databricks using Lakeflow Connect.,2026-03-23T08:00:00.000Z,how-to,,0.25,False,"Tutorial on ingesting Workday reports; primarily a how-to pipeline creation guide, not a configuration reference or troubleshooting catalog.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-pipeline,Create ingestion pipeline,Ingest Workday reports - Azure Databricks,,Learn how to ingest Workday reports into Azure Databricks using Lakeflow Connect.,Learn how to ingest Workday reports into Azure Databricks using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"How-to page for ingesting Workday reports; appears to be a basic pipeline creation walkthrough without expert-level limits, config tables, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-reference,Reference,Workday Reports connector reference - Azure Databricks,Use Workday Reports connector reference mappings,"Access reference material for the Workday Reports connector in Databricks Lakeflow Connect. For example, unsupported objects and transformations from Workday data types to Delta-compatible data types.",This page contains reference material for the Workday Reports connector in Databricks Lakeflow Connect.,2026-03-31T23:28:00.000Z,reference,configuration,0.84,True,"Reference material including unsupported objects and data type transformations to Delta; this is detailed, product-specific mapping/configuration information.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-source-setup,Set up data source,Configure Workday reports for ingestion - Azure Databricks,Set up Workday reports for Lakeflow ingestion,Learn how to configure Workday reports for ingestion so you can load data into Azure Databricks using Lakeflow Connect.,This page describes how to configure Workday reports for ingestion.,2026-01-20T08:00:00.000Z,how-to,configuration,0.78,True,"Focused on configuring Workday reports for ingestion; likely includes specific configuration fields, formats, and constraints needed for the connector to work.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-troubleshoot,Troubleshooting,Troubleshoot Workday ingestion - Azure Databricks,Diagnose and fix Databricks Workday connector issues,Learn about common issues with the Workday Reports connector in Databricks Lakeflow Connect and how to resolve them.,This page describes common issues with the Workday Reports connector in Databricks Lakeflow Connect and how to resolve them.,2026-03-16T17:36:00.000Z,troubleshooting,troubleshooting,0.9,True,"Page is explicitly a troubleshooting guide for the Workday Reports connector, likely organized by specific symptoms and including product-specific error messages or behaviors with corresponding resolutions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-concepts,Concepts,Zendesk Support connector concepts - Azure Databricks,,"Learn about the Zendesk Support connector architecture, data models, and pricing.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes key concepts for the Zendesk Support connector in Lakeflow Connect.,2026-02-02T19:09:00.000Z,concept-article,,0.3,False,"A 'concepts' page about architecture, data models, and pricing is generally high-level and explanatory. Without evidence of numeric thresholds or decision matrices, it is more conceptual than expert configuration or decision-making guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-limits,Limitations,Zendesk Support connector limitations - Azure Databricks,Understand Zendesk Support connector limitations,Learn about known limitations of the Zendesk Support connector.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews.,2026-04-02T22:35:00.000Z,concept-article,limits-quotas,0.78,True,"A connector limitations page; typically enumerates specific unsupported features, rate limits, or behavioral constraints that are expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-overview,Zendesk Support,Zendesk Support connector overview - Azure Databricks,,Overview of the Zendesk Support connector for managed ingestion pipelines.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The managed Zendesk Support connector in Lakeflow Connect allows you to ingest data from Zendesk Support into Azure Databricks, including ticket data, knowledge base content, and community forum data.",2026-04-02T22:35:00.000Z,concept-article,,0.1,False,"Zendesk Support connector overview; summary indicates high-level description of ingested data types, not detailed limits, config parameters, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-pipeline,Create ingestion pipeline,Ingest data from Zendesk Support - Azure Databricks,,Learn how to create a managed Zendesk Support ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to create a managed Zendesk Support ingestion pipeline using Lakeflow Connect.,2026-03-24T01:32:00.000Z,how-to,,0.2,False,How-to guide for creating a Zendesk Support ingestion pipeline; likely procedural without structured configuration or error-resolution tables.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-pipeline,Create ingestion pipeline,Ingest data from Zendesk Support - Azure Databricks,,Learn how to create a managed Zendesk Support ingestion pipeline using Databricks Lakeflow Connect.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page shows how to create a managed Zendesk Support ingestion pipeline using Lakeflow Connect.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Zendesk Support ingestion pipeline tutorial; likely focuses on steps to set up a managed connector, not on detailed configuration parameters, quotas, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-reference,Reference,Zendesk Support connector reference - Azure Databricks,Use Zendesk Support connector technical reference in Databricks,"Technical reference for the Zendesk Support connector, including supported tables and data models.",Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page provides technical reference information for the Zendesk Support connector.,2026-02-02T19:09:00.000Z,concept-article,configuration,0.78,True,"A technical reference listing supported tables and data models is detailed, product-specific schema/configuration information (table names, fields, types) that aligns with configuration expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-source-setup,Set up data source,Configure Zendesk Support for OAuth - Azure Databricks,Configure OAuth security for Zendesk Support connector,Learn how to configure Zendesk Support to enable authentication from Azure Databricks.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Learn how to configure Zendesk Support to enable authentication from Azure Databricks.,2026-02-02T19:09:00.000Z,concept-article,security,0.7,True,"Configuring OAuth for a specific SaaS integration typically involves concrete auth endpoints, scopes, redirect URIs, and app settings. These are product-specific authentication parameters that match the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-troubleshoot,Troubleshooting,Troubleshoot the Zendesk Support connector - Azure Databricks,Troubleshoot Databricks Zendesk Support connector errors,Troubleshoot common issues with the Zendesk Support connector.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes common errors when using the Zendesk Support connector.,2026-02-02T19:09:00.000Z,concept-article,troubleshooting,0.9,True,"Described as common errors and how to resolve them. Such content usually maps specific error messages/codes and symptoms to causes and fixes, which is product-specific troubleshooting knowledge.",unchanged @@ -1233,54 +1255,54 @@ https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/,Open https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/configure,Configure your OTLP client,Configure OpenTelemetry (OTLP) clients to send data to Unity Catalog - Azure Databricks,Configure OTLP clients for Unity Catalog ingestion,"Configure OpenTelemetry SDKs and the OpenTelemetry Collector to send traces, logs, and metrics to Zerobus Ingest.","Important This feature is inBeta. Zerobus Ingestincludes anOpenTelemetry Protocol (OTLP)endpoint. You can push traces, logs, and metrics directly into Unity Catalog Delta tables using standard OpenTelemetry SDKs and collectors, without custom libraries. This page covers retrieving your endpoint, creating target tables, configuring a service principal, and sending your first telemetry data.",2026-04-09T17:57:00.000Z,how-to,configuration,0.8,True,"Configuration-focused article for OpenTelemetry SDKs and Collector; expected to list concrete config keys, endpoint URLs, auth settings, and example config blocks specific to Zerobus/Unity Catalog.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/queries,Query examples,Query OpenTelemetry data - Azure Databricks,"Query OpenTelemetry traces, logs, and metrics in Databricks","Example SQL queries for OpenTelemetry traces, logs, and metrics ingested into Delta tables by Zerobus Ingest.","Important This feature is inBeta. This page provides example SQL queries for OpenTelemetry data ingested into Delta tables byZerobus Ingest OTLP. For table schemas and column details, seeOpenTelemetry table reference for Zerobus Ingest. In the examples below, replace..with your catalog, schema, and table name prefix. Columns such asattributes,resource.attributes,instrumentation_scope.attributes, andbody(logs) are stored asVARIANT. Use the:key::typesyntax to extract value",2026-04-09T17:57:00.000Z,how-to,integrations,0.65,True,"Provides concrete SQL query examples against Zerobus OTLP tables, including use of VARIANT columns and key extraction syntax; these are product-specific query patterns and usage examples.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/table-reference,Table reference,OpenTelemetry table reference for Zerobus Ingest - Azure Databricks,OpenTelemetry table schemas for Zerobus Ingest,"Reference for OpenTelemetry (OTLP) table schemas, gRPC service paths, and data mapping in Zerobus Ingest.",Important This feature is inBeta. This page provides reference information for the OpenTelemetry (OTLP) table schemas and data mapping used byZerobus Ingest OTLP.,2026-04-09T17:57:00.000Z,reference,configuration,0.85,True,"Reference for OTLP table schemas and mappings; contains detailed column names, types, and mapping rules that are product-specific configuration/reference data.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/overview,Overview,What is Lakeflow Connect? - Azure Databricks,,"Learn about Databricks Lakeflow Connect, which offers efficient connectors to ingest data from enterprise applications, databases, cloud storage, local files, and more.","Lakeflow Connect offers simple and efficient connectors to ingest data from local files, popular enterprise applications, databases, cloud storage, message buses, and more. This page outlines some of the ways that Lakeflow Connect can improve ETL performance. It also covers common use cases and the range of supported ingestion tools, from fully-managed connectors to fully-customizable frameworks.",2026-03-31T08:00:00.000Z,concept-article,,0.2,False,"High-level overview of Lakeflow Connect capabilities and use cases; summary does not show detailed configuration parameters, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/sftp,SFTP,Ingest files from SFTP servers - Azure Databricks,Ingest data from SFTP using Lakeflow Connect,Learn how to ingest files from SFTP servers using Auto Loader in Lakeflow Connect.,"Important The SFTP connector is inPublic Preview. Learn how to ingest files from SFTP servers using Lakeflow Connect. The SFTP connector extends Auto Loader functionality to provide secure, incremental ingestion from SFTP servers with Unity Catalog governance.",2026-04-13T08:00:00.000Z,how-to,integrations,0.7,True,"SFTP connector usage in Lakeflow Connect is an integration pattern; such pages usually include connector-specific options, connection parameters, and constraints unique to this product.",updated -https://learn.microsoft.com/en-us/azure/databricks/ingestion/sharepoint,SharePoint,Ingest files from SharePoint - Azure Databricks,Ingest SharePoint files into Delta tables,"Learn how to ingest structured, semi-structured, and unstructured files from SharePoint into Delta tables using Auto Loader, `spark.read`, or `COPY INTO`.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. :::note Compliance The SharePoint connector supports use in workspaces with theConfigure enhanced security and compliance settingsenabled. ::: You can ingest structured, semi-structured, and unstructured files from Microsoft SharePoint into Delta tables. The SharePoint connector supports incremental ingestion of SharePoint files using batch and streamin",2026-04-07T08:00:00.000Z,how-to,integrations,0.7,True,"SharePoint connector article; likely includes Databricks-specific connector configuration, supported modes (batch/streaming), and parameter usage for Auto Loader, spark.read, and COPY INTO.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ingestion/overview,Overview,What is Lakeflow Connect? - Azure Databricks,,"Learn about Databricks Lakeflow Connect, which offers efficient connectors to ingest data from enterprise applications, databases, cloud storage, local files, and more.","Lakeflow Connect offers simple and efficient connectors to ingest data from local files, popular enterprise applications, databases, cloud storage, message buses, and more. This page outlines some of the ways that Lakeflow Connect can improve ETL performance. It also covers common use cases and the range of supported ingestion tools, from fully-managed connectors to fully-customizable frameworks.",2026-04-21T22:28:00.000Z,concept-article,,0.2,False,"High-level overview of Lakeflow Connect capabilities and use cases; no indication of detailed limits, configuration parameters, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/sftp,SFTP,Ingest files from SFTP servers - Azure Databricks,Ingest SFTP server data using Lakeflow Connect Auto Loader,Learn how to ingest files from SFTP servers using Auto Loader in Lakeflow Connect.,"Important The SFTP connector is inPublic Preview. Learn how to ingest files from SFTP servers using Lakeflow Connect. The SFTP connector extends Auto Loader functionality to provide secure, incremental ingestion from SFTP servers with Unity Catalog governance.",2026-04-23T08:00:00.000Z,how-to,integrations,0.75,True,"Describes the SFTP connector that extends Auto Loader, including secure, incremental ingestion and Unity Catalog governance; likely documents connector options and behaviors that are specific integration knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/sharepoint,SharePoint,Ingest files from SharePoint - Azure Databricks,Ingest SharePoint files into Delta tables with Databricks,"Learn how to ingest structured, semi-structured, and unstructured files from SharePoint into Delta tables using Auto Loader, `spark.read`, or `COPY INTO`.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. :::note Compliance The SharePoint connector supports use in workspaces with theConfigure enhanced security and compliance settingsenabled. ::: You can ingest structured, semi-structured, and unstructured files from Microsoft SharePoint into Delta tables. The SharePoint connector supports incremental ingestion of SharePoint files using batch and streamin",2026-04-24T18:05:00.000Z,how-to,integrations,0.75,True,"Documents the SharePoint connector for incremental ingestion into Delta using Auto Loader, spark.read, and COPY INTO, with Databricks-specific options and behaviors; this is concrete integration guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/ingestion/variant,Ingest data as semi-structured variant type,Ingest data as semi-structured variant type - Azure Databricks,Ingest data as VARIANT type using Auto Loader and COPY INTO,Learn about using the VARIANT type for ingesting semi-structured data.,"Important This feature is inPublic Preview. In Databricks Runtime 15.3 and above, you can use theVARIANTtype to ingest semi-structured data. This article describes behavior and provides example patterns for ingesting data from cloud object storage using Auto Loader andCOPY INTO, streaming records from Kafka, and SQL commands for creating new tables with variant data or inserting new records using the variant type. The following table summarizes the supported file formats and Databricks Runtime v",2026-02-09T20:20:00.000Z,concept-article,configuration,0.75,True,"Describes supported file formats, runtime versions, and concrete ingestion patterns/commands for VARIANT; includes tables of supported combinations and options.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-errors,Error handling,Zerobus Ingest Error Handling - Azure Databricks,Handle Zerobus Ingest errors and retries,Learn how to use the Zerobus Ingest connector in Lakeflow Connect.,,2026-02-23T22:28:00.000Z,how-to,troubleshooting,0.7,True,"Error handling page will map Zerobus-specific error codes/messages to causes and recovery steps, which is product-specific troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-ingest,Write a client,Use the Zerobus Ingest connector - Azure Databricks,Ingest data with Zerobus Ingest connector,Learn how to use the Zerobus Ingest connector in Lakeflow Connect.,This page describes how to ingest data using the Zerobus Ingest connector in Lakeflow Connect.,2026-04-15T08:00:00.000Z,how-to,integrations,0.7,True,"Connector-specific ingestion page for Zerobus Ingest in Lakeflow Connect likely documents connector parameters, configuration, and usage patterns unique to this integration.",updated +https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-ingest,Write a client,Use the Zerobus Ingest connector - Azure Databricks,Ingest data with Zerobus Ingest connector,Learn how to use the Zerobus Ingest connector in Lakeflow Connect.,This page describes how to ingest data using the Zerobus Ingest connector in Lakeflow Connect.,2026-04-15T08:00:00.000Z,how-to,integrations,0.7,True,"Connector-specific ingestion page for Zerobus Ingest in Lakeflow Connect likely documents connector parameters, configuration, and usage patterns unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-limits,Limitations,Zerobus Ingest connector limitations - Azure Databricks,Zerobus Ingest connector limits and constraints,Learn about limitations when using the Zerobus Ingest connector in Lakeflow Connect.,This page lists limitations when using the Zerobus Ingest connector in Lakeflow Connect.,2026-04-08T22:17:00.000Z,limits-and-quotas,limits-quotas,0.85,True,"Explicitly a limitations page; such pages typically list concrete numeric limits (throughput, payload size, connection counts) and unsupported scenarios that qualify as expert limits/quotas knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-overview,Overview,Zerobus Ingest connector overview - Azure Databricks,,Learn about the Zerobus Ingest connector in Lakeflow Connect.,"Zerobus Ingest is a push-based ingestion API that writes data directly into Unity Catalog Delta tables. It is a serverless connector that automatically scales to handle incoming connections. It does not require configuring partitions or managing brokers. With Zerobus Ingest, your ""scaling strategy"" is to open more connections. This streamlines ingestion workflows by eliminating the need for message bus infrastructure. Any application that can integrate with Zerobus Ingest SDKs or communicate via",2026-04-09T17:57:00.000Z,concept-article,,0.3,False,"Zerobus Ingest overview; description suggests conceptual and marketing-style explanation of what the connector is and why to use it, without detailed limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/init-scripts/,Init scripts overview,What are init scripts? - Azure Databricks,Use Databricks init scripts for cluster customization,"Learn how to use initialization (init) scripts to install packages and libraries, set system properties and environment variables, modify Apache Spark config parameters, and set other configurations o",An init script (initialization script) is a shell script that runs during startup of each cluster node before the Apache Spark driver or executor JVM starts. This article provides recommendations for init scripts and configuration information if you must use them.,2026-03-31T08:00:00.000Z,concept-article,best-practices,0.7,True,Defines init scripts plus recommendations and configuration information for using them; these are Databricks-specific patterns and gotchas for environment setup.,unchanged https://learn.microsoft.com/en-us/azure/databricks/init-scripts/cluster-scoped,Cluster-scoped init scripts,Cluster-scoped init scripts - Azure Databricks,Configure cluster-scoped init scripts in Databricks,Learn how to use cluster-scoped init scripts to configure compute environments for clusters.,"Cluster-scoped init scripts are init scripts defined in a cluster configuration. Cluster-scoped init scripts apply to both clusters you create and those created to run jobs. You can configure cluster-scoped init scripts using the UI, the CLI, and by invoking the Clusters API. This section focuses on performing these tasks using the UI. For the other methods, see theDatabricks CLIand theClusters API. You can add any number of scripts, and the scripts are executed sequentially in the order provide",2025-02-18T19:07:00.000Z,concept-article,configuration,0.8,True,Describes defining cluster-scoped init scripts via UI/CLI/API and execution order; product-specific configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/init-scripts/environment-variables,Set and use environment variables with init scripts,Set and use environment variables with init scripts - Azure Databricks,Configure environment variables in Databricks init scripts,Learn about how init script events are logged on Azure Databricks.,Init scripts have access to all environment variables present on a cluster.,2026-04-15T22:08:00.000Z,concept-article,configuration,0.7,True,"The page describes product-specific behavior of environment variables in Azure Databricks init scripts, including how they are exposed and used during cluster initialization. This is configuration-focused, with details unique to Databricks init scripts rather than generic environment variable usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/init-scripts/environment-variables,Set and use environment variables with init scripts,Set and use environment variables with init scripts - Azure Databricks,Configure environment variables in Databricks init scripts,Learn about how init script events are logged on Azure Databricks.,Init scripts have access to all environment variables present on a cluster.,2026-04-15T22:08:00.000Z,concept-article,configuration,0.7,True,"The page describes product-specific behavior of environment variables in Azure Databricks init scripts, including how they are exposed and used during cluster initialization. This is configuration-focused, with details unique to Databricks init scripts rather than generic environment variable usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/init-scripts/global,Global init scripts,Global init scripts - Azure Databricks,Configure global init scripts across Databricks workspace,Learn how to use global init scripts to configure compute environments for all clusters in a workspace.,"Important Databricks recommends configuring all init scripts as cluster-scoped init scripts and managing them across your workspace using cluster policies. SeeCluster-scoped init scripts. A global init script runs on all clusters in your workspace configured with dedicated (formerly single user) or legacy no-isolation shared access mode. Only workspace admins can create global init scripts. You can create them using either the UI or REST API. Important Before using global init scripts, consider ",2026-02-04T00:12:00.000Z,concept-article,configuration,0.75,True,"Explains global init scripts, access mode support, admin-only creation, and cautions; product-specific configuration and behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/init-scripts/logs,Init script logging,Init script logging - Azure Databricks,Use logging to troubleshoot Databricks init scripts,Learn about how init script events are logged on Azure Databricks.,"Init script start and finish events are captured in cluster event logs. Details are captured in cluster logs. Global init script create, edit, and delete events are also captured in account-level diagnostic logs.",2024-05-03T08:00:00.000Z,concept-article,troubleshooting,0.65,True,Explains where init script start/finish events and global script changes are logged; product-specific diagnostic locations for troubleshooting.,unchanged https://learn.microsoft.com/en-us/azure/databricks/init-scripts/referencing-files,What files can I reference in an init script?,What files can I reference in an init script? - Azure Databricks,Reference external files safely in Databricks init scripts,Learn about support for referencing other files in init scripts stored in different locations on Azure Databricks.,The support for referencing other files in an init script depends on where the referenced files are stored. This article outlines this behavior and provides recommendations. Databricks recommends managing all init scripts as cluster-scoped init scripts.,2025-02-18T19:07:00.000Z,concept-article,best-practices,0.7,True,Outlines supported file locations for init scripts and provides recommendations; product-specific best practices and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/,Technology partner overview,Technology partners - Azure Databricks,,Learn how you can connect technology partners to your Azure Databricks workspace so you can use third-party tools with your Databricks lakehouse data.,"Azure Databricks has validated integrations with various third-party solutions that allow you to work with data through Azure Databricks clusters and SQL warehouses, in many cases with low-code and no-code experiences. These solutions enable common scenarios such as data ingestion, data preparation and transformation, business intelligence (BI), and machine learning. Databricks also includesPartner Connect, a user interface that allows some of these validated solutions to integrate more quickly ",2026-01-20T08:00:00.000Z,partner-tools,,0.3,False,High-level overview of technology partner integrations and Partner Connect; primarily marketing/overview without detailed configuration tables or numeric constraints.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/,Technology partner overview,Technology partners - Azure Databricks,,Learn how you can connect technology partners to your Azure Databricks workspace so you can use third-party tools with your Databricks lakehouse data.,"Azure Databricks has validated integrations with various third-party solutions that allow you to work with data through Azure Databricks clusters and SQL warehouses, in many cases with low-code and no-code experiences. These solutions enable common scenarios such as data ingestion, data preparation and transformation, business intelligence (BI), and machine learning. Databricks also includesPartner Connect, a user interface that allows some of these validated solutions to integrate more quickly ",2026-04-20T22:01:00.000Z,partner-tools,,0.1,False,"High-level overview of technology partner integrations and Partner Connect; appears to be conceptual/marketing and navigational content without detailed parameters, limits, or error codes.",updated https://learn.microsoft.com/en-us/azure/databricks/integrations/configuration,Introduction to Databricks sign-on from partner solutions,Azure Databricks sign-on from partner solutions - Azure Databricks,Configure OAuth sign-on from BI partner tools to Databricks,Discover configurations that enable Azure Databricks sign-on from partner solutions.,"Azure Databricks supports integrations with your favorite BI tools, including Power BI and Tableau. Some of these partner applications are enabled in your account by default aspublished OAuth application integrations. OAuth application integrations allow users to access Azure Databricks from partner applications using single sign-on (SSO). Account admins can disable published partner OAuth applications for their account. SeeDisable dbt Core, Power BI, Tableau Desktop, or Tableau Cloud OAuth appl",2024-04-10T20:51:00.000Z,partner-tools,security,0.7,True,Describes OAuth application integration behavior and configuration for partner SSO into Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/configure-oauth-dbt,Configure Databricks sign-on from dbt Core with Microsoft Entra ID,Configure Azure Databricks sign-on from dbt Core with Microsoft Entra ID - Azure Databricks,Configure Entra ID SSO from dbt Core to Azure Databricks,Learn how to configure Azure Databricks sign-on from dbt Core with Microsoft Entra ID.,"Important This feature is inPublic Preview. This page describes how to configure Azure Databricks sign-on from dbt Core with Microsoft Entra ID. After you complete this one-time configuration as an Azure Databricks account admin, users can connect Azure Databricks to dbt Core using single sign-on (SSO). In addition to using Microsoft Entra ID, you can use Databricks M2M OAuth to integrate with dbt Core. SeeEnable or disable partner OAuth applications.",2026-01-20T08:00:00.000Z,how-to,security,0.7,True,"Configuration-focused page for enabling SSO/OAuth between dbt Core and Azure Databricks using Microsoft Entra ID. Likely includes specific OAuth scopes, redirect URIs, app registration settings, and Databricks-side security configuration that are product-specific and not generic tutorial content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/configure-oauth-tableau,Configure Databricks sign-on from Tableau Server,Configure Azure Databricks sign-on from Tableau Server - Azure Databricks,Configure OAuth SSO from Tableau Server to Databricks,Learn how to configure Azure Databricks sign-on from Tableau Server.,"This page describes how to configure Azure Databricks sign-on from Tableau Server. After you complete this one-time configuration as an Azure Databricks account admin, users can connect from Tableau Server using SSO authentication. The steps in this article aren't needed for Tableau Desktop and Tableau Cloud, which are enabled as OAuth applications in your Azure Databricks account by default. This page is specific to custom Tableau Server OAuth application creation. For generic custom OAuth appl",2026-01-30T19:03:00.000Z,how-to,integrations,0.76,True,"Configuration guide for a specific Tableau Server ↔ Azure Databricks OAuth integration, likely includes app registration fields, redirect URLs, and Databricks-specific parameters that are product- and partner-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/dbt-core-tutorial,dbt Core tutorial,"Tutorial: Create, run, and test dbt models locally - Azure Databricks",,"Learn how to connect your Azure Databricks workspace to dbt Core, an open-source command line tool that enables data teams to transform data.","This tutorial walks you through how to create, run, and test dbt models locally. You can also run dbt projects as Azure Databricks job tasks. For more information, seeUse dbt transformations in Lakeflow Jobs.",2025-06-11T12:07:00.000Z,tutorial,,0.48,False,Tutorial on creating/running dbt models; primarily workflow and modeling steps rather than configuration matrices or troubleshooting/error code mappings.,unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/enable-disable-oauth,Enable or disable partner OAuth applications,Enable or disable partner OAuth applications - Azure Databricks,Enable or disable partner OAuth apps in Databricks,Learn how to enable or disable OAuth applications that allow you to use SSO to authenticate to Azure Databricks from partner solutions.,"This article describes how to enable and disable partner OAuth applications for your Azure Databricks account. dbt Core, Power BI, Tableau Desktop, and Tableau Cloud OAuth applications are enabled by default for your account. Note Updates to OAuth applications can take 30 minutes to process.",2026-01-29T08:00:00.000Z,how-to,security,0.78,True,Provides concrete steps and behavior (including 30-minute propagation) for managing OAuth app access to Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/excel,Excel,Connect to Azure Databricks from Microsoft Excel - Azure Databricks,,"Learn about the Databricks Excel Add-in, which connects your Databricks workspace to Microsoft Excel so you can browse, import, and query Lakehouse data in your spreadsheets.","Important This feature is inPublic Preview. The Azure Databricks Excel Add-in connects your Azure Databricks workspace to Microsoft Excel, bringing governed Lakehouse data directly into your spreadsheets to help you move from data to decisions faster. You can browse and import Azure Databricks tables through an intuitive interface where no SQL knowledge is required. While the add-in offers the flexibility to execute custom SQL queries, it is optional. The add-in is fully supported across Excel f",2026-03-02T19:05:00.000Z,overview,,0.45,False,Feature overview of the Excel Add-in; summary suggests conceptual description rather than detailed configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-query,Import and query data in Excel,Import and query data using the Azure Databricks Excel Add-in - Azure Databricks,,"Learn how to use the Databricks Excel Add-in in Microsoft Excel to analyze data with tables, queries, and custom functions.","Important This feature is inPublic Preview. The Azure Databricks Excel Add-in connects your Azure Databricks workspace to Microsoft Excel, bringing governed Lakehouse data directly into your spreadsheets to help you move from data to decisions faster. This page describes how to use the Azure Databricks Excel Add-in to import and analyze data from Azure Databricks in Excel. You can browse and import Azure Databricks tables through an intuitive interface where no SQL knowledge is required. While t",2026-04-13T21:52:00.000Z,how-to,,0.4,False,"Covers how to use the Azure Databricks Excel Add-in to import and query data; likely a usage tutorial focused on UI flows and basic querying rather than configuration parameters, limits, or error-code-based troubleshooting.",updated -https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-setup,Install the Excel Add-in,Set up the Azure Databricks Excel Add-in - Azure Databricks,,"Learn how to install the Databricks Excel Add-in and connect to your Databricks workspace using SSO authentication in Excel for the web, Windows, or macOS.","Important This feature is inPublic Preview. The Azure Databricks Excel Add-in connects your Azure Databricks workspace to Microsoft Excel, bringing governed Lakehouse data directly into your spreadsheets to help you move from data to decisions faster. The add-in is fully supported across Excel for the web and desktop versions for both Windows and macOS. This page describes the two methods available to set up the Azure Databricks Excel Add-in in Microsoft Excel. This add-in uses single sign-on (S",2026-04-14T08:00:00.000Z,install-set-up-deploy,,0.4,False,"Describes how to install and set up the Azure Databricks Excel Add-in and connect via SSO; appears to be a step-by-step setup/tutorial without detailed parameter tables, limits, or specialized troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/,Google Sheets,Databricks Connector for Google Sheets - Azure Databricks,Use Databricks Connector to access data from Google Sheets,Learn how to use the Databricks Connector for Google Sheets to connect your Azure Databricks workspace with Google Sheets.,"TheDatabricks Connector for Google Sheetsconnects your Azure Databricks workspace with Google Sheets. Use the connector to query Azure Databricks data from within Google Sheets, enabling further analysis.",2026-04-17T18:03:00.000Z,integration,integrations,0.7,True,"Connector overview for Google Sheets and Databricks; connector docs generally include specific connection options and parameters (workspace URL, auth method, warehouse selection) that are product-specific integration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/connect,Connect to a workspace,Set up the Databricks Connector for Google Sheets - Azure Databricks,Set up Databricks Connector for Google Sheets,Learn how to connect the Databricks Connector for Google Sheets to your Azure Databricks workspace and select a SQL warehouse.,"To use theDatabricks Connector for Google Sheets, you must first connect to your Azure Databricks workspace and select a SQL warehouse. This page describes how to set up the initial connection and how to switch to a different workspace.",2026-04-16T08:00:00.000Z,integration,integrations,0.8,True,"Explicitly about setting up the connector and selecting a SQL warehouse; this setup typically involves concrete configuration fields and parameter values (e.g., host, HTTP path, token), which are integration-specific settings.",updated -https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/query-data,Query data,Query Azure Databricks data in Google Sheets - Azure Databricks,Query Databricks data from Google Sheets with SQL,"Learn how to query Azure Databricks data in Google Sheets by selecting tables, writing SQL queries, adding parameters, and creating pivot tables.","This page describes how to query data from your Azure Databricks workspace and import it into Google Sheets using theDatabricks Connector for Google Sheets. You can select tables directly, write SQL queries, add parameters, and create pivot tables. The connector automatically saves all queries as imports so you can refresh results and reuse existing queries.",2026-04-16T08:00:00.000Z,integration,integrations,0.65,True,"Describes selecting tables, writing SQL, adding parameters, and how queries are saved and refreshed via the connector; involves product-specific behavior and options of the Databricks–Sheets connector, fitting integrations.",updated -https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/schedule-refresh,Schedule data refreshes,Schedule data refreshes in Google Sheets - Azure Databricks,Configure scheduled data refreshes in Google Sheets connector,Learn how to schedule automatic data refreshes in the Databricks Connector for Google Sheets to keep imported data up to date.,This page describes how to schedule automatic data refreshes in theDatabricks Connector for Google Sheetsto keep your imported data up to date. Scheduled refreshes run your saved queries on a recurring basis and update the corresponding sheets.,2026-04-16T08:00:00.000Z,integration,configuration,0.65,True,"Covers scheduling automatic refreshes for saved queries; likely documents specific scheduling options/fields (intervals, triggers) within the Databricks Connector for Google Sheets, which are configuration details for this integration feature.",updated +https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-query,Import and query data in Excel,Import and query data using the Azure Databricks Excel Add-in - Azure Databricks,,"Learn how to use the Databricks Excel Add-in in Microsoft Excel to analyze data with tables, queries, and custom functions.","Important This feature is inPublic Preview. The Azure Databricks Excel Add-in connects your Azure Databricks workspace to Microsoft Excel, bringing governed Lakehouse data directly into your spreadsheets to help you move from data to decisions faster. This page describes how to use the Azure Databricks Excel Add-in to import and analyze data from Azure Databricks in Excel. You can browse and import Azure Databricks tables through an intuitive interface where no SQL knowledge is required. While t",2026-04-13T21:52:00.000Z,how-to,,0.4,False,"Covers how to use the Azure Databricks Excel Add-in to import and query data; likely a usage tutorial focused on UI flows and basic querying rather than configuration parameters, limits, or error-code-based troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-setup,Install the Excel Add-in,Set up the Azure Databricks Excel Add-in - Azure Databricks,,"Learn how to install the Databricks Excel Add-in and connect to your Databricks workspace using SSO authentication in Excel for the web, Windows, or macOS.","Important This feature is inPublic Preview. The Azure Databricks Excel Add-in connects your Azure Databricks workspace to Microsoft Excel, bringing governed Lakehouse data directly into your spreadsheets to help you move from data to decisions faster. The add-in is fully supported across Excel for the web and desktop versions for both Windows and macOS. This page describes the two methods available to set up the Azure Databricks Excel Add-in in Microsoft Excel. This add-in uses single sign-on (S",2026-04-23T08:00:00.000Z,install-set-up-deploy,,0.4,False,"Setup guide for the Azure Databricks Excel Add-in focused on installation and SSO connection. Based on the summary, it reads as a step-by-step tutorial rather than a parameter reference; it likely lacks detailed configuration tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/,Google Sheets,Databricks Connector for Google Sheets - Azure Databricks,Use Databricks Connector to access data from Google Sheets,Learn how to use the Databricks Connector for Google Sheets to connect your Azure Databricks workspace with Google Sheets.,"TheDatabricks Connector for Google Sheetsconnects your Azure Databricks workspace with Google Sheets. Use the connector to query Azure Databricks data from within Google Sheets, enabling further analysis.",2026-04-17T18:03:00.000Z,integration,integrations,0.7,True,"Connector overview for Google Sheets and Databricks; connector docs generally include specific connection options and parameters (workspace URL, auth method, warehouse selection) that are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/connect,Connect to a workspace,Set up the Databricks Connector for Google Sheets - Azure Databricks,Set up Databricks Connector for Google Sheets,Learn how to connect the Databricks Connector for Google Sheets to your Azure Databricks workspace and select a SQL warehouse.,"To use theDatabricks Connector for Google Sheets, you must first connect to your Azure Databricks workspace and select a SQL warehouse. This page describes how to set up the initial connection and how to switch to a different workspace.",2026-04-16T08:00:00.000Z,integration,integrations,0.8,True,"Explicitly about setting up the connector and selecting a SQL warehouse; this setup typically involves concrete configuration fields and parameter values (e.g., host, HTTP path, token), which are integration-specific settings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/query-data,Query data,Query Azure Databricks data in Google Sheets - Azure Databricks,Query Azure Databricks data using Google Sheets connector,"Learn how to query Azure Databricks data in Google Sheets by selecting tables, writing SQL queries, adding parameters, and creating pivot tables.","This page describes how to query data from your Azure Databricks workspace and import it into Google Sheets using theDatabricks Connector for Google Sheets. You can select tables directly, write SQL queries, add parameters, and create pivot tables. The connector automatically saves all queries as imports so you can refresh results and reuse existing queries.",2026-04-20T08:00:00.000Z,integration,integrations,0.7,True,"Covers a specific integration (Databricks Connector for Google Sheets) with concrete behavior: selecting tables, writing SQL, parameters, pivot tables, and automatic saving/refresh of imports. This is product-specific integration knowledge about how the connector works and how queries are handled, beyond generic Sheets or Databricks usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/schedule-refresh,Schedule data refreshes,Schedule data refreshes in Google Sheets - Azure Databricks,Configure scheduled data refreshes in Google Sheets connector,Learn how to schedule automatic data refreshes in the Databricks Connector for Google Sheets to keep imported data up to date.,This page describes how to schedule automatic data refreshes in theDatabricks Connector for Google Sheetsto keep your imported data up to date. Scheduled refreshes run your saved queries on a recurring basis and update the corresponding sheets.,2026-04-16T08:00:00.000Z,integration,configuration,0.65,True,"Covers scheduling automatic refreshes for saved queries; likely documents specific scheduling options/fields (intervals, triggers) within the Databricks Connector for Google Sheets, which are configuration details for this integration feature.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/graphframes/,GraphFrames overview,How to use GraphFrames on Azure Databricks - Azure Databricks,,Example notebooks for GraphFrames on Azure Databricks. GraphFrames is a package for Apache Spark that provides DataFrame-based graphs.,"This article includes example notebooks to help you get started using GraphFrames on Azure Databricks.GraphFramesis a package for Apache Spark that provides DataFrame-based graphs. It provides high-level APIs in Java, Python, and Scala. It aims to provide both the functionality of GraphX and extended functionality taking advantage of SparkDataFrames. This extended functionality includes motif finding, DataFrame-based serialization, and highly expressive graph queries. This article includes three",2025-12-29T08:00:00.000Z,concept-article,,0.3,False,"Primarily links to example notebooks and gives a high-level description of GraphFrames; lacks detailed configuration tables, limits, or product-specific troubleshooting or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/graphframes/user-guide-scala,GraphFrames - Scala,GraphFrames user guide - Scala - Azure Databricks,Use GraphFrames with Scala on Azure Databricks,Learn how to use GraphFrames to do graph analysis using Scala in Azure Databricks.,This article demonstrates examples from theGraphFrames User Guide.,2025-10-06T08:00:00.000Z,concept-article,integrations,0.6,True,"Demonstrates concrete Scala usage of GraphFrames on Databricks with API calls and patterns specific to this integration, which qualifies as a coding/integration pattern rather than generic graph theory content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-odbc-bi,ODBC and JDBC Drivers,Databricks ODBC and JDBC Drivers - Azure Databricks,Choose and start with Databricks ODBC and JDBC drivers,Learn how to get started with the Databricks ODBC and JDBC Drivers to connect to Azure Databricks.,"Azure Databricks provides an ODBC driver and a JDBC driver to connect your tools or clients to Azure Databricks. For tool or client specific connection instructions, seeTechnology partnersor your tool's or client's documentation.",2026-01-20T08:00:00.000Z,reference,decision-making,0.62,True,"Entry page for Databricks ODBC/JDBC drivers that helps users decide which driver to use and where to go for tool-specific instructions; while light on numbers, it is product-specific selection guidance between driver options.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/,Overview,Databricks JDBC Driver - Azure Databricks,,"Learn how to get started with the open source Databricks JDBC Driver, which enables you to connect participating apps, tools, and SDKs to Azure Databricks through JDBC.","Note The Databricks JDBC Driversource codeis publicly available under theApache 2.0 license. This reflects Databricks' commitment to transparency, collaboration, and the power of community-driven development. Contributions from developers, users, and the community are welcome. To get started, see theContribution Guidelines. The Databricks JDBC Driver enables you to connect tools such asDataGrip,DBeaver, andSQL Workbench/Jto Azure Databricks through Java Database Connectivity (JDBC), an industry-",2026-04-14T21:45:00.000Z,overview,,0.3,False,"High-level overview of the open source Databricks JDBC driver and supported tools; no detailed property tables, config parameters, limits, or error-code-based troubleshooting content are evident from the summary.",updated +https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/,Overview,Databricks JDBC Driver - Azure Databricks,,"Learn how to get started with the open source Databricks JDBC Driver, which enables you to connect participating apps, tools, and SDKs to Azure Databricks through JDBC.","Note The Databricks JDBC Driversource codeis publicly available under theApache 2.0 license. This reflects Databricks' commitment to transparency, collaboration, and the power of community-driven development. Contributions from developers, users, and the community are welcome. To get started, see theContribution Guidelines. The Databricks JDBC Driver enables you to connect tools such asDataGrip,DBeaver, andSQL Workbench/Jto Azure Databricks through Java Database Connectivity (JDBC), an industry-",2026-04-14T21:45:00.000Z,overview,,0.3,False,"High-level overview of the open source Databricks JDBC driver and supported tools; no detailed property tables, config parameters, limits, or error-code-based troubleshooting content are evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/authentication,Authentication settings,Authentication settings for the Databricks JDBC Driver - Azure Databricks,Set up authentication for the Databricks JDBC driver,Learn how to configure authentication settings such as OAuth and PAT for the Databricks JDBC Driver.,"TheDatabricks JDBC Driversupports multiple authentication methods depending on your use case. This page describes how to configure each method and lists the required connection properties. To configure authentication for the Databricks JDBC Driver, use one of the following methods:",2026-04-03T17:47:00.000Z,how-to,security,0.88,True,"Authentication settings for JDBC driver with multiple methods; will list specific connection properties per auth flow (OAuth, PAT, etc.), which are product-specific security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/configure,Configure a connection,Configure a connection to Databricks using the Databricks JDBC Driver - Azure Databricks,Configure Databricks JDBC driver connections (v3+),Learn how to configure a connection to Databricks using the Databricks JDBC Driver.,"This article shows you how to configure a connection to Databricks using theDatabricks JDBC Driver, version 3 and above.",2026-04-03T17:47:00.000Z,how-to,configuration,0.86,True,"Shows how to configure connections using the Databricks JDBC driver v3+, including connection URL structure and property names/values; this is detailed configuration reference.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/configure,Configure a connection,Configure a connection to Databricks using the Databricks JDBC Driver - Azure Databricks,Configure Databricks JDBC driver connection settings,Learn how to configure a connection to Databricks using the Databricks JDBC Driver.,"This page shows you how to configure a connection to Databricks using theDatabricks JDBC Driver, version 3 and above.",2026-04-24T23:15:00.000Z,how-to,configuration,0.8,True,"A connection configuration page for the Databricks JDBC driver (v3+) will contain driver-specific connection string parameters, required/optional settings, and possibly defaults. These are concrete configuration options unique to this product, matching the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/example,Run queries using the JDBC driver,Run queries using the JDBC driver - Azure Databricks,Understand Databricks JDBC driver parameter limits,See examples of how to run queries using the Databricks JDBC Driver.,"This page contains examples that show you how to run queries using theDatabricks JDBC Driver, version 3 and above. Note The Databricks JDBC Driver has a parameters limit of 256 for parameterized statements.",2026-04-07T21:58:00.000Z,how-to,limits-quotas,0.7,True,"The summary explicitly states that the Databricks JDBC Driver has a parameters limit of 256 for parameterized statements. This is a concrete numeric limit with an exact value that is product-specific. Even though the page is mostly examples, this explicit limit qualifies it as limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/metadata,Work with metric view metadata,Work with metric view metadata using the Databricks JDBC Driver - Azure Databricks,Use Databricks JDBC metadata for metric views,Use the Databricks JDBC Driver to discover metric views and identify measure columns with enhanced metadata operations.,"TheDatabricks JDBC Driversupports enhanced metadata operations that allow business intelligence (BI) tools and applications to discovermetric viewsand identify measure columns within them using standard JDBC metadata methods. BI tools can filter and display metric views separately in data source browsers, generate appropriate SQL by distinguishing measures from dimensions, and build rich semantic layer capabilities. Data exploration tools can help users discover and understand available business",2026-04-09T22:01:00.000Z,how-to,integrations,0.7,True,"The page describes enhanced JDBC metadata operations for discovering metric views and identifying measure columns using standard JDBC metadata methods. This is a product-specific integration pattern for BI tools, likely including specific metadata method usage and parameters unique to the Databricks JDBC driver, which fits the integrations category.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/properties,Supported connection properties,Supported connection properties - Azure Databricks,Configure Databricks JDBC driver connection properties,"Discover the connection properties supported by the Databricks JDBC Driver, version 3 and above.","This article describes the connection properties supported by theDatabricks JDBC Driver, version 3 and above.",2026-04-14T08:00:00.000Z,reference,configuration,0.9,True,"Explicitly documents supported connection properties for the Databricks JDBC Driver v3+, which are product-specific configuration parameters likely presented in tables with names, types, defaults, and allowed values—information that qualifies as expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/properties,Supported connection properties,Supported connection properties - Azure Databricks,Use supported connection properties for Databricks JDBC,"Discover the connection properties supported by the Databricks JDBC Driver, version 3 and above.","This article describes the connection properties supported by theDatabricks JDBC Driver, version 3 and above.",2026-04-24T08:00:00.000Z,reference,configuration,0.95,True,"Explicitly about 'supported connection properties' for the Databricks JDBC driver. Such pages are typically parameter tables with names, types, allowed values, and defaults—precisely the expert configuration knowledge that fits the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/reference,API reference,Java API reference for the Databricks JDBC Driver - Azure Databricks,Java API reference for the Databricks JDBC driver,"Find API reference documentation for the Databricks JDBC Driver, version 3 and above.","This article provides API reference documentation for theDatabricks JDBC Driver, version 3 and above. Describes methods to retrieve connection and statement execution handles. Package:com.databricks.jdbc.api Describes methods to manage the driver connection. Package:com.databricks.client.jdbc com.databricks.client.jdbc.Driverextendscom.databricks.client.jdbc.IDatabricksDriverandjava.sql.Driver. Describes methods to retrieve results of an asynchronous query. Package:com.databricks.jdbc.api Descri",2026-01-24T08:00:00.000Z,reference,integrations,0.83,True,"API reference with package names, classes, and methods for Databricks JDBC; includes method signatures and usage patterns that are product-specific integration APIs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/volumes,Manage files in volumes,Manage files in volumes with the Databricks JDBC Driver - Azure Databricks,Manage Unity Catalog volume files via Databricks JDBC,"Learn how to upload, download, and delete files in Unity Catalog volumes using the Databricks JDBC Driver, version 3 and above.","Databricks offers bulk ingestion capabilities using Unity Catalog volumes, which allows users to transfer datasets to and from local files like CSV files. SeeWhat are Unity Catalog volumes?. This article describes how to manage files in volumes, as well as read and write streams to and from volumes, using theDatabricks JDBC Driver, version 3 and above.",2026-03-31T08:00:00.000Z,how-to,integrations,0.82,True,"Explains how to upload/download/delete files and stream data using the JDBC driver; includes driver-specific SQL/operations and parameters, which are integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/,Overview,Simba JDBC Driver (Legacy) - Azure Databricks,Configure legacy Simba-based Databricks JDBC driver,"Learn how to connect apps, tools, and SDKs to Azure Databricks using the Databricks JDBC Driver through JDBC.","Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. Databricks JDBC, the first version of the driver, is a Simba driver developed byinsightsoftware. Use it to connect apps, tools, clients, SDKs, and APIs to Azure Databricks through Java Database Connectivity (JDBC), an industry-standard specification for accessing database management systems. This section supplements the information in the Databricks ",2026-03-24T01:32:00.000Z,overview,configuration,0.8,True,"Legacy JDBC driver section that supplements vendor docs; likely includes Databricks-specific settings and constraints for Simba driver versions below 3, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/authentication,Authentication settings,Authentication settings for the Databricks JDBC Driver (Simba) - Azure Databricks,Configure authentication for the legacy Databricks JDBC driver,Learn how to configure Azure Databricks authentication settings for the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to configure Azure Databricks authentication settings for theDatabricks JDBC Driver. The Databricks JDBC Driver supports the following Azure Databricks authentication types:",2026-03-23T08:00:00.000Z,how-to,security,0.86,True,"Authentication settings for legacy JDBC driver, listing supported auth types and associated properties; includes security caveats and product-specific auth configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/capability,Driver capability settings,Driver capability settings for the Databricks JDBC Driver (Simba) - Azure Databricks,Configure capability settings for legacy Databricks JDBC driver,Learn how to configure special and advanced driver capability settings for the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to configure special and advanced driver capability settings for theDatabricks JDBC Driver. The Databricks JDBC Driver provides the following special and advanced driver capability settings.",2026-04-09T22:01:00.000Z,reference,configuration,0.86,True,"Similar to index 0 but for the legacy Simba JDBC driver, this page documents 'special and advanced driver capability settings' and how to configure them. These are driver-specific configuration parameters and options, which aligns with the configuration sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/capability,Driver capability settings,Driver capability settings for the Databricks JDBC Driver (Simba) - Azure Databricks,Configure advanced capability settings for Databricks JDBC driver,Learn how to configure special and advanced driver capability settings for the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to configure special and advanced driver capability settings for theDatabricks JDBC Driver. The Databricks JDBC Driver provides the following special and advanced driver capability settings.",2026-04-24T23:15:00.000Z,reference,integrations,0.78,True,"Page documents special and advanced capability settings for the Databricks (Simba) JDBC driver, which are product- and driver-specific integration parameters. These are configuration-style options for a specific integration (JDBC driver) rather than general configuration of the Databricks service, and include named settings and how to use them—details that are unlikely to be known from pretraining.",updated https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/compute,Compute settings,Compute settings for the Databricks JDBC Driver (Simba) - Azure Databricks,Set compute options for the legacy Databricks JDBC driver,Learn how to configure Azure Databricks compute resource settings for the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to configure Azure Databricks compute resource settings for theDatabricks JDBC Driver. Note The JDBC driver doesn't support connecting to job clusters.",2026-03-24T01:32:00.000Z,how-to,configuration,0.8,True,"Compute settings for the Simba JDBC driver, including constraints like not supporting job clusters and likely specific parameters; these are Databricks-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/configure,Configure a connection,Configure a connection to Databricks using the Databricks JDBC Driver (Simba) - Azure Databricks,Configure connections using the legacy Databricks JDBC driver,Learn how to configure a connection to the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to configure a connection to Azure Databricks using theDatabricks JDBC Driver. To configure a connection, combine the following settings into a JDBC connection URL or a programmatic collection of connection properties:",2026-03-24T01:32:00.000Z,how-to,configuration,0.86,True,"Details how to build JDBC URLs and property sets for the Simba driver, including property names and required values; this is concrete configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/download,Download and reference,Download and reference the Databricks JDBC Driver (Simba) - Azure Databricks,Download and reference the legacy Databricks JDBC driver,Learn how to download and reference the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to download and reference theDatabricks JDBC Driver. Before downloading the driver, review theJDBC ODBC driver license.",2026-03-24T01:32:00.000Z,how-to,deployment,0.64,True,"Explains how to obtain and reference the Simba-based JDBC driver, including jar coordinates/paths and class names; these are product-specific distribution and wiring details, closest to deployment/packaging knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/legacy,Legacy driver settings,Legacy Databricks JDBC Driver - Azure Databricks,Authentication settings for very old Databricks JDBC versions,Learn how to configure settings for legacy Databricks JDBC Driver 2.6.22 and below.,"This page describes authentication settings for legacy Databricks JDBC Driver versions 2.6.22 and below. For version 3 and above, seeDatabricks JDBC Driver. Warning Use the most secure authentication flow available. The authentication methods on this page carry risks not present in other flows. Only use these methods when more secure options, such as managed identities, aren't viable.",2026-01-16T08:00:00.000Z,reference,security,0.78,True,"Covers auth for JDBC driver 2.6.22 and below, including insecure methods and warnings; includes specific auth parameters and flows, which are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/testing,Testing,Testing the Databricks JDBC Driver (Simba) - Azure Databricks,Test and validate legacy Databricks JDBC driver connections,Learn how to get started with testing code that uses the Databricks JDBC Driver (Simba).,"Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to test code that uses theDatabricks JDBC Driver. Use any test framework for JDBC-compatible languages. The following examples useJUnitandMockitoto test JDBC driver connections. This code is based on the example inAuthentication settings for the Databricks JDBC Driver (Simba).",2026-03-24T01:32:00.000Z,how-to,troubleshooting,0.7,True,Provides JUnit/Mockito examples to test JDBC connections based on auth settings; offers concrete patterns to validate and debug connectivity for the Simba driver.,unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/volumes,Files in Unity Catalog volumes,Manage files in Unity Catalog volumes with the Databricks JDBC Driver (Simba) - Azure Databricks,Manage Unity Catalog volume files via legacy JDBC driver,"Learn how to upload, download, and delete files in Unity Catalog volumes using the Databricks JDBC Driver (Simba).","Note This page applies to JDBC driver (Legacy) versions below version 3. For Databricks JDBC driver version 3 and above, seeDatabricks JDBC Driver. This page describes how to upload, download, and delete files in Unity Catalogvolumesusing theDatabricks JDBC Driver.",2026-03-24T01:32:00.000Z,how-to,integrations,0.8,True,"Shows how to upload/download/delete files in Unity Catalog volumes using the Simba JDBC driver, with driver-specific commands and parameters; this is integration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/lovable,Lovable,Connect Lovable to Databricks - Azure Databricks,Connect Lovable no-code apps to Databricks via OAuth,Learn how to connect a Lovable-hosted app to your Databricks workspace to query data using OAuth machine-to-machine authentication.,"Lovableis a no-code application platform that lets teams build and deploy apps without writing code. With the Azure Databricks integration, Lovable-hosted apps can query data stored in your Azure Databricks lakehouse using the Databricks REST API and OAuth machine-to-machine (M2M) authentication. For example, you can build forecasting dashboards or reporting apps backed by lakehouse data.",2026-04-13T21:03:00.000Z,integration,integrations,0.75,True,"Integration guide for Lovable-hosted apps using Databricks REST API and OAuth M2M; likely includes specific OAuth scopes, endpoint URLs, and configuration parameters unique to this integration, fitting the integrations sub-skill.",new +https://learn.microsoft.com/en-us/azure/databricks/integrations/lovable,Lovable,Connect Lovable to Databricks - Azure Databricks,Connect Lovable no-code apps to Databricks via OAuth,Learn how to connect a Lovable-hosted app to your Databricks workspace to query data using OAuth machine-to-machine authentication.,"Lovableis a no-code application platform that lets teams build and deploy apps without writing code. With the Azure Databricks integration, Lovable-hosted apps can query data stored in your Azure Databricks lakehouse using the Databricks REST API and OAuth machine-to-machine (M2M) authentication. For example, you can build forecasting dashboards or reporting apps backed by lakehouse data.",2026-04-13T21:03:00.000Z,integration,integrations,0.75,True,"Integration guide for Lovable-hosted apps using Databricks REST API and OAuth M2M; likely includes specific OAuth scopes, endpoint URLs, and configuration parameters unique to this integration, fitting the integrations sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/manage-oauth,Override partner OAuth token lifetime policy,Override partner OAuth token lifetime policy - Azure Databricks,Override OAuth token lifetimes for Azure Databricks partner apps,Learn how to override the OAuth token lifetime policy for partner OAuth applications.,"This page describes how to override the OAuth token lifetime policy for existing partner OAuth applications. To configure single-use refresh tokens for enhanced security, seeSingle-use refresh tokens. Note Updates to partner OAuth applications can take 30 minutes to process.",2026-01-24T08:00:00.000Z,how-to,security,0.7,True,"Describes overriding OAuth token lifetime policy for partner apps. Likely includes specific policy names, configuration parameters, and possibly lifetime values or constraints. This is product-specific security configuration rather than conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/microsoft-foundry,Foundry,Use Azure Databricks Genie in Microsoft Foundry - Azure Databricks,Add Databricks Genie MCP server to Microsoft Foundry,Learn how to add an Azure Databricks Genie Model Context Protocol (MCP) server as a tool in Microsoft Foundry.,"Important This feature is inPublic Preview. This page explains how to add an Azure Databricks Genie Model Context Protocol (MCP) server as a tool in Microsoft Foundry, allowing your AI agents to connect and use the insights of Genie spaces. The Genie MCP server on Azure Databricks invokes Genie as an MCP tool and works out of the box. To learn more, seeAvailable managed serversandWhat is a Genie space.",2026-03-10T23:23:00.000Z,integration,integrations,0.7,True,"Explains how to register Genie as an MCP tool in Foundry; likely includes specific endpoint URLs, tool configuration fields, and Databricks-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform,Power Platform,Azure Databricks and Microsoft Power Platform integration - Azure Databricks,,"Learn how to connect Azure Databricks to Microsoft Power Platform. Build apps in Power Apps, flows in Power Automate, and agents in Copilot Studio using Azure Databricks data and Genie spaces.","The Azure Databricks connector for Microsoft Power Platform enables you to build low-code applications, workflows, and agents using your Azure Databricks data. Build apps in Power Apps, create automated flows in Power Automate, and develop AI agents in Copilot Studio while preserving your Azure Databricks governance controls.",2026-01-26T19:59:00.000Z,integration,,0.35,False,High-level description of the Power Platform connector; summary suggests conceptual integration overview without detailed configuration tables.,unchanged @@ -1288,7 +1310,7 @@ https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platf https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform-usage,Use Databricks data,Use Azure Databricks data on Microsoft Power Platform - Azure Databricks,,"Learn how to build apps in Power Apps, flows in Power Automate, and agents in Copilot Studio using Azure Databricks data and Genie spaces.",This page explains how to use your Azure Databricks data from the following platforms after creating a connection:,2026-03-10T08:00:00.000Z,integration,,0.4,False,"Usage-focused page on building apps/flows/agents after connection exists. Summary does not indicate detailed configuration tables, limits, or troubleshooting; more of a how-to use data conceptually.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/,Overview,Databricks ODBC Driver - Azure Databricks,,"Learn how to get started with the Databricks ODBC Driver, which enables you to connect participating apps, tools, and SDKs to Azure Databricks through ODBC.","Important As of February 2026, the ODBC driver was renamed from Simba Spark ODBC Driver to Databricks ODBC Driver. Databricks is no longer distributing new versions of the legacy Simba driver, but existing versions remain supported for two years. Databricks recommends migrating to the Databricks ODBC Driver. Azure Databricks provides anODBC driver, which enables you to connect participating apps, tools, clients, SDKs, and APIs to Azure Databricks through Open Database Connectivity (ODBC), an ind",2026-02-25T19:11:00.000Z,overview,,0.45,False,High-level driver overview and migration note; detailed settings are on child pages.,unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/authentication,Authentication settings,Authentication settings for the Databricks ODBC Driver - Azure Databricks,Configure authentication for the Databricks ODBC driver,Learn how to configure Azure Databricks authentication settings for the Databricks ODBC Driver.,This page describes how to configure Azure Databricks authentication settings for theDatabricks ODBC Driver. The Databricks ODBC Driver supports the following Azure Databricks authentication types:,2026-04-03T08:00:00.000Z,how-to,security,0.86,True,"Authentication settings page for ODBC driver; typically enumerates supported auth types and required connection properties (e.g., token, host, HTTPPath), which are product-specific security and auth configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/capability,Driver capability settings,Driver capability settings for the Databricks ODBC Driver - Azure Databricks,Configure advanced capability settings for Databricks ODBC driver,Learn how to configure special and advanced driver capability settings for the Databricks ODBC Driver.,This page describes how to configure special and advanced driver capability settings for theDatabricks ODBC Driver. The Databricks ODBC Driver provides the following special and advanced driver capability settings.,2026-04-09T22:01:00.000Z,reference,configuration,0.86,True,"The page is specifically about 'special and advanced driver capability settings' for the Databricks ODBC driver. These capability flags and their allowed values are product-specific configuration parameters that an LLM is unlikely to know from training. It is not just a tutorial; it documents driver settings and how to configure them, which fits the configuration category.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/capability,Driver capability settings,Driver capability settings for the Databricks ODBC Driver - Azure Databricks,Configure advanced capability settings for Databricks ODBC driver,Learn how to configure special and advanced driver capability settings for the Databricks ODBC Driver.,This page describes how to configure special and advanced driver capability settings for theDatabricks ODBC Driver. The Databricks ODBC Driver provides the following special and advanced driver capability settings.,2026-04-24T23:15:00.000Z,reference,configuration,0.8,True,"Described as detailing 'special and advanced driver capability settings' for the Databricks ODBC driver. Such pages typically list driver capability flags/parameters, allowed values, and behavior, which are product-specific configuration details not inferable from general knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/compute,Compute settings,Compute settings for the Databricks ODBC Driver - Azure Databricks,Configure Databricks compute settings for the ODBC driver,Learn how to configure Azure Databricks compute resource settings for the Databricks ODBC Driver.,This page describes how to configure Azure Databricks compute resource settings for theDatabricks ODBC Driver.,2026-02-24T08:00:00.000Z,how-to,configuration,0.82,True,"Describes compute resource settings specifically for the Databricks ODBC driver, likely listing setting names, allowed values, and behavior; this is driver-specific configuration rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/connect-databricks-excel-python-r,Connect from Python or R,Connect to Azure Databricks from Python or R - Azure Databricks,Connect Python and R clients to Databricks via ODBC,Learn how to use the Databricks ODBC driver to connect Azure Databricks using Python or R.,"In this page, you learn how to use the Databricks ODBC driver to connect Azure Databricks with Python or R language. Once you establish the connection, you can access the data in Azure Databricks from the Python or R clients. You can also use the clients to further analyze the data.",2026-03-02T08:00:00.000Z,how-to,integrations,0.8,True,"Demonstrates using the Databricks ODBC driver from Python and R, including connection strings and driver options; these are concrete integration details unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/download,Download and install,Download and install the Databricks ODBC Driver - Azure Databricks,,Learn how to download and install the Databricks ODBC Driver.,"Note As of February 2026, new versions of the ODBC driver are being released by Databricks and the driver has been renamed to Databricks ODBC Driver. To migrate from the legacy Simba Spark ODBC driver, see themigration guide. Review theJDBC ODBC driver licensebefore you install the driver. Some tools require separate ODBC driver installation (like Tableau Desktop), while others include the driver (like recent Power BI Desktop releases). If installation isn't required, skip toNext steps.",2026-02-25T19:11:00.000Z,how-to,,0.4,False,Download/install instructions without detailed configuration parameter tables or limits.,unchanged @@ -1304,20 +1326,20 @@ https://learn.microsoft.com/en-us/azure/databricks/jobs/alert,SQL Alert,SQL aler https://learn.microsoft.com/en-us/azure/databricks/jobs/automate,Automate jobs,Automate job creation and management - Azure Databricks,"Automate Databricks job management with CLI, SDK, and REST",Learn about how to get started with developer tools to automate the creation and management of jobs,"This article shows you how to get started with developer tools to automate the creation and management of jobs. It introduces you to the Databricks CLI, the Databricks SDKs, and the REST API. Note This article provides examples for creating and managing jobs using the Databricks CLI, the Databricks Python SDK, and the REST API as an easy introduction to those tools. To programmatically manage jobs as part of CI/CD useDeclarative Automation Bundlesor theDatabricks Terraform provider.",2026-03-16T08:00:00.000Z,how-to,integrations,0.8,True,"Provides concrete examples using Databricks CLI, Python SDK, and REST API for job creation and management, including product-specific commands and parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/backfill-jobs,Backfill jobs,Backfill jobs - Azure Databricks,Configure and run backfill jobs in Lakeflow Jobs,Learn how to run a backfill for an existing job.,"After you have created a job that runs on a schedule, there are times when you need to backfill data into the system. For example: A backfill for data should use the same job automation that you use to keep your data up to date. Job backfills allow you to run jobs using your existing automation for different date ranges.",2026-01-24T08:00:00.000Z,how-to,configuration,0.6,True,Explains how to run backfills using existing automation and date ranges; likely includes job parameterization and schedule configuration details specific to backfill scenarios.,unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/clean-room-notebook,Clean Room notebook,Clean Room notebook task for jobs - Azure Databricks,Configure Clean Room notebook tasks in Lakeflow Jobs,Learn how to configure a Clean Room notebook task in an Azure Databricks job.,"Use the Clean Room notebook task to run Databricks notebooks in aClean Roomas part of a workflow. For more information about using Clean Room notebook tasks to create complex workflows, seeUse Lakeflow Jobs to run clean room notebooks.",2026-01-24T08:00:00.000Z,how-to,configuration,0.7,True,"Describes Clean Room notebook tasks; likely includes task configuration parameters and Clean Room–specific options, which are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/compute,Overview,Configure compute for jobs - Azure Databricks,Configure compute for Lakeflow Jobs with recommended patterns,Learn about options and best practices for running your Lakeflow Jobs with Azure Databricks compute resources.,"This article contains recommendations and resources for configuring compute for Lakeflow Jobs. Important Limitations for serverless compute for jobs include the following: For more limitations, seeServerless compute limitations. Each job can have one or more tasks. You define compute resources for each task. Multiple tasks defined for the same job can use the same compute resource.",2026-01-23T08:00:00.000Z,how-to,best-practices,0.75,True,"Explicitly called recommendations and best practices for compute; includes serverless limitations and workload-specific guidance, matching product-specific best-practices criteria.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/configure-job,Overview,Configure and edit Lakeflow Jobs - Azure Databricks,Configure and edit Azure Databricks Lakeflow Jobs,"Learn how to create, configure, and edit Lakeflow Jobs to orchestrate data processing, machine learning, and analytics pipelines.","You can create and run a job using the Jobs UI, or developer tools such as the Databricks CLI or the REST API. Using the UI or API, you can repair and rerun a failed or canceled job. This article shows how to create, configure, and edit jobs using theJobs & Pipelinesworkspace UI. For information about other tools, see the following: Tip To view a job as YAML, click the kebab menu to the left ofRun nowfor the job and then clickSwitch to code version (YAML).",2026-04-17T18:03:00.000Z,how-to,configuration,0.7,True,"Page is about creating and configuring Lakeflow Jobs via the Jobs & Pipelines UI. It typically includes job-level settings, task options, and possibly parameter names and allowed values. These are product-specific configuration details that qualify as expert knowledge under configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/compute,Overview,Configure compute for jobs - Azure Databricks,Configure Azure Databricks compute for Lakeflow Jobs,Learn about options and best practices for running your Lakeflow Jobs with Azure Databricks compute resources.,"This article contains recommendations and resources for configuring compute for Lakeflow Jobs. Important Limitations for serverless compute for jobs include the following: For more limitations, seeServerless compute limitations. Each job can have one or more tasks. You define compute resources for each task. Multiple tasks defined for the same job can use the same compute resource.",2026-04-20T17:34:00.000Z,how-to,best-practices,0.7,True,"The page provides product-specific recommendations and limitations for configuring compute for Lakeflow Jobs, including guidance on serverless compute limitations and how to assign compute per task. This is actionable, Databricks-specific best-practices content rather than generic concepts, but the summary does not clearly show detailed numeric limits, so it fits best under best-practices rather than limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/configure-job,Overview,Configure and edit Lakeflow Jobs - Azure Databricks,Configure and edit Azure Databricks Lakeflow Jobs,"Learn how to create, configure, and edit Lakeflow Jobs to orchestrate data processing, machine learning, and analytics pipelines.","You can create and run a job using the Jobs UI, or developer tools such as the Databricks CLI or the REST API. Using the UI or API, you can repair and rerun a failed or canceled job. This article shows how to create, configure, and edit jobs using theJobs & Pipelinesworkspace UI. For information about other tools, see the following: Tip To view a job as YAML, click the kebab menu to the left ofRun nowfor the job and then clickSwitch to code version (YAML).",2026-04-17T18:03:00.000Z,how-to,configuration,0.7,True,"Page is about creating and configuring Lakeflow Jobs via the Jobs & Pipelines UI. It typically includes job-level settings, task options, and possibly parameter names and allowed values. These are product-specific configuration details that qualify as expert knowledge under configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/configure-task,Overview,Configure and edit tasks in Lakeflow Jobs - Azure Databricks,,"Learn how to create, configure, and edit tasks in Lakeflow Jobs to orchestrate data processing, machine learning, and analytics pipelines.","This article focuses on instructions for creating, configuring, and editing tasks using theJobs & Pipelinesworkspace UI. Azure Databricks manages tasks as components of Lakeflow Jobs. A job has one or more tasks. You create a new job in the workspace UI by configuring the first task. To configure a new job, seeConfigure and edit Lakeflow Jobs. Each task has an associated compute resource that runs the task logic. If you are using serverless, Azure Databricks configures your compute resources. If",2026-03-06T08:00:00.000Z,how-to,,0.5,False,"Task configuration via UI; mostly procedural, no clear evidence of parameter tables with defaults or numeric constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/control-flow,Control flow,Control the flow of tasks within Lakeflow Jobs - Azure Databricks,Configure task dependencies and control flow in Lakeflow Jobs,Learn how to control the flow of tasks within Lakeflow Jobs.,"Some jobs are simply a list of tasks that need to be completed. You can control the execution order of tasks by specifying dependencies between them. You can configure tasks to run in sequence or parallel. However, you can also create branching flows that include conditional tasks, error correction, or cleanup. Lakeflow Jobs provides functionality to control the flow of tasks within a job. The following topics describe ways that you can control the flow of your tasks.",2026-01-23T08:00:00.000Z,how-to,configuration,0.6,True,"Describes conditional tasks, branching, and cleanup; likely includes task-level settings and patterns specific to Databricks job control flow.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/dashboard,Dashboard,Dashboard task for jobs - Azure Databricks,Configure dashboard tasks to refresh Databricks dashboards,Learn how to configure a dashboard task in an Azure Databricks job.,Use a dashboard task to refresh results on a published Databricks dashboard as part of a job.,2026-02-23T08:00:00.000Z,how-to,configuration,0.65,True,Describes dashboard tasks; expected to list configuration parameters for selecting and refreshing dashboards within jobs.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/dbt,dbt,dbt task for jobs - Azure Databricks,Configure dbt tasks in Azure Databricks jobs,Learn how to configure a dbt task in an Azure Databricks job.,"Use the dbt task to configure and run dbt projects on Azure Databricks. Important When dbt tasks run, Databricks injects theDBT_ACCESS_TOKENfor the principal configured in theRun Asfield.",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Explains Databricks-specific dbt task configuration, including use of DBT_ACCESS_TOKEN and run-as principal behavior. These are concrete, product-specific configuration details not captured by generic dbt knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/dbt,dbt,dbt task for jobs - Azure Databricks,Configure dbt tasks in Azure Databricks jobs,Learn how to configure a dbt task in an Azure Databricks job.,"Use the dbt task to configure and run dbt projects on Azure Databricks. Important When dbt tasks run, Databricks injects theDBT_ACCESS_TOKENfor the principal configured in theRun Asfield.",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Explains Databricks-specific dbt task configuration, including use of DBT_ACCESS_TOKEN and run-as principal behavior. These are concrete, product-specific configuration details not captured by generic dbt knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/dbt-platform,dbt platform,dbt platform task for jobs - Azure Databricks,Orchestrate dbt platform jobs with Azure Databricks dbt platform tasks,Learn how to configure a dbt platform task in an Azure Databricks job.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Use the dbt platform task to orchestrate and monitor existing dbt platform jobs directly from Azure Databricks. This page explains how to select and trigger dbt jobs, set auto-retry options for failures, and monitor runs.",2026-01-23T08:00:00.000Z,how-to,integrations,0.75,True,"dbt platform task integrates existing dbt platform jobs; likely documents selection of dbt jobs, auto-retry options, and monitoring parameters unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/dynamic-value-references,Dynamic value references,What is a dynamic value reference? - Azure Databricks,Use dynamic value references in Databricks jobs,"Learn how to pass context about a job run, for example, the job ID or the start time for the run, into the job's tasks.",Dynamic value referencesdescribe a collection of variables available when configuring jobs and tasks. Use dynamic value references to configure conditional statements for tasks or to pass information as parameters or arguments. Dynamic value references include information such as:,2026-04-14T17:47:00.000Z,how-to,configuration,0.8,True,"Defines the set of dynamic value reference variables (job ID, run start time, etc.) and how they are used in conditions and parameter passing. These are specific configuration tokens and behaviors unique to Databricks.",updated -https://learn.microsoft.com/en-us/azure/databricks/jobs/file-arrival-triggers,Trigger jobs when new files arrive,Trigger jobs when new files arrive - Azure Databricks,Configure file arrival triggers for Lakeflow Jobs,Learn how to trigger a run in Lakeflow Jobs when new files arrive in cloud storage locations.,"You can usefile arrival triggersto trigger a run of your job when new files arrive in anexternal locationsuch as Amazon S3, Azure storage, or Google Cloud Storage. This feature is useful when a scheduled job's efficiency is compromised by irregular new data arrivals.",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,"Describes how to trigger jobs based on new files in external locations. This usually involves trigger configuration options (paths, patterns, polling intervals, event conditions) that are specific to Lakeflow Jobs. These settings and their behavior are product-specific configuration knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each,Overview,Use a For each task to run another task in a loop - Azure Databricks,Configure For each looping tasks in Databricks jobs,"Learn how to run an Azure Databricks job task in a loop, passing different parameters to each task run.","This article discusses using theFor eachtask with your Lakeflow Jobs, including details on adding and configuring the task in the Jobs UI. Use theFor eachtask to run a nested task in a loop, passing a different set ofparametersto each iteration of the task. Adding theFor eachtask to a job requires defining two tasks: TheFor eachtask and anested task. The nested task is the task to run for each iteration of theFor eachtask and is one of the standard Lakeflow Jobs task types. You cannot add anothe",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Details how to add and configure For each and nested tasks in Lakeflow Jobs, including parameter passing semantics and task-type constraints. These are specific configuration mechanics for Databricks workflows.",updated -https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each-lookup-example,Pass large parameter arrays,Use a lookup table for large parameter arrays in a For each task - Azure Databricks,Handle large parameter arrays in For each tasks,"Example of a an Azure Databricks job that uses the For each task in a loop, passing parameters to lookup configuration for each task run.","For eachtasks pass parameter arrays to nested tasks iteratively, to give each nested task information for its run. Parameter arrays are limited to 10,000 characters, or 48 KB if you use task value references to pass them. When you have a larger amount of data to pass to the nested tasks, you can't directly use the input, task values, or job parameters to pass that data. One alternative to passing the complete data is to store the task data as a JSON file, and pass in a lookup key (into the JSON ",2026-04-17T18:03:00.000Z,how-to,limits-quotas,0.9,True,"Explicitly states numeric limits for parameter arrays (10,000 characters, 48 KB with task value references) and provides a workaround pattern using JSON lookup tables. Contains concrete limits and related configuration behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/jobs/git,Use Git with jobs,Use Git with Lakeflow Jobs - Azure Databricks,Configure Git-backed source for Lakeflow Jobs tasks,"Learn how to configure Git-backed Lakeflow Jobs tasks, use sparse checkout for large repositories, and optimize caching to maintain workspace performance.","Job tasks can check out source code directly from a remote Git repository. The following task types support remote Git repositories: All tasks in a job must reference the same commit in the remote repository. When a job run begins, Azure Databricks takes a snapshot of the specified branch or commit, so that all tasks in that run use the same version of the code. When you view the run history of a task that runs code stored in a remote Git repository, theTask run detailspane includes Git details,",2026-04-17T18:03:00.000Z,how-to,configuration,0.7,True,"Explains configuring job tasks to check out code from remote Git repositories, including constraints like all tasks using the same commit and how snapshots are taken. These are product-specific configuration behaviors and parameters (branch/commit handling, caching, sparse checkout) that fit the configuration sub-skill.",new +https://learn.microsoft.com/en-us/azure/databricks/jobs/dynamic-value-references,Dynamic value references,What is a dynamic value reference? - Azure Databricks,Use dynamic value references in Databricks jobs,"Learn how to pass context about a job run, for example, the job ID or the start time for the run, into the job's tasks.",Dynamic value referencesdescribe a collection of variables available when configuring jobs and tasks. Use dynamic value references to configure conditional statements for tasks or to pass information as parameters or arguments. Dynamic value references include information such as:,2026-04-14T17:47:00.000Z,how-to,configuration,0.8,True,"Defines the set of dynamic value reference variables (job ID, run start time, etc.) and how they are used in conditions and parameter passing. These are specific configuration tokens and behaviors unique to Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/jobs/file-arrival-triggers,Trigger jobs when new files arrive,Trigger jobs when new files arrive - Azure Databricks,Configure file arrival triggers for Lakeflow Jobs,Learn how to trigger a run in Lakeflow Jobs when new files arrive in cloud storage locations.,"You can usefile arrival triggersto trigger a run of your job when new files arrive in anexternal locationsuch as Amazon S3, Azure storage, or Google Cloud Storage. This feature is useful when a scheduled job's efficiency is compromised by irregular new data arrivals.",2026-04-15T08:00:00.000Z,how-to,configuration,0.65,True,"Describes how to trigger jobs based on new files in external locations. This usually involves trigger configuration options (paths, patterns, polling intervals, event conditions) that are specific to Lakeflow Jobs. These settings and their behavior are product-specific configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each,Overview,Use a For each task to run another task in a loop - Azure Databricks,Configure For each looping tasks in Databricks jobs,"Learn how to run an Azure Databricks job task in a loop, passing different parameters to each task run.","This article discusses using theFor eachtask with your Lakeflow Jobs, including details on adding and configuring the task in the Jobs UI. Use theFor eachtask to run a nested task in a loop, passing a different set ofparametersto each iteration of the task. Adding theFor eachtask to a job requires defining two tasks: TheFor eachtask and anested task. The nested task is the task to run for each iteration of theFor eachtask and is one of the standard Lakeflow Jobs task types. You cannot add anothe",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Details how to add and configure For each and nested tasks in Lakeflow Jobs, including parameter passing semantics and task-type constraints. These are specific configuration mechanics for Databricks workflows.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each-lookup-example,Pass large parameter arrays,Use a lookup table for large parameter arrays in a For each task - Azure Databricks,Handle large parameter arrays in For each tasks,"Example of a an Azure Databricks job that uses the For each task in a loop, passing parameters to lookup configuration for each task run.","For eachtasks pass parameter arrays to nested tasks iteratively, to give each nested task information for its run. Parameter arrays are limited to 10,000 characters, or 48 KB if you use task value references to pass them. When you have a larger amount of data to pass to the nested tasks, you can't directly use the input, task values, or job parameters to pass that data. One alternative to passing the complete data is to store the task data as a JSON file, and pass in a lookup key (into the JSON ",2026-04-17T18:03:00.000Z,how-to,limits-quotas,0.9,True,"Explicitly states numeric limits for parameter arrays (10,000 characters, 48 KB with task value references) and provides a workaround pattern using JSON lookup tables. Contains concrete limits and related configuration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/jobs/git,Use Git with jobs,Use Git with Lakeflow Jobs - Azure Databricks,Configure Git-backed source for Lakeflow Jobs tasks,"Learn how to configure Git-backed Lakeflow Jobs tasks, use sparse checkout for large repositories, and optimize caching to maintain workspace performance.","Job tasks can check out source code directly from a remote Git repository. The following task types support remote Git repositories: All tasks in a job must reference the same commit in the remote repository. When a job run begins, Azure Databricks takes a snapshot of the specified branch or commit, so that all tasks in that run use the same version of the code. When you view the run history of a task that runs code stored in a remote Git repository, theTask run detailspane includes Git details,",2026-04-17T18:03:00.000Z,how-to,configuration,0.7,True,"Explains configuring job tasks to check out code from remote Git repositories, including constraints like all tasks using the same commit and how snapshots are taken. These are product-specific configuration behaviors and parameters (branch/commit handling, caching, sparse checkout) that fit the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/create-recurring-job,Use parameters for a recurring job,Setup a recurring query with backfill support using Lakeflow Jobs - Azure Databricks,,"Learn how to set up a recurring, parameterized query with backfill support.","A common scenario is a query that is run on a regular schedule with a job orchestrating it. For example, at the end of each day, a query is run to update a system based on that day's changes to source datasets. This tutorial takes you through creating a query with parameters that identify the time frame to import data for, and then creating a job to schedule that query to run daily. The query and parameters created in this tutorial match best practices, and are set up to allow you to run abackfi",2026-01-24T08:00:00.000Z,tutorial,,0.45,False,"Tutorial for recurring, parameterized queries with backfill; while it mentions best practices, it’s framed as a walkthrough rather than a concentrated best-practices reference with quantified impact.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/foreach-sql-lookup-tutorial,Use SQL and For each tasks for metadata-driven orchestration,Use a control table to drive a For each job - Azure Databricks,Build metadata-driven For each jobs with control tables,Scale data ingestion across hundreds of sources using a metadata-driven control table with SQL and For each tasks in Lakeflow Jobs.,"You may need to ingest from many sources. When that list changes, hardcoding it in job configuration means changing code, and redeploying. Usemetadatato address this by storing the list of sources in a table that is read and used at run time. Add a source as a new row and the next job run picks it up with no changes to the job itself. This tutorial shows you how to build a job using this approach. A SQL task reads the control table, and aFor eachtask iterates over every row in parallel.",2026-04-17T18:03:00.000Z,tutorial,best-practices,0.65,True,"Shows a concrete, product-specific pattern for scaling ingestion using a control table plus SQL and For each tasks in Lakeflow Jobs. This is an actionable design pattern (metadata-driven orchestration) specific to Databricks jobs, beyond generic looping; fits best-practices as it prescribes how to structure jobs and control tables for maintainability and scalability.",new +https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/foreach-sql-lookup-tutorial,Use SQL and For each tasks for metadata-driven orchestration,Use a control table to drive a For each job - Azure Databricks,Build metadata-driven For each jobs with control tables,Scale data ingestion across hundreds of sources using a metadata-driven control table with SQL and For each tasks in Lakeflow Jobs.,"You may need to ingest from many sources. When that list changes, hardcoding it in job configuration means changing code, and redeploying. Usemetadatato address this by storing the list of sources in a table that is read and used at run time. Add a source as a new row and the next job run picks it up with no changes to the job itself. This tutorial shows you how to build a job using this approach. A SQL task reads the control table, and aFor eachtask iterates over every row in parallel.",2026-04-17T18:03:00.000Z,tutorial,best-practices,0.65,True,"Shows a concrete, product-specific pattern for scaling ingestion using a control table plus SQL and For each tasks in Lakeflow Jobs. This is an actionable design pattern (metadata-driven orchestration) specific to Databricks jobs, beyond generic looping; fits best-practices as it prescribes how to structure jobs and control tables for maintainability and scalability.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/run-jobs-with-service-principals,Run jobs with service principals,Run a job with a Microsoft Entra ID service principal - Azure Databricks,Run Lakeflow Jobs using Microsoft Entra service principals,Learn how to run Lakeflow Jobs with a Microsoft Entra ID service principal.,"Lakeflow Jobs provide a non-interactive way to run applications in an Azure Databricks cluster, for example, an ETL job or data analysis task that should run on a scheduled basis. Typically, these jobs run as the user that created them, but this can have some limitations: Using a service account—an account associated with an application rather than a specific user—is a common method to address these limitations. In Azure, you can use a Microsoft Entra ID application and service principal to crea",2026-02-11T08:00:00.000Z,tutorial,security,0.8,True,"Focuses on running jobs as a service principal; likely includes specific RBAC roles, scopes, and auth configuration details for Databricks–Entra integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-airflow-with-jobs,Orchestrate jobs with Airflow,Orchestrate Lakeflow Jobs with Apache Airflow - Azure Databricks,Orchestrate Lakeflow Jobs with Apache Airflow integration,Learn how to orchestrate Lakeflow Jobs in a data pipeline with Apache Airflow and how to set up the Airflow integration.,"This article describes the Apache Airflow support for orchestrating data pipelines with Azure Databricks, has instructions for installing and configuring Airflow locally, and provides an example of deploying and running a Azure Databricks workflow with Airflow. Note The Apache Airflow support described on this page uses several open source packages. This includes the Databricks provider for Airflow (including the Databricks Airflow operators). These packages are not directly supported by Databri",2026-01-23T08:00:00.000Z,tutorial,integrations,0.8,True,"Describes installing/configuring Airflow and Databricks provider, plus example workflow; likely includes operator names, connection parameters, and integration-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-dbt-in-workflows,Use dbt,Use dbt transformations in Lakeflow Jobs - Azure Databricks,Integrate dbt Core transformations with Lakeflow Jobs,Learn how to integrate your dbt Core transformations in a Lakeflow Jobs workflow.,"You can run your dbt Core projects as a task in a job. By running your dbt Core project as a job task, you can benefit from the following Lakeflow Jobs features: To learn more about dbt Core, see thedbt documentation.",2026-03-16T08:00:00.000Z,tutorial,integrations,0.7,True,Describes running dbt Core projects as a Lakeflow Jobs task; likely includes task configuration parameters and integration-specific settings unique to Databricks–dbt integration.,unchanged @@ -1326,27 +1348,27 @@ https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-python-wheels https://learn.microsoft.com/en-us/azure/databricks/jobs/if-else,If/else,Add branching logic to a job with the If/else task - Azure Databricks,Add If/else branching logic in Databricks jobs,Learn how to use the If/else task in an Azure Databricks job to apply conditional logic to workflows.,"Use theIf/else conditiontask to add boolean conditional logic to task graphs. These tasks consist of a boolean operator and a pair of operands, where the operands might reference the job or task state using configured or dynamic parameters or task values. SeeParameterize jobs. For example, suppose you have a task namedprocess_recordsthat maintains a count of records that are not valid in a value namedbad_records, and you want to branch processing when you encounter bad records. To add this logic",2026-01-23T08:00:00.000Z,how-to,configuration,0.7,True,"Details how to configure If/else tasks, including operands and parameter references, which are Databricks-specific configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/jar,Overview,JAR task for jobs - Azure Databricks,Configure JAR tasks for Azure Databricks jobs,Learn how to configure a JAR task in an Azure Databricks job.,"Use the JAR task to deploy Scala or Java code compiled into a JAR (Java ARchive). Before you create a JAR task, seeCreate a Databricks compatible JAR. Important Serverless Scala and Java jobs are inPublic Preview. You can use JAR tasks to deploy your JAR. SeeManage Azure Databricks previewsif it's not already enabled.",2026-04-10T08:00:00.000Z,how-to,configuration,0.7,True,"How-to page for configuring JAR tasks; likely includes Databricks-specific task settings, parameters, and options (for example, main class, libraries, cluster settings) that are product-specific configuration details rather than generic Java knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/jar-create,Create a JAR,Create an Azure Databricks compatible JAR - Azure Databricks,Create Databricks-compatible JARs for Lakeflow Jobs,Learn how to create a JAR that is ready for use in an Azure Databricks job.,"A Java archive (JAR) packages Java or Scala code for deployment in Lakeflow Jobs. This page covers JAR compatibility requirements and project configuration for different compute types. Tip For automated deployment and continuous integration workflows, use Declarative Automation Bundles to create a project from a template with pre-configured build and deployment settings. SeeBuild a Scala JAR using Declarative Automation BundlesandBundle that uploads a JAR file to Unity Catalog. This page describ",2026-03-03T08:00:00.000Z,how-to,configuration,0.7,True,"Describes compatibility requirements and project configuration for different Databricks compute types, including build settings that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/job-parameters,Configure job parameters,Configure job parameters - Azure Databricks,Configure and use job parameters in Databricks,Learn about options for parameterizing jobs and tasks in Azure Databricks.,"This article describes job parameter functionality and configuring job parameters with the Databricks workspace UI. You can also add job parameters to JSON and YAML definitions used with the REST API, CLI, and Declarative Automation Bundles. SeeJobs API,Databricks CLI, andWhat are Declarative Automation Bundles?.",2026-04-14T21:45:00.000Z,concept-article,configuration,0.8,True,"Explains Databricks job parameter functionality with concrete configuration in UI, JSON, and YAML for REST API, CLI, and automation bundles. Contains specific parameter constructs and schema unique to Databricks jobs.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/job-parameters,Configure job parameters,Configure job parameters - Azure Databricks,Configure and use job parameters in Databricks,Learn about options for parameterizing jobs and tasks in Azure Databricks.,"This article describes job parameter functionality and configuring job parameters with the Databricks workspace UI. You can also add job parameters to JSON and YAML definitions used with the REST API, CLI, and Declarative Automation Bundles. SeeJobs API,Databricks CLI, andWhat are Declarative Automation Bundles?.",2026-04-14T21:45:00.000Z,concept-article,configuration,0.8,True,"Explains Databricks job parameter functionality with concrete configuration in UI, JSON, and YAML for REST API, CLI, and automation bundles. Contains specific parameter constructs and schema unique to Databricks jobs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/jobs-quickstart,Quickstart,Create your first workflow with Lakeflow Jobs - Azure Databricks,,Learn how to quickly create and orchestrate tasks with Lakeflow Jobs.,"This article demonstrates using Lakeflow Jobs to orchestrate tasks to read and process a sample dataset. In this quickstart, you:",2026-03-26T08:00:00.000Z,tutorial,,0.3,False,"Quickstart tutorial for creating a workflow; typically step-by-step without detailed config matrices, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/large-jobs,Large jobs,Jobs with a large number of tasks - Azure Databricks,Configure and troubleshoot Lakeflow Jobs with many tasks,Learn how to configure and troubleshoot Lakeflow Jobs with a large number of tasks.,You can create jobs that contain up to 1000 tasks. The following topics describe how to solve specific issues that may come up with large jobs that have more than 100 tasks.,2026-01-23T08:00:00.000Z,how-to,troubleshooting,0.7,True,Targets issues with jobs up to 1000 tasks and specifically mentions solving issues; likely includes symptom→cause→solution guidance and numeric limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/monitor,Monitor jobs,Monitoring and observability for Lakeflow Jobs - Azure Databricks,Monitor and inspect Lakeflow Jobs runs in Databricks,Learn about tools for monitoring and observability for jobs and tasks in Azure Databricks.,"This article describes the features available in the Azure Databricks UI to view jobs you have access to, view a history of runs for jobs, and view details of job runs. To configure notifications for jobs, seeAdd notifications on a job. To learn about using the Databricks CLI to view jobs and run jobs, run the CLI commandsdatabricks jobs list -h,databricks jobs get -h, anddatabricks jobs run-now -h. To learn about using the Jobs API, see theJobs API. If you have access to thesystem.lakeflowschem",2026-03-16T08:00:00.000Z,how-to,configuration,0.65,True,"Describes specific UI features, CLI commands, and system.lakeflow schema usage for monitoring jobs, which are product-specific observability configurations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/notebook,Notebook,Notebook task for jobs - Azure Databricks,Configure notebook tasks in Azure Databricks jobs,Learn how to configure a notebook task in an Azure Databricks job.,Use the notebook task to deploy Databricks notebooks.,2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Task configuration page with product-specific fields and options for notebook tasks (e.g., parameters, cluster settings, run-as options). These are concrete configuration parameters and patterns unique to Databricks jobs, not just conceptual guidance.",updated -https://learn.microsoft.com/en-us/azure/databricks/jobs/notifications,Notifications,Add notifications on a job - Azure Databricks,Configure job and task notifications in Databricks,"Learn how to configure email and system notifications when your Lakeflow Jobs start, complete successfully, or fail.","You can set up notifications to be sent on runs of a job and individual job tasks for the following events: You can send notifications to one or more email addresses or third-party destinations such as Slack, Microsoft Teams, PagerDuty, or any webhook-based service. This article describes the different ways you can set up job-level notifications.",2026-04-15T22:08:00.000Z,how-to,configuration,0.7,True,"Describes configuration of notifications for jobs and tasks, including supported destinations (email, Slack, Teams, PagerDuty, webhooks) and event types. These are product-specific notification configuration options.",updated -https://learn.microsoft.com/en-us/azure/databricks/jobs/parameter-use,Access parameter values,Access parameter values from a task - Azure Databricks,Access Databricks job parameters from task code,Learn how to access parameter values of different types in a Lakeflow Jobs task.,"This article describes how to access parameter values from code in your tasks, including Databricks notebooks, Python scripts, and SQL files. Parameters include user-defined parameters, values output from upstream tasks, and metadata values generated by the job. SeeParameterize jobs. While the details vary bytask type, there are four common methods used to reference parameter values from source code: In each of these cases, you reference the parameter's key to access its value. The key is someti",2026-04-16T08:00:00.000Z,how-to,integrations,0.65,True,"Shows concrete code patterns for accessing parameters in notebooks, Python scripts, and SQL files, including how Databricks exposes parameter values and metadata. This is a product-specific coding/integration pattern for parameter access.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/monitor,Monitor jobs,Monitoring and observability for Lakeflow Jobs - Azure Databricks,,Learn about tools for monitoring and observability for jobs and tasks in Azure Databricks.,"This article describes the features available in the Azure Databricks UI to view jobs you have access to, view a history of runs for jobs, and view details of job runs. To configure notifications for jobs, seeAdd notifications on a job. To learn about using the Databricks CLI to view jobs and run jobs, run the CLI commandsdatabricks jobs list -h,databricks jobs get -h, anddatabricks jobs run-now -h. To learn about using the Jobs API, see theJobs API. If you have access to thesystem.lakeflowschem",2026-04-17T08:00:00.000Z,how-to,,0.2,False,"The summary indicates a conceptual/feature overview of monitoring and observability for jobs, referencing UI views, CLI help commands, and the Jobs API, but does not show specific error codes, diagnostic mappings, or configuration tables. It reads as general usage guidance rather than expert troubleshooting or configuration detail.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/notebook,Notebook,Notebook task for jobs - Azure Databricks,Configure notebook tasks in Azure Databricks jobs,Learn how to configure a notebook task in an Azure Databricks job.,Use the notebook task to deploy Databricks notebooks.,2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Task configuration page with product-specific fields and options for notebook tasks (e.g., parameters, cluster settings, run-as options). These are concrete configuration parameters and patterns unique to Databricks jobs, not just conceptual guidance.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/jobs/notifications,Notifications,Add notifications on a job - Azure Databricks,Configure job and task notifications in Databricks,"Learn how to configure email and system notifications when your Lakeflow Jobs start, complete successfully, or fail.","You can set up notifications to be sent on runs of a job and individual job tasks for the following events: You can send notifications to one or more email addresses or third-party destinations such as Slack, Microsoft Teams, PagerDuty, or any webhook-based service. This article describes the different ways you can set up job-level notifications.",2026-04-15T22:08:00.000Z,how-to,configuration,0.7,True,"Describes configuration of notifications for jobs and tasks, including supported destinations (email, Slack, Teams, PagerDuty, webhooks) and event types. These are product-specific notification configuration options.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/jobs/parameter-use,Access parameter values,Access parameter values from a task - Azure Databricks,Access Databricks job parameters from task code,Learn how to access parameter values of different types in a Lakeflow Jobs task.,"This article describes how to access parameter values from code in your tasks, including Databricks notebooks, Python scripts, and SQL files. Parameters include user-defined parameters, values output from upstream tasks, and metadata values generated by the job. SeeParameterize jobs. While the details vary bytask type, there are four common methods used to reference parameter values from source code: In each of these cases, you reference the parameter's key to access its value. The key is someti",2026-04-16T08:00:00.000Z,how-to,integrations,0.65,True,"Shows concrete code patterns for accessing parameters in notebooks, Python scripts, and SQL files, including how Databricks exposes parameter values and metadata. This is a product-specific coding/integration pattern for parameter access.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/parameters,Overview,Parameterize jobs - Azure Databricks,Parameterize Azure Databricks jobs and tasks,Learn about options for parameterizing jobs and tasks in Azure Databricks.,"This article provides an overview of using parameters with jobs and tasks, focusing on the specifics of configuration for jobs and tasks in the UI. You must also update the source code assets you configure as tasks to reference parameters. Parameter references vary by language and task type. SeeAccess parameter values from a task. The following are foundational concepts for understanding parameters for jobs:",2026-01-23T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains concrete parameter configuration options in the Jobs UI and how they map to task code, which is Databricks-specific configuration detail.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/pipeline,Pipeline,Pipeline task for jobs - Azure Databricks,Configure pipeline tasks to run Lakeflow Spark Declarative Pipelines,Learn how to configure a pipeline task in a Lakeflow Jobs to manage Lakeflow Spark Declarative Pipelines.,"Lakeflow Jobs provide a procedural approach to defining relationships betweentasks. Lakeflow Spark Declarative Pipelines provide a declarative approach to defining relationships betweendatasetsandtransformations. This page describes how you can schedule triggered Lakeflow Spark Declarative Pipelines to run as a task in a job, using the Jobs UI, the Lakeflow Spark Declarative Pipelines UI, or SQL. Note Atriggeredpipeline is a pipeline that does not run continuously, but must be triggered to start",2026-01-23T08:00:00.000Z,how-to,configuration,0.7,True,Explains scheduling triggered pipelines as job tasks; likely includes task configuration options and trigger behavior specific to Lakeflow pipelines.,unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/powerbi,Power BI,Power BI task for jobs - Azure Databricks,Configure Power BI tasks to orchestrate semantic models from Databricks,Learn how to configure a Power BI task in an Azure Databricks job.,"Important The Power BI task feature is inPublic Preview. While you can publish toMicrosoft Power BI onlinemanually from your Azure Databricks workspace, you can use a Power BI task to orchestrate your Power BI semantic models automatically. To learn more about publishing to Power BI in the Azure Databricks UI, seePublish to the Power BI service from Azure Databricks.",2026-01-24T08:00:00.000Z,how-to,integrations,0.7,True,Power BI task integrates Databricks jobs with Power BI service; likely includes connection/config parameters and constraints specific to this integration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/privileges,Identities and privileges,"Manage identities, permissions, and privileges for Lakeflow Jobs - Azure Databricks",Manage Lakeflow Jobs identities and permissions in Databricks,"Learn how to configure and manage identities, permissions, and privileges for Lakeflow Jobs.","This article contains recommendations and instructions for managing identities, permissions, and privileges for Lakeflow Jobs. Note Secretsare not redacted from a cluster's Spark driver logstdoutandstderrstreams. To protect sensitive data, by default, Spark driver logs are viewable only by users with CAN MANAGE permission on job, dedicated access mode, and standard access mode clusters. To allow users with CAN ATTACH TO or CAN RESTART permission to view the logs on these clusters, set the follow",2026-04-09T08:00:00.000Z,concept-article,security,0.9,True,"Covers identities, permissions, and privileges for Lakeflow Jobs; mentions specific permissions like CAN MANAGE, CAN ATTACH TO, CAN RESTART and log visibility behavior, which are product-specific RBAC/security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/python-script,Python script,Python script task for jobs - Azure Databricks,Configure Python script tasks in Azure Databricks jobs,Learn how to configure a Python script task in an Azure Databricks job.,Use thePython scripttask to run a Python file.,2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Describes how to configure Python script tasks with Databricks-specific options (paths, parameters, environment, cluster/runtime settings). Contains concrete configuration fields and patterns beyond generic Python execution.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/python-script,Python script,Python script task for jobs - Azure Databricks,Configure Python script tasks in Azure Databricks jobs,Learn how to configure a Python script task in an Azure Databricks job.,Use thePython scripttask to run a Python file.,2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"Describes how to configure Python script tasks with Databricks-specific options (paths, parameters, environment, cluster/runtime settings). Contains concrete configuration fields and patterns beyond generic Python execution.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/python-wheel,Python wheel,Python Wheel task for jobs - Azure Databricks,Configure Python wheel tasks in Azure Databricks jobs,Learn how to configure a Python wheel task in an Azure Databricks job.,Use the Python wheel task type to deploy code packaged as a Python wheel.,2026-01-23T08:00:00.000Z,how-to,configuration,0.7,True,Describes Python wheel task type; likely includes configuration fields and constraints for deploying wheel-based applications in jobs.,unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/repair-job-failures,Troubleshoot,Troubleshoot and repair job failures - Azure Databricks,Troubleshoot and repair Azure Databricks job failures,Learn how to use tools and features in the Azure Databricks UI to troubleshoot and fix failures in your Lakeflow Jobs.,"Suppose you have been notified (for example, through an email notification, a monitoring solution, or in the Lakeflow Jobs UI) that a task has failed in a run of your job. The steps in this article provide guidance to help you identify the cause of failure, suggestions to fix the issues that you find, and how to repair failed job runs.",2026-02-25T08:00:00.000Z,how-to,troubleshooting,0.8,True,"Organized around failed tasks with guidance to identify causes, fix issues, and repair runs using Databricks-specific tools, matching troubleshooting criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/run-classic-jobs,Classic,Best practices for configuring classic Lakeflow Jobs - Azure Databricks,Apply best practices for configuring classic Lakeflow Jobs,Learn about options and best practices for running your Azure Databricks classic workflows with Azure Databricks compute resources.,"Learn general recommendations about features and configurations that can benefit classic Lakeflow Jobs. Classic jobs require that you create and tailor specific configurations of compute resources, policies, and performance options that fit the needs of your data transformation scenarios. Specific recommendations for configuring the size and types of compute resources vary based on the workload. Review these best practices before you start configuring your classic workflows in order to avoid unw",2026-01-23T08:00:00.000Z,best-practice,best-practices,0.8,True,"Explicit best-practices article for classic jobs, including compute sizing, policies, and performance options; clearly product-specific configuration recommendations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/run-if,Configure task dependencies,Configure task dependencies - Azure Databricks,,Learn how to run a task in an Azure Databricks job conditionally based on the status of the task's dependencies.,"TheRun if dependenciesfield allows you to add control flow logic to tasks based on other tasks' success, failure, or completion. Dependencies are visually represented in the job DAG as lines between tasks. Azure Databricks runs upstream tasks before running downstream tasks, running as many of them in parallel as possible. Note Depends onis only visible if the job consists of multiple tasks. Databricks also has the following functionality for control flow and conditionalization:",2026-04-08T19:25:00.000Z,how-to,,0.4,False,"Describes conditional task dependencies and control flow conceptually; no indication of numeric thresholds, configuration tables, or product-specific error codes. Primarily behavior/UX description.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/run-job,Run Job,Run Job task for jobs - Azure Databricks,Configure Run Job tasks for Databricks workflows,Learn how to configure a Run Job task in an Azure Databricks job.,Use theRun Jobtask to trigger other jobs configured in your Azure Databricks workspace. Important Databricks does not support jobs with circular dependencies or that nest more than threeRun Jobtasks. Databricks might block deploying jobs with this pattern in a future release. Circular dependencies areRun Jobtasks that directly or indirectly trigger each other. For example:,2026-01-23T08:00:00.000Z,how-to,configuration,0.75,True,"Contains concrete constraints (no circular dependencies, max nesting depth) and task configuration details specific to Databricks jobs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/run-serverless-jobs,Serverless,Run your Lakeflow Jobs with serverless compute for workflows - Azure Databricks,Decide when to run Lakeflow Jobs on serverless compute,Learn how to use serverless compute for workflows to run a data processing workflow without configuring and deploying infrastructure.,"Serverless compute for workflows lets you run your job without configuring and deploying infrastructure. With serverless compute, you focus on implementing your data processing and analysis pipelines, and Azure Databricks efficiently manages compute resources, including optimizing and scaling compute for your workloads. Autoscaling andPhotonare automatically enabled for the compute resources that run your job. Serverless compute for workflows automatically and continuously optimizes infrastructu",2026-04-16T17:44:00.000Z,how-to,decision-making,0.6,True,"Covers using serverless compute for workflows, including how it manages autoscaling and optimization. Such pages typically include guidance on when to choose serverless vs. other compute options, with trade-offs around management, performance, and cost. This aligns with decision-making for compute selection in Lakeflow Jobs.",updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/run-serverless-jobs,Serverless,Run your Lakeflow Jobs with serverless compute for workflows - Azure Databricks,Decide when to run Lakeflow Jobs on serverless compute,Learn how to use serverless compute for workflows to run a data processing workflow without configuring and deploying infrastructure.,"Serverless compute for workflows lets you run your job without configuring and deploying infrastructure. With serverless compute, you focus on implementing your data processing and analysis pipelines, and Azure Databricks efficiently manages compute resources, including optimizing and scaling compute for your workloads. Autoscaling andPhotonare automatically enabled for the compute resources that run your job. Serverless compute for workflows automatically and continuously optimizes infrastructu",2026-04-16T17:44:00.000Z,how-to,decision-making,0.6,True,"Covers using serverless compute for workflows, including how it manages autoscaling and optimization. Such pages typically include guidance on when to choose serverless vs. other compute options, with trade-offs around management, performance, and cost. This aligns with decision-making for compute selection in Lakeflow Jobs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/scheduled,Run jobs on a schedule,Run jobs on a schedule - Azure Databricks,Configure time-based schedules for Azure Databricks jobs,Learn how to run your Azure Databricks job on a specific schedule.,Configure your job with theScheduledtrigger to run them on a time-based schedule. The scheduled trigger type has two options: Note You cannot specify the time for the first run for a simple schedule. The scheduler chooses a time when you configure the schedule.,2026-01-23T08:00:00.000Z,how-to,configuration,0.7,True,Details scheduled trigger options and notes constraints (for example first-run timing); likely includes schedule parameter names and allowed values.,unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/spark-submit,Spark Submit (legacy),Spark Submit Task Deprecation Notice & Migration Guide - Azure Databricks,Migrate from Spark Submit tasks in Databricks jobs,Deprecation notice for the **Spark Submit** task type with migration guidance.,Warning TheSpark Submittask is deprecated and pending removal. Usage of this task type is disallowed for new use cases and strongly discouraged for existing customers. SeeSpark Submit (legacy)for the original documentation for this task type. Keep reading for migration instructions.,2026-01-23T08:00:00.000Z,upgrade-and-migration-article,decision-making,0.65,True,"Deprecation notice plus migration guidance implies concrete recommendations on which alternative task types to choose and when, which is product-specific decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/jobs/sql,SQL,SQL task for jobs - Azure Databricks,Configure SQL tasks and warehouses for Databricks jobs,Learn how to configure a SQL task in an Azure Databricks job.,"Use the SQL task type to configure a SQL query, alert, or SQL file. TheSQLtask requires Databricks SQL and aserverless or pro SQL warehouse.",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,Covers configuring SQL tasks including query/file/alert modes and requirements for Databricks SQL warehouses (serverless or pro). Contains product-specific configuration options and constraints for SQL tasks.,updated +https://learn.microsoft.com/en-us/azure/databricks/jobs/sql,SQL,SQL task for jobs - Azure Databricks,Configure SQL tasks and warehouses for Databricks jobs,Learn how to configure a SQL task in an Azure Databricks job.,"Use the SQL task type to configure a SQL query, alert, or SQL file. TheSQLtask requires Databricks SQL and aserverless or pro SQL warehouse.",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,Covers configuring SQL tasks including query/file/alert modes and requirements for Databricks SQL warehouses (serverless or pro). Contains product-specific configuration options and constraints for SQL tasks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/task-parameters,Configure task parameters,Configure task parameters - Azure Databricks,Configure and use task parameters in Azure Databricks jobs,Learn about options for parameterizing tasks in Azure Databricks.,"Task parameters allow you to parameterize tasks using values that can be static, dynamic, or set by upstream tasks. For information on using dynamic values, seeWhat is a dynamic value reference?. For information on passing context between tasks, seeUse task values to pass information between tasks. The assets configured by tasks use different syntax to refer to values passed as parameters. SeeAccess parameter values from a task. Note Some tasks support parameterization but do not have parameter ",2026-04-09T08:00:00.000Z,concept-article,configuration,0.85,True,"Details task parameterization, including syntax differences per asset type and dynamic value references; these are product-specific parameter names and usage patterns that constitute configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/task-values,Task values,Use task values to pass information between tasks - Azure Databricks,Pass task values between Databricks job tasks,Learn how to share information in an Azure Databricks workflow by passing variables between job tasks.,"Task valuesrefer to the Databricks UtilitiestaskValuessubutility, which lets you pass arbitrary values between tasks in a Databricks job. SeetaskValues subutility (dbutils.jobs.taskValues). You specify a key-value pair usingdbutils.jobs.taskValues.set()in one task and then can use the task name and key to reference the value in subsequent tasks. Note Becausedbutils.jobs.taskValues.set()anddbutils.jobs.taskValues.get()in thedbutils.jobs.taskValuessubutility are Python functions, they can be used ",2026-01-23T08:00:00.000Z,how-to,integrations,0.7,True,"Describes dbutils.jobs.taskValues API (set/get) with key-based semantics, a Databricks-specific coding pattern and API surface for inter-task communication.",unchanged https://learn.microsoft.com/en-us/azure/databricks/jobs/trigger-table-update,Trigger on update of source tables,Trigger jobs when source tables are updated - Azure Databricks,Trigger Lakeflow Jobs on source table updates,Learn how to trigger a run in Lakeflow Jobs when source tables are updated.,You can usetable update triggersto trigger a run of your job when source tables are updated. Use this feature to run a job when new data is ready without the need for a continuously running cluster or knowledge of the processes that update a table.,2026-01-24T08:00:00.000Z,how-to,configuration,0.65,True,Explains table update triggers; likely documents trigger configuration fields and behavior specific to Databricks table-update events.,unchanged @@ -1357,7 +1379,7 @@ https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/cost-o https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/data-governance/,Data & AI governance,Data and AI governance for the data lakehouse - Azure Databricks,Apply data and AI governance architecture on Databricks,This article covers architectural principles of data and AI governance on the Databricks lakehouse: How to centrally manage data assets and access.,The architectural principles of thedata and AI governancepillar cover how to centrally manage assets and access.,2026-01-20T08:00:00.000Z,reference-architecture,architecture-patterns,0.65,True,Describes architectural principles for centrally managing data assets and access in the Databricks lakehouse; product-specific governance architecture patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/data-governance/best-practices,Data & AI governance best practices,Best practices for data and AI governance - Azure Databricks,Implement best practices for Databricks data and AI governance,This article covers best practices supporting principles of data and AI governance on the Databricks lakehouse.,"This article covers best practices ofdata and AI governance, organized by architectural principles listed in the following sections.",2026-03-26T08:00:00.000Z,best-practice,best-practices,0.75,True,Explicitly a best-practices article organized by governance principles; contains Databricks-specific DOs and DON’Ts for governance.,unchanged https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/,Production planning,Databricks production planning - Azure Databricks,Plan production Azure Databricks lakehouse deployments,"Comprehensive planning guide for designing enterprise Azure Databricks lakehouse architecture across account setup, governance, network, storage, and operations for production deployments.","This section provides a structured, phase-by-phase approach to planning and designing a production-ready enterprise Azure Databricks lakehouse platform. It focuses on architectural decisions, design patterns, and best practices rather than step-by-step implementation instructions.",2026-03-24T17:49:00.000Z,best-practice,decision-making,0.7,True,"Phase-based production planning guide focused on architectural decisions and patterns across account, governance, network, storage, and operations; helps choose approaches and structures for enterprise deployments.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/account-setup,Phase 1: Account,Phase 1: Design account and identity strategy - Azure Databricks,Design Azure Databricks account and identity strategy,"Design account administration and identity management strategy for your Azure Databricks account, including administrative roles, SSO, and user provisioning.","In this phase, you design the foundational account administration and identity management strategy for your Azure Databricks account.",2026-04-15T08:00:00.000Z,best-practice,security,0.7,True,"A design guide for account administration and identity management typically includes specific Databricks account roles, Azure AD/SSO configuration details, and user provisioning patterns. These are product-specific security and identity configurations (RBAC roles, scopes, SSO setup) that qualify as security-focused expert knowledge rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/account-setup,Phase 1: Account,Phase 1: Design account and identity strategy - Azure Databricks,Design Azure Databricks account and identity strategy,"Design account administration and identity management strategy for your Azure Databricks account, including administrative roles, SSO, and user provisioning.","In this phase, you design the foundational account administration and identity management strategy for your Azure Databricks account.",2026-04-15T08:00:00.000Z,best-practice,security,0.7,True,"A design guide for account administration and identity management typically includes specific Databricks account roles, Azure AD/SSO configuration details, and user provisioning patterns. These are product-specific security and identity configurations (RBAC roles, scopes, SSO setup) that qualify as security-focused expert knowledge rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/compute,Phase 8: Compute,Phase 8: Design compute configuration - Azure Databricks,Design compute and workspace configuration for Azure Databricks,"Design compute strategy and workspace configuration including cluster sizing, SQL warehouse sizing, policies, and security settings.","In this phase, you design compute resources and workspace settings to optimize performance, cost, and security. Azure Databricks recommends using serverless compute as the primary option. Serverless requires no configuration, is always available, and scales automatically with workloads in seconds. Only configure classic compute manually if serverless does not support your use case.",2026-03-24T01:32:00.000Z,best-practice,decision-making,0.7,True,"Helps choose between serverless and classic compute, cluster and SQL warehouse sizing, and policies; Databricks-specific trade-offs and recommendations for different scenarios.",unchanged https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/delta-lake,Phase 6: Delta Lake,Phase 6: Design Delta Lake architecture - Azure Databricks,Design Delta Lake and medallion data architecture on Databricks,"Design Delta Lake storage architecture, medallion architecture patterns, and data governance strategies for your Azure Databricks lakehouse.","In this phase, you design Delta Lake storage architecture and data organization patterns for your lakehouse.",2026-03-24T01:32:00.000Z,best-practice,architecture-patterns,0.7,True,Focuses on Delta Lake storage architecture and data organization patterns (including medallion) specific to Databricks; these are concrete product-specific architecture patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/ha-dr,Phase 10: High availability & disaster recovery,Phase 10: Design high availability and disaster recovery - Azure Databricks,Design high availability and disaster recovery for Databricks lakehouse,Design high availability and disaster recovery strategies to ensure business continuity for your Azure Databricks lakehouse.,"In this phase, you design high availability (HA) and disaster recovery (DR) strategies to ensure business continuity and resilience. High availability (HA) and disaster recovery (DR) are important facets of the lakehouse, and many organizations implement both to productionize critical use cases on Azure Databricks.",2026-03-24T01:32:00.000Z,best-practice,architecture-patterns,0.65,True,Covers Databricks-specific HA and DR strategies for critical workloads; these are concrete resilience architecture patterns for the platform.,unchanged @@ -1388,11 +1410,11 @@ https://learn.microsoft.com/en-us/azure/databricks/lakehouse/medallion,What is t https://learn.microsoft.com/en-us/azure/databricks/lakehouse/ssot,What does it mean to build a single source of truth?,What does it mean to build a single source of truth? - Azure Databricks,,"The Databricks lakehouse eliminates the need for creating and syncing copies of data across multiple systems by unifying data access and storage in a single system, establishing the lakehouse as the s","The Databricks lakehouse eliminates the need for creating and syncing copies of data across multiple systems by unifying data access and storage in a single system, establishing the lakehouse as the single source of truth (SSOT). Duplicating data often results in data silos, meaning that different teams within an organization may be working with versions of the same data that differ in quality and freshness.",2024-08-06T08:00:00.000Z,concept-article,,0.2,False,"Conceptual explanation of single source of truth; no indication of numeric thresholds, decision matrices, or product-specific configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/languages/,Overview,Languages overview - Azure Databricks,,"Learn how to use Python, SQL, R, and Scala to perform collaborative data science, data engineering, and data analysis in Azure Databricks.","Databricks actively supports developers who want to use their favorite language and corresponding tools to harness Databricks functionality. The following table provides links to overviews and information about developer-focused Databricks features and integrations by supported language, which includes Python, R, Scala, and SQL language support and many other tools that enable automating and streamlining your organization's ETL pipelines and software development lifecycle. For recommendations ab",2026-03-16T17:36:00.000Z,landing-page,,0.2,False,"Languages overview that mainly links to other content; lacks product-specific configuration values, limits, or detailed patterns required for expert-knowledge classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/languages/overview,Language recommendations,Choose a development language - Azure Databricks,Choose a programming language for Azure Databricks,"Learn about the primary languages that Databricks supports, and how to make the best choice for your scenarios.","Databricks supports the use of different programming languages for development and data engineering. This article outlines the available options, where those languages can be used, and their limitations.",2026-03-16T08:00:00.000Z,overview,decision-making,0.65,True,"Explains available languages, where they can be used, and their limitations in Databricks. This is workspace-specific language selection guidance with constraints, fitting decision-making for technology choice.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/languages/python,Python,Azure Databricks for Python developers - Azure Databricks,,Learn about developing notebooks and jobs in Azure Databricks using the Python language. This article provides links to tutorials and key references and tools.,"This section provides a guide to developing notebooks and jobs in Azure Databricks using the Python language, including tutorials for common workflows and tasks, and links to APIs, libraries, and tools. To get started:",2026-04-15T08:00:00.000Z,overview,,0.2,False,"High-level guide for Python development on Azure Databricks with links to tutorials and tools; summary suggests navigation/overview content without detailed configuration tables, limits, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/languages/scala,Scala,Azure Databricks for Scala developers - Azure Databricks,,Learn about developing notebooks and jobs in Azure Databricks using the Scala language. This article provides links to tutorials and key references and tools.,"This article provides a guide to developing notebooks and jobs in Azure Databricks using the Scala language. The first section provides links to tutorials for common workflows and tasks. The second section provides links to APIs, libraries, and key tools. A basic workflow for getting started is: Beyond this, you can branch out into more specific topics:",2026-01-19T08:00:00.000Z,overview,,0.3,False,"Similar to the Python page, this is a guide/index for Scala content without specific configuration tables, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/languages/python,Python,Azure Databricks for Python developers - Azure Databricks,,Learn about developing notebooks and jobs in Azure Databricks using the Python language. This article provides links to tutorials and key references and tools.,"This section provides a guide to developing notebooks and jobs in Azure Databricks using the Python language, including tutorials for common workflows and tasks, and links to APIs, libraries, and tools. To get started:",2026-04-15T08:00:00.000Z,overview,,0.2,False,"High-level guide for Python development on Azure Databricks with links to tutorials and tools; summary suggests navigation/overview content without detailed configuration tables, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/languages/scala,Scala,Azure Databricks for Scala developers - Azure Databricks,,Learn about developing notebooks and jobs in Azure Databricks using the Scala language. This article provides links to tutorials and key references and tools.,"This article provides a guide to developing notebooks and jobs in Azure Databricks using the Scala language. The first section provides links to tutorials for common workflows and tasks. The second section provides links to APIs, libraries, and key tools. A basic workflow for getting started is: Beyond this, you can branch out into more specific topics:",2026-04-23T08:00:00.000Z,overview,,0.1,False,"Acts as a navigation/guide page for Scala development on Databricks with links to tutorials and APIs; no detailed limits, configs, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/large-language-models/,Overview,Large language models (LLMs) on Databricks - Azure Databricks,,"How to use large language models (LLMs) on Databricks, including tools and libraries such as Hugging Face, LangChain, and OpenAI.","Azure Databricks makes it simple to access and build off of publicly available large language models. Databricks Runtime for Machine Learning includes libraries like Hugging Face Transformers and LangChain that allow you to integrate existing pre-trained models or other open-source libraries into your workflow. From here, you can leverage Azure Databricks platform capabilities to fine-tune LLMs using your own data for better domain performance. In addition, Azure Databricks offers built-in funct",2026-01-22T13:21:00.000Z,concept-article,,0.4,False,"Overview of using LLMs on Databricks with common libraries; largely conceptual and high-level without detailed config tables, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions,Overview,Enrich data using AI Functions - Azure Databricks,Use Azure Databricks AI Functions from SQL and Python,"Learn about Azure Databricks AI Functions, built-in SQL functions that allow you to access and serve large language models (LLMs) directly from SQL or Python.","Important This feature is inPublic Preview. AI Functions are built-in functions that you can use to apply LLMs or state-of-the-art research techniques on data stored on Azure Databricks for data transformation and enrichment. They can be run from anywhere on Databricks, including Databricks SQL, notebooks, Lakeflow Spark Declarative Pipelines, and Workflows. AI Functions are simple to use, fast, and scalable. Analysts can use them to apply data intelligence to their proprietary data, while data ",2026-04-17T18:03:00.000Z,concept-article,integrations,0.65,True,"Describes concrete, product-specific AI Functions (built-in SQL/Python functions) for invoking LLMs on Databricks data. While the summary is high level, this page in full typically documents specific function names, parameters, and usage patterns unique to Databricks AI Functions, which fits the integrations & coding patterns category.",updated -https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions-example,Tutorial: Analyze customer reviews using AI functions,Analyze customer reviews using AI Functions - Azure Databricks,,Learn how to use the built-in Databricks SQL function to examine customer reviews and determine if a response needs to be generated.,"Important This feature is inPublic Preview. This article illustrates how to use AI Functions to examine customer reviews and determine if a response needs to be generated. The AI Functions used in this example are built-in Databricks SQL functions, powered by generative AI models made available by Databricks Foundation Model APIs. SeeEnrich data using AI Functions. This example performs the following on a test dataset calledreviewsusing AI Functions:",2026-04-13T21:52:00.000Z,concept-article,,0.3,False,"Scenario tutorial showing how to analyze reviews with AI Functions. Based on the summary, it focuses on an example workflow rather than enumerating configuration tables, limits, or detailed error mappings. It is primarily instructional/tutorial content, not a reference of expert-only details.",updated +https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions,Overview,Enrich data using AI Functions - Azure Databricks,Use Azure Databricks AI Functions from SQL and Python,"Learn about Azure Databricks AI Functions, built-in SQL functions that allow you to access and serve large language models (LLMs) directly from SQL or Python.","Important This feature is inPublic Preview. AI Functions are built-in functions that you can use to apply LLMs or state-of-the-art research techniques on data stored on Azure Databricks for data transformation and enrichment. They can be run from anywhere on Databricks, including Databricks SQL, notebooks, Lakeflow Spark Declarative Pipelines, and Workflows. AI Functions are simple to use, fast, and scalable. Analysts can use them to apply data intelligence to their proprietary data, while data ",2026-04-17T18:03:00.000Z,concept-article,integrations,0.65,True,"Describes concrete, product-specific AI Functions (built-in SQL/Python functions) for invoking LLMs on Databricks data. While the summary is high level, this page in full typically documents specific function names, parameters, and usage patterns unique to Databricks AI Functions, which fits the integrations & coding patterns category.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions-example,Tutorial: Analyze customer reviews using AI functions,Analyze customer reviews using AI Functions - Azure Databricks,,Learn how to use the built-in Databricks SQL function to examine customer reviews and determine if a response needs to be generated.,"Important This feature is inPublic Preview. This article illustrates how to use AI Functions to examine customer reviews and determine if a response needs to be generated. The AI Functions used in this example are built-in Databricks SQL functions, powered by generative AI models made available by Databricks Foundation Model APIs. SeeEnrich data using AI Functions. This example performs the following on a test dataset calledreviewsusing AI Functions:",2026-04-13T21:52:00.000Z,concept-article,,0.3,False,"Scenario tutorial showing how to analyze reviews with AI Functions. Based on the summary, it focuses on an example workflow rather than enumerating configuration tables, limits, or detailed error mappings. It is primarily instructional/tutorial content, not a reference of expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-playground,AI Playground,Chat with LLMs and prototype generative AI apps using AI Playground - Azure Databricks,,Chat with supported large language models using Databricks AI Playground available in your Databricks workspace.,"The AI Playground is a chat environment for Large Language Models (LLMs). In the playground, you can interactively test, prompt, and compare LLMs, as well as prototype tool-calling agents and question-answering bots.",2026-02-10T23:12:00.000Z,concept-article,,0.3,False,Overview of AI Playground and how to chat with LLMs; mostly conceptual/usage description without detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-query,ai_query: general-purpose function,Use ai_query - Azure Databricks,Call ai_query to access AI models in Databricks,"Learn how to use ai\_query, the general-purpose AI Function that queries any supported model directly from SQL or Python on Azure Databricks.","Important This feature is inPublic Preview. ai_queryis a general-purposeAI Functionthat lets you query any supported AI model directly from SQL or Python. Unlike task-specific AI Functions, which are purpose-built and optimized for a single task,ai_querygives you full control over the model, prompt, and parameters. For full syntax and parameter reference, seeai_queryfunction.",2026-04-07T21:58:00.000Z,how-to,integrations,0.8,True,"Focuses on the ai_query function, a Databricks-specific API for querying AI models from SQL/Python. Likely includes function signature, parameters (model, prompt, temperature, etc.), and usage constraints unique to Databricks, matching the integrations & coding patterns criteria with concrete parameter references.",unchanged @@ -1410,27 +1432,27 @@ https://learn.microsoft.com/en-us/azure/databricks/ldp/auto-scaling,Autoscaling, https://learn.microsoft.com/en-us/azure/databricks/ldp/best-practices,Best practices,Best practices for Lakeflow Spark Declarative Pipelines - Azure Databricks,Apply best practices for Lakeflow Spark Declarative Pipelines,"Learn recommended patterns for building reliable, efficient, and maintainable Lakeflow Spark Declarative Pipelines on Azure Databricks.","This page describes recommended patterns for designing, building, and operating pipelines with Lakeflow Spark Declarative Pipelines. Apply these guidelines when starting a new pipeline or improving an existing one.",2026-03-31T08:00:00.000Z,concept-article,best-practices,0.7,True,Explicit best-practices article for Lakeflow Spark Declarative Pipelines; likely includes product-specific DO/DON’T guidance and patterns beyond generic Spark advice.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/cdc,Overview,The AUTO CDC APIs: Simplify change data capture with pipelines - Azure Databricks,Implement AUTO CDC and AUTO CDC FROM SNAPSHOT in Lakeflow pipelines,Learn how to implement change data capture with Lakeflow Spark Declarative Pipelines using the AUTO CDC and AUTO CDC FROM SNAPSHOT APIs.,"Lakeflow Spark Declarative Pipelines simplifies change data capture (CDC) with theAUTO CDCandAUTO CDC FROM SNAPSHOTAPIs. These APIs automate the complexity of computing slowly changing dimensions (SCD) Type 1 and Type 2 from either a CDC feed or database snapshots. To learn more about these concepts, seeChange data capture and snapshots. Note TheAUTO CDCAPIs replace theAPPLY CHANGESAPIs and have the same syntax. TheAPPLY CHANGESAPIs are still available, but Databricks recommends using theAUTO CD",2026-03-09T18:23:00.000Z,how-to,configuration,0.75,True,"Documents AUTO CDC APIs, including syntax and options for SCD Type 1/2; these are specific configuration/command details for CDC.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/cdc-advanced,Advanced auto CDC topics,Advanced AUTO CDC topics - Azure Databricks,Use advanced AUTO CDC features and monitor processing metrics,"Learn about DML operations on AUTO CDC targets, reading change data feeds, and processing metrics.","This page covers advanced topics for working withAUTO CDCandAUTO CDC FROM SNAPSHOTtarget tables, including DML operations, reading change data feeds, and monitoring processing metrics. For an introduction to theAUTO CDCAPIs, seeThe AUTO CDC APIs: Simplify change data capture with pipelines.",2026-03-09T18:23:00.000Z,how-to,best-practices,0.7,True,"Covers advanced topics like DML on targets, reading change feeds, and metrics; these are nuanced, product-specific usage patterns and recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/clone-hms-to-uc,Migrate from Hive metastore to Unity Catalog,Create a Unity Catalog pipeline by cloning a Hive metastore pipeline - Azure Databricks,Clone Hive metastore pipelines to Unity Catalog via REST,Learn how to use the clone a pipeline REST API request to create a declarative pipeline configured to publish to Unity Catalog based on an existing pipeline that publishes to the Hive metastore.,"This page describes theclone a pipelinerequest in the Databricks REST API and how you can use it to copy an existing pipeline that publishes to the Hive metastore to a new pipeline that publishes to Unity Catalog. When you call theclone a pipelinerequest, it: After the clone operation is complete, both the original and new pipelines can run independently. This page includes examples of calling the API request directly and through a Python script from a Databricks notebook.",2026-04-07T08:00:00.000Z,how-to,integrations,0.7,True,"Documents a specific Databricks REST API request (clone a pipeline) with concrete usage patterns and examples, including how to transform an HMS-based pipeline into a Unity Catalog pipeline. This is product-specific API/integration behavior beyond generic knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/clone-hms-to-uc,Migrate from Hive metastore to Unity Catalog,Create a Unity Catalog pipeline by cloning a Hive metastore pipeline - Azure Databricks,Clone Hive metastore pipelines to Unity Catalog,Learn how to use the clone a pipeline REST API request to create a declarative pipeline configured to publish to Unity Catalog based on an existing pipeline that publishes to the Hive metastore.,"This page describes theclone a pipelinerequest in the Databricks REST API and how you can use it to copy an existing pipeline that publishes to the Hive metastore to a new pipeline that publishes to Unity Catalog. When you call theclone a pipelinerequest, it: After the clone operation is complete, both the original and new pipelines can run independently. This page includes examples of calling the API request directly and through a Python script from a Databricks notebook.",2026-04-21T08:00:00.000Z,how-to,integrations,0.7,True,"Documents a specific Databricks REST API request (clone a pipeline) with concrete usage patterns and examples, including request/response structure and Python usage; this is product-specific integration/API knowledge beyond generic LLM training.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/concepts,Concepts,Lakeflow Spark Declarative Pipelines concepts - Azure Databricks,,Learn what Azure Databricks Lakeflow Spark Declarative Pipelines is and the data processing concepts that define it.,"Learn what Lakeflow Spark Declarative Pipelines (SDP) is, the core concepts (such as pipelines, streaming tables, and materialized views) that define it, the relationships between those concepts, and the benefits of using it in your data processing workflows. Note Lakeflow Spark Declarative Pipelines requires thePremium plan. Contact your Databricks account team for more information.",2026-03-04T08:00:00.000Z,concept-article,,0.3,False,"Conceptual overview of Lakeflow Spark Declarative Pipelines, core concepts, and benefits. No clear indication of detailed configuration parameters, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/configure-compute,Classic,Configure classic compute for pipelines - Azure Databricks,Configure classic compute resources for pipelines,Learn how to configure classic compute resources for your Lakeflow Spark Declarative Pipelines.,"This page contains instructions for configuring classic compute for Lakeflow Spark Declarative Pipelines. For a reference of the JSON schema, see theclustersdefinition in thePipeline API reference. To create a pipeline that runs on classic compute, users must first have permission to deploy classic compute, either unrestricted creation permission or access to a compute policy. Serverless pipelines do not require compute creation permissions. By default, all workspace users can use serverless pip",2026-01-23T08:00:00.000Z,how-to,configuration,0.8,True,"Describes configuring classic compute for pipelines, referencing JSON schema and cluster definitions; includes product-specific configuration parameters and permission requirements.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/configure-pipeline,Overview,Configure Pipelines - Azure Databricks,Configure Lakeflow pipelines with Unity Catalog,Configure Lakeflow Spark Declarative Pipelines using Unity Catalog.,"This page describes the basic configuration for pipelines using the workspace UI. Databricks recommends developing new pipelines using serverless. For configuration instructions for serverless pipelines, seeConfigure a serverless pipeline. The configuration instructions in this page use Unity Catalog. For instructions for configuring pipelines with legacy Hive metastore, seeUse Lakeflow Spark Declarative Pipelines with legacy Hive metastore. This page discusses functionality for the current defa",2026-03-06T08:00:00.000Z,how-to,configuration,0.8,True,"Explicitly about basic pipeline configuration via the workspace UI, including Unity Catalog and serverless vs classic; likely includes specific settings and options.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/convert-to-dab,Convert a pipeline to a bundle,Convert a pipeline into a bundle project - Azure Databricks,,"Learn how to convert existing Lakeflow Spark Declarative Pipelines to bundle projects for easier portability, sharing, and management.","This article shows how to convert an existing pipeline into aDeclarative Automation Bundlesproject. Bundles enable you to define and manage your Azure Databricks data processing configuration in a single, source-controlled YAML file that provides easier maintenance and enables automated deployment to target environments. For a tutorial that usesdatabricks pipelinescommands to create a pipelines project, then deploys and runs a pipeline, seeDevelop pipelines with Declarative Automation Bundles.",2026-03-16T17:36:00.000Z,tutorial,,0.3,False,"Covers converting an existing pipeline into a bundle project; summary suggests a how-to/tutorial style without explicit limits, config parameter tables, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/database-replication,Replicate external database (AUTO CDC),Replicate an external RDBMS table using AUTO CDC - Azure Databricks,Replicate external RDBMS tables with AUTO CDC,Learn how to replicate a table from an external relational database management system (RDBMS) into Azure Databricks using the `AUTO CDC` API in Lakeflow Spark Declarative Pipelines.,This page walks you through how to replicate a table from an external relational database management system (RDBMS) into Azure Databricks using theAUTO CDCAPI in pipelines. You’ll learn: This pattern is ideal for building slowly changing dimension (SCD) tables or keeping a target table in sync with an external system of record.,2026-01-23T08:00:00.000Z,how-to,integrations,0.7,True,"Describes using the AUTO CDC API to replicate RDBMS tables; likely includes API parameters, configuration patterns, and product-specific behavior for change data capture.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/dbsql-for-ldp,Overview,Use pipelines in Databricks SQL - Azure Databricks,,Learn about using Lakeflow Spark Declarative Pipelines in Databricks SQL.,"Lakeflow Spark Declarative Pipelines is the most common way to work with data in pipelines. You can define streaming tables and materialized views with simple query syntax, and Azure Databricks manages the pipelines for you. Pipeline functionality is also available for use outside Lakeflow Spark Declarative Pipelines using Databricks SQL. This section teaches you about using pipelines outside Lakeflow Spark Declarative Pipelines, including the following topics.",2026-04-10T21:59:00.000Z,overview,,0.2,False,"High-level description of using pipelines in Databricks SQL and what topics are covered. Appears to be an overview/navigation page without detailed parameters, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized,Create materialized views in Databricks SQL,Use materialized views in Databricks SQL - Azure Databricks,Create and manage Databricks SQL materialized views,"Learn how to create, refresh, and query Databricks materialized views in Databricks SQL.",This page describes how to create and refresh materialized views in Databricks SQL to improve performance and reduce the cost of your data processing and analysis workloads.,2026-04-07T08:00:00.000Z,how-to,integrations,0.65,True,"How-to/reference for creating, refreshing, and querying Databricks materialized views in Databricks SQL. Likely includes product-specific SQL syntax, options, and operational behaviors that function as API/DSL parameters, which go beyond generic materialized view concepts.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized,Create materialized views in Databricks SQL,Use materialized views in Databricks SQL - Azure Databricks,,"Learn how to create, refresh, and query Databricks materialized views in Databricks SQL.",This page describes how to create and refresh materialized views in Databricks SQL to improve performance and reduce the cost of your data processing and analysis workloads.,2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Describes how to create and refresh materialized views; no evidence of limits tables, config parameter references, or troubleshooting/error-code mappings in the summary.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized-configure,Configure materialized views in Databricks SQL,Configure materialized views in Databricks SQL - Azure Databricks,Configure Databricks SQL materialized view options and access,Learn how to configure Databricks materialized views in Databricks SQL.,"This page describes how to configure materialized views in Databricks SQL, including access control on the results. Most configuration can be done when you create the materialized view with theCREATE OR REPLACE MATERIALIZED VIEWstatement, or after creation, with theALTER TABLEstatement.",2026-03-16T08:00:00.000Z,how-to,configuration,0.8,True,"Explicitly about configuring materialized views, including access control; likely contains specific statement options, parameters, and allowed values (CREATE/ALTER options, access settings), matching configuration criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized-monitor,Monitor materialized views in Databricks SQL,Monitor materialized views in Databricks SQL - Azure Databricks,Monitor and manage materialized view refresh data,"Learn how to get information about, manage, and monitor Databricks materialized views in Databricks SQL.",This page describes how to monitor and query refresh data about a materialized view in Databricks SQL.,2026-03-12T18:41:00.000Z,how-to,configuration,0.6,True,Describes how to get information and monitor refresh data; likely includes specific system tables or commands unique to Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/schedule-refreshes,Schedule refreshes in Databricks SQL,Schedule refreshes in Databricks SQL - Azure Databricks,,Learn how to schedule refreshes for materialized views and streaming tables in Databricks SQL.,"You can manually refresh your Databricks SQL created pipeline (either materialized view or streaming table) when you know that the source tables have been updated. However, you can also set a schedule for refreshes, either by time, when source tables update, or via orchestration. This page describes how to create the schedules for your pipelines. You can also createnotificationsand set theperformance modefor your scheduled refreshes.",2026-03-11T08:00:00.000Z,how-to,,0.4,False,"Describes how to schedule refreshes; appears to be procedural UI/API steps without limits, config matrices, or detailed troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/streaming,Streaming tables in Databricks SQL,Use streaming tables in Databricks SQL - Azure Databricks,,"Learn how to create, refresh, configure, and monitor Databricks streaming tables in Databricks SQL.","Databricks recommends using streaming tables to ingest data using Databricks SQL. Astreaming tableis a table registered to Unity Catalog with extra support for streaming or incremental data processing. A pipeline is automatically created for each streaming table. You can use streaming tables for incremental data loading from Kafka and cloud object storage. Note To learn how to use Delta Lake tables as streaming sources and sinks, seeDelta table streaming reads and writes.",2026-03-26T08:00:00.000Z,how-to,,0.4,False,"How-to usage of streaming tables in Databricks SQL; no config tables, limits, or product-specific error/diagnostic details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/de-agent,Genie Code for pipeline development,Use Genie Code for pipeline development - Azure Databricks,,Learn how to use the Data Engineering Agent to build Lakeflow Spark Declarative Pipelines. The agent can run multi-step workflows from a single prompt.,"Important This feature is inPublic Preview. This page introduces Genie Code for pipeline development, an AI data agent available by selecting Agent mode in Genie Code. Designed specifically for Lakeflow Spark Declarative Pipelines (SDP) and the Lakeflow Pipelines Editor, it explores data, generates and runs pipeline code, and fixes errors, all from a single prompt.",2026-04-15T08:00:00.000Z,concept-article,,0.3,False,"High-level introduction to Genie Code Data Engineering Agent for Lakeflow Spark Declarative Pipelines. The summary indicates conceptual/feature overview without exposing concrete configuration parameters, error codes, or limits.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/streaming,Streaming tables in Databricks SQL,Use streaming tables in Databricks SQL - Azure Databricks,,"Learn how to create, refresh, configure, and monitor Databricks streaming tables in Databricks SQL.","Databricks recommends using streaming tables to ingest data using Databricks SQL. Astreaming tableis a table registered to Unity Catalog with extra support for streaming or incremental data processing. A pipeline is automatically created for each streaming table. You can use streaming tables for incremental data loading from Kafka and cloud object storage. Note To learn how to use Delta Lake tables as streaming sources and sinks, seeDelta table streaming reads and writes.",2026-04-22T17:34:00.000Z,how-to,,0.3,False,"Primarily a how-to/tutorial for creating and using streaming tables; summary does not indicate numeric limits, config tables, error-code mappings, or product-specific decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/de-agent,Genie Code for pipeline development,Use Genie Code for pipeline development - Azure Databricks,,Learn how to use the Data Engineering Agent to build Lakeflow Spark Declarative Pipelines. The agent can run multi-step workflows from a single prompt.,"Important This feature is inPublic Preview. This page introduces Genie Code for pipeline development, an AI data agent available by selecting Agent mode in Genie Code. Designed specifically for Lakeflow Spark Declarative Pipelines (SDP) and the Lakeflow Pipelines Editor, it explores data, generates and runs pipeline code, and fixes errors, all from a single prompt.",2026-04-15T08:00:00.000Z,concept-article,,0.3,False,"High-level introduction to Genie Code Data Engineering Agent for Lakeflow Spark Declarative Pipelines. The summary indicates conceptual/feature overview without exposing concrete configuration parameters, error codes, or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/develop,Overview,Develop Lakeflow Spark Declarative Pipelines - Azure Databricks,Apply development best practices to Lakeflow pipelines,"Learn the steps and recommendations for Lakeflow Spark Declarative Pipelines development and testing in either a Azure Databricks notebook, the Azure Databricks file editor, or locally using an integr","Developing and testing pipeline code differs from other Apache Spark workloads. This article provides an overview of supported functionality, best practices, and considerations when developing pipeline code. For more recommendations and best practices, seeApplying software development & DevOps best practices to pipelines. Note You must add source code to a pipeline configuration to validate code or run an update. SeeConfigure Pipelines.",2026-01-23T08:00:00.000Z,overview,best-practices,0.7,True,Explicitly states it provides best practices and considerations for developing and testing pipeline code; these are product-specific recommendations for Lakeflow Spark Declarative Pipelines.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/develop-locally,Local development,Develop pipeline code in your local development environment - Azure Databricks,,Learn how to develop Lakeflow Spark Declarative Pipelines code in your local development environment and then upload the pipeline code to your Azure Databricks workspace.,"You can author Python pipeline source code in your preferred integrated development environment (IDE). You cannot validate or run updates on pipeline code written in an IDE. You must deploy source code files back to a Azure Databricks workspace and configure them as part of a pipeline. This article provides an overview of support for local IDE development. For more interactive development and testing, Databricks recommends using the Lakeflow Pipelines Editor. SeeDevelop and debug ETL pipelines w",2026-03-16T17:36:00.000Z,concept-article,,0.2,False,"Overview of local development workflow; appears procedural without detailed configuration tables, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/,Overview,Pipeline developer reference - Azure Databricks,,Learn about using Python and SQL to implement Lakeflow Spark Declarative Pipelines.,"This section contains reference and instructions for pipeline developers. Data loading and transformations are implemented in pipelines by queries that define streaming tables and materialized views. To implement these queries, Lakeflow Spark Declarative Pipelines supports SQL and Python interfaces. Because these interfaces provide equivalent functionality for most data processing use cases, pipeline developers can choose the interface that they are most comfortable with.",2026-04-13T21:03:00.000Z,overview,,0.2,False,"The description and summary indicate a high-level developer reference overview explaining that pipelines can be implemented with SQL or Python and that they are equivalent for most use cases. There is no evidence of specific configuration parameters, limits, error codes, or decision matrices. It appears conceptual rather than containing expert, product-specific details.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/,Overview,Pipeline developer reference - Azure Databricks,,Learn about using Python and SQL to implement Lakeflow Spark Declarative Pipelines.,"This section contains reference and instructions for pipeline developers. Data loading and transformations are implemented in pipelines by queries that define streaming tables and materialized views. To implement these queries, Lakeflow Spark Declarative Pipelines supports SQL and Python interfaces. Because these interfaces provide equivalent functionality for most data processing use cases, pipeline developers can choose the interface that they are most comfortable with.",2026-04-13T21:03:00.000Z,overview,,0.2,False,"The description and summary indicate a high-level developer reference overview explaining that pipelines can be implemented with SQL or Python and that they are equivalent for most use cases. There is no evidence of specific configuration parameters, limits, error codes, or decision matrices. It appears conceptual rather than containing expert, product-specific details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/definition-function,Dataset definition functions,Functions to define datasets - Azure Databricks,Define Lakeflow pipeline datasets with Python decorators,Learn about writing Python functions to define datasets in combination with decorators in Lakeflow Spark Declarative Pipelines.,"Thepyspark.pipelines(here aliased asdp) module implements much of its core functionality using decorators. These decorators accept a function that defines either a streaming or batch query and returns an Apache Spark DataFrame. The following syntax shows a simple example for definining a pipeline dataset: This page provides an overview of the functions and queries that define datasets in pipelines. For a complete list of available decorators, seePipeline developer reference. The functions you us",2026-01-23T08:00:00.000Z,reference,integrations,0.8,True,Documents functions and decorators to define datasets; includes specific function signatures and usage patterns unique to this product’s Python API.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/dlt-meta,Create pipelines with dlt-meta,Create pipelines with dlt-meta - Azure Databricks,Generate Lakeflow pipelines from metadata with dlt-meta,Learn about using dlt-meta to automate Lakeflow Spark Declarative Pipelines by using JSON metadata to specify many tables at once.,"This article introducesdlt-meta, aDatabricks Labsproject that provides tools to generate pipelines from metadata that you maintain. Note The open source dlt-meta project, like all projects in the databrickslabs GitHub account, exists for exploration purposes only. Azure Databricks does not support it or provide service-level agreements (SLAs) for it. Do not submit Azure Databricks support tickets for issues related to this project. Instead, file aGitHub issue, which will be reviewed as time perm",2026-01-23T08:00:00.000Z,overview,configuration,0.65,True,Describes using dlt-meta to generate pipelines from JSON metadata; likely includes metadata schema/fields and configuration patterns unique to this tool and product.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/external-dependencies,Manage Python dependencies for pipelines,Manage Python dependencies for pipelines - Azure Databricks,Manage Python dependencies safely in Databricks pipelines,Learn how to use external dependencies such as Python libraries in your Lakeflow Spark Declarative Pipelines.,"Lakeflow Spark Declarative Pipelines supports external dependencies in your pipelines. Databricks recommends using one of two patterns to install Python packages: Pipelines also support using global and cluster-scopedinit scripts. However, these external dependencies, particularly init scripts, increase the risk of issues with runtime upgrades. To mitigate these risks, minimize using init scripts in your pipelines. If your processing requires init scripts, automate testing of your pipeline to de",2026-01-23T08:00:00.000Z,how-to,best-practices,0.75,True,"Gives concrete recommendations on using specific patterns (e.g., avoiding init scripts, automating tests) due to runtime upgrade risks; these are product-specific operational best practices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-metaprogramming,Create multiple flows with metaprogramming,Tutorial: Create multiple flows with different parameters - Azure Databricks,,Learn how to use metaprogramming patterns in Lakeflow Spark Declarative Pipelines to generate multiple pipeline flows from parameterized Python inner functions.,"A pipeline may contain multiple flows that are almost identical, differing only by a few parameters. Defining these flows explicitly is error-prone, redundant, and difficult to maintain. Metaprogramming with Python inner functions generates repetitive flows dynamically, with each invocation supplying a different set of parameters.",2026-04-13T21:03:00.000Z,concept-article,,0.3,False,"Tutorial on metaprogramming patterns for Lakeflow Spark Declarative Pipelines; primarily shows a coding pattern with inner functions and parameterization. No configuration tables, limits, error-code mappings, or product-specific settings with values that qualify as expert knowledge per the defined categories.",new -https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-append-flow,append_flow,append_flow - Azure Databricks,Use append_flow decorator for pipeline append-only flows,Learn how to use the append\_flow syntax in Lakeflow Spark Declarative Pipelines with Python to add an append-only flow to a sink.,The@dp.append_flowdecorator creates append flows or backfills for your pipeline tables. The function must return an Apache Spark streaming DataFrame. SeeLoad and process data incrementally with Lakeflow Spark Declarative Pipelines flows. Append flows can target streaming tables or sinks.,2026-01-24T08:00:00.000Z,reference,integrations,0.9,True,Reference for @dp.append_flow with required streaming DataFrame semantics; includes function/decorator parameters and constraints specific to Lakeflow pipelines.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-metaprogramming,Create multiple flows with metaprogramming,Tutorial: Create multiple flows with different parameters - Azure Databricks,,Learn how to use metaprogramming patterns in Lakeflow Spark Declarative Pipelines to generate multiple pipeline flows from parameterized Python inner functions.,"A pipeline may contain multiple flows that are almost identical, differing only by a few parameters. Defining these flows explicitly is error-prone, redundant, and difficult to maintain. Metaprogramming with Python inner functions generates repetitive flows dynamically, with each invocation supplying a different set of parameters.",2026-04-13T21:03:00.000Z,concept-article,,0.3,False,"Tutorial on metaprogramming patterns for Lakeflow Spark Declarative Pipelines; primarily shows a coding pattern with inner functions and parameterization. No configuration tables, limits, error-code mappings, or product-specific settings with values that qualify as expert knowledge per the defined categories.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-append-flow,append_flow,append_flow - Azure Databricks,Use append_flow decorator in Databricks pipelines,Learn how to use the append\_flow syntax in Lakeflow Spark Declarative Pipelines with Python to add an append-only flow to a sink.,The@dp.append_flowdecorator creates append flows or backfills for your pipeline tables. The function must return an Apache Spark streaming DataFrame. SeeLoad and process data incrementally with Lakeflow Spark Declarative Pipelines flows. Append flows can target streaming tables or sinks.,2026-04-21T08:00:00.000Z,reference,integrations,0.7,True,Documents the @dp.append_flow Python decorator with product-specific syntax and behavior for streaming DataFrames and append flows; this is concrete API usage/configuration knowledge.,updated https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-apply-changes,create_auto_cdc_flow,create_auto_cdc_flow - Azure Databricks,Use create_auto_cdc_flow for pipeline CDC processing,Learn how to use the create\_auto\_cdc\_flow syntax in Lakeflow Spark Declarative Pipelines with Python to process CDC data.,Thecreate_auto_cdc_flow()function creates a flow that uses Lakeflow Spark Declarative Pipelines change data capture (CDC) functionality to process source data from a change data feed (CDF). Note This function replaces the previous functionapply_changes(). The two functions have the same signature. Databricks recommends updating to use the new name. Important You must declare a target streaming table to apply changes into. You can optionally specify the schema for your target table. When specifyi,2026-03-19T18:35:00.000Z,reference,integrations,0.9,True,"Details create_auto_cdc_flow signature, required target streaming table, and CDC behavior; these are product-specific API semantics and constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-apply-changes-from-snapshot,create_auto_cdc_from_snapshot_flow,create_auto_cdc_from_snapshot_flow - Azure Databricks,Use create_auto_cdc_from_snapshot_flow for snapshot CDC,Learn how to use the create\_auto\_cdc\_from\_snapshot\_flow syntax in Lakeflow Spark Declarative Pipelines with Python to create a flow that processess CDC data from database snapshots to a streaming,Thecreate_auto_cdc_from_snapshot_flowfunction creates a flow that uses Lakeflow Spark Declarative Pipelines change data capture (CDC) functionality to process source data from database snapshots. SeeHow AUTO CDC FROM SNAPSHOT works. Note This function replaces the previous functionapply_changes_from_snapshot(). The two functions have the same signature. Databricks recommends updating to use the new name. Important You must have a target streaming table for this operation. To create the required ,2026-02-03T08:00:00.000Z,reference,integrations,0.9,True,Documents create_auto_cdc_from_snapshot_flow behavior and requirements; includes API parameters and usage patterns unique to Lakeflow CDC from snapshots.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-expectations,expect,Expectations - Azure Databricks,Apply data quality expectations in Lakeflow Python pipelines,Learn how how to use expectations to add data quality constraints to datasets in Lakeflow Spark Declarative Pipelines.,"This page contains Python reference documentation for pipeline expectations. Expectation decorators declare data quality constraints on materialized views, streaming tables, or temporary views created in a pipeline. Thedpmodule includes six decorators to control expectations behavior. The following table describes the dimensions on which these permutations differ: You can add multiple expectation decorators to your datasets, providing flexibility in strictness for your data quality constraints. ",2026-01-23T08:00:00.000Z,reference,integrations,0.85,True,"Expectation decorators with six variants and behavioral differences; includes decorator names and semantics that are unique, i.e., detailed API usage.",unchanged @@ -1454,47 +1476,47 @@ https://learn.microsoft.com/en-us/azure/databricks/ldp/event-hooks,Custom monito https://learn.microsoft.com/en-us/azure/databricks/ldp/event-hubs,Use Azure Event Hubs as a pipelines data source,Use Azure Event Hubs as a pipeline data source - Azure Databricks,Use Azure Event Hubs as a Lakeflow data source,This article explains how to use Lakeflow Spark Declarative Pipelines to process messages from Azure Event Hubs.,"This article explains how to process messages from Azure Event Hubs in a pipeline. You cannot use theStructured Streaming Event Hubs connectorbecause this library is not available as part of Databricks Runtime, and Lakeflow Spark Declarative Pipelinesdoes not allow you to use third-party JVM libraries.",2026-01-23T08:00:00.000Z,how-to,integrations,0.75,True,Explains processing Event Hubs messages in pipelines and notes the Structured Streaming connector is unavailable; implies specific connection patterns and constraints unique to Databricks/Lakeflow.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/expectation-patterns,Advanced expectation patterns,Expectation recommendations and advanced patterns - Azure Databricks,Implement advanced expectation patterns at scale,Learn how to manage data quality with Azure Databricks Lakeflow Spark Declarative Pipelines expectations.,"This article contains recommendations for implementing expectations at scale and examples of advanced patterns supported by expectations. These patterns use multiple datasets in conjunction with expectations and require that users understand the syntax and semantics of materialized views, streaming tables, and expectations. For a basic overview of expectations behavior and syntax, seeManage data quality with pipeline expectations.",2026-01-23T08:00:00.000Z,how-to,best-practices,0.8,True,Explicitly provides recommendations and advanced patterns for expectations using multiple datasets; this is product-specific best-practice guidance for Lakeflow expectations.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/expectations,Data quality,Manage data quality with pipeline expectations - Azure Databricks,Configure pipeline expectations for data quality,Learn how to manage data quality with Azure Databricks Lakeflow Spark Declarative Pipelines expectations.,"Use expectations to apply quality constraints that validate data as it flows through ETL pipelines. Expectations provide greater insight into data quality metrics and allow you to fail updates or drop records when detecting invalid records. This article has an overview of expectations, including syntax examples and behavior options. For more advanced use cases and recommended best practices, seeExpectation recommendations and advanced patterns.",2026-02-04T08:00:00.000Z,how-to,configuration,0.7,True,"Covers expectations syntax, behavior options, and how to fail updates or drop records; these are product-specific configuration options for data quality enforcement.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/fix-high-init,Troubleshoot high initialization times,Fixing high initialization times in pipelines - Azure Databricks,Reduce Lakeflow pipeline initialization latency,"Learn how to identify the cause of initialization latency in Lakeflow Spark Declarative Pipelines, and then how to split tables across pipelines.","Pipelines can contain many datasets with many flows to keep them up to date. Pipelines automatically manage updates and clusters to update efficiently. However, there is some overhead with managing large numbers of flows, and at times, this can lead to larger than expected initialization or even management overhead during processing. If you are running into delays waiting on triggered pipelines to initialize, such as initialization times over five minutes, consider splitting processing into seve",2026-01-23T08:00:00.000Z,troubleshooting,best-practices,0.8,True,"Discusses causes of high initialization times and recommends splitting tables across pipelines with thresholds (e.g., over five minutes); these are product-specific performance best practices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/flow-examples,Flow examples,Examples of flows in Lakeflow Spark Declarative Pipelines - Azure Databricks,Example configurations of flows in Lakeflow Spark Declarative Pipelines,Find examples of using Lakeflow Spark Declarative Pipelines flows in Databricks.,,2026-02-19T19:35:00.000Z,sample,configuration,0.65,True,"Provides concrete examples of flow definitions and options, which are detailed configuration patterns for this product.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/flows,Overview,Load and process data incrementally with Lakeflow Spark Declarative Pipelines flows - Azure Databricks,Configure and use flows in Lakeflow pipelines,A Lakeflow Spark Declarative Pipelines flow is a query that loads and processes data incrementally. Learn how to use flows to load and transform data to create new datasets for persistence to target D,"Data is processed in pipelines throughflows. Each flow consists of aqueryand, typically, atarget. The flow processes the query, either as a batch, or incrementally as a stream of data into the target. A flow lives within a pipeline in Lakeflow Spark Declarative Pipelines. Typically, flows are defined automatically when you create a query in a pipeline that updates a target, but you can also explicitly define additional flows for more complex processing, such as appending to a single target from ",2026-01-23T08:00:00.000Z,concept-article,configuration,0.65,True,Explains how flows work and how to define them for incremental/batch processing; likely includes flow configuration options and semantics specific to Lakeflow pipelines.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/flows-backfill,Backfill historical data,Backfilling historical data with pipelines - Azure Databricks,Backfill historical data with Lakeflow pipelines,Creating flows that backfill historical data in Lakeflow Spark Declarative Pipelines.,"In data engineering,backfillingrefers to the process of retroactively processing historical data through a data pipeline that was designed for processing current or streaming data. Typically, this is a separate flow sending data into your existing tables. The following illustration shows a backfill flow sending historical data to the bronze tables in your pipeline. Some scenarios that might require a backfill: A backfill in Lakeflow Spark Declarative Pipelines is supported with a specialized app",2026-01-23T08:00:00.000Z,how-to,best-practices,0.65,True,"Focuses on creating specialized backfill flows; likely includes recommended patterns, caveats, and product-specific approaches for backfilling in Lakeflow pipelines.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/for-each-batch,ForEachBatch sink,Use ForEachBatch to write to arbitrary data sinks in pipelines - Azure Databricks,Use ForEachBatch sink in Lakeflow streaming pipelines,"Use `foreachBatch` in Lakeflow Spark Declarative Pipelines to take arbitrary actions on streaming data, including transformation and writing to one or more data sinks, on Azure Databricks.","Important Theforeach_batch_sinkAPI is inPublic Preview. The ForEachBatch sink allows you to process a stream as a series of micro-batches. Each batch can be processed in Python with custom logic similar to Apache Spark Structured Streaming'sforeachBatch. With the Lakeflow Spark Declarative Pipelines (SDP) ForEachBatch sink, you can transform, merge, or write streaming data to one or more targets that do not natively support streaming writes. This page walks you through setting up a ForEachBatch ",2026-04-16T08:00:00.000Z,how-to,integrations,0.65,True,"Explains the ForEachBatch sink API for Lakeflow Spark Declarative Pipelines, including its relation to Structured Streaming foreachBatch and how to write to arbitrary sinks that don’t support streaming. This is a product-specific streaming integration/coding pattern rather than generic streaming guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/fix-high-init,Troubleshoot high initialization times,Fixing high initialization times in pipelines - Azure Databricks,Reduce initialization latency in Databricks pipelines,"Learn how to identify the cause of initialization latency in Lakeflow Spark Declarative Pipelines, and then how to split tables across pipelines.","Pipelines can contain many datasets with many flows to keep them up to date. Pipelines automatically manage updates and clusters to update efficiently. However, there is some overhead with managing large numbers of flows, and at times, this can lead to larger than expected initialization or even management overhead during processing. If you are running into delays waiting on triggered pipelines to initialize, such as initialization times over five minutes, consider splitting processing into seve",2026-04-21T08:00:00.000Z,troubleshooting,best-practices,0.7,True,"Provides concrete operational guidance for high initialization times (for example, thresholds like initialization over five minutes and splitting tables across pipelines); this is product-specific performance tuning advice.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/flow-examples,Flow examples,Examples of flows in Lakeflow Spark Declarative Pipelines - Azure Databricks,,Find examples of using Lakeflow Spark Declarative Pipelines flows in Databricks.,,2026-04-21T08:00:00.000Z,sample,,0.2,False,"Examples of flows likely show patterns but summary does not indicate product-specific configuration tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/flows,Overview,Load and process data incrementally with Lakeflow Spark Declarative Pipelines flows - Azure Databricks,,A Lakeflow Spark Declarative Pipelines flow is a query that loads and processes data incrementally. Learn how to use flows to load and transform data to create new datasets for persistence to target D,"Data is processed in pipelines throughflows. Each flow consists of aqueryand, typically, atarget. The flow processes the query, either as a batch, or incrementally as a stream of data into the target. A flow lives within a pipeline in Lakeflow Spark Declarative Pipelines. Typically, flows are defined automatically when you create a query in a pipeline that updates a target, but you can also explicitly define additional flows for more complex processing, such as appending to a single target from ",2026-04-21T08:00:00.000Z,concept-article,,0.3,False,"Conceptual and how-to description of flows and incremental processing; summary does not indicate numeric limits, decision matrices, or detailed configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/flows-backfill,Backfill historical data,Backfilling historical data with pipelines - Azure Databricks,,Creating flows that backfill historical data in Lakeflow Spark Declarative Pipelines.,"In data engineering,backfillingrefers to the process of retroactively processing historical data through a data pipeline that was designed for processing current or streaming data. Typically, this is a separate flow sending data into your existing tables. The following illustration shows a backfill flow sending historical data to the bronze tables in your pipeline. Some scenarios that might require a backfill: A backfill in Lakeflow Spark Declarative Pipelines is supported with a specialized app",2026-04-21T08:00:00.000Z,how-to,,0.4,False,"Backfill concept and scenario description; summary does not show concrete limits, configuration matrices, or error-based troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/for-each-batch,ForEachBatch sink,Use ForEachBatch to write to arbitrary data sinks in pipelines - Azure Databricks,Use ForEachBatch sink in Lakeflow streaming pipelines,"Use `foreachBatch` in Lakeflow Spark Declarative Pipelines to take arbitrary actions on streaming data, including transformation and writing to one or more data sinks, on Azure Databricks.","Important Theforeach_batch_sinkAPI is inPublic Preview. The ForEachBatch sink allows you to process a stream as a series of micro-batches. Each batch can be processed in Python with custom logic similar to Apache Spark Structured Streaming'sforeachBatch. With the Lakeflow Spark Declarative Pipelines (SDP) ForEachBatch sink, you can transform, merge, or write streaming data to one or more targets that do not natively support streaming writes. This page walks you through setting up a ForEachBatch ",2026-04-16T08:00:00.000Z,how-to,integrations,0.65,True,"Explains the ForEachBatch sink API for Lakeflow Spark Declarative Pipelines, including its relation to Structured Streaming foreachBatch and how to write to arbitrary sinks that don’t support streaming. This is a product-specific streaming integration/coding pattern rather than generic streaming guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/from-json-schema-evolution,from_json schema inference and evolution,Infer and evolve the schema using from_json in pipelines - Azure Databricks,Configure JSON schema inference and evolution with from_json,Learn about schema inference and evolution options supported by the `from_json` SQL function with Lakeflow Spark Declarative Pipelines.,Important This feature is in Public Preview. This article describes how to infer and evolve the schema of JSON blobs with thefrom_jsonSQL function in Lakeflow Spark Declarative Pipelines.,2026-01-23T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes schema inference and evolution options for from_json in pipelines; likely includes function options, flags, and behaviors unique to this environment.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/full-refresh-st,Full refresh for streaming tables,Full refresh for streaming tables - Azure Databricks,Run full refresh operations for Databricks streaming tables safely,Learn when and how to perform a full refresh for streaming tables in Lakeflow Spark Declarative Pipelines.,"A full refresh of a streaming table discards all existing data and metadata and restarts the stream from the beginning. Specifically, it truncates the streaming table, removes all checkpoint data, and restarts the streaming process with new checkpoints for every flow writing to the table. This page describes when you might be required to run a full refresh, and the impact of running a full refresh. It also includes best practices around full refreshes. For guidance on how to trigger a full refre",2026-04-06T21:50:00.000Z,concept-article,best-practices,0.75,True,"Explains when a full refresh is required, its impact on checkpoints and metadata, and includes best practices specific to Databricks streaming tables and Lakeflow pipelines. This is actionable, product-specific guidance (when/when not to run, implications on flows) rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/hive-metastore,Hive metastore (legacy),Use Lakeflow Spark Declarative Pipelines with legacy Hive metastore - Azure Databricks,Configure pipelines with legacy Hive metastore,Learn how to persist tables from your Lakeflow Spark Declarative Pipelines and query the tables from external applications.,"This article details configurations and caveats specific to Lakeflow Spark Declarative Pipelines configured to publish data to the legacy Hive metastore. Databricks recommends using Unity Catalog for all new pipelines. SeeUse Unity Catalog with pipelines. Note This article discusses functionality for the current default publishing mode for pipelines. Pipelines created before February 5, 2025, might use the legacy publishing mode andLIVEvirtual schema. SeeLIVE schema (legacy).",2026-03-30T08:00:00.000Z,how-to,configuration,0.7,True,Details configurations and caveats for publishing to legacy Hive metastore; includes product-specific settings and behavioral differences vs Unity Catalog.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/import-workspace-files,Import Python modules from Git folders or workspace files,Import Python modules from Git folders or workspace files - Azure Databricks,Import Python modules from Git or workspace into pipelines,This article has guidance on importing Python modules and packages from Git folders or workspace files into Lakeflow Spark Declarative Pipelines.,"You can store Python code inDatabricks Git foldersor inworkspace filesand then import that Python code into your pipeline. For more information about working with modules in Git folders or workspace files, seeWork with Python and R modules. To import a Python file, you have multiple options:",2026-01-23T08:00:00.000Z,how-to,integrations,0.7,True,Guidance on importing Python modules from Databricks Git folders or workspace files into pipelines; involves product-specific module path/import patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/ldp-sinks,Use sinks in pipelines,Using sinks in pipelines - Azure Databricks,Configure and use Lakeflow sink API with flows,Learn about Lakeflow Spark Declarative Pipelines sinks and the process for configuring and using them with your Lakeflow Spark Declarative Pipelines.,"Important ThesinkAPI is inPublic Preview. This page describes the Lakeflow Spark Declarative PipelinessinkAPI and how to use it withflowsto write records transformed by a pipeline to an external data sink. External data sinks include Unity Catalog managed and external tables, and event streaming services such as Apache Kafka or Azure Event Hubs. You can also use data sinks to write to custom data sources by writing Python code for that data source. Note",2026-03-11T08:00:00.000Z,concept-article,configuration,0.75,True,"Explicitly about the sink API and how to use it with flows; likely includes parameter names, allowed values, and configuration patterns for external sinks like Kafka and Event Hubs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/limitations,Limitations,Pipeline Limitations - Azure Databricks,Understand Lakeflow Spark Declarative Pipeline limits and quotas,Learn about the limitations of Lakeflow Spark Declarative Pipelines in Azure Databricks.,"The following are limitations of Lakeflow Spark Declarative Pipelines that are important to know as you develop your pipelines: An Azure Databricks workspace is limited to 1000 concurrent pipeline updates. The number of datasets that a single pipeline can contain is determined by the pipeline configuration and workload complexity. The configuration of a pipeline includes references to source files and folders. If the configuration referencesonlyindividual notebooks or files, the limit per pipeli",2026-03-10T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly documents numeric limits (for example 1000 concurrent pipeline updates, per-pipeline dataset limits based on config); matches limits-quotas criteria.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/limitations,Limitations,Pipeline Limitations - Azure Databricks,Lakeflow Spark pipeline limits in Azure Databricks,Learn about the limitations of Lakeflow Spark Declarative Pipelines in Azure Databricks.,"The following are limitations of Lakeflow Spark Declarative Pipelines that are important to know as you develop your pipelines: An Azure Databricks workspace is limited to 1000 concurrent pipeline updates. The number of datasets that a single pipeline can contain is determined by the pipeline configuration and workload complexity. The configuration of a pipeline includes references to source files and folders. If the configuration referencesonlyindividual notebooks or files, the limit per pipeli",2026-04-21T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly documents concrete numeric limits (for example, 1000 concurrent pipeline updates and per-pipeline dataset limits based on configuration). This matches the limits-quotas criteria with product-specific constraints that are unlikely to be known from training.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/live-schema,Overview,LIVE schema (legacy) - Azure Databricks,Understand and migrate from legacy LIVE schema,Learn about the legacy LIVE schema and how if functions in legacy publishing mode for Lakeflow Spark Declarative Pipelines.,This article provides an overview of the legacy syntax and behavior for theLIVEvirtual schema. TheLIVEvirtual schema is a legacy feature of Lakeflow Spark Declarative Pipelines and is considered deprecated. You can still use legacy publishing mode and theLIVEvirtual schema for pipelines that were created with this mode. Databricks recommends migrating all pipelines to the new publishing mode. You have two choices for migration: Both of these methods are one-way migrations. You can't migrate tabl,2026-01-23T08:00:00.000Z,concept-article,configuration,0.65,True,Explains legacy LIVE virtual schema behavior and migration options; includes product-specific schema semantics and migration configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/load,Load data using SDP,Load data in pipelines - Azure Databricks,Load data into Lakeflow pipelines from supported sources,This article describes how to load data to Azure Databricks with Lakeflow Spark Declarative Pipelines.,"You can load data from any data source supported by Apache Spark on Azure Databricks using pipelines. You can define datasets (tables and views) in Lakeflow Spark Declarative Pipelines against any query that returns a Spark DataFrame, including streaming DataFrames and Pandas for Spark DataFrames. For data ingestion tasks, Databricks recommends using streaming tables for most use cases. Streaming tables are good for ingesting data from cloud object storage using Auto Loader or from message buses",2026-03-05T08:00:00.000Z,how-to,configuration,0.7,True,Explains how to load data from any Spark-supported source and define datasets; likely includes configuration of streaming tables and ingestion patterns specific to Lakeflow.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/load,Load data using SDP,Load data in pipelines - Azure Databricks,,"Learn how to load data into pipelines from cloud object storage, message buses, databases, and other data sources on Azure Databricks.","You can load data from any data source supported by Apache Spark on Azure Databricks using pipelines. You can define datasets—tables and views—in Lakeflow Spark Declarative Pipelines against any query that returns a Spark DataFrame, including streaming DataFrames and Pandas for Spark DataFrames. For data ingestion tasks, Databricks recommends using streaming tables for most use cases. Streaming tables are useful for ingesting data from cloud object storage using Auto Loader or from message buses",2026-04-21T18:07:00.000Z,how-to,,0.3,False,"Describes how to load data into pipelines and recommends streaming tables, but summary does not indicate detailed configuration options, limits, or decision matrices with quantified trade-offs.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/materialized-views,Overview,Materialized views - Azure Databricks,,Learn about materialized views in Lakeflow Spark Declarative Pipelines.,"Like standard views, materialized views are the results of a query and you access them the same way you would a table. Unlike standard views, which recompute results on every query, materialized views cache the results and refreshes them on a specified interval. Because a materialized view is precomputed, queries against it can run much faster than against regular views. A materialized view is a declarative pipeline object. It includes aquerythat defines it, aflowto update it, and the cached res",2026-03-11T22:00:00.000Z,concept-article,,0.4,False,"Explains materialized views conceptually in pipelines; summary does not indicate detailed config tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/migrate-to-dpm,Migrate legacy publishing pipelines,Enable the default publishing mode in a pipeline - Azure Databricks,Migrate Databricks pipelines to the default publishing mode,"Learn how to enable the default publishing mode for Lakeflow Spark Declarative Pipelines that currently use the legacy publishing mode, or the `LIVE` virtual schema.","This article describes how to migrate pipelines that use theLIVEvirtual schema (the legacy publishing mode) to the default publishing mode. The default publishing mode allows a single pipeline to write to multiple catalogs and schemas, and includes a simplified syntax for working with tables and views within the pipeline. The legacy publishing mode is considered deprecated, and Databricks recommends migrating all pipelines to the default publishing mode. Migration affects the metadata of the pip",2026-04-07T08:00:00.000Z,upgrade-and-migration-article,configuration,0.7,True,"Describes how to enable the default publishing mode and migrate from the legacy LIVE virtual schema, affecting pipeline metadata and behavior. Likely includes specific configuration steps, settings, and schema behavior unique to Lakeflow Spark Declarative Pipelines, fitting configuration-focused expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/monitor-event-log-schema,Event log schema,Pipeline event log schema - Azure Databricks,Use Lakeflow pipeline event log schema for auditing,Learn about the JSON schema of events in the Lakeflow Spark Declarative Pipelines event log.,"The pipeline event log contains all information related to a pipeline, including audit logs, data quality checks, pipeline progress, and data lineage. The following tables describe the event log schema. Some of these fields contain JSON data that require parsing to perform some queries, such as thedetailsfield. Azure Databricks supports the:operator to parse JSON fields. See:(colon sign) operator. Note Some fields in the event log are for internal use by Azure Databricks. The following documenta",2026-03-06T08:00:00.000Z,reference,configuration,0.9,True,Describes JSON schema of pipeline event log with field tables; this is detailed schema/configuration information unique to the product.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/monitor-event-logs,Event log,Pipeline event log - Azure Databricks,Query and use the Lakeflow pipeline event log,Learn about the Lakeflow Spark Declarative Pipelines event log and its schema.,"The pipeline event log contains all information related to a pipeline, including audit logs, data quality checks, pipeline progress, and data lineage. You can use the event log to track, understand, and monitor the state of your data pipelines. You can view event log entries in the pipeline monitoringuser interface, thePipelines REST API, or by directly querying the event log. This section focuses on querying the event log directly. You can also define custom actions to run when events are logge",2026-03-11T08:00:00.000Z,concept-article,configuration,0.8,True,"Explains event log usage and access via UI, REST API, and SQL; includes schema usage and likely field semantics for monitoring and custom actions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/monitoring-ui,Monitor in the UI,Monitor pipelines in the UI - Azure Databricks,,Learn how to monitor Lakeflow Spark Declarative Pipelines through the Azure Databricks user interface.,This section describes using built-in monitoring and observability features for Lakeflow Spark Declarative Pipelines in the Azure Databricks user interface. These features support tasks such as:,2026-02-10T08:00:00.000Z,how-to,,0.45,False,Focuses on using the UI for monitoring; mostly operational UI steps without strong indication of detailed error-code mappings or config tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/move-table,Move tables between pipelines,Move tables between pipelines - Azure Databricks,,Learn how to move a streaming table or materialized view from one pipeline to another.,"This article describes how to move streaming tables and materialized views between pipelines. After the move, the pipeline you move the flow to updates the table, rather than the original pipeline. This is useful in many scenarios, including:",2026-02-03T08:00:00.000Z,how-to,,0.3,False,Describes how to move tables between pipelines; appears to be a scenario/how-to article without deep configuration references or numeric constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/multi-file-editor,Lakeflow Pipelines Editor,Develop and debug ETL pipelines with the Lakeflow Pipelines Editor - Azure Databricks,Use Lakeflow Pipelines Editor for ETL development,"Learn how to use the Lakeflow Pipelines Editor to develop and debug ETL (extract, transform, and load) pipelines in Lakeflow Spark Declarative Pipelines.","Important This feature is inPublic Preview. This article describes using the Lakeflow Pipelines Editor to develop and debug ETL (extract, transform, and load) pipelines in Lakeflow Spark Declarative Pipelines (SDP). Note The Lakeflow Pipelines Editor is enabled by default. You can turn it off, or re-enable it if it has been turned off. SeeEnable the Lakeflow Pipelines Editor and updated monitoring.",2026-02-25T08:00:00.000Z,how-to,configuration,0.65,True,Describes how to develop and debug pipelines with the Lakeflow Pipelines Editor; likely includes editor-specific options and pipeline configuration behaviors unique to this product.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/multi-file-editor,Lakeflow Pipelines Editor,Develop and debug ETL pipelines with the Lakeflow Pipelines Editor - Azure Databricks,,"Learn how to use the Lakeflow Pipelines Editor to develop and debug ETL (extract, transform, and load) pipelines in Lakeflow Spark Declarative Pipelines.","Important This feature is inPublic Preview. This article describes using the Lakeflow Pipelines Editor to develop and debug ETL (extract, transform, and load) pipelines in Lakeflow Spark Declarative Pipelines (SDP). Note The Lakeflow Pipelines Editor is enabled by default. You can turn it off, or re-enable it if it has been turned off. SeeEnable the Lakeflow Pipelines Editor and updated monitoring.",2026-04-21T08:00:00.000Z,how-to,,0.3,False,"Feature usage and UI-focused guidance for Lakeflow Pipelines Editor; no detailed configuration tables, limits, error codes, or product-specific best-practice gotchas evident from summary.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/notebook-devex,Notebook development,Develop and debug pipelines with a notebook (legacy) - Azure Databricks,Develop and debug Databricks pipelines using legacy notebooks,Learn how to use the legacy notebook editing experience in Lakeflow Spark Declarative Pipelines to develop and debug ETL pipelines.,"Important This feature is inPublic Preview. This article describes how to use a notebook in Lakeflow Spark Declarative Pipelines to develop and debug ETL pipelines. Important This page describes the legacy notebook editing experience. The default, recommended experience is the Lakeflow Pipelines Editor. You can use the Lakeflow Pipelines Editor to edit notebooks, or Python or SQL code files for a pipeline. For more information, seeDevelop and debug ETL pipelines with the Lakeflow Pipelines Edito",2026-04-07T08:00:00.000Z,how-to,integrations,0.65,True,"Describes a product-specific, legacy notebook-based development and debugging experience for Lakeflow Spark Declarative Pipelines. Likely includes concrete notebook configuration details, pipeline/notebook linkage, and execution/debug patterns unique to this product, fitting integrations & coding patterns more than generic tutorials.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/observability,Observability,Monitor pipelines - Azure Databricks,Monitor and troubleshoot Lakeflow Spark pipelines,"Learn about monitoring and observability features of Lakeflow Spark Declarative Pipelines that support tasks such as tracking update history, auditing pipelines, and viewing lineage.","This section describes monitoring and observability features for Lakeflow Spark Declarative Pipelines. Additionally, there are troubleshooting topics for specific scenarios.",2026-01-23T08:00:00.000Z,landing-page,troubleshooting,0.65,True,Monitoring/observability section plus explicit mention of troubleshooting topics for specific scenarios suggests symptom→diagnosis→solution guidance and product-specific diagnostic details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/parameters,Parameterization,Use parameters with pipelines - Azure Databricks,Parameterize Lakeflow Spark Declarative Pipelines,Learn how to use Lakeflow Spark Declarative Pipelines configurations to parameterize pipeline code.,This article explains how to configure parameters for your pipelines.,2026-01-23T08:00:00.000Z,how-to,configuration,0.7,True,"Explains configuring parameters for pipelines; likely includes specific configuration keys, usage patterns, and allowed values unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/pipeline-mode,Triggered vs. continuous,Triggered vs. continuous pipeline mode - Azure Databricks,Choose and configure triggered vs continuous pipeline modes,Learn the difference between triggered and continuous mode for Lakeflow Spark Declarative Pipelines.,"This article describes the operational semantics of triggered and continuous modes for pipelines. Pipeline mode is independent of the type of table being computed. Both materialized views and Streaming tables can be updated in either pipeline mode. To change between triggered and continuous, use thePipeline modeoption in the pipeline settings while creating or editing a pipeline. SeeConfigure Pipelines. Note Refresh operations for materialized views and Streaming tables defined in Databricks SQL",2026-01-23T08:00:00.000Z,concept-article,configuration,0.65,True,Explains operational semantics of triggered and continuous modes and how to change them in settings; product-specific configuration behavior for pipeline execution.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/pipeline-mode,Triggered vs. continuous,Triggered vs. continuous pipeline mode - Azure Databricks,,Learn the difference between triggered and continuous mode for Lakeflow Spark Declarative Pipelines.,"This article describes the operational semantics of triggered and continuous modes for pipelines. Pipeline mode is independent of the type of table being computed. Both materialized views and streaming tables can be updated in either pipeline mode. To change between triggered and continuous, use thePipeline modeoption in the pipeline settings while creating or editing a pipeline. SeeConfigure Pipelines. Note Refresh operations for materialized views and streaming tables defined in Databricks SQL",2026-04-21T08:00:00.000Z,concept-article,,0.3,False,"Explains semantics of triggered vs continuous pipeline modes conceptually; no evidence of numeric thresholds, configuration tables, or product-specific error/diagnostic details.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/privileges,Identities and privileges,"Manage identities, permissions, and privileges for pipelines - Azure Databricks",Configure identities and permissions for Databricks pipelines,Learn how to set Lakeflow Spark Declarative Pipelines privileges and the identity used to process pipeline updates.,"This article provides an overview of identities, permissions, and privileges for Lakeflow Spark Declarative Pipelines. Databricks recommends using Unity Catalog for all new pipelines. By default, materialized views and streaming tables created by pipelines configured with Unity Catalog can only be queried by the pipeline owner. SeeUse Unity Catalog with pipelines. If your pipelines publish datasets to legacy Hive metastore, seeUse Lakeflow Spark Declarative Pipelines with legacy Hive metastore. ",2026-01-23T08:00:00.000Z,how-to,security,0.8,True,"Covers identities, permissions, and privileges for pipelines; likely lists specific roles/privileges and default access behaviors, which are product-specific security configurations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/properties,Pipeline configuration reference,Pipeline properties reference - Azure Databricks,Configure Lakeflow pipeline JSON and table properties,This article provides a reference for Lakeflow Spark Declarative Pipelines JSON setting specification and table properties in Azure Databricks.,"This article provides a reference for pipeline JSON setting specification and table properties in Lakeflow Spark Declarative Pipelines. For more details on using these various properties and configurations, see the following articles:",2026-03-31T08:00:00.000Z,reference,configuration,0.85,True,"A reference for JSON settings and table properties strongly implies parameter names, allowed values, and possibly defaults in tables, which is product-specific configuration detail.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/query-history,Access query history,Access query history for pipelines - Azure Databricks,Use pipeline query history for debugging and tuning,Learn how to access query insights for Lakeflow Spark Declarative Pipelines to assist with troubleshooting and performance optimization.,"This article explains how to access query histories and query profiles associated with pipeline runs. You can use this information to debug queries, identify performance bottlenecks, and optimize pipeline runs.",2026-01-23T08:00:00.000Z,how-to,troubleshooting,0.7,True,Focuses on using query histories and profiles to debug and optimize; likely includes product-specific UI/commands and interpretation patterns for performance troubleshooting.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/recover-streaming,Recover from streaming checkpoint failure,Recover a pipeline from streaming checkpoint failure - Azure Databricks,Recover Databricks pipelines from checkpoint failures,Recover Lakeflow Spark Declarative Pipelines from streaming checkpoint failure.,This page describes how to recover a pipeline in Lakeflow Spark Declarative Pipelines when a streaming checkpoint becomes invalid or corrupted.,2026-03-31T08:00:00.000Z,how-to,troubleshooting,0.9,True,Explicitly about recovering from streaming checkpoint corruption; this is a concrete failure mode with product-specific recovery steps and edge cases.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/serverless,Serverless,Configure a serverless pipeline - Azure Databricks,Configure serverless Lakeflow Spark pipelines in Databricks,Configure Lakeflow Spark Declarative Pipelines to use serverless compute and Unity Catalog.,This page describes configurations for serverless pipelines. Databricks recommends developing new pipelines using serverless. Some workloads might require configuring classic compute or working with the legacy Hive metastore. SeeConfigure classic compute for pipelinesandUse Lakeflow Spark Declarative Pipelines with legacy Hive metastore. Note,2026-04-15T08:00:00.000Z,how-to,configuration,0.68,True,"The page is specifically about configuring Lakeflow Spark Declarative Pipelines to use serverless compute and Unity Catalog. This implies product-specific configuration options and settings (for example, how to select serverless vs classic compute, and how to bind to Unity Catalog) that go beyond generic knowledge. It is not just a conceptual overview but a configuration-focused guide, so it best fits the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/recover-streaming,Recover from streaming checkpoint failure,Recover a pipeline from streaming checkpoint failure - Azure Databricks,Recover Databricks pipelines from checkpoint failures,Recover Lakeflow Spark Declarative Pipelines from streaming checkpoint failure.,This page describes how to recover a pipeline in Lakeflow Spark Declarative Pipelines when a streaming checkpoint becomes invalid or corrupted.,2026-04-21T08:00:00.000Z,how-to,troubleshooting,0.75,True,"Focused on recovering from streaming checkpoint corruption, which typically involves Databricks-specific recovery steps, options, and possibly commands; this is symptom→solution guidance unique to the product.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/serverless,Serverless,Configure a serverless pipeline - Azure Databricks,Configure serverless Lakeflow Spark pipelines in Databricks,Configure Lakeflow Spark Declarative Pipelines to use serverless compute and Unity Catalog.,This page describes configurations for serverless pipelines. Databricks recommends developing new pipelines using serverless. Some workloads might require configuring classic compute or working with the legacy Hive metastore. SeeConfigure classic compute for pipelinesandUse Lakeflow Spark Declarative Pipelines with legacy Hive metastore. Note,2026-04-15T08:00:00.000Z,how-to,configuration,0.68,True,"The page is specifically about configuring Lakeflow Spark Declarative Pipelines to use serverless compute and Unity Catalog. This implies product-specific configuration options and settings (for example, how to select serverless vs classic compute, and how to bind to Unity Catalog) that go beyond generic knowledge. It is not just a conceptual overview but a configuration-focused guide, so it best fits the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/sinks,Overview,Sinks in Lakeflow Spark Declarative Pipelines - Azure Databricks,Configure custom sinks in Lakeflow pipelines,Learn about using custom sinks for your Lakeflow Spark Declarative Pipelines flows.,"By default when you create a flow, your pipeline writes the resulting query to a Delta table, typically a materialized view or a streaming table. Pipelines also provide functionality to let you write to a wide range of sinks, or even programmatically transform and stream data to any target (or targets) that you can write to with Python. The following topics describe the sink functionality in pipelines.",2026-01-23T08:00:00.000Z,overview,configuration,0.7,True,Describes sink functionality and how to write to various targets; likely includes sink configuration options and behaviors specific to Lakeflow Spark Declarative Pipelines.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/source-controlled,Create a source-controlled pipeline,Create a source-controlled pipeline - Azure Databricks,Configure source-controlled Lakeflow pipelines with Git,Learn how to use Declarative Automation Bundles and Git folders to create pipelines that are source-controlled.,"Important The Lakeflow Pipelines Editor is inPublic Preview. In Azure Databricks, you can source control a pipeline and all of the code associated with it. By source controlling all files associated with your pipeline, changes to your transformation code, exploration code, and pipeline configuration are all versioned in Git and can be tested in development and confidently deployed to production. A source-controlled pipeline offers the following advantages: Azure Databricks allows for pipelines a",2026-03-16T08:00:00.000Z,tutorial,configuration,0.7,True,"Describes how to make pipelines source-controlled using Declarative Automation Bundles and Git folders; likely includes pipeline configuration structure, file layout, and specific settings unique to Lakeflow pipelines.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/stateful-processing,Stateful stream processing,Optimize stateful processing with watermarks - Azure Databricks,Optimize stateful stream processing with watermarks,"Learn how to optimally perform stateful processing such as aggregations, joins, and deduplication in Lakeflow Spark Declarative Pipelines using features such as watermarks.","To effectively manage the data kept in state, use watermarks when performing stateful stream processing in Lakeflow Spark Declarative Pipelines, including aggregations, joins, and deduplication. This article describes how to use watermarks in your pipeline queries and includes examples of the recommended operations. Note To ensure queries that perform aggregations are processed incrementally and not fully recomputed with each update, you must use watermarks.",2026-01-23T08:00:00.000Z,concept-article,best-practices,0.7,True,"Explains how to use watermarks for aggregations, joins, and deduplication in Lakeflow pipelines with examples; these are product-specific performance and correctness recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/streaming-tables,Overview,Streaming tables - Azure Databricks,,Learn about streaming tables in Azure Databricks.,"A streaming table is a Delta table with additional support for streaming or incremental data processing. A streaming table can be targeted by one or more flows in a pipeline. Streaming tables are a good choice for data ingestion for the following reasons: Streaming tables are also a good choice for low-latency streaming transformations for the following reasons: The following diagram illustrates how streaming tables work. On each update, the flows associated with a streaming table read the chang",2026-03-11T22:00:00.000Z,concept-article,,0.3,False,"Conceptual explanation of streaming tables and when to use them; no evidence of limits, config matrices, or detailed troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/streaming-tables,Overview,Streaming tables - Azure Databricks,,Learn about streaming tables in Azure Databricks.,"A Streaming table is a Delta table with additional support for streaming or incremental data processing. A Streaming table can be targeted by one or more flows in a pipeline. Streaming tables are a good choice for data ingestion for the following reasons: Streaming tables are also a good choice for low-latency streaming transformations because they can reason over rows and windows of time, handle high volumes of data, and provide low-latency processing. The following diagram shows how flows read",2026-04-21T18:07:00.000Z,concept-article,,0.4,False,"Explains streaming tables and when they are a good choice, but summary lacks numeric thresholds, detailed decision matrices, or configuration parameter tables.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/target-schema,Catalog and schema,Set the target catalog and schema - Azure Databricks,Set default catalog and schema for Lakeflow pipelines,Configure the default catalog and schema for a pipeline or override Lakeflow Spark Declarative Pipelines defaults to read or write datasets.,"TheDefault location for data assetssection of the pipeline configuration UI sets the default catalog and schema for a pipeline. This default catalog and schema are used for all dataset definitions and table reads, unless overridden within the query. Note Legacy publishing mode uses theLIVEvirtual schema to achieve similar behavior. In the default publishing mode (used by all new pipelines), theLIVEkeyword is ignored. SeeLIVE schema (legacy).",2026-03-18T08:00:00.000Z,how-to,configuration,0.7,True,Describes configuring default catalog and schema and overriding defaults; includes specific UI options and behavior differences with legacy LIVE schema.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/transform,Transform data,Transform data with pipelines - Azure Databricks,Declare and manage data transformations in pipelines,Learn how to use Lakeflow Spark Declarative Pipelines to declare transformations on datasets and specify how records are processed through query logic.,"This article describes how you can use pipelines to declare transformations on datasets and specify how records are processed through query logic. It also contains examples of common transformation patterns for building pipelines. You can define a dataset against any query that returns a DataFrame. You can use Apache Spark built-in operations, UDFs, custom logic, and MLflow models as transformations in Lakeflow Spark Declarative Pipelines. After data has been ingested into your pipeline, you can",2026-02-03T08:00:00.000Z,how-to,configuration,0.65,True,"Shows how to declare transformations on datasets and use Spark operations, UDFs, and MLflow models; includes product-specific transformation configuration and patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/transform,Transform data,Transform data with pipelines - Azure Databricks,,Learn how to use Lakeflow Spark Declarative Pipelines to declare transformations on datasets and specify how records are processed through query logic.,"This article describes how you can use pipelines to declare transformations on datasets and specify how records are processed through query logic. It also contains examples of common transformation patterns for building pipelines. You can define a dataset against any query that returns a DataFrame. You can use Apache Spark built-in operations, UDFs, custom logic, and MLflow models as transformations in Lakeflow Spark Declarative Pipelines. After data has been ingested into your pipeline, you can",2026-04-21T08:00:00.000Z,how-to,,0.3,False,"Transformation patterns and general usage of pipelines; summary suggests examples but not specific numeric limits, configuration tables, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorial-get-started,Create your first pipeline using the Lakeflow Pipelines Editor,Tutorial: Create your first pipeline using the Lakeflow Pipelines Editor - Azure Databricks,,Learn how to create and deploy a simple pipeline with Lakeflow Spark Declarative Pipelines and the Lakeflow Pipelines Editor.,"Learn how to create a new pipeline using Lakeflow Spark Declarative Pipelines (SDP) for data orchestration and Auto Loader. This tutorial extends the sample pipeline by cleaning the data and creating a query to find the top 100 users. In this tutorial, you learn how to use the Lakeflow Pipelines Editor to:",2026-04-07T08:00:00.000Z,concept-article,,0.3,False,"Step-by-step getting-started tutorial for creating a first pipeline; focuses on basic usage and example scenario, not on detailed configuration matrices, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorial-pipelines,Build an ETL pipeline using CDC,Tutorial: Build an ETL pipeline using change data capture - Azure Databricks,,"Learn how to create and deploy an ETL (extract, transform, and load) pipeline using change data capture (CDC) with Lakeflow Spark Declarative Pipelines.","Learn how to create and deploy an ETL (extract, transform, and load) pipeline with change data capture (CDC) using Lakeflow Spark Declarative Pipelines (SDP) for data orchestration and Auto Loader. An ETL pipeline implements the steps to read data from source systems, transform that data based on requirements, such as data quality checks and record de-duplication, and write the data to a target system, such as a data warehouse or a data lake. In this tutorial, you'll use data from acustomerstabl",2026-04-07T08:00:00.000Z,tutorial,,0.3,False,"How-to tutorial for building an ETL pipeline with CDC; primarily procedural guidance without product-specific limits, configuration tables, or structured troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorial-spatial-pipelines,Build a geospatial pipeline with native spatial types,Tutorial: Build a geospatial pipeline with native spatial types - Azure Databricks,,Learn how to build a geospatial pipeline with Lakeflow Spark Declarative Pipelines using native spatial types and spatial joins.,"Learn how to create and deploy a pipeline that ingests GPS data, converts coordinates to native spatial types, and joins against warehouse geofences to track arrivals using Lakeflow Spark Declarative Pipelines (SDP) for data orchestration and Auto Loader. This tutorial uses Databricks native spatial types (GEOMETRY,GEOGRAPHY) and built-in spatial functions such asST_Point,ST_GeomFromWKT, andST_Contains, so you can run geospatial workflows at scale without external libraries. In this tutorial, yo",2026-03-17T08:00:00.000Z,tutorial,,0.4,False,"Geospatial pipeline tutorial with example functions; appears as a scenario walkthrough rather than configuration references, limits, or best-practice gotchas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorials,Overview,Lakeflow Spark Declarative Pipelines tutorials - Azure Databricks,,Learn how to use Lakeflow Spark Declarative Pipelines in Azure Databricks with tutorials.,The tutorials in this section are designed to help you learn about SDP. The following tutorials are available:,2026-04-10T21:59:00.000Z,landing-page,,0.2,False,"Tutorials landing page listing available Lakeflow Spark Declarative Pipelines tutorials; navigation/overview content without detailed limits, configs, or troubleshooting data.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/unity-catalog,Unity Catalog,Use Unity Catalog with pipelines - Azure Databricks,Configure Unity Catalog usage in Lakeflow pipelines,Learn how to read tables from and write tables to Unity Catalog in your Lakeflow Spark Declarative Pipelines.,"Databricks recommends configuring Lakeflow Spark Declarative Pipelines with Unity Catalog. Using Unity Catalog is the default for newly created pipelines. Pipelines configured with Unity Catalog publish all defined materialized views and streaming tables to the specified catalog and schema. Unity Catalog pipelines can read from other Unity Catalog tables and volumes. To manage permissions on the tables created by a Unity Catalog pipeline, useGRANT and REVOKE. Note This article discusses function",2026-04-08T08:00:00.000Z,how-to,configuration,0.68,True,"Describes how Lakeflow Spark Declarative Pipelines behave when configured with Unity Catalog, including default behavior for new pipelines, how materialized views and streaming tables are published to specific catalogs and schemas, and how permissions are managed with GRANT/REVOKE. This is product-specific configuration/behavior detail rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/ldp/updates,Pipeline update,Run a pipeline update - Azure Databricks,,This article explains what a pipeline update is and how to run an update in Lakeflow Spark Declarative Pipelines.,This article explains pipeline updates and provides details on how to trigger an update.,2026-04-06T08:00:00.000Z,concept-article,,0.3,False,"Described as explaining what a pipeline update is and how to trigger it, which sounds like conceptual and basic operational guidance without clear evidence of detailed configuration tables, limits, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/ldp/unity-catalog,Unity Catalog,Use Unity Catalog with pipelines - Azure Databricks,,Learn how to read tables from and write tables to Unity Catalog in your Lakeflow Spark Declarative Pipelines.,"Databricks recommends configuring Lakeflow Spark Declarative Pipelines with Unity Catalog. Using Unity Catalog is the default for newly created pipelines. Pipelines configured with Unity Catalog publish all defined materialized views and streaming tables to the specified catalog and schema. Unity Catalog pipelines can read from other Unity Catalog tables and volumes. To manage permissions on the tables created by a Unity Catalog pipeline, useGRANT and REVOKE. Note This article discusses function",2026-04-21T08:00:00.000Z,how-to,,0.3,False,"Describes using Unity Catalog with pipelines at a conceptual level (read/write, permissions via GRANT/REVOKE); summary does not indicate detailed RBAC role lists, config tables, or other expert-only specifics.",updated +https://learn.microsoft.com/en-us/azure/databricks/ldp/updates,Pipeline update,Run a pipeline update - Azure Databricks,,This article explains what a pipeline update is and how to run an update in Lakeflow Spark Declarative Pipelines.,This article explains pipeline updates and provides details on how to trigger an update.,2026-04-20T17:34:00.000Z,concept-article,,0.3,False,"Explains what a pipeline update is and how to trigger it; summary suggests procedural guidance without detailed configuration matrices, limits, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/databricks/ldp/using-alter-sql,ALTER pipeline datasets,Use ALTER statements with pipeline datasets - Azure Databricks,Use ALTER SQL statements with pipeline datasets,Learn about using Databricks SQL `ALTER` statements with Lakeflow Spark Declarative Pipelines.,"Lakeflow Spark Declarative Pipelines (SDP) defines pipelines in source code that is specific to SDP. You can edit pipeline source in either SQL or Python, for example, in theLakeflow Pipelines Editor. Lakeflow Connect creates pipelines that ingest data, and create ingestion streaming tables. Azure Databricks also provides a SQL environment calledDatabricks SQL. You can create materialized views and streaming tables with Databricks SQL using pipeline functionality outside of SDP (seeUse pipelines",2026-02-05T18:55:00.000Z,concept-article,configuration,0.7,True,Covers how Databricks SQL ALTER interacts with pipeline-defined datasets; product-specific behavior and constraints for schema changes.,unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/what-is-change-data-capture,Change data capture and snapshots,Change data capture and snapshots - Azure Databricks,Design CDC and snapshot patterns in Databricks,"Learn about change data capture (CDC), snapshots, and slowly changing dimensions (SCD).","Data engineers often need to replicate data from sources upstream of Azure Databricks—such as relational databases (Oracle, Postgres, SQL Server)—into Azure Databricks for analytics, reporting, and machine learning. As operational systems change, analytical tables must stay synchronized with those changes. Some teams need to reflect the current state of their operational databases for reporting and analytics. Others need to preserve a full history of changes for auditability, regulatory requirem",2026-03-09T18:23:00.000Z,concept-article,best-practices,0.65,True,"Databricks-focused guidance on CDC, snapshots, and SCD with concrete pipeline implications; product-specific recommendations for different requirements.",unchanged https://learn.microsoft.com/en-us/azure/databricks/ldp/where-is-dlt,Where is DLT?,What happened to Delta Live Tables (DLT)? - Azure Databricks,,Learn how Lakeflow Spark Declarative Pipelines replaced Delta Live Tables (DLT).,"The product formerly known as Delta Live Tables (DLT) has been updated to Lakeflow Spark Declarative Pipelines (SDP). If you have previously used DLT, there is no migration required to use Lakeflow Spark Declarative Pipelines: your code will still work in SDP. There are changes that you can make to better take advantage of Lakeflow Spark Declarative Pipelines, both now and in the future, as well as to introduce compatibility with the Apache Spark™ Declarative Pipelines (beginning in Apache Spark",2026-01-23T08:00:00.000Z,concept-article,,0.3,False,"Explains product renaming and high-level migration note; no detailed config, limits, or decision matrices.",unchanged @@ -1509,15 +1531,15 @@ https://learn.microsoft.com/en-us/azure/databricks/libraries/restart-python-proc https://learn.microsoft.com/en-us/azure/databricks/libraries/volume-libraries,Install libraries from a volume,Install libraries from a volume - Azure Databricks,Install Databricks libraries from Unity Catalog volumes,Learn how to upload libraries to volumes and install them onto clusters.,"This article walks you through the steps required to upload libraries or requirements.txt files to volumes and install them onto clusters in Azure Databricks. You can install libraries onto all-purpose compute or job compute. For more information about volumes, seeWhat are Unity Catalog volumes?. For information about working with Unity Catalog, including controlling access and creating objects, seeWhat is Unity Catalog?. For full library compatibility details, seeCompute-scoped libraries.",2026-01-19T08:00:00.000Z,how-to,configuration,0.7,True,Explains using volumes and requirements.txt for library installation; relies on Unity Catalog volume configuration and compatibility details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/libraries/workspace-files-libraries,Install libraries from workspace files,Install libraries from workspace files - Azure Databricks,Install libraries from workspace files in Databricks,Learn how to upload libraries to workspace files and install them onto clusters.,"This article walks you through the steps required to upload package or requirements.txt files to workspace files and install them onto clusters in Azure Databricks. You can install libraries onto all-purpose compute or job compute. Important This article describes storing libraries as workspace files. This is different thanworkspace librarieswhich are deprecated. For more information about workspace files, seeWorkspace UI. Databricks Runtime 15.0 or above is required to upload requirements.txt f",2026-01-19T08:00:00.000Z,how-to,configuration,0.7,True,Describes storing libraries as workspace files with runtime version requirements; this is product-specific configuration behavior and constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/,AI and ML introduction,AI and machine learning on Databricks - Azure Databricks,,Build AI and machine learning applications on Databricks using unified data and ML platform capabilities.,"Build, deploy, and manage AI and machine learning applications with Mosaic AI, an integrated platform that unifies the entire AI lifecycle from data preparation to production monitoring. For a set of tutorials to get you started, seeAI and machine learning tutorials.",2026-03-27T08:00:00.000Z,concept-article,,0.1,False,High-level AI/ML overview for Databricks; no detailed technical parameters or limits indicated.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-ml-tutorials,Tutorials,AI and machine learning tutorials - Azure Databricks,,A curated list of quickstart notebooks and tutorials designed to quickly get you started with AI and ML on Databricks.,Try one of these tutorials to get started. You canimport these notebooksto your Databricks workspace.,2026-04-17T18:03:00.000Z,concept-article,,0.1,False,"Curated list of tutorials/quickstarts; navigation/collection page without deep product-specific limits, configs, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/,Overview,AI Runtime - Azure Databricks,,Learn about Databricks AI Runtime. This serverless GPU compute offering is specialized for custom AI workloads such as custom single-node deep learning.,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta.,2026-04-15T22:08:00.000Z,concept-article,,0.2,False,"High-level description of Databricks AI Runtime and preview status; appears to be an overview page without detailed limits, configuration tables, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/connecting,Connect to AI Runtime,Connect to AI Runtime - Azure Databricks,Connect notebooks and jobs to Databricks AI Runtime,"Learn how to connect to AI Runtime on serverless GPU from notebooks, scheduled jobs, and the Jobs API.","Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This article describes how to connect to AI Runtime from interactive notebooks, scheduled jobs, and the Jobs API.",2026-04-15T22:08:00.000Z,how-to,configuration,0.65,True,"Explains how to connect to AI Runtime from notebooks, scheduled jobs, and the Jobs API, which typically involves specifying cluster/runtime identifiers, job configuration fields, and API parameters that are product-specific configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/dataloading,Read in data,Load data on AI Runtime - Azure Databricks,Apply data loading best practices on Databricks AI Runtime,Learn about best practices for data loading on Databricks AI Runtime,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This section covers information about loading data on AI Runtime specifically for ML and DL applications. Check thetutorialto learn more about how to load and transform data using the Spark Python API. Note Unity Catalog is required. All data access on AI Runtime goes through Unity Catalog. Your tables and volumes must be registered in Unity Catalog and accessible t,2026-03-19T18:35:00.000Z,best-practice,best-practices,0.7,True,"Described as best practices for data loading on AI Runtime, including Unity Catalog requirements and likely product-specific patterns and gotchas for ML/DL workloads.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/distributed-training,Distributed training,Multi-GPU workload - Azure Databricks,Use Serverless GPU Python API for multi-GPU training,Distributed training on multi-GPU nodes using serverless\_gpu python library,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can launch distributed workloads across multiple GPUs on a single node using theServerless GPU Python API. +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-ml-tutorials,Tutorials,AI and machine learning tutorials - Azure Databricks,,A curated list of quickstart notebooks and tutorials designed to quickly get you started with AI and ML on Databricks.,Try one of these tutorials to get started. You canimport these notebooksto your Databricks workspace.,2026-04-17T18:03:00.000Z,concept-article,,0.1,False,"Curated list of tutorials/quickstarts; navigation/collection page without deep product-specific limits, configs, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/,Overview,AI Runtime - Azure Databricks,,Learn about Databricks AI Runtime. This serverless GPU compute offering is specialized for custom AI workloads such as custom single-node deep learning.,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta.,2026-04-21T08:00:00.000Z,concept-article,,0.3,False,"High-level overview of Databricks AI Runtime and preview/beta status; lacks concrete limits, configuration parameters, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/connecting,Connect to AI Runtime,Connect to AI Runtime - Azure Databricks,Connect notebooks and jobs to Databricks AI Runtime,"Learn how to connect to AI Runtime on serverless GPU from notebooks, scheduled jobs, and the Jobs API.","Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This article describes how to connect to AI Runtime from interactive notebooks, scheduled jobs, and the Jobs API.",2026-04-21T22:28:00.000Z,how-to,configuration,0.7,True,"Explains how to connect from notebooks, scheduled jobs, and Jobs API, which typically involves product-specific connection parameters and API usage patterns not captured by generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/dataloading,Read in data,Load data on AI Runtime - Azure Databricks,Apply data loading best practices on Databricks AI Runtime,Learn about best practices for data loading on Databricks AI Runtime,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This section covers information about loading data on AI Runtime specifically for ML and DL applications. Check thetutorialto learn more about how to load and transform data using the Spark Python API. Note Unity Catalog is required. All data access on AI Runtime goes through Unity Catalog. Your tables and volumes must be registered in Unity Catalog and accessible t,2026-04-23T08:00:00.000Z,best-practice,best-practices,0.7,True,"Provides product-specific data loading guidance tied to Unity Catalog requirements and AI Runtime behavior, representing concrete best practices for this service.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/distributed-training,Distributed training,Multi-GPU workload - Azure Databricks,Use serverless GPU Python API for multi-GPU training,Distributed training on multi-GPU nodes using serverless\_gpu python library,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can launch distributed workloads across multiple GPUs on a single node using theServerless GPU Python API. The API provides a simple, unified interface that abstracts away the details of GPU provisioning, environment setup, and workload distribution. With minimal code changes, you can seamlessly move -from single-GPU training to multi-GPU distributed",2026-04-15T22:08:00.000Z,feature-guide,integrations,0.7,True,"Describes a specific Serverless GPU Python API for distributed training, which will include function signatures, parameters, and constraints unique to this product—matching integration/coding pattern criteria rather than generic ML guidance.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/environment,Set up environment,Set up your environment - Azure Databricks,Configure Python environments for Databricks AI Runtime,"Learn how to set up and configure Python environments for AI Runtime, including the default and AI environments.","Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This page describes how to choose and configure a Python environment for AI Runtime, including environment caching behavior, custom module imports, and known limitations.",2026-03-20T08:00:00.000Z,how-to,configuration,0.8,True,"Explicitly about setting up and configuring Python environments, including environment caching behavior and known limitations, which are product-specific configuration details and constraints.",unchanged +from single-GPU training to multi-GPU distributed",2026-04-21T22:28:00.000Z,feature-guide,integrations,0.65,True,"Describes the Serverless GPU Python API for distributed training, a product-specific integration/coding pattern with unique API usage and behavior for multi-GPU workloads.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/environment,Set up environment,Set up your environment - Azure Databricks,Set up Python environments for Databricks AI Runtime,"Learn how to set up and configure Python environments for AI Runtime, including the default and AI environments.","Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This page describes how to choose and configure a Python environment for AI Runtime, including environment caching behavior, custom module imports, and known limitations.",2026-04-21T08:00:00.000Z,how-to,configuration,0.8,True,"Covers choosing and configuring Python environments, environment caching behavior, custom module imports, and known limitations—detailed, product-specific configuration behavior.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/,Overview,AI Runtime example notebooks - Azure Databricks,,Use example tutorial notebooks for AI Runtime (serverless GPU).,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. The pages below include various notebook examples that show how to use AI Runtime for different tasks.,2026-03-26T08:00:00.000Z,concept-article,,0.3,False,"Index of example notebooks; does not itself contain configuration tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-classic-ml,Overview,Classic machine learning - Azure Databricks,,Examples for classic machine learning tasks using AI Runtime (serverless GPU).,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This page provides notebook examples for classic machine learning tasks using AI RuntimeThese examples demonstrate how to leverage GPUs for traditional ML algorithms and time series forecasting.,2026-03-26T08:00:00.000Z,concept-article,,0.3,False,Landing page for classic ML examples; mostly links and high-level descriptions without explicit product-specific configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-computer-vision,Overview,Computer vision - Azure Databricks,,Examples for computer vision tasks using AI Runtime.,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This page provides notebook examples for computer vision tasks using AI Runtime. These examples demonstrate how to train and fine-tune models for various computer vision applications.,2026-03-19T18:35:00.000Z,concept-article,,0.3,False,"Landing page for computer vision example notebooks; mainly descriptive and linking, without explicit configuration tables or limits.",unchanged @@ -1529,18 +1551,18 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/e https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-recommendation,Deep learning recommendation,Deep learning based recommender systems - Azure Databricks,,Examples for building recommendation systems using AI Runtime (serverless GPU).,Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This page provides notebook examples for building recommendation systems using AI Runtime. These examples demonstrate how to create efficient recommendation models using modern deep learning approaches.,2026-03-31T23:28:00.000Z,concept-article,,0.3,False,"Overview page for recommender system examples; primarily descriptive with links, not detailed configuration or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-api-h100-starter,Distributed training with H100 GPUs,Get started: Serverless GPU compute with H100 GPUs - Azure Databricks,Get started with H100 serverless GPU using serverless_gpu API,Learn how to use Databricks AI Runtime (serverless GPU) with H100 accelerators to run distributed GPU workloads using the serverless\_gpu Python library.,"Open notebook version of this page This notebook demonstrates how to use Databricks Serverless GPU compute with H100 accelerators. You'll learn how to connect to H100 GPUs and run distributed workloads using theserverless_gpuPython library. Theserverless_gpulibrary enables seamless execution of GPU workloads directly from Databricks notebooks. It provides decorators and runtime utilities for distributed GPU computing. To learn more, see theServerless GPU API documentation.",2026-03-26T08:00:00.000Z,tutorial,integrations,0.7,True,"Notebook demonstrates connecting to H100 GPUs and using the serverless_gpu Python library, including decorators and runtime utilities that are specific integration APIs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-cnn-mnist,Image classification with CNN,Image classification using convolutional neural networks - Azure Databricks,Train CNN image classifier on Databricks AI Runtime,Train a Convolutional Neural Network (CNN) for image classification using PyTorch and the MNIST dataset on Databricks AI Runtime (serverless GPU).,"Open notebook version of this page This notebook demonstrates how to train aConvolutional Neural Network (CNN)for image classification using theMNIST datasetandPyTorch. The MNIST dataset contains 70,000 grayscale images of handwritten digits (0-9), making it ideal for learning image classification techniques. You'll learn how to:",2026-03-26T08:00:00.000Z,tutorial,integrations,0.6,True,"Notebook tutorial for CNN training on AI Runtime likely includes Databricks-specific environment setup, data access, and logging code patterns that represent concrete integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-finetune-qwen2-0.5b,Distributed fine-tune Qwen2-0.5B with LoRA,Distributed fine-tuning of Qwen2-0.5B with LoRA - Azure Databricks,,Fine-tune Qwen2-0.5B using LoRA and Liger Kernels for memory-efficient distributed training on AI Runtime (serverless GPU) with parameter reduction.,Open notebook version of this page This notebook demonstrates how to efficiently fine-tune the Qwen2-0.5B large language model using parameter-efficient techniques on Serverless GPU Compute. You'll learn how to: Key concepts:,2026-04-17T21:49:00.000Z,tutorial,,0.4,False,"Tutorial-style notebook for fine-tuning Qwen2-0.5B; likely focuses on example code and workflow rather than structured configuration tables, limits, or troubleshooting mappings. Treated as a general how-to rather than a reusable expert reference.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-finetune-qwen2-0.5b,Distributed fine-tune Qwen2-0.5B with LoRA,Distributed fine-tuning of Qwen2-0.5B with LoRA - Azure Databricks,,Fine-tune Qwen2-0.5B using LoRA and Liger Kernels for memory-efficient distributed training on AI Runtime (serverless GPU) with parameter reduction.,Open notebook version of this page This notebook demonstrates how to efficiently fine-tune the Qwen2-0.5B large language model using parameter-efficient techniques on Serverless GPU Compute. You'll learn how to: Key concepts:,2026-04-17T21:49:00.000Z,tutorial,,0.4,False,"Tutorial-style notebook for fine-tuning Qwen2-0.5B; likely focuses on example code and workflow rather than structured configuration tables, limits, or troubleshooting mappings. Treated as a general how-to rather than a reusable expert reference.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-gpt-oss-20b,Fine-tune GPT-OSS 20B,Distributed fine-tuning of OpenAI gpt-oss-20b - Azure Databricks,Distributed fine-tuning of gpt-oss-20b on Databricks,Fine-tune OpenAI's gpt-oss-20b model using distributed training on AI Runtime (serverless GPU) for large language model customization.,Open notebook version of this page This notebook demonstrates how to fine-tune OpenAI'sgpt-oss-20bmodel using distributed training on serverless GPU compute. You'll learn how to: Key concepts:,2026-03-26T08:00:00.000Z,tutorial,integrations,0.7,True,"Notebook for fine-tuning a specific OpenAI model with distributed training on Databricks, including concrete integration code and configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-pytorch-fsdp,Train Transformer with PyTorch FSDP,Distributed training using PyTorch FSDP on serverless GPU compute - Azure Databricks,Train Transformers with PyTorch FSDP on Databricks,Train Transformer models using PyTorch FSDP distributed training on AI Runtime (serverless GPU) to shard model parameters across multiple GPUs efficiently.,"Open notebook version of this page This notebook demonstrates how to train a Transformer model using distributed training with PyTorch's Fully Sharded Data Parallel (FSDP) on Databricks serverless GPU compute. FSDP is a data parallelism technique that shards model parameters, gradients, and optimizer states across multiple GPUs, enabling efficient training of large models that don't fit on a single GPU. In this example, you'll learn how to: This notebook uses synthetic data to keep it self-conta",2026-03-26T08:00:00.000Z,tutorial,integrations,0.7,True,"Notebook for distributed training on Databricks Serverless GPU with PyTorch FSDP will contain concrete configuration (FSDP settings, GPU allocation, Databricks runtime specifics, training loop patterns) that are product- and stack-specific integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-llama-unsloth,Fine-tune Llama-3.2-3B with Unsloth,Finetune Llama-3.2-3B with Unsloth - Azure Databricks,,Fine-tune Llama-3.2-3B using Unsloth library with LoRA for parameter-efficient training on single GPU serverless compute with reduced memory usage.,"Open notebook version of this page This notebook demonstrates how to finetune the Llama-3.2-3B large language model using theUnslothlibrary. Unsloth provides optimized implementations for parameter-efficient fine-tuning (PEFT) techniques like LoRA (Low-Rank Adaptation), enabling faster training with reduced memory usage. The notebook covers:",2026-04-14T21:45:00.000Z,tutorial,,0.4,False,"Tutorial notebook for fine-tuning Llama-3.2-3B with Unsloth; primarily an end-to-end example. Does not clearly indicate structured limits, configuration matrices, or troubleshooting content required for expert-knowledge classification.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-llama-unsloth-distributed,Distributed fine-tune Llama-3.2-3B with Unsloth,Distributed finetune Llama-3.2-3B with Unsloth on Multiple GPUs - Azure Databricks,Distributed Unsloth finetuning of Llama-3.2-3B on Databricks,Fine-tune Llama-3.2-3B using distributed training across multiple A10 GPUs with Unsloth library for optimized parameter-efficient training.,Open notebook version of this page This notebook demonstrates how to finetune the Llama-3.2-3B LLM using the Unsloth and serverless_gpu library on 8 H100 GPUs,2026-04-15T19:49:00.000Z,tutorial,integrations,0.68,True,"Notebook-style tutorial for fine-tuning Llama-3.2-3B on Azure Databricks Serverless GPU with Unsloth and multiple GPUs. It necessarily includes concrete, product-specific code patterns and configuration parameters (e.g., serverless_gpu, Unsloth setup, multi-GPU configuration) that go beyond generic LLM training knowledge, fitting the integrations & coding patterns category.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-llama-unsloth,Fine-tune Llama-3.2-3B with Unsloth,Finetune Llama-3.2-3B with Unsloth - Azure Databricks,,Fine-tune Llama-3.2-3B using Unsloth library with LoRA for parameter-efficient training on single GPU serverless compute with reduced memory usage.,"Open notebook version of this page This notebook demonstrates how to finetune the Llama-3.2-3B large language model using theUnslothlibrary. Unsloth provides optimized implementations for parameter-efficient fine-tuning (PEFT) techniques like LoRA (Low-Rank Adaptation), enabling faster training with reduced memory usage. The notebook covers:",2026-04-14T21:45:00.000Z,tutorial,,0.4,False,"Tutorial notebook for fine-tuning Llama-3.2-3B with Unsloth; primarily an end-to-end example. Does not clearly indicate structured limits, configuration matrices, or troubleshooting content required for expert-knowledge classification.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-llama-unsloth-distributed,Distributed fine-tune Llama-3.2-3B with Unsloth,Distributed finetune Llama-3.2-3B with Unsloth on Multiple GPUs - Azure Databricks,Distributed Unsloth finetuning of Llama-3.2-3B on Databricks,Fine-tune Llama-3.2-3B using distributed training across multiple A10 GPUs with Unsloth library for optimized parameter-efficient training.,Open notebook version of this page This notebook demonstrates how to finetune the Llama-3.2-3B LLM using the Unsloth and serverless_gpu library on 8 H100 GPUs,2026-04-15T19:49:00.000Z,tutorial,integrations,0.68,True,"Notebook-style tutorial for fine-tuning Llama-3.2-3B on Azure Databricks Serverless GPU with Unsloth and multiple GPUs. It necessarily includes concrete, product-specific code patterns and configuration parameters (e.g., serverless_gpu, Unsloth setup, multi-GPU configuration) that go beyond generic LLM training knowledge, fitting the integrations & coding patterns category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-qwen2-0.5b,Fine-tune Qwen2-0.5B,LoRA fine-tuning of Qwen2-0.5B - Azure Databricks,Fine-tune Qwen2-0.5B with LoRA on Databricks AI Runtime,Fine-tune Qwen2-0.5B large language model using parameter-efficient LoRA techniques on AI Runtime (serverless GPU) for efficient model customization.,Open notebook version of this page This notebook demonstrates how to efficiently fine-tune the Qwen2-0.5B large language model using parameter-efficient techniques. You'll learn how to: Key concepts:,2026-03-26T08:00:00.000Z,tutorial,integrations,0.65,True,"Notebook-style tutorial for a specific model on AI Runtime likely includes concrete code, library parameters, and Databricks-specific integration patterns (e.g., checkpointing, logging) that are product-specific coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-gpt-oss-120b-ddp-fsdp,Fine-tune GPT-OSS 120B,Fine-tune OpenAI's GPT-OSS 120B model using distributed training - Azure Databricks,Distributed GPT-OSS 120B finetuning with DDP/FSDP on Databricks,Fine-tune OpenAI's GPT-OSS 120B model using supervised fine-tuning on 8 H100 GPUs with DDP and FSDP distributed training strategies.,"Open notebook version of this page This notebook demonstrates supervised fine-tuning (SFT) of the large 120B parameter GPT-OSS model on 8 H100 GPUs using Databricks Serverless GPU Compute. The training leverages: By settingremote=Falseand specifying 16 GPUs, this can be extended to multi-node training across 16 GPUs.",2026-04-15T22:08:00.000Z,tutorial,integrations,0.7,True,"Describes supervised fine-tuning of a 120B-parameter GPT-OSS model on 8–16 H100 GPUs using Databricks Serverless GPU with DDP and FSDP. This implies detailed, product-specific code and configuration (e.g., DDP/FSDP settings, GPU count parameters, Databricks runtime specifics) that qualify as expert integration patterns rather than generic ML guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-gpt-oss-120b-ddp-fsdp,Fine-tune GPT-OSS 120B,Fine-tune OpenAI's GPT-OSS 120B model using distributed training - Azure Databricks,Distributed GPT-OSS 120B finetuning with DDP/FSDP on Databricks,Fine-tune OpenAI's GPT-OSS 120B model using supervised fine-tuning on 8 H100 GPUs with DDP and FSDP distributed training strategies.,"Open notebook version of this page This notebook demonstrates supervised fine-tuning (SFT) of the large 120B parameter GPT-OSS model on 8 H100 GPUs using Databricks Serverless GPU Compute. The training leverages: By settingremote=Falseand specifying 16 GPUs, this can be extended to multi-node training across 16 GPUs.",2026-04-15T22:08:00.000Z,tutorial,integrations,0.7,True,"Describes supervised fine-tuning of a 120B-parameter GPT-OSS model on 8–16 H100 GPUs using Databricks Serverless GPU with DDP and FSDP. This implies detailed, product-specific code and configuration (e.g., DDP/FSDP settings, GPU count parameters, Databricks runtime specifics) that qualify as expert integration patterns rather than generic ML guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-llama3-8b-llmfoundry,Fine-tune Llama 3.1 8B with LLM Foundry,Fine-tune Llama 3.1 8B using Mosaic LLM Foundry on Databricks Serverless GPU - Azure Databricks,Fine-tune Llama 3.1 8B with Mosaic LLM Foundry,Fine-tune Llama 3.1 8B model using Mosaic LLM Foundry on Databricks serverless GPU with distributed training strategies and model evaluation.,"Open notebook version of this page This notebook demonstrates how to fine-tune a Llama 3.1 8B model usingMosaic LLM Foundryon Databricks Serverless GPU. LLM Foundry is a codebase for training, fine-tuning, evaluating, and deploying large language models with support for distributed training strategies. The notebook uses:",2026-03-19T18:35:00.000Z,tutorial,integrations,0.7,True,"Notebook-style tutorial for Databricks Serverless GPU and Mosaic LLM Foundry will include concrete, product-specific code patterns, configuration parameters, and integration details (e.g., model IDs, training arguments, cluster/runtime specifics) that go beyond generic LLM fine-tuning knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-olmo3-7b-lora-axolotl,Distributed fine-tune Olmo3 7B with Axolotl,Fine-tune Olmo3 7B with Axolotl on multi-GPU serverless compute - Azure Databricks,Fine-tune Olmo3 7B with Axolotl on Databricks serverless GPU,Fine-tune Olmo3 7B model using LoRA and Axolotl framework on Databricks serverless GPU with distributed training orchestration.,"Open notebook version of this page This notebook demonstrates how to fine-tune theOlmo3 7B Instruct modelusingAxolotlon Databricks serverless GPU compute. Axolotl provides a high-performance framework for LLM post-training with QLoRA (Quantized Low-Rank Adaptation), enabling efficient fine-tuning on multi-GPU infrastructure. The trained model is logged to MLflow and registered in Unity Catalog for deployment.",2026-03-19T18:35:00.000Z,tutorial,integrations,0.7,True,"Shows Axolotl-based QLoRA fine-tuning with Databricks serverless GPU and MLflow/Unity Catalog integration, including concrete configuration and API usage unique to this environment.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-recommender-system-lightning,Two-tower recommendation model,Distributed training of two tower recommendation model using Lightning - Azure Databricks,Distributed two-tower recommender training with Lightning on Databricks,Training a Two-Tower deep recommendation model,"Open notebook version of this page This notebook demonstrates how to create a two tower recommendation model using the PyTorch LightningTrainerAPI with distributed training across 8 H100 GPUs on a single node. Reminders: To get started, configure your notebook to use serverless GPU: Your notebook is now connected to serverless GPU compute. The@distributeddecorator will handle launching your training across all 8 GPUs.",2026-03-31T23:28:00.000Z,tutorial,integrations,0.7,True,"Shows how to use PyTorch Lightning Trainer API with Databricks serverless GPU and @distributed decorator, which are specific integration and configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-retinanet-image-detection-model-training,Object detection with RetinaNet,Train a RetinaNet image detection model - Azure Databricks,Train RetinaNet object detection model on Databricks,Train RetinaNet object detection model from scratch using PyTorch and torchvision on serverless GPU with Feature Pyramid Network and focal loss.,Open notebook version of this page This notebook demonstrates how to train a RetinaNet object detection model from scratch using PyTorch and torchvision on Databricks serverless GPU compute. RetinaNet is a single-stage object detection model that uses a Feature Pyramid Network (FPN) and focal loss to handle class imbalance. The notebook covers:,2026-03-19T18:35:00.000Z,tutorial,integrations,0.6,True,"Demonstrates RetinaNet training with PyTorch on Databricks serverless GPU, including code and configuration patterns specific to Databricks AI Runtime.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-sft-trl-deepspeed-llama-1b,Fine-tune Llama 3.2 1B with LoRA,Fine-tune Llama 3.2 1B with LoRA using Serverless GPU - Azure Databricks,Fine-tune Llama 3.2 1B with LoRA and DeepSpeed,Fine-tune Llama 3.2 1B with LoRA using supervised fine-tuning and DeepSpeed ZeRO Stage 3 optimization on 8 A10 GPUs with serverless compute.,"Open notebook version of this page This notebook demonstrates how to fine-tune a large language model using supervised fine-tuning (SFT) with Low-Rank Adaptation (LoRA) on Databricks Serverless GPU. The notebook uses the Transformers Reinforcement Learning (TRL) library with DeepSpeed ZeRO Stage 3 optimization to efficiently train Llama 3.2 1B on a single node with 8 H100 GPUs. Key concepts: For more information, seeAI Runtime.",2026-03-19T18:35:00.000Z,tutorial,integrations,0.7,True,"Notebook for SFT with LoRA and DeepSpeed ZeRO Stage 3 on Databricks Serverless GPU will contain specific DeepSpeed config options, TRL usage patterns, and Databricks GPU setup details that are concrete integration and coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-sft-trl-deepspeed-llama-1b,Fine-tune Llama 3.2 1B with LoRA,Fine-tune Llama 3.2 1B with LoRA using AI Runtime - Azure Databricks,Fine-tune Llama 3.2 1B with LoRA on Databricks,Fine-tune Llama 3.2 1B with LoRA using supervised fine-tuning and DeepSpeed ZeRO Stage 3 optimization on 8 A10 GPUs with serverless compute.,"Open notebook version of this page This notebook demonstrates how to fine-tune a large language model using supervised fine-tuning (SFT) with Low-Rank Adaptation (LoRA) on Databricks AI Runtime. The notebook uses the Transformers Reinforcement Learning (TRL) library with DeepSpeed ZeRO Stage 3 optimization to efficiently train Llama 3.2 1B on a single node with 8 H100 GPUs. Key concepts: For more information, seeAI Runtime.",2026-04-21T18:07:00.000Z,tutorial,integrations,0.68,True,"Notebook-style tutorial with concrete, product-specific code and configuration for integrating Databricks AI Runtime, TRL, DeepSpeed ZeRO Stage 3, and LoRA on a specific GPU setup. Contains detailed parameter choices and patterns for this integration scenario rather than just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-time-series-gluonts-101,Time series forecasting with GluonTS,Forecasting time series with GluonTS - Azure Databricks,Time series forecasting with GluonTS on Databricks GPU,"Use GluonTS for probabilistic time series forecasting on serverless GPU with deep learning models, evaluation metrics, and checkpoint management.","Open notebook version of this page This notebook demonstrates how to useGluonTSfor probabilistic time series forecasting on Databricks serverless GPU compute. GluonTS is a Python library focused on deep learning-based approaches for time series modeling. GluonTS provides a toolkit for forecasting and anomaly detection, with pre-built implementations of state-of-the-artmodels. It supports both PyTorch and MXNet implementations and includes essential components like neural network architectures, f",2026-03-19T18:35:00.000Z,tutorial,integrations,0.65,True,"Shows how to use GluonTS with Databricks serverless GPU, including integration-specific code, model configuration, and checkpoint management patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-xgboost,Train XGBoost on single GPU,Train XGBoost model on a single GPU - Azure Databricks,Train XGBoost regression model on Databricks GPU,Train XGBoost regression model on a single GPU using AI Runtime (serverless GPU) with GPU-accelerated training and Unity Catalog model checkpointing.,"Open notebook version of this page This notebook demonstrates how to train an XGBoost regression model on a single GPU using Databricks serverless GPU compute. GPU acceleration significantly speeds up model training compared to CPU-based training, especially for large datasets. Key concepts covered: For more information, seeXGBoost GPU SupportandUnity Catalog volumes.",2026-03-26T08:00:00.000Z,tutorial,integrations,0.7,True,"Notebook demonstrates XGBoost with GPU acceleration and Unity Catalog checkpointing, including concrete code and configuration parameters for these integrations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/guides,Guides,User guides for AI Runtime - Azure Databricks,"User guides, migration, and troubleshooting for AI Runtime","Guides for AI Runtime (serverless GPU) including migration from classic workloads, example notebooks, and troubleshooting.","Important AI Runtime for single-node tasks is inPublic Preview. The distributed training API for multi-GPU workloads remain inBeta. This page includes migration information, links to example notebooks, and troubleshooting information.",2026-03-26T08:00:00.000Z,how-to,troubleshooting,0.65,True,"Aggregates migration and troubleshooting information; such pages typically link or embed specific error patterns and resolutions for AI Runtime, which are product-specific troubleshooting details.",unchanged @@ -1567,23 +1589,28 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regre https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regression-train-api,Regression with the API,Train regression models with AutoML Python API - Azure Databricks,Use AutoML Python API for regression on Databricks,Learn how to train regression models using AutoML with the Python API. The API provides functions to start classification AutoML runs.,"This article demonstrates how to train a model with AutoML using the AutoML Python API. SeeAutoML Python API referencefor more details. The API provides functions to start classification, regression, and forecasting AutoML runs. Each function call trains a set of models and generates a trial notebook for each model. SeeRequirementsfor AutoML experiments.",2025-11-21T08:00:00.000Z,concept-article,integrations,0.65,True,API-focused article describing AutoML Python functions to start runs; likely includes method signatures and parameter options specific to Databricks AutoML.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/databricks-runtime-ml,Overview,Databricks Runtime for Machine Learning - Azure Databricks,Configure Databricks Runtime for Machine Learning compute,Learn about Databricks Runtime for Machine Learning and how to create a classic compute resource using it.,This page describes the Databricks Runtime for Machine Learning and provides guidance for how to create a classic compute resource that uses it.,2026-02-27T23:28:00.000Z,concept-article,configuration,0.7,True,"Describes Databricks Runtime ML and how to create compute with it; likely includes runtime versions, supported features, and specific configuration options for ML workloads.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/databricks-runtime-ml-maintenance,Databricks Runtime ML maintenance policy,Databricks Runtime ML maintenance policy - Azure Databricks,Understand Databricks Runtime ML library maintenance and support,Learn about the maintenance policy for Databricks Runtime for Machine Learning and the libraries it includes.,"Databricks Runtime ML includes a variety of popular ML and DL libraries. The libraries are updated with each release to include new features and fixes. This article describes the supported top-tier libraries, their update cadence and the scenarios for when libraries are deprecated.",2025-12-10T00:00:00.000Z,concept-article,limits-quotas,0.65,True,"Maintenance policy for Runtime ML libraries; typically includes version support windows, update cadence, and deprecation timelines—effectively limits/constraints on supported library versions.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/,Feature Store overview,Databricks Feature Store - Azure Databricks,,"Learn about Feature Store and feature engineering in Unity Catalog. Unity Catalog is your feature store, with feature discovery, governance, lineage, and cross-workspace access.","This page is an overview of capabilities available when you use Databricks Feature Store with Unity Catalog. The Databricks Feature Store provides a central registry for features used in your AI and ML models. Feature tables and models are registered in Unity Catalog, providing built-in governance, lineage, and cross-workspace feature sharing and discovery. With Databricks, the entire model training workflow takes place on a single platform, including: When you use features from the feature stor",2026-02-10T08:00:00.000Z,overview,,0.3,False,"Overview of Databricks Feature Store capabilities in Unity Catalog; primarily conceptual description of registry, governance, and workflow without detailed parameters or limits.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/,Feature Store overview,Databricks Feature Store - Azure Databricks,,"Learn about Feature Store in Unity Catalog. Unity Catalog provides feature discovery, governance, lineage, and cross-workspace access.","This page is an overview of capabilities available when you use Databricks Feature Store with Unity Catalog. The Databricks Feature Store provides a central registry for features used in your AI and ML models. Feature tables and models are registered in Unity Catalog, providing built-in governance, lineage, and cross-workspace feature sharing and discovery. With Databricks, the entire model training workflow takes place on a single platform, including: When you use features from Databricks Featu",2026-04-23T17:47:00.000Z,overview,,0.2,False,"Overview of Databricks Feature Store in Unity Catalog describing capabilities (governance, lineage, sharing); no indication of numeric limits, detailed configuration parameters, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/automatic-feature-lookup,Automatic feature lookup (Model Serving),Model Serving with automatic feature lookup - Azure Databricks,Configure Model Serving with automatic feature lookup,Learn how to use Databricks serverless real-time inference and Databricks Feature Store to automatically lookup feature values from published online stores.,"Model Servingcan automatically look up feature values from aDatabricks Online Feature Storeor athird-party online store. For real-time serving of feature values, Databricks recommends usingDatabricks Online Feature Stores.",2026-01-20T08:00:00.000Z,how-to,configuration,0.75,True,"Explains how Model Serving automatically looks up features from Databricks or third-party online stores; likely includes endpoint configuration, feature lookup settings, and required metadata.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/concepts,Concepts,Feature store overview and glossary - Azure Databricks,,Learn about feature engineering and feature store concepts.,This page explains how the Databricks Feature Store works and defines important terms.,2026-01-20T08:00:00.000Z,concept-article,,0.25,False,Feature store overview and glossary; conceptual definitions rather than actionable configuration or troubleshooting details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/concepts,Concepts,Feature Store overview and glossary - Azure Databricks,,Learn about feature engineering and feature store concepts.,This page explains how the Databricks Feature Store works and defines important terms.,2026-04-22T08:00:00.000Z,concept-article,,0.2,False,"Conceptual overview and glossary of feature store concepts without product-specific limits, configs, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-api-reference,Declarative features API reference,Declarative features API reference - Azure Databricks,Reference for Databricks declarative feature APIs,Reference material for the Declarative Feature Engineering APIs.,Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews.,2026-04-24T23:15:00.000Z,reference,configuration,0.8,True,"API reference page will list specific parameters, options, and allowed values for declarative feature engineering, which are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-apis,Declarative features,Declarative features - Azure Databricks,Use Databricks declarative feature engineering APIs,"Learn about the Declarative Feature Engineering APIs for defining and computing features using aggregations, column selections, and request-time data.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. The Declarative Feature Engineering APIs enable you to define and compute features from data sources. Features can be defined using a variety of sources (Delta table and request-time data) and computations (time-windowed aggregations, simple column selections, and more). This guide covers the following workflows: For API details, seeDeclarative features",2026-04-24T23:15:00.000Z,how-to,integrations,0.65,True,Describes concrete workflows and API-based patterns for defining and computing features from Delta tables and request-time data; this is product-specific integration/coding guidance beyond generic ML concepts.,new https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/feature-function-serving,Feature Serving endpoints,Feature Serving endpoints - Azure Databricks,Configure and use Databricks Feature Serving endpoints,Feature Serving provides structured data for RAG applications and makes data in the Databricks platform available to applications deployed outside of Databricks.,"Databricks Feature Serving makes data in the Databricks platform available to models or applications deployed outside of Azure Databricks. Feature Serving endpoints automatically scale to adjust to real-time traffic and provide a high-availability, low-latency service for serving features. This page describes how to set up and use Feature Serving. For a step-by-step tutorial, seeExample: Deploy and query a feature serving endpoint. When you use Mosaic AI Model Serving to serve a model that was b",2026-04-10T21:59:00.000Z,how-to,configuration,0.65,True,"Describes how to set up and use Feature Serving endpoints, which typically involves endpoint configuration options (scaling behavior, request/response schema, possibly feature table bindings). This is product-specific configuration for serving structured data to external apps, not just a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/feature-serving-tutorial,Deploy a feature serving endpoint,Example: Deploy and query a feature serving endpoint - Azure Databricks,Deploy and query a Databricks feature serving endpoint with SDK,"Tutorial, step-by-step instructions to deploy and query a feature serving endpoint.","This article shows how to deploy and query a feature serving endpoint in a step-by-step process. This article uses the Databricks SDK. Some steps can also be completed using the REST API or the Databricks UI and include references to the documentation for those methods. In this example, you have a table of cities with their locations (latitude and longitude) and a recommender app that takes into account the user's current distance from those cities. Because the user's location changes constantly",2026-01-21T08:00:00.000Z,tutorial,deployment,0.7,True,Step-by-step tutorial using the Databricks SDK to deploy and query feature serving endpoints; includes deployment-specific API calls and requirements unique to Databricks Feature Serving.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/fs-authentication,Authentication,Authentication for working with third-party online stores - Azure Databricks,Configure authentication for third-party online feature stores,Learn how to provide read and write authentication for online stores with the Databricks Feature Store.,This article describes how to configure authentication for publishing feature tables to third-party online stores and looking up features from third-party online stores.,2026-02-11T08:00:00.000Z,how-to,security,0.8,True,"Specifically about configuring read/write authentication for third-party online stores; likely includes auth methods, scopes, and possibly role/permission details unique to Databricks Feature Store.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/lineage,Lineage and governance,Feature governance and lineage - Azure Databricks,,"Learn about feature table governance. Learn about lineage tracking across features, functions, and models and how to view the lineage graph.","This page describes the governance and lineage capabilities of feature engineering in Unity Catalog. For information about monitoring the performance of a served model and changes in feature table data, seeData profiling.",2026-03-12T08:00:00.000Z,how-to,,0.5,False,"Governance and lineage overview; likely conceptual explanation of lineage graphs and monitoring, without specific configuration parameters or limits.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/materialized-features,Materialize declarative features,Materialize declarative features - Azure Databricks,Materialize Databricks declarative features to tables,"Learn about the Databricks Declarative Feature Engineering APIs, which enable you to define and compute time-windowed aggregation features from data sources.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. After you have created your declarative feature definitions, which are stored in Unity Catalog, you can produce feature data from your source table using the feature definitions. This process is called materializing your features. Azure Databricks creates and manages Lakeflow Spark Declarative Pipelines to populate tables in Unity Catalog for model trai",2026-04-24T23:15:00.000Z,how-to,configuration,0.7,True,"Describes how Databricks creates and manages Lakeflow Spark Declarative Pipelines and how to configure materialization of features into Unity Catalog tables, involving product-specific settings and behaviors.",new https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/migrate-from-online-tables,Migrate from online tables,Migrate from legacy and third party online tables - Azure Databricks,Migrate legacy and third-party online tables to Lakebase,Learn how to migrate data from an online table to a Lakebase synced table.,This page describes how to migrate your existing online tables. You can migrate to the following: Important Databricks online tables are no longer supported. Databricks Online Feature Store (powered by Lakebase)is the recommended approach for online feature serving.,2026-03-31T23:28:00.000Z,upgrade-and-migration-article,decision-making,0.7,True,Migration-focused page describing how to move from legacy/third-party online tables to Lakebase synced tables; includes recommended target options and migration paths.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/on-demand-features,On-demand feature computation,On-demand feature computation - Azure Databricks,Implement on-demand feature computation with UDFs,Learn about using on-demand features with Databricks Feature Store. Compute features on demand using Python user-defined functions.,"This article describes how to create and use on-demand features in Azure Databricks. To use on-demand features, your workspace must be enabled forUnity Catalogand you must use Databricks Runtime 13.3 LTS ML or above.",2026-02-04T08:00:00.000Z,how-to,integrations,0.65,True,Shows how to create and use on-demand features via Python UDFs with specific runtime and Unity Catalog requirements; product-specific coding and configuration pattern.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-feature-store,Databricks Online Feature Store,Databricks Online Feature Stores - Azure Databricks,Use Databricks Online Feature Stores for low-latency serving,Learn about Databricks Online Feature Stores powered by Databricks Lakebase.,"Databricks Online Feature Stores are a high-performance, scalable solution for serving feature data to online applications and real-time machine learning models. Powered by Databricks Lakebase, Online Feature Stores provide low-latency access to feature data at a high scale while maintaining consistency with your offline feature tables. The primary use cases for Online Feature Stores include: New Online Feature Stores are now created as Lakebase Autoscaling projects. For details and differences,",2026-03-12T23:05:00.000Z,how-to,limits-quotas,0.6,True,"Describes high-performance, low-latency online feature serving; such service pages typically include concrete performance/latency characteristics and constraints tied to Lakebase projects.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows,Use features in online applications,Use features in online workflows - Azure Databricks,Design online feature workflows with Databricks and third-party stores,"Learn about feature engineering online workflows including Databricks Online Feature Store, on-demand feature calculation, and third-party feature stores.","When you use feature engineering in Unity Catalog, every step of your model development process is integrated into the Databricks Data Intelligence Platform. This means you can build automated data pipelines to compute and serve feature values while Databricks handles the infrastructure for you. The Databricks platform provides real-time serving for both features and models, including on-demand computation of feature values.",2026-01-20T08:00:00.000Z,overview,architecture-patterns,0.6,True,"Discusses online workflows including Databricks Online Feature Store, on-demand computation, and third-party stores; likely provides pattern-level guidance on when to use each approach within Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-feature-store,Databricks Online Feature Store,Databricks Online Feature Stores - Azure Databricks,Architect online feature serving with Databricks,Learn about Databricks Online Feature Stores powered by Databricks Lakebase.,"Databricks Online Feature Stores are a high-performance, scalable solution for serving feature data to online applications and real-time machine learning models. Powered by Databricks Lakebase, Online Feature Stores provide low-latency access to feature data at a high scale while maintaining consistency with your offline feature tables. The primary use cases for Online Feature Stores include: New Online Feature Stores are now created as Lakebase Autoscaling projects. For details and differences,",2026-04-23T17:47:00.000Z,how-to,architecture-patterns,0.6,True,"Explains Databricks Online Feature Stores, their primary use cases, and differences with Lakebase autoscaling projects, giving product-specific architectural guidance for online vs offline features.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows,Serve features,Use features in online workflows - Azure Databricks,Use Databricks features in online ML workflows,"Learn about Databricks Online Feature Store, on-demand feature calculation, and third-party feature stores.","When you use Databricks Feature Store, every step of your model development process is integrated into the Databricks Data Intelligence Platform. This means you can build automated data pipelines to compute and serve feature values while Databricks handles the infrastructure for you. The Databricks platform provides real-time serving for both features and models, including on-demand computation of feature values.",2026-04-23T17:47:00.000Z,overview,integrations,0.6,True,"Describes how to integrate Databricks Feature Store with real-time serving and automated pipelines, including on-demand feature computation; these are concrete integration patterns across services.",new https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/publish-features,Publish features,Publish features to a third-party online store - Azure Databricks,Publish Databricks feature tables to third-party online stores,Learn about publishing features to online stores with Databricks Feature Store.,This article describes how to publish features to a third-party online store for real-time serving. Databricks Feature Store supports these online stores:,2026-02-11T08:00:00.000Z,how-to,integrations,0.8,True,Describes publishing features to supported online stores; likely lists supported systems and includes configuration parameters and API usage for each integration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/python-api,Python API,Python API - Azure Databricks,Configure Databricks Feature Engineering Python client,"Learn about the Databricks Feature Engineering Python API, including working with feature tables and online stores. Install the Feature Engineering client for local testing.","This page provides links to the Python API documentation of Databricks Feature Engineering and Databricks legacy Workspace Feature Store, and information about the client packagesdatabricks-feature-engineeringanddatabricks-feature-store. Note As of version 0.17.0,databricks-feature-storehas been deprecated. All existing modules from this package are now available indatabricks-feature-engineeringversion 0.2.0 and later. For information about migrating todatabricks-feature-engineering, seeMigrate ",2025-09-03T08:00:00.000Z,concept-article,configuration,0.65,True,"Discusses specific client packages, versions, and migration between databricks-feature-store and databricks-feature-engineering; likely includes install commands and version constraints that are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/rag,Use features with RAG applications,Example: use features with structured RAG applications - Azure Databricks,,Example notebook for how to set up a structured RAG application using online tables in Databricks.,"Retrieval-augmented generation, or RAG, is one of the most common approaches to building generative AI applications. Feature engineering in Unity Catalog supports structured RAG applications using online tables. You create an online table for the structured data that the RAG application needs and host it on a feature serving endpoint. The RAG application uses the feature serving endpoint to look up relevant data from the online table. The typical steps are as follows: The following notebook illu",2025-09-18T08:00:00.000Z,concept-article,,0.3,False,"Example notebook for structured RAG with feature serving; primarily tutorial-style workflow without clear evidence of product-specific limits, configs tables, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/serve-declarative-features,Serve declarative features,Serve declarative features - Azure Databricks,Serve Databricks declarative features via endpoints,"Learn how to serve declarative features using model serving endpoints, including models trained with RequestSource features.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Important Feature Serving endpoints are not supported for Declarative Feature Engineering. To serve features online, deploy a model serving endpoint using a model logged through Unity Catalog. Models that are trained using features from Databricks automatically track lineage to the features they were trained on. When deployed as model serving endpoints,",2026-04-24T23:15:00.000Z,how-to,deployment,0.65,True,"Details constraints (feature serving endpoints not supported) and how to deploy model serving endpoints using models logged through Unity Catalog, which are product-specific deployment behaviors and requirements.",new https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/third-party-online-stores,Third-party online stores,Third-party online stores - Azure Databricks,Integrate third-party online stores with Feature Store,Learn about using third-party online stores such as CosmosDB and DynamoDB with Databricks Feature Store.,"For real-time serving of feature values, Databricks recommends usingDatabricks Online Feature Stores. This article describes how to work with third-party online stores for real-time serving of feature values. With real-time serving, you publish feature tables to a low-latency database and deploy the model or feature spec to a REST endpoint. Databricks Feature Store also supports automatic feature lookup. In this case, the input values provided by the client include values that are only available",2026-02-12T04:25:00.000Z,how-to,integrations,0.7,True,Describes using third-party stores like CosmosDB and DynamoDB for real-time serving and automatic feature lookup; product-specific integration patterns with external services.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/time-series,Point-in-time feature joins,Point-in-time feature joins - Azure Databricks,Implement point-in-time correct feature joins for time series,Learn how to ensure point-in-time correctness for ML model development using time series feature tables.,"This page describes how to use point-in-time correctness to create a training dataset that accurately reflects feature values as of the time a label observation was recorded. This is important to preventdata leakage, which occurs when you use feature values for model training that were not available at the time the label was recorded. This type of error can be hard to detect and can negatively affect the model's performance. Time series feature tables include a timestamp key column that ensures ",2026-03-31T08:00:00.000Z,how-to,best-practices,0.7,True,Focuses on preventing data leakage with time-series feature tables and timestamp keys; contains Databricks-specific guidance and patterns for building correct training datasets.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-models-with-feature-store,Use feature tables,Train models with feature tables - Azure Databricks,Train models and run batch inference with Databricks feature tables,Learn how to train models and perform batch inference using Feature Engineering in Unity Catalog or features from the Databricks Workspace Feature Store.,"This article describes how you can train models using Feature Engineering in Unity Catalog or the legacy Workspace Feature Store. You must first create a training dataset, which defines the features to use and how to join them. Then, when you train a model, the model retains references to the features. When you train a model using Feature Engineering in Unity Catalog, you can view the model's lineage in Catalog Explorer. Tables and functions that were used to create the model are automatically t",2026-03-27T08:00:00.000Z,how-to,integrations,0.65,True,Explains how to create training datasets from feature tables and retain feature references; likely includes code patterns and APIs specific to Databricks Feature Store integration with ML.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/time-series,Point-in-time feature joins,Point-in-time feature joins - Azure Databricks,Implement point-in-time correct feature joins,Learn how to ensure point-in-time correctness for ML model development using time series feature tables.,"This page describes how to use point-in-time correctness to create a training dataset that accurately reflects feature values as of the time a label observation was recorded. This is important to preventdata leakage, which occurs when you use feature values for model training that were not available at the time the label was recorded. This type of error can be hard to detect and can negatively affect the model's performance. Time series feature tables include a timestamp key column that ensures ",2026-04-20T08:00:00.000Z,how-to,best-practices,0.7,True,"Provides specific guidance to avoid data leakage using time-series feature tables and timestamp key columns; this is actionable, product-specific best-practice for Databricks feature engineering.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-models-with-feature-store,Use feature tables,Train models with feature tables - Azure Databricks,Train ML models with Databricks feature tables,Learn how to train models and perform batch inference using Feature Engineering in Unity Catalog or features from the Databricks Workspace Feature Store.,"This article describes how you can train models using Feature Engineering in Unity Catalog or the legacy Workspace Feature Store. You must first create a training dataset, which defines the features to use and how to join them. Then, when you train a model, the model retains references to the features. When you train a model using Feature Engineering in Unity Catalog, you can view the model's lineage in Catalog Explorer. Tables and functions that were used to create the model are automatically t",2026-04-20T08:00:00.000Z,how-to,integrations,0.65,True,"Covers concrete patterns for integrating feature tables with model training and batch inference, including lineage behavior in Unity Catalog; these are product-specific coding/integration patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-with-declarative-features,Use declarative features,Train models with declarative features - Azure Databricks,Train models using Databricks declarative features,Learn how to train machine learning models using declarative features with point-in-time correct feature computation.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. This page describes how to use declarative features for model training. For information about defining declarative features, seeDeclarative features.",2026-04-24T23:15:00.000Z,how-to,integrations,0.65,True,"Describes how to wire declarative features into model training with point-in-time computation, a Databricks-specific integration pattern between feature definitions and ML training.",new https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/troubleshooting-and-limitations,Troubleshooting,Troubleshooting and limitations - Azure Databricks,Troubleshoot Databricks Feature Store issues and limits,Troubleshooting and limitations information about Databricks Feature Store.,,2026-03-18T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicit troubleshooting and limitations page; typically includes concrete error symptoms, causes, and resolutions plus product-specific limitations that are not general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/feature-tables-uc,Feature tables,Feature tables - Azure Databricks,Create and manage feature tables in Unity Catalog,"Learn how to create and work with feature tables in Unity Catalog, including updating, browsing, and controlling access to feature tables.","This page describes how to create and work with feature tables in Unity Catalog. This page applies only to workspaces that are enabled for Unity Catalog. If your workspace is not enabled for Unity Catalog, seeWork with feature tables in Workspace Feature Store (legacy). For details about the commands and parameters used in the examples on this page, see theFeature Engineering Python API reference.",2026-03-27T08:00:00.000Z,how-to,configuration,0.75,True,"Covers creating and working with feature tables in Unity Catalog; likely includes API calls, parameters, and constraints specific to Databricks Feature Engineering.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/ui-uc,Explore features in Unity Catalog,Explore features in Unity Catalog - Azure Databricks,,"Learn about feature discoverability with Feature Engineering in Unity Catalog. Also, how to search for feature tables and explore more details in Catalog Explorer.","With Feature Engineering in Unity Catalog, all of the benefits of Unity Catalog are available for all feature tables, including the following: You can use any Delta table in Unity Catalog that includes a primary key constraint as a feature table. For information about managing tables in Unity Catalog, including privileges, lineage, and tags, seeWhat is Unity Catalog?. You can also explore and manage feature tables using the Features UI. To access the Features UI, clickFeaturesin the sidebar. Sel",2026-03-10T23:23:00.000Z,how-to,,0.5,False,"Describes discoverability and UI navigation for feature tables; mostly about using Catalog Explorer and Features UI, not detailed config tables.",unchanged @@ -1593,13 +1620,13 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-stor https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/feature-tables,Create and manage feature tables (legacy),Work with feature tables in Workspace Feature Store (legacy) - Azure Databricks,Manage Workspace Feature Store feature tables,"Learn how to create and work with feature tables in the Workspace Feature Store in Databricks including how to update, control access, and browse feature tables.","Note This documentation covers the Workspace Feature Store. Databricks recommends usingFeature Engineering in Unity Catalog. Workspace Feature Store will be deprecated in the future. For information about working with feature tables in Unity Catalog, seeFeature tables. This page describes how to create and work with feature tables in the Workspace Feature Store. Note If your workspace is enabled for Unity Catalog, any table managed by Unity Catalog that has a primary key is automatically a featu",2026-01-20T08:00:00.000Z,how-to,configuration,0.7,True,"How-to for creating, updating, and browsing feature tables; likely includes specific API calls, options, and behaviors unique to Workspace Feature Store, which are configuration-like expert details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/ui,Explore features and lineage (legacy),Explore features and lineage (legacy) - Azure Databricks,,"Learn about feature discoverability and lineage tracking with Databricks Feature Store. Search for feature tables, identify source data and downstream models and jobs.","Note This documentation covers the legacy Workspace Feature Store. Databricks recommends usingFeature Engineering in Unity Catalog. If your workspace is enabled for Unity Catalog, seeExplore features in Unity Catalogfor information about feature discovery and lineage. With Databricks Workspace Feature Store, you can: To access the Workspace Feature Store UI, in the sidebar, selectMachine Learning > Features. A table lists all of the available feature tables, along with the features in the table ",2024-12-11T18:37:00.000Z,how-to,,0.4,False,Describes UI-based feature discovery and lineage; summary suggests navigation and conceptual capabilities rather than detailed configuration parameters or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/,Foundation Model APIs overview,Databricks Foundation Model APIs - Azure Databricks,Overview and limits of Databricks Foundation Model APIs,"This article provides an overview of the Foundation Model APIs in Databricks. It includes requirements for use, supported models, and limitations.","This article provides an overview of Foundation Model APIs on Azure Databricks. It includes requirements for use, supported models, and limitations.",2025-10-07T08:00:00.000Z,concept-article,limits-quotas,0.7,True,"Overview that explicitly includes requirements and limitations; likely lists numeric limits (context length, rate limits, region support) and model availability tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference,Foundation Model APIs reference,Foundation model REST API reference - Azure Databricks,Use Databricks Foundation Model REST APIs,This article provides API reference details for Databricks Foundation Model APIs.,This article provides general API information for Databricks Foundation Model APIs and the models they support. The Foundation Model APIs are designed to be similar to OpenAI's REST API to make migrating existing projects easier. Both the pay-per-token and provisioned throughput endpoints accept the same REST API request format.,2026-04-14T08:00:00.000Z,concept-article,integrations,0.85,True,"An API reference for Foundation Model APIs, including request/response formats and parameters, designed to be similar to OpenAI’s REST API. This is a direct SDK/API parameter reference unique to Databricks, fitting integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/compliance,Foundation Model APIs compliance,Foundation Model APIs compliance and security - Azure Databricks,Compliance and security profiles for Foundation Model APIs,Learn about compliance standards and security profile support for Databricks Foundation Model APIs.,This article describes the compliance standards and security profile support for Databricks Foundation Model APIs. Databricks Foundation Model APIs support various compliance standards to meet enterprise security and regulatory requirements. The availability of these standards varies by deployment mode: pay-per-token or provisioned throughput.,2026-04-16T08:00:00.000Z,concept-article,security,0.8,True,"Describes compliance standards and security profile support for Foundation Model APIs, varying by deployment mode (pay-per-token vs provisioned throughput). This is product-specific security/compliance configuration and support matrix information.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference,Foundation Model APIs reference,Foundation model REST API reference - Azure Databricks,Use Databricks Foundation Model REST APIs for model integration,This article provides API reference details for Databricks Foundation Model APIs.,This article provides general API information for Databricks Foundation Model APIs and the models they support. The Foundation Model APIs are designed to be similar to OpenAI's REST API to make migrating existing projects easier. Both the pay-per-token and provisioned throughput endpoints accept the same REST API request format.,2026-04-23T08:00:00.000Z,concept-article,integrations,0.85,True,"API reference content includes endpoint paths, parameters, types, defaults, and constraints for Foundation Model APIs, which are detailed integration and coding patterns unique to this product.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/compliance,Foundation Model APIs compliance,Foundation Model APIs compliance and security - Azure Databricks,Apply compliance and security profiles to Databricks Foundation Model APIs,Learn about compliance standards and security profile support for Databricks Foundation Model APIs.,This article describes the compliance standards and security profile support for Databricks Foundation Model APIs. Databricks Foundation Model APIs support various compliance standards to meet enterprise security and regulatory requirements. The availability of these standards varies by deployment mode: pay-per-token or provisioned throughput.,2026-04-24T08:00:00.000Z,concept-article,security,0.8,True,"Describes compliance standards and security profile support by deployment mode; likely includes specific security profile names, supported standards per mode, and configuration implications, which are product-specific security details.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/deploy-prov-throughput-foundation-model-apis,Provisioned throughput Foundation Model APIs overview,Provisioned throughput Foundation Model APIs - Azure Databricks,Deploy provisioned throughput Foundation Model APIs on Databricks,Learn how to deploy Foundation Model APIs with provisioned throughput.,"This article demonstrates how to deploy models usingFoundation Model APIsprovisioned throughput. Azure Databricks recommends provisioned throughput for production workloads, and it provides optimized inference for foundation models with performance guarantees.",2026-01-09T18:49:00.000Z,concept-article,deployment,0.75,True,"Demonstrates deploying models with provisioned throughput; likely includes SKU/tier options, capacity settings, and deployment constraints specific to Databricks provisioned throughput.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/limits,Limitations,Foundation Model APIs limits and quotas - Azure Databricks,Foundation Model APIs rate limits and quotas,Learn about limits and quotas for Databricks Foundation Model APIs workloads.,"This page describes the limits and quotas for Databricks Foundation Model APIs workloads. Databricks Foundation Model APIs enforce rate limits to ensure reliable performance and fair resource allocation across all users. These limits vary based on theworkspace platform tier, foundation model type and how you deploy your foundation model.",2026-04-16T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly a limits and quotas page for Foundation Model APIs, describing rate limits that vary by workspace tier, model type, and deployment mode. This will contain specific numeric limits and tables, which are classic expert-only quota details.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/limits,Limitations,Foundation Model APIs limits and quotas - Azure Databricks,Understand Databricks Foundation Model APIs limits and quotas,Learn about limits and quotas for Databricks Foundation Model APIs workloads.,"This page describes the limits and quotas for Databricks Foundation Model APIs workloads. Databricks Foundation Model APIs enforce rate limits to ensure reliable performance and fair resource allocation across all users. These limits vary based on theworkspace platform tier, foundation model type and how you deploy your foundation model.",2026-04-24T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly about limits and quotas; will contain numeric rate limits, token caps, and tier-based tables for different workspace tiers and deployment modes, matching the limits-quotas criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/model-units,Model units in provisioned throughput,Model units in provisioned throughput - Azure Databricks,Understand model units for provisioned throughput capacity,Learn about model units and how they are used to measure Foundation Model APIs provisioned throughput workload capacity.,"Model units are a unit of throughput which determine how much work your endpoint can handle per minute. When you create a newprovisioned throughput endpoint, you specify how many model units to provision for each model served. The amount of work required to process each request to your endpoint depends on the size of both the input and the generated output. As the number of input and output tokens increases, the amount of work required to process a request also increases. Generating output token",2026-04-06T18:30:00.000Z,concept-article,limits-quotas,0.8,True,"Explains model units as a throughput measure tied to tokens per minute and work per request. Such pages typically include formulas, numeric mappings (e.g., MU to tokens/min), and capacity thresholds that are not generally known.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/prov-throughput-run-benchmark,Provisioned throughput run your own benchmarking,Conduct your own LLM endpoint benchmarking - Azure Databricks,Benchmark Databricks LLM endpoints for latency and throughput,See a Databricks recommended notebook example for benchmarking an LLM endpoint. Also learn how Databricks performs LLM inference and calculates latency and throughput as endpoint performance metrics.,Warning The topics described in this page apply to provisioned throughput workloads that serve models that provision inference capacity based ontokens per second. The following models apply: SeeModel units in provisioned throughputfor supported models that usemodel units(not tokens per second) to provision inference capacity. This article provides a Databricks recommended notebook example for benchmarking an LLM endpoint. It also includes a brief introduction to how Databricks performs LLM infer,2026-04-02T08:00:00.000Z,concept-article,best-practices,0.7,True,"Provides a recommended notebook and explains how Databricks measures latency and throughput for provisioned throughput endpoints. Likely includes concrete metrics definitions, recommended configurations, and interpretation guidance—product-specific best practices for performance testing.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models,Supported foundation models,Databricks-hosted foundation models available in Foundation Model APIs - Azure Databricks,Choose Databricks-hosted foundation models and endpoints,This article describes the state-of-the-art foundation models hosted by Databricks.,This article describes the state-of-the-art open models that are supported byDatabricks Foundation Model APIs. Note SeeSupported foundation models on Mosaic AI Model Servingfor region availability of these models and the supported feature areas. You can send query requests to these models using the pay-per-token endpoints available in your Databricks workspace. SeeUse foundation modelsandpay-per-token supported models tablefor the names of the model endpoints to use. In addition to supporting mo,2026-04-16T08:00:00.000Z,concept-article,decision-making,0.75,True,"Describes which open models are supported by Databricks Foundation Model APIs, including pay-per-token endpoints and model endpoint names. This is concrete model catalog and endpoint naming information used to choose and call specific models, which is not generally knowable from training data.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models,Supported foundation models,Databricks-hosted foundation models available in Foundation Model APIs - Azure Databricks,Choose Databricks-hosted foundation models in Foundation Model APIs,This article describes the state-of-the-art foundation models hosted by Databricks.,This article describes the state-of-the-art open models that are supported byDatabricks Foundation Model APIs. Note SeeSupported foundation models on Mosaic AI Model Servingfor region availability of these models and the supported feature areas. You can send query requests to these models using the pay-per-token endpoints available in your Databricks workspace. SeeUse foundation modelsandpay-per-token supported models tablefor the names of the model endpoints to use. In addition to supporting mo,2026-04-24T08:00:00.000Z,concept-article,decision-making,0.75,True,"Lists Databricks-hosted models with details like capabilities, modalities, and possibly cost or feature support; such tables guide model selection and are product-specific decision-making aids.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/,Load data for training,Load data for machine learning and deep learning - Azure Databricks,Load and prepare data for ML on Databricks,"Learn about loading data for machine learning and deep learning workflows in Databricks, including preparing data for distributed training.","This section covers information about loading data specifically for ML and DL applications. For general information about loading data, seeStandard connectors in Lakeflow Connect.",2025-05-09T00:36:00.000Z,concept-article,best-practices,0.7,True,"Guidance on loading data specifically for ML/DL and distributed training; likely includes Databricks-specific recommendations (file formats, partitioning, connectors) beyond generic data loading.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/ddl-data,Prepare data for distributed training,Prepare data for distributed training - Azure Databricks,Prepare data for distributed ML training on Databricks,Learn how to prepare data for distributed training of machine learning models.,This article describes the methods for preparing data for distributed training: Mosaic Streaming and TFRecords.,2024-09-26T21:15:00.000Z,concept-article,configuration,0.7,True,"Describes methods (Mosaic Streaming, TFRecords) and likely specific data layout/configuration requirements for distributed training.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/streaming,Load data using Mosaic Streaming,Load data using Mosaic Streaming - Azure Databricks,Use Mosaic Streaming to load Spark data into PyTorch,Learn how to load data using Mosaic Streaming for distributed training and evaluation of machine learning models.,"This article describes how to useMosaic Streamingto convert data from Apache Spark to a format compatible with PyTorch. Mosaic Streaming is an open source data loading library. It enables single-node or distributed training and evaluation of deep learning models from datasets that are already loaded as Apache Spark DataFrames. Mosaic Streaming primarily supports Mosaic Composer, but also integrates with native PyTorch, PyTorch Lightning, and the TorchDistributor. Mosaic Streaming provides a seri",2025-09-03T08:00:00.000Z,concept-article,integrations,0.65,True,"Describes using Mosaic Streaming to convert Spark DataFrames to PyTorch-compatible format; likely includes product-specific APIs, parameters, and integration patterns between Spark and PyTorch.",unchanged @@ -1617,26 +1644,26 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/mlops- https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/mlops-workflow,MLOps workflows,MLOps workflows on Azure Databricks - Azure Databricks,Implement MLOps workflows on Azure Databricks,Learn the recommended Databricks MLOps workflow to optimize performance and efficiency of your machine learning production systems.,"This article describes how you can use MLOps on the Databricks platform to optimize the performance and long-term efficiency of your machine learning (ML) systems. It includes general recommendations for an MLOps architecture and describes a generalized workflow using the Databricks platform that you can use as a model for your ML development-to-production process. For modifications of this workflow for LLMOps applications, seeLLMOps workflows. For more details, seeThe Big Book of MLOps.",2024-12-18T08:00:00.000Z,best-practice,architecture-patterns,0.63,True,"Provides a Databricks-specific MLOps architecture and workflow, mapping platform components to stages; while high-level, it includes concrete platform patterns unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-inference/,Deploy models for inference,Deploy models for batch inference and prediction - Azure Databricks,Choose Databricks options for batch model inference,Learn about what Databricks offers for performing batch model inference.,"This article describes what Databricks recommends for batch inference. For real-time model serving on Azure Databricks, seeDeploy models using Mosaic AI Model Serving.",2026-04-07T21:58:00.000Z,concept-article,decision-making,0.65,True,"Described as what Databricks recommends for batch inference, which implies guidance on selecting among Databricks batch inference options and patterns; this is decision-making content with product-specific recommendations rather than generic ML theory.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/,Overview,Deploy models using Mosaic AI Model Serving - Azure Databricks,,Learn about Mosaic AI Model Serving and what it offers for ML and generative AI model deployments.,"This article describes Mosaic AI Model Serving, the Databricks solution for deploying AI and ML models for real-time serving and batch inference.",2026-01-24T08:00:00.000Z,concept-article,,0.3,False,"Overview of Mosaic AI Model Serving; high-level description of capabilities, not detailed deployment constraints or matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/acceptable-use-models,Applicable model terms,Applicable model terms - Azure Databricks,,Learn about model terms for AI models on Databricks. You are responsible for ensuring compliance with model terms and use policies.,This document lists models in the Databricks Services that are subject to separate model terms. These links are provided for your convenience. You are responsible for ensuring compliance with applicable model terms and any applicable use policies.,2026-04-16T08:00:00.000Z,concept-article,,0.4,False,"Lists models subject to separate model terms and links to their terms/acceptable use policies. This is primarily legal/terms-of-use content, not technical limits, configuration, or troubleshooting guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/acceptable-use-models,Applicable model terms,Applicable model terms - Azure Databricks,,Learn about model terms for AI models on Databricks. You are responsible for ensuring compliance with model terms and use policies.,This document lists models in the Databricks Services that are subject to separate model terms. These links are provided for your convenience. You are responsible for ensuring compliance with applicable model terms and any applicable use policies.,2026-04-24T08:00:00.000Z,concept-article,,0.2,False,"Lists applicable model terms and policy links; legal/compliance content without technical limits, configuration, or product-specific operational details.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/configure-load-test,Load test example,Configure a load test for custom model serving endpoints - Azure Databricks,Configure Locust-based load tests for Databricks serving,Learn how to configure a load test. The example provides guidance on how to set up your environment and run a load test to optimize endpoint performance.,"This article provides a load test notebook example and covers setup requirements, authentication, cluster configuration, and step-by-step instructions for running load tests to optimize endpoint performance. The information in this article and the example files get you started on how to configure a load test for your Mosaic AI Model Serving endpoint on Databricks. For more information about load testing and related concepts, seeLoad testing for serving endpoints.",2025-09-03T08:00:00.000Z,concept-article,best-practices,0.7,True,"Provides a notebook example and detailed steps for setting up environment, authentication, and cluster configuration for load tests. These are concrete, Databricks-specific recommendations and configurations for performance testing, fitting best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/create-foundation-model-endpoints,Create generative AI model serving endpoints,Create foundation model serving endpoints - Azure Databricks,Create and call Databricks foundation model endpoints,"Learn how to format scoring requests for your served foundation model, and how to send those requests to the model serving endpoint.","In this article, you learn how to create model serving endpoints that deploy and serve foundation models. Mosaic AI Model Servingsupports the following models: Model Serving provides the following options for model serving endpoint creation: For creating endpoints that serve traditional ML or Python models, seeCreate custom model serving endpoints.",2026-04-07T08:00:00.000Z,concept-article,integrations,0.65,True,"Covers formatting scoring requests and sending them to model serving endpoints. This typically includes request body schemas, parameter names, and endpoint-specific options—product-specific integration and coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/create-manage-serving-endpoints,Create custom model serving endpoints,Create custom model serving endpoints - Azure Databricks,Create and configure custom model serving endpoints,Learn how to create and configure model serving endpoints that serve custom models or custom code.,"This article describes how to create model serving endpoints that servecustom modelsusing DatabricksModel Serving. Model Serving provides the following options for serving endpoint creation: For creating endpoints that serve generative AI models, seeCreate foundation model serving endpoints.",2025-10-30T01:25:00.000Z,concept-article,configuration,0.82,True,"Describes how to create and configure endpoints for custom models, including endpoint options; endpoint configuration details are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-model-serving-uc-logs,OpenTelemetry for custom model serving endpoints,Persist custom model serving data to Unity Catalog - Azure Databricks,Configure telemetry persistence for Databricks custom model serving,"Configure endpoint telemetry for Databricks Custom Model Serving endpoints to persist logs, traces, and metrics to Unity Catalog tables.","Important This feature is inBeta. It is not automatically enabled for all customers and functionality is subject to change. To request access, contact your Azure Databricks account team. Learn how to configure endpoint telemetry to persist OpenTelemetry logs, traces, and metrics from yourcustom model serving endpointsto Unity Catalog tables. Use the persisted telemetry data to perform root cause analysis, monitor endpoint health, and meet compliance requirements with standard SQL queries.",2026-04-16T08:00:00.000Z,concept-article,configuration,0.8,True,"Describes configuring endpoint telemetry to persist logs, traces, and metrics to Unity Catalog tables. This requires specific configuration parameters, table targets, and possibly schema/setting names that are product-specific expert knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-models,Deploy custom models,Custom models overview - Azure Databricks,Configure custom models and compute for Databricks serving,"Learn about logging custom models, compute types, dependencies, and expectations for Mosaic AI Model Serving.","This article describes support forcustom modelsusingMosaic AI Model Serving. It provides details about supported model logging options and compute types, how to package model dependencies for serving, and expectations for endpoint creation and scaling.",2026-01-09T08:00:00.000Z,concept-article,configuration,0.7,True,"Details supported model logging options, compute types, dependency packaging, and scaling expectations. These are product-specific configuration options and constraints (e.g., supported runtimes, instance types) that qualify as configuration expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-model-serving-uc-logs,OpenTelemetry for custom model serving endpoints,Persist custom model serving data to Unity Catalog - Azure Databricks,Configure telemetry persistence for Databricks custom model serving,"Configure endpoint telemetry for Databricks Custom Model Serving endpoints to persist logs, traces, and metrics to Unity Catalog tables.","Important This feature is inBeta. It is not automatically enabled for all customers and functionality is subject to change. To request access, contact your Azure Databricks account team. Learn how to configure endpoint telemetry to persist OpenTelemetry logs, traces, and metrics from yourcustom model serving endpointsto Unity Catalog tables. Use the persisted telemetry data to perform root cause analysis, monitor endpoint health, and meet compliance requirements with standard SQL queries.",2026-04-16T08:00:00.000Z,concept-article,configuration,0.8,True,"Describes configuring endpoint telemetry to persist logs, traces, and metrics to Unity Catalog tables. This requires specific configuration parameters, table targets, and possibly schema/setting names that are product-specific expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-models,Deploy custom models,Custom models overview - Azure Databricks,Configure custom model serving on Azure Databricks,"Learn about logging custom models, compute types, dependencies, and expectations for Mosaic AI Model Serving.","This article describes support forcustom modelsusingMosaic AI Model Serving. It provides details about supported model logging options and compute types, how to package model dependencies for serving, and expectations for endpoint creation and scaling.",2026-04-22T21:32:00.000Z,concept-article,configuration,0.7,True,"Describes supported model logging options, compute types, dependency packaging, and endpoint behavior for Mosaic AI Model Serving—product-specific configuration details beyond generic knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/deploy-custom-python-code,Deploy Python code with Model Serving,Deploy Python code with Model Serving - Azure Databricks,Deploy custom Python logic with Databricks Model Serving,Learn how to deploy Python code with Model Serving.,"This article describes how to deploy your customized Python code withMosaic AI Model Serving. The example in this article focuses on providing guidance for adding preprocessing and postprocessing logic to your model and deploying it. MLflow's Python function,pyfunc, provides flexibility to deploy any piece of Python code or any Python model. The following are example scenarios where you might want to use the guide.",2025-09-03T08:00:00.000Z,concept-article,integrations,0.7,True,"Shows how to deploy Python code using MLflow pyfunc with Databricks Model Serving, including how to structure code, handle preprocessing/postprocessing, and package it. This is a concrete coding pattern/integration between Python/MLflow and Databricks serving.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/enable-model-serving-inference-tables,Enable inference tables on model serving endpoints,Enable inference tables on model serving endpoints using the API - Azure Databricks,Enable inference tables on Databricks endpoints via API,Learn how to enable inference tables to automatically capture and log requests and responses to model serving endpoints.,"Important This feature is inPublic Preview. Important This article describes the legacy inference table experience, which is only relevant for certain provisioned throughput and custom model serving endpoints. Databricks recommendsAI Gateway-enabled inference tablesfor its availability on custom model, foundation model, and agent serving endpoints. SeeMigrating to AI Gateway inference tablesfor instructions on migrating to AI Gateway-enabled inference tables. This article explains how to use the",2026-01-22T13:21:00.000Z,concept-article,configuration,0.78,True,Explains API usage to enable inference tables and related settings; includes specific API parameters and behavior unique to Databricks.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/foundation-model-overview,Foundation models overview,Supported foundation models on Mosaic AI Model Serving - Azure Databricks,Select supported foundation models on Mosaic AI,This article provides an overview of the foundation models that are supported by Mosaic AI Model Serving.,"This article describes the foundation models you can serve usingMosaic AI Model Serving. Foundation models are large, pre-trained neural networks that are trained on both large and broad ranges of data. These models are designed to learn general patterns in language, images, or other data types, and can be fine-tuned for specific tasks with additional training. Your use of certain foundation models is subject to the model's terms and acceptable use policy. SeeApplicable model terms. Model Servin",2026-04-17T08:00:00.000Z,concept-article,decision-making,0.6,True,"Lists which foundation models are supported by Mosaic AI Model Serving, including region availability and supported feature areas. This is concrete, product-specific selection guidance (which model where) that helps decide what to use; such availability matrices are not inferable from training data alone.",updated -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/function-calling,Function calling on Databricks,Function calling on Azure Databricks - Azure Databricks,Implement function calling on Azure Databricks,"Learn about function calling on Databricks, including what it is, when to use it and how to implement it with your generative AI applications.",This article describes function calling and how to use it as part of your generative AI application workflows. Databricks Function Calling is OpenAI-compatible and is only available during model serving as part ofFoundation Model APIsand serving endpoints that serveexternal models.,2026-04-16T08:00:00.000Z,concept-article,integrations,0.7,True,"Covers what function calling is, when to use it, and how to implement it with Databricks generative AI apps. Implementation guidance for Databricks’ OpenAI-compatible function calling will include request schema and parameter names unique to this product integration.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/foundation-model-overview,Foundation models overview,Supported foundation models on Mosaic AI Model Serving - Azure Databricks,Select supported foundation models on Mosaic AI Model Serving,This article provides an overview of the foundation models that are supported by Mosaic AI Model Serving.,"This article describes the foundation models you can serve usingMosaic AI Model Serving. Foundation models are large, pre-trained neural networks that are trained on both large and broad ranges of data. These models are designed to learn general patterns in language, images, or other data types, and can be fine-tuned for specific tasks with additional training. Your use of certain foundation models is subject to the model's terms and acceptable use policy. SeeApplicable model terms. Model Servin",2026-04-24T08:00:00.000Z,concept-article,decision-making,0.6,True,"A page listing supported foundation models typically includes model-specific capabilities, regions, and feature support tables; this helps decide which model to use for a scenario, fitting decision-making with product-specific matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/function-calling,Function calling on Databricks,Function calling on Azure Databricks - Azure Databricks,Implement function calling with Databricks foundation model endpoints,"Learn about function calling on Databricks, including what it is, when to use it and how to implement it with your generative AI applications.",This article describes function calling and how to use it as part of your generative AI application workflows. Databricks Function Calling is OpenAI-compatible and is only available during model serving as part ofFoundation Model APIsand serving endpoints that serveexternal models.,2026-04-24T08:00:00.000Z,concept-article,integrations,0.7,True,"Function calling implementation will show concrete JSON schemas, parameter names, and endpoint usage specific to Databricks’ OpenAI-compatible API, which are integration/coding patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/glossary,Model Serving concepts,Mosaic AI Model Serving concepts - Azure Databricks,,Defintions of key terms related to model serving.,This page provides definitions of key concepts that are used in Mosaic AI Model Serving for model deployments.,2025-11-25T21:04:00.000Z,concept-article,,0.2,False,"Glossary of model serving concepts; definitions are conceptual and not configuration, limits, or error-resolution content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/inference-tables,Inference tables for monitoring and debugging models,Inference tables for monitoring and debugging models - Azure Databricks,Use inference tables to monitor and debug Databricks endpoints,"Learn about inference tables for monitoring model serving endpoints, including what they log and feature limitations.","Important This feature is inPublic Preview. Important This article describes the legacy inference table experience, which is only relevant for certain provisioned throughput and custom model serving endpoints. Databricks recommendsAI Gateway-enabled inference tablesfor its availability on custom model, foundation model, and agent serving endpoints. SeeMigrating to AI Gateway inference tablesfor instructions on migrating to AI Gateway-enabled inference tables. Note If you are serving a gen AI app",2026-03-04T08:00:00.000Z,concept-article,configuration,0.76,True,"Describes what inference tables log, limitations, and legacy vs AI Gateway-enabled behavior; these are detailed, product-specific monitoring configurations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/manage-serving-endpoints,Overview,Manage model serving endpoints - Azure Databricks,Configure and manage Azure Databricks model serving endpoints,Learn how to manage model serving endpoints with Mosaic AI Model Serving for model deployment and model inference.,This article describes how to manage model serving endpoints using theServingUI and REST API. SeeServing endpointsin the REST API reference. To create model serving endpoints use one of the following:,2026-04-16T17:44:00.000Z,concept-article,configuration,0.7,True,"Management article for serving endpoints typically includes endpoint state options, REST API fields, and configuration parameters (scaling, traffic routing, versions) that are product-specific and not just conceptual. These are concrete settings rather than a high-level overview.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/manage-serving-endpoints,Overview,Manage model serving endpoints - Azure Databricks,Configure and manage Azure Databricks model serving endpoints,Learn how to manage model serving endpoints with Mosaic AI Model Serving for model deployment and model inference.,This article describes how to manage model serving endpoints using theServingUI and REST API. SeeServing endpointsin the REST API reference. To create model serving endpoints use one of the following:,2026-04-16T17:44:00.000Z,concept-article,configuration,0.7,True,"Management article for serving endpoints typically includes endpoint state options, REST API fields, and configuration parameters (scaling, traffic routing, versions) that are product-specific and not just conceptual. These are concrete settings rather than a high-level overview.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/metrics-export-serving-endpoint,Export endpoint health metrics to Prometheus and Datadog,Track and export serving endpoint health metrics to Prometheus and Datadog - Azure Databricks,Configure Databricks serving endpoint health metrics export,"Learn how to set up metric collection for your Model Serving endpoints with the metrics export API and common monitoring systems, Prometheus and Datadog.","This article provides an overview of serving endpoint health metrics and shows how to use the metrics export API to export endpoint metrics toPrometheusandDatadog. Endpoint health metrics measures infrastructure and metrics such as latency, request rate, error rate, CPU usage, memory usage, etc. This tells you how your serving infrastructure is behaving.",2025-09-03T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains using a metrics export API to send metrics to Prometheus and Datadog, which typically includes metric names, labels, endpoint paths, and configuration parameters unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/migrate-model-serving,Migrate to Model Serving,Migrate to Model Serving - Azure Databricks,Migrate from legacy MLflow to Mosaic AI Model Serving,Migrate your models from using Legacy MLflow Model Serving to using Mosaic AI Model Serving experience built on serverless compute.,"This article demonstrates how to enable Model Serving in your workspace and switch your models to theMosaic AI Model Servingexperience built on serverless compute. Important Starting August 22, 2025, customers will no longer be able to create new serving endpoints using the Legacy MLflow Model Serving experience. On September 15, 2025, the legacy experience will reach end of life and all existing endpoints using this service can no longer be used.",2025-09-03T08:00:00.000Z,concept-article,decision-making,0.7,True,"Migration article between legacy and new serving experiences usually includes timelines, feature differences, and guidance on when/how to switch, which is product-specific decision and migration guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-custom-artifacts,Package custom artifacts with Model Serving endpoints,Package custom artifacts for Model Serving - Azure Databricks,Package custom artifacts for Databricks Model Serving,Learn how to make your model's file and artifact dependencies available for model serving on a Model Serving endpoint.,This article describes how to ensure your model's file and artifact dependencies are available on yourDeploy models using Mosaic AI Model Servingendpoint.,2025-05-09T19:58:00.000Z,concept-article,configuration,0.8,True,Explains how to make file and artifact dependencies available on serving endpoints; artifact handling is product-specific configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-debug,Debug Model Serving,Debugging guide for Model Serving - Azure Databricks,Debug common issues in Databricks Model Serving endpoints,Learn how to debug and resolve common issues for Mosaic AI Model Serving.,"This article demonstrates debugging steps for common issues that users might encounter when working with model serving endpoints. Common issues could include errors users encounter when the endpoint fails to initialize or start, build failures related to the container, or problems during the operation or running of the model on the endpoint. :::tip Validate before debugging Having deployment issues? Start withpre-deployment validationto catch common problems before they occur. :::",2025-12-02T23:00:00.000Z,concept-article,troubleshooting,0.9,True,"Organized around common issues (deployment failures, container build errors, runtime problems) with diagnosis and resolution steps; classic symptom→cause→solution troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-genie-code,Observability with Genie Code,Model serving observability with Genie Code - Azure Databricks,Diagnose Databricks model serving issues with Genie Code,"Learn how to use Genie Code to diagnose issues, analyze performance, and get guidance for your model serving endpoints.","Important This feature is inPublic Preview. This article describes how Genie Code can help you diagnose issues, analyze performance, and get guidance for your model serving endpoints.",2026-04-16T17:44:00.000Z,feature-guide,troubleshooting,0.7,True,"Genie Code is described as a tool to diagnose issues and analyze performance for model serving endpoints, which implies mappings from symptoms to diagnostics and guidance. This is product-specific troubleshooting content beyond generic debugging advice.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-genie-code,Observability with Genie Code,Model serving observability with Genie Code - Azure Databricks,Diagnose Databricks model serving issues with Genie Code,"Learn how to use Genie Code to diagnose issues, analyze performance, and get guidance for your model serving endpoints.","Important This feature is inPublic Preview. This article describes how Genie Code can help you diagnose issues, analyze performance, and get guidance for your model serving endpoints.",2026-04-16T17:44:00.000Z,feature-guide,troubleshooting,0.7,True,"Genie Code is described as a tool to diagnose issues and analyze performance for model serving endpoints, which implies mappings from symptoms to diagnostics and guidance. This is product-specific troubleshooting content beyond generic debugging advice.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-intro,Mosaic AI Model Serving,Tutorial: Deploy and query a custom model - Azure Databricks,,Learn the overview and basic steps for performing model serving on Databricks.,"This article provides the basic steps for deploying and querying a custom model, that is a traditional ML model, using Mosaic AI Model Serving. The model must be registered in Unity Catalog or in the workspace model registry. To learn about serving and deploying generative AI models instead, see the following articles:",2025-09-03T08:00:00.000Z,concept-article,,0.3,False,Basic steps for model serving; appears as an overview/tutorial without detailed deployment matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-limits,Limits and region availability,Model Serving limits and regions - Azure Databricks,Review Azure Databricks model serving limits and regions,"Learn about Azure Databricks model serving limits, quotas, and region availability.",This article summarizes the limitations and region availability for Azure Databricks Model Serving and supported endpoint types.,2026-04-07T08:00:00.000Z,concept-article,limits-quotas,0.9,True,"The page is explicitly about Azure Databricks Model Serving limits and region availability and will list concrete numerical limits/quotas and regional support details that are product- and SKU-specific, matching the limits-quotas criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-pre-deployment-validation,Pre-deployment validation for Model Serving,Pre-deployment validation for Model Serving - Azure Databricks,Validate models before Databricks Model Serving deployment,Learn how to validate and test models before deploying to Mosaic AI Model Serving endpoints.,The guidance in this article can help you catch issues with your model before waiting for the endpoint deployment process. Databricks recommends going through these validation steps to ensure a better development experience when using model serving.,2025-12-02T23:00:00.000Z,concept-article,best-practices,0.76,True,Provides recommended validation steps to catch issues pre-deployment; these are Databricks-specific validation practices and checks.,unchanged @@ -1646,22 +1673,22 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-servin https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/private-libraries-model-serving,Use custom libraries with Model Serving endpoints,Use custom Python libraries with Model Serving - Azure Databricks,Use custom and private Python libraries in Model Serving,Learn how to use custom libraries and private packages with Mosaic AI Model Serving to create model deployments with enterprise-grade security.,"In this article, you learn how to include custom libraries or libraries from a private mirror server when you log your model, so that you can use them withMosaic AI Model Servingmodel deployments. You should complete the steps detailed in this guide after you have a trained ML model ready to deploy but before you create an Azure DatabricksModel Serving endpoint. Model development often requires the use of custom Python libraries that contain functions for pre- or post-processing, custom model de",2026-02-20T23:39:00.000Z,concept-article,configuration,0.83,True,Shows how to include custom/private libraries when logging models for serving; involves specific packaging and configuration steps unique to Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/production-optimization,Optimize endpoints for production workloads,Optimize Model Serving endpoints for production - Azure Databricks,Optimize Databricks Model Serving endpoints for production,Optimize Model Serving endpoints for production workloads with high throughput and low latency requirements.,"Learn how to optimize Model Serving endpoints for production workloads that require high throughput, low latency, and reliable performance. Optimization strategies fall into three categories:",2025-11-24T20:02:00.000Z,concept-article,best-practices,0.78,True,"Provides optimization strategies for high-throughput, low-latency workloads; these are Databricks-specific performance recommendations and gotchas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/provider-native-apis,Provider native APIs,Provider native APIs - Azure Databricks,"Call provider-native OpenAI, Anthropic, and Gemini APIs on Databricks","Learn how to query foundation models using provider native APIs for OpenAI, Anthropic, and Google Gemini on Databricks.",Provider native APIs give you direct access to provider-specific API surfaces when you need features beyond the unified OpenAI-compatible APIs. Use native APIs to access the latest provider-specific features or to migrate existing provider SDK code to Azure Databricks.,2026-02-19T23:31:00.000Z,concept-article,integrations,0.8,True,"Shows how to use provider-native APIs through Databricks; includes endpoint names, parameters, and configuration details for each provider within Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-anthropic-messages,Anthropic Messages API,Query with the Anthropic Messages API - Azure Databricks,Query Databricks Anthropic models with Messages API,Learn how to query foundation models using the Anthropic Messages API and send those requests to a model serving endpoint.,"Important The Anthropic Messages API is only compatible with Anthropic pay per token foundation models and external models. For a unified API that works across all providers, use theChat Completions API. The Anthropic Messages API provides native Anthropic SDK compatibility for Claude models on Azure Databricks. Use this API when you need Anthropic-specific features or are migrating existing Anthropic SDK code.",2026-04-16T17:44:00.000Z,concept-article,integrations,0.8,True,"Explains how to query foundation models using the Anthropic Messages API against Databricks model serving endpoints, including compatibility constraints (only Anthropic pay-per-token and external models) and migration from Anthropic SDK. This is a concrete API integration pattern with provider- and SKU-specific constraints.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-anthropic-messages,Anthropic Messages API,Query with the Anthropic Messages API - Azure Databricks,Query Databricks Anthropic models with Messages API,Learn how to query foundation models using the Anthropic Messages API and send those requests to a model serving endpoint.,"Important The Anthropic Messages API is only compatible with Anthropic pay per token foundation models and external models. For a unified API that works across all providers, use theChat Completions API. The Anthropic Messages API provides native Anthropic SDK compatibility for Claude models on Azure Databricks. Use this API when you need Anthropic-specific features or are migrating existing Anthropic SDK code.",2026-04-16T17:44:00.000Z,concept-article,integrations,0.8,True,"Explains how to query foundation models using the Anthropic Messages API against Databricks model serving endpoints, including compatibility constraints (only Anthropic pay-per-token and external models) and migration from Anthropic SDK. This is a concrete API integration pattern with provider- and SKU-specific constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-chat-models,Query chat models,Query a chat model - Azure Databricks,Write and send chat model queries on Databricks,"Learn how to write query requests for general purpose and chat foundation models, and how to send those requests to a model serving endpoint.","In this article, you learn how to write query requests for foundation models that are optimized for chat and general purpose tasks and send them to your model serving endpoint. The examples in this article apply to querying foundation models that are made available using either:",2026-02-11T08:00:00.000Z,concept-article,integrations,0.7,True,"How-to for constructing query requests to chat/general-purpose models via Databricks model serving endpoints. This typically includes request/response schema, parameter names, and API-specific fields unique to Databricks Foundation Model APIs, which fits integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-embedding-models,Query embedding models,Query an embedding model - Azure Databricks,Query embedding models via Databricks endpoints,"Learn how to write query requests for foundation models for embedding tasks, and how to send those requests to a model serving endpoint.","In this article, you learn how to write query requests for foundation models that are optimized for embeddings tasks and send them to your model serving endpoint. The examples in this article apply to querying foundation models that are made available using either:",2026-02-11T08:00:00.000Z,concept-article,integrations,0.7,True,"Describes how to build and send embedding requests to Databricks model serving endpoints. This usually includes body schema, parameter names, and constraints specific to Databricks embedding APIs, which is product-specific integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-gemini-api,Google Gemini API,Query with the Google Gemini API - Azure Databricks,Integrate Google Gemini API with Databricks,Learn how to query foundation models using the Google Gemini API and send those requests to a model serving endpoint.,"Important The Google Gemini API is only compatible with Gemini pay per token foundation models and external models. For a unified API that works across all providers, use theChat Completions API. The Google Gemini API provides native Google AI SDK compatibility for Gemini models on Azure Databricks. Use this API when you need Gemini-specific features or are migrating existing Google AI SDK code.",2026-03-31T23:28:00.000Z,concept-article,integrations,0.8,True,"Describes using the Google Gemini API and Google AI SDK against Databricks-hosted Gemini models, including compatibility notes and likely request configuration. This is product-specific integration knowledge between Databricks and Gemini.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-openai-responses,OpenAI Responses API,Query with the OpenAI Responses API - Azure Databricks,Use OpenAI Responses API with Databricks endpoints,Learn how to query foundation models using the OpenAI Responses API and send those requests to a model serving endpoint.,"Important The Responses API is only compatible with OpenAI pay per token foundation models and external models. For a unified API that works across all providers, use theChat Completions API. The OpenAI Responses API is an alternative to the Chat Completions API that provides additional features for OpenAI models, including custom tools and multi-step workflows.",2026-04-06T08:00:00.000Z,concept-article,integrations,0.7,True,"Page is about querying via the OpenAI Responses API into Databricks model serving. Likely includes request/response schema, API parameters, and provider-specific options that are product-specific integration details beyond generic LLM knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-reason-models,Query reasoning models,Query reasoning models - Azure Databricks,Query reasoning models using Foundation Model API,"Learn how to write query requests for foundation models for your served foundation model or external model, and how to send those requests to Foundation Model API","In this article, you learn how to write query requests for foundation models optimized for reasoning tasks, and send them to your Foundation Model API endpoint. Mosaic AI Foundation Model API provides a unified API to interact with all Foundation Models, including reasoning models. Reasoning gives foundation models enhanced capabilities to tackle complex tasks. Some models also provide transparency by revealing their step-by-step thought process before delivering a final answer.",2026-04-16T08:00:00.000Z,concept-article,integrations,0.7,True,"Describes how to write and send requests to reasoning-optimized models via the Foundation Model API. This implies specific API parameters and patterns for reasoning models, which are product-specific integration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-openai-responses,OpenAI Responses API,Query with the OpenAI Responses API - Azure Databricks,Query Databricks foundation models using the OpenAI Responses API,Learn how to query foundation models using the OpenAI Responses API and send those requests to a model serving endpoint.,"Important The Responses API is only compatible with OpenAI pay per token foundation models and external models. For a unified API that works across all providers, use theChat Completions API. The OpenAI Responses API is an alternative to the Chat Completions API that provides additional features for OpenAI models, including custom tools and multi-step workflows.",2026-04-24T08:00:00.000Z,concept-article,integrations,0.8,True,"Describes how to use the OpenAI Responses API with Databricks endpoints; will include endpoint URLs, request/response schemas, and parameters specific to Databricks’ integration with OpenAI-compatible APIs.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-reason-models,Query reasoning models,Query reasoning models - Azure Databricks,Query reasoning foundation models using Foundation Model API,"Learn how to write query requests for foundation models for your served foundation model or external model, and how to send those requests to Foundation Model API","In this article, you learn how to write query requests for foundation models optimized for reasoning tasks, and send them to your Foundation Model API endpoint. Mosaic AI Foundation Model API provides a unified API to interact with all Foundation Models, including reasoning models. Reasoning gives foundation models enhanced capabilities to tackle complex tasks. Some models also provide transparency by revealing their step-by-step thought process before delivering a final answer.",2026-04-24T08:00:00.000Z,concept-article,integrations,0.7,True,"Describes request formats and parameters for reasoning models; includes API-specific fields and usage patterns (e.g., reasoning configuration) that are product-specific integration details.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-route-optimization,Query route optimized serving endpoints,Query route-optimized serving endpoints - Azure Databricks,Query route-optimized Databricks serving endpoints,Learn how to query route-optimized model serving endpoints.,This article describes how to fetch the appropriate authentication credentials and URL so you can query your route-optimizedmodel servingorfeature servingendpoint.,2025-10-27T08:00:00.000Z,concept-article,integrations,0.76,True,Details how to fetch credentials and URLs and how to call route-optimized endpoints; includes endpoint-specific request patterns and auth details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-vision-models,Query vision models,Query vision models - Azure Databricks,Query vision foundation models via Databricks endpoints,"Learn how to write query requests for foundation models optimized for vision tasks, and how to send those requests to a model serving endpoint.","In this article, you learn how to write query requests for foundation models optimized for vision tasks, and send them to your model serving endpoint. Mosaic AI Model Serving provides a unified API to understand and analyze images using a variety of foundation models, unlocking powerful multimodal capabilities. This functionality is available through select Databricks-hosted models as part ofFoundation Model APIsand serving endpoints that serveexternal models.",2026-04-16T08:00:00.000Z,concept-article,integrations,0.7,True,"Focuses on writing query requests for vision-optimized foundation models and sending them to serving endpoints. Likely includes request body formats, image encoding, and model-specific parameters, which are concrete integration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-vision-models,Query vision models,Query vision models - Azure Databricks,Query vision foundation models via Mosaic AI Model Serving,"Learn how to write query requests for foundation models optimized for vision tasks, and how to send those requests to a model serving endpoint.","In this article, you learn how to write query requests for foundation models optimized for vision tasks, and send them to your model serving endpoint. Mosaic AI Model Serving provides a unified API to understand and analyze images using a variety of foundation models, unlocking powerful multimodal capabilities. This functionality is available through select Databricks-hosted models as part ofFoundation Model APIsand serving endpoints that serveexternal models.",2026-04-24T08:00:00.000Z,concept-article,integrations,0.7,True,"Covers how to structure requests for vision models; will include model-specific request fields (image encodings, content types, parameter names) and endpoint usage patterns unique to Databricks’ API.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/route-optimization,Route optimization on serving endpoints,Route optimization on serving endpoints - Azure Databricks,Enable route optimization on Databricks serving endpoints,Learn how to enable route-optimization on your model serving endpoints.,"This article describes how to enable route optimization on yourmodel servingorfeature servingendpoints. Route optimized serving endpoints dramatically lower overhead latency and allow for substantial improvements in the throughput supported by your endpoint. Route-optimized endpoints are queried differently from non-route-optimized endpoints, including using a different URL and authentication using OAuth tokens. SeeQuery route-optimized serving endpointsfor details.",2025-11-24T20:02:00.000Z,concept-article,configuration,0.74,True,"Explains enabling route-optimized endpoints, changed URLs, and OAuth auth; these are specific configuration and usage details for this feature.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-custom-model-endpoints,Query custom model serving endpoints,Query serving endpoints for custom models - Azure Databricks,Send scoring requests to Databricks custom model endpoints,Learn how to format scoring requests for a served custom model and how to send those requests to the endpoint that serves your model.,"In this article, learn how to format scoring requests for your served model, and how to send those requests to the model serving endpoint. The guidance is relevant to servingcustom models, which Databricks defines as traditional ML models or customized Python models packaged in the MLflow format. Register the models in Unity Catalog or in the workspace model registry. Examples include scikit-learn, XGBoost, PyTorch, and Hugging Face transformer models. SeeDeploy models using Mosaic AI Model Serv",2026-02-12T04:25:00.000Z,concept-article,integrations,0.7,True,Explains request formatting and how to call custom model endpoints; includes request/response patterns and parameters specific to Databricks Model Serving.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-foundation-models,Use foundation models and external models overview,Use foundation models - Azure Databricks,Send query requests to Databricks foundation models,Learn which options are available to write query requests for supported foundation model types and how to send those requests to a model serving endpoint.,"In this article, you learn which options are available to write query requests for foundation models and how to send them to your model serving endpoint. You can query foundation models that are hosted by Databricks and foundation models hosted outside of Databricks. For traditional ML or Python models query requests, seeQuery serving endpoints for custom models. Mosaic AI Model ServingsupportsFoundation Models APIsandexternal modelsfor accessing foundation models. Model Serving uses a unified O",2026-04-16T08:00:00.000Z,concept-article,integrations,0.7,True,"Explains how to write and send query requests to Databricks-hosted and external foundation models via model serving endpoints. This typically includes request schema, parameters, and endpoint specifics unique to Databricks Foundation Model APIs, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-foundation-models,Use foundation models and external models overview,Use foundation models - Azure Databricks,Send query requests to Databricks and external foundation models,Learn which options are available to write query requests for supported foundation model types and how to send those requests to a model serving endpoint.,"In this article, you learn which options are available to write query requests for foundation models and how to send them to your model serving endpoint. You can query foundation models that are hosted by Databricks and foundation models hosted outside of Databricks. For traditional ML or Python models query requests, seeQuery serving endpoints for custom models. Mosaic AI Model ServingsupportsFoundation Models APIsandexternal modelsfor accessing foundation models. Model Serving uses a unified O",2026-04-24T08:00:00.000Z,concept-article,integrations,0.7,True,"Explains request options and how to send them to endpoints; likely includes request schema, parameter names, and endpoint URLs for Databricks-hosted and external models, which are integration-specific API patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/serve-multiple-models-to-serving-endpoint,Serve multiple models to a Model Serving endpoint,Serve multiple models to a model serving endpoint - Azure Databricks,Configure multiple models and traffic splitting on Databricks endpoints,Learn how to serve multiple models to a Model Serving endpoint and configure the traffic split between them.,"This article describes how to programmatically configure a model serving endpoint to serve multiple models and the traffic split between them. Serving multiple models from a single endpoint enables you to split traffic between different models to compare their performance and facilitate A/B testing. You can also serve different versions of a model at the same time, which makes experimenting with new versions easier, while keeping the current version in production. You can serve any of the follow",2025-09-03T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes how to programmatically configure multiple models and traffic split on a single endpoint, which implies endpoint JSON schema, parameter names, and valid value ranges that are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/serverless-optimized-deployments,Serverless optimized deployments for custom models,Serverless optimized deployments for model serving endpoints - Azure Databricks,Use serverless optimized deployments for model serving,Learn how to use serverless optimized deployments to speed up the model serving deployment process.,This article describes how to use serverless optimized deployments on yourmodel servingendpoints. Serverless optimized deployments dramatically lower deployment times and keep the model serving environment the same as the model training environment.,2025-12-29T08:00:00.000Z,concept-article,deployment,0.78,True,Describes serverless optimized deployment behavior and constraints for Databricks endpoints; these deployment mechanics are product-specific.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/store-env-variable-model-serving,Configure access to resources from model serving endpoints,Configure access to resources from model serving endpoints - Azure Databricks,Secure external resource access from Databricks endpoints,Learn how to configure access to external and private resources from model serving endpoints with secret-based environment variables.,This article describes how to configure access to external and private resources from model serving endpoints. Model Serving supports plain text environment variables and secrets-based environment variables using Databrickssecrets.,2025-10-27T08:00:00.000Z,concept-article,security,0.8,True,"Explains configuring access to external/private resources using environment variables and Databricks secrets. This involves product-specific security configuration patterns (secret scopes, env var usage) and is clearly in the security/identity configuration domain.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/structured-outputs,Structured outputs on Databricks,Structured outputs on Azure Databricks - Azure Databricks,Use structured outputs with Databricks LLM endpoints,"Learn about structured outputs on Databricks, including what it is, when to use it and how to implement it with your generative AI applications.",This article describes structured outputs on Azure Databricks and how to use them as part of your generative AI application workflows. Structured outputs works with OpenAI models that support structured models.,2025-10-28T08:00:00.000Z,concept-article,integrations,0.8,True,"Describes structured outputs support and implementation; likely includes schema definitions, parameters, and code examples specific to Databricks/OpenAI integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/web-search,Web search on Databricks,Web search on Azure Databricks - Azure Databricks,Use web search grounding with Databricks models,"Learn about web search on Databricks, including how to ground model responses with real-time information using Gemini, OpenAI, and Anthropic foundation models.",This page describes web search on Azure Databricks and how to use it to ground model responses with real-time information from the web. Web search is available for Anthropic models throughModel Context Protocol (MCP).,2026-04-16T08:00:00.000Z,concept-article,integrations,0.65,True,"Describes how to ground model responses with real-time web information using Gemini, OpenAI, and Anthropic models, including Anthropic via MCP. This implies concrete configuration and API usage patterns for web search integration that are product-specific.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/web-search,Web search on Databricks,Web search on Azure Databricks - Azure Databricks,Use web search with Databricks foundation models and MCP,"Learn about web search on Databricks, including how to ground model responses with real-time information using Gemini, OpenAI, and Anthropic foundation models.",This page describes web search on Azure Databricks and how to use it to ground model responses with real-time information from the web. Web search is available for Anthropic models throughModel Context Protocol (MCP).,2026-04-24T08:00:00.000Z,concept-article,integrations,0.65,True,"Explains how to ground responses with real-time web data using Gemini, OpenAI, Anthropic, and MCP; will include provider-specific parameters, configuration flags, and endpoint usage unique to Databricks’ implementation.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/what-is-load-test,Load test custom model serving endpoints,Load testing for serving endpoints - Azure Databricks,Plan and execute load testing for Databricks endpoints,Learn about load testing for custom model serving endpoints.,"This article walks you through the essential process of load testing your Mosaic AI Model Serving endpoints to ensure they can handle production workloads effectively. It also provides practical examples, real-world analogies, and step-by-step instructions using the Locust load testing framework, to demonstrate how to measure key performance metrics like requests per second and concurrency limits, so you can size your endpoints correctly and confidently deploy models for your business needs. Tip",2025-11-24T08:00:00.000Z,concept-article,best-practices,0.75,True,"Guides how to load test Mosaic AI Model Serving endpoints, measure RPS and concurrency, and size endpoints. This is actionable, product-specific performance and capacity planning guidance—best practices for operating Databricks serving in production.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/preprocess-data/transfer-learning-tensorflow,Featurization for transfer learning,Featurization for transfer learning - Azure Databricks,Featurize data for transfer learning with pandas UDFs on Databricks,Learn from an example notebook how to do featurization for transfer learning using pandas UDFs.,This article provides an example of doing featurization for transfer learning using pandas UDFs.,2024-09-12T18:56:00.000Z,concept-article,integrations,0.6,True,"Provides an example notebook using pandas UDFs for transfer learning featurization, which is a Databricks-specific coding pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/,Databricks Ray overview,What is Ray on Azure Databricks? - Azure Databricks,,"Run Ray applications on Azure Databricks to simplify scaling Python AI tasks. Benefit from seamless Apache Spark integration, robust data management, governance, and automated workflows.","Rayis an open source framework for scaling Python applications. It includes libraries specific to AI workloads, making it especially suited for developing AI applications. Ray on Azure Databricks lets you run Ray applications while getting all the platform benefits and features of Azure Databricks. With Ray 2.3.0 and above, you can create Ray clusters and run Ray applications on Apache Spark clusters with Azure Databricks. For information about getting started with machine learning on Ray, inclu",2025-02-14T08:00:00.000Z,concept-article,,0.3,False,"High-level overview of Ray on Databricks; mostly conceptual description of capabilities without detailed config tables, limits, or error mappings.",unchanged @@ -1675,7 +1702,7 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/reference-so https://learn.microsoft.com/en-us/azure/databricks/machine-learning/reference-solutions/images-etl-inference,ETL and image processing,Reference solution for image applications - Azure Databricks,Implement distributed image inference on Databricks,"Learn how to do distributed image model inference from reference solution notebooks using pandas UDF, PyTorch, and TensorFlow.","Learn how to do distributed image model inference from reference solution notebooks using pandas UDF, PyTorch, and TensorFlow in a common configuration shared by many real-world image applications. This configuration assumes that you store many images in an object store and optionally have continuously arriving new images.",2024-08-09T08:00:00.000Z,concept-article,best-practices,0.68,True,"Provides a reference solution with concrete patterns (pandas UDFs, PyTorch, TensorFlow) for large-scale image inference on Databricks, which are practical, environment-specific best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/reference-solutions/natural-language-processing,Natural language processing with Spark NLP,Natural language processing - Azure Databricks,Run NLP workloads with Spark NLP on Databricks,"Learn how to perform natural language processing tasks on Databricks with Spark ML, spark-nlp, and John Snow Labs.","You can perform natural language processing tasks on Azure Databricks using popular open source libraries such as Spark ML and spark-nlp or proprietary libraries through the Azure Databricks partnership with John Snow Labs. For examples of NLP with Hugging Face, seeAdditional resources",2024-03-27T12:07:00.000Z,concept-article,integrations,0.6,True,"Describes how to perform NLP tasks using Spark ML, spark-nlp, and John Snow Labs on Databricks, which are concrete integration patterns between these libraries and the platform.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/retired-models-policy,Gen AI model maintenance policy,Generative AI models maintenance policy - Azure Databricks,Understand Databricks generative AI model maintenance policy,Learn about the Databricks model maintenance policy for the Foundation Model APIs pay-per-token and Foundation Model Fine-tuning offerings.,"This article describes the model maintenance policy for theFoundation Model APIs pay-per-token,Foundation Model APIs provisioned throughput, andFoundation Model Fine-tuningofferings. In order to continue supporting the most state-of-the-art models, Databricks might update supported models or retire older models for these offerings.",2026-03-11T08:00:00.000Z,concept-article,limits-quotas,0.77,True,"Describes which foundation models are supported, how and when they may be updated or retired, and likely includes timelines and behavioral guarantees, which are offering-specific constraints akin to limits/policy details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/serve-data-ai,Overview,Serve data for ML and AI - Azure Databricks,,Learn how to serve data for AI workloads.,"Mosaic AI provides a unified platform for all parts of AI development, including serving data for AI applications.",2025-11-12T08:00:00.000Z,concept-article,,0.3,False,"Very high-level statement about Mosaic AI serving data; appears to be an overview without detailed settings, limits, or patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/serve-data-ai,Overview,Serve data for ML and AI - Azure Databricks,,Learn how to serve data for AI workloads.,"Mosaic AI provides a unified platform for all parts of AI development, including serving data for AI applications.",2026-04-23T17:47:00.000Z,concept-article,,0.2,False,"High-level guidance on serving data for AI workloads in Azure Databricks/Mosaic AI; appears to be conceptual/overview without concrete limits, configuration tables, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/,Train models overview,Train AI and ML models - Azure Databricks,,Explore examples of training machine learning and deep learning models in Azure Databricks with popular open-source libraries.,"Azure Databricks offers flexible compute solutions tailored to different machine learning needs, ranging from managed cluster runtimes to fully serverless GPU environments.",2026-03-20T17:45:00.000Z,landing-page,,0.2,False,"High-level overview of training models on Databricks with examples; no indication of numeric limits, specific configs, or troubleshooting patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/deep-learning,Deep learning overview,Deep learning - Azure Databricks,,"Learn about training deep learning models in Azure Databricks using PyTorch, Tensorflow, TorchDistributor,and DeepSpeed.","This article gives a brief introduction to using PyTorch, Tensorflow, and distributed training for developing and fine-tuning deep learning models on Azure Databricks. It also includes links to pages with example notebooks illustrating how to use those tools.",2026-03-19T08:00:00.000Z,concept-article,,0.2,False,"High-level intro to deep learning options on Databricks with links to notebooks; no detailed configs, limits, or product-specific patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/distributed-training/,Distributed training overview,Distributed training - Azure Databricks,,Learn how to perform distributed training of machine learning models.,"When possible, Azure Databricks recommends that you train neural networks on a single machine; distributed code for training and inference is more complex than single-machine code and slower due to communication overhead. However, you should consider distributed training and inference if your model or your data are too large to fit in memory on a single machine. For these workloads, Databricks Runtime ML includes the TorchDistributor, DeepSpeed distributor and Ray packages. Azure Databricks also",2026-03-19T08:00:00.000Z,concept-article,,0.3,False,"Conceptual guidance on when to use distributed training; summary suggests no concrete thresholds, configs, or decision matrices.",unchanged @@ -1686,7 +1713,7 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/huggingface/,Hugging Face overview,What are Hugging Face Transformers? - Azure Databricks,,This article provides an introduction to Hugging Face Transformers on Azure Databricks. It includes guidance on why to use Hugging Face Transformers and how to install it on your cluster.,This article provides an introduction to Hugging Face Transformers on Azure Databricks. It includes guidance on why to use Hugging Face Transformers and how to install it on your cluster.,2024-11-07T23:52:00.000Z,concept-article,,0.35,False,Introductory overview of Hugging Face Transformers on Databricks and installation; mostly conceptual and basic setup without detailed config matrices or product-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/huggingface/fine-tune-model,Fine-tune a Hugging Face model,Fine-tune Hugging Face models for a single GPU - Azure Databricks,Fine-tune Hugging Face models on a single GPU in Databricks,Learn how to fine-tune a natural language processing model with Hugging Face Transformers on a single node GPU.,"This article describes how to fine-tune a Hugging Face model with the Hugging Facetransformerslibrary on a single GPU. It also includes Databricks-specific recommendations for loading data from the lakehouse and logging models to MLflow, which enables you to use and govern your models on Azure Databricks. The Hugging Facetransformerslibrary provides theTrainerutility andAuto Modelclasses that enable loading and fine-tuning Transformers models. These tools are available for the following tasks wi",2024-11-06T08:00:00.000Z,concept-article,best-practices,0.74,True,"Includes Databricks-specific recommendations for loading data from the lakehouse and logging models to MLflow during fine-tuning, which are concrete best practices and patterns unique to this environment.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/huggingface/load-data,Prepare data for fine-tuning models,Prepare data for fine tuning Hugging Face models - Azure Databricks,Prepare datasets for Hugging Face fine-tuning on Databricks,Learn how to prepare data for fine-tuning large language models with Hugging Face Transformers.,This article demonstrates how to prepare your data for fine-tuning open source large language models withHugging Face TransformersandHugging Face Datasets.,2024-12-17T15:52:00.000Z,concept-article,best-practices,0.62,True,"Shows concrete steps and patterns for preparing data with Hugging Face Datasets on Databricks for LLM fine-tuning, which are practical, product-specific data prep recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/mllib,MLlib,Use Apache Spark MLlib on Azure Databricks - Azure Databricks,,"Learn how to train machine learning models using the Apache Spark MLlib Pipelines API in Azure Databricks. Classification, regression, and custom transformer examples.","This page provides example notebooks showing how to use MLlib on Azure Databricks. Apache Spark MLlib is the Apache Spark machine learning library consisting of common learning algorithms and utilities, including classification, regression, clustering, collaborative filtering, dimensionality reduction, and underlying optimization primitives. For reference information about MLlib features, Azure Databricks recommends the following Apache Spark API references: Thepyspark.mlpackage from Apache Spar",2026-03-19T08:00:00.000Z,how-to,,0.3,False,"Example notebooks for MLlib usage; no indication of Databricks-specific limits, quotas, or detailed configs.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/mllib,MLlib,Use Apache Spark MLlib on Azure Databricks - Azure Databricks,,"Learn how to train machine learning models using the Apache Spark MLlib Pipelines API in Azure Databricks. Classification, regression, and custom transformer examples.","This page provides example notebooks showing how to use MLlib on Azure Databricks. Apache Spark MLlib is the Apache Spark machine learning library consisting of common learning algorithms and utilities, including classification, regression, clustering, collaborative filtering, dimensionality reduction, and underlying optimization primitives. For reference information about MLlib features, Azure Databricks recommends the following Apache Spark API references: Thepyspark.mlpackage from Apache Spar",2026-04-23T08:00:00.000Z,how-to,,0.2,False,"Example notebooks for using Spark MLlib on Databricks; likely generic ML usage patterns rather than Databricks-specific limits, configs, or error mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/pytorch,PyTorch,PyTorch - Azure Databricks,,Learn how to train machine learning models on single nodes using PyTorch.,"PyTorch projectis a Python package that provides GPU accelerated tensor computation and high level functionalities for building deep learning networks. For licensing details, see the PyTorchlicense doc on GitHub. To monitor and debug your PyTorch models, consider usingTensorBoard. PyTorch is included in Databricks Runtime for Machine Learning. If you are using Databricks Runtime, seeInstall PyTorchfor instructions on installing PyTorch. Note This is not a comprehensive guide to PyTorch. For more",2026-03-19T08:00:00.000Z,how-to,,0.2,False,"Basic overview of using PyTorch on Databricks; appears tutorial/intro without detailed configuration tables, limits, or troubleshooting content.",unchanged @@ -1696,7 +1723,7 @@ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/ https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/tensorboard,TensorBoard,TensorBoard - Azure Databricks,Use TensorBoard for ML debugging on Databricks,Learn how to debug machine learning programs using TensorBoard.,"TensorBoardis a suite of visualization tools for debugging, optimizing, and understanding TensorFlow, PyTorch, Hugging Face Transformers, and other machine learning programs.",2024-06-18T22:22:00.000Z,concept-article,integrations,0.6,True,"Explains how to integrate TensorBoard with Databricks ML workloads, which involves product-specific configuration steps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/tensorflow,TensorFlow,TensorFlow - Azure Databricks,,Learn how to train machine learning models on single nodes using TensorFlow and debug machine learning programs using inline TensorBoard. A 10-minute tutorial notebook shows an example of training mac,"Note The open-source version ofTensorFlowis not compatible with the latest CUDA versions. TensorFlow will be removed in the next major Databricks Runtime ML version. Azure Databricks recommends you install your own versions as needed. TensorFlow is an open-source framework for machine learning created by Google. It supports deep-learning and general numerical computations on CPUs, GPUs, and clusters of GPUs. It is subject to the terms and conditions of theApache License 2.0. Databricks Runtime M",2026-03-19T08:00:00.000Z,how-to,,0.4,False,TensorFlow usage and deprecation note; summary does not show detailed config parameters or numeric constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/training-examples,Overview,Model training examples - Azure Databricks,,Explore examples of training machine learning and deep learning models in Azure Databricks with popular open-source libraries.,"This section includes examples showing how to train machine learning models on Azure Databricks using many popular open-source libraries. You can also useAutoML, which automatically prepares a dataset for model training, performs a set of trials using open-source libraries such as scikit-learn and XGBoost, and creates a Python notebook with the source code for each trial run so you can review, reproduce, and modify the code.",2026-04-15T22:08:00.000Z,overview,,0.2,False,"High-level index/overview page that links to various training examples and mentions AutoML conceptually. It does not itself expose concrete configuration tables, limits, error codes, or detailed product-specific patterns; it mainly describes what kinds of examples exist.",updated +https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/training-examples,Overview,Model training examples - Azure Databricks,,Explore examples of training machine learning and deep learning models in Azure Databricks with popular open-source libraries.,"This section includes examples showing how to train machine learning models on Azure Databricks using many popular open-source libraries. You can also useAutoML, which automatically prepares a dataset for model training, performs a set of trials using open-source libraries such as scikit-learn and XGBoost, and creates a Python notebook with the source code for each trial run so you can review, reproduce, and modify the code.",2026-04-15T22:08:00.000Z,overview,,0.2,False,"High-level index/overview page that links to various training examples and mentions AutoML conceptually. It does not itself expose concrete configuration tables, limits, error codes, or detailed product-specific patterns; it mainly describes what kinds of examples exist.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/xgboost,Use XGBoost,Use XGBoost on Azure Databricks - Azure Databricks,Train XGBoost models on Azure Databricks,"Learn how to train machine learning models using XGBoost in Azure Databricks. Examples of single-node and distributed training using Python, PySpark, and Scala.",This article provides examples of training machine learning models using XGBoost in Azure Databricks. Databricks Runtime for Machine Learning includes XGBoost libraries for both Python and Scala. You can train XGBoost models on an individual machine or in a distributed fashion.,2026-03-19T08:00:00.000Z,how-to,integrations,0.65,True,"Provides examples of single-node and distributed XGBoost training in Python, PySpark, and Scala on Databricks; such examples include cluster configs, library usage, and integration patterns specific to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/xgboost-scala,Distributed training of XGBoost models using Scala,Distributed training of XGBoost models using Scala - Azure Databricks,Integrate XGBoost with Spark ML pipelines in Scala,"Learn how to use distributed training for XGBoost models in Databricks with Scala. Specifically, how to embed an XGBoost model into an MLlib ML pipeline and how to use MLlib cross validation to tune a",The example notebooks show how you can use XGBoost with MLlib: Get notebook Get notebook,2024-03-01T08:00:00.000Z,concept-article,integrations,0.6,True,"Shows how to embed XGBoost models into MLlib pipelines and use cross-validation, which are Databricks-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/xgboost-spark,Distributed training of XGBoost models using xgboost.spark,Distributed training of XGBoost models using xgboost.spark - Azure Databricks,Use xgboost.spark for distributed XGBoost on Databricks,Learn how to use distributed training for XGBoost models in Databricks using the Python package xgboost.spark.,"Important This feature is inPublic Preview. The Python package xgboost>=1.7 contains a new modulexgboost.spark. This module includes the xgboost PySpark estimatorsxgboost.spark.SparkXGBRegressor,xgboost.spark.SparkXGBClassifier, andxgboost.spark.SparkXGBRanker. These new classes support the inclusion of XGBoost estimators in SparkML Pipelines. For API details, see theXGBoost python spark API doc.",2026-01-12T23:16:00.000Z,concept-article,integrations,0.7,True,"Describes new xgboost.spark estimators and their use in SparkML pipelines, which are specific integration APIs and patterns.",unchanged @@ -1713,9 +1740,9 @@ https://learn.microsoft.com/en-us/azure/databricks/marketplace/provider-policies https://learn.microsoft.com/en-us/azure/databricks/migration/,Migrate data apps to Databricks,Migrate data applications to Azure Databricks - Azure Databricks,Plan migration of data applications to Azure Databricks,"Learn how to migrate data applications such as ETL jobs, enterprise data warehouses, ML, data science, and analytics to Azure Databricks.","This article provides an introduction to migrating existing data applications to Azure Databricks. Azure Databricks provides a unified approach that lets you work with data from many source systems on a single platform. For an overview of platform capabilities, seeWhat is Azure Databricks?.",2025-10-08T08:00:00.000Z,concept-article,decision-making,0.6,True,"Migration overview with considerations for moving ETL, ML, and analytics workloads, guiding technology selection and migration approach.",unchanged https://learn.microsoft.com/en-us/azure/databricks/migration/etl,Migrate ETL pipelines to Databricks,Migrate ETL pipelines to Azure Databricks - Azure Databricks,Assess options for migrating ETL pipelines to Databricks,"Learn how to scope the level of effort for migrating various extract, transform, load (ETL) pipelines to Azure Databricks.","This article provides an overview of options for migrating extract, transform, load (ETL) pipelines running on other data systems to Azure Databricks. If you are migrating Apache Spark code, seeAdapt your exisiting Apache Spark code for Azure Databricks. For general information about moving from an enterprise data warehouse to a lakehouse, seeMigrate your data warehouse to the Databricks lakehouse. For information about moving from Parquet to Delta Lake, seeMigrate a Parquet data lake to Delta L",2025-10-08T08:00:00.000Z,concept-article,decision-making,0.65,True,Helps scope effort and choose approaches for migrating ETL pipelines from other systems to Azure Databricks.,unchanged https://learn.microsoft.com/en-us/azure/databricks/migration/parquet-to-delta-lake,Migrate Parquet to Delta Lake,Migrate a Parquet data lake to Delta Lake - Azure Databricks,Choose a migration path from Parquet to Delta Lake,"Learn what to consider before migrating a Parquet data lake to Delta Lake on Azure Databricks, as well as the four Databricks recommended migration paths to do so.",This article provides recommendations for converting an existing Parquet data lake to Delta Lake. Delta Lake is the underlying format in theDatabricks lakehouse. SeeWhat is Delta Lake in Azure Databricks?.,2025-10-08T08:00:00.000Z,concept-article,decision-making,0.7,True,"Describes considerations and four recommended migration paths, helping decide how to convert a Parquet lake to Delta Lake.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/migration/spark,Migrate Spark code,Adapt your exisiting Apache Spark code for Azure Databricks - Azure Databricks,Adapt Apache Spark workloads for Azure Databricks,Learn the Databricks recommended steps to migrate Apache Spark workloads to Azure Databricks.,"This article outlines the required changes to adapt existing Apache Spark workloads to run on Azure Databricks. Whether you're moving to Azure Databricks from an on-premises cluster, custom cloud-based infrastructure, or another enterprise Apache Spark offering, most workloads require only a few changes to get into production. Azure Databricks extends, simplifies, and improves the performance of Apache Spark by introducing custom optimizations, configuring and deploying infrastructure, and maint",2026-04-15T08:00:00.000Z,concept-article,best-practices,0.7,True,"Migration guide with Databricks-specific code and configuration changes (e.g., Spark config, cluster behavior, feature differences) that are unique to Azure Databricks and not just generic Spark concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/migration/spark,Migrate Spark code,Adapt your exisiting Apache Spark code for Azure Databricks - Azure Databricks,Adapt Apache Spark workloads for Azure Databricks,Learn the Databricks recommended steps to migrate Apache Spark workloads to Azure Databricks.,"This article outlines the required changes to adapt existing Apache Spark workloads to run on Azure Databricks. Whether you're moving to Azure Databricks from an on-premises cluster, custom cloud-based infrastructure, or another enterprise Apache Spark offering, most workloads require only a few changes to get into production. Azure Databricks extends, simplifies, and improves the performance of Apache Spark by introducing custom optimizations, configuring and deploying infrastructure, and maint",2026-04-15T08:00:00.000Z,concept-article,best-practices,0.7,True,"Migration guide with Databricks-specific code and configuration changes (e.g., Spark config, cluster behavior, feature differences) that are unique to Azure Databricks and not just generic Spark concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/migration/warehouse-to-lakehouse,Migrate data warehouse to lakehouse,Migrate your data warehouse to the Databricks lakehouse - Azure Databricks,Migrate enterprise data warehouses to the Databricks lakehouse,Resources and recommendations for migrating your enterprise data warehouse to the Databricks lakehouse.,"This article describes some of the considerations and caveats to consider as you replace your enterprise data warehouse with the Databricks lakehouse. Most workloads, queries, and dashboards defined in enterprise data warehouses can run with minimal code refactoring once admins have completed the initial data migration and governance configuration. Migrating your data warehousing workloads to Azure Databricks is not about eliminating data warehousing, but rather unifying your data ecosystem. For",2025-10-08T08:00:00.000Z,concept-article,decision-making,0.7,True,"Covers considerations, caveats, and when/how to replace a warehouse with a lakehouse, guiding migration decisions.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow/,Overview,MLflow on Databricks - Azure Databricks,,"Learn about MLflow on Databricks. MLflow is the largest open source AI engineering platform for agents, LLMs, and ML models.","This article describes how MLflow on Databricks is used to develop high-quality generative AI agents and machine learning models. Note If you're just getting started with Azure Databricks, consider trying MLflow onDatabricks Free Edition.",2026-03-19T18:35:00.000Z,concept-article,,0.2,False,"High-level overview of MLflow on Databricks; summary indicates conceptual description without detailed configs, limits, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/mlflow/,Overview,MLflow on Databricks - Azure Databricks,,"Learn about MLflow on Databricks. MLflow is the largest open source AI engineering platform for agents, LLMs, and ML models.","This article describes how MLflow on Databricks is used to develop high-quality generative AI agents and machine learning models. Note If you're just getting started with Azure Databricks, consider trying MLflow onDatabricks Free Edition.",2026-04-22T08:00:00.000Z,concept-article,,0.2,False,"High-level overview of MLflow on Databricks without specific limits, configuration parameter tables, error codes, or decision matrices; primarily conceptual/introductory content.",updated https://learn.microsoft.com/en-us/azure/databricks/mlflow/build-dashboards,Build dashboards with MLflow metadata in system tables,Build dashboards with MLflow metadata in system tables - Azure Databricks,Build dashboards from MLflow system table metadata,Learn how to build dashboards for experiments runs using system tables data,"Important This feature is inPublic Preview. Using MLflow metadata insystem tables, you can builddashboardsto analyze your MLflow experiments and runs from the entire workspace. Using the existing MLflow UI and REST APIs for these tasks would require extensive, time-consuming iteration.",2026-02-25T08:00:00.000Z,how-to,configuration,0.6,True,Uses MLflow metadata in system tables; likely includes table/column names and query patterns that are product-specific configuration details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow/databricks-autologging,Databricks Autologging,Databricks Autologging - Azure Databricks,Configure Databricks Autologging with MLflow,"Learn about Azure Databricks Autologging with MLflow, a no-code solution that extends MLflow's automatic logging solution to track your machine learning training experiments.","This page covers how to customizeDatabricks Autologging, which automatically captures model parameters, metrics, files, and lineage information when you train models from a variety of popular machine learning libraries. Training sessions are recorded asMLflow tracking runs. Model files are also tracked so you can easily log them to theMLflow Model Registry. Note To enable trace logging for generative AI workloads, MLflow supportsOpenAI autologging. The following video shows Databricks Autologgin",2025-09-08T18:32:00.000Z,concept-article,configuration,0.7,True,"Covers Databricks Autologging behavior and customization, including what is automatically logged for each supported library and how to configure it, which is Databricks-specific configuration/integration detail.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow/deployment-job,Deployment Jobs,MLflow 3 deployment jobs - Azure Databricks,Use MLflow 3 deployment jobs for model lifecycle,This article describes deployment jobs in MLflow.,"Important This feature is inPublic Preview. Note Deployment jobs do not need to be used with MLflow 3 clients or model tracking, and can be enabled on older, existing models in Unity Catalog. However, it is recommended to use MLflow 3. This article describes how to use MLflow deployment jobs as part of your machine learning workflow to manage the full lifecycle of ML models.",2026-03-26T08:00:00.000Z,how-to,deployment,0.7,True,Explains MLflow deployment jobs as part of lifecycle; likely includes product-specific deployment behavior and constraints for Unity Catalog models.,unchanged @@ -1743,8 +1770,8 @@ https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/api-reference,M https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/concepts/,Concepts,Concepts & data model - Azure Databricks,,"Understand the core concepts and data model that power MLflow for GenAI - how experiments, traces, assessments, and evaluation work together.","MLflow for GenAI provides a comprehensive data model designed specifically for developing, evaluating, and monitoring generative AI applications. This page explains the core concepts and how they work together.",2026-02-12T08:00:00.000Z,overview,,0.2,False,"Conceptual explanation of MLflow GenAI data model (experiments, traces, assessments) without product-specific limits, configs, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/,Evaluation & monitoring,Evaluate and monitor AI agents - Azure Databricks,,"Agent evaluation and LLM evaluation with MLflow. Monitor and evaluate AI agents, LLMs, and RAG applications for quality, cost, and latency.","MLflow provides comprehensive agent evaluation and LLM evaluation capabilities to help you measure, improve, and maintain the quality of your AI applications. MLflow supports the entire development lifecycle from testing through production monitoring for LLMs, agents, RAG systems, or other GenAI applications. Evaluating AI agents and LLMs is more complex than traditional ML model evaluation. These applications involve multiple components, multi-turn conversations, and nuanced quality criteria. B",2026-03-04T19:15:00.000Z,overview,,0.4,False,"High-level overview of evaluation and monitoring capabilities without detailed configuration tables, limits, or product-specific error/decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/align-judges,Align judges with humans,Align judges with humans - Azure Databricks,Align MLflow LLM judges with human evaluators,Learn how to align LLM judges with human evaluation standards to create domain-specific quality assessments that match your team's expertise.,"Judge alignment teaches LLM judges to match human evaluation standards through systematic feedback. This process transforms generic evaluators into domain-specific experts that understand your unique quality criteria, improving agreement with human assessments by 30 to 50 percent compared to baseline judges. Judge alignment follows a three-step workflow: The system uses Simplified Multi-Bootstrap Aggregation (SIMBA) as the default optimization strategy, leveraging DSPy's implementation to iterat",2026-02-25T08:00:00.000Z,how-to,best-practices,0.65,True,"Describes a concrete three-step workflow and SIMBA-based optimization strategy to align judges with human standards, including quantified impact (30–50% improvement), which is product-specific best-practice guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/archive-traces,Archive traces,Archive traces to a Delta table - Azure Databricks,,Learn how to archive GenAI traces and their assessments to a Unity Catalog Delta table for long-term storage and advanced analysis.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can save traces and their associated assessments to a Unity Catalog Delta table for long-term storage and advanced analysis. This is useful for building custom dashboards, performing in-depth analytics on trace data, and maintaining a durable record of your application's behavior. Note You must have the necessary permissions to write to the specifie",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Covers archiving traces to a Unity Catalog Delta table. The summary focuses on purpose and high-level usage; it does not indicate detailed configuration tables, permission matrices, or other specific expert-level reference information.",new -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/backfill-scorers,Backfill historical traces,Backfill historical traces with scorers - Azure Databricks,,"Learn how to retroactively apply new or updated scorers to historical traces in MLflow, enabling quality assessment of past production data.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can retroactively apply new or updated scorers to historical traces. This is useful when you add a new scorer and want to evaluate past production data, or when you update an existing scorer and want to re-evaluate previous traces with the new configuration.",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Explains retroactively applying scorers to historical traces. The description is conceptual/usage-oriented and does not mention specific configuration parameters, limits, or troubleshooting details that would constitute expert knowledge.",new +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/archive-traces,Archive traces,Archive traces to a Delta table - Azure Databricks,,Learn how to archive GenAI traces and their assessments to a Unity Catalog Delta table for long-term storage and advanced analysis.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can save traces and their associated assessments to a Unity Catalog Delta table for long-term storage and advanced analysis. This is useful for building custom dashboards, performing in-depth analytics on trace data, and maintaining a durable record of your application's behavior. Note You must have the necessary permissions to write to the specifie",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Covers archiving traces to a Unity Catalog Delta table. The summary focuses on purpose and high-level usage; it does not indicate detailed configuration tables, permission matrices, or other specific expert-level reference information.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/backfill-scorers,Backfill historical traces,Backfill historical traces with scorers - Azure Databricks,,"Learn how to retroactively apply new or updated scorers to historical traces in MLflow, enabling quality assessment of past production data.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can retroactively apply new or updated scorers to historical traces. This is useful when you add a new scorer and want to evaluate past production data, or when you update an existing scorer and want to re-evaluate previous traces with the new configuration.",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Explains retroactively applying scorers to historical traces. The description is conceptual/usage-oriented and does not mention specific configuration parameters, limits, or troubleshooting details that would constitute expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/build-eval-dataset,Evaluation datasets,Building MLflow evaluation datasets - Azure Databricks,,"Build MLflow evaluation datasets for your GenAI apps using various methods, including from traces, from scratch, from existing data, or synthetic generation.","To systematically test and improve a GenAI application, you use an evaluation dataset. An evaluation dataset is a selected set of example inputs — either labeled (with known expected outputs) or unlabeled (without ground-truth answers). Evaluation datasets help you improve your app's performance in the following ways: MLflow evaluation datasets are stored in Unity Catalog, which provides built-in versioning, lineage, sharing, and governance.",2026-03-29T08:00:00.000Z,how-to,,0.45,False,"Describes ways to build evaluation datasets; summary does not indicate specific schemas, parameter tables, or numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/code-based-scorer-examples,Code-based scorer examples,Code-based scorer examples - Azure Databricks,Examples of custom code-based scorers for GenAI,This tutorial provides many examples of custom code-based scorers for MLflow Evaluation for GenAI.,"In MLflow Evaluation for GenAI,custom code-based scorersallow you to define flexible evaluation metrics for your AI agent or application. This set of examples and the companionexample notebookillustrate many patterns for using code-based scorers with different options for inputs, outputs, implementation, and error handling. The image below illustrates some custom scorers' outputs as metrics in the MLflow UI.",2026-03-10T19:51:00.000Z,tutorial,integrations,0.7,True,"Provides many concrete scorer implementations, input/output signatures, and error-handling patterns—rich product-specific coding patterns for MLflow Evaluation.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/concepts/eval-datasets,Evaluation datasets,Evaluation dataset reference - Azure Databricks,Use MLflow evaluation dataset schema and APIs,"MLflow evaluation datasets - test data management with code examples, patterns, and best practices","This page describes the evaluation dataset schema and includes links to the SDK reference for some of the most frequently used methods and classes. For general information and examples of how to use evaluation datasets, seeEvaluate GenAI during development.",2026-03-31T23:28:00.000Z,reference,configuration,0.65,True,Describes evaluation dataset schema and references specific methods/classes; likely includes field names and structures that are product-specific configuration details.,unchanged @@ -1766,8 +1793,8 @@ https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/cu https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/eval-examples,Evaluation examples,MLflow evaluation examples for GenAI - Azure Databricks,,MLflow evaluation harness common usage patterns for GenAI,"This page presents some common usage patterns for the evaluation harness, including data patterns andpredict_fnpatterns.",2026-03-18T08:00:00.000Z,reference,,0.4,False,Examples of usage patterns are mentioned but not detailed; summary does not show concrete configuration tables or product-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/evaluate-app,Tutorial: Evaluate & improve your app,Tutorial: Evaluate and improve a GenAI application - Azure Databricks,,"Learn how to systematically evaluate new versions of your GenAI application in MLflow, assess quality after changes, detect regressions, and iteratively improve your app's quality using `mlflow.genai.","This tutorial shows you how to use evaluation datasets to evaluate quality, identify issues, and iteratively improve a generative AI application. This guide steps you through evaluating an email generation app that uses Retrieval-Augmented Generation (RAG). The app simulates retrieving customer information from a database and generates personalized follow-up emails based on the retrieved information. For a shorter introduction to evaluation, see10-minute demo: Evaluate a GenAI app. This tutorial",2026-03-16T08:00:00.000Z,tutorial,,0.35,False,"Tutorial for evaluating and improving a GenAI app; primarily step-by-step evaluation workflow rather than reference-style configs, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/evaluate-conversations,Evaluate conversations,Evaluate conversations - Azure Databricks,Evaluate multi-turn conversations with MLflow judges,"Learn how to evaluate multi-turn conversations with MLflow. Assess entire conversation sessions with specialized scorers for conversation completeness, user frustration, and dialogue quality.","Conversation evaluation enables you to assess entire conversation sessions rather than individual turns. This is essential for evaluating conversational AI systems where quality emerges over multiple interactions, such as user frustration patterns, conversation completeness, or overall dialogue coherence. Multi-turn judges can be used both for offline evaluation during development (as described on this page) and forcontinuous monitoring in production. Note Multi-turn evaluation isexperimental. T",2026-03-06T19:12:00.000Z,how-to,integrations,0.67,True,"Describes specialized multi-turn scorers for conversation completeness, frustration, and dialogue quality, including how to apply them—MLflow-specific evaluation integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/manage-production-scorers,Manage production scorers,Manage production scorers - Azure Databricks,,"Learn how to manage the lifecycle of production scorers in MLflow, including listing, updating, stopping, restarting, and deleting scorers for GenAI quality monitoring.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. After youset up production monitoring, you can manage your scorers throughout their lifecycle. This page covers how to list, update, stop, restart, and delete scorers. For the full API parameter reference, seeScorer lifecycle management API reference.",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Describes lifecycle management of production scorers (list, update, stop, restart, delete). The summary suggests basic API usage and operations, without exposing detailed parameter tables, limits, or error mappings that would qualify as expert knowledge under the defined categories.",new -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/production-monitoring,Production monitoring,Monitor GenAI apps in production - Azure Databricks,,"Learn how to set up automated quality monitoring for your GenAI applications in MLflow by scheduling scorers to run on production traces, enabling continuous assessment of application quality.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Production monitoring lets you automatically run MLflow 3 scorers on traces from your GenAI apps to continuously assess quality. You schedule scorers against an MLflow experiment, and the monitoring service evaluates a configurable sample of incoming traces. Results are attached as feedback to each evaluated trace. Production monitoring includes the fol",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Appears to be a how-to/feature usage page for scheduling MLflow scorers on GenAI traces. The summary does not indicate presence of numeric limits, config parameter tables, error codes, or other product-specific expert details; likely procedural guidance rather than reference-style expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/manage-production-scorers,Manage production scorers,Manage production scorers - Azure Databricks,,"Learn how to manage the lifecycle of production scorers in MLflow, including listing, updating, stopping, restarting, and deleting scorers for GenAI quality monitoring.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. After youset up production monitoring, you can manage your scorers throughout their lifecycle. This page covers how to list, update, stop, restart, and delete scorers. For the full API parameter reference, seeScorer lifecycle management API reference.",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Describes lifecycle management of production scorers (list, update, stop, restart, delete). The summary suggests basic API usage and operations, without exposing detailed parameter tables, limits, or error mappings that would qualify as expert knowledge under the defined categories.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/production-monitoring,Production monitoring,Monitor GenAI apps in production - Azure Databricks,,"Learn how to set up automated quality monitoring for your GenAI applications in MLflow by scheduling scorers to run on production traces, enabling continuous assessment of application quality.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Production monitoring lets you automatically run MLflow 3 scorers on traces from your GenAI apps to continuously assess quality. You schedule scorers against an MLflow experiment, and the monitoring service evaluates a configurable sample of incoming traces. Results are attached as feedback to each evaluated trace. Production monitoring includes the fol",2026-04-17T18:03:00.000Z,how-to,,0.3,False,"Appears to be a how-to/feature usage page for scheduling MLflow scorers on GenAI traces. The summary does not indicate presence of numeric limits, config parameter tables, error codes, or other product-specific expert details; likely procedural guidance rather than reference-style expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/third-party-scorers/,Third-party scorers,Third-party scorers - Azure Databricks,,"Use third-party evaluation framework scorers with MLflow to access specialized metrics for RAG, agents, safety, and content quality from DeepEval, RAGAS, Phoenix, TruLens, and Guardrails AI.","MLflow integrates with popular open-source evaluation frameworks so that you can use their specialized metrics as scorers alongsidebuilt-in LLM judgesandcode-based scorers. Third-party scorers plug directly intomlflow.genai.evaluate(), giving you access to a broad library of evaluation metrics through a single, unified interface.",2026-03-31T23:28:00.000Z,overview,,0.35,False,"High-level integration overview of third-party scorers; no specific parameter mappings, settings tables, or constraints are evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/third-party-scorers/deep-eval,DeepEval scorers,DeepEval scorers - Azure Databricks,,"Use DeepEval scorers with MLflow to evaluate RAG systems, agents, conversational AI, and safety in your GenAI applications.","DeepEvalis a comprehensive evaluation framework for LLM applications that provides metrics for RAG systems, agents, conversational AI, and safety evaluation. MLflow integrates with DeepEval so that you can use DeepEval metrics as scorers.",2026-03-31T23:28:00.000Z,how-to,,0.35,False,"Describes DeepEval integration conceptually; summary lacks concrete configuration parameters, defaults, or SDK-specific details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/third-party-scorers/guardrails,Guardrails AI scorers,Guardrails AI scorers - Azure Databricks,,"Use Guardrails AI scorers with MLflow to validate LLM outputs for toxicity, PII, jailbreak attempts, secrets, and content quality in your GenAI applications.","Guardrails AIis a framework for validating LLM outputs using a community-driven hub of validators for safety, PII detection, content quality, and more. MLflow integrates with Guardrails AI so that you can use Guardrails validators as scorers, offering rule-based evaluation without requiring LLM calls.",2026-03-31T23:28:00.000Z,how-to,,0.35,False,"Guardrails AI integration overview; no explicit mention of validator configuration parameters, settings tables, or product-specific constraints.",unchanged @@ -1777,15 +1804,15 @@ https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/th https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/,Get started,Get started: MLflow 3 for GenAI - Azure Databricks,,"Get started with MLflow for GenAI apps through a quickstart guide covering tracing, evaluation, and human feedback",Open notebook version of this page Get started using MLflow 3 for GenAI on Databricks by:,2026-04-06T18:30:00.000Z,tutorial,,0.3,False,"Getting started quickstart for MLflow 3 for GenAI; likely a step-by-step tutorial rather than a reference of configuration parameters, limits, or troubleshooting mappings. Does not clearly indicate expert-only configuration or error code content from the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/connect-environment,Set up your environment,Tutorial: Connect your development environment to MLflow - Azure Databricks,Configure dev environments to connect to MLflow,"Learn how to connect your development environment to MLflow for GenAI application development, whether using a local IDE or Databricks Notebook.",This page shows you how to create an MLflow Experiment and connect your development environment to it. An MLflow Experiment is the container for your GenAI application. Learn more about MLflow Experiments in theExperiment data modelconcept guide. Go the section relevant to your development environment:,2026-02-19T08:00:00.000Z,tutorial,configuration,0.65,True,"Shows how to connect local IDE or Databricks notebooks to MLflow experiments, likely including specific environment variables and MLflow tracking URI values that are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/eval,10-min demo: Evaluation,10-minute demo: Evaluate a GenAI app - Azure Databricks,,Learn how to use MLflow Evaluation to assess and improve your GenAI applications with a toy GenAI app.,"Open notebook version of this page This quickstart guides you through evaluating a GenAI application using MLflow. It uses a simple example: filling in blanks in a sentence template to be funny and child-appropriate, similar to the gameMad Libs. This tutorial takes you through the following steps: For a more detailed tutorial, seeTutorial: Evaluate and improve a GenAI application",2026-03-18T18:06:00.000Z,tutorial,,0.2,False,"Quickstart/tutorial for evaluating a GenAI app with MLflow; from the summary it appears to be a step-by-step example rather than a reference for limits, configuration matrices, error codes, or product-specific decision criteria. No indication of numeric limits, config tables, RBAC roles, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/genie-code,Genie Code for observability & evaluation,Genie Code for agent observability and evaluation - Azure Databricks,,"Use Genie Code to analyze traces, debug errors, review evaluations, and get instrumentation guidance for your GenAI applications using natural language.","Genie Code provides a natural language interface for understanding, debugging, and improving your GenAI applications within MLflow. It has read access to everything in your experiment, from traces, prompts, and datasets to evaluation runs, scorers, and labeling sessions — so you can explore your observability and evaluation data conversationally instead of writing queries or navigating multiple UI pages. To get started, click the Genie Code icon in the top-right of your workspace while viewing a",2026-04-16T17:44:00.000Z,overview,,0.3,False,"High-level description of Genie Code capabilities for observability and evaluation; appears to be a getting-started/UX overview without detailed configuration parameters, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/genie-code,Genie Code for observability & evaluation,Genie Code for agent observability and evaluation - Azure Databricks,,"Use Genie Code to analyze traces, debug errors, review evaluations, and get instrumentation guidance for your GenAI applications using natural language.","Genie Code provides a natural language interface for understanding, debugging, and improving your GenAI applications within MLflow. It has read access to everything in your experiment, from traces, prompts, and datasets to evaluation runs, scorers, and labeling sessions — so you can explore your observability and evaluation data conversationally instead of writing queries or navigating multiple UI pages. To get started, click the Genie Code icon in the top-right of your workspace while viewing a",2026-04-16T17:44:00.000Z,overview,,0.3,False,"High-level description of Genie Code capabilities for observability and evaluation; appears to be a getting-started/UX overview without detailed configuration parameters, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/human-feedback,10-min demo: Collect human feedback,10-minute demo: Collect human feedback - Azure Databricks,,"Learn the complete human feedback lifecycle in MLflow - collect end-user feedback, add developer annotations, create expert labeling sessions, and use feedback to evaluate GenAI app quality","Open notebook version of this page This tutorial shows how to collect end-user feedback, add developer annotations, create expert review sessions, and use that feedback to evaluate your GenAI app's quality.",2026-02-19T08:00:00.000Z,tutorial,,0.45,False,"Tutorial demo of human feedback lifecycle; summary does not reveal detailed config options, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/tracing/tracing-ide,Tracing in a local IDE,Get started: MLflow Tracing for GenAI (Local IDE) - Azure Databricks,,Learn how to build your first GenAI application using MLflow and a Databricks hosted LLM in your IDE.,"This quickstart helps you integrate your GenAI app withMLflow Tracingif you use a local development environment such as an IDE (VS Code, PyCharm, Cursor or others) or a locally-hosted notebook environment (Jupyter or others). If you use a Databricks Notebook, please use theDatabricks Notebook quickstartinstead. By the end of this tutorial, you will have:",2026-02-19T08:00:00.000Z,get-started,,0.35,False,"Local IDE tracing quickstart; focuses on example integration steps, not exhaustive configuration options or expert troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/tracing/tracing-notebook,Tracing in a Databricks notebook,Get started: MLflow Tracing for GenAI (Databricks Notebook) - Azure Databricks,,Learn how to build your first GenAI application using MLflow and a Databricks hosted LLM in a Databricks Notebook.,"Open notebook version of this page This quickstart helps you integrate your GenAI app withMLflow Tracingif you use a Databricks notebook as your development environment. If you use a local IDE, please use theIDE quickstartinstead. By the end of this tutorial, you will have:",2026-02-19T08:00:00.000Z,tutorial,,0.35,False,"Notebook quickstart for tracing in Databricks; mainly a guided example rather than reference-style limits, configuration matrices, or error-resolution mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/,Human feedback,Human feedback - Azure Databricks,,"Learn how to collect, manage, and utilize human feedback to improve your GenAI applications with MLflow","Human feedback is essential for building high-quality GenAI applications that meet user expectations. MLflow provides tools and a data model to collect, manage, and utilize feedback from developers, end-users, and domain experts. Human feedback complementsautomated evaluation. It can help you create datasets for automated LLM judges and scorers, and also help keep them aligned with human expert judgment.",2026-02-12T08:00:00.000Z,overview,,0.4,False,"Conceptual overview of human feedback tooling and data model; summary lacks specific schemas, roles, or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/concepts/labeling-schemas,Labeling schemas,Create and manage labeling schemas - Azure Databricks,,"Understand and use labeling schemas in MLflow to define questions for domain experts in the Review App, enabling structured feedback collection for GenAI app evaluation.","Labeling schemas define the specific questions that domain experts answer when labeling existing traces in the Review App. They structure the feedback collection process, ensuring consistent and relevant information for evaluating your GenAI app. Labeling schemas apply only when using the Review App tolabel existing traces. They are not used for vibe checks in theReview App Chat UI.",2026-02-18T08:00:00.000Z,how-to,,0.5,False,"Conceptual explanation of labeling schemas; while it defines purpose and scope, the summary does not expose concrete schema field definitions or constraints.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/concepts/labeling-sessions,Labeling sessions,Create and manage labeling sessions - Azure Databricks,,Understand MLflow labeling sessions for collecting expert feedback on GenAI app traces and curating evaluation datasets.,"Labeling sessions provide a structured way to gather feedback from domain experts on the behavior of your GenAI applications. A labeling session is a special type of MLflow run that contains a specific set of traces that you want domain experts to review using the MLflow Review App. The goal of a labeling session is to collect human-generated assessments (labels) on existingMLflow Traces. You can capture eitherFeedbackorExpectationdata, which can then be used to improve your GenAI app through sy",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Describes the concept of labeling sessions and their role in collecting human feedback. This is primarily conceptual and workflow-oriented, without concrete limits, configuration parameters, or error-resolution mappings required for expert knowledge classification.",updated +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/concepts/labeling-sessions,Labeling sessions,Create and manage labeling sessions - Azure Databricks,,Understand MLflow labeling sessions for collecting expert feedback on GenAI app traces and curating evaluation datasets.,"Labeling sessions provide a structured way to gather feedback from domain experts on the behavior of your GenAI applications. A labeling session is a special type of MLflow run that contains a specific set of traces that you want domain experts to review using the MLflow Review App. The goal of a labeling session is to collect human-generated assessments (labels) on existingMLflow Traces. You can capture eitherFeedbackorExpectationdata, which can then be used to improve your GenAI app through sy",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Describes the concept of labeling sessions and their role in collecting human feedback. This is primarily conceptual and workflow-oriented, without concrete limits, configuration parameters, or error-resolution mappings required for expert knowledge classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/dev-annotations,Label during development,Label during development - Azure Databricks,,Learn how to add quality annotations to GenAI application traces during development,"As a developer building GenAI applications, you need a way to track your observations about the quality of your application's outputs.MLflow Tracingallows you to add feedback or expectations directly to traces during development, giving you a quick way to record quality issues, mark successful examples, or add notes for future reference.",2026-02-12T08:00:00.000Z,how-to,,0.45,False,"Describes labeling during development; summary does not show concrete schema fields, parameter names, or constraints.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/expert-feedback/label-existing-traces,Collect feedback and expectations by labeling existing traces,Collect feedback and expectations by labeling existing traces - Azure Databricks,,Learn how to use MLflow Review App to collect structured feedback from domain experts by having them review and label existing GenAI application traces for quality improvement.,One of the most effective ways to improve your GenAI application is to have domain experts review and label existing traces. MLflow's Review App provides a structured process for collecting this expert feedback on real interactions with your application.,2026-02-25T08:00:00.000Z,how-to,,0.45,False,Explains using Review App to label traces; summary does not indicate detailed labeling schema fields or configuration tables.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/expert-feedback/label-existing-traces,Collect feedback and expectations by labeling existing traces,Collect feedback and expectations by labeling existing traces - Azure Databricks,,Learn how to use MLflow Review App to collect structured feedback from domain experts by having them review and label existing GenAI application traces for quality improvement.,One of the most effective ways to improve your GenAI application is to have domain experts review and label existing traces. MLflow's Review App provides a structured process for collecting this expert feedback on real interactions with your application.,2026-04-23T17:47:00.000Z,how-to,,0.2,False,"Tutorial-style guidance on using MLflow Review App to label existing GenAI traces; likely focuses on workflow steps rather than product-specific limits, configuration tables, error codes, or security/decision matrices.",updated https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/expert-feedback/live-app-testing,Vibe check using the Chat UI,Test an app version with the Review App's Chat UI - Azure Databricks,Test GenAI apps with MLflow Review App Chat UI,Learn how to use the MLflow Review App Chat UI to enable domain experts to test and provide feedback on your GenAI applications,"The MLflow Review App includes a built-in chat interface that allows domain experts to interactively test your GenAI application and provide immediate feedback. Use the Chat UI as a way to vibe check your app. Important The Review App Chat UI requires an agent deployed to aModel Serving endpoint. It does not currently support agents deployed onDatabricks Apps. If you deploy your agent on Databricks Apps, you can stilllabel existing tracesfor evaluation. Databricks is building review and feedback",2026-02-23T22:28:00.000Z,how-to,integrations,0.65,True,"Describes using the Review App Chat UI, including the requirement for a Model Serving endpoint and current lack of Databricks Apps support—product-specific integration constraints and usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/overview/,Overview,MLflow for Generative AI - Azure Databricks,,"Learn how MLflow helps you measure, improve, and monitor quality throughout the GenAI application or agent lifecycle.","Traditional software and ML tests aren't built for GenAI's free-form language, making it difficult for teams to measure and improve quality. MLflow solves this by combiningAI-powered metrics that reliably measure GenAI qualitywith comprehensivetrace observability, enabling you tomeasure, improve, and monitorquality throughout your entire application lifecycle.",2026-02-19T08:00:00.000Z,overview,,0.4,False,"Conceptual overview of how MLflow helps with GenAI quality; does not indicate detailed configuration parameters, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/overview/oss-managed-diff,Open source vs. managed MLflow,Open source vs. managed MLflow on Azure Databricks - Azure Databricks,Choose between open source and Databricks MLflow,Understand the differences between Open Source MLflow and Managed MLflow on Databricks for GenAI applications,This page is meant to helpopen source MLflowusers get familiar with using MLflow on Databricks. Databricks-managed MLflow uses the same APIs but provides additional capabilities through integrations with the broader Azure Databricks platform.,2026-03-26T08:00:00.000Z,concept-article,decision-making,0.65,True,"Compares open source MLflow with Databricks-managed MLflow, describing additional platform-specific capabilities and when to use each; this is product-specific decision guidance beyond generic concepts.",unchanged @@ -1866,133 +1893,134 @@ https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integra newOpenAI Agents SDK. Please consider migrating to the new SDK for the latest features and support. MLflow Tracingprovides automatic tracing capability forOpenAI Swarm, a multi-agent framework developed by OpenAI. By enabling auto tracing for OpenAI by calling themlflow.openai.autologfunction, MLflow will capture nested traces and log them to the active MLflow Experiment upon invocation of OpenAI SD",2025-11-21T08:00:00.000Z,concept-article,integrations,0.7,True,Details MLflow integration for OpenAI Swarm via mlflow.openai.autolog and nested trace behavior plus deprecation guidance; specific to this product combo.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/txtai,txtai,Tracing txtai - Azure Databricks,Trace txtai embeddings and LLM workflows with MLflow,Tracing txtai,"txtaiis an all-in-one embeddings database for semantic search, LLM orchestration and language model workflows. MLflow Tracingprovides automatic tracing capability for txtai. Auto tracing for txtai can be enabled by calling themlflow.autologfunction, MLflow will capture traces for LLM invocation, embeddings, vector search, and log them to the active MLflow Experiment.",2025-11-21T19:21:00.000Z,concept-article,integrations,0.76,True,"Explains enabling tracing for txtai via mlflow.autolog and what operations (LLM invocation, embeddings, vector search) are captured—concrete integration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/migrate-uc-trace-table-prefix,Migrate from older experiments,Migrate traces to the latest Unity Catalog table format - Azure Databricks,Migrate MLflow traces to new Unity Catalog format,Migrate MLflow traces from the older schema-linked Unity Catalog format to the current table-prefix format for improved query performance and annotation support.,"If youconfigured an MLflow experiment to store traces in Unity Catalogduring the Beta release using the schema-linked format (catalog.schema), you can migrate those traces to the table-prefix format (catalog.schema.table_prefix) introduced with the Public Preview release. Databricks recommends the table-prefix format for all new and existing UC trace workloads. It provides faster time-range queries, richer attribute types, a dedicated annotations table, and support for multiple trace destination",2026-04-21T18:07:00.000Z,concept-article,configuration,0.7,True,"Covers migration from a schema-linked to a table-prefix Unity Catalog format for MLflow traces, including product-specific schema/format details and recommended configuration patterns that an LLM would not know from training.",new https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/mlflow-mcp,MLflow MCP server,MLflow MCP server - Azure Databricks,Use MLflow MCP server to manage traces,Learn how to use the MLflow MCP server to interact with traces from AI applications and coding assistants.,"The MLflow MCP (Model Context Protocol) server enables AI applications and coding assistants to interact with your traces programmatically. The MLflow MCP server exposes all MLflow trace management operations through the MCP protocol, allowing AI assistants to: For complete documentation on the MLflow MCP server, including installation, configuration, and available tools, see theopen source MLflow MCP server documentation.",2026-01-29T02:25:00.000Z,concept-article,integrations,0.75,True,Describes MLflow MCP server operations and tools; product-specific integration interface for interacting with traces via MCP.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/,Search & analyze traces,Debug and analyze your app with tracing - Azure Databricks,,"Learn how to view, debug, and observe your application traces within the Databricks MLflow UI and notebooks.","MLflow Tracing provides deep insights into your application's behavior, facilitating a complete debugging experience across different environments. By capturing the complete request-response cycle (Input/Output Tracking) and the execution flow, you can visualize and understand your application's logic and decision-making process. Examining the inputs, outputs, and metadata for each intermediate step (for example, retrieval, tool calls, LLM interactions) and associateduser feedbackor the results ",2026-03-11T08:00:00.000Z,concept-article,,0.45,False,"Explains how to debug and analyze traces conceptually in UI/notebooks; likely more of a usage guide than a reference of configs, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/access-trace-data,Access trace data,Access trace data - Azure Databricks,Access MLflow trace metadata and spans via SDK,"Learn how to access trace info and data using the MLflow Trace SDK, including metadata, spans, assessments, and more.","This page demonstrates how to access every aspect of trace data including metadata, spans, assessments, and more. Once you learn how to access trace data, seeExamples: Analyzing traces The MLflowTraceobject consists of two main components: TraceInfo, metadata about the trace: TraceData, the actual execution data:",2026-01-24T08:00:00.000Z,concept-article,integrations,0.7,True,"Explains the MLflowTrace object structure (TraceInfo, TraceData) and how to access components via SDK; includes product-specific object models and methods.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/analyze-traces,Examples: Trace analysis,Examples: Analyzing traces - Azure Databricks,Analyze GenAI traces for errors and performance,"Learn how to analyze trace data using the MLflow Trace SDK, including patterns for monitoring errors, performance, and user activity.","This page shows patterns for analyzing GenAI traces in real-world scenarios. For details onmlflow.search_traces(), seeSearch traces programmatically.",2026-04-17T08:00:00.000Z,concept-article,best-practices,0.7,True,"Provides concrete patterns and recommended ways to analyze trace data (errors, performance, user activity) using the MLflow Trace SDK. These are actionable, product-specific analysis patterns and gotchas rather than generic observability concepts.",updated -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-dbsql,Query traces - DBSQL,Query MLflow traces using MLflow Databricks SQL - Azure Databricks,,"Query MLflow trace data stored in Unity Catalog using DBSQL views for metadata, tags, assessments, and OpenTelemetry log records.","Important Storing MLflow Traces in Unity Catalog is inBeta. By storing trace data in OpenTelemetry format in Unity Catalog, you can query traces using the MLflow Python SDK, or through Databricks SQL using Unity Catalog tables and views.",2026-04-09T08:00:00.000Z,concept-article,,0.3,False,"From the summary, the page is about querying MLflow traces stored in Unity Catalog using Databricks SQL and views. It likely shows example queries and mentions available tables/views, but there is no clear indication of structured configuration tables, limits, error codes, or decision matrices. Without explicit evidence of such expert details, it is treated as general usage/tutorial content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-via-sdk,Search traces - SDK,Search traces programmatically - Azure Databricks,Query MLflow GenAI traces programmatically via SDK,Complete guide to searching and analyzing MLflow traces - from creating searchable traces to querying them effectively,"Search and analyze traces programmatically usingmlflow.search_traces(). This function can query traces stored in the MLflow tracking server, inference tables, or Unity Catalog tables. You can select subsets of traces to analyze or to create evaluation datasets.",2026-04-17T18:03:00.000Z,concept-article,integrations,0.78,True,"Focuses on mlflow.search_traces() usage across tracking server, inference tables, and Unity Catalog, which is an SDK/API integration pattern. Contains product-specific function signatures, parameters, and usage patterns for querying traces that go beyond generic SDK knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/analyze-traces,Examples: Trace analysis,Examples: Analyzing traces - Azure Databricks,Analyze GenAI traces for errors and performance,"Learn how to analyze trace data using the MLflow Trace SDK, including patterns for monitoring errors, performance, and user activity.","This page shows patterns for analyzing GenAI traces in real-world scenarios. For details onmlflow.search_traces(), seeSearch traces programmatically.",2026-04-17T08:00:00.000Z,concept-article,best-practices,0.7,True,"Provides concrete patterns and recommended ways to analyze trace data (errors, performance, user activity) using the MLflow Trace SDK. These are actionable, product-specific analysis patterns and gotchas rather than generic observability concepts.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-dbsql,Query traces in Unity Catalog,Query MLflow traces stored in Unity Catalog - Azure Databricks,Query Unity Catalog MLflow traces with Databricks SQL,Query MLflow trace data stored in Unity Catalog,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. By storing trace data in OpenTelemetry format in Unity Catalog, you can query traces using the MLflow Python SDK or through Databricks SQL using Unity Catalog tables and views.",2026-04-23T17:47:00.000Z,concept-article,integrations,0.65,True,"Explains how to query MLflow trace data stored in Unity Catalog using Databricks SQL and MLflow SDK, involving product-specific table/view usage and query patterns that are not generic SQL knowledge.",new +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-via-sdk,Search traces - SDK,Search traces programmatically - Azure Databricks,Search and analyze MLflow traces via SDK,Complete guide to searching and analyzing MLflow traces - from creating searchable traces to querying them effectively,"Search and analyze traces programmatically usingmlflow.search_traces(). This function can query traces stored in the MLflow tracking server, inference tables, or Unity Catalog tables. You can select subsets of traces to analyze or to create evaluation datasets.",2026-04-22T17:34:00.000Z,how-to,integrations,0.7,True,"Documents programmatic querying of traces using mlflow.search_traces(), including product-specific API behavior and parameters for traces stored in different backends, which constitutes detailed SDK integration knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/search-traces-examples,Tutorial: Search traces,Tutorial: Search traces programmatically - Azure Databricks,Example queries using mlflow.search_traces(),Tutorial demonstrating simple examples of searching MLflow traces,"Open notebook version of this page This tutorial provides simple examples to get started withmlflow.search_traces(). For details on searching traces, seeSearch traces programmatically.",2026-01-12T23:16:00.000Z,concept-article,integrations,0.61,True,"Provides concrete code examples and query patterns for mlflow.search_traces() beyond generic SDK usage, illustrating product-specific filters and usage idioms.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/ui-traces,Search traces - UI,View traces in the Databricks MLflow UI - Azure Databricks,Access and view MLflow traces in Databricks UI,"Learn how to view, debug, and observe your application traces within the Databricks MLflow UI and notebooks.","All captured traces are logged to an MLflow Experiment. You can access them through the MLflow UI in your Databricks workspace. Tip Traces are stored and served by the managed MLflow Tracking service in your Databricks workspace whenMLFLOW_TRACKING_URIis set todatabricks. This production‑ready backend requires no additional hosting. SeeTrace agents deployed on Databricks. Navigate to your Experiment: Go to the experiment where your traces are logged. For example, the experiment set bymlflow.set_",2026-01-24T08:00:00.000Z,concept-article,configuration,0.7,True,"Mentions MLFLOW_TRACKING_URI set to 'databricks' and describes how traces are stored/served by managed tracking; likely includes specific environment variable values and backend behavior, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/prod-tracing,Deploy on Databricks,Trace agents deployed on Databricks - Azure Databricks,Deploy Databricks agents with automatic MLflow tracing,Deploy GenAI applications on Databricks to capture production traces.,"This page shows how to deploy GenAI applications on Databricks so that production traces are captured automatically. For apps deployed outside Databricks, seeTrace agents deployed outside of Databricks. You can deploy GenAI applications on Databricks usingMosaic AI Agent Framework(recommended) or custom CPU Model Serving. Regardless of which deployment method you choose, traces are logged to your MLflow experiment for real-time viewing. You can optionally store traces long-term in Delta tables u",2026-03-26T08:00:00.000Z,concept-article,deployment,0.7,True,Describes deploying GenAI apps on Databricks via Mosaic AI Agent Framework or Model Serving with automatic trace capture; includes Databricks-specific deployment methods and constraints for production tracing.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/prod-tracing-external,Deploy outside of Databricks,Trace agents deployed outside of Databricks - Azure Databricks,Enable MLflow Tracing for agents deployed outside Databricks,"MLflow Tracing provides comprehensive observability for production GenAI agents deployed outside of Databricks by capturing execution details and sending them to your Databricks workspace, where you c","MLflow Tracing provides comprehensive observability for production GenAI agents deployed outside of Databricks by capturing execution details and sending them to your Databricks workspace, where you can view them in the MLflow UI. This page covers deploying agents outside of Databricks with tracing enabled. If your agent is deployed using Databricks Model Serving, seeDeploy with Agent Framework (recommended).",2026-01-05T23:39:00.000Z,concept-article,deployment,0.7,True,"Covers how to configure externally deployed agents to send traces to Databricks, including endpoint configuration and deployment considerations—product-specific deployment patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/span-concepts,Span concepts,Span concepts - Azure Databricks,,Span concepts for MLflow Tracing,"TheSpanobject is a fundamental building block in the Trace data model. It serves as a container for information about individual steps of a trace, such as LLM calls, tool execution, retrieval operations, and more. Spans are organized hierarchically in a trace to represent your application's execution flow. Each span captures:",2025-12-10T00:00:00.000Z,concept-article,,0.46,False,Conceptual explanation of span objects and hierarchy; primarily data model description without configuration tables or error-resolution content.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/langfuse,Langfuse,Export Langfuse traces to Azure Databricks MLflow - Azure Databricks,Configure Langfuse to export traces to MLflow,Configure Langfuse to export OpenTelemetry traces to Databricks MLflow for storage in Unity Catalog tables.,"ConfigureLangfuseto send OpenTelemetry-based trace spans to the Azure Databricks MLflow OTLP endpoint, so that traces are stored in Unity Catalog tables alongside your other MLflow traces. Consolidating traces on Azure Databricks provides the following benefits:",2026-04-09T08:00:00.000Z,concept-article,configuration,0.7,True,"Page explains how to configure Langfuse to send OpenTelemetry traces to the Azure Databricks MLflow OTLP endpoint so traces land in Unity Catalog tables. This involves product-specific endpoint/config parameters for integration, which aligns with configuration (and partially integrations), but the primary focus is on concrete config settings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/otel-span-attributes,OTel GenAI attribute mapping,Set OpenTelemetry span attributes for MLflow - Azure Databricks,Set OpenTelemetry span attributes for Databricks MLflow,"Learn which OpenTelemetry span attributes to set so Databricks MLflow correctly displays span types, inputs, outputs, and token usage in traces.","When you send traces from a custom OpenTelemetry-instrumented (OTel) application to Azure Databricks MLflow, you must set specific span attributes to correctly render your trace data in the MLflow UI. This page shows you whichOpenTelemetry GenAI Semantic Conventionattributes to set. If you use a pre-built integration such as Langfuse, that integration sets these attributes automatically. This page is for applications with custom OTel instrumentation. Note The attributes in Azure Databricks manag",2026-04-09T08:00:00.000Z,concept-article,configuration,0.85,True,"Page lists specific OpenTelemetry GenAI Semantic Convention span attribute names and how they must be set so Databricks MLflow correctly renders span types, inputs, outputs, and token usage. These are detailed, product-specific configuration parameters (attribute keys/values) that an LLM is unlikely to know, fitting configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/trace-unity-catalog,Store traces in Unity Catalog,Store MLflow traces in Unity Catalog - Azure Databricks,Configure MLflow trace storage in Unity Catalog,Store and manage MLflow traces in Unity Catalog tables using OpenTelemetry format for scalable trace data storage and querying.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Azure Databricks supports storing MLflow traces in Unity Catalog tables using an OpenTelemetry-compatible format (OTEL). By default, MLflow stores traces organized by experiments in the MLflow control plane service. However, storing traces in Unity Catalog using OTEL format provides the following benefits: Access control is managed through Unity Catalog",2026-04-17T18:03:00.000Z,concept-article,configuration,0.7,True,"Explains how to store MLflow traces in Unity Catalog tables using OTEL format, which typically includes product-specific configuration options (e.g., enabling the feature, specifying catalogs/schemas/tables, and access control behavior). These are concrete configuration details unique to Databricks + MLflow traces.",updated +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/otel-span-attributes,OTel GenAI attribute mapping,Set OpenTelemetry span attributes for MLflow - Azure Databricks,Set OpenTelemetry span attributes for Databricks MLflow,"Learn which OpenTelemetry span attributes to set so Databricks MLflow correctly displays span types, inputs, outputs, and token usage in traces.","When you send traces from a custom OpenTelemetry-instrumented (OTel) application to Azure Databricks MLflow, you must set specific span attributes to correctly render your trace data in the MLflow UI. This page shows you whichOpenTelemetry GenAI Semantic Conventionattributes to set. If you use a pre-built integration such as Langfuse, that integration sets these attributes automatically. This page is for applications with custom OTel instrumentation. Note The attributes in Azure Databricks manag",2026-04-21T08:00:00.000Z,concept-article,integrations,0.85,True,"Lists specific OpenTelemetry span attribute names and how they must be set for Databricks MLflow to correctly render span types, inputs, outputs, and token usage—this is a product-specific integration contract with concrete parameter requirements.",updated +https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/trace-unity-catalog,Store traces in Unity Catalog,Store MLflow traces in Unity Catalog - Azure Databricks,Configure MLflow trace storage in Unity Catalog,Store and manage MLflow traces in Unity Catalog tables using OpenTelemetry format for scalable trace data storage and querying.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Azure Databricks supports storing MLflow traces in Unity Catalog tables using an OpenTelemetry-compatible format (OTel). By default, MLflow stores traces organized by experiments in the MLflow control plane service. However, storing traces in Unity Catalog using OTel format provides the following benefits:",2026-04-23T17:47:00.000Z,concept-article,configuration,0.7,True,"Describes how to store MLflow traces in Unity Catalog tables using an OpenTelemetry-compatible format, which requires product-specific configuration details (table layout, storage options, and UC-specific behaviors) that go beyond generic concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/tracing-101,Trace concepts,Trace concepts - Azure Databricks,,Trace concepts for MLflow Tracing,"Tracing is an observability technique that captures the complete execution flow of a request through your application. Unlike traditional logging that records isolated events, tracing creates a detailed map of how data flows through your systems and records every operation along the way. For GenAI applications, tracing is essential because these systems involve complex, multi-step workflows with multiple components (LLMs, retrievers, tools, agents) that are difficult to debug without complete vi",2026-03-26T08:00:00.000Z,concept-article,,0.2,False,Explains trace concepts and why tracing matters; conceptual observability overview without product-specific numeric limits or config tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tutorials/examples/custom-scorers,Optimize prompts using custom scorers,Optimize prompts using custom scorers - Azure Databricks,Create custom MLflow scorers for RAG evaluation,"A practical example of evaluating Retrieval Augmented Generation (RAG) applications with MLflow, focusing on assessing hallucination or groundedness and relevance using predefined judges.","This notebook walks you through how to make custom scorers using MLflowmake_judge. Often, built-in scorers and judges don't fit all use cases. Take advantage of custom scorers or judges to ensure you have accurate evaluations to optimize against. The notebook walks you through a markdown judge that optimizes a prompt to output in a more markdown format.",2026-02-25T08:00:00.000Z,tutorial,integrations,0.7,True,Walks through using mlflow.make_judge and custom scorers; includes product-specific API usage and patterns for evaluating hallucination and relevance.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tutorials/examples/multi-prompt-optimization,Multi-prompt optimization,Optimize multiple prompts together - Azure Databricks,Optimize chained prompts with MLflow multi-prompt workflows,"Learn how to optimize multiple chained prompts simultaneously using MLflow GenAI Optimization, demonstrating multi-prompt workflows with plan and answer functions.","In complex agent systems, you might have chained multiple prompts together. You can provide all these prompts together for GEPA to consider and optimize each prompt.",2026-02-25T08:00:00.000Z,tutorial,integrations,0.65,True,Shows multi-prompt optimization with GEPA across plan/answer prompts; likely includes specific MLflow optimization APIs and configuration patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tutorials/examples/prompt-optimization-quickstart,Prompt optimization quickstart,Optimize prompts tutorial - Azure Databricks,Implement MLflow prompt optimization with GEPA and GPT-OSS,A tutorial for optimizing prompts using MLflow GenAI Prompt Optimization with GEPA and demonstrates classification tasks with GPT-OSS 20B.,This tutorial example optimizes a simple promptclassify this queryusing MLflow Prompt Optimization using GEPA and GPT-OSS 20B for classification tasks.,2026-02-25T08:00:00.000Z,tutorial,integrations,0.65,True,"Tutorial with concrete code integrating MLflow Prompt Optimization, GEPA, and GPT-OSS 20B; includes API usage and parameters specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/,Notebooks overview,Databricks notebooks - Azure Databricks,,"Overview of Databricks notebooks for data science, machine learning, and collaborative code development.","Notebooks are the primary tool for creating data science and machine learning workflows on Azure Databricks. Databricks notebooks provide real-time coauthoring in multiple languages, automatic versioning, and built-in data visualizations for developing code and presenting results.",2026-04-13T08:00:00.000Z,landing-page,,0.2,False,"High-level overview of Databricks notebooks; primarily conceptual and feature description without detailed limits, configs, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/basic-editing,Basic editing,Basic editing in Databricks notebooks - Azure Databricks,,"Learn the basics of editing Databricks notebooks, including cell types, keyboard shortcuts, toolbar navigation, and essential cell actions.","A Databricks notebook is a web-based code editor that allows you to write code and view results for interactive data analysis. This page covers the basics of using notebooks in Databricks, including how to navigate the toolbar and perform various cell actions.",2026-04-13T08:00:00.000Z,how-to,,0.2,False,Basic editing and UI usage in notebooks; mostly step-by-step editor usage without deep product-specific configuration tables or expert-only details.,updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/best-practices,Best practices for notebooks,Software engineering best practices for notebooks - Azure Databricks,Apply software engineering practices to Databricks notebooks,"Learn how to apply software engineering best practices to your Azure Databricks notebooks, including version control, code sharing, testing, and CI/CD.","This article provides a hands-on walkthrough that demonstrates how to apply software engineering best practices to your Azure Databricks notebooks, including version control, code sharing, testing, and optionally continuous integration and continuous delivery or deployment (CI/CD). In this walkthrough, you:",2026-04-13T08:00:00.000Z,best-practice,best-practices,0.7,True,"Walkthrough for version control, code sharing, testing, and CI/CD specifically for Databricks notebooks; likely includes Databricks-specific patterns, repo usage, and configuration details that go beyond generic best practices.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/code-assistant,Code help using Genie Code,Get coding help from Genie Code - Azure Databricks,Apply Genie Code effectively in Databricks notebooks,"Learn how to use Genie Code in Databricks notebooks for coding help, debugging, autocomplete, and natural language code generation.","This page describes how you can use Genie Code to help you code and debug your notebooks, and provides tips on how to get the most out of Genie Code.",2026-04-13T08:00:00.000Z,feature-guide,best-practices,0.7,True,"Provides tips on getting the most out of Genie Code, which are product-specific usage recommendations and gotchas rather than generic coding advice.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/dashboards,Dashboards,Dashboards in notebooks - Azure Databricks,,Learn how to publish graphs and visualizations derived from notebook output and share them in a presentation format with your organization.,"This page describes how to create dashboards based on notebook cell outputs. Dashboards allow you to publish graphs and visualizations, and then share them in a presentation format with your organization.",2026-04-13T08:00:00.000Z,how-to,,0.45,False,Explains creating dashboards from notebook outputs; primarily UI and visualization usage without detailed configuration parameters or limits.,updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/debugger,Debug notebooks,Debug notebooks - Azure Databricks,Debug Python code in Databricks notebooks,"Learn how to use the Databricks interactive debugger for Python notebooks, including breakpoints, variable inspection, and step-by-step execution.","This page describes how to use the built-in interactive debugger in the Databricks notebook. The debugger is available only for Python. The interactive debugger provides breakpoints, step-by-step execution, variable inspection, and more tools to help you develop code in notebooks more efficiently.",2026-04-13T08:00:00.000Z,how-to,troubleshooting,0.65,True,"Describes the Databricks interactive debugger (breakpoints, variable inspection, step execution) which is a product-specific debugging and troubleshooting workflow.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/ds-agent,Genie Code for data science,Use Genie Code for data science - Azure Databricks,Use Genie Code agent for Databricks data science,Learn how to use the Data Science Agent to build data science workflows. The agent can run multi-step workflows from a single prompt.,"This page introduces Genie Code for data science. Designed specifically for Databricks notebooks and the SQL Editor, Genie Code in Agent mode can explore data, generate and run code, and fix errors—all from a single prompt.",2026-04-15T08:00:00.000Z,feature-guide,integrations,0.65,True,Shows how to drive multi-step workflows from prompts in Databricks notebooks/SQL Editor using Genie Code; involves product-specific agent behaviors and invocation patterns.,updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/eda-tutorial,Tutorial: EDA techniques,Tutorial: EDA techniques using Databricks notebooks - Azure Databricks,,"This tutorial guides you through the basics of conducting exploratory data analysis (EDA) in a Databricks notebook, from loading data to generating insights.","This tutorial guides you through the basics of conducting exploratory data analysis (EDA) using Python in a Azure Databricks notebook, from loading data to generating insights through data visualizations. The notebook used in this tutorial examines global energy and emissions data and demonstrates how to load, clean, and explore data. You can follow along using theexample notebookor create your own notebook from scratch.",2026-04-13T08:00:00.000Z,tutorial,,0.2,False,"Tutorial-style EDA walkthrough in Databricks notebooks; focuses on loading, cleaning, and visualizing data without product-specific limits, configuration tables, error codes, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/ipython-kernel,IPython kernel,IPython kernel - Azure Databricks,,"Learn about the IPython kernel, how it works with Azure Databricks notebooks, and the benefits of using the IPython kernel.","TheIPython kernelis a Jupyter kernel for Python code execution. Jupyter, and other compatible notebooks, use the IPython kernel for executing Python notebook code. In Databricks Runtime 11.3 LTS and above, Python notebooks use the IPython kernel to execute Python code. In Databricks Runtime 11.3 LTS and above, you can pass input to ipykernel in Python notebooks. This allows you to use interactive tools such as the Python debugger in the notebook. For an example notebook that illustrates how to u",2026-04-14T17:47:00.000Z,reference-architecture,,0.4,False,Explains IPython kernel usage and benefits; mostly conceptual and runtime-version notes without detailed configuration tables or limits.,updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/,Notebooks overview,Databricks notebooks - Azure Databricks,,"Overview of Databricks notebooks for data science, machine learning, and collaborative code development.","Notebooks are the primary tool for creating data science and machine learning workflows on Azure Databricks. Databricks notebooks provide real-time coauthoring in multiple languages, automatic versioning, and built-in data visualizations for developing code and presenting results.",2026-04-13T08:00:00.000Z,landing-page,,0.2,False,"High-level overview of Databricks notebooks; primarily conceptual and feature description without detailed limits, configs, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/basic-editing,Basic editing,Basic editing in Databricks notebooks - Azure Databricks,,"Learn the basics of editing Databricks notebooks, including cell types, keyboard shortcuts, toolbar navigation, and essential cell actions.","A Databricks notebook is a web-based code editor that allows you to write code and view results for interactive data analysis. This page covers the basics of using notebooks in Databricks, including how to navigate the toolbar and perform various cell actions.",2026-04-13T08:00:00.000Z,how-to,,0.2,False,Basic editing and UI usage in notebooks; mostly step-by-step editor usage without deep product-specific configuration tables or expert-only details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/best-practices,Best practices for notebooks,Software engineering best practices for notebooks - Azure Databricks,Apply engineering best practices to Azure Databricks notebooks,"Learn how to apply software engineering best practices to your Azure Databricks notebooks, including version control, code sharing, testing, and CI/CD.","This article provides a hands-on walkthrough that demonstrates how to apply software engineering best practices to your Azure Databricks notebooks, including version control, code sharing, testing, and optionally continuous integration and continuous delivery or deployment (CI/CD). In this walkthrough, you:",2026-04-23T08:00:00.000Z,best-practice,best-practices,0.7,True,"The page is a hands-on walkthrough with concrete, product-specific recommendations for version control, code sharing, testing, and CI/CD in Azure Databricks notebooks. It goes beyond generic advice by showing how to implement these practices in the Databricks environment, which qualifies as expert, product-specific best-practices content.",updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/code-assistant,Code help using Genie Code,Get coding help from Genie Code - Azure Databricks,Apply Genie Code effectively in Databricks notebooks,"Learn how to use Genie Code in Databricks notebooks for coding help, debugging, autocomplete, and natural language code generation.","This page describes how you can use Genie Code to help you code and debug your notebooks, and provides tips on how to get the most out of Genie Code.",2026-04-13T08:00:00.000Z,feature-guide,best-practices,0.7,True,"Provides tips on getting the most out of Genie Code, which are product-specific usage recommendations and gotchas rather than generic coding advice.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/dashboards,Dashboards,Dashboards in notebooks - Azure Databricks,,Learn how to publish graphs and visualizations derived from notebook output and share them in a presentation format with your organization.,"This page describes how to create dashboards based on notebook cell outputs. Dashboards allow you to publish graphs and visualizations, and then share them in a presentation format with your organization.",2026-04-13T08:00:00.000Z,how-to,,0.45,False,Explains creating dashboards from notebook outputs; primarily UI and visualization usage without detailed configuration parameters or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/debugger,Debug notebooks,Debug notebooks - Azure Databricks,Debug Python code in Databricks notebooks,"Learn how to use the Databricks interactive debugger for Python notebooks, including breakpoints, variable inspection, and step-by-step execution.","This page describes how to use the built-in interactive debugger in the Databricks notebook. The debugger is available only for Python. The interactive debugger provides breakpoints, step-by-step execution, variable inspection, and more tools to help you develop code in notebooks more efficiently.",2026-04-13T08:00:00.000Z,how-to,troubleshooting,0.65,True,"Describes the Databricks interactive debugger (breakpoints, variable inspection, step execution) which is a product-specific debugging and troubleshooting workflow.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/ds-agent,Genie Code for data science,Use Genie Code for data science - Azure Databricks,Use Genie Code agent for Databricks data science,Learn how to use the Data Science Agent to build data science workflows. The agent can run multi-step workflows from a single prompt.,"This page introduces Genie Code for data science. Designed specifically for Databricks notebooks and the SQL Editor, Genie Code in Agent mode can explore data, generate and run code, and fix errors—all from a single prompt.",2026-04-15T08:00:00.000Z,feature-guide,integrations,0.65,True,Shows how to drive multi-step workflows from prompts in Databricks notebooks/SQL Editor using Genie Code; involves product-specific agent behaviors and invocation patterns.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/eda-tutorial,Tutorial: EDA techniques,Tutorial: EDA techniques using Databricks notebooks - Azure Databricks,,"This tutorial guides you through the basics of conducting exploratory data analysis (EDA) in a Databricks notebook, from loading data to generating insights.","This tutorial guides you through the basics of conducting exploratory data analysis (EDA) using Python in a Azure Databricks notebook, from loading data to generating insights through data visualizations. The notebook used in this tutorial examines global energy and emissions data and demonstrates how to load, clean, and explore data. You can follow along using theexample notebookor create your own notebook from scratch.",2026-04-13T08:00:00.000Z,tutorial,,0.2,False,"Tutorial-style EDA walkthrough in Databricks notebooks; focuses on loading, cleaning, and visualizing data without product-specific limits, configuration tables, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/ipython-kernel,IPython kernel,IPython kernel - Azure Databricks,,"Learn about the IPython kernel, how it works with Azure Databricks notebooks, and the benefits of using the IPython kernel.","TheIPython kernelis a Jupyter kernel for Python code execution. Jupyter, and other compatible notebooks, use the IPython kernel for executing Python notebook code. In Databricks Runtime 11.3 LTS and above, Python notebooks use the IPython kernel to execute Python code. In Databricks Runtime 11.3 LTS and above, you can pass input to ipykernel in Python notebooks. This allows you to use interactive tools such as the Python debugger in the notebook. For an example notebook that illustrates how to u",2026-04-14T17:47:00.000Z,reference-architecture,,0.4,False,Explains IPython kernel usage and benefits; mostly conceptual and runtime-version notes without detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/notebooks/ipywidgets,ipywidgets,ipywidgets - Azure Databricks,,Learn how to use ipywidgets to make your notebooks interactive.,"ipywidgetsare visual elements that allow users to specify parameter values in notebook cells. You can use ipywidgets to make your Databricks Python notebooks interactive. The ipywidgets package includes over30 different controls, including form controls such as sliders, text boxes, and checkboxes, as well as layout controls such as tabs, accordions, and grids. Using these elements, you can build graphical user interfaces to interface with your notebook code. Note",2024-06-28T05:10:00.000Z,how-to,,0.3,False,"How-to/tutorial style use of ipywidgets in Databricks notebooks without product-specific limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/legacy-widgets,Legacy widgets (Runtime 15.1),Legacy notebook widgets: ${param} - Azure Databricks,Use and migrate legacy ${param} notebook widgets,"Learn how to use the deprecated ${param} syntax for notebook widgets in Databricks Runtime 15.1 and below, and how to migrate to the current parameter marker syntax.","Warning The${param}syntax for accessing widget values has been deprecated in Databricks Runtime 15.2 and above. Use the currentDatabricks widgets syntax, (:param) instead. This page shows you how to use the legacy${param}syntax for notebook widgets running on Databricks Runtime 15.1 and below. Databricks recommends youmigrate to the current syntax.",2026-04-13T08:00:00.000Z,reference,configuration,0.8,True,"Details deprecated ${param} syntax, supported runtime versions, and migration to new syntax; includes precise version constraints and parameter usage unique to Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/legacy-widgets,Legacy widgets (Runtime 15.1),Legacy notebook widgets: ${param} - Azure Databricks,Use and migrate legacy ${param} notebook widgets,"Learn how to use the deprecated ${param} syntax for notebook widgets in Databricks Runtime 15.1 and below, and how to migrate to the current parameter marker syntax.","Warning The${param}syntax for accessing widget values has been deprecated in Databricks Runtime 15.2 and above. Use the currentDatabricks widgets syntax, (:param) instead. This page shows you how to use the legacy${param}syntax for notebook widgets running on Databricks Runtime 15.1 and below. Databricks recommends youmigrate to the current syntax.",2026-04-13T08:00:00.000Z,reference,configuration,0.8,True,"Details deprecated ${param} syntax, supported runtime versions, and migration to new syntax; includes precise version constraints and parameter usage unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-cells,Organize notebook cells,Organize notebook cells - Azure Databricks,,Learn about ways you can organize notebook cells for improved readability.,"This article describes Databricks customizations to help you organize notebook cells. For information on how to format your code cells, seeFormat code cells.",2025-01-13T19:27:00.000Z,how-to,,0.3,False,"Organizing notebook cells is mostly UX guidance; no indication of product-specific limits, configs, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-compute,Notebook compute,Notebook compute resources - Azure Databricks,Choose compute resources for Databricks notebooks,"Learn how to attach Databricks notebooks to all-purpose compute, serverless compute, or SQL warehouses to run your code.","This page covers the options for notebook compute resources. You can run a notebook on an all-purpose compute resource, serverless compute, or, for SQL commands, you can use a SQL warehouse, a type of compute-optimized for SQL analytics. For more information about compute types, seeCompute.",2026-04-13T08:00:00.000Z,how-to,decision-making,0.6,True,"Guides selection between all-purpose compute, serverless compute, and SQL warehouses for notebooks, which is a Databricks-specific compute choice decision.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-editor,Code navigation,Navigate the Databricks notebook and file editor - Azure Databricks,,"Learn to use the notebook editor based on VS Code, supporting code suggestions and autocomplete, variable inspection, code folding, and diffs.","This page describes the functions available to help you navigate the Databricks notebook and file editor, including keyboard shortcuts, code suggestions and autocomplete, variable inspection, and code folding. When you use the notebook or the file editor, Genie Code is available to help you generate, explain, and debug code. SeeGet coding help from Genie Codefor details. You can select from a selection of editor themes. SelectView > Editor themeand make a selection from the menu.",2026-04-13T08:00:00.000Z,feature-guide,,0.4,False,"Navigation and usage of the VS Code–based editor (shortcuts, autocomplete, themes); largely UI behavior without configuration tables or error-resolution mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-export-import,Export and import notebooks,Import and export Databricks notebooks - Azure Databricks,,"Learn how to import and export Databricks notebooks, convert files to notebooks, and explore supported notebook formats.",This page describes how to import and export notebooks in Azure Databricks and the notebook formats that Azure Databricks supports.,2026-04-13T08:00:00.000Z,how-to,,0.45,False,"Import/export of notebooks and supported formats is largely procedural and format listing; does not strongly match limits, troubleshooting, or detailed configuration patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-format,Manage notebook format,Manage notebook format - Azure Databricks,Configure Databricks notebook file formats and commits,Databricks notebooks are created in IPYNB format by default. You can change the default format to source format on the Developer Settings page.,"This page describes the default notebook format in Azure Databricks, how to change your notebook format, and how to manage output commits if your notebook is in a source-controlled folder. By default, notebooks in Databricks are created in.ipynb(IPython or Jupyter) format. You can also choose to use source format instead. You can still import and export notebooks in various formats. SeeImport and export Databricks notebooks.",2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Describes default IPYNB format, switching to source format, and managing output commits in source-controlled folders, which are specific configuration behaviors.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-limitations,Notebooks limitations,Known limitations of Databricks notebooks - Azure Databricks,Notebook size and feature limits in Azure Databricks,"Known limitations of Databricks notebooks, including notebook sizing, cell outputs, debugger, SQL warehouse, and widget constraints.","This page covers known limitations of Databricks notebooks. For additional resource limits, seeResource limits.",2026-04-13T08:00:00.000Z,reference,limits-quotas,0.85,True,"Page explicitly documents known limitations such as notebook sizing, cell outputs, debugger, SQL warehouse, and widget constraints; these are typically expressed as concrete numeric limits and constraints that qualify as expert limits/quotas knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-media,"Add images, equations, and other media","Add images, equations, and other media to notebooks - Azure Databricks",,"Learn how to display images, equations, HTML, and links to other notebooks in your Databricks notebook.","This page explains how to display images, equations, HTML, and links to other notebooks.",2026-04-14T17:47:00.000Z,how-to,,0.2,False,"How-to content for adding media to notebooks; likely shows markdown/HTML syntax but no product-specific limits, configs, or error mappings that qualify as expert knowledge per the defined categories.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-outputs,Outputs and results,Notebook outputs and results - Azure Databricks,,"Learn how to manage cell outputs, work with results tables, apply filters, download data, and format columns in Databricks notebooks.","After youattach a notebook to a clusterandrun one or more cells, your notebook has state and displays outputs. This section describes how to manage notebook state and outputs.",2026-04-13T08:00:00.000Z,how-to,,0.4,False,Explains how to view and manage notebook outputs and results; mostly UI operations and generic behavior without detailed configuration or limits.,updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-compute,Notebook compute,Notebook compute resources - Azure Databricks,Choose compute resources for Databricks notebooks,"Learn how to attach Databricks notebooks to all-purpose compute, serverless compute, or SQL warehouses to run your code.","This page covers the options for notebook compute resources. You can run a notebook on an all-purpose compute resource, serverless compute, or, for SQL commands, you can use a SQL warehouse, a type of compute-optimized for SQL analytics. For more information about compute types, seeCompute.",2026-04-13T08:00:00.000Z,how-to,decision-making,0.6,True,"Guides selection between all-purpose compute, serverless compute, and SQL warehouses for notebooks, which is a Databricks-specific compute choice decision.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-editor,Code navigation,Navigate the Databricks notebook and file editor - Azure Databricks,,"Learn to use the notebook editor based on VS Code, supporting code suggestions and autocomplete, variable inspection, code folding, and diffs.","This page describes the functions available to help you navigate the Databricks notebook and file editor, including keyboard shortcuts, code suggestions and autocomplete, variable inspection, and code folding. When you use the notebook or the file editor, Genie Code is available to help you generate, explain, and debug code. SeeGet coding help from Genie Codefor details. You can select from a selection of editor themes. SelectView > Editor themeand make a selection from the menu.",2026-04-13T08:00:00.000Z,feature-guide,,0.4,False,"Navigation and usage of the VS Code–based editor (shortcuts, autocomplete, themes); largely UI behavior without configuration tables or error-resolution mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-export-import,Export and import notebooks,Import and export Databricks notebooks - Azure Databricks,,"Learn how to import and export Databricks notebooks, convert files to notebooks, and explore supported notebook formats.",This page describes how to import and export notebooks in Azure Databricks and the notebook formats that Azure Databricks supports.,2026-04-13T08:00:00.000Z,how-to,,0.45,False,"Import/export of notebooks and supported formats is largely procedural and format listing; does not strongly match limits, troubleshooting, or detailed configuration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-format,Manage notebook format,Manage notebook format - Azure Databricks,Configure Databricks notebook file formats and commits,Databricks notebooks are created in IPYNB format by default. You can change the default format to source format on the Developer Settings page.,"This page describes the default notebook format in Azure Databricks, how to change your notebook format, and how to manage output commits if your notebook is in a source-controlled folder. By default, notebooks in Databricks are created in.ipynb(IPython or Jupyter) format. You can also choose to use source format instead. You can still import and export notebooks in various formats. SeeImport and export Databricks notebooks.",2026-04-13T08:00:00.000Z,how-to,configuration,0.7,True,"Describes default IPYNB format, switching to source format, and managing output commits in source-controlled folders, which are specific configuration behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-limitations,Notebooks limitations,Known limitations of Databricks notebooks - Azure Databricks,Notebook size and feature limits in Azure Databricks,"Known limitations of Databricks notebooks, including notebook sizing, cell outputs, debugger, SQL warehouse, and widget constraints.","This page covers known limitations of Databricks notebooks. For additional resource limits, seeResource limits.",2026-04-13T08:00:00.000Z,reference,limits-quotas,0.85,True,"Page explicitly documents known limitations such as notebook sizing, cell outputs, debugger, SQL warehouse, and widget constraints; these are typically expressed as concrete numeric limits and constraints that qualify as expert limits/quotas knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-media,"Add images, equations, and other media","Add images, equations, and other media to notebooks - Azure Databricks",,"Learn how to display images, equations, HTML, and links to other notebooks in your Databricks notebook.","This page explains how to display images, equations, HTML, and links to other notebooks.",2026-04-14T17:47:00.000Z,how-to,,0.2,False,"How-to content for adding media to notebooks; likely shows markdown/HTML syntax but no product-specific limits, configs, or error mappings that qualify as expert knowledge per the defined categories.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-outputs,Outputs and results,Notebook outputs and results - Azure Databricks,,"Learn how to manage cell outputs, work with results tables, apply filters, download data, and format columns in Databricks notebooks.","After youattach a notebook to a clusterandrun one or more cells, your notebook has state and displays outputs. This section describes how to manage notebook state and outputs.",2026-04-13T08:00:00.000Z,how-to,,0.4,False,Explains how to view and manage notebook outputs and results; mostly UI operations and generic behavior without detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-tags,Apply tags to notebooks,Apply tags to notebooks - Azure Databricks,Apply and manage Databricks notebook tags,"Learn how to apply tags to notebooks for organization and management, including certification and deprecation system tags.","Important This feature is inPublic Preview. Use tags to organize and categorize notebooks for easier management. Notebooks also support certification and deprecation system tags to indicate trust or lifecycle status. For more information about tags, seeApply tags to Unity Catalog securable objects. For information about certification and deprecation, seeFlag data as certified or deprecated.",2026-03-19T08:00:00.000Z,how-to,configuration,0.6,True,"Describes notebook tags including certification and deprecation system tags, which are product-specific metadata settings tied to governance behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-ui,Customize notebook appearance,Customize notebook appearance - Azure Databricks,,"Learn how to customize your notebook appearance, such as adding line numbers and enabling dark mode, with various Databricks settings.","This page describes ways you can customize the appearance of your notebook with various Databricks settings. You can remove cell margins, add line numbers, wrap lines, and view in dark mode.",2026-04-14T17:47:00.000Z,how-to,,0.4,False,"Customization of notebook appearance (line numbers, dark mode, margins) is UI preference-level configuration without complex parameters or expert-only constraints.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-version-history,Version history,Version history in notebooks - Azure Databricks,,Databricks notebooks maintain a history of notebook versions. Learn about the tools Databricks provides for version control.,"This page describes how you can manage versions in Azure Databricks notebooks. Azure Databricks notebooks maintain a history of notebook versions, allowing you to view and restore previous snapshots of a notebook. You can perform the following actions on versions: add descriptions, restore and delete versions, and clear version history. You can alsosync your work in Databricks with a remote Git repository.",2026-04-14T17:47:00.000Z,how-to,,0.45,False,"Version history usage and basic version control operations; likely procedural without detailed limits, quotas, or complex decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-ui,Customize notebook appearance,Customize notebook appearance - Azure Databricks,,"Learn how to customize your notebook appearance, such as adding line numbers and enabling dark mode, with various Databricks settings.","This page describes ways you can customize the appearance of your notebook with various Databricks settings. You can remove cell margins, add line numbers, wrap lines, and view in dark mode.",2026-04-14T17:47:00.000Z,how-to,,0.4,False,"Customization of notebook appearance (line numbers, dark mode, margins) is UI preference-level configuration without complex parameters or expert-only constraints.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-version-history,Version history,Version history in notebooks - Azure Databricks,,Databricks notebooks maintain a history of notebook versions. Learn about the tools Databricks provides for version control.,"This page describes how you can manage versions in Azure Databricks notebooks. Azure Databricks notebooks maintain a history of notebook versions, allowing you to view and restore previous snapshots of a notebook. You can perform the following actions on versions: add descriptions, restore and delete versions, and clear version history. You can alsosync your work in Databricks with a remote Git repository.",2026-04-14T17:47:00.000Z,how-to,,0.45,False,"Version history usage and basic version control operations; likely procedural without detailed limits, quotas, or complex decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-workflows,Run a notebook from another notebook,Orchestrate notebooks and modularize code in notebooks - Azure Databricks,,Learn how to orchestrate notebooks and modularize code in notebooks.,Learn how to orchestrate notebooks and modularize code in notebooks. See examples and understand when to use alternative methods for notebook orchestration.,2025-01-07T19:53:00.000Z,how-to,,0.35,False,Covers orchestration and modularization patterns but summary doesn’t show Databricks-specific decision matrices or quantified trade-offs.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-code,Overview,Develop code in Databricks notebooks - Azure Databricks,Use Databricks notebook features for code development,"Develop code in Databricks notebooks, including code formatting, mixing languages, variable explorer, code modularization with files, and version history.","This page describes how to develop code in Databricks notebooks, including code formatting, autocomplete, mixing languages, and magic commands. For more details about advanced functionality available with the editor, such as autocomplete, variable selection, multi-cursor support, and side-by-side diffs, seeNavigate the Databricks notebook and file editor. When you use the notebook or the file editor, Genie Code is available to help you generate, explain, and debug code. SeeUse Genie Codefor more",2026-04-13T08:00:00.000Z,how-to,integrations,0.6,True,"Describes magic commands, mixing languages, and editor behaviors that are Databricks-specific coding patterns and APIs, which qualify as product-specific integration/coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-collaborate,Collaborate using notebooks,Collaborate using Databricks notebooks - Azure Databricks,Manage access and collaboration in Databricks notebooks,"Learn how to collaborate using Databricks notebooks, share notebooks with coworkers, manage permissions, and use code comments.",This page describes how to give coworkers access to a notebook and how you can leave comments in a notebook. Note Access control is available only in thePremium plan.,2026-04-13T08:00:00.000Z,how-to,security,0.7,True,"Covers sharing notebooks, permissions, and mentions access control availability only in the Premium plan, which is product-specific security and RBAC behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-manage,Manage notebooks,Manage notebooks - Azure Databricks,,"Learn how to create, open, delete, rename, and control access to Databricks notebooks using the Databricks UI, CLI, and Workspace API.","You can manage notebooks using the UI, the CLI, and the Workspace API. This page focuses on performing notebook tasks using the UI. For the other methods, seeDatabricks CLIandthe Workspace API reference.",2026-04-13T08:00:00.000Z,how-to,,0.45,False,"Managing notebooks (create, open, delete, rename, access) via UI/CLI/API is mostly operational; likely lacks deep configuration tables or expert-only constraints.",updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-code,Overview,Develop code in Databricks notebooks - Azure Databricks,Use Databricks notebook features for code development,"Develop code in Databricks notebooks, including code formatting, mixing languages, variable explorer, code modularization with files, and version history.","This page describes how to develop code in Databricks notebooks, including code formatting, autocomplete, mixing languages, and magic commands. For more details about advanced functionality available with the editor, such as autocomplete, variable selection, multi-cursor support, and side-by-side diffs, seeNavigate the Databricks notebook and file editor. When you use the notebook or the file editor, Genie Code is available to help you generate, explain, and debug code. SeeUse Genie Codefor more",2026-04-13T08:00:00.000Z,how-to,integrations,0.6,True,"Describes magic commands, mixing languages, and editor behaviors that are Databricks-specific coding patterns and APIs, which qualify as product-specific integration/coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-collaborate,Collaborate using notebooks,Collaborate using Databricks notebooks - Azure Databricks,Manage access and collaboration in Databricks notebooks,"Learn how to collaborate using Databricks notebooks, share notebooks with coworkers, manage permissions, and use code comments.",This page describes how to give coworkers access to a notebook and how you can leave comments in a notebook. Note Access control is available only in thePremium plan.,2026-04-13T08:00:00.000Z,how-to,security,0.7,True,"Covers sharing notebooks, permissions, and mentions access control availability only in the Premium plan, which is product-specific security and RBAC behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-manage,Manage notebooks,Manage notebooks - Azure Databricks,,"Learn how to create, open, delete, rename, and control access to Databricks notebooks using the Databricks UI, CLI, and Workspace API.","You can manage notebooks using the UI, the CLI, and the Workspace API. This page focuses on performing notebook tasks using the UI. For the other methods, seeDatabricks CLIandthe Workspace API reference.",2026-04-13T08:00:00.000Z,how-to,,0.45,False,"Managing notebooks (create, open, delete, rename, access) via UI/CLI/API is mostly operational; likely lacks deep configuration tables or expert-only constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/notebooks/package-cells,Package cells,Package cells - Azure Databricks,,Learn how to define classes in package cells to use custom Scala classes and objects defined within notebooks reliably in Apache Spark and across notebook sessions.,"To use custom Scala classes and objects defined within notebooks reliably in Spark and across notebook sessions, you should define classes in package cells. Apackage cellis a cell that is compiled when it is run. A package cell has no visibility with respect to the rest of the notebook. You can think of it as a separate Scala file. Onlyclassandobjectdefinitions can go in a package cell. You cannot have any values, variables, or function definitions. The following notebook shows what can happen i",2024-03-01T08:00:00.000Z,how-to,,0.4,False,"Explains package cells conceptually and how to define Scala classes; no clear tables of settings, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/run-notebook,Run notebooks,Run Databricks notebooks - Azure Databricks,Run Databricks notebooks safely and efficiently,"Learn how to run a Databricks notebook interactively or as a job, and configure notifications and Databricks Advisor settings.","Before you can run any cell in a notebook, you mustattach the notebook to a cluster. To run all the cells in a notebook, selectRun Allin the notebook toolbar. Important Do not useRun Allif steps formount and unmountare in the same notebook. It could lead to a race condition and possibly corrupt the mount points. To run a single cell, click in the cell and pressshift+enter. You can also run a subset of lines in a cell or a subset of cells. SeeRun selected textandRun selected cells. To run all cel",2026-04-14T17:47:00.000Z,how-to,best-practices,0.75,True,"Includes a specific warning about Run All with mount/unmount causing race conditions and corruption, which is a Databricks-specific gotcha and actionable best practice.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/schedule-notebook-jobs,Schedule notebooks,Create and manage scheduled notebook jobs - Azure Databricks,Create and manage scheduled Databricks notebook jobs,"Learn how to create and manage scheduled notebook jobs directly from the Databricks notebook UI, including compute and parameter options.","You can create and manage notebook jobs directly in the notebook UI. If a notebook is already assigned to one or more jobs, you can create and manage schedules for those jobs. If a notebook is not assigned to a job, you can create a job and a schedule to run the notebook. To learn more about scheduling jobs, seeRun jobs on a schedule.",2026-04-13T08:00:00.000Z,how-to,deployment,0.65,True,"Describes creating jobs and schedules directly from the notebook UI, including compute and parameter options; these are Databricks-specific job deployment patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/share-code,Share code between notebooks,Share code between Databricks notebooks - Azure Databricks,Share and modularize code across Databricks notebooks,Share code between Databricks notebooks. Import a Python function or file into a Databricks notebook.,"This page describes how to use files to modularize your code, including how to create and import Python files. Databricks also supports multi-task jobs which allow you to combine notebooks into workflows with complex dependencies. For more information, seeLakeflow Jobs.",2026-04-13T08:00:00.000Z,how-to,integrations,0.7,True,"Covers importing Python files and using Databricks-specific mechanisms (files, multi-task jobs) to share code, which are concrete coding patterns unique to Databricks.",updated -https://learn.microsoft.com/en-us/azure/databricks/notebooks/test-notebooks,Test notebooks,Test Databricks notebooks - Azure Databricks,Test and schedule Databricks notebook code,"How to test code directly in Databricks notebooks, including automatic scheduling, widgets, Databricks Git folders, and running a notebook from another notebook.","This page briefly describes some techniques that are useful when testing code directly in Databricks notebooks. You can use these methods separately or together. For a detailed walkthrough of how to set up and organize functions and unit tests in Databricks notebooks, seeUnit testing for notebooks. Many unit testing libraries work directly within the notebook. For example, you can use the built-in Pythonunittestpackage to test notebook code. Test failures appear in the output area of the cell.",2026-04-13T08:00:00.000Z,how-to,best-practices,0.7,True,"Provides Databricks-specific techniques for testing (widgets, Git folders, running notebooks from notebooks) which are concrete, product-specific testing practices.",updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/run-notebook,Run notebooks,Run Databricks notebooks - Azure Databricks,Run Databricks notebooks safely and efficiently,"Learn how to run a Databricks notebook interactively or as a job, and configure notifications and Databricks Advisor settings.","Before you can run any cell in a notebook, you mustattach the notebook to a cluster. To run all the cells in a notebook, selectRun Allin the notebook toolbar. Important Do not useRun Allif steps formount and unmountare in the same notebook. It could lead to a race condition and possibly corrupt the mount points. To run a single cell, click in the cell and pressshift+enter. You can also run a subset of lines in a cell or a subset of cells. SeeRun selected textandRun selected cells. To run all cel",2026-04-14T17:47:00.000Z,how-to,best-practices,0.75,True,"Includes a specific warning about Run All with mount/unmount causing race conditions and corruption, which is a Databricks-specific gotcha and actionable best practice.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/schedule-notebook-jobs,Schedule notebooks,Create and manage scheduled notebook jobs - Azure Databricks,Create and manage scheduled Databricks notebook jobs,"Learn how to create and manage scheduled notebook jobs directly from the Databricks notebook UI, including compute and parameter options.","You can create and manage notebook jobs directly in the notebook UI. If a notebook is already assigned to one or more jobs, you can create and manage schedules for those jobs. If a notebook is not assigned to a job, you can create a job and a schedule to run the notebook. To learn more about scheduling jobs, seeRun jobs on a schedule.",2026-04-13T08:00:00.000Z,how-to,deployment,0.65,True,"Describes creating jobs and schedules directly from the notebook UI, including compute and parameter options; these are Databricks-specific job deployment patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/share-code,Share code between notebooks,Share code between Databricks notebooks - Azure Databricks,Share and modularize code across Databricks notebooks,Share code between Databricks notebooks. Import a Python function or file into a Databricks notebook.,"This page describes how to use files to modularize your code, including how to create and import Python files. Databricks also supports multi-task jobs which allow you to combine notebooks into workflows with complex dependencies. For more information, seeLakeflow Jobs.",2026-04-13T08:00:00.000Z,how-to,integrations,0.7,True,"Covers importing Python files and using Databricks-specific mechanisms (files, multi-task jobs) to share code, which are concrete coding patterns unique to Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/notebooks/test-notebooks,Test notebooks,Test Databricks notebooks - Azure Databricks,Test and schedule Databricks notebook code,"How to test code directly in Databricks notebooks, including automatic scheduling, widgets, Databricks Git folders, and running a notebook from another notebook.","This page briefly describes some techniques that are useful when testing code directly in Databricks notebooks. You can use these methods separately or together. For a detailed walkthrough of how to set up and organize functions and unit tests in Databricks notebooks, seeUnit testing for notebooks. Many unit testing libraries work directly within the notebook. For example, you can use the built-in Pythonunittestpackage to test notebook code. Test failures appear in the output area of the cell.",2026-04-13T08:00:00.000Z,how-to,best-practices,0.7,True,"Provides Databricks-specific techniques for testing (widgets, Git folders, running notebooks from notebooks) which are concrete, product-specific testing practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/notebooks/testing,Unit testing,Unit testing for notebooks - Azure Databricks,,Learn how to apply techniques and frameworks for unit testing code functions for your Azure Databricks notebooks.,"You can useunit testingto help improve the quality and consistency of your notebooks' code. Unit testing is an approach to testing self-contained units of code, such as functions, early and often. This helps you find problems with your code faster, uncover mistaken assumptions about your code sooner, and streamline your overall coding efforts. This article is an introduction to basicunit testingwith functions. Advanced concepts such as unit testing classes and interfaces, as well as the use ofst",2025-01-21T08:00:00.000Z,how-to,,0.3,False,Introductory unit testing guidance; likely generic testing patterns rather than Databricks-specific best practices with quantified impact.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/notebooks/widgets,Databricks widgets,Databricks widgets - Azure Databricks,Configure and use Databricks notebook widgets,"Learn how to use input widgets to add parameters to your Databricks notebooks and dashboards, including widget types, configuration, and layout options.","Input widgets allow you to add parameters to your notebooks and dashboards. You can add a widget from the Databricks UI or using the widget API. To add or edit a widget, you must have CAN EDITpermissions on the notebook. If you are running Databricks Runtime 11.3 LTS or above, you can also useipywidgets in Databricks notebooks. Databricks widgets are best for: To view the documentation for the widget API in Scala, Python, or R, use the following command:dbutils.widgets.help(). You can also refer",2026-04-13T08:00:00.000Z,how-to,configuration,0.8,True,"Describes widget types, configuration, layout options, and widget API usage; includes specific API names and parameters, which are product-specific configuration details.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/,OLTP databases (Lakebase),Lakebase Postgres - Azure Databricks,,"Learn how to work with Lakebase Postgres, a managed Postgres online transaction processing (OLTP) database.","Lakebase Postgres is a fully managed, cloud-native PostgreSQL database that brings online transaction processing (OLTP) capabilities to the Lakehouse. Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you",2026-04-15T08:00:00.000Z,landing-page,,0.3,False,"High-level introduction to Lakebase Postgres OLTP; summary indicates conceptual overview of the service and variants (Autoscaling vs Provisioned) without concrete limits, configuration tables, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/notebooks/widgets,Databricks widgets,Databricks widgets - Azure Databricks,Configure and use Databricks notebook widgets,"Learn how to use input widgets to add parameters to your Databricks notebooks and dashboards, including widget types, configuration, and layout options.","Input widgets allow you to add parameters to your notebooks and dashboards. You can add a widget from the Databricks UI or using the widget API. To add or edit a widget, you must have CAN EDITpermissions on the notebook. If you are running Databricks Runtime 11.3 LTS or above, you can also useipywidgets in Databricks notebooks. Databricks widgets are best for: To view the documentation for the widget API in Scala, Python, or R, use the following command:dbutils.widgets.help(). You can also refer",2026-04-13T08:00:00.000Z,how-to,configuration,0.8,True,"Describes widget types, configuration, layout options, and widget API usage; includes specific API names and parameters, which are product-specific configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/,OLTP databases (Lakebase),Lakebase Postgres - Azure Databricks,,"Learn how to work with Lakebase Postgres, a managed Postgres online transaction processing (OLTP) database.","Lakebase Postgres is a fully managed, cloud-native PostgreSQL database that brings online transaction processing (OLTP) capabilities to the Lakehouse. Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you",2026-04-15T08:00:00.000Z,landing-page,,0.3,False,"High-level introduction to Lakebase Postgres OLTP; summary indicates conceptual overview of the service and variants (Autoscaling vs Provisioned) without concrete limits, configuration tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/,Lakebase Provisioned,Lakebase - Azure Databricks,,"Overview of Lakebase, a managed PostgreSQL online transaction processing (OLTP) database for the Databricks platform.","Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. Lakebase is a fully managed Postgres online transaction processing (O",2026-03-31T23:28:00.000Z,landing-page,,0.25,False,"General overview of Lakebase; summary is introductory and rollout info, without specific limits, configs, or security roles.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/about,What is Lakebase Provisioned?,What is Lakebase Provisioned? - Azure Databricks,,"Learn how to work with Lakebase, a managed PostgreSQL online transaction processing (OLTP) database.","Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page introduces Lakebase Provisioned, a fully managed Postgres O",2026-03-31T23:28:00.000Z,overview,,0.25,False,Introductory page for Lakebase Provisioned; primarily conceptual overview of the service.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/auth-and-permissions,Authentication and permissions,Authentication and permissions - Azure Databricks,Configure authentication and permissions for Lakebase instances,Learn about authentication methods and permission management for Azure Databricks database instances.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page provides an overview of authentication methods and permissi",2026-03-31T23:28:00.000Z,concept-article,security,0.75,True,"Covers authentication methods and permission management; likely lists specific auth modes, scopes, and permission models unique to Azure Databricks Lakebase.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/authentication,Authenticate,Authenticate to a database instance - Azure Databricks,Authenticate to Lakebase using OAuth tokens,Learn how to obtain an OAuth token and use it to authenticate to your Lakebase database instance.,"Important Lakebase Provisioned is available in the following regions:westus,westus2,eastus,eastus2,centralus,southcentralus,northeurope,westeurope,australiaeast,brazilsouth,canadacentral,centralindia,southeastasia,uksouth. Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created",2026-03-12T18:41:00.000Z,concept-article,security,0.8,True,"Obtaining and using OAuth tokens for Lakebase involves concrete auth endpoints, scopes, and parameters. These are product-specific authentication configuration details, fitting security.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/,Create a database instance,Create and manage a database instance - Azure Databricks,,"Learn how to create, connect to, and test a Lakebase database instance.","Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. To get started with OLTP workloads, create a Lakebase Provisioned dat",2026-04-16T17:44:00.000Z,landing-page,,0.3,False,"Appears to be a how-to guide for creating and managing a Lakebase Provisioned database instance. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices; it mainly describes getting started and mentions region availability and product variants.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/,Create a database instance,Create and manage a database instance - Azure Databricks,,"Learn how to create, connect to, and test a Lakebase database instance.","Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. To get started with OLTP workloads, create a Lakebase Provisioned dat",2026-04-16T17:44:00.000Z,landing-page,,0.3,False,"Appears to be a how-to guide for creating and managing a Lakebase Provisioned database instance. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices; it mainly describes getting started and mentions region availability and product variants.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/capacity,Manage capacity,Manage instance capacity - Azure Databricks,Right-size Lakebase instance capacity and scaling,Learn how to manage a Lakebase instance capacity.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page explains the options for right-sizing your Lakebase instanc",2026-03-31T23:28:00.000Z,how-to,decision-making,0.7,True,"Page is about right-sizing capacity; likely includes guidance on choosing instance sizes, thresholds, and scaling strategies based on workload, which is decision-making with product-specific criteria.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/child-instance,Restore data and time travel,Restore data and time travel - Azure Databricks,,Learn how to use child instances to perform time travel and restore data to your Lakebase database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page explains how to use child instances to restore data and per",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Focuses on using child instances for time travel and restore. From the summary it looks like a feature/how-to explanation without explicit limits, configuration tables, or error-code troubleshooting. No clear evidence of quantified best practices or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/child-instance,Restore data and time travel,Restore data and time travel - Azure Databricks,,Learn how to use child instances to perform time travel and restore data to your Lakebase database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page explains how to use child instances to restore data and per",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Focuses on using child instances for time travel and restore. From the summary it looks like a feature/how-to explanation without explicit limits, configuration tables, or error-code troubleshooting. No clear evidence of quantified best practices or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/high-availability,Configure for high availability,Configure for high availability - Azure Databricks,Configure high availability for Lakebase instances,Learn how to configure a Lakebase database instance for high availability.,"This page describes how to configure a Lakebase database instance for high availability by enabling readable secondary instances. Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout st",2026-03-31T23:28:00.000Z,how-to,configuration,0.7,True,"Describes enabling readable secondary instances; likely includes specific HA settings, replication options, and constraints unique to Lakebase, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/monitor,Monitor a database instance,Monitor a database instance - Azure Databricks,,Learn how to monitor your Lakebase (OLTP) database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page explains how to monitor the performance of your Lakebase da",2026-03-31T23:28:00.000Z,how-to,,0.45,False,"Monitoring page for instances; summary suggests general performance monitoring without explicit mention of metrics tables, config parameters, or error-code mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/instance,Database instances,What is a database instance? - Azure Databricks,,Learn about Lakebase database instances.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. A Lakebase database instance manages storage and compute resources an",2026-03-31T23:28:00.000Z,concept-article,,0.25,False,Explains what a database instance is; conceptual resource model description without detailed configuration tables or limits in the summary.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/manage-privileges,Manage permissions,Manage permissions - Azure Databricks,Grant and manage Lakebase instance access in UI,Provide other users access to the Lakebase database instance using the Azure Databricks UI.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page describes when and how to grant Azure Databricks users and ",2026-03-31T23:28:00.000Z,how-to,security,0.7,True,"Describes when and how to grant access via the Databricks UI; likely includes specific roles/permissions and their effects, which are product-specific security details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/pg-roles,Manage PostgreSQL roles,Manage PostgreSQL roles - Azure Databricks,Create and manage PostgreSQL roles for Lakebase,Learn how to create and manage Postgres roles for Azure Databricks identities to allow users to access a Lakebase database.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. A Postgres role for the Lakebase database instance owner’s Azure Data",2026-03-31T23:28:00.000Z,concept-article,security,0.8,True,"Focuses on mapping Azure Databricks identities to Postgres roles; likely includes role names, privileges, and configuration patterns unique to Lakebase.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/,Connect and query,Connect and query - Azure Databricks,Connect to and query Lakebase database instances,Learn how to work with a Lakebase database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page outlines the different ways to work with your Lakebase data",2026-03-31T23:28:00.000Z,landing-page,integrations,0.65,True,"Outlines ways to work with Lakebase instances; likely includes connection strings, driver settings, and environment-specific parameters, which are integration and configuration patterns beyond generic Postgres usage.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/notebook,Query from Notebooks,Use a notebook to access a database instance - Azure Databricks,Query Lakebase instances from Databricks notebooks,Learn how to use a notebook to query a Lakebase database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page contains code examples that show you how to access your Lak",2026-03-31T23:28:00.000Z,how-to,integrations,0.7,True,"Described as containing code examples for accessing Lakebase from notebooks; this typically includes product-specific connection strings, parameters, and usage patterns that qualify as integration-focused expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/notebook,Query from Notebooks,Use a notebook to access a database instance - Azure Databricks,Query Lakebase database instances from notebooks,Learn how to use a notebook to query a Lakebase database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page contains code examples that show you how to access your Lak",2026-04-22T08:00:00.000Z,how-to,integrations,0.65,True,"Page focuses on concrete code examples for accessing a Lakebase database instance from an Azure Databricks notebook. This is a product-specific integration pattern (not just a generic tutorial) showing how to connect and query the service programmatically. While the summary is truncated, the description indicates detailed, code-based access patterns unique to Lakebase/Databricks rather than conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/postgres-compatibility,PostgreSQL compatibility,PostgreSQL compatibility - Azure Databricks,,Learn about how Lakebase is compatible with Postgres.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page describes how a Lakebase database instance is compatible wi",2026-03-31T23:28:00.000Z,how-to,,0.4,False,"PostgreSQL compatibility overview; likely lists supported/unsupported features conceptually rather than detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/psql,Query from SQL clients,Access a database instance from SQL clients - Azure Databricks,Connect external SQL clients to Lakebase instances,"Learn how to access a Lakebase database instance from external SQL clients, such as psql, DBeaver, and pgAdmin4.","Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page describes how to access a Lakebase database instance from S",2026-03-31T23:28:00.000Z,how-to,integrations,0.75,True,"Explains how to access Lakebase from psql, DBeaver, pgAdmin4; such pages usually include host formats, ports, SSL options, and client-specific parameters, which are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/sql-editor,Query from SQL Editor,Access a database instance from the SQL editor - Azure Databricks,,"Learn how to access and query a Lakebase database instance from the SQL editor, including read-only queries on readable secondaries.","Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page describes how to access a Lakebase database instance from t",2026-03-31T23:28:00.000Z,how-to,,0.3,False,"How-to on accessing/querying a Lakebase instance from the SQL editor; likely step-by-step UI usage without detailed limits, configs, or error-code mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/register-uc,Register database as a catalog,Register your database in Unity Catalog - Azure Databricks,Register Lakebase instances with Unity Catalog,Learn how to register your Lakebase database with Unity Catalog.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page explains how to register your Lakebase database as a read-o",2026-03-31T23:28:00.000Z,how-to,integrations,0.75,True,"Registration with Unity Catalog is a specific integration; page likely details configuration fields, permissions, and constraints for this integration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/roles,Pre-created roles,Pre-created roles and permissions - Azure Databricks,Use pre-created roles and permissions in Lakebase,Learn about the different types of Postgres roles in a Lakebase database instance.,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. This page explains the Postgres roles that you can use to govern acce",2026-03-31T23:28:00.000Z,concept-article,security,0.8,True,Explains predefined Postgres roles for Lakebase; these role names and associated permissions are product-specific security configuration knowledge.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/sync-data/,Work with Unity Catalog,Work with Unity Catalog - Azure Databricks,Integrate Lakebase with Unity Catalog and synced data,Integrate Lakebase with Unity Catalog to serve lakehouse data through synced tables and enable data governance.,"Integrate your Lakebase Provisioned database with Unity Catalog to enable data governance, federated queries, and automated data synchronization between your Lakehouse and PostgreSQL databases.",2026-03-11T08:00:00.000Z,landing-page,integrations,0.7,True,"Describes integration of Lakebase with Unity Catalog for governance and synchronization. Such a page typically includes configuration options, sync behaviors, and constraints specific to this integration, matching integrations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/sync-data/sync-table,Serve data with synced tables,Serve lakehouse data with synced tables (Lakebase Provisioned) - Azure Databricks,,Serve lakehouse data through Lakebase Provisioned instances using synced tables for operational applications,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. Synced tables let you serve lakehouse data through Lakebase Provision",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Describes serving lakehouse data via synced tables for Lakebase Provisioned instances. The summary suggests conceptual and procedural content, not detailed limits, configuration matrices, or troubleshooting mappings required for the expert sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/sync-data/sync-table,Serve data with synced tables,Serve lakehouse data with synced tables (Lakebase Provisioned) - Azure Databricks,,Serve lakehouse data through Lakebase Provisioned instances using synced tables for operational applications,"Important Lakebase Provisioned is the original Lakebase offering that uses provisioned compute you scale manually. For supported regions, seeRegion availability. For the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore, seeLakebase Autoscaling. New Lakebase instances will be created as Autoscaling projects. Rollout starts March 12, 2026. For details, seeAutoscaling by default. Synced tables let you serve lakehouse data through Lakebase Provision",2026-04-15T08:00:00.000Z,how-to,,0.3,False,"Describes serving lakehouse data via synced tables for Lakebase Provisioned instances. The summary suggests conceptual and procedural content, not detailed limits, configuration matrices, or troubleshooting mappings required for the expert sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/,Lakebase Autoscaling,Lakebase Autoscaling - Azure Databricks,,Projects overview and navigation,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Autoscaling is 100% standard Postgres with automatic scaling, instant branching, and deep integration with the Databricks platform. Create a project, get a connection string, and start building.",2026-04-10T21:59:00.000Z,landing-page,,0.2,False,"High-level overview/navigation for Lakebase Autoscaling projects; primarily marketing and orientation content without detailed limits, configs, or error handling.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/about,What is Lakebase?,What is Lakebase Autoscaling? - Azure Databricks,,"Learn about Lakebase Postgres Autoscaling, the next generation of managed Postgres with autoscaling, branching, and scale-to-zero capabilities.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Postgres Autoscaling is a fully managed Postgres database built for any application that requires online transaction processing (OLTP) and low-latency data serving. It is integrated into the Databricks platform, enabling you to build real-t",2026-04-10T08:00:00.000Z,overview,,0.2,False,"Conceptual 'What is' overview of Lakebase Autoscaling; mostly describes capabilities and scenarios without concrete numeric limits, configuration parameters, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/active-queries,Active queries,Monitor active queries - Azure Databricks,,View and analyze running queries in your Lakebase Postgres database.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Monitor active queries in your Lakebase Postgres project to track running queries, identify performance bottlenecks, and analyze query execution in real time.",2026-03-31T23:28:00.000Z,how-to,,0.45,False,"Focuses on viewing and analyzing running queries; summary suggests UI usage and conceptual monitoring, not product-specific configuration or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/api-usage,API,Lakebase Autoscaling API guide - Azure Databricks,"Use Lakebase Autoscaling APIs, CLI, and SDKs","Overview of the Lakebase Autoscaling API for managing projects programmatically using the REST API, Databricks CLI, and Databricks SDKs.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This page provides an overview of the Lakebase Autoscaling API, including authentication, available endpoints, and common patterns for working with the REST API, Databricks CLI, and Databricks SDKs (Python, Java, Go). For the complete API reference,",2026-04-14T08:00:00.000Z,overview,integrations,0.6,True,"Overview of Lakebase Autoscaling API with authentication, endpoints, and patterns for REST, CLI, and SDKs. This implies product-specific API/SDK parameters and usage patterns, aligning with integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/authentication,About authentication,About authentication - Azure Databricks,Configure authentication for Lakebase Postgres connections,"Learn how to authenticate database connections using OAuth tokens or Postgres passwords, including token rotation and security considerations.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to authenticate database connections to Lakebase Postgres. For step-by-step connection instructions, seeQuickstart.",2026-04-17T08:00:00.000Z,concept-article,security,0.75,True,"Covers how to authenticate to Lakebase using OAuth tokens or Postgres passwords, including token rotation and security considerations. This is product-specific auth configuration and security guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/api-usage,API,Lakebase Autoscaling API guide - Azure Databricks,"Use Lakebase Autoscaling APIs, CLI, and SDKs","Overview of the Lakebase Autoscaling API for managing projects programmatically using the REST API, Databricks CLI, and Databricks SDKs.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This page provides an overview of the Lakebase Autoscaling API, including authentication, available endpoints, and common patterns for working with the REST API, Databricks CLI, and Databricks SDKs (Python, Java, Go). For the complete API reference,",2026-04-14T08:00:00.000Z,overview,integrations,0.6,True,"Overview of Lakebase Autoscaling API with authentication, endpoints, and patterns for REST, CLI, and SDKs. This implies product-specific API/SDK parameters and usage patterns, aligning with integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/authentication,About authentication,About authentication - Azure Databricks,Configure Lakebase authentication and token rotation,"Learn how to authenticate database connections using OAuth tokens or Postgres passwords, including token rotation and security considerations.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to authenticate database connections to Lakebase Postgres. For step-by-step connection instructions, seeQuickstart.",2026-04-22T08:00:00.000Z,concept-article,security,0.75,True,"Covers how to authenticate to Lakebase Postgres using OAuth tokens or passwords, including token rotation and security considerations. This implies concrete auth flows, token lifetimes, and secure configuration details specific to Lakebase.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/automate-with-terraform,Terraform,Get started with Terraform for Lakebase - Azure Databricks,Provision Lakebase resources with Terraform,"Get started with Terraform to create Lakebase projects, branches, and endpoints, and learn how to delete them when finished.","This guide helps you get started with Terraform to manage Lakebase resources using the Azure Databricks Terraform provider. You'll create a project, add a development branch and endpoint, and then delete them when finished. This is a typical workflow for managing development and testing environments. Tip This guide covers a subset of available Terraform commands. For the complete resource reference and all available configuration options, see theAzure Databricks provider documentation on the Ter",2026-03-31T23:28:00.000Z,get-started,deployment,0.7,True,"Shows Terraform configuration for projects, branches, endpoints, and deletion; includes resource types and arguments specific to the Azure Databricks provider for Lakebase.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/autoscaling,Autoscaling,Autoscaling - Azure Databricks,,Autoscaling dynamically adjusts compute resources to match workload demands in real-time.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Autoscaling dynamically adjusts the amount of compute resources allocated to your Lakebase computes in response to current workload demands. As your application experiences varying levels of activity throughout the day, autoscaling automatically inc",2026-04-10T08:00:00.000Z,feature-guide,,0.4,False,"Explains autoscaling behavior conceptually (dynamic compute adjustment, scale-to-zero) but summary does not show specific configuration parameters, thresholds, or limits; reads as feature description rather than detailed config or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/backup-methods,Backup and restore,Backup and restore methods - Azure Databricks,Choose backup and restore methods for Lakebase,Learn about different backup and restore methods,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Autoscaling supports different backup and restore options, which you can use separately or in combination, depending on your requirements.",2026-03-31T23:28:00.000Z,concept-article,decision-making,0.6,True,"Describes different backup and restore options and when to use them in combination; likely includes scenario-based guidance and trade-offs between methods, which is decision-making specific to Lakebase Autoscaling.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/branches,Database branches,Branches - Azure Databricks,Use Lakebase branches for database development workflows,Branching,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Branching in Lakebase lets you version, test, and evolve your data environment safely, similar to branching your code in Git. You can instantly create isolated, fully functional branches for development, experimentation, or testing schema changes, w",2026-03-31T23:28:00.000Z,feature-guide,architecture-patterns,0.6,True,"Explains branching semantics for Lakebase databases and how to use them for dev/test/experimentation, a product-specific architectural workflow pattern.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/cli,CLI,Get started with the Databricks CLI for Lakebase - Azure Databricks,Manage Lakebase with the Databricks CLI,"Get started with the Databricks CLI to manage Lakebase Postgres projects, branches, and computes","Important Lakebase Autoscaling is available in the following regions:eastus,eastus2,centralus,southcentralus,westus,westus2,canadacentral,brazilsouth,northeurope,uksouth,westeurope,australiaeast,centralindia,southeastasia. Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide helps you get started with the Databricks CLI to manage your Lakebase pr",2026-02-04T19:43:00.000Z,get-started,integrations,0.7,True,"CLI usage for Lakebase will document specific commands, flags, and configuration keys that are product-specific integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/compatibility,Project Postgres compatibility,Postgres compatibility - Azure Databricks,Understand Lakebase and standard Postgres differences,Learn about Lakebase compatibility with standard Postgres.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This page describes how Lakebase Postgres is compatible with standard Postgres. As a managed Postgres service, there are some differences and limitations.",2026-04-17T08:00:00.000Z,concept-article,limits-quotas,0.6,True,"A compatibility page for a managed Postgres service typically enumerates specific unsupported features, behavioral differences, and limitations (e.g., disabled extensions, restricted settings). These are concrete product-specific limits/constraints that qualify as limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/cli,CLI,Get started with the Databricks CLI for Lakebase - Azure Databricks,,"Get started with the Databricks CLI to manage Lakebase Postgres projects, branches, and computes","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide helps you get started with the Databricks CLI to manage your Lakebase projects, branches, and computes (endpoints). You'll learn how to create a working project in just a few commands. For complete command reference and all available opti",2026-04-22T08:00:00.000Z,get-started,,0.3,False,"Intro to using Databricks CLI for Lakebase; summary suggests step-by-step usage, not parameter reference tables, quotas, or error-code-based troubleshooting. Likely generic tutorial-style content.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/compatibility,Project Postgres compatibility,Postgres compatibility - Azure Databricks,Understand Lakebase and standard Postgres differences,Learn about Lakebase compatibility with standard Postgres.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This page describes how Lakebase Postgres is compatible with standard Postgres. As a managed Postgres service, there are some differences and limitations.",2026-04-17T08:00:00.000Z,concept-article,limits-quotas,0.6,True,"A compatibility page for a managed Postgres service typically enumerates specific unsupported features, behavioral differences, and limitations (e.g., disabled extensions, restricted settings). These are concrete product-specific limits/constraints that qualify as limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/computes-and-endpoints,Computes and endpoints,Computes and endpoints - Azure Databricks,,"Understand how computes, endpoints, and compute instances relate across the Lakebase UI, API, and documentation.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. In Lakebase, you connect to your database through aLakebase endpoint, a stable database access point identified by a UID. Behind the endpoint, one or more compute instances provide the processing power to run your queries. Yourconnection stringsstay",2026-04-09T22:01:00.000Z,concept-article,,0.2,False,"Appears to be a conceptual/usage overview of computes, endpoints, and instances in Lakebase without clear evidence of numeric limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect,Connect,Connect to your database - Azure Databricks,,Learn how to connect to your Lakebase Postgres project.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to connect to your database using various clients and authentication methods. For managing Lakebase infrastructure (creating projects, branches, computes), seeProject permissions.",2026-04-10T18:06:00.000Z,how-to,,0.2,False,"High-level guidance on connecting to Lakebase using various clients and auth methods; summary does not indicate detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-application,Connect an application,Connect an application - Azure Databricks,,Connect your app to Lakebase: Azure Databricks Apps or external applications.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Choose how to connect your application to Lakebase: Databricks Apps (recommended) or external applications using Postgres drivers (SDK or REST) or the Data API. Use Databricks Apps unless you must run in existing infrastructure or a specific framewo",2026-04-10T18:06:00.000Z,how-to,,0.3,False,"Describes options for connecting applications (Databricks Apps vs external apps) and when to use them, but summary does not show quantified trade-offs, limits, or detailed configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-dbeaver,Connect with DBeaver,Connect with DBeaver - Azure Databricks,Connect to Lakebase using DBeaver,Connect to your Lakebase database using DBeaver,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. DBeaver is a versatile universal database management tool that supports a wide range of databases, including PostgreSQL. It provides a rich set of features for database administration, query development, and data visualization.",2026-03-31T23:28:00.000Z,how-to,integrations,0.7,True,DBeaver-specific connection instructions; likely includes driver selection and connection property values tailored to Lakebase.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-overview,Quickstart,Quickstart - Azure Databricks,,Quickstart guide for connecting to your Lakebase Postgres project.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Connect to your database using OAuth tokens or Postgres passwords. For managing Lakebase infrastructure (creating projects, branches, computes), seeProject permissions.",2026-03-31T23:28:00.000Z,how-to,,0.3,False,"Quickstart/overview for connecting; summary suggests high-level guidance (OAuth vs password) without detailed parameters, limits, or product-specific patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pgadmin,Connect with pgAdmin,Connect with pgAdmin - Azure Databricks,Connect and monitor Lakebase with pgAdmin,Connect to your Lakebase database using pgAdmin and monitor database performance,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. pgAdmin is a popular open source graphical administration tool for PostgreSQL. It provides a visual interface for managing databases, running queries, viewing data, and monitoring database performance. You can use pgAdmin to connect to your Lakebase",2026-03-31T23:28:00.000Z,how-to,integrations,0.7,True,Shows how to configure pgAdmin for Lakebase and monitor performance; likely includes connection property values and monitoring setup specific to Lakebase.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pghero,Connect with PgHero,Connect with PgHero - Azure Databricks,Monitor Lakebase Postgres performance with PgHero,"Monitor your Lakebase Postgres database performance with PgHero, a performance monitoring tool for Postgres databases.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. PgHero is an open-source performance monitoring tool for Postgres that helps you find and fix data issues using a dashboard interface. You can use PgHero to monitor your Lakebase Postgres database performance, identify slow queries, analyze query pa",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,Shows how to connect PgHero to Lakebase Postgres for performance monitoring. This is a concrete integration pattern between a specific monitoring tool and Lakebase.,updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-psql,Connect with psql,Connect with psql - Azure Databricks,Connect to Lakebase Postgres using psql,Connect to your Lakebase database using psql,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. psqlis the native command-line client for PostgreSQL. It provides an interactive session for sending commands to Postgres and running ad-hoc queries. For more information aboutpsql, see thepsql referencein the PostgreSQL documentation.",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,"Describes how to use the PostgreSQL psql client specifically with Lakebase, which likely includes Lakebase-specific connection parameters or patterns beyond generic psql usage.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connection-strings,Connection strings,Connection strings - Azure Databricks,Configure Lakebase Postgres connection string formats,"Understanding Lakebase connection strings, formats, and configuration","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A Lakebase connection string includes the role, hostname, and database name. For native Postgres password authentication, the connection string also includes the password. For OAuth authentication, you provide an OAuth token in place of a password. ",2026-04-17T08:00:00.000Z,reference,configuration,0.7,True,"Describes Lakebase-specific connection string structure including required components (role, hostname, database name) and how OAuth tokens vs passwords are supplied. This is product-specific configuration detail beyond generic PostgreSQL knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pghero,Connect with PgHero,Connect with PgHero - Azure Databricks,Monitor Lakebase Postgres performance with PgHero,"Monitor your Lakebase Postgres database performance with PgHero, a performance monitoring tool for Postgres databases.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. PgHero is an open-source performance monitoring tool for Postgres that helps you find and fix data issues using a dashboard interface. You can use PgHero to monitor your Lakebase Postgres database performance, identify slow queries, analyze query pa",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,Shows how to connect PgHero to Lakebase Postgres for performance monitoring. This is a concrete integration pattern between a specific monitoring tool and Lakebase.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-psql,Connect with psql,Connect with psql - Azure Databricks,Connect to Lakebase Postgres using psql,Connect to your Lakebase database using psql,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. psqlis the native command-line client for PostgreSQL. It provides an interactive session for sending commands to Postgres and running ad-hoc queries. For more information aboutpsql, see thepsql referencein the PostgreSQL documentation.",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,"Describes how to use the PostgreSQL psql client specifically with Lakebase, which likely includes Lakebase-specific connection parameters or patterns beyond generic psql usage.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connection-strings,Connection strings,Connection strings - Azure Databricks,Configure Lakebase Postgres connection string formats,"Understanding Lakebase connection strings, formats, and configuration","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A Lakebase connection string includes the role, hostname, and database name. For native Postgres password authentication, the connection string also includes the password. For OAuth authentication, you provide an OAuth token in place of a password. ",2026-04-17T08:00:00.000Z,reference,configuration,0.7,True,"Describes Lakebase-specific connection string structure including required components (role, hostname, database name) and how OAuth tokens vs passwords are supplied. This is product-specific configuration detail beyond generic PostgreSQL knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/core-concepts,Core concepts,Core concepts - Azure Databricks,,"Learn about the core concepts that make Lakebase unique: autoscaling, scale-to-zero, branches, and read replicas.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase is built on a set of features that enable you to develop, test, and scale your database applications efficiently. This section introduces the core concepts that differentiate Lakebase from traditional database systems.",2026-03-31T23:28:00.000Z,concept-article,,0.3,False,"Core concepts page explains autoscaling, scale-to-zero, branches, and replicas conceptually; lacks numeric thresholds or detailed configuration surfaces.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/customer-managed-keys,Customer-managed keys,Customer-managed keys for Lakebase - Azure Databricks,Use customer-managed keys to encrypt Lakebase data,Learn how customer-managed keys (CMK) encrypt Lakebase Autoscaling project data at rest using your own cloud KMS key.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Customer-managed keys (CMK) let you encrypt Lakebase Autoscaling project data at rest (stored data) using a key that your organization owns and manages in your cloud key management service (KMS). This gives your organization full sovereignty over it",2026-04-14T17:47:00.000Z,feature-guide,security,0.75,True,"Explains configuring CMK for Lakebase Autoscaling with a cloud KMS key. This includes product-specific encryption settings, key scopes, and possibly required roles/permissions, which are security-focused expert configurations.",new -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-api,Data API,Lakebase Data API - Azure Databricks,Use Lakebase Data API for RESTful Postgres access,Use the Lakebase Data API to interact with your Postgres database through a RESTful HTTP interface.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. The Lakebase Data API is a PostgREST-compatible RESTful interface that allows you to interact directly with your Lakebase Postgres database using standard HTTP methods. It offers API endpoints derived from your database schema, allowing for secure C",2026-04-17T08:00:00.000Z,how-to,integrations,0.7,True,"Describes a PostgREST-compatible HTTP interface specific to Lakebase, including schema-derived endpoints and secure CRUD operations. This is a product-specific API integration pattern rather than a generic concept.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-protection,Data protection,Data protection - Azure Databricks,Configure data protection for Lakebase Postgres,"Learn about data protection features in Lakebase Postgres, including customer-managed keys, protected branches, and private connectivity.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Postgres provides data protection features to help secure your databases and control access to critical data. These features enable you to protect production environments and establish secure network connectivity between your applications a",2026-04-14T17:47:00.000Z,concept-article,security,0.7,True,"Covers customer-managed keys, protected branches, and private connectivity. These involve specific security configurations, network settings, and access controls unique to Lakebase, aligning with security.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/databricks-apps,Databricks Apps,Using Lakebase with Databricks Apps - Azure Databricks,Integrate Lakebase Postgres with Databricks Apps,Connect a Lakebase database to a Databricks App using a template to store transactional data in managed Postgres,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Databricks Apps let you build and run interactive applications directly in your workspace. When you add Lakebase as a resource, Databricks handles the entire auth chain: a service principal is created for your app, granted a matching Postgres role, ",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,"Explains how Databricks Apps integrate with Lakebase, including automatic creation of service principals and matching Postgres roles. This is a product-specific integration pattern between Databricks Apps and Lakebase.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/customer-managed-keys,Customer-managed keys,Customer-managed keys for Lakebase - Azure Databricks,Use customer-managed keys to encrypt Lakebase data,Learn how customer-managed keys (CMK) encrypt Lakebase Autoscaling project data at rest using your own cloud KMS key.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Customer-managed keys (CMK) let you encrypt Lakebase Autoscaling project data at rest (stored data) using a key that your organization owns and manages in your cloud key management service (KMS). This gives your organization full sovereignty over it",2026-04-14T17:47:00.000Z,feature-guide,security,0.75,True,"Explains configuring CMK for Lakebase Autoscaling with a cloud KMS key. This includes product-specific encryption settings, key scopes, and possibly required roles/permissions, which are security-focused expert configurations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-api,Data API,Lakebase Data API - Azure Databricks,Use Lakebase Data API for RESTful Postgres access,Use the Lakebase Data API to interact with your Postgres database through a RESTful HTTP interface.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. The Lakebase Data API is a PostgREST-compatible RESTful interface that allows you to interact directly with your Lakebase Postgres database using standard HTTP methods. It offers API endpoints derived from your database schema, allowing for secure C",2026-04-17T08:00:00.000Z,how-to,integrations,0.7,True,"Describes a PostgREST-compatible HTTP interface specific to Lakebase, including schema-derived endpoints and secure CRUD operations. This is a product-specific API integration pattern rather than a generic concept.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-protection,Data protection,Data protection - Azure Databricks,Configure data protection for Lakebase Postgres,"Learn about data protection features in Lakebase Postgres, including customer-managed keys, protected branches, and private connectivity.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Postgres provides data protection features to help secure your databases and control access to critical data. These features enable you to protect production environments and establish secure network connectivity between your applications a",2026-04-14T17:47:00.000Z,concept-article,security,0.7,True,"Covers customer-managed keys, protected branches, and private connectivity. These involve specific security configurations, network settings, and access controls unique to Lakebase, aligning with security.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/databricks-apps,Databricks Apps,Using Lakebase with Databricks Apps - Azure Databricks,Use Lakebase Postgres as backend for Databricks Apps,Connect a Lakebase database to a Databricks app using a template to store transactional data in managed Postgres,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Databricks Apps lets you build and deploy interactive applications directly in your Azure Databricks workspace. Adding Lakebase as a resource gives your app a fully managed Postgres backend. Azure Databricks creates a service principal for your app,",2026-04-24T18:05:00.000Z,how-to,integrations,0.7,True,"Explains how to connect a Lakebase database to Databricks Apps as a managed Postgres backend, likely including app resource configuration, service principal behavior, and connection details that are specific to this integration.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/dev-workflow-tutorial,Branch-based development workflow,Tutorial: Branch-based development workflow - Azure Databricks,,Learn how to use branches for safe development workflows with Lakebase Postgres,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to use branches like Git branches, giving each developer an isolated branch for independent work, then resetting to stay in sync.",2026-04-02T22:35:00.000Z,tutorial,,0.3,False,"Tutorial-style branching workflow; summary shows conceptual guidance on using branches like Git without product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/extensions,Postgres extensions,Postgres extensions - Azure Databricks,Configure and install Postgres extensions in Lakebase,Postgres extensions supported by Lakebase.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase provides support for Postgres extensions, enabling you to extend your database functionality with additional features and capabilities. See theInstall an extensionsection below for extension installation instructions.",2026-03-31T23:28:00.000Z,how-to,configuration,0.65,True,"Page is specifically about supported Postgres extensions and how to install them; likely includes extension names, enable/disable commands, and possibly parameters unique to Lakebase, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-connect,Connect with Databricks SDK,Connect external app to Lakebase using SDK - Azure Databricks,Connect external apps to Lakebase via SDK,"Connect external applications to Lakebase using the Databricks SDK with OAuth token rotation (Python, Java, Go)","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide shows how to connect external applications to Lakebase Autoscaling using standard Postgres drivers (psycopg, pgx, JDBC) with OAuth token rotation. You use the Azure Databricks SDK with a service principal and a connection pool that callsg",2026-03-31T23:28:00.000Z,how-to,integrations,0.85,True,Shows how to use Databricks SDK with Postgres drivers and OAuth token rotation; includes product-specific SDK calls and connection pool patterns.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-connect,Connect with Databricks SDK,Connect external app to Lakebase using SDK - Azure Databricks,Connect external apps to Lakebase via SDK and OAuth,"Connect external applications to Lakebase using the Databricks SDK with OAuth token rotation (Python, Java, Go)","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide shows how to connect external applications to Lakebase Autoscaling using standard Postgres drivers (psycopg, pgx, JDBC) with OAuth token rotation. You use the Azure Databricks SDK with a service principal and a connection pool that callsg",2026-04-22T08:00:00.000Z,how-to,integrations,0.8,True,"Provides product-specific integration patterns using the Databricks SDK plus standard Postgres drivers, including how to perform OAuth token rotation and connection pooling for Lakebase Autoscaling.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-manual-api,Connect using REST API,Connect external app to Lakebase using API - Azure Databricks,Connect external apps to Lakebase via REST API,"Connect external applications to Lakebase using REST API without SDK for languages like Node.js, Ruby, PHP","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide shows how to connect external applications to Lakebase Autoscaling using direct REST API calls. Use this approach when a Databricks SDK is not available for your language (Node.js, Ruby, PHP, Elixir, Rust, etc.). If your language has SDK ",2026-03-31T23:28:00.000Z,how-to,integrations,0.85,True,"Describes direct REST API calls for languages without SDK; likely includes endpoint paths, parameters, and auth patterns unique to Lakebase Autoscaling.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-monitoring-tools,External tools,External monitoring tools - Azure Databricks,Integrate external monitoring tools with Lakebase Postgres,Monitor your Lakebase Postgres database with external tools like pgAdmin and PgHero.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Monitor your Lakebase Postgres database using external tools that provide performance insights, query analysis, and operational visibility. These tools connect to your database using standard Postgres protocols and help you optimize database perform",2026-03-31T23:28:00.000Z,how-to,integrations,0.7,True,"Covers using pgAdmin, PgHero, etc. with Lakebase; likely includes connection parameters, required settings, and tool-specific configuration for this product, which fits integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-monitoring-tools,External tools,External monitoring tools - Azure Databricks,,Monitor your Lakebase Postgres database with external tools like pgAdmin and PgHero.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Monitor your Lakebase Postgres database using external tools that provide performance insights, query analysis, and operational visibility. These tools connect to your database using standard Postgres protocols and help you optimize database perform",2026-04-24T08:00:00.000Z,how-to,,0.3,False,"Describes using external tools like pgAdmin and PgHero; summary indicates high-level integration/monitoring usage, not detailed config tables, quotas, or error mappings. Likely tutorial/overview rather than expert reference.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/feature-store,Feature Store and Model Serving,Feature Store and Model Serving - Azure Databricks,Back Databricks Online Feature Stores with Lakebase,Use Lakebase Autoscaling as the backend for Databricks Online Feature Stores to serve low-latency feature data to ML models and real-time applications.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Databricks Online Feature Stores are powered by Lakebase Autoscaling. When youcreate an online storeusing the Feature Engineering client, Databricks provisions a Lakebase Autoscaling project as the underlying storage backend, giving you low-latency ",2026-04-10T18:06:00.000Z,how-to,integrations,0.7,True,"Describing Lakebase Autoscaling as the backend for Online Feature Stores typically includes specific configuration fields, API calls, and constraints for online stores that are unique integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/framework-examples,Framework examples,Framework connection examples - Azure Databricks,Use frameworks to connect to Lakebase,Connect to Lakebase from various programming languages and frameworks,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. The examples below show how to connect to your Lakebase database from different programming languages and frameworks. You can also get connection snippets for these languages from theConnectdialog in the Lakebase App. Note These examples use native ",2026-03-31T23:28:00.000Z,how-to,integrations,0.8,True,Shows code examples for multiple languages/frameworks using native Postgres auth; likely includes concrete driver parameters and patterns specific to Lakebase connectivity.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/get-started,Get started,Get started with Lakebase Autoscaling - Azure Databricks,,Get started with Lakebase Autoscaling projects,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. By the end of this guide, you'll have a running Postgres database with sample data, connected to Unity Catalog, with data flowing between Lakebase and the Databricks lakehouse. Steps:①Create a project→ ②Connect→ ③Create a table→ ④Register in Unity C",2026-04-10T21:59:00.000Z,get-started,,0.3,False,"Getting started walkthrough for Lakebase Autoscaling; step-by-step tutorial to create a project and connect data, but no clear indication of configuration tables, limits, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-permissions-programmatically,Grant permissions programmatically,Grant permissions programmatically - Azure Databricks,Programmatically manage Lakebase project permissions,"Grant and manage Lakebase project permissions programmatically using the REST API, Databricks CLI, SDKs, and Terraform.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase project permissions can be managed programmatically using the standard Azure Databricks Permissions API, the Azure Databricks CLI, Azure Databricks SDKs, and Terraform. For an overview of permission types, default permissions, and how to ma",2026-04-17T18:03:00.000Z,how-to,security,0.7,True,"Covers managing Lakebase project permissions via Databricks Permissions API, CLI, SDKs, and Terraform. This implies product-specific permission scopes/objects and API/CLI parameters that go beyond generic RBAC concepts, fitting the security category.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-permissions-programmatically,Grant permissions programmatically,Grant permissions programmatically - Azure Databricks,Programmatically manage Lakebase project permissions,"Grant and manage Lakebase project permissions programmatically using the REST API, Databricks CLI, SDKs, and Terraform.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase project permissions can be managed programmatically using the standard Azure Databricks Permissions API, the Azure Databricks CLI, Azure Databricks SDKs, and Terraform. For an overview of permission types, default permissions, and how to ma",2026-04-17T18:03:00.000Z,how-to,security,0.7,True,"Covers managing Lakebase project permissions via Databricks Permissions API, CLI, SDKs, and Terraform. This implies product-specific permission scopes/objects and API/CLI parameters that go beyond generic RBAC concepts, fitting the security category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-user-access-tutorial,Grant project and database access to a new user,Tutorial: Grant project and database access to a new user - Azure Databricks,Grant Lakebase project and database access,"Learn how to grant Lakebase project and database access to users, including project permissions and Postgres roles.","Important Lakebase Autoscaling is available in the following regions:eastus,eastus2,centralus,southcentralus,westus,westus2,canadacentral,brazilsouth,northeurope,uksouth,westeurope,australiaeast,centralindia,southeastasia. Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to set up a new user with access to your Lakebase project and database. T",2026-02-12T04:25:00.000Z,tutorial,security,0.7,True,Walkthrough of assigning project permissions and Postgres roles; includes concrete role names and permission steps specific to Lakebase security.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/high-availability,High availability,High availability - Azure Databricks,Design for high availability with Lakebase computes,High availability pairs a primary compute with secondary compute instances across availability zones for automatic failover and high availability.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. High availability pairs a primary read/write compute with one or more secondary compute instances distributed across availability zones. When the primary becomes unavailable, a secondary compute instance is automatically promoted and your applicatio",2026-03-27T08:00:00.000Z,feature-guide,architecture-patterns,0.65,True,"Describes primary/secondary compute pairing across availability zones and automatic failover behavior, a Lakebase-specific HA pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/integrations,Integrate,Integrate - Azure Databricks,,Integrate Lakebase Postgres with other Databricks services and tools,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Postgres integrates with other Databricks services, enabling you to combine transactional database capabilities with the Databricks lakehouse platform and AI tools.",2026-04-10T18:06:00.000Z,how-to,,0.2,False,"Summary describes a conceptual integration overview of Lakebase Postgres with other Databricks services without exposing concrete configuration parameters, code patterns, or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/limitations,Limitations,Lakebase Autoscaling limitations - Azure Databricks,Lakebase Autoscaling feature and compatibility limitations,Limitations for Lakebase Postgres Autoscaling,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. The following limitations apply to Lakebase Postgres Autoscaling. Note This page describeslimitations, such as features not yet supported and compatibility constraints. If you're looking forproject limits(for example, maximum number of projects per ",2026-03-31T23:28:00.000Z,reference,limits-quotas,0.85,True,"Explicitly a limitations page; while it distinguishes from project limits, it will list unsupported features and compatibility constraints specific to Lakebase Autoscaling, which are expert limit/constraint details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage,Manage,Manage - Azure Databricks,,"Manage your Lakebase Postgres projects, branches, computes, roles, databases, catalogs, and synced tables.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Manage all aspects of your Lakebase Postgres projects. You can create, configure, and maintain projects, branches, computes, roles, and databases.",2026-03-31T23:28:00.000Z,landing-page,,0.3,False,"Top-level manage hub page listing areas (projects, branches, computes, etc.); primarily navigational without detailed parameters or limits in the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-branches,Branches,Manage branches - Azure Databricks,Create and manage Lakebase branches,"Create and manage branches in Lakebase Postgres projects for development, testing, and production environments.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Create and manage branches to support different development workflows, testing scenarios, and production environments. Branches allow you to work with isolated database environments without affecting production. Note About branch management permissi",2026-03-19T08:00:00.000Z,how-to,configuration,0.7,True,"Branch management for dev/test/prod workflows; likely includes branch operations, constraints, and permission requirements specific to Lakebase.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-computes,Computes,Manage computes - Azure Databricks,Configure and optimize Lakebase Postgres computes,"Configure and manage compute resources for Lakebase Postgres projects including sizing, scaling, and performance optimization.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A compute is a virtualized service that runs Postgres for your Lakebase projects. Each branch has one primary (read-write) compute. A compute is required to connect to a branch and access its data. For an overview of how computes and endpoints relat",2026-04-17T18:03:00.000Z,how-to,best-practices,0.65,True,"Page is about sizing, scaling, and performance optimization for Lakebase Postgres computes. This typically includes concrete product-specific recommendations (instance sizes, scaling behaviors, configuration patterns) that qualify as best practices rather than generic theory.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-databases,Databases,Manage databases - Azure Databricks,Manage Lakebase databases and branch limits,Create and manage databases within Lakebase Postgres projects for different applications and workloads.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A database is a container for SQL objects such as schemas, tables, views, functions, and indexes. In Lakebase, a database exists within a branch of a project, with a limit of 500 databases per branch.",2026-03-31T23:28:00.000Z,how-to,limits-quotas,0.9,True,Explicitly states a numeric limit of 500 databases per branch; this is a concrete service quota that LLMs would not know without documentation.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-computes,Computes,Manage computes - Azure Databricks,Configure and optimize Lakebase Postgres computes,"Configure and manage compute resources for Lakebase Postgres projects including sizing, scaling, and performance optimization.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A compute is a virtualized service that runs Postgres for your Lakebase projects. Each branch has one primary (read-write) compute. A compute is required to connect to a branch and access its data. For an overview of how computes and endpoints relat",2026-04-17T18:03:00.000Z,how-to,best-practices,0.65,True,"Page is about sizing, scaling, and performance optimization for Lakebase Postgres computes. This typically includes concrete product-specific recommendations (instance sizes, scaling behaviors, configuration patterns) that qualify as best practices rather than generic theory.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-databases,Databases,Manage databases - Azure Databricks,Understand Lakebase Postgres database count limits,Create and manage databases within Lakebase Postgres projects for different applications and workloads.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A database is a container for SQL objects such as schemas, tables, views, functions, and indexes. In Lakebase, a database exists within a branch of a project, with a limit of 500 databases per branch.",2026-04-22T08:00:00.000Z,how-to,limits-quotas,0.75,True,"Explicitly states a concrete numeric limit: 'limit of 500 databases per branch'. This is a product-specific quota value that an LLM would not reliably know from training, matching limits-quotas criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-high-availability,High availability,Manage high availability - Azure Databricks,Enable and configure Lakebase high availability,"Enable and configure high availability, manage read-only access to secondary compute instances, and obtain connection strings for Lakebase endpoints.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide covers enabling and managing high availability for your Lakebase endpoints. For background on how high availability works and how secondary compute instances differ from standalone read replicas, seeHigh availability.",2026-04-09T22:01:00.000Z,how-to,configuration,0.65,True,"High availability management for Lakebase endpoints typically includes specific settings (HA flags, secondary compute behavior, connection string patterns); the page likely documents product-specific HA configuration options and how to obtain/read-only connection strings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-project-permissions,Manage permissions,Manage project permissions - Azure Databricks,Manage Lakebase project permissions and access,Grant and manage permissions for Lakebase Postgres projects to control who can access and manage project resources.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Project permissions control who can access and manage your Lakebase project resources. Use project permissions to grant access to Azure Databricks identities, groups, and service principals for performing actions such as creating branches, managing ",2026-03-31T23:28:00.000Z,how-to,security,0.8,True,"Covers project permissions for identities, groups, and service principals; likely lists specific permission types and scopes for Lakebase resources.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-projects,Projects,Manage projects - Azure Databricks,Create and configure Lakebase Postgres projects,"Create, configure, and manage Lakebase Postgres projects including settings, resources, and lifecycle management.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A project is the top-level container for your Lakebase resources, including branches, computes, databases, and roles. This page explains how to create projects, understand their structure, configure settings, and manage their lifecycle. If you're ne",2026-04-17T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to create and configure Lakebase projects, including settings and lifecycle management for branches, computes, databases, and roles. This is product-specific configuration of top-level Lakebase resources.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-read-replicas,Read replicas,Manage read replicas - Azure Databricks,Create and manage Lakebase read replicas,Learn how to create and manage read replicas in Lakebase Postgres projects.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide walks you through creating and managing read replicas for your projects. Read replicas segregate read-only work from your production database operations, with applications ranging from horizontal scaling to analytics workloads. For detail",2026-04-17T08:00:00.000Z,how-to,best-practices,0.65,True,"Managing read replicas for Lakebase Postgres will include product-specific guidance on when/how to use replicas, routing read workloads, and operational patterns, which are actionable best practices beyond generic Postgres replication concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-projects,Projects,Manage projects - Azure Databricks,,"Create, configure, and manage Lakebase Postgres projects including settings, resources, and lifecycle management.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A project is the top-level container for your Lakebase resources, including branches, computes, databases, and roles. This page explains how to create projects, understand their structure, configure settings, and manage their lifecycle. If you're ne",2026-04-23T08:00:00.000Z,how-to,,0.3,False,"Primarily a how-to/manage guide for Lakebase projects; summary does not indicate numeric limits, config parameter tables, error codes, or decision matrices. Likely procedural/tutorial content rather than expert reference details.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-read-replicas,Read replicas,Manage read replicas - Azure Databricks,Create and manage Lakebase read replicas,Learn how to create and manage read replicas in Lakebase Postgres projects.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This guide walks you through creating and managing read replicas for your projects. Read replicas segregate read-only work from your production database operations, with applications ranging from horizontal scaling to analytics workloads. For detail",2026-04-17T08:00:00.000Z,how-to,best-practices,0.65,True,"Managing read replicas for Lakebase Postgres will include product-specific guidance on when/how to use replicas, routing read workloads, and operational patterns, which are actionable best practices beyond generic Postgres replication concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles,Roles,Manage roles - Azure Databricks,Manage Postgres roles in Lakebase projects,"Create, manage, and configure database Postgres roles in Lakebase Postgres projects.","Important Lakebase Autoscaling is available in the following regions:eastus,eastus2,centralus,southcentralus,westus,westus2,canadacentral,brazilsouth,northeurope,uksouth,westeurope,australiaeast,centralindia,southeastasia. Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Postgres roles control access to your Postgres databases, schemas, tables, and othe",2026-03-12T18:41:00.000Z,how-to,security,0.75,True,Covers creation and configuration of Postgres roles within Lakebase; likely includes role attributes and permission mappings unique to this environment.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles-permissions,Manage database permissions,Manage database permissions - Azure Databricks,Manage Lakebase Postgres roles and permissions,Learn how to grant database access to Postgres roles in your Lakebase project.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to grant database permissions to Postgres roles in your Lakebase project. Prerequisite:Before granting permissions, you must have created a Postgres role. You can create OAuth roles for Azure Databricks identities using thedatabricks_authe",2026-04-17T08:00:00.000Z,how-to,security,0.7,True,Explains granting database permissions to Postgres roles in Lakebase projects and mentions creating OAuth roles via a specific Databricks auth resource. This is product-specific RBAC/permissions configuration.,updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-with-bundles,Declarative Automation Bundles,Manage Lakebase with Declarative Automation Bundles - Azure Databricks,Manage Lakebase with Declarative Automation Bundles,"Get started with Declarative Automation Bundles to create Lakebase projects, branches, and endpoints, and learn how to manage them using infrastructure as code.","This guide helps you get started with Declarative Automation Bundles to manage Lakebase resources using infrastructure as code. You'll create a Lakebase project, add a development branch and endpoint, and learn how to manage these resources declaratively. This is a typical workflow for programmatically managing Azure Databricks resources in development and testing environments. For the complete bundle resource reference and all available configuration options, seebundle resources. Note postgres_",2026-04-15T19:49:00.000Z,get-started,configuration,0.7,True,"Covers using Declarative Automation Bundles to define Lakebase projects, branches, and endpoints as IaC. This relies on bundle resource schemas and configuration options (resource types, fields), which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles-permissions,Manage database permissions,Manage database permissions - Azure Databricks,Grant and manage Lakebase database permissions,"Learn how to grant database permissions to Postgres roles in your Lakebase project, including read, write, and schema-level access.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Grant database permissions to Postgres roles in your Lakebase project using standard SQLGRANTcommands. Permissions control what databases, schemas, and tables a role can access. Note This page covers database permissions (Postgres GRANT commands). F",2026-04-24T23:15:00.000Z,how-to,security,0.8,True,"Explains how to use Postgres GRANT commands for Lakebase projects, including schema/table-level access patterns. This is concrete, product-scoped IAM/permission guidance rather than generic SQL theory.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-with-bundles,Declarative Automation Bundles,Manage Lakebase with Declarative Automation Bundles - Azure Databricks,Manage Lakebase with Declarative Automation Bundles,"Get started with Declarative Automation Bundles to create Lakebase projects, branches, and endpoints, and learn how to manage them using infrastructure as code.","This guide helps you get started with Declarative Automation Bundles to manage Lakebase resources using infrastructure as code. You'll create a Lakebase project, add a development branch and endpoint, and learn how to manage these resources declaratively. This is a typical workflow for programmatically managing Azure Databricks resources in development and testing environments. For the complete bundle resource reference and all available configuration options, seebundle resources. Note postgres_",2026-04-15T19:49:00.000Z,get-started,configuration,0.7,True,"Covers using Declarative Automation Bundles to define Lakebase projects, branches, and endpoints as IaC. This relies on bundle resource schemas and configuration options (resource types, fields), which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/metrics,Metrics,Metrics dashboard - Azure Databricks,Use Lakebase metrics dashboard for monitoring,Monitor system and database metrics for your Lakebase Postgres project using the metrics dashboard.,"Important Lakebase Autoscaling is available in the following regions:eastus,eastus2,centralus,southcentralus,westus,westus2,canadacentral,brazilsouth,northeurope,uksouth,westeurope,australiaeast,centralindia,southeastasia. Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. If you are a Lakebase Provisioned user, seeLakebase Provisioned. The Metrics dashboard in the Lakebase UI provides graphs for monitoring system and ",2026-02-04T19:43:00.000Z,how-to,configuration,0.6,True,"Metrics dashboard documentation typically lists metric names, meanings, and possibly thresholds specific to Lakebase Postgres.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/monitor,Monitor,Monitor - Azure Databricks,,"Monitor and observe Lakebase Postgres projects with system operations, metrics, and query analysis tools.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Monitor your Lakebase Postgres projects to track performance, resource usage, and system health. Lakebase provides multiple categories of monitoring tools to help you understand your database operations and optimize performance.",2026-03-31T23:28:00.000Z,how-to,,0.4,False,"General monitoring overview (system operations, metrics, query analysis); summary lacks specific metrics tables, config parameters, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/monitor,Monitor,Monitor - Azure Databricks,,"Monitor and observe Lakebase Postgres projects with system operations, metrics, and query analysis tools.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Monitor your Lakebase Postgres projects to track performance, resource usage, and system health. Lakebase provides multiple categories of monitoring tools to help you understand your database operations and optimize performance.",2026-04-24T08:00:00.000Z,how-to,,0.3,False,"Monitoring overview for Lakebase projects; summary mentions categories of tools and optimization but no specific metrics thresholds, limits, or configuration parameter tables. Reads as conceptual/usage guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/operations,System operations,System operations - Azure Databricks,,Monitor Lakebase system operations to track the health and status of your project.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. System operations are actions performed by the Lakebase platform on projects and resources. These operations include both user-initiated activities (such as creating a branch or deleting a database) and system-initiated activities (such as suspendin",2026-03-31T23:28:00.000Z,how-to,,0.4,False,"Describes system operations conceptually (user-initiated vs system-initiated); summary does not show specific operation codes, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/pg-dump-restore,Postgres pg_dump & pg_restore,Postgres pg_dump & pg_restore - Azure Databricks,Backup and restore Lakebase with pg_dump/pg_restore,Create and restore database backups using native Postgres pg\_dump and pg\_restore tools.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This topic describes how to create a backup of your Lakebase database using the Postgrespg_dumputility and how to restore a backup usingpg_restore.",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,"Describes using native Postgres tools with Lakebase, which will include Lakebase-specific connection parameters, options, and constraints when running pg_dump/pg_restore, fitting integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/pg-stat-statements,pg_stat_statements,Monitor with pg_stat_statements - Azure Databricks,Monitor Lakebase queries using pg_stat_statements,Monitor query performance using the pg\_stat\_statements Postgres extension to track execution statistics for all SQL statements.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. pg_stat_statementsis a Postgres extension that provides a detailed statistical view of SQL statement execution within your Lakebase Postgres database. It tracks information such as execution counts, total and average execution times, and more, helpi",2026-04-17T08:00:00.000Z,how-to,best-practices,0.65,True,"Monitoring guidance with pg_stat_statements for Lakebase Postgres will include product-specific views, recommended queries, and interpretation patterns for performance tuning, which are actionable best practices.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/pg-dump-restore,Postgres pg_dump & pg_restore,Postgres pg_dump & pg_restore - Azure Databricks,Backup and restore Lakebase with pg_dump/pg_restore,Create and restore database backups using native Postgres pg\_dump and pg\_restore tools.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This topic describes how to create a backup of your Lakebase database using the Postgrespg_dumputility and how to restore a backup usingpg_restore.",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,"Describes using native Postgres tools with Lakebase, which will include Lakebase-specific connection parameters, options, and constraints when running pg_dump/pg_restore, fitting integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/pg-stat-statements,pg_stat_statements,Monitor with pg_stat_statements - Azure Databricks,Monitor Lakebase queries using pg_stat_statements,Monitor query performance using the pg\_stat\_statements Postgres extension to track execution statistics for all SQL statements.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. pg_stat_statementsis a Postgres extension that provides a detailed statistical view of SQL statement execution within your Lakebase Postgres database. It tracks information such as execution counts, total and average execution times, and more, helpi",2026-04-17T08:00:00.000Z,how-to,best-practices,0.65,True,"Monitoring guidance with pg_stat_statements for Lakebase Postgres will include product-specific views, recommended queries, and interpretation patterns for performance tuning, which are actionable best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/point-in-time-branching,Querying data at a point in time,Querying data at a point in time - Azure Databricks,Use point-in-time branches in Lakebase,Query data at any point in time using point-in-time branches,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Point-in-time branching allows you to query your database as it existed at any moment within your restore window. This feature creates a new branch with your data from a specific point in time, so you can analyze historical states without affecting ",2026-03-31T23:28:00.000Z,feature-guide,configuration,0.6,True,Explains how to create and query point-in-time branches within a restore window; Lakebase-specific feature behavior and usage steps.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/point-in-time-restore,Point-in-time restore,Point-in-time restore - Azure Databricks,,Restore your data to a point in time within your restore window.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. You can restore from any point in time within your project's restore window. This restore option lets you quickly recover from an accidental data loss or modification such as an unintended deletion or schema change.",2026-03-31T23:28:00.000Z,feature-guide,,0.4,False,"Explains point-in-time restore conceptually; summary does not indicate specific restore windows, limits, or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/postgres,Postgres,Postgres - Azure Databricks,,"Postgres features, extensions, and compatibility in Lakebase Postgres projects.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Lakebase Postgres provides a fully managed PostgreSQL experience with support for PostgreSQL extensions and broad compatibility with standard Postgres features. You can use familiar Postgres tools, clients, and workflows with your Lakebase projects.",2026-03-31T23:28:00.000Z,concept-article,,0.2,False,"High-level description of Lakebase Postgres features and compatibility; no concrete limits, configs, or security role details in summary.",unchanged @@ -2002,20 +2030,21 @@ https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/private-link,Pr https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/protected-branches,Protected branches,Protected branches - Azure Databricks,Configure protected branches in Lakebase Postgres,Configure branch protection rules to safeguard critical branches in Lakebase Postgres.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Protected branches help you safeguard critical branches from accidental modifications or deletions.",2026-03-31T23:28:00.000Z,how-to,security,0.75,True,"Protected branches are an access-control mechanism; page likely defines specific protection rules, permissions, and constraints, which are product-specific security configurations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-data,Query,Query your data - Azure Databricks,,Learn how to query and work with data in your Lakebase Postgres database using various tools and interfaces.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Learn how to query and work with data in your Lakebase Postgres database. Lakebase provides several methods to access and query your data, each suited for different use cases and workflows. Note Before querying, make sure you'veconnected to your dat",2026-04-10T18:06:00.000Z,how-to,,0.3,False,"General guidance on querying data using various tools; summary does not indicate presence of product-specific configuration tables, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-performance,Query performance,Monitor query performance - Azure Databricks,,View and analyze query history for your Lakebase Postgres database.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Monitor query performance for your Lakebase Postgres project to analyze historical query execution, identify slow queries, and find optimization opportunities.",2026-03-31T23:28:00.000Z,how-to,,0.45,False,Historical query performance monitoring; summary indicates conceptual optimization guidance without concrete config values or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-sql-editor,Query from Lakehouse SQL Editor,Query from SQL Editor in Lakehouse - Azure Databricks,Query Lakebase from Lakehouse SQL editor,Query your Lakebase database from the SQL editor in Lakehouse by connecting directly to Lakebase compute or registering in Unity Catalog.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This page describes how to query databases in your Lakebase project from the SQL editor in Lakehouse using two different connection methods. The SQL editor in Lakehouse is a collaborative SQL workspace where you can author queries, browse data catal",2026-04-17T08:00:00.000Z,how-to,integrations,0.7,True,Describes two specific connection methods from the Lakehouse SQL editor to Lakebase compute or via Unity Catalog. This is a product-specific integration and connection pattern.,updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-sql-editor,Query from Lakehouse SQL Editor,Query from SQL Editor in Lakehouse - Azure Databricks,Query Lakebase from Lakehouse SQL editor,Query your Lakebase database from the SQL editor in Lakehouse by connecting directly to Lakebase compute or registering in Unity Catalog.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This page describes how to query databases in your Lakebase project from the SQL editor in Lakehouse using two different connection methods. The SQL editor in Lakehouse is a collaborative SQL workspace where you can author queries, browse data catal",2026-04-17T08:00:00.000Z,how-to,integrations,0.7,True,Describes two specific connection methods from the Lakehouse SQL editor to Lakebase compute or via Unity Catalog. This is a product-specific integration and connection pattern.,unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/read-replicas,Read replicas,Read replicas - Azure Databricks,Scale reads with Lakebase read replicas,Scale your application with read replicas that provide independent read-only computes without duplicating data.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Read replicas are independent read-only computes that perform read operations on the same data as your primary read-write compute. Lakebase read replicas don't replicate or duplicate data. Instead, read requests are served from the same storage laye",2026-03-31T23:28:00.000Z,feature-guide,architecture-patterns,0.65,True,"Explains how read replicas share the same storage layer without data duplication, providing a Lakebase-specific scaling pattern and trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/register-uc,Register database in Unity Catalog,Register a Lakebase database in Unity Catalog - Azure Databricks,Register Lakebase Postgres databases in Unity Catalog,Register your Lakebase database in Unity Catalog to enable unified governance and cross-source queries.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Registering a Lakebase database in Unity Catalog creates a read-only catalog that represents your Postgres database, enabling unified data governance and cross-source analytics across your lakehouse and transactional workloads.",2026-04-10T08:00:00.000Z,how-to,configuration,0.65,True,"A registration guide for a database in Unity Catalog typically includes specific configuration steps, object names, and settings unique to Databricks governance, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/roles-permissions,Roles and permissions,Roles and permissions - Azure Databricks,Configure Postgres roles and permissions in Lakebase,Set up database access by creating Postgres roles and granting permissions.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Postgres roles control who can access your database and what actions they can perform. Before users can connect to your database, you must create Postgres roles for them and grant appropriate permissions to access data. Note Postgres roles controlda",2026-03-31T23:28:00.000Z,how-to,security,0.7,True,Covers creating roles and granting permissions for Lakebase; likely includes specific GRANT patterns and role usage tailored to this managed service.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/register-uc,Register database in Unity Catalog,Register a Lakebase database in Unity Catalog - Azure Databricks,,Register your Lakebase database in Unity Catalog to enable unified governance and cross-source queries.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Registering a Lakebase database in Unity Catalog creates a read-only catalog that represents your Postgres database, enabling unified data governance and cross-source analytics across your lakehouse and transactional workloads.",2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Explains registering a Lakebase database in Unity Catalog; summary focuses on purpose and outcome (read-only catalog, unified governance) without indicating specific RBAC role lists, config parameters, or limits.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/roles-permissions,Roles and permissions,Roles and permissions - Azure Databricks,Configure Lakebase Postgres roles and permissions,Set up Postgres database access for your Lakebase project by creating roles and granting the appropriate permissions to connect and query data.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Postgres roles control who can access your database and what actions they can perform. Before users can connect to your database, you must create Postgres roles for them and grant appropriate permissions to access data. Note Postgres roles controlda",2026-04-24T23:15:00.000Z,how-to,security,0.8,True,"Details how to set up Postgres roles and grant permissions for Lakebase projects. Likely includes specific role patterns, required privileges to connect/query, and how these map to Azure Databricks, which are product-specific security configurations.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/scale-to-zero,Scale to zero,Scale to zero - Azure Databricks,Configure Lakebase scale-to-zero idle suspension behavior,Scale to zero automatically suspends inactive computes to minimize costs while maintaining quick reactivation.,"Important Lakebase Autoscaling is available in the following regions:eastus,eastus2,centralus,southcentralus,westus,westus2,canadacentral,brazilsouth,northeurope,uksouth,westeurope,australiaeast,centralindia,southeastasia. Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Scale to zero automatically suspends your Lakebase compute after a period of inacti",2026-02-04T19:43:00.000Z,feature-guide,configuration,0.7,True,"Explains how scale-to-zero suspends compute after a period of inactivity, implying specific timing/behavioral settings that are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/snapshots,Snapshots,Snapshots - Azure Databricks,Create and restore Lakebase database snapshots,"Create and restore from database snapshots for backup, development, and testing purposes.","Important Lakebase Autoscaling is available in the following regions:eastus,eastus2,centralus,southcentralus,westus,westus2,canadacentral,brazilsouth,northeurope,uksouth,westeurope,australiaeast,centralindia,southeastasia. Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. If you are a Lakebase Provisioned user, seeLakebase Provisioned. A snapshot is a point-in-time capture of a project's root branch, including the sch",2026-03-02T19:05:00.000Z,how-to,configuration,0.7,True,"Snapshot behavior (scope, retention, restore semantics) is specific to Lakebase and constitutes expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sql-editor,Query from Lakebase SQL Editor,Query from Lakebase SQL Editor - Azure Databricks,,Query your database from the Lakebase SQL Editor.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. The Lakebase SQL Editor runs queries on your Lakebase databases directly from the Lakebase App. It offers Postgres-native features such asEXPLAIN/ANALYZE,psql-stylemeta-commands, and exporting results to CSV/JSON/XLSX. Note You can also query your L",2026-04-09T17:57:00.000Z,how-to,,0.4,False,"Describes features of the Lakebase SQL Editor (EXPLAIN/ANALYZE, meta-commands, export formats); likely a usage guide rather than detailed configuration, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/state-management,Agent state and memory,Agent state and memory - Azure Databricks,Store AI agent state in Lakebase Postgres,Use Lakebase Postgres for durable agent state and memory in AI agents built with LangGraph or the OpenAI Agents SDK.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. AI agents need persistent storage to maintain context across turns and sessions. Lakebase Autoscaling provides a fully managed Postgres backend for storing agent state and memory, integrating natively with Databricks authentication and scaling autom",2026-04-10T18:06:00.000Z,how-to,integrations,0.7,True,"Using Lakebase Postgres for agent state with LangGraph or OpenAI Agents SDK implies concrete schema patterns, connection parameters, and SDK usage that are product-specific integration and coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sync-tables,Serve lakehouse data with synced tables,Serve lakehouse data with synced tables - Azure Databricks,Serve lakehouse data via Lakebase synced tables,Serve lakehouse data through Lakebase for operational analytics and real-time applications,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Synced tables let you serve lakehouse data through Lakebase Postgres. Unity Catalog tables sync into Postgres so applications can query lakehouse data directly with low latency. This process is commonly known as reverse ETL. The lakehouse is optimiz",2026-04-17T08:00:00.000Z,how-to,architecture-patterns,0.6,True,"Describes using synced tables to serve lakehouse data through Lakebase for operational analytics and real-time apps. This is a product-specific pattern (reverse ETL via Lakebase) with guidance on when/how to use it, fitting architecture & design patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sync-tables,Serve lakehouse data with synced tables,Serve lakehouse data with synced tables - Azure Databricks,,Serve lakehouse data through Lakebase for operational analytics and real-time applications,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. Synced tables let you serve lakehouse data through Lakebase Postgres. Unity Catalog tables sync into Postgres so applications can query lakehouse data directly with low latency. This process is commonly known as reverse ETL. The lakehouse is optimiz",2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Describes synced tables and reverse ETL conceptually; summary does not show numeric limits, config parameter tables, or decision matrices. Appears to be conceptual/usage guidance rather than expert configuration or limits.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/table-editor,Tables editor,Managing data with the Tables editor - Azure Databricks,,"Use the Tables editor to easily view, edit, and manage your data and schemas in Lakebase.","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. TheTables editorin Lakebase offers a visual interface for managing data and schemas. This interactive view lets you add, update, and delete records, filter data, modify columns, drop or truncate tables, export data in both JSON and CSV formats, and ",2026-04-09T17:57:00.000Z,how-to,,0.3,False,"Tables editor page appears to be a UI/usage walkthrough for viewing and editing data; summary does not suggest numeric limits, config parameter tables, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorial-databricks-apps-autoscaling,Write your own,Connect a custom Databricks app to Lakebase - Azure Databricks,Secure Databricks app access to Lakebase Autoscaling,Learn how to connect a Databricks app to Lakebase Autoscaling using service principal authentication with automatic OAuth token rotation.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This tutorial shows you how to connect a Databricks app to Lakebase Autoscaling with automatic credential rotation. The app generates fresh database credentials from Databricks before they expire. The example usesFlask, but the authentication patter",2026-04-17T08:00:00.000Z,tutorial,security,0.7,True,"Tutorial focuses on connecting via service principal authentication with automatic OAuth token rotation and handling credential expiry. These are product-specific authentication and token-handling patterns, fitting security-focused configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/transfer-object-ownership,Transfer object ownership,Transfer Postgres object ownership - Azure Databricks,Transfer Lakebase Postgres object ownership via group roles,Transfer ownership of Postgres objects to a Azure Databricks group role in Lakebase using a temporary intermediate role.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. In Lakebase, use a temporary shared role as an intermediate step to transfer Postgres object ownership between roles. You can't do this directly with a standardALTER TABLE ... OWNER TOcommand. Note This page covers transferring ownership to a Azure ",2026-04-24T23:15:00.000Z,how-to,security,0.8,True,"Describes a Lakebase-specific workaround for transferring Postgres object ownership using a temporary shared role instead of direct ALTER OWNER, tied to Azure Databricks group roles. This is a product-specific security/permissions pattern and edge case.",new +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorial-databricks-apps-autoscaling,Write your own,Connect a custom Databricks app to Lakebase - Azure Databricks,Connect custom Databricks app to Lakebase Autoscaling,Learn how to connect a Databricks app to Lakebase Autoscaling using service principal authentication with automatic OAuth token rotation.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. This tutorial shows you how to connect a Databricks app to Lakebase Autoscaling with automatic credential rotation. The app generates fresh database credentials from Databricks before they expire. The example usesFlask, but the authentication patter",2026-04-24T18:05:00.000Z,tutorial,integrations,0.8,True,"Shows a concrete authentication and connection pattern (service principal, OAuth token rotation, credential refresh) between Databricks Apps and Lakebase Autoscaling, with code-level details and product-specific parameters.",updated https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorials,Tutorials,Lakebase Postgres tutorials - Azure Databricks,,Learn how to use Lakebase Postgres through hands-on tutorials for common workflows and development patterns.,"Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. These tutorials walk you through common workflows and development patterns for Lakebase Postgres.",2026-03-31T23:28:00.000Z,tutorial,,0.3,False,Tutorials landing page listing workflows; primarily navigational without detailed technical content in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/updates,Updates,Manage updates - Azure Databricks,Configure automatic updates for Lakebase computes,"Manage automatic updates for Postgres minor versions, security patches, and platform features in your Lakebase projects. Lakebase uses prewarming to minimize performance impact during compute restarts","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. To keep your Lakebase Postgres instances up to date with the latest patches and features, Lakebase applies updates to your project's computes. You can select an update window by choosing a specific day and hour for updates.",2026-04-15T19:49:00.000Z,reference,configuration,0.6,True,"Describes selecting update windows (specific day and hour) and how Lakebase applies updates and uses prewarming. This is product-specific configuration of update behavior and scheduling, fitting configuration.",updated -https://learn.microsoft.com/en-us/azure/databricks/oltp/upgrade-to-autoscaling,Autoscaling by default,Autoscaling by default - Azure Databricks,Decide between Lakebase Provisioned and Autoscaling projects,"Autoscaling by default for new Lakebase instances. How new projects appear in the UI, connection details, and integrations.","Note StartingMarch 12, 2026, new Lakebase instances will be created as Lakebase Autoscaling projects (instead of Provisioned instances) so you can use capabilities such as autoscaling compute, scale-to-zero, branching, and instant restore. The rollout begins on March 12 and is expected to complete within a few days. Until your workspace is updated, you can still create Provisioned instances. Once your workspace is in the rollout, any new Lakebase instance you create will be an Autoscaling projec",2026-04-15T08:00:00.000Z,concept-article,decision-making,0.65,True,"Explains the autoscaling-by-default rollout, when new instances become Autoscaling projects, and how this affects creation options and integrations. This is product-specific decision guidance about which Lakebase mode you get and under what rollout conditions, helping users understand behavior changes and choose or plan migration paths, which aligns best with decision-making.",updated +https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/updates,Updates,Manage updates - Azure Databricks,Configure automatic updates for Lakebase computes,"Manage automatic updates for Postgres minor versions, security patches, and platform features in your Lakebase projects. Lakebase uses prewarming to minimize performance impact during compute restarts","Important Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, seeRegion availability. If you are a Lakebase Provisioned user, seeLakebase Provisioned. To keep your Lakebase Postgres instances up to date with the latest patches and features, Lakebase applies updates to your project's computes. You can select an update window by choosing a specific day and hour for updates.",2026-04-15T19:49:00.000Z,reference,configuration,0.6,True,"Describes selecting update windows (specific day and hour) and how Lakebase applies updates and uses prewarming. This is product-specific configuration of update behavior and scheduling, fitting configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/oltp/upgrade-to-autoscaling,Autoscaling by default,Autoscaling by default - Azure Databricks,Decide between Lakebase Provisioned and Autoscaling projects,"Autoscaling by default for new Lakebase instances. How new projects appear in the UI, connection details, and integrations.","Note StartingMarch 12, 2026, new Lakebase instances will be created as Lakebase Autoscaling projects (instead of Provisioned instances) so you can use capabilities such as autoscaling compute, scale-to-zero, branching, and instant restore. The rollout begins on March 12 and is expected to complete within a few days. Until your workspace is updated, you can still create Provisioned instances. Once your workspace is in the rollout, any new Lakebase instance you create will be an Autoscaling projec",2026-04-15T08:00:00.000Z,concept-article,decision-making,0.65,True,"Explains the autoscaling-by-default rollout, when new instances become Autoscaling projects, and how this affects creation options and integrations. This is product-specific decision guidance about which Lakebase mode you get and under what rollout conditions, helping users understand behavior changes and choose or plan migration paths, which aligns best with decision-making.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/,Overview,Optimization recommendations on Azure Databricks - Azure Databricks,Apply performance optimization recommendations on Azure Databricks,Learn about optimizations and performance recommendations on Azure Databricks.,"Azure Databricks provides many optimizations supporting a variety of workloads on the lakehouse, ranging from large-scale ETL processing to ad-hoc, interactive queries. Many of these optimizations take place automatically. You get their benefits simply by using Azure Databricks. Additionally, most Databricks Runtime features require Delta Lake, the default format used to create tables in Azure Databricks. Azure Databricks configures default values that optimize most workloads. But, in some cases",2026-04-06T18:30:00.000Z,overview,best-practices,0.65,True,"Optimization recommendations for Databricks typically include concrete, product-specific tuning guidance (e.g., settings, feature usage, workload-specific advice) beyond generic performance theory.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/aqe,Adaptive query execution,Adaptive query execution - Azure Databricks,Use adaptive query execution on Databricks,Learn how to use adaptive query execution in Azure Databricks.,"Adaptive query execution (AQE) is query re-optimization that occurs during query execution. The motivation for runtime re-optimization is that Azure Databricks has the most up-to-date accurate statistics at the end of a shuffle and broadcast exchange (referred to as a query stage in AQE). As a result, Azure Databricks can opt for a better physical strategy, pick an optimal post-shuffle partition size and number, or do optimizations that used to require hints, for example, skew join handling. Thi",2024-03-01T08:00:00.000Z,concept-article,best-practices,0.65,True,"Describes Databricks AQE behavior, including how it re-optimizes queries at runtime and handles skew joins, with product-specific tuning implications.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/archive-delta,Archival support,Archival support in Azure Databricks - Azure Databricks,,Archival support in Azure Databricks enables you to use cloud-based lifecycle policies on cloud object storage containing Delta tables.,Important This feature is in Public Preview for Databricks Runtime 13.3 LTS and above. Archival support in Azure Databricks enables you to use cloud-based lifecycle policies on cloud object storage containing Delta tables. Enabling archival support on a Delta Lake table effectively tells Azure Databricks to ignore files that are older than the specified period in the table.,2026-04-09T08:00:00.000Z,feature-guide,,0.3,False,"Feature overview of archival support; summary does not indicate numeric limits, configuration tables, or detailed parameters. Likely conceptual behavior description rather than expert-only configuration or limits.",unchanged @@ -2024,7 +2053,7 @@ https://learn.microsoft.com/en-us/azure/databricks/optimizations/cbo,Cost-based For this to work it is critical to collect table and column statistics and keep them up to date.",2026-02-12T22:57:00.000Z,concept-article,best-practices,0.65,True,"Explains how CBO works in Spark SQL on Databricks and the need for up-to-date stats, with Databricks-specific guidance on using and benefiting from CBO.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/disk-cache,Disk caching,Optimize performance with caching on Azure Databricks - Azure Databricks,Improve read performance with Databricks disk cache,Learn how Azure Databricks disk caching accelerates data reads.,"Azure Databricks uses disk caching to accelerate data reads by creating copies of remote Parquet data files in nodes' local storage using a fast intermediate data format. The data is cached automatically whenever a file has to be fetched from a remote location. Successive reads of the same data are then performed locally, which results in significantly improved reading speed. The cache works for all Parquet data files (including Delta Lake tables). Note In SQL warehouses and Databricks Runtime 1",2024-05-01T08:00:00.000Z,concept-article,best-practices,0.7,True,"Explains Databricks disk caching behavior, when it applies (Parquet/Delta, SQL warehouses, runtimes), and how it impacts performance, which is product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/dynamic-file-pruning,Dynamic file pruning,Dynamic file pruning - Azure Databricks,Improve Delta query performance with dynamic file pruning on Azure Databricks,Learn how dynamic file pruning on Azure Databricks can improve queries on Delta Lake tables.,"Dynamic file pruning, can significantly improve the performance of many queries on Delta Lake tables. Dynamic file pruning triggers based on query optimizer decisions, and for queries that contain filter statements orWHEREclauses. You must use Photon-enabled compute to guarantee use of dynamic file pruning inMERGE,UPDATE, andDELETEstatements. OnlySELECTstatements guarantee use of dynamic file pruning when Photon is not used. Dynamic file pruning is especially efficient for non-partitioned tables",2026-03-31T23:28:00.000Z,concept-article,best-practices,0.75,True,"Explains when dynamic file pruning is triggered, its dependency on Photon for certain statements, and which query patterns benefit most. These are Databricks-specific performance behaviors and recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/optimizations/incremental-refresh,Incremental refresh for materialized views,Incremental refresh for materialized views - Azure Databricks,Choose and configure incremental refresh for Databricks materialized views,Learn how to use incremental refresh with materialized views.,"This article outlines the semantics and requirements for incremental refreshes on materialized views, and identifies the SQL operations, keywords, and clauses that support incremental refresh. It includes discussion of the differences between incremental and full refreshes, and includes recommendations for choosing between materialized views and streaming tables. When running updates on materialized views using serverless pipelines, many queries can be incrementally refreshed. Incremental refres",2026-04-07T08:00:00.000Z,feature-guide,decision-making,0.7,True,"Covers semantics and requirements for incremental refresh, supported SQL operations/clauses, and recommendations for choosing between materialized views and streaming tables. This is product-specific decision guidance about when to use each approach and how to configure incremental refresh, matching decision-making with some best-practice aspects.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/optimizations/incremental-refresh,Incremental refresh for materialized views,Incremental refresh for materialized views - Azure Databricks,Choose and configure incremental refresh for Databricks materialized views,Learn how to use incremental refresh with materialized views.,"This article outlines the semantics and requirements for incremental refreshes on materialized views, and identifies the SQL operations, keywords, and clauses that support incremental refresh. It includes discussion of the differences between incremental and full refreshes, and includes recommendations for choosing between materialized views and streaming tables. When running updates on materialized views using serverless pipelines, many queries can be incrementally refreshed. Incremental refres",2026-04-21T08:00:00.000Z,feature-guide,decision-making,0.65,True,"Article compares incremental vs full refresh and materialized views vs streaming tables, and provides recommendations for choosing between them; this is product-specific decision guidance about when to use each approach.",updated https://learn.microsoft.com/en-us/azure/databricks/optimizations/isolation/,Isolation levels and write conflicts,Isolation levels and write conflicts - Azure Databricks,,"Learn about isolation levels, write conflicts, and concurrency control for transactions on Azure Databricks.",This page describes isolation levels and write conflict behavior for Delta Lake tables on Azure Databricks. Delta Lake provides ACID transaction guarantees between reads and writes: SeeWhat are ACID guarantees on Azure Databricks?. Note Azure Databricks uses Delta Lake for all tables by default.,2026-03-09T18:23:00.000Z,concept-article,,0.3,False,"High-level explanation of isolation levels and write conflicts; summary suggests conceptual behavior without concrete settings, codes, or config values.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/isolation/isolation-levels,Isolation levels,Isolation levels (WriteSerializable and Serializable) - Azure Databricks,,Learn about WriteSerializable and Serializable isolation levels for Delta tables on Azure Databricks.,Delta Lake on Azure Databricks supports two isolation levels that control how concurrent operations on a given table interact:,2026-03-09T18:23:00.000Z,concept-article,,0.4,False,"Describes two isolation levels conceptually; summary does not indicate specific configuration parameters, thresholds, or product-specific numeric guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/isolation/row-level-concurrency,Row-level concurrency,Row-level concurrency - Azure Databricks,,Learn about row-level concurrency and how it reduces write conflicts for Delta tables on Azure Databricks.,Row-level concurrency reduces conflicts between concurrent write operations by detecting changes at the row level and automatically resolving conflicts that occur when concurrent writes update or delete different rows in the same data file.,2026-03-09T18:23:00.000Z,concept-article,,0.4,False,"Row-level concurrency behavior is described conceptually; summary does not show concrete config options, error codes, or numeric thresholds.",unchanged @@ -2049,8 +2078,8 @@ https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/ https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-memory-issues,Memory issues,Spark memory issues - Azure Databricks,Diagnose and fix Spark memory issues on Databricks,Debugging Spark memory issues,,2025-05-09T19:58:00.000Z,troubleshooting,troubleshooting,0.7,True,Focused on debugging Spark memory problems with Databricks-specific patterns and likely includes concrete symptoms and remediation steps.,unchanged https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-rewriting-data,Spark rewriting data,How to determine if Spark is rewriting data - Azure Databricks,Detect unnecessary data rewriting in Databricks Spark writes,How to determine if Spark is rewriting data,"First open the SQL DAG for your write stage. Scroll up to the top of the job's page and click on the Associated SQL Query: You should now see the DAG. If not, scroll around a bit and you should see it: If you're doing a Delete or Update operation, look at the amount of data being written by the writer versus what you expect. If you're seeing a lot more data being written than you expect, you're probably rewriting data: If you're doing a merge, the merge node has explicit statistics about how muc",2025-05-09T19:58:00.000Z,troubleshooting,best-practices,0.72,True,"Uses Databricks SQL DAG statistics for delete/update/merge to determine if Spark is rewriting more data than expected; detailed, product-specific diagnostic technique.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pandas/,Pandas overview,Can you use pandas on Azure Databricks? - Azure Databricks,Choose pandas options on Azure Databricks,"Discover options for working with pandas on Azure Databricks. Use DataFrames, convert to PySpark, and apply functions with Arrow.","Databricks Runtime includes pandas as one of the standard Python packages, allowing you to create and leverage pandas DataFrames in Databricks notebooks and jobs. In Databricks Runtime 10.4 LTS and above,Pandas API on Sparkprovides familiar pandas commands on top of PySpark DataFrames. You can alsoconvert DataFrames between pandas and PySpark. Apache Spark includes Arrow-optimized execution of Python logic in the form ofpandas function APIs, which allow users to apply pandas transformations dire",2026-03-31T08:00:00.000Z,overview,decision-making,0.7,True,"Compares pandas, pandas API on Spark, and conversions with PySpark/Arrow, giving guidance on when to use each approach for different scenarios—service-specific decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-function-apis,pandas function APIs,pandas function APIs - Azure Databricks,Apply pandas function APIs to PySpark DataFrames,Learn how to work with pandas function APIs in Azure Databricks.,"pandas function APIs enable you to directly apply a Python native function that takes and outputs pandas instances to a PySpark DataFrame. Similar topandas user-defined functions, function APIs also useApache Arrowto transfer data and pandas to work with the data; however, Python type hints are optional in pandas function APIs. There are three types of pandas function APIs: pandas function APIs leverage the same internal logic that pandas UDF execution uses. They share characteristics such as Py",2026-02-26T08:00:00.000Z,concept-article,integrations,0.75,True,"Describes pandas function APIs, their types, and behavior (Arrow-based execution, type hints) which are concrete API patterns for Databricks users.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-on-spark,Pandas API on Spark,Pandas API on Spark - Azure Databricks,Use pandas API on Spark in Databricks,Learn how to use the pandas API on Spark to access data in Azure Databricks.,"Note This feature is available on clusters that run Databricks Runtime 10.0 and above. For clusters that run Databricks Runtime 9.1 LTS and below, useKoalasinstead. Commonly used by data scientists,pandasis a Python package that provides easy-to-use data structures and data analysis tools for the Python programming language. However, pandas does not scale out to big data. Pandas API on Spark fills this gap by providing pandas equivalent APIs that work on Apache Spark. Pandas API on Spark is usef",2026-04-02T22:35:00.000Z,concept-article,configuration,0.75,True,"Includes Databricks Runtime version requirements and how to enable/use pandas API on Spark, which is specific to Databricks environments.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-function-apis,pandas function APIs,pandas function APIs - Azure Databricks,Use pandas function APIs with PySpark DataFrames,Learn how to work with pandas function APIs in Azure Databricks.,"pandas function APIs enable you to directly apply a Python native function that takes and outputs pandas instances to a PySpark DataFrame. Similar topandas user-defined functions, function APIs also useApache Arrowto transfer data and pandas to work with the data; however, Python type hints are optional in pandas function APIs. There are three types of pandas function APIs: pandas function APIs leverage the same internal logic that pandas UDF execution uses. They share characteristics such as Py",2026-04-23T08:00:00.000Z,concept-article,integrations,0.6,True,"Describes pandas function APIs that apply native Python functions to PySpark DataFrames, leveraging Arrow and pandas; likely includes specific function signatures, parameter behaviors, and execution characteristics that are product- and API-specific integration patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-on-spark,Pandas API on Spark,Pandas API on Spark - Azure Databricks,,Learn how to use the pandas API on Spark to access data in Azure Databricks.,"Note This feature is available on clusters that run Databricks Runtime 10.0 and above. For clusters that run Databricks Runtime 9.1 LTS and below, useKoalasinstead. Commonly used by data scientists,pandasis a Python package that provides easy-to-use data structures and data analysis tools for the Python programming language. However, pandas does not scale out to big data. Pandas API on Spark fills this gap by providing pandas equivalent APIs that work on Apache Spark. Pandas API on Spark is usef",2026-04-23T08:00:00.000Z,concept-article,,0.3,False,"Explains pandas API on Spark and mentions runtime version requirement, but otherwise is conceptual/usage guidance without detailed limits, configuration tables, or decision criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/pandas/pyspark-pandas-conversion,pandas to PySpark conversion,Convert between PySpark and pandas DataFrames - Azure Databricks,Convert between PySpark and pandas DataFrames with Arrow,Learn how to use convert Apache Spark DataFrames to and from pandas DataFrames using Apache Arrow in Azure Databricks.,Learn how to convert Apache Spark DataFrames to and from pandas DataFrames using Apache Arrow in Azure Databricks.,2026-02-26T08:00:00.000Z,how-to,integrations,0.75,True,"Focuses on conversion using Apache Arrow, including specific APIs and constraints for interoperability between Spark and pandas on Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partner-connect/,What is Partner Connect?,What is Databricks Partner Connect? - Azure Databricks,,"Learn about Databricks Partner Connect, an Azure Databricks workflow that enables customers to easily discover and connect with partner solutions.","Partner Connect lets you create trial accounts with select Azure Databricks technology partners and connect your Azure Databricks workspace to partner solutions from the Azure Databricks UI. This allows you to try partner solutions using your data in the Databricks lakehouse, then adopt the solutions that best meet your business needs. Partner Connect simplifies integration by provisioning the required Azure Databricks resources on your behalf, then passing resource details to the partner. These",2025-05-09T19:58:00.000Z,overview,,0.2,False,High-level overview of Partner Connect capabilities and workflow; primarily conceptual/marketing without detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/partner-connect/admin,Manage connections as an administrator,Manage connections in Partner Connect - Azure Databricks,,"Learn how to administer Databricks Partner Connect. Manage service principals, personal access tokens, and partner account users or disconnect from a partner.","You can perform administrative tasks with Azure Databricks workspace connections to partner solutions, such as: To administer Partner Connect, you must sign in to your workspace as a workspace admin. For more information, seeManage users.",2026-01-20T08:00:00.000Z,how-to,,0.3,False,"High-level administration of Partner Connect (service principals, PATs, users). Summary suggests procedural/portal steps without detailed configuration tables, specific parameters, or error mappings.",unchanged @@ -2065,18 +2094,18 @@ https://learn.microsoft.com/en-us/azure/databricks/partner-connect/reverse-etl,C https://learn.microsoft.com/en-us/azure/databricks/partner-connect/semantic-layer,Connect to a semantic layer partner,Connect to semantic layer partners using Partner Connect - Azure Databricks,,Learn about how to connect your Azure Databricks workspace to semantic layer partners using Partner Connect.,"To connect your Azure Databricks workspace to a semantic layer partner solution using Partner Connect, you typically follow the steps in this article. Important Before you follow the steps in this article, see the appropriate partner article for important partner-specific information. There might be differences in the connection steps between partner solutions. Some partner solutions also allow you to integrate with Databricks SQL warehouses (formerly Databricks SQL endpoints) or Azure Databrick",2025-01-21T04:23:00.000Z,how-to,,0.35,False,"Semantic layer Partner Connect overview; primarily generic connection workflow and navigation, not detailed configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partner-connect/troubleshoot,Troubleshoot connections,Troubleshoot Partner Connect - Azure Databricks,Troubleshoot Azure Databricks Partner Connect issues,Learn how to troubleshoot common issues with Partner Connect.,This section provides information to help address common issues with Partner Connect.,2026-02-09T08:00:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Explicit troubleshooting page. Likely organized by common Partner Connect symptoms with specific error messages/codes and corresponding resolutions, which are product-specific diagnostic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partner-connect/walkthrough-fivetran,Connect to Fivetran walkthrough,Walkthrough: Connect to Fivetran using Partner Connect - Azure Databricks,Walkthrough: Connect Fivetran to Databricks via Partner Connect,Follow an example walkthrough that demonstrates how to connect to Fivetran using Partner Connect.,"In this walkthrough, you use Partner Connect to connect a Databricks SQL warehouse in your workspace to Fivetran and then use Fivetran to ingest sample data from Google Sheets into your workspace. Make sure your Azure Databricks account, workspace, and the signed-in user all meet therequirementsfor Partner Connect. In the sidebar, clickMarketplace. InPartner Connect integrations, clickView all. Click theFivetrantile. Note If theFivetrantile has a check mark icon inside of it, this means one of y",2025-01-17T08:00:00.000Z,tutorial,integrations,0.66,True,Step-by-step Partner Connect walkthrough including Databricks SQL warehouse selection and Fivetran setup; contains concrete UI steps and integration-specific details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/partners/bi/bi-metric-view,Query Unity Catalog metric views,Query metric views from BI tools - Azure Databricks,Enable BI compatibility mode for Databricks metric views,Learn how to enable BI compatibility mode to query Databricks Unity Catalog metric views from external BI tools.,"Important This feature is inBeta. BI compatibility mode lets you queryUnity Catalog metric viewsfrom external BI tools. When enabled, Azure Databricks rewrites the queries generated by the BI tool to correctly evaluate metric view measures. This page covers how to enable BI compatibility mode, how it works, supported scenarios, and known limitations.",2026-04-17T21:49:00.000Z,integration,configuration,0.7,True,"Covers enabling BI compatibility mode and its supported scenarios/limitations; such a feature page typically documents specific configuration flags or settings to turn the mode on/off and how it rewrites queries, which is product-specific configuration knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/partners/bi/fabric,Fabric,Use Microsoft Fabric to read data that is registered in Unity Catalog - Azure Databricks,Read Unity Catalog data from Microsoft Fabric,Learn how to use Microsoft Fabric to read data that is registered in Unity Catalog.,This article gives an overview of how to use Microsoft Fabric to read data that is registered in Unity Catalog.,2026-02-19T19:35:00.000Z,integration,integrations,0.65,True,Product-specific integration between Fabric and Unity Catalog with concrete connection configuration details not covered by generic knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/partners/bi/bi-metric-view,Query Unity Catalog metric views,Query metric views from BI tools - Azure Databricks,Enable and use BI compatibility mode for metric views,Learn how to enable BI compatibility mode to query Databricks Unity Catalog metric views from external BI tools.,"Important This feature is inBeta. BI compatibility mode lets you queryUnity Catalog metric viewsfrom external BI tools. When enabled, Azure Databricks rewrites the queries generated by the BI tool to correctly evaluate metric view measures. This page covers how to enable BI compatibility mode, how it works, supported scenarios, and known limitations.",2026-04-20T08:00:00.000Z,integration,configuration,0.7,True,"Describes a product-specific feature (BI compatibility mode) with concrete enablement steps and detailed behavior/limitations for querying Unity Catalog metric views from external BI tools. This is configuration-focused expert knowledge about how Azure Databricks rewrites BI tool queries and which scenarios are supported, which isn't generic BI or Databricks knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/partners/bi/fabric,Fabric,Use Microsoft Fabric to read data that is registered in Unity Catalog - Azure Databricks,,Learn how to use Microsoft Fabric to read data that is registered in Unity Catalog.,This article gives an overview of how to use Microsoft Fabric to read data that is registered in Unity Catalog.,2026-04-21T08:00:00.000Z,integration,,0.2,False,"Described as an overview of using Microsoft Fabric to read Unity Catalog data. Likely high-level integration guidance without detailed configuration tables, parameters, or product-specific limits, so it does not clearly meet any expert-knowledge sub-skill criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/partners/bi/hex,Hex,Connect to Hex - Azure Databricks,Connect Databricks SQL warehouses to Hex,"Learn how to connect your Azure Databricks workspace to Hex, a platform for collaborative data science and analytics.","Hex is a modern data workspace. It makes it easy to connect to data, analyze it in collaborative SQL and Python-powered notebooks, and share work as interactive data apps and stories. Hex has three major elements: You can integrate your Databricks SQL warehouses (formerly Databricks SQL endpoints) with Hex. Note Hex does not integrate with Azure Databricks clusters.",2024-05-14T08:00:00.000Z,integration,integrations,0.7,True,Partner integration with Hex including Databricks SQL–specific connection configuration and constraints (no cluster integration).,unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/looker,Looker,Connect to Looker - Azure Databricks,Connect Looker to Azure Databricks clusters and SQL,"Learn how to connect your Azure Databricks workspace to Looker, a platform for business intelligence, data applications, and embedded analytics.","This article describes how to use Looker with an Azure Databricks cluster or Databricks SQL warehouse (formerly Databricks SQL endpoint). Important When persistent derived tables (PDTs) are enabled, by default Looker regenerates PDTs every 5 minutes by connecting to the associated database. Databricks recommends that you change the default frequency to avoid incurring excess compute costs. For more information, seeEnable and manage persistent derived tables (PDTs).",2026-03-06T08:00:00.000Z,integration,integrations,0.7,True,"Looker integration with Databricks generally requires specific JDBC/ODBC settings, dialect selection, and PDT-related configuration. Mention of PDT regeneration frequency hints at product-specific behavior and recommended settings, which are expert integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/looker-studio,Looker Studio,Connect to Looker Studio - Azure Databricks,Connect Looker Studio to Azure Databricks,Learn how to connect your Azure Databricks workspace to Looker Studio.,This article describes how to use Looker Studio with an Azure Databricks cluster or Databricks SQL warehouse (formerly Databricks SQL endpoint).,2026-01-20T08:00:00.000Z,integration,integrations,0.65,True,"Describes using Looker Studio with Databricks clusters/SQL warehouses. Such pages typically specify connector type, connection string fields, auth modes, and required configuration values, which are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/microstrategy,MicroStrategy,Connect to MicroStrategy - Azure Databricks,Connect MicroStrategy Workstation to Azure Databricks,Learn how to connect your Azure Databricks workspace to MicroStrategy so you can make data-driven decisions and optimize business processes.,This article describes how to use MicroStrategy Workstation with an Azure Databricks cluster or a Databricks SQL warehouse (formerly Databricks SQL endpoint).,2026-01-20T08:00:00.000Z,integration,integrations,0.65,True,"MicroStrategy–Databricks integration usually needs specific DSN/driver settings, warehouse selection, and authentication parameters. These are concrete integration configuration details not covered by generic SDK knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/mode,Mode,Connect to Mode - Azure Databricks,Connect Mode analytics to Azure Databricks,"Learn how to connect your Azure Databricks workspace to Mode, which provides a unified, horizontal analytics layer.",This article describes how to use Mode with an Azure Databricks cluster or a Databricks SQL warehouse (formerly Databricks SQL endpoint).,2024-11-15T19:10:00.000Z,integration,integrations,0.7,True,Integration guide for Mode with Databricks clusters/SQL warehouses including connection configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi,Power BI,Power BI with Azure Databricks - Azure Databricks,,Learn how to integrate Microsoft Power BI with Azure Databricks for interactive data visualization and business intelligence.,"Microsoft Power BI is a business analytics service that provides interactive visualizations with self-service business intelligence capabilities. When you integrate Power BI with Azure Databricks, you bring scalable data processing and high-performance analytics to all business users, enabling them to create rich reports and dashboards without depending on IT staff or database administrators. The integration allows you to publish data sets directly from Azure Databricks, connect Power BI Desktop",2026-04-17T21:49:00.000Z,integration,,0.2,False,"High-level description of integrating Power BI with Azure Databricks; summary suggests conceptual integration overview without specific configuration tables, limits, or product-specific error/decision details.",updated +https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi,Power BI,Power BI with Azure Databricks - Azure Databricks,,Learn how to integrate Microsoft Power BI with Azure Databricks for interactive data visualization and business intelligence.,"Microsoft Power BI is a business analytics service that provides interactive visualizations with self-service business intelligence capabilities. When you integrate Power BI with Azure Databricks, you bring scalable data processing and high-performance analytics to all business users, enabling them to create rich reports and dashboards without depending on IT staff or database administrators. The integration allows you to publish data sets directly from Azure Databricks, connect Power BI Desktop",2026-04-17T21:49:00.000Z,integration,,0.2,False,"High-level description of integrating Power BI with Azure Databricks; summary suggests conceptual integration overview without specific configuration tables, limits, or product-specific error/decision details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-adbc,Configure ADBC or ODBC driver,Configure ADBC or ODBC driver for Power BI - Azure Databricks,Configure ADBC vs ODBC drivers for Power BI with Databricks,Learn about the Arrow Database Connectivity (ADBC) driver on Microsoft Power BI with Azure Databricks for interactive data visualization and business intelligence.,"Important This feature is inPublic Preview. This page explains how to configure which driver your Power BI connections use. Although Arrow Database Connectivity (ADBC) is the default driver for all new Power BI connections created using the UI, pre-existing connections continue to use Open Database Connectivity (ODBC), unless you manually update them to ADBC. TheArrow Database Connectivity (ADBC)driver is a modern, columnar API standard designed for analytical applications, enabling efficient da",2026-03-16T22:07:00.000Z,integration,integrations,0.75,True,"Page focuses on configuring which driver (ADBC or ODBC) Power BI uses to connect to Azure Databricks, including driver selection behavior (ADBC as default for new connections, existing connections remain ODBC unless updated) and product-specific integration details. This fits the integrations category as it covers concrete connection configuration patterns between Power BI and Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-desktop,Power BI Desktop,Connect Power BI Desktop to Azure Databricks - Azure Databricks,Connect Power BI Desktop to Azure Databricks,"Learn how to connect to your Azure Databricks workspace from Microsoft Power BI Desktop, a business analytics service that provides interactive visualizations.","This page describes how to connect to Azure Databricks fromMicrosoft Power BI Desktop. Power BI Desktop is a Windows-based application that enables you to connect to, shape, and visualize data from a wide range of sources. With Power BI Desktop, you can create interactive reports and dashboards using self-service business intelligence tools. When you use Azure Databricks as a data source with Power BI, you extend the performance and capabilities of Azure Databricks beyond data scientists and dat",2026-04-17T21:49:00.000Z,integration,integrations,0.7,True,"How-to page for connecting Power BI Desktop to Azure Databricks; such connector docs typically include product-specific connection parameters (server/HTTP path, authentication options, possibly gateway settings) that qualify as integration configuration details beyond generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-desktop,Power BI Desktop,Connect Power BI Desktop to Azure Databricks - Azure Databricks,Connect Power BI Desktop to Azure Databricks,"Learn how to connect to your Azure Databricks workspace from Microsoft Power BI Desktop, a business analytics service that provides interactive visualizations.","This page describes how to connect to Azure Databricks fromMicrosoft Power BI Desktop. Power BI Desktop is a Windows-based application that enables you to connect to, shape, and visualize data from a wide range of sources. With Power BI Desktop, you can create interactive reports and dashboards using self-service business intelligence tools. When you use Azure Databricks as a data source with Power BI, you extend the performance and capabilities of Azure Databricks beyond data scientists and dat",2026-04-17T21:49:00.000Z,integration,integrations,0.7,True,"How-to page for connecting Power BI Desktop to Azure Databricks; such connector docs typically include product-specific connection parameters (server/HTTP path, authentication options, possibly gateway settings) that qualify as integration configuration details beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-m2m,Set up service principal for M2M OAuth,Configure service principals on Azure Databricks for Power BI - Azure Databricks,Configure Azure Databricks service principals for Power BI M2M OAuth,Learn how to set up service principcals on Azure Databricks for the Microsoft Power BI connection.,This page describes how to set up a service principal in Azure Databricks if you want to enable machine-to-machine (M2M) OAuth authentication with Power BI. Machine-to-Machine (M2M) OAuth provides a more secure authentication method for Power BI connections by using service principals instead of personal access tokens. This approach: Note Power BI Desktop 2.143.878.0 (May 2025 release) or above is required for this authentication method.,2026-01-24T08:00:00.000Z,integration,security,0.8,True,"Describes setting up service principals and M2M OAuth for Power BI. Expected to include specific app registration settings, Databricks permissions, scopes, and security-related configuration unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-service,Power BI service,Publish to the Power BI service from Azure Databricks - Azure Databricks,Publish Azure Databricks datasets to Power BI service,"Learn how to publish to the Power BI service from Azure Databricks, a business analytics service that provides interactive visualizations.","This page describes how to publish data from Azure Databricks toMicrosoft Power BI service. Microsoft Power BI service is a cloud-based business analytics platform that enables users to connect to, visualize, and analyze data. When you integrate Azure Databricks as a data source with the Power BI service, you bring Databricks’ scalable data processing and performance to all users. If you want to connect Power BI Desktop to Azure Databricks, seeConnect Power BI Desktop to Azure Databricks. Note T",2026-04-17T08:00:00.000Z,integration,integrations,0.7,True,"Describes publishing data from Databricks to the Power BI service; these docs usually include specific connector/endpoint settings and authentication parameters for the service integration, which are product-specific integration patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-service,Power BI service,Publish to the Power BI service from Azure Databricks - Azure Databricks,Publish Azure Databricks datasets to Power BI service,"Learn how to publish to the Power BI service from Azure Databricks, a business analytics service that provides interactive visualizations.","This page describes how to publish data from Azure Databricks toMicrosoft Power BI service. Microsoft Power BI service is a cloud-based business analytics platform that enables users to connect to, visualize, and analyze data. When you integrate Azure Databricks as a data source with the Power BI service, you bring Databricks’ scalable data processing and performance to all users. If you want to connect Power BI Desktop to Azure Databricks, seeConnect Power BI Desktop to Azure Databricks. Note T",2026-04-17T08:00:00.000Z,integration,integrations,0.7,True,"Describes publishing data from Databricks to the Power BI service; these docs usually include specific connector/endpoint settings and authentication parameters for the service integration, which are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-uc-connect,Power BI connection in Unity Catalog,Create a Power BI connection in Unity Catalog for orchestration - Azure Databricks,Create and use Power BI connections in Unity Catalog,Learn how to create a Power BI connection in Unity Catalog to create Power BI tasks for job on Azure Databricks.,"This pages explains how to connect to Power BI in Unity Catalog to orchestrate publishing to Power BI with aPower BI task (preview). You must first create a connection to store your Entra credentials. Note You must have theCREATE CONNECTIONprivilege in Unity Catalog to create the Power BI connection. If someone else creates the connection, you must have theUSE CONNECTIONprivilege to use the connection in a Power BI task. If you want to use Power BI Desktop or the Power BI service instead, see:",2026-02-23T08:00:00.000Z,integration,configuration,0.74,True,"Explains creating a Power BI connection object in Unity Catalog with specific privileges (CREATE/USE CONNECTION) and configuration fields, which are product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/preset,Preset,Connect to Preset - Azure Databricks,Integrate Preset BI with Azure Databricks,Learn how to set up Azure Databricks to integrate with Preset.,"Preset provides modern business intelligence for your entire organization. Preset provides a powerful, easy to use data exploration and visualization platform, powered by open source Apache Superset. You can integrate your Databricks SQL warehouses (formerly Databricks SQL endpoints) and Azure Databricks clusters with Preset.",2024-11-15T19:10:00.000Z,integration,integrations,0.7,True,Shows how to connect Preset (Superset-based) to Databricks SQL warehouses and clusters with product-specific connection patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/bi/qlik-sense,Qlik Sense,Connect to Qlik Sense - Azure Databricks,Connect Qlik Sense to Azure Databricks,Learn how to connect your Azure Databricks workspace to Qlik Sense so you can make data-driven decisions and take action.,"Qlik Sense delivers best-in-class cloud analytics that help people of all skill levels to make data-driven decisions and take action. This article describes how to use Qlik Sense with an Azure Databricks cluster or a Databricks SQL warehouse (formerly Databricks SQL endpoint) to analyze data in Delta Lake. Note For information about Qlik Replicate, a solution that helps you pull data from multiple data sources (Oracle, Microsoft SQL Server, SAP, mainframe, and more) into Delta Lake, seeConnect t",2025-05-09T19:58:00.000Z,integration,integrations,0.7,True,"Partner integration for Qlik Sense with Databricks clusters/SQL warehouses and Delta Lake, including Databricks-specific connection details.",unchanged @@ -2110,434 +2139,434 @@ https://learn.microsoft.com/en-us/azure/databricks/partners/reverse-etl/census,C https://learn.microsoft.com/en-us/azure/databricks/partners/reverse-etl/hightouch,Hightouch,Connect to Hightouch - Azure Databricks,Connect Hightouch reverse ETL to Azure Databricks,"Learn how to connect your Azure Databricks workspace to Hightouch, a reverse ETL platform that syncs data from Databricks into business tools.",Hightouch syncs your data in Azure Databricks to the tools that your business teams rely on. You can integrate your Databricks SQL warehouses (formerly Databricks SQL endpoints) and Azure Databricks clusters with Hightouch.,2026-01-20T08:00:00.000Z,integration,integrations,0.7,True,"Reverse ETL integration requires specifying Databricks SQL warehouse/cluster connection parameters, authentication, and sync configuration. These are product-specific integration settings beyond generic how-to content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/semantic-layer/atscale,AtScale,Connect to AtScale - Azure Databricks,Connect AtScale semantic layer to Databricks,Learn about how to connect your Azure Databricks workspace to AtScale. AtScale's semantic layer makes lakehouse data easily consumable by BI tools.,"AtScale's semantic layer delivers a single source of governed metrics and KPIs, tied to live lakehouse data and to BI tools including Excel, Tableau, and Power BI. You can connect Databricks SQL warehouses (formerly Databricks SQL endpoints) and Azure Databricks clusters to AtScale.",2024-11-15T19:10:00.000Z,integration,integrations,0.65,True,Explains how to connect AtScale to Databricks SQL warehouses and clusters with product-specific integration details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/partners/semantic-layer/stardog,Stardog,Connect to Stardog - Azure Databricks,Integrate Stardog semantic layer with Databricks,"Learn how to connect your Azure Databricks workspace to Stardog, which provides a flexible semantic data layer designed to answer complex queries across data silos.",The Stardog Enterprise Knowledge Graph Platform provides a foundation for a flexible semantic data layer designed to answer complex queries across data silos. You can integrate your Databricks SQL warehouses (formerly Databricks SQL endpoints) and Azure Databricks clusters with Stardog.,2024-11-15T19:10:00.000Z,integration,integrations,0.65,True,"Partner integration article for Stardog with Databricks clusters/SQL warehouses, including configuration patterns unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/,Overview,PySpark on Azure Databricks - Azure Databricks,,"Learn the fundamentals of PySpark, a Python API for Spark, on Databricks.","Azure Databricks is built on top ofApache Spark, a unified analytics engine for big data and machine learning. PySpark helps you interface with Apache Spark using the Python programming language, which is a flexible language that is easy to learn, implement, and maintain. It also provides many options fordata visualizationin Databricks. PySpark combines the power of Python and Apache Spark. This article provides an overview of the fundamentals of PySpark on Databricks.",2026-01-19T08:00:00.000Z,overview,,0.3,False,"Overview of PySpark on Databricks; mostly conceptual and introductory without detailed configs, limits, or product-specific troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/basics,Basics,PySpark basics - Azure Databricks,,Discover everything you need to know to quickly start using PySpark.,"This article walks through simple examples to illustrate usage of PySpark. It assumes you understand fundamentalApache Spark conceptsand are running commands in a Azure Databricksnotebookconnected to compute. You create DataFrames using sample data, perform basic transformations including row and column operations on this data, combine multiple DataFrames and aggregate this data, visualize this data, and then save it to a table or file.",2026-03-31T08:00:00.000Z,how-to,,0.4,False,"Basic PySpark usage tutorial; generic DataFrame operations and saving data, not focused on Databricks-specific configuration or limits.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/,Overview,PySpark on Azure Databricks - Azure Databricks,,"Learn the fundamentals of PySpark, a Python API for Spark, on Databricks.","Azure Databricks is built on top ofApache Spark, a unified analytics engine for big data and machine learning. PySpark helps you interface with Apache Spark using the Python programming language, which is a flexible language that is easy to learn, implement, and maintain. It also provides many options fordata visualizationin Databricks. PySpark combines the power of Python and Apache Spark. This article provides an overview of the fundamentals of PySpark on Databricks.",2026-04-23T08:00:00.000Z,overview,,0.1,False,"High-level overview of PySpark on Databricks and its fundamentals; no product-specific limits, configuration tables, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/basics,Basics,PySpark basics - Azure Databricks,,Discover everything you need to know to quickly start using PySpark.,"This article walks through simple examples to illustrate usage of PySpark. It assumes you understand fundamentalApache Spark conceptsand are running commands in a Azure Databricksnotebookconnected to compute. You create DataFrames using sample data, perform basic transformations including row and column operations on this data, combine multiple DataFrames and aggregate this data, visualize this data, and then save it to a table or file.",2026-04-23T08:00:00.000Z,how-to,,0.2,False,"Introductory PySpark examples (DataFrames, transforms, visualization, saving data); tutorial-style usage without Databricks-specific limits, configs, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/databricks/pyspark/datasources,Custom data sources,PySpark custom data sources - Azure Databricks,Implement PySpark custom data sources on Databricks,Learn about PySpark custom data sources on Databricks.,"PySpark custom data sources are created using thePython (PySpark) DataSource API, which enables reading from custom data sources and writing to custom data sinks in Apache Spark using Python. You can use PySpark custom data sources to define custom connections to data systems and implement additional functionality to build out reusable data sources. Note PySpark custom data sources require Databricks Runtime 15.4 LTS and above, orserverless environment version 2.",2026-03-31T23:28:00.000Z,concept-article,integrations,0.8,True,"Documents the PySpark DataSource API usage on Databricks, including runtime version requirements and patterns for custom sources/sinks—product-specific integration code patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/,Overview,PySpark reference - Azure Databricks,,"Discover reference pages for PySpark, a Python API for Spark, on Databricks.","This page provides an overview of reference available for PySpark, a Python API for Spark. For more information about PySpark, seePySpark on Azure Databricks.",2026-04-17T21:49:00.000Z,reference,,0.1,False,"Navigation/overview page for PySpark reference; no detailed configs, limits, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog,Catalog class,Catalog - Azure Databricks,Use PySpark Catalog API on Azure Databricks,Learn about the Catalog class in PySpark,"User-facing catalog API, accessible throughSparkSession.catalog. This is a thin wrapper around its Scala implementationorg.apache.spark.sql.catalog.Catalog.",2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,API reference for Catalog class with method signatures and behavior is product-specific integration knowledge (SDK surface) beyond generic concepts.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/cachetable,cacheTable,cacheTable - Azure Databricks,Cache tables with PySpark Catalog.cacheTable,Documentation for the Catalog.cacheTable method in PySpark.,Caches the specified table in-memory or with given storage level. Default MEMORY_AND_DISK.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents a specific method, default storage level (MEMORY_AND_DISK), and behavior; this is concrete SDK configuration/usage detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/clearcache,clearCache,clearCache - Azure Databricks,Clear cached tables with Catalog.clearCache,Documentation for the Catalog.clearCache method in PySpark.,Removes all cached tables from the in-memory cache.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference describing behavior of clearCache is product-specific API detail useful for coding against Databricks.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/createtable,createTable,createTable - Azure Databricks,Create tables using Catalog.createTable in PySpark,Documentation for the Catalog.createTable method in PySpark.,Creates a table based on the dataset in a data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes a concrete Catalog API for table creation tied to Databricks/Spark; this is SDK integration detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/currentcatalog,currentCatalog,currentCatalog - Azure Databricks,Get current catalog with Catalog.currentCatalog,Documentation for the Catalog.currentCatalog method in PySpark.,Returns the current default catalog in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,"Documents a specific PySpark Catalog method and its return behavior, which is product-specific API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/currentdatabase,currentDatabase,currentDatabase - Azure Databricks,Get current database with Catalog.currentDatabase,Documentation for the Catalog.currentDatabase method in PySpark.,Returns the current default database in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,Method reference for currentDatabase is concrete SDK behavior for Databricks PySpark integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/databaseexists,databaseExists,databaseExists - Azure Databricks,Check database existence with Catalog.databaseExists,Documentation for the Catalog.databaseExists method in PySpark.,Check if the database with the specified name exists.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API method semantics (what it checks, return type) are product-specific integration details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/dropglobaltempview,dropGlobalTempView,dropGlobalTempView - Azure Databricks,Drop global temp views with Catalog.dropGlobalTempView,Documentation for the Catalog.dropGlobalTempView method in PySpark.,Drops the global temporary view with the given view name in the catalog.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents a specific Catalog method and its effect on global temp views; concrete SDK usage detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/droptempview,dropTempView,dropTempView - Azure Databricks,Drop local temp views with Catalog.dropTempView,Documentation for the Catalog.dropTempView method in PySpark.,"Drops the local temporary view with the given view name in the catalog. If the view has been cached before, then it will also be uncached. Returns true if this view is dropped successfully, false otherwise.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Includes behavior details (also uncaches, returns true/false) that are specific to this API and important for integration logic.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/functionexists,functionExists,functionExists - Azure Databricks,Check function existence with Catalog.functionExists,Documentation for the Catalog.functionExists method in PySpark.,Check if the function with the specified name exists. This can either be a temporary function or a function.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level semantics for checking temporary or permanent functions are product-specific API details.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/getdatabase,getDatabase,getDatabase - Azure Databricks,Retrieve databases with Catalog.getDatabase in PySpark,Documentation for the Catalog.getDatabase method in PySpark.,Get the database with the specified name. This throws anAnalysisExceptionwhen the database cannot be found.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Describes behavior including throwing AnalysisException when not found; this is concrete SDK behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/getfunction,getFunction,getFunction - Azure Databricks,Retrieve functions with Catalog.getFunction in PySpark,Documentation for the Catalog.getFunction method in PySpark.,Get the function with the specified name. This function can be a temporary function or a function. This throws anAnalysisExceptionwhen the function cannot be found.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,API reference including error behavior (AnalysisException) is specific integration knowledge.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/gettable,getTable,getTable - Azure Databricks,Retrieve tables and views with Catalog.getTable,Documentation for the Catalog.getTable method in PySpark.,Get the table or view with the specified name. This table can be a temporary view or a table/view. This throws anAnalysisExceptionwhen no Table can be found.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Documents how the method resolves temp vs permanent tables and its exception behavior; concrete SDK detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/iscached,isCached,isCached - Azure Databricks,Check table cache status with Catalog.isCached,Documentation for the Catalog.isCached method in PySpark.,Returns true if the table is currently cached in-memory.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method semantics (returns true if cached in-memory) are specific to Databricks PySpark Catalog API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listcatalogs,listCatalogs,listCatalogs - Azure Databricks,List catalogs in session with Catalog.listCatalogs,Documentation for the Catalog.listCatalogs method in PySpark.,Returns a list of catalogs in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,API behavior for listing catalogs in a session is product-specific integration information.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listcolumns,listColumns,listColumns - Azure Databricks,List table columns with Catalog.listColumns,Documentation for the Catalog.listColumns method in PySpark.,Returns a list of columns for the given table/view in the specified database.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference for listing columns of a table/view is specific to this API surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listdatabases,listDatabases,listDatabases - Azure Databricks,List databases with Catalog.listDatabases in PySpark,Documentation for the Catalog.listDatabases method in PySpark.,Returns a list of databases available across all sessions.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,Describes method returning databases across sessions; concrete SDK usage detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listfunctions,listFunctions,listFunctions - Azure Databricks,List registered functions with Catalog.listFunctions,Documentation for the Catalog.listFunctions method in PySpark.,Returns a list of functions registered in the specified database.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents how to retrieve functions in a database via Catalog API; concrete integration pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listtables,listTables,listTables - Azure Databricks,List tables and views with Catalog.listTables,Documentation for the Catalog.listTables method in PySpark.,Returns a list of tables/views in the specified database.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API semantics for listing tables/views in a database are product-specific integration details.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/recoverpartitions,recoverPartitions,recoverPartitions - Azure Databricks,Recover table partitions with Catalog.recoverPartitions,Documentation for the Catalog.recoverPartitions method in PySpark.,Recovers all the partitions of the given table and updates the catalog.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes a specific Catalog method that updates metadata for partitions; this is detailed SDK behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/refreshbypath,refreshByPath,refreshByPath - Azure Databricks,Refresh cached data by path with Catalog.refreshByPath,Documentation for the Catalog.refreshByPath method in PySpark.,Invalidates and refreshes all the cached data (and the associated metadata) for any DataFrame that contains the given data source path.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Details how cached data and metadata are invalidated/refreshed for DataFrames by path; product-specific API semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/refreshtable,refreshTable,refreshTable - Azure Databricks,Refresh cached tables with Catalog.refreshTable,Documentation for the Catalog.refreshTable method in PySpark.,Invalidates and refreshes all the cached data and metadata of the given table.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Explains behavior of invalidating and refreshing cached data/metadata for a table; concrete Databricks PySpark API detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/setcurrentcatalog,setCurrentCatalog,setCurrentCatalog - Azure Databricks,Use Catalog.setCurrentCatalog in Azure Databricks PySpark,Documentation for the Catalog.setCurrentCatalog method in PySpark.,Sets the current default catalog in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark Catalog method, including signature and Databricks-specific behavior; this is product-specific integration/code-pattern knowledge beyond generic LLM training.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/setcurrentdatabase,setCurrentDatabase,setCurrentDatabase - Azure Databricks,Use Catalog.setCurrentDatabase in Azure Databricks PySpark,Documentation for the Catalog.setCurrentDatabase method in PySpark.,Sets the current default database in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete PySpark Catalog API with exact method name, parameters, and behavior in Databricks; this is detailed SDK usage rather than conceptual content.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/tableexists,tableExists,tableExists - Azure Databricks,Check table existence with Catalog.tableExists in PySpark,Documentation for the Catalog.tableExists method in PySpark.,Check if the table or view with the specified name exists. This can either be a temporary view or a table/view.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact Databricks PySpark method semantics and usage for checking table or view existence, which is product-specific API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/uncachetable,uncacheTable,uncacheTable - Azure Databricks,Remove cached tables with Catalog.uncacheTable in PySpark,Documentation for the Catalog.uncacheTable method in PySpark.,Removes the specified table from the in-memory cache.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes a specific Databricks PySpark Catalog API for cache management, including method name and behavior, which is detailed integration knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column,Column class,Column class - Azure Databricks,Work with the Column class in Azure Databricks PySpark,Documentation for the Column class in PySpark.,A column in a DataFrame. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.85,True,"Full reference for the Databricks PySpark Column class, including methods, behaviors, and Spark Connect support; this is detailed SDK/API knowledge unique to the product.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/alias,alias,alias (Column) - Azure Databricks,Assign aliases to columns with Column.alias in PySpark,Documentation for the Column.alias method in PySpark.,Give the column an alias.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents a specific Column method, its signature, and behavior in Databricks PySpark, which is concrete API usage rather than generic theory.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc,asc,asc (Column) - Azure Databricks,Sort columns ascending with Column.asc in PySpark,Documentation for the Column.asc method in PySpark.,Sort the column in ascending order.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a Databricks PySpark Column method with defined behavior, representing product-specific coding patterns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc_nulls_first,asc_nulls_first,asc_nulls_first (Column) - Azure Databricks,Sort ascending with nulls first using Column.asc_nulls_first,Documentation for the Column.asc\_nulls\_first method in PySpark.,Sort the column in ascending order with null values appearing first.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes a precise Column API for sort behavior including null ordering, which is specific to Databricks PySpark implementation.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc_nulls_last,asc_nulls_last,asc_nulls_last (Column) - Azure Databricks,Sort ascending with nulls last using Column.asc_nulls_last,Documentation for the Column.asc\_nulls\_last method in PySpark.,Sort the column in ascending order with null values appearing last.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact method name and semantics for a Databricks PySpark Column sort variant, which is detailed SDK knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/astype,astype,astype - Azure Databricks,Cast column types with Column.astype in PySpark,Documentation for the Column.astype method in PySpark.,Alias forcast().,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents a Column method alias and its relationship to cast(), which is specific API behavior in Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/between,between,between - Azure Databricks,Filter values within ranges using Column.between in PySpark,Documentation for the Column.between method in PySpark.,Check if the column value is between lower and upper bounds (inclusive).,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"API reference for a Column method with defined inclusive bounds semantics, representing concrete Databricks PySpark behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwiseand,bitwiseAND,bitwiseAND - Azure Databricks,Compute bitwise AND with Column.bitwiseAND in Databricks PySpark,Documentation for the Column.bitwiseAND method in PySpark.,Compute bitwise AND of this expression with another expression. Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details a specific Column bitwise operation, including Databricks Runtime version notes and Spark Connect support, which is product-specific API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwiseor,bitwiseOR,bitwiseOR - Azure Databricks,Compute bitwise OR with Column.bitwiseOR in Databricks PySpark,Documentation for the Column.bitwiseOR method in PySpark.,Compute bitwise OR of this expression with another expression. Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Similar to index 11, this documents a precise Column API and runtime/version behavior, which is expert-level integration detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwisexor,bitwiseXOR,bitwiseXOR - Azure Databricks,Compute bitwise XOR with Column.bitwiseXOR in Databricks PySpark,Documentation for the Column.bitwiseXOR method in PySpark.,Compute bitwise XOR of this expression with another expression. Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Provides method-level documentation including Databricks Runtime changes, which is detailed product-specific SDK information.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/cast,cast,cast - Azure Databricks,Change column data types with Column.cast in PySpark,Documentation for the Column.cast method in PySpark.,Convert the column to a different data type.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents the exact casting behavior and usage for a Databricks PySpark Column, which is concrete API reference material.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/contains,contains,contains (Column) - Azure Databricks,Check substring presence with Column.contains in PySpark,Documentation for the Column.contains method in PySpark.,Check if a string column contains a substring.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"API reference for a string-specific Column method, including semantics of substring matching in Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/desc,desc,desc (Column) - Azure Databricks,Sort columns descending with Column.desc in PySpark,Documentation for the Column.desc method in PySpark.,Sort the column in descending order.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific Column sorting method and its behavior, which is part of the Databricks PySpark SDK surface.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/desc_nulls_first,desc_nulls_first,desc_nulls_first (Column) - Azure Databricks,Sort descending with nulls first using Column.desc_nulls_first,Documentation for the Column.desc\_nulls\_first method in PySpark.,Sort the column in descending order with null values appearing first.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact method name and null ordering semantics, which are product-specific API details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/desc_nulls_last,desc_nulls_last,desc_nulls_last (Column) - Azure Databricks,Sort descending with nulls last using Column.desc_nulls_last,Documentation for the Column.desc\_nulls\_last method in PySpark.,Sort the column in descending order with null values appearing last.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Similar to index 17, this is detailed Column API documentation for Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/dropfields,dropFields,dropFields - Azure Databricks,Remove struct fields with Column.dropFields in PySpark,Documentation for the Column.dropFields method in PySpark.,Remove fields from a struct column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes a struct-specific Column method and its behavior, which is concrete Databricks PySpark API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/endswith,endswith,endswith (Column) - Azure Databricks,Check string suffixes with Column.endswith in PySpark,Documentation for the Column.endswith method in PySpark.,Check if a string column ends with a substring.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API reference for a Column string operation with defined semantics in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/eqnullsafe,eqNullSafe,eqNullSafe - Azure Databricks,Perform null-safe equality with Column.eqNullSafe in Databricks PySpark,Documentation for the Column.eqNullSafe method in PySpark.,Equality test that is safe for null values. Added in Databricks Runtime 11.0 Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Documents a null-safe comparison method, including Databricks Runtime version changes and Spark Connect support, which is detailed product-specific behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/getfield,getField,getField - Azure Databricks,Access struct fields with Column.getField in PySpark,Documentation for the Column.getField method in PySpark.,Get a field from a struct column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Provides method-level semantics for struct field access in Databricks PySpark, which is specific SDK usage.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/getitem,getItem,getItem - Azure Databricks,Access array or map elements with Column.getItem in PySpark,Documentation for the Column.getItem method in PySpark.,Get an item from an array or map column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents a Column method for array/map indexing, including behavior details unique to Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/ilike,ilike,ilike (Column) - Azure Databricks,Use case-insensitive pattern matching with Column.ilike in PySpark,Documentation for the Column.ilike method in PySpark.,Case-insensitive SQL LIKE pattern matching.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API reference for a Column method implementing case-insensitive SQL LIKE semantics in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isin,isin,isin - Azure Databricks,Test membership in value lists with Column.isin in PySpark,Documentation for the Column.isin method in PySpark.,Check if the column value is in a list of values.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes a Column method for membership checks, including its behavior in Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isnan,isNaN,isNaN - Azure Databricks,Detect NaN values with Column.isNaN in PySpark,Documentation for the Column.isNaN method in PySpark.,Check if the column value is NaN.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents NaN-specific behavior for a Column method, which is concrete Databricks PySpark API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isnotnull,isNotNull,isNotNull - Azure Databricks,Filter non-null values with Column.isNotNull in PySpark,Documentation for the Column.isNotNull method in PySpark.,Check if the column value is not null.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API reference for a null-checking Column method with defined semantics in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isnull,isNull,isNull - Azure Databricks,Filter null values with Column.isNull in PySpark,Documentation for the Column.isNull method in PySpark.,Check if the column value is null.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Documents the null-detection Column method and its behavior in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/like,like,like (Column) - Azure Databricks,Use SQL LIKE pattern matching with Column.like in PySpark,Documentation for the Column.like method in PySpark.,SQL LIKE pattern matching.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Provides method-level documentation for SQL LIKE semantics on a Column in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/name,name,name (Column) - Azure Databricks,Rename columns using Column.name alias in PySpark,Documentation for the Column.name method in PySpark.,Alias foralias().,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains a Column method alias and its relation to alias(), which is specific to the Databricks PySpark API surface.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/otherwise,otherwise,otherwise - Azure Databricks,Provide default values with Column.otherwise in PySpark,Documentation for the Column.otherwise method in PySpark.,Specify a default value for when conditions inwhen()are not met.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how otherwise() interacts with when() conditions in Databricks PySpark, which is detailed API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/over,over,over - Azure Databricks,Apply window specifications with Column.over in PySpark,Documentation for the Column.over method in PySpark.,Apply a window specification to the column.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"API reference for binding a Column to a window specification, including Databricks-specific semantics for window functions.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/rlike,rlike,rlike (Column) - Azure Databricks,Use regex pattern matching with Column.rlike in PySpark,Documentation for the Column.rlike method in PySpark.,SQL RLIKE (regex LIKE) pattern matching.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents SQL RLIKE semantics on a Column in Databricks PySpark, which is specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/startswith,startswith,startswith (Column) - Azure Databricks,Check string prefixes with Column.startswith in PySpark,Documentation for the Column.startswith method in PySpark.,Check if a string column starts with a substring.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API reference for a Column method implementing starts-with semantics in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/substr,substr,substr (Column) - Azure Databricks,Extract substrings with Column.substr in PySpark,Documentation for the Column.substr method in PySpark.,Return a substring of the column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Documents substring extraction behavior and parameters for a Column in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/try_cast,try_cast,try_cast - Azure Databricks,Safely cast column types with Column.try_cast in Databricks PySpark,Documentation for the Column.try\_cast method in PySpark.,Try to convert the column to a different data type. Returns null if conversion fails. Added in Databricks Runtime 15.0,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes a Databricks Runtime 15.0-added Column method that returns null on conversion failure, which is new, version-specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/when,when,when (Column) - Azure Databricks,Implement conditional logic with Column.when in PySpark,Documentation for the Column.when method in PySpark.,Evaluate a list of conditions and return one of multiple possible result expressions.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"API reference for multi-branch conditional expressions on Columns, including interaction with otherwise(), which is detailed Databricks PySpark behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/withfield,withField,withField - Azure Databricks,Add or replace struct fields with Column.withField in PySpark,Documentation for the Column.withField method in PySpark.,Add or replace a field in a struct column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents struct-manipulation behavior for a Column method in Databricks PySpark, which is specific SDK knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe,DataFrame class,DataFrame class - Azure Databricks,Use the DataFrame class in Azure Databricks PySpark,Documentation for the DataFrame class in PySpark.,"A distributed collection of data grouped into named columns. A DataFrame is equivalent to a relational table in Spark SQL, and can be created using various functions in SparkSession. Important A DataFrame should not be directly created using the constructor. Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,"Comprehensive reference for the Databricks PySpark DataFrame class, including creation constraints, methods, and Spark Connect support; this is core product-specific API and integration knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/agg,agg,agg (DataFrame) - Azure Databricks,Use DataFrame.agg for aggregations in Azure Databricks PySpark,Documentation for the DataFrame.agg method in PySpark.,Aggregate on the entire DataFrame without groups (shorthand fordf.groupBy().agg()).,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"API reference for DataFrame.agg with method signature, parameters, and behavior details that are product/version-specific and not reliably known from training.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/alias,alias,alias (DataFrame) - Azure Databricks,Apply aliases to DataFrames with alias() in PySpark,Documentation for the DataFrame.alias method in PySpark.,Returns a new DataFrame with an alias set.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,"Documents DataFrame.alias behavior and parameters in Databricks’ PySpark, which is concrete API surface rather than conceptual info.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/approxquantile,approxQuantile,approxQuantile (DataFrame) - Azure Databricks,Compute approximate quantiles with DataFrame.approxQuantile,Documentation for the DataFrame.approxQuantile method in PySpark.,Calculates the approximate quantiles of numerical columns of a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"API reference including parameters (e.g., relative error) and return structure for approxQuantile, which are detailed, product-specific semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/astable,asTable,asTable - Azure Databricks,Convert DataFrames to table arguments with asTable,Documentation for the DataFrame.asTable method in PySpark.,"Converts the DataFrame into aTableArgobject, which can be used as a table argument in a TVF (Table-Valued Function) including UDTF (User-Defined Table Function).",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes DataFrame.asTable and its use with TVFs/UDTFs, including object type and usage patterns unique to Databricks’ PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cache,cache,cache - Azure Databricks,Cache DataFrames with default storage level in PySpark,Documentation for the DataFrame.cache method in PySpark.,Persists the DataFrame with the default storage level (MEMORY_AND_DISK_DESER).,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Documents DataFrame.cache and the specific default storage level (MEMORY_AND_DISK_DESER), which is a concrete configuration detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/checkpoint,checkpoint,checkpoint - Azure Databricks,Checkpoint DataFrames and manage logical plans in PySpark,Documentation for the DataFrame.checkpoint method in PySpark.,"Returns a checkpointed version of this DataFrame. Checkpointing can be used to truncate the logical plan of this DataFrame, which is especially useful in iterative algorithms where the plan may grow exponentially. It will be saved to files inside the checkpoint directory set withSparkContext.setCheckpointDir, orspark.checkpoint.dirconfiguration.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains DataFrame.checkpoint behavior and interaction with SparkContext.setCheckpointDir and spark.checkpoint.dir, including side effects and storage location.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/coalesce,coalesce,coalesce (DataFrame) - Azure Databricks,Repartition DataFrames with coalesce(numPartitions),Documentation for the DataFrame.coalesce method in PySpark.,Returns a new DataFrame that has exactlynumPartitionspartitions.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,API reference for DataFrame.coalesce with parameter semantics that are specific to Spark/Databricks implementation.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/collect,collect,collect - Azure Databricks,Collect all DataFrame rows into driver with collect(),Documentation for the DataFrame.collect method in PySpark.,Returns all the records in the DataFrame as a list ofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,"Describes DataFrame.collect behavior and return type (list of Row), which is specific API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/colregex,colRegex,colRegex - Azure Databricks,Select columns by regex with DataFrame.colRegex,Documentation for the DataFrame.colRegex method in PySpark.,Selects column based on the column name specified as a regex and returns it asColumn.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"Documents DataFrame.colRegex usage and return type, a concrete API behavior detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/columns,columns,columns - Azure Databricks,Access DataFrame column names via columns property,Documentation for the DataFrame.columns property in PySpark.,Retrieves the names of all columns in theDataFrameas a list. The order of the column names in the list reflects their order in the DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,API reference for DataFrame.columns property and ordering semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/corr,corr,corr (DataFrame) - Azure Databricks,Compute column correlation with DataFrame.corr,Documentation for the DataFrame.corr method in PySpark.,Calculates the correlation of two columns of a DataFrame as a double value. Currently only supports the Pearson Correlation Coefficient.DataFrame.corrandDataFrameStatFunctions.corrare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Details DataFrame.corr usage, supported correlation type, and aliasing with DataFrameStatFunctions.corr.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/count,count,count (DataFrame) - Azure Databricks,Count DataFrame rows with count() in PySpark,Documentation for the DataFrame.count method in PySpark.,Returns the number of rows in this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"API behavior for DataFrame.count, including return type and semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cov,cov,cov (DataFrame) - Azure Databricks,Calculate sample covariance with DataFrame.cov,Documentation for the DataFrame.cov method in PySpark.,"Calculate the sample covariance for the given columns, specified by their names, as a double value.DataFrame.covandDataFrameStatFunctions.covare aliases.",2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Documents DataFrame.cov usage, return type, and aliasing with DataFrameStatFunctions.cov.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createglobaltempview,createGlobalTempView,createGlobalTempView - Azure Databricks,Create global temporary views from DataFrames,Documentation for the DataFrame.createGlobalTempView method in PySpark.,Creates a global temporary view with this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"API reference for DataFrame.createGlobalTempView, including scope and behavior of global temp views in Databricks.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createorreplaceglobaltempview,createOrReplaceGlobalTempView,createOrReplaceGlobalTempView - Azure Databricks,Create or replace global temp views with DataFrames,Documentation for the DataFrame.createOrReplaceGlobalTempView method in PySpark.,Creates or replaces a global temporary view using the given name.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"Describes DataFrame.createOrReplaceGlobalTempView semantics, including replacement behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createorreplacetempview,createOrReplaceTempView,createOrReplaceTempView - Azure Databricks,Create or replace local temp views from DataFrames,Documentation for the DataFrame.createOrReplaceTempView method in PySpark.,Creates or replaces a local temporary view with this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"API semantics for DataFrame.createOrReplaceTempView, including local temp view scope.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createtempview,createTempView,createTempView - Azure Databricks,Create local temporary views with createTempView,Documentation for the DataFrame.createTempView method in PySpark.,Creates a local temporary view with this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,Documents DataFrame.createTempView behavior and how the view is registered in Spark SQL catalog.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/crossjoin,crossJoin,crossJoin - Azure Databricks,Perform cartesian products with DataFrame.crossJoin,Documentation for the DataFrame.crossJoin method in PySpark.,Returns the cartesian product with another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,API reference for DataFrame.crossJoin and its semantics when joining two DataFrames.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/crosstab,crosstab,crosstab (DataFrame) - Azure Databricks,Build contingency tables with DataFrame.crosstab,Documentation for the DataFrame.crosstab method in PySpark.,Computes a pair-wise frequency table of the given columns. Also known as a contingency table. The first column of each row will be the distinct values ofcol1and the column names will be the distinct values ofcol2. The name of the first column will be$col1_$col2. Pairs that have no occurrences will have zero as their counts.DataFrame.crosstabandDataFrameStatFunctions.crosstabare aliases.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details DataFrame.crosstab output schema, naming convention ($col1_$col2), and aliasing with DataFrameStatFunctions.crosstab.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cube,cube,cube - Azure Databricks,Create aggregation cubes with DataFrame.cube,Documentation for the DataFrame.cube method in PySpark.,"Create a multi-dimensional cube for the current DataFrame using the specified columns, allowing aggregations to be performed on them.",2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,API semantics for DataFrame.cube and how it enables multi-dimensional aggregation.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/describe,describe,describe - Azure Databricks,Compute basic column statistics with DataFrame.describe,Documentation for the DataFrame.describe method in PySpark.,Computes basic statistics for numeric and string columns.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Documents DataFrame.describe behavior, including which statistics are computed for numeric and string columns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/distinct,distinct,distinct - Azure Databricks,Return distinct DataFrame rows with distinct(),Documentation for the DataFrame.distinct method in PySpark.,Returns a new DataFrame containing the distinct rows in this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,API reference for DataFrame.distinct and its deduplication semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/drop,drop,drop (DataFrame) - Azure Databricks,Drop DataFrame columns with drop() in PySpark,Documentation for the DataFrame.drop method in PySpark.,Returns a new DataFrame without specified columns. This is a no-op if the schema doesn't contain the given column name(s).,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,"Describes DataFrame.drop behavior, including no-op behavior when columns are missing.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dropduplicates,dropDuplicates,dropDuplicates - Azure Databricks,Remove duplicate rows with DataFrame.dropDuplicates,Documentation for the DataFrame.dropDuplicates method in PySpark.,"Return a new DataFrame with duplicate rows removed, optionally only considering certain columns.",2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"API semantics for DataFrame.dropDuplicates, including optional subset of columns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dropduplicateswithinwatermark,dropDuplicatesWithinWatermark,dropDuplicatesWithinWatermark - Azure Databricks,Drop duplicates within watermark in streaming DataFrames,Documentation for the DataFrame.dropDuplicatesWithinWatermark method in PySpark.,"Return a new DataFrame with duplicate rows removed, optionally only considering certain columns, within watermark.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents DataFrame.dropDuplicatesWithinWatermark, a Databricks-specific streaming/Watermark API with nuanced behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dropna,dropna,dropna - Azure Databricks,Drop null or NaN rows with DataFrame.dropna,Documentation for the DataFrame.dropna method in PySpark.,Returns a new DataFrame omitting rows with null or NaN values.DataFrame.dropnaandDataFrameNaFunctions.dropare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"API reference for DataFrame.dropna, including aliasing with DataFrameNaFunctions.drop and behavior on null/NaN.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dtypes,dtypes,dtypes - Azure Databricks,Inspect DataFrame column names and types via dtypes,Documentation for the DataFrame.dtypes property in PySpark.,Returns all column names and their data types as a list.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,Documents DataFrame.dtypes property and its list-of-tuples structure.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/exceptall,exceptAll,exceptAll - Azure Databricks,Subtract DataFrames with duplicates using exceptAll,Documentation for the DataFrame.exceptAll method in PySpark.,Return a new DataFrame containing rows in this DataFrame but not in another DataFrame while preserving duplicates.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"API semantics for DataFrame.exceptAll, including preservation of duplicates.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/executioninfo,executionInfo,executionInfo - Azure Databricks,Inspect query execution details with DataFrame.executionInfo,Documentation for the DataFrame.executionInfo property in PySpark.,"Returns anExecutionInfoobject after the query was executed. TheexecutionInfoproperty allows you to introspect information about the actual query execution after a successful execution. Accessing this property before query execution returnsNone. If the same DataFrame is executed multiple times, the execution info is overwritten by the latest operation. Note This property is dedicated to Spark Connect client only. With a regular Spark session, it throws an exception.",2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Describes DataFrame.executionInfo property, its lifecycle (None before execution, overwritten on re-run), and Spark Connect–only behavior, which are nuanced, product-specific details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/exists,exists,exists (DataFrame) - Azure Databricks,Build EXISTS subqueries with DataFrame.exists,Documentation for the DataFrame.exists method in PySpark.,Return aColumnobject for an EXISTS Subquery.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,API reference for DataFrame.exists returning a Column usable in EXISTS subqueries.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/explain,explain,explain - Azure Databricks,Print logical and physical plans with DataFrame.explain,Documentation for the DataFrame.explain method in PySpark.,Prints the (logical and physical) plans to the console for debugging purposes.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,Documents DataFrame.explain behavior and output for debugging query plans.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/fillna,fillna,fillna - Azure Databricks,Fill null values in DataFrames with fillna(),Documentation for the DataFrame.fillna method in PySpark.,Returns a new DataFrame which null values are filled with new value.DataFrame.fillnaandDataFrameNaFunctions.fillare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,API semantics for DataFrame.fillna and aliasing with DataFrameNaFunctions.fill.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/filter,filter,filter (DataFrame) - Azure Databricks,Filter DataFrame rows with filter() conditions,Documentation for the DataFrame.filter method in PySpark.,Filters rows using the given condition.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,Documents DataFrame.filter usage and accepted condition types.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/first,first,first (DataFrame) - Azure Databricks,Retrieve the first row of a DataFrame with first(),Documentation for the DataFrame.first method in PySpark.,Returns the first row as aRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"API behavior for DataFrame.first, including return type Row.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/foreach,foreach,foreach - Azure Databricks,Apply functions to each DataFrame row with foreach(),Documentation for the DataFrame.foreach method in PySpark.,Applies theffunction to all Row of this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,Describes DataFrame.foreach behavior and function signature expectations.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/foreachpartition,foreachPartition,foreachPartition - Azure Databricks,Process DataFrame partitions with foreachPartition(),Documentation for the DataFrame.foreachPartition method in PySpark.,Applies theffunction to each partition of this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"API semantics for DataFrame.foreachPartition, including per-partition function application.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/freqitems,freqItems,freqItems (DataFrame) - Azure Databricks,Find frequent items in columns with DataFrame.freqItems,Documentation for the DataFrame.freqItems method in PySpark.,"Finding frequent items for columns, possibly with false positives. Using the frequent element count algorithm described in ""https://doi.org/10.1145/762471.762473, proposed by Karp, Schenker, and Papadimitriou"".DataFrame.freqItemsandDataFrameStatFunctions.freqItemsare aliases.",2026-04-17T21:49:00.000Z,reference,integrations,0.81,True,"Documents DataFrame.freqItems, its probabilistic behavior (false positives) and relation to the cited algorithm, which is specialized API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/groupby,groupBy,groupBy - Azure Databricks,Group DataFrames for aggregation with groupBy(),Documentation for the DataFrame.groupBy method in PySpark.,Groups the DataFrame by the specified columns so that aggregation can be performed on them. SeeGroupedDatafor all the available aggregate functions.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,API reference for DataFrame.groupBy and its relationship to GroupedData aggregate functions.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/groupingsets,groupingSets,groupingSets - Azure Databricks,Create multi-dimensional aggregations with groupingSets(),Documentation for the DataFrame.groupingSets method in PySpark.,"Create multi-dimensional aggregation for the current DataFrame using the specified grouping sets, so we can run aggregation on them.",2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"Documents DataFrame.groupingSets semantics for multi-dimensional aggregation, which is specific to Spark SQL.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/head,head,head - Azure Databricks,Return first n rows of a DataFrame with head(),Documentation for the DataFrame.head method in PySpark.,Returns the firstnrows.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"API behavior for DataFrame.head, including parameter n and return structure.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/hint,hint,hint - Azure Databricks,Use DataFrame.hint for query optimization in Databricks,Documentation for the DataFrame.hint method in PySpark.,Specifies some hint on the current DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrame.hint with Databricks-specific behavior and parameters; concrete method signature and usage details are product-specific integration/coding knowledge beyond generic concepts.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/inputfiles,inputFiles,inputFiles - Azure Databricks,Retrieve composing files with DataFrame.inputFiles,Documentation for the DataFrame.inputFiles method in PySpark.,"Returns a best-effort snapshot of the files that compose this DataFrame. This method simply asks each constituent BaseRelation for its respective files and takes the union of all results. Depending on the source relations, this may not find all input files. Duplicates are removed.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents DataFrame.inputFiles behavior, including best-effort semantics and duplicate handling; these method-level details are specific to Databricks/Spark API usage.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/intersect,intersect,intersect - Azure Databricks,Compute DataFrame intersections with intersect,Documentation for the DataFrame.intersect method in PySpark.,Return a new DataFrame containing rows only in both this DataFrame and another DataFrame. Note that any duplicates are removed. To preserve duplicates useintersectAll.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Explains DataFrame.intersect semantics, especially duplicate removal and relation to intersectAll; this is concrete API behavior specific to this product.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/intersectall,intersectAll,intersectAll - Azure Databricks,Preserve duplicates using DataFrame.intersectAll,Documentation for the DataFrame.intersectAll method in PySpark.,Return a new DataFrame containing rows in both this DataFrame and another DataFrame while preserving duplicates.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Clarifies how intersectAll differs from intersect in handling duplicates; method semantics are product-specific coding details.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/isempty,isEmpty,isEmpty - Azure Databricks,Check for empty DataFrames with isEmpty,Documentation for the DataFrame.isEmpty method in PySpark.,Checks if the DataFrame is empty and returns a boolean value.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents DataFrame.isEmpty return type and behavior; API-level detail for Databricks PySpark usage.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/islocal,isLocal,isLocal - Azure Databricks,Detect local collect capability with isLocal,Documentation for the DataFrame.isLocal method in PySpark.,ReturnsTrueif thecollectandtakemethods can be run locally (without any Spark executors).,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes DataFrame.isLocal semantics regarding collect/take running without executors; specific to Spark/Databricks execution model.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/isstreaming,isStreaming,isStreaming - Azure Databricks,Work with streaming DataFrames using isStreaming,Documentation for the DataFrame.isStreaming property in PySpark.,"ReturnsTrueif thisDataFramecontains one or more sources that continuously return data as it arrives. ADataFramethat reads data from a streaming source must be executed as aStreamingQueryusing thestartmethod inDataStreamWriter. Methods that return a single answer, such ascountorcollect, will throw anAnalysisExceptionwhen there is a streaming source present.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Includes behavior with count/collect raising AnalysisException on streaming sources and requirement to use StreamingQuery; concrete product-specific API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/join,join,join - Azure Databricks,Join DataFrames with DataFrame.join in Databricks,Documentation for the DataFrame.join method in PySpark.,"Joins with another DataFrame, using the given join expression.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrame.join with join expressions; method semantics and parameters are specific integration/coding patterns.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/lateraljoin,lateralJoin,lateralJoin - Azure Databricks,Perform lateral joins with DataFrame.lateralJoin,Documentation for the DataFrame.lateralJoin method in PySpark.,"Lateral joins with another DataFrame, using the given join expression.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents lateralJoin behavior and usage; lateral join support and syntax are product-specific API details.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/limit,limit,limit - Azure Databricks,Limit DataFrame result rows with limit,Documentation for the DataFrame.limit method in PySpark.,Limits the result count to the number specified.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Explains DataFrame.limit semantics for restricting row counts; concrete method behavior for Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/localcheckpoint,localCheckpoint,localCheckpoint - Azure Databricks,Use localCheckpoint for DataFrame logical plan truncation,Documentation for the DataFrame.localCheckpoint method in PySpark.,"Returns a locally checkpointed version of this DataFrame. Checkpointing can be used to truncate the logical plan of this DataFrame, which is especially useful in iterative algorithms where the plan may grow exponentially. Local checkpoints are stored in the executors using the caching subsystem and therefore they are not reliable.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes localCheckpoint behavior, storage location, and reliability characteristics; product-specific execution and caching semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/mapinarrow,mapInArrow,mapInArrow - Azure Databricks,Apply Arrow-based batch transforms with mapInArrow,Documentation for the DataFrame.mapInArrow method in PySpark.,"Maps an iterator of batches in the current DataFrame using a Python native function that is performed onpyarrow.RecordBatchs both as input and output, and returns the result as a DataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Details DataFrame.mapInArrow using pyarrow.RecordBatch iterators; includes integration-specific behavior between PySpark and Arrow.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/mapinpandas,mapInPandas,mapInPandas - Azure Databricks,Apply pandas batch transforms with mapInPandas,Documentation for the DataFrame.mapInPandas method in PySpark.,"Maps an iterator of batches in the current DataFrame using a Python native function that is performed on pandas DataFrames both as input and output, and returns the result as a DataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Documents DataFrame.mapInPandas using pandas DataFrame iterators; specific integration pattern between Databricks PySpark and pandas.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/melt,melt,melt - Azure Databricks,Unpivot wide DataFrames using melt/unpivot,Documentation for the DataFrame.melt method in PySpark.,"Unpivot a DataFrame from wide format to long format, optionally leaving identifier columns set. This is the reverse togroupBy(...).pivot(...).agg(...), except for the aggregation, which cannot be reversed. meltis an alias forunpivot.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains DataFrame.melt semantics, relation to groupBy().pivot().agg(), and aliasing to unpivot; concrete API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/mergeinto,mergeInto,mergeInto - Azure Databricks,Merge source changes into target tables with mergeInto,Documentation for the DataFrame.mergeInto method in PySpark.,"Merges a set of updates, insertions, and deletions based on a source table into a target table.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes DataFrame.mergeInto for updates/inserts/deletes into target tables; Databricks-specific Delta-style merge semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/metadatacolumn,metadataColumn,metadataColumn - Azure Databricks,Access logical metadata columns with metadataColumn,Documentation for the DataFrame.metadataColumn method in PySpark.,Selects a metadata column based on its logical column name and returns it as aColumn. Added in Databricks Runtime 16.1,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents metadataColumn behavior and its addition in a specific Databricks Runtime version; product-specific API detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/na,na,na - Azure Databricks,Handle missing values via DataFrame.na functions,Documentation for the DataFrame.na property in PySpark.,Returns aDataFrameNaFunctionsfor handling missing values.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Returns DataFrameNaFunctions; exposes product-specific API surface for null handling.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/observe,observe,observe - Azure Databricks,Define and capture DataFrame metrics with observe,Documentation for the DataFrame.observe method in PySpark.,"Define (named) metrics to observe on the DataFrame. This method returns an 'observed' DataFrame that returns the same result as the input, with the following guarantees: It will compute the defined aggregates (metrics) on all the data that is flowing through the Dataset at that point. It will report the value of the defined aggregate columns as soon as we reach a completion point.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains guarantees about metric computation and reporting on observed DataFrames; Databricks-specific monitoring semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/offset,offset,offset - Azure Databricks,Skip initial rows using DataFrame.offset,Documentation for the DataFrame.offset method in PySpark.,Returns a new DataFrame by skipping the firstnrows. Added in Databricks Runtime 13.1,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents offset behavior and runtime version introduction; concrete API semantics for row skipping.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/orderby,orderBy,orderBy (DataFrame) - Azure Databricks,Sort DataFrames using orderBy/sort aliases,Documentation for the DataFrame.orderBy method in PySpark.,orderByis an alias forsort.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Clarifies that orderBy is an alias for sort; aliasing behavior is specific API knowledge.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/pandas_api,pandas_api,pandas_api - Azure Databricks,Convert PySpark DataFrames to pandas-on-Spark,Documentation for the DataFrame.pandas\_api method in PySpark.,Converts the existing DataFrame into a pandas-on-Spark DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes pandas_api conversion behavior; integration between Databricks PySpark and pandas-on-Spark is product-specific.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/persist,persist,persist - Azure Databricks,Configure DataFrame persistence and storage levels,Documentation for the DataFrame.persist method in PySpark.,Sets the storage level to persist the contents of the DataFrame across operations after the first time it is computed. This can only be used to assign a new storage level if the DataFrame does not have a storage level set yet. If no storage level is specified defaults to (MEMORY_AND_DISK_DESER).,2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,"Documents persist behavior, including default storage level MEMORY_AND_DISK_DESER and constraints on changing storage level; specific configuration semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/plot,plot,plot - Azure Databricks,Create visualizations with DataFrame.plot accessor,Documentation for the DataFrame.plot property in PySpark.,Returns aPySparkPlotAccessorfor plotting functions. You can create plots in two ways:,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Exposes PySparkPlotAccessor and supported plotting approaches; product-specific plotting API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/printschema,printSchema,printSchema - Azure Databricks,Inspect DataFrame schemas with printSchema,Documentation for the DataFrame.printSchema method in PySpark.,Prints out the schema in the tree format. Optionally allows to specify how many levels to print if schema is nested.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Explains tree-format schema printing and level control; concrete method behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/randomsplit,randomSplit,randomSplit - Azure Databricks,Randomly split DataFrames with randomSplit,Documentation for the DataFrame.randomSplit method in PySpark.,Randomly splits this DataFrame with the provided weights.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents randomSplit behavior with weights; API semantics for dataset partitioning.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/rdd,rdd,rdd - Azure Databricks,Access underlying RDDs via DataFrame.rdd,Documentation for the DataFrame.rdd property in PySpark.,Returns the content as anRDDofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes rdd property returning RDD[Row]; product-specific bridge between DataFrame and RDD APIs.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/repartition,repartition,repartition - Azure Databricks,Hash partition DataFrames with repartition,Documentation for the DataFrame.repartition method in PySpark.,Returns a new DataFrame partitioned by the given partitioning expressions. The resulting DataFrame is hash partitioned.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains repartition behavior and hash partitioning semantics; concrete API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/repartitionbyid,repartitionById,repartitionById - Azure Databricks,Partition DataFrames by column identifier with repartitionById,Documentation for the DataFrame.repartitionById method in PySpark.,Returns a new DataFrame partitioned by the given partitioning expressions. The resulting DataFrame is partitioned by column identifier.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents repartitionById semantics and partitioning by column identifier; Databricks-specific API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/repartitionbyrange,repartitionByRange,repartitionByRange - Azure Databricks,Range partition DataFrames with repartitionByRange,Documentation for the DataFrame.repartitionByRange method in PySpark.,Returns a new DataFrame partitioned by the given partitioning expressions. The resulting DataFrame is range partitioned.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains repartitionByRange behavior and range partitioning; concrete partitioning API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/replace,replace,replace (DataFrame) - Azure Databricks,Replace values in DataFrames with replace,Documentation for the DataFrame.replace method in PySpark.,"Returns a new DataFrame replacing a value with another value.DataFrame.replaceandDataFrameNaFunctions.replaceare aliases of each other. Values to_replace and value must have the same type and can only be numerics, booleans, or strings. Value can have None. When replacing, the new value will be cast to the type of the existing column.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Details type constraints, aliasing with DataFrameNaFunctions.replace, and casting behavior; specific method semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/rollup,rollup,rollup - Azure Databricks,Create multi-dimensional aggregations with rollup,Documentation for the DataFrame.rollup method in PySpark.,"Create a multi-dimensional rollup for the current DataFrame using the specified columns, allowing for aggregation on them.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents rollup behavior for multi-dimensional aggregation; product-specific group/aggregation API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/samesemantics,sameSemantics,sameSemantics - Azure Databricks,Compare DataFrame logical plans with sameSemantics,Documentation for the DataFrame.sameSemantics method in PySpark.,ReturnsTruewhen the logical query plans inside both DataFrames are equal and therefore return the same results.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains sameSemantics behavior for logical plan equality; Databricks/Spark-specific query plan API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sample,sample,sample - Azure Databricks,Sample subsets of DataFrames with sample,Documentation for the DataFrame.sample method in PySpark.,Returns a sampled subset of this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents sampling behavior; concrete API semantics for random sampling.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sampleby,sampleBy,sampleBy (DataFrame) - Azure Databricks,Perform stratified sampling with sampleBy,Documentation for the DataFrame.sampleBy method in PySpark.,Returns a stratified sample without replacement based on the fraction given on each stratum.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains stratified sampling without replacement and fraction-per-stratum behavior; specific method semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/scalar,scalar,scalar - Azure Databricks,Use scalar subqueries via DataFrame.scalar,Documentation for the DataFrame.scalar method in PySpark.,Return aColumnobject for a SCALAR Subquery containing exactly one row and one column.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents scalar subquery constraints (exactly one row and one column) and Column return type; product-specific query API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/schema,schema,schema (DataFrame) - Azure Databricks,Inspect DataFrame schema via schema property,Documentation for the DataFrame.schema property in PySpark.,Returns the schema of thisDataFrameas aStructType.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes schema property returning StructType; concrete API surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/select,select,select - Azure Databricks,Project columns and expressions with select,Documentation for the DataFrame.select method in PySpark.,Projects a set of expressions and returns a new DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents DataFrame.select projection behavior; core API semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/selectexpr,selectExpr,selectExpr - Azure Databricks,Project SQL expressions with selectExpr,Documentation for the DataFrame.selectExpr method in PySpark.,Projects a set of SQL expressions and returns a new DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains selectExpr usage with SQL expressions; specific to Databricks/Spark SQL integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/semantichash,semanticHash,semanticHash - Azure Databricks,Generate logical plan hashes with semanticHash,Documentation for the DataFrame.semanticHash method in PySpark.,Returns a hash code of the logical query plan against this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents semanticHash behavior for logical query plans; product-specific diagnostic API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/show,show,show - Azure Databricks,Display DataFrame rows in console with show,Documentation for the DataFrame.show method in PySpark.,Prints the firstnrows of the DataFrame to the console.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Explains show behavior for printing first n rows; includes method semantics and parameters specific to this API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sort,sort,sort - Azure Databricks,Use DataFrame.sort in Azure Databricks PySpark,Documentation for the DataFrame.sort method in PySpark.,Returns a new DataFrame sorted by the specified column(s).,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark DataFrame method on Databricks, including parameters and behavior details that are product- and runtime-specific.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sortwithinpartitions,sortWithinPartitions,sortWithinPartitions - Azure Databricks,Use sortWithinPartitions for partitioned sorting,Documentation for the DataFrame.sortWithinPartitions method in PySpark.,Returns a new DataFrame with each partition sorted by the specified column(s).,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents a Databricks-specific PySpark DataFrame API with method semantics and parameters that go beyond generic conceptual knowledge.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sparksession,sparkSession,sparkSession - Azure Databricks,Access DataFrame.sparkSession in Databricks PySpark,Documentation for the DataFrame.sparkSession property in PySpark.,Returns Spark session that created this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Property reference describing how to obtain the SparkSession from a DataFrame in Databricks PySpark, including behavior specific to this environment.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/stat,stat,stat - Azure Databricks,Use DataFrame.stat for statistics in PySpark,Documentation for the DataFrame.stat property in PySpark.,Returns aDataFrameStatFunctionsfor statistic functions.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Reference for the stat property returning DataFrameStatFunctions, exposing product-specific statistical APIs and usage patterns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/storagelevel,storageLevel,storageLevel - Azure Databricks,Check DataFrame.storageLevel in Databricks PySpark,Documentation for the DataFrame.storageLevel property in PySpark.,Gets theDataFrame's current storage level.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a property exposing Spark storage level details for a DataFrame, including behavior tied to Databricks runtime.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/subtract,subtract,subtract - Azure Databricks,Use DataFrame.subtract to diff DataFrames,Documentation for the DataFrame.subtract method in PySpark.,Return a new DataFrame containing rows in this DataFrame but not in another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for subtract between DataFrames with Databricks-specific semantics and constraints not captured by generic knowledge.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/summary,summary,summary - Azure Databricks,Compute DataFrame.summary statistics in PySpark,Documentation for the DataFrame.summary method in PySpark.,"Computes specified statistics for numeric and string columns. Available statistics are: count, mean, stddev, min, max, arbitrary approximate percentiles specified as a percentage (e.g., 75%).",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Details available statistics, including approximate percentiles and accepted arguments, which are concrete API behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/tail,tail,tail - Azure Databricks,Use DataFrame.tail to get last rows,Documentation for the DataFrame.tail method in PySpark.,Returns the lastnumrows as a list ofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method reference for tail with parameter semantics and return type specific to Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/take,take,take - Azure Databricks,Use DataFrame.take to get first rows,Documentation for the DataFrame.take method in PySpark.,Returns the firstnumrows as a list ofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents the take method’s parameters and behavior for Databricks PySpark DataFrames.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/to,to,to - Azure Databricks,Reconcile DataFrame schema with to() in PySpark,Documentation for the DataFrame.to method in PySpark.,Returns a new DataFrame where each row is reconciled to match the specified schema.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrame.to with schema reconciliation behavior that is implementation-specific.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/toarrow,toArrow,toArrow - Azure Databricks,Convert DataFrame to PyArrow Table in Databricks,Documentation for the DataFrame.toArrow method in PySpark.,Returns the contents of this DataFrame as PyArrowpyarrow.Table. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes DataFrame.toArrow, including return type, version added, and integration behavior with PyArrow, which are concrete integration details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/todf,toDF,toDF - Azure Databricks,Rename DataFrame columns with toDF in PySpark,Documentation for the DataFrame.toDF method in PySpark.,Returns a new DataFrame with new specified column names.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method reference for toDF with specific behavior for renaming columns in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/tojson,toJSON,toJSON - Azure Databricks,Convert DataFrame to JSON RDD in Databricks,Documentation for the DataFrame.toJSON method in PySpark.,Converts a DataFrame into a RDD of string or DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents DataFrame.toJSON, including output type and behavior, which are concrete API integration details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/tolocaliterator,toLocalIterator,toLocalIterator - Azure Databricks,Iterate over all DataFrame rows with toLocalIterator,Documentation for the DataFrame.toLocalIterator method in PySpark.,Returns an iterator that contains all of the rows in this DataFrame. The iterator will consume as much memory as the largest partition in this DataFrame. With prefetch it may consume up to the memory of the 2 largest partitions.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains memory behavior (largest partition, prefetch) and iterator semantics, which are product-specific operational details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/topandas,toPandas,toPandas - Azure Databricks,Convert Spark DataFrame to pandas in Databricks,Documentation for the DataFrame.toPandas method in PySpark.,Returns the contents of this DataFrame as Pandaspandas.DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Reference for toPandas, including return type and implicit data movement semantics between Spark and pandas.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/transform,transform,transform (DataFrame) - Azure Databricks,Chain custom DataFrame.transform operations,Documentation for the DataFrame.transform method in PySpark.,Returns a new DataFrame. Concise syntax for chaining custom transformations.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents the transform method’s signature and usage pattern for chaining transformations in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/transpose,transpose,transpose - Azure Databricks,Transpose a DataFrame in Databricks PySpark,Documentation for the DataFrame.transpose method in PySpark.,"Transposes a DataFrame such that the values in the specified index column become the new columns of the DataFrame. If no index column is provided, the first column is used as the default.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"API reference for transpose, including default index column behavior, which is specific to this implementation.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/union,union,union - Azure Databricks,Union two DataFrames with union in PySpark,Documentation for the DataFrame.union method in PySpark.,Return a new DataFrame containing the union of rows in this and another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes DataFrame.union semantics and usage in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unionbyname,unionByName,unionByName - Azure Databricks,Union DataFrames by column name with unionByName,Documentation for the DataFrame.unionByName method in PySpark.,Returns a new DataFrame containing union of rows in this and another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents unionByName behavior and how it matches columns, which is concrete API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unpersist,unpersist,unpersist - Azure Databricks,Unpersist a DataFrame in Databricks PySpark,Documentation for the DataFrame.unpersist method in PySpark.,"Marks the DataFrame as non-persistent, and remove all blocks for it from memory and disk.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains unpersist behavior, including removal of blocks from memory and disk, which is implementation-specific.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unpivot,unpivot,unpivot - Azure Databricks,Unpivot wide DataFrames to long format in PySpark,Documentation for the DataFrame.unpivot method in PySpark.,"Unpivot a DataFrame from wide format to long format, optionally leaving identifier columns set. This is the reverse togroupBy(...).pivot(...).agg(...), except for the aggregation, which cannot be reversed. Added in Databricks Runtime 11.1",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details unpivot behavior, relation to groupBy().pivot().agg(), and version added, which are specific integration semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/where,where,where - Azure Databricks,Filter DataFrames using where alias in PySpark,Documentation for the DataFrame.where method in PySpark.,whereis an alias forfilter.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Clarifies that where is an alias for filter in Databricks PySpark, a concrete API mapping.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumn,withColumn,withColumn - Azure Databricks,Add or replace columns with withColumn in PySpark,Documentation for the DataFrame.withColumn method in PySpark.,Returns a new DataFrame by adding a column or replacing the existing column that has the same name.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Reference for withColumn behavior when adding or replacing columns in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumnrenamed,withColumnRenamed,withColumnRenamed - Azure Databricks,Rename a DataFrame column with withColumnRenamed,Documentation for the DataFrame.withColumnRenamed method in PySpark.,Returns a new DataFrame by renaming an existing column. This is a no-op if the schema doesn't contain the given column name.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents renaming semantics and no-op behavior when column is missing, which are concrete API details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumns,withColumns,withColumns - Azure Databricks,Add multiple columns with withColumns in PySpark,Documentation for the DataFrame.withColumns method in PySpark.,Returns a new DataFrame by adding multiple columns or replacing the existing columns that have the same names.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes multi-column add/replace behavior for Databricks PySpark DataFrames.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumnsrenamed,withColumnsRenamed,withColumnsRenamed - Azure Databricks,Rename multiple columns with withColumnsRenamed,Documentation for the DataFrame.withColumnsRenamed method in PySpark.,Returns a new DataFrame by renaming multiple columns. This is a no-op if the schema doesn't contain the given column names.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains multi-column rename semantics and no-op behavior, which are specific API behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withmetadata,withMetadata,withMetadata - Azure Databricks,Update column metadata with withMetadata in PySpark,Documentation for the DataFrame.withMetadata method in PySpark.,Returns a new DataFrame by updating an existing column with metadata.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents how to modify column metadata via a specific DataFrame API in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withwatermark,withWatermark,withWatermark - Azure Databricks,Define event-time watermarks with withWatermark,Documentation for the DataFrame.withWatermark method in PySpark.,Defines an event time watermark for this DataFrame. A watermark tracks a point in time before which we assume no more late data is going to arrive.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains watermark semantics for streaming DataFrames, including late data handling, which is product-specific behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/write,write,write (DataFrame) - Azure Databricks,Write non-streaming DataFrames to storage in Databricks,Documentation for the DataFrame.write property in PySpark.,Interface for saving the content of the non-streamingDataFrameout into external storage.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Reference for DataFrame.write interface, including how it interacts with external storage systems in Databricks.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/writestream,writeStream,writeStream - Azure Databricks,Write streaming DataFrames with writeStream in Databricks,Documentation for the DataFrame.writeStream property in PySpark.,Interface for saving the content of the streamingDataFrameout into external storage.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Documents streaming write interface and semantics for Databricks PySpark DataFrames.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/writeto,writeTo,writeTo - Azure Databricks,Configure v2 data source writes with writeTo,Documentation for the DataFrame.writeTo method in PySpark.,Create a write configuration builder for v2 sources.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes the write configuration builder for v2 sources, including API usage specific to Databricks runtime.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions,DataFrameNaFunctions class,DataFrameNaFunctions class - Azure Databricks,Handle nulls with DataFrameNaFunctions in PySpark,"Learn about the DataFrameNaFunctions class in PySpark for dropping, filling, and replacing null values in a DataFrame.",Functionality for working with missing data in a DataFrame. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Class reference for DataFrameNaFunctions, including supported operations and Spark Connect support, which are concrete API details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/drop,drop,drop (DataFrameNaFunctions) - Azure Databricks,Drop null or NaN rows with DataFrameNaFunctions.drop,Documentation for the DataFrameNaFunctions.drop method in PySpark.,Returns a newDataFrameomitting rows with null or NaN values.DataFrame.dropnaandDataFrameNaFunctions.dropare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents drop behavior, aliasing with DataFrame.dropna, and null-handling semantics specific to Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/fill,fill,fill (DataFrameNaFunctions) s - Azure Databricks,Fill null values with DataFrameNaFunctions.fill,Documentation for the DataFrameNaFunctions.fill method in PySpark.,Returns a newDataFramein which null values are filled with a new value.DataFrame.fillnaandDataFrameNaFunctions.fillare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains fill behavior, aliasing with DataFrame.fillna, and accepted value types, which are concrete API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/replace,replace,replace (DataFrameNaFunctions) - Azure Databricks,Replace values with DataFrameNaFunctions.replace,Documentation for the DataFrameNaFunctions.replace method in PySpark.,"Returns a newDataFramereplacing a value with another value.DataFrame.replaceandDataFrameNaFunctions.replaceare aliases of each other. Values forto_replaceandvaluemust have the same type and can only be numerics, booleans, or strings.valuecan beNone. When replacing, the new value is cast to the type of the existing column.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Details type constraints, None handling, and casting behavior for replace, which are precise, product-specific API rules.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader,DataFrameReader class,DataFrameReader class - Azure Databricks,Load data with DataFrameReader in Azure Databricks,Documentation for the DataFrameReader class in PySpark.,"Interface used to load a DataFrame from external storage systems (e.g. file systems, key-value stores, etc). Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,"Class reference listing methods, parameters, and behaviors for DataFrameReader, including Spark Connect support; this is detailed API integration knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/csv,csv,csv (DataFrameReader) - Azure Databricks,Read CSV files with DataFrameReader.csv in Databricks,Documentation for the DataFrameReader.csv method in PySpark.,"Loads a CSV file and returns the result as aDataFrame. IfinferSchemais enabled, this function reads the input once to determine the schema. To avoid this, either disableinferSchemaor specify the schema explicitly usingschema.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Method reference with options like inferSchema and schema, including behavior about extra passes over data; these are concrete API parameters and effects.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/excel,excel,excel (DataFrameReader) - Azure Databricks,Read Excel files using DataFrameReader.excel in Databricks,Documentation for the DataFrameReader.excel method in PySpark.,Loads Excel files and returns the result as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"API method for loading Excel into DataFrames with product-specific options and behavior, fitting integration patterns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/format,format,format (DataFrameReader) - Azure Databricks,Specify input data source format with DataFrameReader.format,Documentation for the DataFrameReader.format method in PySpark.,Specifies the input data source format.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents the format method and accepted values for configuring data sources, which are concrete configuration parameters for integrations.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/jdbc,jdbc,jdbc (DataFrameReader) - Azure Databricks,Read database tables via JDBC with DataFrameReader.jdbc,Documentation for the DataFrameReader.jdbc method in PySpark.,"Constructs aDataFramerepresenting the database table accessible via JDBC URLurl. Partitions of the table are retrieved in parallel if eithercolumnorpredicatesis specified. If bothcolumnandpredicatesare specified,columntakes precedence.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Describes JDBC URL usage, partitioning via column/predicates, and precedence rules, which are detailed integration and configuration behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/json,json,json (DataFrameReader) - Azure Databricks,Read JSON and JSON Lines with DataFrameReader.json,Documentation for the DataFrameReader.json method in PySpark.,"Loads JSON files and returns the results as aDataFrame. JSON Lines (newline-delimited JSON) is supported by default. For JSON with one record per file, set themultiLineoption toTrue. Ifschemais not specified, this function reads the input once to determine the input schema.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Covers options like multiLine and schema inference behavior, which are specific API parameters and effects for JSON integration.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/load,load,load - Azure Databricks,Load data from sources with DataFrameReader.load,Documentation for the DataFrameReader.load method in PySpark.,Loads data from a data source and returns it as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API reference for load with parameters and behavior for reading from various sources; this is product-specific integration detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/option,option,option (DataFrameReader) - Azure Databricks,Set single input option with DataFrameReader.option,Documentation for the DataFrameReader.option method in PySpark.,Adds an input option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents the option method for configuring data source parameters, including names and values, which is integration-focused configuration.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/options,options,options (DataFrameReader) - Azure Databricks,Set multiple input options with DataFrameReader.options,Documentation for the DataFrameReader.options method in PySpark.,Adds input options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API for bulk-setting data source options; parameter names and usage are product-specific integration details.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/orc,orc,orc (DataFrameReader) - Azure Databricks,Read ORC files with DataFrameReader.orc in Databricks,Documentation for the DataFrameReader.orc method in PySpark.,Loads ORC files and returns the result as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Method reference for loading ORC with specific parameters and behavior, which is concrete integration knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/parquet,parquet,parquet (DataFrameReader) - Azure Databricks,Read Parquet files with DataFrameReader.parquet,Documentation for the DataFrameReader.parquet method in PySpark.,Loads Parquet files and returns the result as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API method for Parquet loading with product-specific options and semantics, fitting integration patterns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/schema,schema,schema - Azure Databricks,Define input schema with DataFrameReader.schema,Documentation for the DataFrameReader.schema method in PySpark.,"Specifies the input schema. Some data sources (such as JSON) can infer the input schema automatically from data. By specifying the schema here, the underlying data source can skip the schema inference step, which speeds up data loading.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Explains schema parameter usage and its effect on skipping inference, which is specific API behavior for integrations.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/table,table,table (DataFrameReader) - Azure Databricks,Load tables as DataFrames with DataFrameReader.table,Documentation for the DataFrameReader.table method in PySpark.,Returns the specified table as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how to reference tables and return DataFrames, including parameter usage; this is product-specific integration detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/text,text,text (DataFrameReader) - Azure Databricks,Read text files with DataFrameReader.text in Databricks,Documentation for the DataFrameReader.text method in PySpark.,"Loads text files and returns aDataFramewhose schema starts with a string column namedvalue, followed by partitioned columns if any are present. Text files must be encoded as UTF-8. By default, each line in the text file is a new row in the resulting DataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Specifies schema (value column), UTF-8 requirement, and line-to-row behavior, which are concrete API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/xml,xml,xml (DataFrameReader) - Azure Databricks,Read XML files with DataFrameReader.xml in Databricks,Documentation for the DataFrameReader.xml method in PySpark.,"Loads an XML file and returns the result as aDataFrame. Ifschemais not specified, this function reads the input once to determine the input schema.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API for XML loading with schema inference behavior; these are specific integration parameters and behaviors.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions,DataFrameStatFunctions class,DataFrameStatFunctions class - Azure Databricks,Use DataFrameStatFunctions for statistics in PySpark,"Learn about the DataFrameStatFunctions class in PySpark for computing statistics such as correlations, covariances, and frequency tables.",Functionality for statistical functions with a DataFrame. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Class reference for statistical functions, including supported operations and Spark Connect support.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/approxquantile,approxQuantile,approxQuantile (DataFrameStatFunctions) - Azure Databricks,Compute approximate quantiles with approxQuantile,Documentation for the DataFrameStatFunctions.approxQuantile method in PySpark.,"Calculates the approximate quantiles of numerical columns of aDataFrame. The result of this algorithm has the following deterministic bound: if theDataFramehas N elements and if we request the quantile at probabilitypup to errorerr, then the algorithm will return a samplexfrom theDataFrameso that theexactrank ofxis close to (p _ N). More precisely,floor((p - err) _ N) <= rank(x) <= ceil((p + err) \* N). This method implements a variation of the Greenwald-Khanna algorithm with some speed optimiza",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes deterministic error bounds and algorithm (Greenwald-Khanna variation), which are detailed, implementation-specific semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/corr,corr,corr (DataFrameStatFunctions) - Azure Databricks,Calculate column correlation with DataFrameStatFunctions.corr,Documentation for the DataFrameStatFunctions.corr method in PySpark.,Calculates the correlation of two columns of aDataFrameas a double value. Currently only supports the Pearson Correlation Coefficient.DataFrame.corrandDataFrameStatFunctions.corrare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents correlation computation, supported method (Pearson), and aliasing with DataFrame.corr.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/cov,cov,cov (DataFrameStatFunctions) - Azure Databricks,Compute sample covariance with DataFrameStatFunctions.cov,Documentation for the DataFrameStatFunctions.cov method in PySpark.,"Calculates the sample covariance for the given columns, specified by their names, as a double value.DataFrame.covandDataFrameStatFunctions.covare aliases of each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains covariance calculation semantics and aliasing with DataFrame.cov, which are concrete API behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/crosstab,crosstab,crosstab (DataFrameStatFunctions) - Azure Databricks,Generate crosstab frequency tables with DataFrameStatFunctions.crosstab,Documentation for the DataFrameStatFunctions.crosstab method in PySpark.,"Computes a pair-wise frequency table of the given columns, also known as a contingency table. The first column of each row contains the distinct values ofcol1, and the column names are the distinct values ofcol2. The name of the first column is$col1_$col2. Pairs with no occurrences have a count of zero.DataFrame.crosstabandDataFrameStatFunctions.crosstabare aliases of each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Details output schema (first column naming, column names from distinct values, zero counts for missing pairs) which are precise API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/freqitems,freqItems,freqItems (DataFrameStatFunctions) - Azure Databricks,Use freqItems for frequent value detection in PySpark,Documentation for the DataFrameStatFunctions.freqItems method in PySpark.,"Finds frequent items for columns, possibly with false positives. Uses the frequent element count algorithm described by Karp, Schenker, and Papadimitriou.DataFrame.freqItemsandDataFrameStatFunctions.freqItemsare aliases of each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrameStatFunctions.freqItems with method-specific parameters and behavior that are product-specific integration details rather than generic concepts.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/sampleby,sampleBy,sampleBy (DataFrameStatFunctions) - Azure Databricks,Stratified sampling with sampleBy in Azure Databricks,Documentation for the DataFrameStatFunctions.sampleBy method in PySpark.,Returns a stratified sample without replacement based on the fraction given on each stratum.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents DataFrameStatFunctions.sampleBy with concrete parameter usage for stratified sampling, which is product-specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter,DataFrameWriter class,DataFrameWriter class - Azure Databricks,Write DataFrames with DataFrameWriter in Azure Databricks,Documentation for the DataFrameWriter class in PySpark.,"Interface used to write a DataFrame to external storage systems (e.g. file systems, key-value stores, etc). Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,Class reference for DataFrameWriter with methods and options for writing to external systems; detailed integration and configuration surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/bucketby,bucketBy,bucketBy - Azure Databricks,Bucket output data with DataFrameWriter.bucketBy,Documentation for the DataFrameWriter.bucketBy method in PySpark.,"Buckets the output by the given columns. If specified, the output is laid out on the file system similar to Hive's bucketing scheme, but with a different bucket hash function and is not compatible with Hive's bucketing.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents bucketBy parameters and behavior relative to Hive bucketing, which is specific API behavior for storage layout.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/clusterby,clusterBy,clusterBy (DataFrameWriter) - Azure Databricks,Cluster data for performance with DataFrameWriter.clusterBy,Documentation for the DataFrameWriter.clusterBy method in PySpark.,Clusters the data by the given columns to optimize query performance.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API method for clustering output by columns; parameter usage and semantics are product-specific integration details.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/csv,csv,csv (DataFrameWriter) - Azure Databricks,Write CSV files with DataFrameWriter.csv in Databricks,Documentation for the DataFrameWriter.csv method in PySpark.,Saves the content of theDataFramein CSV format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API for CSV output with options and behavior specific to Databricks/Spark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/excel,excel,excel (DataFrameWriter) - Azure Databricks,Write Excel files with DataFrameWriter.excel in Databricks,Documentation for the DataFrameWriter.excel method in PySpark.,Saves the content of theDataFramein Excel format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Method reference for Excel output with product-specific parameters and semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/format,format,format (DataFrameWriter) - Azure Databricks,Specify output data source with DataFrameWriter.format,Documentation for the DataFrameWriter.format method in PySpark.,Specifies the underlying output data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents format configuration for output, including accepted values and behavior, which is integration configuration detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/insertinto,insertInto,insertInto - Azure Databricks,Insert DataFrame rows into tables with DataFrameWriter.insertInto,Documentation for the DataFrameWriter.insertInto method in PySpark.,Inserts the content of theDataFrameinto the specified table. Requires that the schema of theDataFrameis the same as the schema of the table.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,Requires schema equality and defines insert behavior; these are concrete integration rules.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/jdbc,jdbc,jdbc (DataFrameWriter) - Azure Databricks,Write DataFrames to databases via DataFrameWriter.jdbc,Documentation for the DataFrameWriter.jdbc method in PySpark.,Saves the content of theDataFrameto an external database table via JDBC.,2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,API for JDBC writes with parameters and behavior specific to Databricks/Spark JDBC integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/json,json,json (DataFrameWriter) - Azure Databricks,Write JSON output with DataFrameWriter.json in Databricks,Documentation for the DataFrameWriter.json method in PySpark.,Saves the content of theDataFramein JSON format (JSON Lines / newline-delimited JSON) at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API for writing JSON Lines with path and options; behavior is specific to this product’s integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/mode,mode,mode (DataFrameWriter) - Azure Databricks,Control overwrite behavior with DataFrameWriter.mode,Documentation for the DataFrameWriter.mode method in PySpark.,Specifies the behavior when data or table already exists.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Defines allowed mode values and their effects when data or tables exist, which are concrete API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/option,option,option (DataFrameWriter) - Azure Databricks,Set single output option with DataFrameWriter.option,Documentation for the DataFrameWriter.option method in PySpark.,Adds an output option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how to configure output data source options via named parameters, which is integration configuration detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/options,options,options (DataFrameWriter) - Azure Databricks,Set multiple output options with DataFrameWriter.options,Documentation for the DataFrameWriter.options method in PySpark.,Adds output options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API for bulk-setting output options; parameter names and usage are product-specific.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/orc,orc,orc (DataFrameWriter) - Azure Databricks,Write ORC files with DataFrameWriter.orc in Databricks,Documentation for the DataFrameWriter.orc method in PySpark.,Saves the content of theDataFramein ORC format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API method for ORC output with specific parameters and behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/parquet,parquet,parquet (DataFrameWriter) - Azure Databricks,Write Parquet output with DataFrameWriter.parquet,Documentation for the DataFrameWriter.parquet method in PySpark.,Saves the content of theDataFramein Parquet format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,Method reference for Parquet writes with product-specific semantics and options.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/partitionby,partitionBy,partitionBy (DataFrameWriter) - Azure Databricks,Partition output data with DataFrameWriter.partitionBy,Documentation for the DataFrameWriter.partitionBy method in PySpark.,"Partitions the output by the given columns on the file system. If specified, the output is laid out on the file system similar to Hive's partitioning scheme.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains partitionBy parameters and layout similar to Hive partitioning, which is specific integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/save,save,save - Azure Databricks,Save DataFrames to data sources with DataFrameWriter.save,Documentation for the DataFrameWriter.save method in PySpark.,"Saves the contents of theDataFrameto a data source. The data source is specified byformatand a set ofoptions. Ifformatis not specified, the default data source configured byspark.sql.sources.defaultis used.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes interaction of format and options, and default source via spark.sql.sources.default, which is concrete integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/saveastable,saveAsTable,saveAsTable - Azure Databricks,Save DataFrames as tables with DataFrameWriter.saveAsTable,Documentation for the DataFrameWriter.saveAsTable method in PySpark.,"Saves the content of theDataFrameas the specified table. If the table already exists, the behavior depends on themodeparameter (default is to throw an exception). Whenmodeis'overwrite', the schema of theDataFramedoes not need to match the existing table schema.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Details mode behavior and schema matching rules when saving as tables, which are product-specific API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/sortby,sortBy,sortBy - Azure Databricks,Sort bucketed output with DataFrameWriter.sortBy,Documentation for the DataFrameWriter.sortBy method in PySpark.,Sorts the output in each bucket by the given columns on the file system.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Describes sortBy behavior within buckets on the filesystem, which is concrete API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/text,text,text (DataFrameWriter) - Azure Databricks,Write text files with DataFrameWriter.text in Databricks,Documentation for the DataFrameWriter.text method in PySpark.,Saves the content of theDataFramein a text file at the specified path. Text files are encoded as UTF-8.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Specifies UTF-8 encoding and path-based output behavior, which are concrete API details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/xml,xml,xml (DataFrameWriter) - Azure Databricks,Write XML files with DataFrameWriter.xml in Databricks,Documentation for the DataFrameWriter.xml method in PySpark.,Saves the content of theDataFramein XML format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Method reference for XML output with product-specific options and semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2,DataFrameWriterV2 class,DataFrameWriterV2 class - Azure Databricks,Write tables using DataFrameWriterV2 in Azure Databricks,Documentation for the DataFrameWriterV2 class in PySpark.,"Interface used to write a DataFrame to external storage using the v2 API. For most use cases with Databricks tables and Delta Lake, DataFrameWriterV2 provides more powerful and flexible options than the original DataFrameWriter: Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,Class reference for v2 writer with capabilities for Databricks tables and Delta Lake; includes product-specific configuration surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/append,append,append - Azure Databricks,Append data to tables with DataFrameWriterV2.append,Documentation for the DataFrameWriterV2.append method in PySpark.,Appends the contents of theDataFrameto the output table.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API method describing append semantics for v2 tables, which is specific integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/clusterby,clusterBy,clusterBy (DataFrameWriterV2) - Azure Databricks,Cluster data using DataFrameWriterV2.clusterBy,Documentation for the DataFrameWriterV2.clusterBy method in PySpark.,Clusters the data by the given columns to optimize query performance.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a Databricks-specific writer method that influences query performance; concrete coding pattern for clustering.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/create,create,create - Azure Databricks,Create new tables with DataFrameWriterV2.create,Documentation for the DataFrameWriterV2.create method in PySpark.,"Creates a new table from the contents of theDataFrame. The new table's schema, partition layout, properties, and other configuration are based on the configuration set on this writer.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains how table schema, partitioning, and properties derive from writer configuration, which is detailed product-specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/createorreplace,createOrReplace,createOrReplace - Azure Databricks,Use DataFrameWriterV2.createOrReplace in Azure Databricks,Documentation for the DataFrameWriterV2.createOrReplace method in PySpark.,"Creates a new table or replaces an existing table with the contents of theDataFrame. The output table's schema, partition layout, properties, and other configuration are based on the contents of theDataFrameand the configuration set on this writer. If the table exists, its configuration and data are replaced.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for a specific PySpark/Databricks method with product-specific semantics and options; this is concrete integration/coding detail beyond generic LLM knowledge.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/option,option,option (DataFrameWriterV2) - Azure Databricks,Set write options with DataFrameWriterV2.option,Documentation for the DataFrameWriterV2.option method in PySpark.,Adds a write option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,API reference for adding a single write option to a data source; product-specific configuration pattern for Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/options,options,options (DataFrameWriterV2) - Azure Databricks,Configure multiple write options with options(),Documentation for the DataFrameWriterV2.options method in PySpark.,Adds write options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents DataFrameWriterV2.options for setting multiple options; concrete configuration API for Databricks data sources.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwrite,overwrite,overwrite - Azure Databricks,Use DataFrameWriterV2.overwrite for conditional updates,Documentation for the DataFrameWriterV2.overwrite method in PySpark.,Overwrites rows matching the given filter condition with the contents of theDataFramein the output table.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete writer API (overwrite with filter) specific to Databricks PySpark, including behavior details that are product-specific.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwritepartitions,overwritePartitions,overwritePartitions - Azure Databricks,Use overwritePartitions with DataFrameWriterV2,Documentation for the DataFrameWriterV2.overwritePartitions method in PySpark.,"Overwrites all partitions for which theDataFramecontains at least one row with the contents of theDataFramein the output table. This operation is equivalent to Hive'sINSERT OVERWRITE ... PARTITION, which replaces partitions dynamically depending on the contents of theDataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes Databricks-specific partition overwrite behavior equivalent to Hive INSERT OVERWRITE PARTITION, which is concrete integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/partitionedby,partitionedBy,partitionedBy - Azure Databricks,Partition tables with DataFrameWriterV2.partitionedBy,Documentation for the DataFrameWriterV2.partitionedBy method in PySpark.,"Partitions the output table created bycreate,createOrReplace, orreplaceusing the given columns or transforms. When specified, the table data is stored by these values for efficient reads. For example, when a table is partitioned by day, it may be stored in a directory layout like: Partitioning is one of the most widely used techniques to optimize physical data layout. It provides a coarse-grained index for skipping unnecessary data reads when queries have predicates on the partitioned columns. F",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains how partitionedBy affects physical layout and query skipping in Databricks; concrete product-specific partitioning behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/replace,replace,replace (DataFrameWriterV2) - Azure Databricks,Replace tables using DataFrameWriterV2.replace,Documentation for the DataFrameWriterV2.replace method in PySpark.,"Replaces an existing table with the contents of theDataFrame. The existing table's schema, partition layout, properties, and other configuration are replaced with the contents of theDataFrameand the configuration set on this writer.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level reference for replacing tables with Databricks PySpark, including how schema and properties are handled; product-specific coding pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/tableproperty,tableProperty,tableProperty - Azure Databricks,Add table properties via tableProperty(),Documentation for the DataFrameWriterV2.tableProperty method in PySpark.,Adds a table property.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Method-level reference for setting table properties on Databricks tables through DataFrameWriterV2; specific integration/config behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/using,using,using - Azure Databricks,Specify data source providers with using(),Documentation for the DataFrameWriterV2.using method in PySpark.,"Specifies a provider for the underlying output data source. Spark's default catalog supports""parquet"",""json"", and other built-in formats.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents DataFrameWriterV2.using with supported provider strings (parquet, json, etc.) in Databricks; concrete API usage detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource,DataSource class,DataSource - Azure Databricks,Implement custom PySpark DataSource in Databricks,A base class for custom PySpark data sources that enables reading from and writing to external data systems.,"A base class for data sources. This class represents a custom data source that allows for reading from and/or writing to it. The data source provides methods to create readers and writers for reading and writing data, respectively. At least one of the methodsreader()orwriter()must be implemented by any subclass to make the data source either readable or writable (or both). After implementing this interface, you can load your data source usingspark.read.format(...).load()and save data usingdf.wri",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Defines the base DataSource class and required methods (reader, writer) for custom sources; detailed integration contract unique to Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/name,name,name (DataSource) - Azure Databricks,Define custom data source name() identifier,Documentation for the DataSource.name method in PySpark.,"Returns a string representing the format name of this data source. By default, it is the class name of the data source. It can be overridden to provide a customized short name for the data source.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents how DataSource.name determines the format string used in spark.read.format; specific to Databricks custom sources.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/reader,reader,reader - Azure Databricks,Provide a DataSource.reader implementation,Documentation for the DataSource.reader method in PySpark.,Returns aDataSourceReaderinstance for reading data. The implementation is required for readable data sources.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains the requirement and behavior of DataSource.reader for readable custom sources; concrete integration contract.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/schema,schema,schema (DataSource) - Azure Databricks,Infer custom data source schema() in Databricks,Documentation for the DataSource.schema method in PySpark.,"Returns the schema of the data source. It can refer to any field initialized in the__init__method to infer the data source's schema when users do not explicitly specify it. This method is invoked once when callingspark.read.format(...).load()to get the schema for a data source read operation. If this method is not implemented, and a user does not provide a schema when reading the data source, an exception will be thrown.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Details how schema() is used when users omit schema in spark.read.format(...).load(); product-specific API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/simplestreamreader,simpleStreamReader,simpleStreamReader - Azure Databricks,Implement simpleStreamReader for custom sources,Documentation for the DataSource.simpleStreamReader method in PySpark.,Returns aSimpleDataSourceStreamReaderinstance for reading data.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents DataSource.simpleStreamReader returning SimpleDataSourceStreamReader; specific streaming integration API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/streamreader,streamReader,streamReader - Azure Databricks,Implement streamReader for streaming data sources,Documentation for the DataSource.streamReader method in PySpark.,Returns aDataSourceStreamReaderinstance for reading streaming data. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes DataSource.streamReader for streaming reads, including Databricks Runtime version; concrete streaming integration contract.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/streamwriter,streamWriter,streamWriter - Azure Databricks,Implement streamWriter for streaming sinks,Documentation for the DataSource.streamWriter method in PySpark.,Returns aDataSourceStreamWriterinstance for writing data into a streaming sink. The implementation is required for writable streaming data sources.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents DataSource.streamWriter returning DataSourceStreamWriter; product-specific streaming sink integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/writer,writer,writer - Azure Databricks,Provide DataSource.writer for custom sinks,Documentation for the DataSource.writer method in PySpark.,Returns aDataSourceWriterinstance for writing data. The implementation is required for writable data sources. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains DataSource.writer requirements and Databricks Runtime version; concrete integration API for writable sources.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcearrowwriter,DataSourceArrowWriter class,DataSourceArrowWriter - Azure Databricks,Use DataSourceArrowWriter for Arrow-based sinks,A base class for PySpark data source writers that uses PyArrow RecordBatch for better performance when writing data.,"A base class for data source writers that process data using PyArrow'sRecordBatch. UnlikeDataSourceWriter, which works with an iterator of SparkRowobjects, this class is optimized for the Arrow format when writing data. It can offer better performance when interfacing with systems or libraries that natively support Arrow. Implement this class and return an instance fromDataSource.writer()to make a data source writable using Arrow.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,Defines a specialized writer using PyArrow RecordBatch instead of Spark Row; detailed integration pattern unique to Databricks.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcearrowwriter/write,write,write (DataSourceArrowWriter) - Azure Databricks,Implement write() in DataSourceArrowWriter,Documentation for the DataSourceArrowWriter.write method in PySpark.,"Writes an iterator of PyArrowRecordBatchobjects to the sink. This method is called once on each executor to write data to the data source. It accepts an iterator of PyArrowRecordBatchobjects and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with the collected commit message",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes executor-side write semantics, commit messages, and driver commit/abort flow for Arrow writers; concrete product-specific behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader,DataSourceReader class,DataSourceReader - Azure Databricks,Implement DataSourceReader for custom inputs,A base class for PySpark data source readers responsible for outputting data from a custom data source.,A base class for data source readers. Data source readers are responsible for outputting data from a data source. Implement this class and return an instance fromDataSource.reader()to make a data source readable.,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,Base class contract for custom readers in Databricks PySpark; defines how data is output from a source.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/partitions,partitions,partitions (DataSourceReader) - Azure Databricks,Define partitions() in DataSourceReader for parallelism,Documentation for the DataSourceReader.partitions method in PySpark.,"Returns a sequence of partitions for this data source. Partitions are used to split data reading operations into parallel tasks. If this method returns N partitions, the query planner will create N tasks. Each task will executeread()in parallel, using the respective partition value to read the data. This method is called once during query planning. By default, it returns a single partition with the valueNone. Subclasses can override this method to return multiple partitions. It's recommended to ",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Explains how partitions() controls task parallelism and default behavior; concrete integration and performance pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/pushfilters,pushFilters,pushFilters (DataSourceReader) - Azure Databricks,Implement pushFilters() for filter pushdown,Documentation for the DataSourceReader.pushFilters method in PySpark.,"Called with the list of filters that can be pushed down to the data source. The list of filters should be interpreted as the AND of the elements. Filter pushdown allows the data source to handle a subset of filters. This can improve performance by reducing the amount of data that needs to be processed by Spark. This method is called once during query planning. By default, it returns all filters, indicating that no filters can be pushed down. Subclasses can override this method to implement filte",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Details how filters are passed and interpreted (AND semantics) and default behavior; product-specific optimization API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/read,read,read (DataSourceReader) - Azure Databricks,Implement read() in DataSourceReader partitions,Documentation for the DataSourceReader.read method in PySpark.,Generates data for a given partition and returns an iterator of tuples or rows. This method is invoked once per partition to read the data. Implementing this method is required for readable data sources. You can initialize any non-serializable resources required for reading data from the data source within this method.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Describes per-partition read semantics and resource initialization; concrete integration contract for custom readers.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourceregistration,DataSourceRegistration class,DataSourceRegistration - Azure Databricks,Register custom DataSource with DataSourceRegistration,"A wrapper for registering Python user-defined data sources in PySpark, accessible via spark.dataSource.",A wrapper for data source registration. This instance can be accessed viaspark.dataSource. Use it to register a customDataSourcesubclass so it can be referenced by name inspark.read.format()anddf.write.format().,2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,Explains spark.dataSource wrapper and how to register Python DataSource subclasses for use in spark.read/df.write; product-specific registration API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourceregistration/register,register,register (DataSourceRegistration) - Azure Databricks,Use register() to expose custom data sources,Documentation for the DataSourceRegistration.register method in PySpark.,Registers a Python user-defined data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Method-level reference for DataSourceRegistration.register; concrete integration step to make sources available by name.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamarrowwriter,DataSourceStreamArrowWriter class,DataSourceStreamArrowWriter - Azure Databricks,Use DataSourceStreamArrowWriter for streaming sinks,A base class for PySpark data stream writers that uses PyArrow RecordBatch for better performance when writing streaming data.,"A base class for data stream writers that process data using PyArrow'sRecordBatch. UnlikeDataSourceStreamWriter, which works with an iterator of SparkRowobjects, this class is optimized for the Arrow format when writing streaming data. It can offer better performance when interfacing with systems or libraries that natively support Arrow for streaming use cases. Implement this class and return an instance fromDataSource.streamWriter()to make a data source writable as a streaming sink using Arrow.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,Defines Arrow-based streaming writer semantics for Databricks; specialized integration pattern for streaming with PyArrow.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamarrowwriter/write,write,write (DataSourceStreamArrowWriter) - Azure Databricks,Implement write() in DataSourceStreamArrowWriter,Documentation for the DataSourceStreamArrowWriter.write method in PySpark.,"Writes an iterator of PyArrowRecordBatchobjects to the streaming sink. This method is called on executors to write data to the streaming data sink in each microbatch. It accepts an iterator of PyArrowRecordBatchobjects and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with ",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes microbatch write semantics, commit messages, and driver commit/abort flow for Arrow streaming writers; product-specific behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader,DataSourceStreamReader class,DataSourceStreamReader - Azure Databricks,Implement DataSourceStreamReader for streaming inputs,A base class for PySpark streaming data source readers responsible for reading data from a streaming data source.,A base class for streaming data source readers. Data source stream readers are responsible for outputting data from a streaming data source. Implement this class and return an instance fromDataSource.streamReader()to make a data source readable as a streaming source. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Base class contract for streaming readers in Databricks, including Runtime version; concrete streaming integration API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/commit,commit,commit (DataSourceStreamReader) - Azure Databricks,Handle commit() in DataSourceStreamReader offsets,Documentation for the DataSourceStreamReader.commit method in PySpark.,Informs the source that Spark has completed processing all data for offsets less than or equal toendand will only request offsets greater thanendin the future. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains how commit signals processed offsets and future offset requests; detailed streaming offset management behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/getdefaultreadlimit,getDefaultReadLimit,getDefaultReadLimit - Azure Databricks,Use getDefaultReadLimit in streaming sources,Documentation for the DataSourceStreamReader.getDefaultReadLimit method in PySpark.,"Returns the read limit potentially passed to the data source through options when creating the data source. Implementing this method is optional. By default, it returnsReadAllAvailable, which means there is no limit on the amount of data returned bylatestOffset().",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Documents default ReadAllAvailable behavior and how options can pass read limits; product-specific streaming configuration API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/initialoffset,initialOffset,initialOffset (DataSourceStreamReader) - Azure Databricks,Define initialOffset() for streaming data sources,Documentation for the DataSourceStreamReader.initialOffset method in PySpark.,"Returns the initial offset of the streaming data source. A new streaming query starts reading data from the initial offset. If Spark is restarting an existing query, it will restart from the checkpointed offset rather than the initial one. Added in Databricks Runtime 15.2",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains how initial offsets are used vs checkpointed offsets on restart; concrete streaming integration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/latestoffset,latestOffset,latestOffset - Azure Databricks,Implement latestOffset() with read limits,Documentation for the DataSourceStreamReader.latestOffset method in PySpark.,"Returns the most recent offset available given a read limit. Thestartoffset can be used to determine how much new data should be read given the limit. For the very first microbatch,startis provided from the return value ofinitialOffset(). For subsequent microbatches, it continues from the last microbatch. The source can return the same offset as the start offset if there is no data to process. ReadLimitcan be used by the source to limit the amount of data returned. ImplementgetDefaultReadLimit()",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details interaction between start offset, read limits, and no-data scenarios; product-specific streaming offset API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/partitions,partitions,partitions (DataSourceStreamReader) - Azure Databricks,Define partitions() for DataSourceStreamReader,Documentation for the DataSourceStreamReader.partitions method in PySpark.,"Returns a list ofInputPartitionobjects given the start and end offsets. EachInputPartitionrepresents a data split that can be processed by one Spark task. When called with an empty offset range wherestart == end, this method should return an empty sequence. Added in Databricks Runtime 15.2",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Explains how InputPartition list is derived from start/end offsets and empty-range behavior; concrete streaming partitioning contract.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/read,read,read (DataSourceStreamReader) - Azure Databricks,Implement read() in DataSourceStreamReader,Documentation for the DataSourceStreamReader.read method in PySpark.,Generates data for a given partition and returns an iterator of tuples or rows. This method is invoked once per partition to read the data. Implementing this method is required for stream readers. You can initialize any non-serializable resources required for reading data from the data source within this method. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Describes per-partition streaming read semantics and resource initialization; product-specific streaming integration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/reportlatestoffset,reportLatestOffset,reportLatestOffset - Azure Databricks,Report latest offsets with reportLatestOffset(),Documentation for the DataSourceStreamReader.reportLatestOffset method in PySpark.,Returns the most recent offset available. The information is used to report the latest offset in the streaming query status. The source can returnNoneif there is no data to process or if the source does not support this method.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains how latest offsets are reported for query status and when None is allowed; detailed streaming status integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/stop,stop,stop (DataSourceStreamReader) - Azure Databricks,Stop streaming sources with stop() in DataSourceStreamReader,Documentation for the DataSourceStreamReader.stop method in PySpark.,Stops this source and frees any resources it has allocated. Invoked when the streaming query is terminated. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents lifecycle method to free resources on query termination; concrete streaming integration lifecycle behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter,DataSourceStreamWriter class,DataSourceStreamWriter - Azure Databricks,Implement DataSourceStreamWriter for streaming sinks,A base class for PySpark data stream writers responsible for writing streaming data to a sink microbatch by microbatch.,"A base class for data stream writers. Data stream writers are responsible for writing data to a streaming sink. Implement this class and return an instance fromDataSource.streamWriter()to make a data source writable as a streaming sink.write()is called on executors for each microbatch, andcommit()orabort()is called on the driver after all tasks in the microbatch complete.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Base class contract for streaming writers, including write/commit/abort semantics per microbatch; detailed Databricks integration pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/abort,abort,abort (DataSourceStreamWriter) - Azure Databricks,Handle abort() in DataSourceStreamWriter failures,Documentation for the DataSourceStreamWriter.abort method in PySpark.,Aborts this microbatch due to task failures. This method is invoked on the driver when one or more tasks failed. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to abort the microbatch in the streaming sink.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Explains how commit messages from tasks are used to abort microbatches on driver; product-specific failure-handling behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/commit,commit,commit (DataSourceStreamWriter) - Azure Databricks,Commit microbatches with DataSourceStreamWriter.commit,Documentation for the DataSourceStreamWriter.commit method in PySpark.,Commits this microbatch with a list of commit messages. This method is invoked on the driver when all tasks run successfully. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to commit the microbatch in the streaming sink.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Describes driver-side commit semantics using commit messages from write() tasks; concrete streaming sink integration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/write,write,write (DataSourceStreamWriter) - Azure Databricks,Implement custom streaming sinks with DataSourceStreamWriter.write,Documentation for the DataSourceStreamWriter.write method in PySpark.,"Writes data into the streaming sink. This method is called on executors to write data to the streaming data sink in each microbatch. It accepts an iterator of input data and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with the collected commit messages.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level reference for a Databricks-specific PySpark streaming writer hook, including commit-message behavior between executors and driver; this is product- and API-specific integration behavior not reliably known from pretraining.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter,DataSourceWriter class,DataSourceWriter - Azure Databricks,Implement custom batch data sources with DataSourceWriter,A base class for PySpark data source writers responsible for saving data to a custom data source in batch mode.,A base class for data source writers. Data source writers are responsible for saving data to a data source. Implement this class and return an instance fromDataSource.writer()to make a data source writable. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Describes a Databricks Runtime–specific base class for custom data source writers and how it is returned from DataSource.writer(); this is concrete API surface and lifecycle behavior for integrations.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/abort,abort,abort - Azure Databricks,Handle failed batch writes with DataSourceWriter.abort,Documentation for the DataSourceWriter.abort method in PySpark.,Aborts this writing job due to task failures. This method is invoked on the driver when one or more tasks failed. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to abort the writing job to the data source. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a specific driver-side abort hook, including how commit messages from write() are passed in on failure; this is detailed, product-specific writer lifecycle behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/commit,commit,commit - Azure Databricks,Commit successful batch writes with DataSourceWriter.commit,Documentation for the DataSourceWriter.commit method in PySpark.,Commits this writing job with a list of commit messages. This method is invoked on the driver when all tasks run successfully. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to commit the writing job to the data source. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Explains the commit hook semantics and how commit messages from executors are used on the driver; this is concrete API lifecycle behavior for custom data source integrations.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/write,write,write - Azure Databricks,Write partition data in custom sources with DataSourceWriter.write,Documentation for the DataSourceWriter.write method in PySpark.,"Writes data into the data source. This method is called once on each executor to write data to the data source. It accepts an iterator of input data and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with the collected commit messages. Added in Databricks Runtime 14.3 LTS",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Details executor-side write behavior and commit-message contract for custom batch data sources; this is specific integration API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader,DataStreamReader class,DataStreamReader - Azure Databricks,Load streaming DataFrames with DataStreamReader,Learn about the DataStreamReader class in PySpark,"Interface used to load a streaming DataFrame from external storage systems (for example, file systems and key-value stores). Usespark.readStreamto access this.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Product-specific interface for loading streaming DataFrames via spark.readStream, including supported external systems; this is concrete API usage for integrations.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/changes,changes,changes (DataStreamReader) - Azure Databricks,Read table change data capture with DataStreamReader.changes,Documentation for the DataStreamReader.changes method in PySpark.,Returns the row-level changes (Change Data Capture) from the specified table as a streaming DataFrame. Currently only supported for Data Source V2 tables whose catalog implementsTableCatalog.loadChangelog(). Useoption()to specify the starting version or timestamp and processing options.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a Databricks-specific streaming CDC API, including requirement for TableCatalog.loadChangelog() and use of options for starting version/timestamp; this is detailed integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/csv,csv,csv (DataStreamReader) - Azure Databricks,Stream CSV data into DataFrames with DataStreamReader.csv,Documentation for the DataStreamReader.csv method in PySpark.,"Loads a CSV file stream and returns the result as a DataFrame. IfinferSchemais enabled, the function goes through the input once to determine the schema. To avoid this pass, disableinferSchemaor specify the schema explicitly usingschema.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API reference for streaming CSV with schema inference behavior and options like inferSchema and schema; these are concrete parameters and semantics for this product’s streaming integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/excel,excel,excel (DataStreamReader) - Azure Databricks,Stream Excel files into DataFrames with DataStreamReader.excel,Documentation for the DataStreamReader.excel method in PySpark.,Loads an Excel file stream and returns the result as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,Documents a specific streaming reader for Excel format in Databricks PySpark; this is concrete integration API surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/format,format,format (DataStreamReader) - Azure Databricks,Specify streaming input formats with DataStreamReader.format,Documentation for the DataStreamReader.format method in PySpark.,Specifies the input data source format.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API for selecting underlying streaming data source formats; this is configuration of integration endpoints rather than conceptual content.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/json,json,json (DataStreamReader) - Azure Databricks,Stream JSON data with DataStreamReader.json,Documentation for the DataStreamReader.json method in PySpark.,"Loads a JSON file stream and returns the results as a DataFrame. JSON Lines (newline-delimited JSON) is supported by default. For JSON with one record per file, set themultiLineoption totrue. Ifschemais not specified, the input schema is inferred from the data.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Describes streaming JSON behavior including JSON Lines support, multiLine option, and schema inference; these are concrete, product-specific API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/load,load,load (DataStreamReader) - Azure Databricks,Load generic streaming sources with DataStreamReader.load,Documentation for the DataStreamReader.load method in PySpark.,Loads a data stream from a data source and returns it as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,Core API for loading a data stream from configured sources; this is concrete integration behavior for Databricks streaming.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/name,name,name (DataStreamReader) - Azure Databricks,Name streaming sources for checkpoint evolution with DataStreamReader.name,Documentation for the DataStreamReader.name method in PySpark.,"Assigns a name to the streaming source for checkpoint evolution. This allows streaming queries to evolve by enabling sources to be reordered or added without breaking checkpoint compatibility. When source evolution is enabled, all sources in a query must be named.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a Databricks-specific feature (source naming for checkpoint evolution) and its requirement that all sources be named when evolution is enabled; this is nuanced, product-specific behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/option,option,option (DataStreamReader) - Azure Databricks,Configure single streaming input option with DataStreamReader.option,Documentation for the DataStreamReader.option method in PySpark.,Adds an input option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,API for setting named input options on streaming sources; this is a configuration surface with specific parameter semantics for this product.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/options,options,options (DataStreamReader) - Azure Databricks,Configure multiple streaming input options with DataStreamReader.options,Documentation for the DataStreamReader.options method in PySpark.,Adds multiple input options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Allows setting multiple named options for streaming sources; this is structured configuration for Databricks streaming integrations.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/orc,orc,orc (DataStreamReader) - Azure Databricks,Stream ORC data into DataFrames with DataStreamReader.orc,Documentation for the DataStreamReader.orc method in PySpark.,Loads an ORC file stream and returns the result as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API reference for streaming ORC files; this is concrete integration behavior for a specific format.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/parquet,parquet,parquet (DataStreamReader) - Azure Databricks,Stream Parquet data into DataFrames with DataStreamReader.parquet,Documentation for the DataStreamReader.parquet method in PySpark.,Loads a Parquet file stream and returns the result as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API reference for streaming Parquet files; this is concrete integration behavior for a specific format.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/schema,schema,schema (DataStreamReader) - Azure Databricks,Define streaming input schemas with DataStreamReader.schema,Documentation for the DataStreamReader.schema method in PySpark.,"Specifies the input schema. Some data sources (for example, JSON) can infer the input schema automatically from data. Specifying the schema here allows the data source to skip schema inference and speed up data loading.",2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Explains specifying schemas vs inference for streaming sources and performance implications; this is product-specific configuration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/table,table,table (DataStreamReader) - Azure Databricks,Create streaming DataFrames from tables with DataStreamReader.table,Documentation for the DataStreamReader.table method in PySpark.,Defines a streaming DataFrame on a table. The data source corresponding to the table must support streaming mode.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for defining a streaming DataFrame on a table with requirement that the table’s data source support streaming; this is concrete integration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/text,text,text (DataStreamReader) - Azure Databricks,Stream text files into DataFrames with DataStreamReader.text,Documentation for the DataStreamReader.text method in PySpark.,"Loads a text file stream and returns a DataFrame whose schema starts with a string column namedvalue, followed by any partitioned columns. Text files must be encoded as UTF-8. Each line in the text file is a new row in the resulting DataFrame by default.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Documents UTF-8 requirement, schema (value column plus partitions), and line-to-row mapping; these are specific integration semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/xml,xml,xml (DataStreamReader) - Azure Databricks,Stream XML data into DataFrames with DataStreamReader.xml,Documentation for the DataStreamReader.xml method in PySpark.,"Loads an XML file stream and returns the result as a DataFrame. Ifschemais not specified, the input schema is inferred from the data.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for streaming XML with schema inference behavior; this is concrete integration behavior for a specific format.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter,DataStreamWriter class,DataStreamWriter - Azure Databricks,Write streaming DataFrames with DataStreamWriter,Learn about the DataStreamWriter class in PySpark,"Interface used to write a streaming DataFrame to external storage systems (for example, file systems and key-value stores). Usedf.writeStreamto access this.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Product-specific interface for writing streaming DataFrames to external systems via df.writeStream; this is concrete integration API surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/clusterby,clusterBy,clusterBy (DataStreamWriter) - Azure Databricks,Configure clustered output files with DataStreamWriter.clusterBy,Documentation for the DataStreamWriter.clusterBy method in PySpark.,"Clusters the output by the given columns. Records with similar values on the clustering columns are grouped together in the same file. Clustering improves query efficiency by allowing queries with predicates on the clustering columns to skip unnecessary data. Unlike partitioning, clustering can be used on high-cardinality columns.",2026-04-17T21:49:00.000Z,reference,configuration,0.65,True,Explains clustering semantics vs partitioning for streaming output and its impact on query efficiency; this is product-specific configuration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/foreach,foreach,foreach (DataStreamWriter) - Azure Databricks,Process streaming rows with custom foreach writers,Documentation for the DataStreamWriter.foreach method in PySpark.,"Sets the output of the streaming query to be processed using the provided writer. The processing logic can be specified as a function that takes a row as input, or as an object withprocess(row)and optionalopen(partition_id, epoch_id)andclose(error)methods.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents DataStreamWriter.foreach semantics, including process(row), open(partition_id, epoch_id), and close(error) hooks; this is detailed integration and coding pattern behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/foreachbatch,foreachBatch,foreachBatch (DataStreamWriter) - Azure Databricks,Process micro-batches with DataStreamWriter.foreachBatch,Documentation for the DataStreamWriter.foreachBatch method in PySpark.,"Sets the output of the streaming query to be processed using the provided function. Supported only in micro-batch execution mode (that is, when the trigger is not continuous). In every micro-batch, the provided function is called with the output rows as a DataFrame and the batch identifier. The batch ID can be used to deduplicate and transactionally write the output to external systems.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains foreachBatch semantics, micro-batch-only support, and use of batch ID for deduplication and transactional writes; this is nuanced, product-specific integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/format,format,format (DataStreamWriter) - Azure Databricks,Select streaming output data source with DataStreamWriter.format,Documentation for the DataStreamWriter.format method in PySpark.,Specifies the underlying output data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API for specifying the underlying sink format; this is integration configuration for output.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/option,option,option (DataStreamWriter) - Azure Databricks,Configure single streaming output option with DataStreamWriter.option,Documentation for the DataStreamWriter.option method in PySpark.,Adds an output option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,API for setting named output options on streaming sinks; this is configuration surface for integrations.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/options,options,options (DataStreamWriter) - Azure Databricks,Configure multiple streaming output options with DataStreamWriter.options,Documentation for the DataStreamWriter.options method in PySpark.,Adds multiple output options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Allows setting multiple named options for streaming sinks; this is structured configuration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/outputmode,outputMode,outputMode (DataStreamWriter) - Azure Databricks,Control streaming sink semantics with DataStreamWriter.outputMode,Documentation for the DataStreamWriter.outputMode method in PySpark.,Specifies how data of a streaming DataFrame is written to a streaming sink.,2026-04-17T21:49:00.000Z,reference,configuration,0.65,True,Specifies how streaming data is written to sinks (output modes); this is product-specific configuration of streaming behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/partitionby,partitionBy,partitionBy (DataStreamWriter) - Azure Databricks,Partition streaming output by columns with DataStreamWriter.partitionBy,Documentation for the DataStreamWriter.partitionBy method in PySpark.,Partitions the output by the given columns on the file system. The output is laid out similar to Hive's partitioning scheme.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Describes partitioning layout similar to Hive for streaming outputs; this is concrete configuration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/queryname,queryName,queryName (DataStreamWriter) - Azure Databricks,Name streaming queries with DataStreamWriter.queryName,Documentation for the DataStreamWriter.queryName method in PySpark.,Specifies the name of the StreamingQuery that can be started withstart(). This name must be unique among all currently active queries in the associated SparkSession.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Specifies unique query names for StreamingQuery instances; this is product-specific configuration requirement.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/start,start,start (DataStreamWriter) - Azure Databricks,Start streaming queries with DataStreamWriter.start,Documentation for the DataStreamWriter.start method in PySpark.,Streams the contents of the DataFrame to a data source and returns a StreamingQuery object.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for starting streaming writes and obtaining a StreamingQuery; this is concrete integration lifecycle behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/table,table,table (DataStreamWriter) - Azure Databricks,Write streaming results to tables with DataStreamWriter.table,Documentation for the DataStreamWriter.table method in PySpark.,"Alias fortoTable(). Starts the execution of the streaming query, continually outputting results to the given table as new data arrives.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Alias for toTable() that starts continuous output to a table; this is specific integration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/totable,toTable,toTable (DataStreamWriter) - Azure Databricks,Stream DataFrame output into tables with DataStreamWriter.toTable,Documentation for the DataStreamWriter.toTable method in PySpark.,"Starts the execution of the streaming query, continually outputting results to the given table as new data arrives. Returns a StreamingQuery object.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents starting a streaming query that writes to a table and returns a StreamingQuery; this is integration lifecycle behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/trigger,trigger,trigger (DataStreamWriter) - Azure Databricks,Configure streaming triggers with DataStreamWriter.trigger,Documentation for the DataStreamWriter.trigger method in PySpark.,"Sets the trigger for the streaming query. If not set, the query runs as fast as possible, equivalent toprocessingTime='0 seconds'. Only one trigger parameter can be set at a time. For more information, seeConfigure Structured Streaming trigger intervals.",2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,"Explains trigger configuration, default behavior (processingTime='0 seconds'), and single-parameter constraint; this is detailed, product-specific configuration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography,Geography class,Geography - Azure Databricks,Work with geography values using the Geography class,Learn about the Geography class in PySpark,A class to represent a Geography value in Python.,2026-04-17T08:00:00.000Z,reference,integrations,0.55,True,"Defines a Databricks-specific Geography type for Python, used in geospatial integrations and functions; this is concrete API surface.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography/fromwkb,fromWKB,fromWKB (Geography) - Azure Databricks,Create Geography objects from WKB with Geography.fromWKB,Documentation for the Geography.fromWKB class method in PySpark.,Constructs aGeographyobject from WKB.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for constructing Geography from WKB; this is specific integration behavior with binary geospatial formats.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography/getbytes,getBytes,getBytes (Geography) - Azure Databricks,Export Geography values to WKB with Geography.getBytes,Documentation for the Geography.getBytes method in PySpark.,Returns the WKB of Geography.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Returns WKB representation of Geography; this is concrete integration behavior with external geospatial systems.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography/getsrid,getSrid,getSrid (Geography) - Azure Databricks,Retrieve SRID from Geography objects with Geography.getSrid,Documentation for the Geography.getSrid method in PySpark.,Returns the SRID of Geography.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API for obtaining SRID from Geography; this is specific to geospatial integration and coordinate system handling.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry,Geometry class,Geometry - Azure Databricks,Work with geometry values using the Geometry class,Learn about the Geometry class in PySpark,A class to represent a Geometry value in Python.,2026-04-17T08:00:00.000Z,reference,integrations,0.55,True,Defines a Databricks-specific Geometry type for Python; this is concrete API surface for geospatial integrations.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/fromwkb,fromWKB,fromWKB (Geometry) - Azure Databricks,Use Geometry.fromWKB in Azure Databricks PySpark,Documentation for the Geometry.fromWKB class method in PySpark.,Constructs aGeometryobject from WKB.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark Geometry class method, including its signature and behavior, which are product-specific integration details beyond generic LLM knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/getbytes,getBytes,getBytes (Geometry) - Azure Databricks,Return Geometry WKB bytes with getBytes,Documentation for the Geometry.getBytes method in PySpark.,Returns the WKB of Geometry.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete PySpark Geometry method, exposing exact behavior and usage patterns that are specific to Azure Databricks’ PySpark implementation.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/getsrid,getSrid,getSrid (Geometry) - Azure Databricks,Retrieve Geometry SRID using getSrid in PySpark,Documentation for the Geometry.getSrid method in PySpark.,Returns the SRID of Geometry.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference for Geometry.getSrid with product-specific semantics that go beyond generic spatial concepts.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata,GroupedData class,GroupedData - Azure Databricks,Work with GroupedData aggregations in PySpark,Learn about the GroupedData class in PySpark,"A set of methods for aggregations on a DataFrame, created byDataFrame.groupBy. Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.65,True,"Class reference for GroupedData with Databricks-specific behavior and supported methods, which are concrete API details rather than conceptual content.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/agg,agg,agg (GroupedData) - Azure Databricks,Compute custom aggregates with GroupedData.agg,Documentation for the GroupedData.agg method in PySpark.,Computes aggregates and returns the result as aDataFrame. The available aggregate functions can be:,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents the exact usage of GroupedData.agg, including supported aggregate functions and calling patterns, which are product-specific API details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/avg,avg,avg (GroupedData) - Azure Databricks,Calculate group averages with GroupedData.avg,Documentation for the GroupedData.avg method in PySpark.,Computes average values for each numeric column for each group. meanis an alias foravg.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for GroupedData.avg/mean with defined behavior per numeric column, representing concrete API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/count,count,count (GroupedData) - Azure Databricks,Count records per group with GroupedData.count,Documentation for the GroupedData.count method in PySpark.,Counts the number of records for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact behavior of GroupedData.count in Databricks PySpark, which is specific API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/max,max,max (GroupedData) - Azure Databricks,Get maximum values per group with GroupedData.max,Documentation for the GroupedData.max method in PySpark.,Computes the max value for each numeric column for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Details the GroupedData.max method behavior for numeric columns, which is concrete API reference material.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/mean,mean,mean (GroupedData) - Azure Databricks,Use GroupedData.mean alias for averages,Documentation for the GroupedData.mean method in PySpark.,Computes average values for each numeric column for each group. meanis an alias foravg.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Clarifies that mean is an alias for avg and how it operates on grouped numeric columns, a product-specific API detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/min,min,min (GroupedData) - Azure Databricks,Get minimum values per group with GroupedData.min,Documentation for the GroupedData.min method in PySpark.,Computes the min value for each numeric column for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level documentation for GroupedData.min, describing exact aggregation behavior in Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/pivot,pivot,pivot (GroupedData) - Azure Databricks,Pivot DataFrame columns with GroupedData.pivot,Documentation for the GroupedData.pivot method in PySpark.,Pivots a column of the currentDataFrameand performs the specified aggregation.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains the GroupedData.pivot API and how it performs aggregations, which is specific to the Databricks PySpark implementation.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/sum,sum,sum (GroupedData) - Azure Databricks,Sum numeric columns per group with GroupedData.sum,Documentation for the GroupedData.sum method in PySpark.,Computes the sum for each numeric column for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents the GroupedData.sum method behavior, a concrete API usage pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/inputpartition,InputPartition,InputPartition - Azure Databricks,Implement custom InputPartition for PySpark data sources,Learn about the InputPartition class in PySpark,A base class representing an input partition returned by thepartitions()method ofDataSourceReader. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Class reference for InputPartition returned by DataSourceReader.partitions(), including runtime version notes, which are product-specific integration details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/observation,Observation class,Observation - Azure Databricks,Capture DataFrame metrics with Observation in PySpark,Learn about the Observation class in PySpark,A class to observe named metrics on a DataFrame. Metrics are aggregation expressions applied to the DataFrame while it is being processed by an action. An Observation instance collects the metrics while the first action is executed. Subsequent actions do not modify the metrics returned byObservation.get. Retrieval of the metric viaObservation.getblocks until the first action has finished and metrics become available.,2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,"Describes Observation class semantics (first action only, blocking get), which are detailed, product-specific API behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/observation/get,get,get (Observation) - Azure Databricks,Retrieve observed metrics using Observation.get,Documentation for the Observation.get property in PySpark.,Gets the observed metrics. Waits until the observed dataset finishes its first action. Only the result of the first action is available. Subsequent actions do not modify the result.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents blocking behavior and single-action semantics of Observation.get, which are non-obvious, product-specific details.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor,PySparkPlotAccessor class,PySparkPlotAccessor class - Azure Databricks,Generate plots from PySpark DataFrames with PySparkPlotAccessor,"Learn about the PySparkPlotAccessor class in PySpark for generating line, bar, scatter, area, pie, box, KDE, and histogram plots.",Accessor for DataFrame plotting functionality in PySpark.,2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Class-level API reference for DataFrame plotting in Databricks PySpark, including supported plot types, which is specific integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/area,area,area - Azure Databricks,Draw stacked area plots with PySparkPlotAccessor.area,Documentation for the PySparkPlotAccessor.area method in PySpark.,Draws a stacked area plot. An area plot displays quantitative data visually.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for area plotting on DataFrames, including semantics of stacked area plots in this API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/bar,bar,bar - Azure Databricks,Create vertical bar charts with PySparkPlotAccessor.bar,Documentation for the PySparkPlotAccessor.bar method in PySpark.,"Creates a vertical bar plot. A bar plot presents categorical data with rectangular bars with lengths proportional to the values they represent. It shows comparisons among discrete categories. One axis shows the specific categories being compared, and the other axis represents a measured value.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents the bar plotting method on DataFrames, a concrete API usage pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/barh,barh,barh - Azure Databricks,Create horizontal bar charts with PySparkPlotAccessor.barh,Documentation for the PySparkPlotAccessor.barh method in PySpark.,"Creates a horizontal bar plot. A horizontal bar plot presents quantitative data with rectangular bars with lengths proportional to the values they represent. It shows comparisons among discrete categories. One axis shows the specific categories being compared, and the other axis represents a measured value.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level documentation for horizontal bar plots, including semantics of axes and categories.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/box,box,box - Azure Databricks,Build box-and-whisker plots with PySparkPlotAccessor.box,Documentation for the PySparkPlotAccessor.box method in PySpark.,"Creates a box-and-whisker plot fromDataFramecolumns. A box plot is a method for graphically depicting groups of numerical data through their quartiles. The box extends from the Q1 to Q3 quartile values of the data, with a line at the median (Q2). The whiskers extend from the edges of the box to show the range of the data. By default, they extend no more than 1.5 × IQR (IQR = Q3 - Q1) from the edges of the box, ending at the farthest data point within that interval. Outliers are plotted as separa",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains detailed behavior of box plots (quartiles, whiskers at 1.5×IQR, outliers) as implemented in this plotting API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/hist,hist,hist - Azure Databricks,Plot DataFrame histograms with PySparkPlotAccessor.hist,Documentation for the PySparkPlotAccessor.hist method in PySpark.,Draws a histogram of theDataFrame's columns. A histogram is a representation of the distribution of data.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for histogram plotting from DataFrame columns, which is specific to Databricks’ PySpark plotting integration.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/kde,kde,kde - Azure Databricks,Generate KDE plots with PySparkPlotAccessor.kde,Documentation for the PySparkPlotAccessor.kde method in PySpark.,"Generates a Kernel Density Estimate (KDE) plot using Gaussian kernels. In statistics, kernel density estimation is a non-parametric way to estimate the probability density function (PDF) of a random variable. This function uses Gaussian kernels and includes automatic bandwidth determination.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes kernel density estimation plotting with Gaussian kernels and automatic bandwidth in this specific API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/line,line,line - Azure Databricks,Plot DataFrame lines with PySparkPlotAccessor.line,Documentation for the PySparkPlotAccessor.line method in PySpark.,Plots aDataFrameas lines.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for line plotting of DataFrames, a concrete integration behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/pie,pie,pie - Azure Databricks,Create pie charts from DataFrames with PySparkPlotAccessor.pie,Documentation for the PySparkPlotAccessor.pie method in PySpark.,Generates a pie plot. A pie plot is a proportional representation of the numerical data in a column.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents pie plot behavior for numerical columns in this plotting API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/scatter,scatter,scatter - Azure Databricks,Visualize correlations with PySparkPlotAccessor.scatter,Documentation for the PySparkPlotAccessor.scatter method in PySpark.,"Creates a scatter plot with varying marker point size and color. The coordinates of each point are defined by two DataFrame columns, and filled circles are used to represent each point. This kind of plot is useful for seeing complex correlations between two variables, such as natural 2D coordinates like longitude and latitude, or any pair of metrics that can be plotted against each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains scatter plot behavior (marker size, color, coordinate mapping) for DataFrame columns in Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/row,Row class,Row class - Azure Databricks,Use the Row class for PySpark DataFrames,Documentation for the Row class in PySpark.,A row in DataFrame. The fields in it can be accessed: key in rowwill search through row keys. Row can be used to create a row object by using named arguments. It is not allowed to omit a named argument to represent that the value is None or missing. This should be explicitly set to None in this case. Changed in Databricks Runtime 7.4: Rows created from named arguments no longer have field names sorted alphabetically and will be ordered in the position as entered.,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Details Row construction, field access rules, and version-specific behavior changes (e.g., field order), which are concrete, product-specific API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/row/asdict,asDict,asDict - Azure Databricks,Convert Row objects to dictionaries with asDict,Documentation for the Row\.asDict method in PySpark.,"Returns the Row asDict[str, Any].",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for Row.asDict, including return type and behavior, which is specific to the PySpark Row API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig,RuntimeConfig class,RuntimeConfig - Azure Databricks,Manage Spark runtime settings with RuntimeConfig,"User-facing configuration API for Spark runtime properties, accessible through spark.conf. Options set here are propagated to Hadoop I/O.","User-facing configuration API, accessible throughSparkSession.conf. Supports Spark Connect Options set here are automatically propagated to the Hadoop configuration during I/O.",2026-04-17T08:00:00.000Z,reference,configuration,0.8,True,"Describes the user-facing configuration API (spark.conf / SparkSession.conf) and propagation to Hadoop I/O, which is core configuration behavior for Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/get,get,get (RuntimeConfig) - Azure Databricks,Read Spark configuration values with RuntimeConfig.get,Documentation for the RuntimeConfig.get method in PySpark.,"Returns the value of Spark runtime configuration property for the given key, assuming it is set.",2026-04-17T21:49:00.000Z,reference,configuration,0.75,True,"Method-level documentation for retrieving specific runtime configuration properties by key, a concrete configuration API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/getall,getAll,getAll - Azure Databricks,List all Spark configuration properties with RuntimeConfig.getAll,Documentation for the RuntimeConfig.getAll property in PySpark.,Returns all properties set in this conf.,2026-04-17T21:49:00.000Z,reference,configuration,0.75,True,"Documents how to enumerate all properties set in the configuration, which is specific configuration API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/ismodifiable,isModifiable,isModifiable - Azure Databricks,Check if Spark config keys are modifiable with isModifiable,Documentation for the RuntimeConfig.isModifiable method in PySpark.,Indicates whether the configuration property with the given key is modifiable in the current session.,2026-04-17T21:49:00.000Z,reference,configuration,0.75,True,"Provides semantics for determining whether a configuration key can be changed in the current session, a product-specific configuration rule.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/set,set,set - Azure Databricks,Set Spark runtime configuration with RuntimeConfig.set,Documentation for the RuntimeConfig.set method in PySpark.,Sets the given Spark runtime configuration property.,2026-04-17T21:49:00.000Z,reference,configuration,0.8,True,"Documents how to set configuration properties via the RuntimeConfig API, including key/value semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/unset,unset,unset - Azure Databricks,Unset Spark configuration keys with RuntimeConfig.unset,Documentation for the RuntimeConfig.unset method in PySpark.,Resets the configuration property for the given key.,2026-04-17T21:49:00.000Z,reference,configuration,0.8,True,"Explains how to reset configuration properties by key, which is specific configuration API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader,SimpleDataSourceStreamReader class,SimpleDataSourceStreamReader - Azure Databricks,Implement SimpleDataSourceStreamReader for lightweight streaming,A base class for simplified streaming data source readers in PySpark that reads data and plans the latest offset simultaneously.,"A base class for simplified streaming data source readers. Compared toDataSourceStreamReader,SimpleDataSourceStreamReaderdoesn't require planning data partitions. Theread()method allows reading data and planning the latest offset at the same time. BecauseSimpleDataSourceStreamReaderreads records in the Spark driver to determine the end offset of each batch without partitioning, it is only suited for lightweight use cases where input rate and batch size are small. UseDataSourceStreamReaderwhen re",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Class reference describing how SimpleDataSourceStreamReader differs from DataSourceStreamReader, including driver-side reading, offset planning, and suitability constraints, which are detailed, product-specific streaming integration semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/commit,commit,commit (SimpleDataSourceStreamReader) - Azure Databricks,Commit processed offsets with SimpleDataSourceStreamReader.commit,Documentation for the SimpleDataSourceStreamReader.commit method in PySpark.,Informs the source that Spark has completed processing all data for offsets less than or equal toendand will only request offsets greater thanendin the future. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents commit semantics for streaming offsets (end offset guarantees, future requests) and runtime version, which are specific to Databricks streaming APIs.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/initialoffset,initialOffset,initialOffset (SimpleDataSourceStreamReader) - Azure Databricks,Determine initial streaming offsets with SimpleDataSourceStreamReader.initialOffset,Documentation for the SimpleDataSourceStreamReader.initialOffset method in PySpark.,"Returns the initial offset of the streaming data source. A new streaming query starts reading data from the initial offset. If Spark is restarting an existing query, it will restart from the checkpointed offset rather than the initial one. Added in Databricks Runtime 15.3",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains initial vs checkpointed offset behavior for new vs restarted queries, which is nuanced, product-specific streaming behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/read,read,read (SimpleDataSourceStreamReader) - Azure Databricks,Read streaming data and next offset with SimpleDataSourceStreamReader.read,Documentation for the SimpleDataSourceStreamReader.read method in PySpark.,Reads all available data from the start offset and returns the offset that the next read attempt starts from. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how read returns both data and the next start offset, a concrete streaming integration pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/readbetweenoffsets,readBetweenOffsets,readBetweenOffsets (SimpleDataSourceStreamReader) - Azure Databricks,Re-read batches deterministically with SimpleDataSourceStreamReader.readBetweenOffsets,Documentation for the SimpleDataSourceStreamReader.readBetweenOffsets method in PySpark.,Reads all available data between a specific start offset and end offset. This method is invoked during failure recovery to re-read a batch deterministically. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes deterministic re-reading between specific offsets for failure recovery, which is detailed streaming API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession,SparkSession class,SparkSession - Azure Databricks,Use SparkSession as the entry point for Databricks PySpark,Learn about the SparkSession class in PySpark,"The entry point to programming Spark with the Dataset and DataFrame API. A SparkSession can be used to create DataFrames, register DataFrames as tables, execute SQL over tables, cache tables, and read parquet files.",2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Class-level reference for SparkSession in Databricks PySpark, including capabilities like creating DataFrames, registering tables, and executing SQL, which are concrete API behaviors.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/active,active,active - Azure Databricks,Access the active SparkSession with SparkSession.active,Documentation for the SparkSession.active class method in PySpark.,"Returns the active or defaultSparkSessionfor the current thread, returned by the builder.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents how to retrieve the active or default SparkSession for the current thread, a specific API behavior important for correct integration.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/addartifacts,addArtifacts,addArtifacts - Azure Databricks,Use SparkSession.addArtifacts in Azure Databricks,Documentation for the SparkSession.addArtifacts method in PySpark.,Adds artifact(s) to the client session. Currently only local files are supported. addArtifactis an alias foraddArtifacts.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a Databricks-specific PySpark method, including exact method name, behavior, and alias; this is product-specific SDK detail not reliably known from training.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/addtag,addTag,addTag - Azure Databricks,Tag operations with SparkSession.addTag in Databricks,Documentation for the SparkSession.addTag method in PySpark.,"Add a tag to be assigned to all the operations started by this thread in this session. Often, a unit of execution in an application consists of multiple Spark executions. Application programmers can use this method to group all those jobs together and give a group tag. The application can useSparkSession.interruptTagto cancel all running executions with this tag. There may be multiple tags present at the same time, so different parts of an application may use different tags to perform cancellati",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific SparkSession method and its semantics for tagging and cancellation, which are product/SDK-specific coding patterns.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/catalog,catalog,catalog - Azure Databricks,Access Spark catalog via SparkSession.catalog,Documentation for the SparkSession.catalog property in PySpark.,"Interface through which the user may create, drop, alter, or query underlying databases, tables, functions, and more.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Describes a concrete SparkSession property and its role for database/table/function management; this is specific API surface knowledge.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/clearprogresshandlers,clearProgressHandlers,clearProgressHandlers - Azure Databricks,Clear Spark progress handlers with clearProgressHandlers,Documentation for the SparkSession.clearProgressHandlers method in PySpark.,Clears all registered progress handlers.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Reference for a specific SparkSession method used to manage progress handlers; product-specific API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/cleartags,clearTags,clearTags - Azure Databricks,Manage Spark operation tags with clearTags,Documentation for the SparkSession.clearTags method in PySpark.,Clear the current thread's operation tags.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a concrete SparkSession method that clears thread operation tags; SDK-level behavior detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/conf,conf,conf - Azure Databricks,Use SparkSession.conf for runtime configuration,Documentation for the SparkSession.conf property in PySpark.,"Runtime configuration interface for Spark. This is the interface through which the user can get and set all Spark and Hadoop configurations relevant to Spark SQL. When getting the value of a config, this defaults to the value set in the underlyingSparkContext, if any.",2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,Explains a configuration interface property used to get/set Spark and Hadoop configs; this is a specific configuration access pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/createdataframe,createDataFrame,createDataFrame - Azure Databricks,Create DataFrames with SparkSession.createDataFrame,Documentation for the SparkSession.createDataFrame method in PySpark.,"Creates aDataFramefrom anRDD, a list, apandas.DataFrame, anumpy.ndarray, or apyarrow.Table.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a key SparkSession method including supported input types (RDD, pandas, numpy, pyarrow); detailed SDK usage.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/datasource,dataSource,dataSource - Azure Databricks,Register data sources via SparkSession.dataSource,Documentation for the SparkSession.dataSource property in PySpark.,Returns aDataSourceRegistrationfor data source registration.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Describes a specific property returning DataSourceRegistration, a product-specific integration mechanism.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/getactivesession,getActiveSession,getActiveSession - Azure Databricks,Retrieve current Spark session with getActiveSession,Documentation for the SparkSession.getActiveSession class method in PySpark.,"Returns the activeSparkSessionfor the current thread, returned by the builder.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents a class method and its behavior for obtaining the active SparkSession; SDK-specific pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/gettags,getTags,getTags - Azure Databricks,Inspect current Spark operation tags with getTags,Documentation for the SparkSession.getTags method in PySpark.,Get the tags that are currently set to be assigned to all the operations started by this thread.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,API reference for a SparkSession method returning thread operation tags; product-specific tagging semantics.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/interruptall,interruptAll,interruptAll - Azure Databricks,Interrupt all operations in a Spark session,Documentation for the SparkSession.interruptAll method in PySpark.,Interrupt all operations of this session currently running on the connected server.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents SparkSession.interruptAll and its effect on server-side operations; concrete control API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/interruptoperation,interruptOperation,interruptOperation - Azure Databricks,Cancel specific Spark operations with interruptOperation,Documentation for the SparkSession.interruptOperation method in PySpark.,Interrupt an operation of this session with the given operation ID.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Reference for a method that interrupts by operation ID; detailed control behavior unique to this API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/interrupttag,interruptTag,interruptTag - Azure Databricks,Cancel tagged Spark operations with interruptTag,Documentation for the SparkSession.interruptTag method in PySpark.,Interrupt all operations of this session with the given operation tag.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes a SparkSession method that cancels operations by tag; product-specific tagging and cancellation pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/newsession,newSession,newSession - Azure Databricks,Create isolated Spark sessions with newSession,Documentation for the SparkSession.newSession method in PySpark.,"Returns a newSparkSessionas a new session, that has separate SQL configuration, registered temporary views, and UDFs, but shares theSparkContextand table cache.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains behavior of SparkSession.newSession including what is shared vs isolated (SQL config, views, UDFs); nuanced API semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/profile,profile,profile - Azure Databricks,Profile Spark performance via SparkSession.profile,Documentation for the SparkSession.profile property in PySpark.,Returns aProfilefor performance and memory profiling.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents a property returning a Profile object for performance/memory profiling; specific to this environment.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/range,range,range (SparkSession) - Azure Databricks,Generate numeric ranges with SparkSession.range,Documentation for the SparkSession.range method in PySpark.,"Creates aDataFramewith a singleLongTypecolumn namedid, containing elements in a range fromstarttoend(exclusive) with step valuestep.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for creating a DataFrame with LongType id column and start/end/step semantics; concrete method behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/read,read,read (SparkSession) - Azure Databricks,Read data as DataFrames with SparkSession.read,Documentation for the SparkSession.read property in PySpark.,Returns aDataFrameReaderthat can be used to read data as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents the DataFrameReader entry point; specific property and usage pattern for data ingestion.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/readstream,readStream,readStream - Azure Databricks,Read streaming data with SparkSession.readStream,Documentation for the SparkSession.readStream property in PySpark.,Returns aDataStreamReaderthat can be used to read data streams as a streamingDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Reference for DataStreamReader access via SparkSession; streaming-specific integration API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/registerprogresshandler,registerProgressHandler,registerProgressHandler - Azure Databricks,Register Spark progress handlers with registerProgressHandler,Documentation for the SparkSession.registerProgressHandler method in PySpark.,Registers a progress handler to be called when a progress update is received from the server.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Describes a method to hook into server progress updates; product-specific callback integration.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/removeprogresshandler,removeProgressHandler,removeProgressHandler - Azure Databricks,Remove Spark progress handlers with removeProgressHandler,Documentation for the SparkSession.removeProgressHandler method in PySpark.,Removes a progress handler that was previously registered.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,API reference for unregistering progress handlers; complements registration behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/removetag,removeTag,removeTag - Azure Databricks,Remove Spark operation tags with removeTag,Documentation for the SparkSession.removeTag method in PySpark.,Remove a tag previously added to be assigned to all the operations started by this thread in this session. No-op if the tag was not previously added.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents tag removal semantics including no-op behavior if tag not present; detailed API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/sparkcontext,sparkContext,sparkContext - Azure Databricks,Access underlying SparkContext via SparkSession.sparkContext,Documentation for the SparkSession.sparkContext property in PySpark.,Returns the underlyingSparkContext.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes a property exposing the underlying SparkContext; specific to SparkSession API surface.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/sql,sql,sql - Azure Databricks,Run SQL queries with SparkSession.sql in Databricks,Documentation for the SparkSession.sql method in PySpark.,"Returns aDataFramerepresenting the result of the given query. Whenkwargsis specified, this method formats the given string using the Python standard formatter. The method binds named parameters to SQL literals or positional parameters fromargs. Named and positional parameters cannot be mixed in the same SQL query.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Details method behavior including named vs positional parameter binding and restriction on mixing them; nuanced, product-specific SQL execution semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/stop,stop,stop (SparkSession) - Azure Databricks,Stop Spark applications with SparkSession.stop,Documentation for the SparkSession.stop method in PySpark.,Stops the underlyingSparkContext.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents how stopping the SparkSession stops the underlying SparkContext; concrete lifecycle API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/streams,streams,streams - Azure Databricks,Manage streaming queries via SparkSession.streams,Documentation for the SparkSession.streams property in PySpark.,Returns aStreamingQueryManagerthat allows managing all activeStreamingQueryinstances on this context.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes access to StreamingQueryManager for active StreamingQuery instances; specific streaming management API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/table,table,table (SparkSession) - Azure Databricks,Load tables as DataFrames with SparkSession.table,Documentation for the SparkSession.table method in PySpark.,Returns the specified table as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for retrieving a table as a DataFrame; concrete method behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/tvf,tvf,tvf - Azure Databricks,Call table-valued functions via SparkSession.tvf,Documentation for the SparkSession.tvf property in PySpark.,Returns aTableValuedFunctionthat can be used to call a table-valued function (TVF).,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a property returning TableValuedFunction for TVF calls; product-specific function invocation pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/udf,udf,udf (SparkSession) - Azure Databricks,Register UDFs with SparkSession.udf in Databricks,Documentation for the SparkSession.udf property in PySpark.,Returns aUDFRegistrationfor UDF registration.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes UDFRegistration access and usage; concrete API for user-defined functions.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/udtf,udtf,udtf (SparkSession) - Azure Databricks,Register UDTFs with SparkSession.udtf in Databricks,Documentation for the SparkSession.udtf property in PySpark.,Returns aUDTFRegistrationfor UDTF registration.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents UDTFRegistration access; specific coding pattern for table functions.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/version-session,version,version (SparkSession) - Azure Databricks,Check Spark runtime version via SparkSession.version,Documentation for the SparkSession.version property in PySpark.,The version of Spark on which this application is running.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Provides a property exposing the exact Spark version the app runs on; configuration/environment detail.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery,StreamingQuery class,StreamingQuery - Azure Databricks,Control streaming queries with StreamingQuery in PySpark,Learn about the StreamingQuery class in PySpark,A handle to a query that is executing continuously in the background as new data arrives. All methods are thread-safe.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Class-level reference describing a handle to continuous queries and thread-safety; specific streaming control API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/awaittermination,awaitTermination,awaitTermination (StreamingQuery) - Azure Databricks,Wait for streaming query termination with awaitTermination,Documentation for the StreamingQuery.awaitTermination method in PySpark.,"Waits for the termination of this query, either bystop()or by an exception. If the query has terminated with an exception, the exception will be thrown. Iftimeoutis set, returns whether the query has terminated within the timeout seconds. If the query has already terminated, subsequent calls either return immediately (if stopped normally) or throw the exception immediately (if terminated with an exception).",2026-04-17T21:49:00.000Z,reference,limits-quotas,0.6,True,"Method reference including timeout parameter semantics and behavior on exceptions; involves time-based behavior and return conditions, bordering on timeout limits.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/exception,exception,exception (StreamingQuery) - Azure Databricks,Inspect streaming failures with StreamingQuery.exception,Documentation for the StreamingQuery.exception method in PySpark.,"Returns theStreamingQueryExceptionif the query was terminated by an exception, orNoneif the query terminated normally.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents how to retrieve StreamingQueryException or None; specific error inspection API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/explain,explain,explain (StreamingQuery) - Azure Databricks,Explain streaming query plans with StreamingQuery.explain,Documentation for the StreamingQuery.explain method in PySpark.,Prints the (logical and physical) plans to the console for debugging.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Reference for printing logical and physical plans for debugging; concrete method behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/id,id,id (StreamingQuery) - Azure Databricks,Use StreamingQuery.id for unique query identification,Documentation for the StreamingQuery.id property in PySpark.,"Returns the unique ID of this query that persists across restarts from checkpoint data. The ID is generated when a query is started for the first time, and is the same every time it is restarted from checkpoint data. There can only be one query with the same ID active in a Spark cluster.",2026-04-17T21:49:00.000Z,reference,limits-quotas,0.65,True,Describes uniqueness constraints (only one query with same ID active in a cluster) and persistence across restarts; this is a behavioral limit/constraint.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/isactive,isActive,isActive (StreamingQuery) - Azure Databricks,Check streaming query activity with isActive,Documentation for the StreamingQuery.isActive property in PySpark.,Returns whether this streaming query is currently active.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Property reference returning whether a query is active; specific status-check API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/lastprogress,lastProgress,lastProgress (StreamingQuery) - Azure Databricks,Get last progress update via lastProgress,Documentation for the StreamingQuery.lastProgress property in PySpark.,"Returns the most recentStreamingQueryProgressupdate of this streaming query, orNoneif there have been no progress updates.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents returning most recent StreamingQueryProgress or None; product-specific progress reporting API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/name,name,name (StreamingQuery) - Azure Databricks,Name streaming queries with StreamingQuery.name,Documentation for the StreamingQuery.name property in PySpark.,"Returns the user-specified name of the query, orNoneif not specified. Set viadf.writeStream.queryName(""query"").start(). If set, this name must be unique across all active queries.",2026-04-17T21:49:00.000Z,reference,limits-quotas,0.6,True,Specifies uniqueness requirement for query names across active queries and how to set them; this is a naming constraint/limit.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/processallavailable,processAllAvailable,processAllAvailable (StreamingQuery) - Azure Databricks,Process all available streaming data with processAllAvailable,Documentation for the StreamingQuery.processAllAvailable method in PySpark.,Blocks until all available data in the source has been processed and committed to the sink. Intended for testing.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Method reference describing blocking behavior until all available data is processed; specific testing/integration pattern.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/recentprogress,recentProgress,recentProgress (StreamingQuery) - Azure Databricks,Access recent streaming progress via recentProgress,Documentation for the StreamingQuery.recentProgress property in PySpark.,Returns an array of the most recentStreamingQueryProgressupdates for this query. The number of progress updates retained is configured byspark.sql.streaming.numRecentProgressUpdates.,2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,Describes that the number of retained progress updates is configured by spark.sql.streaming.numRecentProgressUpdates; exposes a specific configuration setting affecting behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/runid,runId,runId (StreamingQuery) - Azure Databricks,Use StreamingQuery.runId in Azure Databricks PySpark,Documentation for the StreamingQuery.runId property in PySpark.,Returns the unique ID of this query that does not persist across restarts. Every query that is started (or restarted from checkpoint) will have a differentrunId.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark Databricks property with product-specific behavior (non-persistent unique ID across restarts), which is detailed SDK behavior rather than generic concept.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/status,status,status (StreamingQuery) - Azure Databricks,Access StreamingQuery.status in Databricks PySpark,Documentation for the StreamingQuery.status property in PySpark.,Returns the current status of the query.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a concrete PySpark property on Databricks (current query status) as part of the SDK surface; this is product-specific API behavior.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/stop,stop,stop (StreamingQuery) - Azure Databricks,Stop a StreamingQuery in Databricks PySpark,Documentation for the StreamingQuery.stop method in PySpark.,Stops this streaming query.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Method-level reference for stopping a streaming query in Databricks PySpark, exposing concrete SDK behavior not covered by generic theory.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener,StreamingQueryListener class,StreamingQueryListener - Azure Databricks,Implement StreamingQueryListener for Databricks PySpark,Documentation for the StreamingQueryListener abstract class in PySpark for listening to streaming query lifecycle events.,An abstract class for listening to events related toStreamingQuery. Subclass this class and implement its abstract methods to receive lifecycle event callbacks for streaming queries.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes a Databricks-specific abstract listener class and its lifecycle callbacks, which are concrete API integration points.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryidle,onQueryIdle,onQueryIdle (StreamingQueryListener) - Azure Databricks,Handle onQueryIdle events in StreamingQueryListener,Documentation for the StreamingQueryListener.onQueryIdle method in PySpark.,Called when the query is idle and waiting for new data to process.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific callback method and when it fires (query idle waiting for data), which is detailed SDK behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryprogress,onQueryProgress,onQueryProgress (StreamingQueryListener) - Azure Databricks,Use onQueryProgress for streaming status updates,Documentation for the StreamingQueryListener.onQueryProgress method in PySpark.,"Called when there is some status update (ingestion rate updated, etc.)",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference for a Databricks PySpark listener callback with specific semantics around status updates.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onquerystarted,onQueryStarted,onQueryStarted (StreamingQueryListener) - Azure Databricks,React to onQueryStarted events in Databricks,Documentation for the StreamingQueryListener.onQueryStarted method in PySpark.,Called when a query is started.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete lifecycle callback method for streaming queries, part of the Databricks PySpark API surface.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryterminated,onQueryTerminated,onQueryTerminated (StreamingQueryListener) - Azure Databricks,Handle onQueryTerminated events in StreamingQueryListener,Documentation for the StreamingQueryListener.onQueryTerminated method in PySpark.,"Called when a query is stopped, with or without error.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains a specific termination callback (with or without error) in the Databricks PySpark listener API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager,StreamingQueryManager class,StreamingQueryManager - Azure Databricks,Manage streaming queries with StreamingQueryManager,Learn about the StreamingQueryManager class in PySpark,Manages all activeStreamingQueryinstances associated with aSparkSession. Usespark.streamsto access this.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Class-level reference for managing active streaming queries via Databricks PySpark APIs, including how to access it from SparkSession.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/active,active,active (StreamingQueryManager) - Azure Databricks,List active streaming queries in Databricks,Documentation for the StreamingQueryManager.active property in PySpark.,Returns a list of all active streaming queries associated with thisSparkSession.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a specific property returning active queries for a SparkSession, a concrete SDK behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/addlistener,addListener,addListener (StreamingQueryManager) - Azure Databricks,Register StreamingQueryListener with StreamingQueryManager,Documentation for the StreamingQueryManager.addListener method in PySpark.,Registers aStreamingQueryListenerto receive lifecycle event callbacks for streaming queries.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level API reference for registering listeners for streaming lifecycle events in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/awaitanytermination,awaitAnyTermination,awaitAnyTermination (StreamingQueryManager) - Azure Databricks,Use awaitAnyTermination on StreamingQueryManager,Documentation for the StreamingQueryManager.awaitAnyTermination method in PySpark.,"Waits until any of the queries on the associatedSparkSessionhas terminated since the creation of the context, or sinceresetTerminated()was called. If any query terminated with an exception, the exception will be thrown. Iftimeoutis set, returns whether any query has terminated within the timeout seconds. If a query has already terminated, subsequent calls either return immediately (if stopped normally) or throw the exception immediately (if terminated with an exception). UseresetTerminated()to c",2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,"Describes detailed behavior of waiting for any query termination, including timeout semantics and exception propagation, which is product-specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/get,get,get (StreamingQueryManager) - Azure Databricks,Get streaming query by ID in Databricks,Documentation for the StreamingQueryManager.get method in PySpark.,Returns an active streaming query from thisSparkSessionby its unique ID.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"API reference for retrieving an active streaming query by unique ID from a SparkSession, a concrete integration pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/removelistener,removeListener,removeListener (StreamingQueryManager) - Azure Databricks,Remove StreamingQueryListener from StreamingQueryManager,Documentation for the StreamingQueryManager.removeListener method in PySpark.,Deregisters aStreamingQueryListener.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents deregistration behavior for streaming listeners in the Databricks PySpark API.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/resetterminated,resetTerminated,resetTerminated (StreamingQueryManager) - Azure Databricks,Reset terminated streaming queries in Databricks,Documentation for the StreamingQueryManager.resetTerminated method in PySpark.,Forgets past terminated queries so thatawaitAnyTermination()can be used again to wait for new terminations.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,"Explains specific behavior of resetTerminated in relation to awaitAnyTermination, which is detailed SDK semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udf,UserDefinedFunction class,UserDefinedFunction - Azure Databricks,Work with UserDefinedFunction in Databricks PySpark,Learn about the UserDefinedFunction class in PySpark,A user defined function in Python. The constructor of this class is not supposed to be directly called. Usepyspark.sql.functions.udforpyspark.sql.functions.pandas_udfto create an instance.,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Class-level reference for Databricks’ PySpark UDF wrapper, including how instances are created via specific factory functions.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udf/asnondeterministic,asNondeterministic,asNondeterministic (UserDefined Function) - Azure Databricks,Mark PySpark UserDefinedFunction as nondeterministic,Documentation for the UserDefinedFunction.asNondeterministic method in PySpark.,Updates theUserDefinedFunctionto nondeterministic.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific method that changes optimizer semantics for a UDF, which is product-specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration,UDFRegistration class,UDFRegistration - Azure Databricks,Register UDFs with UDFRegistration in Databricks,Learn about the UDFRegistration class in PySpark,Wrapper for user-defined function registration. This instance can be accessed byspark.udf.,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Documents a Databricks-specific wrapper for UDF registration accessed via spark.udf, including its role in SQL function registration.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/register,register,register (UDFRegistration) - Azure Databricks,Register Python functions as SQL UDFs in Databricks,Documentation for the UDFRegistration.register method in PySpark.,Registers a Python function (including lambda functions) or a user-defined function as a SQL function.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API reference for registering Python functions and UDFs as SQL functions, including supported function types and behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/registerjavafunction,registerJavaFunction,registerJavaFunction (UDFRegistration) - Azure Databricks,Register Java UDFs as SQL functions in Databricks,Documentation for the UDFRegistration.registerJavaFunction method in PySpark.,"Registers a Java user-defined function as a SQL function. In addition to a name and the function itself, the return type can be optionally specified. When the return type is not specified, it is inferred via reflection.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents a method for registering Java UDFs, including optional return type inference via reflection, which is product-specific integration detail.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/registerjavaudaf,registerJavaUDAF,registerJavaUDAF (UDFRegistration) - Azure Databricks,Register Java UDAFs as SQL functions in Databricks,Documentation for the UDFRegistration.registerJavaUDAF method in PySpark.,Registers a Java user-defined aggregate function as a SQL function.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API reference for registering Java aggregate functions as SQL functions, a concrete integration pattern.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtf,UserDefinedTableFunction class,UserDefinedTableFunction - Azure Databricks,Use UserDefinedTableFunction in Databricks PySpark,Learn about the UserDefinedTableFunction class in PySpark,A user-defined table function in Python. The constructor of this class is not supposed to be directly called. Usepyspark.sql.functions.udtfto create an instance.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Class-level reference for Databricks’ PySpark UDTF, including how to construct via pyspark.sql.functions.udtf.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtf/asdeterministic,asDeterministic,asDeterministic (UserDefinedTableFunction) - Azure Databricks,Mark UserDefinedTableFunction as deterministic,Documentation for the UserDefinedTableFunction.asDeterministic method in PySpark.,Updates theUserDefinedTableFunctionto deterministic.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level API reference that affects execution semantics of UDTFs in Databricks PySpark.,new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtfregistration,UDTFRegistration class,UDTFRegistration - Azure Databricks,Use UDTFRegistration for table functions in Databricks,Learn about the UDTFRegistration class in PySpark,Wrapper for user-defined table function registration. This instance can be accessed byspark.udtf.,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Class-level reference for registering user-defined table functions via spark.udtf, which is Databricks-specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtfregistration/register,register,register (UDTFRegistration) - Azure Databricks,Register Python UDTFs as SQL table functions,Documentation for the UDTFRegistration.register method in PySpark.,Registers a Python user-defined table function as a SQL table function.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how to register Python user-defined table functions as SQL table functions, including method semantics.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval,VariantVal class,VariantVal class - Azure Databricks,Work with VariantVal type in Databricks PySpark,Documentation for the VariantVal class in PySpark.,A class to represent a Variant value in Python. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Introduces a Databricks-specific VariantVal class and its runtime version, which is specialized API knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/parsejson,parseJson,parseJson - Azure Databricks,Parse JSON into VariantVal in Databricks PySpark,Documentation for the VariantVal.parseJson method in PySpark.,Converts a JSON string to a VariantVal object. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Method-level reference for converting JSON strings into VariantVal objects, a product-specific data handling API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/tojson,toJson,toJson - Azure Databricks,Convert VariantVal to JSON with time zone control,Documentation for the VariantVal.toJson method in PySpark.,Convert the VariantVal to a JSON string. The zone ID represents the time zone that the timestamp should be printed in. It is defaulted to UTC. The list of valid zone IDs can be found by importing thezoneinfomodule and runningzoneinfo.available_timezones(). Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes VariantVal.toJson including default UTC zone and how valid zone IDs are obtained via zoneinfo.available_timezones(), which is detailed, product-specific behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/topython,toPython,toPython - Azure Databricks,Convert VariantVal to native Python structures,Documentation for the VariantVal.toPython method in PySpark.,Convert the VariantVal to a Python data structure. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Documents VariantVal.toPython behavior, which is specific to Databricks’ Variant type integration with Python.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window,Window class,Window class - Azure Databricks,Define window specifications with Window in PySpark,Documentation for the Window class in PySpark.,Utility functions for defining window in DataFrames. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Class-level reference for Window utility functions, including Spark Connect support, which is concrete API surface.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/orderby,orderBy,orderBy (Window) - Azure Databricks,Specify ordering for Window in Databricks PySpark,Documentation for the Window\.orderBy method in PySpark.,Creates aWindowSpecwith the ordering defined.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Method-level API reference for Window.orderBy, defining how ordering is attached to a WindowSpec.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/partitionby,partitionBy,partitionBy (Window) - Azure Databricks,Specify partitioning for Window in Databricks PySpark,Documentation for the Window\.partitionBy method in PySpark.,Creates aWindowSpecwith the partitioning defined.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents Window.partitionBy behavior, a concrete configuration of windowing in Databricks PySpark.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/rangebetween,rangeBetween,rangeBetween (Window) - Azure Databricks,Define rangeBetween frames for Window in PySpark,Documentation for the Window\.rangeBetween method in PySpark.,"Creates aWindowSpecwith the frame boundaries defined, fromstart(inclusive) toend(inclusive). Bothstartandendare relative from the current row. For example,0means ""current row"",-1means one before the current row, and5means five after the current row. A range-based boundary is based on the actual value of the ORDER BY expression(s). An offset alters the value of the ORDER BY expression — for example, if the current ORDER BY value is10and the lower bound offset is-3, the resulting lower bound is7. ",2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Explains detailed semantics of rangeBetween, including relative offsets and how ORDER BY values are adjusted, which is precise API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/rowsbetween,rowsBetween,rowsBetween (Window) - Azure Databricks,Define rowsBetween frames for Window in PySpark,Documentation for the Window\.rowsBetween method in PySpark.,"Creates aWindowSpecwith the frame boundaries defined, fromstart(inclusive) toend(inclusive). Bothstartandendare relative positions from the current row. For example,0means ""current row"",-1means the row before the current row, and5means the fifth row after the current row. A row-based boundary is based on the position of the row within the partition. An offset indicates the number of rows above or below the current row where the frame starts or ends.",2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Documents row-based frame semantics with relative positions, which is detailed behavior of the Databricks PySpark API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec,WindowSpec class,WindowSpec class - Azure Databricks,Use WindowSpec to configure windowing in Databricks,Documentation for the WindowSpec class in PySpark.,"A window specification that defines the partitioning, ordering, and frame boundaries. Changed in Databricks Runtime 13.0: Supports Spark Connect. Use the static methods inWindowto create aWindowSpec.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Class-level reference for WindowSpec, including partitioning, ordering, frame boundaries, and Spark Connect support.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/orderby,orderBy,orderBy (WindowSpec) - Azure Databricks,Set ordering columns on WindowSpec in PySpark,Documentation for the WindowSpec.orderBy method in PySpark.,Defines the ordering columns in aWindowSpec.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Method-level API reference for WindowSpec.orderBy, defining ordering semantics for windows.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/partitionby,partitionBy,partitionBy (WindowSpec) - Azure Databricks,Set partitioning columns on WindowSpec in PySpark,Documentation for the WindowSpec.partitionBy method in PySpark.,Defines the partitioning columns in aWindowSpec.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents WindowSpec.partitionBy behavior, a concrete configuration method in the Databricks PySpark API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/rangebetween,rangeBetween,rangeBetween (WindowSpec) - Azure Databricks,Configure rangeBetween on WindowSpec in Databricks,Documentation for the WindowSpec.rangeBetween method in PySpark.,"Defines the frame boundaries, fromstart(inclusive) toend(inclusive). Bothstartandendare relative from the current row. For example,0means ""current row"",-1means one before the current row, and5means five after the current row.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains detailed semantics of range-based frame boundaries relative to the current row, which is specific API behavior.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/rowsbetween,rowsBetween,rowsBetween (WindowSpec) - Azure Databricks,Configure rowsBetween on WindowSpec in Databricks,Documentation for the WindowSpec.rowsBetween method in PySpark.,"Defines the frame boundaries, fromstart(inclusive) toend(inclusive). Bothstartandendare relative positions from the current row. For example,0means ""current row"",-1means the row before the current row, and5means the fifth row after the current row.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents row-based frame boundaries with relative positions, a detailed aspect of the WindowSpec API.",new -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/writercommitmessage,WriterCommitMessage,WriterCommitMessage - Azure Databricks,Use WriterCommitMessage in Databricks data sources,Learn about the WriterCommitMessage class in PySpark,A commit message returned byDataSourceWriter.writeand sent back to the driver as an input parameter ofDataSourceWriter.commitorDataSourceWriter.abort. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Class-level reference for WriterCommitMessage used with DataSourceWriter.write/commit/abort, including its role in driver communication, which is product-specific integration behavior.",new +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/,Overview,PySpark reference - Azure Databricks,,"Discover reference pages for PySpark, a Python API for Spark, on Databricks.","This page provides an overview of reference available for PySpark, a Python API for Spark. For more information about PySpark, seePySpark on Azure Databricks.",2026-04-23T08:00:00.000Z,reference,,0.1,False,"This is an overview/index page for PySpark reference on Databricks. It does not itself contain detailed configuration tables, limits, error codes, or best-practice guidance; it mainly routes to other reference pages. Therefore it does not meet the expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog,Catalog class,Catalog - Azure Databricks,Use PySpark Catalog API on Azure Databricks,Learn about the Catalog class in PySpark,"User-facing catalog API, accessible throughSparkSession.catalog. This is a thin wrapper around its Scala implementationorg.apache.spark.sql.catalog.Catalog.",2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,API reference for Catalog class with method signatures and behavior is product-specific integration knowledge (SDK surface) beyond generic concepts.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/cachetable,cacheTable,cacheTable - Azure Databricks,Cache tables with PySpark Catalog.cacheTable,Documentation for the Catalog.cacheTable method in PySpark.,Caches the specified table in-memory or with given storage level. Default MEMORY_AND_DISK.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents a specific method, default storage level (MEMORY_AND_DISK), and behavior; this is concrete SDK configuration/usage detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/clearcache,clearCache,clearCache - Azure Databricks,Clear cached tables with Catalog.clearCache,Documentation for the Catalog.clearCache method in PySpark.,Removes all cached tables from the in-memory cache.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference describing behavior of clearCache is product-specific API detail useful for coding against Databricks.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/createtable,createTable,createTable - Azure Databricks,Create tables using Catalog.createTable in PySpark,Documentation for the Catalog.createTable method in PySpark.,Creates a table based on the dataset in a data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes a concrete Catalog API for table creation tied to Databricks/Spark; this is SDK integration detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/currentcatalog,currentCatalog,currentCatalog - Azure Databricks,Get current catalog with Catalog.currentCatalog,Documentation for the Catalog.currentCatalog method in PySpark.,Returns the current default catalog in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,"Documents a specific PySpark Catalog method and its return behavior, which is product-specific API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/currentdatabase,currentDatabase,currentDatabase - Azure Databricks,Get current database with Catalog.currentDatabase,Documentation for the Catalog.currentDatabase method in PySpark.,Returns the current default database in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,Method reference for currentDatabase is concrete SDK behavior for Databricks PySpark integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/databaseexists,databaseExists,databaseExists - Azure Databricks,Check database existence with Catalog.databaseExists,Documentation for the Catalog.databaseExists method in PySpark.,Check if the database with the specified name exists.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API method semantics (what it checks, return type) are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/dropglobaltempview,dropGlobalTempView,dropGlobalTempView - Azure Databricks,Drop global temp views with Catalog.dropGlobalTempView,Documentation for the Catalog.dropGlobalTempView method in PySpark.,Drops the global temporary view with the given view name in the catalog.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents a specific Catalog method and its effect on global temp views; concrete SDK usage detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/droptempview,dropTempView,dropTempView - Azure Databricks,Drop local temp views with Catalog.dropTempView,Documentation for the Catalog.dropTempView method in PySpark.,"Drops the local temporary view with the given view name in the catalog. If the view has been cached before, then it will also be uncached. Returns true if this view is dropped successfully, false otherwise.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Includes behavior details (also uncaches, returns true/false) that are specific to this API and important for integration logic.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/functionexists,functionExists,functionExists - Azure Databricks,Check function existence with Catalog.functionExists,Documentation for the Catalog.functionExists method in PySpark.,Check if the function with the specified name exists. This can either be a temporary function or a function.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level semantics for checking temporary or permanent functions are product-specific API details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/getdatabase,getDatabase,getDatabase - Azure Databricks,Retrieve databases with Catalog.getDatabase in PySpark,Documentation for the Catalog.getDatabase method in PySpark.,Get the database with the specified name. This throws anAnalysisExceptionwhen the database cannot be found.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Describes behavior including throwing AnalysisException when not found; this is concrete SDK behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/getfunction,getFunction,getFunction - Azure Databricks,Retrieve functions with Catalog.getFunction in PySpark,Documentation for the Catalog.getFunction method in PySpark.,Get the function with the specified name. This function can be a temporary function or a function. This throws anAnalysisExceptionwhen the function cannot be found.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,API reference including error behavior (AnalysisException) is specific integration knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/gettable,getTable,getTable - Azure Databricks,Retrieve tables and views with Catalog.getTable,Documentation for the Catalog.getTable method in PySpark.,Get the table or view with the specified name. This table can be a temporary view or a table/view. This throws anAnalysisExceptionwhen no Table can be found.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Documents how the method resolves temp vs permanent tables and its exception behavior; concrete SDK detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/iscached,isCached,isCached - Azure Databricks,Check table cache status with Catalog.isCached,Documentation for the Catalog.isCached method in PySpark.,Returns true if the table is currently cached in-memory.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method semantics (returns true if cached in-memory) are specific to Databricks PySpark Catalog API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listcatalogs,listCatalogs,listCatalogs - Azure Databricks,List catalogs in session with Catalog.listCatalogs,Documentation for the Catalog.listCatalogs method in PySpark.,Returns a list of catalogs in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,API behavior for listing catalogs in a session is product-specific integration information.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listcolumns,listColumns,listColumns - Azure Databricks,List table columns with Catalog.listColumns,Documentation for the Catalog.listColumns method in PySpark.,Returns a list of columns for the given table/view in the specified database.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference for listing columns of a table/view is specific to this API surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listdatabases,listDatabases,listDatabases - Azure Databricks,List databases with Catalog.listDatabases in PySpark,Documentation for the Catalog.listDatabases method in PySpark.,Returns a list of databases available across all sessions.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,Describes method returning databases across sessions; concrete SDK usage detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listfunctions,listFunctions,listFunctions - Azure Databricks,List registered functions with Catalog.listFunctions,Documentation for the Catalog.listFunctions method in PySpark.,Returns a list of functions registered in the specified database.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents how to retrieve functions in a database via Catalog API; concrete integration pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listtables,listTables,listTables - Azure Databricks,List tables and views with Catalog.listTables,Documentation for the Catalog.listTables method in PySpark.,Returns a list of tables/views in the specified database.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API semantics for listing tables/views in a database are product-specific integration details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/recoverpartitions,recoverPartitions,recoverPartitions - Azure Databricks,Recover table partitions with Catalog.recoverPartitions,Documentation for the Catalog.recoverPartitions method in PySpark.,Recovers all the partitions of the given table and updates the catalog.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes a specific Catalog method that updates metadata for partitions; this is detailed SDK behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/refreshbypath,refreshByPath,refreshByPath - Azure Databricks,Refresh cached data by path with Catalog.refreshByPath,Documentation for the Catalog.refreshByPath method in PySpark.,Invalidates and refreshes all the cached data (and the associated metadata) for any DataFrame that contains the given data source path.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Details how cached data and metadata are invalidated/refreshed for DataFrames by path; product-specific API semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/refreshtable,refreshTable,refreshTable - Azure Databricks,Refresh cached tables with Catalog.refreshTable,Documentation for the Catalog.refreshTable method in PySpark.,Invalidates and refreshes all the cached data and metadata of the given table.,2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,Explains behavior of invalidating and refreshing cached data/metadata for a table; concrete Databricks PySpark API detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/setcurrentcatalog,setCurrentCatalog,setCurrentCatalog - Azure Databricks,Use Catalog.setCurrentCatalog in Azure Databricks PySpark,Documentation for the Catalog.setCurrentCatalog method in PySpark.,Sets the current default catalog in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark Catalog method, including signature and Databricks-specific behavior; this is product-specific integration/code-pattern knowledge beyond generic LLM training.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/setcurrentdatabase,setCurrentDatabase,setCurrentDatabase - Azure Databricks,Use Catalog.setCurrentDatabase in Azure Databricks PySpark,Documentation for the Catalog.setCurrentDatabase method in PySpark.,Sets the current default database in this session.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete PySpark Catalog API with exact method name, parameters, and behavior in Databricks; this is detailed SDK usage rather than conceptual content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/tableexists,tableExists,tableExists - Azure Databricks,Check table existence with Catalog.tableExists in PySpark,Documentation for the Catalog.tableExists method in PySpark.,Check if the table or view with the specified name exists. This can either be a temporary view or a table/view.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact Databricks PySpark method semantics and usage for checking table or view existence, which is product-specific API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/uncachetable,uncacheTable,uncacheTable - Azure Databricks,Remove cached tables with Catalog.uncacheTable in PySpark,Documentation for the Catalog.uncacheTable method in PySpark.,Removes the specified table from the in-memory cache.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes a specific Databricks PySpark Catalog API for cache management, including method name and behavior, which is detailed integration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column,Column class,Column class - Azure Databricks,Work with the Column class in Azure Databricks PySpark,Documentation for the Column class in PySpark.,A column in a DataFrame. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.85,True,"Full reference for the Databricks PySpark Column class, including methods, behaviors, and Spark Connect support; this is detailed SDK/API knowledge unique to the product.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/alias,alias,alias (Column) - Azure Databricks,Assign aliases to columns with Column.alias in PySpark,Documentation for the Column.alias method in PySpark.,Give the column an alias.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents a specific Column method, its signature, and behavior in Databricks PySpark, which is concrete API usage rather than generic theory.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc,asc,asc (Column) - Azure Databricks,Sort columns ascending with Column.asc in PySpark,Documentation for the Column.asc method in PySpark.,Sort the column in ascending order.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a Databricks PySpark Column method with defined behavior, representing product-specific coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc_nulls_first,asc_nulls_first,asc_nulls_first (Column) - Azure Databricks,Sort ascending with nulls first using Column.asc_nulls_first,Documentation for the Column.asc\_nulls\_first method in PySpark.,Sort the column in ascending order with null values appearing first.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes a precise Column API for sort behavior including null ordering, which is specific to Databricks PySpark implementation.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc_nulls_last,asc_nulls_last,asc_nulls_last (Column) - Azure Databricks,Sort ascending with nulls last using Column.asc_nulls_last,Documentation for the Column.asc\_nulls\_last method in PySpark.,Sort the column in ascending order with null values appearing last.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact method name and semantics for a Databricks PySpark Column sort variant, which is detailed SDK knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/astype,astype,astype - Azure Databricks,Cast column types with Column.astype in PySpark,Documentation for the Column.astype method in PySpark.,Alias forcast().,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents a Column method alias and its relationship to cast(), which is specific API behavior in Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/between,between,between - Azure Databricks,Filter values within ranges using Column.between in PySpark,Documentation for the Column.between method in PySpark.,Check if the column value is between lower and upper bounds (inclusive).,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"API reference for a Column method with defined inclusive bounds semantics, representing concrete Databricks PySpark behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwiseand,bitwiseAND,bitwiseAND - Azure Databricks,Compute bitwise AND with Column.bitwiseAND in Databricks PySpark,Documentation for the Column.bitwiseAND method in PySpark.,Compute bitwise AND of this expression with another expression. Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details a specific Column bitwise operation, including Databricks Runtime version notes and Spark Connect support, which is product-specific API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwiseor,bitwiseOR,bitwiseOR - Azure Databricks,Compute bitwise OR with Column.bitwiseOR in Databricks PySpark,Documentation for the Column.bitwiseOR method in PySpark.,Compute bitwise OR of this expression with another expression. Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Similar to index 11, this documents a precise Column API and runtime/version behavior, which is expert-level integration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwisexor,bitwiseXOR,bitwiseXOR - Azure Databricks,Compute bitwise XOR with Column.bitwiseXOR in Databricks PySpark,Documentation for the Column.bitwiseXOR method in PySpark.,Compute bitwise XOR of this expression with another expression. Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Provides method-level documentation including Databricks Runtime changes, which is detailed product-specific SDK information.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/cast,cast,cast - Azure Databricks,Change column data types with Column.cast in PySpark,Documentation for the Column.cast method in PySpark.,Convert the column to a different data type.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents the exact casting behavior and usage for a Databricks PySpark Column, which is concrete API reference material.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/contains,contains,contains (Column) - Azure Databricks,Check substring presence with Column.contains in PySpark,Documentation for the Column.contains method in PySpark.,Check if a string column contains a substring.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"API reference for a string-specific Column method, including semantics of substring matching in Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/desc,desc,desc (Column) - Azure Databricks,Sort columns descending with Column.desc in PySpark,Documentation for the Column.desc method in PySpark.,Sort the column in descending order.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific Column sorting method and its behavior, which is part of the Databricks PySpark SDK surface.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/desc_nulls_first,desc_nulls_first,desc_nulls_first (Column) - Azure Databricks,Sort descending with nulls first using Column.desc_nulls_first,Documentation for the Column.desc\_nulls\_first method in PySpark.,Sort the column in descending order with null values appearing first.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact method name and null ordering semantics, which are product-specific API details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/desc_nulls_last,desc_nulls_last,desc_nulls_last (Column) - Azure Databricks,Sort descending with nulls last using Column.desc_nulls_last,Documentation for the Column.desc\_nulls\_last method in PySpark.,Sort the column in descending order with null values appearing last.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Similar to index 17, this is detailed Column API documentation for Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/dropfields,dropFields,dropFields - Azure Databricks,Remove struct fields with Column.dropFields in PySpark,Documentation for the Column.dropFields method in PySpark.,Remove fields from a struct column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes a struct-specific Column method and its behavior, which is concrete Databricks PySpark API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/endswith,endswith,endswith (Column) - Azure Databricks,Check string suffixes with Column.endswith in PySpark,Documentation for the Column.endswith method in PySpark.,Check if a string column ends with a substring.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API reference for a Column string operation with defined semantics in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/eqnullsafe,eqNullSafe,eqNullSafe - Azure Databricks,Perform null-safe equality with Column.eqNullSafe in Databricks PySpark,Documentation for the Column.eqNullSafe method in PySpark.,Equality test that is safe for null values. Added in Databricks Runtime 11.0 Changed in Databricks Runtime 13.0: Supports Spark Connect.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Documents a null-safe comparison method, including Databricks Runtime version changes and Spark Connect support, which is detailed product-specific behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/getfield,getField,getField - Azure Databricks,Access struct fields with Column.getField in PySpark,Documentation for the Column.getField method in PySpark.,Get a field from a struct column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Provides method-level semantics for struct field access in Databricks PySpark, which is specific SDK usage.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/getitem,getItem,getItem - Azure Databricks,Access array or map elements with Column.getItem in PySpark,Documentation for the Column.getItem method in PySpark.,Get an item from an array or map column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents a Column method for array/map indexing, including behavior details unique to Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/ilike,ilike,ilike (Column) - Azure Databricks,Use case-insensitive pattern matching with Column.ilike in PySpark,Documentation for the Column.ilike method in PySpark.,Case-insensitive SQL LIKE pattern matching.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API reference for a Column method implementing case-insensitive SQL LIKE semantics in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isin,isin,isin - Azure Databricks,Test membership in value lists with Column.isin in PySpark,Documentation for the Column.isin method in PySpark.,Check if the column value is in a list of values.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes a Column method for membership checks, including its behavior in Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isnan,isNaN,isNaN - Azure Databricks,Detect NaN values with Column.isNaN in PySpark,Documentation for the Column.isNaN method in PySpark.,Check if the column value is NaN.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents NaN-specific behavior for a Column method, which is concrete Databricks PySpark API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isnotnull,isNotNull,isNotNull - Azure Databricks,Filter non-null values with Column.isNotNull in PySpark,Documentation for the Column.isNotNull method in PySpark.,Check if the column value is not null.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API reference for a null-checking Column method with defined semantics in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/isnull,isNull,isNull - Azure Databricks,Filter null values with Column.isNull in PySpark,Documentation for the Column.isNull method in PySpark.,Check if the column value is null.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Documents the null-detection Column method and its behavior in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/like,like,like (Column) - Azure Databricks,Use SQL LIKE pattern matching with Column.like in PySpark,Documentation for the Column.like method in PySpark.,SQL LIKE pattern matching.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Provides method-level documentation for SQL LIKE semantics on a Column in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/name,name,name (Column) - Azure Databricks,Rename columns using Column.name alias in PySpark,Documentation for the Column.name method in PySpark.,Alias foralias().,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains a Column method alias and its relation to alias(), which is specific to the Databricks PySpark API surface.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/otherwise,otherwise,otherwise - Azure Databricks,Provide default values with Column.otherwise in PySpark,Documentation for the Column.otherwise method in PySpark.,Specify a default value for when conditions inwhen()are not met.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how otherwise() interacts with when() conditions in Databricks PySpark, which is detailed API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/over,over,over - Azure Databricks,Apply window specifications with Column.over in PySpark,Documentation for the Column.over method in PySpark.,Apply a window specification to the column.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"API reference for binding a Column to a window specification, including Databricks-specific semantics for window functions.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/rlike,rlike,rlike (Column) - Azure Databricks,Use regex pattern matching with Column.rlike in PySpark,Documentation for the Column.rlike method in PySpark.,SQL RLIKE (regex LIKE) pattern matching.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents SQL RLIKE semantics on a Column in Databricks PySpark, which is specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/startswith,startswith,startswith (Column) - Azure Databricks,Check string prefixes with Column.startswith in PySpark,Documentation for the Column.startswith method in PySpark.,Check if a string column starts with a substring.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API reference for a Column method implementing starts-with semantics in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/substr,substr,substr (Column) - Azure Databricks,Extract substrings with Column.substr in PySpark,Documentation for the Column.substr method in PySpark.,Return a substring of the column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Documents substring extraction behavior and parameters for a Column in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/try_cast,try_cast,try_cast - Azure Databricks,Safely cast column types with Column.try_cast in Databricks PySpark,Documentation for the Column.try\_cast method in PySpark.,Try to convert the column to a different data type. Returns null if conversion fails. Added in Databricks Runtime 15.0,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes a Databricks Runtime 15.0-added Column method that returns null on conversion failure, which is new, version-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/when,when,when (Column) - Azure Databricks,Implement conditional logic with Column.when in PySpark,Documentation for the Column.when method in PySpark.,Evaluate a list of conditions and return one of multiple possible result expressions.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"API reference for multi-branch conditional expressions on Columns, including interaction with otherwise(), which is detailed Databricks PySpark behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/withfield,withField,withField - Azure Databricks,Add or replace struct fields with Column.withField in PySpark,Documentation for the Column.withField method in PySpark.,Add or replace a field in a struct column.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents struct-manipulation behavior for a Column method in Databricks PySpark, which is specific SDK knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe,DataFrame class,DataFrame class - Azure Databricks,Use the DataFrame class in Azure Databricks PySpark,Documentation for the DataFrame class in PySpark.,"A distributed collection of data grouped into named columns. A DataFrame is equivalent to a relational table in Spark SQL, and can be created using various functions in SparkSession. Important A DataFrame should not be directly created using the constructor. Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,"Comprehensive reference for the Databricks PySpark DataFrame class, including creation constraints, methods, and Spark Connect support; this is core product-specific API and integration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/agg,agg,agg (DataFrame) - Azure Databricks,Use DataFrame.agg for aggregations in Azure Databricks PySpark,Documentation for the DataFrame.agg method in PySpark.,Aggregate on the entire DataFrame without groups (shorthand fordf.groupBy().agg()).,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"API reference for DataFrame.agg with method signature, parameters, and behavior details that are product/version-specific and not reliably known from training.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/alias,alias,alias (DataFrame) - Azure Databricks,Apply aliases to DataFrames with alias() in PySpark,Documentation for the DataFrame.alias method in PySpark.,Returns a new DataFrame with an alias set.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,"Documents DataFrame.alias behavior and parameters in Databricks’ PySpark, which is concrete API surface rather than conceptual info.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/approxquantile,approxQuantile,approxQuantile (DataFrame) - Azure Databricks,Compute approximate quantiles with DataFrame.approxQuantile,Documentation for the DataFrame.approxQuantile method in PySpark.,Calculates the approximate quantiles of numerical columns of a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"API reference including parameters (e.g., relative error) and return structure for approxQuantile, which are detailed, product-specific semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/astable,asTable,asTable - Azure Databricks,Convert DataFrames to table arguments with asTable,Documentation for the DataFrame.asTable method in PySpark.,"Converts the DataFrame into aTableArgobject, which can be used as a table argument in a TVF (Table-Valued Function) including UDTF (User-Defined Table Function).",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes DataFrame.asTable and its use with TVFs/UDTFs, including object type and usage patterns unique to Databricks’ PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cache,cache,cache - Azure Databricks,Cache DataFrames with default storage level in PySpark,Documentation for the DataFrame.cache method in PySpark.,Persists the DataFrame with the default storage level (MEMORY_AND_DISK_DESER).,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Documents DataFrame.cache and the specific default storage level (MEMORY_AND_DISK_DESER), which is a concrete configuration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/checkpoint,checkpoint,checkpoint - Azure Databricks,Checkpoint DataFrames and manage logical plans in PySpark,Documentation for the DataFrame.checkpoint method in PySpark.,"Returns a checkpointed version of this DataFrame. Checkpointing can be used to truncate the logical plan of this DataFrame, which is especially useful in iterative algorithms where the plan may grow exponentially. It will be saved to files inside the checkpoint directory set withSparkContext.setCheckpointDir, orspark.checkpoint.dirconfiguration.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains DataFrame.checkpoint behavior and interaction with SparkContext.setCheckpointDir and spark.checkpoint.dir, including side effects and storage location.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/coalesce,coalesce,coalesce (DataFrame) - Azure Databricks,Repartition DataFrames with coalesce(numPartitions),Documentation for the DataFrame.coalesce method in PySpark.,Returns a new DataFrame that has exactlynumPartitionspartitions.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,API reference for DataFrame.coalesce with parameter semantics that are specific to Spark/Databricks implementation.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/collect,collect,collect - Azure Databricks,Collect all DataFrame rows into driver with collect(),Documentation for the DataFrame.collect method in PySpark.,Returns all the records in the DataFrame as a list ofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,"Describes DataFrame.collect behavior and return type (list of Row), which is specific API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/colregex,colRegex,colRegex - Azure Databricks,Select columns by regex with DataFrame.colRegex,Documentation for the DataFrame.colRegex method in PySpark.,Selects column based on the column name specified as a regex and returns it asColumn.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"Documents DataFrame.colRegex usage and return type, a concrete API behavior detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/columns,columns,columns - Azure Databricks,Access DataFrame column names via columns property,Documentation for the DataFrame.columns property in PySpark.,Retrieves the names of all columns in theDataFrameas a list. The order of the column names in the list reflects their order in the DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,API reference for DataFrame.columns property and ordering semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/corr,corr,corr (DataFrame) - Azure Databricks,Compute column correlation with DataFrame.corr,Documentation for the DataFrame.corr method in PySpark.,Calculates the correlation of two columns of a DataFrame as a double value. Currently only supports the Pearson Correlation Coefficient.DataFrame.corrandDataFrameStatFunctions.corrare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Details DataFrame.corr usage, supported correlation type, and aliasing with DataFrameStatFunctions.corr.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/count,count,count (DataFrame) - Azure Databricks,Count DataFrame rows with count() in PySpark,Documentation for the DataFrame.count method in PySpark.,Returns the number of rows in this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"API behavior for DataFrame.count, including return type and semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cov,cov,cov (DataFrame) - Azure Databricks,Calculate sample covariance with DataFrame.cov,Documentation for the DataFrame.cov method in PySpark.,"Calculate the sample covariance for the given columns, specified by their names, as a double value.DataFrame.covandDataFrameStatFunctions.covare aliases.",2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Documents DataFrame.cov usage, return type, and aliasing with DataFrameStatFunctions.cov.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createglobaltempview,createGlobalTempView,createGlobalTempView - Azure Databricks,Create global temporary views from DataFrames,Documentation for the DataFrame.createGlobalTempView method in PySpark.,Creates a global temporary view with this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"API reference for DataFrame.createGlobalTempView, including scope and behavior of global temp views in Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createorreplaceglobaltempview,createOrReplaceGlobalTempView,createOrReplaceGlobalTempView - Azure Databricks,Create or replace global temp views with DataFrames,Documentation for the DataFrame.createOrReplaceGlobalTempView method in PySpark.,Creates or replaces a global temporary view using the given name.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"Describes DataFrame.createOrReplaceGlobalTempView semantics, including replacement behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createorreplacetempview,createOrReplaceTempView,createOrReplaceTempView - Azure Databricks,Create or replace local temp views from DataFrames,Documentation for the DataFrame.createOrReplaceTempView method in PySpark.,Creates or replaces a local temporary view with this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"API semantics for DataFrame.createOrReplaceTempView, including local temp view scope.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/createtempview,createTempView,createTempView - Azure Databricks,Create local temporary views with createTempView,Documentation for the DataFrame.createTempView method in PySpark.,Creates a local temporary view with this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,Documents DataFrame.createTempView behavior and how the view is registered in Spark SQL catalog.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/crossjoin,crossJoin,crossJoin - Azure Databricks,Perform cartesian products with DataFrame.crossJoin,Documentation for the DataFrame.crossJoin method in PySpark.,Returns the cartesian product with another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,API reference for DataFrame.crossJoin and its semantics when joining two DataFrames.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/crosstab,crosstab,crosstab (DataFrame) - Azure Databricks,Build contingency tables with DataFrame.crosstab,Documentation for the DataFrame.crosstab method in PySpark.,Computes a pair-wise frequency table of the given columns. Also known as a contingency table. The first column of each row will be the distinct values ofcol1and the column names will be the distinct values ofcol2. The name of the first column will be$col1_$col2. Pairs that have no occurrences will have zero as their counts.DataFrame.crosstabandDataFrameStatFunctions.crosstabare aliases.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details DataFrame.crosstab output schema, naming convention ($col1_$col2), and aliasing with DataFrameStatFunctions.crosstab.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cube,cube,cube - Azure Databricks,Create aggregation cubes with DataFrame.cube,Documentation for the DataFrame.cube method in PySpark.,"Create a multi-dimensional cube for the current DataFrame using the specified columns, allowing aggregations to be performed on them.",2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,API semantics for DataFrame.cube and how it enables multi-dimensional aggregation.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/describe,describe,describe - Azure Databricks,Compute basic column statistics with DataFrame.describe,Documentation for the DataFrame.describe method in PySpark.,Computes basic statistics for numeric and string columns.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Documents DataFrame.describe behavior, including which statistics are computed for numeric and string columns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/distinct,distinct,distinct - Azure Databricks,Return distinct DataFrame rows with distinct(),Documentation for the DataFrame.distinct method in PySpark.,Returns a new DataFrame containing the distinct rows in this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,API reference for DataFrame.distinct and its deduplication semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/drop,drop,drop (DataFrame) - Azure Databricks,Drop DataFrame columns with drop() in PySpark,Documentation for the DataFrame.drop method in PySpark.,Returns a new DataFrame without specified columns. This is a no-op if the schema doesn't contain the given column name(s).,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,"Describes DataFrame.drop behavior, including no-op behavior when columns are missing.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dropduplicates,dropDuplicates,dropDuplicates - Azure Databricks,Remove duplicate rows with DataFrame.dropDuplicates,Documentation for the DataFrame.dropDuplicates method in PySpark.,"Return a new DataFrame with duplicate rows removed, optionally only considering certain columns.",2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"API semantics for DataFrame.dropDuplicates, including optional subset of columns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dropduplicateswithinwatermark,dropDuplicatesWithinWatermark,dropDuplicatesWithinWatermark - Azure Databricks,Drop duplicates within watermark in streaming DataFrames,Documentation for the DataFrame.dropDuplicatesWithinWatermark method in PySpark.,"Return a new DataFrame with duplicate rows removed, optionally only considering certain columns, within watermark.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents DataFrame.dropDuplicatesWithinWatermark, a Databricks-specific streaming/Watermark API with nuanced behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dropna,dropna,dropna - Azure Databricks,Drop null or NaN rows with DataFrame.dropna,Documentation for the DataFrame.dropna method in PySpark.,Returns a new DataFrame omitting rows with null or NaN values.DataFrame.dropnaandDataFrameNaFunctions.dropare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"API reference for DataFrame.dropna, including aliasing with DataFrameNaFunctions.drop and behavior on null/NaN.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dtypes,dtypes,dtypes - Azure Databricks,Inspect DataFrame column names and types via dtypes,Documentation for the DataFrame.dtypes property in PySpark.,Returns all column names and their data types as a list.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,Documents DataFrame.dtypes property and its list-of-tuples structure.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/exceptall,exceptAll,exceptAll - Azure Databricks,Subtract DataFrames with duplicates using exceptAll,Documentation for the DataFrame.exceptAll method in PySpark.,Return a new DataFrame containing rows in this DataFrame but not in another DataFrame while preserving duplicates.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"API semantics for DataFrame.exceptAll, including preservation of duplicates.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/executioninfo,executionInfo,executionInfo - Azure Databricks,Inspect query execution details with DataFrame.executionInfo,Documentation for the DataFrame.executionInfo property in PySpark.,"Returns anExecutionInfoobject after the query was executed. TheexecutionInfoproperty allows you to introspect information about the actual query execution after a successful execution. Accessing this property before query execution returnsNone. If the same DataFrame is executed multiple times, the execution info is overwritten by the latest operation. Note This property is dedicated to Spark Connect client only. With a regular Spark session, it throws an exception.",2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Describes DataFrame.executionInfo property, its lifecycle (None before execution, overwritten on re-run), and Spark Connect–only behavior, which are nuanced, product-specific details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/exists,exists,exists (DataFrame) - Azure Databricks,Build EXISTS subqueries with DataFrame.exists,Documentation for the DataFrame.exists method in PySpark.,Return aColumnobject for an EXISTS Subquery.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,API reference for DataFrame.exists returning a Column usable in EXISTS subqueries.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/explain,explain,explain - Azure Databricks,Print logical and physical plans with DataFrame.explain,Documentation for the DataFrame.explain method in PySpark.,Prints the (logical and physical) plans to the console for debugging purposes.,2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,Documents DataFrame.explain behavior and output for debugging query plans.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/fillna,fillna,fillna - Azure Databricks,Fill null values in DataFrames with fillna(),Documentation for the DataFrame.fillna method in PySpark.,Returns a new DataFrame which null values are filled with new value.DataFrame.fillnaandDataFrameNaFunctions.fillare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,API semantics for DataFrame.fillna and aliasing with DataFrameNaFunctions.fill.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/filter,filter,filter (DataFrame) - Azure Databricks,Filter DataFrame rows with filter() conditions,Documentation for the DataFrame.filter method in PySpark.,Filters rows using the given condition.,2026-04-17T21:49:00.000Z,reference,integrations,0.76,True,Documents DataFrame.filter usage and accepted condition types.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/first,first,first (DataFrame) - Azure Databricks,Retrieve the first row of a DataFrame with first(),Documentation for the DataFrame.first method in PySpark.,Returns the first row as aRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"API behavior for DataFrame.first, including return type Row.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/foreach,foreach,foreach - Azure Databricks,Apply functions to each DataFrame row with foreach(),Documentation for the DataFrame.foreach method in PySpark.,Applies theffunction to all Row of this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,Describes DataFrame.foreach behavior and function signature expectations.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/foreachpartition,foreachPartition,foreachPartition - Azure Databricks,Process DataFrame partitions with foreachPartition(),Documentation for the DataFrame.foreachPartition method in PySpark.,Applies theffunction to each partition of this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.77,True,"API semantics for DataFrame.foreachPartition, including per-partition function application.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/freqitems,freqItems,freqItems (DataFrame) - Azure Databricks,Find frequent items in columns with DataFrame.freqItems,Documentation for the DataFrame.freqItems method in PySpark.,"Finding frequent items for columns, possibly with false positives. Using the frequent element count algorithm described in ""https://doi.org/10.1145/762471.762473, proposed by Karp, Schenker, and Papadimitriou"".DataFrame.freqItemsandDataFrameStatFunctions.freqItemsare aliases.",2026-04-17T21:49:00.000Z,reference,integrations,0.81,True,"Documents DataFrame.freqItems, its probabilistic behavior (false positives) and relation to the cited algorithm, which is specialized API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/groupby,groupBy,groupBy - Azure Databricks,Group DataFrames for aggregation with groupBy(),Documentation for the DataFrame.groupBy method in PySpark.,Groups the DataFrame by the specified columns so that aggregation can be performed on them. SeeGroupedDatafor all the available aggregate functions.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,API reference for DataFrame.groupBy and its relationship to GroupedData aggregate functions.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/groupingsets,groupingSets,groupingSets - Azure Databricks,Create multi-dimensional aggregations with groupingSets(),Documentation for the DataFrame.groupingSets method in PySpark.,"Create multi-dimensional aggregation for the current DataFrame using the specified grouping sets, so we can run aggregation on them.",2026-04-17T21:49:00.000Z,reference,integrations,0.79,True,"Documents DataFrame.groupingSets semantics for multi-dimensional aggregation, which is specific to Spark SQL.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/head,head,head - Azure Databricks,Return first n rows of a DataFrame with head(),Documentation for the DataFrame.head method in PySpark.,Returns the firstnrows.,2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"API behavior for DataFrame.head, including parameter n and return structure.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/hint,hint,hint - Azure Databricks,Use DataFrame.hint for query optimization in Databricks,Documentation for the DataFrame.hint method in PySpark.,Specifies some hint on the current DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrame.hint with Databricks-specific behavior and parameters; concrete method signature and usage details are product-specific integration/coding knowledge beyond generic concepts.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/inputfiles,inputFiles,inputFiles - Azure Databricks,Retrieve composing files with DataFrame.inputFiles,Documentation for the DataFrame.inputFiles method in PySpark.,"Returns a best-effort snapshot of the files that compose this DataFrame. This method simply asks each constituent BaseRelation for its respective files and takes the union of all results. Depending on the source relations, this may not find all input files. Duplicates are removed.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents DataFrame.inputFiles behavior, including best-effort semantics and duplicate handling; these method-level details are specific to Databricks/Spark API usage.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/intersect,intersect,intersect - Azure Databricks,Compute DataFrame intersections with intersect,Documentation for the DataFrame.intersect method in PySpark.,Return a new DataFrame containing rows only in both this DataFrame and another DataFrame. Note that any duplicates are removed. To preserve duplicates useintersectAll.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Explains DataFrame.intersect semantics, especially duplicate removal and relation to intersectAll; this is concrete API behavior specific to this product.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/intersectall,intersectAll,intersectAll - Azure Databricks,Preserve duplicates using DataFrame.intersectAll,Documentation for the DataFrame.intersectAll method in PySpark.,Return a new DataFrame containing rows in both this DataFrame and another DataFrame while preserving duplicates.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Clarifies how intersectAll differs from intersect in handling duplicates; method semantics are product-specific coding details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/isempty,isEmpty,isEmpty - Azure Databricks,Check for empty DataFrames with isEmpty,Documentation for the DataFrame.isEmpty method in PySpark.,Checks if the DataFrame is empty and returns a boolean value.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents DataFrame.isEmpty return type and behavior; API-level detail for Databricks PySpark usage.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/islocal,isLocal,isLocal - Azure Databricks,Detect local collect capability with isLocal,Documentation for the DataFrame.isLocal method in PySpark.,ReturnsTrueif thecollectandtakemethods can be run locally (without any Spark executors).,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes DataFrame.isLocal semantics regarding collect/take running without executors; specific to Spark/Databricks execution model.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/isstreaming,isStreaming,isStreaming - Azure Databricks,Work with streaming DataFrames using isStreaming,Documentation for the DataFrame.isStreaming property in PySpark.,"ReturnsTrueif thisDataFramecontains one or more sources that continuously return data as it arrives. ADataFramethat reads data from a streaming source must be executed as aStreamingQueryusing thestartmethod inDataStreamWriter. Methods that return a single answer, such ascountorcollect, will throw anAnalysisExceptionwhen there is a streaming source present.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Includes behavior with count/collect raising AnalysisException on streaming sources and requirement to use StreamingQuery; concrete product-specific API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/join,join,join - Azure Databricks,Join DataFrames with DataFrame.join in Databricks,Documentation for the DataFrame.join method in PySpark.,"Joins with another DataFrame, using the given join expression.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrame.join with join expressions; method semantics and parameters are specific integration/coding patterns.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/lateraljoin,lateralJoin,lateralJoin - Azure Databricks,Perform lateral joins with DataFrame.lateralJoin,Documentation for the DataFrame.lateralJoin method in PySpark.,"Lateral joins with another DataFrame, using the given join expression.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents lateralJoin behavior and usage; lateral join support and syntax are product-specific API details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/limit,limit,limit - Azure Databricks,Limit DataFrame result rows with limit,Documentation for the DataFrame.limit method in PySpark.,Limits the result count to the number specified.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Explains DataFrame.limit semantics for restricting row counts; concrete method behavior for Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/localcheckpoint,localCheckpoint,localCheckpoint - Azure Databricks,Use localCheckpoint for DataFrame logical plan truncation,Documentation for the DataFrame.localCheckpoint method in PySpark.,"Returns a locally checkpointed version of this DataFrame. Checkpointing can be used to truncate the logical plan of this DataFrame, which is especially useful in iterative algorithms where the plan may grow exponentially. Local checkpoints are stored in the executors using the caching subsystem and therefore they are not reliable.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes localCheckpoint behavior, storage location, and reliability characteristics; product-specific execution and caching semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/mapinarrow,mapInArrow,mapInArrow - Azure Databricks,Apply Arrow-based batch transforms with mapInArrow,Documentation for the DataFrame.mapInArrow method in PySpark.,"Maps an iterator of batches in the current DataFrame using a Python native function that is performed onpyarrow.RecordBatchs both as input and output, and returns the result as a DataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Details DataFrame.mapInArrow using pyarrow.RecordBatch iterators; includes integration-specific behavior between PySpark and Arrow.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/mapinpandas,mapInPandas,mapInPandas - Azure Databricks,Apply pandas batch transforms with mapInPandas,Documentation for the DataFrame.mapInPandas method in PySpark.,"Maps an iterator of batches in the current DataFrame using a Python native function that is performed on pandas DataFrames both as input and output, and returns the result as a DataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Documents DataFrame.mapInPandas using pandas DataFrame iterators; specific integration pattern between Databricks PySpark and pandas.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/melt,melt,melt - Azure Databricks,Unpivot wide DataFrames using melt/unpivot,Documentation for the DataFrame.melt method in PySpark.,"Unpivot a DataFrame from wide format to long format, optionally leaving identifier columns set. This is the reverse togroupBy(...).pivot(...).agg(...), except for the aggregation, which cannot be reversed. meltis an alias forunpivot.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains DataFrame.melt semantics, relation to groupBy().pivot().agg(), and aliasing to unpivot; concrete API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/mergeinto,mergeInto,mergeInto - Azure Databricks,Merge source changes into target tables with mergeInto,Documentation for the DataFrame.mergeInto method in PySpark.,"Merges a set of updates, insertions, and deletions based on a source table into a target table.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes DataFrame.mergeInto for updates/inserts/deletes into target tables; Databricks-specific Delta-style merge semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/metadatacolumn,metadataColumn,metadataColumn - Azure Databricks,Access logical metadata columns with metadataColumn,Documentation for the DataFrame.metadataColumn method in PySpark.,Selects a metadata column based on its logical column name and returns it as aColumn. Added in Databricks Runtime 16.1,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents metadataColumn behavior and its addition in a specific Databricks Runtime version; product-specific API detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/na,na,na - Azure Databricks,Handle missing values via DataFrame.na functions,Documentation for the DataFrame.na property in PySpark.,Returns aDataFrameNaFunctionsfor handling missing values.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Returns DataFrameNaFunctions; exposes product-specific API surface for null handling.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/observe,observe,observe - Azure Databricks,Define and capture DataFrame metrics with observe,Documentation for the DataFrame.observe method in PySpark.,"Define (named) metrics to observe on the DataFrame. This method returns an 'observed' DataFrame that returns the same result as the input, with the following guarantees: It will compute the defined aggregates (metrics) on all the data that is flowing through the Dataset at that point. It will report the value of the defined aggregate columns as soon as we reach a completion point.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains guarantees about metric computation and reporting on observed DataFrames; Databricks-specific monitoring semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/offset,offset,offset - Azure Databricks,Skip initial rows using DataFrame.offset,Documentation for the DataFrame.offset method in PySpark.,Returns a new DataFrame by skipping the firstnrows. Added in Databricks Runtime 13.1,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents offset behavior and runtime version introduction; concrete API semantics for row skipping.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/orderby,orderBy,orderBy (DataFrame) - Azure Databricks,Sort DataFrames using orderBy/sort aliases,Documentation for the DataFrame.orderBy method in PySpark.,orderByis an alias forsort.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Clarifies that orderBy is an alias for sort; aliasing behavior is specific API knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/pandas_api,pandas_api,pandas_api - Azure Databricks,Convert PySpark DataFrames to pandas-on-Spark,Documentation for the DataFrame.pandas\_api method in PySpark.,Converts the existing DataFrame into a pandas-on-Spark DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes pandas_api conversion behavior; integration between Databricks PySpark and pandas-on-Spark is product-specific.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/persist,persist,persist - Azure Databricks,Configure DataFrame persistence and storage levels,Documentation for the DataFrame.persist method in PySpark.,Sets the storage level to persist the contents of the DataFrame across operations after the first time it is computed. This can only be used to assign a new storage level if the DataFrame does not have a storage level set yet. If no storage level is specified defaults to (MEMORY_AND_DISK_DESER).,2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,"Documents persist behavior, including default storage level MEMORY_AND_DISK_DESER and constraints on changing storage level; specific configuration semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/plot,plot,plot - Azure Databricks,Create visualizations with DataFrame.plot accessor,Documentation for the DataFrame.plot property in PySpark.,Returns aPySparkPlotAccessorfor plotting functions. You can create plots in two ways:,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Exposes PySparkPlotAccessor and supported plotting approaches; product-specific plotting API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/printschema,printSchema,printSchema - Azure Databricks,Inspect DataFrame schemas with printSchema,Documentation for the DataFrame.printSchema method in PySpark.,Prints out the schema in the tree format. Optionally allows to specify how many levels to print if schema is nested.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Explains tree-format schema printing and level control; concrete method behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/randomsplit,randomSplit,randomSplit - Azure Databricks,Randomly split DataFrames with randomSplit,Documentation for the DataFrame.randomSplit method in PySpark.,Randomly splits this DataFrame with the provided weights.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents randomSplit behavior with weights; API semantics for dataset partitioning.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/rdd,rdd,rdd - Azure Databricks,Access underlying RDDs via DataFrame.rdd,Documentation for the DataFrame.rdd property in PySpark.,Returns the content as anRDDofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes rdd property returning RDD[Row]; product-specific bridge between DataFrame and RDD APIs.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/repartition,repartition,repartition - Azure Databricks,Hash partition DataFrames with repartition,Documentation for the DataFrame.repartition method in PySpark.,Returns a new DataFrame partitioned by the given partitioning expressions. The resulting DataFrame is hash partitioned.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains repartition behavior and hash partitioning semantics; concrete API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/repartitionbyid,repartitionById,repartitionById - Azure Databricks,Partition DataFrames by column identifier with repartitionById,Documentation for the DataFrame.repartitionById method in PySpark.,Returns a new DataFrame partitioned by the given partitioning expressions. The resulting DataFrame is partitioned by column identifier.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents repartitionById semantics and partitioning by column identifier; Databricks-specific API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/repartitionbyrange,repartitionByRange,repartitionByRange - Azure Databricks,Range partition DataFrames with repartitionByRange,Documentation for the DataFrame.repartitionByRange method in PySpark.,Returns a new DataFrame partitioned by the given partitioning expressions. The resulting DataFrame is range partitioned.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains repartitionByRange behavior and range partitioning; concrete partitioning API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/replace,replace,replace (DataFrame) - Azure Databricks,Replace values in DataFrames with replace,Documentation for the DataFrame.replace method in PySpark.,"Returns a new DataFrame replacing a value with another value.DataFrame.replaceandDataFrameNaFunctions.replaceare aliases of each other. Values to_replace and value must have the same type and can only be numerics, booleans, or strings. Value can have None. When replacing, the new value will be cast to the type of the existing column.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Details type constraints, aliasing with DataFrameNaFunctions.replace, and casting behavior; specific method semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/rollup,rollup,rollup - Azure Databricks,Create multi-dimensional aggregations with rollup,Documentation for the DataFrame.rollup method in PySpark.,"Create a multi-dimensional rollup for the current DataFrame using the specified columns, allowing for aggregation on them.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents rollup behavior for multi-dimensional aggregation; product-specific group/aggregation API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/samesemantics,sameSemantics,sameSemantics - Azure Databricks,Compare DataFrame logical plans with sameSemantics,Documentation for the DataFrame.sameSemantics method in PySpark.,ReturnsTruewhen the logical query plans inside both DataFrames are equal and therefore return the same results.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains sameSemantics behavior for logical plan equality; Databricks/Spark-specific query plan API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sample,sample,sample - Azure Databricks,Sample subsets of DataFrames with sample,Documentation for the DataFrame.sample method in PySpark.,Returns a sampled subset of this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents sampling behavior; concrete API semantics for random sampling.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sampleby,sampleBy,sampleBy (DataFrame) - Azure Databricks,Perform stratified sampling with sampleBy,Documentation for the DataFrame.sampleBy method in PySpark.,Returns a stratified sample without replacement based on the fraction given on each stratum.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains stratified sampling without replacement and fraction-per-stratum behavior; specific method semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/scalar,scalar,scalar - Azure Databricks,Use scalar subqueries via DataFrame.scalar,Documentation for the DataFrame.scalar method in PySpark.,Return aColumnobject for a SCALAR Subquery containing exactly one row and one column.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents scalar subquery constraints (exactly one row and one column) and Column return type; product-specific query API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/schema,schema,schema (DataFrame) - Azure Databricks,Inspect DataFrame schema via schema property,Documentation for the DataFrame.schema property in PySpark.,Returns the schema of thisDataFrameas aStructType.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes schema property returning StructType; concrete API surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/select,select,select - Azure Databricks,Project columns and expressions with select,Documentation for the DataFrame.select method in PySpark.,Projects a set of expressions and returns a new DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents DataFrame.select projection behavior; core API semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/selectexpr,selectExpr,selectExpr - Azure Databricks,Project SQL expressions with selectExpr,Documentation for the DataFrame.selectExpr method in PySpark.,Projects a set of SQL expressions and returns a new DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains selectExpr usage with SQL expressions; specific to Databricks/Spark SQL integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/semantichash,semanticHash,semanticHash - Azure Databricks,Generate logical plan hashes with semanticHash,Documentation for the DataFrame.semanticHash method in PySpark.,Returns a hash code of the logical query plan against this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents semanticHash behavior for logical query plans; product-specific diagnostic API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/show,show,show - Azure Databricks,Display DataFrame rows in console with show,Documentation for the DataFrame.show method in PySpark.,Prints the firstnrows of the DataFrame to the console.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Explains show behavior for printing first n rows; includes method semantics and parameters specific to this API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sort,sort,sort - Azure Databricks,Use DataFrame.sort in Azure Databricks PySpark,Documentation for the DataFrame.sort method in PySpark.,Returns a new DataFrame sorted by the specified column(s).,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark DataFrame method on Databricks, including parameters and behavior details that are product- and runtime-specific.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sortwithinpartitions,sortWithinPartitions,sortWithinPartitions - Azure Databricks,Use sortWithinPartitions for partitioned sorting,Documentation for the DataFrame.sortWithinPartitions method in PySpark.,Returns a new DataFrame with each partition sorted by the specified column(s).,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents a Databricks-specific PySpark DataFrame API with method semantics and parameters that go beyond generic conceptual knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/sparksession,sparkSession,sparkSession - Azure Databricks,Access DataFrame.sparkSession in Databricks PySpark,Documentation for the DataFrame.sparkSession property in PySpark.,Returns Spark session that created this DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Property reference describing how to obtain the SparkSession from a DataFrame in Databricks PySpark, including behavior specific to this environment.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/stat,stat,stat - Azure Databricks,Use DataFrame.stat for statistics in PySpark,Documentation for the DataFrame.stat property in PySpark.,Returns aDataFrameStatFunctionsfor statistic functions.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Reference for the stat property returning DataFrameStatFunctions, exposing product-specific statistical APIs and usage patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/storagelevel,storageLevel,storageLevel - Azure Databricks,Check DataFrame.storageLevel in Databricks PySpark,Documentation for the DataFrame.storageLevel property in PySpark.,Gets theDataFrame's current storage level.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a property exposing Spark storage level details for a DataFrame, including behavior tied to Databricks runtime.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/subtract,subtract,subtract - Azure Databricks,Use DataFrame.subtract to diff DataFrames,Documentation for the DataFrame.subtract method in PySpark.,Return a new DataFrame containing rows in this DataFrame but not in another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for subtract between DataFrames with Databricks-specific semantics and constraints not captured by generic knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/summary,summary,summary - Azure Databricks,Compute DataFrame.summary statistics in PySpark,Documentation for the DataFrame.summary method in PySpark.,"Computes specified statistics for numeric and string columns. Available statistics are: count, mean, stddev, min, max, arbitrary approximate percentiles specified as a percentage (e.g., 75%).",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Details available statistics, including approximate percentiles and accepted arguments, which are concrete API behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/tail,tail,tail - Azure Databricks,Use DataFrame.tail to get last rows,Documentation for the DataFrame.tail method in PySpark.,Returns the lastnumrows as a list ofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method reference for tail with parameter semantics and return type specific to Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/take,take,take - Azure Databricks,Use DataFrame.take to get first rows,Documentation for the DataFrame.take method in PySpark.,Returns the firstnumrows as a list ofRow.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents the take method’s parameters and behavior for Databricks PySpark DataFrames.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/to,to,to - Azure Databricks,Reconcile DataFrame schema with to() in PySpark,Documentation for the DataFrame.to method in PySpark.,Returns a new DataFrame where each row is reconciled to match the specified schema.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrame.to with schema reconciliation behavior that is implementation-specific.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/toarrow,toArrow,toArrow - Azure Databricks,Convert DataFrame to PyArrow Table in Databricks,Documentation for the DataFrame.toArrow method in PySpark.,Returns the contents of this DataFrame as PyArrowpyarrow.Table. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes DataFrame.toArrow, including return type, version added, and integration behavior with PyArrow, which are concrete integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/todf,toDF,toDF - Azure Databricks,Rename DataFrame columns with toDF in PySpark,Documentation for the DataFrame.toDF method in PySpark.,Returns a new DataFrame with new specified column names.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method reference for toDF with specific behavior for renaming columns in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/tojson,toJSON,toJSON - Azure Databricks,Convert DataFrame to JSON RDD in Databricks,Documentation for the DataFrame.toJSON method in PySpark.,Converts a DataFrame into a RDD of string or DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents DataFrame.toJSON, including output type and behavior, which are concrete API integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/tolocaliterator,toLocalIterator,toLocalIterator - Azure Databricks,Iterate over all DataFrame rows with toLocalIterator,Documentation for the DataFrame.toLocalIterator method in PySpark.,Returns an iterator that contains all of the rows in this DataFrame. The iterator will consume as much memory as the largest partition in this DataFrame. With prefetch it may consume up to the memory of the 2 largest partitions.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains memory behavior (largest partition, prefetch) and iterator semantics, which are product-specific operational details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/topandas,toPandas,toPandas - Azure Databricks,Convert Spark DataFrame to pandas in Databricks,Documentation for the DataFrame.toPandas method in PySpark.,Returns the contents of this DataFrame as Pandaspandas.DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Reference for toPandas, including return type and implicit data movement semantics between Spark and pandas.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/transform,transform,transform (DataFrame) - Azure Databricks,Chain custom DataFrame.transform operations,Documentation for the DataFrame.transform method in PySpark.,Returns a new DataFrame. Concise syntax for chaining custom transformations.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents the transform method’s signature and usage pattern for chaining transformations in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/transpose,transpose,transpose - Azure Databricks,Transpose a DataFrame in Databricks PySpark,Documentation for the DataFrame.transpose method in PySpark.,"Transposes a DataFrame such that the values in the specified index column become the new columns of the DataFrame. If no index column is provided, the first column is used as the default.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"API reference for transpose, including default index column behavior, which is specific to this implementation.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/union,union,union - Azure Databricks,Union two DataFrames with union in PySpark,Documentation for the DataFrame.union method in PySpark.,Return a new DataFrame containing the union of rows in this and another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes DataFrame.union semantics and usage in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unionbyname,unionByName,unionByName - Azure Databricks,Union DataFrames by column name with unionByName,Documentation for the DataFrame.unionByName method in PySpark.,Returns a new DataFrame containing union of rows in this and another DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents unionByName behavior and how it matches columns, which is concrete API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unpersist,unpersist,unpersist - Azure Databricks,Unpersist a DataFrame in Databricks PySpark,Documentation for the DataFrame.unpersist method in PySpark.,"Marks the DataFrame as non-persistent, and remove all blocks for it from memory and disk.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains unpersist behavior, including removal of blocks from memory and disk, which is implementation-specific.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unpivot,unpivot,unpivot - Azure Databricks,Unpivot wide DataFrames to long format in PySpark,Documentation for the DataFrame.unpivot method in PySpark.,"Unpivot a DataFrame from wide format to long format, optionally leaving identifier columns set. This is the reverse togroupBy(...).pivot(...).agg(...), except for the aggregation, which cannot be reversed. Added in Databricks Runtime 11.1",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details unpivot behavior, relation to groupBy().pivot().agg(), and version added, which are specific integration semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/where,where,where - Azure Databricks,Filter DataFrames using where alias in PySpark,Documentation for the DataFrame.where method in PySpark.,whereis an alias forfilter.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Clarifies that where is an alias for filter in Databricks PySpark, a concrete API mapping.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumn,withColumn,withColumn - Azure Databricks,Add or replace columns with withColumn in PySpark,Documentation for the DataFrame.withColumn method in PySpark.,Returns a new DataFrame by adding a column or replacing the existing column that has the same name.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Reference for withColumn behavior when adding or replacing columns in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumnrenamed,withColumnRenamed,withColumnRenamed - Azure Databricks,Rename a DataFrame column with withColumnRenamed,Documentation for the DataFrame.withColumnRenamed method in PySpark.,Returns a new DataFrame by renaming an existing column. This is a no-op if the schema doesn't contain the given column name.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents renaming semantics and no-op behavior when column is missing, which are concrete API details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumns,withColumns,withColumns - Azure Databricks,Add multiple columns with withColumns in PySpark,Documentation for the DataFrame.withColumns method in PySpark.,Returns a new DataFrame by adding multiple columns or replacing the existing columns that have the same names.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes multi-column add/replace behavior for Databricks PySpark DataFrames.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumnsrenamed,withColumnsRenamed,withColumnsRenamed - Azure Databricks,Rename multiple columns with withColumnsRenamed,Documentation for the DataFrame.withColumnsRenamed method in PySpark.,Returns a new DataFrame by renaming multiple columns. This is a no-op if the schema doesn't contain the given column names.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains multi-column rename semantics and no-op behavior, which are specific API behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withmetadata,withMetadata,withMetadata - Azure Databricks,Update column metadata with withMetadata in PySpark,Documentation for the DataFrame.withMetadata method in PySpark.,Returns a new DataFrame by updating an existing column with metadata.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents how to modify column metadata via a specific DataFrame API in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withwatermark,withWatermark,withWatermark - Azure Databricks,Define event-time watermarks with withWatermark,Documentation for the DataFrame.withWatermark method in PySpark.,Defines an event time watermark for this DataFrame. A watermark tracks a point in time before which we assume no more late data is going to arrive.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains watermark semantics for streaming DataFrames, including late data handling, which is product-specific behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/write,write,write (DataFrame) - Azure Databricks,Write non-streaming DataFrames to storage in Databricks,Documentation for the DataFrame.write property in PySpark.,Interface for saving the content of the non-streamingDataFrameout into external storage.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Reference for DataFrame.write interface, including how it interacts with external storage systems in Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/writestream,writeStream,writeStream - Azure Databricks,Write streaming DataFrames with writeStream in Databricks,Documentation for the DataFrame.writeStream property in PySpark.,Interface for saving the content of the streamingDataFrameout into external storage.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Documents streaming write interface and semantics for Databricks PySpark DataFrames.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/writeto,writeTo,writeTo - Azure Databricks,Configure v2 data source writes with writeTo,Documentation for the DataFrame.writeTo method in PySpark.,Create a write configuration builder for v2 sources.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes the write configuration builder for v2 sources, including API usage specific to Databricks runtime.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions,DataFrameNaFunctions class,DataFrameNaFunctions class - Azure Databricks,Handle nulls with DataFrameNaFunctions in PySpark,"Learn about the DataFrameNaFunctions class in PySpark for dropping, filling, and replacing null values in a DataFrame.",Functionality for working with missing data in a DataFrame. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Class reference for DataFrameNaFunctions, including supported operations and Spark Connect support, which are concrete API details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/drop,drop,drop (DataFrameNaFunctions) - Azure Databricks,Drop null or NaN rows with DataFrameNaFunctions.drop,Documentation for the DataFrameNaFunctions.drop method in PySpark.,Returns a newDataFrameomitting rows with null or NaN values.DataFrame.dropnaandDataFrameNaFunctions.dropare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents drop behavior, aliasing with DataFrame.dropna, and null-handling semantics specific to Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/fill,fill,fill (DataFrameNaFunctions) s - Azure Databricks,Fill null values with DataFrameNaFunctions.fill,Documentation for the DataFrameNaFunctions.fill method in PySpark.,Returns a newDataFramein which null values are filled with a new value.DataFrame.fillnaandDataFrameNaFunctions.fillare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains fill behavior, aliasing with DataFrame.fillna, and accepted value types, which are concrete API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/replace,replace,replace (DataFrameNaFunctions) - Azure Databricks,Replace values with DataFrameNaFunctions.replace,Documentation for the DataFrameNaFunctions.replace method in PySpark.,"Returns a newDataFramereplacing a value with another value.DataFrame.replaceandDataFrameNaFunctions.replaceare aliases of each other. Values forto_replaceandvaluemust have the same type and can only be numerics, booleans, or strings.valuecan beNone. When replacing, the new value is cast to the type of the existing column.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Details type constraints, None handling, and casting behavior for replace, which are precise, product-specific API rules.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader,DataFrameReader class,DataFrameReader class - Azure Databricks,Load data with DataFrameReader in Azure Databricks,Documentation for the DataFrameReader class in PySpark.,"Interface used to load a DataFrame from external storage systems (e.g. file systems, key-value stores, etc). Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,"Class reference listing methods, parameters, and behaviors for DataFrameReader, including Spark Connect support; this is detailed API integration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/csv,csv,csv (DataFrameReader) - Azure Databricks,Read CSV files with DataFrameReader.csv in Databricks,Documentation for the DataFrameReader.csv method in PySpark.,"Loads a CSV file and returns the result as aDataFrame. IfinferSchemais enabled, this function reads the input once to determine the schema. To avoid this, either disableinferSchemaor specify the schema explicitly usingschema.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Method reference with options like inferSchema and schema, including behavior about extra passes over data; these are concrete API parameters and effects.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/excel,excel,excel (DataFrameReader) - Azure Databricks,Read Excel files using DataFrameReader.excel in Databricks,Documentation for the DataFrameReader.excel method in PySpark.,Loads Excel files and returns the result as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"API method for loading Excel into DataFrames with product-specific options and behavior, fitting integration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/format,format,format (DataFrameReader) - Azure Databricks,Specify input data source format with DataFrameReader.format,Documentation for the DataFrameReader.format method in PySpark.,Specifies the input data source format.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents the format method and accepted values for configuring data sources, which are concrete configuration parameters for integrations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/jdbc,jdbc,jdbc (DataFrameReader) - Azure Databricks,Read database tables via JDBC with DataFrameReader.jdbc,Documentation for the DataFrameReader.jdbc method in PySpark.,"Constructs aDataFramerepresenting the database table accessible via JDBC URLurl. Partitions of the table are retrieved in parallel if eithercolumnorpredicatesis specified. If bothcolumnandpredicatesare specified,columntakes precedence.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Describes JDBC URL usage, partitioning via column/predicates, and precedence rules, which are detailed integration and configuration behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/json,json,json (DataFrameReader) - Azure Databricks,Read JSON and JSON Lines with DataFrameReader.json,Documentation for the DataFrameReader.json method in PySpark.,"Loads JSON files and returns the results as aDataFrame. JSON Lines (newline-delimited JSON) is supported by default. For JSON with one record per file, set themultiLineoption toTrue. Ifschemais not specified, this function reads the input once to determine the input schema.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Covers options like multiLine and schema inference behavior, which are specific API parameters and effects for JSON integration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/load,load,load - Azure Databricks,Load data from sources with DataFrameReader.load,Documentation for the DataFrameReader.load method in PySpark.,Loads data from a data source and returns it as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API reference for load with parameters and behavior for reading from various sources; this is product-specific integration detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/option,option,option (DataFrameReader) - Azure Databricks,Set single input option with DataFrameReader.option,Documentation for the DataFrameReader.option method in PySpark.,Adds an input option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents the option method for configuring data source parameters, including names and values, which is integration-focused configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/options,options,options (DataFrameReader) - Azure Databricks,Set multiple input options with DataFrameReader.options,Documentation for the DataFrameReader.options method in PySpark.,Adds input options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API for bulk-setting data source options; parameter names and usage are product-specific integration details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/orc,orc,orc (DataFrameReader) - Azure Databricks,Read ORC files with DataFrameReader.orc in Databricks,Documentation for the DataFrameReader.orc method in PySpark.,Loads ORC files and returns the result as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Method reference for loading ORC with specific parameters and behavior, which is concrete integration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/parquet,parquet,parquet (DataFrameReader) - Azure Databricks,Read Parquet files with DataFrameReader.parquet,Documentation for the DataFrameReader.parquet method in PySpark.,Loads Parquet files and returns the result as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API method for Parquet loading with product-specific options and semantics, fitting integration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/schema,schema,schema - Azure Databricks,Define input schema with DataFrameReader.schema,Documentation for the DataFrameReader.schema method in PySpark.,"Specifies the input schema. Some data sources (such as JSON) can infer the input schema automatically from data. By specifying the schema here, the underlying data source can skip the schema inference step, which speeds up data loading.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Explains schema parameter usage and its effect on skipping inference, which is specific API behavior for integrations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/table,table,table (DataFrameReader) - Azure Databricks,Load tables as DataFrames with DataFrameReader.table,Documentation for the DataFrameReader.table method in PySpark.,Returns the specified table as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how to reference tables and return DataFrames, including parameter usage; this is product-specific integration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/text,text,text (DataFrameReader) - Azure Databricks,Read text files with DataFrameReader.text in Databricks,Documentation for the DataFrameReader.text method in PySpark.,"Loads text files and returns aDataFramewhose schema starts with a string column namedvalue, followed by partitioned columns if any are present. Text files must be encoded as UTF-8. By default, each line in the text file is a new row in the resulting DataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,"Specifies schema (value column), UTF-8 requirement, and line-to-row behavior, which are concrete API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/xml,xml,xml (DataFrameReader) - Azure Databricks,Read XML files with DataFrameReader.xml in Databricks,Documentation for the DataFrameReader.xml method in PySpark.,"Loads an XML file and returns the result as aDataFrame. Ifschemais not specified, this function reads the input once to determine the input schema.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API for XML loading with schema inference behavior; these are specific integration parameters and behaviors.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions,DataFrameStatFunctions class,DataFrameStatFunctions class - Azure Databricks,Use DataFrameStatFunctions for statistics in PySpark,"Learn about the DataFrameStatFunctions class in PySpark for computing statistics such as correlations, covariances, and frequency tables.",Functionality for statistical functions with a DataFrame. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Class reference for statistical functions, including supported operations and Spark Connect support.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/approxquantile,approxQuantile,approxQuantile (DataFrameStatFunctions) - Azure Databricks,Compute approximate quantiles with approxQuantile,Documentation for the DataFrameStatFunctions.approxQuantile method in PySpark.,"Calculates the approximate quantiles of numerical columns of aDataFrame. The result of this algorithm has the following deterministic bound: if theDataFramehas N elements and if we request the quantile at probabilitypup to errorerr, then the algorithm will return a samplexfrom theDataFrameso that theexactrank ofxis close to (p _ N). More precisely,floor((p - err) _ N) <= rank(x) <= ceil((p + err) \* N). This method implements a variation of the Greenwald-Khanna algorithm with some speed optimiza",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes deterministic error bounds and algorithm (Greenwald-Khanna variation), which are detailed, implementation-specific semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/corr,corr,corr (DataFrameStatFunctions) - Azure Databricks,Calculate column correlation with DataFrameStatFunctions.corr,Documentation for the DataFrameStatFunctions.corr method in PySpark.,Calculates the correlation of two columns of aDataFrameas a double value. Currently only supports the Pearson Correlation Coefficient.DataFrame.corrandDataFrameStatFunctions.corrare aliases of each other.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents correlation computation, supported method (Pearson), and aliasing with DataFrame.corr.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/cov,cov,cov (DataFrameStatFunctions) - Azure Databricks,Compute sample covariance with DataFrameStatFunctions.cov,Documentation for the DataFrameStatFunctions.cov method in PySpark.,"Calculates the sample covariance for the given columns, specified by their names, as a double value.DataFrame.covandDataFrameStatFunctions.covare aliases of each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains covariance calculation semantics and aliasing with DataFrame.cov, which are concrete API behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/crosstab,crosstab,crosstab (DataFrameStatFunctions) - Azure Databricks,Generate crosstab frequency tables with DataFrameStatFunctions.crosstab,Documentation for the DataFrameStatFunctions.crosstab method in PySpark.,"Computes a pair-wise frequency table of the given columns, also known as a contingency table. The first column of each row contains the distinct values ofcol1, and the column names are the distinct values ofcol2. The name of the first column is$col1_$col2. Pairs with no occurrences have a count of zero.DataFrame.crosstabandDataFrameStatFunctions.crosstabare aliases of each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Details output schema (first column naming, column names from distinct values, zero counts for missing pairs) which are precise API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/freqitems,freqItems,freqItems (DataFrameStatFunctions) - Azure Databricks,Use freqItems for frequent value detection in PySpark,Documentation for the DataFrameStatFunctions.freqItems method in PySpark.,"Finds frequent items for columns, possibly with false positives. Uses the frequent element count algorithm described by Karp, Schenker, and Papadimitriou.DataFrame.freqItemsandDataFrameStatFunctions.freqItemsare aliases of each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for DataFrameStatFunctions.freqItems with method-specific parameters and behavior that are product-specific integration details rather than generic concepts.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/sampleby,sampleBy,sampleBy (DataFrameStatFunctions) - Azure Databricks,Stratified sampling with sampleBy in Azure Databricks,Documentation for the DataFrameStatFunctions.sampleBy method in PySpark.,Returns a stratified sample without replacement based on the fraction given on each stratum.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents DataFrameStatFunctions.sampleBy with concrete parameter usage for stratified sampling, which is product-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter,DataFrameWriter class,DataFrameWriter class - Azure Databricks,Write DataFrames with DataFrameWriter in Azure Databricks,Documentation for the DataFrameWriter class in PySpark.,"Interface used to write a DataFrame to external storage systems (e.g. file systems, key-value stores, etc). Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,Class reference for DataFrameWriter with methods and options for writing to external systems; detailed integration and configuration surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/bucketby,bucketBy,bucketBy - Azure Databricks,Bucket output data with DataFrameWriter.bucketBy,Documentation for the DataFrameWriter.bucketBy method in PySpark.,"Buckets the output by the given columns. If specified, the output is laid out on the file system similar to Hive's bucketing scheme, but with a different bucket hash function and is not compatible with Hive's bucketing.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents bucketBy parameters and behavior relative to Hive bucketing, which is specific API behavior for storage layout.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/clusterby,clusterBy,clusterBy (DataFrameWriter) - Azure Databricks,Cluster data for performance with DataFrameWriter.clusterBy,Documentation for the DataFrameWriter.clusterBy method in PySpark.,Clusters the data by the given columns to optimize query performance.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,API method for clustering output by columns; parameter usage and semantics are product-specific integration details.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/csv,csv,csv (DataFrameWriter) - Azure Databricks,Write CSV files with DataFrameWriter.csv in Databricks,Documentation for the DataFrameWriter.csv method in PySpark.,Saves the content of theDataFramein CSV format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API for CSV output with options and behavior specific to Databricks/Spark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/excel,excel,excel (DataFrameWriter) - Azure Databricks,Write Excel files with DataFrameWriter.excel in Databricks,Documentation for the DataFrameWriter.excel method in PySpark.,Saves the content of theDataFramein Excel format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Method reference for Excel output with product-specific parameters and semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/format,format,format (DataFrameWriter) - Azure Databricks,Specify output data source with DataFrameWriter.format,Documentation for the DataFrameWriter.format method in PySpark.,Specifies the underlying output data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents format configuration for output, including accepted values and behavior, which is integration configuration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/insertinto,insertInto,insertInto - Azure Databricks,Insert DataFrame rows into tables with DataFrameWriter.insertInto,Documentation for the DataFrameWriter.insertInto method in PySpark.,Inserts the content of theDataFrameinto the specified table. Requires that the schema of theDataFrameis the same as the schema of the table.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,Requires schema equality and defines insert behavior; these are concrete integration rules.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/jdbc,jdbc,jdbc (DataFrameWriter) - Azure Databricks,Write DataFrames to databases via DataFrameWriter.jdbc,Documentation for the DataFrameWriter.jdbc method in PySpark.,Saves the content of theDataFrameto an external database table via JDBC.,2026-04-17T21:49:00.000Z,reference,integrations,0.9,True,API for JDBC writes with parameters and behavior specific to Databricks/Spark JDBC integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/json,json,json (DataFrameWriter) - Azure Databricks,Write JSON output with DataFrameWriter.json in Databricks,Documentation for the DataFrameWriter.json method in PySpark.,Saves the content of theDataFramein JSON format (JSON Lines / newline-delimited JSON) at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,API for writing JSON Lines with path and options; behavior is specific to this product’s integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/mode,mode,mode (DataFrameWriter) - Azure Databricks,Control overwrite behavior with DataFrameWriter.mode,Documentation for the DataFrameWriter.mode method in PySpark.,Specifies the behavior when data or table already exists.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Defines allowed mode values and their effects when data or tables exist, which are concrete API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/option,option,option (DataFrameWriter) - Azure Databricks,Use DataFrameWriter.option for Databricks data sources,Documentation for the DataFrameWriter.option method in PySpark.,"Adds an output option for the underlying data source. For some available options, seeOptions.",2026-04-24T18:05:00.000Z,reference,integrations,0.7,True,API reference for DataFrameWriter.option with product-specific write options for Azure Databricks data sources; exposes concrete option names/behaviors that function as integration/configuration details beyond generic PySpark knowledge.,updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/options,options,options (DataFrameWriter) - Azure Databricks,Configure multiple DataFrameWriter.options in Databricks,Documentation for the DataFrameWriter.options method in PySpark.,Adds output options for the underlying data source.,2026-04-24T18:05:00.000Z,reference,integrations,0.7,True,API reference for DataFrameWriter.options showing how to set multiple output options for Databricks-backed data sources; includes specific option usage patterns that are product-specific integration details.,updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/orc,orc,orc (DataFrameWriter) - Azure Databricks,Write ORC files with DataFrameWriter.orc in Databricks,Documentation for the DataFrameWriter.orc method in PySpark.,Saves the content of theDataFramein ORC format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,API method for ORC output with specific parameters and behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/parquet,parquet,parquet (DataFrameWriter) - Azure Databricks,Write Parquet output with DataFrameWriter.parquet,Documentation for the DataFrameWriter.parquet method in PySpark.,Saves the content of theDataFramein Parquet format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,Method reference for Parquet writes with product-specific semantics and options.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/partitionby,partitionBy,partitionBy (DataFrameWriter) - Azure Databricks,Partition output data with DataFrameWriter.partitionBy,Documentation for the DataFrameWriter.partitionBy method in PySpark.,"Partitions the output by the given columns on the file system. If specified, the output is laid out on the file system similar to Hive's partitioning scheme.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains partitionBy parameters and layout similar to Hive partitioning, which is specific integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/save,save,save - Azure Databricks,Save DataFrames to data sources with DataFrameWriter.save,Documentation for the DataFrameWriter.save method in PySpark.,"Saves the contents of theDataFrameto a data source. The data source is specified byformatand a set ofoptions. Ifformatis not specified, the default data source configured byspark.sql.sources.defaultis used.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes interaction of format and options, and default source via spark.sql.sources.default, which is concrete integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/saveastable,saveAsTable,saveAsTable - Azure Databricks,Save DataFrames as tables with DataFrameWriter.saveAsTable,Documentation for the DataFrameWriter.saveAsTable method in PySpark.,"Saves the content of theDataFrameas the specified table. If the table already exists, the behavior depends on themodeparameter (default is to throw an exception). Whenmodeis'overwrite', the schema of theDataFramedoes not need to match the existing table schema.",2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Details mode behavior and schema matching rules when saving as tables, which are product-specific API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/sortby,sortBy,sortBy - Azure Databricks,Sort bucketed output with DataFrameWriter.sortBy,Documentation for the DataFrameWriter.sortBy method in PySpark.,Sorts the output in each bucket by the given columns on the file system.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Describes sortBy behavior within buckets on the filesystem, which is concrete API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/text,text,text (DataFrameWriter) - Azure Databricks,Write text files with DataFrameWriter.text in Databricks,Documentation for the DataFrameWriter.text method in PySpark.,Saves the content of theDataFramein a text file at the specified path. Text files are encoded as UTF-8.,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Specifies UTF-8 encoding and path-based output behavior, which are concrete API details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/xml,xml,xml (DataFrameWriter) - Azure Databricks,Write XML files with DataFrameWriter.xml in Databricks,Documentation for the DataFrameWriter.xml method in PySpark.,Saves the content of theDataFramein XML format at the specified path.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Method reference for XML output with product-specific options and semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2,DataFrameWriterV2 class,DataFrameWriterV2 class - Azure Databricks,Write tables using DataFrameWriterV2 in Azure Databricks,Documentation for the DataFrameWriterV2 class in PySpark.,"Interface used to write a DataFrame to external storage using the v2 API. For most use cases with Databricks tables and Delta Lake, DataFrameWriterV2 provides more powerful and flexible options than the original DataFrameWriter: Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.9,True,Class reference for v2 writer with capabilities for Databricks tables and Delta Lake; includes product-specific configuration surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/append,append,append - Azure Databricks,Append data to tables with DataFrameWriterV2.append,Documentation for the DataFrameWriterV2.append method in PySpark.,Appends the contents of theDataFrameto the output table.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API method describing append semantics for v2 tables, which is specific integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/clusterby,clusterBy,clusterBy (DataFrameWriterV2) - Azure Databricks,Cluster data using DataFrameWriterV2.clusterBy,Documentation for the DataFrameWriterV2.clusterBy method in PySpark.,Clusters the data by the given columns to optimize query performance.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a Databricks-specific writer method that influences query performance; concrete coding pattern for clustering.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/create,create,create - Azure Databricks,Create new tables with DataFrameWriterV2.create,Documentation for the DataFrameWriterV2.create method in PySpark.,"Creates a new table from the contents of theDataFrame. The new table's schema, partition layout, properties, and other configuration are based on the configuration set on this writer.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains how table schema, partitioning, and properties derive from writer configuration, which is detailed product-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/createorreplace,createOrReplace,createOrReplace - Azure Databricks,Use DataFrameWriterV2.createOrReplace in Azure Databricks,Documentation for the DataFrameWriterV2.createOrReplace method in PySpark.,"Creates a new table or replaces an existing table with the contents of theDataFrame. The output table's schema, partition layout, properties, and other configuration are based on the contents of theDataFrameand the configuration set on this writer. If the table exists, its configuration and data are replaced.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for a specific PySpark/Databricks method with product-specific semantics and options; this is concrete integration/coding detail beyond generic LLM knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/option,option,option (DataFrameWriterV2) - Azure Databricks,Use DataFrameWriterV2.option for Databricks writes,Documentation for the DataFrameWriterV2.option method in PySpark.,"Adds a write option for the underlying data source. For some available options, seeOptions.",2026-04-24T18:05:00.000Z,reference,integrations,0.7,True,API reference for DataFrameWriterV2.option with Databricks-specific write options; documents concrete option names/semantics for integrating with underlying Databricks data sources.,updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/options,options,options (DataFrameWriterV2) - Azure Databricks,Configure DataFrameWriterV2 options in Azure Databricks,Documentation for the DataFrameWriterV2.options method in PySpark.,Adds write options for the underlying data source.,2026-04-24T18:05:00.000Z,reference,integrations,0.78,True,"The API reference for DataFrameWriterV2.options lists product-specific write options for Azure Databricks PySpark, including exact option names and their behaviors for underlying data sources. This is detailed integration/configuration knowledge tied to Databricks' Spark runtime and data source connectors, which goes beyond generic LLM training. It fits best under integrations because it documents concrete parameters for writing to external storage systems via Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwrite,overwrite,overwrite - Azure Databricks,Use DataFrameWriterV2.overwrite for conditional updates,Documentation for the DataFrameWriterV2.overwrite method in PySpark.,Overwrites rows matching the given filter condition with the contents of theDataFramein the output table.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete writer API (overwrite with filter) specific to Databricks PySpark, including behavior details that are product-specific.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwritepartitions,overwritePartitions,overwritePartitions - Azure Databricks,Use overwritePartitions with DataFrameWriterV2,Documentation for the DataFrameWriterV2.overwritePartitions method in PySpark.,"Overwrites all partitions for which theDataFramecontains at least one row with the contents of theDataFramein the output table. This operation is equivalent to Hive'sINSERT OVERWRITE ... PARTITION, which replaces partitions dynamically depending on the contents of theDataFrame.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes Databricks-specific partition overwrite behavior equivalent to Hive INSERT OVERWRITE PARTITION, which is concrete integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/partitionedby,partitionedBy,partitionedBy - Azure Databricks,Partition tables with DataFrameWriterV2.partitionedBy,Documentation for the DataFrameWriterV2.partitionedBy method in PySpark.,"Partitions the output table created bycreate,createOrReplace, orreplaceusing the given columns or transforms. When specified, the table data is stored by these values for efficient reads. For example, when a table is partitioned by day, it may be stored in a directory layout like: Partitioning is one of the most widely used techniques to optimize physical data layout. It provides a coarse-grained index for skipping unnecessary data reads when queries have predicates on the partitioned columns. F",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains how partitionedBy affects physical layout and query skipping in Databricks; concrete product-specific partitioning behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/replace,replace,replace (DataFrameWriterV2) - Azure Databricks,Replace tables using DataFrameWriterV2.replace,Documentation for the DataFrameWriterV2.replace method in PySpark.,"Replaces an existing table with the contents of theDataFrame. The existing table's schema, partition layout, properties, and other configuration are replaced with the contents of theDataFrameand the configuration set on this writer.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level reference for replacing tables with Databricks PySpark, including how schema and properties are handled; product-specific coding pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/tableproperty,tableProperty,tableProperty - Azure Databricks,Add table properties via tableProperty(),Documentation for the DataFrameWriterV2.tableProperty method in PySpark.,Adds a table property.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Method-level reference for setting table properties on Databricks tables through DataFrameWriterV2; specific integration/config behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/using,using,using - Azure Databricks,Specify data source providers with using(),Documentation for the DataFrameWriterV2.using method in PySpark.,"Specifies a provider for the underlying output data source. Spark's default catalog supports""parquet"",""json"", and other built-in formats.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents DataFrameWriterV2.using with supported provider strings (parquet, json, etc.) in Databricks; concrete API usage detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource,DataSource class,DataSource - Azure Databricks,Implement custom PySpark DataSource in Databricks,A base class for custom PySpark data sources that enables reading from and writing to external data systems.,"A base class for data sources. This class represents a custom data source that allows for reading from and/or writing to it. The data source provides methods to create readers and writers for reading and writing data, respectively. At least one of the methodsreader()orwriter()must be implemented by any subclass to make the data source either readable or writable (or both). After implementing this interface, you can load your data source usingspark.read.format(...).load()and save data usingdf.wri",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Defines the base DataSource class and required methods (reader, writer) for custom sources; detailed integration contract unique to Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/name,name,name (DataSource) - Azure Databricks,Define custom data source name() identifier,Documentation for the DataSource.name method in PySpark.,"Returns a string representing the format name of this data source. By default, it is the class name of the data source. It can be overridden to provide a customized short name for the data source.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents how DataSource.name determines the format string used in spark.read.format; specific to Databricks custom sources.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/reader,reader,reader - Azure Databricks,Provide a DataSource.reader implementation,Documentation for the DataSource.reader method in PySpark.,Returns aDataSourceReaderinstance for reading data. The implementation is required for readable data sources.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains the requirement and behavior of DataSource.reader for readable custom sources; concrete integration contract.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/schema,schema,schema (DataSource) - Azure Databricks,Infer custom data source schema() in Databricks,Documentation for the DataSource.schema method in PySpark.,"Returns the schema of the data source. It can refer to any field initialized in the__init__method to infer the data source's schema when users do not explicitly specify it. This method is invoked once when callingspark.read.format(...).load()to get the schema for a data source read operation. If this method is not implemented, and a user does not provide a schema when reading the data source, an exception will be thrown.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Details how schema() is used when users omit schema in spark.read.format(...).load(); product-specific API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/simplestreamreader,simpleStreamReader,simpleStreamReader - Azure Databricks,Implement simpleStreamReader for custom sources,Documentation for the DataSource.simpleStreamReader method in PySpark.,Returns aSimpleDataSourceStreamReaderinstance for reading data.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents DataSource.simpleStreamReader returning SimpleDataSourceStreamReader; specific streaming integration API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/streamreader,streamReader,streamReader - Azure Databricks,Implement streamReader for streaming data sources,Documentation for the DataSource.streamReader method in PySpark.,Returns aDataSourceStreamReaderinstance for reading streaming data. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes DataSource.streamReader for streaming reads, including Databricks Runtime version; concrete streaming integration contract.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/streamwriter,streamWriter,streamWriter - Azure Databricks,Implement streamWriter for streaming sinks,Documentation for the DataSource.streamWriter method in PySpark.,Returns aDataSourceStreamWriterinstance for writing data into a streaming sink. The implementation is required for writable streaming data sources.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents DataSource.streamWriter returning DataSourceStreamWriter; product-specific streaming sink integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource/writer,writer,writer - Azure Databricks,Provide DataSource.writer for custom sinks,Documentation for the DataSource.writer method in PySpark.,Returns aDataSourceWriterinstance for writing data. The implementation is required for writable data sources. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains DataSource.writer requirements and Databricks Runtime version; concrete integration API for writable sources.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcearrowwriter,DataSourceArrowWriter class,DataSourceArrowWriter - Azure Databricks,Use DataSourceArrowWriter for Arrow-based sinks,A base class for PySpark data source writers that uses PyArrow RecordBatch for better performance when writing data.,"A base class for data source writers that process data using PyArrow'sRecordBatch. UnlikeDataSourceWriter, which works with an iterator of SparkRowobjects, this class is optimized for the Arrow format when writing data. It can offer better performance when interfacing with systems or libraries that natively support Arrow. Implement this class and return an instance fromDataSource.writer()to make a data source writable using Arrow.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,Defines a specialized writer using PyArrow RecordBatch instead of Spark Row; detailed integration pattern unique to Databricks.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcearrowwriter/write,write,write (DataSourceArrowWriter) - Azure Databricks,Implement write() in DataSourceArrowWriter,Documentation for the DataSourceArrowWriter.write method in PySpark.,"Writes an iterator of PyArrowRecordBatchobjects to the sink. This method is called once on each executor to write data to the data source. It accepts an iterator of PyArrowRecordBatchobjects and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with the collected commit message",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes executor-side write semantics, commit messages, and driver commit/abort flow for Arrow writers; concrete product-specific behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader,DataSourceReader class,DataSourceReader - Azure Databricks,Implement DataSourceReader for custom inputs,A base class for PySpark data source readers responsible for outputting data from a custom data source.,A base class for data source readers. Data source readers are responsible for outputting data from a data source. Implement this class and return an instance fromDataSource.reader()to make a data source readable.,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,Base class contract for custom readers in Databricks PySpark; defines how data is output from a source.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/partitions,partitions,partitions (DataSourceReader) - Azure Databricks,Define partitions() in DataSourceReader for parallelism,Documentation for the DataSourceReader.partitions method in PySpark.,"Returns a sequence of partitions for this data source. Partitions are used to split data reading operations into parallel tasks. If this method returns N partitions, the query planner will create N tasks. Each task will executeread()in parallel, using the respective partition value to read the data. This method is called once during query planning. By default, it returns a single partition with the valueNone. Subclasses can override this method to return multiple partitions. It's recommended to ",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Explains how partitions() controls task parallelism and default behavior; concrete integration and performance pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/pushfilters,pushFilters,pushFilters (DataSourceReader) - Azure Databricks,Implement pushFilters() for filter pushdown,Documentation for the DataSourceReader.pushFilters method in PySpark.,"Called with the list of filters that can be pushed down to the data source. The list of filters should be interpreted as the AND of the elements. Filter pushdown allows the data source to handle a subset of filters. This can improve performance by reducing the amount of data that needs to be processed by Spark. This method is called once during query planning. By default, it returns all filters, indicating that no filters can be pushed down. Subclasses can override this method to implement filte",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Details how filters are passed and interpreted (AND semantics) and default behavior; product-specific optimization API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/read,read,read (DataSourceReader) - Azure Databricks,Implement read() in DataSourceReader partitions,Documentation for the DataSourceReader.read method in PySpark.,Generates data for a given partition and returns an iterator of tuples or rows. This method is invoked once per partition to read the data. Implementing this method is required for readable data sources. You can initialize any non-serializable resources required for reading data from the data source within this method.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Describes per-partition read semantics and resource initialization; concrete integration contract for custom readers.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourceregistration,DataSourceRegistration class,DataSourceRegistration - Azure Databricks,Register custom DataSource with DataSourceRegistration,"A wrapper for registering Python user-defined data sources in PySpark, accessible via spark.dataSource.",A wrapper for data source registration. This instance can be accessed viaspark.dataSource. Use it to register a customDataSourcesubclass so it can be referenced by name inspark.read.format()anddf.write.format().,2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,Explains spark.dataSource wrapper and how to register Python DataSource subclasses for use in spark.read/df.write; product-specific registration API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourceregistration/register,register,register (DataSourceRegistration) - Azure Databricks,Use register() to expose custom data sources,Documentation for the DataSourceRegistration.register method in PySpark.,Registers a Python user-defined data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Method-level reference for DataSourceRegistration.register; concrete integration step to make sources available by name.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamarrowwriter,DataSourceStreamArrowWriter class,DataSourceStreamArrowWriter - Azure Databricks,Use DataSourceStreamArrowWriter for streaming sinks,A base class for PySpark data stream writers that uses PyArrow RecordBatch for better performance when writing streaming data.,"A base class for data stream writers that process data using PyArrow'sRecordBatch. UnlikeDataSourceStreamWriter, which works with an iterator of SparkRowobjects, this class is optimized for the Arrow format when writing streaming data. It can offer better performance when interfacing with systems or libraries that natively support Arrow for streaming use cases. Implement this class and return an instance fromDataSource.streamWriter()to make a data source writable as a streaming sink using Arrow.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,Defines Arrow-based streaming writer semantics for Databricks; specialized integration pattern for streaming with PyArrow.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamarrowwriter/write,write,write (DataSourceStreamArrowWriter) - Azure Databricks,Implement write() in DataSourceStreamArrowWriter,Documentation for the DataSourceStreamArrowWriter.write method in PySpark.,"Writes an iterator of PyArrowRecordBatchobjects to the streaming sink. This method is called on executors to write data to the streaming data sink in each microbatch. It accepts an iterator of PyArrowRecordBatchobjects and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with ",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes microbatch write semantics, commit messages, and driver commit/abort flow for Arrow streaming writers; product-specific behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader,DataSourceStreamReader class,DataSourceStreamReader - Azure Databricks,Implement DataSourceStreamReader for streaming inputs,A base class for PySpark streaming data source readers responsible for reading data from a streaming data source.,A base class for streaming data source readers. Data source stream readers are responsible for outputting data from a streaming data source. Implement this class and return an instance fromDataSource.streamReader()to make a data source readable as a streaming source. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Base class contract for streaming readers in Databricks, including Runtime version; concrete streaming integration API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/commit,commit,commit (DataSourceStreamReader) - Azure Databricks,Handle commit() in DataSourceStreamReader offsets,Documentation for the DataSourceStreamReader.commit method in PySpark.,Informs the source that Spark has completed processing all data for offsets less than or equal toendand will only request offsets greater thanendin the future. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains how commit signals processed offsets and future offset requests; detailed streaming offset management behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/getdefaultreadlimit,getDefaultReadLimit,getDefaultReadLimit - Azure Databricks,Use getDefaultReadLimit in streaming sources,Documentation for the DataSourceStreamReader.getDefaultReadLimit method in PySpark.,"Returns the read limit potentially passed to the data source through options when creating the data source. Implementing this method is optional. By default, it returnsReadAllAvailable, which means there is no limit on the amount of data returned bylatestOffset().",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Documents default ReadAllAvailable behavior and how options can pass read limits; product-specific streaming configuration API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/initialoffset,initialOffset,initialOffset (DataSourceStreamReader) - Azure Databricks,Define initialOffset() for streaming data sources,Documentation for the DataSourceStreamReader.initialOffset method in PySpark.,"Returns the initial offset of the streaming data source. A new streaming query starts reading data from the initial offset. If Spark is restarting an existing query, it will restart from the checkpointed offset rather than the initial one. Added in Databricks Runtime 15.2",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains how initial offsets are used vs checkpointed offsets on restart; concrete streaming integration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/latestoffset,latestOffset,latestOffset - Azure Databricks,Implement latestOffset() with read limits,Documentation for the DataSourceStreamReader.latestOffset method in PySpark.,"Returns the most recent offset available given a read limit. Thestartoffset can be used to determine how much new data should be read given the limit. For the very first microbatch,startis provided from the return value ofinitialOffset(). For subsequent microbatches, it continues from the last microbatch. The source can return the same offset as the start offset if there is no data to process. ReadLimitcan be used by the source to limit the amount of data returned. ImplementgetDefaultReadLimit()",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Details interaction between start offset, read limits, and no-data scenarios; product-specific streaming offset API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/partitions,partitions,partitions (DataSourceStreamReader) - Azure Databricks,Define partitions() for DataSourceStreamReader,Documentation for the DataSourceStreamReader.partitions method in PySpark.,"Returns a list ofInputPartitionobjects given the start and end offsets. EachInputPartitionrepresents a data split that can be processed by one Spark task. When called with an empty offset range wherestart == end, this method should return an empty sequence. Added in Databricks Runtime 15.2",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Explains how InputPartition list is derived from start/end offsets and empty-range behavior; concrete streaming partitioning contract.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/read,read,read (DataSourceStreamReader) - Azure Databricks,Implement read() in DataSourceStreamReader,Documentation for the DataSourceStreamReader.read method in PySpark.,Generates data for a given partition and returns an iterator of tuples or rows. This method is invoked once per partition to read the data. Implementing this method is required for stream readers. You can initialize any non-serializable resources required for reading data from the data source within this method. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Describes per-partition streaming read semantics and resource initialization; product-specific streaming integration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/reportlatestoffset,reportLatestOffset,reportLatestOffset - Azure Databricks,Report latest offsets with reportLatestOffset(),Documentation for the DataSourceStreamReader.reportLatestOffset method in PySpark.,Returns the most recent offset available. The information is used to report the latest offset in the streaming query status. The source can returnNoneif there is no data to process or if the source does not support this method.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Explains how latest offsets are reported for query status and when None is allowed; detailed streaming status integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/stop,stop,stop (DataSourceStreamReader) - Azure Databricks,Stop streaming sources with stop() in DataSourceStreamReader,Documentation for the DataSourceStreamReader.stop method in PySpark.,Stops this source and frees any resources it has allocated. Invoked when the streaming query is terminated. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents lifecycle method to free resources on query termination; concrete streaming integration lifecycle behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter,DataSourceStreamWriter class,DataSourceStreamWriter - Azure Databricks,Implement DataSourceStreamWriter for streaming sinks,A base class for PySpark data stream writers responsible for writing streaming data to a sink microbatch by microbatch.,"A base class for data stream writers. Data stream writers are responsible for writing data to a streaming sink. Implement this class and return an instance fromDataSource.streamWriter()to make a data source writable as a streaming sink.write()is called on executors for each microbatch, andcommit()orabort()is called on the driver after all tasks in the microbatch complete.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Base class contract for streaming writers, including write/commit/abort semantics per microbatch; detailed Databricks integration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/abort,abort,abort (DataSourceStreamWriter) - Azure Databricks,Handle abort() in DataSourceStreamWriter failures,Documentation for the DataSourceStreamWriter.abort method in PySpark.,Aborts this microbatch due to task failures. This method is invoked on the driver when one or more tasks failed. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to abort the microbatch in the streaming sink.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Explains how commit messages from tasks are used to abort microbatches on driver; product-specific failure-handling behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/commit,commit,commit (DataSourceStreamWriter) - Azure Databricks,Commit microbatches with DataSourceStreamWriter.commit,Documentation for the DataSourceStreamWriter.commit method in PySpark.,Commits this microbatch with a list of commit messages. This method is invoked on the driver when all tasks run successfully. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to commit the microbatch in the streaming sink.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,Describes driver-side commit semantics using commit messages from write() tasks; concrete streaming sink integration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/write,write,write (DataSourceStreamWriter) - Azure Databricks,Implement custom streaming sinks with DataSourceStreamWriter.write,Documentation for the DataSourceStreamWriter.write method in PySpark.,"Writes data into the streaming sink. This method is called on executors to write data to the streaming data sink in each microbatch. It accepts an iterator of input data and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with the collected commit messages.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level reference for a Databricks-specific PySpark streaming writer hook, including commit-message behavior between executors and driver; this is product- and API-specific integration behavior not reliably known from pretraining.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter,DataSourceWriter class,DataSourceWriter - Azure Databricks,Implement custom batch data sources with DataSourceWriter,A base class for PySpark data source writers responsible for saving data to a custom data source in batch mode.,A base class for data source writers. Data source writers are responsible for saving data to a data source. Implement this class and return an instance fromDataSource.writer()to make a data source writable. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Describes a Databricks Runtime–specific base class for custom data source writers and how it is returned from DataSource.writer(); this is concrete API surface and lifecycle behavior for integrations.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/abort,abort,abort - Azure Databricks,Handle failed batch writes with DataSourceWriter.abort,Documentation for the DataSourceWriter.abort method in PySpark.,Aborts this writing job due to task failures. This method is invoked on the driver when one or more tasks failed. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to abort the writing job to the data source. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a specific driver-side abort hook, including how commit messages from write() are passed in on failure; this is detailed, product-specific writer lifecycle behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/commit,commit,commit - Azure Databricks,Commit successful batch writes with DataSourceWriter.commit,Documentation for the DataSourceWriter.commit method in PySpark.,Commits this writing job with a list of commit messages. This method is invoked on the driver when all tasks run successfully. The commit messages are collected from thewrite()method call from each task and passed to this method. The implementation should use the commit messages to commit the writing job to the data source. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Explains the commit hook semantics and how commit messages from executors are used on the driver; this is concrete API lifecycle behavior for custom data source integrations.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/write,write,write - Azure Databricks,Write partition data in custom sources with DataSourceWriter.write,Documentation for the DataSourceWriter.write method in PySpark.,"Writes data into the data source. This method is called once on each executor to write data to the data source. It accepts an iterator of input data and returns a single row representing a commit message, orNoneif there is no commit message. The driver collects commit messages, if any, from all executors and passes them to thecommit()method if all tasks run successfully. If any task fails, theabort()method will be called with the collected commit messages. Added in Databricks Runtime 14.3 LTS",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Details executor-side write behavior and commit-message contract for custom batch data sources; this is specific integration API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader,DataStreamReader class,DataStreamReader - Azure Databricks,Load streaming DataFrames with DataStreamReader,Learn about the DataStreamReader class in PySpark,"Interface used to load a streaming DataFrame from external storage systems (for example, file systems and key-value stores). Usespark.readStreamto access this.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Product-specific interface for loading streaming DataFrames via spark.readStream, including supported external systems; this is concrete API usage for integrations.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/changes,changes,changes (DataStreamReader) - Azure Databricks,Read table change data capture with DataStreamReader.changes,Documentation for the DataStreamReader.changes method in PySpark.,Returns the row-level changes (Change Data Capture) from the specified table as a streaming DataFrame. Currently only supported for Data Source V2 tables whose catalog implementsTableCatalog.loadChangelog(). Useoption()to specify the starting version or timestamp and processing options.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a Databricks-specific streaming CDC API, including requirement for TableCatalog.loadChangelog() and use of options for starting version/timestamp; this is detailed integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/csv,csv,csv (DataStreamReader) - Azure Databricks,Stream CSV data into DataFrames with DataStreamReader.csv,Documentation for the DataStreamReader.csv method in PySpark.,"Loads a CSV file stream and returns the result as a DataFrame. IfinferSchemais enabled, the function goes through the input once to determine the schema. To avoid this pass, disableinferSchemaor specify the schema explicitly usingschema.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API reference for streaming CSV with schema inference behavior and options like inferSchema and schema; these are concrete parameters and semantics for this product’s streaming integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/excel,excel,excel (DataStreamReader) - Azure Databricks,Stream Excel files into DataFrames with DataStreamReader.excel,Documentation for the DataStreamReader.excel method in PySpark.,Loads an Excel file stream and returns the result as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,Documents a specific streaming reader for Excel format in Databricks PySpark; this is concrete integration API surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/format,format,format (DataStreamReader) - Azure Databricks,Specify streaming input formats with DataStreamReader.format,Documentation for the DataStreamReader.format method in PySpark.,Specifies the input data source format.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API for selecting underlying streaming data source formats; this is configuration of integration endpoints rather than conceptual content.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/json,json,json (DataStreamReader) - Azure Databricks,Stream JSON data with DataStreamReader.json,Documentation for the DataStreamReader.json method in PySpark.,"Loads a JSON file stream and returns the results as a DataFrame. JSON Lines (newline-delimited JSON) is supported by default. For JSON with one record per file, set themultiLineoption totrue. Ifschemais not specified, the input schema is inferred from the data.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Describes streaming JSON behavior including JSON Lines support, multiLine option, and schema inference; these are concrete, product-specific API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/load,load,load (DataStreamReader) - Azure Databricks,Load generic streaming sources with DataStreamReader.load,Documentation for the DataStreamReader.load method in PySpark.,Loads a data stream from a data source and returns it as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,Core API for loading a data stream from configured sources; this is concrete integration behavior for Databricks streaming.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/name,name,name (DataStreamReader) - Azure Databricks,Name streaming sources for checkpoint evolution with DataStreamReader.name,Documentation for the DataStreamReader.name method in PySpark.,"Assigns a name to the streaming source for checkpoint evolution. This allows streaming queries to evolve by enabling sources to be reordered or added without breaking checkpoint compatibility. When source evolution is enabled, all sources in a query must be named.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a Databricks-specific feature (source naming for checkpoint evolution) and its requirement that all sources be named when evolution is enabled; this is nuanced, product-specific behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/option,option,option (DataStreamReader) - Azure Databricks,Configure single streaming input option with DataStreamReader.option,Documentation for the DataStreamReader.option method in PySpark.,Adds an input option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,API for setting named input options on streaming sources; this is a configuration surface with specific parameter semantics for this product.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/options,options,options (DataStreamReader) - Azure Databricks,Configure multiple streaming input options with DataStreamReader.options,Documentation for the DataStreamReader.options method in PySpark.,Adds multiple input options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Allows setting multiple named options for streaming sources; this is structured configuration for Databricks streaming integrations.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/orc,orc,orc (DataStreamReader) - Azure Databricks,Stream ORC data into DataFrames with DataStreamReader.orc,Documentation for the DataStreamReader.orc method in PySpark.,Loads an ORC file stream and returns the result as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API reference for streaming ORC files; this is concrete integration behavior for a specific format.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/parquet,parquet,parquet (DataStreamReader) - Azure Databricks,Stream Parquet data into DataFrames with DataStreamReader.parquet,Documentation for the DataStreamReader.parquet method in PySpark.,Loads a Parquet file stream and returns the result as a DataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API reference for streaming Parquet files; this is concrete integration behavior for a specific format.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/schema,schema,schema (DataStreamReader) - Azure Databricks,Define streaming input schemas with DataStreamReader.schema,Documentation for the DataStreamReader.schema method in PySpark.,"Specifies the input schema. Some data sources (for example, JSON) can infer the input schema automatically from data. Specifying the schema here allows the data source to skip schema inference and speed up data loading.",2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Explains specifying schemas vs inference for streaming sources and performance implications; this is product-specific configuration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/table,table,table (DataStreamReader) - Azure Databricks,Create streaming DataFrames from tables with DataStreamReader.table,Documentation for the DataStreamReader.table method in PySpark.,Defines a streaming DataFrame on a table. The data source corresponding to the table must support streaming mode.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for defining a streaming DataFrame on a table with requirement that the table’s data source support streaming; this is concrete integration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/text,text,text (DataStreamReader) - Azure Databricks,Stream text files into DataFrames with DataStreamReader.text,Documentation for the DataStreamReader.text method in PySpark.,"Loads a text file stream and returns a DataFrame whose schema starts with a string column namedvalue, followed by any partitioned columns. Text files must be encoded as UTF-8. Each line in the text file is a new row in the resulting DataFrame by default.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,"Documents UTF-8 requirement, schema (value column plus partitions), and line-to-row mapping; these are specific integration semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/xml,xml,xml (DataStreamReader) - Azure Databricks,Stream XML data into DataFrames with DataStreamReader.xml,Documentation for the DataStreamReader.xml method in PySpark.,"Loads an XML file stream and returns the result as a DataFrame. Ifschemais not specified, the input schema is inferred from the data.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for streaming XML with schema inference behavior; this is concrete integration behavior for a specific format.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter,DataStreamWriter class,DataStreamWriter - Azure Databricks,Write streaming DataFrames with DataStreamWriter,Learn about the DataStreamWriter class in PySpark,"Interface used to write a streaming DataFrame to external storage systems (for example, file systems and key-value stores). Usedf.writeStreamto access this.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Product-specific interface for writing streaming DataFrames to external systems via df.writeStream; this is concrete integration API surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/clusterby,clusterBy,clusterBy (DataStreamWriter) - Azure Databricks,Configure clustered output files with DataStreamWriter.clusterBy,Documentation for the DataStreamWriter.clusterBy method in PySpark.,"Clusters the output by the given columns. Records with similar values on the clustering columns are grouped together in the same file. Clustering improves query efficiency by allowing queries with predicates on the clustering columns to skip unnecessary data. Unlike partitioning, clustering can be used on high-cardinality columns.",2026-04-17T21:49:00.000Z,reference,configuration,0.65,True,Explains clustering semantics vs partitioning for streaming output and its impact on query efficiency; this is product-specific configuration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/foreach,foreach,foreach (DataStreamWriter) - Azure Databricks,Process streaming rows with custom foreach writers,Documentation for the DataStreamWriter.foreach method in PySpark.,"Sets the output of the streaming query to be processed using the provided writer. The processing logic can be specified as a function that takes a row as input, or as an object withprocess(row)and optionalopen(partition_id, epoch_id)andclose(error)methods.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents DataStreamWriter.foreach semantics, including process(row), open(partition_id, epoch_id), and close(error) hooks; this is detailed integration and coding pattern behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/foreachbatch,foreachBatch,foreachBatch (DataStreamWriter) - Azure Databricks,Process micro-batches with DataStreamWriter.foreachBatch,Documentation for the DataStreamWriter.foreachBatch method in PySpark.,"Sets the output of the streaming query to be processed using the provided function. Supported only in micro-batch execution mode (that is, when the trigger is not continuous). In every micro-batch, the provided function is called with the output rows as a DataFrame and the batch identifier. The batch ID can be used to deduplicate and transactionally write the output to external systems.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains foreachBatch semantics, micro-batch-only support, and use of batch ID for deduplication and transactional writes; this is nuanced, product-specific integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/format,format,format (DataStreamWriter) - Azure Databricks,Select streaming output data source with DataStreamWriter.format,Documentation for the DataStreamWriter.format method in PySpark.,Specifies the underlying output data source.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API for specifying the underlying sink format; this is integration configuration for output.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/option,option,option (DataStreamWriter) - Azure Databricks,Configure single streaming output option with DataStreamWriter.option,Documentation for the DataStreamWriter.option method in PySpark.,Adds an output option for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,API for setting named output options on streaming sinks; this is configuration surface for integrations.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/options,options,options (DataStreamWriter) - Azure Databricks,Configure multiple streaming output options with DataStreamWriter.options,Documentation for the DataStreamWriter.options method in PySpark.,Adds multiple output options for the underlying data source.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Allows setting multiple named options for streaming sinks; this is structured configuration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/outputmode,outputMode,outputMode (DataStreamWriter) - Azure Databricks,Control streaming sink semantics with DataStreamWriter.outputMode,Documentation for the DataStreamWriter.outputMode method in PySpark.,Specifies how data of a streaming DataFrame is written to a streaming sink.,2026-04-17T21:49:00.000Z,reference,configuration,0.65,True,Specifies how streaming data is written to sinks (output modes); this is product-specific configuration of streaming behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/partitionby,partitionBy,partitionBy (DataStreamWriter) - Azure Databricks,Partition streaming output by columns with DataStreamWriter.partitionBy,Documentation for the DataStreamWriter.partitionBy method in PySpark.,Partitions the output by the given columns on the file system. The output is laid out similar to Hive's partitioning scheme.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Describes partitioning layout similar to Hive for streaming outputs; this is concrete configuration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/queryname,queryName,queryName (DataStreamWriter) - Azure Databricks,Name streaming queries with DataStreamWriter.queryName,Documentation for the DataStreamWriter.queryName method in PySpark.,Specifies the name of the StreamingQuery that can be started withstart(). This name must be unique among all currently active queries in the associated SparkSession.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Specifies unique query names for StreamingQuery instances; this is product-specific configuration requirement.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/start,start,start (DataStreamWriter) - Azure Databricks,Start streaming queries with DataStreamWriter.start,Documentation for the DataStreamWriter.start method in PySpark.,Streams the contents of the DataFrame to a data source and returns a StreamingQuery object.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for starting streaming writes and obtaining a StreamingQuery; this is concrete integration lifecycle behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/table,table,table (DataStreamWriter) - Azure Databricks,Write streaming results to tables with DataStreamWriter.table,Documentation for the DataStreamWriter.table method in PySpark.,"Alias fortoTable(). Starts the execution of the streaming query, continually outputting results to the given table as new data arrives.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Alias for toTable() that starts continuous output to a table; this is specific integration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/totable,toTable,toTable (DataStreamWriter) - Azure Databricks,Stream DataFrame output into tables with DataStreamWriter.toTable,Documentation for the DataStreamWriter.toTable method in PySpark.,"Starts the execution of the streaming query, continually outputting results to the given table as new data arrives. Returns a StreamingQuery object.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents starting a streaming query that writes to a table and returns a StreamingQuery; this is integration lifecycle behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/trigger,trigger,trigger (DataStreamWriter) - Azure Databricks,Configure streaming triggers with DataStreamWriter.trigger,Documentation for the DataStreamWriter.trigger method in PySpark.,"Sets the trigger for the streaming query. If not set, the query runs as fast as possible, equivalent toprocessingTime='0 seconds'. Only one trigger parameter can be set at a time. For more information, seeConfigure Structured Streaming trigger intervals.",2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,"Explains trigger configuration, default behavior (processingTime='0 seconds'), and single-parameter constraint; this is detailed, product-specific configuration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography,Geography class,Geography - Azure Databricks,Work with geography values using the Geography class,Learn about the Geography class in PySpark,A class to represent a Geography value in Python.,2026-04-17T08:00:00.000Z,reference,integrations,0.55,True,"Defines a Databricks-specific Geography type for Python, used in geospatial integrations and functions; this is concrete API surface.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography/fromwkb,fromWKB,fromWKB (Geography) - Azure Databricks,Create Geography objects from WKB with Geography.fromWKB,Documentation for the Geography.fromWKB class method in PySpark.,Constructs aGeographyobject from WKB.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,API for constructing Geography from WKB; this is specific integration behavior with binary geospatial formats.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography/getbytes,getBytes,getBytes (Geography) - Azure Databricks,Export Geography values to WKB with Geography.getBytes,Documentation for the Geography.getBytes method in PySpark.,Returns the WKB of Geography.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Returns WKB representation of Geography; this is concrete integration behavior with external geospatial systems.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geography/getsrid,getSrid,getSrid (Geography) - Azure Databricks,Retrieve SRID from Geography objects with Geography.getSrid,Documentation for the Geography.getSrid method in PySpark.,Returns the SRID of Geography.,2026-04-17T21:49:00.000Z,reference,integrations,0.55,True,API for obtaining SRID from Geography; this is specific to geospatial integration and coordinate system handling.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry,Geometry class,Geometry - Azure Databricks,Work with geometry values using the Geometry class,Learn about the Geometry class in PySpark,A class to represent a Geometry value in Python.,2026-04-17T08:00:00.000Z,reference,integrations,0.55,True,Defines a Databricks-specific Geometry type for Python; this is concrete API surface for geospatial integrations.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/fromwkb,fromWKB,fromWKB (Geometry) - Azure Databricks,Use Geometry.fromWKB in Azure Databricks PySpark,Documentation for the Geometry.fromWKB class method in PySpark.,Constructs aGeometryobject from WKB.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark Geometry class method, including its signature and behavior, which are product-specific integration details beyond generic LLM knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/getbytes,getBytes,getBytes (Geometry) - Azure Databricks,Return Geometry WKB bytes with getBytes,Documentation for the Geometry.getBytes method in PySpark.,Returns the WKB of Geometry.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete PySpark Geometry method, exposing exact behavior and usage patterns that are specific to Azure Databricks’ PySpark implementation.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/getsrid,getSrid,getSrid (Geometry) - Azure Databricks,Retrieve Geometry SRID using getSrid in PySpark,Documentation for the Geometry.getSrid method in PySpark.,Returns the SRID of Geometry.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference for Geometry.getSrid with product-specific semantics that go beyond generic spatial concepts.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata,GroupedData class,GroupedData - Azure Databricks,Work with GroupedData aggregations in PySpark,Learn about the GroupedData class in PySpark,"A set of methods for aggregations on a DataFrame, created byDataFrame.groupBy. Supports Spark Connect",2026-04-17T08:00:00.000Z,reference,integrations,0.65,True,"Class reference for GroupedData with Databricks-specific behavior and supported methods, which are concrete API details rather than conceptual content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/agg,agg,agg (GroupedData) - Azure Databricks,Compute custom aggregates with GroupedData.agg,Documentation for the GroupedData.agg method in PySpark.,Computes aggregates and returns the result as aDataFrame. The available aggregate functions can be:,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents the exact usage of GroupedData.agg, including supported aggregate functions and calling patterns, which are product-specific API details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/avg,avg,avg (GroupedData) - Azure Databricks,Calculate group averages with GroupedData.avg,Documentation for the GroupedData.avg method in PySpark.,Computes average values for each numeric column for each group. meanis an alias foravg.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for GroupedData.avg/mean with defined behavior per numeric column, representing concrete API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/count,count,count (GroupedData) - Azure Databricks,Count records per group with GroupedData.count,Documentation for the GroupedData.count method in PySpark.,Counts the number of records for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Provides exact behavior of GroupedData.count in Databricks PySpark, which is specific API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/max,max,max (GroupedData) - Azure Databricks,Get maximum values per group with GroupedData.max,Documentation for the GroupedData.max method in PySpark.,Computes the max value for each numeric column for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Details the GroupedData.max method behavior for numeric columns, which is concrete API reference material.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/mean,mean,mean (GroupedData) - Azure Databricks,Use GroupedData.mean alias for averages,Documentation for the GroupedData.mean method in PySpark.,Computes average values for each numeric column for each group. meanis an alias foravg.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Clarifies that mean is an alias for avg and how it operates on grouped numeric columns, a product-specific API detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/min,min,min (GroupedData) - Azure Databricks,Get minimum values per group with GroupedData.min,Documentation for the GroupedData.min method in PySpark.,Computes the min value for each numeric column for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level documentation for GroupedData.min, describing exact aggregation behavior in Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/pivot,pivot,pivot (GroupedData) - Azure Databricks,Pivot DataFrame columns with GroupedData.pivot,Documentation for the GroupedData.pivot method in PySpark.,Pivots a column of the currentDataFrameand performs the specified aggregation.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains the GroupedData.pivot API and how it performs aggregations, which is specific to the Databricks PySpark implementation.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/sum,sum,sum (GroupedData) - Azure Databricks,Sum numeric columns per group with GroupedData.sum,Documentation for the GroupedData.sum method in PySpark.,Computes the sum for each numeric column for each group.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents the GroupedData.sum method behavior, a concrete API usage pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/inputpartition,InputPartition,InputPartition - Azure Databricks,Implement custom InputPartition for PySpark data sources,Learn about the InputPartition class in PySpark,A base class representing an input partition returned by thepartitions()method ofDataSourceReader. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Class reference for InputPartition returned by DataSourceReader.partitions(), including runtime version notes, which are product-specific integration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/observation,Observation class,Observation - Azure Databricks,Capture DataFrame metrics with Observation in PySpark,Learn about the Observation class in PySpark,A class to observe named metrics on a DataFrame. Metrics are aggregation expressions applied to the DataFrame while it is being processed by an action. An Observation instance collects the metrics while the first action is executed. Subsequent actions do not modify the metrics returned byObservation.get. Retrieval of the metric viaObservation.getblocks until the first action has finished and metrics become available.,2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,"Describes Observation class semantics (first action only, blocking get), which are detailed, product-specific API behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/observation/get,get,get (Observation) - Azure Databricks,Retrieve observed metrics using Observation.get,Documentation for the Observation.get property in PySpark.,Gets the observed metrics. Waits until the observed dataset finishes its first action. Only the result of the first action is available. Subsequent actions do not modify the result.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents blocking behavior and single-action semantics of Observation.get, which are non-obvious, product-specific details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor,PySparkPlotAccessor class,PySparkPlotAccessor class - Azure Databricks,Generate plots from PySpark DataFrames with PySparkPlotAccessor,"Learn about the PySparkPlotAccessor class in PySpark for generating line, bar, scatter, area, pie, box, KDE, and histogram plots.",Accessor for DataFrame plotting functionality in PySpark.,2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Class-level API reference for DataFrame plotting in Databricks PySpark, including supported plot types, which is specific integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/area,area,area - Azure Databricks,Draw stacked area plots with PySparkPlotAccessor.area,Documentation for the PySparkPlotAccessor.area method in PySpark.,Draws a stacked area plot. An area plot displays quantitative data visually.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for area plotting on DataFrames, including semantics of stacked area plots in this API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/bar,bar,bar - Azure Databricks,Create vertical bar charts with PySparkPlotAccessor.bar,Documentation for the PySparkPlotAccessor.bar method in PySpark.,"Creates a vertical bar plot. A bar plot presents categorical data with rectangular bars with lengths proportional to the values they represent. It shows comparisons among discrete categories. One axis shows the specific categories being compared, and the other axis represents a measured value.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents the bar plotting method on DataFrames, a concrete API usage pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/barh,barh,barh - Azure Databricks,Create horizontal bar charts with PySparkPlotAccessor.barh,Documentation for the PySparkPlotAccessor.barh method in PySpark.,"Creates a horizontal bar plot. A horizontal bar plot presents quantitative data with rectangular bars with lengths proportional to the values they represent. It shows comparisons among discrete categories. One axis shows the specific categories being compared, and the other axis represents a measured value.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method-level documentation for horizontal bar plots, including semantics of axes and categories.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/box,box,box - Azure Databricks,Build box-and-whisker plots with PySparkPlotAccessor.box,Documentation for the PySparkPlotAccessor.box method in PySpark.,"Creates a box-and-whisker plot fromDataFramecolumns. A box plot is a method for graphically depicting groups of numerical data through their quartiles. The box extends from the Q1 to Q3 quartile values of the data, with a line at the median (Q2). The whiskers extend from the edges of the box to show the range of the data. By default, they extend no more than 1.5 × IQR (IQR = Q3 - Q1) from the edges of the box, ending at the farthest data point within that interval. Outliers are plotted as separa",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains detailed behavior of box plots (quartiles, whiskers at 1.5×IQR, outliers) as implemented in this plotting API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/hist,hist,hist - Azure Databricks,Plot DataFrame histograms with PySparkPlotAccessor.hist,Documentation for the PySparkPlotAccessor.hist method in PySpark.,Draws a histogram of theDataFrame's columns. A histogram is a representation of the distribution of data.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for histogram plotting from DataFrame columns, which is specific to Databricks’ PySpark plotting integration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/kde,kde,kde - Azure Databricks,Generate KDE plots with PySparkPlotAccessor.kde,Documentation for the PySparkPlotAccessor.kde method in PySpark.,"Generates a Kernel Density Estimate (KDE) plot using Gaussian kernels. In statistics, kernel density estimation is a non-parametric way to estimate the probability density function (PDF) of a random variable. This function uses Gaussian kernels and includes automatic bandwidth determination.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,Describes kernel density estimation plotting with Gaussian kernels and automatic bandwidth in this specific API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/line,line,line - Azure Databricks,Plot DataFrame lines with PySparkPlotAccessor.line,Documentation for the PySparkPlotAccessor.line method in PySpark.,Plots aDataFrameas lines.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for line plotting of DataFrames, a concrete integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/pie,pie,pie - Azure Databricks,Create pie charts from DataFrames with PySparkPlotAccessor.pie,Documentation for the PySparkPlotAccessor.pie method in PySpark.,Generates a pie plot. A pie plot is a proportional representation of the numerical data in a column.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents pie plot behavior for numerical columns in this plotting API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/scatter,scatter,scatter - Azure Databricks,Visualize correlations with PySparkPlotAccessor.scatter,Documentation for the PySparkPlotAccessor.scatter method in PySpark.,"Creates a scatter plot with varying marker point size and color. The coordinates of each point are defined by two DataFrame columns, and filled circles are used to represent each point. This kind of plot is useful for seeing complex correlations between two variables, such as natural 2D coordinates like longitude and latitude, or any pair of metrics that can be plotted against each other.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Explains scatter plot behavior (marker size, color, coordinate mapping) for DataFrame columns in Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/row,Row class,Row class - Azure Databricks,Use the Row class for PySpark DataFrames,Documentation for the Row class in PySpark.,A row in DataFrame. The fields in it can be accessed: key in rowwill search through row keys. Row can be used to create a row object by using named arguments. It is not allowed to omit a named argument to represent that the value is None or missing. This should be explicitly set to None in this case. Changed in Databricks Runtime 7.4: Rows created from named arguments no longer have field names sorted alphabetically and will be ordered in the position as entered.,2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Details Row construction, field access rules, and version-specific behavior changes (e.g., field order), which are concrete, product-specific API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/row/asdict,asDict,asDict - Azure Databricks,Convert Row objects to dictionaries with asDict,Documentation for the Row\.asDict method in PySpark.,"Returns the Row asDict[str, Any].",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Method reference for Row.asDict, including return type and behavior, which is specific to the PySpark Row API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig,RuntimeConfig class,RuntimeConfig - Azure Databricks,Manage Spark runtime settings with RuntimeConfig,"User-facing configuration API for Spark runtime properties, accessible through spark.conf. Options set here are propagated to Hadoop I/O.","User-facing configuration API, accessible throughSparkSession.conf. Supports Spark Connect Options set here are automatically propagated to the Hadoop configuration during I/O.",2026-04-17T08:00:00.000Z,reference,configuration,0.8,True,"Describes the user-facing configuration API (spark.conf / SparkSession.conf) and propagation to Hadoop I/O, which is core configuration behavior for Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/get,get,get (RuntimeConfig) - Azure Databricks,Read Spark configuration values with RuntimeConfig.get,Documentation for the RuntimeConfig.get method in PySpark.,"Returns the value of Spark runtime configuration property for the given key, assuming it is set.",2026-04-17T21:49:00.000Z,reference,configuration,0.75,True,"Method-level documentation for retrieving specific runtime configuration properties by key, a concrete configuration API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/getall,getAll,getAll - Azure Databricks,List all Spark configuration properties with RuntimeConfig.getAll,Documentation for the RuntimeConfig.getAll property in PySpark.,Returns all properties set in this conf.,2026-04-17T21:49:00.000Z,reference,configuration,0.75,True,"Documents how to enumerate all properties set in the configuration, which is specific configuration API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/ismodifiable,isModifiable,isModifiable - Azure Databricks,Check if Spark config keys are modifiable with isModifiable,Documentation for the RuntimeConfig.isModifiable method in PySpark.,Indicates whether the configuration property with the given key is modifiable in the current session.,2026-04-17T21:49:00.000Z,reference,configuration,0.75,True,"Provides semantics for determining whether a configuration key can be changed in the current session, a product-specific configuration rule.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/set,set,set - Azure Databricks,Set Spark runtime configuration with RuntimeConfig.set,Documentation for the RuntimeConfig.set method in PySpark.,Sets the given Spark runtime configuration property.,2026-04-17T21:49:00.000Z,reference,configuration,0.8,True,"Documents how to set configuration properties via the RuntimeConfig API, including key/value semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/unset,unset,unset - Azure Databricks,Unset Spark configuration keys with RuntimeConfig.unset,Documentation for the RuntimeConfig.unset method in PySpark.,Resets the configuration property for the given key.,2026-04-17T21:49:00.000Z,reference,configuration,0.8,True,"Explains how to reset configuration properties by key, which is specific configuration API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader,SimpleDataSourceStreamReader class,SimpleDataSourceStreamReader - Azure Databricks,Implement SimpleDataSourceStreamReader for lightweight streaming,A base class for simplified streaming data source readers in PySpark that reads data and plans the latest offset simultaneously.,"A base class for simplified streaming data source readers. Compared toDataSourceStreamReader,SimpleDataSourceStreamReaderdoesn't require planning data partitions. Theread()method allows reading data and planning the latest offset at the same time. BecauseSimpleDataSourceStreamReaderreads records in the Spark driver to determine the end offset of each batch without partitioning, it is only suited for lightweight use cases where input rate and batch size are small. UseDataSourceStreamReaderwhen re",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Class reference describing how SimpleDataSourceStreamReader differs from DataSourceStreamReader, including driver-side reading, offset planning, and suitability constraints, which are detailed, product-specific streaming integration semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/commit,commit,commit (SimpleDataSourceStreamReader) - Azure Databricks,Commit processed offsets with SimpleDataSourceStreamReader.commit,Documentation for the SimpleDataSourceStreamReader.commit method in PySpark.,Informs the source that Spark has completed processing all data for offsets less than or equal toendand will only request offsets greater thanendin the future. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents commit semantics for streaming offsets (end offset guarantees, future requests) and runtime version, which are specific to Databricks streaming APIs.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/initialoffset,initialOffset,initialOffset (SimpleDataSourceStreamReader) - Azure Databricks,Determine initial streaming offsets with SimpleDataSourceStreamReader.initialOffset,Documentation for the SimpleDataSourceStreamReader.initialOffset method in PySpark.,"Returns the initial offset of the streaming data source. A new streaming query starts reading data from the initial offset. If Spark is restarting an existing query, it will restart from the checkpointed offset rather than the initial one. Added in Databricks Runtime 15.3",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains initial vs checkpointed offset behavior for new vs restarted queries, which is nuanced, product-specific streaming behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/read,read,read (SimpleDataSourceStreamReader) - Azure Databricks,Read streaming data and next offset with SimpleDataSourceStreamReader.read,Documentation for the SimpleDataSourceStreamReader.read method in PySpark.,Reads all available data from the start offset and returns the offset that the next read attempt starts from. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how read returns both data and the next start offset, a concrete streaming integration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/readbetweenoffsets,readBetweenOffsets,readBetweenOffsets (SimpleDataSourceStreamReader) - Azure Databricks,Re-read batches deterministically with SimpleDataSourceStreamReader.readBetweenOffsets,Documentation for the SimpleDataSourceStreamReader.readBetweenOffsets method in PySpark.,Reads all available data between a specific start offset and end offset. This method is invoked during failure recovery to re-read a batch deterministically. Added in Databricks Runtime 15.3,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Describes deterministic re-reading between specific offsets for failure recovery, which is detailed streaming API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession,SparkSession class,SparkSession - Azure Databricks,Use SparkSession as the entry point for Databricks PySpark,Learn about the SparkSession class in PySpark,"The entry point to programming Spark with the Dataset and DataFrame API. A SparkSession can be used to create DataFrames, register DataFrames as tables, execute SQL over tables, cache tables, and read parquet files.",2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Class-level reference for SparkSession in Databricks PySpark, including capabilities like creating DataFrames, registering tables, and executing SQL, which are concrete API behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/active,active,active - Azure Databricks,Access the active SparkSession with SparkSession.active,Documentation for the SparkSession.active class method in PySpark.,"Returns the active or defaultSparkSessionfor the current thread, returned by the builder.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents how to retrieve the active or default SparkSession for the current thread, a specific API behavior important for correct integration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/addartifacts,addArtifacts,addArtifacts - Azure Databricks,Use SparkSession.addArtifacts in Azure Databricks,Documentation for the SparkSession.addArtifacts method in PySpark.,Adds artifact(s) to the client session. Currently only local files are supported. addArtifactis an alias foraddArtifacts.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a Databricks-specific PySpark method, including exact method name, behavior, and alias; this is product-specific SDK detail not reliably known from training.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/addtag,addTag,addTag - Azure Databricks,Tag operations with SparkSession.addTag in Databricks,Documentation for the SparkSession.addTag method in PySpark.,"Add a tag to be assigned to all the operations started by this thread in this session. Often, a unit of execution in an application consists of multiple Spark executions. Application programmers can use this method to group all those jobs together and give a group tag. The application can useSparkSession.interruptTagto cancel all running executions with this tag. There may be multiple tags present at the same time, so different parts of an application may use different tags to perform cancellati",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific SparkSession method and its semantics for tagging and cancellation, which are product/SDK-specific coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/catalog,catalog,catalog - Azure Databricks,Access Spark catalog via SparkSession.catalog,Documentation for the SparkSession.catalog property in PySpark.,"Interface through which the user may create, drop, alter, or query underlying databases, tables, functions, and more.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Describes a concrete SparkSession property and its role for database/table/function management; this is specific API surface knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/clearprogresshandlers,clearProgressHandlers,clearProgressHandlers - Azure Databricks,Clear Spark progress handlers with clearProgressHandlers,Documentation for the SparkSession.clearProgressHandlers method in PySpark.,Clears all registered progress handlers.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Reference for a specific SparkSession method used to manage progress handlers; product-specific API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/cleartags,clearTags,clearTags - Azure Databricks,Manage Spark operation tags with clearTags,Documentation for the SparkSession.clearTags method in PySpark.,Clear the current thread's operation tags.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a concrete SparkSession method that clears thread operation tags; SDK-level behavior detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/conf,conf,conf - Azure Databricks,Use SparkSession.conf for runtime configuration,Documentation for the SparkSession.conf property in PySpark.,"Runtime configuration interface for Spark. This is the interface through which the user can get and set all Spark and Hadoop configurations relevant to Spark SQL. When getting the value of a config, this defaults to the value set in the underlyingSparkContext, if any.",2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,Explains a configuration interface property used to get/set Spark and Hadoop configs; this is a specific configuration access pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/createdataframe,createDataFrame,createDataFrame - Azure Databricks,Create DataFrames with SparkSession.createDataFrame,Documentation for the SparkSession.createDataFrame method in PySpark.,"Creates aDataFramefrom anRDD, a list, apandas.DataFrame, anumpy.ndarray, or apyarrow.Table.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a key SparkSession method including supported input types (RDD, pandas, numpy, pyarrow); detailed SDK usage.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/datasource,dataSource,dataSource - Azure Databricks,Register data sources via SparkSession.dataSource,Documentation for the SparkSession.dataSource property in PySpark.,Returns aDataSourceRegistrationfor data source registration.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Describes a specific property returning DataSourceRegistration, a product-specific integration mechanism.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/getactivesession,getActiveSession,getActiveSession - Azure Databricks,Retrieve current Spark session with getActiveSession,Documentation for the SparkSession.getActiveSession class method in PySpark.,"Returns the activeSparkSessionfor the current thread, returned by the builder.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents a class method and its behavior for obtaining the active SparkSession; SDK-specific pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/gettags,getTags,getTags - Azure Databricks,Inspect current Spark operation tags with getTags,Documentation for the SparkSession.getTags method in PySpark.,Get the tags that are currently set to be assigned to all the operations started by this thread.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,API reference for a SparkSession method returning thread operation tags; product-specific tagging semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/interruptall,interruptAll,interruptAll - Azure Databricks,Interrupt all operations in a Spark session,Documentation for the SparkSession.interruptAll method in PySpark.,Interrupt all operations of this session currently running on the connected server.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents SparkSession.interruptAll and its effect on server-side operations; concrete control API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/interruptoperation,interruptOperation,interruptOperation - Azure Databricks,Cancel specific Spark operations with interruptOperation,Documentation for the SparkSession.interruptOperation method in PySpark.,Interrupt an operation of this session with the given operation ID.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Reference for a method that interrupts by operation ID; detailed control behavior unique to this API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/interrupttag,interruptTag,interruptTag - Azure Databricks,Cancel tagged Spark operations with interruptTag,Documentation for the SparkSession.interruptTag method in PySpark.,Interrupt all operations of this session with the given operation tag.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes a SparkSession method that cancels operations by tag; product-specific tagging and cancellation pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/newsession,newSession,newSession - Azure Databricks,Create isolated Spark sessions with newSession,Documentation for the SparkSession.newSession method in PySpark.,"Returns a newSparkSessionas a new session, that has separate SQL configuration, registered temporary views, and UDFs, but shares theSparkContextand table cache.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Explains behavior of SparkSession.newSession including what is shared vs isolated (SQL config, views, UDFs); nuanced API semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/profile,profile,profile - Azure Databricks,Profile Spark performance via SparkSession.profile,Documentation for the SparkSession.profile property in PySpark.,Returns aProfilefor performance and memory profiling.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents a property returning a Profile object for performance/memory profiling; specific to this environment.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/range,range,range (SparkSession) - Azure Databricks,Generate numeric ranges with SparkSession.range,Documentation for the SparkSession.range method in PySpark.,"Creates aDataFramewith a singleLongTypecolumn namedid, containing elements in a range fromstarttoend(exclusive) with step valuestep.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for creating a DataFrame with LongType id column and start/end/step semantics; concrete method behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/read,read,read (SparkSession) - Azure Databricks,Read data as DataFrames with SparkSession.read,Documentation for the SparkSession.read property in PySpark.,Returns aDataFrameReaderthat can be used to read data as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents the DataFrameReader entry point; specific property and usage pattern for data ingestion.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/readstream,readStream,readStream - Azure Databricks,Read streaming data with SparkSession.readStream,Documentation for the SparkSession.readStream property in PySpark.,Returns aDataStreamReaderthat can be used to read data streams as a streamingDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Reference for DataStreamReader access via SparkSession; streaming-specific integration API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/registerprogresshandler,registerProgressHandler,registerProgressHandler - Azure Databricks,Register Spark progress handlers with registerProgressHandler,Documentation for the SparkSession.registerProgressHandler method in PySpark.,Registers a progress handler to be called when a progress update is received from the server.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Describes a method to hook into server progress updates; product-specific callback integration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/removeprogresshandler,removeProgressHandler,removeProgressHandler - Azure Databricks,Remove Spark progress handlers with removeProgressHandler,Documentation for the SparkSession.removeProgressHandler method in PySpark.,Removes a progress handler that was previously registered.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,API reference for unregistering progress handlers; complements registration behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/removetag,removeTag,removeTag - Azure Databricks,Remove Spark operation tags with removeTag,Documentation for the SparkSession.removeTag method in PySpark.,Remove a tag previously added to be assigned to all the operations started by this thread in this session. No-op if the tag was not previously added.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents tag removal semantics including no-op behavior if tag not present; detailed API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/sparkcontext,sparkContext,sparkContext - Azure Databricks,Access underlying SparkContext via SparkSession.sparkContext,Documentation for the SparkSession.sparkContext property in PySpark.,Returns the underlyingSparkContext.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Describes a property exposing the underlying SparkContext; specific to SparkSession API surface.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/sql,sql,sql - Azure Databricks,Run SQL queries with SparkSession.sql in Databricks,Documentation for the SparkSession.sql method in PySpark.,"Returns aDataFramerepresenting the result of the given query. Whenkwargsis specified, this method formats the given string using the Python standard formatter. The method binds named parameters to SQL literals or positional parameters fromargs. Named and positional parameters cannot be mixed in the same SQL query.",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Details method behavior including named vs positional parameter binding and restriction on mixing them; nuanced, product-specific SQL execution semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/stop,stop,stop (SparkSession) - Azure Databricks,Stop Spark applications with SparkSession.stop,Documentation for the SparkSession.stop method in PySpark.,Stops the underlyingSparkContext.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents how stopping the SparkSession stops the underlying SparkContext; concrete lifecycle API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/streams,streams,streams - Azure Databricks,Manage streaming queries via SparkSession.streams,Documentation for the SparkSession.streams property in PySpark.,Returns aStreamingQueryManagerthat allows managing all activeStreamingQueryinstances on this context.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes access to StreamingQueryManager for active StreamingQuery instances; specific streaming management API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/table,table,table (SparkSession) - Azure Databricks,Load tables as DataFrames with SparkSession.table,Documentation for the SparkSession.table method in PySpark.,Returns the specified table as aDataFrame.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,API reference for retrieving a table as a DataFrame; concrete method behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/tvf,tvf,tvf - Azure Databricks,Call table-valued functions via SparkSession.tvf,Documentation for the SparkSession.tvf property in PySpark.,Returns aTableValuedFunctionthat can be used to call a table-valued function (TVF).,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a property returning TableValuedFunction for TVF calls; product-specific function invocation pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/udf,udf,udf (SparkSession) - Azure Databricks,Register UDFs with SparkSession.udf in Databricks,Documentation for the SparkSession.udf property in PySpark.,Returns aUDFRegistrationfor UDF registration.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Describes UDFRegistration access and usage; concrete API for user-defined functions.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/udtf,udtf,udtf (SparkSession) - Azure Databricks,Register UDTFs with SparkSession.udtf in Databricks,Documentation for the SparkSession.udtf property in PySpark.,Returns aUDTFRegistrationfor UDTF registration.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Documents UDTFRegistration access; specific coding pattern for table functions.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/version-session,version,version (SparkSession) - Azure Databricks,Check Spark runtime version via SparkSession.version,Documentation for the SparkSession.version property in PySpark.,The version of Spark on which this application is running.,2026-04-17T21:49:00.000Z,reference,configuration,0.6,True,Provides a property exposing the exact Spark version the app runs on; configuration/environment detail.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery,StreamingQuery class,StreamingQuery - Azure Databricks,Control streaming queries with StreamingQuery in PySpark,Learn about the StreamingQuery class in PySpark,A handle to a query that is executing continuously in the background as new data arrives. All methods are thread-safe.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Class-level reference describing a handle to continuous queries and thread-safety; specific streaming control API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/awaittermination,awaitTermination,awaitTermination (StreamingQuery) - Azure Databricks,Wait for streaming query termination with awaitTermination,Documentation for the StreamingQuery.awaitTermination method in PySpark.,"Waits for the termination of this query, either bystop()or by an exception. If the query has terminated with an exception, the exception will be thrown. Iftimeoutis set, returns whether the query has terminated within the timeout seconds. If the query has already terminated, subsequent calls either return immediately (if stopped normally) or throw the exception immediately (if terminated with an exception).",2026-04-17T21:49:00.000Z,reference,limits-quotas,0.6,True,"Method reference including timeout parameter semantics and behavior on exceptions; involves time-based behavior and return conditions, bordering on timeout limits.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/exception,exception,exception (StreamingQuery) - Azure Databricks,Inspect streaming failures with StreamingQuery.exception,Documentation for the StreamingQuery.exception method in PySpark.,"Returns theStreamingQueryExceptionif the query was terminated by an exception, orNoneif the query terminated normally.",2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Documents how to retrieve StreamingQueryException or None; specific error inspection API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/explain,explain,explain (StreamingQuery) - Azure Databricks,Explain streaming query plans with StreamingQuery.explain,Documentation for the StreamingQuery.explain method in PySpark.,Prints the (logical and physical) plans to the console for debugging.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Reference for printing logical and physical plans for debugging; concrete method behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/id,id,id (StreamingQuery) - Azure Databricks,Use StreamingQuery.id for unique query identification,Documentation for the StreamingQuery.id property in PySpark.,"Returns the unique ID of this query that persists across restarts from checkpoint data. The ID is generated when a query is started for the first time, and is the same every time it is restarted from checkpoint data. There can only be one query with the same ID active in a Spark cluster.",2026-04-17T21:49:00.000Z,reference,limits-quotas,0.65,True,Describes uniqueness constraints (only one query with same ID active in a cluster) and persistence across restarts; this is a behavioral limit/constraint.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/isactive,isActive,isActive (StreamingQuery) - Azure Databricks,Check streaming query activity with isActive,Documentation for the StreamingQuery.isActive property in PySpark.,Returns whether this streaming query is currently active.,2026-04-17T21:49:00.000Z,reference,integrations,0.6,True,Property reference returning whether a query is active; specific status-check API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/lastprogress,lastProgress,lastProgress (StreamingQuery) - Azure Databricks,Get last progress update via lastProgress,Documentation for the StreamingQuery.lastProgress property in PySpark.,"Returns the most recentStreamingQueryProgressupdate of this streaming query, orNoneif there have been no progress updates.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents returning most recent StreamingQueryProgress or None; product-specific progress reporting API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/name,name,name (StreamingQuery) - Azure Databricks,Name streaming queries with StreamingQuery.name,Documentation for the StreamingQuery.name property in PySpark.,"Returns the user-specified name of the query, orNoneif not specified. Set viadf.writeStream.queryName(""query"").start(). If set, this name must be unique across all active queries.",2026-04-17T21:49:00.000Z,reference,limits-quotas,0.6,True,Specifies uniqueness requirement for query names across active queries and how to set them; this is a naming constraint/limit.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/processallavailable,processAllAvailable,processAllAvailable (StreamingQuery) - Azure Databricks,Process all available streaming data with processAllAvailable,Documentation for the StreamingQuery.processAllAvailable method in PySpark.,Blocks until all available data in the source has been processed and committed to the sink. Intended for testing.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Method reference describing blocking behavior until all available data is processed; specific testing/integration pattern.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/recentprogress,recentProgress,recentProgress (StreamingQuery) - Azure Databricks,Access recent streaming progress via recentProgress,Documentation for the StreamingQuery.recentProgress property in PySpark.,Returns an array of the most recentStreamingQueryProgressupdates for this query. The number of progress updates retained is configured byspark.sql.streaming.numRecentProgressUpdates.,2026-04-17T21:49:00.000Z,reference,configuration,0.7,True,Describes that the number of retained progress updates is configured by spark.sql.streaming.numRecentProgressUpdates; exposes a specific configuration setting affecting behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/runid,runId,runId (StreamingQuery) - Azure Databricks,Use StreamingQuery.runId in Azure Databricks PySpark,Documentation for the StreamingQuery.runId property in PySpark.,Returns the unique ID of this query that does not persist across restarts. Every query that is started (or restarted from checkpoint) will have a differentrunId.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"API reference for a specific PySpark Databricks property with product-specific behavior (non-persistent unique ID across restarts), which is detailed SDK behavior rather than generic concept.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/status,status,status (StreamingQuery) - Azure Databricks,Access StreamingQuery.status in Databricks PySpark,Documentation for the StreamingQuery.status property in PySpark.,Returns the current status of the query.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents a concrete PySpark property on Databricks (current query status) as part of the SDK surface; this is product-specific API behavior.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/stop,stop,stop (StreamingQuery) - Azure Databricks,Stop a StreamingQuery in Databricks PySpark,Documentation for the StreamingQuery.stop method in PySpark.,Stops this streaming query.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Method-level reference for stopping a streaming query in Databricks PySpark, exposing concrete SDK behavior not covered by generic theory.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener,StreamingQueryListener class,StreamingQueryListener - Azure Databricks,Implement StreamingQueryListener for Databricks PySpark,Documentation for the StreamingQueryListener abstract class in PySpark for listening to streaming query lifecycle events.,An abstract class for listening to events related toStreamingQuery. Subclass this class and implement its abstract methods to receive lifecycle event callbacks for streaming queries.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Describes a Databricks-specific abstract listener class and its lifecycle callbacks, which are concrete API integration points.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryidle,onQueryIdle,onQueryIdle (StreamingQueryListener) - Azure Databricks,Handle onQueryIdle events in StreamingQueryListener,Documentation for the StreamingQueryListener.onQueryIdle method in PySpark.,Called when the query is idle and waiting for new data to process.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific callback method and when it fires (query idle waiting for data), which is detailed SDK behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryprogress,onQueryProgress,onQueryProgress (StreamingQueryListener) - Azure Databricks,Use onQueryProgress for streaming status updates,Documentation for the StreamingQueryListener.onQueryProgress method in PySpark.,"Called when there is some status update (ingestion rate updated, etc.)",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level reference for a Databricks PySpark listener callback with specific semantics around status updates.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onquerystarted,onQueryStarted,onQueryStarted (StreamingQueryListener) - Azure Databricks,React to onQueryStarted events in Databricks,Documentation for the StreamingQueryListener.onQueryStarted method in PySpark.,Called when a query is started.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a concrete lifecycle callback method for streaming queries, part of the Databricks PySpark API surface.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryterminated,onQueryTerminated,onQueryTerminated (StreamingQueryListener) - Azure Databricks,Handle onQueryTerminated events in StreamingQueryListener,Documentation for the StreamingQueryListener.onQueryTerminated method in PySpark.,"Called when a query is stopped, with or without error.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Explains a specific termination callback (with or without error) in the Databricks PySpark listener API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager,StreamingQueryManager class,StreamingQueryManager - Azure Databricks,Manage streaming queries with StreamingQueryManager,Learn about the StreamingQueryManager class in PySpark,Manages all activeStreamingQueryinstances associated with aSparkSession. Usespark.streamsto access this.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Class-level reference for managing active streaming queries via Databricks PySpark APIs, including how to access it from SparkSession.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/active,active,active (StreamingQueryManager) - Azure Databricks,List active streaming queries in Databricks,Documentation for the StreamingQueryManager.active property in PySpark.,Returns a list of all active streaming queries associated with thisSparkSession.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a specific property returning active queries for a SparkSession, a concrete SDK behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/addlistener,addListener,addListener (StreamingQueryManager) - Azure Databricks,Register StreamingQueryListener with StreamingQueryManager,Documentation for the StreamingQueryManager.addListener method in PySpark.,Registers aStreamingQueryListenerto receive lifecycle event callbacks for streaming queries.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level API reference for registering listeners for streaming lifecycle events in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/awaitanytermination,awaitAnyTermination,awaitAnyTermination (StreamingQueryManager) - Azure Databricks,Use awaitAnyTermination on StreamingQueryManager,Documentation for the StreamingQueryManager.awaitAnyTermination method in PySpark.,"Waits until any of the queries on the associatedSparkSessionhas terminated since the creation of the context, or sinceresetTerminated()was called. If any query terminated with an exception, the exception will be thrown. Iftimeoutis set, returns whether any query has terminated within the timeout seconds. If a query has already terminated, subsequent calls either return immediately (if stopped normally) or throw the exception immediately (if terminated with an exception). UseresetTerminated()to c",2026-04-17T21:49:00.000Z,reference,integrations,0.72,True,"Describes detailed behavior of waiting for any query termination, including timeout semantics and exception propagation, which is product-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/get,get,get (StreamingQueryManager) - Azure Databricks,Get streaming query by ID in Databricks,Documentation for the StreamingQueryManager.get method in PySpark.,Returns an active streaming query from thisSparkSessionby its unique ID.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"API reference for retrieving an active streaming query by unique ID from a SparkSession, a concrete integration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/removelistener,removeListener,removeListener (StreamingQueryManager) - Azure Databricks,Remove StreamingQueryListener from StreamingQueryManager,Documentation for the StreamingQueryManager.removeListener method in PySpark.,Deregisters aStreamingQueryListener.,2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,Documents deregistration behavior for streaming listeners in the Databricks PySpark API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/resetterminated,resetTerminated,resetTerminated (StreamingQueryManager) - Azure Databricks,Reset terminated streaming queries in Databricks,Documentation for the StreamingQueryManager.resetTerminated method in PySpark.,Forgets past terminated queries so thatawaitAnyTermination()can be used again to wait for new terminations.,2026-04-17T21:49:00.000Z,reference,integrations,0.68,True,"Explains specific behavior of resetTerminated in relation to awaitAnyTermination, which is detailed SDK semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udf,UserDefinedFunction class,UserDefinedFunction - Azure Databricks,Work with UserDefinedFunction in Databricks PySpark,Learn about the UserDefinedFunction class in PySpark,A user defined function in Python. The constructor of this class is not supposed to be directly called. Usepyspark.sql.functions.udforpyspark.sql.functions.pandas_udfto create an instance.,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Class-level reference for Databricks’ PySpark UDF wrapper, including how instances are created via specific factory functions.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udf/asnondeterministic,asNondeterministic,asNondeterministic (UserDefined Function) - Azure Databricks,Mark PySpark UserDefinedFunction as nondeterministic,Documentation for the UserDefinedFunction.asNondeterministic method in PySpark.,Updates theUserDefinedFunctionto nondeterministic.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Documents a specific method that changes optimizer semantics for a UDF, which is product-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration,UDFRegistration class,UDFRegistration - Azure Databricks,Register UDFs with UDFRegistration in Databricks,Learn about the UDFRegistration class in PySpark,Wrapper for user-defined function registration. This instance can be accessed byspark.udf.,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Documents a Databricks-specific wrapper for UDF registration accessed via spark.udf, including its role in SQL function registration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/register,register,register (UDFRegistration) - Azure Databricks,Register Python functions as SQL UDFs in Databricks,Documentation for the UDFRegistration.register method in PySpark.,Registers a Python function (including lambda functions) or a user-defined function as a SQL function.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API reference for registering Python functions and UDFs as SQL functions, including supported function types and behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/registerjavafunction,registerJavaFunction,registerJavaFunction (UDFRegistration) - Azure Databricks,Register Java UDFs as SQL functions in Databricks,Documentation for the UDFRegistration.registerJavaFunction method in PySpark.,"Registers a Java user-defined function as a SQL function. In addition to a name and the function itself, the return type can be optionally specified. When the return type is not specified, it is inferred via reflection.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents a method for registering Java UDFs, including optional return type inference via reflection, which is product-specific integration detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/registerjavaudaf,registerJavaUDAF,registerJavaUDAF (UDFRegistration) - Azure Databricks,Register Java UDAFs as SQL functions in Databricks,Documentation for the UDFRegistration.registerJavaUDAF method in PySpark.,Registers a Java user-defined aggregate function as a SQL function.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"API reference for registering Java aggregate functions as SQL functions, a concrete integration pattern.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtf,UserDefinedTableFunction class,UserDefinedTableFunction - Azure Databricks,Use UserDefinedTableFunction in Databricks PySpark,Learn about the UserDefinedTableFunction class in PySpark,A user-defined table function in Python. The constructor of this class is not supposed to be directly called. Usepyspark.sql.functions.udtfto create an instance.,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Class-level reference for Databricks’ PySpark UDTF, including how to construct via pyspark.sql.functions.udtf.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtf/asdeterministic,asDeterministic,asDeterministic (UserDefinedTableFunction) - Azure Databricks,Mark UserDefinedTableFunction as deterministic,Documentation for the UserDefinedTableFunction.asDeterministic method in PySpark.,Updates theUserDefinedTableFunctionto deterministic.,2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,Method-level API reference that affects execution semantics of UDTFs in Databricks PySpark.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtfregistration,UDTFRegistration class,UDTFRegistration - Azure Databricks,Use UDTFRegistration for table functions in Databricks,Learn about the UDTFRegistration class in PySpark,Wrapper for user-defined table function registration. This instance can be accessed byspark.udtf.,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Class-level reference for registering user-defined table functions via spark.udtf, which is Databricks-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtfregistration/register,register,register (UDTFRegistration) - Azure Databricks,Register Python UDTFs as SQL table functions,Documentation for the UDTFRegistration.register method in PySpark.,Registers a Python user-defined table function as a SQL table function.,2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents how to register Python user-defined table functions as SQL table functions, including method semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval,VariantVal class,VariantVal class - Azure Databricks,Work with VariantVal type in Databricks PySpark,Documentation for the VariantVal class in PySpark.,A class to represent a Variant value in Python. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Introduces a Databricks-specific VariantVal class and its runtime version, which is specialized API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/parsejson,parseJson,parseJson - Azure Databricks,Parse JSON into VariantVal in Databricks PySpark,Documentation for the VariantVal.parseJson method in PySpark.,Converts a JSON string to a VariantVal object. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Method-level reference for converting JSON strings into VariantVal objects, a product-specific data handling API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/tojson,toJson,toJson - Azure Databricks,Convert VariantVal to JSON with time zone control,Documentation for the VariantVal.toJson method in PySpark.,Convert the VariantVal to a JSON string. The zone ID represents the time zone that the timestamp should be printed in. It is defaulted to UTC. The list of valid zone IDs can be found by importing thezoneinfomodule and runningzoneinfo.available_timezones(). Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.85,True,"Describes VariantVal.toJson including default UTC zone and how valid zone IDs are obtained via zoneinfo.available_timezones(), which is detailed, product-specific behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/topython,toPython,toPython - Azure Databricks,Convert VariantVal to native Python structures,Documentation for the VariantVal.toPython method in PySpark.,Convert the VariantVal to a Python data structure. Added in Databricks Runtime 15.2,2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Documents VariantVal.toPython behavior, which is specific to Databricks’ Variant type integration with Python.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window,Window class,Window class - Azure Databricks,Define window specifications with Window in PySpark,Documentation for the Window class in PySpark.,Utility functions for defining window in DataFrames. Supports Spark Connect,2026-04-17T08:00:00.000Z,reference,integrations,0.78,True,"Class-level reference for Window utility functions, including Spark Connect support, which is concrete API surface.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/orderby,orderBy,orderBy (Window) - Azure Databricks,Specify ordering for Window in Databricks PySpark,Documentation for the Window\.orderBy method in PySpark.,Creates aWindowSpecwith the ordering defined.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Method-level API reference for Window.orderBy, defining how ordering is attached to a WindowSpec.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/partitionby,partitionBy,partitionBy (Window) - Azure Databricks,Specify partitioning for Window in Databricks PySpark,Documentation for the Window\.partitionBy method in PySpark.,Creates aWindowSpecwith the partitioning defined.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents Window.partitionBy behavior, a concrete configuration of windowing in Databricks PySpark.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/rangebetween,rangeBetween,rangeBetween (Window) - Azure Databricks,Define rangeBetween frames for Window in PySpark,Documentation for the Window\.rangeBetween method in PySpark.,"Creates aWindowSpecwith the frame boundaries defined, fromstart(inclusive) toend(inclusive). Bothstartandendare relative from the current row. For example,0means ""current row"",-1means one before the current row, and5means five after the current row. A range-based boundary is based on the actual value of the ORDER BY expression(s). An offset alters the value of the ORDER BY expression — for example, if the current ORDER BY value is10and the lower bound offset is-3, the resulting lower bound is7. ",2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Explains detailed semantics of rangeBetween, including relative offsets and how ORDER BY values are adjusted, which is precise API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/rowsbetween,rowsBetween,rowsBetween (Window) - Azure Databricks,Define rowsBetween frames for Window in PySpark,Documentation for the Window\.rowsBetween method in PySpark.,"Creates aWindowSpecwith the frame boundaries defined, fromstart(inclusive) toend(inclusive). Bothstartandendare relative positions from the current row. For example,0means ""current row"",-1means the row before the current row, and5means the fifth row after the current row. A row-based boundary is based on the position of the row within the partition. An offset indicates the number of rows above or below the current row where the frame starts or ends.",2026-04-17T21:49:00.000Z,reference,integrations,0.82,True,"Documents row-based frame semantics with relative positions, which is detailed behavior of the Databricks PySpark API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec,WindowSpec class,WindowSpec class - Azure Databricks,Use WindowSpec to configure windowing in Databricks,Documentation for the WindowSpec class in PySpark.,"A window specification that defines the partitioning, ordering, and frame boundaries. Changed in Databricks Runtime 13.0: Supports Spark Connect. Use the static methods inWindowto create aWindowSpec.",2026-04-17T08:00:00.000Z,reference,integrations,0.8,True,"Class-level reference for WindowSpec, including partitioning, ordering, frame boundaries, and Spark Connect support.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/orderby,orderBy,orderBy (WindowSpec) - Azure Databricks,Set ordering columns on WindowSpec in PySpark,Documentation for the WindowSpec.orderBy method in PySpark.,Defines the ordering columns in aWindowSpec.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Method-level API reference for WindowSpec.orderBy, defining ordering semantics for windows.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/partitionby,partitionBy,partitionBy (WindowSpec) - Azure Databricks,Set partitioning columns on WindowSpec in PySpark,Documentation for the WindowSpec.partitionBy method in PySpark.,Defines the partitioning columns in aWindowSpec.,2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Documents WindowSpec.partitionBy behavior, a concrete configuration method in the Databricks PySpark API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/rangebetween,rangeBetween,rangeBetween (WindowSpec) - Azure Databricks,Configure rangeBetween on WindowSpec in Databricks,Documentation for the WindowSpec.rangeBetween method in PySpark.,"Defines the frame boundaries, fromstart(inclusive) toend(inclusive). Bothstartandendare relative from the current row. For example,0means ""current row"",-1means one before the current row, and5means five after the current row.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Explains detailed semantics of range-based frame boundaries relative to the current row, which is specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/rowsbetween,rowsBetween,rowsBetween (WindowSpec) - Azure Databricks,Configure rowsBetween on WindowSpec in Databricks,Documentation for the WindowSpec.rowsBetween method in PySpark.,"Defines the frame boundaries, fromstart(inclusive) toend(inclusive). Bothstartandendare relative positions from the current row. For example,0means ""current row"",-1means the row before the current row, and5means the fifth row after the current row.",2026-04-17T21:49:00.000Z,reference,integrations,0.8,True,"Documents row-based frame boundaries with relative positions, a detailed aspect of the WindowSpec API.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/writercommitmessage,WriterCommitMessage,WriterCommitMessage - Azure Databricks,Use WriterCommitMessage in Databricks data sources,Learn about the WriterCommitMessage class in PySpark,A commit message returned byDataSourceWriter.writeand sent back to the driver as an input parameter ofDataSourceWriter.commitorDataSourceWriter.abort. Added in Databricks Runtime 14.3 LTS,2026-04-17T21:49:00.000Z,reference,integrations,0.78,True,"Class-level reference for WriterCommitMessage used with DataSourceWriter.write/commit/abort, including its role in driver communication, which is product-specific integration behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/datatypes,Data types index,PySpark data types - Azure Databricks,,"Learn about data types available for PySpark, a Python API for Spark, on Databricks.",This page provides a list of PySpark data types available on Databricks with links to corresponding reference documentation.,2026-02-25T19:11:00.000Z,reference,,0.4,False,List of PySpark data types with links; largely mirrors upstream Spark types and is not Databricks-specific configuration or troubleshooting.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/,Functions index,PySpark functions - Azure Databricks,,"Learn about functions available for PySpark, a Python API for Spark, on Databricks.",This page provides a list of PySpark SQL functions available on Databricks with links to corresponding reference documentation.,2026-04-17T08:00:00.000Z,reference,,0.3,False,"Index page listing PySpark SQL functions with links; no limits, configuration tables, error mappings, or product-specific decision/security/deployment guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/,Functions index,PySpark functions - Azure Databricks,,"Learn about functions available for PySpark, a Python API for Spark, on Databricks.",This page provides a list of PySpark SQL functions available on Databricks with links to corresponding reference documentation.,2026-04-17T08:00:00.000Z,reference,,0.3,False,"Index page listing PySpark SQL functions with links; no limits, configuration tables, error mappings, or product-specific decision/security/deployment guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/abs,abs,abs - Azure Databricks,Use abs numeric function in Databricks PySpark,Learn how to use the abs function with Python,"Computes the absolute value of the given column or expression. Supports Spark Connect. For the corresponding Databricks SQL function, seeabsfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,"Function reference page with exact signature, return type, and behavior for abs in Databricks PySpark; this is concrete API usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/acos,acos,acos - Azure Databricks,Use acos inverse cosine in Databricks PySpark,Learn how to use the acos function with Python,"Computes the inverse cosine (also known as arccosine) of the given column or expression. Supports Spark Connect. For the corresponding Databricks SQL function, seeacosfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,"Documents the acos function’s parameters, types, and behavior in Databricks PySpark, which is specific function API knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/acosh,acosh,acosh - Azure Databricks,Use acosh inverse hyperbolic cosine in PySpark,Learn how to use the acosh function with Python,"Computes the inverse hyperbolic cosine (also known as arcosh) of the given column or expression. Supports Spark Connect. For the corresponding Databricks SQL function, seeacoshfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Function reference for acosh with Databricks PySpark semantics and types; detailed API behavior qualifies as expert knowledge.,unchanged @@ -2552,7 +2581,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/a https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/approx_top_k,approx_top_k,approx_top_k - Azure Databricks,Find top-k frequent items with approx_top_k in PySpark,Learn how to use the approx\_top\_k function with PySpark,"Returns the topkmost frequently occurring item values in a string, boolean, date, timestamp, or numeric columncolalong with their approximate counts. The error in each count may be up to2.0 * numRows / maxItemsTrackedwherenumRowsis the total number of rows.k(default: 5) andmaxItemsTracked(default: 10000) are both integer parameters. Higher values ofmaxItemsTrackedprovide better accuracy at the cost of increased memory usage. Columns that have fewer thanmaxItemsTrackeddistinct items will yield ex",2026-01-29T20:02:00.000Z,reference,limits-quotas,0.8,True,"Includes explicit numeric defaults (k default 5, maxItemsTracked default 10000) and an error bound formula 2.0 * numRows / maxItemsTracked, which are precise limits/constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array,array,array - Azure Databricks,Create array columns with array function in PySpark,Learn how to use the array function with PySpark,Creates a new array column from the input columns or column names.,2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Function reference for array construction in Databricks PySpark with exact behavior and types.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_agg,array_agg,array_agg - Azure Databricks,Aggregate values into arrays with array_agg,Learn how to use the array\_agg function with PySpark,Returns a list of objects with duplicates.,2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Documents array_agg aggregation behavior and return type in Databricks PySpark; concrete function semantics.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_append,array_append,array_append - Azure Databricks,,Learn how to use the array\_append function with PySpark,Returns a new array column by appending a value to the existing array. Added in Databricks Runtime 12.2 LTS,2026-04-17T21:49:00.000Z,reference,,0.3,False,"Function reference for array_append focuses on behavior and syntax; no quotas, config parameters, error-code troubleshooting, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_append,array_append,array_append - Azure Databricks,,Learn how to use the array\_append function with PySpark,Returns a new array column by appending a value to the existing array. Added in Databricks Runtime 12.2 LTS,2026-04-17T21:49:00.000Z,reference,,0.3,False,"Function reference for array_append focuses on behavior and syntax; no quotas, config parameters, error-code troubleshooting, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_compact,array_compact,array_compact - Azure Databricks,Remove nulls from arrays with array_compact,Learn how to use the array\_compact function with PySpark,Removes null values from the array.,2026-01-29T20:02:00.000Z,reference,integrations,0.6,True,Function reference for array_compact behavior in Databricks PySpark; product-specific API.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_contains,array_contains,array_contains - Azure Databricks,Test array membership with array_contains in PySpark,Learn how to use the array\_contains function with PySpark,"Returns a boolean indicating whether the array contains the given value. Returns null if the array is null, true if the array contains the given value, and false otherwise.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Describes array_contains semantics including null handling; detailed function behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_distinct,array_distinct,array_distinct - Azure Databricks,Remove duplicates from arrays with array_distinct,Learn how to use the array\_distinct function with PySpark,Removes duplicate values from the array.,2026-01-29T20:02:00.000Z,reference,integrations,0.6,True,Function reference for array_distinct behavior in Databricks PySpark; concrete API semantics.,unchanged @@ -2563,7 +2592,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/a https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_max,array_max,array_max - Azure Databricks,Find maximum element in arrays with array_max,Learn how to use the array\_max function with PySpark,Returns the maximum value of the array.,2026-01-29T20:02:00.000Z,reference,integrations,0.6,True,Function reference for array_max behavior in Databricks PySpark; concrete API semantics.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_min,array_min,array_min - Azure Databricks,Find minimum element in arrays with array_min,Learn how to use the array\_min function with PySpark,Returns the minimum value of the array.,2026-01-29T20:02:00.000Z,reference,integrations,0.6,True,Function reference for array_min behavior in Databricks PySpark; concrete API semantics.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_position,array_position,array_position - Azure Databricks,,Learn how to use the array\_position function with PySpark,"Locates the position of the first occurrence of the given value in the given array. Returns null if either of the arguments are null. The position is not zero based, but 1 based index. Returns 0 if the given value could not be found in the array.",2026-01-29T20:02:00.000Z,reference,,0.2,False,"API reference for a single PySpark function (array_position); describes behavior but no product-specific limits, configs, error codes, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_prepend,array_prepend,array_prepend - Azure Databricks,,Learn how to use the array\_prepend function with PySpark,Returns an array containing the given element as the first element and the rest of the elements from the original array. Added in Databricks Runtime 13.1,2026-04-17T21:49:00.000Z,reference,,0.3,False,"Function reference for array_prepend describes usage and semantics only; lacks numeric limits, configuration tables, troubleshooting mappings, or decision guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_prepend,array_prepend,array_prepend - Azure Databricks,,Learn how to use the array\_prepend function with PySpark,Returns an array containing the given element as the first element and the rest of the elements from the original array. Added in Databricks Runtime 13.1,2026-04-17T21:49:00.000Z,reference,,0.3,False,"Function reference for array_prepend describes usage and semantics only; lacks numeric limits, configuration tables, troubleshooting mappings, or decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_remove,array_remove,array_remove - Azure Databricks,,Learn how to use the array\_remove function with PySpark,Remove all elements that equal to element from the given array.,2026-01-29T20:02:00.000Z,reference,,0.2,False,Function reference (array_remove); generic behavior description without product-specific expert details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_repeat,array_repeat,array_repeat - Azure Databricks,,Learn how to use the array\_repeat function with PySpark,Creates an array containing a column repeated count times.,2026-01-29T20:02:00.000Z,reference,,0.2,False,"Function reference (array_repeat); no limits, configs, or specialized patterns beyond basic semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_size,array_size,array_size - Azure Databricks,,Learn how to use the array\_size function with PySpark,Returns the total number of elements in the array. The function returns null for null input.,2026-01-29T20:02:00.000Z,reference,,0.2,False,"Function reference (array_size); only describes return value and null handling, which is generic.",unchanged @@ -2695,7 +2724,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/f https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/forall,forall,forall - Azure Databricks,Validate array predicates with forall in Databricks PySpark,Learn how to use the forall function with Python,"Returns whether a predicate holds for every element in the array. Supports Spark Connect. For the corresponding Databricks SQL function, seeforallfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,Describes Databricks PySpark forall behavior on arrays and its SQL counterpart; concrete integration API.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/format_number,format_number,format_number - Azure Databricks,Format numbers as strings with PySpark in Databricks,Learn how to use the format\_number function with Python,"Formats the number X to a format like '#,--#,--#.--', rounded to d decimal places with HALF_EVEN round mode, and returns the result as a string. For the corresponding Databricks SQL function, seeformat_numberfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.8,True,"Specifies exact formatting pattern, rounding mode (HALF_EVEN), and decimal handling for Databricks format_number; detailed function semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/format_string,format_string,format_string - Azure Databricks,Use format_string printf-style formatting in Databricks,Learn how to use the format\_string function with Python,"Formats the arguments in printf-style and returns the result as a string column. For the corresponding Databricks SQL function, seeformat_stringfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,Documents Databricks PySpark format_string behavior and SQL mapping; specific API usage pattern.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/from_avro,from_avro (Avro),from_avro - Azure Databricks,Use from_avro in Azure Databricks PySpark,Learn how to use the from\_avro function with PySpark to deserialize binary Avro data into DataFrame columns.,"Converts a binary column of Avro format into its corresponding catalyst value. The specified schema must match the read data, otherwise the behavior is undefined: it may fail or return an arbitrary result. IfjsonFormatSchemais not provided but bothsubjectandschemaRegistryAddressare provided, the function converts a binary column of Schema Registry Avro format into its corresponding catalyst value.",2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"The page documents a specific PySpark function (from_avro) in the Azure Databricks runtime, including its parameters, behavior when schema/jsonFormatSchema/subject/schemaRegistryAddress are provided or omitted, and how it integrates with Schema Registry Avro format. This is product- and API-specific integration knowledge (function signature, parameter semantics, and behavior) that goes beyond generic LLM training. It fits best under integrations & coding patterns, as it focuses on a concrete API for deserializing Avro data into Spark DataFrames rather than general concepts or limits.",new +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/from_avro,from_avro (Avro),from_avro - Azure Databricks,Use from_avro in Azure Databricks PySpark,Learn how to use the from\_avro function with PySpark to deserialize binary Avro data into DataFrame columns.,"Converts a binary column of Avro format into its corresponding catalyst value. The specified schema must match the read data, otherwise the behavior is undefined: it may fail or return an arbitrary result. IfjsonFormatSchemais not provided but bothsubjectandschemaRegistryAddressare provided, the function converts a binary column of Schema Registry Avro format into its corresponding catalyst value.",2026-04-17T21:49:00.000Z,reference,integrations,0.74,True,"The page documents a specific PySpark function (from_avro) in the Azure Databricks runtime, including its parameters, behavior when schema/jsonFormatSchema/subject/schemaRegistryAddress are provided or omitted, and how it integrates with Schema Registry Avro format. This is product- and API-specific integration knowledge (function signature, parameter semantics, and behavior) that goes beyond generic LLM training. It fits best under integrations & coding patterns, as it focuses on a concrete API for deserializing Avro data into Spark DataFrames rather than general concepts or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/from_csv,from_csv,from_csv - Azure Databricks,Parse CSV strings to rows with from_csv in Databricks,Learn how to use the from\_csv function with PySpark,Parses a column containing a CSV string into a row with the specified schema. Returnsnullif the string cannot be parsed.,2026-01-29T20:02:00.000Z,reference,integrations,0.8,True,"Explains Databricks PySpark from_csv behavior, schema handling, and null on parse failure; product-specific parsing semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/from_json,from_json,from_json - Azure Databricks,Parse JSON strings to complex types with from_json in Databricks,Learn how to use the from\_json function with PySpark,"Parses a column containing a JSON string into aMapTypewithStringTypeas keys type,StructTypeorArrayTypewith the specified schema. Returnsnull, in the case of an unparsable string.",2026-01-29T20:02:00.000Z,reference,integrations,0.8,True,"Details Databricks PySpark from_json return types (MapType, StructType, ArrayType) and null behavior on unparsable strings; concrete API semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/from_unixtime,from_unixtime,from_unixtime - Azure Databricks,Convert Unix epoch seconds to timestamps with from_unixtime in Databricks,Learn how to use the from\_unixtime function with Python,"Converts the number of seconds from unix epoch (1970-01-01 00:00:00 UTC) to a string representing the timestamp of that moment in the current system time zone in the given format. For the corresponding Databricks SQL function, seefrom_unixtimefunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.75,True,"Specifies epoch origin, time zone handling, and formatting for Databricks PySpark from_unixtime; product-specific function behavior.",unchanged @@ -2819,7 +2848,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/l https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/log2,log2,log2 - Azure Databricks,,Learn how to use the log2 function with Python,"Returns the base-2 logarithm of the argument. Supports Spark Connect. For the corresponding Databricks SQL function, seelog2function.",2026-01-29T20:02:00.000Z,reference,,0.05,False,Base-2 logarithm (log2); generic behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/lower,lower,lower - Azure Databricks,,Learn how to use the lower function with Python,"Converts a string expression to lower case. For the corresponding Databricks SQL function, seelowerfunction.",2026-01-29T20:02:00.000Z,reference,,0.05,False,"String to lower case (lower); trivial, generic function.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/lpad,lpad,lpad - Azure Databricks,,Learn how to use the lpad function with Python,"Left-pad the string column to widthlenwithpad. For the corresponding Databricks SQL function, seelpadfunction.",2026-01-29T20:02:00.000Z,reference,,0.1,False,Left padding (lpad); generic string function behavior.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/ltrim,ltrim,ltrim - Azure Databricks,,Learn how to use the ltrim function with Python,"Trim the spaces from left end for the specified string value. For the corresponding Databricks SQL function, seeltrimfunction.",2026-04-17T21:49:00.000Z,reference,,0.2,False,"Function reference for ltrim in Azure Databricks PySpark; likely just syntax and basic behavior without product-specific limits, configuration tables, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/ltrim,ltrim,ltrim - Azure Databricks,,Learn how to use the ltrim function with Python,"Trim the spaces from left end for the specified string value. For the corresponding Databricks SQL function, seeltrimfunction.",2026-04-17T21:49:00.000Z,reference,,0.2,False,"Function reference for ltrim in Azure Databricks PySpark; likely just syntax and basic behavior without product-specific limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/make_date,make_date,make_date - Azure Databricks,,Learn how to use the make\_date function with Python,"Returns a column with a date built from the year, month and day columns. For the corresponding Databricks SQL function, seemake_datefunction.",2026-01-29T20:02:00.000Z,reference,,0.15,False,"make_date; generic date construction semantics, no Databricks-specific configs or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/make_dt_interval,make_dt_interval,make_dt_interval - Azure Databricks,,Learn how to use the make\_dt\_interval function with Python,"Make DayTimeIntervalType duration from days, hours, mins and secs. For the corresponding Databricks SQL function, seemake_dt_intervalfunction.",2026-01-29T20:02:00.000Z,reference,,0.2,False,make_dt_interval; describes interval construction but no product-specific constraints or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/make_interval,make_interval,make_interval - Azure Databricks,,Learn how to use the make\_interval function with Python,"Make interval from years, months, weeks, days, hours, mins and secs. For the corresponding Databricks SQL function, seemake_intervalfunction.",2026-01-29T20:02:00.000Z,reference,,0.2,False,"make_interval; generic interval construction, no expert Databricks-specific details.",unchanged @@ -2893,7 +2922,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/r https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/raise_error,raise_error,raise_error - Azure Databricks,,Learn how to use the raise\_error function with PySpark,Throws an exception with the provided error message.,2026-01-29T20:02:00.000Z,reference,,0.25,False,raise_error describes throwing an exception with a message; no error-code mappings or troubleshooting flows.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/rand,rand,rand - Azure Databricks,,Learn how to use the rand function with Python,"Generates a random column with independent and identically distributed (i.i.d.) samples uniformly distributed in[0.0, 1.0). Supports Spark Connect. The function is non-deterministic in general case. For the corresponding Databricks SQL function, seerandfunction.",2026-02-24T08:00:00.000Z,reference,,0.2,False,"Function reference for rand in Azure Databricks PySpark; describes behavior of a random-number function but does not present product-specific limits, configuration tables, error codes, or decision matrices beyond what an LLM would generally know about such functions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/randn,randn,randn - Azure Databricks,,Learn how to use the randn function with Python,"Generates a random column with independent and identically distributed (i.i.d.) samples from the standard normal distribution. Supports Spark Connect. For the corresponding Databricks SQL function, seerandnfunction.",2026-02-24T08:00:00.000Z,reference,,0.2,False,"Function reference for randn in Azure Databricks PySpark; documents a standard normal random-number generator without product-specific limits, quotas, configuration parameters, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/randstr,randstr,randstr - Azure Databricks,Use randstr PySpark function in Azure Databricks,Learn how to use the randstr function with Python,"Returns a string of the specified length whose characters are chosen uniformly at random from the following pool of characters: 0-9, a-z, A-Z. The random seed is optional. The string length must be a constant two-byte or four-byte integer (SMALLINT or INT, respectively). Added in Databricks Runtime 16.1 For the corresponding Databricks SQL function, seerandstrfunction.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a Databricks-specific PySpark function with precise behavior and constraints (character pool 0-9, a-z, A-Z; length must be SMALLINT or INT; optional seed; runtime version added). This is product- and API-specific reference knowledge that goes beyond generic LLM training and fits best under integrations/coding patterns for this service.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/randstr,randstr,randstr - Azure Databricks,Use randstr PySpark function in Azure Databricks,Learn how to use the randstr function with Python,"Returns a string of the specified length whose characters are chosen uniformly at random from the following pool of characters: 0-9, a-z, A-Z. The random seed is optional. The string length must be a constant two-byte or four-byte integer (SMALLINT or INT, respectively). Added in Databricks Runtime 16.1 For the corresponding Databricks SQL function, seerandstrfunction.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"Documents a Databricks-specific PySpark function with precise behavior and constraints (character pool 0-9, a-z, A-Z; length must be SMALLINT or INT; optional seed; runtime version added). This is product- and API-specific reference knowledge that goes beyond generic LLM training and fits best under integrations/coding patterns for this service.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/rank,rank,rank - Azure Databricks,,Learn how to use the rank function with PySpark,"Window function: returns the rank of rows within a window partition. The difference between rank and dense_rank is that dense_rank leaves no gaps in ranking sequence when there are ties. That is, if you were ranking a competition using dense_rank and had three people tie for second place, you would say that all three were in second place and that the next person came in third. Rank would give me sequential numbers, making the person that came in third place (after the ties) would register as com",2026-01-29T20:02:00.000Z,reference,,0.3,False,rank window function description is generic SQL window semantics; no Databricks-specific expert details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/reduce,reduce,reduce - Azure Databricks,,Learn how to use the reduce function with Python,"Applies a binary operator to an initial state and all elements in the array, and reduces this to a single state. The final state is converted into the final result by applying a finish function. For the corresponding Databricks SQL function, seereducefunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"reduce function summary explains generic array reduction; no product-specific limits, configs, or edge-case tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/reflect,reflect,reflect - Azure Databricks,,Learn how to use the reflect function with PySpark,Calls a method with reflection.,2026-01-29T20:02:00.000Z,reference,,0.35,False,"reflect calls a method via reflection; summary lacks parameter tables, security implications, or other expert-only details.",unchanged @@ -2923,7 +2952,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/r https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/round,round,round - Azure Databricks,Round numeric values with PySpark round in Databricks,Learn how to use the round function with Python,"Round the given value toscaledecimal places using HALF_UP rounding mode ifscale>= 0 or at integral part whenscale< 0. Supports Spark Connect. For the corresponding Databricks SQL function, seeroundfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,"Specifies HALF_UP rounding and scale behavior in Databricks PySpark, which is concrete API semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/row_number,row_number,row_number - Azure Databricks,Generate row numbers with PySpark window functions in Databricks,Learn how to use the row\_number function with PySpark,Window function: returns a sequential number starting at 1 within a window partition.,2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Documents row_number window function usage and partition semantics in Databricks PySpark.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/rpad,rpad,rpad - Azure Databricks,Right-pad strings with PySpark rpad in Databricks,Learn how to use the rpad function with Python,"Right-pad the string column to widthlenwithpad. For the corresponding Databricks SQL function, seerpadfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,API reference for rpad including parameters len and pad is product-specific function behavior.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/rtrim,rtrim,rtrim - Azure Databricks,,Learn how to use the rtrim function with Python,"Trim the spaces from right end for the specified string value. For the corresponding Databricks SQL function, seertrimfunction.",2026-04-17T21:49:00.000Z,reference,,0.3,False,"Appears to be a simple reference for a standard string trimming function without product-specific parameters, limits, or configuration details. Likely mirrors generic rtrim behavior that LLMs already know, and the summary does not indicate any Azure Databricks–specific nuances or expert-only details.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/rtrim,rtrim,rtrim - Azure Databricks,,Learn how to use the rtrim function with Python,"Trim the spaces from right end for the specified string value. For the corresponding Databricks SQL function, seertrimfunction.",2026-04-17T21:49:00.000Z,reference,,0.3,False,"Appears to be a simple reference for a standard string trimming function without product-specific parameters, limits, or configuration details. Likely mirrors generic rtrim behavior that LLMs already know, and the summary does not indicate any Azure Databricks–specific nuances or expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/schema_of_csv,schema_of_csv,schema_of_csv - Azure Databricks,Infer schema from CSV strings with PySpark in Databricks,Learn how to use the schema\_of\_csv function with PySpark,Parses a CSV string and infers its schema in DDL format.,2026-01-29T20:02:00.000Z,reference,integrations,0.75,True,"Documents schema_of_csv behavior, return DDL format, and parameters, which are Databricks-specific function semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/schema_of_json,schema_of_json,schema_of_json - Azure Databricks,Infer schema from JSON strings with PySpark in Databricks,Learn how to use the schema\_of\_json function with PySpark,Parses a JSON string and infers its schema in DDL format.,2026-01-29T20:02:00.000Z,reference,integrations,0.75,True,API reference for schema_of_json including how it infers and returns schema in DDL format is product-specific.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/schema_of_variant,schema_of_variant,schema_of_variant - Azure Databricks,Get SQL schema of variant values in Databricks PySpark,Learn how to use the schema\_of\_variant function with PySpark,Returns schema in the SQL format of a variant.,2026-01-29T20:02:00.000Z,reference,integrations,0.78,True,"Documents schema_of_variant for Databricks’ variant type, which is a Databricks-specific data type and function.",unchanged @@ -2941,7 +2970,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/s https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/shiftleft,shiftleft,shiftleft - Azure Databricks,Shift bits left with PySpark shiftleft in Databricks,Learn how to use the shiftleft function with Python,"Shift the given value numBits left. For the corresponding Databricks SQL function, seeshiftleftfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,API reference for bit shifting behavior on numeric types in Databricks PySpark.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/shiftright,shiftright,shiftright - Azure Databricks,Signed right bit shift with PySpark shiftright in Databricks,Learn how to use the shiftright function with Python,"(Signed) shift the given value numBits right. For the corresponding Databricks SQL function, seeshiftrightfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,"Documents signed right shift semantics, which are specific to the Databricks PySpark function.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/shiftrightunsigned,shiftrightunsigned,shiftrightunsigned - Azure Databricks,Unsigned right bit shift with PySpark shiftrightunsigned in Databricks,Learn how to use the shiftrightunsigned function with Python,"Unsigned shift the given value numBits right. For the corresponding Databricks SQL function, seeshiftrightunsignedfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,"Describes unsigned right shift behavior, distinct from signed shiftright, as Databricks PySpark API detail.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/shuffle,shuffle,shuffle - Azure Databricks,,Learn how to use the shuffle function with PySpark,"Generates a random permutation of the given array. The shuffle function is non-deterministic, meaning the order of the output array can be different for each execution.",2026-04-17T21:49:00.000Z,reference,,0.2,False,"API reference for a single PySpark function (shuffle) describing behavior and non-determinism, but no product-specific limits, configuration parameters, error codes, or decision matrices that meet any sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/shuffle,shuffle,shuffle - Azure Databricks,,Learn how to use the shuffle function with PySpark,"Generates a random permutation of the given array. The shuffle function is non-deterministic, meaning the order of the output array can be different for each execution.",2026-04-17T21:49:00.000Z,reference,,0.2,False,"API reference for a single PySpark function (shuffle) describing behavior and non-determinism, but no product-specific limits, configuration parameters, error codes, or decision matrices that meet any sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/sign,sign,sign - Azure Databricks,Compute sign of values with PySpark sign in Databricks,Learn how to use the sign function with Python,"Computes the signum of the given value. Supports Spark Connect. For the corresponding Databricks SQL function, seesignfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,API reference for sign including supported types and Spark Connect support is product-specific.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/signum,signum,signum - Azure Databricks,Compute signum of values with PySpark signum in Databricks,Learn how to use the signum function with Python,"Computes the signum of the given value. Supports Spark Connect. For the corresponding Databricks SQL function, seesignumfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Similar to sign; documents exact behavior and types for Databricks PySpark.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/sin,sin,sin - Azure Databricks,Compute sine values with PySpark sin in Databricks,Learn how to use the sin function with Python,"Computes sine of the input column. Supports Spark Connect. For the corresponding Databricks SQL function, seesinfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.65,True,Function reference for sin including Spark Connect support is specific to Databricks’ PySpark API.,unchanged @@ -3086,7 +3115,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/t https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/timestamp_micros,timestamp_micros,timestamp_micros - Azure Databricks,,Learn how to use the timestamp\_micros function with Python,"Creates timestamp from the number of microseconds since UTC epoch. For the corresponding Databricks SQL function, seetimestamp_microsfunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"timestamp_micros function reference; standard behavior description without quotas, config parameters, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/timestamp_millis,timestamp_millis,timestamp_millis - Azure Databricks,,Learn how to use the timestamp\_millis function with Python,"Creates timestamp from the number of milliseconds since UTC epoch. For the corresponding Databricks SQL function, seetimestamp_millisfunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"timestamp_millis function reference; no product-specific limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/timestamp_seconds,timestamp_seconds,timestamp_seconds - Azure Databricks,,Learn how to use the timestamp\_seconds function with Python,"Converts the number of seconds from the Unix epoch (1970-01-01T00:00:00Z) to a timestamp. For the corresponding Databricks SQL function, seetimestamp_secondsfunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"timestamp_seconds function reference; only explains conversion behavior, not expert configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/to_avro,to_avro (Avro),to_avro - Azure Databricks,Use to_avro in PySpark on Azure Databricks,Learn how to use the to\_avro function with PySpark to serialize DataFrame columns into binary Avro format.,"Converts a column into binary of Avro format. If bothsubjectandschemaRegistryAddressare provided, the function converts a column into binary of Schema Registry Avro format. The input data schema must have been registered to the given subject in Schema Registry, or the query fails at runtime.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Function reference pages typically include product-specific syntax, parameters, and behavior (for example, interaction with Schema Registry, required subject/schema registration, and failure modes) that go beyond generic LLM knowledge. This is an integration-focused serialization function with concrete API semantics rather than a conceptual overview.",new +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/to_avro,to_avro (Avro),to_avro - Azure Databricks,Use to_avro in PySpark on Azure Databricks,Learn how to use the to\_avro function with PySpark to serialize DataFrame columns into binary Avro format.,"Converts a column into binary of Avro format. If bothsubjectandschemaRegistryAddressare provided, the function converts a column into binary of Schema Registry Avro format. The input data schema must have been registered to the given subject in Schema Registry, or the query fails at runtime.",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Function reference pages typically include product-specific syntax, parameters, and behavior (for example, interaction with Schema Registry, required subject/schema registration, and failure modes) that go beyond generic LLM knowledge. This is an integration-focused serialization function with concrete API semantics rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/to_binary,to_binary,to_binary - Azure Databricks,,Learn how to use the to\_binary function with Python,"Converts the inputcolto a binary value based on the suppliedformat. Theformatcan be a case-insensitive string literal of ""hex"", ""utf-8"", ""utf8"", or ""base64"". By default, the binary format for conversion is ""hex"" ifformatis omitted. The function returns NULL if at least one of the input parameters is NULL. For the corresponding Databricks SQL function, seeto_binaryfunction.",2026-01-29T20:02:00.000Z,reference,,0.35,False,"to_binary function reference; lists allowed format literals but no config tables, quotas, or decision/troubleshooting structures.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/to_char,to_char,to_char - Azure Databricks,,Learn how to use the to\_char function with Python,"Convertcolto a string based on theformat. Throws an exception if the conversion fails. The format can consist of the following characters, case insensitive: '0' or '9': Specifies an expected digit between 0 and 9. A sequence of 0 or 9 in the format string matches a sequence of digits in the input value, generating a result string of the same length as the corresponding sequence in the format string. The result string is left-padded with zeros if the 0/9 sequence comprises more digits than the ma",2026-01-29T20:02:00.000Z,reference,,0.4,False,"to_char function reference; format pattern details are generic API semantics, not product-specific limits/quotas or config matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/to_csv,to_csv,to_csv - Azure Databricks,,Learn how to use the to\_csv function with PySpark,"Converts a column containing aStructTypeinto a CSV string. Throws an exception, in the case of an unsupported type.",2026-01-29T20:02:00.000Z,reference,,0.35,False,"to_csv function reference; describes supported types and behavior, without limits, quotas, or configuration tables.",unchanged @@ -3112,7 +3141,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/t https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/transform_keys,transform_keys,transform_keys - Azure Databricks,,Learn how to use the transform\_keys function with Python,"Applies a function to every key-value pair in a map and returns a map with the results of those applications as the new keys for the pairs. Supports Spark Connect. For the corresponding Databricks SQL function, seetransform_keysfunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"transform_keys function reference; map transformation semantics only, no expert configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/transform_values,transform_values,transform_values - Azure Databricks,,Learn how to use the transform\_values function with Python,"Applies a function to every key-value pair in a map and returns a map with the results of those applications as the new values for the pairs. Supports Spark Connect. For the corresponding Databricks SQL function, seetransform_valuesfunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"transform_values function reference; map value transformation semantics, no quotas, config matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/translate,translate,translate - Azure Databricks,,Learn how to use the translate function with Python,"The characters inreplaceis corresponding to the characters inmatching. Translation will happen whenever any character in the string is matching with the character in thematching. For the corresponding Databricks SQL function, seetranslatefunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"translate function reference; simple string operation description, no expert-only limits or configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/trim,trim,trim - Azure Databricks,Apply trim string function in Databricks PySpark,Learn how to use the trim function with Python,"Trim the spaces from both ends for the specified string column. For the corresponding Databricks SQL function, seetrimfunction.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"PySpark function reference pages usually document exact signatures, return types, null-handling, and Databricks-specific behavior. While trim is conceptually simple, the page is an API reference with concrete usage details rather than a high-level concept.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/trim,trim,trim - Azure Databricks,Apply trim string function in Databricks PySpark,Learn how to use the trim function with Python,"Trim the spaces from both ends for the specified string column. For the corresponding Databricks SQL function, seetrimfunction.",2026-04-17T21:49:00.000Z,reference,integrations,0.65,True,"PySpark function reference pages usually document exact signatures, return types, null-handling, and Databricks-specific behavior. While trim is conceptually simple, the page is an API reference with concrete usage details rather than a high-level concept.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/trunc,trunc,trunc - Azure Databricks,,Learn how to use the trunc function with Python,"Returns date truncated to the unit specified by the format. For the corresponding Databricks SQL function, seetruncfunction.",2026-01-29T20:02:00.000Z,reference,,0.3,False,"trunc function reference; date truncation semantics only, no expert configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_add,try_add,try_add - Azure Databricks,,Learn how to use the try\_add function with Python,"Returns the sum ofleftandrightand the result is null on overflow. The acceptable input types are the same with the+operator. Supports Spark Connect. For the corresponding Databricks SQL function, seetry_addfunction.",2026-01-29T20:02:00.000Z,reference,,0.35,False,try_add function reference; describes null-on-overflow behavior but not in the structured limits/config/troubleshooting forms required.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_aes_decrypt,try_aes_decrypt,try_aes_decrypt - Azure Databricks,,Learn how to use the try\_aes\_decrypt function with PySpark,"This is a special version ofaes_decryptthat performs the same operation, but returns a NULL value instead of raising an error if the decryption cannot be performed. Returns a decrypted value ofinputusing AES inmodewithpadding. Key lengths of 16, 24 and 32 bits are supported. Supported combinations of (mode,padding) are (ECB,PKCS), (GCM,NONE) and (CBC,PKCS). Optional additional authenticated data (AAD) is only supported for GCM. If provided for encryption, the identical AAD value must be provided",2026-01-29T20:02:00.000Z,reference,,0.45,False,"try_aes_decrypt function reference; mentions supported modes/paddings and key lengths, but as API semantics rather than limits/quotas or configuration matrices.",unchanged @@ -3120,9 +3149,9 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/t https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_divide,try_divide,try_divide - Azure Databricks,,Learn how to use the try\_divide function with Python,"Returnsdividend/divisor. It always performs floating point division. Its result is always null ifdivisoris 0. Supports Spark Connect. For the corresponding Databricks SQL function, seetry_dividefunction.",2026-01-29T20:02:00.000Z,reference,,0.35,False,"try_divide function reference; describes division semantics and null on divisor=0, but no quotas, config tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_element_at,try_element_at,try_element_at - Azure Databricks,,Learn how to use the try\_element\_at function with Python,"Collection function: Returns element of array at given (1-based) index or value for given key in a map. For arrays, if index is 0, Spark will throw an error. If index < 0, accesses elements from the last to the first. The function always returns NULL if the index exceeds the length of the array. For maps, the function always returns NULL if the key is not contained in the map. For the corresponding Databricks SQL function, seetry_element_atfunction.",2026-01-29T20:02:00.000Z,reference,,0.4,False,"try_element_at function reference; includes some edge-case behavior (index 0, negative index) but not in the structured best-practices/troubleshooting/config forms required.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_interval,try_make_interval,try_make_interval - Azure Databricks,,Learn how to use the try\_make\_interval function with Python,"This is a special version ofmake_intervalthat performs the same operation, but returns a NULL value instead of raising an error if interval cannot be created.",2026-01-29T20:02:00.000Z,reference,,0.35,False,"try_make_interval function reference; describes null-on-error behavior, but no limits, quotas, or configuration matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_timestamp,try_make_timestamp,try_make_timestamp - Azure Databricks,Create timestamps with try_make_timestamp in Databricks,Learn how to use the try\_make\_timestamp function with Python,"Try to create timestamp from years, months, days, hours, mins, secs and (optional) timezone fields. Alternatively, try to create timestamp from date, time, and (optional) timezone fields. The result data type is consistent with the value of configurationspark.sql.timestampType. The function returns NULL on invalid inputs. Added in Databricks Runtime 16.1",2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Documents a Databricks-specific PySpark function including its arguments, behavior on invalid input (returns NULL), and dependence on spark.sql.timestampType. These are concrete API semantics and configuration interactions that qualify as expert, product-specific knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_timestamp_ltz,try_make_timestamp_ltz,try_make_timestamp_ltz - Azure Databricks,Use try_make_timestamp_ltz for local-timezone timestamps,Learn how to use the try\_make\_timestamp\_ltz function with Python,"Try to create the current timestamp with local time zone from years, months, days, hours, mins, secs and timezone fields. The function returns NULL on invalid inputs. Added in Databricks Runtime 16.1",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Describes a specific Databricks Runtime function added in version 16.1, including its parameters, local time zone semantics, and NULL behavior on invalid input. This is detailed API behavior tied to a particular product/runtime, fitting integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_timestamp_ntz,try_make_timestamp_ntz,try_make_timestamp_ntz - Azure Databricks,Use try_make_timestamp_ntz for local date-time values,Learn how to use the try\_make\_timestamp\_ntz function with Python,"Try to create local date-time from years, months, days, hours, mins, secs fields. Alternatively, try to create local date-time from date and time fields. The function returns NULL on invalid inputs. Added in Databricks Runtime 16.1",2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Covers a Databricks-specific PySpark function with defined argument patterns (date/time vs components), local date-time semantics, version introduction (Runtime 16.1), and NULL behavior. These are concrete, product-specific API details suitable for an integrations skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_timestamp,try_make_timestamp,try_make_timestamp - Azure Databricks,Create timestamps with try_make_timestamp in Databricks,Learn how to use the try\_make\_timestamp function with Python,"Try to create timestamp from years, months, days, hours, mins, secs and (optional) timezone fields. Alternatively, try to create timestamp from date, time, and (optional) timezone fields. The result data type is consistent with the value of configurationspark.sql.timestampType. The function returns NULL on invalid inputs. Added in Databricks Runtime 16.1",2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Documents a Databricks-specific PySpark function including its arguments, behavior on invalid input (returns NULL), and dependence on spark.sql.timestampType. These are concrete API semantics and configuration interactions that qualify as expert, product-specific knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_timestamp_ltz,try_make_timestamp_ltz,try_make_timestamp_ltz - Azure Databricks,Use try_make_timestamp_ltz for local-timezone timestamps,Learn how to use the try\_make\_timestamp\_ltz function with Python,"Try to create the current timestamp with local time zone from years, months, days, hours, mins, secs and timezone fields. The function returns NULL on invalid inputs. Added in Databricks Runtime 16.1",2026-04-17T21:49:00.000Z,reference,integrations,0.75,True,"Describes a specific Databricks Runtime function added in version 16.1, including its parameters, local time zone semantics, and NULL behavior on invalid input. This is detailed API behavior tied to a particular product/runtime, fitting integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_make_timestamp_ntz,try_make_timestamp_ntz,try_make_timestamp_ntz - Azure Databricks,Use try_make_timestamp_ntz for local date-time values,Learn how to use the try\_make\_timestamp\_ntz function with Python,"Try to create local date-time from years, months, days, hours, mins, secs fields. Alternatively, try to create local date-time from date and time fields. The function returns NULL on invalid inputs. Added in Databricks Runtime 16.1",2026-04-17T08:00:00.000Z,reference,integrations,0.75,True,"Covers a Databricks-specific PySpark function with defined argument patterns (date/time vs components), local date-time semantics, version introduction (Runtime 16.1), and NULL behavior. These are concrete, product-specific API details suitable for an integrations skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_mod,try_mod,try_mod - Azure Databricks,Use try_mod for safe modulo in Databricks PySpark,Learn how to use the try\_mod function with Python,"Returns the remainder afterdividend/divisor. Its result is always null ifdivisoris 0. Supports Spark Connect. For the corresponding Databricks SQL function, seetry_modfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.68,True,"Describes a Databricks PySpark function with defined behavior on divisor=0 and Spark Connect support, which is specific API behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_multiply,try_multiply,try_multiply - Azure Databricks,Use try_multiply for overflow-safe multiplication,Learn how to use the try\_multiply function with Python,"Returnsleft_rightand the result is null on overflow. The acceptable input types are the same with the_operator. Supports Spark Connect. For the corresponding Databricks SQL function, seetry_multiplyfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.68,True,"Details a Databricks PySpark function that returns null on overflow and aligns with operator type rules, which is product-specific behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_parse_json,try_parse_json,try_parse_json - Azure Databricks,Parse JSON safely with try_parse_json in PySpark,Learn how to use the try\_parse\_json function with PySpark,Parses a column containing a JSON string into aVariantType. Returns None if a string contains an invalid JSON value.,2026-01-29T20:02:00.000Z,reference,integrations,0.72,True,Function reference for parsing JSON into VariantType with null on invalid JSON; VariantType and behavior are Databricks-specific API details.,unchanged @@ -3142,7 +3171,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/t https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_variant_get,try_variant_get,try_variant_get - Azure Databricks,Extract sub-variants with try_variant_get in PySpark,Learn how to use the try\_variant\_get function with PySpark,"Extracts a sub-variant fromvaccording topath, and then cast the sub-variant totargetType. Returns null if the path does not exist or the cast fails.",2026-01-29T20:02:00.000Z,reference,integrations,0.72,True,"Covers Variant type handling, path semantics, casting to targetType, and null-on-missing/cast-fail behavior, all Databricks-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_zstd_decompress,try_zstd_decompress,try_zstd_decompress - Azure Databricks,Decompress Zstandard data with try_zstd_decompress,Learn how to use the try\_zstd\_decompress function with Python,"Returns the decompressed value of expr using Zstandard. Supports data compressed in both single-pass mode and streaming mode. On decompression failure, it returns NULL.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,"Describes Databricks PySpark function behavior for Zstandard decompression, including streaming vs single-pass and null on failure.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-collations,collations (TVF),collations - Azure Databricks,,Learn how to use the collations function with PySpark,Get all of the Spark SQL string collations.,2026-01-29T20:02:00.000Z,reference,,0.35,False,API reference for collations TVF; likely lists collations but not in the form of limits/quotas or decision matrices as defined.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-explode,explode (TVF),TableValuedFunction.explode - Azure Databricks,Use TableValuedFunction.explode in Azure Databricks PySpark,Learn how to use the TableValuedFunction.explode function with PySpark,"Returns a DataFrame containing a new row for each element in the given array or map. The default column name iscolfor elements in an array andkeyandvaluefor elements in a map. To use different column names, calltoDF()on the returned DataFrame.",2026-04-17T18:03:00.000Z,reference,integrations,0.68,True,"The page documents a specific Azure Databricks PySpark table-valued function, including its behavior (how arrays/maps are expanded into rows) and default column names (iscol, key, value). These are product- and API-specific details that are not generic PySpark knowledge and qualify as expert integration/coding pattern knowledge. It is not about limits, architecture, or configuration, but about a concrete API surface and its semantics.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-explode,explode (TVF),TableValuedFunction.explode - Azure Databricks,Use TableValuedFunction.explode in Azure Databricks PySpark,Learn how to use the TableValuedFunction.explode function with PySpark,"Returns a DataFrame containing a new row for each element in the given array or map. The default column name iscolfor elements in an array andkeyandvaluefor elements in a map. To use different column names, calltoDF()on the returned DataFrame.",2026-04-23T08:00:00.000Z,reference,integrations,0.68,True,"The page documents a specific Azure Databricks PySpark table-valued function, including its behavior (how arrays/maps are expanded into rows) and default column names (col, key, value). These are product- and API-specific details that go beyond generic LLM knowledge and are needed to correctly integrate and use this function in code. It fits best under integrations & coding patterns because it is a concrete API reference for a Databricks-specific PySpark function rather than general configuration, limits, or architecture guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-explode_outer,explode_outer (TVF),TableValuedFunction.explode_outer - Azure Databricks,Use TableValuedFunction.explode_outer in Azure Databricks,Learn how to use the TableValuedFunction.explode\_outer function with PySpark,"Returns a DataFrame containing a new row for each element with position in the given array or map. Unlike explode, if the array/map is null or empty then null is produced. Uses the default column namecolfor elements in the array andkeyandvaluefor elements in the map unless specified otherwise.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,"Documents a Databricks-specific table-valued PySpark API, including behavior on null/empty arrays and default column names; this is concrete product API knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-inline,inline (TVF),TableValuedFunction.inline - Azure Databricks,Use TableValuedFunction.inline in Databricks PySpark,Learn how to use the TableValuedFunction.inline function with PySpark,Explodes an array of structs into a table. This function takes an input column containing an array of structs and returns a new column where each struct in the array is exploded into a separate row.,2026-01-29T20:02:00.000Z,reference,integrations,0.75,True,Product-specific TVF.inline API for exploding arrays of structs into tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-inline_outer,inline_outer (TVF),TableValuedFunction.inline_outer - Azure Databricks,Use TableValuedFunction.inline_outer in Azure Databricks,Learn how to use the TableValuedFunction.inline\_outer function with PySpark,"Explodes an array of structs into a table. Unlike inline, if the array is null or empty then null is produced for each nested column.",2026-01-29T20:02:00.000Z,reference,integrations,0.7,True,"Documents a Databricks-specific table-valued PySpark function, including behavior differences from inline, which is product-specific API knowledge.",unchanged @@ -3161,7 +3190,7 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/u https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/udtf,udtf,udtf - Azure Databricks,Create user-defined table functions (UDTF) in PySpark,Learn how to use the udtf function with Python,Creates a user defined table function (UDTF).,2026-01-29T20:02:00.000Z,reference,integrations,0.78,True,"Explains Databricks’ udtf helper, including how to define and use table-valued functions, which is a product-specific integration pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/unbase64,unbase64,unbase64 - Azure Databricks,Decode Base64 strings with unbase64 in PySpark,Learn how to use the unbase64 function with Python,"Decodes a BASE64 encoded string column and returns it as a binary column. For the corresponding Databricks SQL function, seeunbase64function.",2026-01-29T20:02:00.000Z,reference,integrations,0.66,True,"Function reference for Databricks’ unbase64, including return type and mapping to SQL function, which is specific API behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/unhex,unhex,unhex - Azure Databricks,Convert hex strings to bytes with unhex in PySpark,Learn how to use the unhex function with Python,"Inverse of hex. Interprets each pair of characters as a hexadecimal number and converts to the byte representation of number. Supports Spark Connect. For the corresponding Databricks SQL function, seeunhexfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.66,True,"Documents Databricks’ unhex function behavior, including Spark Connect support and SQL mapping, which is product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/uniform,uniform,uniform - Azure Databricks,Use the uniform random function in Azure Databricks PySpark,Learn how to use the uniform function with Python,"Returns a random value with independent and identically distributed (i.i.d.) values with the specified range of numbers. The random seed is optional. The provided numbers specifying the minimum and maximum values of the range must be constant. If both of these numbers are integers, then the result will also be an integer. Otherwise if one or both of these are floating-point numbers, then the result will also be a floating-point number. Supports Spark Connect. Added in Databricks Runtime 16.1 For",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Function reference page with product-specific behavior (i.i.d. semantics, constant-argument requirement, integer vs float result rules, Spark Connect support, and Databricks Runtime version added). This is concrete API behavior and constraints that go beyond generic knowledge, fitting best under integrations/coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/uniform,uniform,uniform - Azure Databricks,Use the uniform random function in Azure Databricks PySpark,Learn how to use the uniform function with Python,"Returns a random value with independent and identically distributed (i.i.d.) values with the specified range of numbers. The random seed is optional. The provided numbers specifying the minimum and maximum values of the range must be constant. If both of these numbers are integers, then the result will also be an integer. Otherwise if one or both of these are floating-point numbers, then the result will also be a floating-point number. Supports Spark Connect. Added in Databricks Runtime 16.1 For",2026-04-17T21:49:00.000Z,reference,integrations,0.7,True,"Function reference page with product-specific behavior (i.i.d. semantics, constant-argument requirement, integer vs float result rules, Spark Connect support, and Databricks Runtime version added). This is concrete API behavior and constraints that go beyond generic knowledge, fitting best under integrations/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/unix_date,unix_date,unix_date - Azure Databricks,Get days since epoch with unix_date in PySpark,Learn how to use the unix\_date function with Python,"Returns the number of days since 1970-01-01. For the corresponding Databricks SQL function, seeunix_datefunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.64,True,"Function reference for Databricks’ unix_date, including exact epoch and return semantics, which are API-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/unix_micros,unix_micros,unix_micros - Azure Databricks,Get microseconds since epoch with unix_micros,Learn how to use the unix\_micros function with Python,"Returns the number of microseconds since 1970-01-01 00:00:00 UTC. For the corresponding Databricks SQL function, seeunix_microsfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.64,True,"Documents Databricks’ unix_micros behavior and units, which is specific to this API.",unchanged https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/unix_millis,unix_millis,unix_millis - Azure Databricks,Get milliseconds since epoch with unix_millis,Learn how to use the unix\_millis function with Python,"Returns the number of milliseconds since 1970-01-01 00:00:00 UTC. Truncates higher levels of precision. For the corresponding Databricks SQL function, seeunix_millisfunction.",2026-01-29T20:02:00.000Z,reference,integrations,0.64,True,"Explains Databricks’ unix_millis semantics, including truncation of higher precision, which is product-specific behavior.",unchanged @@ -3204,21 +3233,21 @@ https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/z https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/zstd_decompress,zstd_decompress,zstd_decompress - Azure Databricks,Decompress Zstandard data with zstd_decompress,Learn how to use the zstd\_decompress function with Python,"Returns the decompressed value of expr using Zstandard. Supports data compressed in both single-pass mode and streaming mode. On decompression failure, it throws an exception.",2026-01-29T20:02:00.000Z,reference,integrations,0.8,True,Documents zstd_decompress behavior including support for single-pass and streaming modes and exception on failure.,unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/,Set up Lakehouse Federation,What is Lakehouse Federation? - Azure Databricks,,Learn about Databricks Lakehouse Federation and how to use it to run federated queries against multiple external data sources.,Lakehouse Federation is the query federation platform for Databricks. The term query federation describes a collection of features that enable users and systems to run queries against multiple data sources without needing to migrate all data to a unified system. There are two types of federation: query federation and catalog federation. This page covers the differences between the types.,2026-02-13T08:00:00.000Z,overview,,0.3,False,"Conceptual overview of Lakehouse Federation and types of federation; lacks detailed config, limits, or security role mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/bigquery,BigQuery,Run federated queries on Google BigQuery - Azure Databricks,Configure Databricks Lakehouse Federation for BigQuery,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Google BigQuery data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on BigQuery data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your BigQuery database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically):",2026-04-02T22:35:00.000Z,how-to,integrations,0.78,True,"BigQuery-specific federation setup, including Unity Catalog objects and connection configuration; likely includes parameter names, auth options, and constraints unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query-federation/catalog-federation,Concepts,What is catalog federation? - Azure Databricks,,"Learn about catalog federation in Databricks Lakehouse Federation and how to use it to query an external Hive metastore, AWS Glue, a legacy Databricks Hive metastore, Snowflake, or OneLake.","With catalog federation, you directly access the foreign table in object storage. The query is only executed using Databricks compute and is therefore more cost-effective and performance-optimized. Catalog federation is used to federate to platforms that have catalog services and support open table formats like external Hive metastores, legacy Databricks Hive metastores, Salesforce Data 360, Snowflake, and OneLake.",2026-04-17T08:00:00.000Z,concept-article,,0.1,False,"Conceptual overview of catalog federation and supported platforms. The summary indicates high-level behavior and benefits without concrete configuration parameters, limits, or decision matrices, so it does not meet any expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/query-federation/catalog-federation,Concepts,What is catalog federation? - Azure Databricks,,"Learn about catalog federation in Databricks Lakehouse Federation and how to use it to query an external Hive metastore, AWS Glue, a legacy Databricks Hive metastore, Snowflake, or OneLake.","With catalog federation, you directly access the foreign table in object storage. The query is only executed using Databricks compute and is therefore more cost-effective and performance-optimized. Catalog federation is used to federate to platforms that have catalog services and support open table formats like external Hive metastores, legacy Databricks Hive metastores, Salesforce Data 360, Snowflake, and OneLake.",2026-04-17T08:00:00.000Z,concept-article,,0.1,False,"Conceptual overview of catalog federation and supported platforms. The summary indicates high-level behavior and benefits without concrete configuration parameters, limits, or decision matrices, so it does not meet any expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/connections,Manage connections,Manage connections for Lakehouse Federation - Azure Databricks,Manage Lakehouse Federation connections in Unity Catalog,"Learn how to manage Azure Databricks Lakehouse Federation connections, including listing, managing permissions, and dropping them.","This article describes how to list all Lakehouse Federation connections defined in a Unity Catalog metastore, get connection details, grant connection permissions, and drop connections using Catalog Explorer and SQL statements in notebooks or the Databricks SQL query editor. Aconnectionis a securable object in Unity Catalog that specifies a path and credentials for accessing an external database system. See alsoCreate a connection. If you prefer to use the REST API, seeAzure Databricks reference",2026-02-23T23:41:00.000Z,how-to,configuration,0.82,True,"Documents connection objects, SQL/REST operations, and permission management for Lakehouse Federation connections, with concrete parameter usage.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query-federation/database-federation,Concepts,What is query federation? - Azure Databricks,,"Learn about catalog federation in Databricks Lakehouse Federation and how to use it to query an external Hive metastore, AWS Glue, a legacy internal Databricks Hive metastore, or Snowflake.","With query federation, queries are pushed down to the foreign database using JDBC APIs. The query is executed both in Databricks and using remote compute. Query federation is used for sources like MySQL, PostgreSQL, Redshift, Teradata, and more.",2026-04-17T21:49:00.000Z,concept-article,,0.2,False,"Described as a conceptual 'What is query federation?' overview explaining pushdown and supported sources; no indication of specific limits, configuration tables, error codes, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/query-federation/database-federation,Concepts,What is query federation? - Azure Databricks,,"Learn about catalog federation in Databricks Lakehouse Federation and how to use it to query an external Hive metastore, AWS Glue, a legacy internal Databricks Hive metastore, or Snowflake.","With query federation, queries are pushed down to the foreign database using JDBC APIs. The query is executed both in Databricks and using remote compute. Query federation is used for sources like MySQL, PostgreSQL, Redshift, Teradata, and more.",2026-04-17T21:49:00.000Z,concept-article,,0.2,False,"Described as a conceptual 'What is query federation?' overview explaining pushdown and supported sources; no indication of specific limits, configuration tables, error codes, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/databricks,Databricks,Run federated queries on another Databricks workspace - Azure Databricks,Federate Databricks queries across workspaces,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on data that is in another Databricks workspace.,"This article describes how to set up Lakehouse Federation to run federated queries on Databricks data in another Databricks workspace. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation?. Important Databricks-to-Databricks Lakehouse Federation is a good tool for running queries on data managed by another Databricks workspace's Hive or AWS Glue metastore. For most other scenarios, other Azure Databricks workflows are more efficient: There is no need to set up Lakehouse Fede",2026-03-04T19:15:00.000Z,how-to,integrations,0.8,True,"Describes Databricks-to-Databricks federation with specific connection objects, when to use it vs other workflows, and configuration details unique to the platform.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/foreign-catalogs,Manage foreign catalogs,Manage and work with foreign catalogs - Azure Databricks,Manage and query foreign catalogs in Databricks,"Learn how to work with foreign catalogs that mirror an external database, using Azure Databricks Lakehouse Federation.","This article describes how to manage foreign catalogs and work with data in foreign catalogs. A foreign catalog is a securable object in Unity Catalog that mirrors a database in an external data system, enabling you to perform read-only queries on that data system in your Azure Databricks workspace, managing access using Unity Catalog. You work with foreign catalogs in the same way that you work with any catalog managed by Unity Catalog: View information about a foreign catalog: seeView catalog ",2026-01-20T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes how to manage foreign catalogs (DDL, privileges, usage patterns) with specific commands and behaviors unique to Unity Catalog foreign catalogs, which are configuration/usage details not covered by generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/hms-federation-concepts,Hive metastore federation,Hive metastore federation: enable Unity Catalog to govern tables registered in a Hive metastore - Azure Databricks,Plan and use Hive metastore federation with Unity Catalog,"Learn about Hive metastore federation, the Azure Databricks feature that enables you to use Unity Catalog to govern tables that are registered in a Hive metastore.","This article introduces Hive metastore federation, a feature that enables Unity Catalog to govern tables that are stored in a Hive metastore. You can federate an external Hive metastore or a legacy internal Azure Databricks Hive metastore. Hive metastore federation can be used for the following use cases: As a step in the migration path to Unity Catalog, enabling incremental migration without code adaptation, with some of your workloads continuing to use data registered in your Hive metastore wh",2026-02-20T08:00:00.000Z,concept-article,decision-making,0.64,True,"Covers when and how to use Hive metastore federation, including migration-path use cases and trade-offs between legacy Hive and Unity Catalog; this is product-specific decision guidance rather than just conceptual description.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/hms-federation-external,Federate external HMS,Enable Hive metastore federation for an external Hive metastore - Azure Databricks,Configure external Hive metastore federation in Databricks,Learn how to enable Hive metastore federation for external metastores. Hive metastore federation enables you to use Unity Catalog to govern tables that are registered in a Hive metastore.,"This article shows how to federate an external Hive metastore so that your organization can work with your Hive metastore tables using Unity Catalog. For an overview of Hive metastore federation, seeHive metastore federation: enable Unity Catalog to govern tables registered in a Hive metastore.",2026-03-20T17:45:00.000Z,how-to,configuration,0.7,True,"Shows how to federate an external Hive metastore with Unity Catalog, including connector configuration, permissions, and object mappings; these are concrete configuration parameters and steps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/hms-federation-internal,Federate legacy workspace HMS,Enable Hive metastore federation for a legacy workspace Hive metastore - Azure Databricks,Enable Hive metastore federation for legacy Databricks workspaces,Learn how to enable Hive metastore federation for internal Azure Databricks workspace Hive metastores. Hive metastore federation enables you to use Unity Catalog to govern tables that are registered i,"This article shows how to federate your legacy Azure Databricks Hive metastore so that your organization can work with your Hive metastore tables using Unity Catalog. For an overview of Hive metastore federation, seeHive metastore federation: enable Unity Catalog to govern tables registered in a Hive metastore.",2026-01-20T08:00:00.000Z,how-to,configuration,0.7,True,"Step-by-step configuration for federating an internal workspace Hive metastore, likely including specific setting names, commands, and required Unity Catalog objects; this is detailed configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query-federation/http,Connect to external HTTP services,Connect to external HTTP services - Azure Databricks,Create Unity Catalog HTTP connections for external APIs,Learn how to create a Unity Catalog connection to an external HTTP service and send authenticated requests from Azure Databricks using the HTTP connection proxy.,"Important This feature is inPublic Preview. An HTTP connection is a Unity Catalog securable object that stores endpoint and credential information for an external HTTP service. Use an HTTP connection to send authenticated requests to external REST APIs, MCP servers, and AI agent tools from Azure Databricks without embedding credentials in your code.",2026-04-16T08:00:00.000Z,how-to,integrations,0.8,True,"Describes creating a Unity Catalog HTTP connection object to call external REST APIs, MCP servers, and AI tools. This implies detailed configuration fields (endpoint, credential storage, proxy behavior) and product-specific parameterization for authenticated requests, matching the integrations category with expert configuration knowledge.",new -https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration,Migrate to serverless routing,Migrate to serverless routing for HTTP connections - Azure Databricks,Migrate Databricks HTTP routing to serverless compute,Migrate HTTP connection routing from control plane to serverless compute for improved network security and private connectivity.,"Azure Databricks is moving HTTP connection routing from the control plane to the serverless compute plane, which improves network security and enables Private Link for private connectivity. On April 30, 2026, all workspaces will be automatically migrated. If your workspace was created before March 2026 and your external services useIP-based firewall rules, you must eithermigrate to Private Linkorupdate your allowlists to use serverless outbound IPsto avoid connectivity failures. If you are unsur",2026-03-24T01:32:00.000Z,how-to,decision-making,0.7,True,"Describes a time-bound migration (with specific dates) from control plane to serverless routing, including conditions when you must migrate, options (Private Link vs IP allowlists), and consequences; this is concrete guidance for choosing and planning migration paths.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/query-federation/http,Connect to external HTTP services,Connect to external HTTP services - Azure Databricks,Create and use HTTP connections in Databricks,Learn how to create a Unity Catalog connection to an external HTTP service and send authenticated requests from Azure Databricks using the HTTP connection proxy.,"Important This feature is inPublic Preview. An HTTP connection is a Unity Catalog securable object that stores endpoint and credential information for an external HTTP service. Use an HTTP connection to send authenticated requests to external REST APIs, MCP servers, and AI agent tools from Azure Databricks without embedding credentials in your code.",2026-04-23T08:00:00.000Z,how-to,integrations,0.86,True,"Details Unity Catalog HTTP connection objects, including endpoint and credential handling, and how to send authenticated requests via the HTTP connection proxy. Contains product-specific configuration concepts and usage patterns for external REST/MCP/agent tools integration.",updated +https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration,Migrate to serverless routing,Migrate to serverless routing for HTTP connections - Azure Databricks,Migrate Databricks HTTP routing to serverless,Migrate HTTP connection routing from control plane to serverless compute for improved network security and private connectivity.,"Azure Databricks is moving HTTP connection routing from the control plane to the serverless compute plane, which improves network security and enables Private Link for private connectivity. On May 30, 2026, all workspaces will be automatically migrated. If your workspace was created before March 2026 and your external services useIP-based firewall rules, you must eithermigrate to Private Linkorupdate your allowlists to use serverless outbound IPsto avoid connectivity failures. If you are unsure ",2026-04-23T08:00:00.000Z,how-to,deployment,0.64,True,"Covers a dated migration requirement from control-plane to serverless routing for HTTP connections, including a hard migration deadline and workspace-creation date conditions that affect connectivity. These are product-specific deployment/runtime constraints and timelines not inferable from general knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/query-federation/migrate,Migrate to Lakehouse Federation,Lakehouse Federation: Migrate legacy query federation connections - Azure Databricks,Migrate legacy Databricks query federation to Lakehouse Federation,Learn how to migrate legacy query federation connections to Azure Databricks Lakehouse Federation.,"If you have set uplegacy query federation connections, Databricks recommends that you migrate them to useLakehouse Federation. Legacy query federation involved creating tables in Azure Databricks that referenced an external data source. To “move” those tables into Unity Catalog using Lakehouse Federation, you must create a Lakehouse Federation connection and foreign catalog for the database that includes the table. You can then grant user access to the catalog, or to schemas and tables in the ca",2024-06-18T08:00:00.000Z,how-to,decision-making,0.7,True,"Guides migration from legacy federation to Lakehouse Federation, including mapping of old objects to new connections/catalogs and recommended migration steps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/mysql,MySQL,Run federated queries on MySQL - Azure Databricks,Set up Lakehouse Federation for MySQL,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on MySQL data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on MySQL data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your MySQL database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically):",2026-03-31T23:28:00.000Z,how-to,integrations,0.8,True,"Step-by-step configuration for MySQL federation, including required Unity Catalog objects and connection parameters; product-specific integration pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/networking,Networking recommendations,Networking recommendations for Lakehouse Federation - Azure Databricks,Configure networking for Databricks Lakehouse Federation data sources,Learn about how to configure network paths between Azure Databricks and data sources in a Lakehouse Federation scenario.,"This article provides guidance for setting up a viable network path between your Azure Databricks clusters or SQL warehouses and the external database system that you are connecting to using Lakehouse Federation. Bear the following important information in mind: For more information on networking in Azure Databricks workspaces, seeNetworking.",2026-02-04T08:00:00.000Z,best-practice,best-practices,0.68,True,"Provides concrete networking guidance (required paths, supported patterns, likely port/protocol and topology recommendations) specific to Lakehouse Federation scenarios; these are actionable product-specific best practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/onelake,OneLake federation,Enable OneLake catalog federation - Azure Databricks,Configure OneLake catalog federation with Unity Catalog,Learn how to enable OneLake federation to directly access Lakehouse and Warehouse data. Enable Unity Catalog to govern tables stored in OneLake.,"Important This feature is inPublic Preview. This article shows how to read data in OneLake using catalog federation. This allows Unity Catalog queries to run directly against OneLake storage. OneLake federation enables you to analyze data stored in your Lakehouse or Warehouse without copying it, bringing powerful analytics and AI/BI capabilities in Azure Databricks directly to your OneLake data. Data access is read-only.",2026-03-31T08:00:00.000Z,how-to,configuration,0.72,True,"Explains how to set up catalog federation to OneLake, including required Unity Catalog objects, permissions, and configuration parameters for read-only access; these are concrete configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle,Oracle,Run federated queries on Oracle - Azure Databricks,Configure Databricks Lakehouse Federation for Oracle,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Oracle data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on Oracle data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Oracle database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically):",2026-03-31T23:28:00.000Z,how-to,integrations,0.78,True,"Connector-specific setup for Oracle via Lakehouse Federation, including Unity Catalog foreign catalog creation, connection options, and likely parameter names/values and JDBC/driver details that are product-specific integration knowledge rather than generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle,Oracle,Run federated queries on Oracle - Azure Databricks,Configure Databricks Lakehouse Federation for Oracle,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Oracle data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on Oracle data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Oracle database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically):",2026-04-23T08:00:00.000Z,how-to,integrations,0.78,True,"Integration-focused page with product-specific connection objects, required properties, and configuration steps for Oracle as a federated source in Unity Catalog. Contains concrete parameter names and patterns unique to Databricks Lakehouse Federation rather than generic SQL/Oracle usage.",updated https://learn.microsoft.com/en-us/azure/databricks/query-federation/performance-recommendations,Performance recommendations,Lakehouse Federation performance recommendations - Azure Databricks,Optimize performance of Databricks Lakehouse Federation queries,Learn how to improve the performance of Databricks Lakehouse Federation queries.,This article provides guidance for improving the performance of Lakehouse Federation queries.,2026-03-26T08:00:00.000Z,best-practice,best-practices,0.76,True,"Provides concrete performance recommendations for Lakehouse Federation (for example pushdown strategies, configuration tweaks, and query patterns) that are specific to this feature and go beyond generic SQL tuning.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/postgresql,PostgreSQL,Run federated queries on PostgreSQL - Azure Databricks,Set up Lakehouse Federation for PostgreSQL,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on PostgreSQL data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on Run queries on PostgreSQL data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Run queries on PostgreSQL database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically):",2026-03-31T23:28:00.000Z,how-to,integrations,0.8,True,"Step-by-step configuration for PostgreSQL federation, including required Unity Catalog objects and connection parameters; product-specific integration pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/redshift,Redshift,Run federated queries on Amazon Redshift - Azure Databricks,Configure Databricks federated queries to Amazon Redshift,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Amazon Redshift data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on Run queries on Amazon Redshift data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Run queries on Amazon Redshift database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned autom",2026-03-04T19:15:00.000Z,how-to,integrations,0.8,True,"Provides concrete setup for Databricks–Redshift federation, including connection parameters and SQL usage patterns beyond generic knowledge.",unchanged @@ -3229,9 +3258,9 @@ https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake,Bu https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-basic-auth,Basic authentication,Run federated queries on Snowflake (basic authentication) - Azure Databricks,Federate Databricks queries to Snowflake with basic auth,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. Authenticate using Databricks basic authentication (use,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): This page covers how to run fed",2026-03-04T19:15:00.000Z,how-to,integrations,0.86,True,"Shows Snowflake federation using username/password via Databricks, with concrete connection options and auth configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-catalog-federation,Snowflake federation,Enable Snowflake catalog federation - Azure Databricks,Enable Snowflake catalog federation in Unity Catalog,Learn how to enable Snowflake federation to use catalog federation to directly access Snowflake Managed Iceberg tables in cloud storage. Enable Unity Catalog to govern tables registered in Snowflake.,"This article shows how to enable Snowflake federation using catalog federation. Catalog federation allows Unity Catalog to read Snowflake Managed Iceberg tables directly from cloud storage, providing better performance and cost-effectiveness compared toquery federation. With catalog federation, Unity Catalog directly accesses the Snowflake Managed Iceberg table in object storage. The query is only executed using Databricks compute, which is more cost-effective and performance-optimized. Non-Iceb",2026-04-07T08:00:00.000Z,how-to,configuration,0.74,True,"The article explains how to enable Snowflake catalog federation so Unity Catalog can directly access Snowflake Managed Iceberg tables. Such federation-enablement docs typically include specific configuration steps and parameters (catalog registration, storage locations, permissions, and connector options) that are unique to Databricks–Snowflake integration. This is detailed, product-specific configuration rather than conceptual guidance, so configuration is the best fit.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-entra,Microsoft Entra ID,Run federated queries on Snowflake (Microsoft Entra ID) - Azure Databricks,Federate Databricks queries to Snowflake with Entra ID,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. Authenticate using Microsoft Entra ID.,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): This page covers how to run fed",2026-03-04T19:15:00.000Z,how-to,integrations,0.86,True,"Describes Snowflake federation using Microsoft Entra ID, with concrete auth configuration, parameter names, and Databricks-specific setup steps.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token,OAuth access token,Run federated queries on Snowflake (OAuth access token) - Azure Databricks,Federate Databricks queries to Snowflake with OAuth tokens,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. Authenticate using an OAuth access token.,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): This page covers how to run fed",2026-03-04T19:15:00.000Z,how-to,integrations,0.86,True,"Covers using raw OAuth access tokens for Snowflake federation, with detailed parameter usage and Databricks connection configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token,OAuth access token,Run federated queries on Snowflake (OAuth access token) - Azure Databricks,Use OAuth tokens for Snowflake federation in Databricks,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. Authenticate using an OAuth access token.,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): This page covers how to run fed",2026-04-23T08:00:00.000Z,how-to,integrations,0.8,True,"Describes Databricks Lakehouse Federation configuration for Snowflake using OAuth access tokens, including specific connection objects, authentication parameters, and patterns unique to this integration, beyond generic OAuth or Snowflake knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-okta,Okta,Run federated queries on Snowflake (Okta) - Azure Databricks,Federate Databricks queries to Snowflake with Okta OAuth,Learn how to run federated queries on Snowflake data not managed by Azure Databricks using Okta as the external OAuth provider.,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): Learn how to run federated quer",2026-03-04T19:15:00.000Z,how-to,integrations,0.86,True,"Shows how to use Okta as external OAuth provider for Snowflake federation, including specific configuration fields and Databricks connection properties.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-pem,PEM private key,Run federated queries on Snowflake (PEM private key) - Azure Databricks,Configure Databricks federated queries to Snowflake with PEM,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. Authenticate using a PEM private key.,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): This page describes how to run ",2026-04-13T08:00:00.000Z,how-to,integrations,0.78,True,"How-to page for configuring Lakehouse Federation to Snowflake using PEM private key authentication. It necessarily includes product-specific connection objects, parameter names, and configuration details unique to Azure Databricks–Snowflake integration (e.g., Unity Catalog connection properties, authentication fields), which qualify as expert integration knowledge beyond generic SDK usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-pem,PEM private key,Run federated queries on Snowflake (PEM private key) - Azure Databricks,Configure Databricks federated queries to Snowflake with PEM,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. Authenticate using a PEM private key.,"This page describes how to set up Lakehouse Federation to run federated queries on Snowflake data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your Snowflake database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): This page describes how to run ",2026-04-13T08:00:00.000Z,how-to,integrations,0.78,True,"How-to page for configuring Lakehouse Federation to Snowflake using PEM private key authentication. It necessarily includes product-specific connection objects, parameter names, and configuration details unique to Azure Databricks–Snowflake integration (e.g., Unity Catalog connection properties, authentication fields), which qualify as expert integration knowledge beyond generic SDK usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/sql-server,Create connection,Run federated queries on Microsoft SQL Server - Azure Databricks,Configure Databricks Lakehouse Federation for SQL Server,Learn how to configure Databricks Lakehouse Federation to run federated queries on Microsoft SQL Server data that is not managed by Azure Databricks.,"This page describes how to set up Lakehouse Federation to run federated queries on SQL Server data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation? To connect to your SQL Server database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore (workspaces created after November 9, 2023 already have a Unity Catalog metastore provisioned automatically): Lakehouse Federation supports",2026-03-31T23:28:00.000Z,how-to,integrations,0.78,True,"SQL Server-specific federation configuration with Databricks, including supported versions, connection properties, and Unity Catalog foreign catalog setup; these are concrete integration details unique to this product combination.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/sql-server-entra,Configure OAuth (Entra ID),Configure Microsoft Entra ID for SQL Server federation - Azure Databricks,Configure Entra ID authentication for SQL Server federation,"Learn how to configure Databricks Lakehouse Federation to run federated queries on Microsoft SQL Server using Microsoft Entra ID authentication, including both the user-to-machine (U2M) and machine-to",This page describes how to configure Databricks Lakehouse Federation to run federated queries on Microsoft SQL Server using Microsoft Entra ID authentication. Both the user-to-machine (U2M) and machine-to-machine (M2M) OAuth flows are supported.,2026-02-23T08:00:00.000Z,how-to,security,0.84,True,"Focuses on Entra ID auth for SQL Server federation, with OAuth flow details, app registrations, scopes, and role/permission configuration specific to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query-federation/sqldw,Synapse (SQL Data Warehouse),Run federated queries on Microsoft Azure Synapse - Azure Databricks,Configure Databricks federated queries to Azure Synapse,Learn how to configure Azure Databricks Lakehouse Federation to run federated queries on Microsoft Azure Synapse (SQL Data Warehouse) data that is not managed by Azure Databricks.,"This article describes how to set up Lakehouse Federation to run federated queries on Azure Synapse (SQL Data Warehouse) data that is not managed by Azure Databricks. To learn more about Lakehouse Federation, seeWhat is Lakehouse Federation?. To connect to an Azure Synapse (SQL Data Warehouse) database using Lakehouse Federation, you must create the following in your Azure Databricks Unity Catalog metastore:",2026-03-04T19:15:00.000Z,how-to,integrations,0.78,True,"How-to for Lakehouse Federation to Synapse, with product-specific connection objects, parameters, and SQL examples that go beyond generic knowledge.",unchanged @@ -3243,13 +3272,13 @@ https://learn.microsoft.com/en-us/azure/databricks/query/formats/binary,Binary f record that contains the raw content and metadata of the file. The binary file data source produces a DataFrame with the following columns and possibly partition columns: To read binary files, specify the data sourceformatasbinaryFile.",2024-07-26T08:00:00.000Z,sample,integrations,0.75,True,"Defines the binaryFile data source, its output schema (columns) and usage, which are specific configuration and integration details for Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query/formats/csv,CSV files,Read CSV files - Azure Databricks,Read CSV files using Azure Databricks,Learn how to read CSV files using Azure Databricks.,"This article provides examples for reading CSV files with Azure Databricks using Python, Scala, R, and SQL. Note Databricks recommends theread_filestable-valued functionfor SQL users to read CSV files.read_filesis available in Databricks Runtime 13.3 LTS and above. You can also use a temporary view. If you use SQL to read CSV data directly without using temporary views orread_files, the following limitations apply:",2025-02-13T08:00:00.000Z,sample,integrations,0.75,True,"Includes Databricks-specific recommendations (read_files TVF, runtime version requirements) and limitations when not using those patterns, which are concrete integration behaviors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query/formats/deltasharing,Delta Sharing shared tables,Read Delta Sharing shared tables using Apache Spark DataFrames - Azure Databricks,Read Delta Sharing tables with Spark DataFrames,Learn how to read tables Delta Sharing shared tables using Apache Spark Dataframes in Azure Databricks.,This article provides syntax examples of using Apache Spark to query data shared usingDelta Sharing. Use thedeltasharingkeyword as a format option for DataFrame operations.,2024-08-06T08:00:00.000Z,sample,integrations,0.7,True,"Provides Databricks-specific format keyword (deltasharing) and syntax examples for reading shared tables, which are concrete integration patterns beyond generic Spark knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query/formats/excel,Excel,Read Excel files - Azure Databricks,Read Excel files with Databricks built-in format,Learn how to read Excel files using Spark and SQL. Built-in Excel file format support removes the need for external libraries or manual conversions.,"Important This feature is in Beta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. You can ingest, parse, and query Excel files for batch and streaming workloads using built-in Excel file format support. It automatically infers schema and data types, eliminating the need for external libraries or manual file conversions. This feature provides seamless ingestion from both local uploads and cloud storage.",2026-04-13T08:00:00.000Z,sample,configuration,0.65,True,"Reading Excel via built-in format support usually involves format-specific options (schema inference flags, header handling, streaming options) that are configuration details unique to Databricks’ Excel support.",updated +https://learn.microsoft.com/en-us/azure/databricks/query/formats/excel,Excel,Read Excel files - Azure Databricks,Read Excel files with built-in Databricks Spark format,Learn how to read Excel files using Spark and SQL. Built-in Excel file format support removes the need for external libraries or manual conversions.,"Important This feature is inPublic Preview. You can ingest, parse, and query Excel files for batch and streaming workloads using built-in Excel file format support. It automatically infers schema and data types, eliminating the need for external libraries or manual file conversions. This feature provides seamless ingestion from both local uploads and cloud storage.",2026-04-21T08:00:00.000Z,sample,integrations,0.7,True,"Explains Databricks’ built-in Excel file format support for batch and streaming, including schema inference and ingestion from local and cloud storage; likely includes format-specific options and parameters, making it an integration/config pattern.",updated https://learn.microsoft.com/en-us/azure/databricks/query/formats/image,Image files,Image - Azure Databricks,Configure Databricks image data source for Spark,Learn how to use binary format to read image files (.jpg) using Azure Databricks.,"Important Databricks recommends that you use thebinary filedata source to load image data into the Spark DataFrame as raw bytes. SeeReference solution for image applicationsfor the recommended workflow to handle image data. Theimage data sourceabstracts from the details of image representations and provides a standard API to load image data. To read image files, specify the data sourceformatasimage. Similar APIs exist for Scala, Java, and R. You can import a nested directory structure (for examp",2026-01-20T08:00:00.000Z,sample,configuration,0.65,True,"Describes the Databricks-specific `image`/`binary file` data source, including required format names and usage patterns, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query/formats/json,JSON files,JSON files - Azure Databricks,,"Learn how to read and write JSON files in Azure Databricks using single-line and multi-line modes, with options for schema inference and rescued data.","You can read JSON files insingle-lineormulti-linemode. In single-line mode, a file can be split into many parts and read in parallel. In multi-line mode, a file is loaded as a whole entity andcannotbe split. For more information, see the Apache Spark documentation onJSON Files.",2026-04-01T08:00:00.000Z,sample,,0.2,False,JSON format page is a simple usage description (single-line vs multi-line) without detailed config tables or limits.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/query/formats/json,JSON files,JSON files - Azure Databricks,Process JSON files with Azure Databricks,"Learn how to read and write JSON files in Azure Databricks using single-line and multi-line modes, with options for schema inference and rescued data.","You can read JSON files insingle-lineormulti-linemode. In single-line mode, a file can be split into many parts and read in parallel. In multi-line mode, a file is loaded as a whole entity andcannotbe split. For more information, see the Apache Spark documentation onJSON Files.",2026-04-23T08:00:00.000Z,sample,integrations,0.65,True,"Describes Databricks-specific patterns for reading/writing JSON, including single-line vs multi-line modes, schema inference, and rescued data handling. These are concrete integration behaviors and options tied to Databricks’ Spark implementation, beyond generic JSON knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/query/formats/mlflow-experiment,MLflow experiment,MLflow experiment - Azure Databricks,Load MLflow experiment run data in Databricks,Learn how to load MLflow experiment run data using Azure Databricks.,"The MLflow experiment data source provides a standard API to load MLflow experiment run data. You can load data from thenotebook experiment, or you can use the MLflow experiment name or experiment ID.",2024-08-29T08:00:00.000Z,sample,integrations,0.7,True,"Documents the MLflow experiment data source and how to load runs by notebook experiment, name, or ID, which is a Databricks-specific integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/query/formats/orc,ORC files,Work with ORC files - Azure Databricks,,"Learn how to read and write Apache ORC files in Azure Databricks using the DataFrame API or SQL, including schema specification, partitioning, and compression options.","Apache ORCis a columnar file format that provides optimizations for speeding up queries. It is more efficient thanCSVorJSON. Azure Databricks supports ORC for both reading and writing with Apache Spark. For more information, see the Apache Spark documentation onORC Files.",2026-03-31T23:28:00.000Z,how-to,,0.2,False,"ORC format page is a basic how-to; no product-specific limits, config matrices, or decision/troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/query/formats/orc,ORC files,Work with ORC files - Azure Databricks,Read and write ORC files in Azure Databricks,"Learn how to read and write Apache ORC files in Azure Databricks using the DataFrame API or SQL, including schema specification, partitioning, and compression options.","Apache ORCis a columnar file format that provides optimizations for speeding up queries. It is more efficient thanCSVorJSON. Azure Databricks supports ORC for both reading and writing with Apache Spark. For more information, see the Apache Spark documentation onORC Files.",2026-04-23T08:00:00.000Z,how-to,integrations,0.65,True,"Page focuses on product-specific code patterns and options for working with ORC via the DataFrame API and SQL (schema specification, partitioning, compression). These are integration/coding patterns unique to Databricks’ Spark environment rather than generic ORC concepts.",updated https://learn.microsoft.com/en-us/azure/databricks/query/formats/parquet,Parquet files,Read Parquet files using Azure Databricks - Azure Databricks,Read Parquet files using Azure Databricks,Learn how to read data from Apache Parquet files using Azure Databricks.,This article shows you how to read data from Apache Parquet files using Azure Databricks.,2025-05-09T19:58:00.000Z,sample,integrations,0.65,True,Shows Databricks-specific code patterns and options for reading Parquet with Spark on Databricks; this is concrete integration usage rather than conceptual info.,unchanged https://learn.microsoft.com/en-us/azure/databricks/query/formats/text,Text files,Text files - Azure Databricks,Process text files with Azure Databricks,Learn how to read data from text files using Azure Databricks.,"You can process files with thetextformat option to parse each line in any text-based file as a row in a DataFrame. This can be useful for a number of operations, including log parsing. It can also be useful if you need to ingest CSV or JSON data as raw strings. For more information, seetext files.",2024-12-17T08:00:00.000Z,reference,integrations,0.65,True,"Describes using the text format option to parse lines into DataFrames, a Databricks/Spark-specific integration pattern for arbitrary text data.",unchanged https://learn.microsoft.com/en-us/azure/databricks/query/formats/xml,XML files,Read and write XML files - Azure Databricks,Read and write XML files in Azure Databricks,This article describes how to read and write XML files.,"Important This feature is inPublic Preview. This article describes how to read and write XML files. Extensible Markup Language (XML) is a markup language for formatting, storing, and sharing data in textual format. It defines a set of rules for serializing data ranging from documents to arbitrary data structures. Native XML file format support enables ingestion, querying, and parsing of XML data for batch processing or streaming. It can automatically infer and evolve schema and data types, suppo",2024-08-09T08:00:00.000Z,sample,integrations,0.7,True,"Describes native XML file format support, schema inference, and streaming support in Databricks, which are concrete integration behaviors for this product.",unchanged @@ -3258,12 +3287,12 @@ https://learn.microsoft.com/en-us/azure/databricks/reference/api,Databricks refe https://learn.microsoft.com/en-us/azure/databricks/reference/delta-lake,Delta Lake API,Delta Lake API reference - Azure Databricks,,Learn about the Delta Lake API reference guides. Delta Lake is an open source storage layer that brings reliability to data lakes.,"Delta Lakeis anopen source storage layerthat brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs. See the Delta Lake website forAPI referencesfor Scala, Java, and Python. To learn how to use the Delta Lake APIs on Azure Databricks, see: See also theDelta Lake API documentationin the Azure Databricks docu",2026-01-19T08:00:00.000Z,reference,,0.3,False,"High-level pointer to Delta Lake API references; the expert details live in external API docs, not on this summary page.",unchanged https://learn.microsoft.com/en-us/azure/databricks/reference/mlflow-api,MLflow API,MLflow API reference - Azure Databricks,Use MLflow REST APIs on Azure Databricks,Learn how to use the MLflow open-source and Azure Databricks-specific REST APIs.,"The open-source MLflow REST API allows you to create, list, and get experiments and runs, and allows you to log parameters, metrics, and artifacts. The Databricks Runtime for Machine Learning provides a managed version of the MLflow server, which includes experiment tracking and the Model Registry. For MLflow, there are two REST API reference guides:",2026-01-19T08:00:00.000Z,reference,integrations,0.65,True,"MLflow API reference pages contain concrete REST endpoints, parameters, and Databricks-specific behaviors. This is code-level integration detail between MLflow and Databricks, matching integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/reference/scim-2-1,Account SCIM v2.1 API,Account SCIM v2.1 API reference - Azure Databricks,Manage Databricks account identities with SCIM v2.1,Learn how to use the SCIM open-source and Azure Databricks-specific REST APIs.,"The Account SCIM v2.1 API allows you to create and manage users, groups, and service principals in the Azure Databricks account. You can download aPDF of the Account SCIM v2.1 API reference.",2026-01-19T08:00:00.000Z,reference,security,0.7,True,"SCIM account API reference is about creating and managing users, groups, and service principals. This is identity and access management with specific API semantics, fitting the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/reference/spark,Apache Spark API,Reference for Apache Spark APIs - Azure Databricks,,"Learn about the Apache Spark API reference guides. Azure Databricks is built on top of Apache Spark, a unified analytics engine for big data and machine learning.","Azure Databricks is built on top of Apache Spark, a unified analytics engine for big data and machine learning. For more information, seeApache Spark overview. Apache Spark has DataFrame APIs for operating on large datasets, which include over 100 operators, in several languages. To learn how to use the Apache Spark APIs on Azure Databricks, see:",2026-01-19T08:00:00.000Z,reference,,0.3,False,"Overview page linking to Apache Spark API references; does not itself contain detailed configuration, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/,Release notes overview,Azure Databricks release notes - Azure Databricks,,"Learn about Azure Databricks releases for the Azure Databricks platform, the Databricks Runtime, Databricks SQL, Lakeflow Spark Declarative Pipelines, and more.","The following sections organize Azure Databricks release notes by release type, includingDatabricks Runtime releases,platform releases, andfeature-specific releasessuch as Databricks SQL, Lakeflow Spark Declarative Pipelines, and serverless compute.",2026-04-08T08:00:00.000Z,release-notes,,0.2,False,"Top-level release notes index that organizes links by release type; no indication of concrete limits, configs, error codes, or decision matrices on this page itself.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/reference/spark,Apache Spark API,Reference for Apache Spark APIs - Azure Databricks,,"Learn about the Apache Spark API reference guides. Azure Databricks is built on top of Apache Spark, a unified analytics engine for big data and machine learning.","Azure Databricks is built on top of Apache Spark, a unified analytics engine for big data and machine learning. For more information, seeApache Spark overview. Apache Spark has DataFrame APIs for operating on large datasets, which include over 100 operators, in several languages. To learn how to use the Apache Spark APIs on Azure Databricks, see:",2026-04-23T08:00:00.000Z,reference,,0.1,False,"This is a navigation/overview page pointing to Apache Spark API references. It does not itself list configuration parameters, limits, error codes, or product-specific patterns; it just describes that Spark has DataFrame APIs and links out. That is not expert knowledge per the defined categories.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/,Release notes overview,Azure Databricks release notes - Azure Databricks,,"Learn about Azure Databricks releases for the Azure Databricks platform, the Databricks Runtime, Databricks SQL, Lakeflow Spark Declarative Pipelines, and more.","The following sections organize Azure Databricks release notes by release type, includingDatabricks Runtime releases,platform releases, andfeature-specific releasessuch as Databricks SQL, Lakeflow Spark Declarative Pipelines, and serverless compute.",2026-04-22T08:00:00.000Z,release-notes,,0.2,False,"Top-level release notes index; primarily navigation to specific month/runtime notes without detailed limits, configs, or troubleshooting content on this page.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/dbconnect/,Databricks Connect release notes,Databricks Connect release notes - Azure Databricks,,Release notes for Databricks Connect releases and maintenance updates,"This page lists releases and maintenance updates issued forDatabricks Connect. The Databricks Runtime version of the cluster must be greater than or equal to the Databricks Connect version. Databricks recommends using the latest version to receive any bug fixes and security updates. For system environment information for Databricks Runtime releases, seeDatabricks Runtime release notes versions and compatibility.",2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"Databricks Connect release notes index; the summary focuses on listing releases and stating version compatibility at a high level. Without explicit evidence of detailed config parameters, limits, or troubleshooting mappings, it’s treated as non-expert for this classification.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/dev-tools/,Databricks developer tools releases,Azure Databricks developer tools and SDKs release notes - Azure Databricks,,See a list of new releases for Azure Databricks developer tools.,"This page provides release links for Azure Databricks developer tools and SDKs. SeeDevelop on Databricks. Azure Databricks developer tools that areExperimentalorPrivate Preview, and those that are hosted byDatabricks Labsare not listed. For Azure Databricks developer tools releases prior to October 2023, see the Azure Databricksplatform release notes. Tip To be notified of new releases of developer tools and SDKs, configure GitHub notifications for the corresponding GitHub repository. SeeAbout G",2026-04-14T21:45:00.000Z,release-notes,,0.2,False,"Release notes index for developer tools and SDKs; summary indicates it mainly links to releases and does not describe specific limits, configs, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/dev-tools/,Databricks developer tools releases,Azure Databricks developer tools and SDKs release notes - Azure Databricks,,See a list of new releases for Azure Databricks developer tools.,"This page provides release links for Azure Databricks developer tools and SDKs. SeeDevelop on Databricks. Azure Databricks developer tools that areExperimentalorPrivate Preview, and those that are hosted byDatabricks Labsare not listed. For Azure Databricks developer tools releases prior to October 2023, see the Azure Databricksplatform release notes. Tip To be notified of new releases of developer tools and SDKs, configure GitHub notifications for the corresponding GitHub repository. SeeAbout G",2026-04-14T21:45:00.000Z,release-notes,,0.2,False,"Release notes index for developer tools and SDKs; summary indicates it mainly links to releases and does not describe specific limits, configs, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dev-tools/bundles,Declarative Automation Bundles feature releases,Declarative Automation Bundles feature release notes - Azure Databricks,,Major feature and upcoming changes release notes for Declarative Automation Bundles,"This article contains details on releases of majorDeclarative Automation Bundles(formerly known as Databricks Asset Bundles) features and changes. Bundle features are released with theDatabricks CLI. For a complete list of updates, see theDatabricks CLI GitHub repository release notes. Tip To be notified of new releases of the Databricks CLI and bundle features, configure GitHub notifications for theCLI repository. SeeAbout GitHub notifications.",2026-03-16T17:36:00.000Z,release-notes,,0.2,False,"Feature release notes for Declarative Automation Bundles; describes feature changes and points to GitHub releases, but not organized as limits, configuration parameter tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/,Lakeflow Spark Declarative Pipelines release notes,Lakeflow Spark Declarative Pipelines release notes and the release upgrade process - Azure Databricks,,Learn about new features and bug fixes in Lakeflow Spark Declarative Pipelines and how Lakeflow Spark Declarative Pipelines upgrades are managed.,"This article explains the Lakeflow Spark Declarative Pipelines release process, how the Lakeflow Spark Declarative Pipelines runtime is managed, and provides links to release notes for each Lakeflow Spark Declarative Pipelines release.",2026-04-17T08:00:00.000Z,concept-article,,0.3,False,"High-level explanation of release process and links to release notes; no indication of concrete limits, configuration tables, error codes, or decision criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/,Lakeflow Spark Declarative Pipelines release notes,Lakeflow Spark Declarative Pipelines release notes and the release upgrade process - Azure Databricks,,Learn about new features and bug fixes in Lakeflow Spark Declarative Pipelines and how Lakeflow Spark Declarative Pipelines upgrades are managed.,"This article explains the Lakeflow Spark Declarative Pipelines release process, how the Lakeflow Spark Declarative Pipelines runtime is managed, and provides links to release notes for each Lakeflow Spark Declarative Pipelines release.",2026-04-17T08:00:00.000Z,concept-article,,0.3,False,"High-level explanation of release process and links to release notes; no indication of concrete limits, configuration tables, error codes, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2022/37/,Release 2022.37,DLT release 2022.37 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2022.37.","September 14 - 22, 2022 These features and improvements were released with the 2022.37 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2022.37; primarily lists features and fixes, not structured as limits, configuration, troubleshooting, or decision-making guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2022/40/,Release 2022.40,DLT release 2022.40 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2022.40.","September 28 - October 5, 2022 These features and improvements were released with the 2022.40 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2022.40; standard release notes, not matching any of the specified expert-knowledge sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2022/42/,Release 2022.42,DLT release 2022.42 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2022.42.","October 17 - October 21, 2022 These features and improvements were released with the 2022.42 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2022.42; does not present configuration tables, error-code mappings, or decision matrices as required by the sub-skill definitions.",unchanged @@ -3297,7 +3326,7 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/13/,Re https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/20/,Release 2024.20,DLT release 2024.20 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.20.","May 23 - 29, 2024 These features and improvements were released with the 2024.20 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,Release notes for DLT 2024.20; change log content rather than a focused expert-knowledge reference in the specified categories.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/22/,Release 2024.22,DLT release 2024.22 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.22.","June 3 - 7, 2024 These features and improvements were released with the 2024.22 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2024.22; does not provide structured limits, configuration options, troubleshooting mappings, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/29/,Release 2024.29,DLT release 2024.29 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.29.","July 22 - 29, 2024 These features and improvements were released with the 2024.29 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2024.29; primarily chronological change information, not structured expert guidance per the defined types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/33/,Release 2024.33,DLT release 2024.33 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.33.","August 19 - 23, 2024 These features and improvements were released with the 2024.33 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2024.33; no clear mapping to limits, configuration, troubleshooting, or other sub-skill types.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/33/,Release 2024.33,DLT release 2024.33 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.33.","August 19 - 23, 2024 These features and improvements were released with the 2024.33 release of DLT.",2026-04-21T08:00:00.000Z,release-notes,,0.25,False,"DLT release 2024.33 notes list new features, improvements, and bug fixes; summary does not indicate structured limits, configuration options, or error-code-based troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/37/,Release 2024.37,DLT release 2024.37 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.37.","September 16 - 21, 2024 These features and improvements were released with the 2024.37 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,Release notes for DLT 2024.37; standard feature/fix listing rather than a reusable expert-knowledge reference in the defined categories.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/40/,Release 2024.40,DLT release 2024.40 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.40.","October 7 - 11, 2024 These features and improvements were released with the 2024.40 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2024.40; does not present structured limits, configuration parameters, troubleshooting flows, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/42/,Release 2024.42,DLT release 2024.42 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in DLT release 2024.42.","October 21 - 24, 2024 These features and improvements were released with the 2024.42 release of DLT.",2026-04-09T08:00:00.000Z,release-notes,,0.3,False,"Release notes for DLT 2024.42; change log style, not matching any of the specified expert-knowledge sub-skill patterns.",unchanged @@ -3314,7 +3343,7 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/28/,Re https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/29/,Release 2025.29,Lakeflow Declarative Pipelines release 2025.29 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in Declarative Pipelines release 2025.29.","July 16 - 18, 2025 These features and improvements were released with the 2025.29 release of Declarative Pipelines.",2026-02-03T08:00:00.000Z,release-notes,,0.4,False,"Release notes for a specific Databricks Lakeflow Declarative Pipelines version; likely lists new features and fixes but not organized as limits, configuration tables, troubleshooting mappings, or other defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/30/,Release 2025.30,Lakeflow Spark Declarative Pipelines release 2025.30 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in Lakehouse Declarative Pipelines release 2025.30.","July 23 - 30, 2025 These features and improvements were released with the 2025.30 release of Lakeflow Spark Declarative Pipelines.",2026-01-24T08:00:00.000Z,release-notes,,0.35,False,"Release 2025.30 notes similarly list features and improvements; no indication of limits/quotas, configuration parameter tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/36/,Release 2025.36,Lakeflow Spark Declarative Pipelines release 2025.36 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in Lakehouse Declarative Pipelines release 2025.30.","August 16 - September 24, 2025 These features and improvements were released with the 2025.36 release of Lakeflow Spark Declarative Pipelines.",2026-01-21T08:00:00.000Z,release-notes,,0.35,False,Release 2025.36 notes describe features and improvements over a date range; summary does not show the specific expert-knowledge patterns required for classification.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2026,Release notes 2026,Lakeflow Spark Declarative Pipelines release notes 2026 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in Lakeflow Spark Declarative Pipelines releases in 2026.","The following Lakeflow Spark Declarative Pipelines features, improvements, and bug fixes were released in 2026. Note Because Lakeflow Spark Declarative Pipelines channel releases follow a rolling upgrade process, channel upgrades are deployed to different regions at different times. Your release, including Databricks Runtime versions, might not be updated until a week or more after the initial release date. To find the current Databricks Runtime version for a pipeline, seeRuntime information.",2026-04-17T18:03:00.000Z,concept-article,,0.3,False,"Yearly release notes overview; summary mentions features and bug fixes and rolling upgrade timing but no specific limits, configuration parameters, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2026,Release notes 2026,Lakeflow Spark Declarative Pipelines release notes 2026 - Azure Databricks,,"Learn about new features, improvements, and bug fixes in Lakeflow Spark Declarative Pipelines releases in 2026.","The following Lakeflow Spark Declarative Pipelines features, improvements, and bug fixes were released in 2026. Note Because Lakeflow Spark Declarative Pipelines channel releases follow a rolling upgrade process, channel upgrades are deployed to different regions at different times. Your release, including Databricks Runtime versions, might not be updated until a week or more after the initial release date. To find the current Databricks Runtime version for a pipeline, seeRuntime information.",2026-04-17T18:03:00.000Z,concept-article,,0.3,False,"Yearly release notes overview; summary mentions features and bug fixes and rolling upgrade timing but no specific limits, configuration parameters, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/,Platform release notes,Azure Databricks platform release notes - Azure Databricks,,"Release notes index for the Databricks Data Intelligence Platform, which provides a unified set of tools for managing enterprise-grade data solutions at scale.",Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-01-21T08:00:00.000Z,release-notes,,0.2,False,"Index page for release notes; no specific technical limits, configs, or error details are provided in the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/april,April 2018,April 2018 - Azure Databricks,,April 2018 release notes for new Azure Databricks features and improvements.,"Releases are staged. Your Azure Databricks account may not be updated until a week after the initial release date. Note We are now providing Databricks Runtime deprecation notices inDatabricks Runtime release notes versions and compatibility.",2026-04-01T08:00:00.000Z,release-notes,,0.35,False,"April 2018 Databricks release notes mention runtime deprecation notices but remain a change log; they do not provide structured limits, configuration parameters, or troubleshooting mappings.",unchanged @@ -3388,8 +3417,8 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/fe https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/january,January 2023,January 2023 - Azure Databricks,,January 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in January 2023. Note Releases are staged. Your Azure Databricks account may not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,January 2023 Azure Databricks release notes are change-log style and do not present structured expert knowledge in any of the specified sub-skill formats.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/july,July 2023,July 2023 - Azure Databricks,,July 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in July 2023. Note Releases are staged. Your Azure Databricks workspace might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"July 2023 Azure Databricks release notes are change-log content and do not provide structured limits, configuration, troubleshooting, or decision-making guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/june,June 2023,June 2023 - Azure Databricks,,June 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in June 2023. Note Releases are staged. Your Azure Databricks workspace might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,June 2023 Azure Databricks release notes describe new features but are not organized as any of the specified expert-knowledge categories.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/march,March 2023,March 2023 - Azure Databricks,,March 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in March 2023. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-16T08:00:00.000Z,release-notes,,0.0,False,"Release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert-knowledge categories defined here.",updated -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/may,May 2023,May 2023 - Azure Databricks,,May 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in May 2023. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-16T08:00:00.000Z,release-notes,,0.0,False,"Release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert-knowledge categories defined here.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/march,March 2023,March 2023 - Azure Databricks,,March 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in March 2023. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-16T08:00:00.000Z,release-notes,,0.0,False,"Release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert-knowledge categories defined here.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/may,May 2023,May 2023 - Azure Databricks,,May 2023 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in May 2023. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-16T08:00:00.000Z,release-notes,,0.0,False,"Release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert-knowledge categories defined here.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/november,November 2023,November 2023 - Azure Databricks,,November 2023 release notes for new Azure Databricks platform features and improvements.,These features and Azure Databricks platform improvements were released in November 2023. Note Releases are staged. Your Azure Databricks workspace might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,November 2023 Azure Databricks release notes are a platform change log and do not match the structured expert-knowledge categories like limits-quotas or configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/october,October 2023,October 2023 - Azure Databricks,,October 2023 release notes for new Azure Databricks platform features and improvements.,These features and Azure Databricks platform improvements were released in October 2023. Note Releases are staged. Your Azure Databricks workspace might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"October 2023 Azure Databricks release notes list platform improvements but are not structured as best-practices, troubleshooting, or configuration references.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/september,September 2023,September 2023 - Azure Databricks,,September 2023 release notes for new Azure Databricks platform features and improvements.,These features and Azure Databricks platform improvements were released in September 2023. Note Releases are staged. Your Azure Databricks workspace might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"September 2023 Azure Databricks release notes are update summaries, not structured expert guidance with numeric limits, parameter tables, or decision matrices.",unchanged @@ -3403,13 +3432,13 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/ju https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/march,March 2024,March 2024 - Azure Databricks,,March 2024 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in March 2024. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"March 2024 Azure Databricks release notes describe new features but do not provide structured limits, configuration options, troubleshooting flows, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/may,May 2024,May 2024 - Azure Databricks,,May 2024 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in May 2024. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"May 2024 Azure Databricks release notes provide feature summaries but lack the structured numeric limits, configuration parameters, or decision matrices required for classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/november,November 2024,November 2024 - Azure Databricks,,November 2024 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in November 2024. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"November 2024 Azure Databricks release notes describe new capabilities but are not organized as limits/quotas, configuration parameter tables, or other defined expert-knowledge categories.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/october,October 2024,October 2024 - Azure Databricks,,October 2024 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in October 2024. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"October 2024 Azure Databricks release notes function as a change log rather than a structured guide for limits, configuration, troubleshooting, or decision-making.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/october,October 2024,October 2024 - Azure Databricks,,October 2024 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in October 2024. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-21T08:00:00.000Z,release-notes,,0.3,False,"Monthly release notes page; primarily feature announcements and platform changes, not organized as limits, best practices, configuration references, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/september,September 2024,September 2024 - Azure Databricks,Updated Git folders and Repos asset limits in Databricks,September 2024 release notes for new Azure Databricks features and improvements.,"These features and Azure Databricks platform improvements were released in September 2024. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date. September 30, 2024 Git folders and Repos (Legacy) now support 1GB and 20000 workspace assets per working branch.",2026-01-21T08:00:00.000Z,release-notes,limits-quotas,0.7,True,"Contains a specific new limit: Git folders and Repos (Legacy) now support 1GB and 20000 workspace assets per working branch, which is an exact numerical quota change.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/april,April 2025,April 2025 - Azure Databricks,,April 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in April 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-03-27T08:00:00.000Z,release-notes,,0.3,False,"April 2025 release notes; visible text is generic and not focused on limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/august,August 2025,August 2025 - Azure Databricks,,August 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in August 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-13T08:00:00.000Z,release-notes,,0.0,False,"Release notes summary page; likely lists new features and changes for August 2025 without the specific numeric limits, config tables, or decision matrices required for classification.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/april,April 2025,April 2025 - Azure Databricks,,April 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in April 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-21T08:00:00.000Z,release-notes,,0.3,False,"Monthly release notes page; describes new features and improvements without the structured limits, decision matrices, or config parameter tables needed for classification.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/august,August 2025,August 2025 - Azure Databricks,,August 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in August 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-13T08:00:00.000Z,release-notes,,0.0,False,"Release notes summary page; likely lists new features and changes for August 2025 without the specific numeric limits, config tables, or decision matrices required for classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/december,December 2025,December 2025 - Azure Databricks,,December 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in December 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-01-21T08:00:00.000Z,release-notes,,0.3,False,"December 2025 release notes; only high-level description of staged releases is visible, no concrete expert details in the provided text.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/february,February 2025,February 2025 - Azure Databricks,,February 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in February 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"February 2025 Azure Databricks release notes are change-log style content, not structured limits, configuration, troubleshooting, or decision guidance as required by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/january,January 2025,January 2025 - Azure Databricks,,January 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in January 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"January 2025 Azure Databricks release notes list new features and improvements but do not present structured expert knowledge in the forms specified (limits, configuration tables, troubleshooting matrices, etc.).",unchanged +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/january,January 2025,January 2025 - Azure Databricks,,January 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in January 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-21T08:00:00.000Z,release-notes,,0.3,False,"Monthly release notes page; focused on change log style updates rather than detailed quotas, configuration options, or troubleshooting guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/july,July 2025,July 2025 - Azure Databricks,,July 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in July 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.3,False,"Monthly Azure Databricks release notes for July 2025; no evidence of detailed limits, configuration parameter tables, or error-code-based troubleshooting content on this page.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/june,June 2025,June 2025 - Azure Databricks,,June 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in June 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.3,False,Monthly Azure Databricks release notes for June 2025; focuses on describing new features and platform improvements rather than the structured expert-knowledge categories defined.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/march,March 2025,March 2025 - Azure Databricks,,March 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in March 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.4,False,"Monthly Azure Databricks release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert patterns defined in the taxonomy.",unchanged @@ -3417,10 +3446,10 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/ma https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/november,November 2025,November 2025 - Azure Databricks,,November 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in November 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-01-21T08:00:00.000Z,release-notes,,0.3,False,"November 2025 release notes; summary indicates general feature updates, not specific limits, configs, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/october,October 2025,October 2025 - Azure Databricks,,October 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in October 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-01-21T08:00:00.000Z,release-notes,,0.3,False,October 2025 release notes; provided summary is generic and does not expose expert-level technical parameters.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/september,September 2025,September 2025 - Azure Databricks,,September 2025 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in September 2025. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-10T08:00:00.000Z,release-notes,,0.3,False,"Monthly Azure Databricks release notes for September 2025; primarily feature announcements and improvements, not the structured technical reference material required for the defined sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/april,April 2026,April 2026 - Azure Databricks,,April 2026 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in April 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-17T18:03:00.000Z,release-notes,,0.0,False,"Release notes summary page; description does not indicate detailed limits, configuration tables, error codes, or other structured expert knowledge. Likely a chronological list of feature changes without the structured patterns required by any sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/april,April 2026,April 2026 - Azure Databricks,,April 2026 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in April 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-24T23:15:00.000Z,release-notes,,0.3,False,"Monthly release notes page; typically lists new features and changes but not structured limits, configuration tables, or troubleshooting mappings required by the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/february,February 2026,February 2026 - Azure Databricks,,February 2026 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in February 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-02-27T08:00:00.000Z,release-notes,,0.3,False,Monthly release notes summary; snippet is generic and does not show specific technical parameters or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/january,January 2026,January 2026 - Azure Databricks,,January 2026 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in January 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-03-31T23:28:00.000Z,release-notes,,0.3,False,"Monthly release notes; summary text does not show specific technical parameters, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/march,March 2026,March 2026 - Azure Databricks,,March 2026 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in March 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-15T08:00:00.000Z,release-notes,,0.0,False,"Release notes summary page; description suggests high-level feature and improvement announcements, not structured limits, configuration parameters, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/march,March 2026,March 2026 - Azure Databricks,,March 2026 release notes for new Azure Databricks features and improvements.,These features and Azure Databricks platform improvements were released in March 2026. Note Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.,2026-04-15T08:00:00.000Z,release-notes,,0.0,False,"Release notes summary page; description suggests high-level feature and improvement announcements, not structured limits, configuration parameters, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/release-types,Databricks preview releases,Azure Databricks preview releases - Azure Databricks,,"Learn about Azure Databricks preview release types such as Private Preview, Public Preview, Beta, and Experimental.","Databricks regularly releases previews to allow you to evaluate and provide feedback on features before they're generally available (GA). Previews come in various degrees of maturity, each of which is defined in this article. The following tables list Databricks support options associated with each release type. For information on accessing preview releases, seeManage Azure Databricks previews.",2026-04-09T17:57:00.000Z,release-notes,,0.3,False,"Defines preview release types and support options conceptually; no product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/,Runtime release overview,Databricks Runtime release notes versions and compatibility - Azure Databricks,,"Explore Azure Databricks runtime releases and maintenance updates for runtime releases. Learn which runtime versions are supported, the release support schedule, and the runtime support lifecycle.","This page lists all Databricks Runtime releases and the schedule for supported releases. Each Databricks Runtime version includes updates that improve the usability, reliability, performance, and security of the Databricks platform. To learn about the Databricks Runtime support lifecycle, generally available releases, and Beta releases, seeDatabricks support lifecycles. For information on maintenance updates issued for Databricks Runtime releases, seeDatabricks Runtime maintenance updates.",2026-04-10T08:00:00.000Z,release-notes,,0.2,False,"High-level index of runtime releases and lifecycle; no detailed limits, configs, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4lts,Databricks Runtime 10.4 LTS,Databricks Runtime 10.4 LTS (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 10.4 LTS, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 10.4 and Databricks Runtime 10.4 Photon, powered by Apache Spark 3.2.1.Photonis inPublic Preview. Databricks released this version in March 2022.",2026-04-10T08:00:00.000Z,archived,,0.4,False,"Runtime 10.4 LTS release notes; LTS release information and change log, not limits, configuration matrices, or troubleshooting mappings.",unchanged @@ -3449,7 +3478,7 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.1,Da https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.1ml,Databricks Runtime 13.1 ML,Databricks Runtime 13.1 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 13.1 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 13.1 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 13.1 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,Runtime 13.1 ML release notes; similar ML runtime release information.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.2,Databricks Runtime 13.2,Databricks Runtime 13.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 13.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 13.2, powered by Apache Spark 3.4.0. Databricks released this version in July 2023.",2026-04-10T18:06:00.000Z,archived,,0.4,False,"Runtime 13.2 release notes; change log content, not limits, configuration, troubleshooting, or decision-making guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.2ml,Databricks Runtime 13.2 ML,Databricks Runtime 13.2 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 13.2 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 13.2 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 13.2 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,Runtime 13.2 ML release notes; ML-specific release info without the required expert-knowledge structures.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.3lts,Databricks Runtime 13.3 LTS,Databricks Runtime 13.3 LTS - Azure Databricks,,"Release notes about Databricks Runtime 13.3 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 13.3 LTS, powered by Apache Spark 3.4.1. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in August 2023. Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle. Tip To see release notes for Databricks Runtime versions that have reached end-of-support (EoS), seeEnd-of-support Databric",2026-04-13T08:00:00.000Z,release-notes,,0.3,False,"Release notes for a specific Databricks Runtime version; summary indicates version, Spark level, and dates but no clear evidence of limits, configuration tables, error codes, or decision matrices. Without detailed content, it appears as change-log style information rather than structured expert guidance per the defined sub-skills.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.3lts,Databricks Runtime 13.3 LTS,Databricks Runtime 13.3 LTS - Azure Databricks,,"Release notes about Databricks Runtime 13.3 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 13.3 LTS, powered by Apache Spark 3.4.1. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in August 2023. Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle. Tip To see release notes for Databricks Runtime versions that have reached end-of-support (EoS), seeEnd-of-support Databric",2026-04-13T08:00:00.000Z,release-notes,,0.3,False,"Release notes for a specific Databricks Runtime version; summary indicates version, Spark level, and dates but no clear evidence of limits, configuration tables, error codes, or decision matrices. Without detailed content, it appears as change-log style information rather than structured expert guidance per the defined sub-skills.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.3lts-ml,Databricks Runtime 13.3 LTS ML,Databricks Runtime 13.3 LTS for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 13.3 LTS ML, powered by Apache Spark.","Databricks Runtime 13.3 LTS for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 13.3 LTS. Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorch, and XGBoost. Databricks Runtime ML includesAutoML, a tool to automatically train machine learning pipelines. Databricks Runtime ML also supports distributed deep learning training using Horovod. Note LTSmeans this version is underlong-ter",2024-09-03T08:00:00.000Z,release-notes,,0.3,False,"ML runtime 13.3 LTS release notes describe bundled libraries and capabilities but do not present structured limits, config parameters, or error-resolution mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.x-migration,Databricks Runtime 13.x migration,Databricks Runtime 13.x migration guide - Azure Databricks,Migrate workloads to Databricks Runtime 13.x safely,"Migration guide for Databricks Runtime 13.x, powered by Apache Spark.","This guide helps you migrate your Azure Databricks workloads to the latest version of Databricks Runtime 13.x. To do so, Databricks recommends that you migrate your workloads in the following order: Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle.",2026-04-10T18:06:00.000Z,archived,decision-making,0.7,True,"Similar to the 12.x guide, a 13.x migration guide will enumerate specific behavioral changes, removed/altered APIs, and configuration adjustments needed when upgrading, which are expert, version-bound details that help decide how and when to move to 13.x.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.0,Databricks Runtime 14.0,Databricks Runtime 14.0 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 14.0, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 14.0, powered by Apache Spark 3.5.0. Databricks released this version in September 2023.",2026-04-10T18:06:00.000Z,archived,,0.4,False,"Runtime 14.0 release notes; standard release notes without explicit limits, configuration tables, or troubleshooting mappings.",unchanged @@ -3458,61 +3487,62 @@ https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.1,Da https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.1ml,Databricks Runtime 14.1 ML,Databricks Runtime 14.1 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 14.1 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 14.1 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 14.1 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,"Runtime 14.1 ML release notes; ML environment release info, not structured expert knowledge per the categories.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.2,Databricks Runtime 14.2,Databricks Runtime 14.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 14.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 14.2, powered by Apache Spark 3.5.0. Databricks released this version in November 2023.",2026-04-10T18:06:00.000Z,archived,,0.4,False,Runtime 14.2 release notes; change log style content without the specific expert-knowledge structures required.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.2ml,Databricks Runtime 14.2 ML,Databricks Runtime 14.2 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 14.2 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 14.2 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 14.2 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,Runtime 14.2 ML release notes; similar to other ML runtime release notes with library lists and updates.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.3lts,Databricks Runtime 14.3 LTS,Databricks Runtime 14.3 LTS - Azure Databricks,,"Release notes about Databricks Runtime 14.3, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 14.3 LTS, powered by Apache Spark 3.5.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in February 2024. Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle. Tip To see release notes for Databricks Runtime versions that have reached end-of-support (EoS), seeEnd-of-support Databr",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 14.3 LTS release notes; similar to other runtime release notes, primarily a change log without the specific structures required for the listed sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.3lts,Databricks Runtime 14.3 LTS,Databricks Runtime 14.3 LTS - Azure Databricks,,"Release notes about Databricks Runtime 14.3, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 14.3 LTS, powered by Apache Spark 3.5.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in February 2024. Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle. Tip To see release notes for Databricks Runtime versions that have reached end-of-support (EoS), seeEnd-of-support Databr",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 14.3 LTS release notes; similar to other runtime release notes, primarily a change log without the specific structures required for the listed sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.3lts-ml,Databricks Runtime 14.3 LTS ML,Databricks Runtime 14.3 LTS for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 14.3 LTS ML, powered by Apache Spark.","Databricks Runtime 14.3 LTS for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 14.3 LTS. Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorch, and XGBoost. Databricks Runtime ML includesAutoML, a tool to automatically train machine learning pipelines. Databricks Runtime ML also supports distributed deep learning training using Horovod. Note LTSmeans this version is underlong-ter",2024-09-03T08:00:00.000Z,release-notes,,0.3,False,"ML runtime release notes list included libraries and features but not structured limits, configuration tables, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.x-migration,Databricks Runtime 14.x migration,Databricks Runtime 14.x migration guide - Azure Databricks,Migrate workloads to Databricks Runtime 14.x safely,"Migration guide for Databricks Runtime 14.x, powered by Apache Spark.","This guide helps you migrate your Azure Databricks workloads to the latest version of Databricks Runtime 14.x. To do so, Databricks recommends that you migrate your workloads in the following order: Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle.",2026-04-10T18:06:00.000Z,archived,decision-making,0.7,True,"A 14.x migration guide will list concrete version-specific changes and required actions for upgrade (e.g., changed defaults, removed options), which are expert details and support planning and decision-making for moving to 14.x.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.0,Databricks Runtime 15.0,Databricks Runtime 15.0 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.0, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 15.0, powered by Apache Spark 3.5.0. Databricks released this version in March 2024.",2026-04-10T18:06:00.000Z,archived,,0.4,False,Runtime 15.0 release notes; primarily release information and high-level changes.,unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.0ml,Databricks Runtime 15.0 ML,Databricks Runtime 15.0 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.0 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 15.0 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 15.0 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,"Runtime 15.0 ML release notes; focuses on ML environment composition, not on limits, configuration matrices, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.1,Databricks Runtime 15.1,Databricks Runtime 15.1 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.1, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 15.1, powered by Apache Spark 3.5.0. Databricks released this version in April 2024.",2026-04-10T18:06:00.000Z,archived,,0.4,False,"Runtime 15.1 release notes; version-specific change log, not aligned with limits, configuration, troubleshooting, or decision-making categories.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.1ml,Databricks Runtime 15.1 ML,Databricks Runtime 15.1 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.1 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 15.1 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 15.1 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,"Runtime 15.1 ML release notes; similar pattern of library versions and features, not structured expert knowledge per the sub-skill definitions.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.2,Databricks Runtime 15.2,Databricks Runtime 15.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 15.2, powered by Apache Spark 3.5.0. Databricks released this version in May 2024.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 15.2 release notes; description is limited to version, Spark version, and release date. Without detailed content showing limits, config options, or troubleshooting, it does not meet any sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.2,Databricks Runtime 15.2,Databricks Runtime 15.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 15.2, powered by Apache Spark 3.5.0. Databricks released this version in May 2024.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 15.2 release notes; description is limited to version, Spark version, and release date. Without detailed content showing limits, config options, or troubleshooting, it does not meet any sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.2ml,Databricks Runtime 15.2 ML,Databricks Runtime 15.2 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.2 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 15.2 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 15.2 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,"Runtime 15.2 ML release notes; focuses on included ML libraries and changes, not on limits, configuration tables, or decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.3,Databricks Runtime 15.3,Databricks Runtime 15.3 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.3, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 15.3, powered by Apache Spark 3.5.0. Databricks released this version in June 2024.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 15.3 release notes; only high-level version and lifecycle details are visible. No evidence of limits/quotas, configuration settings, troubleshooting mappings, or other expert-knowledge structures.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.3,Databricks Runtime 15.3,Databricks Runtime 15.3 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.3, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 15.3, powered by Apache Spark 3.5.0. Databricks released this version in June 2024.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 15.3 release notes; only high-level version and lifecycle details are visible. No evidence of limits/quotas, configuration settings, troubleshooting mappings, or other expert-knowledge structures.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.3ml,Databricks Runtime 15.3 ML,Databricks Runtime 15.3 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 15.3 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 15.3 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 15.3 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.4,False,"Runtime 15.3 ML release notes; similar to other runtime release notes with library versions and features, not expert-knowledge patterns per the defined categories.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.4lts,Databricks Runtime 15.4 LTS,Databricks Runtime 15.4 LTS - Azure Databricks,,"Release notes about Databricks Runtime 15.4 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 15.4 LTS, powered by Apache Spark 3.5.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in August 2024. Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle. Tip To see release notes for Databricks Runtime versions that have reached end-of-support (EoS), seeEnd-of-support Databric",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 15.4 LTS release notes; focuses on version changes and LTS lifecycle, not on the structured expert-knowledge patterns defined (limits, configuration tables, troubleshooting, etc.).",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.4lts,Databricks Runtime 15.4 LTS,Databricks Runtime 15.4 LTS - Azure Databricks,,"Release notes about Databricks Runtime 15.4 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 15.4 LTS, powered by Apache Spark 3.5.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in August 2024. Note LTSmeans this version is underlong-term support. SeeDatabricks Runtime LTS version lifecycle. Tip To see release notes for Databricks Runtime versions that have reached end-of-support (EoS), seeEnd-of-support Databric",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 15.4 LTS release notes; focuses on version changes and LTS lifecycle, not on the structured expert-knowledge patterns defined (limits, configuration tables, troubleshooting, etc.).",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.4lts-ml,Databricks Runtime 15.4 LTS ML,Databricks Runtime 15.4 LTS for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 15.4 LTS ML, powered by Apache Spark.","Databricks Runtime 15.4 LTS for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 15.4 LTS. Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorch, and XGBoost. Databricks Runtime ML includesAutoML, a tool to automatically train machine learning pipelines. Databricks Runtime ML also supports distributed deep learning training using TorchDistributor. Note LTSmeans this version is unde",2026-01-21T08:00:00.000Z,release-notes,,0.4,False,"Runtime 15.4 LTS ML release notes appear similar to other ML runtime notes, mainly listing included components rather than numeric limits or configuration matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.0,Databricks Runtime 16.0,Databricks Runtime 16.0 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.0, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 16.0, powered by Apache Spark 3.5.0. Databricks released this version in November 2024. Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.0 release notes; appears to be a standard change log with lifecycle info. The provided text does not show any of the specific numeric limits, configuration parameters, or decision matrices required for classification.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.0,Databricks Runtime 16.0,Databricks Runtime 16.0 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.0, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 16.0, powered by Apache Spark 3.5.0. Databricks released this version in November 2024. Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility.",2026-04-21T08:00:00.000Z,archived,,0.2,False,"Runtime 16.0 release notes summary; similar to other runtime release notes, mainly version info and high-level changes. The summary does not show numeric limits, configuration tables, or structured troubleshooting content that matches any sub-skill type.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.0ml,Databricks Runtime 16.0 ML,Databricks Runtime 16.0 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.0 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 16.0 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 16.0 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T18:06:00.000Z,archived,,0.4,False,"Release notes for a specific Databricks Runtime ML version; mostly version/library listings and high-level changes, not structured limits, configuration matrices, troubleshooting mappings, or decision criteria as defined by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.1,Databricks Runtime 16.1,Databricks Runtime 16.1 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.1, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 16.1, powered by Apache Spark 3.5.0. Databricks released this version in December 2024.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.1 release notes; summary content is limited to version, Spark level, and dates. There is no clear sign of structured expert knowledge like quotas, config tables, or troubleshooting flows.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.1,Databricks Runtime 16.1,Databricks Runtime 16.1 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.1, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 16.1, powered by Apache Spark 3.5.0. Databricks released this version in December 2024.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.1 release notes; summary content is limited to version, Spark level, and dates. There is no clear sign of structured expert knowledge like quotas, config tables, or troubleshooting flows.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.1ml,Databricks Runtime 16.1 ML,Databricks Runtime 16.1 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.1 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 16.1 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 16.1 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.3,False,"Runtime 16.1 ML release notes; ML library and environment description, not limits, configuration, or troubleshooting per the skill definitions.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.2,Databricks Runtime 16.2,Databricks Runtime 16.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 16.2, powered by Apache Spark 3.5.2. Databricks released this version in February 2025.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.2 release notes; only lifecycle and version information is evident. No indication of detailed limits, configuration matrices, or decision-making guidance per the sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.2,Databricks Runtime 16.2,Databricks Runtime 16.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 16.2, powered by Apache Spark 3.5.2. Databricks released this version in February 2025.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.2 release notes; only lifecycle and version information is evident. No indication of detailed limits, configuration matrices, or decision-making guidance per the sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.2ml,Databricks Runtime 16.2 ML,Databricks Runtime 16.2 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.2 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 16.2 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 16.2 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.3,False,"Runtime 16.2 ML release notes; describes ML environment and libraries, not limits, configuration options, or decision-making matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.3,Databricks Runtime 16.3,Databricks Runtime 16.3 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.3, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 16.3, powered by Apache Spark 3.5.2. Databricks released this version in March 2025.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.3 release notes; description is high-level (version, Spark version, release date). Does not clearly expose any of the required expert-knowledge patterns such as limits tables, config parameters, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.3,Databricks Runtime 16.3,Databricks Runtime 16.3 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.3, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 16.3, powered by Apache Spark 3.5.2. Databricks released this version in March 2025.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 16.3 release notes; description is high-level (version, Spark version, release date). Does not clearly expose any of the required expert-knowledge patterns such as limits tables, config parameters, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.3ml,Databricks Runtime 16.3 ML,Databricks Runtime 16.3 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 16.3 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 16.3 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 16.3 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T08:00:00.000Z,archived,,0.3,False,"Runtime 16.3 ML release notes; focuses on ML libraries and environment, not on quotas, config matrices, or error-resolution content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.4lts,Databricks Runtime 16.4 LTS,Databricks Runtime 16.4 LTS - Azure Databricks,,"Release notes about Databricks Runtime 16.4 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 16.4 LTS, powered by Apache Spark 3.5.2. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this LTS version in May 2025. There are 2 variants for this release, one supporting Scala 2.12 and one supporting Scala 2.13. Starting with Databricks Runtime 17 (Spark 4), only Scala 2.13 will be supported. To help you transition, two images are avai",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 16.4 LTS release notes; includes notes like Scala version support, but overall is a change log rather than limits, configuration matrices, or troubleshooting guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.4lts,Databricks Runtime 16.4 LTS,Databricks Runtime 16.4 LTS - Azure Databricks,,"Release notes about Databricks Runtime 16.4 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 16.4 LTS, powered by Apache Spark 3.5.2. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this LTS version in May 2025. There are 2 variants for this release, one supporting Scala 2.12 and one supporting Scala 2.13. Starting with Databricks Runtime 17 (Spark 4), only Scala 2.13 will be supported. To help you transition, two images are avai",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 16.4 LTS release notes; includes notes like Scala version support, but overall is a change log rather than limits, configuration matrices, or troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.4lts-ml,Databricks Runtime 16.4 LTS ML,Databricks Runtime 16.4 LTS for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 16.4 LTS ML , powered by Apache Spark.","Databricks Runtime 16.4 LTS for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 16.4 LTS. Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorch, and XGBoost. Databricks Runtime ML includesAutoML, a tool to automatically train machine learning pipelines. Databricks Runtime ML also supports distributed deep learning training using TorchDistributor, DeepSpeed, and Ray. Tip To see rel",2026-01-21T08:00:00.000Z,release-notes,,0.4,False,ML LTS runtime release notes emphasize bundled ML libraries and tools; summary does not show expert-knowledge structures like limits tables or config parameter references.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.0,Databricks Runtime 17.0,Databricks Runtime 17.0 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.0, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 17.0, powered by Apache Spark 4.0.0. Databricks released this version in June 2025.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 17.0 release notes; summary shows only version, Spark version, and release timing. No visible structured best practices, limits, configuration options, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.0,Databricks Runtime 17.0,Databricks Runtime 17.0 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.0, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 17.0, powered by Apache Spark 4.0.0. Databricks released this version in June 2025.",2026-04-13T08:00:00.000Z,archived,,0.3,False,"Databricks Runtime 17.0 release notes; summary shows only version, Spark version, and release timing. No visible structured best practices, limits, configuration options, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.0ml,Databricks Runtime 17.0 ML,Databricks Runtime 17.0 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.0 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 17.0 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 17.0 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T18:06:00.000Z,archived,,0.3,False,"Runtime 17.0 ML release notes; ML library list and environment description, not aligned with any of the specified expert-knowledge skill types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.1,Databricks Runtime 17.1,Databricks Runtime 17.1 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.1, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 17.1, powered by Apache Spark 4.0.0. Azure Databricks released this version in August 2025.",2026-04-13T08:00:00.000Z,release-notes,,0.3,False,"Databricks Runtime 17.1 release notes; provided summary is lifecycle and version info. Lacks explicit evidence of numeric limits, config tables, security roles, or troubleshooting content required for classification.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.1,Databricks Runtime 17.1,Databricks Runtime 17.1 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.1, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 17.1, powered by Apache Spark 4.0.0. Azure Databricks released this version in August 2025.",2026-04-13T08:00:00.000Z,release-notes,,0.3,False,"Databricks Runtime 17.1 release notes; provided summary is lifecycle and version info. Lacks explicit evidence of numeric limits, config tables, security roles, or troubleshooting content required for classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.1ml,Databricks Runtime 17.1 ML,Databricks Runtime 17.1 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.1 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 17.1 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 17.1 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T18:06:00.000Z,release-notes,,0.3,False,"Runtime 17.1 ML release notes; describes ML environment and libraries, not numeric limits, config parameters, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.2,Databricks Runtime 17.2,Databricks Runtime 17.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 17.2, powered by Apache Spark 4.0.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version",2026-04-13T08:00:00.000Z,release-notes,,0.3,False,"Databricks Runtime 17.2 release notes; description only shows version, Spark level, and lifecycle status. No indication of limits, configuration parameters, troubleshooting mappings, or decision criteria that match the expert-knowledge sub-skill definitions.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.2,Databricks Runtime 17.2,Databricks Runtime 17.2 (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.2, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. The following release notes provide information about Databricks Runtime 17.2, powered by Apache Spark 4.0.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version",2026-04-13T08:00:00.000Z,release-notes,,0.3,False,"Databricks Runtime 17.2 release notes; description only shows version, Spark level, and lifecycle status. No indication of limits, configuration parameters, troubleshooting mappings, or decision criteria that match the expert-knowledge sub-skill definitions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.2ml,Databricks Runtime 17.2 ML,Databricks Runtime 17.2 for Machine Learning (EoS) - Azure Databricks,,"Release notes about Databricks Runtime 17.2 ML, powered by Apache Spark.","Note Support for this Databricks Runtime version has ended. For the end-of-support date, seeEnd-of-support and end-of-life history. For all supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. Databricks Runtime 17.2 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 17.2 (EoS). Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorc",2026-04-10T18:06:00.000Z,release-notes,,0.3,False,"Runtime 17.2 ML release notes; primarily lists included ML libraries and versions, not quotas, config matrices, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.3lts,Databricks Runtime 17.3 LTS,Databricks Runtime 17.3 LTS - Azure Databricks,,"Release notes about Databricks Runtime 17.3 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 17.3 LTS, powered by Apache Spark 4.0.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this LTS version in October 2025.",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 17.3 LTS release notes; LTS lifecycle and change details, but not structured as any of the specified expert-knowledge sub-skill categories.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.3lts,Databricks Runtime 17.3 LTS,Databricks Runtime 17.3 LTS - Azure Databricks,,"Release notes about Databricks Runtime 17.3 LTS, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 17.3 LTS, powered by Apache Spark 4.0.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this LTS version in October 2025.",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 17.3 LTS release notes; LTS lifecycle and change details, but not structured as any of the specified expert-knowledge sub-skill categories.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.3lts-ml,Databricks Runtime 17.3 LTS ML,Databricks Runtime 17.3 LTS for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 17.3 LTS ML, powered by Apache Spark.","Databricks Runtime 17.3 LTS for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 17.3 LTS. Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorch, and XGBoost. Databricks Runtime ML includesAutoML, a tool to automatically train machine learning pipelines. Databricks Runtime ML also supports distributed deep learning training using TorchDistributor, DeepSpeed, and Ray.",2026-01-21T08:00:00.000Z,release-notes,,0.4,False,"ML LTS runtime release notes focus on included libraries and capabilities; no indication of numeric limits, config tables, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.0,Databricks Runtime 18.0,Databricks Runtime 18.0 - Azure Databricks,,"Release notes about Databricks Runtime 18.0, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.0, powered by Apache Spark 4.1.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in January 2026.",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 18.0 release notes; contains product updates and package versions but not in the form of limits/quotas, decision matrices, configuration tables, or error-resolution mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.0,Databricks Runtime 18.0,Databricks Runtime 18.0 - Azure Databricks,,"Release notes about Databricks Runtime 18.0, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.0, powered by Apache Spark 4.1.0. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in January 2026.",2026-04-21T08:00:00.000Z,release-notes,,0.2,False,"Release notes summary; likely lists changes and fixes but not organized as limits, configuration matrices, troubleshooting mappings, or other defined sub-skill types. No clear indication of numeric limits, config tables, or error-code-based troubleshooting in the provided summary.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.0ml,Databricks Runtime 18.0 ML,Databricks Runtime 18.0 for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 18.0 ML, powered by Apache Spark.","Databricks Runtime 18.0 for Machine Learning provides a ready-to-go environment for machine learning and data science based onDatabricks Runtime 18.0. Databricks Runtime ML contains many popular machine learning libraries, including TensorFlow, PyTorch, and XGBoost. Databricks Runtime ML also supports distributed deep learning training using TorchDistributor, DeepSpeed, and Ray.",2026-03-31T08:00:00.000Z,release-notes,,0.4,False,"Runtime ML release notes are primarily feature and library-version announcements; the summary does not indicate structured limits, configuration tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.1,Databricks Runtime 18.1,Databricks Runtime 18.1 - Azure Databricks,,"Release notes about Databricks Runtime 18.1, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.1. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in February 2026.",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 18.1 release notes; primarily version-specific change log rather than structured limits, configuration references, or troubleshooting content as defined by the sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.1,Databricks Runtime 18.1,Databricks Runtime 18.1 - Azure Databricks,,"Release notes about Databricks Runtime 18.1, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.1. This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in February 2026.",2026-04-13T08:00:00.000Z,release-notes,,0.4,False,"Runtime 18.1 release notes; primarily version-specific change log rather than structured limits, configuration references, or troubleshooting content as defined by the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.1ml,Databricks Runtime 18.1 ML,Databricks Runtime 18.1 for Machine Learning - Azure Databricks,,"Release notes about Databricks Runtime 18.1 ML, powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.1 ML. Azure Databricks released this version in February 2026. Databricks Runtime 18.1 ML is built on top of Databricks Runtime 18.1. For information on what's new in Databricks Runtime 18.1, including Apache Spark MLlib and SparkR, see theDatabricks Runtime 18.1release notes.",2026-03-11T08:00:00.000Z,release-notes,,0.4,False,Runtime 18.1 ML release notes; describes included libraries and base runtime but not structured configuration parameters or limits in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.2,Databricks Runtime 18.2 (Beta),Databricks Runtime 18.2 (Beta) - Azure Databricks,,"Release notes about Databricks Runtime 18.2 (Beta), powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.2 (Beta). This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in April 2026. Important Databricks Runtime 18.2 is inBeta. The contents of the supported environments might change during the Beta. Changes can include the list of packages or versions of installed packages.",2026-04-14T08:00:00.000Z,release-notes,,0.4,False,"Release notes for a specific Databricks Runtime version; likely lists changes, fixes, and package versions but not organized as limits, configuration matrices, troubleshooting guides, or other defined sub-skill patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.2,Databricks Runtime 18.2 (Beta),Databricks Runtime 18.2 (Beta) - Azure Databricks,,"Release notes about Databricks Runtime 18.2 (Beta), powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.2 (Beta). This version incorporates all features, improvements, and bug fixes from all previous Databricks Runtime releases. Databricks released this version in April 2026. Important Databricks Runtime 18.2 is inBeta. The contents of the supported environments might change during the Beta. Changes can include the list of packages or versions of installed packages.",2026-04-20T08:00:00.000Z,release-notes,,0.0,False,"Release notes summary; no evidence in the provided text of numeric limits, configuration tables, error-code troubleshooting, or other structured expert details tied to the defined sub-skill types.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.2ml,Databricks Runtime 18.2 ML (Beta),Databricks Runtime 18.2 for Machine Learning (Beta) - Azure Databricks,,"Release notes about Databricks Runtime 18.2 ML (Beta), powered by Apache Spark.","The following release notes provide information about Databricks Runtime 18.2 ML. Azure Databricks released this version in April 2026. Important Databricks Runtime 18.2 is inBeta. The contents of the supported environments might change during the Beta. Changes can include the list of packages or versions of installed packages. Databricks Runtime 18.2 ML is built on top of Databricks Runtime 18.2. For information on what's new in Databricks Runtime 18.2, including Apache Spark MLlib and SparkR, ",2026-04-08T22:17:00.000Z,release-notes,,0.3,False,"ML runtime release notes; mainly package versions and feature changes, not limits, configuration tables, or troubleshooting mappings as required by the skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/databricks-runtime-ver,Support lifecycles,Databricks support lifecycles - Azure Databricks,Plan around Databricks Runtime support lifecycles,"Learn about the phases of feature retirement and details about corresponding support for Databricks platform features and Databricks Runtime releases, including Legacy, Deprecated, End of Support (EoS","As part of Azure Databricks's commitment to innovation, platform and runtime features might be retired and replaced by new features. Databricks Runtime releases are also retired and replaced on a regular schedule. This page lists retirement phases and details about corresponding support for platform features and Databricks Runtime releases. It also includes SQL queries to detect clusters and jobs using legacy Databricks Runtime versions. For information about previews and release types, seeAzure",2026-04-10T08:00:00.000Z,release-notes,decision-making,0.65,True,"Support lifecycle documentation typically includes concrete dates, phase definitions (Legacy, Deprecated, EoS, EoL), and sometimes queries to detect affected clusters. These are expert, time-bound details that guide decisions on when to upgrade or retire runtimes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/eos,End-of-support releases,End-of-support Databricks Runtime release notes - Azure Databricks,,Learn about Databricks Runtime versions that have reached end-of-support.,"This page lists release notes for Databricks Runtime versions that have reached end-of-support and end-of-life. For information on supported Databricks Runtime versions, seeDatabricks Runtime release notes versions and compatibility. For information on the Databricks Runtime support policy and schedule, seeDatabricks support lifecycles. Note Databricks Runtime 10.3 and earlier have reached end-of-life. Release notes for those versions have been removed from the documentation. Databricks Runtime ",2026-04-10T18:06:00.000Z,release-notes,,0.2,False,"End-of-support index page; mostly lifecycle and navigation information without numeric limits, configs, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates,Runtime maintenance updates,Databricks Runtime maintenance updates - Azure Databricks,,Maintenance updates issued for Databricks Runtime releases.,"This article lists maintenance updates for supported Databricks Runtime versions. To add a maintenance update to an existing cluster, restart the cluster. For the maintenance updates on unsupported Databricks Runtime versions, seeMaintenance updates for Databricks Runtime (archived). Note Releases are staged. Your Azure Databricks account might not update for a few days after the initial release date.",2026-04-10T08:00:00.000Z,release-notes,,0.3,False,"Release/maintenance notes index for runtimes; the summary suggests high-level listing of maintenance updates and process (restart cluster) without exposing specific limits, configs, or troubleshooting mappings in the snippet. Without clear evidence of structured expert details (error codes, config tables, limits), it’s treated as non-expert for this classification.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates-archive,Databricks Runtime maintenance updates (archived),Maintenance updates for Databricks Runtime (archived) - Azure Databricks,,Archived maintenance updates issued for Databricks runtime releases. The releases listed on this page are no longer supported.,"This archived page lists maintenance updates issued for Databricks Runtime releases that are no longer supported. To add a maintenance update to an existing cluster, restart the cluster. Important This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content have reached end-of-support. SeeDatabricks Runtime release notes versions and compatibility.",2026-04-10T18:06:00.000Z,archived,,0.2,False,"Archived maintenance updates index; describes existence of updates but not detailed limits, configuration parameters, or error-resolution mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/,Serverless compute release notes,Serverless compute release notes - Azure Databricks,,Release notes for Azure Databricks serverless compute.,"This article explains the features and behaviors that are currently available and upcoming on serverless compute for notebooks and jobs. For more information on serverless compute, seeConnect to serverless compute. Azure Databricks periodically releases updates to serverless compute, automatically upgrading the serverless compute runtime to support enhancements and upgrades to the platform. All users get the same updates, rolled out over a short period of time.",2026-04-10T08:00:00.000Z,release-notes,,0.3,False,"Serverless compute release notes overview; the summary indicates staged feature updates and behavior descriptions but no clear evidence of structured limits, config tables, or troubleshooting mappings in the snippet. Treated as general release information rather than a focused expert-knowledge page per the given categories.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/,Serverless environment versions,Serverless environment versions - Azure Databricks,Choose appropriate Databricks serverless environment versions,Reference page for the available serverless environments on Databricks.,"This page provides a reference for the currently released serverless environment versions. Serverless compute for notebooks and jobs uses environment versions, which provide a stable client API based on Spark Connect to ensure application compatibility. This allows Databricks to upgrade the server independently, delivering performance improvements, security enhancements, and bug fixes without requiring any code changes to workloads. Each environment version includes a specific Python version and",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Provides a reference for currently released environment versions, each with specific Python versions and API compatibility; this is concrete configuration/selection data for environments.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five,Serverless environment version 5,Serverless environment version 5 - Azure Databricks,Reference serverless environment version 5 system details,Release notes for Azure Databricks serverless compute environment version 5.,"This page outlines the system environment information for serverless environment version 5. To ensure application compatibility, serverless workloads use a versioned API, known as the environment version, that remains compatible with newer serverless versions. To select a base environment, use theBase environmentselector in theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Environment-version pages typically enumerate exact system environment details (Python version, libraries, possibly config constraints) that are product-specific configuration references.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four,Serverless environment version 4,Serverless environment version 4 - Azure Databricks,Reference serverless environment version 4 system details,Release notes for Azure Databricks serverless compute environment version 4.,"This article outlines the system environment information for serverless environment version 4. To ensure compatibility for the application, serverless workloads use a versioned API, known as the environment version, which remains compatible with newer server versions. You can select a base environment that includes this environment version using theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Like other environment-version pages, it outlines system environment information (versions and components) that constitute product-specific configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates,Runtime maintenance updates,Databricks Runtime maintenance updates - Azure Databricks,,Maintenance updates issued for Databricks Runtime releases.,"This article lists maintenance updates for supported Databricks Runtime versions. To add a maintenance update to an existing cluster, restart the cluster. For the maintenance updates on unsupported Databricks Runtime versions, seeMaintenance updates for Databricks Runtime (archived). Note Releases are staged. Your Azure Databricks account might not update for a few days after the initial release date.",2026-04-17T08:00:00.000Z,release-notes,,0.3,False,"Release notes listing maintenance updates by runtime version; summary does not indicate numeric limits, configuration tables, error-code mappings, or decision matrices. Primarily change-log information rather than reusable expert configuration or troubleshooting knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates-archive,Databricks Runtime maintenance updates (archived),Maintenance updates for Databricks Runtime (archived) - Azure Databricks,,Archived maintenance updates issued for Databricks runtime releases. The releases listed on this page are no longer supported.,"This archived page lists maintenance updates issued for Databricks Runtime releases that are no longer supported. To add a maintenance update to an existing cluster, restart the cluster. Important This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content have reached end-of-support. SeeDatabricks Runtime release notes versions and compatibility.",2026-04-21T08:00:00.000Z,archived,,0.1,False,"Archived maintenance updates index; primarily a navigation/summary page pointing to other releases. Description does not indicate detailed limits, configuration parameters, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/,Serverless compute release notes,Serverless compute release notes - Azure Databricks,,Release notes for Azure Databricks serverless compute.,"This article explains the features and behaviors that are currently available and upcoming on serverless compute for notebooks and jobs. For more information on serverless compute, seeConnect to serverless compute. Azure Databricks periodically releases updates to serverless compute, automatically upgrading the serverless compute runtime to support enhancements and upgrades to the platform. All users get the same updates, rolled out over a short period of time.",2026-04-21T08:00:00.000Z,release-notes,,0.3,False,"Serverless compute release notes describing features and behaviors over time; no indication of limits tables, config parameter references, or troubleshooting mappings. Mostly high-level feature availability information.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/,Serverless environment versions,Serverless environment versions - Azure Databricks,Reference available Databricks serverless environment versions,Reference page for the available serverless environments on Databricks.,"This page provides a reference for the currently released serverless environment versions. Serverless compute for notebooks and jobs uses environment versions, which provide a stable client API based on Spark Connect to ensure application compatibility. This allows Databricks to upgrade the server independently, delivering performance improvements, security enhancements, and bug fixes without requiring any code changes to workloads. Each environment version includes a specific Python version and",2026-04-21T08:00:00.000Z,release-notes,configuration,0.65,True,"Described as a reference for currently released serverless environment versions, including specific Python versions and other environment characteristics. This is a configuration reference (environment-version matrix) with concrete values that an LLM would not reliably know from training.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five,Serverless environment version 5,Serverless environment version 5 - Azure Databricks,Review system environment details for serverless environment v5,Release notes for Azure Databricks serverless compute environment version 5.,"This page outlines the system environment information for serverless environment version 5. To ensure application compatibility, serverless workloads use a versioned API, known as the environment version, that remains compatible with newer serverless versions. To select a base environment, use theBase environmentselector in theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-04-23T17:47:00.000Z,release-notes,configuration,0.7,True,"Described as outlining system environment information for a specific serverless environment version. These environment-version pages typically enumerate concrete environment details (Python version, libraries, system settings) that are product-specific configuration data not inferable from general training.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five-gpu,Serverless GPU environment version 5,Serverless GPU environment version 5 (Preview) - Azure Databricks,Inspect configuration for Serverless GPU environment version 5,Release notes for Azure Databricks Serverless GPU compute environment version 5.,"Important This feature is inPublic Preview. This page outlines the system environment information for Serverless GPU environment version 5. This compute offering is part ofAI Runtime, which is designed for modern AI and deep learning workloads. Serverless GPU environment 5 is built on top of serverless environment 5 (CPU). See what's new inserverless environment 5 (CPU). It includes the following environment: To ensure compatibility for the application, Serverless GPU workloads use a versioned A",2026-04-24T18:05:00.000Z,release-notes,configuration,0.7,True,"GPU environment version page explicitly provides system environment information for a specific GPU runtime, which generally includes concrete versions and settings for AI Runtime. This is product-specific configuration detail that qualifies as expert knowledge.",new +https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four,Serverless environment version 4,Serverless environment version 4 - Azure Databricks,Check system environment details for serverless environment v4,Release notes for Azure Databricks serverless compute environment version 4.,"This article outlines the system environment information for serverless environment version 4. To ensure compatibility for the application, serverless workloads use a versioned API, known as the environment version, which remains compatible with newer server versions. You can select a base environment that includes this environment version using theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-04-23T17:47:00.000Z,release-notes,configuration,0.7,True,"Similar to v5, this page outlines system environment information for environment version 4, which typically includes specific runtime versions and settings. These are concrete configuration parameters unique to this product and version.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four-gpu,Serverless GPU environment version 4,Serverless GPU environment version 4 (Preview) - Azure Databricks,Use serverless GPU environment version 4 for AI workloads,Release notes for Azure Databricks Serverless GPU compute environment version 4.,"Important This feature is inPublic Preview. This article outlines the system environment information for Serverless GPU environment version 4. This compute offering is part ofAI Runtime, which is designed for modern AI and deep learning workloads. To ensure compatibility for the application, Serverless GPU workloads use a versioned API, known as the environment version, which remains compatible with newer server versions. You can select a base environment that includes this environment version u",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Describes a specific GPU environment version with system environment information and preview constraints, which are concrete configuration details for AI Runtime GPU workloads.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/one,Serverless environment version 1,Serverless environment version 1 - Azure Databricks,Reference serverless environment version 1 system details,Release notes for Azure Databricks serverless compute environment version 1.,"This article outlines the system environment information for serverless environment version 1. To ensure compatibility for the application, serverless workloads use a versioned API, known as the environment version, which remains compatible with newer server versions. You can select a base environment that includes this environment version using theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Documents system environment information for environment version 1, providing concrete configuration details unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three,Serverless environment version 3,Serverless environment version 3 - Azure Databricks,Reference serverless environment version 3 system details,Release notes for Azure Databricks serverless compute environment version 3.,"This article outlines the system environment information for serverless environment version 3. To ensure compatibility for the application, serverless workloads use a versioned API, known as the environment version, which remains compatible with newer server versions. You can select a base environment that includes this environment version using theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Outlines system environment information for a specific environment version, which is product-specific configuration data.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three-gpu,Serverless GPU environment version 3,Serverless GPU environment version 3 (Preview) - Azure Databricks,Understand deprecated serverless GPU environment version 3,Release notes for Azure Databricks serverless compute environment for GPU version 3.,"Important Serverless GPU environment 3 has been deprecated. Use Serverless GPUenvironment 4instead. Important This feature is inPublic Preview. This article outlines the system environment information for Serverless GPU environment version 3. This compute offering is part ofAI Runtime, which is designed for modern AI and deep learning workloads. To ensure compatibility for the application, Serverless GPU workloads use a versioned API, known as the environment version, which remains compatible wi",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Provides system environment information and deprecation status for a specific GPU environment version, which is concrete configuration and lifecycle detail.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three-gpu,Serverless GPU environment version 3,Serverless GPU environment version 3 (Preview) - Azure Databricks,Review deprecated Serverless GPU environment version 3 settings,Release notes for Azure Databricks serverless compute environment for GPU version 3.,"Important Serverless GPU environment 3 has been deprecated. Use Serverless GPUenvironment 4orenvironment 5instead. Important This feature is inPublic Preview. This article outlines the system environment information for Serverless GPU environment version 3. This compute offering is part ofAI Runtime, which is designed for modern AI and deep learning workloads. To ensure compatibility for the application, Serverless GPU workloads use a versioned API, known as the environment version, which remain",2026-04-21T08:00:00.000Z,release-notes,configuration,0.7,True,"Although deprecated, the page is said to outline system environment information for GPU environment version 3. Such environment-version pages usually list specific versions and configuration details, which are product-specific expert configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/two,Serverless environment version 2,Serverless environment version 2 - Azure Databricks,Reference serverless environment version 2 system details,Release notes for Azure Databricks serverless compute environment version 2.,"This article outlines the system environment information for serverless environment version 2. To ensure compatibility for the application, serverless workloads use a versioned API, known as the environment version, which remains compatible with newer server versions. You can select a base environment that includes this environment version using theEnvironmentside panel in your serverless notebooks. SeeSelect a base environment.",2026-03-25T08:00:00.000Z,release-notes,configuration,0.7,True,"Similar to other environment-version pages, it documents system environment specifics for that version, which are configuration references.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/release-notes/whats-coming,What's coming?,What's coming? - Azure Databricks,,Learn about upcoming Azure Databricks releases and significant behavior changes.,Learn about features and behavioral changes in upcoming Azure Databricks releases.,2026-04-17T21:49:00.000Z,release-notes,,0.2,False,"Upcoming release and behavior-change overview; no concrete limits, configs, or error mappings indicated in the summary.",updated +https://learn.microsoft.com/en-us/azure/databricks/release-notes/whats-coming,What's coming?,What's coming? - Azure Databricks,,Learn about upcoming Azure Databricks releases and significant behavior changes.,Learn about features and behavioral changes in upcoming Azure Databricks releases.,2026-04-24T23:15:00.000Z,release-notes,,0.2,False,"Upcoming features and behavior changes are high-level release notes; typically lack stable numeric limits, config tables, or detailed error mappings required for expert-knowledge sub-skills.",updated https://learn.microsoft.com/en-us/azure/databricks/repos/,Git folders overview,Azure Databricks Git folders - Azure Databricks,Use Azure Databricks Git folders for version control,"Learn how to integrate Git repositories within your Databricks workspace for version control, collaboration, and CI/CD workflows.","Azure Databricks Git folders is a visual Git client and API that integrates Git repositories within your workspace. Use Git folders to develop code in notebooks and files while following software development best practices using Git for version control, collaboration, and CI/CD.",2026-03-12T18:41:00.000Z,landing-page,configuration,0.65,True,"Describes Git folders behavior and integration semantics unique to Databricks, beyond generic Git usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-ms-entra,Automate with Microsoft Entra,Authorize a Microsoft Entra service principal to access Git folders - Azure Databricks,Authorize Entra service principals for Databricks Git folders,Configure a Microsoft Entra service principal to automate access to Azure Databricks Git folders from Azure DevOps.,Use Microsoft Entra ID to authenticate access to Azure Databricks Git folders from your Azure DevOps automation. This page explains how to configure an Azure Databricks service principal with Microsoft Entra for authorization.,2026-01-30T08:00:00.000Z,how-to,security,0.75,True,Shows how to configure a Databricks service principal with Microsoft Entra ID to access Git folders from Azure DevOps automation. This is identity and authorization configuration specific to Databricks and Entra.,unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-sp,Automate with a service principal,Authorize a service principal to access Git folders - Azure Databricks,Authorize Databricks service principals for Git folders,Learn how to configure a Databricks service principal to access and interact with Databricks Git folders.,"A service principal is a non-human identity used to authenticate automated workflows in Azure Databricks. When you automate Git folder operations, such as pulling the latest code before a job run or programmatically updating notebooks in a CI/CD pipeline, the service principal needs its own Git credentials to access your Git provider. This page explains how to link Git credentials to a service principal using the UI or CLI. Tip Use a service principal for automated or shared workflows like CI/CD",2026-02-20T23:39:00.000Z,how-to,security,0.7,True,Explains how to link Git credentials to Databricks service principals via UI/CLI; includes identity and permission configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-terraform,Automate with Terraform,Manage Azure Databricks Git folders using Terraform - Azure Databricks,Manage Databricks Git folders with Terraform,Configure Terraform to use the Databricks provider for Git folder access with a service principal.,You can manage Azure Databricks Git folders in a fully automated environment usingTerraformand thedatabricks_repoTerraform resource. This topic covers two authentication approaches:,2026-01-30T08:00:00.000Z,how-to,deployment,0.7,True,Explains how to use the databricks_repo Terraform resource and authentication approaches to manage Git folders. This is a Databricks-specific deployment/automation pattern using Terraform.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-terraform,Automate with Terraform,Manage Azure Databricks Git folders using Terraform - Azure Databricks,Automate Azure Databricks Git folders with Terraform,Configure Terraform to use the Databricks provider for Git folder access with a service principal.,Manage Azure Databricks Git folders in a fully automated environment usingTerraformand theDatabricks Terraform provider. This topic covers two authentication approaches:,2026-04-20T22:01:00.000Z,how-to,integrations,0.7,True,"Page describes configuring the Databricks Terraform provider and service principal authentication for managing Git folders. This involves product-specific provider configuration, authentication parameters, and patterns for integrating Terraform with Azure Databricks repos, which qualify as integration-focused expert knowledge beyond generic Terraform usage.",updated https://learn.microsoft.com/en-us/azure/databricks/repos/ci-cd,CI/CD with Git folders,CI/CD with Databricks Git folders - Azure Databricks,Integrate Databricks Git folders into CI/CD workflows,Learn how to use Databricks Git folders in CI/CD workflows.,"Use Databricks Git folders in your CI/CD flows to keep work in source control and integrate it with your data engineering workflows. For a broader overview of CI/CD with Azure Databricks, seeCI/CD on Azure Databricks.",2026-03-16T08:00:00.000Z,best-practice,deployment,0.65,True,"Describes how to use Databricks Git folders in CI/CD flows, integrating source control with Databricks jobs. This is a Databricks-specific deployment/CI-CD pattern rather than generic Git usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/connect-on-prem-git-server,Connect on-prem Git server,Connect to an on-premises Git Server - Azure Databricks,Connect Databricks Serverless Private Git to on-prem Git,Learn how to connect to an on-premises server for Databricks Serverless Private Git,"Note The approach above is provided as general guidance. Network connectivity between Azure and your on-premises environment is outside of the Databricks platform scope. Customers are responsible for designing and implementing connectivity that aligns with their security, availability, and performance requirements. If you must configure an NCC private endpoint rule for your on-premises Git server, you cannot directly add the Git server to the Azure Standard Load Balancer (SLB) backend pool. This",2026-01-30T08:00:00.000Z,how-to,architecture-patterns,0.6,True,"Provides guidance and constraints for connecting Databricks Serverless Private Git to on-premises Git servers, including notes about NCC private endpoint rules and Azure SLB backend pools. These are detailed, product-specific network architecture considerations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/enable-disable-repos-with-api,Enable or disable Git folders,Enable or disable Databricks Git folders - Azure Databricks,Enable or disable Databricks Git folders via API,Learn how to enable or disable Databricks Git folders programmatically using the Databricks REST API or SDK.,Databricks Git folders are enabled by default for new workspaces. Admins can disable or re-enable this feature using the/api/2.0/workspace-confREST API endpoint or a Databricks SDK.,2026-01-30T08:00:00.000Z,how-to,configuration,0.75,True,Shows how admins can toggle Git folders using the /api/2.0/workspace-conf REST endpoint or SDK. This is a specific configuration surface and endpoint unique to Databricks.,unchanged @@ -3521,24 +3551,24 @@ https://learn.microsoft.com/en-us/azure/databricks/repos/get-access-tokens-from- https://learn.microsoft.com/en-us/azure/databricks/repos/git-folders-concepts,Git folders concepts,Azure Databricks Git folders concepts - Azure Databricks,Understand Databricks Git folders concepts and providers,"Learn about Git folders concepts, capabilities, and supported Git providers for version control in Databricks.","Azure Databricks Git folders is a visual Git client and API that integrates Git repositories within your workspace. Use Git folders to develop code in notebooks and files while following software development best practices using Git for version control, collaboration, and CI/CD. Git folders supports common Git operations such as cloning a repository, committing and pushing, pulling, branch management, and visually comparing diffs when committing.",2026-03-12T18:41:00.000Z,concept-article,configuration,0.7,True,Details supported Git providers and Git folders capabilities; these are Databricks-specific integration and behavior details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/git-operations-with-repos,Create and manage Git folders,Create and manage Git folders - Azure Databricks,Create and manage Azure Databricks Git folders,"Learn how to create and manage Databricks Git folders, including cloning repositories, branching, committing, pushing, and more.","This page describes how to create Azure Databricks Git folders and perform common Git operations, including cloning, branching, committing, and pushing. This guide covers the following Git operations:",2026-03-11T08:00:00.000Z,how-to,configuration,0.7,True,"Step-by-step Git operations (clone, branch, commit, push) within Git folders, reflecting Databricks-specific workspace behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/git-proxy,Set up private Git connectivity,Set up private Git connectivity for Azure Databricks Git folders - Azure Databricks,Configure Git server proxy for private Git with Databricks,Learn about the Databricks Git server proxy for Git folders and how to configure your private Git repository to work with it.,"If you host a private Git server (such as GitHub Enterprise Server, Bitbucket Server, or GitLab self-managed) or your Git server is behind a firewall, you can use the Git server proxy to connect Databricks Git folders to your private repositories. The proxy routes Git commands from your Azure Databricks workspace through a compute resource to your private Git server.",2026-01-30T08:00:00.000Z,how-to,configuration,0.7,True,Describes the Databricks Git server proxy and how to configure private Git servers behind firewalls to work with Git folders. This is Databricks-specific networking and Git integration configuration.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/repos/limits,Limitations and FAQ,Databricks Git folder limits and reference - Azure Databricks,Databricks Git folder limits and behaviors,"Learn about size limits, file recoverability, and frequently asked questions for Git integration with Databricks Git folders.","The following sections specify limits for Databricks Git folders and Git integration. For general information, seeResource limits. Jump to: To learn about Databricks asset types supported in Git folders, seeSupported asset types in Git folders.",2026-04-16T08:00:00.000Z,limits-and-quotas,limits-quotas,0.92,True,"Explicitly described as specifying limits for Databricks Git folders and Git integration, including size limits and file recoverability. This is product-specific numerical limits and constraints that qualify as expert knowledge under limits-quotas.",updated -https://learn.microsoft.com/en-us/azure/databricks/repos/repos-setup,Configure Git integration,Configure Git integration for Git folders - Azure Databricks,Configure secure Git integration for Azure Databricks,"Configure Git integration for Databricks Git folders, including credentials, network connectivity, and security features.","This page describes how to configure Git integration for Azure Databricks Git folders, including credentials, network connectivity, and security features. To create a Git folder and start working with Git operations, seeCreate and manage Git folders. Important Use Git folders for interactive development. For CI/CD and production deployments, use Declarative Automation Bundles with versioned artifacts and workload identity federation. SeeCI/CD with Databricks Git foldersandWhat are Declarative Au",2026-04-16T08:00:00.000Z,how-to,security,0.68,True,"Page focuses on configuring Git integration for Databricks Git folders, including credentials, network connectivity, and security features. This typically includes product-specific auth modes, token scopes, and network/security configuration details that go beyond generic Git usage, fitting the security sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/repos/limits,Limitations and FAQ,Databricks Git folder limits and reference - Azure Databricks,Databricks Git folder limits and behaviors,"Learn about size limits, file recoverability, and frequently asked questions for Git integration with Databricks Git folders.","The following sections specify limits for Databricks Git folders and Git integration. For general information, seeResource limits. Jump to: To learn about Databricks asset types supported in Git folders, seeSupported asset types in Git folders.",2026-04-16T08:00:00.000Z,limits-and-quotas,limits-quotas,0.92,True,"Explicitly described as specifying limits for Databricks Git folders and Git integration, including size limits and file recoverability. This is product-specific numerical limits and constraints that qualify as expert knowledge under limits-quotas.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/repos/repos-setup,Configure Git integration,Configure Git integration for Git folders - Azure Databricks,Configure secure Git integration for Azure Databricks,"Configure Git integration for Databricks Git folders, including credentials, network connectivity, and security features.","This page describes how to configure Git integration for Azure Databricks Git folders, including credentials, network connectivity, and security features. To create a Git folder and start working with Git operations, seeCreate and manage Git folders. Important Use Git folders for interactive development. For CI/CD and production deployments, use Declarative Automation Bundles with versioned artifacts and workload identity federation. SeeCI/CD with Databricks Git foldersandWhat are Declarative Au",2026-04-16T08:00:00.000Z,how-to,security,0.68,True,"Page focuses on configuring Git integration for Databricks Git folders, including credentials, network connectivity, and security features. This typically includes product-specific auth modes, token scopes, and network/security configuration details that go beyond generic Git usage, fitting the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/serverless-private-git,Serverless Private Git,Configure Databricks Serverless Private Git - Azure Databricks,Set up Databricks Serverless Private Git with Private Link,Learn how to use Databricks Serverless Private Git,Note Databricks Serverless Private Git is inPublic Preview. Compute and networking costs apply when serverless compute resources connect to external resources. SeeUnderstand Databricks serverless networking costsfor billing details. Databricks Serverless Private Git lets you connect a Databricks workspace to a private Git server using serverless compute and Azure Private Link. A Git server is private if internet users can't access it. The following diagram illustrates the overall system architec,2026-03-10T08:00:00.000Z,get-started,architecture-patterns,0.65,True,"Explains Databricks Serverless Private Git, including architecture diagram and how serverless compute and Azure Private Link are used to connect to private Git servers. This is a product-specific architectural pattern for secure Git connectivity.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/supported-artifact-types,Supported asset types,Supported asset types in Git folders - Azure Databricks,,"Learn about supported asset types in Git folders, including notebooks, files, queries, dashboards, and alerts.","Git folders support version control for specific Azure Databricks asset types. When an asset type is supported in Git folders, you can commit it to a remote Git repository, track changes over time, and collaborate with other users through version control. Note Git folders are workspace folders. The files and notebooks in a Git folder are stored in the same location as other workspace assets. For more information about where Azure Databricks stores data, seeWhere does Azure Databricks write data?",2026-04-09T08:00:00.000Z,reference,,0.2,False,"Describes which Azure Databricks asset types are supported in Git folders, but from the summary it appears to be a conceptual/feature description without numeric limits, configuration parameter tables, error-code-based troubleshooting, or other detailed product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/repos/what-happened-repos,What happened to Repos?,What happened to Databricks Repos? - Azure Databricks,,Learn how Git folders replaced the Databricks Repos feature and what changed.,"Repos are now called Git folders. Like Repos, Git folders let you sync workspace folders with remote Git repositories for version control.",2026-01-30T08:00:00.000Z,upgrade-and-migration-article,,0.2,False,"Renaming/feature-change explanation for Repos to Git folders; no detailed limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/resources/,Azure Databricks resources,Resources - Azure Databricks,,"Learn how to submit support tickets, manage your support contract, submit product feedback, and monitor Azure Databricks system status.","This section provides information on limits, the Azure Databricks release process, support plans, how to give product feedback, and how to monitor system status.",2026-03-31T23:28:00.000Z,landing-page,,0.1,False,Navigation/resources hub pointing to other docs; does not itself contain detailed limits or configurations.,unchanged https://learn.microsoft.com/en-us/azure/databricks/resources/databricks-geos,Geographies,Azure Geographies: Data residency - Azure Databricks,Manage Databricks data residency with Azure Geographies,"Learn how Geos help you manage data residency for some AI capabilities in Databricks, known as Designated Services.","This article describes how Azure Geographies manage data residency when processing customer content for features known asDesignated Services. For more information and a list of Azure data center regions in each Geography, seeData residency in Azure.",2026-02-27T23:28:00.000Z,concept-article,security,0.65,True,"Explains how Geos control where customer content is processed, impacting compliance and data residency configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/resources/designated-services,Designated Services,Databricks Designated Services - Azure Databricks,Manage data residency with Databricks Designated Services,Databricks Designated Services are features that use Geos to manage data residency for data processing.,"This article describes Databricks Designated Services, which are features in Databricks that useAzure geographiesto manage data residency when processing customer content. Important If a Designated Service is available in your workspace's Geo, then your content is processed in that Geo. Some Designated Services are not natively available in all Geos, and admins might need to enablecross-Geo processingbefore these Designated Services can be used. If you enable cross-Geo processing on your workspa",2026-04-07T08:00:00.000Z,concept-article,security,0.65,True,"Describes Designated Services, Geos, and cross-Geo processing for data residency; this is product-specific security/compliance configuration around where customer content is processed.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/resources/feature-region-support,Features with limited regional availability,Features with limited regional availability - Azure Databricks,Check Azure Databricks feature availability by region,Learn about feature support for regions supported by Azure Databricks.,"This article lists features with availability differences by region. Note Some features are not included in the tables below. These features, known as Designated Services, use Geographies to manage data residency when processing customer content. A Geo is a group of data center regions that Databricks uses to provide predictability and transparency regarding where your data is processed. SeeDatabricks Designated Services. The following tables list support for features that are only available on ",2026-04-17T08:00:00.000Z,feature-availability,deployment,0.7,True,"Contains region-by-region feature support tables, which are effectively a platform/region support matrix; this is deployment-relevant expert knowledge about where specific capabilities are available.",updated +https://learn.microsoft.com/en-us/azure/databricks/resources/designated-services,Designated Services,Databricks Designated Services - Azure Databricks,Decide and configure Databricks Designated Services geos,Databricks Designated Services are features that use Geos to manage data residency for data processing.,"This article describes Databricks Designated Services, which are features in Databricks that useAzure geographiesto manage data residency when processing customer content. Important If a Designated Service is available in your workspace's Geo, then your content is processed in that Geo. Some Designated Services are not natively available in all Geos, and admins might need to enablecross-Geo processingbefore these Designated Services can be used. If you enable cross-Geo processing on your workspa",2026-04-20T08:00:00.000Z,concept-article,decision-making,0.6,True,"Describes which Databricks features are Designated Services, how they use Azure geographies for data residency, and when cross-Geo processing must be enabled. This is product-specific guidance for choosing and configuring data residency behavior across geos, impacting compliance and deployment decisions.",updated +https://learn.microsoft.com/en-us/azure/databricks/resources/feature-region-support,Features with limited regional availability,Features with limited regional availability - Azure Databricks,Check Azure Databricks feature availability by region,Learn about feature support for regions supported by Azure Databricks.,"This article lists features with availability differences by region. Note Some features are not included in the tables below. These features, known as Designated Services, use Geographies to manage data residency when processing customer content. A Geo is a group of data center regions that Databricks uses to provide predictability and transparency regarding where your data is processed. SeeDatabricks Designated Services. The following tables list support for features that are only available on ",2026-04-21T08:00:00.000Z,feature-availability,deployment,0.7,True,"Lists which Azure Databricks features are available in which regions using tables of feature-by-region support. This is concrete, product-specific availability data that changes over time and is not inferable from general training. It directly affects deployment and rollout decisions by region, fitting best under deployment.",updated https://learn.microsoft.com/en-us/azure/databricks/resources/firewall-rules,Domain name firewall rules,Configure domain name firewall rules - Azure Databricks,Configure domain-based firewall rules for Databricks,Learn how to configure domain name firewall rules for Azure Databricks workspaces.,"If your corporate firewall blocks traffic based on domain names, you must allow HTTPS and WebSocket traffic to Azure Databricks domain names to ensure access to Azure Databricks resources. You can choose between two options, one more permissive but easier to configure, the other specific to your workspace domains.",2026-02-04T00:12:00.000Z,concept-article,security,0.7,True,"Provides specific domain names and protocol requirements (HTTPS/WebSocket) that must be allowed through corporate firewalls for Azure Databricks, which is concrete, product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/resources/glossary,Glossary,Azure Databricks technical terminology glossary - Azure Databricks,,Glossary of key Databricks technical terms.,,2026-04-10T08:00:00.000Z,glossary,,0.2,False,"Glossary of terminology; mostly definitions rather than limits, configs, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/resources/glossary,Glossary,Azure Databricks technical terminology glossary - Azure Databricks,,Glossary of key Databricks technical terms.,,2026-04-21T08:00:00.000Z,glossary,,0.0,False,"A terminology glossary; primarily conceptual definitions without configuration values, limits, error codes, or decision matrices. Does not meet any expert-knowledge criteria for the defined sub-skill types.",updated https://learn.microsoft.com/en-us/azure/databricks/resources/ideas,Submit in-product feedback,Submit product feedback - Azure Databricks,,Learn how to influence the Azure Databricks product roadmap by providing feedback directly to the product team from your workspace.,"You can submit feedback directly to the product team to influence the Azure Databricks product roadmap. To submit product feedback from the Azure Databricks workspace, do the following:",2026-03-02T08:00:00.000Z,how-to,,0.1,False,"Describes how to submit product feedback; procedural, not technical expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region,IP addresses and domains for Databricks services and assets,IP addresses and domains for Azure Databricks services and assets - Azure Databricks,Configure IPs and domains for Azure Databricks networking,Learn about the IP and domains available for each region.,"This article lists IP addresses and domains for Azure Databricks services and assets. You may need this information if your Azure Databricks workspace is deployed to your own virtual network (VNet) and you use custom routes, also known as user-defined routes (UDR), to manage network traffic using a virtual appliance or firewall. SeeUser-defined route settings for Azure Databricks. Databricks strongly recommends that you use the Azure Databricks service tag instead of specific IP addresses.Azure ",2026-04-14T08:00:00.000Z,concept-article,configuration,0.8,True,"Lists specific IP address ranges and domain names per region for Databricks services, used for UDR/firewall configuration; this is detailed, product-specific configuration data.",updated -https://learn.microsoft.com/en-us/azure/databricks/resources/limits,Resource quotas,Resource limits - Azure Databricks,Review Azure Databricks resource and API limits,Learn about numerical limits for Azure Databricks resources and whether you can request an increase for each limit.,"The following tables list various numerical limits for Azure Databricks resources. For additional information about Azure Databricks resource limits, see each individual resource's overview documentation. For limits on Databricks Free Edition, seeDatabricks Free Edition limitations. This page includes resource limits followed by API rate limits. For limits whereFixedisNo, you can request a limit increase through your Azure Databricks account team.",2026-04-09T08:00:00.000Z,limits-and-quotas,limits-quotas,0.98,True,"Explicitly described as listing numerical limits for Azure Databricks resources and API rate limits, likely in tables with exact values and whether they are fixed or can be increased.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region,IP addresses and domains for Databricks services and assets,IP addresses and domains for Azure Databricks services and assets - Azure Databricks,Configure network rules using Databricks IPs and domains,Learn about the IP and domains available for each region.,"This article lists IP addresses and domains for Azure Databricks services and assets. You may need this information if your Azure Databricks workspace is deployed to your own virtual network (VNet) and you use custom routes, also known as user-defined routes (UDR), to manage network traffic using a virtual appliance or firewall. SeeUser-defined route settings for Azure Databricks. Databricks strongly recommends that you use the Azure Databricks service tag instead of specific IP addresses.Azure ",2026-04-23T08:00:00.000Z,concept-article,configuration,0.8,True,"Provides region-specific IP address ranges and domain names for Azure Databricks services and assets, used for UDRs, firewalls, and VNets. These are detailed, product-specific configuration values that an LLM would not know from training and are used to configure network/security settings.",updated +https://learn.microsoft.com/en-us/azure/databricks/resources/limits,Resource quotas,Resource limits - Azure Databricks,Review Azure Databricks resource and API limits,Learn about numerical limits for Azure Databricks resources and whether you can request an increase for each limit.,"The following tables list various numerical limits for Azure Databricks resources. For additional information about Azure Databricks resource limits, see each individual resource's overview documentation. For limits on Databricks Free Edition, seeDatabricks Free Edition limitations. This page includes resource limits followed by API rate limits. For limits whereFixedisNo, you can request a limit increase through your Azure Databricks account team.",2026-04-21T08:00:00.000Z,limits-and-quotas,limits-quotas,0.95,True,"Explicitly described as listing numerical limits for Azure Databricks resources and API rate limits, likely in tables with exact values and whether they are fixed or can be increased, matching limits-quotas criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/resources/manage-resource-quotas,Monitor resource quotas,Monitor your usage of Unity Catalog resource quotas - Azure Databricks,Monitor Unity Catalog resource quota usage via APIs,Learn how to monitor your Unity Catalog resource usage using the Unity Catalog resource quotas APIs.,"This article describes how to monitor your usage of Unity Catalog securable objects that are subject to resource quotas. You can use the Unity Catalog resource quotas APIs to track usage. Although some limits can be increased upon request, others are fixed. To avoid disruptions, plan ahead and contact your Azure Databricks account team if you anticipate exceeding your resource quotas.",2026-03-26T08:00:00.000Z,limits-and-quotas,limits-quotas,0.8,True,"Focuses on resource quotas and usage monitoring via specific APIs; quotas imply concrete numeric limits and tracking, which is expert limits-quotas knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/resources/platform-release,Platform release process,Platform release process - Azure Databricks,Understand Azure Databricks platform release windows,Learn about the Azure Databricks platform release process and scheduled maintenance windows.,"Platform releases are deployed on a regular basis. To minimize disruption to users, deployments are scheduled outside of normal business hours for the region in which they occur.",2024-10-07T08:00:00.000Z,concept-article,deployment,0.7,True,"Describes platform release process and scheduled maintenance windows, which are product-specific deployment/maintenance timing details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/resources/pricing,Pricing,Serverless DBU consumption by SKU - Azure Databricks,Understand serverless DBU billing by Azure Databricks SKU,Learn about the SKUs and DBU multipliers used to bill for various Azure Databricks serverless offerings.,"This article explains the SKUs and DBU multipliers used to bill for various Databricks serverless offerings. For Azure Databricks pricing, seepricing details.",2026-03-02T08:00:00.000Z,concept-article,decision-making,0.7,True,"Explains SKUs and DBU multipliers used for billing; this is quantified cost/consumption data that informs SKU and cost decisions, fitting decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/resources/status,Azure Databricks status page,Status Page - Azure Databricks,,"Learn about the Azure Databricks Status Page, which provides an overview of all core Azure Databricks services.","The Azure DatabricksStatus Pageprovides an overview of all core Azure Databricks services. You can also subscribe to status updates on individual service components and receive an alert whenever the status of the service you are subscribed to changes. The following image is a screenshot, not a live link. To see the actual current status, click this link:Status Page. The status page is broken down by Azure region. Select one of the four main geos (Americas,Europe,Asia Pacific, orMiddle East and A",2026-04-13T08:00:00.000Z,concept-article,,0.1,False,"Describes the status page and how to subscribe to updates; no product-specific limits, configs, or troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/databricks/resources/pricing,Pricing,Serverless DBU consumption by SKU - Azure Databricks,Choose Azure Databricks serverless SKUs and DBU multipliers,Learn about the SKUs and DBU multipliers used to bill for various Azure Databricks serverless offerings.,"This article explains the SKUs and DBU multipliers used to bill for various Databricks serverless offerings. For Azure Databricks pricing, seepricing details.",2026-04-21T08:00:00.000Z,concept-article,decision-making,0.7,True,"Explains SKUs and DBU multipliers used for billing serverless offerings. This is pricing/tier-specific guidance with concrete multipliers that affect cost decisions, fitting decision-making around SKU selection.",updated +https://learn.microsoft.com/en-us/azure/databricks/resources/status,Azure Databricks status page,Status Page - Azure Databricks,,"Learn about the Azure Databricks Status Page, which provides an overview of all core Azure Databricks services.","The Azure DatabricksStatus Pageprovides an overview of all core Azure Databricks services. You can also subscribe to status updates on individual service components and receive an alert whenever the status of the service you are subscribed to changes. The following image is a screenshot, not a live link. To see the actual current status, click this link:Status Page. The status page is broken down by Azure region. Select one of the four main geos (Americas,Europe,Asia Pacific, orMiddle East and A",2026-04-13T08:00:00.000Z,concept-article,,0.1,False,"Describes the status page and how to subscribe to updates; no product-specific limits, configs, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/resources/supported-browsers,Supported browsers,Supported browsers - Azure Databricks,Verify supported browsers for Azure Databricks UI access,"Learn which browsers are officially supported by Azure Databricks, including Google Chrome, Firefox, Safari, and Microsoft Edge.","Azure Databricks officially supports the following browsers on Windows and macOS: The following browsers are not supported: Using an unsupported browser might cause unexpected behavior, including security issues. Azure Databricks can assist only in those support cases where an officially supported browser is being used.",2026-01-21T08:00:00.000Z,feature-availability,configuration,0.6,True,Provides explicit list of supported and unsupported browsers; this is product-specific compatibility configuration information.,unchanged https://learn.microsoft.com/en-us/azure/databricks/resources/supported-regions,Supported regions,Azure Databricks regions - Azure Databricks,,Learn about the regions supported by Azure Databricks.,"This page lists the regions supported by Azure Databricks. Note Some Databricks features, known asDesignated Services, use geographies to manage data residency when processing customer content. To learn more, seeData residency in Azure. For more region-related information, see the following pages:",2026-03-24T17:49:00.000Z,feature-availability,,0.4,False,"Lists supported regions; while specific, it is static catalog data rather than limits, configs, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/schemas/,Overview,What are schemas in Azure Databricks? - Azure Databricks,,Learn about schemas (databases) in Azure Databricks and how they work in Unity Catalog.,"In Unity Catalog, a schema is a child of acatalogand can contain tables, views, volumes, models, and functions. Schemas provide more granular categories of data organization than catalogs. This article describes the role of schemas in the Azure Databricks data object hierarchy in Unity Catalog. For information on schemas in the legacy workspace-local Hive metastore, seeDatabase objects in the legacy Hive metastore.",2026-02-26T08:00:00.000Z,concept-article,,0.2,False,"Conceptual overview of schemas and hierarchy; no detailed config tables, limits, or security role specifics.",unchanged @@ -3547,12 +3577,12 @@ https://learn.microsoft.com/en-us/azure/databricks/schemas/manage-schema,Manage https://learn.microsoft.com/en-us/azure/databricks/search/,Search for workspace objects,Search for workspace objects - Azure Databricks,,"Learn how to search for Azure Databricks workspace objects, including notebooks, queries, dashboards, alerts, files, folders, libraries, tables, jobs, repos, and more.","This article describes how to search for tables, volumes, notebooks, queries, dashboards, alerts, files, folders, libraries, jobs, repos, partners, and Marketplace listings in your Azure Databricks workspace. Tables must be registered in Unity Catalog to appear in search results. In workspaces that usecustomer-managed keys for encryption, notebook contents and query contents are not available in search.",2026-03-26T08:00:00.000Z,concept-article,,0.4,False,"Describes how to search for workspace objects; while it mentions a CSP-related behavior, it is mostly procedural UI guidance, not a structured configuration or security reference.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/,Security & compliance overview,Security and compliance - Azure Databricks,,Learn about how Azure Databricks secures your data and privacy and how you can secure your Azure Databricks account and data.,"Azure Databricks provides comprehensive security and compliance features to protect your data, users, and workspaces. Configure authentication and access controls, secure network connections, encrypt data at rest and in transit, manage secrets and credentials, and meet regulatory compliance requirements.",2025-12-11T08:00:00.000Z,concept-article,,0.2,False,"Top-level security overview; conceptual description of security and compliance features without detailed roles, parameters, or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/auth/,Authentication and access control overview,Authentication and access control - Azure Databricks,,Learn how to manage authentication and access control your Azure Databricks account and workspaces.,"This article introduces authentication and access control in Azure Databricks. For information about securing access to your data, seeData governance with Azure Databricks.",2026-04-03T08:00:00.000Z,concept-article,,0.3,False,"Introductory overview of authentication and access control; does not appear to list specific RBAC roles, scopes, or configuration parameters in detail.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/,Access control overview,Access control lists - Azure Databricks,Configure Azure Databricks workspace object ACLs,Learn how to manage access to Azure Databricks securable objects.,This page describes details about the permissions available for the different workspace objects.,2026-04-15T08:00:00.000Z,concept-article,security,0.78,True,"Access control documentation for Databricks securable objects typically lists concrete permission names, scopes, and role capabilities per object type (clusters, jobs, notebooks, repos, etc.). These RBAC/ACL matrices and exact permission semantics are product-specific security configuration details that qualify as expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/,Access control overview,Access control lists - Azure Databricks,Manage access control lists for Databricks workspace objects,Learn how to manage access to Azure Databricks securable objects.,This page describes details about the permissions available for the different workspace objects.,2026-04-24T08:00:00.000Z,concept-article,security,0.8,True,Documents specific permissions and ACL behaviors for different securable object types—detailed RBAC/permission configuration unique to Databricks.,updated https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/service-principal-acl,Manage access to service principals,Roles for managing service principals - Azure Databricks,Assign roles and ACLs for Databricks service principals,Learn how to control access to service principals in Azure Databricks.,"This article describes how to manage roles on service principals in your Azure Databricks account. A service principal is an identity that you create in Azure Databricks for use with automated tools, jobs, and applications. Service principals give automated tools and scripts API-only access to Azure Databricks resources, providing greater security than using users or groups. You can grant Azure Databricks users, service principals, and account groups access to use a service principal. This allow",2026-02-11T08:00:00.000Z,concept-article,security,0.8,True,"Describes roles for managing service principals and how to grant access; likely lists specific Databricks roles/permissions and ACL behaviors, which are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/auth/api-access-permissions,Manage personal access tokens,Manage personal access token permissions - Azure Databricks,Configure Azure Databricks personal access token permissions,"Set permissions for personal access tokens and passwords. Control who can create tokens, modify tokens, use passwords to authenticate, and more.","This article describes the how to configure permissions for Azure Databricks personal access tokens. To learn how to use credentials to authenticate to Azure Databricks, seeAuthorize access to Azure Databricks resources. To monitor and revoke personal access tokens, seeMonitor and revoke personal access tokens.",2026-04-06T08:00:00.000Z,concept-article,security,0.8,True,"Managing PAT permissions is security-focused and product-specific. This page is about configuring who can create/modify tokens and use passwords, which implies concrete permission settings and possibly role or scope names unique to Databricks, matching the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/auth/change-default-workspace-access,Change default workspace access to Consumer access,Change default workspace access to consumer access - Azure Databricks,Change default Databricks workspace access to consumer,Learn how to change the default workspace access for new users to consumer access using group cloning to separate users with authoring privileges from view-only consumers.,Important This feature is inPublic Preview. This page explains how workspace admins can change the default workspace access for new users to consumer access by using group cloning. This feature helps you streamline consumer onboarding at scale while maintaining appropriate access levels for users who need authoring privileges.,2026-03-16T08:00:00.000Z,concept-article,security,0.7,True,Describes how to use group cloning to change default workspace access for new users to consumer access. This involves concrete group/permission configuration patterns unique to Databricks.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/auth/change-default-workspace-access,Change default workspace access to Consumer access,Change default workspace access to consumer access - Azure Databricks,Configure default consumer workspace access via group cloning,Learn how to change the default workspace access for new users to consumer access using group cloning to separate users with authoring privileges from view-only consumers.,Important This feature is inPublic Preview. This page explains how workspace admins can change the default workspace access for new users to consumer access by using group cloning. This feature helps you streamline consumer onboarding at scale while maintaining appropriate access levels for users who need authoring privileges.,2026-04-20T08:00:00.000Z,concept-article,security,0.7,True,"Describes a Databricks-specific pattern (group cloning) to separate consumer vs authoring access, with concrete steps—security configuration guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/security/auth/default-permissions,Default workspace permissions,Default workspace permissions - Azure Databricks,Understand default workspace permissions in Databricks,Learn about the default permissions for each new workspace.,"This page describes the workspace permissions granted by default when a new workspace is created. In a new workspace, your default permissions depend on whether you're a workspace admin or a non-admin user. For more information about users and groups, seeManage users, service principals, and groups. Audit logs record all changes made to user permissions. The logs show the permission changed and the user who initiated the change. Default permissions are set by Azure Databricks and are shown as in",2025-10-13T08:00:00.000Z,concept-article,security,0.8,True,"Details default permissions for admins and non-admins in a new workspace, including which actions are allowed on which objects. These default ACLs/roles are product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/auth/entitlements,Manage entitlements,Manage entitlements - Azure Databricks,Manage Databricks user and group entitlements,"Learn how to manage entitlements for users, service principals, and groups","This page describes how to manage entitlements for users, service principals, and groups.",2026-03-23T08:00:00.000Z,concept-article,security,0.75,True,"Explains specific entitlements that can be assigned to users, service principals, and groups. The entitlement names and effects are Databricks-specific security configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/auth/entitlements,Manage entitlements,Manage entitlements - Azure Databricks,Manage Azure Databricks user and group entitlements,"Learn how to manage entitlements for users, service principals, and groups","This page describes how to manage entitlements for users, service principals, and groups.",2026-04-20T22:01:00.000Z,concept-article,security,0.7,True,"Covers specific entitlements and how they’re assigned to users, service principals, and groups—product-specific access control configuration.",updated https://learn.microsoft.com/en-us/azure/databricks/security/auth/jit,Automatically provision users (JIT),Automatically provision users (JIT) - Azure Databricks,Configure just-in-time user provisioning in Databricks,Learn how to configure just-in-time (JIT) provisioning in your Azure Databricks account.,This page explains how to automatically create user accounts when users first log in to Azure Databricks using just-in-time (JIT) provisioning.,2026-02-02T08:00:00.000Z,concept-article,security,0.7,True,"Describes how to configure JIT provisioning for user accounts, which involves specific identity/auth settings and flows unique to Azure Databricks. These are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/,Overview,Data security and encryption - Azure Databricks,,Protect your data with encryption at rest and in-transit. Configure customer-managed keys for more control over your data privacy.,"This article introduces data security configurations to help protect your data. For information about securing access to your data, seeData governance with Azure Databricks.",2026-01-24T08:00:00.000Z,concept-article,,0.3,False,"High-level introduction to data security and encryption options; appears conceptual without detailed configuration tables, limits, or product-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmek-unity-catalog,Customer-managed keys for Unity Catalog,Customer-managed keys for Unity Catalog - Azure Databricks,Configure customer-managed keys for Unity Catalog,Learn how to configure customer-managed keys (CMK) for Unity Catalog to protect catalog data with multi-key encryption.,"Customer-managed keys (CMK) for Unity Catalog let you protect data managed by Azure Databricks with your own encryption keys. You can configure encryption at the catalog level, using a separate key for each catalog based on data sensitivity or compliance requirements. For information about CMK for managed services and workspace storage, seeCustomer-managed keys for encryption. tc",2026-03-02T19:05:00.000Z,concept-article,security,0.76,True,"Describes catalog-level CMK configuration and multi-key encryption behavior for Unity Catalog, including how keys map to catalogs and compliance scenarios, which is detailed product-specific security behavior.",unchanged @@ -3562,7 +3592,7 @@ https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-dis https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-services-azure/,Customer-managed keys for managed services overview,Customer-managed keys for managed services - Azure Databricks,Configure CMK for Databricks managed services data,"Learn how to configure your own HSM and software keys (customer-managed keys) for Azure Databricks managed services data in the control plane (notebooks, secrets, Databricks SQL queries, and Databrick","Note This feature requires thePremium plan. For additional control of your data, you can add your own key to protect and control access to some types of data. Azure Databricks has three customer-managed key features for different types of data and locations. To compare them, seeCustomer-managed keys for encryption. Managed services data in the Azure Databrickscontrol planeis encrypted at rest. You can add a customer-managed key for managed services to help protect and control access to the follo",2026-03-10T08:00:00.000Z,concept-article,security,0.8,True,"Describes Databricks control-plane data types that can be encrypted with CMK and how the feature works, including plan requirements and product-specific encryption behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-services-azure/cmk-hsm-managed-services-azure,Enable HSM customer-managed keys for managed services,Enable HSM customer-managed keys for managed services - Azure Databricks,Enable HSM CMK for Databricks managed services,"Learn how to configure your own HSM keys (customer-managed keys) for Azure Databricks managed services data in the control plane (notebooks, secrets, Databricks SQL queries, and Databricks SQL query h","Note This feature requires thePremium plan. This article describes how to configure your own key from Azure Key Vault Managed HSM. For instructions on using a key from Azure Key Vault vaults, seeEnable customer-managed keys for managed services.",2026-01-24T08:00:00.000Z,concept-article,security,0.86,True,"Provides concrete instructions for configuring Azure Key Vault Managed HSM keys for Databricks managed services data, including product-specific security configuration steps.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-services-azure/customer-managed-key-managed-services-azure,Enable customer-managed keys for managed services,Enable customer-managed keys for managed services - Azure Databricks,Enable CMK for Databricks managed services via Key Vault,"Learn how to configure your own keys (customer-managed keys) for Azure Databricks managed services data in the control plane (notebooks, secrets, Databricks SQL queries, and Databricks SQL query histo","Note This feature requires thePremium plan. For additional control of your data, you can add your own key to protect and control access to some types of data. Azure Databricks has multiple customer-managed key features. To compare the related features, seeCustomer-managed keys for encryption. Tip This article describes how to configure your own key from Azure Key Vault vaults for managed services. For instructions on using a key from Azure Key Vault Managed HSM, seeEnable HSM customer-managed ke",2026-01-24T08:00:00.000Z,concept-article,security,0.86,True,"Step-by-step configuration of customer-managed keys from Azure Key Vault for managed services data, with Databricks-specific security configuration details and parameters.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys,Customer-managed keys overview,Customer-managed keys for encryption - Azure Databricks,,Learn about using your own key with a Databricks workspace to encrypt some types of data.,Note This feature requires thePremium plan. This page provides an overview of customer-managed keys for encryption. Some services and data support adding a customer-managed key to help protect and control access to encrypted data. You can use the key management service in your cloud to maintain a customer-managed encryption key. Azure Databricks supports customer-managed keys from Azure Key Vault vaults and Azure Key Vault Managed HSM (Hardware Security Modules).,2026-04-17T18:03:00.000Z,concept-article,,0.45,False,"Described as an overview of customer‑managed keys for encryption and supported key sources. Likely conceptual and plan‑level info without detailed role names, permission scopes, or concrete configuration parameter tables.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys,Customer-managed keys overview,Customer-managed keys for encryption - Azure Databricks,,Learn about using your own key with a Databricks workspace to encrypt some types of data.,Note This feature requires thePremium plan. This page provides an overview of customer-managed keys for encryption. Some services and data support adding a customer-managed key to help protect and control access to encrypted data. You can use the key management service in your cloud to maintain a customer-managed encryption key. Azure Databricks supports customer-managed keys from Azure Key Vault vaults and Azure Key Vault Managed HSM (Hardware Security Modules).,2026-04-17T18:03:00.000Z,concept-article,,0.45,False,"Described as an overview of customer‑managed keys for encryption and supported key sources. Likely conceptual and plan‑level info without detailed role names, permission scopes, or concrete configuration parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys-dbfs/,Customer-managed keys for DBFS overview,Customer-managed keys for DBFS root - Azure Databricks,Use CMK to encrypt Databricks DBFS root,Learn how to use your own encryption key to encrypt the DBFS storage account.,"Note This feature is available only in thePremium plan. For additional control of your data, you can add your own key to protect and control access to some types of data. Azure Databricks has two customer-managed key features that involve different types of data and locations. For a comparison, seeCustomer-managed keys for encryption. By default, the storage account is encrypted with Microsoft-managed keys. After you add a customer-managed key for DBFS root, Azure Databricks uses your key to enc",2025-11-11T08:00:00.000Z,concept-article,security,0.8,True,"Explains Databricks-specific behavior when adding CMK for DBFS root and how it changes storage account encryption, which is a product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys-dbfs/cmk-dbfs-azure-cli,Enable customer-managed keys using Azure CLI,Configure customer-managed keys for DBFS using the Azure CLI - Azure Databricks,Configure DBFS CMK via Azure CLI,Learn how to use the Azure CLI to configure your own encryption key to encrypt the DBFS storage account.,"Note This feature is available only in thePremium plan. You can use the Azure CLI to configure your own encryption key to encrypt the workspace storage account. This article describes how to configure your own key from Azure Key Vault vaults. For instructions on using a key from Azure Key Vault Managed HSM, seeConfigure HSM customer-managed keys for DBFS using the Azure CLI. For more information about customer-managed keys for DBFS, seeCustomer-managed keys for DBFS root.",2025-10-27T08:00:00.000Z,concept-article,security,0.87,True,"Shows Azure CLI-based configuration for CMK on the Databricks workspace storage account, including specific commands and parameters tied to Databricks DBFS encryption.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys-dbfs/cmk-dbfs-azure-portal,Enable customer-managed keys for DBFS using Azure portal,Configure customer-managed keys for DBFS using the Azure portal - Azure Databricks,Configure DBFS CMK via Azure portal for Databricks,Learn how to use the Azure portal to configure your own encryption key to encrypt the DBFS storage account.,"Note This feature is available only in thePremium plan. You can use the Azure portal to configure your own encryption key to encrypt the workspace storage account. This article describes how to configure your own key from Azure Key Vault vaults. For instructions on using a key from Azure Key Vault Managed HSM, seeConfigure HSM customer-managed keys for DBFS using the Azure portal. For more information about customer-managed keys for DBFS, seeCustomer-managed keys for DBFS root.",2025-03-03T19:40:00.000Z,concept-article,security,0.8,True,"Stepwise portal configuration of CMK from Azure Key Vault for DBFS storage account encryption, with Databricks-specific requirements and flows.",unchanged @@ -3578,49 +3608,49 @@ https://learn.microsoft.com/en-us/azure/databricks/security/keys/sql-encryption, https://learn.microsoft.com/en-us/azure/databricks/security/network/,Overview,Networking - Azure Databricks,Configure secure networking for Azure Databricks workspaces,"Configure secure network connectivity and security controls for Databricks workspaces, compute planes, and data access.","Databricks provides a secure networking environment by default. You can configure additional networking features to control access to workspaces, secure connectivity between the control plane and compute planes, and protect connections to your data sources. For an overview of the networking architecture, seeNetworking security architecture. Note Azure Databricks charges for networking costs when serverless workloads connect to customer resources. SeeUnderstand Databricks serverless networking co",2026-02-09T20:20:00.000Z,concept-article,security,0.6,True,"Describes Databricks-specific networking security options (front-end, serverless, classic) and billing implications for serverless networking, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/,Classic compute plane networking,Classic compute plane networking - Azure Databricks,Secure classic compute plane networking in Azure Databricks,Learn how to secure your Azure Databricks control plane to the classic compute plane with networking security features.,"This page introduces features to customize network access between the Azure Databricks control plane and the classic compute plane. Connectivity between the control plane and the classic compute plane is always over the cloud network backbone and not the public internet. To learn more about the control plane and the compute plane, seeNetworking security architecture. To learn more about classic compute and serverless compute, seeCompute. The features in this section focus on establishing and sec",2026-02-12T04:25:00.000Z,concept-article,security,0.6,True,Introduces Databricks-specific features for securing control-plane to classic compute-plane connectivity over the cloud backbone.,unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/on-prem-network,Connect a workspace to an on-premises network,Connect your Azure Databricks workspace to your on-premises network - Azure Databricks,Configure Databricks connectivity to on-premises networks,Learn how to connect your Azure Databricks workspace to your on-premises network.,"This article shows how to establish connectivity from your Azure Databricks workspace to your on-premises network. Traffic is routed via a transit virtual network (VNet) to the on-premises network, using the following hub-and-spoke topology. If you need assistance following this guide, contact your Microsoft and Databricks account teams.",2026-01-24T08:00:00.000Z,concept-article,configuration,0.65,True,Shows how to establish connectivity via a hub-and-spoke topology and transit VNet. Contains concrete network configuration patterns specific to Databricks deployments.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard,Classic compute plane Private Link,Configure classic compute plane private connectivity to Azure Databricks - Azure Databricks,Configure classic compute Private Link for Azure Databricks,Configure Azure Private Link for classic compute plane private connectivity.,Use Azure Private Link to create a secureclassic compute plane Private Link connectionfor your Azure Databricks workspace. This connection secures traffic between clusters on the classic compute plane and core services on the Azure Databricks control plane.,2026-02-10T19:28:00.000Z,concept-article,security,0.75,True,"Explains how to set up Azure Private Link between classic compute plane clusters and the Databricks control plane, including product-specific connectivity behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard,Classic compute plane Private Link,Configure classic compute plane private connectivity to Azure Databricks - Azure Databricks,Configure classic compute plane Private Link for Databricks,Configure Azure Private Link for classic compute plane private connectivity.,Use Azure Private Link to create a secureclassic compute plane Private Link connectionfor your Azure Databricks workspace. This connection secures traffic between clusters on the classic compute plane and core services on the Azure Databricks control plane.,2026-04-21T22:28:00.000Z,concept-article,security,0.78,True,"Describes how to configure Azure Private Link for the classic compute plane to reach Databricks control plane services, including specific network/security configuration steps and required Azure resources. This is product-specific secure connectivity configuration.",updated https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/secure-cluster-connectivity,Secure cluster connectivity,Enable secure cluster connectivity - Azure Databricks,Enable secure cluster connectivity (no public IP) in Databricks,"Learn about secure cluster connectivity, which provides customer VPCs with no open ports and Databricks Runtime cluster nodes with no public IP addresses.",This article explains how to use secure cluster connectivity for Azure Databricks workspaces. Secure cluster connectivity is also known as no public IP (NPIP). Serverless compute resources do not use secure cluster connectivity but also do not have public IP addresses.,2025-11-03T08:00:00.000Z,concept-article,security,0.7,True,Explains secure cluster connectivity/NPIP behavior and how to configure it so clusters have no public IPs. This is a product-specific security/network configuration pattern.,unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/service-endpoints,Configure service endpoint policies for storage access,Configure Azure virtual network service endpoint policies for storage access from classic compute - Azure Databricks,Configure VNet service endpoint policies for Databricks,Configure Azure virtual network service endpoint policies to allow access to only specific Azure Storage accounts.,"This page explains how to use Azure virtual network service endpoint policies to filter outbound traffic from the classic compute plane, ensuring connections are only made to specific Azure Storage accounts.",2026-02-23T08:00:00.000Z,concept-article,security,0.7,True,"Explains how to configure Azure virtual network service endpoint policies specifically for Azure Databricks classic compute to restrict storage access, including product-specific network/security settings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/udr,User-defined route settings,User-defined route settings for Azure Databricks - Azure Databricks,Configure user-defined routes for Databricks VNets,Learn how to configure user-defined route settings for Azure Databricks.,"If your Azure Databricks workspace isdeployed to your own virtual network (VNet), you can use custom routes, also known asuser-defined routes (UDR), to ensure that network traffic is routed correctly for your workspace. For example, if youconnect the virtual network to your on-premises network, traffic may be routed through the on-premises network and unable to reach the Azure Databricks control plane. User-defined routes can solve that problem. You need a UDR for every type of outbound connecti",2026-04-14T17:47:00.000Z,concept-article,configuration,0.8,True,"User‑defined route settings for Databricks require specific route table entries, next hop types, and per‑connection requirements (control plane, data plane, on‑prem). These are detailed configuration parameters and patterns unique to Databricks networking.",updated -https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/update-workspaces,Update workspace network configuration,Update workspace network configuration - Azure Databricks,Update Azure Databricks workspace VNet settings,Learn how to update network configurations for a Azure Databricks workspace.,"This page provides step-by-step instructions for updating the virtual network (VNet) configuration of an existing an Azure Databricks workspace. This allows you to migrate a workspace from a Azure Databricks-managed VNet to your own VNet, a process known as VNet injection, or to modify the VNet configuration of an existing VNet-injected workspace.",2026-04-15T22:08:00.000Z,concept-article,configuration,0.72,True,"Step‑by‑step instructions for updating workspace VNet configuration and migrating between managed VNet and VNet injection will include specific workspace properties, supported transitions, and required parameter values—product‑specific configuration knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-inject,Deploy Azure Databricks in your VNet,Deploy Azure Databricks in your Azure virtual network (VNet injection) - Azure Databricks,Deploy Azure Databricks with VNet injection,"Learn how to deploy Azure Databricks in your Azure Virtual Network, also known as VNet injection.","Deploy Azure Databricks in your Azure VNet to enable network customization, secure connectivity to Azure services and on-premises data sources, and traffic inspection capabilities.",2026-04-13T08:00:00.000Z,concept-article,configuration,0.7,True,"VNet injection deployment guidance typically includes required subnets, address ranges, NSG/route requirements, and workspace configuration parameters. These are concrete, product‑specific configuration settings rather than generic networking concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/udr,User-defined route settings,User-defined route settings for Azure Databricks - Azure Databricks,Configure user-defined routes for Databricks VNets,Learn how to configure user-defined route settings for Azure Databricks.,"If your Azure Databricks workspace isdeployed to your own virtual network (VNet), you can use custom routes, also known asuser-defined routes (UDR), to ensure that network traffic is routed correctly for your workspace. For example, if youconnect the virtual network to your on-premises network, traffic may be routed through the on-premises network and unable to reach the Azure Databricks control plane. User-defined routes can solve that problem. You need a UDR for every type of outbound connecti",2026-04-14T17:47:00.000Z,concept-article,configuration,0.8,True,"User‑defined route settings for Databricks require specific route table entries, next hop types, and per‑connection requirements (control plane, data plane, on‑prem). These are detailed configuration parameters and patterns unique to Databricks networking.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/update-workspaces,Update workspace network configuration,Update workspace network configuration - Azure Databricks,Update Azure Databricks workspace VNet and network settings,Learn how to update network configurations for a Azure Databricks workspace.,"This page provides step-by-step instructions for updating the virtual network (VNet) configuration of an existing an Azure Databricks workspace. This allows you to migrate a workspace from a Azure Databricks-managed VNet to your own VNet, a process known as VNet injection, or to modify the VNet configuration of an existing VNet-injected workspace.",2026-04-23T17:47:00.000Z,concept-article,configuration,0.7,True,"Provides detailed steps and options for changing an existing workspace’s VNet configuration (including VNet injection and modifications), with product-specific network configuration parameters and constraints. This is focused on configuration rather than general concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-inject,Deploy Azure Databricks in your VNet,Deploy Azure Databricks in your Azure virtual network (VNet injection) - Azure Databricks,Deploy Azure Databricks with VNet injection,"Learn how to deploy Azure Databricks in your Azure Virtual Network, also known as VNet injection.","Deploy Azure Databricks in your Azure VNet to enable network customization, secure connectivity to Azure services and on-premises data sources, and traffic inspection capabilities.",2026-04-13T08:00:00.000Z,concept-article,configuration,0.7,True,"VNet injection deployment guidance typically includes required subnets, address ranges, NSG/route requirements, and workspace configuration parameters. These are concrete, product‑specific configuration settings rather than generic networking concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-peering,Peer virtual networks,Peer virtual networks - Azure Databricks,Configure VNet peering for Azure Databricks workspaces,Learn how to configure bidirectional virtual network (VNet) peering between Azure Databricks and another Azure virtual network.,This article shows you how to peer an Azure Databricks virtual network (VNet) with an Azure VNet.,2026-02-12T04:25:00.000Z,concept-article,security,0.65,True,"Shows how to peer Databricks VNets with other Azure VNets, including product-specific network topology and security considerations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/concepts/architecture,Networking security architecture,Networking security architecture - Azure Databricks,,Learn about the Databricks control plane and compute plane architecture and how to configure secure network connectivity.,"Azure Databricks operates out of acontrol planeand acompute plane. To learn more about classic compute and serverless compute, seeCompute. For additional architecture information, seeHigh-level architecture.",2026-02-11T08:00:00.000Z,concept-article,,0.4,False,Appears to be a conceptual overview of Databricks control/compute plane and networking security architecture without detailed configuration tables or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/concepts/private-link,Azure Private Link,Azure Private Link concepts - Azure Databricks,,"Concepts and architecture for Azure Private Link on Azure Databricks, covering inbound (front-end), outbound (serverless), and classic (back-end) connectivity patterns.","Azure Private Link creates a private, secure connection between your Azure Databricks resources and your Azure services and serverless resources, ensuring your network traffic isn't exposed to the public internet. Azure Databricks supports three types of Private Link connectivity:",2026-04-02T08:00:00.000Z,concept-article,,0.4,False,Described as concepts and architecture for Azure Private Link connectivity patterns; likely conceptual without detailed parameter tables or specific config values.,unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/context-based-policies,Overview,Context-based network policies - Azure Databricks,Manage context-based network policies for Databricks workspaces,Learn how to manage context-based network policies that control inbound access and outbound connections for your Azure Databricks workspaces.,"Azure Databricks context-based policies provide a unified security framework for managing both inbound and outbound traffic to your workspaces. With context-based ingress, administrators can restrict workspace access based on a combination of identity, network source, and request type. Serverless egress policies extend this control to outbound traffic by limiting serverless workloads to authorized destinations. Together, these network policies help ensure that both user access and data movement ",2026-02-09T23:11:00.000Z,concept-article,security,0.8,True,"Describes Databricks-specific context-based policies that combine identity, network source, and request type for ingress and serverless egress, including supported controls.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/,Inbound networking,Users to Azure Databricks networking - Azure Databricks,Secure user access to Azure Databricks with front-end controls,Learn how to secure your Azure Databricks workspace with front-end networking security features.,"This guide introduces features to customize network access between users and their Azure Databricks workspaces. By default, users and applications can connect to Azure Databricks from any IP address. Users might access critical data sources using Azure Databricks. If a user's credentials are compromised through phishing or a similar attack, securing network access dramatically reduces the risk of an account takeover. Configurations like private connectivity, IP access lists, and firewalls help k",2026-02-09T08:00:00.000Z,concept-article,security,0.6,True,"Introduces Databricks-specific front-end security features like private connectivity, IP access lists, and firewalls to restrict workspace access.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/context-based-ingress,Context-based ingress control,Context-based ingress control - Azure Databricks,Configure context-based ingress control for Databricks,"This article explains how context-based ingress control lets you control who can reach your Databricks workspace endpoints based on identity, network source, and API scope.","Important This feature is inPublic Preview. This page provides an overview of context-based ingress control. For serverless egress control, seeWhat is serverless egress control?. To configure ingress policies, seeManage context-based ingress policies.",2026-02-09T08:00:00.000Z,concept-article,security,0.7,True,"Describes configuring ingress policies based on identity, network source, and API scope. These policy constructs and configuration steps are Databricks-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/front-end-private-connect,Inbound Private Link,Configure Inbound Private Link - Azure Databricks,Configure inbound Private Link to Databricks workspaces,Learn how to configure Private Link for inbound connections.,"This page provides instructions for configuring inbound private connectivity, which secures the connection between users and their Azure Databricks workspaces.",2026-03-11T08:00:00.000Z,concept-article,security,0.75,True,"Provides concrete steps and parameters to set up inbound Private Link for Databricks, a product-specific network security configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/front-end-private-connect,Inbound Private Link,Configure Inbound Private Link - Azure Databricks,Configure inbound Private Link for Azure Databricks workspaces,Learn how to configure Private Link for inbound connections.,"This page provides instructions for configuring inbound private connectivity, which secures the connection between users and their Azure Databricks workspaces.",2026-04-20T08:00:00.000Z,concept-article,security,0.76,True,"Step-by-step configuration of inbound Private Link for Databricks front-end, with product-specific network/security settings (private endpoint types, DNS/endpoint configuration, required Azure resources). This is concrete security configuration rather than generic networking guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list,IP access lists overview,Manage IP access lists - Azure Databricks,Configure Azure Databricks IP access lists,Learn how to limit Azure Databricks workspace access to the authorized IP addresses only.,This page introduces IP access lists for the Azure Databricks account and workspaces.,2026-02-09T08:00:00.000Z,concept-article,security,0.78,True,"Describes Databricks-specific IP access list behavior and enforcement for accounts and workspaces, including how lists interact with other controls. This is product-specific security configuration beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list-account,IP access lists for the account console,Configure IP access lists for the account console - Azure Databricks,Configure account console IP access lists,Learn how to limit access to the Azure Databricks account console to the authorized IP addresses only.,"This page describes how to configure IP access lists for the Azure Databricks account console. You can also use theAccount IP Access Lists API. To configure IP access lists for a Azure Databricks workspace, seeConfigure IP access lists for workspaces.",2026-02-09T08:00:00.000Z,concept-article,security,0.82,True,"Provides specific instructions for configuring IP access lists for the Azure Databricks account console using the Account IP Access Lists API, which is a product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list-workspace,IP access lists for workspaces,Configure IP access lists for workspaces - Azure Databricks,Manage workspace IP access lists with CLI and API,Learn how to limit Azure Databricks workspace access to the authorized IP addresses only.,"This page describes how to configure IP access lists for Azure Databricks workspaces. This article discusses the most common tasks you can perform using theDatabricks CLI. You can also use theIP Access Lists API. Note Workspace IP access lists and account-level context-based ingress controls are enforced together. A request must be allowed by both controls to succeed. Context-based ingress controls provide finer-grained security by combining identity, request type, and network source conditions.",2026-02-09T08:00:00.000Z,concept-article,security,0.86,True,"Details concrete steps and commands for configuring workspace IP access lists using the Databricks CLI and IP Access Lists API, including how enforcement combines with account-level ingress controls. Contains product-specific security configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/manage-ingress-policies,Manage context-based ingress policies,Manage context-based ingress policies - Azure Databricks,Create and manage context-based ingress policies in Databricks,This document explains how to configure and manage context-based ingress policies for your Databricks workspaces.,"Important This feature is inPublic Preview. This page shows account admins how to create network policies, author granular ingress rules, and attach a policy to workspaces. For an overview of context-based ingress, seeContext-based ingress control. For serverless egress control, seeWhat is serverless egress control?.",2026-02-09T23:11:00.000Z,concept-article,security,0.8,True,"Provides concrete admin steps to create network policies, author granular ingress rules, and attach them to workspaces, including Databricks-specific policy model.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-privatelink,Inbound Private Link for performance-intensive services,Configure inbound Private Link for performance-intensive services - Azure Databricks,Configure inbound Private Link for Databricks services,Configure inbound (front-end) Private Link for performance-intensive services on the Databricks platform.,"Important This feature is inPublic Preview. This page shows how to configure Private Link for inbound connectivity to performance-intensive services on the Azure Databricks platform. This private connection allows external clients and users to access services on the Azure Databricks platform, such as Zerobus Ingest and Lakebase Autoscaling. Note Databricks does not currently bill for networking costs associated with inbound Private Link connections to performance-intensive services. Charges may ",2026-04-17T08:00:00.000Z,concept-article,configuration,0.7,True,"Private Link setup for performance‑intensive Databricks services will include specific configuration fields (DNS zones, endpoint types, required subnets, resource IDs) and possibly constraints around supported services. These are detailed, product‑specific configuration parameters beyond generic knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/,Serverless compute plane networking,Serverless compute plane networking - Azure Databricks,,Learn how to configure network access from serverless SQL warehouses.,"This page introduces tools to secure network access between the compute resources in the Azure Databricks serverless compute plane and customer resources. To learn more about the control plane and the serverless compute plane, seeNetworking security architecture. To learn more about classic compute and serverless compute, seeCompute. Note Azure Databricks charges for networking costs when serverless workloads connect to customer resources. SeeUnderstand Databricks serverless networking costs.",2026-04-13T21:52:00.000Z,concept-article,,0.4,False,"Described as an introduction to serverless compute plane networking and cost note; likely a conceptual/overview page about tools and architecture rather than detailed config tables, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-privatelink,Inbound Private Link for performance-intensive services,Configure inbound Private Link for performance-intensive services - Azure Databricks,Set up inbound Private Link for Databricks performance services,Configure inbound (front-end) Private Link for performance-intensive services on the Databricks platform.,"Important This feature is inPublic Preview. This page shows how to configure Private Link for inbound connectivity to performance-intensive services on the Azure Databricks platform. This private connection allows external clients and users to access services on the Azure Databricks platform, such as Zerobus Ingest and Lakebase Autoscaling. Note Databricks does not currently bill for networking costs associated with inbound Private Link connections to performance-intensive services. Charges may ",2026-04-21T08:00:00.000Z,concept-article,security,0.7,True,"Covers configuring inbound Private Link specifically for performance-intensive Databricks services (e.g., Zerobus Ingest, Lakehouse Autoscaling) with product-specific endpoint and networking configuration steps. This is detailed security/network configuration, not just conceptual info.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/,Serverless compute plane networking,Serverless compute plane networking - Azure Databricks,,Learn how to configure network access from serverless SQL warehouses.,"This page introduces tools to secure network access between the compute resources in the Azure Databricks serverless compute plane and customer resources. To learn more about the control plane and the serverless compute plane, seeNetworking security architecture. To learn more about classic compute and serverless compute, seeCompute. Note Azure Databricks charges for networking costs when serverless workloads connect to customer resources. SeeUnderstand Databricks serverless networking costs.",2026-04-13T21:52:00.000Z,concept-article,,0.4,False,"Described as an introduction to serverless compute plane networking and cost note; likely a conceptual/overview page about tools and architecture rather than detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/cost-management,Understand data transfer and connectivity costs,Understand Databricks serverless networking costs - Azure Databricks,Evaluate Databricks serverless networking data transfer costs,Understand data transfer and connectivity costs,"Data transfer and connectivity refer to moving data into and out of Databricks serverless environments. Networking charges for serverless products only apply to customers using Azure Databricks serverless compute. Customers using classic compute manage and pay networking costs directly to Azure. Serverless charges apply when serverless compute communicates with your resources and are billed directly by Azure. Important Starting April 7, 2026, configuring service endpoints requires Azure Network ",2026-02-04T08:00:00.000Z,concept-article,decision-making,0.65,True,"Explains when and how networking charges apply for serverless vs classic compute, including which traffic patterns incur costs. This is cost/behavior guidance specific to Databricks serverless networking and informs deployment/usage decisions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/manage-network-policies,Manage network policies for serverless egress control,Manage network policies for serverless egress control - Azure Databricks,Configure serverless egress network policies in Azure Databricks,This document explains how to configure and manage network policies to control outbound network connections from your serverless workloads in Azure Databricks.,"This page explains how to configure and manage network policies to control outbound network connections from your serverless workloads in Azure Databricks. For ingress control, seeContext-based ingress control.",2026-04-07T08:00:00.000Z,concept-article,security,0.76,True,"This page is about configuring and managing network policies for serverless egress control, which is a product-specific security feature. Such docs typically include concrete policy object names, allowed/blocked destination formats, scope definitions, and possibly required roles or permissions to manage these policies. That aligns with the security category’s requirement for product-specific security settings and RBAC/scope details rather than generic networking concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/manage-private-endpoint-rules,Manage private endpoint rules,Manage private endpoint rules - Azure Databricks,Manage Databricks serverless private endpoint rules,Learn about managing private endpoint rules for private connectivity from the serverless compute plane to cloud storage.,"Note Azure Databricks charges for networking costs when serverless workloads connect to customer resources. SeeUnderstand Databricks serverless networking costs. This article describes how to manage private endpoint rules for private connectivity from serverless compute using the Azure Databricks account console. You can also use theNetwork connectivity configurations API. To configure private connectivity for serverless compute, seeConfigure private connectivity to Azure resourcesandConfigure p",2026-03-19T08:00:00.000Z,concept-article,configuration,0.75,True,Describes managing private endpoint rules for serverless connectivity using the account console and Network connectivity configurations API. Contains product-specific configuration objects and API usage.,unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/network-policies,What is serverless egress control?,What is serverless egress control? - Azure Databricks,Use serverless egress control policies in Azure Databricks,This article explains how serverless egress control allows you to manage outbound network connections from your serverless compute resources.,"Note This feature requires thePremium tier. This page explains how serverless egress control allows you to manage outbound network connections from your serverless compute resources. For ingress control, seeContext-based ingress control. Serverless egress control strengthens your security posture by allowing you to manage outbound connections from your serverless workloads, reducing the risk of data exfiltration. Using network policies, you can: Serverless egress control is supported with the fo",2026-03-05T08:00:00.000Z,concept-article,security,0.8,True,"Explains serverless egress control requirements (Premium tier) and supported workloads, with Databricks-specific policy behavior to restrict outbound connections.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/pl-to-internal-network,Private connectivity to resources in your VNet,Configure private connectivity to resources in your VNet - Azure Databricks,Set up Private Link from serverless to VNets,Configure private connectivity to resources in your VNet,Note Azure Databricks charges for networking costs when serverless workloads connect to customer resources. SeeUnderstand Databricks serverless networking costs. This page explains how to use the Azure Databricks account console to configure Private Link connections from serverless compute to resources in your virtual network (VNet) through an Azure load balancer. Configuring private connectivity for serverless compute provides:,2026-04-16T08:00:00.000Z,concept-article,configuration,0.76,True,"Describes configuring Private Link connections from serverless compute to resources in a VNet via an Azure load balancer using the account console. This will involve specific configuration fields, resource types, and allowed values that are detailed product configuration.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/pl-to-internal-network,Private connectivity to resources in your VNet,Configure private connectivity to resources in your VNet - Azure Databricks,Set up Private Link from serverless to VNets,Configure private connectivity to resources in your VNet,Note Azure Databricks charges for networking costs when serverless workloads connect to customer resources. SeeUnderstand Databricks serverless networking costs. This page explains how to use the Azure Databricks account console to configure Private Link connections from serverless compute to resources in your virtual network (VNet) through an Azure load balancer. Configuring private connectivity for serverless compute provides:,2026-04-16T08:00:00.000Z,concept-article,configuration,0.76,True,"Describes configuring Private Link connections from serverless compute to resources in a VNet via an Azure load balancer using the account console. This will involve specific configuration fields, resource types, and allowed values that are detailed product configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewall,Configure an Azure Storage firewall for serverless compute access (Legacy),Configure a firewall for serverless compute access (legacy) - Azure Databricks,Configure legacy serverless compute firewall access,Learn about configuring your storage firewalls to support serverless compute.,"Important This feature has reached end-of-life and will no longer be available after June 9, 2026. If you configure this feature before the end-of-life date, you must migrate to the new network security perimeter feature by June 9, 2026. Existing customers have been contacted about the required migration. SeeConfigure an Azure network security perimeter for Azure resources. This page describes how to configure an Azure storage firewall for serverless compute using the Azure Databricks account co",2026-03-27T08:00:00.000Z,concept-article,security,0.7,True,"Explains how to configure Azure storage firewalls to allow Databricks serverless compute, including Databricks-specific network and firewall settings. This is concrete security/network configuration, even though the feature is legacy.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-nsp-firewall,Configure an Azure Network Security Perimeter for serverless compute access,Configure an Azure network security perimeter for Azure resources - Azure Databricks,Configure Azure NSP for Databricks serverless access,Learn how to configure Azure Network Security Perimeter for Azure resources.,This page describes how to configure Azure Network Security Perimeter (NSP) to control access from serverless compute to your Azure resources using the Azure portal.,2026-04-13T08:00:00.000Z,concept-article,security,0.7,True,"Configuring Azure Network Security Perimeter for serverless access to Azure resources involves specific NSP resource settings, policies, and allowed connections. These are detailed, product‑specific security configuration parameters.",updated -https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-link,Private connectivity to Azure resources,Configure private connectivity to Azure resources - Azure Databricks,Configure Private Link from Databricks serverless compute,Learn about configuring Azure Private Link for serverless communication to Azure services,"This article describes how to configure private connectivity from serverless compute using the Azure Databricks account console UI. You can also use theNetwork connectivity configurations API. If you configure your Azure resource to only accept connections from private endpoints, any connection to the resource from your classic Databricks compute resources also must use private endpoints. To configure an Azure Storage firewall for serverless compute access using subnets, instead seeConfigure a f",2026-04-13T21:52:00.000Z,concept-article,configuration,0.76,True,"Explains configuring Azure Private Link for serverless communication using the Databricks account console and a specific Network connectivity configurations API. This implies concrete setting names, required values, and Azure resource configuration details that are product‑specific configuration knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/security/network/storage/firewall-support,Enable firewall support for your workspace storage account,Enable firewall support for your workspace storage account - Azure Databricks,Enable storage account firewall for Databricks workspaces,Learn how to enable firewall support for your Azure workspace storage account to limit access to authorized resources and networks.,"Each Azure Databricks workspace has an associated Azure storage account in a managed resource group known as theworkspace storage account. This account contains workspace system data (job output, system settings, and logs), the Databricks File System root, and in some cases a Unity Catalog workspace catalog. You can limit access to your workspace storage account to authorized resources and networks only, using the Azure CLI or PowerShell.",2026-04-13T08:00:00.000Z,concept-article,security,0.78,True,"Explains limiting access to the workspace storage account using Azure CLI/PowerShell. This will include specific firewall rule parameters, service endpoint or Private Link settings, and identity/authorization details, which are concrete security configuration instructions.",updated -https://learn.microsoft.com/en-us/azure/databricks/security/network/storage/firewall-support-arm-template,ARM template for firewall support,ARM template for firewall support for the workspace storage account - Azure Databricks,Use ARM template to enable Databricks storage firewall,ARM template for limiting access to the workspace storage account.,"This page provides you with the ARM template and a description of the required fields for firewall support for the workspace storage account. Firewall support for your workspace storage account is controlled by the ARM template propertystorageAccountFirewall, which must be set toEnabled. For certain workspace configurations, such as enabling thecompliance security profile, you might need to customize the ARM template. If you have questions regarding the ARM template, please file an Azure support",2026-04-13T21:52:00.000Z,concept-article,configuration,0.86,True,"Provides an ARM template and required fields for workspace storage account firewall support, including a specific property (storageAccountFirewall) and required value (Enabled), plus customizations for certain workspace configurations. This is a direct configuration parameter reference with allowed values—clear expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-nsp-firewall,Configure an Azure Network Security Perimeter for serverless compute access,Configure an Azure network security perimeter for Azure resources - Azure Databricks,Configure Azure NSP for Databricks serverless access,Learn how to configure Azure Network Security Perimeter for Azure resources.,This page describes how to configure Azure Network Security Perimeter (NSP) to control access from serverless compute to your Azure resources using the Azure portal.,2026-04-13T08:00:00.000Z,concept-article,security,0.7,True,"Configuring Azure Network Security Perimeter for serverless access to Azure resources involves specific NSP resource settings, policies, and allowed connections. These are detailed, product‑specific security configuration parameters.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-link,Private connectivity to Azure resources,Configure private connectivity to Azure resources - Azure Databricks,Configure Private Link from Databricks serverless compute,Learn about configuring Azure Private Link for serverless communication to Azure services,"This article describes how to configure private connectivity from serverless compute using the Azure Databricks account console UI. You can also use theNetwork connectivity configurations API. If you configure your Azure resource to only accept connections from private endpoints, any connection to the resource from your classic Databricks compute resources also must use private endpoints. To configure an Azure Storage firewall for serverless compute access using subnets, instead seeConfigure a f",2026-04-13T21:52:00.000Z,concept-article,configuration,0.76,True,"Explains configuring Azure Private Link for serverless communication using the Databricks account console and a specific Network connectivity configurations API. This implies concrete setting names, required values, and Azure resource configuration details that are product‑specific configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/network/storage/firewall-support,Enable firewall support for your workspace storage account,Enable firewall support for your workspace storage account - Azure Databricks,Enable storage account firewall for Databricks workspaces,Learn how to enable firewall support for your Azure workspace storage account to limit access to authorized resources and networks.,"Each Azure Databricks workspace has an associated Azure storage account in a managed resource group known as theworkspace storage account. This account contains workspace system data (job output, system settings, and logs), the Databricks File System root, and in some cases a Unity Catalog workspace catalog. You can limit access to your workspace storage account to authorized resources and networks only, using the Azure CLI or PowerShell.",2026-04-13T08:00:00.000Z,concept-article,security,0.78,True,"Explains limiting access to the workspace storage account using Azure CLI/PowerShell. This will include specific firewall rule parameters, service endpoint or Private Link settings, and identity/authorization details, which are concrete security configuration instructions.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/network/storage/firewall-support-arm-template,ARM template for firewall support,ARM template for firewall support for the workspace storage account - Azure Databricks,Use ARM template to enable Databricks storage firewall,ARM template for limiting access to the workspace storage account.,"This page provides you with the ARM template and a description of the required fields for firewall support for the workspace storage account. Firewall support for your workspace storage account is controlled by the ARM template propertystorageAccountFirewall, which must be set toEnabled. For certain workspace configurations, such as enabling thecompliance security profile, you might need to customize the ARM template. If you have questions regarding the ARM template, please file an Azure support",2026-04-13T21:52:00.000Z,concept-article,configuration,0.86,True,"Provides an ARM template and required fields for workspace storage account firewall support, including a specific property (storageAccountFirewall) and required value (Enabled), plus customizations for certain workspace configurations. This is a direct configuration parameter reference with allowed values—clear expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/,Compliance overview,Compliance - Azure Databricks,,"Learn how Databricks supports auditing, privacy, and compliance in highly regulated industries, including compliance profiles for HIPAA, IRAP, PCI-DSS, FedRAMP High, and FedRAMP Moderate.","Azure Databricks has put in place controls to meet the unique compliance needs of highly regulated industries. To learn how you can use Delta Lake on Azure Databricks to manage General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) compliance for your data lake, seePrepare your data for GDPR compliance. For guidance on exporting workspace data for compliance, backup, or data portability requirements, seeExport workspace data. For more information about privacy and c",2026-02-26T08:00:00.000Z,concept-article,,0.35,False,"High-level compliance overview (auditing, privacy, regulations) without detailed configuration parameters, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/c5,Germany C5,Cloud Computing Compliance Criteria Catalog (C5) - Azure Databricks,Apply C5 compliance controls in Azure Databricks,Learn about C5 compliance controls for a workspace.,This page describes Cloud Computing Compliance Criteria Catalog (C5) compliance controls in Azure Databricks.,2026-04-10T18:06:00.000Z,concept-article,security,0.7,True,"Maps the C5 compliance framework to concrete Azure Databricks workspace controls and behaviors, which is product- and standard-specific security/compliance configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/cccs-medium-protected-b,Canada Protected B,Canada Protected B - Azure Databricks,Configure Canada Protected B controls in Databricks,Canadian Centre for Cybersecurity (CCCS) Medium (Protected B) compliance controls for a workspace.,This page describes Canada Protected B compliance controls in Azure Databricks.,2026-04-02T08:00:00.000Z,concept-article,security,0.68,True,"Describes CCCS Medium (Protected B) compliance controls implemented by Databricks, which are product-specific security/compliance behaviors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/enhanced-security-compliance,Configure the enhanced security and compliance settings,Configure enhanced security and compliance settings - Azure Databricks,Configure enhanced security and compliance in Azure Databricks,Learn about the Enhanced Security and Compliance add-on.,This page describes how to configure enhanced security and compliance settings on your Azure Databricks workspace. Important,2026-04-10T08:00:00.000Z,concept-article,security,0.8,True,"Provides concrete steps and settings to enable the Enhanced Security and Compliance add-on for Databricks workspaces, including product-specific security configuration options and behaviors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/enhanced-security-monitoring,Enhanced security monitoring,Enhanced security monitoring - Azure Databricks,Set up enhanced security monitoring in Azure Databricks,Learn how to use Databricks Enhanced security monitoring.,This page describes the enhanced security monitoring feature and how to configure it on your Azure Databricks workspace or account. Enabling this feature on a workspace automatically adds the Enhanced Security and Compliance add-on as described on thepricing page.,2026-02-18T08:00:00.000Z,concept-article,security,0.8,True,"Covers how to configure enhanced security monitoring, including workspace/account-level settings and their effects; this is detailed, product-specific security configuration guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/export-workspace-data,Export workspace data,Export workspace data - Azure Databricks,,"Learn how to export workspace configuration, data, and AI assets from Databricks for compliance, backup, and migration needs.","This page provides an overview of the tools and approaches for exporting data and configuration from your Azure Databricks workspace. You can export workspace assets for compliance requirements, data portability, backup purposes, or workspace migration.",2026-01-24T08:00:00.000Z,concept-article,,0.4,False,Described as an overview of tools and approaches for exporting data; likely a conceptual or high-level how-to without detailed configuration tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/privacy/gdpr-delta,GDPR Compliance,Prepare your data for GDPR compliance - Azure Databricks,Implement GDPR and CCPA-compliant deletions with Delta,This article explains how to prepare your data for General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) compliance.,"The General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) are privacy and data security regulations that require companies to permanently and completely delete all personally identifiable information (PII) collected about a customer upon their explicit request. Also known as the “right to be forgotten” (RTBF) or “right to data erasure”, deletion requests must be executed during a specified period (for example, within one calendar month). This article guides you thr",2025-12-23T22:56:00.000Z,concept-article,security,0.7,True,"Provides concrete guidance for implementing right-to-be-forgotten workflows on Databricks/Delta, including product-specific patterns for permanent deletion.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/security/privacy/gdpr-delta,GDPR Compliance,Prepare your data for GDPR compliance - Azure Databricks,Prepare Delta Lake data for GDPR and CCPA erasure compliance,This article explains how to prepare your data for General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) compliance.,"The General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) are privacy and data security regulations that require companies to permanently and completely delete all personally identifiable information (PII) collected about a customer upon their explicit request. Also known as the “right to be forgotten” (RTBF) or “right to data erasure”, deletion requests must be executed during a specified period (for example, within one calendar month). This article guides you thr",2026-04-21T08:00:00.000Z,concept-article,security,0.7,True,"Product-specific guidance for GDPR/CCPA compliance on Delta, including how to structure and manage data for deletion within regulatory timeframes; this is specialized security/privacy configuration and data-handling guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/security/privacy/hipaa,HIPAA,HIPAA - Azure Databricks,Understand HIPAA compliance controls in Databricks,Learn about HIPAA compliance controls for a workspace.,This page describes HIPAA compliance controls in Azure Databricks.,2026-04-01T08:00:00.000Z,concept-article,security,0.68,True,"Lists Databricks HIPAA compliance controls and workspace implications, which are specific to the service and not generic HIPAA guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/hitrust,HITRUST,HITRUST - Azure Databricks,Use HITRUST compliance controls in Databricks,HITRUST compliance controls of a workspace.,Important This feature is inPublic Preview. This page describes HITRUST compliance controls in Azure Databricks.,2026-04-01T08:00:00.000Z,concept-article,security,0.68,True,"Documents Databricks HITRUST compliance controls and how they apply to workspaces, representing product-specific security/compliance configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/irap,IRAP,Infosec Registered Assessors Program (IRAP) - Azure Databricks,Apply IRAP compliance controls in Databricks,Learn about IRAP compliance controls for a workspace.,This page describes IRAP compliance controls in Azure Databricks. Important This feature is inPublic Preview.,2026-04-02T08:00:00.000Z,concept-article,security,0.68,True,"Describes Databricks IRAP compliance controls and workspace behavior, which is specialized security/compliance information for this product.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/ismap,ISMAP,Information system Security Management and Assessment Program (ISMAP) - Azure Databricks,Configure ISMAP compliance controls in Azure Databricks,Learn about ISMAP compliance controls for a workspace and how to configure the compliance security profile.,This page describes ISMAP compliance controls in Azure Databricks.,2026-04-01T08:00:00.000Z,concept-article,security,0.7,True,"Compliance-control pages typically list concrete workspace settings, flags, and configuration steps required to meet ISMAP, which are product-specific security configurations not inferable from general training data.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/k-fsi,K-FSI,Korean Financial Security Institute (K-FSI) compliance controls - Azure Databricks,Implement K-FSI compliance controls in Azure Databricks,K-FSI compliance controls of a workspace.,This page describes Korean Financial Security Institute (K-FSI) compliance controls in Azure Databricks.,2026-04-10T18:06:00.000Z,concept-article,security,0.7,True,"Details how Korean Financial Security Institute (K-FSI) requirements are implemented as controls in Azure Databricks workspaces, a niche, product-specific compliance/security mapping.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/pci,PCI-DSS,PCI DSS v4.0 - Azure Databricks,Configure PCI DSS v4.0 controls for Databricks workspaces,Learn about PCI DSS compliance controls of a workspace.,This page describes PCI DSS v4.0 compliance controls in Azure Databricks.,2026-04-01T08:00:00.000Z,concept-article,security,0.7,True,"PCI DSS compliance pages generally document specific workspace security settings, modes, and constraints required for PCI, which are detailed, product-specific security configurations.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/security/privacy/security-profile,Compliance security profile overview,Compliance security profile - Azure Databricks,Use Azure Databricks compliance security profile controls,"Learn about the compliance security profile, its compliance controls, and supported features.","This page describes the compliance security profile, its compliance controls, and supported features. To enable the compliance security profile, seeConfigure enhanced security and compliance settings.",2026-04-15T08:00:00.000Z,concept-article,security,0.7,True,"The page describes a product-specific compliance security profile for Azure Databricks, including concrete security/compliance controls and supported features that are unique to this service. It provides detailed, implementation-focused security configuration and behavior information that goes beyond generic security concepts, fitting the security sub-skill.",updated +https://learn.microsoft.com/en-us/azure/databricks/security/privacy/security-profile,Compliance security profile overview,Compliance security profile - Azure Databricks,Use Azure Databricks compliance security profile controls,"Learn about the compliance security profile, its compliance controls, and supported features.","This page describes the compliance security profile, its compliance controls, and supported features. To enable the compliance security profile, seeConfigure enhanced security and compliance settings.",2026-04-15T08:00:00.000Z,concept-article,security,0.7,True,"The page describes a product-specific compliance security profile for Azure Databricks, including concrete security/compliance controls and supported features that are unique to this service. It provides detailed, implementation-focused security configuration and behavior information that goes beyond generic security concepts, fitting the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/tisax,TISAX,Trusted Information Security Assessment Exchange (TISAX) - Azure Databricks,Configure TISAX-aligned controls for Azure Databricks,Learn about TISAX compliance controls for a workspace.,This page describes TISAX compliance controls in Azure Databricks.,2026-04-10T18:06:00.000Z,concept-article,security,0.7,True,"Explains how TISAX requirements translate into specific Azure Databricks workspace controls, which is specialized security/compliance configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/security/privacy/uk-cyber-essentials-plus,UK Cyber Essentials Plus,UK Cyber Essentials Plus - Azure Databricks,Apply UK Cyber Essentials Plus controls in Databricks,Learn about UK Essentials Plus compliance controls for a workspace.,This page describes UK Cyber Essentials Plus compliance controls in Azure Databricks.,2026-04-01T08:00:00.000Z,concept-article,security,0.7,True,UK Cyber Essentials Plus compliance controls require specific workspace security configurations and options that are unique to this product and regulation.,unchanged https://learn.microsoft.com/en-us/azure/databricks/security/secrets/,Keep data secure with secrets,Secret management - Azure Databricks,Manage secrets securely in Azure Databricks,"Use Databricks secrets to securely store and manage sensitive information like credentials, API keys, and tokens.","Many workflows require sensitive information such as credentials, API keys, and tokens. Rather than entering these values directly into notebooks or storing them in plain text, you can securely store them using Databricks secrets and reference them in your notebooks and jobs. This approach enhances security and simplifies credential management. This page provides an overview of Databricks secrets. Note Databricks recommends using Unity Catalog to configure access to data in cloud storage. SeeCon",2026-02-11T08:00:00.000Z,concept-article,security,0.72,True,"Beyond conceptual overview, Databricks secrets documentation typically includes product-specific mechanisms for storing and referencing secrets in notebooks and jobs, which are security configuration patterns unique to Databricks.",unchanged @@ -3630,10 +3660,10 @@ https://learn.microsoft.com/en-us/azure/databricks/semi-structured/,Overview,Mod https://learn.microsoft.com/en-us/azure/databricks/semi-structured/complex-types,Transform complex data types,Transform complex data types - Azure Databricks,Transform complex and nested data types in Databricks,Learn how Azure Databricks optimizes transformations on complex data types.,"While working with nested data types, Azure Databricks optimizes certain transformations out-of-the-box. The following code examples demonstrate patterns for working with complex and nested data types in Azure Databricks.",2026-01-20T08:00:00.000Z,how-to,best-practices,0.7,True,"Shows optimized transformation patterns for nested data types with concrete code examples and Databricks-specific optimizations. These are actionable patterns and gotchas for performance and correctness, aligning with best-practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/semi-structured/higher-order-functions,Higher-order functions,Higher-order functions - Azure Databricks,Use higher-order functions on arrays in Databricks SQL,Learn how Azure Databricks optimizes the performance of higher-order and build-in functions.,Azure Databricks provides dedicated primitives for manipulating arrays in Apache Spark SQL. These primitives make working with arrays easier and more concise and don't require large amounts of boilerplate code. The primitives revolve around two functional programming constructs: higher-order functions and anonymous (lambda) functions. These work together to allow you to define functions that manipulate arrays in SQL.,2026-01-20T08:00:00.000Z,how-to,best-practices,0.7,True,Describes dedicated primitives and patterns for manipulating arrays with higher-order and lambda functions in Databricks SQL. Contains specific function behaviors and idiomatic patterns that are product-specific best practices.,unchanged https://learn.microsoft.com/en-us/azure/databricks/semi-structured/json,Query JSON strings,Query JSON strings - Azure Databricks,Query JSON string columns with Databricks SQL operators,Learn how to easily query semi-structured JSON string data with Azure Databricks.,"This article describes the Databricks SQL operators you can use to query and transform semi-structured data stored as JSON strings. Note This feature lets you read semi-structured data without flattening the files. However, for optimal read query performance Databricks recommends that you extract nested columns with the correct data types. You extract a column from fields containing JSON strings using the syntax:, whereis the string column name and:) unique to Databricks for JSON querying, which are product-specific API/parameter details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant,Overview,Query variant data - Azure Databricks,Query semi-structured data using VARIANT in Databricks,Learn how to query semi-structured data stored as VARIANT with Azure Databricks.,"Important This feature is inPublic Preview. This article describes how you can query and transform semi-structured data stored asVARIANT. TheVARIANTdata type is available in Databricks Runtime 15.3 and above. Databricks recommends usingVARIANTover JSON strings. For users currently using JSON strings looking to migrate, seeHow is variant different than JSON strings?. If you want to see examples for querying semi-structured data stored with JSON strings, seeQuery JSON strings. Note VARIANTcolumns ",2026-01-20T08:00:00.000Z,how-to,best-practices,0.7,True,"Describes how to query and transform VARIANT data, with recommendations to prefer VARIANT over JSON strings and migration considerations. This includes Databricks-specific behaviors and recommended patterns for using the VARIANT type.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant,Overview,Query variant data - Azure Databricks,Query and transform VARIANT semi-structured data in Databricks,Learn how to query semi-structured data stored as VARIANT with Azure Databricks.,"Important This feature is inPublic Preview. This article describes how you can query and transform semi-structured data stored asVARIANT. TheVARIANTdata type is available in Databricks Runtime 15.3 and above. Databricks recommends usingVARIANTover JSON strings. For users currently using JSON strings looking to migrate, seeHow is variant different than JSON strings?. If you want to see examples for querying semi-structured data stored with JSON strings, seeQuery JSON strings. Note VARIANTcolumns ",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"Details the Databricks VARIANT data type, its behavior versus JSON strings, supported operations, and runtime/version constraints; these are product-specific type semantics and configuration/usage details not captured by generic knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant-json-diff,How is variant different than JSON strings?,How is variant different than JSON strings? - Azure Databricks,Differences between VARIANT and JSON strings in Databricks,Learn about the behavioral differences between variant and working with JSON strings.,"Important This feature is inPublic Preview. This article describes the behavior changes and differences in syntax and semantics when working with the variant data type. This article assumes that you are familiar with working with JSON string data on Azure Databricks. For users new to Azure Databricks, you should use variant over JSON strings whenever storing semi-structured data that requires flexibility for changing or unknown schema. SeeModel semi-structured data. In Databricks Runtime 15.3 an",2026-01-20T08:00:00.000Z,concept-article,best-practices,0.75,True,"Focuses on behavioral, syntax, and semantic differences between VARIANT and JSON strings. These are nuanced, product-specific behaviors and guidance on when/how to use each, fitting best-practices and expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/spark/,Overview,Apache Spark overview - Azure Databricks,,"Find links to resources for working with Apache Spark on Databricks, including DataFrames, streaming, language APIs, and configuration options.",Apache Spark is the technology powering compute clusters and SQL warehouses in Azure Databricks. This page provides an overview of the documentation in this section.,2026-01-19T08:00:00.000Z,landing-page,,0.1,False,"High-level Apache Spark overview and navigation page without detailed limits, configs, or product-specific patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/spark/conf,Configure Spark properties,Set Spark configuration properties on Azure Databricks - Azure Databricks,Set and manage Spark configuration properties on Azure Databricks,Learn about supported options to configure Apache Spark and set Spark confs on Azure Databricks.,"You can set Spark configuration properties (Spark confs) to customize settings in your compute environment. Databricks generally recommends against configuring most Spark properties. Especially when migrating from open-source Apache Spark or upgrading Databricks Runtime versions, legacy Spark configurations can override new default behaviors that optimize workloads. For many behaviors controlled by Spark properties, Azure Databricks also provides options to either enable behavior at a table leve",2026-04-13T21:52:00.000Z,concept-article,configuration,0.8,True,"Covers supported options to configure Spark on Azure Databricks. Such a page typically lists specific Spark configuration keys, defaults, and Databricks-specific behaviors, which are product-specific configuration details that qualify as expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/spark/conf,Configure Spark properties,Set Spark configuration properties on Azure Databricks - Azure Databricks,Set and manage Spark configuration properties on Azure Databricks,Learn about supported options to configure Apache Spark and set Spark confs on Azure Databricks.,"You can set Spark configuration properties (Spark confs) to customize settings in your compute environment. Databricks generally recommends against configuring most Spark properties. Especially when migrating from open-source Apache Spark or upgrading Databricks Runtime versions, legacy Spark configurations can override new default behaviors that optimize workloads. For many behaviors controlled by Spark properties, Azure Databricks also provides options to either enable behavior at a table leve",2026-04-13T21:52:00.000Z,concept-article,configuration,0.8,True,"Covers supported options to configure Spark on Azure Databricks. Such a page typically lists specific Spark configuration keys, defaults, and Databricks-specific behaviors, which are product-specific configuration details that qualify as expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/spark/connect-vs-classic,Compare Spark Connect to Spark Classic,Compare Spark Connect to Spark Classic - Azure Databricks,Decide between Spark Connect and Spark Classic,Learn about key differences between Spark Connect and Spark Classic in execution and analysis behavior to avoid unexpected behavior and performance issues when migrating code.,"Spark Connectis a gRPC-based protocol within Apache Spark that specifies how a client application can communicate with a remote Spark Server. It allows remote execution of Spark workloads using the DataFrame API. Spark Connect is used in the following: While both Spark Connect and Spark Classic utilize lazy execution for transformations, there are important differences to know to avoid unexpected behavior and performance issues when migrating existing code from Spark Classic to Spark Connect or ",2026-01-20T23:44:00.000Z,concept-article,decision-making,0.75,True,Comparison article focused on differences in execution and analysis behavior for migration; likely includes scenario-based recommendations and trade-offs.,unchanged https://learn.microsoft.com/en-us/azure/databricks/spark/faq,What is Apache Spark on Databricks?,Apache Spark on Azure Databricks - Azure Databricks,,Learn about Apache Spark on the Azure Databricks Data Intelligence Platform.,Apache Spark is at the heart of the Azure Databricks Data Intelligence Platform and is the technology powering compute clusters and SQL warehouses. Azure Databricks is an optimized platform for Apache Spark that provides an efficient and simple platform for running Apache Spark workloads.,2026-01-19T08:00:00.000Z,faq,,0.2,False,"FAQ-style overview of Spark on Databricks; primarily conceptual/platform positioning without concrete configs, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sparkr/,R overview,Azure Databricks for R developers - Azure Databricks,,"Learn how to work with Apache Spark from R using SparkR, sparklyr, and RStudio in Azure Databricks.",This page provides an overview for developing notebooks and jobs in Azure Databricks using the R language.,2026-03-31T23:28:00.000Z,overview,,0.3,False,Overview page for R development on Databricks; primarily navigational without detailed expert-only configuration or troubleshooting content.,unchanged @@ -3646,7 +3676,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sparkr/rstudio,RStudio Deskto https://learn.microsoft.com/en-us/azure/databricks/sparkr/shiny,Shiny on Azure Databricks,Shiny on Azure Databricks - Azure Databricks,Run Shiny applications on Azure Databricks,Learn how to develop interactive applications on Azure Databricks using the Shiny package.,"Shinyis an R package, available on CRAN, used to build interactive R applications and dashboards. You can use Shiny insideRStudio Serverhosted on Azure Databricks clusters. You can also develop, host, and share Shiny applications directly from an Azure Databricks notebook. To get started with Shiny, see theShiny tutorials. You can run these tutorials on Azure Databricks notebooks. This article describes how to run Shiny applications on Azure Databricks and use Apache Spark inside Shiny applicati",2026-03-02T08:00:00.000Z,how-to,integrations,0.7,True,Shows how to host Shiny apps in Databricks notebooks and RStudio with Spark integration; these are concrete product-specific integration patterns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparklyr,sparklyr,sparklyr - Azure Databricks,Use sparklyr with Azure Databricks,Learn how to work with Apache Spark from R using sparklyr in Azure Databricks.,"Azure Databricks supports sparklyr in notebooks, jobs, and RStudio Desktop. This article describes how you can use sparklyr and provides example scripts that you can run. SeeR interface to Apache Sparkfor more information.",2026-01-19T08:00:00.000Z,how-to,integrations,0.7,True,"Describes how to connect and work with Spark via sparklyr on Databricks, including example scripts and environment-specific usage patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-migration,SparkR to sparklyr migration,Migrate from SparkR to sparklyr - Azure Databricks,Migrate SparkR code to sparklyr on Databricks,"Migrate from SparkR to sparklyr in Azure Databricks following SparkR deprecation in Spark 4.0, with function mappings and code examples.","SparkRwas developed as part of Apache Spark, and its design is familiar to users of Scala and Python, but potentially less intuitive for R practitioners. In addition, SparkR is deprecated in Spark 4.0. In contrast,sparklyris focused on providing a more R-friendly experience. It leveragesdplyrsyntax, which is familiar to users oftidyversewith patterns likeselect(),filter(), andmutate()for DataFrame operations. sparklyr is the recommended R package for working with Apache Spark. This page explains",2026-03-31T23:28:00.000Z,how-to,integrations,0.7,True,"Describes concrete function mappings and code examples for migrating from SparkR to sparklyr in Azure Databricks. These API-level mappings and migration patterns are product- and version-specific coding patterns, fitting integrations & coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-vs-sparklyr,SparkR versus sparklyr,Comparing SparkR and sparklyr - Azure Databricks,Decide between SparkR and sparklyr on Databricks,,"Important SparkR in Databricks isdeprecatedin Databricks Runtime 16.0 and above. For information about migrating to sparklyr, seeMigrate from SparkR to sparklyr. There are two APIs for Apache Spark for R users available:SparkRandsparklyr. Databricks recommends you usesparklyr, as SparkR has been deprecated. To help you migrate your code, this article compares these APIs.",2026-03-31T23:28:00.000Z,concept-article,decision-making,0.7,True,"Compares SparkR and sparklyr specifically for Azure Databricks, including deprecation status and migration-oriented guidance to help choose which API to use. This is product- and version-specific decision guidance that an LLM is unlikely to infer from general training data.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-vs-sparklyr,SparkR versus sparklyr,Comparing SparkR and sparklyr - Azure Databricks,Choose between SparkR and sparklyr on Databricks,,"Important SparkR in Databricks isdeprecatedin Databricks Runtime 16.0 and above. For information about migrating to sparklyr, seeMigrate from SparkR to sparklyr. There are two APIs for Apache Spark for R users available:SparkRandsparklyr. Databricks recommends you usesparklyr, as SparkR has been deprecated. To help you migrate your code, this article compares these APIs.",2026-04-23T08:00:00.000Z,concept-article,decision-making,0.7,True,"Explicit comparison of SparkR vs sparklyr with migration guidance; Databricks-specific recommendation and deprecation details help users decide which R API to use and how to migrate, fitting decision-making criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/,Overview,Data warehousing on Azure Databricks - Azure Databricks - Databricks SQL,,Learn about building a data warehousing solution on the Azure Databricks platform using Databricks SQL.,"Databricks SQL is a cloud data warehouse built on lakehouse architecture. It runs directly on your data lake, supports ANSI SQL with Delta Lake extensions, and provides the tools to build highly performant, cost-effective data warehouses without moving your data.",2026-04-09T22:01:00.000Z,landing-page,,0.2,False,"High-level overview of Databricks SQL as a data warehouse; no concrete limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/dbsql-api-latest,Update to the latest Databricks SQL API version,Update to the latest Databricks SQL API version - Azure Databricks - Databricks SQL,Migrate to the latest Databricks SQL REST API,Learn about the changes in the latest version of the Databricks SQL REST API.,"This page describes changes to the Queries, Alerts, Permissions, Data Sources, and Visualizations APIs included in the latest version of the Databricks SQL API. The legacy API is deprecated and support will end soon. Use this page to migrate your applications and integrations to the new API version.",2026-02-12T04:25:00.000Z,concept-article,decision-making,0.65,True,Describes changes between legacy and latest API versions to help migrate applications; this is decision/migration guidance comparing capabilities and required changes.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/,Databricks SQL tutorials,Get started with data warehousing using Databricks SQL - Azure Databricks - Databricks SQL,,"Get started with Databricks SQL for data warehousing, from basic concepts to advanced usage with BI tools, dashboards, and SQL warehouses.","If you're a data analyst who works primarily with SQL queries and your favorite BI tools, Databricks SQL provides an intuitive environment for running ad-hoc queries and creating dashboards on data stored in your data lake. Note Databricks SQL Serverless is not available in Azure China. Databricks SQL is not available in Azure Government regions.",2026-03-24T01:32:00.000Z,get-started,,0.25,False,"Get-started guide for Databricks SQL; mostly onboarding and navigation, not deep configuration, limits, or troubleshooting content.",unchanged @@ -3654,7 +3684,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/concepts,Data https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/data-warehousing-concepts,Data warehouse architecture,Data warehousing architecture - Azure Databricks - Databricks SQL,,"Architecture and patterns for data warehousing on Azure Databricks including lakehouse architecture, medallion patterns, and data modeling approaches.",Data warehousing refers to collecting and storing data from multiple sources so it can be quickly accessed for business insights and reporting. This article contains key concepts for building a data warehouse in your data lakehouse.,2025-11-26T08:00:00.000Z,concept-article,,0.2,False,"Conceptual architecture and patterns (lakehouse, medallion) for data warehousing; generic design concepts rather than product-specific numeric thresholds or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/sample-dashboards,Sample dashboards,Tutorial: Use sample dashboards - Azure Databricks - Databricks SQL,,Learn how to import and use AI/BI sample dashboards from the samples gallery on Azure Databricks.,This tutorial shows you how to import and use sample dashboards from the samples gallery. These dashboards illustrate some of the rich visualizations you can use to gain insights from your data. No setup is required. These dashboards use data already available in your workspace and rely on a compute resource (called a SQL warehouse) already configured. You don't need to be an administrator to get started. SeeDashboardsto learn about all of the visualization types and features available for dashb,2026-02-23T08:00:00.000Z,concept-article,,0.25,False,Tutorial on using sample dashboards; step-by-step UI usage without detailed product-specific configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/sql-etl-tutorial,Tutorial: ETL in Databricks SQL,ETL in Databricks SQL - Azure Databricks - Databricks SQL,,"Build an incremental ETL pipeline in Databricks SQL using streaming tables, AUTO CDC, and materialized views.","When dealing with large amounts of data, you need a pipeline that can process only the new and changed records instead of reprocessing the entire dataset. This is called incremental ETL. In Databricks SQL, you can build incremental ETL pipelines using streaming tables and materialized views, without writing procedural code or scheduling manual refreshes. This tutorial walks you through a common pattern: tracking product changes over time. You create a source table, capture change events, build a",2026-04-10T21:59:00.000Z,tutorial,,0.3,False,"Tutorial for building an incremental ETL pipeline; primarily step-by-step guidance without configuration tables, limits, or error-resolution mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/,SQL language reference,SQL language reference - Azure Databricks - Databricks SQL,,Learn about the SQL language constructs supported in Databricks SQL.,"This is a SQL command reference for Databricks SQL and Databricks Runtime. For information about how to understand and use the syntax notation and symbols in this reference, seeHow to use the SQL reference. For information about using SQL with Lakeflow Spark Declarative Pipelines, seePipeline SQL language reference. Note Databricks SQL Serverless is not available in Azure China. Databricks SQL is not available in Azure Government regions.",2026-04-03T08:00:00.000Z,language-reference,,0.2,False,"High-level entry page for the SQL language reference; acts as a navigation/overview without exposing specific limits, configuration tables, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/,SQL language reference,SQL language reference - Azure Databricks - Databricks SQL,,Learn about the SQL language constructs supported in Databricks SQL.,"This is a SQL command reference for Databricks SQL and Databricks Runtime. For information about how to understand and use the syntax notation and symbols in this reference, seeHow to use the SQL reference. For information about using SQL with Lakeflow Spark Declarative Pipelines, seePipeline SQL language reference. Note Databricks SQL Serverless is not available in Azure China. Databricks SQL is not available in Azure Government regions.",2026-04-20T08:00:00.000Z,language-reference,,0.4,False,"High-level entry page for the Databricks SQL language reference; likely an index/overview of syntax topics rather than detailed limits, configuration tables, or troubleshooting mappings. Does not clearly match any expert-knowledge sub-skill type as defined.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/case-stmt,CASE statement,CASE statement - Azure Databricks - Databricks SQL,,Learn how to use the CASE statement syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 16.3 and above ExecutesthenStmtNfor the firstoptNthat equalsexprorelseStmtif nooptNmatchesexpr. This is called asimple case statement. ExecutesthenStmtNfor the firstcondNevaluating totrue, orelseStmtif nocondNevaluates totrue. This is called asearched case statement. For case expressions that yield result values, seeCASE expression This statement may only be used within acompound statement.",2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"CASE statement control-flow syntax; language reference without configuration tables, limits, or troubleshooting mappings.",unchanged @@ -3687,7 +3717,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-type https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/int-type,INT type,INT type - Azure Databricks - Databricks SQL,,Learn about the int type in Databricks Runtime and Databricks SQL. Int type represents 4-byte signed integer numbers. Understand the syntax and limits with examples.,Applies to:Databricks SQLDatabricks Runtime Represents 4-byte signed integer numbers.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,INT type is standard 4-byte integer; summary does not indicate Databricks-specific behavior or constraints.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/interval-type,INTERVAL type,INTERVAL type - Azure Databricks - Databricks SQL,,Learn about the interval type in Databricks Runtime and Databricks SQL. Interval type represents intervals of time either on a scale of seconds or months. Understand the syntax and limits with example,Applies to:Databricks SQLDatabricks Runtime Represents intervals of time either on a scale of seconds or months.,2025-01-24T08:00:00.000Z,language-reference,,0.45,False,"INTERVAL type description is generic; summary does not show Databricks-specific ranges, limits, or configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/map-type,MAP type,MAP type - Azure Databricks - Databricks SQL,,Learn about the map type in Databricks Runtime and Databricks SQL. Map type represents values comprising a set of key-value pairs. Understand the syntax and limits with examples.,Applies to:Databricks SQLDatabricks Runtime Represents values comprising a set of key-value pairs.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,MAP type description with examples; summary lacks explicit Databricks-only limits or configuration parameters.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/null-type,VOID type,VOID type - Azure Databricks - Databricks SQL,,Learn about the NULL data types in Databricks Runtime and Databricks SQL. Null type represents the untyped NULL value. Understand the syntax and limits with examples.,Applies to:Databricks SQLDatabricks Runtime Represents the untyped NULL value,2026-04-13T21:03:00.000Z,language-reference,,0.2,False,"Page is a basic SQL data type reference (NULL/VOID) without product-specific limits, configuration parameters, or decision/troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/null-type,VOID type,VOID type - Azure Databricks - Databricks SQL,,Learn about the NULL data types in Databricks Runtime and Databricks SQL. Null type represents the untyped NULL value. Understand the syntax and limits with examples.,Applies to:Databricks SQLDatabricks Runtime Represents the untyped NULL value,2026-04-13T21:03:00.000Z,language-reference,,0.2,False,"Page is a basic SQL data type reference (NULL/VOID) without product-specific limits, configuration parameters, or decision/troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/object-type,OBJECT type,OBJECT type - Azure Databricks - Databricks SQL,Work with OBJECT type and VARIANT schemas in Databricks,Learn about the object type in Databricks Runtime and Databricks SQL. Understand the syntax and limits with examples.,"Applies to:Databricks Runtime 15.3 and above Represents values in aVARIANTwith the structure described by a set of fields. Refer toSTRUCTfor storing and processing structured types described by a sequence of fields. Important TheOBJECTcannot be stored in a table column. It is only exposed when callingschema_of_variantorschema_of_variant_agg. To use anOBJECTtype, you must cast it to aSTRUCTorMAP.",2025-10-02T18:06:00.000Z,language-reference,best-practices,0.65,True,"Describes that OBJECT cannot be stored in table columns and must be cast to STRUCT or MAP, and that it is only exposed via specific functions; these are Databricks-specific constraints and usage patterns.",unchanged @@ -3716,17 +3746,17 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-dro https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-fsck,FSCK REPAIR TABLE,FSCK REPAIR TABLE - Azure Databricks - Databricks SQL,,Learn how to use the FSCK REPAIR TABLE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Detects and repairs metadata and data file issues for a Delta table. The command has three progressive levels: Each level includes all operations from the previous levels.,2026-01-24T08:00:00.000Z,language-reference,,0.4,False,"FSCK REPAIR TABLE command overview; describes progressive levels conceptually without exposing specific numeric thresholds, error-code mappings, or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-generate,GENERATE,GENERATE - Azure Databricks - Databricks SQL,,Learn how to use the GENERATE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Generates the given mode (specified as a string) in a Delta table.,2024-11-14T08:00:00.000Z,language-reference,,0.4,False,"GENERATE command description is high-level; no limits, configs, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-merge-into,MERGE INTO,MERGE INTO - Azure Databricks - Databricks SQL,Merge data into Delta Lake tables with MERGE,Learn how to use the MERGE INTO syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Merges a set of updates, insertions, and deletions based on a source table into a target Delta table. This statement is supported only for Delta Lake tables. This page contains details for using the correct syntax with theMERGEcommand. SeeUpsert into a Delta Lake table using mergefor more guidance on how to useMERGEoperations to manage your data.",2026-03-16T08:00:00.000Z,language-reference,integrations,0.78,True,"Documents Databricks-specific MERGE INTO syntax and semantics for Delta tables, including how updates/inserts/deletes are expressed; these are concrete, product-specific SQL integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-optimize,OPTIMIZE,OPTIMIZE - Azure Databricks - Databricks SQL,,Learn how to use the OPTIMIZE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime to optimize the layout of Delta Lake data.,"Applies to:Databricks SQLDatabricks Runtime This page describes theOPTIMIZEcommand, which optimizes the layout of Delta Lake data. You can optimize a subset of data or collocate data by column. If you don't specify collocation and the table doesn't use liquid clustering, Delta Lake performs bin-packing optimization.",2026-03-31T23:28:00.000Z,language-reference,,0.4,False,"OPTIMIZE command description for Delta tables; focuses on what the command does (bin-packing, collocation) rather than detailed limits, configuration parameter tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-reorg-table,REORG TABLE,REORG TABLE - Azure Databricks - Databricks SQL,Use REORG TABLE to optimize Delta layout,Learn how to use the REORG TABLE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime to optimize the layout of Delta Lake data.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Reorganize a Delta Lake table by rewriting files to purge soft-deleted data, such as the column data dropped by ALTER TABLE DROP COLUMN, or by performing Delta Lake checkpointing to improve metadata management.",2026-04-17T18:03:00.000Z,language-reference,configuration,0.7,True,"This is a product-specific SQL language reference for REORG TABLE in Databricks/Delta Lake, including exact syntax, options, and behavior details that are not generic SQL knowledge. It describes how specific parameters affect file rewriting, soft-delete purging, and checkpointing, which fits configuration of product-specific operations more than general SQL concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-optimize,OPTIMIZE,OPTIMIZE - Azure Databricks - Databricks SQL,Use OPTIMIZE to tune Delta Lake table layout,Learn how to use the OPTIMIZE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime to optimize the layout of Delta Lake data.,"Applies to:Databricks SQLDatabricks Runtime This page describes theOPTIMIZEcommand, which optimizes the layout of Delta Lake data. You can optimize a subset of data or collocate data by column. If you don't specify collocation and the table doesn't use liquid clustering, Delta Lake performs bin-packing optimization.",2026-04-24T08:00:00.000Z,language-reference,configuration,0.74,True,"The OPTIMIZE command reference includes product-specific SQL syntax, options, and parameters (e.g., ZORDER BY, WHERE predicates, retention/cleanup behaviors) that define how to configure Delta Lake table optimization in Databricks. These are concrete, service-specific configuration knobs and behaviors rather than generic SQL knowledge, fitting the configuration sub-skill best.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-reorg-table,REORG TABLE,REORG TABLE - Azure Databricks - Databricks SQL,Use REORG TABLE to optimize Delta layout,Learn how to use the REORG TABLE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime to optimize the layout of Delta Lake data.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Reorganize a Delta Lake table by rewriting files to purge soft-deleted data, such as the column data dropped by ALTER TABLE DROP COLUMN, or by performing Delta Lake checkpointing to improve metadata management.",2026-04-17T18:03:00.000Z,language-reference,configuration,0.7,True,"This is a product-specific SQL language reference for REORG TABLE in Databricks/Delta Lake, including exact syntax, options, and behavior details that are not generic SQL knowledge. It describes how specific parameters affect file rewriting, soft-delete purging, and checkpointing, which fits configuration of product-specific operations more than general SQL concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-restore,RESTORE,RESTORE - Azure Databricks - Databricks SQL,,Learn how to use the RESTORE syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Restores a Delta table to an earlier state. Restoring to an earlier version number or a timestamp is supported. This page contains details for using the correct syntax with theRESTOREcommand. SeeWork with table historyfor more guidance on navigating Delta Lake table versions with this command.,2024-11-14T18:50:00.000Z,language-reference,,0.45,False,"RESTORE syntax and usage; summary does not indicate numeric limits, config parameters, or error-code mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-update,UPDATE,UPDATE - Azure Databricks - Databricks SQL,Update rows in Delta Lake tables with UPDATE,Learn how to use the UPDATE (table) syntax of the Delta Lake SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Updates the column values for the rows that match a predicate. When no predicate is provided, update the column values for all rows. This statement is only supported for Delta Lake tables.",2026-03-16T08:00:00.000Z,language-reference,integrations,0.68,True,"Provides Databricks SQL UPDATE syntax and behavior restricted to Delta tables, which are product-specific SQL surface details useful for coding against the service.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-vacuum,VACUUM,VACUUM - Azure Databricks - Databricks SQL,,Learn how to use the VACUUM syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Remove unused files from a table directory. Note This command works differently depending on whether you're working on a Delta, Apache Spark, or Apache Iceberg table.",2026-02-05T18:55:00.000Z,language-reference,,0.55,False,VACUUM command overview; summary does not expose specific retention hours/days or other numeric constraints.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/abs,abs function,abs function - Azure Databricks - Databricks SQL,,Learn the syntax of the abs function of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the absolute value of the numeric value inexpr.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents a standard abs() SQL function; no product-specific limits, configuration tables, or troubleshooting/decision guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/abs,abs function,abs function - Azure Databricks - Databricks SQL,,Learn the syntax of the abs function of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the absolute value of the numeric value inexpr.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents a standard abs() SQL function; no product-specific limits, configuration tables, or troubleshooting/decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/acos,acos function,acos function - Azure Databricks - Databricks SQL,,Learn the syntax of the acos function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the inverse cosine (arccosine) ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"Single math function reference (acos); likely just syntax and basic description without Databricks-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/acosh,acosh function,acosh function - Azure Databricks - Databricks SQL,,Learn the syntax of the acosh function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the inverse hyperbolic cosine ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"Single math function reference (acosh); likely only syntax and description, no product-specific constraints or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/add_months,add_months function,add_months function - Azure Databricks - Databricks SQL,,Learn the syntax of the add\_months function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the date that isnumMonthsafterstartDate.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"ADD_MONTHS date function reference; describes return value but no limits, configs, or decision/troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aes_decrypt,aes_decrypt function,aes_decrypt function - Azure Databricks - Databricks SQL,,Learn the syntax of the aes\_decrypt function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Decrypts a binary produced using AES encryption.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Page describes aes_decrypt() syntax and behavior but is a straightforward function reference without detailed security configuration, limits, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aes_encrypt,aes_encrypt function,aes_encrypt function - Azure Databricks - Databricks SQL,,Learn the syntax of the aes\_encrypt function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Encrypts a binary using AES encryption.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Page describes aes_encrypt() syntax and behavior; it is a standard function reference without RBAC roles, config tables, limits, or decision/troubleshooting structures.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aes_decrypt,aes_decrypt function,aes_decrypt function - Azure Databricks - Databricks SQL,,Learn the syntax of the aes\_decrypt function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Decrypts a binary produced using AES encryption.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Page describes aes_decrypt() syntax and behavior but is a straightforward function reference without detailed security configuration, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aes_encrypt,aes_encrypt function,aes_encrypt function - Azure Databricks - Databricks SQL,,Learn the syntax of the aes\_encrypt function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Encrypts a binary using AES encryption.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Page describes aes_encrypt() syntax and behavior; it is a standard function reference without RBAC roles, config tables, limits, or decision/troubleshooting structures.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aggregate,aggregate function,aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Aggregates elements in an array using a custom aggregator. This function is a synonym forreducefunction.,2026-01-20T08:00:00.000Z,language-reference,,0.45,False,"aggregate array function reference; may show examples but not configuration parameters, limits, or decision matrices specific to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_analyze_sentiment,ai_analyze_sentiment function,ai_analyze_sentiment function - Azure Databricks - Databricks SQL,Use ai_analyze_sentiment in Databricks SQL queries,Learn the syntax of the ai\_analyze\_sentiment function of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_analyze_sentiment()function allows you to invoke a state-of-the-art generative AI model to perform sentiment analysis on input text using SQL.,2026-03-31T23:28:00.000Z,language-reference,integrations,0.7,True,"AI function that invokes Databricks-managed models; documentation typically includes function signature, argument names, types, defaults, and possibly constraints unique to Databricks AI Functions, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_classify,ai_classify function,ai_classify function - Azure Databricks - Databricks SQL,Classify text with ai_classify in Databricks SQL,Learn the syntax of the ai\_classify function of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_classify()function classifies text content according to custom labels you provide. You can use simple label names for basic classification, or add label descriptions and instructions to improve accuracy for use cases like customer support routing, document categorization, and content analysis. The function accepts text orVARIANToutput from other AI functions l",2026-03-31T23:28:00.000Z,language-reference,integrations,0.7,True,"Describes ai_classify() with custom labels; likely includes parameter schema, accepted types (text/VARIANT), and behavior specific to Databricks AI Functions, which are product-specific integration patterns.",unchanged @@ -3736,12 +3766,12 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_gen,ai_gen function,ai_gen function - Azure Databricks - Databricks SQL,Generate content using ai_gen in Databricks SQL,Learn the syntax of the ai\_gen function of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_gen()function invokes a state-of-the-art generative AI model to answer the user-provided prompt using SQL. This function uses a chat model serving endpoint made available byDatabricks Foundation Model APIs.,2026-03-31T23:28:00.000Z,language-reference,integrations,0.75,True,"ai_gen() invokes Databricks Foundation Model APIs; reference will include function signature, parameter options (prompt, model, etc.), and Databricks-specific behavior, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_generate_text,ai_generate_text function,ai_generate_text function - Azure Databricks - Databricks SQL,Use (deprecated) ai_generate_text for LLM text in Databricks,Learn the syntax of the ai\_generate\_text function of the SQL language in Databricks SQL.,"Applies to:Databricks SQL Important This feature is inPublic Preview. Warning The AI function,ai_generate_text()is deprecated. Databricks recommends usingai_query with external models. Returns text generated by a selected large language model (LLM) given the prompt.",2026-01-22T13:21:00.000Z,language-reference,configuration,0.7,True,"Deprecated AI function; documentation typically includes specific parameter names, defaults, and supported models, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_mask,ai_mask function,ai_mask function - Azure Databricks - Databricks SQL,Mask sensitive entities with ai_mask in Databricks SQL,Learn the syntax of the ai\_mask function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_mask()function allows you to invoke a state-of-the-art generative AI model to mask specified entities in a given text using SQL. This function uses a chat model serving endpoint made available byDatabricks Foundation Model APIs.,2026-03-31T23:28:00.000Z,language-reference,integrations,0.7,True,"AI masking function; page will define entity specification, parameters, and behavior tied to Databricks model endpoints, which are product-specific integration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_parse_document,ai_parse_document function,ai_parse_document function - Azure Databricks - Databricks SQL,Use ai_parse_document in Databricks SQL queries,Learn the syntax of the ai\_parse\_document function of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Theai_parse_document()function leverages state-of-the-art Databricks-managed research techniques to parse structured content from unstructured documents. For a visual UI to validate and iterate on the results ofai_extract, seeDocument Parsing.",2026-04-15T08:00:00.000Z,language-reference,integrations,0.74,True,"Function reference pages for Databricks SQL typically include product-specific syntax, arguments, return types, and constraints for ai_parse_document that are not generally known to LLMs. This is concrete API/SQL-surface behavior, fitting integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_prep_search,ai_prep_search function,ai_prep_search function - Azure Databricks - Databricks SQL,Prepare documents for RAG with ai_prep_search,Learn the syntax of the ai\_prep\_search function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Theai_prep_search()function transforms the structured output ofai_parse_documentinto a format optimized for RAG vector search and information retrieval systems. For each input document, the function splits content into semantic chunks, enriches each chunk with document-level context such as the document title,",2026-04-17T08:00:00.000Z,language-reference,integrations,0.74,True,"Describes ai_prep_search SQL function behavior, arguments, and output format for RAG/search scenarios. These are product-specific API details and configuration-like parameters, which qualify as expert integration knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_parse_document,ai_parse_document function,ai_parse_document function - Azure Databricks - Databricks SQL,Use ai_parse_document in Databricks SQL queries,Learn the syntax of the ai\_parse\_document function of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Theai_parse_document()function leverages state-of-the-art Databricks-managed research techniques to parse structured content from unstructured documents. For a visual UI to validate and iterate on the results ofai_parse_document, seeDocument Parsing.",2026-04-21T08:00:00.000Z,language-reference,integrations,0.7,True,"Function reference pages for Databricks SQL typically include product-specific syntax, parameters, and behavior details (such as required/optional arguments, supported data types, and return structures) that go beyond generic LLM knowledge. While not about limits or configuration, this is expert integration/coding-pattern knowledge for using the ai_parse_document function within Databricks SQL.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_prep_search,ai_prep_search function,ai_prep_search function - Azure Databricks - Databricks SQL,Prepare documents for RAG with ai_prep_search,Learn the syntax of the ai\_prep\_search function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Theai_prep_search()function transforms the structured output ofai_parse_documentinto a format optimized for RAG vector search and information retrieval systems. For each input document, the function splits content into semantic chunks, enriches each chunk with document-level context such as the document title,",2026-04-17T08:00:00.000Z,language-reference,integrations,0.74,True,"Describes ai_prep_search SQL function behavior, arguments, and output format for RAG/search scenarios. These are product-specific API details and configuration-like parameters, which qualify as expert integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_query,ai_query function,ai_query function - Azure Databricks - Databricks SQL,Call model serving endpoints with ai_query in Databricks,Learn the syntax of the ai\_query function of the SQL language in Databricks SQL. See examples of querying foundation models and traditional ML models for inference.,Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Invokes an existing Azure DatabricksModel Serving endpointand parses and returns its response. ai_queryis a general purposeAI Functionthat enables you to query existing endpoints for real-time inference or batch inference workloads.,2026-03-27T08:00:00.000Z,language-reference,integrations,0.8,True,"ai_query() is explicitly for invoking Databricks Model Serving endpoints; page will include endpoint parameterization, request/response handling, and constraints unique to Databricks, clearly an integration & coding pattern.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_similarity,ai_similarity function,ai_similarity function - Azure Databricks - Databricks SQL,Compute semantic similarity with ai_similarity in Databricks SQL,Learn the syntax of the ai\_similarity function of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_similarity()function invokes a state-of-the-art generative AI model fromDatabricks Foundation Model APIsto compare two strings and computes the semantic similarity score using SQL.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Function uses Databricks Foundation Model APIs; documentation will specify parameters, return types, and usage patterns specific to Databricks AI Functions, matching integrations criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_summarize,ai_summarize function,ai_summarize function - Azure Databricks - Databricks SQL,Summarize text using ai_summarize in Databricks SQL,Learn the syntax of the ai\_summarize function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_summarize()function allows you to invoke a state-of-the-art generative AI model to generate a summary of a given text using SQL. This function uses a chat model serving endpoint made available byDatabricks Foundation Model APIs.,2026-03-31T23:28:00.000Z,language-reference,integrations,0.7,True,ai_summarize() integrates with Databricks chat model endpoints; reference will detail function arguments and behavior unique to this product integration.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_translate,ai_translate function,ai_translate function - Azure Databricks - Databricks SQL,Translate text with ai_translate in Databricks SQL,Learn the syntax of the ai\_translate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_translate()function allows you to invoke a state-of-the-art generative AI model to translate text to a specified target language using SQL. This function uses a chat model serving endpoint made available byDatabricks Foundation Model APIsand supports the following languages. Theto_langargument acceptsIETF BCP 47language codes (based onISO 639-1), full language",2026-04-16T17:44:00.000Z,language-reference,integrations,0.78,True,"Documents ai_translate SQL function including supported language codes, argument semantics, and behavior tied to Databricks Foundation Model APIs. This is concrete, product-specific API surface, best matching integrations.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_translate,ai_translate function,ai_translate function - Azure Databricks - Databricks SQL,Translate text with ai_translate in Databricks SQL,Learn the syntax of the ai\_translate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Important This functionality is inPublic PreviewandHIPAA compliant. During the preview: Theai_translate()function allows you to invoke a state-of-the-art generative AI model to translate text to a specified target language using SQL. This function uses a chat model serving endpoint made available byDatabricks Foundation Model APIsand supports the following languages. Theto_langargument acceptsIETF BCP 47language codes (based onISO 639-1), full language",2026-04-16T17:44:00.000Z,language-reference,integrations,0.78,True,"Documents ai_translate SQL function including supported language codes, argument semantics, and behavior tied to Databricks Foundation Model APIs. This is concrete, product-specific API surface, best matching integrations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ampersandsign,& (ampersand sign) operator,& (ampersand sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the & (ampersand sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwise AND ofexpr1andexpr2.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,Bitwise AND operator reference; generic SQL semantics without Databricks-specific limits or configs.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/and,and predicate,and predicate - Azure Databricks - Databricks SQL,,Learn the syntax of the and predicate of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the logical AND ofexpr1andexpr2.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"Logical AND predicate reference; standard SQL behavior, no product-specific expert content indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/any,any aggregate function,any aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the any aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnstrueif at least one value ofexprin the group is true. @@ -3753,50 +3783,50 @@ algorithm, a state of the art cardinality estimation algorithm. Results are accu of the maximum relative standard deviation, although this is configurable with therelativeSDparameter as mentioned below.",2024-06-04T18:15:00.000Z,language-reference,,0.4,False,"approx_count_distinct describes HLL++ and default 5% error, but this is algorithmic rather than product limits/quotas or config tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_percentile,approx_percentile aggregate function,approx_percentile aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the approx\_percentile aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the approximate percentile of theexprwithin the group.,2024-06-04T18:15:00.000Z,language-reference,,0.3,False,approx_percentile aggregate function; summary is high-level and does not mention Databricks-specific parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k,approx_top_k aggregate function,approx_top_k aggregate function - Azure Databricks - Databricks SQL,Use approx_top_k aggregate in Databricks SQL,Learn the syntax of the approx\_top\_k aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the topkmost frequently occurring item values in anexpralong with their approximate counts.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for approx_top_k with Databricks-specific syntax, argument behavior, and return structure. These are detailed API semantics beyond generic SQL knowledge, fitting integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_accumulate,approx_top_k_accumulate aggregate function,approx_top_k_accumulate aggregate function - Azure Databricks - Databricks SQL,Accumulate top-K sketches with approx_top_k_accumulate,Learn the syntax of the approx\_top\_k\_accumulate aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Accumulates items into an Apache DataSketches ItemsSketch for approximate frequent items (top-K) estimation. The sketch can later be combined with other sketches usingapprox_top_k_combineor estimated directly usingapprox_top_k_estimate.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.72,True,"Describes how to build and use Apache DataSketches-based state via approx_top_k_accumulate in Databricks Runtime, including how it interacts with related functions. This is specific API usage and integration behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_combine,approx_top_k_combine aggregate function,approx_top_k_combine aggregate function - Azure Databricks - Databricks SQL,Combine top-K sketches with approx_top_k_combine,Learn the syntax of the approx\_top\_k\_combine aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Combines multiple sketch states produced byapprox_top_k_accumulateinto a single merged sketch. Use this function to combine sketches from different partitions, time periods, or data sources.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.72,True,"Documents approx_top_k_combine behavior for merging sketch states from approx_top_k_accumulate. This is product-specific function semantics and usage patterns, aligning with integrations.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k,approx_top_k aggregate function,approx_top_k aggregate function - Azure Databricks - Databricks SQL,Use approx_top_k aggregate in Databricks SQL,Learn the syntax of the approx\_top\_k aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the topkmost frequently occurring item values in anexpralong with their approximate counts.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for approx_top_k with Databricks-specific syntax, argument behavior, and return structure. These are detailed API semantics beyond generic SQL knowledge, fitting integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_accumulate,approx_top_k_accumulate aggregate function,approx_top_k_accumulate aggregate function - Azure Databricks - Databricks SQL,Accumulate top-K sketches with approx_top_k_accumulate,Learn the syntax of the approx\_top\_k\_accumulate aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Accumulates items into an Apache DataSketches ItemsSketch for approximate frequent items (top-K) estimation. The sketch can later be combined with other sketches usingapprox_top_k_combineor estimated directly usingapprox_top_k_estimate.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.72,True,"Describes how to build and use Apache DataSketches-based state via approx_top_k_accumulate in Databricks Runtime, including how it interacts with related functions. This is specific API usage and integration behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_combine,approx_top_k_combine aggregate function,approx_top_k_combine aggregate function - Azure Databricks - Databricks SQL,Combine top-K sketches with approx_top_k_combine,Learn the syntax of the approx\_top\_k\_combine aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Combines multiple sketch states produced byapprox_top_k_accumulateinto a single merged sketch. Use this function to combine sketches from different partitions, time periods, or data sources.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.72,True,"Documents approx_top_k_combine behavior for merging sketch states from approx_top_k_accumulate. This is product-specific function semantics and usage patterns, aligning with integrations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_estimate,approx_top_k_estimate function,approx_top_k_estimate function - Azure Databricks - Databricks SQL,,Learn the syntax of the approx\_top\_k\_estimate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the top K most frequent items with their estimated counts from a sketch state produced byapprox_top_k_accumulateorapprox_top_k_combine.,2026-02-18T22:10:00.000Z,language-reference,,0.45,False,approx_top_k_estimate returns frequent items from a sketch; summary does not show product-specific parameters or quotas.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array,array function,array function - Azure Databricks - Databricks SQL,,Learn the syntax of the array function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns an array with the elements inexpr.,2024-04-21T19:33:00.000Z,language-reference,,0.25,False,ARRAY constructor function; generic SQL-like behavior without Databricks-specific expert details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_agg,array_agg aggregate function,array_agg aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_agg function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns an array consisting of all values inexprwithin the group. This function is a synonym forcollect_listaggregate function.,2024-04-21T19:33:00.000Z,language-reference,,0.25,False,"array_agg aggregate function; synonym for collect_list, no indication of limits or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_append,array_append function,array_append function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_append function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above Returnsarrayappended byelem.,2024-04-21T19:33:00.000Z,language-reference,,0.25,False,"array_append function; simple behavior description, no product-specific constraints or configs indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_compact,array_compact function,array_compact function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_distinct function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above Removes NULL elements fromarray.,2024-06-04T18:15:00.000Z,language-reference,,0.25,False,array_compact removes NULLs; straightforward function semantics without expert-level Databricks-specific content.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_contains,array_contains function,array_contains function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_contains function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns true ifarraycontainsvalue.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for array_contains with syntax and basic behavior only; no limits, quotas, configuration tables, error-code mappings, or product-specific thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_distinct,array_distinct function,array_distinct function - Azure Databricks - Databricks SQL,Remove duplicates with array_distinct in Databricks SQL,Learn the syntax of the array\_distinct function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Removes duplicate values fromarray.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Even though conceptually simple, the page will define exact syntax, type behavior, and edge cases for array_distinct in Databricks SQL, which are concrete API details rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_distinct,array_distinct function,array_distinct function - Azure Databricks - Databricks SQL,Remove duplicates with array_distinct in Databricks SQL,Learn the syntax of the array\_distinct function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Removes duplicate values fromarray.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Even though conceptually simple, the page will define exact syntax, type behavior, and edge cases for array_distinct in Databricks SQL, which are concrete API details rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_except,array_except function,array_except function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_except function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array of the elements inarray1but not inarray2.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"array_except function; set difference semantics, no indication of Databricks-specific limits or configs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_insert,array_insert function,array_insert function - Azure Databricks - Databricks SQL,Insert elements with array_insert in Databricks SQL,Learn the syntax of the array\_insert function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns an expandedarraywhereelemis inserted at theindexposition.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Covers exact syntax and behavior (index handling, return type) of array_insert in Databricks SQL/Runtime 13.3+, which are specific coding patterns for this platform.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_insert,array_insert function,array_insert function - Azure Databricks - Databricks SQL,Insert elements with array_insert in Databricks SQL,Learn the syntax of the array\_insert function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns an expandedarraywhereelemis inserted at theindexposition.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Covers exact syntax and behavior (index handling, return type) of array_insert in Databricks SQL/Runtime 13.3+, which are specific coding patterns for this platform.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_intersect,array_intersect function,array_intersect function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_intersect function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array of the elements in the intersection ofarray1andarray2.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"array_intersect function; intersection semantics, no product-specific limits or configuration tables indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_join,array_join function,array_join function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_join function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Concatenates the elements ofarray.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"array_join function; concatenates array elements, generic behavior without Databricks-specific expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_max,array_max function,array_max function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_max function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the maximum value inarray.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"Function reference for array_max; likely just syntax, description, and simple examples without product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_min,array_min function,array_min function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_min function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the minimum value inarray.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"Function reference for array_min; standard SQL-like behavior, no indication of limits, quotas, or specialized patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_position,array_position function,array_position function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_position function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the position of the first occurrence ofelementinarray.,2024-12-11T18:37:00.000Z,language-reference,,0.4,False,"Function reference for array_position; describes basic behavior, not product-specific expert guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_prepend,array_prepend function,array_prepend function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_prepend function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returnsarrayprepended byelem.,2024-04-18T08:00:00.000Z,language-reference,,0.4,False,"Function reference for array_prepend; simple description of prepending element to array, no advanced configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_remove,array_remove function,array_remove function - Azure Databricks - Databricks SQL,Remove elements with array_remove in Databricks SQL,Learn the syntax of the array\_remove function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Removes all occurrences ofelementfromarray.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Provides precise function signature and behavior for array_remove in Databricks SQL, including how occurrences are matched and removed. This is product-specific API usage.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_repeat,array_repeat function,array_repeat function - Azure Databricks - Databricks SQL,Create repeated arrays with array_repeat in Databricks SQL,Learn the syntax of the array\_repeat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array containingelementcounttimes.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Defines how array_repeat works in Databricks SQL, including parameter order, types, and behavior for count, which are concrete integration/coding details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_remove,array_remove function,array_remove function - Azure Databricks - Databricks SQL,Remove elements with array_remove in Databricks SQL,Learn the syntax of the array\_remove function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Removes all occurrences ofelementfromarray.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Provides precise function signature and behavior for array_remove in Databricks SQL, including how occurrences are matched and removed. This is product-specific API usage.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_repeat,array_repeat function,array_repeat function - Azure Databricks - Databricks SQL,Create repeated arrays with array_repeat in Databricks SQL,Learn the syntax of the array\_repeat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array containingelementcounttimes.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Defines how array_repeat works in Databricks SQL, including parameter order, types, and behavior for count, which are concrete integration/coding details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_size,array_size function,array_size function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_size function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of elements inarray.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"Function reference for array_size; generic behavior, no product-specific limits or configuration tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_sort,array_sort function,array_sort function - Azure Databricks - Databricks SQL,Sort arrays with array_sort in Databricks SQL,Learn the syntax of the array\_sort function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsarraysorted according tofunc.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Documents array_sort syntax and the func parameter behavior in Databricks SQL, which are specific API semantics and coding patterns rather than generic theory.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_union,array_union function,array_union function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_union function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array of the elements in the union ofarray1andarray2without duplicates.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Function reference for array_union; primarily syntax and basic behavior without product-specific limits, configuration tables, or decision/troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_sort,array_sort function,array_sort function - Azure Databricks - Databricks SQL,Sort arrays with array_sort in Databricks SQL,Learn the syntax of the array\_sort function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsarraysorted according tofunc.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.68,True,"Documents array_sort syntax and the func parameter behavior in Databricks SQL, which are specific API semantics and coding patterns rather than generic theory.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_union,array_union function,array_union function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_union function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array of the elements in the union ofarray1andarray2without duplicates.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Function reference for array_union; primarily syntax and basic behavior without product-specific limits, configuration tables, or decision/troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/arrays_overlap,arrays_overlap function,arrays_overlap function - Azure Databricks - Databricks SQL,,Learn the syntax of the arrays\_overlap function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns true if the intersection ofarray1andarray2is not empty.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"Function reference for arrays_overlap; simple boolean predicate, not exposing specialized troubleshooting or quotas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/arrays_zip,arrays_zip function,arrays_zip function - Azure Databricks - Databricks SQL,,Learn the syntax of the arrays\_zip function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a merged array of structs in which the nth struct contains all nth values of input arrays.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"Function reference for arrays_zip; describes merging arrays into structs, but no indication of expert-only constraints or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ascii,ascii function,ascii function - Azure Databricks - Databricks SQL,,Learn the syntax of the ascii function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the ASCII code point of the first character ofstr.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"Function reference for ascii; standard SQL scalar function, behavior well-known and not Databricks-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/asin,asin function,asin function - Azure Databricks - Databricks SQL,,Learn the syntax of the asin function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the inverse sine (arcsine) ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Mathematical function reference (asin) with standard behavior; lacks product-specific limits, configuration parameters, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/asinh,asinh function,asinh function - Azure Databricks - Databricks SQL,,Learn the syntax of the asinh function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the inverse hyperbolic sine ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Mathematical function reference (asinh) describing inverse hyperbolic sine; no expert-only limits, quotas, or configuration specifics.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/assert_true,assert_true function,assert_true function - Azure Databricks - Databricks SQL,,Learn the syntax of the assert\_true function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an error ifexpris not true.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,assert_true function reference; describes simple behavior (error if expression not true) without detailed product-specific constraints or configuration.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/asterisksign,* (asterisk sign) operator,* (asterisk sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the \* (asterisk sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returnsmultipliermultiplied bymultiplicand.,2026-04-13T21:52:00.000Z,language-reference,,0.1,False,Asterisk (*) operator reference; basic arithmetic operator semantics that are generic and not product-specific expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/assert_true,assert_true function,assert_true function - Azure Databricks - Databricks SQL,,Learn the syntax of the assert\_true function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an error ifexpris not true.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,assert_true function reference; describes simple behavior (error if expression not true) without detailed product-specific constraints or configuration.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/asterisksign,* (asterisk sign) operator,* (asterisk sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the \* (asterisk sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returnsmultipliermultiplied bymultiplicand.,2026-04-13T21:52:00.000Z,language-reference,,0.1,False,Asterisk (*) operator reference; basic arithmetic operator semantics that are generic and not product-specific expert knowledge.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/atan,atan function,atan function - Azure Databricks - Databricks SQL,,Learn the syntax of the atan function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the inverse tangent (arctangent) ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Mathematical function reference (atan) with standard arcsine behavior; no Databricks-specific limits, configuration, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/atan2,atan2 function,atan2 function - Azure Databricks - Databricks SQL,,Learn the syntax of the atan2 function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the angle in radians between the positive x-axis of a plane and the point specified by the coordinates (exprX,exprY).",2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Mathematical function reference (atan2) describing angle computation; contains syntax and description only, without expert configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/atanh,atanh function,atanh function - Azure Databricks - Databricks SQL,,Learn the syntax of the atanh function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns inverse hyperbolic tangent ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Mathematical function reference (atanh) with generic behavior; no product-specific quotas, configuration parameters, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/avg,avg aggregate function,avg aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the avg function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the mean calculated from values of a group. This function is a synonym formeanaggregate function.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"avg aggregate function reference; standard SQL aggregate behavior without Databricks-specific limits, configs, or decision guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/avg,avg aggregate function,avg aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the avg function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the mean calculated from values of a group. This function is a synonym formeanaggregate function.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"avg aggregate function reference; standard SQL aggregate behavior without Databricks-specific limits, configs, or decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bangeqsign,!= (bangeq sign) operator,!= (bangeq sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the != (bangeq sign) operator of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Returnstrueifexpr1does not equalexpr2, orfalseotherwise. This function is a synonym for<>(lt gt sign) operator.",2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"Operator reference for !=; basic comparison semantics, no expert-only information.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bangsign,! (bang sign) operator,! (bang sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the ! (bang sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the logicalNOTof a Boolean expression. This operator is a synonym fornotoperator.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"Operator reference for !; logical NOT semantics, generic SQL behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/base64,base64 function,base64 function - Azure Databricks - Databricks SQL,,Learn the syntax of the base64 function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Convertsexprto a base 64 string usingRFC2045 Base64 transfer encoding for MIME.,2024-03-01T08:00:00.000Z,language-reference,,0.5,False,"base64 function; uses standard RFC2045 encoding, behavior is generic and not Databricks-specific expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/between,between predicate,between predicate - Azure Databricks - Databricks SQL,,Learn the syntax of the between predicate of the SQL language in Databricks SQL.,Tests whetherexpr1is greater or equal thanexpr2and less than or equal toexpr3.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"between predicate; standard SQL predicate semantics, no product-specific thresholds or configs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bigint,bigint function,bigint function - Azure Databricks - Databricks SQL,,Learn the syntax of the bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto BIGINT. This function is a synonym forCAST(expr AS BIGINT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"bigint cast function reference; generic casting behavior, no detailed configuration, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bigint,bigint function,bigint function - Azure Databricks - Databricks SQL,,Learn the syntax of the bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto BIGINT. This function is a synonym forCAST(expr AS BIGINT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"bigint cast function reference; generic casting behavior, no detailed configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bin,bin function,bin function - Azure Databricks - Databricks SQL,,Learn the syntax of the bin function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the binary representation ofexpr.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"bin function; returns binary representation of a number, standard behavior without expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/binary,binary function,binary function - Azure Databricks - Databricks SQL,,Learn the syntax of the binary function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the value ofexprto BINARY. This function is a synonym forCAST(expr AS BINARY). Seecastfunctionfor details.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,"binary cast function; simple type cast, no advanced configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_and,bit_and aggregate function,bit_and aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_and function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwise AND of all input values in the group.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"bit_and aggregate; standard bitwise aggregation semantics, no indication of Databricks-specific constraints or patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_count,bit_count function,bit_count function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_count function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of bits set in the argument. To count bits in aBINARYexpression usebitmap_countfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"bit_count function; counts bits in argument, generic behavior with a simple note about bitmap_count, not configuration- or limit-focused.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_get,bit_get function,bit_get function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_get function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the value of a bit in a binary representation of an integral numeric. This function is a synonym forgetbitfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"bit_get function reference; while it notes availability from Databricks Runtime 11.3 LTS, it mainly documents basic function behavior without deeper product-specific constraints or patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_get,bit_get function,bit_get function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_get function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the value of a bit in a binary representation of an integral numeric. This function is a synonym forgetbitfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"bit_get function reference; while it notes availability from Databricks Runtime 11.3 LTS, it mainly documents basic function behavior without deeper product-specific constraints or patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_length,bit_length function,bit_length function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_length function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the bit length of string data or number of bits of binary data.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"bit_length function; generic string/binary bit length behavior, not Databricks-specific expert content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_or,bit_or aggregate function,bit_or aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_or function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwiseORof all input values in the group. To aggregate bit positions into aBINARYbitmap use thebitmap_construct_agg()aggregate function. To aggregateBINARYinput values use thebitmap_or_agg()] aggregate function.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"bit_or aggregate; standard bitwise OR aggregation, references other bitmap functions but no limits, quotas, or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bit_xor,bit_xor aggregate function,bit_xor aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the bit\_xor function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwiseXORof all input values in the group.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"bit_xor aggregate; generic bitwise XOR aggregation, no expert-only Databricks behavior.",unchanged @@ -3813,15 +3843,15 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions This function is a synonym foreveryaggregate function.",2024-11-26T08:00:00.000Z,language-reference,,0.45,False,"bool_and aggregate; synonym for every, standard boolean aggregation semantics, no expert troubleshooting or configuration content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bool_or,bool_or aggregate function,bool_or aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the bool\_or function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnstrueif at least one value inexpris true within the group. Thebool_oraggregate function is synonymous withanyaggregate function.,2024-07-18T17:47:00.000Z,language-reference,,0.2,False,"Function reference for bool_or; syntax/behavior is generic SQL-style aggregation without product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/boolean,boolean function,boolean function - Azure Databricks - Databricks SQL,,Learn the syntax of the boolean function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Castsexprto boolean. This function is a synonym forCAST(expr AS binary). Seecastfunctionfor details.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"Casts to boolean; generic casting behavior, no Databricks-specific limits, configs, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bracketsign,[ ] (bracket sign) operator,[ ] (bracket sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the \[ ] (bracket sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns an array element or map value given an index or key.,2026-04-13T21:52:00.000Z,language-reference,,0.1,False,"[ ] bracket operator reference; generic array/map indexing semantics, no expert-level configuration, limits, or troubleshooting.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bround,bround function,bround function - Azure Databricks - Databricks SQL,,Learn the syntax of the bround function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the roundedexprusingHALF_EVENrounding mode.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"bround function reference; describes rounding mode but not in the form of product-specific best practices, limits, or configuration tables.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bracketsign,[ ] (bracket sign) operator,[ ] (bracket sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the \[ ] (bracket sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns an array element or map value given an index or key.,2026-04-13T21:52:00.000Z,language-reference,,0.1,False,"[ ] bracket operator reference; generic array/map indexing semantics, no expert-level configuration, limits, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/bround,bround function,bround function - Azure Databricks - Databricks SQL,,Learn the syntax of the bround function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the roundedexprusingHALF_EVENrounding mode.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"bround function reference; describes rounding mode but not in the form of product-specific best practices, limits, or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/cardinality,cardinality function,cardinality function - Azure Databricks - Databricks SQL,,Learn the syntax of the cardinality function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the size ofexpr.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"cardinality returns size of array; generic behavior, no Databricks-specific parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/caretsign,^ (caret sign) operator,^ (caret sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the ^ (caret sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwise exclusiveOR (XOR)ofexpr1andexpr2.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"Bitwise XOR operator; standard SQL operator semantics, no special Databricks-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/case,case expression,case expression - Azure Databricks - Databricks SQL,Implement CASE expressions in Databricks SQL,Learn the syntax of the case function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime ReturnsresNfor the firstoptNthat equalsexprordefif none matches. ReturnsresNfor the firstcondNevaluating to true, ordefif none found.",2026-01-20T08:00:00.000Z,language-reference,integrations,0.6,True,"Details Databricks SQL CASE expression semantics (first matching branch, default handling), which are part of the product’s SQL dialect and execution behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/cast,cast function,cast function - Azure Databricks - Databricks SQL,,Learn the syntax of the cast function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto the target data typetype. This operator is a synonym for::(colon colon sign) operator,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"cast function reference; standard casting semantics and synonym operator, without detailed product-specific constraints or configuration options.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/cast,cast function,cast function - Azure Databricks - Databricks SQL,,Learn the syntax of the cast function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto the target data typetype. This operator is a synonym for::(colon colon sign) operator,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"cast function reference; standard casting semantics and synonym operator, without detailed product-specific constraints or configuration options.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/cbrt,cbrt function,cbrt function - Azure Databricks - Databricks SQL,Compute cube roots with cbrt in Databricks SQL,Learn the syntax of the cbrt function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the cube root ofexpr.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.58,True,"Function reference for a Databricks SQL numeric function; while mathematically simple, it is still concrete product API surface and behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ceil,ceil function,ceil function - Azure Databricks - Databricks SQL,,Learn the syntax of the ceil function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the smallest number not smaller thanexprrounded up totargetScaledigits relative to the decimal point. This function is a synonym ofceilingfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents the Databricks SQL ceil function syntax and behavior. This is standard SQL-style function reference without product-specific limits, quotas, configuration tables, or decision/troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ceiling,ceiling function,ceiling function - Azure Databricks - Databricks SQL,,Learn the syntax of the ceiling function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the smallest number not smaller thanexprrounded up totargetScaledigits relative to the decimal point. This function is a synonym ofceilfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents the Databricks SQL ceiling function syntax and behavior. It is a basic function reference and synonym of ceil, without product-specific limits, configuration parameters, or troubleshooting guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ceil,ceil function,ceil function - Azure Databricks - Databricks SQL,,Learn the syntax of the ceil function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the smallest number not smaller thanexprrounded up totargetScaledigits relative to the decimal point. This function is a synonym ofceilingfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents the Databricks SQL ceil function syntax and behavior. This is standard SQL-style function reference without product-specific limits, quotas, configuration tables, or decision/troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ceiling,ceiling function,ceiling function - Azure Databricks - Databricks SQL,,Learn the syntax of the ceiling function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the smallest number not smaller thanexprrounded up totargetScaledigits relative to the decimal point. This function is a synonym ofceilfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents the Databricks SQL ceiling function syntax and behavior. It is a basic function reference and synonym of ceil, without product-specific limits, configuration parameters, or troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/char,char function,char function - Azure Databricks - Databricks SQL,,Learn the syntax of the char function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the character at the supplied UTF-16 code point. This function is a synonym forchrfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"char returns UTF-16 character; generic string function, no product-specific parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/char_length,char_length function,char_length function - Azure Databricks - Databricks SQL,,Learn the syntax of the char\_length function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the character length of string data or number of bytes of binary data. This function is a synonym forcharacter_lengthfunctionandlengthfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"char_length string length; generic function, synonyms only, no expert configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/character_length,character_length function,character_length function - Azure Databricks - Databricks SQL,,Learn the syntax of the character\_length function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the character length of string data or number of bytes of binary data. This function is a synonym forchar_lengthfunctionandlengthfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"character_length string length; same as char_length/length, generic SQL behavior.",unchanged @@ -3833,12 +3863,12 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/collations,collations table function,collations table function - Azure Databricks - Databricks SQL,List supported collations in Azure Databricks SQL,Learn the syntax of the collations table function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.1 and above Returns the set of supported collations in Azure Databricks.,2026-01-27T08:00:00.000Z,language-reference,configuration,0.7,True,Explicitly returns set of supported collations; this is a product-specific configuration matrix not known generically.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/collect_list,collect_list aggregate function,collect_list aggregate function - Azure Databricks - Databricks SQL,Aggregate values into arrays with collect_list,Learn the syntax of the collect\_list function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array consisting of all values inexprwithin the group. This function is a synonym forarray_aggaggregate function.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.63,True,"Documents a Databricks SQL aggregate function and its synonym array_agg, which is concrete dialect/API behavior for this product.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/collect_set,collect_set aggregate function,collect_set aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the collect\_set function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an array consisting of all unique values inexprwithin the group.,2024-04-18T19:27:00.000Z,language-reference,,0.25,False,"collect_set aggregate; generic distinct aggregation, no expert-only Databricks details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/coloncolonsign,:: (colon colon sign) operator,:: (colon colon sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the :: (colon colon sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto the target data typetype. This operator is a synonym forcastfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page describes the :: cast operator in Databricks SQL. It is a straightforward language feature description, not containing limits, quotas, configuration tables, or decision/troubleshooting matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/coloncolonsign,:: (colon colon sign) operator,:: (colon colon sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the :: (colon colon sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto the target data typetype. This operator is a synonym forcastfunction.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page describes the :: cast operator in Databricks SQL. It is a straightforward language feature description, not containing limits, quotas, configuration tables, or decision/troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/colonsign,: (colon sign) operator,: (colon sign) operator - Azure Databricks - Databricks SQL,Extract JSON content with Databricks colon operator,Learn the syntax of the : (colon sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Extracts content from a JSON string using aJSON path expression.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks SQL-specific : operator for JSON path extraction, including its semantics; this is nonstandard SQL behavior and thus expert product knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/concat,concat function,concat function - Azure Databricks - Databricks SQL,,Learn the syntax of the concat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the concatenation of the arguments. This function is a synonym for||(pipe pipe sign) operator.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents the concat function in Databricks SQL as a synonym for the || operator. This is standard function syntax/behavior, not expert configuration, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/concat,concat function,concat function - Azure Databricks - Databricks SQL,,Learn the syntax of the concat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the concatenation of the arguments. This function is a synonym for||(pipe pipe sign) operator.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents the concat function in Databricks SQL as a synonym for the || operator. This is standard function syntax/behavior, not expert configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/concat_ws,concat_ws function,concat_ws function - Azure Databricks - Databricks SQL,,Learn the syntax of the concat\_ws function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the concatenation strings separated bysep.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"concat_ws; generic separator-based concatenation, no Databricks-specific expert details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/contains,contains function,contains function - Azure Databricks - Databricks SQL,Check substring presence with contains in Databricks SQL,Learn the syntax of the contains function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above ReturnstrueifexprcontainssubExpr.,2026-03-05T08:00:00.000Z,language-reference,integrations,0.64,True,Function reference for a Databricks SQL string predicate with specific version applicability; this is concrete API behavior for the service.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/conv,conv function,conv function - Azure Databricks - Databricks SQL,,Learn the syntax of the conv function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime ConvertsnumfromfromBasetotoBase.,2026-04-13T21:52:00.000Z,language-reference,,0.25,False,"Page describes the conv function for base conversion in Databricks SQL. While product-specific, it is a simple function reference without numeric limits, configuration parameters, or decision/troubleshooting structures required for any sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/conv,conv function,conv function - Azure Databricks - Databricks SQL,,Learn the syntax of the conv function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime ConvertsnumfromfromBasetotoBase.,2026-04-13T21:52:00.000Z,language-reference,,0.25,False,"Page describes the conv function for base conversion in Databricks SQL. While product-specific, it is a simple function reference without numeric limits, configuration parameters, or decision/troubleshooting structures required for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/convert_timezone,convert_timezone function,convert_timezone function - Azure Databricks - Databricks SQL,,Learn the syntax of the convert\_timezone function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above ConvertsTIMESTAMP_NTZto another time zone. The input column is converted toTIMESTAMP_NTZtype before the time zone conversion, if the input column is ofTIMESTAMPorDATEorSTRINGtype.",2024-08-19T08:00:00.000Z,language-reference,,0.35,False,"convert_timezone; while TIMESTAMP_NTZ is Databricks-specific, the page is a basic function reference without detailed limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/corr,corr aggregate function,corr aggregate function - Azure Databricks - Databricks SQL,Compute Pearson correlation with corr in Databricks SQL,Learn the syntax of the corr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns Pearson coefficient of correlation between a group of number pairs.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.6,True,"Documents a Databricks SQL aggregate function for correlation; while conceptually known, its exact availability and usage in this product are API-specific.",unchanged @@ -3871,7 +3901,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_add,date_add (days) function,date_add (days) function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_add function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the datenumDaysafterstartDate. To add units other than days, usedate_add(unit, value, expr)).",2024-03-01T08:00:00.000Z,language-reference,,0.3,False,date_add(days) returns date offset by days; standard date arithmetic without Databricks-specific limits or config options.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_add3,date_add function,date_add function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_add function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Addsvalueandunitto a timestampexpr. This function is a synonym fortimestampaddfunction.,2024-04-18T08:00:00.000Z,language-reference,,0.3,False,"date_add(timestamp) synonym for timestampadd; function semantics only, no numeric limits, configuration tables, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_diff,date_diff function,date_diff function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_diff (timestamp) function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns the difference between two timestamps measured inunits.date_diff(timestamp) is a synonym fortimestampdifffunction.,2024-04-18T08:00:00.000Z,language-reference,,0.3,False,"date_diff(timestamp) synonym for timestampdiff; generic timestamp difference behavior, not expert configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_format,date_format function,date_format function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_format function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Converts a timestamp to a string in the formatfmt.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for date_format; likely just syntax, arguments, and examples without limits, quotas, configuration tables, or decision/troubleshooting matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_format,date_format function,date_format function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_format function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Converts a timestamp to a string in the formatfmt.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for date_format; likely just syntax, arguments, and examples without limits, quotas, configuration tables, or decision/troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_from_unix_date,date_from_unix_date function,date_from_unix_date function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_from\_unix\_date function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates a date from the number of days since1970-01-01. This function is a synonym fordate_add(DATE'1970-01-01', days).",2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"date_from_unix_date creates date from days since 1970-01-01; simple arithmetic and synonym info, no expert-only limits or configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_part,date_part function,date_part function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_part function of the SQL language in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Extracts a part of the date, timestamp, or interval.",2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"date_part function reference; standard SQL-style extraction semantics without Databricks-specific limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/date_sub,date_sub function,date_sub function - Azure Databricks - Databricks SQL,,Learn the syntax of the date\_sub function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the datenumDaysbeforestartDate.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"date_sub returns date offset by days; standard date arithmetic, no Databricks-specific limits or configuration matrices.",unchanged @@ -3885,18 +3915,18 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/dayofmonth,dayofmonth function,dayofmonth function - Azure Databricks - Databricks SQL,,Learn the syntax of the dayofmonth function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the day of month of the date or timestamp. This function is a synonym forextract(DAY FROM expr). Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.45,False,"dayofmonth function reference; similar to day, with timezone note but no expert-level limits, configs, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/dayofweek,dayofweek function,dayofweek function - Azure Databricks - Databricks SQL,,Learn the syntax of the dayofweek function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the day of week of the date or timestamp. This function is a synonym forextract(DAYOFWEEK FROM expr). Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.45,False,"dayofweek function reference; describes behavior and timezone dependence but lacks numeric limits, configuration matrices, or decision/troubleshooting structures.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/dayofyear,dayofyear function,dayofyear function - Azure Databricks - Databricks SQL,,Learn the syntax of the dayofyear function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the day of year of the date or timestamp. This function is a synonym forextract(DAY FORM expr). Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.45,False,"dayofyear function reference; basic syntax and behavior with timezone note, no product-specific limits, configuration parameters, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/decimal,decimal function,decimal function - Azure Databricks - Databricks SQL,,Learn the syntax of the decimal function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto DECIMAL. This function is a synonym forCAST(expr AS decimal(10, 0)). Seecastfunctionfor details.",2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"decimal casting function reference; describes syntax and behavior but not product-specific limits, quotas, or configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/decimal,decimal function,decimal function - Azure Databricks - Databricks SQL,,Learn the syntax of the decimal function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto DECIMAL. This function is a synonym forCAST(expr AS decimal(10, 0)). Seecastfunctionfor details.",2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"decimal casting function reference; describes syntax and behavior but not product-specific limits, quotas, or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/decode,decode (key) function,decode (key) function - Azure Databricks - Databricks SQL,,Learn the syntax of the decode function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the value matching the key.decodecomparesexprto eachkeyNin order and returns the correspondingvalueNfor the first match (like a key-value lookup or switch). If no key matches, it returnsdefValuewhen provided, otherwiseNULL.",2026-02-24T23:25:00.000Z,language-reference,,0.3,False,"decode(key) describes key-value style conditional logic; generic SQL-like behavior, not focused on Databricks-specific limits or configs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/decode_cs,decode (character set) function,decode (character set) function - Azure Databricks - Databricks SQL,,Learn the syntax of the decode (character set) function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Translates binaryexprto a string using the character set encodingcharSet.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"decode (character set) function reference; explains translation from binary to string but not in the form of limits, configuration tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/decode_cs,decode (character set) function,decode (character set) function - Azure Databricks - Databricks SQL,,Learn the syntax of the decode (character set) function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Translates binaryexprto a string using the character set encodingcharSet.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"decode (character set) function reference; explains translation from binary to string but not in the form of limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/degrees,degrees function,degrees function - Azure Databricks - Databricks SQL,,Learn the syntax of the degrees function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Converts radians to degrees.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for converting radians to degrees; standard SQL/math behavior without product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/dense_rank,dense_rank ranking window function,dense_rank ranking window function - Azure Databricks - Databricks SQL,,Learn the syntax of the dense\_rank function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the rank of a value compared to all values in the partition.,2025-02-21T19:44:00.000Z,language-reference,,0.3,False,dense_rank window function semantics; generic SQL ranking behavior without Databricks-specific limits or troubleshooting mappings.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/div,div operator,div operator - Azure Databricks - Databricks SQL,,Learn the syntax of the div operator of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the integral part of the division ofdividendbydivisor.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"div operator reference; arithmetic behavior description without quotas, configuration options, or decision/troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/div,div operator,div operator - Azure Databricks - Databricks SQL,,Learn the syntax of the div operator of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the integral part of the division ofdividendbydivisor.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"div operator reference; arithmetic behavior description without quotas, configuration options, or decision/troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/dotsign,. (dot sign) operator,. (dot sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the . (dot sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns afieldIdentifiervalue in anSTRUCTor a value bykeyIdentifierin aMAP.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,". (dot) operator for struct/map field access; language syntax reference, not limits, configuration, or troubleshooting guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/double,double function,double function - Azure Databricks - Databricks SQL,,Learn the syntax of the double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto DOUBLE. This function is a synonym forCAST(expr AS DOUBLE). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"double casting function reference; standard syntax/behavior, no product-specific limits, configuration parameters, or best-practice/troubleshooting structures.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/double,double function,double function - Azure Databricks - Databricks SQL,,Learn the syntax of the double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto DOUBLE. This function is a synonym forCAST(expr AS DOUBLE). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"double casting function reference; standard syntax/behavior, no product-specific limits, configuration parameters, or best-practice/troubleshooting structures.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/e,e function,e function - Azure Databricks - Databricks SQL,,Learn the syntax of the e function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the constante.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Returns mathematical constant e; generic behavior, no product-specific constraints or configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/element_at,element_at function,element_at function - Azure Databricks - Databricks SQL,,Learn the syntax of the element\_at function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the element of anarrayExpratindex. Returns the value ofmapExprforkey.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"element_at function reference; describes array/map element access but not expert-knowledge patterns like limits, quotas, or configuration matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/element_at,element_at function,element_at function - Azure Databricks - Databricks SQL,,Learn the syntax of the element\_at function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the element of anarrayExpratindex. Returns the value ofmapExprforkey.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"element_at function reference; describes array/map element access but not expert-knowledge patterns like limits, quotas, or configuration matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/elt,elt function,elt function - Azure Databricks - Databricks SQL,,Learn the syntax of the elt function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the nth expression.,2024-03-01T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/encode,encode function,encode function - Azure Databricks - Databricks SQL,,Learn the syntax of the encode function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the binary representation of a string using thecharSetcharacter encoding.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"encode function reference; focuses on syntax and behavior for character encoding without detailed limits, configuration tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/encode,encode function,encode function - Azure Databricks - Databricks SQL,,Learn the syntax of the encode function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the binary representation of a string using thecharSetcharacter encoding.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"encode function reference; focuses on syntax and behavior for character encoding without detailed limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/endswith,endswith function,endswith function - Azure Databricks - Databricks SQL,,Learn the syntax of the endswith function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above The function operates in BINARY mode if both arguments are BINARY. Returnstrueifexprends withendExpr.,2026-03-05T08:00:00.000Z,language-reference,,0.3,False,"String function endswith with note about BINARY mode; minor version applicability but no detailed limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/eqeqsign,== (eq eq sign) operator,== (eq eq sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the == (eq eq sign) operator of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnstrueifexpr1equalsexpr2, orfalseotherwise. This function is a synonym for=(eq sign) operator. Useequal_nullto treatNULLas a comparable value.",2025-01-31T16:09:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/eqsign,= (eq sign) operator,= (eq sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the = (eq sign) operator of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnstrueifexpr1equalsexpr2, orfalseotherwise. This function is a synonym for==(eq eq sign) operator.",2025-01-31T16:09:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged @@ -3914,22 +3944,22 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/find_in_set,find_in_set function,find_in_set function - Azure Databricks - Databricks SQL,,Learn the syntax of the find\_in\_set function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the position of a string within a comma-separated list of strings.,2024-03-01T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/first,first aggregate function,first aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the first aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the first value ofexprfor a group of rows. This function is a synonym forfirst_valueaggregate function.,2024-11-20T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/first_value,first_value aggregate function,first_value aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the first\_value aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the first value ofexprfor a group of rows. This function is a synonym forfirstaggregate function.,2024-11-21T21:05:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/flatten,flatten function,flatten function - Azure Databricks - Databricks SQL,Use Databricks SQL flatten function on nested arrays,Learn the syntax of the flatten function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms an array of arrays into a single array.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference pages contain product-specific syntax, arguments, return types, and behavioral details for Databricks SQL that go beyond generic SQL knowledge, fitting the integrations/coding patterns category best.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/float,float function,float function - Azure Databricks - Databricks SQL,Cast values to FLOAT with Databricks SQL float,Learn the syntax of the float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto FLOAT. This function is a synonym forCAST(expr AS FLOAT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents exact Databricks SQL casting behavior and syntax for FLOAT, which is product-specific API/SQL surface detail appropriate for integrations/coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/floor,floor function,floor function - Azure Databricks - Databricks SQL,Round numbers down using Databricks SQL floor,Learn the syntax of the floor function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the largest number not bigger thanexprrounded down totargetScaledigits relative to the decimal point.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific SQL function syntax and semantics (including targetScale behavior) that are concrete API details, aligning with integrations/coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/flatten,flatten function,flatten function - Azure Databricks - Databricks SQL,Use Databricks SQL flatten function on nested arrays,Learn the syntax of the flatten function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms an array of arrays into a single array.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference pages contain product-specific syntax, arguments, return types, and behavioral details for Databricks SQL that go beyond generic SQL knowledge, fitting the integrations/coding patterns category best.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/float,float function,float function - Azure Databricks - Databricks SQL,Cast values to FLOAT with Databricks SQL float,Learn the syntax of the float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto FLOAT. This function is a synonym forCAST(expr AS FLOAT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents exact Databricks SQL casting behavior and syntax for FLOAT, which is product-specific API/SQL surface detail appropriate for integrations/coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/floor,floor function,floor function - Azure Databricks - Databricks SQL,Round numbers down using Databricks SQL floor,Learn the syntax of the floor function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the largest number not bigger thanexprrounded down totargetScaledigits relative to the decimal point.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific SQL function syntax and semantics (including targetScale behavior) that are concrete API details, aligning with integrations/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/forall,forall function,forall function - Azure Databricks - Databricks SQL,,Learn the syntax of the forall function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Tests whetherfuncholds for all elements in the array.,2024-03-01T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/format_number,format_number function,format_number function - Azure Databricks - Databricks SQL,,Learn the syntax of the format\_number function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Formatsexprlike#,###,###.##, rounded toscaledecimal places. Formatsexprlikefmt.",2026-01-20T08:00:00.000Z,language-reference,,0.25,False,"Number formatting function; likely just syntax and examples, no limits, configs, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/format_string,format_string function,format_string function - Azure Databricks - Databricks SQL,,Learn the syntax of the format\_string function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns a formatted string from printf-style format strings. The function exploits thejava.util.Formatterclass withLocale.US. For details, seejava.util.Formatter.",2024-07-18T17:47:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_avro,from_avro function,from_avro function - Azure Databricks - Databricks SQL,Use from_avro to parse Avro data in Databricks SQL,Learn the syntax of the from\_avro function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 16.0 and above Returns a struct value with theavroBinandjsonSchemaStr.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Databricks-specific function for decoding Avro with parameters (avroBin, jsonSchemaStr); represents a concrete integration pattern between Databricks SQL and Avro with product-specific function semantics.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_csv,from_csv function,from_csv function - Azure Databricks - Databricks SQL,Parse CSV strings with Databricks SQL from_csv,Learn the syntax of the from\_csv function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a struct value with thecsvStrandschema.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.75,True,"Provides detailed syntax and parameter behavior for from_csv in Databricks SQL, including schema handling, which is product-specific integration/coding information.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_json,from_json function,from_json function - Azure Databricks - Databricks SQL,Parse JSON strings with Databricks SQL from_json,Learn the syntax of the from\_json function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a struct value with thejsonStrandschema.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Covers Databricks SQL from_json function syntax and behavior for converting JSON to structs, a concrete API surface used when integrating JSON data.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_csv,from_csv function,from_csv function - Azure Databricks - Databricks SQL,Parse CSV strings with Databricks SQL from_csv,Learn the syntax of the from\_csv function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a struct value with thecsvStrandschema.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.75,True,"Provides detailed syntax and parameter behavior for from_csv in Databricks SQL, including schema handling, which is product-specific integration/coding information.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_json,from_json function,from_json function - Azure Databricks - Databricks SQL,Parse JSON strings with Databricks SQL from_json,Learn the syntax of the from\_json function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a struct value with thejsonStrandschema.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Covers Databricks SQL from_json function syntax and behavior for converting JSON to structs, a concrete API surface used when integrating JSON data.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_unixtime,from_unixtime function,from_unixtime function - Azure Databricks - Databricks SQL,,Learn the syntax of the from\_unixtime function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime ReturnsunixTimeinfmt.,2025-05-09T19:58:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_utc_timestamp,from_utc_timestamp function,from_utc_timestamp function - Azure Databricks - Databricks SQL,,Learn the syntax of the from\_utc\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the timestamp attimeZonefor a timestampexpratUTC. For a list of valid timezones, seeList of tz database time zones.",2024-11-21T21:05:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_xml,from_xml function,from_xml function - Azure Databricks - Databricks SQL,Parse XML to structs with Databricks SQL from_xml,Learn the syntax of the from\_xml function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 14.1 and above Important This feature is inPublic Preview. Returns a struct or a variant value parsed from thexmlStrusingschema.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Describes a Databricks SQL public preview function with specific syntax, schema handling, and return types for XML parsing, which is specialized integration/coding knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_xml,from_xml function,from_xml function - Azure Databricks - Databricks SQL,Parse XML to structs with Databricks SQL from_xml,Learn the syntax of the from\_xml function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 14.1 and above Important This feature is inPublic Preview. Returns a struct or a variant value parsed from thexmlStrusingschema.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Describes a Databricks SQL public preview function with specific syntax, schema handling, and return types for XML parsing, which is specialized integration/coding knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get,get function,get function - Azure Databricks - Databricks SQL,,Learn the syntax of the get function of the SQL language in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the element of anarrayExpratindex, starting with0.",2024-07-24T23:57:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get_json_object,get_json_object function,get_json_object function - Azure Databricks - Databricks SQL,,Learn the syntax of the get\_json\_object function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts a JSON object frompath.,2024-07-29T21:03:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/getbit,getbit function,getbit function - Azure Databricks - Databricks SQL,Read individual bits using Databricks SQL getbit,Learn the syntax of the bit\_get function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the value of a bit in a binary representation of an integral numeric. This function is a synonym ofbit_getfunction.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents the getbit/bit_get function’s exact behavior and usage in Databricks SQL, which is concrete, product-specific function/API information suitable for integrations/coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get_json_object,get_json_object function,get_json_object function - Azure Databricks - Databricks SQL,Use get_json_object in Azure Databricks SQL queries,Learn the syntax of the get\_json\_object function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Extracts a JSON object frompath. Tip For new code, Azure Databricks recommends using theVARIANT data typewith the:operatorto query JSON data.VARIANToffers better read and write performance, case-sensitive field access, and clearer semantics than string-based JSON parsing. SeeHow is variant different than JSON strings?.",2026-04-23T17:47:00.000Z,language-reference,integrations,0.7,True,"Function reference pages for Databricks SQL typically include product-specific syntax, arguments, return types, and behavior details (such as how JSON paths are interpreted, null/edge-case handling, and examples). These are concrete, code-level integration details for working with JSON in Databricks SQL that go beyond generic SQL knowledge, fitting the integrations & coding patterns category.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/getbit,getbit function,getbit function - Azure Databricks - Databricks SQL,Read individual bits using Databricks SQL getbit,Learn the syntax of the bit\_get function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the value of a bit in a binary representation of an integral numeric. This function is a synonym ofbit_getfunction.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents the getbit/bit_get function’s exact behavior and usage in Databricks SQL, which is concrete, product-specific function/API information suitable for integrations/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/getdate,getdate function,getdate function - Azure Databricks - Databricks SQL,,Learn the syntax of getdate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the current timestamp at the start of query evaluation. This function is a synonym forcurrent_timestamp().,2024-03-01T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/greatest,greatest function,greatest function - Azure Databricks - Databricks SQL,,Learn the syntax of the greatest function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the greatest value of all arguments, skipping null values.",2024-03-01T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/grouping,grouping function,grouping function - Azure Databricks - Databricks SQL,,Learn the syntax of the grouping function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Indicates whether a specified column in aGROUPING SET,ROLLUP, orCUBErepresents a subtotal.",2024-03-01T08:00:00.000Z,language-reference,,0.0,False,Parse error: Expecting value: line 9 column 24 (char 1497),unchanged @@ -3952,7 +3982,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_ispentagon,h3_ispentagon function,h3_ispentagon function - Azure Databricks - Databricks SQL,Detect pentagon H3 cells with h3_ispentagon in Databricks,Learn the syntax of the h3\_ispentagon function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns true if the input BIGINT or hexadecimal STRING corresponds to a pentagonal H3 cell or not.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.76,True,Provides exact function behavior and typing for identifying pentagonal H3 cells in Databricks SQL.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_isvalid,h3_isvalid function,h3_isvalid function - Azure Databricks - Databricks SQL,Validate H3 cell IDs with h3_isvalid in Databricks,Learn the syntax of the h3\_isvalid function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns true if the input BIGINT or STRING is a valid H3 cell ID.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.76,True,Function reference with concrete input types (BIGINT/STRING) and boolean validation semantics for H3 IDs.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_kring,h3_kring function,h3_kring function - Azure Databricks - Databricks SQL,Get H3 k-ring neighborhoods with h3_kring in Databricks,Learn the syntax of the h3\_kring function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the H3 cells that are within (grid) distancekof the origin cell. The set of these H3 cells is called thek-ring of the origin cell.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.8,True,Documents Databricks SQL function returning all cells within grid distance k; includes array return and parameter behavior.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_kringdistances,h3_kringdistances function,h3_kringdistances function - Azure Databricks - Databricks SQL,,Learn the syntax of the h3\_kringdistances function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns all H3 cells (represented as long integers or strings) within grid distancekfrom the origin H3 cell, along with their distance from the origin H3 cell.",2026-04-13T08:00:00.000Z,language-reference,,0.2,False,"Function reference page likely focuses on syntax and semantics of h3_kringdistances without limits, quotas, configuration tables, or troubleshooting mappings. No clear evidence of product-specific numeric limits, config matrices, or error-code-based diagnosis.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_kringdistances,h3_kringdistances function,h3_kringdistances function - Azure Databricks - Databricks SQL,,Learn the syntax of the h3\_kringdistances function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns all H3 cells (represented as long integers or strings) within grid distancekfrom the origin H3 cell, along with their distance from the origin H3 cell.",2026-04-13T08:00:00.000Z,language-reference,,0.2,False,"Function reference page likely focuses on syntax and semantics of h3_kringdistances without limits, quotas, configuration tables, or troubleshooting mappings. No clear evidence of product-specific numeric limits, config matrices, or error-code-based diagnosis.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_longlatash3,h3_longlatash3 function,h3_longlatash3 function - Azure Databricks - Databricks SQL,Convert longitude/latitude to H3 BIGINT with h3_longlatash3,Learn the syntax of the h3\_longlatash3 function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the H3 cell ID (as a BIGINT) corresponding to the provided longitude and latitude at the specified resolution.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.82,True,Function reference mapping lon/lat and resolution to BIGINT H3 ID in Databricks SQL; concrete parameter contract.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_longlatash3string,h3_longlatash3string function,h3_longlatash3string function - Azure Databricks - Databricks SQL,Convert longitude/latitude to H3 string with h3_longlatash3string,Learn the syntax of the h3\_longlatash3string function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the H3 cell ID (as a hexadecimal STRING) corresponding to the provided longitude and latitude at the specified resolution.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.82,True,Documents Databricks SQL function returning hex STRING H3 IDs from coordinates; product-specific API details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_maxchild,h3_maxchild function,h3_maxchild function - Azure Databricks - Databricks SQL,Get maximum child H3 cell with h3_maxchild in Databricks,Learn the syntax of the h3\_maxchild function of the SQL language in Databricks SQL and Databricks Runtime.,Returns the child of maximum value of the input H3 cell at the specified resolution. Applies to:Databricks SQL previewDatabricks Runtime 11.3 LTS and above,2026-01-20T08:00:00.000Z,language-reference,integrations,0.78,True,Describes preview Databricks SQL function semantics for selecting max-valued child at a resolution; specific API behavior.,unchanged @@ -3979,11 +4009,11 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hex,hex function,hex function - Azure Databricks - Databricks SQL,,Learn the syntax of the hex function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Convertsexprto hexadecimal.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"HEX function reference; describes conversion behavior but not product-specific limits, quotas, or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/histogram_numeric,histogram_numeric aggregate function,histogram_numeric aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the histogram\_numeric aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.2 and above Computes a histogram onexprusingnumBinsbins.,2025-05-09T19:58:00.000Z,language-reference,,0.4,False,"HISTOGRAM_NUMERIC aggregate function reference; explains purpose and arguments but no detailed limits, quotas, or config matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hll_sketch_agg,hll_sketch_agg aggregate function,hll_sketch_agg aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the hll\_sketch\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above This function utilizes theHyperLogLogalgorithm to count a probabilistic approximation of the number of unique values in a given column, and outputs the result as a binary representation known as a sketch buffer. -This binary representation is suitable for persistence. Queries can use the resulting buffers to compute approximate unique counts with thehll_sketch_estimatefunction. Thehll_unionandhll_union_aggfunctions can also combine sk",2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"HyperLogLog aggregate function reference (hll_sketch_agg) is primarily syntax/behavior description. The summary does not indicate presence of quotas, config parameter tables, security roles, or troubleshooting mappings required for expert-knowledge sub-skills.",updated +This binary representation is suitable for persistence. Queries can use the resulting buffers to compute approximate unique counts with thehll_sketch_estimatefunction. Thehll_unionandhll_union_aggfunctions can also combine sk",2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"HyperLogLog aggregate function reference (hll_sketch_agg) is primarily syntax/behavior description. The summary does not indicate presence of quotas, config parameter tables, security roles, or troubleshooting mappings required for expert-knowledge sub-skills.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hll_sketch_estimate,hll_sketch_estimate function,hll_sketch_estimate function - Azure Databricks - Databricks SQL,,Learn the syntax of the hll\_sketch\_estimate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above This function utilizes theHyperLogLogalgorithm to count a probabilistic approximation of the number of unique values in a given column, consuming a binary representation known as a sketch buffer previously generated by thehll_sketch_aggfunction and returning the result as a big integer. Thehll_unionandhll_union_aggfunctions can also combine sketches together by consuming and merging these buffers as inputs. The implementation uses th",2024-09-23T08:00:00.000Z,language-reference,,0.45,False,"HLL_SKETCH_ESTIMATE function reference; focuses on behavior and usage, not on quotas, limits, or product-specific configuration values.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hll_union,hll_union function,hll_union function - Azure Databricks - Databricks SQL,,Learn the syntax of the hll\_union function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above This function utilizes theHyperLogLogalgorithm to combine two sketches into a single sketch. Queries can use the resulting buffers to compute approximate unique counts as long integers with thehll_sketch_estimatefunction. The implementation uses theApache Datasketches library. Please seeHLLfor more information.,2024-09-23T08:00:00.000Z,language-reference,,0.45,False,"HLL_UNION function reference; describes combining sketches but lacks numeric limits, decision matrices, or config parameter tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hll_union_agg,hll_union_agg function,hll_union_agg function - Azure Databricks - Databricks SQL,,Learn the syntax of the hll\_union\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above This function utilizes theHyperLogLogalgorithm to combine a group of sketches into a single one. Queries can use the resulting buffers to compute approximate unique counts with thehll_sketch_estimatefunction. The implementation uses theApache Datasketches library. Please seeHLLfor more information.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"hll_union_agg function reference appears to describe usage and algorithm (HyperLogLog) but not product-specific limits, configuration matrices, or error-code-based troubleshooting. Does not match any expert-knowledge sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hll_union_agg,hll_union_agg function,hll_union_agg function - Azure Databricks - Databricks SQL,,Learn the syntax of the hll\_union\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above This function utilizes theHyperLogLogalgorithm to combine a group of sketches into a single one. Queries can use the resulting buffers to compute approximate unique counts with thehll_sketch_estimatefunction. The implementation uses theApache Datasketches library. Please seeHLLfor more information.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"hll_union_agg function reference appears to describe usage and algorithm (HyperLogLog) but not product-specific limits, configuration matrices, or error-code-based troubleshooting. Does not match any expert-knowledge sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hour,hour function,hour function - Azure Databricks - Databricks SQL,,Learn the syntax of the hour function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the hour component of a timestamp. This function is a synonym forextract(HOUR FROM expr). Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"hour function reference; generic SQL behavior (extract hour) that an LLM already knows, with only a note about session timezone, not a detailed config or limit.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/http_request,http_request function,http_request function - Azure Databricks - Databricks SQL,,Learn the syntax of the http\_request function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.2 and above Makes an HTTP request using a definedHTTP connection. This function requires named parameter invocation.,2026-01-20T08:00:00.000Z,language-reference,,0.5,False,"http_request function reference; summary only indicates it makes an HTTP request using a defined connection and named parameters, without exposing detailed parameter tables, limits, or security configuration in the provided text.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/hypot,hypot function,hypot function - Azure Databricks - Databricks SQL,,Learn the syntax of the hypot function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnssqrt(expr1 * expr1 + expr2 * expr2).,2026-01-20T08:00:00.000Z,language-reference,,0.3,False,hypot function reference; standard mathematical function behavior that is generic and not product-specific expert knowledge.,unchanged @@ -4001,7 +4031,17 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/input_file_block_start,input_file_block_start function,input_file_block_start function - Azure Databricks - Databricks SQL,,Learn the syntax of the input\_file\_block\_start function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the start offset in bytes of the block being read. This function is not available on Unity Catalog. In Databricks SQL AND Databricks Runtime 13.3 LTS and above this function is deprecated. Please use_metadata.file_block_start.,2024-04-18T08:00:00.000Z,language-reference,,0.55,False,"INPUT_FILE_BLOCK_START function reference; similar to index 18, mainly behavior and deprecation guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/input_file_name,input_file_name function,input_file_name function - Azure Databricks - Databricks SQL,,Learn the syntax of the input\_file\_name function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Warning This function is not available on Unity Catalog. In Databricks Runtime 13.3 LTS and above this function is deprecated. Use_metadata.file_name. In Databricks SQL and Databricks Runtime 17.3 LTS and above this function is no longer supported. Use_metadata.file_nameinstead. Returns the name of the file being read, or empty string if not available.",2026-01-20T08:00:00.000Z,language-reference,,0.55,False,"input_file_name function reference; summary includes deprecation and support-version notes but not detailed configuration parameters, limits, or decision matrices that would qualify as expert knowledge under the defined categories.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/instr,instr function,instr function - Azure Databricks - Databricks SQL,,Learn the syntax of the instr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the (1-based) index of the first occurrence ofsubstrinstr.,2024-12-12T08:00:00.000Z,language-reference,,0.35,False,INSTR function reference; standard substring search semantics.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/int,int function,int function - Azure Databricks - Databricks SQL,,Learn the syntax of the int function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto INTEGER. This function is a synonym forCAST(expr AS INTEGER).,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents a simple SQL cast/int function; primarily syntax and basic behavior without product-specific limits, configuration tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/int,int function,int function - Azure Databricks - Databricks SQL,,Learn the syntax of the int function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprto INTEGER. This function is a synonym forCAST(expr AS INTEGER).,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Page documents a simple SQL cast/int function; primarily syntax and basic behavior without product-specific limits, configuration tables, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_as_binary,ip_as_binary function,ip_as_binary function - Azure Databricks - Databricks SQL,Use ip_as_binary in Databricks SQL queries,Learn the syntax of the ip\_as\_binary function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical binary representation of an IP address or CIDR block.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Function reference pages for Databricks SQL typically include exact syntax, argument types, return types, and product-specific behavior for the ip_as_binary function, which are concrete API/SDK-style details beyond generic SQL knowledge and fit the integrations & coding patterns category.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_as_string,ip_as_string function,ip_as_string function - Azure Databricks - Databricks SQL,Use ip_as_string in Databricks SQL queries,Learn the syntax of the ip\_as\_string function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical string representation of an IP address or CIDR block.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"The page documents the ip_as_string Databricks SQL function with precise syntax and behavior for converting IPs/CIDRs to canonical strings, which is product-specific API surface detail appropriate for the integrations category.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_cidr,ip_cidr function,ip_cidr function - Azure Databricks - Databricks SQL,Use ip_cidr in Databricks SQL for CIDR handling,Learn the syntax of the ip\_cidr function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical representation of an IPv4 or IPv6 CIDR block.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Describes the ip_cidr function’s exact usage and behavior for IPv4/IPv6 CIDR canonicalization in Databricks SQL, representing concrete function-level integration details rather than general concepts.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_cidr_contains,ip_cidr_contains function,ip_cidr_contains function - Azure Databricks - Databricks SQL,Use ip_cidr_contains in Databricks SQL filters,Learn the syntax of the ip\_cidr\_contains function of the SQL language in Databricks Runtime.,"Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. ReturnsTRUEif an IP address or CIDR block is contained within another CIDR block,FALSEotherwise.",2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Provides specific syntax and semantics for the ip_cidr_contains function (TRUE/FALSE containment checks between IPs and CIDRs) in Databricks SQL, which is product-specific function behavior aligned with integrations & coding patterns.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_host,ip_host function,ip_host function - Azure Databricks - Databricks SQL,Use ip_host in Databricks SQL for IP normalization,Learn the syntax of the ip\_host function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical representation of an IPv4 or IPv6 address.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Documents the ip_host function’s exact behavior and usage for canonical IPv4/IPv6 address representation in Databricks SQL, which is detailed API-level knowledge suitable for the integrations category.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network,ip_network function,ip_network function - Azure Databricks - Databricks SQL,Use ip_network in Databricks SQL for network extraction,Learn the syntax of the ip\_network function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the network portion of an IPv4 or IPv6 CIDR block in its canonical form. This function is aliased byip_network_firstfunction.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Explains the ip_network function (and its alias behavior) with precise semantics for returning the network portion of CIDR blocks in Databricks SQL, representing concrete function configuration/usage details.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network_first,ip_network_first function,ip_network_first function - Azure Databricks - Databricks SQL,Use ip_network_first alias in Databricks SQL,Learn the syntax of the ip\_network\_first function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the network portion of an IPv4 or IPv6 CIDR block in its canonical form. This function is an alias forip_networkfunction.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Clarifies that ip_network_first is an alias for ip_network and documents its exact behavior and usage, which is specific to Databricks SQL’s function surface and fits integrations & coding patterns.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network_last,ip_network_last function,ip_network_last function - Azure Databricks - Databricks SQL,Use ip_network_last in Databricks SQL for last IP,Learn the syntax of the ip\_network\_last function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the last address of an IPv4 or IPv6 CIDR block in its canonical form.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Provides exact syntax and semantics for ip_network_last to return the last address in a CIDR block in Databricks SQL, which is detailed, product-specific function behavior appropriate for integrations.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_prefix_length,ip_prefix_length function,ip_prefix_length function - Azure Databricks - Databricks SQL,Use ip_prefix_length in Databricks SQL for CIDR masks,Learn the syntax of the ip\_prefix\_length function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the prefix length of an IPv4 or IPv6 CIDR block.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Documents the ip_prefix_length function’s precise behavior for extracting prefix length from CIDR blocks in Databricks SQL, representing concrete API usage details rather than generic networking theory.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_version,ip_version function,ip_version function - Azure Databricks - Databricks SQL,Use ip_version in Databricks SQL to detect IP type,Learn the syntax of the ip\_version function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the IP version (4 or 6) from an IPv4 or IPv6 address or CIDR block.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.78,True,"Describes the ip_version function’s exact return values (4 or 6) and usage with IPv4/IPv6 addresses and CIDRs in Databricks SQL, which is specific function-level behavior aligned with integrations & coding patterns.",new https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/is_account_group_member,is_account_group_member function,is_account_group_member function - Azure Databricks - Databricks SQL,Check Databricks account-level group membership in SQL,Learn the syntax of the is\_account\_group\_member function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks RuntimeUnity Catalog only Returns true if thesession (connected) useris a direct or indirect member of the specified group at the account level.,2024-03-01T08:00:00.000Z,language-reference,security,0.7,True,"IS_ACCOUNT_GROUP_MEMBER function is specific to Databricks security; describes account-level group membership evaluation for the session user, which is product-specific IAM behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/is_member,is_member function,is_member function - Azure Databricks - Databricks SQL,Evaluate Databricks workspace group membership in SQL,Learn the syntax of the is\_member function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns true if thesession (connected) useris a direct or indirect member of the specified group if the specified group is a workspace local group or an account level group assigned to the workspace. In most cases you should useis_account_group_memberfunctionto test group membership at the account level.,2024-03-01T08:00:00.000Z,language-reference,security,0.7,True,"IS_MEMBER function describes workspace-local vs account-level groups and how membership is resolved, which is Databricks-specific security/IAM behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/is_valid_utf8,is_valid_utf8 function,is_valid_utf8 function - Azure Databricks - Databricks SQL,,Learn the syntax of the isvalid\_utf8 function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.4 and above Returnstrueif the input is a valid UTF-8 string, otherwise returnsfalse.",2025-04-30T00:43:00.000Z,language-reference,,0.4,False,IS_VALID_UTF8 function reference; simple validation behavior without product-specific configuration or limits.,unchanged @@ -4017,21 +4057,21 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/json_array_length,json_array_length function,json_array_length function - Azure Databricks - Databricks SQL,,Learn the syntax of the json\_array\_length function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of elements in the outermost JSON array.,2024-04-21T19:33:00.000Z,language-reference,,0.4,False,JSON_ARRAY_LENGTH function reference; standard JSON utility behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/json_object_keys,json_object_keys function,json_object_keys function - Azure Databricks - Databricks SQL,,Learn the syntax of the json\_object\_keys function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns all the keys of the outermost JSON object as an array.,2025-05-09T19:58:00.000Z,language-reference,,0.4,False,JSON_OBJECT_KEYS function reference; standard JSON key extraction behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/json_tuple,json_tuple table-valued generator function,json_tuple table-valued generator function - Azure Databricks - Databricks SQL,,Learn the syntax of the json\_tuple function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns multiple JSON objects as a tuple.,2024-07-26T08:00:00.000Z,language-reference,,0.45,False,"JSON_TUPLE table-valued function; describes JSON parsing behavior but not limits, quotas, or configuration matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_bigint,kll_merge_agg_bigint aggregate function,kll_merge_agg_bigint aggregate function - Azure Databricks - Databricks SQL,Use kll_merge_agg_bigint for KLL sketches,Learn the syntax of the kll\_merge\_agg\_bigint aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple KLL (Karnin-Lang-Liberty) sketch buffers for approximate quantile estimation on integer data and merges them into one result buffer.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Aggregate function reference for merging bigint KLL sketch buffers, including exact function identifier and usage semantics. This is specific to Databricks SQL/Runtime and aligns with integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_double,kll_merge_agg_double aggregate function,kll_merge_agg_double aggregate function - Azure Databricks - Databricks SQL,Use kll_merge_agg_double for KLL sketches,Learn the syntax of the kll\_merge\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple KLL (Karnin-Lang-Liberty) sketch buffers for approximate quantile estimation on double-precision floating-point data and merges them into one result buffer.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Documents the kll_merge_agg_double aggregate function with concrete syntax and behavior for double-precision data. This is product-specific API detail suitable for the integrations & coding patterns sub-skill.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_float,kll_merge_agg_float aggregate function,kll_merge_agg_float aggregate function - Azure Databricks - Databricks SQL,Use kll_merge_agg_float for KLL sketches,Learn the syntax of the kll\_merge\_agg\_float aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple KLL (Karnin-Lang-Liberty) sketch buffers for approximate quantile estimation on single-precision floating-point data and merges them into one result buffer.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for kll_merge_agg_float, including how it merges float KLL sketch buffers. Contains specific function name and usage details, which are expert, product-specific integration patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_bigint,kll_merge_agg_bigint aggregate function,kll_merge_agg_bigint aggregate function - Azure Databricks - Databricks SQL,Use kll_merge_agg_bigint for KLL sketches,Learn the syntax of the kll\_merge\_agg\_bigint aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple KLL (Karnin-Lang-Liberty) sketch buffers for approximate quantile estimation on integer data and merges them into one result buffer.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Aggregate function reference for merging bigint KLL sketch buffers, including exact function identifier and usage semantics. This is specific to Databricks SQL/Runtime and aligns with integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_double,kll_merge_agg_double aggregate function,kll_merge_agg_double aggregate function - Azure Databricks - Databricks SQL,Use kll_merge_agg_double for KLL sketches,Learn the syntax of the kll\_merge\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple KLL (Karnin-Lang-Liberty) sketch buffers for approximate quantile estimation on double-precision floating-point data and merges them into one result buffer.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Documents the kll_merge_agg_double aggregate function with concrete syntax and behavior for double-precision data. This is product-specific API detail suitable for the integrations & coding patterns sub-skill.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_float,kll_merge_agg_float aggregate function,kll_merge_agg_float aggregate function - Azure Databricks - Databricks SQL,Use kll_merge_agg_float for KLL sketches,Learn the syntax of the kll\_merge\_agg\_float aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple KLL (Karnin-Lang-Liberty) sketch buffers for approximate quantile estimation on single-precision floating-point data and merges them into one result buffer.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for kll_merge_agg_float, including how it merges float KLL sketch buffers. Contains specific function name and usage details, which are expert, product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_agg_bigint,kll_sketch_agg_bigint aggregate function,kll_sketch_agg_bigint aggregate function - Azure Databricks - Databricks SQL,Use kll_sketch_agg_bigint in Databricks SQL,Learn the syntax of the kll\_sketch\_agg\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Creates a KLL (Karnin-Lang-Liberty) sketch for approximate quantile estimation on integer data with configurable accuracy.,2026-04-10T21:59:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific SQL syntax, arguments, return types, and behavior for kll_sketch_agg_bigint. This is concrete API/SQL surface knowledge (parameters and usage) that an LLM is unlikely to fully know from training and fits the integrations & coding patterns category best.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_agg_double,kll_sketch_agg_double aggregate function,kll_sketch_agg_double aggregate function - Azure Databricks - Databricks SQL,Use kll_sketch_agg_double in Databricks SQL,Learn the syntax of the kll\_sketch\_agg\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Creates a KLL (Karnin-Lang-Liberty) sketch for approximate quantile estimation on double precision floating-point data with configurable accuracy.,2026-04-10T21:59:00.000Z,language-reference,integrations,0.7,True,"Documents exact SQL syntax and behavior for kll_sketch_agg_double, including product-specific function signature and usage details. This is concrete API-level knowledge, matching integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_agg_float,kll_sketch_agg_float aggregate function,kll_sketch_agg_float aggregate function - Azure Databricks - Databricks SQL,Use kll_sketch_agg_float in Databricks SQL,Learn the syntax of the kll\_sketch\_agg\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Creates a KLL (Karnin-Lang-Liberty) sketch for approximate quantile estimation on single-precision floating-point data with configurable accuracy.,2026-04-10T21:59:00.000Z,language-reference,integrations,0.7,True,"Provides detailed SQL function reference for kll_sketch_agg_float with product-specific syntax and semantics, which qualifies as expert integration/coding pattern knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_n_bigint,kll_sketch_get_n_bigint function,kll_sketch_get_n_bigint function - Azure Databricks - Databricks SQL,Query item count from bigint KLL sketch,Learn the syntax of the kll\_sketch\_get\_n\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Returns the number of items that have been added to an integer KLL sketch.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.72,True,"Describes kll_sketch_get_n_bigint function syntax and return behavior, which is Databricks-specific SQL function API knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_n_double,kll_sketch_get_n_double function,kll_sketch_get_n_double function - Azure Databricks - Databricks SQL,Query item count from double KLL sketch,Learn the syntax of the kll\_sketch\_get\_n\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Returns the number of items that have been added to a double KLL sketch.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.72,True,"Function reference for kll_sketch_get_n_double with exact usage and behavior, representing product-specific SQL API details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_n_float,kll_sketch_get_n_float function,kll_sketch_get_n_float function - Azure Databricks - Databricks SQL,Query item count from float KLL sketch,Learn the syntax of the kll\_sketch\_get\_n\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Returns the number of items that have been added to a float KLL sketch.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.72,True,"Documents kll_sketch_get_n_float function signature and semantics in Databricks SQL, which is specific integration/API knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_quantile_bigint,kll_sketch_get_quantile_bigint function,kll_sketch_get_quantile_bigint function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_quantile\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the value at a given quantile rank (or multiple ranks) from an integer KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_quantile_bigint; likely includes syntax and description but not limits, decision matrices, or detailed troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_quantile_double,kll_sketch_get_quantile_double function,kll_sketch_get_quantile_double function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_quantile\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the value at a given quantile rank (or multiple ranks) from a double KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_quantile_double; focused on syntax/usage, not on quotas, configuration matrices, or error-resolution mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_quantile_float,kll_sketch_get_quantile_float function,kll_sketch_get_quantile_float function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_quantile\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the value at a given quantile rank (or multiple ranks) from a float KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_quantile_float; no indication of numeric limits, config parameter tables, or decision/troubleshooting frameworks.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_rank_bigint,kll_sketch_get_rank_bigint function,kll_sketch_get_rank_bigint function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_rank\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the normalized rank (0.0 to 1.0) of a given value in an integer KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_rank_bigint; describes estimating normalized rank but not in the form of limits, best practices, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_rank_double,kll_sketch_get_rank_double function,kll_sketch_get_rank_double function - Azure Databricks - Databricks SQL,Use kll_sketch_get_rank_double in Databricks SQL,Learn the syntax of the kll\_sketch\_get\_rank\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the normalized rank (0.0 to 1.0) of a given value in a double KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific SQL syntax, argument types, return type, and behavior for kll_sketch_get_rank_double. This is concrete API-level knowledge (exact function name, signature, and semantics) that an LLM is unlikely to fully know from training and fits best under integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_rank_float,kll_sketch_get_rank_float function,kll_sketch_get_rank_float function - Azure Databricks - Databricks SQL,Use kll_sketch_get_rank_float in Databricks SQL,Learn the syntax of the kll\_sketch\_get\_rank\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the normalized rank (0.0 to 1.0) of a given value in a float KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Similar to index 0 but for float KLL sketches; documents precise SQL function name, parameters, and behavior. This is detailed product-specific API usage, matching the integrations & coding patterns category.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_quantile_bigint,kll_sketch_get_quantile_bigint function,kll_sketch_get_quantile_bigint function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_quantile\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the value at a given quantile rank (or multiple ranks) from an integer KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_quantile_bigint; likely includes syntax and description but not limits, decision matrices, or detailed troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_quantile_double,kll_sketch_get_quantile_double function,kll_sketch_get_quantile_double function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_quantile\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the value at a given quantile rank (or multiple ranks) from a double KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_quantile_double; focused on syntax/usage, not on quotas, configuration matrices, or error-resolution mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_quantile_float,kll_sketch_get_quantile_float function,kll_sketch_get_quantile_float function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_quantile\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the value at a given quantile rank (or multiple ranks) from a float KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_quantile_float; no indication of numeric limits, config parameter tables, or decision/troubleshooting frameworks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_rank_bigint,kll_sketch_get_rank_bigint function,kll_sketch_get_rank_bigint function - Azure Databricks - Databricks SQL,,Learn the syntax of the kll\_sketch\_get\_rank\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the normalized rank (0.0 to 1.0) of a given value in an integer KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference for kll_sketch_get_rank_bigint; describes estimating normalized rank but not in the form of limits, best practices, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_rank_double,kll_sketch_get_rank_double function,kll_sketch_get_rank_double function - Azure Databricks - Databricks SQL,Use kll_sketch_get_rank_double in Databricks SQL,Learn the syntax of the kll\_sketch\_get\_rank\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the normalized rank (0.0 to 1.0) of a given value in a double KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific SQL syntax, argument types, return type, and behavior for kll_sketch_get_rank_double. This is concrete API-level knowledge (exact function name, signature, and semantics) that an LLM is unlikely to fully know from training and fits best under integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_get_rank_float,kll_sketch_get_rank_float function,kll_sketch_get_rank_float function - Azure Databricks - Databricks SQL,Use kll_sketch_get_rank_float in Databricks SQL,Learn the syntax of the kll\_sketch\_get\_rank\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Estimates the normalized rank (0.0 to 1.0) of a given value in a float KLL sketch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Similar to index 0 but for float KLL sketches; documents precise SQL function name, parameters, and behavior. This is detailed product-specific API usage, matching the integrations & coding patterns category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_merge_bigint,kll_sketch_merge_bigint function,kll_sketch_merge_bigint function - Azure Databricks - Databricks SQL,Merge bigint KLL sketches in Databricks SQL,Learn the syntax of the kll\_sketch\_merge\_bigint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Merges two compatible integer KLL sketches into a single sketch.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.78,True,"Explains kll_sketch_merge_bigint function syntax and compatibility requirements for merging sketches, which are Databricks-specific API semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_merge_double,kll_sketch_merge_double function,kll_sketch_merge_double function - Azure Databricks - Databricks SQL,Merge double KLL sketches in Databricks SQL,Learn the syntax of the kll\_sketch\_merge\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Merges two compatible double KLL sketches into a single sketch.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.78,True,"Provides exact usage details for kll_sketch_merge_double, including how sketches are combined, which is product-specific function behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_merge_float,kll_sketch_merge_float function,kll_sketch_merge_float function - Azure Databricks - Databricks SQL,Merge float KLL sketches in Databricks SQL,Learn the syntax of the kll\_sketch\_merge\_float function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.0 and later Merges two compatible float KLL sketches into a single sketch.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.78,True,"Documents kll_sketch_merge_float function contract and constraints, representing Databricks SQL-specific integration details.",unchanged @@ -4068,44 +4108,44 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ltrim,ltrim function,ltrim function - Azure Databricks - Databricks SQL,,Learn the syntax of the ltrim function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsstrwith leading characters withintrimStrremoved.,2024-12-12T08:00:00.000Z,language-reference,,0.4,False,LTRIM string function behavior; standard SQL-style function without Databricks-specific constraints or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ltsign,< (lt sign) operator,< (lt sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the \< (lt sign) operator of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Returnstrueifexpr1is less thanexpr2, orfalseotherwise.",2025-01-30T08:00:00.000Z,language-reference,,0.3,False,"Basic < comparison operator; generic SQL behavior, no expert Databricks-specific content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/luhn_check,luhn_check function,luhn_check function - Azure Databricks - Databricks SQL,,Learn the syntax of the luhn\_check function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above ReturnstrueifnumStrpasses theLuhn algorithmcheck. The Luhn algorithm is used for example, to validate creditcard numbers.",2025-05-09T19:58:00.000Z,language-reference,,0.5,False,"LUHN_CHECK function description; while it notes version applicability, it is a straightforward algorithmic check without limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_date,make_date function,make_date function - Azure Databricks - Databricks SQL,Use make_date in Databricks SQL queries,Learn the syntax of the make\_date function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates a date fromyear,month, anddayfields.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with Databricks-specific SQL syntax, arguments, return types, and examples that go beyond generic SQL knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_dt_interval,make_dt_interval function,make_dt_interval function - Azure Databricks - Databricks SQL,Build day-time intervals with make_dt_interval,Learn the syntax of the make\_dt\_interval function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Creates an interval fromdays,hours,minsandsecs.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a Databricks-specific SQL function including version applicability and parameter behavior, which is product-specific API knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_interval,make_interval function,make_interval function - Azure Databricks - Databricks SQL,Create intervals with deprecated make_interval,Learn the syntax of the make\_interval function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates an interval fromyears,months,weeks,days,hours,minsandsecs. Warning This constructor is deprecated since it generates anINTERVALwhich cannot be compared or operated upon. Please usemake_ym_intervalormake_dt_intervalto produce intervals.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Details a deprecated Databricks SQL interval constructor and its behavioral caveats, including guidance to use alternative functions—product-specific API behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_timestamp,make_timestamp function,make_timestamp function - Azure Databricks - Databricks SQL,Construct timestamps using make_timestamp,Learn the syntax of the make\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates a timestamp fromyear,month,day,hour,min,sec, andtimezonefields.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Provides Databricks SQL-specific function signature and behavior for timestamp construction, including timezone handling.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_date,make_date function,make_date function - Azure Databricks - Databricks SQL,Use make_date in Databricks SQL queries,Learn the syntax of the make\_date function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates a date fromyear,month, anddayfields.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with Databricks-specific SQL syntax, arguments, return types, and examples that go beyond generic SQL knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_dt_interval,make_dt_interval function,make_dt_interval function - Azure Databricks - Databricks SQL,Build day-time intervals with make_dt_interval,Learn the syntax of the make\_dt\_interval function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Creates an interval fromdays,hours,minsandsecs.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a Databricks-specific SQL function including version applicability and parameter behavior, which is product-specific API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_interval,make_interval function,make_interval function - Azure Databricks - Databricks SQL,Create intervals with deprecated make_interval,Learn the syntax of the make\_interval function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates an interval fromyears,months,weeks,days,hours,minsandsecs. Warning This constructor is deprecated since it generates anINTERVALwhich cannot be compared or operated upon. Please usemake_ym_intervalormake_dt_intervalto produce intervals.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Details a deprecated Databricks SQL interval constructor and its behavioral caveats, including guidance to use alternative functions—product-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_timestamp,make_timestamp function,make_timestamp function - Azure Databricks - Databricks SQL,Construct timestamps using make_timestamp,Learn the syntax of the make\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Creates a timestamp fromyear,month,day,hour,min,sec, andtimezonefields.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Provides Databricks SQL-specific function signature and behavior for timestamp construction, including timezone handling.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_valid_utf8,make_valid_utf8 function,make_valid_utf8 function - Azure Databricks - Databricks SQL,,Learn the syntax of the make\_valid\_utf8 function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.4 and above Returns a string in which all invalid UTF-8 byte sequences instrExpr, are replaced by the Unicode replacement character (U+FFFD).",2025-04-30T00:43:00.000Z,language-reference,,0.55,False,"MAKE_VALID_UTF8 function; describes replacing invalid UTF-8 sequences but lacks numeric limits, config tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_ym_interval,make_ym_interval function,make_ym_interval function - Azure Databricks - Databricks SQL,Create year-month intervals with make_ym_interval,Learn the syntax of the make\_ym\_interval function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Creates an year-month interval fromyearsandmonths.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks-only SQL function with version constraints and parameter semantics, which are product-specific.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map,map function,map function - Azure Databricks - Databricks SQL,Build map literals with Databricks SQL map,Learn the syntax of the map function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map with the specified key-value pairs.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for Databricks SQL map construction, including argument rules and behavior, which is specific to this engine.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_concat,map_concat function,map_concat function - Azure Databricks - Databricks SQL,Merge maps using map_concat in Databricks SQL,Learn the syntax of the map\_concat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the union of allexprmap expressions.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks SQL-specific behavior for combining maps, including how key collisions are handled—API-level detail.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_ym_interval,make_ym_interval function,make_ym_interval function - Azure Databricks - Databricks SQL,Create year-month intervals with make_ym_interval,Learn the syntax of the make\_ym\_interval function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Creates an year-month interval fromyearsandmonths.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks-only SQL function with version constraints and parameter semantics, which are product-specific.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map,map function,map function - Azure Databricks - Databricks SQL,Build map literals with Databricks SQL map,Learn the syntax of the map function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map with the specified key-value pairs.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for Databricks SQL map construction, including argument rules and behavior, which is specific to this engine.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_concat,map_concat function,map_concat function - Azure Databricks - Databricks SQL,Merge maps using map_concat in Databricks SQL,Learn the syntax of the map\_concat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the union of allexprmap expressions.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks SQL-specific behavior for combining maps, including how key collisions are handled—API-level detail.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_contains_key,map_contains_key function,map_contains_key function - Azure Databricks - Databricks SQL,,Learn the syntax of the array\_contains function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns true ifmapcontainskey.,2024-04-21T19:33:00.000Z,language-reference,,0.35,False,"MAP_CONTAINS_KEY function; simple predicate semantics, no expert-level Databricks-specific content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_entries,map_entries function,map_entries function - Azure Databricks - Databricks SQL,,Learn the syntax of the map\_entries function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an unordered array of all entries inmap.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,"MAP_ENTRIES function; returns entries as array, but only generic behavior is described.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_filter,map_filter function,map_filter function - Azure Databricks - Databricks SQL,,Learn the syntax of the map\_filter function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Filters entries in the map inexprusing the functionfunc.,2024-03-01T08:00:00.000Z,language-reference,,0.45,False,"MAP_FILTER function; describes functional-style filtering but no product-specific limits, configs, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_from_arrays,map_from_arrays function,map_from_arrays function - Azure Databricks - Databricks SQL,Create maps from arrays with map_from_arrays,Learn the syntax of the map\_from\_arrays function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map with a pair of thekeysandvaluesarrays.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Explains Databricks SQL function semantics for turning key/value arrays into maps, including constraints on array lengths and types.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_from_entries,map_from_entries function,map_from_entries function - Azure Databricks - Databricks SQL,Build maps from entries using map_from_entries,Learn the syntax of the map\_from\_entries function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map created from the specified array of entries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Provides Databricks SQL-specific function details for constructing maps from entry arrays, which is product-specific API knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_from_arrays,map_from_arrays function,map_from_arrays function - Azure Databricks - Databricks SQL,Create maps from arrays with map_from_arrays,Learn the syntax of the map\_from\_arrays function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map with a pair of thekeysandvaluesarrays.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Explains Databricks SQL function semantics for turning key/value arrays into maps, including constraints on array lengths and types.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_from_entries,map_from_entries function,map_from_entries function - Azure Databricks - Databricks SQL,Build maps from entries using map_from_entries,Learn the syntax of the map\_from\_entries function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map created from the specified array of entries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Provides Databricks SQL-specific function details for constructing maps from entry arrays, which is product-specific API knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_keys,map_keys function,map_keys function - Azure Databricks - Databricks SQL,,Learn the syntax of the map\_keys function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an unordered array containing the keys ofmap.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,"MAP_KEYS function; returns keys of a map, generic behavior without limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_values,map_values function,map_values function - Azure Databricks - Databricks SQL,,Learn the syntax of the map\_values function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an unordered array containing the values ofmap.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,"MAP_VALUES function; returns values of a map, generic SQL-style function.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/map_zip_with,map_zip_with function,map_zip_with function - Azure Databricks - Databricks SQL,,Learn the syntax of the map\_zip\_with function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Mergesmap1andmap2into a single map.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"MAP_ZIP_WITH function; describes merging maps but no numeric limits, configuration options, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mask,mask function,mask function - Azure Databricks - Databricks SQL,Mask sensitive strings with Databricks SQL mask,Learn the syntax of the mask function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above Returns a masked version of the inputstr. In Databricks SQL and Databricks Runtime 13.3 LTS and above this function supportsnamed parameter invocation.,2026-04-13T21:52:00.000Z,language-reference,security,0.65,True,"Function reference for a data-masking operation used for security/privacy; includes version-specific behavior and named-parameter support, fitting security-focused configuration/usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mask,mask function,mask function - Azure Databricks - Databricks SQL,Mask sensitive strings with Databricks SQL mask,Learn the syntax of the mask function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above Returns a masked version of the inputstr. In Databricks SQL and Databricks Runtime 13.3 LTS and above this function supportsnamed parameter invocation.,2026-04-13T21:52:00.000Z,language-reference,security,0.65,True,"Function reference for a data-masking operation used for security/privacy; includes version-specific behavior and named-parameter support, fitting security-focused configuration/usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/max,max aggregate function,max aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the max function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the maximum value ofexprin a group.,2024-12-13T19:08:00.000Z,language-reference,,0.3,False,"MAX aggregate function; generic SQL aggregation semantics, no Databricks-specific constraints or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/max_by,max_by aggregate function,max_by aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the max\_by function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the value of an expression associated with the largest value of a second expression in a group. With the optional third argument, returns an array of up tolimitvalues corresponding to the largest values of the ordering expression.",2026-03-05T20:31:00.000Z,language-reference,,0.45,False,"MAX_BY aggregate function; describes behavior including optional limit argument, but no numeric system limits or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/md5,md5 function,md5 function - Azure Databricks - Databricks SQL,,Learn the syntax of the md5 function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an MD5 128-bit checksum ofexpras a hex string.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"MD5 function; standard checksum behavior, no Databricks-specific expert configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mean,mean aggregate function,mean aggregate function - Azure Databricks - Databricks SQL,Calculate averages with mean aggregate function,Learn the syntax of the mean function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the mean calculated from values of a group. This function is a synonym foravgaggregate function.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks SQL aggregate function semantics and its synonymy with avg, which is specific to this SQL dialect.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mean,mean aggregate function,mean aggregate function - Azure Databricks - Databricks SQL,Calculate averages with mean aggregate function,Learn the syntax of the mean function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the mean calculated from values of a group. This function is a synonym foravgaggregate function.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks SQL aggregate function semantics and its synonymy with avg, which is specific to this SQL dialect.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/measure,measure aggregate function,measure aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the measure expression of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 16.4 and above Returns themeasure_columnaggregated from the values of a group. Unlike a regular aggregate function likeSUM,AVG, orCOUNT, theMEASUREfunction does not specify the aggregation. It inherits the definition of the aggregation from themetric view definition. Using a metric view with measures is superior to regular views because it abstracts the complexity of the underlying aggregations while giving the invoker the freedom to choose the groupin",2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"measure aggregate function tied to metric views is somewhat Databricks-specific, but the summary indicates conceptual behavior rather than detailed configuration tables, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/median,median aggregate function,median aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the median function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the median calculated from values of a group.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"median aggregate function reference; describes standard aggregation behavior without product-specific limits, quotas, or configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/min,min aggregate function,min aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the min function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the minimum value ofexprin a group.,2024-12-13T19:08:00.000Z,language-reference,,0.3,False,MIN aggregate function; standard SQL aggregation semantics.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/min_by,min_by aggregate function,min_by aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the min\_by function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the value of an expression associated with the smallest value of a second expression in a group. With the optional third argument, returns an array of up tolimitvalues corresponding to the smallest values of the ordering expression. Semantics are the same asmax_byaggregate functionwith opposite ordering.",2026-03-05T20:31:00.000Z,language-reference,,0.45,False,"MIN_BY aggregate function; behavior mirrors MAX_BY with opposite ordering, but no numeric system limits or configuration tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/minussign,- (minus sign) operator,- (minus sign) operator - Azure Databricks - Databricks SQL,Use binary minus operator in Databricks SQL,Learn the syntax of the - (minus sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the subtraction ofexpr2fromexpr1.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Operator reference for Databricks SQL subtraction, including type and null-handling semantics that are engine-specific.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/minussignunary,- (minus sign) unary operator,- (minus sign) unary operator - Azure Databricks - Databricks SQL,Use unary minus operator in Databricks SQL,Learn the syntax of the - (minus sign) unary operator of the SQL language in Databricks SQL.,Returns the negated value ofexpr. This function is a synonym fornegativefunction.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,Describes Databricks SQL unary negation and its synonym relationship with the negative function—dialect-specific operator behavior.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/minussign,- (minus sign) operator,- (minus sign) operator - Azure Databricks - Databricks SQL,Use binary minus operator in Databricks SQL,Learn the syntax of the - (minus sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the subtraction ofexpr2fromexpr1.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Operator reference for Databricks SQL subtraction, including type and null-handling semantics that are engine-specific.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/minussignunary,- (minus sign) unary operator,- (minus sign) unary operator - Azure Databricks - Databricks SQL,Use unary minus operator in Databricks SQL,Learn the syntax of the - (minus sign) unary operator of the SQL language in Databricks SQL.,Returns the negated value ofexpr. This function is a synonym fornegativefunction.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,Describes Databricks SQL unary negation and its synonym relationship with the negative function—dialect-specific operator behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/minute,minute function,minute function - Azure Databricks - Databricks SQL,,Learn the syntax of the minute function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the minute component of the timestamp inexpr. This function is a synonym forextract(MINUTES FROM expr). Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for extracting minute from timestamps; primarily syntax/behavior description without limits, quotas, configuration tables, or product-specific thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mod,mod function,mod function - Azure Databricks - Databricks SQL,Compute remainders with mod in Databricks SQL,Learn the syntax of the mod function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the remainder afterdividend / divisor. This function is equivalent to the%(percent sign) operator.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Function reference for Databricks SQL mod, including equivalence to % operator and any edge-case behavior, which is product-specific.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mod,mod function,mod function - Azure Databricks - Databricks SQL,Compute remainders with mod in Databricks SQL,Learn the syntax of the mod function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the remainder afterdividend / divisor. This function is equivalent to the%(percent sign) operator.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Function reference for Databricks SQL mod, including equivalence to % operator and any edge-case behavior, which is product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/mode,mode aggregate function,mode aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the mode function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the most frequent, notNULL, value ofexprin a group. modeis a non-deterministic function unlessdeterministicis set totrue.",2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"Aggregate function reference for mode; mentions non-determinism flag but likely just syntax/behavior, not detailed configuration tables or thresholds.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/monotonically_increasing_id,monotonically_increasing_id function,monotonically_increasing_id function - Azure Databricks - Databricks SQL,,Learn the syntax of the monotonically\_increasing\_id function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns monotonically increasing 64-bit integers.,2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"monotonically_increasing_id function reference; describes return type and behavior but not quotas, config matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/month,month function,month function - Azure Databricks - Databricks SQL,,Learn the syntax of the month function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the month component of the timestamp inexpr. This function is a synonym forextract(MONTH FROM expr). Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"month extraction function; standard SQL-style behavior with a note about session timezone, but no detailed product-specific limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/months_between,months_between function,months_between function - Azure Databricks - Databricks SQL,,Learn the syntax of the months\_between function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of months elapsed between dates or timestamps inexpr1andexpr2.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"months_between function; describes return value semantics only, without numeric limits, configuration parameters, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/named_struct,named_struct function,named_struct function - Azure Databricks - Databricks SQL,,Learn the syntax of the named\_struct function of the SQL language in Azure Databricks.,Applies to:Databricks SQLDatabricks Runtime Creates a struct with the specified field names and values.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"named_struct SQL function reference; describes creating a struct with given field names and values, but no product-specific limits, configs, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/nanvl,nanvl function,nanvl function - Azure Databricks - Databricks SQL,,Learn the syntax of the nanvl function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnsexpr1if it's notNaN, orexpr2otherwise.",2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"nanvl function; simple conditional behavior around NaN, no expert-only configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/negative,negative function,negative function - Azure Databricks - Databricks SQL,Negate numeric values with negative function,Learn the syntax of the negative function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the negated value ofexpr. This function is a synonym for-(minus sign) unary operator.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Documents a Databricks SQL function and its synonymy with unary minus, which is specific to this SQL implementation.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/next_day,next_day function,next_day function - Azure Databricks - Databricks SQL,Find next weekday date with next_day function,Learn the syntax of the next\_day function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the first date which is later thanexprand named as indayOfWeek.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Provides Databricks SQL-specific semantics for computing the next occurrence of a weekday from a date expression, including accepted weekday formats.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/negative,negative function,negative function - Azure Databricks - Databricks SQL,Negate numeric values with negative function,Learn the syntax of the negative function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the negated value ofexpr. This function is a synonym for-(minus sign) unary operator.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Documents a Databricks SQL function and its synonymy with unary minus, which is specific to this SQL implementation.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/next_day,next_day function,next_day function - Azure Databricks - Databricks SQL,Find next weekday date with next_day function,Learn the syntax of the next\_day function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the first date which is later thanexprand named as indayOfWeek.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Provides Databricks SQL-specific semantics for computing the next occurrence of a weekday from a date expression, including accepted weekday formats.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/not,not operator,not operator - Azure Databricks - Databricks SQL,,Learn the syntax of the not function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns logical negation of the argument. This operator is an alias for!(bang sign) operator.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,not logical operator reference; generic SQL behavior without product-specific constraints or configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/now,now function,now function - Azure Databricks - Databricks SQL,,Learn the syntax of the now function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the current timestamp at the start of query evaluation.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"now timestamp function; while it notes evaluation at start of query, this is standard behavior and not framed as limits, configs, or best-practices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/nth_value,nth_value analytic window function,nth_value analytic window function - Azure Databricks - Databricks SQL,,Learn the syntax of the nth\_value window function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the value at a specificoffsetin the window.,2024-03-01T08:00:00.000Z,language-reference,,0.4,False,nth_value analytic window function reference; describes generic window semantics without Databricks-specific limits or configuration parameters.,unchanged @@ -4117,21 +4157,21 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/octet_length,octet_length function,octet_length function - Azure Databricks - Databricks SQL,,Learn the syntax of the octet\_length function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the byte length of string data or number of bytes of binary data.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"octet_length function reference; generic byte-length behavior, no Databricks-specific constraints or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/or,or operator,or operator - Azure Databricks - Databricks SQL,,Learn the syntax of the or operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the logicalORofexpr1andexpr2.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,or logical operator reference; standard SQL semantics without product-specific expert guidance.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/overlay,overlay function,overlay function - Azure Databricks - Databricks SQL,,Learn the syntax of the overlay function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Replacesinputwithreplacethat starts atposand is of lengthlen.,2024-05-24T08:00:00.000Z,language-reference,,0.35,False,"overlay string function; describes replacement semantics but no numeric limits, config options, or Databricks-only behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_json,parse_json function,parse_json function - Azure Databricks - Databricks SQL,Use parse_json in Databricks SQL queries,Learn the syntax of the parse\_json function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 15.3 and above Returns aVARIANTvalue from thejsonStr.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, return type (VARIANT), and behavior details for parse_json that go beyond generic SQL knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_timestamp,parse_timestamp function,parse_timestamp function - Azure Databricks - Databricks SQL,Use parse_timestamp in Databricks SQL queries,Learn the syntax of the parse\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Ifexpris a string, parses it into aTIMESTAMPaccording to the first matching pattern in the given list of formats. One or more of the formats can reference a predefined list of formats. Ifexpris a numeric type, parses it as a Unix timestamp.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, arguments, and behavior for parse_timestamp, including how string and numeric inputs are interpreted and supported format patterns—details that go beyond generic SQL knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_url,parse_url function,parse_url function - Azure Databricks - Databricks SQL,Extract URL components with parse_url in Databricks,Learn the syntax of the parse\_url function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts a part fromurl.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific parse_url function syntax and supported URL parts, which are product-specific API details rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_json,parse_json function,parse_json function - Azure Databricks - Databricks SQL,Use parse_json in Databricks SQL queries,Learn the syntax of the parse\_json function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 15.3 and above Returns aVARIANTvalue from thejsonStr.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, return type (VARIANT), and behavior details for parse_json that go beyond generic SQL knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_timestamp,parse_timestamp function,parse_timestamp function - Azure Databricks - Databricks SQL,Use parse_timestamp in Databricks SQL queries,Learn the syntax of the parse\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Ifexpris a string, parses it into aTIMESTAMPaccording to the first matching pattern in the given list of formats. One or more of the formats can reference a predefined list of formats. Ifexpris a numeric type, parses it as a Unix timestamp.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, arguments, and behavior for parse_timestamp, including how string and numeric inputs are interpreted and supported format patterns—details that go beyond generic SQL knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_url,parse_url function,parse_url function - Azure Databricks - Databricks SQL,Extract URL components with parse_url in Databricks,Learn the syntax of the parse\_url function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts a part fromurl.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific parse_url function syntax and supported URL parts, which are product-specific API details rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percent_rank,percent_rank ranking window function,percent_rank ranking window function - Azure Databricks - Databricks SQL,,Learn the syntax of the percent\_rank window function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Computes the percentage ranking of a value within the partition.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"percent_rank window function; standard ranking semantics, not focused on quotas, configs, or decision criteria.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentile,percentile aggregate function,percentile aggregate function - Azure Databricks - Databricks SQL,Calculate exact percentiles in Databricks SQL,Learn the syntax of the percentile aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the exact percentile value ofexprat the specifiedpercentagein a group.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks-specific percentile aggregate function semantics and usage, including exact vs approximate behavior, which are concrete API details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentile,percentile aggregate function,percentile aggregate function - Azure Databricks - Databricks SQL,Calculate exact percentiles in Databricks SQL,Learn the syntax of the percentile aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the exact percentile value ofexprat the specifiedpercentagein a group.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks-specific percentile aggregate function semantics and usage, including exact vs approximate behavior, which are concrete API details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentile_approx,percentile_approx aggregate function,percentile_approx aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the percentile\_approx aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the approximate percentile of theexprwithin the group. This function is a synonym forapprox_percentileaggregate function.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,"percentile_approx aggregate function; synonym for approx_percentile, but no numeric accuracy bounds, limits, or config tables are indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentile_cont,percentile_cont aggregate function,percentile_cont aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the percentile\_cont aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the value that corresponds to thepercentileof the providedsortKeys using a continuous distribution model.,2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"percentile_cont aggregate function; continuous distribution semantics are described, but no numeric limits, config parameters, or decision matrices are indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentile_disc,percentile_disc aggregate function,percentile_disc aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the percentile\_disc aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the value that corresponds to thepercentileof the providedsortKeyusing a discrete distribution model.,2024-04-18T08:00:00.000Z,language-reference,,0.45,False,"percentile_disc aggregate function; notes runtime version and discrete model, but lacks numeric limits, configuration parameters, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentsign,% (percent sign) operator,% (percent sign) operator - Azure Databricks - Databricks SQL,Use modulo % operator in Databricks SQL,Learn the syntax of the % (percent sign) operator of the SQL language in Azure Databricks.,Applies to:Databricks SQLDatabricks Runtime Returns the remainder afterdividend/divisor. This function is equivalent tomodfunction.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Operator reference with Databricks-specific behavior and equivalence to mod function; while simple, it is concrete product syntax/semantics.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentsign,% (percent sign) operator,% (percent sign) operator - Azure Databricks - Databricks SQL,Use modulo % operator in Databricks SQL,Learn the syntax of the % (percent sign) operator of the SQL language in Azure Databricks.,Applies to:Databricks SQLDatabricks Runtime Returns the remainder afterdividend/divisor. This function is equivalent tomodfunction.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Operator reference with Databricks-specific behavior and equivalence to mod function; while simple, it is concrete product syntax/semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/pi,pi function,pi function - Azure Databricks - Databricks SQL,,Learn the syntax of the pi function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns pi.,2026-01-20T08:00:00.000Z,language-reference,,0.1,False,"pi function; trivial constant-returning function with no configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/pipepipesign,|| (pipe pipe sign) operator,|| (pipe pipe sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the || (pipe pipe sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the concatenation ofexpr1andexpr2. This operator is a synonym forconcatfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"|| concatenation operator; synonym for concat, generic SQL behavior without Databricks-specific constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/pipesign,| (pipe sign) operator,| (pipe sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the | (pipe sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwiseORofexpr1andexpr2.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"| bitwise OR operator; basic operator semantics, no expert configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/plussign,+ (plus sign) operator,+ (plus sign) operator - Azure Databricks - Databricks SQL,Use plus + operator in Databricks SQL,Learn the syntax of the + (plus sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the sum ofexpr1andexpr2.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Documents Databricks SQL behavior of the + operator (types, coercion, etc.), which is specific to this product’s SQL dialect.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/plussign,+ (plus sign) operator,+ (plus sign) operator - Azure Databricks - Databricks SQL,Use plus + operator in Databricks SQL,Learn the syntax of the + (plus sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime Returns the sum ofexpr1andexpr2.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.6,True,"Documents Databricks SQL behavior of the + operator (types, coercion, etc.), which is specific to this product’s SQL dialect.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/plussignunary,+ (plus sign) unary operator,+ (plus sign) unary operator - Azure Databricks - Databricks SQL,,Learn the syntax of the + (plus sign) unary operator of the SQL language in Databricks SQL.,Returns the value ofexpr. This function is a synonym forpositivefunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"+ unary operator; synonym for positive, trivial behavior without expert-only details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/pmod,pmod function,pmod function - Azure Databricks - Databricks SQL,Compute positive modulo with pmod in Databricks,Learn the syntax of the pmod function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the positive remainder afterdividend / divisor.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Function reference for pmod with Databricks-specific semantics (always positive remainder) and usage details that are not generic SQL knowledge.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/pmod,pmod function,pmod function - Azure Databricks - Databricks SQL,Compute positive modulo with pmod in Databricks,Learn the syntax of the pmod function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the positive remainder afterdividend / divisor.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Function reference for pmod with Databricks-specific semantics (always positive remainder) and usage details that are not generic SQL knowledge.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/posexplode,posexplode table-valued generator function,posexplode table-valued generator function - Azure Databricks - Databricks SQL,,Learn the syntax of the posexplode function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a set of rows by un-nestingexprwith numbering of positions. In Databricks SQL and Databricks Runtime 16.1 and above this function supportsnamed parameter invocation.,2025-02-04T08:00:00.000Z,language-reference,,0.45,False,"posexplode table-valued function; describes un-nesting with positions and named parameter support, but no limits, quotas, or config tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/posexplode_outer,posexplode_outer table-valued generator function,posexplode_outer table-valued generator function - Azure Databricks - Databricks SQL,,Learn the syntax of the posexplode\_outer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns rows by un-nesting the array with numbering of positions usingOUTERsemantics. In Databricks SQL and Databricks Runtime 16.1 and above this function supportsnamed parameter invocation.,2025-02-05T19:42:00.000Z,language-reference,,0.45,False,"posexplode_outer table-valued function; OUTER semantics and named parameters noted, but no numeric limits, configuration options, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/position,position function,position function - Azure Databricks - Databricks SQL,,Learn the syntax of the position function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the position of the first occurrence ofsubstrinstrafter positionpos. This function is a synonym forlocatefunction.,2024-12-12T08:00:00.000Z,language-reference,,0.3,False,"position function reference; synonym for locate, generic string search behavior without Databricks-specific expert details.",unchanged @@ -4143,15 +4183,15 @@ For details, seejava.util.Formatter.",2024-07-18T17:47:00.000Z,language-referenc https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/quarter,quarter function,quarter function - Azure Databricks - Databricks SQL,,Learn the syntax of the quarter function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the quarter of the year forexprin the range 1 to 4. This function is a synonym forextract(QUARTER FROM expr).,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"quarter date function; generic extraction of quarter from date, no Databricks-specific limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/questiondoublecolonsign,?:: (question mark double colon) operator,?:: (question double colon sign) operator - Azure Databricks - Databricks SQL,Cast with ?:: operator in Databricks SQL,Learn the syntax of the ?:: (question double colon sign) operator of the SQL language in Databricks SQL.,Applies to:Databricks Runtime 15.3 and above Casts the valueexprto the target data typetypewith error toleration. This operator is a synonym fortry_castfunction.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks-specific cast operator, its syntax, supported types, and error-tolerant behavior, which are detailed API semantics unique to this product and align with integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/radians,radians function,radians function - Azure Databricks - Databricks SQL,Use radians function in Databricks SQL,Learn the syntax of the radians function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Convertsexprin degrees to radians.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.6,True,"Provides precise syntax and behavior for the radians function in Databricks SQL/Runtime. While mathematically simple, the exact function signature and platform applicability are product-specific API details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/raise_error,raise_error function,raise_error function - Azure Databricks - Databricks SQL,Raise custom SQL errors in Databricks,Learn the syntax of the raise\_error function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Throws an exception withexpras the message.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents raise_error function syntax and behavior for Databricks SQL, including how expressions become error messages—product-specific API behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/raise_error,raise_error function,raise_error function - Azure Databricks - Databricks SQL,Raise custom SQL errors in Databricks,Learn the syntax of the raise\_error function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Throws an exception withexpras the message.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents raise_error function syntax and behavior for Databricks SQL, including how expressions become error messages—product-specific API behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rand,rand function,rand function - Azure Databricks - Databricks SQL,Generate random numbers with rand in Databricks SQL,Learn the syntax of the rand function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a random value between 0 and 1. This function is a synonym forrandomfunction.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Covers the rand function’s syntax, return type, and synonym relationship with random in Databricks SQL, which are concrete API details and coding patterns specific to this environment.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/randn,randn function,randn function - Azure Databricks - Databricks SQL,,Learn the syntax of the randn function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a random value from a standard normal distribution.,2024-12-16T21:39:00.000Z,language-reference,,0.3,False,"Standard normal random function; generic reference, no expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/random,random function,random function - Azure Databricks - Databricks SQL,Generate random numbers with random in Databricks SQL,Learn the syntax of the random function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a random value between 0 and 1. This function is a synonym forrandfunction.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Documents the random function, its behavior, and synonymy with rand in Databricks SQL. These are specific function-level integration details rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/randstr,randstr function,randstr function - Azure Databricks - Databricks SQL,,Learn the syntax of the randstr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.1 and above Returns a random string oflengthalpha-numeric characters.,2025-04-30T00:43:00.000Z,language-reference,,0.35,False,"Random string function; likely just syntax and examples, no quotas or config tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/range,range table-valued function,range table-valued function - Azure Databricks - Databricks SQL,,Learn the syntax of the range function of the SQL language in Databricks SQL and Databricks Runtime.,Returns a table of values within a specified range.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,range TVF reference; no indication of limits tables or deployment/config details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rank,rank ranking window function,rank ranking window function - Azure Databricks - Databricks SQL,,Learn the syntax of the rank function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the rank of a value compared to all values in the partition.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,"Window function reference; standard SQL semantics, no Databricks-specific expert constraints.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_files,read_files table-valued function,read_files table-valued function - Azure Databricks - Databricks SQL,Read files with read_files table function in Databricks,Learn the syntax of the read\_files function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Reads files under a provided location and returns the data in tabular form. Supports readingJSON,CSV,XML,TEXT,BINARYFILE,PARQUET,AVRO, andORCfile formats. -Can detect the file format automatically and infer a unified schema across all files.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.8,True,"Details a Databricks-specific table-valued function including supported formats, schema inference, and behavior—concrete integration/API usage information.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_files,read_files table-valued function,read_files table-valued function - Azure Databricks - Databricks SQL,Use read_files table-valued function in Databricks SQL,Learn the syntax of the read\_files function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Reads files under a provided location and returns the data in tabular form. Supports readingJSON,CSV,XML,TEXT,BINARYFILE,PARQUET,AVRO, andORCfile formats. +Can detect the file format automatically and infer a unified schema across all files.",2026-04-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Function reference pages for Databricks SQL typically include product-specific syntax, parameters, return schema details, supported file formats, and behavioral nuances (such as how schemas are inferred, partitioning behavior, and options) that go beyond generic SQL knowledge. These are concrete API/SDK-style integration details for reading external files into Databricks SQL, fitting the integrations category better than others.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_kafka,read_kafka table-valued function,read_kafka table-valued function - Azure Databricks - Databricks SQL,Query Kafka with read_kafka in Databricks SQL,Learn the syntax of the read\_kafka function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Reads data from an Apache Kafka cluster and returns the data in tabular form. Can read data from one or more Kafka topics. It supports both batch queries and streaming ingestion.,2026-04-09T08:00:00.000Z,language-reference,integrations,0.78,True,"The read_kafka TVF documentation will contain Databricks-specific integration details for Apache Kafka (such as required/optional options, parameter names, and supported modes like batch vs streaming). These are concrete configuration and API usage details unique to Databricks SQL’s Kafka integration, which aligns with the integrations sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_kinesis,read_kinesis streaming table-valued function,read_kinesis streaming table-valued function - Azure Databricks - Databricks SQL,Stream from Amazon Kinesis using read_kinesis TVF,Learn the syntax of the read\_kinesis function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns a table with records read fromKinesisfrom one or more streams.,2025-05-09T19:58:00.000Z,language-reference,integrations,0.7,True,"Kinesis streaming integration; page will define TVF parameters (stream name, region, offsets, auth options) that are product-specific configuration for an external service.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_pubsub,read_pubsub streaming table-valued function,read_pubsub streaming table-valued function - Azure Databricks - Databricks SQL,Read Google Pub/Sub streams with read_pubsub in Databricks,Learn the syntax of the read\_pubsub function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns a table with records read from Pub/Sub from a topic. Only supports streaming queries.,2024-04-18T08:00:00.000Z,language-reference,integrations,0.7,True,"Pub/Sub streaming TVF; requires specific connector options and parameter names for Pub/Sub topics and credentials, which are integration-focused expert details.",unchanged @@ -4162,14 +4202,14 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions The returned relation only supports running as a batch query.",2026-04-09T08:00:00.000Z,language-reference,integrations,0.7,True,"This table-valued function is specific to Databricks streaming internals and will document parameters, constraints (batch-only execution), and usage patterns for reading the state store. These are product-specific API details and integration patterns for interacting with the Databricks state store, which fits the integrations category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/reduce,reduce function,reduce function - Azure Databricks - Databricks SQL,Aggregate arrays with reduce in Databricks SQL,Learn the syntax of the reduce function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Aggregates elements in an array using a custom aggregator. This function is a synonym foraggregatefunction.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.65,True,Covers syntax and behavior of the reduce aggregate function (synonym for aggregate) in Databricks SQL. These are concrete function semantics and usage patterns specific to this SQL dialect.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/reflect,reflect function,reflect function - Azure Databricks - Databricks SQL,,Learn the syntax of the reflect function of the SQL language in Databricks Runtime.,"Applies to:Databricks Runtime 11.3 LTS and above Calls a method with reflection. The method might return an exception. To return aNULLinstead, usetry_reflect.",2024-04-18T08:00:00.000Z,language-reference,,0.5,False,"reflect/try_reflect are JVM reflection helpers; while product-specific, page is likely just syntax and examples, not configuration tables or error-code mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp,regexp operator,regexp operator - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns true ifstrmatchesregex. This function is a synonym forrlikeoperator.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Describes basic regexp operator syntax and behavior; this is generic SQL/regex functionality without product-specific limits, configs, or patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_count,regexp_count function,regexp_count function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_count function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the number of timesstrmatches theregexppattern.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Explains regexp_count syntax and return value; standard regex-count behavior without Databricks-specific limits, configs, or troubleshooting details.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_extract,regexp_extract function,regexp_extract function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_extract function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts the first string instrthat matches theregexpexpression and corresponds to theregexgroup index.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Covers regexp_extract syntax and semantics; generic regex extraction behavior, no product-specific configuration tables, limits, or decision guidance.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_extract_all,regexp_extract_all function,regexp_extract_all function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_extract\_all function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts all of the strings instrthat match theregexpexpression and correspond to theregexgroup index.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Describes regexp_extract_all usage; while slightly more specialized, it is still straightforward function syntax without expert-only limits, quotas, or product-specific configs.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_instr,regexp_instr function,regexp_instr function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_instr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the position of the first substring instrthat matchesregexp.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Documents regexp_instr behavior (position of match); standard regex function semantics, no detailed configuration, limits, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_like,regexp_like function,regexp_like function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_like function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns true ifstrmatchesregex. This function is a synonym forrlikeoperator.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"regexp_like is a synonym for rlike; page is basic syntax/semantics without product-specific parameters, limits, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_replace,regexp_replace function,regexp_replace function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_replace function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Replaces all substrings ofstrthat matchregexpwithrep.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"regexp_replace function description is generic regex replacement behavior; no Databricks-specific configuration tables, quotas, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_substr,regexp_substr function,regexp_substr function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_substr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the first substring instrthat matchesregexp.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,regexp_substr returns first matching substring; this is standard regex function behavior without expert-only product-specific details.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp,regexp operator,regexp operator - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns true ifstrmatchesregex. This function is a synonym forrlikeoperator.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Describes basic regexp operator syntax and behavior; this is generic SQL/regex functionality without product-specific limits, configs, or patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_count,regexp_count function,regexp_count function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_count function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the number of timesstrmatches theregexppattern.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Explains regexp_count syntax and return value; standard regex-count behavior without Databricks-specific limits, configs, or troubleshooting details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_extract,regexp_extract function,regexp_extract function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_extract function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts the first string instrthat matches theregexpexpression and corresponds to theregexgroup index.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Covers regexp_extract syntax and semantics; generic regex extraction behavior, no product-specific configuration tables, limits, or decision guidance.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_extract_all,regexp_extract_all function,regexp_extract_all function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_extract\_all function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Extracts all of the strings instrthat match theregexpexpression and correspond to theregexgroup index.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Describes regexp_extract_all usage; while slightly more specialized, it is still straightforward function syntax without expert-only limits, quotas, or product-specific configs.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_instr,regexp_instr function,regexp_instr function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_instr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the position of the first substring instrthat matchesregexp.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Documents regexp_instr behavior (position of match); standard regex function semantics, no detailed configuration, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_like,regexp_like function,regexp_like function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_like function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns true ifstrmatchesregex. This function is a synonym forrlikeoperator.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"regexp_like is a synonym for rlike; page is basic syntax/semantics without product-specific parameters, limits, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_replace,regexp_replace function,regexp_replace function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_replace function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Replaces all substrings ofstrthat matchregexpwithrep.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"regexp_replace function description is generic regex replacement behavior; no Databricks-specific configuration tables, quotas, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regexp_substr,regexp_substr function,regexp_substr function - Azure Databricks - Databricks SQL,,Learn the syntax of the regexp\_substr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the first substring instrthat matchesregexp.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,regexp_substr returns first matching substring; this is standard regex function behavior without expert-only product-specific details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_avgx,regr_avgx aggregate function,regr_avgx aggregate function - Azure Databricks - Databricks SQL,Compute regression mean with regr_avgx in Databricks SQL,Learn the syntax of the regr\_avgx function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the mean ofxExprcalculated from values of a group wherexExprandyExprareNOT NULL.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.75,True,"Provides exact semantics for the regr_avgx aggregate function, including null-handling rules and version applicability. This is specialized statistical function behavior in this product’s SQL dialect.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_avgy,regr_avgy aggregate function,regr_avgy aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the regr\_avgy function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the mean ofyExprcalculated from values of a group wherexExprandyExprareNOT NULL.,2025-02-19T01:19:00.000Z,language-reference,,0.35,False,"regr_avgy reference; similar to 25, no Databricks-specific quotas or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_count,regr_count aggregate function,regr_count aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the regr\_count aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the number of non-null value pairsyExpr,xExprin the group.",2024-04-18T08:00:00.000Z,language-reference,,0.35,False,regr_count reference; standard aggregate semantics.,unchanged @@ -4180,14 +4220,14 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_sxy,regr_sxy aggregate function,regr_sxy aggregate function - Azure Databricks - Databricks SQL,Compute regression SXY with regr_sxy in Databricks SQL,Learn the syntax of the regr\_sxy function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the sum of products ofyExprandxExprcalculated from values of a group wherexExprandyExprareNOT NULL.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.75,True,"Documents the regr_sxy aggregate function, including how it computes sum of products and its null-handling rules. These are concrete, product-specific function semantics aligned with integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_syy,regr_syy aggregate function,regr_syy aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the regr\_syy function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the sum of squares of theyExprvalues of a group wherexExprandyExprareNOT NULL.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for regr_syy; describes syntax and behavior but no limits, quotas, config tables, error codes, or product-specific thresholds. Primarily language semantics, which are broadly known and not in any listed sub-skill category.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/remote_query,remote_query table-valued function,remote_query table-valued function - Azure Databricks - Databricks SQL,Use remote_query to access external databases in Databricks SQL,Learn the syntax of the remote\_query function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Important This feature is inPublic Preview. Returns the tabular result of the query executed on the remote database engine. remote_queryfetches data from remote systems using credentials from aconnection. -The function accepts a set of connector options, so besides query. This function requiresnamed parameter invocation.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a product-specific table-valued function with named parameters and connector options for querying remote systems using stored connection credentials; this is concrete integration behavior and syntax unique to Databricks SQL, not generic SQL knowledge.",updated +The function accepts a set of connector options, so besides query. This function requiresnamed parameter invocation.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a product-specific table-valued function with named parameters and connector options for querying remote systems using stored connection credentials; this is concrete integration behavior and syntax unique to Databricks SQL, not generic SQL knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/repeat,repeat function,repeat function - Azure Databricks - Databricks SQL,,Learn the syntax of the repeat function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the string that repeatsexprntimes.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"repeat string function; generic behavior, no Databricks-specific constraints listed.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/replace,replace function,replace function - Azure Databricks - Databricks SQL,,Learn the syntax of the replace function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Replaces all occurrences ofsearchwithreplace.,2024-12-12T08:00:00.000Z,language-reference,,0.3,False,replace string function; simple reference.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/reverse,reverse function,reverse function - Azure Databricks - Databricks SQL,,Learn the syntax of the reverse function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a reversed string or an array with reverse order of elements.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,reverse string/array function; no indication of limits or config matrices.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/right,right function,right function - Azure Databricks - Databricks SQL,,Learn the syntax of the right function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the rightmostlencharacters from the stringstr.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,right substring function; standard SQL-like behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rint,rint function,rint function - Azure Databricks - Databricks SQL,,Learn the syntax of the rint function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnsexprrounded to a whole number as aDOUBLE. This function is a synonym forround(expr, 0).",2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for rint; only explains rounding behavior and synonymy with round(expr, 0). No configuration parameters, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rlike,rlike operator,rlike operator - Azure Databricks - Databricks SQL,,Learn the syntax of the rlike operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.0 Returns true ifstrmatchesregex.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"rlike operator page describes basic regex match semantics; no specific limits, configuration parameters, or troubleshooting mappings unique to Databricks.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/round,round function,round function - Azure Databricks - Databricks SQL,,Learn the syntax of the round function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the roundedexprusingHALF_UProunding mode.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"round function description covers generic rounding semantics; lacks product-specific limits, configuration options, or best-practice guidance with quantified impact.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rlike,rlike operator,rlike operator - Azure Databricks - Databricks SQL,,Learn the syntax of the rlike operator of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.0 Returns true ifstrmatchesregex.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"rlike operator page describes basic regex match semantics; no specific limits, configuration parameters, or troubleshooting mappings unique to Databricks.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/round,round function,round function - Azure Databricks - Databricks SQL,,Learn the syntax of the round function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the roundedexprusingHALF_UProunding mode.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"round function description covers generic rounding semantics; lacks product-specific limits, configuration options, or best-practice guidance with quantified impact.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/row_number,row_number ranking window function,row_number ranking window function - Azure Databricks - Databricks SQL,,Learn the syntax of the row\_number function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Assigns a unique, sequential number to each row, starting with one, according to the ordering of rows in the window partition.",2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"ROW_NUMBER window function reference; standard SQL semantics, no Databricks-specific limits, configs, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rpad,rpad function,rpad function - Azure Databricks - Databricks SQL,,Learn the syntax of the rpad function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnsexpr, right-padded withpadto a length oflen.",2025-05-09T19:58:00.000Z,language-reference,,0.3,False,"RPAD string function reference; generic SQL behavior, no Databricks-specific quotas, configs, or decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rtrim,rtrim function,rtrim function - Azure Databricks - Databricks SQL,,Learn the syntax of the rtrim function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsstrwith trailing characters removed.,2026-02-24T08:00:00.000Z,language-reference,,0.3,False,"RTRIM string function reference; standard trimming semantics, no product-specific constraints or configuration parameters.",unchanged @@ -4199,9 +4239,9 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/schema_of_xml,schema_of_xml function,schema_of_xml function - Azure Databricks - Databricks SQL,,Learn the syntax of the schema\_of\_xml function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 14.1 and above Important This feature is inPublic Preview. Returns the schema of anXMLstring inDDLformat.,2026-01-22T13:21:00.000Z,language-reference,,0.4,False,"SCHEMA_OF_XML function reference; even with preview note, it is basic function behavior without detailed constraints or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sec,sec function,sec function - Azure Databricks - Databricks SQL,,Learn the syntax of the sec function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.1 Returns the secant ofexpr.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"SEC trigonometric function reference; generic math behavior, no Databricks-specific expert content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/second,second function,second function - Azure Databricks - Databricks SQL,,Learn the syntax of the second function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the second component of the timestamp inexpr.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"SECOND datetime function reference; standard extraction of second component, no product-specific limits or configs.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/secret,secret function,secret function - Azure Databricks - Databricks SQL,Use Databricks SQL secret function for secure values,Learn the syntax of the secret function of the SQL language in Azure Databricks.,Applies to:Databricks SQL previewDatabricks Runtime 11.3 LTS and above Extracts a secret value with the givenscopeandkeyfromDatabricks secret service.,2026-04-13T21:52:00.000Z,language-reference,security,0.7,True,"Function reference with product-specific behavior for retrieving secrets from Databricks secret scopes; includes concrete syntax and constraints tied to Databricks secret service, which is implementation-specific security configuration/usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/secret,secret function,secret function - Azure Databricks - Databricks SQL,Use Databricks SQL secret function for secure values,Learn the syntax of the secret function of the SQL language in Azure Databricks.,Applies to:Databricks SQL previewDatabricks Runtime 11.3 LTS and above Extracts a secret value with the givenscopeandkeyfromDatabricks secret service.,2026-04-13T21:52:00.000Z,language-reference,security,0.7,True,"Function reference with product-specific behavior for retrieving secrets from Databricks secret scopes; includes concrete syntax and constraints tied to Databricks secret service, which is implementation-specific security configuration/usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sentences,sentences function,sentences function - Azure Databricks - Databricks SQL,,Learn the syntax of the sentences function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Splitsstrinto an array of array of words.,2025-05-09T19:58:00.000Z,language-reference,,0.25,False,"SENTENCES text function reference; generic string processing without Databricks-specific limits, configs, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sequence,sequence function,sequence function - Azure Databricks - Databricks SQL,,Learn the syntax of the sequence function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Generates an array of elements fromstarttostop(inclusive), incrementing bystep.",2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Standard SQL-style sequence/array generation; primarily syntax and examples without product-specific limits, configs, or decision logic beyond generic function semantics.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sequence,sequence function,sequence function - Azure Databricks - Databricks SQL,,Learn the syntax of the sequence function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Generates an array of elements fromstarttostop(inclusive), incrementing bystep.",2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Standard SQL-style sequence/array generation; primarily syntax and examples without product-specific limits, configs, or decision logic beyond generic function semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/session_user,session_user function,session_user function - Azure Databricks - Databricks SQL,,Learn the syntax of the session\_user function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 14.1 and above Returns the user connected to Azure Databricks. Note The SQL standard differentiates betweenCURRENT_USERandSESSION_USER. In Databricks SQL and Databricks Runtime 14.1 and above you should useSESSION_USERinstead ofCURRENT_USERorUSER.",2026-01-20T08:00:00.000Z,language-reference,,0.3,False,"session_user function reference; notes version applicability and preference over CURRENT_USER/USER, but lacks detailed RBAC roles, security configuration parameters, or other expert-level configuration/troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/session_window,session_window grouping expression,session_window grouping expression - Azure Databricks - Databricks SQL,,Learn the syntax of the session\_window grouping expression of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Creates a session-window over a timestamp expression.,2024-04-18T08:00:00.000Z,language-reference,,0.35,False,"SESSION_WINDOW grouping expression; describes windowing behavior but not detailed numeric thresholds, configs, or troubleshooting mappings.",unchanged @@ -4218,16 +4258,16 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sinh,sinh function,sinh function - Azure Databricks - Databricks SQL,,Learn the syntax of the sinh function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the hyperbolic sine ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for sinh; standard math function semantics with no expert configuration, limits, or decision-making content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/size,size function,size function - Azure Databricks - Databricks SQL,,Learn the syntax of the size function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the cardinality of the array or map inexpr.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"SIZE function reference; returns cardinality of array/map, but no numeric limits, configuration parameters, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/skewness,skewness aggregate function,skewness aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the skewness function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the skewness value calculated from values of a group.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Aggregate function reference for skewness; explains statistical aggregation but does not include limits, quotas, configuration parameters, or troubleshooting/error-code mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/slashsign,/ (slash sign) operator,/ (slash sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the / (slash sign) operator of the SQL language in Azure Databricks.,Applies to:Databricks SQLDatabricks Runtime Returnsdividenddivided bydivisor.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Basic arithmetic division operator description; generic SQL behavior that an LLM already knows, with no product-specific limits, configuration, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/slice,slice function,slice function - Azure Databricks - Databricks SQL,,Learn the syntax of the slice function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a subset of an array.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Array slicing semantics similar to many languages; page is a function syntax reference without Databricks-specific limits, quotas, or configuration tables.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/smallint,smallint function,smallint function - Azure Databricks - Databricks SQL,,Learn the syntax of the smallint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprtoSMALLINT. This function is a synonym forCAST(expr AS SMALLINT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Casts to SMALLINT and is explicitly a synonym for CAST; generic SQL typing behavior, no Databricks-specific constraints or expert-only details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/slashsign,/ (slash sign) operator,/ (slash sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the / (slash sign) operator of the SQL language in Azure Databricks.,Applies to:Databricks SQLDatabricks Runtime Returnsdividenddivided bydivisor.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Basic arithmetic division operator description; generic SQL behavior that an LLM already knows, with no product-specific limits, configuration, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/slice,slice function,slice function - Azure Databricks - Databricks SQL,,Learn the syntax of the slice function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a subset of an array.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Array slicing semantics similar to many languages; page is a function syntax reference without Databricks-specific limits, quotas, or configuration tables.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/smallint,smallint function,smallint function - Azure Databricks - Databricks SQL,,Learn the syntax of the smallint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprtoSMALLINT. This function is a synonym forCAST(expr AS SMALLINT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Casts to SMALLINT and is explicitly a synonym for CAST; generic SQL typing behavior, no Databricks-specific constraints or expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/some,some aggregate function,some aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the some function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnstrueif at least one value ofexprin a group is true. Thesomeaggregate function is synonymous withanyaggregate functionandbool_oraggregate function.,2024-07-18T17:47:00.000Z,language-reference,,0.25,False,"SOME aggregate function reference; logical aggregation semantics, no product-specific limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sort_array,sort_array function,sort_array function - Azure Databricks - Databricks SQL,,Learn the syntax of the sort\_array function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the array inexprin sorted order.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"SORT_ARRAY function reference; generic array sorting behavior, no Databricks-specific expert configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/soundex,soundex function,soundex function - Azure Databricks - Databricks SQL,,Learn the syntax of the soundex function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns thesoundexcode of the string.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"SOUNDEX function reference; phonetic encoding behavior, but no Databricks-specific limits, configs, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/space,space function,space function - Azure Databricks - Databricks SQL,,Learn the syntax of the space function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a string consisting ofnspaces.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"SPACE function reference; returns n spaces, but lacks numeric limit tables, configuration parameters, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/spark_partition,spark_partition_id function,spark_partition_id function - Azure Databricks - Databricks SQL,,Learn the syntax of the spark\_partition\_id function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the current partition ID.,2024-03-01T08:00:00.000Z,language-reference,,0.35,False,SPARK_PARTITION_ID function reference; Databricks/Spark-specific but only exposes partition ID without detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/split,split function,split function - Azure Databricks - Databricks SQL,,Learn the syntax of the split function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Splitsstraround occurrences that matchregexand returns an array with a length of at mostlimit.,2024-12-12T08:00:00.000Z,language-reference,,0.3,False,"SPLIT function reference; describes regex-based splitting and limit argument, but no detailed numeric constraints or configuration tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/split_part,split_part function,split_part function - Azure Databricks - Databricks SQL,,Learn the syntax of the split\_part function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Splitsstraround occurrences ofdelimand returns thepartNumpart.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"String splitting function semantics are common across SQL engines; page is a basic function reference without product-specific limits, configuration parameters, or decision/troubleshooting matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/split_part,split_part function,split_part function - Azure Databricks - Databricks SQL,,Learn the syntax of the split\_part function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Splitsstraround occurrences ofdelimand returns thepartNumpart.,2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"String splitting function semantics are common across SQL engines; page is a basic function reference without product-specific limits, configuration parameters, or decision/troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sql_keywords,sql_keywords function,sql_keywords function - Azure Databricks - Databricks SQL,Use sql_keywords function in Databricks SQL,Learn the syntax of the sql\_keywords function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns the set of SQL keywords in Azure Databricks.,2024-04-18T08:00:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, arguments, and return behavior for sql_keywords in Databricks SQL/Runtime, which are not generally known outside this documentation.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sqrt,sqrt function,sqrt function - Azure Databricks - Databricks SQL,,Learn the syntax of the sqrt function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the square root ofexpr.,2026-01-20T08:00:00.000Z,language-reference,,0.2,False,"Function reference for sqrt in Databricks SQL; primarily syntax and basic behavior without product-specific limits, configuration tables, or error mappings that meet any sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_addpoint,st_addpoint function,st_addpoint function - Azure Databricks - Databricks SQL,Manipulate linestrings with st_addpoint in Databricks,Learn the syntax of the st\_addpoint function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Adds a new point to the n-th position in the input linestringGEOGRAPHYorGEOMETRYvalue.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.78,True,"Geospatial function reference with Databricks-specific behavior, input/return types, and preview/warehouse support details for st_addpoint.",unchanged @@ -4242,7 +4282,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_aswkb,st_aswkb function,st_aswkb function - Azure Databricks - Databricks SQL,Export geometries as WKB with st_aswkb,Learn the syntax of the st\_aswkb function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the inputGEOGRAPHYorGEOMETRYvalue inWKBformat using the specified endianness, if provided. If the endianness is not specified, the returned value is little-endian encoded.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.82,True,Similar to st_asbinary but Databricks-specific; documents endianness default and GEOGRAPHY/GEOMETRY handling.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_aswkt,st_aswkt function,st_aswkt function - Azure Databricks - Databricks SQL,Export geometries as WKT with st_aswkt,Learn the syntax of the st\_aswkt function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the inputGEOGRAPHYorGEOMETRYvalue inWKTformat.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.8,True,"Databricks geospatial function reference for st_aswkt, including supported types and runtime applicability.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_azimuth,st_azimuth function,st_azimuth function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_azimuth function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.0 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the north-based azimuth from the first point to the second in radians in[0, 2π).",2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference page describes syntax and behavior of st_azimuth (returns north-based azimuth in radians in [0, 2π)), but does not include product-specific limits, quotas, configuration parameters, error-code troubleshooting, or decision matrices. It is general API semantics that an LLM is likely to know from training rather than deployment- or configuration-specific expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_azimuth,st_azimuth function,st_azimuth function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_azimuth function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.0 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the north-based azimuth from the first point to the second in radians in[0, 2π).",2026-04-13T21:52:00.000Z,language-reference,,0.3,False,"Function reference page describes syntax and behavior of st_azimuth (returns north-based azimuth in radians in [0, 2π)), but does not include product-specific limits, quotas, configuration parameters, error-code troubleshooting, or decision matrices. It is general API semantics that an LLM is likely to know from training rather than deployment- or configuration-specific expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_boundary,st_boundary function,st_boundary function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_boundary function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.0 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the boundary of the inputGEOMETRYvalue asGEOMETRYvalue.",2026-03-19T18:35:00.000Z,language-reference,,0.2,False,"Geospatial function reference (st_boundary) describing return type and behavior; lacks numeric limits, configuration tables, or product-specific best practices or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_buffer,st_buffer function,st_buffer function - Azure Databricks - Databricks SQL,Create geometry buffers with st_buffer in Databricks,Learn the syntax of the st\_buffer function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the buffer of the inputGEOMETRYvalue using the specified radius.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.78,True,Function reference for st_buffer with Databricks-specific radius handling and GEOMETRY support.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_centroid,st_centroid function,st_centroid function - Azure Databricks - Databricks SQL,Compute geometry centroid with st_centroid,Learn the syntax of the st\_centroid function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the centroid of the inputGEOMETRYvalue as a 2D pointGEOMETRYvalue.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.76,True,"Describes Databricks-specific st_centroid behavior, including 2D projection and return type.",unchanged @@ -4292,7 +4332,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions The SRID value of the returnedGEOMETRYvalue is the value of thesridExprif specified, or 0 otherwise.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.7,True,"Explains Databricks SQL function semantics for WKB parsing and SRID defaulting, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_geomfromwkt,st_geomfromwkt function,st_geomfromwkt function - Azure Databricks - Databricks SQL,Use st_geomfromwkt to parse WKT geometry with SRID,Learn the syntax of the st\_geomfromwkt function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Parses theWKTdescription of a geometry and returns the correspondingGEOMETRYvalue. The SRID value of the returnedGEOMETRYvalue is the value of thesridExprif specified, or 0 otherwise.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.7,True,Similar to index 6 but explicitly for this function; details Databricks behavior for SRID resolution and runtime support.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_interiorringn,st_interiorringn function,st_interiorringn function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_interiorringn function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 17.3 and above Important This feature is inPublic Preview. Returns the n-th interior ring of the input polygon as a linestring.,2026-04-13T08:00:00.000Z,language-reference,,0.2,False,"Function reference page describing syntax and behavior of st_interiorringn; no limits, quotas, configuration tables, error-code troubleshooting, or product-specific thresholds. Primarily generic SQL/geometry semantics that an LLM likely already knows.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_interiorringn,st_interiorringn function,st_interiorringn function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_interiorringn function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 17.3 and above Important This feature is inPublic Preview. Returns the n-th interior ring of the input polygon as a linestring.,2026-04-13T08:00:00.000Z,language-reference,,0.2,False,"Function reference page describing syntax and behavior of st_interiorringn; no limits, quotas, configuration tables, error-code troubleshooting, or product-specific thresholds. Primarily generic SQL/geometry semantics that an LLM likely already knows.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_intersection,st_intersection function,st_intersection function - Azure Databricks - Databricks SQL,Use st_intersection to compute geometry intersections,Learn the syntax of the st\_intersection function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the point-set intersection of the two inputGEOMETRYvalues as a 2DGEOMETRYvalue.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.6,True,"Describes Databricks SQL implementation of geometric intersection returning 2D GEOMETRY, which is specific to this engine.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_intersects,st_intersects function,st_intersects function - Azure Databricks - Databricks SQL,Use st_intersects to test geometry intersections,Learn the syntax of the st\_intersects function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns true if the two inputGEOMETRYvalues intersect.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.6,True,"Function reference for Databricks-specific boolean predicate on GEOMETRY values, including runtime/warehouse applicability.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_isempty,st_isempty function,st_isempty function - Azure Databricks - Databricks SQL,Use st_isempty to test empty geography or geometry,Learn the syntax of the st\_isempty function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns true if the inputGEOGRAPHYorGEOMETRYvalue does not contain any non-empty points.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.6,True,"Defines Databricks behavior for detecting non-empty points in geospatial values, which is product-specific.",unchanged @@ -4316,7 +4356,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_reverse,st_reverse function,st_reverse function - Azure Databricks - Databricks SQL,Use st_reverse to reverse geospatial geometries,Learn the syntax of the st\_reverse function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Reverses the inputGEOGRAPHYorGEOMETRYvalue.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.6,True,"Documents Databricks behavior for reversing GEOGRAPHY/GEOMETRY values, which is specific to this implementation.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_rotate,st_rotate function,st_rotate function - Azure Databricks - Databricks SQL,Use st_rotate to rotate geometries around Z axis,Learn the syntax of the st\_rotate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Rotates the inputGEOMETRYvalue around the Z axis by the given rotation angle (in radians).",2026-01-22T13:21:00.000Z,language-reference,integrations,0.65,True,Function reference for Databricks-specific rotation behavior using radians around the Z axis.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_scale,st_scale function,st_scale function - Azure Databricks - Databricks SQL,"Use st_scale to scale geometries in X, Y, and Z",Learn the syntax of the st\_scale function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Scales the inputGEOMETRYvalue in the X, Y, and, if specified, Z directions using the provided scaling factors.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks SQL function that scales geometries with separate factors per axis, which is concrete API behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_setpoint,st_setpoint function,st_setpoint function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_setpoint function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Sets the n-th point of the input linestringGEOGRAPHYorGEOMETRYvalue.",2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Function reference page for st_setpoint with syntax and description; mentions warehouse type applicability but without detailed matrices, limits, or configuration parameters. Does not match any expert-knowledge sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_setpoint,st_setpoint function,st_setpoint function - Azure Databricks - Databricks SQL,,Learn the syntax of the st\_setpoint function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Sets the n-th point of the input linestringGEOGRAPHYorGEOMETRYvalue.",2026-04-13T21:52:00.000Z,language-reference,,0.2,False,"Function reference page for st_setpoint with syntax and description; mentions warehouse type applicability but without detailed matrices, limits, or configuration parameters. Does not match any expert-knowledge sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_setsrid,st_setsrid function,st_setsrid function - Azure Databricks - Databricks SQL,Use st_setsrid to change SRID of geospatial values,Learn the syntax of the st\_setsrid function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns a newGEOGRAPHYorGEOMETRYvalue whose SRID is the specified SRID value.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.65,True,Function reference for Databricks behavior when returning a new geography/geometry with a specified SRID.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_simplify,st_simplify function,st_simplify function - Azure Databricks - Databricks SQL,Use st_simplify to simplify geometries with Douglas-Peucker,Learn the syntax of the st\_simplify function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Simplifies the inputGEOMETRYvalue using the Douglas-Peucker algorithm.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks implementation of geometry simplification using a specific algorithm, which is product-specific behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_srid,st_srid function,st_srid function - Azure Databricks - Databricks SQL,Use st_srid to read SRID from geography or geometry,Learn the syntax of the st\_srid function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Returns the SRID of the inputGEOGRAPHYorGEOMETRYvalue.",2026-01-22T13:21:00.000Z,language-reference,integrations,0.65,True,Function reference for Databricks-specific SRID retrieval from geospatial values.,unchanged @@ -4342,68 +4382,72 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/stddev,stddev aggregate function,stddev aggregate function - Azure Databricks - Databricks SQL,Use stddev aggregate function in Databricks SQL,Learn the syntax of the stddev aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the sample standard deviation calculated from the values in the group. This function is a synonym forstdaggregate function.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,"Function reference with exact name, behavior, and synonym relationship to std; product-specific SQL API details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/stddev_pop,stddev_pop aggregate function,stddev_pop aggregate function - Azure Databricks - Databricks SQL,Use stddev_pop aggregate function in Databricks SQL,Learn the syntax of the stddev\_pop aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the population standard deviation calculated from the values of a group.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,Documents Databricks SQL aggregate function for population standard deviation; includes precise function semantics unique to this product.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/stddev_samp,stddev_samp aggregate function,stddev_samp aggregate function - Azure Databricks - Databricks SQL,Use stddev_samp aggregate function in Databricks SQL,Learn the syntax of the stddev\_samp aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the sample standard deviation calculated from the values of a group.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,Provides exact behavior of the sample standard deviation aggregate in Databricks SQL; this is concrete API-level knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/str_to_map,str_to_map function,str_to_map function - Azure Databricks - Databricks SQL,Use str_to_map in Databricks SQL queries,Learn the syntax of the str\_to\_map function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map after splitting the input into key-value pairs using delimiters.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference pages typically include product-specific syntax, parameters, and behavior details (e.g., delimiters, null-handling) that are effectively API surface documentation and count as integration/coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/str_to_map,str_to_map function,str_to_map function - Azure Databricks - Databricks SQL,Use str_to_map in Databricks SQL queries,Learn the syntax of the str\_to\_map function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a map after splitting the input into key-value pairs using delimiters.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference pages typically include product-specific syntax, parameters, and behavior details (e.g., delimiters, null-handling) that are effectively API surface documentation and count as integration/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/string,string function,string function - Azure Databricks - Databricks SQL,,Learn the syntax of the string function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Casts the valueexprtoSTRING. This function is a synonym forcast(expr AS STRING). Seecastfunctionfor details.,2024-12-13T19:08:00.000Z,language-reference,,0.2,False,"string cast function; trivial synonym for CAST, no expert knowledge content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/string_agg,string_agg function,string_agg aggregate function - Azure Databricks - Databricks SQL,Use string_agg aggregate function in Databricks SQL,Learn the syntax of the string\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.4 and later Returns concatenatedSTRINGandBINARYvalues within a group. This function is an alias forlistagg function.,2026-01-24T08:00:00.000Z,language-reference,integrations,0.75,True,"Details Databricks SQL string_agg/listagg behavior for STRING and BINARY, including aliasing; product-specific function semantics and naming are expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/struct,struct function,struct function - Azure Databricks - Databricks SQL,Create structs with struct function in Databricks SQL,Learn the syntax of the struct function of the SQL language in Azure Databricks.,Applies to:Databricks SQLDatabricks Runtime Creates aSTRUCTwith the specified field values.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.7,True,Explains Databricks SQL struct function behavior and usage; this is specific to the Databricks SQL language surface.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/substr,substr function,substr function - Azure Databricks - Databricks SQL,,Learn the syntax of the substr function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the substring ofexprthat starts atposand is of lengthlen. This function is a synonym forsubstringfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"substr function; synonym for substring, basic string operation without expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/substring,substring function,substring function - Azure Databricks - Databricks SQL,,Learn the syntax of the substring function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the substring ofexprthat starts atposand is of lengthlen. This function is a synonym forsubstrfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"substring function; standard SQL substring behavior, no numeric limits, quotas, or configuration ranges indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/substring_index,substring_index function,substring_index function - Azure Databricks - Databricks SQL,,Learn the syntax of the substring\_index function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the substring ofexprbefore thecountoccurrence of the delimiterdelim.,2024-12-13T19:08:00.000Z,language-reference,,0.4,False,"substring_index; describes behavior relative to delimiter and count, but no product-specific limits or config parameters.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sum,sum aggregate function,sum aggregate function - Azure Databricks - Databricks SQL,Apply sum aggregate in Databricks SQL,Learn the syntax of the sum aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the sum calculated from the values of a group.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Aggregate function docs usually specify exact syntax, argument types, null semantics, and edge-case behavior that are product-specific API details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/sum,sum aggregate function,sum aggregate function - Azure Databricks - Databricks SQL,Apply sum aggregate in Databricks SQL,Learn the syntax of the sum aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the sum calculated from the values of a group.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Aggregate function docs usually specify exact syntax, argument types, null semantics, and edge-case behavior that are product-specific API details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/table_changes,table_changes table-valued function,table_changes table-valued function - Azure Databricks - Databricks SQL,Use table_changes for Delta Lake change data feed access,Learn the syntax of the table\_changes function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a log of changes to a Delta Lake table with Change Data Feed enabled. To invoke this function you need to have at least one of the following:,2024-12-17T08:00:00.000Z,language-reference,security,0.65,True,"Describes table_changes TVF for Delta Lake CDF and explicitly notes required permissions (“you need to have at least one of the following”), which is product-specific security/authorization guidance mapping function usage to access requirements.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tan,tan function,tan function - Azure Databricks - Databricks SQL,Use tan trigonometric function in Databricks SQL,Learn the syntax of the tan function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the tangent ofexpr.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.65,True,"While tangent is generic, the exact function name, signature, and availability in Databricks SQL/Runtime are product-specific API details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tanh,tanh function,tanh function - Azure Databricks - Databricks SQL,Use tanh hyperbolic function in Databricks SQL,Learn the syntax of the tanh function of the SQL language in Databricks SQL and Databricks Runtime,Applies to:Databricks SQLDatabricks Runtime Returns the hyperbolic tangent ofexpr.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.65,True,"Documents the Databricks SQL tanh function, including its exact name and behavior; this is concrete product API knowledge rather than conceptual math.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_difference,theta_difference function,theta_difference function - Azure Databricks - Databricks SQL,Compute theta_difference on Databricks Theta Sketches,Learn the syntax of the theta\_difference function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Computes the set difference (A minus B) of two Theta Sketch binary representations. The returned sketch contains only values that appear in the first sketch but not in the second.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Theta Sketch functions are niche and Databricks-specific; the page will define exact function signature, accepted binary formats, and behavior, which are specialized API/config details.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_intersection,theta_intersection function,theta_intersection function - Azure Databricks - Databricks SQL,Use theta_intersection with Databricks Theta Sketches,Learn the syntax of the theta\_intersection function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Computes the set intersection of two Theta Sketch binary representations. The returned sketch contains only values that appear in both sketches.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Documents a specialized function over Theta Sketch binary representations with precise syntax and semantics, which is product-specific integration knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_intersection_agg,theta_intersection_agg aggregate function,theta_intersection_agg aggregate function - Azure Databricks - Databricks SQL,Aggregate Theta Sketch intersections in Databricks SQL,Learn the syntax of the theta\_intersection\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple Theta Sketch buffers and intersects them into one result buffer. Returns the approximate count of distinct values that appear in all input sketches.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Aggregate variant over multiple Theta Sketch buffers; includes exact function name, arguments, and return behavior that are specific to Databricks SQL.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_sketch_agg,theta_sketch_agg aggregate function,theta_sketch_agg aggregate function - Azure Databricks - Databricks SQL,Build Theta Sketch aggregates in Databricks SQL,Learn the syntax of the theta\_sketch\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Creates a Datasketches Theta Sketch from input values for approximate distinct count estimation. The Theta Sketch algorithm provides probabilistic counting of unique values with configurable accuracy.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.85,True,"Describes theta_sketch_agg with configurable accuracy; likely includes parameters controlling sketch size/accuracy and exact usage patterns, which are detailed API/config info.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_sketch_estimate,theta_sketch_estimate function,theta_sketch_estimate function - Azure Databricks - Databricks SQL,Estimate unique counts from Theta Sketches,Learn the syntax of the theta\_sketch\_estimate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Returns the estimated number of unique values from a Theta Sketch binary representation.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"theta_sketch_estimate is a specialized function; the doc will define its signature, accepted types, and estimation behavior, which are product-specific coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_union,theta_union function,theta_union function - Azure Databricks - Databricks SQL,Union Theta Sketches with Databricks SQL,Learn the syntax of the theta\_union function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Merges exactly two Theta Sketch binary representations using set union.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,theta_union operates on Theta Sketch binaries; the exact function contract and usage are specialized integration details not covered by generic LLM knowledge.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_union_agg,theta_union_agg aggregate function,theta_union_agg aggregate function - Azure Databricks - Databricks SQL,Aggregate Theta Sketch unions across partitions,Learn the syntax of the theta\_union\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple Theta Sketch buffers and merges them using set union into one result buffer. Use this function to combine sketches from different partitions or time periods.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"theta_union_agg is an aggregate over multiple sketch buffers; the doc will specify arguments, return type, and partition-combination semantics, which are API-level details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_difference,theta_difference function,theta_difference function - Azure Databricks - Databricks SQL,Compute theta_difference on Databricks Theta Sketches,Learn the syntax of the theta\_difference function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Computes the set difference (A minus B) of two Theta Sketch binary representations. The returned sketch contains only values that appear in the first sketch but not in the second.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Theta Sketch functions are niche and Databricks-specific; the page will define exact function signature, accepted binary formats, and behavior, which are specialized API/config details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_intersection,theta_intersection function,theta_intersection function - Azure Databricks - Databricks SQL,Use theta_intersection with Databricks Theta Sketches,Learn the syntax of the theta\_intersection function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Computes the set intersection of two Theta Sketch binary representations. The returned sketch contains only values that appear in both sketches.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Documents a specialized function over Theta Sketch binary representations with precise syntax and semantics, which is product-specific integration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_intersection_agg,theta_intersection_agg aggregate function,theta_intersection_agg aggregate function - Azure Databricks - Databricks SQL,Aggregate Theta Sketch intersections in Databricks SQL,Learn the syntax of the theta\_intersection\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple Theta Sketch buffers and intersects them into one result buffer. Returns the approximate count of distinct values that appear in all input sketches.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Aggregate variant over multiple Theta Sketch buffers; includes exact function name, arguments, and return behavior that are specific to Databricks SQL.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_sketch_agg,theta_sketch_agg aggregate function,theta_sketch_agg aggregate function - Azure Databricks - Databricks SQL,Build Theta Sketch aggregates in Databricks SQL,Learn the syntax of the theta\_sketch\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Creates a Datasketches Theta Sketch from input values for approximate distinct count estimation. The Theta Sketch algorithm provides probabilistic counting of unique values with configurable accuracy.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.85,True,"Describes theta_sketch_agg with configurable accuracy; likely includes parameters controlling sketch size/accuracy and exact usage patterns, which are detailed API/config info.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_sketch_estimate,theta_sketch_estimate function,theta_sketch_estimate function - Azure Databricks - Databricks SQL,Estimate unique counts from Theta Sketches,Learn the syntax of the theta\_sketch\_estimate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Returns the estimated number of unique values from a Theta Sketch binary representation.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"theta_sketch_estimate is a specialized function; the doc will define its signature, accepted types, and estimation behavior, which are product-specific coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_union,theta_union function,theta_union function - Azure Databricks - Databricks SQL,Union Theta Sketches with Databricks SQL,Learn the syntax of the theta\_union function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Merges exactly two Theta Sketch binary representations using set union.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,theta_union operates on Theta Sketch binaries; the exact function contract and usage are specialized integration details not covered by generic LLM knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/theta_union_agg,theta_union_agg aggregate function,theta_union_agg aggregate function - Azure Databricks - Databricks SQL,Aggregate Theta Sketch unions across partitions,Learn the syntax of the theta\_union\_agg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.0 and above Consumes multiple Theta Sketch buffers and merges them using set union into one result buffer. Use this function to combine sketches from different partitions or time periods.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"theta_union_agg is an aggregate over multiple sketch buffers; the doc will specify arguments, return type, and partition-combination semantics, which are API-level details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tildesign,~ (tilde sign) operator,~ (tilde sign) operator - Azure Databricks - Databricks SQL,,Learn the syntax of the \~ (tilde sign) operator of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the bitwiseNOTofexpr.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"Bitwise NOT (~) operator; generic SQL operator semantics without Databricks-specific limits, quotas, or configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timediff,timediff function,timediff function - Azure Databricks - Databricks SQL,,Learn the syntax of the timediff function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 14.0 and above Returns the difference between two timestamps measured inunits. This function is a synonym fortimestampdifffunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"Function reference for timediff; likely just syntax, arguments, and basic examples without limits, quotas, or product-specific configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp,timestamp function,timestamp function - Azure Databricks - Databricks SQL,,Learn the syntax of the timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Castsexprto TIMESTAMP. This function is a synonym forCAST(expr AS TIMESTAMP). For details seecastfunction.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,"timestamp cast function reference; standard SQL-style behavior, no indication of limits, quotas, or special configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp_micros,timestamp_micros function,timestamp_micros function - Azure Databricks - Databricks SQL,Convert microseconds to timestamp in Databricks SQL,Learn the syntax of the timestamp\_micros function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a timestampexprmicroseconds since UTC epoch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"timestamp_micros is a Databricks SQL function; the page will define exact syntax, accepted numeric ranges, and timezone behavior, which are product-specific function details.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp_millis,timestamp_millis function,timestamp_millis function - Azure Databricks - Databricks SQL,Convert milliseconds to timestamp in Databricks SQL,Learn the syntax of the timestamp\_millis function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a timestampexprmilliseconds since UTC epoch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"timestamp_millis function documentation includes precise signature and behavior (epoch basis, overflow handling) that are specific to Databricks SQL.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp_seconds,timestamp_seconds function,timestamp_seconds function - Azure Databricks - Databricks SQL,Convert seconds to timestamp in Databricks SQL,Learn the syntax of the timestamp\_seconds function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates timestampexprseconds since UTC epoch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"timestamp_seconds function reference will specify exact usage and semantics for converting epoch seconds, which is detailed API knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp_micros,timestamp_micros function,timestamp_micros function - Azure Databricks - Databricks SQL,Convert microseconds to timestamp in Databricks SQL,Learn the syntax of the timestamp\_micros function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a timestampexprmicroseconds since UTC epoch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"timestamp_micros is a Databricks SQL function; the page will define exact syntax, accepted numeric ranges, and timezone behavior, which are product-specific function details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp_millis,timestamp_millis function,timestamp_millis function - Azure Databricks - Databricks SQL,Convert milliseconds to timestamp in Databricks SQL,Learn the syntax of the timestamp\_millis function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates a timestampexprmilliseconds since UTC epoch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"timestamp_millis function documentation includes precise signature and behavior (epoch basis, overflow handling) that are specific to Databricks SQL.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestamp_seconds,timestamp_seconds function,timestamp_seconds function - Azure Databricks - Databricks SQL,Convert seconds to timestamp in Databricks SQL,Learn the syntax of the timestamp\_seconds function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Creates timestampexprseconds since UTC epoch.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"timestamp_seconds function reference will specify exact usage and semantics for converting epoch seconds, which is detailed API knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestampadd,timestampadd function,timestampadd function - Azure Databricks - Databricks SQL,,Learn the syntax of the timestampadd function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Addsvalueunits to a timestampexpr.,2024-04-21T19:33:00.000Z,language-reference,,0.25,False,"timestampadd reference; standard add-units-to-timestamp semantics, likely just syntax and examples.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/timestampdiff,timestampdiff function,timestampdiff function - Azure Databricks - Databricks SQL,,Learn the syntax of the timestampdiff function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the difference between two timestamps measured inunits.,2024-04-18T08:00:00.000Z,language-reference,,0.25,False,"timestampdiff reference; generic difference-between-timestamps function, no special limits or configuration hinted.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tinyint,tinyint function,tinyint function - Azure Databricks - Databricks SQL,Cast expressions to TINYINT in Databricks SQL,Learn the syntax of the tinyint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Castsexprto TINYINT. This function is a synonym forCAST(expr AS TINYINT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"tinyint function (CAST synonym) docs typically include exact casting rules, ranges, and edge-case behavior, which are product-specific function semantics.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tinyint,tinyint function,tinyint function - Azure Databricks - Databricks SQL,Cast expressions to TINYINT in Databricks SQL,Learn the syntax of the tinyint function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Castsexprto TINYINT. This function is a synonym forCAST(expr AS TINYINT). Seecastfunctionfor details.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"tinyint function (CAST synonym) docs typically include exact casting rules, ranges, and edge-case behavior, which are product-specific function semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_avro,to_avro function,to_avro function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_avro function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 16.0 and above Returns a Avro binary value with the specified input value.,2024-12-16T21:39:00.000Z,language-reference,,0.3,False,"to_avro reference; describes returning Avro binary, but summary does not indicate detailed config tables or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_binary,to_binary function,to_binary function - Azure Databricks - Databricks SQL,Convert values to BINARY with to_binary,Learn the syntax of the to\_binary function of the SQL language in Azure Databricks.,Applies to:Databricks SQL previewDatabricks Runtime 11.3 LTS and above Returnsexprcast to BINARY based onfmt.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific to_binary behavior, including fmt-based casting rules and applicable runtimes, which are product-specific API details rather than generic SQL knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_char,to_char function,to_char function - Azure Databricks - Databricks SQL,Format values as strings using to_char,Learn the syntax of the to\_char function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast toSTRINGusing formattingfmt. In Databricks Runtime 14.0 and earlierto_charsupportsexprof numeric types. In Databricks SQL and Databricks Runtime 14.1 and aboveto_charalso supportsexprof typesDATE,TIMESTAMP, andBINARY to_charis a synonym forto_varchar.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific to_char/to_varchar behavior, supported types by runtime version, and formatting semantics—API details unique to Databricks SQL.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_binary,to_binary function,to_binary function - Azure Databricks - Databricks SQL,Convert values to BINARY with to_binary,Learn the syntax of the to\_binary function of the SQL language in Azure Databricks.,Applies to:Databricks SQL previewDatabricks Runtime 11.3 LTS and above Returnsexprcast to BINARY based onfmt.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific to_binary behavior, including fmt-based casting rules and applicable runtimes, which are product-specific API details rather than generic SQL knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_char,to_char function,to_char function - Azure Databricks - Databricks SQL,Format values as strings using to_char,Learn the syntax of the to\_char function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast toSTRINGusing formattingfmt. In Databricks Runtime 14.0 and earlierto_charsupportsexprof numeric types. In Databricks SQL and Databricks Runtime 14.1 and aboveto_charalso supportsexprof typesDATE,TIMESTAMP, andBINARY to_charis a synonym forto_varchar.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific to_char/to_varchar behavior, supported types by runtime version, and formatting semantics—API details unique to Databricks SQL.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_csv,to_csv function,to_csv function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_csv function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a CSV string with the specified struct value.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"to_csv reference; returns CSV string from struct, typical function reference content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_date,to_date function,to_date function - Azure Databricks - Databricks SQL,Cast expressions to DATE with to_date,Learn the syntax of the to\_date function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsexprcast to a date using an optional formatting.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Provides Databricks SQL-specific casting rules and optional formatting behavior for to_date, which are concrete API semantics not generally known from generic SQL.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_date,to_date function,to_date function - Azure Databricks - Databricks SQL,Cast expressions to DATE with to_date,Learn the syntax of the to\_date function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsexprcast to a date using an optional formatting.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Provides Databricks SQL-specific casting rules and optional formatting behavior for to_date, which are concrete API semantics not generally known from generic SQL.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_geography,to_geography function,to_geography function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_geography function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Parses the input description of a geography and returns the correspondingGEOGRAPHYvalue. The SRID value of the returnedGEOGRAPHYvalue is 4326.",2026-01-22T13:21:00.000Z,language-reference,,0.4,False,"to_geography reference; mentions SRID 4326 and preview/warehouse support, but summary does not show detailed limits, RBAC, or configuration tables required for expert classification.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_geometry,to_geometry function,to_geometry function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_geometry function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Parses the input description of a geometry and returns the correspondingGEOMETRYvalue. The SRID value of the returnedGEOMETRYvalue depends on the input format.",2026-01-22T13:21:00.000Z,language-reference,,0.4,False,"to_geometry reference; similar to to_geography with SRID note and preview status, but no explicit limits/quotas or security/config matrices indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_json,to_json function,to_json function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_json function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a JSON string with theSTRUCTorVARIANTspecified inexpr.,2024-07-29T21:03:00.000Z,language-reference,,0.3,False,"to_json reference; standard struct/variant-to-JSON conversion, likely only syntax and examples.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_number,to_number function,to_number function - Azure Databricks - Databricks SQL,Convert formatted strings to DECIMAL with to_number,Learn the syntax of the to\_number function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast to DECIMAL using formattingfmt.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Covers Databricks-specific to_number syntax and fmt handling for DECIMAL casting, including runtime applicability—product API details beyond generic SQL.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_timestamp,to_timestamp function,to_timestamp function - Azure Databricks - Databricks SQL,Cast expressions to TIMESTAMP with to_timestamp,Learn the syntax of the to\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsexprcast to a timestamp using an optional formatting.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific to_timestamp behavior, including optional formatting and casting rules, which are concrete API semantics for this platform.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_unix_timestamp,to_unix_timestamp function,to_unix_timestamp function - Azure Databricks - Databricks SQL,Get UNIX epoch from timestamps with to_unix_timestamp,Learn the syntax of the to\_unix\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the timestamp inexpras a UNIX timestamp.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Defines Databricks SQL behavior for converting timestamps to UNIX epoch, including accepted expr types and return semantics—platform-specific function details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_number,to_number function,to_number function - Azure Databricks - Databricks SQL,Convert formatted strings to DECIMAL with to_number,Learn the syntax of the to\_number function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast to DECIMAL using formattingfmt.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Covers Databricks-specific to_number syntax and fmt handling for DECIMAL casting, including runtime applicability—product API details beyond generic SQL.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_timestamp,to_timestamp function,to_timestamp function - Azure Databricks - Databricks SQL,Cast expressions to TIMESTAMP with to_timestamp,Learn the syntax of the to\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returnsexprcast to a timestamp using an optional formatting.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific to_timestamp behavior, including optional formatting and casting rules, which are concrete API semantics for this platform.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_unix_timestamp,to_unix_timestamp function,to_unix_timestamp function - Azure Databricks - Databricks SQL,Get UNIX epoch from timestamps with to_unix_timestamp,Learn the syntax of the to\_unix\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the timestamp inexpras a UNIX timestamp.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Defines Databricks SQL behavior for converting timestamps to UNIX epoch, including accepted expr types and return semantics—platform-specific function details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_utc_timestamp,to_utc_timestamp function,to_utc_timestamp function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_utc\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the timestamp atUTCfor a timestampexprattimeZone. For a list of valid timezones, seeList of tz database time zones.",2024-11-21T21:05:00.000Z,language-reference,,0.35,False,to_utc_timestamp reference; mentions valid timezones list but that is external; page itself is a basic function reference.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_varchar,to_varchar function,to_varchar function - Azure Databricks - Databricks SQL,Format values as VARCHAR using to_varchar,Learn the syntax of the to\_varchar function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast toSTRINGusing formattingfmt. In Databricks Runtime 14.0 and earlierto_varcharsupportsexprof numeric types. In Databricks SQL and Databricks Runtime 14.1 and aboveto_varcharalso supportsexprof typesDATE,TIMESTAMP, andBINARY to_varcharis a synonym forto_char.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Explains Databricks-specific to_varchar behavior, supported types by runtime version, and synonymy with to_char—detailed API behavior unique to Databricks SQL.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_varchar,to_varchar function,to_varchar function - Azure Databricks - Databricks SQL,Format values as VARCHAR using to_varchar,Learn the syntax of the to\_varchar function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast toSTRINGusing formattingfmt. In Databricks Runtime 14.0 and earlierto_varcharsupportsexprof numeric types. In Databricks SQL and Databricks Runtime 14.1 and aboveto_varcharalso supportsexprof typesDATE,TIMESTAMP, andBINARY to_varcharis a synonym forto_char.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Explains Databricks-specific to_varchar behavior, supported types by runtime version, and synonymy with to_char—detailed API behavior unique to Databricks SQL.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_variant_object,to_variant_object function,to_variant_object function - Azure Databricks - Databricks SQL,Convert complex types to VARIANT objects in Databricks,Learn the syntax of the to\_variant\_object function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.3 and above Convert a complex expression (ARRAY,MAP,STRUCT) into aVARIANTwhere maps and structs are converted to variant objects which are unordered.MAPcan only haveSTRINGkeys.",2026-01-20T08:00:00.000Z,language-reference,integrations,0.75,True,"Defines how ARRAY, MAP, and STRUCT are converted into VARIANT objects, including the constraint that MAP keys must be STRING and that maps/structs become unordered variant objects. This is detailed, product-specific type-conversion behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/to_xml,to_xml function,to_xml function - Azure Databricks - Databricks SQL,,Learn the syntax of the to\_xml function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns an XML string with the struct or variant specified inexpr.,2025-05-30T05:26:00.000Z,language-reference,,0.3,False,"to_xml reference; standard struct/variant-to-XML conversion, typical function documentation.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/transform,transform function,transform function - Azure Databricks - Databricks SQL,,Learn the syntax of the transform function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms elements in an array inexprusing the functionfunc.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"transform reference; array transformation function, generic SQL-like behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/transform_keys,transform_keys function,transform_keys function - Azure Databricks - Databricks SQL,Transform map keys with transform_keys function,Learn the syntax of the transform\_keys function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms keys in a map inexprusing the functionfunc.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Function reference with Databricks-specific semantics for applying a func over map keys, including argument behavior and return structure—API-level details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/transform_keys,transform_keys function,transform_keys function - Azure Databricks - Databricks SQL,Transform map keys with transform_keys function,Learn the syntax of the transform\_keys function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms keys in a map inexprusing the functionfunc.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Function reference with Databricks-specific semantics for applying a func over map keys, including argument behavior and return structure—API-level details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/transform_values,transform_values function,transform_values function - Azure Databricks - Databricks SQL,,Learn the syntax of the transform\_values function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms values in a map inexprusing the functionfunc.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"transform_values reference; map value transformation, no expert-only details indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/translate,translate function,translate function - Azure Databricks - Databricks SQL,,Learn the syntax of the translate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns anexprwhere all characters infromhave been replaced with those into.,2024-12-12T08:00:00.000Z,language-reference,,0.25,False,"translate reference; character replacement function, generic SQL behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/trim,trim function,trim function - Azure Databricks - Databricks SQL,,Learn the syntax of the trim function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Removes the leading or trailing space characters fromstr. Removes the leading or trailingtrimStrcharacters fromstr.,2024-12-13T19:08:00.000Z,language-reference,,0.25,False,"trim reference; whitespace/character trimming, standard SQL function.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/trunc,trunc function,trunc function - Azure Databricks - Databricks SQL,,Learn the syntax of the trunc function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns a date with the date truncated to the unit specified by the format modelunit.,2024-03-01T08:00:00.000Z,language-reference,,0.25,False,"trunc reference; date truncation by unit, typical SQL semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_add,try_add function,try_add function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_add function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the sum ofexpr1andexpr2, or NULL in case of error.",2024-04-18T08:00:00.000Z,language-reference,,0.3,False,"try_add reference; arithmetic with NULL-on-error semantics, but no product-specific limits or configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_aes_decrypt,try_aes_decrypt function,try_aes_decrypt function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_aes\_decrypt function of the SQL language in Databricks Runtime and Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Decrypts a binary produced using AES encryption and returns NULL if that fails for any reason.,2024-04-21T19:33:00.000Z,language-reference,,0.35,False,"try_aes_decrypt reference; decryption with NULL-on-failure, but summary does not show key-size tables, modes, or security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_avg,try_avg aggregate function,try_avg aggregate function - Azure Databricks - Databricks SQL,Compute averages safely with try_avg aggregate,Learn the syntax of the try\_avg function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the mean calculated from values of a group. If there is an overflow, returns NULL.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific try_avg behavior, especially overflow handling returning NULL instead of errors—non-generic aggregate semantics.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_avg,try_avg aggregate function,try_avg aggregate function - Azure Databricks - Databricks SQL,Compute averages safely with try_avg aggregate,Learn the syntax of the try\_avg function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the mean calculated from values of a group. If there is an overflow, returns NULL.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific try_avg behavior, especially overflow handling returning NULL instead of errors—non-generic aggregate semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_cast,try_cast function,try_cast function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_cast function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the value ofsourceExprcast to thetargetTypeif the cast is supported; otherwise, it returnsNULL, provided that the cast from the type ofsourceExprtotargetTypeis supported. If the source and target types are not a valid cast combination, aDATATYPE_MISMATCHerror is returned. SeeReturnsfor supported cast combinations.",2025-05-14T01:43:00.000Z,language-reference,,0.35,False,"try_cast reference; cast-with-NULL-on-failure; while it may list supported combinations, this is still basic type system behavior rather than limits/quotas or configuration matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_divide,try_divide function,try_divide function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_divide function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returnsdividenddivided bydivisor, or NULL ifdivisoris 0.",2026-04-10T08:00:00.000Z,language-reference,,0.2,False,"Function reference for try_divide in Databricks SQL; describes syntax and behavior (returns NULL when divisor is 0) but does not include limits, quotas, configuration tables, error-code troubleshooting, or other product-specific numeric thresholds or settings as defined by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_element_at,try_element_at function,try_element_at function - Azure Databricks - Databricks SQL,Safely access array or map elements with try_element_at,Learn the syntax of the try\_element\_at function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the element of anarrayExpratindex, or NULL ifindexis out of bound. Returns the value ofmapExprforkey, or NULL idkeydoes not exist.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Details Databricks-specific behavior for out-of-bounds indices and missing keys returning NULL instead of errors—precise function semantics.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_element_at,try_element_at function,try_element_at function - Azure Databricks - Databricks SQL,Safely access array or map elements with try_element_at,Learn the syntax of the try\_element\_at function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the element of anarrayExpratindex, or NULL ifindexis out of bound. Returns the value ofmapExprforkey, or NULL idkeydoes not exist.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Details Databricks-specific behavior for out-of-bounds indices and missing keys returning NULL instead of errors—precise function semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_as_binary,try_ip_as_binary function,try_ip_as_binary function - Azure Databricks - Databricks SQL,Use try_ip_as_binary in Databricks SQL queries,Learn the syntax of the try\_ip\_as\_binary function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical binary representation of an IP address or CIDR block. ReturnsNULLinstead of raising an error if the input is invalid.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific SQL syntax, argument types, return types, and behavior (returns NULL instead of error) for try_ip_as_binary, which are concrete API details rather than conceptual overview.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_as_string,try_ip_as_string function,try_ip_as_string function - Azure Databricks - Databricks SQL,Use try_ip_as_string in Databricks SQL queries,Learn the syntax of the try\_ip\_as\_string function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical string representation of an IP address or CIDR block. ReturnsNULLinstead of raising an error if the input is invalid.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.7,True,"Function reference page with Databricks-specific SQL syntax and behavior for try_ip_as_string, including how invalid input is handled (NULL), which is concrete API behavior not generally known.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_cidr,try_ip_cidr function,try_ip_cidr function - Azure Databricks - Databricks SQL,Use try_ip_cidr for CIDR handling in Databricks SQL,Learn the syntax of the try\_ip\_cidr function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical representation of an IPv4 or IPv6 CIDR block. ReturnsNULLinstead of raising an error if the input is invalid.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.7,True,"Documents the try_ip_cidr function with exact SQL signature and behavior for IPv4/IPv6 CIDR normalization and error handling, which are product-specific function semantics.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_host,try_ip_host function,try_ip_host function - Azure Databricks - Databricks SQL,Use try_ip_host for IP parsing in Databricks SQL,Learn the syntax of the try\_ip\_host function of the SQL language in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Returns the canonical representation of an IPv4 or IPv6 address. ReturnsNULLinstead of raising an error if the input is invalid.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.7,True,"Provides Databricks SQL function details for try_ip_host, including canonicalization rules and NULL-on-error behavior, which are specific API semantics beyond generic knowledge.",new https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_mod,try_mod function,try_mod function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_mod function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 15.3 and above Returns the remainder afterdividend / divisororNULLifdivisoris0.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"Function reference for try_mod in Databricks SQL; describes return behavior but does not include product-specific limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_multiply,try_multiply function,try_multiply function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_multiply function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returnsmultipliermultiplied bymultiplicand, orNULLon overflow.",2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"Function reference for try_multiply in Databricks SQL; explains overflow behavior but lacks numeric limits, configuration options, or decision/troubleshooting structures required by the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_parse_json,try_parse_json function,try_parse_json function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_parse\_json function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.3 and above Returns aVARIANTvalue from thejsonStrif possible, orNULLif not possible.",2025-04-30T00:43:00.000Z,language-reference,,0.35,False,"try_parse_json reference; JSON parsing to VARIANT with NULL-on-failure, but summary does not show detailed config or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_parse_timestamp,try_parse_timestamp function,try_parse_timestamp function - Azure Databricks - Databricks SQL,Parse timestamps safely with try_parse_timestamp,Learn the syntax of the try\_parse\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Ifexpris a string, parses it into aTIMESTAMPaccording to the first matching pattern in the given list of formats, or returnsNULLif no pattern matches. Ifexpris a numeric type, parses it as a Unix timestamp. Invalid or non-matchingexprreturnsNULLinstead of raising an error.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific parsing rules, list-of-formats behavior, numeric Unix timestamp handling, and NULL-on-failure semantics—detailed API behavior.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_parse_timestamp,try_parse_timestamp function,try_parse_timestamp function - Azure Databricks - Databricks SQL,Parse timestamps safely with try_parse_timestamp,Learn the syntax of the try\_parse\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Ifexpris a string, parses it into aTIMESTAMPaccording to the first matching pattern in the given list of formats, or returnsNULLif no pattern matches. Ifexpris a numeric type, parses it as a Unix timestamp. Invalid or non-matchingexprreturnsNULLinstead of raising an error.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific parsing rules, list-of-formats behavior, numeric Unix timestamp handling, and NULL-on-failure semantics—detailed API behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_reflect,try_reflect function,try_reflect function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_reflect function of the SQL language in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 14.1 and above Call a method with reflection, returningNULLif the method returns an exception. To return an error instead usereflect.",2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"Function reference page with syntax/behavior description but no limits, configs, error-code mappings, or product-specific thresholds beyond what an LLM can infer from general SQL knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_secret,try_secret function,try_secret function - Azure Databricks - Databricks SQL,Retrieve Databricks secrets with try_secret function,Learn the syntax of the try\_secret function of the SQL language in Azure Databricks.,"Applies to:Databricks SQL previewDatabricks Runtime 15.0 and above Extracts a secret value with the givenscopeandkeyfromDatabricks secret service, orNULLif the key cannot be retrieved.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Covers Databricks SQL integration with Databricks secret service via try_secret, including scope/key parameters and NULL-on-failure behavior—product-specific integration API.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_secret,try_secret function,try_secret function - Azure Databricks - Databricks SQL,Retrieve Databricks secrets with try_secret function,Learn the syntax of the try\_secret function of the SQL language in Azure Databricks.,"Applies to:Databricks SQL previewDatabricks Runtime 15.0 and above Extracts a secret value with the givenscopeandkeyfromDatabricks secret service, orNULLif the key cannot be retrieved.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Covers Databricks SQL integration with Databricks secret service via try_secret, including scope/key parameters and NULL-on-failure behavior—product-specific integration API.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_subtract,try_subtract function,try_subtract function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_subtract function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above Returns the subtraction ofexpr2fromexpr1, orNULLon overflow.",2025-01-22T08:00:00.000Z,language-reference,,0.3,False,"Arithmetic try_ function; behavior (NULL on overflow) is generic and not configuration-, limit-, or error-code–oriented expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_sum,try_sum aggregate function,try_sum aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_sum aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returns the sum calculated from values of a group, or NULL if there is an overflow.",2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"Aggregate function reference for try_sum in Databricks SQL; documents that it returns NULL on overflow but does not provide detailed limits, configuration parameters, or structured troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_binary,try_to_binary function,try_to_binary function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_to\_binary function of the SQL language in Azure Databricks.,"Applies to:Databricks SQL previewDatabricks Runtime 11.3 LTS and above Returnsexprcast to BINARY based onfmt, or NULL if the input is not valid.",2024-04-18T08:00:00.000Z,language-reference,,0.4,False,"Casts to BINARY with format; likely just syntax and examples, without configuration tables, limits, or security/decision matrices.",unchanged @@ -4411,30 +4455,30 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions The SRID value of the returnedGEOGRAPHYvalue is 4326.",2026-01-22T13:21:00.000Z,language-reference,,0.4,False,"Geography parsing function; SRID 4326 is noted but this is standard GIS knowledge, not a configuration, limit, or troubleshooting guide.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_geometry,try_to_geometry function,try_to_geometry function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_to\_geometry function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.1 and above Important This feature is inPublic Preview. Note This feature is not available on Databricks SQL Classic warehouses. To learn more about Databricks SQL warehouses, seeSQL warehouse types. Parses the input description of a geometry and returns the correspondingGEOMETRYvalue, orNULLif the input description is invalid. The SRID value of the returnedGEOMETRYvalue depends on the input format.",2026-01-22T13:21:00.000Z,language-reference,,0.4,False,"Geometry parsing function; SRID depends on input format but page is a basic function reference, not limits/config/troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_number,try_to_number function,try_to_number function - Azure Databricks - Databricks SQL,Convert formatted strings to DECIMAL with try_to_number,Learn the syntax of the try\_to\_number function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast to DECIMAL using formattingfmt, orNULLifexprdoes not match the format.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific try_to_number behavior, including fmt handling and NULL return on format mismatch—detailed casting semantics unique to this platform.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_timestamp,try_to_timestamp function,try_to_timestamp function - Azure Databricks - Databricks SQL,Use try_to_timestamp in Databricks SQL queries,Learn the syntax of the try\_to\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast to a timestamp using an optional formatting, orNULLif the cast fails.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, arguments, and behavior for Databricks SQL; this is concrete API-level knowledge not reliably known from training.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_number,try_to_number function,try_to_number function - Azure Databricks - Databricks SQL,Convert formatted strings to DECIMAL with try_to_number,Learn the syntax of the try\_to\_number function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast to DECIMAL using formattingfmt, orNULLifexprdoes not match the format.",2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific try_to_number behavior, including fmt handling and NULL return on format mismatch—detailed casting semantics unique to this platform.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_timestamp,try_to_timestamp function,try_to_timestamp function - Azure Databricks - Databricks SQL,Use try_to_timestamp in Databricks SQL queries,Learn the syntax of the try\_to\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Returnsexprcast to a timestamp using an optional formatting, orNULLif the cast fails.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference page with product-specific syntax, arguments, and behavior for Databricks SQL; this is concrete API-level knowledge not reliably known from training.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_url_decode,try_url_decode function,try_url_decode function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_url\_decode function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 16.0 and above Translates a string back fromapplication/x-www-form-urlencodedformat, orNULLif the format is incorrect.",2024-12-13T08:00:00.000Z,language-reference,,0.3,False,"URL decode helper returning NULL on invalid format; generic behavior, no expert-only limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_validate_utf8,try_validate_utf8 function,try_validate_utf8 function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_validate\_utf8 function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.4 and above Returns the input value if it corresponds to a valid UTF-8 string, orNULLotherwise.",2025-04-30T00:43:00.000Z,language-reference,,0.3,False,"UTF-8 validation helper; returns input or NULL, but no detailed configuration, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_variant_get,try_variant_get function,try_variant_get function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_variant\_get function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.3 and above Extracts a value of typetypefromvariantExpr, specified bypath, orNULLif it is not possible to cast to the target type.",2025-04-30T00:43:00.000Z,language-reference,,0.4,False,Variant extraction helper; product-specific type but page is a simple function reference without config tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_zstd_decompress,try_zstd_decompress function,try_zstd_decompress function - Azure Databricks - Databricks SQL,,Learn the syntax of the try\_zstd\_decompress function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.2 and above Returns value decompressed with Zstandard compression. On decompression failure, the function returnsNULL",2024-06-04T18:15:00.000Z,language-reference,,0.4,False,"Zstandard decompression helper returning NULL on failure; no quotas, configs, or error-code mapping content.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_difference_double,tuple_difference_double function,tuple_difference_double function - Azure Databricks - Databricks SQL,Compute tuple_difference_double in Databricks SQL,Learn the syntax of the tuple\_difference\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set difference (A minus B) of two TupleSketch binary representations with double summaries. The returned sketch contains only keys that appear in the first sketch but not in the second.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a specific Databricks SQL function, including its signature and semantics for TupleSketch binaries with double summaries, which is product-specific API behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_difference_integer,tuple_difference_integer function,tuple_difference_integer function - Azure Databricks - Databricks SQL,Compute tuple_difference_integer in Databricks SQL,Learn the syntax of the tuple\_difference\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set difference (A minus B) of two TupleSketch binary representations with integer summaries. The returned sketch contains only keys that appear in the first sketch but not in the second.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for a Databricks-only TupleSketch operation with integer summaries; includes exact function name and usage semantics that are expert, product-specific details.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_agg_double,tuple_intersection_agg_double aggregate function,tuple_intersection_agg_double aggregate function - Azure Databricks - Databricks SQL,Aggregate tuple_intersection_agg_double sketches in SQL,Learn the syntax of the tuple\_intersection\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the intersection of multiple TupleSketch binary representations with double summaries. Returns a sketch containing only keys common to all input sketches.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,Describes a Databricks aggregate function over TupleSketch binaries with double summaries; exposes concrete API surface and behavior unique to this runtime.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_agg_integer,tuple_intersection_agg_integer aggregate function,tuple_intersection_agg_integer aggregate function - Azure Databricks - Databricks SQL,Aggregate tuple_intersection_agg_integer sketches in SQL,Learn the syntax of the tuple\_intersection\_agg\_integer aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the intersection of multiple TupleSketch binary representations with integer summaries. Returns a sketch containing only keys common to all input sketches.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Provides syntax and semantics for a Databricks aggregate function on TupleSketch binaries with integer summaries, which is specialized API knowledge.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_double,tuple_intersection_double function,tuple_intersection_double function - Azure Databricks - Databricks SQL,Use tuple_intersection_double on TupleSketches,Learn the syntax of the tuple\_intersection\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set intersection of exactly two TupleSketch binary representations with double summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Function reference for intersecting two TupleSketch binaries with double summaries; details of this function are specific to Databricks SQL.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_integer,tuple_intersection_integer function,tuple_intersection_integer function - Azure Databricks - Databricks SQL,Use tuple_intersection_integer on TupleSketches,Learn the syntax of the tuple\_intersection\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set intersection of exactly two TupleSketch binary representations with integer summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Documents a Databricks SQL function for intersecting TupleSketch binaries with integer summaries; includes exact function behavior and usage.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_agg_double,tuple_sketch_agg_double aggregate function,tuple_sketch_agg_double aggregate function - Azure Databricks - Databricks SQL,Create TupleSketches with tuple_sketch_agg_double,Learn the syntax of the tuple\_sketch\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Creates a Datasketches TupleSketch from key-value pairs where keys are used for distinct counting and double summary values are aggregated according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Aggregate function reference for building TupleSketches from key–double pairs with specific aggregation modes; this is detailed, product-specific API information.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_agg_integer,tuple_sketch_agg_integer aggregate function,tuple_sketch_agg_integer aggregate function - Azure Databricks - Databricks SQL,Create TupleSketches with tuple_sketch_agg_integer,Learn the syntax of the tuple\_sketch\_agg\_integer aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Creates a Datasketches TupleSketch from key-value pairs where keys are used for distinct counting and integer summary values are aggregated according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Describes how to aggregate key–integer pairs into TupleSketches in Databricks SQL; exposes concrete function signature and semantics.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_estimate_double,tuple_sketch_estimate_double function,tuple_sketch_estimate_double function - Azure Databricks - Databricks SQL,Estimate unique keys with tuple_sketch_estimate_double,Learn the syntax of the tuple\_sketch\_estimate\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the estimated number of unique keys from a TupleSketch with double summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Function reference for estimating cardinality from TupleSketches with double summaries; this is a specialized Databricks SQL API.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_estimate_integer,tuple_sketch_estimate_integer function,tuple_sketch_estimate_integer function - Azure Databricks - Databricks SQL,Estimate unique keys with tuple_sketch_estimate_integer,Learn the syntax of the tuple\_sketch\_estimate\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the estimated number of unique keys from a TupleSketch with integer summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Documents a Databricks SQL function that estimates unique keys from TupleSketches with integer summaries; concrete API-level knowledge.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_summary_double,tuple_sketch_summary_double function,tuple_sketch_summary_double function - Azure Databricks - Databricks SQL,Summarize TupleSketch doubles with tuple_sketch_summary_double,Learn the syntax of the tuple\_sketch\_summary\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Aggregates the summary values from a TupleSketch with double summaries according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Provides syntax and behavior for aggregating double summaries from TupleSketches using specific modes; this is detailed, product-specific function behavior.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_summary_integer,tuple_sketch_summary_integer function,tuple_sketch_summary_integer function - Azure Databricks - Databricks SQL,Summarize TupleSketch integers with tuple_sketch_summary_integer,Learn the syntax of the tuple\_sketch\_summary\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Aggregates the summary values from a TupleSketch with integer summaries according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Function reference for aggregating integer summaries from TupleSketches; includes exact function name and semantics unique to Databricks SQL.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_difference_double,tuple_difference_double function,tuple_difference_double function - Azure Databricks - Databricks SQL,Compute tuple_difference_double in Databricks SQL,Learn the syntax of the tuple\_difference\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set difference (A minus B) of two TupleSketch binary representations with double summaries. The returned sketch contains only keys that appear in the first sketch but not in the second.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a specific Databricks SQL function, including its signature and semantics for TupleSketch binaries with double summaries, which is product-specific API behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_difference_integer,tuple_difference_integer function,tuple_difference_integer function - Azure Databricks - Databricks SQL,Compute tuple_difference_integer in Databricks SQL,Learn the syntax of the tuple\_difference\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set difference (A minus B) of two TupleSketch binary representations with integer summaries. The returned sketch contains only keys that appear in the first sketch but not in the second.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference for a Databricks-only TupleSketch operation with integer summaries; includes exact function name and usage semantics that are expert, product-specific details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_agg_double,tuple_intersection_agg_double aggregate function,tuple_intersection_agg_double aggregate function - Azure Databricks - Databricks SQL,Aggregate tuple_intersection_agg_double sketches in SQL,Learn the syntax of the tuple\_intersection\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the intersection of multiple TupleSketch binary representations with double summaries. Returns a sketch containing only keys common to all input sketches.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,Describes a Databricks aggregate function over TupleSketch binaries with double summaries; exposes concrete API surface and behavior unique to this runtime.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_agg_integer,tuple_intersection_agg_integer aggregate function,tuple_intersection_agg_integer aggregate function - Azure Databricks - Databricks SQL,Aggregate tuple_intersection_agg_integer sketches in SQL,Learn the syntax of the tuple\_intersection\_agg\_integer aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the intersection of multiple TupleSketch binary representations with integer summaries. Returns a sketch containing only keys common to all input sketches.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Provides syntax and semantics for a Databricks aggregate function on TupleSketch binaries with integer summaries, which is specialized API knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_double,tuple_intersection_double function,tuple_intersection_double function - Azure Databricks - Databricks SQL,Use tuple_intersection_double on TupleSketches,Learn the syntax of the tuple\_intersection\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set intersection of exactly two TupleSketch binary representations with double summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Function reference for intersecting two TupleSketch binaries with double summaries; details of this function are specific to Databricks SQL.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_intersection_integer,tuple_intersection_integer function,tuple_intersection_integer function - Azure Databricks - Databricks SQL,Use tuple_intersection_integer on TupleSketches,Learn the syntax of the tuple\_intersection\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the set intersection of exactly two TupleSketch binary representations with integer summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,Documents a Databricks SQL function for intersecting TupleSketch binaries with integer summaries; includes exact function behavior and usage.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_agg_double,tuple_sketch_agg_double aggregate function,tuple_sketch_agg_double aggregate function - Azure Databricks - Databricks SQL,Create TupleSketches with tuple_sketch_agg_double,Learn the syntax of the tuple\_sketch\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Creates a Datasketches TupleSketch from key-value pairs where keys are used for distinct counting and double summary values are aggregated according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Aggregate function reference for building TupleSketches from key–double pairs with specific aggregation modes; this is detailed, product-specific API information.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_agg_integer,tuple_sketch_agg_integer aggregate function,tuple_sketch_agg_integer aggregate function - Azure Databricks - Databricks SQL,Create TupleSketches with tuple_sketch_agg_integer,Learn the syntax of the tuple\_sketch\_agg\_integer aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Creates a Datasketches TupleSketch from key-value pairs where keys are used for distinct counting and integer summary values are aggregated according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Describes how to aggregate key–integer pairs into TupleSketches in Databricks SQL; exposes concrete function signature and semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_estimate_double,tuple_sketch_estimate_double function,tuple_sketch_estimate_double function - Azure Databricks - Databricks SQL,Estimate unique keys with tuple_sketch_estimate_double,Learn the syntax of the tuple\_sketch\_estimate\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the estimated number of unique keys from a TupleSketch with double summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Function reference for estimating cardinality from TupleSketches with double summaries; this is a specialized Databricks SQL API.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_estimate_integer,tuple_sketch_estimate_integer function,tuple_sketch_estimate_integer function - Azure Databricks - Databricks SQL,Estimate unique keys with tuple_sketch_estimate_integer,Learn the syntax of the tuple\_sketch\_estimate\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the estimated number of unique keys from a TupleSketch with integer summaries.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Documents a Databricks SQL function that estimates unique keys from TupleSketches with integer summaries; concrete API-level knowledge.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_summary_double,tuple_sketch_summary_double function,tuple_sketch_summary_double function - Azure Databricks - Databricks SQL,Summarize TupleSketch doubles with tuple_sketch_summary_double,Learn the syntax of the tuple\_sketch\_summary\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Aggregates the summary values from a TupleSketch with double summaries according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,"Provides syntax and behavior for aggregating double summaries from TupleSketches using specific modes; this is detailed, product-specific function behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_summary_integer,tuple_sketch_summary_integer function,tuple_sketch_summary_integer function - Azure Databricks - Databricks SQL,Summarize TupleSketch integers with tuple_sketch_summary_integer,Learn the syntax of the tuple\_sketch\_summary\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Aggregates the summary values from a TupleSketch with integer summaries according to the specified mode.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Function reference for aggregating integer summaries from TupleSketches; includes exact function name and semantics unique to Databricks SQL.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_theta_double,tuple_sketch_theta_double function,tuple_sketch_theta_double function - Azure Databricks - Databricks SQL,,Learn the syntax of the tuple\_sketch\_theta\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the theta value (sampling rate) from a TupleSketch with double summaries.,2026-02-18T22:10:00.000Z,language-reference,,0.4,False,Theta (sampling rate) accessor; function semantics without expert configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_sketch_theta_integer,tuple_sketch_theta_integer function,tuple_sketch_theta_integer function - Azure Databricks - Databricks SQL,,Learn the syntax of the tuple\_sketch\_theta\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Returns the theta value (sampling rate) from a TupleSketch with integer summaries.,2026-02-18T22:10:00.000Z,language-reference,,0.4,False,Theta accessor for integer summaries; same pattern as 25.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_agg_double,tuple_union_agg_double aggregate function,tuple_union_agg_double aggregate function - Azure Databricks - Databricks SQL,Union multiple TupleSketches with tuple_union_agg_double,Learn the syntax of the tuple\_union\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Unions multiple TupleSketch binary representations with double summaries into a single merged sketch. Use this function to combine pre-aggregated sketches from different partitions or data sources.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,Aggregate function documentation for unioning multiple TupleSketch binaries with double summaries; describes specific API usage for combining pre-aggregated sketches.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_agg_integer,tuple_union_agg_integer aggregate function,tuple_union_agg_integer aggregate function - Azure Databricks - Databricks SQL,Union multiple TupleSketches with tuple_union_agg_integer,Learn the syntax of the tuple\_union\_agg\_integer aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Unions multiple TupleSketch binary representations with integer summaries into a single merged sketch. Use this function to combine pre-aggregated sketches from different partitions or data sources.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Describes Databricks SQL aggregate function to union TupleSketch binaries with integer summaries; concrete, product-specific API details.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_double,tuple_union_double function,tuple_union_double function - Azure Databricks - Databricks SQL,Merge two TupleSketches with tuple_union_double,Learn the syntax of the tuple\_union\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Merges exactly two TupleSketch binary representations with double summaries using set union.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Function reference for unioning exactly two TupleSketch binaries with double summaries; specialized Databricks SQL function semantics.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_integer,tuple_union_integer function,tuple_union_integer function - Azure Databricks - Databricks SQL,Merge two TupleSketches with tuple_union_integer,Learn the syntax of the tuple\_union\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Merges exactly two TupleSketch binary representations with integer summaries using set union.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Documents a Databricks SQL function that unions two TupleSketch binaries with integer summaries; API-level behavior not generally known.,updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_agg_double,tuple_union_agg_double aggregate function,tuple_union_agg_double aggregate function - Azure Databricks - Databricks SQL,Union multiple TupleSketches with tuple_union_agg_double,Learn the syntax of the tuple\_union\_agg\_double aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Unions multiple TupleSketch binary representations with double summaries into a single merged sketch. Use this function to combine pre-aggregated sketches from different partitions or data sources.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,Aggregate function documentation for unioning multiple TupleSketch binaries with double summaries; describes specific API usage for combining pre-aggregated sketches.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_agg_integer,tuple_union_agg_integer aggregate function,tuple_union_agg_integer aggregate function - Azure Databricks - Databricks SQL,Union multiple TupleSketches with tuple_union_agg_integer,Learn the syntax of the tuple\_union\_agg\_integer aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Unions multiple TupleSketch binary representations with integer summaries into a single merged sketch. Use this function to combine pre-aggregated sketches from different partitions or data sources.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.8,True,"Describes Databricks SQL aggregate function to union TupleSketch binaries with integer summaries; concrete, product-specific API details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_double,tuple_union_double function,tuple_union_double function - Azure Databricks - Databricks SQL,Merge two TupleSketches with tuple_union_double,Learn the syntax of the tuple\_union\_double function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Merges exactly two TupleSketch binary representations with double summaries using set union.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Function reference for unioning exactly two TupleSketch binaries with double summaries; specialized Databricks SQL function semantics.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_integer,tuple_union_integer function,tuple_union_integer function - Azure Databricks - Databricks SQL,Merge two TupleSketches with tuple_union_integer,Learn the syntax of the tuple\_union\_integer function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Merges exactly two TupleSketch binary representations with integer summaries using set union.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.75,True,Documents a Databricks SQL function that unions two TupleSketch binaries with integer summaries; API-level behavior not generally known.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/typeof,typeof function,typeof function - Azure Databricks - Databricks SQL,,Learn the syntax of the typeof function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Return a DDL-formatted type string for the data type of the input.,2024-12-12T08:00:00.000Z,language-reference,,0.2,False,"Type inspection helper returning DDL-formatted type; generic behavior, no product-specific limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ucase,ucase function,ucase function - Azure Databricks - Databricks SQL,,Learn the syntax of the ucase function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnsexprwith all characters changed to uppercase according to the collation ofexpr. This function is a synonym forupperfunction.",2024-12-13T19:08:00.000Z,language-reference,,0.2,False,"Uppercase string function; synonym for upper, entirely generic.",unchanged @@ -4445,30 +4489,30 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_micros,unix_micros function,unix_micros function - Azure Databricks - Databricks SQL,,Learn the syntax of the unix\_micros function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of microseconds since1970-01-01 00:00:00 UTC.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,Microseconds since Unix epoch; generic timestamp function behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_millis,unix_millis function,unix_millis function - Azure Databricks - Databricks SQL,,Learn the syntax of the unix\_millis function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of milliseconds since1970-01-01 00:00:00 UTC.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,Milliseconds since Unix epoch; generic timestamp function behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_seconds,unix_seconds function,unix_seconds function - Azure Databricks - Databricks SQL,,Learn the syntax of the unix\_seconds function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the number of seconds since1970-01-01 00:00:00 UTC.,2024-03-01T08:00:00.000Z,language-reference,,0.2,False,Seconds since Unix epoch; generic timestamp function behavior.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_timestamp,unix_timestamp function,unix_timestamp function - Azure Databricks - Databricks SQL,Use unix_timestamp in Azure Databricks SQL,Learn the syntax of the unix\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the UNIX timestamp of current or specified time.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"While unix_timestamp is conceptually common, this page defines the exact Databricks SQL syntax, arguments, and behavior, which are product-specific integration details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_timestamp,unix_timestamp function,unix_timestamp function - Azure Databricks - Databricks SQL,Use unix_timestamp in Azure Databricks SQL,Learn the syntax of the unix\_timestamp function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the UNIX timestamp of current or specified time.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"While unix_timestamp is conceptually common, this page defines the exact Databricks SQL syntax, arguments, and behavior, which are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/upper,upper function,upper function - Azure Databricks - Databricks SQL,Convert strings to uppercase with Databricks upper,Learn the syntax of the upper function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returnsexprwith all characters changed to uppercase according to the collation ofexpr. This function is a synonym forucasefunction.",2024-12-13T19:08:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks SQL upper/ucase function behavior and usage, including collation-specific behavior; these are concrete API semantics.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/url_decode,url_decode function,url_decode function - Azure Databricks - Databricks SQL,Decode URL-encoded strings with url_decode,Learn the syntax of the url\_decode function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Translates a string back fromapplication/x-www-form-urlencodedformat.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Function reference for Databricks SQL url_decode, including its exact behavior and applicable runtimes; this is concrete API knowledge for this platform.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/url_decode,url_decode function,url_decode function - Azure Databricks - Databricks SQL,Decode URL-encoded strings with url_decode,Learn the syntax of the url\_decode function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Translates a string back fromapplication/x-www-form-urlencodedformat.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.65,True,"Function reference for Databricks SQL url_decode, including its exact behavior and applicable runtimes; this is concrete API knowledge for this platform.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/url_encode,url_encode function,url_encode function - Azure Databricks - Databricks SQL,Encode strings as URLs in Databricks SQL,Learn the syntax of the url\_encode function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Translates a string intoapplication/x-www-form-urlencodedformat.,2024-04-21T19:33:00.000Z,language-reference,integrations,0.7,True,"Describes url_encode function with Databricks-specific version applicability and behavior, representing concrete function-level integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/user,user function,user function - Azure Databricks - Databricks SQL,Get current user with Databricks user function,Learn the syntax of the user function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and above Returns the user executing the statement. This function is an alias forcurrent_user. Note The SQL standard differentiates betweenCURRENT_USERandSESSION_USER. In Databricks SQL and Databricks Runtime 14.1 and above you should useSESSION_USERinstead ofCURRENT_USERorUSER.",2024-04-21T19:33:00.000Z,language-reference,integrations,0.7,True,"Documents user/current_user/session_user behavior and version-specific guidance, which are detailed, product-specific SQL function semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/uuid,uuid function,uuid function - Azure Databricks - Databricks SQL,,Learn the syntax of the uuid function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns a universally unique identifierUUIDstring. A UUID is a 128-bit value used to uniquely identify objects or entities on the Internet. The term ""globally unique identifier"" (GUID) is also used. While UUIDs are not guaranteed to be unique, the probability of two UUIDs being the same is so low that they can be considered unique for practical purposes.",2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"uuid() function page describes standard UUID behavior; lacks Databricks-specific limits, configuration parameters, or troubleshooting details required for any sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/validate_utf8,validate_utf8 function,validate_utf8 function - Azure Databricks - Databricks SQL,Validate UTF-8 strings and handle INVALID_UTF8_STRING,Learn the syntax of the validate\_utf8 function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.4 and above Returns the input value if it corresponds to a valid UTF-8 string, or raisesINVALID_UTF8_STRINGotherwise.",2026-04-13T21:52:00.000Z,language-reference,troubleshooting,0.7,True,"Documents validate_utf8 behavior including a specific error condition/exception name (INVALID_UTF8_STRING), which is useful for diagnosing and handling product-specific errors.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/validate_utf8,validate_utf8 function,validate_utf8 function - Azure Databricks - Databricks SQL,Validate UTF-8 strings and handle INVALID_UTF8_STRING,Learn the syntax of the validate\_utf8 function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.4 and above Returns the input value if it corresponds to a valid UTF-8 string, or raisesINVALID_UTF8_STRINGotherwise.",2026-04-13T21:52:00.000Z,language-reference,troubleshooting,0.7,True,"Documents validate_utf8 behavior including a specific error condition/exception name (INVALID_UTF8_STRING), which is useful for diagnosing and handling product-specific errors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/var_pop,var_pop aggregate function,var_pop aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the var\_pop aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the population variance calculated from values of a group.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"var_pop aggregate function page is basic SQL reference (syntax and definition of population variance) without product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/var_samp,var_samp aggregate function,var_samp aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the var\_samp aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the sample variance calculated from values of a group. This function is a synonym forvarianceaggregate function.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"var_samp aggregate function page is standard SQL semantics; no Databricks-specific configuration, limits, or troubleshooting content that fits the expert-knowledge sub-skills.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/variance,variance aggregate function,variance aggregate function - Azure Databricks - Databricks SQL,,Learn the syntax of the variance aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the sample variance calculated from values of a group. This function is a synonym forvar_sampaggregate function.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"variance aggregate function page is a synonym description for var_samp; contains only generic SQL behavior, not product-specific expert details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/variant_explode,variant_explode table-valued function,variant_explode table-valued function - Azure Databricks - Databricks SQL,Unnest VARIANT data with variant_explode in Databricks,Learn the syntax of the variant\_explode table function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 15.3 and above Returns a set of rows by un-nestinginput. In Databricks SQL and Databricks Runtime 16.1 and above this function supportsnamed parameter invocation.,2025-02-05T19:42:00.000Z,language-reference,integrations,0.75,True,Table-valued function reference with Databricks-specific behavior for un-nesting VARIANT data and version-specific named parameter support.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/variant_explode_outer,variant_explode_outer table-valued function,variant_explode_outer table-valued function - Azure Databricks - Databricks SQL,Outer explode VARIANT data with variant_explode_outer,Learn the syntax of the variant\_explode\_outer table function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 15.3 and above Returns a set of rows by un-nestingvariantExprusing outer semantics.,2024-07-29T21:03:00.000Z,language-reference,integrations,0.75,True,"Documents variant_explode_outer table function and its outer semantics, which are detailed Databricks SQL function behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/variant_get,variant_get function,variant_get function - Azure Databricks - Databricks SQL,Use variant_get to extract values from VARIANT columns in Databricks SQL,Learn the syntax of the variant\_get function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.3 and above Extracts a value of type fromvariantExpr, specified bypath.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference pages typically include product-specific syntax, argument types, return types, and edge-case behavior (for example, how paths are specified, null-handling, and error conditions) that go beyond generic SQL knowledge. This is expert, product-specific API surface best aligned with integrations & coding patterns.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_avg,vector_avg aggregate function,vector_avg aggregate function - Azure Databricks - Databricks SQL,Compute element-wise vector averages with vector_avg in Databricks SQL,Learn the syntax of the vector\_avg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the element-wise average of vectors in an aggregate. Returns a vector where each element is the arithmetic mean of the corresponding elements across all input vectors.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks-specific aggregate function for vectors, including exact syntax, argument constraints, and return behavior that are not part of standard SQL. This is concrete API-level behavior, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/variant_get,variant_get function,variant_get function - Azure Databricks - Databricks SQL,Use variant_get to extract values from VARIANT columns in Databricks SQL,Learn the syntax of the variant\_get function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.3 and above Extracts a value of type fromvariantExpr, specified bypath.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Function reference pages typically include product-specific syntax, argument types, return types, and edge-case behavior (for example, how paths are specified, null-handling, and error conditions) that go beyond generic SQL knowledge. This is expert, product-specific API surface best aligned with integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_avg,vector_avg aggregate function,vector_avg aggregate function - Azure Databricks - Databricks SQL,Compute element-wise vector averages with vector_avg in Databricks SQL,Learn the syntax of the vector\_avg aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the element-wise average of vectors in an aggregate. Returns a vector where each element is the arithmetic mean of the corresponding elements across all input vectors.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks-specific aggregate function for vectors, including exact syntax, argument constraints, and return behavior that are not part of standard SQL. This is concrete API-level behavior, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_cosine_similarity,vector_cosine_similarity function,vector_cosine_similarity function - Azure Databricks - Databricks SQL,Compute vector cosine similarity in Databricks SQL,Learn the syntax of the vector\_cosine\_similarity function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 18.1 and above Computes the cosine similarity between two vectors, measuring the cosine of the angle between them.",2026-03-05T20:31:00.000Z,language-reference,integrations,0.7,True,"Function reference for vector_cosine_similarity with Databricks-specific vector types and behavior, beyond generic math concept.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_inner_product,vector_inner_product function,vector_inner_product function - Azure Databricks - Databricks SQL,Compute vector inner product in Databricks SQL,Learn the syntax of the vector\_inner\_product function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the inner product (dot product) between two vectors.,2026-03-05T20:31:00.000Z,language-reference,integrations,0.7,True,"Documents vector_inner_product function semantics and applicability in Databricks Runtime, which are concrete API details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_l2_distance,vector_l2_distance function,vector_l2_distance function - Azure Databricks - Databricks SQL,Compute vector L2 distance in Databricks SQL,Learn the syntax of the vector\_l2\_distance function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the Euclidean (L2) distance between two vectors.,2026-03-05T20:31:00.000Z,language-reference,integrations,0.7,True,"Describes vector_l2_distance function behavior for Databricks vector types, which is specific to this platform’s SQL functions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_norm,vector_norm function,vector_norm function - Azure Databricks - Databricks SQL,Calculate vector norms with vector_norm in Databricks,Learn the syntax of the vector\_norm function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the Lp norm of a vector using the specified degree.,2026-03-05T20:31:00.000Z,language-reference,integrations,0.7,True,"Function reference for vector_norm including Lp norm semantics and parameters in Databricks Runtime, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_normalize,vector_normalize function,vector_normalize function - Azure Databricks - Databricks SQL,Normalize vectors with vector_normalize in Databricks,Learn the syntax of the vector\_normalize function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Normalizes a vector to unit length using the specified norm degree.,2026-03-05T20:31:00.000Z,language-reference,integrations,0.7,True,"Documents vector_normalize function behavior and parameters for Databricks vectors, which are concrete integration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_search,vector_search function,vector_search function - Azure Databricks - Databricks SQL,Query Mosaic AI Vector Search indexes via SQL,Learn the syntax of the vector\_search function of the SQL language in Databricks SQL.,Applies to:Databricks SQL Important This feature is inPublic Preview. Thevector_search()function allows you to query aMosaic AI Vector Searchindex using SQL.,2026-01-20T08:00:00.000Z,language-reference,integrations,0.65,True,"vector_search() is a Databricks-specific integration with Mosaic AI Vector Search; the function reference is likely to include parameter names, types, and constraints for querying the index, which are product-specific API/config details that match the integrations & coding patterns criteria.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_sum,vector_sum aggregate function,vector_sum aggregate function - Azure Databricks - Databricks SQL,Compute element-wise vector sums with vector_sum in Databricks SQL,Learn the syntax of the vector\_sum aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the element-wise sum of vectors in an aggregate. Returns a vector where each element is the sum of the corresponding elements across all input vectors.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Covers a Databricks-specific aggregate function with precise syntax and semantics for vector operations, which are not standard SQL. This is detailed API usage information, matching integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_sum,vector_sum aggregate function,vector_sum aggregate function - Azure Databricks - Databricks SQL,Compute element-wise vector sums with vector_sum in Databricks SQL,Learn the syntax of the vector\_sum aggregate function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks Runtime 18.1 and above Computes the element-wise sum of vectors in an aggregate. Returns a vector where each element is the sum of the corresponding elements across all input vectors.,2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Covers a Databricks-specific aggregate function with precise syntax and semantics for vector operations, which are not standard SQL. This is detailed API usage information, matching integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/version,version function,version function - Azure Databricks - Databricks SQL,,Learn the syntax of the version function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the Apache Spark version. Usecurrent_versionto retrieve the Databricks SQL version.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"version() function page returns Spark version; expected to be simple syntax and return description without detailed configuration tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/weekday,weekday function,weekday function - Azure Databricks - Databricks SQL,,Learn the syntax of the weekday function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the day of the week ofexpr. This function is a synonym forextract(DAYOFWEEK_ISO FROM expr) - 1. Note When extracting fields from aTIMESTAMP(TIMESTAMP_LTZ), the result is based on the session timezone.",2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"weekday() function page explains behavior and relation to extract(); while it mentions session timezone, it does not indicate detailed configuration parameters or limits that fit any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/weekofyear,weekofyear function,weekofyear function - Azure Databricks - Databricks SQL,Compute week of year with Databricks weekofyear,Learn the syntax of the weekofyear function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Returns the week of the year ofexpr.,2024-03-01T08:00:00.000Z,language-reference,integrations,0.6,True,"Function reference for weekofyear in Databricks SQL/Runtime, including how it derives week numbers; these are concrete API semantics.",unchanged @@ -4489,14 +4533,14 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zeroifnull,zeroifnull function,zeroifnull function - Azure Databricks - Databricks SQL,Replace NULL with zero using zeroifnull in Databricks,Learn the syntax of the zeroifnull function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime 16.0 and above Returns0ifexprisNULL, orexprotherwise. This function is a synonym forcoalesce(expr, 0).",2025-04-30T00:43:00.000Z,language-reference,integrations,0.7,True,"Documents zeroifnull function, its synonym to coalesce(expr, 0), and version applicability, which are Databricks-specific semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zip_with,zip_with function,zip_with function - Azure Databricks - Databricks SQL,Merge arrays element-wise with zip_with in Databricks,Learn the syntax of the zip\_with function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Merges the arrays inexpr1andexpr2, element-wise, into a single array usingfunc.",2024-03-01T08:00:00.000Z,language-reference,integrations,0.75,True,"Function reference for zip_with, including element-wise merge semantics using a lambda func in Databricks SQL/Runtime, which is product-specific behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zstd_compress,zstd_compress function,zstd_compress function - Azure Databricks - Databricks SQL,Use zstd_compress in Databricks SQL queries,Learn the syntax of the zstd\_compress function of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 15.2 and above Returns value compressed with Zstandard compression.,2024-06-04T18:15:00.000Z,language-reference,integrations,0.72,True,"Function reference page with exact SQL signature, argument types, return type, and behavior details for zstd_compress that are product-specific and not just conceptual.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zstd_decompress,zstd_decompress function,zstd_decompress function - Azure Databricks - Databricks SQL,Decompress data with zstd_decompress in Databricks SQL,Learn the syntax of the zstd\_decompress function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.2 and above Returns value decompressed with Zstandard compression. On decompression failure, it throws an exception.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a Databricks-specific SQL function including exact syntax, input/return types, and behavior on decompression failure (throws exception). These are concrete, product-specific API details, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zstd_decompress,zstd_decompress function,zstd_decompress function - Azure Databricks - Databricks SQL,Decompress data with zstd_decompress in Databricks SQL,Learn the syntax of the zstd\_decompress function of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 15.2 and above Returns value decompressed with Zstandard compression. On decompression failure, it throws an exception.",2026-04-13T21:52:00.000Z,language-reference,integrations,0.7,True,"Documents a Databricks-specific SQL function including exact syntax, input/return types, and behavior on decompression failure (throws exception). These are concrete, product-specific API details, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/how-to-use,How to use the SQL reference,How to use the SQL reference - Azure Databricks - Databricks SQL,,Learn how to read and navigate the SQL language reference.,"Applies to:Databricks SQLDatabricks Runtime This guide explains how to read and navigate the SQL language reference, including platform availability labels and syntax notation.",2026-02-14T02:30:00.000Z,language-reference,,0.4,False,Explains how to read the SQL reference; meta-documentation without product-specific configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalog_privileges,CATALOG_PRIVILEGES,CATALOG_PRIVILEGES - Azure Databricks - Databricks SQL,Query catalog privileges via Databricks information schema,Learn about the INFORMATION\_SCHEMA.CATALOG\_PRIVILEGES relation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only INFORMATION_SCHEMA.CATALOG_PRIVILEGES listsprincipalsthat haveprivilegeson a catalog.,2026-01-20T08:00:00.000Z,language-reference,security,0.7,True,"Describes INFORMATION_SCHEMA.CATALOG_PRIVILEGES, a security/permissions metadata relation; includes product-specific privilege and principal semantics that map directly to access control behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalog_provider_share_usage,CATALOG_PROVIDER_SHARE_USAGE,CATALOG_PROVIDER_SHARE_USAGE - Azure Databricks - Databricks SQL,Inspect catalog provider share usage in Databricks,Learn about the INFORMATION\_SCHEMA.CATALOG\_PROVIDER\_SHARE\_USAGE relation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above INFORMATION_SCHEMA.CATALOG_PROVIDER_SHARE_USAGE listscatalogsthat have mountedprovidershares. Information is displayed only for catalogs the user has permission to interact with. This is an extension to the SQL Standard Information Schema.,2024-04-18T08:00:00.000Z,language-reference,configuration,0.7,True,"Describes CATALOG_PROVIDER_SHARE_USAGE relation, a Databricks-specific metadata table for mounted provider shares, which is configuration/metadata structure.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalog_tags,CATALOG_TAGS,CATALOG_TAGS - Azure Databricks - Databricks SQL,List catalog tags using INFORMATION_SCHEMA in Databricks,Learn about the INFORMATION\_SCHEMA.CATALOG\_TAGS relation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only. INFORMATION_SCHEMA.CATALOG_TAGScontains all the tags that have been applied to all the catalogs within the metastore. Information is displayed only for catalogs the user has permission to interact with. This relation is an extension to the SQL Standard Information Schema.,2024-06-27T08:00:00.000Z,language-reference,configuration,0.7,True,Defines CATALOG_TAGS relation and how tags are represented; this is Databricks-specific metadata configuration.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalogs,CATALOGS,CATALOGS - Azure Databricks - Databricks SQL,Describe Unity Catalog catalogs via INFORMATION_SCHEMA,Learn about the INFORMATION\_SCHEMA.CATALOGS relation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only INFORMATION_SCHEMA.CATALOGS describes catalogs. Information is displayed only for catalogs bound to the current workspace that the user has permission to interact with. This is an extension to the SQL Standard Information Schema.,2024-11-04T20:06:00.000Z,language-reference,configuration,0.7,True,CATALOGS relation schema and behavior are Databricks-specific metadata configuration details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/check_constraints,CHECK_CONSTRAINTS,CHECK_CONSTRAINTS - Azure Databricks - Databricks SQL,,Learn about the INFORMATION\_SCHEMA.CHECK\_CONSTRAINTS relation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks RuntimeUnity Catalog only Reserved for future use. INFORMATION_SCHEMA.CHECK_CONSTRAINTS will describe check constraints defined on tables.,2024-09-17T21:22:00.000Z,language-reference,,0.2,False,"Reserved for future use; no concrete schema, parameters, or behavior described yet.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/column_masks,COLUMN_MASKS,COLUMN_MASKS - Azure Databricks - Databricks SQL,Query COLUMN_MASKS metadata in Databricks SQL,Learn about the INFORMATION\_SCHEMA.COLUMN\_MASKS relation in Databricks SQL and Databricks Runtime.,"Databricks Runtime 12.2 LTS and aboveUnity Catalog only. Important This feature is inPublic Preview. INFORMATION_SCHEMA.COLUMN_MASKScontains the column masking metadata for table columns in the catalog, or all catalogs if owned by theSYSTEMcatalog. Information is displayed only for columns the user has permission to interact with. Tables that are accessible only through theBROWSEprivilege aren't included in the results. This relation is an extension to the SQL Standard Information Schema.",2026-04-16T17:44:00.000Z,language-reference,configuration,0.76,True,"The page documents the INFORMATION_SCHEMA.COLUMN_MASKS relation, including product-specific columns/fields that describe column masking metadata, their meanings, and constraints (for example, visibility rules based on permissions and catalog ownership). This is detailed, product-specific configuration/metadata schema knowledge that an LLM is unlikely to know from training and fits best under configuration rather than general concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/column_masks,COLUMN_MASKS,COLUMN_MASKS - Azure Databricks - Databricks SQL,Query COLUMN_MASKS metadata in Databricks SQL,Learn about the INFORMATION\_SCHEMA.COLUMN\_MASKS relation in Databricks SQL and Databricks Runtime.,"Databricks Runtime 12.2 LTS and aboveUnity Catalog only. Important This feature is inPublic Preview. INFORMATION_SCHEMA.COLUMN_MASKScontains the column masking metadata for table columns in the catalog, or all catalogs if owned by theSYSTEMcatalog. Information is displayed only for columns the user has permission to interact with. Tables that are accessible only through theBROWSEprivilege aren't included in the results. This relation is an extension to the SQL Standard Information Schema.",2026-04-16T17:44:00.000Z,language-reference,configuration,0.76,True,"The page documents the INFORMATION_SCHEMA.COLUMN_MASKS relation, including product-specific columns/fields that describe column masking metadata, their meanings, and constraints (for example, visibility rules based on permissions and catalog ownership). This is detailed, product-specific configuration/metadata schema knowledge that an LLM is unlikely to know from training and fits best under configuration rather than general concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/column_tags,COLUMN_TAGS,COLUMN_TAGS - Azure Databricks - Databricks SQL,Retrieve column tagging metadata with COLUMN_TAGS,Learn about the INFORMATION\_SCHEMA.COLUMN\_TAGs relation in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only. INFORMATION_SCHEMA.COLUMN_TAGScontains the column tagging metadata within the table, or all columns if owned by theSYSTEMcatalog. Information is displayed only for columns the user has permission to interact with. This relation is an extension to the SQL Standard Information Schema.",2024-06-27T08:00:00.000Z,language-reference,configuration,0.7,True,COLUMN_TAGS relation structure and behavior are Databricks-specific metadata configuration details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/columns,COLUMNS,COLUMNS - Azure Databricks - Databricks SQL,Use INFORMATION_SCHEMA.COLUMNS in Databricks SQL,Learn about the INFORMATION\_SCHEMA.COLUMNS relation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only INFORMATION_SCHEMA.COLUMNS describes columns of tables and views (relations) in the catalog. The rows returned are limited to the relations the user is privileged to interact with.,2026-02-04T08:00:00.000Z,language-reference,configuration,0.8,True,COLUMNS relation schema and filtering rules are product-specific metadata configuration not generally known.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/connection_privileges,CONNECTION_PRIVILEGES,CONNECTION_PRIVILEGES - Azure Databricks - Databricks SQL,Inspect connection privileges via INFORMATION_SCHEMA in Databricks,Learn about the INFORMATION\_SCHEMA.CONNECTION\_PRIVILEGES relation in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only INFORMATION_SCHEMA.CONNECTION_PRIVILEGES listsprincipalsthat haveprivilegeson a connection. Note Currently, users with theMANAGEprivilege on an object cannot view all grants for that object in theINFORMATION_SCHEMA. Instead, theINFORMATION_SCHEMAonly shows grants their own grants on the object. This behavior will be corrected in the future. Users withMANAGEprivilege can view all grants on an object using SQL command",2024-12-16T21:39:00.000Z,language-reference,security,0.85,True,CONNECTION_PRIVILEGES lists principals and privileges on connections and notes special MANAGE behavior; this is detailed RBAC/permission semantics.,unchanged @@ -4569,7 +4613,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security- https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-ansi-compliance,ANSI compliance in Databricks Runtime,ANSI compliance in Databricks Runtime - Azure Databricks - Databricks SQL,Configure ANSI compliance options in Databricks Runtime,Learn about ANSI compliance in the SQL language constructs supported in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime This article describes ANSI compliance in Databricks Runtime. For ANSI mode in Databricks SQL, seeANSI_MODE. Spark SQL has two options to support compliance with the ANSI SQL standard:spark.sql.ansi.enabledandspark.sql.storeAssignmentPolicy. spark.sql.ansi.enabled(ANSI mode) spark.sql.storeAssignmentPolicy(store assignment policy) The following table summarizes the behavior: The following subsections present behavior changes in arithmetic operations, t",2026-01-24T08:00:00.000Z,language-reference,configuration,0.75,True,"Describes spark.sql.ansi.enabled and spark.sql.storeAssignmentPolicy options and their behavioral impact. Likely includes allowed values and behavior tables, which are product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-collation,Collation,Collation - Azure Databricks - Databricks SQL,,Learn about collations in SQL in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 16.1 and above A collation is a set of rules that determines how string comparisons are performed. Collations support case-insensitive, accent-insensitive, and trailing-space-insensitive comparisons, as well as language-aware string ordering. Strings in Azure Databricks are represented asUTF-8encodedUnicode characters. By default, Azure Databricks compares strings by their binary UTF-8 representation, known asUTF8_BINARYcollation.UTF8_BINARYcomparisons",2026-04-10T21:59:00.000Z,language-reference,,0.4,False,"Explains collation concepts and default behavior (UTF8_BINARY) but, from the summary, does not expose detailed configuration tables, numeric limits, or decision matrices; mostly conceptual/behavioral description.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-datatype-rules,SQL data type rules,SQL data type rules - Azure Databricks - Databricks SQL,Apply SQL data type resolution rules in Databricks,Learn about rules governing SQL data types in Databricks SQL an Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Azure Databricks uses several rules to resolve conflicts among data types: You can also explicitly cast between many types:,2026-01-24T08:00:00.000Z,language-reference,configuration,0.7,True,"Explains Databricks-specific rules for resolving type conflicts and casting between types. These are detailed, product-specific language rules not generally known from generic SQL.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-datatypes,Data types and literals,Data types - Azure Databricks - Databricks SQL,,Learn about SQL data types in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime For rules governing how conflicts between data types are resolved, seeSQL data type rules.",2026-04-13T08:00:00.000Z,language-reference,,0.0,False,"Page is a general reference for Databricks SQL data types; it defines and describes types but does not focus on limits/quotas, configuration parameters, decision matrices, troubleshooting mappings, or other product-specific expert details as defined in the sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-datatypes,Data types and literals,Data types - Azure Databricks - Databricks SQL,,Learn about SQL data types in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime For rules governing how conflicts between data types are resolved, seeSQL data type rules.",2026-04-13T08:00:00.000Z,language-reference,,0.0,False,"Page is a general reference for Databricks SQL data types; it defines and describes types but does not focus on limits/quotas, configuration parameters, decision matrices, troubleshooting mappings, or other product-specific expert details as defined in the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-datetime-pattern,Datetime patterns,Datetime patterns - Azure Databricks - Databricks SQL,,Learn about SQL datetime patterns in Databricks SQL and Databricks Runtime,Applies to:Databricks SQLDatabricks Runtime There are several common scenarios for datetime usage in Azure Databricks:,2025-05-09T19:58:00.000Z,language-reference,,0.45,False,Datetime patterns page likely lists format strings; summary is conceptual and does not show Databricks-specific limits or configuration parameters.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-expression,Expressions,SQL expression - Azure Databricks - Databricks SQL,,"Learn about SQL expressions in Databricks SQL including syntax, parameters, and types of expressions.","Applies to:Databricks SQLDatabricks Runtime An expression is a formula that computes a result based on literals or references to columns, fields, or variables, using functions or operators.",2026-01-24T08:00:00.000Z,language-reference,,0.2,False,"High-level definition of expressions; summary does not indicate product-specific limits, parameters, or error mappings beyond generic SQL expression concepts.",unchanged @@ -4578,12 +4622,12 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-e https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-federated-queries,Federated queries (Lakehouse Federation),Federated queries (Lakehouse Federation) - Azure Databricks - Databricks SQL,,Learn about Unity Catalog federated queries in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only Query federation allows Azure Databricks to execute queries against data served by other Azure Databricks metastores as well as many third-party database management systems (DBMS) such asPostgreSQL,mySQL, andSnowflake. To query data from another system you must: You can now issue queries across the various local and foreign relations.",2024-08-01T21:10:00.000Z,language-reference,,0.4,False,"Conceptual overview of federated queries; no detailed configuration parameter tables, limits, or error mappings are indicated in the summary.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-function-invocation,Function invocation,Function invocation - Azure Databricks - Databricks SQL,Invoke built-in and user-defined functions in Databricks SQL,Learn about function invocation in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime A function invocation executes abuiltin functionor a user-defined function after associating arguments to the function's parameters. Azure Databricks supportspositional parameter invocationas well asnamed parameter invocation.,2025-02-19T01:19:00.000Z,language-reference,integrations,0.63,True,Documents Databricks-specific function invocation syntax including named vs positional parameters and nuances that go beyond generic SQL knowledge.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions,Functions,Functions - Azure Databricks - Databricks SQL,,Learn about SQL functions in the SQL language constructs supported in Databricks Runtime.,Applies to:Databricks Runtime Spark SQL provides two function features to meet a wide range of needs: built-in functions and user-defined functions (UDFs). To learn about function resolution and function invocation see:Function invocation.,2024-03-01T08:00:00.000Z,language-reference,,0.3,False,"High-level overview of functions and UDFs; summary does not show Databricks-specific limits, configs, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin,Built-in functions,Built-in functions - Azure Databricks - Databricks SQL,,Learn about built-in functions in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime This article presents links to and descriptions of built-in operators and functions for strings and binary types, numeric scalars, aggregations, windows, arrays, maps, dates and timestamps, casting, CSV data, JSON data, XPath manipulation, and other miscellaneous functions. For use cases that are not supported by existing built-in functions, consider defining a custom function. SeeWhat are user-defined functions (UDFs)?. Also see:",2026-04-08T08:00:00.000Z,language-reference,,0.2,False,"Index page listing categories of built-in SQL functions; summary shows no parameter tables, limits, or detailed syntax/semantics. Likely a navigational overview rather than expert configuration or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin,Built-in functions,Built-in functions - Azure Databricks - Databricks SQL,Use built-in SQL functions in Azure Databricks,Learn about built-in functions in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime This article presents links to and descriptions of built-in operators and functions for strings and binary types, numeric scalars, aggregations, windows, arrays, maps, dates and timestamps, casting, CSV data, JSON data, XPath manipulation, and other miscellaneous functions. For use cases that are not supported by existing built-in functions, consider defining a custom function. SeeWhat are user-defined functions (UDFs)?. Also see:",2026-04-20T08:00:00.000Z,language-reference,integrations,0.7,True,"The page is a reference for Databricks SQL built-in functions, which includes product-specific function names, signatures, and behaviors that go beyond generic SQL knowledge. This is effectively API/SDK-style surface area for Databricks SQL, fitting best under integrations & coding patterns. It is not about limits, architecture, or deployment, but about concrete callable functions and their usage.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin-alpha,Alphabetical list of built-in functions,Alphabetical list of built-in functions - Azure Databricks - Databricks SQL,,View an alphabetical list of built-in functions and operators in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime This article provides an alphabetically-ordered list of built-in functions and operators in Azure Databricks.,2026-02-18T22:10:00.000Z,language-reference,,0.3,False,"Alphabetical index of built-in functions; navigational content without expert-level configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-udf-aggregate,User-defined aggregate functions (UDAFs),User-defined aggregate functions (UDAFs) - Azure Databricks - Databricks SQL,Implement user-defined aggregate functions in Databricks,Learn about SQL user-defined aggregate functions in the SQL language constructs supported in Databricks Runtime.,Applies to:Databricks Runtime User-defined aggregate functions (UDAFs) are user-programmable routines that act on multiple rows at once and return a single aggregated value as a result. This documentation lists the classes that are required for creating and registering UDAFs. It also contains examples that demonstrate how to define and register UDAFs in Scala and invoke them in Spark SQL.,2024-06-12T18:16:00.000Z,language-reference,integrations,0.69,True,"Contains concrete class names, required interfaces, registration patterns, and Scala/SQL invocation examples for UDAFs that are specific to Databricks Runtime.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-udf-hive,"Integration with Hive UDFs, UDAFs, and UDTFs","Integration with Hive UDFs, UDAFs, and UDTFs - Azure Databricks - Databricks SQL","Integrate Hive UDFs, UDAFs, UDTFs with Databricks SQL","Learn about SQL integration with Hive UDFs, UDAFs, and UDTFs in the SQL language constructs supported in Databricks Runtime.","Applies to:Databricks Runtime Spark SQL supports integration of Hive UDFs, UDAFs, and UDTFs. Similar to Spark UDFs and UDAFs, Hive UDFs work on a single row as input and generate a single row as output, while Hive UDAFs operate on multiple rows and return a single aggregated row as a result. In addition, Hive also supports UDTFs (User Defined Tabular Functions) that act on one row as input and return multiple rows as output. To use Hive UDFs/UDAFs/UTFs, the user should register them in Spark, an",2024-03-01T08:00:00.000Z,language-reference,integrations,0.71,True,"Describes product-specific steps and APIs to register and use Hive UDF/UDAF/UDTF implementations in Spark SQL on Databricks, including required configuration and invocation patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-udf-scalar,External user-defined scalar functions (UDFs),External user-defined scalar functions (UDFs) - Azure Databricks - Databricks SQL,Create and register scalar UDFs in Databricks,Learn about SQL scalar user-defined functions in the SQL language constructs supported in Databricks Runtime.,Applies to:Databricks Runtime User-defined scalar functions (UDFs) are user-programmable routines that act on one row. This documentation lists the classes that are required for creating and registering UDFs. It also contains examples that demonstrate how to define and register UDFs and invoke them in Spark SQL.,2024-03-01T08:00:00.000Z,language-reference,integrations,0.69,True,"Lists required classes, registration APIs, and usage examples for external scalar UDFs in Databricks Runtime, which are concrete product-specific coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions,H3 geospatial functions,H3 geospatial functions - Azure Databricks - Databricks SQL,Use H3 geospatial SQL functions in Azure Databricks,Learn about H3 geospatial functions in Databricks SQL. H3 is a global grid indexing system used for processing and analyzing spatial data.,"Applies to:Databricks SQLDatabricks Runtime H3 is a global grid indexing system. Grid systems use a shape, like rectangles or triangles, to tessellate a surface, which in this case is the Earth's surface. The H3 system was designed to use hexagons (and a few pentagons), and offers 16 levels of resolutions within its hierarchy. At higher resolutions, the tesselated shapes are smaller. H3 expressions are only supported in Photon-enabled clusters and Databricks SQL warehouses at the Databricks SQL ",2026-04-10T08:00:00.000Z,language-reference,integrations,0.68,True,"The page is a detailed SQL language reference for H3 geospatial functions in Databricks SQL, containing product-specific function names, signatures, parameter semantics, and behavior that go beyond generic geospatial knowledge. These are concrete API/SQL integration details unique to Databricks SQL rather than conceptual geospatial theory, fitting the integrations & coding patterns category best.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions,H3 geospatial functions,H3 geospatial functions - Azure Databricks - Databricks SQL,Use H3 geospatial SQL functions in Azure Databricks,Learn about H3 geospatial functions in Databricks SQL. H3 is a global grid indexing system used for processing and analyzing spatial data.,"Applies to:Databricks SQLDatabricks Runtime H3 is a global grid indexing system. Grid systems use a shape, like rectangles or triangles, to tessellate a surface, which in this case is the Earth's surface. The H3 system was designed to use hexagons (and a few pentagons), and offers 16 levels of resolutions within its hierarchy. At higher resolutions, the tesselated shapes are smaller. In Databricks Runtime 17.0 or above, H3 expressions are supported on all compute types except classic Databricks ",2026-04-21T08:00:00.000Z,language-reference,integrations,0.7,True,"Page is a detailed SQL reference for H3 geospatial functions in Databricks SQL/Runtime, with product-specific function names, signatures, argument types, and behavior that go beyond generic geospatial knowledge. This is expert, code-level integration knowledge for using H3 within Databricks SQL, fitting the integrations & coding patterns category.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions-alpha,Alphabetical list of H3 geospatial functions,Alphabetical list of H3 geospatial functions - Azure Databricks - Databricks SQL,Alphabetical reference of H3 functions in Databricks SQL,View a list of H3 geospatial built-in functions Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime,2025-04-01T19:57:00.000Z,language-reference,integrations,0.62,True,"Alphabetical catalog of H3 function names and signatures supported by Databricks SQL, which is detailed API surface knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions-examples,H3 geospatial functions example,H3 geospatial functions example - Azure Databricks - Databricks SQL,Example analytics using H3 functions on Databricks,Examples using H3 geospatial built-in functions Databricks SQL.,"In this example, we analyze flight data with various H3 geospatial built-in functions. The example notebook uses the following functions.",2024-03-01T08:00:00.000Z,language-reference,integrations,0.66,True,"Shows end-to-end SQL examples with specific H3 function calls and parameters on Databricks, which are concrete integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions-quickstart,H3 Quickstart (Databricks SQL),H3 Quickstart (Databricks SQL) - Azure Databricks - Databricks SQL,,Quickstart guide for H3 geospatial built-in functions Databricks SQL.,The H3 geospatial functions quickstart on this page illustrates the following:,2026-03-31T08:00:00.000Z,language-reference,,0.4,False,"Quickstart for H3 functions; typically step-by-step usage examples rather than detailed limits, configuration matrices, or troubleshooting content.",unchanged @@ -4591,6 +4635,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-identifiers,Identifiers,Identifiers - Azure Databricks - Databricks SQL,,Learn about SQL identifiers in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime An identifier is a string used to identify an object such as a table, view, schema, or column. Azure Databricks supports non-delimited (regular) identifiers and delimited identifiers, which are enclosed within backticks. Identifiers are case-insensitive when referenced. For identifiers persisted with a metastore and data source the characters permitted can be restricted. SeeNamesfor details on the specific usage of identifiers.",2026-04-10T21:59:00.000Z,language-reference,,0.3,False,"Describes what identifiers are and basic behavior (case-insensitivity, backticks) but, as summarized, does not include detailed parameter tables, limits, or product-specific configuration values.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-information-schema,Information schema,Information schema - Azure Databricks - Databricks SQL,,"Learn about the information schema in Databricks SQL and Databricks Runtime. The purpose of the information schema is to provide a SQL-based, self-describing API to the metadata.","Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only In theSYSTEMcatalog, theINFORMATION_SCHEMAis a SQL standard schema that provides metadata about objects across all catalogs in the metastore. It does not contain metadata abouthive_metastoreobjects. Separately, each catalog created in Unity Catalog also automatically includes aninformation_schemathat describes metadata about objects in that catalog only. Both types of information schemas automatically filter results",2026-01-20T08:00:00.000Z,language-reference,,0.5,False,"Information schema overview for Databricks; primarily conceptual and descriptive about metadata exposure, without detailed config tables, limits, or decision/troubleshooting structures.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-ip-functions,IP functions,IP functions - Azure Databricks - Databricks SQL,Work with IP address SQL functions in Databricks,Learn about IP functions in Databricks Runtime.,Applies to:Databricks Runtime 18.2 and above Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. IP functions operate onIPv4 and IPv6 addressesandCIDR blocksrepresented asSTRINGorBINARYvalues.,2026-04-20T17:34:00.000Z,language-reference,integrations,0.7,True,"Page documents Databricks Runtime IP functions with specific function names, accepted types (STRING/BINARY), IPv4/IPv6 and CIDR handling, and beta/preview behavior. These are product-specific SQL APIs and patterns for handling IP data, which qualifies as expert integration/coding reference rather than generic concepts.",new https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-json-path-expression,JSON path expressions,JSON path expression - Azure Databricks - Databricks SQL,Use JSON path expressions in Databricks SQL,"Learn about JSON path expressions in Databricks Runtime and Databricks SQL. Learn the parameters, returns, and how to extract values with examples and Databricks Runtime.",Applies to:Databricks SQLDatabricks Runtime A JSON path expression is used to extract values from a JSON string or aVARIANTusing the: operator,2026-01-20T08:00:00.000Z,language-reference,configuration,0.65,True,"Describes Databricks-specific JSON path expression syntax and behavior (e.g., use of ':' operator, handling VARIANT). These are concrete language features unique to this product’s SQL dialect.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-lambda-functions,Lambda functions,Lambda functions - Azure Databricks - Databricks SQL,Write and use lambda functions in Databricks SQL,Learn about lambda function in Databricks SQL and Databricks Runtime. A lambda function is a parameterized expression that can be passed to a function to control its behavior.,"Applies to:Databricks SQLDatabricks Runtime A parameterized expression that can be passed to a function to control its behavior. For example,array_sortfunctionaccepts a lambda function as an argument to define a custom sort order.",2024-03-01T08:00:00.000Z,language-reference,integrations,0.64,True,"Explains Databricks SQL lambda syntax and usage (for example with array_sort), including parameterization rules that are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-name-resolution,Name resolution,Name resolution - Azure Databricks - Databricks SQL,Name resolution rules in Databricks SQL,Learn about name resolution in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Name resolution is the process by whichidentifiersare resolved to specific column-, field-, parameter-, or table-references.",2026-03-06T19:12:00.000Z,language-reference,configuration,0.7,True,"Describes how identifiers are resolved to specific objects in Databricks, a product-specific resolution algorithm and behavior.",unchanged @@ -4605,19 +4650,19 @@ entity calledperson). Sometimes, the value of a column specific to a row is not known at the time the row comes into existence. InSQL, such values are represented asNULL. This section details the semantics ofNULLvalues handling in various operators, expressio",2026-02-19T08:00:00.000Z,language-reference,integrations,0.7,True,"Details Databricks-specific NULL handling across operators and expressions, including edge-case semantics.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameter-marker,Parameter markers,Parameter markers - Azure Databricks - Databricks SQL,Configure and use parameter markers in Databricks SQL,Learn about parameter markers in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Parameter markers arenamedorunnamedtyped placeholder variables used to supply values from the API invoking the SQL statement. Using parameter markers protects your code from SQL injection attacks since it clearly separates provided values from the structure of the SQL statements. You cannot mix named and unnamed parameter markers in the same SQL statement. You can also use parameter markers in theIDENTIFIERclause, which can be used to parameterizeobjec",2026-01-24T08:00:00.000Z,language-reference,configuration,0.7,True,"Covers Databricks-specific behavior of named vs unnamed parameter markers, restrictions on mixing them, and their use in IDENTIFIER clauses. These are concrete, product-specific SQL configuration/usage rules.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameter-marker,Parameter markers,Parameter markers - Azure Databricks - Databricks SQL,Use parameter markers in Databricks SQL statements,Learn about parameter markers in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Parameter markers arenamedorunnamedtyped placeholder variables used to supply values from the API invoking the SQL statement. Using parameter markers protects your code from SQL injection attacks since it clearly separates provided values from the structure of the SQL statements. You cannot mix named and unnamed parameter markers in the same SQL statement. You can also use parameter markers in theIDENTIFIERclause, which can be used to parameterizeobjec",2026-04-23T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes product-specific behavior of named and unnamed parameter markers in Databricks SQL, including constraints such as not mixing marker types in a single statement and usage in IDENTIFIER clauses. This is concrete, code-level usage guidance tied to how external callers (APIs/clients) pass parameters into SQL, fitting the integrations & coding patterns category.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameters,Configuration parameters,Configuration parameters - Azure Databricks - Databricks SQL,Understand Databricks SQL configuration parameter hierarchy,Learn about configuration parameters in Databricks SQL.,Applies to:Databricks SQL A configuration parameter is a setting which affects the behavior of Databricks SQL outside of the specified SQL syntax. The effective value of a configuration parameter is derived from the different levels where it is set.,2024-07-16T19:13:00.000Z,language-reference,configuration,0.7,True,Defines how configuration parameters work in Databricks SQL and how effective values are derived from multiple levels; this is product-specific configuration behavior not generally known.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-partition,Partitions,Partitions - Azure Databricks - Databricks SQL,Define and manage table partitions in Databricks SQL,Learn about table partitions in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Note On managed Apache Iceberg tables, Unity Catalog supports only liquid clustering and interprets partitions specified in thePARTITION BYclause as clustering keys for liquid clustering. Databricks recommends liquid clustering for all new Delta tables and managed Iceberg tables. SeeUnity Catalog managed tables in Azure Databricks for Delta Lake and Apache IcebergandUse liquid clustering for tables. A partition is composed of a subset of rows in a tabl",2026-01-20T08:00:00.000Z,language-reference,configuration,0.65,True,Includes Databricks/Unity Catalog–specific behavior where PARTITION BY is interpreted as liquid clustering keys on managed Iceberg tables and recommendations for liquid clustering. These are product-specific configuration semantics.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-principal,Principals,Principal - Azure Databricks - Databricks SQL,Understand principals for Databricks SQL security,Learn about principals in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime A principal is a user, service principal, or group known to the metastore. Principals can be grantedprivilegesand can ownsecurable objects.",2024-09-16T20:12:00.000Z,language-reference,security,0.68,True,"Defines the Databricks concept of principals as securable identities in the metastore, which is foundational to Databricks-specific privilege configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-privileges,Privileges and securable objects in Unity Catalog,Privileges and securable objects in Unity Catalog - Azure Databricks - Databricks SQL,Manage Unity Catalog privileges and securable objects,Learn about privileges and securable objects in Unity Catalog. This is reference information for queries in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks RuntimeUnity Catalog only A privilege is a right granted to aprincipalto operate on asecurable objectin the metastore. The privilege model and securable objects differ depending on whether you are using a Unity Catalog metastore or the legacy Hive metastore. This article describes the privilege model for Unity Catalog. If you are using the Hive metastore, seePrivileges and securable objects in the Hive metastore. For detailed information about how to manage pr",2026-04-15T08:00:00.000Z,language-reference,security,0.78,True,"Unity Catalog privilege reference pages enumerate specific securable object types, privilege names, and their exact semantics in Databricks SQL/Runtime. These are product-specific RBAC/permission details that qualify as security configuration knowledge beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-privileges,Privileges and securable objects in Unity Catalog,Privileges and securable objects in Unity Catalog - Azure Databricks - Databricks SQL,Manage Unity Catalog privileges and securable objects,Learn about privileges and securable objects in Unity Catalog. This is reference information for queries in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks RuntimeUnity Catalog only A privilege is a right granted to aprincipalto operate on asecurable objectin the metastore. The privilege model and securable objects differ depending on whether you are using a Unity Catalog metastore or the legacy Hive metastore. This article describes the privilege model for Unity Catalog. If you are using the Hive metastore, seePrivileges and securable objects in the Hive metastore. For detailed information about how to manage pr",2026-04-15T08:00:00.000Z,language-reference,security,0.78,True,"Unity Catalog privilege reference pages enumerate specific securable object types, privilege names, and their exact semantics in Databricks SQL/Runtime. These are product-specific RBAC/permission details that qualify as security configuration knowledge beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-privileges-hms,Privileges and securable objects in the Hive metastore,Privileges and securable objects in the Hive metastore - Azure Databricks - Databricks SQL,Configure Hive metastore privileges and securable objects,Learn about privileges and securable objects in Hive metastore in Databricks SQL and Databricks Runtime,"Applies to:Databricks SQLDatabricks Runtime A privilege is a right granted to aprincipalto operate on asecurable objectin the metastore. The privilege model and securable objects differ depending on whether you are using a Unity Catalog metastore or the legacy Hive metastore. This article describes the privilege model for the legacy Hive metastore. If you are using Unity Catalog, seePrivileges and securable objects in Unity Catalog.",2024-09-16T08:00:00.000Z,language-reference,security,0.8,True,"Documents privilege model and securable object types for the legacy Hive metastore on Databricks, including specific privilege names and scopes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-reserved-words,Reserved words,Reserved words and schemas - Azure Databricks - Databricks SQL,Use reserved words and schemas in Databricks SQL,Learn about reserved words and databases in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Reserved words are literals used as keywords by the SQL language which should not be used as identifiers to avoid unexpected behavior. Reserved schema names have special meaning to Azure Databricks.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.7,True,"Lists reserved words and reserved schema names with special meaning in Databricks, which are concrete configuration/usage constraints.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-scripting,SQL Scripting,SQL scripting - Azure Databricks - Databricks SQL,Write SQL scripts with Databricks SQL/PSM syntax,Learn about SQL Scripting in the SQL language constructs supported in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 16.3 and above You can employ powerful procedural logic using SQL/PSM standard-based scripting syntax. +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-scripting,SQL Scripting,SQL scripting - Azure Databricks - Databricks SQL,Author SQL/PSM procedural scripts in Databricks SQL,Learn about SQL Scripting in the SQL language constructs supported in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 16.3 and above You can employ powerful procedural logic using SQL/PSM standard-based scripting syntax. Any SQL script consists of and starts with acompound statementblock (BEGIN ... END). A compound statement starts with a section to declare local variables, cursors, user-defined conditions, and condition handlers, which are used to catch exceptions. -This is followed by the compound statement body, which consists of:",2026-02-27T08:00:00.000Z,language-reference,integrations,0.7,True,"Details Databricks SQL scripting syntax (compound statements, handlers, cursors) and supported control-of-flow constructs, which are product-specific language features.",unchanged +This is followed by the compound statement body, which consists of:",2026-04-20T08:00:00.000Z,language-reference,integrations,0.65,True,"Page describes Databricks SQL scripting (SQL/PSM) with concrete syntax for compound statements, variable and cursor declarations, condition handlers, and control flow. This is detailed, product-specific language/API behavior for scripting within Databricks SQL, which fits integrations & coding patterns more than other categories.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-sharing,Delta Sharing,Delta Sharing - Azure Databricks - Databricks SQL,Configure secure Delta Sharing in Databricks SQL,Delta Sharing in Databricks SQL is an open protocol for secure data sharing with other organizations regardless of their computing platforms.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Delta Sharing is an open protocol for secure data sharing with other organizations regardless of which computing platforms they use. It can share collections of tables in a Unity Catalog metastore in real time without copying them, @@ -4648,7 +4693,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s default values if they were changed using theSETcommand. When using Databricks Runtime, parameters are known as SQL Conf properties.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.76,True,"Configuration-focused command resetting session-level parameters to global defaults, with Databricks-specific terminology.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set,SET,SET - Azure Databricks - Databricks SQL,Manage Databricks SQL session parameters with SET,Learn how to use the SET syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Sets a Azure Databricksparameterat the session level, returns the value of an existing parameter or returns all parameters with value and meaning. When using Databricks Runtime, parameters are known as SQL Conf properties. To set aSQL variable, useSET VARIABLE. To set query tags for cost attribution and observability, useSET QUERY_TAGS.",2026-02-06T20:21:00.000Z,language-reference,configuration,0.8,True,"Documents Databricks-specific parameter system, including listing all parameters and distinction from SQL variables and query tags.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-query-tags,SET QUERY_TAGS,SET QUERY_TAGS - Azure Databricks - Databricks SQL,Set and manage QUERY_TAGS in Databricks SQL,Learn how to use the SET QUERY\_TAGS syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQL Important This feature is inPublic Preview. Sets, reads, or removes query tags for the current session. Query tags are custom key-value pairs that you can apply to SQL workloads to enable grouping, filtering, and cost attribution in thesystem.query.historytable. For a complete overview of query tags, seeQuery tags.",2026-04-14T17:47:00.000Z,concept-article,integrations,0.7,True,"The page documents the SET QUERY_TAGS SQL command, including exact syntax, supported operations (set/read/remove), and behavior tied to Databricks’ system.query.history table. These are product-specific API details and configuration patterns for tagging queries, which align with integrations/coding-patterns and represent expert knowledge beyond generic SQL concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-query-tags,SET QUERY_TAGS,SET QUERY_TAGS - Azure Databricks - Databricks SQL,Set and manage QUERY_TAGS in Databricks SQL,Learn how to use the SET QUERY\_TAGS syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQL Important This feature is inPublic Preview. Sets, reads, or removes query tags for the current session. Query tags are custom key-value pairs that you can apply to SQL workloads to enable grouping, filtering, and cost attribution in thesystem.query.historytable. For a complete overview of query tags, seeQuery tags.",2026-04-14T17:47:00.000Z,concept-article,integrations,0.7,True,"The page documents the SET QUERY_TAGS SQL command, including exact syntax, supported operations (set/read/remove), and behavior tied to Databricks’ system.query.history table. These are product-specific API details and configuration patterns for tagging queries, which align with integrations/coding-patterns and represent expert knowledge beyond generic SQL concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-timezone,SET TIMEZONE,SET TIME ZONE - Azure Databricks - Databricks SQL,Set session time zone with SET TIME ZONE in Databricks SQL,Learn how to use the SET TIME ZONE syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Sets the time zone of the current session.,2025-05-09T19:58:00.000Z,language-reference,configuration,0.74,True,"Databricks-specific command for configuring session time zone, affecting SQL evaluation semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-connector-get,GET,GET - Azure Databricks - Databricks SQL,GET files from volumes using Databricks SQL Connector,Learn how to use the GET syntax of the SQL language in Databricks SQL Connector.,Applies to:Databricks SQL Connector Get a file from avolumeto your local storage. Note This statement is only available in theDatabricks SQL Connectoror when connecting to Databricks using a driver. It cannot be submitted from the UI.,2026-01-24T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks SQL Connector–only GET statement with precise syntax and constraints (only via connector/driver, not UI). This is a product-specific integration/coding pattern for file transfer.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-connector-put-into,PUT INTO,PUT INTO - Azure Databricks - Databricks SQL,Upload local files to volumes with PUT INTO,Learn how to use the PUT INTO syntax of the SQL language in Databricks SQL Connector.,Applies to:Databricks SQL Connector Put a local file into avolume. Note This statement is only available in theDatabricks SQL Connectoror when connecting to Databricks using a driver. It cannot be submitted from the UI.,2026-01-24T08:00:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks SQL Connector–specific PUT INTO statement for moving local files into volumes, including availability constraints (connector/driver only). This is a concrete integration pattern.",unchanged @@ -4659,7 +4704,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-credential,DESCRIBE CREDENTIAL,DESCRIBE CREDENTIAL - Azure Databricks - Databricks SQL,,Learn how to use the DESCRIBE CREDENTIAL syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Returns the metadata of an existing credential. The metadata information includes credential name, comment, owner and other metadata.",2024-11-07T23:52:00.000Z,language-reference,,0.4,False,"DESCRIBE CREDENTIAL metadata; security-related object but page is just metadata description, not RBAC or config details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-database,DESCRIBE DATABASE,DESCRIBE DATABASE - Azure Databricks - Databricks SQL,,Learn how to use the DESCRIBE DATABASE syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime An alias forDESCRIBE SCHEMA. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred.",2024-03-01T08:00:00.000Z,language-reference,,0.35,False,DESCRIBE DATABASE alias for DESCRIBE SCHEMA; no expert-only numeric or config details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-function,DESCRIBE FUNCTION,DESCRIBE FUNCTION - Azure Databricks - Databricks SQL,,Learn how to use the DESCRIBE FUNCTION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the basic metadata information of an existing function. The metadata information includes the function name, implementing class and the usage details. If the optionalEXTENDEDoption is specified, the basic metadata information is returned along with the extended usage information.",2025-02-19T01:19:00.000Z,language-reference,,0.4,False,"DESCRIBE FUNCTION metadata; generic description of function info, no limits or config matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-governed-tag,DESCRIBE GOVERNED TAG,DESCRIBE GOVERNED TAG - Azure Databricks - Databricks SQL,Describe governed tags in Unity Catalog,Learn how to use the DESCRIBE GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Returns the metadata of an existing governed tag.,2026-04-13T21:03:00.000Z,language-reference,configuration,0.7,True,"This is a detailed SQL syntax reference for DESCRIBE GOVERNED TAG in Databricks/Unity Catalog, including exact command form, parameters, and returned metadata fields. These are product-specific configuration and metadata inspection details that an LLM would not reliably know from training, fitting the configuration sub-skill.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-governed-tag,DESCRIBE GOVERNED TAG,DESCRIBE GOVERNED TAG - Azure Databricks - Databricks SQL,Describe governed tags in Unity Catalog,Learn how to use the DESCRIBE GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Returns the metadata of an existing governed tag.,2026-04-13T21:03:00.000Z,language-reference,configuration,0.7,True,"This is a detailed SQL syntax reference for DESCRIBE GOVERNED TAG in Databricks/Unity Catalog, including exact command form, parameters, and returned metadata fields. These are product-specific configuration and metadata inspection details that an LLM would not reliably know from training, fitting the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-location,DESCRIBE LOCATION,DESCRIBE EXTERNAL LOCATION - Azure Databricks - Databricks SQL,,Learn how to use the DESCRIBE EXTERNAL LOCATION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Returns the metadata of an existing external location. The metadata information includes location name, URL, associated credential, owner, and timestamps of creation and last modification.",2024-04-18T08:00:00.000Z,language-reference,,0.45,False,"DESCRIBE EXTERNAL LOCATION metadata; security-adjacent but only returns stored properties, not RBAC roles or config parameters.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-policy,DESCRIBE POLICY,DESCRIBE POLICY - Azure Databricks - Databricks SQL,Use DESCRIBE POLICY for Unity Catalog row filters,Learn how to use the DESCRIBE POLICY SQL statement to view the details of a row filter or column mask policy in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 16.4 and aboveUnity Catalog only Shows the details of a policy as property-value pairs. To run this statement, you must have theMANAGEprivilege on the target securable.",2026-03-13T08:00:00.000Z,language-reference,security,0.7,True,"Security-focused statement tied to Unity Catalog policies; includes specific privilege name (MANAGE) and securable requirements, which are product-specific IAM details that qualify as expert security configuration knowledge.",unchanged @@ -4701,7 +4746,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-credentials,SHOW CREDENTIALS,SHOW CREDENTIALS - Azure Databricks - Databricks SQL,List accessible Unity Catalog credentials with SHOW CREDENTIALS,Learn how to use the SHOW CREDENTIALS syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only List all of the credentials present in the workspace that are accessible to the requesting user.,2024-11-07T23:52:00.000Z,language-reference,security,0.7,True,"Security-focused listing of credentials scoped to requesting user, with Unity Catalog-specific behavior and requirements.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-databases,SHOW DATABASES,SHOW DATABASES - Azure Databricks - Databricks SQL,Use SHOW DATABASES (SCHEMAS) in Databricks SQL,Learn how to use the SHOW DATABASES syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime An alias forSHOW SCHEMAS. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred.",2024-03-01T08:00:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks aliasing between DATABASE and SCHEMA and exact SHOW DATABASES behavior, which is product-specific.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-functions,SHOW FUNCTIONS,SHOW FUNCTIONS - Azure Databricks - Databricks SQL,List functions with SHOW FUNCTIONS in Databricks SQL,Learn how to use the SHOW FUNCTIONS syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the list of functions after applying an optional regex pattern. Databricks SQL supports a large number of functions. You can useSHOW FUNCTIONSin conjunction withdescribe functionto quickly find a function and learn how to use it. TheLIKEclause is optional, and ensures compatibility with other systems.",2026-01-24T08:00:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific SHOW FUNCTIONS syntax, LIKE/regex usage, and interaction with DESCRIBE FUNCTION; this is concrete command/API behavior beyond generic SQL knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-governed-tags,SHOW GOVERNED TAGS,SHOW GOVERNED TAGS - Azure Databricks - Databricks SQL,Use SHOW GOVERNED TAGS in Databricks SQL,Learn how to use the SHOW GOVERNED TAGS syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Lists all governed tags in the account.,2026-04-13T21:03:00.000Z,language-reference,integrations,0.7,True,"SQL reference pages for a specific product typically include exact syntax, required/optional clauses, and parameter behaviors that are product-specific and not reliably known from training. This page defines the SHOW GOVERNED TAGS command for Databricks SQL/Runtime with precise usage details, fitting the integrations/coding-patterns category (product-specific SQL API surface).",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-governed-tags,SHOW GOVERNED TAGS,SHOW GOVERNED TAGS - Azure Databricks - Databricks SQL,Use SHOW GOVERNED TAGS in Databricks SQL,Learn how to use the SHOW GOVERNED TAGS syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Lists all governed tags in the account.,2026-04-13T21:03:00.000Z,language-reference,integrations,0.7,True,"SQL reference pages for a specific product typically include exact syntax, required/optional clauses, and parameter behaviors that are product-specific and not reliably known from training. This page defines the SHOW GOVERNED TAGS command for Databricks SQL/Runtime with precise usage details, fitting the integrations/coding-patterns category (product-specific SQL API surface).",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-groups,SHOW GROUPS,SHOW GROUPS - Azure Databricks - Databricks SQL,List groups with SHOW GROUPS in Databricks SQL,Learn how to use the SHOW GROUPS syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Lists the groups that match an optionally supplied regular expression pattern. If you don't supply a pattern, the command lists all of the groups in the system. You can optionally supply an identifier to show only the groups a specific user or group belongs to. To run this command you must be an administrator.",2026-01-24T08:00:00.000Z,language-reference,integrations,0.7,True,"Describes a Databricks-only SQL command (SHOW GROUPS), including regex filtering, user/group scoping, and admin requirement—product-specific command semantics and parameters.",unchanged @@ -4750,25 +4795,25 @@ Additionally, the output of this statement may be filtered by an optional matchi https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-sync,SYNC,SYNC - Azure Databricks - Databricks SQL,Use SYNC to upgrade Hive tables to Unity Catalog,Learn how to use the SYNC command of the SQL language in Azure Databricks.,"Applies to:Databricks SQLDatabricks RuntimeUnity Catalog only Use theSYNCcommand to upgrade external tables in Hive Metastore to external tables in Unity Catalog. You can also useSYNCto upgrade Hive managed tables that are stored outside of Databricks workspace storage (sometimes called DBFS root) to external tables in Unity Catalog. You cannot use it to upgrade Hive managed tables stored in workspace storage. To upgrade those tables, useCREATE TABLE CLONE. You can useSYNCto create new tables in",2026-01-24T08:00:00.000Z,language-reference,decision-making,0.65,True,"Gives concrete guidance on when SYNC can and cannot be used (external vs managed tables, storage location) and alternative (CREATE TABLE CLONE), supporting migration decisions between table types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-comment,How to add comments to SQL statements,Adding comments to SQL statements - Azure Databricks - Databricks SQL,Add comments and hints in Databricks SQL statements,Learn how to annotate SQL with comments in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Comments are useful for documenting SQL code and for temporarily disabling SQL code. You can add comments to SQL code before, after, and within statements. Comments are ignored by Azure Databricks unless they are recognized ashints. The following forms of comments are supported:",2026-02-14T02:30:00.000Z,language-reference,integrations,0.7,True,"Details supported SQL comment syntaxes and special treatment as hints, which is product-specific language behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-catalog,ALTER CATALOG,ALTER CATALOG - Azure Databricks - Databricks SQL,Use ALTER CATALOG to configure Databricks Unity Catalog,Learn how to use the ALTER CATALOG syntax of the SQL language in Databricks SQL or Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Transfers the ownership of a catalog to a newprincipal, changes the managed storage location of a catalog, applies tags to a catalog, or enables or disablespredictive optimizationfor a catalog.",2026-04-17T18:03:00.000Z,language-reference,configuration,0.78,True,"DDL reference pages for ALTER CATALOG include product-specific SQL syntax, options, and parameters (for ownership transfer, storage location, tags, predictive optimization) that go beyond generic SQL knowledge. This is configuration-focused expert knowledge about how to set catalog properties in Databricks.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-catalog,ALTER CATALOG,ALTER CATALOG - Azure Databricks - Databricks SQL,Use ALTER CATALOG to configure Databricks Unity Catalog,Learn how to use the ALTER CATALOG syntax of the SQL language in Databricks SQL or Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Transfers the ownership of a catalog to a newprincipal, changes the managed storage location of a catalog, applies tags to a catalog, or enables or disablespredictive optimizationfor a catalog.",2026-04-17T18:03:00.000Z,language-reference,configuration,0.78,True,"DDL reference pages for ALTER CATALOG include product-specific SQL syntax, options, and parameters (for ownership transfer, storage location, tags, predictive optimization) that go beyond generic SQL knowledge. This is configuration-focused expert knowledge about how to set catalog properties in Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-catalog-drop-connection,DROP CONNECTION (foreign catalog),DROP CONNECTION (foreign catalog) - Azure Databricks - Databricks SQL,Use DROP CONNECTION to convert foreign catalogs,Use DROP CONNECTION to convert a foreign catalog to a standard catalog in Unity Catalog in Databricks Runtime 17.3 or above.,"Applies to:Databricks SQLDatabricks Runtime 17.3 and above Important This feature is in Public Preview and is only available to participating customers at this time. To participate in the preview, apply byfilling out this form. This feature only supports dropping the connection for foreign catalogs using Hive Metastore (HMS) and Glue Federation. Use theDROP CONNECTIONcommand to convert a foreign catalog to a standard catalog in Unity Catalog. After dropping the connection, the catalog no longer ",2026-01-20T08:00:00.000Z,language-reference,configuration,0.7,True,"Provides precise Databricks SQL syntax, options, and behavior for DROP CONNECTION on foreign catalogs (HMS/Glue federation), including side effects after conversion—product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-connection,ALTER CONNECTION,ALTER CONNECTION - Azure Databricks - Databricks SQL,Manage Unity Catalog connections with ALTER CONNECTION,Learn how to use the ALTER CONNECTION syntax of the SQL language in Databricks SQL or Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only Transfers the ownership of a connection to a newprincipal, renames a connection, or changes the connection options. To set a comment on a connection useCOMMENT ON CONNECTION.",2026-01-24T08:00:00.000Z,language-reference,configuration,0.8,True,"Defines Databricks-specific ALTER CONNECTION syntax and options (rename, ownership, connection options) with exact parameter names and behaviors, fitting configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-credential,ALTER CREDENTIAL,ALTER CREDENTIAL - Azure Databricks - Databricks SQL,ALTER CREDENTIAL names in Databricks SQL,Learn how to use the ALTER CREDENTIAL syntax of the SQL language in Databricks SQL.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Renames acredential.,2025-04-30T00:43:00.000Z,language-reference,configuration,0.7,True,SQL DDL reference for renaming credentials with exact syntax; this is product-specific configuration of security objects.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-database,ALTER DATABASE,ALTER DATABASE - Azure Databricks - Databricks SQL,Use ALTER DATABASE (alias for ALTER SCHEMA) in Databricks,Learn how to use the ALTER DATABASE syntax of the SQL language in Databricks SQL and Databricks Runtime,"Applies to:Databricks SQLDatabricks Runtime An alias forALTER SCHEMA. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.68,True,"Documents that ALTER DATABASE is an alias for ALTER SCHEMA and how it behaves, which is specific DDL/configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-governed-tag,ALTER GOVERNED TAG,ALTER GOVERNED TAG - Azure Databricks - Databricks SQL,Modify governed tags with ALTER GOVERNED TAG in Databricks,Learn how to use the ALTER GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Alters the description or the set of allowed values for a governed tag. To execute this statement, you must have theMANAGEpermission on the governed tag.",2026-04-13T21:03:00.000Z,language-reference,configuration,0.78,True,"The page documents the exact ALTER GOVERNED TAG SQL syntax, allowed options, and required MANAGE permission for governed tags in Databricks Unity Catalog. These are product-specific configuration parameters and privilege requirements, not generic concepts.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-governed-tag,ALTER GOVERNED TAG,ALTER GOVERNED TAG - Azure Databricks - Databricks SQL,Modify governed tags with ALTER GOVERNED TAG in Databricks,Learn how to use the ALTER GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Alters the description or the set of allowed values for a governed tag. To execute this statement, you must have theMANAGEpermission on the governed tag.",2026-04-13T21:03:00.000Z,language-reference,configuration,0.78,True,"The page documents the exact ALTER GOVERNED TAG SQL syntax, allowed options, and required MANAGE permission for governed tags in Databricks Unity Catalog. These are product-specific configuration parameters and privilege requirements, not generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-location,ALTER LOCATION,ALTER EXTERNAL LOCATION - Azure Databricks - Databricks SQL,ALTER EXTERNAL LOCATION properties in Databricks,Learn how to use the ALTER EXTERNAL LOCATION syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Alters properties of anexternal locationor renames the location.,2024-04-18T20:44:00.000Z,language-reference,configuration,0.72,True,"Provides exact ALTER EXTERNAL LOCATION syntax and supported property changes, which are configuration parameters for external locations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-materialized-view,ALTER MATERIALIZED VIEW,ALTER MATERIALIZED VIEW - Azure Databricks - Databricks SQL,Use ALTER MATERIALIZED VIEW in Databricks SQL,Learn how to use the ALTER MATERIALIZED VIEW syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQL Alters metadata associated with the view. Allows you to perform any of the following actions: To add or alter a comment on a materialized view, useCOMMENT ON. Note Altering a pipeline-created dataset in ways that contradict the defining SQL can cause some changes to be reverted. SeeUsingALTERcommands with Lakeflow Spark Declarative Pipelines. view_name The name of the materialized view to alter the definition of. The name must not include atemporal specification. schedu",2026-04-07T17:44:00.000Z,language-reference,configuration,0.7,True,"The page is a detailed SQL reference for ALTER MATERIALIZED VIEW in Databricks SQL, specifying exact syntax elements, options, and constraints that are product-specific. It enumerates parameter names (for example, view_name requirements, scheduling/refresh options, and other clauses) and their allowed forms/behaviors, which fits the configuration category more than generic language knowledge. It is not about limits, troubleshooting, or architecture, but about how to configure and modify materialized views via specific SQL options.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-provider,ALTER PROVIDER,ALTER PROVIDER - Azure Databricks - Databricks SQL,ALTER PROVIDER name and ownership in Databricks,Learn how to use the ALTER PROVIDER syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and aboveUnity Catalog only Renames a provider. -Transfers the ownership of a provider to a newprincipal.",2024-04-18T20:44:00.000Z,language-reference,configuration,0.7,True,"SQL reference for ALTER PROVIDER with exact operations (rename, transfer ownership), which are configuration of sharing providers.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-provider,ALTER PROVIDER,ALTER PROVIDER - Azure Databricks - Databricks SQL,Use ALTER PROVIDER in Databricks SQL for Unity Catalog,Learn how to use the ALTER PROVIDER syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and aboveUnity Catalog only Renames a provider. +Transfers the ownership of a provider to a newprincipal. Note ALTER PROVIDERdoesn't update provider credentials. To apply a new credential after a provider using the open sharing protocol rotates your bearer token, seeRotate credentials for open recipients.",2026-04-24T18:05:00.000Z,language-reference,integrations,0.7,True,"DDL reference for ALTER PROVIDER with product-specific syntax, options, and behavior (ownership transfer, constraints, and notes about credential handling) that go beyond generic SQL knowledge and are unique to Databricks/Unity Catalog.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-recipient,ALTER RECIPIENT,ALTER RECIPIENT - Azure Databricks - Databricks SQL,ALTER RECIPIENT name and ownership in Databricks,Learn how to use the ALTER PROVIDER syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and aboveUnity Catalog only Renames a recipient. Transfers the ownership of a recipient to a newprincipal.",2024-08-06T08:00:00.000Z,language-reference,configuration,0.7,True,"Documents ALTER RECIPIENT syntax and behavior, configuring data sharing recipients; this is product-specific DDL configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema,ALTER SCHEMA,ALTER SCHEMA - Azure Databricks - Databricks SQL,Configure schemas using ALTER SCHEMA in Databricks SQL,Learn how to use the ALTER SCHEMA syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Changes the owner of a schema, changes the managed storage location of a schema, setspredictive optimizationbehavior, or alters metadata associated with a schema by settingDBPROPERTIES. The specified property values override any existing value with the same property name. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred.",2026-04-17T18:03:00.000Z,language-reference,configuration,0.78,True,"The ALTER SCHEMA reference describes precise SQL syntax and options for changing schema owner, storage location, predictive optimization, and DBPROPERTIES in Databricks. These are detailed configuration operations unique to this product’s SQL dialect.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema,ALTER SCHEMA,ALTER SCHEMA - Azure Databricks - Databricks SQL,Configure schemas using ALTER SCHEMA in Databricks SQL,Learn how to use the ALTER SCHEMA syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Changes the owner of a schema, changes the managed storage location of a schema, setspredictive optimizationbehavior, or alters metadata associated with a schema by settingDBPROPERTIES. The specified property values override any existing value with the same property name. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred.",2026-04-17T18:03:00.000Z,language-reference,configuration,0.78,True,"The ALTER SCHEMA reference describes precise SQL syntax and options for changing schema owner, storage location, predictive optimization, and DBPROPERTIES in Databricks. These are detailed configuration operations unique to this product’s SQL dialect.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema-set-managed-location,SET MANAGED LOCATION (FOREIGN SCHEMA),SET MANAGED LOCATION (FOREIGN SCHEMA) - Azure Databricks - Databricks SQL,Set managed locations for foreign schemas in Unity Catalog,Use SET MANAGED LOCATION to modify the default location in cloud storage for new managed tables and managed tables converted from foreign tables in Unity Catalog schemas in Databricks Runtime 17.3 or ,"Applies to:Databricks Runtime 17.3 and above Important This feature is in Public Preview and is only available to participating customers at this time. To participate in the preview, apply byfilling out this form. This feature only supports modifying managed locations for schemas in foreign catalogs using HMS and Glue Federation. Use theALTER SCHEMA SET MANAGED LOCATIONcommand to modify the default location used in cloud storage when you create new managed tables in Unity Catalog schemas. This c",2026-01-20T08:00:00.000Z,language-reference,configuration,0.75,True,"Describes ALTER SCHEMA SET MANAGED LOCATION for foreign catalogs (HMS/Glue), including scope and behavior for new/converted managed tables—specific storage location configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-share,ALTER SHARE,ALTER SHARE - Azure Databricks - Databricks SQL,Manage data shares with ALTER SHARE in Unity Catalog,Learn how to use the ALTER SHARE syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Adds, alters or removes schemas, tables, materialized views, or views to or from the share. Renames a share. Transfers the ownership of a share to a newprincipal. Permissions required:",2026-01-24T08:00:00.000Z,language-reference,security,0.8,True,"Covers ALTER SHARE syntax for adding/removing objects, renaming, and transferring ownership, plus required permissions—detailed sharing and access-control configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-streaming-table,ALTER STREAMING TABLE,ALTER STREAMING TABLE - Azure Databricks - Databricks SQL,ALTER STREAMING TABLE metadata in Databricks SQL,Learn how to use the ALTER STREAMING TABLE syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQL Allows you to perform any of the following actions: To add or alter a comment on a streaming table, useCOMMENT ON. Note Altering a pipeline-created dataset in ways that contradict the defining SQL can cause some changes to be reverted. SeeUsingALTERcommands with Lakeflow Spark Declarative Pipelines.",2026-02-23T08:00:00.000Z,language-reference,configuration,0.72,True,"Provides ALTER STREAMING TABLE syntax and supported actions, including interaction with Lakeflow pipelines, which are detailed configuration behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table,ALTER TABLE,ALTER TABLE - Azure Databricks - Databricks SQL,Alter table schema and properties in Databricks SQL,Learn how to use the ALTER TABLE syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Alters the schema or properties of a table. For type changes or renaming columns in Delta Lake seerewrite the data. To change the comment on a table or a column, you can also useCOMMENT ON. To alter a streaming table or materialized view, useALTER STREAMING TABLE, orALTER MATERIALIZED VIEW. TheALTER TABLEcommand isn't supported for temporary tables. An error is returned if theALTER TABLEcommand is applied to a temporary table. If the table is cached, t",2026-01-24T08:00:00.000Z,language-reference,configuration,0.9,True,"Comprehensively documents ALTER TABLE syntax, supported operations, unsupported cases (temporary tables, streaming tables), and behaviors like caching—detailed product-specific configuration rules.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table,ALTER TABLE,ALTER TABLE - Azure Databricks - Databricks SQL,Alter Databricks SQL tables with ALTER TABLE,Learn how to use the ALTER TABLE syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Alters the schema or properties of a table. TheALTER TABLEcommand isn't supported for temporary tables. An error is returned if theALTER TABLEcommand is applied to a temporary table. If the table is cached, the command clears cached data of the table and all its dependents that refer to it. The cache will be lazily filled when the table or the dependents are accessed the next time. On foreign tables, you can only performALTER TABLE SET OWNERandALTER TA",2026-04-23T17:47:00.000Z,language-reference,configuration,0.7,True,"Detailed DDL syntax and options for ALTER TABLE in Databricks SQL, including supported/unsupported table types (for example, temporary and foreign tables) and product-specific behaviors like cache invalidation, which are not generic SQL and represent expert configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-add-constraint,ADD CONSTRAINT clause,ADD CONSTRAINT clause - Azure Databricks - Databricks SQL,Add table constraints with ALTER TABLE ADD CONSTRAINT,Learn how to use the ALTER TABLE ADD CONSTRAINT syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Adds aninformationalprimary key,informationalforeign key, or an enforced check constraint to an existing Delta Lake table.",2026-01-24T08:00:00.000Z,language-reference,configuration,0.8,True,"Describes Databricks-specific SQL syntax for informational primary/foreign keys and enforced check constraints on Delta tables, including clause forms and behaviors—product-specific DDL configuration.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-drop-constraint,DROP CONSTRAINT clause,DROP CONSTRAINT clause - Azure Databricks - Databricks SQL,Drop table constraints with ALTER TABLE DROP CONSTRAINT,Learn how to use the ALTER TABLE DROP CONSTRAINT syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Drops aPRIMARY KEY,FOREIGN KEY, orCHECKconstraint from the relation.",2026-01-20T08:00:00.000Z,language-reference,configuration,0.8,True,"Provides exact Databricks SQL syntax and behavior for dropping PRIMARY KEY, FOREIGN KEY, and CHECK constraints, which are concrete DDL configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-manage-column,ALTER TABLE … COLUMN clause,ALTER TABLE ... COLUMN clause - Azure Databricks - Databricks SQL,Alter table columns and fields in Databricks SQL,Learn how to use the ALTER TABLE … COLUMN syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Adds, modifies, or drops a column in a table, or a field in a column in a Delta Lake table.",2026-02-17T08:00:00.000Z,language-reference,configuration,0.8,True,"Documents ALTER TABLE ... COLUMN syntax for adding, modifying, and dropping columns/fields in Delta tables with Databricks-specific options—detailed configuration parameters.",unchanged @@ -4780,7 +4825,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-volume,ALTER VOLUME,ALTER VOLUME - Azure Databricks - Databricks SQL,ALTER VOLUME name and owner in Databricks,Learn how to use the ALTER VOLUME syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only Alters the name or owner of a volume.,2024-06-12T16:26:00.000Z,language-reference,configuration,0.72,True,"SQL reference for ALTER VOLUME with exact operations (rename, change owner) and Unity Catalog constraints, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-cluster-by,CLUSTER BY clause (TABLE),CLUSTER BY clause (TABLE) - Azure Databricks - Databricks SQL,Define liquid clustering with CLUSTER BY on Delta tables,Learn how to use the CLUSTER BY clause syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveDelta Lake only Defines liquid, multi-dimensional clustering for a relation. Azure Databricks recommends using automatic liquid clustering and predictive optimization for all Unity Catalog managed tables. These features provide intelligent optimization of data layout based on your data usage patterns. You can use this clause when you: Create a table usingCREATE TABLE. Alter a table withALTER TABLEto change the clustering columns. To c",2026-01-20T08:00:00.000Z,language-reference,configuration,0.8,True,"Explains Databricks-specific CLUSTER BY syntax for liquid, multi-dimensional clustering, including how to use it on CREATE/ALTER TABLE and its interaction with predictive optimization—product-specific configuration.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-column-mask,Column mask clause,Column mask clause - Azure Databricks - Databricks SQL,Configure column masks for fine-grained data access,Learn how to use the column mask clause syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and aboveUnity Catalog only Specifies a function that is applied to a column whenever rows are fetched from the table. All subsequent queries from that column receive the result of evaluating that function over the column in place of the column's original value. This can be useful for fine-grained access control purposes where the function can inspect the identity or group memberships of the invoking user to determine whether to redact the val,2026-04-03T22:07:00.000Z,language-reference,security,0.85,True,"Details the column mask clause syntax, specifying how to bind masking functions to columns and how identity/group info is used—product-specific security configuration with exact clause forms.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-column-mask,Column mask clause,Column mask clause - Azure Databricks - Databricks SQL,Define and manage column masks in Databricks SQL,Learn how to use the column mask clause syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and aboveUnity Catalog only Specifies a function that is applied to a column whenever rows are fetched from the table. All subsequent queries from that column receive the result of evaluating that function over the column in place of the column's original value. This can be useful for fine-grained access control purposes where the function can inspect the identity or group memberships of the invoking user to determine whether to redact the val,2026-04-23T21:54:00.000Z,language-reference,security,0.7,True,"Documents product-specific column mask clause syntax and behavior for fine-grained access control, including how masks are applied on query, interaction with user identity/groups, and Unity Catalog constraints—security configuration details unique to Databricks.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-comment,COMMENT ON,COMMENT ON - Azure Databricks - Databricks SQL,Set object comments with COMMENT ON in Databricks,Learn how to use the COMMENT syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Sets a comment on a catalog, schema, table, column, share, recipient, provider, or volume. Note If you want to add an AI-generated comment for a table or table column managed by Unity Catalog, seeAdd AI-generated comments to Unity Catalog objects. Catalogs, shares, recipients, and providers are supported in Unity Catalog only.",2026-02-26T19:56:00.000Z,language-reference,configuration,0.74,True,"Provides COMMENT ON syntax for multiple object types and notes on AI-generated comments and Unity Catalog scope, which are specific configuration behaviors.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-catalog,CREATE CATALOG,CREATE CATALOG - Azure Databricks - Databricks SQL,Create catalogs with Databricks SQL CREATE CATALOG,Learn how to use the CREATE CATALOG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Creates a catalog with the specified name. If a catalog with the same name already exists, an exception is thrown. When you create aFOREIGNcatalog it will be populated with all the schemas and their tables visible to the authenticating user.",2026-01-24T08:00:00.000Z,language-reference,integrations,0.7,True,"Detailed SQL DDL reference for CREATE CATALOG including Unity Catalog specifics and FOREIGN catalog behavior. Contains product-specific syntax and semantics that function as API/DDL parameters, matching integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-connection,CREATE CONNECTION,CREATE CONNECTION - Azure Databricks - Databricks SQL,Define foreign connections with CREATE CONNECTION,Learn how to use the CREATE CONNECTION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only This command creates a foreign connection (or server), which represents a remote data system of a specific type, using system specific options that provide the location of the remote system and authentication details. Foreign connections enablefederated queries.",2026-01-24T08:00:00.000Z,language-reference,integrations,0.78,True,"Describes CREATE CONNECTION DDL for foreign connections, including system-specific options and authentication details. These are concrete configuration/parameter patterns for federated queries, which qualify as product-specific integration patterns.",unchanged @@ -4789,12 +4834,12 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s as permanent functions are created in the persistent catalog and are made available to all sessions. The resources specified in theUSINGclause are made available to all executors when they are executed for the first time. In addition to the SQL interface, Spark allows you to create custom user defined scalar and aggregate functions using Scala, Python, and Jav",2026-01-24T08:00:00.000Z,language-reference,integrations,0.76,True,"Documents CREATE FUNCTION for external functions, including USING clause resource handling and session vs persistent scope. These are detailed, product-specific SQL/Scala/Python/Java UDF integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-governed-tag,CREATE GOVERNED TAG,CREATE GOVERNED TAG - Azure Databricks - Databricks SQL,Create governed tags with Databricks CREATE GOVERNED TAG,Learn how to use the CREATE GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Creates a governed tag, which is an account-level resource. To execute this statement, you must have theCREATEprivilege at the account level. Account admins and workspace admins have this privilege by default.",2026-04-13T21:03:00.000Z,language-reference,configuration,0.78,True,"The page defines the CREATE GOVERNED TAG SQL syntax, parameters, and required CREATE privilege at the account level, including role assumptions (account/workspace admins). This is expert, product-specific configuration of governed tags in Unity Catalog.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-governed-tag,CREATE GOVERNED TAG,CREATE GOVERNED TAG - Azure Databricks - Databricks SQL,Create governed tags with Databricks CREATE GOVERNED TAG,Learn how to use the CREATE GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Creates a governed tag, which is an account-level resource. To execute this statement, you must have theCREATEprivilege at the account level. Account admins and workspace admins have this privilege by default.",2026-04-13T21:03:00.000Z,language-reference,configuration,0.78,True,"The page defines the CREATE GOVERNED TAG SQL syntax, parameters, and required CREATE privilege at the account level, including role assumptions (account/workspace admins). This is expert, product-specific configuration of governed tags in Unity Catalog.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-location,CREATE LOCATION,CREATE EXTERNAL LOCATION - Azure Databricks - Databricks SQL,Configure external locations with CREATE EXTERNAL LOCATION,Learn how to use the CREATE EXTERNAL LOCATION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Creates an external location with the specified name. If a location with the same name already exists, an exception is thrown. For how-to instructions, seeOption 2: Create an external location using SQL.",2026-01-20T08:00:00.000Z,language-reference,configuration,0.68,True,"DDL reference for creating external locations in Unity Catalog, including required options and behavior when names conflict. This is concrete configuration of storage locations via SQL, matching configuration-style expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view,CREATE MATERIALIZED VIEW,CREATE MATERIALIZED VIEW - Azure Databricks - Databricks SQL,Define materialized views in Databricks SQL,Learn how to use the CREATE MATERIALIZED VIEW syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQL Amaterialized viewis a view where precomputed results are available for query and can be updated to reflect changes in the input. +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view,CREATE MATERIALIZED VIEW,CREATE MATERIALIZED VIEW - Azure Databricks - Databricks SQL,Use CREATE MATERIALIZED VIEW in Databricks SQL,Learn how to use the CREATE MATERIALIZED VIEW syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQL Amaterialized viewis a view where precomputed results are available for query and can be updated to reflect changes in the input. Each time a materialized view is refreshed, query results are recalculated to reflect changes in upstream datasets. -All materialized views are backed by an ETL pipeline. You can refresh materialized views manually or on a schedule. To learn more about how to perform a manual refresh, seeREFRESH (MATERIALIZED VIEW or STREAMING TABLE). To learn",2026-04-07T08:00:00.000Z,language-reference,integrations,0.7,True,"The page is a SQL language manual entry for CREATE MATERIALIZED VIEW in Databricks, with Databricks-specific semantics (backing ETL pipeline, refresh behavior, supported clauses). This is concrete, product-specific SQL syntax and behavior, fitting integrations/coding patterns rather than generic concepts.",unchanged +All materialized views are backed by an ETL pipeline. You can refresh materialized views manually or on a schedule. To learn more about how to perform a manual refresh, seeREFRESH (MATERIALIZED VIEW or STREAMING TABLE). To learn",2026-04-22T08:00:00.000Z,language-reference,configuration,0.78,True,"This is a detailed SQL reference page for CREATE MATERIALIZED VIEW in Databricks SQL/Runtime. Such language-manual pages typically enumerate all supported clauses, options, and constraints (e.g., syntax forms, required/optional parameters, allowed values, and behavior of refresh options) that are product-specific and not inferable from generic SQL knowledge. That fits the configuration category as it defines concrete, product-specific command parameters and their valid usage rather than conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view-refresh-policy,REFRESH POLICY clause,REFRESH POLICY clause - Azure Databricks - Databricks SQL,Configure REFRESH POLICY for materialized views,Learn how to use the CREATE MATERIALIZED VIEW REFRESH POLICY syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.3 and above Important This feature is inBeta. Available in Databricks Runtime 17.3 and above. Adds a refresh policy to the materialized view, controlling when to incrementalize the refresh. Applies to theCREATE MATERIALIZED VIEWstatement. To learn about incrementalization, seeIncremental refresh for materialized views. You can check whether a SQL query is incrementalizable with theEXPLAIN CREATE MATERIALIZED VIEWstatement. SeeEXPLAIN CREATE MATERIAL",2026-02-23T08:00:00.000Z,language-reference,configuration,0.74,True,Documents REFRESH POLICY clause with specific options controlling incremental refresh behavior. These are concrete configuration parameters and allowed values for a Databricks-specific feature.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-policy,CREATE POLICY,CREATE POLICY - Azure Databricks - Databricks SQL,Define row-level security policies in Databricks SQL,Learn how to use the CREATE POLICY SQL statement to define row filters and column masks on securables in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 16.4 and aboveUnity Catalog only Creates a named policy on a securable. Policies can be row filters or column masks applied to catalogs, schemas, or tables. Thepolicy nameis scoped to the securable the policy is defined on. To run this statement, you must have theMANAGEprivilege on the target securable or be its owner.",2026-03-16T17:36:00.000Z,language-reference,security,0.78,True,"This is a detailed SQL syntax reference for CREATE POLICY in Azure Databricks/Unity Catalog, including product-specific semantics for row filters and column masks, required privileges (MANAGE, ownership), and how policies attach to securables. These are concrete, product-unique security configuration details that go beyond generic SQL knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-procedure,CREATE PROCEDURE,CREATE PROCEDURE - Azure Databricks - Databricks SQL,Create and manage procedures in Databricks SQL,Learn how to create and use stored procedures in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQL Databricks Runtime 17.0 and aboveUnity Catalog only Creates a procedure in Unity Catalog that takes or modifies arguments, executes a set of SQL statements, and optionally returns a result set. In addition topositional parameter invocation, you can also invoke procedures usingnamed parameter invocation.",2026-04-06T08:00:00.000Z,language-reference,integrations,0.75,True,"This is a detailed syntax reference for CREATE PROCEDURE in Databricks SQL/Runtime with Unity Catalog, including named vs positional parameters and Databricks-specific behavior. It represents expert, product-specific SQL integration/coding knowledge, not general theory or limits tables.",unchanged @@ -4811,7 +4856,7 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-hiveformat,CREATE TABLE with Hive format,CREATE TABLE with Hive format - Azure Databricks - Databricks SQL,Create Hive-format tables in Databricks Runtime,Learn how to use the CREATE TABLE with Hive format syntax of the SQL language in Azure Databricks.,"Applies to:Databricks Runtime Defines a table using Hive format. The clauses between the column definition clause and theAS SELECTclause can appear in any order. For example, you can writeCOMMENT table_commentafterTBLPROPERTIES. Note You must specify either theSTORED ASorROW FORMATclause. Otherwise, the SQL parser uses theCREATE TABLE [USING]syntax to parse it and creates a Delta table by default. table_identifier A table name, optionally qualified with a schema name. Syntax:[schema_name.] table",2024-03-01T08:00:00.000Z,language-reference,configuration,0.76,True,"CREATE TABLE with Hive format syntax, required clauses, and Databricks-specific defaulting to Delta if omitted.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-like,CREATE TABLE LIKE,CREATE TABLE LIKE - Azure Databricks - Databricks SQL,Create tables LIKE existing tables in Databricks,Learn how to use the CREATE TABLE LIKE syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime Defines a table using the definition and metadata of an existing table. Delta Lake does supportCREATE TABLE LIKEin Databricks SQL and Databricks Runtime 13.3 LTS and above. In Databricks Runtime 12.2 LTS and below, useCREATE TABLE AS.",2025-01-24T08:00:00.000Z,language-reference,configuration,0.7,True,CREATE TABLE LIKE behavior and version-specific support for Delta Lake in Databricks Runtime.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-using,CREATE TABLE [USING],CREATE TABLE [USING] - Azure Databricks - Databricks SQL,"Create managed, temporary, and external tables in Databricks SQL",Learn how to use the CREATE TABLE \[USING] syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Defines a managed, temporary, orexternal table, and, for non-temporary tables, optionally using a data source. TheCREATE TEMP TABLEcommand creates a session-local temporary table that persists data temporarily during the session. The table name must be unqualified (no schema or catalog prefix). Temporary tables don't reside in the current schema or catalog and are only accessible within the session that created them. Databricks drops temporary tables a",2026-04-09T08:00:00.000Z,language-reference,integrations,0.75,True,"The page documents the Databricks-specific CREATE TABLE [USING] syntax, including behavior of managed, temporary, and external tables and data source usage. This is concrete, product-specific SQL syntax and behavior, aligning with integrations/coding patterns rather than the other categories.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-using,CREATE TABLE [USING],CREATE TABLE [USING] - Azure Databricks - Databricks SQL,Define tables with CREATE TABLE in Databricks SQL,Learn how to use the CREATE TABLE \[USING] syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Defines a managed, temporary, orexternal table, and, for non-temporary tables, optionally using a data source. TheCREATE TEMP TABLEcommand creates a session-local temporary table that persists data temporarily during the session. The table name must be unqualified (no schema or catalog prefix). Temporary tables don't reside in the current schema or catalog and are only accessible within the session that created them. Databricks drops temporary tables a",2026-04-23T08:00:00.000Z,language-reference,configuration,0.8,True,"This SQL reference for CREATE TABLE [USING] in Databricks SQL/Runtime describes exact syntax and options for managed, temporary, and external tables, including data source usage and behavior differences (e.g., session-local temp tables, qualification rules, persistence semantics). These are product-specific configuration details for table creation (parameters, options, and constraints) that go beyond generic SQL knowledge, matching the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-view,CREATE VIEW,CREATE VIEW - Azure Databricks - Databricks SQL,Create views and metric views in Databricks SQL,Learn how to use the CREATE VIEW syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Constructs a virtual table that has no physical data based on the result-set of a SQL query or a metric view based on a yaml specification.ALTER VIEWandDROP VIEWonly change metadata. To execute this statement, you must be a metastore administrator or haveUSE CATALOGandUSE SCHEMAprivileges on the catalog and schema, along withCREATE TABLEprivileges in the target schema. The user executing this command will become the owner of the view.",2026-02-25T08:00:00.000Z,language-reference,integrations,0.76,True,"Reference for CREATE VIEW including metric views, YAML-based definitions, and required privileges. Contains concrete syntax and permission requirements that are specific to Databricks SQL.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-volume,CREATE VOLUME,CREATE VOLUME - Azure Databricks - Databricks SQL,Create Unity Catalog volumes with CREATE VOLUME,Learn how to use the CREATE VOLUME syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 13.3 LTS and aboveUnity Catalog only Creates a volume with the specified name. If a volume with the same name already exists in the schemaVOLUME_ALREADY_EXISTSis raised. SeeVolumesfor details about using volumes.",2026-01-20T08:00:00.000Z,language-reference,integrations,0.68,True,"DDL reference for CREATE VOLUME, including Unity Catalog constraints and specific error (VOLUME_ALREADY_EXISTS). These are product-specific SQL semantics and options.",unchanged @@ -4825,7 +4870,7 @@ To drop a credential you must have theMANAGEprivilege on the credential or be it https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-database,DROP DATABASE,DROP DATABASE - Azure Databricks - Databricks SQL,Drop databases (schemas) in Databricks SQL,Learn how to use the DROP DATABASE syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime An alias forDROP SCHEMA. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred. To drop a database you must have theMANAGEprivilege on the database or be its owner.",2024-12-16T21:39:00.000Z,language-reference,configuration,0.64,True,DROP DATABASE aliasing to DROP SCHEMA and privilege requirements are Databricks-specific behaviors.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-function,DROP FUNCTION,DROP FUNCTION - Azure Databricks - Databricks SQL,Drop user-defined functions with DROP FUNCTION in Databricks,Learn how to use the DROP FUNCTION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Drops a temporary or permanent user-defined function (UDF). To drop a function you must have theMANAGEprivilege on the function, be its owner, or the owner of the schema, catalog, or metastore the function resides in.",2026-01-20T08:00:00.000Z,language-reference,integrations,0.66,True,Reference for DROP FUNCTION including required MANAGE privilege and ownership rules. Contains product-specific DDL semantics and permission patterns for function lifecycle management.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-governed-tag,DROP GOVERNED TAG,DROP GOVERNED TAG - Azure Databricks - Databricks SQL,Use DROP GOVERNED TAG in Databricks SQL,Learn how to use the DROP GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Drops a governed tag. When a governed tag is dropped, it is not removed from objects it was applied to. Instead, the tag becomes ungoverned: the tag key and any existing tag values remain on objects, but the tag is no longer subject to a tag policy. To execute this statement, you must have theMANAGEpermission on the governed tag.",2026-04-13T21:03:00.000Z,language-reference,configuration,0.7,True,"DDL syntax reference for a specific Databricks SQL statement, including required permission (MANAGE on the governed tag) and product/version scope. This is product-specific configuration/DDL behavior that an LLM is unlikely to know from training.",new +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-governed-tag,DROP GOVERNED TAG,DROP GOVERNED TAG - Azure Databricks - Databricks SQL,Use DROP GOVERNED TAG in Databricks SQL,Learn how to use the DROP GOVERNED TAG syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 18.1 and aboveUnity Catalog only Drops a governed tag. When a governed tag is dropped, it is not removed from objects it was applied to. Instead, the tag becomes ungoverned: the tag key and any existing tag values remain on objects, but the tag is no longer subject to a tag policy. To execute this statement, you must have theMANAGEpermission on the governed tag.",2026-04-13T21:03:00.000Z,language-reference,configuration,0.7,True,"DDL syntax reference for a specific Databricks SQL statement, including required permission (MANAGE on the governed tag) and product/version scope. This is product-specific configuration/DDL behavior that an LLM is unlikely to know from training.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-location,DROP LOCATION,DROP EXTERNAL LOCATION - Azure Databricks - Databricks SQL,Drop external locations in Databricks Unity Catalog,Learn how to use the DROP EXTERNAL LOCATION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and aboveUnity Catalog only Drops an external location. An exception is thrown if the location does not exist in the metastore. To drop an external location you must have theMANAGEprivilege on the external location or be its owner.",2025-04-30T22:51:00.000Z,language-reference,configuration,0.68,True,DROP EXTERNAL LOCATION syntax and MANAGE privilege requirements are specific to Databricks.,unchanged @@ -4866,7 +4911,7 @@ unqualified references to objects such as tables, functions, and views that are referenced by SQLs are resolved from the current schema. The default schema name isdefault. While usage ofSCHEMAandDATABASEis interchangeable,SCHEMAis preferred.",2024-04-21T19:33:00.000Z,language-reference,configuration,0.74,True,"Product-specific rules for schema scoping, default schema name, and DATABASE/SCHEMA terminology.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-usedb,USE DATABASE,USE DATABASE - Azure Databricks - Databricks SQL,Change current database with USE DATABASE (USE SCHEMA) in Databricks,Learn how to use the USE DATABASE syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime An alias forUSE SCHEMA. While usage ofSCHEMA,NAMESPACEandDATABASEis interchangeable,SCHEMAis preferred.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.66,True,"Documents Databricks aliasing between DATABASE, SCHEMA, and NAMESPACE and their effect on object resolution.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into,INSERT INTO,INSERT - Azure Databricks - Databricks SQL,Use INSERT syntax in Databricks SQL,Learn how to use the INSERT syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Inserts new rows into a table and optionally truncates the table or partitions. You specify the inserted rows by value expressions or the result of a query. Databricks does not supportINSERTforHive Avrotables if thetimestamp-millistype is present in the table schema.,2026-04-13T08:00:00.000Z,language-reference,integrations,0.7,True,"Product-specific DML syntax and constraints (for example, unsupported Hive Avro timestamp-millis) represent detailed behavioral knowledge beyond generic SQL.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into,INSERT INTO,INSERT - Azure Databricks - Databricks SQL,Use INSERT syntax in Azure Databricks SQL,Learn how to use the INSERT syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Inserts new rows into a table and optionally truncates the table or partitions. You specify the inserted rows by value expressions or the result of a query. Databricks does not supportINSERTforHive Avrotables if thetimestamp-millistype is present in the table schema.,2026-04-20T08:00:00.000Z,language-reference,integrations,0.7,True,"The page documents the exact, product-specific behavior and constraints of the Databricks SQL INSERT statement (including unsupported cases like Hive Avro with timestamp-millis and Databricks-specific syntax/semantics). This is detailed reference behavior for a specific platform’s SQL dialect, which qualifies as expert knowledge. It is not about limits/quotas, troubleshooting, or configuration, but about precise API/SQL usage patterns, so it best fits the integrations & coding patterns category.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-overwrite-directory,INSERT OVERWRITE DIRECTORY,INSERT OVERWRITE DIRECTORY - Azure Databricks - Databricks SQL,Write query results to directories with INSERT OVERWRITE,Learn how to use the INSERT OVERWRITE DIRECTORY syntax of the SQL language in Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Overwrites the existing data in the directory with the new values using a given Spark file format. You specify the inserted row by value expressions or the result of a query.,2024-03-01T08:00:00.000Z,language-reference,configuration,0.6,True,"INSERT OVERWRITE DIRECTORY reference defines Databricks-specific syntax and behavior for writing to file system directories using Spark formats, which is configuration-level detail.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-overwrite-directory-hive,INSERT OVERWRITE DIRECTORY with Hive format,INSERT OVERWRITE DIRECTORY with Hive format - Azure Databricks - Databricks SQL,Use INSERT OVERWRITE DIRECTORY with HiveSerDe,Learn how to use the INSERT OVERWRITE DIRECTORY with Hive format syntax of the SQL language in Databricks Runtime.,"Applies to:Databricks Runtime Overwrites the existing data in the directory with the new values using HiveSerDe. Hive support must be enabled to use this command. You specify the inserted rows by value expressions or the result of a query.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.6,True,"Documents HiveSerDe-specific variant of INSERT OVERWRITE DIRECTORY, including requirement to enable Hive support—Databricks-specific configuration and integration semantics.",unchanged @@ -4874,57 +4919,57 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-s https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-explain,EXPLAIN,EXPLAIN - Azure Databricks - Databricks SQL,,Learn how to use the EXPLAIN syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Provides the logical or physical plans for an input statement. By default, this clause provides information about a physical plan only.",2024-03-01T08:00:00.000Z,language-reference,,0.4,False,"EXPLAIN syntax reference; no numeric limits, decision matrices, or config tables.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-explain-materialized-view,EXPLAIN CREATE MATERIALIZED VIEW,EXPLAIN CREATE MATERIALIZED VIEW - Azure Databricks - Databricks SQL,,Learn how to use the EXPLAIN CREATE MATERIALIZED VIEW syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 17.3 and above Important This feature is inBeta. Available in Databricks Runtime 17.3 and above. Provides information about whether a query can be incrementalized when being refreshed for a materialized view. To learn about materialized view incrementalization, seeIncremental refresh for materialized views. Important EXPLAIN CREATE MATERIALIZED VIEWconfirms structural eligibility for incrementalization. It does not guarantee an incremental refresh will",2026-02-24T20:13:00.000Z,language-reference,,0.45,False,"EXPLAIN CREATE MATERIALIZED VIEW describes eligibility conceptually; no quantified thresholds, limits, or config tables.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-pipeline,SQL Pipeline Syntax,SQL Pipeline Syntax - Azure Databricks - Databricks SQL,Author SQL pipelines in Databricks SQL,Learn how to use SQL pipeline syntax in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.2 and above Azure Databricks supports SQL pipeline syntax which allows composing queries from combinations of chained operators.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,Describes Databricks-specific SQL pipeline syntax and chaining semantics introduced in specific runtime versions.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-query,Query,Query - Azure Databricks - Databricks SQL,Compose queries in Databricks SQL,Learn how to use Query syntax in the SQL language in Databricks SQL and Databricks Runtime.,Retrieves result sets from one or more tables. Applies to:Databricks SQLDatabricks Runtime,2026-04-14T08:00:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific query syntax and options, which are implementation details not fully captured by generic SQL knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-pipeline,SQL Pipeline Syntax,SQL Pipeline Syntax - Azure Databricks - Databricks SQL,Author SQL pipelines in Databricks SQL,Learn how to use SQL pipeline syntax in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.2 and above Azure Databricks supports SQL pipeline syntax which allows composing queries from combinations of chained operators.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,Describes Databricks-specific SQL pipeline syntax and chaining semantics introduced in specific runtime versions.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-query,Query,Query - Azure Databricks - Databricks SQL,Compose queries in Databricks SQL,Learn how to use Query syntax in the SQL language in Databricks SQL and Databricks Runtime.,Retrieves result sets from one or more tables. Applies to:Databricks SQLDatabricks Runtime,2026-04-14T08:00:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-specific query syntax and options, which are implementation details not fully captured by generic SQL knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select,SELECT,SELECT (subselect) - Azure Databricks - Databricks SQL,Write SELECT subqueries in Databricks SQL,Learn how to use the subselect syntax in the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Composes a result set from one or moretable references. -TheSELECTclause can be part of a query which also includes common table expressions (CTE), set operations, and various other clauses.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,Covers Databricks-specific SELECT/subselect syntax and how it composes with CTEs and set operations.,updated +TheSELECTclause can be part of a query which also includes common table expressions (CTE), set operations, and various other clauses.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,Covers Databricks-specific SELECT/subselect syntax and how it composes with CTEs and set operations.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-clusterby,CLUSTER BY clause,CLUSTER BY clause (SELECT) - Azure Databricks - Databricks SQL,Repartition and sort results with CLUSTER BY,Learn how to use the CLUSTER BY syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Repartitions the data based on the input expressions and then sorts the data within each partition. This is semantically equivalent to performing aDISTRIBUTE BYfollowed by aSORT BY. This clause only ensures that the resultant rows are sorted within each partition and does not guarantee a total order of output.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.6,True,"CLUSTER BY reference explains Databricks-specific semantics (equivalence to DISTRIBUTE BY + SORT BY, partition-local ordering only), which are engine-specific behavior details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-column-list,SELECT clause,SELECT clause - Azure Databricks - Databricks SQL,Project columns with SELECT clause in Databricks SQL,Learn how to use the SELECT clause syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Collects the columns to be returned from the subquery, including the execution of expressions, aggregations, and deduplication.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks-specific SELECT clause capabilities including expressions, aggregations, and deduplication semantics.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-cte,Common table expression (CTE),Common table expression (CTE) - Azure Databricks - Databricks SQL,Use CTEs in Databricks SQL queries,Learn how to use a common table expression of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Defines a temporary result set that you can reference possibly multiple times within the scope of a SQL statement. A CTE is used mainly in aSELECTstatement.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks-specific common table expression syntax and behavior, which are concrete language details.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-column-list,SELECT clause,SELECT clause - Azure Databricks - Databricks SQL,Project columns with SELECT clause in Databricks SQL,Learn how to use the SELECT clause syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Collects the columns to be returned from the subquery, including the execution of expressions, aggregations, and deduplication.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks-specific SELECT clause capabilities including expressions, aggregations, and deduplication semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-cte,Common table expression (CTE),Common table expression (CTE) - Azure Databricks - Databricks SQL,Use CTEs in Databricks SQL queries,Learn how to use a common table expression of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Defines a temporary result set that you can reference possibly multiple times within the scope of a SQL statement. A CTE is used mainly in aSELECTstatement.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Describes Databricks-specific common table expression syntax and behavior, which are concrete language details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-distributeby,DISTRIBUTE BY clause,DISTRIBUTE BY clause - Azure Databricks - Databricks SQL,Control data partitioning with DISTRIBUTE BY,Learn how to use the DISTRIBUTE BY syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Repartitions data based on the input expressions. Unlike theCLUSTER BYclause, does not sort the data within each partition.",2024-03-01T08:00:00.000Z,language-reference,configuration,0.6,True,"DISTRIBUTE BY reference describes Databricks-specific repartitioning behavior distinct from CLUSTER BY and ORDER BY, which is engine-specific configuration detail.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-groupby,GROUP BY clause,GROUP BY clause - Azure Databricks - Databricks SQL,GROUP BY and advanced aggregates in Databricks SQL,Learn how to use the GROUP BY syntax of the SQL language in Databricks SQL.,"Applies to:Databricks SQLDatabricks Runtime TheGROUP BYclause is used to group the rows based on a set of specified grouping expressions and compute aggregations on the group of rows based on one or more specified aggregate functions. Databricks SQL also supports advanced aggregations to do multiple aggregations for the same input record set viaGROUPING SETS,CUBE,ROLLUPclauses. -The grouping expressions and advanced aggregations can be mixed in theGROUP BYclause and nested in aGROUPING SETSclause",2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Includes Databricks-specific GROUP BY capabilities such as GROUPING SETS, CUBE, and ROLLUP usage details.",updated +The grouping expressions and advanced aggregations can be mixed in theGROUP BYclause and nested in aGROUPING SETSclause",2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Includes Databricks-specific GROUP BY capabilities such as GROUPING SETS, CUBE, and ROLLUP usage details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-having,HAVING clause,HAVING clause - Azure Databricks - Databricks SQL,Filter grouped results with HAVING in Databricks SQL,Learn how to use the HAVING syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Filters the results produced byGROUP BYbased on the specified condition. Often used -in conjunction with aGROUP BYclause.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,Provides concrete HAVING clause syntax and behavior as implemented in Databricks SQL.,updated +in conjunction with aGROUP BYclause.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,Provides concrete HAVING clause syntax and behavior as implemented in Databricks SQL.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-hints,Hints,Hints - Azure Databricks - Databricks SQL,,Learn how to use hints syntax of the SQL language in Databricks SQL and Databricks Runtime,Applies to:Databricks SQLDatabricks Runtime Suggest specific approaches to generate an execution plan.,2026-01-20T08:00:00.000Z,language-reference,,0.4,False,"Hints page suggests approaches for execution plans; summary does not show numeric limits, config tables, or explicit best-practice gotchas beyond generic optimizer hints.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-join,JOIN,JOIN - Azure Databricks - Databricks SQL,Join tables using Databricks SQL JOIN,Learn how to use the JOIN syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Combines the rows from the precedingleft_table referencewith theright_table referencebased on join criteria.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Covers Databricks-specific JOIN syntax and supported join behaviors, which are detailed language semantics.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-lateral-view,LATERAL VIEW clause,LATERAL VIEW clause - Azure Databricks - Databricks SQL,Use and migrate from LATERAL VIEW in Databricks SQL,Learn how to use the LATERAL VIEW syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Used in conjunction with generator functions such asEXPLODE, which generates a virtual table containing one or more rows.LATERAL VIEWapplies the rows to each original output row. In Databricks SQL and starting with Databricks Runtime 12.2 this clause is deprecated. You should invoke a table valued generator function as atable_reference.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific LATERAL VIEW behavior, generator function usage, and deprecation guidance tied to specific runtime versions.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-join,JOIN,JOIN - Azure Databricks - Databricks SQL,Join tables using Databricks SQL JOIN,Learn how to use the JOIN syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Combines the rows from the precedingleft_table referencewith theright_table referencebased on join criteria.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Covers Databricks-specific JOIN syntax and supported join behaviors, which are detailed language semantics.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-lateral-view,LATERAL VIEW clause,LATERAL VIEW clause - Azure Databricks - Databricks SQL,Use and migrate from LATERAL VIEW in Databricks SQL,Learn how to use the LATERAL VIEW syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Used in conjunction with generator functions such asEXPLODE, which generates a virtual table containing one or more rows.LATERAL VIEWapplies the rows to each original output row. In Databricks SQL and starting with Databricks Runtime 12.2 this clause is deprecated. You should invoke a table valued generator function as atable_reference.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,"Documents Databricks-specific LATERAL VIEW behavior, generator function usage, and deprecation guidance tied to specific runtime versions.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-limit,LIMIT clause,LIMIT clause - Azure Databricks - Databricks SQL,Limit query result rows in Databricks SQL,Learn how to use the LIMIT syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Constrains the number of rows returned by theQuery. In general, this clause is used in conjunction withORDER BYto -ensure that the results are deterministic.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,Describes Databricks-specific LIMIT clause syntax and interaction with ORDER BY for deterministic results.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-named-window,WINDOW clause,WINDOW clause - Azure Databricks - Databricks SQL,Define reusable WINDOW specifications in Databricks SQL,Learn how to use the WINDOW clause in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime The window clause allows you to define and name one or more distinct window specifications once and share them across many window functions within the same query.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,Explains Databricks-specific WINDOW clause syntax for naming and reusing window definitions across functions.,updated +ensure that the results are deterministic.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,Describes Databricks-specific LIMIT clause syntax and interaction with ORDER BY for deterministic results.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-named-window,WINDOW clause,WINDOW clause - Azure Databricks - Databricks SQL,Define reusable WINDOW specifications in Databricks SQL,Learn how to use the WINDOW clause in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime The window clause allows you to define and name one or more distinct window specifications once and share them across many window functions within the same query.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,Explains Databricks-specific WINDOW clause syntax for naming and reusing window definitions across functions.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-offset,OFFSET clause,OFFSET clause - Azure Databricks - Databricks SQL,Paginate results with OFFSET in Databricks SQL,Learn how to use the OFFSET syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 11.3 LTS and above Skips a number of rows returned by a statement or subquery. This clause is mostly used in the conjunction withLIMITtopagethrough a result set, andORDER BYto produce a deterministic result. Note When paging through a result set usingLIMITandOFFSETthe skipped rows still get processed. These rows merely get suppressed from the result set. -Pagination with this technique is not advised for resource-intensive queries.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.75,True,"Includes Databricks-specific OFFSET syntax, version constraints, and explicit guidance on performance implications when paging with LIMIT/OFFSET.",updated +Pagination with this technique is not advised for resource-intensive queries.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.75,True,"Includes Databricks-specific OFFSET syntax, version constraints, and explicit guidance on performance implications when paging with LIMIT/OFFSET.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-orderby,ORDER BY clause,ORDER BY clause - Azure Databricks - Databricks SQL,Sort query results with ORDER BY in Databricks SQL,Learn how to use the ORDER BY syntax of the SQL language in Databricks SQL and Databricks Runtime,"Applies to:Databricks SQLDatabricks Runtime Returns the result rows in a sorted manner in the user specified order. Unlike theSORT -BYclause, this clause guarantees a total order in the output.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,"Details Databricks ORDER BY semantics versus SORT BY, which are product-specific SQL behaviors.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-pipeop,SQL Pipeline Operator,Piped operation - Azure Databricks - Databricks SQL,Apply piped operations in Databricks SQL,Learn how to use pipe operation in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.2 and above Processes the result of the precedingqueryusing a chained operation.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,"Covers Databricks-specific pipe operator syntax and how queries are chained, which is unique to this product.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-pivot,PIVOT clause,PIVOT clause - Azure Databricks - Databricks SQL,Transform rows with PIVOT in Databricks SQL,Learn how to use the PIVOT syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms the rows of the precedingtable_referenceby rotating unique values of a specified column list into separate columns.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,Provides Databricks-specific PIVOT clause syntax and behavior for rotating row values into columns.,updated +BYclause, this clause guarantees a total order in the output.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,"Details Databricks ORDER BY semantics versus SORT BY, which are product-specific SQL behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-pipeop,SQL Pipeline Operator,Piped operation - Azure Databricks - Databricks SQL,Apply piped operations in Databricks SQL,Learn how to use pipe operation in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime 16.2 and above Processes the result of the precedingqueryusing a chained operation.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,"Covers Databricks-specific pipe operator syntax and how queries are chained, which is unique to this product.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-pivot,PIVOT clause,PIVOT clause - Azure Databricks - Databricks SQL,Transform rows with PIVOT in Databricks SQL,Learn how to use the PIVOT syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Transforms the rows of the precedingtable_referenceby rotating unique values of a specified column list into separate columns.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,Provides Databricks-specific PIVOT clause syntax and behavior for rotating row values into columns.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-qualify,QUALIFY clause,QUALIFY clause - Azure Databricks - Databricks SQL,Filter window function results with QUALIFY,Learn how to use the QUALIFY syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 10.4 LTS and above. Filters the results of window functions. To useQUALIFY, at least one window function is required to be present in theSELECTlist or theQUALIFYclause.",2024-04-18T08:00:00.000Z,language-reference,configuration,0.65,True,"QUALIFY clause reference (10.4+ only) documents Databricks-specific support for filtering on window functions and its requirement for at least one window function, which is dialect-specific behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-sampling,TABLESAMPLE clause,TABLESAMPLE clause - Azure Databricks - Databricks SQL,Sample tables with TABLESAMPLE in Databricks SQL,Learn how to use the TABLESAMPLE syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime TheTABLESAMPLEstatement is used to sample the relation.,2025-05-09T19:58:00.000Z,language-reference,configuration,0.65,True,"TABLESAMPLE reference describes how sampling is applied to relations in Databricks SQL, including syntax and behavior, which are engine-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-setops,Set operator,Set operators - Azure Databricks - Databricks SQL,Use set operators in Databricks SQL queries,"Learn how to use the EXCEPT, MINUS, INTERSECT, and UNION set operators of the SQL language in Databricks SQL and Databricks Runtime.",Applies to:Databricks SQLDatabricks Runtime Combines the precedingsubquery1andsubquery2into a single one. Azure Databricks supports three types of set operators:,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-supported set operators (EXCEPT, MINUS, INTERSECT, UNION) and their concrete usage.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-setops,Set operator,Set operators - Azure Databricks - Databricks SQL,Use set operators in Databricks SQL queries,"Learn how to use the EXCEPT, MINUS, INTERSECT, and UNION set operators of the SQL language in Databricks SQL and Databricks Runtime.",Applies to:Databricks SQLDatabricks Runtime Combines the precedingsubquery1andsubquery2into a single one. Azure Databricks supports three types of set operators:,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,"Documents Databricks-supported set operators (EXCEPT, MINUS, INTERSECT, UNION) and their concrete usage.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-sortby,SORT BY clause,SORT BY clause - Azure Databricks - Databricks SQL,Sort within partitions using SORT BY in Databricks SQL,Learn how to use the SORT BY syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Returns the result rows sorted within each Spark partition in the user specified order. When the data is spread across multiple Spark partitions,SORT BYmight return a partially ordered result. To explicitly control how the data has been split into Spark partitions use theREPARTITION hint. This is different than theORDER BYclause which guarantees a fully ordered output regardless of how Spark splits the data.",2025-01-27T18:57:00.000Z,language-reference,configuration,0.6,True,"SORT BY reference explains partition-local ordering, interaction with REPARTITION hints, and difference from ORDER BY—Databricks-specific execution semantics.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-table-reference,Table reference,table reference - Azure Databricks - Databricks SQL,Reference tables and derived results in Databricks SQL,Learn how to use the table reference syntax in the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime A table reference is an intermediate result table within SQL. It can bederivedfrom other operators, such as functions, joins or a subquery, reference a base table directly, or be constructed as an inline table.",2025-04-30T00:43:00.000Z,language-reference,configuration,0.65,True,"Table reference page defines how Databricks treats base tables, derived tables, inline tables, and functions as table references—dialect-specific query construction rules.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-tvf,Table-valued function (TVF),Table-valued function (TVF) invocation - Azure Databricks - Databricks SQL,Invoke table-valued functions in Databricks SQL,Learn how to use the table-valued functions of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Invokes a function that returns a relation or a set of rows as atable-reference. A TVF can be a: SQL user-defined table function. Therangetable-valued function. Any table-valued generator function, such asexplode. Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above. Note Hive UDTFcannot be invoked as a table-reference, but must be invoked from theSELECTor using theLATERAL VIEW clause.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.75,True,"Details Databricks-specific TVF invocation syntax, supported function types, version constraints, and Hive UDTF limitations.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-tvf,Table-valued function (TVF),Table-valued function (TVF) invocation - Azure Databricks - Databricks SQL,Invoke table-valued functions in Databricks SQL,Learn how to use the table-valued functions of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Invokes a function that returns a relation or a set of rows as atable-reference. A TVF can be a: SQL user-defined table function. Therangetable-valued function. Any table-valued generator function, such asexplode. Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above. Note Hive UDTFcannot be invoked as a table-reference, but must be invoked from theSELECTor using theLATERAL VIEW clause.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.75,True,"Details Databricks-specific TVF invocation syntax, supported function types, version constraints, and Hive UDTF limitations.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-unpivot,UNPIVOT clause,UNPIVOT clause - Azure Databricks - Databricks SQL,Use UNPIVOT to rotate columns in Databricks SQL,Learn how to use the UNPIVOT syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 12.2 LTS and above. Transforms the rows of the precedingtable_referenceby rotating groups of columns into rows and collapsing the listed columns: -A first new column holds the original column group names (or alias there-of) as values, this column is followed for a group of columns with the values of each column group.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific UNPIVOT clause behavior and version availability, which are concrete implementation details.",updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-values,VALUES clause,VALUES clause - Azure Databricks - Databricks SQL,Create inline tables with VALUES in Databricks SQL,Learn how to use the VALUES syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Produces an inline temporary table for use within the query.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,Documents Databricks-specific VALUES clause behavior for producing inline temporary tables.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-watermark,WATERMARK clause,WATERMARK clause - Azure Databricks - Databricks SQL,Apply WATERMARK for streaming in Databricks SQL,Learn how to use the WATERMARK clause of the SQL language in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 12.0 and above Adds a watermark to a relation in a select statement. TheWATERMARKclause only applies to queries on stateful streaming data, which include stream-stream joins and aggregation.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,Documents Databricks-specific WATERMARK clause semantics for stateful streaming queries and version requirements.,updated -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-where,WHERE clause,WHERE clause - Azure Databricks - Databricks SQL,Filter rows with WHERE in Databricks SQL,Learn how to use the WHERE syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Limits the results of theFROMclause of a query or a subquery based on the specified condition.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,Provides Databricks-specific WHERE clause syntax and behavior within its SQL dialect.,updated +A first new column holds the original column group names (or alias there-of) as values, this column is followed for a group of columns with the values of each column group.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,"Describes Databricks-specific UNPIVOT clause behavior and version availability, which are concrete implementation details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-values,VALUES clause,VALUES clause - Azure Databricks - Databricks SQL,Create inline tables with VALUES in Databricks SQL,Learn how to use the VALUES syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Produces an inline temporary table for use within the query.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.65,True,Documents Databricks-specific VALUES clause behavior for producing inline temporary tables.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-watermark,WATERMARK clause,WATERMARK clause - Azure Databricks - Databricks SQL,Apply WATERMARK for streaming in Databricks SQL,Learn how to use the WATERMARK clause of the SQL language in Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime 12.0 and above Adds a watermark to a relation in a select statement. TheWATERMARKclause only applies to queries on stateful streaming data, which include stream-stream joins and aggregation.",2026-04-14T17:47:00.000Z,language-reference,integrations,0.7,True,Documents Databricks-specific WATERMARK clause semantics for stateful streaming queries and version requirements.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-where,WHERE clause,WHERE clause - Azure Databricks - Databricks SQL,Filter rows with WHERE in Databricks SQL,Learn how to use the WHERE syntax of the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Limits the results of theFROMclause of a query or a subquery based on the specified condition.,2026-04-14T17:47:00.000Z,language-reference,integrations,0.6,True,Provides Databricks-specific WHERE clause syntax and behavior within its SQL dialect.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-star,Star clause,* (star) clause - Azure Databricks - Databricks SQL,Use star (*) expansion in Databricks SQL,Learn how to use the \* (star) syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime A shorthand to name all the referencable columns in theFROMclause, or a specific table reference's columns or fields in theFROMclause. The list of columns or fields is ordered by the order of table references and the order of columns within each table reference. -In case of fields it is ordered by the order of fields within the struct. The_metadatacolumn is not included this list. You must reference it explicitly. Prior to Databricks Runtime 15.0 the st",2026-04-14T17:47:00.000Z,language-reference,integrations,0.75,True,"Details Databricks-specific behavior of * expansion, column ordering, struct field handling, and exclusion of the _metadata column, including version-specific changes.",updated +In case of fields it is ordered by the order of fields within the struct. The_metadatacolumn is not included this list. You must reference it explicitly. Prior to Databricks Runtime 15.0 the st",2026-04-14T17:47:00.000Z,language-reference,integrations,0.75,True,"Details Databricks-specific behavior of * expansion, column ordering, struct field handling, and exclusion of the _metadata column, including version-specific changes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-begin,BEGIN TRANSACTION,BEGIN TRANSACTION - Azure Databricks - Databricks SQL,,Learn how to use the BEGIN TRANSACTION syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQL Begins a new interactive transaction that groups multiple SQL statements into a single unit of work that can be committed or rolled back. As an alternative to interactive transactions, you can define non-interactive transactions using theBEGIN ATOMIC ... END;syntax. SeeATOMIC compound statement.",2026-03-09T18:23:00.000Z,concept-article,,0.4,False,BEGIN TRANSACTION syntax; generic transactional concept without product-specific numeric or config details.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-begin-atomic,BEGIN ATOMIC,ATOMIC compound statement - Azure Databricks - Databricks SQL,,Learn how to use BEGIN ATOMIC ... END to run multiple SQL statements as a single atomic transaction on Azure Databricks.,"Applies to:Databricks SQLDatabricks Runtime 18.0 and above Implements a SQL Script block that can contain a sequence of SQL statements, control-of-flow statements, local variable declarations, and exception handlers. When marked asATOMIC, the block runs as a transactional unit where all statements succeed together or fail together.",2026-04-07T08:00:00.000Z,concept-article,,0.4,False,"This page describes the BEGIN ATOMIC ... END compound statement in Databricks SQL, focusing on transactional script block syntax and behavior. While it is detailed, it is language semantics rather than limits/quotas, configuration matrices, security roles, troubleshooting error codes, or decision-making guidance, so it does not fit the specified sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-commit,COMMIT,COMMIT - Azure Databricks - Databricks SQL,,Learn how to use the COMMIT syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQL Commits the current interactive transaction, making all changes across all modified tables permanent. For requirements and usage patterns for interactive transactions, seeInteractive transactions.",2026-03-09T18:23:00.000Z,concept-article,,0.4,False,"COMMIT syntax; standard transactional semantics, no expert-only limits or configs indicated.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-rollback,ROLLBACK,ROLLBACK - Azure Databricks - Databricks SQL,,Learn how to use the ROLLBACK syntax of the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQL Rolls back the current transaction, discarding all changes made since the beginning of the transaction. For requirements and usage patterns for interactive transactions, seeInteractive transactions.",2026-03-09T18:23:00.000Z,concept-article,,0.4,False,ROLLBACK syntax; generic transactional behavior without numeric limits or config matrices.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-window-functions-frame,Window frame clause,Window frame clause - Azure Databricks - Databricks SQL,,Learn how to use the window frame clause in window functions in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Specifies a sliding subset of rows within the partition on which the aggregate or analytic window function operates.,2026-04-14T17:47:00.000Z,language-reference,,0.3,False,"Explains the window frame clause conceptually. While it may show syntax, it is primarily standard SQL semantics rather than Databricks-specific configuration tables, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-window-functions-frame,Window frame clause,Window frame clause - Azure Databricks - Databricks SQL,,Learn how to use the window frame clause in window functions in the SQL language in Databricks SQL and Databricks Runtime.,Applies to:Databricks SQLDatabricks Runtime Specifies a sliding subset of rows within the partition on which the aggregate or analytic window function operates.,2026-04-14T17:47:00.000Z,language-reference,,0.3,False,"Explains the window frame clause conceptually. While it may show syntax, it is primarily standard SQL semantics rather than Databricks-specific configuration tables, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-variables,Variables,Variables - Azure Databricks - Databricks SQL,Declare and manage session variables in Databricks SQL,Learn about variables in Databricks SQL and Databricks Runtime.,"Applies to:Databricks Runtime14.1 and above Variables are typed and schema qualified objects which store values that are private to a session. In Azure Databricks variables are temporary and declared within a session using theDECLARE VARIABLEstatement. The termstemporary variableandsession variableare interchangeable. The schema in which temporary variables reside issystem.session. A variable is dropped implicitly at the end of the session that defines it. But you can explicitly drop it earlier ",2025-03-14T19:24:00.000Z,language-reference,configuration,0.76,True,"Explains Databricks variables as schema-qualified session objects, their lifecycle, and system.session schema usage.",unchanged @@ -4933,12 +4978,12 @@ Volumes provide capabilities for accessing, storing, governing, and organizing f While tables provide governance over tabular datasets, volumes add governance over non-tabular datasets. You can use volumes to store and access files in any format, including structured, semi-structured, and unstructured ",2024-07-01T18:17:00.000Z,language-reference,configuration,0.68,True,"Describes Unity Catalog volumes as objects, their behavior and usage constraints for managing non-tabular data, which is Databricks-specific configuration surface.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-window-functions,Window functions,Window functions - Azure Databricks - Databricks SQL,,Learn how to use window functions in the SQL language in Databricks SQL and Databricks Runtime.,"Applies to:Databricks SQLDatabricks Runtime Functions that operate on a group of rows, referred to as a window, and calculate a return value for each row based on the group of rows. -Window functions are useful for processing tasks such as calculating a moving average, computing a cumulative statistic, or accessing the value of rows given the relative position of the current row.",2026-04-14T17:47:00.000Z,language-reference,,0.3,False,"High-level description of window functions and their uses; likely conceptual with generic examples. Does not obviously focus on product-specific limits, configuration parameters, or error codes.",updated +Window functions are useful for processing tasks such as calculating a moving average, computing a cumulative statistic, or accessing the value of rows given the relative position of the current row.",2026-04-14T17:47:00.000Z,language-reference,,0.3,False,"High-level description of window functions and their uses; likely conceptual with generic examples. Does not obviously focus on product-specific limits, configuration parameters, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/,SQL release notes,Databricks SQL release notes - Azure Databricks - Databricks SQL,,Learn about Databricks SQL releases.,"This article lists new Databricks SQL features and improvements, along with known issues and FAQs.",2026-03-27T08:00:00.000Z,concept-article,,0.3,False,"Top-level Databricks SQL release notes index; summary mentions features, improvements, known issues, and FAQs but not specific limits, configuration tables, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2021,Release notes 2021,Databricks SQL release notes 2021 - Azure Databricks - Databricks SQL,,2021 release notes for new Databricks SQL features and improvements.,The following outlines the improvements and updates in Databricks SQL from January through December 2021.,2026-02-12T08:00:00.000Z,concept-article,,0.3,False,"2021 SQL release notes similarly list improvements and updates without explicit limits, configuration matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2022,Release notes 2022,Databricks SQL release notes 2022 - Azure Databricks - Databricks SQL,,2022 release notes for new Databricks SQL features and improvements.,The following outlines the improvements and updates in Databricks SQL from January through December 2022.,2026-03-31T08:00:00.000Z,concept-article,,0.3,False,2022 SQL release notes are a chronological list of improvements; no evidence of the specific expert-knowledge structures defined in the sub-skill types.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2023,Release notes 2023,Databricks SQL release notes 2023 - Azure Databricks - Databricks SQL,Use new Databricks SQL 2023 language and functions,2023 release notes for new Databricks SQL features and improvements.,The following outlines the improvements and updates in Databricks SQL from January through December 2023. The features listed in this section are independent of the SQL Warehouse compute versions described above. The features listed in this section are independent of the SQL Warehouse compute versions described above. Visualizations: Highlights: SQL Language updates:The following builtin functions have been added: The following builtin functions have been enhanced: The features listed in this se,2026-04-10T08:00:00.000Z,concept-article,integrations,0.6,True,"The 2023 SQL release notes explicitly mention new and enhanced builtin functions and visualization updates. Such notes typically include function signatures, parameter behaviors, and subtle changes that are product-specific integration/coding details not generally known to an LLM, fitting the integrations & coding patterns category best among the options.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2024,Release notes 2024,Databricks SQL release notes 2024 - Azure Databricks - Databricks SQL,,2024 release notes for Databricks SQL features and improvements.,"The following outlines the improvements and updates in Databricks SQL from January through December 2024. SQL warehouse system tables(Public Preview) Data discovery Legacy dashboards: Users can now start, create, and alter schedules for streaming tables and materialized views using human-readable syntax instad of CRON scheduling. SeeALTER MATERIALIZED VIEW,ALTER STREAMING TABLE,CREATE MATERIALIZED VIEW, andCREATE STREAMING TABLE. You can now use time travel to query previous table versions based",2026-03-11T08:00:00.000Z,concept-article,,0.35,False,"2024 SQL release notes include language updates and features but the summary does not show numeric limits, config tables, or symptom→solution mappings required for classification.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2024,Release notes 2024,Databricks SQL release notes 2024 - Azure Databricks - Databricks SQL,,2024 release notes for Databricks SQL features and improvements.,"The following outlines the improvements and updates in Databricks SQL from January through December 2024. SQL warehouse system tables(Public Preview) Data discovery Legacy dashboards: Users can now start, create, and alter schedules for streaming tables and materialized views using human-readable syntax instad of CRON scheduling. SeeALTER MATERIALIZED VIEW,ALTER STREAMING TABLE,CREATE MATERIALIZED VIEW, andCREATE STREAMING TABLE. You can now use time travel to query previous table versions based",2026-04-21T08:00:00.000Z,concept-article,,0.25,False,"Databricks SQL 2024 release notes describe new capabilities (system tables, time travel, scheduling syntax) but summary does not show detailed limits, config tables, or troubleshooting mappings required for classification.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2025,Release notes 2025,Databricks SQL release notes 2025 - Azure Databricks - Databricks SQL,,Release notes index for Databricks SQL features and improvements.,The following Databricks SQL features and improvements were released in 2025.,2026-04-09T08:00:00.000Z,concept-article,,0.3,False,"Databricks SQL 2025 release notes index; similar to 2026, appears to be a feature list without explicit evidence of the structured expert content required for the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2026,Release notes 2026,Databricks SQL release notes 2026 - Azure Databricks - Databricks SQL,,Release notes index for Databricks SQL features and improvements.,The following Databricks SQL features and improvements were released in 2026.,2026-04-09T08:00:00.000Z,concept-article,,0.3,False,"Databricks SQL 2026 release notes index; summary suggests a list of features and improvements. No clear indication of limits, configuration matrices, or troubleshooting structures in the snippet, so not classified as expert knowledge under the given categories.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/tpcds-eval,TPC-DS performance evaluation,Use the TPC-DS sample dataset to evaluate system performance - Azure Databricks - Databricks SQL,,Learn how to use the TPC-DS sample dataset to evaluate and benchmark system performance in Databricks SQL warehouses.,"Azure Databricks provides access to the TPC-DS benchmark dataset, a widely used benchmark for testing the performance of systems built for data warehousing and analytics. The dataset is available in two sizes by default in every Unity Catalog-enabled workspace. These datasets are ideal for testing Azure Databricks performance on a standardized benchmark that simulates realistic retail and e-commerce business scenarios. To learn more about this dataset, see theTPC-DS benchmarkdocumentation.",2025-10-03T08:00:00.000Z,concept-article,,0.2,False,"Benchmark usage overview; likely contains example queries and dataset description but not product-specific limits, configs, or troubleshooting matrices.",unchanged @@ -4957,13 +5002,13 @@ https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-tags,Q https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/schedule-query,Schedule queries,Schedule a query - Azure Databricks - Databricks SQL,,Learn how to schedule queries in Databricks SQL.,"You can use scheduled query executions to update your dashboards or enable routine alerts. By default, your queries do not have a schedule. Note If an alert uses your query, the alert runs on its own refresh schedule and does not use the query schedule. To set the schedule: In the Query Editor, clickSchedule>Add scheduleto open a menu with schedule settings. Choose when to run the query. ClickCreate. Your query will run automatically according to the schedule. If you experience a scheduled query",2024-03-01T08:00:00.000Z,concept-article,,0.4,False,"Basic scheduling steps for queries; no indication of limits, config matrices, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/sessions,SQL warehouse sessions,What are SQL warehouse sessions? - Azure Databricks - Databricks SQL,,"Learn about SQL warehouse sessions, which allow you to define variables, create temporary views and tables, and persist state across multiple query runs.","SQL warehouse sessions allow you to define variables, create temporary views, and maintain state changes across multiple query runs. With sessions, you can build SQL logic iteratively without needing to run all statements at once. You can use sessions in the following contexts when attached to a SQL warehouse:",2026-02-06T20:21:00.000Z,concept-article,,0.45,False,Conceptual explanation of SQL warehouse sessions and where they can be used; likely lacks detailed config tables or error mappings.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/,SQL editor introduction,Write queries and explore data in the new SQL editor - Azure Databricks - Databricks SQL,,Review the features and tools in the new Databricks SQL editor.,"The Databricks UI includes a SQL editor that you can use to author queries, collaborate with colleagues, browse available data, and create visualizations. This page explains how to use the SQL editor to write, run, manage, and share queries. This article explains how to use the new SQL editor. To learn about working with the legacy SQL editor, seeWrite queries and explore data in the legacy SQL editor. Note Starting in late May 2026, the new SQL editor will be enabled by default for all workspac",2026-04-09T08:00:00.000Z,concept-article,,0.2,False,"How-to UI usage for the new SQL editor; no detailed configuration parameter tables, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/,SQL editor introduction,Write queries and explore data in the new SQL editor - Azure Databricks - Databricks SQL,,Review the features and tools in the new Databricks SQL editor.,"The Databricks UI includes a SQL editor that you can use to author queries, collaborate with colleagues, browse available data, and create visualizations. This page explains how to use the SQL editor to write, run, manage, and share queries. This article explains how to use the new SQL editor. To learn about working with the legacy SQL editor, seeWrite queries and explore data in the legacy SQL editor. Note Starting in late May 2026, the new SQL editor will be enabled by default for all workspac",2026-04-20T08:00:00.000Z,concept-article,,0.2,False,"Content describes how to use the Databricks SQL editor UI to write, run, and manage queries. This is primarily feature and UI usage guidance, not configuration tables, limits, error codes, or product-specific integration parameters. It lacks the structured expert details required for any of the defined sub-skill types.",updated https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/custom-format,Custom format SQL,Custom format SQL statements - Azure Databricks - Databricks SQL,Customize SQL auto-formatting in Databricks editor,Learn how to customize autoformatting options in the SQL editor.,Important This feature is inPublic Preview. This article explains how to customize SQL auto-formatting options in the Azure Databricks UI.,2025-10-03T08:00:00.000Z,concept-article,configuration,0.6,True,Explains how to customize auto-formatting options; likely includes specific setting names and allowed values for the SQL editor formatting behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/legacy,Use the legacy SQL editor,Write queries and explore data in the legacy SQL editor - Azure Databricks - Databricks SQL,,Review the features and tools in Databricks SQL's query editor.,"Important The legacy SQL editor will be retired in late July 2026. Starting in late May 2026, the new SQL editor will be enabled by default for all workspaces and the workspace-level opt-out will be removed. Individual users can still switch to the legacy editor until late July 2026, when the legacy editor will be fully retired. Databricks recommends transitioning to thenew SQL editor. SeeWhat's coming?for details. The Azure Databricks UI includes a SQL editor that you can use to author queries,",2026-04-09T08:00:00.000Z,concept-article,,0.2,False,"Describes using the legacy SQL editor and deprecation timeline; focuses on UI behavior without expert-level limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/mustache-parameters,Mustache parameter syntax,Mustache parameter syntax - Azure Databricks - Databricks SQL,Use mustache parameter syntax in Databricks SQL,Learn about mustache parameter syntax in the Databricks SQL editor.,"Important Mustache parameter syntax is supported in the legacy SQL editor only. Databricks recommends usingnamed parameter markersfor new queries. If you copy a query using mustache syntax into a notebook, AI/BI dashboard dataset editor, or Genie space, you must convert it to named parameter markers before it will run. In the legacy SQL editor, any string wrapped in double curly braces ({{ }}) is treated as a query parameter. A widget appears above the results pane where you set the parameter va",2026-04-10T18:06:00.000Z,concept-article,configuration,0.65,True,"Defines product-specific mustache parameter behavior in the legacy SQL editor (double curly braces treated as parameters, widget behavior, and migration to named parameters). This is a concrete, product-unique configuration/usage pattern not generally known, though it is not about numeric limits or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/parameter-widgets,Work with parameter widgets,Work with parameter widgets - Azure Databricks - Databricks SQL,Configure parameter widgets in Databricks SQL editor,Learn how to configure parameter widgets in Databricks.,"When you add anamed parameter markerto a query, Azure Databricks displays a parameter widget in the UI. Widgets let users set parameter values without editing the query directly. You can configure each widget's type, title, and default value. Parameter widgets are supported in the SQL editor, notebooks, AI/BI dashboards, and Genie spaces, but behave differently across these surfaces. This page describes parameter widgets in the SQL editor. For other surfaces, see: In the SQL editor, any paramete",2026-03-31T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes widget types, titles, default values, and behavior across surfaces; includes product-specific parameter/widget configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/,Overview,Build a custom stateful application - Azure Databricks,Implement custom stateful streaming with transformWithState,Learn how to use transformWithState to build a custom stateful application with Structured Streaming on Azure Databricks.,"You can build streaming applications using custom stateful operators to implement low-latency and near real-time solutions that use arbitrary stateful logic. Custom stateful operators unlock new operational use cases and patterns unavailable through traditional Structured Streaming processing. Note Databricks recommends using built-in Structured Streaming functionality for supported stateful operations such as aggregations, deduplication, and streaming joins. SeeWhat is stateful streaming?. Data",2026-04-16T08:00:00.000Z,concept-article,integrations,0.65,True,"Describes product-specific coding pattern for custom stateful operators in Azure Databricks Structured Streaming using transformWithState, which is an integration/coding pattern unique to this service rather than a generic concept.",updated -https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/examples,Example stateful applications,Example stateful applications - Azure Databricks,Implement example custom stateful streaming apps,See examples of using transformWithState to implement custom stateful streaming applications on Azure Databricks.,This article contains code examples for custom stateful applications. Databricks recommends using built-in stateful methods for common operations such as aggregations and joins. The patterns in this article use thetransformWithStateoperator and associated classes available in Databricks Runtime 16.2 and above. SeeBuild a custom stateful application. Note Python supports both the row-basedtransformWithStateAPI (available in microbatch mode and real-time mode) and the Pandas-basedtransformWithStat,2026-03-12T23:05:00.000Z,concept-article,integrations,0.75,True,"Contains concrete code examples using transformWithState and related classes, which are Databricks-specific APIs and patterns.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/,Overview,Build a custom stateful application - Azure Databricks,Implement custom stateful streaming with transformWithState,Learn how to use transformWithState to build a custom stateful application with Structured Streaming on Azure Databricks.,"You can build streaming applications using custom stateful operators to implement low-latency and near real-time solutions that use arbitrary stateful logic. Custom stateful operators unlock new operational use cases and patterns unavailable through traditional Structured Streaming processing. Note Databricks recommends using built-in Structured Streaming functionality for supported stateful operations such as aggregations, deduplication, and streaming joins. SeeWhat is stateful streaming?. Data",2026-04-16T08:00:00.000Z,concept-article,integrations,0.65,True,"Describes product-specific coding pattern for custom stateful operators in Azure Databricks Structured Streaming using transformWithState, which is an integration/coding pattern unique to this service rather than a generic concept.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/examples,Example stateful applications,Example stateful applications - Azure Databricks,Implement custom stateful streaming with transformWithState,See examples of using transformWithState to implement custom stateful streaming applications on Azure Databricks.,This article contains code examples for custom stateful applications. Databricks recommends using built-in stateful methods for common operations such as aggregations and joins. The patterns in this article use thetransformWithStateoperator and associated classes available in Databricks Runtime 16.2 and above. SeeBuild a custom stateful application. Note Python supports both the row-basedtransformWithStateAPI (available in microbatch mode and real-time mode) and the Pandas-basedtransformWithStat,2026-04-21T08:00:00.000Z,concept-article,integrations,0.65,True,Contains concrete code patterns using Databricks Runtime–specific transformWithState APIs (row-based and Pandas-based) for custom stateful applications; these are product-specific coding patterns and API usages that qualify as integrations & coding patterns.,updated https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/legacy,Legacy state operators,Legacy arbitrary stateful operators - Azure Databricks,Use legacy arbitrary stateful operators on Databricks,,"Note Databricks recommends usingtransformWithStateto build custom stateful applications. SeeBuild a custom stateful application. This article has information for features that supportmapGroupsWithState, andflatMapGroupsWithState. For more details on these operators, seelink.",2025-02-14T19:16:00.000Z,concept-article,integrations,0.65,True,"Covers mapGroupsWithState and flatMapGroupsWithState support details, which are specific APIs with Databricks-specific behavior and usage patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/schema-evolution,Schema evolution for state variables,Schema evolution in the state store - Azure Databricks,Handle schema evolution in transformWithState state store,Learn about schema evolution for state variables with transformWithState in Azure Databricks.,This article provides an overview of schema evolution in the state store and examples of types of supported schema changes.,2026-03-12T08:00:00.000Z,concept-article,integrations,0.65,True,"Discusses supported schema changes for state variables in the Databricks state store, including examples of allowed patterns tied to specific APIs.",unchanged https://learn.microsoft.com/en-us/azure/databricks/storage/default-storage,Default storage,Default storage in Databricks - Azure Databricks,Configure and use default storage in Azure Databricks,Learn how to use default storage in Azure Databricks workspaces.,This page explains how default storage on Azure Databricks works and how to create catalogs and data objects that use it.,2026-04-03T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains Databricks-specific behavior of default storage and how catalogs and data objects are created against it, including configuration patterns unique to the service.",unchanged @@ -4973,31 +5018,31 @@ https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/avro-dat https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/batch-size,Micro-batch size,Configure Structured Streaming batch size on Azure Databricks - Azure Databricks,Configure Structured Streaming batch size with admission controls,Limiting the input rate for Structured Streaming queries helps to maintain a consistent batch size and prevents large batches from leading to spill and cascading micro-batch processing delays.,"This page explains how to use admission controls to maintain a consistent batch size for streaming queries. Admission controls limit the input rate for Structured Streaming queries, which can help maintain a consistent batch size and prevent large batches from causing spill and cascading micro-batch processing delays. Azure Databricks provides the same options to control Structured Streaming batch sizes for both Delta Lake and Auto Loader. Note You can modify admission control settings without r",2026-03-24T22:15:00.000Z,how-to,configuration,0.85,True,"Explains Databricks-specific admission control settings for Structured Streaming, including configuration parameters to limit input rate and maintain batch size—concrete config knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/checkpoints,Checkpoints,Structured Streaming checkpoints - Azure Databricks,Configure Structured Streaming checkpoints on Databricks,This article provides an overview of Structured Streaming checkpoints.,"Checkpoints and write-ahead logs work together to provide processing guarantees for Structured Streaming workloads. The checkpoint tracks the information that identifies the query, including state information and processed records. When you delete the files in a checkpoint directory or change to a new checkpoint location, the next run of the query begins fresh. A checkpoint directory contains the following: Each query must have a different checkpoint location. Multiple queries should never share",2026-03-24T08:00:00.000Z,concept-article,configuration,0.7,True,"Details checkpoint directory structure, required uniqueness per query, and behavior when changing/deleting checkpoints—product-specific operational semantics beyond generic streaming concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/concepts,Overview,Structured Streaming concepts - Azure Databricks,,Learn core concepts for configuring incremental and near real-time workloads with Structured Streaming.,"Apache Spark Structured Streaming is a near real-time processing engine that offers end-to-end fault tolerance with exactly-once processing guarantees using familiar Spark APIs. Structured Streaming lets you express computation on streaming data in the same way you express a batch computation on static data. The Structured Streaming engine performs the computation incrementally and continuously updates the result as streaming data arrives. For a step-by-step tutorial, seeRun your first Structure",2026-04-09T17:57:00.000Z,concept-article,,0.2,False,"Conceptual explanation of Structured Streaming; summary suggests high-level concepts without product-specific limits, configs, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/delta-lake,Delta Lake,Delta table streaming reads and writes - Azure Databricks,Use Delta tables for streaming reads and writes,"Learn how to use Delta Lake tables as streaming sources and sinks, handle upstream changes, and resolve errors from updates and deletes in streaming queries.","This page describes how to use Delta tables as sources and sinks forSpark Structured StreamingwithreadStreamandwriteStream. Delta Lake solves common performance and reliability problems for streaming systems and files. The benefits include: To learn how to load data using streaming tables in Databricks SQL, seeUse streaming tables in Databricks SQL. For stream-static joins with Delta Lake, seeStream-static joins.",2026-04-17T18:03:00.000Z,how-to,best-practices,0.6,True,"Covers how to use Delta tables as streaming sources/sinks, handle upstream changes, and resolve errors from updates/deletes in streaming queries—likely includes product-specific streaming patterns and gotchas beyond generic streaming concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/delta-lake,Delta Lake,Delta table streaming reads and writes - Azure Databricks,Use Delta tables for streaming reads and writes,"Learn how to use Delta Lake tables as streaming sources and sinks, handle upstream changes, and resolve errors from updates and deletes in streaming queries.","This page describes how to use Delta tables as sources and sinks forSpark Structured StreamingwithreadStreamandwriteStream. Delta Lake solves common performance and reliability problems for streaming systems and files. The benefits include: To learn how to load data using streaming tables in Databricks SQL, seeUse streaming tables in Databricks SQL. For stream-static joins with Delta Lake, seeStream-static joins.",2026-04-17T18:03:00.000Z,how-to,best-practices,0.6,True,"Covers how to use Delta tables as streaming sources/sinks, handle upstream changes, and resolve errors from updates/deletes in streaming queries—likely includes product-specific streaming patterns and gotchas beyond generic streaming concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/examples,Examples,Structured Streaming patterns on Azure Databricks - Azure Databricks,Structured Streaming integration patterns with external systems,"See examples of using Spark Structured Streaming with Cassandra, Azure Synapse Analytics, Python notebooks, and Scala notebooks in Azure Databricks.",This contains notebooks and code samples for common patterns for working with Structured Streaming on Azure Databricks.,2026-03-06T08:00:00.000Z,get-started,integrations,0.8,True,"Contains notebooks and code samples for integrating Structured Streaming with Cassandra, Synapse, and notebooks—showing concrete connection and usage patterns specific to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/foreach,foreachBatch,Use foreachBatch to write to arbitrary data sinks - Azure Databricks,Write custom streaming sinks with foreachBatch,Use foreachBatch and foreach to write custom outputs with Structured Streaming on Azure Databricks.,This page shows how to useforeachBatchwith Structured Streaming to write the output of a streaming query to data sources that do not have an existing streaming sink. The code patternstreamingDF.writeStream.foreachBatch(...)allows you to apply batch functions to the output data of every micro-batch of the streaming query. Functions used withforeachBatchtake two parameters: You must useforeachBatchfor Delta Lake merge operations in Structured Streaming. SeeUpsert from streaming queries usingforeac,2026-04-09T22:01:00.000Z,how-to,integrations,0.7,True,"Documents Databricks-specific Structured Streaming patterns using foreach/foreachBatch, including required patterns for Delta Lake merge; these are concrete code patterns and gotchas unique to this product, fitting integrations/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/output-mode,Output mode,Select an output mode for Structured Streaming - Azure Databricks,Choose Structured Streaming output modes on Databricks,This article defines output mode for Structured Streaming and provides recommendations for choosing an output mode for a streaming workload.,"This article discusses selecting an output mode for stateful streaming. Only stateful streams containing aggregations require an output mode configuration. Joins only support the append output mode, and output mode doesn't impact deduplication. The arbitrary stateful operatorsmapGroupsWithStateandflatMapGroupsWithStateemit records using their own custom logic, so the stream's output mode doesn't affect their behavior. For stateless streaming, all output modes behave the same. To configure output",2026-03-24T08:00:00.000Z,concept-article,decision-making,0.8,True,"Provides guidance on when to use append/complete/update for different stateful workloads, including operator support and behavioral trade-offs—explicit decision guidance for Databricks streaming.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/production,Structured streaming production,Production considerations for Structured Streaming - Azure Databricks,Production best practices for Databricks Structured Streaming,This page contains recommendations to configure production incremental processing workloads with Structured Streaming on Azure Databricks to meet latency or cost requirements for real-time or batch ap,This page contains recommendations for scheduling Structured Streaming workloads using jobs on Azure Databricks. Databricks recommends that you always configure the following: Some workloads benefit from the following: Azure Databricks has introduced Lakeflow Spark Declarative Pipelines to reduce the complexities of managing production infrastructure for Structured Streaming workloads. Databricks recommends using Lakeflow Spark Declarative Pipelines for new Structured Streaming pipelines. SeeLak,2026-04-15T22:08:00.000Z,concept-article,best-practices,0.7,True,"Provides Databricks-specific recommendations for configuring production Structured Streaming workloads (scheduling via jobs, latency vs cost tuning, and use of Lakeflow Spark Declarative Pipelines). These are actionable, product-specific best practices rather than generic streaming concepts.",updated +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/production,Structured streaming production,Production considerations for Structured Streaming - Azure Databricks,Production best practices for Databricks Structured Streaming,This page contains recommendations to configure production incremental processing workloads with Structured Streaming on Azure Databricks to meet latency or cost requirements for real-time or batch ap,This page contains recommendations for scheduling Structured Streaming workloads using jobs on Azure Databricks. Databricks recommends that you always configure the following: Some workloads benefit from the following: Azure Databricks has introduced Lakeflow Spark Declarative Pipelines to reduce the complexities of managing production infrastructure for Structured Streaming workloads. Databricks recommends using Lakeflow Spark Declarative Pipelines for new Structured Streaming pipelines. SeeLak,2026-04-15T22:08:00.000Z,concept-article,best-practices,0.7,True,"Provides Databricks-specific recommendations for configuring production Structured Streaming workloads (scheduling via jobs, latency vs cost tuning, and use of Lakeflow Spark Declarative Pipelines). These are actionable, product-specific best practices rather than generic streaming concepts.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/protocol-buffers,Protobuf,Read and write protocol buffers - Azure Databricks,Serialize and deserialize protocol buffers in Databricks streaming,Learn how to use to\_protobuf and from\_protobuf functions in Structured Streaming processing.,Azure Databricks provides native support for serialization and deserialization between Apache Spark structs and protocol buffers (protobuf). Protobuf support is implemented as an Apache Spark DataFrame transformer and can be used with Structured Streaming or for batch operations.,2024-08-02T08:00:00.000Z,how-to,integrations,0.85,True,"Describes to_protobuf/from_protobuf functions and their parameters, which are specific APIs for integrating Spark structs with protobuf schemas.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/read-state,Read state,Read Structured Streaming state information - Azure Databricks,Read and query Structured Streaming state data on Databricks,Learn how to read Structured Streaming query state data and metadata on Azure Databricks.,"You can use DataFrame operations or SQL table-value functions to query Structured Streaming state data and metadata. Use these functions to observe state information for Structured Streaming stateful queries, which can be useful for monitoring and debugging. You must have read access to the checkpoint path for a streaming query in order to query state data or metadata. The functions described in this article provide read-only access to state data and metadata. You can only use batch read semanti",2026-03-06T08:00:00.000Z,how-to,configuration,0.8,True,"Documents Databricks-specific SQL table-valued functions and DataFrame APIs for accessing streaming state/metadata, including required permissions and usage constraints.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/concepts,Concepts,Real-time mode in Structured Streaming - Azure Databricks,,"Learn what real-time mode is in Structured Streaming, how it achieves low latency, and when to use it for operational streaming workloads.","This page describes the concepts behind real-time mode in Structured Streaming, including what it is, how it achieves low latency, and when to use it.",2026-04-07T17:44:00.000Z,concept-article,,0.2,False,Conceptual page about what real-time mode is and when to use it; summary suggests high-level concepts without quantified decision matrices or detailed configuration parameters.,unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/examples,Examples,Real-time mode examples - Azure Databricks,Use real-time mode with Kafka and custom sinks,"Explore code examples for real-time mode in Structured Streaming, including Kafka sources and sinks, stateful queries, aggregations, and custom sinks.","This page provides working code examples for real-time mode queries in Structured Streaming, from simple stateless transformations to complex stateful processing with custom state management. SeeReal-time mode in Structured Streamingfor concepts andTutorial: Run a real-time streaming workloadfor a hands-on tutorial.",2026-04-07T17:44:00.000Z,how-to,integrations,0.65,True,"Provides working code examples for real-time mode with Kafka sources/sinks and custom sinks. These examples typically include product-specific API/SDK parameters and configuration patterns for integrations, which qualify as expert integration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/limitations,Limitations,Real-time mode limitations - Azure Databricks,Understand real-time mode limitations in Structured Streaming,"Known limitations for real-time mode in Structured Streaming, including source, union, and mapPartitions restrictions.",This page describes known limitations for real-time mode in Structured Streaming.,2026-04-14T17:47:00.000Z,reference,limits-quotas,0.7,True,"A limitations page for real-time mode is likely to enumerate concrete constraints (for example, which sources/unions/mapPartitions are disallowed or restricted), which are product-specific behavioral limits not inferable from general knowledge.",updated +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/limitations,Limitations,Real-time mode limitations - Azure Databricks,Understand real-time mode limitations in Structured Streaming,"Known limitations for real-time mode in Structured Streaming, including source, union, and mapPartitions restrictions.",This page describes known limitations for real-time mode in Structured Streaming.,2026-04-14T17:47:00.000Z,reference,limits-quotas,0.7,True,"A limitations page for real-time mode is likely to enumerate concrete constraints (for example, which sources/unions/mapPartitions are disallowed or restricted), which are product-specific behavioral limits not inferable from general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/performance,Optimization and monitoring,Optimize and monitor real-time mode query performance - Azure Databricks,Optimize and monitor Databricks real-time streaming performance,Reduce latency and monitor query performance in Structured Streaming real-time mode using asynchronous optimization techniques and built-in metrics.,"This page covers compute tuning, techniques for reducing end-to-end latency, and approaches for measuring query performance in real-time mode.",2026-04-07T17:44:00.000Z,how-to,best-practices,0.7,True,"Covers compute tuning, latency reduction techniques, and performance measurement for real-time mode. This is actionable, product-specific performance guidance (how to size, which settings to adjust, how to interpret metrics), fitting best-practices with expert details.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/reference,Reference,Real-time mode reference - Azure Databricks,,"Reference tables for real-time mode in Structured Streaming, including supported environments, languages, sources, sinks, operators, and limitations.","This page provides reference information for real-time mode in Structured Streaming, including supported environments, languages, sources, sinks, and operators. For known limitations, seeReal-time mode limitations.",2026-04-14T17:47:00.000Z,reference,,0.3,False,"Reference of supported environments, languages, sources, sinks, and operators is largely capability listing; summary does not indicate numeric limits, config tables, or error mappings that would qualify as expert knowledge under the defined categories.",updated -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/setup,Setup,Set up real-time mode - Azure Databricks,Set up and configure real-time mode in Databricks,"Configure real-time mode in Structured Streaming, including prerequisites, compute requirements, query configuration, and compute sizing.","This page describes the prerequisites and configuration needed to run real-time mode queries in Structured Streaming. For a step-by-step tutorial, seeTutorial: Run a real-time streaming workload. For conceptual information about real-time mode, seeReal-time mode in Structured Streaming.",2026-04-07T17:44:00.000Z,how-to,configuration,0.75,True,"Focused on prerequisites, compute requirements, query configuration, and compute sizing for real-time mode. This strongly implies tables or lists of specific settings, parameter names, and environment requirements unique to Databricks, matching configuration expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/reference,Reference,Real-time mode reference - Azure Databricks,Reference supported features and limits for real-time mode,"Reference tables for real-time mode in Structured Streaming, including supported environments, languages, sources, sinks, operators, and limitations.","This page provides reference information for real-time mode in Structured Streaming, including supported environments, languages, sources, sinks, and operators. For known limitations, seeReal-time mode limitations.",2026-04-23T17:47:00.000Z,reference,limits-quotas,0.8,True,"Provides reference tables for supported environments, languages, sources, sinks, operators, and known limitations for real-time mode; these tables typically include explicit constraints and support matrices that function as product-specific limits/quotas.",updated +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/setup,Setup,Set up real-time mode - Azure Databricks,Configure real-time mode for Databricks Structured Streaming,"Configure real-time mode in Structured Streaming, including prerequisites, compute requirements, query configuration, and compute sizing.","This page describes the prerequisites and configuration needed to run real-time mode queries in Structured Streaming. For a step-by-step tutorial, seeTutorial: Run a real-time streaming workload. For conceptual information about real-time mode, seeReal-time mode in Structured Streaming.",2026-04-20T22:01:00.000Z,how-to,configuration,0.7,True,"Describes prerequisites, compute requirements, and query configuration for real-time mode, likely including specific configuration options and sizing guidance unique to Databricks Structured Streaming, fitting configuration-focused expert knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/tutorial,Tutorial,Tutorial: Run a real-time streaming workload - Azure Databricks,,"Tutorial: Run a real-time streaming query using real-time mode in Structured Streaming, including notebook setup and the real-time trigger.","Real-time mode enables ultra-low latency streaming with end-to-end latency as low as five milliseconds, making it ideal for operational workloads like fraud detection and real-time personalization. This tutorial guides you through setting up your first real-time streaming query using a simple example. For conceptual information about real-time mode, when to use it, and supported features, seeReal-time mode in Structured Streaming. For configuration requirements, seeSet up real-time mode.",2026-04-07T17:44:00.000Z,tutorial,,0.3,False,"Tutorial for running a real-time streaming workload; primarily step-by-step example usage. Does not clearly indicate configuration matrices, limits, or decision/troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/rocksdb-state-store,RocksDB state store,Configure RocksDB state store on Azure Databricks - Azure Databricks,Configure RocksDB state store for Databricks Structured Streaming,Learn how to configure a RocksDB state store for Structured Streaming applications on Azure Databricks.,"RocksDB is the default state store provider in Databricks Runtime 17.3 and above. For Databricks Runtime versions below 17.3, you can enable RocksDB-based state management by setting the following configuration in the SparkSession before starting the streaming query. You can enable RocksDB on Lakeflow Spark Declarative Pipelines. SeeOptimize pipeline configuration for stateful processing.",2026-03-12T23:05:00.000Z,concept-article,configuration,0.75,True,"Explains how to enable RocksDB state store with specific Spark configuration keys and version constraints (e.g., default in Runtime 17.3+). These are concrete config parameters and defaults.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stateful-streaming,Joins and aggregations,What is stateful streaming? - Azure Databricks,,Managing the intermediate state information of stateful Structured Streaming queries can help prevent unexpected latency or production problems.,"This page explains stateful Structured Streaming queries, including stateful operations, optimization recommendations, chaining multiple stateful operators, and state rebalancing. AstatefulStructured Streaming query requires incremental updates to intermediate state information, whereas astatelessStructured Streaming query only tracks information about which rows have been processed from the source to the sink. For optimization features available for stateless queries, seeOptimize stateless stre",2026-03-16T17:36:00.000Z,concept-article,,0.3,False,"Conceptual explanation of stateful Structured Streaming in Azure Databricks with some optimization recommendations, but not organized as concrete product-specific best practices with quantified impact, nor as configuration/limits/troubleshooting reference. Mostly high-level behavior and concepts rather than expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stateless-streaming,Optimize stateless streaming queries,Optimize stateless streaming queries - Azure Databricks,Optimize stateless Structured Streaming queries on Databricks,"Learn how to optimize stateless Structured Streaming queries using AQE, AOS, and shuffle partition configuration in Databricks Runtime 18.0 and above.","This page describes optimization features available for stateless streaming queries in Databricks Runtime 18.0 and above. Stateless Structured Streaming queries process data without maintaining intermediate state. These queries do not use stateful operators such as streaming aggregations,dropDuplicates, or stream-stream joins. Examples include queries that use stream-static joins,MERGE INTOwith Delta tables, and other operations that only track which rows have been processed from source to sink.",2026-03-06T08:00:00.000Z,how-to,best-practices,0.75,True,"Covers Databricks Runtime–specific optimization features (AQE, AOS, shuffle partitions) with concrete tuning recommendations for stateless streaming workloads.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stream-monitoring,StreamingQueryListener,Monitoring Structured Streaming queries on Azure Databricks - Azure Databricks,Monitor Azure Databricks Structured Streaming queries,Learn how to monitor Structured Streaming applications on Azure Databricks.,Azure Databricks provides built-in monitoring for Structured Streaming applications through the Spark UI under theStreamingtab.,2026-04-15T08:00:00.000Z,feature-guide,best-practices,0.6,True,"Monitoring guidance for Structured Streaming via Spark UI typically includes product-specific recommendations and gotchas (which metrics to watch, how to interpret them, and how to configure monitoring) that go beyond generic debugging advice.",updated +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stream-monitoring,StreamingQueryListener,Monitoring Structured Streaming queries on Azure Databricks - Azure Databricks,Monitor Databricks Structured Streaming queries effectively,Learn how to monitor Structured Streaming applications on Azure Databricks.,Azure Databricks provides built-in monitoring for Structured Streaming applications through the Spark UI under theStreamingtab.,2026-04-23T08:00:00.000Z,feature-guide,best-practices,0.6,True,"Explains how to use the Databricks/Spark UI Streaming tab and related mechanisms to monitor streaming applications; likely includes product-specific guidance on interpreting metrics and using monitoring features, which is actionable best-practice content.",updated https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/triggers,Trigger intervals,Configure Structured Streaming trigger intervals - Azure Databricks,Configure Structured Streaming trigger intervals on Databricks,"Learn how to configure Structured Streaming trigger intervals, including processingTime, AvailableNow, and real-time mode, to balance cost and latency on Databricks.","This page explains how to configure trigger intervals for Structured Streaming on Azure Databricks. Apache Spark Structured Streaming processes data incrementally. Trigger intervals control how frequently Structured Streaming checks for new data. You can configure trigger intervals for near-real-time processing, for scheduled database refreshes, or batch processing all new data for a day or a week. BecauseWhat is Auto Loader?uses Structured Streaming to load data, understanding how triggers work",2026-04-08T22:17:00.000Z,overview,configuration,0.65,True,"Page is specifically about configuring trigger intervals (processingTime, AvailableNow, real-time mode). This typically includes parameter names, allowed values, and examples unique to Databricks Structured Streaming, fitting configuration guidance with expert-level details.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/tutorial,Structured Streaming tutorial,Run your first Structured Streaming workload - Azure Databricks,,Learn the basics of near real-time and incremental processing with Structured Streaming on Azure Databricks.,"This article provides code examples and explanation of basic concepts necessary to run your first Structured Streaming queries on Azure Databricks. You can use Structured Streaming for near real-time and incremental processing workloads. Structured Streaming is one of several technologies that power streaming tables in Lakeflow Spark Declarative Pipelines. Databricks recommends using Lakeflow Spark Declarative Pipelines for all new ETL, ingestion, and Structured Streaming workloads. SeeLakeflow ",2026-03-12T18:41:00.000Z,get-started,,0.3,False,"Introductory tutorial for first Structured Streaming queries; primarily example code and basic concepts, not detailed config matrices or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/unity-catalog,Unity Catalog support,Using Unity Catalog with Structured Streaming - Azure Databricks,Combine Unity Catalog with Structured Streaming workloads,Learn how to leverage Unity Catalog in conjunction with Structured Streaming on Azure Databricks.,Use Structured Streaming with Unity Catalog to manage data governance for your incremental and streaming workloads on Azure Databricks. This document outlines supported functionality and suggests best practices for using Unity Catalog and Structured Streaming together.,2025-08-29T23:49:00.000Z,how-to,best-practices,0.7,True,"Outlines supported functionality and best practices for using Unity Catalog with streaming, including governance patterns specific to Databricks.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/watermarks,Watermarks,Apply watermarks to control data processing thresholds - Azure Databricks,Apply watermarks for efficient stateful streaming,Learn the basic concepts of watermarking and recommendations for using watermarks to control state information in common stateful streaming operations.,"This page describes the basic concepts of watermarking and provides recommendations for using watermarks in common stateful streaming operations. You must apply watermarks to stateful streaming operations to avoid infinitely expanding the amount of data kept in state, which can introduce memory issues or increase processing latencies during long-running streaming operations.",2026-02-19T19:35:00.000Z,concept-article,best-practices,0.7,True,Provides recommendations on watermark usage to control state growth and latency; these are product-specific operational guidelines for Structured Streaming.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/watermarks,Watermarks,Apply watermarks to control data processing thresholds - Azure Databricks,Apply watermarks for Databricks streaming state control,Learn the basic concepts of watermarking and recommendations for using watermarks to control state information in common stateful streaming operations.,"This page describes the basic concepts of watermarking and provides recommendations for using watermarks in common stateful streaming operations. You must apply watermarks to stateful streaming operations to avoid infinitely expanding the amount of data kept in state, which can introduce memory issues or increase processing latencies during long-running streaming operations.",2026-04-20T08:00:00.000Z,concept-article,best-practices,0.7,True,"Provides Databricks-specific recommendations on how to use watermarks in stateful streaming to avoid unbounded state and latency issues; includes concrete guidance on when and how to configure watermarks for common operations, which is product-specific best-practice knowledge.",updated https://learn.microsoft.com/en-us/azure/databricks/tables/,Overview,Azure Databricks tables - Azure Databricks,,"Learn about table types, storage formats, and management features in Azure Databricks, including managed, external, temporary, and foreign tables.","Azure Databricks supports multiple table types and storage formats to meet different data management needs. For an overview of table types, storage formats, and Unity Catalog integration, seeAzure Databricks tables concepts.",2026-04-10T18:06:00.000Z,landing-page,,0.2,False,"Appears to be a conceptual overview of Azure Databricks table types and storage formats without evidence of numeric limits, config parameter tables, or product-specific error/decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/tables/automatic-feature-enablement,Automatic feature enablement,Automatic feature enablement - Azure Databricks,Use Automatic Feature Enablement for Unity Catalog tables,Learn how Automatic Feature Enablement automatically turns on best-practice features on your Unity Catalog managed tables with no code changes required.,"Important Automatic feature enablement is inPublic Preview. To enroll, completethis formwith your account ID. No code changes or additional configuration are required after enrollment. Automatic feature enablement (AFE) automatically upgrades Unity Catalog managed tables to use generally available recommended features without requiring code changes or manualALTER TABLEstatements. AFE also verifies that clients are compatible before turning on new features. AFE provides the following benefits:",2026-04-16T08:00:00.000Z,feature-guide,best-practices,0.65,True,"Describes an automatic mechanism that turns on recommended features and verifies client compatibility; likely includes product-specific recommendations and behaviors (which features are enabled, conditions, and edge cases) that constitute Databricks-specific best practices.",updated -https://learn.microsoft.com/en-us/azure/databricks/tables/constraints,Table constraints,Constraints on Azure Databricks - Azure Databricks,Define constraints on Databricks Delta tables,Apply enforced constraints and primary or foreign keys on tables in Azure Databricks.,Azure Databricks supports standard SQL constraint management clauses. Constraints fall into two categories: All constraints on Azure Databricks require Delta Lake. Lakeflow Spark Declarative Pipelines has a similar concept known as expectations. SeeManage data quality with pipeline expectations.,2025-10-08T08:00:00.000Z,concept-article,configuration,0.6,True,"Covers Databricks-specific support for enforced constraints and key definitions on Delta tables, including SQL clauses and behavior tied to Delta.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/tables/automatic-feature-enablement,Automatic feature enablement,Automatic feature enablement - Azure Databricks,Use Automatic Feature Enablement for Unity Catalog tables,Learn how Automatic Feature Enablement automatically turns on best-practice features on your Unity Catalog managed tables with no code changes required.,"Important Automatic feature enablement is inPublic Preview. To enroll, completethis formwith your account ID. No code changes or additional configuration are required after enrollment. Automatic feature enablement (AFE) automatically upgrades Unity Catalog managed tables to use generally available recommended features without requiring code changes or manualALTER TABLEstatements. AFE also verifies that clients are compatible before turning on new features. AFE provides the following benefits:",2026-04-16T08:00:00.000Z,feature-guide,best-practices,0.65,True,"Describes an automatic mechanism that turns on recommended features and verifies client compatibility; likely includes product-specific recommendations and behaviors (which features are enabled, conditions, and edge cases) that constitute Databricks-specific best practices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/tables/constraints,Table constraints,Constraints on Azure Databricks - Azure Databricks,Configure SQL constraints on Delta tables in Azure Databricks,Apply enforced constraints and primary or foreign key constraints on Delta Lake tables in Azure Databricks.,"Azure Databricks supports standard SQL constraint management clauses: All constraints on Azure Databricks require Delta Lake. For a related concept in Lakeflow Spark Declarative Pipelines, seeManage data quality with pipeline expectations.",2026-04-22T17:34:00.000Z,concept-article,configuration,0.65,True,"Covers supported SQL constraint clauses and their requirements (all constraints require Delta Lake) and likely includes specific syntax and behaviors for enforced, primary, and foreign key constraints in Databricks, which are product-specific configuration details.",updated https://learn.microsoft.com/en-us/azure/databricks/tables/convert-external-managed,Convert external tables to managed,Convert an external table to a managed Unity Catalog table - Azure Databricks,,Learn how to convert an external table to a Unity Catalog managed table in Azure Databricks.,This page describes how to convert an external table to a Unity Catalog managed table in Azure Databricks using theALTER TABLE ... SET MANAGEDcommand or Catalog Explorer.,2026-03-04T19:15:00.000Z,concept-article,,0.4,False,Describes using ALTER TABLE ... SET MANAGED or Catalog Explorer; appears as straightforward DDL usage without deep config or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/tables/convert-foreign-external,Convert foreign tables to external,Convert a foreign table to an external Unity Catalog table - Azure Databricks,,Use SET EXTERNAL to convert a foreign table to a Unity Catalog external table in Databricks Runtime 17.3 or above.,"Important This feature is in Public Preview and is only available to participating customers at this time. To participate in the preview, apply byfilling out this form. This feature only supports converting foreign tables federated usingHMS and Glue Federation. This page describes how to useSET EXTERNALto convert a foreign table to an external table.",2025-11-13T19:48:00.000Z,concept-article,,0.5,False,How-to for SET EXTERNAL to convert foreign to external tables; preview note but no detailed limits/config tables in summary.,unchanged https://learn.microsoft.com/en-us/azure/databricks/tables/convert-foreign-managed,Convert foreign tables to managed,Convert a foreign table to a managed Unity Catalog table - Azure Databricks,Convert foreign tables to Unity Catalog managed tables,Use SET MANAGED to convert a foreign table to a Unity Catalog managed table in Databricks Runtime 17.3 or above.,"Important This feature is in Public Preview and is only available to participating customers at this time. To participate in the preview, apply byfilling out this form. This feature only supports converting foreign tables federated usingHMS and Glue Federation. This page describes how to useSET MANAGEDto convert a foreign table to a managed table.",2026-03-04T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains using SET MANAGED to convert foreign tables, with runtime/version and federation-type constraints—product-specific DDL configuration behavior.",unchanged @@ -5012,9 +5057,9 @@ https://learn.microsoft.com/en-us/azure/databricks/tables/size,Table size,Table https://learn.microsoft.com/en-us/azure/databricks/tables/table-overview,Tables,What are tables in Azure Databricks? - Azure Databricks,,Learn about the different table types that Azure Databricks supports.,"In Azure Databricks, a table is a structured collection of data stored within a schema. Tables are used to store, query, and manage data using SQL or Spark. The default table type is a Unity Catalog managed table, which uses Delta Lake for reliable data storage. Azure Databricks supports three main table types, each with different ownership and data management characteristics: For most use cases, Databricks recommends using managed tables.",2026-01-20T23:44:00.000Z,concept-article,,0.3,False,"Overview of table types and recommendations; lacks numeric limits, config tables, or detailed security roles.",unchanged https://learn.microsoft.com/en-us/azure/databricks/tables/tables-concepts,Tables concepts,Azure Databricks tables concepts - Azure Databricks,,"Core concepts and foundational information about table types, storage formats, and Unity Catalog integration in Azure Databricks.","A Azure Databricks table resides in a schema and contains rows of data. The default table type created in Azure Databricks is a Unity Catalog managed table. The following example shows a managed table namedprod.people_ops_employeesthat contains data about five employees. As a managed table, the data files are stored in Unity Catalog's managed storage location in cloud storage.",2026-01-20T23:44:00.000Z,concept-article,,0.3,False,"Concepts page defining tables and showing a simple example; conceptual rather than configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/databricks/tables/temporary-tables,Temporary tables,Temporary tables - Azure Databricks,,Learn how to use temporary tables in Azure Databricks SQL warehouses for session-scoped intermediate data storage and analysis.,Applies to:Databricks SQL Important This feature is inPublic Preview. This page describes how to use temporary tables for session-scoped intermediate data storage and analysis on SQL warehouse compute. Temporary tables store data for the duration of a Azure Databricks session. Use temporary tables to materialize intermediate results for exploratory analysis or SQL data pipelines without creating permanent tables in your catalog. Temporary tables are available on SQL warehouse compute only. Class,2026-04-10T18:06:00.000Z,concept-article,,0.3,False,"Focuses on what temporary tables are and when to use them; summary does not indicate numeric limits, detailed configuration parameter tables, or troubleshooting/decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/transactions/,Transactions,Transactions - Azure Databricks,Use multi-statement transactions on Unity Catalog tables,Execute multiple SQL statements as a single atomic transaction on Unity Catalog tables with transaction support on Azure Databricks.,"Important Transactions that write to Unity Catalog managed Delta tables are inPublic Preview. Transactions that write to Unity Catalog managed Iceberg tables are inPrivate Preview. To join this preview, submit themanaged Iceberg tables preview enrollment form. Transactions let you coordinate operations across multiple SQL statements and tables. All changes succeed together or roll back together, ensuring data consistency across your operations and tables. Transactions provide ACID guarantees: at",2026-04-07T08:00:00.000Z,feature-guide,configuration,0.65,True,"Details how to execute transactions across tables, including syntax and constraints for Delta and Iceberg in Unity Catalog; product-specific transactional behavior and configuration.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/transactions/,Transactions,Transactions - Azure Databricks,Configure and run multi-statement transactions on Unity Catalog tables,Execute multiple SQL statements as a single atomic transaction on Unity Catalog tables with transaction support on Azure Databricks.,"Important Transactions that write to Unity Catalog managed Delta tables are inPublic Preview. Transactions that write to Unity Catalog managed Iceberg tables are inPrivate Preview. To join this preview, submit themanaged Iceberg tables preview enrollment form. Transactions let you coordinate operations across multiple SQL statements and tables. All changes succeed together or roll back together, ensuring data consistency across your operations and tables. Transactions provide ACID guarantees: at",2026-04-21T08:00:00.000Z,feature-guide,configuration,0.65,True,"Describes how to execute transactions across Unity Catalog managed Delta and Iceberg tables, including preview scope and ACID guarantees. Likely includes specific SQL syntax and constraints for Databricks transactions, which are product-specific configuration/usage details.",updated https://learn.microsoft.com/en-us/azure/databricks/transactions/transaction-modes,Transaction modes,Transaction modes - Azure Databricks,Choose and implement Databricks transaction modes,Learn how to use interactive and non-interactive transaction modes for transactions that span multiple tables on Azure Databricks.,"Important Transactions that write to Unity Catalog managed Delta tables are inPublic Preview. Transactions that write to Unity Catalog managed Iceberg tables are inPrivate Preview. To join this preview, submit themanaged Iceberg tables preview enrollment form. Transactions support two modes: non-interactive and interactive. This page covers when to use each mode and includes implementation examples. For requirements and an overview of transactions, seeTransactions. For hands-on practice with bot",2026-04-07T08:00:00.000Z,concept-article,decision-making,0.7,True,"Explains when to use interactive vs non-interactive transaction modes with examples; provides scenario-based guidance for selecting modes, which is decision-focused and Databricks-specific.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/transactions/tutorial,Coordinate transactions across tables,Tutorial: Coordinate transactions across tables - Azure Databricks,,Learn how to use transactions on Databricks with hands-on examples for coordinating updates across multiple tables.,"Important Transactions that write to Unity Catalog managed Delta tables are inPublic Preview. Transactions that write to Unity Catalog managed Iceberg tables are inPrivate Preview. To join this preview, submit themanaged Iceberg tables preview enrollment form. This tutorial demonstrates how to use transactions to coordinate updates across multiple statements and tables. You learn both transaction modes: non-interactive transactions, which commit automatically, and interactive transactions, which",2026-04-15T08:00:00.000Z,concept-article,,0.3,False,"Described as a tutorial demonstrating how to use transactions with examples and preview notes. Tutorials generally focus on step-by-step usage rather than reference-style limits, configuration matrices, or error-code mappings, so it likely lacks the required expert-knowledge patterns.",updated +https://learn.microsoft.com/en-us/azure/databricks/transactions/tutorial,Coordinate transactions across tables,Tutorial: Coordinate transactions across tables - Azure Databricks,,Learn how to use transactions on Databricks with hands-on examples for coordinating updates across multiple tables.,"Important Transactions that write to Unity Catalog managed Delta tables are inPublic Preview. Transactions that write to Unity Catalog managed Iceberg tables are inPrivate Preview. To join this preview, submit themanaged Iceberg tables preview enrollment form. This tutorial demonstrates how to use transactions to coordinate updates across multiple statements and tables. You learn both transaction modes: non-interactive transactions, which commit automatically, and interactive transactions, which",2026-04-15T08:00:00.000Z,concept-article,,0.3,False,"Described as a tutorial demonstrating how to use transactions with examples and preview notes. Tutorials generally focus on step-by-step usage rather than reference-style limits, configuration matrices, or error-code mappings, so it likely lacks the required expert-knowledge patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/transform/aggregation,Aggregations,Aggregate data on Azure Databricks - Azure Databricks,,Learn the basics of data aggregation on Azure Databricks.,"This article introduces the general semantics for aggregation and discusses the differences between results computed using batch queries, materialized views, and streaming.",2026-03-12T08:00:00.000Z,concept-article,,0.3,False,"Introductory aggregation semantics and comparison of batch, materialized views, and streaming; likely conceptual without detailed product-specific parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/databricks/transform/data-modeling,Data modeling,Data modeling - Azure Databricks,Design data models optimized for Azure Databricks,Learn important considerations around data modeling when transforming data on Azure Databricks.,"This article introduces considerations, caveats, and recommendations for data modeling on Azure Databricks. It is targeted toward users who are setting up new tables or authoring ETL workloads, with an emphasis on understanding Azure Databricks behaviors that influence transforming raw data into a new data model. Data modeling decisions depend on how your organization and workloads use tables. The data model you choose impacts query performance, compute costs, and storage costs. This includes an",2026-01-23T08:00:00.000Z,concept-article,best-practices,0.65,True,"Provides Databricks-specific modeling recommendations tied to table behavior, performance, and cost in the lakehouse, going beyond generic data modeling theory.",unchanged https://learn.microsoft.com/en-us/azure/databricks/transform/join,Joins,Work with joins on Azure Databricks - Azure Databricks,,Learn common patterns for joining datasets on Azure Databricks with batch or stream processing.,"Databricks supports ANSI standard join syntax. This page describes differences between joins with batch and stream processing. Note Databricks also supports standard syntax for the set operatorsUNION,INTERSECT, andEXCEPT. SeeSet operators.",2026-03-31T23:28:00.000Z,concept-article,,0.3,False,Describes join semantics and differences between batch and streaming; appears to be general usage patterns rather than Databricks-specific configuration or limits.,unchanged @@ -5022,28 +5067,33 @@ https://learn.microsoft.com/en-us/azure/databricks/transform/optimize-joins,Opti https://learn.microsoft.com/en-us/azure/databricks/transform/validate,Clean and validate,Clean and validate data with batch or stream processing - Azure Databricks,Clean and validate data with Databricks batch and streaming,Learn common patterns for cleaning and validating data on Azure Databricks with batch or stream processing.,"Cleaning and validating data is essential for ensuring the quality of data assets in a lakehouse. This article outlines Azure Databricks product offerings designed to facilitate data quality, as well as providing recommendations for defining business logic to implement custom rules.",2026-03-12T08:00:00.000Z,concept-article,best-practices,0.65,True,Provides Databricks product-specific patterns and recommendations for implementing data quality rules in batch and streaming workloads.,unchanged https://learn.microsoft.com/en-us/azure/databricks/udf/,What are UDFs?,What are user-defined functions (UDFs)? - Azure Databricks,,Learn about user-defined functions supported by Azure Databricks and their strengths and limitations.,"User-defined functions (UDFs) allow you to reuse and share code that extends built-in functionality on Azure Databricks. Use UDFs to perform specific tasks like complex calculations, transformations, or custom data manipulations.",2026-01-24T08:00:00.000Z,concept-article,,0.4,False,"Conceptual explanation of UDFs and their strengths/limitations; summary does not show product-specific parameters, limits, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/databricks/udf/aggregate-scala,Scala UDAFs,User-defined aggregate functions - Scala - Azure Databricks,Create Scala user-defined aggregate functions on Databricks,Learn how to implement a user-defined aggregate function in Scala and register it for use from Apache Spark SQL code in Azure Databricks.,This article contains an example of a UDAF and how to register it for use in Apache Spark SQL. SeeUser-defined aggregate functions (UDAFs)for more details.,2025-12-30T23:46:00.000Z,concept-article,integrations,0.65,True,"Shows how to implement and register Scala UDAFs for Databricks Spark SQL, which is a concrete integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/udf/pandas,pandas UDFs,pandas user-defined functions - Azure Databricks,Create and use pandas UDFs on Azure Databricks,Learn how to create and use pandas user-defined functions in Python code in Azure Databricks.,"A pandas user-defined function (UDF)—also known as vectorized UDF—is a user-defined function that usesApache Arrowto transfer data and pandas to work with the data. pandas UDFs allow -vectorized operations that can increase performance up to 100x compared to row-at-a-timePython UDFs. For background information, see the blog postNew Pandas UDFs and Python Type Hints in the Upcoming Release of Apache Spark 3.0. You define a pandas UDF using the keywordpandas_udfas a decorator and wrap the function ",2025-10-06T08:00:00.000Z,concept-article,integrations,0.7,True,"Details pandas UDFs using Apache Arrow and pandas, including decorators (pandas_udf) and performance characteristics on Databricks. These are concrete code patterns and API usage specific to Databricks’ Spark environment.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/udf/pandas,pandas UDFs,pandas user-defined functions - Azure Databricks,Create and optimize pandas UDFs in Databricks,Learn how to create and use pandas user-defined functions in Python code in Azure Databricks.,"A pandas user-defined function (UDF)—also known as vectorized UDF—is a user-defined function that usesApache Arrowto transfer data and pandas to work with the data. pandas UDFs allow +vectorized operations that can increase performance up to 100x compared to row-at-a-timePython UDFs. For background information, see the blog postNew Pandas UDFs and Python Type Hints in the Upcoming Release of Apache Spark 3.0. You define a pandas UDF using the keywordpandas_udfas a decorator and wrap the function ",2026-04-23T08:00:00.000Z,concept-article,integrations,0.7,True,Details pandas UDFs using Arrow and pandas with Databricks-specific performance characteristics (up to 100x vs row UDFs) and decorator-based definitions; this is a concrete coding pattern for integrating pandas with Spark on Databricks.,updated https://learn.microsoft.com/en-us/azure/databricks/udf/python,Python scalar UDFs,User-defined scalar functions - Python - Azure Databricks,Implement Python scalar UDFs in Databricks SQL,Learn how to implement Python user-defined functions for use from Apache Spark SQL code in Azure Databricks.,"This article contains Python user-defined function (UDF) examples. It shows how to register UDFs, how to invoke UDFs, and provides caveats about evaluation order of subexpressions in Spark SQL.",2026-02-12T04:25:00.000Z,concept-article,integrations,0.7,True,"Contains concrete examples and caveats for Python UDF registration and evaluation order in Databricks Spark SQL, which are product-specific coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/udf/python-batch-udf,Batch UDFs in Unity Catalog,Batch Python User-defined functions (UDFs) in Unity Catalog - Azure Databricks,Implement batch Python UDFs in Unity Catalog,Learn how to implement batch Python user-defined functions in Unity Catalog.,"Important This feature is inPublic Preview. Batch Unity Catalog Python UDFs extend the capabilities of Unity Catalog UDFs by allowing you to write Python code to operate on batches of data, significantly improving efficiency by reducing the overhead associated with row-by-row UDFs. These optimizations make Unity Catalog batch Python UDFs ideal for large-scale data processing.",2026-03-31T08:00:00.000Z,concept-article,configuration,0.7,True,"Covers Unity Catalog batch Python UDFs, including how they operate on batches and performance characteristics vs row UDFs. This is a Databricks-specific UDF configuration and usage pattern, beyond generic pandas or Spark knowledge.",unchanged https://learn.microsoft.com/en-us/azure/databricks/udf/python-udtf,Python UDTFs (user-defined table functions),Python user-defined table functions (UDTFs) - Azure Databricks,Configure Python user-defined table functions on Databricks,Learn how to implement Python user-defined table functions on Azure Databricks.,"Important This feature is inPublic Previewin Databricks Runtime 14.3 LTS and above. A user-defined table function (UDTF) allows you to register functions that return tables instead of scalar values. Unlike scalar functions that return a single result value from each call, each UDTF is invoked in a SQL statement'sFROMclause and returns an entire table as output. Each UDTF call can accept zero or more arguments. These arguments can be scalar expressions or table arguments representing entire input",2025-10-10T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes Python UDTFs in Databricks Runtime, including how they return tables and are invoked in SQL FROM clauses. This is Databricks-specific UDTF configuration and behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/udf/scala,Scala UDFs,User-defined scalar functions - Scala - Azure Databricks,Implement Scala scalar UDFs in Databricks SQL,Learn how to implement Scala user-defined functions for use from Apache Spark SQL code in Azure Databricks.,"This article contains Scala user-defined function (UDF) examples. It shows how to register UDFs, how to invoke UDFs, and caveats regarding evaluation order of subexpressions in Spark SQL. SeeExternal user-defined scalar functions (UDFs)for more details.",2026-02-12T04:25:00.000Z,concept-article,integrations,0.7,True,"Scala UDF registration and usage details in Databricks Spark SQL, including evaluation caveats, are product-specific coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/databricks/udf/udf-task-context,Get UDF context information,Get task context in a UDF - Azure Databricks,Access task context inside Databricks UDFs,"Get task context information such as the user executing the UDF, job IDs, and cluster tags in Batch Unity Catalog Python UDF and PySpark UDFs.","Use the TaskContext PySpark API to get context information while running a Batch Unity Catalog Python UDF or PySpark UDF. For example, context information such as the user's identity and cluster tags can verify a user's identity to access external services.",2026-01-24T08:00:00.000Z,concept-article,integrations,0.7,True,"Shows how to use the TaskContext PySpark API within Batch Unity Catalog Python UDFs and PySpark UDFs to get user identity, job IDs, and cluster tags. This is a product-specific coding pattern for integrating UDF logic with Databricks execution context.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/udf/udtf-unity-catalog,UDTFs in Unity Catalog,Python user-defined table functions (UDTFs) in Unity Catalog - Azure Databricks,Register Python UDTFs in Unity Catalog,Learn how to register Python user-defined table functions (UDTFs) in Unity Catalog on Azure Databricks.,"Important Registering Python UDTFs in Unity Catalog is inPublic Preview. A Unity Catalog user-defined table function (UDTF) registers functions that return complete tables instead of scalar values. Unlike scalar functions that return a single result value from each call, UDTFs are invoked in a SQL statement'sFROMclause and can return multiple rows and columns. UDTFs are particularly useful for: Each UDTF call accepts zero or more arguments. These arguments can be scalar expressions or table argu",2026-01-24T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains Unity Catalog user-defined table functions, how they are invoked in FROM clauses, and argument types. This is specific to Databricks Unity Catalog UDTF configuration and behavior.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/udf/udtf-unity-catalog,UDTFs in Unity Catalog,Python user-defined table functions (UDTFs) in Unity Catalog - Azure Databricks,Register and configure Python UDTFs in Unity Catalog,Learn how to register Python user-defined table functions (UDTFs) in Unity Catalog on Azure Databricks.,"Important Registering Python UDTFs in Unity Catalog is inPublic Preview. A Unity Catalog user-defined table function (UDTF) registers functions that return complete tables instead of scalar values. Unlike scalar functions that return a single result value from each call, UDTFs are invoked in a SQL statement'sFROMclause and can return multiple rows and columns. UDTFs are particularly useful for: Each UDTF call accepts zero or more arguments. These arguments can be scalar expressions or table argu",2026-04-23T08:00:00.000Z,concept-article,configuration,0.65,True,"Covers registering Python user-defined table functions in Unity Catalog, including how arguments and return tables are defined and invoked in SQL FROM clauses; likely includes function registration syntax and catalog-specific configuration details unique to Databricks.",updated https://learn.microsoft.com/en-us/azure/databricks/udf/unity-catalog,UDFs on Unity Catalog,User-defined functions (UDFs) in Unity Catalog - Azure Databricks,Configure SQL and Python UDFs in Unity Catalog,Learn how to implement Python and SQL user-defined functions for use with Unity Catalog on Azure Databricks.,"Important This feature is inPublic Preview. User-defined functions (UDFs) in Unity Catalog extend SQL and Python capabilities within Azure Databricks. They allow custom functions to be defined, used, and securely shared and governed across computing environments. Python UDFs registered as functions in Unity Catalog differ in scope and support from PySpark UDFs scoped to a notebook or SparkSession. SeeUser-defined scalar functions - Python. SeeCREATE FUNCTION (SQL and Python)for complete SQL lang",2026-03-31T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes Unity Catalog–scoped UDFs, their scope differences from PySpark UDFs, and references to CREATE FUNCTION syntax. This is product-specific configuration of UDFs within Unity Catalog, including scope and governance behavior.",unchanged https://learn.microsoft.com/en-us/azure/databricks/vector-search/create-vector-search,Create vector search endpoints and indexes,Create vector search endpoints and indexes - Azure Databricks,Configure and create Mosaic AI Vector Search endpoints and indexes,Learn how to create vector search endpoints and indexes using Mosaic AI Vector Search. This article also provides guidance on how to configure a vector search endpoint to serve an embeddings model of ,"This article describes how to create vector search endpoints and indexes usingMosaic AI Vector Search. You can create and manage vector search components, like a vector search endpoint and vector search indices, using the UI, thePython SDK, or theREST API. For example notebooks illustrating how to create and query vector search endpoints, seeVector search example notebooks. For reference information, see thePython SDK reference.",2026-04-08T19:25:00.000Z,how-to,configuration,0.7,True,"Describes how to create and configure vector search endpoints and indexes via UI, Python SDK, and REST API. This typically includes product-specific parameters (for endpoints, indexes, and embedding models) and concrete configuration patterns that go beyond generic knowledge, fitting the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/databricks/vector-search/custom-embedding-model,Use a custom embedding model,Use a custom embedding model - Azure Databricks,Integrate custom embedding models with Vector Search,Learn how to use a custom embedding model with vector search.,"The following example shows how to log a custom Python model for Databricks to use to generate embeddings. The input schema should be a singleColSpecstring, and the output schema should beTensorSpecas your signature. The embeddings should be returned in a NumPy array.",2026-01-20T08:00:00.000Z,how-to,integrations,0.8,True,"Shows how to log a custom Python model with specific input/output schema requirements (ColSpec, TensorSpec, NumPy array embeddings); these are concrete integration and API contract details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/embedding-with-oss-models,Register and serve an OSS embedding model,Register and serve an OSS embedding model - Azure Databricks,Deploy OSS embedding model for Databricks Vector Search,Learn how to register and serve an open source embedding model (`e5-small-v2`) in a Model Serving endpoint for use with Vector Search.,"Open notebook version of this page This notebook sets up the open source text embedding modele5-small-v2in a Model Serving endpoint usable for Vector Search. The modele5-small-v2is available athttps://huggingface.co/intfloat/e5-small-v2. For a list of library versions included in Databricks Runtime, see therelease notesfor your Databricks Runtime version.",2026-04-22T17:34:00.000Z,tutorial,deployment,0.75,True,"Describes registering and serving the e5-small-v2 model in a Databricks Model Serving endpoint for Vector Search, including runtime/library considerations, which are product-specific deployment details.",new https://learn.microsoft.com/en-us/azure/databricks/vector-search/high-qps,Scale endpoint throughput (Beta),Scale endpoint throughput with high QPS (Beta) - Azure Databricks,Scale Mosaic AI Vector Search endpoints to high QPS,Learn how to use high QPS for scaling endpoints to 1000+ QPS.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. By default, standard endpoints support 20–200 QPS depending on index size. Real-time applications such as search bars, recommendation systems, and entity matching often require 100–1000+ QPS. On standard endpoints only, you can set a minimum QPS. Databricks provisions the infrastructure to support that throughput level when indexes are created or synced",2026-02-24T20:13:00.000Z,how-to,limits-quotas,0.85,True,"Explicitly mentions default QPS ranges (20–200) and configuring minimum QPS for standard endpoints; contains numeric throughput limits and configuration, fitting limits-quotas.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/vector-search/query-vector-search,Query a vector search index,Query a vector search index - Azure Databricks,Query Mosaic AI Vector Search indexes with filters and reranking,Learn how to query a Mosaic AI Vector Search vector search index.,"This page describes how to query a vector search index, including pagination, filters, and reranking. For example notebooks illustrating how to create and query vector search endpoints and indexes, seeVector search example notebooks. For reference information, see thePython SDK reference.",2026-04-13T21:03:00.000Z,how-to,integrations,0.68,True,"Querying a Databricks Mosaic AI Vector Search index typically involves product-specific API/SDK parameters (e.g., vector, k, filters, pagination tokens, reranking options) and their allowed values. This is integration-focused, detailing how to call the service and structure requests, which is expert, product-specific knowledge beyond generic vector search concepts.",updated -https://learn.microsoft.com/en-us/azure/databricks/vector-search/retrieval-quality-eval,Retrieval quality evaluation,Evaluate vector search retrieval quality - Azure Databricks,Evaluate and compare Mosaic AI Vector Search retrieval quality,Learn how to use automatic vector search retrieval quality to measure and compare retrieval quality across different search strategies.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Mosaic AI Vector Search provides built-in retrieval quality evaluation that measures and compares the relevance of different search strategies on your data. You can automatically generate evaluation queries from your documents, run multiple retrieval strategies, and generate a detailed report.",2026-04-14T08:00:00.000Z,how-to,best-practices,0.7,True,"The page describes built-in retrieval quality evaluation with automatic query generation and comparison of retrieval strategies. This implies detailed, product-specific guidance on how to configure and interpret evaluation runs, and how to compare strategies on Databricks, which are concrete usage patterns and recommendations rather than generic theory, fitting best-practices.",new -https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search,Introduction to vector search,Mosaic AI Vector Search - Azure Databricks,,"Learn about Mosaic AI Vector Search, a vector search solution built into Databricks and integrated with its governance and productivity tools.","This article gives an overview of Mosaic AI Vector Search, including what it is and how it works.",2026-04-08T19:25:00.000Z,overview,,0.2,False,"High-level overview of Mosaic AI Vector Search describing what it is and how it works conceptually. No evidence of numeric limits, configuration parameter tables, error codes, or decision matrices with thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/query-vector-search,Query a vector search index,Query a vector search index - Azure Databricks,Query Mosaic AI Vector Search indexes with filters and reranking,Learn how to query a Mosaic AI Vector Search vector search index.,"This page describes how to query a vector search index, including pagination, filters, and reranking. For example notebooks illustrating how to create and query vector search endpoints and indexes, seeVector search example notebooks. For reference information, see thePython SDK reference.",2026-04-13T21:03:00.000Z,how-to,integrations,0.68,True,"Querying a Databricks Mosaic AI Vector Search index typically involves product-specific API/SDK parameters (e.g., vector, k, filters, pagination tokens, reranking options) and their allowed values. This is integration-focused, detailing how to call the service and structure requests, which is expert, product-specific knowledge beyond generic vector search concepts.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/retrieval-quality-eval,Retrieval quality evaluation,Evaluate vector search retrieval quality - Azure Databricks,Evaluate and compare Mosaic AI Vector Search retrieval quality,Learn how to use automatic vector search retrieval quality to measure and compare retrieval quality across different search strategies.,"Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Mosaic AI Vector Search provides built-in retrieval quality evaluation that measures and compares the relevance of different search strategies on your data. You can automatically generate evaluation queries from your documents, run multiple retrieval strategies, and generate a detailed report.",2026-04-14T08:00:00.000Z,how-to,best-practices,0.7,True,"The page describes built-in retrieval quality evaluation with automatic query generation and comparison of retrieval strategies. This implies detailed, product-specific guidance on how to configure and interpret evaluation runs, and how to compare strategies on Databricks, which are concrete usage patterns and recommendations rather than generic theory, fitting best-practices.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search,Introduction to vector search,Mosaic AI Vector Search - Azure Databricks,,"Learn about Mosaic AI Vector Search, a vector search solution built into Databricks and integrated with its governance and productivity tools.","This article gives an overview of Mosaic AI Vector Search, including what it is and how it works.",2026-04-23T08:00:00.000Z,overview,,0.2,False,"High-level overview of Mosaic AI Vector Search without detailed limits, configs, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-best-practices,Performance guide,Vector search performance guide - Azure Databricks,Optimize Mosaic AI Vector Search performance,"Learn how to optimize the performance of Mosaic AI Vector Search, including high QPS Beta for scaling endpoints to 1000+ QPS.","Mosaic AI Vector Search is built for fast, scalable retrieval. Vector search performance depends on many factors, including SKU choice, index size, query type, vector dimensionality, authentication methods, and how your application handles traffic spikes. Most workloads perform well out of the box, but for situations where you need to scale or optimize latency, this guide presents practical tips and common patterns to help you configure your system for optimal vector search performance.",2026-03-18T08:00:00.000Z,best-practice,best-practices,0.85,True,"Performance guide with practical, product-specific tuning advice (SKU choice, index size, QPS scaling, auth impact); these are concrete best practices beyond generic vector search theory.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-budget-policies,Budget policies,Vector search budget policies - Azure Databricks,,Learn how to use budget policies to track spending on vector search endpoints.,"Important This feature is inPublic Preview. This article describes how to use budget policies to track vector search costs. Budget policies enable administrators to group and filter billing records across all Azure Databricks serverless products, and provide a dedicated UI for tracking spending. Budget policies are created by workspace admins and can be applied to vector search endpoints and indexes. For general information and details about how to create and manage budget policies, seeAttribute",2026-04-16T17:44:00.000Z,how-to,,0.2,False,"The description focuses on using budget policies to track spending and references a general attribute-based pricing article. It reads as cost-tracking/administration guidance without clear evidence of numeric limits, configuration tables, or detailed parameter specs; more conceptual than expert configuration or limits content.",updated -https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-cost-management,Cost management guide,Vector search cost management guide - Azure Databricks,Optimize and manage Mosaic AI Vector Search costs,Learn how to effectively manage your costs when using Mosaic AI Vector Search.,"This article describes how to effectively manage your vector search costs. It covers the following topics: To find endpoints with indexes that receive no query traffic, seeIdentify unused vector search endpoints. Databricks vector search includes the following:",2026-04-15T08:00:00.000Z,best-practice,best-practices,0.64,True,"A cost management guide for a specific service usually contains concrete, product-specific recommendations (for example, how to configure endpoints, index layouts, or traffic patterns to reduce cost, and how to identify unused endpoints). These are actionable DO/DON'T patterns tied to this product’s behavior, fitting best-practices with expert operational guidance.",updated +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-budget-policies,Budget policies,Vector search budget policies - Azure Databricks,,Learn how to use budget policies to track spending on vector search endpoints.,"Important This feature is inPublic Preview. This article describes how to use budget policies to track vector search costs. Budget policies enable administrators to group and filter billing records across all Azure Databricks serverless products, and provide a dedicated UI for tracking spending. Budget policies are created by workspace admins and can be applied to vector search endpoints and indexes. For general information and details about how to create and manage budget policies, seeAttribute",2026-04-16T17:44:00.000Z,how-to,,0.2,False,"The description focuses on using budget policies to track spending and references a general attribute-based pricing article. It reads as cost-tracking/administration guidance without clear evidence of numeric limits, configuration tables, or detailed parameter specs; more conceptual than expert configuration or limits content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-cost-management,Cost management guide,Vector search cost management guide - Azure Databricks,Optimize and manage Mosaic AI Vector Search costs,Learn how to effectively manage your costs when using Mosaic AI Vector Search.,"This article describes how to effectively manage your vector search costs. It covers the following topics: To find endpoints with indexes that receive no query traffic, seeIdentify unused vector search endpoints. Databricks vector search includes the following:",2026-04-15T08:00:00.000Z,best-practice,best-practices,0.64,True,"A cost management guide for a specific service usually contains concrete, product-specific recommendations (for example, how to configure endpoints, index layouts, or traffic patterns to reduce cost, and how to identify unused endpoints). These are actionable DO/DON'T patterns tied to this product’s behavior, fitting best-practices with expert operational guidance.",unchanged https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-endpoint-load-test,Load test vector search endpoints,Configure a load test for vector search endpoints - Azure Databricks,Design and run load tests for Vector Search endpoints,Learn how to configure a load test for a vector search endpoint. The example shows how to design and run a load test to optimize endpoint performance.,"This page provides guidance, example code, and an example notebook for load testing vector search endpoints. Load testing helps you understand the performance and production readiness of a vector search endpoint before it's deployed to production. Load testing can tell you about: For more information about load testing and related concepts, seeLoad testing for serving endpoints.",2026-04-03T08:00:00.000Z,how-to,best-practices,0.7,True,Provides detailed guidance and example code for load testing vector search endpoints; includes product-specific patterns and parameters for realistic performance evaluation.,unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-external-embedding-model-example,Call an OpenAI embeddings model,Vector Search external embedding model (OpenAI) example - Azure Databricks,Use Vector Search with OpenAI embeddings,Learn how to use the Vector Search Python SDK with an external embedding model (OpenAI) to create and query a Vector Search index.,"Open notebook version of this page This notebook shows how to use the Vector Search Python SDK, which provides aVectorSearchClientas a primary API for working with Vector Search. This notebook usesDatabricks support of external modelsto access an OpenAI embeddings model to generate embeddings.",2026-04-22T17:34:00.000Z,tutorial,integrations,0.85,True,"Provides concrete code and configuration for using Databricks Vector Search with external OpenAI embedding models via Databricks external models, a product-specific integration pattern.",new +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-foundation-embedding-model-gte-example,Call a GTE embeddings model,Vector Search foundational embedding model (GTE) example - Azure Databricks,Use Vector Search with GTE foundation embeddings,Learn how to use the Vector Search Python SDK with a foundational embedding model (GTE) to create and query a Vector Search index.,"Open notebook version of this page This notebook shows how to use the Vector Search Python SDK, which provides aVectorSearchClientas a primary API for working with Vector Search. This notebook usesDatabricks Foundation Model APIsto access the GTE embeddings model to generate embeddings.",2026-04-22T17:34:00.000Z,tutorial,integrations,0.85,True,"Shows how to integrate Databricks Vector Search with Databricks Foundation Model APIs (GTE embeddings), including SDK usage and parameters, which is a specific integration pattern.",new +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-oauth-token,Use Vector Search with an OAuth token,Use Vector Search with an OAuth token - Azure Databricks,Call Databricks Vector Search with OAuth tokens,Learn how to call a Vector Search endpoint using the Vector Search SDK or HTTP with a fresh OAuth token over the network optimized path.,"Open notebook version of this page This notebook shows how to call a Vector Search endpoint using the vector search SDK or HTTP with a fresh OAuth token. In both cases, the network optimized path is used, as is recommended for any production workload. The HTTP calls for creating a token and calling the endpoint can be implemented in a language of your choice. For production applications, keep in mind that the token must be refreshed every 60 minutes. To prevent errors due to a stale token, Datab",2026-04-22T17:34:00.000Z,tutorial,security,0.8,True,"Explains how to obtain and use OAuth tokens with the Vector Search SDK/HTTP, including token lifetime (60 minutes) and refresh considerations, which are product-specific security/auth configuration details.",new +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-python-sdk-example,Vector search with the Python SDK,Vector Search Python SDK example usage - Azure Databricks,Use the Databricks Vector Search Python SDK,"Learn how to use the Vector Search Python SDK, which provides a `VectorSearchClient` as a primary API for working with Vector Search.","Open notebook version of this page This notebook shows how to use the Vector Search Python SDK, which provides aVectorSearchClientas a primary API for working with Vector Search. Alternatively, you can call the REST API directly.",2026-04-22T17:34:00.000Z,tutorial,integrations,0.8,True,"Shows detailed usage of VectorSearchClient and related APIs, including parameters and request patterns unique to Databricks Vector Search.",new https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-retrieval-quality,Quality guide,Vector search retrieval quality guide - Azure Databricks,Improve Mosaic AI Vector Search retrieval quality,"Learn how to optimize the retrieval quality of Mosaic AI Vector Search for RAG, search, and matching applications.","This guide provides a systematic approach to improving retrieval quality for real-time RAG, search, and matching applications using Mosaic AI Vector Search. The recommendations are ordered from highest impact/lowest effort to lowest impact/highest effort.",2026-03-27T08:00:00.000Z,best-practice,best-practices,0.8,True,"Systematic, ordered recommendations for tuning retrieval quality in RAG/search/matching; contains Databricks-specific guidance and patterns that go beyond generic RAG advice.",unchanged https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-unused-endpoints,Identify unused endpoints,Identify unused vector search endpoints - Azure Databricks,Identify and clean up unused Vector Search endpoints,Learn how to use audit logs to find Vector Search endpoints that have indexes but receive no query traffic.,This page describes how to use the audit log system table to find vector search endpoints that have indexes but receive no query traffic. Unused endpoints consume resources and incur costs without delivering value. Identifying unused endpoints helps you clean up unused resources and reduce costs.,2026-03-24T22:15:00.000Z,how-to,troubleshooting,0.7,True,Shows how to query audit log system tables to detect endpoints with no query traffic; includes product-specific diagnostic queries and symptom→action guidance.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/vector-search/vs-example-notebooks,Example notebooks,Vector search example notebooks - Azure Databricks,,Example notebooks illustrating how to use Mosaic AI Vector Search.,"The examples in this section demonstrate usage of the vector search Python SDK. For reference information, see thePython SDK reference.",2026-01-20T08:00:00.000Z,how-to,,0.3,False,"Index page for example notebooks; points to SDK reference but itself does not contain configuration tables, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/vector-search/vs-example-notebooks,Example notebooks,Vector search example notebooks - Azure Databricks,Run Databricks vector search example notebooks,Example notebooks illustrating how to use Mosaic AI Vector Search.,"The following notebooks show how to use the vector search Python SDK. For reference information, see thePython SDK reference.",2026-04-22T17:34:00.000Z,how-to,integrations,0.7,True,"Example notebooks demonstrate concrete usage of the Vector Search Python SDK with product-specific API calls and patterns, which are integration/coding patterns.",updated https://learn.microsoft.com/en-us/azure/databricks/views/,Overview,What is a view? - Azure Databricks,,"Learn about Unity Catalog views, temp views, materialized views, and dynamic views in Azure Databricks.","A view is a read-only object that is the result of a query over one or more tables and views in a Unity Catalog metastore. You can create a view from tables and from other views in multiple schemas and catalogs. This article describes the views that you can create in Azure Databricks and provides an explanation of the permissions and compute required to query them. For information about creating views, see:",2026-04-09T08:00:00.000Z,overview,,0.2,False,"Conceptual explanation of what views are and high-level permissions/compute discussion. No detailed privilege tables, config parameters, or error codes; primarily overview content.",unchanged https://learn.microsoft.com/en-us/azure/databricks/views/create-views,Create views,Create and manage views - Azure Databricks,Create and manage Unity Catalog views in Databricks,Learn how to create and manage views on Azure Databricks.,This article shows how to create views in Unity Catalog. SeeWhat is a view?.,2026-04-08T22:17:00.000Z,how-to,configuration,0.65,True,"How-to page with concrete SQL patterns and options for creating/managing different view types in Unity Catalog. Contains product-specific syntax and behavior, fitting configuration/integration patterns rather than conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/databricks/views/dynamic,Dynamic views,Create a dynamic view - Azure Databricks,Implement dynamic views for fine-grained access control,Learn what a dynamic view is on Azure Databricks.,"In Unity Catalog, you can use dynamic views to configure fine-grained access control, including: Unity Catalog introduces the following functions, which allow you to dynamically limit which users can access a row, column, or record in a view: Azure Databricks recommends that you not grant users the ability to read the tables and views referenced in the view. The following examples illustrate how to create dynamic views in Unity Catalog.",2025-05-21T21:02:00.000Z,concept-article,security,0.85,True,"Introduces Unity Catalog-specific functions for row/column-level security and gives examples, which are detailed, product-specific security patterns.",unchanged @@ -5055,7 +5105,7 @@ https://learn.microsoft.com/en-us/azure/databricks/visualizations/format-numeric https://learn.microsoft.com/en-us/azure/databricks/visualizations/heatmap,Heatmap visualization,Heatmap options - Azure Databricks,Configure heatmap chart options in Databricks,Learn about heatmap visualization configuration options in Azure Databricks notebooks and Databricks SQL.,"This section covers the configuration options for heatmap chart visualizations in the SQL editor or a notebook. For an example, seeheat map example. To learn how to work with visualizations in AI/BI dashboards, seeDashboard visualizations.",2026-03-06T08:00:00.000Z,reference,configuration,0.7,True,Details configuration options for heatmap charts; these are specific to Databricks visualization behavior.,unchanged https://learn.microsoft.com/en-us/azure/databricks/visualizations/histogram,Histogram visualization,Histogram options - Azure Databricks,Configure histogram visualization options in Databricks,Learn about histogram visualization configuration options in Azure Databricks notebooks and Databricks SQL.,"This section covers the configuration options for histogram chart visualizations created in the SQL editor or a notebook. For an example, seehistogram example. To learn how to work with visualizations in AI/BI dashboards, seeDashboard visualizations.",2026-03-06T08:00:00.000Z,reference,configuration,0.7,True,Provides product-specific histogram configuration options for Databricks notebooks and SQL editor.,unchanged https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-charts,Migrate legacy charts,Migrate legacy line charts - Azure Databricks,Migrate legacy line charts to new Databricks chart types,Learn about line charts in Azure Databricks and how to migrate to from the legacy Line.,Learn about line charts in Azure Databricks and how to migrate from the legacy Line charts. Azure Databricks has three types of line charts:Lineand legacy chartsLine (v2)andLine (v1).,2024-03-01T08:00:00.000Z,upgrade-and-migration-article,deployment,0.6,True,"Focuses on migration from legacy to new line chart implementations, which is a product-specific upgrade/migration path akin to deployment guidance.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-visualizations,Legacy visualizations,Legacy visualizations - Azure Databricks,Use and migrate Databricks legacy visualizations,Learn about legacy visualizations in Azure Databricks.,"This article describes legacy Azure Databricks visualizations. SeeVisualizations in Databricks notebooks and SQL editorfor current visualization support when creating visualizations in the SQL editor or a notebook. For information about working with visualizations in AI/BI dashboards, seeAI/BI dashboard visualization types. Azure Databricks also natively supports visualization libraries in Python and R and lets you install and use third-party libraries.",2026-03-06T08:00:00.000Z,overview,configuration,0.65,True,"Describes legacy visualization behavior and support in Databricks, including product-specific configuration and compatibility details.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-visualizations,Legacy visualizations,Legacy visualizations - Azure Databricks,,Learn about legacy visualizations in Azure Databricks.,"This article describes legacy Azure Databricks visualizations. SeeVisualizations in Databricks notebooks and SQL editorfor current visualization support when creating visualizations in the SQL editor or a notebook. For information about working with visualizations in AI/BI dashboards, seeAI/BI dashboard visualization types. Azure Databricks also natively supports visualization libraries in Python and R and lets you install and use third-party libraries.",2026-04-23T08:00:00.000Z,overview,,0.2,False,"Appears to be a conceptual/feature description of legacy visualizations and references to other visualization docs. No clear evidence of detailed configuration tables, limits, error codes, or product-specific patterns that meet the expert-knowledge criteria.",updated https://learn.microsoft.com/en-us/azure/databricks/visualizations/maps,Map visualizations,Map options - Azure Databricks,Configure map visualizations in Databricks,Learn about map visualization configuration options in Azure Databricks notebooks and Databricks SQL.,"The map visualizations display results on a geographic map. The query result set must include the appropriate geographic data: This page describes the configuration options available when creating visualizations in the SQL editor or a notebook. To learn how to work with visualizations in AI/BI dashboards, seeDashboard visualizations.",2026-03-06T08:00:00.000Z,reference,configuration,0.7,True,"Describes Databricks-specific configuration options for map visualizations, including required geographic data fields.",unchanged https://learn.microsoft.com/en-us/azure/databricks/visualizations/tables,Table visualization,Table options - Azure Databricks,Customize Databricks table visualization options,Learn about table visualization configuration options in Azure Databricks notebooks and Databricks SQL.,"With Azure Databricks table visualizations you can manually pin, reorder, or hide columns. You can also format the data in a column. This article describes how you can control data presentation in table visualizations. A table visualization can be manipulated independently of the original cell results table. You can: This page describes the configuration options available when creating visualizations in the SQL editor or a notebook. To learn how to work with visualizations in AI/BI dashboards, s",2026-03-06T08:00:00.000Z,reference,configuration,0.75,True,"Explains specific table visualization behaviors (pin, reorder, hide, formatting) that are configuration details unique to Databricks.",unchanged https://learn.microsoft.com/en-us/azure/databricks/visualizations/visualization-types,Visualization types,Notebook and SQL editor visualization types - Azure Databricks,,"Learn about the types of visualizations available in Databricks notebooks and Databricks SQL, including bar charts, line charts, maps, and more. For AI/BI dashboard visualizations, see dashboard visua","This page outlines the types of visualizations available to use in Azure Databricks notebooks and in the SQL editor, and shows you how to create an example of each visualization type. Important This page covers visualizations forAzure Databricks notebooks and the SQL editor. For visualizations in AI/BI dashboards, seeAI/BI dashboard visualization types.",2026-03-06T08:00:00.000Z,reference,,0.25,False,"Lists visualization types and examples; lacks deep config matrices, limits, or decision/troubleshooting content.",unchanged @@ -5066,13 +5116,13 @@ https://learn.microsoft.com/en-us/azure/databricks/volumes/privileges,Volume pri https://learn.microsoft.com/en-us/azure/databricks/volumes/unstructured-data-tutorial,Work with unstructured data,Work with unstructured data in volumes - Azure Databricks,,"Learn how to store, query, process, and share unstructured data files using Unity Catalog volumes.","This page shows you how to store, query, and process unstructured data files using Unity Catalog volumes. You'll learn how to upload files, query metadata, process files with AI functions, apply access control, and share volumes with other organizations. Where possible, instructions for working through this tutorial using the Catalog Explorer UI have been included. If noCatalog Exploreroption is shown, use the provided Python or SQL commands. For a complete overview of volume capabilities and us",2026-02-18T22:10:00.000Z,tutorial,,0.4,False,Tutorial-style walkthrough for unstructured data; mostly step-by-step usage without deep config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/databricks/volumes/utility-commands,Create and manage volumes,Create and manage Unity Catalog volumes - Azure Databricks,Create and manage Unity Catalog volumes in Databricks,"Learn how to create, drop, and manage permissions and ownership for volumes on Azure Databricks.","This page contains syntax examples for creating, managing, and dropping Unity Catalog volumes.",2026-04-06T18:30:00.000Z,how-to,configuration,0.7,True,"Provides concrete syntax and options for CREATE/DROP/MANAGE volumes, including product-specific parameters and patterns for Unity Catalog volumes. This is configuration-level expert knowledge beyond generic LLM understanding.",unchanged https://learn.microsoft.com/en-us/azure/databricks/volumes/volume-files,Work with files in volumes,Work with files in Unity Catalog volumes - Azure Databricks,Manage files in Unity Catalog volumes across tools,Learn about how to manage files in Unity Catalog volumes.,"This page has examples for managing files in Unity Catalogvolumesfor various user interfaces, tools, libraries, and languages. Databricks recommends using volumes for managing all access to non-tabular data in cloud object storage and for storing workload support files. Examples include the following: Volumes provide Portable Operating System Interface (POSIX)-style paths that work with Filesystem in Userspace (FUSE)-dependent tools and frameworks. This makes them ideal for machine learning fram",2026-03-31T08:00:00.000Z,feature-guide,integrations,0.6,True,Shows concrete path formats and usage patterns for volumes with FUSE-dependent tools and multiple languages; product-specific file access patterns and parameters.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/,Overview,Workspace UI - Azure Databricks,,Learn how to navigate a Databricks workspace and access features using the Databricks unified navigation experience.,The Azure Databricks workspace is your central hub for accessing all Azure Databricks objects and features.,2026-03-10T23:23:00.000Z,concept-article,,0.2,False,"Workspace UI overview; navigational and conceptual, not configuration or limits-focused.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one,Databricks One,What is Databricks One? - Azure Databricks,,"Learn about Databricks One, a simplified UI for business users that provides a single entry point for AI/BI dashboards, Genie, and Databricks Apps.","Databricks One is a user interface designed for business users, giving them a single, intuitive entry point to interact with data and AI in Azure Databricks, without needing to navigate technical concepts such as clusters, queries, models, or notebooks.",2026-04-08T08:00:00.000Z,concept-article,,0.1,False,"High-level description of Databricks One UI for business users; marketing/overview style without detailed settings, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-account,Account-level Databricks One,Account-level Databricks One - Azure Databricks,,"Learn about account-level Databricks One, a unified analytics view across all workspaces in your Databricks account.","Account-level Databricks One is a unified analytics view across all workspaces in your Azure Databricks account. Business users can discover and interact with shared assets (such as AI/BI dashboards, Genie spaces, and Databricks Apps) from multiple workspaces in a single entry point. Unlike the workspace-level Databricks One experience, which is scoped to a single workspace, account-level Databricks One lets users view and navigate assets from any workspace they have access to in the account.",2026-04-09T17:57:00.000Z,concept-article,,0.1,False,"Conceptual overview of account-level Databricks One; explains what it is and how it differs from workspace-level, but lacks specific quotas, configuration parameters, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-chat,Chat,Chat in Databricks One - Azure Databricks,,"Use Chat in Databricks One to ask data questions in natural language. Chat leverages existing dashboards, queries, and Genie spaces to answer your questions with all available data.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Chat provides a unified, full-screen interface for asking data questions in natural language. Chat uses existing dashboards, queries, Genie spaces, and metric views to answer your questions with all available data.",2026-04-16T08:00:00.000Z,concept-article,,0.1,False,"Feature overview of Chat in Databricks One; summary only mentions what the feature does and that it is in beta, with no indication of numeric limits, configuration tables, error codes, or other expert-only details.",updated -https://learn.microsoft.com/en-us/azure/databricks/workspace/navigate-workspace,Navigate the workspace,Navigate the Lakehouse workspace UI - Azure Databricks,,"Navigate the Databricks workspace UI - homepage, sidebar, search, and core features.","Learn how to navigate the Lakehouse workspace UI and find the features you need. This article explains the core concepts of the Azure Databricks workspace UI, an environment for authoring and accessing all of your Azure Databricks objects. You can manage workspace assets and settings using the workspace UI, theDatabricks CLI, and theWorkspace API. Most of the articles in the Azure Databricks documentation focus on performing tasks using the Lakehouse workspace UI.",2026-02-25T08:00:00.000Z,concept-article,,0.25,False,"UI navigation guide for the workspace; describes screens and basic actions without deep configuration tables, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/workspace/,Overview,Workspace UI - Azure Databricks,,Learn how to navigate a Databricks workspace and access features using the Databricks unified navigation experience.,The Azure Databricks workspace is your central hub for accessing all Azure Databricks objects and features.,2026-04-22T08:00:00.000Z,landing-page,,0.1,False,"Workspace UI overview and navigation; conceptual and UX-focused without detailed configuration parameters, limits, or error-resolution content.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one,Databricks One,What is Databricks One? - Azure Databricks,,"Learn about Databricks One, a simplified UI for business users that provides a single entry point for AI/BI dashboards, Genie, and Databricks Apps.","Databricks One is a user interface designed for business users, giving them a single, intuitive entry point to interact with data and AI in Azure Databricks, without needing to navigate technical concepts such as clusters, queries, models, or notebooks.",2026-04-22T08:00:00.000Z,overview,,0.1,False,"High-level description of Databricks One for business users; marketing/overview style without detailed technical limits, configs, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-account,Account-level Databricks One,Account-level Databricks One - Azure Databricks,,"Learn about account-level Databricks One, a unified analytics view across all workspaces in your Databricks account.","Account-level Databricks One is a unified analytics view across all workspaces in your Azure Databricks account. Business users can discover and interact with shared assets (such as AI/BI dashboards, Genie spaces, and Databricks Apps) from multiple workspaces in a single entry point. Unlike the workspace-level Databricks One experience, which is scoped to a single workspace, account-level Databricks One lets users view and navigate assets from any workspace they have access to in the account.",2026-04-22T08:00:00.000Z,feature-guide,,0.1,False,"Account-level Databricks One overview; focuses on what it is and how it scopes assets, not on detailed technical configuration or constraints.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-chat,Chat,Chat in Databricks One - Azure Databricks,,"Use Chat in Databricks One to ask data questions in natural language. Chat leverages existing dashboards, queries, and Genie spaces to answer your questions with all available data.","Important This feature is inBeta. Workspace admins can control access to this feature from thePreviewspage. SeeManage Azure Databricks previews. Chat provides a unified, full-screen interface for asking data questions in natural language. Chat uses existing dashboards, queries, Genie spaces, and metric views to answer your questions with all available data.",2026-04-22T08:00:00.000Z,how-to,,0.2,False,"Describes Chat in Databricks One at a feature level; no specific configuration tables, error codes, or quantified limits evident from the summary.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/navigate-workspace,Navigate the workspace,Navigate the Lakehouse workspace UI - Azure Databricks,,"Navigate the Databricks workspace UI - homepage, sidebar, search, and core features.","Learn how to navigate the Lakehouse workspace UI and find the features you need. This article explains the core concepts of the Azure Databricks workspace UI, an environment for authoring and accessing all of your Azure Databricks objects. You can manage workspace assets and settings using the workspace UI, theDatabricks CLI, and theWorkspace API. Most of the articles in the Azure Databricks documentation focus on performing tasks using the Lakehouse workspace UI.",2026-04-22T08:00:00.000Z,how-to,,0.1,False,"Explains how to navigate the Lakehouse workspace UI; primarily conceptual navigation guidance, not expert-level configuration, limits, or troubleshooting.",updated https://learn.microsoft.com/en-us/azure/databricks/workspace/per-workspace-urls,Per-workspace URLs,Per-workspace URLs - Azure Databricks,,Learn about Azure Databricks workspace URLs and how to start using the unique per-workspace URL.,"In April 2020, Azure Databricks added a new unique per-workspace URL for each workspace. This per-workspace URL has the format adb-..azuredatabricks.net The per-workspace URL replaces the deprecated regional URL (.azuredatabricks.net) to access workspaces. Important Avoid using legacy regional URLs. They may not work for new workspaces, are less reliable, and exhibit lower performance than per-workspace URLs.",2024-12-10T08:00:00.000Z,concept-article,,0.45,False,"Describes per-workspace URL format and deprecation of regional URLs; contains a URL pattern but not limits, configs, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-assets,Introduction to workspace objects,Introduction to workspace objects - Azure Databricks,,Get a high-level overview of the objects you can operate on in an Azure Databricks workspace.,"This article provides a high-level introduction to Azure Databricks workspace objects. You can create, view, and organize workspace objects in the workspace browser across personas.",2026-03-18T08:00:00.000Z,concept-article,,0.2,False,High-level introduction to workspace objects; describes object types conceptually without detailed configuration or security matrices.,unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-browser,Workspace browser and file management,Workspace browser - Azure Databricks,,"Learn how to create, view, and organize objects in the workspace browser.","With the workspace browser you can create, browse, and organize Azure Databricks objects, including notebooks, libraries, experiments, queries, dashboards, and alerts, in a single place. You can then share objects and assign permissions at the folder level to organize objects by team or project. You can also browse content in Databricks Git folders. The workspace browser introduces a contextual browser that allows you to browse content, including content in Git folders, from within a notebook.",2026-03-05T20:31:00.000Z,concept-article,,0.3,False,"Workspace browser usage and organization; operational UI guidance, not configuration reference or limits.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-details,Get identifiers for workspace objects,Get identifiers for workspace objects - Azure Databricks,,"Learn how to get your workspace instance name and ID, cluster URLs, notebook URLs, model IDs, and job URLs in Azure Databricks.","This article explains how to get workspace, classic compute, dashboard, directory, model, notebook, and job identifiers and URLs in Azure Databricks.",2025-11-05T08:00:00.000Z,concept-article,,0.35,False,"Shows how to obtain various IDs and URLs; while useful, it is procedural and does not present configuration tables, limits, or error-resolution mappings.",unchanged -https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-objects,Manage workspace objects,Manage workspace objects - Azure Databricks,,Learn how to work with folders and other Azure Databricks workspace objects.,"The objects stored in the Workspace root folder arefolders,Git folders,notebooks,files(in Databricks Runtime 11.3 LTS and above),queries,dashboards,Genie spaces,alerts, andexperiments. To perform an action on a Workspace object, right-click the object or clickto the right of the object. From the drop-down menu you can: In addition to the procedures listed in this article, you can also do the following:",2026-02-23T08:00:00.000Z,concept-article,,0.25,False,"Explains how to manage workspace objects via UI actions; lacks detailed configuration parameters, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-assets,Introduction to workspace objects,Introduction to workspace objects - Azure Databricks,,Get a high-level overview of the objects you can operate on in an Azure Databricks workspace.,"This article provides a high-level introduction to Azure Databricks workspace objects. You can create, view, and organize workspace objects in the workspace browser across personas.",2026-04-22T08:00:00.000Z,concept-article,,0.1,False,"High-level introduction to workspace objects; conceptual overview rather than detailed configuration, limits, or error-handling content.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-browser,Workspace browser and file management,Workspace browser - Azure Databricks,,"Learn how to create, view, and organize objects in the workspace browser.","With the workspace browser you can create, browse, and organize Azure Databricks objects, including notebooks, libraries, experiments, queries, dashboards, and alerts, in a single place. You can then share objects and assign permissions at the folder level to organize objects by team or project. You can also browse content in Databricks Git folders. The workspace browser introduces a contextual browser that allows you to browse content, including content in Git folders, from within a notebook.",2026-04-22T08:00:00.000Z,how-to,,0.25,False,"Explains how to create and organize workspace objects; likely procedural UI guidance without detailed parameter tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-details,Get identifiers for workspace objects,Get identifiers for workspace objects - Azure Databricks,,"Learn how to get your workspace instance name and ID, cluster URLs, notebook URLs, model IDs, and job URLs in Azure Databricks.","This article explains how to get workspace, classic compute, dashboard, directory, model, notebook, and job identifiers and URLs in Azure Databricks.",2026-04-22T08:00:00.000Z,reference,,0.3,False,"Explains how to obtain various IDs and URLs for workspace objects; useful but primarily step-by-step retrieval instructions, not configuration, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-objects,Manage workspace objects,Manage workspace objects - Azure Databricks,,Learn how to work with folders and other Azure Databricks workspace objects.,"The objects stored in the Workspace root folder arefolders,Git folders,notebooks,files(in Databricks Runtime 11.3 LTS and above),queries,dashboards,Genie spaces,alerts, andexperiments. To perform an action on a Workspace object, right-click the object or clickto the right of the object. From the drop-down menu you can: In addition to the procedures listed in this article, you can also do the following:",2026-04-22T08:00:00.000Z,how-to,,0.25,False,"Covers managing workspace objects (folders, notebooks, etc.) via UI actions; procedural but not focused on expert-level configuration parameters or limits.",updated diff --git a/products/azure-databricks/report.md b/products/azure-databricks/report.md index a37b07ca..9a29764a 100644 --- a/products/azure-databricks/report.md +++ b/products/azure-databricks/report.md @@ -1,60 +1,62 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: Identity, access control, encryption, networking, compliance, and secure - integrations for Azure Databricks, Unity Catalog, Lakeflow, Lakebase, apps, and - external data sources. - configuration: 'Configuring and administering Azure Databricks: accounts, workspaces, - security, networking, compute, storage, jobs, ML/serving, Lakehouse/Unity Catalog, - Lakeflow, apps, and system-table–based monitoring.' - decision-making: Guides for choosing Azure Databricks architectures, compute, runtimes, - ML/LLM options, and detailed migration paths (Unity Catalog, Delta, SQL, Connect, - MLflow, serverless, Lakebase, and networking). - limits-quotas: Limits, quotas, and constraints for Databricks compute, AI/BI, connectors, - Lakeflow, Lakebase, model serving, tokens, data types, and Unity Catalog resources, - plus related workarounds. - best-practices: End-to-end Databricks best practices for performance, cost, governance, - streaming, ML/LLM/RAG, BI, Lakeflow, Vector Search, and operational reliability - across Azure Databricks workloads. - architecture-patterns: 'Architectural blueprints and patterns for Databricks: lakehouse, - networking, storage, HA/DR, governance, performance, ML/MLOps, RAG/agents, Lakebase, - streaming, and external data access.' + integrations for Azure Databricks, Unity Catalog, Lakebase, Delta Sharing, Apps, + and ingestion workloads. + configuration: 'Configuring and managing Azure Databricks: accounts, workspaces, + security, networking, compute, storage, jobs, streaming, Unity Catalog, ML/serving, + Marketplace, Lakeflow, and admin tooling.' + decision-making: Guides for choosing and migrating Databricks runtimes, compute, + storage, ML/LLM options, and Unity Catalog, plus cost, sizing, and architecture + decisions for production lakehouse workloads. + limits-quotas: Limits, quotas, and constraints for Azure Databricks compute, AI/BI, + connectors, Lakehouse/Lakebase, SQL types, streaming, Git/Repos, model serving, + and Unity Catalog resources. + best-practices: Best-practice guidance for configuring, optimizing, securing, governing, + and operating Azure Databricks and Lakehouse workloads across BI, streaming, ML/LLM, + Lakeflow, Vector Search, and cost management. + architecture-patterns: 'Architectural blueprints and patterns for Databricks: lakehouse + design, networking, storage, HA/DR, governance, performance, cost, streaming, + ML/MLOps, RAG, and AI/agent systems.' + deployment: Deploying and managing Azure Databricks workspaces, CI/CD, IaC, jobs, + model/feature serving, AI agents, bundles, and related tooling (CLI, Terraform, + GitHub Actions, Azure DevOps). + troubleshooting: 'Diagnosing and fixing Azure Databricks issues: cluster startup/termination, + Spark and SQL errors, connectors/Lakeflow ingestion, jobs/pipelines, CLI/VS Code, + Feature Store, model serving, and AI agents.' integrations: Patterns and APIs for integrating Databricks with external data systems, - tools, and AI/ML frameworks, plus detailed PySpark/SQL function references and - Lakehouse Federation/streaming examples. - troubleshooting: Diagnosing and fixing Databricks errors, job and compute failures, - connector/ingestion issues, SQL error codes, and performance/debugging problems - across Spark, AI, Lakeflow, and tooling. - deployment: Deploying and operating Databricks apps, agents, models, jobs, and infrastructure - using CI/CD, IaC, bundles, serving, Terraform, Git, and region/release planning. + BI/AI tools, agents, streaming, ML/LLM serving, Lakehouse Federation, and low-level + Spark/SQL/PySpark coding primitives. skill_description: Expert knowledge for Azure Databricks development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - using Unity Catalog, Lakehouse/Lakebase, Lakeflow pipelines, Vector Search/RAG, - or model serving, and other Azure Databricks related development tasks. Not for - Azure Synapse Analytics (use azure-synapse-analytics), Azure HDInsight (use azure-hdinsight), - Azure Machine Learning (use azure-machine-learning), Azure Data Factory (use azure-data-factory). + using Unity Catalog, Lakehouse/Lakebase, Lakeflow pipelines, Vector Search, or AI/agent + & model serving, and other Azure Databricks related development tasks. Not for Azure + Synapse Analytics (use azure-synapse-analytics), Azure HDInsight (use azure-hdinsight), + Azure Machine Learning (use azure-machine-learning), Azure Data Explorer (use azure-data-explorer). use_when: Use when using Unity Catalog, Lakehouse/Lakebase, Lakeflow pipelines, Vector - Search/RAG, or model serving, and other Azure Databricks related development tasks. + Search, or AI/agent & model serving, and other Azure Databricks related development + tasks. confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure HDInsight (use azure-hdinsight), Azure Machine Learning (use azure-machine-learning), - Azure Data Factory (use azure-data-factory). + Azure Data Explorer (use azure-data-explorer). --- # Azure Databricks Crawl Report ## Summary -- **Total Pages**: 4808 -- **Fetched**: 4808 +- **Total Pages**: 4858 +- **Fetched**: 4858 - **Fetch Failed**: 0 -- **Classified**: 3218 -- **Unclassified**: 1590 +- **Classified**: 3254 +- **Unclassified**: 1604 ### Incremental Update -- **New Pages**: 459 -- **Updated Pages**: 513 -- **Unchanged**: 3836 -- **Deleted Pages**: 41 +- **New Pages**: 54 +- **Updated Pages**: 397 +- **Unchanged**: 4407 +- **Deleted Pages**: 4 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-databricks/azure-databricks.csv` ## Classification Statistics @@ -62,139 +64,120 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 39 | 0.8% | -| best-practices | 171 | 3.6% | -| configuration | 692 | 14.4% | -| decision-making | 84 | 1.7% | -| deployment | 34 | 0.7% | -| integrations | 1696 | 35.3% | -| limits-quotas | 77 | 1.6% | -| security | 327 | 6.8% | -| troubleshooting | 98 | 2.0% | -| *(Unclassified)* | 1590 | 33.1% | +| best-practices | 168 | 3.5% | +| configuration | 676 | 13.9% | +| decision-making | 87 | 1.8% | +| deployment | 39 | 0.8% | +| integrations | 1740 | 35.8% | +| limits-quotas | 75 | 1.5% | +| security | 335 | 6.9% | +| troubleshooting | 95 | 2.0% | +| *(Unclassified)* | 1604 | 33.0% | ## Changes ### New Pages -- [Unity Catalog connections](https://learn.microsoft.com/en-us/azure/databricks/connect/uc-connections) -- [Connect to external HTTP services](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http) -- [Delta client access](https://learn.microsoft.com/en-us/azure/databricks/external-access/unity-rest) -- [Query-based connectors](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-overview) -- [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-pipeline) -- [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-limits) -- [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-reference) -- [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-troubleshoot) -- [Create multiple flows with metaprogramming](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-metaprogramming) -- [Use SQL and For each tasks for metadata-driven orchestration](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/foreach-sql-lookup-tutorial) -- [Use Git with jobs](https://learn.microsoft.com/en-us/azure/databricks/jobs/git) -- [Customer-managed keys](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/customer-managed-keys) -- [Explore sample data](https://learn.microsoft.com/en-us/azure/databricks/genie-code/sample-data-explorer) -- [Govern LLMs and MCPs (AI Gateway)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/) -- [LLMs](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta) -- [Host a custom MCP server](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp) -- [Model serving endpoints (previous)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-serving-endpoints) -- [Retrieval quality evaluation](https://learn.microsoft.com/en-us/azure/databricks/vector-search/retrieval-quality-eval) -- [Tutorial: Intelligent document processing pipeline](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/idp-pipeline-tutorial) -- [Manage production scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/manage-production-scorers) -- *...and 439 more* +- [TLS server certificate validation](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tls-server-certificate-validation) +- [Overview](https://learn.microsoft.com/en-us/azure/databricks/designer/) +- [What is Lakeflow Designer?](https://learn.microsoft.com/en-us/azure/databricks/designer/what-is-lakeflow-designer) +- [Create a Visual data prep](https://learn.microsoft.com/en-us/azure/databricks/designer/build-transformation) +- [Ingest data](https://learn.microsoft.com/en-us/azure/databricks/designer/ingest-data) +- [Built-in operators](https://learn.microsoft.com/en-us/azure/databricks/designer/built-in-operators) +- [Transfer object ownership](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/transfer-object-ownership) +- [AI governance (Unity AI Gateway)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/) +- [Declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-apis) +- [Declarative features API reference](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-api-reference) +- [Materialize declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/materialized-features) +- [Use declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-with-declarative-features) +- [Serve features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows) +- [Serve declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/serve-declarative-features) +- [Vector search with the Python SDK](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-python-sdk-example) +- [Call an OpenAI embeddings model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-external-embedding-model-example) +- [Call a GTE embeddings model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-foundation-embedding-model-gte-example) +- [Register and serve an OSS embedding model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/embedding-with-oss-models) +- [Use Vector Search with an OAuth token](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-oauth-token) +- [Supervisor API](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/supervisor-api) +- *...and 34 more* ### Updated Pages -- [Overview](https://learn.microsoft.com/en-us/azure/databricks/connect/) - - Updated: 2026-02-11T08:00:00.000Z → 2026-04-17T18:03:00.000Z -- [Configure managed storage](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/managed-storage) - - Updated: 2026-01-20T08:00:00.000Z → 2026-04-17T18:03:00.000Z -- [Concepts](https://learn.microsoft.com/en-us/azure/databricks/query-federation/database-federation) - - Updated: 2026-03-23T08:00:00.000Z → 2026-04-17T21:49:00.000Z -- [Free Edition limitations](https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-edition-limitations) - - Updated: 2026-03-27T08:00:00.000Z → 2026-04-14T08:00:00.000Z +- [ORC files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/orc) + - Updated: 2026-03-31T23:28:00.000Z → 2026-04-23T08:00:00.000Z +- [JSON files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/json) + - Updated: 2026-04-01T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- [Legacy visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-visualizations) + - Updated: 2026-03-06T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- [Query and visualize data](https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start) + - Updated: 2025-10-24T20:03:00.000Z → 2026-04-24T18:05:00.000Z +- [Train and deploy an ML model](https://learn.microsoft.com/en-us/azure/databricks/getting-started/ml-get-started) + - Updated: 2026-03-25T08:00:00.000Z → 2026-04-20T22:01:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/databricks/workspace/) + - Updated: 2026-03-10T23:23:00.000Z → 2026-04-22T08:00:00.000Z +- [Navigate the workspace](https://learn.microsoft.com/en-us/azure/databricks/workspace/navigate-workspace) + - Updated: 2026-02-25T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Databricks One](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one) + - Updated: 2026-04-08T08:00:00.000Z → 2026-04-22T08:00:00.000Z - [Chat](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-chat) - - Updated: 2026-04-03T17:47:00.000Z → 2026-04-16T08:00:00.000Z -- [Tutorial: EDA techniques](https://learn.microsoft.com/en-us/azure/databricks/notebooks/eda-tutorial) - - Updated: 2026-03-10T23:23:00.000Z → 2026-04-13T08:00:00.000Z -- [Workspace files overview](https://learn.microsoft.com/en-us/azure/databricks/files/workspace) - - Updated: 2026-02-25T08:00:00.000Z → 2026-04-16T08:00:00.000Z -- [PEM private key](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-pem) - - Updated: 2026-03-12T08:00:00.000Z → 2026-04-13T08:00:00.000Z -- [Concepts](https://learn.microsoft.com/en-us/azure/databricks/query-federation/catalog-federation) - - Updated: 2026-02-20T23:39:00.000Z → 2026-04-17T08:00:00.000Z -- [Explore database objects](https://learn.microsoft.com/en-us/azure/databricks/discover/database-objects) - - Updated: 2026-03-26T08:00:00.000Z → 2026-04-15T08:00:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/databricks/external-access/) - - Updated: 2026-01-20T08:00:00.000Z → 2026-04-14T17:47:00.000Z -- [Credential vending](https://learn.microsoft.com/en-us/azure/databricks/external-access/credential-vending) - - Updated: 2026-03-20T08:00:00.000Z → 2026-04-14T17:47:00.000Z -- [Selective overwrite](https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite) - - Updated: 2026-03-17T18:09:00.000Z → 2026-04-13T21:52:00.000Z -- [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/) - - Updated: 2026-03-31T23:28:00.000Z → 2026-04-15T22:08:00.000Z -- [Pipeline tagging](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/pipeline-tags) - - Updated: 2026-02-04T23:58:00.000Z → 2026-04-16T17:44:00.000Z -- [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-limits) - - Updated: 2026-04-02T22:35:00.000Z → 2026-04-17T18:03:00.000Z -- [Configuring for production](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/production) - - Updated: 2026-02-25T08:00:00.000Z → 2026-04-14T21:45:00.000Z -- [Options](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/options) - - Updated: 2026-04-09T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/) - - Updated: 2026-02-23T23:41:00.000Z → 2026-04-17T21:49:00.000Z -- [Use temporary credentials](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/temporary-credentials) - - Updated: 2026-03-31T23:28:00.000Z → 2026-04-13T21:52:00.000Z -- *...and 493 more* + - Updated: 2026-04-16T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Account-level Databricks One](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-account) + - Updated: 2026-04-09T17:57:00.000Z → 2026-04-22T08:00:00.000Z +- [Workspace browser and file management](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-browser) + - Updated: 2026-03-05T20:31:00.000Z → 2026-04-22T08:00:00.000Z +- [Introduction to workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-assets) + - Updated: 2026-03-18T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Manage workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-objects) + - Updated: 2026-02-23T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Get identifiers for workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-details) + - Updated: 2025-11-05T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Oracle](https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle) + - Updated: 2026-03-31T23:28:00.000Z → 2026-04-23T08:00:00.000Z +- [OAuth access token](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token) + - Updated: 2026-03-04T19:15:00.000Z → 2026-04-23T08:00:00.000Z +- [Connect to external HTTP services](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http) + - Updated: 2026-04-16T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- [Migrate to serverless routing](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration) + - Updated: 2026-03-24T01:32:00.000Z → 2026-04-23T08:00:00.000Z +- [Add AI-generated comments](https://learn.microsoft.com/en-us/azure/databricks/comments/ai-comments) + - Updated: 2026-02-26T08:00:00.000Z → 2026-04-23T17:47:00.000Z +- [Create service credentials](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials) + - Updated: 2026-01-24T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- *...and 377 more* ### Deleted Pages -- ~~Attribute usage using serverless budget policies~~ (https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies) -- ~~AI Gateway~~ (https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/) -- ~~AI Gateway (Beta)~~ (https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta) -- ~~Serving endpoints~~ (https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-serving-endpoints) -- ~~Create Delta tables~~ (https://learn.microsoft.com/en-us/azure/databricks/external-access/create-external-tables) -- ~~Delta client reads~~ (https://learn.microsoft.com/en-us/azure/databricks/external-access/unity-rest) -- ~~Custom MCP server~~ (https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp) -- ~~Build, evaluate, and deploy a retrieval agent~~ (https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/agent-framework-notebook) -- ~~Query Unity Catalog metric views~~ (https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-metric-views) -- ~~Catalog~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog) -- ~~Column~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column) -- ~~DataFrame~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe) -- ~~DataFrameNaFunctions~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions) -- ~~DataFrameReader~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader) -- ~~DataFrameStatFunctions~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions) -- ~~DataFrameWriter~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter) -- ~~DataFrameWriterV2~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2) -- ~~DataSource~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource) -- ~~DataSourceArrowWriter~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcearrowwriter) -- ~~DataSourceReader~~ (https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader) -- *...and 21 more* +- ~~Govern LLMs and MCPs (AI Gateway)~~ (https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/) +- ~~Dashboard local metric views~~ (https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/local-metric-views) +- ~~Use features in online applications~~ (https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows) +- ~~Query traces - DBSQL~~ (https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-dbsql) ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Resource quotas](https://learn.microsoft.com/en-us/azure/databricks/resources/limits) | limits-quotas | 0.98 | Explicitly described as listing numerical limits for Azure Databricks resources and API rate limits, likely in tables with exact values and whether they are fixed or can be increased. | | [Bundle configuration reference](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/reference) | configuration | 0.95 | Explicit configuration reference for all supported YAML keys, likely with parameter names, types, and allowed values—core expert configuration knowledge. | | [CAST_INVALID_INPUT error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/cast-invalid-input-error-class) | troubleshooting | 0.95 | Provides exact error condition name, SQLSTATE, message template, and concrete remediation steps (correct value, change type, use try_cast, or set specific ansiConfig to false). Strong troubleshooting pattern. | | [Dashboard limits](https://learn.microsoft.com/en-us/azure/databricks/dashboards/limits) | limits-quotas | 0.95 | Explicitly described as listing limits for pages, datasets, widgets, visualizations, and subscriptions; this will contain product-specific numeric limits and quotas. | | [INVALID_ARRAY_INDEX error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/invalid-array-index-error-class) | troubleshooting | 0.95 | Provides a specific error message template with SQLSTATE, mentions arraySize/indexValue, and recommends using get() or changing ansiConfig, which is concrete symptom→solution guidance unique to Databricks. | | [INVALID_ARRAY_INDEX_IN_ELEMENT_AT error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/invalid-array-index-in-element-at-error-class) | troubleshooting | 0.95 | Details a specific error class with SQLSTATE and message plus recommended use of try_element_at and ansiConfig setting, which is product-specific error handling guidance. | -| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ldp/limitations) | limits-quotas | 0.95 | Explicitly documents numeric limits (for example 1000 concurrent pipeline updates, per-pipeline dataset limits based on config); matches limits-quotas criteria. | -| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/limits) | limits-quotas | 0.95 | Explicitly a limits and quotas page for Foundation Model APIs, describing rate limits that vary by workspace tier, model type, and deployment mode. This will contain specific numeric limits and tables, which are classic expert-only quota details. | +| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ldp/limitations) | limits-quotas | 0.95 | Explicitly documents concrete numeric limits (for example, 1000 concurrent pipeline updates and per-pipeline dataset limits based on configuration). This matches the limits-quotas criteria with product-specific constraints that are unlikely to be known from training. | +| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/limits) | limits-quotas | 0.95 | Explicitly about limits and quotas; will contain numeric rate limits, token caps, and tier-based tables for different workspace tiers and deployment modes, matching the limits-quotas criteria. | | [Options](https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/options) | configuration | 0.95 | Reference for Kafka connector options necessarily includes option names, allowed values, and behavior; these are detailed configuration parameters unique to Databricks’ Kafka integration. | +| [Resource quotas](https://learn.microsoft.com/en-us/azure/databricks/resources/limits) | limits-quotas | 0.95 | Explicitly described as listing numerical limits for Azure Databricks resources and API rate limits, likely in tables with exact values and whether they are fixed or can be increased, matching limits-quotas criteria. | +| [Supported connection properties](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/properties) | configuration | 0.95 | Explicitly about 'supported connection properties' for the Databricks JDBC driver. Such pages are typically parameter tables with names, types, allowed values, and defaults—precisely the expert configuration knowledge that fits the configuration sub-skill. | | [TABLE_OR_VIEW_NOT_FOUND error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/table-or-view-not-found-error-class) | troubleshooting | 0.95 | Provides SQLSTATE, error message template, and concrete remediation steps (check schema/catalog, use IF EXISTS), which is clear symptom→cause→solution guidance. | | [Limitations and FAQ](https://learn.microsoft.com/en-us/azure/databricks/repos/limits) | limits-quotas | 0.92 | Explicitly described as specifying limits for Databricks Git folders and Git integration, including size limits and file recoverability. This is product-specific numerical limits and constraints that qualify as expert knowledge under limits-quotas. | -| [ALTER TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table) | configuration | 0.90 | Comprehensively documents ALTER TABLE syntax, supported operations, unsupported cases (temporary tables, streaming tables), and behaviors like caching—detailed product-specific configuration rules. | | [ARITHMETIC_OVERFLOW error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/arithmetic-overflow-error-class) | troubleshooting | 0.90 | Documents specific error condition name, SQLSTATE code, message template, and config flag to bypass the error. This is a clear symptom → cause → resolution mapping unique to Databricks. | | [AUTO CDC (pipelines)](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-sql-ref-apply-changes-into) | integrations | 0.90 | AUTO CDC INTO statement reference with syntax and behavior; product-specific SQL command semantics for CDC flows. | | [Best practices](https://learn.microsoft.com/en-us/azure/databricks/delta/best-practices) | best-practices | 0.90 | Explicit DO/DON’T guidance for Delta Lake (file sizing, OPTIMIZE/VACUUM usage, schema evolution, partitioning, concurrency) with Databricks-specific recommendations and gotchas that are not generic Spark knowledge. | | [CREATE FLOW (pipelines)](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-sql-ref-create-flow) | integrations | 0.90 | CREATE FLOW statement reference; includes syntax, options, and constraints for flows/backfills, which are unique SQL APIs. | -| [Compute configuration](https://learn.microsoft.com/en-us/azure/databricks/compute/configure) | configuration | 0.90 | This is a configuration reference for Databricks compute, detailing available settings when creating all-purpose or job compute resources and how policies affect visibility of those settings. It clearly maps to configuration, with product-specific options and constraints. | -| [Compute policy reference](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-definition) | configuration | 0.90 | Explicitly a 'compute policy reference' listing available policy attributes and limitation types, with JSON examples and sample policies. This is a configuration reference with specific attribute names, allowed values, and example configurations that are product-specific expert knowledge. | +| [Compute policy reference](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-definition) | configuration | 0.90 | Lists all available policy attributes, limitation types, and sample JSON policy definitions—classic configuration reference with parameter names and allowed values. | | [Compute termination error codes](https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/cluster-error-codes) | troubleshooting | 0.90 | Explicitly maps cluster termination error codes to causes and fixes; classic symptom → error code → resolution troubleshooting content. | | [DIVIDE_BY_ZERO error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/divide-by-zero-error-class) | troubleshooting | 0.90 | Includes error class, SQLSTATE, message, suggested use of try_divide, and config flag to bypass, which is a clear symptom→solution mapping. | | [DataFrame class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe) | integrations | 0.90 | Comprehensive reference for the Databricks PySpark DataFrame class, including creation constraints, methods, and Spark Connect support; this is core product-specific API and integration knowledge. | | [DataFrameReader class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader) | integrations | 0.90 | Class reference listing methods, parameters, and behaviors for DataFrameReader, including Spark Connect support; this is detailed API integration knowledge. | | [DataFrameWriter class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter) | integrations | 0.90 | Class reference for DataFrameWriter with methods and options for writing to external systems; detailed integration and configuration surface. | | [DataFrameWriterV2 class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2) | integrations | 0.90 | Class reference for v2 writer with capabilities for Databricks tables and Delta Lake; includes product-specific configuration surface. | -| [Databases](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-databases) | limits-quotas | 0.90 | Explicitly states a numeric limit of 500 databases per branch; this is a concrete service quota that LLMs would not know without documentation. | | [Debug Model Serving](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-debug) | troubleshooting | 0.90 | Organized around common issues (deployment failures, container build errors, runtime problems) with diagnosis and resolution steps; classic symptom→cause→solution troubleshooting content. | | [Environment variables and fields reference](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/env-vars) | configuration | 0.90 | Explicit reference listing environment variables and configuration fields with names and semantics for unified authentication. This is a configuration table-style reference unique to Databricks. | | [Error classes](https://learn.microsoft.com/en-us/azure/databricks/error-messages/error-classes) | troubleshooting | 0.90 | Page is a catalog of Azure Databricks-specific error conditions used in Databricks SQL and Databricks Runtime, intended for programmatic handling. It enumerates product-specific error identifiers and meanings, which are not generally known from training data and map directly to troubleshooting and error-handling logic. | @@ -203,20 +186,19 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [H3_INVALID_GRID_DISTANCE_VALUE error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/h3-invalid-grid-distance-value-error-class) | troubleshooting | 0.90 | Page documents a specific Databricks SQL error class with SQLSTATE, message template, and condition (grid distance must be non‑negative), which is product-specific error behavior useful for diagnosis. | | [H3_INVALID_RESOLUTION_VALUE error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/h3-invalid-resolution-value-error-class) | troubleshooting | 0.90 | Defines a specific Databricks H3 error class with SQLSTATE and message template including minR/maxR bounds, which is detailed product error semantics for troubleshooting. | | [Identities and privileges](https://learn.microsoft.com/en-us/azure/databricks/jobs/privileges) | security | 0.90 | Covers identities, permissions, and privileges for Lakeflow Jobs; mentions specific permissions like CAN MANAGE, CAN ATTACH TO, CAN RESTART and log visibility behavior, which are product-specific RBAC/security configuration details. | -| [Instance type compatibility reference](https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-type-instances) | configuration | 0.90 | A reference list of instance type compatibility groups; effectively a configuration matrix of which instance types can be substituted, which is detailed expert configuration knowledge. | | [Limits and region availability](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-limits) | limits-quotas | 0.90 | The page is explicitly about Azure Databricks Model Serving limits and region availability and will list concrete numerical limits/quotas and regional support details that are product- and SKU-specific, matching the limits-quotas criteria. | | [MISSING_AGGREGATION error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/missing-aggregation-error-class) | troubleshooting | 0.90 | Explains a Databricks SQL error with SQLSTATE and message template, including guidance to adjust GROUP BY or use an any-value expression, which is concrete troubleshooting advice. | | [MLflow 2: Troubleshoot evaluation](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/troubleshooting) | troubleshooting | 0.90 | Explicit troubleshooting article for Agent Evaluation; likely organized by specific errors and resolutions, including product-specific causes and fixes. | +| [Manage governed tag permissions](https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-permissions) | security | 0.90 | A page focused on managing permissions for governed tags will contain product-specific RBAC roles, privileges, and permission scopes for tag creation and usage, which is expert security knowledge per the criteria. | | [Meta parameters](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp-meta-param) | configuration | 0.90 | Explicitly about the `_meta` parameter controlling result limits, filters, and warehouse selection; this is a configuration reference with parameter names, allowed values, and behavior—clear expert configuration knowledge. | | [Options](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/options) | configuration | 0.90 | Reference for Auto Loader/cloudFiles options implies detailed parameter names, allowed values, and defaults for this specific source, which matches configuration-type expert knowledge. | | [Pass large parameter arrays](https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each-lookup-example) | limits-quotas | 0.90 | Explicitly states numeric limits for parameter arrays (10,000 characters, 48 KB with task value references) and provides a workaround pattern using JSON lookup tables. Contains concrete limits and related configuration behavior. | -| [Recover from streaming checkpoint failure](https://learn.microsoft.com/en-us/azure/databricks/ldp/recover-streaming) | troubleshooting | 0.90 | Explicitly about recovering from streaming checkpoint corruption; this is a concrete failure mode with product-specific recovery steps and edge cases. | +| [Rate limits](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/rate-limits-beta) | limits-quotas | 0.90 | A page dedicated to configuring rate limits for Unity AI Gateway endpoints will list concrete numeric limits (requests per second/minute, token caps, default and maximum values) and possibly tier-based tables, which are exactly the kind of expert numeric constraints described under limits-quotas. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-reference) | configuration | 0.90 | Reference page with supported tables, column definitions, and data types is detailed configuration/reference information unique to this connector. | | [Serverless quotas](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/serverless-quotas) | limits-quotas | 0.90 | Explicitly about quotas for serverless compute; such pages typically list DBU/hour limits and possibly per-workspace or per-feature caps that are not knowable from training data. | -| [Supported connection properties](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/properties) | configuration | 0.90 | Explicitly documents supported connection properties for the Databricks JDBC Driver v3+, which are product-specific configuration parameters likely presented in tables with names, types, defaults, and allowed values—information that qualifies as expert configuration knowledge. | | [Table properties reference](https://learn.microsoft.com/en-us/azure/databricks/delta/table-properties) | configuration | 0.90 | Explicitly a reference list of table properties that control behavior. Such pages typically include property names, allowed values, and effects—product-specific configuration details that qualify as expert knowledge under the configuration category. | | [Troubleshoot](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/troubleshooting) | troubleshooting | 0.90 | Dedicated troubleshooting article; expected to map specific error messages and conditions to causes and resolutions for this extension. | -| [Troubleshoot common sharing errors](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/troubleshooting) | troubleshooting | 0.90 | Explicit troubleshooting article; expected to map specific Delta Sharing error messages or codes to causes and resolutions. | +| [Troubleshoot common sharing errors](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/troubleshooting) | troubleshooting | 0.90 | Explicitly a troubleshooting page describing common errors when accessing shared data. It will contain specific error messages/codes and their resolutions, which are product-specific symptom→cause→solution mappings. | | [Troubleshoot issues with the Terraform provider](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting article for Databricks Terraform provider; typically organized by specific error messages/codes and their resolutions, which are product-specific diagnostic knowledge. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting guide for this connector; such pages map specific errors and symptoms to causes and resolutions. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-troubleshoot) | troubleshooting | 0.90 | Troubleshooting page explicitly for common errors; will map specific error messages or codes to causes and resolutions for this connector. | @@ -224,11 +206,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-troubleshoot) | troubleshooting | 0.90 | Troubleshooting page will contain mappings from connector-specific issues to causes and fixes, including any relevant error messages. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-troubleshoot) | troubleshooting | 0.90 | Explicitly a troubleshooting page for Jira ingestion; such pages normally map specific error messages/codes and OAuth misconfigurations to resolutions, which is expert troubleshooting knowledge. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting page; expected to contain symptom → cause → solution mappings and possibly error codes for the Meta Ads connector. | -| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-troubleshoot) | troubleshooting | 0.90 | Troubleshooting page for MySQL ingestion; Databricks connector troubleshooting docs typically list specific error messages/codes and connector-specific resolutions, which are not generally known to LLMs. | -| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting guide for PostgreSQL connector; will map specific symptoms and errors to causes and fixes, which is expert, product-specific knowledge. | +| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-troubleshoot) | troubleshooting | 0.90 | Explicitly a troubleshooting guide for MySQL ingestion; Databricks/Lakeflow-specific issues, likely includes concrete error messages and resolutions organized by symptom and fix. | +| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-troubleshoot) | troubleshooting | 0.90 | Troubleshooting page for PostgreSQL connector; Databricks-specific ingestion problems with symptom-to-solution mappings and likely error messages. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-troubleshoot) | troubleshooting | 0.90 | Focused on troubleshooting common issues with query-based connectors, including specific error conditions like cursor column errors and connection failures, matching symptom→cause→solution troubleshooting guidance. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting page; expected to map specific Salesforce connector errors and ingestion issues to causes and fixes. | -| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting guide; will contain symptom → cause → solution mappings and possibly error codes/messages specific to the SQL Server connector. | +| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting guide for SQL Server connector; expected to contain connector-specific errors, causes, and resolutions. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-troubleshoot) | troubleshooting | 0.90 | Troubleshooting page for a specific connector; expected to contain error messages/codes and resolution steps, which is expert troubleshooting knowledge. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-troubleshoot) | troubleshooting | 0.90 | Troubleshooting page for a specific connector; expected to map errors and symptoms to causes and fixes, which is expert troubleshooting knowledge. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-troubleshoot) | troubleshooting | 0.90 | Page is explicitly a troubleshooting guide for the Workday Reports connector, likely organized by specific symptoms and including product-specific error messages or behaviors with corresponding resolutions. | @@ -236,7 +218,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [UNRESOLVED_ROUTINE error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/unresolved-routine-error-class) | troubleshooting | 0.90 | Documents a specific Databricks error class with SQLSTATE and message about resolving functions on a search path, which is product-specific error semantics for troubleshooting. | | [WKB_PARSE_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/wkb-parse-error-error-class) | troubleshooting | 0.90 | Provides SQLSTATE and a detailed message template including parseError and position, which is specific to Databricks’ WKB parsing behavior and used for error diagnosis. | | [WKT_PARSE_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/wkt-parse-error-error-class) | troubleshooting | 0.90 | Analogous to WKB, this documents WKT parsing errors with SQLSTATE and message template, which is product-specific troubleshooting information. | -| [append_flow](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-append-flow) | integrations | 0.90 | Reference for @dp.append_flow with required streaming DataFrame semantics; includes function/decorator parameters and constraints specific to Lakeflow pipelines. | | [create_auto_cdc_flow](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-apply-changes) | integrations | 0.90 | Details create_auto_cdc_flow signature, required target streaming table, and CDC behavior; these are product-specific API semantics and constraints. | | [create_auto_cdc_from_snapshot_flow](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-apply-changes-from-snapshot) | integrations | 0.90 | Documents create_auto_cdc_from_snapshot_flow behavior and requirements; includes API parameters and usage patterns unique to Lakeflow CDC from snapshots. | | [csv](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/csv) | integrations | 0.90 | Method reference with options like inferSchema and schema, including behavior about extra passes over data; these are concrete API parameters and effects. | @@ -254,37 +235,35 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Basic authentication](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-basic-auth) | integrations | 0.86 | Shows Snowflake federation using username/password via Databricks, with concrete connection options and auth configuration details. | | [Built-in OAuth](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake) | integrations | 0.86 | Covers Snowflake OAuth integration with Databricks, including connector parameters, auth flow specifics, and SQL/connection configuration unique to this product combo. | | [Bundle resources](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/resources) | configuration | 0.86 | The page is a configuration reference for all supported resource types in Declarative Automation Bundles, describing how to specify resources in the bundle configuration. This implies detailed resource-specific configuration parameters and schema-level options unique to Databricks bundles, matching the configuration category. | -| [Configure a connection](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/configure) | configuration | 0.86 | Shows how to configure connections using the Databricks JDBC driver v3+, including connection URL structure and property names/values; this is detailed configuration reference. | +| [Compute configuration](https://learn.microsoft.com/en-us/azure/databricks/compute/configure) | configuration | 0.86 | A reference for all-purpose and job compute configuration settings; Databricks-specific parameters and options constitute expert configuration knowledge beyond generic cluster setup. | | [Configure a connection](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/configure) | configuration | 0.86 | Details how to build JDBC URLs and property sets for the Simba driver, including property names and required values; this is concrete configuration reference. | | [Configure serverless environment](https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/dependencies) | configuration | 0.86 | The page describes a configuration side panel for serverless notebooks, including settings for dependencies, memory size, base environment, and usage policies. These are concrete configuration options unique to Databricks serverless, matching the configuration sub-skill. | -| [Driver capability settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/capability) | configuration | 0.86 | Similar to index 0 but for the legacy Simba JDBC driver, this page documents 'special and advanced driver capability settings' and how to configure them. These are driver-specific configuration parameters and options, which aligns with the configuration sub-skill type. | -| [Driver capability settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/capability) | configuration | 0.86 | The page is specifically about 'special and advanced driver capability settings' for the Databricks ODBC driver. These capability flags and their allowed values are product-specific configuration parameters that an LLM is unlikely to know from training. It is not just a tutorial; it documents driver settings and how to configure them, which fits the configuration category. | +| [Connect to external HTTP services](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http) | integrations | 0.86 | Details Unity Catalog HTTP connection objects, including endpoint and credential handling, and how to send authenticated requests via the HTTP connection proxy. Contains product-specific configuration concepts and usage patterns for external REST/MCP/agent tools integration. | | [Enable HSM customer-managed keys for managed services](https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-services-azure/cmk-hsm-managed-services-azure) | security | 0.86 | Provides concrete instructions for configuring Azure Key Vault Managed HSM keys for Databricks managed services data, including product-specific security configuration steps. | | [Enable customer-managed keys for managed services](https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-services-azure/customer-managed-key-managed-services-azure) | security | 0.86 | Step-by-step configuration of customer-managed keys from Azure Key Vault for managed services data, with Databricks-specific security configuration details and parameters. | | [IP access lists for workspaces](https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list-workspace) | security | 0.86 | Details concrete steps and commands for configuring workspace IP access lists using the Databricks CLI and IP Access Lists API, including how enforcement combines with account-level ingress controls. Contains product-specific security configuration patterns. | | [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-entra) | integrations | 0.86 | Describes Snowflake federation using Microsoft Entra ID, with concrete auth configuration, parameter names, and Databricks-specific setup steps. | -| [OAuth access token](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token) | integrations | 0.86 | Covers using raw OAuth access tokens for Snowflake federation, with detailed parameter usage and Databricks connection configuration. | | [Okta](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-okta) | integrations | 0.86 | Shows how to use Okta as external OAuth provider for Snowflake federation, including specific configuration fields and Databricks connection properties. | | [Personal access tokens (legacy)](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/pat) | limits-quotas | 0.86 | Contains explicit numeric limits and lifecycle rules for PATs (for example, up to 600 PATs per user per workspace and automatic revocation after 90 days of inactivity). These are concrete quotas and time-based constraints that an LLM would not reliably know from training, matching the limits-quotas category. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-reference) | configuration | 0.86 | Reference material including data type mappings and DDL operation handling; this is detailed, product-specific behavior/configuration that qualifies as expert knowledge. | +| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/troubleshooting) | troubleshooting | 0.86 | A dedicated troubleshooting page for the Databricks CLI will list concrete error messages/codes, causes, and resolution steps specific to this CLI, matching the symptom → cause → solution pattern and providing expert diagnostic knowledge. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-troubleshoot) | troubleshooting | 0.86 | The page is explicitly a troubleshooting guide for the ServiceNow connector in Databricks Lakeflow Connect, describing common issues and how to resolve them. Such content typically includes product-specific error messages, causes, and resolution steps that go beyond generic debugging knowledge. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-troubleshoot) | troubleshooting | 0.86 | A dedicated troubleshooting page for the SharePoint connector; likely organized by specific error messages/symptoms and corresponding resolutions unique to Lakeflow Connect. | | [h3_tessellateaswkb function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_tessellateaswkb) | integrations | 0.86 | Detailed Databricks SQL function returning ARRAY of structs (cell ID, coverage flag, WKB intersection); complex, product-specific API pattern. | | [ANY FILE privilege](https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/any-file) | security | 0.85 | Defines the ANY FILE securable and its impact on direct filesystem and cloud storage access, independent of table ACLs. This is a product-specific permission model detail (securable name and its effect), clearly in the security domain. | | [Access control (legacy)](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/access-control) | security | 0.85 | Access control article for feature tables; expected to list specific permissions, roles, and scope behaviors that are product-specific security configuration details. | -| [App environment](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/system-env) | configuration | 0.85 | Describes the managed app environment, including available binaries, environment variables, and dependency management; this is detailed configuration/environment reference information specific to Databricks Apps. | | [Azure SQL Database firewall configuration](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/azure-sql-db-firewall) | security | 0.85 | Describes precise firewall configuration for Azure SQL Database to allow Databricks classic compute access, including network constructs (VNets, endpoints) and likely specific rules and settings. | -| [Best practices](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices) | best-practices | 0.85 | Provides an opinionated, Databricks-specific guide for configuring identity and migrating to identity federation, including concrete recommendations and patterns that go beyond generic IAM advice. | +| [Best practices for serverless workspaces](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces-best-practices) | best-practices | 0.85 | Explicitly lists best practices for creating, managing, securing, and controlling costs in serverless workspaces, which are product-specific DO/DON'T recommendations. | | [Bundle library dependencies](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/library-dependencies) | configuration | 0.85 | Describes syntax for declaring library dependencies (Python, others) in bundle configuration files, including supported dependency types and fields. | | [Bundle project templates](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/templates) | configuration | 0.85 | Explains syntax and commands for bundle templates; likely includes template parameters, structure, and configuration keys that are specific to bundles. | | [Bundle run identity](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/run-as) | security | 0.85 | Details the `run_as` setting, allowed values (`user_name`, `service_principal_name`), and constraints (non-admin restrictions); product-specific IAM configuration. | | [CONNECTION_PRIVILEGES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/connection_privileges) | security | 0.85 | CONNECTION_PRIVILEGES lists principals and privileges on connections and notes special MANAGE behavior; this is detailed RBAC/permission semantics. | | [CREATE STREAMING TABLE (pipelines)](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-sql-ref-create-streaming-table) | integrations | 0.85 | CREATE STREAMING TABLE (pipelines) reference; documents syntax and behavior for streaming tables backed by pipelines. | +| [Call a GTE embeddings model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-foundation-embedding-model-gte-example) | integrations | 0.85 | Shows how to integrate Databricks Vector Search with Databricks Foundation Model APIs (GTE embeddings), including SDK usage and parameters, which is a specific integration pattern. | +| [Call an OpenAI embeddings model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-external-embedding-model-example) | integrations | 0.85 | Provides concrete code and configuration for using Databricks Vector Search with external OpenAI embedding models via Databricks external models, a product-specific integration pattern. | | [Column class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column) | integrations | 0.85 | Full reference for the Databricks PySpark Column class, including methods, behaviors, and Spark Connect support; this is detailed SDK/API knowledge unique to the product. | -| [Column mask clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-column-mask) | security | 0.85 | Details the column mask clause syntax, specifying how to bind masking functions to columns and how identity/group info is used—product-specific security configuration with exact clause forms. | | [Configure task parameters](https://learn.microsoft.com/en-us/azure/databricks/jobs/task-parameters) | configuration | 0.85 | Details task parameterization, including syntax differences per asset type and dynamic value references; these are product-specific parameter names and usage patterns that constitute configuration knowledge. | | [Connect using REST API](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-manual-api) | integrations | 0.85 | Describes direct REST API calls for languages without SDK; likely includes endpoint paths, parameters, and auth patterns unique to Lakebase Autoscaling. | -| [Connect with Databricks SDK](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-connect) | integrations | 0.85 | Shows how to use Databricks SDK with Postgres drivers and OAuth token rotation; includes product-specific SDK calls and connection pool patterns. | | [Create and manage recipients (OIDC)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-oidc-fed) | security | 0.85 | Explains how to federate authentication via OIDC, including JWT/OAuth token handling and IdP integration; contains product-specific security configuration parameters and flows. | | [DC_GA4_RAW_DATA_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/dc-ga4-raw-data-error-error-class) | troubleshooting | 0.85 | Defines a specific Databricks error class for GA4 raw data connector with SQLSTATE and message structure including errorCode. Product-specific error diagnosis information. | | [DC_SQLSERVER_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/dc-sqlserver-error-error-class) | troubleshooting | 0.85 | Documents a Databricks-specific error class for the SQL Server connector with SQLSTATE and message. This is concrete troubleshooting metadata for connector failures. | @@ -293,11 +272,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [EWKB_PARSE_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/ewkb-parse-error-error-class) | troubleshooting | 0.85 | Defines EWKB_PARSE_ERROR with SQLSTATE and message template including parseError and position. Clear mapping from parsing symptom to error condition for debugging. | | [EWKT_PARSE_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/ewkt-parse-error-error-class) | troubleshooting | 0.85 | Documents EWKT_PARSE_ERROR with SQLSTATE and detailed message template. Product-specific error identifier and format used for diagnosing spatial parsing problems. | | [EXTERNAL_LOCATION_PRIVILEGES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/external_location_privileges) | security | 0.85 | EXTERNAL_LOCATION_PRIVILEGES lists principals and privileges on external locations and notes MANAGE behavior; this is detailed RBAC metadata. | +| [End of life for Standard tier workspaces on Azure Databricks](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/standard-tier) | decision-making | 0.85 | Provides concrete dates and required actions for upgrading from Standard to Premium tier, with billing impact considerations, which is explicit SKU/tier decision and migration guidance. | | [Enforce user isolation cluster types on a workspace](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/enforce-user-isolation) | security | 0.85 | Security-focused workspace setting that blocks certain cluster access types; includes specific cluster type names and behavior when enabled. | | [Extension settings](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/settings) | configuration | 0.85 | Explicitly lists extension settings; likely includes setting names, allowed values, and defaults, which is product-specific configuration knowledge. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/faq) | limits-quotas | 0.85 | Provides a table of latency impact estimates by trace size and references workspace quotas and rate limits; this is numeric, product-specific limits/behavior not inferable from general knowledge. | | [Failing jobs or executors removed](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/failing-spark-jobs) | troubleshooting | 0.85 | Explicitly about diagnosing failed jobs and removed executors, listing common causes and how to use the Spark UI to identify them, which is classic symptom → cause → solution troubleshooting content specific to Databricks. | -| [Foundation Model APIs reference](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference) | integrations | 0.85 | An API reference for Foundation Model APIs, including request/response formats and parameters, designed to be similar to OpenAI’s REST API. This is a direct SDK/API parameter reference unique to Databricks, fitting integrations & coding patterns. | +| [Foundation Model APIs reference](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference) | integrations | 0.85 | API reference content includes endpoint paths, parameters, types, defaults, and constraints for Foundation Model APIs, which are detailed integration and coding patterns unique to this product. | | [GEOJSON_PARSE_ERROR error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/geojson-parse-error-error-class) | troubleshooting | 0.85 | GEOJSON_PARSE_ERROR error class with SQLSTATE and message template including parseError and position. Used directly in troubleshooting malformed GeoJSON input. | | [GRANT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-grant) | security | 0.85 | GRANT syntax for Databricks with specific notes about samples catalog being read-only and use of GRANT ON SHARE for recipients. Contains detailed RBAC/privilege model behavior. | | [GROUP_BY_AGGREGATE error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/group-by-aggregate-error-class) | troubleshooting | 0.85 | GROUP_BY_AGGREGATE error condition with SQLSTATE and message template describing invalid aggregate usage in GROUP BY. Product-specific error code and guidance for query correction. | @@ -306,7 +286,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/limitations) | limits-quotas | 0.85 | Explicitly a limitations page; while it distinguishes from project limits, it will list unsupported features and compatibility constraints specific to Lakebase Autoscaling, which are expert limit/constraint details. | | [Micro-batch size](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/batch-size) | configuration | 0.85 | Explains Databricks-specific admission control settings for Structured Streaming, including configuration parameters to limit input rate and maintain batch size—concrete config knowledge. | | [Notebooks limitations](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-limitations) | limits-quotas | 0.85 | Page explicitly documents known limitations such as notebook sizing, cell outputs, debugger, SQL warehouse, and widget constraints; these are typically expressed as concrete numeric limits and constraints that qualify as expert limits/quotas knowledge. | -| [OTel GenAI attribute mapping](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/otel-span-attributes) | configuration | 0.85 | Page lists specific OpenTelemetry GenAI Semantic Convention span attribute names and how they must be set so Databricks MLflow correctly renders span types, inputs, outputs, and token usage. These are detailed, product-specific configuration parameters (attribute keys/values) that an LLM is unlikely to know, fitting configuration. | +| [OTel GenAI attribute mapping](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/otel-span-attributes) | integrations | 0.85 | Lists specific OpenTelemetry span attribute names and how they must be set for Databricks MLflow to correctly render span types, inputs, outputs, and token usage—this is a product-specific integration contract with concrete parameter requirements. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/settings) | configuration | 0.85 | Explicitly about configuration file syntax; will contain keys, allowed values, and structural rules for databricks.yml, which are expert configuration details. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-sql-ref-create-materialized-view) | integrations | 0.85 | CREATE MATERIALIZED VIEW (pipelines) reference; product-specific SQL syntax and behavior tied to pipeline-backed materialized views. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/python-ref) | integrations | 0.85 | Python language reference for Lakeflow pipelines will enumerate functions, parameters, and constraints, which are detailed SDK/API integration patterns. | @@ -332,7 +312,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/repos/errors-troubleshooting) | troubleshooting | 0.85 | Explicitly a troubleshooting page for Databricks Git folders, providing mappings from common error messages to causes and resolutions when syncing with remote Git repos. Contains product-specific error-handling knowledge. | | [Use IP access lists to restrict access](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/access-list) | security | 0.85 | Defines IP access list behavior, including maximum of 100 IP/CIDR values per recipient and interaction with workspace IP lists; this is precise, product-specific security configuration with numeric limits. | | [Volume privileges](https://learn.microsoft.com/en-us/azure/databricks/volumes/privileges) | security | 0.85 | Defines specific privilege names and which permissions are required for particular volume operations. This is product-specific RBAC/privilege mapping, matching the security category and representing expert configuration knowledge. | -| [YAML syntax reference](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/yaml-reference) | configuration | 0.85 | Provides the full YAML grammar and supported expressions for metric view definitions, including field names and allowed structures, which are product-specific configuration details not inferable from general knowledge. | | [approxQuantile](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/approxquantile) | integrations | 0.85 | Describes deterministic error bounds and algorithm (Greenwald-Khanna variation), which are detailed, implementation-specific semantics. | | [create_streaming_table](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-streaming-table) | integrations | 0.85 | create_streaming_table reference with function signature and deprecation notes; product-specific API behavior and constraints. | | [crosstab](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions/crosstab) | integrations | 0.85 | Details output schema (first column naming, column names from distinct values, zero counts for missing pairs) which are precise API semantics. | @@ -352,6 +331,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [over](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/over) | integrations | 0.85 | API reference for binding a Column to a window specification, including Databricks-specific semantics for window functions. | | [pandas_udf](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/pandas_udf) | integrations | 0.85 | Explains Databricks-specific pandas_udf behavior, Arrow data transfer, and how it integrates with PySpark, which is detailed integration/API knowledge. | | [parquet](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/parquet) | integrations | 0.85 | Method reference for Parquet writes with product-specific semantics and options. | +| [psql](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/psql-command) | integrations | 0.85 | Describes the psql CLI command, including how it connects to Lakebase Postgres instances and supported modes. This is a product-specific integration command with concrete parameters and behavior. | | [replace](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions/replace) | integrations | 0.85 | Details type constraints, None handling, and casting behavior for replace, which are precise, product-specific API rules. | | [save](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/save) | integrations | 0.85 | Describes interaction of format and options, and default source via spark.sql.sources.default, which is concrete integration behavior. | | [saveAsTable](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/saveastable) | integrations | 0.85 | Details mode behavior and schema matching rules when saving as tables, which are product-specific API semantics. | @@ -368,7 +348,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Power BI cheat sheet](https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/power-bi) | best-practices | 0.84 | Explicit best-practices cheat sheet with opinionated, Databricks-specific guidance to optimize query performance and dashboard efficiency. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-reference) | configuration | 0.84 | Reference material including unsupported objects and data type transformations to Delta; this is detailed, product-specific mapping/configuration information. | | [Script reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-utility-reference) | configuration | 0.84 | Reference for a utility script with components and parameters is a configuration reference, likely including parameter names, allowed values, and behavior. | -| [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/troubleshooting) | troubleshooting | 0.84 | A dedicated troubleshooting page will map specific CLI error messages/codes and symptoms to causes and resolutions, which is product-specific troubleshooting knowledge. | | [h3_polyfillash3 function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_polyfillash3) | integrations | 0.84 | Provides Databricks SQL function that returns ARRAY H3 cells whose centroids lie in a polygon; detailed API semantics. | | [h3_polyfillash3string function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_polyfillash3string) | integrations | 0.84 | Function reference for ARRAY H3 polyfill behavior in Databricks SQL; product-specific geospatial API. | | [h3_try_coverash3 function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_try_coverash3) | integrations | 0.84 | Describes Databricks SQL function variant that returns NULL instead of errors on invalid input; specific error-handling semantics and return typing. | @@ -379,7 +358,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Enable customer-managed keys for managed disks](https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-disks-azure/cmk-managed-disks-azure) | security | 0.83 | Gives concrete steps to configure customer-managed keys from Azure Key Vault for Databricks compute managed disks, including product-specific encryption behavior and required settings. | | [Use custom libraries with Model Serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/private-libraries-model-serving) | configuration | 0.83 | Shows how to include custom/private libraries when logging models for serving; involves specific packaging and configuration steps unique to Databricks. | | [permissions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/permissions-commands) | security | 0.83 | permissions CLI group is about access control; the reference will list specific permission targets, operations, and possibly role mappings, which are product-specific security configuration details. | -| [Allowlist libraries and init scripts on standard compute](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/allowlist) | security | 0.82 | Describes allowlist object, default behavior, and MANAGE ALLOWLIST privilege; this is product-specific security/privilege configuration. | +| [Best practices](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices) | best-practices | 0.82 | Opinionated, product-specific guidance on configuring identity and migrating to identity federation qualifies as best practices. | | [Compute settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/compute) | configuration | 0.82 | Describes compute resource settings specifically for the Databricks ODBC driver, likely listing setting names, allowed values, and behavior; this is driver-specific configuration rather than generic concepts. | | [Create custom model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/create-manage-serving-endpoints) | configuration | 0.82 | Describes how to create and configure endpoints for custom models, including endpoint options; endpoint configuration details are product-specific. | | [Debug agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/debug-agent) | troubleshooting | 0.82 | Organized around debugging issues for agents on Apps and Model Serving, likely listing specific error messages, causes, and resolutions. This matches symptom→cause→solution troubleshooting content. | @@ -389,6 +368,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-limits) | limits-quotas | 0.82 | A limitations page for a specific connector typically lists concrete constraints (supported objects, date ranges, API caps) that are product-specific and not generally known. | | [Manage connections](https://learn.microsoft.com/en-us/azure/databricks/query-federation/connections) | configuration | 0.82 | Documents connection objects, SQL/REST operations, and permission management for Lakehouse Federation connections, with concrete parameter usage. | | [Manage files in volumes](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/volumes) | integrations | 0.82 | Explains how to upload/download/delete files and stream data using the JDBC driver; includes driver-specific SQL/operations and parameters, which are integration patterns. | +| [SQL warehouse admin settings](https://learn.microsoft.com/en-us/azure/databricks/admin/sql/) | configuration | 0.82 | Covers workspace-level SQL warehouse settings and access controls with product-specific options and defaults, fitting configuration. | | [Salesforce Data 360 File Sharing](https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud-file-sharing) | integrations | 0.82 | Details multiple Salesforce connectors, including a file sharing connector, with tables comparing connectors and configuration specifics for this Databricks integration. | | [Set up best practices](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/best-practice) | best-practices | 0.82 | Explicit best-practices page with product-specific guidance on SSO, hosting, and gateway cluster configuration for Partner Connect, including concrete recommendations and gotchas. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-source-setup) | configuration | 0.82 | Focuses on configuring MySQL for ingestion using binlog replication; likely includes specific server settings, required options, and constraints that are product-specific. | @@ -404,6 +384,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [libraries](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/libraries-commands) | integrations | 0.82 | Documents exact CLI commands and options for installing, uninstalling, and checking status of libraries on clusters; these are concrete, product-specific command interfaces. | | [model-versions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/model-versions-commands) | integrations | 0.82 | Defines CLI commands and parameters for model-versions; these are low-level, product-specific interfaces not derivable from general training. | | [parseJson](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/variantval/parsejson) | integrations | 0.82 | Method-level reference for converting JSON strings into VariantVal objects, a product-specific data handling API. | +| [postgres](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/postgres-commands) | integrations | 0.82 | The postgres command group reference exposes concrete commands, flags, and resource management patterns for Lakebase Postgres via CLI, which are detailed integration/API usage specifics. | | [queries](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/queries-commands) | integrations | 0.82 | queries CLI reference documents exact commands and parameters for CRUD operations on queries, which are specific integration details. | | [rangeBetween](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/rangebetween) | integrations | 0.82 | Explains detailed semantics of rangeBetween, including relative offsets and how ORDER BY values are adjusted, which is precise API behavior. | | [repos](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/repos-commands) | integrations | 0.82 | repos CLI group documents concrete commands and options for Git folder management, which are specific integration/coding patterns. | @@ -425,6 +406,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ALTER SHARE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-share) | security | 0.80 | Covers ALTER SHARE syntax for adding/removing objects, renaming, and transferring ownership, plus required permissions—detailed sharing and access-control configuration. | | [ALTER TABLE … COLUMN clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-manage-column) | configuration | 0.80 | Documents ALTER TABLE ... COLUMN syntax for adding, modifying, and dropping columns/fields in Delta tables with Databricks-specific options—detailed configuration parameters. | | [Access HTTP headers](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/http-headers) | configuration | 0.80 | Lists specific X-Forwarded-* headers passed from the reverse proxy and how to use them, which is detailed environment/configuration information (exact header names and semantics) unique to Databricks Apps. | +| [Access control overview](https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/) | security | 0.80 | Documents specific permissions and ACL behaviors for different securable object types—detailed RBAC/permission configuration unique to Databricks. | | [Access storage using a service principal & Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/databricks/archive/storage/aad-storage-service-principal) | security | 0.80 | Covers storage access via Entra ID app and service principal, including role assignments, scope definitions, and configuration keys for secure access. | | [Access storage using managed identities](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/azure-managed-identities) | security | 0.80 | Details how to configure managed identities for metastore root and external storage; includes identity types, scopes, and Databricks-specific steps. | | [Add row filters and column masks](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/manually-apply) | security | 0.80 | Shows detailed, product-specific patterns for implementing row filters and column masks with mapping tables, including how they integrate with ABAC. These are concrete security configuration and policy application patterns. | @@ -434,8 +416,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Anthropic](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/anthropic-uc-integration) | integrations | 0.80 | Shows how to call Anthropic models with UC tools; expected to include Anthropic SDK parameters and Databricks tool wiring details. | | [Anthropic](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/anthropic) | integrations | 0.80 | Shows mlflow.anthropic.autolog usage and captured metadata; integration-specific configuration and behavior. | | [Anthropic Messages API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-anthropic-messages) | integrations | 0.80 | Explains how to query foundation models using the Anthropic Messages API against Databricks model serving endpoints, including compatibility constraints (only Anthropic pay-per-token and external models) and migration from Anthropic SDK. This is a concrete API integration pattern with provider- and SKU-specific constraints. | -| [App telemetry](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/observability) | configuration | 0.80 | Describes configuring app telemetry with OpenTelemetry and Unity Catalog tables, including Databricks-specific settings and parameters for traces, logs, and metrics collection. | -| [Audit log system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/audit-logs) | configuration | 0.80 | Reference page for audit log system table schema and sample queries; contains column names, types, and usage patterns that are Databricks-specific configuration/usage details. | +| [Audit log system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/audit-logs) | configuration | 0.80 | Provides the exact system table path, schema, and sample queries for audit logs, which is detailed product metadata/configuration rather than conceptual guidance. | | [Authenticate](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/authentication) | security | 0.80 | Explains how to set up authentication between the Databricks CLI and Databricks accounts/workspaces. This typically includes specific auth methods, environment variables, profile settings, and token/OAuth configuration parameters unique to the Databricks CLI, which fits the security category as product-specific authentication configuration. | | [Authenticate](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/authentication) | security | 0.80 | Covers authorization and authentication; likely includes specific OAuth scopes, token configuration, and workspace auth settings unique to Databricks. | | [Authenticate](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/authentication) | security | 0.80 | Obtaining and using OAuth tokens for Lakebase involves concrete auth endpoints, scopes, and parameters. These are product-specific authentication configuration details, fitting security. | @@ -453,6 +434,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [CREATE FUNCTION (SQL)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-sql-function) | integrations | 0.80 | Reference for CREATE FUNCTION with SQL and Python UDFs, including runtime/version requirements and Unity Catalog constraints. Contains concrete syntax, parameterization, and environment requirements that are product-specific coding patterns. | | [CREATE GROUP](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-create-group) | security | 0.80 | CREATE GROUP syntax and behavior for Databricks workspace-local groups, including admin requirement and non-compatibility with Unity Catalog. Product-specific IAM configuration semantics. | | [CREATE TABLE CLONE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-clone) | integrations | 0.80 | Explains CREATE TABLE CLONE syntax and semantics, including deep vs shallow clone behavior, supported formats, versioning, and Unity Catalog specifics. These are detailed, product-specific data management patterns. | +| [CREATE TABLE [USING]](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-using) | configuration | 0.80 | This SQL reference for CREATE TABLE [USING] in Databricks SQL/Runtime describes exact syntax and options for managed, temporary, and external tables, including data source usage and behavior differences (e.g., session-local temp tables, qualification rules, persistence semantics). These are product-specific configuration details for table creation (parameters, options, and constraints) that go beyond generic SQL knowledge, matching the configuration sub-skill. | | [CREATE TEMPORARY VIEW (pipelines)](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-sql-ref-create-temporary-view) | integrations | 0.80 | CREATE TEMPORARY VIEW (pipelines) reference, including legacy syntax and expectations support; product-specific SQL semantics. | | [CWD for notebooks](https://learn.microsoft.com/en-us/azure/databricks/files/cwd-dbr-14) | configuration | 0.80 | Explains CWD behavior differences by language, runtime version, and workspace config; these are nuanced, product-specific execution settings. | | [Choose between the free trial and Free Edition](https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-trial-vs-free-edition) | decision-making | 0.80 | Explicitly compares two no-cost options and helps users choose; likely includes a feature/benefit comparison table and scenario-based guidance, which is product-specific decision-making content. | @@ -460,19 +442,18 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Classic](https://learn.microsoft.com/en-us/azure/databricks/jobs/run-classic-jobs) | best-practices | 0.80 | Explicit best-practices article for classic jobs, including compute sizing, policies, and performance options; clearly product-specific configuration recommendations. | | [Classic](https://learn.microsoft.com/en-us/azure/databricks/ldp/configure-compute) | configuration | 0.80 | Describes configuring classic compute for pipelines, referencing JSON schema and cluster definitions; includes product-specific configuration parameters and permission requirements. | | [Claude Code](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/claude-code) | integrations | 0.80 | Describes approaches for tracing Claude Code conversations and agents; includes specific integration patterns and configuration. | -| [Clean room system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/clean-rooms) | configuration | 0.80 | Reference for the clean room events system table, including table path and schema; these are specific configuration/metadata details. | | [Cluster policies](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/cluster-policies-cli) | configuration | 0.80 | Documents product-specific CLI subcommands, flags, and usage patterns for managing cluster policies; these are concrete configuration interfaces unique to Databricks. | | [Cluster-scoped init scripts](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/cluster-scoped) | configuration | 0.80 | Describes defining cluster-scoped init scripts via UI/CLI/API and execution order; product-specific configuration details. | | [Clusters](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/clusters-cli) | integrations | 0.80 | Documents specific clusters CLI subcommands, parameters, and usage patterns for Databricks CLI ≤0.18, which are product-specific API/CLI integration details. | | [Compute settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/compute) | configuration | 0.80 | Compute settings for the Simba JDBC driver, including constraints like not supporting job clusters and likely specific parameters; these are Databricks-specific configuration details. | | [Configuration in Python](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/python/) | configuration | 0.80 | Python support for bundles includes APIs and patterns for defining resources in code and modifying YAML; these are detailed configuration/programmatic patterns specific to bundles. | | [Configuration profiles](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/config-profiles) | configuration | 0.80 | Explains configuration profiles stored in .databrickscfg, including profile names and fields. Product-specific configuration mechanism for managing auth across environments. | -| [Configure SCIM provisioning for MS Entra ID](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/aad) | security | 0.80 | Contains detailed, product-specific configuration steps and parameters for SCIM provisioning from Entra ID to Databricks, including app configuration and sync behavior unique to this integration. | +| [Configure SCIM provisioning for MS Entra ID](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/aad) | security | 0.80 | Provides detailed configuration for Entra ID SCIM provisioning to Databricks, including capabilities vs automatic identity management; clearly IAM/security configuration. | | [Configure Spark properties](https://learn.microsoft.com/en-us/azure/databricks/spark/conf) | configuration | 0.80 | Covers supported options to configure Spark on Azure Databricks. Such a page typically lists specific Spark configuration keys, defaults, and Databricks-specific behaviors, which are product-specific configuration details that qualify as expert knowledge. | +| [Configure a connection](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/configure) | configuration | 0.80 | A connection configuration page for the Databricks JDBC driver (v3+) will contain driver-specific connection string parameters, required/optional settings, and possibly defaults. These are concrete configuration options unique to this product, matching the configuration sub-skill. | | [Configure access to resources from model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/store-env-variable-model-serving) | security | 0.80 | Explains configuring access to external/private resources using environment variables and Databricks secrets. This involves product-specific security configuration patterns (secret scopes, env var usage) and is clearly in the security/identity configuration domain. | | [Configure app.yaml](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/app-runtime) | configuration | 0.80 | Details how app.yaml controls Databricks app execution, including entry points and environment-specific configuration. This is a product-specific config file with defined fields and behavior, fitting the configuration sub-skill type. | | [Configure authorization](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/auth) | security | 0.80 | Covers Databricks Apps authorization model based on OAuth 2.0 and app/user identity models; likely includes specific scopes, roles, and configuration parameters unique to Databricks Apps. | -| [Configure default Python package repositories](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-python-packages) | configuration | 0.80 | Describes workspace-level default pip configuration for notebooks, jobs, and Lakeflow Spark Declarative Pipelines, including how to set private/authenticated repos—product-specific configuration parameters and behavior. | | [Configure external locations](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/external-locations) | configuration | 0.80 | Explains how to define external location objects with specific parameters and their role in governing storage access. | | [Configure job parameters](https://learn.microsoft.com/en-us/azure/databricks/jobs/job-parameters) | configuration | 0.80 | Explains Databricks job parameter functionality with concrete configuration in UI, JSON, and YAML for REST API, CLI, and automation bundles. Contains specific parameter constructs and schema unique to Databricks jobs. | | [Configure materialized views in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized-configure) | configuration | 0.80 | Explicitly about configuring materialized views, including access control; likely contains specific statement options, parameters, and allowed values (CREATE/ALTER options, access settings), matching configuration criteria. | @@ -486,12 +467,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Connect from Python or R](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/connect-databricks-excel-python-r) | integrations | 0.80 | Demonstrates using the Databricks ODBC driver from Python and R, including connection strings and driver options; these are concrete integration details unique to Databricks. | | [Connect to Azure Data Lake Storage and Blob Storage](https://learn.microsoft.com/en-us/azure/databricks/archive/storage/azure-storage) | integrations | 0.80 | ABFS driver setup requires specific configuration options (fs.azure.account.*), endpoint formats, and authentication parameters that are product-specific. | | [Connect to a workspace](https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/connect) | integrations | 0.80 | Explicitly about setting up the connector and selecting a SQL warehouse; this setup typically involves concrete configuration fields and parameter values (e.g., host, HTTP path, token), which are integration-specific settings. | -| [Connect to external HTTP services](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http) | integrations | 0.80 | Describes creating a Unity Catalog HTTP connection object to call external REST APIs, MCP servers, and AI tools. This implies detailed configuration fields (endpoint, credential storage, proxy behavior) and product-specific parameterization for authenticated requests, matching the integrations category with expert configuration knowledge. | +| [Connect with Databricks SDK](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-connect) | integrations | 0.80 | Provides product-specific integration patterns using the Databricks SDK plus standard Postgres drivers, including how to perform OAuth token rotation and connection pooling for Lakebase Autoscaling. | | [Create Postgres roles](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/postgres-roles) | security | 0.80 | Focuses on creating and managing Postgres roles for Lakebase; likely lists specific default roles, their permissions, and how they map to Lakebase projects, which are product-specific RBAC and permission-scope details. | | [Create a Foundation Model Fine-tuning run (API)](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/foundation-model-training/create-fine-tune-run) | configuration | 0.80 | Explicitly states it describes all parameters used in the Fine-tuning API call; this implies parameter names, allowed values, and defaults—product-specific configuration reference that an LLM would not reliably know. | | [Create a Unity Catalog metastore](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/create-metastore) | configuration | 0.80 | Shows how to create a metastore and link it to workspaces; this involves specific configuration fields, region constraints, and admin options that are detailed product configuration. | | [Create and manage recipients (Databricks-to-Databricks)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient) | configuration | 0.80 | Explains recipient objects, including different flows depending on Unity Catalog access; involves product-specific object properties and configuration steps for recipients. | -| [Create and manage recipients (tokens)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-token) | security | 0.80 | Details creating recipient objects using bearer tokens, token generation, rotation, and scope; this is product-specific authentication and access configuration. | | [Create cluster UI (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/compute/configure) | configuration | 0.80 | Details many cluster configuration options (modes, autoscaling, termination, Spark options, tags, logging) with specific setting names and behaviors for the legacy UI, which are product-specific configuration details. | | [Create metastore using service principal](https://learn.microsoft.com/en-us/azure/databricks/archive/unity-catalog/service-principals) | security | 0.80 | Legacy procedure for creating metastores, external locations, and managed storage using a service principal; necessarily includes IAM roles, permissions, and security configuration details, matching security configuration guidance. | | [Credentials](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-storage-credentials) | security | 0.80 | Details Unity Catalog credential and storage credential objects used to access external locations and services. These are product-specific security configuration constructs and permissions, fitting the security sub-skill. | @@ -507,8 +487,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [DROP CONSTRAINT clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-drop-constraint) | configuration | 0.80 | Provides exact Databricks SQL syntax and behavior for dropping PRIMARY KEY, FOREIGN KEY, and CHECK constraints, which are concrete DDL configuration details. | | [DROP GROUP](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-drop-group) | security | 0.80 | DROP GROUP behavior including exception when group does not exist and constraints around workspace-local groups and Unity Catalog. Concrete, product-specific security/IAM behavior. | | [DSPy](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/dspy) | integrations | 0.80 | Uses mlflow.dspy.autolog; product-specific integration API and what gets traced. | +| [Data access configuration](https://learn.microsoft.com/en-us/azure/databricks/admin/sql/data-access-configuration) | configuration | 0.80 | Describes workspace data access properties for SQL warehouses and their impact (restarts), which are detailed configuration options. | +| [Data classification system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-classification) | configuration | 0.80 | Outlines the data classification results table schema and sample queries; this is precise table configuration and usage guidance. | | [Data prep](https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/bi-serving-data-prep) | best-practices | 0.80 | Summarizes recommended data prep practices with expected impact and action items, tailored to Databricks BI workloads. | -| [Data quality monitoring system tables reference](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-quality-monitoring) | configuration | 0.80 | Provides table path, schema, and access rules for the data quality monitoring results system table; this is detailed, product-specific configuration information. | +| [Data quality monitoring system tables reference](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-quality-monitoring) | configuration | 0.80 | Provides schema and usage for the data quality monitoring results table, including specific fields for freshness, completeness, and impact—expert configuration details. | | [DataFrameNaFunctions class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframenafunctions) | integrations | 0.80 | Class reference for DataFrameNaFunctions, including supported operations and Spark Connect support, which are concrete API details. | | [DataFrameStatFunctions class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframestatfunctions) | integrations | 0.80 | Class reference for statistical functions, including supported operations and Spark Connect support. | | [DataSource class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasource) | integrations | 0.80 | Defines the base DataSource class and required methods (reader, writer) for custom sources; detailed integration contract unique to Databricks PySpark. | @@ -522,23 +504,24 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Databricks Foundation Models](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/databricks-foundation-models) | integrations | 0.80 | Explains using mlflow.openai.autolog for Databricks Foundation Models; integration-specific behavior and captured metadata. | | [Databricks Utilities](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/databricks-utilities) | integrations | 0.80 | Describes how to call Databricks Utilities via Databricks Connect, including specific methods, parameters, and behaviors that are product-specific. | | [Databricks Utilities](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-utils) | configuration | 0.80 | Reference for dbutils modules and commands (files, secrets, etc.) and their availability constraints (DBFS-only). Contains product-specific APIs and behaviors that act as configuration/control surface for Databricks environments. | -| [Databricks access to customer workspaces](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-access) | security | 0.80 | Describes how Databricks personnel can access workspaces for support and what security measures are enforced, including workspace access settings and behavior unique to the service. | +| [Databricks access to customer workspaces](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-access) | security | 0.80 | Explains how Databricks personnel can access workspaces and the enforced security measures; this is specific security configuration and behavior. | | [Databricks widgets](https://learn.microsoft.com/en-us/azure/databricks/notebooks/widgets) | configuration | 0.80 | Describes widget types, configuration, layout options, and widget API usage; includes specific API names and parameters, which are product-specific configuration details. | | [Dataset definition functions](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/definition-function) | integrations | 0.80 | Documents functions and decorators to define datasets; includes specific function signatures and usage patterns unique to this product’s Python API. | +| [Declarative features API reference](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-api-reference) | configuration | 0.80 | API reference page will list specific parameters, options, and allowed values for declarative feature engineering, which are product-specific configuration details. | | [DeepSeek](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/deepseek) | integrations | 0.80 | Describes tracing DeepSeek via mlflow.openai.autolog; includes integration behavior and constraints. | | [Default compute access mode](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-access-mode) | configuration | 0.80 | Covers a workspace setting that controls default data_security_mode for jobs; this is a product-specific configuration with concrete behavior changes. | | [Default workspace permissions](https://learn.microsoft.com/en-us/azure/databricks/security/auth/default-permissions) | security | 0.80 | Details default permissions for admins and non-admins in a new workspace, including which actions are allowed on which objects. These default ACLs/roles are product-specific security settings. | | [Deployment modes](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/deployment-modes) | configuration | 0.80 | Describes syntax and behaviors for declaring deployment modes (dev, prod, etc.) in bundle configs; product-specific configuration semantics. | | [Directory listing mode](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/directory-listing-mode) | configuration | 0.80 | Describes how to configure streams in directory listing mode and includes performance guidance; likely lists specific options and recommended settings. | | [Disable DBFS root and mounts](https://learn.microsoft.com/en-us/azure/databricks/dbfs/disable-dbfs-root-mounts) | configuration | 0.80 | Describes how to disable DBFS root and mounts, including an account-level setting ('Disable legacy features') and workspace-level controls. This is concrete configuration guidance with specific setting names and behaviors. | +| [Driver capability settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/capability) | configuration | 0.80 | Described as detailing 'special and advanced driver capability settings' for the Databricks ODBC driver. Such pages typically list driver capability flags/parameters, allowed values, and behavior, which are product-specific configuration details not inferable from general knowledge. | | [Dynamic setting overrides](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/overrides) | configuration | 0.80 | Explains how to override or join top-level settings with target settings; involves specific configuration patterns and keys for bundles. | | [Dynamic value references](https://learn.microsoft.com/en-us/azure/databricks/jobs/dynamic-value-references) | configuration | 0.80 | Defines the set of dynamic value reference variables (job ID, run start time, etc.) and how they are used in conditions and parameter passing. These are specific configuration tokens and behaviors unique to Databricks. | -| [Embedding for external users](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/external-embed) | security | 0.80 | Details use of service principals and scoped access tokens for external embedding; includes product-specific auth and permission configuration. | +| [Enable admin protection for "No Isolation Shared" clusters on your account](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/no-isolation-shared) | security | 0.80 | Details an account-level setting to prevent automatic internal credential generation for admins on no isolation shared clusters and references related isolation settings, which are product-specific security controls. | | [Enable customer-managed keys for DBFS using Azure portal](https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys-dbfs/cmk-dbfs-azure-portal) | security | 0.80 | Stepwise portal configuration of CMK from Azure Key Vault for DBFS storage account encryption, with Databricks-specific requirements and flows. | | [Enable customer-managed keys using PowerShell](https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys-dbfs/cmk-dbfs-powershell) | security | 0.80 | Provides PowerShell-based configuration steps and parameters for setting CMK on the Databricks workspace storage account, clearly security configuration. | | [Enable external access](https://learn.microsoft.com/en-us/azure/databricks/external-access/admin) | security | 0.80 | Details how a metastore admin enables external data access and requires specific privileges like 'EXTERNAL USE SCHEMA'. This is product-specific security and permission configuration guidance. | | [Enable table access control for a cluster](https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/table-acl) | security | 0.80 | Provides concrete steps and settings to enable table ACLs on clusters, which is specific security configuration for Databricks. | -| [Enable verbose logs](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/verbose-logs) | configuration | 0.80 | Documents specific workspace configuration keys (enableVerboseAuditLogs, workspaceConfKeys) and emitted events, which are Databricks-specific configuration parameters. | | [Enhanced security monitoring](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/enhanced-security-monitoring) | security | 0.80 | Covers how to configure enhanced security monitoring, including workspace/account-level settings and their effects; this is detailed, product-specific security configuration guidance. | | [Event log](https://learn.microsoft.com/en-us/azure/databricks/ldp/monitor-event-logs) | configuration | 0.80 | Explains event log usage and access via UI, REST API, and SQL; includes schema usage and likely field semantics for monitoring and custom actions. | | [Examples](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/examples) | integrations | 0.80 | Contains notebooks and code samples for integrating Structured Streaming with Cassandra, Synapse, and notebooks—showing concrete connection and usage patterns specific to Databricks. | @@ -549,23 +532,23 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Files in Unity Catalog volumes](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/volumes) | integrations | 0.80 | Shows how to upload/download/delete files in Unity Catalog volumes using the Simba JDBC driver, with driver-specific commands and parameters; this is integration guidance. | | [Files in Unity Catalog volumes](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/volumes) | integrations | 0.80 | Shows how to upload/download/delete files in Unity Catalog volumes using the ODBC driver, with driver-specific commands/SQL and parameters; this is a product-specific integration pattern. | | [Fivetran](https://learn.microsoft.com/en-us/azure/databricks/partners/ingestion/fivetran) | integrations | 0.80 | Explains how to integrate Fivetran with Databricks SQL warehouses and clusters, a concrete cross-product integration scenario with specific configuration steps and parameters, which matches the integrations sub-skill. | -| [Foundation Model APIs compliance](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/compliance) | security | 0.80 | Describes compliance standards and security profile support for Foundation Model APIs, varying by deployment mode (pay-per-token vs provisioned throughput). This is product-specific security/compliance configuration and support matrix information. | +| [Foundation Model APIs compliance](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/compliance) | security | 0.80 | Describes compliance standards and security profile support by deployment mode; likely includes specific security profile names, supported standards per mode, and configuration implications, which are product-specific security details. | | [Framework examples](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/framework-examples) | integrations | 0.80 | Shows code examples for multiple languages/frameworks using native Postgres auth; likely includes concrete driver parameters and patterns specific to Lakebase connectivity. | | [Function reference](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/custom-calculations/function-reference) | configuration | 0.80 | A complete function reference is expert knowledge: lists function names, signatures, argument types, and behavior specific to this product. | | [Gaps in execution](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-job-gaps) | troubleshooting | 0.80 | Explains specific reasons for gaps in the Databricks Spark jobs timeline and how to determine if they are expected, mapping the symptom (gaps) to likely causes and investigation steps in the Spark UI, which is concrete Databricks-specific troubleshooting guidance. | | [Gemini](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/gemini) | integrations | 0.80 | Integration via mlflow.gemini.autolog; includes concrete configuration and captured trace details. | | [Generate temporary credentials for ingestion](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/generate-temporary-credentials) | security | 0.80 | Focuses on creating temporary credentials with least privilege. This involves specific permission scopes and security settings, which are security configuration expert knowledge. | -| [Genie Code system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/assistant) | configuration | 0.80 | Documents the Genie Code events system table, including its path and schema; this is detailed, product-specific system table configuration. | -| [GitHub Actions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-github) | security | 0.80 | Provides GitHub-specific configuration (OIDC settings, trust relationships) to federate identities to Databricks. Product- and platform-specific security integration details. | | [GitLab CI/CD](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-gitlab) | security | 0.80 | Describes GitLab CI/CD OIDC configuration and Databricks-side setup for token federation. Contains concrete security parameters and flows unique to this integration. | +| [Google Drive](https://learn.microsoft.com/en-us/azure/databricks/ingestion/google-drive) | integrations | 0.80 | Covers the Lakeflow Connect Google Drive connector with Databricks-specific functions (read_files, spark.read, COPY INTO, Auto Loader) and likely includes connector options, authentication, and constraints; this is a product-specific integration pattern. | | [Google Gemini API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-gemini-api) | integrations | 0.80 | Describes using the Google Gemini API and Google AI SDK against Databricks-hosted Gemini models, including compatibility notes and likely request configuration. This is product-specific integration knowledge between Databricks and Gemini. | | [Groq](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/groq) | integrations | 0.80 | Uses mlflow.groq.autolog; notes only synchronous calls supported, which is a product-specific constraint and integration detail. | | [Groups](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/groups-cli) | configuration | 0.80 | Describes Databricks groups CLI subcommands and syntax, including how to manage groups via CLI; these are product-specific configuration/management commands. | | [H3_INVALID_CELL_ID error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/h3-invalid-cell-id-error-class) | troubleshooting | 0.80 | Documents a specific H3-related error class and message, which is unique to Databricks’ H3 expression support. | | [Haystack](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/haystack) | integrations | 0.80 | Explains mlflow.haystack.autolog usage and what Haystack-specific data (latencies, token usage, metadata) is captured, a concrete integration pattern. | | [INSUFFICIENT_TABLE_PROPERTY error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/insufficient-table-property-error-class) | troubleshooting | 0.80 | Documents a specific Databricks error class and message about missing table properties, providing product-specific error semantics for diagnosis. | -| [IP addresses and domains for Databricks services and assets](https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region) | configuration | 0.80 | Lists specific IP address ranges and domain names per region for Databricks services, used for UDR/firewall configuration; this is detailed, product-specific configuration data. | +| [IP addresses and domains for Databricks services and assets](https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region) | configuration | 0.80 | Provides region-specific IP address ranges and domain names for Azure Databricks services and assets, used for UDRs, firewalls, and VNets. These are detailed, product-specific configuration values that an LLM would not know from training and are used to configure network/security settings. | | [Identities and privileges](https://learn.microsoft.com/en-us/azure/databricks/ldp/privileges) | security | 0.80 | Covers identities, permissions, and privileges for pipelines; likely lists specific roles/privileges and default access behaviors, which are product-specific security configurations. | +| [Inference tables](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables) | configuration | 0.80 | Describes what inference tables log and how to enable them; this will include table schemas, column names, and configuration steps specific to Databricks AI Gateway, which are expert configuration details. | | [Instance pools](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/instance-pools-cli) | configuration | 0.80 | Covers Databricks pools CLI usage, including required CLI version and specific subcommands/parameters for configuring instance pools, which are detailed configuration interfaces. | | [JDBC connection](https://learn.microsoft.com/en-us/azure/databricks/connect/jdbc-connection) | integrations | 0.80 | Details JDBC connection objects, driver specification, URL, credentials, and usage via Spark and Remote Query SQL API; includes beta/runtime constraints and migration guidance. | | [Job task type settings](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/job-task-types) | configuration | 0.80 | Provides details on how to declare job tasks in bundle configs and includes product-specific guidance (e.g., not using `git_source`/`GIT` in bundles). | @@ -590,24 +573,26 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Log model dependencies](https://learn.microsoft.com/en-us/azure/databricks/mlflow/log-model-dependencies) | configuration | 0.80 | Focuses on logging model dependencies alongside artifacts; likely includes specific config fields, file formats, and patterns unique to MLflow on Databricks. | | [MAX_FILE_PARTITION_BYTES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/parameters/max_partition_bytes) | configuration | 0.80 | Configuration parameter that controls maximum partition size when reading from file data sources, affecting parallelism; Databricks-specific config with performance implications. | | [METASTORE_PRIVILEGES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/metastore_privileges) | security | 0.80 | METASTORE_PRIVILEGES lists principals and privileges on the metastore, a core RBAC/security configuration detail. | -| [MLflow system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/mlflow) | configuration | 0.80 | Describes MLflow system tables, their schema, and usage for cross-workspace analytics; these are concrete configuration/metadata details unique to Databricks. | +| [MLflow system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/mlflow) | configuration | 0.80 | Documents MLflow system table schemas and how to query them, which is detailed Databricks-specific configuration/metadata. | | [Manage PostgreSQL roles](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/pg-roles) | security | 0.80 | Focuses on mapping Azure Databricks identities to Postgres roles; likely includes role names, privileges, and configuration patterns unique to Lakebase. | | [Manage access to service principals](https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/service-principal-acl) | security | 0.80 | Describes roles for managing service principals and how to grant access; likely lists specific Databricks roles/permissions and ACL behaviors, which are product-specific security configuration details. | | [Manage context-based ingress policies](https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/manage-ingress-policies) | security | 0.80 | Provides concrete admin steps to create network policies, author granular ingress rules, and attach them to workspaces, including Databricks-specific policy model. | +| [Manage database permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles-permissions) | security | 0.80 | Explains how to use Postgres GRANT commands for Lakebase projects, including schema/table-level access patterns. This is concrete, product-scoped IAM/permission guidance rather than generic SQL theory. | +| [Manage notification destinations](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notification-destinations) | configuration | 0.80 | Describes system notification destinations, including webhook-based configuration for workflow events and access requests; these are concrete product-specific settings. | | [Manage permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-project-permissions) | security | 0.80 | Covers project permissions for identities, groups, and service principals; likely lists specific permission types and scopes for Lakebase resources. | | [Manage personal access tokens](https://learn.microsoft.com/en-us/azure/databricks/security/auth/api-access-permissions) | security | 0.80 | Managing PAT permissions is security-focused and product-specific. This page is about configuring who can create/modify tokens and use passwords, which implies concrete permission settings and possibly role or scope names unique to Databricks, matching the security sub-skill. | | [Manage storage credentials](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/manage-storage-credentials) | security | 0.80 | Covers listing, updating, and permission recommendations (e.g., grant only CREATE EXTERNAL LOCATION) for storage credentials, which are detailed security practices. | | [Manual token generation](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/aad-token-manual) | security | 0.80 | Explains manual generation of Entra ID access tokens for Databricks, including token scopes, lifetimes, and usage. Contains advanced, product-specific security/auth details. | -| [Marketplace system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/marketplace) | configuration | 0.80 | Documents specific Marketplace system tables, their schema, and how they are enabled and used; these table structures and paths are product-specific configuration knowledge. | | [Microsoft Teams](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/teams-agent) | integrations | 0.80 | Shows Teams integration using Azure Bot Service and OAuth On Behalf Of; this is a concrete integration with detailed auth configuration and Databricks-specific setup. | | [Mistral](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/mistral) | integrations | 0.80 | Uses mlflow.mistral.autolog; includes constraint that only synchronous Text Generation API calls are traced, which is expert integration detail. | | [Model Serving timeouts](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-timeouts) | limits-quotas | 0.80 | Covers different timeout types (deployment, server-side, client-side) and how to handle them; these articles typically include default and maximum timeout values and configuration ranges, which are numeric constraints. | | [Model units in provisioned throughput](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/model-units) | limits-quotas | 0.80 | Explains model units as a throughput measure tied to tokens per minute and work per request. Such pages typically include formulas, numeric mappings (e.g., MU to tokens/min), and capacity thresholds that are not generally known. | | [Monitor resource quotas](https://learn.microsoft.com/en-us/azure/databricks/resources/manage-resource-quotas) | limits-quotas | 0.80 | Focuses on resource quotas and usage monitoring via specific APIs; quotas imply concrete numeric limits and tracking, which is expert limits-quotas knowledge. | | [MySQL](https://learn.microsoft.com/en-us/azure/databricks/query-federation/mysql) | integrations | 0.80 | Step-by-step configuration for MySQL federation, including required Unity Catalog objects and connection parameters; product-specific integration pattern. | -| [OAuth authentication as a user](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-u2m) | security | 0.80 | Details OAuth 2.0 user authorization for Databricks APIs/SDKs, including token lifetimes (one hour) and unified auth behavior, matching security configuration criteria. | +| [OAuth access token](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token) | integrations | 0.80 | Describes Databricks Lakehouse Federation configuration for Snowflake using OAuth access tokens, including specific connection objects, authentication parameters, and patterns unique to this integration, beyond generic OAuth or Snowflake knowledge. | | [Object ownership](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/ownership) | security | 0.80 | Describes Databricks-specific ownership rules, capabilities, and effects on privileges for Unity Catalog securable objects, which are detailed IAM semantics unique to this product. | | [OpenAI](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/openai-uc-integration) | integrations | 0.80 | Describes using UC tools in OpenAI workflows; likely includes OpenAI SDK call patterns and Databricks-specific tool configuration not known generically. | +| [OpenAI Responses API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-openai-responses) | integrations | 0.80 | Describes how to use the OpenAI Responses API with Databricks endpoints; will include endpoint URLs, request/response schemas, and parameters specific to Databricks’ integration with OpenAI-compatible APIs. | | [OpenTelemetry for custom model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-model-serving-uc-logs) | configuration | 0.80 | Describes configuring endpoint telemetry to persist logs, traces, and metrics to Unity Catalog tables. This requires specific configuration parameters, table targets, and possibly schema/setting names that are product-specific expert knowledge. | | [Orchestrate jobs with Airflow](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-airflow-with-jobs) | integrations | 0.80 | Describes installing/configuring Airflow and Databricks provider, plus example workflow; likely includes operator names, connection parameters, and integration-specific settings. | | [Other providers](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-other) | security | 0.80 | Shows how to configure OIDC federation for Terraform Cloud, Bitbucket Pipelines, and Jenkins with Databricks. Contains concrete security configuration fields and flows. | @@ -625,7 +610,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Pool best practices](https://learn.microsoft.com/en-us/azure/databricks/compute/pool-best-practices) | best-practices | 0.80 | Explicit best-practices article on how to best configure and use pools; contains product-specific recommendations and patterns. | | [PostgreSQL](https://learn.microsoft.com/en-us/azure/databricks/query-federation/postgresql) | integrations | 0.80 | Step-by-step configuration for PostgreSQL federation, including required Unity Catalog objects and connection parameters; product-specific integration pattern. | | [Pre-created roles](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/roles) | security | 0.80 | Explains predefined Postgres roles for Lakebase; these role names and associated permissions are product-specific security configuration knowledge. | -| [Predictive optimization system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/predictive-optimization) | configuration | 0.80 | Outlines the predictive optimization operation history table schema and table path, plus sample queries; these are concrete configuration details for a specific feature. | +| [Predictive optimization system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/predictive-optimization) | configuration | 0.80 | Reference for the predictive optimization system table with schema and sample queries; this is specific configuration/metadata knowledge. | | [Prepare data for Foundation Model Fine-tuning](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/foundation-model-training/data-preparation) | configuration | 0.80 | Describes accepted training and evaluation data file formats; such pages typically specify schema fields, required/optional columns, and format constraints—product-specific configuration of input data. | | [Pricing system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/pricing) | configuration | 0.80 | Describes pricing system table schema and usage; table path and columns are Databricks-specific configuration/metadata details. | | [Private Link](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/private-link) | security | 0.80 | Private Link configuration pages normally list endpoint types, required DNS/endpoint settings, and traffic routing specifics for workspace and client connections, which are product-specific security and network configuration details. | @@ -634,6 +619,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Privileges reference](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control/privileges-reference) | security | 0.80 | A privilege reference page listing each privilege and applicable securable objects is inherently detailed security configuration knowledge with specific role/privilege names. | | [Programmatically interact with workspace files](https://learn.microsoft.com/en-us/azure/databricks/files/workspace-interact) | configuration | 0.80 | Includes runtime/version requirements and an environment variable `WSFS_ENABLE_WRITE_SUPPORT=false` controlling behavior, which are concrete configuration parameters. | | [Provider native APIs](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/provider-native-apis) | integrations | 0.80 | Shows how to use provider-native APIs through Databricks; includes endpoint names, parameters, and configuration details for each provider within Databricks. | +| [Pub/Sub](https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pub-sub) | integrations | 0.80 | Describes a built-in Pub/Sub connector with exactly-once semantics, authentication, schema, and configuration options; includes product-specific options and caveats (duplicates, out-of-order) that form an integration and coding pattern. | | [Publish features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/publish-features) | integrations | 0.80 | Describes publishing features to supported online stores; likely lists supported systems and includes configuration parameters and API usage for each integration. | | [PydanticAI](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/pydantic-ai) | integrations | 0.80 | Shows MLflow integration with PydanticAI using mlflow.pydantic_ai.autolog and details about recorded agent steps and tool calls; product-specific API pattern. | | [Quality guide](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-retrieval-quality) | best-practices | 0.80 | Systematic, ordered recommendations for tuning retrieval quality in RAG/search/matching; contains Databricks-specific guidance and patterns that go beyond generic RAG advice. | @@ -643,7 +629,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [REFRESH POLICY clause](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-sql-ref-create-materialized-view-refresh-policy) | integrations | 0.80 | REFRESH POLICY clause reference with semantics for incrementalization; includes clause options and behavior unique to this product. | | [ROUTINE_PRIVILEGES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/routine_privileges) | security | 0.80 | ROUTINE_PRIVILEGES lists principals and privileges on routines and notes MANAGE behavior; detailed RBAC semantics. | | [ROW_COLUMN_ACCESS error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/row-column-access-error-class) | troubleshooting | 0.80 | Describes a Databricks error class related to row filters and column masks and notes derived error classes, which is specific to Databricks security features and used for troubleshooting. | -| [Rate limits](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/rate-limits-beta) | limits-quotas | 0.80 | A page specifically about configuring rate limits for AI Gateway endpoints is likely to include concrete numeric limits (requests per minute/hour, token caps), configuration ranges, and possibly tier-specific constraints, which qualify as expert limits/quotas knowledge. | | [Read shared data (OIDC M2M)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sharing-over-oidc-m2m) | integrations | 0.80 | Covers machine-to-machine OAuth client credentials flow for Python clients; includes client registration, token acquisition, and Delta Sharing-specific parameters. | | [Read shared data (OIDC U2M)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sharing-over-oidc-u2m) | integrations | 0.80 | Describes user-to-machine OIDC flows for tools like Power BI; includes client configuration, scopes, and token usage specific to this integration pattern. | | [Read state](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/read-state) | configuration | 0.80 | Documents Databricks-specific SQL table-valued functions and DataFrame APIs for accessing streaming state/metadata, including required permissions and usage constraints. | @@ -656,18 +641,16 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-reference) | configuration | 0.80 | Reference material including unsupported objects and data type transformations implies detailed tables of fields and mappings—product-specific configuration details. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-reference) | configuration | 0.80 | Reference material with transformations from ServiceNow data types to Delta-compatible types implies detailed mapping tables—configuration-level expert knowledge. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-reference) | configuration | 0.80 | Lists supported tables, dimensions, and metrics—detailed connector-specific schema/configuration information that is expert reference knowledge. | +| [Reference](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/reference) | limits-quotas | 0.80 | Provides reference tables for supported environments, languages, sources, sinks, operators, and known limitations for real-time mode; these tables typically include explicit constraints and support matrices that function as product-specific limits/quotas. | | [Refresh Unity Catalog metadata](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-refresh-foreign) | security | 0.80 | Describes REFRESH FOREIGN for catalogs, schemas, and tables and explicitly lists required privileges for each operation and clause. These are product-specific permission requirements, fitting the security category. | | [Repos](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/repos-cli) | integrations | 0.80 | Repos CLI page describes commands and required CLI versions for interacting with Databricks repos, a product-specific integration. | -| [Row and column filters](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/) | security | 0.80 | Provides concrete guidance on defining and applying row filters and column masks in Unity Catalog to protect sensitive data. This includes product-specific security policy constructs and behaviors. | +| [Roles and permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/roles-permissions) | security | 0.80 | Details how to set up Postgres roles and grant permissions for Lakebase projects. Likely includes specific role patterns, required privileges to connect/query, and how these map to Azure Databricks, which are product-specific security configurations. | | [Row class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/row) | integrations | 0.80 | Details Row construction, field access rules, and version-specific behavior changes (e.g., field order), which are concrete, product-specific API semantics. | | [Run jobs with service principals](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/run-jobs-with-service-principals) | security | 0.80 | Focuses on running jobs as a service principal; likely includes specific RBAC roles, scopes, and auth configuration details for Databricks–Entra integration. | | [Runs](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/runs-cli) | integrations | 0.80 | Runs CLI documentation includes detailed subcommands and usage for job runs, which are Databricks-specific API/CLI patterns. | | [RuntimeConfig class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig) | configuration | 0.80 | Describes the user-facing configuration API (spark.conf / SparkSession.conf) and propagation to Hadoop I/O, which is core configuration behavior for Databricks PySpark. | -| [SCIM provisioning overview](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/) | configuration | 0.80 | SCIM provisioning setup includes endpoint URLs, token scopes, and IdP app settings specific to Databricks, matching configuration sub-skill criteria. | | [SET](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set) | configuration | 0.80 | Documents Databricks-specific parameter system, including listing all parameters and distinction from SQL variables and query tags. | | [SQL serving](https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/bi-serving-sql-serving) | best-practices | 0.80 | Outlines recommended provisioning, management, and monitoring practices for SQL warehouses, including product-specific settings. | -| [SQL warehouse admin settings](https://learn.microsoft.com/en-us/azure/databricks/admin/sql/) | configuration | 0.80 | Details workspace-level configuration options and access controls for SQL warehouses, including default behaviors and Unity Catalog assumptions, which are Databricks-specific configuration parameters. | -| [SQL warehouse events system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouse-events) | configuration | 0.80 | Describes a specific system table (path, schema, and usage patterns) for warehouse events; these details are configuration/metadata unique to Azure Databricks. | | [STATEMENT_TIMEOUT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/parameters/statement_timeout) | limits-quotas | 0.80 | STATEMENT_TIMEOUT is a configuration parameter specifying a timeout duration in seconds; this is an explicit timeout limit with units and configurable range, fitting limits-quotas. | | [STORAGE_CREDENTIALS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/storage_credentials) | configuration | 0.80 | STORAGE_CREDENTIALS relation defines how storage credentials are represented and notes deprecation in favor of CREDENTIALS; this is product-specific configuration detail. | | [Schema inference and evolution](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/schema) | configuration | 0.80 | Explains how to configure schema inference/evolution and rescue data; likely includes specific Auto Loader options/parameters and their behaviors. | @@ -678,11 +661,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Service principal authentication](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-sp) | security | 0.80 | Details using Entra service principal credentials for Databricks auth, including when to choose this vs OAuth M2M and how to configure. Product-specific security/auth patterns. | | [Set privileges on a data object (legacy)](https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/object-privileges) | security | 0.80 | Describes the legacy Hive metastore privilege model and how to grant/deny/revoke privileges on specific securables. This includes product-specific RBAC/privilege semantics that qualify as security expert knowledge. | | [Set up and use managed identities](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-mi-auth) | security | 0.80 | Step-by-step setup for user-assigned/system-assigned managed identities with Databricks accounts and workspaces. Contains concrete security configuration steps and identity bindings. | -| [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-source-setup) | configuration | 0.80 | Source setup for D365 plus Synapse Link and ADLS Gen2 will include specific configuration parameters, connection details, and authentication settings unique to this connector. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-source-setup) | security | 0.80 | Describes configuring OAuth 2.0 for Jira ingestion, which involves product-specific auth endpoints, scopes, and credential parameters. | -| [Set up environment](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/environment) | configuration | 0.80 | Explicitly about setting up and configuring Python environments, including environment caching behavior and known limitations, which are product-specific configuration details and constraints. | +| [Set up environment](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/environment) | configuration | 0.80 | Covers choosing and configuring Python environments, environment caching behavior, custom module imports, and known limitations—detailed, product-specific configuration behavior. | | [Set up service principal for M2M OAuth](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-m2m) | security | 0.80 | Describes setting up service principals and M2M OAuth for Power BI. Expected to include specific app registration settings, Databricks permissions, scopes, and security-related configuration unique to this integration. | -| [Shared materialization history system table reference](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/materialization) | configuration | 0.80 | Documents the shared materialized data history system table, including table path and schema; this is detailed configuration/metadata for a specific Databricks feature. | | [SimpleDataSourceStreamReader class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader) | integrations | 0.80 | Class reference describing how SimpleDataSourceStreamReader differs from DataSourceStreamReader, including driver-side reading, offset planning, and suitability constraints, which are detailed, product-specific streaming integration semantics. | | [Slack](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/slack-agent) | integrations | 0.80 | Shows how to connect an agent to Slack using HTTP Unity Catalog connections and user-to-machine auth; this is a concrete integration with product-specific connection parameters and auth setup. | | [Slow stage I/O](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/slow-spark-stage-low-io) | troubleshooting | 0.80 | Describes causes of slow stages with little I/O and how to identify them using the SQL DAG, providing Databricks-specific symptom → cause mappings and investigation steps, which is troubleshooting content. | @@ -692,10 +673,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [TIMEZONE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/parameters/timezone) | configuration | 0.80 | TIMEZONE is a Databricks-specific configuration parameter controlling local timezone for timestamp operations, with details on how to set it at different scopes. | | [Table properties and table options](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-tblproperties) | configuration | 0.80 | TBLPROPERTIES and table options syntax with Databricks-specific storage and tagging behaviors. | | [Tracing integrations](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/) | integrations | 0.80 | Overview of tracing integrations with many frameworks; includes one-line autologging patterns and framework-specific hooks. | +| [Transfer object ownership](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/transfer-object-ownership) | security | 0.80 | Describes a Lakebase-specific workaround for transferring Postgres object ownership using a temporary shared role instead of direct ALTER OWNER, tied to Azure Databricks group roles. This is a product-specific security/permissions pattern and edge case. | | [Troubleshoot](https://learn.microsoft.com/en-us/azure/databricks/jobs/repair-job-failures) | troubleshooting | 0.80 | Organized around failed tasks with guidance to identify causes, fix issues, and repair runs using Databricks-specific tools, matching troubleshooting criteria. | | [Troubleshoot compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/) | troubleshooting | 0.80 | Dedicated troubleshooting article for compute startup behavior; links to symptom-based resources for diagnosing unresponsive compute and metastore issues. | | [Troubleshoot connections](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/troubleshoot) | troubleshooting | 0.80 | Explicit troubleshooting page. Likely organized by common Partner Connect symptoms with specific error messages/codes and corresponding resolutions, which are product-specific diagnostic knowledge. | -| [Troubleshoot high initialization times](https://learn.microsoft.com/en-us/azure/databricks/ldp/fix-high-init) | best-practices | 0.80 | Discusses causes of high initialization times and recommends splitting tables across pipelines with thresholds (e.g., over five minutes); these are product-specific performance best practices. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/troubleshooting) | troubleshooting | 0.80 | Explicit troubleshooting article; expected to map specific errors and version issues to resolutions for Databricks Connect for Scala. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-troubleshoot) | troubleshooting | 0.80 | Explicitly described as troubleshooting common issues with the Confluence connector, including authentication errors and rate limits. Such pages typically map specific error messages and causes to resolutions, which is product-specific expert knowledge. | | [UNDROP TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-undrop-table) | limits-quotas | 0.80 | UNDROP command page includes a concrete 7-day retention period for recoverable relations and behavior when multiple drops exist—an explicit numeric limit and product-specific recovery semantics. | @@ -703,20 +684,23 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [UNSUPPORTED_VIEW_OPERATION error condition](https://learn.microsoft.com/en-us/azure/databricks/error-messages/unsupported-view-operation-error-class) | troubleshooting | 0.80 | Similar to tables, this documents a view-specific error class and message template, which is concrete product behavior for troubleshooting. | | [USE_CACHED_RESULT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/parameters/use_cached_result) | configuration | 0.80 | Describes a specific configuration parameter (USE_CACHED_RESULT), its default (TRUE), and behavior for reusing result sets; includes product-specific config and default value. | | [Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/unity-catalog-cli) | configuration | 0.80 | Describes the Unity Catalog CLI, including that it is experimental; CLI docs typically provide specific command and option references, which are detailed configuration/management knowledge. | +| [Use Vector Search with an OAuth token](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-oauth-token) | security | 0.80 | Explains how to obtain and use OAuth tokens with the Vector Search SDK/HTTP, including token lifetime (60 minutes) and refresh considerations, which are product-specific security/auth configuration details. | | [Use a custom embedding model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/custom-embedding-model) | integrations | 0.80 | Shows how to log a custom Python model with specific input/output schema requirements (ColSpec, TensorSpec, NumPy array embeddings); these are concrete integration and API contract details. | | [User-defined route settings](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/udr) | configuration | 0.80 | User‑defined route settings for Databricks require specific route table entries, next hop types, and per‑connection requirements (control plane, data plane, on‑prem). These are detailed configuration parameters and patterns unique to Databricks networking. | | [Utility objects script](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-utility-script) | configuration | 0.80 | Describes preparing MySQL using a utility objects script; likely documents specific database objects, scripts, and configuration steps unique to this connector. | +| [Vector search with the Python SDK](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-python-sdk-example) | integrations | 0.80 | Shows detailed usage of VectorSearchClient and related APIs, including parameters and request patterns unique to Databricks Vector Search. | | [What is DBFS?](https://learn.microsoft.com/en-us/azure/databricks/dbfs/) | configuration | 0.80 | Explains DBFS root and mount deprecation and recommends alternative storage mechanisms; these are Databricks-specific storage configuration details. | | [What is serverless egress control?](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/network-policies) | security | 0.80 | Explains serverless egress control requirements (Premium tier) and supported workloads, with Databricks-specific policy behavior to restrict outbound connections. | | [WindowSpec class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec) | integrations | 0.80 | Class-level reference for WindowSpec, including partitioning, ordering, frame boundaries, and Spark Connect support. | | [Workspace](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/workspace-cli) | integrations | 0.80 | Workspace CLI documentation provides commands and parameters for managing workspace items, which are Databricks-specific integration details. | -| [Workspaces system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/workspaces) | configuration | 0.80 | Explains the workspaces system table, including its structure and how to join with other tables; these table-level details are product-specific configuration. | +| [Write your own](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorial-databricks-apps-autoscaling) | integrations | 0.80 | Shows a concrete authentication and connection pattern (service principal, OAuth token rotation, credential refresh) between Databricks Apps and Lakebase Autoscaling, with code-level details and product-specific parameters. | | [abort](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamwriter/abort) | integrations | 0.80 | Explains how commit messages from tasks are used to abort microbatches on driver; product-specific failure-handling behavior. | | [ai_query function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_query) | integrations | 0.80 | ai_query() is explicitly for invoking Databricks Model Serving endpoints; page will include endpoint parameterization, request/response handling, and constraints unique to Databricks, clearly an integration & coding pattern. | | [ai_query: general-purpose function](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-query) | integrations | 0.80 | Focuses on the ai_query function, a Databricks-specific API for querying AI models from SQL/Python. Likely includes function signature, parameters (model, prompt, temperature, etc.), and usage constraints unique to Databricks, matching the integrations & coding patterns criteria with concrete parameter references. | | [append](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/append) | integrations | 0.80 | API method describing append semantics for v2 tables, which is specific integration behavior. | | [approx_top_k](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/approx_top_k) | limits-quotas | 0.80 | Includes explicit numeric defaults (k default 5, maxItemsTracked default 10000) and an error bound formula 2.0 * numRows / maxItemsTracked, which are precise limits/constraints. | | [asTable](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/astable) | integrations | 0.80 | Describes DataFrame.asTable and its use with TVFs/UDTFs, including object type and usage patterns unique to Databricks’ PySpark. | +| [auth](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/auth-commands) | security | 0.80 | The auth command group reference will document concrete authentication-related commands, flags, and possibly OAuth configuration parameters specific to Databricks CLI. Because it focuses on authentication setup and identity, it best fits the security category and contains expert, product-specific details. | | [bitwiseAND](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwiseand) | integrations | 0.80 | Details a specific Column bitwise operation, including Databricks Runtime version notes and Spark Connect support, which is product-specific API knowledge. | | [bitwiseOR](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwiseor) | integrations | 0.80 | Similar to index 11, this documents a precise Column API and runtime/version behavior, which is expert-level integration detail. | | [bitwiseXOR](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/bitwisexor) | integrations | 0.80 | Provides method-level documentation including Databricks Runtime changes, which is detailed product-specific SDK information. | @@ -751,6 +735,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [get_json_object](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/get_json_object) | integrations | 0.80 | Describes jsonpath-based extraction and null behavior on invalid JSON; Databricks-specific function semantics. | | [git-credentials](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/git-credentials-commands) | integrations | 0.80 | The 'git-credentials' command group reference will specify commands and parameters for registering PATs for Git operations, which are concrete integration and configuration details. | | [grouping](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/grouping) | integrations | 0.80 | Explains grouping’s 0/1 return values tied to GROUP BY aggregation state; product-specific aggregate function behavior. | +| [groups-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/groups-v2-commands) | integrations | 0.80 | Describes Databricks CLI groups-v2 commands with specific operations and parameters for managing workspace groups. This is product-specific CLI/API surface, fitting integrations & coding patterns. | | [h3_compact function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_compact) | integrations | 0.80 | Provides exact function usage, argument types, and semantics for H3 compaction in Databricks SQL, which is product-specific. | | [h3_hexring function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_hexring) | integrations | 0.80 | Function reference for creating hollow hex rings at distance k; includes Databricks SQL syntax and array return behavior. | | [h3_kring function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_kring) | integrations | 0.80 | Documents Databricks SQL function returning all cells within grid distance k; includes array return and parameter behavior. | @@ -776,7 +761,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [kll_merge_agg_bigint](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/kll_merge_agg_bigint) | limits-quotas | 0.80 | Specifies numeric parameter constraints (k range 8–65535) and behavior when k is omitted, which are explicit limits and defaults. | | [kll_merge_agg_double](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/kll_merge_agg_double) | limits-quotas | 0.80 | Includes explicit numeric limits for k (8–65535) and behavior when not specified, matching limits-quotas criteria. | | [kll_merge_agg_float](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/kll_merge_agg_float) | limits-quotas | 0.80 | Documents numeric range constraints for k (8–65535) and defaulting behavior, which are concrete limits. | +| [knowledge-assistants](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/knowledge-assistants-commands) | integrations | 0.80 | Provides detailed CLI commands and options for the knowledge-assistants command group, exposing product-specific parameters and behaviors that are expert integration details. | | [lag analytic window function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/lag) | integrations | 0.80 | Details Databricks-specific syntax and options for the lag analytic window function, including parameters and default behaviors, which are concrete API semantics. | +| [lakeview-embedded](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/lakeview-embedded-commands) | integrations | 0.80 | Covers token-based Lakeview embedded APIs through CLI commands, including specific command names and parameters for embedding dashboards, which are concrete integration patterns. | | [latestOffset](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/latestoffset) | integrations | 0.80 | Details interaction between start offset, read limits, and no-data scenarios; product-specific streaming offset API. | | [like](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/like) | integrations | 0.80 | Provides method-level documentation for SQL LIKE semantics on a Column in Databricks PySpark. | | [like operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/like) | integrations | 0.80 | Documents the like operator syntax, escape behavior, and matching semantics in Databricks SQL, which are product-specific operator/API details. | @@ -787,9 +774,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [notification-destinations](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/notification-destinations-commands) | integrations | 0.80 | Documents notification-destinations CLI commands and options, which are specific integration endpoints and configuration parameters. | | [online-tables](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/online-tables-commands) | integrations | 0.80 | Provides exact online-tables CLI commands and arguments, representing concrete product-specific integration patterns. | | [option](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/option) | integrations | 0.80 | Documents the option method for configuring data source parameters, including names and values, which is integration-focused configuration. | -| [option](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/option) | integrations | 0.80 | Documents how to configure output data source options via named parameters, which is integration configuration detail. | | [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/options) | integrations | 0.80 | API for bulk-setting data source options; parameter names and usage are product-specific integration details. | -| [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/options) | integrations | 0.80 | API for bulk-setting output options; parameter names and usage are product-specific. | | [orc](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframereader/orc) | integrations | 0.80 | Method reference for loading ORC with specific parameters and behavior, which is concrete integration knowledge. | | [orc](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/orc) | integrations | 0.80 | API method for ORC output with specific parameters and behavior. | | [otherwise](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/otherwise) | integrations | 0.80 | Documents how otherwise() interacts with when() conditions in Databricks PySpark, which is detailed API behavior. | @@ -807,15 +792,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [percentile](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/percentile) | integrations | 0.80 | Defines exact percentile computation, valid range [0.0, 1.0], and expression handling for a Databricks PySpark function. | | [percentile_approx](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/percentile_approx) | integrations | 0.80 | Documents approximate percentile semantics, ordering, and percentage interpretation for a Databricks PySpark aggregation function. | | [providers](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/providers-commands) | integrations | 0.80 | providers CLI group for Delta Sharing exposes concrete command names, flags, and behaviors that are product-specific integration patterns. | -| [psql](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/psql-command) | integrations | 0.80 | psql command reference includes connection parameters, flags, and usage patterns specific to Databricks-managed Postgres. | | [pushFilters](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/pushfilters) | integrations | 0.80 | Details how filters are passed and interpreted (AND semantics) and default behavior; product-specific optimization API. | -| [quality-monitors](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitors-commands) | integrations | 0.80 | quality-monitors CLI commands and options for creating/editing/deleting monitors are detailed, product-specific interfaces. | | [rangeBetween](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/rangebetween) | integrations | 0.80 | Explains detailed semantics of range-based frame boundaries relative to the current row, which is specific API behavior. | | [read](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcereader/read) | integrations | 0.80 | Describes per-partition read semantics and resource initialization; concrete integration contract for custom readers. | | [read](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcestreamreader/read) | integrations | 0.80 | Describes per-partition streaming read semantics and resource initialization; product-specific streaming integration behavior. | | [read](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/read) | integrations | 0.80 | Documents how read returns both data and the next start offset, a concrete streaming integration pattern. | | [readBetweenOffsets](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/simpledatasourcestreamreader/readbetweenoffsets) | integrations | 0.80 | Describes deterministic re-reading between specific offsets for failure recovery, which is detailed streaming API behavior. | -| [read_files table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_files) | integrations | 0.80 | Details a Databricks-specific table-valued function including supported formats, schema inference, and behavior—concrete integration/API usage information. | | [recipients](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/recipients-commands) | integrations | 0.80 | recipients CLI reference lists commands and flags for managing share recipients, representing product-specific integration patterns. | | [register](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration/register) | integrations | 0.80 | API reference for registering Python functions and UDFs as SQL functions, including supported function types and behavior. | | [register](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtfregistration/register) | integrations | 0.80 | Documents how to register Python user-defined table functions as SQL table functions, including method semantics. | @@ -824,10 +806,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [registered-models](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/registered-models-commands) | integrations | 0.80 | registered-models CLI commands and their parameters are detailed, product-specific interfaces for model registry operations. | | [rlike](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/rlike) | integrations | 0.80 | Documents SQL RLIKE semantics on a Column in Databricks PySpark, which is specific API behavior. | | [rowsBetween](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/rowsbetween) | integrations | 0.80 | Documents row-based frame boundaries with relative positions, a detailed aspect of the WindowSpec API. | -| [serving-endpoints](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/serving-endpoints-commands) | integrations | 0.80 | serving-endpoints CLI reference defines exact commands and parameters for creating, updating, and deleting serving endpoints, which are detailed integration patterns. | | [set](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/set) | configuration | 0.80 | Documents how to set configuration properties via the RuntimeConfig API, including key/value semantics. | | [settings](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/settings-commands) | configuration | 0.80 | settings CLI group manages workspace-level settings; the reference will list specific setting names, allowed values, and commands, which are configuration details. | -| [shares](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/shares-commands) | integrations | 0.80 | shares CLI reference documents concrete commands and parameters for share lifecycle and asset registration, which are product-specific integration details. | | [st_asewkt function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_asewkt) | integrations | 0.80 | Documents Databricks-specific st_asewkt function behavior and supported GEOGRAPHY/GEOMETRY types. | | [st_asgeojson function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_asgeojson) | integrations | 0.80 | Provides Databricks-specific API details for st_asgeojson, including formats and type handling. | | [st_astext function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_astext) | integrations | 0.80 | Function reference for st_astext with Databricks-specific behavior and supported geospatial types. | @@ -863,6 +843,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [tuple_union_agg_integer aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tuple_union_agg_integer) | integrations | 0.80 | Describes Databricks SQL aggregate function to union TupleSketch binaries with integer summaries; concrete, product-specific API details. | | [unpivot](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/unpivot) | integrations | 0.80 | Details unpivot behavior, relation to groupBy().pivot().agg(), and version added, which are specific integration semantics. | | [unset](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/runtimeconfig/unset) | configuration | 0.80 | Explains how to reset configuration properties by key, which is specific configuration API behavior. | +| [vector-search-endpoints](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-endpoints-commands) | integrations | 0.80 | Provides detailed CLI commands and parameters for vector-search-endpoints, a product-specific integration surface for Mosaic AI Vector Search. | +| [warehouses](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/warehouses-commands) | integrations | 0.80 | The warehouses CLI command group reference includes concrete command syntax and options for SQL warehouse management, which are specific integration details beyond generic knowledge. | | [withField](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/withfield) | integrations | 0.80 | Documents struct-manipulation behavior for a Column method in Databricks PySpark, which is specific SDK knowledge. | | [withWatermark](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withwatermark) | integrations | 0.80 | Explains watermark semantics for streaming DataFrames, including late data handling, which is product-specific behavior. | | [write](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/write) | integrations | 0.80 | Reference for DataFrame.write interface, including how it interacts with external storage systems in Databricks. | @@ -888,21 +870,26 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ALTER GOVERNED TAG](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-governed-tag) | configuration | 0.78 | The page documents the exact ALTER GOVERNED TAG SQL syntax, allowed options, and required MANAGE permission for governed tags in Databricks Unity Catalog. These are product-specific configuration parameters and privilege requirements, not generic concepts. | | [ALTER SCHEMA](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema) | configuration | 0.78 | The ALTER SCHEMA reference describes precise SQL syntax and options for changing schema owner, storage location, predictive optimization, and DBPROPERTIES in Databricks. These are detailed configuration operations unique to this product’s SQL dialect. | | [ALTER VIEW](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-view) | configuration | 0.78 | Details ALTER VIEW syntax, cache invalidation behavior, and TBLPROPERTIES handling, which are product-specific configuration semantics. | -| [Access control overview](https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/) | security | 0.78 | Access control documentation for Databricks securable objects typically lists concrete permission names, scopes, and role capabilities per object type (clusters, jobs, notebooks, repos, etc.). These RBAC/ACL matrices and exact permission semantics are product-specific security configuration details that qualify as expert knowledge. | +| [Allowlist libraries and init scripts on standard compute](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/allowlist) | security | 0.78 | Covers the Unity Catalog allowlist object, default behavior, and MANAGE ALLOWLIST privilege; this is product-specific security/permission configuration for controlling which artifacts can run on standard access mode compute. | | [Best practices for serverless compute](https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/best-practices) | best-practices | 0.78 | The page is explicitly a best-practices guide for Databricks serverless compute, with product-specific recommendations (for example, performance mode selection, dependency handling, networking, and cost monitoring) that go beyond generic advice and are tied to this specific service’s behavior. | | [BigQuery](https://learn.microsoft.com/en-us/azure/databricks/query-federation/bigquery) | integrations | 0.78 | BigQuery-specific federation setup, including Unity Catalog objects and connection configuration; likely includes parameter names, auth options, and constraints unique to this integration. | +| [Billable usage system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/billing) | configuration | 0.78 | Provides table path, schema, and example queries for system.billing.usage, which is detailed product-specific configuration/metadata. | | [CREATE CONNECTION](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-connection) | integrations | 0.78 | Describes CREATE CONNECTION DDL for foreign connections, including system-specific options and authentication details. These are concrete configuration/parameter patterns for federated queries, which qualify as product-specific integration patterns. | | [CREATE GOVERNED TAG](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-governed-tag) | configuration | 0.78 | The page defines the CREATE GOVERNED TAG SQL syntax, parameters, and required CREATE privilege at the account level, including role assumptions (account/workspace admins). This is expert, product-specific configuration of governed tags in Unity Catalog. | +| [CREATE MATERIALIZED VIEW](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view) | configuration | 0.78 | This is a detailed SQL reference page for CREATE MATERIALIZED VIEW in Databricks SQL/Runtime. Such language-manual pages typically enumerate all supported clauses, options, and constraints (e.g., syntax forms, required/optional parameters, allowed values, and behavior of refresh options) that are product-specific and not inferable from generic SQL knowledge. That fits the configuration category as it defines concrete, product-specific command parameters and their valid usage rather than conceptual guidance. | | [CREATE POLICY](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-policy) | security | 0.78 | This is a detailed SQL syntax reference for CREATE POLICY in Azure Databricks/Unity Catalog, including product-specific semantics for row filters and column masks, required privileges (MANAGE, ownership), and how policies attach to securables. These are concrete, product-unique security configuration details that go beyond generic SQL knowledge. | | [CREATE STREAMING TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-streaming-table) | integrations | 0.78 | Describes CREATE STREAMING TABLE syntax and constraints (Delta-only, supported environments, parsing-only behavior on some runtimes). These are detailed, product-specific SQL patterns for streaming/incremental processing. | | [CREATE TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table) | configuration | 0.78 | Unifies Databricks-specific CREATE TABLE syntaxes (USING, Hive format, LIKE) and their behaviors. | +| [Classic compute plane Private Link](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard) | security | 0.78 | Describes how to configure Azure Private Link for the classic compute plane to reach Databricks control plane services, including specific network/security configuration steps and required Azure resources. This is product-specific secure connectivity configuration. | | [Configuration recommendations](https://learn.microsoft.com/en-us/azure/databricks/compute/cluster-config-best-practices) | best-practices | 0.78 | Explicitly a recommendations/best practices article for compute configuration; likely includes concrete product-specific guidance and gotchas beyond generic advice. | -| [Configure a federation policy](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-policy) | security | 0.78 | Page describes how to create and configure an OAuth token federation policy, which is an authentication/authorization feature. It likely includes specific policy fields, allowed values, scopes, and configuration parameters unique to Databricks OAuth federation, which qualify as product-specific security configuration details. | +| [Configure a federation policy](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-policy) | security | 0.78 | Page is about configuring an OAuth token federation policy, which is a security/identity configuration topic. It likely includes concrete policy parameters (policy names, scopes, audiences, issuer/subject filters, claim mappings, and how they apply account-wide vs workload-specific), which are product-specific security settings rather than generic OAuth theory. | +| [Configure default Python package repositories](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-python-packages) | configuration | 0.78 | Admin setting for default pip configuration and authenticated/private repos is product-specific configuration with concrete options and behavior, beyond generic knowledge. | | [Configure double encryption for DBFS root](https://learn.microsoft.com/en-us/azure/databricks/security/keys/double-encryption) | security | 0.78 | Describes Databricks-specific double-encryption behavior for DBFS root and how it interacts with Azure Storage encryption, including when and how to enable it. | | [Create and manage ABAC policies](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/abac/policies) | security | 0.78 | ABAC policy configuration for Unity Catalog is product-specific security content. The page describes how to configure row filters and column masks, which typically includes concrete policy syntax, attribute names, and Unity Catalog–specific constructs that an LLM is unlikely to know from training. This fits the security category because it defines access control behavior rather than general concepts. | | [Create connection](https://learn.microsoft.com/en-us/azure/databricks/query-federation/sql-server) | integrations | 0.78 | SQL Server-specific federation configuration with Databricks, including supported versions, connection properties, and Unity Catalog foreign catalog setup; these are concrete integration details unique to this product combination. | | [Databricks SQL CLI](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-sql-cli) | integrations | 0.78 | Describes Databricks SQL CLI usage, including CLI commands, flags, and configuration (profiles, host, token); these are product-specific integration and configuration patterns. | | [Databricks SQL Driver for Node.js](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/nodejs-sql-driver) | integrations | 0.78 | Covers Databricks SQL Driver for Node.js with concrete connection configuration and driver-specific options; these are product-specific SDK parameters and code patterns for integrating Node.js with Databricks. | +| [Driver capability settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/capability) | integrations | 0.78 | Page documents special and advanced capability settings for the Databricks (Simba) JDBC driver, which are product- and driver-specific integration parameters. These are configuration-style options for a specific integration (JDBC driver) rather than general configuration of the Databricks service, and include named settings and how to use them—details that are unlikely to be known from pretraining. | | [Enable firewall support for your workspace storage account](https://learn.microsoft.com/en-us/azure/databricks/security/network/storage/firewall-support) | security | 0.78 | Explains limiting access to the workspace storage account using Azure CLI/PowerShell. This will include specific firewall rule parameters, service endpoint or Private Link settings, and identity/authorization details, which are concrete security configuration instructions. | | [Enable inference tables on model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/enable-model-serving-inference-tables) | configuration | 0.78 | Explains API usage to enable inference tables and related settings; includes specific API parameters and behavior unique to Databricks. | | [Enable or disable partner OAuth applications](https://learn.microsoft.com/en-us/azure/databricks/integrations/enable-disable-oauth) | security | 0.78 | Provides concrete steps and behavior (including 30-minute propagation) for managing OAuth app access to Databricks. | @@ -913,19 +900,24 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Legacy driver settings](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/legacy) | security | 0.78 | Covers auth for JDBC driver 2.6.22 and below, including insecure methods and warnings; includes specific auth parameters and flows, which are product-specific security configuration details. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-limits) | limits-quotas | 0.78 | A connector limitations page; typically enumerates specific unsupported features, rate limits, or behavioral constraints that are expert-only details. | | [MERGE INTO](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-merge-into) | integrations | 0.78 | Documents Databricks-specific MERGE INTO syntax and semantics for Delta tables, including how updates/inserts/deletes are expressed; these are concrete, product-specific SQL integration details. | +| [Manage service principals](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-service-principals) | security | 0.78 | Covers management of service principals at account and workspace scope, which is detailed IAM/security configuration. | | [Model deployment pattern](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/deployment-patterns) | architecture-patterns | 0.78 | Compares two concrete deployment patterns for ML artifacts with pros and cons and scenario-based guidance, which is exactly product-specific architecture/deployment pattern decision content. | | [Names](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-names) | configuration | 0.78 | Provides explicit naming limitations and behaviors (e.g., special characters, case preservation) for Unity Catalog object names. | | [Optimize endpoints for production workloads](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/production-optimization) | best-practices | 0.78 | Provides optimization strategies for high-throughput, low-latency workloads; these are Databricks-specific performance recommendations and gotchas. | -| [Oracle](https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle) | integrations | 0.78 | Connector-specific setup for Oracle via Lakehouse Federation, including Unity Catalog foreign catalog creation, connection options, and likely parameter names/values and JDBC/driver details that are product-specific integration knowledge rather than generic concepts. | +| [Oracle](https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle) | integrations | 0.78 | Integration-focused page with product-specific connection objects, required properties, and configuration steps for Oracle as a federated source in Unity Catalog. Contains concrete parameter names and patterns unique to Databricks Lakehouse Federation rather than generic SQL/Oracle usage. | | [PEM private key](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-pem) | integrations | 0.78 | How-to page for configuring Lakehouse Federation to Snowflake using PEM private key authentication. It necessarily includes product-specific connection objects, parameter names, and configuration details unique to Azure Databricks–Snowflake integration (e.g., Unity Catalog connection properties, authentication fields), which qualify as expert integration knowledge beyond generic SDK usage. | | [Performance efficiency best practices](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/performance-efficiency/best-practices) | best-practices | 0.78 | Described as a best-practices article for performance efficiency on the Databricks lakehouse; these pages typically include product-specific tuning guidance (for clusters, storage formats, partitioning, caching) and concrete configuration patterns unique to Azure Databricks rather than just conceptual advice. | | [Privileges and securable objects in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-privileges) | security | 0.78 | Unity Catalog privilege reference pages enumerate specific securable object types, privilege names, and their exact semantics in Databricks SQL/Runtime. These are product-specific RBAC/permission details that qualify as security configuration knowledge beyond generic concepts. | +| [Query history system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/query-history) | configuration | 0.78 | Provides table path and schema outline for system.query.history, which is detailed product-specific system table configuration. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-reference) | configuration | 0.78 | Connector reference documentation usually includes supported objects, fields, and mappings—product-specific configuration/reference information. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-reference) | configuration | 0.78 | A technical reference listing supported tables and data models is detailed, product-specific schema/configuration information (table names, fields, types) that aligns with configuration expert knowledge. | | [Resource permissions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/permissions) | security | 0.78 | The article explains how to define permissions for resources in bundle configuration files, including rules such as non-overlapping permissions for users, groups, and service principals. This is product-specific IAM/permissions configuration, fitting the security category. | +| [Restrict workspace admins](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/restrict-workspace-admins) | security | 0.78 | Describes the RestrictWorkspaceAdmins setting with distinct fields controlling ownership changes; this is a product-specific security/RBAC configuration. | +| [SCIM provisioning overview](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/) | security | 0.78 | SCIM provisioning setup between IdPs and Databricks is a concrete identity/security configuration with product-specific steps and parameters. | +| [SQL warehouse events system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouse-events) | configuration | 0.78 | Warehouse events table path and event semantics (start, stop, scale) are Databricks-specific system table configuration details. | | [Salesforce Data 360](https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud) | integrations | 0.78 | Product-specific instructions for connecting Databricks Lakehouse Federation to Salesforce Data 360 with concrete configuration details. | -| [Search traces - SDK](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-via-sdk) | integrations | 0.78 | Focuses on mlflow.search_traces() usage across tracking server, inference tables, and Unity Catalog, which is an SDK/API integration pattern. Contains product-specific function signatures, parameters, and usage patterns for querying traces that go beyond generic SDK knowledge. | | [Serverless optimized deployments for custom models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/serverless-optimized-deployments) | deployment | 0.78 | Describes serverless optimized deployment behavior and constraints for Databricks endpoints; these deployment mechanics are product-specific. | +| [Service principals overview](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/service-principals) | security | 0.78 | Defines Databricks service principals and their security role in API-only access; this is product-specific identity/security behavior. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-source-setup) | configuration | 0.78 | Focused on configuring Workday reports for ingestion; likely includes specific configuration fields, formats, and constraints needed for the connector to work. | | [Spark UI guide overview](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/) | best-practices | 0.78 | Step-by-step, Databricks-specific guidance on what to look for in Spark UI pages, how to interpret metrics, and how to act on them; contains concrete, product-specific diagnostic patterns. | | [Strands](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/strands) | integrations | 0.78 | Shows how to use mlflow.strands.autolog for Strands Agents and what agent call information is captured, plus serverless autologging behavior—specific integration details. | @@ -935,6 +927,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Teradata](https://learn.microsoft.com/en-us/azure/databricks/query-federation/teradata) | configuration | 0.78 | The page is a concrete configuration guide for connecting Azure Databricks Lakehouse Federation to Teradata. It describes specific Unity Catalog objects that must be created, and typically such connector pages include connector-specific parameters (server, port, authentication options, JDBC/ODBC properties) and required settings unique to this integration. This is product- and source-specific configuration detail rather than a generic tutorial, so it best fits the configuration sub-skill. | | [UDFRegistration class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udfregistration) | integrations | 0.78 | Documents a Databricks-specific wrapper for UDF registration accessed via spark.udf, including its role in SQL function registration. | | [UDTFRegistration class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtfregistration) | integrations | 0.78 | Class-level reference for registering user-defined table functions via spark.udtf, which is Databricks-specific API behavior. | +| [Unity Catalog best practices](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/best-practices) | best-practices | 0.78 | The page explicitly focuses on best practices for configuring Unity Catalog for data governance and isolation. Unity Catalog is product-specific, and such guidance typically includes concrete recommendations (for example, how to structure catalogs/schemas, how to separate environments, how to assign permissions) and gotchas unique to Unity Catalog on Azure Databricks. This aligns with the best-practices category as product-specific DO/DON'T guidance rather than generic theory. | | [UserDefinedFunction class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udf) | integrations | 0.78 | Class-level reference for Databricks’ PySpark UDF wrapper, including how instances are created via specific factory functions. | | [UserDefinedTableFunction class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/udtf) | integrations | 0.78 | Class-level reference for Databricks’ PySpark UDTF, including how to construct via pyspark.sql.functions.udtf. | | [Utility objects script](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-utility) | configuration | 0.78 | Preparing SQL Server using a utility objects script implies specific script parameters, objects, and configuration steps unique to this connector—expert configuration knowledge. | @@ -943,12 +936,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [When to use Spark vs. Ray](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/spark-ray-overview) | decision-making | 0.78 | Directly compares Spark and Ray for different workload types with Databricks-specific guidance on when to choose each or both, which is explicit technology selection guidance. | | [Window class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window) | integrations | 0.78 | Class-level reference for Window utility functions, including Spark Connect support, which is concrete API surface. | | [WriterCommitMessage](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/writercommitmessage) | integrations | 0.78 | Class-level reference for WriterCommitMessage used with DataSourceWriter.write/commit/abort, including its role in driver communication, which is product-specific integration behavior. | +| [YAML syntax reference](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/yaml-reference) | configuration | 0.78 | A full YAML grammar reference for metric views with supported expressions and feature availability is a configuration reference: specific field names, allowed values, and syntax that an LLM would not reliably infer from training. | | [agg](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/agg) | integrations | 0.78 | API reference for DataFrame.agg with method signature, parameters, and behavior details that are product/version-specific and not reliably known from training. | | [ai_translate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_translate) | integrations | 0.78 | Documents ai_translate SQL function including supported language codes, argument semantics, and behavior tied to Databricks Foundation Model APIs. This is concrete, product-specific API surface, best matching integrations. | +| [alerts-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-v2-commands) | integrations | 0.78 | CLI reference pages typically list command names, flags, parameters, and usage patterns that are product-specific and not inferable from general knowledge. Even though the summary is brief, the alerts-v2 command group reference will enumerate concrete commands and options for managing SQL alerts, which fits the integrations & coding patterns category. | +| [apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/apps-commands) | integrations | 0.78 | As a CLI reference for the apps command group, this page will contain specific commands, arguments, and behaviors for managing Databricks apps. These are product-specific integration patterns for programmatic control, matching the integrations category. | | [bundle](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/bundle-commands) | integrations | 0.78 | CLI reference pages typically enumerate command names, flags, and parameter behaviors that are product-specific and not inferable from general knowledge. This page is a command-group reference for 'bundle', so it likely lists options, flags, and usage patterns for deploying and running jobs and Lakeflow pipelines, which fits integrations & coding patterns. | | [cache](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cache) | integrations | 0.78 | Documents DataFrame.cache and the specific default storage level (MEMORY_AND_DISK_DESER), which is a concrete configuration detail. | | [catalogs](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/catalogs-commands) | integrations | 0.78 | As a CLI command-group reference for 'catalogs', it will contain specific commands, arguments, and behaviors for managing Unity Catalog catalogs. These are concrete API/CLI parameters unique to Databricks, matching integrations. | -| [clean-room-assets](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-assets-commands) | integrations | 0.78 | The page documents 'clean-room-assets' CLI commands, including supported asset types and command parameters. Such detailed CLI options and object types are product-specific integration details. | | [connections](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/connections-commands) | integrations | 0.78 | The 'connections' command group manages connections to external data sources, with specific command names and parameters for connection definitions, which are integration patterns. | | [corr](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/corr) | integrations | 0.78 | Details DataFrame.corr usage, supported correlation type, and aliasing with DataFrameStatFunctions.corr. | | [cov](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cov) | integrations | 0.78 | Documents DataFrame.cov usage, return type, and aliasing with DataFrameStatFunctions.cov. | @@ -958,8 +953,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [cube](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/cube) | integrations | 0.78 | API semantics for DataFrame.cube and how it enables multi-dimensional aggregation. | | [describe](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/describe) | integrations | 0.78 | Documents DataFrame.describe behavior, including which statistics are computed for numeric and string columns. | | [exceptAll](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/exceptall) | integrations | 0.78 | API semantics for DataFrame.exceptAll, including preservation of duplicates. | -| [experiments](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/experiments-commands) | integrations | 0.78 | The 'experiments' command group reference will include specific commands and flags for creating and managing MLflow experiments, which are detailed CLI integration patterns. | | [functions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/functions-commands) | integrations | 0.78 | The 'functions' command group reference will list commands and parameters for managing UDFs, including how implementations are specified, which are product-specific CLI integration details. | +| [genie](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/genie-commands) | integrations | 0.78 | CLI reference pages list product-specific commands, flags, and parameters for the genie command group. These are concrete API/CLI integration details (names, options, behaviors) that qualify as expert knowledge beyond generic concepts. | | [global-init-scripts](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/global-init-scripts-commands) | configuration | 0.78 | This page documents CLI commands for global init scripts, including how to define and manage scripts that run on every node. These are specific configuration commands and parameters. | | [groupBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/groupby) | integrations | 0.78 | API reference for DataFrame.groupBy and its relationship to GroupedData aggregate functions. | | [groups](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/groups-commands) | security | 0.78 | The 'groups' command group reference will include commands and parameters for managing groups used in access control, which is security/identity management specific to Databricks. | @@ -972,19 +967,37 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [h3_maxchild function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_maxchild) | integrations | 0.78 | Describes preview Databricks SQL function semantics for selecting max-valued child at a resolution; specific API behavior. | | [h3_minchild function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_minchild) | integrations | 0.78 | Provides exact function usage for selecting min-valued child H3 cell; Databricks-specific SQL function. | | [h3_stringtoh3 function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_stringtoh3) | integrations | 0.78 | Provides concrete Databricks SQL function for string-to-BIGINT H3 conversion; specific parameter and return contract. | +| [ip_as_binary function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_as_binary) | integrations | 0.78 | Function reference pages for Databricks SQL typically include exact syntax, argument types, return types, and product-specific behavior for the ip_as_binary function, which are concrete API/SDK-style details beyond generic SQL knowledge and fit the integrations & coding patterns category. | +| [ip_as_string function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_as_string) | integrations | 0.78 | The page documents the ip_as_string Databricks SQL function with precise syntax and behavior for converting IPs/CIDRs to canonical strings, which is product-specific API surface detail appropriate for the integrations category. | +| [ip_cidr function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_cidr) | integrations | 0.78 | Describes the ip_cidr function’s exact usage and behavior for IPv4/IPv6 CIDR canonicalization in Databricks SQL, representing concrete function-level integration details rather than general concepts. | +| [ip_cidr_contains function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_cidr_contains) | integrations | 0.78 | Provides specific syntax and semantics for the ip_cidr_contains function (TRUE/FALSE containment checks between IPs and CIDRs) in Databricks SQL, which is product-specific function behavior aligned with integrations & coding patterns. | +| [ip_host function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_host) | integrations | 0.78 | Documents the ip_host function’s exact behavior and usage for canonical IPv4/IPv6 address representation in Databricks SQL, which is detailed API-level knowledge suitable for the integrations category. | +| [ip_network function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network) | integrations | 0.78 | Explains the ip_network function (and its alias behavior) with precise semantics for returning the network portion of CIDR blocks in Databricks SQL, representing concrete function configuration/usage details. | +| [ip_network_first function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network_first) | integrations | 0.78 | Clarifies that ip_network_first is an alias for ip_network and documents its exact behavior and usage, which is specific to Databricks SQL’s function surface and fits integrations & coding patterns. | +| [ip_network_last function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network_last) | integrations | 0.78 | Provides exact syntax and semantics for ip_network_last to return the last address in a CIDR block in Databricks SQL, which is detailed, product-specific function behavior appropriate for integrations. | +| [ip_prefix_length function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_prefix_length) | integrations | 0.78 | Documents the ip_prefix_length function’s precise behavior for extracting prefix length from CIDR blocks in Databricks SQL, representing concrete API usage details rather than generic networking theory. | +| [ip_version function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_version) | integrations | 0.78 | Describes the ip_version function’s exact return values (4 or 6) and usage with IPv4/IPv6 addresses and CIDRs in Databricks SQL, which is specific function-level behavior aligned with integrations & coding patterns. | | [jobs](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/jobs-commands) | integrations | 0.78 | The 'jobs' command group reference will enumerate commands and parameters for job lifecycle management, including job definitions and runs, which are detailed CLI integration patterns. | | [kll_sketch_merge_bigint function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_merge_bigint) | integrations | 0.78 | Explains kll_sketch_merge_bigint function syntax and compatibility requirements for merging sketches, which are Databricks-specific API semantics. | | [kll_sketch_merge_double function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_merge_double) | integrations | 0.78 | Provides exact usage details for kll_sketch_merge_double, including how sketches are combined, which is product-specific function behavior. | | [kll_sketch_merge_float function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_sketch_merge_float) | integrations | 0.78 | Documents kll_sketch_merge_float function contract and constraints, representing Databricks SQL-specific integration details. | | [labs](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/labs-commands) | integrations | 0.78 | CLI reference pages enumerate product-specific command names, flags, and behaviors for the labs command group, which are not inferable from general knowledge and qualify as expert integration/config details. | | [lakeview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/lakeview-commands) | integrations | 0.78 | Provides detailed CLI command syntax and parameters for the lakeview command group, which are product-specific API/SDK-style details. | +| [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/options) | integrations | 0.78 | The API reference for DataFrameWriterV2.options lists product-specific write options for Azure Databricks PySpark, including exact option names and their behaviors for underlying data sources. This is detailed integration/configuration knowledge tied to Databricks' Spark runtime and data source connectors, which goes beyond generic LLM training. It fits best under integrations because it documents concrete parameters for writing to external storage systems via Databricks. | +| [policies](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policies-commands) | integrations | 0.78 | Documents the policies command group with specific CLI operations and arguments for ABAC policies. These are product-specific command and parameter details, matching integrations criteria. | | [pyodbc](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/pyodbc) | integrations | 0.78 | How-to for connecting via ODBC from Python using pyodbc; typically includes driver-specific connection strings, DSN/host/HTTP path parameters, and example code patterns unique to Databricks ODBC, which are product-specific integration details. | +| [quality-monitor-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitor-v2-commands) | integrations | 0.78 | Even though deprecated, the quality-monitor-v2 command group reference contains specific CLI commands and options for managing data quality monitors, which are detailed integration patterns. | +| [quality-monitors](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitors-commands) | integrations | 0.78 | CLI reference pages enumerate product-specific commands, flags, and parameters for the quality-monitors command group, which are not generally known from training and map directly to SDK/API usage patterns. | | [query-history](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/query-history-commands) | integrations | 0.78 | query-history CLI commands for storing and retrieving query history are concrete, product-specific command interfaces. | | [read_kafka table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_kafka) | integrations | 0.78 | The read_kafka TVF documentation will contain Databricks-specific integration details for Apache Kafka (such as required/optional options, parameter names, and supported modes like batch vs streaming). These are concrete configuration and API usage details unique to Databricks SQL’s Kafka integration, which aligns with the integrations sub-skill. | | [schema_of_variant](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/schema_of_variant) | integrations | 0.78 | Documents schema_of_variant for Databricks’ variant type, which is a Databricks-specific data type and function. | | [schema_of_variant_agg](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/schema_of_variant_agg) | integrations | 0.78 | Describes schema_of_variant_agg behavior on variant columns, which is specific to Databricks’ variant support. | +| [service-principal-secrets-proxy](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principal-secrets-proxy-commands) | integrations | 0.78 | Describes Databricks CLI service-principal-secrets-proxy commands with concrete command names and options, representing product-specific integration patterns for managing secrets. | | [service-principals](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principals-commands) | security | 0.78 | service-principals CLI commands manage identities used for CI/CD; these are concrete, product-specific IAM configuration interfaces. | +| [service-principals-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principals-v2-commands) | integrations | 0.78 | Provides detailed CLI command group reference (service-principals-v2) with specific commands/parameters for identity automation, which are product-specific API/SDK-style integration details. | +| [serving-endpoints](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/serving-endpoints-commands) | integrations | 0.78 | Covers the serving-endpoints CLI command group with concrete command syntax and options for managing model serving endpoints, which are product-specific integration commands. | | [session_window](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/session_window) | integrations | 0.78 | Documents session_window semantics, gap duration, dynamic window expansion, and microsecond precision, which are detailed API behaviors. | +| [shares](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/shares-commands) | integrations | 0.78 | Documents the shares CLI command group with specific commands and parameters for Unity Catalog shares, representing detailed integration/configuration patterns unique to Databricks. | | [st_addpoint function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_addpoint) | integrations | 0.78 | Geospatial function reference with Databricks-specific behavior, input/return types, and preview/warehouse support details for st_addpoint. | | [st_area function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_area) | integrations | 0.78 | Details Databricks geospatial function st_area, including supported types, runtime versions, and behavior not inferable from generic GIS knowledge. | | [st_buffer function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_buffer) | integrations | 0.78 | Function reference for st_buffer with Databricks-specific radius handling and GEOMETRY support. | @@ -997,15 +1010,17 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [st_force2d function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_force2d) | integrations | 0.78 | Databricks Runtime function semantics for st_force2d, including projection of GEOGRAPHY/GEOMETRY to 2D. | | [storage-credentials](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/storage-credentials-commands) | integrations | 0.78 | CLI reference pages typically list command names, required/optional parameters, and flags specific to the Databricks CLI and Unity Catalog storage credentials. These are product-specific API/SDK-style parameters and options that qualify as integration/coding patterns. | | [sync](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/sync-commands) | integrations | 0.78 | The sync command reference will include CLI parameters (paths, filters, flags) and behavior details for one-way synchronization, which are product-specific command/parameter patterns. | +| [tables](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tables-commands) | integrations | 0.78 | The tables command group reference lists product-specific CLI commands and flags for managing tables, which are concrete integration points not derivable from generic knowledge. | | [token-management](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/token-management-commands) | integrations | 0.78 | Token-management command reference will list commands, IDs, and flags for managing tokens, which are specific CLI integration patterns. | | [tokens](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tokens-commands) | integrations | 0.78 | Tokens command group reference exposes concrete CLI commands and parameters for token lifecycle operations, which are product-specific. | | [udf](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/udf) | integrations | 0.78 | Documents Databricks’ udf helper, including signatures, return types, and behavior for user-defined functions, which are platform-specific coding patterns. | | [udtf](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/udtf) | integrations | 0.78 | Explains Databricks’ udtf helper, including how to define and use table-valued functions, which is a product-specific integration pattern. | | [users](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/users-commands) | integrations | 0.78 | Describes CLI commands and arguments for user identity management, including parameter names and usage unique to Databricks CLI. | +| [users-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/users-v2-commands) | integrations | 0.78 | Describes the users-v2 CLI command group with specific command names and options for user identity management, fitting the integrations category as product-specific API/CLI patterns. | | [variant_explode (TVF)](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-variant_explode) | integrations | 0.78 | Details schema, behavior, and edge cases of variant_explode for Databricks-specific VARIANT type, including handling of nulls and non-variant inputs. | | [variant_explode_outer (TVF)](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-variant_explode_outer) | integrations | 0.78 | Documents variant_explode_outer semantics, including behavior differences vs variant_explode and handling of non-array/object inputs. | -| [vector-search-endpoints](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-endpoints-commands) | integrations | 0.78 | Vector-search-endpoints command reference will include endpoint-specific CLI parameters and options that are detailed integration patterns. | | [vector-search-indexes](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-indexes-commands) | integrations | 0.78 | Provides CLI command syntax and parameters for vector-search-indexes, which are concrete, product-specific integration details. | +| [workspace-iam-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-iam-v2-commands) | security | 0.78 | CLI reference pages enumerate product-specific workspace-iam-v2 commands, flags, and behaviors for managing identities and access in Databricks. These are concrete, versioned security/IAM configuration interfaces that an LLM cannot reliably infer from training data and map directly to security-related operations. | | [years](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/years) | integrations | 0.78 | Documents years partition transform, its purpose, Spark Connect support, and deprecation in 4.0.0 with replacement guidance. | | [zip_with](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/zip_with) | integrations | 0.78 | Documents zip_with semantics including null-padding of shorter arrays and Spark Connect support. | | [Gen AI model maintenance policy](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/retired-models-policy) | limits-quotas | 0.77 | Describes which foundation models are supported, how and when they may be updated or retired, and likely includes timelines and behavioral guarantees, which are offering-specific constraints akin to limits/policy details. | @@ -1021,6 +1036,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [CREATE FUNCTION (External)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-function) | integrations | 0.76 | Documents CREATE FUNCTION for external functions, including USING clause resource handling and session vs persistent scope. These are detailed, product-specific SQL/Scala/Python/Java UDF integration patterns. | | [CREATE TABLE with Hive format](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-hiveformat) | configuration | 0.76 | CREATE TABLE with Hive format syntax, required clauses, and Databricks-specific defaulting to Delta if omitted. | | [CREATE VIEW](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-view) | integrations | 0.76 | Reference for CREATE VIEW including metric views, YAML-based definitions, and required privileges. Contains concrete syntax and permission requirements that are specific to Databricks SQL. | +| [Compute system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/compute) | configuration | 0.76 | Reference for compute system tables and their fields is specific to Databricks’ internal monitoring schema. | | [Configure Databricks sign-on from Tableau Server](https://learn.microsoft.com/en-us/azure/databricks/integrations/configure-oauth-tableau) | integrations | 0.76 | Configuration guide for a specific Tableau Server ↔ Azure Databricks OAuth integration, likely includes app registration fields, redirect URLs, and Databricks-specific parameters that are product- and partner-specific. | | [Configure encryption for S3 with KMS](https://learn.microsoft.com/en-us/azure/databricks/security/keys/kms-s3) | security | 0.76 | Gives concrete configuration for using KMS keys with s3a:// paths in Unity Catalog, including Databricks-specific cross-cloud encryption behavior and settings. | | [Customer-managed keys for Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmek-unity-catalog) | security | 0.76 | Describes catalog-level CMK configuration and multi-key encryption behavior for Unity Catalog, including how keys map to catalogs and compliance scenarios, which is detailed product-specific security behavior. | @@ -1028,10 +1044,13 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [DataGrip](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/datagrip) | integrations | 0.76 | Tool-specific integration guide; typically includes JDBC URL templates, driver selection, and Databricks-specific parameters for DataGrip. | | [Driver overload gaps](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-driver-overloaded) | best-practices | 0.76 | Gives specific remediation strategies for overloaded drivers (too many streams/jobs, non-Spark code) in Databricks clusters, tied to UI observations and cluster behavior. | | [FETCH statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/fetch-stmt) | troubleshooting | 0.76 | Documents FETCH statement behavior including specific condition code SQLSTATE '02000' (CURSOR_NO_MORE_ROWS) and casting error conditions, mapping symptoms to conditions. | +| [Inbound Private Link](https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/front-end-private-connect) | security | 0.76 | Step-by-step configuration of inbound Private Link for Databricks front-end, with product-specific network/security settings (private endpoint types, DNS/endpoint configuration, required Azure resources). This is concrete security configuration rather than generic networking guidance. | | [Inference tables for monitoring and debugging models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/inference-tables) | configuration | 0.76 | Describes what inference tables log, limitations, and legacy vs AI Gateway-enabled behavior; these are detailed, product-specific monitoring configurations. | +| [Jobs system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs) | configuration | 0.76 | Reference for lakeflow (workflow) system tables and their regional behavior is specific operational metadata/configuration. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/compute/dedicated-limitations) | limits-quotas | 0.76 | Requirements and limitations page for dedicated compute; typically lists runtime-dependent feature support and constraints, which are expert, product-specific limits. | | [Low-level client APIs (advanced)](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/manual-tracing/low-level-api) | integrations | 0.76 | Describes MlflowClient tracing APIs, including explicit trace lifecycle and custom IDs, with parameter-level details—advanced integration patterns unique to MLflow. | | [Manage network policies for serverless egress control](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/manage-network-policies) | security | 0.76 | This page is about configuring and managing network policies for serverless egress control, which is a product-specific security feature. Such docs typically include concrete policy object names, allowed/blocked destination formats, scope definitions, and possibly required roles or permissions to manage these policies. That aligns with the security category’s requirement for product-specific security settings and RBAC/scope details rather than generic networking concepts. | +| [OAuth authentication as a user](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-u2m) | security | 0.76 | Page covers setting up OAuth 2.0 user authorization for Databricks APIs/CLI/SDKs, including token lifetimes (access token valid for one hour) and likely specific auth configuration details (scopes, redirect URIs, client types, unified client auth behavior). These are product-specific authentication settings and token behaviors, fitting the security sub-skill. | | [OPEN statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/open-stmt) | troubleshooting | 0.76 | Documents OPEN statement behavior including specific error conditions like TABLE_OR_VIEW_NOT_FOUND and COLUMN_NOT_FOUND_IN_TABLE, which map failures to causes. | | [One task](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/one-spark-task) | best-practices | 0.76 | Explains Databricks-specific causes of single-task stages and how to address them to avoid cluster underutilization; actionable performance tuning guidance. | | [Performance recommendations](https://learn.microsoft.com/en-us/azure/databricks/query-federation/performance-recommendations) | best-practices | 0.76 | Provides concrete performance recommendations for Lakehouse Federation (for example pushdown strategies, configuration tweaks, and query patterns) that are specific to this feature and go beyond generic SQL tuning. | @@ -1042,14 +1061,17 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [RESET](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-reset) | configuration | 0.76 | Configuration-focused command resetting session-level parameters to global defaults, with Databricks-specific terminology. | | [SQL warehouse settings for BI workloads](https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/bi-workload-settings) | best-practices | 0.76 | Provides workload-specific configuration recommendations for BI, including concrete sizing, concurrency, and performance/cost trade-offs for Databricks SQL warehouses. | | [SQL warehouse sizing, scaling, and queuing behavior](https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/warehouse-behavior) | decision-making | 0.76 | Explains sizing, autoscaling, and queuing behavior with guidance on choosing configurations to balance performance and cost; includes decision criteria for serverless vs other options. | +| [SQL warehouses system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouses) | configuration | 0.76 | Describes warehouses system table path and snapshot semantics, which are product-specific configuration/metadata details. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-source-setup) | security | 0.76 | Configuring TikTok Ads for authentication will involve app credentials, scopes, and callback settings—product-specific security/auth configuration. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-setup) | security | 0.76 | Auth configuration for Workday HCM will include specific credentials, endpoints, and permission scopes—product-specific security configuration. | +| [Sync users and groups automatically from MS Entra ID](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management) | security | 0.76 | Details automatic syncing of users, groups, and service principals from Entra ID, which is a product-specific identity integration/configuration. | | [TABLE_SHARE_USAGE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/table_share_usage) | configuration | 0.76 | Documents the TABLE_SHARE_USAGE information schema extension with its specific metadata fields and behavior for shares, which is product-specific schema configuration detail. | | [TABLE_TAGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/table_tags) | configuration | 0.76 | Defines the TABLE_TAGS information schema relation and its tagging metadata behavior for tables/views, which is detailed schema/metadata configuration. | | [VIEWS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/views) | configuration | 0.76 | Documents the INFORMATION_SCHEMA.VIEWS relation and its specific metadata fields and visibility rules, which are configuration/metadata details. | | [VOLUME_TAGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/volume_tags) | configuration | 0.76 | Defines the VOLUME_TAGS information schema relation and how tagging metadata is exposed, which is product-specific metadata configuration. | | [Variables](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-variables) | configuration | 0.76 | Explains Databricks variables as schema-qualified session objects, their lifecycle, and system.session schema usage. | | [alias](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/alias) | integrations | 0.76 | Documents DataFrame.alias behavior and parameters in Databricks’ PySpark, which is concrete API surface rather than conceptual info. | +| [clean-room-assets](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-assets-commands) | integrations | 0.76 | This CLI reference for clean-room-assets will list specific commands and parameters for managing shared assets like FOREIGN_TABLE and NOTEBOOK. These are concrete API/CLI patterns unique to Databricks, aligning with integrations & coding patterns. | | [clean-room-task-runs](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-task-runs-commands) | integrations | 0.76 | Reference for 'clean-room-task-runs' commands will list specific commands, flags, and usage for managing notebook executions in clean rooms, which are concrete integration/CLI patterns. | | [clean-rooms](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-rooms-commands) | integrations | 0.76 | The 'clean-rooms' command group reference will enumerate commands and parameters for creating and managing clean rooms, which are detailed, product-specific CLI integration points. | | [coalesce](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/coalesce) | integrations | 0.76 | API reference for DataFrame.coalesce with parameter semantics that are specific to Spark/Databricks implementation. | @@ -1060,7 +1082,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [consumer-personalization-requests](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/consumer-personalization-requests-commands) | integrations | 0.76 | The 'consumer-personalization-requests' command group will define commands and arguments for managing personalization requests, which are specific CLI integration patterns. | | [distinct](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/distinct) | integrations | 0.76 | API reference for DataFrame.distinct and its deduplication semantics. | | [drop](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/drop) | integrations | 0.76 | Describes DataFrame.drop behavior, including no-op behavior when columns are missing. | +| [environments](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/environments-commands) | integrations | 0.76 | The environments command group reference will list specific commands and options for managing environment resources and versions. These CLI details are expert, product-specific integration patterns between tooling and the Databricks Environments API. | | [exists](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/exists) | integrations | 0.76 | API reference for DataFrame.exists returning a Column usable in EXISTS subqueries. | +| [experiments](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/experiments-commands) | integrations | 0.76 | This experiments command group reference will contain concrete CLI commands and parameters for creating and managing MLflow experiments in Databricks. These are detailed integration patterns between the CLI and MLflow/Databricks services. | | [feature-engineering](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/feature-engineering-commands) | integrations | 0.76 | The 'feature-engineering' command group manages features in the feature store, with concrete commands and parameters that represent product-specific integration behavior. | | [filter](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/filter) | integrations | 0.76 | Documents DataFrame.filter usage and accepted condition types. | | [grants](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/grants-commands) | security | 0.76 | Grants are core to access control; the CLI reference will list specific grant/revoke commands and permission scopes, fitting the security sub-skill. | @@ -1085,17 +1109,16 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [st_flipcoordinates function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/st_flipcoordinates) | integrations | 0.76 | Documents Databricks-specific st_flipcoordinates behavior for swapping X/Y in GEOMETRY values. | | [system-schemas](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/system-schemas-commands) | integrations | 0.76 | Describes CLI commands and parameters for managing system schemas, including command syntax and options that are specific to Databricks CLI. | | [table-constraints](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/table-constraints-commands) | integrations | 0.76 | Command-group reference for table-constraints will enumerate commands and flags for primary/foreign key management, which are concrete CLI integration details. | -| [tables](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tables-commands) | integrations | 0.76 | Provides Databricks CLI tables command syntax and parameters, which are product-specific integration details. | | [temporary-path-credentials](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/temporary-path-credentials-commands) | integrations | 0.76 | temporary-path-credentials command reference will include parameters (scopes, durations, storage locations) for generating downscoped credentials, which are detailed integration/security configuration values. | | [temporary-table-credentials](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/temporary-table-credentials-commands) | integrations | 0.76 | Covers CLI commands and options for temporary-table-credentials, including parameter names and usage patterns unique to Databricks. | | [txtai](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/txtai) | integrations | 0.76 | Explains enabling tracing for txtai via mlflow.autolog and what operations (LLM invocation, embeddings, vector search) are captured—concrete integration pattern. | | [variant_get](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/variant_get) | integrations | 0.76 | Provides product-specific behavior for variant_get including path handling, casting rules, and exception conditions. | | [volumes](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/volumes-commands) | integrations | 0.76 | Volumes command group reference lists CLI commands and parameters for volume operations, which are product-specific integration patterns. | -| [warehouses](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/warehouses-commands) | integrations | 0.76 | Warehouses command group will document CLI commands, flags, and parameter names for SQL warehouse management, which are integration details. | | [width_bucket](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/width_bucket) | integrations | 0.76 | Function reference with Databricks-specific argument constraints and null-return behavior when conditions are not met. | | [workspace](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-commands) | integrations | 0.76 | Workspace command group reference includes CLI operations (list, import, export, delete) with specific parameters and options unique to Databricks. | | [workspace-bindings](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-bindings-commands) | integrations | 0.76 | Workspace-bindings command reference will show commands and flags to set OPEN/ISOLATED bindings, which are concrete CLI integration patterns. | | [workspace-conf](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-conf-commands) | integrations | 0.76 | Workspace-conf command group documents specific configuration keys and CLI parameters for workspace settings, which are product-specific. | +| [workspace-settings-v2](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-settings-v2-commands) | configuration | 0.76 | The workspace-settings-v2 CLI reference describes specific commands, parameters, and allowed values for managing workspace-level settings. This is detailed configuration surface (names, options, and behaviors) that qualifies as expert knowledge beyond generic concepts. | | [xpath_double](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/xpath_double) | integrations | 0.76 | Details xpath_double behavior including return of 0 for no match and NaN for non-numeric matches. | | [xpath_float](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/xpath_float) | integrations | 0.76 | Documents xpath_float semantics including 0 for no match and NaN for non-numeric matches. | | [xpath_int](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/xpath_int) | integrations | 0.76 | Explains xpath_int behavior including 0 for no match or non-numeric values. | @@ -1104,16 +1127,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [xpath_short](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/xpath_short) | integrations | 0.76 | Explains xpath_short semantics including 0 for no match or non-numeric values. | | [ANSI compliance in Databricks Runtime](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-ansi-compliance) | configuration | 0.75 | Describes spark.sql.ansi.enabled and spark.sql.storeAssignmentPolicy options and their behavioral impact. Likely includes allowed values and behavior tables, which are product-specific configuration parameters. | | [ANSI_MODE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/parameters/ansi_mode) | configuration | 0.75 | Describes a specific configuration parameter (ANSI_MODE) with its effects on built-in functions and casts. This is a concrete configuration setting with product-specific behavior. | -| [About authentication](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/authentication) | security | 0.75 | Covers how to authenticate to Lakebase using OAuth tokens or Postgres passwords, including token rotation and security considerations. This is product-specific auth configuration and security guidance. | +| [About authentication](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/authentication) | security | 0.75 | Covers how to authenticate to Lakebase Postgres using OAuth tokens or passwords, including token rotation and security considerations. This implies concrete auth flows, token lifetimes, and secure configuration details specific to Lakebase. | | [About privilege management in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/) | security | 0.75 | Explains how to control access, manage metastore admins, and object ownership; includes product-specific privilege names and role behaviors that are detailed security configuration knowledge. | | [Amazon S3 Select](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/amazon-s3-select) | integrations | 0.75 | The S3 Select connector page describes a specific Spark data source and its configuration for pushing down filters and column selection. This is a product-specific integration pattern with configuration parameters and behavior not generally known. | | [Authenticate to ADLS using Microsoft Entra ID (formerly Azure AD) credentials](https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/adls-passthrough) | security | 0.75 | Covers using Microsoft Entra ID credential passthrough to access ADLS from Databricks; such docs normally include authentication configuration details and security-specific settings, which are product-specific security knowledge. | | [Authenticate with an identity provider token](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-exchange) | security | 0.75 | Explains OAuth 2.0 token exchange with Databricks, including token lifetimes derived from IdP tokens and API usage patterns, which are product-specific security details. | | [Authentication](https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/authentication) | security | 0.75 | Authentication article for Kafka connector; expected to list supported auth methods, security protocol settings, and configuration parameters specific to Databricks Kafka integration. | | [Authentication and permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/auth-and-permissions) | security | 0.75 | Covers authentication methods and permission management; likely lists specific auth modes, scopes, and permission models unique to Azure Databricks Lakebase. | -| [Authentication for agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication) | security | 0.75 | Covers authentication methods for agents accessing Vector Search, endpoints, and Unity Catalog; likely includes specific auth flows, scopes, and configuration parameters. | | [Authentication overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/) | security | 0.75 | Explains concrete authorization methods, when to use each, and how to configure authentication for Databricks CLI and REST APIs. Contains product-specific auth flows and permission requirements. | -| [Auto-enable deletion vectors](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/deletion-vectors) | configuration | 0.75 | Provides a workspace setting that controls default creation of Delta tables with deletion vectors, including version constraints (SQL warehouses and DBR 14.0+), which is a product-specific configuration detail. | | [AutoML Python API reference](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/automl-api-reference) | integrations | 0.75 | API reference for AutoML Python API will list method signatures, parameter names, types, and behaviors specific to Databricks AutoML, which are concrete integration and coding patterns. | | [Automate with Microsoft Entra](https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-ms-entra) | security | 0.75 | Shows how to configure a Databricks service principal with Microsoft Entra ID to access Git folders from Azure DevOps automation. This is identity and authorization configuration specific to Databricks and Entra. | | [Automatic feature lookup (Model Serving)](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/automatic-feature-lookup) | configuration | 0.75 | Explains how Model Serving automatically looks up features from Databricks or third-party online stores; likely includes endpoint configuration, feature lookup settings, and required metadata. | @@ -1125,28 +1146,28 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Bundles with private artifacts](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/artifact-private) | configuration | 0.75 | Explains how to reference private artifacts (e.g., JFrog, private repos) in bundle configs; includes product-specific configuration patterns and constraints. | | [CONNECTIONS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/connections) | configuration | 0.75 | CONNECTIONS relation defines how foreign connections are represented and filtered, which is Databricks-specific metadata configuration. | | [CREATE PROCEDURE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-procedure) | integrations | 0.75 | This is a detailed syntax reference for CREATE PROCEDURE in Databricks SQL/Runtime with Unity Catalog, including named vs positional parameters and Databricks-specific behavior. It represents expert, product-specific SQL integration/coding knowledge, not general theory or limits tables. | -| [CREATE TABLE [USING]](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-using) | integrations | 0.75 | The page documents the Databricks-specific CREATE TABLE [USING] syntax, including behavior of managed, temporary, and external tables and data source usage. This is concrete, product-specific SQL syntax and behavior, aligning with integrations/coding patterns rather than the other categories. | | [CSV files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/csv) | integrations | 0.75 | Includes Databricks-specific recommendations (read_files TVF, runtime version requirements) and limitations when not using those patterns, which are concrete integration behaviors. | | [Change workspace storage redundancy](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-storage-redundancy) | configuration | 0.75 | Explains how to change DBFS storage redundancy and notes date-based behavior changes; involves specific configuration options and platform behavior. | -| [Classic compute plane Private Link](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard) | security | 0.75 | Explains how to set up Azure Private Link between classic compute plane clusters and the Databricks control plane, including product-specific connectivity behavior. | | [Classification data preparation](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/classification-data-prep) | configuration | 0.75 | Describes how AutoML prepares data and the configurable data settings for classification, which are product-specific configuration details. | | [Compare Spark Connect to Spark Classic](https://learn.microsoft.com/en-us/azure/databricks/spark/connect-vs-classic) | decision-making | 0.75 | Comparison article focused on differences in execution and analysis behavior for migration; likely includes scenario-based recommendations and trade-offs. | | [Configure ADBC or ODBC driver](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-adbc) | integrations | 0.75 | Page focuses on configuring which driver (ADBC or ODBC) Power BI uses to connect to Azure Databricks, including driver selection behavior (ADBC as default for new connections, existing connections remain ODBC unless updated) and product-specific integration details. This fits the integrations category as it covers concrete connection configuration patterns between Power BI and Databricks. | | [Configure Azure Database for MySQL](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-azure-config) | configuration | 0.75 | Describes enabling binary logging and retention for Azure Database for MySQL; likely includes specific parameter names and allowed values. | | [Configure RDS and Aurora](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-aws-rds-config) | configuration | 0.75 | Connector-specific setup for RDS/Aurora MySQL; typically lists parameter group settings and required values for binlog and retention. | +| [Configure endpoints](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-ai-gateway-endpoints) | configuration | 0.75 | A how-to configuration article for enabling AI Gateway on serving endpoints will include specific setting names, UI/CLI parameters, and possibly JSON configuration snippets unique to Databricks. | | [Configure external locations for DBFS root](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/external-locations-dbfs-root) | configuration | 0.75 | Provides detailed steps and constraints for governing DBFS root via external locations, including legacy-specific behavior. | | [Configure permissions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/permissions) | security | 0.75 | Explains configuring and managing user permissions for Databricks apps; such pages typically list specific permission levels/RBAC roles and their effects, which is product-specific security configuration. | | [Connect to Azure Synapse Analytics dedicated pool](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/synapse-analytics-dedicated-pool) | integrations | 0.75 | Tutorial covers service principal/MSI/SQL auth configuration, connection strings, and Synapse-specific connector settings. | | [Connect to Google Cloud Storage](https://learn.microsoft.com/en-us/azure/databricks/archive/storage/gcs) | integrations | 0.75 | Describes GCS access via service accounts and keys, including bucket URL formats and Spark configuration parameters unique to this integration. | | [Cost optimization best practices](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/cost-optimization/best-practices) | best-practices | 0.75 | Best-practices article for cost optimization, organized by principle; contains Databricks-specific cost control recommendations. | -| [Create service credentials](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials) | security | 0.75 | Defines service credential objects, their scope, and how they encapsulate long-term cloud credentials; product-specific security object configuration. | | [Customer-managed keys](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/customer-managed-keys) | security | 0.75 | Explains configuring CMK for Lakebase Autoscaling with a cloud KMS key. This includes product-specific encryption settings, key scopes, and possibly required roles/permissions, which are security-focused expert configurations. | | [DBFS root](https://learn.microsoft.com/en-us/azure/databricks/dbfs/dbfs-root) | best-practices | 0.75 | Provides Databricks-specific recommendations and DO/DON'T guidance for using DBFS root, including avoiding production/sensitive data and migration guidance. These are concrete product-specific practices, not generic storage advice. | | [Data & AI governance best practices](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/data-governance/best-practices) | best-practices | 0.75 | Explicitly a best-practices article organized by governance principles; contains Databricks-specific DOs and DON’Ts for governance. | -| [Data classification system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-classification) | configuration | 0.75 | Data classification results table schema and usage are Databricks-specific configuration details. | +| [Data lineage system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage) | configuration | 0.75 | Reference for lineage system tables with schema and usage patterns; this is specific structural/configuration knowledge about Databricks system tables. | | [DataSourceRegistration class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourceregistration) | integrations | 0.75 | Explains spark.dataSource wrapper and how to register Python DataSource subclasses for use in spark.read/df.write; product-specific registration API. | +| [Databases](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-databases) | limits-quotas | 0.75 | Explicitly states a concrete numeric limit: 'limit of 500 databases per branch'. This is a product-specific quota value that an LLM would not reliably know from training, matching limits-quotas criteria. | | [Deep learning best practices](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/dl-best-practices) | best-practices | 0.75 | Provides Databricks-specific tips and recommendations for deep learning workloads, including resource management and tool usage. | | [Delta Lake limitations on S3](https://learn.microsoft.com/en-us/azure/databricks/delta/s3-limitations) | best-practices | 0.75 | Documents S3-specific limitations and edge cases for Delta Lake, including behavior under eventual consistency and multi-cluster writes—product- and platform-specific gotchas. | +| [Disable legacy features](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/legacy-features) | configuration | 0.75 | Describes an account-level setting that disables specific legacy features for new workspaces and includes a date-based availability rule, which is product-specific configuration behavior. | | [Distributed training with DeepSpeed distributor](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/distributed-training/deepspeed) | integrations | 0.75 | Describes DeepSpeed distributor built on TorchDistributor, including when to use it and how to configure it for memory-constrained large models. | | [Dynamic file pruning](https://learn.microsoft.com/en-us/azure/databricks/optimizations/dynamic-file-pruning) | best-practices | 0.75 | Explains when dynamic file pruning is triggered, its dependency on Photon for certain statements, and which query patterns benefit most. These are Databricks-specific performance behaviors and recommendations. | | [EXECUTE IMMEDIATE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-execute-immediate) | integrations | 0.75 | Documents Databricks-specific EXECUTE IMMEDIATE syntax, runtime version requirement, parameter markers, and variable assignment semantics—detailed integration/coding pattern for executing dynamic SQL in this environment. | @@ -1154,23 +1175,21 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Enable or disable Git folders](https://learn.microsoft.com/en-us/azure/databricks/repos/enable-disable-repos-with-api) | configuration | 0.75 | Shows how admins can toggle Git folders using the /api/2.0/workspace-conf REST endpoint or SDK. This is a specific configuration surface and endpoint unique to Databricks. | | [Enable serverless SQL warehouses](https://learn.microsoft.com/en-us/azure/databricks/admin/sql/serverless) | configuration | 0.75 | Explains requirements and setup for serverless SQL warehouses, including that they are enabled by default and have no public IP addresses—service-specific configuration and behavior. | | [English SDK for Apache Spark](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/sdk-english) | integrations | 0.75 | Describes a specialized SDK that compiles English to Spark objects; likely includes API usage patterns and parameters unique to this SDK. | -| [Example stateful applications](https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/examples) | integrations | 0.75 | Contains concrete code examples using transformWithState and related classes, which are Databricks-specific APIs and patterns. | | [External Apache Hive metastore (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/external-metastores/external-hive-metastore) | configuration | 0.75 | Explains how to set up clusters to connect to external Hive metastores, including Databricks-specific configuration steps and properties. | +| [External MCP servers](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/external-mcp) | security | 0.75 | Describes supported authentication methods (shared principal vs per-user) and Unity Catalog connections for token management; this implies concrete auth configuration parameters and security patterns unique to Databricks’ MCP integration. | | [Feature tables](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/feature-tables-uc) | configuration | 0.75 | Covers creating and working with feature tables in Unity Catalog; likely includes API calls, parameters, and constraints specific to Databricks Feature Engineering. | | [Forecasting data preparation](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/forecasting-data-prep) | configuration | 0.75 | Explains how AutoML prepares data for forecasting and the configurable data settings, which are product-specific configuration parameters. | | [Format numeric visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/format-numeric-types) | configuration | 0.75 | Provides specific numeric format string options and behavior for Databricks visualizations, which are product-specific configuration details. | | [Full refresh for streaming tables](https://learn.microsoft.com/en-us/azure/databricks/ldp/full-refresh-st) | best-practices | 0.75 | Explains when a full refresh is required, its impact on checkpoints and metadata, and includes best practices specific to Databricks streaming tables and Lakeflow pipelines. This is actionable, product-specific guidance (when/when not to run, implications on flows) rather than generic concepts. | | [GRANT SHARE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-grant-share) | security | 0.75 | Documents GRANT ON SHARE syntax, Unity Catalog-only scope, and recipient model, which are Databricks-specific data sharing security semantics. | +| [Genie Code system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/assistant) | configuration | 0.75 | Gives the exact system table path and schema for assistant events plus example queries, which is detailed, product-specific metadata/configuration. | | [Global init scripts](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/global) | configuration | 0.75 | Explains global init scripts, access mode support, admin-only creation, and cautions; product-specific configuration and behavior. | | [Google BigQuery](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/bigquery) | integrations | 0.75 | BigQuery integration requires key-based authentication, project/dataset/table identifiers, and connector-specific options. | | [How is variant different than JSON strings?](https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant-json-diff) | best-practices | 0.75 | Focuses on behavioral, syntax, and semantic differences between VARIANT and JSON strings. These are nuanced, product-specific behaviors and guidance on when/how to use each, fitting best-practices and expert knowledge. | -| [Iceberg client access](https://learn.microsoft.com/en-us/azure/databricks/external-access/iceberg) | configuration | 0.75 | Explains how to configure Apache Iceberg clients to read/write Unity Catalog tables via the Iceberg REST catalog, including runtime/version constraints and endpoint usage. This is concrete integration/configuration detail. | | [Import modules from workspace files](https://learn.microsoft.com/en-us/azure/databricks/files/workspace-modules) | configuration | 0.75 | Describes Databricks-specific module import behavior, relative paths, and CWD behavior for modules, which are configuration/behavior details. | -| [Inbound Private Link](https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/front-end-private-connect) | security | 0.75 | Provides concrete steps and parameters to set up inbound Private Link for Databricks, a product-specific network security configuration. | | [Ingest data as semi-structured variant type](https://learn.microsoft.com/en-us/azure/databricks/ingestion/variant) | configuration | 0.75 | Describes supported file formats, runtime versions, and concrete ingestion patterns/commands for VARIANT; includes tables of supported combinations and options. | | [Init scripts in workspace files](https://learn.microsoft.com/en-us/azure/databricks/files/workspace-init-scripts) | configuration | 0.75 | Provides runtime-version-specific support details and recommended storage locations for init scripts, which are Databricks-specific configuration patterns. | | [Interoperability & usability best practices](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/interoperability-and-usability/best-practices) | best-practices | 0.75 | Best-practices article tied to interoperability and usability principles; contains actionable Databricks-specific recommendations. | -| [Jobs system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs) | configuration | 0.75 | A 'system table reference' for jobs is inherently a schema/config reference: it will list table/column names, data types, and semantics for the lakeflow (workflow) system tables. That structure is product-specific expert knowledge not derivable from general training and fits the configuration category (reference of fields and how they are used). | | [Lakebase](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakebase) | configuration | 0.75 | Describes Autoscaling and Provisioned Lakebase options and shared PostgreSQL connection model; likely includes environment variables and config options, fitting configuration with expert details. | | [Legacy Iceberg REST Catalog](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/external-access-iceberg) | integrations | 0.75 | Describes using the Iceberg REST catalog to access Unity Catalog tables from external engines, including product-specific integration configuration. | | [Legacy table access control overview](https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/) | security | 0.75 | Describes table access control for the built-in Hive metastore, including how to programmatically grant/revoke access; this involves product-specific security/privilege configuration. | @@ -1183,12 +1202,13 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [MLflow MCP server](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/mlflow-mcp) | integrations | 0.75 | Describes MLflow MCP server operations and tools; product-specific integration interface for interacting with traces via MCP. | | [Manage Python dependencies for pipelines](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/external-dependencies) | best-practices | 0.75 | Gives concrete recommendations on using specific patterns (e.g., avoiding init scripts, automating tests) due to runtime upgrade risks; these are product-specific operational best practices. | | [Manage egress costs (for providers)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-egress) | best-practices | 0.75 | Provides concrete recommendations (DEEP CLONE, CDF, Cloudflare R2) and when to use them to manage egress; these are product-specific cost-optimization practices. | -| [Manage entitlements](https://learn.microsoft.com/en-us/azure/databricks/security/auth/entitlements) | security | 0.75 | Explains specific entitlements that can be assigned to users, service principals, and groups. The entitlement names and effects are Databricks-specific security configuration. | | [Manage external locations](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/manage-external-locations) | security | 0.75 | Describes listing, updating, granting permissions, enabling file events, and deleting external locations. Involves product-specific administrative and permission operations on external locations, aligning with security/identity configuration details. | -| [Manage governed tag permissions](https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-permissions) | security | 0.75 | Focuses on granting permissions on governed tags; includes specific permission names and scopes that define how tag-based governance is secured. | | [Manage private endpoint rules](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/manage-private-endpoint-rules) | configuration | 0.75 | Describes managing private endpoint rules for serverless connectivity using the account console and Network connectivity configurations API. Contains product-specific configuration objects and API usage. | +| [Manage serverless base environments](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/base-environment) | configuration | 0.75 | Explains creating and managing pre-built cached base environments for serverless notebooks and jobs, which involves specific configuration options unique to Databricks. | +| [Marketplace system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/marketplace) | configuration | 0.75 | Explains the system.marketplace schema and its tables, including how they’re used for analytics—expert table configuration information. | | [Migrate model versions](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/migrate-models) | decision-making | 0.75 | Describes using copy_model_version() with specific MLflow client version and environment variable behavior; concrete migration path and thresholds. | -| [Network access system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/network) | configuration | 0.75 | Describes network access events system tables that record blocked network events; table structures and usage are product-specific configuration details. | +| [Monitor and manage access to personal access tokens](https://learn.microsoft.com/en-us/azure/databricks/admin/access-control/tokens) | security | 0.75 | Explains admin workflows and controls for listing and revoking PATs in a workspace—detailed, product-specific security configuration/operations. | +| [Network access system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/network) | configuration | 0.75 | Describes the network access events system table, including schema and how each row maps to blocked requests—specific table configuration/metadata details. | | [OAuth M2M](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-m2m) | security | 0.75 | Describes configuring OAuth machine-to-machine authentication; expected to include specific app registrations, scopes, and permission settings—product-specific security configuration. | | [OAuth U2M](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-u2m) | security | 0.75 | User-to-machine OAuth configuration for SharePoint ingestion; likely details scopes, consent, and app settings—security-focused configuration. | | [OFFSET clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-offset) | integrations | 0.75 | Includes Databricks-specific OFFSET syntax, version constraints, and explicit guidance on performance implications when paging with LIMIT/OFFSET. | @@ -1199,10 +1219,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Optimize stateless streaming queries](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stateless-streaming) | best-practices | 0.75 | Covers Databricks Runtime–specific optimization features (AQE, AOS, shuffle partitions) with concrete tuning recommendations for stateless streaming workloads. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-cli) | security | 0.75 | Shows how to use Azure CLI for Databricks auth, including service principal vs user flows and Databricks-specific recommendations. Product-specific security/auth configuration. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-mi) | security | 0.75 | Describes how managed identities are used to authenticate to Databricks, including which Azure resources and flows are supported. Product-specific security configuration. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/compute) | best-practices | 0.75 | Explicitly called recommendations and best practices for compute; includes serverless limitations and workload-specific guidance, matching product-specific best-practices criteria. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/cdc) | configuration | 0.75 | Documents AUTO CDC APIs, including syntax and options for SCD Type 1/2; these are specific configuration/command details for CDC. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/mlflow/models) | configuration | 0.75 | Details MLflow model flavors and how to log/load/register models, including streaming models; product-specific configuration and packaging patterns. | -| [Pandas API on Spark](https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-on-spark) | configuration | 0.75 | Includes Databricks Runtime version requirements and how to enable/use pandas API on Spark, which is specific to Databricks environments. | | [Partition discovery for external tables](https://learn.microsoft.com/en-us/azure/databricks/tables/external-partition-discovery) | configuration | 0.75 | Describes default partition discovery strategy and an optional setting for partition metadata logs; these are concrete configuration options affecting behavior and performance. | | [Partner-powered AI features](https://learn.microsoft.com/en-us/azure/databricks/databricks-ai/partner-powered) | security | 0.75 | Describes a specific workspace-level setting controlling use of partner-hosted models, including default behaviors for CSP vs non-CSP workspaces and how to enable/disable; this is product-specific security/compliance configuration. | | [Permissions concepts](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control/permissions-concepts) | security | 0.75 | Describes object hierarchy, privileges, ownership, and inheritance; contains detailed, product-specific permission semantics beyond generic RBAC concepts. | @@ -1221,7 +1239,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ROUTINES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/routines) | configuration | 0.75 | ROUTINES relation schema and filtering are Databricks-specific metadata configuration. | | [ROW_FILTERS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/row_filters) | security | 0.75 | ROW_FILTERS relation exposes row-level security filter metadata, a security configuration feature specific to Databricks. | | [Read and write data from Snowflake](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/snowflake) | integrations | 0.75 | Snowflake connector usage includes account identifiers, warehouse/database/schema options, and connector-specific parameters. | -| [Read shared data (tokens)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-open) | integrations | 0.75 | Describes reading Delta Sharing open shares using credential files and multiple tools; likely includes connector/SDK parameters and token-based configuration details unique to Delta Sharing. | +| [Recover from streaming checkpoint failure](https://learn.microsoft.com/en-us/azure/databricks/ldp/recover-streaming) | troubleshooting | 0.75 | Focused on recovering from streaming checkpoint corruption, which typically involves Databricks-specific recovery steps, options, and possibly commands; this is symptom→solution guidance unique to the product. | +| [Register and serve an OSS embedding model](https://learn.microsoft.com/en-us/azure/databricks/vector-search/embedding-with-oss-models) | deployment | 0.75 | Describes registering and serving the e5-small-v2 model in a Databricks Model Serving endpoint for Vector Search, including runtime/library considerations, which are product-specific deployment details. | | [Register database as a catalog](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/register-uc) | integrations | 0.75 | Registration with Unity Catalog is a specific integration; page likely details configuration fields, permissions, and constraints for this integration. | | [Regression data preparation](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regression-data-prep) | configuration | 0.75 | Explicitly describes configurable data settings for regression in the AutoML UI, which are product-specific configuration parameters. | | [Reliability best practices](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/reliability/best-practices) | best-practices | 0.75 | Best-practices article for reliability, organized by principles; contains Databricks-specific reliability recommendations. | @@ -1233,24 +1252,24 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [SCHEMATA](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/schemata) | configuration | 0.75 | SCHEMATA relation defines schema metadata and filtering; this is product-specific configuration. | | [SET MANAGED (FOREIGN VIEW)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-view-set-managed) | configuration | 0.75 | Documents ALTER VIEW SET MANAGED syntax and behavior for converting foreign views (HMS/Glue) to standard views and implications for syncing—Databricks-specific configuration behavior. | | [SET MANAGED LOCATION (FOREIGN SCHEMA)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema-set-managed-location) | configuration | 0.75 | Describes ALTER SCHEMA SET MANAGED LOCATION for foreign catalogs (HMS/Glue), including scope and behavior for new/converted managed tables—specific storage location configuration. | +| [SFTP](https://learn.microsoft.com/en-us/azure/databricks/ingestion/sftp) | integrations | 0.75 | Describes the SFTP connector that extends Auto Loader, including secure, incremental ingestion and Unity Catalog governance; likely documents connector options and behaviors that are specific integration knowledge. | | [SHARES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/shares) | configuration | 0.75 | SHARES relation schema and filtering rules are Databricks-specific metadata configuration. | | [SHOW CREATE TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-create-table) | integrations | 0.75 | Provides Databricks-specific SHOW CREATE TABLE syntax, behavior differences for materialized/streaming tables, runtime version requirements, and error behavior on certain object types—detailed API semantics unique to this product. | | [SHOW GRANTS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-show-grant) | security | 0.75 | Describes SHOW GRANTS syntax, including inherited/denied/granted privilege reporting and usage constraints, which are specific to Databricks security introspection. | | [SHOW GRANTS ON SHARE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-show-grant-on-share) | security | 0.75 | Documents SHOW GRANTS ON SHARE syntax, Unity Catalog-only scope, and admin requirement, which are Databricks-specific security inspection commands. | | [SHOW GRANTS TO RECIPIENT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/security-show-grant-to-recipient) | security | 0.75 | Provides SHOW GRANTS TO RECIPIENT syntax and behavior for Unity Catalog shares, including admin requirement, which is specific to Databricks sharing security. | -| [SQL warehouses system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouses) | configuration | 0.75 | Reference for warehouses system table including schema and usage; table path and columns are product-specific configuration details. | | [Save DataFrames to TFRecord files and load with TensorFlow](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/tfrecords-save-load) | integrations | 0.75 | Shows how to use spark-tensorflow-connector with Spark DataFrames and TFRecord, including API usage and configuration details. | | [Service principals for CI/CD](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/service-principals) | security | 0.75 | Describes using service principals and tokens for CI/CD access to Databricks, including security best practices and specific identity configuration patterns. | | [Set up Delta Sharing for your account (for providers)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/set-up) | configuration | 0.75 | Describes how to enable and set up Delta Sharing at the account level; likely includes specific settings (Unity Catalog workspace requirement, metastore selection, enablement flags) that are product-specific configuration knowledge. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-source-setup) | security | 0.75 | Describes configuring OAuth U2M authentication, which typically includes product-specific auth parameters, token handling, and required credentials for this connector. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-source-setup) | configuration | 0.75 | Describes configuring NetSuite with token-based authentication; such pages usually list specific account settings, token fields, and required roles/permissions—product-specific configuration details. | -| [Setup](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/setup) | configuration | 0.75 | Focused on prerequisites, compute requirements, query configuration, and compute sizing for real-time mode. This strongly implies tables or lists of specific settings, parameter names, and environment requirements unique to Databricks, matching configuration expert knowledge. | | [Shallow clone](https://learn.microsoft.com/en-us/azure/databricks/delta/clone-unity-catalog) | configuration | 0.75 | Documents runtime requirements, managed vs external behavior, and VACUUM behavior differences for shallow clones—detailed product-specific configuration and behavior. | +| [SharePoint](https://learn.microsoft.com/en-us/azure/databricks/ingestion/sharepoint) | integrations | 0.75 | Documents the SharePoint connector for incremental ingestion into Delta using Auto Loader, spark.read, and COPY INTO, with Databricks-specific options and behaviors; this is concrete integration guidance. | +| [Shared materialization history system table reference](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/materialization) | configuration | 0.75 | Describes the materialization history system table path and fields, which is detailed product-specific metadata/configuration. | | [SparkSession class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession) | integrations | 0.75 | Class-level reference for SparkSession in Databricks PySpark, including capabilities like creating DataFrames, registering tables, and executing SQL, which are concrete API behaviors. | | [Star clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-star) | integrations | 0.75 | Details Databricks-specific behavior of * expansion, column ordering, struct field handling, and exclusion of the _metadata column, including version-specific changes. | | [Streaming with AQS](https://learn.microsoft.com/en-us/azure/databricks/archive/azure/aqs) | integrations | 0.75 | Describes the ABS-AQS connector as an optimized file source using Azure Queue Storage to detect new files in Blob Storage; legacy connector docs generally include connector-specific options and behaviors (for example, message deletion semantics) that are expert integration details. | -| [Supported foundation models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models) | decision-making | 0.75 | Describes which open models are supported by Databricks Foundation Model APIs, including pay-per-token endpoints and model endpoint names. This is concrete model catalog and endpoint naming information used to choose and call specific models, which is not generally knowable from training data. | -| [Sync users and groups automatically from MS Entra ID](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management) | configuration | 0.75 | Automatic identity management setup requires specific Databricks and Entra ID configuration steps and options, which are product-specific configuration details. | +| [Supported foundation models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models) | decision-making | 0.75 | Lists Databricks-hosted models with details like capabilities, modalities, and possibly cost or feature support; such tables guide model selection and are product-specific decision-making aids. | | [Table visualization](https://learn.microsoft.com/en-us/azure/databricks/visualizations/tables) | configuration | 0.75 | Explains specific table visualization behaviors (pin, reorder, hide, formatting) that are configuration details unique to Databricks. | | [Table-valued function (TVF)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-tvf) | integrations | 0.75 | Details Databricks-specific TVF invocation syntax, supported function types, version constraints, and Hive UDTF limitations. | | [Trace Node.js apps](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/typescript-sdk) | integrations | 0.75 | Covers MLflow Tracing TypeScript SDK for Node.js, including SDK-specific configuration and API parameters that are unique integration details for this product. | @@ -1260,8 +1279,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Use Azure Event Hubs as a pipelines data source](https://learn.microsoft.com/en-us/azure/databricks/ldp/event-hubs) | integrations | 0.75 | Explains processing Event Hubs messages in pipelines and notes the Structured Streaming connector is unavailable; implies specific connection patterns and constraints unique to Databricks/Lakeflow. | | [Use sinks in pipelines](https://learn.microsoft.com/en-us/azure/databricks/ldp/ldp-sinks) | configuration | 0.75 | Explicitly about the sink API and how to use it with flows; likely includes parameter names, allowed values, and configuration patterns for external sinks like Kafka and Event Hubs. | | [Use the Databricks connector to connect to another Databricks workspace](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/databricks) | integrations | 0.75 | Shows connector-specific options, JDBC URL patterns, and authentication parameters for cross-workspace Databricks connectivity. | -| [Users](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/users) | security | 0.75 | Explains how admins manage users in identity-federated workspaces, tied to the Databricks identity model and access control, which is specific IAM configuration/operation knowledge. | -| [Workspace email settings](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/email) | configuration | 0.75 | Describes when users get emails and how to configure notification destinations; involves specific workspace settings and options. | | [Writing data](https://learn.microsoft.com/en-us/azure/databricks/files/write-data) | configuration | 0.75 | Details default storage paths and how configurations affect write locations across Databricks components, which are product-specific configuration behaviors. | | [account command group](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-commands) | integrations | 0.75 | Command-group reference for account operations; exposes concrete CLI verbs, arguments, and behavior that are product-specific integration details. | | [account service-principal-federation-policy](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-service-principal-federation-policy-commands) | security | 0.75 | Service-principal-federation-policy commands; workload identity federation configuration with product-specific policy schema. | @@ -1318,14 +1335,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [orderBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/orderby) | integrations | 0.75 | Method-level API reference for Window.orderBy, defining how ordering is attached to a WindowSpec. | | [orderBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/orderby) | integrations | 0.75 | Method-level API reference for WindowSpec.orderBy, defining ordering semantics for windows. | | [overlay](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/overlay) | integrations | 0.75 | Details Databricks PySpark overlay behavior with src, replace, pos, and len parameters, which is concrete API semantics. | -| [pandas function APIs](https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-function-apis) | integrations | 0.75 | Describes pandas function APIs, their types, and behavior (Arrow-based execution, type hints) which are concrete API patterns for Databricks users. | | [pandas to PySpark conversion](https://learn.microsoft.com/en-us/azure/databricks/pandas/pyspark-pandas-conversion) | integrations | 0.75 | Focuses on conversion using Apache Arrow, including specific APIs and constraints for interoperability between Spark and pandas on Databricks. | | [pandas_api](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/pandas_api) | integrations | 0.75 | Describes pandas_api conversion behavior; integration between Databricks PySpark and pandas-on-Spark is product-specific. | | [partitionBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/window/partitionby) | integrations | 0.75 | Documents Window.partitionBy behavior, a concrete configuration of windowing in Databricks PySpark. | | [partitionBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/windowspec/partitionby) | integrations | 0.75 | Documents WindowSpec.partitionBy behavior, a concrete configuration method in the Databricks PySpark API. | | [percent_rank](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/percent_rank) | integrations | 0.75 | Documents Databricks PySpark percent_rank behavior within window partitions, which is concrete API semantics. | | [pivot](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/pivot) | integrations | 0.75 | Explains the GroupedData.pivot API and how it performs aggregations, which is specific to the Databricks PySpark implementation. | -| [postgres](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/postgres-commands) | integrations | 0.75 | postgres command group reference includes commands and parameters for projects, branches, and endpoints, a detailed integration surface. | | [register](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourceregistration/register) | integrations | 0.75 | Method-level reference for DataSourceRegistration.register; concrete integration step to make sources available by name. | | [regr_avgx aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_avgx) | integrations | 0.75 | Provides exact semantics for the regr_avgx aggregate function, including null-handling rules and version applicability. This is specialized statistical function behavior in this product’s SQL dialect. | | [regr_sxx aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/regr_sxx) | integrations | 0.75 | Explains the regr_sxx aggregate function’s behavior and null-handling in Databricks SQL, which is detailed, product-specific API semantics useful for coding patterns. | @@ -1386,6 +1401,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [withColumns](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumns) | integrations | 0.75 | Describes multi-column add/replace behavior for Databricks PySpark DataFrames. | | [withColumnsRenamed](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/withcolumnsrenamed) | integrations | 0.75 | Explains multi-column rename semantics and no-op behavior, which are specific API behaviors. | | [zip_with function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zip_with) | integrations | 0.75 | Function reference for zip_with, including element-wise merge semantics using a lambda func in Databricks SQL/Runtime, which is product-specific behavior. | +| [App environment](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/system-env) | configuration | 0.74 | A Databricks Apps 'environment' page typically enumerates the exact binaries available in the container/VM image and lists system environment variables and their meanings. Those are product-specific configuration details (names, paths, versions) that an LLM is unlikely to know from training and map to configuration parameters rather than generic concepts. | +| [Authentication for agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication) | security | 0.74 | Covers concrete authentication methods for agents (service principals, OBO scopes) and how Databricks Apps handles app and user authorization to other resources. These are product-specific security and identity configuration patterns, matching the security sub-skill. | | [COMMENT ON](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-comment) | configuration | 0.74 | Provides COMMENT ON syntax for multiple object types and notes on AI-generated comments and Unity Catalog scope, which are specific configuration behaviors. | | [Compute configuration](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/cluster-config) | configuration | 0.74 | Describes configuration options for connecting Databricks Connect to clusters or serverless compute, likely including specific setting names, modes, and allowed values. | | [Configure apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/configuration) | configuration | 0.74 | Explains how to configure app templates, permissions, authorization, and request handling, including specific configuration options and models unique to Databricks Apps. | @@ -1396,6 +1413,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [GET DIAGNOSTICS statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/get-diagnostics-stmt) | troubleshooting | 0.74 | Explains GET DIAGNOSTICS statement syntax and usage to retrieve condition information in handlers, which is core to structured troubleshooting in scripts. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/compute/standard-limitations) | limits-quotas | 0.74 | Explicitly a requirements and limitations page for standard compute; such pages typically enumerate concrete feature and behavioral limits tied to access mode and runtime versions. | | [MLflow Model Registry webhooks](https://learn.microsoft.com/en-us/azure/databricks/mlflow/model-registry-webhooks) | integrations | 0.74 | Describes webhook event types, payloads, and API usage for integrating Model Registry events with external systems like CI/CD and Slack, which are concrete integration patterns and API details. | +| [OPTIMIZE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-optimize) | configuration | 0.74 | The OPTIMIZE command reference includes product-specific SQL syntax, options, and parameters (e.g., ZORDER BY, WHERE predicates, retention/cleanup behaviors) that define how to configure Delta Lake table optimization in Databricks. These are concrete, service-specific configuration knobs and behaviors rather than generic SQL knowledge, fitting the configuration sub-skill best. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/commands) | configuration | 0.74 | A Databricks CLI command reference typically lists product-specific commands, subcommands, and parameters that go beyond generic CLI usage. These command names, flags, and their exact behaviors are expert, product-specific configuration/invocation details that an LLM is unlikely to fully know from training. The page is not just a tutorial; it is a catalog of available commands and options, which best fits the configuration category. | | [Power BI connection in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-uc-connect) | configuration | 0.74 | Explains creating a Power BI connection object in Unity Catalog with specific privileges (CREATE/USE CONNECTION) and configuration fields, which are product-specific settings. | | [REFRESH POLICY clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view-refresh-policy) | configuration | 0.74 | Documents REFRESH POLICY clause with specific options controlling incremental refresh behavior. These are concrete configuration parameters and allowed values for a Databricks-specific feature. | | [Route optimization on serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/route-optimization) | configuration | 0.74 | Explains enabling route-optimized endpoints, changed URLs, and OAuth auth; these are specific configuration and usage details for this feature. | @@ -1405,11 +1424,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Snowflake federation](https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-catalog-federation) | configuration | 0.74 | The article explains how to enable Snowflake catalog federation so Unity Catalog can directly access Snowflake Managed Iceberg tables. Such federation-enablement docs typically include specific configuration steps and parameters (catalog registration, storage locations, permissions, and connector options) that are unique to Databricks–Snowflake integration. This is detailed, product-specific configuration rather than conceptual guidance, so configuration is the best fit. | | [Span tracing](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/manual-tracing/span-tracing) | integrations | 0.74 | Documents mlflow.start_span() usage, parameters, and patterns for span creation, which are specific integration APIs and coding patterns. | | [USE SCHEMA](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-use-schema) | configuration | 0.74 | Product-specific rules for schema scoping, default schema name, and DATABASE/SCHEMA terminology. | -| [ai_parse_document function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_parse_document) | integrations | 0.74 | Function reference pages for Databricks SQL typically include product-specific syntax, arguments, return types, and constraints for ai_parse_document that are not generally known to LLMs. This is concrete API/SQL-surface behavior, fitting integrations & coding patterns. | | [ai_prep_search function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_prep_search) | integrations | 0.74 | Describes ai_prep_search SQL function behavior, arguments, and output format for RAG/search scenarios. These are product-specific API details and configuration-like parameters, which qualify as expert integration knowledge. | | [columns](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/columns) | integrations | 0.74 | API reference for DataFrame.columns property and ordering semantics. | | [configure](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/configure-commands) | security | 0.74 | The 'configure' command reference covers authenticating with PATs or Entra ID tokens and includes security best-practice notes. It likely lists specific auth-related parameters and flags, fitting security-focused configuration. | | [count](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/count) | integrations | 0.74 | API behavior for DataFrame.count, including return type and semantics. | +| [data-classification](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-classification-commands) | integrations | 0.74 | The data-classification command group reference will provide exact CLI commands and options to manage data classification for Unity Catalog catalogs, including constraints like one coordinator per catalog. These are product-specific command patterns, fitting integrations. | +| [data-quality](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-quality-commands) | integrations | 0.74 | This page documents the data-quality CLI commands for managing data quality monitoring on Unity Catalog objects. It will enumerate concrete commands and parameters, which are product-specific integration details rather than conceptual guidance. | | [dtypes](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/dtypes) | integrations | 0.74 | Documents DataFrame.dtypes property and its list-of-tuples structure. | | [first](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/first) | integrations | 0.74 | API behavior for DataFrame.first, including return type Row. | | [from_avro (Avro)](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/from_avro) | integrations | 0.74 | The page documents a specific PySpark function (from_avro) in the Azure Databricks runtime, including its parameters, behavior when schema/jsonFormatSchema/subject/schemaRegistryAddress are provided or omitted, and how it integrates with Schema Registry Avro format. This is product- and API-specific integration knowledge (function signature, parameter semantics, and behavior) that goes beyond generic LLM training. It fits best under integrations & coding patterns, as it focuses on a concrete API for deserializing Avro data into Spark DataFrames rather than general concepts or limits. | @@ -1443,7 +1463,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Restart Python process](https://learn.microsoft.com/en-us/azure/databricks/libraries/restart-python-process) | best-practices | 0.72 | Provides a specific recommended pattern (install libraries then run dbutils.library.restartPython()) and explains state-loss behavior; this is a product-specific best-practice and gotcha. | | [Spark rewriting data](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-rewriting-data) | best-practices | 0.72 | Uses Databricks SQL DAG statistics for delete/update/merge to determine if Spark is rewriting more data than expected; detailed, product-specific diagnostic technique. | | [USE CATALOG](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-use-catalog) | configuration | 0.72 | Describes Databricks-specific catalog scoping rules and side effect of resetting current schema to default. | -| [Update workspace network configuration](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/update-workspaces) | configuration | 0.72 | Step‑by‑step instructions for updating workspace VNet configuration and migrating between managed VNet and VNet injection will include specific workspace properties, supported transitions, and required parameter values—product‑specific configuration knowledge. | | [approx_top_k_accumulate aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_accumulate) | integrations | 0.72 | Describes how to build and use Apache DataSketches-based state via approx_top_k_accumulate in Databricks Runtime, including how it interacts with related functions. This is specific API usage and integration behavior. | | [approx_top_k_combine aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k_combine) | integrations | 0.72 | Documents approx_top_k_combine behavior for merging sketch states from approx_top_k_accumulate. This is product-specific function semantics and usage patterns, aligning with integrations. | | [awaitAnyTermination](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager/awaitanytermination) | integrations | 0.72 | Describes detailed behavior of waiting for any query termination, including timeout semantics and exception propagation, which is product-specific API behavior. | @@ -1472,16 +1491,16 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ADD ARCHIVE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-resource-mgmt-add-archive) | integrations | 0.70 | Runtime-specific resource management command with supported archive types and relation to LIST ARCHIVE. | | [ALTER CREDENTIAL](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-credential) | configuration | 0.70 | SQL DDL reference for renaming credentials with exact syntax; this is product-specific configuration of security objects. | | [ALTER MATERIALIZED VIEW](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-materialized-view) | configuration | 0.70 | The page is a detailed SQL reference for ALTER MATERIALIZED VIEW in Databricks SQL, specifying exact syntax elements, options, and constraints that are product-specific. It enumerates parameter names (for example, view_name requirements, scheduling/refresh options, and other clauses) and their allowed forms/behaviors, which fits the configuration category more than generic language knowledge. It is not about limits, troubleshooting, or architecture, but about how to configure and modify materialized views via specific SQL options. | -| [ALTER PROVIDER](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-provider) | configuration | 0.70 | SQL reference for ALTER PROVIDER with exact operations (rename, transfer ownership), which are configuration of sharing providers. | +| [ALTER PROVIDER](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-provider) | integrations | 0.70 | DDL reference for ALTER PROVIDER with product-specific syntax, options, and behavior (ownership transfer, constraints, and notes about credential handling) that go beyond generic SQL knowledge and are unique to Databricks/Unity Catalog. | | [ALTER RECIPIENT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-recipient) | configuration | 0.70 | Documents ALTER RECIPIENT syntax and behavior, configuring data sharing recipients; this is product-specific DDL configuration. | +| [ALTER TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table) | configuration | 0.70 | Detailed DDL syntax and options for ALTER TABLE in Databricks SQL, including supported/unsupported table types (for example, temporary and foreign tables) and product-specific behaviors like cache invalidation, which are not generic SQL and represent expert configuration knowledge. | | [ALTER pipeline datasets](https://learn.microsoft.com/en-us/azure/databricks/ldp/using-alter-sql) | configuration | 0.70 | Covers how Databricks SQL ALTER interacts with pipeline-defined datasets; product-specific behavior and constraints for schema changes. | | [Access control in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control) | security | 0.70 | Overview of privileges, ABAC policies, and data-level restrictions; likely enumerates specific privilege names and policy constructs that define the product’s security model. | | [Access data shared with you](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/recipient) | configuration | 0.70 | Recipient-focused guide for accessing shared data across both sharing models; includes concrete steps and catalog/SQL usage specific to Delta Sharing. | | [Access query history](https://learn.microsoft.com/en-us/azure/databricks/ldp/query-history) | troubleshooting | 0.70 | Focuses on using query histories and profiles to debug and optimize; likely includes product-specific UI/commands and interpretation patterns for performance troubleshooting. | | [Access trace data](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/access-trace-data) | integrations | 0.70 | Explains the MLflowTrace object structure (TraceInfo, TraceData) and how to access components via SDK; includes product-specific object models and methods. | | [Account SCIM v2.1 API](https://learn.microsoft.com/en-us/azure/databricks/reference/scim-2-1) | security | 0.70 | SCIM account API reference is about creating and managing users, groups, and service principals. This is identity and access management with specific API semantics, fitting the security sub-skill. | -| [Account management overview](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/) | configuration | 0.70 | Describes specific account console settings (Unity Catalog metastores, feature enablement, email/language/account naming) and when to use account console vs Azure portal. These are concrete configuration options and scopes unique to Databricks. | -| [Add AI-generated comments](https://learn.microsoft.com/en-us/azure/databricks/comments/ai-comments) | configuration | 0.70 | Describes specific behavior (ALTER commands triggered, impact on pipelines) and concrete steps to configure AI-generated comments. | +| [Account management overview](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/) | configuration | 0.70 | Describes concrete account-level settings (Unity Catalog metastores, feature enablement, email and language settings) and when to use account console vs Azure portal, which are product-specific configuration details. | | [Add custom instructions](https://learn.microsoft.com/en-us/azure/databricks/genie-code/instructions) | best-practices | 0.70 | Covers how to add custom instructions plus best practices for writing them, likely including product-specific guidance and edge cases for how instructions are applied across Genie Code features. | | [Add resources](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/resources) | configuration | 0.70 | Covers adding various platform features as resources and managing secrets; such pages typically define resource configuration options and environment variables, which are product-specific configuration details. | | [Add tracing to apps](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/) | integrations | 0.70 | Explains concrete approaches to instrumenting Python/TypeScript apps with MLflow Tracing, including specific APIs and patterns. This is integration-focused with product-specific code patterns. | @@ -1491,14 +1510,18 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Agent metadata](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/agent-metadata) | configuration | 0.70 | Describes product-specific metadata fields (display names, formats, synonyms) and their usage within metric views, including version requirements, which are concrete configuration options unique to this feature. | | [Agent state and memory](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/state-management) | integrations | 0.70 | Using Lakebase Postgres for agent state with LangGraph or OpenAI Agents SDK implies concrete schema patterns, connection parameters, and SDK usage that are product-specific integration and coding patterns. | | [Agent systems design patterns](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/agent-system-design-patterns) | architecture-patterns | 0.70 | Agent system design patterns page with recommended patterns and practical development advice; Databricks-specific guidance on when to use certain agent architectures and trade-offs. | +| [Analyze the billing logs](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/system-tables) | configuration | 0.70 | Explains how to use the system.billing.usage table, including joins and fields, to monitor costs—detailed system table configuration/usage. | +| [App telemetry](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/observability) | configuration | 0.70 | Telemetry configuration for Databricks Apps using OpenTelemetry and Unity Catalog will include specific configuration options (tables, protocols, SDK settings, enablement flags). These are concrete configuration parameters and patterns unique to Databricks Apps, fitting the configuration sub-skill. | | [Async checkpoint](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/async-checkpointing) | architecture-patterns | 0.70 | Explains when to use async vs sync checkpointing with a trade-off table for latency and guarantees, a Databricks-specific streaming pattern. | | [Async progress tracking](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/async-progress-checking) | architecture-patterns | 0.70 | Describes how async progress tracking affects latency and when it cannot be used (Trigger.once/availableNow), providing Databricks-specific pattern guidance. | -| [Audit data sharing](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/audit-logs) | troubleshooting | 0.70 | Describes specific audit log events for providers and recipients; likely maps event types and fields to actions, which is product-specific diagnostic information. | +| [Attribute usage using serverless usage policies](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies) | configuration | 0.70 | Describes serverless usage policies, their tag configuration, and how they affect billing records—detailed configuration behavior unique to Databricks. | | [Audit log schemas for security monitoring](https://learn.microsoft.com/en-us/azure/databricks/archive/security/monitor-log-schemas) | security | 0.70 | Audit log schema documentation is security-focused and product-specific, describing fields, meanings, and structures of security monitoring logs. This fits the security sub-skill as it enables precise interpretation of Databricks security events. | | [Authentication for agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication-model-serving) | security | 0.70 | Authentication-focused page for agents accessing Databricks and external resources; likely includes concrete auth flows, token types, scopes, and Databricks-specific configuration details for Model Serving and external resources. This is product-specific security configuration rather than generic concepts. | | [Auto Loader with file events overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/file-events-explained) | configuration | 0.70 | Explains cloudFiles.useManagedFileEvents option; this is a specific configuration flag with product-specific behavior. | -| [Automate with Terraform](https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-terraform) | deployment | 0.70 | Explains how to use the databricks_repo Terraform resource and authentication approaches to manage Git folders. This is a Databricks-specific deployment/automation pattern using Terraform. | +| [Auto-enable deletion vectors](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/deletion-vectors) | configuration | 0.70 | Workspace setting controlling default creation of Delta tables with deletion vectors is a Databricks-specific configuration option with nuanced applicability. | +| [Automate with Terraform](https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-terraform) | integrations | 0.70 | Page describes configuring the Databricks Terraform provider and service principal authentication for managing Git folders. This involves product-specific provider configuration, authentication parameters, and patterns for integrating Terraform with Azure Databricks repos, which qualify as integration-focused expert knowledge beyond generic Terraform usage. | | [Automate with a service principal](https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-sp) | security | 0.70 | Explains how to link Git credentials to Databricks service principals via UI/CLI; includes identity and permission configuration details. | +| [Automatic cluster update](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/automatic-cluster-update) | configuration | 0.70 | Explains scheduling maintenance windows, frequency, and restart behavior for automatic cluster updates, which are concrete configuration options for Databricks compute. | | [Automatically optimize prompts to improve quality](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/automatically-optimize-prompts) | best-practices | 0.70 | Describes mlflow.genai.optimize_prompts and GEPA algorithm usage; likely includes concrete API parameters and recommended optimization patterns unique to MLflow. | | [Automatically provision users (JIT)](https://learn.microsoft.com/en-us/azure/databricks/security/auth/jit) | security | 0.70 | Describes how to configure JIT provisioning for user accounts, which involves specific identity/auth settings and flows unique to Azure Databricks. These are product-specific security configuration details. | | [Azure Cosmos DB connector](https://learn.microsoft.com/en-us/azure/databricks/archive/azure/cosmosdb) | integrations | 0.70 | Legacy but focused on using the Cosmos DB Spark connector with Databricks, including product-specific integration details. | @@ -1506,31 +1529,29 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Basic embedding](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/basic) | integrations | 0.70 | Describes iframe-based embedding with Databricks-specific URLs and access behavior; this is a concrete integration pattern. | | [Batch UDFs in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/udf/python-batch-udf) | configuration | 0.70 | Covers Unity Catalog batch Python UDFs, including how they operate on batches and performance characteristics vs row UDFs. This is a Databricks-specific UDF configuration and usage pattern, beyond generic pandas or Spark knowledge. | | [Become a provider](https://learn.microsoft.com/en-us/azure/databricks/marketplace/become-provider) | security | 0.70 | Describes assigning Marketplace admin roles and provider setup; expected to reference specific RBAC roles and permissions unique to Marketplace. | -| [Best practices](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/best-practices) | best-practices | 0.70 | Explicitly a best-practices page; Databricks CI/CD guidance typically includes product-specific workflow patterns, recommended configurations, and gotchas beyond generic CI/CD theory. | +| [Best practices](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/best-practices) | best-practices | 0.70 | Page explicitly focuses on CI/CD best practices and recommended workflows specific to Databricks. It likely includes concrete, product-specific recommendations and patterns for structuring repos, jobs, and deployment flows on Databricks, which fits the best-practices category rather than generic CI/CD concepts. | | [Best practices](https://learn.microsoft.com/en-us/azure/databricks/genie/best-practices) | best-practices | 0.70 | Explicitly a best-practices article for creating and maintaining Genie spaces; likely includes product-specific guidance and gotchas for achieving accurate, consistent answers beyond generic LLM advice. | | [Best practices](https://learn.microsoft.com/en-us/azure/databricks/ldp/best-practices) | best-practices | 0.70 | Explicit best-practices article for Lakeflow Spark Declarative Pipelines; likely includes product-specific DO/DON’T guidance and patterns beyond generic Spark advice. | | [Best practices for cluster policies](https://learn.microsoft.com/en-us/azure/databricks/archive/compute/policies-best-practices) | best-practices | 0.70 | Explicitly a best-practices article for compute policies; these typically include concrete DO/DON'T guidance and product-specific recommendations for controlling compute creation and governance. | -| [Best practices for notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/best-practices) | best-practices | 0.70 | Walkthrough for version control, code sharing, testing, and CI/CD specifically for Databricks notebooks; likely includes Databricks-specific patterns, repo usage, and configuration details that go beyond generic best practices. | -| [Best practices for serverless workspaces](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces-best-practices) | best-practices | 0.70 | Focused on concrete DO/DON'T guidance for serverless workspaces, including product-specific recommendations around cost management, security, and platform requirements that go beyond generic advice. | +| [Best practices for notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/best-practices) | best-practices | 0.70 | The page is a hands-on walkthrough with concrete, product-specific recommendations for version control, code sharing, testing, and CI/CD in Azure Databricks notebooks. It goes beyond generic advice by showing how to implement these practices in the Databricks environment, which qualifies as expert, product-specific best-practices content. | | [Best practices: DBFS and Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/dbfs/unity-catalog) | best-practices | 0.70 | Described as outlining several best practices around working with Unity Catalog external locations and DBFS, focused on secure and recommended usage patterns specific to Azure Databricks. This is product-specific guidance rather than generic concepts. | -| [Billable usage system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/billing) | configuration | 0.70 | Reference for the billable usage system table (system.billing.usage) will include the exact table path, schema, column names, and example queries, which are concrete product-specific data structures and usage patterns that qualify as expert configuration/usage knowledge. | | [Bind a catalog](https://learn.microsoft.com/en-us/azure/databricks/catalogs/binding) | security | 0.70 | How-to page for restricting catalog access to workspaces with concrete Unity Catalog constructs and permission behavior; product-specific security configuration details beyond generic concepts. | | [Boxplot visualization](https://learn.microsoft.com/en-us/azure/databricks/visualizations/boxplot) | configuration | 0.70 | Covers detailed configuration options for box charts in Databricks; these are product-specific visualization settings. | | [Branches](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-branches) | configuration | 0.70 | Branch management for dev/test/prod workflows; likely includes branch operations, constraints, and permission requirements specific to Lakebase. | | [Bring your own lineage](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/external-lineage) | configuration | 0.70 | Explains how to add external lineage metadata, including specific APIs or table structures for lineage updates; these are product-specific configuration/integration patterns. | +| [Build with Supervisor API](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/supervisor-api-app) | integrations | 0.70 | Explains using Supervisor API for orchestration instead of custom loops, tied to Databricks Apps agent.py behavior and endpoints. This is product-specific integration behavior and coding pattern for combining Apps with Supervisor API, which qualifies as expert integration knowledge. | +| [Built-in functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin) | integrations | 0.70 | The page is a reference for Databricks SQL built-in functions, which includes product-specific function names, signatures, and behaviors that go beyond generic SQL knowledge. This is effectively API/SDK-style surface area for Databricks SQL, fitting best under integrations & coding patterns. It is not about limits, architecture, or deployment, but about concrete callable functions and their usage. | | [Bundles in air-gapped environments](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/airgapped-environment) | deployment | 0.70 | Describes setting up bundles in air-gapped networks, including required Docker images and dependency URLs; these are product-specific deployment requirements. | | [CALL](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-call) | integrations | 0.70 | This is a precise SQL syntax reference for the Databricks-specific CALL statement, including argument passing, nesting depth (up to 64 levels), and behavior details that are product- and version-specific. That makes it expert integration/coding-pattern knowledge beyond generic SQL, but it does not focus on limits tables, decisions, or configuration matrices. | | [CATALOGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalogs) | configuration | 0.70 | CATALOGS relation schema and behavior are Databricks-specific metadata configuration details. | | [CATALOG_PRIVILEGES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalog_privileges) | security | 0.70 | Describes INFORMATION_SCHEMA.CATALOG_PRIVILEGES, a security/permissions metadata relation; includes product-specific privilege and principal semantics that map directly to access control behavior. | | [CATALOG_PROVIDER_SHARE_USAGE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalog_provider_share_usage) | configuration | 0.70 | Describes CATALOG_PROVIDER_SHARE_USAGE relation, a Databricks-specific metadata table for mounted provider shares, which is configuration/metadata structure. | | [CATALOG_TAGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/catalog_tags) | configuration | 0.70 | Defines CATALOG_TAGS relation and how tags are represented; this is Databricks-specific metadata configuration. | -| [CLI](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/cli) | integrations | 0.70 | CLI usage for Lakebase will document specific commands, flags, and configuration keys that are product-specific integration details. | | [COLUMN_TAGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/column_tags) | configuration | 0.70 | COLUMN_TAGS relation structure and behavior are Databricks-specific metadata configuration details. | | [CONSTRAINT clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-constraint) | integrations | 0.70 | Documents CONSTRAINT clause for informational primary/foreign keys and how to add check constraints via ALTER TABLE. Contains concrete syntax and behavior specific to Databricks Delta/SQL. | | [CONSTRAINT_COLUMN_USAGE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/constraint_column_usage) | configuration | 0.70 | CONSTRAINT_COLUMN_USAGE relation describes how constraints reference columns; this is specific metadata schema for Databricks. | | [CONSTRAINT_TABLE_USAGE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/constraint_table_usage) | configuration | 0.70 | CONSTRAINT_TABLE_USAGE relation structure and filtering are Databricks-specific metadata configuration. | | [CREATE CATALOG](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-catalog) | integrations | 0.70 | Detailed SQL DDL reference for CREATE CATALOG including Unity Catalog specifics and FOREIGN catalog behavior. Contains product-specific syntax and semantics that function as API/DDL parameters, matching integrations & coding patterns. | -| [CREATE MATERIALIZED VIEW](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view) | integrations | 0.70 | The page is a SQL language manual entry for CREATE MATERIALIZED VIEW in Databricks, with Databricks-specific semantics (backing ETL pipeline, refresh behavior, supported clauses). This is concrete, product-specific SQL syntax and behavior, fitting integrations/coding patterns rather than generic concepts. | | [CREATE SHARE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-share) | configuration | 0.70 | CREATE SHARE syntax and privilege requirements for Unity Catalog Delta Sharing are product-specific. | | [CREATE TABLE LIKE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-like) | configuration | 0.70 | CREATE TABLE LIKE behavior and version-specific support for Delta Lake in Databricks Runtime. | | [Cache prompts](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/cache-prompts) | configuration | 0.70 | Explains caching strategy differences for version vs alias access; likely includes product-specific behavior and possibly parameters affecting cache usage. | @@ -1538,27 +1559,28 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Catalog class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog) | integrations | 0.70 | API reference for Catalog class with method signatures and behavior is product-specific integration knowledge (SDK surface) beyond generic concepts. | | [Catalog commits](https://learn.microsoft.com/en-us/azure/databricks/delta/catalog-commits) | configuration | 0.70 | How-to page for enabling catalog commits; likely includes specific table properties, configuration flags, and allowed values for turning on catalog-level commit coordination, which fits configuration details. | | [Change data feed](https://learn.microsoft.com/en-us/azure/databricks/delta/delta-change-data-feed) | configuration | 0.70 | Describes how to enable and query change data feed, including table properties and options tied to Delta table history. These are product-specific configuration knobs and usage patterns for CDF. | -| [Change default workspace access to Consumer access](https://learn.microsoft.com/en-us/azure/databricks/security/auth/change-default-workspace-access) | security | 0.70 | Describes how to use group cloning to change default workspace access for new users to consumer access. This involves concrete group/permission configuration patterns unique to Databricks. | +| [Change default workspace access to Consumer access](https://learn.microsoft.com/en-us/azure/databricks/security/auth/change-default-workspace-access) | security | 0.70 | Describes a Databricks-specific pattern (group cloning) to separate consumer vs authoring access, with concrete steps—security configuration guidance. | | [Chart visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/charts) | configuration | 0.70 | Outlines available chart options for Databricks visualizations; these are product-specific configuration parameters. | | [Checkpoint V2](https://learn.microsoft.com/en-us/azure/databricks/delta/checkpoint-v2) | configuration | 0.70 | Describes enabling Checkpoint V2 with runtime/version requirements; likely includes table properties or configuration parameters to turn on the feature and constraints around usage, which are product-specific configuration details. | | [Checkpoints](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/checkpoints) | configuration | 0.70 | Details checkpoint directory structure, required uniqueness per query, and behavior when changing/deleting checkpoints—product-specific operational semantics beyond generic streaming concepts. | | [Choose a compute type for your workload](https://learn.microsoft.com/en-us/azure/databricks/compute/choose-compute) | decision-making | 0.70 | Page is explicitly about compute selection recommendations and likely includes workload-based guidance and trade-offs between Databricks compute options, which is product-specific decision-making beyond generic concepts. | | [Clean Room notebook](https://learn.microsoft.com/en-us/azure/databricks/jobs/clean-room-notebook) | configuration | 0.70 | Describes Clean Room notebook tasks; likely includes task configuration parameters and Clean Room–specific options, which are product-specific. | +| [Clean room system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/clean-rooms) | configuration | 0.70 | Defines the clean room events system table path and schema, which is detailed system metadata/configuration not derivable from general knowledge. | | [Clone Parquet and Iceberg tables](https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/clone-parquet) | integrations | 0.70 | Covers Azure Databricks clone functionality for Parquet and Iceberg, including runtime version requirement (11.3 LTS+), preview status, use cases, limitations, and examples. This is detailed, product-specific behavior for integrating external table formats with Delta Lake, fitting integrations & coding patterns more than other categories. | | [Cluster-named init scripts (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/init-scripts/legacy-cluster-named) | configuration | 0.70 | Legacy cluster-named init scripts include product-specific behavior (best-effort execution, how they bind to cluster names) and configuration details unique to Databricks init script handling that are not generic knowledge. | | [Clusters UI changes and cluster access modes](https://learn.microsoft.com/en-us/azure/databricks/archive/compute/cluster-ui-preview) | configuration | 0.70 | Explains new vs legacy cluster UI and access mode configuration, including product-specific compute configuration behavior. | | [Code help using Genie Code](https://learn.microsoft.com/en-us/azure/databricks/notebooks/code-assistant) | best-practices | 0.70 | Provides tips on getting the most out of Genie Code, which are product-specific usage recommendations and gotchas rather than generic coding advice. | | [Code interpreter tools](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/code-interpreter-tools) | integrations | 0.70 | Describes the system.ai.python_exec Unity Catalog function and how agents invoke it. This is a specific callable function with defined behavior and usage in SQL/agents, matching integration & coding patterns with product-specific APIs. | | [Code-based scorer examples](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/code-based-scorer-examples) | integrations | 0.70 | Provides many concrete scorer implementations, input/output signatures, and error-handling patterns—rich product-specific coding patterns for MLflow Evaluation. | +| [Coding agent integration](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/coding-agent-integration-beta) | integrations | 0.70 | Integration with specific coding agents (Cursor, Gemini CLI, Codex CLI) will require product-specific configuration parameters, endpoint URLs, and possibly token or header settings unique to Databricks’ AI Gateway, fitting the integrations & coding patterns category. | | [Cohort visualization](https://learn.microsoft.com/en-us/azure/databricks/visualizations/cohorts) | configuration | 0.70 | Describes configuration options for cohort visualizations, which are Databricks-specific settings. | | [Collaborate using notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-collaborate) | security | 0.70 | Covers sharing notebooks, permissions, and mentions access control availability only in the Premium plan, which is product-specific security and RBAC behavior. | | [Column mapping](https://learn.microsoft.com/en-us/azure/databricks/delta/column-mapping) | configuration | 0.70 | Details how to enable and use column mapping to rename/drop columns and support special characters without rewriting data files. Includes specific table properties/behaviors unique to Delta on Databricks, which are configuration-focused expert details. | +| [Column mask clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-column-mask) | security | 0.70 | Documents product-specific column mask clause syntax and behavior for fine-grained access control, including how masks are applied on query, interaction with user identity/groups, and Unity Catalog constraints—security configuration details unique to Databricks. | | [Column selection](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/column-selection) | configuration | 0.70 | Describes specific table configuration properties to include or exclude columns in managed connector pipelines, with example pipeline definitions. This is product-specific configuration behavior rather than generic ingestion guidance. | -| [Common patterns](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/common-patterns) | best-practices | 0.70 | Explicitly about patterns and techniques to optimize managed ingestion pipelines, including controlling ingested data and configuring advanced behaviors. These are Databricks-specific best-practice patterns. | | [Compliance security profile overview](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/security-profile) | security | 0.70 | The page describes a product-specific compliance security profile for Azure Databricks, including concrete security/compliance controls and supported features that are unique to this service. It provides detailed, implementation-focused security configuration and behavior information that goes beyond generic security concepts, fitting the security sub-skill. | | [Compound statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/compound-stmt) | integrations | 0.70 | The page documents product-specific SQL control-flow syntax for Azure Databricks (BEGIN...END compound blocks, ATOMIC behavior, transactional semantics, and notebook invocation constraints). This is detailed, implementation-specific language behavior that goes beyond generic SQL knowledge and is needed for correctly integrating and scripting against Databricks SQL. It does not focus on limits, security, deployment, or troubleshooting, but on precise language constructs and their behavior. | | [Compute creation cheat sheet](https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/compute) | best-practices | 0.70 | Aimed at clear, opinionated guidance for compute creation to optimize performance and cost; Databricks-specific recommendations for choosing compute types qualify as product-specific best practices. | -| [Compute system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/compute) | configuration | 0.70 | Provides a reference for compute system tables, including specific table names and schema used to monitor clusters and node types. These are concrete, product-specific structures and fields that an LLM wouldn’t reliably know without the doc. | | [Compute-scoped libraries](https://learn.microsoft.com/en-us/azure/databricks/libraries/cluster-libraries) | configuration | 0.70 | Details using the Install library UI and policy-enforced library behavior; likely includes product-specific options and constraints for cluster libraries. | | [Conditional access](https://learn.microsoft.com/en-us/azure/databricks/archive/azure-admin/conditional-access) | security | 0.70 | The page describes enabling conditional access for Azure Databricks, which involves identity and access configuration. Such content typically includes specific policy settings and scopes, fitting the security sub-skill. | | [Configuration parameters](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameters) | configuration | 0.70 | Defines how configuration parameters work in Databricks SQL and how effective values are derived from multiple levels; this is product-specific configuration behavior not generally known. | @@ -1570,19 +1592,17 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Configure Slack notifications](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/slack-subscriptions) | configuration | 0.70 | Describes creating a Slack app and configuring channels as notification destinations; involves specific configuration steps and parameters unique to this integration. | | [Configure an Azure Network Security Perimeter for serverless compute access](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-nsp-firewall) | security | 0.70 | Configuring Azure Network Security Perimeter for serverless access to Azure resources involves specific NSP resource settings, policies, and allowed connections. These are detailed, product‑specific security configuration parameters. | | [Configure an Azure Storage firewall for serverless compute access (Legacy)](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewall) | security | 0.70 | Explains how to configure Azure storage firewalls to allow Databricks serverless compute, including Databricks-specific network and firewall settings. This is concrete security/network configuration, even though the feature is legacy. | -| [Configure diagnostic log delivery](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-log-delivery) | security | 0.70 | Describes how to enable audit log delivery with specific Azure roles and permissions (e.g., required actions on Microsoft.Databricks/workspaces), which are concrete security configuration details. | -| [Configure endpoints](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-endpoints-beta) | configuration | 0.70 | Explicitly about configuring AI Gateway endpoints; such pages typically include endpoint configuration parameters, allowed values, and defaults, matching the configuration sub-skill. | | [Configure for high availability](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/high-availability) | configuration | 0.70 | Describes enabling readable secondary instances; likely includes specific HA settings, replication options, and constraints unique to Lakebase, which are configuration details. | | [Configure managed storage](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/managed-storage) | configuration | 0.70 | Describes how to associate managed storage locations at metastore, catalog, and schema levels and how precedence works; this is product-specific configuration behavior (hierarchy overrides, recommended assignment level) that an LLM would not reliably infer without the docs. | | [Configure networking](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/networking) | security | 0.70 | Networking configuration for apps with IP access lists, private connectivity, and network policies is security-focused and likely includes specific settings and patterns unique to Databricks Apps. | | [Configure service endpoint policies for storage access](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/service-endpoints) | security | 0.70 | Explains how to configure Azure virtual network service endpoint policies specifically for Azure Databricks classic compute to restrict storage access, including product-specific network/security settings. | | [Configure single-use refresh tokens](https://learn.microsoft.com/en-us/azure/databricks/integrations/single-use-tokens) | security | 0.70 | Explains and configures single-use refresh tokens via Azure Databricks API. Contains product-specific API fields, security settings, and configuration steps for token rotation that qualify as detailed security configuration. | | [Connect agents to external services](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/external-connection-tools) | integrations | 0.70 | Explains multiple product-specific approaches (MCP servers, managed OAuth, UC connections proxy, direct API calls) and relies on Unity Catalog HTTP connections. This is integration-focused with Databricks-specific connection patterns and configuration details. | +| [Connect to AI Runtime](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/connecting) | configuration | 0.70 | Explains how to connect from notebooks, scheduled jobs, and Jobs API, which typically involves product-specific connection parameters and API usage patterns not captured by generic knowledge. | | [Connect to Amazon S3](https://learn.microsoft.com/en-us/azure/databricks/archive/storage/amazon-s3) | integrations | 0.70 | S3 integration requires specific configuration such as access/secret keys, IAM roles, S3 URL schemes, and Databricks-specific mount or read/write options. | | [Connect to Azure Data Lake Storage](https://learn.microsoft.com/en-us/azure/databricks/archive/storage/tutorial-azure-storage) | integrations | 0.70 | Tutorial for ADLS access via service principal OAuth typically includes tenant IDs, endpoint URLs, Spark config keys, and token-related parameters specific to this product integration. | | [Connect to MCP servers](https://learn.microsoft.com/en-us/azure/databricks/genie-code/mcp) | integrations | 0.70 | Focuses on connecting Genie Code to MCP servers (external tools/data). This is an integration pattern specific to Databricks + MCP, likely with configuration details and constraints such as Agent mode-only support. | | [Connect to SQL Workbench/J](https://learn.microsoft.com/en-us/azure/databricks/archive/partners/workbenchj) | integrations | 0.70 | Article on using SQL Workbench/J with Databricks generally includes JDBC URL formats, driver settings, and configuration parameters specific to this tool and Databricks, which are concrete integration patterns rather than generic SQL guidance. | -| [Connect to managed ingestion sources (Lakeflow Connect)](https://learn.microsoft.com/en-us/azure/databricks/connect/managed-ingestion) | security | 0.70 | Describes creating connection objects that store authentication details and USE CONNECTION privileges; product-specific IAM and connection configuration. | | [Connect using token authentication](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/connect-local) | security | 0.70 | Describes OAuth 2.0 Bearer token authentication for Databricks app HTTP APIs, including applicability constraints (only /api/routes apps). This is product-specific security/auth configuration and usage guidance. | | [Connect with DBeaver](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-dbeaver) | integrations | 0.70 | DBeaver-specific connection instructions; likely includes driver selection and connection property values tailored to Lakebase. | | [Connect with pgAdmin](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pgadmin) | integrations | 0.70 | Shows how to configure pgAdmin for Lakebase and monitor performance; likely includes connection property values and monitoring setup specific to Lakebase. | @@ -1595,7 +1615,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Create a Foundation Model Fine-tuning run (UI)](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/foundation-model-training/ui) | configuration | 0.70 | Explains how to create and configure runs using the UI; such UI docs typically enumerate setting names and allowed values for the Databricks fine-tuning feature, which are product-specific configuration details. | | [Create a JAR](https://learn.microsoft.com/en-us/azure/databricks/jobs/jar-create) | configuration | 0.70 | Describes compatibility requirements and project configuration for different Databricks compute types, including build settings that are product-specific. | | [Create a SQL warehouse](https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/create) | configuration | 0.70 | Describes requirements and advanced configuration options for SQL warehouses, likely including specific settings and parameters in the UI/API, which is product-specific configuration knowledge beyond generic deployment steps. | -| [Create a clean room](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/create-clean-room) | configuration | 0.70 | Describes creating clean rooms via UI and notes key features and limitations; includes product-specific configuration steps and constraints. | | [Create a custom judge](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/custom-judge/create-custom-judge) | integrations | 0.70 | Step-by-step tutorial using make_judge() with concrete code and workflow for evaluating a support agent—product-specific integration and coding patterns. | | [Create a custom multi-agent system](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/multi-agent-apps) | architecture-patterns | 0.70 | Describes a multi-agent orchestrator pattern with supported subagent types; provides Databricks-specific architectural guidance on routing and composition of agents and services. | | [Create a listing (provider)](https://learn.microsoft.com/en-us/azure/databricks/marketplace/create-listing) | configuration | 0.70 | Covers creating shares and listing types (datasets, MCP server, Git repo); likely includes listing configuration options and required fields specific to Marketplace. | @@ -1605,12 +1624,18 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Create an SAP BDC connection](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/create-connection) | configuration | 0.70 | Page is about creating and managing an SAP Business Data Cloud connection in Azure Databricks Delta Sharing. This typically includes product-specific connection parameters, options, and settings (such as connection properties, authentication settings, and management operations), which qualify as configuration details beyond generic tutorial content. | | [Create and manage Git folders](https://learn.microsoft.com/en-us/azure/databricks/repos/git-operations-with-repos) | configuration | 0.70 | Step-by-step Git operations (clone, branch, commit, push) within Git folders, reflecting Databricks-specific workspace behavior. | | [Create and manage feature tables (legacy)](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/feature-tables) | configuration | 0.70 | How-to for creating, updating, and browsing feature tables; likely includes specific API calls, options, and behaviors unique to Workspace Feature Store, which are configuration-like expert details. | -| [Create and manage shares](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-share) | configuration | 0.70 | Explains how to define and manage share objects in Unity Catalog; likely includes specific commands/parameters for configuring shares and asset inclusion behavior. | +| [Create and manage governed tags](https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-governed-tags) | security | 0.70 | Creating and managing governed tags in Databricks is tightly tied to access control and policy enforcement; such a page typically documents specific tag-related controls and constraints unique to the product, which falls under product-specific security/governance configuration rather than generic tagging concepts. | +| [Create and manage recipients (tokens)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-token) | security | 0.70 | Describes creating recipient objects and configuring bearer-token based authentication for non-Databricks users. This is product-specific security configuration (recipient object, bearer token method, token handling) that goes beyond generic security concepts. | | [Create and manage volumes](https://learn.microsoft.com/en-us/azure/databricks/volumes/utility-commands) | configuration | 0.70 | Provides concrete syntax and options for CREATE/DROP/MANAGE volumes, including product-specific parameters and patterns for Unity Catalog volumes. This is configuration-level expert knowledge beyond generic LLM understanding. | +| [Create and monitor budgets](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/budgets) | decision-making | 0.70 | Details how budgets are defined (USD, list price, filters) and how to apply them to teams/projects; this is concrete guidance for planning and monitoring spend decisions. | | [Create basic resources with Terraform](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/cluster-notebook-job) | integrations | 0.70 | How-to article with Databricks Terraform provider resource blocks and parameters for clusters, notebooks, and jobs. Contains product-specific configuration patterns and fields beyond generic Terraform usage. | | [Create catalogs](https://learn.microsoft.com/en-us/azure/databricks/catalogs/create-catalog) | configuration | 0.70 | Shows how to create catalogs via Catalog Explorer and SQL; likely includes specific SQL syntax, parameters, and options unique to Unity Catalog, which are product-specific configuration details. | | [Create private exchanges (provider)](https://learn.microsoft.com/en-us/azure/databricks/marketplace/private-exchange) | configuration | 0.70 | Explains creating and managing private exchanges for invited consumers; likely includes specific settings/flags controlling visibility and access for private listings. | +| [Create service credentials](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials) | security | 0.70 | Describes how to create and govern service credential objects in Unity Catalog for external cloud services. This is identity/access configuration specific to Azure Databricks and Unity Catalog, likely including object types, permissions, and scope details that qualify as product-specific security configuration. | | [Create vector search endpoints and indexes](https://learn.microsoft.com/en-us/azure/databricks/vector-search/create-vector-search) | configuration | 0.70 | Describes how to create and configure vector search endpoints and indexes via UI, Python SDK, and REST API. This typically includes product-specific parameters (for endpoints, indexes, and embedding models) and concrete configuration patterns that go beyond generic knowledge, fitting the configuration sub-skill. | +| [Create workspace using Azure Portal](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/create-workspace) | deployment | 0.70 | Provides product-specific deployment steps for classic and serverless workspaces via the Azure portal, including workspace-type-specific deployment behavior. | +| [Create workspace using PowerShell](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/powershell) | deployment | 0.70 | Describes deploying and reviewing a Databricks workspace with PowerShell, including required modules and commands, which are product-specific deployment details. | +| [Create workspace using the Azure CLI](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/azure-cli) | deployment | 0.70 | Explains how to create a Databricks deployment using Azure CLI, which involves product-specific deployment commands and parameters. | | [Create, edit, and use prompts](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/create-and-edit-prompts) | configuration | 0.70 | How-to for creating and versioning prompts via Python SDK and UI; likely includes specific API calls and parameters for prompt objects. | | [Credential passthrough (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/) | security | 0.70 | Describes how credential passthrough works, its deprecation, and how it secures storage access in Databricks with product-specific security behavior and configuration. | | [Credential redaction](https://learn.microsoft.com/en-us/azure/databricks/security/keys/redaction) | security | 0.70 | Explains product-specific behavior for how Databricks redacts credentials in logs, which is a security feature unique to the platform and not generic knowledge. | @@ -1632,12 +1657,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [DROP VOLUME](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-volume) | limits-quotas | 0.70 | Includes a specific retention period for managed volumes: files are retained for 7 days before deletion, which is a concrete product limit not generally known. | | [Dashboard themes](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/themes) | configuration | 0.70 | Explains creating, updating, and deleting workspace themes; theming options (colors, fonts, alignment) are configuration settings specific to AI/BI dashboards. | | [Data API](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-api) | integrations | 0.70 | Describes a PostgREST-compatible HTTP interface specific to Lakebase, including schema-derived endpoints and secure CRUD operations. This is a product-specific API integration pattern rather than a generic concept. | -| [Data access configuration](https://learn.microsoft.com/en-us/azure/databricks/admin/sql/data-access-configuration) | security | 0.70 | Data access configuration for SQL warehouses is security-related and usually includes specific properties, flags, and access modes that control external data object access; these are product-specific IAM/data access settings beyond generic knowledge. | -| [Data lineage system tables](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage) | configuration | 0.70 | A reference for lineage system tables, including table names, schemas, and how lineage records are emitted. These are product-specific table structures and behaviors that qualify as configuration/reference details beyond generic knowledge. | +| [Data classification](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification) | configuration | 0.70 | Explains how to use Databricks Data Classification with Unity Catalog; this usually involves specific configuration steps, options, and possibly classifier settings unique to the service, which counts as expert configuration knowledge beyond generic data classification theory. | | [Data protection](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-protection) | security | 0.70 | Covers customer-managed keys, protected branches, and private connectivity. These involve specific security configurations, network settings, and access controls unique to Lakebase, aligning with security. | | [Data quality](https://learn.microsoft.com/en-us/azure/databricks/ldp/expectations) | configuration | 0.70 | Covers expectations syntax, behavior options, and how to fail updates or drop records; these are product-specific configuration options for data quality enforcement. | | [Database objects in the legacy Hive metastore](https://learn.microsoft.com/en-us/azure/databricks/database-objects/hive-metastore) | best-practices | 0.70 | Details behavioral differences and gotchas between Unity Catalog and legacy Hive metastore, including recommendations and edge cases. | | [Databricks AI assistive features trust and safety](https://learn.microsoft.com/en-us/azure/databricks/databricks-ai/databricks-ai-trust) | security | 0.70 | Focuses on data protection, privacy, and trust for AI assistive features; likely includes product-specific handling of user data, retention, and compliance behaviors that are not generic security concepts. | +| [Databricks Apps](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/databricks-apps) | integrations | 0.70 | Explains how to connect a Lakebase database to Databricks Apps as a managed Postgres backend, likely including app resource configuration, service principal behavior, and connection details that are specific to this integration. | | [Databricks Autologging](https://learn.microsoft.com/en-us/azure/databricks/mlflow/databricks-autologging) | configuration | 0.70 | Covers Databricks Autologging behavior and customization, including what is automatically logged for each supported library and how to configure it, which is Databricks-specific configuration/integration detail. | | [Databricks CLI from Azure Cloud Shell](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/databricks-cli-from-azure-cloud-shell) | integrations | 0.70 | Describes how to configure and use the Databricks CLI specifically within Azure Cloud Shell, including environment variables/commands that are integration-specific. | | [Databricks Runtime 11.x migration](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/11.x-migration) | decision-making | 0.70 | This is a migration guide that helps decide how and when to move workloads from 10.4 LTS or above to 11.x. It contains version-specific migration paths and guidance, which fits the decision-making category around upgrade paths and migration considerations. | @@ -1654,11 +1679,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Dataiku](https://learn.microsoft.com/en-us/azure/databricks/partners/ml/dataiku) | integrations | 0.70 | Partner integration article with product-specific connection steps and configuration details for connecting Databricks clusters and SQL warehouses to Dataiku. | | [Declarative Automation Bundles](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-with-bundles) | configuration | 0.70 | Covers using Declarative Automation Bundles to define Lakebase projects, branches, and endpoints as IaC. This relies on bundle resource schemas and configuration options (resource types, fields), which are product-specific configuration details. | | [Deep learning pipelines migration guide](https://learn.microsoft.com/en-us/azure/databricks/archive/spark-3.x-migration/deep-learning-pipelines) | best-practices | 0.70 | Provides Databricks-specific migration tips and gotchas for moving off the Deep Learning Pipelines package, including what components were removed and how to replace them, which are concrete product behaviors and edge cases. | -| [Default policies and policy families](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-families) | best-practices | 0.70 | Describes default policies for specific use cases and notes that each enforces best practices; includes concrete policy behaviors and how to customize them, which are product-specific best practices. | +| [Default policies and policy families](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-families) | best-practices | 0.70 | Describes default policies for specific use cases and the rules they enforce as best practices; includes product-specific recommendations for when to use each policy. | | [Default storage](https://learn.microsoft.com/en-us/azure/databricks/storage/default-storage) | configuration | 0.70 | Explains Databricks-specific behavior of default storage and how catalogs and data objects are created against it, including configuration patterns unique to the service. | | [Default workspace catalog](https://learn.microsoft.com/en-us/azure/databricks/catalogs/default) | decision-making | 0.70 | Discusses how to choose which catalog should be the default and how to change it; provides product-specific decision guidance and configuration steps for default catalog selection. | | [Define environment variables](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/environment-variables) | configuration | 0.70 | Covers defining environment variables via app.yaml in the env section, including default variables set by the platform. This is product-specific configuration behavior and likely includes parameter names and usage patterns unique to Databricks Apps. | -| [Deletion vectors](https://learn.microsoft.com/en-us/azure/databricks/delta/deletion-vectors) | best-practices | 0.70 | Explains how deletion vectors change DELETE/UPDATE/MERGE behavior, when they are recommended, and their interaction with storage and performance. These are Databricks-specific optimization practices and behavioral gotchas. | +| [Deletion vectors](https://learn.microsoft.com/en-us/azure/databricks/delta/deletion-vectors) | best-practices | 0.70 | Describes Databricks-specific deletion vector behavior (marking rows instead of rewriting Parquet files) and recommendations on when to enable them. This is a product-specific performance and storage optimization pattern, not generic SQL knowledge. | | [Delta Sharing](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-sharing) | security | 0.70 | Delta Sharing is a security-focused feature; the page describes providers, recipients, and shares with Databricks-specific access and sharing semantics that go beyond generic concepts. | | [Delta Sharing shared tables](https://learn.microsoft.com/en-us/azure/databricks/query/formats/deltasharing) | integrations | 0.70 | Provides Databricks-specific format keyword (deltasharing) and syntax examples for reading shared tables, which are concrete integration patterns beyond generic Spark knowledge. | | [Delta client access](https://learn.microsoft.com/en-us/azure/databricks/external-access/unity-rest) | integrations | 0.70 | Explains how to create, read, and write Unity Catalog tables from external Delta clients via the Unity REST API. This implies product-specific API usage and parameters for integrating external Delta clients with Databricks, which matches integrations & coding patterns. | @@ -1666,18 +1691,15 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Deploy Python code with Model Serving](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/deploy-custom-python-code) | integrations | 0.70 | Shows how to deploy Python code using MLflow pyfunc with Databricks Model Serving, including how to structure code, handle preprocessing/postprocessing, and package it. This is a concrete coding pattern/integration between Python/MLflow and Databricks serving. | | [Deploy a bundle from the workspace](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace-deploy) | deployment | 0.70 | Covers deployment behavior, unique bundle identities, and synchronization; these are product-specific deployment semantics and constraints. | | [Deploy a feature serving endpoint](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/feature-serving-tutorial) | deployment | 0.70 | Step-by-step tutorial using the Databricks SDK to deploy and query feature serving endpoints; includes deployment-specific API calls and requirements unique to Databricks Feature Serving. | -| [Deploy apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/deploy) | deployment | 0.70 | Covers how Databricks apps are built and run using platform-specific deployment flows (UI and CLI) and project configuration, which are product-specific deployment details rather than generic concepts. | -| [Deploy custom models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-models) | configuration | 0.70 | Details supported model logging options, compute types, dependency packaging, and scaling expectations. These are product-specific configuration options and constraints (e.g., supported runtimes, instance types) that qualify as configuration expert knowledge. | +| [Deploy custom models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-models) | configuration | 0.70 | Describes supported model logging options, compute types, dependency packaging, and endpoint behavior for Mosaic AI Model Serving—product-specific configuration details beyond generic knowledge. | | [Deploy on Databricks](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/prod-tracing) | deployment | 0.70 | Describes deploying GenAI apps on Databricks via Mosaic AI Agent Framework or Model Serving with automatic trace capture; includes Databricks-specific deployment methods and constraints for production tracing. | | [Deploy outside of Databricks](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/prod-tracing-external) | deployment | 0.70 | Covers how to configure externally deployed agents to send traces to Databricks, including endpoint configuration and deployment considerations—product-specific deployment patterns. | | [Deployment Jobs](https://learn.microsoft.com/en-us/azure/databricks/mlflow/deployment-job) | deployment | 0.70 | Explains MLflow deployment jobs as part of lifecycle; likely includes product-specific deployment behavior and constraints for Unity Catalog models. | -| [Diagnostic logs reference guide](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-logs) | configuration | 0.70 | Provides a comprehensive reference of audit log services and events, which is product-specific expert knowledge (event names, services, and log structure) used for configuring and consuming logs. | -| [Disable legacy features](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/legacy-features) | configuration | 0.70 | Describes an account setting that controls legacy feature availability, including date-based behavior and which features are affected. This is product-specific configuration behavior not derivable from generic knowledge. | +| [Diagnostic logs reference guide](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-logs) | configuration | 0.70 | Provides a comprehensive reference of audit/diagnostic log services and event types; this is detailed mapping of services to events, akin to configuration metadata. | | [Disable the legacy Hive metastore](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/disable-hms) | configuration | 0.70 | Explains a specific workspace admin setting to block Hive metastore access; this is a concrete configuration option with defined behavior, part of product-specific settings. | | [Disk caching](https://learn.microsoft.com/en-us/azure/databricks/optimizations/disk-cache) | best-practices | 0.70 | Explains Databricks disk caching behavior, when it applies (Parquet/Delta, SQL warehouses, runtimes), and how it impacts performance, which is product-specific. | | [Distributed deep learning with HorovodRunner](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/train-model/horovod-runner) | integrations | 0.70 | Details how to use HorovodRunner to launch Horovod training as Spark jobs, including Databricks-specific integration patterns and parameters for distributed execution. | | [Distributed fine-tune Olmo3 7B with Axolotl](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-olmo3-7b-lora-axolotl) | integrations | 0.70 | Shows Axolotl-based QLoRA fine-tuning with Databricks serverless GPU and MLflow/Unity Catalog integration, including concrete configuration and API usage unique to this environment. | -| [Distributed training](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/distributed-training) | integrations | 0.70 | Describes a specific Serverless GPU Python API for distributed training, which will include function signatures, parameters, and constraints unique to this product—matching integration/coding pattern criteria rather than generic ML guidance. | | [Distributed training of XGBoost models using sparkdl.xgboost](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/sparkdl-xgboost) | integrations | 0.70 | Covers sparkdl.xgboost estimators, limitations, and migration guidance; includes product-specific integration details and constraints. | | [Distributed training of XGBoost models using xgboost.spark](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/xgboost-spark) | integrations | 0.70 | Describes new xgboost.spark estimators and their use in SparkML pipelines, which are specific integration APIs and patterns. | | [Distributed training with H100 GPUs](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-api-h100-starter) | integrations | 0.70 | Notebook demonstrates connecting to H100 GPUs and using the serverless_gpu Python library, including decorators and runtime utilities that are specific integration APIs. | @@ -1688,59 +1710,59 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Drop table features](https://learn.microsoft.com/en-us/azure/databricks/delta/drop-feature) | configuration | 0.70 | Explains how to use DROP FEATURE and protocol downgrades; likely includes specific feature names, protocol version constraints, and commands, which are detailed configuration and compatibility settings unique to Delta Lake on Databricks. | | [Embed dashboards](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/) | integrations | 0.70 | Embedding dashboards requires specific embed URLs, parameters, and integration patterns unique to Databricks. | | [Enable a workspace for Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/enable-workspaces) | configuration | 0.70 | Describes how to enable an existing workspace for Unity Catalog by assigning a metastore, including product-specific configuration steps and conditions (e.g., automatic enablement after a certain date). This is concrete configuration guidance unique to Azure Databricks and Unity Catalog. | -| [Enable admin protection for "No Isolation Shared" clusters on your account](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/no-isolation-shared) | security | 0.70 | Describes a specific account-level security setting that controls automatic internal credential generation for workspace admins on no isolation shared clusters, including product-specific behavior and constraints that are not generic knowledge. | +| [Enable verbose logs](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/verbose-logs) | configuration | 0.70 | Documents the exact workspace configuration keys and parameters used to toggle verbose audit logs—concrete configuration settings. | | [Enable web terminal](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/web-terminal) | configuration | 0.70 | Describes enabling web terminal via admin settings and propagation behavior; involves specific workspace/cluster configuration toggles. | -| [Enable your custom URL](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/custom-url) | configuration | 0.70 | Custom URL setup involves specific configuration fields, DNS/host requirements, and behavior across workspaces. These are concrete configuration parameters and behaviors unique to Databricks accounts. | +| [Enable your custom URL](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/custom-url) | configuration | 0.70 | Covers enabling a custom branded URL for the account; this typically involves specific configuration steps and parameters unique to Azure Databricks. | | [Encrypt queries](https://learn.microsoft.com/en-us/azure/databricks/security/keys/sql-encryption) | security | 0.70 | Describes how to encrypt queries, query history, and results for Databricks SQL with Premium plan constraints and object-type-specific behavior, which is product-specific security configuration. | -| [End of life for Standard tier workspaces on Azure Databricks](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/standard-tier) | decision-making | 0.70 | Contains concrete deprecation and upgrade deadlines (specific dates) and guidance on upgrading tiers; this is decision-focused information (when to move tiers) that changes over time and is not inferable from training alone. | | [Error handling](https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-errors) | troubleshooting | 0.70 | Error handling page will map Zerobus-specific error codes/messages to causes and recovery steps, which is product-specific troubleshooting knowledge. | | [Evaluate prompts](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/evaluate-prompts) | best-practices | 0.70 | Shows systematic evaluation of prompt versions with MLflow’s evaluation framework; includes product-specific code patterns and workflows for building datasets and comparing metrics. | +| [Example notebooks](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vs-example-notebooks) | integrations | 0.70 | Example notebooks demonstrate concrete usage of the Vector Search Python SDK with product-specific API calls and patterns, which are integration/coding patterns. | | [Example queries](https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/monitor/queries) | best-practices | 0.70 | Page provides concrete, product-specific example SQL queries against Databricks system tables to monitor SQL warehouse performance, usage, and costs. These are actionable patterns (what to query, which columns, how to aggregate) that go beyond generic monitoring concepts and represent expert, service-specific best-practice guidance rather than generic SQL knowledge. | -| [Examples](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/patterns) | best-practices | 0.70 | Quick reference of common patterns with concrete examples for Auto Loader; likely includes recommended code/config patterns and gotchas specific to this product. | +| [Examples](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/patterns) | architecture-patterns | 0.70 | Quick-reference of concrete Auto Loader ingestion patterns (incremental loads, schema evolution, directory layouts) with product-specific options and code snippets that guide when and how to use each pattern; this is actionable pattern guidance beyond generic concepts. | | [Examples: Trace analysis](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/analyze-traces) | best-practices | 0.70 | Provides concrete patterns and recommended ways to analyze trace data (errors, performance, user activity) using the MLflow Trace SDK. These are actionable, product-specific analysis patterns and gotchas rather than generic observability concepts. | +| [Excel](https://learn.microsoft.com/en-us/azure/databricks/query/formats/excel) | integrations | 0.70 | Explains Databricks’ built-in Excel file format support for batch and streaming, including schema inference and ingestion from local and cloud storage; likely includes format-specific options and parameters, making it an integration/config pattern. | | [Experiment tracking and observability](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/tracking-observability) | best-practices | 0.70 | Covers MLflow usage, GPU health monitoring, logs, and checkpoint management with AI Runtime-specific recommendations and patterns, which are product-specific best practices. | | [Export endpoint health metrics to Prometheus and Datadog](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/metrics-export-serving-endpoint) | configuration | 0.70 | Explains using a metrics export API to send metrics to Prometheus and Datadog, which typically includes metric names, labels, endpoint paths, and configuration parameters unique to Databricks. | -| [External MCP servers](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/external-mcp) | configuration | 0.70 | Page is about connecting to external MCP servers via Databricks-managed proxies and Unity Catalog connections. This typically includes concrete connection settings, auth modes (shared principal vs per-user), and configuration parameters specific to Databricks MCP integration, which are product-specific details not inferable from general knowledge. | | [External tables](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-external-tables) | security | 0.70 | Defines storage credential, external location, and external table as Unity Catalog securable objects used to grant access to cloud storage. Involves product-specific security objects and privilege concepts, aligning with security configuration. | -| [External tools](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-monitoring-tools) | integrations | 0.70 | Covers using pgAdmin, PgHero, etc. with Lakebase; likely includes connection parameters, required settings, and tool-specific configuration for this product, which fits integrations & coding patterns. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/faq) | troubleshooting | 0.70 | Kafka FAQ for Databricks typically addresses concrete issues, error messages, and behavior clarifications specific to this integration, aligning with troubleshooting-style expert knowledge. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/faq) | troubleshooting | 0.70 | FAQ for Auto Loader; typically includes specific behaviors, limits, and resolutions to common issues, which are product-specific expert details. | -| [Fan-in and fan-out architecture](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/fan-in-fan-out) | architecture-patterns | 0.70 | Describes how to realize fan-in/fan-out specifically in Lakeflow Spark Declarative Pipelines, including Databricks-specific constructs and wiring patterns, making it a product-specific architecture pattern guide. | +| [Fan-in and fan-out architecture](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/fan-in-fan-out) | architecture-patterns | 0.70 | Describes concrete fan-in/fan-out implementation patterns specifically for Lakeflow Spark Declarative Pipelines, including how to structure and orchestrate stages. This is product-specific architecture guidance beyond generic fan-in/fan-out concepts. | | [Feature Store and Model Serving](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/feature-store) | integrations | 0.70 | Describing Lakebase Autoscaling as the backend for Online Feature Stores typically includes specific configuration fields, API calls, and constraints for online stores that are unique integration details. | | [Feature store integration](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/feature-store-integration) | integrations | 0.70 | Describes integration of Feature Store tables into AutoML, likely including specific configuration steps and parameters for linking feature tables. | -| [Features with limited regional availability](https://learn.microsoft.com/en-us/azure/databricks/resources/feature-region-support) | deployment | 0.70 | Contains region-by-region feature support tables, which are effectively a platform/region support matrix; this is deployment-relevant expert knowledge about where specific capabilities are available. | +| [Features with limited regional availability](https://learn.microsoft.com/en-us/azure/databricks/resources/feature-region-support) | deployment | 0.70 | Lists which Azure Databricks features are available in which regions using tables of feature-by-region support. This is concrete, product-specific availability data that changes over time and is not inferable from general training. It directly affects deployment and rollout decisions by region, fitting best under deployment. | | [Federate external HMS](https://learn.microsoft.com/en-us/azure/databricks/query-federation/hms-federation-external) | configuration | 0.70 | Shows how to federate an external Hive metastore with Unity Catalog, including connector configuration, permissions, and object mappings; these are concrete configuration parameters and steps. | | [Federate legacy workspace HMS](https://learn.microsoft.com/en-us/azure/databricks/query-federation/hms-federation-internal) | configuration | 0.70 | Step-by-step configuration for federating an internal workspace Hive metastore, likely including specific setting names, commands, and required Unity Catalog objects; this is detailed configuration knowledge. | +| [File metadata column](https://learn.microsoft.com/en-us/azure/databricks/ingestion/file-metadata-column) | configuration | 0.70 | Explains the hidden _metadata column semantics, available fields, and how to select it in queries, including product-specific behavior when a user column has the same name; these are detailed configuration/usage semantics unique to Databricks. | | [FileStore](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/filestore) | configuration | 0.70 | Explains special FileStore paths, behavior for libraries and feature-generated files, and runtime-specific support details unique to Databricks. | | [Files on Databricks overview](https://learn.microsoft.com/en-us/azure/databricks/files/) | configuration | 0.70 | Covers Databricks-specific file interaction APIs and constraints (e.g., unsupported paths for JVM processes), which are configuration and behavior details. | | [Fine tune a Foundation Model](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/foundation-model-training/fine-tune-run-tutorial) | configuration | 0.70 | Covers creating and configuring fine-tuning runs via Mosaic AI Model Training APIs and UI; likely includes run configuration parameters, options, and deployment settings specific to this service. | -| [Fine-grained access control](https://learn.microsoft.com/en-us/azure/databricks/compute/single-user-fgac) | security | 0.70 | Explains how serverless and dedicated compute enforce row filters, column masks, and dynamic views; this is product-specific access control behavior. | +| [Fine-grained access control](https://learn.microsoft.com/en-us/azure/databricks/compute/single-user-fgac) | security | 0.70 | Explains how serverless compute enforces fine-grained access controls (row filters, column masks, dynamic views) on dedicated compute; this is Databricks-specific security/authorization behavior. | | [Fine-tune GPT-OSS 120B](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-gpt-oss-120b-ddp-fsdp) | integrations | 0.70 | Describes supervised fine-tuning of a 120B-parameter GPT-OSS model on 8–16 H100 GPUs using Databricks Serverless GPU with DDP and FSDP. This implies detailed, product-specific code and configuration (e.g., DDP/FSDP settings, GPU count parameters, Databricks runtime specifics) that qualify as expert integration patterns rather than generic ML guidance. | | [Fine-tune GPT-OSS 20B](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-gpt-oss-20b) | integrations | 0.70 | Notebook for fine-tuning a specific OpenAI model with distributed training on Databricks, including concrete integration code and configuration patterns. | | [Fine-tune Llama 3.1 8B with LLM Foundry](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-llama3-8b-llmfoundry) | integrations | 0.70 | Notebook-style tutorial for Databricks Serverless GPU and Mosaic LLM Foundry will include concrete, product-specific code patterns, configuration parameters, and integration details (e.g., model IDs, training arguments, cluster/runtime specifics) that go beyond generic LLM fine-tuning knowledge. | -| [Fine-tune Llama 3.2 1B with LoRA](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-sft-trl-deepspeed-llama-1b) | integrations | 0.70 | Notebook for SFT with LoRA and DeepSpeed ZeRO Stage 3 on Databricks Serverless GPU will contain specific DeepSpeed config options, TRL usage patterns, and Databricks GPU setup details that are concrete integration and coding patterns. | | [Flag certified and deprecated data](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/certify-deprecate-data) | configuration | 0.70 | Describes how to apply specific system tags to Unity Catalog securable objects, including tag names and usage semantics, which are product-specific configuration operations. | | [Flexible node types overview](https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-types) | best-practices | 0.70 | Describes behavior of flexible node types, fallback logic, and stockout handling; these are product-specific operational recommendations and patterns. | | [Foreign tables](https://learn.microsoft.com/en-us/azure/databricks/tables/foreign) | configuration | 0.70 | Details methods for registering foreign tables and behavior differences (e.g., federated Hive metastore metadata behavior), which are product-specific configuration/behavior details. | | [Foundation Model APIs overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/) | limits-quotas | 0.70 | Overview that explicitly includes requirements and limitations; likely lists numeric limits (context length, rate limits, region support) and model availability tables. | | [Foundry](https://learn.microsoft.com/en-us/azure/databricks/integrations/microsoft-foundry) | integrations | 0.70 | Explains how to register Genie as an MCP tool in Foundry; likely includes specific endpoint URLs, tool configuration fields, and Databricks-specific parameters. | -| [Function calling on Databricks](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/function-calling) | integrations | 0.70 | Covers what function calling is, when to use it, and how to implement it with Databricks generative AI apps. Implementation guidance for Databricks’ OpenAI-compatible function calling will include request schema and parameter names unique to this product integration. | +| [Function calling on Databricks](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/function-calling) | integrations | 0.70 | Function calling implementation will show concrete JSON schemas, parameter names, and endpoint usage specific to Databricks’ OpenAI-compatible API, which are integration/coding patterns. | | [Function decorators](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/manual-tracing/function-decorator) | integrations | 0.70 | Documents the @mlflow.trace decorator usage, including function signatures and parameters unique to MLflow Tracing, which are product-specific coding patterns and API details. | -| [GDPR Compliance](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/gdpr-delta) | security | 0.70 | Provides concrete guidance for implementing right-to-be-forgotten workflows on Databricks/Delta, including product-specific patterns for permanent deletion. | +| [GDPR Compliance](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/gdpr-delta) | security | 0.70 | Product-specific guidance for GDPR/CCPA compliance on Delta, including how to structure and manage data for deletion within regulatory timeframes; this is specialized security/privacy configuration and data-handling guidance. | | [GET](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-connector-get) | integrations | 0.70 | Describes Databricks SQL Connector–only GET statement with precise syntax and constraints (only via connector/driver, not UI). This is a product-specific integration/coding pattern for file transfer. | | [Gateway event logs](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/gateway-event-logs) | configuration | 0.70 | Defines specific event log metrics and usage patterns for monitoring snapshot and CDC phases; product-specific logging schema and usage. | -| [Generated columns](https://learn.microsoft.com/en-us/azure/databricks/delta/generated-columns) | configuration | 0.70 | Describes how to declare generated columns, their constraints, and behaviors (e.g., automatic computation, partitioning use). These are product-specific configuration semantics for table schemas. | +| [Generated columns](https://learn.microsoft.com/en-us/azure/databricks/delta/generated-columns) | configuration | 0.70 | Details how to declare generated columns in Delta Lake, including syntax and behavior when values are omitted, and use for partitioning. These are specific configuration/DDL patterns unique to Delta Lake on Databricks. | | [Germany C5](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/c5) | security | 0.70 | Maps the C5 compliance framework to concrete Azure Databricks workspace controls and behaviors, which is product- and standard-specific security/compliance configuration knowledge. | | [Get UDF context information](https://learn.microsoft.com/en-us/azure/databricks/udf/udf-task-context) | integrations | 0.70 | Shows how to use the TaskContext PySpark API within Batch Unity Catalog Python UDFs and PySpark UDFs to get user identity, job IDs, and cluster tags. This is a product-specific coding pattern for integrating UDF logic with Databricks execution context. | | [Git folders concepts](https://learn.microsoft.com/en-us/azure/databricks/repos/git-folders-concepts) | configuration | 0.70 | Details supported Git providers and Git folders capabilities; these are Databricks-specific integration and behavior details. | +| [GitHub Actions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-github) | integrations | 0.70 | Page describes enabling OAuth/OIDC workload identity federation specifically for GitHub Actions CI/CD flows. This is an integration pattern between Azure Databricks and GitHub Actions and likely includes GitHub workflow configuration (OIDC provider, audience, subject, environment variables) and Databricks-side settings, which are concrete, product-specific integration parameters. | | [GitHub Actions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/github) | deployment | 0.70 | Describes Databricks-authored GitHub Actions with examples for CI/CD. These actions have specific inputs/outputs and constraints, representing deployment-focused integration patterns unique to Databricks. | | [Global init scripts (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/init-scripts/legacy-global) | configuration | 0.70 | Describes legacy global init script behavior (run on every cluster, cannot reference environment variables, failure handling) and migration specifics, which are product-specific configuration details. | -| [Google Drive](https://learn.microsoft.com/en-us/azure/databricks/ingestion/google-drive) | integrations | 0.70 | Connector article for Google Drive ingestion; likely includes Databricks-specific options (e.g., connector names, parameters, and usage patterns for read_files, spark.read, COPY INTO, Auto Loader) that are configuration- and API-specific rather than generic tutorial content. | | [Google Sheets](https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/) | integrations | 0.70 | Connector overview for Google Sheets and Databricks; connector docs generally include specific connection options and parameters (workspace URL, auth method, warehouse selection) that are product-specific integration details. | | [Govern and audit](https://learn.microsoft.com/en-us/azure/databricks/dashboards/monitor-usage) | configuration | 0.70 | Provides concrete sample SQL against system tables and references specific audit log schemas and fields, which are expert configuration/usage details. | | [Grant SAP BDC recipients access to shares](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/share-to-sap) | integrations | 0.70 | How-to for setting up SAP Business Data Cloud as a Delta Sharing recipient. Likely includes product-specific connection parameters, configuration steps, and required settings unique to the Databricks–SAP BDC integration, which go beyond generic knowledge. | | [Grant permissions programmatically](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-permissions-programmatically) | security | 0.70 | Covers managing Lakebase project permissions via Databricks Permissions API, CLI, SDKs, and Terraform. This implies product-specific permission scopes/objects and API/CLI parameters that go beyond generic RBAC concepts, fitting the security category. | | [Grant project and database access to a new user](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-user-access-tutorial) | security | 0.70 | Walkthrough of assigning project permissions and Postgres roles; includes concrete role names and permission steps specific to Lakebase security. | +| [H3 geospatial functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions) | integrations | 0.70 | Page is a detailed SQL reference for H3 geospatial functions in Databricks SQL/Runtime, with product-specific function names, signatures, argument types, and behavior that go beyond generic geospatial knowledge. This is expert, code-level integration knowledge for using H3 within Databricks SQL, fitting the integrations & coding patterns category. | | [Handle long-running tasks](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor-long-running-tasks) | limits-quotas | 0.70 | Explicitly addresses long-running tasks and request timeouts; such pages typically document timeout limits and continuation behavior (e.g., max duration per request) that qualify as expert limits/quotas knowledge. | | [Heatmap visualization](https://learn.microsoft.com/en-us/azure/databricks/visualizations/heatmap) | configuration | 0.70 | Details configuration options for heatmap charts; these are specific to Databricks visualization behavior. | | [Hex](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/hex) | integrations | 0.70 | Partner integration with Hex including Databricks SQL–specific connection configuration and constraints (no cluster integration). | @@ -1754,9 +1776,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [How to add comments to SQL statements](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-comment) | integrations | 0.70 | Details supported SQL comment syntaxes and special treatment as hints, which is product-specific language behavior. | | [Hyperopt best practices and troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl-hyperparam-tuning/hyperopt-best-practices) | best-practices | 0.70 | Explicitly focuses on best practices and troubleshooting for Hyperopt on Databricks, likely including gotchas and product-specific recommendations. | | [Hyperparameter tuning with Optuna](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl-hyperparam-tuning/optuna) | integrations | 0.70 | Describes MLflow 3.0 integration with Optuna and horizontal scaling; this will include concrete code patterns, MLflow/Optuna configuration, and Databricks-specific integration details that qualify as expert integration knowledge. | -| [INSERT INTO](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into) | integrations | 0.70 | Product-specific DML syntax and constraints (for example, unsupported Hive Avro timestamp-millis) represent detailed behavioral knowledge beyond generic SQL. | +| [INSERT INTO](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into) | integrations | 0.70 | The page documents the exact, product-specific behavior and constraints of the Databricks SQL INSERT statement (including unsupported cases like Hive Avro with timestamp-millis and Databricks-specific syntax/semantics). This is detailed reference behavior for a specific platform’s SQL dialect, which qualifies as expert knowledge. It is not about limits/quotas, troubleshooting, or configuration, but about precise API/SQL usage patterns, so it best fits the integrations & coding patterns category. | +| [IP functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-ip-functions) | integrations | 0.70 | Page documents Databricks Runtime IP functions with specific function names, accepted types (STRING/BINARY), IPv4/IPv6 and CIDR handling, and beta/preview behavior. These are product-specific SQL APIs and patterns for handling IP data, which qualifies as expert integration/coding reference rather than generic concepts. | | [ISMAP](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/ismap) | security | 0.70 | Compliance-control pages typically list concrete workspace settings, flags, and configuration steps required to meet ISMAP, which are product-specific security configurations not inferable from general training data. | -| [Iceberg v3](https://learn.microsoft.com/en-us/azure/databricks/iceberg/iceberg-v3) | configuration | 0.70 | Describes how to use Iceberg v3 features with Unity Catalog, including table-level feature requirements (row lineage) and likely specific configuration/DDL options unique to Databricks’ Iceberg v3 implementation. | +| [Iceberg client access](https://learn.microsoft.com/en-us/azure/databricks/external-access/iceberg) | integrations | 0.70 | Page describes configuring Apache Iceberg REST catalog clients (Spark, Flink, Trino) to read/write Unity Catalog tables, including product-specific REST endpoints, auth/config parameters, and client-side settings. This is concrete integration configuration rather than a conceptual overview. | | [Identify unused endpoints](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-unused-endpoints) | troubleshooting | 0.70 | Shows how to query audit log system tables to detect endpoints with no query traffic; includes product-specific diagnostic queries and symptom→action guidance. | | [Identifying an expensive read](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-dag-expensive-read) | troubleshooting | 0.70 | Teaches how to use the Spark DAG view to find which read is most expensive, with Databricks-specific UI usage and interpretation, mapping performance symptom to root cause, which is troubleshooting guidance. | | [If/else](https://learn.microsoft.com/en-us/azure/databricks/jobs/if-else) | configuration | 0.70 | Details how to configure If/else tasks, including operands and parameter references, which are Databricks-specific configuration patterns. | @@ -1764,14 +1787,15 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Import and export](https://learn.microsoft.com/en-us/azure/databricks/dashboards/automate/import-export) | deployment | 0.70 | Describes export/import/replace mechanics and constraints for dashboard files between workspaces, which are deployment-focused product details. | | [Improve RAG chain quality](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/quality-rag-chain) | best-practices | 0.70 | Focuses on improving quality via RAG chain components; likely includes concrete recommendations and patterns for Databricks RAG chains, which are best-practices. | | [Improve gen AI app quality](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/quality-overview) | best-practices | 0.70 | Explicitly about 'knobs' to tune in pipeline and RAG chain; this is actionable, product-specific tuning guidance rather than generic theory, fitting best-practices. | -| [Inbound Private Link for performance-intensive services](https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-privatelink) | configuration | 0.70 | Private Link setup for performance‑intensive Databricks services will include specific configuration fields (DNS zones, endpoint types, required subnets, resource IDs) and possibly constraints around supported services. These are detailed, product‑specific configuration parameters beyond generic knowledge. | -| [Incremental refresh for materialized views](https://learn.microsoft.com/en-us/azure/databricks/optimizations/incremental-refresh) | decision-making | 0.70 | Covers semantics and requirements for incremental refresh, supported SQL operations/clauses, and recommendations for choosing between materialized views and streaming tables. This is product-specific decision guidance about when to use each approach and how to configure incremental refresh, matching decision-making with some best-practice aspects. | +| [Inbound Private Link for performance-intensive services](https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-privatelink) | security | 0.70 | Covers configuring inbound Private Link specifically for performance-intensive Databricks services (e.g., Zerobus Ingest, Lakehouse Autoscaling) with product-specific endpoint and networking configuration steps. This is detailed security/network configuration, not just conceptual info. | +| [Inference tables](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables-beta) | configuration | 0.70 | Inference tables for Unity AI Gateway endpoints will document specific table schemas, column names, and enablement settings, which are product-specific configuration details beyond generic monitoring concepts. | | [Ingest formula fields incrementally](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-formula-fields) | best-practices | 0.70 | Explains how incremental ingestion of formula fields works, including limitations and error tracking; likely includes product-specific recommendations, gotchas, and configuration flags—best-practices for this connector. | | [Init scripts overview](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/) | best-practices | 0.70 | Defines init scripts plus recommendations and configuration information for using them; these are Databricks-specific patterns and gotchas for environment setup. | | [Install Databricks Connect for Python](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/install) | integrations | 0.70 | Install guide will include concrete configuration steps, environment variables, and version compatibility details for the Python client, which are integration-specific. | | [Install Databricks Connect for Scala](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/install) | integrations | 0.70 | Scala install article will specify client artifacts, versions, and configuration parameters for connecting Scala apps to Databricks, which are integration details. | | [Install libraries from a volume](https://learn.microsoft.com/en-us/azure/databricks/libraries/volume-libraries) | configuration | 0.70 | Explains using volumes and requirements.txt for library installation; relies on Unity Catalog volume configuration and compatibility details. | | [Install libraries from workspace files](https://learn.microsoft.com/en-us/azure/databricks/libraries/workspace-files-libraries) | configuration | 0.70 | Describes storing libraries as workspace files with runtime version requirements; this is product-specific configuration behavior and constraints. | +| [Instance type compatibility reference](https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-type-instances) | decision-making | 0.70 | Provides a reference list of instance type compatibility groups for flexible node types; this is detailed, product-specific guidance for choosing fallback instances, which supports deployment/selection decisions. | | [Instructor](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/instructor) | integrations | 0.70 | Explains tracing Instructor by enabling autolog on underlying LLM providers; includes integration patterns and behavior for structured outputs. | | [Introduction to Databricks sign-on from partner solutions](https://learn.microsoft.com/en-us/azure/databricks/integrations/configuration) | security | 0.70 | Describes OAuth application integration behavior and configuration for partner SSO into Databricks. | | [Jobs event timeline](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/jobs-timeline) | troubleshooting | 0.70 | Page is a Spark UI guide focused on using the jobs timeline to debug pipelines and queries, with Databricks-specific interpretation of UI elements and workflow for diagnosing failures and performance issues (symptom → investigation steps), which is product-specific troubleshooting knowledge. | @@ -1795,15 +1819,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-limits) | limits-quotas | 0.70 | A 'limitations' page for a specific connector is likely to enumerate concrete ingestion constraints (for example, unsupported operations, maximum sizes, or rate-related limits) that are product-specific and not generally known; treated as limits-quotas. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-limits) | limits-quotas | 0.70 | A 'limitations and considerations' page for a specific connector typically lists concrete constraints (for example, supported object counts, date ranges, or API limits) that qualify as limits-quotas. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-limits) | limits-quotas | 0.70 | Limitations and considerations page; typically lists unsupported objects, rate limits, and other concrete constraints. | -| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-limits) | limits-quotas | 0.70 | Limitations/considerations page for a specific connector is likely to enumerate concrete constraints (unsupported features, size or performance limits) that qualify as expert knowledge. | -| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-limits) | limits-quotas | 0.70 | Limitations/considerations page for SQL Server connector is expected to list concrete constraints and unsupported scenarios, which are expert, product-specific details. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/limitations) | limits-quotas | 0.70 | A limitations page for real-time mode is likely to enumerate concrete constraints (for example, which sources/unions/mapPartitions are disallowed or restricted), which are product-specific behavioral limits not inferable from general knowledge. | | [Link production traces to app versions](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/version-tracking/link-production-traces-to-app-versions) | configuration | 0.70 | Describes how to configure deployments so traces include LoggedModel version info; product-specific configuration for version-aware monitoring. | -| [Liquid clustering](https://learn.microsoft.com/en-us/azure/databricks/delta/clustering) | best-practices | 0.70 | Explains how to use liquid clustering instead of partitioning/ZORDER; likely includes Databricks-specific configuration syntax, recommended patterns, and trade-offs for clustering keys, which are concrete product-specific best practices. | +| [Liquid clustering](https://learn.microsoft.com/en-us/azure/databricks/delta/clustering) | best-practices | 0.70 | Explains how and when to use liquid clustering instead of partitioning/ZORDER, including product-specific behavior (redefining clustering keys without rewrite, applicability to streaming tables and materialized views). These are Databricks-specific optimization patterns and gotchas. | | [Load data for training](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/) | best-practices | 0.70 | Guidance on loading data specifically for ML/DL and distributed training; likely includes Databricks-specific recommendations (file formats, partitioning, connectors) beyond generic data loading. | | [Load data using Petastorm](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/petastorm) | integrations | 0.70 | Describes product-specific code patterns and parameters for using Petastorm with Azure Databricks and Spark to load Parquet data into TensorFlow/PyTorch, which are concrete integration details beyond generic ML knowledge. | -| [Load data using SDP](https://learn.microsoft.com/en-us/azure/databricks/ldp/load) | configuration | 0.70 | Explains how to load data from any Spark-supported source and define datasets; likely includes configuration of streaming tables and ingestion patterns specific to Lakeflow. | -| [Load data using a Unity Catalog external location](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/add-data-external-locations) | configuration | 0.70 | Describes using the add data UI with Unity Catalog external locations; likely includes specific object types, required settings, and constraints for external locations. | | [Load test example](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/configure-load-test) | best-practices | 0.70 | Provides a notebook example and detailed steps for setting up environment, authentication, and cluster configuration for load tests. These are concrete, Databricks-specific recommendations and configurations for performance testing, fitting best practices. | | [Load test vector search endpoints](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-endpoint-load-test) | best-practices | 0.70 | Provides detailed guidance and example code for load testing vector search endpoints; includes product-specific patterns and parameters for realistic performance evaluation. | | [Log and register agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/log-agent) | configuration | 0.70 | Explains logging agents as points-in-time of code and configuration; likely includes specific logging APIs, parameters, and schema details unique to Databricks. | @@ -1817,7 +1837,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [MLflow 2: Input schema](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-schema) | configuration | 0.70 | Explains the input schema required by Agent Evaluation; schema definitions and required fields are product-specific configuration details. | | [MLflow and Ray](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/ray-mlflow) | integrations | 0.70 | Explains how to wire Ray distributed workloads into MLflow tracking with Databricks-specific configuration and API usage; this is a concrete integration pattern between two products with code and parameter details that are not generic knowledge. | | [MLflow experiment](https://learn.microsoft.com/en-us/azure/databricks/query/formats/mlflow-experiment) | integrations | 0.70 | Documents the MLflow experiment data source and how to load runs by notebook experiment, name, or ID, which is a Databricks-specific integration pattern. | -| [Manage Databricks Previews](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/manage-previews) | configuration | 0.70 | Explains how to enable/disable Databricks preview features at account/workspace scope, involving specific preview management settings unique to the platform. | | [Manage access request destinations](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/access-request-destinations) | security | 0.70 | Explains how to configure destinations for access requests; involves product-specific security workflow settings and permission flows. | | [Manage access to DBFS browser](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/dbfs-browser) | configuration | 0.70 | Admin setting controlling DBFS visual browser; distinct from programmatic access, so configuration is product-specific. | | [Manage access to file upload interface](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/disable-upload-data-ui) | configuration | 0.70 | Workspace setting that controls UI-based data uploads; product-specific configuration with clear effect on capabilities. | @@ -1826,37 +1845,35 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Manage catalogs](https://learn.microsoft.com/en-us/azure/databricks/catalogs/manage-catalog) | configuration | 0.70 | Page describes how to view, update, and delete catalogs using Catalog Explorer and SQL commands. This typically includes catalog-specific SQL statements, options, and parameters unique to Unity Catalog, fitting the configuration sub-skill type. | | [Manage clean rooms](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/manage-clean-room) | configuration | 0.70 | Explains how collaborators update, monitor, and delete clean rooms; contains product-specific management operations and behaviors. | | [Manage dashboards with Workspace APIs](https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/workspace-dashboard-api) | integrations | 0.70 | Provides concrete REST API request/response examples and property usage for Workspace and Lakeview APIs, including specific fields and patterns for managing dashboards, which is product-specific integration knowledge. | -| [Manage database permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles-permissions) | security | 0.70 | Explains granting database permissions to Postgres roles in Lakebase projects and mentions creating OAuth roles via a specific Databricks auth resource. This is product-specific RBAC/permissions configuration. | | [Manage dependencies](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/dependencies) | configuration | 0.70 | The page describes how Databricks Apps handle Python/Node.js dependencies via language-specific files and overriding pre-installed libraries. This is product-specific configuration behavior (how dependencies are resolved and managed in Databricks Apps), which goes beyond generic package management knowledge. It likely includes specific file names, resolution order, and constraints, fitting the configuration category. | +| [Manage entitlements](https://learn.microsoft.com/en-us/azure/databricks/security/auth/entitlements) | security | 0.70 | Covers specific entitlements and how they’re assigned to users, service principals, and groups—product-specific access control configuration. | | [Manage foreign catalogs](https://learn.microsoft.com/en-us/azure/databricks/query-federation/foreign-catalogs) | configuration | 0.70 | Describes how to manage foreign catalogs (DDL, privileges, usage patterns) with specific commands and behaviors unique to Unity Catalog foreign catalogs, which are configuration/usage details not covered by generic knowledge. | -| [Manage metastores](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-metastore) | configuration | 0.70 | Shows how to update, delete, and manage metastore behavior, which involves specific configuration options and admin controls not generally known. | +| [Manage groups](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-groups) | security | 0.70 | Management of groups for access to workspaces, data, and other securables is concrete IAM/security configuration. | +| [Manage metastores](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-metastore) | configuration | 0.70 | A management page for Unity Catalog metastores typically includes product-specific settings (e.g., default catalog, region behavior, assignment rules) and concrete configuration options unique to Databricks Unity Catalog, which qualify as expert configuration knowledge rather than just conceptual overview. | | [Manage metric views](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/manage) | security | 0.70 | Describes controlling access, permissions (e.g., SELECT), ownership, and lifecycle management; this is product-specific security and governance configuration for Unity Catalog metric views. | | [Manage model lifecycle in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/) | configuration | 0.70 | Describes lifecycle management of models in Unity Catalog with hosted MLflow registry; includes product-specific governance, access control, and workflow patterns. | | [Manage notebook format](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-format) | configuration | 0.70 | Describes default IPYNB format, switching to source format, and managing output commits in source-controlled folders, which are specific configuration behaviors. | -| [Manage notification destinations](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notification-destinations) | configuration | 0.70 | Describes how admins configure notification destinations and webhooks; such pages typically include concrete setting names, payload formats, and configuration parameters for system notifications, which are product-specific configuration details. | | [Manage permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/manage-privileges) | security | 0.70 | Describes when and how to grant access via the Databricks UI; likely includes specific roles/permissions and their effects, which are product-specific security details. | | [Manage permissions on dashboards](https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/manage-permissions) | security | 0.70 | Workspace permissions API usage with specific permission properties and request/response patterns is product-specific security configuration beyond generic knowledge. | | [Manage resources with Terraform](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/workspace-management) | integrations | 0.70 | Shows concrete Terraform configuration blocks and variables (spark version, node type, current user) for Databricks workspace resources. This is product-specific integration/config detail, not just conceptual guidance. | | [Manage schemas](https://learn.microsoft.com/en-us/azure/databricks/schemas/manage-schema) | security | 0.70 | Operational page with concrete SQL/CLI patterns and permission requirements for managing schemas in Unity Catalog vs legacy Hive metastore. Contains product-specific behavior differences and required privileges, which are expert, implementation-level details rather than conceptual overview. | -| [Manage serverless base environments](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/base-environment) | configuration | 0.70 | Admin-focused page on creating and managing serverless base environments is likely to enumerate specific workspace settings, parameter names, and allowed values for serverless notebooks and jobs, which are product-specific configuration details rather than generic concepts. | | [Manage service credentials](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/manage-service-credentials) | security | 0.70 | Covers listing, updating, granting permissions on, and deleting service credentials; product-specific security administration patterns. | -| [Manage service principals](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-service-principals) | security | 0.70 | Describes concrete management operations for Databricks service principals at account and workspace scope, including permissions and access control patterns that are specific to the product. | -| [Manage the Personal Compute cluster policy](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/personal-compute) | configuration | 0.70 | Explains how the Personal Compute default policy works and how to enable/manage it, including constraints on compute creation; these are specific configuration behaviors for this product feature. | -| [Managed MCP servers](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp) | integrations | 0.70 | Describes managed MCP servers that connect agents to Unity Catalog, Vector Search, Genie spaces, and custom functions, with governance via AI Gateway. This is a product-specific integration pattern between agents and Databricks data/tools. | | [Managed versus external](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/managed-versus-external) | decision-making | 0.70 | The page focuses on when to use managed versus external tables and volumes in Unity Catalog and describes trade-offs around storage location, lifecycle, and governance. This is product-specific decision guidance about selecting between options rather than a generic concept, fitting the decision-making sub-skill. No numeric limits or config tables are indicated, so other categories are less appropriate. | | [Manual refresh token](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-refresh-token) | security | 0.70 | Manual token refresh authentication involves specific token lifetimes, refresh endpoints, and secrets; these are product-specific security configuration details. | | [Many small jobs](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/small-spark-jobs) | best-practices | 0.70 | Provides concrete recommendations for handling many small jobs (e.g., data size <10GB, parallelizing operations, using specific Databricks features like Lakeflow Spark Declarative Pipelines), which are product-specific performance best practices rather than generic theory. | | [Map visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/maps) | configuration | 0.70 | Describes Databricks-specific configuration options for map visualizations, including required geographic data fields. | +| [Materialize declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/materialized-features) | configuration | 0.70 | Describes how Databricks creates and manages Lakeflow Spark Declarative Pipelines and how to configure materialization of features into Unity Catalog tables, involving product-specific settings and behaviors. | | [Memory issues](https://learn.microsoft.com/en-us/azure/databricks/optimizations/spark-ui-guide/spark-memory-issues) | troubleshooting | 0.70 | Focused on debugging Spark memory problems with Databricks-specific patterns and likely includes concrete symptoms and remediation steps. | | [Migrate Parquet to Delta Lake](https://learn.microsoft.com/en-us/azure/databricks/migration/parquet-to-delta-lake) | decision-making | 0.70 | Describes considerations and four recommended migration paths, helping decide how to convert a Parquet lake to Delta Lake. | | [Migrate Spark code](https://learn.microsoft.com/en-us/azure/databricks/migration/spark) | best-practices | 0.70 | Migration guide with Databricks-specific code and configuration changes (e.g., Spark config, cluster behavior, feature differences) that are unique to Azure Databricks and not just generic Spark concepts. | | [Migrate data warehouse to lakehouse](https://learn.microsoft.com/en-us/azure/databricks/migration/warehouse-to-lakehouse) | decision-making | 0.70 | Covers considerations, caveats, and when/how to replace a warehouse with a lakehouse, guiding migration decisions. | | [Migrate existing resources](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/migrate-resources) | configuration | 0.70 | Describes using `databricks bundle generate` and UI-exported configuration to bind existing jobs/pipelines/apps; involves concrete configuration structures and fields. | -| [Migrate from Hive metastore to Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/ldp/clone-hms-to-uc) | integrations | 0.70 | Documents a specific Databricks REST API request (clone a pipeline) with concrete usage patterns and examples, including how to transform an HMS-based pipeline into a Unity Catalog pipeline. This is product-specific API/integration behavior beyond generic knowledge. | +| [Migrate from Hive metastore to Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/ldp/clone-hms-to-uc) | integrations | 0.70 | Documents a specific Databricks REST API request (clone a pipeline) with concrete usage patterns and examples, including request/response structure and Python usage; this is product-specific integration/API knowledge beyond generic LLM training. | | [Migrate from classic to serverless](https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/migration) | decision-making | 0.70 | The article focuses on migrating from classic to serverless compute, including which workloads can or cannot migrate and feature gaps. This is product-specific decision guidance about when to use serverless vs. classic and how to phase migration, fitting the decision-making category. | | [Migrate from dbx to bundles](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/dbx/dbx-migrate) | decision-making | 0.70 | Migration article comparing dbx and Declarative Automation Bundles with limitations and feature comparisons to guide migration decisions; this is product-specific decision guidance beyond generic knowledge. | | [Migrate from legacy dashboards](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/clone-legacy-to-aibi) | limits-quotas | 0.70 | Contains a concrete product limit: cloning is only supported for legacy dashboards with a maximum of 100 widgets, which is a specific numeric constraint not inferable from general knowledge. | | [Migrate from legacy versions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/migrate) | decision-making | 0.70 | Migration-focused article; migration guidance between runtime generations is product-specific decision-making and upgrade-path knowledge. | +| [Migrate from older experiments](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/migrate-uc-trace-table-prefix) | configuration | 0.70 | Covers migration from a schema-linked to a table-prefix Unity Catalog format for MLflow traces, including product-specific schema/format details and recommended configuration patterns that an LLM would not know from training. | | [Migrate from online tables](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/migrate-from-online-tables) | decision-making | 0.70 | Migration-focused page describing how to move from legacy/third-party online tables to Lakebase synced tables; includes recommended target options and migration paths. | | [Migrate legacy publishing pipelines](https://learn.microsoft.com/en-us/azure/databricks/ldp/migrate-to-dpm) | configuration | 0.70 | Describes how to enable the default publishing mode and migrate from the legacy LIVE virtual schema, affecting pipeline metadata and behavior. Likely includes specific configuration steps, settings, and schema behavior unique to Lakeflow Spark Declarative Pipelines, fitting configuration-focused expert knowledge. | | [Migrate models to Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/migrate-to-uc) | decision-making | 0.70 | Migration guidance from Workspace Model Registry to Unity Catalog with recommendations and workflow steps; supports decision-making and migration planning. | @@ -1864,16 +1881,15 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Migrate to MLflow 3 from MLflow 2 and Agent Evaluation](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/agent-eval-migration) | decision-making | 0.70 | Migration guide between MLflow 2 Agent Evaluation and MLflow 3; likely includes feature comparisons, constraints (Managed vs OSS), and concrete guidance on when/how to move. | | [Migrate to Model Serving](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/migrate-model-serving) | decision-making | 0.70 | Migration article between legacy and new serving experiences usually includes timelines, feature differences, and guidance on when/how to switch, which is product-specific decision and migration guidance. | | [Migrate to UC-only workspaces](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/upgrade/uc-only-migration) | decision-making | 0.70 | Provides guidance for accounts with legacy workspaces on preparing for UC-only enforcement with specific dates and migration considerations, which is decision and migration-planning guidance. | -| [Migrate to serverless routing](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration) | decision-making | 0.70 | Describes a time-bound migration (with specific dates) from control plane to serverless routing, including conditions when you must migrate, options (Private Link vs IP allowlists), and consequences; this is concrete guidance for choosing and planning migration paths. | | [Migration guide](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/migration) | decision-making | 0.70 | Migration guide between legacy Simba Spark ODBC and new Databricks ODBC driver; likely includes mapping of settings, compatibility notes, and coexistence guidance, which are product-specific decision and migration details. | | [Mitigation requirements for OpenAI models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/open-ai-mitigation-requirements) | security | 0.70 | Describes mitigation requirements for high-risk OpenAI use cases on Databricks; these are product- and provider-specific security/compliance requirements. | | [Mode](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/mode) | integrations | 0.70 | Integration guide for Mode with Databricks clusters/SQL warehouses including connection configuration details. | | [Model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/model-serving) | configuration | 0.70 | Integrating model serving endpoints, including permission levels and best practices, implies specific configuration fields and possibly security-related settings unique to Databricks model serving. | | [MongoDB](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/mongodb) | integrations | 0.70 | Uses MongoDB Spark connector with specific connection URIs, options, and authentication parameters for Atlas integration. | | [Monitor alerts](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/monitor-alerts) | configuration | 0.70 | Shows how to build SQL queries on profile and drift metric tables and configure alert frequency and notifications; uses specific system tables and alert settings that are product-specific. | -| [Monitor and manage access to personal access tokens](https://learn.microsoft.com/en-us/azure/databricks/admin/access-control/tokens) | security | 0.70 | Details admin workflows and controls for listing, monitoring, and revoking PATs in Databricks, including product-specific token management behavior. | | [Monitor costs](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/monitor-costs) | best-practices | 0.70 | Shows how to use a specific system table (system.billing.usage) with example queries to analyze costs and attribute spending. This is product-specific, actionable cost-monitoring guidance. | -| [Monitor job costs](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs-cost) | configuration | 0.70 | Provides concrete example queries and references to specific Lakeflow and billing system tables/fields for cost and performance monitoring. These table/column details and usage patterns are product-specific. | +| [Monitor job costs](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs-cost) | configuration | 0.70 | Provides concrete example queries and explains which system tables/fields to use for job cost attribution—detailed product-specific usage of system tables. | +| [Monitor serverless costs](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/serverless-billing) | configuration | 0.70 | Shows which fields in system.billing.usage correspond to serverless compute and how to query them—specific configuration/field-level knowledge. | | [Monitor usage](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/audit) | troubleshooting | 0.70 | Provides sample queries against system audit tables to monitor Genie spaces; these are product-specific diagnostic patterns (symptom-to-query-to-detection) that go beyond generic logging concepts. | | [Multi-agent using Genie](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/multi-agent-genie) | architecture-patterns | 0.70 | Describes Genie agent systems and how to create a multi-agent system using Genie spaces; this is a Databricks-specific multi-agent architecture pattern. | | [NULL semantics](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-null-semantics) | integrations | 0.70 | Details Databricks-specific NULL handling across operators and expressions, including edge-case semantics. | @@ -1888,7 +1904,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Observability with Genie Code](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-genie-code) | troubleshooting | 0.70 | Genie Code is described as a tool to diagnose issues and analyze performance for model serving endpoints, which implies mappings from symptoms to diagnostics and guidance. This is product-specific troubleshooting content beyond generic debugging advice. | | [Observation class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/observation) | integrations | 0.70 | Describes Observation class semantics (first action only, blocking get), which are detailed, product-specific API behaviors. | | [Ollama](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/ollama) | integrations | 0.70 | Integration page with product-specific autolog function name, OpenAI-compatible endpoint usage, and Databricks serverless caveat; contains concrete configuration/invocation patterns unique to this integration. | -| [OpenAI Responses API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-openai-responses) | integrations | 0.70 | Page is about querying via the OpenAI Responses API into Databricks model serving. Likely includes request/response schema, API parameters, and provider-specific options that are product-specific integration details beyond generic LLM knowledge. | | [Optimization and monitoring](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/performance) | best-practices | 0.70 | Covers compute tuning, latency reduction techniques, and performance measurement for real-time mode. This is actionable, product-specific performance guidance (how to size, which settings to adjust, how to interpret metrics), fitting best-practices with expert details. | | [Optimize](https://learn.microsoft.com/en-us/azure/databricks/delta/optimize) | best-practices | 0.70 | Explains OPTIMIZE behavior, interaction with liquid clustering and partitioning, and recommends enabling predictive optimization—product-specific performance best practices. | | [Optimize join performance](https://learn.microsoft.com/en-us/azure/databricks/transform/optimize-joins) | best-practices | 0.70 | Contains Databricks-specific recommendations (Photon behavior, join type choices, cross-join cautions) tailored to the platform’s optimizer/runtime, which are concrete product-specific performance practices. | @@ -1899,11 +1914,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Overview](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/) | security | 0.70 | Explains required storage connections and how Unity Catalog governs access; includes product-specific security and access configuration patterns. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/abac/) | security | 0.70 | Describes ABAC in Unity Catalog; includes product-specific policy constructs, tag usage, and evaluation semantics that define how security is enforced. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/work-tasks) | configuration | 0.70 | Describes development and lifecycle of bundles; typically includes concrete steps, commands, and configuration patterns for managing bundle resources. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/commands) | configuration | 0.70 | The page is a detailed reference of Databricks CLI commands for versions 0.205+ and includes product-specific command names, parameters, and usage patterns that go beyond generic CLI knowledge. This is configuration-oriented expert knowledge about how to control Azure Databricks via its CLI, which an LLM is unlikely to fully know from training. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/) | configuration | 0.70 | Python-specific Databricks Connect article typically includes concrete client configuration parameters (host, token, cluster ID, runtime version), environment variables, and version compatibility details that are product-specific and not just conceptual usage. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/configure-data-access) | security | 0.70 | Configuration-focused article describing concrete ways to configure secure access from Azure Databricks to ADLS containers. Likely includes storage-specific auth patterns, permission scopes, and configuration parameters unique to Databricks and ADLS rather than just conceptual security guidance. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/examples) | best-practices | 0.70 | Provides common patterns for COPY INTO usage, which are concrete SQL patterns and option combinations tailored to Databricks, aligning with best-practices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/) | decision-making | 0.70 | Discusses when to use CLONE vs in-place conversion and Unity Catalog managed tables; this is migration and option-selection guidance with product-specific trade-offs. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/compute) | best-practices | 0.70 | The page provides product-specific recommendations and limitations for configuring compute for Lakeflow Jobs, including guidance on serverless compute limitations and how to assign compute per task. This is actionable, Databricks-specific best-practices content rather than generic concepts, but the summary does not clearly show detailed numeric limits, so it fits best under best-practices rather than limits-quotas. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/configure-job) | configuration | 0.70 | Page is about creating and configuring Lakeflow Jobs via the Jobs & Pipelines UI. It typically includes job-level settings, task options, and possibly parameter names and allowed values. These are product-specific configuration details that qualify as expert knowledge under configuration. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each) | configuration | 0.70 | Details how to add and configure For each and nested tasks in Lakeflow Jobs, including parameter passing semantics and task-type constraints. These are specific configuration mechanics for Databricks workflows. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/jar) | configuration | 0.70 | How-to page for configuring JAR tasks; likely includes Databricks-specific task settings, parameters, and options (for example, main class, libraries, cluster settings) that are product-specific configuration details rather than generic Java knowledge. | @@ -1913,16 +1928,15 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/databricks-runtime-ml) | configuration | 0.70 | Describes Databricks Runtime ML and how to create compute with it; likely includes runtime versions, supported features, and specific configuration options for ML workloads. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/manage-serving-endpoints) | configuration | 0.70 | Management article for serving endpoints typically includes endpoint state options, REST API fields, and configuration parameters (scaling, traffic routing, versions) that are product-specific and not just conceptual. These are concrete settings rather than a high-level overview. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/monitor-diagnose-endpoints) | configuration | 0.70 | Summarizes monitoring tools and how they apply to model serving; tool availability and usage are product-specific observability configuration details. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant) | best-practices | 0.70 | Describes how to query and transform VARIANT data, with recommendations to prefer VARIANT over JSON strings and migration considerations. This includes Databricks-specific behaviors and recommended patterns for using the VARIANT type. | -| [Overview of D2D sharing](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-databricks) | configuration | 0.70 | Covers how to share data via the Databricks-to-Databricks protocol; typically includes share/recipient configuration steps and parameters unique to this feature. | -| [Overview of open sharing](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-open) | configuration | 0.70 | Provider-focused overview of open sharing; typically includes how to configure shares and endpoints for non-Databricks tools, with product-specific parameters. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant) | configuration | 0.70 | Details the Databricks VARIANT data type, its behavior versus JSON strings, supported operations, and runtime/version constraints; these are product-specific type semantics and configuration/usage details not captured by generic knowledge. | +| [Overview of data lineage](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-lineage) | configuration | 0.70 | Describes how to use Catalog Explorer and lineage system tables; such a page typically includes concrete UI options, lineage query patterns, and table/column names for system tables, which are product-specific configuration/usage details rather than generic concepts. | | [PARAMETERS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/parameters) | configuration | 0.70 | PARAMETERS relation lists routine parameters; its schema and filtering are Databricks-specific metadata details. | | [PCI-DSS](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/pci) | security | 0.70 | PCI DSS compliance pages generally document specific workspace security settings, modes, and constraints required for PCI, which are detailed, product-specific security configurations. | | [PROVIDERS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/providers) | configuration | 0.70 | PROVIDERS relation defines how providers are represented and filtered, which is Databricks-specific metadata configuration. | | [PUT INTO](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-connector-put-into) | integrations | 0.70 | Documents Databricks SQL Connector–specific PUT INTO statement for moving local files into volumes, including availability constraints (connector/driver only). This is a concrete integration pattern. | | [Package code for Databricks Model Serving](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/version-tracking/optionally-package-app-code-and-files-for-databricks-model-serving) | deployment | 0.70 | Explains packaging app code into LoggedModel for Model Serving and Agent Framework; includes deployment-specific packaging requirements and patterns. | | [Pandas overview](https://learn.microsoft.com/en-us/azure/databricks/pandas/) | decision-making | 0.70 | Compares pandas, pandas API on Spark, and conversions with PySpark/Arrow, giving guidance on when to use each approach for different scenarios—service-specific decision guidance. | -| [Parameter markers](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameter-marker) | configuration | 0.70 | Covers Databricks-specific behavior of named vs unnamed parameter markers, restrictions on mixing them, and their use in IDENTIFIER clauses. These are concrete, product-specific SQL configuration/usage rules. | +| [Parameter markers](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameter-marker) | integrations | 0.70 | Describes product-specific behavior of named and unnamed parameter markers in Databricks SQL, including constraints such as not mixing marker types in a single statement and usage in IDENTIFIER clauses. This is concrete, code-level usage guidance tied to how external callers (APIs/clients) pass parameters into SQL, fitting the integrations & coding patterns category. | | [Parameterization](https://learn.microsoft.com/en-us/azure/databricks/ldp/parameters) | configuration | 0.70 | Explains configuring parameters for pipelines; likely includes specific configuration keys, usage patterns, and allowed values unique to this product. | | [Partitioning tables](https://learn.microsoft.com/en-us/azure/databricks/tables/partitions) | architecture-patterns | 0.70 | Focused on when to use table partitioning vs liquid clustering, with Databricks-specific guidance and trade-offs for different workloads and table types (Delta vs Iceberg). This is a design/architecture decision pattern specific to Databricks table layout. | | [Phase 1: Account](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/account-setup) | security | 0.70 | A design guide for account administration and identity management typically includes specific Databricks account roles, Azure AD/SSO configuration details, and user provisioning patterns. These are product-specific security and identity configurations (RBAC roles, scopes, SSO setup) that qualify as security-focused expert knowledge rather than generic concepts. | @@ -1934,21 +1948,19 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Pipeline](https://learn.microsoft.com/en-us/azure/databricks/jobs/pipeline) | configuration | 0.70 | Explains scheduling triggered pipelines as job tasks; likely includes task configuration options and trigger behavior specific to Lakeflow pipelines. | | [Platform administration cheat sheet](https://learn.microsoft.com/en-us/azure/databricks/cheat-sheet/administration) | best-practices | 0.70 | Described as clear, opinionated guidance for admins on cost, observability, governance, and security; likely includes concrete product-specific recommendations and gotchas. | | [Platform release process](https://learn.microsoft.com/en-us/azure/databricks/resources/platform-release) | deployment | 0.70 | Describes platform release process and scheduled maintenance windows, which are product-specific deployment/maintenance timing details. | -| [Point-in-time feature joins](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/time-series) | best-practices | 0.70 | Focuses on preventing data leakage with time-series feature tables and timestamp keys; contains Databricks-specific guidance and patterns for building correct training datasets. | +| [Point-in-time feature joins](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/time-series) | best-practices | 0.70 | Provides specific guidance to avoid data leakage using time-series feature tables and timestamp key columns; this is actionable, product-specific best-practice for Databricks feature engineering. | | [Power BI](https://learn.microsoft.com/en-us/azure/databricks/jobs/powerbi) | integrations | 0.70 | Power BI task integrates Databricks jobs with Power BI service; likely includes connection/config parameters and constraints specific to this integration. | | [Power BI Desktop](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-desktop) | integrations | 0.70 | How-to page for connecting Power BI Desktop to Azure Databricks; such connector docs typically include product-specific connection parameters (server/HTTP path, authentication options, possibly gateway settings) that qualify as integration configuration details beyond generic knowledge. | | [Power BI service](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-service) | integrations | 0.70 | Describes publishing data from Databricks to the Power BI service; these docs usually include specific connector/endpoint settings and authentication parameters for the service integration, which are product-specific integration patterns. | | [Predictive table optimization](https://learn.microsoft.com/en-us/azure/databricks/optimizations/predictive-optimization) | best-practices | 0.70 | Includes rollout dates, default enablement behavior, and recommendations for enabling predictive optimization—product-specific operational best practices and behavior details. | | [Prepare data for distributed training](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/ddl-data) | configuration | 0.70 | Describes methods (Mosaic Streaming, TFRecords) and likely specific data layout/configuration requirements for distributed training. | | [Preset](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/preset) | integrations | 0.70 | Shows how to connect Preset (Superset-based) to Databricks SQL warehouses and clusters with product-specific connection patterns. | -| [Pricing](https://learn.microsoft.com/en-us/azure/databricks/resources/pricing) | decision-making | 0.70 | Explains SKUs and DBU multipliers used for billing; this is quantified cost/consumption data that informs SKU and cost decisions, fitting decision-making. | +| [Pricing](https://learn.microsoft.com/en-us/azure/databricks/resources/pricing) | decision-making | 0.70 | Explains SKUs and DBU multipliers used for billing serverless offerings. This is pricing/tier-specific guidance with concrete multipliers that affect cost decisions, fitting decision-making around SKU selection. | | [Privilege requirements for source database users](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-privileges) | security | 0.70 | Describes required PostgreSQL privileges/roles for the replication user and cloud-specific notes (RDS/Aurora, Azure Flexible Server); this is product-specific IAM/privilege configuration. | | [Production planning](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/) | decision-making | 0.70 | Phase-based production planning guide focused on architectural decisions and patterns across account, governance, network, storage, and operations; helps choose approaches and structures for enterprise deployments. | -| [Projects](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-projects) | configuration | 0.70 | Explains how to create and configure Lakebase projects, including settings and lifecycle management for branches, computes, databases, and roles. This is product-specific configuration of top-level Lakebase resources. | | [Provisioned throughput run your own benchmarking](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/prov-throughput-run-benchmark) | best-practices | 0.70 | Provides a recommended notebook and explains how Databricks measures latency and throughput for provisioned throughput endpoints. Likely includes concrete metrics definitions, recommended configurations, and interpretation guidance—product-specific best practices for performance testing. | -| [Pub/Sub](https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pub-sub) | integrations | 0.70 | Describes a built-in Pub/Sub connector; such pages typically include connector options, required parameters, and semantics (exactly-once, ordering) that are product-specific integration details. | | [Pulsar](https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pulsar) | integrations | 0.70 | Pulsar streaming connector configuration is an integration pattern; will include connector options and parameters specific to Databricks’ Pulsar support. | -| [Purge workspace storage](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/storage) | configuration | 0.70 | Explains how to permanently purge deleted notebook cells, notebooks, experiments, and cluster logs, including Databricks-specific storage behavior and controls that go beyond generic deletion. | +| [Purge workspace storage](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/storage) | configuration | 0.70 | Details how to permanently purge deleted notebooks, cells, experiments, and logs; this is product-specific configuration/operations behavior. | | [Python UDTFs (user-defined table functions)](https://learn.microsoft.com/en-us/azure/databricks/udf/python-udtf) | configuration | 0.70 | Describes Python UDTFs in Databricks Runtime, including how they return tables and are invoked in SQL FROM clauses. This is Databricks-specific UDTF configuration and behavior. | | [Python scalar UDFs](https://learn.microsoft.com/en-us/azure/databricks/udf/python) | integrations | 0.70 | Contains concrete examples and caveats for Python UDF registration and evaluation order in Databricks Spark SQL, which are product-specific coding patterns. | | [Python script](https://learn.microsoft.com/en-us/azure/databricks/jobs/python-script) | configuration | 0.70 | Describes how to configure Python script tasks with Databricks-specific options (paths, parameters, environment, cluster/runtime settings). Contains concrete configuration fields and patterns beyond generic Python execution. | @@ -1960,21 +1972,20 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Query OpenAI models](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/external-models-tutorial) | configuration | 0.70 | Step-by-step instructions for configuring external model endpoints via MLflow Deployments; likely includes endpoint parameter names, required fields, and specific configuration options unique to Databricks external models. | | [Query PostgreSQL with Databricks](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/postgresql) | integrations | 0.70 | Includes PostgreSQL-specific JDBC URL patterns, driver usage, and Databricks read/write examples that are concrete integration details. | | [Query SQL Server with Databricks](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/sql-server) | integrations | 0.70 | Contains SQL Server JDBC URL formats, authentication options, and Databricks-specific Spark configuration for this integration. | -| [Query Unity Catalog metric views](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/bi-metric-view) | configuration | 0.70 | Covers enabling BI compatibility mode and its supported scenarios/limitations; such a feature page typically documents specific configuration flags or settings to turn the mode on/off and how it rewrites queries, which is product-specific configuration knowledge. | +| [Query Unity Catalog metric views](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/bi-metric-view) | configuration | 0.70 | Describes a product-specific feature (BI compatibility mode) with concrete enablement steps and detailed behavior/limitations for querying Unity Catalog metric views from external BI tools. This is configuration-focused expert knowledge about how Azure Databricks rewrites BI tool queries and which scenarios are supported, which isn't generic BI or Databricks knowledge. | | [Query agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/query-agent) | integrations | 0.70 | Details specific request formats, headers, and client configuration (Databricks OpenAI client vs REST) for querying agents. These are product-specific API integration patterns. | | [Query caching](https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-caching) | best-practices | 0.70 | Explains different caching layers, how they work, and how to disable result caching; likely includes product-specific behaviors and recommendations for performance and cost. | | [Query chat models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-chat-models) | integrations | 0.70 | How-to for constructing query requests to chat/general-purpose models via Databricks model serving endpoints. This typically includes request/response schema, parameter names, and API-specific fields unique to Databricks Foundation Model APIs, which fits integrations & coding patterns. | | [Query custom model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-custom-model-endpoints) | integrations | 0.70 | Explains request formatting and how to call custom model endpoints; includes request/response patterns and parameters specific to Databricks Model Serving. | +| [Query data](https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/query-data) | integrations | 0.70 | Covers a specific integration (Databricks Connector for Google Sheets) with concrete behavior: selecting tables, writing SQL, parameters, pivot tables, and automatic saving/refresh of imports. This is product-specific integration knowledge about how the connector works and how queries are handled, beyond generic Sheets or Databricks usage. | | [Query embedding models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-embedding-models) | integrations | 0.70 | Describes how to build and send embedding requests to Databricks model serving endpoints. This usually includes body schema, parameter names, and constraints specific to Databricks embedding APIs, which is product-specific integration knowledge. | | [Query from Lakehouse SQL Editor](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-sql-editor) | integrations | 0.70 | Describes two specific connection methods from the Lakehouse SQL editor to Lakebase compute or via Unity Catalog. This is a product-specific integration and connection pattern. | -| [Query from Notebooks](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/notebook) | integrations | 0.70 | Described as containing code examples for accessing Lakebase from notebooks; this typically includes product-specific connection strings, parameters, and usage patterns that qualify as integration-focused expert knowledge. | | [Query history](https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-history) | troubleshooting | 0.70 | Explicitly about using query history UI to debug issues; likely maps performance symptoms to diagnostic views/fields and actions, which is troubleshooting-focused expert knowledge. | -| [Query history system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/query-history) | configuration | 0.70 | Reference page for a specific system table, including its schema and column details. This is product-specific configuration/metadata that an LLM is unlikely to know exactly from training and is used to configure queries and analytics over system tables. | | [Query optimization with constraints](https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-optimization-constraints) | best-practices | 0.70 | Contains examples of using primary keys with RELY for optimization and notes Photon requirements; this is concrete, product-specific optimization guidance. | -| [Query reasoning models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-reason-models) | integrations | 0.70 | Describes how to write and send requests to reasoning-optimized models via the Foundation Model API. This implies specific API parameters and patterns for reasoning models, which are product-specific integration details. | +| [Query reasoning models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-reason-models) | integrations | 0.70 | Describes request formats and parameters for reasoning models; includes API-specific fields and usage patterns (e.g., reasoning configuration) that are product-specific integration details. | | [Query tags](https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-tags) | configuration | 0.70 | Describes custom key-value tags, where they appear, and how to use them for grouping and cost attribution; includes product-specific tag configuration and behavior. | | [Query unstructured data](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/unstructured-retrieval-tools) | integrations | 0.70 | Covers connecting agents to unstructured data via Databricks Vector Search indexes and external vector stores, including retriever tools and AI Bridge packages. This is a product-specific integration pattern with concrete tool and API usage. | -| [Query vision models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-vision-models) | integrations | 0.70 | Focuses on writing query requests for vision-optimized foundation models and sending them to serving endpoints. Likely includes request body formats, image encoding, and model-specific parameters, which are concrete integration details. | +| [Query vision models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-vision-models) | integrations | 0.70 | Covers how to structure requests for vision models; will include model-specific request fields (image encodings, content types, parameter names) and endpoint usage patterns unique to Databricks’ API. | | [RECIPIENTS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/recipients) | configuration | 0.70 | RECIPIENTS relation schema and filtering rules are Databricks-specific metadata configuration. | | [REFERENTIAL_CONSTRAINTS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/referential_constraints) | configuration | 0.70 | REFERENTIAL_CONSTRAINTS relation describes RI relationships; its exact schema and behavior are product-specific metadata. | | [REMOVE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-connector-remove) | integrations | 0.70 | Defines Databricks-specific REMOVE statement for deleting files from volumes via SQL Connector/driver only. This is detailed, product-specific integration behavior. | @@ -1983,8 +1994,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ROUTINE_COLUMNS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/routine_columns) | configuration | 0.70 | ROUTINE_COLUMNS relation defines how result columns of TVFs are exposed; this is Databricks-specific metadata configuration. | | [RStudio Desktop](https://learn.microsoft.com/en-us/azure/databricks/sparkr/rstudio) | integrations | 0.70 | Describes concrete configuration steps and connection patterns between local RStudio and Databricks clusters, which are product-specific integration details. | | [Range join](https://learn.microsoft.com/en-us/azure/databricks/optimizations/range-join) | best-practices | 0.70 | Covers Databricks-specific range join optimization with concrete tuning guidance and join hints that go beyond generic Spark knowledge. | -| [Read in data](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/dataloading) | best-practices | 0.70 | Described as best practices for data loading on AI Runtime, including Unity Catalog requirements and likely product-specific patterns and gotchas for ML/DL workloads. | -| [Read shared data (Databricks-to-Databricks)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-databricks) | integrations | 0.70 | How-to for consuming Databricks-to-Databricks shares with product-specific connection behavior and security model; likely includes concrete options/parameters for accessing shared data from Databricks workspaces. | +| [Read in data](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/dataloading) | best-practices | 0.70 | Provides product-specific data loading guidance tied to Unity Catalog requirements and AI Runtime behavior, representing concrete best practices for this service. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-reference) | configuration | 0.70 | A connector 'reference' page typically lists supported tables, required OAuth scopes, and permissions—i.e., concrete parameter names and required values—matching the configuration category. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-reference) | configuration | 0.70 | Reference material including data type transformations and supported objects implies tables of fields, types, and mappings—product-specific configuration/parameter knowledge. | | [Reference](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-reference) | configuration | 0.70 | Reference for unsupported objects and data type transformations implies detailed, product-specific mapping tables and constraints that function as configuration/reference data. | @@ -1993,10 +2003,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Replicate external database (AUTO CDC)](https://learn.microsoft.com/en-us/azure/databricks/ldp/database-replication) | integrations | 0.70 | Describes using the AUTO CDC API to replicate RDBMS tables; likely includes API parameters, configuration patterns, and product-specific behavior for change data capture. | | [Request logs and assessment logs (deprecated)](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/request-assessment-logs) | configuration | 0.70 | Details deprecation of specific request and assessment log tables and migration to MLflow Traces; likely includes table names, schema expectations, and migration steps. | | [Reserved words](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-reserved-words) | configuration | 0.70 | Lists reserved words and reserved schema names with special meaning in Databricks, which are concrete configuration/usage constraints. | -| [Restrict workspace admins](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/restrict-workspace-admins) | security | 0.70 | Describes a specific admin control (RestrictWorkspaceAdmins) with concrete, product-specific behavior and fields that govern workspace admin permissions. This is detailed configuration/security behavior that isn’t obvious from general knowledge and maps to security/permission settings. | | [Retrieval quality evaluation](https://learn.microsoft.com/en-us/azure/databricks/vector-search/retrieval-quality-eval) | best-practices | 0.70 | The page describes built-in retrieval quality evaluation with automatic query generation and comparison of retrieval strategies. This implies detailed, product-specific guidance on how to configure and interpret evaluation runs, and how to compare strategies on Databricks, which are concrete usage patterns and recommendations rather than generic theory, fitting best-practices. | | [Review logged output table](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/anomaly-detection/results) | security | 0.70 | Contains product-specific governance details: exact system table name (system.data_quality_monitoring.table_results), default storage behavior, and access control note that only account admins can access and must grant others. These are concrete, product-specific security/permission behaviors. | -| [Roles and permissions](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/roles-permissions) | security | 0.70 | Covers creating roles and granting permissions for Lakebase; likely includes specific GRANT patterns and role usage tailored to this managed service. | +| [Row and column filters](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/) | security | 0.70 | Provides guidance on using row filters and column masks to protect sensitive data at row/column level. This is product-specific data governance and security configuration, likely including specific SQL constructs and Unity Catalog behaviors. | | [Row filtering](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/row-filtering) | configuration | 0.70 | Describes a 'row filtering' feature in beta, analogous to SQL WHERE, including its impact on performance and duplication. This is a specific configuration capability for managed connectors. | | [Run as](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/run-as) | security | 0.70 | Explains the 'Run as' setting, defaulting to owner, and recommends using a service principal to ensure continuity. This is product-specific identity/permission configuration guidance. | | [Run jobs on a schedule](https://learn.microsoft.com/en-us/azure/databricks/jobs/scheduled) | configuration | 0.70 | Details scheduled trigger options and notes constraints (for example first-run timing); likely includes schedule parameter names and allowed values. | @@ -2008,7 +2017,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [SET QUERY_TAGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-query-tags) | integrations | 0.70 | The page documents the SET QUERY_TAGS SQL command, including exact syntax, supported operations (set/read/remove), and behavior tied to Databricks’ system.query.history table. These are product-specific API details and configuration patterns for tagging queries, which align with integrations/coding-patterns and represent expert knowledge beyond generic SQL concepts. | | [SET RECIPIENT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-set-recipient) | configuration | 0.70 | Unity Catalog/Delta Sharing-specific session configuration to mock CURRENT_RECIPIENT in provider queries. | | [SET VARIABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-set-variable) | configuration | 0.70 | SQL reference page with exact Databricks-specific SET variable syntax and behavior, including how it differs from SET config. This is product-specific configuration semantics not inferable from general SQL knowledge. | -| [SFTP](https://learn.microsoft.com/en-us/azure/databricks/ingestion/sftp) | integrations | 0.70 | SFTP connector usage in Lakeflow Connect is an integration pattern; such pages usually include connector-specific options, connection parameters, and constraints unique to this product. | | [SHARE_RECIPIENT_PRIVILEGES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/share_recipient_privileges) | security | 0.70 | Documents a Databricks-specific INFORMATION_SCHEMA relation with exact column names and semantics for share recipient privileges, which is product-specific security metadata not generally known from training. | | [SHOW ALL IN SHARE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-all-in-share) | integrations | 0.70 | Documents precise Databricks SQL syntax and behavior for SHOW ALL IN SHARE, specific to Unity Catalog and Delta Sharing. | | [SHOW CATALOGS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-catalogs) | integrations | 0.70 | Provides Databricks-specific SHOW CATALOGS syntax, regex pattern usage, and metastore behavior not covered by generic SQL. | @@ -2030,7 +2038,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [SQL](https://learn.microsoft.com/en-us/azure/databricks/jobs/sql) | configuration | 0.70 | Covers configuring SQL tasks including query/file/alert modes and requirements for Databricks SQL warehouses (serverless or pro). Contains product-specific configuration options and constraints for SQL tasks. | | [SQL Pipeline Operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-pipeop) | integrations | 0.70 | Covers Databricks-specific pipe operator syntax and how queries are chained, which is unique to this product. | | [SQL Pipeline Syntax](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-pipeline) | integrations | 0.70 | Describes Databricks-specific SQL pipeline syntax and chaining semantics introduced in specific runtime versions. | -| [SQL Scripting](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-scripting) | integrations | 0.70 | Details Databricks SQL scripting syntax (compound statements, handlers, cursors) and supported control-of-flow constructs, which are product-specific language features. | | [SQL Server CDC setup](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/sql-server-cdc) | integrations | 0.70 | Covers enabling built-in CDC in SQL Server for use with the Databricks connector. This typically includes specific database options, T-SQL commands, and connector expectations, which are expert integration details rather than generic CDC concepts. | | [SQL Server change tracking setup](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/sql-server-ct) | integrations | 0.70 | Guidance on enabling SQL Server change tracking specifically for the Databricks SQL Server connector, including when CT vs CDC is used, is a product-specific integration pattern with configuration steps and likely T-SQL options, fitting integrations. | | [SQL Server database setup (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/sql-server-ddl-legacy) | integrations | 0.70 | Legacy DDL capture and schema evolution setup for SQL Server ingestion in Lakeflow Connect implies detailed database objects (internal tables, stored procedures, triggers) and connector-specific behavior. These are product-specific integration patterns and configurations. | @@ -2040,26 +2047,26 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Scala UDFs](https://learn.microsoft.com/en-us/azure/databricks/udf/scala) | integrations | 0.70 | Scala UDF registration and usage details in Databricks Spark SQL, including evaluation caveats, are product-specific coding patterns. | | [Scale Ray](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/scale-ray) | best-practices | 0.70 | Provides concrete recommendations for autoscaling, head node sizing, heterogeneous clusters, and resource allocation for Ray on Databricks, which are product-specific performance and scaling best practices. | | [Scale to zero](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/scale-to-zero) | configuration | 0.70 | Explains how scale-to-zero suspends compute after a period of inactivity, implying specific timing/behavioral settings that are product-specific configuration details. | +| [Search traces - SDK](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-via-sdk) | integrations | 0.70 | Documents programmatic querying of traces using mlflow.search_traces(), including product-specific API behavior and parameters for traces stored in different backends, which constitutes detailed SDK integration knowledge. | | [Search traces - UI](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/ui-traces) | configuration | 0.70 | Mentions MLFLOW_TRACKING_URI set to 'databricks' and describes how traces are stored/served by managed tracking; likely includes specific environment variable values and backend behavior, which are configuration details. | | [Secret workflow example](https://learn.microsoft.com/en-us/azure/databricks/security/secrets/example-secret-workflow) | integrations | 0.70 | Tutorial shows concrete code and configuration patterns for integrating Databricks secrets with JDBC connections to Azure Data Lake Storage, including parameter usage and secret reference syntax. | | [Secrets in Spark conf or environment variables](https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets-spark-conf-env-var) | security | 0.70 | Provides concrete syntax and behavior for referencing secrets in Spark configuration properties and environment variables, including automatic redaction from logs; this is product-specific security configuration/usage. | | [Secure cluster connectivity](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/secure-cluster-connectivity) | security | 0.70 | Explains secure cluster connectivity/NPIP behavior and how to configure it so clusters have no public IPs. This is a product-specific security/network configuration pattern. | | [Security, compliance & privacy](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/security-compliance-and-privacy/) | security | 0.70 | Security-focused architectural principles for protecting Databricks workloads and data; product-specific security and compliance patterns. | | [Security, compliance & privacy best practices](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/security-compliance-and-privacy/best-practices) | security | 0.70 | Summarizes Databricks Security Best Practices guide along pillar principles; includes Databricks-specific security recommendations and configurations. | -| [Selective overwrite](https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite) | best-practices | 0.70 | Provides concrete recommendations such as preferring replaceWhere for most operations and using restore if data is accidentally overwritten. These are product-specific DO/DON’T operational patterns for Delta Lake, fitting best-practices. | | [September 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/september) | limits-quotas | 0.70 | Contains a specific new limit: Git folders and Repos (Legacy) now support 1GB and 20000 workspace assets per working branch, which is an exact numerical quota change. | | [Serve LLMs to an optimized LLM serving endpoint](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/llm-optimized-model-serving) | configuration | 0.70 | Provides product-specific configuration and code patterns for enabling optimized LLM serving, including performance characteristics and deprecated API usage. | | [Serve multiple models to a Model Serving endpoint](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/serve-multiple-models-to-serving-endpoint) | configuration | 0.70 | Describes how to programmatically configure multiple models and traffic split on a single endpoint, which implies endpoint JSON schema, parameter names, and valid value ranges that are product-specific configuration details. | -| [Serverless GPU environment version 3](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three-gpu) | configuration | 0.70 | Provides system environment information and deprecation status for a specific GPU environment version, which is concrete configuration and lifecycle detail. | +| [Serverless GPU environment version 3](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three-gpu) | configuration | 0.70 | Although deprecated, the page is said to outline system environment information for GPU environment version 3. Such environment-version pages usually list specific versions and configuration details, which are product-specific expert configuration knowledge. | | [Serverless GPU environment version 4](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four-gpu) | configuration | 0.70 | Describes a specific GPU environment version with system environment information and preview constraints, which are concrete configuration details for AI Runtime GPU workloads. | +| [Serverless GPU environment version 5](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five-gpu) | configuration | 0.70 | GPU environment version page explicitly provides system environment information for a specific GPU runtime, which generally includes concrete versions and settings for AI Runtime. This is product-specific configuration detail that qualifies as expert knowledge. | | [Serverless environment version 1](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/one) | configuration | 0.70 | Documents system environment information for environment version 1, providing concrete configuration details unique to this product. | | [Serverless environment version 2](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/two) | configuration | 0.70 | Similar to other environment-version pages, it documents system environment specifics for that version, which are configuration references. | | [Serverless environment version 3](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three) | configuration | 0.70 | Outlines system environment information for a specific environment version, which is product-specific configuration data. | -| [Serverless environment version 4](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four) | configuration | 0.70 | Like other environment-version pages, it outlines system environment information (versions and components) that constitute product-specific configuration details. | -| [Serverless environment version 5](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five) | configuration | 0.70 | Environment-version pages typically enumerate exact system environment details (Python version, libraries, possibly config constraints) that are product-specific configuration references. | -| [Serverless environment versions](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/) | configuration | 0.70 | Provides a reference for currently released environment versions, each with specific Python versions and API compatibility; this is concrete configuration/selection data for environments. | +| [Serverless environment version 4](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four) | configuration | 0.70 | Similar to v5, this page outlines system environment information for environment version 4, which typically includes specific runtime versions and settings. These are concrete configuration parameters unique to this product and version. | +| [Serverless environment version 5](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five) | configuration | 0.70 | Described as outlining system environment information for a specific serverless environment version. These environment-version pages typically enumerate concrete environment details (Python version, libraries, system settings) that are product-specific configuration data not inferable from general training. | +| [Serverless workspaces overview](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces) | decision-making | 0.70 | Describes benefits, limitations, and use cases for serverless workspaces versus other options, providing product-specific guidance on when to choose this workspace type. | | [Service principals for Terraform](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/service-principals) | security | 0.70 | Covers creating service principals for Databricks automation with Terraform, including security-focused identity configuration. Likely includes specific fields/parameters for service principal resources and scopes. | -| [Service principals overview](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/service-principals) | security | 0.70 | Covers product-specific identity model for service principals in Azure Databricks, including how they interact with workspaces and identity federation. Contains concrete, platform-specific security/identity behavior beyond generic concepts. | | [Set and use environment variables with init scripts](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/environment-variables) | configuration | 0.70 | The page describes product-specific behavior of environment variables in Azure Databricks init scripts, including how they are exposed and used during cluster initialization. This is configuration-focused, with details unique to Databricks init scripts rather than generic environment variable usage. | | [Set the tracking server and experiment](https://learn.microsoft.com/en-us/azure/databricks/mlflow/tracking-server-configuration) | configuration | 0.70 | Covers how to configure tracking servers for local, cross-workspace, and multi-workspace scenarios; likely includes specific settings and patterns for data location. | | [Set up a connection](https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform-setup) | configuration | 0.70 | Covers creating a Databricks connection in Power Apps/Power Automate. Likely includes connector-specific parameters, authentication options, and required fields that are detailed configuration knowledge. | @@ -2070,9 +2077,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-source-setup) | security | 0.70 | Configuring OAuth for a specific SaaS integration typically involves concrete auth endpoints, scopes, redirect URIs, and app settings. These are product-specific authentication parameters that match the security sub-skill. | | [Set up private Git connectivity](https://learn.microsoft.com/en-us/azure/databricks/repos/git-proxy) | configuration | 0.70 | Describes the Databricks Git server proxy and how to configure private Git servers behind firewalls to work with Git folders. This is Databricks-specific networking and Git integration configuration. | | [Set up your environment](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/configure-env) | deployment | 0.70 | Lists specific workspace and local prerequisites (CLI version, workspace features) required to deploy and run apps; product-specific deployment requirements. | +| [Setup](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/setup) | configuration | 0.70 | Describes prerequisites, compute requirements, and query configuration for real-time mode, likely including specific configuration options and sizing guidance unique to Databricks Structured Streaming, fitting configuration-focused expert knowledge. | | [Share code between notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/share-code) | integrations | 0.70 | Covers importing Python files and using Databricks-specific mechanisms (files, multi-task jobs) to share code, which are concrete coding patterns unique to Databricks. | | [Share feature tables across workspaces (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/feature-store/multiple-workspaces) | configuration | 0.70 | Describes how to configure a centralized feature store and access it from multiple workspaces, including Databricks-specific setup recommendations. | -| [SharePoint](https://learn.microsoft.com/en-us/azure/databricks/ingestion/sharepoint) | integrations | 0.70 | SharePoint connector article; likely includes Databricks-specific connector configuration, supported modes (batch/streaming), and parameter usage for Auto Loader, spark.read, and COPY INTO. | | [Sharing bundle files](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/sharing) | configuration | 0.70 | Explains how to configure shared folders, libraries, and variables across bundles; involves specific configuration patterns and paths unique to Databricks. | | [Shiny on Azure Databricks](https://learn.microsoft.com/en-us/azure/databricks/sparkr/shiny) | integrations | 0.70 | Shows how to host Shiny apps in Databricks notebooks and RStudio with Spark integration; these are concrete product-specific integration patterns. | | [Sign in with Azure CLI](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-cli-login) | security | 0.70 | Concrete instructions for logging into Databricks via Azure CLI with user or service principal, including required commands and parameters. Security/auth focused and product-specific. | @@ -2080,11 +2087,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Skew join hints](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/skew-join) | best-practices | 0.70 | Discusses Databricks-specific configuration (spark.sql.adaptive.skewJoin.enabled) and hint usage to handle data skew, which are concrete product-specific tuning practices. | | [Snapshots](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/snapshots) | configuration | 0.70 | Snapshot behavior (scope, retention, restore semantics) is specific to Lakebase and constitutes expert configuration knowledge. | | [SparkR to sparklyr migration](https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-migration) | integrations | 0.70 | Describes concrete function mappings and code examples for migrating from SparkR to sparklyr in Azure Databricks. These API-level mappings and migration patterns are product- and version-specific coding patterns, fitting integrations & coding patterns. | -| [SparkR versus sparklyr](https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-vs-sparklyr) | decision-making | 0.70 | Compares SparkR and sparklyr specifically for Azure Databricks, including deprecation status and migration-oriented guidance to help choose which API to use. This is product- and version-specific decision guidance that an LLM is unlikely to infer from general training data. | +| [SparkR versus sparklyr](https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-vs-sparklyr) | decision-making | 0.70 | Explicit comparison of SparkR vs sparklyr with migration guidance; Databricks-specific recommendation and deprecation details help users decide which R API to use and how to migrate, fitting decision-making criteria. | | [Stack](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/stack-cli) | deployment | 0.70 | Stack CLI is used to deploy and manage stacks of Databricks resources; the page documents product-specific deployment commands and constraints. | | [Standalone chat app](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/chat-app) | deployment | 0.70 | Describes building and deploying a chat UI using Databricks Apps; likely includes app configuration, routing, and deployment details specific to Databricks. | | [Stateful stream processing](https://learn.microsoft.com/en-us/azure/databricks/ldp/stateful-processing) | best-practices | 0.70 | Explains how to use watermarks for aggregations, joins, and deduplication in Lakeflow pipelines with examples; these are product-specific performance and correctness recommendations. | -| [Store traces in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/trace-unity-catalog) | configuration | 0.70 | Explains how to store MLflow traces in Unity Catalog tables using OTEL format, which typically includes product-specific configuration options (e.g., enabling the feature, specifying catalogs/schemas/tables, and access control behavior). These are concrete configuration details unique to Databricks + MLflow traces. | +| [Store traces in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/trace-unity-catalog) | configuration | 0.70 | Describes how to store MLflow traces in Unity Catalog tables using an OpenTelemetry-compatible format, which requires product-specific configuration details (table layout, storage options, and UC-specific behaviors) that go beyond generic concepts. | | [StreamingQueryListener class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener) | integrations | 0.70 | Describes a Databricks-specific abstract listener class and its lifecycle callbacks, which are concrete API integration points. | | [StreamingQueryManager class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerymanager) | integrations | 0.70 | Class-level reference for managing active streaming queries via Databricks PySpark APIs, including how to access it from SparkSession. | | [Structured Streaming writes to Azure Synapse](https://learn.microsoft.com/en-us/azure/databricks/archive/azure/stream-synapse) | integrations | 0.70 | Describes Structured Streaming writes to Synapse using a specific connector and COPY semantics, a Databricks–Synapse integration pattern. | @@ -2092,7 +2099,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [SuperAnnotate](https://learn.microsoft.com/en-us/azure/databricks/partners/ml/superannotate) | integrations | 0.70 | Shows how to integrate SuperAnnotate’s Python SDK with Databricks, including product-specific patterns for transforming annotations into Spark DataFrames. | | [Swarm](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/swarm) | integrations | 0.70 | Details MLflow integration for OpenAI Swarm via mlflow.openai.autolog and nested trace behavior plus deprecation guidance; specific to this product combo. | | [Synapse with Polybase](https://learn.microsoft.com/en-us/azure/databricks/archive/azure/synapse-polybase) | integrations | 0.70 | Described as including configuration information for the legacy PolyBase connector between Azure Databricks and Synapse; such connector docs typically list connector-specific options and parameters that qualify as product-specific integration details. | +| [System tables overview](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/) | configuration | 0.70 | Reference for system tables schema and usage is product-specific configuration/metadata needed for monitoring and analytics. | | [TISAX](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/tisax) | security | 0.70 | Explains how TISAX requirements translate into specific Azure Databricks workspace controls, which is specialized security/compliance configuration knowledge. | +| [TLS server certificate validation](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tls-server-certificate-validation) | security | 0.70 | Explains how to configure TLS server certificate validation for specific Lakeflow Connect database connectors (MySQL, PostgreSQL, SQL Server), including product-specific security behavior and settings to prevent PITM attacks, which fits security configuration guidance. | | [Tableau](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/tableau) | integrations | 0.70 | Covers Partner Connect plus direct Tableau connections. Such docs typically include server URL formats, connector selection, sign-on configuration, and Databricks-specific options, which are expert integration and configuration details. | | [Task values](https://learn.microsoft.com/en-us/azure/databricks/jobs/task-values) | integrations | 0.70 | Describes dbutils.jobs.taskValues API (set/get) with key-based semantics, a Databricks-specific coding pattern and API surface for inter-task communication. | | [Terraform](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/automate-with-terraform) | deployment | 0.70 | Shows Terraform configuration for projects, branches, endpoints, and deletion; includes resource types and arguments specific to the Azure Databricks provider for Lakebase. | @@ -2111,6 +2120,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Transaction modes](https://learn.microsoft.com/en-us/azure/databricks/transactions/transaction-modes) | decision-making | 0.70 | Explains when to use interactive vs non-interactive transaction modes with examples; provides scenario-based guidance for selecting modes, which is decision-focused and Databricks-specific. | | [Transactional writes to cloud storage with DBIO](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/dbio-commit) | best-practices | 0.70 | DBIO usage involves product-specific write patterns, commit protocols, and configuration options that address correctness and performance edge cases. | | [Transform complex data types](https://learn.microsoft.com/en-us/azure/databricks/semi-structured/complex-types) | best-practices | 0.70 | Shows optimized transformation patterns for nested data types with concrete code examples and Databricks-specific optimizations. These are actionable patterns and gotchas for performance and correctness, aligning with best-practices. | +| [Troubleshoot high initialization times](https://learn.microsoft.com/en-us/azure/databricks/ldp/fix-high-init) | best-practices | 0.70 | Provides concrete operational guidance for high initialization times (for example, thresholds like initialization over five minutes and splitting tables across pipelines); this is product-specific performance tuning advice. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/databricks/genie/troubleshooting) | troubleshooting | 0.70 | Dedicated troubleshooting page for Genie spaces; such pages typically map specific symptoms and errors to causes and resolutions, which are product-specific expert knowledge. | | [Tune file size](https://learn.microsoft.com/en-us/azure/databricks/delta/tune-file-size) | best-practices | 0.70 | Provides Databricks-specific recommendations for manual vs automatic file size tuning, including when Unity Catalog managed tables override manual tuning and how file size interacts with clustering and predictive optimization. These are concrete, product-specific tuning practices. | | [Tutorial: Create and deploy a bundle in the workspace](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace-tutorial) | configuration | 0.70 | Tutorial for workspace bundle creation and deployment; will show concrete configuration steps and requirements specific to workspace-based bundles. | @@ -2121,37 +2131,39 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Type widening](https://learn.microsoft.com/en-us/azure/databricks/delta/type-widening) | configuration | 0.70 | Describes a Databricks Runtime–specific feature (type widening) with product-version constraints and behavior details (only works on Runtime 15.4 LTS+ and requires Delta Lake). While the summary doesn’t show full parameter tables, this is a product-specific configuration/behavior toggle rather than a generic concept, so it fits configuration better than other categories. | | [UC function tools](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/create-custom-tool) | integrations | 0.70 | Shows how to build custom tools using Unity Catalog functions to execute task-specific logic. This is concrete, product-specific guidance on defining callable functions/tools for agents, fitting integrations & coding patterns. | | [UDFs on Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/udf/unity-catalog) | configuration | 0.70 | Describes Unity Catalog–scoped UDFs, their scope differences from PySpark UDFs, and references to CREATE FUNCTION syntax. This is product-specific configuration of UDFs within Unity Catalog, including scope and governance behavior. | -| [UDTFs in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/udf/udtf-unity-catalog) | configuration | 0.70 | Explains Unity Catalog user-defined table functions, how they are invoked in FROM clauses, and argument types. This is specific to Databricks Unity Catalog UDTF configuration and behavior. | | [UK Cyber Essentials Plus](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/uk-cyber-essentials-plus) | security | 0.70 | UK Cyber Essentials Plus compliance controls require specific workspace security configurations and options that are unique to this product and regulation. | | [UNPIVOT clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-unpivot) | integrations | 0.70 | Describes Databricks-specific UNPIVOT clause behavior and version availability, which are concrete implementation details. | | [Unified authentication](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/unified-auth) | security | 0.70 | Defines unified authentication model and how it applies to CLI, Terraform, and SDKs. Includes product-specific auth configuration behavior and reuse patterns. | -| [Unity Catalog best practices](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/best-practices) | best-practices | 0.70 | The page is explicitly a best practices guide for using Unity Catalog to meet data governance needs and data isolation on Azure Databricks. Such guidance is typically product-specific, including concrete recommendations on how to structure catalogs, schemas, permissions, and isolation patterns that go beyond generic governance theory. This aligns with the best-practices sub-skill type. | | [Unity Catalog connections](https://learn.microsoft.com/en-us/azure/databricks/connect/uc-connections) | security | 0.70 | Unity Catalog connections are securable objects with endpoint and credential definitions; documentation for these objects generally includes permission scopes, object-level security semantics, and how they differ from storage/service credentials, which is product-specific IAM/security configuration. | | [Unity Catalog support](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/unity-catalog) | best-practices | 0.70 | Outlines supported functionality and best practices for using Unity Catalog with streaming, including governance patterns specific to Databricks. | | [Unity Catalog tables](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tables) | security | 0.70 | Emphasizes required privileges, governance, and access control for Unity Catalog tables; likely lists specific privilege requirements and patterns, fitting product-specific security configuration. | | [Unity Catalog volumes](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/uc-volumes) | security | 0.70 | Covers volumes with governance and access control plus required privileges; this is product-specific security and permission configuration for file-based storage. | +| [Update workspace network configuration](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/update-workspaces) | configuration | 0.70 | Provides detailed steps and options for changing an existing workspace’s VNet configuration (including VNet injection and modifications), with product-specific network configuration parameters and constraints. This is focused on configuration rather than general concepts. | | [Upgrade to privilege inheritance](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/upgrade-privilege-model) | security | 0.70 | Explains how to move to Unity Catalog Privilege Model 1.0, including behavior changes around privilege inheritance and upgrade implications, which are product-specific security/authorization details. | +| [Usage tracking](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/usage-tracking-beta) | configuration | 0.70 | Usage tracking system table for Unity AI Gateway endpoints will include concrete schema details (column names for token counts, latency, request/response metadata) and example queries specific to Databricks system tables, which are product-specific configuration/telemetry details not inferable from general knowledge. | | [Use Git with jobs](https://learn.microsoft.com/en-us/azure/databricks/jobs/git) | configuration | 0.70 | Explains configuring job tasks to check out code from remote Git repositories, including constraints like all tasks using the same commit and how snapshots are taken. These are product-specific configuration behaviors and parameters (branch/commit handling, caching, sparse checkout) that fit the configuration sub-skill. | | [Use a Python package](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-python-wheels-in-workflows) | integrations | 0.70 | Covers packaging code as a wheel and using the Python wheel task; likely documents task-specific parameters and constraints unique to this job type. | +| [Use a Unity Catalog volume or external location](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/unity-catalog) | integrations | 0.70 | Describes Databricks SQL COPY INTO usage with Unity Catalog volumes/external locations, including product-specific syntax, options, and configuration details for ADLS-backed tables; this is concrete integration and coding pattern guidance. | | [Use a custom image as your compute environment](https://learn.microsoft.com/en-us/azure/databricks/compute/custom-containers) | configuration | 0.70 | Covers customizing compute with Docker images and init scripts; such content typically includes image configuration parameters and environment setup details specific to Databricks Container Services. | | [Use custom metrics](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/custom-metrics) | configuration | 0.70 | Explains how to define custom and drift metrics for profiling, including how they plug into the profiling system and metric tables, which are Databricks-specific configuration patterns. | | [Use dbt](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-dbt-in-workflows) | integrations | 0.70 | Describes running dbt Core projects as a Lakeflow Jobs task; likely includes task configuration parameters and integration-specific settings unique to Databricks–dbt integration. | -| [Use foundation models and external models overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-foundation-models) | integrations | 0.70 | Explains how to write and send query requests to Databricks-hosted and external foundation models via model serving endpoints. This typically includes request schema, parameters, and endpoint specifics unique to Databricks Foundation Model APIs, fitting integrations & coding patterns. | +| [Use foundation models and external models overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-foundation-models) | integrations | 0.70 | Explains request options and how to send them to endpoints; likely includes request schema, parameter names, and endpoint URLs for Databricks-hosted and external models, which are integration-specific API patterns. | | [Use level of detail expressions](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/level-of-detail) | integrations | 0.70 | Covers Databricks-specific level-of-detail expression syntax and behavior within metric views, with concrete expression patterns and semantics that are unique to this product feature, aligning with integrations & coding patterns. | | [Use service credentials to access services](https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/use-service-credentials) | integrations | 0.70 | Shows how to use service credential objects when connecting to external cloud services; includes concrete usage patterns and parameters for integrations. | | [Use temporary credentials](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/temporary-credentials) | security | 0.70 | Describes using temporary credentials for accessing external storage with COPY INTO; such content typically includes auth-related parameters, scopes, and security-specific configuration unique to Databricks ingestion. | -| [Use the Genie API](https://learn.microsoft.com/en-us/azure/databricks/genie/conversation-api) | integrations | 0.70 | API integration page for embedding Genie into applications; likely includes endpoint URLs, request/response schemas, and parameter references that are product-specific integration details. | +| [Use the Genie API](https://learn.microsoft.com/en-us/azure/databricks/genie/conversation-api) | integrations | 0.70 | Describes using the Genie API in agents, chatbots, and apps. An API reference-style page will include endpoint/parameter details and request/response structures that are product-specific integration patterns. | | [User-defined functions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/functions) | configuration | 0.70 | Describes adding registered SQL/Python UDFs as resources; likely includes how to reference them and any required configuration fields, which are product-specific configuration patterns. | | [User-defined functions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/udf) | integrations | 0.70 | UDF article explains serialization behavior, supported patterns, and possibly limitations for UDFs over Databricks Connect, which are specific coding and integration details. | -| [Users and groups overview](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/) | security | 0.70 | Covers centralized identity management across account and workspaces, including syncing from identity providers and controlling access to workspaces, data, and compute—product-specific IAM behavior. | -| [Vacuum](https://learn.microsoft.com/en-us/azure/databricks/delta/vacuum) | best-practices | 0.70 | VACUUM behavior on Delta/Unity Catalog is product-specific; page typically includes retention thresholds, recommended settings, and gotchas (e.g., data loss risks, minimum retention), which are concrete Databricks-specific maintenance recommendations. | +| [Users](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/users) | security | 0.70 | Procedures and constraints for adding, updating, and removing users in identity-federated workspaces are specific IAM/security operations. | +| [Users and groups overview](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/) | security | 0.70 | Centralized identity management and access control across account/workspaces is product-specific IAM behavior and configuration. | +| [Vacuum](https://learn.microsoft.com/en-us/azure/databricks/delta/vacuum) | best-practices | 0.70 | Covers using VACUUM with retention thresholds and recommendations (predictive optimization, regular scheduling) that are specific to Databricks Delta behavior and compliance/cost trade-offs. This is actionable product-specific maintenance guidance rather than generic SQL knowledge. | | [Variant shredding](https://learn.microsoft.com/en-us/azure/databricks/delta/variant-shredding) | best-practices | 0.70 | Describes how to apply shredding to VARIANT columns, including Databricks-specific behavior and recommended usage patterns to improve query performance on semi-structured data. These are concrete optimization practices. | | [Vector search indexes](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/vector-search) | configuration | 0.70 | Page is about adding vector search indexes as app resources and likely includes Databricks-specific resource configuration options (resource definitions, properties, and parameters) that go beyond generic vector search concepts. | | [View table details](https://learn.microsoft.com/en-us/azure/databricks/delta/table-details) | configuration | 0.70 | DESCRIBE DETAIL exposes many table-level configuration fields and metadata (e.g., properties, protocol versions, stats) that are product-specific. This is effectively a reference for what details are returned and how to interpret them, which is configuration-focused expert knowledge beyond generic SQL DESCRIBE behavior. | | [Volume and file paths](https://learn.microsoft.com/en-us/azure/databricks/volumes/paths) | configuration | 0.70 | Details restrictions on path overlaps and path-based access patterns for tables and volumes; product-specific path management rules. | | [WATERMARK clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-watermark) | integrations | 0.70 | Documents Databricks-specific WATERMARK clause semantics for stateful streaming queries and version requirements. | | [WINDOW clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-named-window) | integrations | 0.70 | Explains Databricks-specific WINDOW clause syntax for naming and reusing window definitions across functions. | -| [Watermarks](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/watermarks) | best-practices | 0.70 | Provides recommendations on watermark usage to control state growth and latency; these are product-specific operational guidelines for Structured Streaming. | +| [Watermarks](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/watermarks) | best-practices | 0.70 | Provides Databricks-specific recommendations on how to use watermarks in stateful streaming to avoid unbounded state and latency issues; includes concrete guidance on when and how to configure watermarks for common operations, which is product-specific best-practice knowledge. | | [What files can I reference in an init script?](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/referencing-files) | best-practices | 0.70 | Outlines supported file locations for init scripts and provides recommendations; product-specific best practices and constraints. | | [Work with Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/sync-data/) | integrations | 0.70 | Describes integration of Lakebase with Unity Catalog for governance and synchronization. Such a page typically includes configuration options, sync behaviors, and constraints specific to this integration, matching integrations. | | [Work with clean rooms as a collaborator](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/clean-room-collaborator) | limits-quotas | 0.70 | States a specific numeric limit: a clean room can have ten total collaborators; this is an explicit quota plus collaborator workflow details. | @@ -2159,10 +2171,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Work with output tables](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/output-tables) | configuration | 0.70 | Defines output tables as temporary read-only tables and explains how to create and access them; includes product-specific table behavior and lifecycle semantics. | | [Work with structured data](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/structured-retrieval-tools) | integrations | 0.70 | Focuses on connecting agents to structured data via Unity Catalog tables, SQL warehouses, and external stores using MCP servers and custom tools. This is a concrete integration pattern with Databricks-specific mechanisms and likely parameterized tool definitions. | | [Workspace libraries](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/workspace-libraries) | configuration | 0.70 | Describes workspace library behavior, installation scopes, and interactions with compute that are specific configuration mechanics of Databricks. | +| [Workspaces system table](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/workspaces) | configuration | 0.70 | Explains the workspaces system table, its columns and lifecycle status fields, and how to join with other tables—expert schema/configuration information. | | [Write a client](https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-ingest) | integrations | 0.70 | Connector-specific ingestion page for Zerobus Ingest in Lakeflow Connect likely documents connector parameters, configuration, and usage patterns unique to this integration. | -| [Write your own](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorial-databricks-apps-autoscaling) | security | 0.70 | Tutorial focuses on connecting via service principal authentication with automatic OAuth token rotation and handling credential expiry. These are product-specific authentication and token-handling patterns, fitting security-focused configuration. | | [XML files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/xml) | integrations | 0.70 | Describes native XML file format support, schema inference, and streaming support in Databricks, which are concrete integration behaviors for this product. | -| [Zerobus system tables reference](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/zerobus-ingest) | configuration | 0.70 | Zerobus system tables reference with schema and regional behavior; these are product-specific configuration/telemetry details. | +| [Zerobus system tables reference](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/zerobus-ingest) | configuration | 0.70 | System table reference pages enumerate table paths, column names, types, and semantics plus example queries—product-specific configuration/metadata details that function like a schema/config reference and are not generally known to LLMs. | | [account access-control](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-access-control-commands) | security | 0.70 | Access-control command group for managing access rules; contains security-focused operations and likely references to permissions and scopes. | | [account billable-usage](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-billable-usage-commands) | integrations | 0.70 | Billable-usage command group with specific parameters (account, date range) for exporting logs; product-specific CLI/API surface. | | [account credentials](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-credentials-commands) | configuration | 0.70 | Commands to manage credential configurations and IAM roles; includes specific configuration objects and fields. | @@ -2202,14 +2214,15 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ai_generate_text() example](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/ai-generate-text-example) | integrations | 0.70 | Describes Databricks-specific SQL function ai_generate_text(), including how to call it against OpenAI, parameter usage, and query patterns that are unique to Databricks SQL and not generic OpenAI usage. | | [ai_mask function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_mask) | integrations | 0.70 | AI masking function; page will define entity specification, parameters, and behavior tied to Databricks model endpoints, which are product-specific integration details. | | [ai_parse_document](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/ai_parse_document) | integrations | 0.70 | Function reference for AI-based document parsing returning VariantType, including error behavior on invalid blobs; Databricks-specific API. | +| [ai_parse_document function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_parse_document) | integrations | 0.70 | Function reference pages for Databricks SQL typically include product-specific syntax, parameters, and behavior details (such as required/optional arguments, supported data types, and return structures) that go beyond generic LLM knowledge. While not about limits or configuration, this is expert integration/coding-pattern knowledge for using the ai_parse_document function within Databricks SQL. | | [ai_similarity function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_similarity) | integrations | 0.70 | Function uses Databricks Foundation Model APIs; documentation will specify parameters, return types, and usage patterns specific to Databricks AI Functions, matching integrations criteria. | | [ai_summarize function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_summarize) | integrations | 0.70 | ai_summarize() integrates with Databricks chat model endpoints; reference will detail function arguments and behavior unique to this product integration. | | [alerts](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-commands) | integrations | 0.70 | CLI reference pages typically list command names, required/optional parameters, and flags specific to the Databricks alerts API. These are product-specific API/SDK-style parameters and options that qualify as integration patterns rather than generic tutorial content. | | [api](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/api-commands) | integrations | 0.70 | The api command reference will enumerate CLI parameters, request structure, and usage patterns for invoking arbitrary Databricks REST endpoints. These are concrete, product-specific integration details (parameters, flags, usage) that fit the integrations sub-skill. | +| [append_flow](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-append-flow) | integrations | 0.70 | Documents the @dp.append_flow Python decorator with product-specific syntax and behavior for streaming DataFrames and append flows; this is concrete API usage/configuration knowledge. | | [approx_count_distinct](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/approx_count_distinct) | integrations | 0.70 | Function reference for approximate distinct counting in Databricks PySpark, including parameters and behavior. | | [approx_percentile](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/approx_percentile) | integrations | 0.70 | Documents approx_percentile semantics and parameters in Databricks PySpark; detailed function behavior. | | [approx_top_k aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/approx_top_k) | integrations | 0.70 | Function reference for approx_top_k with Databricks-specific syntax, argument behavior, and return structure. These are detailed API semantics beyond generic SQL knowledge, fitting integrations & coding patterns. | -| [apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/apps-commands) | integrations | 0.70 | Apps command group; product-specific app lifecycle operations and parameters for apps running on Databricks. | | [area](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/area) | integrations | 0.70 | Method reference for area plotting on DataFrames, including semantics of stacked area plots in this API. | | [array_join](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/array_join) | integrations | 0.70 | Documents array_join behavior including delimiter and null_replacement semantics; detailed function behavior. | | [artifact-allowlists](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/artifact-allowlists-commands) | integrations | 0.70 | This command group reference will describe specific CLI commands and parameters for managing artifact allowlists in Unity Catalog, including command syntax and options. These are product-specific configuration/integration parameters, matching the integrations category. | @@ -2219,7 +2232,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [asc](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc) | integrations | 0.70 | API reference for a Databricks PySpark Column method with defined behavior, representing product-specific coding patterns. | | [asc_nulls_first](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc_nulls_first) | integrations | 0.70 | Describes a precise Column API for sort behavior including null ordering, which is specific to Databricks PySpark implementation. | | [asc_nulls_last](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/column/asc_nulls_last) | integrations | 0.70 | Provides exact method name and semantics for a Databricks PySpark Column sort variant, which is detailed SDK knowledge. | -| [auth](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/auth-commands) | integrations | 0.70 | Auth command reference pages document concrete CLI commands, flags, and authentication parameters (e.g., profiles, tokens, OAuth flows) unique to the Databricks CLI. These are detailed integration/configuration parameters rather than high-level concepts, fitting integrations. | | [avg](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata/avg) | integrations | 0.70 | Method reference for GroupedData.avg/mean with defined behavior per numeric column, representing concrete API semantics. | | [bar](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/bar) | integrations | 0.70 | Documents the bar plotting method on DataFrames, a concrete API usage pattern. | | [barh](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/plotaccessor/barh) | integrations | 0.70 | Method-level documentation for horizontal bar plots, including semantics of axes and categories. | @@ -2235,7 +2247,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [createOrReplace](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/createorreplace) | integrations | 0.70 | API reference for a specific PySpark/Databricks method with product-specific semantics and options; this is concrete integration/coding detail beyond generic LLM knowledge. | | [createTable](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/createtable) | integrations | 0.70 | Describes a concrete Catalog API for table creation tied to Databricks/Spark; this is SDK integration detail. | | [current-user](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/current-user-commands) | integrations | 0.70 | The 'current-user' command group will specify exact commands and output structure for querying the authenticated user or service principal, which are product-specific CLI details. | -| [data-quality](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-quality-commands) | configuration | 0.70 | Data-quality command group will list specific operations and parameters for data quality management, which are expert configuration patterns. | | [database](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/database-commands) | configuration | 0.70 | Database instance management commands and their arguments are detailed configuration interfaces unique to Databricks. | | [databaseExists](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/databaseexists) | integrations | 0.70 | API method semantics (what it checks, return type) are product-specific integration details. | | [dbt](https://learn.microsoft.com/en-us/azure/databricks/jobs/dbt) | configuration | 0.70 | Explains Databricks-specific dbt task configuration, including use of DBT_ACCESS_TOKEN and run-as principal behavior. These are concrete, product-specific configuration details not captured by generic dbt knowledge. | @@ -2265,11 +2276,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [from_avro function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_avro) | integrations | 0.70 | Databricks-specific function for decoding Avro with parameters (avroBin, jsonSchemaStr); represents a concrete integration pattern between Databricks SQL and Avro with product-specific function semantics. | | [from_json schema inference and evolution](https://learn.microsoft.com/en-us/azure/databricks/ldp/from-json-schema-evolution) | configuration | 0.70 | Describes schema inference and evolution options for from_json in pipelines; likely includes function options, flags, and behaviors unique to this environment. | | [functionExists](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/functionexists) | integrations | 0.70 | Method-level semantics for checking temporary or permanent functions are product-specific API details. | -| [genie](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/genie-commands) | integrations | 0.70 | The 'genie' command group contains commands for Genie; this implies specific CLI operations and parameters unique to Genie spaces, fitting integrations. | | [get](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/observation/get) | integrations | 0.70 | Documents blocking behavior and single-action semantics of Observation.get, which are non-obvious, product-specific details. | | [getActiveSession](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/getactivesession) | integrations | 0.70 | Documents a class method and its behavior for obtaining the active SparkSession; SDK-specific pattern. | | [getBytes](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/getbytes) | integrations | 0.70 | Documents a concrete PySpark Geometry method, exposing exact behavior and usage patterns that are specific to Azure Databricks’ PySpark implementation. | | [getSrid](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/geometry/getsrid) | integrations | 0.70 | Method-level reference for Geometry.getSrid with product-specific semantics that go beyond generic spatial concepts. | +| [get_json_object function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get_json_object) | integrations | 0.70 | Function reference pages for Databricks SQL typically include product-specific syntax, arguments, return types, and behavior details (such as how JSON paths are interpreted, null/edge-case handling, and examples). These are concrete, code-level integration details for working with JSON in Databricks SQL that go beyond generic SQL knowledge, fitting the integrations & coding patterns category. | | [getbit](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/getbit) | integrations | 0.70 | Documents Databricks PySpark getbit behavior and mapping to Databricks SQL; concrete API details. | | [getbit function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/getbit) | integrations | 0.70 | Documents the getbit/bit_get function’s exact behavior and usage in Databricks SQL, which is concrete, product-specific function/API information suitable for integrations/coding patterns. | | [h3_hexring](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/h3_hexring) | integrations | 0.70 | Function reference page with product-specific signature, argument types, return types, and behavior details for the h3_hexring PySpark API, which are not reliably known from pretraining. | @@ -2365,9 +2376,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [onQueryProgress](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryprogress) | integrations | 0.70 | Method-level reference for a Databricks PySpark listener callback with specific semantics around status updates. | | [onQueryStarted](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onquerystarted) | integrations | 0.70 | Documents a concrete lifecycle callback method for streaming queries, part of the Databricks PySpark API surface. | | [onQueryTerminated](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquerylistener/onqueryterminated) | integrations | 0.70 | Explains a specific termination callback (with or without error) in the Databricks PySpark listener API. | +| [option](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/option) | integrations | 0.70 | API reference for DataFrameWriter.option with product-specific write options for Azure Databricks data sources; exposes concrete option names/behaviors that function as integration/configuration details beyond generic PySpark knowledge. | +| [option](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/option) | integrations | 0.70 | API reference for DataFrameWriterV2.option with Databricks-specific write options; documents concrete option names/semantics for integrating with underlying Databricks data sources. | +| [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/options) | integrations | 0.70 | API reference for DataFrameWriter.options showing how to set multiple output options for Databricks-backed data sources; includes specific option usage patterns that are product-specific integration details. | | [overwrite](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwrite) | integrations | 0.70 | Documents a concrete writer API (overwrite with filter) specific to Databricks PySpark, including behavior details that are product-specific. | | [overwritePartitions](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwritepartitions) | integrations | 0.70 | Describes Databricks-specific partition overwrite behavior equivalent to Hive INSERT OVERWRITE PARTITION, which is concrete integration behavior. | -| [pandas UDFs](https://learn.microsoft.com/en-us/azure/databricks/udf/pandas) | integrations | 0.70 | Details pandas UDFs using Apache Arrow and pandas, including decorators (pandas_udf) and performance characteristics on Databricks. These are concrete code patterns and API usage specific to Databricks’ Spark environment. | +| [pandas UDFs](https://learn.microsoft.com/en-us/azure/databricks/udf/pandas) | integrations | 0.70 | Details pandas UDFs using Arrow and pandas with Databricks-specific performance characteristics (up to 100x vs row UDFs) and decorator-based definitions; this is a concrete coding pattern for integrating pandas with Spark on Databricks. | | [parse_json function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_json) | integrations | 0.70 | Function reference page with product-specific syntax, return type (VARIANT), and behavior details for parse_json that go beyond generic SQL knowledge. | | [parse_timestamp function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_timestamp) | integrations | 0.70 | Function reference page with product-specific syntax, arguments, and behavior for parse_timestamp, including how string and numeric inputs are interpreted and supported format patterns—details that go beyond generic SQL knowledge. | | [parse_url function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/parse_url) | integrations | 0.70 | Documents Databricks-specific parse_url function syntax and supported URL parts, which are product-specific API details rather than generic concepts. | @@ -2383,6 +2397,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [range](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/range) | integrations | 0.70 | API reference for creating a DataFrame with LongType id column and start/end/step semantics; concrete method behavior. | | [read](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/read) | integrations | 0.70 | Documents the DataFrameReader entry point; specific property and usage pattern for data ingestion. | | [readStream](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/readstream) | integrations | 0.70 | Reference for DataStreamReader access via SparkSession; streaming-specific integration API. | +| [read_files table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_files) | integrations | 0.70 | Function reference pages for Databricks SQL typically include product-specific syntax, parameters, return schema details, supported file formats, and behavioral nuances (such as how schemas are inferred, partitioning behavior, and options) that go beyond generic SQL knowledge. These are concrete API/SDK-style integration details for reading external files into Databricks SQL, fitting the integrations category better than others. | | [read_kinesis streaming table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_kinesis) | integrations | 0.70 | Kinesis streaming integration; page will define TVF parameters (stream name, region, offsets, auth options) that are product-specific configuration for an external service. | | [read_pubsub streaming table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_pubsub) | integrations | 0.70 | Pub/Sub streaming TVF; requires specific connector options and parameter names for Pub/Sub topics and credentials, which are integration-focused expert details. | | [read_pulsar streaming table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_pulsar) | integrations | 0.70 | Pulsar integration in public preview; page will list Pulsar-specific options and TVF parameters, a product-specific integration pattern. | @@ -2519,6 +2534,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [transform](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/transform) | integrations | 0.70 | Documents the transform method’s signature and usage pattern for chaining transformations in Databricks PySpark. | | [trigger](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/trigger) | configuration | 0.70 | Explains trigger configuration, default behavior (processingTime='0 seconds'), and single-parameter constraint; this is detailed, product-specific configuration behavior. | | [try_element_at function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_element_at) | integrations | 0.70 | Details Databricks-specific behavior for out-of-bounds indices and missing keys returning NULL instead of errors—precise function semantics. | +| [try_ip_as_binary function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_as_binary) | integrations | 0.70 | Function reference page with product-specific SQL syntax, argument types, return types, and behavior (returns NULL instead of error) for try_ip_as_binary, which are concrete API details rather than conceptual overview. | +| [try_ip_as_string function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_as_string) | integrations | 0.70 | Function reference page with Databricks-specific SQL syntax and behavior for try_ip_as_string, including how invalid input is handled (NULL), which is concrete API behavior not generally known. | +| [try_ip_cidr function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_cidr) | integrations | 0.70 | Documents the try_ip_cidr function with exact SQL signature and behavior for IPv4/IPv6 CIDR normalization and error handling, which are product-specific function semantics. | +| [try_ip_host function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_host) | integrations | 0.70 | Provides Databricks SQL function details for try_ip_host, including canonicalization rules and NULL-on-error behavior, which are specific API semantics beyond generic knowledge. | | [try_parse_timestamp function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_parse_timestamp) | integrations | 0.70 | Describes Databricks-specific parsing rules, list-of-formats behavior, numeric Unix timestamp handling, and NULL-on-failure semantics—detailed API behavior. | | [try_to_binary](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_to_binary) | integrations | 0.70 | Describes Databricks PySpark function behavior on conversion failures (NULL vs error) and relation to SQL function, which is product-specific. | | [try_to_geography](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/try_to_geography) | integrations | 0.70 | Function reference page with product-specific behavior (input types BINARY/string, return type Geography, null-handling semantics, and preview status). This is concrete API behavior rather than conceptual overview, fitting integrations/coding patterns best among the available categories. | @@ -2597,13 +2616,16 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [DROP LOCATION](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-location) | configuration | 0.68 | DROP EXTERNAL LOCATION syntax and MANAGE privilege requirements are specific to Databricks. | | [DROP PROCEDURE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-procedure) | configuration | 0.68 | DROP PROCEDURE syntax and privilege/ownership rules are Databricks-specific. | | [Debugging using the Spark UI](https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/debugging-spark-ui) | troubleshooting | 0.68 | Troubleshooting-focused article for Spark applications using UI and logs; likely organized around symptoms and diagnostic steps specific to Databricks Spark UI. | +| [Deploy apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/deploy) | deployment | 0.68 | The page focuses on how to deploy Databricks Apps, including behavior differences (automatic deployment from templates, redeploy after changes) and likely product-specific deployment commands/constraints. This is deployment-focused expert knowledge beyond generic 'run a build' instructions. | | [Distributed fine-tune Llama-3.2-3B with Unsloth](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-llama-unsloth-distributed) | integrations | 0.68 | Notebook-style tutorial for fine-tuning Llama-3.2-3B on Azure Databricks Serverless GPU with Unsloth and multiple GPUs. It necessarily includes concrete, product-specific code patterns and configuration parameters (e.g., serverless_gpu, Unsloth setup, multi-GPU configuration) that go beyond generic LLM training knowledge, fitting the integrations & coding patterns category. | | [ETL and image processing](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/reference-solutions/images-etl-inference) | best-practices | 0.68 | Provides a reference solution with concrete patterns (pandas UDFs, PyTorch, TensorFlow) for large-scale image inference on Databricks, which are practical, environment-specific best practices. | +| [Embedding for external users](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/external-embed) | security | 0.68 | Describes product-specific security configuration for embedding dashboards for external users, including use of service principals, scoped access tokens, and workspace settings. This is concrete IAM/auth configuration unique to Azure Databricks embedding rather than a generic overview. | | [Enrich traces with tags & metadata](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/attach-tags/) | integrations | 0.68 | Contains product-specific API usage and field semantics (mutable tags vs immutable metadata) for MLflow Tracing on Databricks, including concrete key/value patterns and how they affect search and organization—details that go beyond generic tagging concepts. | -| [H3 geospatial functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions) | integrations | 0.68 | The page is a detailed SQL language reference for H3 geospatial functions in Databricks SQL, containing product-specific function names, signatures, parameter semantics, and behavior that go beyond generic geospatial knowledge. These are concrete API/SQL integration details unique to Databricks SQL rather than conceptual geospatial theory, fitting the integrations & coding patterns category best. | +| [Fine-tune Llama 3.2 1B with LoRA](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-sft-trl-deepspeed-llama-1b) | integrations | 0.68 | Notebook-style tutorial with concrete, product-specific code and configuration for integrating Databricks AI Runtime, TRL, DeepSpeed ZeRO Stage 3, and LoRA on a specific GPU setup. Contains detailed parameter choices and patterns for this integration scenario rather than just conceptual guidance. | | [HIPAA](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/hipaa) | security | 0.68 | Lists Databricks HIPAA compliance controls and workspace implications, which are specific to the service and not generic HIPAA guidance. | | [HITRUST](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/hitrust) | security | 0.68 | Documents Databricks HITRUST compliance controls and how they apply to workspaces, representing product-specific security/compliance configuration. | | [IRAP](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/irap) | security | 0.68 | Describes Databricks IRAP compliance controls and workspace behavior, which is specialized security/compliance information for this product. | +| [Manage workspace-local groups (legacy)](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/workspace-local-groups) | security | 0.68 | Describes legacy workspace-local group management, which is specific to Databricks’ IAM model and not generic knowledge. | | [Manual tracing](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/manual-tracing/) | best-practices | 0.68 | Guides when and how to use manual tracing for production-ready observability, including concrete code patterns and recommendations specific to MLflow Tracing, fitting product-specific best practices. | | [Migrate](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/migrate) | decision-making | 0.68 | The page is focused on migrating between specific Databricks CLI versions (0.18 and below to 0.205+), which is a product-specific migration/upgrade decision scenario. It likely includes concrete guidance on when and how to move, differences between versions, and steps/considerations for migration. This aligns best with the decision-making category’s migration/upgrade path criterion. It is not just a conceptual overview; it contains version-specific, product-specific instructions that an LLM would not reliably know from training. | | [Migrate to Auto Loader with file events](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/migrating-to-file-events) | best-practices | 0.68 | Migration guidance typically includes stepwise changes, required option updates, and edge-case handling when switching modes. These are product-specific operational patterns and gotchas, fitting best-practices. | @@ -2615,9 +2637,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [SHOW CONNECTIONS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-connections) | integrations | 0.68 | Describes Databricks-specific SHOW CONNECTIONS command, including supported runtime versions and listing semantics. | | [SHOW VOLUMES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-volumes) | configuration | 0.68 | Unity Catalog-specific command for listing volumes accessible to the current user with optional pattern filtering. | | [Serverless](https://learn.microsoft.com/en-us/azure/databricks/ldp/serverless) | configuration | 0.68 | The page is specifically about configuring Lakeflow Spark Declarative Pipelines to use serverless compute and Unity Catalog. This implies product-specific configuration options and settings (for example, how to select serverless vs classic compute, and how to bind to Unity Catalog) that go beyond generic knowledge. It is not just a conceptual overview but a configuration-focused guide, so it best fits the configuration sub-skill. | +| [Supervisor API](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/supervisor-api) | integrations | 0.68 | Describes a specific OpenResponses-compatible endpoint (POST /mlflow/v1/responses) and how Supervisor API orchestrates model and tool calls. This is concrete, product-specific API behavior and request pattern that goes beyond generic LLM knowledge, fitting integrations & coding patterns best. | | [Tutorial: Run code on serverless compute](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/jar-compile) | integrations | 0.68 | Tutorial content, but for Databricks Connect for Scala it typically includes product-specific build settings (for example, required Scala/Spark versions, Maven/SBT coordinates, Unity Catalog / serverless compatibility flags, and packaging details for JAR tasks). These are concrete integration/configuration patterns unique to Databricks rather than generic Scala usage. | | [UPDATE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-update) | integrations | 0.68 | Provides Databricks SQL UPDATE syntax and behavior restricted to Delta tables, which are product-specific SQL surface details useful for coding against the service. | -| [Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/ldp/unity-catalog) | configuration | 0.68 | Describes how Lakeflow Spark Declarative Pipelines behave when configured with Unity Catalog, including default behavior for new pipelines, how materialized views and streaming tables are published to specific catalogs and schemas, and how permissions are managed with GRANT/REVOKE. This is product-specific configuration/behavior detail rather than generic concepts. | | [Volumes](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-volumes) | configuration | 0.68 | Describes Unity Catalog volumes as objects, their behavior and usage constraints for managing non-tabular data, which is Databricks-specific configuration surface. | | [array_distinct function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_distinct) | integrations | 0.68 | Even though conceptually simple, the page will define exact syntax, type behavior, and edge cases for array_distinct in Databricks SQL, which are concrete API details rather than generic concepts. | | [array_insert function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array_insert) | integrations | 0.68 | Covers exact syntax and behavior (index handling, return type) of array_insert in Databricks SQL/Runtime 13.3+, which are specific coding patterns for this platform. | @@ -2627,7 +2649,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [currentCatalog](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/currentcatalog) | integrations | 0.68 | Documents a specific PySpark Catalog method and its return behavior, which is product-specific API knowledge. | | [currentDatabase](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/currentdatabase) | integrations | 0.68 | Method reference for currentDatabase is concrete SDK behavior for Databricks PySpark integration. | | [event_log table-valued function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/event_log) | integrations | 0.68 | The page documents a specific Databricks SQL table-valued function (event_log) with product-specific behavior and constraints, including who can call it (ownership requirement) and how it exposes event logs for materialized views, streaming tables, and Lakeflow pipelines. This is detailed, product-specific function behavior that an LLM is unlikely to know from training and is closest to an integration/coding pattern for querying internal event logs. | -| [explode (TVF)](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-explode) | integrations | 0.68 | The page documents a specific Azure Databricks PySpark table-valued function, including its behavior (how arrays/maps are expanded into rows) and default column names (iscol, key, value). These are product- and API-specific details that are not generic PySpark knowledge and qualify as expert integration/coding pattern knowledge. It is not about limits, architecture, or configuration, but about a concrete API surface and its semantics. | +| [explode (TVF)](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/tvf-explode) | integrations | 0.68 | The page documents a specific Azure Databricks PySpark table-valued function, including its behavior (how arrays/maps are expanded into rows) and default column names (col, key, value). These are product- and API-specific details that go beyond generic LLM knowledge and are needed to correctly integrate and use this function in code. It fits best under integrations & coding patterns because it is a concrete API reference for a Databricks-specific PySpark function rather than general configuration, limits, or architecture guidance. | | [kurtosis aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kurtosis) | integrations | 0.68 | Reference for the kurtosis aggregate function in Databricks SQL/Runtime, including exact function name, arguments, and behavior, which is product-specific. | | [listCatalogs](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listcatalogs) | integrations | 0.68 | API behavior for listing catalogs in a session is product-specific integration information. | | [listDatabases](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/catalog/listdatabases) | integrations | 0.68 | Describes method returning databases across sessions; concrete SDK usage detail. | @@ -2672,12 +2694,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Adaptive query execution](https://learn.microsoft.com/en-us/azure/databricks/optimizations/aqe) | best-practices | 0.65 | Describes Databricks AQE behavior, including how it re-optimizes queries at runtime and handles skew joins, with product-specific tuning implications. | | [Advanced techniques](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/advanced-techniques) | integrations | 0.65 | Describes product-specific syntax and patterns (window measures, composability) for Databricks metric views, including concrete YAML/code expressions and behaviors that are unique to this feature, fitting integrations & coding patterns more than generic concepts. | | [Align judges with humans](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/align-judges) | best-practices | 0.65 | Describes a concrete three-step workflow and SIMBA-based optimization strategy to align judges with human standards, including quantified impact (30–50% improvement), which is product-specific best-practice guidance. | -| [Analyze the billing logs](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/system-tables) | configuration | 0.65 | Describes the billable usage system table and how to join it with other system tables. This is detailed, product-specific schema/usage information that functions as configuration/metadata for cost monitoring. | | [Anomalo](https://learn.microsoft.com/en-us/azure/databricks/partners/data-governance/anomalo) | integrations | 0.65 | Describes how to integrate Anomalo with Databricks clusters and SQL warehouses, including connector-specific configuration. | | [App Version Tracking](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/version-tracking/version-concepts) | configuration | 0.65 | API-focused description of LoggedModel and version tracking; likely includes specific fields, behaviors, and usage patterns for versioned GenAI apps. | | [Apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/apps-resource) | configuration | 0.65 | Adding another Databricks app as a resource likely involves specific configuration fields and environment variables for app-to-app communication, which are configuration details unique to this product. | | [AtScale](https://learn.microsoft.com/en-us/azure/databricks/partners/semantic-layer/atscale) | integrations | 0.65 | Explains how to connect AtScale to Databricks SQL warehouses and clusters with product-specific integration details. | -| [Author agents on Databricks Apps](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent) | deployment | 0.65 | Covers authoring an AI agent and deploying it on Databricks Apps, including control over server configuration and deployment workflow. This implies Databricks-specific deployment patterns and requirements (e.g., how agents are hosted, app templates, deployment behavior), fitting the deployment sub-skill. | +| [Audit data sharing](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/audit-logs) | security | 0.65 | Covers how providers and recipients use audit logs to monitor Delta Sharing events. Audit log event types and their meanings are product-specific security and compliance details that qualify as expert knowledge. | | [Author agents using Model Serving](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent-model-serving) | deployment | 0.65 | Describes authoring agents in Python and deploying to Databricks Model Serving; such pages typically include endpoint configuration, deployment parameters, and product-specific patterns not known generically. | | [Author bundles in the workspace](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/workspace-author) | configuration | 0.65 | Focuses on authoring bundles in workspace; includes specific UI-driven configuration flows and constraints for bundle files and targets. | | [Automate Unity Catalog setup using Terraform](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/automate-uc) | integrations | 0.65 | Unity Catalog automation via Databricks Terraform provider with deployment requirements and validation tips. While partly linking out, it encodes product-specific integration steps and constraints for this service. | @@ -2685,7 +2706,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Autoscaling by default](https://learn.microsoft.com/en-us/azure/databricks/oltp/upgrade-to-autoscaling) | decision-making | 0.65 | Explains the autoscaling-by-default rollout, when new instances become Autoscaling projects, and how this affects creation options and integrations. This is product-specific decision guidance about which Lakebase mode you get and under what rollout conditions, helping users understand behavior changes and choose or plan migration paths, which aligns best with decision-making. | | [Avro files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/avro) | integrations | 0.65 | Documents the Avro data source support and usage in Databricks, which is a concrete integration pattern with this file format. | | [Azure DevOps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/azure-devops) | deployment | 0.65 | Configures Azure DevOps pipelines to build, test, and deploy Python wheels to Databricks; includes Databricks-specific CI/CD pipeline configuration details. | -| [Backfill historical data](https://learn.microsoft.com/en-us/azure/databricks/ldp/flows-backfill) | best-practices | 0.65 | Focuses on creating specialized backfill flows; likely includes recommended patterns, caveats, and product-specific approaches for backfilling in Lakeflow pipelines. | | [CI/CD with Git folders](https://learn.microsoft.com/en-us/azure/databricks/repos/ci-cd) | deployment | 0.65 | Describes how to use Databricks Git folders in CI/CD flows, integrating source control with Databricks jobs. This is a Databricks-specific deployment/CI-CD pattern rather than generic Git usage. | | [COMPUTE STATISTICS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-analyze-compute-statistics) | best-practices | 0.65 | Includes product-specific recommendation to enable predictive optimization for Unity Catalog managed tables, tied to a specific command and behavior; actionable performance guidance. | | [CREATE BLOOMFILTER INDEX (deprecated)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-create-bloomfilter-index) | integrations | 0.65 | Although deprecated, this page is a detailed syntax reference for CREATE BLOOMFILTER INDEX in Databricks SQL, including product-specific options and behavior and explicit migration guidance to predictive I/O or liquid clustering. This is expert, product-specific SQL integration knowledge rather than conceptual content. | @@ -2696,19 +2716,22 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Classification with the API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/classification-train-api) | integrations | 0.65 | API article for starting classification runs; likely lists function parameters and options unique to Databricks AutoML. | | [Clean and validate](https://learn.microsoft.com/en-us/azure/databricks/transform/validate) | best-practices | 0.65 | Provides Databricks product-specific patterns and recommendations for implementing data quality rules in batch and streaming workloads. | | [Clone](https://learn.microsoft.com/en-us/azure/databricks/delta/clone) | configuration | 0.65 | Details clone command behavior (deep vs shallow), versioned cloning, and cross-format cloning; these are product-specific DDL semantics. | +| [Cluster events API pagination changes](https://learn.microsoft.com/en-us/azure/databricks/compute/events-api-updates) | integrations | 0.65 | Describes a product-specific API behavior change with concrete field names being deprecated (limit, offset, total_count, next_page) and their replacement with token-based pagination fields on a specific date. This is detailed, time-bound API contract knowledge that an LLM is unlikely to infer from training data and is directly relevant to how clients must integrate with the Databricks cluster_events API. | | [Code examples](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/examples) | integrations | 0.65 | Provides concrete code examples for Databricks Connect for Python, showing API/SDK usage patterns, method signatures, and parameters specific to this integration between local IDEs and Azure Databricks clusters. | | [Code examples](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/examples) | integrations | 0.65 | Scala-focused article with concrete Databricks Connect code examples, demonstrating product-specific APIs, method calls, and parameters for integrating Scala applications with Azure Databricks clusters. | | [Common table expression (CTE)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-cte) | integrations | 0.65 | Describes Databricks-specific common table expression syntax and behavior, which are concrete language details. | | [Compare multiple model types](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl-hyperparam-tuning/hyperopt-model-selection) | integrations | 0.65 | Demonstrates Hyperopt-driven model selection with MLflow tracking, involving Databricks-specific integration patterns and configurations. | -| [Compatibility Mode](https://learn.microsoft.com/en-us/azure/databricks/external-access/compatibility-mode) | configuration | 0.65 | Describes a Databricks-only feature that auto-generates read-only table versions for external clients; includes specific behavior and constraints. | +| [Compatibility Mode](https://learn.microsoft.com/en-us/azure/databricks/external-access/compatibility-mode) | integrations | 0.65 | Describes how Compatibility Mode exposes read-only versions of Unity Catalog tables to external Delta/Iceberg clients, with Databricks-specific configuration and access patterns. This is an integration-focused pattern with product-specific behavior, not just conceptual content. | | [Computes](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-computes) | best-practices | 0.65 | Page is about sizing, scaling, and performance optimization for Lakebase Postgres computes. This typically includes concrete product-specific recommendations (instance sizes, scaling behaviors, configuration patterns) that qualify as best practices rather than generic theory. | -| [Configure compute](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/compute-size) | decision-making | 0.65 | Choosing instance size for apps to match workload and cost implies guidance on when to pick which size; likely includes Databricks-specific thresholds or recommendations, fitting decision-making. | +| [Configure compute](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/compute-size) | decision-making | 0.65 | A page about configuring instance size for Databricks Apps is likely to include SKU/size options, CPU/memory values, and guidance on which size to choose for different workloads. That is expert decision guidance with quantified trade-offs (cost vs. performance) and tier selection, fitting decision-making. | +| [Configure diagnostic log delivery](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-log-delivery) | configuration | 0.65 | Contains specific configuration steps and required Azure permissions (including exact action name) for enabling log delivery—product-specific configuration details. | +| [Configure endpoints](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-endpoints-beta) | configuration | 0.65 | Page is specifically about configuring AI Gateway endpoints; likely includes endpoint setting names, allowed values, and possibly tables of configuration parameters unique to this product. | | [Configuring for production](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/production) | best-practices | 0.65 | Production configuration guidance for Auto Loader is typically product-specific (checkpointing, trigger intervals, schema evolution, scaling patterns) and goes beyond generic streaming theory, giving concrete recommendations for running Auto Loader at scale. | | [Connect a workspace to an on-premises network](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/on-prem-network) | configuration | 0.65 | Shows how to establish connectivity via a hub-and-spoke topology and transit VNet. Contains concrete network configuration patterns specific to Databricks deployments. | | [Connect and query](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/) | integrations | 0.65 | Outlines ways to work with Lakebase instances; likely includes connection strings, driver settings, and environment-specific parameters, which are integration and configuration patterns beyond generic Postgres usage. | -| [Connect to AI Runtime](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/connecting) | configuration | 0.65 | Explains how to connect to AI Runtime from notebooks, scheduled jobs, and the Jobs API, which typically involves specifying cluster/runtime identifiers, job configuration fields, and API parameters that are product-specific configuration details. | | [Connect to Infoworks](https://learn.microsoft.com/en-us/azure/databricks/archive/partners/infoworks) | integrations | 0.65 | Partner integration article for Infoworks on Databricks generally contains concrete setup steps, connection parameters, and product-specific configuration patterns, which qualify as integrations-focused expert knowledge despite being retired. | | [Connect to cloud storage (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/storage/connect-storage-index) | integrations | 0.65 | Legacy storage access patterns usually document storage mount commands, configuration keys, and connection parameters for different cloud object stores. | +| [Connect to managed ingestion sources (Lakeflow Connect)](https://learn.microsoft.com/en-us/azure/databricks/connect/managed-ingestion) | configuration | 0.65 | Explains how to create and use connection objects in Catalog Explorer for Lakeflow Connect managed ingestion, including USE CONNECTION privileges and authentication storage. This is product-specific configuration of connections and permissions, likely with concrete settings and roles, fitting the configuration sub-skill. | | [Connect with PgHero](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pghero) | integrations | 0.65 | Shows how to connect PgHero to Lakebase Postgres for performance monitoring. This is a concrete integration pattern between a specific monitoring tool and Lakebase. | | [Connect with psql](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-psql) | integrations | 0.65 | Describes how to use the PostgreSQL psql client specifically with Lakebase, which likely includes Lakebase-specific connection parameters or patterns beyond generic psql usage. | | [Cost-based optimizer](https://learn.microsoft.com/en-us/azure/databricks/optimizations/cbo) | best-practices | 0.65 | Explains how CBO works in Spark SQL on Databricks and the need for up-to-date stats, with Databricks-specific guidance on using and benefiting from CBO. | @@ -2716,7 +2739,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Create GPU-enabled compute](https://learn.microsoft.com/en-us/azure/databricks/compute/gpu) | decision-making | 0.65 | Covers when to use GPU-enabled compute, requirements, and how to create them; this is service-specific selection guidance between GPU and non-GPU compute. | | [Create and edit metric views](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/create-edit) | configuration | 0.65 | Covers creating/editing metric views via UI and SQL; likely includes specific SQL syntax, YAML schema, and field options that constitute product-specific configuration knowledge. | | [Create generative AI model serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/create-foundation-model-endpoints) | integrations | 0.65 | Covers formatting scoring requests and sending them to model serving endpoints. This typically includes request body schemas, parameter names, and endpoint-specific options—product-specific integration and coding patterns. | -| [Create materialized views in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized) | integrations | 0.65 | How-to/reference for creating, refreshing, and querying Databricks materialized views in Databricks SQL. Likely includes product-specific SQL syntax, options, and operational behaviors that function as API/DSL parameters, which go beyond generic materialized view concepts. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-pipeline) | security | 0.65 | Explicitly states the managed SharePoint connector supports use in workspaces with 'Configure enhanced security and compliance settings' enabled. This is a product-specific security/compliance capability detail that LLMs are unlikely to know from training and maps to security configuration behavior. | | [Create or modify a table using file upload](https://learn.microsoft.com/en-us/azure/databricks/ingestion/create-or-modify-table) | configuration | 0.65 | Describes a specific Databricks UI feature with constraints (file types, small file size, managed Delta tables in Unity Catalog or Hive metastore). These are concrete product-specific configuration/usage details. | | [Create pipelines with dlt-meta](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/dlt-meta) | configuration | 0.65 | Describes using dlt-meta to generate pipelines from JSON metadata; likely includes metadata schema/fields and configuration patterns unique to this tool and product. | | [Create schemas](https://learn.microsoft.com/en-us/azure/databricks/schemas/create-schema) | configuration | 0.65 | Shows concrete SQL/UX steps and options for creating schemas in both Unity Catalog and legacy Hive metastore; includes product-specific commands and behavior differences. | @@ -2730,19 +2753,20 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Data skipping](https://learn.microsoft.com/en-us/azure/databricks/delta/data-skipping) | best-practices | 0.65 | Contains Databricks-specific recommendations on how to leverage data skipping (stats columns, Z-order, OPTIMIZE) and when they are effective, including interactions with liquid clustering. These are concrete, product-specific performance practices rather than generic concepts. | | [DataFrames and tables](https://learn.microsoft.com/en-us/azure/databricks/sparkr/dataframes-tables) | integrations | 0.65 | Shows how to work with R data.frames, Spark DataFrames, and tables using SparkR, sparklyr, and dplyr in Azure Databricks. Contains product-specific code patterns for integrating these R libraries with Databricks Spark, beyond generic R or Spark knowledge. | | [DataSourceWriter class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter) | integrations | 0.65 | Describes a Databricks Runtime–specific base class for custom data source writers and how it is returned from DataSource.writer(); this is concrete API surface and lifecycle behavior for integrations. | -| [Databricks Apps](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/databricks-apps) | integrations | 0.65 | Explains how Databricks Apps integrate with Lakebase, including automatic creation of service principals and matching Postgres roles. This is a product-specific integration pattern between Databricks Apps and Lakebase. | | [Databricks Light](https://learn.microsoft.com/en-us/azure/databricks/archive/runtime/light) | decision-making | 0.65 | The page explains what Databricks Light does not support and when it is appropriate for jobs that do not need advanced features. That is product-specific guidance on when to choose this runtime versus full Databricks Runtime, fitting decision-making. | | [Databricks Runtime ML maintenance policy](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/databricks-runtime-ml-maintenance) | limits-quotas | 0.65 | Maintenance policy for Runtime ML libraries; typically includes version support windows, update cadence, and deprecation timelines—effectively limits/constraints on supported library versions. | | [Databricks online tables (legacy)](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/feature-store/online-tables) | limits-quotas | 0.65 | Online tables have region-specific availability, deprecation timelines (not accessible after a specific date), and pricing references, which are concrete product limits and lifecycle constraints. | | [Dates and timestamps](https://learn.microsoft.com/en-us/azure/databricks/archive/spark-3.x-migration/dates-timestamps) | configuration | 0.65 | Explains significant changes to Date and Timestamp types in Databricks Runtime 7.0, including behavior and formatting differences, which are product-specific configuration/behavior details. | | [Debug notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/debugger) | troubleshooting | 0.65 | Describes the Databricks interactive debugger (breakpoints, variable inspection, step execution) which is a product-specific debugging and troubleshooting workflow. | | [Declarative Automation Bundles extension features](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/bundles) | deployment | 0.65 | Covers defining, deploying, and running Declarative Automation Bundles for Lakeflow Jobs, Spark Declarative Pipelines, and MLOps Stacks. This is CI/CD-focused, with Databricks-specific deployment patterns and bundle behavior that go beyond generic deployment tutorials. | +| [Declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-apis) | integrations | 0.65 | Describes concrete workflows and API-based patterns for defining and computing features from Delta tables and request-time data; this is product-specific integration/coding guidance beyond generic ML concepts. | +| [Delete a workspace](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/delete-workspace) | deployment | 0.65 | Covers workspace deletion behavior, including which resources are automatically cleaned up and when to force delete DBFS storage and access connectors, which are product-specific lifecycle/deployment details. | | [Deploy agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/deploy-agent) | deployment | 0.65 | Covers deploying agents via the Agent Framework deploy() function on Mosaic AI Model Serving. This is product-specific deployment behavior and likely includes endpoint configuration details and constraints unique to Databricks Model Serving. | | [Deploy batch inference pipelines](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/batch-inference-pipelines) | deployment | 0.65 | Covers deploying batch inference pipelines as scheduled workflows and structured streaming; such a page typically includes Databricks-specific deployment patterns, workflow/streaming constraints, and possibly schedule/throughput details that go beyond generic deployment knowledge. | | [Deploy models for inference](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-inference/) | decision-making | 0.65 | Described as what Databricks recommends for batch inference, which implies guidance on selecting among Databricks batch inference options and patterns; this is decision-making content with product-specific recommendations rather than generic ML theory. | -| [Designated Services](https://learn.microsoft.com/en-us/azure/databricks/resources/designated-services) | security | 0.65 | Describes Designated Services, Geos, and cross-Geo processing for data residency; this is product-specific security/compliance configuration around where customer content is processed. | -| [Disaster recovery overview](https://learn.microsoft.com/en-us/azure/databricks/admin/disaster-recovery) | architecture-patterns | 0.65 | Covers Databricks-specific disaster recovery patterns and how the service behaves in regional outages as part of a broader data ecosystem; this is product-specific DR architecture guidance beyond generic DR concepts. | +| [Disaster recovery overview](https://learn.microsoft.com/en-us/azure/databricks/admin/disaster-recovery) | architecture-patterns | 0.65 | Focuses on DR planning for Databricks within a broader data ecosystem; likely includes Databricks-specific DR patterns and guidance on when to use them, which is architecture and pattern guidance specific to the service. | | [Distributed deep learning with horovod.spark](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/train-model/horovod-spark) | integrations | 0.65 | Describes how to use the horovod.spark package with Databricks for distributed training, including product-specific APIs and integration behavior with Spark clusters. | +| [Distributed training](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/distributed-training) | integrations | 0.65 | Describes the Serverless GPU Python API for distributed training, a product-specific integration/coding pattern with unique API usage and behavior for multi-GPU workloads. | | [Distributed training algorithms with Hyperopt](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl-hyperparam-tuning/hyperopt-distributed-ml) | integrations | 0.65 | Shows how to combine Hyperopt with HorovodRunner and MLflow on Databricks, which is a product-specific integration pattern. | | [Distributed training for Spark ML models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/distributed-training/distributed-ml-for-spark-connect) | integrations | 0.65 | Example-focused article on pyspark.ml.connect for distributed training and inference via Databricks Connect; likely documents module-specific APIs and patterns not generally known. | | [Distributed training with TensorFlow 2](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/train-model/spark-tf-distributor) | integrations | 0.65 | Covers Databricks-specific usage of spark-tensorflow-distributor built on tf.distribute.Strategy, including API usage and parameters for running distributed TensorFlow on Spark clusters, which are concrete integration patterns. | @@ -2751,18 +2775,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [ElasticSearch](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/elasticsearch) | integrations | 0.65 | Elasticsearch connector usage involves index URL formats and Spark connector options specific to this integration. | | [Embed apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/embed) | configuration | 0.65 | Explains embedding apps via HTML iframe with specific src URL; likely includes URL patterns and parameters unique to Databricks app embedding, which is configuration-focused. | | [Evaluation datasets](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/concepts/eval-datasets) | configuration | 0.65 | Describes evaluation dataset schema and references specific methods/classes; likely includes field names and structures that are product-specific configuration details. | +| [Example stateful applications](https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/examples) | integrations | 0.65 | Contains concrete code patterns using Databricks Runtime–specific transformWithState APIs (row-based and Pandas-based) for custom stateful applications; these are product-specific coding patterns and API usages that qualify as integrations & coding patterns. | | [Examples](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/examples) | integrations | 0.65 | Provides working code examples for real-time mode with Kafka sources/sinks and custom sinks. These examples typically include product-specific API/SDK parameters and configuration patterns for integrations, which qualify as expert integration knowledge. | -| [Excel](https://learn.microsoft.com/en-us/azure/databricks/query/formats/excel) | configuration | 0.65 | Reading Excel via built-in format support usually involves format-specific options (schema inference flags, header handling, streaming options) that are configuration details unique to Databricks’ Excel support. | -| [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-faq) | troubleshooting | 0.65 | FAQ for a specific connector usually includes concrete behaviors, edge cases, and resolutions for connector-specific issues that go beyond generic concepts. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-faq) | troubleshooting | 0.65 | Connector FAQ is likely to include specific behaviors, edge cases, and resolutions for SQL Server connector usage that are not generic knowledge. | | [FLOAT type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/float-type) | limits-quotas | 0.65 | FLOAT type docs specify it as 4-byte single-precision with associated numeric limits, which are explicit size/precision constraints. | -| [Fabric](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/fabric) | integrations | 0.65 | Product-specific integration between Fabric and Unity Catalog with concrete connection configuration details not covered by generic knowledge. | | [Feature Serving endpoints](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/feature-function-serving) | configuration | 0.65 | Describes how to set up and use Feature Serving endpoints, which typically involves endpoint configuration options (scaling behavior, request/response schema, possibly feature table bindings). This is product-specific configuration for serving structured data to external apps, not just a conceptual overview. | | [Feature compatibility](https://learn.microsoft.com/en-us/azure/databricks/delta/feature-compatibility) | decision-making | 0.65 | Compares protocol versions and feature compatibility across clients; used to decide which protocol/features to enable based on reader/writer support, a product-specific compatibility and selection guide. | | [Feedback model (deprecated)](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/feedback-model) | configuration | 0.65 | Describes deprecated feedback model and required migration to MLflow Traces; likely includes API names and configuration changes specific to Databricks agents. | -| [File metadata column](https://learn.microsoft.com/en-us/azure/databricks/ingestion/file-metadata-column) | best-practices | 0.65 | Describes behavior of the hidden _metadata column, including precedence when a user column has the same name and warnings about future field additions. These are product-specific behavioral details and gotchas that qualify as expert knowledge and best-practice usage patterns. | | [Fine-tune Qwen2-0.5B](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-finetune-qwen2-0.5b) | integrations | 0.65 | Notebook-style tutorial for a specific model on AI Runtime likely includes concrete code, library parameters, and Databricks-specific integration patterns (e.g., checkpointing, logging) that are product-specific coding patterns. | -| [Flow examples](https://learn.microsoft.com/en-us/azure/databricks/ldp/flow-examples) | configuration | 0.65 | Provides concrete examples of flow definitions and options, which are detailed configuration patterns for this product. | | [ForEachBatch sink](https://learn.microsoft.com/en-us/azure/databricks/ldp/for-each-batch) | integrations | 0.65 | Explains the ForEachBatch sink API for Lakeflow Spark Declarative Pipelines, including its relation to Structured Streaming foreachBatch and how to write to arbitrary sinks that don’t support streaming. This is a product-specific streaming integration/coding pattern rather than generic streaming guidance. | | [Forecasting with the API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/forecasting-train-api) | integrations | 0.65 | API reference-style article for forecasting runs; likely includes method signatures and parameter options specific to Databricks AutoML. | | [Full refresh](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/full-refresh) | best-practices | 0.65 | Describes behavior of full refresh, pipeline phases, and automatic retries during Initializing/Resetting tables, including failure behavior. This is detailed, product-specific operational guidance. | @@ -2772,19 +2792,21 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Genie spaces](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/genie) | configuration | 0.65 | Adding Genie spaces as resources for natural language querying likely requires specific configuration parameters and environment variables, which are product-specific configuration patterns. | | [Geographies](https://learn.microsoft.com/en-us/azure/databricks/resources/databricks-geos) | security | 0.65 | Explains how Geos control where customer content is processed, impacting compliance and data residency configuration. | | [Git folders overview](https://learn.microsoft.com/en-us/azure/databricks/repos/) | configuration | 0.65 | Describes Git folders behavior and integration semantics unique to Databricks, beyond generic Git usage. | -| [Governed tags](https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/) | security | 0.65 | Governed tags automatically enforce policies; the page includes security-related warnings and describes how tags affect permissions, which is product-specific governance and security behavior. | | [Grant access to shares](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/grant-access) | security | 0.65 | Describes how providers grant, view, update, and revoke access to Delta Sharing shares. This typically involves product-specific permission models, roles, or access control constructs in Unity Catalog, which qualify as security configuration details. | | [GroupedData class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/groupeddata) | integrations | 0.65 | Class reference for GroupedData with Databricks-specific behavior and supported methods, which are concrete API details rather than conceptual content. | +| [Groups](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/groups) | security | 0.65 | Explains Databricks groups as securable-object access constructs; this is product-specific identity and access model information. | | [Guides](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/guides) | troubleshooting | 0.65 | Aggregates migration and troubleshooting information; such pages typically link or embed specific error patterns and resolutions for AI Runtime, which are product-specific troubleshooting details. | | [High availability](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/high-availability) | architecture-patterns | 0.65 | Describes primary/secondary compute pairing across availability zones and automatic failover behavior, a Lakebase-specific HA pattern. | | [High availability](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-high-availability) | configuration | 0.65 | High availability management for Lakebase endpoints typically includes specific settings (HA flags, secondary compute behavior, connection string patterns); the page likely documents product-specific HA configuration options and how to obtain/read-only connection strings. | | [Hive tables](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/hive-tables) | integrations | 0.65 | Shows how to define external tables pointing at cloud storage, including table DDL, path formats, and compatibility details specific to Databricks. | -| [Host a custom MCP server](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp) | configuration | 0.65 | Describes how to host custom/third-party MCP servers as Databricks apps, controlled via Databricks Apps permissions and monitored via AI Gateway. Likely includes app configuration parameters, permission settings, and wiring to MCP, which are product-specific configuration details. | +| [Host a custom MCP server](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp) | security | 0.65 | Access to custom MCP servers is controlled via Databricks Apps permissions; the full content will detail permission scopes/roles and how they apply to MCP servers, which is product-specific security configuration. | | [IDENTIFIER clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-names-identifier-clause) | integrations | 0.65 | Documents a product-specific SQL feature (IDENTIFIER clause) with concrete usage rules and constraints (e.g., allowed argument types, runtime version applicability, replacement of object names). This is detailed syntax/behavior knowledge that is not generic SQL and is unlikely to be fully known from pretraining. Best fits integrations & coding patterns because it defines precise language/parameter usage for safe, injection-resistant SQL construction. | | [INFORMATION_SCHEMA_CATALOG_NAME](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/information_schema_catalog_name) | configuration | 0.65 | INFORMATION_SCHEMA_CATALOG_NAME relation is a Databricks-specific metadata endpoint exposing catalog naming. | +| [Iceberg v3](https://learn.microsoft.com/en-us/azure/databricks/iceberg/iceberg-v3) | configuration | 0.65 | Describes how to use Iceberg v3 features with Unity Catalog, including requirements like mandatory row lineage and likely specific table properties/DDL needed to enable v3 behavior. This is product-specific configuration knowledge beyond generic Iceberg concepts. | | [Image files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/image) | configuration | 0.65 | Describes the Databricks-specific `image`/`binary file` data source, including required format names and usage patterns, which are product-specific configuration details. | | [Import data to DBFS](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/data-tab) | configuration | 0.65 | Covers UI-driven table creation and file upload flows tied to DBFS and Databricks metadata, including specific UI paths and behaviors. | | [Improve Genie Code responses](https://learn.microsoft.com/en-us/azure/databricks/genie-code/tips) | best-practices | 0.65 | Explicitly described as tips and best practices for using Genie Code. While summary is brief, this type of page typically contains concrete, product-specific DOs/DON’Ts and interaction patterns unique to Genie Code. | +| [Incremental refresh for materialized views](https://learn.microsoft.com/en-us/azure/databricks/optimizations/incremental-refresh) | decision-making | 0.65 | Article compares incremental vs full refresh and materialized views vs streaming tables, and provides recommendations for choosing between them; this is product-specific decision guidance about when to use each approach. | | [Init script logging](https://learn.microsoft.com/en-us/azure/databricks/init-scripts/logs) | troubleshooting | 0.65 | Explains where init script start/finish events and global script changes are logged; product-specific diagnostic locations for troubleshooting. | | [InputPartition](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/inputpartition) | integrations | 0.65 | Class reference for InputPartition returned by DataSourceReader.partitions(), including runtime version notes, which are product-specific integration details. | | [Install a library with an init script](https://learn.microsoft.com/en-us/azure/databricks/archive/compute/libraries-init-scripts) | best-practices | 0.65 | Contains Databricks-specific guidance on why installing libraries via init scripts causes issues (for example, 'Module not found' during jobs) and recommends alternative patterns, which are concrete product-specific gotchas. | @@ -2792,16 +2814,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Instance pools overview](https://learn.microsoft.com/en-us/azure/databricks/compute/pool-index) | decision-making | 0.65 | Explains what pools are and when to use them, including a recommendation to prefer serverless compute when supported; this is product-specific guidance that helps choose between pools and serverless for different workloads, fitting decision-making around compute options. | | [Interoperability & usability](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/interoperability-and-usability/) | architecture-patterns | 0.65 | Covers architectural principles for user experience and integration with external systems specific to the Databricks lakehouse; product-focused interaction patterns. | | [JOIN](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-join) | integrations | 0.65 | Covers Databricks-specific JOIN syntax and supported join behaviors, which are detailed language semantics. | +| [JSON files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/json) | integrations | 0.65 | Describes Databricks-specific patterns for reading/writing JSON, including single-line vs multi-line modes, schema inference, and rescued data handling. These are concrete integration behaviors and options tied to Databricks’ Spark implementation, beyond generic JSON knowledge. | | [JSON path expressions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-json-path-expression) | configuration | 0.65 | Describes Databricks-specific JSON path expression syntax and behavior (e.g., use of ':' operator, handling VARIANT). These are concrete language features unique to this product’s SQL dialect. | | [LLMOps](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/llmops) | best-practices | 0.65 | Describes Databricks-specific recommended LLMOps workflow patterns and steps (evaluation, deployment, monitoring) that go beyond generic MLOps concepts and are tailored to Databricks features and tooling, providing actionable product-specific guidance. | -| [LLMs](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta) | security | 0.65 | Described as covering permissions, usage tracking, rate limits, and coding agent integration. Likely includes product-specific governance and permission settings, fitting security (with some limits-quotas aspects). | | [LOAD DATA](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-load) | configuration | 0.65 | LOAD DATA reference describes behavior for directories vs files and partitioned loads into Hive SerDe tables on Databricks, which are concrete command semantics. | | [Lakeflow Jobs](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakeflow) | configuration | 0.65 | Enabling apps to trigger and monitor Lakeflow Jobs typically involves specific resource configuration fields and parameters, which are product-specific configuration details. | -| [Lakeflow Pipelines Editor](https://learn.microsoft.com/en-us/azure/databricks/ldp/multi-file-editor) | configuration | 0.65 | Describes how to develop and debug pipelines with the Lakeflow Pipelines Editor; likely includes editor-specific options and pipeline configuration behaviors unique to this product. | | [Language recommendations](https://learn.microsoft.com/en-us/azure/databricks/languages/overview) | decision-making | 0.65 | Explains available languages, where they can be used, and their limitations in Databricks. This is workspace-specific language selection guidance with constraints, fitting decision-making for technology choice. | | [Legacy state operators](https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/legacy) | integrations | 0.65 | Covers mapGroupsWithState and flatMapGroupsWithState support details, which are specific APIs with Databricks-specific behavior and usage patterns. | | [Legacy versions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect-legacy) | decision-making | 0.65 | Deprecated-runtime article that points to migration paths; contains product-specific guidance on when and how to move to newer runtimes. | -| [Legacy visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-visualizations) | configuration | 0.65 | Describes legacy visualization behavior and support in Databricks, including product-specific configuration and compatibility details. | | [Lightup](https://learn.microsoft.com/en-us/azure/databricks/partners/data-governance/lightup) | integrations | 0.65 | Partner integration article for Lightup with Databricks clusters and SQL warehouses, including configuration specifics. | | [Load data using Mosaic Streaming](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/streaming) | integrations | 0.65 | Describes using Mosaic Streaming to convert Spark DataFrames to PyTorch-compatible format; likely includes product-specific APIs, parameters, and integration patterns between Spark and PyTorch. | | [Looker Studio](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/looker-studio) | integrations | 0.65 | Describes using Looker Studio with Databricks clusters/SQL warehouses. Such pages typically specify connector type, connection string fields, auth modes, and required configuration values, which are product-specific integration patterns. | @@ -2813,11 +2833,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Maintenance](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-maintenance) | best-practices | 0.65 | Ongoing operations and maintenance guidance for ingestion pipelines typically includes product-specific recommendations, gotchas, and operational patterns (for example, how and when to refresh, handle schema changes, or manage performance), which fits best-practices for this connector. | | [Manage dashboards with AI/BI APIs](https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/dashboard-crud-api) | integrations | 0.65 | Describes Databricks REST API for dashboard CRUD and permissions. Likely includes endpoint names, request/response parameters, and product-specific API behavior, which fits integrations & coding patterns. | | [Manage embedding](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/embed) | security | 0.65 | Workspace admin page for managing embedding options is likely to include security-related settings, scopes, and possibly role-based controls specific to embedding AI/BI assets. | -| [Manage groups](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-groups) | security | 0.65 | Provides product-specific guidance on managing groups in Azure Databricks for securing workspaces and data, including how groups map to permissions and identity federation behavior. | | [Manage listings (provider)](https://learn.microsoft.com/en-us/azure/databricks/marketplace/manage-listings) | configuration | 0.65 | Explains editing, unpublishing, deleting, and revoking access to listings; these are product-specific management operations and states for Marketplace listings. | | [Manage requests and installed products (consumer)](https://learn.microsoft.com/en-us/azure/databricks/marketplace/manage-requests-consumer) | configuration | 0.65 | Describes managing shared products and pending requests; involves product-specific UI/API operations and states for Marketplace assets. | -| [Manage workspace-local groups (legacy)](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/workspace-local-groups) | configuration | 0.65 | Legacy workspace-local groups behavior is Databricks-specific; page likely documents exact configuration flows and constraints unique to these legacy groups. | -| [Manage your subscription](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account) | decision-making | 0.65 | Explains how to upgrade, downgrade, or cancel subscriptions. Such content typically includes SKU/plan options, constraints, and implications of each choice, which are decision-making details for subscription management. | +| [Manage the Personal Compute cluster policy](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/personal-compute) | decision-making | 0.65 | Explains how the Personal Compute policy works, when to grant it, and how it limits compute creation—guidance for choosing this policy for certain scenarios. | +| [Manage your subscription](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account) | decision-making | 0.65 | Explains how to upgrade, downgrade, or cancel subscriptions; likely includes SKU/tier implications and concrete steps, providing product-specific decision and configuration guidance. | +| [Managed MCP servers](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp) | security | 0.65 | Managed MCP servers enforce Unity Catalog permissions; the full article is likely to list specific permission models, role mappings, and how access is evaluated for tools and data, which are product-specific security configuration details. | | [Managed tables](https://learn.microsoft.com/en-us/azure/databricks/tables/managed) | configuration | 0.65 | Describes how to create, query, update, and drop managed Delta and Iceberg tables with Unity Catalog; involves product-specific DDL and table configuration behavior. | | [Matillion](https://learn.microsoft.com/en-us/azure/databricks/partners/prep/matillion) | configuration | 0.65 | Integration setup for Matillion with Databricks SQL warehouses and clusters. Likely includes connector configuration fields, authentication options, and Databricks-specific settings that constitute expert configuration knowledge. | | [Merge data](https://learn.microsoft.com/en-us/azure/databricks/delta/merge) | integrations | 0.65 | Covers Delta Lake MERGE syntax and extended, non-standard capabilities specific to Databricks/Delta Lake, including code patterns and behavior beyond ANSI SQL. This is product-specific coding pattern knowledge. | @@ -2827,11 +2847,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Migrate from legacy versions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/migrate) | decision-making | 0.65 | Migration guide between Databricks Runtime 12.2 LTS and 13.3+ Connect versions; typically includes version-specific behavior changes, required configuration updates, and guidance on when/how to move, which supports concrete upgrade decisions. | | [Migrate optimized LLM endpoints to provisioned throughput](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/migrate-provisioned-throughput) | decision-making | 0.65 | Guides migration from optimized LLM serving to provisioned throughput, including when and how to switch approaches in Databricks. | | [Model inference for NLP](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/train-model/model-inference-nlp) | integrations | 0.65 | Shows concrete, product-specific code and configuration patterns for using Hugging Face Transformers pipelines with Spark on Databricks, including how to structure inference, which APIs to call, and Databricks-specific usage details beyond generic LLM knowledge. | -| [Monitor apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/monitor) | security | 0.65 | Focuses on application logs and platform audit logs for Databricks Apps in a security context; likely includes product-specific log sources and monitoring configuration relevant to security operations. | -| [Monitor default storage costs](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/default-storage) | integrations | 0.65 | Describes concrete use of the billable usage system table with product-specific fields and example queries to monitor default storage costs. | -| [Monitor jobs](https://learn.microsoft.com/en-us/azure/databricks/jobs/monitor) | configuration | 0.65 | Describes specific UI features, CLI commands, and system.lakeflow schema usage for monitoring jobs, which are product-specific observability configurations. | -| [Monitor model serving costs](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/model-serving-cost) | configuration | 0.65 | Gives concrete examples of using billing system tables for Mosaic AI Model Serving endpoints, including specific table/field usage. These telemetry schema details are expert, product-specific knowledge. | -| [Monitor serverless costs](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/serverless-billing) | configuration | 0.65 | Explains how to query the billable usage system table for serverless compute, including specific fields and their meanings. This is product-specific table/field configuration knowledge. | +| [Monitor default storage costs](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/default-storage) | configuration | 0.65 | Explains how to use the billable usage system table to isolate default storage costs, including relevant fields—product-specific configuration/usage. | +| [Monitor model serving costs](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/model-serving-cost) | configuration | 0.65 | Gives specific guidance on which billing/system tables and fields to query for model serving endpoints—expert configuration/usage details. | +| [Monitor usage using tags](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags) | best-practices | 0.65 | Gives Databricks-specific guidance on using tags for cost tracking and includes security warnings about tag content; these are product-specific DO/DON’T recommendations. | | [Monte Carlo](https://learn.microsoft.com/en-us/azure/databricks/partners/data-governance/monte-carlo) | integrations | 0.65 | Explains how to connect Monte Carlo’s platform to Databricks with product-specific integration steps and settings. | | [Mounting object storage](https://learn.microsoft.com/en-us/azure/databricks/dbfs/mounts) | best-practices | 0.65 | Covers a deprecated pattern (DBFS mounts) and recommends migration to Unity Catalog external locations with Databricks-specific guidance on when and why to avoid mounts. This is actionable, product-specific best-practice content. | | [Multi-destination pipelines](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/multi-destination-pipeline) | configuration | 0.65 | Details how to write to multiple destination catalogs and schemas from one pipeline and the constraint that duplicate table names in the same schema are not supported, requiring specific naming configuration. This is product-specific configuration behavior. | @@ -2841,6 +2859,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Non-conversational agents](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/non-conversational-agents) | architecture-patterns | 0.65 | Describes non-conversational agents that process structured inputs without state; likely includes Databricks-specific patterns and schemas for such agents. | | [Notebook development](https://learn.microsoft.com/en-us/azure/databricks/ldp/notebook-devex) | integrations | 0.65 | Describes a product-specific, legacy notebook-based development and debugging experience for Lakeflow Spark Declarative Pipelines. Likely includes concrete notebook configuration details, pipeline/notebook linkage, and execution/debug patterns unique to this product, fitting integrations & coding patterns more than generic tutorials. | | [OBJECT type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/object-type) | best-practices | 0.65 | Describes that OBJECT cannot be stored in table columns and must be cast to STRUCT or MAP, and that it is only exposed via specific functions; these are Databricks-specific constraints and usage patterns. | +| [ORC files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/orc) | integrations | 0.65 | Page focuses on product-specific code patterns and options for working with ORC via the DataFrame API and SQL (schema specification, partitioning, compression). These are integration/coding patterns unique to Databricks’ Spark environment rather than generic ORC concepts. | | [Observability](https://learn.microsoft.com/en-us/azure/databricks/ldp/observability) | troubleshooting | 0.65 | Monitoring/observability section plus explicit mention of troubleshooting topics for specific scenarios suggests symptom→diagnosis→solution guidance and product-specific diagnostic details. | | [On-demand feature computation](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/on-demand-features) | integrations | 0.65 | Shows how to create and use on-demand features via Python UDFs with specific runtime and Unity Catalog requirements; product-specific coding and configuration pattern. | | [Onboard data from ADLS](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/onboard-data) | configuration | 0.65 | Step-by-step setup for incremental ingestion including Unity Catalog volumes/external locations and Auto Loader settings; likely includes specific configuration parameters and values. | @@ -2852,7 +2871,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Overview](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-tool) | integrations | 0.65 | Describes AI agent tools, including managed MCP servers and legacy Unity Catalog function tools. Likely contains Databricks-specific tool definitions, parameters, and patterns for connecting agents to data and services, which fits product-specific integration and coding patterns. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/triggers) | configuration | 0.65 | Describes available trigger options; likely enumerates trigger types and their specific configuration parameters and constraints. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions) | integrations | 0.65 | Describes concrete, product-specific AI Functions (built-in SQL/Python functions) for invoking LLMs on Databricks data. While the summary is high level, this page in full typically documents specific function names, parameters, and usage patterns unique to Databricks AI Functions, which fits the integrations & coding patterns category. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/flows) | configuration | 0.65 | Explains how flows work and how to define them for incremental/batch processing; likely includes flow configuration options and semantics specific to Lakeflow pipelines. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/live-schema) | configuration | 0.65 | Explains legacy LIVE virtual schema behavior and migration options; includes product-specific schema semantics and migration configuration details. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/workspace-model-registry) | configuration | 0.65 | Legacy registry management with conditions based on default catalog and runtime versions; includes product-specific behavior and upgrade guidance. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/optimizations/) | best-practices | 0.65 | Optimization recommendations for Databricks typically include concrete, product-specific tuning guidance (e.g., settings, feature usage, workload-specific advice) beyond generic performance theory. | @@ -2881,13 +2899,15 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Python wheel builds tutorial](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/python-wheel) | configuration | 0.65 | Describes how to declare and configure Python wheel build/deploy in bundles, likely including specific YAML keys and parameter values unique to Declarative Automation Bundles. | | [QUALIFY clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-qualify) | configuration | 0.65 | QUALIFY clause reference (10.4+ only) documents Databricks-specific support for filtering on window functions and its requirement for at least one window function, which is dialect-specific behavior. | | [Query](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-query) | integrations | 0.65 | Documents Databricks-specific query syntax and options, which are implementation details not fully captured by generic SQL knowledge. | -| [Query data](https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/query-data) | integrations | 0.65 | Describes selecting tables, writing SQL, adding parameters, and how queries are saved and refreshed via the connector; involves product-specific behavior and options of the Databricks–Sheets connector, fitting integrations. | +| [Query endpoints](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/query-endpoints-beta) | integrations | 0.65 | Describes querying AI Gateway endpoints using unified and native APIs; likely includes request/response parameters, API-specific options, and product-specific integration patterns. | | [Query examples](https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/queries) | integrations | 0.65 | Provides concrete SQL query examples against Zerobus OTLP tables, including use of VARIANT columns and key extraction syntax; these are product-specific query patterns and usage examples. | +| [Query from Notebooks](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/notebook) | integrations | 0.65 | Page focuses on concrete code examples for accessing a Lakebase database instance from an Azure Databricks notebook. This is a product-specific integration pattern (not just a generic tutorial) showing how to connect and query the service programmatically. While the summary is truncated, the description indicates detailed, code-based access patterns unique to Lakebase/Databricks rather than conceptual guidance. | +| [Query traces in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-dbsql) | integrations | 0.65 | Explains how to query MLflow trace data stored in Unity Catalog using Databricks SQL and MLflow SDK, involving product-specific table/view usage and query patterns that are not generic SQL knowledge. | | [Quick reference](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/agent-eval-migration-reference) | decision-making | 0.65 | Quick reference summarizing key changes for migration; helps choose equivalent APIs and paths, a decision-focused mapping between old and new approaches. | | [Read replicas](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-read-replicas) | best-practices | 0.65 | Managing read replicas for Lakebase Postgres will include product-specific guidance on when/how to use replicas, routing read workloads, and operational patterns, which are actionable best practices beyond generic Postgres replication concepts. | | [Read replicas](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/read-replicas) | architecture-patterns | 0.65 | Explains how read replicas share the same storage layer without data duplication, providing a Lakebase-specific scaling pattern and trade-offs. | +| [Read shared data (tokens)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-open) | security | 0.65 | Explains how to use credential files and bearer tokens to securely read shared data, including token-based access behavior and rotation managed by providers. This is product-specific authentication/authorization usage rather than generic tutorial content. | | [Reference architecture](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/reference) | architecture-patterns | 0.65 | Provides Databricks-specific reference architectures across ingestion, transformation, serving, and analysis. While summary is high-level, the downloadable reference architectures typically encode concrete product-specific patterns and component choices that guide when to use which pattern. | -| [Register database in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/register-uc) | configuration | 0.65 | A registration guide for a database in Unity Catalog typically includes specific configuration steps, object names, and settings unique to Databricks governance, which are product-specific configuration details. | | [Regression with the API](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regression-train-api) | integrations | 0.65 | API-focused article describing AutoML Python functions to start runs; likely includes method signatures and parameter options specific to Databricks AutoML. | | [Reliability](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/reliability/) | architecture-patterns | 0.65 | Reliability pillar principles for Databricks; product-specific patterns for failure recovery and continuity. | | [Request and access data products (consumer)](https://learn.microsoft.com/en-us/azure/databricks/marketplace/get-started-consumer) | configuration | 0.65 | How-to for consumers in Unity Catalog-enabled workspaces; likely includes workspace/permission requirements and specific steps/parameters to attach Marketplace products. | @@ -2896,6 +2916,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [SET TAG](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-set-tag) | security | 0.65 | Describes SET TAG with specific Unity Catalog securable objects and required privileges (including ASSIGN on governed tags), which are product-specific RBAC details. | | [SHOW DATABASES](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-databases) | integrations | 0.65 | Documents Databricks aliasing between DATABASE and SCHEMA and exact SHOW DATABASES behavior, which is product-specific. | | [SHOW SCHEMAS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-schemas) | integrations | 0.65 | Documents Databricks-specific SHOW SCHEMAS behavior, regex usage, and DATABASE/SCHEMA terminology. | +| [SQL Scripting](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-scripting) | integrations | 0.65 | Page describes Databricks SQL scripting (SQL/PSM) with concrete syntax for compound statements, variable and cursor declarations, condition handlers, and control flow. This is detailed, product-specific language/API behavior for scripting within Databricks SQL, which fits integrations & coding patterns more than other categories. | | [SQL basics](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/sql-dev) | integrations | 0.65 | Describes new SQL keywords and functions for Lakeflow SDP; product-specific SQL constructs and patterns. | | [SYNC](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-sync) | decision-making | 0.65 | Gives concrete guidance on when SYNC can and cannot be used (external vs managed tables, storage location) and alternative (CREATE TABLE CLONE), supporting migration decisions between table types. | | [Safety](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/concepts/judges/is_safe) | configuration | 0.65 | Documents the Safety judge scorer, including how it evaluates content and returns pass/fail plus rationale; likely includes specific scorer names/parameters that are product-specific configuration of evaluation. | @@ -2904,8 +2925,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Schedule data refreshes](https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/schedule-refresh) | configuration | 0.65 | Covers scheduling automatic refreshes for saved queries; likely documents specific scheduling options/fields (intervals, triggers) within the Databricks Connector for Google Sheets, which are configuration details for this integration feature. | | [Schedule notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/schedule-notebook-jobs) | deployment | 0.65 | Describes creating jobs and schedules directly from the notebook UI, including compute and parameter options; these are Databricks-specific job deployment patterns. | | [Schema evolution for state variables](https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/schema-evolution) | integrations | 0.65 | Discusses supported schema changes for state variables in the Databricks state store, including examples of allowed patterns tied to specific APIs. | +| [Selective overwrite](https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite) | best-practices | 0.65 | Covers specific Delta Lake options (replaceWhere, dynamic partition overwrite) with guidance on when to use each and a recommendation to prefer replaceWhere. This is product-specific DO/DON'T guidance and patterns rather than generic theory. | +| [Serve declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/serve-declarative-features) | deployment | 0.65 | Details constraints (feature serving endpoints not supported) and how to deploy model serving endpoints using models logged through Unity Catalog, which are product-specific deployment behaviors and requirements. | | [Serverless Private Git](https://learn.microsoft.com/en-us/azure/databricks/repos/serverless-private-git) | architecture-patterns | 0.65 | Explains Databricks Serverless Private Git, including architecture diagram and how serverless compute and Azure Private Link are used to connect to private Git servers. This is a product-specific architectural pattern for secure Git connectivity. | -| [Serverless workspaces overview](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces) | decision-making | 0.65 | Explains benefits and limitations of serverless workspaces and when they are ideal (production-only serverless, training, testing, limited permissions), providing Databricks-specific decision guidance rather than just a conceptual overview. | +| [Serverless environment versions](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/) | configuration | 0.65 | Described as a reference for currently released serverless environment versions, including specific Python versions and other environment characteristics. This is a configuration reference (environment-version matrix) with concrete values that an LLM would not reliably know from training. | | [Set operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-setops) | integrations | 0.65 | Documents Databricks-supported set operators (EXCEPT, MINUS, INTERSECT, UNION) and their concrete usage. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-source-setup) | configuration | 0.65 | Source setup pages usually include concrete configuration fields (app IDs, secrets, scopes, redirect URIs) and possibly required permissions, which are product-specific configuration details. | | [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-overview) | configuration | 0.65 | Overview of supported authentication methods for SharePoint ingestion; likely lists specific auth modes and required settings, which are configuration details. | @@ -2918,23 +2941,22 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Stardog](https://learn.microsoft.com/en-us/azure/databricks/partners/semantic-layer/stardog) | integrations | 0.65 | Partner integration article for Stardog with Databricks clusters/SQL warehouses, including configuration patterns unique to this product. | | [Support lifecycles](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/databricks-runtime-ver) | decision-making | 0.65 | Support lifecycle documentation typically includes concrete dates, phase definitions (Legacy, Deprecated, EoS, EoL), and sometimes queries to detect affected clusters. These are expert, time-bound details that guide decisions on when to upgrade or retire runtimes. | | [Supported classification tags](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification-tags) | configuration | 0.65 | Reference list of supported system classification tags, organized by global, regional, and compliance frameworks. The exact tag names and their organization are product-specific configuration/metadata details that an LLM would not reliably know from training. | -| [System tables overview](https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/) | configuration | 0.65 | A system tables reference for monitoring costs, auditing, and tracking workloads typically documents table names, schemas, and possibly configuration needed to enable or query them, which are detailed product-specific structures not known generically. | | [TABLESAMPLE clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-sampling) | configuration | 0.65 | TABLESAMPLE reference describes how sampling is applied to relations in Databricks SQL, including syntax and behavior, which are engine-specific. | | [TIMESTAMP type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/timestamp-type) | limits-quotas | 0.65 | Documents that TIMESTAMP uses session local time zone and represents an absolute point in time; full page typically includes valid ranges and precision. These are concrete temporal representation constraints. | | [TRUNCATE TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-truncate-table) | configuration | 0.65 | Documents TRUNCATE TABLE behavior and constraints (no views/external/temp tables, Delta Lake partition clause limitation, cache clearing behavior), which are specific to Databricks/Delta implementation. | +| [Table constraints](https://learn.microsoft.com/en-us/azure/databricks/tables/constraints) | configuration | 0.65 | Covers supported SQL constraint clauses and their requirements (all constraints require Delta Lake) and likely includes specific syntax and behaviors for enforced, primary, and foreign key constraints in Databricks, which are product-specific configuration details. | | [Table reference](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-table-reference) | configuration | 0.65 | Table reference page defines how Databricks treats base tables, derived tables, inline tables, and functions as table references—dialect-specific query construction rules. | | [Table size](https://learn.microsoft.com/en-us/azure/databricks/tables/size) | best-practices | 0.65 | Explains why reported table size differs from cloud storage and gives concrete Databricks-specific methods and recommendations to compute detailed storage metrics and control costs. These are actionable, product-specific cost-optimization practices. | | [Tag database objects](https://learn.microsoft.com/en-us/azure/databricks/database-objects/tags) | security | 0.65 | Describes tagging mechanics, including security warning about plain-text, globally replicated tag data; product-specific metadata and security considerations. | | [Text files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/text) | integrations | 0.65 | Describes using the text format option to parse lines into DataFrames, a Databricks/Spark-specific integration pattern for arbitrary text data. | | [Time series forecasting with GluonTS](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-time-series-gluonts-101) | integrations | 0.65 | Shows how to use GluonTS with Databricks serverless GPU, including integration-specific code, model configuration, and checkpoint management patterns. | | [Train a PySpark model and save in MLeap format](https://learn.microsoft.com/en-us/azure/databricks/archive/model-export/tracking-ex-pyspark) | integrations | 0.65 | Notebook example that combines MLflow tracking with exporting a PySpark model to MLeap format, demonstrating Databricks-specific integration between tracking and model serialization. | -| [Transactions](https://learn.microsoft.com/en-us/azure/databricks/transactions/) | configuration | 0.65 | Details how to execute transactions across tables, including syntax and constraints for Delta and Iceberg in Unity Catalog; product-specific transactional behavior and configuration. | -| [Transform data](https://learn.microsoft.com/en-us/azure/databricks/ldp/transform) | configuration | 0.65 | Shows how to declare transformations on datasets and use Spark operations, UDFs, and MLflow models; includes product-specific transformation configuration and patterns. | +| [Transactions](https://learn.microsoft.com/en-us/azure/databricks/transactions/) | configuration | 0.65 | Describes how to execute transactions across Unity Catalog managed Delta and Iceberg tables, including preview scope and ACID guarantees. Likely includes specific SQL syntax and constraints for Databricks transactions, which are product-specific configuration/usage details. | | [Trigger intervals](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/triggers) | configuration | 0.65 | Page is specifically about configuring trigger intervals (processingTime, AvailableNow, real-time mode). This typically includes parameter names, allowed values, and examples unique to Databricks Structured Streaming, fitting configuration guidance with expert-level details. | | [Trigger jobs when new files arrive](https://learn.microsoft.com/en-us/azure/databricks/jobs/file-arrival-triggers) | configuration | 0.65 | Describes how to trigger jobs based on new files in external locations. This usually involves trigger configuration options (paths, patterns, polling intervals, event conditions) that are specific to Lakeflow Jobs. These settings and their behavior are product-specific configuration knowledge. | | [Trigger on update of source tables](https://learn.microsoft.com/en-us/azure/databricks/jobs/trigger-table-update) | configuration | 0.65 | Explains table update triggers; likely documents trigger configuration fields and behavior specific to Databricks table-update events. | -| [Triggered vs. continuous](https://learn.microsoft.com/en-us/azure/databricks/ldp/pipeline-mode) | configuration | 0.65 | Explains operational semantics of triggered and continuous modes and how to change them in settings; product-specific configuration behavior for pipeline execution. | | [Tutorial: Create a workspace with Terraform](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/azure-workspace) | configuration | 0.65 | Provides a concrete Terraform sample using `azurerm` and `azurerm_databricks_workspace`, including specific resource blocks and parameters—product-specific configuration. | +| [UDTFs in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/udf/udtf-unity-catalog) | configuration | 0.65 | Covers registering Python user-defined table functions in Unity Catalog, including how arguments and return tables are defined and invoked in SQL FROM clauses; likely includes function registration syntax and catalog-specific configuration details unique to Databricks. | | [UNSET TAG](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-unset-tag) | security | 0.65 | UNSET TAG describes removing tags from specific securable objects and the exact privilege requirements, which are product-specific RBAC configuration details. | | [Understand data transfer and connectivity costs](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/cost-management) | decision-making | 0.65 | Explains when and how networking charges apply for serverless vs classic compute, including which traffic patterns incur costs. This is cost/behavior guidance specific to Databricks serverless networking and informs deployment/usage decisions. | | [Unity Catalog upgrade overview](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/upgrade/) | decision-making | 0.65 | Covers upgrading legacy workspaces and migrating off Hive metastore/DBFS/unsupported runtimes; such migration guidance typically includes scenario-based recommendations and trade-offs for when and how to move. | @@ -2944,8 +2966,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Upgrade from Workspace Feature Store](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/upgrade-feature-table-to-uc) | decision-making | 0.65 | Describes upgrade process and use of upgrade_workspace_table; includes migration guidance and recommended versions, which are product-specific decision and migration details. | | [Use SQL and For each tasks for metadata-driven orchestration](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/foreach-sql-lookup-tutorial) | best-practices | 0.65 | Shows a concrete, product-specific pattern for scaling ingestion using a control table plus SQL and For each tasks in Lakeflow Jobs. This is an actionable design pattern (metadata-driven orchestration) specific to Databricks jobs, beyond generic looping; fits best-practices as it prescribes how to structure jobs and control tables for maintainability and scalability. | | [Use XGBoost](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/xgboost) | integrations | 0.65 | Provides examples of single-node and distributed XGBoost training in Python, PySpark, and Scala on Databricks; such examples include cluster configs, library usage, and integration patterns specific to Databricks. | -| [Use a Unity Catalog volume or external location](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/unity-catalog) | configuration | 0.65 | How-to for using COPY INTO with Unity Catalog volumes/external locations. COPY INTO and Unity Catalog typically require specific option names, table properties, and configuration parameters that are product-specific, going beyond generic SQL knowledge. | -| [Use feature tables](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-models-with-feature-store) | integrations | 0.65 | Explains how to create training datasets from feature tables and retain feature references; likely includes code patterns and APIs specific to Databricks Feature Store integration with ML. | +| [Use declarative features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-with-declarative-features) | integrations | 0.65 | Describes how to wire declarative features into model training with point-in-time computation, a Databricks-specific integration pattern between feature definitions and ML training. | +| [Use feature tables](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-models-with-feature-store) | integrations | 0.65 | Covers concrete patterns for integrating feature tables with model training and batch inference, including lineage behavior in Unity Catalog; these are product-specific coding/integration patterns. | | [Use prompts in deployed apps](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/use-prompts-in-deployed-apps) | configuration | 0.65 | Describes configuring production apps to load prompts from Prompt Registry using aliases; likely includes concrete API usage and configuration patterns specific to MLflow Prompt Registry. | | [VALUES clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-values) | integrations | 0.65 | Documents Databricks-specific VALUES clause behavior for producing inline temporary tables. | | [VARIANT type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/variant-type) | best-practices | 0.65 | Contains Databricks-specific constraints: VARIANT is public preview, Iceberg v2 tables do not support VARIANT while v3 does; this is product-specific behavior and compatibility guidance. | @@ -2953,14 +2975,13 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [VS Code workspace directory](https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/workspace-dir) | configuration | 0.65 | Describes how to choose and configure a Databricks workspace directory in the VS Code extension, which is a product-specific configuration procedure. | | [Variant](https://learn.microsoft.com/en-us/azure/databricks/delta/variant) | configuration | 0.65 | Documents Databricks VARIANT type behavior, enabling/disabling it on tables, and runtime constraints, which are product-specific configuration details. | | [Vibe check using the Chat UI](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/expert-feedback/live-app-testing) | integrations | 0.65 | Describes using the Review App Chat UI, including the requirement for a Model Serving endpoint and current lack of Databricks Apps support—product-specific integration constraints and usage. | -| [View app details](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/view-app-details) | troubleshooting | 0.65 | Summary emphasizes runtime logs, deployment status, and using the page to troubleshoot issues, which typically includes product-specific diagnostic locations and symptom-to-solution guidance. | | [View expenses](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/expense) | configuration | 0.65 | Shows how to query the system.billing.usage system table to distinguish anomaly detection vs profiling costs, which is a product-specific configuration/usage of billing tables. | -| [Web search on Databricks](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/web-search) | integrations | 0.65 | Describes how to ground model responses with real-time web information using Gemini, OpenAI, and Anthropic models, including Anthropic via MCP. This implies concrete configuration and API usage patterns for web search integration that are product-specific. | +| [Web search on Databricks](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/web-search) | integrations | 0.65 | Explains how to ground responses with real-time web data using Gemini, OpenAI, Anthropic, and MCP; will include provider-specific parameters, configuration flags, and endpoint usage unique to Databricks’ implementation. | | [What are volumes?](https://learn.microsoft.com/en-us/azure/databricks/files/volumes) | configuration | 0.65 | Defines Unity Catalog volume types and how they govern access to non-tabular data; these are Databricks-specific storage configuration concepts. | | [What is a data lakehouse?](https://learn.microsoft.com/en-us/azure/databricks/lakehouse/) | architecture-patterns | 0.65 | Explains the lakehouse architectural pattern specifically in the context of Azure Databricks, including when and how to use it for ACID, governance, ETL, BI, and ML. While mostly conceptual, it is a product-specific architecture pattern description rather than generic data architecture. | | [Work with Hive metastore alongside Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/hive-metastore) | configuration | 0.65 | Explains a specific, non-obvious configuration pattern for using the legacy per-workspace Hive metastore with Unity Catalog. This is product-specific setup behavior that goes beyond generic concepts, fitting configuration/interop guidance. | | [Work with parameter widgets](https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/parameter-widgets) | configuration | 0.65 | Describes widget types, titles, default values, and behavior across surfaces; includes product-specific parameter/widget configuration details. | -| [Workspace appearance settings](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/appearance) | configuration | 0.65 | Covers concrete workspace settings (themes, date/time format, language) and how admins manage them; these are specific configuration options rather than generic UI concepts. | +| [Workspace email settings](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/email) | configuration | 0.65 | Defines specific workspace-level email notification settings and events, which are product-specific configuration options. | | [abort](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datasourcewriter/abort) | integrations | 0.65 | Documents a specific driver-side abort hook, including how commit messages from write() are passed in on failure; this is detailed, product-specific writer lifecycle behavior. | | [abs](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/abs) | integrations | 0.65 | Function reference page with exact signature, return type, and behavior for abs in Databricks PySpark; this is concrete API usage. | | [account budget-policy](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-budget-policy-commands) | decision-making | 0.65 | Budget-policy commands for thresholds and alerts; supports cost-control decisions with concrete policy fields and thresholds. | @@ -3017,8 +3038,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [named_struct](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/named_struct) | integrations | 0.65 | API reference for Databricks PySpark named_struct, defining how fields and values are combined. | | [next_day function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/next_day) | integrations | 0.65 | Provides Databricks SQL-specific semantics for computing the next occurrence of a weekday from a date expression, including accepted weekday formats. | | [now](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/now) | integrations | 0.65 | Clarifies evaluation timing (start of query) for Databricks PySpark now function, which is specific API behavior. | -| [option](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/option) | integrations | 0.65 | API reference for adding a single write option to a data source; product-specific configuration pattern for Databricks PySpark. | -| [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/options) | integrations | 0.65 | Documents DataFrameWriterV2.options for setting multiple options; concrete configuration API for Databricks data sources. | | [outputMode](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/outputmode) | configuration | 0.65 | Specifies how streaming data is written to sinks (output modes); this is product-specific configuration of streaming behavior. | | [percentile aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/percentile) | integrations | 0.65 | Describes Databricks-specific percentile aggregate function semantics and usage, including exact vs approximate behavior, which are concrete API details. | | [pg_stat_statements](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/pg-stat-statements) | best-practices | 0.65 | Monitoring guidance with pg_stat_statements for Lakebase Postgres will include product-specific views, recommended queries, and interpretation patterns for performance tuning, which are actionable best practices. | @@ -3130,6 +3149,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [variance](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/variance) | integrations | 0.65 | Documents aliasing and behavior mapping for variance to var_samp in Databricks PySpark, including signatures and usage details. | | [vector_search function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/vector_search) | integrations | 0.65 | vector_search() is a Databricks-specific integration with Mosaic AI Vector Search; the function reference is likely to include parameter names, types, and constraints for querying the index, which are product-specific API/config details that match the integrations & coding patterns criteria. | | [xxhash64 function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/xxhash64) | integrations | 0.65 | Documents xxhash64 function behavior and 64-bit hash return type in Databricks SQL/Runtime, which is specific API knowledge. | +| [Build a knowledge store](https://learn.microsoft.com/en-us/azure/databricks/genie/knowledge-store) | best-practices | 0.64 | Focuses on how to build and curate a Genie knowledge store using localized metadata, prompt matching, and SQL instructions. This implies product-specific guidance and patterns to improve accuracy—actionable DO/DON'T style recommendations rather than just concepts. | | [Connect to a data ingestion partner](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/ingestion) | integrations | 0.64 | Describes concrete steps to connect to ingestion partners using Partner Connect; includes Databricks UI flows and partner-specific integration behavior. | | [Cost management guide](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-cost-management) | best-practices | 0.64 | A cost management guide for a specific service usually contains concrete, product-specific recommendations (for example, how to configure endpoints, index layouts, or traffic patterns to reduce cost, and how to identify unused endpoints). These are actionable DO/DON'T patterns tied to this product’s behavior, fitting best-practices with expert operational guidance. | | [DROP DATABASE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-database) | configuration | 0.64 | DROP DATABASE aliasing to DROP SCHEMA and privilege requirements are Databricks-specific behaviors. | @@ -3139,6 +3159,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Informatica Cloud Data Integration](https://learn.microsoft.com/en-us/azure/databricks/partners/ingestion/informatica-cloud-data-integration) | integrations | 0.64 | Integration guide for Informatica Cloud with Databricks SQL warehouses; includes connector configuration and Databricks-specific settings. | | [Install libraries from a package repository](https://learn.microsoft.com/en-us/azure/databricks/libraries/package-repositories) | security | 0.64 | Includes security-relevant behavior (DBFS root deprecation, disabled by default in certain runtimes) and version-specific constraints for library storage, which are product-specific security/config details. | | [Lambda functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-lambda-functions) | integrations | 0.64 | Explains Databricks SQL lambda syntax and usage (for example with array_sort), including parameterization rules that are product-specific. | +| [Migrate to serverless routing](https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration) | deployment | 0.64 | Covers a dated migration requirement from control-plane to serverless routing for HTTP connections, including a hard migration deadline and workspace-creation date conditions that affect connectivity. These are product-specific deployment/runtime constraints and timelines not inferable from general knowledge. | | [Qlik Replicate](https://learn.microsoft.com/en-us/azure/databricks/partners/ingestion/qlik) | integrations | 0.64 | Explains how to set up Qlik Replicate with Databricks and Delta Lake; likely includes connection details and Databricks-specific configuration steps. | | [Queries and interruptions](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/queries) | best-practices | 0.64 | Focuses on handling asynchronous queries and interruptions; likely includes concrete code patterns and product-specific behaviors/gotchas for query cancellation and retries. | | [Rivery](https://learn.microsoft.com/en-us/azure/databricks/partners/ingestion/rivery) | integrations | 0.64 | Describes integrating Rivery with Databricks SQL warehouses and notes lack of cluster integration; includes product-specific integration constraints. | @@ -3161,22 +3182,23 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Collect user feedback](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/collect-user-feedback/) | integrations | 0.63 | Explains MLflow’s structured feedback/assessment model on traces and how to log it via APIs, which is a product-specific data model and integration pattern not captured by generic LLM knowledge. | | [Function invocation](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-function-invocation) | integrations | 0.63 | Documents Databricks-specific function invocation syntax including named vs positional parameters and nuances that go beyond generic SQL knowledge. | | [MLOps workflows](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/mlops-workflow) | architecture-patterns | 0.63 | Provides a Databricks-specific MLOps architecture and workflow, mapping platform components to stages; while high-level, it includes concrete platform patterns unique to Databricks. | +| [Monitor apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/monitor) | security | 0.63 | The page is about logs and audit events for Databricks Apps and explicitly mentions security event detection. Such content typically includes product-specific log types, locations, and possibly required permissions or configuration steps for audit logging, which are security-focused expert details. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/) | deployment | 0.63 | CI/CD on Azure Databricks documentation typically includes Databricks-specific deployment patterns, supported methods, and constraints for automating builds/tests/deployments. These are product-specific deployment details and patterns, beyond generic CI/CD concepts, fitting the deployment sub-skill. | +| [Set up a Genie space](https://learn.microsoft.com/en-us/azure/databricks/genie/set-up) | configuration | 0.63 | Covers how to set up and manage Genie spaces with product-specific options and settings. Likely includes concrete configuration steps and parameters (space settings, permissions, data connections) beyond generic concepts. | | [View entity relationships](https://learn.microsoft.com/en-us/azure/databricks/catalog-explorer/entity-relationship-diagram) | configuration | 0.63 | Describes how to access and interpret the ERD in Catalog Explorer, including conditions (tables with foreign key constraints) and UI/feature behavior that are specific to Databricks. | | [collect_list aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/collect_list) | integrations | 0.63 | Documents a Databricks SQL aggregate function and its synonym array_agg, which is concrete dialect/API behavior for this product. | | [Alphabetical list of H3 geospatial functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions-alpha) | integrations | 0.62 | Alphabetical catalog of H3 function names and signatures supported by Databricks SQL, which is detailed API surface knowledge. | | [Alphabetical list of ST geospatial functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-st-geospatial-functions-alpha) | integrations | 0.62 | Alphabetical list of ST geospatial functions available in Databricks Runtime, exposing the exact function surface area. | -| [Auto Loader with Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/unity-catalog) | best-practices | 0.62 | Describes how to use Auto Loader with Unity Catalog, including references to Structured Streaming limitations and recommendations. This typically includes product-specific configuration guidance and gotchas (for example, access modes, catalog constraints), fitting best-practices. | | [CREATE SERVER](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-server) | configuration | 0.62 | Documents Databricks-specific aliasing of CREATE SERVER to CREATE CONNECTION and preferred usage. | | [Connect to a data preparation partner](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/prep) | integrations | 0.62 | Describes typical steps to connect to data preparation partners, with notes about differences per partner and SQL warehouse vs cluster integration options. | | [Connect to a machine learning partner](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/ml) | integrations | 0.62 | Describes typical steps to connect to ML partners, with Databricks SQL warehouse/cluster integration options and partner-specific considerations. | -| [Create a Supervisor Agent](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor) | integrations | 0.62 | Describes concrete, product-specific patterns for wiring multiple Databricks agents and tools together under a Supervisor Agent, including how agents interact and are coordinated. This is implementation-focused integration logic rather than just conceptual multi-agent theory. | | [Databricks notebooks](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/notebooks) | integrations | 0.62 | Explains how to integrate Databricks Connect with notebooks, including specific configuration or invocation patterns that are product-specific. | | [Explore storage and find data files](https://learn.microsoft.com/en-us/azure/databricks/discover/files) | configuration | 0.62 | Includes programmatic examples and likely specific path formats and options for exploring volumes and cloud URIs, which are concrete configuration/usage details for Databricks-managed storage. | | [MySQL](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql) | decision-making | 0.62 | Described as helping understand the MySQL ingestion workflow and factors that determine setup approach for different personas; likely includes scenario-based guidance and trade-offs for how to configure ingestion. | | [ODBC and JDBC Drivers](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-odbc-bi) | decision-making | 0.62 | Entry page for Databricks ODBC/JDBC drivers that helps users decide which driver to use and where to go for tool-specific instructions; while light on numbers, it is product-specific selection guidance between driver options. | | [Prepare data for fine-tuning models](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/huggingface/load-data) | best-practices | 0.62 | Shows concrete steps and patterns for preparing data with Hugging Face Datasets on Databricks for LLM fine-tuning, which are practical, product-specific data prep recommendations. | | [Prophecy](https://learn.microsoft.com/en-us/azure/databricks/partners/prep/prophecy) | integrations | 0.62 | Explains integrating Prophecy with Databricks compute; includes steps and configuration details specific to this integration. | +| [Query metric views](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/query) | integrations | 0.62 | Shows concrete SQL query patterns and how to consume metric views from external tools. This is a product-specific querying/integration pattern rather than generic SQL usage. | | [Tutorial: Trace and analyze users and environments](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/add-context-to-traces-tutorial) | integrations | 0.62 | Step-by-step tutorial with concrete code for adding user/session/environment context to traces and querying it, demonstrating specific integration patterns with MLflow Tracing. | | [user](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/user) | integrations | 0.62 | Documents Databricks’ user function behavior (returns current database per summary), which is specific to this environment’s semantics. | | [FAQs](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/faqs) | troubleshooting | 0.61 | FAQ pages for a specific extension typically include concrete error messages, limitations, and resolution steps (for example, connection/auth issues, sync problems, language support caveats). These map symptoms to causes and fixes, fitting the troubleshooting category. | @@ -3188,32 +3210,33 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [API](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/api-usage) | integrations | 0.60 | Overview of Lakebase Autoscaling API with authentication, endpoints, and patterns for REST, CLI, and SDKs. This implies product-specific API/SDK parameters and usage patterns, aligning with integrations & coding patterns. | | [Apply tags to notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-tags) | configuration | 0.60 | Describes notebook tags including certification and deprecation system tags, which are product-specific metadata settings tied to governance behavior. | | [Assess performance](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/evaluate-assess-performance) | best-practices | 0.60 | Covers metrics for retrieval, response, and system performance; likely includes concrete metric definitions and usage patterns specific to Databricks RAG apps. | -| [Attribute usage using serverless usage policies](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies) | configuration | 0.60 | Describes serverless usage policies and how tags are applied and logged in billing records. This is product-specific configuration behavior for policies and tagging, beyond generic tagging concepts. | | [BOOLEAN type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/boolean-type) | configuration | 0.60 | Documents Databricks-specific BOOLEAN type semantics and limits. While simple, it is part of the concrete type system configuration for this product. | | [Backfill jobs](https://learn.microsoft.com/en-us/azure/databricks/jobs/backfill-jobs) | configuration | 0.60 | Explains how to run backfills using existing automation and date ranges; likely includes job parameterization and schedule configuration details specific to backfill scenarios. | | [Backup and restore](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/backup-methods) | decision-making | 0.60 | Describes different backup and restore options and when to use them in combination; likely includes scenario-based guidance and trade-offs between methods, which is decision-making specific to Lakebase Autoscaling. | | [Build dashboards with MLflow metadata in system tables](https://learn.microsoft.com/en-us/azure/databricks/mlflow/build-dashboards) | configuration | 0.60 | Uses MLflow metadata in system tables; likely includes table/column names and query patterns that are product-specific configuration details. | | [CLUSTER BY clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-clusterby) | configuration | 0.60 | CLUSTER BY reference explains Databricks-specific semantics (equivalence to DISTRIBUTE BY + SORT BY, partition-local ordering only), which are engine-specific behavior details. | | [Classic compute plane networking](https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/) | security | 0.60 | Introduces Databricks-specific features for securing control-plane to classic compute-plane connectivity over the cloud backbone. | +| [Common patterns](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/common-patterns) | best-practices | 0.60 | Describes concrete patterns and techniques to optimize managed ingestion pipelines (controlling ingested data, managing updates, advanced behaviors) that are specific to Lakeflow Connect connectors, which aligns with product-specific best practices. | | [Connect on-prem Git server](https://learn.microsoft.com/en-us/azure/databricks/repos/connect-on-prem-git-server) | architecture-patterns | 0.60 | Provides guidance and constraints for connecting Databricks Serverless Private Git to on-premises Git servers, including notes about NCC private endpoint rules and Azure SLB backend pools. These are detailed, product-specific network architecture considerations. | | [Control flow](https://learn.microsoft.com/en-us/azure/databricks/jobs/control-flow) | configuration | 0.60 | Describes conditional tasks, branching, and cleanup; likely includes task-level settings and patterns specific to Databricks job control flow. | | [Cost optimization](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/cost-optimization/) | architecture-patterns | 0.60 | Cost optimization pillar principles for Databricks; product-specific patterns for structuring workloads and resources to manage cost. | -| [Create and monitor budgets](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/budgets) | configuration | 0.60 | Details how budgets work (USD, SKU list price, platform add-ons) and how they are applied and monitored. These are product-specific configuration semantics for budgets that an LLM is unlikely to know exactly. | +| [Create and manage compute policies](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policies) | configuration | 0.60 | Explains how to configure compute policies in the workspace, tying into a reference of policy attributes; this is concrete product configuration behavior. | | [Custom format SQL](https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/custom-format) | configuration | 0.60 | Explains how to customize auto-formatting options; likely includes specific setting names and allowed values for the SQL editor formatting behavior. | | [DISTRIBUTE BY clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-distributeby) | configuration | 0.60 | DISTRIBUTE BY reference describes Databricks-specific repartitioning behavior distinct from CLUSTER BY and ORDER BY, which is engine-specific configuration detail. | | [Data skipping index](https://learn.microsoft.com/en-us/azure/databricks/archive/spark-3.x-migration/dataskipping-index) | configuration | 0.60 | Documents the DATASKIPPING INDEX SQL syntax and its removal, with guidance to use Delta tables instead; this is specific SQL configuration and feature behavior for Databricks. | | [DataStreamReader class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader) | integrations | 0.60 | Product-specific interface for loading streaming DataFrames via spark.readStream, including supported external systems; this is concrete API usage for integrations. | | [DataStreamWriter class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter) | integrations | 0.60 | Product-specific interface for writing streaming DataFrames to external systems via df.writeStream; this is concrete integration API surface. | | [Database branches](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/branches) | architecture-patterns | 0.60 | Explains branching semantics for Lakebase databases and how to use them for dev/test/experimentation, a product-specific architectural workflow pattern. | -| [Databricks Online Feature Store](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-feature-store) | limits-quotas | 0.60 | Describes high-performance, low-latency online feature serving; such service pages typically include concrete performance/latency characteristics and constraints tied to Lakebase projects. | +| [Databricks Online Feature Store](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-feature-store) | architecture-patterns | 0.60 | Explains Databricks Online Feature Stores, their primary use cases, and differences with Lakebase autoscaling projects, giving product-specific architectural guidance for online vs offline features. | | [Define quality: Evaluation sets](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/evaluate-define-quality) | best-practices | 0.60 | Describes how to create evaluation sets specifically for RAG applications; likely includes Databricks-specific structures and examples for evaluation data, which are actionable best practices. | | [Delta Lake](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/delta-lake) | best-practices | 0.60 | Covers how to use Delta tables as streaming sources/sinks, handle upstream changes, and resolve errors from updates/deletes in streaming queries—likely includes product-specific streaming patterns and gotchas beyond generic streaming concepts. | +| [Designated Services](https://learn.microsoft.com/en-us/azure/databricks/resources/designated-services) | decision-making | 0.60 | Describes which Databricks features are Designated Services, how they use Azure geographies for data residency, and when cross-Geo processing must be enabled. This is product-specific guidance for choosing and configuring data residency behavior across geos, impacting compliance and deployment decisions. | | [Distributed training of XGBoost models using Scala](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/xgboost-scala) | integrations | 0.60 | Shows how to embed XGBoost models into MLlib pipelines and use cross-validation, which are Databricks-specific integration patterns. | | [Download files from the internet](https://learn.microsoft.com/en-us/azure/databricks/volumes/download-internet-files) | best-practices | 0.60 | Describes recommended patterns and gotchas for downloading from the public internet into Databricks, including where to store data (volumes vs other locations); product-specific guidance beyond generic HTTP download. | | [Evaluation and monitoring](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/fundamentals-evaluation-monitoring-rag) | best-practices | 0.60 | Covers evaluation and monitoring with emphasis on quality, cost, and latency; likely includes Databricks-specific recommendations and metrics wiring that constitute concrete best practices. | | [Evaluation tools infrastructure](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/evaluate-enable-measurement) | architecture-patterns | 0.60 | Describes infrastructure needed for measuring RAG quality and how Databricks provides it; likely includes concrete architectural components and patterns for evaluation pipelines, which are design-pattern-level expert guidance. | | [Featurization for transfer learning](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/preprocess-data/transfer-learning-tensorflow) | integrations | 0.60 | Provides an example notebook using pandas UDFs for transfer learning featurization, which is a Databricks-specific coding pattern. | -| [Foundation models overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/foundation-model-overview) | decision-making | 0.60 | Lists which foundation models are supported by Mosaic AI Model Serving, including region availability and supported feature areas. This is concrete, product-specific selection guidance (which model where) that helps decide what to use; such availability matrices are not inferable from training data alone. | +| [Foundation models overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/foundation-model-overview) | decision-making | 0.60 | A page listing supported foundation models typically includes model-specific capabilities, regions, and feature support tables; this helps decide which model to use for a scenario, fitting decision-making with product-specific matrices. | | [Get started with MLflow Java and Scala](https://learn.microsoft.com/en-us/azure/databricks/archive/mlflow/quick-start-java-scala) | integrations | 0.60 | Quickstart includes concrete MLflow API usage from Java/Scala on Databricks, with specific method calls and patterns for tracking experiments that are product- and SDK-specific rather than conceptual. | | [Get started with MLflow R](https://learn.microsoft.com/en-us/azure/databricks/archive/mlflow/quick-start-r) | integrations | 0.60 | Quickstart notebook for MLflow tracking in R on Databricks, showing product-specific API usage and integration patterns between R, MLflow, and Databricks experiments. | | [GraphFrames - Scala](https://learn.microsoft.com/en-us/azure/databricks/integrations/graphframes/user-guide-scala) | integrations | 0.60 | Demonstrates concrete Scala usage of GraphFrames on Databricks with API calls and patterns specific to this integration, which qualifies as a coding/integration pattern rather than generic graph theory content. | @@ -3229,7 +3252,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Koalas](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/koalas) | integrations | 0.60 | Koalas usage in Databricks involves environment/runtime constraints and code patterns specific to this platform, beyond generic pandas knowledge. | | [LIMIT clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-limit) | integrations | 0.60 | Describes Databricks-specific LIMIT clause syntax and interaction with ORDER BY for deterministic results. | | [MLOps Stacks tutorial](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/mlops-stacks) | best-practices | 0.60 | Describes MLOps Stacks that follow production best practices out of the box and how to create/deploy them with bundles; likely includes Databricks-specific patterns and recommendations. | -| [Manage providers (for recipients)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-provider) | security | 0.60 | Explains provider objects and how recipients view and manage provider information in Unity Catalog. This is tied to access control and identity of data providers, which is a product-specific security/authorization construct. | | [Map visualizations](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/maps) | configuration | 0.60 | Focuses on map visualization configuration options and required geographic data; such pages typically enumerate specific settings and allowed values for map types, layers, and data fields, which is product-specific configuration knowledge. | | [Materialization](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/materialization) | architecture-patterns | 0.60 | Explains how Databricks metric view materialization works with Lakeflow Spark Declarative Pipelines and aggregate-aware query rewriting, including product-specific behavior and when to use materialized views for performance, which is closer to an architecture/design pattern for this feature. | | [Metrics](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/metrics) | configuration | 0.60 | Metrics dashboard documentation typically lists metric names, meanings, and possibly thresholds specific to Lakebase Postgres. | @@ -3250,7 +3272,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Overview](https://learn.microsoft.com/en-us/azure/databricks/jobs/) | decision-making | 0.60 | Introduces concepts and choices for managing production workloads with Lakeflow Jobs. Likely includes guidance on when to use jobs, task types, and orchestration options for different scenarios, which is service-specific decision guidance rather than generic workflow theory. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-code) | integrations | 0.60 | Describes magic commands, mixing languages, and editor behaviors that are Databricks-specific coding patterns and APIs, which qualify as product-specific integration/coding patterns. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/security/network/) | security | 0.60 | Describes Databricks-specific networking security options (front-end, serverless, classic) and billing implications for serverless networking, which are product-specific. | -| [Overview of data lineage](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-lineage) | configuration | 0.60 | Describes how to use Catalog Explorer and lineage system tables; likely includes specific table names, UI options, and query patterns that are product-specific configuration/usage details. | | [Pipeline maintenance](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/pipeline-maintenance) | best-practices | 0.60 | Covers Databricks-specific operational tasks for managed ingestion connectors; actionable maintenance guidance. | | [Precisely](https://learn.microsoft.com/en-us/azure/databricks/partners/data-governance/precisely) | integrations | 0.60 | Describes how to integrate Precisely’s suite with Databricks, including connector-specific configuration beyond generic patterns. | | [Predictive I/O](https://learn.microsoft.com/en-us/azure/databricks/optimizations/predictive-io) | best-practices | 0.60 | Outlines Databricks predictive I/O capabilities, categories, and Photon exclusivity, which are specific performance features not generally known. | @@ -3266,20 +3287,19 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [STRING type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/string-type) | limits-quotas | 0.60 | States that STRING supports sequences of any length ≥ 0; full page typically clarifies any practical or storage limits. These are concrete constraints on string values in this engine. | | [STRUCT type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/struct-type) | configuration | 0.60 | Describes STRUCT type syntax and how fields are structured in Databricks SQL, which is concrete type-system configuration for this product. | | [Schema enforcement and evolution](https://learn.microsoft.com/en-us/azure/databricks/tables/schema-enforcement) | configuration | 0.60 | Describes Databricks-specific schema-on-write enforcement behavior for Delta-backed tables versus external data, including default behaviors. | -| [Serve lakehouse data with synced tables](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sync-tables) | architecture-patterns | 0.60 | Describes using synced tables to serve lakehouse data through Lakebase for operational analytics and real-time apps. This is a product-specific pattern (reverse ETL via Lakebase) with guidance on when/how to use it, fitting architecture & design patterns. | +| [Serve features](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows) | integrations | 0.60 | Describes how to integrate Databricks Feature Store with real-time serving and automated pipelines, including on-demand feature computation; these are concrete integration patterns across services. | | [Serverless](https://learn.microsoft.com/en-us/azure/databricks/jobs/run-serverless-jobs) | decision-making | 0.60 | Covers using serverless compute for workflows, including how it manages autoscaling and optimization. Such pages typically include guidance on when to choose serverless vs. other compute options, with trade-offs around management, performance, and cost. This aligns with decision-making for compute selection in Lakeflow Jobs. | | [StreamingQuery class](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery) | integrations | 0.60 | Class-level reference describing a handle to continuous queries and thread-safety; specific streaming control API. | -| [StreamingQueryListener](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stream-monitoring) | best-practices | 0.60 | Monitoring guidance for Structured Streaming via Spark UI typically includes product-specific recommendations and gotchas (which metrics to watch, how to interpret them, and how to configure monitoring) that go beyond generic debugging advice. | +| [StreamingQueryListener](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stream-monitoring) | best-practices | 0.60 | Explains how to use the Databricks/Spark UI Streaming tab and related mechanisms to monitor streaming applications; likely includes product-specific guidance on interpreting metrics and using monitoring features, which is actionable best-practice content. | | [Supported browsers](https://learn.microsoft.com/en-us/azure/databricks/resources/supported-browsers) | configuration | 0.60 | Provides explicit list of supported and unsupported browsers; this is product-specific compatibility configuration information. | | [Supported integrations](https://learn.microsoft.com/en-us/azure/databricks/external-access/integrations) | decision-making | 0.60 | Contains a table of integrated engines and whether they use Unity REST API or Iceberg REST catalog, helping users decide which integration path to use per engine. This is selection guidance between options based on capabilities. | | [TIMESTAMP_NTZ type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/timestamp-ntz-type) | best-practices | 0.60 | Includes Databricks-specific notes: public preview status, requirement to enable support for Delta tables, and that new tables auto-enable while existing ones do not; these are product-specific behaviors and gotchas. | -| [Table constraints](https://learn.microsoft.com/en-us/azure/databricks/tables/constraints) | configuration | 0.60 | Covers Databricks-specific support for enforced constraints and key definitions on Delta tables, including SQL clauses and behavior tied to Delta. | | [TensorBoard](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/tensorboard) | integrations | 0.60 | Explains how to integrate TensorBoard with Databricks ML workloads, which involves product-specific configuration steps. | | [Terraform CDK Databricks provider](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/cdktf) | decision-making | 0.60 | Explains CDKTF deprecation and recommends using the Databricks Terraform provider instead, providing product-specific guidance on which approach to choose for IaC. | | [Tutorial: Intelligent document processing pipeline](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/idp-pipeline-tutorial) | architecture-patterns | 0.60 | Describes an end-to-end intelligent document processing pipeline using a medallion architecture and multiple AI Functions. This is a product-specific architecture pattern (IDP on Databricks with medallion layers) and provides concrete guidance on how to structure ingestion, parsing, classification, and extraction, which aligns with architecture-patterns. | | [Update jobs](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/jobs-update) | configuration | 0.60 | Main table of scenarios and code changes for jobs when upgrading; contains concrete path and table-reference patterns unique to Unity Catalog migration, fitting configuration/coding adjustments. | | [Updates](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/updates) | configuration | 0.60 | Describes selecting update windows (specific day and hour) and how Lakebase applies updates and uses prewarming. This is product-specific configuration of update behavior and scheduling, fitting configuration. | -| [Use features in online applications](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows) | architecture-patterns | 0.60 | Discusses online workflows including Databricks Online Feature Store, on-demand computation, and third-party stores; likely provides pattern-level guidance on when to use each approach within Databricks. | +| [Usage dashboards](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage) | configuration | 0.60 | Explains how to access and import specific pre-built dashboards from the account console—product-specific configuration/usage steps. | | [WHERE clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select-where) | integrations | 0.60 | Provides Databricks-specific WHERE clause syntax and behavior within its SQL dialect. | | [Work with files in volumes](https://learn.microsoft.com/en-us/azure/databricks/volumes/volume-files) | integrations | 0.60 | Shows concrete path formats and usage patterns for volumes with FUSE-dependent tools and multiple languages; product-specific file access patterns and parameters. | | [any_value](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/any_value) | integrations | 0.60 | Defines any_value aggregation semantics for groups in Databricks PySpark; concrete function behavior. | @@ -3333,6 +3353,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamreader/options) | configuration | 0.60 | Allows setting multiple named options for streaming sources; this is structured configuration for Databricks streaming integrations. | | [options](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/options) | configuration | 0.60 | Allows setting multiple named options for streaming sinks; this is structured configuration behavior. | | [orderBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/orderby) | integrations | 0.60 | Clarifies that orderBy is an alias for sort; aliasing behavior is specific API knowledge. | +| [pandas function APIs](https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-function-apis) | integrations | 0.60 | Describes pandas function APIs that apply native Python functions to PySpark DataFrames, leveraging Arrow and pandas; likely includes specific function signatures, parameter behaviors, and execution characteristics that are product- and API-specific integration patterns. | | [partitionBy](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/datastreamwriter/partitionby) | configuration | 0.60 | Describes partitioning layout similar to Hive for streaming outputs; this is concrete configuration behavior. | | [printSchema](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/printschema) | integrations | 0.60 | Explains tree-format schema printing and level control; concrete method behavior. | | [profile](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/sparksession/profile) | integrations | 0.60 | Documents a property returning a Profile object for performance/memory profiling; specific to this environment. | @@ -3406,15 +3427,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [list_secrets table function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/list_secrets) | 0.55 | Secrets listing function; may mention scopes and auth, but description suggests basic behavior without detailed RBAC role tables or config parameters. | | [make_valid_utf8 function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/make_valid_utf8) | 0.55 | MAKE_VALID_UTF8 function; describes replacing invalid UTF-8 sequences but lacks numeric limits, config tables, or troubleshooting mappings. | | [<=> (lt eq gt sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/lteqgtsign) | 0.50 | Describes the <=> operator semantics; while slightly product-specific, it is core SQL semantics without limits, configs, or decision matrices. | -| [Build a knowledge store](https://learn.microsoft.com/en-us/azure/databricks/genie/knowledge-store) | 0.50 | Describes building a knowledge store and using prompt matching; summary suggests conceptual and procedural guidance, not detailed limits, config matrices, or error mappings. | | [Command Palette commands](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/command-palette) | 0.50 | Lists Command Palette commands; mostly command names and descriptions, not configuration ranges, limits, or troubleshooting mappings. | -| [Configure endpoints](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-ai-gateway-endpoints) | 0.50 | How-to configure AI Gateway on endpoints; likely has some settings but summary does not clearly indicate full configuration tables with defaults/ranges. | | [Convert foreign tables to external](https://learn.microsoft.com/en-us/azure/databricks/tables/convert-foreign-external) | 0.50 | How-to for SET EXTERNAL to convert foreign to external tables; preview note but no detailed limits/config tables in summary. | | [Explore features in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/ui-uc) | 0.50 | Describes discoverability and UI navigation for feature tables; mostly about using Catalog Explorer and Features UI, not detailed config tables. | | [Filter on fields](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/field-filters) | 0.50 | Field filter behavior and multi-dataset connections; likely conceptual and UI-based without structured configuration tables. | | [GEOGRAPHY type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/geography-type) | 0.50 | GEOGRAPHY type is product-specific and in preview, but summary only gives conceptual description; no explicit limits, config parameters, or error mappings shown. | | [GEOMETRY type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/geometry-type) | 0.50 | GEOMETRY type is product-specific and in preview; summary describes coordinate system but not concrete limits or configuration tables. | -| [Inference tables](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables) | 0.50 | Describes AI Gateway-enabled inference tables; may include schema details but summary does not show explicit config ranges or quotas. | | [Information schema](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-information-schema) | 0.50 | Information schema overview for Databricks; primarily conceptual and descriptive about metadata exposure, without detailed config tables, limits, or decision/troubleshooting structures. | | [Labeling schemas](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/concepts/labeling-schemas) | 0.50 | Conceptual explanation of labeling schemas; while it defines purpose and scope, the summary does not expose concrete schema field definitions or constraints. | | [Lineage and governance](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/lineage) | 0.50 | Governance and lineage overview; likely conceptual explanation of lineage graphs and monitoring, without specific configuration parameters or limits. | @@ -3445,7 +3463,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [CLEAR CACHE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-cache-clear-cache) | 0.45 | CLEAR CACHE syntax; simple cache-clearing behavior, no numeric limits or specialized configs. | | [Classification](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/classification) | 0.45 | Overview of Classification built on ai_classify and Agents UI; summary suggests conceptual usage, not detailed configuration or troubleshooting. | | [Code-based scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/custom-scorers) | 0.45 | Explains what custom code-based scorers are and when to use them; summary does not expose specific scorer APIs, parameter tables, or configuration ranges. | -| [Collect feedback and expectations by labeling existing traces](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/expert-feedback/label-existing-traces) | 0.45 | Explains using Review App to label traces; summary does not indicate detailed labeling schema fields or configuration tables. | | [Compare MLflow runs and models using graphs and charts](https://learn.microsoft.com/en-us/azure/databricks/mlflow/visualize-runs) | 0.45 | Describes visualizations for comparing runs; UI usage guidance rather than detailed configuration parameters or expert-only behavior. | | [Customer-managed keys overview](https://learn.microsoft.com/en-us/azure/databricks/security/keys/customer-managed-keys) | 0.45 | Described as an overview of customer‑managed keys for encryption and supported key sources. Likely conceptual and plan‑level info without detailed role names, permission scopes, or concrete configuration parameter tables. | | [DESCRIBE LOCATION](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-location) | 0.45 | DESCRIBE EXTERNAL LOCATION metadata; security-adjacent but only returns stored properties, not RBAC roles or config parameters. | @@ -3469,8 +3486,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [MAP type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/map-type) | 0.45 | MAP type description with examples; summary lacks explicit Databricks-only limits or configuration parameters. | | [MLflow 2: Run an evaluation](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluate-agent) | 0.45 | Explains how to run an evaluation and view results; summary does not show detailed input schema, limits, or error codes. | | [Manage notebooks](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-manage) | 0.45 | Managing notebooks (create, open, delete, rename, access) via UI/CLI/API is mostly operational; likely lacks deep configuration tables or expert-only constraints. | +| [Manage providers (for recipients)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-provider) | 0.45 | Describes how recipients view provider information and manage provider objects in the UI; appears to be basic usage/navigation without detailed configuration parameters, limits, or troubleshooting mappings. | | [Monitor a database instance](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/monitor) | 0.45 | Monitoring page for instances; summary suggests general performance monitoring without explicit mention of metrics tables, config parameters, or error-code mappings. | -| [Monitor compute](https://learn.microsoft.com/en-us/azure/databricks/compute/cluster-metrics) | 0.45 | Explains how to view compute metrics and notes approximate delay; mostly UI usage and conceptual metrics info without detailed config or limits. | | [Monitor in the UI](https://learn.microsoft.com/en-us/azure/databricks/ldp/monitoring-ui) | 0.45 | Focuses on using the UI for monitoring; mostly operational UI steps without strong indication of detailed error-code mappings or config tables. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/) | 0.45 | High-level driver overview and migration note; detailed settings are on child pages. | | [Per-workspace URLs](https://learn.microsoft.com/en-us/azure/databricks/workspace/per-workspace-urls) | 0.45 | Describes per-workspace URL format and deprecation of regional URLs; contains a URL pattern but not limits, configs, or decision matrices. | @@ -3530,21 +3547,20 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [xpath_double function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/xpath_double) | 0.45 | xpath_double() function page is an XML extraction helper; likely just syntax and return type without product-specific configuration tables, limits, or troubleshooting content. | | [xpath_float function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/xpath_float) | 0.45 | xpath_float() function page similarly provides basic SQL function reference; no evidence of expert-level limits, configuration matrices, or error-resolution mappings. | | [Agent mode](https://learn.microsoft.com/en-us/azure/databricks/genie/agent-mode) | 0.40 | Describes Agent mode behavior and capabilities; appears conceptual/behavioral rather than listing config parameters, limits, or troubleshooting mappings. | -| [Applicable model terms](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/acceptable-use-models) | 0.40 | Lists models subject to separate model terms and links to their terms/acceptable use policies. This is primarily legal/terms-of-use content, not technical limits, configuration, or troubleshooting guidance. | | [Apply tags](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tags) | 0.40 | Describes tagging apps for organization and lifecycle status; mostly conceptual/UX-level without detailed configuration parameters or numeric constraints. | | [April 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/april) | 0.40 | April 2022 Azure Databricks release notes describe platform updates but are not structured as limits-quotas, configuration, troubleshooting, or decision-making guidance. | | [April 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/april) | 0.40 | April 2023 Azure Databricks release notes provide platform update information but do not match the structured expert-knowledge patterns defined in the taxonomy. | | [April 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/april) | 0.40 | April 2024 Azure Databricks release notes are not organized as any of the defined expert-knowledge sub-skill types; they are chronological feature updates. | | [August 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/august) | 0.40 | August 2022 Azure Databricks release notes are update announcements, not structured expert-knowledge references as defined by the taxonomy. | | [August 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/august) | 0.40 | August 2023 Azure Databricks release notes function as a chronological list of changes rather than any of the defined expert-knowledge sub-skill types. | -| [Automatic cluster update](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/automatic-cluster-update) | 0.40 | Describes what automatic cluster update does and how to schedule it, but summary does not indicate specific configuration parameter tables, limits, or product-unique gotchas; likely a procedural/how-to page rather than expert configuration reference. | | [Autoscaling](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/autoscaling) | 0.40 | Explains autoscaling behavior conceptually (dynamic compute adjustment, scale-to-zero) but summary does not show specific configuration parameters, thresholds, or limits; reads as feature description rather than detailed config or limits. | | [Azure Private Link](https://learn.microsoft.com/en-us/azure/databricks/security/network/concepts/private-link) | 0.40 | Described as concepts and architecture for Azure Private Link connectivity patterns; likely conceptual without detailed parameter tables or specific config values. | | [BEGIN ATOMIC](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-begin-atomic) | 0.40 | This page describes the BEGIN ATOMIC ... END compound statement in Databricks SQL, focusing on transactional script block syntax and behavior. While it is detailed, it is language semantics rather than limits/quotas, configuration matrices, security roles, troubleshooting error codes, or decision-making guidance, so it does not fit the specified sub-skill types. | | [BEGIN TRANSACTION](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-begin) | 0.40 | BEGIN TRANSACTION syntax; generic transactional concept without product-specific numeric or config details. | -| [Basics](https://learn.microsoft.com/en-us/azure/databricks/pyspark/basics) | 0.40 | Basic PySpark usage tutorial; generic DataFrame operations and saving data, not focused on Databricks-specific configuration or limits. | +| [Backfill historical data](https://learn.microsoft.com/en-us/azure/databricks/ldp/flows-backfill) | 0.40 | Backfill concept and scenario description; summary does not show concrete limits, configuration matrices, or error-based troubleshooting content. | | [Benchmarks](https://learn.microsoft.com/en-us/azure/databricks/genie/benchmarks) | 0.40 | Explains using benchmarks to evaluate Genie spaces; likely methodology and workflow rather than product-specific limits, configs, or error mappings. | | [Build a geospatial pipeline with native spatial types](https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorial-spatial-pipelines) | 0.40 | Geospatial pipeline tutorial with example functions; appears as a scenario walkthrough rather than configuration references, limits, or best-practice gotchas. | +| [Built-in operators](https://learn.microsoft.com/en-us/azure/databricks/designer/built-in-operators) | 0.40 | Reference for built-in operators with configuration options, but summary does not indicate structured parameter tables with defaults/ranges or other expert-only details; likely general usage descriptions. | | [CACHE TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-cache-cache-table) | 0.40 | CACHE TABLE syntax overview; describes caching behavior conceptually without detailed cache size limits, tier-specific constraints, or configuration parameter tables. | | [COMMIT](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-commit) | 0.40 | COMMIT syntax; standard transactional semantics, no expert-only limits or configs indicated. | | [CONVERT TO DELTA](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-convert-to-delta) | 0.40 | CONVERT TO DELTA command description; explains behavior and process but summary does not indicate numeric limits, configuration tables, or decision matrices required for expert-knowledge classification. | @@ -3553,6 +3569,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Collation](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-collation) | 0.40 | Explains collation concepts and default behavior (UTF8_BINARY) but, from the summary, does not expose detailed configuration tables, numeric limits, or decision matrices; mostly conceptual/behavioral description. | | [Configure task dependencies](https://learn.microsoft.com/en-us/azure/databricks/jobs/run-if) | 0.40 | Describes conditional task dependencies and control flow conceptually; no indication of numeric thresholds, configuration tables, or product-specific error codes. Primarily behavior/UX description. | | [Convert external tables to managed](https://learn.microsoft.com/en-us/azure/databricks/tables/convert-external-managed) | 0.40 | Describes using ALTER TABLE ... SET MANAGED or Catalog Explorer; appears as straightforward DDL usage without deep config or limits. | +| [Create and manage shares](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-share) | 0.40 | Explains what shares are and how they conceptually work (what assets can be included, behavior when sharing schemas). Summary does not indicate detailed configuration tables, limits, or troubleshooting content. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-pipeline) | 0.40 | Jira ingestion pipeline page mentions automatic retries with exponential backoff and rate limit errors, but the summary does not show concrete numeric limits, configuration tables, or detailed error-code mappings. It likely links to troubleshooting elsewhere rather than containing that expert reference content itself. | | [Create visualizations with Genie Code](https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/query-based-params) | 0.40 | Step-by-step tutorial on using query-based parameters in dashboards; focuses on workflow steps rather than enumerating configuration options, limits, or specialized patterns beyond generic tutorial content. | | [Custom judges](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/custom-judge/) | 0.40 | Explains custom judges conceptually; summary does not show specific make_judge() parameters, config tables, or product-specific edge cases. | | [Customize notebook appearance](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-ui) | 0.40 | Customization of notebook appearance (line numbers, dark mode, margins) is UI preference-level configuration without complex parameters or expert-only constraints. | @@ -3564,6 +3582,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [DESCRIBE PROCEDURE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-describe-procedure) | 0.40 | DESCRIBE PROCEDURE metadata; no numeric limits, config parameters, or security role details. | | [DROP VIEW](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-view) | 0.40 | Primarily SQL syntax reference for DROP VIEW with privilege notes; no numeric limits, configuration tables, or product-specific thresholds. | | [Data types index](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/datatypes) | 0.40 | List of PySpark data types with links; largely mirrors upstream Spark types and is not Databricks-specific configuration or troubleshooting. | +| [Database connectors (CDC)](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/cdc-overview) | 0.40 | Overview of database connectors and CDC in Lakeflow Connect; sounds conceptual (what the connectors do) rather than detailed configuration, limits, or decision-making guidance with quantified criteria. | | [Databricks Connect release notes](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dbconnect/) | 0.40 | Databricks Connect release notes index; the summary focuses on listing releases and stating version compatibility at a high level. Without explicit evidence of detailed config parameters, limits, or troubleshooting mappings, it’s treated as non-expert for this classification. | | [Databricks Runtime 10.4 LTS](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4lts) | 0.40 | Runtime 10.4 LTS release notes; LTS release information and change log, not limits, configuration matrices, or troubleshooting mappings. | | [Databricks Runtime 10.4 LTS ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/10.4lts-ml) | 0.40 | Runtime 10.4 LTS ML release notes; ML runtime release information without the required expert-knowledge structures. | @@ -3609,17 +3628,13 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Databricks Runtime 16.4 LTS ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.4lts-ml) | 0.40 | ML LTS runtime release notes emphasize bundled ML libraries and tools; summary does not show expert-knowledge structures like limits tables or config parameter references. | | [Databricks Runtime 17.3 LTS](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.3lts) | 0.40 | Runtime 17.3 LTS release notes; LTS lifecycle and change details, but not structured as any of the specified expert-knowledge sub-skill categories. | | [Databricks Runtime 17.3 LTS ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.3lts-ml) | 0.40 | ML LTS runtime release notes focus on included libraries and capabilities; no indication of numeric limits, config tables, or troubleshooting content. | -| [Databricks Runtime 18.0](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.0) | 0.40 | Runtime 18.0 release notes; contains product updates and package versions but not in the form of limits/quotas, decision matrices, configuration tables, or error-resolution mappings. | | [Databricks Runtime 18.0 ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.0ml) | 0.40 | Runtime ML release notes are primarily feature and library-version announcements; the summary does not indicate structured limits, configuration tables, or troubleshooting mappings. | | [Databricks Runtime 18.1](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.1) | 0.40 | Runtime 18.1 release notes; primarily version-specific change log rather than structured limits, configuration references, or troubleshooting content as defined by the sub-skill types. | | [Databricks Runtime 18.1 ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.1ml) | 0.40 | Runtime 18.1 ML release notes; describes included libraries and base runtime but not structured configuration parameters or limits in the summary. | -| [Databricks Runtime 18.2 (Beta)](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.2) | 0.40 | Release notes for a specific Databricks Runtime version; likely lists changes, fixes, and package versions but not organized as limits, configuration matrices, troubleshooting guides, or other defined sub-skill patterns. | | [Databricks Terraform provider overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/) | 0.40 | High-level overview of Databricks Terraform provider; likely conceptual and marketing-style without detailed config tables or limits. | | [Debug code with Databricks Connect](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/databricks-connect) | 0.40 | Describes debugging with Databricks Connect in VS Code; appears to be workflow guidance rather than detailed troubleshooting or config tables. | -| [Debug notebooks with Databricks Connect](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/notebooks) | 0.40 | Explains running and debugging notebook cells; likely procedural without product-specific limits, config matrices, or error-code mappings. | | [December 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/december) | 0.40 | December 2024 Azure Databricks release notes are primarily a chronological feature/change log, not a structured best-practices, configuration, or troubleshooting reference. | | [Deep learning using TensorFlow with HorovodRunner](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/train-model/mnist-tensorflow-keras) | 0.40 | Notebook example of TensorFlow + Keras with HorovodRunner for MNIST; mainly workflow and example code without structured configuration tables or product-specific limits/troubleshooting. | -| [Delete a workspace](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/delete-workspace) | 0.40 | Describes workspace deletion and notes about retained resources; summary does not show detailed configuration options or limits, more of a procedural guide. | | [Develop apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/app-development) | 0.40 | Development guidance mentions environment variables and supported frameworks but is summarized at a high level; no explicit parameter tables, default values, or product-specific constraints are evident from the summary. | | [Distributed fine-tune Qwen2-0.5B with LoRA](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-finetune-qwen2-0.5b) | 0.40 | Tutorial-style notebook for fine-tuning Qwen2-0.5B; likely focuses on example code and workflow rather than structured configuration tables, limits, or troubleshooting mappings. Treated as a general how-to rather than a reusable expert reference. | | [Download and install](https://learn.microsoft.com/en-us/azure/databricks/integrations/odbc/download) | 0.40 | Download/install instructions without detailed configuration parameter tables or limits. | @@ -3637,6 +3652,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-faq) | 0.40 | FAQ page; description does not indicate detailed error-code mappings or numeric limits. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-faq) | 0.40 | FAQ page; description mentions delete handling and permissions but not explicit error-code mappings or numeric limits. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-faq) | 0.40 | FAQ page, but summary does not mention error codes, config parameters, or numeric thresholds; likely high-level Q&A. | +| [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-faq) | 0.40 | MySQL connector FAQ may contain some product-specific details, but from the summary it appears to be general Q&A; no clear indication of numeric limits, config tables, or error-code-based troubleshooting to justify classification. | | [FAQs and best practices](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/faqs) | 0.40 | FAQ page; summary does not indicate detailed error codes, config tables, or limits—likely conceptual Q&A and clarifications. | | [FSCK REPAIR TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-fsck) | 0.40 | FSCK REPAIR TABLE command overview; describes progressive levels conceptually without exposing specific numeric thresholds, error-code mappings, or configuration parameters. | | [February 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/february) | 0.40 | February 2023 Azure Databricks release notes list new features and improvements but lack the structured numeric or configuration detail required for classification. | @@ -3654,12 +3670,10 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [IPython kernel](https://learn.microsoft.com/en-us/azure/databricks/notebooks/ipython-kernel) | 0.40 | Explains IPython kernel usage and benefits; mostly conceptual and runtime-version notes without detailed configuration tables or limits. | | [ITERATE statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/iterate-stmt) | 0.40 | Language reference for ITERATE control-flow syntax; mostly generic SQL semantics without configuration tables, limits, or product-specific thresholds. | | [Import and query data in Excel](https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-query) | 0.40 | Covers how to use the Azure Databricks Excel Add-in to import and query data; likely a usage tutorial focused on UI flows and basic querying rather than configuration parameters, limits, or error-code-based troubleshooting. | -| [Inference tables](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables-beta) | 0.40 | Explains inference tables conceptually; no explicit evidence of configuration parameter ranges or quotas. | -| [Install the Excel Add-in](https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-setup) | 0.40 | Describes how to install and set up the Azure Databricks Excel Add-in and connect via SSO; appears to be a step-by-step setup/tutorial without detailed parameter tables, limits, or specialized troubleshooting mappings. | +| [Install the Excel Add-in](https://learn.microsoft.com/en-us/azure/databricks/integrations/excel-setup) | 0.40 | Setup guide for the Azure Databricks Excel Add-in focused on installation and SSO connection. Based on the summary, it reads as a step-by-step tutorial rather than a parameter reference; it likely lacks detailed configuration tables, limits, or error-code-based troubleshooting. | | [Intelligent document processing](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/intelligent-document-processing) | 0.40 | Describes intelligent document processing conceptually and mentions AI Functions; appears as solution overview without detailed configs or limits. | | [Isolation levels](https://learn.microsoft.com/en-us/azure/databricks/optimizations/isolation/isolation-levels) | 0.40 | Describes two isolation levels conceptually; summary does not indicate specific configuration parameters, thresholds, or product-specific numeric guidance. | | [January 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/january) | 0.40 | January 2023 Azure Databricks release notes are change-log style and do not present structured expert knowledge in any of the specified sub-skill formats. | -| [January 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/january) | 0.40 | January 2025 Azure Databricks release notes list new features and improvements but do not present structured expert knowledge in the forms specified (limits, configuration tables, troubleshooting matrices, etc.). | | [July 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/july) | 0.40 | July 2023 Azure Databricks release notes are change-log content and do not provide structured limits, configuration, troubleshooting, or decision-making guidance. | | [July 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/july) | 0.40 | July 2024 Azure Databricks release notes list platform updates but do not match any specific sub-skill pattern such as limits tables, RBAC role references, or troubleshooting mappings. | | [June 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/june) | 0.40 | June 2022 Azure Databricks release notes list new features and changes but do not provide structured limits, configuration parameters, or troubleshooting mappings. | @@ -3668,13 +3682,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [LEAVE statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/leave-stmt) | 0.40 | Language reference for LEAVE control-flow syntax; describes behavior but not product-specific limits, configs, or decision matrices. | | [LOOP statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/loop-stmt) | 0.40 | Language reference for LOOP statement; explains syntax and semantics but lacks numeric limits, configuration parameters, or troubleshooting mappings. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/limitations) | 0.40 | Describes limitations conceptually; summary does not indicate concrete numeric limits or quotas required for limits-quotas classification. | +| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-limits) | 0.40 | Described as limitations and considerations but not clearly numeric limits/quotas; likely qualitative constraints rather than specific values or tables required for limits-quotas. | | [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-limits) | 0.40 | Describes limitations conceptually (deletion tracking, cursor constraints) but summary does not indicate specific numeric limits, quotas, or tier-based tables required for limits-quotas classification. | +| [Limitations](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-limits) | 0.40 | Lists limitations and considerations but summary does not show concrete numeric limits or quotas; likely qualitative constraints rather than detailed limit tables. | | [MCP servers](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/mcp) | 0.40 | Introduces MCP servers for agents and references broader MCP documentation; likely conceptual/overview of the protocol and its role in Databricks rather than detailed config tables, limits, or troubleshooting mappings. | | [MLflow 2: Collect human feedback](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/review-app) | 0.40 | Describes using a review app to collect human feedback; likely workflow/UI guidance without product-specific limits, config tables, or error-code mappings. | | [MLflow 3 for models](https://learn.microsoft.com/en-us/azure/databricks/mlflow/mlflow-3-install) | 0.40 | Getting-started install article with demo notebooks; likely step-by-step but not a consolidated configuration reference or expert-only details. | | [MSCK REPAIR TABLE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-repair-table) | 0.40 | Describes REPAIR TABLE behavior and when to use SYNC METADATA, but no numeric limits, configuration tables, or detailed decision matrices. | | [Manage Foundation Model Fine-tuning runs](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/foundation-model-training/view-manage-runs) | 0.40 | Focuses on viewing, managing, and analyzing runs; summary does not indicate detailed parameter tables, error codes, or configuration options—likely workflow guidance rather than expert configuration or troubleshooting content. | -| [Manage classic compute](https://learn.microsoft.com/en-us/azure/databricks/compute/clusters-manage) | 0.40 | Management article (start, stop, edit, monitor) appears procedural; summary does not indicate detailed config tables, limits, or error-code mappings. | | [March 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/march) | 0.40 | March 2022 Azure Databricks release notes function as release announcements and do not contain the structured expert knowledge required for classification into the defined sub-skill types. | | [March 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/march) | 0.40 | March 2024 Azure Databricks release notes describe new features but do not provide structured limits, configuration options, troubleshooting flows, or decision criteria. | | [March 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/march) | 0.40 | Monthly Azure Databricks release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert patterns defined in the taxonomy. | @@ -3682,35 +3697,36 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [May 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/may) | 0.40 | May 2024 Azure Databricks release notes provide feature summaries but lack the structured numeric limits, configuration parameters, or decision matrices required for classification. | | [Metric views](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/) | 0.40 | Explains what metric views are and how to define/govern them; sounds like conceptual plus basic how-to, without clear indication of detailed config tables or limits. | | [Migrate from Model Serving](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/migrate-agent-to-apps) | 0.40 | Migration guidance from Model Serving to Databricks Apps, but summary suggests high-level recommendations and advantages rather than detailed configuration matrices, limits, or error-code-based troubleshooting. | -| [Monitor](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/monitor) | 0.40 | General monitoring overview (system operations, metrics, query analysis); summary lacks specific metrics tables, config parameters, or error mappings. | +| [Monitor compute](https://learn.microsoft.com/en-us/azure/databricks/compute/cluster-metrics) | 0.40 | Explains how to view compute metrics in the UI and notes storage location and delay; appears to be a usage guide without detailed configuration parameters, limits, or troubleshooting mappings. | | [Networking security architecture](https://learn.microsoft.com/en-us/azure/databricks/security/network/concepts/architecture) | 0.40 | Appears to be a conceptual overview of Databricks control/compute plane and networking security architecture without detailed configuration tables or numeric thresholds. | | [November 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/november) | 0.40 | November 2023 Azure Databricks release notes are a platform change log and do not match the structured expert-knowledge categories like limits-quotas or configuration. | | [November 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/november) | 0.40 | November 2024 Azure Databricks release notes describe new capabilities but are not organized as limits/quotas, configuration parameter tables, or other defined expert-knowledge categories. | -| [OPTIMIZE](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-optimize) | 0.40 | OPTIMIZE command description for Delta tables; focuses on what the command does (bin-packing, collocation) rather than detailed limits, configuration parameter tables, or decision matrices. | | [October 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/october) | 0.40 | October 2022 Azure Databricks release notes summarize platform improvements but are not structured as limits, configuration, troubleshooting, or decision-making content. | | [October 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/october) | 0.40 | October 2023 Azure Databricks release notes list platform improvements but are not structured as best-practices, troubleshooting, or configuration references. | -| [October 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/october) | 0.40 | October 2024 Azure Databricks release notes function as a change log rather than a structured guide for limits, configuration, troubleshooting, or decision-making. | | [Organize training runs with experiments](https://learn.microsoft.com/en-us/azure/databricks/mlflow/experiments) | 0.40 | Explains how to create/manage experiments and navigate UI; organizational guidance rather than deep configuration or error mapping. | | [Outputs and results](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-outputs) | 0.40 | Explains how to view and manage notebook outputs and results; mostly UI operations and generic behavior without detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/archive/connectors/external-systems) | 0.40 | High-level index/overview of external connectors; detailed configuration is in the linked connector pages, not here. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation) | 0.40 | Described as overview information about OAuth token federation; likely conceptual without detailed config tables or role mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/) | 0.40 | Conceptual description of Declarative Automation Bundles; summary suggests overview of what bundles are, not detailed configuration references or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/) | 0.40 | Conceptual description of MCP on Databricks and listing types of MCP servers; summary does not indicate detailed configuration parameters, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/) | 0.40 | Overview of using LLMs on Databricks with common libraries; largely conceptual and high-level without detailed config tables, limits, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/) | 0.40 | High-level overview of Lakeflow Spark Declarative Pipelines; summary doesn’t indicate detailed configs, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/materialized-views) | 0.40 | Explains materialized views conceptually in pipelines; summary does not indicate detailed config tables, limits, or troubleshooting mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/streaming-tables) | 0.40 | Explains streaming tables and when they are a good choice, but summary lacks numeric thresholds, detailed decision matrices, or configuration parameter tables. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/overview/) | 0.40 | Conceptual overview of how MLflow helps with GenAI quality; does not indicate detailed configuration parameters, limits, or troubleshooting content. | +| [Overview of D2D sharing](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-databricks) | 0.40 | Overview of Databricks-to-Databricks Delta Sharing for providers; primarily conceptual workflow and requirements, not detailed configuration tables, error codes, or quantified trade-offs. | +| [Overview of open sharing](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-open) | 0.40 | Provider-focused overview of open sharing protocol; summary suggests conceptual guidance and workflow, not product-specific limits, configuration tables, or error mappings. | | [Package cells](https://learn.microsoft.com/en-us/azure/databricks/notebooks/package-cells) | 0.40 | Explains package cells conceptually and how to define Scala classes; no clear tables of settings, limits, or error mappings. | +| [Parameters](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/parameters) | 0.40 | Describes using parameters to make dashboards interactive; focuses on runtime substitution and filtering, but no explicit config tables or limits are indicated. | | [Point-in-time restore](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/point-in-time-restore) | 0.40 | Explains point-in-time restore conceptually; summary does not indicate specific restore windows, limits, or configuration parameters. | | [PostgreSQL compatibility](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/postgres-compatibility) | 0.40 | PostgreSQL compatibility overview; likely lists supported/unsupported features conceptually rather than detailed config tables, limits, or error mappings. | | [Provider policies](https://learn.microsoft.com/en-us/azure/databricks/marketplace/provider-policies) | 0.40 | Provider policy/legal terms; while detailed, they are contractual rather than technical configuration, limits, or troubleshooting for an AI skills system. | | [Query LLMs](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/llm-serving-intro) | 0.40 | Getting started with LLM serving; pay-per-token mention but no specific numeric limits, config tables, or decision matrices. | -| [Query endpoints](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/query-endpoints-beta) | 0.40 | Describes how to query AI Gateway endpoints; summary does not indicate detailed parameter tables or quotas. | | [Query from Lakebase SQL Editor](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sql-editor) | 0.40 | Describes features of the Lakebase SQL Editor (EXPLAIN/ANALYZE, meta-commands, export formats); likely a usage guide rather than detailed configuration, limits, or troubleshooting mappings. | | [REFRESH (MATERIALIZED VIEW or STREAMING TABLE)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-refresh-full) | 0.40 | Explains REFRESH for materialized views/streaming tables and mentions synchronous default, but lacks numeric limits, config parameter tables, or detailed troubleshooting. | | [REPEAT statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/repeat-stmt) | 0.40 | Language reference for REPEAT statement; standard control-flow description without expert-only limits, quotas, or product-specific configuration details. | | [RESIGNAL statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/resignal-stmt) | 0.40 | RESIGNAL statement reference; while Databricks-specific, it is a straightforward syntax/behavior description without numeric limits, config tables, or error-code troubleshooting mappings. | | [ROLLBACK](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-txn-rollback) | 0.40 | ROLLBACK syntax; generic transactional behavior without numeric limits or config matrices. | +| [Read shared data (Databricks-to-Databricks)](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-databricks) | 0.40 | Describes how recipients read shared data using Databricks-to-Databricks protocol; appears to be a usage/tutorial style page without detailed configuration matrices or troubleshooting content. | | [Release 2025.12](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/12/) | 0.40 | DLT release notes for version 2025.12; does not provide structured limits, configuration parameters, troubleshooting error mappings, or decision matrices. | | [Release 2025.15](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/15/) | 0.40 | DLT release notes for version 2025.15; change log style content rather than expert-knowledge reference material in the defined categories. | | [Release 2025.16](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/16/) | 0.40 | DLT release notes for version 2025.16; primarily a chronological list of changes, not structured as limits-quotas, configuration, troubleshooting, or decision-making guidance. | @@ -3721,17 +3737,16 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Release 2025.28](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/28/) | 0.40 | Versioned release notes for Lakeflow Spark Declarative Pipelines; contains change log information rather than structured limits, configuration, troubleshooting, or decision matrices. | | [Release 2025.29](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/29/) | 0.40 | Release notes for a specific Databricks Lakeflow Declarative Pipelines version; likely lists new features and fixes but not organized as limits, configuration tables, troubleshooting mappings, or other defined sub-skill types. | | [Row-level concurrency](https://learn.microsoft.com/en-us/azure/databricks/optimizations/isolation/row-level-concurrency) | 0.40 | Row-level concurrency behavior is described conceptually; summary does not show concrete config options, error codes, or numeric thresholds. | +| [SQL language reference](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/) | 0.40 | High-level entry page for the Databricks SQL language reference; likely an index/overview of syntax topics rather than detailed limits, configuration tables, or troubleshooting mappings. Does not clearly match any expert-knowledge sub-skill type as defined. | | [Schedule queries](https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/schedule-query) | 0.40 | Basic scheduling steps for queries; no indication of limits, config matrices, or error-code-based troubleshooting. | | [Schedule refreshes in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/schedule-refreshes) | 0.40 | Describes how to schedule refreshes; appears to be procedural UI/API steps without limits, config matrices, or detailed troubleshooting. | | [Schedules and subscriptions](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/schedule-subscribe) | 0.40 | Describes scheduling updates and subscriptions conceptually; summary does not indicate specific numeric schedules, quotas, or config tables. | | [Search for workspace objects](https://learn.microsoft.com/en-us/azure/databricks/search/) | 0.40 | Describes how to search for workspace objects; while it mentions a CSP-related behavior, it is mostly procedural UI guidance, not a structured configuration or security reference. | | [September 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/september) | 0.40 | September 2023 Azure Databricks release notes are update summaries, not structured expert guidance with numeric limits, parameter tables, or decision matrices. | | [Serverless compute plane networking](https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/) | 0.40 | Described as an introduction to serverless compute plane networking and cost note; likely a conceptual/overview page about tools and architecture rather than detailed config tables, limits, or error mappings. | -| [Set up a Genie space](https://learn.microsoft.com/en-us/azure/databricks/genie/set-up) | 0.40 | Explains how to set up and manage a Genie space; likely a setup/tutorial flow without detailed config parameter tables or numeric thresholds. | | [Set up and manage Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/get-started) | 0.40 | ‘Get started’ article is likely a procedural setup guide without dense configuration parameter tables or decision matrices; mostly onboarding steps. | | [SparkR ML tutorial](https://learn.microsoft.com/en-us/azure/databricks/sparkr/glm-tutorial) | 0.40 | Tutorial on using glm with SparkR; mostly generic modeling steps and syntax, not Databricks-specific configuration or limits. | | [SparkR overview](https://learn.microsoft.com/en-us/azure/databricks/sparkr/overview) | 0.40 | SparkR overview with deprecation note; largely conceptual and lifecycle info without detailed configs, limits, or patterns. | -| [Streaming tables in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/streaming) | 0.40 | How-to usage of streaming tables in Databricks SQL; no config tables, limits, or product-specific error/diagnostic details. | | [Supported regions](https://learn.microsoft.com/en-us/azure/databricks/resources/supported-regions) | 0.40 | Lists supported regions; while specific, it is static catalog data rather than limits, configs, or decision matrices. | | [System operations](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/operations) | 0.40 | Describes system operations conceptually (user-initiated vs system-initiated); summary does not show specific operation codes, limits, or troubleshooting mappings. | | [Table visualizations](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/tables) | 0.40 | Covers table and pivot visualization configuration; primarily UI-driven formatting options, not expert-only configuration or limits. | @@ -3740,14 +3755,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Track models and experiments overview](https://learn.microsoft.com/en-us/azure/databricks/mlflow/tracking) | 0.40 | General description of MLflow tracking and experiments; likely conceptual with basic usage, not a detailed config or troubleshooting reference. | | [Tutorial: Develop a Node.js app](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tutorial-node) | 0.40 | Tutorial for building a simple Node.js app with Express and Chart.js; likely step-by-step example code rather than structured config tables, limits, or decision matrices. | | [Tutorial: Develop an app with Streamlit](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/tutorial-streamlit) | 0.40 | Tutorial for building a Databricks app with Streamlit and SQL Connector; primarily example workflow, not structured expert reference content. | -| [Tutorial: Load and transform data using DataFrames](https://learn.microsoft.com/en-us/azure/databricks/getting-started/dataframes) | 0.40 | Step-by-step tutorial for DataFrames; mostly generic Spark usage. Free Edition notes are environment constraints but not detailed limits/config matrices. | | [Upload files](https://learn.microsoft.com/en-us/azure/databricks/genie/file-upload) | 0.40 | Covers uploading small CSV/Excel files to a Genie space; while there may be size limits, the summary does not indicate explicit numeric quotas or detailed configuration tables. | -| [Usage tracking](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/usage-tracking-beta) | 0.40 | Monitoring usage via system table; likely describes schema but summary does not show detailed config ranges or limits. | | [Use Databricks data](https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform-usage) | 0.40 | Usage-focused page on building apps/flows/agents after connection exists. Summary does not indicate detailed configuration tables, limits, or troubleshooting; more of a how-to use data conceptually. | | [Use a JAR on serverless compute](https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/use-jars-in-workflows) | 0.40 | Tutorial for creating and running JARs on serverless compute; no detailed limits, config matrices, or product-specific parameters. | | [Use the data profiling dashboard](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/monitor-dashboard) | 0.40 | Describes what the automatically created dashboard shows but appears more descriptive than configuration-focused; no clear parameter tables, limits, or decision matrices indicated in the summary. | | [Use the web terminal](https://learn.microsoft.com/en-us/azure/databricks/compute/web-terminal) | 0.40 | Describes web terminal usage and capabilities; appears as a feature overview/tutorial without deep configuration tables or error mappings. | -| [Visualization types](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/types) | 0.40 | Catalog of visualization types and examples; mostly conceptual/how-to, not deep config or limits. | | [WHILE statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/while-stmt) | 0.40 | WHILE statement reference; generic control-flow semantics without expert-only numeric limits, configuration options, or decision matrices. | | [What are ACID guarantees on Databricks?](https://learn.microsoft.com/en-us/azure/databricks/lakehouse/acid) | 0.40 | Explains ACID guarantees conceptually for Delta Lake on Azure Databricks; while product-specific, it is primarily conceptual behavior rather than limits, configuration parameters, or decision matrices as defined by the sub-skill types. | | [What are UDFs?](https://learn.microsoft.com/en-us/azure/databricks/udf/) | 0.40 | Conceptual explanation of UDFs and their strengths/limitations; summary does not show product-specific parameters, limits, or error codes. | @@ -3837,7 +3849,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [BIGINT type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/bigint-type) | 0.35 | BIGINT type is standard; summary only notes 8-byte signed integer with examples, no Databricks-specific constraints or configs. | | [BINARY type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/binary-type) | 0.35 | BINARY type description appears generic; summary lacks explicit limits, config parameters, or Databricks-only behavior. | | [Build an ETL pipeline (Lakeflow SDP)](https://learn.microsoft.com/en-us/azure/databricks/getting-started/data-pipeline-get-started) | 0.35 | ETL pipeline tutorial using Lakeflow Spark Declarative Pipelines; focuses on workflow steps rather than configuration matrices or limits. | -| [Clean rooms overview](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/) | 0.35 | Introductory overview of Clean Rooms; primarily conceptual without detailed configuration parameters or limits. | | [Compliance overview](https://learn.microsoft.com/en-us/azure/databricks/security/privacy/) | 0.35 | High-level compliance overview (auditing, privacy, regulations) without detailed configuration parameters, limits, or decision matrices. | | [Connect to a data governance partner](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/data-governance) | 0.35 | Data governance Partner Connect overview; mostly generic steps and navigation without detailed product-specific configuration. | | [Connect to a reverse ETL partner](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/reverse-etl) | 0.35 | Generic Partner Connect reverse ETL overview; mainly navigation and high-level steps without detailed config matrices. | @@ -3845,10 +3856,7 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Connect to a semantic layer partner](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/semantic-layer) | 0.35 | Semantic layer Partner Connect overview; primarily generic connection workflow and navigation, not detailed configuration tables. | | [Create a dashboard](https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/create-dashboard) | 0.35 | Step-by-step UI tutorial for creating a dashboard; mostly basic usage without detailed configuration matrices or platform-specific constraints. | | [Create a table](https://learn.microsoft.com/en-us/azure/databricks/getting-started/create-table) | 0.35 | Tutorial on creating a table and granting privileges; primarily step-by-step usage of Unity Catalog, not a comprehensive security or configuration reference with role matrices or parameter tables. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-pipeline) | 0.35 | Pipeline creation guide; mention of automatic retries and rate limits is conceptual here, without explicit numeric limits or config tables. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-pipeline) | 0.35 | Pipeline creation tutorial for SharePoint; mostly procedural and not clearly a configuration reference, limits page, or troubleshooting guide. | | [Create workspace using Bicep](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/bicep) | 0.35 | Bicep deployment how-to; appears to be generic IaC usage rather than a detailed configuration reference with Databricks-specific parameters. | -| [Create workspace using PowerShell](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/powershell) | 0.35 | PowerShell deployment how-to; summary focuses on prerequisites and basic commands, not on Databricks-specific deployment constraints or matrices. | | [Create workspace using an ARM template](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/arm-template) | 0.35 | How-to for deploying with ARM templates; likely generic template usage without detailed Databricks-specific configuration tables in the summary. | | [Custom LLM (legacy)](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/custom-llm) | 0.35 | Describes creating a generative AI agent using Custom LLM (beta). Summary suggests a feature tutorial rather than detailed limits, configuration matrices, or troubleshooting content. | | [Custom calculations](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/custom-calculations/) | 0.35 | Explains custom calculations in dashboards; likely focuses on UI usage and formula concepts, not on product-specific limits, config ranges, or troubleshooting mappings. | @@ -3859,13 +3867,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [DeepEval scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/third-party-scorers/deep-eval) | 0.35 | Describes DeepEval integration conceptually; summary lacks concrete configuration parameters, defaults, or SDK-specific details. | | [Detect anomalies at scale](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/anomaly-detection/) | 0.35 | The anomaly detection page describes what anomaly detection is, what it monitors, and how to use it, plus a billing note about default storage. From the summary, it looks like a feature explanation/usage guide rather than a limits table, configuration reference, or troubleshooting guide with specific error codes or parameters. Lacking clear evidence of detailed numeric thresholds or config tables, it is not classified as expert knowledge here. | | [Explore data in Databricks One](https://learn.microsoft.com/en-us/azure/databricks/getting-started/analyst-get-started) | 0.35 | Business analyst tutorial for dashboards and Genie; usage-focused, not deep configuration or troubleshooting. | -| [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-faq) | 0.35 | MySQL connector FAQ; summary does not indicate detailed error-code mappings, config tables, or limits—likely general Q&A rather than expert troubleshooting or configuration reference. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-faq) | 0.35 | FAQ page; may mix conceptual and minor specifics but is not clearly organized as limits, configuration tables, or troubleshooting with error codes. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-faq) | 0.35 | FAQ page; may contain some specifics but not clearly structured as limits, configuration reference, or troubleshooting with error codes. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-faq) | 0.35 | FAQ page for SharePoint connector; may contain mixed information but not clearly structured as limits, configuration tables, or troubleshooting with error codes. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-faq) | 0.35 | FAQ page; summary does not indicate presence of specific error codes, config tables, or limits—likely general Q&A rather than structured expert troubleshooting or config reference. | | [Forecasting with the UI](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/forecasting) | 0.35 | Classic compute forecasting tutorial; mentions runtime requirement but not detailed limits, config matrices, or troubleshooting mappings. | -| [Get identifiers for workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-details) | 0.35 | Shows how to obtain various IDs and URLs; while useful, it is procedural and does not present configuration tables, limits, or error-resolution mappings. | | [Guardrails AI scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/third-party-scorers/guardrails) | 0.35 | Guardrails AI integration overview; no explicit mention of validator configuration parameters, settings tables, or product-specific constraints. | | [How Photon improves query performance](https://learn.microsoft.com/en-us/azure/databricks/compute/photon) | 0.35 | High-level explanation of Photon benefits and compatibility; lacks detailed configuration parameters, limits, or troubleshooting mappings. | | [Hugging Face overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/huggingface/) | 0.35 | Introductory overview of Hugging Face Transformers on Databricks and installation; mostly conceptual and basic setup without detailed config matrices or product-specific constraints. | @@ -3888,7 +3894,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [RAGAS scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/third-party-scorers/ragas) | 0.35 | RAGAS integration overview; summary does not indicate detailed configuration or metric parameterization. | | [Release 2025.30](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/30/) | 0.35 | Release 2025.30 notes similarly list features and improvements; no indication of limits/quotas, configuration parameter tables, or troubleshooting mappings. | | [Release 2025.36](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/36/) | 0.35 | Release 2025.36 notes describe features and improvements over a date range; summary does not show the specific expert-knowledge patterns required for classification. | -| [Release notes 2024](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2024) | 0.35 | 2024 SQL release notes include language updates and features but the summary does not show numeric limits, config tables, or symptom→solution mappings required for classification. | | [Run a file or notebook in Databricks](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/run) | 0.35 | How to run files/notebooks as jobs; typical usage article without detailed configuration parameter tables or troubleshooting mappings. | | [Run a notebook from another notebook](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-workflows) | 0.35 | Covers orchestration and modularization patterns but summary doesn’t show Databricks-specific decision matrices or quantified trade-offs. | | [Run tests](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/pytest) | 0.35 | Describes running Python tests; likely generic test execution steps rather than product-specific expert configuration or troubleshooting. | @@ -3977,21 +3982,26 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [AI Playground](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-playground) | 0.30 | Overview of AI Playground and how to chat with LLMs; mostly conceptual/usage description without detailed configuration tables or limits. | | [AI agent skills](https://learn.microsoft.com/en-us/azure/databricks/agent-skills/) | 0.30 | Conceptual description of agent skills format and usage; no indication of detailed config tables, limits, or error codes. | | [AI catalog foundation models](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/pretrained-models) | 0.30 | Describes accessing generative AI/LLM models from Unity Catalog and one-click access. Likely a usage/feature overview without detailed configuration tables, limits, or troubleshooting content. | +| [AI governance (Unity AI Gateway)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/) | 0.30 | Overview of Unity AI Gateway capabilities; summary indicates conceptual governance description without concrete limits, config tables, or decision matrices. | | [AI/BI release notes](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/) | 0.30 | Top-level AI/BI release notes index; summary indicates a list of features and updates, not structured limits, configuration, or troubleshooting content. | +| [Add AI-generated comments](https://learn.microsoft.com/en-us/azure/databricks/comments/ai-comments) | 0.30 | Page is primarily a feature overview and how-to for AI-generated comments in Unity Catalog. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details as defined by the sub-skill types. | +| [Administration topics](https://learn.microsoft.com/en-us/azure/databricks/admin/) | 0.30 | High-level administration landing page that aggregates topics; no indication of detailed limits, configs, or error mappings. | | [Agent memory](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/stateful-agents) | 0.30 | Describes AI agent memory concepts and using Lakehouse/Lakebase as durable memory; likely conceptual and architectural overview without concrete limits, config tables, or troubleshooting mappings. | | [Aggregations](https://learn.microsoft.com/en-us/azure/databricks/transform/aggregation) | 0.30 | Introductory aggregation semantics and comparison of batch, materialized views, and streaming; likely conceptual without detailed product-specific parameters or limits. | | [Alphabetical list of built-in functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin-alpha) | 0.30 | Alphabetical index of built-in functions; navigational content without expert-level configuration, limits, or troubleshooting. | -| [Apache Spark API](https://learn.microsoft.com/en-us/azure/databricks/reference/spark) | 0.30 | Overview page linking to Apache Spark API references; does not itself contain detailed configuration, limits, or error mappings. | | [April 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/april) | 0.30 | April 2019 Databricks release notes provide feature updates but no structured limits, configuration, or troubleshooting content as required by the sub-skill definitions. | | [April 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/april) | 0.30 | April 2020 Databricks release notes are a feature change log; they lack numeric limits, config tables, or error-code-based troubleshooting content. | | [April 2021](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2021/april) | 0.30 | April 2021 release notes; monthly product changes, not limits, configuration, troubleshooting, or decision-making guidance. | -| [April 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/april) | 0.30 | April 2025 release notes; visible text is generic and not focused on limits, configs, or error mappings. | +| [April 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/april) | 0.30 | Monthly release notes page; describes new features and improvements without the structured limits, decision matrices, or config parameter tables needed for classification. | +| [April 2026](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/april) | 0.30 | Monthly release notes page; typically lists new features and changes but not structured limits, configuration tables, or troubleshooting mappings required by the sub-skill types. | | [Archival support](https://learn.microsoft.com/en-us/azure/databricks/optimizations/archive-delta) | 0.30 | Feature overview of archival support; summary does not indicate numeric limits, configuration tables, or detailed parameters. Likely conceptual behavior description rather than expert-only configuration or limits. | | [Archive traces](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/archive-traces) | 0.30 | Covers archiving traces to a Unity Catalog Delta table. The summary focuses on purpose and high-level usage; it does not indicate detailed configuration tables, permission matrices, or other specific expert-level reference information. | | [August 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/august) | 0.30 | August 2018 release notes; feature list style, not structured expert guidance. | | [August 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/august) | 0.30 | August 2019 release notes; feature list style, not structured expert guidance per defined sub-skill types. | | [August 2021](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2021/august) | 0.30 | August 2021 release notes; primarily a list of changes, not a structured expert-knowledge page per the taxonomy. | | [Authentication and access control overview](https://learn.microsoft.com/en-us/azure/databricks/security/auth/) | 0.30 | Introductory overview of authentication and access control; does not appear to list specific RBAC roles, scopes, or configuration parameters in detail. | +| [Author dashboards](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/) | 0.30 | Basic how-to for creating and organizing dashboards; appears procedural without detailed config tables, limits, or error-resolution content. | +| [Auto Loader with Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/unity-catalog) | 0.30 | Describes using Auto Loader with Unity Catalog and references other docs for recommendations/limitations; summary does not show concrete config tables, numeric limits, or error mappings. | | [Backfill historical traces](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/backfill-scorers) | 0.30 | Explains retroactively applying scorers to historical traces. The description is conceptual/usage-oriented and does not mention specific configuration parameters, limits, or troubleshooting details that would constitute expert knowledge. | | [Basic usage](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/usage) | 0.30 | Basic CLI usage and help/output handling; likely procedural tutorial without detailed config tables, limits, or product-specific error mappings. | | [Bloom filter indexes (deprecated)](https://learn.microsoft.com/en-us/azure/databricks/optimizations/bloom-filters) | 0.30 | Deprecation notice and high-level guidance to avoid Bloom filter indexes; no indication of numeric limits, config tables, or detailed migration/decision matrices. | @@ -4002,9 +4012,9 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Bundle with pipelines tutorial](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/pipelines-tutorial) | 0.30 | Tutorial for using bundles with pipelines; procedural guidance rather than detailed configuration matrices or limits. | | [Business semantics](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/) | 0.30 | High-level overview of Unity Catalog business semantics; described as an introduction and navigation page rather than detailed configuration, limits, or troubleshooting content. | | [CASE statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/case-stmt) | 0.30 | CASE statement control-flow syntax; language reference without configuration tables, limits, or troubleshooting mappings. | +| [CLI](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/cli) | 0.30 | Intro to using Databricks CLI for Lakebase; summary suggests step-by-step usage, not parameter reference tables, quotas, or error-code-based troubleshooting. Likely generic tutorial-style content. | | [Classic compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/use-compute) | 0.30 | High-level overview of classic compute access and creation; no detailed limits, configs, or error mappings. | | [Classification with the UI](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/classification) | 0.30 | Similar to regression article but for classification; summary indicates general usage, not expert-level configuration or limits. | -| [Coding agent integration](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/coding-agent-integration-beta) | 0.30 | Described as an integration overview for coding agents using AI Gateway, likely focused on conceptual integration and basic setup rather than detailed config parameter tables or error-code troubleshooting. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-concepts) | 0.30 | Concepts page; likely explains how the connector works conceptually rather than listing concrete configuration parameters or limits. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-concepts) | 0.30 | A 'concepts' page about architecture, data models, and pricing is generally high-level and explanatory. Without evidence of numeric thresholds or decision matrices, it is more conceptual than expert configuration or decision-making guidance. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ldp/concepts) | 0.30 | Conceptual overview of Lakeflow Spark Declarative Pipelines, core concepts, and benefits. No clear indication of detailed configuration parameters, limits, or decision matrices. | @@ -4017,27 +4027,28 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Coordinate transactions across tables](https://learn.microsoft.com/en-us/azure/databricks/transactions/tutorial) | 0.30 | Described as a tutorial demonstrating how to use transactions with examples and preview notes. Tutorials generally focus on step-by-step usage rather than reference-style limits, configuration matrices, or error-code mappings, so it likely lacks the required expert-knowledge patterns. | | [Core concepts](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/core-concepts) | 0.30 | Core concepts page explains autoscaling, scale-to-zero, branches, and replicas conceptually; lacks numeric thresholds or detailed configuration surfaces. | | [Correctness](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/concepts/judges/is_correct) | 0.30 | High-level explanation of the Correctness judge; no evidence of concrete config values, schemas, or error mappings. | +| [Cost management tools on Databricks](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/) | 0.30 | High-level navigation/overview of cost management tools without detailed configuration tables, limits, or error mappings. | | [Create a Knowledge Assistant agent](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/knowledge-assistant) | 0.30 | Appears to be a how-to/tutorial for creating a chatbot with Knowledge Assistant; no indication of numeric limits, config tables, error-code mappings, or product-specific decision matrices. | -| [Create a custom app](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-custom-app) | 0.30 | Custom app creation flow; likely a procedural guide without deep config parameter tables, limits, or troubleshooting mappings. | +| [Create a Supervisor Agent](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor) | 0.30 | The page describes how to use a Supervisor Agent to orchestrate a multi-agent system in Azure Databricks, but based on the summary it appears to be a conceptual/how-to guide without explicit limits, configuration parameter tables, error-code-based troubleshooting, or quantified decision matrices. It focuses on coordination patterns and natural language feedback rather than product-specific numeric thresholds, RBAC roles, or detailed configuration references. | +| [Create a Visual data prep](https://learn.microsoft.com/en-us/azure/databricks/designer/build-transformation) | 0.30 | Step-by-step guidance on creating a Visual data prep; tutorial-style content without explicit limits, configuration reference tables, or troubleshooting mappings. | +| [Create a clean room](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/create-clean-room) | 0.30 | UI-based how-to for creating a clean room; appears procedural without detailed config tables, limits, or product-specific best-practice guidance. | | [Create a database instance](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/) | 0.30 | Appears to be a how-to guide for creating and managing a Lakebase Provisioned database instance. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices; it mainly describes getting started and mentions region availability and product variants. | -| [Create an app from a template](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-app-template) | 0.30 | Template-creation walkthrough; summary suggests UI-driven steps, not detailed configuration matrices, limits, or security role definitions. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-pipeline) | 0.30 | Pipeline creation tutorial; unlikely to contain detailed config matrices, limits, or troubleshooting mappings beyond generic steps. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-pipeline) | 0.30 | Appears to be a step-by-step ingestion/tutorial page for Confluence using Lakeflow Connect. Description and summary indicate how-to pipeline creation, not configuration tables, limits, error-code mappings, or product-specific best-practice guidance with quantified impact. | | [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-pipeline) | 0.30 | How-to pipeline creation guide; likely step-by-step without detailed config matrices or limits. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-pipeline) | 0.30 | Pipeline creation tutorial; unlikely to include detailed config tables or numeric limits. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-pipeline) | 0.30 | How-to ingestion pipeline guide; likely procedural without detailed config matrices or numeric constraints. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-pipeline) | 0.30 | Pipeline creation tutorial; likely step-by-step without detailed config matrices or numeric limits. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-pipeline) | 0.30 | Pipeline creation pages are generally step-by-step tutorials without exhaustive config matrices, limits, or troubleshooting mappings. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-pipeline) | 0.30 | Pipeline creation tutorial; typically procedural without exhaustive configuration matrices or limits. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-pipeline) | 0.30 | Described as a guide to create a managed ingestion pipeline from Google Ads. This is typical tutorial content without explicit mention of limits, config matrices, or error-code-based troubleshooting. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-pipeline) | 0.30 | Page teaches how to ingest Google Analytics 4 data via BigQuery using Lakeflow Connect. Summary suggests a how-to integration tutorial, not a configuration reference, limits table, or decision/troubleshooting guide. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-pipeline) | 0.30 | HubSpot ingestion pipeline creation with Lakeflow Connect is likely a procedural tutorial. No indication of numeric limits, config parameter tables, or structured troubleshooting content in the summary. | | [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-pipeline) | 0.30 | How-to pipeline setup; summary does not mention detailed config tables, limits, or error mappings beyond standard tutorial content. | | [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-pipeline) | 0.30 | How-to guide for creating a query-based ingestion pipeline; described as a UI/automation walkthrough, not a reference of config parameters, limits, or troubleshooting details. | | [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-pipeline) | 0.30 | How-to ingest data from Salesforce; likely a step-by-step tutorial without comprehensive configuration parameter tables or limits. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-pipeline) | 0.30 | Pipeline creation tutorial for ServiceNow; generally procedural without exhaustive config tables or limits. | | [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-pipeline) | 0.30 | How-to ingestion pipeline guide; summary does not indicate detailed config tables, limits, or error mappings beyond standard tutorial content. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-pipeline) | 0.30 | Scenario/tutorial-style ingestion guide for TikTok Ads using Lakeflow Connect; likely step-by-step UI usage without detailed config tables, limits, or product-specific error mappings. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-pipeline) | 0.30 | Tutorial for creating a Workday HCM ingestion pipeline; description suggests procedural guidance rather than configuration matrices, limits, or troubleshooting content. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-pipeline) | 0.30 | How-to page for ingesting Workday reports; appears to be a basic pipeline creation walkthrough without expert-level limits, config tables, or decision matrices. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-pipeline) | 0.30 | Zendesk Support ingestion pipeline tutorial; likely focuses on steps to set up a managed connector, not on detailed configuration parameters, quotas, or troubleshooting mappings. | +| [Create materialized views in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized) | 0.30 | Describes how to create and refresh materialized views; no evidence of limits tables, config parameter references, or troubleshooting/error-code mappings in the summary. | | [Create multiple flows with metaprogramming](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-metaprogramming) | 0.30 | Tutorial on metaprogramming patterns for Lakeflow Spark Declarative Pipelines; primarily shows a coding pattern with inner functions and parameterization. No configuration tables, limits, error-code mappings, or product-specific settings with values that qualify as expert knowledge per the defined categories. | -| [Create workspace using the Azure CLI](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/azure-cli) | 0.30 | CLI deployment how-to; summary does not mention product-specific deployment constraints or matrices. | | [Create your first pipeline using the Lakeflow Pipelines Editor](https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorial-get-started) | 0.30 | Step-by-step getting-started tutorial for creating a first pipeline; focuses on basic usage and example scenario, not on detailed configuration matrices, limits, or error mappings. | -| [Dashboard local metric views](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/local-metric-views) | 0.30 | Describes dashboard-local metric views and their scope; mainly conceptual and workflow-focused, without numeric limits, config parameter tables, or decision matrices. | -| [Data classification](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification) | 0.30 | The summary indicates a feature overview of Databricks Data Classification and its purpose (automatically classify and tag sensitive data). It does not clearly indicate the presence of configuration tables, role definitions, or other detailed parameters. It appears more conceptual/feature-usage oriented than a parameterized configuration or security reference, so it is not classified as expert knowledge under the given types. | | [Databricks AI assistive features overview](https://learn.microsoft.com/en-us/azure/databricks/databricks-ai/) | 0.30 | Introductory page for AI assistive features; conceptual overview without detailed configuration parameters, limits, or security role mappings. | | [Databricks Ray overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/) | 0.30 | High-level overview of Ray on Databricks; mostly conceptual description of capabilities without detailed config tables, limits, or error mappings. | | [Databricks Runtime 13.3 LTS](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.3lts) | 0.30 | Release notes for a specific Databricks Runtime version; summary indicates version, Spark level, and dates but no clear evidence of limits, configuration tables, error codes, or decision matrices. Without detailed content, it appears as change-log style information rather than structured expert guidance per the defined sub-skills. | @@ -4045,7 +4056,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Databricks Runtime 14.3 LTS ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.3lts-ml) | 0.30 | ML runtime release notes list included libraries and features but not structured limits, configuration tables, or troubleshooting content. | | [Databricks Runtime 15.2](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.2) | 0.30 | Databricks Runtime 15.2 release notes; description is limited to version, Spark version, and release date. Without detailed content showing limits, config options, or troubleshooting, it does not meet any sub-skill criteria. | | [Databricks Runtime 15.3](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/15.3) | 0.30 | Databricks Runtime 15.3 release notes; only high-level version and lifecycle details are visible. No evidence of limits/quotas, configuration settings, troubleshooting mappings, or other expert-knowledge structures. | -| [Databricks Runtime 16.0](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.0) | 0.30 | Databricks Runtime 16.0 release notes; appears to be a standard change log with lifecycle info. The provided text does not show any of the specific numeric limits, configuration parameters, or decision matrices required for classification. | | [Databricks Runtime 16.1](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.1) | 0.30 | Databricks Runtime 16.1 release notes; summary content is limited to version, Spark level, and dates. There is no clear sign of structured expert knowledge like quotas, config tables, or troubleshooting flows. | | [Databricks Runtime 16.1 ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.1ml) | 0.30 | Runtime 16.1 ML release notes; ML library and environment description, not limits, configuration, or troubleshooting per the skill definitions. | | [Databricks Runtime 16.2](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.2) | 0.30 | Databricks Runtime 16.2 release notes; only lifecycle and version information is evident. No indication of detailed limits, configuration matrices, or decision-making guidance per the sub-skill criteria. | @@ -4060,7 +4070,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Databricks Runtime 17.2 ML](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/17.2ml) | 0.30 | Runtime 17.2 ML release notes; primarily lists included ML libraries and versions, not quotas, config matrices, or error mappings. | | [Databricks Runtime 18.2 ML (Beta)](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.2ml) | 0.30 | ML runtime release notes; mainly package versions and feature changes, not limits, configuration tables, or troubleshooting mappings as required by the skill types. | | [Databricks preview releases](https://learn.microsoft.com/en-us/azure/databricks/release-notes/release-types) | 0.30 | Defines preview release types and support options conceptually; no product-specific limits, configs, or error mappings. | -| [Datasets](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/datasets) | 0.30 | How-to article on creating and managing dashboard datasets via the editor; likely procedural UI steps rather than structured configuration references or expert-only constraints. | +| [Datasets](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/datasets) | 0.30 | Explains creating and managing dashboard datasets using the editor; summary suggests step-by-step usage, not detailed configuration matrices or limits. | +| [Debug notebooks with Databricks Connect](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/notebooks) | 0.30 | Primarily a how-to/tutorial for running and debugging Databricks notebooks from VS Code. It likely describes workflow steps and basic integration behavior, but not detailed configuration tables, limits, error-code mappings, or product-specific parameter references that meet the expert-knowledge criteria. | | [December 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/december) | 0.30 | December 2018 Databricks release notes are a feature change log and lack structured limits, configuration tables, or troubleshooting mappings. | | [December 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/december) | 0.30 | December 2019 Databricks release notes list features and improvements without structured numeric limits, configuration parameter tables, or error-code-based troubleshooting. | | [December 2021](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2021/december) | 0.30 | December 2021 release notes; summarizes platform improvements but does not provide structured expert-knowledge as defined. | @@ -4076,13 +4087,11 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Downstream RAG use case](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-rag) | 0.30 | Downstream RAG use case description; likely conceptual workflow guidance rather than detailed product-specific limits, configs, or troubleshooting. | | [Dynamics 365](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365) | 0.30 | Overview of Dynamics 365 connector; mainly conceptual and architectural description via Synapse Link. | | [End-to-end classical ML models](https://learn.microsoft.com/en-us/azure/databricks/mlflow/end-to-end-example) | 0.30 | End-to-end tutorial notebook; primarily example workflow, not configuration matrices or limits/quotas. | -| [Example notebooks](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vs-example-notebooks) | 0.30 | Index page for example notebooks; points to SDK reference but itself does not contain configuration tables, limits, or troubleshooting content. | | [Exploratory data analysis (EDA)](https://learn.microsoft.com/en-us/azure/databricks/exploratory-data-analysis/) | 0.30 | Conceptual overview of EDA tools and techniques; lacks detailed configuration tables, limits, or product-specific edge-case mappings. | | [Explore database objects](https://learn.microsoft.com/en-us/azure/databricks/discover/database-objects) | 0.30 | Task-focused page on exploring catalogs, schemas, tables, and views using Catalog Explorer and SQL. The summary suggests usage instructions rather than detailed configuration tables, limits, or troubleshooting mappings, so it does not clearly fit any expert-knowledge sub-skill type. | -| [Explore sample data](https://learn.microsoft.com/en-us/azure/databricks/genie-code/sample-data-explorer) | 0.30 | Describes using Genie Code to explore sample data with natural language; appears as a usage/tutorial flow without detailed config tables, limits, or troubleshooting mappings. | | [External models](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/external-models/) | 0.30 | Overview of external models, supported providers, and limitations. From the summary it appears high-level; without explicit tables of limits, config parameters, or error mappings, it is primarily conceptual/feature overview. | +| [External tools](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-monitoring-tools) | 0.30 | Describes using external tools like pgAdmin and PgHero; summary indicates high-level integration/monitoring usage, not detailed config tables, quotas, or error mappings. Likely tutorial/overview rather than expert reference. | | [FOR statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/for-stmt) | 0.30 | FOR statement control-flow syntax; describes looping over query results but not product-specific limits, configuration parameters, or error-resolution mappings. | -| [Feature Store overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/) | 0.30 | Overview of Databricks Feature Store capabilities in Unity Catalog; primarily conceptual description of registry, governance, and workflow without detailed parameters or limits. | | [February 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/february) | 0.30 | February 2018 release notes; general release info, not aligned with any sub-skill category. | | [February 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/february) | 0.30 | February 2019 Databricks release notes are standard release notes without explicit numeric limits, configuration parameter references, or error-code-based troubleshooting. | | [February 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/february) | 0.30 | February 2020 Databricks release notes are standard release notes and do not match any of the expert-knowledge sub-skill patterns (no limits tables, config references, or troubleshooting mappings). | @@ -4090,18 +4099,17 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [February 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/february) | 0.30 | February 2022 release notes; change log style, not a limits, configuration, troubleshooting, or decision-making reference. | | [February 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/february) | 0.30 | Release notes for February 2024; change log style, not a structured expert-knowledge page per the defined sub-skill types. | | [February 2026](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/february) | 0.30 | Monthly release notes summary; snippet is generic and does not show specific technical parameters or limits. | -| [Filters and drill through](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/) | 0.30 | Describes how to use dashboard filter widgets conceptually and via UI steps; no product-specific limits, config tables, or error mappings. | +| [Filters and drill through](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/) | 0.30 | Explains filter types and how to work with them; summary does not indicate numeric limits, parameter tables, or error/diagnostic mappings. | | [Functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions) | 0.30 | High-level overview of functions and UDFs; summary does not show Databricks-specific limits, configs, or troubleshooting details. | | [Functions index](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/) | 0.30 | Index page listing PySpark SQL functions with links; no limits, configuration tables, error mappings, or product-specific decision/security/deployment guidance. | | [Genie Code for dashboard authoring](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/dashboard-agent) | 0.30 | Feature and workflow overview for Genie Code dashboard authoring; no indication of numeric limits, config tables, error codes, or product-specific parameters. | | [Genie Code for observability & evaluation](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/genie-code) | 0.30 | High-level description of Genie Code capabilities for observability and evaluation; appears to be a getting-started/UX overview without detailed configuration parameters, limits, or error-code-based troubleshooting. | | [Genie Code for pipeline development](https://learn.microsoft.com/en-us/azure/databricks/ldp/de-agent) | 0.30 | High-level introduction to Genie Code Data Engineering Agent for Lakeflow Spark Declarative Pipelines. The summary indicates conceptual/feature overview without exposing concrete configuration parameters, error codes, or limits. | +| [Get identifiers for workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-details) | 0.30 | Explains how to obtain various IDs and URLs for workspace objects; useful but primarily step-by-step retrieval instructions, not configuration, limits, or troubleshooting mappings. | | [Get started](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/getting-started/) | 0.30 | Getting started quickstart for MLflow 3 for GenAI; likely a step-by-step tutorial rather than a reference of configuration parameters, limits, or troubleshooting mappings. Does not clearly indicate expert-only configuration or error code content from the summary. | | [Get started](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/get-started) | 0.30 | Getting started walkthrough for Lakebase Autoscaling; step-by-step tutorial to create a project and connect data, but no clear indication of configuration tables, limits, or troubleshooting details. | -| [Get started with apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/get-started) | 0.30 | Step-by-step getting started tutorial; description does not indicate detailed config tables, limits, or product-specific troubleshooting or decision matrices. | | [Google Ads](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-overview) | 0.30 | Overview of Google Ads connector; primarily conceptual description of capabilities. | | [Google Analytics](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics) | 0.30 | Overview of Google Analytics Raw Data connector; mainly conceptual description of ingesting GA4 via BigQuery. | -| [Govern LLMs and MCPs (AI Gateway)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/) | 0.30 | Overview of AI Gateway beta and its role; summary does not show concrete limits, configuration parameter tables, or error mappings—primarily conceptual governance description. | | [GraphFrames overview](https://learn.microsoft.com/en-us/azure/databricks/integrations/graphframes/) | 0.30 | Primarily links to example notebooks and gives a high-level description of GraphFrames; lacks detailed configuration tables, limits, or product-specific troubleshooting or decision matrices. | | [Groundedness](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/concepts/judges/is_grounded) | 0.30 | Conceptual description of the RetrievalGroundedness judge; summary does not indicate detailed parameters, config tables, or product-specific limits/metrics. | | [Guide: Agents development workflow](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/agents-dev-workflow) | 0.30 | Guide to overall agents development workflow; lifecycle-focused and conceptual, not centered on specific configuration parameters, limits, or error-resolution mappings. | @@ -4109,13 +4117,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [High-level architecture](https://learn.microsoft.com/en-us/azure/databricks/getting-started/high-level-architecture) | 0.30 | High-level overview of Azure Databricks architecture; conceptual enterprise architecture description without detailed limits, configs, or troubleshooting mappings. | | [HubSpot](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-overview) | 0.30 | Overview of HubSpot connector; primarily conceptual description of capabilities. | | [IF statement](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/control-flow/if-stmt) | 0.30 | IF THEN ELSE statement syntax; standard control-flow construct without Databricks-specific limits, configuration matrices, or troubleshooting content. | -| [Iceberg](https://learn.microsoft.com/en-us/azure/databricks/iceberg/) | 0.30 | High-level 'What is Apache Iceberg' overview and support statement; primarily conceptual description of the table format and preview status without detailed limits, configs, or decision matrices. | | [Identifiers](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-identifiers) | 0.30 | Describes what identifiers are and basic behavior (case-insensitivity, backticks) but, as summarized, does not include detailed parameter tables, limits, or product-specific configuration values. | +| [Ingest data](https://learn.microsoft.com/en-us/azure/databricks/designer/ingest-data) | 0.30 | Describes options for ingesting data (local files, cloud storage, etc.) into Designer; appears to be procedural guidance rather than detailed configuration reference or limits/quotas. | | [Isolation levels and write conflicts](https://learn.microsoft.com/en-us/azure/databricks/optimizations/isolation/) | 0.30 | High-level explanation of isolation levels and write conflicts; summary suggests conceptual behavior without concrete settings, codes, or config values. | | [January 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/january) | 0.30 | January 2018 Databricks release notes describe early platform changes but do not match any of the expert-knowledge sub-skill patterns (no limits tables, config references, or troubleshooting mappings). | | [January 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/january) | 0.30 | January 2019 Databricks release notes list platform improvements but do not match any of the structured expert-knowledge sub-skill patterns. | | [January 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/january) | 0.30 | January 2022 Azure Databricks release notes; product changes but no structured expert-knowledge content as defined. | | [January 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/january) | 0.30 | Release notes for January 2024; documents new features but not in the form of limits, configuration tables, troubleshooting mappings, or decision matrices. | +| [January 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/january) | 0.30 | Monthly release notes page; focused on change log style updates rather than detailed quotas, configuration options, or troubleshooting guidance. | | [January 2026](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/january) | 0.30 | Monthly release notes; summary text does not show specific technical parameters, limits, or troubleshooting content. | | [Jenkins](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/jenkins) | 0.30 | Summary indicates a general how-to for using Jenkins with Azure Databricks CI/CD. It reads as a workflow/tutorial article without explicit mention of product-specific limits, configuration parameter tables, or deployment matrices. Likely mostly step-by-step guidance that an LLM can approximate from generic Jenkins + Databricks patterns. | | [Jira](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira) | 0.30 | Overview of Jira connector; mainly conceptual description of ingestion capabilities. | @@ -4132,14 +4141,17 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [June 2021](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2021/june) | 0.30 | June 2021 release notes; change log style, not a structured expert-knowledge reference according to the defined sub-skill types. | | [June 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/june) | 0.30 | Monthly Azure Databricks release notes for June 2025; focuses on describing new features and platform improvements rather than the structured expert-knowledge categories defined. | | [LLM judges & scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/concepts/scorers) | 0.30 | Conceptual description of scorers and LLM judges; likely focuses on what they are and how they’re used, without detailed numeric limits or configuration tables. | +| [LLMs](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta) | 0.30 | Overview of AI Gateway for LLM endpoints mentioning permissions, usage tracking, and rate limits, but summary does not show specific numeric limits or detailed RBAC role definitions. | | [Labeling sessions](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/concepts/labeling-sessions) | 0.30 | Describes the concept of labeling sessions and their role in collecting human feedback. This is primarily conceptual and workflow-oriented, without concrete limits, configuration parameters, or error-resolution mappings required for expert knowledge classification. | +| [Lakeflow Pipelines Editor](https://learn.microsoft.com/en-us/azure/databricks/ldp/multi-file-editor) | 0.30 | Feature usage and UI-focused guidance for Lakeflow Pipelines Editor; no detailed configuration tables, limits, error codes, or product-specific best-practice gotchas evident from summary. | | [Lakeflow Spark Declarative Pipelines release notes](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/) | 0.30 | High-level explanation of release process and links to release notes; no indication of concrete limits, configuration tables, error codes, or decision criteria. | | [Legacy dashboards](https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/legacy-dashboards) | 0.30 | Legacy dashboards overview using SQL editor and visualizations; appears to be conceptual/usage guidance without detailed limits, configs, or troubleshooting. | +| [Load data using SDP](https://learn.microsoft.com/en-us/azure/databricks/ldp/load) | 0.30 | Describes how to load data into pipelines and recommends streaming tables, but summary does not indicate detailed configuration options, limits, or decision matrices with quantified trade-offs. | +| [Local metric views](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/data-modeling/local-metric-views) | 0.30 | Describes local metric views conceptually and their benefits; no indication of numeric thresholds, config tables, or security/decision matrices. | | [Low shuffle merge](https://learn.microsoft.com/en-us/azure/databricks/optimizations/low-shuffle-merge) | 0.30 | Describes low shuffle merge optimization conceptually and its availability in certain runtimes; summary does not show specific configuration parameters, limits, or troubleshooting details. | | [MLflow 2: Evaluate gen AI apps](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/) | 0.30 | High-level overview of Agent Evaluation (MLflow 2); clearly an introduction without detailed configuration or limits. | | [MLflow API Reference](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/api-reference) | 0.30 | Index page linking to external MLflow API docs; summary suggests navigation/reference only without inline parameter tables or config details. | | [MLflow for agents & LLMs](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/) | 0.30 | High-level overview of MLflow 3 for GenAI (tracking, evaluation, observability). Summary suggests conceptual description of capabilities without detailed configuration tables, limits, or error mappings. | -| [MLlib](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/mllib) | 0.30 | Example notebooks for MLlib usage; no indication of Databricks-specific limits, quotas, or detailed configs. | | [Manage](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage) | 0.30 | Top-level manage hub page listing areas (projects, branches, computes, etc.); primarily navigational without detailed parameters or limits in the summary. | | [Manage connections as an administrator](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/admin) | 0.30 | High-level administration of Partner Connect (service principals, PATs, users). Summary suggests procedural/portal steps without detailed configuration tables, specific parameters, or error mappings. | | [Manage production scorers](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/manage-production-scorers) | 0.30 | Describes lifecycle management of production scorers (list, update, stop, restart, delete). The summary suggests basic API usage and operations, without exposing detailed parameter tables, limits, or error mappings that would qualify as expert knowledge under the defined categories. | @@ -4152,9 +4164,8 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [May 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/may) | 0.30 | May 2020 Databricks release notes describe platform changes but not in the structured forms required for limits, configuration, troubleshooting, or decision-making classifications. | | [May 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/may) | 0.30 | Monthly Azure Databricks release notes for May 2025; change-log content does not match limits, configuration, troubleshooting, or other targeted expert-knowledge sub-skill patterns. | | [Measure Genie Code impact](https://learn.microsoft.com/en-us/azure/databricks/genie-code/impact) | 0.30 | Focuses on measuring impact via logs and surveys; likely analytics guidance without product-specific limits or configs. | -| [Model serving endpoints (previous)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-serving-endpoints) | 0.30 | Described as an overview of AI Gateway for serving endpoints and supported features. Sounds like conceptual/feature overview without explicit limits, config tables, or error mappings. | +| [Monitor](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/monitor) | 0.30 | Monitoring overview for Lakebase projects; summary mentions categories of tools and optimization but no specific metrics thresholds, limits, or configuration parameter tables. Reads as conceptual/usage guidance. | | [Monitor fairness and bias for classification models](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/fairness-bias) | 0.30 | Fairness/bias monitoring description is conceptual; no product-specific limits, configs, error codes, or decision matrices with quantified thresholds. | -| [Monitor usage using tags](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags) | 0.30 | Explains how to use tags for cost attribution conceptually; summary does not indicate detailed parameter tables, limits, or product-specific error codes. | | [Mosaic AI Model Serving](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-intro) | 0.30 | Basic steps for model serving; appears as an overview/tutorial without detailed deployment matrices or limits. | | [Move tables between pipelines](https://learn.microsoft.com/en-us/azure/databricks/ldp/move-table) | 0.30 | Describes how to move tables between pipelines; appears to be a scenario/how-to article without deep configuration references or numeric constraints. | | [November 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/november) | 0.30 | November 2018 Databricks release notes describe new features but do not provide numeric limits, configuration parameter tables, or decision matrices. | @@ -4167,21 +4178,22 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [October 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/october) | 0.30 | October 2018 release notes; release history, not limits/config/troubleshooting content. | | [October 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/october) | 0.30 | October 2019 Databricks release notes describe platform updates but lack explicit limits, configuration tables, or troubleshooting mappings. | | [October 2021](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2021/october) | 0.30 | October 2021 release notes; change log content without the structured patterns required for expert-knowledge classification. | +| [October 2024](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2024/october) | 0.30 | Monthly release notes page; primarily feature announcements and platform changes, not organized as limits, best practices, configuration references, or troubleshooting content. | | [October 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/october) | 0.30 | October 2025 release notes; provided summary is generic and does not expose expert-level technical parameters. | | [Organize notebook cells](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-cells) | 0.30 | Organizing notebook cells is mostly UX guidance; no indication of product-specific limits, configs, or troubleshooting. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/best-practices) | 0.30 | High-level pointer page that links to other best practices; summary does not show concrete product-specific configs, numbers, or patterns. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-sharing/) | 0.30 | Overview of data and AI asset sharing options (Delta Sharing, Marketplace, Clean Rooms); appears to be conceptual/feature overview without numeric limits, config tables, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/database-objects/) | 0.30 | Conceptual explanation of database objects and their relationship to catalogs and schemas; high-level architecture without detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/delta/) | 0.30 | Conceptual overview of Delta Lake; no indication of detailed limits, configs, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/) | 0.30 | High-level description of the Databricks CLI and its usage. The summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting. It appears to be an overview/introduction rather than expert configuration or troubleshooting content. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/) | 0.30 | Conceptual overview of Databricks Apps, use cases, requirements, and limitations; does not emphasize detailed config parameters, error codes, or numeric limits that qualify as expert knowledge. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/) | 0.30 | High-level overview of Databricks Connect capabilities and supported languages; no detailed limits, configuration tables, error codes, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/) | 0.30 | General how-to for Databricks Connect for Scala; likely step-by-step usage without deep config tables or troubleshooting matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/discover/) | 0.30 | High-level overview of data discovery tools; mostly conceptual and navigational without detailed configuration tables or product-specific limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/) | 0.30 | High-level description of MCP on Databricks and available server types; summary suggests conceptual overview and navigation to specific MCP server pages without detailed configuration tables or numeric limits. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/getting-started/architecture) | 0.30 | General architecture concepts and lakehouse design patterns; likely high-level and conceptual without product-specific numeric thresholds or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-overview) | 0.30 | Zerobus Ingest overview; description suggests conceptual and marketing-style explanation of what the connector is and why to use it, without detailed limits, configs, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/) | 0.30 | High-level overview of the open source Databricks JDBC driver and supported tools; no detailed property tables, config parameters, limits, or error-code-based troubleshooting content are evident from the summary. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/streaming-tables) | 0.30 | Conceptual explanation of streaming tables and when to use them; no evidence of limits, config matrices, or detailed troubleshooting. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/flows) | 0.30 | Conceptual and how-to description of flows and incremental processing; summary does not indicate numeric limits, decision matrices, or detailed configuration parameters. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/) | 0.30 | High-level overview of Databricks AI Runtime and preview/beta status; lacks concrete limits, configuration parameters, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/) | 0.30 | Index of example notebooks; does not itself contain configuration tables, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-classic-ml) | 0.30 | Landing page for classic ML examples; mostly links and high-level descriptions without explicit product-specific configuration tables. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-computer-vision) | 0.30 | Landing page for computer vision example notebooks; mainly descriptive and linking, without explicit configuration tables or limits. | @@ -4189,31 +4201,27 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-fsdp) | 0.30 | High-level description that the page has notebook examples for FSDP; summary does not indicate detailed parameter tables, limits, or product-specific configuration references—likely just example usage. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/gpu-llms) | 0.30 | Landing page for LLM example notebooks; primarily links to tutorials and does not describe specific configs or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/) | 0.30 | Overview of Mosaic AI Model Serving; high-level description of capabilities, not detailed deployment constraints or matrices. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/serve-data-ai) | 0.30 | Very high-level statement about Mosaic AI serving data; appears to be an overview without detailed settings, limits, or patterns. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/) | 0.30 | Appears to be an overview of Prompt Registry; description suggests conceptual intro without detailed config tables or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/pyspark/) | 0.30 | Overview of PySpark on Databricks; mostly conceptual and introductory without detailed configs, limits, or product-specific troubleshooting. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/query/formats/) | 0.30 | High-level overview of supported data formats and default protocols; no detailed configuration tables, limits, or product-specific edge cases. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/security/keys/) | 0.30 | High-level introduction to data security and encryption options; appears conceptual without detailed configuration tables, limits, or product-specific parameters. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/volumes/) | 0.30 | Conceptual overview of Unity Catalog volumes and when to use them; lacks detailed config tables, limits, or error mappings. | -| [Parameters](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/filters/parameters) | 0.30 | Explains dashboard parameters and interactivity; appears to be usage/tutorial content without detailed configuration tables or limits. | -| [Pipeline update](https://learn.microsoft.com/en-us/azure/databricks/ldp/updates) | 0.30 | Described as explaining what a pipeline update is and how to trigger it, which sounds like conceptual and basic operational guidance without clear evidence of detailed configuration tables, limits, or error mappings. | +| [Pandas API on Spark](https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-on-spark) | 0.30 | Explains pandas API on Spark and mentions runtime version requirement, but otherwise is conceptual/usage guidance without detailed limits, configuration tables, or decision criteria. | +| [Pipeline update](https://learn.microsoft.com/en-us/azure/databricks/ldp/updates) | 0.30 | Explains what a pipeline update is and how to trigger it; summary suggests procedural guidance without detailed configuration matrices, limits, or error-code-based troubleshooting. | | [PostgreSQL](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql) | 0.30 | Workflow/overview and persona-based guidance; summary does not indicate numeric limits, config tables, or detailed error mappings. | | [Production monitoring](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/eval-monitor/production-monitoring) | 0.30 | Appears to be a how-to/feature usage page for scheduling MLflow scorers on GenAI traces. The summary does not indicate presence of numeric limits, config parameter tables, error codes, or other product-specific expert details; likely procedural guidance rather than reference-style expert knowledge. | -| [Publishing](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/share) | 0.30 | Covers how to publish and share dashboards; likely procedural sharing steps without limits, error mappings, or detailed configuration matrices. | +| [Projects](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-projects) | 0.30 | Primarily a how-to/manage guide for Lakebase projects; summary does not indicate numeric limits, config parameter tables, error codes, or decision matrices. Likely procedural/tutorial content rather than expert reference details. | | [PySpark Shell](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/spark-shell) | 0.30 | Likely a usage/overview page for the PySpark shell in Databricks Connect without detailed limits, configs, or troubleshooting matrices. | | [PyTorch](https://learn.microsoft.com/en-us/azure/databricks/mlflow/tracking-ex-pytorch) | 0.30 | End-to-end PyTorch tutorial; focuses on workflow and code, not on Databricks-specific configuration matrices or limits. | | [Query](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-data) | 0.30 | General guidance on querying data using various tools; summary does not indicate presence of product-specific configuration tables, limits, or troubleshooting content. | -| [Query and visualize data](https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start) | 0.30 | Intro tutorial on querying and visualizing data; shows basic usage, not product-specific configs, limits, or troubleshooting. | | [Query from SQL Editor](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/sql-editor) | 0.30 | How-to on accessing/querying a Lakebase instance from the SQL editor; likely step-by-step UI usage without detailed limits, configs, or error-code mappings. | | [Query streaming data](https://learn.microsoft.com/en-us/azure/databricks/query/streaming) | 0.30 | Streaming query page appears to be conceptual and example-focused without numeric limits, config tables, or error-code-based troubleshooting. | -| [Query traces - DBSQL](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-dbsql) | 0.30 | From the summary, the page is about querying MLflow traces stored in Unity Catalog using Databricks SQL and views. It likely shows example queries and mentions available tables/views, but there is no clear indication of structured configuration tables, limits, error codes, or decision matrices. Without explicit evidence of such expert details, it is treated as general usage/tutorial content. | | [Query-based connectors](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-overview) | 0.30 | Overview of query-based connectors and cursor-column concept; no clear evidence of numeric limits, config parameter tables, or detailed error mappings from the summary. | | [Quickstart](https://learn.microsoft.com/en-us/azure/databricks/jobs/jobs-quickstart) | 0.30 | Quickstart tutorial for creating a workflow; typically step-by-step without detailed config matrices, limits, or troubleshooting mappings. | | [Quickstart](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-overview) | 0.30 | Quickstart/overview for connecting; summary suggests high-level guidance (OAuth vs password) without detailed parameters, limits, or product-specific patterns. | | [R overview](https://learn.microsoft.com/en-us/azure/databricks/sparkr/) | 0.30 | Overview page for R development on Databricks; primarily navigational without detailed expert-only configuration or troubleshooting content. | | [RAG governance and LLMOps](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/fundamentals-governance-llmops) | 0.30 | Described as a brief overview with links to governance and LLMOps resources; likely high-level without detailed RBAC roles or config parameters. | -| [Reference](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/reference) | 0.30 | Reference of supported environments, languages, sources, sinks, and operators is largely capability listing; summary does not indicate numeric limits, config tables, or error mappings that would qualify as expert knowledge under the defined categories. | | [Reference solutions overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/reference-solutions/) | 0.30 | Index page listing reference solutions; primarily navigational without detailed technical content itself. | +| [Register database in Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/register-uc) | 0.30 | Explains registering a Lakebase database in Unity Catalog; summary focuses on purpose and outcome (read-only catalog, unified governance) without indicating specific RBAC role lists, config parameters, or limits. | | [Regression with the UI](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regression) | 0.30 | How-to usage article for AutoML regression; summary suggests general workflow, not detailed configuration tables or product-specific constraints. | | [Release 2022.37](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2022/37/) | 0.30 | Release notes for DLT 2022.37; primarily lists features and fixes, not structured as limits, configuration, troubleshooting, or decision-making guidance. | | [Release 2022.40](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2022/40/) | 0.30 | Release notes for DLT 2022.40; standard release notes, not matching any of the specified expert-knowledge sub-skill types. | @@ -4248,7 +4256,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Release 2024.20](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/20/) | 0.30 | Release notes for DLT 2024.20; change log content rather than a focused expert-knowledge reference in the specified categories. | | [Release 2024.22](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/22/) | 0.30 | Release notes for DLT 2024.22; does not provide structured limits, configuration options, troubleshooting mappings, or decision criteria. | | [Release 2024.29](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/29/) | 0.30 | Release notes for DLT 2024.29; primarily chronological change information, not structured expert guidance per the defined types. | -| [Release 2024.33](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/33/) | 0.30 | Release notes for DLT 2024.33; no clear mapping to limits, configuration, troubleshooting, or other sub-skill types. | | [Release 2024.37](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/37/) | 0.30 | Release notes for DLT 2024.37; standard feature/fix listing rather than a reusable expert-knowledge reference in the defined categories. | | [Release 2024.40](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/40/) | 0.30 | Release notes for DLT 2024.40; does not present structured limits, configuration parameters, troubleshooting flows, or decision matrices. | | [Release 2024.42](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/42/) | 0.30 | Release notes for DLT 2024.42; change log style, not matching any of the specified expert-knowledge sub-skill patterns. | @@ -4256,17 +4263,14 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Release 2025.04](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2025/04/) | 0.30 | Release notes listing new features and fixes for a specific DLT release; not a structured guide on limits, configuration, troubleshooting, or other defined sub-skills. | | [Release notes 2021](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2021) | 0.30 | 2021 SQL release notes similarly list improvements and updates without explicit limits, configuration matrices, or troubleshooting mappings. | | [Release notes 2022](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2022) | 0.30 | 2022 SQL release notes are a chronological list of improvements; no evidence of the specific expert-knowledge structures defined in the sub-skill types. | -| [Release notes 2024](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2024) | 0.30 | 2024 AI/BI release notes summary indicates a chronological list of improvements, not structured expert-knowledge content per the defined categories. | -| [Release notes 2025](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2025) | 0.30 | AI/BI 2025 release notes index; similar to 2026, appears to be a chronological feature list without clear indication of structured expert details that fit the defined sub-skill types. | | [Release notes 2025](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2025) | 0.30 | Databricks SQL 2025 release notes index; similar to 2026, appears to be a feature list without explicit evidence of the structured expert content required for the defined sub-skill types. | | [Release notes 2026](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2026) | 0.30 | Yearly release notes overview; summary mentions features and bug fixes and rolling upgrade timing but no specific limits, configuration parameters, or troubleshooting mappings. | | [Release notes 2026](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2026) | 0.30 | Databricks SQL 2026 release notes index; summary suggests a list of features and improvements. No clear indication of limits, configuration matrices, or troubleshooting structures in the snippet, so not classified as expert knowledge under the given categories. | | [Restore data and time travel](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/create/child-instance) | 0.30 | Focuses on using child instances for time travel and restore. From the summary it looks like a feature/how-to explanation without explicit limits, configuration tables, or error-code troubleshooting. No clear evidence of quantified best practices or decision matrices. | -| [Runtime maintenance updates](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates) | 0.30 | Release/maintenance notes index for runtimes; the summary suggests high-level listing of maintenance updates and process (restart cluster) without exposing specific limits, configs, or troubleshooting mappings in the snippet. Without clear evidence of structured expert details (error codes, config tables, limits), it’s treated as non-expert for this classification. | +| [Runtime maintenance updates](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates) | 0.30 | Release notes listing maintenance updates by runtime version; summary does not indicate numeric limits, configuration tables, error-code mappings, or decision matrices. Primarily change-log information rather than reusable expert configuration or troubleshooting knowledge. | +| [SAP BDC semantic metadata](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/semantic-metadata) | 0.30 | Describes semantic metadata syncing from SAP BDC into Unity Catalog at a conceptual level; no indication of configuration parameters, limits, or detailed mappings that would qualify as expert configuration or integration patterns. | | [SQL release notes](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/) | 0.30 | Top-level Databricks SQL release notes index; summary mentions features, improvements, known issues, and FAQs but not specific limits, configuration tables, or decision matrices. | | [SQL warehouses overview](https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/) | 0.30 | Primarily a conceptual/usage overview of SQL warehouses and how to connect; no detailed limits, config tables, or error mappings that meet the expert-knowledge criteria. | -| [Scala](https://learn.microsoft.com/en-us/azure/databricks/languages/scala) | 0.30 | Similar to the Python page, this is a guide/index for Scala content without specific configuration tables, limits, or troubleshooting mappings. | -| [Schema evolution](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/schema-evolution) | 0.30 | Conceptual explanation of schema evolution and its importance; summary does not indicate product-specific configuration tables, limits, or error mappings. | | [September 2018](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2018/september) | 0.30 | September 2018 Databricks release notes are standard release documentation without structured expert knowledge such as limits, configuration parameters, or troubleshooting mappings. | | [September 2019](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2019/september) | 0.30 | September 2019 Databricks release notes are standard release documentation without structured limits/quotas, configuration parameters, or decision matrices. | | [September 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/september) | 0.30 | September 2020 Databricks release notes are a change log; they do not present structured limits, quotas, configuration parameter tables, or error-code-based troubleshooting. | @@ -4274,25 +4278,28 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [September 2022](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2022/september) | 0.30 | September 2022 Azure Databricks release notes; monthly change log without structured limits, configuration, or troubleshooting mappings. | | [September 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/september) | 0.30 | Monthly Azure Databricks release notes for September 2025; primarily feature announcements and improvements, not the structured technical reference material required for the defined sub-skill types. | | [Serve data with synced tables](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/sync-data/sync-table) | 0.30 | Describes serving lakehouse data via synced tables for Lakebase Provisioned instances. The summary suggests conceptual and procedural content, not detailed limits, configuration matrices, or troubleshooting mappings required for the expert sub-skill types. | +| [Serve lakehouse data with synced tables](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sync-tables) | 0.30 | Describes synced tables and reverse ETL conceptually; summary does not show numeric limits, config parameter tables, or decision matrices. Appears to be conceptual/usage guidance rather than expert configuration or limits. | | [Serverless compute for notebooks](https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/notebooks) | 0.30 | How-to article for using serverless compute for notebooks; appears procedural without detailed config matrices or limits. | -| [Serverless compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/) | 0.30 | Explains how to connect to serverless compute; likely a usage/tutorial page rather than configuration reference with parameter tables. | -| [Serverless compute release notes](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/) | 0.30 | Serverless compute release notes overview; the summary indicates staged feature updates and behavior descriptions but no clear evidence of structured limits, config tables, or troubleshooting mappings in the snippet. Treated as general release information rather than a focused expert-knowledge page per the given categories. | +| [Serverless compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/) | 0.30 | Page appears to describe how to connect and use Azure Databricks serverless compute for notebooks, workflows, and Lakeflow pipelines. Based on the description, it is likely a conceptual/how-to connection guide without detailed limits tables, configuration parameter matrices, error-code-based troubleshooting, or decision matrices. Lacking clear evidence of numeric limits, config tables, or product-specific troubleshooting, it does not meet the expert-knowledge criteria for any sub-skill type. | +| [Serverless compute release notes](https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/) | 0.30 | Serverless compute release notes describing features and behaviors over time; no indication of limits tables, config parameter references, or troubleshooting mappings. Mostly high-level feature availability information. | | [Serverless forecasting](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/serverless-forecasting) | 0.30 | Primarily a UI-driven tutorial for running serverless forecasting; no detailed config tables, limits, or product-specific troubleshooting. | | [Set alerts](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/anomaly-detection/alerts) | 0.30 | High-level description of setting up and viewing alerts; summary does not indicate specific configuration parameters, limits, or error mappings. | | [Set up Lakehouse Federation](https://learn.microsoft.com/en-us/azure/databricks/query-federation/) | 0.30 | Conceptual overview of Lakehouse Federation and types of federation; lacks detailed config, limits, or security role mappings. | +| [Set up data source](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-source-setup) | 0.30 | Focus is on setting up Dynamics 365 as a data source and configuring authentication. Likely a procedural setup guide rather than a reference of limits, configuration parameter tables, or troubleshooting mappings. No clear evidence of expert-only details from the provided summary. | | [Sign up for Databricks Free Edition](https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-edition) | 0.30 | Signup instructions for Free Edition; procedural, not configuration/limits/troubleshooting focused. | | [Sign up for a free trial](https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-trial) | 0.30 | Step-by-step sign-up and trial activation; does not focus on limits, configuration matrices, or decision criteria beyond basic eligibility and duration. | | [Single node PyTorch to distributed deep learning](https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/train-model/mnist-pytorch) | 0.30 | Notebook-based tutorial on adapting single-node PyTorch to distributed training with HorovodRunner; primarily workflow and example code, not configuration tables, limits, or product-specific troubleshooting. | | [Source control](https://learn.microsoft.com/en-us/azure/databricks/dashboards/automate/git-support) | 0.30 | Explains how to use Databricks Git folders for version control and CI/CD of dashboards. The summary suggests a procedural/tutorial focus without specific configuration matrices, limits, or error-code troubleshooting content required for expert classification. | +| [Streaming tables in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/streaming) | 0.30 | Primarily a how-to/tutorial for creating and using streaming tables; summary does not indicate numeric limits, config tables, error-code mappings, or product-specific decision matrices. | | [Structured Streaming tutorial](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/tutorial) | 0.30 | Introductory tutorial for first Structured Streaming queries; primarily example code and basic concepts, not detailed config matrices or limits. | | [Tables](https://learn.microsoft.com/en-us/azure/databricks/tables/table-overview) | 0.30 | Overview of table types and recommendations; lacks numeric limits, config tables, or detailed security roles. | | [Tables concepts](https://learn.microsoft.com/en-us/azure/databricks/tables/tables-concepts) | 0.30 | Concepts page defining tables and showing a simple example; conceptual rather than configuration, limits, or troubleshooting. | | [Tables editor](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/table-editor) | 0.30 | Tables editor page appears to be a UI/usage walkthrough for viewing and editing data; summary does not suggest numeric limits, config parameter tables, or error-code-based troubleshooting. | -| [Technology partner overview](https://learn.microsoft.com/en-us/azure/databricks/integrations/) | 0.30 | High-level overview of technology partner integrations and Partner Connect; primarily marketing/overview without detailed configuration tables or numeric constraints. | | [Temporary tables](https://learn.microsoft.com/en-us/azure/databricks/tables/temporary-tables) | 0.30 | Focuses on what temporary tables are and when to use them; summary does not indicate numeric limits, detailed configuration parameter tables, or troubleshooting/decision matrices. | | [Text widgets](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/text-widgets) | 0.30 | Text widget usage and markdown formatting; generic and UI-focused, not product-specific expert configuration. | | [Traditional ML model workflow](https://learn.microsoft.com/en-us/azure/databricks/mlflow/mlflow3-ml-workflow) | 0.30 | Example workflow notebook with screenshots; tutorial-style content rather than structured expert reference on limits, configs, or troubleshooting. | -| [Train and deploy an ML model](https://learn.microsoft.com/en-us/azure/databricks/getting-started/ml-get-started) | 0.30 | ML tutorial using scikit-learn and MLflow; mainly generic ML workflow and Databricks usage, not Databricks-specific configuration or limits. | +| [Transform data](https://learn.microsoft.com/en-us/azure/databricks/ldp/transform) | 0.30 | Transformation patterns and general usage of pipelines; summary suggests examples but not specific numeric limits, configuration tables, or error-code-based troubleshooting. | +| [Triggered vs. continuous](https://learn.microsoft.com/en-us/azure/databricks/ldp/pipeline-mode) | 0.30 | Explains semantics of triggered vs continuous pipeline modes conceptually; no evidence of numeric thresholds, configuration tables, or product-specific error/diagnostic details. | | [Tutorial](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/tutorial) | 0.30 | Tutorial for using VS Code extension to run Python; typical step-by-step usage without detailed configuration matrices or troubleshooting content. | | [Tutorial](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/tutorial) | 0.30 | Tutorial for running a real-time streaming workload; primarily step-by-step example usage. Does not clearly indicate configuration matrices, limits, or decision/troubleshooting content. | | [Tutorial: Analyze customer reviews using AI functions](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions-example) | 0.30 | Scenario tutorial showing how to analyze reviews with AI Functions. Based on the summary, it focuses on an example workflow rather than enumerating configuration tables, limits, or detailed error mappings. It is primarily instructional/tutorial content, not a reference of expert-only details. | @@ -4300,17 +4307,18 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [Tutorial: ETL in Databricks SQL](https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/sql-etl-tutorial) | 0.30 | Tutorial for building an incremental ETL pipeline; primarily step-by-step guidance without configuration tables, limits, or error-resolution mappings. | | [Tutorials](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorials) | 0.30 | Tutorials landing page listing workflows; primarily navigational without detailed technical content in the summary. | | [Unit testing](https://learn.microsoft.com/en-us/azure/databricks/notebooks/testing) | 0.30 | Introductory unit testing guidance; likely generic testing patterns rather than Databricks-specific best practices with quantified impact. | +| [Unity Catalog](https://learn.microsoft.com/en-us/azure/databricks/ldp/unity-catalog) | 0.30 | Describes using Unity Catalog with pipelines at a conceptual level (read/write, permissions via GRANT/REVOKE); summary does not indicate detailed RBAC role lists, config tables, or other expert-only specifics. | | [Update schema](https://learn.microsoft.com/en-us/azure/databricks/delta/update-schema) | 0.30 | High-level description of schema evolution capabilities and coordination concerns; the summary does not indicate specific configuration parameters, numeric thresholds, or error codes. Likely more of a conceptual/how-to page than a reference with expert-only details. | -| [Usage dashboards](https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage) | 0.30 | High-level explanation of importing and using pre-built usage dashboards; likely a navigation/overview page without detailed parameter tables or product-specific limits. | | [Use Genie Code](https://learn.microsoft.com/en-us/azure/databricks/genie-code/use-genie-code) | 0.30 | Usage overview for Genie Code (enable/disable, general capabilities) but summary does not show detailed configuration tables, limits, or error-code-based troubleshooting. | +| [Use Genie to explore data](https://learn.microsoft.com/en-us/azure/databricks/genie/talk-to-genie) | 0.30 | Explains how business users interact with Genie spaces and provide feedback; primarily usage guidance without detailed configuration parameters, limits, or troubleshooting mappings. | | [Use features with RAG applications](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/rag) | 0.30 | Example notebook for structured RAG with feature serving; primarily tutorial-style workflow without clear evidence of product-specific limits, configs tables, or troubleshooting mappings. | | [Use the simple compute form](https://learn.microsoft.com/en-us/azure/databricks/compute/simple-form) | 0.30 | Explains how to use the simple compute UI; mostly procedural UI steps without deep configuration reference tables or limits. | -| [Visualizations](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/) | 0.30 | Explains how to configure dashboard visualizations; appears to be UI/tutorial style without detailed config parameter tables or numeric constraints. | +| [Visualization types](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/types) | 0.30 | Lists visualization types and examples; appears as feature usage documentation rather than detailed configuration, limits, or troubleshooting guidance. | +| [Visualizations](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/visualizations/) | 0.30 | Covers how to use and customize visualization widgets; likely UI-driven instructions without expert-only limits, quotas, or config reference tables. | | [Wanderbricks dataset](https://learn.microsoft.com/en-us/azure/databricks/discover/wanderbricks-dataset) | 0.30 | Describes the Wanderbricks sample dataset schema and use cases; this is descriptive dataset documentation, not configuration, limits, or troubleshooting guidance. | | [Where is DLT?](https://learn.microsoft.com/en-us/azure/databricks/ldp/where-is-dlt) | 0.30 | Explains product renaming and high-level migration note; no detailed config, limits, or decision matrices. | | [Window frame clause](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-window-functions-frame) | 0.30 | Explains the window frame clause conceptually. While it may show syntax, it is primarily standard SQL semantics rather than Databricks-specific configuration tables, limits, or troubleshooting content. | | [Window functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-window-functions) | 0.30 | High-level description of window functions and their uses; likely conceptual with generic examples. Does not obviously focus on product-specific limits, configuration parameters, or error codes. | -| [Workspace browser and file management](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-browser) | 0.30 | Workspace browser usage and organization; operational UI guidance, not configuration reference or limits. | | [Workspace files overview](https://learn.microsoft.com/en-us/azure/databricks/files/workspace) | 0.30 | Conceptual explanation of Databricks workspace files and Git folders; no detailed configuration parameter tables, limits, quotas, or troubleshooting mappings. | | [aes_decrypt function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aes_decrypt) | 0.30 | Page describes aes_decrypt() syntax and behavior but is a straightforward function reference without detailed security configuration, limits, or troubleshooting mappings. | | [aes_encrypt function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/aes_encrypt) | 0.30 | Page describes aes_encrypt() syntax and behavior; it is a standard function reference without RBAC roles, config tables, limits, or decision/troubleshooting structures. | @@ -4460,25 +4468,26 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [xpath_number function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/xpath_number) | 0.30 | Function reference for xpath_number; likely just syntax, arguments, and basic behavior without product-specific limits, quotas, config tables, or troubleshooting mappings. | | [year function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/year) | 0.30 | Simple date function (year) reference; describes behavior and note about session timezone but no detailed limits, config matrices, or error mappings. | | [What is Catalog Explorer?](https://learn.microsoft.com/en-us/azure/databricks/catalog-explorer/) | 0.28 | Introductory description of Catalog Explorer UI; lacks detailed configuration parameters or expert-only behavior. | -| [Concepts](https://learn.microsoft.com/en-us/azure/databricks/dashboards/concepts) | 0.25 | Conceptual explanation of dashboard capabilities and usage; does not expose detailed configuration parameters, limits, or product-specific troubleshooting. | -| [Concepts](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/concepts) | 0.25 | Feature store overview and glossary; conceptual definitions rather than actionable configuration or troubleshooting details. | | [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-pipeline) | 0.25 | Tutorial for creating a MySQL ingestion pipeline; primarily procedural ingestion steps rather than structured configuration reference, limits, or troubleshooting catalog. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-pipeline) | 0.25 | Tutorial on ingesting Workday reports; primarily a how-to pipeline creation guide, not a configuration reference or troubleshooting catalog. | | [Data discovery and collaboration in the lakehouse](https://learn.microsoft.com/en-us/azure/databricks/lakehouse/collaboration) | 0.25 | Covers collaboration and data discovery concepts using Unity Catalog and Delta Sharing; appears architectural and conceptual rather than detailed config or troubleshooting. | | [Data guides overview](https://learn.microsoft.com/en-us/azure/databricks/guides/) | 0.25 | Navigation-style data guides hub that routes users to other content; does not itself contain detailed limits, configuration, or troubleshooting mappings. | | [Database instances](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/instance) | 0.25 | Explains what a database instance is; conceptual resource model description without detailed configuration tables or limits in the summary. | | [Databricks SQL tutorials](https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/) | 0.25 | Get-started guide for Databricks SQL; mostly onboarding and navigation, not deep configuration, limits, or troubleshooting content. | | [Lakebase Provisioned](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/) | 0.25 | General overview of Lakebase; summary is introductory and rollout info, without specific limits, configs, or security roles. | -| [Manage workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-objects) | 0.25 | Explains how to manage workspace objects via UI actions; lacks detailed configuration parameters, limits, or troubleshooting mappings. | -| [Navigate the workspace](https://learn.microsoft.com/en-us/azure/databricks/workspace/navigate-workspace) | 0.25 | UI navigation guide for the workspace; describes screens and basic actions without deep configuration tables, limits, or troubleshooting content. | +| [Manage workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-objects) | 0.25 | Covers managing workspace objects (folders, notebooks, etc.) via UI actions; procedural but not focused on expert-level configuration parameters or limits. | +| [Release 2024.33](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/2024/33/) | 0.25 | DLT release 2024.33 notes list new features, improvements, and bug fixes; summary does not indicate structured limits, configuration options, or error-code-based troubleshooting content. | +| [Release notes 2024](https://learn.microsoft.com/en-us/azure/databricks/sql/release-notes/2024) | 0.25 | Databricks SQL 2024 release notes describe new capabilities (system tables, time travel, scheduling syntax) but summary does not show detailed limits, config tables, or troubleshooting mappings required for classification. | | [Salesforce](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-concepts) | 0.25 | Salesforce connector 'concepts' page; primarily high-level explanation and scenario description, not detailed limits, configs, or troubleshooting. | | [Sample dashboards](https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/sample-dashboards) | 0.25 | Tutorial on using sample dashboards; step-by-step UI usage without detailed product-specific configuration matrices or limits. | | [ServiceNow](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow) | 0.25 | ServiceNow connector overview; primarily conceptual and persona-focused guidance, not detailed limits, configuration matrices, or troubleshooting. | | [SharePoint](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint) | 0.25 | SharePoint connector overview; likely conceptual and marketing, not detailed limits, configuration parameters, or troubleshooting. | | [Tutorial: Run code on classic compute](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/tutorial) | 0.25 | Tutorial for running code from IntelliJ; typical walkthrough content rather than expert configuration, limits, or troubleshooting data. | +| [View app details](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/view-app-details) | 0.25 | Explains the app details page for Databricks Apps (deployment status, logs, configuration, history). While useful for operations, the summary does not indicate specific error codes, configuration parameter tables, or other structured expert knowledge. | | [Visualization types](https://learn.microsoft.com/en-us/azure/databricks/visualizations/visualization-types) | 0.25 | Lists visualization types and examples; lacks deep config matrices, limits, or decision/troubleshooting content. | | [Visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/) | 0.25 | General visualization overview; no detailed option tables, limits, or product-specific troubleshooting. | | [What is Lakebase Provisioned?](https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/about) | 0.25 | Introductory page for Lakebase Provisioned; primarily conceptual overview of the service. | +| [Workspace browser and file management](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-browser) | 0.25 | Explains how to create and organize workspace objects; likely procedural UI guidance without detailed parameter tables, limits, or troubleshooting mappings. | +| [Workspace creation overview](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/) | 0.25 | Overview/landing page for workspace creation and management options without detailed settings, limits, or troubleshooting mappings. | | [any aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/any) | 0.25 | ANY aggregate function reference; describes boolean aggregation and synonyms, but no limits, configs, or error mappings. | | [any_value aggregate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/any_value) | 0.25 | ANY_VALUE aggregate function; non-deterministic behavior but summary does not indicate product-specific configuration or constraints. | | [array function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/array) | 0.25 | ARRAY constructor function; generic SQL-like behavior without Databricks-specific expert details. | @@ -4576,120 +4585,136 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [> (gt sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/gtsign) | 0.20 | Operator syntax/semantics for '>' in Databricks SQL; no limits, configs, error codes, or product-specific expert patterns. | | [AI governance](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/ai-governance) | 0.20 | High-level AI governance overview; lacks concrete RBAC role lists, config parameters, or detailed decision matrices. | | [Add images, equations, and other media](https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-media) | 0.20 | How-to content for adding media to notebooks; likely shows markdown/HTML syntax but no product-specific limits, configs, or error mappings that qualify as expert knowledge per the defined categories. | -| [Administration topics](https://learn.microsoft.com/en-us/azure/databricks/admin/) | 0.20 | High-level administration landing page describing areas like account, workspaces, users, security, and monitoring. No indication of specific limits, configuration parameter tables, error codes, or decision matrices; appears to be conceptual/navigation content. | +| [Administration overview](https://learn.microsoft.com/en-us/azure/databricks/admin/admin-concepts) | 0.20 | Conceptual overview of admin roles and responsibilities without product-specific numeric limits, configs, or troubleshooting content. | +| [Applicable model terms](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/acceptable-use-models) | 0.20 | Lists applicable model terms and policy links; legal/compliance content without technical limits, configuration, or product-specific operational details. | | [August 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/august) | 0.20 | August 2020 release notes (including mention of version 3.26 regional rollout) are change-log style content. They do not provide structured limits, configuration tables, troubleshooting steps, or decision criteria that fit the defined sub-skill types. | -| [Author dashboards](https://learn.microsoft.com/en-us/azure/databricks/dashboards/manage/) | 0.20 | Covers basic authoring operations (create, view, organize); primarily conceptual/how-to UI usage without deep config or limits. | +| [Author agents on Databricks Apps](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent) | 0.20 | High-level guidance on authoring and deploying agents on Databricks Apps; summary indicates conceptual workflow and when to use Supervisor API, but no concrete API parameters, limits, or detailed configuration tables are evident. | | [AutoML overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/) | 0.20 | Conceptual 'What is AutoML?' overview and basic requirements; no indication of detailed parameters, limits, or error codes in the summary. | | [Basic editing](https://learn.microsoft.com/en-us/azure/databricks/notebooks/basic-editing) | 0.20 | Basic editing and UI usage in notebooks; mostly step-by-step editor usage without deep product-specific configuration tables or expert-only details. | +| [Basics](https://learn.microsoft.com/en-us/azure/databricks/pyspark/basics) | 0.20 | Introductory PySpark examples (DataFrames, transforms, visualization, saving data); tutorial-style usage without Databricks-specific limits, configs, or troubleshooting mappings. | | [Budget policies](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-budget-policies) | 0.20 | The description focuses on using budget policies to track spending and references a general attribute-based pricing article. It reads as cost-tracking/administration guidance without clear evidence of numeric limits, configuration tables, or detailed parameter specs; more conceptual than expert configuration or limits content. | -| [Built-in functions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin) | 0.20 | Index page listing categories of built-in SQL functions; summary shows no parameter tables, limits, or detailed syntax/semantics. Likely a navigational overview rather than expert configuration or troubleshooting content. | | [CHECK_CONSTRAINTS](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/information-schema/check_constraints) | 0.20 | Reserved for future use; no concrete schema, parameters, or behavior described yet. | +| [Chat](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-chat) | 0.20 | Describes Chat in Databricks One at a feature level; no specific configuration tables, error codes, or quantified limits evident from the summary. | +| [Clean rooms overview](https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/) | 0.20 | High-level feature introduction for Databricks Clean Rooms without detailed limits, configuration parameters, or troubleshooting mappings. | +| [Collect feedback and expectations by labeling existing traces](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/human-feedback/expert-feedback/label-existing-traces) | 0.20 | Tutorial-style guidance on using MLflow Review App to label existing GenAI traces; likely focuses on workflow steps rather than product-specific limits, configuration tables, error codes, or security/decision matrices. | | [Computes and endpoints](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/computes-and-endpoints) | 0.20 | Appears to be a conceptual/usage overview of computes, endpoints, and instances in Lakebase without clear evidence of numeric limits, configuration tables, or troubleshooting mappings. | +| [Concepts](https://learn.microsoft.com/en-us/azure/databricks/dashboards/concepts) | 0.20 | Conceptual description of AI/BI dashboard capabilities; lacks concrete limits, configuration parameters, or troubleshooting mappings. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/) | 0.20 | High-level description of standard connectors; appears to be overview/navigation without detailed limits, configs, or troubleshooting content. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/) | 0.20 | Described as an overview of managed connectors in Lakeflow Connect, focusing on capabilities and benefits (incremental reads/writes, cost efficiency). No clear indication of detailed configuration tables, limits, or error mappings. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-analytics-concepts) | 0.20 | Concepts page; likely explains conceptual model of GA4 raw data ingestion rather than concrete limits or configs. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-concepts) | 0.20 | A 'concepts' page is typically high-level explanation of how the connector works, not detailed configuration, limits, or troubleshooting. | -| [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-concepts) | 0.20 | Concepts/how-it-works description; likely high-level connector concepts rather than detailed configuration or troubleshooting data. | +| [Concepts](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/concepts) | 0.20 | Conceptual overview and glossary of feature store concepts without product-specific limits, configs, or error mappings. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/concepts/) | 0.20 | Conceptual explanation of MLflow GenAI data model (experiments, traces, assessments) without product-specific limits, configs, or decision matrices. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/query-federation/database-federation) | 0.20 | Described as a conceptual 'What is query federation?' overview explaining pushdown and supported sources; no indication of specific limits, configuration tables, error codes, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/concepts) | 0.20 | Conceptual page about what real-time mode is and when to use it; summary suggests high-level concepts without quantified decision matrices or detailed configuration parameters. | | [Connect](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect) | 0.20 | High-level guidance on connecting to Lakebase using various clients and auth methods; summary does not indicate detailed config tables, limits, or error mappings. | -| [Cost management tools on Databricks](https://learn.microsoft.com/en-us/azure/databricks/admin/usage/) | 0.20 | Section overview of cost management tools; appears to be navigational/summary content without detailed configuration, limits, or troubleshooting data. | -| [Create and manage compute policies](https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policies) | 0.20 | High-level guidance on creating and managing compute policies; the summary does not indicate detailed parameter tables, limits, or product-specific gotchas. It appears more like conceptual/administrative how-to content than expert reference. | -| [Create and manage governed tags](https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-governed-tags) | 0.20 | The page describes how to create and manage governed tags and includes a general security warning about tag data being stored as plain text and replicated globally. From the summary, it appears to be a procedural/how-to page without detailed RBAC roles, configuration parameter tables, or other expert-only specifics. It does not clearly match limits, configuration, troubleshooting, or other expert categories based on the provided excerpt. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-pipeline) | 0.20 | How-to guide for creating a TikTok Ads ingestion pipeline; appears to be a step-by-step tutorial rather than configuration reference or troubleshooting with expert-only details. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-pipeline) | 0.20 | Pipeline creation tutorial for Workday HCM; likely procedural steps rather than structured configuration reference or error-resolution mappings. | -| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-pipeline) | 0.20 | How-to guide for creating a Zendesk Support ingestion pipeline; likely procedural without structured configuration or error-resolution tables. | -| [Create workspace using Azure Portal](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/create-workspace) | 0.20 | Step-by-step deployment via Azure Portal without tier matrices, constraints, or detailed configuration parameter tables; primarily a basic how-to guide rather than expert configuration or deployment reference. | +| [Create a custom app](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-custom-app) | 0.20 | Describes creating a custom Databricks app and uploading code. This is a how-to flow without explicit expert-level configuration matrices, limits, or troubleshooting structures. | +| [Create an app from a template](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/create-app-template) | 0.20 | Guides creating an app from a template via UI. Appears to be procedural/tutorial content without detailed configuration parameter tables, limits, or error-resolution mappings. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-pipeline) | 0.20 | Appears to be a how-to/tutorial for creating a Meta Ads ingestion pipeline. No indication of numeric limits, config parameter tables, error-code mappings, or other expert-only details; likely step-by-step usage instructions. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-pipeline) | 0.20 | Described as a page showing how to create a managed NetSuite ingestion pipeline. This is typical tutorial content without evidence of limits, configuration matrices, or troubleshooting mappings. | +| [Create ingestion pipeline](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-pipeline) | 0.20 | ServiceNow ingestion pipeline creation guide; summary suggests procedural instructions only. No signs of detailed limits, config tables, or error-resolution content that would qualify as expert knowledge. | | [DROP BLOOMFILTER INDEX (deprecated)](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-drop-bloomfilter-index) | 0.20 | This is a narrow SQL syntax reference for dropping a deprecated Bloom filter index. It likely describes command usage and options but not limits/quotas, configuration tables, error-code-based troubleshooting, or other expert-only operational details as defined in the sub-skill types. | -| [Dashboards](https://learn.microsoft.com/en-us/azure/databricks/dashboards/) | 0.20 | Overview of Databricks dashboards and sharing capabilities; lacks numeric limits, detailed configuration tables, troubleshooting mappings, or other product-specific expert details. | +| [Dashboards](https://learn.microsoft.com/en-us/azure/databricks/dashboards/) | 0.20 | High-level overview of Databricks dashboards and sharing; no numeric limits, configuration tables, error codes, or product-specific decision matrices. | | [Dashboards with Genie spaces](https://learn.microsoft.com/en-us/azure/databricks/dashboards/genie-spaces) | 0.20 | Describes Genie spaces with dashboards and natural language exploration; appears to be feature usage/overview, not deep config or limits. | | [Data warehouse architecture](https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/data-warehousing-concepts) | 0.20 | Conceptual architecture and patterns (lakehouse, medallion) for data warehousing; generic design concepts rather than product-specific numeric thresholds or decision matrices. | -| [Databricks Runtime maintenance updates (archived)](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates-archive) | 0.20 | Archived maintenance updates index; describes existence of updates but not detailed limits, configuration parameters, or error-resolution mappings. | -| [Databricks best practice articles](https://learn.microsoft.com/en-us/azure/databricks/getting-started/best-practices) | 0.20 | Index page listing best practice articles, but itself does not contain concrete recommendations, configs, or quantified guidance. | +| [Databricks Runtime 16.0](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.0) | 0.20 | Runtime 16.0 release notes summary; similar to other runtime release notes, mainly version info and high-level changes. The summary does not show numeric limits, configuration tables, or structured troubleshooting content that matches any sub-skill type. | +| [Databricks Runtime 18.0](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.0) | 0.20 | Release notes summary; likely lists changes and fixes but not organized as limits, configuration matrices, troubleshooting mappings, or other defined sub-skill types. No clear indication of numeric limits, config tables, or error-code-based troubleshooting in the provided summary. | | [Databricks developer tools releases](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dev-tools/) | 0.20 | Release notes index for developer tools and SDKs; summary indicates it mainly links to releases and does not describe specific limits, configs, error codes, or decision matrices. | | [December 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/december) | 0.20 | December 2020 Azure Databricks release notes describe feature changes over time, not structured expert knowledge like limits, configuration parameters, or troubleshooting mappings. They do not match any of the specified sub-skill categories. | | [Declarative Automation Bundles feature releases](https://learn.microsoft.com/en-us/azure/databricks/release-notes/dev-tools/bundles) | 0.20 | Feature release notes for Declarative Automation Bundles; describes feature changes and points to GitHub releases, but not organized as limits, configuration parameter tables, or troubleshooting mappings. | | [Dedicated compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/dedicated-overview) | 0.20 | Dedicated compute overview; primarily conceptual description of access mode and benefits. | | [Deep learning overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/deep-learning) | 0.20 | High-level intro to deep learning options on Databricks with links to notebooks; no detailed configs, limits, or product-specific patterns. | -| [Delta Sharing overview](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/) | 0.20 | Introductory overview of Delta Sharing; no detailed limits, configs, error codes, or decision matrices. | +| [Delta Sharing overview](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/) | 0.20 | High-level introduction to Delta Sharing and related concepts (Marketplace, Clean Rooms) without concrete limits, configuration parameters, or detailed procedures. | | [Direct deployment engine](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/direct) | 0.20 | Explains migration from Terraform-based to direct deployment engine for Databricks bundles; summary indicates a feature announcement and high-level migration guidance, not detailed deployment matrices, configuration parameter tables, or quantified trade-offs. | | [Elements of a well-architected lakehouse](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/well-architected) | 0.20 | High-level introduction to the Databricks well-architected framework; acts as a navigation/overview page. | | [End-of-support releases](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/eos) | 0.20 | End-of-support index page; mostly lifecycle and navigation information without numeric limits, configs, or troubleshooting details. | +| [Explore sample data](https://learn.microsoft.com/en-us/azure/databricks/genie-code/sample-data-explorer) | 0.20 | Describes using Genie Code to explore sample data; summary suggests conceptual and workflow guidance, not detailed configuration, limits, or troubleshooting. | | [Expressions](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-expression) | 0.20 | High-level definition of expressions; summary does not indicate product-specific limits, parameters, or error mappings beyond generic SQL expression concepts. | +| [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-faq) | 0.20 | FAQ page; description does not indicate detailed error codes, config tables, or numeric limits—likely conceptual and preview/usage Q&A. | +| [Fabric](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/fabric) | 0.20 | Described as an overview of using Microsoft Fabric to read Unity Catalog data. Likely high-level integration guidance without detailed configuration tables, parameters, or product-specific limits, so it does not clearly meet any expert-knowledge sub-skill criteria. | +| [Feature Store overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/) | 0.20 | Overview of Databricks Feature Store in Unity Catalog describing capabilities (governance, lineage, sharing); no indication of numeric limits, detailed configuration parameters, or troubleshooting mappings. | +| [Flow examples](https://learn.microsoft.com/en-us/azure/databricks/ldp/flow-examples) | 0.20 | Examples of flows likely show patterns but summary does not indicate product-specific configuration tables, limits, or troubleshooting mappings. | | [Gen AI capabilities](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/gen-ai-capabilities) | 0.20 | Lists Azure Databricks GenAI capabilities by workflow stage; primarily an overview/feature catalog without detailed decision matrices, configs, or troubleshooting. | -| [Glossary](https://learn.microsoft.com/en-us/azure/databricks/resources/glossary) | 0.20 | Glossary of terminology; mostly definitions rather than limits, configs, or troubleshooting mappings. | -| [Groups](https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/groups) | 0.20 | Described as an overview of groups; likely conceptual without detailed role names, config parameters, or limits. | +| [Get started with apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/get-started) | 0.20 | Step-by-step getting started tutorial for Databricks Apps. While it may show example commands or flows, it is focused on walkthrough rather than enumerating configuration matrices, limits, or troubleshooting mappings. | +| [Governed tags](https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/) | 0.20 | Described as an overview of governed tags with a general warning about tag data; likely conceptual governance explanation without detailed permissions, config tables, or numeric limits. | +| [Iceberg](https://learn.microsoft.com/en-us/azure/databricks/iceberg/) | 0.20 | High-level explanation of Apache Iceberg support in Azure Databricks; summary indicates conceptual overview of features (schema evolution, time travel, hidden partitioning) without product-specific limits, configuration tables, or detailed patterns. | | [Integrate](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/integrations) | 0.20 | Summary describes a conceptual integration overview of Lakebase Postgres with other Databricks services without exposing concrete configuration parameters, code patterns, or limits. | | [Integrations](https://learn.microsoft.com/en-us/azure/databricks/getting-started/connect/) | 0.20 | Integrations overview that lists categories of connections and tools but does not expose detailed configuration parameters, limits, or SDK reference tables. | | [Introduction to RAG](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/retrieval-augmented-generation) | 0.20 | RAG overview on Azure Databricks; explains what RAG is and its benefits, but summary does not indicate detailed product-specific configs or limits. | -| [Introduction to vector search](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search) | 0.20 | High-level overview of Mosaic AI Vector Search describing what it is and how it works conceptually. No evidence of numeric limits, configuration parameter tables, error codes, or decision matrices with thresholds. | -| [Introduction to workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-assets) | 0.20 | High-level introduction to workspace objects; describes object types conceptually without detailed configuration or security matrices. | -| [JSON files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/json) | 0.20 | JSON format page is a simple usage description (single-line vs multi-line) without detailed config tables or limits. | +| [Introduction to vector search](https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search) | 0.20 | High-level overview of Mosaic AI Vector Search without detailed limits, configs, or troubleshooting content. | | [June 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/june) | 0.20 | June 2020 release notes describe staged releases and feature additions but lack the structured, reusable expert knowledge (limits, config parameters, error mappings, or decision matrices) required for any of the specified sub-skill categories. | | [Key concepts in apps](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/key-concepts) | 0.20 | Key concepts article describing structure, permissions, and interactions at a conceptual level; lacks concrete configuration tables, numeric thresholds, or error-code-based troubleshooting. | | [Lakebase Autoscaling](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/) | 0.20 | High-level overview/navigation for Lakebase Autoscaling projects; primarily marketing and orientation content without detailed limits, configs, or error handling. | | [Legacy alerts](https://learn.microsoft.com/en-us/azure/databricks/sql/user/alerts/legacy) | 0.20 | Explains legacy alerts conceptually and how they run queries; no indication of numeric limits, config tables, or error-code-based troubleshooting. | +| [Legacy visualizations](https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-visualizations) | 0.20 | Appears to be a conceptual/feature description of legacy visualizations and references to other visualization docs. No clear evidence of detailed configuration tables, limits, error codes, or product-specific patterns that meet the expert-knowledge criteria. | +| [Load data using a Unity Catalog external location](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/add-data-external-locations) | 0.20 | How-to article using the add data UI; appears to be a step-by-step tutorial without detailed configuration parameter tables or numeric constraints. | | [Local development](https://learn.microsoft.com/en-us/azure/databricks/ldp/develop-locally) | 0.20 | Overview of local development workflow; appears procedural without detailed configuration tables, limits, or troubleshooting mappings. | | [MLOps Stacks](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/mlops-stacks) | 0.20 | High-level description of MLOps Stacks and process-as-code benefits; no concrete limits, config parameter tables, error codes, or product-specific decision matrices with thresholds. | -| [MLflow 2: Quickstart notebook](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-quickstart) | 0.20 | Tutorial/quickstart description for an evaluation notebook; no indication of specific limits, configuration tables, error codes, or product-specific settings. Appears to be procedural guidance rather than detailed reference content with expert-only knowledge. | +| [MLflow 2: Quickstart notebook](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-evaluation/evaluation-quickstart) | 0.20 | Notebook-based quickstart for evaluating a GenAI app with MLflow 2 Agent Evaluation; primarily a tutorial and example workflow, not a reference for limits, configuration matrices, troubleshooting codes, or other expert-only details. | +| [MLlib](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/mllib) | 0.20 | Example notebooks for using Spark MLlib on Databricks; likely generic ML usage patterns rather than Databricks-specific limits, configs, or error mappings. | +| [Manage Databricks Previews](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/manage-previews) | 0.20 | Managing previews is mostly procedural/admin UI guidance without detailed config tables or numeric constraints. | +| [Manage classic compute](https://learn.microsoft.com/en-us/azure/databricks/compute/clusters-manage) | 0.20 | Described as a how-to for managing compute (start, stop, edit, monitor); likely procedural UI/API usage without detailed configuration tables, limits, or error-code-based troubleshooting. | | [Manage workspace settings overview](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/) | 0.20 | Section overview for workspace settings; navigation-style content without specific settings tables in the summary. | | [May 2021](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2021/may) | 0.20 | Release notes summarize new features and improvements for May 2021 but are not organized as limits, configuration references, troubleshooting guides, or decision matrices. They typically lack structured numeric limits, config tables, or error-code mappings required by the defined sub-skill types. | | [Meta Ads](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads) | 0.20 | Described as an overview of the Meta Ads connector; likely conceptual/marketing without detailed limits, configs, or error mappings. | | [Model Serving concepts](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/glossary) | 0.20 | Glossary of model serving concepts; definitions are conceptual and not configuration, limits, or error-resolution content. | +| [Model serving endpoints (previous)](https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-serving-endpoints) | 0.20 | Described as an overview of Unity AI Gateway for serving endpoints and supported features; overviews typically lack detailed numeric limits, configuration tables, or error mappings. | +| [Monitor jobs](https://learn.microsoft.com/en-us/azure/databricks/jobs/monitor) | 0.20 | The summary indicates a conceptual/feature overview of monitoring and observability for jobs, referencing UI views, CLI help commands, and the Jobs API, but does not show specific error codes, diagnostic mappings, or configuration tables. It reads as general usage guidance rather than expert troubleshooting or configuration detail. | | [NetSuite](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite) | 0.20 | Connector overview for NetSuite; likely conceptual and marketing without detailed limits, config tables, or troubleshooting content. | | [Notebook tutorial](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/tutorial-notebook) | 0.20 | Tutorial notebook for COPY INTO; likely step-by-step example rather than structured configuration tables, limits, or troubleshooting matrices. | | [Notebooks overview](https://learn.microsoft.com/en-us/azure/databricks/notebooks/) | 0.20 | High-level overview of Databricks notebooks; primarily conceptual and feature description without detailed limits, configs, or troubleshooting mappings. | -| [ORC files](https://learn.microsoft.com/en-us/azure/databricks/query/formats/orc) | 0.20 | ORC format page is a basic how-to; no product-specific limits, config matrices, or decision/troubleshooting content. | | [October 2020](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2020/october) | 0.20 | October 2020 release notes list platform updates but are not focused on quotas, configuration matrices, security roles, or error-resolution flows. They are temporal change logs rather than reusable expert-knowledge references as defined. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-governance/) | 0.20 | High-level overview of data governance concepts and Unity Catalog; primarily conceptual framework rather than detailed configuration or troubleshooting content. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/designer/) | 0.20 | High-level description of Lakeflow Designer as a visual, no-code experience; appears to be overview/marketing without detailed limits, config tables, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/tutorials) | 0.20 | Tutorials index/overview page; navigation content pointing to other tutorials, not detailed expert configuration or troubleshooting content itself. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/) | 0.20 | Primarily an overview of the Databricks VS Code extension and its capabilities; no detailed configuration tables, limits, error-code-based troubleshooting, or product-specific decision matrices are evident from the summary. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/developers/) | 0.20 | High-level overview of Databricks developer tools and APIs without detailed configuration tables, limits, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/build-genai-apps) | 0.20 | Overview of tools for building and managing GenAI apps; high-level and tool-list oriented without detailed decision matrices, configs, or troubleshooting mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/) | 0.20 | High-level overview of Auto Loader and its purpose; summary does not indicate detailed configuration tables, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/) | 0.20 | Intro/get-started page for COPY INTO; summary suggests conceptual and tutorial-style usage without explicit mention of detailed limits, config tables, or error mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/overview) | 0.20 | High-level overview of Lakeflow Connect capabilities and use cases; summary does not show detailed configuration parameters, limits, or decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/overview) | 0.20 | High-level overview of Lakeflow Connect capabilities and use cases; no indication of detailed limits, configuration parameters, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/languages/) | 0.20 | Languages overview that mainly links to other content; lacks product-specific configuration values, limits, or detailed patterns required for expert-knowledge classification. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/large-language-models/foundation-model-training/) | 0.20 | Overview of Foundation Model Fine-tuning; summary indicates conceptual description and regional preview info, but no specific configuration parameters, limits, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/dbsql-for-ldp) | 0.20 | High-level description of using pipelines in Databricks SQL and what topics are covered. Appears to be an overview/navigation page without detailed parameters, limits, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/) | 0.20 | The description and summary indicate a high-level developer reference overview explaining that pipelines can be implemented with SQL or Python and that they are equivalent for most use cases. There is no evidence of specific configuration parameters, limits, error codes, or decision matrices. It appears conceptual rather than containing expert, product-specific details. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/ldp/tutorials) | 0.20 | Tutorials landing page listing available Lakeflow Spark Declarative Pipelines tutorials; navigation/overview content without detailed limits, configs, or troubleshooting data. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/) | 0.20 | High-level description of Databricks AI Runtime and preview status; appears to be an overview page without detailed limits, configuration tables, or troubleshooting mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/serve-data-ai) | 0.20 | High-level guidance on serving data for AI workloads in Azure Databricks/Mosaic AI; appears to be conceptual/overview without concrete limits, configuration tables, or error-code-based troubleshooting. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/training-examples) | 0.20 | High-level index/overview page that links to various training examples and mentions AutoML conceptually. It does not itself expose concrete configuration tables, limits, error codes, or detailed product-specific patterns; it mainly describes what kinds of examples exist. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/marketplace/) | 0.20 | High-level introduction to Databricks Marketplace; primarily conceptual and marketing-style overview without detailed configs or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/mlflow/) | 0.20 | High-level overview of MLflow on Databricks; summary indicates conceptual description without detailed configs, limits, or error mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/mlflow/) | 0.20 | High-level overview of MLflow on Databricks without specific limits, configuration parameter tables, error codes, or decision matrices; primarily conceptual/introductory content. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/query/) | 0.20 | The page is a broad overview of how to query data in Azure Databricks, covering core concepts and general procedures with some code examples. It does not focus on detailed configuration tables, limits, error codes, or decision matrices with quantified trade-offs. The content is largely conceptual and tutorial-like, which an LLM is likely to know from training, so it does not meet the expert-knowledge criteria for any sub-skill type. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/schemas/) | 0.20 | Conceptual overview of schemas and hierarchy; no detailed config tables, limits, or security role specifics. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/sql/) | 0.20 | High-level overview of Databricks SQL as a data warehouse; no concrete limits, configuration tables, error codes, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/concepts) | 0.20 | Conceptual explanation of Structured Streaming; summary suggests high-level concepts without product-specific limits, configs, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/tables/) | 0.20 | Appears to be a conceptual overview of Azure Databricks table types and storage formats without evidence of numeric limits, config parameter tables, or product-specific error/decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/views/) | 0.20 | Conceptual explanation of what views are and high-level permissions/compute discussion. No detailed privilege tables, config parameters, or error codes; primarily overview content. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/workspace/) | 0.20 | Workspace UI overview; navigational and conceptual, not configuration or limits-focused. | +| [Partner tools](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/tools) | 0.20 | High-level description of BI tools that integrate with Azure Databricks; likely a catalog/overview without detailed configuration tables, limits, or decision matrices. | | [Platform release notes](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/) | 0.20 | Index page for release notes; no specific technical limits, configs, or error details are provided in the summary. | | [Postgres](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/postgres) | 0.20 | High-level description of Lakebase Postgres features and compatibility; no concrete limits, configs, or security role details in summary. | | [Power BI](https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi) | 0.20 | High-level description of integrating Power BI with Azure Databricks; summary suggests conceptual integration overview without specific configuration tables, limits, or product-specific error/decision details. | | [Profile data in a table](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/) | 0.20 | High-level overview of data profiling concepts and usage; no detailed configuration parameters, limits, or product-specific error/decision matrices. | +| [Publishing](https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/share) | 0.20 | How-to for publishing and sharing dashboards; likely basic sharing steps without detailed RBAC role lists, security scopes, or other expert-only details. | | [PyTorch](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/pytorch) | 0.20 | Basic overview of using PyTorch on Databricks; appears tutorial/intro without detailed configuration tables, limits, or troubleshooting content. | | [Python](https://learn.microsoft.com/en-us/azure/databricks/languages/python) | 0.20 | High-level guide for Python development on Azure Databricks with links to tutorials and tools; summary suggests navigation/overview content without detailed configuration tables, limits, or troubleshooting mappings. | | [Queries overview](https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/) | 0.20 | Covers accessing and managing saved queries via the UI; no detailed configuration options, limits, or troubleshooting mappings. | -| [Query metric views](https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/query) | 0.20 | The page focuses on how to query metric views and shows common query patterns. This is closer to tutorial/query examples than to configuration, limits, or troubleshooting content with product-specific parameters or error codes. It does not clearly meet any expert-knowledge sub-skill criteria. | -| [Release notes 2026](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2026) | 0.20 | Release notes typically list feature changes and dates but not the specific limits, configuration tables, error-code mappings, or decision matrices required by the defined sub-skill types. The description does not indicate presence of numeric limits, configuration parameters, or troubleshooting details. | -| [Release notes overview](https://learn.microsoft.com/en-us/azure/databricks/release-notes/) | 0.20 | Top-level release notes index that organizes links by release type; no indication of concrete limits, configs, error codes, or decision matrices on this page itself. | +| [Query and visualize data](https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start) | 0.20 | Tutorial-style quickstart for querying and visualizing data in a Databricks notebook; no indication of product-specific limits, configs, error codes, or decision matrices beyond generic how-to steps. | +| [Release notes 2024](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2024) | 0.20 | 2024 AI/BI release notes summarize features and updates; no indication of structured limits, configuration parameters, or decision matrices that meet expert-knowledge criteria. | +| [Release notes 2025](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2025) | 0.20 | Page is a yearly release-notes overview for AI/BI; description does not indicate presence of numeric limits, config tables, or error-code-based troubleshooting. | +| [Release notes 2026](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/release-notes/2026) | 0.20 | Release notes typically list new features and fixes but not structured limits, configs, or troubleshooting mappings as defined; summary provides no evidence of detailed expert-knowledge content. | +| [Release notes overview](https://learn.microsoft.com/en-us/azure/databricks/release-notes/) | 0.20 | Top-level release notes index; primarily navigation to specific month/runtime notes without detailed limits, configs, or troubleshooting content on this page. | | [Runtime release overview](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/) | 0.20 | High-level index of runtime releases and lifecycle; no detailed limits, configs, error codes, or decision matrices. | -| [SAP BDC semantic metadata](https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/semantic-metadata) | 0.20 | Describes what semantic metadata syncs from SAP BDC into Unity Catalog and how it behaves conceptually. Based on the summary, it focuses on behavior and concepts (comments, keys, governance tags, source of truth) rather than concrete configuration parameters, limits, or troubleshooting mappings. | | [SQL Server](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-overview) | 0.20 | Connector workflow overview; no indication of detailed limits, config matrices, or error mappings. | -| [SQL editor introduction](https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/) | 0.20 | How-to UI usage for the new SQL editor; no detailed configuration parameter tables, limits, or troubleshooting content. | -| [SQL language reference](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/) | 0.20 | High-level entry page for the SQL language reference; acts as a navigation/overview without exposing specific limits, configuration tables, error codes, or decision matrices. | +| [SQL editor introduction](https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/) | 0.20 | Content describes how to use the Databricks SQL editor UI to write, run, and manage queries. This is primarily feature and UI usage guidance, not configuration tables, limits, error codes, or product-specific integration parameters. It lacks the structured expert details required for any of the defined sub-skill types. | | [SaaS connectors](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/saas-overview) | 0.20 | High-level overview of SaaS connectors; no detailed limits, configs, or error mappings indicated. | +| [Schema evolution](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/schema-evolution) | 0.20 | Primarily conceptual explanation of schema evolution and its importance; summary does not indicate product-specific configuration values, limits, or detailed patterns beyond general concepts. | | [Security & compliance overview](https://learn.microsoft.com/en-us/azure/databricks/security/) | 0.20 | Top-level security overview; conceptual description of security and compliance features without detailed roles, parameters, or configuration tables. | | [Standard compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/standard-overview) | 0.20 | Standard compute overview; conceptual description of access mode without detailed limits, configs, or decision matrices. | | [Supported asset types](https://learn.microsoft.com/en-us/azure/databricks/repos/supported-artifact-types) | 0.20 | Describes which Azure Databricks asset types are supported in Git folders, but from the summary it appears to be a conceptual/feature description without numeric limits, configuration parameter tables, error-code-based troubleshooting, or other detailed product-specific settings. | | [TPC-DS performance evaluation](https://learn.microsoft.com/en-us/azure/databricks/sql/tpcds-eval) | 0.20 | Benchmark usage overview; likely contains example queries and dataset description but not product-specific limits, configs, or troubleshooting matrices. | | [Trace concepts](https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/tracing-101) | 0.20 | Explains trace concepts and why tracing matters; conceptual observability overview without product-specific numeric limits or config tables. | +| [Train and deploy an ML model](https://learn.microsoft.com/en-us/azure/databricks/getting-started/ml-get-started) | 0.20 | Intro ML tutorial using scikit-learn and Optuna on Databricks; focuses on basic model training workflow, not on Databricks-specific limits, configs, or troubleshooting mappings. | | [Train models overview](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-model/) | 0.20 | High-level overview of training models on Databricks with examples; no indication of numeric limits, specific configs, or troubleshooting patterns. | | [Tutorial: EDA techniques](https://learn.microsoft.com/en-us/azure/databricks/notebooks/eda-tutorial) | 0.20 | Tutorial-style EDA walkthrough in Databricks notebooks; focuses on loading, cleaning, and visualizing data without product-specific limits, configuration tables, error codes, or decision matrices. | -| [Use Genie to explore data](https://learn.microsoft.com/en-us/azure/databricks/genie/talk-to-genie) | 0.20 | Describes how business users can interact with Genie spaces and provide feedback. This is end-user usage guidance, not expert-level configuration, limits, or troubleshooting content. | +| [Tutorial: Load and transform data using DataFrames](https://learn.microsoft.com/en-us/azure/databricks/getting-started/dataframes) | 0.20 | Tutorial-style DataFrame usage with basic load/transform examples; no product-specific limits, configuration tables, or specialized patterns beyond standard Spark APIs. | | [Use the legacy SQL editor](https://learn.microsoft.com/en-us/azure/databricks/sql/user/sql-editor/legacy) | 0.20 | Describes using the legacy SQL editor and deprecation timeline; focuses on UI behavior without expert-level limits, configs, or error mappings. | | [User-defined metadata](https://learn.microsoft.com/en-us/azure/databricks/delta/custom-metadata) | 0.20 | Describes adding comments and tags as custom metadata for Delta/Unity Catalog tables but appears to be conceptual/how-to without product-specific limits, config parameter tables, or error mappings. | | [VOID type](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/null-type) | 0.20 | Page is a basic SQL data type reference (NULL/VOID) without product-specific limits, configuration parameters, or decision/troubleshooting content. | @@ -4698,12 +4723,13 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [What is Apache Spark on Databricks?](https://learn.microsoft.com/en-us/azure/databricks/spark/faq) | 0.20 | FAQ-style overview of Spark on Databricks; primarily conceptual/platform positioning without concrete configs, limits, or troubleshooting mappings. | | [What is Genie Code?](https://learn.microsoft.com/en-us/azure/databricks/genie-code/) | 0.20 | High-level introduction to Genie Code; summary indicates conceptual overview and positioning without concrete limits, configs, or error mappings. | | [What is Lakebase?](https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/about) | 0.20 | Conceptual 'What is' overview of Lakebase Autoscaling; mostly describes capabilities and scenarios without concrete numeric limits, configuration parameters, or decision matrices. | +| [What is Lakeflow Designer?](https://learn.microsoft.com/en-us/azure/databricks/designer/what-is-lakeflow-designer) | 0.20 | Conceptual 'what is' page explaining Lakeflow Designer and key concepts; no indication of numeric limits, configuration parameter tables, or decision matrices. | | [What is Partner Connect?](https://learn.microsoft.com/en-us/azure/databricks/partner-connect/) | 0.20 | High-level overview of Partner Connect capabilities and workflow; primarily conceptual/marketing without detailed configuration tables or limits. | | [What is Unity Catalog?](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/) | 0.20 | ‘What is Unity Catalog?’ is an overview of key concepts; does not indicate detailed settings, limits, or error-resolution mappings. | | [What is the medallion lakehouse architecture?](https://learn.microsoft.com/en-us/azure/databricks/lakehouse/medallion) | 0.20 | Conceptual description of medallion architecture layers; no numeric thresholds, configuration parameters, or product-specific limits. | -| [What's coming?](https://learn.microsoft.com/en-us/azure/databricks/release-notes/whats-coming) | 0.20 | Upcoming release and behavior-change overview; no concrete limits, configs, or error mappings indicated in the summary. | +| [What's coming?](https://learn.microsoft.com/en-us/azure/databricks/release-notes/whats-coming) | 0.20 | Upcoming features and behavior changes are high-level release notes; typically lack stable numeric limits, config tables, or detailed error mappings required for expert-knowledge sub-skills. | | [Workspace Feature Store overview (legacy)](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/) | 0.20 | Overview of legacy Workspace Feature Store; mainly conceptual description of workflow and deprecation notice without detailed configs, limits, or troubleshooting content. | -| [Workspace creation overview](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/) | 0.20 | High-level overview of workspace creation and management options without detailed settings, limits, or decision matrices. | +| [Workspace appearance settings](https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/appearance) | 0.20 | Workspace appearance (labels, themes, date/time, language) is basic UI configuration without detailed parameters or expert-only behavior. | | [^ (caret sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/caretsign) | 0.20 | Bitwise XOR operator; standard SQL operator semantics, no special Databricks-only details. | | [abs function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/abs) | 0.20 | Page documents a standard abs() SQL function; no product-specific limits, configuration tables, or troubleshooting/decision guidance. | | [add_months function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/add_months) | 0.20 | ADD_MONTHS date function reference; describes return value but no limits, configs, or decision/troubleshooting content. | @@ -4862,7 +4888,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [unix_millis function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_millis) | 0.20 | Milliseconds since Unix epoch; generic timestamp function behavior. | | [unix_seconds function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/unix_seconds) | 0.20 | Seconds since Unix epoch; generic timestamp function behavior. | | [~ (tilde sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/tildesign) | 0.20 | Bitwise NOT (~) operator; generic SQL operator semantics without Databricks-specific limits, quotas, or configuration. | -| [Database connectors (CDC)](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/cdc-overview) | 0.15 | Overview of database connectors using CDC; appears conceptual, describing what the connectors do rather than detailed limits, configuration parameters, or troubleshooting. | | [asc](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/asc) | 0.15 | asc function reference; generic sort expression behavior only. | | [ascii](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/ascii) | 0.15 | ascii function reference; generic string/character behavior. | | [asin](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/asin) | 0.15 | asin function reference; standard math function semantics. | @@ -4878,38 +4903,48 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [* (asterisk sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/asterisksign) | 0.10 | Asterisk (*) operator reference; basic arithmetic operator semantics that are generic and not product-specific expert knowledge. | | [AI and ML introduction](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/) | 0.10 | High-level AI/ML overview for Databricks; no detailed technical parameters or limits indicated. | | [AI/BI concepts](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/concepts) | 0.10 | Conceptual explanation of AI/BI and compound AI; lacks product-specific numeric limits, config parameters, or troubleshooting mappings. | -| [Account-level Databricks One](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-account) | 0.10 | Conceptual overview of account-level Databricks One; explains what it is and how it differs from workspace-level, but lacks specific quotas, configuration parameters, or troubleshooting mappings. | -| [Administration overview](https://learn.microsoft.com/en-us/azure/databricks/admin/admin-concepts) | 0.10 | High-level overview of Azure Databricks admin concepts and roles; no specific limits, configuration parameters, error codes, or decision matrices with quantified criteria. | +| [Account-level Databricks One](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-account) | 0.10 | Account-level Databricks One overview; focuses on what it is and how it scopes assets, not on detailed technical configuration or constraints. | +| [Apache Spark API](https://learn.microsoft.com/en-us/azure/databricks/reference/spark) | 0.10 | This is a navigation/overview page pointing to Apache Spark API references. It does not itself list configuration parameters, limits, error codes, or product-specific patterns; it just describes that Spark has DataFrame APIs and links out. That is not expert knowledge per the defined categories. | | [Archived docs](https://learn.microsoft.com/en-us/azure/databricks/archive/) | 0.10 | Archive landing page describing retired documentation; no specific technical details itself. | | [Azure Databricks resources](https://learn.microsoft.com/en-us/azure/databricks/resources/) | 0.10 | Navigation/resources hub pointing to other docs; does not itself contain detailed limits or configurations. | | [Azure Databricks status page](https://learn.microsoft.com/en-us/azure/databricks/resources/status) | 0.10 | Describes the status page and how to subscribe to updates; no product-specific limits, configs, or troubleshooting details. | | [Batch vs. streaming](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/batch-vs-streaming) | 0.10 | Conceptual explanation of batch vs streaming semantics; no concrete product-specific thresholds or configs. | | [Build an ETL pipeline (Apache Spark)](https://learn.microsoft.com/en-us/azure/databricks/getting-started/etl-quick-start) | 0.10 | Step-by-step ETL tutorial; focuses on how to build a first pipeline, not on detailed configuration matrices, limits, or troubleshooting content. | -| [Chat](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one-chat) | 0.10 | Feature overview of Chat in Databricks One; summary only mentions what the feature does and that it is in beta, with no indication of numeric limits, configuration tables, error codes, or other expert-only details. | | [Compute overview](https://learn.microsoft.com/en-us/azure/databricks/compute/) | 0.10 | High-level overview of compute types (serverless, classic, SQL warehouses) and where to view/manage them; no indication of numeric limits, configuration tables, error codes, or detailed decision matrices. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/concepts/) | 0.10 | Concepts page for generative AI on Databricks; high-level explanations of agents, RAG, workflows without detailed configs, limits, or troubleshooting. | +| [Concepts](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-concepts) | 0.10 | Concepts/overview page explaining how the SQL Server connector works; no indication of numeric limits, config tables, or troubleshooting content. | | [Concepts](https://learn.microsoft.com/en-us/azure/databricks/query-federation/catalog-federation) | 0.10 | Conceptual overview of catalog federation and supported platforms. The summary indicates high-level behavior and benefits without concrete configuration parameters, limits, or decision matrices, so it does not meet any expert-knowledge criteria. | | [Databricks AI/BI](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/) | 0.10 | High-level product/feature overview of Databricks AI/BI; no detailed limits, configuration tables, error codes, or decision matrices. | -| [Databricks One](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one) | 0.10 | High-level description of Databricks One UI for business users; marketing/overview style without detailed settings, limits, or decision matrices. | +| [Databricks One](https://learn.microsoft.com/en-us/azure/databricks/workspace/databricks-one) | 0.10 | High-level description of Databricks One for business users; marketing/overview style without detailed technical limits, configs, or decision matrices. | +| [Databricks Runtime maintenance updates (archived)](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/maintenance-updates-archive) | 0.10 | Archived maintenance updates index; primarily a navigation/summary page pointing to other releases. Description does not indicate detailed limits, configuration parameters, or troubleshooting mappings. | | [Databricks SQL concepts](https://learn.microsoft.com/en-us/azure/databricks/sql/get-started/concepts) | 0.10 | Conceptual introduction to Databricks SQL concepts; lacks product-specific numeric limits, config parameters, or troubleshooting mappings. | +| [Databricks best practice articles](https://learn.microsoft.com/en-us/azure/databricks/getting-started/best-practices) | 0.10 | Acts as a navigation hub linking to best practice articles rather than containing the concrete, product-specific recommendations itself. | | [Databricks components](https://learn.microsoft.com/en-us/azure/databricks/getting-started/concepts) | 0.10 | Fundamental conceptual overview of Azure Databricks components (workspaces, clusters, data objects, etc.) without product-specific limits, configuration tables, or troubleshooting mappings. | | [Databricks overview](https://learn.microsoft.com/en-us/azure/databricks/introduction/) | 0.10 | High-level product introduction to Azure Databricks; conceptual and marketing-style overview without detailed limits, configs, or error mappings. | | [Databricks reference documentation](https://learn.microsoft.com/en-us/azure/databricks/reference/api) | 0.10 | Navigation hub listing reference sections; no direct expert content such as limits, configs, or error mappings on this page itself. | | [Free training](https://learn.microsoft.com/en-us/azure/databricks/getting-started/free-training) | 0.10 | Describes access to training resources; no technical limits, configs, or troubleshooting content. | -| [Genie space overview](https://learn.microsoft.com/en-us/azure/databricks/genie/) | 0.10 | Introductory 'What is' page for Genie spaces; high-level conceptual overview without detailed configs, limits, or troubleshooting. | +| [Genie space overview](https://learn.microsoft.com/en-us/azure/databricks/genie/) | 0.10 | Introductory overview of Genie spaces and natural language data exploration; no detailed configuration parameters, limits, or decision matrices. | | [Get started on Databricks](https://learn.microsoft.com/en-us/azure/databricks/getting-started/) | 0.10 | Navigation hub for getting-started tutorials; no detailed technical guidance, limits, or configuration references. | | [Guiding principles](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/guiding-principles) | 0.10 | Guiding principles article; architectural philosophy without concrete numeric thresholds or configs. | | [Intro to the GenAI guide](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/) | 0.10 | Introduction to generative AI apps and agent systems on Azure Databricks. Described as conceptual explanations and scenario guides, without indication of numeric limits, configuration tables, or detailed error/decision matrices. | +| [Introduction to workspace objects](https://learn.microsoft.com/en-us/azure/databricks/workspace/workspace-assets) | 0.10 | High-level introduction to workspace objects; conceptual overview rather than detailed configuration, limits, or error-handling content. | | [Key challenges in building GenAI apps](https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/gen-ai-challenges) | 0.10 | Describes key challenges in building GenAI apps; conceptual discussion of quality, control, and cost rather than product-specific expert configuration or troubleshooting details. | +| [Navigate the workspace](https://learn.microsoft.com/en-us/azure/databricks/workspace/navigate-workspace) | 0.10 | Explains how to navigate the Lakehouse workspace UI; primarily conceptual navigation guidance, not expert-level configuration, limits, or troubleshooting. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/) | 0.10 | High-level overview of Lakeflow and data engineering; no detailed settings, limits, or error mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/best-practices) | 0.10 | This is a hub/overview page that links to other best practices articles and does not itself contain concrete product-specific recommendations, numeric thresholds, or configuration details. It is primarily navigational and conceptual. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/concepts) | 0.10 | Conceptual overview of data engineering topics; lacks product-specific parameters or numeric thresholds. | -| [Overview](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/) | 0.10 | Navigation/overview page for PySpark reference; no detailed configs, limits, or troubleshooting content. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/) | 0.10 | Conceptual overview of Databricks Apps, use cases, and high-level requirements/limitations. No detailed configuration tables, numeric limits, error codes, or decision matrices; primarily descriptive content. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/) | 0.10 | High-level 'What is Auto Loader?' overview; summary indicates conceptual description of incremental ingestion, not detailed limits, configs, or troubleshooting. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/pyspark/) | 0.10 | High-level overview of PySpark on Databricks and its fundamentals; no product-specific limits, configuration tables, error codes, or decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/) | 0.10 | This is an overview/index page for PySpark reference on Databricks. It does not itself contain detailed configuration tables, limits, error codes, or best-practice guidance; it mainly routes to other reference pages. Therefore it does not meet the expert-knowledge criteria. | | [Overview](https://learn.microsoft.com/en-us/azure/databricks/spark/) | 0.10 | High-level Apache Spark overview and navigation page without detailed limits, configs, or product-specific patterns. | -| [Partner tools](https://learn.microsoft.com/en-us/azure/databricks/ai-bi/tools) | 0.10 | High-level overview of BI tools that integrate with Databricks; likely marketing/integrations overview without detailed config tables or parameters. | +| [Overview](https://learn.microsoft.com/en-us/azure/databricks/workspace/) | 0.10 | Workspace UI overview and navigation; conceptual and UX-focused without detailed configuration parameters, limits, or error-resolution content. | | [Procedural vs. declarative](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/procedural-vs-declarative) | 0.10 | Conceptual comparison of procedural vs declarative processing; no Databricks-specific configs or limits. | +| [Scala](https://learn.microsoft.com/en-us/azure/databricks/languages/scala) | 0.10 | Acts as a navigation/guide page for Scala development on Databricks with links to tutorials and APIs; no detailed limits, configs, or troubleshooting content. | | [Scope of the lakehouse](https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/scope) | 0.10 | Scope and roles overview for lakehouse platform; conceptual, not configuration or limits. | | [Submit in-product feedback](https://learn.microsoft.com/en-us/azure/databricks/resources/ideas) | 0.10 | Describes how to submit product feedback; procedural, not technical expert knowledge. | | [Tables and views](https://learn.microsoft.com/en-us/azure/databricks/data-engineering/tables-views) | 0.10 | Overview of tables and views; lacks detailed configuration tables or numeric constraints. | +| [Technology partner overview](https://learn.microsoft.com/en-us/azure/databricks/integrations/) | 0.10 | High-level overview of technology partner integrations and Partner Connect; appears to be conceptual/marketing and navigational content without detailed parameters, limits, or error codes. | | [TikTok Ads](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-overview) | 0.10 | Connector overview for TikTok Ads; described as an overview without indication of limits, config tables, or troubleshooting details. | | [Tutorials](https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/) | 0.10 | Tutorials index page; primarily navigation to how-to guides without consolidated limits, configs, or decision criteria. | | [Tutorials](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-ml-tutorials) | 0.10 | Curated list of tutorials/quickstarts; navigation/collection page without deep product-specific limits, configs, or troubleshooting content. | @@ -4954,11 +4989,12 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [= (eq sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/eqsign) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [== (eq eq sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/eqeqsign) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [>= (gt eq sign) operator](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/gteqsign) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | -| [April 2026](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/april) | - | Release notes summary page; description does not indicate detailed limits, configuration tables, error codes, or other structured expert knowledge. Likely a chronological list of feature changes without the structured patterns required by any sub-skill type. | | [August 2025](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2025/august) | - | Release notes summary page; likely lists new features and changes for August 2025 without the specific numeric limits, config tables, or decision matrices required for classification. | | [Azure Databricks Documentation](https://learn.microsoft.com/en-us/azure/databricks/) | - | High-level product documentation landing page; no detailed limits, configs, or troubleshooting content. | | [Data types and literals](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-datatypes) | - | Page is a general reference for Databricks SQL data types; it defines and describes types but does not focus on limits/quotas, configuration parameters, decision matrices, troubleshooting mappings, or other product-specific expert details as defined in the sub-skill types. | +| [Databricks Runtime 18.2 (Beta)](https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/18.2) | - | Release notes summary; no evidence in the provided text of numeric limits, configuration tables, error-code troubleshooting, or other structured expert details tied to the defined sub-skill types. | | [FAQ](https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/faq) | - | FAQ page description is generic and does not indicate presence of specific limits, configuration tables, error-code mappings, or other detailed expert knowledge as defined; likely high-level Q&A and connector pointers rather than numeric limits or product-specific configuration matrices. | +| [Glossary](https://learn.microsoft.com/en-us/azure/databricks/resources/glossary) | - | A terminology glossary; primarily conceptual definitions without configuration values, limits, error codes, or decision matrices. Does not meet any expert-knowledge criteria for the defined sub-skill types. | | [March 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/march) | - | Release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert-knowledge categories defined here. | | [March 2026](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/march) | - | Release notes summary page; description suggests high-level feature and improvement announcements, not structured limits, configuration parameters, or troubleshooting mappings. | | [May 2023](https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2023/may) | - | Release notes summarize new features and changes but are not organized as limits, configuration references, troubleshooting guides, or other structured expert-knowledge categories defined here. | @@ -4978,7 +5014,6 @@ confusable_not_for: Not for Azure Synapse Analytics (use azure-synapse-analytics | [from_unixtime function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_unixtime) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [from_utc_timestamp function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_utc_timestamp) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [get function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | -| [get_json_object function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get_json_object) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [getdate function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/getdate) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [greatest function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/greatest) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | | [grouping function](https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/grouping) | - | Parse error: Expecting value: line 9 column 24 (char 1497) | diff --git a/products/azure-defender-for-cloud/azure-defender-for-cloud.csv b/products/azure-defender-for-cloud/azure-defender-for-cloud.csv index f156059b..f9da2d53 100644 --- a/products/azure-defender-for-cloud/azure-defender-for-cloud.csv +++ b/products/azure-defender-for-cloud/azure-defender-for-cloud.csv @@ -7,12 +7,12 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/agentless-malware-sca https://learn.microsoft.com/en-us/azure/defender-for-cloud/agentless-vulnerability-assessment-azure,Vulnerability assessments for supported environments,Vulnerability assessments for Defender for Container supported environments - Microsoft Defender for Cloud,Apply agentless vulnerability assessment for containers,Learn about vulnerability assessments for images and containers with Microsoft Defender Vulnerability Management.,Defender for Containers performs agentless vulnerability assessment on container images in supported runtime environments and supported container registries. Relevant recommendations are generated for vulnerabilities detected in a container registry image or running container. Vulnerability assessment of images in supported container registries is performed whenRegistry accessis enabled for the Defender for Cloud Security Posture Management or Defender for Containers plans. Vulnerability assessm,2026-03-04T08:00:00.000Z,concept-article,best-practices,0.7,True,"Details supported runtime environments and registries, and how agentless assessment is triggered when specific Defender plans and registry access are enabled; this is product-specific operational guidance on using the feature effectively.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/agentless-vulnerability-assessment-docker-hub,Vulnerability assessments for Docker Hub,Vulnerability assessments for Docker Hub external registry with Microsoft Defender Vulnerability Management - Microsoft Defender for Cloud,Configure Docker Hub vulnerability assessments with Defender,Configure vulnerability assessments for Docker Hub as an external registry with Microsoft Defender Vulnerability Management.,"A key aspect of Defender for Containers' security solution is to provide container image vulnerability assessment throughout its lifecycle, from code development to cloud deployment. To achieve this goal, comprehensive coverage is needed for all stages of the container image life cycle, including container images from external registries. Docker Hub, widely used by enterprises, SMBs, and the open-source community, is supported in this feature. Customers using Docker Hub can use Defender for Cont",2025-05-18T08:00:00.000Z,how-to,configuration,0.75,True,"How-to configure Docker Hub as an external registry; likely includes specific configuration parameters, scopes, and integration settings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/agentless-vulnerability-assessment-jfrog-artifactory,Vulnerability assessments for JFrog Artifactory,Vulnerability assessments for JFrog Artifactory external registry with Microsoft Defender Vulnerability Management - Microsoft Defender for Cloud,Configure JFrog Artifactory vulnerability assessments with Defender,Configure vulnerability assessments for JFrog Artifactory as an external registry with Microsoft Defender Vulnerability Management.,"Microsoft Defender for Containers provides inventory discovery and vulnerability assessment of a container image throughout its lifecycle, from code development to cloud deployment. Defender for Containers protects the JFrog Artifactory (Cloud) container registry images with the same security capabilities available for the cloud-native registry images in Azure Container Registry (ACR), Elastic Container Registry (ECR), and Google Container Registry (GCR).",2025-04-22T22:02:00.000Z,how-to,configuration,0.75,True,Describes onboarding JFrog Artifactory Cloud registry; likely includes product-specific configuration steps and parameters.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-model-security,Discover AI models,Discover AI models - Microsoft Defender for Cloud,Configure and use Defender for Cloud AI model security,Learn about AI model security in Microsoft Defender for Cloud.,"Important This feature is currently in preview and included with the Microsoft Defender for AI Services plan. During preview, there is no additional charge for AI model scanning. However, enabling the Defender for AI Services plan may incur costs related to threat protection features. Continued inclusion of AI model scanning feature as part of Defender for AI Services is not guaranteed when it becomes generally available (GA), and licensing requirements may change. If that occurs, a notification",2026-04-14T22:13:00.000Z,concept-article,security,0.65,True,"Covers AI model security within Defender for Cloud and a specific Defender for AI Services plan. Although the summary emphasizes preview and licensing notes, this type of feature article typically includes product-specific security behaviors (how AI models are discovered/scanned, required plan/enablement details, and possibly scope of protection). Among the available categories, it best fits 'security' as it is centered on securing AI models in this product context.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-model-security,Discover AI models,Discover AI models - Microsoft Defender for Cloud,Configure and use Defender for Cloud AI model security,Learn about AI model security in Microsoft Defender for Cloud.,"Important This feature is currently in preview and included with the Microsoft Defender for AI Services plan. During preview, there is no additional charge for AI model scanning. However, enabling the Defender for AI Services plan may incur costs related to threat protection features. Continued inclusion of AI model scanning feature as part of Defender for AI Services is not guaranteed when it becomes generally available (GA), and licensing requirements may change. If that occurs, a notification",2026-04-14T22:13:00.000Z,concept-article,security,0.65,True,"Covers AI model security within Defender for Cloud and a specific Defender for AI Services plan. Although the summary emphasizes preview and licensing notes, this type of feature article typically includes product-specific security behaviors (how AI models are discovered/scanned, required plan/enablement details, and possibly scope of protection). Among the available categories, it best fits 'security' as it is centered on securing AI models in this product context.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-onboarding,Enable threat protection for AI services,Enable threat protection for AI services - Microsoft Defender for Cloud,Enable Defender threat protection for Azure AI services,Learn how to enable threat protection for AI services on your Azure subscription for Microsoft Defender for Cloud.,Threat protection for AI services in Microsoft Defender for Cloud protects Microsoft Foundry workloads on an Azure subscription by providing insights to threats that might affect your generative AI applications and agents.,2026-03-30T17:14:00.000Z,install-set-up-deploy,configuration,0.65,True,"The page covers how to enable threat protection for AI services in Defender for Cloud at the subscription level, including specific onboarding steps and configuration options for Microsoft Foundry workloads. These are product-specific enablement and configuration details rather than conceptual overview, aligning with the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-security-posture,Overview,Overview - AI security posture management - Microsoft Defender for Cloud,,Learn about AI security posture management in Microsoft Defender for Cloud and how it protects resources from AI threats.,"The Defender Cloud Security Posture Management (CSPM) plan in Microsoft Defender for Cloud secures enterprise-built, multicloud, or hybrid cloud environments. These environments include Azure, Amazon Web Services (AWS), and Google Cloud Platform (GCP) Vertex AI. The Defender CSPM plan secures generative AI applications and AI agents (Preview) throughout their entire lifecycle. Defender for Cloud reduces risks to cross-cloud AI workloads by: Important To enable AI security posture management capa",2025-12-02T12:08:00.000Z,concept-article,,0.28,False,"AI security posture management overview; summary is conceptual and does not indicate detailed configuration parameters, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-threat-protection,Overview,Overview - AI threat protection - Microsoft Defender for Cloud,,Learn about AI threat protection in Microsoft Defender for Cloud and how it protects your resources from AI threats.,"Microsoft Defender for Cloud's threat protection for AI services identifies threats to generative AI applications and agents in real time and helps respond to security issues. Defender for Cloud's AI threat protection works withAzure AI Content Safety Prompt Shieldsand Microsoft's threat intelligence to provide security alerts for threats like data leakage, data poisoning, jailbreak, credential theft and more.",2026-01-26T12:03:00.000Z,overview,,0.25,False,"AI threat protection overview; summary is high-level description of capabilities and threats, not detailed configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/alert-validation,Validate alert configuration,How to validate alerts - Microsoft Defender for Cloud,Validate Defender for Cloud alert generation and coverage,Learn how to validate security alerts in Microsoft Defender for Cloud to ensure your system is properly configured and can effectively monitor threats.,"This article explains how to validate that your system is properly configured for Microsoft Defender for Cloud alerts, ensuring you can effectively monitor and respond to security threats.",2025-07-14T08:00:00.000Z,how-to,troubleshooting,0.65,True,"Focused on validating alerts; such pages usually map expected alerts/symptoms to configuration checks and fixes, which is product-specific troubleshooting guidance.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-ai-workloads,Alerts for AI services,Alerts for AI services - Microsoft Defender for Cloud,Interpret and respond to Defender for Cloud AI alerts,This article lists the security alerts for AI services visible in Microsoft Defender for Cloud.,"This article lists the security alerts you might get for AI from Microsoft Defender for Cloud and any Microsoft Defender plans you enabled. The alerts shown in your environment depend on the resources and services you're protecting, and your customized configuration. Learn how to respond to these alerts. Learn how to export alerts. Note Alerts from different sources might take different amounts of time to appear. For example, alerts that require analysis of network traffic might take longer to a",2026-04-16T08:00:00.000Z,reference,troubleshooting,0.72,True,"The page is a catalog of specific security alerts for AI services in Microsoft Defender for Cloud, including alert names and how to respond. This is product-specific, symptom-to-action guidance tied to concrete alert outputs, which fits the troubleshooting category. It goes beyond conceptual security guidance by mapping alerts to recommended responses.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-ai-workloads,Alerts for AI services,Alerts for AI services - Microsoft Defender for Cloud,Interpret and respond to Defender for Cloud AI alerts,This article lists the security alerts for AI services visible in Microsoft Defender for Cloud.,"This article lists the security alerts you might get for AI from Microsoft Defender for Cloud and any Microsoft Defender plans you enabled. The alerts shown in your environment depend on the resources and services you're protecting, and your customized configuration. Learn how to respond to these alerts. Learn how to export alerts. Note Alerts from different sources might take different amounts of time to appear. For example, alerts that require analysis of network traffic might take longer to a",2026-04-16T08:00:00.000Z,reference,troubleshooting,0.72,True,"The page is a catalog of specific security alerts for AI services in Microsoft Defender for Cloud, including alert names and how to respond. This is product-specific, symptom-to-action guidance tied to concrete alert outputs, which fits the troubleshooting category. It goes beyond conceptual security guidance by mapping alerts to recommended responses.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-azure-app-service,Alerts for Azure App Service,Alerts for Azure App Service - Microsoft Defender for Cloud,Understand Defender for Cloud alerts for Azure App Service,This article lists the security alerts for Azure App Service visible in Microsoft Defender for Cloud.,"This article lists the security alerts you might get for Azure App Service from Microsoft Defender for Cloud and any Microsoft Defender plans you enabled. The alerts shown in your environment depend on the resources and services you're protecting, and your customized configuration. Note Some of the recently added alerts powered by Microsoft Defender Threat Intelligence and Microsoft Defender for Endpoint might be undocumented. Learn how to respond to these alerts. Learn how to export alerts.",2025-12-11T12:07:00.000Z,reference,security,0.8,True,Reference of App Service alert types and meanings; detailed alert catalog is expert security knowledge.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-azure-cosmos-db,Alerts for Azure Cosmos DB,Alerts for Azure Cosmos DB - Microsoft Defender for Cloud,Understand Defender for Cloud alerts for Azure Cosmos DB,This article lists the security alerts for Azure Cosmos DB visible in Microsoft Defender for Cloud.,"This article lists the security alerts you might get for Azure Cosmos DB from Microsoft Defender for Cloud and any Microsoft Defender plans you enabled. The alerts shown in your environment depend on the resources and services you're protecting, and your customized configuration. Note Some of the recently added alerts powered by Microsoft Defender Threat Intelligence and Microsoft Defender for Endpoint might be undocumented. Learn how to respond to these alerts. Learn how to export alerts. Note ",2024-08-08T22:06:00.000Z,reference,security,0.8,True,Lists Cosmos DB alert types and semantics; product-specific security detection behavior.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-azure-ddos-protection,Alerts for Azure DDoS Protection,Alerts for Azure DDoS Protection - Microsoft Defender for Cloud,Understand Defender for Cloud alerts for Azure DDoS Protection,This article lists the security alerts for Azure DDoS Protection visible in Microsoft Defender for Cloud.,"This article lists the security alerts you might get for Azure DDoS Protection from Microsoft Defender for Cloud and any Microsoft Defender plans you enabled. The alerts shown in your environment depend on the resources and services you're protecting, and your customized configuration. Note Some of the recently added alerts powered by Microsoft Defender Threat Intelligence and Microsoft Defender for Endpoint might be undocumented. Learn how to respond to these alerts. Learn how to export alerts.",2024-08-08T22:06:00.000Z,reference,security,0.8,True,Reference of DDoS Protection alert types; detailed alert semantics are expert security knowledge.,unchanged @@ -35,7 +35,7 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-windows-machin https://learn.microsoft.com/en-us/azure/defender-for-cloud/anti-malware,Antimalware,Anti-malware detection and blocking - Microsoft Defender for Cloud,Configure container runtime anti-malware policies,"Learn how to configure Container runtime anti-malware detection and blocking to block or alert on malware in Azure, Amazon Web Service (AWS), and Google Cloud Project (GCP) environments.",Container runtime anti-malware detects and blocks malware when a container runs an executable that the system identifies as malicious software. This feature sends alerts when it identifies malware and lets you block malware. You can define anti-malware policies that set conditions for alerts and blocking. These policies help you distinguish legitimate activity from potential threats. Container runtime anti-malware detection and blocking is part of the Defender for Containers plan. This feature i,2026-02-22T23:14:00.000Z,how-to,security,0.7,True,"Explains product-specific anti-malware policy settings, conditions, and behaviors for blocking/alerting across clouds, which are concrete security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/api-security-posture-overview,Overview,API security posture overview - Microsoft Defender for Cloud,,"Learn how Microsoft Defender for Cloud enhances API security posture management for your APIs across Azure API Management, Function Apps, and Logic Apps.","APIs are entry points into cloud-native apps. They connect services, apps, and data, making them targets for attackers. API security posture management helps protect APIs by assessing risks from misconfigurations and vulnerabilities. The Defender Cloud Security Posture Management (CSPM) plan in Microsoft Defender for Cloud offers API discovery and posture across your Azure Function Apps and Logic Apps and your managed APIs across your Azure API Management Platform. Note API discovery and securit",2026-01-04T08:00:00.000Z,concept-article,,0.3,False,"API security posture overview; focuses on what is assessed and supported platforms, not on detailed configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/apply-security-baseline,Remediate OS misconfigurations (Cloud Security Benchmark),Review OS misconfiguration recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Review OS misconfiguration recommendations against MCSB baselines,Learn how Microsoft Defender for Cloud uses the guest configuration to compare machine OS settings with baselines in Microsoft Cloud Security Benchmark.,Microsoft Defender for Cloud provides security recommendations to improve organizational security posture and reduce risk. An important element in risk reduction is machine hardening. Defender for Cloud assesses operating system settings against compute security baselines provided by theMicrosoft Cloud Security Benchmark (MCSB). Machine information is gathered for assessment using the Azure Policy machine configuration extension (formerly known as the guest configuration) on the machine.Learn mo,2025-02-19T08:00:00.000Z,how-to,best-practices,0.7,True,Describes how Defender for Cloud assesses OS settings against Microsoft Cloud Security Benchmark and how to act on recommendations; includes product-specific baseline mapping and remediation patterns.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/asset-inventory,Cloud asset inventory,Cloud asset inventory - Microsoft Defender for Cloud,,Learn about the cloud asset inventory in Microsoft Defender for Cloud and Security Exposure Management,"The asset inventory page of Microsoft Defender for Cloud shows thesecurity postureof the resources you connected to Defender for Cloud. It offers a unified, contextual view of cloud infrastructure across Azure, AWS, and GCP. It categorizes assets by workload, criticality, and coverage status, while integrating health data, device actions, and risk signals into a single interface. Defender for Cloud periodically analyzes the security state of resources connected to your subscriptions to identify",2025-12-03T08:00:00.000Z,how-to,,0.25,False,"Asset inventory page description focuses on what is shown and how posture is summarized; no indication of specific configuration parameters, quotas, or troubleshooting flows.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/asset-inventory,Cloud asset inventory,Cloud asset inventory - Microsoft Defender for Cloud,,Learn about the cloud asset inventory in Microsoft Defender for Cloud and Security Exposure Management,"The asset inventory page of Microsoft Defender for Cloud shows thesecurity postureof the resources you connected to Defender for Cloud. It offers a unified, contextual view of cloud infrastructure across Azure, AWS, and GCP. It categorizes assets by workload, criticality, and coverage status, while integrating health data, device actions, and risk signals into a single interface. Defender for Cloud periodically analyzes the security state of resources connected to your subscriptions to identify",2026-04-19T08:00:00.000Z,how-to,,0.2,False,"Overview of cloud asset inventory and posture visualization; lacks product-specific numeric limits, configuration parameters, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/assign-access-to-workload,Assign access to workload owners,Assign access to workload owners - Microsoft Defender for Cloud,Assign granular access to AWS and GCP connectors,Learn how to assign access to a workload owner of an Amazon Web Service or Google Cloud Platform connector.,"When you onboard your Amazon Web Service (AWS) or Google Cloud Platform (GCP) environments, Defender for Cloud automatically creates a security connector as an Azure resource within the connected subscription and resource group. Defender for Cloud also creates the identity provider as an IAM role required during the onboarding process. To assign permissions to users on a specific connector below the parent connector, you need to determine which AWS accounts or GCP projects you want users to acce",2024-12-11T12:17:00.000Z,how-to,security,0.7,True,"Describes assigning access to specific connectors; likely includes role names, scopes, and IAM/RBAC mappings that are product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/assign-regulatory-compliance-standards,Choose compliance standards,Assign regulatory compliance standards in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn how to assign regulatory compliance standards in Microsoft Defender for Cloud.,"In Defender for Cloud, regulatory compliance standards are implemented using Azure Policy initiatives and evaluated through the Regulatory compliance dashboard. You can assign regulatory compliance standards to specific scopes such as Azure subscriptions, Amazon Web Services (AWS) accounts, and Google Cloud Platform (GCP) projects. Defender for Cloud continually assesses the scoped environment against the standards. Based on these assessments, it shows whether in-scope resources are compliant or",2025-12-28T12:02:00.000Z,how-to,,0.5,False,Assigning compliance standards to scopes is mostly policy/UI workflow; summary doesn’t show detailed parameter tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/attack-path-api,Retrieve attack path data with API,Retrieve attack path data with API - Microsoft Defender for Cloud,Query Defender attack path data via ARG API,Learn how to Retrieve attack path data with APIs in Microsoft Defender for Cloud and enhance the security of your environment.,"You can consume attack path data programmatically by querying Azure Resource Graph (ARG) API. API responses contain only externally-driven, exploitable attack paths that focus on real threats rather than broad scenarios. Learnhow to query ARG API.",2025-09-15T17:11:00.000Z,how-to,integrations,0.7,True,"API-based retrieval via Azure Resource Graph; likely includes specific query schema, fields, and parameters unique to Defender for Cloud.",unchanged @@ -49,14 +49,14 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/ci-cd-pipeline-scanni https://learn.microsoft.com/en-us/azure/defender-for-cloud/ci-cd-pipeline-scanning-with-defender-cli,Integrate Defender for Cloud CLI with CI/CD pipelines,CI/CD in pipeline-scanning with Defender for Cloud CLI - Microsoft Defender for Cloud,Integrate Defender for Cloud CLI into CI/CD pipelines,Learn how to integrate Microsoft Defender for Cloud CLI into your CI/CD workflows for automated security scanning and standards-based SARIF output.,Microsoft Defender for Cloud Command‑Line Interface (Defender for Cloud CLI) lets you embed security scanning directly in your continuous integration and continuous deployment (CI/CD) workflows. The CLI orchestrates security scanners and can run locally for developers.,2026-01-06T06:03:00.000Z,concept-article,integrations,0.7,True,"Describes embedding the Defender CLI into CI/CD; likely includes command-line flags, configuration options, and SARIF output settings that are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-infrastructure-dashboard,Cloud overview dashboard,Cloud overview dashboard in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,"Learn how to use the Cloud overview dashboard to monitor security posture, threat protection, and exposure management across your multicloud environment.","The Cloud Overview dashboard is the landing page for Microsoft Defender for Cloud in the unified security portal (Defender portal). It gives security teams a clear, actionable view of their cloud security status both pre and post breach helping them understand where to focus, track progress over time, and take immediate action. It also provides a scale view at the tenant-level or scope based. Important Microsoft Defender for Cloud is expanding to the Defender portal to provide a unified security",2025-11-25T14:31:00.000Z,how-to,,0.2,False,"Cloud overview dashboard article is primarily UI/experience description and posture visualization; summary does not indicate detailed settings tables, limits, or error-resolution mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-scopes-unified-rbac,Cloud scopes and unified RBAC,Manage cloud scopes and unified role-based access control in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Manage cloud scopes and unified RBAC in Defender,Learn how to configure cloud scopes and unified role-based access control for granular permissions management across your cloud environments.,"Note This capability is currently in preview.For details about current gaps and restrictions, seeKnown limitations. Cloud scopes and unified role-based access control (unified RBAC) in the Microsoft Defender portal let you segment multicloud resources (Azure, AWS, GCP, and connected DevOps/registry sources) into meaningful groupings and apply least‑privilege access consistently. They provide: Important Unified RBAC in the Defender portal is distinct from Azure RBAC. Permissions don't flow betwee",2025-11-25T14:31:00.000Z,how-to,security,0.78,True,"Explicitly about unified RBAC and scopes; likely lists specific Defender roles, permission scopes, and how they differ from Azure RBAC. This is product-specific IAM configuration, matching the security category.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-kubernetes-clusters,Investigate Kubernetes vulnerabilities with Cloud Security Explorer,Build Cloud Security Explorer queries to identify vulnerabilities in Kubernetes clusters - Microsoft Defender for Cloud,Author Cloud Security Explorer queries for AKS vulnerabilities,Learn how to build queries with Cloud Security Explorer in Microsoft Defender for Cloud to investigate vulnerabilities in Kubernetes clusters.,"Use Cloud Security Explorer to identify vulnerabilities in your Kubernetes clusters. The following examples show how to build queries to investigate container images and cluster nodes, and can be adapted to filter results based on your requirements. For an introduction to Cloud Security Explorer queries, seeBuild queries with Cloud Security Explorer.",2026-04-16T11:04:00.000Z,how-to,integrations,0.62,True,"Page provides concrete query examples for Cloud Security Explorer targeting Kubernetes clusters and container images. These examples encode product-specific query schema, fields, and patterns unique to Defender for Cloud’s explorer, which aligns best with integrations & coding patterns (query construction as an API-like integration).",new +https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-kubernetes-clusters,Investigate Kubernetes vulnerabilities with Cloud Security Explorer,Build Cloud Security Explorer queries to identify vulnerabilities in Kubernetes clusters - Microsoft Defender for Cloud,Author Cloud Security Explorer queries for AKS vulnerabilities,Learn how to build queries with Cloud Security Explorer in Microsoft Defender for Cloud to investigate vulnerabilities in Kubernetes clusters.,"Use Cloud Security Explorer to identify vulnerabilities in your Kubernetes clusters. The following examples show how to build queries to investigate container images and cluster nodes, and can be adapted to filter results based on your requirements. For an introduction to Cloud Security Explorer queries, seeBuild queries with Cloud Security Explorer.",2026-04-16T11:04:00.000Z,how-to,integrations,0.62,True,"Page provides concrete query examples for Cloud Security Explorer targeting Kubernetes clusters and container images. These examples encode product-specific query schema, fields, and patterns unique to Defender for Cloud’s explorer, which aligns best with integrations & coding patterns (query construction as an API-like integration).",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-software-vulnerabilities,Cloud Security Explorer software queries,Building Cloud Security Explorer software vulnerabilities query - Microsoft Defender for Cloud,,Learn to build queries with cloud security explorer in Microsoft Defender for Cloud to proactively identify software vulnerabilities in VMs and container images,You can use the Cloud Security Explorer to identify software vulnerabilities. The following examples build queries to identify software vulnerabilities in Virtual Machines (VMs) and container images. ReadBuild queries Cloud Security Explorerfor an introduction to Cloud Security Explorer queries.,2024-10-07T11:12:00.000Z,how-to,,0.4,False,"Shows example queries for Cloud Security Explorer; likely query examples but not organized as config tables, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/cluster-security-dashboard,Overview,Review security findings in the AKS security dashboard - Microsoft Defender for Cloud,,"Learn how to review and investigate alerts, vulnerabilities, misconfigurations, and compliance findings in the AKS security dashboard in Microsoft Defender for Cloud.","The AKS security dashboard shows security findings for an Azure Kubernetes Service (AKS) cluster in Microsoft Defender for Cloud. It includes alerts, vulnerabilities, misconfigurations, and compliance results to help you identify and prioritize issues.",2026-04-16T11:04:00.000Z,how-to,,0.3,False,"Described as how to review and investigate findings in the AKS security dashboard. This is primarily UI/navigation and conceptual guidance on using the dashboard, without clear indication of product-specific configuration parameters, RBAC roles, or detailed troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/defender-for-cloud/cluster-security-dashboard,Overview,Review security findings in the AKS security dashboard - Microsoft Defender for Cloud,,"Learn how to review and investigate alerts, vulnerabilities, misconfigurations, and compliance findings in the AKS security dashboard in Microsoft Defender for Cloud.","The AKS security dashboard shows security findings for an Azure Kubernetes Service (AKS) cluster in Microsoft Defender for Cloud. It includes alerts, vulnerabilities, misconfigurations, and compliance results to help you identify and prioritize issues.",2026-04-16T11:04:00.000Z,how-to,,0.3,False,"Described as how to review and investigate findings in the AKS security dashboard. This is primarily UI/navigation and conceptual guidance on using the dashboard, without clear indication of product-specific configuration parameters, RBAC roles, or detailed troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/code-to-runtime-mapping,Code to runtime enrichment for recommendations,Code to runtime for recommendations - Microsoft Defender for Cloud,,Learn how to use code to runtime visibility to trace security issues from runtime back to source code and fix them at the origin to prevent recurrence.,"Modern cloud applications move through stages that might include source code, pipelines, registries, and runtime environments. A small code change can create many cloud workloads across your environments. When a security issue appears at runtime, you might not know where the issue starts or how many assets it affects. Code to runtime gives you end-to-end visibility across the software development lifecycle (SDLC). This feature helps you find the origin of an issue, assess its blast radius, and f",2026-03-10T22:26:00.000Z,how-to,,0.2,False,"The page describes the concept of code-to-runtime visibility and how it helps trace security issues across the SDLC. From the summary, it appears conceptual and workflow-oriented without detailed configuration tables, specific error codes, or concrete parameter values, so it does not clearly meet any expert-knowledge sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/common-questions-microsoft-defender-vulnerability-management,Common questions,Microsoft Defender Vulnerability Management FAQ - Microsoft Defender for Cloud,,Answers to common questions on the new Container VA offering powered by Microsoft Defender Vulnerability Management,Get answers to common questions on the new Container VA offering powered by Microsoft Defender Vulnerability Management solution.,2025-07-15T08:00:00.000Z,faq,,0.35,False,"FAQ-style content; summary suggests high-level Q&A without specific error codes, configs, or numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-containers,Agentless container posture in Defender CSPM,Agentless container posture in Defender CSPM - Microsoft Defender for Cloud,,"Learn how agentless container posture offers discovery, visibility, and vulnerability assessment for containers without installing a sensor on your machines.","The Defender for Cloud Security Posture Management (CSPM) plan in Defender for Cloud provides container posture capabilities for Azure, AWS, and GCP. For requirements and support, see theContainers support matrix in Defender for Cloud. Agentless container posture provides easy and seamless visibility into your Kubernetes assets and security posture, with contextual risk analysis that empowers security teams to prioritize remediation based on actual risk behind security issues, and proactively hu",2025-03-31T08:00:00.000Z,concept-article,,0.3,False,"Agentless container posture concept article; summary emphasizes capabilities and visibility, not specific configuration settings or error-resolution mappings.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-data-collection,Agentless machine scanning,Agentless machine scanning in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn how Defender for Cloud can gather information about multicloud machine without installing an agent.,"Agentless machine scanning in Microsoft Defender for Cloud improves the security posture of machines connected to Defender for Cloud. Agentless scanning doesn't need any installed agents or network connectivity, and doesn't affect machine performance. Agentless machine scanning: Agentless scanning is available in the following Defender for Cloud plans:",2025-02-19T08:00:00.000Z,concept-article,,0.3,False,"High-level description of agentless machine scanning and supported plans; summary suggests conceptual overview and plan listing without detailed configuration parameters, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-attack-path,Investigate risks with security explorer and attack paths,Investigate risks with security explorer/attack paths in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn about investigating risks with security explorer/attack paths in Microsoft Defender for Cloud.,"One of the biggest challenges for security teams today is the number of daily security issues. Numerous security issues need resolution, but resources are insufficient. Defender for Cloud's contextual security capabilities help security teams assess the risk behind each security issue and identify the highest-risk issues that need immediate resolution. Defender for Cloud helps security teams reduce the risk of impactful breaches effectively. All of these capabilities are available as part of the",2025-11-25T14:31:00.000Z,concept-article,,0.3,False,"Attack path/security explorer article describes capabilities and usage conceptually; summary does not show specific configuration values, error codes, or decision matrices with thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-data-collection,Agentless machine scanning,Agentless machine scanning in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn how Defender for Cloud can gather information about multicloud machine without installing an agent.,"Agentless machine scanning in Microsoft Defender for Cloud improves the security posture of machines connected to Defender for Cloud. Agentless scanning doesn't need any installed agents or network connectivity, and doesn't affect machine performance. Agentless machine scanning: Agentless scanning is available in the following Defender for Cloud plans:",2026-04-19T08:00:00.000Z,concept-article,,0.3,False,"Conceptual description of agentless machine scanning and supported plans; no detailed limits, configuration tables, error codes, or decision matrices with quantified trade-offs.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-attack-path,Investigate risks with security explorer and attack paths,Investigate risks with security explorer/attack paths in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn about investigating risks with security explorer/attack paths in Microsoft Defender for Cloud.,"One of the biggest challenges for security teams today is the number of daily security issues. Numerous security issues need resolution, but resources are insufficient. Defender for Cloud's contextual security capabilities help security teams assess the risk behind each security issue and identify the highest-risk issues that need immediate resolution. Defender for Cloud helps security teams reduce the risk of impactful breaches effectively. All of these capabilities are available as part of the",2026-04-19T08:00:00.000Z,concept-article,,0.2,False,"Explains attack paths/security explorer conceptually; no concrete configuration values, limits, or structured troubleshooting/decision content.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-authentication-architecture-aws,Authentication architecture for AWS connectors,Authentication architecture for AWS connectors - Microsoft Defender for Cloud,Understand AWS connector authentication architecture in Defender for Cloud,"Learn how Microsoft Defender for Cloud authenticates to AWS using short-lived credentials, federated trust, and role-based access controls.","When you connect an AWS account to Microsoft Defender for Cloud, the service uses federated authentication to securely call AWS APIs without storing long-lived credentials. Temporary access is granted through AWS Security Token Service (STS), using short-lived credentials exchanged through a cross-cloud trust relationship. This article explains how that trust is established and how short-lived credentials are used to access AWS resources securely.",2025-12-18T12:04:00.000Z,article,security,0.7,True,"Explains federated auth, STS, and trust setup for AWS connectors; product-specific authentication architecture and permissions, fitting security configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-cloud-security-posture-management,Overview Cloud Security Posture Management (CSPM),What is Cloud Security Posture Management (CSPM) - Microsoft Defender for Cloud,,Learn more about Cloud Security Posture Management (CSPM) in Microsoft Defender for Cloud and how it helps improve your security posture.,"Cloud Security Posture Management (CSPM) is a core feature of Microsoft Defender for Cloud. CSPM provides continuous visibility into the security state of your cloud assets and workloads, offering actionable guidance to improve your security posture across Azure, AWS, and GCP. Defender for Cloud continually assesses your cloud infrastructure against security standards defined for your Azure subscriptions, Amazon Web Service (AWS) accounts, and Google Cloud Platform (GCP) projects. Defender for C",2025-12-10T12:09:00.000Z,concept-article,,0.25,False,"Conceptual overview of CSPM; describes capabilities and benefits without detailed configuration parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-data-security-posture,About Data Security Posture Management,Overview - Data security posture management - Microsoft Defender for Cloud,,"Explore how Microsoft Defender for Cloud enhances data security posture management across multicloud environments, ensuring comprehensive protection.",Organizations move data to the cloud at an exponential rate using multiple data stores such as object stores and managed/hosted databases as digital transformation accelerates. The cloud's dynamic and complex nature increases data threat surfaces and risks. Security teams face challenges with data visibility and protecting the cloud data estate. Data security posture management in Microsoft Defender for Cloud helps you reduce data risk and respond to data breaches. With data security posture man,2025-02-23T12:10:00.000Z,concept-article,,0.24,False,Data security posture management overview; describes challenges and high-level capabilities without indicating specific configuration settings or limits.,unchanged @@ -115,9 +115,9 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-apis-man https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-apis-posture,"Investigate findings, recommendations, alerts",Investigate your API security findings and posture - Microsoft Defender for Cloud,Investigate API security findings and posture in Defender for APIs,Learn how to analyze your API security alerts and posture in Microsoft Defender for Cloud,"This article describes how to investigate API security findings, alerts, and security posture recommendations for APIs protected byMicrosoft Defender for APIs.",2025-07-15T08:00:00.000Z,concept-article,best-practices,0.65,True,"Explains how to analyze findings, alerts, and posture recommendations; will include product-specific navigation, filtering, and recommended investigation steps for API security.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-apis-prepare,Support and prerequisites,Support and prerequisites for deploying the Defender for APIs plan - Microsoft Defender for Cloud,Check prerequisites to deploy Defender for APIs,Learn about the requirements for Defender for APIs deployment in Microsoft Defender for Cloud,Review the onboarding requirements on this page before setting upMicrosoft Defender for APIs.,2026-03-31T17:20:00.000Z,checklist,configuration,0.7,True,"The page describes concrete, product-specific prerequisites and support requirements for enabling the Defender for APIs plan (for example, which API Management configurations/tiers or environments are supported and required). These are configuration/support matrix details that are not inferable from general knowledge and are needed before onboarding, fitting the configuration category best.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-apis-validation,Validate APIs alerts,Validate your Microsoft Defender for APIs alerts - Microsoft Defender for Cloud,Trigger and validate Defender for APIs alerts,"Validate your Microsoft Defender for APIs alerts and ensure the security of your APIs with full lifecycle protection, detection, and response coverage.","Microsoft Defender for APIs offers full lifecycle protection, detection, and response coverage for APIs that are published in Azure API Management. One of the main capabilities is the ability to detect exploits of the Open Web Application Security Project (OWASP) API Top 10 vulnerabilities through runtime observations of anomalies using machine learning-based and rule-based detections. This page walks you through the steps to trigger an alert for one of your API endpoints through Defender for AP",2024-08-13T17:00:00.000Z,how-to,troubleshooting,0.7,True,"Walkthrough to trigger alerts for OWASP API Top 10 likely includes specific alert IDs, conditions, and validation steps (symptom→verification).",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-app-service-introduction,Overview,Microsoft Defender for App Service - the benefits and features - Microsoft Defender for Cloud,,Learn about the capabilities of Microsoft Defender for App Service and how to enable it on your subscription.,,2025-05-13T08:00:00.000Z,overview,,0.3,False,Benefits/features overview of Defender for App Service; summary suggests high-level content without detailed configs or troubleshooting.,unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-app-service-introduction,Overview,Microsoft Defender for App Service - the benefits and features - Microsoft Defender for Cloud,,Learn about the capabilities of Microsoft Defender for App Service and how to enable it on your subscription.,,2026-04-19T08:00:00.000Z,overview,,0.1,False,"Benefits and features overview for Defender for App Service; describes what the service does and how to enable it, without detailed configuration parameters, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-glossary,Defender for Cloud glossary of terms,Defender for Cloud glossary - Microsoft Defender for Cloud,,The glossary provides a brief description of important Defender for Cloud platform terms and concepts.,This glossary provides a brief description of important terms and concepts for the Microsoft Defender for Cloud platform. Select theLearn morelinks to go to related terms in the glossary. This glossary can help you to learn and use the product tools quickly and effectively.,2024-09-09T11:07:00.000Z,article,,0.2,False,"Glossary of terms is conceptual reference; it defines terminology but not configuration, limits, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-introduction,What is Microsoft Defender for Cloud?,Microsoft Defender for Cloud Overview - Microsoft Defender for Cloud,,"Secure your Azure, hybrid, and multicloud resources with Microsoft Defender for Cloud. This cloud-native application protection platform (CNAPP) includes two key capabilities, cloud security posture m","Important Microsoft Defender for Cloud is expanding to the Defender portal to provide a unified security experience across cloud and code environments. As part of this expansion, some features are now available in the Microsoft Defender Portal, and additional capabilities will be added to the Defender portal over time. This change is designed to: To identify documentation specific for the Defender Portal, look for the portal entry point at the top of the article. This pivot indicates whether the",2026-01-01T08:00:00.000Z,overview,,0.2,False,"High-level product overview of Defender for Cloud; primarily conceptual CNAPP/CSPM/CWPP description without concrete limits, configs, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-introduction,What is Microsoft Defender for Cloud?,Microsoft Defender for Cloud Overview - Microsoft Defender for Cloud,,"Secure your Azure, hybrid, and multicloud resources with Microsoft Defender for Cloud. This cloud-native application protection platform (CNAPP) includes two key capabilities, cloud security posture m","Important Microsoft Defender for Cloud is expanding to the Defender portal to provide a unified security experience across cloud and code environments. As part of this expansion, some features are now available in the Microsoft Defender Portal, and additional capabilities will be added to the Defender portal over time. This change is designed to: To identify documentation specific for the Defender Portal, look for the portal entry point at the top of the article. This pivot indicates whether the",2026-04-19T08:00:00.000Z,overview,,0.2,False,"High-level overview of Microsoft Defender for Cloud (CNAPP, CSPM, CWPP) and portal transition; no detailed limits, configuration tables, error codes, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-container-registries-introduction,Defender for container registries (deprecated),Microsoft Defender for container registries - the benefits and features - Microsoft Defender for Cloud,,Learn about the benefits and features of Microsoft Defender for container registries.,"Important We started a public preview of Azure Vulnerability Assessment powered by MDVM. For more information, seeVulnerability assessments for Azure with Microsoft Defender Vulnerability Management. Azure Container Registry (ACR) is a managed, private Docker registry service that stores and manages your container images for Azure deployments in a central registry. It's based on the open-source Docker Registry 2.0. To protect the Azure Resource Manager based registries in your subscription, enab",2025-07-15T08:00:00.000Z,overview,,0.3,False,Benefits/features overview of Defender for container registries; largely conceptual without detailed configs or limits in the summary.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-arc-enable-portal,Enable via portal,Enable Defender for Containers on Arc-enabled Kubernetes via portal - Microsoft Defender for Cloud,,"Learn how to enable Microsoft Defender for Containers on your Arc-enabled Kubernetes clusters through the Azure portal, with options to enable all components or deploy specific components selectively.","This article shows you how to enable Microsoft Defender for Containers on your Arc-enabled Kubernetes clusters through the Azure portal. You can choose to enable all security features at once for comprehensive protection, or selectively deploy specific components based on your requirements.",2025-12-04T12:03:00.000Z,how-to,,0.4,False,"Portal enablement how-to; likely step-by-step UI instructions without detailed config tables, limits, or product-specific troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-arc-enable-programmatically,Enable programmatically,Deploy Defender for Containers on Arc-Enabled Kubernetes Programmatically - Microsoft Defender for Cloud,Programmatically enable Defender for Containers on Arc,Learn how to enable Microsoft Defender for Containers on Arc-enabled Kubernetes clusters using Azure CLI and ARM templates.,"This article describes how to enable Microsoft Defender for Containers on Arc-enabled Kubernetes clusters by using programmatic methods. Tip For Azure portal deployment instructions, seeDeploy Defender for Containers on Arc-enabled Kubernetes using Azure portal.",2025-12-04T12:03:00.000Z,how-to,configuration,0.65,True,"Programmatic deployment via CLI and ARM typically includes resource types, parameter names, and example payloads that are product-specific configuration details beyond generic knowledge.",unchanged @@ -148,13 +148,13 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containe https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-introduction,Overview,Overview of Microsoft Defender for Containers - Microsoft Defender for Cloud,,"Learn about Microsoft Defender for Containers, a cloud-native solution that secures your containerized assets across multicloud and on-premises environments.","Microsoft Defender for Containers is a cloud-native solution that enhances, monitors, and maintains the security of your containerized assets. These assets include Kubernetes clusters, nodes, workloads, registries, images, and more. It protects applications across multicloud and on-premises environments. Defender for Containers helps you with five core domains of container security: Security posture managementruns continuous monitoring of cloud APIs, Kubernetes APIs, and Kubernetes workloads. It",2026-03-30T11:04:00.000Z,overview,,0.2,False,"Introduction to Microsoft Defender for Containers focusing on what it is and its core domains; summary does not indicate concrete limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-enable-cosmos-protections,Deploy Defender for Azure Cosmos DB,Enable Microsoft Defender for Azure Cosmos DB - Microsoft Defender for Cloud,Enable Microsoft Defender for Azure Cosmos DB,Learn how to enable enhanced security features in Microsoft Defender for Azure Cosmos DB.,"Microsoft Defender for Azure Cosmos DB protection is available at both theSubscription level, and resource level. You can enable Microsoft Defender for Cloud on your subscription to protect all database types on your subscription including Microsoft Defender for Azure Cosmos DB (recommended). You can also choose to enable Microsoft Defender for Azure Cosmos DB at theResource levelto protect a specific Azure Cosmos DB account.",2025-06-30T08:00:00.000Z,how-to,security,0.64,True,"Enablement article for a security plan at subscription and resource levels; likely includes specific configuration paths, scopes, and security-related settings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-introduction,Overview,Overview of Defender for Open-Source Relational Databases - Microsoft Defender for Cloud,,"Learn about the benefits and features of Microsoft Defender for Open-Source Relational Databases such as PostgreSQL, MySQL, and MariaDB.","In Microsoft Defender for Cloud, theDefender for Open-Source Relational Databasesplan within Defender for Databases detects anomalous activities that indicate unusual and potentially harmful attempts to access or exploit databases. With this plan, you can address potential threats to databases without the need to be a security expert or manage advanced security-monitoring systems.",2025-12-21T12:03:00.000Z,overview,,0.1,False,"Overview of Defender for open-source relational databases; marketing/benefits style content without detailed limits, configs, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-overview,Overview,Overview of Defender for Databases - Microsoft Defender for Cloud,,"Discover the advantages and capabilities of Microsoft Defender for Databases, including support for PostgreSQL, MySQL, and MariaDB.","In Microsoft Defender for Cloud, the Defender for Databases plan helps protect your database estate from threats and vulnerabilities. The Defender for Databases plan provides threat protection and security management across cloud environments. Defender for Databases includes four offerings that relate to database types: Microsoft Defender for Azure SQL Databases: Offers threat protection for Azure SQL databases by detecting and responding to potential security threats. Microsoft Defender for SQL",2025-05-13T17:04:00.000Z,overview,,0.25,False,"High-level overview of Defender for Databases offerings; no indication of detailed configuration, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-overview,Overview,Overview of Defender for Databases - Microsoft Defender for Cloud,,"Discover the advantages and capabilities of Microsoft Defender for Databases, including support for PostgreSQL and MySQL.","In Microsoft Defender for Cloud, the Defender for Databases plan helps protect your database estate from threats and vulnerabilities. The Defender for Databases plan provides threat protection and security management across cloud environments. Defender for Databases includes four offerings that relate to database types: Microsoft Defender for Azure SQL Databases: Offers threat protection for Azure SQL databases by detecting and responding to potential security threats. Microsoft Defender for SQL",2026-04-20T17:15:00.000Z,overview,,0.1,False,"High-level overview of Defender for Databases capabilities and supported database types; no specific limits, configuration tables, error codes, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-devops-introduction,DevOps security overview,Microsoft Defender for Cloud DevOps security benefits - Microsoft Defender for Cloud,,"Learn about the benefits and features of Microsoft Defender for Cloud DevOps security, including visibility, posture management, and threat protection.","Microsoft Defender for Cloud enables comprehensive visibility, posture management, and threat protection across multicloud environments, including Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), and on-premises resources. DevOps security in Defender for Cloud uses a central console to help security teams protect applications and resources from code to cloud across multi-pipeline environments, including Azure DevOps, GitHub, and GitLab. DevOps security recommendations can be correl",2025-03-13T11:19:00.000Z,overview,,0.3,False,"DevOps security benefits overview; focuses on visibility and posture management conceptually across platforms, not on specific configs, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-dns-alerts,Respond to Microsoft Defender for DNS alerts,Respond to Microsoft Defender for DNS alerts - Microsoft Defender for Cloud,Respond to Microsoft Defender for DNS security alerts,Learn best practices for responding to alerts that indicate security risks in DNS services.,"Important When you receive a security alert about suspicious and anomalous activities identified in DNS transactions, we recommend you investigate and respond to the alert as described below. Even if you're familiar with the application or user that triggered the alert, it's important to verify the situation surrounding every alert.",2025-06-11T17:01:00.000Z,how-to,troubleshooting,0.65,True,"Described as best practices for responding to DNS alerts; likely includes specific alert types and recommended investigation/remediation steps, fitting troubleshooting symptom→solution patterns.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-dns-introduction,Overview,Microsoft Defender for DNS - the benefits and features - Microsoft Defender for Cloud,,Learn about the benefits and features of Microsoft Defender for DNS.,"Important Microsoft Defender for DNS provides an additional layer of protection for all Azure resources that use Azure DNS's Azure-provided name resolution capability. From within Azure DNS, Defender for DNS monitors the queries from these resources and detects suspicious activities without the need for any extra agents.",2025-08-20T22:10:00.000Z,overview,,0.25,False,"Benefits/features overview for Defender for DNS; summary shows conceptual description of monitoring behavior without detailed configs, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-key-vault-introduction,Handle Defender for Key Vault alerts,Microsoft Defender for Key Vault - the benefits and features - Microsoft Defender for Cloud,,Learn about the benefits and features of Microsoft Defender for Key Vault.,"Azure Key Vault is a cloud service that safeguards encryption keys and secrets like certificates, connection strings, and passwords. EnableMicrosoft Defender for Key Vaultfor Azure-native, advanced threat protection for Azure Key Vault, providing another layer of security intelligence.",2025-08-20T08:00:00.000Z,overview,,0.1,False,Duplicate of the Defender for Key Vault introduction; same conceptual overview nature without expert-level configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-key-vault-introduction,Overview,Microsoft Defender for Key Vault - the benefits and features - Microsoft Defender for Cloud,,Learn about the benefits and features of Microsoft Defender for Key Vault.,"Azure Key Vault is a cloud service that safeguards encryption keys and secrets like certificates, connection strings, and passwords. EnableMicrosoft Defender for Key Vaultfor Azure-native, advanced threat protection for Azure Key Vault, providing another layer of security intelligence.",2025-08-20T08:00:00.000Z,overview,,0.1,False,Introduction to Defender for Key Vault benefits and features; appears to be a conceptual/marketing overview without detailed configs or limits.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-introduction,Overview,Microsoft Defender for Resource Manager - Benefits and Features - Microsoft Defender for Cloud,,Learn about the benefits and features of Microsoft Defender for Resource Manager to protect your Azure resources from potential threats.,"Azure Resource Manageris the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You use management features, like access control, locks, and tags, to secure and organize your resources after deployment. The cloud management layer is a crucial service connected to all your cloud resources. Because of this, it is also a potential target for attackers. Consequently, we recommend security operati",2025-08-20T22:10:00.000Z,overview,,0.2,False,Benefits and features overview for Defender for Resource Manager; summary indicates conceptual security positioning rather than detailed configuration or troubleshooting content.,unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-introduction,Overview,Microsoft Defender for Resource Manager - Benefits and Features - Microsoft Defender for Cloud,,Learn about the benefits and features of Microsoft Defender for Resource Manager to protect your Azure resources from potential threats.,"Azure Resource Manageris the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You use management features, like access control, locks, and tags, to secure and organize your resources after deployment. The cloud management layer is a crucial service connected to all your cloud resources. Because of this, it is also a potential target for attackers. Consequently, we recommend security operati",2026-04-19T08:00:00.000Z,overview,,0.2,False,"Introductory benefits and features overview for Defender for Resource Manager. The summary indicates conceptual explanation of what the service is and why it matters, without specific RBAC roles, configuration parameters, limits, or decision matrices. This is not expert-knowledge content per the defined categories.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-usage,Handle Defender for Resource Manager alerts,Responding to Defender for Resource Manager alerts - Microsoft Defender for Cloud,Investigate and respond to Defender for Resource Manager alerts,Learn about the steps necessary for responding to alerts from Microsoft Defender for Resource Manager,"Investigate and respond to alerts from Microsoft Defender for Resource Manager as described in this article. Defender for Resource Manager protects all connected resources. Verify the situation surrounding every alert, even if you're familiar with the application or user that triggered it.",2025-04-01T11:25:00.000Z,how-to,troubleshooting,0.65,True,"Alert response article will map specific Defender for Resource Manager alert types to investigation and remediation steps, which is product-specific troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-servers-overview,Overview,Overview of Defender for Servers in Defender for Cloud - Microsoft Defender for Cloud,,"Get an overview of the Defender for Servers plan in Microsoft Defender for Cloud, including its features and integration with other Defender services.",The Defender for Servers plan in Microsoft Defender for Cloud reduces security risk and exposure for machines in your organization. It provides recommendations to improve and remediate security posture. Defender for Servers also protects machines against real-time security threats and attacks. Note Defender for Servers no longer supports the Log Analytics agent and Azure Monitoring Agent (AMA).Agentless machine scanningand theintegration with Microsoft Defender for Endpointreplace these agents f,2025-11-02T08:00:00.000Z,concept-article,,0.4,False,Overview of Defender for Servers plan; mostly conceptual feature description.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-sql-alerts,Investigate Defender for SQL security alerts,Explore and investigate Defender for SQL security alerts - Microsoft Defender for Cloud,Explore and investigate Defender for SQL security alerts,Learn how to explore and investigate Defender for SQL security alerts in Microsoft Defender for Cloud.,There are several ways to view Microsoft Defender for SQL alerts in Microsoft Defender for Cloud: TheAlertspage. The machine's security page. Theworkload protections dashboard. Through the direct link provided in the alert's email.,2024-08-07T16:44:00.000Z,how-to,security,0.64,True,"Describes specific locations and flows to view and investigate Defender for SQL alerts, which is product-specific security operations guidance.",unchanged @@ -170,7 +170,7 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage- Migrating to the new plan is a simple process, read here abouthow to migrate from the classic plan. If you're using Defender for Storage (classic) with per-transac",2025-05-13T08:00:00.000Z,overview,decision-making,0.65,True,"Migration-focused article comparing classic and new plans, including pricing model differences and feature availability; this is plan-selection guidance with concrete trade-offs and migration advice.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-enable,Enable and configure Defender for Storage (classic),Enable and configure Microsoft Defender for Storage (classic) - Microsoft Defender for Cloud,Enable and configure Defender for Storage classic via templates,Learn how to enable and configure Microsoft Defender for Storage (classic) to protect your storage accounts from potential security threats.,"This article explains how to enable and configure Microsoft Defender for Storage (classic) on your subscriptions using various templates such as PowerShell, REST API, and others. Note Defender for Storage (classic) is unavailable for new subscriptions as of February 5, 2025. You can alsoupgrade to the new Microsoft Defender for Storage planand use advanced security capabilities, including malware scanning and sensitive data threat detection. Benefit from a predictable and granular pricing struct",2025-03-20T17:01:00.000Z,how-to,configuration,0.7,True,"Covers enabling/configuring Defender for Storage (classic) using PowerShell, REST API, and templates, which necessarily includes specific configuration parameters and request schemas.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate,Migrating to the new plan,Migrate from Defender for Storage (classic) - Microsoft Defender for Cloud,Migrate from Defender for Storage classic to new plan,Learn about how to migrate from Defender for Storage (classic) to the new Defender for Storage plan to take advantage of its enhanced capabilities and pricing.,"On March 28, 2023, we introduced the new Defender for Storage plan. This plan offers several benefits not available in the Defender for Storage (classic) per-transaction or per-storage account pricing plans, such as: The new pricing plan charges based on the number of storage accounts you protect, simplifying calculations and allowing for easy scaling as your needs change. For detailed pricing information, seethe pricing page. You can alsoestimate costs with the Defender for Cloud cost calculato",2025-05-13T08:00:00.000Z,how-to,decision-making,0.7,True,"Explicit migration article with plan comparison, pricing model explanation, and migration steps; supports decision-making and migration path selection between pricing models.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate,Migrating to the new plan,Migrate from Defender for Storage (classic) - Microsoft Defender for Cloud,Decide and migrate from Defender for Storage classic to new plan,Learn about how to migrate from Defender for Storage (classic) to the new Defender for Storage plan to take advantage of its enhanced capabilities and pricing.,"On March 28, 2023, we introduced the new Defender for Storage plan. This plan offers several benefits not available in the Defender for Storage (classic) per-transaction or per-storage account pricing plans, such as: The new pricing plan charges based on the number of storage accounts you protect, simplifying calculations and allowing for easy scaling as your needs change. For detailed pricing information, seethe pricing page. You can alsoestimate costs with the Defender for Cloud cost calculato",2026-04-19T08:00:00.000Z,how-to,decision-making,0.7,True,"Migration and plan-selection guidance between Defender for Storage (classic) and the new plan. Such docs typically include comparison of pricing models, capabilities, and migration steps, helping users decide when and how to move. This is product-specific decision-making and migration guidance that goes beyond generic knowledge.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-configure-malware-scan,Set up automated remediation for malware detection,Set Up Automated Remediation for Malware Detection - Microsoft Defender for Cloud,Set up automated malware remediation in Defender for Storage,Learn how to set up automated remediation for malware detection in Microsoft Defender for Storage to protect your Azure Storage accounts from harmful files.,"Defender for Storage supports different ways to handle malicious files. Select the remediation option that fits your scenario: By using malware scanning, you can build your automated remediation by using these scan result options: Tip To explore malware scanning in Defender for Storage, try the hands-on lab. TheNinja training modulegives step-by-step instructions to: This lab is part of the Microsoft Defender for Cloud training series and gives practical experience with security features.",2026-01-08T08:00:00.000Z,how-to,configuration,0.65,True,"Describes concrete ways to handle malicious files and build automated remediation based on scan result options, which implies product-specific configuration flows and option values rather than just conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-data-sensitivity,Overview,Sensitive data threat detection in Defender for Storage - Microsoft Defender for Cloud,,"Understand how Microsoft Defender for Storage detects and protects sensitive data from exposure, enhancing your organization's data security.","Sensitive data threat detection helps you prioritize and examine security alerts efficiently. It considers the sensitivity of the data at risk, leading to better detection and prevention of data breaches. This capability helps security teams reduce the likelihood of data breaches by quickly identifying and addressing the most significant risks. It also enhances sensitive data protection by detecting exposure events and suspicious activities on resources containing sensitive data. This feature is",2025-08-14T11:04:00.000Z,overview,,0.3,False,"High-level description of sensitive data threat detection benefits and behavior; summary suggests conceptual overview without specific configuration parameters, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-infrastructure-as-code-enablement,Enable with Infrastructure as Code,Enable Defender for Storage by Using Infrastructure as Code - Microsoft Defender for Cloud,Enable Defender for Storage with IaC templates,Learn how to enable and configure Microsoft Defender for Storage by using infrastructure as code (IaC) templates.,"We recommend that you enable Microsoft Defender for Storage on the subscription level. Doing so helps ensure that all storage accounts currently in the subscription are protected. Protection for storage accounts that you create after enabling Defender for Storage on the subscription level starts up to 24 hours after creation. Tip You can alwaysconfigure specific storage accountswith custom settings that differ from the settings configured at the subscription level. That is, you can override subs",2025-09-16T22:15:00.000Z,how-to,deployment,0.76,True,"Infrastructure-as-code enablement article; likely contains ARM/Bicep/Terraform parameters and deployment constraints, including timing detail (up to 24 hours) that is product-specific.",unchanged @@ -186,7 +186,7 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/enabl https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/integration-faq,FAQ,Integration FAQ for Defender portal,,"Frequently asked questions about integrating and using Microsoft Defender for Cloud in the Defender portal, including migration, compatibility, and feature questions.","Note This is currently in preview. For details about current gaps and restrictions, seeKnown limitations. This article answers frequently asked questions about using Microsoft Defender for Cloud in the Defender portal, including integration, migration, and compatibility topics.",2026-02-09T08:00:00.000Z,faq,,0.4,False,"FAQ about integration and migration; likely conceptual Q&A without structured error codes, configs, or numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/known-limitations,Known limitations,Known limitations in the Defender portal,Understand current limitations of Defender portal preview,Understand current limitations and known issues when using Microsoft Defender for Cloud in the Defender portal during the preview phase.,These are the known limitations during the preview release of Defender for Cloud in the Defender portal. Note Enabling the Defender portal experience won't impact your experience in the Azure portal.,2026-02-09T08:00:00.000Z,reference,limits-quotas,0.65,True,"Known limitations page for preview; typically lists concrete unsupported scenarios and constraints specific to this portal experience, which function as product limits/quotas.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-sensor-change-log,Defender sensor for Containers changelog,Defender Sensor for Defender for Containers Changelog - Microsoft Defender for Cloud,,Learn about the version history and updates for the Defender sensor in Microsoft Defender for Containers.,"The Sensor for Microsoft Defender for Containers release notes provides a detailed version history of sensor updates. Each version includes new features, improvements, and fixes to enhance functionality. Use this changelog to stay informed about the latest updates and plan your deployments accordingly. For more information about deploying the sensor in Defender for Containers, seeConfigure Microsoft Defender for Containers components. To see the version of the sensor run: kubectl get -n kube-sys",2026-03-17T17:15:00.000Z,reference,,0.2,False,"The page is a changelog/release notes for the Defender sensor, listing version history, features, and fixes. It does not clearly expose structured limits, configuration parameter tables, security roles, troubleshooting mappings, or decision matrices as defined in the sub-skill types. While it may contain some product-specific details, they are not organized in a way that matches any of the targeted expert-knowledge categories.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-sensor-change-log,Defender sensor for Containers changelog,Defender Sensor for Defender for Containers Changelog - Microsoft Defender for Cloud,,Learn about the version history and updates for the Defender sensor in Microsoft Defender for Containers.,"The Sensor for Microsoft Defender for Containers release notes provides a detailed version history of sensor updates. Each version includes new features, improvements, and fixes to enhance functionality. Use this changelog to stay informed about the latest updates and plan your deployments accordingly. For more information about deploying the sensor in Defender for Containers, seeConfigure Microsoft Defender for Containers components. To see the version of the sensor run: kubectl get -n kube-sys",2026-04-20T11:04:00.000Z,reference,,0.2,False,"A changelog/release notes page lists version history, features, and fixes. While it is detailed, it does not fit any of the defined sub-skill types (no limits tables, configuration matrices, troubleshooting mappings, or decision criteria). It’s primarily historical/update information rather than reusable expert configuration, deployment, or troubleshooting guidance.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/delegate-with-copilot,Delegate recommendations with Microsoft Security Copilot,Delegate recommendations with Microsoft Security Copilot - Microsoft Defender for Cloud,,Learn how to delegate recommendations with Copilot in Microsoft Defender for Cloud and improve your security posture.,Microsoft Defender for Cloud's integration with Microsoft Security Copilot lets you delegate recommendations on the recommendations page with natural language prompts. You can delegate recommendations to another person or team. Delegating recommendations can improve your security posture by having the right people address the risks and vulnerabilities in your environment.,2025-11-25T14:31:00.000Z,how-to,,0.35,False,Describes delegating recommendations with Copilot; mainly UX/process guidance without deep technical reference content.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/deploy-helm,Configure sensor deployed with Helm,Install Defender for Containers sensor using Helm - Microsoft Defender for Cloud,Deploy Defender for Containers sensor via Helm,Learn how to install the Microsoft Defender for Containers sensor on Kubernetes clusters using Helm.,"This article describes how to install and configure the Microsoft Defender for Containers sensor on AKS, EKS, and GKE clusters by using Helm. You learn about prerequisites, enabling Defender for Containers, and step-by-step deployment instructions for different environments. Defender for Containers supports multiple deployment models for deploying the sensor, including automatic provisioning and Helm-based installation. Helm-based deployment provides greater control over sensor versioning and up",2026-03-12T22:19:00.000Z,how-to,deployment,0.68,True,"Page focuses on installing Defender for Containers sensor on AKS/EKS/GKE using Helm with product-specific deployment models and environment-specific steps. While the summary emphasizes step-by-step instructions rather than matrices, Helm-based deployment of a security sensor across multiple managed Kubernetes platforms is a product-specific deployment pattern that an LLM is unlikely to know in detail from training. It describes how to enable Defender for Containers and use Helm-based installation as an alternative to automatic provisioning, which fits deployment more than configuration or integrations.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/deploy-vulnerability-assessment-byol-vm,Enable vulnerability scanning with a BYOL solution,Enable vulnerability scanning with a Bring Your Own License (BYOL) solution - Microsoft Defender for Cloud,Use BYOL vulnerability assessment with Defender for Cloud,Deploy a BYOL vulnerability assessment solution on your Azure virtual machines to get recommendations in Microsoft Defender for Cloud that can help you protect your virtual machines.,"The Defender for Servers plan in Microsoft Defender for Cloud providesvulnerability scanningfor protected machines. Vulnerability scanning uses integrated Microsoft Defender Vulnerability Management as its scanner. Warning Bring your own license (BYOL) capability is being deprecated. Starting February 3rd, you will no longer be able to add new BYOL security solutions or onboard new machines to existing ones. By May 1st, the deprecation will be complete, and no data will be available. If you are ",2025-02-19T08:00:00.000Z,how-to,decision-making,0.7,True,"Describes enabling BYOL vulnerability solutions and includes deprecation dates and migration guidance, which are product-specific decision and migration considerations.",unchanged @@ -204,9 +204,9 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/edit-devops-connector https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-agentless-scanning-vms,Enable agentless machine scanning,Enable agentless scanning for Virtual Machines - Microsoft Defender for Cloud,Configure agentless scanning for virtual machines,Run agentless scanning on Virtual Machines (VMs) for vulnerabilities and threats in Microsoft Defender for Cloud.,"Agentless machine scanningin Microsoft Defender for Cloud improves the security posture of machines connected to Defender for Cloud. Agentless machine scanning includes capabilities such as scanning for software inventory, vulnerabilities, secrets, and malware. When you turn on Defender for Servers Plan 2 or the Defender Cloud Security Posture Management (CSPM) plan, agentless machine scanning is enabled by default. If needed, you can use the instructions in this article to enable agentless mach",2025-12-24T12:03:00.000Z,how-to,configuration,0.7,True,"Describes enabling agentless scanning, including when it is enabled by default and how to turn it on; likely includes specific setting names and plan-based behavior.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-api-security-posture,Enable API security posture with Defender CSPM,Enable API security posture management - Microsoft Defender for Cloud,,Learn how to enable API security posture management in Microsoft Defender for Cloud to protect your APIs.,"The Defender Cloud Security Posture Management (CSPM) plan in Microsoft Defender for Cloud gives you a complete view of your APIs across Azure API Management, Function Apps, and Logic Apps. It helps you improve API security by finding misconfigurations and vulnerabilities. This article explains how to enable API security posture management in your Defender CSPM plan and assess your API security. Defender CSPM onboards APIs without an agent and regularly checks for risks and sensitive data exposu",2026-03-31T11:04:00.000Z,how-to,,0.3,False,"Primarily describes that Defender CSPM provides API security posture management and explains that APIs are onboarded agentlessly and checked for risks. The summary does not indicate presence of detailed configuration parameter tables, RBAC role definitions, or error-code mappings; it reads as a how-to/enablement overview rather than expert configuration or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-aws,Enable on AWS (Preview),Enable Defender for open-source relational databases on Amazon Web Services (AWS) - Microsoft Defender for Cloud,Enable Defender for open-source databases on AWS,Learn how to enable Microsoft Defender for open-source relational databases to detect potential security threats on AWS environments.,"The Defender for open-source relational databases plan in Microsoft Defender for Cloud helps you detect and investigate unusual activity in your AWS RDS databases. This plan supports the following database instance types: This article explains how to enable Defender for open-source relational databases on AWS so that you can start receiving alerts for suspicious activity. When you enable this plan, Defender for Cloud also discovers sensitive data in your AWS account and enriches security insight",2025-12-28T23:03:00.000Z,how-to,security,0.62,True,Explains enabling Defender for open-source relational databases on AWS RDS; likely includes cloud-specific configuration details and security integration steps.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure,Enable on Azure,Enable Defender for open-source relational databases on Azure - Microsoft Defender for Cloud,Enable Defender for open-source databases on Azure,Learn how to enable Microsoft Defender for open-source relational databases to detect potential security threats on Azure environments.,"Microsoft Defender for Cloud detects anomalous activities indicating unusual and potentially harmful attempts to access or exploit databases for the following services: To get alerts from the Microsoft Defender plan, you need to follow the instructions on this page to enable Defender for open-source relational databases Azure. Learn more about this Microsoft Defender plan inOverview of Microsoft Defender for open-source relational databases.",2025-06-30T08:00:00.000Z,how-to,security,0.62,True,"Enablement article for a security plan; typically includes specific Azure configuration steps, scopes, and possibly role/permission requirements unique to this plan.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure,Enable on Azure,Enable Defender for open-source relational databases on Azure - Microsoft Defender for Cloud,Enable Defender for open-source Azure databases,Learn how to enable Microsoft Defender for open-source relational databases to detect potential security threats on Azure environments.,"Microsoft Defender for Cloud detects anomalous activities indicating unusual and potentially harmful attempts to access or exploit databases for the following services: To get alerts from the Microsoft Defender plan, you need to follow the instructions on this page to enable Defender for open-source relational databases Azure. Learn more about this Microsoft Defender plan inOverview of Microsoft Defender for open-source relational databases.",2026-04-20T17:15:00.000Z,how-to,configuration,0.7,True,"Enablement/how-to article for a specific Defender for Cloud plan. These pages typically include product-specific configuration steps (which resource types are supported, where to enable the plan, plan names/toggles, and required settings) rather than just conceptual security overview. That makes it primarily configuration-focused rather than generic guidance.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-endpoint,Enable Defender integration,Enable Defender for Endpoint in Defender for Cloud - Microsoft Defender for Cloud,Enable Defender for Endpoint integration in Defender for Cloud,Learn how to enable Microsoft Defender for Endpoint integration in Microsoft Defender for Cloud to protect your multicloud and on-premises machines.,Microsoft Defender for Cloudintegrates natively with Microsoft Defender for Endpointto provide Defender for Endpoint and Defender Vulnerability Management capabilities in Defender for Cloud. This article explains how to manually enable Defender for Endpoint integration when necessary.,2025-06-30T08:00:00.000Z,how-to,integrations,0.8,True,"How-to enable integration; typically includes specific configuration steps, settings, and plan prerequisites unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity,Enable sensitive data threat detection,Enable sensitive data threat detection - Microsoft Defender for Cloud,Enable and configure sensitive data threat detection for Storage,Learn how to enable and configure sensitive data threat detection in Microsoft Defender for Storage to protect your data and enhance security.,Sensitive data threat detection is enabled by default when you enable Defender for Storage. You canenable it or disable itin the Azure portal or with other at-scale methods. This feature is included in the price of Defender for Storage.,2025-07-01T08:00:00.000Z,how-to,configuration,0.6,True,"Explains enabling/disabling sensitive data threat detection via portal and at-scale methods, which typically includes specific switches, policy settings, or API parameters unique to this feature.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity,Enable sensitive data threat detection,Enable sensitive data threat detection - Microsoft Defender for Cloud,Configure sensitive data threat detection for Defender for Storage,Learn how to enable and configure sensitive data threat detection in Microsoft Defender for Storage to protect your data and enhance security.,Sensitive data threat detection is enabled by default when you enable Defender for Storage. You canenable it or disable itin the Azure portal or with other at-scale methods. This feature is included in the price of Defender for Storage.,2026-04-19T08:00:00.000Z,how-to,security,0.65,True,"Configuration-focused page for enabling/disabling sensitive data threat detection in Defender for Storage. While the summary is high-level, the actual doc is expected to include product-specific security configuration steps and options (for example, where and how to toggle the feature, scope of protection), which qualify as security expert knowledge beyond generic concepts.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-sql-at-scale,Enable Defender for SQL servers on machines at scale,Enable Microsoft Defender for SQL Servers on Machines at scale - Microsoft Defender for Cloud,Enable Defender for SQL on Machines at scale,"Learn how to protect your Microsoft SQL servers on Azure VMs, on-premises, and in hybrid and multicloud environments with Microsoft Defender for Cloud at scale.","Microsoft Defender for Cloud's SQL Servers on Machines component of the Defender for Databases plan protects SQL IaaS and Defender for SQL extensions. The SQL Servers on Machines component identify and mitigates potential database vulnerabilities while detecting anomalous activity that could indicate threats to your databases. When you enable the SQL Server on machines component of the Defender for Databases plan, the auto-provision process is automatically initiated. The auto-provision process ",2025-05-05T08:00:00.000Z,how-to,deployment,0.65,True,"Describes enabling Defender for SQL on Machines at scale and mentions auto-provisioning; likely includes deployment patterns, requirements, and constraints specific to large environments.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-just-in-time-access,Enable just-in-time access,Enable just-in-time access - Microsoft Defender for Cloud,Enable just-in-time access for Azure virtual machines,Learn how just-in-time VM access (JIT) in Microsoft Defender for Cloud helps you control access to your Azure virtual machines.,"Defender for Serversin Microsoft Defender for Cloud provides a just-in-time machine access feature. You can use Microsoft Defender for Cloud's just-in-time access to protect your Azure VMs from unauthorized network access. Many times firewalls contain allow rules that leave your VMs vulnerable to attack. JIT lets you allow access to your VMs only when the access is needed, on the ports needed, and for the period of time needed. In this article, you learn how to set up and use just-in-time access",2025-10-28T22:12:00.000Z,how-to,configuration,0.8,True,"How-to set up and use JIT access; typically includes specific port, time, and policy settings unique to this feature.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-periodic-system-updates,Remediate system updates,Remediate system updates and patches recommendations - Microsoft Defender for Cloud,Remediate system update and patch recommendations in Defender for Cloud,Learn how to enable system updates on your servers to keep them secure and healthy by following the steps provided in this guide to ensure optimal security.,"Microsoft Defender for Cloud provides security recommendations to improve your organizational security posture and reduce risk. An important element in risk reduction is to harden machines across your business environment. As part of the hardening strategy, Defender for Cloud assesses machines to check that the latest system updates and patches are installed, and issues security recommendations if they're not. System updates and patches are crucial for keeping machines secure and healthy. Update",2025-02-19T08:00:00.000Z,how-to,best-practices,0.7,True,Guides enabling and managing system updates based on Defender for Cloud recommendations; includes product-specific assessment behavior and recommended remediation steps.,unchanged @@ -294,8 +294,8 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/export-to-splunk-or-q https://learn.microsoft.com/en-us/azure/defender-for-cloud/express-configuration-azure-commands,Manage with Azure CLI,Express configuration Azure Command Line Interface (CLI) commands reference - Microsoft Defender for Cloud,SQL VA express configuration Azure CLI commands reference,"In this article, you can review the Express configuration Azure Command Line Interface (CLI) commands reference and copy example scripts to use in your environments.","This article lists the Azure Command Line Interface (CLI) commands that can be used with SQL vulnerability assessment express configuration. The examples in this article should be run in PowerShell; they aren't for use ""as is"" with Bash. Note For Azure CLI reference for the classic configuration, seeManage findings in your Azure SQL databases You can get the list of available scan IDs with this cmdlet -Get SQL vulnerability assessment scan",2024-09-17T17:04:00.000Z,sample,configuration,0.8,True,"Provides Azure CLI command references and examples for SQL VA express configuration, including command names and parameter usage.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/express-configuration-powershell-commands,Example - express configuration,Express configuration PowerShell commands reference - Microsoft Defender for Cloud,SQL VA express configuration PowerShell commands reference,"In this article, you can review the Express configuration PowerShell commands reference and copy example scripts to use in your environments.","This article lists the PowerShell commands that can be used with SQL vulnerability assessment express configuration. Make a local copy of the script located onExpress configuration PowerShell wrapper module, and save the file with the following file nameSqlVulnerabilityAssessmentCommands.psm1, which can be referenced with the following commands: Example 1: Example 2: Example 1: Example 2: Example 1: Example 2: Example 1: Example 2: Example 1: Example 2:",2025-03-30T08:00:00.000Z,sample,configuration,0.8,True,"Lists specific PowerShell commands, parameters, and examples for SQL VA express configuration, which are detailed configuration options.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/express-configuration-sql-commands,Express configuration PowerShell wrapper module,Express configuration PowerShell wrapper module - Microsoft Defender for Cloud,SQL VA express configuration PowerShell wrapper module reference,"In this article, you can review the express configuration SQL vulnerability assessment PowerShell commands reference and copy example scripts to use in your environments.",This article provides the PowerShell wrapper for SQL vulnerability assessment express configuration. We recommend copying the script to a local file with the following file nameSqlVulnerabilityAssessmentCommands.psm1. You should use theExpress configuration PowerShell commands referenceto run the script.,2025-04-01T11:25:00.000Z,sample,configuration,0.78,True,"Reference for a PowerShell module (commands and usage) specific to SQL VA express configuration, including function names and usage patterns.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-copilot,Common questions about Copilot in Defender for Cloud,Common questions - Copilot in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Frequently asked questions about Copilot for Microsoft Defender for Cloud.,Microsoft Defender for Cloud's integration with Microsoft Security Copilot brings the benefits of AI and automation to your security operations. Copilot helps you prioritize and respond to security incidents faster and more effectively.,2026-04-15T17:07:00Z,faq,,0.2,False,"Copilot FAQ for Defender for Cloud is mainly descriptive (capabilities, availability, high-level behavior) and does not typically include numeric limits, configuration tables, or detailed troubleshooting content that would qualify as expert knowledge under the given categories.",updated -https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-cspm,Common questions about CSPM,Common questions - cloud security posture management (CSPM) - Microsoft Defender for Cloud,,Frequently asked questions about cloud security posture management (CSPM) for Microsoft Defender for Cloud.,One of Microsoft Defender for Cloud's main pillars for cloud security is Cloud Security Posture Management (CSPM). CSPM provides you with hardening guidance that helps you efficiently and effectively improve your security. CSPM also gives you visibility into your current security situation.,2026-04-15T17:07:00Z,faq,,0.2,False,"FAQ about Defender for Cloud CSPM appears to be conceptual and explanatory (what CSPM is, general benefits, common questions). No indication of specific numeric limits, configuration parameter tables, RBAC role lists, or detailed error-code-based troubleshooting that would qualify as expert knowledge under the defined categories.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-copilot,Common questions about Copilot in Defender for Cloud,Common questions - Copilot in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Frequently asked questions about Copilot for Microsoft Defender for Cloud.,Microsoft Defender for Cloud's integration with Microsoft Security Copilot brings the benefits of AI and automation to your security operations. Copilot helps you prioritize and respond to security incidents faster and more effectively.,2026-04-15T17:07:00Z,faq,,0.2,False,"Copilot FAQ for Defender for Cloud is mainly descriptive (capabilities, availability, high-level behavior) and does not typically include numeric limits, configuration tables, or detailed troubleshooting content that would qualify as expert knowledge under the given categories.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-cspm,Common questions about CSPM,Common questions - cloud security posture management (CSPM) - Microsoft Defender for Cloud,,Frequently asked questions about cloud security posture management (CSPM) for Microsoft Defender for Cloud.,One of Microsoft Defender for Cloud's main pillars for cloud security is Cloud Security Posture Management (CSPM). CSPM provides you with hardening guidance that helps you efficiently and effectively improve your security. CSPM also gives you visibility into your current security situation.,2026-04-15T17:07:00Z,faq,,0.2,False,"FAQ about Defender for Cloud CSPM appears to be conceptual and explanatory (what CSPM is, general benefits, common questions). No indication of specific numeric limits, configuration parameter tables, RBAC role lists, or detailed error-code-based troubleshooting that would qualify as expert knowledge under the defined categories.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-apis,Common questions about Defender for APIs,Common questions - Defender for APIs - Microsoft Defender for Cloud,,Frequently asked questions about Defender for APIs,,2025-08-10T11:03:00Z,faq,,0.4,False,FAQ for Defender for APIs; likely mostly conceptual answers without detailed config tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-containers,Common questions about Defender for Containers,Common questions about protecting containers - Microsoft Defender for Cloud,,Get answers to common questions about protecting containers,Get answers to common questions about protecting containers,2025-11-30T23:03:00Z,faq,,0.35,False,"FAQ page; summary doesn’t indicate detailed configuration tables, limits, or troubleshooting mappings, likely mostly conceptual Q&A.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-databases,Common questions,Common Questions - Defender for Databases - Microsoft Defender for Cloud,,Get answers to frequently asked questions about Microsoft Defender for Databases.,Get answers to common questions about Microsoft Defender for Databases.,2025-11-30T23:03:00Z,faq,,0.35,False,"FAQ for Defender for Databases; summary suggests general Q&A without detailed error codes, configs, or numeric thresholds.",unchanged @@ -304,9 +304,9 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-serv https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-storage,Common questions about Defender for Storage,Common questions - Defender for Storage - Microsoft Defender for Cloud,,Get answers to frequently asked questions about Microsoft Defender for Storage.,Get answers to common questions about Microsoft Defender for Storage.,2025-11-30T23:03:00Z,faq,,0.2,False,"FAQ summary only; likely mixed conceptual and pricing questions without clear indication of detailed limits, configs, or error codes from the provided text.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-storage-classic,Common questions about Defender for Storage (classic),Common questions - Defender for Storage classic - Microsoft Defender for Cloud,,Get answers to frequently asked questions about Microsoft Defender for Storage classic.,Get answers to common questions about Microsoft Defender for Storage classic.,2025-11-30T23:03:00Z,faq,,0.2,False,"Classic plan FAQ summary only; no evidence of detailed limits, configuration tables, or error-code troubleshooting in the provided text.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-endor-labs,Common questions,Common questions - Endor Labs integration - Microsoft Defender for Cloud,Resolve common issues in Endor Labs integration,Get answers to frequently asked questions about the Endor Labs integration.,Get answers to common questions about the Endor Labs integration.,2025-11-30T23:03:00Z,faq,troubleshooting,0.6,True,"FAQ for a specific integration typically covers concrete issues, behaviors, and resolutions, which are product-specific troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general,Common questions about Defender for Cloud,Common questions - General questions - Microsoft Defender for Cloud,,"Frequently asked general questions about Microsoft Defender for Cloud, a product that helps you prevent, detect, and respond to threats",,2026-04-14T11:10:00Z,faq,,0.2,False,"General FAQ about Defender for Cloud; primarily conceptual and product-overview style questions without detailed limits, configuration tables, or error-code-based troubleshooting.",updated -https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-permissions,Common questions about permissions,Common questions - permissions - Microsoft Defender for Cloud,Understand and assign Defender for Cloud permissions,"This FAQ answers questions about permissions in Microsoft Defender for Cloud, a product that helps you prevent, detect, and respond to threats.",,2026-04-15T17:07:00Z,faq,security,0.7,True,"Permissions FAQ for Defender for Cloud is likely to list specific Azure RBAC roles, required permissions, and scope details unique to the product, which matches the security sub-skill criteria for product-specific role names and access requirements.",updated -https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-regulatory-compliance,Common questions about regulatory compliance,Common questions - regulatory compliance questions - Microsoft Defender for Cloud,,Frequently asked general questions about regulatory compliance,,2026-04-15T17:07:00Z,faq,,0.3,False,"Regulatory compliance FAQ tends to describe supported standards and conceptual compliance behavior; it usually lacks concrete configuration parameters, limits, or detailed troubleshooting mappings required for the defined sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general,Common questions about Defender for Cloud,Common questions - General questions - Microsoft Defender for Cloud,Resolve common Microsoft Defender for Cloud issues,"Frequently asked general questions about Microsoft Defender for Cloud, a product that helps you prevent, detect, and respond to threats",,2026-04-19T22:03:00Z,faq,troubleshooting,0.65,True,"FAQ pages for security products typically include specific error messages, feature behaviors, and product-specific clarifications (for example, what certain alerts mean, how billing or data collection behaves in edge cases). These are symptom→explanation mappings that qualify as product-specific troubleshooting knowledge beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-permissions,Common questions about permissions,Common questions - permissions - Microsoft Defender for Cloud,Understand and assign Defender for Cloud permissions,"This FAQ answers questions about permissions in Microsoft Defender for Cloud, a product that helps you prevent, detect, and respond to threats.",,2026-04-15T17:07:00Z,faq,security,0.7,True,"Permissions FAQ for Defender for Cloud is likely to list specific Azure RBAC roles, required permissions, and scope details unique to the product, which matches the security sub-skill criteria for product-specific role names and access requirements.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-regulatory-compliance,Common questions about regulatory compliance,Common questions - regulatory compliance questions - Microsoft Defender for Cloud,,Frequently asked general questions about regulatory compliance,,2026-04-15T17:07:00Z,faq,,0.3,False,"Regulatory compliance FAQ tends to describe supported standards and conceptual compliance behavior; it usually lacks concrete configuration parameters, limits, or detailed troubleshooting mappings required for the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-runtime-gated,Frequently asked questions,Gated Deployment FAQ for Defender for Containers - Microsoft Defender for Cloud,,"Find answers to common questions about gated deployment in Defender for Containers, including rule creation, exemptions, and multicloud support.","This FAQ addresses common questions about gated deployment in Microsoft Defender for Containers. Gated deployment enforces container image security policies at deployment time in supported Kubernetes environments, based on vulnerability scan results from integrated container registries.",2025-11-26T12:02:00.000Z,concept-article,,0.35,False,FAQ about gated deployment; likely high-level answers without detailed error codes or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/file-integrity-monitoring-enable-defender-endpoint,Enable file integrity monitoring,Enable File Integrity Monitoring - Microsoft Defender for Cloud,Configure File Integrity Monitoring with Defender for Endpoint,Learn how to enable File Integrity Monitoring when you collect data with Microsoft Defender for Endpoint.,"In Defender for Servers Plan 2 in Microsoft Defender for Cloud, theFile Integrity Monitoringfeature helps to keep enterprise assets and resources secure. It scans and analyzes operating system files, Windows registries, application software, and Linux system files for changes that might indicate an attack. After you enable Defender for Servers Plan 2, follow the instructions in this article to configure File Integrity Monitoring using the Microsoft Defender for Endpoint agent and agentless machi",2026-03-22T08:00:00.000Z,how-to,configuration,0.65,True,Described as instructions to configure File Integrity Monitoring using the Microsoft Defender for Endpoint agent and agentless machines. This implies product-specific configuration steps and settings rather than just conceptual guidance.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/file-integrity-monitoring-overview,Overview,Overview of file integrity monitoring in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn about tracking file change with file integrity monitoring in Microsoft Defender for Cloud.,"The file integrity monitoring feature in Microsoft Defender for Cloud'sDefender for Servers Plan 2, scans operating system files, Windows registries, application software, and Linux system files. It analyzes these files for changes that might indicate an attack. File integrity monitoring helps you to:",2026-03-22T11:08:00.000Z,concept-article,,0.2,False,"High-level overview of file integrity monitoring in Defender for Cloud; no specific configuration parameters, limits, error codes, or detailed settings are indicated in the summary.",unchanged @@ -318,10 +318,10 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/github-action,Configu https://learn.microsoft.com/en-us/azure/defender-for-cloud/github-advanced-security-deploy,Deploy GitHub Advanced Security integration,Deploy GitHub Advanced Security Integration with Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Use this step-by-step guide to set up GitHub Advanced Security integration with Microsoft Defender for Cloud for code-to-runtime security.,"This guide provides setup steps and other actions to help you integrate GitHub Advanced Security (GHAS) and Microsoft Defender for Cloud with a use case that helps you validate the integration end to end. This integration helps maximize Microsoft's cloud-native application security by correlating runtime risks and context with the originated code for faster AI-powered remediation. By following this guide, you:",2026-03-03T23:21:00.000Z,how-to,,0.3,False,"Page is a step-by-step setup guide for integrating GitHub Advanced Security with Defender for Cloud; it appears procedural rather than a reference for limits, configuration matrices, troubleshooting codes, or best-practice gotchas with quantified impact.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/github-advanced-security-overview,Overview,GitHub Advanced Security Integration with Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn how GitHub Advanced Security integrates with Microsoft Defender for Cloud to provide unified code-to-cloud security from development to production.,"GitHub Advanced Security (GHAS) integration with Microsoft Defender for Cloud connects your source code repositories to cloud workloads. This integration automatically maps code changes to production environments, prioritizes security alerts based on real runtime context, and enables coordinated remediation workflows between development and security teams. It provides unified security visibility across your development lifecycle. Use this integration to: This overview explains how the integratio",2026-03-03T23:21:00.000Z,overview,,0.2,False,"Page is an overview of the GitHub Advanced Security integration with Defender for Cloud; it describes capabilities and benefits but does not expose concrete limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/governance-rules,Drive remediation with governance rules,Drive Recommendation Remediation by Using Governance Rules - Microsoft Defender for Cloud,Configure governance rules to enforce Defender remediation,Learn how to drive remediation of security recommendations by using governance rules in Microsoft Defender for Cloud.,"Security teams are responsible for improving their organization's security posture, but team members might not always follow through to implement security recommendations. Security teams can set governance rules to help drive accountability and create a service-level agreement (SLA) around the remediation process. For an in-depth discussion around why governance rules are helpful, watchthis episodeof theDefender for Cloud in the fieldvideo series.",2026-03-04T08:00:00.000Z,how-to,security,0.7,True,"Describes how to use Defender for Cloud governance rules to drive remediation of security recommendations, including SLAs and accountability. This is product-specific security posture management and enforcement guidance, not generic governance theory, and aligns best with security-focused configuration and process.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-enable-agentless-containers,Onboard agentless containers for CSPM,How-to enable agentless container posture - Microsoft Defender for Cloud,,Learn how to onboard agentless containers,"Enable agentless container posture in Defender CSPM to gain visibility into Kubernetes clusters and container images without deploying agents. Agentless container posture is available for Azure, AWS, and GCP environments.",2026-04-16T11:04:00.000Z,how-to,,0.4,False,"Described as a how-to onboarding article for agentless container posture. From the summary it looks like a step-by-step enablement guide without explicit mention of configuration parameter tables, limits, or error codes. Likely procedural onboarding content rather than deep configuration, limits, or troubleshooting guidance.",updated -https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-attack-path,Identify and remediate attack paths,Identify and remediate attack paths - Microsoft Defender for Cloud,,Learn how to identify and remediate attack paths in Microsoft Defender for Cloud and enhance the security of your environment.,"Defender for Cloud uses aproprietary algorithm to locate potential attack pathsspecific to your multicloud environment. Defender for Cloud focuses on real, externally driven and exploitable threats rather than broad scenarios. The algorithm detects attack paths that begin outside your organization and progress to business-critical targets, helping you cut through the noise and act faster. You can use attack path analysis to address security issues that pose immediate threats and have the greates",2025-11-25T14:31:00.000Z,how-to,,0.4,False,"Describes attack path identification conceptually; summary doesn’t show concrete config parameters, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-cloud-security-explorer,Build queries with Cloud security explorer,Build queries with cloud security explorer - Microsoft Defender for Cloud,,Learn how to build queries with cloud security explorer in Microsoft Defender for Cloud to proactively identify security risks in your cloud environment.,"Defenders for Cloud's contextual security capabilities help security teams reduce the risk of significant breaches. Defender for Cloud uses environmental context to assess security issues, identify the biggest risks, and distinguish them from less risky issues. The cloud security explorer uses snapshot publishing, a method of publishing data at regular intervals known as snapshots. Snapshots ensure that the workload configuration data is refreshed daily, keeping it fresh and accurate. Use the cl",2025-03-31T17:04:00.000Z,how-to,,0.4,False,Cloud security explorer query building; summary suggests conceptual and UI-level guidance without detailed config tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-test-attack-path-and-security-explorer-with-vulnerable-container-image,Attack path analysis and enhanced risk-hunting for containers,Attack path analysis and enhanced risk-hunting for containers - Microsoft Defender for Cloud,,Learn how to test attack paths and perform enhanced risk-hunting for containers with cloud security explorer in Microsoft Defender for Cloud,Attack path analysis identifies potential paths that attackers could use to reach high-impact resources in your environment. It analyzes relationships between resources and highlights issues that can be remediated to reduce risk. This article shows how to test attack path analysis by deploying a mock vulnerable container image.,2026-04-16T11:04:00.000Z,how-to,,0.3,False,"Focuses on demonstrating attack path analysis using a mock vulnerable container image. The summary suggests a scenario/tutorial style article, not one organized around error codes, configuration matrices, or quantified limits. No clear evidence of product-specific config tables or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-enable-agentless-containers,Onboard agentless containers for CSPM,How-to enable agentless container posture - Microsoft Defender for Cloud,,Learn how to onboard agentless containers,"Enable agentless container posture in Defender CSPM to gain visibility into Kubernetes clusters and container images without deploying agents. Agentless container posture is available for Azure, AWS, and GCP environments.",2026-04-19T08:00:00.000Z,how-to,,0.3,False,"How-to onboarding for agentless container posture appears procedural; summary does not indicate detailed config tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-attack-path,Identify and remediate attack paths,Identify and remediate attack paths - Microsoft Defender for Cloud,Use Defender for Cloud attack path analysis,Learn how to identify and remediate attack paths in Microsoft Defender for Cloud and enhance the security of your environment.,"Defender for Cloud uses aproprietary algorithm to locate potential attack pathsspecific to your multicloud environment. Defender for Cloud focuses on real, externally driven and exploitable threats rather than broad scenarios. The algorithm detects attack paths that begin outside your organization and progress to business-critical targets, helping you cut through the noise and act faster. You can use attack path analysis to address security issues that pose immediate threats and have the greates",2026-04-19T08:00:00.000Z,how-to,security,0.65,True,"Page describes how to identify and remediate attack paths in Microsoft Defender for Cloud, including product-specific security workflows and configuration steps for using the attack path feature. This is concrete, product-specific security guidance rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-cloud-security-explorer,Build queries with Cloud security explorer,Build queries with cloud security explorer - Microsoft Defender for Cloud,Query security risks with cloud security explorer,Learn how to build queries with cloud security explorer in Microsoft Defender for Cloud to proactively identify security risks in your cloud environment.,"Defenders for Cloud's contextual security capabilities help security teams reduce the risk of significant breaches. Defender for Cloud uses environmental context to assess security issues, identify the biggest risks, and distinguish them from less risky issues. The cloud security explorer uses snapshot publishing, a method of publishing data at regular intervals known as snapshots. Snapshots ensure that the workload configuration data is refreshed daily, keeping it fresh and accurate. Use the cl",2026-04-19T08:00:00.000Z,how-to,security,0.6,True,"Page explains how to build and run queries in Defender for Cloud’s cloud security explorer, including product-specific query usage and configuration patterns for proactively identifying risks. This is detailed, feature-specific security configuration/usage guidance.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-test-attack-path-and-security-explorer-with-vulnerable-container-image,Attack path analysis and enhanced risk-hunting for containers,Attack path analysis and enhanced risk-hunting for containers - Microsoft Defender for Cloud,,Learn how to test attack paths and perform enhanced risk-hunting for containers with cloud security explorer in Microsoft Defender for Cloud,Attack path analysis identifies potential paths that attackers could use to reach high-impact resources in your environment. It analyzes relationships between resources and highlights issues that can be remediated to reduce risk. This article shows how to test attack path analysis by deploying a mock vulnerable container image.,2026-04-16T11:04:00.000Z,how-to,,0.3,False,"Focuses on demonstrating attack path analysis using a mock vulnerable container image. The summary suggests a scenario/tutorial style article, not one organized around error codes, configuration matrices, or quantified limits. No clear evidence of product-specific config tables or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/iac-template-mapping,Map IaC templates from code to cloud,Map Infrastructure as Code templates from code to cloud - Microsoft Defender for Cloud,,Learn how to map your Infrastructure as Code (IaC) templates to your cloud resources.,"Mapping Infrastructure as Code (IaC) templates to cloud resources helps you ensure consistent, secure, and auditable infrastructure provisioning. It supports rapid response to security threats and a security-by-design approach. You can use mapping to discover misconfigurations in runtime resources. Then, remediate at the template level to help ensure no drift and to facilitate deployment via CI/CD methodology.",2025-06-23T22:26:00.000Z,how-to,,0.4,False,IaC mapping article is more conceptual/feature-usage; summary doesn’t indicate detailed parameter tables or numeric limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/iac-vulnerabilities,Discover IaC misconfigurations,Scan for misconfigurations in Infrastructure as Code - Microsoft Defender for Cloud,Configure IaC misconfiguration scanning with Microsoft Security DevOps,Learn how to use Microsoft Security DevOps scanning with Microsoft Defender for Cloud to find misconfigurations in Infrastructure as Code (IaC).,"You can set up Microsoft Security DevOps to scan your connected GitHub repository or Azure DevOps project. Use a GitHub action or an Azure DevOps extension to run Microsoft Security DevOps only on your Infrastructure as Code (IaC) source code, and help reduce your pipeline runtime. This article shows you how to apply a template YAML configuration file to scan your connected repository or project specifically for IaC security issues by using Microsoft Security DevOps rules.",2025-05-19T08:00:00.000Z,how-to,configuration,0.75,True,Shows how to apply a template YAML configuration file and run MSDO only on IaC code; includes concrete YAML parameters and patterns unique to this scanning setup.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/identify-ai-workload-model,Discover generative AI workloads,Discover generative AI workloads - Microsoft Defender for Cloud,,Learn how to use the cloud security explorer to determine which AI workloads and models are running in your environment.,"The Defender Cloud Security Posture Management (CSPM) plan in Microsoft Defender for Cloud provides a comprehensive view of your organization's AI Bill of Materials (AI BOM). The instructions in this article explain how to use the cloud security explorer to identify the AI workloads and models that are running in your environment. With the results, you can assess the security posture of the scanned AI workloads.",2025-05-19T08:00:00.000Z,how-to,,0.32,False,Describes using cloud security explorer to discover AI workloads; appears to be a usage guide without detailed configuration tables or numeric thresholds.,unchanged @@ -342,10 +342,10 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/internet-exposure-ana https://learn.microsoft.com/en-us/azure/defender-for-cloud/introduction-malware-scanning,Introduction to malware scanning,Introduction to Defender for Storage malware scanning - Microsoft Defender for Cloud,,Discover how malware scanning in Microsoft Defender for Storage enhances security. It improves compliance and data integrity by detecting and mitigating threats.,"Malware scanning in Microsoft Defender for Storage improves the security of your Azure Storage accounts by detecting and mitigating malware threats. It uses Microsoft Defender Antivirus (MDAV) to scan your storage content, helping ensure data security and compliance. Defender for Storage offers two types of malware scanning: On-upload malware scanning: Scans blobs automatically when they're uploaded or modified, providing fast detection. This type of scanning is ideal for applications that invol",2025-12-09T18:45:00.000Z,how-to,,0.35,False,Introduction to malware scanning; high-level explanation of on-upload and on-demand modes without detailed configuration tables in the summary.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/investigate-resource-health,Investigate resource health,Tutorial - Investigate the health of your resources - Microsoft Defender for Cloud,,Tutorial: Learn how to investigate the health of your resources using Microsoft Defender for Cloud.,"The resource health page provides a snapshot view of the overall health of a single resource. You can review detailed information about the resource and all recommendations that apply to that resource. Also, if you're using any of theadvanced protection plans of Microsoft Defender for Cloud, you can see outstanding security alerts for that specific resource too. This single page, in Defender for Cloud's portal pages shows: In this tutorial you'll learn how to:",2025-11-25T14:31:00.000Z,tutorial,,0.35,False,Tutorial on investigating resource health; likely UI-driven walkthrough without detailed configuration tables or numeric limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/just-in-time-access-overview,Overview,Understand just-in-time virtual machine access - Microsoft Defender for Cloud,,This document explains how just-in-time VM access in Microsoft Defender for Cloud helps you control access to your Azure virtual machines,"Microsoft Defender for Cloud's Defender for Servers Plan 2 offers the just-in-time machine access feature. Just-in-time protects your resources from threat actors actively hunting for machines with open management ports, such as Remote Desktop Protocol (RDP) or Secure Shell (SSH). All machines are potential targets for attacks. Once compromised, a machine can serve as an entry point to further attack resources in the environment. To reduce attack surfaces, minimize open ports, especially managem",2025-03-10T17:02:00.000Z,how-to,,0.45,False,Conceptual overview of just-in-time VM access; summary doesn’t show detailed configuration parameters or decision matrices.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-malware,Review and remediate malware alerts for Kubernetes nodes,Review and remediate malware alerts for Kubernetes nodes - Microsoft Defender for Cloud,Handle malware alerts on Kubernetes nodes in Defender,Learn how to review and remediate malware alerts for Kubernetes nodes in Defender for Containers.,"Defender for Containers uses the Microsoft Defender Antivirus anti-malware engine to scan Kubernetes nodes for malicious files. When malware is detected, Defender for Cloud generates security alerts that can be investigated and remediated in Defender for Cloud and Defender XDR.",2026-04-16T11:04:00.000Z,how-to,troubleshooting,0.7,True,"The page describes how Defender for Containers raises and manages malware alerts on Kubernetes nodes, and how to investigate and remediate them in Defender for Cloud and Defender XDR. This is product-specific incident/alert handling (alert → investigation steps → remediation) that fits troubleshooting and goes beyond generic anti-malware concepts.",new +https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-malware,Review and remediate malware alerts for Kubernetes nodes,Review and remediate malware alerts for Kubernetes nodes - Microsoft Defender for Cloud,Handle malware alerts on Kubernetes nodes in Defender,Learn how to review and remediate malware alerts for Kubernetes nodes in Defender for Containers.,"Defender for Containers uses the Microsoft Defender Antivirus anti-malware engine to scan Kubernetes nodes for malicious files. When malware is detected, Defender for Cloud generates security alerts that can be investigated and remediated in Defender for Cloud and Defender XDR.",2026-04-16T11:04:00.000Z,how-to,troubleshooting,0.7,True,"The page describes how Defender for Containers raises and manages malware alerts on Kubernetes nodes, and how to investigate and remediate them in Defender for Cloud and Defender XDR. This is product-specific incident/alert handling (alert → investigation steps → remediation) that fits troubleshooting and goes beyond generic anti-malware concepts.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-overview,Overview,Overview of Kubernetes Nodes Protection - Microsoft Defender for Cloud,,Learn about Defender for Containers vulnerability assessment and malware detection for Kubernetes nodes.,"In addition to protecting the Kubernetes cluster control plane and workloads, Defender for Cloud also extends security and compliance over the Kubernetes nodes in the customer's Azure Kubernetes Service (AKS).",2026-03-09T12:18:00.000Z,overview,,0.2,False,"High-level overview of Kubernetes nodes protection; no specific RBAC roles, configuration parameters, limits, or error codes are indicated in the summary.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-va,Review and remediate VA findings for Kubernetes nodes,Review and remediate Kubernetes node vulnerabilities - Microsoft Defender for Cloud,Investigate and fix Defender Kubernetes node vulnerabilities,Learn how to review and remediate vulnerability findings for Kubernetes nodes in Microsoft Defender for Cloud.,"Defender for Cloud scans theVMs that host Kubernetes nodesfor vulnerabilities in the operating system and installed software. When vulnerabilities are detected, Defender for Cloud generates recommendations with detailed findings to help you review and remediate them. Reviewing and remediating these vulnerabilities is part of theshared responsibilityfor maintaining Kubernetes node security.",2026-04-16T11:04:00.000Z,how-to,troubleshooting,0.68,True,"The page is focused on reviewing and remediating specific vulnerability findings for Kubernetes nodes in Defender for Cloud. These workflows, recommendation details, and product-specific remediation steps constitute expert, product-specific troubleshooting knowledge (symptom → findings → remediation actions), beyond generic security concepts.",new -https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-workload-protections,Kubernetes data plane hardening,Kubernetes data plane hardening - Microsoft Defender for Cloud,Apply Defender for Cloud Kubernetes data plane hardening,Learn how to use Microsoft Defender for Cloud's set of Kubernetes data plane hardening security recommendations,"This page describes how to use Microsoft Defender for Cloud's set of security recommendations dedicated to Kubernetes data plane hardening. Tip For a list of the security recommendations that might appear for Kubernetes clusters and nodes, reviewcontainer recommendations.",2026-04-16T11:04:00.000Z,how-to,security,0.68,True,"Page focuses on Defender for Cloud security recommendations specifically for Kubernetes data plane hardening. These are product-specific security controls and recommendations (what to enable, how to use the recommendations) that go beyond generic Kubernetes security concepts, fitting the security sub-skill.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-va,Review and remediate VA findings for Kubernetes nodes,Review and remediate Kubernetes node vulnerabilities - Microsoft Defender for Cloud,Investigate and fix Defender Kubernetes node vulnerabilities,Learn how to review and remediate vulnerability findings for Kubernetes nodes in Microsoft Defender for Cloud.,"Defender for Cloud scans theVMs that host Kubernetes nodesfor vulnerabilities in the operating system and installed software. When vulnerabilities are detected, Defender for Cloud generates recommendations with detailed findings to help you review and remediate them. Reviewing and remediating these vulnerabilities is part of theshared responsibilityfor maintaining Kubernetes node security.",2026-04-16T11:04:00.000Z,how-to,troubleshooting,0.68,True,"The page is focused on reviewing and remediating specific vulnerability findings for Kubernetes nodes in Defender for Cloud. These workflows, recommendation details, and product-specific remediation steps constitute expert, product-specific troubleshooting knowledge (symptom → findings → remediation actions), beyond generic security concepts.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-workload-protections,Kubernetes data plane hardening,Kubernetes data plane hardening - Microsoft Defender for Cloud,Apply Defender for Cloud Kubernetes data plane hardening,Learn how to use Microsoft Defender for Cloud's set of Kubernetes data plane hardening security recommendations,"This page describes how to use Microsoft Defender for Cloud's set of security recommendations dedicated to Kubernetes data plane hardening. Tip For a list of the security recommendations that might appear for Kubernetes clusters and nodes, reviewcontainer recommendations.",2026-04-16T11:04:00.000Z,how-to,security,0.68,True,"Page focuses on Defender for Cloud security recommendations specifically for Kubernetes data plane hardening. These are product-specific security controls and recommendations (what to enable, how to use the recommendations) that go beyond generic Kubernetes security concepts, fitting the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/logging-ingestion,Integrate GCP logging,Integrate GCP cloud logging - Microsoft Defender for Cloud,Ingest GCP Cloud Logging into Defender for Cloud via Pub/Sub,Learn how to ingest Google Cloud Platform (GCP) Cloud Logging into Microsoft Defender for Cloud using Pub/Sub.,"Microsoft Defender for Cloud can collect activity logs from Google Cloud Platform (GCP) by ingesting Cloud Logging data through Pub/Sub. These logs provide activity context used by Cloud Infrastructure Entitlement Management (CIEM) in Defender for Cloud, including risk-based recommendations and attack path analysis across your Google Cloud environments. GCP Cloud Logging ingestion is available for individual GCP projects and for GCP organizations that use centralized logging. Learn more about th",2026-02-11T12:03:00.000Z,install-set-up-deploy,integrations,0.7,True,Describes Pub/Sub-based log ingestion; likely includes product-specific integration settings and constraints.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/manage-mcsb,Manage MCSB recommendations,Manage MCSB in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Configure and manage MCSB security standard,Learn how to manage the MCSB standard in Microsoft Defender for Cloud,"Microsoft Defender for Cloud assesses resources againstsecurity standards. By default, when you onboard cloud accounts to Defender for Cloud, theMicrosoft Cloud Security Benchmark (MCSB) standardis enabled. Defender for Cloud starts assessing the security posture of your resource against controls in the MCSB standard, and issues security recommendations based on the assessments. This article describes how you can manage recommendations provided by MCSB.",2025-05-20T08:00:00.000Z,how-to,configuration,0.6,True,"Managing MCSB recommendations likely includes specific controls, policy settings, and configuration options tied to this benchmark.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/manage-respond-alerts,Manage and respond to security alerts,Manage and respond to security alerts - Microsoft Defender for Cloud,,This document helps you to use Microsoft Defender for Cloud capabilities to manage and respond to security alerts.,"Defender for Cloud collects, analyzes, and integrates log data from your Azure, hybrid, and multicloud resources, the network, and connected partner solutions, such as firewalls and endpoint agents. Defender for Cloud uses the log data to detect real threats and reduce false positives. A list of prioritized security alerts is shown in Defender for Cloud along with the information you need to quickly investigate the problem and the steps to take to remediate an attack. This article shows you how ",2025-10-26T22:03:00.000Z,how-to,,0.3,False,Explains how to manage and respond to alerts; summary indicates procedural UI guidance without specific error-code mappings or configuration parameter tables.,unchanged @@ -354,7 +354,7 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/monitoring-components https://learn.microsoft.com/en-us/azure/defender-for-cloud/on-demand-malware-scanning,On-demand malware scanning,Microsoft Defender for Storage on-demand malware scanning - Microsoft Defender for Cloud,,Learn about the benefits and features of on-demand malware scanning in Microsoft Defender for Storage.,"On-demand malware scanning in Microsoft Defender for Storage enables you to scan existing blobs and files in your Azure Storage accounts whenever needed. This capability provides flexibility to scan stored data in response to evolving security requirements, compliance needs, or security incidents, ensuring your data is continuously protected. By using Microsoft Defender Antivirus with the latest malware definitions, on-demand scanning offers a cloud-native solution. It doesn't require extra infr",2026-03-03T18:21:00.000Z,how-to,,0.3,False,"From the summary, the page is a feature/benefits overview of on-demand malware scanning in Microsoft Defender for Storage. It describes what the capability does and why it's useful, but there's no evidence of specific limits/quotas, configuration parameter tables, RBAC role lists, or error-code-based troubleshooting. Without concrete numeric limits, settings, or security role details, it doesn't meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/on-upload-malware-scanning,On-upload malware scanning,Microsoft Defender for Storage on-upload malware scanning - Microsoft Defender for Cloud,Configure on-upload malware scanning for Azure Storage,Learn how on-upload malware scanning in Microsoft Defender for Storage provides real-time detection and protection against malicious content.,"On-upload malware scanning in Microsoft Defender for Storage automatically scans blobs when they're uploaded or modified, providing fast detection of malicious content. This cloud-native, SaaS-based solution uses Microsoft Defender Antivirus to perform comprehensive malware scans, ensuring your storage accounts remain secure without the need for extra infrastructure or maintenance. By integrating on-upload scanning into your storage accounts, you can: Malware upload is a top threat for cloud sto",2025-09-16T08:00:00.000Z,how-to,security,0.66,True,"Describes product-specific behavior and configuration of on-upload scanning using MDAV, including when scans occur and how it integrates with storage accounts.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-machines-with-defender-for-endpoint,Connect machines with Defender for Endpoint,Onboard non-Azure servers with Defender for Endpoint - Microsoft Defender for Cloud,,Learn how to connect your non-Azure machines directly to Microsoft Defender for Cloud with Microsoft Defender for Endpoint.,"Microsoft Defender for Cloud lets you onboard on-premises and multicloud servers through Microsoft Defender for Endpoint. This setup protects non-Azure machines without the need for any additional agents and shows all servers, Azure and non-Azure, in one unified security view. Note To connect your non-Azure machines via Azure Arc, seeConnect your non-Azure machines to Microsoft Defender for Cloud with Azure Arc.",2025-12-22T18:24:00.000Z,quickstart,,0.45,False,How-to for onboarding non-Azure servers via Defender for Endpoint; mostly procedural without detailed config parameter tables.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-management-group,Enable Defender for Cloud for a management group,Onboard a management group - Microsoft Defender for Cloud,Enable Defender for Cloud on management groups via policy,Learn how to use a supplied Azure Policy definition to enable Microsoft Defender for Cloud for all the subscriptions in a management group.,"You can use Azure Policy to enable Microsoft Defender for Cloud on all the Azure subscriptions within the same management group (MG). This is more convenient than accessing them individually from the portal, and works even if the subscriptions belong to different owners.",2025-07-15T08:00:00.000Z,how-to,configuration,0.65,True,"Uses Azure Policy definitions to onboard subscriptions; likely includes policy names, parameters, and assignment scopes that are configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-management-group,Enable Defender for Cloud for a management group,Onboard a management group - Microsoft Defender for Cloud,,Learn how to use a supplied Azure Policy definition to enable Microsoft Defender for Cloud for all the subscriptions in a management group.,"You can use Azure Policy to enable Microsoft Defender for Cloud on all the Azure subscriptions within the same management group (MG). This is more convenient than accessing them individually from the portal, and works even if the subscriptions belong to different owners.",2026-04-19T08:00:00.000Z,how-to,,0.2,False,"Onboarding a management group with Azure Policy is primarily a procedural how-to. It’s unlikely to contain configuration parameter tables, limits, or detailed troubleshooting mappings; it mainly explains using a supplied policy definition to enable Defender for Cloud, which is standard tutorial content.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-42crunch,Onboard 42Crunch (preview),Technical onboarding guide for 42Crunch (preview) - Microsoft Defender for Cloud,Onboard 42Crunch API security with Defender,Learn how to use 42Crunch with Microsoft Defender.,42Crunch enables a standardized approach to securing APIs that automates the enforcement of API security compliance across distributed development and security teams. The 42Crunch API security platform empowers developers to build security from the integrated development environment (IDE) into the CI/CD pipeline. This seamless DevSecOps approach to API security reduces governance costs and accelerates the delivery of secure APIs.,2025-07-15T08:00:00.000Z,how-to,integrations,0.7,True,"Technical onboarding guide will contain concrete configuration parameters, tokens, and integration steps unique to 42Crunch.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-bright,Onboard Bright Security (preview),Technical onboarding guide for Bright Security (preview) - Microsoft Defender for Cloud,Connect Bright Security DAST with Defender,Learn how to use Bright Security with Microsoft Defender for Cloud to enhance your application security testing.,"Bright provides a developer-centric enterprise Dynamic Application Security Testing (DAST) solution. It scans applications and APIs from the outside-in, mimicking how a hacker would approach the application, and automatically tests for vulnerabilities that bad actors could use to exploit. Unlike legacy DAST tools designed exclusively for expert security users after the application is already in production, Bright’s tool was built to be ""developer-first."" It was designed to empower developers to ",2025-07-15T08:00:00.000Z,how-to,integrations,0.7,True,Technical onboarding for Bright Security will include specific configuration options and integration parameters.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-stackhawk,Onboard StackHawk (preview),Technical onboarding guide for StackHawk (preview) - Microsoft Defender for Cloud,Integrate StackHawk testing with Defender for Cloud,Learn how to use StackHawk with Microsoft Defender for Cloud to enhance your application security testing.,StackHawk makes API and application security testing part of software delivery. The StackHawk platform offers engineering teams the ability to find and fix application bugs at any stage of software development and gives Security teams insight into the security posture of applications and APIs being developed.,2025-07-15T08:00:00.000Z,how-to,integrations,0.7,True,"Onboarding StackHawk requires product-specific configuration, endpoints, and parameter values.",unchanged @@ -368,12 +368,12 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-ser https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-agents,Understand data collection,Understand Defender for Servers data collection in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Understand Defender for Servers data collection design,Understand how the Defender for Servers plan collects data.,This article helps you to understand howDefender for Serversin Microsoft Defender for Cloud collects data for assessment.,2025-02-19T08:00:00.000Z,concept-article,architecture-patterns,0.7,True,"Explains how data is collected for assessment; likely details which agents, signals, and pipelines are used in different environments, which is product-specific architecture/pattern information.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-data-workspace,Understand data storage and workspaces,Plan Defender for Servers data residency - Microsoft Defender for Cloud,Plan Defender for Servers data residency and workspaces,Review data residency and workspace design for Microsoft Defender for Servers.,"This article helps you understand how data is stored in Microsoft Defender for Cloud, and when you need a Log Analytics workspace. Defender for Servers is one of the paid plans provided byMicrosoft Defender for Cloud.",2025-02-19T08:00:00.000Z,concept-article,architecture-patterns,0.7,True,"Covers data residency and Log Analytics workspace design; typically includes guidance on when to use single vs multiple workspaces and trade-offs, which is architecture/decision pattern content.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-roles,Plan roles and permissions,Plan Defender for Servers roles and permissions - Microsoft Defender for Cloud,Configure roles and permissions for Defender for Servers,Review roles and permissions for Microsoft Defender for Servers.,This article helps you understand how to control access to Defender for Servers. Defender for Servers is one of the paid plans provided byMicrosoft Defender for Cloud.,2025-02-19T08:00:00.000Z,concept-article,security,0.8,True,"Roles and permissions article almost certainly lists specific Azure RBAC role names, scopes, and what each can do in Defender for Servers, which is product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale,Scale a deployment,Scale a Defender for Servers deployment - Microsoft Defender for Cloud,Scale Microsoft Defender for Servers across environments,"Scale protection of Azure, AWS, GCP, and on-premises servers by using Microsoft Defender for Servers.",This article helps you scale your Microsoft Defender for Servers deployment. Defender for Servers is one of the paid plans provided byMicrosoft Defender for Cloud.,2025-02-19T08:00:00.000Z,concept-article,architecture-patterns,0.65,True,"Scaling deployment across Azure, AWS, GCP, and on-premises usually involves product-specific architectural patterns and constraints for large environments.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale,Scale a deployment,Scale a Defender for Servers deployment - Microsoft Defender for Cloud,Scale Microsoft Defender for Servers across environments,"Scale protection of Azure, AWS, GCP, and on-premises servers by using Microsoft Defender for Servers.",This article helps you scale your Microsoft Defender for Servers deployment. Defender for Servers is one of the paid plans provided byMicrosoft Defender for Cloud.,2026-04-19T08:00:00.000Z,concept-article,deployment,0.68,True,"Scaling Defender for Servers across Azure, AWS, GCP, and on-prem typically involves plan- and platform-specific deployment patterns and constraints (for example, which onboarding/deployment methods are supported per environment and SKU). This kind of article usually contains concrete guidance on how to roll out protection at scale, which is deployment-focused and product-specific rather than conceptual. Even without seeing the full text, the focus on 'scale your deployment' and multi-cloud/on-prem coverage strongly indicates expert deployment knowledge.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-select-plan,Choose a plan and deployment scope,Select a Defender for Servers plan - Microsoft Defender for Cloud,Choose the right Defender for Servers plan,This article helps you understand which Defender for Servers plan to deploy in Microsoft Defender for Cloud.,This article helps you understand whichDefender for Servers planto deploy in Microsoft Defender for Cloud.,2025-11-02T08:00:00.000Z,concept-article,decision-making,0.85,True,"Explicitly about selecting which Defender for Servers plan to deploy; such pages typically include comparison tables, capabilities per plan, and recommendations for scenarios, which is decision-making guidance.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/policy-reference,Azure Policy built-ins,Built-in policy definitions - Microsoft Defender for Cloud,Use built-in Azure Policy definitions for Defender for Cloud,Lists Azure Policy built-in policy definitions for Microsoft Defender for Cloud. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy definitions related to Microsoft Defender for Cloud. The following groupings of policy definitions are available: For more information about security policies, seeWorking with security policies. For other Azure Policy built-ins for other services, seeAzure Policy built-in definitions. The name of each built-in policy definition links to the policy definition in the Azure portal. Use the link in theVersioncolumn to view the source on theAzure Po",2025-03-26T08:00:00.000Z,reference,configuration,0.7,True,"Lists specific built-in policy definitions tied to Defender for Cloud; these are concrete configuration artifacts (names, effects, scopes) that are product-specific and not generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding,Automate onboarding using PowerShell,Onboard with PowerShell - Microsoft Defender for Cloud,Onboard Defender for Cloud using PowerShell,This document walks you through the process of enabling Microsoft Defender for Cloud with PowerShell cmdlets.,"You can secure your Azure workloads programmatically, using the Microsoft Defender for Cloud PowerShell module. Using PowerShell enables you to automate tasks and avoid the human error inherent in manual tasks. This is especially useful in large-scale deployments that involve dozens of subscriptions with hundreds and thousands of resources, all of which must be secured from the beginning. Onboarding Microsoft Defender for Cloud using PowerShell enables you to programmatically automate onboarding",2025-08-11T11:03:00.000Z,quickstart,configuration,0.7,True,"PowerShell onboarding uses specific cmdlets, parameters, and scripts that are detailed configuration knowledge for this product.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding,Automate onboarding using PowerShell,Onboard with PowerShell - Microsoft Defender for Cloud,Automate Defender for Cloud onboarding with PowerShell,This document walks you through the process of enabling Microsoft Defender for Cloud with PowerShell cmdlets.,"You can secure your Azure workloads programmatically, using the Microsoft Defender for Cloud PowerShell module. Using PowerShell enables you to automate tasks and avoid the human error inherent in manual tasks. This is especially useful in large-scale deployments that involve dozens of subscriptions with hundreds and thousands of resources, all of which must be secured from the beginning. Onboarding Microsoft Defender for Cloud using PowerShell enables you to programmatically automate onboarding",2026-04-19T08:00:00.000Z,quickstart,integrations,0.6,True,"PowerShell onboarding docs for a specific Azure service typically list concrete cmdlets, parameter names, and required values (for example, enabling specific Defender plans across subscriptions). These are product-specific integration/coding patterns for using the Defender for Cloud PowerShell module, matching the integrations category.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-sample-vulnerability-assessment-azure-sql,Enable vulnerability assessments on Azure SQL databases with the express configuration,PowerShell script sample - Enable vulnerability assessment on a SQL server - Microsoft Defender for Cloud,PowerShell script to enable SQL VA express configuration,"In this article, learn how to enable vulnerability assessments on Azure SQL databases with the express configuration using a PowerShell script.","This PowerShell script enables the express configuration ofvulnerability assessmentson an Azure SQL Server. If vulnerability assessment has already been configured using the classic configuration, this script migrates it to the express configuration and copies all of the pre-existing baseline definitions. Your scan history isn't copied over to the new configuration. Your scan history remains accessible on the storage account that was previously used by the classic configuration.",2025-05-25T08:00:00.000Z,sample,configuration,0.7,True,"Provides a concrete PowerShell script with parameters and behavior for enabling/migrating SQL VA express configuration, which is specific configuration automation.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-sample-vulnerability-assessment-baselines,Set up baselines for vulnerability assessments on Azure SQL databases,PowerShell script sample - Set up baselines on Azure SQL databases - Microsoft Defender for Cloud,PowerShell script to set SQL VA baselines,"In this article, learn how to set up baselines for vulnerability assessments on Azure SQL databases using a PowerShell script.","This PowerShell script sets up baselines based on latestvulnerability assessmentscan results for all databases in an Azure SQL Server. This sample requires Azure PowerShell Az 1.0 or later. RunGet-Module -ListAvailable Azto see which versions are installed. @@ -385,18 +385,18 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/privacy,Manage user d https://learn.microsoft.com/en-us/azure/defender-for-cloud/protect-network-resources,Protect network resources,Protecting your network resources - Microsoft Defender for Cloud,Apply Defender networking recommendations for Azure,This document addresses recommendations in Microsoft Defender for Cloud that help you protect your Azure network resources and stay in compliance with security policies.,"Microsoft Defender for Cloud continuously analyzes the security state of your Azure resources for network security best practices. When Defender for Cloud identifies potential security vulnerabilities, it creates recommendations that guide you through the process of configuring the needed controls to harden and protect your resources. Review Defender for Cloudnetworking recommendations. This article addresses recommendations that apply to your Azure resources from a network security perspective.",2025-12-23T12:17:00.000Z,concept-article,best-practices,0.6,True,Network protection article addresses concrete Defender recommendations and configuration steps for Azure network resources.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/query-software-bill-of-materials,Query software bill of materials (SBOM),Query software bill of materials (SBOM) - Microsoft Defender for Cloud,Query SBOM data in Defender for Cloud using Cloud Security Explorer,Learn how to query Software Bill of Materials (SBOM) results in Microsoft Defender for Cloud's Cloud Security Explorer.,"Microsoft Defender for Cloud's DevOps Security agentless scanning capabilities automatically generate a Software Bill of Materials (SBOM) for connected code repositories. When a scan finishes, the process publishes the repository and identified packages to thecloud security graph. You can use Defender for Cloud'scloud security explorerto query this data. By using the cloud security explorer, you can locate specific packages (dependencies) and identify exactly which repositories use them. Use thi",2026-01-18T12:03:00.000Z,how-to,configuration,0.7,True,"Describes how to query SBOM results via cloud security explorer; includes specific query patterns, fields, and relationships unique to Defender for Cloud's SBOM graph.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-automation-alert,Create autoresponses to alerts using an ARM template or Bicep,Create a security automation for specific security alerts by using an Azure Resource Manager template (ARM template) or Bicep - Microsoft Defender for Cloud,,"Learn how to create a Microsoft Defender for Cloud automation to trigger a logic app, which will be triggered by specific Defender for Cloud alerts by using an Azure Resource Manager template (ARM tem","In this quickstart, you'll learn how to use an Azure Resource Manager template (ARM template) or a Bicep file to create a workflow automation. The workflow automation will trigger a logic app when specific security alerts are received by Microsoft Defender for Cloud.",2025-05-18T08:00:00.000Z,quickstart,,0.4,False,"Quickstart using ARM/Bicep; typically step-by-step deployment/automation example, not a comprehensive configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-aws,Connect your AWS account,Connect your AWS account - Microsoft Defender for Cloud,,"Defend your AWS resources with Microsoft Defender for Cloud, a guide to set up and configure Defender for Cloud to protect your workloads in AWS.","Microsoft Defender for Cloud helps protect workloads running in Amazon Web Services (AWS). To assess your AWS resources and get security recommendations, you need to connect your AWS account to Defender for Cloud. The connector gathers configuration and security signals from AWS services. By using this information, Defender for Cloud can analyze posture, generate recommendations, and surface alerts. For more information, watch theNew AWS connector in Defender for Cloudvideo from theDefender for ",2026-01-06T12:02:00.000Z,install-set-up-deploy,,0.45,False,"Quickstart for connecting AWS account; mostly procedural onboarding, not a deep configuration or troubleshooting reference.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-aws,Connect your AWS account,Connect your AWS account - Microsoft Defender for Cloud,,"Defend your AWS resources with Microsoft Defender for Cloud, a guide to set up and configure Defender for Cloud to protect your workloads in AWS.","Microsoft Defender for Cloud helps protect workloads running in Amazon Web Services (AWS). To assess your AWS resources and get security recommendations, you need to connect your AWS account to Defender for Cloud. The connector gathers configuration and security signals from AWS services. By using this information, Defender for Cloud can analyze posture, generate recommendations, and surface alerts. For more information, watch theNew AWS connector in Defender for Cloudvideo from theDefender for ",2026-04-19T08:00:00.000Z,install-set-up-deploy,,0.3,False,"Quickstart for connecting an AWS account to Defender for Cloud; primarily a step-by-step onboarding/tutorial without detailed configuration parameter tables, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-devops,Connect Azure DevOps environments,Connect your Azure DevOps organizations - Microsoft Defender for Cloud,,Learn how to connect your Azure DevOps environment to Defender for Cloud.,"This page provides a simple onboarding experience to connect Azure DevOps environments to Microsoft Defender for Cloud, and automatically discover Azure DevOps repositories. By connecting your Azure DevOps environments to Defender for Cloud, you extend the security capabilities of Defender for Cloud to your Azure DevOps resources and improve security posture.Learn more.",2025-09-03T17:16:00.000Z,quickstart,,0.35,False,Quickstart onboarding Azure DevOps; likely a simple connector setup without detailed parameter tables or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-gcp,Connect your GCP project,Connect your GCP project - Microsoft Defender for Cloud,,Defend your GCP resources by using Microsoft Defender for Cloud. Protect your workloads and enhance your cloud security with our comprehensive solution.,"Microsoft Defender for Cloud provides security posture management and threat protection for workloads running in Google Cloud Platform (GCP). This article shows you how to connect a GCP project or organization to Microsoft Defender for Cloud so Microsoft Defender for Cloud can discover resources, assess security posture, and surface security recommendations and alerts.",2026-02-11T12:03:00.000Z,install-set-up-deploy,,0.45,False,Quickstart for connecting GCP project; mainly onboarding steps without detailed config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-github,Connect GitHub environments,Connect your GitHub organizations - Microsoft Defender for Cloud,,Learn how to connect your GitHub Environment to Defender for Cloud and enhance the security of your GitHub resources.,"In this quick start, you connect your GitHub organizations on theEnvironment settingspage in Microsoft Defender for Cloud. This page provides a simple onboarding experience to autodiscover your GitHub repositories. By connecting your GitHub environments to Defender for Cloud, you extend the security capabilities of Defender for Cloud to your GitHub resources and improve security posture.Learn more.",2025-11-12T23:03:00.000Z,quickstart,,0.35,False,Quickstart for connecting GitHub organizations; mainly onboarding steps without deep configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-gitlab,Connect GitLab environments,Connect your GitLab groups - Microsoft Defender for Cloud,,Learn how to connect your GitLab Environment to Defender for Cloud.,"In this quickstart, you connect your GitLab groups on theEnvironment settingspage in Microsoft Defender for Cloud. This page provides a simple onboarding experience to automatically discover your GitLab resources. By connecting your GitLab groups to Defender for Cloud, you extend the security capabilities of Defender for Cloud to your GitLab resources. These features include: Foundational Cloud Security Posture Management (CSPM) features: You can assess your GitLab security posture through GitLa",2026-01-23T23:18:00.000Z,quickstart,,0.35,False,"Quickstart for connecting GitLab groups; summary suggests straightforward onboarding, not detailed configuration or error-resolution mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-machines,Connect machines with Azure Arc,Connect on-premises machines - Microsoft Defender for Cloud,,Learn how to connect your non-Azure machines to Microsoft Defender for Cloud and monitor their security posture using Azure Arc and Defender for Endpoint.,"Microsoft Defender for Cloud monitors the security posture of non-Azure machines, but first you need to connect them to Azure. Connect non-Azure computers in any of the following ways: This article describes the methods for onboarding with Azure Arc. If you're connecting machines from other cloud providers, seeConnect your AWS accountorConnect your GCP project. The multicloud connectors for Amazon Web Services (AWS) and Google Cloud Platform (GCP) in Defender for Cloud handle the Azure Arc deplo",2025-03-13T08:00:00.000Z,install-set-up-deploy,,0.45,False,Quickstart for connecting on-premises machines via Azure Arc; step-by-step onboarding rather than configuration reference.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-ai,AI recommendations,Reference table for all AI security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Use Defender for Cloud AI security recommendations,This article lists all Microsoft Defender for Cloud AI security recommendations that help you harden and protect your resources.,"This article lists all the AI security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud.",2026-02-09T12:03:00.000Z,reference,security,0.75,True,Reference table of AI security recommendations with specific recommendation IDs and semantics; product-specific security guidance not derivable from general training.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-api,API recommendations,Reference table for all API security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Apply Defender for Cloud API security recommendations,This article lists all Microsoft Defender for Cloud API security recommendations that help you harden and protect your resources.,"This article lists all the API/API management security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud.",2025-06-25T17:25:00.000Z,reference,security,0.75,True,Reference table of API/API Management security recommendations; contains concrete recommendation definitions unique to Defender for Cloud.,unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-api,API recommendations,Reference table for all API security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Use Defender for Cloud API security recommendations,This article lists all Microsoft Defender for Cloud API security recommendations that help you harden and protect your resources.,"This article lists all the API/API management security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud.",2026-04-19T22:03:00.000Z,reference,security,0.65,True,"Reference table of all Defender for Cloud API security recommendations is product-specific security guidance. It enumerates concrete recommendation identifiers and their meanings for API/API Management resources, which are unique to this service and used to harden configurations.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-app-services,Azure App Service recommendations,Reference table for Azure App Service security recommendations - Microsoft Defender for Cloud,Use Defender for Cloud security recommendations for Azure App Service,This article lists the Microsoft Defender for Cloud security recommendations for Azure App Service.,"This article lists all the security recommendations you might see issued by the Microsoft Defender for Cloud plan - Microsoft Defender for Cloud for Azure App Service. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defende",2026-03-30T11:04:00.000Z,reference,security,0.78,True,"Reference table of Defender for Cloud recommendations is product-specific security guidance, mapping concrete recommendation IDs/names to Azure App Service scenarios. While not classic RBAC/parameter docs, it is expert security configuration knowledge unique to Defender for Cloud and App Service.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-compute,Compute recommendations,Reference table for all compute security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Apply Defender for Cloud compute security recommendations,This article lists all Microsoft Defender for Cloud compute security recommendations that help you harden and protect your resources.,"This article lists all the multicloud compute security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud. Tip If a recommendation description s",2026-03-30T08:00:00.000Z,reference,security,0.78,True,Lists all Defender for Cloud compute security recommendations with product-specific guidance for hardening VMs and related resources. This is specialized security configuration knowledge tied to Defender for Cloud’s recommendation model.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-container,Container recommendations,Reference table for all container security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Apply Defender for Cloud container security recommendations,This article lists all Microsoft Defender for Cloud container security recommendations that help you harden and protect your resources.,"This article lists all the container security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. Tip If a recommendation description saysNo related policy, usually it's because that recommendation is dependent on a different recommendation. For example, the recommendationEn",2026-03-29T08:00:00.000Z,reference,security,0.78,True,"Provides a full reference of container security recommendations in Defender for Cloud, which is specialized security guidance for containerized workloads beyond generic best practices.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-data,Data recommendations,Reference table for all data security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Use Defender for Cloud data security recommendations,This article lists all Microsoft Defender for Cloud data security recommendations that help you harden and protect your resources.,"This article lists all the data security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud. Tip If a recommendation description saysNo related ",2026-04-15T08:00:00.000Z,reference,security,0.7,True,"The page is a reference table of all Microsoft Defender for Cloud data security recommendations. It enumerates product-specific recommendation IDs, names, and details that are unique to Defender for Cloud’s security model. This is expert, product-specific security guidance rather than generic concepts, but it does not focus on limits, configuration parameters, or troubleshooting patterns.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-data,Data recommendations,Reference table for all data security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Use Defender for Cloud data security recommendations,This article lists all Microsoft Defender for Cloud data security recommendations that help you harden and protect your resources.,"This article lists all the data security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud. Tip If a recommendation description saysNo related ",2026-04-15T08:00:00.000Z,reference,security,0.7,True,"The page is a reference table of all Microsoft Defender for Cloud data security recommendations. It enumerates product-specific recommendation IDs, names, and details that are unique to Defender for Cloud’s security model. This is expert, product-specific security guidance rather than generic concepts, but it does not focus on limits, configuration parameters, or troubleshooting patterns.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-deprecated,Deprecated recommendations,Reference table for all deprecated security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Review deprecated Defender for Cloud security recommendations,This article lists all Microsoft Defender for Cloud deprecated security recommendations that help you harden and protect your resources.,This article lists all the deprecated security recommendations in Microsoft Defender for Cloud.,2025-05-18T08:00:00.000Z,reference,security,0.7,True,Lists deprecated security recommendations with their identifiers; this is product-specific security metadata useful for posture management.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-devops,DevOps recommendations,Reference table for all security recommendations for DevOps - Microsoft Defender for Cloud,Apply Defender for Cloud DevOps security recommendations,This article lists all Microsoft Defender for Cloud security recommendations that help you harden and protect your DevOps resources.,"This article lists the recommendations you might see in Microsoft Defender for Cloud if you connect anAzure DevOps,GitHub, orGitLabenvironment by using theEnvironment settingspage. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendatio",2025-05-18T08:00:00.000Z,reference,security,0.75,True,"Lists DevOps security recommendations for Azure DevOps, GitHub, GitLab; specific recommendation semantics are expert security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-identity-access,Identity and access recommendations,Reference table for all identity and access security recommendations in Microsoft Defender for cloud - Microsoft Defender for Cloud,Implement identity and access recommendations in Defender for Cloud,This article lists all Microsoft Defender for Cloud identity and access security recommendations that help you harden and protect your resources.,"This article lists all the identity and access security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud. Tip If a recommendation description ",2026-03-29T08:00:00.000Z,reference,security,0.8,True,"Focused on identity and access security recommendations, likely including specific Azure policy and identity configurations. This is specialized IAM/security guidance aligned with Defender for Cloud’s recommendation engine.",unchanged @@ -404,14 +404,14 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-refer https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-keyvault,Keyvault recommendations,Reference table for all Keyvault security recommendations in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Use Defender for Cloud Key Vault security recommendations,This article lists all Microsoft Defender for Cloud Keyvault security recommendations that help you harden and protect your resources.,"This article lists all the Keyvault security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud. Tip If a recommendation description saysNo rela",2025-06-16T05:08:00.000Z,reference,security,0.8,True,Lists Key Vault security recommendations with specific definitions; specialized security configuration knowledge for Key Vault in Defender for Cloud.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-networking,Networking recommendations,Reference table for all networking security recommendations - Microsoft Defender for Cloud,Apply Defender for Cloud networking security recommendations,This article lists all Microsoft Defender for Cloud networking security recommendations that help you harden and protect your resources.,"This article lists all the networking security recommendations you might see in Microsoft Defender for Cloud. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate recommendations in Defender for Cloud. Tip If a recommendation description saysNo re",2026-03-30T11:04:00.000Z,reference,security,0.78,True,"Lists all networking security recommendations (NSGs, firewalls, etc.) in Defender for Cloud. These are product-specific security hardening instructions beyond generic networking security concepts.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-serverless-protection,Serverless protection recommendations,Reference table for serverless protection security recommendations - Microsoft Defender for Cloud,Use Defender for Cloud serverless protection recommendations,This article lists the Microsoft Defender for Cloud security recommendations for serverless protection.,"This article lists all the security recommendations you might see issued by the Microsoft Defender for Cloud plan - Defender Cloud Security Posture Management (CSPM) for serverless protection. The recommendations that appear in your environment are based on the resources that you're protecting and on your customized configuration. You cansee the recommendations in the portalthat apply to your resources. To learn about actions that you can take in response to these recommendations, seeRemediate r",2025-12-01T12:02:00.000Z,reference,security,0.75,True,Reference table of serverless protection recommendations; product-specific security guidance for serverless workloads.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/regional-availability,Regional availability for Defender for Cloud plans,Microsoft Defender for Cloud Regional Availability - Microsoft Defender for Cloud,Check Defender for Cloud regional and plan availability,"Discover the regional availability of Microsoft Defender for Cloud plans across Azure, AWS, and GCP. Find supported services by region and platform.","Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that was not already onboarded to the Microsoft Defender for Cloud service prior to August 18, 2025, the date of the retirement announcement. For more information on the retirement, seeMicrosoft Defender for Clo",2026-04-14T08:00:00.000Z,concept-article,decision-making,0.7,True,"Page provides region-by-region availability of Microsoft Defender for Cloud plans and supported services across Azure, AWS, and GCP, including specific retirement dates and onboarding restrictions for Azure China. This is expert, product-specific data used to decide where and how the service can be used, fitting decision-making around regional support and lifecycle rather than generic limits or configuration.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/regional-availability,Regional availability for Defender for Cloud plans,Microsoft Defender for Cloud Regional Availability - Microsoft Defender for Cloud,Check Defender for Cloud regional and plan availability,"Discover the regional availability of Microsoft Defender for Cloud plans across Azure, AWS, and GCP. Find supported services by region and platform.","Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that was not already onboarded to the Microsoft Defender for Cloud service prior to August 18, 2025, the date of the retirement announcement. For more information on the retirement, seeMicrosoft Defender for Clo",2026-04-14T08:00:00.000Z,concept-article,decision-making,0.7,True,"Page provides region-by-region availability of Microsoft Defender for Cloud plans and supported services across Azure, AWS, and GCP, including specific retirement dates and onboarding restrictions for Azure China. This is expert, product-specific data used to decide where and how the service can be used, fitting decision-making around regional support and lifecycle rather than generic limits or configuration.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/regulatory-compliance-dashboard,Improve regulatory compliance,Improve regulatory compliance in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn how to improve regulatory compliance in Microsoft Defender for Cloud.,"Microsoft Defender for Cloud helps you to meet regulatory compliance requirements by continuously assessing resources against compliance controls, and identifying issues that are blocking you from achieving a particular compliance certification. In the Regulatory compliance dashboard, you manage and interact with compliance standards. You can see which compliance standards are assigned, turn standards on and off for Azure, AWS, and GCP, review the status of assessments against standards, and mor",2025-07-15T08:00:00.000Z,tutorial,,0.3,False,"Regulatory compliance dashboard usage; summary suggests conceptual and UI guidance, not detailed config or error codes.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes,What's new in Defender for Cloud features,What's new in Microsoft Defender for Cloud features - Microsoft Defender for Cloud,,What's new and updated in Microsoft Defender for Cloud features,"This article summarizes what's new in Microsoft Defender for Cloud. It includes information about new features in preview or in general availability (GA), feature updates, upcoming feature plans, and deprecated functionality. This page is updated frequently with the latest updates in Defender for Cloud. Find the latest information about security recommendations and alerts inWhat's new in recommendations and alerts. If you're looking for items older than six months, you can find them in theWhat's",2026-04-01T08:00:00.000Z,overview,,0.2,False,"Release notes summarizing new and updated Defender for Cloud features without clear indication of detailed limits, configuration tables, or troubleshooting mappings; primarily change log/overview content.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes-archive,Archived release notes (older than six months),Archive of what's new in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,A description of what's new and changed in Microsoft Defender for Cloud from six months ago and earlier.,"This page provides you with information about features, fixes, and deprecations that are older than six months. For the latest updates, readWhat's new in Defender for Cloud?.",2025-12-23T08:00:00.000Z,reference,,0.25,False,"Release notes archive; while detailed, it’s historical change log rather than reusable expert configuration, limits, or troubleshooting reference.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes-archive,Archived release notes (older than six months),Archive of what's new in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,A description of what's new and changed in Microsoft Defender for Cloud from six months ago and earlier.,"This page provides you with information about features, fixes, and deprecations that are older than six months. For the latest updates, readWhat's new in Defender for Cloud?.",2026-04-19T08:00:00.000Z,reference,,0.2,False,"Release notes archive listing historical feature changes and deprecations; does not focus on structured limits, configuration matrices, troubleshooting mappings, or other categorized expert patterns defined in the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes-recommendations-alerts,"What's new in recommendations, alerts, and incidents","New and upcoming changes in recommendations, alerts, and incidents - Microsoft Defender for Cloud",,"Get release notes for new and upcoming changes in recommendations, alerts, and incidents in Microsoft Defender for Cloud.","This article summarizes what's new in security recommendations, alerts, and incidents in Microsoft Defender for Cloud. It includes information about new, modified, and deprecated recommendations and alerts. This page is updated frequently with the latest recommendations and alerts in Defender for Cloud. Recommendations older than six months are found in the relevant recommendations reference list. Find the latest information about new and updated Defender for Cloud features inWhat's new in Defen",2026-03-29T08:00:00.000Z,overview,,0.2,False,"Release notes for recommendations, alerts, and incidents; likely lists new/modified items but not organized as troubleshooting guides with error-code-to-solution mappings or detailed configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-cloud-deployment-secrets,Remediate Cloud deployment secrets,Remediate cloud deployment secrets security issues in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Remediate cloud deployment secrets in Defender,Learn how to remediate cloud deployment secrets security issues in Microsoft Defender for Cloud.,"Microsoft Defender for Cloud provides secrets scanning for virtual machines, and for cloud deployments, to reduce lateral movement risk. This article helps you to identify and remediate security risks with cloud deployment secrets.",2025-05-25T08:00:00.000Z,overview,best-practices,0.6,True,"Similar to index 38 but for cloud deployment secrets; remediation guidance is likely organized by finding type with recommended actions specific to Defender for Cloud, qualifying as product-specific best practices.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-code-with-copilot,Remediate code with Microsoft Security Copilot,Remediate code with Microsoft Security Copilot - Microsoft Defender for Cloud,,Learn how to remediate code with Copilot in Microsoft Defender for Cloud and improve your security posture.,Microsoft Defender for Cloud's integration with Microsoft Security Copilot lets you remediate Infrastructure as Code (IaC) misconfigurations in your code repositories. Remediating IaC findings with Copilot lets you address security misconfigurations and vulnerabilities early in the development cycle by automatically generating pull requests (PRs) that correct the identified weaknesses. This remediation ensures that security issues in code are addressed accurately and promptly.,2025-09-25T08:00:00.000Z,how-to,,0.4,False,Explains using Copilot to remediate IaC misconfigurations; summary doesn’t indicate detailed configuration schemas or numeric limits.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-server-secrets,Remediate issues with VM secrets,Remediate machine secrets in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Remediate machine secrets findings in Defender,Learn how to remediate security issues with machine secrets in Microsoft Defender for Cloud.,"Microsoft Defender for Cloud canscan machines and cloud deploymentsforsupported secrets, to reduce lateral movement risk. This article helps you to identify and remediatemachine secrets scanfindings. Note This page describes the classic Recommendations view in Defender for Cloud. For the latest experience in the Depender portal, seeReview security recommendations. It’s important to be able to prioritize secrets and identify which ones need immediate attention. To help you do this, Defender for C",2025-02-19T08:00:00.000Z,overview,best-practices,0.6,True,"Remediation article for machine secrets likely includes prioritized remediation guidance, categorization of secret severity, and product-specific recommendations on how to handle different findings—these are concrete, product-specific best practices and gotchas.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-server-secrets,Remediate issues with VM secrets,Remediate machine secrets in Microsoft Defender for Cloud - Microsoft Defender for Cloud,,Learn how to remediate security issues with machine secrets in Microsoft Defender for Cloud.,"Microsoft Defender for Cloud canscan machines and cloud deploymentsforsupported secrets, to reduce lateral movement risk. This article helps you to identify and remediatemachine secrets scanfindings. Note This page describes the classic Recommendations view in Defender for Cloud. For the latest experience in the Depender portal, seeReview security recommendations. It’s important to be able to prioritize secrets and identify which ones need immediate attention. To help you do this, Defender for C",2026-04-19T08:00:00.000Z,overview,,0.4,False,"Guidance on identifying and remediating machine secrets findings is likely workflow-focused; summary does not show specific error codes, configuration parameters, or numeric thresholds.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-vulnerability-findings-vm,Remediate machine vulnerabilities,Remediate machine vulnerability findings - Microsoft Defender for Cloud,Remediate machine vulnerability findings in Defender for Servers,Learn about remediating machine vulnerabilities in Microsoft Defender for Cloud.,The Defender for Servers plan in Microsoft Defender for Cloud providesagentless and agent-based vulnerability scanningfor protected machines using Microsoft Defender Vulnerability Management.,2025-02-19T08:00:00.000Z,how-to,best-practices,0.6,True,Explains how to act on vulnerability findings from Defender Vulnerability Management; includes product-specific workflows and remediation guidance.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-with-copilot,Remediate recommendations with Microsoft Security Copilot,Remediate recommendations with Microsoft Security Copilot - Microsoft Defender for Cloud,,Learn how to remediate recommendations with Copilot in Microsoft Defender for Cloud and improve your security posture.,"Microsoft Defender for Cloud's integration with Microsoft Security Copilot lets you remediate recommendations on the recommendations page with natural language prompts. Remediating recommendations with Microsoft Security Copilot lets you improve your security posture by addressing the risks and vulnerabilities in your environment. Once Microsoft Security Copilot summarizes a recommendation in Defender for Cloud, you can decide how to handle it. By using prompts, you can have Microsoft Security C",2025-09-25T11:03:00.000Z,how-to,,0.35,False,"Covers using Copilot to remediate recommendations; summary suggests workflow guidance, not detailed config or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/resolve-disk-scanning-error,Resolve agentless scanning error,Resolve agentless scan error - Microsoft Defender for Cloud,Resolve agentless disk scan errors for GCP in Defender,Troubleshoot disk scan error in Microsoft Defender for Cloud to ensure your resources are connected and protected.,"After you connect your Google Cloud Platform (GCP) project to Microsoft Defender for Cloud, Defender for Cloud uses agentless machine scanning to identify vulnerabilities in your virtual machines (VMs). Defender for Cloud then provides security recommendations and alerts, along with guidance for remediation. If no agentless scan results appear within 24 hours after you connect your GCP project, it’s possible that the GCP organizational policyCompute Storage resource use restrictions (Compute Eng",2025-10-26T17:02:00.000Z,how-to,troubleshooting,0.8,True,"Focuses on missing scan results and ties them to a specific GCP organizational policy, with steps to resolve; clear symptom→cause→solution mapping.",unchanged @@ -435,7 +435,7 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/security-policy-conce https://learn.microsoft.com/en-us/azure/defender-for-cloud/security-recommendations,Security recommendations,Security Recommendations - Microsoft Defender for Cloud,,Learn about security recommendations in Microsoft Defender for Cloud to improve the security posture of your environments.,"Security recommendations in Microsoft Defender for Cloud contain actionable guidance that helps you secure your resources and improve your security posture. Each recommendation provides detailed steps to address specific security issues identified in your cloud environments. Security recommendations are generated based on continuous assessment of your resources against security policies and compliance standards. They provide clear remediation steps, prioritization based on risk, and guidance to ",2026-01-04T12:05:00.000Z,concept-article,,0.3,False,High-level description of security recommendations and posture; likely conceptual without product-specific config tables.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/sensitive-info-types,Supported sensitive data information types,Sensitive information types supported by Microsoft Defender for Cloud - Microsoft Defender for Cloud,Reference sensitive information types in Defender,List table of sensitive information types supported by Microsoft Defender for Cloud,This article lists all sensitive information types supported by Microsoft Defender for Cloud (a subset of what's supported in Microsoft Purview). The following table links to each sensitive information type's description and whether the sensitive information type is scanned by default. Thesensitivity settings pageallows you to modify the default settings. Note Custom information types from Microsoft Purview aren't scanned by default.,2024-08-07T16:44:00.000Z,reference,configuration,0.8,True,"Explicitly a list table of sensitive information types and whether each is scanned by default. This is detailed, product-specific configuration/reference data that LLMs are unlikely to know from training.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/sentinel-connected-aws,Connect a Sentinel connected AWS account,Connect a Microsoft Sentinel connected AWS account to Defender for Cloud - Microsoft Defender for Cloud,Resolve Sentinel-connected AWS onboarding issues in Defender,Troubleshoot AWS connector deployment issues in Microsoft Defender for Cloud to ensure your resources are connected and protected.,"Microsoft Defender for Cloud generates a CloudFormation template that includes the resources required to onboard your Amazon Web Services (AWS) account. Microsoft Defender for Cloud and Microsoft Sentinel can both ingest AWS CloudTrail events. By default, the Microsoft Sentinel connector receives CloudTrail notifications directly from Amazon S3 through an Amazon SQS queue. Because an Amazon SQS queue supports only one consumer, enabling CloudTrail ingestion for Defender for Cloud requires config",2025-12-18T12:04:00.000Z,how-to,troubleshooting,0.8,True,Describes resolving AWS connector deployment issues when Sentinel already consumes SQS; includes specific configuration changes and error scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/serverless-protection,Serverless protection (Preview),What is Serverless protection (Preview) - Microsoft Defender for Cloud,,Learn about Serverless protection in Microsoft Defender for Cloud and how it helps secure your serverless resources.,"Microsoft Defender for Cloud, as a Cloud-Native Application Protection Platform (CNAPP), delivers comprehensive visibility, security, and posture management for serverless workloads across multicloud environments. It extends coverage to Azure Web Apps, Azure Functions, and Amazon Web Service (AWS) Lambda, ensuring these resources are fully protected. Serverless protection automatically discovers and inventories all Web Apps, Azure Functions, and AWS Lambda functions in your environment. Once it ",2026-04-13T11:03:00.000Z,overview,,0.3,False,"Appears to be a conceptual/feature overview of Defender for Cloud serverless protection (what it is, what it covers). The summary does not indicate presence of concrete RBAC roles, config tables, limits, or error codes; likely marketing/overview rather than detailed configuration or troubleshooting.",new +https://learn.microsoft.com/en-us/azure/defender-for-cloud/serverless-protection,Serverless protection (Preview),What is Serverless protection (Preview) - Microsoft Defender for Cloud,,Learn about Serverless protection in Microsoft Defender for Cloud and how it helps secure your serverless resources.,"Microsoft Defender for Cloud, as a Cloud-Native Application Protection Platform (CNAPP), delivers comprehensive visibility, security, and posture management for serverless workloads across multicloud environments. It extends coverage to Azure Web Apps, Azure Functions, and Amazon Web Service (AWS) Lambda, ensuring these resources are fully protected. Serverless protection automatically discovers and inventories all Web Apps, Azure Functions, and AWS Lambda functions in your environment. Once it ",2026-04-13T11:03:00.000Z,overview,,0.3,False,"Appears to be a conceptual/feature overview of Defender for Cloud serverless protection (what it is, what it covers). The summary does not indicate presence of concrete RBAC roles, config tables, limits, or error codes; likely marketing/overview rather than detailed configuration or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/simulate-alerts-sql-machines,Simulate alerts for SQL servers on machines,Simulate alerts for SQL servers on machines - Microsoft Defender for Cloud,Simulate Defender for SQL alerts on machines,Learn how to simulate alerts for SQL servers on machines in Microsoft Defender for Cloud.,"Microsoft Defender for Cloud provides a SQL simulated alert feature that helps organizations and security teams validate deployment and test the preparedness of security teams detection, response, and automation workflows without creating actual security risks. The simulation injects telemetry records on target machines (Azure Virtual Machines (VMs) or Arc-connected machines) through a custom script extension namedSql-SimulateAlert. The simulated alerts include full runtime context such as host,",2026-02-09T12:03:00.000Z,how-to,security,0.68,True,"Details the SQL simulated alert feature, including the Sql-SimulateAlert extension and how telemetry is injected, which is product-specific security testing behavior.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/software-inventory,Review the software inventory,Review the software inventory in Defender for Cloud - Microsoft Defender for Cloud,,Learn how to review the software inventory in Microsoft Defender for Cloud,"The Defender for Servers plan in Microsoft Defender for Cloud provides vulnerability scanning using Microsoft Defender Vulnerability Management. Microsoft Defender for Endpoint and Defender Vulnerability Management are integrated natively into Defender for Cloud. The software inventory feature, provided by Defender Vulnerability Management, shows a list of known software in your organization, with security information about discovered applications. This article explains how to review the softwar",2025-02-19T08:00:00.000Z,how-to,,0.3,False,"How-to UI walkthrough for viewing software inventory; no detailed configuration tables, limits, or product-specific error mappings.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/sql-azure-vulnerability-assessment-enable,Deploy vulnerability assessment on your Azure SQL databases (Express),Enable vulnerability assessment (Express) - Microsoft Defender for Cloud,Enable SQL vulnerability assessment (Express) for Azure SQL and Synapse,"Learn how to enable the express configuration of SQL vulnerability assessment on Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics.","In this article, you learn how to enablevulnerability assessmentso you can find and remediate database vulnerabilities. We recommend that you enable vulnerability assessment using the express configuration so you aren't dependent on a storage account. You can also enable vulnerability assessment using theclassic configuration.",2025-07-20T08:00:00.000Z,how-to,configuration,0.75,True,"Covers enabling VA express configuration, including specific configuration choices (express vs classic) and likely parameter details; product-specific configuration guidance.",unchanged @@ -446,10 +446,10 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/sql-azure-vulnerabili https://learn.microsoft.com/en-us/azure/defender-for-cloud/sql-azure-vulnerability-assessment-rules-changelog,Vulnerability assessment rules changelog,SQL vulnerability assessment rules changelog - Microsoft Defender for Cloud,Changelog for SQL vulnerability assessment rules,"Changelog for SQL vulnerability assessment rules with SQL Server, Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics","This article details the changes made to the SQL vulnerability assessment service rules. Rules that are updated, removed, or added will be outlined below. For an updated list of SQL vulnerability assessment rules, seeSQL vulnerability assessment rules.",2024-08-07T16:44:00.000Z,reference,security,0.7,True,"Documents specific changes to VA rules over time, which is detailed product behavior history relevant to security assessments.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/subassessment-rest-api,REST API (secure score),Container vulnerability assessments powered by Microsoft Defender Vulnerability Management subassessments - Microsoft Defender for Cloud,Use Defender VM subassessments for container vulnerabilities,Learn about container vulnerability assessments powered by Microsoft Defender Vulnerability Management subassessments,,2025-07-15T08:00:00.000Z,how-to,integrations,0.63,True,"Describes the specific subassessment model and fields for container VA powered by Defender Vulnerability Management, including REST schema details that are product-specific integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/summarize-with-copilot,Summarize recommendations with Microsoft Security Copilot,Summarize recommendations with Microsoft Security Copilot - Microsoft Defender for Cloud,,Learn how to summarize recommendations with Microsoft Security Copilot in Microsoft Defender for Cloud and improve your security posture.,"Microsoft Defender for Cloud's integration with Microsoft Security Copilot lets you summarize a recommendation so you can better understand the risks and vulnerabilities in your environment. When you summarize a recommendation, you get a quick overview of the recommendation in natural language. This summary helps you understand the information presented in a recommendation and lets you prioritize your remediation efforts.",2025-09-25T11:03:00.000Z,how-to,,0.35,False,How-to summarize recommendations with Copilot; appears to be a feature walkthrough rather than a configuration or troubleshooting reference.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud,Defender for Cloud support matrices,"Interoperability with Azure services, Azure clouds, and client operating systems - Microsoft Defender for Cloud",Check Defender for Cloud interoperability across Azure services and environments,"Learn about the Azure cloud environments where Defender for Cloud can be used, the Azure services that Defender for Cloud protects, and the client operating systems that Defender for Cloud supports.","Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that was not already onboarded to the Microsoft Defender for Cloud service prior to August 18, 2025, the date of the retirement announcement. For more information on the retirement, seeMicrosoft Defender for Clo",2026-03-30T17:14:00.000Z,limits-and-quotas,deployment,0.65,True,"Support matrix for which Azure clouds, services, and client OSes are supported, including region-specific retirement dates and onboarding constraints; this is effectively a platform/tier support matrix relevant to deployment and environment planning.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers,Support matrices for Containers,Containers support matrix in Defender for Cloud - Microsoft Defender for Cloud,Support matrix for Defender for Containers features,Review support requirements for container capabilities in Microsoft Defender for Cloud.,"Caution This article references CentOS, a Linux distribution that reached end of service on June 30, 2024. Consider your use and plan accordingly. For more information, see theCentOS End Of Life guidance. Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that w",2026-02-22T23:14:00.000Z,limits-and-quotas,deployment,0.82,True,"Support matrix article; typically lists which container capabilities are supported on which platforms/versions with constraints and retirement dates, which are deployment-specific expert details.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud,Defender for Cloud support matrices,"Interoperability with Azure services, Azure clouds, and client operating systems - Microsoft Defender for Cloud",Check Defender for Cloud interoperability and support matrix,"Learn about the Azure cloud environments where Defender for Cloud can be used, the Azure services that Defender for Cloud protects, and the client operating systems that Defender for Cloud supports.","Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that was not already onboarded to the Microsoft Defender for Cloud service prior to August 18, 2025, the date of the retirement announcement. For more information on the retirement, seeMicrosoft Defender for Clo",2026-04-19T08:00:00.000Z,limits-and-quotas,deployment,0.7,True,"Support matrix for Azure environments, services, and client OSs is product-specific interoperability data that affects where and how Defender for Cloud can be deployed and used; this is not generic knowledge and is maintained as a matrix unique to the service.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers,Support matrices for Containers,Containers support matrix in Defender for Cloud - Microsoft Defender for Cloud,Container support matrix for Defender for Cloud,Review support requirements for container capabilities in Microsoft Defender for Cloud.,"Caution This article references CentOS, a Linux distribution that reached end of service on June 30, 2024. Consider your use and plan accordingly. For more information, see theCentOS End Of Life guidance. Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that w",2026-04-23T08:00:00.000Z,limits-and-quotas,deployment,0.8,True,"Support matrix for container capabilities; such pages list which container platforms/versions/regions are supported, retirement timelines, and environment-specific constraints. This is expert deployment knowledge (platform support and constraints by environment), matching the deployment sub-skill definition.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-servers,Defender for Servers support matrices,Support for the Defender for Servers plan - Microsoft Defender for Cloud,Review support matrix and requirements for Defender for Servers,"Review support requirements, network configurations, and feature support for the ""Defender for Servers"" plan in Microsoft Defender for Cloud.","Important All Microsoft Defender for Cloud features will be officially retired in the Azure in China region on August 18, 2026. Due to this upcoming retirement, Azure in China customers are no longer able to onboard new subscriptions to the service. A new subscription is any subscription that was not already onboarded to the Microsoft Defender for Cloud service prior to August 18, 2025, the date of the retirement announcement. For more information on the retirement, seeMicrosoft Defender for Clo",2025-12-22T08:00:00.000Z,limits-and-quotas,deployment,0.75,True,Support matrix for Defender for Servers including requirements and feature support; used for deployment planning and environment compatibility.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-storage,Prerequisites,Prerequisites for Microsoft Defender for Storage - Microsoft Defender for Cloud,Prerequisites and permissions for Defender for Storage,Learn about the prerequisites and permissions required to enable Microsoft Defender for Storage and its features of malware scanning and sensitive-data threat detection.,This article lists the prerequisites and permissions required toenable Microsoft Defender for Storageand its features.,2025-12-11T18:08:00.000Z,reference,security,0.78,True,"Explicitly lists prerequisites and permissions; likely includes specific RBAC roles, required actions, and security-related configuration details unique to Defender for Storage.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-storage,Prerequisites,Prerequisites for Microsoft Defender for Storage - Microsoft Defender for Cloud,Prerequisites and permissions for Defender for Storage,Learn about the prerequisites and permissions required to enable Microsoft Defender for Storage and its features of malware scanning and sensitive-data threat detection.,This article lists the prerequisites and permissions required toenable Microsoft Defender for Storageand its features.,2026-04-20T06:03:00.000Z,reference,security,0.8,True,"Support-matrix/prerequisites page for a security product; these typically enumerate specific RBAC roles, permission scopes, and security-related requirements to enable malware scanning and sensitive-data threat detection. That aligns with the security sub-skill (product-specific roles/permissions and security settings).",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/tenant-wide-permissions-management,Grant and request tenant-wide permissions,Grant and request tenant-wide permissions - Microsoft Defender for Cloud,Manage tenant-wide permissions in Defender for Cloud,Learn how to manage tenant-wide permissions in Microsoft Defender for Cloud effectively to enhance your organization's security.,"A user with the Microsoft Entra role ofGlobal Administratormight have tenant-wide responsibilities, but lack the Azure permissions to view that organization-wide information in Microsoft Defender for Cloud. Permission elevation is required because Microsoft Entra role assignments don't grant access to Azure resources.",2025-07-15T08:00:00.000Z,how-to,security,0.75,True,"Focuses on permission elevation and tenant-wide access; expected to detail specific Microsoft Entra roles, Azure roles, and how they interact.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/test-agentless-malware-scanning,Test agentless malware scanning alerts,Test agentless malware scanning for VMs in Microsoft Defender for Cloud - Microsoft Defender for Cloud,Test agentless malware scanning alerts for VMs,Test agentless malware scanning in Microsoft Defender for Cloud.,"In addition to thenext-generation antimalware protectionprovided by agent-basedDefender for Endpoint integration with Defender for Cloud, Defender for Servers Plan 2 provides agentless malware scanning as part of its agentless scanning capabilities. This article describes how to create a test alert to make sure that agentless malware scanning is working as expected.",2025-02-19T08:00:00.000Z,how-to,configuration,0.7,True,Describes how to create a test alert to validate scanning; likely includes specific steps and configuration values unique to this feature.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/threat-intelligence-reports,Generate threat intelligence reports,Threat intelligence report - Microsoft Defender for Cloud,,This page helps you to use Microsoft Defender for Cloud threat intelligence reports during an investigation to find more information about security alerts,Microsoft Defender for Cloud's threat intelligence reports can help you learn more about a threat that triggered a security alert.,2025-07-30T08:00:00.000Z,how-to,,0.3,False,"Threat intelligence reports usage overview; summary doesn’t indicate specific configuration parameters, limits, or error-code mappings.",unchanged @@ -468,8 +468,8 @@ https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-datab https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-key-vault-plan,Protect key vaults with Defender for Key Vault,Protect your key vaults with the Defender for Key Vault plan - Microsoft Defender for Cloud,,Learn how to enable the Defender for Key Vault plan on your Azure subscription for Microsoft Defender for Cloud.,"Azure Key Vault is a cloud service that safeguards encryption keys and secrets like certificates, connection strings, and passwords. Enable Microsoft Defender for Key Vault for Azure-native, advanced threat protection for Azure Key Vault, providing an additional layer of security intelligence. Learn more aboutMicrosoft Defender for Key Vault. You can learn more about Defender for Key Vault's pricing onthe pricing page. You can alsoestimate costs with the Defender for Cloud cost calculator.",2025-09-28T17:10:00.000Z,install-set-up-deploy,,0.3,False,Tutorial to enable a plan; mostly portal steps and conceptual description of Defender for Key Vault without detailed config parameters or limits.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-resource-manager-plan,Protect resources with Defender for Resource Manager,Protect your resources with the Resource Manager plan - Microsoft Defender for Cloud,,Learn how to enable the Defender for Resource Manager plan on your Azure subscription for Microsoft Defender for Cloud.,"Azure Resource Manageris the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You use management features, like access control, locks, and tags, to secure and organize your resources after deployment. Microsoft Defender for Resource Manager automatically monitors the resource management operations in your organization, whether they're performed through the Azure portal, Azure REST APIs, Azu",2025-09-28T17:10:00.000Z,install-set-up-deploy,,0.3,False,Plan enablement tutorial for Defender for Resource Manager; likely basic onboarding steps without expert-level configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-servers-plan,Deploy Defender for Servers,Protect your servers with Defender for Servers - Microsoft Defender for Cloud,,Learn how to enable the Defender for Servers plan in Microsoft Defender for Cloud to protect your virtual machines and reduce security risks.,"The Defender for Servers plan in Microsoft Defender for Cloud protects Windows and Linux virtual machines (VMs) that run in Azure, Amazon Web Service (AWS), Google Cloud Platform (GCP), and in on-premises environments. Defender for Servers provides recommendations to improve the security posture of machines and protects machines against security threats. This article helps you deploy a Defender for Servers plan. Note After you enable a plan, a 30-day trial period begins. You can't stop, pause, o",2025-09-28T08:00:00.000Z,install-set-up-deploy,,0.5,False,"Tutorial for enabling Defender for Servers; includes note about 30-day trial but otherwise mainly procedural, not a configuration or limits reference.",unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-storage-plan,Protect storage accounts with Defender for Storage,Deploy Microsoft Defender for Storage - Microsoft Defender for Cloud,Deploy Microsoft Defender for Storage on Azure,Learn how to enable Microsoft Defender for Storage on your Azure subscription for Microsoft Defender for Cloud.,"Microsoft Defender for Storage is an Azure-native solution. It offers an advanced layer of intelligence for detecting and mitigating threats in storage accounts. It usesMicrosoft Defender Threat Intelligence, Microsoft Defender Antivirus technologies, and sensitive data discovery. It helps protect the Azure Blob Storage, Azure Files, and Azure Data Lake Storage services. Defender for Storage provides a comprehensive alert suite, near-real-time malware scanning (as an add-on), and sensitive-data ",2025-09-28T17:10:00.000Z,install-set-up-deploy,deployment,0.6,True,Tutorial for enabling Defender for Storage; likely includes concrete deployment steps and possibly plan/scope considerations specific to this service.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-protect-resources,Protect VMs,Protect your Virtual Machines (VMs) with Microsoft Defender for Servers - Microsoft Defender for Cloud,,This tutorial shows you how to configure a just-in-time VM access policy and an application control policy.,"Defender for Servers in Microsoft Defender for Cloud, limits your exposure to threats by using access and application controls to block malicious activity. Just-in-time (JIT) virtual machine (VM) access reduces your exposure to attacks by enabling you to deny persistent access to VMs. Instead, you provide controlled and audited access to VMs only when needed. Defender for Cloud uses machine learning to analyze the processes running in the VM and helps you apply allowlist rules using this intelli",2025-07-15T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for enabling JIT access and application control; step-by-step usage, not deep configuration, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-storage-plan,Protect storage accounts with Defender for Storage,Deploy Microsoft Defender for Storage - Microsoft Defender for Cloud,,Learn how to enable Microsoft Defender for Storage on your Azure subscription for Microsoft Defender for Cloud.,"Microsoft Defender for Storage is an Azure-native solution. It offers an advanced layer of intelligence for detecting and mitigating threats in storage accounts. It usesMicrosoft Defender Threat Intelligence, Microsoft Defender Antivirus technologies, and sensitive data discovery. It helps protect the Azure Blob Storage, Azure Files, and Azure Data Lake Storage services. Defender for Storage provides a comprehensive alert suite, near-real-time malware scanning (as an add-on), and sensitive-data ",2026-04-20T06:03:00.000Z,install-set-up-deploy,,0.3,False,"Tutorial-style deployment/enablement article for Defender for Storage; summary emphasizes conceptual description and step-by-step enabling, but does not indicate detailed configuration parameter tables, limits, or SKU matrices. Lacks clear evidence of expert-only configuration or constraints.",updated +https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-protect-resources,Protect VMs,Protect your Virtual Machines (VMs) with Microsoft Defender for Servers - Microsoft Defender for Cloud,Configure JIT access and application control for Defender for Servers,This tutorial shows you how to configure a just-in-time VM access policy and an application control policy.,"Defender for Servers in Microsoft Defender for Cloud, limits your exposure to threats by using access and application controls to block malicious activity. Just-in-time (JIT) virtual machine (VM) access reduces your exposure to attacks by enabling you to deny persistent access to VMs. Instead, you provide controlled and audited access to VMs only when needed. Defender for Cloud uses machine learning to analyze the processes running in the VM and helps you apply allowlist rules using this intelli",2026-04-19T08:00:00.000Z,tutorial,security,0.7,True,"Tutorial for Microsoft Defender for Servers that walks through configuring just-in-time VM access and application control. These are product-specific security configurations (JIT policy settings, application allowlisting behavior) that go beyond generic security concepts and represent concrete, service-specific security guidance.",updated https://learn.microsoft.com/en-us/azure/defender-for-cloud/understand-malware-scan-results,Understanding malware scanning results,Understand malware scanning results - Microsoft Defender for Cloud,Interpret and act on Defender for Storage malware scan results,"Learn how to understand and interpret the results from malware scanning in Microsoft Defender for Storage, including how to take appropriate actions.","When a blob is scanned for malware, the scan result can be assessed in several ways: Whether you're looking toautomate responsesto specific scan outcomes or to keep a detailed record of all scans, these options can be tailored to meet your needs. Scan results fall into two categories: successful states and error states. Understanding these states is important for interpreting the results of malware scanning and taking appropriate action. Note For storage accounts that exceed thethroughput capaci",2025-12-11T12:07:00.000Z,how-to,troubleshooting,0.7,True,Explains malware scan result states (success/error) and appropriate actions; this is symptom→interpretation→action guidance specific to Defender for Storage scanning.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/update-sql-machine-configuration,Update Defender for SQL Servers on Machines plan configuration,Update Defender for SQL servers on Machines plan configuration - Microsoft Defender for Cloud,Update configuration for Defender for SQL Servers on Machines,"Learn how to update your SQL Servers on machine configuration across Azure VMs, on-premises, and hybrid environments with Defender for Cloud.","Important This page applies to existing customers who enabled the plan on a subscription before April 21, 2025. Defender for SQL Servers on Machines plan's includes an updated agent architecture that simplifies onboarding and improves SQL protection. To gain visibility and provide protection, the plan requires each SQL server instance to be registered within Azure. Registration occurs automatically with the SQL Server IaaS Agent extension whichautomates registration for Azure VMs. Arc-enabled SQ",2025-04-28T11:11:00.000Z,how-to,configuration,0.7,True,Explains updating plan configuration and registration behavior; product-specific configuration and agent behavior details.,unchanged https://learn.microsoft.com/en-us/azure/defender-for-cloud/verify-machine-protection,Verify SQL machine protection,Verify SQL machine protection - Microsoft Defender for Cloud,Verify Defender for SQL Servers on Machines protection status,Verify that SQL VMs are protected with the Defender for SQL Servers on Machines plan as expected. Ensure that all security measures are properly implemented.,"Important This article applies to commercial clouds. If you're using Government clouds, see theVerify SQL machine protection governmentarticle. After enabling protection for SQL Servers installed on Virtual Machines (VM), on-premises machines, and multicloud resources with the Defender for SQL Servers on Machines plan, verify that your SQL servers are protected as expected.",2025-04-28T11:11:00.000Z,how-to,deployment,0.6,True,Describes how to verify that SQL machines are protected; product-specific verification steps tied to deployment state.,unchanged diff --git a/products/azure-defender-for-cloud/report.md b/products/azure-defender-for-cloud/report.md index d5351ee1..edfd2a0b 100644 --- a/products/azure-defender-for-cloud/report.md +++ b/products/azure-defender-for-cloud/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - integrations: Integrating Defender for Cloud with CI/CD, SIEM, EDR, ITSM, APIs, - and multi-cloud logs, plus querying/exporting security data and automating tickets - and container/AKS scans - configuration: 'Configuring Defender for Cloud features: onboarding, policies, exemptions, - scanning (agentless, containers, SQL, storage, DevOps), alerts, exports, private - links, and data security posture.' + integrations: Integrating Defender for Cloud with CI/CD, SIEM, EDR, ticketing, cloud + logs, partners, and APIs, plus exporting/querying security data via ARG, REST, + CLI, and PowerShell. + configuration: 'How to enable and tune Defender for Cloud features: configuring + scans, alerts, exports, exemptions, DevOps integrations, data security posture, + and plan settings across Azure and multicloud.' architecture-patterns: 'Architectural guidance for Defender for Servers/Containers: agentless scanning, malware/vuln detection on VMs/Kubernetes, data collection, residency, workspaces, and large-scale deployment.' @@ -15,33 +15,33 @@ category_descriptions: for Cloud, AKS, registries, and CI/CD. security: 'Security alerts, permissions, and hardening for Defender for Cloud: configuring protections, interpreting alerts/recommendations, RBAC/CIEM, data handling, and - securing SQL, storage, containers, VMs, APIs, and Kubernetes.' - troubleshooting: 'Diagnosing and fixing Defender for Cloud issues: alert validation/interpretation, - connector/onboarding problems (AWS/GCP/APIs/SQL/K8s), deployment gaps, malware/vuln - handling, and incident references.' - decision-making: Guidance for choosing and planning Defender for Cloud plans, pricing, - portals, agents, regional availability, and migration/transition paths for servers, - containers, storage, and GCP. - deployment: Deploying and managing Defender for Cloud plans and agents (Containers, - SQL, Storage) across AKS/EKS/GKE and servers using portal, CLI, IaC, policies, - APIs, and reviewing support/regions. + securing VMs, containers, SQL, storage, APIs, and Kubernetes.' + troubleshooting: 'Diagnosing and fixing Defender for Cloud issues: alert validation/response, + connector and SQL/Kubernetes/container problems, GCP/AWS onboarding, and malware/vulnerability + scan errors.' + decision-making: Guidance for choosing Defender for Cloud plans, portals, pricing + and cost allocation, regional availability, migration paths, and deployment options + for servers, containers, storage, and GCP. + deployment: Deploying and managing Defender for Cloud protections (Containers, SQL, + Storage, Servers) at scale via portal, CLI/IaC/Policy/REST, plus migration, removal, + and support matrices. limits-quotas: 'Limits, quotas, and prerequisites for Defender for Cloud features: free trials, data ingestion, APIs, DevOps, portal preview, alert export limits, and data collection extension changes.' skill_description: Expert knowledge for Azure Defender For Cloud development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when securing Azure VMs, AKS/containers, SQL/storage, multi-cloud connectors, - or Defender for DevOps APIs, and other Azure Defender For Cloud related development - tasks. Not for Azure Defender For Iot (use azure-defender-for-iot), Azure DDos Protection - (use azure-ddos-protection), Azure External Attack Surface Management (use azure-external-attack-surface-management), - Azure Security (use azure-security). -use_when: Use when securing Azure VMs, AKS/containers, SQL/storage, multi-cloud connectors, - or Defender for DevOps APIs, and other Azure Defender For Cloud related development + Use when securing Azure VMs, AKS/containers, SQL/storage, multicloud connectors, + or Defender for Servers plans, and other Azure Defender For Cloud related development + tasks. Not for Azure Defender For Iot (use azure-defender-for-iot), Azure Security + (use azure-security), Azure Sentinel (use azure-sentinel), Azure Firewall Manager + (use azure-firewall-manager). +use_when: Use when securing Azure VMs, AKS/containers, SQL/storage, multicloud connectors, + or Defender for Servers plans, and other Azure Defender For Cloud related development tasks. confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), Azure - DDos Protection (use azure-ddos-protection), Azure External Attack Surface Management - (use azure-external-attack-surface-management), Azure Security (use azure-security). + Security (use azure-security), Azure Sentinel (use azure-sentinel), Azure Firewall + Manager (use azure-firewall-manager). --- # Azure Defender For Cloud Crawl Report @@ -50,75 +50,76 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), - **Total Pages**: 473 - **Fetched**: 473 - **Fetch Failed**: 0 -- **Classified**: 254 -- **Unclassified**: 219 +- **Classified**: 255 +- **Unclassified**: 218 ### Incremental Update -- **New Pages**: 5 -- **Updated Pages**: 12 -- **Unchanged**: 456 -- **Deleted Pages**: 5 +- **New Pages**: 0 +- **Updated Pages**: 27 +- **Unchanged**: 446 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-defender-for-cloud/azure-defender-for-cloud.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 6 | 1.3% | -| best-practices | 16 | 3.4% | -| configuration | 64 | 13.5% | +| architecture-patterns | 5 | 1.1% | +| best-practices | 15 | 3.2% | +| configuration | 62 | 13.1% | | decision-making | 15 | 3.2% | | deployment | 25 | 5.3% | -| integrations | 26 | 5.5% | +| integrations | 27 | 5.7% | | limits-quotas | 6 | 1.3% | -| security | 71 | 15.0% | -| troubleshooting | 25 | 5.3% | -| *(Unclassified)* | 219 | 46.3% | +| security | 74 | 15.6% | +| troubleshooting | 26 | 5.5% | +| *(Unclassified)* | 218 | 46.1% | ## Changes -### New Pages - -- [Serverless protection (Preview)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/serverless-protection) -- [Review and remediate VA findings for Kubernetes nodes](https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-va) -- [Review and remediate malware alerts for Kubernetes nodes](https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-malware) -- [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cluster-security-dashboard) -- [Investigate Kubernetes vulnerabilities with Cloud Security Explorer](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-kubernetes-clusters) - ### Updated Pages -- [Regional availability for Defender for Cloud plans](https://learn.microsoft.com/en-us/azure/defender-for-cloud/regional-availability) - - Updated: 2025-09-11T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Common questions about Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general) - - Updated: 2026-01-28T18:11:00Z → 2026-04-14T11:10:00Z -- [Common questions about permissions](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-permissions) - - Updated: 2025-11-30T23:03:00Z → 2026-04-15T17:07:00Z -- [Common questions about regulatory compliance](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-regulatory-compliance) - - Updated: 2025-10-23T17:10:00Z → 2026-04-15T17:07:00Z -- [Common questions about Copilot in Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-copilot) - - Updated: 2025-05-19T11:24:00Z → 2026-04-15T17:07:00Z +- [Agentless machine scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-data-collection) + - Updated: 2025-02-19T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Cloud asset inventory](https://learn.microsoft.com/en-us/azure/defender-for-cloud/asset-inventory) + - Updated: 2025-12-03T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Investigate risks with security explorer and attack paths](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-attack-path) + - Updated: 2025-11-25T14:31:00.000Z → 2026-04-19T08:00:00.000Z - [Onboard agentless containers for CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-enable-agentless-containers) - - Updated: 2025-03-31T05:32:00.000Z → 2026-04-16T11:04:00.000Z -- [Attack path analysis and enhanced risk-hunting for containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-test-attack-path-and-security-explorer-with-vulnerable-container-image) - - Updated: 2025-05-18T08:00:00.000Z → 2026-04-16T11:04:00.000Z -- [Discover AI models](https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-model-security) - - Updated: 2026-03-30T17:14:00.000Z → 2026-04-14T22:13:00.000Z -- [Kubernetes data plane hardening](https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-workload-protections) - - Updated: 2025-07-15T08:00:00.000Z → 2026-04-16T11:04:00.000Z -- [Common questions about CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-cspm) - - Updated: 2025-05-19T11:24:00Z → 2026-04-15T17:07:00Z -- [Data recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-data) - - Updated: 2026-03-30T11:04:00.000Z → 2026-04-15T08:00:00.000Z -- [Alerts for AI services](https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-ai-workloads) - - Updated: 2026-03-25T08:00:00.000Z → 2026-04-16T08:00:00.000Z - -### Deleted Pages - -- ~~Investigate clusters with Cloud Security Explorer~~ (https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-kubernetes-clusters) -- ~~Protect clusters with AKS Security Dashboard~~ (https://learn.microsoft.com/en-us/azure/defender-for-cloud/cluster-security-dashboard) -- ~~Kubernetes node malware detection~~ (https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-malware) -- ~~Kubernetes node vulnerability assessment~~ (https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-va) -- ~~Serverless protection~~ (https://learn.microsoft.com/en-us/azure/defender-for-cloud/serverless-protection) + - Updated: 2026-04-16T11:04:00.000Z → 2026-04-19T08:00:00.000Z +- [Remediate issues with VM secrets](https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-server-secrets) + - Updated: 2025-02-19T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Common questions about Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general) + - Updated: 2026-04-14T11:10:00Z → 2026-04-19T22:03:00Z +- [Enable Defender for Cloud for a management group](https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-management-group) + - Updated: 2025-07-15T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Automate onboarding using PowerShell](https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding) + - Updated: 2025-08-11T11:03:00.000Z → 2026-04-19T08:00:00.000Z +- [Enable on Azure](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure) + - Updated: 2025-06-30T08:00:00.000Z → 2026-04-20T17:15:00.000Z +- [Protect storage accounts with Defender for Storage](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-storage-plan) + - Updated: 2025-09-28T17:10:00.000Z → 2026-04-20T06:03:00.000Z +- [Prerequisites](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-storage) + - Updated: 2025-12-11T18:08:00.000Z → 2026-04-20T06:03:00.000Z +- [Support matrices for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers) + - Updated: 2026-02-22T23:14:00.000Z → 2026-04-23T08:00:00.000Z +- [What is Microsoft Defender for Cloud?](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-introduction) + - Updated: 2026-01-01T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Defender for Cloud support matrices](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud) + - Updated: 2026-03-30T17:14:00.000Z → 2026-04-19T08:00:00.000Z +- [Connect your AWS account](https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-aws) + - Updated: 2026-01-06T12:02:00.000Z → 2026-04-19T08:00:00.000Z +- [Enable sensitive data threat detection](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity) + - Updated: 2025-07-01T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Migrating to the new plan](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate) + - Updated: 2025-05-13T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-introduction) + - Updated: 2025-08-20T22:10:00.000Z → 2026-04-19T08:00:00.000Z +- [Scale a deployment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale) + - Updated: 2025-02-19T08:00:00.000Z → 2026-04-19T08:00:00.000Z +- [Defender sensor for Containers changelog](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-sensor-change-log) + - Updated: 2026-03-17T17:15:00.000Z → 2026-04-20T11:04:00.000Z +- *...and 7 more* ## Classified Pages @@ -136,7 +137,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Choose a plan and deployment scope](https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-select-plan) | decision-making | 0.85 | Explicitly about selecting which Defender for Servers plan to deploy; such pages typically include comparison tables, capabilities per plan, and recommendations for scenarios, which is decision-making guidance. | | [Download a CSV report](https://learn.microsoft.com/en-us/azure/defender-for-cloud/export-alerts-to-csv) | limits-quotas | 0.85 | Explicitly mentions Azure Resource Graph limitation of 25,000 rows per file; this is a concrete numeric export limit that qualifies as a quota. | | [User roles and permissions](https://learn.microsoft.com/en-us/azure/defender-for-cloud/permissions) | security | 0.85 | Describes specific Azure RBAC roles and what they can do in Defender for Cloud; includes role names and permission scopes, matching security sub-skill. | -| [Support matrices for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers) | deployment | 0.82 | Support matrix article; typically lists which container capabilities are supported on which platforms/versions with constraints and retirement dates, which are deployment-specific expert details. | | [Alerts for Azure App Service](https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-azure-app-service) | security | 0.80 | Reference of App Service alert types and meanings; detailed alert catalog is expert security knowledge. | | [Alerts for Azure Cosmos DB](https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-azure-cosmos-db) | security | 0.80 | Lists Cosmos DB alert types and semantics; product-specific security detection behavior. | | [Alerts for Azure DDoS Protection](https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-azure-ddos-protection) | security | 0.80 | Reference of DDoS Protection alert types; detailed alert semantics are expert security knowledge. | @@ -160,12 +160,14 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Manage with Azure CLI](https://learn.microsoft.com/en-us/azure/defender-for-cloud/express-configuration-azure-commands) | configuration | 0.80 | Provides Azure CLI command references and examples for SQL VA express configuration, including command names and parameter usage. | | [Plan roles and permissions](https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-roles) | security | 0.80 | Roles and permissions article almost certainly lists specific Azure RBAC role names, scopes, and what each can do in Defender for Servers, which is product-specific security configuration. | | [Prepare Azure resources for exporting to Splunk and QRadar](https://learn.microsoft.com/en-us/azure/defender-for-cloud/export-to-splunk-or-qradar) | integrations | 0.80 | How-to for setting up Event Hubs and Entra ID for QRadar/Splunk export; such pages contain product-specific integration settings and sometimes PowerShell parameters. | +| [Prerequisites](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-storage) | security | 0.80 | Support-matrix/prerequisites page for a security product; these typically enumerate specific RBAC roles, permission scopes, and security-related requirements to enable malware scanning and sensitive-data threat detection. That aligns with the security sub-skill (product-specific roles/permissions and security settings). | | [Remove](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-aws-remove) | deployment | 0.80 | Removal article that includes tables mapping Defender features to installed components; provides detailed deployment/component relationships. | | [Remove](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-azure-remove) | deployment | 0.80 | Removal/disablement article that includes tables mapping Defender features to installed components; this is product-specific deployment/component knowledge. | | [Remove](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-gcp-remove) | deployment | 0.80 | Removal article that includes tables mapping Defender features to installed components; provides detailed deployment/component relationships for GCP. | | [Resolve Domain Restricted Sharing policy](https://learn.microsoft.com/en-us/azure/defender-for-cloud/resolve-gcp-sharing-policy) | troubleshooting | 0.80 | Troubleshooting GCP policy conflicts; maps the Domain Restricted Sharing policy to deployment failures and provides specific remediation steps. | | [Resolve VPC Service Controls restriction issues](https://learn.microsoft.com/en-us/azure/defender-for-cloud/resolve-vpc-service-controls-issues) | troubleshooting | 0.80 | Explains how VPC Service Controls block Defender scanning and provides specific ingress/egress policy configurations to fix it. | | [Resolve agentless scanning error](https://learn.microsoft.com/en-us/azure/defender-for-cloud/resolve-disk-scanning-error) | troubleshooting | 0.80 | Focuses on missing scan results and ties them to a specific GCP organizational policy, with steps to resolve; clear symptom→cause→solution mapping. | +| [Support matrices for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers) | deployment | 0.80 | Support matrix for container capabilities; such pages list which container platforms/versions/regions are supported, retirement timelines, and environment-specific constraints. This is expert deployment knowledge (platform support and constraints by environment), matching the deployment sub-skill definition. | | [Supported sensitive data information types](https://learn.microsoft.com/en-us/azure/defender-for-cloud/sensitive-info-types) | configuration | 0.80 | Explicitly a list table of sensitive information types and whether each is scanned by default. This is detailed, product-specific configuration/reference data that LLMs are unlikely to know from training. | | [Troubleshoot connectors guide](https://learn.microsoft.com/en-us/azure/defender-for-cloud/troubleshoot-connectors) | troubleshooting | 0.80 | Explicit troubleshooting guide for connectors; expected to map symptoms to causes and resolutions, possibly with error messages and diagnostic steps. | | [Troubleshoot express or classic configuration](https://learn.microsoft.com/en-us/azure/defender-for-cloud/troubleshoot-vulnerability-findings) | troubleshooting | 0.80 | Explicit troubleshooting article for configuration issues in express/classic modes, mapping configuration problems to causes and fixes. | @@ -178,7 +180,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Container recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-container) | security | 0.78 | Provides a full reference of container security recommendations in Defender for Cloud, which is specialized security guidance for containerized workloads beyond generic best practices. | | [Express configuration PowerShell wrapper module](https://learn.microsoft.com/en-us/azure/defender-for-cloud/express-configuration-sql-commands) | configuration | 0.78 | Reference for a PowerShell module (commands and usage) specific to SQL VA express configuration, including function names and usage patterns. | | [Networking recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-networking) | security | 0.78 | Lists all networking security recommendations (NSGs, firewalls, etc.) in Defender for Cloud. These are product-specific security hardening instructions beyond generic networking security concepts. | -| [Prerequisites](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-storage) | security | 0.78 | Explicitly lists prerequisites and permissions; likely includes specific RBAC roles, required actions, and security-related configuration details unique to Defender for Storage. | | [Syntax](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-cli-syntax) | integrations | 0.78 | The page documents product-specific CLI commands, syntax, and parameters for the Microsoft Defender for Cloud container security scanner. It includes command names, flags, and usage patterns unique to this product, which are not generally known from training data. This aligns best with the integrations sub-skill, as it focuses on concrete CLI/API parameter usage rather than generic concepts. | | [Enable with Infrastructure as Code](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-infrastructure-as-code-enablement) | deployment | 0.76 | Infrastructure-as-code enablement article; likely contains ARM/Bicep/Terraform parameters and deployment constraints, including timing detail (up to 24 hours) that is product-specific. | | [Enable with PowerShell](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-powershell-enablement) | deployment | 0.76 | PowerShell-based deployment article; likely includes cmdlet names, parameters, and product-specific deployment behavior including the 24-hour protection delay. | @@ -186,7 +187,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Enable with the Azure portal](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-azure-portal-enablement) | deployment | 0.76 | Portal-based enablement; includes product-specific deployment behavior (subscription-level coverage and up to 24-hour delay for new accounts) and configuration nuances. | | [Vulnerability assessment rules](https://learn.microsoft.com/en-us/azure/defender-for-cloud/sql-azure-vulnerability-assessment-rules) | security | 0.76 | Detailed list of built-in VA rules with titles and descriptions, representing product-specific security checks and best-practice mappings. | | [AI recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-ai) | security | 0.75 | Reference table of AI security recommendations with specific recommendation IDs and semantics; product-specific security guidance not derivable from general training. | -| [API recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-api) | security | 0.75 | Reference table of API/API Management security recommendations; contains concrete recommendation definitions unique to Defender for Cloud. | | [Alerts schemas](https://learn.microsoft.com/en-us/azure/defender-for-cloud/alerts-schemas) | configuration | 0.75 | Alert schema reference typically lists field names, types, and structures for different export paths, which are detailed configuration/integration schemas. | | [Configure the Microsoft Security DevOps Azure DevOps extension](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-azure-devops-extension) | configuration | 0.75 | Describes configuring the MSDO Azure DevOps extension and lists specific tools; such articles include YAML/task parameters, configuration options, and tool-specific settings unique to this product. | | [Configure the Microsoft Security DevOps Azure DevOps extension](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-azure-devops-extension) | configuration | 0.75 | Describes configuring the MSDO Azure DevOps extension and lists specific tools; such articles include YAML/task parameters, configuration options, and tool-specific settings unique to this product. | @@ -216,7 +216,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Authentication](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-cli-authentication) | security | 0.70 | Details connector-based and token-based authentication methods for Azure DevOps and GitHub, including scopes and token handling, which are product-specific security configuration patterns. | | [Authentication architecture for AWS connectors](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-authentication-architecture-aws) | security | 0.70 | Explains federated auth, STS, and trust setup for AWS connectors; product-specific authentication architecture and permissions, fitting security configuration patterns. | | [Authentication architecture for GCP connectors](https://learn.microsoft.com/en-us/azure/defender-for-cloud/authentication-architecture-google-cloud) | security | 0.70 | Details federated identity, STS, and service account impersonation; product-specific auth configuration and security model. | -| [Automate onboarding using PowerShell](https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding) | configuration | 0.70 | PowerShell onboarding uses specific cmdlets, parameters, and scripts that are detailed configuration knowledge for this product. | | [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/defender-for-cloud/policy-reference) | configuration | 0.70 | Lists specific built-in policy definitions tied to Defender for Cloud; these are concrete configuration artifacts (names, effects, scopes) that are product-specific and not generic knowledge. | | [Azure portal vs Defender portal feature comparison](https://learn.microsoft.com/en-us/azure/defender-for-cloud/azure-portal-vs-defender-portal-comparison) | decision-making | 0.70 | Feature comparison between Azure portal and Defender portal experiences; such pages typically include comparison tables of capabilities and guidance on when to use each portal, which fits decision-making criteria. | | [Binary drift detection and blocking](https://learn.microsoft.com/en-us/azure/defender-for-cloud/binary-drift-detection) | security | 0.70 | Covers Defender for Cloud’s specific binary drift detection feature, including how it evaluates processes vs. original images and how to configure blocking, which is product-specific security behavior. | @@ -234,6 +233,7 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Create custom standards and recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/create-custom-recommendations) | configuration | 0.70 | Creating custom standards/recommendations will involve specific policy definitions, parameters, and configuration structures unique to Defender for Cloud. | | [Customize sensitivity settings](https://learn.microsoft.com/en-us/azure/defender-for-cloud/data-sensitivity-settings) | configuration | 0.70 | Article is specifically about configuring data sensitivity settings. Likely includes setting names, options, and their effects, which are product-specific configuration details. | | [Data recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-data) | security | 0.70 | The page is a reference table of all Microsoft Defender for Cloud data security recommendations. It enumerates product-specific recommendation IDs, names, and details that are unique to Defender for Cloud’s security model. This is expert, product-specific security guidance rather than generic concepts, but it does not focus on limits, configuration parameters, or troubleshooting patterns. | +| [Defender for Cloud support matrices](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud) | deployment | 0.70 | Support matrix for Azure environments, services, and client OSs is product-specific interoperability data that affects where and how Defender for Cloud can be deployed and used; this is not generic knowledge and is maintained as a matrix unique to the service. | | [Defender for Containers deployment overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-deployment-overview) | decision-making | 0.70 | Deployment overview that helps choose the right deployment path across Kubernetes environments; likely includes comparison of options and guidance for different scenarios. | | [Deploy the Azure Monitor agent](https://learn.microsoft.com/en-us/azure/defender-for-cloud/auto-deploy-azure-monitoring-agent) | configuration | 0.70 | Use of AMA in Defender will involve agent configuration options, data collection rules, and settings unique to this integration. | | [Deprecated recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-deprecated) | security | 0.70 | Lists deprecated security recommendations with their identifiers; this is product-specific security metadata useful for posture management. | @@ -243,6 +243,7 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Enable Cloud infrastructure entitlement management (CIEM)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-permissions-management) | security | 0.70 | Enablement article for CIEM is likely to include specific configuration steps, scopes, and permission requirements unique to Defender for Cloud’s entitlement model, which qualify as product-specific security configuration knowledge. | | [Enable agentless machine scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-agentless-scanning-vms) | configuration | 0.70 | Describes enabling agentless scanning, including when it is enabled by default and how to turn it on; likely includes specific setting names and plan-based behavior. | | [Enable and configure Defender for Storage (classic)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-enable) | configuration | 0.70 | Covers enabling/configuring Defender for Storage (classic) using PowerShell, REST API, and templates, which necessarily includes specific configuration parameters and request schemas. | +| [Enable on Azure](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure) | configuration | 0.70 | Enablement/how-to article for a specific Defender for Cloud plan. These pages typically include product-specific configuration steps (which resource types are supported, where to enable the plan, plan names/toggles, and required settings) rather than just conceptual security overview. That makes it primarily configuration-focused rather than generic guidance. | | [Enable programmatically](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-azure-enable-programmatically) | integrations | 0.70 | Describes programmatic deployment of Defender for Containers components on AKS using Azure CLI, REST API, and ARM templates. This typically includes product-specific API/CLI parameters, resource types, and configuration values that qualify as integration & coding patterns beyond generic tutorial content. | | [Enable via portal](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-aws-enable-portal) | deployment | 0.70 | Portal-based enablement for EKS; includes AWS-specific deployment steps and component selection details. | | [Enable via portal](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-gcp-enable-portal) | deployment | 0.70 | Portal-based enablement for GKE; includes GCP-specific deployment steps and component selection details. | @@ -259,7 +260,7 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Microsoft Security Private Link for Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-private-links) | configuration | 0.70 | Describes how to enable and wire Microsoft Security Private Link to Defender for Cloud using specific Azure resources (Security Private Link resource, private endpoints, VNet connectivity). This is product-specific configuration guidance rather than a generic concept, and includes concrete setup details for secure traffic routing over the Microsoft backbone. | | [Migrate from the Microsoft Monitoring Agent or the Azure Monitor Agent](https://learn.microsoft.com/en-us/azure/defender-for-cloud/migrate-file-integrity-monitoring) | deployment | 0.70 | Migration guide from MMA/AMA to Defender for Endpoint-based FIM; likely includes product-specific migration steps, constraints, and sequencing. | | [Migrate to AMA](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-sql-autoprovisioning) | deployment | 0.70 | Covers migration from deprecated MMA/Log Analytics agent to AMA autoprovisioning; likely includes product-specific migration steps, constraints, and deployment requirements. | -| [Migrating to the new plan](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate) | decision-making | 0.70 | Explicit migration article with plan comparison, pricing model explanation, and migration steps; supports decision-making and migration path selection between pricing models. | +| [Migrating to the new plan](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate) | decision-making | 0.70 | Migration and plan-selection guidance between Defender for Storage (classic) and the new plan. Such docs typically include comparison of pricing models, capabilities, and migration steps, helping users decide when and how to move. This is product-specific decision-making and migration guidance that goes beyond generic knowledge. | | [Modify plan settings](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-servers-coverage) | configuration | 0.70 | Explains how to view protected machines and adjust plan settings; product-specific configuration options for coverage. | | [Onboard 42Crunch (preview)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-42crunch) | integrations | 0.70 | Technical onboarding guide will contain concrete configuration parameters, tokens, and integration steps unique to 42Crunch. | | [Onboard Bright Security (preview)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-bright) | integrations | 0.70 | Technical onboarding for Bright Security will include specific configuration options and integration parameters. | @@ -267,6 +268,7 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-cli-overview) | configuration | 0.70 | CLI overview for orchestrating scans in pipelines implies product-specific commands, flags, and behaviors that constitute configuration/usage patterns beyond generic CLI knowledge. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/integration-defender-for-endpoint) | integrations | 0.70 | Integration article between Defender for Endpoint/Vulnerability Management and Defender for Cloud; likely includes configuration parameters, plan requirements, and data flow specifics. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/runtime-gated-overview) | security | 0.70 | Details how gated deployment uses scan results and admission control to enforce image policies across specific registries and clusters, which is product-specific security behavior. | +| [Protect VMs](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-protect-resources) | security | 0.70 | Tutorial for Microsoft Defender for Servers that walks through configuring just-in-time VM access and application control. These are product-specific security configurations (JIT policy settings, application allowlisting behavior) that go beyond generic security concepts and represent concrete, service-specific security guidance. | | [Queries - Azure Resource Graph](https://learn.microsoft.com/en-us/azure/defender-for-cloud/resource-graph-samples) | integrations | 0.70 | Sample queries page contains concrete KQL query patterns against Defender-specific tables and resource types, which are product-specific integration/query patterns. | | [Query software bill of materials (SBOM)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/query-software-bill-of-materials) | configuration | 0.70 | Describes how to query SBOM results via cloud security explorer; includes specific query patterns, fields, and relationships unique to Defender for Cloud's SBOM graph. | | [Regional availability for Defender for Cloud plans](https://learn.microsoft.com/en-us/azure/defender-for-cloud/regional-availability) | decision-making | 0.70 | Page provides region-by-region availability of Microsoft Defender for Cloud plans and supported services across Azure, AWS, and GCP, including specific retirement dates and onboarding restrictions for Azure China. This is expert, product-specific data used to decide where and how the service can be used, fitting decision-making around regional support and lifecycle rather than generic limits or configuration. | @@ -300,32 +302,35 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Disable vulnerabilities on images (secure score)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/disable-vulnerability-findings-containers-secure-score) | security | 0.68 | Operational, product-specific guidance on creating and managing disable rules that affect secure score and findings; includes concrete UI/field-level behavior and side effects that go beyond generic concepts. | | [Kubernetes data plane hardening](https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-workload-protections) | security | 0.68 | Page focuses on Defender for Cloud security recommendations specifically for Kubernetes data plane hardening. These are product-specific security controls and recommendations (what to enable, how to use the recommendations) that go beyond generic Kubernetes security concepts, fitting the security sub-skill. | | [Review and remediate VA findings for Kubernetes nodes](https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-va) | troubleshooting | 0.68 | The page is focused on reviewing and remediating specific vulnerability findings for Kubernetes nodes in Defender for Cloud. These workflows, recommendation details, and product-specific remediation steps constitute expert, product-specific troubleshooting knowledge (symptom → findings → remediation actions), beyond generic security concepts. | +| [Scale a deployment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale) | deployment | 0.68 | Scaling Defender for Servers across Azure, AWS, GCP, and on-prem typically involves plan- and platform-specific deployment patterns and constraints (for example, which onboarding/deployment methods are supported per environment and SKU). This kind of article usually contains concrete guidance on how to roll out protection at scale, which is deployment-focused and product-specific rather than conceptual. Even without seeing the full text, the focus on 'scale your deployment' and multi-cloud/on-prem coverage strongly indicates expert deployment knowledge. | | [Simulate alerts for SQL servers on machines](https://learn.microsoft.com/en-us/azure/defender-for-cloud/simulate-alerts-sql-machines) | security | 0.68 | Details the SQL simulated alert feature, including the Sql-SimulateAlert extension and how telemetry is injected, which is product-specific security testing behavior. | | [Classic configuration](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-vulnerability-findings-classic) | security | 0.66 | Similar to express mode but for classic configuration; includes product-specific behavior for scanning, findings, and disabling. | | [Configure a Docker Hub external container registry](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-enable-external-registry-for-docker-hub) | configuration | 0.66 | How-to onboarding external registries typically includes specific configuration fields (URLs, credentials, scopes) and product-specific requirements, fitting the configuration/integration pattern; here it is primarily configuration of Defender for Containers. | | [Express configuration](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-vulnerability-findings-express) | security | 0.66 | Describes express configuration mode for SQL VA, including how findings are used and disabled, which is product-specific security configuration behavior. | | [On-upload malware scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/on-upload-malware-scanning) | security | 0.66 | Describes product-specific behavior and configuration of on-upload scanning using MDAV, including when scans occur and how it integrates with storage accounts. | +| [API recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-api) | security | 0.65 | Reference table of all Defender for Cloud API security recommendations is product-specific security guidance. It enumerates concrete recommendation identifiers and their meanings for API/API Management resources, which are unique to this service and used to harden configurations. | | [Assessment checks for endpoint detection and response solutions](https://learn.microsoft.com/en-us/azure/defender-for-cloud/endpoint-detection-response-solution-recommendations) | troubleshooting | 0.65 | Describes how to remediate EDR-related recommendations on VMs; maps specific recommendation findings (symptoms) to remediation steps (solutions). | +| [Common questions about Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general) | troubleshooting | 0.65 | FAQ pages for security products typically include specific error messages, feature behaviors, and product-specific clarifications (for example, what certain alerts mean, how billing or data collection behaves in edge cases). These are symptom→explanation mappings that qualify as product-specific troubleshooting knowledge beyond generic concepts. | | [Common questions about data collection, agents, agentless and workspaces](https://learn.microsoft.com/en-us/azure/defender-for-cloud/prepare-deprecation-log-analytics-mma-agent) | decision-making | 0.65 | Same as index 22; deprecation article provides migration guidance and trade-offs for replacing the MMA-based features. | | [Configure agentless code scanning (Preview)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/agentless-code-scanning) | configuration | 0.65 | Configuration article for agentless scanning is expected to include scanner settings, scopes, and product-specific options that qualify as configuration details. | | [Configure email notifications for alerts and attack paths](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-email-notifications) | configuration | 0.65 | Covers fine-tuning email notifications by severity and risk levels; such pages usually include specific setting names and allowed values for notification configuration. | | [Configure pull request annotations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-pull-request-annotations) | configuration | 0.65 | Enabling PR annotations in GitHub/Azure DevOps requires specific configuration steps and settings (e.g., toggles, scopes) that are product-specific. | | [Defender for Cloud cost calculator](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cost-calculator) | decision-making | 0.65 | Cost calculator guidance typically includes plan/tier options and cost breakdowns, helping choose configurations based on quantified cost trade-offs. | -| [Defender for Cloud support matrices](https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud) | deployment | 0.65 | Support matrix for which Azure clouds, services, and client OSes are supported, including region-specific retirement dates and onboarding constraints; this is effectively a platform/tier support matrix relevant to deployment and environment planning. | | [Disable vulnerability findings](https://learn.microsoft.com/en-us/azure/defender-for-cloud/disable-vulnerability-findings) | configuration | 0.65 | Shows how to ignore certain vulnerability findings; requires product-specific configuration steps, scopes, and options for disabling findings. | | [Discover AI models](https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-model-security) | security | 0.65 | Covers AI model security within Defender for Cloud and a specific Defender for AI Services plan. Although the summary emphasizes preview and licensing notes, this type of feature article typically includes product-specific security behaviors (how AI models are discovered/scanned, required plan/enablement details, and possibly scope of protection). Among the available categories, it best fits 'security' as it is centered on securing AI models in this product context. | | [Enable Data Security Posture Management](https://learn.microsoft.com/en-us/azure/defender-for-cloud/data-security-posture-enable) | configuration | 0.65 | Enablement article is likely to include specific toggles, plan requirements, and configuration steps unique to Defender for Cloud’s data posture feature, fitting the configuration category. | -| [Enable Defender for Cloud for a management group](https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-management-group) | configuration | 0.65 | Uses Azure Policy definitions to onboard subscriptions; likely includes policy names, parameters, and assignment scopes that are configuration details. | | [Enable Defender for SQL Servers on Machines](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-sql-usage) | deployment | 0.65 | Covers enabling the plan for SQL VMs and Arc SQL, including agent architecture transition; product-specific deployment/enablement details. | | [Enable Defender for SQL servers on machines at scale](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-sql-at-scale) | deployment | 0.65 | Describes enabling Defender for SQL on Machines at scale and mentions auto-provisioning; likely includes deployment patterns, requirements, and constraints specific to large environments. | | [Enable file integrity monitoring](https://learn.microsoft.com/en-us/azure/defender-for-cloud/file-integrity-monitoring-enable-defender-endpoint) | configuration | 0.65 | Described as instructions to configure File Integrity Monitoring using the Microsoft Defender for Endpoint agent and agentless machines. This implies product-specific configuration steps and settings rather than just conceptual guidance. | | [Enable programmatically](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-arc-enable-programmatically) | configuration | 0.65 | Programmatic deployment via CLI and ARM typically includes resource types, parameter names, and example payloads that are product-specific configuration details beyond generic knowledge. | +| [Enable sensitive data threat detection](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity) | security | 0.65 | Configuration-focused page for enabling/disabling sensitive data threat detection in Defender for Storage. While the summary is high-level, the actual doc is expected to include product-specific security configuration steps and options (for example, where and how to toggle the feature, scope of protection), which qualify as security expert knowledge beyond generic concepts. | | [Enable threat protection for AI services](https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-onboarding) | configuration | 0.65 | The page covers how to enable threat protection for AI services in Defender for Cloud at the subscription level, including specific onboarding steps and configuration options for Microsoft Foundry workloads. These are product-specific enablement and configuration details rather than conceptual overview, aligning with the configuration sub-skill. | | [Gain application and end-user context for AI alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/gain-end-user-context-ai) | configuration | 0.65 | Explains how to add end-user and application context to AI alerts; this typically involves specific configuration fields, headers, or identity mappings unique to Defender for Cloud AI threat protection. | | [Gated deployment for Infrastructure as Code](https://learn.microsoft.com/en-us/azure/defender-for-cloud/gated-deployment-infrastructure-as-code) | deployment | 0.65 | Covers IaC deployment of the gated deployment admission controller, including required access to ACRs and cluster-level deployment specifics, which are product-specific deployment patterns. | | [Get started](https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers) | architecture-patterns | 0.65 | Planning guide for protecting on-premises and multicloud servers; likely includes product-specific deployment patterns, when to use each onboarding method, and trade-offs between data collection approaches and environments. | | [Handle Defender for Resource Manager alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-usage) | troubleshooting | 0.65 | Alert response article will map specific Defender for Resource Manager alert types to investigation and remediation steps, which is product-specific troubleshooting guidance. | | [Identify SQL Servers protected by Microsoft Monitoring Agent](https://learn.microsoft.com/en-us/azure/defender-for-cloud/identify-sql-servers-protected-by-monitor-agent) | deployment | 0.65 | Guides identifying MMA-protected SQL servers and prerequisites for migration to Arc; product-specific deployment/migration considerations. | +| [Identify and remediate attack paths](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-attack-path) | security | 0.65 | Page describes how to identify and remediate attack paths in Microsoft Defender for Cloud, including product-specific security workflows and configuration steps for using the attack path feature. This is concrete, product-specific security guidance rather than generic concepts. | | [Investigate Defender for Endpoint misconfigurations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/endpoint-detection-misconfiguration) | best-practices | 0.65 | Covers specific misconfiguration checks and how to remediate them via Defender for Endpoint; these are product-specific edge cases and recommended actions. | | [Investigate findings, recommendations, alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-apis-posture) | best-practices | 0.65 | Explains how to analyze findings, alerts, and posture recommendations; will include product-specific navigation, filtering, and recommended investigation steps for API security. | | [Known limitations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/known-limitations) | limits-quotas | 0.65 | Known limitations page for preview; typically lists concrete unsupported scenarios and constraints specific to this portal experience, which function as product limits/quotas. | @@ -340,7 +345,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Remediate vulnerabilities for running images (secure score)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/view-and-remediate-vulnerabilities-for-images-secure-score) | best-practices | 0.65 | Classic secure score approach for images running on clusters; includes product-specific recommendations and prioritization behavior. | | [Respond to Microsoft Defender for DNS alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-dns-alerts) | troubleshooting | 0.65 | Described as best practices for responding to DNS alerts; likely includes specific alert types and recommended investigation/remediation steps, fitting troubleshooting symptom→solution patterns. | | [Review security recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/review-security-recommendations) | security | 0.65 | Page focuses on how to review and work with Microsoft Defender for Cloud security recommendations, including product-specific recommendation details, risk factors, and prioritization. This is concrete, product-specific security guidance rather than generic concepts, but it does not center on limits, deployment, or configuration tables. | -| [Scale a deployment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale) | architecture-patterns | 0.65 | Scaling deployment across Azure, AWS, GCP, and on-premises usually involves product-specific architectural patterns and constraints for large environments. | | [Security policies and standards](https://learn.microsoft.com/en-us/azure/defender-for-cloud/security-policy-concept) | security | 0.65 | Security policy article describes how standards, controls, and conditions are applied to resources. These are product-specific security configuration behaviors and assessment logic that go beyond generic policy concepts. | | [Security solutions (legacy)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/partner-integration) | integrations | 0.65 | Legacy integration reference usually lists supported partner solutions and configuration details for each, which are product-specific integration patterns. | | [Set up automated remediation for malware detection](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-configure-malware-scan) | configuration | 0.65 | Describes concrete ways to handle malicious files and build automated remediation based on scan result options, which implies product-specific configuration flows and option values rather than just conceptual guidance. | @@ -354,22 +358,20 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Verify deployment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-gcp-verify) | troubleshooting | 0.64 | Verification guide for GKE; likely includes specific checks and expected states for Defender components, which are product-specific diagnostics. | | [REST API (secure score)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/subassessment-rest-api) | integrations | 0.63 | Describes the specific subassessment model and fields for container VA powered by Defender Vulnerability Management, including REST schema details that are product-specific integration knowledge. | | [Enable on AWS (Preview)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-aws) | security | 0.62 | Explains enabling Defender for open-source relational databases on AWS RDS; likely includes cloud-specific configuration details and security integration steps. | -| [Enable on Azure](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure) | security | 0.62 | Enablement article for a security plan; typically includes specific Azure configuration steps, scopes, and possibly role/permission requirements unique to this plan. | | [Investigate Kubernetes vulnerabilities with Cloud Security Explorer](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-kubernetes-clusters) | integrations | 0.62 | Page provides concrete query examples for Cloud Security Explorer targeting Kubernetes clusters and container images. These examples encode product-specific query schema, fields, and patterns unique to Defender for Cloud’s explorer, which aligns best with integrations & coding patterns (query construction as an API-like integration). | | [Support and prerequisites](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-data-security-posture-prepare) | configuration | 0.62 | Prerequisites page for data security posture management typically lists supported resource types, regions, and enabling settings. These are concrete product-specific configuration and support details that qualify as expert configuration knowledge. | | [Vulnerability assessment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/sql-azure-vulnerability-assessment-overview) | security | 0.62 | Covers how to configure SQL VA and interpret reports for multiple services, which includes product-specific assessment behavior and configuration steps. | +| [Automate onboarding using PowerShell](https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding) | integrations | 0.60 | PowerShell onboarding docs for a specific Azure service typically list concrete cmdlets, parameter names, and required values (for example, enabling specific Defender plans across subscriptions). These are product-specific integration/coding patterns for using the Defender for Cloud PowerShell module, matching the integrations category. | +| [Build queries with Cloud security explorer](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-cloud-security-explorer) | security | 0.60 | Page explains how to build and run queries in Defender for Cloud’s cloud security explorer, including product-specific query usage and configuration patterns for proactively identifying risks. This is detailed, feature-specific security configuration/usage guidance. | | [Chargeback process using Azure Cost Analysis](https://learn.microsoft.com/en-us/azure/defender-for-cloud/chargeback) | decision-making | 0.60 | Chargeback process using Cost Analysis involves concrete steps and criteria for allocating costs across teams, supporting financial decision-making for service usage. | | [Common questions](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-endor-labs) | troubleshooting | 0.60 | FAQ for a specific integration typically covers concrete issues, behaviors, and resolutions, which are product-specific troubleshooting knowledge. | | [Configure GCP plans](https://learn.microsoft.com/en-us/azure/defender-for-cloud/configure-google-plans) | decision-making | 0.60 | Guides which GCP plans to enable, with defaults and plan behaviors; helps choose plans per scenario, aligning with decision-making on SKU/plan selection. | -| [Enable sensitive data threat detection](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity) | configuration | 0.60 | Explains enabling/disabling sensitive data threat detection via portal and at-scale methods, which typically includes specific switches, policy settings, or API parameters unique to this feature. | | [How Microsoft secures customer data](https://learn.microsoft.com/en-us/azure/defender-for-cloud/data-security) | security | 0.60 | Explains how Defender for Cloud manages and safeguards collected data; such pages usually include product-specific data flows, storage locations, and compliance/security behaviors. | | [Manage Defender for APIs](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-apis-manage) | configuration | 0.60 | Managing the plan deployment and offboarding APIs implies product-specific configuration steps and options. | | [Manage MCSB recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/manage-mcsb) | configuration | 0.60 | Managing MCSB recommendations likely includes specific controls, policy settings, and configuration options tied to this benchmark. | | [Partner applications](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-partner-applications) | integrations | 0.60 | Integration of partner scan results into Defender implies specific data formats, mappings, and configuration steps. | | [Protect network resources](https://learn.microsoft.com/en-us/azure/defender-for-cloud/protect-network-resources) | best-practices | 0.60 | Network protection article addresses concrete Defender recommendations and configuration steps for Azure network resources. | -| [Protect storage accounts with Defender for Storage](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-storage-plan) | deployment | 0.60 | Tutorial for enabling Defender for Storage; likely includes concrete deployment steps and possibly plan/scope considerations specific to this service. | | [Remediate Cloud deployment secrets](https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-cloud-deployment-secrets) | best-practices | 0.60 | Similar to index 38 but for cloud deployment secrets; remediation guidance is likely organized by finding type with recommended actions specific to Defender for Cloud, qualifying as product-specific best practices. | -| [Remediate issues with VM secrets](https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-server-secrets) | best-practices | 0.60 | Remediation article for machine secrets likely includes prioritized remediation guidance, categorization of secret severity, and product-specific recommendations on how to handle different findings—these are concrete, product-specific best practices and gotchas. | | [Remediate machine vulnerabilities](https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-vulnerability-findings-vm) | best-practices | 0.60 | Explains how to act on vulnerability findings from Defender Vulnerability Management; includes product-specific workflows and remediation guidance. | | [Review pull request annotations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/review-pull-request-annotations) | best-practices | 0.60 | Focuses on reviewing PR annotations for security; likely includes product-specific UI locations, annotation formats, and recommended review workflows that are actionable best practices. | | [Reviewing results](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-cli-reviewing-results) | best-practices | 0.60 | Guides how to interpret and use scan results mapped to container images; includes product-specific navigation, query patterns, and recommended workflows for handling findings. | @@ -387,7 +389,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Deploy Defender for Servers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-servers-plan) | 0.50 | Tutorial for enabling Defender for Servers; includes note about 30-day trial but otherwise mainly procedural, not a configuration or limits reference. | | [Connect machines with Azure Arc](https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-machines) | 0.45 | Quickstart for connecting on-premises machines via Azure Arc; step-by-step onboarding rather than configuration reference. | | [Connect machines with Defender for Endpoint](https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-machines-with-defender-for-endpoint) | 0.45 | How-to for onboarding non-Azure servers via Defender for Endpoint; mostly procedural without detailed config parameter tables. | -| [Connect your AWS account](https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-aws) | 0.45 | Quickstart for connecting AWS account; mostly procedural onboarding, not a deep configuration or troubleshooting reference. | | [Connect your GCP project](https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-gcp) | 0.45 | Quickstart for connecting GCP project; mainly onboarding steps without detailed config matrices or limits. | | [Deploy Defender for Azure SQL Databases](https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-sql-database-plan) | 0.45 | How-to for enabling Defender for Azure SQL Databases; mostly procedural without deep configuration matrices. | | [Edit DevOps connectors](https://learn.microsoft.com/en-us/azure/defender-for-cloud/edit-devops-connector) | 0.45 | Editing DevOps connectors is likely a procedural UI tutorial; summary doesn’t show detailed config parameter tables or error mappings. | @@ -403,7 +404,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Assessing Defender for Endpoint EDR](https://learn.microsoft.com/en-us/azure/defender-for-cloud/endpoint-detection-response) | 0.40 | High-level overview of EDR capabilities with Defender for Endpoint; summary suggests conceptual content rather than detailed configuration or troubleshooting. | | [Assign a recommendation to an active user](https://learn.microsoft.com/en-us/azure/defender-for-cloud/active-user) | 0.40 | Assigning recommendations to active users; appears to be feature usage, not deep config or troubleshooting. | | [Automate responses to alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/workflow-automations) | 0.40 | Workflow automation overview; likely a how-to without detailed configuration matrices or limits in the summary. | -| [Build queries with Cloud security explorer](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-cloud-security-explorer) | 0.40 | Cloud security explorer query building; summary suggests conceptual and UI-level guidance without detailed config tables or limits. | | [Cloud Security Explorer software queries](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-security-explorer-software-vulnerabilities) | 0.40 | Shows example queries for Cloud Security Explorer; likely query examples but not organized as config tables, limits, or error mappings. | | [Common questions](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-servers) | 0.40 | FAQ for Defender for Servers; likely general Q&A without detailed config tables, limits, or troubleshooting mappings. | | [Common questions about Defender for APIs](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-apis) | 0.40 | FAQ for Defender for APIs; likely mostly conceptual answers without detailed config tables or limits. | @@ -415,14 +415,13 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Enable preview features](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/enable-preview-features) | 0.40 | How to enable preview features; likely a procedural guide without deep configuration tables or product-specific patterns. | | [Enable via portal](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-arc-enable-portal) | 0.40 | Portal enablement how-to; likely step-by-step UI instructions without detailed config tables, limits, or product-specific troubleshooting content. | | [FAQ](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/integration-faq) | 0.40 | FAQ about integration and migration; likely conceptual Q&A without structured error codes, configs, or numeric thresholds. | -| [Identify and remediate attack paths](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-attack-path) | 0.40 | Describes attack path identification conceptually; summary doesn’t show concrete config parameters, limits, or error mappings. | | [Install](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-cli-install) | 0.40 | Install article likely focuses on download and basic install commands; these are generic and not configuration tables or limits, so not treated as expert configuration knowledge per the criteria. | | [Map IaC templates from code to cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/iac-template-mapping) | 0.40 | IaC mapping article is more conceptual/feature-usage; summary doesn’t indicate detailed parameter tables or numeric limits. | -| [Onboard agentless containers for CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-enable-agentless-containers) | 0.40 | Described as a how-to onboarding article for agentless container posture. From the summary it looks like a step-by-step enablement guide without explicit mention of configuration parameter tables, limits, or error codes. Likely procedural onboarding content rather than deep configuration, limits, or troubleshooting guidance. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-servers-overview) | 0.40 | Overview of Defender for Servers plan; mostly conceptual feature description. | | [Power BI integration overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/integration-power-bi) | 0.40 | Power BI integration overview; mostly conceptual benefits and high-level description. | | [Protect resources with Defender CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-cspm-plan) | 0.40 | Tutorial for enabling CSPM plan; mostly conceptual and procedural without detailed config matrices or numeric thresholds. | | [Remediate code with Microsoft Security Copilot](https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-code-with-copilot) | 0.40 | Explains using Copilot to remediate IaC misconfigurations; summary doesn’t indicate detailed configuration schemas or numeric limits. | +| [Remediate issues with VM secrets](https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-server-secrets) | 0.40 | Guidance on identifying and remediating machine secrets findings is likely workflow-focused; summary does not show specific error codes, configuration parameters, or numeric thresholds. | | [Remediate recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/implement-security-recommendations) | 0.40 | Describes remediation workflow for recommendations; likely step-by-step UI actions without expert-only configuration details. | | [Review resources exempted from recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/review-exemptions) | 0.40 | Reviewing exempted resources is mostly workflow/UI; summary doesn’t show detailed config or limits. | | [ServiceNow integration overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/integration-servicenow) | 0.40 | ServiceNow integration overview; high-level description without explicit config parameter tables in summary. | @@ -468,13 +467,14 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [AWS ECR coverage in Defender for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-twenty-five) | 0.30 | Governance capability improvements episode is about deploying governance at scale and monitoring rules, but summary does not show specific configuration parameter sets or numeric thresholds. | | [Agentless code scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-sixty-three) | 0.30 | Agentless code scanning overview and enablement demo; focuses on how the capability works and its importance, not on detailed configuration parameters or limits. | | [Agentless container posture in Defender CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-containers) | 0.30 | Agentless container posture concept article; summary emphasizes capabilities and visibility, not specific configuration settings or error-resolution mappings. | -| [Agentless machine scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-data-collection) | 0.30 | High-level description of agentless machine scanning and supported plans; summary suggests conceptual overview and plan listing without detailed configuration parameters, limits, or error mappings. | +| [Agentless machine scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-agentless-data-collection) | 0.30 | Conceptual description of agentless machine scanning and supported plans; no detailed limits, configuration tables, error codes, or decision matrices with quantified trade-offs. | | [Attack path analysis and enhanced risk-hunting for containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-test-attack-path-and-security-explorer-with-vulnerable-container-image) | 0.30 | Focuses on demonstrating attack path analysis using a mock vulnerable container image. The summary suggests a scenario/tutorial style article, not one organized around error codes, configuration matrices, or quantified limits. No clear evidence of product-specific config tables or troubleshooting mappings. | | [Binary drift detection in Defender for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-fifty-two) | 0.30 | Binary drift detection explanation and policy creation demo; while it shows how to create policies, the description does not indicate detailed parameter tables or numeric thresholds. | | [Cloud Detection Response experience](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-fifty-five) | 0.30 | Cloud Detection Response experience overview and scenario demos; focuses on how it works and capabilities, not on configuration tables or troubleshooting mappings. | | [Cloud secure score](https://learn.microsoft.com/en-us/azure/defender-for-cloud/secure-score-security-controls) | 0.30 | Secure score article appears to explain scoring concept and benchmark application; summary does not show numeric limits, config tables, or product-specific error mappings. | | [Common questions about regulatory compliance](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-regulatory-compliance) | 0.30 | Regulatory compliance FAQ tends to describe supported standards and conceptual compliance behavior; it usually lacks concrete configuration parameters, limits, or detailed troubleshooting mappings required for the defined sub-skill types. | | [Connect an integration](https://learn.microsoft.com/en-us/azure/defender-for-cloud/connect-an-integration) | 0.30 | Generic guidance on connecting integrations; appears to be a navigation/how-to page without deep configuration reference. | +| [Connect your AWS account](https://learn.microsoft.com/en-us/azure/defender-for-cloud/quickstart-onboard-aws) | 0.30 | Quickstart for connecting an AWS account to Defender for Cloud; primarily a step-by-step onboarding/tutorial without detailed configuration parameter tables, limits, or troubleshooting mappings. | | [Defender Threat Intelligence (Defender TI)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-twenty-three) | 0.30 | SQL VA enhancements episode mentions enabling experience and customizing baselines with scripts, but summary does not clearly indicate structured configuration tables or numeric ranges. | | [Defender for container registries (deprecated)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-container-registries-introduction) | 0.30 | Benefits/features overview of Defender for container registries; largely conceptual without detailed configs or limits in the summary. | | [Demystifying Defender for Servers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-twenty-seven) | 0.30 | Zero Trust and Defender for Cloud episode is about principles and best practices conceptually; summary does not show product-specific numeric or configuration details. | @@ -488,19 +488,18 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Explore risks to sensitive data](https://learn.microsoft.com/en-us/azure/defender-for-cloud/data-security-review-risks) | 0.30 | Explains using attack paths and security explorer for sensitive data risks; appears to be a usage guide without specific config tables, limits, or error codes. | | [Generate threat intelligence reports](https://learn.microsoft.com/en-us/azure/defender-for-cloud/threat-intelligence-reports) | 0.30 | Threat intelligence reports usage overview; summary doesn’t indicate specific configuration parameters, limits, or error-code mappings. | | [Improve regulatory compliance](https://learn.microsoft.com/en-us/azure/defender-for-cloud/regulatory-compliance-dashboard) | 0.30 | Regulatory compliance dashboard usage; summary suggests conceptual and UI guidance, not detailed config or error codes. | -| [Investigate risks with security explorer and attack paths](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-attack-path) | 0.30 | Attack path/security explorer article describes capabilities and usage conceptually; summary does not show specific configuration values, error codes, or decision matrices with thresholds. | | [Kubernetes gated deployment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-sixty-two) | 0.30 | Kubernetes gated deployment feature overview and demo; describes how to create a gated rule and plan adoption, but no explicit configuration tables or numeric thresholds are indicated. | | [Malware automated remediation](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-sixty-five) | 0.30 | Malware automated remediation capability overview and configuration demo; high-level feature usage without explicit configuration parameter tables or limits. | | [Manage and respond to security alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/manage-respond-alerts) | 0.30 | Explains how to manage and respond to alerts; summary indicates procedural UI guidance without specific error-code mappings or configuration parameter tables. | | [Manage security incidents](https://learn.microsoft.com/en-us/azure/defender-for-cloud/incidents) | 0.30 | High-level guidance on managing incidents; summary suggests conceptual workflow, not detailed error codes, config tables, or limits. | | [Microsoft CNAPP solution](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-forty-eight) | 0.30 | CNAPP solution discussion about adoption and considerations; decision-oriented but described at conceptual level without concrete comparison tables or quantified trade-offs. | | [On-demand malware scanning](https://learn.microsoft.com/en-us/azure/defender-for-cloud/on-demand-malware-scanning) | 0.30 | From the summary, the page is a feature/benefits overview of on-demand malware scanning in Microsoft Defender for Storage. It describes what the capability does and why it's useful, but there's no evidence of specific limits/quotas, configuration parameter tables, RBAC role lists, or error-code-based troubleshooting. Without concrete numeric limits, settings, or security role details, it doesn't meet the expert-knowledge criteria for any sub-skill type. | +| [Onboard agentless containers for CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-enable-agentless-containers) | 0.30 | How-to onboarding for agentless container posture appears procedural; summary does not indicate detailed config tables, limits, or error-code-based troubleshooting. | | [Onboarding Docker Hub and JFrog Artifactory](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-fifty-seven) | 0.30 | Onboarding Docker Hub and JFrog Artifactory demo; tutorial-style onboarding without explicit parameter tables or tier constraints in the description. | | [Other threat protections](https://learn.microsoft.com/en-us/azure/defender-for-cloud/other-threat-protections) | 0.30 | Overview of additional threat protections; appears marketing/feature listing without deep technical details. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/api-security-posture-overview) | 0.30 | API security posture overview; focuses on what is assessed and supported platforms, not on detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cluster-security-dashboard) | 0.30 | Described as how to review and investigate findings in the AKS security dashboard. This is primarily UI/navigation and conceptual guidance on using the dashboard, without clear indication of product-specific configuration parameters, RBAC roles, or detailed troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/containers-software-supply-chain-security-introduction) | 0.30 | Introduction to supply chain security; appears conceptual/overview without detailed configs, limits, or decision matrices. | -| [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-app-service-introduction) | 0.30 | Benefits/features overview of Defender for App Service; summary suggests high-level content without detailed configs or troubleshooting. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-aws-overview) | 0.30 | EKS overview for Defender for Containers; primarily conceptual description of capabilities and architecture without detailed configuration or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-azure-overview) | 0.30 | AKS overview for Defender for Containers; mainly describes capabilities and benefits without detailed configuration, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-gcp-overview) | 0.30 | GKE overview for Defender for Containers; conceptual description of capabilities without detailed configuration or troubleshooting mappings. | @@ -509,6 +508,7 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Overview of partner integrations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/partner-integrations) | 0.30 | Overview of partner integrations; summary is conceptual and marketing-style without detailed configuration or decision matrices. | | [Protect key vaults with Defender for Key Vault](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-key-vault-plan) | 0.30 | Tutorial to enable a plan; mostly portal steps and conceptual description of Defender for Key Vault without detailed config parameters or limits. | | [Protect resources with Defender for Resource Manager](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-resource-manager-plan) | 0.30 | Plan enablement tutorial for Defender for Resource Manager; likely basic onboarding steps without expert-level configuration or troubleshooting content. | +| [Protect storage accounts with Defender for Storage](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-storage-plan) | 0.30 | Tutorial-style deployment/enablement article for Defender for Storage; summary emphasizes conceptual description and step-by-step enabling, but does not indicate detailed configuration parameter tables, limits, or SKU matrices. Lacks clear evidence of expert-only configuration or constraints. | | [Review changes in file integrity monitoring](https://learn.microsoft.com/en-us/azure/defender-for-cloud/file-integrity-monitoring-review-changes) | 0.30 | Describes reviewing file integrity monitoring results; appears as feature usage guidance without specific config parameters or error-code troubleshooting. | | [Review data security alerts](https://learn.microsoft.com/en-us/azure/defender-for-cloud/review-data-security-alerts) | 0.30 | Reviewing data security alerts is described at a high level; summary does not indicate specific alert IDs, error codes, or detailed symptom-to-solution mappings. | | [Review the software inventory](https://learn.microsoft.com/en-us/azure/defender-for-cloud/software-inventory) | 0.30 | How-to UI walkthrough for viewing software inventory; no detailed configuration tables, limits, or product-specific error mappings. | @@ -525,8 +525,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [AKS security dashboard](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-fifty-eight) | 0.25 | AKS security dashboard overview and capabilities; focuses on insights and use cases, not on numeric thresholds or configuration schemas. | | [Agentless malware detection](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-forty-four) | 0.25 | Agentless malware detection overview and user experience demo; no mention of specific configuration parameters, limits, or error-code-based troubleshooting. | | [Agentless secrets scanning for virtual machines](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-forty-two) | 0.25 | Agentless secrets scanning feature overview with prerequisites and demo; lacks explicit configuration schemas, limits, or troubleshooting structures. | -| [Archived release notes (older than six months)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes-archive) | 0.25 | Release notes archive; while detailed, it’s historical change log rather than reusable expert configuration, limits, or troubleshooting reference. | -| [Cloud asset inventory](https://learn.microsoft.com/en-us/azure/defender-for-cloud/asset-inventory) | 0.25 | Asset inventory page description focuses on what is shown and how posture is summarized; no indication of specific configuration parameters, quotas, or troubleshooting flows. | | [Cloud security explorer and attack path analysis](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-twenty) | 0.25 | Regulatory compliance dashboard updates episode is UI- and feature-focused; summary does not show detailed configuration parameters or limits. | | [Code reachability analysis](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-sixty) | 0.25 | Code reachability analysis explanation and demo; conceptual prioritization of vulnerabilities and use of attack path, not detailed configuration or limits. | | [Common questions about DevOps security](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-devops) | 0.25 | DevOps Security FAQ summary; likely mixed conceptual Q&A without clear evidence of detailed limits, configuration tables, or error-code mappings from the provided text. | @@ -548,7 +546,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Native integration with ServiceNow](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-forty-one) | 0.25 | Describes native integration with ServiceNow and shows how to configure it, but at a tutorial/demo level; no explicit parameter tables, limits, or error mappings are indicated. | | [New secure score](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-sixty-six) | 0.25 | New secure score explanation and portal access; conceptual description of calculation and dashboard enhancements, not detailed numeric thresholds or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/ai-threat-protection) | 0.25 | AI threat protection overview; summary is high-level description of capabilities and threats, not detailed configuration, limits, or troubleshooting. | -| [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-overview) | 0.25 | High-level overview of Defender for Databases offerings; no indication of detailed configuration, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-dns-introduction) | 0.25 | Benefits/features overview for Defender for DNS; summary shows conceptual description of monitoring behavior without detailed configs, limits, or error mappings. | | [Overview Cloud Security Posture Management (CSPM)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-cloud-security-posture-management) | 0.25 | Conceptual overview of CSPM; describes capabilities and benefits without detailed configuration parameters, limits, or troubleshooting mappings. | | [Remediate security recommendations with governance](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-fifteen) | 0.25 | Defender for Servers integration with MDE episode covers architecture and deployment options but summary does not show concrete config parameter lists or error mappings. | @@ -557,33 +554,35 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [About Data Security Posture Management](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-data-security-posture) | 0.24 | Data security posture management overview; describes challenges and high-level capabilities without indicating specific configuration settings or limits. | | [Microsoft Cloud Security Benchmark (MCSB)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-regulatory-compliance) | 0.22 | Explains Microsoft cloud security benchmark and how it is applied; summary suggests conceptual mapping of standards, not concrete configuration or troubleshooting content. | | [Regulatory compliance standards](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-regulatory-compliance-standards) | 0.22 | Regulatory compliance overview; likely maps standards conceptually without detailed product-specific configuration or numeric thresholds. | +| [Archived release notes (older than six months)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes-archive) | 0.20 | Release notes archive listing historical feature changes and deprecations; does not focus on structured limits, configuration matrices, troubleshooting mappings, or other categorized expert patterns defined in the sub-skill types. | | [Capabilities to counter identity-based supply chain attacks](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-thirty-seven) | 0.20 | Episode on identity-based supply chain attacks with conceptual explanations and mitigation recommendations; no specific error codes, RBAC roles, or configuration parameters are described. | +| [Cloud asset inventory](https://learn.microsoft.com/en-us/azure/defender-for-cloud/asset-inventory) | 0.20 | Overview of cloud asset inventory and posture visualization; lacks product-specific numeric limits, configuration parameters, or troubleshooting mappings. | | [Cloud overview dashboard](https://learn.microsoft.com/en-us/azure/defender-for-cloud/cloud-infrastructure-dashboard) | 0.20 | Cloud overview dashboard article is primarily UI/experience description and posture visualization; summary does not indicate detailed settings tables, limits, or error-resolution mappings. | | [Code to runtime enrichment for recommendations](https://learn.microsoft.com/en-us/azure/defender-for-cloud/code-to-runtime-mapping) | 0.20 | The page describes the concept of code-to-runtime visibility and how it helps trace security issues across the SDLC. From the summary, it appears conceptual and workflow-oriented without detailed configuration tables, specific error codes, or concrete parameter values, so it does not clearly meet any expert-knowledge sub-skill criteria. | | [Common questions about CSPM](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-cspm) | 0.20 | FAQ about Defender for Cloud CSPM appears to be conceptual and explanatory (what CSPM is, general benefits, common questions). No indication of specific numeric limits, configuration parameter tables, RBAC role lists, or detailed error-code-based troubleshooting that would qualify as expert knowledge under the defined categories. | | [Common questions about Copilot in Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-copilot) | 0.20 | Copilot FAQ for Defender for Cloud is mainly descriptive (capabilities, availability, high-level behavior) and does not typically include numeric limits, configuration tables, or detailed troubleshooting content that would qualify as expert knowledge under the given categories. | -| [Common questions about Defender for Cloud](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general) | 0.20 | General FAQ about Defender for Cloud; primarily conceptual and product-overview style questions without detailed limits, configuration tables, or error-code-based troubleshooting. | | [Common questions about Defender for Storage](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-storage) | 0.20 | FAQ summary only; likely mixed conceptual and pricing questions without clear indication of detailed limits, configs, or error codes from the provided text. | | [Common questions about Defender for Storage (classic)](https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-defender-for-storage-classic) | 0.20 | Classic plan FAQ summary only; no evidence of detailed limits, configuration tables, or error-code troubleshooting in the provided text. | | [Data security dashboard](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-thirty-eight) | 0.20 | Data security dashboard overview and demo; focuses on purpose, prerequisites, and supported data types, not on detailed configuration tables or limits. | | [Defender CSPM support for GCP and more updates](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-thirty-six) | 0.20 | Video episode description about CSPM support for GCP and multicloud strategy; high-level feature discussion without concrete limits, configs, or troubleshooting details. | | [Defender for APIs reaches GA](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-thirty-nine) | 0.20 | GA announcement and feature recap for Defender for APIs; marketing/overview style content without quantified limits, configs, or troubleshooting mappings. | | [Defender for Cloud glossary of terms](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-glossary) | 0.20 | Glossary of terms is conceptual reference; it defines terminology but not configuration, limits, or troubleshooting details. | -| [Defender sensor for Containers changelog](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-sensor-change-log) | 0.20 | The page is a changelog/release notes for the Defender sensor, listing version history, features, and fixes. It does not clearly expose structured limits, configuration parameter tables, security roles, troubleshooting mappings, or decision matrices as defined in the sub-skill types. While it may contain some product-specific details, they are not organized in a way that matches any of the targeted expert-knowledge categories. | +| [Defender sensor for Containers changelog](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-sensor-change-log) | 0.20 | A changelog/release notes page lists version history, features, and fixes. While it is detailed, it does not fit any of the defined sub-skill types (no limits tables, configuration matrices, troubleshooting mappings, or decision criteria). It’s primarily historical/update information rather than reusable expert configuration, deployment, or troubleshooting guidance. | +| [Enable Defender for Cloud for a management group](https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-management-group) | 0.20 | Onboarding a management group with Azure Policy is primarily a procedural how-to. It’s unlikely to contain configuration parameter tables, limits, or detailed troubleshooting mappings; it mainly explains using a supplied policy definition to enable Defender for Cloud, which is standard tutorial content. | +| [Investigate risks with security explorer and attack paths](https://learn.microsoft.com/en-us/azure/defender-for-cloud/concept-attack-path) | 0.20 | Explains attack paths/security explorer conceptually; no concrete configuration values, limits, or structured troubleshooting/decision content. | | [Map container images from code to runtime](https://learn.microsoft.com/en-us/azure/defender-for-cloud/container-image-mapping) | 0.20 | Content is about mapping container images from code to runtime using Defender CSPM DevOps capabilities. The summary suggests a conceptual/feature overview of traceability from CI/CD to runtime, without clear indication of specific error codes, configuration tables, limits, or decision matrices. Lacking evidence of concrete parameters, thresholds, or product-specific troubleshooting mappings, it does not meet the expert-knowledge criteria for any sub-skill type. | | [Microsoft Defender for Containers in a multiCloud environment](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-nine) | 0.20 | Multicloud containers episode focuses on capabilities and onboarding demos; summary does not indicate detailed config tables or numeric thresholds. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-arc-overview) | 0.20 | High-level overview of Defender for Containers on Arc; primarily conceptual capabilities and scenarios without detailed configuration parameters, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-introduction) | 0.20 | Introduction to Microsoft Defender for Containers focusing on what it is and its core domains; summary does not indicate concrete limits, configuration tables, error codes, or decision matrices. | -| [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-introduction) | 0.20 | Benefits and features overview for Defender for Resource Manager; summary indicates conceptual security positioning rather than detailed configuration or troubleshooting content. | +| [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-resource-manager-introduction) | 0.20 | Introductory benefits and features overview for Defender for Resource Manager. The summary indicates conceptual explanation of what the service is and why it matters, without specific RBAC roles, configuration parameters, limits, or decision matrices. This is not expert-knowledge content per the defined categories. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/file-integrity-monitoring-overview) | 0.20 | High-level overview of file integrity monitoring in Defender for Cloud; no specific configuration parameters, limits, error codes, or detailed settings are indicated in the summary. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/github-advanced-security-overview) | 0.20 | Page is an overview of the GitHub Advanced Security integration with Defender for Cloud; it describes capabilities and benefits but does not expose concrete limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-overview) | 0.20 | High-level overview of Kubernetes nodes protection; no specific RBAC roles, configuration parameters, limits, or error codes are indicated in the summary. | | [Overview of Defender for Cloud in Defender portal](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-portal/defender-for-cloud-defender-portal) | 0.20 | Overview of Defender for Cloud in Defender portal; navigation and conceptual description, no detailed configs or limits. | -| [Protect VMs](https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-protect-resources) | 0.20 | Tutorial for enabling JIT access and application control; step-by-step usage, not deep configuration, limits, or troubleshooting content. | | [Protecting containers in GCP with Defender for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-ten) | 0.20 | Protecting containers in GCP episode mentions architecture and onboarding but summary does not show concrete configuration parameters or limits. | | [Risk prioritization](https://learn.microsoft.com/en-us/azure/defender-for-cloud/risk-prioritization) | 0.20 | Content describes how Defender for Cloud prioritizes risks and recommendations conceptually; no specific RBAC roles, configuration parameters, limits, error codes, or decision matrices with quantified thresholds are evident. | | [Threat landscape for containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-eleven) | 0.20 | Threat landscape and detections episode is about attack evolution and detections; summary does not show structured troubleshooting or configuration details. | -| [What is Microsoft Defender for Cloud?](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-introduction) | 0.20 | High-level product overview of Defender for Cloud; primarily conceptual CNAPP/CSPM/CWPP description without concrete limits, configs, or error mappings. | +| [What is Microsoft Defender for Cloud?](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-cloud-introduction) | 0.20 | High-level overview of Microsoft Defender for Cloud (CNAPP, CSPM, CWPP) and portal transition; no detailed limits, configuration tables, error codes, or decision matrices. | | [What's new in Defender for Cloud features](https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes) | 0.20 | Release notes summarizing new and updated Defender for Cloud features without clear indication of detailed limits, configuration tables, or troubleshooting mappings; primarily change log/overview content. | | [What's new in recommendations, alerts, and incidents](https://learn.microsoft.com/en-us/azure/defender-for-cloud/release-notes-recommendations-alerts) | 0.20 | Release notes for recommendations, alerts, and incidents; likely lists new/modified items but not organized as troubleshooting guides with error-code-to-solution mappings or detailed configuration parameters. | | [Lessons learned from the field](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-six) | 0.15 | Lessons from the field episode is experience- and scenario-focused; summary does not indicate structured best-practices with product-specific configs or numeric guidance. | @@ -597,7 +596,9 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), | [Live from Microsoft Ignite 2023](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-forty) | 0.10 | Ignite 2023 recap and experiences using Defender for Cloud; primarily announcements and use cases, not detailed technical guidance. | | [Microsoft Defender for Containers](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-three) | 0.10 | Containers episode description focuses on what's new, pricing model, and demo; summary lacks specific config values, limits, or error-resolution mappings. | | [New AWS connector](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-one) | 0.10 | Video episode description about AWS connector is high-level and scenario-focused; summary does not indicate concrete config tables, limits, or error mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-app-service-introduction) | 0.10 | Benefits and features overview for Defender for App Service; describes what the service does and how to enable it, without detailed configuration parameters, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-introduction) | 0.10 | Overview of Defender for open-source relational databases; marketing/benefits style content without detailed limits, configs, or troubleshooting. | +| [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-databases-overview) | 0.10 | High-level overview of Defender for Databases capabilities and supported database types; no specific limits, configuration tables, error codes, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-key-vault-introduction) | 0.10 | Introduction to Defender for Key Vault benefits and features; appears to be a conceptual/marketing overview without detailed configs or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-introduction) | 0.10 | High-level introduction to Microsoft Defender for Storage describing benefits and capabilities without specific limits, configuration parameters, error codes, or decision matrices. Primarily conceptual/marketing overview, not expert configuration, troubleshooting, or decision-making content. | | [Updates from Microsoft Ignite 2024](https://learn.microsoft.com/en-us/azure/defender-for-cloud/episode-fifty-four) | 0.10 | Ignite 2024 updates and vision discussion; announcement-style content without detailed technical matrices or configs. | diff --git a/products/azure-defender-for-iot/azure-defender-for-iot.csv b/products/azure-defender-for-iot/azure-defender-for-iot.csv index d278c650..e64b9883 100644 --- a/products/azure-defender-for-iot/azure-defender-for-iot.csv +++ b/products/azure-defender-for-iot/azure-defender-for-iot.csv @@ -126,7 +126,7 @@ https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrati https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/logrhythm,LogRhythm,Integrate LogRhythm with Microsoft Defender for IoT - Microsoft Defender for IoT,Send Defender for IoT alerts to LogRhythm,Learn how to send Microsoft Defender for IoT alerts to ALogRhythmrcSight.,This article describes how to send Microsoft Defender for IoT alerts to LogRhythm. Integrating Defender for IoT with LogRhythm provides visibility into the security and resiliency of OT networks and a unified approach to IT and OT security.,2023-05-15T17:32:00.000Z,integration,integrations,0.75,True,"Explains how to send alerts to LogRhythm, which involves specific connector or syslog configuration unique to this product pair.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/netwitness,RSA NetWitness,Integrate RSA NetWitness with Microsoft Defender for IoT - Microsoft Defender for IoT,Send Defender for IoT alerts to RSA NetWitness,Learn how to send Microsoft Defender for IoT alerts to RSA NetWitness.,"This article describes how to send Microsoft Defender for IoT alerts to RSA NetWitness. Integrating Defender for IoT with NetWitness provides visibility into the security and resiliency of OT networks and a unified approach to IT and OT security. Note Defender for IoT plans to retire the NetWitness integration on December 1, 2025",2025-05-21T17:04:00.000Z,integration,integrations,0.75,True,"Describes how to send alerts to NetWitness, which requires specific configuration steps and parameters unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/on-premises-sentinel,Microsoft Sentinel (legacy),Connect Defender for IoT on-premises resources to Microsoft Sentinel (legacy) - Microsoft Defender for IoT,Connect on-premises Defender for IoT to Sentinel (legacy),This article describes the legacy method for connecting your OT sensor to Microsoft Sentinel.,"This article describes the legacy method for connecting your OT sensor to Microsoft Sentinel. Stream data into Microsoft Sentinel whenever you want to use Microsoft Sentinel's advanced threat hunting, security analytics, and automation features when responding to security incidents and threats across your network. Important This feature will be deprecated inJanuary 2025. If you're using a cloud connected sensor, we recommend that you connect Defender for IoT data using the Microsoft Sentinel sol",2025-01-09T18:03:00.000Z,how-to,integrations,0.7,True,Legacy connection method to Sentinel is an integration pattern with specific data streaming configuration and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners,Stream cloud alerts to a partner SIEM,Stream Microsoft Defender for IoT cloud alerts to a partner SIEM - Microsoft Defender for IoT,Stream Defender for IoT cloud alerts to external SIEMs,"Learn how to send Microsoft Defender for IoT data on the cloud to a partner SIEM via Microsoft Sentinel and Azure Event Hubs, using Splunk as an example.","As more businesses convert OT systems to digital IT infrastructures, security operations center (SOC) teams and chief information security officers (CISOs) are increasingly responsible for handling threats from OT networks. We recommend using Microsoft Defender for IoT's out-of-the-boxdata connectorandsolutionto integrate with Microsoft Sentinel and bridge the gap between the IT and OT security challenge. However, if you have other security information and event management (SIEM) systems, you ca",2025-10-24T05:11:00.000Z,integration,integrations,0.8,True,"Describes sending data via Sentinel and Event Hubs to SIEMs like Splunk; this involves specific connector, Event Hub, and data format configurations unique to this integration.",unchanged +https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners,Stream cloud alerts to a partner SIEM,Stream Microsoft Defender for IoT cloud alerts to a partner SIEM - Microsoft Defender for IoT,Integrate Defender for IoT alerts with partner SIEMs,"Learn how to send Microsoft Defender for IoT data on the cloud to a partner SIEM via Microsoft Sentinel and Azure Event Hubs, using Splunk as an example.","As more businesses convert OT systems to digital IT infrastructures, security operations center (SOC) teams and chief information security officers (CISOs) are increasingly responsible for handling threats from OT networks. We recommend using Microsoft Defender for IoT's out-of-the-boxdata connectorandsolutionto integrate with Microsoft Sentinel and bridge the gap between the IT and OT security challenge. However, if you have other security information and event management (SIEM) systems, you ca",2026-04-22T19:06:00.000Z,integration,integrations,0.68,True,"The page describes a product-specific integration pattern for streaming Defender for IoT cloud alerts to third-party SIEMs via Microsoft Sentinel and Azure Event Hubs (with Splunk as an example). This involves concrete, service-specific integration steps and configuration details that go beyond generic SIEM integration knowledge, fitting the integrations sub-skill. It is not just a conceptual overview or marketing content.",updated https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/service-now-legacy,ServiceNow (legacy),Microsoft Defender for IoT integration with ServiceNow (legacy) - Microsoft Defender for IoT,Configure legacy ServiceNow integration for Defender for IoT,"In this tutorial, learn how to integrate the legacy ServiceNow integration with Microsoft Defender for IoT.","Note A newOperational Technology Managerintegration is now available from the ServiceNow store. The new integration streamlines Microsoft Defender for IoT sensor appliances, OT assets, network connections, and vulnerabilities to ServiceNow’s Operational Technology (OT) data model. Check the software version, as more up-to-date versions of this software maybe available on the ServiceNow site. Please read the ServiceNow’s supporting links and docs for the ServiceNow's terms of service. Microsoft D",2023-05-15T17:32:00.000Z,reference,integrations,0.7,True,"Legacy ServiceNow integration articles typically include product-specific connection parameters, endpoint URLs, authentication settings, and configuration steps unique to this integration, which qualify as expert integration knowledge beyond generic tutorials.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/iot-advanced-threat-monitoring,Investigate Defender for IoT incidents with Microsoft Sentinel,Investigate and detect threats for IoT devices - Microsoft Defender for IoT,Use Sentinel solution to detect IoT threats,"This tutorial describes how to use the Microsoft Sentinel data connector and solution for Microsoft Defender for IoT to secure your entire environment. Detect and respond to threats, including multis","The integration between Microsoft Defender for IoT andMicrosoft Sentinelenable SOC teams to efficiently and effectively detect and respond to security threats across your network. Enhance your security capabilities with theMicrosoft Defender for IoT solution, a set of bundled content configured specifically for Defender for IoT data that includes analytics rules, workbooks, and playbooks. In this tutorial, you: Important The Microsoft Sentinel content hub experience is currently inPREVIEW, as is",2023-03-12T00:00:00.000Z,tutorial,integrations,0.65,True,"Focuses on using the Sentinel data connector and Defender for IoT solution; such content typically details specific analytics rules, workbooks, and connector settings unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/iot-solution,Connect Defender for IoT cloud data to Microsoft Sentinel,Connect Microsoft Defender for IoT with Microsoft Sentinel - Microsoft Defender for IoT,Connect Defender for IoT with Microsoft Sentinel,"This tutorial describes how to integrate Microsoft Sentinel and Microsoft Defender for IoT with the Microsoft Sentinel data connector to secure your entire environment. Detect and respond to threats, ","​Microsoft Defender for IoT enables you to secure your entire OT and Enterprise IoT environment, whether you need to protect existing devices or build security into new innovations. Microsoft Sentinel and Microsoft Defender for IoT help to bridge the gap between IT and OT security challenges, and to empower SOC teams with out-of-the-box capabilities to efficiently and effectively detect and respond to security threats. The integration between Microsoft Defender for IoT and Microsoft Sentinel hel",2023-03-08T00:00:00.000Z,tutorial,integrations,0.7,True,"Integration tutorial between Defender for IoT and Sentinel will include connector configuration, data types, and product-specific parameters for the data connector and solution.",unchanged diff --git a/products/azure-defender-for-iot/report.md b/products/azure-defender-for-iot/report.md index dfdd92c0..9d438d2c 100644 --- a/products/azure-defender-for-iot/report.md +++ b/products/azure-defender-for-iot/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-03-16' +generated_at: '2026-04-26' category_descriptions: troubleshooting: Diagnosing and fixing Defender for IoT micro agent and OT sensor issues, understanding/handling security and health alerts, and validating sensor/agent @@ -12,9 +12,9 @@ category_descriptions: topology and sensor placement.' limits-quotas: Info on OT trial setup, supported/retiring features, appliance catalog and requirements, and Defender for IoT data retention and storage limits. - integrations: Integrating Defender for IoT with SIEMs, firewalls, ServiceNow, Sentinel, - OT sensors, and micro agents, plus using APIs, playbooks, and workbooks to automate - alerts and manage inventory/vulnerabilities. + integrations: Integrating Defender for IoT/OT with SIEMs, firewalls, ServiceNow, + Sentinel, REST APIs, and micro agents, plus automating alerts, inventories, vulnerabilities, + and visualizations. deployment: 'Planning and deploying Defender for IoT OT sensors: hardware/VM options, appliance-specific guides, traffic mirroring, onboarding, activation, and moving IoT security resources across regions.' @@ -29,17 +29,16 @@ category_descriptions: skill_description: Expert knowledge for Azure Defender For Iot development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when deploying OT sensors, configuring micro agents, setting up traffic mirroring, - or integrating with Sentinel/SIEM, and other Azure Defender For Iot related development + Use when deploying OT sensors, configuring traffic mirroring, integrating with Sentinel/SIEM, + or managing IoT micro agents, and other Azure Defender For Iot related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Security - (use azure-security), Azure External Attack Surface Management (use azure-external-attack-surface-management), - Azure Sentinel (use azure-sentinel). -use_when: Use when deploying OT sensors, configuring micro agents, setting up traffic - mirroring, or integrating with Sentinel/SIEM, and other Azure Defender For Iot related - development tasks. + (use azure-security), Azure IoT (use azure-iot), Azure IoT Hub (use azure-iot-hub). +use_when: Use when deploying OT sensors, configuring traffic mirroring, integrating + with Sentinel/SIEM, or managing IoT micro agents, and other Azure Defender For Iot + related development tasks. confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-cloud), - Azure Security (use azure-security), Azure External Attack Surface Management (use - azure-external-attack-surface-management), Azure Sentinel (use azure-sentinel). + Azure Security (use azure-security), Azure IoT (use azure-iot), Azure IoT Hub (use + azure-iot-hub). --- # Azure Defender For Iot Crawl Report @@ -53,8 +52,8 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 180 +- **Updated Pages**: 1 +- **Unchanged**: 179 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-defender-for-iot/azure-defender-for-iot.csv` @@ -75,6 +74,11 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo ## Changes +### Updated Pages + +- [Stream cloud alerts to a partner SIEM](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners) + - Updated: 2025-10-24T05:11:00.000Z → 2026-04-22T19:06:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -106,7 +110,6 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo | [SSL/TLS certificate requirements](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/best-practices/certificate-requirements) | security | 0.80 | Details certificate file requirements and validation (CRL, expiration) for Defender for IoT OT sensors, which are product-specific security configuration rules. | | [SSO for sensor console login](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/set-up-sso) | security | 0.80 | SSO setup with Microsoft Entra ID includes specific application IDs, claims, and configuration parameters for the sensor console, which are product-specific security settings. | | [Sensor health message reference](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/sensor-health-messages) | troubleshooting | 0.80 | A reference of sensor health messages implies specific message texts and meanings, mapping symptoms to causes and actions, which is product-specific troubleshooting knowledge. | -| [Stream cloud alerts to a partner SIEM](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners) | integrations | 0.80 | Describes sending data via Sentinel and Event Hubs to SIEMs like Splunk; this involves specific connector, Event Hub, and data format configurations unique to this integration. | | [Vulnerability management](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/api/sensor-vulnerability-apis) | integrations | 0.80 | Vulnerability management API reference exposes product-specific endpoints and response schemas aligned with integration patterns. | | [Which appliances do I need?](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/ot-appliance-sizing) | decision-making | 0.80 | Helps choose physical vs virtual appliances and hardware profiles based on monitoring needs; product-specific sizing and selection guidance. | | [Work with Defender-IoT-micro-agent for Eclipse ThreadX (Preview)](https://learn.microsoft.com/en-us/azure/defender-for-iot/device-builders/how-to-threadx-security-module) | configuration | 0.80 | Explicitly about configuring/customizing the ThreadX micro agent for network, bandwidth, and memory; likely includes product-specific parameters and ranges. | @@ -169,6 +172,7 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo | [Validate after installation](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/ot-deploy/post-install-validation-ot-software) | troubleshooting | 0.70 | Post-install validation via UI/CLI for system health; likely includes checks, commands, and symptom→resolution guidance specific to this product. | | [Visualize data with workbooks](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/workbooks) | integrations | 0.70 | Using Azure Monitor workbooks with Defender for IoT requires product-specific queries and workbook configuration tied to this data source. | | [YS-techsystems YS-FIT2 (Rugged MIL-STD-810G)](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/appliance-catalog/ys-techsystems-ys-fit2) | deployment | 0.70 | Includes deployment and installation details for a specific hardware platform, which are product-specific deployment patterns. | +| [Stream cloud alerts to a partner SIEM](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners) | integrations | 0.68 | The page describes a product-specific integration pattern for streaming Defender for IoT cloud alerts to third-party SIEMs via Microsoft Sentinel and Azure Event Hubs (with Splunk as an example). This involves concrete, service-specific integration steps and configuration details that go beyond generic SIEM integration knowledge, fitting the integrations sub-skill. It is not just a conceptual overview or marketing content. | | [Analyze OT programming details and changes](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/how-to-analyze-programming-details-changes) | security | 0.65 | Investigating programming events and code changes on OT devices is a product-specific forensic/security workflow, likely with unique event types and fields. | | [Audit user activity](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/track-user-activity) | security | 0.65 | Auditing user activity typically documents specific log locations, event types, and access patterns unique to Defender for IoT, which are security-related expert details. | | [Azure connection methods](https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/architecture-connections) | architecture-patterns | 0.65 | Describes supported architecture models for connecting sensors to Azure; product-specific connection patterns and when to use them. | diff --git a/products/azure-deployment-environments/azure-deployment-environments.csv b/products/azure-deployment-environments/azure-deployment-environments.csv index 84f9d7a2..9a07e920 100644 --- a/products/azure-deployment-environments/azure-deployment-environments.csv +++ b/products/azure-deployment-environments/azure-deployment-environments.csv @@ -2,13 +2,13 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_sk https://learn.microsoft.com/en-us/azure/deployment-environments/best-practice-catalog-structure,Best practices for designing catalogs,Best practices for structuring an Azure Deployment Environments catalog - Azure Deployment Environments,Apply catalog structure best practices in ADE,Get guidelines for structuring an Azure Deployment Environments catalog to help ensure efficient caching.,This article describes best practices for structuring an Azure Deployment Environments catalog.,2025-03-26T22:03:00.000Z,best-practice,best-practices,0.7,True,Explicit best-practices article for catalog structure with ADE-specific guidance likely including concrete recommendations and gotchas.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/concept-azure-developer-cli-with-deployment-environments,Use Azure Developer CLI (azd) with ADE,Use Azure Developer CLI with Azure Deployment Environments - Azure Deployment Environments,,Understand how ADE and `azd` work together to provision application infrastructure and deploy application code to the new infrastructure.,"In this article, you learn about Azure Developer CLI (azd) and how it works with Azure Deployment Environments (ADE) to simplify the process of provisioning application infrastructure and deploying application code to the new infrastructure. azdis an open-source command-line tool that provides developer-friendly commands that map to key stages in your workflow. You can installazdlocally on your machine or use it in other environments. With ADE, you can create environments from an environment def",2024-11-19T18:02:00.000Z,concept-article,,0.4,False,Conceptual article on how azd and ADE work together; integration overview without detailed parameter tables.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/concept-deployment-environments-role-based-access-control,Azure role-based access control,Plan Azure role-based access control - Azure Deployment Environments,Plan Azure RBAC roles for Deployment Environments,Learn how Azure Deployment Environments provides protection with Azure role-based access control (Azure RBAC) integration.,"This article is for platform engineers, central IT administrators, and other higher-level admins who plan and manage Azure Deployment Environments. It describes the different built-in roles that the service supports and how they map to organizational roles like platform engineer and dev manager. Use this information to plan the right permissions model before you roll out Azure Deployment Environments. Azure role-based access control (RBAC) provides built-in role definitions that outline the perm",2025-12-30T23:13:00.000Z,concept-article,security,0.8,True,Describes built-in roles and how they map to org roles; RBAC role names and scopes are product-specific security configuration.,unchanged -https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml,Parameters and data types in environment.yaml,environment.yaml schema - Azure Deployment Environments,Configure environment.yaml schema for ADE definitions,Learn how to use environment.yaml to define parameters in your environment definition.,Azure Deployment Environments environment definitions are infrastructure as code (IaC) that are written in Bicep or Terraform and stored in repositories. You can modify and adapt environment definitions for your requirements and then use them to create a deployment environment on Azure. The environment.yaml schema defines and describes the types of Azure resources included in environment definitions.,2025-03-27T22:02:00.000Z,concept-article,configuration,0.8,True,"Schema article for environment.yaml will list specific fields, allowed values, and structure—product-specific configuration reference.",unchanged +https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml,Parameters and data types in environment.yaml,environment.yaml schema - Azure Deployment Environments,Configure environment.yaml schema for Azure Deployment Environments,Learn how to use environment.yaml to define parameters in your environment definition.,"Azure Deployment Environments environment definitions are infrastructure as code (IaC) templates written in ARM, Bicep, Terraform, or other frameworks supported through the ADE extensibility model, and stored in repositories. You can modify and adapt environment definitions for your requirements and then use them to create a deployment environment on Azure. The environment.yaml schema defines and describes the types of Azure resources included in environment definitions.",2026-04-23T22:13:00.000Z,concept-article,configuration,0.8,True,"Describes the environment.yaml schema used to define environment parameters; schema docs typically include specific field names, allowed values, and structure, which are product-specific configuration details.",updated https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environments-key-concepts,Key concepts,Key Concepts and Roles - Azure Deployment Environments,,"Learn the key concepts, role definitions, features, and terminology for Azure Deployment Environments.","In this article, you learn about the key concepts and components of Azure Deployment Environments. This knowledge helps you more effectively deploy environments for your scenarios. As you learn about Deployment Environments, you might encounter components ofMicrosoft Dev Box, a complementary service that shares certain architectural components. Dev Box provides developers with a cloud-based development workstation, called a dev box, which is configured with the tools they need for their work. Th",2025-07-30T17:10:00.000Z,concept-article,,0.1,False,Conceptual key concepts and roles; primarily terminology and roles mapping.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/concept-extensibility-model,What is the ADE Extensibility Model?,ADE Extensibility Model - Azure Deployment Environments,,Learn how the ADE extensibility model enables you to use custom container images to create deployment environments.,"Azure Deployment Environments (ADE) enables you to provide a curated set of infrastructure-as-code (IaC) templates that your development teams use to perform deployments. ADE offers power and flexibility for organizations through an extensibility model which enables platform engineers to define preapproved templates using their preferred IaC framework. The following diagram shows the full workflow for ADE. The catalog stores IaC templates, which reference container images for use in deployments.",2025-01-21T23:02:00.000Z,concept-article,,0.4,False,Conceptual description of ADE extensibility model; workflow-level overview rather than detailed config or troubleshooting.,unchanged -https://learn.microsoft.com/en-us/azure/deployment-environments/concept-reliability-deployment-environments,Reliability in Azure Deployment Environments,Reliability and Availability in Azure Deployment Environments - Azure Deployment Environments,,Learn how Azure Deployment Environments ensure reliability with availability zones and disaster recovery strategies. Discover best practices for resiliency.,"This article describes reliability support in Azure Deployment Environments. It covers intra-regional resiliency with availability zones and inter-region resiliency with disaster recovery. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-01-10T06:10:00.000Z,concept-article,,0.3,False,Reliability/availability overview; likely conceptual resiliency description without detailed numeric thresholds or config matrices.,unchanged +https://learn.microsoft.com/en-us/azure/deployment-environments/concept-reliability-deployment-environments,Reliability in Azure Deployment Environments,Reliability and Availability in Azure Deployment Environments - Azure Deployment Environments,Apply resiliency best practices in Azure Deployment Environments,Learn how Azure Deployment Environments ensure reliability with availability zones and disaster recovery strategies. Discover best practices for resiliency.,"This article describes reliability support in Azure Deployment Environments. It covers intra-regional resiliency with availability zones and inter-region resiliency with disaster recovery. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-04-23T22:13:00.000Z,concept-article,best-practices,0.65,True,Focuses on reliability and availability with availability zones and disaster recovery; likely includes ADE-specific resiliency recommendations and patterns beyond generic reliability concepts.,updated https://learn.microsoft.com/en-us/azure/deployment-environments/configure-environment-definition,Add & configure an environment definition,Add and Configure an Environment Definition in a Catalog - Azure Deployment Environments,Configure ADE environment definitions and container images,Learn how to add and configure an environment definition to use in Azure Deployment Environments projects.,"This article explains how to add, update, or delete an environment definition in an Azure Deployment Environments catalog. It also explains how to reference a container image to deploy your environment. In Deployment Environments, you use acatalogto provide your development teams with a curated set of predefinedinfrastructure as code (IaC)templates calledenvironment definitions. An environment definition is composed of at least two files: Your development teams use the environment definitions th",2025-07-30T22:07:00.000Z,how-to,configuration,0.7,True,"Explains environment definition composition and configuration, including referencing container images—product-specific config details.",unchanged -https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate,Authenticate to REST APIs,Authenticate to Azure Deployment Environments REST APIs - Azure Deployment Environments,Authenticate to ADE REST APIs using Azure CLI,"Learn how to authenticate to Azure Deployment Environments REST APIs, as administrator or developer, by using Azure CLI.","In this article, you learn how to authenticate to Microsoft Dev Box REST APIs by using Azure CLI. Authentication is a crucial step for accessing both administrator (control plane) and developer (data plane) APIs. This guide walks you through retrieving an access token from Microsoft Entra ID, understanding the token's structure and validity, and using the bearer token to access REST APIs. By following these steps, you can securely interact with Microsoft Dev Box services. Tip Before authenticati",2025-04-08T22:02:00.000Z,concept-article,security,0.7,True,Explains obtaining and using Entra access tokens for ADE/Dev Box REST APIs; includes auth flows and token usage—security/auth configuration.,unchanged +https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate,Authenticate to REST APIs,Authenticate to Azure Deployment Environments REST APIs - Azure Deployment Environments,Authenticate to Azure Deployment Environments REST APIs securely,"Learn how to authenticate to Azure Deployment Environments REST APIs, as administrator or developer, by using Azure CLI.","In this article, you learn how to authenticate to Azure Deployment Environments REST APIs by using Azure CLI. Authentication is a crucial step for accessing both administrator (control plane) and developer (data plane) APIs. This guide walks you through retrieving an access token from Microsoft Entra ID, understanding the token's structure and validity, and using the bearer token to access REST APIs. By following these steps, you can securely interact with Azure Deployment Environments services.",2026-04-23T22:13:00.000Z,concept-article,security,0.8,True,"Covers authentication to ADE REST APIs using Azure CLI and Entra ID tokens; likely includes specific scopes, endpoints, and token usage patterns that are product-specific security configuration details.",updated https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-azure-developer-cli-deployment-environments,Create an environment from an azd template,Create an environment with Azure Developer CLI - Azure Deployment Environments,,Use an `azd` template to provision application infrastructure and deploy application code to the new infrastructure.,"In this article, you create a new environment from an existing Azure Developer CLI (azd) compatible template by usingazd. You learn how to configure Azure Deployment Environments (ADE) andazdto work together to provision application infrastructure and deploy application code to the new infrastructure. To learn the key concepts of howazdand ADE work together, seeUse Azure Developer CLI with Azure Deployment Environments.",2024-11-27T08:00:00.000Z,how-to,,0.4,False,"How-to for creating an environment with azd; tutorial-style integration, not a deep config or reference.",unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-catalog,Add & configure a catalog,Add a catalog from a GitHub or Azure Repos repository - Azure Deployment Environments,,Learn how to add a catalog in your Azure Deployment Environments dev center or project to provide environment definitions for your developers.,"This article explains how to add and configure acatalogfor your Azure Deployment Environments dev center or project. Catalogs help you provide a set of curated infrastructure-as-code(IaC) templates, known as environment definitions for your development teams to create environments. You can attach your own source control repository from GitHub or Azure Repos as a catalog and specify the folder with your environment definitions. Deployment Environments scans the folder for environment definitions ",2025-02-24T08:00:00.000Z,how-to,,0.4,False,"How-to for adding a catalog; mostly procedural steps, not a full configuration reference with parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-devcenter-environment-types,Configure dev center environment types,Configure dev center environment types - Azure Deployment Environments,,Learn how to define the environment types available for all projects within a dev center by creating them as dev center environment types.,"In Azure Deployment Environments, you useenvironment typesto define the environments that development teams can deploy. You have the flexibility to name the environment types according to the naming conventions that your enterprise uses: for example,sandbox,dev,test, andproduction. You can specify deployment settings and the permissions that are available to developers per environment type and per project. In this article, you learn how to:",2025-03-26T22:03:00.000Z,how-to,,0.5,False,Describes configuring environment types but summary doesn’t indicate detailed parameter tables or numeric thresholds.,unchanged @@ -24,7 +24,7 @@ https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-manage-en https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-request-quota-increase,Request a quota limit increase,Request a quota limit increase for Azure Deployment Environments resources - Azure Deployment Environments,Request ADE quota and capacity limit increases,Learn how to request a quota increase to extend the number of Deployment Environments resources you can use in your subscription.,"This guide explains how to submit a support request to increase the number of resources available to Azure Deployment Environments in your Azure subscription. If your organization uses Deployment Environments extensively, you might encounter a quota limit during deployment. When you reach the limit for a resource in your subscription, you can request a limit increase (sometimes called acapacity increaseor aquota increase) to extend the number of resources available. The request process allows th",2025-03-26T17:06:00.000Z,how-to,limits-quotas,0.7,True,Quota increase guide will reference specific ADE resource limits/quotas that trigger requests—numeric limits not generally known.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-schedule-environment-deletion,Automatically delete an environment,Schedule an Environment for Automatic Deletion - Azure Deployment Environments,,Learn how to schedule a deletion date and time for an environment. Set an expiration date to specify when the environment and its resources are deleted.,"In this article, you learn how to set an expiration date for a deployment environment. On the expiration date, Azure Deployment Environments automatically deletes the environment and all its resources. If your timeline changes, you can change the expiration date. Working with many deployment environments across multiple projects can be challenging. Scheduled deletion helps you manage your environments by automatically deleting them on a specific date at a specific time. Using automatic expiratio",2025-07-30T22:07:00.000Z,how-to,,0.5,False,Describes scheduling environment deletion; likely simple UI/CLI steps without numeric limits or complex config.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/overview-what-is-azure-deployment-environments,What is Azure Deployment Environments?,What Is Azure Deployment Environments? - Azure Deployment Environments,,"Enable developer teams to spin up infrastructure for deploying apps with templates, adding governance for Azure resource types, security, and cost.",Azure Deployment Environments empowers development teams to quickly and easily spin up app infrastructure with project-based templates that establish consistency and best practices while maximizing security. This on-demand access to secure environments speeds up the stages of the software development lifecycle in a compliant and cost-efficient way. This article provides an overview of Development Environments Adeployment environmentis a collection of Azure infrastructure resources defined in a t,2025-07-30T17:10:00.000Z,overview,,0.1,False,"High-level overview of Azure Deployment Environments without detailed limits, configs, or patterns.",unchanged -https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-access-environments,Create and access an environment,Create a Deployment Environment - Azure Deployment Environments,,Learn how to create and access an environment in Azure Deployment Environments by using the developer portal.,"This quickstart describes how to create and access anenvironmentin Azure Deployment Environments by using the developer portal. As a developer, you can create environments associated with aprojectin Deployment Environments. An environment provides preconfigured Azure resources for deploying your application.",2025-07-30T22:07:00.000Z,quickstart,,0.1,False,Basic how-to for creating an environment via portal; no deep configuration or limits.,unchanged +https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-access-environments,Create and access an environment,Create a deployment environment - Azure Deployment Environments,,Learn how to create and access an environment in Azure Deployment Environments by using the developer portal.,"This quickstart describes how to create and access anenvironmentin Azure Deployment Environments by using the developer portal. As a developer, you can create environments associated with aprojectin Deployment Environments. An environment provides preconfigured Azure resources for deploying your application.",2026-04-23T22:13:00.000Z,quickstart,,0.2,False,"Quickstart for creating/accessing environments via the developer portal; primarily step-by-step UI usage without detailed limits, configuration matrices, or product-specific troubleshooting.",updated https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-and-configure-devcenter,Configure Azure Deployment Environments,Set Up Azure Deployment Environments - Azure Deployment Environments,,"Learn how to set up the resources to get started with Azure Deployment Environments. Configure a dev center, attach an identity, and attach a catalog for using IaC templates.","In this quickstart, you set up all the resources in Azure Deployment Environments to enable self-service deployment environments for development teams. Learn how to create and configure a dev center, add a catalog to the dev center, and define an environment type. Then associate a project with the dev center, add an environment type, and allow developer access to the project. A dev center is the top-level resource for Deployment Environments that contains the collection of development projects. ",2025-07-24T08:00:00.000Z,quickstart,,0.2,False,Quickstart setup guide; mostly step-by-step portal usage without detailed configuration tables or expert-only data.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-dev-center-project-azure-resource-manager,Create dev center and project (Azure Resource Manager),Create a Dev Center and Project for Deployment Environments by Using an ARM Template - Azure Deployment Environments,,Learn how to create and configure a dev center and project for Azure Deployment Environments by using an ARM template.,This quickstart describes how to use an Azure Resource Manager template (ARM template) to create and configure an Azure Deployment Environments dev center and project for creating an environment. AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe your intended deployment without writing the sequence of programming commands to create the deployment. If y,2025-07-30T22:07:00.000Z,quickstart-arm,,0.2,False,ARM template quickstart; shows example template but not a comprehensive config reference or product-specific patterns.,unchanged https://learn.microsoft.com/en-us/azure/deployment-environments/reference-deployment-environment-cli,ADE CLI reference,ADE CLI reference - Azure Deployment Environments,Use ADE CLI commands for custom image workflows,Learn about the commands available for building custom images using Azure Deployment Environment (ADE) base images.,"This article describes the commands available for building custom images using Azure Deployment Environment (ADE) base images. By using the ADE CLI, you can interact with information about your environment and specified environment definition, upload, and access previously uploaded files related to the environment, record more logging regarding their executing operation, and upload and access outputs of an environment deployment.",2024-04-16T17:06:00.000Z,reference,integrations,0.7,True,CLI reference for ADE custom image commands; includes command parameters and behavior—product-specific API/CLI integration details.,unchanged diff --git a/products/azure-deployment-environments/report.md b/products/azure-deployment-environments/report.md index 3fe92b3c..ee9d7d6a 100644 --- a/products/azure-deployment-environments/report.md +++ b/products/azure-deployment-environments/report.md @@ -1,14 +1,13 @@ --- -generated_at: '2026-02-28' +generated_at: '2026-04-26' category_descriptions: - best-practices: 'Guidance on structuring ADE catalogs: organizing templates, folders, - and repos for reusable, maintainable, and scalable deployment environment definitions.' - security: 'RBAC and identity for ADE: planning Azure roles/scopes, using Azure CLI - auth for REST, configuring managed identities, and assigning built‑in ADE roles - and access.' - configuration: Defining and configuring ADE environment.yaml schemas, environment - definitions, and custom container images, plus required CLI environment variables - for building and running those images. + best-practices: Guidance on structuring ADE template catalogs and implementing resiliency + patterns (high availability, fault tolerance, recovery) in Azure Deployment Environments. + security: RBAC and role planning, secure API auth, configuring managed identities, + and assigning ADE built-in roles/scopes to control access to deployments and resources. + configuration: Defining environment.yaml schemas, configuring ADE environment definitions, + and building/using custom container images (including required CLI env vars) for + Azure Deployment Environments limits-quotas: How to view current Azure Deployment Environments quotas/capacity, understand default limits, and request increases for org, project, and environment resource usage. @@ -23,14 +22,13 @@ category_descriptions: update, and delete ADE environments. skill_description: Expert knowledge for Azure Deployment Environments development including troubleshooting, best practices, limits & quotas, security, configuration, - integrations & coding patterns, and deployment. Use when designing ADE catalogs, - environment.yaml schemas, custom images, RBAC/roles, or CI/CD image pipelines, and - other Azure Deployment Environments related development tasks. Not for Azure DevTest - Labs (use azure-devtest-labs), Azure Dev Box (use azure-dev-box), Azure Integration - Environments (use azure-integration-environments), Azure Managed Applications (use - azure-managed-applications). -use_when: Use when designing ADE catalogs, environment.yaml schemas, custom images, - RBAC/roles, or CI/CD image pipelines, and other Azure Deployment Environments related + integrations & coding patterns, and deployment. Use when defining environment.yaml, + ADE catalogs, RBAC/managed identities, custom images, or CI/CD pipelines, and other + Azure Deployment Environments related development tasks. Not for Azure DevTest Labs + (use azure-devtest-labs), Azure Dev Box (use azure-dev-box), Azure Integration Environments + (use azure-integration-environments), Azure Managed Applications (use azure-managed-applications). +use_when: Use when defining environment.yaml, ADE catalogs, RBAC/managed identities, + custom images, or CI/CD pipelines, and other Azure Deployment Environments related development tasks. confusable_not_for: Not for Azure DevTest Labs (use azure-devtest-labs), Azure Dev Box (use azure-dev-box), Azure Integration Environments (use azure-integration-environments), @@ -43,13 +41,13 @@ confusable_not_for: Not for Azure DevTest Labs (use azure-devtest-labs), Azure D - **Total Pages**: 32 - **Fetched**: 32 - **Fetch Failed**: 0 -- **Classified**: 14 -- **Unclassified**: 18 +- **Classified**: 15 +- **Unclassified**: 17 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 32 +- **Updated Pages**: 4 +- **Unchanged**: 28 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-deployment-environments/azure-deployment-environments.csv` @@ -57,17 +55,28 @@ confusable_not_for: Not for Azure DevTest Labs (use azure-devtest-labs), Azure D | Type | Count | Percentage | |------|-------|------------| -| best-practices | 1 | 3.1% | +| best-practices | 2 | 6.2% | | configuration | 4 | 12.5% | | deployment | 2 | 6.2% | | integrations | 1 | 3.1% | | limits-quotas | 1 | 3.1% | | security | 4 | 12.5% | | troubleshooting | 1 | 3.1% | -| *(Unclassified)* | 18 | 56.2% | +| *(Unclassified)* | 17 | 53.1% | ## Changes +### Updated Pages + +- [Create and access an environment](https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-access-environments) + - Updated: 2025-07-30T22:07:00.000Z → 2026-04-23T22:13:00.000Z +- [Parameters and data types in environment.yaml](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml) + - Updated: 2025-03-27T22:02:00.000Z → 2026-04-23T22:13:00.000Z +- [Reliability in Azure Deployment Environments](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-reliability-deployment-environments) + - Updated: 2026-01-10T06:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Authenticate to REST APIs](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate) + - Updated: 2025-04-08T22:02:00.000Z → 2026-04-23T22:13:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -75,17 +84,18 @@ confusable_not_for: Not for Azure DevTest Labs (use azure-devtest-labs), Azure D | [ADE CLI variables](https://learn.microsoft.com/en-us/azure/deployment-environments/reference-deployment-environment-variables) | configuration | 0.90 | Lists ADE-specific environment variables, names, and usage constraints for custom image scripts—core configuration reference. | | [Grant access to Azure Deployment Environments](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-manage-deployment-environments-access) | security | 0.85 | Details specific built-in roles (DevCenter Project Admin, Deployment Environments User, DevCenter Owner) and scope usage—RBAC security configuration. | | [Troubleshoot custom image errors and warnings](https://learn.microsoft.com/en-us/azure/deployment-environments/troubleshoot-custom-image-logs-errors) | troubleshooting | 0.85 | Explicit troubleshooting guide with ADE-specific error log file ($ADE_ERROR_LOG), CLI commands, and symptom-to-resolution steps. | +| [Authenticate to REST APIs](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate) | security | 0.80 | Covers authentication to ADE REST APIs using Azure CLI and Entra ID tokens; likely includes specific scopes, endpoints, and token usage patterns that are product-specific security configuration details. | | [Azure role-based access control](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-deployment-environments-role-based-access-control) | security | 0.80 | Describes built-in roles and how they map to org roles; RBAC role names and scopes are product-specific security configuration. | | [Configure a managed identity](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-managed-identity) | security | 0.80 | Details how to configure managed identity for ADE dev centers; includes identity scopes and permissions—security configuration. | -| [Parameters and data types in environment.yaml](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml) | configuration | 0.80 | Schema article for environment.yaml will list specific fields, allowed values, and structure—product-specific configuration reference. | +| [Parameters and data types in environment.yaml](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml) | configuration | 0.80 | Describes the environment.yaml schema used to define environment parameters; schema docs typically include specific field names, allowed values, and structure, which are product-specific configuration details. | | [ADE CLI reference](https://learn.microsoft.com/en-us/azure/deployment-environments/reference-deployment-environment-cli) | integrations | 0.70 | CLI reference for ADE custom image commands; includes command parameters and behavior—product-specific API/CLI integration details. | | [Add & configure an environment definition](https://learn.microsoft.com/en-us/azure/deployment-environments/configure-environment-definition) | configuration | 0.70 | Explains environment definition composition and configuration, including referencing container images—product-specific config details. | -| [Authenticate to REST APIs](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate) | security | 0.70 | Explains obtaining and using Entra access tokens for ADE/Dev Box REST APIs; includes auth flows and token usage—security/auth configuration. | | [Best practices for designing catalogs](https://learn.microsoft.com/en-us/azure/deployment-environments/best-practice-catalog-structure) | best-practices | 0.70 | Explicit best-practices article for catalog structure with ADE-specific guidance likely including concrete recommendations and gotchas. | | [Configure ARM or Bicep container image](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-extensibility-model-custom-image) | configuration | 0.70 | How-to for building and using custom images with ADE, including image references and script usage—product-specific configuration patterns. | | [Request a quota limit increase](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-request-quota-increase) | limits-quotas | 0.70 | Quota increase guide will reference specific ADE resource limits/quotas that trigger requests—numeric limits not generally known. | | [Automate with Azure Pipelines (CI/CD)](https://learn.microsoft.com/en-us/azure/deployment-environments/tutorial-deploy-environments-in-cicd-azure-devops) | deployment | 0.65 | Shows ADE integration with Azure Pipelines; includes product-specific pipeline configuration for environment deployments. | | [Automate with GitHub Actions (CI/CD)](https://learn.microsoft.com/en-us/azure/deployment-environments/tutorial-deploy-environments-in-cicd-github) | deployment | 0.65 | Tutorial for CI/CD integration with ADE; likely includes ADE-specific GitHub Actions configuration and constraints for deployments. | +| [Reliability in Azure Deployment Environments](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-reliability-deployment-environments) | best-practices | 0.65 | Focuses on reliability and availability with availability zones and disaster recovery; likely includes ADE-specific resiliency recommendations and patterns beyond generic reliability concepts. | ## Unclassified Pages @@ -103,9 +113,8 @@ confusable_not_for: Not for Azure DevTest Labs (use azure-devtest-labs), Azure D | [Create and configure a dev center from the CLI](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-create-configure-dev-center) | 0.30 | CLI quickstart to create a dev center; step-by-step commands rather than exhaustive configuration or patterns. | | [Create and configure a project from the CLI](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-create-configure-projects) | 0.30 | CLI quickstart for creating a project; mostly basic commands and flow. | | [Create environments with Azure CLI](https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-create-access-environments) | 0.30 | CLI how-to for creating/accessing an environment; basic usage, not a reference. | -| [Reliability in Azure Deployment Environments](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-reliability-deployment-environments) | 0.30 | Reliability/availability overview; likely conceptual resiliency description without detailed numeric thresholds or config matrices. | | [Configure Azure Deployment Environments](https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-and-configure-devcenter) | 0.20 | Quickstart setup guide; mostly step-by-step portal usage without detailed configuration tables or expert-only data. | +| [Create and access an environment](https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-access-environments) | 0.20 | Quickstart for creating/accessing environments via the developer portal; primarily step-by-step UI usage without detailed limits, configuration matrices, or product-specific troubleshooting. | | [Create dev center and project (Azure Resource Manager)](https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-dev-center-project-azure-resource-manager) | 0.20 | ARM template quickstart; shows example template but not a comprehensive config reference or product-specific patterns. | -| [Create and access an environment](https://learn.microsoft.com/en-us/azure/deployment-environments/quickstart-create-access-environments) | 0.10 | Basic how-to for creating an environment via portal; no deep configuration or limits. | | [Key concepts](https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environments-key-concepts) | 0.10 | Conceptual key concepts and roles; primarily terminology and roles mapping. | | [What is Azure Deployment Environments?](https://learn.microsoft.com/en-us/azure/deployment-environments/overview-what-is-azure-deployment-environments) | 0.10 | High-level overview of Azure Deployment Environments without detailed limits, configs, or patterns. | diff --git a/products/azure-devops/azure-devops.csv b/products/azure-devops/azure-devops.csv index 10e7136c..8b80d848 100644 --- a/products/azure-devops/azure-devops.csv +++ b/products/azure-devops/azure-devops.csv @@ -20,24 +20,24 @@ https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/configure-se https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/configure-storage?view=azure-devops,Configure additional storage,Configure storage - Managed DevOps Pools,Configure additional storage for Managed DevOps Pools agents,Learn how to add an empty data disk to your agents in Managed DevOps Pools.,"Do you want more disk space for your agents? Managed DevOps Pools supports attaching an empty data disk to the agents in your pool. When you attach a data disk, you can get more storage space without incurring the potentially greater cost of moving your virtual machine (VM) size to a more expensive size that has more built-in storage.",2025-12-17T22:03:00.000Z,how-to,configuration,0.75,True,Describes attaching data disks to agents; includes product-specific storage configuration options and constraints.,unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/demands?view=azure-devops,Demands,Configure demands - Managed DevOps Pools,Configure demands and capabilities in Managed DevOps Pools,Learn how to configure demands for Managed DevOps Pools.,"Pipelines usedemandsto specify what capabilities an agent needs for Azure DevOps to send a pipeline job to the agent. In Managed DevOps Pools, demands likeImageOverridework just like demands in Azure Pipelines. A pipeline job is routed to a specific agent that has attributes that match the demand. You can use some demands, likeWorkFolderandPriority, to configure attributes on the agent. This article describes the demands available in Managed DevOps Pools and how to use them.",2025-12-17T08:00:00.000Z,how-to,configuration,0.8,True,"Describes supported demands like ImageOverride, WorkFolder, Priority and how they affect agent selection; these are product-specific configuration knobs.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/diagnostics?view=azure-devops,Diagnostic logs,Diagnostic logs - Managed DevOps Pools,Use diagnostic logs for Managed DevOps Pools troubleshooting,Learn how to troubleshoot using diagnostic logs.,"Diagnostic settings in Azure are used to collect resource logs. An Azure resource emits resource logs and provides rich, frequent data about the operation of that resource. These logs are captured per request and are also referred to as ""data plane logs"". SeeDiagnostic settings in Azure Monitorfor a recommended overview of the functionality in Azure. The content of these logs varies by resource type. In Managed DevOps Pools, two options are available to log:",2024-11-18T18:38:00.000Z,how-to,troubleshooting,0.7,True,"Diagnostic logging article for a specific service; likely includes concrete log categories, event names, and configuration steps for Azure Monitor diagnostic settings, which are product-specific troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/faq?view=azure-devops,Frequently asked questions,Frequently asked questions - Managed DevOps Pools,,Learn the answers to frequently asked questions for Managed DevOps Pools.,,2026-04-17T21:04:00.000Z,faq,,0.2,False,"FAQ pages are often mixed; without the full content, this is more likely to be conceptual and clarifying common questions rather than providing structured limits, configuration tables, or error-code-based troubleshooting. Insufficient evidence that it contains the specific numeric limits, config parameters, or error mappings required for expert-knowledge classification.",updated +https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/faq?view=azure-devops,Frequently asked questions,Managed DevOps Pools FAQ - Managed DevOps Pools,Understand quotas and options for Managed DevOps Pools,"Get answers to frequently asked questions about Managed DevOps Pools, including quotas, VM SKUs, regions, and pricing.",,2026-04-22T21:02:00.000Z,faq,limits-quotas,0.78,True,"FAQ explicitly mentions quotas, VM SKUs, regions, and pricing; such FAQs typically enumerate concrete numeric quotas, allowed SKUs, and region-specific constraints that qualify as expert limits/quotas knowledge.",updated https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/features-timeline?view=azure-devops,Roadmap and features timeline,Features timeline and roadmap - Managed DevOps Pools,,Learn about new features in Managed DevOps Pools.,,2026-04-03T21:03:00.000Z,overview,,0.0,False,"Features timeline and roadmap page is primarily release/roadmap information, not configuration, limits, or troubleshooting guidance with reusable expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/manage-costs?view=azure-devops,Manage cost and performance,Manage cost and performance - Managed DevOps Pools,Optimize Managed DevOps Pools cost and performance,Learn how to manage cost and performance for your Managed DevOps Pools.,"Managed DevOps Pools provides several different options for configuring the performance of your pool. This article describes options for matching your pool's performance to the demands of your workload, by increasing or reducing the performance and cost of your pools.",2025-09-26T08:00:00.000Z,concept-article,best-practices,0.7,True,"Describes how to tune pool performance vs. cost; likely includes product-specific recommendations (e.g., instance sizes, scaling behaviors) and configuration guidance unique to Managed DevOps Pools.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/migrate-from-scale-set-agents?view=azure-devops,Compare with Azure Virtual Machine Scale Set agents,Compare Managed DevOps Pools with Azure Virtual Machine Scale Set agents - Managed DevOps Pools,Choose between Managed DevOps Pools and VM scale-set agents,Learn about the differences between Managed DevOps Pools and Azure Virtual Machine Scale Set agents.,"Managed DevOps Pools is a new service that is an evolution of Azure DevOps Virtual Machine Scale Set agent pools, simplifying custom pool creation even further by improving scalability and reliability of custom pools. Managed DevOps Pools is a fully managed service where the virtual machines that run the agents live in a Microsoft Azure subscription and not in your own Azure subscription, like when using Azure DevOps Virtual Machine Scale Set agent pools. If you're considering using auto-scalabl",2026-01-30T02:12:00.000Z,overview,decision-making,0.8,True,"Explicit comparison between Managed DevOps Pools and VM Scale Set agents; helps decide which to use, with service-specific trade-offs and migration considerations.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/monitor-pool?view=azure-devops,Monitor,Monitor - Managed DevOps Pools,,Learn how to view the health of your Managed DevOps Pools.,"Managed DevOps Pools provides several options for monitoring your pool instances. TheOverviewpage provides predefined metrics charts, and you can configure custom charts on theMetricspage. Use these tools to monitor the health of your Managed DevOps Pools instances.",2025-04-25T16:22:00.000Z,concept-article,,0.2,False,"Monitoring overview and UI usage for Managed DevOps Pools; no numeric limits, config tables, or product-specific patterns beyond generic charts.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/overview?view=azure-devops,Overview,Managed DevOps Pools Overview - Managed DevOps Pools,,Learn about how you can use Managed DevOps Pools to spin up pools that are tailored to your specific needs.,Managed DevOps Pools empowers development teams to quickly and easily spin up Azure DevOps agent pools that are tailored to their specific needs. Managed DevOps Pools implements security best practices and provides ways to balance cost and performance. It also provides paths for the most common scenarios and significantly reduces the time teams spend creating and maintaining custom pools. Managed DevOps Pools is an evolution of Azure DevOps Virtual Machine Scale Sets agent pools. It simplifies c,2026-03-05T22:02:00.000Z,overview,,0.2,False,"Managed DevOps Pools overview describes capabilities and scenarios; no indication of detailed configuration parameters, limits, or decision matrices in the summary.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/prerequisites?view=azure-devops,Prerequisites,Prerequisites for Managed DevOps Pools - Managed DevOps Pools,Configure prerequisites for Managed DevOps Pools,Learn how to configure your Azure subscription and Azure DevOps organization for use with Managed DevOps Pools.,"There are a few things you need to prepare before using Managed DevOps Pools for the first time. At a high level, you need: This article shows you how to configure your Azure subscription and Azure DevOps organization for use with Managed DevOps Pools. These configuration steps only need to be performed a single time per Azure DevOps organization and Azure subscription. Note If you're creating a Managed DevOps Pool from a pipeline, grant the permissions described inVerify Azure permissionsandVer",2025-10-09T02:04:00.000Z,concept-article,configuration,0.7,True,"Describes concrete steps and settings to prepare Azure subscriptions and DevOps orgs; likely includes required roles, permissions, and resource configurations specific to this service.",unchanged -https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/pricing?view=azure-devops,Pricing,Managed DevOps Pools pricing - Managed DevOps Pools,Estimate and compare Managed DevOps Pools costs,Learn how pricing is calculated for your Managed DevOps Pools.,"Managed DevOps Pools pricing is a combination of the cost of the Azure services your pool uses, like compute, storage, and data egress, and the standard Azure DevOps Services parallel jobs pricing for self-hosted agents. This article describes how to estimate and project the costs for your Managed DevOps Pools.",2026-04-17T21:04:00.000Z,concept-article,decision-making,0.7,True,"Pricing guidance for Managed DevOps Pools typically includes concrete cost components, rate structures, and how Azure DevOps parallel job pricing combines with underlying Azure services. This is product-specific decision guidance to project and optimize costs, fitting the decision-making category more than generic pricing marketing.",updated +https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/pricing?view=azure-devops,Pricing,Managed DevOps Pools pricing - Managed DevOps Pools,Estimate and compare Managed DevOps Pools costs,Learn how pricing is calculated for your Managed DevOps Pools.,"Managed DevOps Pools pricing is a combination of the cost of the Azure services your pool uses, like compute, storage, and data egress, and the standard Azure DevOps Services parallel jobs pricing for self-hosted agents. This article describes how to estimate and project the costs for your Managed DevOps Pools.",2026-04-17T21:04:00.000Z,concept-article,decision-making,0.7,True,"Pricing guidance for Managed DevOps Pools typically includes concrete cost components, rate structures, and how Azure DevOps parallel job pricing combines with underlying Azure services. This is product-specific decision guidance to project and optimize costs, fitting the decision-making category more than generic pricing marketing.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/quickstart-arm-template?view=azure-devops,ARM template,Create a Managed DevOps Pool using an ARM template - Managed DevOps Pools,,Learn how to create a Managed DevOps Pool using an Azure Resource Manager template (ARM template).,"This article shows you how to create a Managed DevOps Pool using an ARM template, and run a pipeline in the new pool.",2025-07-29T08:00:00.000Z,quickstart,,0.4,False,Quickstart using ARM template; focused on one deployment example rather than full configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/quickstart-azure-cli?view=azure-devops,Azure CLI,Create a Managed DevOps Pool using Azure CLI - Managed DevOps Pools,,Learn how to create a Managed DevOps Pool using Azure CLI.,"This article shows you how to create a Managed DevOps Pool using Azure CLI, and run a pipeline in it.",2025-04-30T21:16:00.000Z,quickstart,,0.4,False,"Quickstart using Azure CLI; tutorial-style creation and pipeline run, not exhaustive configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/quickstart-azure-portal?view=azure-devops,Azure portal,Create a Managed DevOps Pool using the Azure portal - Managed DevOps Pools,,Learn how to create a Managed DevOps Pool using the Azure portal.,"This article shows you how to create a Managed DevOps pool, and run a pipeline in the new pool.",2025-04-30T08:00:00.000Z,quickstart,,0.4,False,"Quickstart using Azure portal; step-by-step tutorial to create a pool and run a pipeline, but not a comprehensive configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/quickstart-bicep?view=azure-devops,Bicep,Create a Managed DevOps Pool using a Bicep template - Managed DevOps Pools,,Learn how to create a Managed DevOps Pool using a Bicep template.,"This article shows you how to create a Managed DevOps Pool using a Bicep template, and run a pipeline in the new pool.",2025-08-01T14:40:00.000Z,quickstart,,0.4,False,"Quickstart using Bicep; example deployment, not a full configuration or decision matrix.",unchanged https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/reliability-managed-devops-pools?view=azure-devops,Reliability,Reliability in Managed DevOps Pools - Managed DevOps Pools,Understand reliability and disaster recovery for Managed DevOps Pools,Learn about reliability and disaster recovery options with Managed DevOps Pools.,"This article describes reliability support in Managed DevOps Pools, and it covers cross-region disaster recovery (DR).",2025-11-04T23:40:00.000Z,reliability-article,architecture-patterns,0.65,True,Covers reliability and cross-region DR architecture for this service; includes product-specific resilience patterns and options.,unchanged -https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops,Troubleshooting,Troubleshoot Managed DevOps Pools issues - Managed DevOps Pools,Troubleshoot common Managed DevOps Pools issues,Learn how to troubleshoot common issues with Managed DevOps Pools.,This article provides solutions to common Managed DevOps Pools issues.,2025-05-21T20:49:00.000Z,how-to,troubleshooting,0.9,True,Explicit troubleshooting guide; expected to map specific symptoms and errors to causes and resolutions for Managed DevOps Pools.,unchanged +https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops,Troubleshooting,Troubleshoot Managed DevOps Pools issues - Managed DevOps Pools,Diagnose and fix Azure Managed DevOps Pools issues,Learn how to troubleshoot common issues with Managed DevOps Pools.,This article provides solutions to common Managed DevOps Pools issues.,2026-04-20T08:00:00.000Z,troubleshooting,troubleshooting,0.86,True,"Troubleshooting article for Managed DevOps Pools; by nature these pages list concrete symptoms, likely include specific error messages, causes, and product-specific remediation steps that aren't generic debugging advice.",updated https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/view-agents?view=azure-devops,View agent status,View agents - Managed DevOps Pools,,View the status of agents in Managed DevOps Pools.,You can view the status of the agents in your pool on theAgentspane. You can use this information to find out how many agents are running jobs and to see how many standby agents are online.,2025-11-04T23:40:00.000Z,concept-article,,0.5,False,"Viewing agent status is mostly UI usage; summary suggests simple monitoring, not detailed configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/devops/marketplace-extensibility/?view=azure-devops,Marketplace & Extensibility,Marketplace & extensibility documentation - Azure DevOps,,"Discover, manage, and develop extensions, widgets, and integrations for Azure DevOps.","Discover, manage, and develop extensions and integrations for Azure DevOps.",2026-04-03T21:03:00Z,landing-page,,0.0,False,"Marketplace & extensibility documentation landing page; describes discovering and developing extensions but is primarily a hub, not a detailed reference with expert-only specifics.",unchanged https://learn.microsoft.com/en-us/azure/devops/mcp-server/mcp-server-overview?view=azure-devops,Azure DevOps MCP Server overview,Enable AI assistance with the Azure DevOps MCP Server - Azure Boards,,"Learn about the Azure DevOps Model Context Protocol (MCP) Server, which enhances your AI assistant with real-time Azure DevOps context for smarter, more accurate project insights and decision-making c","Azure DevOps Services Consider asking your AI assistant ""Get my current sprint work items, then identify which ones might be at risk"" and getting instant access to your actual Azure DevOps data. The Azure DevOps Model Context Protocol (MCP) Server provides your AI assistant with secure access to work items, pull requests, builds, test plans, and documentation from your Azure DevOps organization. Unlike cloud-based solutions that require sending your data externally, the Azure DevOps MCP Server r",2025-12-12T18:05:00.000Z,overview,,0.2,False,"High-level overview of Azure DevOps MCP Server and its benefits; no detailed configuration tables, limits, or product-specific error/decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops,Remote MCP Server (preview),Set up the remote Azure DevOps MCP Server (preview) - Azure DevOps Services,Configure remote Azure DevOps MCP Server endpoint access,Learn how to configure the remote Azure DevOps MCP Server for AI-assisted development without local installation by using streamable HTTP transport.,"Azure DevOps Services Important The remote Azure DevOps MCP Server is currently in public preview. Preview features might have limited functionality and can change before general availability. Support The remote Azure DevOps MCP Server is a hosted version of theAzure DevOps MCP Serverthat doesn't require a local installation. Instead of running the server on your machine, you connect your AI assistant directly to the Azure DevOps–hosted endpoint by using streamable HTTP transport. The remote ser",2026-03-31T20:43:00.000Z,how-to,configuration,0.7,True,"Page is a how-to for setting up the remote Azure DevOps MCP Server using streamable HTTP transport. It likely includes concrete endpoint URLs, required headers, configuration parameters, and possibly auth settings specific to this preview service. These are product-specific configuration details that an LLM would not reliably know from training.",unchanged +https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops,Remote MCP Server (preview),Set up the remote Azure DevOps MCP Server (preview) - Azure DevOps Services,Configure remote Azure DevOps MCP Server endpoint,Learn how to configure the remote Azure DevOps MCP Server for AI-assisted development without local installation by using streamable HTTP transport.,"Azure DevOps Services Important The remote Azure DevOps MCP Server is currently in public preview. Preview features might have limited functionality and can change before general availability. Support The remote Azure DevOps MCP Server is a hosted version of theAzure DevOps MCP Serverthat doesn't require a local installation. Instead of running the server on your machine, you connect your AI assistant directly to the Azure DevOps–hosted endpoint by using streamable HTTP transport. The remote ser",2026-04-21T20:06:00.000Z,how-to,configuration,0.7,True,"Page is a how-to for setting up the remote MCP Server with streamable HTTP transport. It likely includes concrete endpoint URLs, required headers, authentication parameters, and configuration options specific to the Azure DevOps-hosted MCP Server, which are product-specific details not inferable from general knowledge.",updated https://learn.microsoft.com/en-us/azure/devops/organizations/?view=azure-devops,Settings >>,Settings & Usage Documentation - Azure DevOps,,"Configure resources and manage settings for an organization, project, team, or user.","Configure resources and manage settings for an organization, project, team, or user.",2026-03-03T22:06:00Z,landing-page,,0.2,False,"Landing/navigation page for Azure DevOps organization, project, team, and user settings; description indicates high-level configuration and usage documentation without specific limits, configuration parameter tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/add-privacy-policy-url?view=azure-devops,Add privacy policy URL,Add privacy policy URL to comply with the GDPR - Azure DevOps Services,Configure privacy policy URL for Azure DevOps public projects,"Learn how to add your Organization's privacy policy URL for your public project, which describes how you handle internal and external guest data privacy.",Azure DevOps Services A privacy policy is legally required for all websites and apps that collect or use personal data from users. This article shows how to add your privacy policy URL to your organization in Azure DevOps for public projects. This URL links to your custom document that describes how you handle both internal and external guest data privacy. The custom privacy policy URL appears only inOrganization settingson theOverviewpage in Azure DevOps. The Microsoft Privacy Statement continu,2025-07-18T23:47:00.000Z,how-to,security,0.65,True,Covers GDPR-related privacy policy configuration for public projects; product-specific security/compliance setting with concrete placement and behavior.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-application-access-policies?view=azure-devops,Set up security policies,Change application connection and security policies for organizations - Azure DevOps Services,Configure Azure DevOps application access and security policies,"Manage security policies for accessing organization through Conditional Access, OAuth, SSH, and personal access tokens (PATs).","Important Azure DevOps doesn't support Alternate Credentials authentication. If you're still using Alternate Credentials, switch to a more secure authentication method. This article shows how to manage your organization's security policies that determine how users and applications can access services and resources in your organization. You can access most of these policies inOrganization settings.",2025-10-13T14:05:00.000Z,how-to,security,0.9,True,"Manages Conditional Access, OAuth, SSH, and PAT policies; includes product-specific security settings and supported/unsupported auth methods.",unchanged @@ -48,10 +48,10 @@ https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-tim https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/create-organization?view=azure-devops,Create your organization,Create an organization - Azure DevOps,,"Learn how to create an Azure DevOps organization using a personal Microsoft account, GitHub account, or work or school account.","Azure DevOps Services Use an organization to connect groups of related projects and help scale up your enterprise. You can use a personal Microsoft account, GitHub account, or a work or school account. When you use your work or school account, youautomatically connectyour organization to your Microsoft Entra ID. Note You must create all organizations manually through the web portal. Automated creation of organizations isn't supported. However, automated organization configuration, project creati",2026-03-04T02:02:00.000Z,how-to,,0.2,False,"Step-by-step guide to creating an Azure DevOps organization; mostly procedural UI instructions without detailed configuration parameter tables, limits, or security role specifics.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/delete-your-organization?view=azure-devops,Delete organization,Delete or remove an organization - Azure DevOps Services,Delete Azure DevOps organization and data retention,"Learn how to permanently delete an Azure DevOps organization, what happens to users and data, recovery options, and administrator deletion capabilities for Microsoft Entra ID administrators.","Azure DevOps Services When you no longer need an organization, you can delete it from Azure DevOps. If you change your mind within 28 days, you canrecover your organization. After 28 days, your organization and all associated data get permanently deleted. When you delete your organization, the following results occur: Caution In rare cases, our deletion process might take up to 70 days due to backend retries and the need to delete data from multiple sources.",2026-04-03T01:04:00.000Z,how-to,limits-quotas,0.7,True,"Contains specific time-based deletion behavior (28-day recovery window and up to 70 days for full backend deletion), which are concrete service limits/timeframes not inferable from general knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops,Access via Microsoft Entra,Access via Microsoft Entra FAQs,Manage Azure DevOps access via Microsoft Entra ID,"Learn the answers to frequently asked questions (FAQs), like how to understand Microsoft Entra groups, add users, connect to, disconnect from, or switch your directory.","Azure DevOps Services Important Azure DevOps doesn't support Alternate Credentials authentication. If you're still using Alternate Credentials, switch to a more secure authentication method. Learn the answers to the following frequently asked questions (FAQs) about access to your Azure DevOps organization via Microsoft Entra ID. We grouped the FAQs by the following subjects:",2026-01-27T18:16:00Z,faq,security,0.75,True,"FAQ on Entra groups, adding users, connecting/switching directories; contains product-specific identity and access configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-configure-customize-organization?view=azure-devops,Create and configure an organization,Organization FAQs - Azure DevOps,,"Learn the answers to frequently asked questions (FAQs) about creating, deleting, restoring, and configuring your organization in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Learn the answers to frequently asked questions (FAQs) about creating and configuring an organization in Azure DevOps, grouped into the following categories: For more information about user and permissions management, seeUser and permissions management FAQs.",2026-02-24T02:03:00Z,faq,,0.3,False,"Organization FAQ; description doesn’t indicate detailed limits, error codes, or configuration tables, so treated as general Q&A.",unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops,Set up Visual Studio,Set up Visual Studio with Azure DevOps FAQs - Azure DevOps Services,Set up Visual Studio integration with Azure DevOps,"Having problems installing Visual Studio, signing in, or handling an expired subscription? Learn answers to these frequently asked questions (FAQs).",Azure DevOps Services Learn the answers to frequently asked questions (FAQs) about setting up Visual Studio with Azure DevOps.,2024-08-12T16:52:00Z,faq,integrations,0.6,True,"FAQ for setting up Visual Studio with Azure DevOps; likely includes specific connection settings, authentication flows, and troubleshooting for this integration.",unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops,Manage users and permissions,User and permissions management FAQs - Azure DevOps,"Manage Azure DevOps users, permissions, and tokens securely","Learn the answers to frequently asked questions (FAQs), like the permissions that are required to manage users and user access, manage Visual Studio subscriptions, and more.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article provides answers to frequently asked questions (FAQs) about user and permissions management in Azure DevOps. The FAQs are organized by topic to help you quickly find information about managing users, permissions, access levels, Visual Studio subscriptions, GitHub Enterprise integration, and related administrative tasks. Important Consider using the more secureMicrosoft Entra tokensover higher-riskpersonal access ",2026-03-24T21:04:00Z,faq,security,0.65,True,"FAQ includes product-specific security guidance (e.g., recommendation to use Microsoft Entra tokens over personal access tokens) and likely details on required permissions/roles for specific management tasks, which are service-specific security configurations.",unchanged +https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops,Access via Microsoft Entra,Access via Microsoft Entra FAQs - Azure DevOps,Configure Azure DevOps access via Microsoft Entra ID,"Learn the answers to frequently asked questions (FAQs), like how to understand Microsoft Entra groups, add users, connect to, disconnect from, or switch your directory.","Azure DevOps Services Important Azure DevOps doesn't support Alternate Credentials authentication. If you're still using Alternate Credentials, switch to a more secure authentication method. Learn the answers to the following frequently asked questions (FAQs) about access to your Azure DevOps organization via Microsoft Entra ID. We grouped the FAQs by the following subjects:",2026-04-24T22:42:00Z,faq,security,0.82,True,"FAQ about access via Microsoft Entra ID; explicitly mentions authentication methods and directory connection. Such content typically includes specific security settings, supported/unsupported auth methods, and directory linkage behaviors unique to Azure DevOps.",updated +https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-configure-customize-organization?view=azure-devops,Create and configure an organization,Organization FAQs - Azure DevOps,,"Learn the answers to frequently asked questions (FAQs) about creating, deleting, restoring, and configuring your organization in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Learn the answers to frequently asked questions (FAQs) about creating and configuring an organization in Azure DevOps, grouped into the following categories: For more information about user and permissions management, seeUser and permissions management FAQs.",2026-04-22T21:02:00Z,faq,,0.3,False,"Organization FAQ about creating/configuring organizations; description suggests general how-to and conceptual guidance without clear indication of numeric limits, detailed configuration tables, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops,Set up Visual Studio,Set up Visual Studio with Azure DevOps FAQs - Azure DevOps,Resolve Visual Studio setup and sign-in issues with Azure DevOps,"Having problems installing Visual Studio, signing in, or handling an expired subscription? Learn answers to these frequently asked questions (FAQs).",Azure DevOps Services Learn the answers to frequently asked questions (FAQs) about setting up Visual Studio with Azure DevOps.,2026-04-22T21:02:00Z,faq,troubleshooting,0.7,True,"FAQ focused on problems installing Visual Studio, signing in, and handling expired subscriptions; likely organized around concrete error conditions and resolutions specific to Azure DevOps integration, matching troubleshooting criteria.",updated +https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops,Manage users and permissions,User and permissions management FAQs - Azure DevOps,"Manage Azure DevOps users, permissions, and access","Learn the answers to frequently asked questions (FAQs), like the permissions that are required to manage users and user access, manage Visual Studio subscriptions, and more.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article provides answers to frequently asked questions (FAQs) about user and permissions management in Azure DevOps. The FAQs are organized by subject to help you quickly find information about managing users, permissions, access levels, Visual Studio subscriptions, GitHub Enterprise integration, and related administrative tasks. Important Consider using the more secureMicrosoft Entra tokensover higher-riskpersonal acces",2026-04-24T22:42:00Z,faq,security,0.7,True,"User and permissions management FAQ likely lists specific Azure DevOps access levels, permission scopes, and possibly role names and requirements; description also highlights security-related token guidance, fitting product-specific security configuration.",updated https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/organization-management?view=azure-devops,Organization management overview,Manage an organization - Azure DevOps Services,,"Manage an organization, so you can collaborate with others to develop apps, plan and track work, integrate with other services, get more features and extensions.","Azure DevOps Services With an organization in Azure DevOps Services, you can do the following tasks: By using these capabilities, you can enhance your development process and improve collaboration within your team. Note If you're just getting started, seeGet started managing your organization. For information about managing an on-premises Azure DevOps Server, seeAdministrative tasks quick reference. Tip You can use AI to help with Azure DevOps tasks. SeeEnable AI assistance with Azure DevOps MCP",2026-02-18T02:04:00.000Z,overview,,0.3,False,"General organization management overview; mostly conceptual and navigational without detailed limits, config matrices, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/recover-your-organization?view=azure-devops,Recover organization,Recover a deleted organization - Azure DevOps Services,Recover deleted Azure DevOps organizations within retention limits,"Learn how to restore your organization and data up to 90 days after being deleted, done with organization owner permissions.","Azure DevOps Services After you delete an organization, it's disabled but available for 28 days. If you change your mind during this time, you can recover the organization. After 28 days, the organization and data are permanently deleted. In rare cases, our deletion process might take up to 70 days due to backend retries and the need to delete data from multiple sources.",2025-09-12T19:46:00.000Z,how-to,limits-quotas,0.85,True,Defines 28-day availability for recovery and mentions up to 70 days for backend deletion; explicit numeric retention limits.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/rename-organization?view=azure-devops,Rename organization,Rename your organization - Azure DevOps Services,,Learn how to rename your organization and what to do before and after you rename it.,"Azure DevOps Services You can change your organization name (URL) at any time in Azure DevOps. This action allows you to update the URL to better reflect your organization's branding or structure. Changing the organization name updates the URL used to access your Azure DevOps resources, including the URLs of your projects, repositories and other resources in the organization. Follow the steps in this article to rename your organization and ensure a smooth transition for your team. Caution The re",2025-10-27T22:02:00.000Z,how-to,,0.35,False,"How-to for renaming an organization; while it has a caution, description doesn’t indicate numeric limits, detailed config tables, or error mappings.",unchanged @@ -65,16 +65,14 @@ https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/at-me https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/change-email-address?view=azure-devops,Change your preferred email address,Change your preferred notification email address - Azure DevOps,,Change the email address used to receive alerts or email notifications managed in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Change your preferred email address for notifications through your organization profile settings. By default, Azure DevOps sends notifications to the email address associated with your organization profile, which is typically the email you used to sign in. Tip Quick access: Go toUser settings>Profileto update your notification email address. Important",2025-08-14T03:52:00.000Z,how-to,,0.3,False,Simple profile setting change for notification email; lacks detailed config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/concepts-email-recipients?view=azure-devops,How email recipients are determined,How Notification Email Recipients are Determined - Azure DevOps,Determine Azure DevOps notification email recipients,Explore how email recipients are determined for notifications and events in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Many factors determine the recipients of an email notification when an event matches a subscription. If you're unaware, these factors can result in your inbox receiving too many or too few emails. Learn about how the type of subscription, its delivery settings, delivery preferences, and other factors determine the set of recipients. Note Many concepts addressed in this article apply to earlier versions of Azure DevOps, althou",2025-07-15T16:57:00.000Z,concept-article,configuration,0.65,True,"Explains product-specific rules for how subscriptions, delivery settings, and preferences combine to determine recipients; nuanced behavior unique to Azure DevOps.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/exclude-self-from-email?view=azure-devops,Limit your emails,Prevent notification emails to yourself from events - Azure DevOps,Exclude event initiators from Azure DevOps notifications,Learn how to exclude the initiator of an event in Azure DevOps Services from receiving notification emails,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article shows how to prevent the initiator of an event from receiving notification emails when that event triggers a team or role-based subscription. The feature is useful when people don't want emails for actions they themselves perform (for example, creating a pull request).",2025-10-27T22:02:00.000Z,how-to,configuration,0.6,True,Describes a specific notification setting to suppress emails to initiators; product-specific behavior and config.,unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/faq-notifications?view=azure-devops,FAQs,Notification FAQs - Azure DevOps,,General questions and answers about notifications settings in Azure DevOps.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022,2025-07-17T19:00:00Z,faq,,0.35,False,"FAQ-style content; summary doesn’t indicate detailed error codes, config tables, or limits.",unchanged +https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/faq-notifications?view=azure-devops,FAQs,Notification FAQs - Azure DevOps,,General questions and answers about notifications settings in Azure DevOps.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022,2026-04-22T21:02:00Z,faq,,0.2,False,"FAQ about notifications is likely conceptual and behavioral (how notifications work, common questions) without detailed error codes, configuration tables, or numeric limits; does not clearly match troubleshooting or other expert-knowledge categories based on the description.",updated https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/integrate-third-party-services?view=azure-devops,"Send notifications to third-party services (Slack, Teams)",How to integrate with third-party services - Azure DevOps,Integrate Azure DevOps notifications with third-party services,"Learn how to integrate third-party services, such as Slack, Microsoft Teams, and others with Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Integrate with your favorite services by notifying them when events happen in Azure DevOps. We have multiplemessaging app integrationsto help users receive notifications in response to events in Azure DevOps, complete workflows and take proactive actions such as allowing users to approve release deployments and creating work items from their channels. See the following examples for specific service hook integrations: Or, inte",2025-10-27T22:02:00.000Z,overview,integrations,0.7,True,"Covers service hooks and messaging app integrations (Slack, Teams, etc.) with product-specific integration patterns and actions.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/manage-team-group-global-organization-notifications?view=azure-devops,"Manage team, group, Global notifications","Manage Notifications - Team, Project, Organization - Azure DevOps",,"Configure email notifications for your team, project, or organization when changes occur to source code, git, work items, and builds in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can manage email notifications for your team, project, organization, or collection and receive notifications when changes occur to work items, code reviews, pull requests, source control files, and builds. For example, when a high priority work item is assigned to your team's area path, a notification email is sent to the team. For more information, seeNotification types. Tip You canuse AI to help with this tasklater in t",2026-03-04T02:02:00.000Z,how-to,,0.4,False,"Explains managing notifications at team/project/org level; appears to be procedural without specific limits, configuration matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/manage-your-personal-notifications?view=azure-devops,Manage personal notifications,Manage Personal Notification Settings - Azure DevOps,,"Set up and manage personal notifications in Azure DevOps and receive messages when changes occur to source code, git, work items, and builds.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 There are many ways you can manage your personal notifications in Azure DevOps: You receive personal notifications as email messages when changes occur to builds, code, pipelines, work, artifacts, extensions, releases, and more. For information about team and project-level notifications, seeTeam and project-level notificationsandManage team or group notifications. Note For on-premises Azure DevOps Server,configure an SMTP ser",2026-03-04T02:02:00.000Z,how-to,,0.4,False,"Describes ways to manage personal notifications; summary does not show config parameter tables, limits, or error mappings—more of a standard feature how-to.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/navigating-the-ui?view=azure-devops,Navigate the UI,Navigate the Notifications UI - Azure DevOps,,"Explore the notifications pages in Azure DevOps, including personal status, team and project-level status, and global settings.",Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article describes how to navigate theNotificationsuser interface in Azure DevOps. You can set notifications at the following levels:,2025-05-06T22:42:00.000Z,concept-article,,0.2,False,UI navigation guide for notifications pages; no expert-level configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/oob-built-in-notifications?view=azure-devops,Default notifications,Default and Supported Notifications - Azure DevOps,Configure default Azure DevOps notification subscriptions,"Learn about out of the box or default notifications set in Azure DevOps: build or release, code under Git or TFVC source control, extension management, and work items.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can configure default subscriptions to send notifications to certain roles or user groups with specific associations to an event. Some examples include users who are assigned theReviewerrole on a pull request, or theAssignee (current)role that identifies the currentAssigned Touser of a changed work item. In the description of the default subscription, you can see the roles that receive notifications, such as""Notifies you ",2025-05-05T17:46:00.000Z,concept-article,configuration,0.7,True,"Reference for built-in/default notifications with role mappings (Reviewer, Assignee); product-specific subscription configuration.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/oob-supported-event-types?view=azure-devops,Events,Event Types for Notifications Subscriptions - Azure DevOps,Reference event types for Azure DevOps notifications,"Review supported event types for automatic notifications in Azure DevOps: build or release, code under Git or TFVC source control, and work items.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This reference article identifies the supported event types for notification subscriptions in Azure DevOps. The following sections list the event types by project notification category, including Build, Release, Work items, and Code under Git source control or Team Foundation version control (TFVC).",2025-05-05T17:34:00.000Z,concept-article,configuration,0.7,True,Lists supported event types by category; product-specific event taxonomy used in notification configuration.,unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-delayed-email?view=azure-devops,Why are my emails delayed,Why are my notification emails delayed - Azure DevOps,Investigate delayed Azure DevOps notification emails,Troubleshooting steps for delayed emails from notifications in Azure DevOps.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You might not receive an expected notification email. Learn how to check the notification statistics.,2025-10-27T22:02:00.000Z,how-to,troubleshooting,0.7,True,Troubleshooting delayed notifications using notification statistics; symptom → diagnosis → resolution pattern.,unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops,Why am I not getting an email,Troubleshoot Not Receiving Notification Emails - Azure DevOps,Troubleshoot missing Azure DevOps notification emails,Discover why you aren't receiving emails from your Azure DevOps notification subscriptions and fix the problem.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article provides troubleshooting information to help you address why you might not be receiving an expected subscription or notification email. An email is sent when aneventoccurs that matches a notification subscription. For more information about notification subscriptions, see thenotifications overview. If you're not receiving an expected notification email, here are some possible causes:",2025-12-04T14:04:00.000Z,troubleshooting-general,troubleshooting,0.8,True,Symptom-based guide (not receiving emails) with possible causes and resolutions specific to Azure DevOps notifications.,unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-unexpected-email?view=azure-devops,Why am I getting this email,Why did I get this email - Azure DevOps,Troubleshoot unexpected Azure DevOps notification emails,Troubleshoot why you're receiving automatic notification emails from Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If you get a notification email that you didn't expect, it could be for one of the following reasons:",2025-10-27T22:02:00.000Z,how-to,troubleshooting,0.8,True,Symptom-based guide (unexpected emails) mapping causes to solutions for Azure DevOps notification system.,unchanged +https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops,Troubleshoot notification emails,Troubleshoot notification emails - Azure DevOps,Diagnose and fix missing Azure DevOps notification emails,"Troubleshoot missing, delayed, or unexpected notification emails from Azure DevOps subscriptions and fix the problem.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 An email is sent when aneventoccurs that matches a notification subscription. For more information about notification subscriptions, see thenotifications overview. Note For on-premises Azure DevOps Server,configure an SMTP serverso team members can see theNotificationsoption from theirorganization or user profile menuand receive notifications. This article helps you troubleshoot common issues with Azure DevOps notification em",2026-04-22T21:02:00.000Z,troubleshooting,troubleshooting,0.8,True,"Article explicitly focuses on troubleshooting missing, delayed, or unexpected notification emails; such guides typically map symptoms (no email, delayed email) to causes (SMTP misconfiguration, subscription issues) and resolutions, often including product-specific checks and configuration steps, fitting the troubleshooting sub-skill.",new https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/unsubscribe-default-notification?view=azure-devops,Unsubscribe from default notification,View Subscribed Notifications - Azure DevOps,View and unsubscribe from Azure DevOps notifications,View your notifications and unsubscribe from a default or built-in notification in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If you want to stop receiving specific email notifications, you can unsubscribe from the notifications in Azure DevOps. You can opt out of default or built-in subscriptions and unsubscribe from other notifications. For a description of each default subscription, seeDefault notifications.",2025-05-07T21:52:00.000Z,how-to,configuration,0.6,True,Covers managing default/built-in subscriptions; product-specific notification configuration behavior.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/use-subscription-logging?view=azure-devops,Enable subscription logging for troubleshooting,Use subscription logging to troubleshoot notifications - Azure DevOps,Use subscription logging to debug Azure DevOps notifications,Enable subscription logging and access diagnostic logs to troubleshoot notification issues in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Note For on-premises Azure DevOps Server,configure an SMTP serverso team members can see theNotificationsoption from theirorganization or user profile menuand receive notifications. Subscription logging helps you troubleshoot notification issues by providing diagnostic information from the notifications pipeline. This feature is disabled by default. When enabled, Azure DevOps collects up to 25 logs or one hour's worth of diag",2025-07-17T19:00:00.000Z,how-to,troubleshooting,0.75,True,"Explicit troubleshooting article; describes enabling subscription logging, diagnostic logs, and specific limits (up to 25 logs or one hour).",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/view-organization-notification-statistics?view=azure-devops,View notification statistics for your organization,View organization-level notifications statistics - Azure DevOps,View and interpret Azure DevOps notification statistics,View organization-level notifications statistics in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Notification statistics show the top 10 most active subscriptions and top event initiators in your organization, for the current day. Administrators should periodically review statistics to ensure there are no unintended high volume subscriptions or event initiators. Note For on-premises Azure DevOps Server,configure an SMTP serverso team members can see theNotificationsoption from theirorganization or user profile menuand re",2025-10-27T22:02:00.000Z,overview,configuration,0.6,True,"Product-specific statistics (top 10 subscriptions, event initiators) and how admins should use them; configuration/monitoring behavior unique to service.",unchanged @@ -115,7 +113,7 @@ https://learn.microsoft.com/en-us/azure/devops/organizations/security/default-tf https://learn.microsoft.com/en-us/azure/devops/organizations/security/download-permissions-report-release?view=azure-devops,Download release permissions report,Download permissions report for a pipeline release - Azure DevOps Services,Download and interpret pipeline release permissions report,Download json-formatted permission report for a pipeline release.,"Azure DevOps Services To determine the effective permissions of users and groups for a release, you can download the permissions report. Requesting the report generates an email with a link to download the report. The report lists the effective permissions for the release you select, for each user and group specified at the time the report is generated. Inherited permissions come from a parent group that you can view from the web portal. The report is a json-formatted report that you can open us",2025-02-07T09:49:00.000Z,how-to,security,0.65,True,"Describes a product-specific mechanism to generate a JSON permissions report for a release, including how inherited permissions are represented; concrete security diagnostics feature.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/security/download-permissions-report?view=azure-devops,Download repository permissions report,Download permissions report for a repository - Azure DevOps Services,Download Azure DevOps repository permissions report,Download json-formatted permission report for a repository.,"Azure DevOps Services To determine the effective permissions of users and groups for a repository, you can download the permissions report. Requesting the report generates an email with a link to download the report. The report lists the effective permissions for the repository you select, for each user and group specified at the time the report is generated. Inherited permissions come from a parent group which you can view from the web portal. The report is a json-formatted report that you can ",2025-02-07T09:49:00.000Z,how-to,security,0.65,True,Describes obtaining effective permissions for users/groups on a repo; product-specific security/permissions reporting behavior.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/security/export-users-audit-log?view=azure-devops,Export user list,Export a list of users and their access levels - Azure DevOps,Export Azure DevOps users and access levels,"Determine the access level-stakeholder, basic, advanced, or Visual Studio Enterprise-granted to user accounts.",Azure DevOps Services You can get a list of users and groups that have access to your organization in Azure DevOps by exporting users. The downloaded list also indicates access levels.,2025-07-17T19:00:00.000Z,how-to,security,0.65,True,"Deals with access levels (stakeholder, basic, advanced, VS Enterprise) which are product-specific permission tiers.",unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/security/get-started-stakeholder?view=azure-devops,Get started as a Stakeholder,Get started with Stakeholder access - Azure DevOps,Use Stakeholder access permissions in Azure DevOps,Learn how to add and update work items and view work tracking progress with Stakeholder access in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Stakeholdersare users with free but limited access to Azure DevOps features and functions. With Stakeholder access, you can add and modify work items, manage build and release pipelines, and view dashboards. You can check project status and provide direction, feedback, feature ideas, and business alignment to a team. For more information, seeCreate your first pipelineandSupported source repositories. Stakeholdersare users wit",2025-12-22T08:00:00.000Z,how-to,security,0.6,True,"Defines what Stakeholder access can and cannot do; product-specific access level capabilities and restrictions, which are security/permission details.",unchanged +https://learn.microsoft.com/en-us/azure/devops/organizations/security/get-started-stakeholder?view=azure-devops,Get started as a Stakeholder,Get started with Stakeholder access - Azure DevOps,,Learn how to add and update work items and view work tracking progress with Stakeholder access in Azure Boards.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Stakeholdersare users with free but limited access to Azure DevOps features and functions. With Stakeholder access, you can add and modify work items, manage build and release pipelines, and view dashboards. You can check project status and provide direction, feedback, feature ideas, and business alignment to a team. For more information, seeCreate your first pipelineandSupported source repositories. Stakeholdersare users wit",2026-04-24T22:42:00.000Z,how-to,,0.35,False,"Stakeholder access getting-started guide describes capabilities and basic usage; summary doesn’t indicate detailed RBAC role tables, configuration parameters, or numeric limits—more of a conceptual/usage overview.",updated https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-azure-devops-administrator?view=azure-devops,Look up Azure DevOps administrator,Look up an Azure DevOps Administrator - Azure DevOps,Identify Azure DevOps Administrators via Microsoft Entra,Learn how to identify members of the Azure DevOps Administrators group on Microsoft Entra.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 TheAzure DevOps Administratorrole is a Microsoft Entra role for tenant-wide administration of Azure DevOps. Members of this role can: If you're not an Azure DevOps Administrator, you can't see the tenant-level policies on theMicrosoft Entrapage in Organization Settings. Note The Azure DevOps Administrator role is different from theProject Collection Administratorrole. Project Collection Administrators manage a specific organi",2026-03-10T13:05:00.000Z,how-to,security,0.7,True,"Explains the Azure DevOps Administrator Entra role, its specific capabilities, and how it differs from Project Collection Administrators, plus how to look up members. These are product-specific role definitions and access behaviors.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-organization-owner?view=azure-devops,Look up organization owner,Look up the organization owner - Azure DevOps,,How-to guide for finding the owner of your organization.,"Azure DevOps Services Each organization has a single owner who can add users, adjust access levels, manage permissions, and customize projects. You can find the owner in your organization settings. To change theOrganization owner, seeChange organization owner.",2026-03-10T13:05:00.000Z,how-to,,0.2,False,"Simple how-to for finding the organization owner; no detailed RBAC roles, configuration parameters, or numeric constraints.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-project-administrators?view=azure-devops,Look up a project administrator,Find a project administrator - Azure DevOps,Find and manage Azure DevOps project administrators,Quickly identify members of the Project Administrators group in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 TheProject Administratorsgroup is the primary administrative security group for a project. Members of this group are authorized to perform the following tasks: To add users to theProject Administratorsgroup or change a project-level permission, seeChange project-level permissions.",2026-03-10T13:05:00.000Z,quickstart,security,0.65,True,"Defines the Project Administrators security group and enumerates its authorized tasks, with guidance on locating members. These are concrete, product-specific permission capabilities tied to a named role.",unchanged @@ -128,8 +126,8 @@ https://learn.microsoft.com/en-us/azure/devops/organizations/security/security-s https://learn.microsoft.com/en-us/azure/devops/organizations/security/set-object-level-permissions?view=azure-devops,Set object-level permissions,Set object-level permissions - Azure DevOps,Configure object-level permissions in Azure DevOps,"How to grant permission and access at the object-level - repos, pipelines, work items, area paths, queries, wikis, dashboards, and more.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 As you manage security for your organization, set permissions at the organization/collection level, project level, and object level. This article helps you access the security dialogs for setting permissions at the object level, as the user interface varies somewhat across Azure DevOps. For more information, seeGet started with permissions, access, and security groups. The following items are considered objects: Work items, t",2026-03-10T13:05:00.000Z,how-to,security,0.8,True,"Explains how to access and use security dialogs to set permissions on specific Azure DevOps objects (repos, pipelines, work items, etc.), reflecting detailed IAM behavior and UI-specific steps unique to the product.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/security/set-permissions-access-test?view=azure-devops,Set permissions & access for testing,Set permissions and access for manual testing - Azure DevOps,Set permissions and access for Azure DevOps manual testing,"How to grant or restrict access to test plans, test suites, test cases, and other test-related features.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To fully utilize Azure Test Plans, you need to understand and configure the necessary permissions and access levels. This article outlines the steps, so you can: By following these guidelines, you can ensure that your team has the appropriate access to efficiently manage and execute test plans. To manage access to manual test features, grant specific permissions to users or groups at theobjectorprojectlevel for the following ",2026-03-10T13:05:00.000Z,how-to,security,0.8,True,"Gives specific instructions for configuring permissions and access levels for Azure Test Plans objects (test plans, suites, cases, etc.), mapping roles/permissions to testing capabilities. This is detailed, product-specific security/permission configuration.",unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/security/set-permissions-access-work-tracking?view=azure-devops,Set work tracking & plan permissions,Set Permissions for Work Tracking - Azure DevOps,Set permissions for Azure DevOps work tracking,"Learn how to grant or restrict access to work tracking tasks by setting object or project-level permissions for Azure DevOps, and default permissions for objects.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To manage work tracking effectively, assign specific permissions to users or groups for particularobjects, projects, or collections. You can alsodefine custom rulesfor processes or projects that apply to specific users or groups, controlling their actions accordingly. For most features, we recommend adding users to the project'sContributorsgroup, which grants comprehensive access and ensures a seamless and efficient work trac",2025-08-14T17:05:00.000Z,how-to,security,0.7,True,"Provides concrete recommendations (e.g., use Contributors group) and describes object vs project-level permissions and custom rules; product-specific security configuration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/devops/organizations/security/stakeholder-access?view=azure-devops,Stakeholder access quick reference,Stakeholder access quick reference - Azure DevOps,Understand Azure DevOps Stakeholder access permissions,Stakeholder access to common user tasks for Azure DevOps,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Stakeholderaccess gives an unlimited number of users free, limited access to your organization. Stakeholders can: Stakeholdersdon'thave access to code repositories (Azure Repos). To contribute to the code base, assign users at leastBasicaccess. To get started, seeGet started as a Stakeholder. For administrative tasks, seeManage your project.",2026-04-14T01:03:00.000Z,overview,security,0.7,True,"Page defines the exact capabilities and restrictions of the Stakeholder access level (for example, no access to code repositories and need at least Basic access to contribute to code), which are product-specific permission details that function as RBAC-like security configuration knowledge.",updated -https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops,Troubleshoot permissions,"Troubleshoot access, permission issues - Azure DevOps",Troubleshoot Azure DevOps access and permission issues,Find helpful troubleshooting information for resolving access and permission issues in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Due to the extensive security and permission structure of Azure DevOps, you might need to investigate why a user lacks access to a project, service, or feature they expect. Find step-by-step guidance to understand and address issues a user might encounter when connecting to a project or accessing an Azure DevOps service or feature.",2025-12-04T14:04:00.000Z,troubleshooting,troubleshooting,0.8,True,"Organized as step-by-step guidance to diagnose why users lack access to projects or features, mapping symptoms to likely permission causes and resolutions; product-specific troubleshooting of security/permissions.",unchanged +https://learn.microsoft.com/en-us/azure/devops/organizations/security/stakeholder-access?view=azure-devops,Stakeholder access quick reference,Stakeholder access quick reference - Azure DevOps,Understand Azure DevOps Stakeholder access permissions,Stakeholder access to common user tasks for Azure DevOps,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Stakeholderaccess gives an unlimited number of users free, limited access to your organization. Stakeholders can: Stakeholdersdon'thave access to code repositories (Azure Repos). To contribute to the code base, assign users at leastBasicaccess. To get started, seeGet started as a Stakeholder. For administrative tasks, seeManage your project.",2026-04-14T01:03:00.000Z,overview,security,0.7,True,"Page defines the exact capabilities and restrictions of the Stakeholder access level (for example, no access to code repositories and need at least Basic access to contribute to code), which are product-specific permission details that function as RBAC-like security configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops,Troubleshoot permissions,"Troubleshoot access, permission issues - Azure DevOps",Diagnose and fix Azure DevOps permission issues,Find helpful troubleshooting information for resolving access and permission issues in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Due to the extensive security and permission structure of Azure DevOps, you might need to investigate why a user lacks access to a project, service, or feature they expect. Find step-by-step guidance to understand and address issues a user might encounter when connecting to a project or accessing an Azure DevOps service or feature.",2026-04-20T08:00:00.000Z,troubleshooting,troubleshooting,0.78,True,"The page is explicitly a troubleshooting guide for Azure DevOps access and permission problems, providing step-by-step diagnosis and resolution for why a user cannot access a project, service, or feature. This is product-specific troubleshooting knowledge (symptom → investigation steps → resolution) that goes beyond generic debugging concepts.",updated https://learn.microsoft.com/en-us/azure/devops/organizations/security/view-permissions?view=azure-devops,View permissions,View permissions and effective access - Azure DevOps,View and troubleshoot Azure DevOps effective permissions,"Learn how to view permissions, check effective permissions for a user or group, and troubleshoot access issues in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article shows how to view permissions and check effective access for users and groups at the organization, project, and repository (or other object) levels. It explains permission states (Allow, Deny, Inherit), how inheritance and group membership affect effective permissions, and steps to troubleshoot common access problems. You learn about: Quick steps: Note Permission management features and UI vary slightly between A",2026-02-04T22:06:00.000Z,how-to,security,0.7,True,Shows exact UI paths and possibly APIs to inspect effective permissions and resolve access issues; product-specific security troubleshooting steps.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops,Area & iteration paths,How are area and iteration paths used? - Azure DevOps,,Learn how to effectively use area paths and iteration paths in Azure DevOps to organize and manage your projects.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Area paths group work items by team, product, or feature area. Iteration paths group work into sprints, milestones, or other time-related periods. Both fields support hierarchical paths. Define area and iteration paths for a project, and teams can select which paths to use for their backlog and Agile tools. Learn how Agile tools use these paths inAgile tools that rely on areas and iterations. Note Area paths and iteration pat",2025-12-22T08:00:00.000Z,overview,,0.3,False,Conceptual explanation of area and iteration paths; organizational guidance without numeric limits or detailed configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-settings?view=azure-devops,About settings,Settings overview for Azure DevOps - Azure DevOps,,"Overview of settings available to administrators for your team, project, collection, and organization in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can configure resources for yourself, your team, project, or organization from the administrativeSettingspage. The settings available to you depend on your security group membership or administrative role. If you're new to being a Project Administrator, seeGet started as an administratorfor a comprehensive guide. Note You can delegate several tasks to a user with Basic or Stakeholder access by adding them to theProject Co",2026-02-17T08:00:00.000Z,overview,,0.1,False,High-level settings overview; navigation/overview content without detailed configuration parameter tables or limits.,unchanged @@ -774,18 +772,18 @@ https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint- https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint-269-update,February 11,Azure DevOps Release Notes - Azure Boards Sprint 269 Update,,"See the Sprint 269 feature updates for Azure Boards, including next steps.",,2026-02-12T02:04:00.000Z,article,,0.2,False,"Azure Boards Sprint 269 release notes are feature updates; no indication of structured limits, configuration matrices, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint-270-update,March 5,Azure DevOps Release Notes - Azure Boards Sprint 270 Update,,"See the Sprint 270 feature updates for Azure Boards, including next steps.",,2026-03-06T03:04:00.000Z,article,,0.2,False,"Sprint release notes for Azure Boards describe new features and changes but do not focus on structured limits, configuration matrices, troubleshooting mappings, or other stable expert reference data as defined by the sub-skill types. Content is primarily update/what’s new information rather than reusable expert knowledge in the specified categories.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint-271-update,March 31,Azure DevOps Release Notes - Azure Boards Sprint 271 Update,,"See the Sprint 271 feature updates for Azure Boards, including next steps.",,2026-03-31T20:43:00.000Z,release-notes,,0.0,False,"Sprint release notes describe new features and changes but do not focus on limits, configuration matrices, error codes, or other structured expert reference data as defined by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint-272-update,April 14,Azure DevOps Release Notes - Azure Boards Sprint 272 Update,,"See the Sprint 272 feature updates for Azure Boards, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.2,False,"Sprint release notes for Azure Boards describe new and changed features but do not focus on structured limits, configuration matrices, troubleshooting mappings, or other stable expert reference data as defined by the sub-skill types. Content is primarily update/what’s new information rather than reusable expert knowledge in the specified categories.",new +https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint-272-update,April 14,Azure DevOps Release Notes - Azure Boards Sprint 272 Update,,"See the Sprint 272 feature updates for Azure Boards, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.2,False,"Sprint release notes for Azure Boards describe new and changed features but do not focus on structured limits, configuration matrices, troubleshooting mappings, or other stable expert reference data as defined by the sub-skill types. Content is primarily update/what’s new information rather than reusable expert knowledge in the specified categories.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/general/sprint-270-update,March 5,Azure DevOps release notes - Azure DevOps Sprint 270 Update,,"See the Sprint 270 feature updates for Azure DevOps, including next steps.",,2026-03-06T03:04:00.000Z,release-notes,,0.0,False,"Sprint release notes describe new features and changes but typically do not provide structured limits, configuration matrices, troubleshooting mappings, or other stable expert reference data as defined by the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/general/sprint-271-update,March 31,Azure DevOps release notes - Azure DevOps Sprint 271 Update,,"See the Sprint 271 feature updates for Azure DevOps, including next steps.",,2026-03-31T20:43:00.000Z,release-notes,,0.2,False,"General Azure DevOps Sprint 271 release notes are primarily change logs and feature summaries, not detailed references with numeric limits, configuration parameter tables, troubleshooting mappings, or decision criteria required for the defined sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/general/sprint-272-update,April 14,Azure DevOps release notes - Azure DevOps Sprint 272 Update,,"See the Sprint 272 feature updates for Azure DevOps, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.2,False,"General Azure DevOps Sprint 272 release notes are primarily change logs and feature announcements. They usually lack detailed limits, configuration parameter tables, troubleshooting mappings, or decision matrices required for the expert-knowledge sub-skill types, so no classification is assigned.",new +https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/general/sprint-272-update,April 14,Azure DevOps release notes - Azure DevOps Sprint 272 Update,,"See the Sprint 272 feature updates for Azure DevOps, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.2,False,"General Azure DevOps Sprint 272 release notes are primarily change logs and feature announcements. They usually lack detailed limits, configuration parameter tables, troubleshooting mappings, or decision matrices required for the expert-knowledge sub-skill types, so no classification is assigned.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-268-update,January 26,Azure DevOps release notes - GitHub Advanced Security for Azure DevOps 268 Update,,"See the Sprint 268 feature updates for GitHub Advanced Security for Azure DevOps, including next steps.",,2026-01-27T18:16:00.000Z,article,,0.25,False,"Sprint 268 update for GitHub Advanced Security for Azure DevOps; change log style content without numeric limits, config matrices, or error code mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-269-update,February 11,Azure DevOps release notes - GitHub Advanced Security for Azure DevOps 269 Update,,"See the Sprint 269 feature updates for GitHub Advanced Security for Azure DevOps, including next steps.",,2026-02-18T02:04:00.000Z,article,,0.25,False,"Sprint 269 update for GitHub Advanced Security for Azure DevOps; future-dated release notes, primarily feature announcements, not structured limits, configuration, or troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-270-update,March 5,Azure DevOps release notes - GitHub Advanced Security for Azure DevOps 270 Update,,"See the Sprint 270 feature updates for GitHub Advanced Security for Azure DevOps, including next steps.",,2026-03-06T03:04:00.000Z,article,,0.2,False,"Sprint release notes typically describe new features and changes at a high level without structured limits, configuration matrices, error-code troubleshooting flows, or decision frameworks. The description suggests a feature update summary for GitHub Advanced Security for Azure DevOps, not detailed quotas, configs, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-271-update,March 31,Azure DevOps release notes - GitHub Advanced Security for Azure DevOps 271 Update,,"See the Sprint 271 feature updates for GitHub Advanced Security for Azure DevOps, including next steps.",,2026-03-31T20:43:00.000Z,release-notes,,0.2,False,"Sprint release notes for GitHub Advanced Security for Azure DevOps typically describe new features and changes at a high level without structured limits, configuration tables, error-code mappings, or decision matrices that match any sub-skill category.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-272-update,April 14,Azure DevOps release notes - GitHub Advanced Security for Azure DevOps Sprint 272 Update,,"See the Sprint 272 feature updates for GitHub Advanced Security for Azure DevOps, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.2,False,"Sprint release notes for GitHub Advanced Security for Azure DevOps typically describe new features and changes at a high level without structured limits, configuration tables, error-code mappings, or decision matrices. They are primarily update/what’s new content rather than deep technical reference, so they don’t fit any expert-knowledge sub-skill type defined.",new +https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-272-update,April 14,Azure DevOps release notes - GitHub Advanced Security for Azure DevOps Sprint 272 Update,,"See the Sprint 272 feature updates for GitHub Advanced Security for Azure DevOps, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.2,False,"Sprint release notes for GitHub Advanced Security for Azure DevOps typically describe new features and changes at a high level without structured limits, configuration tables, error-code mappings, or decision matrices. They are primarily update/what’s new content rather than deep technical reference, so they don’t fit any expert-knowledge sub-skill type defined.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/pipelines/sprint-269-update,February 11,Azure DevOps release notes - Azure Pipelines Sprint 269 update,,"See the Sprint 269 feature updates for Azure Pipelines, including next steps.",,2026-02-13T02:04:00.000Z,release-notes,,0.2,False,"Azure Pipelines Sprint 269 release notes list new features and changes, not structured expert-knowledge content like quotas, config matrices, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/pipelines/sprint-271-update,March 31,Azure DevOps release notes - Azure Pipelines Sprint 271 Update,,"See the Sprint 271 feature updates for Azure Pipelines, including next steps.",,2026-03-31T20:43:00.000Z,release-notes,,0.2,False,"Sprint release notes describe new and changed features but are primarily announcements and high-level descriptions, not structured expert references like limits, configuration matrices, error-code troubleshooting, or decision guides. They typically lack the stable, tabulated parameters, quotas, or role definitions required by the specified sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/pipelines/sprint-272-update,April 14,Azure DevOps release notes - Azure Pipelines Sprint 272 Update,,"See the Sprint 272 feature updates for Azure Pipelines, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.0,False,"Sprint release notes describe new features and changes but are not structured as limits, configuration references, troubleshooting guides, or other defined sub-skill types with stable expert knowledge. Content is primarily update/what’s new information rather than reusable technical reference with numeric limits, config tables, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/pipelines/sprint-272-update,April 14,Azure DevOps release notes - Azure Pipelines Sprint 272 Update,,"See the Sprint 272 feature updates for Azure Pipelines, including next steps.",,2026-04-14T16:51:00.000Z,release-notes,,0.0,False,"Sprint release notes describe new features and changes but are not structured as limits, configuration references, troubleshooting guides, or other defined sub-skill types with stable expert knowledge. Content is primarily update/what’s new information rather than reusable technical reference with numeric limits, config tables, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/repos/sprint-268-update,January 26,Azure DevOps Release Notes - Azure Repos Sprint 268 Update,,"See the Sprint 268 feature updates for Azure Repos, including next steps.",,2026-01-27T18:16:00.000Z,release-notes,,0.25,False,"Sprint 268 Azure Repos release notes focus on new features and changes, not on numeric limits, configuration parameter references, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/repos/sprint-269-update,February 11,Azure DevOps Release Notes - Azure Repos Sprint 269 Update,,"See the Sprint 269 feature updates for Azure Repos, including next steps.",,2026-02-12T02:04:00.000Z,release-notes,,0.25,False,"Sprint 269 Azure Repos release notes are what’s-new content for a specific sprint; they don’t present limits, configuration matrices, or troubleshooting mappings in the required structured way.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/repos/sprint-270-update,March 5,Azure DevOps Release Notes - Azure Repos Sprint 270 Update,,"See the Sprint 270 feature updates for Azure Repos, including next steps.",,2026-03-06T03:04:00.000Z,release-notes,,0.2,False,"Sprint release notes typically describe new features and changes without structured limits, configuration matrices, error-code troubleshooting, or decision frameworks. This page is an update summary for Azure Repos Sprint 270 and is unlikely to contain the kind of stable, structured expert knowledge (limits, quotas, config tables, error mappings, or decision matrices) targeted by these sub-skill types.",unchanged @@ -793,7 +791,7 @@ https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-268-upd https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-269-update,February 11,Support for GitHub Copilot Custom Agents in Azure Boards,,Support for GitHub Copilot Custom Agents in Azure Boards,"Azure Boards now supports selecting GitHub Copilot custom agents when creating a pull request from a work item. Custom agents created at the repository or organization level in GitHub automatically appear in Azure DevOps, and your selected agent will generate the code changes and create the pull request in the chosen repository. Check out the release notes for details.",2026-03-06T03:04:00.000Z,article,,0.2,False,"Sprint release note announcing support for GitHub Copilot custom agents in Azure Boards. Summary suggests feature description only, without detailed configuration options, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-270-update,March 5,Auto‑complete pull requests by default,,Auto‑complete pull requests by default,"With this sprint, we're adding a new repository setting that enables pull requests to be set to auto‑complete by default when they’re created. This helps teams reduce manual follow‑ups and streamline the merge process. The setting can be configured at the project level or per repository, giving teams flexibility in how they manage pull request completion. Check out the release notes for details.",2026-03-11T01:04:00.000Z,article,,0.2,False,"Sprint release note describing a new auto-complete PR setting at a conceptual level. No specific configuration parameter tables, limits, or troubleshooting details are indicated.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-271-update,March 31,Improved deployment visibility with artifact IDs and stage level views,,Improved deployment visibility with artifact IDs and stage level views,"This sprint enhances Azure Pipelines deployment clarity by surfacing the exact build artifact deployed in each pipeline run and introducing a new Stages view. Together, these updates make it easier to track what version is deployed, where it’s running, and how deployments progress across environments Check out the release notes for details.",2026-03-31T20:43:00.000Z,article,,0.2,False,"Sprint update teaser text pointing to release notes; marketing-style summary without detailed technical configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-272-update,April 14,CodeQL default setup public preview and enhanced work item experiences,,CodeQL default setup public preview and enhanced work item experiences,"This sprint brings built-in Code Search to Azure DevOps without requiring an extension, CodeQL default setup to public preview for GitHub Advanced Security for Azure DevOps, enhances Azure Boards editing and filtering experiences, and improves GitHub service connection details in Azure Pipelines. Check out the release notes for details.",2026-04-14T16:51:00.000Z,article,,0.2,False,"Sprint release notes summarizing new features; lacks concrete limits, configuration tables, troubleshooting mappings, or decision criteria.",new +https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-272-update,April 14,CodeQL default setup public preview and enhanced work item experiences,,CodeQL default setup public preview and enhanced work item experiences,"This sprint brings built-in Code Search to Azure DevOps without requiring an extension, CodeQL default setup to public preview for GitHub Advanced Security for Azure DevOps, enhances Azure Boards editing and filtering experiences, and improves GitHub service connection details in Azure Pipelines. Check out the release notes for details.",2026-04-14T16:51:00.000Z,article,,0.2,False,"Sprint release notes summarizing new features; lacks concrete limits, configuration tables, troubleshooting mappings, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/testplans/sprint-268-update,January 26,Azure DevOps Release Notes - Azure Test Plans Sprint 268 Update,,"See the Sprint 268 feature updates for Azure Test Plans, including next steps.",,2026-01-27T18:16:00.000Z,article,,0.25,False,"Sprint 268 Azure Test Plans release notes; feature updates and next steps, not structured limits, configuration, or troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/testplans/sprint-270-update,March 5,Azure DevOps Release Notes - Azure Test Plans Sprint 270 Update,,"See the Sprint 270 feature updates for Azure Test Plans, including next steps.",,2026-03-06T03:04:00.000Z,article,,0.2,False,"Sprint release notes for Azure Test Plans describe new features and changes but do not focus on structured limits, configuration matrices, troubleshooting mappings, or other expert-reference data as defined by the sub-skill types. Content is primarily update/what’s new information rather than reusable expert knowledge tables or patterns.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/testplans/sprint-271-update,March 31,Azure DevOps Release Notes - Azure Test Plans Sprint 271 Update,,"See the Sprint 271 feature updates for Azure Test Plans, including next steps.",,2026-03-31T20:43:00.000Z,release-notes,,0.0,False,"Sprint release notes describe new features and changes but do not focus on structured limits, configuration matrices, troubleshooting mappings, or other categorized expert-knowledge patterns defined in the sub-skill types.",unchanged @@ -801,8 +799,8 @@ https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/,What' https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/azure-devops-docs-2026-01,January 2026,"Azure DevOps docs: What's new for January 1, 2026 - January 31, 2026",,"What's new in the Azure DevOps docs for January 1, 2026 - January 31, 2026.","Welcome to what's new in the Azure DevOps docs from January 1, 2026 through January 31, 2026. This article lists some of the major changes to docs during this period.",2026-03-11T17:05:00.000Z,article,,0.0,False,"Monthly 'what's new' summary for Azure DevOps docs; functions as a change log and navigation aid, not a source of expert configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/azure-devops-docs-2026-02,February 2026,"Azure DevOps docs: What's new for February 1, 2026 - February 28, 2026",,"What's new in the Azure DevOps docs for February 1, 2026 - February 28, 2026.","Welcome to what's new in the Azure DevOps docs from February 1, 2026 through February 28, 2026. This article lists some of the major changes to docs during this period.",2026-04-09T17:06:00.000Z,article,,0.0,False,"Another monthly 'what's new' documentation summary; contains no product-specific limits, configs, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/azure-devops-docs-whats-new,March 2026,"Azure DevOps docs: What's new for March 1, 2026 - March 31, 2026",,"What's new in the Azure DevOps docs for March 1, 2026 - March 31, 2026.","Welcome to what's new in the Azure DevOps docs from March 1, 2026 through March 31, 2026. This article lists some of the major changes to docs during this period.",2026-04-09T17:06:00.000Z,article,,0.0,False,"Monthly 'what's new' documentation summary; meta-doc about changes, not technical guidance or detailed product behavior.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline,Roadmap,Azure DevOps Roadmap,,Azure DevOps feature roadmap,|What's New|Developer Community|DevOps Blog|Documentation|,2026-01-22T22:02:00.000Z,article,,0.0,False,"Roadmap/landing page with navigation; no technical configuration, limits, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline-released,Released features,Azure DevOps Released Features,,Azure DevOps release notes and server build numbers,|What's New|Developer Community|DevOps Blog|Documentation|,2026-04-14T08:00:00.000Z,release-notes,,0.2,False,"High-level release notes timeline and build numbers; no detailed limits, configuration parameters, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline,Roadmap,Azure DevOps Roadmap,,Azure DevOps feature roadmap,|What's New|Developer Community|DevOps Blog|Documentation|,2026-04-02T08:00:00.000Z,article,,0.0,False,"Roadmap/what's new navigation content without detailed limits, configuration parameters, troubleshooting mappings, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline-released,Released features,Azure DevOps Released Features,,Azure DevOps release notes and server build numbers,|What's New|Developer Community|DevOps Blog|Documentation|,2026-04-14T08:00:00.000Z,release-notes,,0.2,False,"High-level release notes timeline and build numbers; no detailed limits, configuration parameters, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/?view=azure-devops,Reporting and Analytics >>,Analytics & Reporting - Azure DevOps,,"Create dashboards, track team velocity, and generate reports to monitor progress and improve development processes in Azure DevOps.","Transform your development data into actionable insights. Create dashboards, track progress, and make data-driven decisions that accelerate your team's delivery.",2026-04-10T01:04:00Z,landing-page,,0.1,False,"Landing/overview page for Azure DevOps analytics and reporting with conceptual guidance on dashboards and tracking progress, but no evidence of specific limits, configuration tables, error codes, or other product-specific expert details.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/analytics/analytics-best-practices?view=azure-devops,Best practices,Analytics best practices - Azure DevOps,Apply best practices when querying Azure DevOps Analytics,Learn about the best practices to use when you query Analytics for Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Analytics is the reporting platform for Azure DevOps, which allows you to gain insights from your data and make data-driven decisions. Analytics is optimized for fast read-access and server-based aggregations, and it provides various tools to visualize and analyze your data. In this article, we share some best practices for using Analytics in Azure DevOps.",2025-07-15T16:57:00.000Z,overview,best-practices,0.8,True,"Explicit best-practices article for Analytics queries; likely includes product-specific query patterns, performance tips, and gotchas.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/analytics/analytics-permissions-prerequisites?view=azure-devops,Permissions and prerequisites,Permissions and prerequisites to access Analytics - Azure DevOps,Meet permissions and prerequisites for Azure DevOps Analytics,Understand the permissions and prerequisites to meet to access and generate reports with Analytics.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To work with Analytics and create reports, several prerequisites must be met as summarized in this article. By default, all project members are provided access to Analytics data for the projects they are members of, including members added to the projectReadersgroup. Users withStakeholderaccess have no access to view or edit Analytics views.",2025-10-27T22:02:00.000Z,overview,security,0.85,True,"Details prerequisites and permissions for Analytics access, including default access for project members and exclusion of Stakeholder users; contains concrete permission behaviors.",unchanged @@ -813,11 +811,11 @@ https://learn.microsoft.com/en-us/azure/devops/report/analytics/entity-reference https://learn.microsoft.com/en-us/azure/devops/report/analytics/entity-reference-pipelines?view=azure-devops,Pipelines (Azure Pipelines),Pipelines properties reference for Analytics - Azure DevOps,Azure Pipelines Analytics properties and enums reference,"Properties, enumerated types, and members metadata reference for the Analytics service for Azure Pipelines.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 The Analytics service collects pipeline and test activity generated via Azure Pipelines. This article describes the properties that you can use to generate an Analytics report for pipelines. You use a combination of properties to filter a query, aggregate data, or build a report. Note This article provides descriptions of entities, properties, and enumerated types supported by the Analytics data model. To query the data mode",2025-10-27T22:02:00.000Z,reference,configuration,0.7,True,"Describes properties and enumerated members for pipeline-related Analytics entities, which are concrete configuration/schema details.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/analytics/entity-reference-test-plans?view=azure-devops,Testing (Azure Test Plans),Test metadata reference for Analytics - Azure DevOps,Azure Test Plans Analytics metadata and properties,"Properties, enumerated types, and members metadata reference for the Analytics service and Azure DevOps testing.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 The Analytics service collects all data for all Azure DevOps test activities. Azure Test Plans supports the definition and execution of planned and exploratory tests. And with Azure Pipelines, you can also execute automated tests with Continuous Integration/Continuous Deployment (CI/CD) workflows. If you're new to Azure DevOps testing, we recommend viewing the following articles: The metadata information provided in this arti",2025-10-27T22:02:00.000Z,reference,configuration,0.7,True,"Provides detailed metadata for test-related Analytics entities, including property names and enums, which are configuration-level details.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/add-charts-to-dashboard?view=azure-devops,Add charts and built-in reports to a dashboard,Add built-in charts to a team dashboard - Azure DevOps,Add built-in charts to Azure DevOps dashboards,Learn how to add system-generated charts or query-based charts to a team dashboard.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article explains how to add query-based charts and in-context reports to a dashboard from their respective functional pages. For example, you can add the Team Velocity in-context Analytics report to a dashboard. After adding the report, you can modify the corresponding widget configuration parameters to suit your needs.",2025-03-25T14:59:00.000Z,how-to,configuration,0.65,True,Explains adding system-generated and query-based charts and modifying widget configuration parameters; contains product-specific chart configuration options.,unchanged -https://learn.microsoft.com/en-us/azure/devops/report/dashboards/add-markdown-to-dashboard?view=azure-devops,Add Markdown to a dashboard,Add Markdown content to a team dashboard - Azure DevOps,,Learn how to add and configure the Markdown widget you add to a team dashboard in Azure DevOps.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Add the Markdown widget to a dashboard to share useful information with your team and stakeholders. Use it to display: The following example shows team information and links:,2026-04-14T01:03:00.000Z,how-to,,0.1,False,"How-to guide for adding Markdown widget to dashboards; lacks numeric limits, configuration matrices, or detailed diagnostic/security information.",updated +https://learn.microsoft.com/en-us/azure/devops/report/dashboards/add-markdown-to-dashboard?view=azure-devops,Add Markdown to a dashboard,Add Markdown content to a team dashboard - Azure DevOps,,Learn how to add and configure the Markdown widget you add to a team dashboard in Azure DevOps.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Add the Markdown widget to a dashboard to share useful information with your team and stakeholders. Use it to display: The following example shows team information and links:,2026-04-14T01:03:00.000Z,how-to,,0.1,False,"How-to guide for adding Markdown widget to dashboards; lacks numeric limits, configuration matrices, or detailed diagnostic/security information.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/add-widget-to-dashboard?view=azure-devops,Add widgets to a dashboard,Add a widget to a team dashboard - Azure DevOps,Add and configure widgets on Azure DevOps dashboards,Learn how to select and configure widgets that you add to a team dashboard in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Widgets smartly format data to provide access to easily consumable data. You add widgets to your team dashboards to gain visibility into the status and trends occurring as you develop your software project. Each widget provides access to a chart, user-configurable information, or a set of links that open a feature or function. You can add one or more charts or widgets to your dashboard. Up to 200 widgets total. You add severa",2026-02-17T08:00:00.000Z,quickstart,limits-quotas,0.8,True,Explicitly states a numeric limit: up to 200 widgets per dashboard; this is a concrete product quota not generally known to LLMs.,unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/analytics-extension?view=azure-devops-server,Enable or install Analytics,Install or enable Analytics - Azure DevOps Server,Install or enable Azure DevOps Server Analytics service,Learn how to add or enable Analytics for your Azure DevOps Server collection.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 For Azure DevOps Server 2020 and later versions, the Analytics service is generally available and automatically enabled for all new project collections added to your server. For project collections upgraded from a previous version, you might need tomanually enable it. You enable Analytics for each project collection for which you want to generate Analytics reports. Note The Analytics Marketplace extension and Analytics arenot",2025-03-25T14:59:00.000Z,how-to,configuration,0.75,True,"Describes enabling Analytics per project collection, version-specific behavior, and extension deprecation; product-specific configuration and deployment requirements.",unchanged -https://learn.microsoft.com/en-us/azure/devops/report/dashboards/analytics-widgets?view=azure-devops,Analytics-based widgets,Analytics Widgets Overview for Azure DevOps - Azure DevOps,,"Analytics widgets in Azure DevOps help you visualize project health, track progress, and optimize team performance. Explore key widgets and boost productivity.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Analytics widgets provide valuable insights into the health and status of your work. Add an Analytics widget to a dashboard the same way you add any other type of widget. For details, seeAdd a widget to your dashboard.",2026-04-14T01:03:00.000Z,overview,,0.1,False,"Analytics widgets overview and usage; no limits, configuration tables, error codes, or product-specific thresholds. Primarily conceptual/how-to content.",updated +https://learn.microsoft.com/en-us/azure/devops/report/dashboards/analytics-widgets?view=azure-devops,Analytics-based widgets,Analytics Widgets Overview for Azure DevOps - Azure DevOps,,"Analytics widgets in Azure DevOps help you visualize project health, track progress, and optimize team performance. Explore key widgets and boost productivity.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Analytics widgets provide valuable insights into the health and status of your work. Add an Analytics widget to a dashboard the same way you add any other type of widget. For details, seeAdd a widget to your dashboard.",2026-04-14T01:03:00.000Z,overview,,0.1,False,"Analytics widgets overview and usage; no limits, configuration tables, error codes, or product-specific thresholds. Primarily conceptual/how-to content.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/burndown-guidance?view=azure-devops,Burndown and burnup guidance,Burndown and burnup guidance - Azure DevOps,Choose and use Azure DevOps burndown and burnup charts,Learn how to choose and use burndown/burnup charts to review sprints and releases in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Burndown and burnup charts support project management to visually track work completed over time. Sprint burndown charts track planned work for a team and a selected sprint, so teams can review how efficiently they plan and execute sprint over sprint. Burndown charts generally show a downward trend. But, if teams add work through a sprint or release period, then the chart shows upward trends. These charts help teams monitor w",2025-10-27T22:02:00.000Z,overview,best-practices,0.7,True,"Guidance on when and how to use different burndown/burnup charts; provides actionable, product-specific recommendations for chart selection and interpretation.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/charts?view=azure-devops,Create work tracking charts,"Status and trend work item, query-based charts - Azure DevOps",,"Learn how to add status, progress, and trend charts to dashboards from flat-list queries in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Chart the results of aflat-list queryto quickly view the status of work in progress. You can create pie, column, bar, pivot, trend, or burndown charts that show a count of work items or a sum of numeric fields like Story Points, Effort, or Remaining Work — grouped by State, Assigned To, or any other system or custom field. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure Dev",2026-03-24T17:07:00.000Z,how-to,,0.2,False,"Describes adding status and trend charts from queries. This is primarily feature usage/tutorial content; the summary does not indicate detailed configuration parameters, limits, or security/diagnostic specifics that would qualify as expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/configure-burndown-burnup-widgets?view=azure-devops,Configure a burndown or burnup widget,Configure a burndown or burnup widget - Azure DevOps,,Learn how to configure a burndown or burnup widget to create charts that you add to a dashboard to track progress across one or more teams in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In Azure DevOps dashboards, burndown and burnup widgets give you flexibility to create charts for any type of scope or number of teams in specified time periods.Burndown chartsfocus on remaining work.Burnup chartsfocus on completed work. Both chart types help your team determine whether you're on track to complete your work by the end date. For an overview of all burndown and burnup charts available to you, seeBurndown and bu",2026-03-19T14:34:00.000Z,how-to,,0.2,False,"Covers configuring burndown/burnup widgets conceptually (scope, teams, time periods). The summary suggests a general how-to without explicit parameter tables, numeric limits, or security/diagnostic mappings that would constitute expert knowledge.",unchanged @@ -831,7 +829,7 @@ https://learn.microsoft.com/en-us/azure/devops/report/dashboards/cycle-time-and- https://learn.microsoft.com/en-us/azure/devops/report/dashboards/dashboard-focus?view=azure-devops,Define your dashboard focus,Create Actionable Dashboards in Azure DevOps - Azure DevOps,,Design effective dashboards in Azure DevOps to keep your team informed and projects on track. Learn how to create and customize dashboards with actionable widgets.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Actionable dashboards keep your team and stakeholders informed and projects on track. You can create dashboards for projects that serve multiple teams or a specific team, and add widgets that show content for the signed-in user. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started.",2026-03-23T08:00:00.000Z,conceptual,,0.2,False,"Covers how to create and customize dashboards with widgets; appears to be a conceptual/UX and basic how-to guide without detailed configuration tables, limits, or product-specific diagnostic/security parameters.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/dashboard-permissions?view=azure-devops,Set dashboard permissions (Security),Set dashboard permissions for team members - Azure DevOps,Set Azure DevOps dashboard permissions for team members,"Learn how to set permissions to create, edit, or delete dashboards in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Dashboards are viewable by all members of the Project Valid Users group. Permissions to edit, delete, or manage dashboards can be configured for both team and project dashboards. As a member of the Project Administrators group, you can set the default dashboard permissions for all teams. As a team or project administrator, you have the flexibility to set individual dashboard permissions for team members. This enables you to t",2025-12-19T16:00:00.000Z,how-to,security,0.8,True,"Explains dashboard permissions, mentions Project Valid Users and Project Administrators groups; contains concrete RBAC group names and permission behaviors.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/dashboards?view=azure-devops,"Add, rename, & delete dashboards","Add, Rename, Delete, and Manage Team Dashboards - Azure DevOps",Configure and manage Azure DevOps team dashboards,"Learn how to create, edit, and delete Azure DevOps dashboards. Find out how to set dashboard permissions and add widgets to dashboards to view progress and trends.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This quickstart shows how to create, edit, rename, and delete dashboards in Azure DevOps. It also covers permissions, adding widgets, and troubleshooting common issues. Learn how to do the following tasks: Quick steps: Tip You can use AI to help with Azure DevOps tasks. SeeEnable AI assistance with Azure DevOps MCP Serverto get started.",2026-02-17T08:00:00.000Z,quickstart,configuration,0.65,True,"Task-focused guide on creating, editing, deleting dashboards and setting permissions; contains product-specific permission options and widget configuration steps.",unchanged -https://learn.microsoft.com/en-us/azure/devops/report/dashboards/faqs?view=azure-devops,FAQs,FAQs - Azure dashboards and charts - Azure DevOps,Troubleshoot Azure DevOps dashboards and charts issues,Answers to frequently asked questions about Azure DevOps dashboards and charts,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions when using Azure DevOps dashboards, charts, and reports.",2025-12-19T16:00:00Z,faq,troubleshooting,0.65,True,"FAQ for dashboards and charts typically includes symptom-based Q&A and product-specific resolutions, qualifying as troubleshooting guidance.",unchanged +https://learn.microsoft.com/en-us/azure/devops/report/dashboards/faqs?view=azure-devops,FAQs,Azure DevOps dashboards and charts FAQs - Azure DevOps,,"Get answers to frequently asked questions about Azure DevOps dashboards, charts, and reports.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Azure DevOps dashboards, charts, and reports.",2026-04-22T21:02:00Z,faq,,0.0,False,"FAQ page about Azure DevOps dashboards and charts; based on the description it likely covers usage questions and conceptual guidance, without explicit numeric limits, configuration parameter tables, error-code-to-solution mappings, or other product-specific expert details as defined in the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/devops/report/dashboards/overview?view=azure-devops,"Dashboards, charts, reports, & widgets","Understand dashboards, charts, reports, and widgets - Azure DevOps",,"Learn about charts, widgets, dashboards, and reports available to monitor status and trends in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Gain visibility into your team's progress by adding one or more widgets or charts to your dashboard. Customizable, highly configurable dashboards provide you and your teams with the flexibility to share information, monitor progress and trends, and improve your workflow processes. Each team can tailor their dashboards to share information and monitor their progress. If you're just starting out, readAdd, rename, and delete das",2026-02-17T08:00:00.000Z,overview,,0.2,False,"Conceptual overview of dashboards, charts, and widgets; no product-specific limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/quick-ref?view=azure-devops,Dashboards quick reference,"Quick reference to dashboards, charts, and reports - Azure DevOps",,"An index of articles that explain dashboard, chart, report, and widget tasks for Azure Boards in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use this index to quickly access information on tasks for configuring or accessing dashboards, charts, reports, and widgets.",2025-12-19T16:00:00.000Z,overview,,0.1,False,Quick reference/index page linking to other docs; no embedded expert configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/devops/report/dashboards/team-velocity?view=azure-devops,View/configure velocity,View and configure team velocity - Azure DevOps,Configure and interpret team velocity reports in Azure DevOps,Learn how to calculate and track team velocity across sprints using the in-context Analytics report or Velocity widget chart in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Velocity metrics provide valuable insights that help teams plan andforecastsprints, and evaluate how accurately they estimate and meet planned commitments. These metrics indicate how much work a team can complete during a sprint, based on eitherthe count of work items completedorthe sum of estimatesfor effort (product backlog items), story points (user stories), or size (requirements). Use velocity to aid in determining team ",2025-12-01T22:03:00.000Z,tutorial,configuration,0.7,True,Covers configuration of in-context Analytics report and Velocity widget; includes product-specific settings and usage patterns.,unchanged @@ -1024,8 +1022,8 @@ https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevops2 https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevops2022?view=azure-devops,RTW Release Notes,Azure DevOps Server 2022 Release Notes - Azure DevOps Server & TFS,,Azure DevOps Server 2022 Release Notes,"|Developer Community|System Requirements and Compatibility|License Terms|DevOps Blog|SHA-256 Hashes| In this article, you will find information regarding the newest release for Azure DevOps Server. To learn more about installing or upgrading an Azure DevOps Server deployment, seeAzure DevOps Server Requirements. To download Azure DevOps Server products, visit theAzure DevOps Server Downloads page. Direct upgrade to Azure DevOps Server 2022 is supported from Azure DevOps Server 2019 or Team Found",2025-10-30T22:08:00.000Z,release-notes,,0.25,False,"Release notes for Azure DevOps Server 2022; summary does not show structured limits, configuration, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevops2022u1?view=azure-devops,Update 1 Release Notes,Azure DevOps Server 2022 Update 1 Release Notes - Azure DevOps Server & TFS,,Azure DevOps Server 2022 Update 1 Release Notes,"|Developer Community|System Requirements and Compatibility|License Terms|DevOps Blog|SHA-256 Hashes| In this article, you will find information regarding the newest release for Azure DevOps Server. To learn more about installing or upgrading an Azure DevOps Server deployment, seeAzure DevOps Server Requirements. To download Azure DevOps Server products, visit theAzure DevOps Server Downloads page. Direct upgrade to Azure DevOps Server 2022 Update 1 is supported from Azure DevOps Server 2019 or T",2025-10-30T22:08:00.000Z,release-notes,,0.25,False,"Release notes for a specific update; primarily change log content, not fitting the defined sub-skill categories.",unchanged https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevops2022u2?view=azure-devops,Update 2 Release Notes,Azure DevOps Server 2022 Update 2 Release Notes - Azure DevOps Server & TFS,,Azure DevOps Server 2022 Update 2 Release Notes,"|Developer Community|System Requirements and Compatibility|License Terms|DevOps Blog|SHA-256 Hashes| In this article, you will find information regarding the newest release for Azure DevOps Server. To learn more about installing or upgrading an Azure DevOps Server deployment, seeAzure DevOps Server Requirements. To download Azure DevOps Server products, visit theAzure DevOps Server Downloads page. Direct upgrade to Azure DevOps Server 2022 Update 2 is supported from Azure DevOps Server 2019 or T",2026-02-10T08:00:00.000Z,release-notes,,0.25,False,"Release notes for a specific update; summary does not indicate structured limits, configuration references, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevopsserver-sha?view=azure-devops,SHA-256 Values,Azure DevOps Server SHA-256 Hashes - Azure DevOps,,Describes the Server SHA-256 values used to compare the expected hash value of your download to verify your download integrity.,"You can get the file hash for your download by running the following PowerShell cmdlet. Get-FileHash -Path 'W:\temp\AzureDevOps.iso' -Algorithm SHA256 You can then compare the value that you get to the expected hash provided on this page to verify its integrity. If the hash doesn't match, your download may be corrupted and you should download the file again. Note The hashes provided on this page only apply to downloads obtained fromAzure DevOps Server Downloads.",2026-04-14T18:38:00.000Z,reference,,0.3,False,"Describes how to compute and compare SHA-256 hashes and then lists expected hash values for specific Azure DevOps Server downloads. While the hashes are specific, they are static verification data rather than reusable expert knowledge for an AI skill (no limits, configs, error codes, or decision guidance).",updated -https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevopsserver?view=azure-devops,RTW Release Notes,Azure DevOps Server Release Notes - Azure DevOps Server & TFS,,Azure DevOps Server Release Notes,"|Developer Community|System Requirements and Compatibility|License Terms|DevOps Blog|SHA-256 Hashes| In this article, you will find information regarding the newest release for Azure DevOps Server. To learn more about installing or upgrading an Azure DevOps Server deployment, seeAzure DevOps Server Requirements. To download Azure DevOps Server products, visit theAzure DevOps Server Downloads page. Direct upgrade to Azure DevOps Server is supported from Azure DevOps Server 2019 or Team Foundation",2026-04-14T08:00:00.000Z,release-notes,,0.2,False,"High-level release notes landing page pointing to other content (system requirements, downloads, blog). The summary does not show concrete limits, configuration tables, error codes, or decision matrices; it mainly serves as navigation/overview.",updated +https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevopsserver-sha?view=azure-devops,SHA-256 Values,Azure DevOps Server SHA-256 Hashes - Azure DevOps,,Describes the Server SHA-256 values used to compare the expected hash value of your download to verify your download integrity.,"You can get the file hash for your download by running the following PowerShell cmdlet. Get-FileHash -Path 'W:\temp\AzureDevOps.iso' -Algorithm SHA256 You can then compare the value that you get to the expected hash provided on this page to verify its integrity. If the hash doesn't match, your download may be corrupted and you should download the file again. Note The hashes provided on this page only apply to downloads obtained fromAzure DevOps Server Downloads.",2026-04-14T18:38:00.000Z,reference,,0.3,False,"Describes how to compute and compare SHA-256 hashes and then lists expected hash values for specific Azure DevOps Server downloads. While the hashes are specific, they are static verification data rather than reusable expert knowledge for an AI skill (no limits, configs, error codes, or decision guidance).",unchanged +https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevopsserver?view=azure-devops,RTW Release Notes,Azure DevOps Server Release Notes - Azure DevOps Server & TFS,,Azure DevOps Server Release Notes,"|Developer Community|System Requirements and Compatibility|License Terms|DevOps Blog|SHA-256 Hashes| In this article, you will find information regarding the newest release for Azure DevOps Server. To learn more about installing or upgrading an Azure DevOps Server deployment, seeAzure DevOps Server Requirements. To download Azure DevOps Server products, visit theAzure DevOps Server Downloads page. Direct upgrade to Azure DevOps Server is supported from Azure DevOps Server 2019 or Team Foundation",2026-04-14T08:00:00.000Z,release-notes,,0.2,False,"High-level release notes landing page pointing to other content (system requirements, downloads, blog). The summary does not show concrete limits, configuration tables, error codes, or decision matrices; it mainly serves as navigation/overview.",unchanged https://learn.microsoft.com/en-us/azure/devops/server/release-notes/tfs?view=azure-devops,Team Foundation Server,TFS Release Notes - Azure DevOps Server & TFS,,"TFS (Team Foundation Server) release notes for 2015, 2017, and 2018. In 2018, Microsoft changed the name TFS to Azure DevOps Services.",Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2025-10-30T22:08:00.000Z,release-notes,,0.1,False,"Access-controlled TFS release notes index; from the visible summary, it is a navigation/change-log page without clear expert-knowledge structures.",unchanged https://learn.microsoft.com/en-us/azure/devops/server/requirements?view=azure-devops-server,Requirements,Setup & upgrade requirements - Azure DevOps Server,Verify Azure DevOps Server setup and upgrade requirements,"Learn about hardware, operating systems, SQL Server requirements to install Azure DevOps Server.","Azure DevOps Server |Azure DevOps Server |Azure DevOps Server 2022 | Azure DevOps Server 2020 Prior to installing or upgrading an Azure DevOps deployment, review the requirements provided in this article. In addition to these requirements, review the following articles as well:",2025-12-09T18:31:00.000Z,install-set-up-deploy,deployment,0.7,True,"Details supported OS, SQL versions, and hardware requirements for install/upgrade; these are product-specific deployment constraints and matrices.",unchanged https://learn.microsoft.com/en-us/azure/devops/server/tfs-is-now-azure-devops-server?view=azure-devops,TFS is now Azure DevOps Server,TFS rebrand to Azure DevOps Server - Azure DevOps,,"Azure DevOps Server, formerly named Visual Studio Team Services, introduces features from the cloud-hosted Azure DevOps Services.","Azure DevOps Server |Azure DevOps Server |Azure DevOps Server 2022 | Azure DevOps Server 2020 Following therebrand of Visual Studio Team Services (VSTS)asAzure DevOps Services, Microsoftrebranded Visual Studio Team Foundation Server (TFS) as Azure DevOps Server 2019 and up. This new release brings many of the newest features from the cloud-hostedAzure DevOps Servicesinto the on-premises server product. Azure DevOps Services was formerly calledVisual Studio Team Services. For more information, se",2025-10-30T22:08:00.000Z,concept-article,,0.1,False,"Rebranding/overview content without technical limits, configs, or decision matrices.",unchanged @@ -1040,8 +1038,8 @@ https://learn.microsoft.com/en-us/azure/devops/server/upgrade/get-started?view=a https://learn.microsoft.com/en-us/azure/devops/server/upgrade/pre-production?view=azure-devops-server,Pre-production,Do a pre-production upgrade - Azure DevOps Server & TFS,Run a pre-production Azure DevOps Server upgrade dry run,Do a dry run of your upgrade in pre-production to avoid surprises in production.,Azure DevOps Server |Azure DevOps Server |Azure DevOps Server 2022 | Azure DevOps Server 2020,2025-10-30T22:08:00.000Z,upgrade-and-migration-article,deployment,0.65,True,Guides setting up and executing a pre-production upgrade to validate process and catch issues; product-specific deployment/upgrade practice.,unchanged https://learn.microsoft.com/en-us/azure/devops/server/upgrade/pre-upgrade?view=azure-devops-server,Pre-upgrade,Use TfsPreUpgrade - Azure DevOps Server & TFS,Use TfsPreUpgrade to speed TFS upgrades,Use TfsPreUpgrade to reduce the time required to upgrade TFS 2013 to TFS 2015,Azure DevOps Server |Azure DevOps Server |Azure DevOps Server 2022 | Azure DevOps Server 2020,2025-10-30T22:08:00.000Z,upgrade-and-migration-article,deployment,0.65,True,Describes using TfsPreUpgrade tool to reduce upgrade time from TFS 2013 to 2015; product-specific pre-upgrade deployment optimization.,unchanged https://learn.microsoft.com/en-us/azure/devops/server/whats-new?view=azure-devops-server,What's new?,What's new for Azure DevOps Server - Azure DevOps,,Your guide to new features that support DevOps made available with Azure DevOps Server,"Azure DevOps Server |Azure DevOps Server |Azure DevOps Server 2022 | Azure DevOps Server 2020 You can use Azure DevOps Server, previously named Visual Studio Team Foundation Server (TFS), to manage your product lifecycle, reduce risks, and improve team efficiency. Updates are made every few weeks to the cloud-hosted version, Azure DevOps Services. These updates are then rolled up and made available through quarterly updates to the on-premises Azure DevOps Server and TFS. To understand the differ",2025-10-30T22:08:00.000Z,whats-new,,0.3,False,"High-level 'what's new' overview for Azure DevOps Server; primarily marketing/feature summary without detailed limits, configuration, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/?view=azure-devops,Azure Test Plans >>,Azure Test Plans documentation - Azure DevOps,,"Plan, run, and track manual and exploratory tests, collect stakeholder feedback, and review automated test results with Azure Test Plans.","Plan, execute, and track manual, exploratory, and automated tests to ship high-quality software with Azure DevOps.",2026-04-17T21:04:00Z,landing-page,,0.1,False,"Landing/overview page for Azure Test Plans documentation with navigation and high-level description; does not expose specific limits, configuration tables, error codes, or other expert-only technical details.",updated -https://learn.microsoft.com/en-us/azure/devops/troubleshoot/?view=azure-devops,Troubleshooting & FAQs >>,Troubleshooting and FAQs - Azure DevOps,,"Find troubleshooting guides and FAQs for connections, pipelines, security, notifications, migrations, and more in Azure DevOps.",Find troubleshooting guides and FAQs for common issues across Azure DevOps.,2026-03-14T23:28:00Z,landing-page,,0.3,False,Top-level troubleshooting and FAQ hub that links to other guides; the page itself is unlikely to contain specific error-code-to-solution mappings.,unchanged +https://learn.microsoft.com/en-us/azure/devops/test/?view=azure-devops,Azure Test Plans >>,Azure Test Plans documentation - Azure DevOps,,"Plan, run, and track manual and exploratory tests, collect stakeholder feedback, and review automated test results with Azure Test Plans.","Plan, execute, and track manual, exploratory, and automated tests to ship high-quality software with Azure DevOps.",2026-04-17T21:04:00Z,landing-page,,0.1,False,"Landing/overview page for Azure Test Plans documentation with navigation and high-level description; does not expose specific limits, configuration tables, error codes, or other expert-only technical details.",unchanged +https://learn.microsoft.com/en-us/azure/devops/troubleshoot/?view=azure-devops,Troubleshooting & FAQs >>,Troubleshooting and FAQs - Azure DevOps,,"Find troubleshooting guides and FAQs for connections, pipelines, security, notifications, migrations, and more in Azure DevOps.",Find troubleshooting guides and FAQs for common issues across Azure DevOps.,2026-04-22T21:02:00Z,landing-page,,0.2,False,"Landing page that aggregates troubleshooting guides and FAQs across Azure DevOps. It does not itself contain specific error codes, commands, or symptom-to-solution mappings; it just links to other resources.",updated https://learn.microsoft.com/en-us/previous-versions/azure/devops/reference/xml/add-or-modify-work-item-fields-to-support-reporting?view=tfs-2017,Change reporting attributes,Change reporting attributes - Azure DevOps Server,Change Azure DevOps work item field reporting attributes,Customize which fields appear in the relational warehouse or cube to support reporting for Team Foundation Server,"Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS 2018 - TFS 2013 Important This article applies to project customization for On-premises XML process models. For you to view reports, you must have configured your on-premises Azure DevOps Server and project to support reporting. SeeAdd reports to a project. For an overview of process models and customization options, seeCustomize your work tracking experience. You use work item fields to track data for a work item type, to define the filt",2022-03-01T00:00:00.000Z,how-to,configuration,0.75,True,"Describes how to configure which fields appear in the warehouse/cube, including field attributes; this is product-specific configuration behavior.",unchanged https://learn.microsoft.com/en-us/previous-versions/azure/devops/reference/xml/link-param-xml-elements-reference?view=tfs-2017,Link and Param,Link and Param XML elements reference - Azure DevOps & TFS,Configure Link and Param XML elements for Azure DevOps forms,"Adds a hyperlink to a field or a standalone label on a work item form by using the Link element, Team Foundation Server","Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS 2018 - TFS 2013 You can add a hyperlink to a field or a standalone label on a work item form by using theLinkelement. You use theLinkelement in the following instances to: To add elements to a form, you modify the definition for a work item type. SeeModify or add a custom work item type. TheLinkelement is either a child element of theControlelement, or a child element of theWebpageControlTargetorWebpageControlOptionselements. For more inf",2017-04-05T00:00:00.000Z,reference,configuration,0.9,True,"Detailed XML reference for Link and Param elements, including where they can appear in the form definition, is product-specific configuration with concrete element names and structure.",unchanged https://learn.microsoft.com/en-us/previous-versions/azure/devops/reference/xml/reportable-fields-reference?view=tfs-2017,Reportable fields reference,Reportable fields reference - Azure DevOps,Default reportable fields for Azure DevOps warehouse and cube,Default set of fields that appear in the relational warehouse database or the cube,"Azure DevOps Server 2022 - Azure DevOps Server 2019 | TFS 2018 - TFS 2013 Important This topic applies to project customization for On-premises XML process models. For you to view SQL Server reports, you must have configured your Azure DevOps Server or Team Foundation Server (TFS) and project to support reporting. SeeAdd reports to a project. For an overview of process models and customization options, seeCustomize your work tracking experience. A default set of fields appears in the relational ",2019-02-01T00:00:00.000Z,reference,configuration,0.75,True,Lists the default set of fields that appear in the relational warehouse/cube; effectively a configuration reference of field names and reporting attributes.,unchanged diff --git a/products/azure-devops/report.md b/products/azure-devops/report.md index 048a852a..15499fc2 100644 --- a/products/azure-devops/report.md +++ b/products/azure-devops/report.md @@ -1,27 +1,26 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: architecture-patterns: 'Architectural guidance for Azure DevOps/Server: pool architecture, reliability/DR, SQL/database dependencies, and design patterns for simple to complex multi-server topologies and analytics modeling.' - configuration: 'Configuring Azure DevOps/Server: Managed DevOps Pools, notifications, - Boards/work items, Analytics/OData/Power BI, dashboards, search, backups, networking, - email/SMTP, and server admin settings.' + configuration: 'Configuring Azure DevOps/Server: pools, agents, networking, notifications, + work items, dashboards, Analytics, backups, SQL, services, and server/admin settings.' security: 'Managing Azure DevOps security: identities, auth, permissions, access levels, groups, auditing, project/repo/pipeline rights, server service accounts, SSL, and download integrity.' - troubleshooting: 'Diagnosing and fixing Azure DevOps issues: Managed DevOps Pools, - performance, email notifications, connectivity/allowlists, permissions, dashboards/Analytics, - wikis restore, and upgrade failures.' + troubleshooting: Troubleshooting Azure DevOps connectivity, performance, permissions, + notifications, wikis, analytics, Managed DevOps Pools, VS sign-in/setup, and server/project + collection upgrades. + limits-quotas: Limits, quotas, and constraints for Azure DevOps orgs, projects, + naming, work tracking, wikis, dashboards, pipelines, analytics, and managed pools, + plus related retention and notification behaviors best-practices: 'Guidance on optimizing Azure DevOps performance, analytics, and reporting: cost-efficient pools, fast OData queries, Power BI reports, dashboards, and data cleanup/maintenance.' decision-making: 'Guidance on Azure DevOps architectural choices: org/project/team structure, work tracking and wikis, analytics/reporting, cost and topology of agents and Azure DevOps Server deployments.' - limits-quotas: Limits, quotas, and behaviors for Azure DevOps orgs, projects, naming, - access levels, work tracking, dashboards, wikis, pipelines, and Analytics data - availability/latency. integrations: Integrating Azure DevOps with tools (VS, SIEM, notifications, clients) and building Analytics/OData- and Power BI–based reports for work items, pipelines, and test/requirements metrics. @@ -31,13 +30,12 @@ category_descriptions: skill_description: Expert knowledge for Azure DevOps development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - managing Boards/work items, pipelines, repos, Analytics/OData/Power BI, or Azure - DevOps Server deployments, and other Azure DevOps related development tasks. Not - for Azure Boards (use azure-boards), Azure Pipelines (use azure-pipelines), Azure - Repos (use azure-repos), Azure Test Plans (use azure-test-plans). -use_when: Use when managing Boards/work items, pipelines, repos, Analytics/OData/Power - BI, or Azure DevOps Server deployments, and other Azure DevOps related development - tasks. + managing org/projects, repos, pipelines, agents/pools, work items/boards, or Analytics/Power + BI, and other Azure DevOps related development tasks. Not for Azure Boards (use + azure-boards), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos), + Azure Test Plans (use azure-test-plans). +use_when: Use when managing org/projects, repos, pipelines, agents/pools, work items/boards, + or Analytics/Power BI, and other Azure DevOps related development tasks. confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). --- @@ -45,17 +43,17 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us ## Summary -- **Total Pages**: 955 -- **Fetched**: 955 +- **Total Pages**: 953 +- **Fetched**: 953 - **Fetch Failed**: 0 -- **Classified**: 259 -- **Unclassified**: 696 +- **Classified**: 256 +- **Unclassified**: 697 ### Incremental Update -- **New Pages**: 5 -- **Updated Pages**: 9 -- **Unchanged**: 941 -- **Deleted Pages**: 0 +- **New Pages**: 1 +- **Updated Pages**: 13 +- **Unchanged**: 939 +- **Deleted Pages**: 3 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-devops/azure-devops.csv` ## Classification Statistics @@ -67,42 +65,52 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | configuration | 75 | 7.9% | | decision-making | 12 | 1.3% | | deployment | 29 | 3.0% | -| integrations | 42 | 4.4% | -| limits-quotas | 11 | 1.2% | -| security | 57 | 6.0% | -| troubleshooting | 14 | 1.5% | -| *(Unclassified)* | 696 | 72.9% | +| integrations | 41 | 4.3% | +| limits-quotas | 12 | 1.3% | +| security | 56 | 5.9% | +| troubleshooting | 12 | 1.3% | +| *(Unclassified)* | 697 | 73.1% | ## Changes ### New Pages -- [April 14](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/boards/sprint-272-update) -- [April 14](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/sprint-272-update) -- [April 14](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/pipelines/sprint-272-update) -- [April 14](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/ghazdo/sprint-272-update) -- [April 14](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/general/sprint-272-update) +- [Troubleshoot notification emails](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops) ### Updated Pages -- [Pricing](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/pricing?view=azure-devops) - - Updated: 2025-04-29T22:41:00.000Z → 2026-04-17T21:04:00.000Z +- [FAQs](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/faq-notifications?view=azure-devops) + - Updated: 2025-07-17T19:00:00Z → 2026-04-22T21:02:00Z +- [Troubleshooting](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops) + - Updated: 2025-05-21T20:49:00.000Z → 2026-04-20T08:00:00.000Z - [Frequently asked questions](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/faq?view=azure-devops) - - Updated: 2025-08-01T21:25:00.000Z → 2026-04-17T21:04:00.000Z -- [Azure Test Plans >>](https://learn.microsoft.com/en-us/azure/devops/test/?view=azure-devops) - - Updated: 2026-04-03T21:03:00Z → 2026-04-17T21:04:00Z -- [Released features](https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline-released) - - Updated: 2026-03-31T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Analytics-based widgets](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/analytics-widgets?view=azure-devops) - - Updated: 2025-10-27T22:02:00.000Z → 2026-04-14T01:03:00.000Z -- [Add Markdown to a dashboard](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/add-markdown-to-dashboard?view=azure-devops) - - Updated: 2025-02-12T21:31:00.000Z → 2026-04-14T01:03:00.000Z -- [Stakeholder access quick reference](https://learn.microsoft.com/en-us/azure/devops/organizations/security/stakeholder-access?view=azure-devops) - - Updated: 2025-05-19T16:40:00.000Z → 2026-04-14T01:03:00.000Z -- [RTW Release Notes](https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevopsserver?view=azure-devops) - - Updated: 2026-03-13T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [SHA-256 Values](https://learn.microsoft.com/en-us/azure/devops/server/release-notes/azuredevopsserver-sha?view=azure-devops) - - Updated: 2026-03-13T21:18:00.000Z → 2026-04-14T18:38:00.000Z + - Updated: 2026-04-17T21:04:00.000Z → 2026-04-22T21:02:00.000Z +- [Create and configure an organization](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-configure-customize-organization?view=azure-devops) + - Updated: 2026-02-24T02:03:00Z → 2026-04-22T21:02:00Z +- [Manage users and permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops) + - Updated: 2026-03-24T21:04:00Z → 2026-04-24T22:42:00Z +- [Access via Microsoft Entra](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops) + - Updated: 2026-01-27T18:16:00Z → 2026-04-24T22:42:00Z +- [Set up Visual Studio](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops) + - Updated: 2024-08-12T16:52:00Z → 2026-04-22T21:02:00Z +- [Get started as a Stakeholder](https://learn.microsoft.com/en-us/azure/devops/organizations/security/get-started-stakeholder?view=azure-devops) + - Updated: 2025-12-22T08:00:00.000Z → 2026-04-24T22:42:00.000Z +- [Remote MCP Server (preview)](https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops) + - Updated: 2026-03-31T20:43:00.000Z → 2026-04-21T20:06:00.000Z +- [Troubleshooting & FAQs >>](https://learn.microsoft.com/en-us/azure/devops/troubleshoot/?view=azure-devops) + - Updated: 2026-03-14T23:28:00Z → 2026-04-22T21:02:00Z +- [Roadmap](https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline) + - Updated: 2026-01-22T22:02:00.000Z → 2026-04-02T08:00:00.000Z +- [FAQs](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/faqs?view=azure-devops) + - Updated: 2025-12-19T16:00:00Z → 2026-04-22T21:02:00Z +- [Troubleshoot permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops) + - Updated: 2025-12-04T14:04:00.000Z → 2026-04-20T08:00:00.000Z + +### Deleted Pages + +- ~~Why are my emails delayed~~ (https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-delayed-email?view=azure-devops) +- ~~Why am I not getting an email~~ (https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops) +- ~~Why am I getting this email~~ (https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-unexpected-email?view=azure-devops) ## Classified Pages @@ -112,14 +120,15 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [ProcessConfiguration XML reference](https://learn.microsoft.com/en-us/azure/devops/reference/xml/process-configuration-xml-element?view=azure-devops-server) | configuration | 0.90 | Explicit XML syntax and usage for ProcessConfiguration elements (fields, columns, mappings) is a structured configuration reference with element/attribute names and allowed values. | | [Set Analytics permissions (Security)](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/analytics-security?view=azure-devops) | security | 0.90 | Explains View analytics permission, default assignments to Contributors with Basic access, and Stakeholder restrictions; includes specific permission names and security handling. | | [Set up security policies](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-application-access-policies?view=azure-devops) | security | 0.90 | Manages Conditional Access, OAuth, SSH, and PAT policies; includes product-specific security settings and supported/unsupported auth methods. | -| [Troubleshooting](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops) | troubleshooting | 0.90 | Explicit troubleshooting guide; expected to map specific symptoms and errors to causes and resolutions for Managed DevOps Pools. | | [Work tracking, process, & project limits](https://learn.microsoft.com/en-us/azure/devops/organizations/settings/work/object-limits?view=azure-devops) | limits-quotas | 0.90 | Explicitly documents operational and object limits for work items, queries, backlogs, boards, and customizations with specific numeric constraints. | +| [Troubleshooting](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops) | troubleshooting | 0.86 | Troubleshooting article for Managed DevOps Pools; by nature these pages list concrete symptoms, likely include specific error messages, causes, and product-specific remediation steps that aren't generic debugging advice. | | [Connect using Advanced Functions](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/data-connector-functions?view=azure-devops) | integrations | 0.85 | Documents specific Power Query M functions (e.g., VSTS.AccountContents) with arguments and behavior unique to the Azure DevOps connector, matching integration/config parameter criteria. | | [Import, export, & manage work item types](https://learn.microsoft.com/en-us/azure/devops/reference/witadmin/witadmin-import-export-manage-wits?view=azure-devops) | configuration | 0.85 | Provides specific witadmin commands and parameters for managing work item types; this is a configuration/command reference unique to Azure DevOps. | | [Manage work item fields](https://learn.microsoft.com/en-us/azure/devops/reference/witadmin/manage-work-item-fields?view=azure-devops) | configuration | 0.85 | Details witadmin commands and field attributes for listing, deleting, and modifying work item fields; clearly a configuration reference with product-specific parameters. | | [Permissions and prerequisites](https://learn.microsoft.com/en-us/azure/devops/report/analytics/analytics-permissions-prerequisites?view=azure-devops) | security | 0.85 | Details prerequisites and permissions for Analytics access, including default access for project members and exclusion of Stakeholder users; contains concrete permission behaviors. | | [Recover organization](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/recover-your-organization?view=azure-devops) | limits-quotas | 0.85 | Defines 28-day availability for recovery and mentions up to 70 days for backend deletion; explicit numeric retention limits. | | [witAdmin command reference](https://learn.microsoft.com/en-us/azure/devops/reference/witadmin/witadmin-customize-and-manage-objects-for-tracking-work?view=azure-devops) | configuration | 0.85 | Command-line reference for witadmin with specific commands, arguments, and behaviors for managing work item types, fields, categories, and link types is product-specific configuration detail. | +| [Access via Microsoft Entra](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops) | security | 0.82 | FAQ about access via Microsoft Entra ID; explicitly mentions authentication methods and directory connection. Such content typically includes specific security settings, supported/unsupported auth methods, and directory linkage behaviors unique to Azure DevOps. | | [Resolve errors with an Analytics view](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/troubleshooting-views?view=azure-devops) | troubleshooting | 0.82 | Described as troubleshooting guidance for common Analytics view issues, including size warnings and verification errors; likely maps specific error messages/symptoms to causes and fixes, which is product-specific troubleshooting knowledge. | | [About pipeline security roles](https://learn.microsoft.com/en-us/azure/devops/organizations/security/about-security-roles?view=azure-devops) | security | 0.80 | Defines specific pipeline security roles and their allowed operations for different pipeline resources; includes product-specific role names and scopes. | | [Add time-in-state measures](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/create-timeinstate-report?view=azure-devops) | best-practices | 0.80 | Provides specific DAX formulas and patterns to compute time-in-state for Azure DevOps work items, which are product-specific code patterns and recipes. | @@ -156,14 +165,13 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Test duration trend](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-analytics-test-duration-trend?view=azure-devops) | integrations | 0.80 | Shows how to query Analytics for day-wise average test duration over a selected range using concrete OData queries, which is product-specific integration knowledge. | | [Test summary](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-analytics-test-summary?view=azure-devops) | integrations | 0.80 | Provides concrete OData queries and entity/field usage to summarize test runs by outcome for pipelines, which is product-specific integration knowledge. | | [Test summary trend](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-summary-trend?view=azure-devops) | integrations | 0.80 | Contains sample OData queries and schema details to build failed test counts and pass rate trend charts over time, which are specific to Azure DevOps test analytics. | -| [Troubleshoot permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops) | troubleshooting | 0.80 | Organized as step-by-step guidance to diagnose why users lack access to projects or features, mapping symptoms to likely permission causes and resolutions; product-specific troubleshooting of security/permissions. | +| [Troubleshoot notification emails](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops) | troubleshooting | 0.80 | Article explicitly focuses on troubleshooting missing, delayed, or unexpected notification emails; such guides typically map symptoms (no email, delayed email) to causes (SMTP misconfiguration, subscription issues) and resolutions, often including product-specific checks and configuration steps, fitting the troubleshooting sub-skill. | | [Unattended install](https://learn.microsoft.com/en-us/azure/devops/server/install/unattended?view=azure-devops-server) | deployment | 0.80 | Documents tfsconfig /unattend and all preset configuration parameters for multi-machine installs; product-specific deployment automation configuration. | -| [Why am I getting this email](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-unexpected-email?view=azure-devops) | troubleshooting | 0.80 | Symptom-based guide (unexpected emails) mapping causes to solutions for Azure DevOps notification system. | -| [Why am I not getting an email](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops) | troubleshooting | 0.80 | Symptom-based guide (not receiving emails) with possible causes and resolutions specific to Azure DevOps notifications. | | [Wiki file structure](https://learn.microsoft.com/en-us/azure/devops/project/wiki/wiki-file-structure?view=azure-devops) | configuration | 0.80 | Details wiki Git repo conventions including .order files and folder layout; these are product-specific configuration/structure rules not generally known. | | [Configuration by outcome matrix](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-plans-configuration-by-outcome?view=azure-devops) | integrations | 0.78 | Provides specific OData queries and schema usage to build configuration-by-outcome matrices, enabling release decisions per configuration, which is product-specific integration knowledge. | | [Duration](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-pipelines-duration?view=azure-devops) | integrations | 0.78 | Shows exact OData queries and fields to compute pipeline run duration from Analytics, which is a concrete integration pattern beyond generic BI usage. | | [Duration trend](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-pipelines-duration-trend?view=azure-devops) | integrations | 0.78 | Provides detailed OData query examples and column mappings to build daily duration trend charts, which are specific to Azure DevOps Analytics schema. | +| [Frequently asked questions](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/faq?view=azure-devops) | limits-quotas | 0.78 | FAQ explicitly mentions quotas, VM SKUs, regions, and pricing; such FAQs typically enumerate concrete numeric quotas, allowed SKUs, and region-specific constraints that qualify as expert limits/quotas knowledge. | | [Lead/Cycle time](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-boards-leadcycletime?view=azure-devops) | integrations | 0.78 | Contains specific OData queries, entities, and field selections for computing lead and cycle time from Azure DevOps Analytics, which are detailed integration patterns not generally known. | | [Outcome summary](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-pipelines-outcome-summary?view=azure-devops) | integrations | 0.78 | Includes concrete OData queries against PipelineRuns and related entities plus column usage to build outcome summary reports in Power BI, which is product-specific integration detail. | | [Outcome summary for all pipelines](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-pipelines-allpipelines?view=azure-devops) | integrations | 0.78 | Shows how to query Analytics for pass rate, failures, duration, etc. across all pipelines with specific OData query patterns and fields, representing detailed integration guidance. | @@ -172,11 +180,11 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Requirements tracking - Rollup](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-stories-overview-rollup?view=azure-devops) | integrations | 0.78 | Shows how to aggregate metrics for one-level rollups (e.g., features from stories) using specific OData queries and relationships, which is product-specific integration knowledge. | | [Stage wise failures](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-pipelines-stagewise-failures?view=azure-devops) | integrations | 0.78 | Contains specific Analytics OData queries and schema usage to calculate daily stage failures for pipelines, which is detailed, product-specific integration knowledge. | | [Tester by outcome matrix](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-plans-tester-by-outcome?view=azure-devops) | integrations | 0.78 | Contains concrete OData queries (v3.0-preview) and field usage to distribute test point outcomes across testers, which is detailed Azure DevOps–Power BI integration content. | +| [Troubleshoot permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops) | troubleshooting | 0.78 | The page is explicitly a troubleshooting guide for Azure DevOps access and permission problems, providing step-by-step diagnosis and resolution for why a user cannot access a project, service, or feature. This is product-specific troubleshooting knowledge (symptom → investigation steps → resolution) that goes beyond generic debugging concepts. | | [Execution trends](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-plans-execution-trend?view=azure-devops) | integrations | 0.76 | Includes OData queries and schema usage to compute outcome trends for manual test plans over time, which are detailed Azure DevOps Analytics integration patterns. | | [Pass rate trend of test](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-analytics-pass-rate-trend-test?view=azure-devops) | integrations | 0.76 | Provides sample OData queries and field usage to compute pass rate trends for an individual test in a pipeline, which is detailed Azure DevOps Analytics integration guidance. | | [Progress status](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-plans-progress-status?view=azure-devops) | integrations | 0.76 | Contains specific Analytics OData queries and schema usage to summarize execution state of manual test plans, which is expert, product-specific integration content. | | [Suite-level aggregation](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-test-plans-aggregate-data-level?view=azure-devops) | integrations | 0.76 | Provides specific OData queries and hierarchical suite aggregation logic using Analytics entities, which is expert, product-specific integration guidance. | -| [Access via Microsoft Entra](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops) | security | 0.75 | FAQ on Entra groups, adding users, connecting/switching directories; contains product-specific identity and access configuration behavior. | | [Add AD or Entra ID security groups to built-in security groups](https://learn.microsoft.com/en-us/azure/devops/organizations/security/add-ad-aad-built-in-security-groups?view=azure-devops) | security | 0.75 | Describes product-specific security group types (project vs collection), how to add Microsoft Entra/AD groups into built-in Azure DevOps groups like Contributors/Readers, and related permission behavior. This is concrete IAM configuration, not just conceptual security. | | [Add or modify a field](https://learn.microsoft.com/en-us/azure/devops/reference/add-modify-field?view=azure-devops-server) | configuration | 0.75 | Explains how to add/modify fields in the on-prem XML process model, including process-specific behavior; product-specific configuration details. | | [Change permissions at the organization or collection-level](https://learn.microsoft.com/en-us/azure/devops/organizations/security/change-organization-collection-level-permissions?view=azure-devops) | security | 0.75 | Shows concrete steps and locations for configuring organization/collection-level permissions in Azure DevOps, tied to specific roles and scopes; product-specific security configuration. | @@ -256,6 +264,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Manage OAuth apps](https://learn.microsoft.com/en-us/azure/devops/organizations/settings/manage-authorizations?view=azure-devops) | security | 0.70 | Covers Azure DevOps OAuth-based authorizations, including product-specific security behavior and configuration details for granting other services access. This is security-focused configuration for access to resources, beyond generic OAuth concepts. | | [Manage TFVC file types](https://learn.microsoft.com/en-us/azure/devops/server/admin/manage-file-types?view=azure-devops) | configuration | 0.70 | Explains file type definitions, merge behavior, and multiple checkout settings; product-specific configuration of version control behavior. | | [Manage cost and performance](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/manage-costs?view=azure-devops) | best-practices | 0.70 | Describes how to tune pool performance vs. cost; likely includes product-specific recommendations (e.g., instance sizes, scaling behaviors) and configuration guidance unique to Managed DevOps Pools. | +| [Manage users and permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops) | security | 0.70 | User and permissions management FAQ likely lists specific Azure DevOps access levels, permission scopes, and possibly role names and requirements; description also highlights security-related token guidance, fitting product-specific security configuration. | | [Manage users or groups](https://learn.microsoft.com/en-us/azure/devops/organizations/security/add-remove-manage-user-group-security-group?view=azure-devops) | security | 0.70 | Operational guide for adding/removing users and groups, using default/custom groups; includes specific group names and permission implications. | | [Markdown guidance](https://learn.microsoft.com/en-us/azure/devops/project/wiki/markdown-guidance?view=azure-devops) | configuration | 0.70 | Documents Azure DevOps-supported Markdown syntax and behaviors across wikis, dashboards, and PRs; includes product-specific formatting features beyond generic Markdown knowledge. | | [Move or clone to new hardware](https://learn.microsoft.com/en-us/azure/devops/server/admin/move-clone-hardware?view=azure-devops-server) | deployment | 0.70 | Restoration-based move/clone procedure with Azure DevOps-specific steps and constraints for preserving project history, which are deployment patterns. | @@ -266,7 +275,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Pricing](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/pricing?view=azure-devops) | decision-making | 0.70 | Pricing guidance for Managed DevOps Pools typically includes concrete cost components, rate structures, and how Azure DevOps parallel job pricing combines with underlying Azure services. This is product-specific decision guidance to project and optimize costs, fitting the decision-making category more than generic pricing marketing. | | [Project & organization-scoped queries](https://learn.microsoft.com/en-us/azure/devops/report/extend-analytics/account-scoped-queries?view=azure-devops) | integrations | 0.70 | Explains constructing project- and organization-scoped OData queries; includes scope parameters and patterns specific to Azure DevOps Analytics. | | [Provide help text, hyperlinks, or web content](https://learn.microsoft.com/en-us/azure/devops/reference/xml/provide-help-text-hyperlinks-web-content-form?view=azure-devops-server) | configuration | 0.70 | Page is about specific form controls for Azure DevOps work item forms (tooltip, text, hyperlink, HTML/web content). This usually includes control/element names, attributes, and usage constraints that are product-specific configuration details. | -| [Remote MCP Server (preview)](https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops) | configuration | 0.70 | Page is a how-to for setting up the remote Azure DevOps MCP Server using streamable HTTP transport. It likely includes concrete endpoint URLs, required headers, configuration parameters, and possibly auth settings specific to this preview service. These are product-specific configuration details that an LLM would not reliably know from training. | +| [Remote MCP Server (preview)](https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops) | configuration | 0.70 | Page is a how-to for setting up the remote MCP Server with streamable HTTP transport. It likely includes concrete endpoint URLs, required headers, authentication parameters, and configuration options specific to the Azure DevOps-hosted MCP Server, which are product-specific details not inferable from general knowledge. | | [Request change in permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/security/request-changes-permissions?view=azure-devops) | security | 0.70 | Page is a concrete, product-specific guide for resolving insufficient-permission messages by mapping specific tasks and error conditions to required Azure DevOps roles/permissions and the process to request elevation. This is detailed security/permission configuration behavior rather than generic concepts. | | [Requirements](https://learn.microsoft.com/en-us/azure/devops/server/requirements?view=azure-devops-server) | deployment | 0.70 | Details supported OS, SQL versions, and hardware requirements for install/upgrade; these are product-specific deployment constraints and matrices. | | [Restore a deployment to new hardware (advanced)](https://learn.microsoft.com/en-us/azure/devops/server/admin/backup/tut-single-svr-home?view=azure-devops-server) | deployment | 0.70 | Tutorial for restoring a TFS deployment using backups and installation media; includes ordered deployment steps and requirements unique to this product. | @@ -275,6 +284,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Security automation scripts](https://learn.microsoft.com/en-us/azure/devops/organizations/security/security-scripts?view=azure-devops) | security | 0.70 | Provides concrete PowerShell scripts and parameters for auditing access, service connections, and dependencies; product-specific security automation patterns. | | [Send notifications to third-party services (Slack, Teams)](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/integrate-third-party-services?view=azure-devops) | integrations | 0.70 | Covers service hooks and messaging app integrations (Slack, Teams, etc.) with product-specific integration patterns and actions. | | [Service accounts & dependencies](https://learn.microsoft.com/en-us/azure/devops/server/admin/service-accounts-dependencies?view=azure-devops) | security | 0.70 | Details built-in services and service accounts, their roles, and deployment implications; product-specific identity and security configuration. | +| [Set up Visual Studio](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops) | troubleshooting | 0.70 | FAQ focused on problems installing Visual Studio, signing in, and handling expired subscriptions; likely organized around concrete error conditions and resolutions specific to Azure DevOps integration, matching troubleshooting criteria. | | [Set up secure sockets layer](https://learn.microsoft.com/en-us/azure/devops/server/admin/setup-secure-sockets-layer?view=azure-devops-server) | security | 0.70 | Explains how to enable/require HTTPS with SSL, including product-specific bindings and configuration steps; clearly a security configuration topic. | | [Set work tracking & plan permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/security/set-permissions-access-work-tracking?view=azure-devops) | security | 0.70 | Provides concrete recommendations (e.g., use Contributors group) and describes object vs project-level permissions and custom rules; product-specific security configuration patterns. | | [Stakeholder access quick reference](https://learn.microsoft.com/en-us/azure/devops/organizations/security/stakeholder-access?view=azure-devops) | security | 0.70 | Page defines the exact capabilities and restrictions of the Stakeholder access level (for example, no access to code repositories and need at least Basic access to contribute to code), which are product-specific permission details that function as RBAC-like security configuration knowledge. | @@ -288,7 +298,6 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [View the OData query of a built-in report](https://learn.microsoft.com/en-us/azure/devops/report/extend-analytics/view-odata-query-analytics-report?view=azure-devops) | integrations | 0.70 | Shows how to view and reuse the exact OData queries behind built-in Analytics reports/widgets, which is a product-specific integration technique. | | [View/configure CFD](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/cumulative-flow?view=azure-devops) | configuration | 0.70 | Explains configuring CFD reports and their options; includes Azure DevOps-specific report configuration and possibly constraints. | | [View/configure velocity](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/team-velocity?view=azure-devops) | configuration | 0.70 | Covers configuration of in-context Analytics report and Velocity widget; includes product-specific settings and usage patterns. | -| [Why are my emails delayed](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-delayed-email?view=azure-devops) | troubleshooting | 0.70 | Troubleshooting delayed notifications using notification statistics; symptom → diagnosis → resolution pattern. | | [Work tracking (Azure Boards)](https://learn.microsoft.com/en-us/azure/devops/report/analytics/entity-reference-boards?view=azure-devops) | configuration | 0.70 | Lists properties and enumerated types for Boards Analytics entities; this is detailed schema/configuration metadata not inferable from general knowledge. | | [4-Reconnect services and users](https://learn.microsoft.com/en-us/azure/devops/server/admin/backup/tut-single-svr-reconn-svcs-users?view=azure-devops-server) | deployment | 0.65 | Explains starting project collections, verifying groups, and clearing server caches after restore; product-specific post-deployment steps. | | [About access levels](https://learn.microsoft.com/en-us/azure/devops/organizations/security/access-levels?view=azure-devops) | security | 0.65 | Describes access level types, what features they unlock, and how to assign them; includes specific access level names and constraints. | @@ -306,14 +315,12 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Download release permissions report](https://learn.microsoft.com/en-us/azure/devops/organizations/security/download-permissions-report-release?view=azure-devops) | security | 0.65 | Describes a product-specific mechanism to generate a JSON permissions report for a release, including how inherited permissions are represented; concrete security diagnostics feature. | | [Download repository permissions report](https://learn.microsoft.com/en-us/azure/devops/organizations/security/download-permissions-report?view=azure-devops) | security | 0.65 | Describes obtaining effective permissions for users/groups on a repo; product-specific security/permissions reporting behavior. | | [Export user list](https://learn.microsoft.com/en-us/azure/devops/organizations/security/export-users-audit-log?view=azure-devops) | security | 0.65 | Deals with access levels (stakeholder, basic, advanced, VS Enterprise) which are product-specific permission tiers. | -| [FAQs](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/faqs?view=azure-devops) | troubleshooting | 0.65 | FAQ for dashboards and charts typically includes symptom-based Q&A and product-specific resolutions, qualifying as troubleshooting guidance. | | [Get started with a new install](https://learn.microsoft.com/en-us/azure/devops/server/install/get-started?view=azure-devops-server) | decision-making | 0.65 | Guides choosing between single, dual, and multi-server deployments with hardware recommendation references and scenario-based guidance, helping decide deployment approach rather than just how-to steps. | | [How email recipients are determined](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/concepts-email-recipients?view=azure-devops) | configuration | 0.65 | Explains product-specific rules for how subscriptions, delivery settings, and preferences combine to determine recipients; nuanced behavior unique to Azure DevOps. | | [Locate or change a product key](https://learn.microsoft.com/en-us/azure/devops/server/upgrade/change-product-key?view=azure-devops-server) | configuration | 0.65 | Describes how licensing works (no key at install) and how to opt-in to full edition via admin console; product-specific configuration behavior. | | [Look up a project administrator](https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-project-administrators?view=azure-devops) | security | 0.65 | Defines the Project Administrators security group and enumerates its authorized tasks, with guidance on locating members. These are concrete, product-specific permission capabilities tied to a named role. | | [Look up a project collection administrator](https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-project-collection-administrators?view=azure-devops) | security | 0.65 | Explains how to locate members of a specific privileged security group in Azure DevOps with product-specific steps and role behavior; this is concrete security configuration rather than conceptual overview. | | [Manage search indexing](https://learn.microsoft.com/en-us/azure/devops/project/search/manage-search?view=azure-devops-server) | configuration | 0.65 | Describes managing search extension and indexing status, including pausing, resuming, and reindexing; product-specific operational configuration details. | -| [Manage users and permissions](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops) | security | 0.65 | FAQ includes product-specific security guidance (e.g., recommendation to use Microsoft Entra tokens over personal access tokens) and likely details on required permissions/roles for specific management tasks, which are service-specific security configurations. | | [Manually back up](https://learn.microsoft.com/en-us/azure/devops/server/admin/backup/manually-backup-tfs?view=azure-devops-server) | configuration | 0.65 | Manual backup via SQL Server tools for Azure DevOps Server will include product-specific database names, sequences, and options required to keep backups in sync, which are concrete configuration steps unique to this product. | | [Monitor usage](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/usage-monitoring?view=azure-devops) | troubleshooting | 0.65 | Guidance for investigating delayed/unfulfilled requests using usage messages and audit logs; symptom-based performance troubleshooting. | | [Move project collection](https://learn.microsoft.com/en-us/azure/devops/server/admin/move-project-collection?view=azure-devops-server) | deployment | 0.65 | Describes moving collections between deployments and domains with specific steps and considerations; this is a migration/deployment scenario. | @@ -347,7 +354,6 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Copy a dashboard](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/copy-dashboard?view=azure-devops) | configuration | 0.60 | Describes copying dashboards between projects/teams and how widgets are duplicated; includes Azure DevOps-specific behavior and options. | | [Data available and versioning](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/data-available-in-analytics?view=azure-devops) | limits-quotas | 0.60 | Describes which data is available depending on platform/version and Analytics version; effectively a capability/availability matrix with product-specific constraints. | | [FAQs](https://learn.microsoft.com/en-us/azure/devops/server/faq?view=azure-devops-server) | troubleshooting | 0.60 | FAQ for server administration typically includes specific error scenarios, support constraints (like not modifying databases), and targeted resolutions; these are product-specific troubleshooting and operational rules. | -| [Get started as a Stakeholder](https://learn.microsoft.com/en-us/azure/devops/organizations/security/get-started-stakeholder?view=azure-devops) | security | 0.60 | Defines what Stakeholder access can and cannot do; product-specific access level capabilities and restrictions, which are security/permission details. | | [Get started with an upgrade](https://learn.microsoft.com/en-us/azure/devops/server/upgrade/get-started?view=azure-devops-server) | deployment | 0.60 | Covers supported upgrade paths, prerequisites, and process considerations specific to Azure DevOps Server/TFS, which are product-specific deployment/upgrade requirements. | | [Historical data representation](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/analytics-historical-filtering?view=azure-devops) | architecture-patterns | 0.60 | Explains how historical data is stored and how to choose entity sets and filters for trend reporting; this is a product-specific data modeling pattern for Analytics. | | [Limit your emails](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/exclude-self-from-email?view=azure-devops) | configuration | 0.60 | Describes a specific notification setting to suppress emails to initiators; product-specific behavior and config. | @@ -359,7 +365,6 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Quick reference index](https://learn.microsoft.com/en-us/azure/devops/reference/quick-reference-index-boards-settings?view=azure-devops) | configuration | 0.60 | Index into many specific configuration/customization tasks for Azure Boards; serves as a map of product-specific configuration capabilities. | | [Refresh data caches on clients](https://learn.microsoft.com/en-us/azure/devops/server/admin/backup/refresh-data-caches?view=azure-devops-server) | best-practices | 0.60 | Explains specific maintenance operations that require cache refresh and how to avoid workspace errors; these are product-specific gotchas and operational recommendations. | | [Refresh data, add last refresh date](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/add-last-refresh-time?view=azure-devops) | best-practices | 0.60 | Describes concrete steps and expressions to surface last refresh time depending on data source (Analytics view vs OData), which are practical, product-specific reporting patterns. | -| [Set up Visual Studio](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops) | integrations | 0.60 | FAQ for setting up Visual Studio with Azure DevOps; likely includes specific connection settings, authentication flows, and troubleshooting for this integration. | | [Split project collection](https://learn.microsoft.com/en-us/azure/devops/server/admin/split-team-project-collection?view=azure-devops-server) | deployment | 0.60 | Operational guide for splitting collections, including steps and constraints; product-specific reconfiguration of deployment topology. | | [Support lifecycle and servicing](https://learn.microsoft.com/en-us/azure/devops/server/install/servicing?view=azure-devops-server) | decision-making | 0.60 | Lifecycle and servicing article will include version-specific support dates, patching cadence, and upgrade guidance, which inform decisions about when to upgrade or patch. | | [Unpublish a code wiki](https://learn.microsoft.com/en-us/azure/devops/project/wiki/publish-repo-to-wiki?view=azure-devops) | configuration | 0.60 | Same content as index 4: product-specific behavior for publishing Markdown from Git repos as wikis and managing multiple wikis per project. | @@ -397,10 +402,10 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Create work item from wiki content](https://learn.microsoft.com/en-us/azure/devops/project/wiki/create-embed-wit-from-wiki?view=azure-devops) | 0.35 | How-to for creating and embedding work items from wiki text; mostly UI steps without detailed configuration tables or error mappings. | | [December 8](https://learn.microsoft.com/en-us/azure/devops/release-notes/2022/sprint-213-update) | 0.35 | Security improvements including managed identity for ACR service connections and read-only GitHub token scope; summary suggests high-level security changes, not detailed role/permission matrices. | | [Delete a project](https://learn.microsoft.com/en-us/azure/devops/organizations/projects/delete-project?view=azure-devops) | 0.35 | Describes deleting and restoring projects; appears to be procedural with caution notes, but lacks detailed limits, configuration matrices, or security role/permission tables. | -| [FAQs](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/faq-notifications?view=azure-devops) | 0.35 | FAQ-style content; summary doesn’t indicate detailed error codes, config tables, or limits. | | [Functional code search](https://learn.microsoft.com/en-us/azure/devops/project/search/functional-code-search?view=azure-devops) | 0.35 | Explains functional code search usage; likely mostly UI and query examples without detailed config tables, limits, or error mappings. | | [Functional package search](https://learn.microsoft.com/en-us/azure/devops/project/search/functional-package-search?view=azure-devops) | 0.35 | Shows how to search for packages across feeds; feature usage without detailed configuration matrices or limits. | | [Functional work item search](https://learn.microsoft.com/en-us/azure/devops/project/search/functional-work-item-search?view=azure-devops) | 0.35 | Describes functional work item search filters; primarily feature usage, not deep configuration or troubleshooting. | +| [Get started as a Stakeholder](https://learn.microsoft.com/en-us/azure/devops/organizations/security/get-started-stakeholder?view=azure-devops) | 0.35 | Stakeholder access getting-started guide describes capabilities and basic usage; summary doesn’t indicate detailed RBAC role tables, configuration parameters, or numeric limits—more of a conceptual/usage overview. | | [Go mobile](https://learn.microsoft.com/en-us/azure/devops/project/navigation/mobile-work?view=azure-devops) | 0.35 | Describes mobile browser support for work items; mostly usage guidance with one support limitation note, but not detailed enough for a configuration or limits skill. | | [January 28](https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/sprint-164-update) | 0.35 | Read-only variables and output variables in deployment jobs; still a sprint feature note, not a full configuration reference. | | [July 29](https://learn.microsoft.com/en-us/azure/devops/release-notes/2022/sprint-207-update) | 0.35 | Change of default build job authorization scope from Project Collection to Project; security-related but described at a high level without detailed RBAC role/permission mapping. | @@ -430,7 +435,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [Create a report using an OData query](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/create-quick-report-odataq?view=azure-devops) | 0.30 | Quickstart for creating a bug trend report with OData; focuses on connection and basic query usage, not on detailed config matrices or limits. | | [Create a wiki for your project](https://learn.microsoft.com/en-us/azure/devops/project/wiki/wiki-create-repo?view=azure-devops) | 0.30 | How-to for creating a project wiki repo and opening it; no limits, config tables, error codes, or product-specific numeric details. | | [Create an Analytics view](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/analytics-views-create?view=azure-devops) | 0.30 | Primarily a how-to for creating Analytics Views in the UI and conceptual guidance on filtering and shaping data for Power BI. Lacks detailed parameter tables, limits, or product-specific configuration values. | -| [Create and configure an organization](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-configure-customize-organization?view=azure-devops) | 0.30 | Organization FAQ; description doesn’t indicate detailed limits, error codes, or configuration tables, so treated as general Q&A. | +| [Create and configure an organization](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-configure-customize-organization?view=azure-devops) | 0.30 | Organization FAQ about creating/configuring organizations; description suggests general how-to and conceptual guidance without clear indication of numeric limits, detailed configuration tables, or error-code-based troubleshooting. | | [Databases and deployment topologies](https://learn.microsoft.com/en-us/azure/devops/server/admin/backup/backup-db-architecture?view=azure-devops-server) | 0.30 | Describes backup concepts and topologies; summary suggests conceptual guidance without concrete numeric limits, config tables, or product-specific commands. | | [December 2](https://learn.microsoft.com/en-us/azure/devops/release-notes/2019/sprint-161-update) | 0.30 | GitHub Actions integration and multi-repo support; integration is mentioned but without detailed parameter/setting tables here. | | [Developer Resources >>](https://learn.microsoft.com/en-us/azure/devops/dev-resources/?view=azure-devops) | 0.30 | Developer resources hub; links to CLI/REST/TypeScript tools, but this page is a navigation overview. | @@ -459,7 +464,6 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [September 23](https://learn.microsoft.com/en-us/azure/devops/release-notes/2019/sprint-158-update) | 0.30 | User assignment-based billing and daily billing; licensing behavior, not technical limits/quotas or configuration matrices. | | [September 4](https://learn.microsoft.com/en-us/azure/devops/release-notes/2024/sprint-244-update) | 0.30 | Describes ability to retrieve branches and alerts via API; while APIs are mentioned, the page is a release note summary, not a full API/config reference here. | | [Time zone settings](https://learn.microsoft.com/en-us/azure/devops/organizations/settings/timezone-settings-usage?view=azure-devops) | 0.30 | Explains how Azure DevOps time zone settings work and where they are applied, but appears to be conceptual/behavioral guidance without numeric limits, configuration tables, or product-specific thresholds. | -| [Troubleshooting & FAQs >>](https://learn.microsoft.com/en-us/azure/devops/troubleshoot/?view=azure-devops) | 0.30 | Top-level troubleshooting and FAQ hub that links to other guides; the page itself is unlikely to contain specific error-code-to-solution mappings. | | [What's new?](https://learn.microsoft.com/en-us/azure/devops/server/whats-new?view=azure-devops-server) | 0.30 | High-level 'what's new' overview for Azure DevOps Server; primarily marketing/feature summary without detailed limits, configuration, or troubleshooting mappings. | | [About projects](https://learn.microsoft.com/en-us/azure/devops/organizations/projects/about-projects?view=azure-devops) | 0.25 | Conceptual article about projects and scaling; primarily architecture/organization concepts without numeric thresholds or decision matrices. | | [Add comments to wiki](https://learn.microsoft.com/en-us/azure/devops/project/wiki/add-comments-wiki?view=azure-devops) | 0.25 | Simple feature how-to for adding comments to wiki pages; lacks deep config, limits, or troubleshooting content. | @@ -659,6 +663,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [December 8](https://learn.microsoft.com/en-us/azure/devops/release-notes/2022/pipelines/sprint-213-update) | 0.20 | Sprint 213 Azure Pipelines release notes; primarily feature announcements and fixes. | | [Define your dashboard focus](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/dashboard-focus?view=azure-devops) | 0.20 | Covers how to create and customize dashboards with widgets; appears to be a conceptual/UX and basic how-to guide without detailed configuration tables, limits, or product-specific diagnostic/security parameters. | | [End-to-end traceability](https://learn.microsoft.com/en-us/azure/devops/cross-service/end-to-end-traceability?view=azure-devops) | 0.20 | Traceability article is described as an overview of tools and features with links; likely conceptual guidance rather than detailed configuration or limits. | +| [FAQs](https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/faq-notifications?view=azure-devops) | 0.20 | FAQ about notifications is likely conceptual and behavioral (how notifications work, common questions) without detailed error codes, configuration tables, or numeric limits; does not clearly match troubleshooting or other expert-knowledge categories based on the description. | | [February 04](https://learn.microsoft.com/en-us/azure/devops/release-notes/2021/boards/sprint-182-update) | 0.20 | Sprint 182 Azure Boards release notes list incremental feature updates; they don’t provide the kind of numeric limits, decision matrices, or config references required for expert-knowledge classification. | | [February 04](https://learn.microsoft.com/en-us/azure/devops/release-notes/2021/general/sprint-182-update) | 0.20 | Sprint 182 release notes describe sprint updates rather than structured expert content like quotas, security roles, or configuration tables. | | [February 04](https://learn.microsoft.com/en-us/azure/devops/release-notes/2021/reporting/sprint-182-update) | 0.20 | Sprint 182 reporting update is a release-note style page; it lacks numeric limits, config parameter tables, or error-code mappings. | @@ -713,7 +718,6 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [February 9](https://learn.microsoft.com/en-us/azure/devops/release-notes/2024/ghazdo/sprint-234-update) | 0.20 | Sprint 234 release notes describe new GitHub Advanced Security features, not limits, configuration, or troubleshooting content. | | [February 9](https://learn.microsoft.com/en-us/azure/devops/release-notes/2024/pipelines/sprint-234-update) | 0.20 | Sprint 234 Azure Pipelines release notes; change log content, not limits, configuration, or troubleshooting guidance. | | [February 9](https://learn.microsoft.com/en-us/azure/devops/release-notes/2024/repos/sprint-234-update) | 0.20 | Sprint 234 Azure Repos release notes are incremental change documentation, not structured expert guidance as defined by the sub-skill types. | -| [Frequently asked questions](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/faq?view=azure-devops) | 0.20 | FAQ pages are often mixed; without the full content, this is more likely to be conceptual and clarifying common questions rather than providing structured limits, configuration tables, or error-code-based troubleshooting. Insufficient evidence that it contains the specific numeric limits, config parameters, or error mappings required for expert-knowledge classification. | | [GitHub integration](https://learn.microsoft.com/en-us/azure/devops/cross-service/github-integration?view=azure-devops) | 0.20 | GitHub integration overview focuses on capabilities and benefits; summary does not indicate detailed config tables, quotas, or error mappings. | | [January 11](https://learn.microsoft.com/en-us/azure/devops/release-notes/2024/general/sprint-232-update) | 0.20 | Sprint 232 release notes summarize new features; they are not limits, configuration, troubleshooting, or decision-making documentation. | | [January 11](https://learn.microsoft.com/en-us/azure/devops/release-notes/2024/testplans/sprint-232-update) | 0.20 | Sprint release notes for Azure Test Plans; primarily feature announcements and high-level changes without structured limits, configuration tables, error mappings, or decision matrices. | @@ -1009,6 +1013,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [September 8](https://learn.microsoft.com/en-us/azure/devops/release-notes/2021/repos/sprint-192-update) | 0.20 | Sprint 192 Azure Repos release notes are historical change notes, not organized around limits, configuration options, or error-resolution mappings. | | [Set user preferences](https://learn.microsoft.com/en-us/azure/devops/organizations/settings/set-your-preferences?view=azure-devops) | 0.20 | How-to for changing profile preferences; mostly UI steps without detailed config parameter tables or numeric constraints. | | [Settings >>](https://learn.microsoft.com/en-us/azure/devops/organizations/?view=azure-devops) | 0.20 | Landing/navigation page for Azure DevOps organization, project, team, and user settings; description indicates high-level configuration and usage documentation without specific limits, configuration parameter tables, or error-code-based troubleshooting. | +| [Troubleshooting & FAQs >>](https://learn.microsoft.com/en-us/azure/devops/troubleshoot/?view=azure-devops) | 0.20 | Landing page that aggregates troubleshooting guides and FAQs across Azure DevOps. It does not itself contain specific error codes, commands, or symptom-to-solution mappings; it just links to other resources. | | [View/configure sprint burndown](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/configure-sprint-burndown?view=azure-devops) | 0.20 | Explains how to configure and monitor sprint burndown charts and references other configuration pages. From the summary it appears to be a usage guide without detailed configuration tables, limits, or product-specific error/security details. | | [What are Analytics views?](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/what-are-analytics-views?view=azure-devops) | 0.20 | High-level explanation of Analytics views and their scope; no detailed limits, configuration tables, or product-specific error/decision matrices. | | [Work with favorites](https://learn.microsoft.com/en-us/azure/devops/project/navigation/set-favorites?view=azure-devops) | 0.20 | Shows how to set favorites; simple UI feature usage without deep technical details. | @@ -1057,6 +1062,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [What is Analytics?](https://learn.microsoft.com/en-us/azure/devops/report/powerbi/what-is-analytics?view=azure-devops) | 0.10 | Conceptual overview of Azure DevOps Analytics and its purpose; does not expose numeric limits, configuration tables, error codes, or decision matrices. | | [April 14](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/pipelines/sprint-272-update) | - | Sprint release notes describe new features and changes but are not structured as limits, configuration references, troubleshooting guides, or other defined sub-skill types with stable expert knowledge. Content is primarily update/what’s new information rather than reusable technical reference with numeric limits, config tables, or decision matrices. | | [Customization >>](https://learn.microsoft.com/en-us/azure/devops/reference/?view=azure-devops) | - | High-level entry page for Azure Boards configuration & customization; acts as a navigation hub without exposing specific parameter tables or expert configuration details itself. | +| [FAQs](https://learn.microsoft.com/en-us/azure/devops/report/dashboards/faqs?view=azure-devops) | - | FAQ page about Azure DevOps dashboards and charts; based on the description it likely covers usage questions and conceptual guidance, without explicit numeric limits, configuration parameter tables, error-code-to-solution mappings, or other product-specific expert details as defined in the sub-skill types. | | [February 2026](https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/azure-devops-docs-2026-02) | - | Another monthly 'what's new' documentation summary; contains no product-specific limits, configs, or troubleshooting details. | | [January 2026](https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/azure-devops-docs-2026-01) | - | Monthly 'what's new' summary for Azure DevOps docs; functions as a change log and navigation aid, not a source of expert configuration, limits, or troubleshooting content. | | [March 2026](https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/azure-devops-docs-whats-new) | - | Monthly 'what's new' documentation summary; meta-doc about changes, not technical guidance or detailed product behavior. | @@ -1064,7 +1070,7 @@ confusable_not_for: Not for Azure Boards (use azure-boards), Azure Pipelines (us | [March 31](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/testplans/sprint-271-update) | - | Sprint release notes describe new features and changes but do not focus on structured limits, configuration matrices, troubleshooting mappings, or other categorized expert-knowledge patterns defined in the sub-skill types. | | [March 5](https://learn.microsoft.com/en-us/azure/devops/release-notes/2026/general/sprint-270-update) | - | Sprint release notes describe new features and changes but typically do not provide structured limits, configuration matrices, troubleshooting mappings, or other stable expert reference data as defined by the sub-skill types. | | [Marketplace & Extensibility](https://learn.microsoft.com/en-us/azure/devops/marketplace-extensibility/?view=azure-devops) | - | Marketplace & extensibility documentation landing page; describes discovering and developing extensions but is primarily a hub, not a detailed reference with expert-only specifics. | -| [Roadmap](https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline) | - | Roadmap/landing page with navigation; no technical configuration, limits, or troubleshooting details. | +| [Roadmap](https://learn.microsoft.com/en-us/azure/devops/release-notes/features-timeline) | - | Roadmap/what's new navigation content without detailed limits, configuration parameters, troubleshooting mappings, or decision matrices. | | [Roadmap and features timeline](https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/features-timeline?view=azure-devops) | - | Features timeline and roadmap page is primarily release/roadmap information, not configuration, limits, or troubleshooting guidance with reusable expert knowledge. | | [What's new](https://learn.microsoft.com/en-us/azure/devops/release-notes/docswhatsnew/) | - | Navigation/change-log style page listing new and updated docs; no technical limits, configs, patterns, or troubleshooting content. | | [Wiki, Search, & Navigation >>](https://learn.microsoft.com/en-us/azure/devops/project/?view=azure-devops) | - | Navigation/overview page for wikis, search, and navigation in Azure DevOps; no indication of detailed configuration, limits, or troubleshooting content. | diff --git a/products/azure-education-hub/report.md b/products/azure-education-hub/report.md index 51a3c99c..78453ff1 100644 --- a/products/azure-education-hub/report.md +++ b/products/azure-education-hub/report.md @@ -8,10 +8,13 @@ category_descriptions: skill_description: Expert knowledge for Azure Education Hub development including troubleshooting, and limits & quotas. Use when managing Azure for Students credits, yearly quotas, renewals, or Dev Tools for Teaching sign-in issues, and other Azure - Education Hub related development tasks. + Education Hub related development tasks. Not for Azure Lab Services (use azure-lab-services), + Azure DevTest Labs (use azure-devtest-labs), Azure Virtual Desktop (use azure-virtual-desktop). use_when: Use when managing Azure for Students credits, yearly quotas, renewals, or Dev Tools for Teaching sign-in issues, and other Azure Education Hub related development tasks. +confusable_not_for: Not for Azure Lab Services (use azure-lab-services), Azure DevTest + Labs (use azure-devtest-labs), Azure Virtual Desktop (use azure-virtual-desktop). --- # Azure Education Hub Crawl Report diff --git a/products/azure-elastic-san/azure-elastic-san.csv b/products/azure-elastic-san/azure-elastic-san.csv index 253c2472..61d9ad00 100644 --- a/products/azure-elastic-san/azure-elastic-san.csv +++ b/products/azure-elastic-san/azure-elastic-san.csv @@ -1,6 +1,6 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-batch-create-sample,Create elastic SAN volumes in a batch,Create multiple Azure Elastic SAN volumes in a batch,Batch-create Azure Elastic SAN volumes with PowerShell,Azure PowerShell Script Sample - Learn how to create multiple Elastic SAN volumes in a batch.,"The PowerShell script in this article simplifies creating multiple Azure Elastic SAN volumes as a batch. To use the script, make a .csv file with pre-filled values to create as many volumes of varying sizes as you want. Format your .csv file with five columns:ResourceGroupName,ElasticSanName,VolumeGroupName,Name, andSizeGiB. The following screenshot provides an example: Then, use the following script to create your volumes.",2026-01-13T12:18:00.000Z,sample,integrations,0.7,True,"Provides a PowerShell script and CSV schema (specific column names and usage) to create multiple volumes. This is a concrete automation/integration pattern with product-specific parameters, fitting integrations.",unchanged -https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices,Best practices,Azure Elastic SAN configuration best practices,Apply Azure Elastic SAN performance best practices,"Learn best practices for optimizing Azure Elastic SAN performance. Get recommendations for client-side settings, MPIO, iSCSI configurations, and deployment sizing",This article provides guidance on how to get optimal performance with an environment that uses an Azure Elastic SAN.,2026-01-13T12:18:00.000Z,concept-article,best-practices,0.9,True,"Provides concrete configuration recommendations for client settings, MPIO, iSCSI, and deployment sizing to optimize performance. These are product-specific DO/DON'T guidelines and tuning values, matching best-practices.",unchanged +https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices,Best practices,Azure Elastic SAN configuration best practices,Optimize Azure Elastic SAN configuration and performance,"Learn best practices for optimizing Azure Elastic SAN performance. Get recommendations for client-side settings, MPIO, iSCSI configurations, and deployment sizing",This article provides guidance on how to get optimal performance with an environment that uses an Azure Elastic SAN.,2026-04-24T06:15:00.000Z,concept-article,best-practices,0.85,True,"A dedicated best-practices article for Azure Elastic SAN with concrete, product-specific recommendations (client-side settings, MPIO, iSCSI configuration, deployment sizing). These are actionable DO/DON'T guidelines and tuning values unique to Elastic SAN performance characteristics, not generic storage advice.",updated https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-configure-customer-managed-keys,Configure customer-managed keys with Key Vault,Configure Customer-Managed Keys for Azure Elastic SAN,Configure customer-managed keys for Azure Elastic SAN,Learn how to configure Azure Elastic SAN encryption with customer-managed keys for an Elastic SAN volume group by using the Azure PowerShell module or Azure CLI.,"All data written to an Elastic SAN volume is automatically encrypted at rest with a data encryption key (DEK). Azure usesenvelope encryptionto encrypt the DEK by using a Key Encryption Key (KEK). By default, Azure uses a platform-managed KEK (managed by Microsoft), but you can create and manage your own KEK. This article shows how to configure encryption of an Elastic SAN volume group by using customer-managed keys stored in an Azure Key Vault.",2026-01-13T12:18:00.000Z,how-to,security,0.85,True,"Shows how to configure CMK-based encryption for Elastic SAN volume groups using Key Vault via PowerShell/CLI, including key and scope parameters. This is detailed encryption and key management configuration, matching security.",unchanged https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-configure-private-endpoints,Configure private endpoints,Configure private endpoints for Azure Elastic SAN,Configure private endpoints for Azure Elastic SAN volume groups,"Learn how to configure private endpoint connections to Azure Elastic SAN volumes for secure network isolation by using Azure portal, PowerShell, or CLI","A private endpoint enables you to connect to your Elastic SAN volume group over a private IP address within your virtual network. When you use a private endpoint, traffic between your virtual network and the Elastic SAN stays entirely on Azure's private backbone, without traversing the public internet. Once you configure and approve a private endpoint, it automatically grants access to the subnet where it resides. This configuration provides strong network isolation and is ideal for production o",2026-01-13T12:18:00.000Z,how-to,security,0.8,True,"Shows how to configure private endpoints for Elastic SAN, including behavior (automatic subnet access, network isolation). This is product-specific secure networking configuration, matching security.",unchanged https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-configure-service-endpoints,Configure service endpoints,Configure Service Endpoints for Azure Elastic SAN,Configure service endpoints for Azure Elastic SAN access,Learn how to configure service endpoints to access Azure Elastic SAN volumes.,"A service endpoint enables secure connectivity to Elastic SAN from a subnet within your virtual network, without requiring a private IP. Virtual network service endpoints are public and accessible via the internet. You canConfigure virtual network rulesto control access to your volume group when using storage service endpoints. This article shows you how to configure service endpoint connections to your Elastic SAN.",2026-01-13T12:18:00.000Z,how-to,security,0.75,True,"Explains configuring service endpoints and virtual network rules for Elastic SAN volume groups. These are specific access control and network security settings, fitting security.",unchanged @@ -17,7 +17,8 @@ https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-networki https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-performance,Performance,Learn about Azure Elastic SAN and VM performance,Understand Azure Elastic SAN and VM performance limits,"Learn how your workload's performance is handled by Azure Elastic SAN and Azure Virtual Machines. Understand IOPS, throughput allocation, and throttling scenarios.","This article explains how Elastic SAN performance works, and how the combination of Elastic SAN limits and Azure Virtual Machines (VM) limits can affect the performance of your workloads.",2026-01-09T23:14:00.000Z,concept-article,limits-quotas,0.8,True,"Explains how Elastic SAN limits and VM limits interact, including IOPS, throughput allocation, and throttling. Such content typically includes specific numeric limits and allocation rules, fitting limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-performance-on-azure-vmware-solutions,Performance on Azure VMware Solution,Elastic SAN datastore performance on Azure VMware solutions,Optimize Elastic SAN datastore performance on AVS,"Benchmark results and guidance for Azure Elastic SAN datastores used with Azure VMware Solution, including IOPS- and throughput-intensive workloads.","This article describes the performance characteristics of Azure Elastic SAN datastores when used with Azure VMware Solution. It presents benchmark results for common workload patterns and provides enough configuration and test details to help you compare these results with your own environments results. The results in this article are intended asreference only, not as guaranteed performance targets. Actual performance changes depending on workload characteristics, VM configuration, and Elastic S",2026-03-04T18:23:00.000Z,concept-article,best-practices,0.7,True,"The page provides benchmark-based, product-specific performance guidance for Azure Elastic SAN used as datastores with Azure VMware Solution, including concrete workload patterns, configuration details, and how to interpret/compare results. This is actionable, service-specific tuning and usage guidance rather than generic performance theory, fitting best under best-practices.",unchanged https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-planning,Planning,Plan for an Azure Elastic SAN deployment,Plan Azure Elastic SAN capacity and configuration,"Plan for an Azure Elastic SAN deployment. Learn about storage capacity, performance, redundancy, and encryption.","To plan for an Elastic SAN deployment, you need to understand its three main components: the SAN itself, volume groups, and volumes. When you deploy a SAN, you make selections while configuring the SAN, including the redundancy of the entire SAN, and how much performance and storage the SAN has. Then you create volume groups that you use to manage volumes at scale. Any settings you apply to a volume group are inherited by volumes inside that volume group. Finally, you partition the storage capac",2026-01-09T23:14:00.000Z,concept-article,decision-making,0.7,True,"Planning article that discusses storage capacity, performance, redundancy, and encryption choices for SAN, volume groups, and volumes. Likely includes concrete thresholds and configuration guidance to choose sizes and redundancy options, which supports deployment decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets,Scale targets,Azure Elastic SAN Scalability and Performance Targets,"Azure Elastic SAN scalability, IOPS, and throughput limits","Learn about the capacity, IOPS, and throughput rates for Azure Elastic SAN. Learn which regions support higher capacities.","An Elastic SAN has three main components: the SAN itself, volume groups, and volumes.",2026-01-09T23:14:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly about capacity, IOPS, throughput, and region support. This type of article typically contains numeric limits and performance targets per SAN/volume, matching limits-quotas criteria.",unchanged +https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets,Scale targets,Azure Elastic SAN Scalability and Performance Targets,"Elastic SAN capacity, IOPS, and throughput limits","Learn about the capacity, IOPS, and throughput rates for Azure Elastic SAN. Learn which regions support higher capacities.","An Elastic SAN has three main components: the SAN itself, volume groups, and volumes.",2026-04-24T06:15:00.000Z,concept-article,limits-quotas,0.95,True,"A scalability and performance targets page for Azure Elastic SAN that necessarily lists specific capacity, IOPS, and throughput numbers, often by region or tier. These numeric limits and targets are classic limits/quotas content and represent expert knowledge beyond generic LLM training.",updated https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-shared-volumes,Clustered applications,Use clustered applications on Azure Elastic SAN,Use clustered applications with shared Azure Elastic SAN volumes,Learn how to deploy clustered applications on an Elastic SAN volumes and share Elastic SAN volumes between compute clients.,"You can attach Azure Elastic SAN volumes to multiple compute clients at the same time, so you can deploy or migrate cluster applications to Azure. To share an Elastic SAN volume, you need to use a cluster manager, such as Windows Server Failover Cluster (WSFC) or Pacemaker. The cluster manager handles cluster node communications and write locking. Elastic SAN doesn't natively offer a fully managed filesystem that can be accessed over SMB or NFS. When used as a shared volume, Elastic SAN volumes ",2026-01-09T23:14:00.000Z,concept-article,architecture-patterns,0.65,True,"Covers pattern of sharing Elastic SAN volumes across multiple clients using cluster managers (WSFC, Pacemaker) and explains constraints (no managed SMB/NFS). This is a product-specific deployment/architecture pattern for clustered workloads.",unchanged https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-snapshots,Using volume snapshots,Backup Azure Elastic SAN volumes,Use snapshots to back up Azure Elastic SAN volumes,"Learn about snapshots for Azure Elastic SAN, including their best uses, how to create them, and how to use them to create new volumes or export to a managed disk.","Azure Elastic SAN volume snapshots are incremental point-in-time backups of your volumes. The first snapshot you take doesn't occupy any space, and every subsequent snapshot consists only of the changes to the Elastic SAN volume since the last snapshot. This approach is different from a managed disk snapshot. The first snapshot you take for a managed disk is a full copy of the managed disk and each subsequent snapshot consists of only the changes to the disk since the last snapshot. Snapshots of",2026-01-09T23:14:00.000Z,concept-article,best-practices,0.7,True,"Snapshot article describes how Elastic SAN snapshots behave versus managed disk snapshots and when/how to use them for backup, volume creation, or export. It likely includes product-specific recommendations and gotchas (for example, behavior of first vs subsequent snapshots) that qualify as best-practice guidance rather than just conceptual backup theory.",unchanged +https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-transition-iqn-naming-authority,Transition IQN Naming Authority on Connected Volumes,Update IQN Naming Authority on Azure Elastic SAN Volumes,Transition Elastic SAN IQN naming authority,Learn how to transition the iSCSI Qualified Name (IQN) naming authority for Azure Elastic SAN volumes that are already connected to clients.,"Elastic SAN volume groups issue a unique identifier called an iSCSI Qualified name (IQN) that your volumes use to connect. As of June 30, 2025, all new volume groups issue an IQN using the naming authoritynet.azure.storage. If you have volume groups that were created prior to June 30, 2025, you may be connected to volumes using the authoritynet.windows.coreand we recommend you transition to the naming authoritynet.azure.storageusing the steps outlined in this article.",2026-04-22T22:14:00.000Z,how-to,configuration,0.7,True,"Describes a product-specific, time-bound change to IQN naming authority for Azure Elastic SAN volumes and prescribes concrete steps to reconfigure existing connections. This is detailed operational/configuration guidance (how to update IQNs and handle existing clients) that is unlikely to be known generically by an LLM and is specific to this service’s behavior and timeline.",new https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-troubleshoot,Troubleshoot your Elastic SAN,Troubleshoot Azure Elastic SAN,Troubleshoot common Azure Elastic SAN issues and errors,Troubleshoot issues with Azure Elastic SAN,This article lists common issues related to Azure Elastic SAN. It also provides possible causes and resolutions for these issues.,2026-01-09T23:14:00.000Z,how-to,troubleshooting,0.95,True,"Explicit troubleshooting article listing common issues, causes, and resolutions. Likely includes specific error messages/codes and diagnostic steps, matching troubleshooting criteria.",unchanged diff --git a/products/azure-elastic-san/report.md b/products/azure-elastic-san/report.md index 05a785c0..a7f6389a 100644 --- a/products/azure-elastic-san/report.md +++ b/products/azure-elastic-san/report.md @@ -1,19 +1,19 @@ --- -generated_at: '2026-03-19' +generated_at: '2026-04-26' category_descriptions: integrations: Using PowerShell to batch-create Elastic SAN volumes and configuring Linux and Windows clients to connect, mount, and use those iSCSI-based volumes. - best-practices: Performance tuning for Elastic SAN volumes and AVS datastores, plus - how to design, configure, and use snapshots for backup and recovery best practices. + best-practices: Tuning Elastic SAN performance and configuration (including AVS + datastores) and using volume snapshots for backup and recovery best practices. security: Encrypting Elastic SAN with customer-managed keys and configuring secure access via private endpoints, service endpoints, and other network security options for volumes and volume groups. - configuration: Configuring, deploying, resizing, deleting, and monitoring Azure - Elastic SAN resources and volumes, including safe capacity changes and using built-in - metrics effectively. - limits-quotas: 'Performance and scale limits for Elastic SAN: max volumes, capacity, - IOPS/throughput per volume/volume group/SAN, and how VM size and workload affect - achievable performance.' + configuration: Configuring, resizing, deleting, and monitoring Azure Elastic SAN + resources/volumes, plus managing iSCSI IQN naming authority and safe operational + changes. + limits-quotas: Details on Elastic SAN capacity, IOPS, throughput, and VM performance + limits, including how these quotas scale, constrain workloads, and impact design + choices. decision-making: Guidance on sizing and configuring Elastic SAN (performance, capacity, architecture) and deciding how to integrate it with AKS workloads and storage patterns. @@ -25,32 +25,32 @@ category_descriptions: error codes/logs. skill_description: Expert knowledge for Azure Elastic SAN development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, - security, configuration, and integrations & coding patterns. Use when scripting - Elastic SAN volumes, tuning AVS datastores, using CMK encryption, sizing for IOPS, - or running clustered SQL, and other Azure Elastic SAN related development tasks. - Not for Azure Blob Storage (use azure-blob-storage), Azure Files (use azure-files), - Azure NetApp Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre). -use_when: Use when scripting Elastic SAN volumes, tuning AVS datastores, using CMK - encryption, sizing for IOPS, or running clustered SQL, and other Azure Elastic SAN - related development tasks. -confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure Files - (use azure-files), Azure NetApp Files (use azure-netapp-files), Azure Managed Lustre - (use azure-managed-lustre). + security, configuration, and integrations & coding patterns. Use when creating iSCSI + volumes, AVS datastores, snapshots, CMK encryption, or AKS-integrated Elastic SAN + storage, and other Azure Elastic SAN related development tasks. Not for Azure Blob + Storage (use azure-blob-storage), Azure NetApp Files (use azure-netapp-files), Azure + Managed Lustre (use azure-managed-lustre), Azure Files (use azure-files). +use_when: Use when creating iSCSI volumes, AVS datastores, snapshots, CMK encryption, + or AKS-integrated Elastic SAN storage, and other Azure Elastic SAN related development + tasks. +confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure NetApp + Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre), + Azure Files (use azure-files). --- # Azure Elastic SAN Crawl Report ## Summary -- **Total Pages**: 22 -- **Fetched**: 22 +- **Total Pages**: 23 +- **Fetched**: 23 - **Fetch Failed**: 0 -- **Classified**: 21 +- **Classified**: 22 - **Unclassified**: 1 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 22 +- **New Pages**: 1 +- **Updated Pages**: 2 +- **Unchanged**: 20 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-elastic-san/azure-elastic-san.csv` @@ -58,25 +58,36 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure F | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 1 | 4.5% | -| best-practices | 3 | 13.6% | -| configuration | 4 | 18.2% | -| decision-making | 1 | 4.5% | -| integrations | 3 | 13.6% | -| limits-quotas | 2 | 9.1% | -| security | 6 | 27.3% | -| troubleshooting | 1 | 4.5% | -| *(Unclassified)* | 1 | 4.5% | +| architecture-patterns | 1 | 4.3% | +| best-practices | 3 | 13.0% | +| configuration | 5 | 21.7% | +| decision-making | 1 | 4.3% | +| integrations | 3 | 13.0% | +| limits-quotas | 2 | 8.7% | +| security | 6 | 26.1% | +| troubleshooting | 1 | 4.3% | +| *(Unclassified)* | 1 | 4.3% | ## Changes +### New Pages + +- [Transition IQN Naming Authority on Connected Volumes](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-transition-iqn-naming-authority) + +### Updated Pages + +- [Best practices](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices) + - Updated: 2026-01-13T12:18:00.000Z → 2026-04-24T06:15:00.000Z +- [Scale targets](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets) + - Updated: 2026-01-09T23:14:00.000Z → 2026-04-24T06:15:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Scale targets](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets) | limits-quotas | 0.95 | Explicitly about capacity, IOPS, throughput, and region support. This type of article typically contains numeric limits and performance targets per SAN/volume, matching limits-quotas criteria. | +| [Scale targets](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets) | limits-quotas | 0.95 | A scalability and performance targets page for Azure Elastic SAN that necessarily lists specific capacity, IOPS, and throughput numbers, often by region or tier. These numeric limits and targets are classic limits/quotas content and represent expert knowledge beyond generic LLM training. | | [Troubleshoot your Elastic SAN](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-troubleshoot) | troubleshooting | 0.95 | Explicit troubleshooting article listing common issues, causes, and resolutions. Likely includes specific error messages/codes and diagnostic steps, matching troubleshooting criteria. | -| [Best practices](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices) | best-practices | 0.90 | Provides concrete configuration recommendations for client settings, MPIO, iSCSI, and deployment sizing to optimize performance. These are product-specific DO/DON'T guidelines and tuning values, matching best-practices. | +| [Best practices](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices) | best-practices | 0.85 | A dedicated best-practices article for Azure Elastic SAN with concrete, product-specific recommendations (client-side settings, MPIO, iSCSI configuration, deployment sizing). These are actionable DO/DON'T guidelines and tuning values unique to Elastic SAN performance characteristics, not generic storage advice. | | [Configure customer-managed keys with Key Vault](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-configure-customer-managed-keys) | security | 0.85 | Shows how to configure CMK-based encryption for Elastic SAN volume groups using Key Vault via PowerShell/CLI, including key and scope parameters. This is detailed encryption and key management configuration, matching security. | | [Manage customer keys](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-encryption-manage-customer-keys) | security | 0.85 | Details lifecycle and management of CMKs (rotation, access control) for Elastic SAN. Contains product-specific key management behaviors and requirements, fitting security. | | [Configure private endpoints](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-configure-private-endpoints) | security | 0.80 | Shows how to configure private endpoints for Elastic SAN, including behavior (automatic subnet access, network isolation). This is product-specific secure networking configuration, matching security. | @@ -91,6 +102,7 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure F | [Performance on Azure VMware Solution](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-performance-on-azure-vmware-solutions) | best-practices | 0.70 | The page provides benchmark-based, product-specific performance guidance for Azure Elastic SAN used as datastores with Azure VMware Solution, including concrete workload patterns, configuration details, and how to interpret/compare results. This is actionable, service-specific tuning and usage guidance rather than generic performance theory, fitting best under best-practices. | | [Planning](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-planning) | decision-making | 0.70 | Planning article that discusses storage capacity, performance, redundancy, and encryption choices for SAN, volume groups, and volumes. Likely includes concrete thresholds and configuration guidance to choose sizes and redundancy options, which supports deployment decision-making. | | [Resize an Elastic SAN](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-expand) | configuration | 0.70 | Describes how to increase/decrease SAN and volume sizes. Such docs typically include constraints (e.g., expansion vs shrink rules, allowed ranges), which are product-specific configuration details. | +| [Transition IQN Naming Authority on Connected Volumes](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-transition-iqn-naming-authority) | configuration | 0.70 | Describes a product-specific, time-bound change to IQN naming authority for Azure Elastic SAN volumes and prescribes concrete steps to reconfigure existing connections. This is detailed operational/configuration guidance (how to update IQNs and handle existing clients) that is unlikely to be known generically by an LLM and is specific to this service’s behavior and timeline. | | [Using volume snapshots](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-snapshots) | best-practices | 0.70 | Snapshot article describes how Elastic SAN snapshots behave versus managed disk snapshots and when/how to use them for backup, volume creation, or export. It likely includes product-specific recommendations and gotchas (for example, behavior of first vs subsequent snapshots) that qualify as best-practice guidance rather than just conceptual backup theory. | | [Clustered applications](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-shared-volumes) | architecture-patterns | 0.65 | Covers pattern of sharing Elastic SAN volumes across multiple clients using cluster managers (WSFC, Pacemaker) and explains constraints (no managed SMB/NFS). This is a product-specific deployment/architecture pattern for clustered workloads. | | [Networking](https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-networking) | security | 0.65 | Describes specific networking options (service endpoints, private endpoints, iSCSI) and how to restrict access by subnets and endpoints. This is product-specific access control/network isolation guidance, fitting security configuration. | diff --git a/products/azure-energy-data-services/azure-energy-data-services.csv b/products/azure-energy-data-services/azure-energy-data-services.csv index 6bb5c75d..d3c2bf16 100644 --- a/products/azure-energy-data-services/azure-energy-data-services.csv +++ b/products/azure-energy-data-services/azure-energy-data-services.csv @@ -46,7 +46,7 @@ https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-petrel-ddm https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-reservoir-ddms-apis,Use Reservoir DDMS APIs,Tutorial: Use Reservoir DDMS APIs to work with reservoir data - Microsoft Azure Data Manager for Energy,Read reservoir data using Reservoir DDMS REST APIs,Learn to use OSDU Reservoir DDMS APIs.,"In this article, you learn how to read data from Reservoir DDMS REST APIs with curl commands.",2025-08-08T11:10:00.000Z,tutorial,integrations,0.65,True,"Uses Reservoir DDMS REST APIs with curl; implies specific endpoints, query parameters, and payload formats.",unchanged https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-reservoir-ddms-websocket,Use Reservoir DDMS websocket APIs,Tutorial: Use Reservoir DDMS websocket endpoints to work with reservoir data - Microsoft Azure Data Manager for Energy,Use Reservoir DDMS websocket endpoints for data,Learn to use OSDU Reservoir DDMS websocket APIs.,"Use Reservoir Domain Data Management Services (DDMS) APIs in PowerShell to work with reservoir data in an Azure Data Manager for Energy resource. In this tutorial, you learn how to use a Reservoir DDMS websocket endpoint to: For more information about DDMS, seeDDMS concepts.",2025-02-13T18:07:00.000Z,tutorial,integrations,0.7,True,"Websocket API usage is highly product-specific; likely includes endpoint URLs, message formats, and connection parameters.",unchanged https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-rock-and-fluid-samples-ddms,Use Rock and Fluid Samples DDMS APIs,Tutorial: Use Rock and Fluid Samples (RAFS) DDMS APIs - Microsoft Azure Data Manager for Energy,Call RAFS DDMS APIs for rock and fluid samples,Learn how to use RAFS DDMS v2 endpoints for rock and fluid samples.,"Rock and Fluid Samples (RAFS) DDMS allows you to manage storage, retrieval, and association of rock and fluid sample master data, analyses, and reports. This tutorial shows a step-by-step workflow using cURL to create and interact with these entities in your Azure Data Manager for Energy instance. In this tutorial, you learn how to use a RAFS DDMS APIs to: This tutorial shows common end-to-end cURL-based interactions with RAFS DDMS API endpoints.",2025-11-06T08:03:00.000Z,tutorial,integrations,0.7,True,End-to-end cURL interactions with RAFS DDMS endpoints; includes concrete API paths and request/response structures.,unchanged -https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-seismic-change-tier,Change tier for seismic workloads,Change storage tier for seismic datasets in Azure Data Manager for Energy,,"Learn how to change the storage tier of seismic datasets in Azure Data Manager for Energy to optimize storage costs using Hot, Cool, and Cold tiers.","Important This feature is currently in preview and available by default on Developer SKU. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Use the Seismic DDMS Change Tier operation in Azure Data Manager for Energy to move datasets betweenHot,Cool, andColdstorage tiers based on access frequency. Moving rarely accessed data to cooler tiers reduces storage co",2026-04-13T06:11:00.000Z,tutorial,,0.2,False,"Tutorial-style guidance on changing storage tiers (Hot/Cool/Cold) for seismic datasets; no evidence of numeric limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. Content appears conceptual/operational rather than detailed expert reference.",updated +https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-seismic-change-tier,Change tier for seismic workloads,Change storage tier for seismic datasets in Azure Data Manager for Energy,,"Learn how to change the storage tier of seismic datasets in Azure Data Manager for Energy to optimize storage costs using Hot, Cool, and Cold tiers.","Important This feature is currently in preview and available by default on Developer SKU. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Use the Seismic DDMS Change Tier operation in Azure Data Manager for Energy to move datasets betweenHot,Cool, andColdstorage tiers based on access frequency. Moving rarely accessed data to cooler tiers reduces storage co",2026-04-13T06:11:00.000Z,tutorial,,0.2,False,"Tutorial-style guidance on changing storage tiers (Hot/Cool/Cold) for seismic datasets; no evidence of numeric limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs. Content appears conceptual/operational rather than detailed expert reference.",unchanged https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-seismic-ddms,Use Seismic Store DDMS APIs,Tutorial: Work with seismic data by using Seismic DDMS APIs - Microsoft Azure Data Manager for Energy,Call Seismic DDMS APIs with cURL,This tutorial shows sample steps for interacting with the Seismic DDMS APIs in Azure Data Manager for Energy.,"This tutorial demonstrates how to utilize Seismic Domain Data Management Services (DDMS) APIs with cURL to manage seismic data within an Azure Data Manager for Energy instance. In this tutorial, you learn how to: For more information about DDMS, seeDDMS concepts.",2025-03-18T17:03:00.000Z,tutorial,integrations,0.65,True,"Tutorial for Seismic DDMS APIs with cURL implies specific REST endpoints, parameters, and request patterns unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-seismic-ddms-sdutil,Use Seismic Store DDMS sdutil,Tutorial: Use sdutil to load data into Seismic Store - Microsoft Azure Data Manager for Energy,Use sdutil CLI to interact with Seismic Store,"This tutorial shows you how to set up and use sdutil, a command-line tool for interacting with Seismic Store.","Seismic Store is a cloud-based solution for storing and managing datasets of any size. It provides a secure way to access datasets through a scoped authorization mechanism. Seismic Store overcomes cloud providers' object size limitations by managing generic datasets as multiple independent objects. Sdutil is a command-line Python tool for interacting with Seismic Store. You can use sdutil to perform basic operations like uploading data to Seismic Store, downloading datasets from Seismic Store, m",2024-01-24T18:05:00.000Z,tutorial,integrations,0.7,True,"Command-line tool usage for Seismic Store is product-specific integration; likely includes concrete CLI parameters, options, and patterns unique to this service.",unchanged https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-well-delivery-ddms,Use Well Delivery DDMS APIs,Tutorial: Work with well data records by using Well Delivery DDMS APIs,Manage well records with Well Delivery DDMS APIs,Learn how to work with well data records in your Azure Data Manager for Energy instance by using Well Delivery Domain Data Management Services (DDMS) APIs in Postman.,"This tutorial demonstrates how to utilize Well Delivery Domain Data Management Services (DDMS) APIs with cURL to manage well record within an Azure Data Manager for Energy instance. In this tutorial, you learn how to: For more information about DDMS, seeDDMS concepts.",2025-07-31T11:10:00.000Z,tutorial,integrations,0.65,True,API-focused tutorial (Well Delivery DDMS) using Postman/cURL; contains concrete endpoints and request schemas specific to this service.,unchanged diff --git a/products/azure-energy-data-services/report.md b/products/azure-energy-data-services/report.md index 69506f73..b4c71e2c 100644 --- a/products/azure-energy-data-services/report.md +++ b/products/azure-energy-data-services/report.md @@ -48,8 +48,8 @@ confusable_not_for: Not for Azure Data Explorer (use azure-data-explorer), Azure ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 51 +- **Updated Pages**: 0 +- **Unchanged**: 52 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-energy-data-services/azure-energy-data-services.csv` @@ -68,11 +68,6 @@ confusable_not_for: Not for Azure Data Explorer (use azure-data-explorer), Azure ## Changes -### Updated Pages - -- [Change tier for seismic workloads](https://learn.microsoft.com/en-us/azure/energy-data-services/tutorial-seismic-change-tier) - - Updated: 2026-03-10T16:12:00.000Z → 2026-04-13T06:11:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-expressroute/azure-expressroute.csv b/products/azure-expressroute/azure-expressroute.csv index 14fe4e92..633a7518 100644 --- a/products/azure-expressroute/azure-expressroute.csv +++ b/products/azure-expressroute/azure-expressroute.csv @@ -24,7 +24,7 @@ https://learn.microsoft.com/en-us/azure/expressroute/expressroute-erdirect-about https://learn.microsoft.com/en-us/azure/expressroute/expressroute-faqs,FAQ,FAQ - Azure ExpressRoute,"Azure ExpressRoute FAQ for services, costs, and connectivity","The ExpressRoute FAQ contains information about Supported Azure Services, Cost, Data and Connections, SLA, Providers and Locations, Bandwidth, and other Technical Details.",,2025-01-10T08:00:00.000Z,faq,troubleshooting,0.7,True,"ExpressRoute FAQ typically includes concrete answers about supported services, bandwidth, SLAs, and technical behaviors; these are product-specific clarifications and edge cases useful for troubleshooting and decisions.",unchanged https://learn.microsoft.com/en-us/azure/expressroute/expressroute-for-cloud-solution-providers,ExpressRoute for Cloud Solution Providers (CSP),ExpressRoute for Cloud Solution Providers - Azure,,This article provides information for Cloud Solution Providers that want to incorporate Azure services and ExpressRoute into their offerings.,"Microsoft provides hyper-scale services for traditional resellers and distributors (CSP) to be able to rapidly provision new services and solutions for your customers without the need to invest in developing these new services. To allow the Cloud Solution Provider (CSP) the ability to directly manage these new services, Microsoft provides programs and APIs that allow the CSP to manage Microsoft Azure resources on behalf of your customers. One of those resources is ExpressRoute. ExpressRoute allo",2025-12-09T08:00:00.000Z,concept-article,,0.3,False,"Appears to be a program/offer overview for Cloud Solution Providers using ExpressRoute, likely focused on business/programmatic aspects and high-level APIs rather than detailed technical limits, configuration parameters, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/expressroute/expressroute-global-reach,Overview,About Azure ExpressRoute Global Reach,,Learn how Azure ExpressRoute Global Reach can link ExpressRoute circuits together to make a private network between your on-premises networks.,"ExpressRoute provides a private and resilient connection between your on-premises networks and the Microsoft Cloud, allowing access to services like Azure and Microsoft 365 from your data center or corporate network. For instance, you might have a branch office in San Francisco with an ExpressRoute circuit in Silicon Valley, and another branch office in London with an ExpressRoute circuit there. Both offices can connect to Azure resources in US West and UK South, but they can't directly communic",2025-04-09T08:00:00.000Z,concept-article,,0.2,False,"Conceptual overview of ExpressRoute Global Reach; no detailed limits, configs, or decision matrices evident from summary.",unchanged -https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager,Azure portal,Configure a virtual network gateway for ExpressRoute using the Azure portal,Create and manage ExpressRoute virtual network gateways,"Learn how to create, configure, and manage virtual network gateways for ExpressRoute using the Azure portal. This guide covers gateway creation, SKU upgrades, and configuration settings.","This article shows you how to create, configure, and manage a virtual network gateway for ExpressRoute using the Azure portal. You can use these steps to create a gateway for a virtual network created with the Resource Manager deployment model. For more information about virtual network gateways and gateway configuration settings, seeAbout virtual network gateways for ExpressRoute.",2025-11-18T17:01:00.000Z,how-to,configuration,0.7,True,"Portal-based configuration including gateway creation, SKU upgrades, and settings; contains concrete configuration parameters.",unchanged +https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager,Azure portal,Configure a virtual network gateway for ExpressRoute using the Azure portal,Configure Azure ExpressRoute virtual network gateway in portal,"Learn how to create, configure, and manage virtual network gateways for ExpressRoute using the Azure portal. This guide covers gateway creation, SKU upgrades, and configuration settings.","This article shows you how to create, configure, and manage a virtual network gateway for ExpressRoute using the Azure portal. You can use these steps to create a gateway for a virtual network created with the Resource Manager deployment model. For more information about virtual network gateways and gateway configuration settings, seeAbout virtual network gateways for ExpressRoute.",2026-04-24T17:42:00.000Z,how-to,configuration,0.68,True,"Step-by-step portal guide that includes product-specific configuration options for ExpressRoute virtual network gateways (gateway types, SKUs, and settings). While largely procedural, it encodes concrete configuration choices and option names unique to ExpressRoute gateways rather than just conceptual overview content.",updated https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-resource-manager,Azure PowerShell,Configure a virtual network gateway for ExpressRoute using PowerShell,Manage ExpressRoute virtual network gateways with PowerShell,"Learn how to add, resize, and remove a virtual network gateway for ExpressRoute using Azure PowerShell. This guide covers gateway creation, SKU selection, and configuration steps.","This article shows you how to add, resize, and remove a virtual network gateway for a preexisting virtual network using PowerShell. The steps apply to virtual networks created with the Resource Manager deployment model for ExpressRoute. For more information, seeAbout ExpressRoute virtual network gateways.",2025-11-18T17:01:00.000Z,how-to,configuration,0.7,True,"PowerShell guide to add, resize, and remove gateways; includes cmdlets and SKU selection details.",unchanged https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-ipv6,Add IPv6 support for private peering,Azure ExpressRoute: Add IPv6 support for private peering,Add IPv6 support to ExpressRoute private peering,"Learn how to add IPv6 support to connect to Azure deployments using the Azure portal, Azure CLI, or Azure PowerShell.","This article describes how to add IPv6 support to connect via ExpressRoute to your resources in Azure using the Azure portal, Azure CLI, or Azure PowerShell. Note Some aspects of the portal experience are still being implemented. Therefore, when using the Azure portal, follow the exact order of steps provided in this document to successfully add IPv6 support. Specifically, make sure to create your virtual network and subnet, or add IPv6 address space to your existing virtual network and GatewayS",2025-07-25T22:07:00.000Z,how-to,configuration,0.7,True,Describes exact steps and ordering to enable IPv6 for ExpressRoute private peering using portal/CLI/PowerShell; includes product-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-circuit-arm,Azure PowerShell,Quickstart: Create and modify an ExpressRoute circuit using Azure PowerShell,,"This quickstart shows you how to create, provision, verify, update, delete, and deprovision an ExpressRoute circuit.","This quickstart shows you how to create an ExpressRoute circuit in three different resiliency types:Maximum Resiliency,High Resiliency, andStandard Resiliencyusing Azure PowerShell. You'll learn how to check the status, update, delete, or deprovision a circuit using PowerShell cmdlets.",2026-03-16T08:00:00.000Z,quickstart,,0.2,False,"Duplicate of index 2: PowerShell quickstart for ExpressRoute circuits. It is a basic how-to guide without structured configuration parameter documentation or product-specific troubleshooting/error mappings, so it doesn’t fit any expert-knowledge sub-skill type.",unchanged diff --git a/products/azure-expressroute/report.md b/products/azure-expressroute/report.md index 677789d9..11b44569 100644 --- a/products/azure-expressroute/report.md +++ b/products/azure-expressroute/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: limits-quotas: ExpressRoute bandwidth, route, and gateway limits, FastPath constraints, rate limiting on provider circuits, and how to monitor advertised routes to stay @@ -10,9 +10,9 @@ category_descriptions: deployment: Guides for deploying and migrating ExpressRoute circuits/gateways, understanding Direct SKUs, testing multi-site resiliency, and automating setup with ARM templates, PowerShell, and Terraform. - configuration: How to configure and manage ExpressRoute circuits, peerings, VNets, - gateways, routing/BGP, NAT, IPv6, monitoring, resiliency, and Global Reach using - portal, PowerShell, and CLI + configuration: Configuring and managing ExpressRoute circuits, peerings, gateways, + routing, NAT, BFD, IPv6, Global Reach, monitoring, resiliency, and linking VNets + using portal, PowerShell, and CLI. architecture-patterns: Designing resilient, highly available ExpressRoute topologies, multi-circuit routing, coexistence with S2S VPN, DR/backup patterns, and using Microsoft peering for PSTN services. @@ -30,17 +30,17 @@ category_descriptions: skill_description: Expert knowledge for Azure ExpressRoute development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - designing ExpressRoute circuits/gateways, BGP routing, Global Reach, FastPath, or - S2S VPN coexistence, and other Azure ExpressRoute related development tasks. Not - for Azure Internet Peering (use azure-internet-peering), Azure Peering Service (use - azure-peering-service), Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway - (use azure-vpn-gateway). + designing ExpressRoute circuits/gateways, BGP routing, Global Reach, encryption + (IPsec/MACsec), or automation, and other Azure ExpressRoute related development + tasks. Not for Azure Internet Peering (use azure-internet-peering), Azure Virtual + WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway), Azure Virtual + Network (use azure-virtual-network). use_when: Use when designing ExpressRoute circuits/gateways, BGP routing, Global Reach, - FastPath, or S2S VPN coexistence, and other Azure ExpressRoute related development + encryption (IPsec/MACsec), or automation, and other Azure ExpressRoute related development tasks. confusable_not_for: Not for Azure Internet Peering (use azure-internet-peering), Azure - Peering Service (use azure-peering-service), Azure Virtual WAN (use azure-virtual-wan), - Azure VPN Gateway (use azure-vpn-gateway). + Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway), + Azure Virtual Network (use azure-virtual-network). --- # Azure ExpressRoute Crawl Report @@ -54,8 +54,8 @@ confusable_not_for: Not for Azure Internet Peering (use azure-internet-peering), ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 95 +- **Updated Pages**: 1 +- **Unchanged**: 94 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-expressroute/azure-expressroute.csv` @@ -76,6 +76,11 @@ confusable_not_for: Not for Azure Internet Peering (use azure-internet-peering), ## Changes +### Updated Pages + +- [Azure portal](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager) + - Updated: 2025-11-18T17:01:00.000Z → 2026-04-24T17:42:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -99,7 +104,6 @@ confusable_not_for: Not for Azure Internet Peering (use azure-internet-peering), | [Azure PowerShell](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-coexist-resource-manager) | configuration | 0.70 | How-to for coexistence of ExpressRoute and site-to-site VPN using PowerShell; includes product-specific gateway configuration steps and parameters beyond generic knowledge. | | [Azure PowerShell](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-reset-peering) | configuration | 0.70 | Shows PowerShell commands and parameters to enable/disable ExpressRoute peerings and describes resulting BGP session behavior, fitting configuration knowledge. | | [Azure PowerShell](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-routing-arm) | configuration | 0.70 | PowerShell article for routing/peering typically includes specific parameter names (peer ASN, VLAN ID, prefixes, routing configuration fields) and how they must be set for ExpressRoute circuits, which matches product-specific configuration knowledge. | -| [Azure portal](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager) | configuration | 0.70 | Portal-based configuration including gateway creation, SKU upgrades, and settings; contains concrete configuration parameters. | | [Azure portal](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-linkvnet-portal-resource-manager) | configuration | 0.70 | Portal how-to for creating VNet-to-ExpressRoute connections; includes specific connection settings and resource relationships. | | [Azure portal](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-linkvnet-portal-resource-manager) | configuration | 0.70 | Portal how-to for creating VNet-to-ExpressRoute connections; includes specific connection settings and resource relationships. | | [Azure portal](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-reset-peering-portal) | configuration | 0.70 | Details how enabling/disabling peerings affects BGP sessions on primary/secondary connections and uses specific portal configuration options, which is product-specific configuration behavior. | @@ -133,6 +137,7 @@ confusable_not_for: Not for Azure Internet Peering (use azure-internet-peering), | [Rate limit for service provider ports](https://learn.microsoft.com/en-us/azure/expressroute/provider-rate-limit) | limits-quotas | 0.70 | Explains how rate limiting works over service provider ports and how to monitor throughput and drops. Such content typically includes specific bandwidth thresholds, drop behaviors, and possibly per-circuit limits, which are numeric constraints and limits. | | [Roles and permissions](https://learn.microsoft.com/en-us/azure/expressroute/roles-permissions) | security | 0.70 | Explains required permissions across circuits, gateways, VNets, and IPs; likely lists specific RBAC roles and scopes. | | [Understand connectivity models](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-connectivity-models) | decision-making | 0.70 | Compares four specific connectivity models (CloudExchange, point-to-point, IPVPN, ExpressRoute Direct) and guides selection with provider options; product-specific decision guidance. | +| [Azure portal](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager) | configuration | 0.68 | Step-by-step portal guide that includes product-specific configuration options for ExpressRoute virtual network gateways (gateway types, SKUs, and settings). While largely procedural, it encodes concrete configuration choices and option names unique to ExpressRoute gateways rather than just conceptual overview content. | | [Azure CLI](https://learn.microsoft.com/en-us/azure/expressroute/howto-circuit-cli) | integrations | 0.65 | Duplicate of index 11; CLI commands and parameters for ExpressRoute circuits are product-specific integration patterns. | | [Azure PowerShell](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-linkvnet-arm) | configuration | 0.65 | PowerShell-based configuration of VNet links to ExpressRoute circuits, including update operations; concrete configuration details. | | [Azure PowerShell](https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-set-global-reach) | configuration | 0.65 | PowerShell-based configuration of Global Reach; contains product-specific cmdlets and parameter usage. | diff --git a/products/azure-extended-zones/azure-extended-zones.csv b/products/azure-extended-zones/azure-extended-zones.csv index ead2c111..c00a9325 100644 --- a/products/azure-extended-zones/azure-extended-zones.csv +++ b/products/azure-extended-zones/azure-extended-zones.csv @@ -7,7 +7,7 @@ https://learn.microsoft.com/en-us/azure/extended-zones/backup-virtual-machine,Ba https://learn.microsoft.com/en-us/azure/extended-zones/create-azure-policy,Create a custom Azure Policy in an Extended Zone,Deploy a Custom Azure Policy in an Azure Extended Zone,Create custom Azure Policy definitions for Extended Zones,Learn how to deploy a custom Azure policy in an Azure extended zone.,"In this article, you learn how to create and deploy a custom Azure policy in an Azure extended zone. Note Azure Policy is supported in Azure Extended Zones with custom policies. Built-in Azure Policy definitions aren't supported in extended zones yet. To enforce governance in extended zones, you must create and deploy custom Azure Policy definitions that are tailored to the unique characteristics of these zones. Examples areextendedLocation,extendedLocation.name, andextendedLocation.type. You mi",2026-03-31T06:10:00.000Z,how-to,configuration,0.8,True,"Explains that only custom policies are supported and references specific policy fields like extendedLocation, extendedLocation.name, and extendedLocation.type; these are product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/extended-zones/create-storage-account,Deploy a storage account in an Extended Zone,Deploy a Storage Account in an Azure Extended Zone,,Learn how to deploy a storage account in an Azure extended zone.,"In this article, you learn how to create an Azure storage account in an extended zone.",2026-03-31T06:10:00.000Z,how-to,,0.4,False,Tutorial for creating a storage account in an extended zone; likely basic wizard steps without configuration matrices or quotas specific enough to qualify.,unchanged https://learn.microsoft.com/en-us/azure/extended-zones/deploy-aks-cluster,Deploy an AKS cluster in an Extended Zone,Deploy an Azure Kubernetes Service (AKS) Cluster in an Extended Zone,,Learn how to deploy an Azure Kubernetes Service (AKS) cluster in an Azure extended zone by using the Azure portal.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this article, you learn how to create an AKS cluster in extended zones. If you don't have an Azure subscription, create afree accountbefore you begin.",2026-03-31T06:10:00.000Z,how-to,,0.4,False,"Tutorial for creating an AKS cluster in an extended zone via portal; summary suggests step-by-step guidance without detailed config tables, limits, or specialized patterns.",unchanged -https://learn.microsoft.com/en-us/azure/extended-zones/deploy-azure-firewall,Deploy Azure Firewall in an Extended Zone (Preview),Deploy Azure Firewall in Azure Extended Zones,Configure Azure Firewall deployment in Extended Zones,"Learn how to deploy Azure Firewall in Azure Extended Zones using ARM templates, including routing configuration, firewall rules, and deployment validation.","In this article, you learn how to deployAzure FirewallinAzure Extended Zonesusing ARM templates. It provides setup instructions, including ARM template snippets and deployment validation steps. Azure Firewall in Azure Extended Zones behaves the same as Azure Firewall in global Azure regions — same SKUs (Standard and Premium), Firewall Policy and rule collections, autoscaling, and availability. The difference is in the setup and deployment. The firewall and its associated resources are created wi",2026-04-17T22:08:00.000Z,how-to,configuration,0.68,True,"The article includes ARM template snippets and specific routing and firewall rule configuration details for Azure Firewall in Azure Extended Zones. These are product-specific configuration patterns and parameters (template properties, routing setup, validation steps) that go beyond generic deployment guidance and represent expert knowledge about how this service must be configured in this specialized environment.",new +https://learn.microsoft.com/en-us/azure/extended-zones/deploy-azure-firewall,Deploy Azure Firewall in an Extended Zone (Preview),Deploy Azure Firewall in Azure Extended Zones,Configure Azure Firewall deployment in Extended Zones,"Learn how to deploy Azure Firewall in Azure Extended Zones using ARM templates, including routing configuration, firewall rules, and deployment validation.","In this article, you learn how to deployAzure FirewallinAzure Extended Zonesusing ARM templates. It provides setup instructions, including ARM template snippets and deployment validation steps. Azure Firewall in Azure Extended Zones behaves the same as Azure Firewall in global Azure regions — same SKUs (Standard and Premium), Firewall Policy and rule collections, autoscaling, and availability. The difference is in the setup and deployment. The firewall and its associated resources are created wi",2026-04-17T22:08:00.000Z,how-to,configuration,0.68,True,"The article includes ARM template snippets and specific routing and firewall rule configuration details for Azure Firewall in Azure Extended Zones. These are product-specific configuration patterns and parameters (template properties, routing setup, validation steps) that go beyond generic deployment guidance and represent expert knowledge about how this service must be configured in this specialized environment.",unchanged https://learn.microsoft.com/en-us/azure/extended-zones/deploy-vm-arm-template,Deploy a VM in an Extended Zone - ARM template,Deploy a virtual machine in an Extended Zone - ARM template,,Learn how to deploy a virtual machine in an Azure Extended Zone using an Azure Resource Manager template (ARM template).,"In this quickstart, you learn how to deploy a virtual machine (VM) in an Extended Zone using an Azure Resource Manager template (ARM template). AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe your intended deployment without writing the sequence of programming commands to create the deployment. You can also complete this quickstart using theAzure por",2026-02-25T08:00:00.000Z,quickstart-arm,,0.2,False,"ARM template deployment quickstart; focuses on example template usage rather than exhaustive configuration references, limits, or best-practice guidance.",unchanged https://learn.microsoft.com/en-us/azure/extended-zones/deploy-vm-cli,Deploy a VM in an Extended Zone - Azure CLI,Deploy a virtual machine in an Extended Zone using Azure CLI,,Learn how to deploy a virtual machine in an Azure Extended Zone using Azure CLI.,"In this quickstart, you learn how to deploy a virtual machine (VM) in an Extended Zone using Azure CLI. AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe your intended deployment without writing the sequence of programming commands to create the deployment. You can also complete this quickstart using theAzure portal. If you don't have an Azure subscrip",2025-09-25T22:12:00.000Z,quickstart,,0.2,False,"Quickstart for deploying a VM using CLI/ARM; primarily tutorial content, not a catalog of configuration options, limits, or decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/extended-zones/deploy-vm-portal,Deploy a VM in an Extended Zone - Portal,Quickstart: Deploy a virtual machine in an Extended Zone - Azure portal,,Learn how to deploy a virtual machine (VM) in an Azure Extended Zone using the Azure portal.,"In this quickstart, you learn how to deploy a virtual machine (VM) to an Extended Zone using the Azure portal. Note Access theExtended Zones-specific VM Creation Portal experience hereto follow along this qickstart tutorial. If you don't have an Azure subscription, create afree accountbefore you begin.",2025-12-08T08:00:00.000Z,quickstart,,0.2,False,"Quickstart for deploying a VM via portal; likely step-by-step UI instructions without detailed configuration parameter tables, limits, or product-specific troubleshooting mappings.",unchanged diff --git a/products/azure-extended-zones/report.md b/products/azure-extended-zones/report.md index 4d0574c7..0f26369f 100644 --- a/products/azure-extended-zones/report.md +++ b/products/azure-extended-zones/report.md @@ -34,9 +34,9 @@ confusable_not_for: Not for Azure Reliability (use azure-reliability), Azure Res - **Unclassified**: 12 ### Incremental Update -- **New Pages**: 1 +- **New Pages**: 0 - **Updated Pages**: 0 -- **Unchanged**: 17 +- **Unchanged**: 18 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-extended-zones/azure-extended-zones.csv` @@ -52,10 +52,6 @@ confusable_not_for: Not for Azure Reliability (use azure-reliability), Azure Res ## Changes -### New Pages - -- [Deploy Azure Firewall in an Extended Zone (Preview)](https://learn.microsoft.com/en-us/azure/extended-zones/deploy-azure-firewall) - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-external-attack-surface-management/azure-external-attack-surface-management.csv b/products/azure-external-attack-surface-management/azure-external-attack-surface-management.csv index 0778c8fd..036c39c1 100644 --- a/products/azure-external-attack-surface-management/azure-external-attack-surface-management.csv +++ b/products/azure-external-attack-surface-management/azure-external-attack-surface-management.csv @@ -1,23 +1,23 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/asn-asset-filters,ASN asset filters,ASN asset filters - Defender ASN domain asset filters,Use ASN asset filters in Defender EASM inventory,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for ASN assets specifically, including operators and applicable field values.",These filters specifically apply to ASN assets. Use these filters when searching for a specific ASN or group of ASNs.,2023-04-17T23:20:00.000Z,how-to,configuration,0.8,True,"ASN asset filters article will specify ASN-related filter fields and operators, a product-specific configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/contact-asset-filters,Contact asset filters,Contact asset filters - Defender EASM contact asset filters,Use contact asset filters in Defender EASM inventory,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for contact assets specifically, including operators and applicable field values.",These filters specifically apply to contact assets. Use these filters when searching for a specific contact.,2023-04-17T23:20:00.000Z,how-to,configuration,0.8,True,"Contact asset filters page describes specific filter fields and operators for contacts, which are configuration parameters unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/data-connections,Leveraging data connections,Defender EASM data connections,Configure Defender EASM data connections to Log Analytics and ADX,The data connector sends Defender EASM asset data to Log Analytics and Azure Data Explorer. You can export Defender EASM data to either tool.,This article discusses the data connections feature in Microsoft Defender External Attack Surface Management (Defender EASM).,2025-07-22T17:26:00.000Z,how-to,integrations,0.7,True,"Data connector configuration to Log Analytics and Azure Data Explorer usually includes connector names, parameters, and schema details that are product-specific integration settings.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/deploying-the-defender-easm-azure-resource,Deploying the Defender EASM Azure resource,Create a Defender EASM Azure resource,,This article explains how to create a Microsoft Defender External Attack Surface Management (Defender EASM) Azure resource by using the Azure portal.,"This article explains how to create a Microsoft Defender External Attack Surface Management (Defender EASM) Azure resource by using the Azure portal. Users can begin their usage of Defender EASM with a 30-day free trial. Once the trial is nearing expiration, you will be notified via banners and push notifications. Creating the Defender EASM Azure resource involves two steps:",2023-10-11T22:21:00.000Z,quickstart,,0.4,False,"Step-by-step creation of a Defender EASM Azure resource and trial info; summary does not indicate deployment matrices, limits, or detailed config tables beyond standard portal steps.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/discovering-your-attack-surface,Discovering your attack surface,Discovering your attack surface,,"Microsoft has preemptively configured the attack surfaces of many organizations, mapping their initial attack surface by discovering infrastructure that’s connected to known assets.",,2023-04-17T23:20:00.000Z,tutorial,,0.25,False,"Describes preconfigured attack surfaces conceptually; no evidence of numeric limits, configuration parameters, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/domain-asset-filters,Domain asset filters,Domain asset filters - Defender EASM domain asset filters,Apply domain asset filters in Defender EASM inventory,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for domain assets specifically, including operators and applicable field values.",These filters specifically apply to domain assets. Use these filters when searching for a specific subset of domain assets.,2023-04-17T23:20:00.000Z,how-to,configuration,0.8,True,"Domain-specific filter article will list field names, operators, and valid values for domain assets, matching configuration-parameter style documentation.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/easm-copilot,Microsoft Security Copilot and Defender EASM,Microsoft Security Copilot in Defender EASM,,Learn how to use Microsoft Security Copilot to get information about your Microsoft Defender External Attack Surface Management (Defender EASM) data.,"Microsoft Defender External Attack Surface Management (Defender EASM) continuously discovers and maps your digital attack surface to provide an external view of your online infrastructure. This visibility helps security and IT teams identify unknowns, prioritize risk, eliminate threats, and extend vulnerability and exposure control beyond the firewall. Attack surface insights are generated by analyzing vulnerability and infrastructure data to showcase the key areas of concern for your organizati",2024-12-10T18:02:00.000Z,conceptual,,0.35,False,"Explains using Security Copilot with Defender EASM data at a high level; summary does not show concrete API parameters, config tables, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/host-asset-filters,Host asset filters,Host asset filters - Defender EASM host asset filters,Apply host asset filters in Defender EASM inventory,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for host assets specifically, including operators and applicable field values.",These filters specifically apply to host assets. Use these filters when searching for a specific subset of host assets.,2023-08-02T11:22:00.000Z,how-to,configuration,0.8,True,"Host asset filters page focuses on filter fields and operators for hosts, which are product-specific configuration options.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/inventory-filters,Inventory filters overview,Inventory filters overview - Defender EASM inventory filters overview,Use Defender EASM inventory filters and saved queries,This article outlines the filter functionality available in Defender EASM to help you find specific subsets of inventory assets based on selected parameters.,This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management (Defender EASM). Filtering helps you find specific subsets of inventory assets based on selected parameters. This article outlines each filter and operator and provides guidance on input options that yield the best results. It also explains how to save queries for easy accessibility to the filtered results.,2023-07-11T22:03:00.000Z,how-to,configuration,0.8,True,"Outlines each filter and operator with guidance on input options; this implies tables of filter names, operators, and allowed values, which are concrete configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-address-asset-filters,IP address asset filters,IP address asset filters - Defender EASM IP address asset filters,Use IP address asset filters in Defender EASM,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for IP address assets specifically, including operators and applicable field values.",These filters specifically apply to IP address assets. Use these filters when searching for a specific subset of IPs.,2023-07-19T17:27:00.000Z,how-to,configuration,0.8,True,"IP address asset filters article defines filter fields (IP ranges, status, etc.) and operators, which are product-specific configuration options.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-block-asset-filters,IP block asset filters,IP block asset filters - Defender EASM IP block asset filters,Use IP block asset filters in Defender EASM,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for IP block assets specifically, including operators and applicable field values.",These filters specifically apply to IP block assets. Use these filters when searching for a specific subset of IP blocks.,2023-04-17T23:20:00.000Z,how-to,configuration,0.8,True,"IP block asset filters page similarly documents filter fields and operators for IP blocks, fitting configuration criteria.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/modifying-inventory-assets,Modifying inventory assets,Modify inventory assets,,This article outlines how to update assets with customized text labels to categorize and make use of inventory data.,"This article outlines how to modify inventory assets. You can change the state of an asset, assign an external ID or apply labels to help provide context and use inventory data. You can also mark CVEs and other observations as non-applicable to remove them from your reported counts. You can also remove assets from their inventory in bulk based on the method with which they are discovered. For instance, users can remove a seed from a discovery group and elect to remove any assets that are discove",2024-09-26T22:04:00.000Z,how-to,,0.35,False,"Covers modifying inventory assets (states, labels, non-applicable CVEs) as workflow guidance; no specific config tables or error mappings evident.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/overview,Overview,Defender EASM Overview,,Learn how Microsoft Defender External Attack Surface Management (Defender EASM) continuously discovers and maps your digital attack surface to give you an external view of your online infrastructure.,"Microsoft Defender External Attack Surface Management (Defender EASM) continuously discovers and maps your digital attack surface to give you an external view of your online infrastructure. Defender EASM gives your security and IT teams essential visibility to help them identify unknowns, prioritize risk, eliminate threats, and extend control of vulnerabilities and exposure beyond the firewall. Attack surface insights are generated by using vulnerability and infrastructure data to showcase key a",2026-03-10T22:11:00.000Z,conceptual,,0.1,False,"High-level product overview of Microsoft Defender EASM without specific limits, configuration parameters, error codes, or decision matrices; primarily conceptual and marketing-style description.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/page-asset-filters,Page asset filters,Page asset filters - Defender EASM page asset filters,Apply page asset filters in Defender EASM inventory,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for page assets specifically, including operators and applicable field values.",These filters specifically apply to page assets. Use these filters when searching for a specific subset of page assets.,2023-07-19T17:27:00.000Z,how-to,configuration,0.8,True,"Page asset filters article will enumerate filterable fields and operators for page assets, a configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/policy-engine,Policy engine automation,Policy engine automation,Configure Defender EASM policy engine automation rules,Automate inventory curation by leveraging the policy engine to proactively implement certain actions based on predetermined parameters.,"The policy engine enables Defender External Attack Surface Management (Defender EASM) users to automate certain actions based on predetermined parameters. You can elect to label assets or change their states based on highly flexible query parameters to automate the curation of your attack surface. Once defined, policies run automatically to ensure that your inventory is categorized according to your specific needs on a recurrent basis.  With the policy engine, you can apply business context to ",2024-06-18T22:05:00.000Z,how-to,configuration,0.65,True,"Describes defining policies based on flexible query parameters to label or change asset states; likely includes specific policy fields, operators, and allowed values, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ssl-certificate-asset-filters,SSL certificate asset filters,SSL certificate asset filters - Defender EASM SSL certificate asset filters,Use SSL certificate asset filters in Defender EASM,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for SSL certificate assets specifically, including operators and applicable field valu","These filters specifically apply to SSL certificate assets. Use these filters when searching for a specific cert, or select group of certs.",2023-04-17T23:20:00.000Z,how-to,configuration,0.8,True,SSL certificate asset filters will list certificate-related fields and operators; this is a structured configuration reference.,unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-asset-details,Understanding asset details,Understand asset details,,Learn how Microsoft Defender External Attack Surface Management discovers and defines your organization's internet-exposed attack surface.,"Microsoft Defender External Attack Surface Management (Defender EASM) frequently scans all inventory assets and collects robust contextual metadata that powers Attack Surface Insights. This data can also be viewed more granularly on the asset details page. The data that's provided changes depending on the asset type. For instance, the platform provides unique Whois data for domains, hosts, and IP addresses. It provides signature algorithm data for Secure Sockets Layer (SSL) certificates. This ar",2024-09-26T22:04:00.000Z,how-to,,0.35,False,"Describes asset details and metadata types (Whois, SSL info) but summary suggests UI/field description, not config tables or quotas.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-billable-assets,Understand billable assets,Understand billable assets - Microsoft Defender EASM,Understand Defender EASM billing and billable asset counts,"This article describes how users are billed for their Defender EASM resource usage, and guides them to the dashboard that displays their counts.","When customers create their first Microsoft Defender External Attack Surface Management (Defender EASM) resource, they're automatically granted a 30-day free trial. Once the trial completes, customers are automatically charged based on their count of billable assets. The charged amount appears on their core Azure billing, with “Defender EASM” appearing as separate line item on their invoice.",2023-04-18T17:34:00.000Z,how-to,limits-quotas,0.7,True,Explains billing based on billable asset counts and a 30-day free trial; such pages typically define what constitutes a billable asset and may include thresholds or counting rules that are numeric and product-specific.,unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-dashboards,Understanding dashboards,Understanding dashboards,,Microsoft Defender External Attack Surface Management (Defender EASM) offers a series of four dashboards designed to help users quickly surface valuable insights derived from their Attack Surface inve,"Microsoft Defender External Attack Surface Management (Defender EASM) offers a series of four dashboards designed to help users quickly surface valuable insights derived from their Approved inventory. These dashboards help organizations prioritize the vulnerabilities, risks and compliance issues that pose the greatest threat to their Attack Surface, making it easy to quickly mitigate key issues. Defender EASM provides eight dashboards:",2024-08-27T22:05:00.000Z,how-to,,0.25,False,"Dashboard overview and purpose; no indication of numeric thresholds, decision matrices, or product-specific troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-inventory-assets,Understanding inventory assets,Understanding Inventory Assets,,Learn how Microsoft Defender External Attack Surface Management (Defender EASM) uses proprietary discovery technology to recursively searches for infrastructure with observed connections to known legi,Microsoft Defender External Attack Surface Management (Defender EASM) uses Microsoft proprietary discovery technology to recursively search for infrastructure through observed connections to known legitimate assets (discovery seeds). It makes inferences about that infrastructure's relationship to the organization to uncover previously unknown and unmonitored properties. Defender EASM discovery includes the following kinds of assets: These asset types make up your attack surface inventory in Defe,2024-12-10T18:02:00.000Z,conceptual,,0.3,False,"Describes asset discovery conceptually; no detailed configuration tables, limits, or troubleshooting content indicated.",unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/using-and-managing-discovery,Using and managing discovery,Use and manage discovery,,Learn how to use Microsoft Defender External Attack Surface Management to discover and manage your organization's internet-exposed attack surface.,"Microsoft Defender External Attack Surface Management (Defender EASM) relies on proprietary discovery technology to continuously define your organization's unique internet-exposed attack surface. Discovery scans the internet for assets owned by your organization to uncover previously unknown and unmonitored properties. Discovered assets are indexed in your inventory to provide a dynamic system of record of web applications, third-party dependencies, and web infrastructure under your organization",2023-12-18T23:04:00.000Z,how-to,,0.35,False,Explains using and managing discovery at a feature level; summary does not show concrete config parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/external-attack-surface-management/what-is-discovery,What is Discovery?,What Is Discovery?,,Learn how Microsoft Defender External Attack Surface Management (Defender EASM) uses proprietary discovery technology to continuously define your organization’s unique internet-exposed attack surface.,"Microsoft Defender External Attack Surface Management (Defender EASM) uses Microsoft proprietary discovery technology to continuously define your organization’s unique internet-exposed attack surface. The Defender EASM discovery feature scans known assets that are owned by your organization to uncover previously unknown and unmonitored properties. Discovered assets are indexed in your organization's inventory. Defender EASM gives you a dynamic system of record for web applications, third-party d",2024-12-10T18:02:00.000Z,concept-article,,0.3,False,"Explains what discovery is and how it works conceptually; lacks numeric limits, config parameters, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/asn-asset-filters,ASN asset filters,ASN asset filters - Defender ASN domain asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for ASN assets specifically, including operators and applicable field values.",These filters specifically apply to ASN assets. Use these filters when searching for a specific ASN or group of ASNs.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,"ASN asset filters article is described as applying to ASN assets; the summary does not include specific filter fields, operators, or constraints.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/contact-asset-filters,Contact asset filters,Contact asset filters - Defender EASM contact asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for contact assets specifically, including operators and applicable field values.",These filters specifically apply to contact assets. Use these filters when searching for a specific contact.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,"Contact asset filters article is summarized as applying to contacts; lacks explicit listing of filter fields, operators, or constraints in the provided text.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/data-connections,Leveraging data connections,Defender EASM data connections,,The data connector sends Defender EASM asset data to Log Analytics and Azure Data Explorer. You can export Defender EASM data to either tool.,This article discusses the data connections feature in Microsoft Defender External Attack Surface Management (Defender EASM).,2026-04-24T17:58:00.000Z,how-to,,0.35,False,States that data can be exported to Log Analytics and Azure Data Explorer; summary lacks specific connector configuration parameters or schema details.,updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/deploying-the-defender-easm-azure-resource,Deploying the Defender EASM Azure resource,Create a Defender EASM Azure resource,,This article explains how to create a Microsoft Defender External Attack Surface Management (Defender EASM) Azure resource by using the Azure portal.,"This article explains how to create a Microsoft Defender External Attack Surface Management (Defender EASM) Azure resource by using the Azure portal. Users can begin their usage of Defender EASM with a 30-day free trial. Once the trial is nearing expiration, you will be notified via banners and push notifications. Creating the Defender EASM Azure resource involves two steps:",2026-04-24T17:58:00.000Z,quickstart,,0.35,False,Explains how to create a Defender EASM Azure resource and mentions a 30-day trial; appears to be a basic setup tutorial without deployment matrices or tier-specific constraints.,updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/discovering-your-attack-surface,Discovering your attack surface,Discovering your attack surface,,"Microsoft has preemptively configured the attack surfaces of many organizations, mapping their initial attack surface by discovering infrastructure that’s connected to known assets.",,2026-04-24T17:58:00.000Z,tutorial,,0.3,False,"Describes that Microsoft preconfigures attack surfaces for many organizations; high-level discovery behavior without detailed configuration, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/domain-asset-filters,Domain asset filters,Domain asset filters - Defender EASM domain asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for domain assets specifically, including operators and applicable field values.",These filters specifically apply to domain assets. Use these filters when searching for a specific subset of domain assets.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,"Domain asset filters article is likely a reference, but the provided summary only states that filters apply to domain assets, without exposing concrete field names, operators, or constraints.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/easm-copilot,Microsoft Security Copilot and Defender EASM,Microsoft Security Copilot in Defender EASM,,Learn how to use Microsoft Security Copilot to get information about your Microsoft Defender External Attack Surface Management (Defender EASM) data.,"Microsoft Defender External Attack Surface Management (Defender EASM) continuously discovers and maps your digital attack surface to provide an external view of your online infrastructure. This visibility helps security and IT teams identify unknowns, prioritize risk, eliminate threats, and extend vulnerability and exposure control beyond the firewall. Attack surface insights are generated by analyzing vulnerability and infrastructure data to showcase the key areas of concern for your organizati",2026-04-24T17:58:00.000Z,integration,,0.3,False,"Overview of using Microsoft Security Copilot with Defender EASM; summary does not show specific prompts, configuration parameters, or error/diagnostic details.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/host-asset-filters,Host asset filters,Host asset filters - Defender EASM host asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for host assets specifically, including operators and applicable field values.",These filters specifically apply to host assets. Use these filters when searching for a specific subset of host assets.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,"Host asset filters article is referenced at a high level; summary does not show specific filter fields, operators, or allowed values.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/inventory-filters,Inventory filters overview,Inventory filters overview - Defender EASM inventory filters overview,,This article outlines the filter functionality available in Defender EASM to help you find specific subsets of inventory assets based on selected parameters.,This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management (Defender EASM). Filtering helps you find specific subsets of inventory assets based on selected parameters. This article outlines each filter and operator and provides guidance on input options that yield the best results. It also explains how to save queries for easy accessibility to the filtered results.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,"Outlines filter functionality and operators conceptually; while product-specific, the summary does not indicate detailed parameter tables or numeric constraints that qualify as expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-address-asset-filters,IP address asset filters,IP address asset filters - Defender EASM IP address asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for IP address assets specifically, including operators and applicable field values.",These filters specifically apply to IP address assets. Use these filters when searching for a specific subset of IPs.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,"IP address asset filters article is described at a high level; summary does not expose specific filter names, operators, or allowed values.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-block-asset-filters,IP block asset filters,IP block asset filters - Defender EASM IP block asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for IP block assets specifically, including operators and applicable field values.",These filters specifically apply to IP block assets. Use these filters when searching for a specific subset of IP blocks.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,IP block asset filters article is summarized generically; no explicit expert configuration details or parameter tables are present in the summary.,updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/modifying-inventory-assets,Modifying inventory assets,Modify inventory assets,,This article outlines how to update assets with customized text labels to categorize and make use of inventory data.,"This article outlines how to modify inventory assets. You can change the state of an asset, assign an external ID or apply labels to help provide context and use inventory data. You can also mark CVEs and other observations as non-applicable to remove them from your reported counts. You can also remove assets from their inventory in bulk based on the method with which they are discovered. For instance, users can remove a seed from a discovery group and elect to remove any assets that are discove",2026-04-24T17:58:00.000Z,how-to,,0.35,False,"Describes modifying inventory assets (states, labels, non-applicable CVEs) as workflow guidance; no specific configuration parameter tables or numeric constraints.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/overview,Overview,Defender EASM Overview,,Learn how Microsoft Defender External Attack Surface Management (Defender EASM) continuously discovers and maps your digital attack surface to give you an external view of your online infrastructure.,"Microsoft Defender External Attack Surface Management (Defender EASM) continuously discovers and maps your digital attack surface to give you an external view of your online infrastructure. Defender EASM gives your security and IT teams essential visibility to help them identify unknowns, prioritize risk, eliminate threats, and extend control of vulnerabilities and exposure beyond the firewall. Attack surface insights are generated by using vulnerability and infrastructure data to showcase key a",2026-04-24T17:58:00.000Z,overview,,0.2,False,"High-level product overview of Defender EASM capabilities and benefits without concrete configuration parameters, limits, or detailed technical patterns.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/page-asset-filters,Page asset filters,Page asset filters - Defender EASM page asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for page assets specifically, including operators and applicable field values.",These filters specifically apply to page assets. Use these filters when searching for a specific subset of page assets.,2026-04-24T17:58:00.000Z,how-to,,0.45,False,Page asset filters article is described generically; no explicit expert-level configuration details or parameter tables are visible in the summary.,updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/policy-engine,Policy engine automation,Policy engine automation,,Automate inventory curation by leveraging the policy engine to proactively implement certain actions based on predetermined parameters.,"The policy engine enables Defender External Attack Surface Management (Defender EASM) users to automate certain actions based on predetermined parameters. You can elect to label assets or change their states based on highly flexible query parameters to automate the curation of your attack surface. Once defined, policies run automatically to ensure that your inventory is categorized according to your specific needs on a recurrent basis.  With the policy engine, you can apply business context to ",2026-04-24T17:58:00.000Z,how-to,,0.35,False,"Policy engine automation is described conceptually (label assets, change states based on queries); summary does not show concrete parameter tables or quantified best practices.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ssl-certificate-asset-filters,SSL certificate asset filters,SSL certificate asset filters - Defender EASM SSL certificate asset filters,,"This article outlines the filter functionality available in Microsoft Defender External Attack Surface Management for SSL certificate assets specifically, including operators and applicable field valu","These filters specifically apply to SSL certificate assets. Use these filters when searching for a specific cert, or select group of certs.",2026-04-24T17:58:00.000Z,how-to,,0.45,False,"SSL certificate asset filters article is likely detailed, but the summary only notes that it applies to certs; no concrete configuration or numeric constraints are shown.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-asset-details,Understanding asset details,Understand asset details,,Learn how Microsoft Defender External Attack Surface Management discovers and defines your organization's internet-exposed attack surface.,"Microsoft Defender External Attack Surface Management (Defender EASM) frequently scans all inventory assets and collects robust contextual metadata that powers Attack Surface Insights. This data can also be viewed more granularly on the asset details page. The data that's provided changes depending on the asset type. For instance, the platform provides unique Whois data for domains, hosts, and IP addresses. It provides signature algorithm data for Secure Sockets Layer (SSL) certificates. This ar",2026-04-24T17:58:00.000Z,how-to,,0.3,False,"Describes asset details and metadata types (Whois, SSL signature algorithms) at a conceptual level; no specific settings, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-billable-assets,Understand billable assets,Understand billable assets - Microsoft Defender EASM,,"This article describes how users are billed for their Defender EASM resource usage, and guides them to the dashboard that displays their counts.","When customers create their first Microsoft Defender External Attack Surface Management (Defender EASM) resource, they're automatically granted a 30-day free trial. Once the trial completes, customers are automatically charged based on their count of billable assets. The charged amount appears on their core Azure billing, with “Defender EASM” appearing as separate line item on their invoice.",2026-04-24T17:58:00.000Z,how-to,,0.4,False,"Mentions 30-day free trial and billing based on billable asset counts, but no detailed pricing tables, tier-specific limits, or numeric thresholds beyond the generic trial duration.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-dashboards,Understanding dashboards,Understanding dashboards,,Microsoft Defender External Attack Surface Management (Defender EASM) offers a series of four dashboards designed to help users quickly surface valuable insights derived from their Attack Surface inve,"Microsoft Defender External Attack Surface Management (Defender EASM) offers a series of four dashboards designed to help users quickly surface valuable insights derived from their Approved inventory. These dashboards help organizations prioritize the vulnerabilities, risks and compliance issues that pose the greatest threat to their Attack Surface, making it easy to quickly mitigate key issues. Defender EASM provides eight dashboards:",2026-04-24T17:58:00.000Z,how-to,,0.2,False,"Overview of dashboards and their purpose; does not include configuration tables, thresholds, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-inventory-assets,Understanding inventory assets,Understanding Inventory Assets,,Learn how Microsoft Defender External Attack Surface Management (Defender EASM) uses proprietary discovery technology to recursively searches for infrastructure with observed connections to known legi,Microsoft Defender External Attack Surface Management (Defender EASM) uses Microsoft proprietary discovery technology to recursively search for infrastructure through observed connections to known legitimate assets (discovery seeds). It makes inferences about that infrastructure's relationship to the organization to uncover previously unknown and unmonitored properties. Defender EASM discovery includes the following kinds of assets: These asset types make up your attack surface inventory in Defe,2026-04-24T17:58:00.000Z,concept-article,,0.3,False,"Describes what inventory assets are and how discovery works conceptually; no specific configuration tables, limits, or product-unique troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/using-and-managing-discovery,Using and managing discovery,Use and manage discovery,,Learn how to use Microsoft Defender External Attack Surface Management to discover and manage your organization's internet-exposed attack surface.,"Microsoft Defender External Attack Surface Management (Defender EASM) relies on proprietary discovery technology to continuously define your organization's unique internet-exposed attack surface. Discovery scans the internet for assets owned by your organization to uncover previously unknown and unmonitored properties. Discovered assets are indexed in your inventory to provide a dynamic system of record of web applications, third-party dependencies, and web infrastructure under your organization",2026-04-24T17:58:00.000Z,how-to,,0.3,False,"Explains how discovery is used and managed in general terms; no detailed configuration options, limits, or decision matrices are indicated.",updated +https://learn.microsoft.com/en-us/azure/external-attack-surface-management/what-is-discovery,What is Discovery?,What Is Discovery?,,Learn how Microsoft Defender External Attack Surface Management (Defender EASM) uses proprietary discovery technology to continuously define your organization’s unique internet-exposed attack surface.,"Microsoft Defender External Attack Surface Management (Defender EASM) uses Microsoft proprietary discovery technology to continuously define your organization’s unique internet-exposed attack surface. The Defender EASM discovery feature scans known assets that are owned by your organization to uncover previously unknown and unmonitored properties. Discovered assets are indexed in your organization's inventory. Defender EASM gives you a dynamic system of record for web applications, third-party d",2026-04-24T17:58:00.000Z,concept-article,,0.3,False,"Conceptual explanation of Defender EASM discovery; lacks numeric limits, decision matrices, or detailed configuration parameters.",updated diff --git a/products/azure-external-attack-surface-management/report.md b/products/azure-external-attack-surface-management/report.md index e71495cf..f4df05da 100644 --- a/products/azure-external-attack-surface-management/report.md +++ b/products/azure-external-attack-surface-management/report.md @@ -30,55 +30,93 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo - **Total Pages**: 22 - **Fetched**: 22 - **Fetch Failed**: 0 -- **Classified**: 12 -- **Unclassified**: 10 +- **Classified**: 0 +- **Unclassified**: 22 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 22 +- **Updated Pages**: 22 +- **Unchanged**: 0 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-external-attack-surface-management/azure-external-attack-surface-management.csv` ## Classification Statistics -| Type | Count | Percentage | -|------|-------|------------| -| configuration | 10 | 45.5% | -| integrations | 1 | 4.5% | -| limits-quotas | 1 | 4.5% | -| *(Unclassified)* | 10 | 45.5% | +*No classified pages* ## Changes +### Updated Pages + +- [Overview](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/overview) + - Updated: 2026-03-10T22:11:00.000Z → 2026-04-24T17:58:00.000Z +- [Understanding inventory assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-inventory-assets) + - Updated: 2024-12-10T18:02:00.000Z → 2026-04-24T17:58:00.000Z +- [What is Discovery?](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/what-is-discovery) + - Updated: 2024-12-10T18:02:00.000Z → 2026-04-24T17:58:00.000Z +- [Understanding asset details](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-asset-details) + - Updated: 2024-09-26T22:04:00.000Z → 2026-04-24T17:58:00.000Z +- [Understanding dashboards](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-dashboards) + - Updated: 2024-08-27T22:05:00.000Z → 2026-04-24T17:58:00.000Z +- [Using and managing discovery](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/using-and-managing-discovery) + - Updated: 2023-12-18T23:04:00.000Z → 2026-04-24T17:58:00.000Z +- [Modifying inventory assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/modifying-inventory-assets) + - Updated: 2024-09-26T22:04:00.000Z → 2026-04-24T17:58:00.000Z +- [Policy engine automation](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/policy-engine) + - Updated: 2024-06-18T22:05:00.000Z → 2026-04-24T17:58:00.000Z +- [Understand billable assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-billable-assets) + - Updated: 2023-04-18T17:34:00.000Z → 2026-04-24T17:58:00.000Z +- [Leveraging data connections](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/data-connections) + - Updated: 2025-07-22T17:26:00.000Z → 2026-04-24T17:58:00.000Z +- [Inventory filters overview](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/inventory-filters) + - Updated: 2023-07-11T22:03:00.000Z → 2026-04-24T17:58:00.000Z +- [Domain asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/domain-asset-filters) + - Updated: 2023-04-17T23:20:00.000Z → 2026-04-24T17:58:00.000Z +- [Host asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/host-asset-filters) + - Updated: 2023-08-02T11:22:00.000Z → 2026-04-24T17:58:00.000Z +- [Page asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/page-asset-filters) + - Updated: 2023-07-19T17:27:00.000Z → 2026-04-24T17:58:00.000Z +- [Contact asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/contact-asset-filters) + - Updated: 2023-04-17T23:20:00.000Z → 2026-04-24T17:58:00.000Z +- [SSL certificate asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ssl-certificate-asset-filters) + - Updated: 2023-04-17T23:20:00.000Z → 2026-04-24T17:58:00.000Z +- [IP address asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-address-asset-filters) + - Updated: 2023-07-19T17:27:00.000Z → 2026-04-24T17:58:00.000Z +- [IP block asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-block-asset-filters) + - Updated: 2023-04-17T23:20:00.000Z → 2026-04-24T17:58:00.000Z +- [ASN asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/asn-asset-filters) + - Updated: 2023-04-17T23:20:00.000Z → 2026-04-24T17:58:00.000Z +- [Deploying the Defender EASM Azure resource](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/deploying-the-defender-easm-azure-resource) + - Updated: 2023-10-11T22:21:00.000Z → 2026-04-24T17:58:00.000Z +- *...and 2 more* + ## Classified Pages -| TOC Title | Type | Confidence | Reason | -|-----------|------|------------|--------| -| [ASN asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/asn-asset-filters) | configuration | 0.80 | ASN asset filters article will specify ASN-related filter fields and operators, a product-specific configuration reference. | -| [Contact asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/contact-asset-filters) | configuration | 0.80 | Contact asset filters page describes specific filter fields and operators for contacts, which are configuration parameters unique to this product. | -| [Domain asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/domain-asset-filters) | configuration | 0.80 | Domain-specific filter article will list field names, operators, and valid values for domain assets, matching configuration-parameter style documentation. | -| [Host asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/host-asset-filters) | configuration | 0.80 | Host asset filters page focuses on filter fields and operators for hosts, which are product-specific configuration options. | -| [IP address asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-address-asset-filters) | configuration | 0.80 | IP address asset filters article defines filter fields (IP ranges, status, etc.) and operators, which are product-specific configuration options. | -| [IP block asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-block-asset-filters) | configuration | 0.80 | IP block asset filters page similarly documents filter fields and operators for IP blocks, fitting configuration criteria. | -| [Inventory filters overview](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/inventory-filters) | configuration | 0.80 | Outlines each filter and operator with guidance on input options; this implies tables of filter names, operators, and allowed values, which are concrete configuration parameters. | -| [Page asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/page-asset-filters) | configuration | 0.80 | Page asset filters article will enumerate filterable fields and operators for page assets, a configuration reference. | -| [SSL certificate asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ssl-certificate-asset-filters) | configuration | 0.80 | SSL certificate asset filters will list certificate-related fields and operators; this is a structured configuration reference. | -| [Leveraging data connections](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/data-connections) | integrations | 0.70 | Data connector configuration to Log Analytics and Azure Data Explorer usually includes connector names, parameters, and schema details that are product-specific integration settings. | -| [Understand billable assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-billable-assets) | limits-quotas | 0.70 | Explains billing based on billable asset counts and a 30-day free trial; such pages typically define what constitutes a billable asset and may include thresholds or counting rules that are numeric and product-specific. | -| [Policy engine automation](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/policy-engine) | configuration | 0.65 | Describes defining policies based on flexible query parameters to label or change asset states; likely includes specific policy fields, operators, and allowed values, which are product-specific configuration details. | +*No classified pages* ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Deploying the Defender EASM Azure resource](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/deploying-the-defender-easm-azure-resource) | 0.40 | Step-by-step creation of a Defender EASM Azure resource and trial info; summary does not indicate deployment matrices, limits, or detailed config tables beyond standard portal steps. | -| [Microsoft Security Copilot and Defender EASM](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/easm-copilot) | 0.35 | Explains using Security Copilot with Defender EASM data at a high level; summary does not show concrete API parameters, config tables, or error codes. | -| [Modifying inventory assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/modifying-inventory-assets) | 0.35 | Covers modifying inventory assets (states, labels, non-applicable CVEs) as workflow guidance; no specific config tables or error mappings evident. | -| [Understanding asset details](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-asset-details) | 0.35 | Describes asset details and metadata types (Whois, SSL info) but summary suggests UI/field description, not config tables or quotas. | -| [Using and managing discovery](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/using-and-managing-discovery) | 0.35 | Explains using and managing discovery at a feature level; summary does not show concrete config parameters or limits. | -| [Understanding inventory assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-inventory-assets) | 0.30 | Describes asset discovery conceptually; no detailed configuration tables, limits, or troubleshooting content indicated. | -| [What is Discovery?](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/what-is-discovery) | 0.30 | Explains what discovery is and how it works conceptually; lacks numeric limits, config parameters, or decision matrices. | -| [Discovering your attack surface](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/discovering-your-attack-surface) | 0.25 | Describes preconfigured attack surfaces conceptually; no evidence of numeric limits, configuration parameters, or troubleshooting mappings. | -| [Understanding dashboards](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-dashboards) | 0.25 | Dashboard overview and purpose; no indication of numeric thresholds, decision matrices, or product-specific troubleshooting. | -| [Overview](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/overview) | 0.10 | High-level product overview of Microsoft Defender EASM without specific limits, configuration parameters, error codes, or decision matrices; primarily conceptual and marketing-style description. | +| [ASN asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/asn-asset-filters) | 0.45 | ASN asset filters article is described as applying to ASN assets; the summary does not include specific filter fields, operators, or constraints. | +| [Contact asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/contact-asset-filters) | 0.45 | Contact asset filters article is summarized as applying to contacts; lacks explicit listing of filter fields, operators, or constraints in the provided text. | +| [Domain asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/domain-asset-filters) | 0.45 | Domain asset filters article is likely a reference, but the provided summary only states that filters apply to domain assets, without exposing concrete field names, operators, or constraints. | +| [Host asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/host-asset-filters) | 0.45 | Host asset filters article is referenced at a high level; summary does not show specific filter fields, operators, or allowed values. | +| [IP address asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-address-asset-filters) | 0.45 | IP address asset filters article is described at a high level; summary does not expose specific filter names, operators, or allowed values. | +| [IP block asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-block-asset-filters) | 0.45 | IP block asset filters article is summarized generically; no explicit expert configuration details or parameter tables are present in the summary. | +| [Inventory filters overview](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/inventory-filters) | 0.45 | Outlines filter functionality and operators conceptually; while product-specific, the summary does not indicate detailed parameter tables or numeric constraints that qualify as expert configuration knowledge. | +| [Page asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/page-asset-filters) | 0.45 | Page asset filters article is described generically; no explicit expert-level configuration details or parameter tables are visible in the summary. | +| [SSL certificate asset filters](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ssl-certificate-asset-filters) | 0.45 | SSL certificate asset filters article is likely detailed, but the summary only notes that it applies to certs; no concrete configuration or numeric constraints are shown. | +| [Understand billable assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-billable-assets) | 0.40 | Mentions 30-day free trial and billing based on billable asset counts, but no detailed pricing tables, tier-specific limits, or numeric thresholds beyond the generic trial duration. | +| [Deploying the Defender EASM Azure resource](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/deploying-the-defender-easm-azure-resource) | 0.35 | Explains how to create a Defender EASM Azure resource and mentions a 30-day trial; appears to be a basic setup tutorial without deployment matrices or tier-specific constraints. | +| [Leveraging data connections](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/data-connections) | 0.35 | States that data can be exported to Log Analytics and Azure Data Explorer; summary lacks specific connector configuration parameters or schema details. | +| [Modifying inventory assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/modifying-inventory-assets) | 0.35 | Describes modifying inventory assets (states, labels, non-applicable CVEs) as workflow guidance; no specific configuration parameter tables or numeric constraints. | +| [Policy engine automation](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/policy-engine) | 0.35 | Policy engine automation is described conceptually (label assets, change states based on queries); summary does not show concrete parameter tables or quantified best practices. | +| [Discovering your attack surface](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/discovering-your-attack-surface) | 0.30 | Describes that Microsoft preconfigures attack surfaces for many organizations; high-level discovery behavior without detailed configuration, limits, or troubleshooting mappings. | +| [Microsoft Security Copilot and Defender EASM](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/easm-copilot) | 0.30 | Overview of using Microsoft Security Copilot with Defender EASM; summary does not show specific prompts, configuration parameters, or error/diagnostic details. | +| [Understanding asset details](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-asset-details) | 0.30 | Describes asset details and metadata types (Whois, SSL signature algorithms) at a conceptual level; no specific settings, limits, or error mappings. | +| [Understanding inventory assets](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-inventory-assets) | 0.30 | Describes what inventory assets are and how discovery works conceptually; no specific configuration tables, limits, or product-unique troubleshooting details. | +| [Using and managing discovery](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/using-and-managing-discovery) | 0.30 | Explains how discovery is used and managed in general terms; no detailed configuration options, limits, or decision matrices are indicated. | +| [What is Discovery?](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/what-is-discovery) | 0.30 | Conceptual explanation of Defender EASM discovery; lacks numeric limits, decision matrices, or detailed configuration parameters. | +| [Overview](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/overview) | 0.20 | High-level product overview of Defender EASM capabilities and benefits without concrete configuration parameters, limits, or detailed technical patterns. | +| [Understanding dashboards](https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-dashboards) | 0.20 | Overview of dashboards and their purpose; does not include configuration tables, thresholds, or troubleshooting content. | diff --git a/products/azure-files/azure-files.csv b/products/azure-files/azure-files.csv index ce28582b..464daf0a 100644 --- a/products/azure-files/azure-files.csv +++ b/products/azure-files/azure-files.csv @@ -33,20 +33,20 @@ https://learn.microsoft.com/en-us/azure/storage/files/analyze-files-metrics,Anal https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/haystack,Haystack,Build RAG pipelines with Haystack and Azure Files,Integrate Haystack RAG pipelines with Azure Files,Use Haystack as an orchestration framework to build retrieval-augmented generation (RAG) pipelines using data stored in Azure Files.,"Haystack is an open-source framework that models every pipeline as a directed acyclic graph (DAG) of typed components. By using Haystack with Azure Files, you can build retrieval-augmented generation (RAG) pipelines that use your existing file shares as a primary data source. Haystack separates indexing (embed and write) from querying (embed, retrieve, prompt, and generate) into distinct pipeline objects, making each independently testable and deployable.",2026-04-10T22:10:00.000Z,overview,integrations,0.65,True,"Covers Haystack DAG-based pipelines wired to Azure Files as the document source. Likely includes concrete component wiring and configuration specific to this integration, not just generic Haystack concepts.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/langchain,LangChain,Build RAG pipelines with LangChain and Azure Files,Integrate LangChain RAG pipelines with Azure Files,Use LangChain as an orchestration framework to build retrieval-augmented generation (RAG) pipelines using data stored in Azure Files.,"LangChain is an open-source framework designed to simplify the creation of applications powered by large language models (LLMs). By using LangChain with Azure Files, you can build robust retrieval-augmented generation (RAG) pipelines that leverage your existing file shares as a primary data source. LangChain's modular architecture andLangChain Expression Language (LCEL)allow you to swap components—such as document loaders, retrievers, and vector stores—with minimal code changes.",2026-04-10T22:10:00.000Z,overview,integrations,0.65,True,"Describes using LangChain components (document loaders, retrievers, vector stores) specifically with Azure Files as a data source. Likely includes product-specific code patterns and configuration parameters for this integration beyond generic LangChain usage.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/llamaindex,LlamaIndex,Build RAG pipelines with LlamaIndex and Azure Files,Integrate LlamaIndex RAG pipelines with Azure Files,Use LlamaIndex as an orchestration framework to build retrieval-augmented generation (RAG) pipelines using data stored in Azure Files.,"LlamaIndex is an open-source framework designed for building retrieval-augmented generation (RAG) applications. By using LlamaIndex with Azure Files, you can build RAG pipelines that use your existing file shares as a primary data source. LlamaIndex provides fine-grained control over each stage of the pipeline through abstractions such asSentenceSplitterfor chunking,VectorStoreIndexfor indexing, andRetrieverQueryEnginefor query-time retrieval and response synthesis.",2026-04-10T22:10:00.000Z,overview,integrations,0.65,True,"Explains using LlamaIndex abstractions (SentenceSplitter, VectorStoreIndex, RetrieverQueryEngine) with Azure Files as the backing store. This is a product-specific integration pattern with concrete code/config unique to this combination.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup,Get started,Prepare Azure Files data for document-based RAG applications with open-source frameworks,Authenticate and access Azure Files for RAG ingestion,Learn how to authenticate to an Azure file share and download files for ingestion into a document-based RAG application using open-source frameworks.,This article describes how to authenticate to an Azure file share and download its contents for use with open‑source retrieval‑augmented generation (RAG) tooling.,2026-04-10T22:10:00.000Z,how-to,configuration,0.65,True,"Focuses on authenticating to Azure file shares and downloading contents for RAG tooling. Likely includes specific auth methods, connection strings, and configuration parameters (for SDK/CLI) that are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone,Use Haystack with Pinecone,Build a RAG pipeline using Azure Files with Haystack and Pinecone,Build Haystack + Pinecone RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using Haystack for orchestration and Pinecone as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses Haystack for orchestration and Pinecone as the vector database. Haystack models the pipeline as an explicit directed acyclic graph (DAG) of typed components — embedder, retriever, prompt builder, generator — so you can inspect and extend each stage independently.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,"Tutorial integrating Haystack DAG pipelines with Pinecone and Azure Files. The explicit component wiring (embedder, retriever, prompt builder, generator) to Azure Files-backed data is a concrete integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-qdrant/tutorial-haystack-qdrant,Use Haystack with Qdrant,Build a RAG pipeline using Azure Files with Haystack and Qdrant,Implement Haystack–Qdrant RAG with Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using Haystack for orchestration and Qdrant as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses Haystack for orchestration and Qdrant as the vector database. Haystack models every pipeline as an explicit directed acyclic graph (DAG) of typed components, so you can inspect and modify each stage independently. Qdrant stores all documents in a single collection and uses indexed payload filtering to scope queries at retrieval time, giving you query-time flexibilit",2026-04-10T22:10:00.000Z,tutorial,integrations,0.68,True,"Tutorial describes a specific integration pattern between Azure Files, Haystack, and Qdrant for RAG, including how Qdrant collections and indexed payload filtering are used at query time. This is concrete, product-specific integration guidance rather than a conceptual overview.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-weaviate/tutorial-haystack-weaviate,Use Haystack with Weaviate,Build a RAG pipeline using Azure Files with Haystack and Weaviate,Implement Haystack–Weaviate RAG with Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using Haystack for orchestration and Weaviate as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses Haystack for orchestration and Weaviate as the vector database. Haystack models each pipeline as an explicit directed acyclic graph (DAG) of typed components, and Weaviate provideshybrid searchthat blends vector similarity with BM25 keyword matching in a single query, configurable via analphaparameter.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.68,True,"Tutorial shows a concrete integration pattern between Azure Files, Haystack, and Weaviate for RAG, including product-specific orchestration and vector DB usage details (DAG components, hybrid search with alpha parameter). This is code-focused integration knowledge that goes beyond generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone,Use LangChain with Pinecone,Build a RAG pipeline using Azure Files with LangChain and Pinecone,Build LangChain + Pinecone RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LangChain for orchestration and Pinecone as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LangChain for orchestration and Pinecone as the vector database. Pinecone namespaces map to Azure Files directory structure, providing scoped retrieval per department without additional configuration.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,"Tutorial wiring LangChain, Pinecone, and Azure Files together. Mentions Pinecone namespaces mapping to Azure Files directories, which is a concrete integration pattern and configuration behavior unique to this combination.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant,Use LangChain with Qdrant,Build a RAG pipeline using Azure Files with LangChain and Qdrant,Build LangChain + Qdrant RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LangChain for orchestration and Qdrant as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LangChain for orchestration and Qdrant as the vector database. Qdrant stores all documents in a single collection and uses indexedpayload filteringto scope queries by department at retrieval time, rather than partitioning data into separate namespaces or indexes.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,"Tutorial integrating LangChain, Qdrant, and Azure Files. Uses a single Qdrant collection with indexed payload filtering for department scoping, a concrete integration and query pattern specific to this stack.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate,Use LangChain with Weaviate,Build a RAG pipeline using Azure Files with LangChain and Weaviate,Build LangChain + Weaviate RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LangChain for orchestration and Weaviate as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LangChain for orchestration and Weaviate as the vector database. Weaviate multi-tenancy maps to Azure Files directory structure, providing isolated retrieval per department with automatic tenant creation on first write.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,"Tutorial integrating LangChain, Weaviate, and Azure Files. Uses Weaviate multi-tenancy mapped to Azure Files directories with automatic tenant creation, which is a specific integration behavior and configuration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone,Use LlamaIndex with Pinecone,Build a RAG pipeline using Azure Files with LlamaIndex and Pinecone,Build LlamaIndex + Pinecone RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LlamaIndex for orchestration and Pinecone as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LlamaIndex for orchestration and Pinecone as the vector database. LlamaIndex node parsers preserve document structure during chunking, and Pinecone namespaces map to Azure Files directory structure for scoped retrieval per department.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,Tutorial integrating LlamaIndex node parsers with Pinecone namespaces mapped to Azure Files directories. Contains specific integration patterns and configuration for this combination.,unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant,Use LlamaIndex with Qdrant,Build a RAG pipeline using Azure Files with LlamaIndex and Qdrant,Build LlamaIndex + Qdrant RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LlamaIndex for orchestration and Qdrant as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LlamaIndex for orchestration and Qdrant as the vector database. Qdrant stores all documents in a single collection and usespayload filteringto scope queries at retrieval time, while LlamaIndex provides fine-grained control over node parsing, indexing, and response synthesis.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,Tutorial integrating LlamaIndex with Qdrant using a single collection and payload filtering for scoping. This is a specific integration and configuration pattern unique to this stack.,unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate,Use LlamaIndex with Weaviate,Build a RAG pipeline using Azure Files with LlamaIndex and Weaviate,Build LlamaIndex + Weaviate RAG over Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LlamaIndex for orchestration and Weaviate as the vector database.,"In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LlamaIndex for orchestration and Weaviate as the vector database. Weaviate supportshybrid search— combining vector similarity with BM25 keyword matching — which helps retrieve both semantic matches and exact terms from file shares that contain a mix of structured and unstructured documents.",2026-04-10T22:10:00.000Z,tutorial,integrations,0.75,True,Tutorial integrating LlamaIndex with Weaviate hybrid search over Azure Files. The hybrid search usage and mapping to file-share content is a concrete integration pattern with product-specific query/config details.,unchanged +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup,Get started,Prepare Azure Files data for document-based RAG applications with open-source frameworks,Authenticate and download Azure Files data for RAG,Learn how to authenticate to an Azure file share and download files for ingestion into a document-based RAG application using open-source frameworks.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication This article shows you how to create a project directory, authenticate to an Azure file share, and build the download logic that each open-source RAG tutorial in this section depends on. When you're finished, your project directory should look like this and be ready for any of the open-source RAG tutorials: Note This article uses Azure file shares accessed over Server Message Block (SMB), authenticated with a Microsoft Entra id",2026-04-23T22:13:00.000Z,how-to,integrations,0.7,True,"Shows how to authenticate to Azure file shares over SMB with Microsoft Entra ID and build download logic used by multiple RAG tutorials. This implies concrete integration patterns, including specific authentication parameters and code-level access patterns unique to Azure Files + Entra ID, which fits integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone,Use Haystack with Pinecone,Build a RAG pipeline using Azure Files with Haystack and Pinecone,Integrate Azure Files with Haystack and Pinecone RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using Haystack for orchestration and Pinecone as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses Haystack for orchestration and Pinecone as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedhaystack-pinecone.pyin your project directory, copy the contents ofhaystack",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Describes a RAG pipeline using Azure Files, Haystack, and Pinecone. It necessarily includes concrete configuration of Haystack components, Pinecone indexes, and Azure Files access, which is expert integration knowledge.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-qdrant/tutorial-haystack-qdrant,Use Haystack with Qdrant,Build a RAG pipeline using Azure Files with Haystack and Qdrant,Implement Haystack–Qdrant RAG with Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using Haystack for orchestration and Qdrant as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses Haystack for orchestration and Qdrant as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedhaystack-qdrant.pyin your project directory, copy the contents ofhaystack-qdr",2026-04-23T22:13:00.000Z,tutorial,integrations,0.68,True,"Tutorial describes a specific integration of Azure Files with Haystack and Qdrant as the vector database. It likely contains concrete configuration (Qdrant collection settings, connection parameters, auth details) and code patterns unique to this stack, which aligns with the integrations & coding patterns sub-skill.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-weaviate/tutorial-haystack-weaviate,Use Haystack with Weaviate,Build a RAG pipeline using Azure Files with Haystack and Weaviate,Implement Haystack–Weaviate RAG with Azure Files,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using Haystack for orchestration and Weaviate as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses Haystack for orchestration and Weaviate as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedhaystack-weaviate.pyin your project directory, copy the contents ofhaystack",2026-04-23T22:13:00.000Z,tutorial,integrations,0.68,True,"Tutorial shows a concrete integration pattern between Azure Files, Haystack, and Weaviate. It likely includes product-specific SDK usage and configuration parameters (endpoints, index/collection names, auth setup) that go beyond generic RAG concepts, fitting the integrations & coding patterns category.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone,Use LangChain with Pinecone,Build a RAG pipeline using Azure Files with LangChain and Pinecone,Integrate Azure Files with LangChain and Pinecone RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LangChain for orchestration and Pinecone as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LangChain for orchestration and Pinecone as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedlangchain-pinecone.pyin your project directory, copy the contents oflangch",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Tutorial builds a full RAG pipeline using Azure Files, LangChain, and Pinecone. Such pages typically include concrete SDK usage, configuration parameters (for vector DB, embeddings, and file access), and code patterns specific to this integration stack, which matches the integrations sub-skill.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant,Use LangChain with Qdrant,Build a RAG pipeline using Azure Files with LangChain and Qdrant,Integrate Azure Files with LangChain and Qdrant RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LangChain for orchestration and Qdrant as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LangChain for orchestration and Qdrant as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedlangchain-qdrant.pyin your project directory, copy the contents oflangchain-",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Builds a RAG pipeline using Azure Files, LangChain, and Qdrant. This requires specific configuration of Qdrant, LangChain vector stores, and Azure Files access, which is expert integration knowledge beyond generic RAG concepts.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate,Use LangChain with Weaviate,Build a RAG pipeline using Azure Files with LangChain and Weaviate,Integrate Azure Files with LangChain and Weaviate RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LangChain for orchestration and Weaviate as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LangChain for orchestration and Weaviate as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedlangchain-weaviate.pyin your project directory, copy the contents oflangch",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Similar to index 3 but with Weaviate as the vector database. It focuses on wiring Azure Files into a LangChain + Weaviate pipeline, with product-specific configuration and code patterns for that integration.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone,Use LlamaIndex with Pinecone,Build a RAG pipeline using Azure Files with LlamaIndex and Pinecone,Integrate Azure Files with LlamaIndex and Pinecone RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LlamaIndex for orchestration and Pinecone as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LlamaIndex for orchestration and Pinecone as the vector database. The sections that follow walk through each component of the pipeline. To start from a complete, runnable script and read along, create a file namedllamaindex-pinecone.pyin your project directory, copy the contents ofllamaindex-pineco",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Shows a concrete RAG pipeline using Azure Files with LlamaIndex and Pinecone. Such content typically includes specific constructor parameters, index configuration, and file access patterns unique to this combination, fitting integrations.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant,Use LlamaIndex with Qdrant,Build a RAG pipeline using Azure Files with LlamaIndex and Qdrant,Integrate Azure Files with LlamaIndex and Qdrant RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LlamaIndex for orchestration and Qdrant as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LlamaIndex for orchestration and Qdrant as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedllamaindex-qdrant.pyin your project directory, copy the contents ofllamaind",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Covers building a RAG pipeline using Azure Files, LlamaIndex, and Qdrant, including specific configuration and code patterns for that stack, which qualifies as integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate,Use LlamaIndex with Weaviate,Build a RAG pipeline using Azure Files with LlamaIndex and Weaviate,Integrate Azure Files with LlamaIndex and Weaviate RAG,Learn how to build a retrieval-augmented generation (RAG) pipeline that queries documents stored in Azure Files using LlamaIndex for orchestration and Weaviate as the vector database.,"Applies to:✔️ SMB file shares with Microsoft Entra ID authentication In this tutorial, you build a retrieval-augmented generation (RAG) pipeline over documents stored in Azure Files. The pipeline uses LlamaIndex for orchestration and Weaviate as the vector database. The sections that follow walk through each component of the pipeline. If you'd rather start from a complete, runnable script and read along, create a file namedllamaindex-weaviate.pyin your project directory, copy the contents ofllam",2026-04-23T22:13:00.000Z,tutorial,integrations,0.75,True,"Provides code and configuration to connect Azure Files to LlamaIndex and Weaviate. This is a product-specific integration pattern with concrete SDK usage and settings, not just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/pinecone,Pinecone,Build RAG pipelines with Pinecone and Azure Files,Use Pinecone vector database with Azure Files RAG,Use Pinecone as a vector database to build retrieval-augmented generation (RAG) pipelines using data stored in Azure Files.,"Pinecone is a managed vector database that you can use to index and retrieve embeddings for retrieval-augmented generation (RAG) workloads. When combined with Azure Files as the document source, Pinecone serves as the vector store for similarity search while orchestration frameworks such as LangChain, LlamaIndex, or Haystack handle ingestion and query-time retrieval. Pinecone is operated and managed by Pinecone Systems, Inc. and is available as a third-party service.",2026-04-10T22:10:00.000Z,overview,integrations,0.7,True,"Describes Pinecone as vector store with Azure Files as document source and orchestration frameworks. Typically includes Pinecone index/namespace configuration and parameters specific to this integration, which qualifies as product-specific integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/qdrant,Qdrant,Build RAG pipelines with Qdrant and Azure Files,Use Qdrant vector database with Azure Files RAG,Use Qdrant as a vector database to build retrieval-augmented generation (RAG) pipelines using data stored in Azure Files.,"Qdrant is a vector database built in Rust that supports advanced payload filtering for scoping queries by metadata at retrieval time. When combined with Azure Files as the document source, Qdrant serves as the vector store for similarity search while orchestration frameworks such as LangChain, LlamaIndex, or Haystack handle ingestion and query-time retrieval. Qdrant is operated and managed by Qdrant Solutions GmbH and is available as a third-party service.",2026-04-10T22:10:00.000Z,overview,integrations,0.7,True,"Describes Qdrant with Azure Files and orchestration frameworks, including payload filtering for metadata-based scoping. This is a product-specific integration pattern with configuration details unique to Qdrant + Azure Files.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/weaviate,Weaviate,Build RAG pipelines with Weaviate and Azure Files,Use Weaviate vector database with Azure Files RAG,Use Weaviate as a vector database to build retrieval-augmented generation (RAG) pipelines using data stored in Azure Files.,"Weaviate is a vector database that supports hybrid search, combining vector similarity with BM25 keyword matching in a single query. When combined with Azure Files as the document source, Weaviate serves as the vector store for similarity search while orchestration frameworks such as LangChain, LlamaIndex, or Haystack handle ingestion and query-time retrieval. Weaviate is operated and managed by Weaviate B.V. and is available as a third-party service.",2026-04-10T22:10:00.000Z,overview,integrations,0.7,True,"Explains Weaviate hybrid search with Azure Files as source and orchestration frameworks. Likely includes Weaviate schema/config parameters and query patterns specific to this setup, fitting integrations criteria.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/overview,What is retrieval-augmented generation?,Retrieval-Augmented Generation (RAG) with Azure Files,,"Learn how to build retrieval-augmented generation (RAG) pipelines over documents stored in Azure Files, using either Azure-native AI services or non-Microsoft open-source AI tooling for orchestration,",This article explains how Azure Files can serve as the document source for retrieval-augmented generation (RAG) pipelines.,2026-04-10T22:10:00.000Z,overview,,0.2,False,"High-level overview of using Azure Files as a data source for RAG; no concrete limits, configuration tables, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/overview,What is retrieval-augmented generation?,Retrieval-Augmented Generation (RAG) with Azure Files,,"Learn how to build retrieval-augmented generation (RAG) pipelines over documents stored in Azure Files, using either Azure-native AI services or non-Microsoft open-source AI tooling for orchestration,","Retrieval-augmented generation (RAG) is a technique for grounding a large language model's responses in your own content. Instead of relying only on what the model learned during training, a RAG pipeline: The result is natural-language search over your own documents, with answers grounded in the retrieved content.",2026-04-23T22:13:00.000Z,overview,,0.2,False,"High-level overview of RAG with Azure Files; describes the concept and general pipeline behavior without indicating product-specific limits, configuration tables, or detailed decision/troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/storage/files/authorize-data-operations-portal,Choose how to authorize access to file data in the Azure portal,Authorize Access to Azure File Share Data in the Azure portal,Authorize Azure portal access to Azure file data,"When you access file data using the Azure portal, the portal makes requests to Azure Files behind the scenes. These requests can be authenticated and authorized using either your Microsoft Entra accou","Applies to:✔️ SMB file shares When you access file data by using theAzure portal, the portal makes requests to the Azure Files service behind the scenes. You can authorize these requests by using either your Microsoft Entra account (preferred) or the storage account access key (less secure). The portal shows which method you're using and enables you to switch between the two methods if you have the appropriate permissions. By default, the portal uses whichever method you're already using to auth",2026-03-05T18:25:00.000Z,how-to,security,0.76,True,"Explains how portal requests to Azure Files are authorized, with specific details on using Entra accounts vs access keys and required permissions, which are product-specific security behaviors.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/authorize-oauth-rest,Authorize access using Microsoft Entra ID with OAuth over REST,Enable Access to Azure File Shares Using OAuth Over REST,Enable OAuth-based REST access to Azure file shares,Learn how to authorize admin-level read and write access to Azure file shares and directories via OAuth authentication over REST APIs by using Microsoft Entra ID.,"✔️Applies to:Classic SMB and NFS file shares created with theMicrosoft.Storageresource provider ✖️Doesn't apply to:File shares created with theMicrosoft.FileSharesresource provider (preview) By using Azure Files OAuth over REST, users and applications can get admin-level read and write access to Azure file shares through theOAuthauthentication protocol. This access method uses Microsoft Entra ID for REST API-based access. Users, groups, Microsoft services such as the Azure portal, and partner se",2026-03-26T22:23:00.000Z,concept-article,security,0.8,True,"Covers configuring OAuth over REST for Azure Files with Entra ID, including admin-level access semantics and identity scopes; this is product-specific authentication/authorization configuration beyond generic OAuth usage.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/azure-files-case-study,Customer case studies,Azure Files customer case studies,,Case studies describing how Microsoft customers use Azure Files and Azure File Sync in different industries.,Customers in diverse industries use Azure Files and Azure File Sync to host their file shares and sync on-premises file servers to cloud. Learn directly from the customer use cases listed here.,2025-09-05T17:17:00.000Z,concept-article,,0.2,False,"Customer case studies are narrative/marketing-style usage stories, not technical reference with limits, configs, or error mappings. They don't provide reusable expert configuration or troubleshooting knowledge.",unchanged @@ -58,9 +58,9 @@ https://learn.microsoft.com/en-us/azure/storage/files/encryption-in-transit-for- https://learn.microsoft.com/en-us/azure/storage/files/file-estimate-cost,Cost estimation examples,How to Estimate the Cost of Azure Files,Estimate Azure Files costs across billing models,Learn how to estimate the cost of Azure Files usage across different billing models.,"In this article, you learn how to estimate the cost difference between the Provisioned v1 and Provisioned v2 SSD billing models for Azure Files in three scenarios. Important These prices are meant only as examples and shouldn't be used to calculate your costs. For official prices, see theAzure Files pricing. For more information about how to choose the correct billing model, seeUnderstand Azure Files billing. We also encourage you to useAzure Pricing calculatorto perform the actual calculations.",2025-08-14T22:10:00.000Z,concept-article,decision-making,0.75,True,Walks through cost estimation scenarios with concrete usage patterns and price impacts between billing models. Provides quantified trade-offs and scenario-based recommendations.,unchanged https://learn.microsoft.com/en-us/azure/storage/files/files-change-redundancy-configuration,Change the redundancy configuration,Change Redundancy Configuration for Azure Files,Change redundancy configuration for Azure Files accounts,Learn how to change how Azure Files data in an existing storage account is replicated.,"Azure always stores multiple copies of your data to protect it in the face of both planned and unplanned events. These events include transient hardware failures, network or power outages, and natural disasters. Data redundancy ensures that your storage account meets theService-Level Agreement (SLA) for Microsoft Online Services. This article describes the process of changing replication settings for an existing storage account that hosts Azure file shares. Important If you're using a zonal stor",2025-01-15T08:00:00.000Z,how-to,configuration,0.7,True,"Describes exact steps and constraints for changing replication settings (e.g., allowed transitions, unsupported combinations). These are concrete configuration rules unique to Azure Files.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/files-data-protection-overview,Data protection overview,Data Protection Overview for Azure Files,,Learn how to protect your data in Azure Files. Understand the concepts and processes involved with backup and recovery of Azure file shares.,"Azure Files gives you many tools to protect your data, including soft delete, share snapshots, Azure Backup, and Azure File Sync. This article describes how to protect your data in Azure Files, and the concepts and processes involved with backup and recovery of Azure file shares. Watch this video to learn how Azure Files advanced data protection helps enterprises stay protected against ransomware and accidental data loss while delivering greater business continuity.",2026-03-11T22:19:00.000Z,overview,,0.3,False,"High-level overview of Azure Files data protection options (soft delete, snapshots, backup, sync) without detailed numeric limits, configuration tables, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/files-disaster-recovery,Disaster recovery and failover,Disaster recovery and failover for Azure Files,Plan disaster recovery and failover for Azure Files,Learn how to recover your data in Azure Files. Understand the concepts and processes involved with disaster recovery and storage account failover.,"Microsoft strives to ensure that Azure services are always available. However, unplanned service outages might occur, and you should have a disaster recovery (DR) plan in place for handling a regional service outage. An important part of a disaster recovery plan is preparing to fail over to the secondary endpoint when the primary endpoint becomes unavailable. This article describes the concepts and processes involved with disaster recovery (DR) and storage account failover. Important Azure File ",2026-03-11T22:19:00.000Z,concept-article,best-practices,0.6,True,"Covers DR planning and failover processes for Azure Files, including product-specific guidance on when and how to fail over storage accounts; organized as actionable recommendations rather than just conceptual DR theory.",unchanged +https://learn.microsoft.com/en-us/azure/storage/files/files-disaster-recovery,Disaster recovery and failover,Disaster Recovery and Failover for Azure Files,Plan disaster recovery and failover for Azure Files,Learn how to recover your data in Azure Files. Understand the concepts and processes involved with disaster recovery and storage account failover.,"Microsoft strives to ensure that Azure services are always available. However, unplanned service outages might occur, and you should have a disaster recovery (DR) plan in place for handling a regional service outage. An important part of a disaster recovery plan is preparing to fail over to the secondary endpoint when the primary endpoint becomes unavailable. This article describes the concepts and processes involved with disaster recovery (DR) and storage account failover. Important Azure File ",2026-04-22T08:00:00.000Z,concept-article,best-practices,0.65,True,"Disaster recovery guidance for Azure Files typically includes product-specific recommendations (for example when to trigger storage account failover, how to handle secondary endpoints, and sequencing of recovery steps) that go beyond generic DR theory. While the summary is high level, this topic usually documents Azure Files–specific behaviors and gotchas during failover, which qualifies as product-specific best practices.",updated https://learn.microsoft.com/en-us/azure/storage/files/files-manage-namespaces,Use DFS-N with Azure Files,How to use DFS-N with Azure Files,Integrate DFS Namespaces with Azure file shares,"Learn how to use DFS Namespaces (DFS-N) with Azure Files. DFS Namespaces works with SMB file shares, agnostic of where those file shares are hosted.","Applies to:✔️ SMB file shares Distributed File Systems Namespaces, commonly referred to as DFS Namespaces or DFS-N, is a Windows Server server role that's used to simplify the deployment and maintenance of SMB file shares in production. DFS Namespaces provides storage namespace virtualization, enabling you to provide a layer of indirection between the UNC path of your file share and the actual file share. DFS Namespaces works with SMB file shares, agnostic of where those file shares are hosted. ",2026-03-11T22:19:00.000Z,how-to,configuration,0.64,True,"Explains using DFS-N with Azure Files, including namespace configuration and path mapping specifics that are product- and scenario-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities,Managed identities with Microsoft Entra ID,Use Managed Identities with Azure Files (Preview),Use managed identities to access Azure SMB file shares,Learn how to authenticate managed identities to allow applications and VMs to access SMB Azure file shares by using identity-based authentication with Entra ID.,"Applies to:✔️ SMB file shares This article explains how you can usemanaged identitiesto allow Windows and Linux virtual machines (VMs) to access SMB Azure file shares by using identity-based authentication with Microsoft Entra ID (preview). A managed identity is an identity in Entra ID that Azure automatically manages. Typically, you use managed identities when developing cloud applications to manage the credentials for authenticating to Azure services. By the end of this guide, you create a sto",2026-03-27T08:00:00.000Z,how-to,security,0.76,True,"Shows how to configure managed identities with Azure Files, including storage account and identity bindings; these are concrete, product-specific identity configuration steps rather than generic managed identity concepts.",unchanged +https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities,Managed identities with Microsoft Entra ID,Use Managed Identities with Azure Files,Configure managed identity access to Azure file shares,Learn how to authenticate managed identities to allow applications and VMs to access SMB Azure file shares by using identity-based authentication with Microsoft Entra ID.,"Applies to:✔️ SMB file shares This article explains how you can usemanaged identitiesto allow Windows and Linux virtual machines (VMs) to access SMB Azure file shares by using identity-based authentication with Microsoft Entra ID. A managed identity is an identity in Microsoft Entra ID that Azure automatically manages. Typically, you use managed identities when developing cloud applications to manage the credentials for authenticating to Azure services. Azure Files now supports both application ",2026-04-20T08:00:00.000Z,how-to,security,0.78,True,"Page describes how to use managed identities with Azure Files SMB shares, including product-specific Microsoft Entra ID configuration, role assignments, and identity-based authentication settings. These are concrete security/RBAC configuration steps unique to Azure Files rather than generic identity concepts.",updated https://learn.microsoft.com/en-us/azure/storage/files/files-monitoring-alerts,Create alerts,Monitor Azure Files by Creating Alerts,Create Azure Monitor alerts for Azure Files health,"Learn how to use Azure Monitor to create alerts on metrics and logs for Azure Files. Monitor throttling, capacity, and egress. Create an alert on high server latency.","✔️Applies to:Classic SMB and NFS file shares created with the Microsoft.Storage resource provider ✖️Doesn't apply to:File shares created with the Microsoft.FileShares resource provider (preview) Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues in your system before your customers notice them. You can set alerts onmetrics,logs, and theactivity log. This article shows you how to create alerts on t",2026-03-11T22:19:00.000Z,how-to,configuration,0.7,True,"Shows how to configure alerts on Azure Files metrics/logs (throttling, capacity, egress, server latency) with specific metric names and alert settings; these are concrete configuration patterns for monitoring this product.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/files-network-security-perimeter,Network security perimeter,Network Security Perimeter for Azure Files,Configure network security perimeter for Azure Files,Network security perimeter enables you to define a logical network isolation boundary for Azure file shares and other PaaS resources that are deployed outside your virtual networks.,"Network security perimeterallows organizations to define a logical network isolation boundary for PaaS resources such as Azure Files that are deployed outside their virtual networks. This feature restricts public network access to PaaS resources outside the perimeter. However, you can exempt access by using explicit access rules for public inbound and outbound traffic. This helps prevent unwanted data exfiltration from your storage resources. Within a network security perimeter, member resources",2026-02-12T06:12:00.000Z,how-to,security,0.7,True,"Describes Azure Files–specific NSP configuration, including member/resource rules and access policies. Contains product-specific security settings and rule constructs beyond generic network isolation concepts.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/files-nfs-protocol,NFS file shares,NFS file shares in Azure Files,Secure and configure NFS file shares in Azure Files,"Learn about file shares hosted in Azure Files using the Network File System (NFS) protocol, including security, networking, feature support, and regional availability.","Applies to:✔️ NFS file shares Azure Files offers two industry-standard file system protocols for mounting Azure file shares: theServer Message Block (SMB)protocol and theNetwork File System (NFS)protocol, allowing you to pick the protocol that is the best fit for your workload. Azure file shares don't support accessing an individual Azure file share with both the SMB and NFS protocols, although you can create SMB and NFS file shares within the same FileStorage storage account. Azure Files offers",2026-04-08T17:12:00.000Z,concept-article,security,0.65,True,"NFS protocol page for Azure Files covering security, networking, feature support, and regional availability. Azure NFS has product-specific security and networking requirements (e.g., allowed networks, auth models, export rules) that constitute expert configuration/security knowledge beyond generic NFS concepts, fitting the security sub-skill best.",unchanged @@ -68,7 +68,7 @@ https://learn.microsoft.com/en-us/azure/storage/files/files-redundancy,Storage r https://learn.microsoft.com/en-us/azure/storage/files/files-remove-smb1-linux,Disable SMB on the Linux SMB client,Improve security by disabling SMB 1 on Linux clients,Disable insecure SMB1 on Linux for Azure Files,"Azure Files supports SMB 3.x and SMB 2.1, but not insecure legacy versions such as SMB 1. This article explains how to disable SMB 1 on Linux clients.","Many organizations and internet service providers (ISPs) block the port that SMB uses to communicate, port 445. This practice originates from security guidance about legacy and deprecated versions of the SMB protocol. Although SMB 3.x is an internet-safe protocol, older versions of SMB, especially SMB 1, aren't. SMB 1, also known as CIFS (Common Internet File System), is included with many Linux distributions. SMB 1 is an outdated, inefficient, and insecure protocol. The good news is that Azure ",2024-05-08T22:08:00.000Z,how-to,security,0.65,True,"Provides concrete Linux configuration steps and commands to disable SMB1 in the context of Azure Files, including package/module settings. This is product- and protocol-specific security hardening guidance.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/files-reserve-capacity,Optimize costs with storage reservations,Reduce costs for Azure Files with Reservations - Azure Files,Reduce Azure Files costs using reservations,"Learn how to save costs on Azure file share deployments by using Azure Files Reservations, also called reserved instances. Get a discount on capacity when you commit to a Reservation for either one ye",✔️Applies to:Classic file shares created with the Microsoft.Storage resource provider with either the provisioned v1 or pay-as-you-go billing models ✖️Doesn't apply to:All file shares using the provisioned v2 billing model including file shares created with the Microsoft.FileShares resource provider (preview) or classic file shares created with the Microsoft.Storage resource provider You can save money on the storage costs for Azure file shares with Azure Files Reservations. Azure Files Reservat,2026-02-18T10:58:00.000Z,concept-article,decision-making,0.7,True,"Describes when reservations apply (which billing models), discount behavior, and term options (1-year/3-year). Guides cost-optimization decisions with product-specific constraints.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/files-smb-protocol,SMB file shares,SMB File Shares in Azure Files,Configure SMB Azure file shares and security features,"Learn about file shares hosted in Azure Files using the Server Message Block (SMB) protocol, including features, security, and SMB Multichannel.","Applies to:✔️ SMB file shares Azure Files offers two industry-standard protocols for mounting Azure file share: theServer Message Block (SMB)protocol and theNetwork File System (NFS)protocol. Azure Files enables you to pick the file system protocol that is the best fit for your workload. Azure file shares don't support accessing an individual Azure file share with both the SMB and NFS protocols, although you can create SMB and NFS file shares within the same storage account. For all file shares,",2026-04-09T06:11:00.000Z,concept-article,security,0.65,True,"Page is about SMB file shares in Azure Files and explicitly mentions features and security. This documentation typically includes SMB-specific security options (e.g., supported authentication methods, encryption requirements, supported SMB versions, and possibly RBAC/permission details) that are product-specific. That aligns best with the security sub-skill, and goes beyond generic SMB knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/files-whats-new,What's new in Azure Files?,What's New in Azure Files and Azure File Sync,,Learn about new features and enhancements in Azure Files and Azure File Sync.,Azure Files and Azure File Sync are updated regularly to offer new features and enhancements. This article provides detailed information about what's new in Azure Files and Azure File Sync.,2025-11-18T08:00:00.000Z,concept-article,,0.1,False,"What's new changelog; mostly release notes and feature announcements without structured limits, configs, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/storage/files/files-whats-new,What's new in Azure Files?,What's New in Azure Files and Azure File Sync,,Learn about new features and enhancements in Azure Files and Azure File Sync.,Azure Files and Azure File Sync are updated regularly to offer new features and enhancements. This article provides detailed information about what's new in Azure Files and Azure File Sync.,2026-04-20T08:00:00.000Z,concept-article,,0.2,False,"Release notes / what's new page listing recent features and changes for Azure Files and Azure File Sync. It does not focus on limits, configuration matrices, troubleshooting mappings, or other structured expert details as defined; primarily update/announcement content.",updated https://learn.microsoft.com/en-us/azure/storage/files/glusterfs-migration-guide,Migrate GlusterFS to Azure Files,GlusterFS to Azure Files Migration Guide,Migrate GlusterFS volumes to Azure Files,Red Hat Gluster Storage (based on GlusterFS) has reached the end of its support lifecycle. Use this guide to migrate GlusterFS volumes to Azure Files.,"This article provides guidance on migrating data from GlusterFS volumes to Azure Files, Microsoft's fully managed file service in the cloud. Azure Files offers both SMB (Server Message Block) and NFS (Network File System) protocols, making it suitable for both Windows and Linux workloads.",2025-11-07T05:43:00.000Z,how-to,deployment,0.65,True,"Provides guidance for moving from GlusterFS to Azure Files, including protocol choices and migration steps; product- and source-specific migration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/migrate-files-between-shares,Migrate files between Azure file shares,Copy Files Between Azure File Shares,Copy files between Azure file shares with tools,Learn how to copy files from one Azure file share to another using common copy tools such as AzCopy and Robocopy.,"This article describes how to copy files between Azure file shares using common copy tools. You can copy files between HDD and SSD file shares, file shares using a different billing model, or file shares in different Azure regions. This article doesn't provide guidance for migrations to Azure Files. If you want to migrate to Azure Files, seeMigrate to SMB Azure file sharesorMigrate to NFS Azure file shares. If you're using Azure File Sync and you want to migrate between Azure file shares, seeMig",2026-02-03T18:20:00.000Z,how-to,configuration,0.7,True,"Shows how to use AzCopy/Robocopy to copy between shares, including cross-tier and cross-region scenarios; contains product-specific command usage and options.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/migrate-files-storage-mover,Use Azure Storage Mover to migrate to Azure file shares,Migrate to Azure Files using Azure Storage Mover,Migrate SMB/NFS shares to Azure Files via Storage Mover,"Learn how to migrate on-premises SMB or NFS file shares to Azure file shares with full fidelity using Azure Storage Mover, a fully managed migration service.","Applies to:✔️ SMB and NFS Azure file shares This migration guide describes how to migrate on-premises files to Azure file shares with full fidelity usingAzure Storage Mover, a fully managed migration service. You can use Storage Mover to migrate from SMB or NFS source shares, including Windows Server, Linux, or NAS. Storage Mover uses the FileREST API to move the data. Storage Mover isn't currently available in Azure Government clouds. Note If you're using or plan to use Azure File Sync for clou",2026-01-22T18:12:00.000Z,how-to,deployment,0.7,True,"End-to-end migration guide using Azure Storage Mover, a managed migration service; includes product-specific migration configuration and constraints (e.g., cloud availability).",unchanged @@ -83,7 +83,7 @@ https://learn.microsoft.com/en-us/azure/storage/files/storage-files-active-direc https://learn.microsoft.com/en-us/azure/storage/files/storage-files-authorization-overview,Overview of authorization and access control,Overview - Azure Files Authorization and Access Control,,"Azure Files enforces authorization on user access at the share level, the directory level, and the file level. You can assign share-level permissions through Azure RBAC.","Applies to:✔️ SMB file shares Regardless of which identity source you choose foridentity-based authenticationon your storage account, you need to configure authorization and access control. Azure Files enforces authorization on user access at the share level, the directory level, and the file level. You can assign share-level permissions to Microsoft Entra users or groups that are managed throughAzure RBAC. With Azure RBAC, the credentials that you use for file access should be available or sync",2026-03-26T22:23:00.000Z,overview,,0.2,False,"Authorization overview describing concepts (share-level, directory-level, file-level, RBAC) without detailed role lists, parameter tables, or other concrete configuration data; primarily conceptual.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-p2s-vpn-linux,Configure Point-to-Site VPN on Linux,Configure a Point-to-Site VPN on Linux for Azure Files,Configure Linux point-to-site VPN for Azure Files access,Learn how to configure a point-to-site virtual private network (VPN) on Linux to mount your Azure file shares directly on premises.,"✔️Applies to:Classic SMB and NFS file shares created with the Microsoft.Storage resource provider ✖️Doesn't apply to:File shares created with the Microsoft.FileShares resource provider (preview) You can use a point-to-site virtual private network (VPN) connection to mount your Azure file shares from outside of Azure, without sending data over the open internet. A point-to-site VPN connection is a VPN connection between Azure and an individual client machine. To use a point-to-site VPN connection",2026-03-11T22:19:00.000Z,how-to,configuration,0.68,True,"Details configuring P2S VPN on Linux specifically for Azure Files, including VPN settings, client configuration parameters, and Azure-side options that go beyond generic VPN knowledge.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-p2s-vpn-windows,Configure Point-to-Site VPN on Windows,Configure a Point-to-Site VPN on Windows for Azure Files,Configure Windows P2S VPN access to Azure Files,How to configure a point-to-site (P2S) VPN on Windows for use with SMB Azure file shares to mount your Azure file shares over SMB from outside of Azure without opening up port 445.,"Applies to:✔️ SMB Azure file shares You can use a point-to-site (P2S) VPN connection to mount your Azure file shares over SMB from outside of Azure, without opening up port 445. A point-to-site VPN connection is a VPN connection between Azure and an individual client. To use a P2S VPN connection with Azure Files, you must configure a VPN connection for each client that wants to connect. If you have many clients that need to connect to your Azure file shares from your on-premises network, you can",2025-11-07T05:43:00.000Z,how-to,configuration,0.7,True,"Step-by-step configuration for Windows P2S VPN to Azure Files with product-specific parameters (VPN client config, certificate requirements, address spaces). Contains concrete settings and values unique to Azure Files networking, beyond generic VPN knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn,Configure Site-to-Site VPN,Configure a Site-to-Site VPN for Azure Files,Configure site-to-site VPN for Azure Files access,"Learn how to configure a site-to-site (S2S) VPN for use with Azure Files so you can mount your Azure file shares from on premises. Use the Azure portal, PowerShell, or CLI.","You can use a site-to-site (S2S) VPN connection to mount your Azure file shares from your on-premises network, without sending data over the open internet. You can set up a S2S VPN usingAzure VPN Gateway, which is an Azure resource offering VPN services. You deploy VPN Gateway in a resource group alongside storage accounts or other Azure resources. We strongly recommend that you readAzure Files networking overviewbefore continuing with this article for a complete discussion of the networking opt",2025-04-01T11:23:00.000Z,how-to,configuration,0.7,True,"Detailed S2S VPN configuration for Azure Files using VPN Gateway, including gateway types, address spaces, and Azure-specific parameters. This is concrete configuration reference, not just conceptual guidance.",unchanged +https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn,Configure Site-to-Site VPN,Configure a Site-to-Site VPN for Azure Files,Configure site-to-site VPN connectivity for Azure Files,"Learn how to configure a site-to-site VPN for use with Azure Files so you can mount your Azure file shares from on premises. Use the Azure portal, PowerShell, or CLI.","✔️Applies to:Classic SMB and NFS file shares created with the Microsoft.Storage resource provider ✖️Doesn't apply to:File shares created with the Microsoft.FileShares resource provider (preview) You can use a site-to-site VPN connection to mount your Azure file shares from your on-premises network, without sending data over the open internet. You can set up a site-to-site VPN usingAzure VPN Gateway, which is an Azure resource offering VPN services. You deploy VPN Gateway in a resource group alon",2026-04-23T06:20:00.000Z,how-to,configuration,0.7,True,"Page walks through configuring a site-to-site VPN with Azure VPN Gateway specifically for Azure Files, including Azure resource configuration, likely with gateway types, SKU choices, address spaces, and other product-specific parameters. This is detailed configuration guidance rather than generic VPN theory or simple how-to.",updated https://learn.microsoft.com/en-us/azure/storage/files/storage-files-developer-overview,Overview,Overview of application development with Azure Files - Azure Storage,Choose development approaches for Azure Files applications,Learn how to develop applications and services that use Azure Files to store data.,This article provides an overview of application development with Azure Files and helps you decide which approach is best based on the needs of your app.,2026-03-11T22:19:00.000Z,concept-article,decision-making,0.65,True,"Overview targeted at helping developers decide between different programming models/approaches for Azure Files; includes product-specific trade-offs between APIs/SDKs and access methods, which is decision guidance rather than generic storage info.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-faq,Azure Files FAQ,Azure Files frequently asked questions (FAQ),Azure Files and File Sync FAQ with limits and behaviors,"Get answers to frequently asked questions (FAQ) about Azure Files and Azure File Sync. You can mount Azure file shares concurrently on cloud or on-premises Windows, Linux, or macOS deployments.","Azure Filesoffers fully managed file shares in the cloud that are accessible via the industry-standardServer Message Block (SMB) protocoland theNetwork File System (NFS) protocol. You can mount Azure file shares concurrently on cloud or on-premises deployments of Windows, Linux, and macOS. You also can cache Azure file shares on Windows Server machines by using Azure File Sync for fast access close to where the data is used.",2025-09-30T08:00:00.000Z,faq,limits-quotas,0.7,True,"Azure Files FAQ typically includes many numeric limits (max shares, file size, throughput behaviors) and edge-case behaviors. These are expert details not obvious from overviews.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-how-to-mount-nfs-shares,Mount NFS file share on Linux,Mount an NFS Azure File Share on Linux,,"Learn how to mount a Network File System (NFS) Azure file share on Linux, including configuring network security and mount options.","✔️Applies to:Classic NFS file shares created with the Microsoft.Storage resource provider ✔️Applies to:File shares created with the Microsoft.FileShares resource provider (preview) Azure file shares can be mounted in Linux distributions using either the Server Message Block (SMB) protocol or the Network File System (NFS) protocol. This article is focused on mounting with NFS. For details on mounting SMB file shares, seeUse Azure Files with Linux. For details on each of the available protocols, s",2026-03-11T05:11:00.000Z,how-to,,0.45,False,Focuses on mounting NFS Azure file shares on Linux; mostly procedural without detailed configuration parameter tables or product-specific limits/roles that meet the expert-knowledge criteria.,unchanged @@ -95,7 +95,7 @@ https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-aut https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-auth-hybrid-cloud-trust,Enable authentication for hybrid identities on legacy clients,Configure Cloud Trust between AD DS and Entra ID,Configure cloud trust between AD DS and Entra ID for Azure Files,Learn how to enable Microsoft Entra Kerberos authentication over SMB for Azure Files and establish a cloud trust between on-premises Active Directory Domain Services (AD DS) and Microsoft Entra ID. Yo,"Applies to:✔️ SMB file shares Many organizations want to use identity-based authentication for SMB Azure file shares in environments that span both on-premises Active Directory Domain Services (AD DS) and Microsoft Entra ID (formerly Azure Active Directory), but don't meet the necessaryoperating system or domain prerequisites. In such scenarios, you can enable Microsoft Entra Kerberos authentication for hybrid user identities and then establish a cloud trust between your on-premises AD DS and En",2026-03-19T22:26:00.000Z,how-to,security,0.8,True,"Describes establishing Microsoft Entra Kerberos authentication and cloud trust between on-prem AD DS and Entra ID for Azure Files, with product-specific identity configuration steps and parameters that qualify as expert security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-auth-hybrid-identities-enable,Enable authentication for hybrid or cloud-only identities,Microsoft Entra Kerberos Authentication for Azure Files,Enable Entra Kerberos auth for hybrid Azure Files users,Learn how to enable identity-based Kerberos authentication over Server Message Block (SMB) for Azure Files through Microsoft Entra ID. Hybrid and cloud-only users can then access Azure file shares by ,Applies to:✔️ SMB file shares This article explains how to enable and configure Microsoft Entra ID (formerly Azure AD) for authenticatinghybridor cloud-only identities (preview). Hybrid identities are on-premises Active Directory Domain Services (AD DS) identities that are synced to Microsoft Entra ID by using eitherMicrosoft Entra Connect SyncorMicrosoft Entra Cloud Sync. Cloud-only identities are created and managed only in Microsoft Entra ID. When you enable Microsoft Entra Kerberos authentic,2026-03-04T08:00:00.000Z,how-to,security,0.8,True,"Explains configuring Microsoft Entra Kerberos authentication for hybrid and cloud-only identities with Azure Files, including product-specific identity, RBAC, and security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-auth-linux-kerberos-enable,Configure Linux clients,Use Kerberos Authentication for Linux Clients with Azure Files,Configure Kerberos auth for Linux Azure Files clients,Learn how to enable identity-based Kerberos authentication for Linux clients over Server Message Block (SMB) for Azure Files using on-premises Active Directory Domain Services (AD DS) or Microsoft Ent,"Applies to:✔️ SMB file shares For more information on supported options and considerations, seeOverview of Azure Files identity-based authentication options for SMB access. Azure Filessupports identity-based authentication over Server Message Block (SMB) for Linux virtual machines (VMs) using the Kerberos authentication protocol through the following methods: To use AD DS, you must sync your AD DS to Microsoft Entra ID by usingMicrosoft Entra Connect Sync. Note This article uses Ubuntu for the e",2026-03-04T06:13:00.000Z,how-to,security,0.78,True,"Details Kerberos-based identity configuration for Linux SMB clients with Azure Files, including specific Entra/AD DS sync requirements and security-related configuration steps unique to this scenario.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-configure-file-level-permissions,Configure directory/file-level permissions,Configure Directory and File-Level Permissions for Azure Files,Configure NTFS ACL permissions for Azure file shares,Learn how to configure Windows ACLs for directory-level and file-level permissions for Active Directory authentication to Azure file shares over SMB for granular access control.,"Applies to:✔️ SMB file shares Before you can configure directory-level and file-level permissions, you mustassign share-level permissions to an identitywith Azure role-based access control (RBAC). After the share-level permissions propagate, follow the steps in this article to configure Windows access control lists (ACLs), also known as NTFS permissions, for more granular access control.",2026-04-16T22:31:00.000Z,how-to,security,0.78,True,"The page gives product-specific, stepwise guidance for configuring Windows ACLs/NTFS permissions on Azure Files with Active Directory authentication, including required RBAC share-level permissions and how they interact with file-level ACLs. This is concrete security configuration knowledge (roles, permission layers, and configuration behavior) that goes beyond generic security concepts.",updated +https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-configure-file-level-permissions,Configure directory/file-level permissions,Configure Directory and File-Level Permissions for Azure Files,Configure NTFS ACL permissions for Azure file shares,Learn how to configure Windows ACLs for directory-level and file-level permissions for Active Directory authentication to Azure file shares over SMB for granular access control.,"Applies to:✔️ SMB file shares Before you can configure directory-level and file-level permissions, you mustassign share-level permissions to an identitywith Azure role-based access control (RBAC). After the share-level permissions propagate, follow the steps in this article to configure Windows access control lists (ACLs), also known as NTFS permissions, for more granular access control.",2026-04-16T22:31:00.000Z,how-to,security,0.78,True,"The page gives product-specific, stepwise guidance for configuring Windows ACLs/NTFS permissions on Azure Files with Active Directory authentication, including required RBAC share-level permissions and how they interact with file-level ACLs. This is concrete security configuration knowledge (roles, permission layers, and configuration behavior) that goes beyond generic security concepts.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-multiple-forests,Use Azure Files with multiple AD DS forests,Use Azure Files with Multiple Active Directory Forests,Configure Azure Files with multiple AD DS forests,Configure on-premises Active Directory Domain Services (AD DS) authentication for SMB Azure file shares with an AD DS environment using multiple forests.,"Applies to:✔️ SMB Azure file shares Many organizations want to use identity-based authentication for SMB Azure file shares in environments that have multiple on-premises Active Directory Domain Services (AD DS) forests. This is a common IT scenario, especially following mergers and acquisitions, where the acquired company's AD forests are isolated from the parent company's AD forests. This article explains how forest trust relationships work and provides step-by-step instructions for multi-fores",2026-03-03T08:00:00.000Z,how-to,security,0.7,True,"Describes how to configure Azure Files AD DS authentication in multi-forest environments, including forest trust requirements and specific configuration steps/parameters for identity integration. This is product-specific security and identity configuration guidance, not just conceptual explanation.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-introduction,What is Azure Files?,Introduction to Azure Files,,"An overview of Azure Files, a service that enables you to create and use network file shares in the cloud using either SMB or NFS protocols.","Azure Files provides fully managed file shares in the cloud that you can access through theServer Message Block (SMB) protocol,Network File System (NFS) protocol, andAzure Files REST API. You can mount Azure file shares concurrently from cloud or on-premises deployments. You can access SMB Azure file shares from Windows, Linux, and macOS clients. You can access NFS Azure file shares from Linux clients. You can also cache SMB Azure file shares on Windows servers by usingAzure File Syncfor fast ac",2026-04-03T21:52:00.000Z,overview,,0.1,False,"High-level introduction to Azure Files describing what the service is and basic capabilities (SMB/NFS access, platforms supported). No specific limits, configuration tables, error codes, or decision matrices are present.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-migration-linux-hybrid,Migrate from Linux to a hybrid file server with Azure File Sync,Linux migration to Azure File Sync,Migrate Linux servers to Azure File Sync hybrid,Learn how to migrate files from a Linux server location to a hybrid cloud deployment with Azure File Sync and Azure file shares.,"✔️Applies to:Classic SMB file shares created with the Microsoft.Storage resource provider ✖️Doesn't apply to:All NFS file shares including file shares created with the Microsoft.FileShares resource provider (preview) or classic file shares created with the Microsoft.Storage resource provider This migration article is one of several involving the keywords NFS and Azure File Sync. Check if this article applies to your scenario: If your scenario is different, look through thetable of migration guid",2026-02-18T10:58:00.000Z,how-to,deployment,0.7,True,Migration guide from Linux to a hybrid Azure File Sync deployment; includes product-specific migration steps and constraints.,unchanged @@ -110,7 +110,7 @@ https://learn.microsoft.com/en-us/azure/storage/files/storage-files-monitoring-r https://learn.microsoft.com/en-us/azure/storage/files/storage-files-netapp-comparison,Azure Files and Azure NetApp Files comparison,Azure Files and Azure NetApp Files comparison,Choose between Azure Files and Azure NetApp Files,"Compare the scalability, performance, and features of Azure Files and Azure NetApp Files.","This article provides a comparison of Azure Files and Azure NetApp Files. Most workloads that require cloud file storage work well on either Azure Files or Azure NetApp Files. To help determine the best fit for your workload, review the information provided in this article. For more information, see theAzure FilesandAzure NetApp Filesdocumentation and theShared storage for all enterprise file-workloadssession which covers choosing between Azure Files and Azure NetApp Files.",2026-03-03T06:13:00.000Z,concept-article,decision-making,0.78,True,"The page is explicitly a comparison to help determine which storage service to use for specific workloads. Such comparison articles typically include feature and capability matrices (performance, scalability, protocol support, scenarios) that guide service selection. This is product-specific decision guidance rather than a generic overview, fitting the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-dns,Configure DNS forwarding for Azure Files,Configure DNS forwarding for Azure Files,Configure DNS forwarding to Azure Files private endpoints,Learn how to configure DNS forwarding for Azure Files to properly resolve the fully qualified domain name (FQDN) of your storage account to your private endpoint's IP address.,"✔️Applies to:All Azure file shares Azure Files enables you to create private endpoints for the storage accounts containing your file shares. Although useful for many different applications, private endpoints are especially useful for connecting to your Azure file shares from your on-premises network using a VPN or ExpressRoute connection using private-peering. In order for connections to your storage account to go over your network tunnel, the fully qualified domain name (FQDN) of your storage a",2024-09-09T08:00:00.000Z,how-to,configuration,0.8,True,"Contains DNS zone names, record patterns, and forwarding rules required for Azure Files private endpoints. These are precise configuration details (FQDN formats, record types) that qualify as expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-endpoints,Configure Azure Files network endpoints,Configure Azure Files Network Endpoints,Configure public and private endpoints for Azure Files,Learn how to configure public and private network endpoints for Server Message Block (SMB) and Network File System (NFS) Azure file shares. Restrict access by setting up a privatelink.,"✔️Applies to:Classic file shares created with the Microsoft.Storage resource provider ✖️Doesn't apply to:File shares created with the Microsoft.FileShares resource provider (preview) Azure Files provides two main types of endpoints for accessing Azure file shares: Public and private endpoints exist on the Azure storage account. A storage account is a management construct that represents a shared pool of storage in which you can deploy multiple file shares, as well as other storage resources, suc",2024-05-10T08:00:00.000Z,how-to,configuration,0.8,True,"Explains configuring SMB/NFS endpoints, including private link and access restrictions on storage accounts; includes product-specific endpoint settings and behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-overview,Networking considerations for direct access,Networking Considerations for Azure Files,Configure secure networking for Azure file shares,"An overview of networking considerations and options for Azure Files, including secure transfer, public and private endpoints, VPN, ExpressRoute, DNS, and firewall settings.","✔️Applies to:All Azure file shares You can access your Azure file shares over the public internet accessible endpoint, over one or more private endpoints on your network(s), or by caching your Azure file share on-premises with Azure File Sync (SMB file shares only). This article focuses on how to configure Azure Files for direct access over public and/or private endpoints. To learn how to cache your Azure file share on-premises with Azure File Sync, seeIntroduction to Azure File Sync. We recomme",2026-04-08T17:12:00.000Z,overview,security,0.65,True,"Networking considerations article for Azure Files that covers secure transfer, public vs private endpoints, VPN/ExpressRoute, DNS, and firewall settings. It likely includes specific configuration options (e.g., secure transfer required, firewall rules, endpoint types) and is focused on secure access patterns, fitting the security category.",unchanged +https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-overview,Networking considerations for direct access,Networking Considerations for Azure Files,,"An overview of networking considerations and options for Azure Files, including secure transfer, public and private endpoints, VPN, ExpressRoute, DNS, and firewall settings.","✔️Applies to:All Azure file shares You can access your Azure file shares over the public internet accessible endpoint, over one or more private endpoints on your network(s), or by caching your Azure file share on-premises with Azure File Sync (SMB file shares only). This article focuses on how to configure Azure Files for direct access over public and/or private endpoints. To learn how to cache your Azure file share on-premises with Azure File Sync, seeIntroduction to Azure File Sync. We recomme",2026-04-22T06:17:00.000Z,overview,,0.2,False,"Networking article is an overview of connectivity options (public/private endpoints, VPN, ExpressRoute, DNS, firewall). Summary suggests conceptual guidance without detailed parameter tables, limits, or specific security/RBAC settings, so it does not meet the expert-knowledge thresholds for the defined sub-skills.",updated https://learn.microsoft.com/en-us/azure/storage/files/storage-files-planning,Plan for an Azure Files deployment,Plan for an Azure Files Deployment,Plan Azure Files deployment and access model,"Understand how to plan for an Azure Files deployment. You can either direct mount an SMB or NFS file share, or cache SMB file shares on-premises with Azure File Sync.","You can deployAzure Filesin two main ways: by directly mounting the serverless Azure file shares or by caching file shares on-premises using Azure File Sync. Deployment considerations differ based on which option you choose. Direct mount of an Azure file share: Because Azure Files provides either Server Message Block (SMB) or Network File System (NFS) access, you can mount Azure file shares on-premises or in the cloud using the standard SMB or NFS clients available in your OS. Because Azure file",2025-09-18T08:00:00.000Z,concept-article,decision-making,0.7,True,Planning guide comparing direct mount vs Azure File Sync with deployment considerations; helps choose deployment approach and includes scenario-based guidance beyond generic concepts.,unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-prevent-file-share-deletion,File share soft delete,Azure file share soft delete,Configure and use soft delete for Azure file shares,Learn about soft delete for Azure Files and how you can use it for data recovery and preventing accidental deletion of Azure file shares.,"✔️Applies to:Classic SMB and NFS file shares created with the Microsoft.Storage resource provider ✖️Doesn't apply to:File shares created with the Microsoft.FileShares resource provider (preview) Azure Files offers soft delete, which allows you to recover your file share if you mistakenly deleted it.",2026-03-11T22:19:00.000Z,concept-article,configuration,0.7,True,"Feature-specific article for Azure Files soft delete; typically includes retention period settings, enable/disable behavior, and recovery behavior that are product-specific configuration details rather than just conceptual description.",unchanged https://learn.microsoft.com/en-us/azure/storage/files/storage-files-scale-targets,Scalability and performance targets,Azure Files scalability and performance targets,"Azure Files scalability, IOPS, and throughput limits","Learn about the scalability and performance targets for Azure Files, including file share storage, IOPS, and throughput.","Azure Filesoffers fully managed file shares in the cloud that are accessible via SMB and NFS file sharing protocols. This article discusses the scalability and performance targets for Azure Files. In addition to the limits set by Azure Files, other variables in your deployment can affect the targets listed in this article. You should test your usage pattern to determine whether the scalability and performance of Azure Files meet your requirements. In Azure, aresourceis a manageable item that you",2025-09-09T05:11:00.000Z,concept-article,limits-quotas,0.9,True,"Explicitly a limits/targets article; contains tables of maximum share size, IOPS, throughput per share/account, and other numeric constraints that are expert knowledge.",unchanged diff --git a/products/azure-files/report.md b/products/azure-files/report.md index d89190c5..db633420 100644 --- a/products/azure-files/report.md +++ b/products/azure-files/report.md @@ -1,39 +1,40 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring Azure Files and Azure File Sync: networking/VPN and - private endpoints, monitoring/alerts, cloud tiering, DFS integration, redundancy, - soft delete, and secure access for apps and RAG.' + private endpoints, monitoring and alerts, cloud tiering, DFS integration, redundancy, + soft delete, and file copy tools.' decision-making: 'Guidance for planning Azure Files deployments: choosing tiers, redundancy, billing/cost models, reservations, access patterns, and migration/architecture options for SMB/NFS and File Sync.' - best-practices: Disaster recovery, lifecycle, and performance best practices for - Azure Files and Azure File Sync, including failover planning, server/drive replacement, - large directory handling, and VDI/FSLogix usage. - security: Securing Azure Files with identity-based auth (AD DS, Entra ID, Kerberos), - NTFS/share permissions, TLS/SMB/NFS hardening, and network/firewall/proxy configuration - for secure access. + best-practices: 'Best practices for Azure Files and Azure File Sync: DR/failover + planning, server/drive replacement, safe deprovision/recovery, and performance + tuning for SMB/NFS and VDI/FSLogix workloads' + security: Securing Azure Files with identity-based auth (AD DS/Entra/Kerberos), + NTFS/share permissions, SMB/NFS security, TLS, firewalls/proxies, and network + perimeter configuration. deployment: Guides for migrating and syncing data to Azure Files/Azure File Sync from NAS, Linux, GlusterFS, SMB/NFS shares, and moving File Sync resources safely across scopes. limits-quotas: 'Azure Files/File Sync limits: capacity, IOPS/throughput, scalability targets, API throttling behavior, redundancy/region support, and FAQ on performance-related constraints.' - integrations: Patterns and code samples for building RAG apps over Azure Files using - Haystack, LangChain, LlamaIndex with Pinecone/Qdrant/Weaviate, plus .NET, Java, - and Python integration guides. + integrations: 'Patterns and code to build RAG apps with Azure Files: authenticating + and loading files, and integrating Haystack/LangChain/LlamaIndex with Pinecone, + Qdrant, Weaviate, plus .NET/Java/Python SDK usage.' skill_description: Expert knowledge for Azure Files development including best practices, decision making, limits & quotas, security, configuration, integrations & coding - patterns, and deployment. Use when configuring Azure Files/File Sync, SMB/NFS access, - DFS/VDI setups, RAG over files, or data migrations, and other Azure Files related - development tasks. Not for Azure Blob Storage (use azure-blob-storage), Azure NetApp - Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre), - Azure Virtual Machines (use azure-virtual-machines). -use_when: Use when configuring Azure Files/File Sync, SMB/NFS access, DFS/VDI setups, - RAG over files, or data migrations, and other Azure Files related development tasks. + patterns, and deployment. Use when configuring Azure Files/File Sync networking, + SMB/NFS access, DR/failover, performance limits, or RAG app integration, and other + Azure Files related development tasks. Not for Azure Blob Storage (use azure-blob-storage), + Azure NetApp Files (use azure-netapp-files), Azure Table Storage (use azure-table-storage), + Azure Queue Storage (use azure-queue-storage). +use_when: Use when configuring Azure Files/File Sync networking, SMB/NFS access, DR/failover, + performance limits, or RAG app integration, and other Azure Files related development + tasks. confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure NetApp - Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre), - Azure Virtual Machines (use azure-virtual-machines). + Files (use azure-netapp-files), Azure Table Storage (use azure-table-storage), Azure + Queue Storage (use azure-queue-storage). --- # Azure Files Crawl Report @@ -42,13 +43,13 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N - **Total Pages**: 126 - **Fetched**: 126 - **Fetch Failed**: 0 -- **Classified**: 107 -- **Unclassified**: 19 +- **Classified**: 106 +- **Unclassified**: 20 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 125 +- **Updated Pages**: 16 +- **Unchanged**: 110 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-files/azure-files.csv` @@ -57,20 +58,50 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | Type | Count | Percentage | |------|-------|------------| | best-practices | 12 | 9.5% | -| configuration | 23 | 18.3% | +| configuration | 22 | 17.5% | | decision-making | 17 | 13.5% | | deployment | 9 | 7.1% | -| integrations | 18 | 14.3% | +| integrations | 19 | 15.1% | | limits-quotas | 4 | 3.2% | -| security | 24 | 19.0% | -| *(Unclassified)* | 19 | 15.1% | +| security | 23 | 18.3% | +| *(Unclassified)* | 20 | 15.9% | ## Changes ### Updated Pages -- [Configure directory/file-level permissions](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-configure-file-level-permissions) - - Updated: 2026-04-07T08:00:00.000Z → 2026-04-16T22:31:00.000Z +- [Use Haystack with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-weaviate/tutorial-haystack-weaviate) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use Haystack with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-qdrant/tutorial-haystack-qdrant) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [What's new in Azure Files?](https://learn.microsoft.com/en-us/azure/storage/files/files-whats-new) + - Updated: 2025-11-18T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Managed identities with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities) + - Updated: 2026-03-27T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Networking considerations for direct access](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-overview) + - Updated: 2026-04-08T17:12:00.000Z → 2026-04-22T06:17:00.000Z +- [Configure Site-to-Site VPN](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn) + - Updated: 2025-04-01T11:23:00.000Z → 2026-04-23T06:20:00.000Z +- [Disaster recovery and failover](https://learn.microsoft.com/en-us/azure/storage/files/files-disaster-recovery) + - Updated: 2026-03-11T22:19:00.000Z → 2026-04-22T08:00:00.000Z +- [What is retrieval-augmented generation?](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/overview) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Get started](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use LangChain with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use LangChain with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use LangChain with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use LlamaIndex with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use LlamaIndex with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use LlamaIndex with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z +- [Use Haystack with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T22:13:00.000Z ## Classified Pages @@ -102,21 +133,21 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | [Configure directory/file-level permissions](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-configure-file-level-permissions) | security | 0.78 | The page gives product-specific, stepwise guidance for configuring Windows ACLs/NTFS permissions on Azure Files with Active Directory authentication, including required RBAC share-level permissions and how they interact with file-level ACLs. This is concrete security configuration knowledge (roles, permission layers, and configuration behavior) that goes beyond generic security concepts. | | [Enable AD DS authentication](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-ad-ds-enable) | security | 0.78 | How-to guide for configuring AD DS authentication on Azure Files, including product-specific identity/security settings, required parameters, and step sequences that go beyond generic knowledge of SMB or Active Directory. | | [Enable Microsoft Entra Domain Services](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-auth-domain-services-enable) | security | 0.78 | Covers enabling Microsoft Entra Domain Services for SMB Azure file shares with concrete configuration steps, roles, and identity/security settings specific to Azure Files. | +| [Managed identities with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities) | security | 0.78 | Page describes how to use managed identities with Azure Files SMB shares, including product-specific Microsoft Entra ID configuration, role assignments, and identity-based authentication settings. These are concrete security/RBAC configuration steps unique to Azure Files rather than generic identity concepts. | | [Choose how to authorize access to file data in the Azure portal](https://learn.microsoft.com/en-us/azure/storage/files/authorize-data-operations-portal) | security | 0.76 | Explains how portal requests to Azure Files are authorized, with specific details on using Entra accounts vs access keys and required permissions, which are product-specific security behaviors. | -| [Managed identities with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities) | security | 0.76 | Shows how to configure managed identities with Azure Files, including storage account and identity bindings; these are concrete, product-specific identity configuration steps rather than generic managed identity concepts. | | [Azure File Sync agent silent installation](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-agent-silent-installation) | configuration | 0.75 | Silent installation requires specific command-line parameters, transforms, and options unique to the agent installer. | | [Choose your cloud tiering policies](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-choose-cloud-tiering-policies) | decision-making | 0.75 | Explicitly about choosing and adjusting policies; provides scenario-based recommendations for policy values and trade-offs. | | [Cost estimation examples](https://learn.microsoft.com/en-us/azure/storage/files/file-estimate-cost) | decision-making | 0.75 | Walks through cost estimation scenarios with concrete usage patterns and price impacts between billing models. Provides quantified trade-offs and scenario-based recommendations. | | [Migrate to NFS Azure file shares](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-migration-nfs) | decision-making | 0.75 | Describes migration from Linux file servers to NFS Azure Files and compares fpsync vs rsync performance; includes tool comparison and scenario guidance. | | [Modify Azure File Sync topology](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-modify-sync-topology) | best-practices | 0.75 | Explicitly described as best practices to avoid errors and data loss when changing topology; product-specific recommendations and edge cases. | | [Update password](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-ad-ds-update-password) | security | 0.75 | Describes how the AD principal password functions as a Kerberos key for Azure Files and how to rotate it; product-specific security/identity maintenance guidance. | -| [Use Haystack with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone) | integrations | 0.75 | Tutorial integrating Haystack DAG pipelines with Pinecone and Azure Files. The explicit component wiring (embedder, retriever, prompt builder, generator) to Azure Files-backed data is a concrete integration pattern. | -| [Use LangChain with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone) | integrations | 0.75 | Tutorial wiring LangChain, Pinecone, and Azure Files together. Mentions Pinecone namespaces mapping to Azure Files directories, which is a concrete integration pattern and configuration behavior unique to this combination. | -| [Use LangChain with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant) | integrations | 0.75 | Tutorial integrating LangChain, Qdrant, and Azure Files. Uses a single Qdrant collection with indexed payload filtering for department scoping, a concrete integration and query pattern specific to this stack. | -| [Use LangChain with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate) | integrations | 0.75 | Tutorial integrating LangChain, Weaviate, and Azure Files. Uses Weaviate multi-tenancy mapped to Azure Files directories with automatic tenant creation, which is a specific integration behavior and configuration pattern. | -| [Use LlamaIndex with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone) | integrations | 0.75 | Tutorial integrating LlamaIndex node parsers with Pinecone namespaces mapped to Azure Files directories. Contains specific integration patterns and configuration for this combination. | -| [Use LlamaIndex with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant) | integrations | 0.75 | Tutorial integrating LlamaIndex with Qdrant using a single collection and payload filtering for scoping. This is a specific integration and configuration pattern unique to this stack. | -| [Use LlamaIndex with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate) | integrations | 0.75 | Tutorial integrating LlamaIndex with Weaviate hybrid search over Azure Files. The hybrid search usage and mapping to file-share content is a concrete integration pattern with product-specific query/config details. | +| [Use Haystack with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone) | integrations | 0.75 | Describes a RAG pipeline using Azure Files, Haystack, and Pinecone. It necessarily includes concrete configuration of Haystack components, Pinecone indexes, and Azure Files access, which is expert integration knowledge. | +| [Use LangChain with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone) | integrations | 0.75 | Tutorial builds a full RAG pipeline using Azure Files, LangChain, and Pinecone. Such pages typically include concrete SDK usage, configuration parameters (for vector DB, embeddings, and file access), and code patterns specific to this integration stack, which matches the integrations sub-skill. | +| [Use LangChain with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant) | integrations | 0.75 | Builds a RAG pipeline using Azure Files, LangChain, and Qdrant. This requires specific configuration of Qdrant, LangChain vector stores, and Azure Files access, which is expert integration knowledge beyond generic RAG concepts. | +| [Use LangChain with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate) | integrations | 0.75 | Similar to index 3 but with Weaviate as the vector database. It focuses on wiring Azure Files into a LangChain + Weaviate pipeline, with product-specific configuration and code patterns for that integration. | +| [Use LlamaIndex with Pinecone](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone) | integrations | 0.75 | Shows a concrete RAG pipeline using Azure Files with LlamaIndex and Pinecone. Such content typically includes specific constructor parameters, index configuration, and file access patterns unique to this combination, fitting integrations. | +| [Use LlamaIndex with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant) | integrations | 0.75 | Covers building a RAG pipeline using Azure Files, LlamaIndex, and Qdrant, including specific configuration and code patterns for that stack, which qualifies as integrations & coding patterns. | +| [Use LlamaIndex with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate) | integrations | 0.75 | Provides code and configuration to connect Azure Files to LlamaIndex and Weaviate. This is a product-specific integration pattern with concrete SDK usage and settings, not just conceptual guidance. | | [Encryption in transit for NFS shares](https://learn.microsoft.com/en-us/azure/storage/files/encryption-in-transit-for-nfs-shares) | security | 0.72 | Explains how to encrypt data in transit for NFS Azure file shares using TLS. This will include product-specific security configuration steps and parameters for enabling TLS on NFSv4.1 volumes, which is concrete security configuration rather than conceptual content. | | [NFS performance](https://learn.microsoft.com/en-us/azure/storage/files/nfs-performance) | best-practices | 0.72 | NFS performance tuning for Azure file shares (such as using the nconnect mount option and other mount/throughput settings) is product- and platform-specific, with concrete configuration guidance for Linux clients and Azure Files. This is actionable optimization advice unique to this service, so it fits best-practices. | | [SMB performance](https://learn.microsoft.com/en-us/azure/storage/files/smb-performance) | best-practices | 0.72 | Performance tuning guidance for SMB Azure file shares is product-specific and typically includes concrete recommendations (for example, enabling SMB Multichannel, client/OS settings, and Azure Files configuration choices) that go beyond generic knowledge. These are actionable DO/DON'T style optimizations unique to Azure Files SMB, fitting best-practices. | @@ -125,11 +156,12 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | [Azure Files FAQ](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-faq) | limits-quotas | 0.70 | Azure Files FAQ typically includes many numeric limits (max shares, file size, throughput behaviors) and edge-case behaviors. These are expert details not obvious from overviews. | | [Change the redundancy configuration](https://learn.microsoft.com/en-us/azure/storage/files/files-change-redundancy-configuration) | configuration | 0.70 | Describes exact steps and constraints for changing replication settings (e.g., allowed transitions, unsupported combinations). These are concrete configuration rules unique to Azure Files. | | [Configure Point-to-Site VPN on Windows](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-p2s-vpn-windows) | configuration | 0.70 | Step-by-step configuration for Windows P2S VPN to Azure Files with product-specific parameters (VPN client config, certificate requirements, address spaces). Contains concrete settings and values unique to Azure Files networking, beyond generic VPN knowledge. | -| [Configure Site-to-Site VPN](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn) | configuration | 0.70 | Detailed S2S VPN configuration for Azure Files using VPN Gateway, including gateway types, address spaces, and Azure-specific parameters. This is concrete configuration reference, not just conceptual guidance. | +| [Configure Site-to-Site VPN](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn) | configuration | 0.70 | Page walks through configuring a site-to-site VPN with Azure VPN Gateway specifically for Azure Files, including Azure resource configuration, likely with gateway types, SKU choices, address spaces, and other product-specific parameters. This is detailed configuration guidance rather than generic VPN theory or simple how-to. | | [Create a file share (Microsoft.FileShares)](https://learn.microsoft.com/en-us/azure/storage/files/create-file-share) | decision-making | 0.70 | The article explicitly tells readers to review criteria before creating a share to decide if the Microsoft.FileShares resource provider is the right fit. This implies product-specific decision guidance between classic Microsoft.Storage file shares and the new Microsoft.FileShares model. While primarily a how-to, it embeds selection guidance unique to this preview provider, which qualifies as decision-making content more than generic configuration or overview. | | [Create alerts](https://learn.microsoft.com/en-us/azure/storage/files/files-monitoring-alerts) | configuration | 0.70 | Shows how to configure alerts on Azure Files metrics/logs (throttling, capacity, egress, server latency) with specific metric names and alert settings; these are concrete configuration patterns for monitoring this product. | | [Delete an Azure File Sync server endpoint](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-server-endpoint-delete) | best-practices | 0.70 | Focuses on preserving data integrity and availability when removing endpoints; scenario-based guidance and gotchas specific to Azure File Sync behavior. | | [File share soft delete](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-prevent-file-share-deletion) | configuration | 0.70 | Feature-specific article for Azure Files soft delete; typically includes retention period settings, enable/disable behavior, and recovery behavior that are product-specific configuration details rather than just conceptual description. | +| [Get started](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup) | integrations | 0.70 | Shows how to authenticate to Azure file shares over SMB with Microsoft Entra ID and build download logic used by multiple RAG tutorials. This implies concrete integration patterns, including specific authentication parameters and code-level access patterns unique to Azure Files + Entra ID, which fits integrations & coding patterns. | | [Install Azure File Sync agent extension on Arc-enabled Windows Servers](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-extension) | configuration | 0.70 | Describes installing/validating/uninstalling the agent extension; likely includes extension name, parameters, and portal/CLI configuration specifics. | | [Java](https://learn.microsoft.com/en-us/azure/storage/files/storage-java-how-to-use-file-storage) | integrations | 0.70 | Provides Java-specific SDK/API usage for Azure Files, including configuration parameters and code patterns for creating/deleting shares, directories, and files; these are product-specific integration details. | | [Large directory best practices](https://learn.microsoft.com/en-us/azure/storage/files/nfs-large-directories) | best-practices | 0.70 | Gives specific recommendations for working with very large directories on Azure Files NFS mounts, including mount options, commands, and operational patterns that are unique to this service scenario. | @@ -159,10 +191,10 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | [Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/weaviate) | integrations | 0.70 | Explains Weaviate hybrid search with Azure Files as source and orchestration frameworks. Likely includes Weaviate schema/config parameters and query patterns specific to this setup, fitting integrations criteria. | | [Zonal placement](https://learn.microsoft.com/en-us/azure/storage/files/zonal-placement) | decision-making | 0.70 | Explains how and when to choose specific availability zones for SSD Azure Files accounts, including constraints (LRS, SSD tier, supported regions) and latency considerations; this is concrete placement/architecture decision guidance. | | [Configure Point-to-Site VPN on Linux](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-p2s-vpn-linux) | configuration | 0.68 | Details configuring P2S VPN on Linux specifically for Azure Files, including VPN settings, client configuration parameters, and Azure-side options that go beyond generic VPN knowledge. | -| [Use Haystack with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-qdrant/tutorial-haystack-qdrant) | integrations | 0.68 | Tutorial describes a specific integration pattern between Azure Files, Haystack, and Qdrant for RAG, including how Qdrant collections and indexed payload filtering are used at query time. This is concrete, product-specific integration guidance rather than a conceptual overview. | -| [Use Haystack with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-weaviate/tutorial-haystack-weaviate) | integrations | 0.68 | Tutorial shows a concrete integration pattern between Azure Files, Haystack, and Weaviate for RAG, including product-specific orchestration and vector DB usage details (DAG components, hybrid search with alpha parameter). This is code-focused integration knowledge that goes beyond generic concepts. | +| [Use Haystack with Qdrant](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-qdrant/tutorial-haystack-qdrant) | integrations | 0.68 | Tutorial describes a specific integration of Azure Files with Haystack and Qdrant as the vector database. It likely contains concrete configuration (Qdrant collection settings, connection parameters, auth details) and code patterns unique to this stack, which aligns with the integrations & coding patterns sub-skill. | +| [Use Haystack with Weaviate](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-weaviate/tutorial-haystack-weaviate) | integrations | 0.68 | Tutorial shows a concrete integration pattern between Azure Files, Haystack, and Weaviate. It likely includes product-specific SDK usage and configuration parameters (endpoints, index/collection names, auth setup) that go beyond generic RAG concepts, fitting the integrations & coding patterns category. | | [Disable SMB on the Linux SMB client](https://learn.microsoft.com/en-us/azure/storage/files/files-remove-smb1-linux) | security | 0.65 | Provides concrete Linux configuration steps and commands to disable SMB1 in the context of Azure Files, including package/module settings. This is product- and protocol-specific security hardening guidance. | -| [Get started](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup) | configuration | 0.65 | Focuses on authenticating to Azure file shares and downloading contents for RAG tooling. Likely includes specific auth methods, connection strings, and configuration parameters (for SDK/CLI) that are product-specific configuration details. | +| [Disaster recovery and failover](https://learn.microsoft.com/en-us/azure/storage/files/files-disaster-recovery) | best-practices | 0.65 | Disaster recovery guidance for Azure Files typically includes product-specific recommendations (for example when to trigger storage account failover, how to handle secondary endpoints, and sequencing of recovery steps) that go beyond generic DR theory. While the summary is high level, this topic usually documents Azure Files–specific behaviors and gotchas during failover, which qualifies as product-specific best practices. | | [Haystack](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/haystack) | integrations | 0.65 | Covers Haystack DAG-based pipelines wired to Azure Files as the document source. Likely includes concrete component wiring and configuration specific to this integration, not just generic Haystack concepts. | | [LangChain](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/langchain) | integrations | 0.65 | Describes using LangChain components (document loaders, retrievers, vector stores) specifically with Azure Files as a data source. Likely includes product-specific code patterns and configuration parameters for this integration beyond generic LangChain usage. | | [LlamaIndex](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/llamaindex) | integrations | 0.65 | Explains using LlamaIndex abstractions (SentenceSplitter, VectorStoreIndex, RetrieverQueryEngine) with Azure Files as the backing store. This is a product-specific integration pattern with concrete code/config unique to this combination. | @@ -171,7 +203,6 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | [Migrate from one Azure file share to another](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-share-to-share-migration) | deployment | 0.65 | Migration procedure differs based on cloud tiering state; contains detailed steps and constraints for moving data between shares and storage accounts. | | [Monitor cloud tiering](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-monitor-cloud-tiering) | configuration | 0.65 | Describes monitoring via server endpoint blade and Azure Monitor; likely lists specific metrics and their meanings, which are product-specific configuration/monitoring details. | | [NFS file shares](https://learn.microsoft.com/en-us/azure/storage/files/files-nfs-protocol) | security | 0.65 | NFS protocol page for Azure Files covering security, networking, feature support, and regional availability. Azure NFS has product-specific security and networking requirements (e.g., allowed networks, auth models, export rules) that constitute expert configuration/security knowledge beyond generic NFS concepts, fitting the security sub-skill best. | -| [Networking considerations for direct access](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-overview) | security | 0.65 | Networking considerations article for Azure Files that covers secure transfer, public vs private endpoints, VPN/ExpressRoute, DNS, and firewall settings. It likely includes specific configuration options (e.g., secure transfer required, firewall rules, endpoint types) and is focused on secure access patterns, fitting the security category. | | [Overview](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-developer-overview) | decision-making | 0.65 | Overview targeted at helping developers decide between different programming models/approaches for Azure Files; includes product-specific trade-offs between APIs/SDKs and access methods, which is decision guidance rather than generic storage info. | | [Plan for an Azure File Sync deployment](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-planning) | decision-making | 0.65 | Planning article that discusses deployment options and how choice changes behavior; typically includes scenario-based guidance and trade-offs for when to choose each option. | | [Replace Windows file servers with Azure File Sync](https://learn.microsoft.com/en-us/azure/storage/files/windows-server-to-azure-files) | decision-making | 0.65 | Discusses replacing/extending Windows file servers with Azure Files and Azure File Sync, likely including scenario-based deployment approaches and trade-offs for migration decisions. | @@ -182,7 +213,6 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | [Migrate from on-premises NAS to a hybrid file server using DataBox](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-migration-nas-hybrid-databox) | deployment | 0.64 | Step-by-step migration scenario combining Azure File Sync and Data Box with product-specific constraints (applicability to classic SMB shares, exclusions for NFS and Microsoft.FileShares) and process details that guide a concrete deployment/migration pattern. | | [Use DFS-N with Azure Files](https://learn.microsoft.com/en-us/azure/storage/files/files-manage-namespaces) | configuration | 0.64 | Explains using DFS-N with Azure Files, including namespace configuration and path mapping specifics that are product- and scenario-specific configuration knowledge. | | [Add an Azure File Sync Server endpoint](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-server-endpoint-create) | decision-making | 0.60 | Explicitly about understanding options and decisions for server endpoint creation; likely includes scenario-based guidance on which options to choose for different use cases. | -| [Disaster recovery and failover](https://learn.microsoft.com/en-us/azure/storage/files/files-disaster-recovery) | best-practices | 0.60 | Covers DR planning and failover processes for Azure Files, including product-specific guidance on when and how to fail over storage accounts; organized as actionable recommendations rather than just conceptual DR theory. | ## Unclassified Pages @@ -202,8 +232,9 @@ confusable_not_for: Not for Azure Blob Storage (use azure-blob-storage), Azure N | [Extend Windows file servers with Azure File Sync](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-extend-servers) | 0.30 | Tutorial-style extension of Windows file servers; likely step-by-step but not focused on config matrices, limits, or error mappings. | | [Customer case studies](https://learn.microsoft.com/en-us/azure/storage/files/azure-files-case-study) | 0.20 | Customer case studies are narrative/marketing-style usage stories, not technical reference with limits, configs, or error mappings. They don't provide reusable expert configuration or troubleshooting knowledge. | | [Mount SMB file share on Windows](https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows) | 0.20 | Primarily a how-to tutorial for mounting SMB Azure file shares on Windows. While it may show commands, it is generic usage guidance without configuration tables, limits, or product-specific diagnostic/security details that rise to expert-knowledge level per the defined categories. | +| [Networking considerations for direct access](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-overview) | 0.20 | Networking article is an overview of connectivity options (public/private endpoints, VPN, ExpressRoute, DNS, firewall). Summary suggests conceptual guidance without detailed parameter tables, limits, or specific security/RBAC settings, so it does not meet the expert-knowledge thresholds for the defined sub-skills. | | [Overview of authorization and access control](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-authorization-overview) | 0.20 | Authorization overview describing concepts (share-level, directory-level, file-level, RBAC) without detailed role lists, parameter tables, or other concrete configuration data; primarily conceptual. | | [What is Azure File Sync?](https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-introduction) | 0.20 | High-level introduction to Azure File Sync; summary indicates conceptual overview of what the service does without specific limits, configuration tables, error codes, or decision matrices. | -| [What is retrieval-augmented generation?](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/overview) | 0.20 | High-level overview of using Azure Files as a data source for RAG; no concrete limits, configuration tables, error codes, or decision matrices. | +| [What is retrieval-augmented generation?](https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/overview) | 0.20 | High-level overview of RAG with Azure Files; describes the concept and general pipeline behavior without indicating product-specific limits, configuration tables, or detailed decision/troubleshooting content. | +| [What's new in Azure Files?](https://learn.microsoft.com/en-us/azure/storage/files/files-whats-new) | 0.20 | Release notes / what's new page listing recent features and changes for Azure Files and Azure File Sync. It does not focus on limits, configuration matrices, troubleshooting mappings, or other structured expert details as defined; primarily update/announcement content. | | [What is Azure Files?](https://learn.microsoft.com/en-us/azure/storage/files/storage-files-introduction) | 0.10 | High-level introduction to Azure Files describing what the service is and basic capabilities (SMB/NFS access, platforms supported). No specific limits, configuration tables, error codes, or decision matrices are present. | -| [What's new in Azure Files?](https://learn.microsoft.com/en-us/azure/storage/files/files-whats-new) | 0.10 | What's new changelog; mostly release notes and feature announcements without structured limits, configs, or troubleshooting mappings. | diff --git a/products/azure-firewall/azure-firewall.csv b/products/azure-firewall/azure-firewall.csv index 0d52ea57..0818b97e 100644 --- a/products/azure-firewall/azure-firewall.csv +++ b/products/azure-firewall/azure-firewall.csv @@ -29,7 +29,7 @@ https://learn.microsoft.com/en-us/azure/firewall/features-by-sku,Azure Firewall https://learn.microsoft.com/en-us/azure/firewall/firewall-azure-policy,Portal,Use Azure Policy to help secure your Azure Firewall deployments,Enforce Azure Firewall security using Azure Policy,Govern Azure Firewall configurations by applying Azure Policies that enforce security best practices and organizational compliance standards.,"Azure Policy is a service in Azure that you can use to create, assign, and manage policies. These policies enforce different rules and effects over your resources, so those resources stay compliant with your corporate standards and service level agreements. Azure Policy evaluates your resources for noncompliance with assigned policies. For example, you can use a policy to allow only a certain size of virtual machines in your environment or to enforce a specific tag on resources. You can use Azur",2026-03-29T11:12:00.000Z,how-to,security,0.65,True,"Describes using Azure Policy to govern Azure Firewall configurations and enforce security/compliance standards; likely includes specific built-in policy definitions and parameters, which are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/firewall/firewall-best-practices,Best practices for performance,Azure Firewall best practices for performance,Optimize Azure Firewall performance with tuning guidelines,"Learn how to configure Azure Firewall to maximize performance and minimize latency using best practices for rules, SNAT, IDPS, and monitoring.","To maximize theperformanceof your Azure Firewall and Firewall policy, it’s important to follow best practices. However, certain network behaviors or features can affect the firewall’s performance and latency, despite its performance optimization capabilities.",2026-03-29T11:12:00.000Z,concept-article,best-practices,0.86,True,"The page provides product-specific performance tuning guidance for Azure Firewall, including concrete recommendations on configuring rules, SNAT, IDPS, and monitoring to minimize latency and maximize throughput. These are actionable DO/DON'T style best practices tied to Azure Firewall behavior rather than generic networking advice, so it fits the best-practices sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/firewall/firewall-copilot,Microsoft Copilot for Security,Azure Firewall integration in Microsoft Security Copilot,,Learn about using Microsoft Security Copilot to investigate traffic flagged by Azure Firewall with Intrusion Detection and Prevention System (IDPS).,"Security Copilot is a generative AI-powered security solution that helps increase the efficiency and capabilities of security personnel to improve security outcomes at machine speed and scale. It provides a natural language, assistive copilot experience that helps support security professionals in end-to-end scenarios such as incident response, threat hunting, intelligence gathering, and posture management. For more information about what it can do, seeWhat is Microsoft Security Copilot?",2026-03-29T11:12:00.000Z,concept-article,,0.2,False,"High-level overview of Azure Firewall integration with Microsoft Security Copilot; mostly conceptual description of capabilities, not detailed configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/firewall/firewall-faq,FAQ,Azure Firewall FAQ,Azure Firewall FAQ limits and behaviors,"FAQ for Azure Firewall. A managed, cloud-based network security service that protects your Azure Virtual Network resources.",,2026-04-15T17:15:00Z,faq,limits-quotas,0.74,True,"The FAQ includes concrete, product-specific numeric limits and behaviors (for example, maximum number of IP groups, rule collection limits, SNAT port allocations, throughput expectations, and other capacity-related figures) that are unlikely to be reliably known from general training data. These are expressed as exact values and constraints, fitting the limits-quotas category better than the others.",updated +https://learn.microsoft.com/en-us/azure/firewall/firewall-faq,FAQ,Azure Firewall FAQ,Azure Firewall FAQ limits and behaviors,"FAQ for Azure Firewall. A managed, cloud-based network security service that protects your Azure Virtual Network resources.",,2026-04-15T17:15:00Z,faq,limits-quotas,0.74,True,"The FAQ includes concrete, product-specific numeric limits and behaviors (for example, maximum number of IP groups, rule collection limits, SNAT port allocations, throughput expectations, and other capacity-related figures) that are unlikely to be reliably known from general training data. These are expressed as exact values and constraints, fitting the limits-quotas category better than the others.",unchanged https://learn.microsoft.com/en-us/azure/firewall/firewall-multi-hub-spoke,Routing in hub and spoke,Use Azure Firewall to Route a Multi-hub and Spoke Topology,Architect multi-hub and spoke routing with Azure Firewall,Learn how to deploy Azure Firewall to route traffic in a multi-hub and spoke topology.,The hub-and-spoke topology is a common network architecture pattern in Azure that consolidates network resources and centralizes network services. This topology provides network connectivity and security to virtual networks deployed across different subscriptions or tenants. You can implement hub-and-spoke architectures in two ways: This article focuses on self-managed hub-and-spoke topologies where you have full visibility and control over the hub virtual network and Azure Firewall deployment. ,2025-09-29T22:15:00.000Z,concept-article,architecture-patterns,0.7,True,Covers using Azure Firewall in self-managed multi-hub-and-spoke topologies; this is a concrete Azure networking architecture pattern.,unchanged https://learn.microsoft.com/en-us/azure/firewall/firewall-performance,Performance,Azure Firewall performance,Plan Azure Firewall performance and SKU throughput,"Learn about Azure Firewall performance data and throughput benchmarks for Basic, Standard, and Premium SKUs across different use cases.","Reliable firewall performance is essential to operate and protect your virtual networks in Azure. More advanced features, like those found in Azure Firewall Premium, require more processing complexity and affect firewall performance and overall network performance. Azure Firewall has three versions: Basic, Standard, and Premium. Azure Firewall Basic Azure Firewall Basic is intended for small and medium size (SMB) customers to secure their Azure cloud environments. It provides the essential prote",2026-03-29T11:12:00.000Z,concept-article,decision-making,0.7,True,"Provides performance data and throughput benchmarks for Basic, Standard, and Premium across use cases; supports SKU/tier selection and capacity planning with quantified trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/firewall/firewall-preview,Preview features,Azure Firewall preview features,,Learn about Azure Firewall preview features that are publicly available for deployment and testing.,"You can deploy and test the following Azure Firewall preview features. Some preview features are available on the Azure portal, and some are only visible by using a feature flag. Important These features are currently in preview. @@ -78,8 +78,8 @@ https://learn.microsoft.com/en-us/azure/firewall/service-tags,Service tags,Overv https://learn.microsoft.com/en-us/azure/firewall/snat-private-range,SNAT private ranges,Azure Firewall SNAT private IP address ranges,Configure SNAT private IP ranges in Azure Firewall,You can configure IP address ranges for SNAT.,"Azure Firewall provides SNAT capability for all outbound traffic to public IP addresses. By default, Azure Firewall doesn't use SNAT with network rules when the destination IP address is in a private IP address range perIANA RFC 1918or shared address space perIANA RFC 6598. Application rules always use SNAT through atransparent proxyregardless of the destination IP address. This default behavior is suitable when routing traffic directly to the Internet. However, there are scenarios where you mig",2026-03-28T08:00:00.000Z,how-to,configuration,0.8,True,"Explains default SNAT behavior with RFC1918/RFC6598 ranges and how to override it; includes product-specific configuration options and edge-case behavior for SNAT, which is expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/firewall/sql-fqdn-filtering,Application rules with SQL FQDNs,Configure Azure Firewall application rules with SQL FQDNs,Configure Azure Firewall application rules with SQL FQDNs,Configure SQL FQDNs in Azure Firewall application rules to control access to SQL server instances from your virtual networks.,"You can configure Azure Firewall application rules with SQL FQDNs. This configuration limits access from your virtual networks to only the specified SQL server instances. By using SQL FQDNs, you can filter traffic: SQL FQDN filtering is supported inproxy-modeonly (port 1433). If you use SQL in the default redirect mode, you can filter access by using the SQL service tag as part ofnetwork rules. If you use non-default ports for SQL IaaS traffic, you can configure those ports in the firewall appli",2026-03-30T22:11:00.000Z,how-to,configuration,0.75,True,"Provides product-specific configuration details: SQL FQDN filtering supported only in proxy mode on port 1433, behavior in redirect mode, and handling non-default ports—these are concrete configuration behaviors unique to Azure Firewall.",unchanged -https://learn.microsoft.com/en-us/azure/firewall/support-help,Support and troubleshooting,Azure Firewall support and help options,,How to obtain help and support for questions or problems when you create solutions using Azure Firewall.,Here are suggestions for where you can get help when developing your Azure Firewall solutions.,2026-04-14T11:11:00.000Z,troubleshooting,,0.0,False,"Support/help options page; no technical limits, configuration parameters, error codes, or product-specific expert details—primarily guidance on where to get assistance.",new -https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior,TCP idle timeout behavior,Understanding Azure Firewall TCP (Transmission Control Protocol) session management and idle timeout behavior,Manage Azure Firewall TCP session idle timeouts,Learn about the behavior of long-running sessions and TCP idle timeout for Azure Firewall.,"This article explains the behavior of long-running sessions and the TCP idle timeout for Azure Firewall. Understanding these concepts is crucial for maintaining network security, optimizing firewall resources, and ensuring uninterrupted connectivity for critical applications.",2026-03-29T11:12:00.000Z,concept-article,limits-quotas,0.7,True,"Focuses on TCP session management and idle timeout behavior; such articles typically include specific default timeout values and ranges, which are numeric limits unique to the product.",unchanged +https://learn.microsoft.com/en-us/azure/firewall/support-help,Support and troubleshooting,Azure Firewall support and help options,,How to obtain help and support for questions or problems when you create solutions using Azure Firewall.,Here are suggestions for where you can get help when developing your Azure Firewall solutions.,2026-04-14T11:11:00.000Z,troubleshooting,,0.0,False,"Support/help options page; no technical limits, configuration parameters, error codes, or product-specific expert details—primarily guidance on where to get assistance.",unchanged +https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior,TCP idle timeout behavior,Understanding Azure Firewall TCP (Transmission Control Protocol) session management and idle timeout behavior,Configure Azure Firewall TCP session idle timeouts,Learn about the behavior of long-running sessions and TCP idle timeout for Azure Firewall.,"This article explains the behavior of long-running sessions and the TCP idle timeout for Azure Firewall. Understanding these concepts is crucial for maintaining network security, optimizing firewall resources, and ensuring uninterrupted connectivity for critical applications.",2026-04-20T22:11:00.000Z,concept-article,limits-quotas,0.78,True,The page describes Azure Firewall TCP session management with specific idle timeout values and behaviors for long-running sessions. These are product-specific timeout limits and behaviors that qualify as expert knowledge under limits-quotas.,updated https://learn.microsoft.com/en-us/azure/firewall/threat-intel,Threat intelligence,Azure Firewall threat intelligence based filtering,Configure Azure Firewall threat intelligence filtering,You can enable Threat intelligence-based filtering for your firewall to alert and deny traffic from/to known malicious IP addresses and domains.,"You can enable Threat intelligence-based filtering for your firewall to alert and deny traffic from/to known malicious IP addresses, FQDNs, and URLs. The IP addresses, domains and URLs are sourced from the Microsoft Threat Intelligence feed, which includes multiple sources including the Microsoft Cyber Security team. When threat intelligence-based filtering is enabled, Azure Firewall evaluates traffic against the threat intelligence rules before applying NAT, network, or application rules. Admin",2025-07-24T22:10:00.000Z,concept-article,security,0.85,True,"Details enabling threat intelligence-based filtering, including behavior (evaluated before NAT/network/app rules) and data sources (Microsoft Threat Intelligence feed), which are product-specific security settings.",unchanged https://learn.microsoft.com/en-us/azure/firewall/tutorial-firewall-deploy-portal,Deploy and configure - classic,Deploy & configure Azure Firewall using the Azure portal,Deploy and configure Azure Firewall in portal,"In this article, you learn how to deploy and configure Azure Firewall using the Azure portal.","Controlling outbound network access is an important part of an overall network security plan. For example, you might want to limit access to web sites. Or, you might want to limit the outbound IP addresses and ports that can be accessed. One way you can control outbound network access from an Azure subnet is with Azure Firewall. With Azure Firewall, you can configure: Network traffic is subjected to the configured firewall rules when you route your network traffic to the firewall as the subnet d",2026-02-06T06:10:00.000Z,how-to,security,0.65,True,Portal-based configuration of Azure Firewall rules and routing for outbound control is security configuration specific to this product.,unchanged https://learn.microsoft.com/en-us/azure/firewall/tutorial-firewall-deploy-portal-policy,Portal,Tutorial: Deploy and configure Azure Firewall and policy using the Azure portal,,Deploy and configure Azure Firewall and policy rules by using the Azure portal.,"Controlling outbound network access is an important part of an overall network security plan. For example, you might want to limit access to websites. Or, you might want to limit the outbound IP addresses and ports that can be accessed. You can control outbound network access from an Azure subnet by using Azure Firewall and Firewall Policy. By using Azure Firewall and Firewall Policy, you can configure: Network traffic is subjected to the configured firewall rules when you route your network tra",2026-03-31T06:10:00.000Z,tutorial,,0.4,False,General tutorial on deploying and configuring firewall and policy via portal; primarily step-by-step without detailed config matrices or numeric limits.,unchanged diff --git a/products/azure-firewall/report.md b/products/azure-firewall/report.md index 01c37f56..266f3e89 100644 --- a/products/azure-firewall/report.md +++ b/products/azure-firewall/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: decision-making: Guidance on choosing Azure Firewall SKUs (Basic/Standard/Premium), comparing features and performance, and planning or changing deployments based @@ -11,7 +11,7 @@ category_descriptions: DNS/proxy/FTP, maintenance windows, monitoring/logging, and advanced Premium/PowerShell management. limits-quotas: Azure Firewall capacity, IP/port/session limits, SNAT scaling with - NAT Gateway, prescaling ranges, and TCP idle timeout behaviors and configuration. + NAT Gateway, prescaling options, and TCP idle timeout behaviors and configuration troubleshooting: Diagnosing Azure Firewall issues using known limitations, packet captures, and Sentinel log analysis for malware detection and traffic investigation. best-practices: Best practices for Azure Firewall DNS proxy/caching, performance @@ -28,14 +28,14 @@ category_descriptions: skill_description: Expert knowledge for Azure Firewall development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - configuring Azure Firewall SKUs, policies/rules, TLS inspection, hub-spoke DNAT, - or SFTP to Storage, and other Azure Firewall related development tasks. Not for - Azure Application Gateway (use azure-application-gateway), Azure Front Door (use - azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), + selecting Firewall SKUs, designing hub-spoke/forced tunneling, configuring DNAT/SNAT/app + rules, TLS inspection, or DNS proxy, and other Azure Firewall related development + tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure + Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), Azure DDos Protection (use azure-ddos-protection). -use_when: Use when configuring Azure Firewall SKUs, policies/rules, TLS inspection, - hub-spoke DNAT, or SFTP to Storage, and other Azure Firewall related development - tasks. +use_when: Use when selecting Firewall SKUs, designing hub-spoke/forced tunneling, + configuring DNAT/SNAT/app rules, TLS inspection, or DNS proxy, and other Azure Firewall + related development tasks. confusable_not_for: Not for Azure Application Gateway (use azure-application-gateway), Azure Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), Azure DDos Protection (use azure-ddos-protection). @@ -51,10 +51,10 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat - **Unclassified**: 25 ### Incremental Update -- **New Pages**: 1 +- **New Pages**: 0 - **Updated Pages**: 1 -- **Unchanged**: 83 -- **Deleted Pages**: 1 +- **Unchanged**: 84 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-firewall/azure-firewall.csv` ## Classification Statistics @@ -74,18 +74,10 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat ## Changes -### New Pages - -- [Support and troubleshooting](https://learn.microsoft.com/en-us/azure/firewall/support-help) - ### Updated Pages -- [FAQ](https://learn.microsoft.com/en-us/azure/firewall/firewall-faq) - - Updated: 2026-03-06T12:21:00Z → 2026-04-15T17:15:00Z - -### Deleted Pages - -- ~~Known issues and limitations~~ (https://learn.microsoft.com/en-us/azure/firewall/firewall-known-issues) +- [TCP idle timeout behavior](https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior) + - Updated: 2026-03-29T11:12:00.000Z → 2026-04-20T22:11:00.000Z ## Classified Pages @@ -99,6 +91,7 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Packet capture on Azure Firewall](https://learn.microsoft.com/en-us/azure/firewall/packet-capture) | troubleshooting | 0.80 | Packet capture usage is framed for troubleshooting; article covers how to capture and analyze traffic, a product-specific diagnostic workflow. | | [SNAT private ranges](https://learn.microsoft.com/en-us/azure/firewall/snat-private-range) | configuration | 0.80 | Explains default SNAT behavior with RFC1918/RFC6598 ranges and how to override it; includes product-specific configuration options and edge-case behavior for SNAT, which is expert configuration knowledge. | | [Secure firewall deployment](https://learn.microsoft.com/en-us/azure/firewall/secure-firewall) | best-practices | 0.80 | Explicitly a best-practices article for securing Azure Firewall, likely including concrete recommendations and configurations for network, data, logging, and threat detection. | +| [TCP idle timeout behavior](https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior) | limits-quotas | 0.78 | The page describes Azure Firewall TCP session management with specific idle timeout values and behaviors for long-running sessions. These are product-specific timeout limits and behaviors that qualify as expert knowledge under limits-quotas. | | [Application rules with SQL FQDNs](https://learn.microsoft.com/en-us/azure/firewall/sql-fqdn-filtering) | configuration | 0.75 | Provides product-specific configuration details: SQL FQDN filtering supported only in proxy mode on port 1433, behavior in redirect mode, and handling non-default ports—these are concrete configuration behaviors unique to Azure Firewall. | | [Certificates](https://learn.microsoft.com/en-us/azure/firewall/premium-certificates) | security | 0.75 | Details requirement for valid intermediate CA certificates and use of Azure Key Vault for TLS inspection; these are product-specific security configuration steps. | | [DNS proxy settings](https://learn.microsoft.com/en-us/azure/firewall/dns-settings) | configuration | 0.75 | Describes specific DNS settings for Azure Firewall, including default behavior (Azure DNS, proxy disabled) and configuration options, which are product-specific configuration parameters. | @@ -125,7 +118,6 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Prescaling](https://learn.microsoft.com/en-us/azure/firewall/prescaling) | limits-quotas | 0.70 | Prescaling involves setting minimum and maximum capacity units; this feature typically includes numeric ranges and constraints for capacity units, which are limit/quota-like expert details. | | [Protect Azure Kubernetes Service (AKS)](https://learn.microsoft.com/en-us/azure/firewall/protect-azure-kubernetes-service) | security | 0.70 | Shows how to secure AKS inbound/outbound traffic with Azure Firewall, including scenario-specific rule and routing configurations. | | [Routing in hub and spoke](https://learn.microsoft.com/en-us/azure/firewall/firewall-multi-hub-spoke) | architecture-patterns | 0.70 | Covers using Azure Firewall in self-managed multi-hub-and-spoke topologies; this is a concrete Azure networking architecture pattern. | -| [TCP idle timeout behavior](https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior) | limits-quotas | 0.70 | Focuses on TCP session management and idle timeout behavior; such articles typically include specific default timeout values and ranges, which are numeric limits unique to the product. | | [Track rule set changes](https://learn.microsoft.com/en-us/azure/firewall/rule-set-change-tracking) | configuration | 0.70 | Explains how to query and analyze rule collection group changes via Azure Resource Graph, a product-specific configuration and auditing pattern. | | [Change Azure Firewall SKU](https://learn.microsoft.com/en-us/azure/firewall/change-sku) | decision-making | 0.68 | The page is focused on how to upgrade/downgrade between Azure Firewall Standard and Premium SKUs, including when you would change (to gain or drop specific security capabilities). This is SKU/feature-based selection and migration guidance rather than just a how-to. It provides product-specific guidance on choosing between SKUs and how to move between them, which fits the decision-making category better than generic configuration or deployment. | | [Monitoring Azure Firewall reference](https://learn.microsoft.com/en-us/azure/firewall/monitor-firewall-reference) | configuration | 0.68 | A 'monitoring data reference' article typically enumerates specific log categories, metrics, schema fields, and diagnostic settings for Azure Firewall in Azure Monitor. These are product-specific configuration and reference details (names of tables, fields, categories, and how to enable them), which fits the configuration category as it documents concrete monitoring/diagnostic configuration options. | diff --git a/products/azure-firmware-analysis/azure-firmware-analysis.csv b/products/azure-firmware-analysis/azure-firmware-analysis.csv index 2fa1fd1d..d034fcc4 100644 --- a/products/azure-firmware-analysis/azure-firmware-analysis.csv +++ b/products/azure-firmware-analysis/azure-firmware-analysis.csv @@ -1,7 +1,7 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/firmware-analysis/automate-firmware-analysis-service-principals,Automate firmware analysis using service principals,Use service principals to automate workflows in firmware analysis,Automate firmware analysis with service principals,Learn about how to use service principals to automate workflows for firmware analysis.,Many users of the firmware analysis service may need to automate their workflow. The commandaz logincreates an interactive login experience with two-factor authentication that makes it difficult for users to fully automate their workflow. Aservice principalis a secure identity with proper permissions that authenticates to Azure in the command line without requiring two-factor authentication or an interactive log-in. This article explains how to create a service principal and use it to interact w,2025-09-12T08:00:00.000Z,concept-article,security,0.7,True,"Describes creating and using service principals with appropriate permissions for firmware analysis automation, including auth configuration details, fitting the security category.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/firmware-analysis-faq,FAQ,Frequently asked questions about firmware analysis,Resolve common issues in Azure firmware analysis,Find answers to some of the common questions about firmware analysis.,This article addresses frequent questions about firmware analysis. Firmware analysisis a tool that analyzes firmware images and provides an understanding of security vulnerabilities in the firmware images.,2026-03-31T22:19:00.000Z,faq,troubleshooting,0.65,True,"FAQ pages for a niche service typically include product-specific behaviors, limitations, and answers to concrete operational questions (for example, what certain results mean, supported formats, or how long analysis takes). These are troubleshooting-style symptom→explanation mappings that go beyond generic concepts, so they qualify as expert knowledge even though the summary doesn’t list specific error codes.",unchanged -https://learn.microsoft.com/en-us/azure/firmware-analysis/firmware-analysis-integration-with-azure-device-registry,Firmware analysis integration with Azure Device Registry,Firmware analysis integration with Azure Device Registry,,Learn about how firmware analysis results are mapped to deployed devices and assets in Azure Device Registry.,"Azure Device Registry maintains an inventory of two types of resources: Assets and Devices. Firmware images will be mapped to both types of Azure Device Registry resources. Firmware analysis and Azure Device Registry operate as complementary Azure services. Firmware analysis evaluates the security of firmware images, while Azure Device Registry tracks deployed devices and assets and their associated metadata. To learn more about Azure Device Registry, visitIntegration with Azure Device Registry ",2026-04-14T22:21:00.000Z,conceptual,,0.3,False,"Describes conceptual integration between Firmware analysis and Azure Device Registry; summary does not indicate specific configuration parameters, code, limits, or troubleshooting details required for any sub-skill type.",new +https://learn.microsoft.com/en-us/azure/firmware-analysis/firmware-analysis-integration-with-azure-device-registry,Firmware analysis integration with Azure Device Registry,Firmware analysis integration with Azure Device Registry,,Learn about how firmware analysis results are mapped to deployed devices and assets in Azure Device Registry.,"Azure Device Registry maintains an inventory of two types of resources: Assets and Devices. Firmware images will be mapped to both types of Azure Device Registry resources. Firmware analysis and Azure Device Registry operate as complementary Azure services. Firmware analysis evaluates the security of firmware images, while Azure Device Registry tracks deployed devices and assets and their associated metadata. To learn more about Azure Device Registry, visitIntegration with Azure Device Registry ",2026-04-14T22:21:00.000Z,conceptual,,0.3,False,"Describes conceptual integration between Firmware analysis and Azure Device Registry; summary does not indicate specific configuration parameters, code, limits, or troubleshooting details required for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/firmware-analysis-rbac,Firmware analysis role-based access control,Azure Role-Based Access Control for firmware analysis,Configure RBAC access for Azure firmware analysis,Learn about how to use Azure Role-Based Access Control for firmware analysis.,"As a user of firmware analysis, you may want to manage access to your firmware image analysis results. Azure Role-Based Access Control (RBAC) is an authorization system that enables you to control who has access to your analysis results, what permissions they have, and at what level of the resource hierarchy. This article explains how to store firmware analysis results in Azure, manage access permissions, and use RBAC to share these results within your organization and with third parties. To lea",2025-11-19T18:10:00.000Z,concept-article,security,0.8,True,"RBAC article will list specific roles, scopes, and permission mappings for firmware analysis resources, matching the security sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/interpreting-extractor-paths,Interpreting extractor paths from analysis results,Interpreting extractor paths from SBOM view in firmware analysis,Interpret SBOM extractor paths in firmware analysis,Learn how to interpret extractor paths from the SBOM view in firmware analysis results.,"A firmware image is a collection of files and file systems containing software that operates hardware. Often, it includes compressed files, executables, and system files. These file systems may or may not include other file systems within each file. For example, a firmware image that’s a .zip file may include individual files such as executables within it but may also include other compressed file systems, such as a SquashFS file. You can visualize it like the following: Each circle represents a",2026-03-31T22:19:00.000Z,conceptual,best-practices,0.7,True,"The page teaches how to interpret extractor paths in the SBOM view for nested file systems and compressed images. This is detailed, product-specific guidance on reading this tool’s output correctly (how paths map to embedded file systems and components), which is actionable usage advice and thus fits best-practices.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/overview-firmware-analysis,Overview,Firmware analysis overview,,"Learn how firmware analysis helps device builders and operators to evaluate the security of IoT, OT and network devices.","Just like computers have operating systems, IoT devices have firmware, and it's the firmware that runs and controls IoT devices. For IoT device builders, security is a near-universal concern as IoT devices have traditionally lacked basic security measures. For example, IoT attack vectors typically use easily exploitable--but easily correctable--weaknesses such as hardcoded user accounts, outdated and vulnerable open-source packages, or a manufacturer's private cryptographic signing key. Use the ",2025-09-26T05:10:00.000Z,conceptual,,0.2,False,"High-level overview of firmware analysis and IoT security concerns without product-specific limits, configs, or detailed procedures.",unchanged @@ -12,7 +12,7 @@ https://learn.microsoft.com/en-us/azure/firmware-analysis/quickstart-upload-firm https://learn.microsoft.com/en-us/azure/firmware-analysis/quickstart-upload-firmware-using-powershell,Analyze firmware images using Azure PowerShell,Quickstart: Upload firmware images to firmware analysis using Azure PowerShell,Upload firmware to Azure analysis with PowerShell,Learn how to upload firmware images for analysis using the Azure PowerShell.,This article explains how to use Azure PowerShell to upload firmware images to firmware analysis. Firmware analysisis a tool that analyzes firmware images and provides an understanding of security vulnerabilities in the firmware images.,2025-09-26T05:10:00.000Z,quickstart,integrations,0.65,True,"PowerShell quickstart will contain cmdlet names and parameter sets specific to firmware analysis, which are concrete integration details beyond generic SDK usage.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/quickstart-upload-firmware-using-python,Analyze firmware images using a Python script,Quickstart: Upload firmware images to firmware analysis using Python,Upload firmware to Azure analysis using Python,Learn how to upload firmware images for analysis using a Python script.,This article explains how to use a Python script to upload firmware images to firmware analysis. Firmware analysisis a tool that analyzes firmware images and provides an understanding of security vulnerabilities in the firmware images.,2026-01-13T06:22:00.000Z,quickstart,integrations,0.7,True,"Python quickstart necessarily documents API endpoints, request payloads, and parameter usage specific to the firmware analysis service, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/release-notes,What's new?,What's new in firmware analysis,,Learn about the latest updates for firmware analysis.,"This article lists new features and feature enhancements in the firmware analysis service. -Get notified about when to revisit this page for updates by copying and pasting this URL into your RSS feed reader: https://learn.microsoft.com/api/search/rss?search=%22What%27s+new+in+firmware+analysis%22&locale=en-us",2026-04-14T22:21:00.000Z,conceptual,,0.2,False,"Release notes listing new features and enhancements; no clear indication of detailed limits, configuration tables, error codes, or other structured expert data per the defined categories.",updated +Get notified about when to revisit this page for updates by copying and pasting this URL into your RSS feed reader: https://learn.microsoft.com/api/search/rss?search=%22What%27s+new+in+firmware+analysis%22&locale=en-us",2026-04-14T22:21:00.000Z,conceptual,,0.2,False,"Release notes listing new features and enhancements; no clear indication of detailed limits, configuration tables, error codes, or other structured expert data per the defined categories.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/tutorial-analyze-firmware,Tutorial using firmware analysis with the Azure portal,Analyze a firmware image with the firmware analysis service.,Analyze firmware images with Azure firmware analysis,Learn to analyze a compiled firmware image using firmware analysis.,This tutorial describes how to use thefirmware analysispage to upload a firmware image for security analysis and view analysis results.,2025-09-26T05:10:00.000Z,tutorial,best-practices,0.6,True,"Tutorial on analyzing firmware images is likely to include product-specific guidance on interpreting results and handling edge cases, which are actionable best practices for this service.",unchanged https://learn.microsoft.com/en-us/azure/firmware-analysis/understand-weaknesses-data,Understanding and prioritizing weaknesses data in firmware analysis,Understanding and prioritizing weaknesses data in Firmware analysis,Prioritize firmware weaknesses using analysis results,Learn what the weaknesses data are in the CVE view of the firmware analysis results.,"Firmware analysis surfaces weaknesses detected in firmware components extracted during analysis. These signals help you understand potential security risks, but they should be interpreted carefully and in context. This article explains weakness-related fields you may see in firmware analysis results, how they relate to one another, and how to evaluate them together to prioritize risk effectively. Note The presence of a weakness or CVE in firmware analysis doesn't necessarily mean a device is vul",2026-03-31T22:19:00.000Z,conceptual,best-practices,0.7,True,"The page explains how to interpret multiple weakness-related fields in firmware analysis results and how to evaluate them together to prioritize risk. That is product-specific guidance on how to use this service’s data correctly (how to read CVE view, how to weigh signals), which fits best-practices rather than generic security theory.",unchanged diff --git a/products/azure-firmware-analysis/report.md b/products/azure-firmware-analysis/report.md index a9241a45..c7a7f7cf 100644 --- a/products/azure-firmware-analysis/report.md +++ b/products/azure-firmware-analysis/report.md @@ -43,9 +43,9 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), - **Unclassified**: 3 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 1 -- **Unchanged**: 14 +- **New Pages**: 0 +- **Updated Pages**: 0 +- **Unchanged**: 16 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-firmware-analysis/azure-firmware-analysis.csv` @@ -63,15 +63,6 @@ confusable_not_for: Not for Azure Defender For Iot (use azure-defender-for-iot), ## Changes -### New Pages - -- [Firmware analysis integration with Azure Device Registry](https://learn.microsoft.com/en-us/azure/firmware-analysis/firmware-analysis-integration-with-azure-device-registry) - -### Updated Pages - -- [What's new?](https://learn.microsoft.com/en-us/azure/firmware-analysis/release-notes) - - Updated: 2026-03-31T22:19:00.000Z → 2026-04-14T22:21:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-front-door/azure-front-door.csv b/products/azure-front-door/azure-front-door.csv index e215285b..eb0b9155 100644 --- a/products/azure-front-door/azure-front-door.csv +++ b/products/azure-front-door/azure-front-door.csv @@ -23,7 +23,7 @@ https://learn.microsoft.com/en-us/azure/frontdoor/front-door-cdn-comparison,Fron https://learn.microsoft.com/en-us/azure/frontdoor/front-door-custom-domain,Azure portal,Add a Custom Domain to Azure Front Door,,"In this article, you learn how to onboard a custom domain to Azure Front Door.","Applies to:✔️ Front Door (classic) Important This article shows how to add a custom domain to your Front Door. When you use Azure Front Door for application delivery, a custom domain is necessary if you want your own domain name to be visible in your end-user request. Having a visible domain name can be convenient for your customers and useful for branding purposes. After you create a Front Door profile, the default frontend host is a subdomain ofazurefd.net. This name is included in the URL for",2025-09-25T08:00:00.000Z,how-to,,0.4,False,How-to for adding a custom domain; likely step-by-step UI instructions rather than a comprehensive configuration parameter reference or limits table.,unchanged https://learn.microsoft.com/en-us/azure/frontdoor/front-door-custom-domain-https,Configure HTTPS on a custom domain,Configure HTTPS on Front Door (classic) custom domain,Configure HTTPS and certificates for Front Door custom domains,Learn how to enable and disable HTTPS on your Azure Front Door (classic) configuration for a custom domain.,"Applies to:✔️ Front Door (classic) Important This article explains how to enable HTTPS for a custom domain associated with your Front Door (classic). Using HTTPS on your custom domain (for example,https://www.contoso.com) ensures secure data transmission via TLS/SSL encryption. When a web browser connects to a website using HTTPS, it validates the website's security certificate and verifies its legitimacy, providing security and protecting your web applications from malicious attacks. Azure Fron",2025-08-07T17:10:00.000Z,how-to,security,0.75,True,"HTTPS configuration for custom domains typically details certificate options, validation methods, and security-related settings specific to Front Door, including parameter names and modes.",unchanged https://learn.microsoft.com/en-us/azure/frontdoor/front-door-ddos,DDoS protection,DDoS protection on Azure Front Door,Understand DDoS protection with Azure Front Door,"Learn how Azure Front Door provides robust protection against DDoS attacks, ensuring the security and performance of your web applications.","Azure Front Door is a Content Delivery Network (CDN) that helps protect your origins from HTTP(S) DDoS attacks by distributing traffic across its 192 edge Points of Presence (POPs) worldwide. These POPs use Azure's large private WAN to deliver your web applications and services faster and more securely to your end users. Azure Front Door includes layer 3, 4, and 7 DDoS protection and a Web Application Firewall (WAF) to safeguard your applications from common exploits and vulnerabilities.",2024-11-19T18:02:00.000Z,concept-article,security,0.61,True,"Explains concrete DDoS protection capabilities and architecture (layers, POP distribution) specific to Front Door; security-focused expert content.",unchanged -https://learn.microsoft.com/en-us/azure/frontdoor/front-door-faq,FAQ,Azure Front Door frequently asked questions (FAQ),Azure Front Door FAQs on limits and behavior,This page provides answers to frequently asked questions about Azure Front Door.,"This article provides answers to the most frequently asked questions about Azure Front Door features and functionality. If you don't see the answer to your question, you can contact us through the following channels (in escalating order): The feedback section of this article. Azure Front Door Feedback. Microsoft Support:To create a new support request, in the Azure portal, on theHelptab, select theHelp + supportbutton, and then selectNew support request.",2026-04-14T22:21:00Z,faq,limits-quotas,0.7,True,"FAQ pages for Azure networking services typically include concrete, product-specific details such as maximum endpoints, routing rules, WAF limits, propagation times, and other numeric constraints and behaviors that are not obvious from general training data. These align with the limits-quotas category because they expose exact values and service behaviors that guide real-world usage.",updated +https://learn.microsoft.com/en-us/azure/frontdoor/front-door-faq,FAQ,Azure Front Door frequently asked questions (FAQ),Azure Front Door FAQs on limits and behavior,This page provides answers to frequently asked questions about Azure Front Door.,"This article provides answers to the most frequently asked questions about Azure Front Door features and functionality. If you don't see the answer to your question, you can contact us through the following channels (in escalating order): The feedback section of this article. Azure Front Door Feedback. Microsoft Support:To create a new support request, in the Azure portal, on theHelptab, select theHelp + supportbutton, and then selectNew support request.",2026-04-14T22:21:00Z,faq,limits-quotas,0.7,True,"FAQ pages for Azure networking services typically include concrete, product-specific details such as maximum endpoints, routing rules, WAF limits, propagation times, and other numeric constraints and behaviors that are not obvious from general training data. These align with the limits-quotas category because they expose exact values and service behaviors that guide real-world usage.",unchanged https://learn.microsoft.com/en-us/azure/frontdoor/front-door-how-to-onboard-apex-domain,Add a root or apex domain,Onboard a root or apex domain to Azure Front Door,Onboard root or apex domains to Azure Front Door,Learn how to onboard a root or apex domain to an existing Azure Front Door by using the Azure portal.,"Important Azure Front Door (classic) will be retired onMarch 31, 2027. To avoid any service disruption, it's important that youmigrate your Azure Front Door (classic) profilesto Azure Front Door Standard or Premium tier by March 2027. For more information, seeAzure Front Door (classic) retirement. Azure Front Door uses CNAME records to validate domain ownership for the onboarding of custom domains. Azure Front Door doesn't expose the front-end IP address associated with your Azure Front Door pro",2025-04-30T08:00:00.000Z,how-to,configuration,0.6,True,Detailed onboarding process using CNAME validation and domain ownership; includes specific DNS record requirements and constraints.,unchanged https://learn.microsoft.com/en-us/azure/frontdoor/front-door-how-to-redirect-https,Configure HTTP to HTTPS redirect,Configure HTTP to HTTPS redirection using the Azure portal - Azure Front Door,,This article shows you how to redirect traffic from HTTP to HTTPS for an Azure Front Door (classic) profile using the Azure portal.,"Important Azure Front Door (classic) will be retired onMarch 31, 2027. To avoid any service disruption, it's important that youmigrate your Azure Front Door (classic) profilesto Azure Front Door Standard or Premium tier by March 2027. For more information, seeAzure Front Door (classic) retirement. This guide explains how to redirect traffic from HTTP to HTTPS for an Azure Front Door (classic) profile using the Azure portal. This setup ensures that all traffic to your domain is securely redirecte",2024-11-19T18:02:00.000Z,how-to,,0.4,False,Step-by-step guide to set up HTTP→HTTPS redirection; mostly UI workflow rather than detailed configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/frontdoor/front-door-http-headers-protocol,HTTP headers protocol support,Protocol support for HTTP headers in Azure Front Door,Understand HTTP header protocol support in Azure Front Door,This article describes HTTP header protocols that Front Door supports.,"This article outlines the protocol that Front Door supports with parts of the call path (see image). In the following sections, you find information about HTTP headers supported by Front Door. Important Azure Front Door doesn't certify any HTTP headers that aren't documented in this article.",2025-04-30T08:00:00.000Z,how-to,configuration,0.85,True,Reference for which HTTP headers are supported on which segments of the call path; this is a product-specific compatibility matrix of header behavior that LLMs cannot infer generically.,unchanged @@ -99,4 +99,5 @@ https://learn.microsoft.com/en-us/azure/frontdoor/tier-migration,Classic to Stan https://learn.microsoft.com/en-us/azure/frontdoor/tier-upgrade,Upgrade from Standard to Premium tier - Portal,Upgrade from Azure Front Door Standard to Premium,Upgrade Front Door Standard to Premium tier,This article shows you how to upgrade from an Azure Front Door Standard to an Azure Front Door Premium profile.,"Applies to:✔️ Front Door Standard Upgrade your Azure Front Door Standard to Premium for advanced capabilities and increased quota limits without any downtime. For a detailed comparison, seeTier comparison. This guide explains how to upgrade your Azure Front Door Standard profile. After upgrading, you'll be billed for the Azure Front Door Premium monthly base fee at an hourly rate. Important Downgrading fromPremiumtoStandardisn't supported.",2024-11-19T18:02:00.000Z,concept-article,decision-making,0.64,True,"Describes upgrade path, billing behavior, and irreversible nature of downgrade; informs SKU/tier selection and upgrade decisions.",unchanged https://learn.microsoft.com/en-us/azure/frontdoor/tier-upgrade-powershell,Upgrade from Standard to Premium tier - PowerShell,Upgrade from Azure Front Door Standard to Premium with Azure PowerShell,Upgrade Front Door Standard to Premium via PowerShell,This article shows you how to upgrade from an Azure Front Door Standard to an Azure Front Door Premium profile with Azure PowerShell.,"Applies to:✔️ Front Door Standard Azure Front Door allows upgrading from Standard to Premium for enhanced capabilities and increased quota limits. This upgrade doesn't cause any downtime to your services or applications. For more information on the differences between Standard and Premium, seeTier comparison. This guide explains how to upgrade an Azure Front Door Standard profile to Premium. After the upgrade, you'll be charged the Azure Front Door Premium monthly base fee at an hourly rate. Imp",2025-05-13T05:03:00.000Z,how-to,deployment,0.64,True,Provides PowerShell commands and process for upgrading tiers; deployment-specific expert instructions.,unchanged https://learn.microsoft.com/en-us/azure/frontdoor/understanding-pricing,Price comparison between tiers,Compare Pricing Between Azure Front Door Tiers,"Compare Azure Front Door Standard, Premium, and Classic pricing","Learn about Azure Front Door billing models and compare pricing across Standard, Premium, and Classic tiers.","Applies to:✔️ Front Door Standard ✔️ Front Door Premium ✔️ Front Door (classic) Note Prices shown in this article are examples and are for illustration purposes only. For pricing information according to your region, see thePricing page Azure Front Door has three tiers: Standard, Premium, and (classic). This article describes the billing model for Azure Front Door and compares the pricing for the Standard, Premium, and (classic) tiers. When migrating from Azure Front Door (classic) to Standard o",2025-09-25T08:00:00.000Z,concept-article,decision-making,0.85,True,"Explicitly compares pricing across tiers; such pages typically include comparison tables, per-unit prices, and guidance for migration between tiers, directly supporting SKU/tier selection decisions with quantified trade-offs.",unchanged +https://learn.microsoft.com/en-us/azure/frontdoor/video-on-demand-live-streaming,Video on-demand and live streaming,Delivering Video On-Demand (VOD) and Live Streaming - Azure Front Door,,Improve live and VOD delivery with Microsoft’s global edge network. Get configuration guidance and performance results.,"Azure Front Door's architecture is designed to optimize global content delivery by leveraging Microsoft's private backbone network together with a large global-edge footprint. Azure Front Door minimizes latency and avoids congested public internet paths by routing traffic across Microsoft's private network between distributed edge locations, providing highly resilient delivery infrastructure. Combined with Microsoft's extensive peering relationships and redundant network paths, this architecture",2026-04-22T11:13:00.000Z,concept-article,,0.2,False,"Appears to be an architectural/performance overview of using Azure Front Door for VOD and live streaming. The summary indicates high-level architecture and benefits but does not show concrete limits, configuration parameter tables, error codes, or decision matrices with quantified thresholds. Without evidence of specific numeric limits, config values, or troubleshooting mappings, it does not meet the expert-knowledge criteria for any sub-skill type.",new https://learn.microsoft.com/en-us/azure/frontdoor/web-application-firewall,Web Application Firewall (WAF) on Front Door,Web Application Firewall (WAF) on Azure Front Door,Features of WAF on Azure Front Door,This article provides a list of the various features available with Web Application Firewall (WAF) on Azure Front Door.,"Azure Web Application Firewall (WAF) on Azure Front Door offers centralized protection for your web applications. It safeguards against common exploits and vulnerabilities, ensuring high availability and compliance. This article outlines the features of Azure Web Application Firewall on Azure Front Door. For more information, seeWAF on Azure Front Door.",2024-11-19T18:02:00.000Z,concept-article,security,0.62,True,"Lists WAF capabilities and modes specific to Front Door; while high-level, it enumerates product-specific security features.",unchanged diff --git a/products/azure-front-door/report.md b/products/azure-front-door/report.md index 18c565e5..c8539427 100644 --- a/products/azure-front-door/report.md +++ b/products/azure-front-door/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: architecture-patterns: 'Architectural patterns for Azure Front Door: apex domain setup, blue/green deployments, manual failover with Traffic Manager, static blob @@ -47,16 +47,16 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat ## Summary -- **Total Pages**: 101 -- **Fetched**: 101 +- **Total Pages**: 102 +- **Fetched**: 102 - **Fetch Failed**: 0 - **Classified**: 77 -- **Unclassified**: 24 +- **Unclassified**: 25 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 100 +- **New Pages**: 1 +- **Updated Pages**: 0 +- **Unchanged**: 101 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-front-door/azure-front-door.csv` @@ -64,23 +64,22 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 5 | 5.0% | +| architecture-patterns | 5 | 4.9% | | best-practices | 2 | 2.0% | -| configuration | 28 | 27.7% | -| decision-making | 8 | 7.9% | -| deployment | 9 | 8.9% | +| configuration | 28 | 27.5% | +| decision-making | 8 | 7.8% | +| deployment | 9 | 8.8% | | integrations | 2 | 2.0% | -| limits-quotas | 5 | 5.0% | -| security | 17 | 16.8% | +| limits-quotas | 5 | 4.9% | +| security | 17 | 16.7% | | troubleshooting | 1 | 1.0% | -| *(Unclassified)* | 24 | 23.8% | +| *(Unclassified)* | 25 | 24.5% | ## Changes -### Updated Pages +### New Pages -- [FAQ](https://learn.microsoft.com/en-us/azure/frontdoor/front-door-faq) - - Updated: 2026-04-08T17:12:00Z → 2026-04-14T22:21:00Z +- [Video on-demand and live streaming](https://learn.microsoft.com/en-us/azure/frontdoor/video-on-demand-live-streaming) ## Classified Pages @@ -188,6 +187,7 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat | [Create a Front Door - Portal](https://learn.microsoft.com/en-us/azure/frontdoor/create-front-door-portal) | 0.20 | Quickstart using portal; step-by-step creation without broad config reference tables or limits. | | [Create a Front Door - PowerShell](https://learn.microsoft.com/en-us/azure/frontdoor/create-front-door-powershell) | 0.20 | Quickstart tutorial for creating a Front Door profile with PowerShell; primarily step-by-step instructions without configuration matrices, limits, or troubleshooting mappings. | | [Front Door (classic) retirement FAQ](https://learn.microsoft.com/en-us/azure/frontdoor/classic-retirement-faq) | 0.20 | Retirement FAQ content is primarily conceptual and procedural (dates, migration guidance, feature overview) without detailed limits, configuration tables, error-code-based troubleshooting, or quantified decision matrices that meet the expert-knowledge criteria. | +| [Video on-demand and live streaming](https://learn.microsoft.com/en-us/azure/frontdoor/video-on-demand-live-streaming) | 0.20 | Appears to be an architectural/performance overview of using Azure Front Door for VOD and live streaming. The summary indicates high-level architecture and benefits but does not show concrete limits, configuration parameter tables, error codes, or decision matrices with quantified thresholds. Without evidence of specific numeric limits, config values, or troubleshooting mappings, it does not meet the expert-knowledge criteria for any sub-skill type. | | [What is Azure Front Door (classic)?](https://learn.microsoft.com/en-us/azure/frontdoor/classic-overview) | 0.20 | An overview page; primarily conceptual description of the service and its capabilities without detailed numeric limits, configuration tables, or troubleshooting mappings. | | [Accelerate and secure your web application](https://learn.microsoft.com/en-us/azure/frontdoor/scenarios) | 0.10 | High-level scenario/overview of Azure Front Door capabilities and when to consider using it; no specific limits, configuration parameters, error codes, or decision matrices with quantified trade-offs. | | [Support and troubleshooting](https://learn.microsoft.com/en-us/azure/frontdoor/support-help) | 0.10 | Page is a general support/help landing page pointing to forums and support channels, without product-specific error codes, diagnostic steps, configuration parameters, or limits. It does not meet the troubleshooting or other expert-knowledge criteria. | diff --git a/products/azure-functions/azure-functions.csv b/products/azure-functions/azure-functions.csv index 3b03648d..cfbe7fa2 100644 --- a/products/azure-functions/azure-functions.csv +++ b/products/azure-functions/azure-functions.csv @@ -6,7 +6,7 @@ https://learn.microsoft.com/en-us/azure/azure-functions/concept-file-access-opti https://learn.microsoft.com/en-us/azure/azure-functions/configure-encrypt-at-rest-using-cmk,Encrypt site data,Encrypt your application source at rest,Encrypt Azure Functions application source at rest,Encrypt your application data in Azure Storage and deploy it as a package file.,Encrypting your function app's application data at rest requires an Azure Storage Account and an Azure Key Vault. These services are used when you run your app from a deployment package.,2022-03-24T11:04:00.000Z,article,security,0.7,True,"Describes using Azure Storage and Key Vault to encrypt application data when running from a deployment package, which is a product-specific security configuration scenario.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/configure-monitoring,Configure monitoring,Configure monitoring for Azure Functions,Configure Application Insights monitoring for Azure Functions,Learn how to connect your function app to Application Insights for monitoring and how to configure data collection.,"Azure Functions integrates with Application Insights to better enable you to monitor your function apps. Application Insights, a feature of Azure Monitor, is an extensible Application Performance Management (APM) service that collects data generated by your function app, including information your app writes to logs. Application Insights integration is typically enabled when your function app is created. If your app doesn't have the instrumentation key set, you must firstenable Application Insig",2025-05-19T08:00:00.000Z,how-to,configuration,0.75,True,"Shows how to connect a function app to Application Insights and configure data collection, including instrumentation key settings and related parameters, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/configure-networking-how-to,Networking,How to use a secured storage account with Azure Functions,Use secured storage accounts with Azure Functions,Learn how to use a secured storage account in a virtual network as the default storage account for a function app in Azure Functions.,"Azure Functions requires an Azure Storage account when you create a function app instance. This default storage account is used by the Functions runtime to maintain the health of your function app. For more information, seeStorage considerations for Azure Functions. This article shows you how to use a secured storage account as the default storage account. For an in-depth tutorial on how to create your function app with inbound and outbound access restrictions, see theIntegrate with a virtual ne",2025-05-12T22:03:00.000Z,how-to,security,0.75,True,"Details how to configure a function app to use a secured storage account in a virtual network as its default storage, involving product-specific security and networking settings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/consumption-plan,Consumption plan (legacy),Azure Functions Consumption plan hosting (legacy),Plan migration from legacy Azure Functions Consumption plan,"Learn about Azure Functions Consumption plan hosting, a legacy serverless hosting option. We recommend the Flex Consumption plan for new serverless function apps.","Important Function apps still running theend-of-life v3 runtimeon Linux in a Consumption plan stop running after September 30, 2026. To avoid service disruption,migrate your app to the v4 runtime. The option to host function apps on Linux in a Consumption plan is retiring on 30 September 2028. The Linux Consumption plan isn't getting any new features orlanguage versions. Apps running on Windows in a Consumption plan aren't currently affected.Migrate your apps to the Flex Consumption planbefore t",2026-04-17T22:08:00.000Z,concept-article,decision-making,0.65,True,"Includes specific end-of-life dates, platform-specific retirement details, and explicit recommendations to migrate to Flex Consumption, providing concrete guidance for hosting-plan and migration decisions.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/consumption-plan,Consumption plan (legacy),Azure Functions Consumption plan hosting (legacy),Plan migration from legacy Azure Functions Consumption plan,"Learn about Azure Functions Consumption plan hosting, a legacy serverless hosting option. We recommend the Flex Consumption plan for new serverless function apps.","Important Function apps still running theend-of-life v3 runtimeon Linux in a Consumption plan stop running after September 30, 2026. To avoid service disruption,migrate your app to the v4 runtime. The option to host function apps on Linux in a Consumption plan is retiring on 30 September 2028. The Linux Consumption plan isn't getting any new features orlanguage versions. Apps running on Windows in a Consumption plan aren't currently affected.Migrate your apps to the Flex Consumption planbefore t",2026-04-17T22:08:00.000Z,concept-article,decision-making,0.65,True,"Includes specific end-of-life dates, platform-specific retirement details, and explicit recommendations to migrate to Flex Consumption, providing concrete guidance for hosting-plan and migration decisions.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/container-concepts,Containerized functions,Linux container support in Azure Functions,Run Azure Functions in Linux containers,Describes the options for and benefits of running your function code in Linux containers in Azure.,"When you plan and develop your individual functions to run in Azure Functions, you're typically focused on the code itself. Azure Functions makes it easy to deploy just your code project to a function app in Azure. When you deploy your project to a Linux function app, your code runs in a container that is created for you automatically and seamlessly integrates with Functions management tools. Functions also supports containerized function app deployments. In a containerized deployment, you creat",2025-07-30T22:07:00.000Z,concept-article,architecture-patterns,0.65,True,Describes container-based deployment options and patterns specific to Azure Functions (built-in vs custom containers) and when to use each; product-specific architecture guidance.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-azure-developer-cli,Scalable web API,Build a scalable web API using Azure Functions,,Learn how to use the Azure Developer CLI (azd) to create resources and deploy a scalable web API project to a Flex Consumption plan on Azure.,"In this quickstart, you use Azure Developer command-line tools to build a scalable web API with function endpoints that respond to HTTP requests. After testing the code locally, you deploy it to a new serverless function app you create running in a Flex Consumption plan in Azure Functions. The project source uses the Azure Developer CLI (azd) to simplify deploying your code to Azure. This deployment follows current best practices for secure and scalable Azure Functions deployments. By default, t",2025-12-03T23:21:00.000Z,quickstart,,0.4,False,Quickstart for building a web API with azd; mostly step-by-step tutorial without detailed config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/create-resources-azure-powershell,Azure PowerShell,Create function app resources in Azure using PowerShell,Provision Azure Functions hosting resources with PowerShell,Azure PowerShell scripts that show you how to create the Azure resources required to host your functions code in Azure.,"The Azure PowerShell example scripts in this article create function apps and other resources required to host your functions in Azure. A function app provides an execution context in which your functions are executed. All functions running in a function app share the same resources and connections, and they're all scaled together. After the resources are created, you can deploy your project files to the new function app. To learn more, seeDeployment methods. Every function app requires your Pow",2023-05-02T08:00:00.000Z,sample,deployment,0.7,True,PowerShell scripts for creating function apps and related resources; includes concrete deployment resource configuration for Functions.,unchanged @@ -17,7 +17,7 @@ https://learn.microsoft.com/en-us/azure/azure-functions/disable-function,Disable https://learn.microsoft.com/en-us/azure/azure-functions/dotnet-aspire-integration,Aspire integration,Guide for Using Azure Functions with Aspire,Integrate Azure Functions with .NET Aspire applications,"Learn how to use Azure Functions with Aspire, which simplifies authoring of distributed applications in the cloud.","Aspireis an opinionated stack that simplifies development of distributed applications in the cloud. The integration of Aspire with Azure Functions enables you to develop, debug, and orchestrate an Azure Functions .NET project as part of the Aspire app host.",2025-11-10T23:18:00.000Z,conceptual,integrations,0.65,True,"Describes concrete integration patterns, configuration, and orchestration between Aspire app host and Functions, including specific APIs and settings unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/dotnet-isolated-in-process-differences,Execution mode comparison,Differences between in-process and isolate worker process .NET Azure Functions,Compare in-process vs isolated .NET Azure Functions models,Compares features and functionality differences between running .NET Functions in-process or as an isolated worker process.,"There are two execution models for .NET functions: Important Support will end for the in-process model on November 10, 2026. We highly recommend that youmigrate your apps to the isolated worker modelfor full support. This article describes the current state of the functional and behavioral differences between the two models. To migrate from the in-process model to the isolated worker model, seeMigrate .NET apps from the in-process model to the isolated worker model.",2026-01-22T23:18:00.000Z,conceptual,decision-making,0.7,True,"Provides feature comparison and behavioral differences, including support end date and trade-offs, to decide which execution model to use and when to migrate.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/dotnet-isolated-process-guide,Isolated worker model,Guide for running C# Azure Functions in an isolated worker process,,"Learn how to use the .NET isolated worker model to run your C# functions in Azure, which lets you run your functions on currently supported versions of .NET and .NET Framework.","This article introduces working with Azure Functions in .NET using the isolated worker model. This model lets your project target versions of .NET independently of other runtime components. For information about specific .NET versions supported, seesupported version. Use the following links to get started right away building .NET isolated worker model functions. To learn about deploying an isolated worker model project to Azure, seeDeploy to Azure Functions.",2026-02-24T08:00:00.000Z,how-to,,0.2,False,"Primarily an introduction to the .NET isolated worker model for Azure Functions and links to other content. The summary indicates conceptual overview and getting-started guidance, without mention of specific configuration tables, limits, error codes, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/durable-functions/durable-functions-overview,Durable Functions,Durable Functions overview - Azure Durable,,"Learn how Durable Functions extends Azure Functions to build reliable, stateful workflows in a serverless environment.","Durable Functions is an extension ofAzure Functionsthat lets you build stateful workflows in a serverless environment by writing orchestrator, activity, and entity functions in code. The Durable Functions runtime manages state, checkpoints, retries, and recovery so your workflows can run reliably for long periods. Tip Not sure whether to use Durable Functions or the standalone Durable Task SDKs? SeeChoose your hosting model.",2026-04-03T21:52:00.000Z,get-started,,0.2,False,"High-level overview of Durable Functions concepts (orchestrator, activity, entity functions) without product-specific limits, configuration tables, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/azure-functions/durable-functions/durable-functions-overview,Durable Functions,Durable Functions Overview: Stateful Serverless Workflows - Azure Durable,,"Learn how Durable Functions extends Azure Functions to build reliable, stateful workflows in a serverless environment. Get started with supported languages, storage providers, and quickstarts.","Durable Functions is an extension ofAzure Functionsthat lets you build stateful workflows in a serverless environment by writing orchestrator, activity, and entity functions in code. The Durable Functions runtime manages state, checkpoints, retries, and recovery so your workflows can run reliably for long periods. Tip Not sure whether to use Durable Functions or the standalone Durable Task SDKs? SeeChoose your hosting model.",2026-04-22T08:00:00.000Z,get-started,,0.1,False,"Page is a high-level conceptual overview of Durable Functions (what it is, basic concepts like orchestrator/activity/entity functions). It does not provide specific limits, configuration tables, error codes, decision matrices, or other detailed product-specific expert knowledge as defined by the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/azure-functions/durable-functions/tutorial-durable-text-analysis-azure-files,Durable text analysis on a mounted share,Tutorial: Durable text analysis with a mounted Azure Files share in Azure Functions - Azure Durable,,Learn how to deploy a Python Azure Functions app that uses Durable Functions to orchestrate parallel text file analysis by using a mounted Azure Files share on a Flex Consumption plan.,"In this tutorial, you deploy a Python Azure Functions app that usesDurable Functionsto orchestrate parallel text file analysis. Your function app mounts an Azure Files share, analyzes multiple text files in parallel (fan-out), aggregates the results (fan-in), and returns them to the caller. This approach demonstrates a key advantage of storage mounts: shared file access across multiple function instances without per-request network overhead. In this tutorial, you: Note The code samples for this ",2026-04-03T21:52:00.000Z,tutorial,,0.3,False,"Durable Functions tutorial demonstrating fan-out/fan-in with Azure Files; primarily a walkthrough without detailed limits, configuration tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/errors-diagnostics/diagnostic-events/azfd0001,AZFD0001,AZFD0001: AzureWebJobsStorage app setting is not present. - Azure Functions,Resolve AZFD0001 missing AzureWebJobsStorage setting,Learn how to troubleshoot the event 'AZFD0001: AzureWebJobsStorage app setting is not present' in Azure Functions,This event occurs when the function app doesn't have theAzureWebJobsStorageapp setting configured for the function app.,2023-12-07T23:03:00.000Z,error-reference,troubleshooting,0.84,True,"Diagnostic event page with specific event ID, condition, and steps to configure AzureWebJobsStorage, matching symptom→cause→solution.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/errors-diagnostics/diagnostic-events/azfd0002,AZFD0002,AZFD0002: Value of AzureWebJobsStorage app setting is invalid. - Azure Functions,Fix AZFD0002 invalid AzureWebJobsStorage value,Learn how to troubleshoot the event 'AZFD0002: Value of AzureWebJobsStorage app setting is invalid' in Azure Functions,This event occurs when the value of theAzureWebJobsStorageapp setting is set to either an invalid Azure Storage account connection string or to a Key Vault reference.,2023-12-07T23:03:00.000Z,error-reference,troubleshooting,0.84,True,Maps event ID to invalid connection string/Key Vault reference and provides product-specific resolution guidance.,unchanged @@ -72,8 +72,8 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cache https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb,Functions 1.x (legacy),Azure Cosmos DB bindings for Functions 1.x,Use Azure Cosmos DB bindings with Azure Functions 1.x,Understand how to use Azure Cosmos DB triggers and bindings in Azure Functions 1.x.,"Important Support will end for version 1.x of the Azure Functions runtime on September 14, 2026. We highly recommend that youmigrate your apps to version 4.xfor full support. This article explains how to work withAzure Cosmos DBbindings in Azure Functions. Azure Functions supports trigger, input, and output bindings for Azure Cosmos DB. Keep these important considerations in mind when using the Azure Cosmos DB binding for the Functions v1.x runtime: This article is for Azure Functions 1.x. We re",2023-09-27T22:04:00.000Z,reference,integrations,0.75,True,Version-specific binding behavior and considerations for Functions v1; includes extension configuration details unique to this runtime.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2,Overview,Azure Cosmos DB bindings for Functions,Use Azure Cosmos DB bindings with Azure Functions v4,Understand how to use Azure Cosmos DB triggers and bindings in Azure Functions.,"This set of articles explains how to work withAzure Cosmos DBbindings in Azure Functions. Azure Functions supports trigger, input, and output bindings for Azure Cosmos DB. For an end-to-end scenario that uses the Azure Cosmos DB extension, seeQuickstart: Respond to database changes in Azure Cosmos DB using Azure Functions. Important This version of the Azure Cosmos DB binding extension supportsAzure Functions version 4.x. If your app still uses version 1.x of the Functions runtime, instead seeAz",2025-12-21T08:00:00.000Z,reference,integrations,0.7,True,Overview of Cosmos DB bindings with links to trigger/input/output details; includes extension versioning and runtime compatibility specifics.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-input,Input,Azure Cosmos DB input binding for Functions 2.x and higher,Configure Azure Cosmos DB input binding for Azure Functions,Learn to use the Azure Cosmos DB input binding in Azure Functions.,"The Azure Cosmos DB input binding uses the SQL API to retrieve one or more Azure Cosmos DB documents and passes them to the input parameter of the function. The document ID or query parameters can be determined based on the trigger that invokes the function. For information on setup and configuration details, see theoverview. Note When the collection ispartitioned, lookup operations must also specify the partition key value. Important This article uses tabs to support multiple versions of the No",2023-07-31T16:56:00.000Z,reference,integrations,0.85,True,"Input binding reference with parameter names, partition key requirements, and query options; product-specific integration configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-output,Output,Azure Cosmos DB output binding for Functions 2.x and higher,Configure Azure Cosmos DB output binding for Functions,Learn to use the Azure Cosmos DB output binding in Azure Functions.,"The Azure Cosmos DB output binding lets you write a new document to an Azure Cosmos DB database using the SQL API. For information on setup and configuration details, see theoverview. Important This article uses tabs to support multiple versions of the Node.js programming model. The v4 model is generally available and is designed to have a more flexible and intuitive experience for JavaScript and TypeScript developers. For more details about how the v4 model works, refer to theAzure Functions No",2025-09-22T22:32:00.000Z,reference,configuration,0.78,True,"Azure Functions binding reference for Cosmos DB output normally documents binding properties (e.g., databaseName, containerName/collectionName, createIfNotExists, partitionKey, connection) with exact parameter names, types, and defaults, plus version-specific behavior for the Node.js programming model. This is detailed, product-specific configuration information that fits the configuration sub-skill type.",updated -https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-trigger,Trigger,Azure Cosmos DB trigger for Functions 2.x and higher,Configure Azure Cosmos DB trigger binding for Functions,Learn to use the Azure Cosmos DB trigger in Azure Functions.,"The Azure Cosmos DB Trigger uses theAzure Cosmos DB change feedto listen for inserts and updates across partitions. The change feed publishes new and updated items, not including updates from deletions. For an end-to-end scenario that uses the Azure Cosmos DB trigger, seeQuickstart: Respond to database changes in Azure Cosmos DB using Azure Functions. For information on setup and configuration details, see theoverview. Cosmos DB scaling decisions for the Consumption and Premium plans are done vi",2026-04-14T22:21:00.000Z,reference,configuration,0.78,True,"Binding reference pages for Azure Functions triggers typically include detailed configuration tables (e.g., connection, leaseCollectionName, maxItemsPerInvocation, preferredLocations) with allowed values, defaults, and behavior notes specific to the Cosmos DB trigger and Functions runtime versions. These are product-specific settings that go beyond generic knowledge and match the configuration sub-skill criteria.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-output,Output,Azure Cosmos DB output binding for Functions 2.x and higher,Configure Azure Cosmos DB output binding for Functions,Learn to use the Azure Cosmos DB output binding in Azure Functions.,"The Azure Cosmos DB output binding lets you write a new document to an Azure Cosmos DB database using the SQL API. For information on setup and configuration details, see theoverview. Important This article uses tabs to support multiple versions of the Node.js programming model. The v4 model is generally available and is designed to have a more flexible and intuitive experience for JavaScript and TypeScript developers. For more details about how the v4 model works, refer to theAzure Functions No",2025-09-22T22:32:00.000Z,reference,configuration,0.78,True,"Azure Functions binding reference for Cosmos DB output normally documents binding properties (e.g., databaseName, containerName/collectionName, createIfNotExists, partitionKey, connection) with exact parameter names, types, and defaults, plus version-specific behavior for the Node.js programming model. This is detailed, product-specific configuration information that fits the configuration sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-trigger,Trigger,Azure Cosmos DB trigger for Functions 2.x and higher,Configure Azure Cosmos DB trigger binding for Functions,Learn to use the Azure Cosmos DB trigger in Azure Functions.,"The Azure Cosmos DB Trigger uses theAzure Cosmos DB change feedto listen for inserts and updates across partitions. The change feed publishes new and updated items, not including updates from deletions. For an end-to-end scenario that uses the Azure Cosmos DB trigger, seeQuickstart: Respond to database changes in Azure Cosmos DB using Azure Functions. For information on setup and configuration details, see theoverview. Cosmos DB scaling decisions for the Consumption and Premium plans are done vi",2026-04-14T22:21:00.000Z,reference,configuration,0.78,True,"Binding reference pages for Azure Functions triggers typically include detailed configuration tables (e.g., connection, leaseCollectionName, maxItemsPerInvocation, preferredLocations) with allowed values, defaults, and behavior notes specific to the Cosmos DB trigger and Functions runtime versions. These are product-specific settings that go beyond generic knowledge and match the configuration sub-skill criteria.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-dapr,Overview,Dapr Extension for Azure Functions,Integrate Azure Functions with Dapr extension bindings,Learn to use the Dapr triggers and bindings in Azure Functions.,"The Dapr Extension for Azure Functions is a set of tools and services that allow developers to easily integrate Azure Functions with theDistributed Application Runtime (Dapr)platform. Azure Functions is an event-driven compute service that provides a set oftriggers and bindingsto easily connect with other Azure services. Dapr provides a set of building blocks and best practices for building distributed applications, including microservices, state management, pub/sub messaging, and more. With the",2025-08-18T22:33:00.000Z,reference,integrations,0.76,True,"Dapr extension reference describes specific trigger and binding types, configuration fields, and usage patterns for Azure Functions+Dapr, which are product-specific integration details beyond generic Dapr concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-dapr-input-secret,Secret,Dapr Secret input binding for Azure Functions,Access secrets with Dapr input binding in Azure Functions,Learn how to access Dapr Secret input binding data during function execution in Azure Functions.,"The Dapr secret input binding allows you to read secrets data as input during function execution. For information on setup and configuration details of the Dapr extension, see theDapr extension overview.",2024-12-03T23:07:00.000Z,reference,integrations,0.76,True,"Provides binding configuration and usage patterns for Dapr secret input binding in Azure Functions, which are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-dapr-input-state,State,Dapr State input binding for Azure Functions,Use Dapr state input binding in Azure Functions,Learn how to provide Dapr State input binding data during a function execution in Azure Functions.,"The Dapr state input binding allows you to read Dapr state during a function execution. For information on setup and configuration details of the Dapr extension, see theDapr extension overview.",2024-12-03T23:07:00.000Z,reference,integrations,0.76,True,"Explains how to read Dapr state via an input binding, including binding names and configuration fields that are specific to the Azure Functions Dapr extension.",unchanged @@ -94,8 +94,8 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid-output,Output,Azure Event Grid output binding for Azure Functions,Send events with Event Grid output binding in Functions,Learn to send an Event Grid event in Azure Functions.,"Use the Event Grid output binding to write events to a custom topic. You must have a validaccess key for the custom topic. The Event Grid output binding doesn't support shared access signature (SAS) tokens. For information on setup and configuration details, seeHow to work with Event Grid triggers and bindings in Azure Functions. Important This article uses tabs to support multiple versions of the Node.js programming model. The v4 model is generally available and is designed to have a more flexi",2023-09-24T11:29:00.000Z,reference,integrations,0.82,True,"Describes Event Grid output binding configuration (access key requirement, unsupported SAS tokens, binding parameters) that are specific to Azure Functions integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid-trigger,Trigger,Azure Event Grid trigger for Azure Functions,Configure Azure Event Grid trigger for Azure Functions,Learn to run code when Event Grid events in Azure Functions are dispatched.,"Use the function trigger to respond to an event sent by anEvent Grid source. You must have an event subscription to the source to receive events. To learn how to create an event subscription, seeCreate a subscription. For information on binding setup and configuration, see theoverview. Note Event Grid triggers aren't natively supported in an internal load balancer App Service Environment (ASE). The trigger uses an HTTP request that can't reach the function app without a gateway into the virtual ",2025-08-20T08:00:00.000Z,reference,integrations,0.82,True,"Trigger reference includes binding properties, subscription configuration, and ASE limitations specific to Azure Functions’ Event Grid trigger, which are integration-specific details.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs,Overview,Azure Event Hubs bindings for Azure Functions,Integrate Azure Functions with Event Hubs bindings,Learn to use Azure Event Hubs trigger and bindings in Azure Functions.,This article explains how to work withAzure Event Hubsbindings for Azure Functions. Azure Functions supports trigger and output bindings for Event Hubs.,2024-06-10T17:07:00.000Z,reference,integrations,0.84,True,"Event Hubs bindings reference includes trigger/output binding configuration, consumer group, connection, and partition settings unique to Azure Functions’ Event Hubs integration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-output,Output,Azure Event Hubs output binding for Azure Functions,Configure Azure Event Hubs output bindings in Functions,Learn to write messages to Azure Event Hubs streams using Azure Functions.,"This article explains how to work withAzure Event Hubsbindings for Azure Functions. Azure Functions supports trigger and output bindings for Event Hubs. For information on setup and configuration details, see theoverview. Use the Event Hubs output binding to write events to an event stream. You must have send permission to an event hub to write events to it. Make sure the required package references are in place before you try to implement an output binding. Important This article uses tabs to s",2025-06-22T05:19:00.000Z,reference,configuration,0.74,True,"Event Hubs output binding docs for Azure Functions normally provide binding configuration tables (e.g., connection, eventHubName, partitionKey) and language-specific attribute/JSON settings with required/optional flags and defaults. This is expert, product-specific configuration knowledge, matching the configuration sub-skill.",updated -https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-trigger,Trigger,Azure Event Hubs trigger for Azure Functions,Configure Azure Event Hubs trigger bindings in Functions,Learn to use Azure Event Hubs trigger in Azure Functions.,"This article explains how to work withAzure Event Hubstrigger for Azure Functions. Azure Functions supports trigger andoutput bindingsfor Event Hubs. For information on setup and configuration details, see theoverview. Use the function trigger to respond to an event sent to an event hub event stream. You need read access to the underlying event hub to set up the trigger. When the function is triggered, the message passed to the function is typed as a string. Event Hubs scaling decisions for the ",2024-06-10T17:07:00.000Z,reference,configuration,0.74,True,"Binding reference pages for Azure Functions typically include detailed attribute/property tables (e.g., connection, eventHubName, consumerGroup) with allowed values, defaults, and behavior specific to the Event Hubs trigger. These are product-specific configuration parameters rather than just conceptual guidance, so this fits the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-output,Output,Azure Event Hubs output binding for Azure Functions,Configure Azure Event Hubs output bindings in Functions,Learn to write messages to Azure Event Hubs streams using Azure Functions.,"This article explains how to work withAzure Event Hubsbindings for Azure Functions. Azure Functions supports trigger and output bindings for Event Hubs. For information on setup and configuration details, see theoverview. Use the Event Hubs output binding to write events to an event stream. You must have send permission to an event hub to write events to it. Make sure the required package references are in place before you try to implement an output binding. Important This article uses tabs to s",2025-06-22T05:19:00.000Z,reference,configuration,0.74,True,"Event Hubs output binding docs for Azure Functions normally provide binding configuration tables (e.g., connection, eventHubName, partitionKey) and language-specific attribute/JSON settings with required/optional flags and defaults. This is expert, product-specific configuration knowledge, matching the configuration sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-trigger,Trigger,Azure Event Hubs trigger for Azure Functions,Configure Azure Event Hubs trigger bindings in Functions,Learn to use Azure Event Hubs trigger in Azure Functions.,"This article explains how to work withAzure Event Hubstrigger for Azure Functions. Azure Functions supports trigger andoutput bindingsfor Event Hubs. For information on setup and configuration details, see theoverview. Use the function trigger to respond to an event sent to an event hub event stream. You need read access to the underlying event hub to set up the trigger. When the function is triggered, the message passed to the function is typed as a string. Event Hubs scaling decisions for the ",2024-06-10T17:07:00.000Z,reference,configuration,0.74,True,"Binding reference pages for Azure Functions typically include detailed attribute/property tables (e.g., connection, eventHubName, consumerGroup) with allowed values, defaults, and behavior specific to the Event Hubs trigger. These are product-specific configuration parameters rather than just conceptual guidance, so this fits the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-iot,Overview,Azure IoT Hub bindings for Azure Functions,Integrate Azure Functions with IoT Hub bindings,Learn to use IoT Hub trigger and binding in Azure Functions.,"This set of articles explains how to work with Azure Functions bindings for IoT Hub. The IoT Hub support is based on theAzure Event Hubs Binding. Important While the following code samples use the Event Hub API, the given syntax is applicable for IoT Hub functions.",2024-06-10T17:07:00.000Z,reference,integrations,0.8,True,IoT Hub bindings reference (based on Event Hubs) includes binding configuration and usage patterns specific to Azure Functions’ IoT Hub integration.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-iot-trigger,Trigger,Azure IoT Hub trigger for Azure Functions,Configure Azure IoT Hub trigger for Azure Functions,Learn to respond to events sent to an IoT hub event stream in Azure Functions.,"This article explains how to work with Azure Functions bindings for IoT Hub. The IoT Hub support is based on theAzure Event Hubs Binding. For information on setup and configuration details, see theoverview. Important While the following code samples use the Event Hub API, the given syntax is applicable for IoT Hub functions. Use the function trigger to respond to an event sent to an event hub event stream. You need read access to the underlying event hub to set up the trigger. When the function ",2024-06-10T17:07:00.000Z,reference,integrations,0.8,True,"Trigger reference covers binding parameters, required permissions, and event stream handling specific to Azure Functions’ IoT Hub trigger.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-expressions-patterns,Binding expression patterns,Azure Functions Binding Expressions and Patterns,Use Azure Functions binding expressions and patterns,Learn how to create various Azure Functions binding expressions based on common patterns.,"One of the most powerful features oftriggers and bindingsin Azure Functions isbinding expressions. In thefunction.jsonfile and in function parameters and code, you can use expressions that resolve to values from various sources. Most expressions are wrapped in curly braces. For example, in a queue trigger function,{queueTrigger}resolves to the queue message text. If thepathproperty for a blob output binding iscontainer/{queueTrigger}and a queue messageHelloWorldtriggers the function, a blob name",2025-07-22T22:09:00.000Z,how-to,configuration,0.65,True,"Binding expressions reference specific binding properties, syntax patterns, and resolution rules (e.g., {queueTrigger}, path formats) that are product-specific configuration details.",unchanged @@ -125,7 +125,7 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-rabbi https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-register,Register binding extensions,Register Azure Functions Binding Extensions,Register and configure Azure Functions binding extensions,Learn how to register Azure Functions binding extensions so that the triggers and bindings that the extension defines can be used in your function code.,The Azure Functions runtime natively runs HTTP and timer triggers. The behaviors of the other supportedtriggers and bindingsare implemented in separate extension packages. Projects that use a .NET class library use binding extensions that are installed in the project as NuGet packages. Extension bundles allow non-.NET apps to use binding extensions without having to interact with .NET infrastructure.,2025-07-22T22:09:00.000Z,concept-article,configuration,0.7,True,"Binding extension registration for .NET vs non-.NET involves specific package names, host.json/function.json settings, and versioning details that are product-specific configuration parameters beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-sendgrid,SendGrid,Azure Functions SendGrid bindings,Use Azure Functions SendGrid output binding,Azure Functions SendGrid bindings reference.,"This article explains how to send email by usingSendGridbindings in Azure Functions. Azure Functions supports an output binding for SendGrid. This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the following resources: Create your first function C# developer references: Create your first function JavaScript developer reference Create your first function TypeScript developer reference Create your first function Java developer reference Create",2025-08-18T22:33:00.000Z,reference,integrations,0.78,True,"Binding reference pages typically include binding-specific configuration properties, parameter names, and JSON schema for function.json that are unique to Azure Functions + SendGrid integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus,Overview,Azure Service Bus bindings for Azure Functions,Configure Azure Service Bus bindings for Functions,Learn to send Azure Service Bus triggers and bindings in Azure Functions.,Azure Functions integrates withAzure Service Busviatriggers and bindings. Integrating with Service Bus allows you to build functions that react to and send queue or topic messages.,2025-08-18T22:33:00.000Z,reference,integrations,0.78,True,"Service Bus bindings reference includes trigger/output binding configuration properties, function.json schema, and attribute parameters specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-output,Output,Azure Service Bus output bindings for Azure Functions,Configure Azure Service Bus output bindings in Functions,Learn to send Azure Service Bus messages from Azure Functions.,"Use Azure Service Bus output binding to send queue or topic messages. For information on setup and configuration details, see theoverview. Important This article uses tabs to support multiple versions of the Node.js programming model. The v4 model is generally available and is designed to have a more flexible and intuitive experience for JavaScript and TypeScript developers. For more details about how the v4 model works, refer to theAzure Functions Node.js developer guide. To learn more about th",2025-09-15T22:11:00.000Z,reference,configuration,0.74,True,"Service Bus output binding documentation generally includes binding property tables (e.g., queueName, topicName, connection, entityType) and their valid values and behaviors across languages and Node.js models. These concrete configuration parameters and defaults are expert configuration details, not just conceptual usage.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-output,Output,Azure Service Bus output bindings for Azure Functions,Configure Azure Service Bus output bindings in Functions,Learn to send Azure Service Bus messages from Azure Functions.,"Use Azure Service Bus output binding to send queue or topic messages. For information on setup and configuration details, see theoverview. Important This article uses tabs to support multiple versions of the Node.js programming model. The v4 model is generally available and is designed to have a more flexible and intuitive experience for JavaScript and TypeScript developers. For more details about how the v4 model works, refer to theAzure Functions Node.js developer guide. To learn more about th",2025-09-15T22:11:00.000Z,reference,configuration,0.74,True,"Service Bus output binding documentation generally includes binding property tables (e.g., queueName, topicName, connection, entityType) and their valid values and behaviors across languages and Node.js models. These concrete configuration parameters and defaults are expert configuration details, not just conceptual usage.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-trigger,Trigger,Azure Service Bus trigger for Azure Functions,Configure Azure Service Bus trigger for Functions,Learn to run an Azure Function when as Azure Service Bus messages are created.,"Use the Service Bus trigger to respond to messages from a Service Bus queue or topic. Starting with extension version 3.1.0, you can trigger on a session-enabled queue or topic. For information on setup and configuration details, see theoverview. Service Bus scaling decisions for the Consumption and Premium plans are made based on target-based scaling. For more information, seeTarget-based scaling. Important This article uses tabs to support multiple versions of the Node.js programming model. Th",2025-12-17T12:14:00.000Z,reference,integrations,0.82,True,"Trigger reference includes binding configuration options (e.g., queueName, topicName, connection, session settings), scale behavior details, and SDK attributes unique to this trigger.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-signalr-service,Overview,Azure Functions SignalR Service bindings,Configure Azure Functions SignalR Service bindings,Understand how to use SignalR Service bindings with Azure Functions.,This set of articles explains how to authenticate and send real-time messages to clients connected toAzure SignalR Serviceby using SignalR Service bindings in Azure Functions. Azure Functions runtime version 2.x and higher supports input and output bindings for SignalR Service.,2025-08-18T22:33:00.000Z,reference,integrations,0.78,True,"Binding reference for SignalR Service includes binding types, configuration properties, and function.json/attribute details specific to this integration.",unchanged @@ -155,7 +155,7 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-concurrency,Co https://learn.microsoft.com/en-us/azure/azure-functions/functions-consumption-costs,Consumption plan costs,Estimating consumption-based costs in Azure Functions,Estimate and compare Azure Functions consumption plan costs,Learn how to better estimate the costs that you might incur when running your function app in either the Consumption plan or the Flex Consumption plan in Azure Functions.,"This article shows you how to estimate plan costs for both theFlex Consumption planand the legacyConsumption plan. Choose the hosting option that best supports the feature, performance, and cost requirements for your function executions. For more information, seeAzure Functions scale and hosting. This article focuses on the two consumption plans because billing in these plans depends on active periods of executions inside each instance. Provides fast horizontal scaling, with flexible compute opt",2025-10-30T05:16:00.000Z,conceptual,decision-making,0.75,True,"Cost estimation for Consumption vs Flex Consumption plans typically includes formulas, pricing dimensions, and usage-based thresholds that guide plan selection and capacity planning.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-container-apps-hosting,Azure Container Apps hosting,Azure Container Apps hosting of Azure Functions,Host Azure Functions on Azure Container Apps,Learn about how you can use Azure Functions on Azure Container Apps to host and manage containerized function apps in Azure.,"Important A new hosting method for running Azure Functions directly in Azure Container Apps is now available. SeeNative Azure Functions Support in Azure Container Apps. This integration allows you to use the full features and capabilities of Azure Container Apps. You also benefit from the functions programming model and simplicity of autoscaling provided by Azure Functions. We recommend this approach for most new workloads. For more information, seeAzure Functions on Azure Container Apps. Azure ",2025-05-22T05:20:00.000Z,concept-article,architecture-patterns,0.65,True,"Explains native Functions support in Container Apps, when to use this hosting model, and trade-offs vs other options; architecture/hosting pattern guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-continuous-deployment,Continuous deployment,Continuous Deployment for Azure Functions,Configure continuous deployment for Azure Functions,Use the continuous deployment features of Azure App Service when publishing to Azure Functions.,"Azure Functions enables you to continuously deploy the changes made in a source control repository to a connected function app. Thissource control integrationenables a workflow in which a code update triggers build, packaging, and deployment from your project to Azure. You should always configure continuous deployment for a staging slot and not for the production slot. When you use the production slot, code updates are pushed directly to production without being verified in Azure. Instead, enabl",2025-12-16T08:00:00.000Z,concept-article,deployment,0.7,True,"Describes Azure Functions–specific continuous deployment behavior, including recommendation to use staging slots instead of production and App Service integration details, which are product-specific deployment patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-core-tools-reference,Azure Functions Core Tools,Azure Functions Core Tools reference,Use Azure Functions Core Tools command reference,Reference documentation that supports the Azure Functions Core Tools (func.exe).,"This article provides reference documentation for the Azure Functions Core Tools, which lets you develop, manage, and deploy Azure Functions projects from your local computer. To learn more about using Core Tools, seeWork with Azure Functions Core Tools. Core Tools commands are organized into the following contexts, each providing a unique set of actions. Before using the commands in this article,install the Core Tools.",2026-04-12T11:12:00.000Z,reference,configuration,0.7,True,"A Core Tools reference page will enumerate commands, options, and parameters (names, allowed values, defaults) for func.exe. This is product-specific configuration/command surface that LLMs won’t fully know from training and matches the configuration category’s focus on parameter references and defaults.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-core-tools-reference,Azure Functions Core Tools,Azure Functions Core Tools reference,Use Azure Functions Core Tools command reference,Reference documentation that supports the Azure Functions Core Tools (func.exe).,"This article provides reference documentation for the Azure Functions Core Tools, which lets you develop, manage, and deploy Azure Functions projects from your local computer. To learn more about using Core Tools, seeWork with Azure Functions Core Tools. Core Tools commands are organized into the following contexts, each providing a unique set of actions. Before using the commands in this article,install the Core Tools.",2026-04-12T11:12:00.000Z,reference,configuration,0.7,True,"A Core Tools reference page will enumerate commands, options, and parameters (names, allowed values, defaults) for func.exe. This is product-specific configuration/command surface that LLMs won’t fully know from training and matches the configuration category’s focus on parameter references and defaults.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-ai-enabled-apps,AI-enabled functions,Use AI tools and models in Azure Functions,,"Learn how Azure Functions supports AI integration in your applications, including LLMs, RAG, agentic workflows, and AI frameworks. Build scalable AI-powered serverless solutions.","Azure Functions provides serverless compute resources that integrate with AI and Azure services to streamline building cloud-hosted intelligent applications. This article provides a survey of the breadth of AI-related scenarios, integrations, and other AI resources that you can use in your function apps. Consider using Azure Functions in your AI-enabled experiences for these scenarios: Select one of these scenarios to learn more in this article. This article is language-specific, so make sure yo",2026-03-04T23:27:00.000Z,conceptual,,0.1,False,"Survey of AI-related scenarios and integrations for Azure Functions; appears conceptual and scenario-oriented without detailed config tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-container-registry,Create functions in containers,Create Azure Functions in a local Linux container,,Get started with Azure Functions by creating a containerized function app on your local computer and publishing the image to a container registry.,"This article shows you how to use Azure Functions Core tools to create your first function in a Linux container on your local computer, verify the function locally, and then publish the containerized function to a container registry. From a container registry, you can easily deploy your containerized functions to Azure. For a complete example of deploying containerized functions to Azure, which include the steps in this article, see one of the following articles: You can also create a function a",2025-07-01T17:29:00.000Z,how-to,,0.2,False,"Tutorial-style get-started article for creating and publishing an Azure Functions Linux container image using Core Tools and a container registry. It focuses on step-by-step commands and workflow, without configuration parameter tables, limits/quotas, error-code-based troubleshooting, or product-specific decision matrices. The content is generic deployment/how-to guidance that an LLM can already approximate from training, not expert-only reference details.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-bicep,Bicep,Create your function app resources in Azure using Bicep,Provision Azure Functions resources using Bicep,Create and deploy to Azure a simple HTTP triggered serverless function using Bicep.,"In this article, you use Bicep to create a function app in a Flex Consumption plan in Azure, along with its required Azure resources. The function app provides a serverless execution context for your function code executions. The app uses Microsoft Entra ID with managed identities to connect to other Azure resources. Completing this quickstart incurs a small cost of a few USD cents or less in your Azure account. Bicepis a domain-specific language (DSL) that uses declarative syntax to deploy Azur",2025-03-20T11:11:00.000Z,quickstart,deployment,0.65,True,Shows how to define a Flex Consumption function app and related resources via Bicep; contains product-specific deployment resource definitions and settings.,unchanged @@ -178,14 +178,14 @@ To create a C# Script app that supports in-portal editing, you must choose a run https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-queue-triggered-function,Queue storage trigger,Create a function in Azure triggered by queue messages,,Use Azure Functions to create a serverless function that is invoked by messages submitted to a queue in Azure.,"Learn how to create a function that is triggered when messages are submitted to an Azure Storage queue. Note In-portal editing is only supported for JavaScript, PowerShell, and C# Script functions. Python in-portal editing is supported only when running in the Consumption plan. To create a C# Script app that supports in-portal editing, you must choose a runtimeVersionthat supports thein-process model. When possible, you shoulddevelop your functions locally. To learn more about the limitations on",2024-09-18T08:00:00.000Z,how-to,,0.45,False,Queue-trigger tutorial is primarily step-by-step; the brief notes on in-portal editing are incidental and not the main focus of expert configuration guidance.,unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-vnet,Connect to a Virtual Network,Use private endpoints to integrate Azure Functions with a virtual network,Secure Azure Functions with VNet private endpoints,This tutorial shows you how to connect a function to an Azure virtual network and lock it down by using private endpoints.,"This tutorial shows you how to use Azure Functions to connect to resources in an Azure virtual network by using private endpoints. You create a new function app using a new storage account that's locked behind a virtual network by using the Azure portal. The virtual network uses a Service Bus queue trigger. In this tutorial, you'll:",2026-04-17T06:16:00.000Z,tutorial,security,0.7,True,"A tutorial on integrating Azure Functions with a virtual network using private endpoints generally includes product-specific security configuration: exact resource types to create, required subnet and private endpoint settings, and sometimes specific RBAC roles or access restrictions for storage and Service Bus. These are concrete, Azure-specific security configuration steps rather than generic networking concepts, so it fits the security sub-skill.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-vnet,Connect to a Virtual Network,Use private endpoints to integrate Azure Functions with a virtual network,Secure Azure Functions with VNet private endpoints,This tutorial shows you how to connect a function to an Azure virtual network and lock it down by using private endpoints.,"This tutorial shows you how to use Azure Functions to connect to resources in an Azure virtual network by using private endpoints. You create a new function app using a new storage account that's locked behind a virtual network by using the Azure portal. The virtual network uses a Service Bus queue trigger. In this tutorial, you'll:",2026-04-17T06:16:00.000Z,tutorial,security,0.7,True,"A tutorial on integrating Azure Functions with a virtual network using private endpoints generally includes product-specific security configuration: exact resource types to create, required subnet and private endpoint settings, and sometimes specific RBAC roles or access restrictions for storage and Service Bus. These are concrete, Azure-specific security configuration steps rather than generic networking concepts, so it fits the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio,Visual Studio,Quickstart: Create your first C# function in Azure using Visual Studio,,"In this quickstart, you learn how to use Visual Studio to create and publish a C# HTTP triggered function to Azure Functions.","Azure Functions lets you use Visual Studio to create local C# function projects and then easily publish this project to run in a scalable serverless environment in Azure. If you prefer to develop your C# apps locally using Visual Studio Code, you should instead consider theVisual Studio Code-based versionof this article. By default, this article shows you how to create C# functions that run on .NET 8 in anisolated worker process. Function apps that run in an isolated worker process are supported",2025-08-11T22:18:00.000Z,quickstart,,0.3,False,Visual Studio C# quickstart; focuses on creating and publishing a simple HTTP-triggered function.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-custom-handlers,Custom handlers,Azure Functions custom handlers,Configure Azure Functions custom handlers for any runtime,Learn to use Azure Functions with any language or runtime version.,"Azure Functions executes your app code by using language-specific handlers. These language-specific handlers allow Functions to supportmost key languagesby default. However, you might need to run code in another language or package. Custom handlers are lightweight web servers that receive events from the Azure Functions host process. You can use custom handlers to deploy to Azure Functions any code project that supports HTTP primitives. Custom handlers are best suited for situations where you wa",2025-12-03T23:21:00.000Z,concept-article,configuration,0.7,True,"Custom handlers require specific host.json configuration, process startup commands, and HTTP contract details that are concrete configuration parameters unique to Azure Functions.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-debug-powershell-local,Debug local PowerShell functions,Debug PowerShell Azure Functions locally,,Learn how to debug PowerShell functions when running locally.,"Azure Functions lets you develop your functions as PowerShell scripts. You can debug your PowerShell functions locally as you would any PowerShell scripts using the following standard development tools: Azure Functions Core Toolssupports local debugging of Azure Functions, including PowerShell functions.",2020-11-05T23:03:00.000Z,conceptual,,0.2,False,"Local debugging guidance for PowerShell Functions using standard tools; no product-specific error codes, config matrices, or limits.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-deploy-container,Linux container (Premium),Create your first containerized Azure Functions,Deploy containerized Azure Functions on Linux in Azure,Get started by deploying your first function app from a Linux image in a container registry to Azure Functions.,"In this article, you create a function app running in a Linux container and deploy it to Azure Functions. Deploying your function code to Azure Functions in a container requiresPremium planorDedicated (App Service) planhosting. Completing this article incurs costs of a few US dollars in your Azure account, which you can minimize bycleaning-up resourceswhen you're done. Tip When you need to run your event-driven functions in Azure in the same environment as other microservices, APIs, websites, wo",2025-02-05T23:02:00.000Z,quickstart,deployment,0.7,True,"Describes requirements for Premium vs Dedicated plans, container registry usage, and plan-specific constraints for containerized Functions, which are deployment-specific details.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-deploy-container-apps,Azure Container Apps hosting (legacy),Create your first containerized Azure Functions on Azure Container Apps,Deploy containerized Azure Functions to Container Apps,Get started with Azure Functions on Azure Container Apps by deploying your first function app from a Linux image in a container registry.,"In this article, you create a function app running in a Linux container and deploy it to an Azure Container Apps environment from a container registry. By deploying to Container Apps, you're able to integrate your function apps into cloud-native microservices. For more information, seeAzure Container Apps hosting of Azure Functions. Important A new hosting method for running Azure Functions directly in Azure Container Apps is now available. SeeNative Azure Functions Support in Azure Container Ap",2025-05-22T05:20:00.000Z,quickstart,deployment,0.7,True,"Walks through deploying a Linux container image with Functions to Azure Container Apps, a product-specific deployment scenario with environment requirements.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-slots,Deployment slots,Azure Functions deployment slots,Use deployment slots with Azure Functions apps,Learn to create and use deployment slots with Azure Functions by using the Azure portal or with Azure CLI.,"Azure Functions deployment slots allow your function app to run different instances calledslots. Slots are different environments exposed by using a publicly available endpoint. One app instance is always mapped to the production slot, and you can swap instances assigned to a slot on demand. The number of available slots depends on your specific hosting option: The following descriptions reflect how functions are affected by swapping slots:",2025-05-07T08:00:00.000Z,conceptual,deployment,0.75,True,"Explains Functions deployment slots, including how swaps affect functions and that slot availability depends on hosting option, which is a product-specific deployment constraint matrix.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies,Deployment options,Deployment technologies in Azure Functions,Select deployment technologies for Azure Functions apps,Learn the different ways you can deploy code to Azure Functions.,You can use several different technologies to deploy your Azure Functions project code to Azure. This article provides an overview of the deployment methods available to you and recommendations for the best method to use in various scenarios. It also provides a comprehensive list of and key details about the underlying deployment technologies.,2026-04-12T08:00:00.000Z,concept-article,deployment,0.7,True,"Provides an overview of all supported deployment methods for Azure Functions with recommendations for which to use in various scenarios and key details about each underlying technology, which is product-specific deployment guidance.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies,Deployment options,Deployment technologies in Azure Functions,Select deployment technologies for Azure Functions apps,Learn the different ways you can deploy code to Azure Functions.,You can use several different technologies to deploy your Azure Functions project code to Azure. This article provides an overview of the deployment methods available to you and recommendations for the best method to use in various scenarios. It also provides a comprehensive list of and key details about the underlying deployment technologies.,2026-04-12T08:00:00.000Z,concept-article,deployment,0.7,True,"Provides an overview of all supported deployment methods for Azure Functions with recommendations for which to use in various scenarios and key details about each underlying technology, which is product-specific deployment guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-develop-local,Develop and debug locally,Develop and run Azure Functions locally,Configure and run Azure Functions locally with Core Tools,Learn how to code and test Azure Functions on your local computer before you run them on Azure Functions.,"Whenever possible, create and validate your Azure Functions code project in a local development environment. By using Azure Functions Core Tools, you get a local runtime version of Azure Functions that integrates with popular development tools for an integrated development, debugging, and deployments. Your local functions can even connect to live Azure services. This article provides some shared guidance for local development, such as working with thelocal.settings.json file. It also links to de",2026-01-23T08:00:00.000Z,concept-article,configuration,0.7,True,"Includes details on local.settings.json schema, connection string handling, and emulator usage that are concrete configuration parameters for local development.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-develop-vs,Visual Studio development,Develop Azure Functions using Visual Studio,Develop and publish C# Azure Functions with Visual Studio,Find out how to use Azure Functions Tools for Visual Studio 2022 to develop and test C# class library function apps and publish them to Azure.,"Visual Studio provides a way to develop, test, and deploy C# class library functions to Azure. If this experience is your first with Azure Functions, seeAzure Functions overview. To get started right away, consider completing theFunctions quickstart for Visual Studio. This article provides detailed information about how to use Visual Studio to develop C# class library functions and publish them to Azure. There are two models for developing C# class library functions: theisolated worker modeland ",2025-09-23T05:17:00.000Z,how-to,deployment,0.6,True,"Covers Visual Studio project templates, publish profiles, and configuration options for deploying Functions, which are concrete deployment patterns.",unchanged @@ -209,7 +209,7 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-idempotent,Des https://learn.microsoft.com/en-us/azure/azure-functions/functions-identity-access-azure-sql-with-managed-identity,Access Azure SQL with managed identity,Connect a function app to Azure SQL with managed identity and SQL bindings - Azure Functions,Connect Azure Functions to Azure SQL via managed identity,Learn how to connect Azure SQL bindings through managed identity.,"Azure Functions provides amanaged identity, which is a turn-key solution for securing access toAzure SQL Databaseand other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizesAzure SQL bindings. A sample Azure Function project with SQL bindings is available in theToDo backend example. When you're finished with this tutorial",2023-10-12T17:01:00.000Z,tutorial,security,0.75,True,"Shows how to configure Azure SQL bindings to use managed identity, including specific binding configuration and identity setup steps that are security-focused and product-specific.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-identity-based-connections-tutorial,Use identity for host connections,Create a function app without default storage secrets in its definition - Azure Functions,Configure identity-based connections for Azure Functions,Learn how to remove storage connection strings from your function app definition and use identity-based connections instead.,"This tutorial shows you how to configure a function app using Microsoft Entra identities instead of secrets or connection strings, where possible. Using identities helps you avoid accidentally leaking sensitive secrets and can provide better visibility into how data is accessed. To learn more about identity-based connections, seeconfigure an identity-based connection. While the procedures shown work generally for all languages, this tutorial currently supports C# class library functions on Windo",2024-06-27T08:00:00.000Z,tutorial,security,0.7,True,"Tutorial shows concrete, product-specific steps and setting names for replacing storage connection strings with Microsoft Entra/managed identity in a function app. Includes identity configuration details that are implementation-specific, not just conceptual.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-identity-based-connections-tutorial-2,Use identity for triggers and bindings,Use identity-based connections with Azure Functions triggers and bindings,Use managed identity with Functions triggers and bindings,Learn how to use identity-based connections instead of secrets when connecting to a Service Bus queue using Azure Functions.,"This tutorial shows you how to configure Azure Functions to connect to Azure Service Bus queues by using managed identities, instead of secrets stored in the function app settings. The tutorial is a continuation of theCreate a function app without default storage secrets in its definitiontutorial. To learn more about identity-based connections, seeConfigure an identity-based connection.. While the procedures shown work generally for all languages, this tutorial currently supports C# class librar",2024-07-02T22:19:00.000Z,tutorial,security,0.7,True,"Tutorial configures Azure Functions triggers/bindings (Service Bus queue) to use identity-based connections instead of secrets, with specific setting names and wiring patterns that are product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code,Infrastructure as code,Automate function app resource deployment to Azure,Automate Azure Functions deployment with Bicep or ARM,"Learn how to build, validate, and use a Bicep file or an Azure Resource Manager template to deploy your function app and related Azure resources.","You can use a Bicep file or an Azure Resource Manager (ARM) template to automate the process of deploying your function app. During the deployment, you can use existing Azure resources or create new ones. You can obtain these benefits in your production apps by using deployment automation, both infrastructure-as-code (IaC) and continuous integration and deployment (CI/CD): This article shows you how to automate the creation of Azure resources and deployment configurations for Azure Functions. To",2026-03-15T11:12:00.000Z,conceptual,deployment,0.65,True,"The article focuses on automating deployment of function apps and related resources using Bicep/ARM, including deployment configurations for production and CI/CD. This is product-specific deployment automation guidance rather than a generic tutorial, fitting the deployment sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code,Infrastructure as code,Automate function app resource deployment to Azure,Deploy Azure Functions with Bicep or ARM templates,"Learn how to build, validate, and use a Bicep file or an Azure Resource Manager template to deploy your function app and related Azure resources.","To automate deploying your function app, use a Bicep file or an Azure Resource Manager template (ARM template). During deployment, you can use existing Azure resources or create new ones. By using deployment automation, both infrastructure as code (IaC) and continuous integration and deployment (CI/CD), you can bring these benefits to your production apps: This article shows you how to automate the creation of Azure resources and deployment configurations for Azure Functions. To learn more about",2026-04-23T06:20:00.000Z,how-to,deployment,0.7,True,"The article goes beyond a basic tutorial and describes product-specific deployment automation for Azure Functions using Bicep/ARM, including how to structure and parameterize templates for function apps and related resources. This is deployment-focused IaC guidance that an LLM is unlikely to fully infer from general knowledge, but it does not center on limits, security, or configuration matrices.",updated https://learn.microsoft.com/en-us/azure/azure-functions/functions-integrate-storage-queue-output-binding,Azure portal,Add messages to an Azure Storage queue using Functions,,Use Azure Functions to create a serverless function that's triggered by an HTTP request and creates a message in an Azure Storage queue.,"In Azure Functions, input and output bindings provide a declarative way to make data from external services available to your code. In this article, you use an output binding to create a message in a queue when an HTTP request triggers a function. You use Azure storage container to view the queue messages that your function creates.",2024-07-02T22:19:00.000Z,how-to,,0.3,False,Tutorial-style integration of Storage queue output binding; mostly step-by-step usage without detailed binding parameter tables or product-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-integrate-store-unstructured-data-cosmosdb,Azure Cosmos DB - portal,Store unstructured data using Azure Cosmos DB and Functions,Integrate Azure Functions with Azure Cosmos DB for unstructured data,Store unstructured data using Azure Functions and Azure Cosmos DB,"Azure Cosmos DBis a great way to store unstructured and JSON data. Combined with Azure Functions, Azure Cosmos DB makes storing data quick and easy with much less code than required for storing data in a relational database. Note At this time, the Azure Cosmos DB trigger, input bindings, and output bindings work with SQL API and Graph API accounts only. In Azure Functions, input and output bindings provide a declarative way to connect to external service data from your function. In this article,",2022-10-12T00:00:00.000Z,quickstart,integrations,0.7,True,"Covers Cosmos DB triggers and bindings, including binding configuration, supported APIs (SQL/Graph), and code patterns that are specific integration details between Functions and Cosmos DB.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-kubernetes-keda,Functions in Kubernetes,Azure Functions on Kubernetes with KEDA,Host Azure Functions on Kubernetes with KEDA,"Understand how to run Azure Functions in Kubernetes in the cloud or on-premises using KEDA, Kubernetes-based event driven autoscaling.","The Azure Functions runtime provides flexibility in hosting where and how you want.KEDA(Kubernetes-based Event Driven Autoscaling) pairs seamlessly with the Azure Functions runtime and tooling to provide event driven scale in Kubernetes. Important Running your containerized function apps on Kubernetes, either by using KEDA or by direct deployment, is an open-source effort that you can use free of cost. Best-effort support is provided by contributors and from the community by usingGitHub issues i",2024-08-20T22:03:00.000Z,conceptual,deployment,0.7,True,"Describes running Functions on Kubernetes with KEDA, including event-driven autoscaling behavior and support model, which are specific deployment patterns for this product.",unchanged @@ -217,9 +217,9 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-machine-learni https://learn.microsoft.com/en-us/azure/azure-functions/functions-manually-run-non-http,Manually run a non HTTP-triggered function,Manually run a non HTTP-triggered Azure Functions,,Use an HTTP request to run a non-HTTP triggered Azure Functions,"This article demonstrates how to manually run a non HTTP-triggered function via specially formatted HTTP request. In some contexts, such as during development and troubleshooting, you might need to run ""on-demand"" an Azure Function that is indirectly triggered. Examples of indirect triggers includefunctions on a scheduleor functions that run as theresult of events. The procedure described in this article is equivalent to using theTest/Runfunctionality of a function'sCode + Testtab in the Azure p",2025-05-08T08:00:00.000Z,article,,0.4,False,"Shows how to manually invoke non-HTTP triggers via HTTP; largely procedural without detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-mcp-foundry-tools,Connect to Foundry Agent Service,Connect an MCP server on Azure Functions to Foundry Agent Service,Connect MCP servers on Azure Functions to Foundry Agent Service,"Learn how to connect your MCP server hosted on Azure Functions to Azure AI Foundry Agent Service, enabling your agents to use custom tools.","This article shows you how to connect yourModel Context Protocol(MCP) server hosted on Azure Functions to Microsoft Foundry Agent Service. After completing this guide, your agent can discover and invoke the tools exposed by your MCP server. This article follows this basic process for configuring the MCP server connection from Foundry Agent Service:",2026-02-13T12:10:00.000Z,how-to,integrations,0.75,True,"Shows how to configure an MCP server hosted on Functions so Foundry Agent Service can discover and invoke tools, involving specific integration configuration steps and parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-mcp-tutorial,Host MCP servers for AI-enabled functions,Tutorial: Host an MCP server on Azure Functions,Securely host MCP servers on Azure Functions,"Host your MCP server on Azure Functions with ease. Learn to configure endpoints, enable authentication, and deploy scalable serverless solutions.",This article shows you how to host remoteModel Context Protocol(MCP) servers on Azure Functions. You also learn how to use built-in authentication to configure server endpoint authorization and better secure your AI tools. There are two ways to host a remote MCP server in Azure Functions: Note The ability to have Azure Functions host MCP servers you create using official MCP SDKs is currently in preview. This tutorial covers both MCP server options supported by Functions. Select the tab that bes,2025-11-18T18:43:00.000Z,tutorial,security,0.7,True,"Covers configuring MCP server endpoints and built-in authentication/authorization for Functions, including endpoint auth settings and security options.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring,Monitor function executions,Monitor executions in Azure Functions,,"Learn how to use Azure Application Insights with Azure Functions to monitor function executions. Application Insights collects log, performance, and error data.","Azure Functionsoffers built-in integration withAzure Application Insightsto monitor function executions. This article provides an overview of the monitoring capabilities provided by Azure for monitoring Azure Functions, including how to choose a telemetry exporter. Application Insights collects log, performance, and error data. By automatically detecting performance anomalies and featuring powerful analytics tools, you can more easily diagnose issues and better understand how your functions are ",2026-04-13T11:11:00.000Z,concept-article,,0.3,False,"Monitoring article is described as an overview of capabilities and exporter choice; no indication of specific error codes, configuration tables, or parameter defaults unique to this product. Likely conceptual/how-to monitoring guidance rather than detailed troubleshooting, configuration, or limits.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring,Monitor function executions,Monitor executions in Azure Functions,,"Learn how to use Azure Application Insights with Azure Functions to monitor function executions. Application Insights collects log, performance, and error data.","Azure Functionsoffers built-in integration withAzure Application Insightsto monitor function executions. This article provides an overview of the monitoring capabilities provided by Azure for monitoring Azure Functions, including how to choose a telemetry exporter. Application Insights collects log, performance, and error data. By automatically detecting performance anomalies and featuring powerful analytics tools, you can more easily diagnose issues and better understand how your functions are ",2026-04-13T11:11:00.000Z,concept-article,,0.3,False,"Monitoring article is described as an overview of capabilities and exporter choice; no indication of specific error codes, configuration tables, or parameter defaults unique to this product. Likely conceptual/how-to monitoring guidance rather than detailed troubleshooting, configuration, or limits.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-networking-faq,Networking FAQ,Frequently asked questions about networking in Azure Functions,,Answers to some of the most common questions and scenarios for networking with Azure Functions.,"This article lists frequently asked questions about networking in Azure Functions. For a more comprehensive overview, seeFunctions networking options.",2022-04-01T11:02:00Z,faq,,0.3,False,"Networking FAQ is likely conceptual Q&A and high-level guidance without detailed configuration tables, numeric thresholds, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-networking-options,Networking options,Azure Functions networking options,Configure Azure Functions networking and access controls,"Explore all supported Azure Functions networking features, including IP restrictions, private and service endpoints, and virtual network integration.","This article describes the networking features available across the hosting options for Azure Functions. The following networking options can be categorized as inbound and outbound networking features. Inbound features allow you to restrict access to your app, whereas outbound features allow you to connect your app to resources secured by a virtual network and control how outbound traffic is routed. Thehosting modelshave different levels of network isolation available. Choosing the correct one h",2026-04-17T06:16:00.000Z,concept-article,security,0.65,True,"The page details concrete networking features (IP restrictions, private endpoints, VNet integration) per Azure Functions hosting model, including which isolation options are available on which plans. These are product-specific security and access configuration details that go beyond generic networking concepts.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-networking-options,Networking options,Azure Functions networking options,Configure Azure Functions networking and access controls,"Explore all supported Azure Functions networking features, including IP restrictions, private and service endpoints, and virtual network integration.","This article describes the networking features available across the hosting options for Azure Functions. The following networking options can be categorized as inbound and outbound networking features. Inbound features allow you to restrict access to your app, whereas outbound features allow you to connect your app to resources secured by a virtual network and control how outbound traffic is routed. Thehosting modelshave different levels of network isolation available. Choosing the correct one h",2026-04-17T06:16:00.000Z,concept-article,security,0.65,True,"The page details concrete networking features (IP restrictions, private endpoints, VNet integration) per Azure Functions hosting model, including which isolation options are available on which plans. These are product-specific security and access configuration details that go beyond generic networking concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-node-troubleshoot,Node.js,Troubleshoot Node.js apps in Azure Functions,Troubleshoot Node.js Azure Functions deployment and runtime issues,Learn how to troubleshoot common errors when you deploy or run a Node.js app in Azure Functions.,Important The content of this article changes based on your choice of the Node.js programming model in the selector at the top of the page. The v4 model is generally available and is designed to have a more flexible and intuitive experience for JavaScript and TypeScript developers. Learn more about the differences between v3 and v4 in themigration guide. This article provides a guide for troubleshooting common scenarios in Node.js function apps. TheDiagnose and solve problemstab in theAzure port,2023-09-26T22:04:00.000Z,troubleshooting-general,troubleshooting,0.85,True,"Provides guidance for common Node.js function app issues, likely including specific error messages and resolutions, fitting the troubleshooting pattern.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-node-upgrade-v4,Migrate Node.js to model v4.x,Migrate to v4 of the Node.js model for Azure Functions,,This article shows you how to upgrade your existing function apps running on v3 of the Node.js programming model to v4.,"This article discusses the differences between version 3 and version 4 of the Node.js programming model and how to upgrade an existing v3 app. If you want to create a new v4 app instead of upgrading an existing v3 app, see the tutorial for eitherVisual Studio Code (VS Code)orAzure Functions Core Tools. This article uses ""tip"" alerts to highlight the most important concrete actions that you should take to upgrade your app. Version 4 is designed to provide Node.js developers with the following ben",2024-01-18T05:44:00.000Z,how-to,,0.4,False,"Migration guide between Node.js v3 and v4 models; mostly conceptual and step-by-step upgrade instructions without detailed config tables, limits, or product-specific error mappings.",unchanged @@ -241,8 +241,8 @@ https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenarios,Scen https://learn.microsoft.com/en-us/azure/azure-functions/functions-target-based-scaling,Target-based scaling,Target-based scaling in Azure Functions,Use target-based scaling for Azure Functions triggers,Explains target-based scaling behaviors of Consumption plan and Premium plan function apps.,"Target-based scaling provides a fast and intuitive scaling model for customers and is currently supported for these binding extensions: Target-based scaling replaces the previous Azure Functions incremental scaling model as the default for these extension types. Incremental scaling added or removed a maximum of one worker ateach new instance rate, with complex decisions for when to scale. In contrast, target-based scaling allows scale up of four instances at a time, and the scaling decision is b",2025-11-18T18:43:00.000Z,conceptual,limits-quotas,0.75,True,"Describes target-based scaling model with specific numeric behaviors (e.g., up to four instances at a time, previous incremental model constraints); these are scaling limits/quotas.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings,About triggers and bindings,Triggers and Bindings in Azure Functions,,Learn how to use triggers and bindings to connect your Azure function to online events and cloud-based services.,"In this article, you learn the high-level concepts surrounding triggers and bindings for functions. Triggers cause a function to run. A trigger defines how a function is invoked, and a function must have exactly one trigger. Triggers can also pass data into your function, as you would with method calls. Binding to a function is a way of declaratively connecting your functions to other resources. Bindings either pass data into your function (aninput binding) or enable you to write data out from y",2026-02-26T08:00:00.000Z,concept-article,,0.1,False,"Page is a high-level conceptual overview of Azure Functions triggers and bindings without detailed limits, configuration tables, error codes, or product-specific numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/functions-twitter-email,Functions with Logic Apps,Create a function that integrates with Azure Logic Apps,Integrate Azure Functions with Logic Apps and AI,Create a function integrate with Azure Logic Apps and Azure AI services. The resulting workflow categorizes tweet sentiments sends email notifications.,"Azure Functions integrates with Azure Logic Apps in the Logic Apps Designer. This integration allows you use the computing power of Functions in orchestrations with other Azure and third-party services. This tutorial shows you how to create a workflow to analyze X activity. As tweets are evaluated, the workflow sends notifications when positive sentiments are detected. In this tutorial, you learn to:",2024-08-09T05:35:00.000Z,tutorial,integrations,0.6,True,Shows how Functions integrate into a Logic Apps workflow with Azure AI; includes product-specific configuration steps for cross-service integration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions,Compare runtime versions,Compare Azure Functions Runtime Versions,Choose the right Azure Functions runtime version,"Learn how Azure Functions supports multiple versions of the runtime, and understand the differences between them and how to choose the one that's right for you.","Azure Functions currently supports two versions of the runtime host. The following table details the currently supported runtime versions, their support level, and when to use them: *Support ends September 14, 2026. For more information, seethe version 1.x support announcement. Azure Functions currently supports only version 4.x of the runtime host. Important Versions 2.x and 3.x of the Azure Functions runtime are no longer supported. For more information, seeRetired versions. Important Function",2026-04-12T11:12:00.000Z,concept-article,decision-making,0.7,True,"Contains a comparison of Azure Functions runtime versions with support status, retirement information, and explicit guidance on when to use each version, enabling concrete version-selection decisions beyond generic knowledge.",updated -https://learn.microsoft.com/en-us/azure/azure-functions/functions-zone-redundancy,Zone redundancy,Configure Zone Redundancy for Azure Functions,Configure zone-redundant Azure Functions apps,"Learn how to configure zone redundancy for Azure Functions, create zone-redundant Function Apps, and migrate existing function apps to use multiple availability zones.","Zone redundancy enables your function apps to be resilient to problems in Azure availability zones, so your app remains available when a datacenter or zone has an outage. This article provides step-by-step guidance for configuring Azure Functions to be zone-redundant, depending on your hosting plan. For information about how availability zones work with Azure Functions, seeReliability in Azure Functions. Availability zone configuration for Azure Functions depends on yourFunctions hosting plan: I",2026-04-16T22:31:00.000Z,how-to,deployment,0.7,True,"The article gives hosting-plan-specific guidance and constraints for creating and migrating zone-redundant Function Apps. This includes which plans support zone redundancy and how to configure it per plan, which is product- and tier-specific deployment behavior that an LLM is unlikely to know from training.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions,Compare runtime versions,Compare Azure Functions Runtime Versions,Choose the right Azure Functions runtime version,"Learn how Azure Functions supports multiple versions of the runtime, and understand the differences between them and how to choose the one that's right for you.","Azure Functions currently supports two versions of the runtime host. The following table details the currently supported runtime versions, their support level, and when to use them: *Support ends September 14, 2026. For more information, seethe version 1.x support announcement. Azure Functions currently supports only version 4.x of the runtime host. Important Versions 2.x and 3.x of the Azure Functions runtime are no longer supported. For more information, seeRetired versions. Important Function",2026-04-17T22:08:00.000Z,concept-article,decision-making,0.74,True,"Page compares Azure Functions runtime versions with concrete support timelines, indicates which versions are currently supported or retired, and provides guidance on when to use each version. This is product-specific decision guidance (version selection and lifecycle) rather than just conceptual description, fitting the decision-making category.",updated +https://learn.microsoft.com/en-us/azure/azure-functions/functions-zone-redundancy,Zone redundancy,Configure Zone Redundancy for Azure Functions,Configure zone-redundant Azure Functions apps,"Learn how to configure zone redundancy for Azure Functions, create zone-redundant Function Apps, and migrate existing function apps to use multiple availability zones.","Zone redundancy enables your function apps to be resilient to problems in Azure availability zones, so your app remains available when a datacenter or zone has an outage. This article provides step-by-step guidance for configuring Azure Functions to be zone-redundant, depending on your hosting plan. For information about how availability zones work with Azure Functions, seeReliability in Azure Functions. Availability zone configuration for Azure Functions depends on yourFunctions hosting plan: I",2026-04-16T22:31:00.000Z,how-to,deployment,0.7,True,"The article gives hosting-plan-specific guidance and constraints for creating and migrating zone-redundant Function Apps. This includes which plans support zone redundancy and how to configure it per plan, which is product- and tier-specific deployment behavior that an LLM is unlikely to know from training.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/how-to-create-function-azure-cli,Command line (Core Tools),Create a function in Azure from the command line,,"Learn how to use command line tools, such as Azure Functions Core Tools, to create a function code project, create Azure resources, and publish function code to run in Azure Functions.","In this article, you use local command-line tools to create a function that responds to HTTP requests. After verifying your code locally, you deploy it to a serverless Flex Consumption hosting plan in Azure Functions. Completing this quickstart incurs a small cost of a few USD cents or less in your Azure account. Make sure to select your preferred development language at the top of the article.",2025-12-03T23:21:00.000Z,quickstart,,0.3,False,CLI-based quickstart for creating and deploying a function; no indication of extensive config references or quotas.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/how-to-create-function-vs-code,Visual Studio Code,Create and deploy function code to Azure using Visual Studio Code,,"Learn how to create a function, then publish the local code project to serverless hosting in Azure Functions using the Azure Functions extension in Visual Studio Code.","Use Visual Studio Code to create a function that responds to HTTP requests from a template. Use GitHub Copilot to improve the generated function code, verify code updates locally, and then deploy it to the serverless Flex Consumption hosting plan in Azure Functions. Use Visual Studio Code to create acustom handlerfunction that responds to HTTP requests. After verifying the code locally, you deploy it to the serverless Flex Consumption hosting plan in Azure Functions. Custom handlers can be used ",2025-12-03T23:21:00.000Z,quickstart,,0.3,False,"VS Code quickstart; primarily step-by-step creation and deployment, not a configuration or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/ip-addresses,IP addresses,IP addresses in Azure Functions,Understand and manage Azure Functions app IP addresses,"Learn how to find inbound and outbound IP addresses for function apps, and what causes them to change.","This article explains the following concepts related to IP addresses of function apps: IP addresses are associated with function apps, not with individual functions. Incoming HTTP requests can't use the inbound IP address to call individual functions; they must use the default domain name (functionappname.azurewebsites.net) or a custom domain name.",2025-05-06T08:00:00.000Z,conceptual,configuration,0.65,True,"Explains how inbound/outbound IPs are assigned, how to retrieve them, and when they change—product-specific behavior and configuration details not captured by generic knowledge.",unchanged @@ -256,8 +256,8 @@ https://learn.microsoft.com/en-us/azure/azure-functions/migrate-version-1-versio https://learn.microsoft.com/en-us/azure/azure-functions/migrate-version-3-version-4,Migrate v3.x to v4.x,Migrate apps from Azure Functions version 3.x to 4.x,Migrate Azure Functions apps from runtime v3 to v4,Learn how to migrate your existing function apps running on version 3.x of the Azure Functions runtime to be able to run on version 4.x of the runtime.,"Azure Functions version 4.x is highly backwards compatible to version 3.x. Most apps should safely migrate to 4.x without requiring significant code changes. For more information about Functions runtime versions, seeAzure Functions runtime versions overview. Important As of December 13, 2022, function apps running on versions 2.x and 3.x of the Azure Functions runtime have reached the end of extended support. For more information, seeRetired versions. This article walks you through the process o",2025-09-30T08:00:00.000Z,how-to,decision-making,0.75,True,"Walks through migration from v3.x to v4.x, including compatibility considerations and steps, which are migration decision and execution guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/migration/migrate-aws-lambda-to-azure-functions,Migrate from AWS Lambda,Migrate AWS Lambda workloads to Azure Functions,Plan migration of AWS Lambda workloads to Azure Functions,Learn how to migrate workloads from AWS Lambda to Azure Functions. Compare functionality and optimize workloads on Azure.,Migrating a serverless workload that uses Amazon Web Services (AWS) Lambda to Azure Functions requires careful planning and implementation. This article provides essential guidance to help you:,2025-10-29T08:00:00.000Z,conceptual,decision-making,0.8,True,"Provides guidance comparing AWS Lambda and Azure Functions and how to migrate, including trade-offs and optimization on Azure, which is decision-making and migration-focused.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/migration/migrate-plan-consumption-to-flex,Migrate Consumption plan apps to Flex Consumption,Migrate Consumption plan apps to Flex Consumption in Azure Functions,Migrate Azure Functions from Consumption to Flex plan,Learn how to migrate an existing function app in Azure running in an Azure Functions Consumption hosting plan to instead run in the Flex Consumption hosting plan.,"This article shows you how to migrate your existing function apps from theConsumption planto theFlex Consumption plan. For most apps, this migration is straightforward and your code doesn't need to change. Important Support for hosting function apps on Linux in a Consumption plan retires on September 30, 2028. As of today, feature and language enhancements aren't being made to the Linux Consumption plan. -Follow this article to migrate your Consumption plan apps to instead run in the Flex Consump",2026-04-16T17:19:00.000Z,concept-article,deployment,0.7,True,"Migration between Consumption and Flex Consumption plans is a deployment/hosting concern. The article is specifically about moving existing apps between plans, which falls under deployment patterns and migration paths. It likely includes plan-specific requirements and constraints (for example Linux Consumption retirement date and guidance to move), which are product-specific deployment details.",updated -https://learn.microsoft.com/en-us/azure/azure-functions/migration/scenario-migrate-linux-consumption-to-flex,Migrate Linux apps to Flex Consumption using Copilot,Quickstart: Migrate Linux Consumption apps to Flex Consumption using GitHub Copilot,Migrate Linux Consumption Functions to Flex Consumption,Use GitHub Copilot with Azure skills to interactively migrate your Linux function apps from the Consumption plan to the Flex Consumption plan.,"In this quickstart, use GitHub Copilot with the Azure skills plugin to interactively migrate your Linux function apps from theConsumption planto theFlex Consumption plan. Copilot automates most of the migration, including assessment, app creation, configuration, deployment, and validation. Important This article demonstrates how to use Copilot to recreate an existing Linux Consumption app in a Flex Consumption plan. TheAzure skillthat Copilot uses to achieve the migration work is designed to wor",2026-04-16T17:19:00.000Z,quickstart,decision-making,0.65,True,"Migration scenario between specific Azure Functions plans typically includes plan-specific constraints, required configuration changes, and step ordering that are not generic knowledge. Even though it’s framed as a Copilot quickstart, such migration docs usually encode concrete prerequisites, supported/unsupported features, and required settings for Flex Consumption vs Linux Consumption, which directly guide when and how to move between plans (a decision-making scenario).",new +Follow this article to migrate your Consumption plan apps to instead run in the Flex Consump",2026-04-16T17:19:00.000Z,concept-article,deployment,0.7,True,"Migration between Consumption and Flex Consumption plans is a deployment/hosting concern. The article is specifically about moving existing apps between plans, which falls under deployment patterns and migration paths. It likely includes plan-specific requirements and constraints (for example Linux Consumption retirement date and guidance to move), which are product-specific deployment details.",unchanged +https://learn.microsoft.com/en-us/azure/azure-functions/migration/scenario-migrate-linux-consumption-to-flex,Migrate Linux apps to Flex Consumption using Copilot,Quickstart: Migrate Linux Consumption apps to Flex Consumption using GitHub Copilot,Migrate Linux Consumption Functions to Flex Consumption,Use GitHub Copilot with Azure skills to interactively migrate your Linux function apps from the Consumption plan to the Flex Consumption plan.,"In this quickstart, use GitHub Copilot with the Azure skills plugin to interactively migrate your Linux function apps from theConsumption planto theFlex Consumption plan. Copilot automates most of the migration, including assessment, app creation, configuration, deployment, and validation. Important This article demonstrates how to use Copilot to recreate an existing Linux Consumption app in a Flex Consumption plan. TheAzure skillthat Copilot uses to achieve the migration work is designed to wor",2026-04-16T17:19:00.000Z,quickstart,decision-making,0.65,True,"Migration scenario between specific Azure Functions plans typically includes plan-specific constraints, required configuration changes, and step ordering that are not generic knowledge. Even though it’s framed as a Copilot quickstart, such migration docs usually encode concrete prerequisites, supported/unsupported features, and required settings for Flex Consumption vs Linux Consumption, which directly guide when and how to move between plans (a decision-making scenario).",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/monitor-functions,Monitor Azure Functions,Monitor Azure Functions,,Start here to learn how to monitor Azure Functions.,"This article describes: Note If you're already familiar with this service and/or Azure Monitor and just want to know how to analyze monitoring data, see theAnalyzesection near the end of this article. When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system. Azure Monitor provides you with a view of availability",2025-12-08T08:00:00.000Z,concept-article,,0.4,False,High-level monitoring overview; mostly conceptual description of Azure Monitor and Functions monitoring without detailed config tables or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/azure-functions/monitor-functions-opentelemetry-distributed-tracing,OpenTelemetry distributed tracing,Monitor Azure Functions with OpenTelemetry distributed tracing,Configure OpenTelemetry distributed tracing for Azure Functions,"Learn how to implement OpenTelemetry distributed tracing across multiple function calls in your function app. See how to monitor function calls, track performance, and gain observability into your ser","This article demonstrates OpenTelemetry support in Azure Function, which enables distributed tracing across multiple function calls by using integrated Application Insights and OpenTelemetry support. To help you get started, an Azure Developer CLI (azd) template is used to create your code project as well as the Azure deployment in which to run your app. In this tutorial, you use theazdtool to: The required Azure resources created by this template follow current best practices for secure and sca",2026-01-30T23:12:00.000Z,tutorial,configuration,0.7,True,"Demonstrates configuring OpenTelemetry and Application Insights for Functions, including tracing settings and instrumentation parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-functions/monitor-functions-reference,Monitoring data reference,Monitoring data reference for Azure Functions,Reference for Azure Functions monitoring data schema,This article contains important reference material you need when you monitor Azure Functions.,This article contains all the monitoring reference information for this service. SeeMonitor Azure Functionsfor details on the data you can collect for Azure Functions and how to use it. SeeMonitor executions in Azure Functionsfor details on using Application Insights to collect and analyze log data from individual functions in your function app.,2025-10-30T05:16:00.000Z,reference,configuration,0.7,True,"Monitoring reference describes specific metrics, log categories, dimensions, and schema used for monitoring Azure Functions, which are product-specific configuration/telemetry details.",unchanged diff --git a/products/azure-functions/report.md b/products/azure-functions/report.md index 7e750cb8..42a61ae2 100644 --- a/products/azure-functions/report.md +++ b/products/azure-functions/report.md @@ -1,21 +1,21 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring Azure Functions apps: bindings, triggers, app/host settings, runtime versions, plans, networking, monitoring/telemetry, and local/Core Tools setup.' - decision-making: Guidance on choosing Functions hosting/runtime models, estimating - costs, and planning migrations (plans, runtimes, languages, AWS Lambda, Express.js, - Service Bus) for optimal architecture. + decision-making: Guidance on choosing Functions hosting/runtime models, comparing + costs and plans, and planning migrations (between runtimes, models, platforms + like AWS Lambda and Express.js). security: 'Securing Functions apps: encryption at rest, secure storage, keys/secrets, managed identity, private endpoints/VNet, Web PubSub, networking/access controls, and secure MCP hosting.' architecture-patterns: Running Functions in Linux containers, Durable Functions design with Azure Storage, and hosting Functions on Azure Container Apps for scalable, container-based architectures. - deployment: 'Deploying and hosting Azure Functions: provisioning plans (Consumption/Flex/Kubernetes), - containers, CI/CD (GitHub/Azure Pipelines), slots, zero‑downtime, and migration/deployment - automation.' + deployment: 'Deploying and hosting Azure Functions: provisioning plans (Consumption/Flex), + templates (Bicep/ARM/Terraform), containers/Kubernetes, CI/CD (GitHub/Azure Pipelines), + slots, and migration tasks.' integrations: Patterns and how-tos for wiring Functions to external systems (DBs, messaging, AI/OpenAI, Dapr, MCP, storage, HTTP) using triggers/bindings, plus integration with API Mgmt, Logic Apps, and on-prem. @@ -31,14 +31,14 @@ category_descriptions: skill_description: Expert knowledge for Azure Functions development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - building HTTP/queue-triggered apps, Durable Functions, Linux/container hosting, - API Mgmt/Logic Apps, or Flex plans, and other Azure Functions related development - tasks. Not for Azure App Service (use azure-app-service), Azure Logic Apps (use - azure-logic-apps), Azure Container Apps (use azure-container-apps), Azure Kubernetes - Service (AKS) (use azure-kubernetes-service). -use_when: Use when building HTTP/queue-triggered apps, Durable Functions, Linux/container - hosting, API Mgmt/Logic Apps, or Flex plans, and other Azure Functions related development - tasks. + building HTTP/queue/event-triggered Functions, Durable workflows, containerized + Functions, or API Management/Logic Apps integrations, and other Azure Functions + related development tasks. Not for Azure App Service (use azure-app-service), Azure + Logic Apps (use azure-logic-apps), Azure Container Apps (use azure-container-apps), + Azure Kubernetes Service (AKS) (use azure-kubernetes-service). +use_when: Use when building HTTP/queue/event-triggered Functions, Durable workflows, + containerized Functions, or API Management/Logic Apps integrations, and other Azure + Functions related development tasks. confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Logic Apps (use azure-logic-apps), Azure Container Apps (use azure-container-apps), Azure Kubernetes Service (AKS) (use azure-kubernetes-service). @@ -54,9 +54,9 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log - **Unclassified**: 52 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 14 -- **Unchanged**: 269 +- **New Pages**: 0 +- **Updated Pages**: 3 +- **Unchanged**: 281 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-functions/azure-functions.csv` @@ -77,40 +77,14 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log ## Changes -### New Pages - -- [Migrate Linux apps to Flex Consumption using Copilot](https://learn.microsoft.com/en-us/azure/azure-functions/migration/scenario-migrate-linux-consumption-to-flex) - ### Updated Pages -- [Connect to a Virtual Network](https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-vnet) - - Updated: 2026-02-25T08:00:00.000Z → 2026-04-17T06:16:00.000Z -- [Zone redundancy](https://learn.microsoft.com/en-us/azure/azure-functions/functions-zone-redundancy) - - Updated: 2026-03-13T11:12:00.000Z → 2026-04-16T22:31:00.000Z -- [Networking options](https://learn.microsoft.com/en-us/azure/azure-functions/functions-networking-options) - - Updated: 2026-03-03T08:00:00.000Z → 2026-04-17T06:16:00.000Z +- [Infrastructure as code](https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code) + - Updated: 2026-03-15T11:12:00.000Z → 2026-04-23T06:20:00.000Z - [Compare runtime versions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions) - - Updated: 2026-03-19T08:00:00.000Z → 2026-04-12T11:12:00.000Z -- [Consumption plan (legacy)](https://learn.microsoft.com/en-us/azure/azure-functions/consumption-plan) - - Updated: 2026-03-25T11:19:00.000Z → 2026-04-17T22:08:00.000Z -- [Deployment options](https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies) - - Updated: 2026-01-15T08:00:00.000Z → 2026-04-12T08:00:00.000Z -- [Monitor function executions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring) - - Updated: 2025-05-07T08:00:00.000Z → 2026-04-13T11:11:00.000Z -- [Migrate Consumption plan apps to Flex Consumption](https://learn.microsoft.com/en-us/azure/azure-functions/migration/migrate-plan-consumption-to-flex) - - Updated: 2026-01-22T06:12:00.000Z → 2026-04-16T17:19:00.000Z -- [Azure Functions Core Tools](https://learn.microsoft.com/en-us/azure/azure-functions/functions-core-tools-reference) - - Updated: 2026-03-06T17:59:00.000Z → 2026-04-12T11:12:00.000Z -- [Trigger](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-trigger) - - Updated: 2025-12-21T08:00:00.000Z → 2026-04-14T22:21:00.000Z -- [Output](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-output) - - Updated: 2023-10-05T08:00:00.000Z → 2025-09-22T22:32:00.000Z -- [Trigger](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-trigger) - - Updated: 2023-09-11T21:49:00.000Z → 2024-06-10T17:07:00.000Z -- [Output](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-output) - - Updated: 2024-04-26T06:07:00.000Z → 2025-06-22T05:19:00.000Z -- [Output](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-output) - - Updated: 2025-06-22T05:19:00.000Z → 2025-09-15T22:11:00.000Z + - Updated: 2026-04-12T11:12:00.000Z → 2026-04-17T22:08:00.000Z +- [Durable Functions](https://learn.microsoft.com/en-us/azure/azure-functions/durable-functions/durable-functions-overview) + - Updated: 2026-04-03T21:52:00.000Z → 2026-04-22T08:00:00.000Z ## Classified Pages @@ -246,6 +220,7 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log | [Use an outbound NAT gateway](https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-nat-gateway) | configuration | 0.75 | Details how to route Functions outbound traffic through an Azure NAT gateway, including subnet and NAT configuration specific to Functions. | | [AZFD0012](https://learn.microsoft.com/en-us/azure/azure-functions/errors-diagnostics/diagnostic-events/azfd0012) | security | 0.74 | Event tied to StrictHISModeWarn/StrictHISModeEnabled feature flags in AzureWebJobsFeatureFlags, describing security-related configuration behavior. | | [Binding](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-dapr-trigger) | integrations | 0.74 | Page focuses on Dapr input binding triggers and the specific Dapr events and binding configuration used by Azure Functions, matching the integrations pattern with concrete binding parameters. | +| [Compare runtime versions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions) | decision-making | 0.74 | Page compares Azure Functions runtime versions with concrete support timelines, indicates which versions are currently supported or retired, and provides guidance on when to use each version. This is product-specific decision guidance (version selection and lifecycle) rather than just conceptual description, fitting the decision-making category. | | [Output](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs-output) | configuration | 0.74 | Event Hubs output binding docs for Azure Functions normally provide binding configuration tables (e.g., connection, eventHubName, partitionKey) and language-specific attribute/JSON settings with required/optional flags and defaults. This is expert, product-specific configuration knowledge, matching the configuration sub-skill. | | [Output](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-output) | configuration | 0.74 | Service Bus output binding documentation generally includes binding property tables (e.g., queueName, topicName, connection, entityType) and their valid values and behaviors across languages and Node.js models. These concrete configuration parameters and defaults are expert configuration details, not just conceptual usage. | | [Service Invocation](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-dapr-trigger-svc-invoke) | integrations | 0.74 | Describes how Azure Functions is triggered by Dapr service invocation, including trigger configuration and binding parameters unique to this integration. | @@ -266,7 +241,6 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log | [Build and deploy using GitHub Actions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-github-actions) | deployment | 0.70 | Provides a concrete GitHub Actions workflow pattern for Azure Functions, including YAML-based configuration and Azure-specific deployment steps, which are product-specific CI/CD details rather than generic deployment guidance. | | [Choose a file access strategy](https://learn.microsoft.com/en-us/azure/azure-functions/concept-file-access-options) | decision-making | 0.70 | Compares storage bindings, external databases, and Azure Files mounts with trade-offs and guidance on when to use each; includes plan support constraints (Linux only, not supported on Consumption) and scenario-based recommendations, fitting decision-making criteria. | | [Command line](https://learn.microsoft.com/en-us/azure/azure-functions/functions-add-output-binding-storage-queue-cli) | integrations | 0.70 | Shows how to add an Azure Storage queue output binding using CLI tools, with product-specific binding configuration details. | -| [Compare runtime versions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions) | decision-making | 0.70 | Contains a comparison of Azure Functions runtime versions with support status, retirement information, and explicit guidance on when to use each version, enabling concrete version-selection decisions beyond generic knowledge. | | [Connect to a Virtual Network](https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-vnet) | security | 0.70 | A tutorial on integrating Azure Functions with a virtual network using private endpoints generally includes product-specific security configuration: exact resource types to create, required subnet and private endpoint settings, and sometimes specific RBAC roles or access restrictions for storage and Service Bus. These are concrete, Azure-specific security configuration steps rather than generic networking concepts, so it fits the security sub-skill. | | [Continuous deployment](https://learn.microsoft.com/en-us/azure/azure-functions/functions-continuous-deployment) | deployment | 0.70 | Describes Azure Functions–specific continuous deployment behavior, including recommendation to use staging slots instead of production and App Service integration details, which are product-specific deployment patterns. | | [Custom handlers](https://learn.microsoft.com/en-us/azure/azure-functions/functions-custom-handlers) | configuration | 0.70 | Custom handlers require specific host.json configuration, process startup commands, and HTTP contract details that are concrete configuration parameters unique to Azure Functions. | @@ -283,6 +257,7 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log | [Host MCP servers](https://learn.microsoft.com/en-us/azure/azure-functions/self-hosted-mcp-servers) | deployment | 0.70 | Explains two concrete hosting approaches for MCP servers on Functions, including supported SDKs and setup steps; product-specific deployment pattern. | | [Host MCP servers for AI-enabled functions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-mcp-tutorial) | security | 0.70 | Covers configuring MCP server endpoints and built-in authentication/authorization for Functions, including endpoint auth settings and security options. | | [How to connect to services](https://learn.microsoft.com/en-us/azure/azure-functions/add-bindings-existing-function) | configuration | 0.70 | Shows how to define bindings in function.json or code with specific properties and values, which are concrete configuration parameters for integrating with other services. | +| [Infrastructure as code](https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code) | deployment | 0.70 | The article goes beyond a basic tutorial and describes product-specific deployment automation for Azure Functions using Bicep/ARM, including how to structure and parameterize templates for function apps and related resources. This is deployment-focused IaC guidance that an LLM is unlikely to fully infer from general knowledge, but it does not center on limits, security, or configuration matrices. | | [JavaScript](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-node) | best-practices | 0.70 | Node.js developer reference includes product-specific recommendations, binding usage patterns, and configuration examples (like context bindings, async patterns) that go beyond generic Node.js knowledge. | | [Language support policy](https://learn.microsoft.com/en-us/azure/azure-functions/language-support-policy) | decision-making | 0.70 | Language support policy pages typically include version-specific support windows, retirement dates, and matrix-style tables per language/runtime that change over time and aren't inferable from training data. This is expert, time-sensitive guidance used to decide which language/runtime to choose. | | [Linux container (Premium)](https://learn.microsoft.com/en-us/azure/azure-functions/functions-deploy-container) | deployment | 0.70 | Describes requirements for Premium vs Dedicated plans, container registry usage, and plan-specific constraints for containerized Functions, which are deployment-specific details. | @@ -331,7 +306,6 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log | [Containerized functions](https://learn.microsoft.com/en-us/azure/azure-functions/container-concepts) | architecture-patterns | 0.65 | Describes container-based deployment options and patterns specific to Azure Functions (built-in vs custom containers) and when to use each; product-specific architecture guidance. | | [Developer reference guide](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python) | best-practices | 0.65 | Python developer reference includes trigger/binding usage, worker behaviors, and configuration specifics (like function signatures and decorators) unique to Azure Functions. | | [IP addresses](https://learn.microsoft.com/en-us/azure/azure-functions/ip-addresses) | configuration | 0.65 | Explains how inbound/outbound IPs are assigned, how to retrieve them, and when they change—product-specific behavior and configuration details not captured by generic knowledge. | -| [Infrastructure as code](https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code) | deployment | 0.65 | The article focuses on automating deployment of function apps and related resources using Bicep/ARM, including deployment configurations for production and CI/CD. This is product-specific deployment automation guidance rather than a generic tutorial, fitting the deployment sub-skill. | | [Java](https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-java) | best-practices | 0.65 | Java reference typically includes function signatures, annotation usage, and configuration details unique to Azure Functions’ Java worker, which are concrete coding patterns. | | [Migrate .NET apps to the isolated model](https://learn.microsoft.com/en-us/azure/azure-functions/migrate-dotnet-to-isolated-model) | decision-making | 0.65 | Provides a prescriptive migration guide from the in-process to isolated worker model for .NET Azure Functions, including time-bound support information, concrete steps, and model-specific considerations. This is expert, product-specific guidance for choosing and moving between execution models rather than a generic overview. | | [Migrate Linux apps to Flex Consumption using Copilot](https://learn.microsoft.com/en-us/azure/azure-functions/migration/scenario-migrate-linux-consumption-to-flex) | decision-making | 0.65 | Migration scenario between specific Azure Functions plans typically includes plan-specific constraints, required configuration changes, and step ordering that are not generic knowledge. Even though it’s framed as a Copilot quickstart, such migration docs usually encode concrete prerequisites, supported/unsupported features, and required settings for Flex Consumption vs Linux Consumption, which directly guide when and how to move between plans (a decision-making scenario). | @@ -396,12 +370,12 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Log | [Core Tools development](https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local) | 0.20 | Primarily a how-to/tutorial for running Azure Functions locally with Core Tools. It focuses on workflow steps (create project, run, debug, deploy) rather than product-specific limits, configuration matrices, or detailed parameter tables. It does not appear to contain numeric limits, tier-specific constraints, RBAC role lists, or structured troubleshooting mappings with error codes. | | [Create functions in containers](https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-container-registry) | 0.20 | Tutorial-style get-started article for creating and publishing an Azure Functions Linux container image using Core Tools and a container registry. It focuses on step-by-step commands and workflow, without configuration parameter tables, limits/quotas, error-code-based troubleshooting, or product-specific decision matrices. The content is generic deployment/how-to guidance that an LLM can already approximate from training, not expert-only reference details. | | [Debug local PowerShell functions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-debug-powershell-local) | 0.20 | Local debugging guidance for PowerShell Functions using standard tools; no product-specific error codes, config matrices, or limits. | -| [Durable Functions](https://learn.microsoft.com/en-us/azure/azure-functions/durable-functions/durable-functions-overview) | 0.20 | High-level overview of Durable Functions concepts (orchestrator, activity, entity functions) without product-specific limits, configuration tables, error codes, or decision matrices. | | [Isolated worker model](https://learn.microsoft.com/en-us/azure/azure-functions/dotnet-isolated-process-guide) | 0.20 | Primarily an introduction to the .NET isolated worker model for Azure Functions and links to other content. The summary indicates conceptual overview and getting-started guidance, without mention of specific configuration tables, limits, error codes, or decision matrices. | | [Process real-time events](https://learn.microsoft.com/en-us/azure/azure-functions/scenario-real-time-events-processing) | 0.20 | Quickstart/tutorial for real-time event processing with Event Hubs trigger and Flex Consumption plan; describes workflow and best practices at a high level but lacks specific limits, configuration matrices, or error-code-based troubleshooting. | | [Respond to blob storage events](https://learn.microsoft.com/en-us/azure/azure-functions/scenario-blob-storage-events) | 0.20 | Quickstart/tutorial for responding to Blob Storage events using Azure Functions and azd; focuses on step-by-step deployment and basic usage, without detailed configuration parameter tables, limits, or troubleshooting mappings. | | [AI-enabled functions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-ai-enabled-apps) | 0.10 | Survey of AI-related scenarios and integrations for Azure Functions; appears conceptual and scenario-oriented without detailed config tables, limits, or troubleshooting mappings. | | [About triggers and bindings](https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings) | 0.10 | Page is a high-level conceptual overview of Azure Functions triggers and bindings without detailed limits, configuration tables, error codes, or product-specific numeric thresholds. | +| [Durable Functions](https://learn.microsoft.com/en-us/azure/azure-functions/durable-functions/durable-functions-overview) | 0.10 | Page is a high-level conceptual overview of Durable Functions (what it is, basic concepts like orchestrator/activity/entity functions). It does not provide specific limits, configuration tables, error codes, decision matrices, or other detailed product-specific expert knowledge as defined by the sub-skill types. | | [Functions overview](https://learn.microsoft.com/en-us/azure/azure-functions/functions-overview) | 0.10 | High-level overview of Azure Functions; no specific limits, configuration tables, error codes, or product-specific decision matrices. | | [Get started](https://learn.microsoft.com/en-us/azure/azure-functions/functions-get-started) | 0.10 | Getting-started navigation article pointing to other content; lacks detailed configuration, limits, or troubleshooting data. | | [Scenarios](https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenarios) | 0.10 | Scenario descriptions for using Azure Functions; does not include numeric limits, config parameters, or troubleshooting mappings. | diff --git a/products/azure-health-data-services/azure-health-data-services.csv b/products/azure-health-data-services/azure-health-data-services.csv index ad51f2aa..ad4a9fd6 100644 --- a/products/azure-health-data-services/azure-health-data-services.csv +++ b/products/azure-health-data-services/azure-health-data-services.csv @@ -48,7 +48,7 @@ https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/regis https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/register-public-azure-ad-client-app,Public client application,Register a public client app in Microsoft Entra ID - Azure API for FHIR,Register public client apps for Azure API for FHIR,"This article explains how to register a public client application in Microsoft Entra ID, in preparation for deploying FHIR API in Azure.","Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. In this article, you learn how to regist",2025-12-05T08:00:00.000Z,article,security,0.8,True,"Explains public client registration with specific configuration for authenticating to Azure API for FHIR (scopes, redirect URIs, platform settings).",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/register-resource-azure-ad-client-app,Resource application,Register a resource app in Microsoft Entra ID - Azure API for FHIR,Register resource (API) app for Azure API for FHIR,"Register a resource (or API) app in Microsoft Entra ID, so that client applications can request access to the resource when authenticating.","Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. In this article, you learn how to regist",2025-12-05T08:00:00.000Z,article,security,0.8,True,Covers registering a resource/API app with specific exposed scopes and permissions required for FHIR data plane access.,unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/register-service-azure-ad-client-app,Service client application,Register a service app in Microsoft Entra ID - Azure API for FHIR,Register service client apps for Azure API for FHIR,Learn how to register a service client application in Microsoft Entra ID.,"Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. In this article, you learn how to regist",2025-12-05T08:00:00.000Z,article,security,0.8,True,Describes service client registration (daemon/service principal) with required permissions and configuration to access FHIR resources.,unchanged -https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/release-notes-2026,Release notes,Azure API for FHIR monthly releases 2026,Plan migration from Azure API for FHIR retirement,This article provides details about the Azure API for FHIR monthly features and enhancements in 2026.,"Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. Azure API for FHIR® provides a fully man",2026-04-12T06:11:00.000Z,reference,decision-making,0.7,True,"Release notes include concrete retirement dates and deployment cutoffs (for example, no new deployments after April 1, 2025 and full retirement on September 30, 2026), which are time-bound product decisions not inferable from general training data. These details guide when and how to move to Azure Health Data Services FHIR service, fitting decision-making around migration and lifecycle planning.",updated +https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/release-notes-2026,Release notes,Azure API for FHIR monthly releases 2026,Plan migration from Azure API for FHIR retirement,This article provides details about the Azure API for FHIR monthly features and enhancements in 2026.,"Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. Azure API for FHIR® provides a fully man",2026-04-12T06:11:00.000Z,reference,decision-making,0.7,True,"Release notes include concrete retirement dates and deployment cutoffs (for example, no new deployments after April 1, 2025 and full retirement on September 30, 2026), which are time-bound product decisions not inferable from general training data. These details guide when and how to move to Azure Health Data Services FHIR service, fitting decision-making around migration and lifecycle planning.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/search-samples,Search examples,Search examples for Azure API for FHIR,Use FHIR search examples with Azure API for FHIR,"How to search using different search parameters, modifiers, and other FHIR search tools","Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. The following are examples of using Fast",2024-10-08T05:34:00.000Z,reference,integrations,0.7,True,"Provides concrete HTTP query examples with parameters, modifiers, and behavior specific to Azure API for FHIR’s search implementation.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/security-controls-policy,Security controls by Azure policy,Azure Policy Regulatory Compliance controls for Azure API for FHIR,Apply regulatory compliance policies to Azure API for FHIR,Lists Azure Policy Regulatory Compliance controls available for Azure API for FHIR. These built-in policy definitions provide common approaches to managing the compliance of your Azure resources.,"Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. Regulatory Compliance in Azure Policypro",2025-12-05T08:00:00.000Z,sample,security,0.75,True,Lists Azure Policy regulatory compliance controls and built-in definitions; these are product-specific security/compliance configurations.,unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/smart-on-fhir,SMART on FHIR,SMART on FHIR - Azure API for FHIR,Enable SMART on FHIR apps with Azure API for FHIR,This tutorial describes how to enable SMART on FHIR applications with the Azure API for FHIR.,"Important Azure API for FHIR will be retired on September 30, 2026.Follow themigration strategiesto transition toAzure Health Data Services FHIR® serviceby that date. Due to the retirement of Azure API for FHIR, new customer deployments won't be allowed beginning April 1, 2025.Azure Health Data Services FHIR serviceis the evolved version of Azure API for FHIR that enables customers to manageFHIRandDICOMservices with integrations into other Azure services. Substitutable Medical Applications and R",2025-12-05T08:00:00.000Z,tutorial,integrations,0.7,True,"Describes enabling SMART on FHIR, including specific authorization endpoints, scopes, and configuration parameters unique to Azure API for FHIR.",unchanged @@ -186,13 +186,13 @@ https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/use-custom-headers- https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/using-curl,Access using cURL,Access Azure Health Data Services with cURL,Access Azure Health Data Services APIs with cURL,This article explains how to access Azure Health Data Services with cURL,"In this article, you learn how to access Azure Health Data Services with cURL.",2025-10-10T08:00:00.000Z,tutorial,integrations,0.7,True,"Provides concrete cURL command patterns, authentication headers, and endpoint usage specific to Azure Health Data Services, beyond generic HTTP usage.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/using-rest-client,Access using REST Client,Access Azure Health Data Services using REST Client,Access Azure Health Data Services using VS Code REST Client,This article explains how to access the Healthcare APIs using the REST Client extension in VS Code,"In this article, you learn how to access Azure Health Data Services usingREST Client extension in Visual Studio Code.",2025-10-10T08:00:00.000Z,tutorial,integrations,0.7,True,"Shows how to configure and use the REST Client extension with this service, including environment variables, auth headers, and request templates specific to the APIs.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/validation-against-profiles,Use $validate,Validate FHIR resources against profiles in Azure Health Data Services,Validate FHIR resources against stored profiles,This article describes how to validate FHIR resources against profiles in the FHIR service.,"In thestore profiles in the FHIR® servicearticle, you walked through the basics of FHIR profiles and storing them. The FHIR service in Azure Health Data Services allows validating resources against profiles to see if the resources conform to the profiles. This article guides you through how to use$validatefor validating resources against profiles. $validateis an operation in Fast Healthcare Interoperability Resources (FHIR) that allows you to ensure that a FHIR resource conforms to the base reso",2025-10-10T08:00:00.000Z,reference,configuration,0.78,True,"Describes using $validate in this FHIR service, including operation parameters, profile references, and validation behavior specific to the platform.",unchanged -https://learn.microsoft.com/en-us/azure/healthcare-apis/get-access-token,Get an access token,Get an access token to use the FHIR service or the DICOM service,Obtain access tokens for FHIR and DICOM services,Learn how to get an access token to access the FHIR service or the DICOM service using Azure CLI or Powershell.,"To use the FHIR® service or the DICOM® service, you need an access token that verifies your identity and permissions to the service. You can get an access token by usingPowerShellor theAzure Command-Line Interface (CLI). These tools help you create and manage Azure resources. Manage the permissions for users and applications to access FHIR or the DICOM services throughrole assignmentsfrom the Azure portal or by usingscripts.",2026-04-14T06:14:00.000Z,how-to,security,0.7,True,"Describes concrete steps and commands to get access tokens and manage permissions via role assignments for FHIR/DICOM services. This is product-specific authentication and authorization configuration, matching the security category.",updated +https://learn.microsoft.com/en-us/azure/healthcare-apis/get-access-token,Get an access token,Get an access token to use the FHIR service or the DICOM service,Obtain access tokens for FHIR and DICOM services,Learn how to get an access token to access the FHIR service or the DICOM service using Azure CLI or Powershell.,"To use the FHIR® service or the DICOM® service, you need an access token that verifies your identity and permissions to the service. You can get an access token by usingPowerShellor theAzure Command-Line Interface (CLI). These tools help you create and manage Azure resources. Manage the permissions for users and applications to access FHIR or the DICOM services throughrole assignmentsfrom the Azure portal or by usingscripts.",2026-04-14T06:14:00.000Z,how-to,security,0.7,True,"Describes concrete steps and commands to get access tokens and manage permissions via role assignments for FHIR/DICOM services. This is product-specific authentication and authorization configuration, matching the security category.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/github-projects,GitHub projects,Related GitHub Projects for Azure Health Data Services,,Lists all Open Source (GitHub) repositories,We have many open-source projects on GitHub that provide you with the source code and instructions to deploy services for various uses. You're always welcome to visit our GitHub repositories to learn and experiment with our features and products.,2026-02-25T08:00:00.000Z,reference,,0.0,False,List of related GitHub projects; navigation/links page without embedded technical reference content.,unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/health-data-services-get-started,Deployment overview,Introduction to Azure Health Data Services,,"Learn how Azure Health Data Services empowers healthcare organizations to manage data securely, support interoperability, and enable analytics.","Azure Health Data Services offers a suite of technologies that empower healthcare organizations to manage health data securely and compliantly. Azure Health Data Services streamlines the handling of various health data types, supports interoperability, and enables advanced analytics.",2026-02-25T08:00:00.000Z,conceptual,,0.1,False,"Introductory/marketing-style overview of Azure Health Data Services; lacks concrete configuration values, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/healthcare-apis-faqs,FAQ,FAQs about Azure Health Data Services,,This document provides answers to the frequently asked questions about Azure Health Data Services.,These are some of the frequently asked questions for the Azure Health Data Services.,2026-02-27T06:11:00.000Z,reference,,0.4,False,"FAQ page; likely mixed conceptual and practical Q&A but summary does not indicate structured error-code mappings, config tables, or numeric limits.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/healthcare-apis-overview,What is Azure Health Data Services?,What is Azure Health Data Services?,,This article is an overview of Azure Health Data Services.,"Azure Health Data Services is a cloud-based solution that helps you collect, store, and analyze health data from different sources and formats. It supports various healthcare standards, such as FHIR® and DICOM®, and converts data from legacy or proprietary device formats to FHIR. Azure Health Data Services enables you to: Designed to meet your current health data needs and built to adapt to future developments, Azure Health Data Services is a powerful and flexible solution that is always evolvin",2026-02-23T08:00:00.000Z,overview,,0.1,False,"High-level product overview of Azure Health Data Services without numeric limits, configuration tables, or detailed error/security specifics.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/healthcare-apis-quickstart,Azure Health Data Services quickstart,Azure Health Data Services quickstart,,Learn how to create a workspace for Azure Health Data Services by using the Azure portal. The workspace is a centralized logical container for instances of the FHIR service and DICOM service.,Follow the steps in this article to create a workspace before you deploy instances of Azure Health Data Services in the Azure portal. The workspace is a centralized logical container for Azure Health Data services such as FHIR® and DICOM® services. It allows you to organize and manage configuration settings that are shared among all the underlying datasets and services.,2026-02-25T08:00:00.000Z,quickstart,,0.2,False,Quickstart for creating a workspace via portal; primarily step-by-step tutorial without detailed configuration parameter tables or product-specific constraints.,unchanged -https://learn.microsoft.com/en-us/azure/healthcare-apis/known-issues,Known issues,Azure Health Data Services known issues,Review known issues for Azure Health Data Services,Learn about the known issues of Azure Health Data Services.,"This article describes known issues with Azure Health Data Services, which includes the FHIR® and DICOM®. Refer to the table for details about resolution dates or possible workarounds.",2026-04-16T06:12:00.000Z,reference,troubleshooting,0.7,True,"Known issues pages typically list specific symptoms, causes, and workarounds with dates. These are highly product-specific troubleshooting details that an LLM will not know from training.",updated +https://learn.microsoft.com/en-us/azure/healthcare-apis/known-issues,Known issues,Azure Health Data Services known issues,Review known issues for Azure Health Data Services,Learn about the known issues of Azure Health Data Services.,"This article describes known issues with Azure Health Data Services, which includes the FHIR® and DICOM®. Refer to the table for details about resolution dates or possible workarounds.",2026-04-16T06:12:00.000Z,reference,troubleshooting,0.7,True,"Known issues pages typically list specific symptoms, causes, and workarounds with dates. These are highly product-specific troubleshooting details that an LLM will not know from training.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/logging,Enable logging,Logging for Azure Health Data Services,,Learn to monitor Azure Health Data Services with AuditLogs for secure healthcare service trails and operational insights. Discover log types and uses.,"While activity logs are available for each Azure resource from the Azure portal, Azure Health Data Services emits resource logs, which include two categories of logs: AuditLogs and DiagnosticLogs. Here's an example of the AuditLog:",2026-02-25T08:00:00.000Z,tutorial,,0.3,False,"Describes logging, AuditLogs, and DiagnosticLogs conceptually; summary does not indicate specific log schema tables, error codes, or configuration parameter references.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/network-access-security,Manage network access security,Manage network access security in Azure Health Data Services,Configure network access security for Health Data Services,Learn about network access security and outbound connections for the FHIR and DICOM services in Azure Health Data Services.,Azure Health Data Services provides multiple options for securing network access to its features and for managing outbound connections made by the FHIR® or DICOM®.,2026-02-25T08:00:00.000Z,concept-article,security,0.7,True,"Focuses on securing network access and outbound connections for FHIR and DICOM; likely includes product-specific network security options and settings (firewall, private endpoints, allowed origins) that qualify as security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/register-application,Register an application via Azure portal,Register a client application in Microsoft Entra ID for the Azure Health Data Services,Register Entra client app for Azure Health Data Services,"Learn how to register a client application in Microsoft Entra ID for Azure Health Data Services. Add secrets, certificates, and API permissions to enable secure access.","In this article, you learn how to register a client application in Microsoft Entra ID to access Azure Health Data Services. When you register a client application, you can authenticate and securely connect to FHIR and DICOM services. For more information, seeRegister an application with the Microsoft identity platform.",2026-03-28T06:12:00.000Z,tutorial,security,0.7,True,"Covers registering a client application in Microsoft Entra ID specifically for Azure Health Data Services, including adding secrets, certificates, and API permissions. This implies product-specific permission scopes and configuration steps for FHIR and DICOM access, which fall under security (auth configuration and access control) and are detailed, product-specific settings.",unchanged @@ -202,6 +202,6 @@ https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2022,2022, https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2023,2023,Release notes for 2023 Azure Health Data Services monthly releases,,"2023 - Find out about features and improvements introduced in 2023 for the FHIR, DICOM, and MedTech services in Azure Health Data Services. Review the monthly release notes and learn how to get the mo","This article describes features, enhancements, and bug fixes released in 2023 for the FHIR® service, DICOM® service, and MedTech service in Azure Health Data Services.",2024-03-14T11:16:00.000Z,reference,,0.3,False,Release notes for 2023; change log content rather than structured expert-knowledge reference material.,unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2024,2024,Release notes for 2024 Azure Health Data Services monthly releases,,"2024 - Stay updated with the latest features and improvements for the FHIR, DICOM, and MedTech services in Azure Health Data Services in 2024. Read the monthly release notes and learn how to get the m","This article describes features, enhancements, and bug fixes released in 2024 for the FHIR® service, DICOM® service, and MedTech service in Azure Health Data Services.",2025-05-29T22:04:00.000Z,reference,,0.3,False,"Release notes for 2024; similar to other release notes, not a stable reference for limits, configuration, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2025,2025,Release notes for 2025 Azure Health Data Services monthly releases,,"2025 - Stay updated with the latest features and improvements for the FHIR, DICOM, and MedTech services in Azure Health Data Services in 2025. Read the monthly release notes and learn how to get the m","Release notes describe features, enhancements, and bug fixes released in 2025 for the FHIR® service, Azure API for FHIR, DICOM® service, and MedTech service in Azure Health Data Services.",2025-12-12T18:11:00.000Z,reference,,0.3,False,"Release notes for 2025; primarily feature and bug-fix history, not structured limits, configuration, or troubleshooting guidance.",unchanged -https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2026,2026,Release notes for 2026 Azure Health Data Services monthly releases,,"2026 - Stay updated with the latest features and improvements for the FHIR, DICOM, and MedTech services in Azure Health Data Services in 2026. Read the monthly release notes and learn how to get the m","Release notes describe features, enhancements, and bug fixes released in 2026 for the FHIR® service and DICOM® service in Azure Health Data Services.",2026-04-13T22:10:00.000Z,reference,,0.0,False,"Release notes summary only; no specific fixes, configuration details, or error codes are provided in the description. Without concrete technical content, it does not meet the expert knowledge criteria.",updated -https://learn.microsoft.com/en-us/azure/healthcare-apis/services-features-regional-availability,Regional availability of services & features,Azure Health Data Services regional availability of services and features,Check Azure Health Data Services regional availability,This article is an overview of Azure Health Data Services regional availability of services and features. The availability of Azure Health Data Services services and features can vary by region.,"This article provides an overview of Azure Health Data Services regional availability of services and features. The availability of Azure Health Data Services services and features can vary by region, including Azure API for FHIR. For example, the Fast Healthcare Interoperability Resources (FHIR®) may be available in a region, but the Digital Imaging and Communications in Medicine (DICOM®) service may not.For information about general Microsoft product availability, refer toProducts available by",2026-04-16T06:12:00.000Z,overview,deployment,0.65,True,"Regional availability matrices for specific services/features are product- and region-specific details that change over time and are not reliably known from training data. This is effectively a deployment constraint by region, fitting deployment (platform/tier support matrix by region).",updated +https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2026,2026,Release notes for 2026 Azure Health Data Services monthly releases,,"2026 - Stay updated with the latest features and improvements for the FHIR, DICOM, and MedTech services in Azure Health Data Services in 2026. Read the monthly release notes and learn how to get the m","Release notes describe features, enhancements, and bug fixes released in 2026 for the FHIR® service and DICOM® service in Azure Health Data Services.",2026-04-22T19:06:00.000Z,reference,,0.2,False,"Monthly release notes typically list new features and fixes but not structured limits, configuration matrices, troubleshooting mappings, or decision criteria as defined by the sub-skill types. Without evidence of specific limits, config tables, error-code mappings, or decision matrices, this is general update information rather than reusable expert-knowledge patterns.",updated +https://learn.microsoft.com/en-us/azure/healthcare-apis/services-features-regional-availability,Regional availability of services & features,Azure Health Data Services regional availability of services and features,Check Azure Health Data Services regional availability,This article is an overview of Azure Health Data Services regional availability of services and features. The availability of Azure Health Data Services services and features can vary by region.,"This article provides an overview of Azure Health Data Services regional availability of services and features. The availability of Azure Health Data Services services and features can vary by region, including Azure API for FHIR. For example, the Fast Healthcare Interoperability Resources (FHIR®) may be available in a region, but the Digital Imaging and Communications in Medicine (DICOM®) service may not.For information about general Microsoft product availability, refer toProducts available by",2026-04-16T06:12:00.000Z,overview,deployment,0.65,True,"Regional availability matrices for specific services/features are product- and region-specific details that change over time and are not reliably known from training data. This is effectively a deployment constraint by region, fitting deployment (platform/tier support matrix by region).",unchanged https://learn.microsoft.com/en-us/azure/healthcare-apis/workspace-overview,Workspace overview,What is the workspace? - Azure Health Data Services,,This article describes an overview of the Azure Health Data Services workspace.,"The Azure Health Data Services workspace is a logical container for all your healthcare service instances such as Fast Healthcare Interoperability Resources (FHIR®) and Digital Imaging and Communications in Medicine (DICOM®) services. The workspace also creates a compliance boundary (HIPAA, HITRUST) within which protected health information can travel. You can provision multiple data services within a workspace, and by design, they work seamlessly with one another. With the workspace, you can or",2026-02-25T08:00:00.000Z,overview,,0.1,False,"High-level overview of Azure Health Data Services workspace; no specific limits, configuration tables, security roles, or troubleshooting details.",unchanged diff --git a/products/azure-health-data-services/report.md b/products/azure-health-data-services/report.md index 6e41a8d6..ccb5a3fd 100644 --- a/products/azure-health-data-services/report.md +++ b/products/azure-health-data-services/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: integrations: Coding patterns and integration guides for FHIR, DICOM, MedTech, and de-identification APIs, including REST/SDK usage, bulk ops, data pipelines, and @@ -52,8 +52,8 @@ confusable_not_for: Not for Azure Health Bot (use azure-health-bot), Azure API M ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 5 -- **Unchanged**: 197 +- **Updated Pages**: 1 +- **Unchanged**: 201 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-health-data-services/azure-health-data-services.csv` @@ -76,16 +76,8 @@ confusable_not_for: Not for Azure Health Bot (use azure-health-bot), Azure API M ### Updated Pages -- [Regional availability of services & features](https://learn.microsoft.com/en-us/azure/healthcare-apis/services-features-regional-availability) - - Updated: 2025-09-08T22:37:00.000Z → 2026-04-16T06:12:00.000Z -- [Get an access token](https://learn.microsoft.com/en-us/azure/healthcare-apis/get-access-token) - - Updated: 2025-06-02T08:00:00.000Z → 2026-04-14T06:14:00.000Z - [2026](https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2026) - - Updated: 2026-04-09T17:25:00.000Z → 2026-04-13T22:10:00.000Z -- [Known issues](https://learn.microsoft.com/en-us/azure/healthcare-apis/known-issues) - - Updated: 2026-02-24T23:11:00.000Z → 2026-04-16T06:12:00.000Z -- [Release notes](https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/release-notes-2026) - - Updated: 2026-03-26T22:23:00.000Z → 2026-04-12T06:11:00.000Z + - Updated: 2026-04-13T22:10:00.000Z → 2026-04-22T19:06:00.000Z ## Classified Pages @@ -283,6 +275,7 @@ confusable_not_for: Not for Azure Health Bot (use azure-health-bot), Azure API M | [Reliability](https://learn.microsoft.com/en-us/azure/healthcare-apis/deidentification/reliability-health-data-services-deidentification) | 0.30 | High-level reliability guidance (active-active, Azure Front Door) without evidence of specific limits, configuration parameter tables, or decision matrices that meet any sub-skill criteria. | | [Transparency Note](https://learn.microsoft.com/en-us/azure/healthcare-apis/deidentification/transparency-note) | 0.30 | Transparency note is conceptual/responsible AI oriented; lacks concrete configuration parameters, limits, or troubleshooting details. | | [What is the DICOM service?](https://learn.microsoft.com/en-us/azure/healthcare-apis/dicom/overview) | 0.30 | High-level overview of the DICOM service benefits and use cases; primarily conceptual/marketing without detailed configuration or limits. | +| [2026](https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2026) | 0.20 | Monthly release notes typically list new features and fixes but not structured limits, configuration matrices, troubleshooting mappings, or decision criteria as defined by the sub-skill types. Without evidence of specific limits, config tables, error-code mappings, or decision matrices, this is general update information rather than reusable expert-knowledge patterns. | | [Azure API for FHIR](https://learn.microsoft.com/en-us/azure/healthcare-apis/azure-api-for-fhir/overview) | 0.20 | Azure API for FHIR overview plus retirement notice; no evidence of detailed limits, configuration parameters, or troubleshooting mappings. | | [Azure Health Data Services quickstart](https://learn.microsoft.com/en-us/azure/healthcare-apis/healthcare-apis-quickstart) | 0.20 | Quickstart for creating a workspace via portal; primarily step-by-step tutorial without detailed configuration parameter tables or product-specific constraints. | | [CMS Interoperability and Patient Access rule introduction](https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/centers-for-medicare-tutorial-introduction) | 0.20 | Tutorial series introduction; high-level description of CMS rule and implementation guides, not detailed configuration or troubleshooting content. | @@ -296,5 +289,4 @@ confusable_not_for: Not for Azure Health Bot (use azure-health-bot), Azure API M | [FHIR service](https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/overview) | 0.10 | Overview of FHIR service capabilities; no concrete limits, config parameters, or troubleshooting mappings are indicated. | | [What is Azure Health Data Services?](https://learn.microsoft.com/en-us/azure/healthcare-apis/healthcare-apis-overview) | 0.10 | High-level product overview of Azure Health Data Services without numeric limits, configuration tables, or detailed error/security specifics. | | [Workspace overview](https://learn.microsoft.com/en-us/azure/healthcare-apis/workspace-overview) | 0.10 | High-level overview of Azure Health Data Services workspace; no specific limits, configuration tables, security roles, or troubleshooting details. | -| [2026](https://learn.microsoft.com/en-us/azure/healthcare-apis/release-notes-2026) | - | Release notes summary only; no specific fixes, configuration details, or error codes are provided in the description. Without concrete technical content, it does not meet the expert knowledge criteria. | | [GitHub projects](https://learn.microsoft.com/en-us/azure/healthcare-apis/github-projects) | - | List of related GitHub projects; navigation/links page without embedded technical reference content. | diff --git a/products/azure-impact-reporting/azure-impact-reporting.csv b/products/azure-impact-reporting/azure-impact-reporting.csv index 5f93b4b0..801d418d 100644 --- a/products/azure-impact-reporting/azure-impact-reporting.csv +++ b/products/azure-impact-reporting/azure-impact-reporting.csv @@ -5,10 +5,10 @@ https://learn.microsoft.com/en-us/azure/azure-impact-reporting/connectors-troubl https://learn.microsoft.com/en-us/azure/azure-impact-reporting/create-azure-monitor-connector,Create an impact connector for Azure Monitor alerts,Azure Impact Reporting: Create an Azure Monitor Alert Connector,Create Azure Impact Reporting connectors for alerts,Learn how to create an Azure Impact Reporting connector for Azure Monitor alerts.,"Important Azure Impact Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, seeSupplemental Terms of Use for Microsoft Azure Previews. You can create impact connectors throughAzCLI,Azure PowerShell, or the Azure portal. This article outlines the connector creation process and requirements.",2025-10-20T22:03:00.000Z,overview,configuration,0.7,True,"Details creating impact connectors via Azure CLI, PowerShell, or portal. This typically includes resource properties, required parameters, and configuration options specific to the connector resource.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/creating-logic-app,Report impacts by using a logic app,Azure Impact Reporting: Create a Logic App,Use Logic Apps to send Azure impact reports,Learn how to onboard your workload to Azure Impact Reporting by using a logic app.,"Important Azure Impact Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, seeSupplemental Terms of Use for Microsoft Azure Previews. To learn more about available impact management actions, see theAPI docs. Use a logic app as a REST client for impact reporting.",2025-10-20T22:03:00.000Z,how-to,integrations,0.7,True,"Explains using a Logic App as a REST client for impact reporting. This typically includes connector/action configuration, request body schema, and product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/faq,FAQ for Azure Impact Reporting,Azure Impact Reporting: FAQ,,Azure Impact Reporting frequently asked questions.,"Important Azure Impact Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, seeSupplemental Terms of Use for Microsoft Azure Previews. Here are answers to common questions about Azure Impact Reporting.",2025-10-20T22:03:00.000Z,faq,,0.4,False,"Azure Impact Reporting FAQ; likely general questions and answers without structured troubleshooting, configuration parameter tables, or numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-faq,FAQ for Guest Health Reporting,Azure HPC Guest Health Reporting - FAQ,Troubleshoot Azure HPC Guest Health Reporting issues,Frequently asked questions for Guest Health Reporting.,"Here are answers to common questions about Guest Health Reporting. Important Guest Health Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, see theSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-14T22:03:00.000Z,faq,troubleshooting,0.68,True,"FAQ for a niche preview feature likely includes product-specific behaviors, constraints, and answers to concrete operational questions that aren't general knowledge, fitting a troubleshooting/diagnostic pattern rather than conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-faq,FAQ for Guest Health Reporting,Azure HPC Guest Health Reporting - FAQ,Troubleshoot Azure HPC Guest Health Reporting issues,Frequently asked questions for Guest Health Reporting.,"Here are answers to common questions about Guest Health Reporting. Important Guest Health Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, see theSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-14T22:03:00.000Z,faq,troubleshooting,0.68,True,"FAQ for a niche preview feature likely includes product-specific behaviors, constraints, and answers to concrete operational questions that aren't general knowledge, fitting a troubleshooting/diagnostic pattern rather than conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-impact-categories,Impact categories,Azure HPC Guest Health Reporting - Impact Categories,Use valid HPC Guest Health impact categories,View Guest Health Reporting impact categories.,"To properly report issues to Guest Health Reporting, you must use an impact category that starts withResource.HPC. There are three main types of impact categories for high-performance computing (HPC): Important Guest Health Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, see theSupplemental Terms of Use for Microsoft Azure Previews.",2025-09-26T22:02:00.000Z,overview,configuration,0.8,True,"Specifies that impact categories must start with 'Resource.HPC' and mentions three main types. This is concrete, product-specific configuration data about allowed category values.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-impact-report,Report node health,Azure HPC Guest Health Reporting - Report Node Health,,Share the health status of a supercomputing virtual machine with Azure.,"This article shows how to use Guest Health Reporting to share the health status of a supercomputing virtual machine (VM) with Azure. Before you begin, follow the instructions for onboarding and access management in thefeature overview. Important Guest Health Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, see theSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-09T11:11:00.000Z,overview,,0.2,False,"From the summary, the page appears to be a how-to/tutorial style description of using Guest Health Reporting for Azure HPC VMs, without clear evidence of numeric limits, detailed configuration parameter tables, error-code-based troubleshooting, or decision matrices. It likely explains usage rather than providing the kind of product-specific limits, configuration catalogs, or troubleshooting mappings required for expert-knowledge classification.",unchanged -https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-log-upload,Log Upload for Guest Health Reporting,Azure HPC Guest Health Reporting - Provide Log File,Provide diagnostic log files for Guest Health Reporting,Provide a log file along with guest health report.,"Providing a log file helps make more precise repair actions. A log file should be uploaded prior to making the guest health report (GHR), with the GHR request containing aLogUrlfield inside theadditionalPropertiessection of the request body. TheLogUrlpoints to the uploaded log file from one of the supported platform-specific diagnostic tools (for example NVidia bug report).",2026-04-14T22:03:00.000Z,overview,integrations,0.7,True,"Describes a specific request body schema (LogUrl in additionalProperties) and how to integrate platform-specific diagnostic tools with Guest Health Reporting, which is product-specific API/configuration knowledge beyond generic tutorials.",new +https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-log-upload,Log Upload for Guest Health Reporting,Azure HPC Guest Health Reporting - Provide Log File,Provide diagnostic log files for Guest Health Reporting,Provide a log file along with guest health report.,"Providing a log file helps make more precise repair actions. A log file should be uploaded prior to making the guest health report (GHR), with the GHR request containing aLogUrlfield inside theadditionalPropertiessection of the request body. TheLogUrlpoints to the uploaded log file from one of the supported platform-specific diagnostic tools (for example NVidia bug report).",2026-04-14T22:03:00.000Z,overview,integrations,0.7,True,"Describes a specific request body schema (LogUrl in additionalProperties) and how to integrate platform-specific diagnostic tools with Guest Health Reporting, which is product-specific API/configuration knowledge beyond generic tutorials.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-overview,Overview,Azure HPC Guest Health Reporting - Overview,,Report Azure supercomputing VM device health status to Microsoft.,"Guest Health Reporting is a feature of the Azure Impact Reporting service that allows Azure supercomputing customers to provide virtual machine (VM) device health statuses to Azure. Based on these status updates, Azure high-performance computing (HPC) can make decisions to remove problematic nodes from production and send them for repair. Important Guest Health Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet relea",2026-02-24T18:03:00.000Z,overview,,0.3,False,"Guest Health Reporting overview for HPC; appears conceptual (what it is, why) without specific configuration parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/overview,Overview,Azure Impact Reporting: Overview,,Azure Impact Reporting is a service that you can use to report observed performance and availability regressions with your Azure workloads. .,"Important Azure Impact Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, seeSupplemental Terms of Use for Microsoft Azure Previews. To enhance your investigations into observed workload regressions, Azure Impact Reporting provides an Azure platform relevant to the resource and the time of the regression. When you report an issue, we investigate changes, outages, and other platform",2025-10-20T22:03:00.000Z,overview,,0.2,False,"High-level overview of Azure Impact Reporting; summary indicates conceptual description of service and preview notice without concrete limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-impact-reporting/report-impact,Report a resource impact,Azure Impact Reporting: Report an Impact,Report Azure workload impact via Service Health and API,Learn how to provide necessary details to report an impact to your Azure workloads.,"Important Azure Impact Reporting is currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, seeSupplemental Terms of Use for Microsoft Azure Previews. You can use the Azure Service HealthReport an issuepane and the REST API to report an issue. You can also use an Azure Monitorconnectorto report an impact automatically when certain alerts get triggered.",2025-10-20T22:03:00.000Z,how-to,integrations,0.65,True,"Describes using Azure Service Health 'Report an issue' pane, REST API, and Azure Monitor connector to submit impact reports. Likely includes request schema, parameters, and integration-specific details beyond generic tutorials.",unchanged diff --git a/products/azure-impact-reporting/report.md b/products/azure-impact-reporting/report.md index 4f7f4d51..3f95cf6c 100644 --- a/products/azure-impact-reporting/report.md +++ b/products/azure-impact-reporting/report.md @@ -32,9 +32,9 @@ confusable_not_for: Not for Azure Carbon Optimization (use azure-carbon-optimiza - **Unclassified**: 5 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 1 -- **Unchanged**: 13 +- **New Pages**: 0 +- **Updated Pages**: 0 +- **Unchanged**: 15 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-impact-reporting/azure-impact-reporting.csv` @@ -49,15 +49,6 @@ confusable_not_for: Not for Azure Carbon Optimization (use azure-carbon-optimiza ## Changes -### New Pages - -- [Log Upload for Guest Health Reporting](https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-log-upload) - -### Updated Pages - -- [FAQ for Guest Health Reporting](https://learn.microsoft.com/en-us/azure/azure-impact-reporting/guest-health-faq) - - Updated: 2025-09-26T22:02:00.000Z → 2026-04-14T22:03:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-iot-edge/azure-iot-edge.csv b/products/azure-iot-edge/azure-iot-edge.csv index c12db34d..88ebf052 100644 --- a/products/azure-iot-edge/azure-iot-edge.csv +++ b/products/azure-iot-edge/azure-iot-edge.csv @@ -1,6 +1,6 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/iot-edge/about-iot-edge,About Azure IoT Edge,What is Azure IoT Edge,,"Learn how Azure IoT Edge enables you to deploy, run, and monitor containerized Linux applications at the edge for better business insights and offline decision-making.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Azure IoT Edge is a device-focused runtime that enables you to deploy, run, and monitor containerized Linux applications, bringing analytics closer to your devices for faster insights and offline decision-making. Analytics drives business value in IoT solutions, but not all analytics need to be in the cloud.",2026-04-13T22:10:00.000Z,overview,,0.1,False,"High-level overview of Azure IoT Edge capabilities and concepts without product-specific limits, configuration parameters, or detailed operational guidance.",updated -https://learn.microsoft.com/en-us/azure/iot-edge/configure-connect-verify-gpu,"Configure, connect, and verify a GPU",Configure and connect an IoT Edge module with a GPU,,Configure your environment to connect and verify your GPU to process modules from your IoT Edge device.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This tutorial shows you how to build a GPU-enabled virtual machine (VM). From the VM, you run an IoT Edge device that allocates work from one of its modules to your GPU. Use the Azure portal, the Azure Cloud Shell, and your VM's command line to:",2026-04-13T22:10:00.000Z,tutorial,,0.4,False,"Tutorial-style guidance for using a GPU with IoT Edge; likely focuses on step-by-step setup rather than structured configuration tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/about-iot-edge,About Azure IoT Edge,What is Azure IoT Edge,,"Learn how Azure IoT Edge enables you to deploy, run, and monitor containerized Linux applications at the edge for better business insights and offline decision-making.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Azure IoT Edge is a device-focused runtime that enables you to deploy, run, and monitor containerized Linux applications, bringing analytics closer to your devices for faster insights and offline decision-making. Analytics drives business value in IoT solutions, but not all analytics need to be in the cloud.",2026-04-13T22:10:00.000Z,overview,,0.1,False,"High-level overview of Azure IoT Edge capabilities and concepts without product-specific limits, configuration parameters, or detailed operational guidance.",unchanged +https://learn.microsoft.com/en-us/azure/iot-edge/configure-connect-verify-gpu,"Configure, connect, and verify a GPU",Configure and connect an IoT Edge module with a GPU,,Configure your environment to connect and verify your GPU to process modules from your IoT Edge device.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This tutorial shows you how to build a GPU-enabled virtual machine (VM). From the VM, you run an IoT Edge device that allocates work from one of its modules to your GPU. Use the Azure portal, the Azure Cloud Shell, and your VM's command line to:",2026-04-13T22:10:00.000Z,tutorial,,0.4,False,"Tutorial-style guidance for using a GPU with IoT Edge; likely focuses on step-by-step setup rather than structured configuration tables, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/configure-device,Configure device settings,Configure Azure IoT Edge device settings,Configure Azure IoT Edge device settings via config.toml,This article shows you how to configure Azure IoT Edge device settings and options using the config.toml file.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. This article shows settings and options for configuring the IoT Edge/etc/aziot/config.tomlfile of an IoT Edge device. IoT Edge uses theconfig.tomlfile to initialize settings for the device. Each of the sections of theconfig.tomlfile has several options. Not all options are mandatory, as they apply to specific sc",2025-05-14T08:00:00.000Z,how-to,configuration,0.9,True,"Detailed reference of config.toml sections, option names, and allowed values for IoT Edge devices.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/debug-module-vs-code,Debug modules with VS Code,Debug Azure IoT Edge modules using Visual Studio Code,,Debug Azure IoT Edge modules in Visual Studio Code with step-by-step guidance for multiple programming languages and device architectures.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. This article explains how to use Visual Studio Code to debug IoT Edge modules in multiple languages. On your development computer, use Visual Studio Code to attach and debug your module in a local or remote module container. This article includes steps for two IoT Edge development tools. Select the tool version ",2025-05-08T08:00:00.000Z,concept-article,,0.45,False,"Debugging tutorial in VS Code for IoT Edge modules; mainly step-by-step IDE usage and attach instructions, not structured troubleshooting mappings or product-specific error code references.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/deploy-confidential-applications,Confidential computing,Confidential applications as Azure IoT Edge modules,Deploy confidential computing applications as IoT Edge modules,Use the Open Enclave SDK and API to write confidential applications and deploy them as IoT Edge modules for confidential computing,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Azure IoT Edge supports confidential applications that run within secure enclaves on the device. Encryption provides security for data while in transit or at rest, but enclaves provide security for data and workloads while in use. IoT Edge supports Open Enclave as a standard for developing confidential applicati",2026-02-20T08:00:00.000Z,concept-article,security,0.7,True,"Describes using Open Enclave and secure enclaves with IoT Edge, including confidential application deployment patterns.",unchanged @@ -17,7 +17,7 @@ https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-iot-edge-for-l https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-iot-edge-for-linux-on-windows-networking,Understand VM networking,Networking for Azure IoT Edge for Linux on Windows,Select and configure networking options for EFLOW,Learn about how to configure custom networking for Azure IoT Edge for Linux on Windows virtual machine.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. This article helps you decide which networking option is best for your scenario and provide insights into IoT Edge for Linux on Windows (EFLOW) configuration requirements. To connect the IoT Edge for Linux on Windows (EFLOW) virtual machine over a network to your host, to other virtual machines on your Windows h",2026-02-20T08:00:00.000Z,concept-article,decision-making,0.7,True,"Helps choose between different EFLOW networking models and details configuration requirements for each, providing product-specific decision guidance and settings.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-module-build-options,Configure module build options,Configure module build options,Configure IoT Edge module build and deployment options,Learn how to use the module.json file to configure build and deployment options for an IoT Edge module,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Themodule.jsonfile controls how modules are built and deployed. IoT Edge module projects in Visual Studio and Visual Studio Code include themodule.jsonfile. This file has configuration details for the IoT Edge module, like the version and platform used when building the module.",2025-08-21T05:12:00.000Z,how-to,configuration,0.78,True,"The page explains the IoT Edge module.json schema and how it controls build and deployment behavior. It contains product-specific configuration fields, their meanings, and allowed values for IoT Edge modules, which are not generic knowledge. This aligns with the configuration sub-skill focused on specific setting names and their usage.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-multiple-nics,Multiple NICs,Configure multiple NICs for Azure IoT Edge for Linux on Windows,Attach and configure multiple NICs for EFLOW VM,Configuration for attaching multiple network interfaces to Azure IoT Edge for Linux on Windows virtual machine,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. By default, the Azure IoT Edge for Linux on Windows (EFLOW) virtual machine has a single network interface card (NIC) assigned. However, you can configure the EFLOW VM with multiple network interfaces by using the EFLOW support for attaching multiple network interfaces to the virtual machine. This functionality ",2025-01-22T05:32:00.000Z,concept-article,configuration,0.8,True,"Provides EFLOW-specific steps and parameters to add and configure multiple network interfaces on the EFLOW VM, including how EFLOW exposes and uses them.",unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-proxy-support,Configure proxy support,Configure devices for network proxies for Azure IoT Edge,Configure proxy settings for Azure IoT Edge devices,Learn how to configure Azure IoT Edge devices to communicate through a proxy server.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. IoT Edge devices send HTTPS requests to communicate with IoT Hub. If your device connects to a network that uses a proxy server, configure the IoT Edge runtime to communicate through the server. Proxy servers can also affect individual IoT Edge modules if they make HTTP or HTTPS requests that aren't routed t",2026-04-13T22:10:00.000Z,how-to,configuration,0.8,True,"Describes how to configure IoT Edge runtime and modules to use network proxies; this usually includes specific environment variables, configuration keys, and allowed values, matching the configuration sub-skill definition.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-proxy-support,Configure proxy support,Configure devices for network proxies for Azure IoT Edge,Configure proxy settings for Azure IoT Edge devices,Learn how to configure Azure IoT Edge devices to communicate through a proxy server.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. IoT Edge devices send HTTPS requests to communicate with IoT Hub. If your device connects to a network that uses a proxy server, configure the IoT Edge runtime to communicate through the server. Proxy servers can also affect individual IoT Edge modules if they make HTTP or HTTPS requests that aren't routed t",2026-04-13T22:10:00.000Z,how-to,configuration,0.8,True,"Describes how to configure IoT Edge runtime and modules to use network proxies; this usually includes specific environment variables, configuration keys, and allowed values, matching the configuration sub-skill definition.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-connect-downstream-device,Connect a downstream device,Connect a downstream device to an Azure IoT Edge gateway,Configure downstream devices to connect via IoT Edge gateway,How to configure downstream devices to connect to Azure IoT Edge gateway devices.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. This article gives instructions for setting up a trusted connection between downstream devices and IoT Edge transparentgateways. In a transparent gateway scenario, one or more devices send messages through a single gateway device that maintains the connection to IoT Hub. In this article, the termsgatewayandIoT E",2026-02-27T08:00:00.000Z,concept-article,configuration,0.7,True,"Covers detailed configuration steps and parameters to establish a trusted connection between downstream devices and an IoT Edge transparent gateway (cert trust, connection strings, gateway hostname, and client settings). These are specific configuration patterns unique to Azure IoT Edge.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-connect-downstream-iot-edge-device,Configure gateways for IoT Edge devices,How to create nested Azure IoT Edge device hierarchies,Configure nested Azure IoT Edge device hierarchies,How to create a trusted connection between an IoT Edge gateway and a downstream IoT Edge device.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. This article explains how to establish a trusted connection between an IoT Edge gateway and a downstream IoT Edge device. This configuration is callednested edge. In a gateway scenario, an IoT Edge device can be both a gateway and a downstream device. You can layer multiple IoT Edge gateways to create a device h",2026-02-27T08:00:00.000Z,concept-article,configuration,0.7,True,"Explains how to set up a trusted connection between an IoT Edge gateway and downstream IoT Edge devices in a nested hierarchy, including product-specific configuration of certificates, identities, and gateway relationships. This is concrete configuration guidance rather than conceptual architecture only.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-connect-usb-devices,Connect a USB device,How to connect a USB device to Azure IoT Edge for Linux on Windows,Configure USB over IP connectivity to EFLOW VM,How to connect a USB device using USB over IP to the Azure IoT Edge for Linux on Windows (EFLOW) virtual machine.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. In some scenarios, your workloads need to get data or communicate with USB devices. Because Azure IoT Edge for Linux on Windows (EFLOW) runs as a virtual machine, you need to connect these devices to the virtual machine. This article guides you through the steps necessary to connect a USB device to the EFLOW vir",2025-01-22T05:32:00.000Z,concept-article,configuration,0.8,True,"Gives concrete steps and configuration details to expose host USB devices to the EFLOW VM using USB over IP, including product-specific commands and settings.",unchanged @@ -38,7 +38,7 @@ https://learn.microsoft.com/en-us/azure/iot-edge/how-to-explore-curated-visualiz https://learn.microsoft.com/en-us/azure/iot-edge/how-to-install-iot-edge-kubernetes,Run IoT Edge on Kubernetes,How to install IoT Edge on Kubernetes,Install Azure IoT Edge on Kubernetes with KubeVirt,Learn on how to install IoT Edge on Kubernetes using a local development cluster environment,You can install IoT Edge on Kubernetes usingKubeVirttechnology. KubeVirt is an open-source project from the Cloud Native Computing Foundation (CNCF) that provides a Kubernetes virtualization API and runtime to define and manage virtual machines.,2025-05-09T05:16:00.000Z,concept-article,deployment,0.7,True,"Product-specific deployment method using KubeVirt and Kubernetes, including environment and runtime constraints.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-install-iot-edge-ubuntuvm,Deploy IoT Edge VM using an ARM template,Run Azure IoT Edge on Ubuntu Virtual Machines,,How to run Azure IoT Edge on an Ubuntu virtual machine,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. The Azure IoT Edge runtime turns a device into an IoT Edge device. Deploy the runtime on devices as small as a Raspberry Pi or as large as an industrial server. After you set up the IoT Edge runtime, deploy business logic to the device from the cloud. To learn more about how the IoT Edge runtime works and its co",2025-06-09T22:05:00.000Z,how-to,,0.4,False,How-to guide for installing IoT Edge on Ubuntu VMs; mostly procedural setup steps and generic commands rather than structured configuration references or expert-only constraints.,unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-install-iot-edge-ubuntuvm-bicep,Deploy IoT Edge VM using a Bicep file,Run Azure IoT Edge on Ubuntu VMs by using Bicep,,Learn how to run Azure IoT Edge on Ubuntu virtual machines by deploying and provisioning using Bicep.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. The Azure IoT Edge runtime turns a device into an IoT Edge device. You can deploy the runtime on devices as small as a Raspberry Pi or as large as an industrial server. After you set up the IoT Edge runtime, deploy business logic to the device from the cloud. To learn more about how the IoT Edge runtime works an",2025-07-22T05:10:00.000Z,how-to,,0.4,False,"Step-by-step Bicep deployment tutorial for IoT Edge on Ubuntu VMs; likely shows commands and example parameters but not organized configuration tables, limits, or product-specific troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/how-to-manage-device-certificates,Manage device certificates,Manage IoT Edge certificates - Azure IoT Edge,Manage certificates for secure Azure IoT Edge devices,How to install and manage certificates on an Azure IoT Edge device to prepare for production deployment.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. All IoT Edge devices use certificates to create secure connections between the runtime and any modules running on the device. IoT Edge devices functioning as gateways use these same certificates to connect to their downstream devices as well. Note The termroot CAused throughout this article refers to the top",2026-04-13T08:00:00.000Z,concept-article,security,0.85,True,"Covers installation and management of IoT Edge device certificates for production; such content typically includes certificate types, trust chain requirements, and product-specific security configuration details, fitting the security sub-skill.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/how-to-manage-device-certificates,Manage device certificates,Manage IoT Edge certificates - Azure IoT Edge,Manage certificates for secure Azure IoT Edge devices,How to install and manage certificates on an Azure IoT Edge device to prepare for production deployment.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. All IoT Edge devices use certificates to create secure connections between the runtime and any modules running on the device. IoT Edge devices functioning as gateways use these same certificates to connect to their downstream devices as well. Note The termroot CAused throughout this article refers to the top",2026-04-13T08:00:00.000Z,concept-article,security,0.85,True,"Covers installation and management of IoT Edge device certificates for production; such content typically includes certificate types, trust chain requirements, and product-specific security configuration details, fitting the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-monitor-iot-edge-deployments,Monitor IoT Edge deployments,Monitor IoT Edge deployments - Azure IoT Edge,,High-level monitoring including edgeHub and edgeAgent reported properties and automatic deployment metrics.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Azure IoT Edge gives you real-time information about the modules deployed to your IoT Edge devices. The IoT Hub service gets status from the devices and shows it to you. Monitoring is also important fordeployments made at scalethat include automatic deployments and layered deployments. Devices and modules have s",2025-06-07T05:16:00.000Z,concept-article,,0.4,False,"Monitoring overview for IoT Edge deployments; describes reported properties and metrics conceptually, without detailed metric reference tables or thresholds.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-monitor-module-twins,Monitor module twins,Monitor module twins - Azure IoT Edge,Monitor IoT Edge module twins for health,How to interpret device twins and module twins to determine connectivity and health.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Module twins in Azure IoT Hub let you monitor the connectivity and health of your IoT Edge deployments. Module twins store information in your IoT hub about the performance of your running modules. TheIoT Edge agentand theIoT Edge hubruntime modules each maintain their own module twins:$edgeAgentand$edgeHub.",2026-04-01T08:00:00.000Z,how-to,best-practices,0.65,True,"Explains how to interpret $edgeAgent and $edgeHub module twins to assess connectivity and health. Likely includes product-specific fields, status patterns, and recommended interpretations (for example, which twin properties indicate issues), which are concrete, IoT Edge–specific operational practices rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-observability,End-to-end observability for IoT Edge,How to implement IoT Edge observability using monitoring and troubleshooting,,Learn how to build an observability solution for an IoT Edge System,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. In this article, you learn the concepts and techniques for implementing both observability dimensions:measuring and monitoringandtroubleshooting. You learn about the following topics:",2025-06-06T22:07:00.000Z,how-to,,0.4,False,High-level observability concepts (measuring/monitoring and troubleshooting) for IoT Edge; more architectural and conceptual than parameter- or error-code–driven expert content.,unchanged @@ -55,18 +55,18 @@ https://learn.microsoft.com/en-us/azure/iot-edge/how-to-provision-single-device- https://learn.microsoft.com/en-us/azure/iot-edge/how-to-retrieve-iot-edge-logs,Retrieve logs with direct methods,Retrieve Azure IoT Edge logs,Retrieve and upload Azure IoT Edge logs via direct methods,Retrieve Azure IoT Edge logs and upload them to Azure Blob Storage for remote monitoring.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Get logs from your IoT Edge deployments without physical or SSH access by using direct methods in the IoT Edge agent module. Direct methods run on the device and you can invoke them from the cloud. The IoT Edge agent has direct methods that help you monitor and manage your IoT Edge devices remotely. The direct m",2025-06-04T05:14:00.000Z,how-to,integrations,0.65,True,"Describes IoT Edge agent direct methods for log retrieval and upload to Blob Storage, including method names and payload parameters; this is product-specific API integration detail.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-share-windows-folder-to-vm,Shared folders,Share Windows folder with Azure IoT Edge for Linux on Windows,Share Windows folders with the EFLOW virtual machine,How to share a Windows folders and files with the Azure IoT Edge for Linux on Windows virtual machine.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. The Azure IoT Edge for Linux on Windows (EFLOW) virtual machine is isolated from the Windows host OS, and the virtual machine can't access the host file system. By default, the EFLOW virtual machine has its own file system and can't access folders or files on the host computer. TheEFLOW file and folder sharing m",2025-06-09T22:05:00.000Z,how-to,configuration,0.8,True,"Explains EFLOW-specific file and folder sharing mechanisms, including configuration options and paths that are unique to EFLOW rather than generic file sharing.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-troubleshoot-monitoring-and-faq,FAQ and troubleshooting,Monitoring troubleshooting and FAQ - Azure IoT Edge,Troubleshoot Azure Monitor integration for IoT Edge metrics,Troubleshooting Azure Monitor integration and FAQ,,2025-08-08T08:00:00.000Z,concept-article,troubleshooting,0.8,True,"Explicitly a monitoring troubleshooting and FAQ article; likely organized by symptoms and includes specific error messages, diagnostic steps, and resolutions for IoT Edge–Azure Monitor integration.",unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/how-to-update-iot-edge,Update IoT Edge,Update IoT Edge version on devices,,Learn how to update IoT Edge devices to run the latest versions of the security subsystem and the IoT Edge runtime.,"Applies to:IoT Edge 1.5IoT Edge 1.4 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. As the IoT Edge service releases new versions, update your IoT Edge devices for the latest features and security improvements. This article provides information about how to update your IoT Edge devices when a new version is available. Two logical components of an IoT Edge device need to be updated if you want to move to a newer version. Security subsys",2026-04-13T08:00:00.000Z,how-to,,0.4,False,"Focuses on updating IoT Edge versions; likely a procedural update guide without structured configuration parameter tables, limits, or detailed troubleshooting matrices.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/how-to-update-iot-edge,Update IoT Edge,Update IoT Edge version on devices,,Learn how to update IoT Edge devices to run the latest versions of the security subsystem and the IoT Edge runtime.,"Applies to:IoT Edge 1.5IoT Edge 1.4 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. As the IoT Edge service releases new versions, update your IoT Edge devices for the latest features and security improvements. This article provides information about how to update your IoT Edge devices when a new version is available. Two logical components of an IoT Edge device need to be updated if you want to move to a newer version. Security subsys",2026-04-13T08:00:00.000Z,how-to,,0.4,False,"Focuses on updating IoT Edge versions; likely a procedural update guide without structured configuration parameter tables, limits, or detailed troubleshooting matrices.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-use-create-options,Understand and use createOptions,Configure Container Create Options for Azure IoT Edge Modules,Configure IoT Edge module container createOptions,"Configure IoT Edge modules with Docker-compatible create options for tasks like port mapping, memory limits, and GPU optimization.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Use thecreateOptionsparameter in the deployment manifest to configure the module containers at runtime. By using this parameter, you can restrict the module's access to the host device's resources or configure networking. IoT Edge modules run as Docker-compatible containers on your IoT Edge device. Docker of",2026-03-05T23:11:00.000Z,concept-article,configuration,0.78,True,"The page describes the IoT Edge-specific use of the Docker-compatible createOptions parameter in deployment manifests, including concrete JSON configuration patterns for networking, resource limits, and device access. These are product-specific configuration details (parameter names, structures, and allowed settings) that go beyond generic Docker knowledge and qualify as expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/how-to-visual-studio-develop-module,Develop modules with Visual Studio,Develop and debug Azure IoT Edge modules using Visual Studio,,Use Visual Studio to develop a custom IoT Edge module and deploy to an IoT device.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This article shows you how to use Visual Studio 2022 to develop, debug, and deploy custom Azure IoT Edge modules. Visual Studio 2022 provides templates for IoT Edge modules written in C and C#. Supported device architectures include Windows x64, Linux x64, ARM32, and ARM64 (preview). For more information abo",2026-04-06T08:00:00.000Z,concept-article,,0.2,False,"Primarily a how-to/tutorial for developing and debugging IoT Edge modules in Visual Studio. From the summary, it focuses on using templates and supported architectures, without indicating detailed configuration tables, limits, error-code mappings, or product-specific settings that meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-as-gateway,IoT Edge device as a gateway,Use Azure IoT Edge as a gateway for downstream devices,Choose Azure IoT Edge gateway patterns for devices,"Use Azure IoT Edge to create a transparent, opaque, or proxy gateway device that sends data from multiple connected devices to the cloud or processes locally.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. IoT Edge devices can operate as gateways, providing a connection between other devices on the network and IoT Hub. The IoT Edge hub module acts like IoT Hub, so it can handle connections from other devices that have an identity with the same IoT hub. This type of gateway pattern is calledtransparentbecause messa",2026-02-25T08:00:00.000Z,concept-article,architecture-patterns,0.7,True,"Describes transparent, opaque, and proxy gateway patterns specific to IoT Edge and when to use each.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-certs,IoT Edge certificates,Understand how IoT Edge uses certificates for security - Azure IoT Edge,Configure certificate-based security for Azure IoT Edge,"How Azure IoT Edge uses certificate to validate devices, modules, and downstream devices enabling secure connections between them.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. IoT Edge uses different types of certificates for different purposes. This article explains how IoT Edge uses certificates with Azure IoT Hub and IoT Edge gateway scenarios. Important For brevity, this article applies to IoT Edge version 1.2 or later. Certificate concepts for version 1.1 are similar, but som",2026-03-02T08:00:00.000Z,concept-article,security,0.74,True,"The article explains how IoT Edge uses different certificate types with IoT Hub and gateway scenarios. It contains product-specific security behavior and certificate role details (device, module, downstream device validation) that are unique to IoT Edge’s security model, matching the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows,About EFLOW,What is Azure IoT Edge for Linux on Windows,,Overview of running Linux IoT Edge modules with Azure IoT Edge for Linux on Windows. Run containerized Linux workloads alongside Windows applications.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Azure IoT Edge for Linux on Windows (EFLOW) enables you to run containerized Linux workloads alongside Windows applications in Windows deployments. Businesses that rely on Windows to power their edge devices and solutions can now take advantage of the cloud-native analytics solutions being built in Linux. Az",2026-03-02T23:28:00.000Z,concept-article,,0.2,False,"Described as an overview of Azure IoT Edge for Linux on Windows (EFLOW), explaining what it is and its high-level capabilities. This is conceptual/marketing-style content without clear indication of detailed configuration parameters, limits, or troubleshooting data.",unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-benefits,Benefits,Why use Azure IoT Edge for Linux on Windows?,,Benefits for using Azure IoT Edge for Linux on Windows (EFLOW) to deploy production Linux-based cloud-native workloads on Windows devices.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Azure IoT Edge for Linux on Windows (EFLOW) lets you run business logic and analytics on devices by deploying production Linux-based cloud-native workloads on Windows devices. Connecting devices to Microsoft Azure lets you bring cloud intelligence to your business. Running workloads on devices helps you resp",2026-04-13T22:10:00.000Z,concept-article,,0.1,False,"Content is about benefits and high-level reasons to use IoT Edge for Linux on Windows. It is conceptual/marketing-style justification without detailed configuration, limits, or troubleshooting data.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-benefits,Benefits,Why use Azure IoT Edge for Linux on Windows?,,Benefits for using Azure IoT Edge for Linux on Windows (EFLOW) to deploy production Linux-based cloud-native workloads on Windows devices.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Azure IoT Edge for Linux on Windows (EFLOW) lets you run business logic and analytics on devices by deploying production Linux-based cloud-native workloads on Windows devices. Connecting devices to Microsoft Azure lets you bring cloud intelligence to your business. Running workloads on devices helps you resp",2026-04-13T22:10:00.000Z,concept-article,,0.1,False,"Content is about benefits and high-level reasons to use IoT Edge for Linux on Windows. It is conceptual/marketing-style justification without detailed configuration, limits, or troubleshooting data.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-networking,Networking,Azure IoT Edge for Linux on Windows networking,Configure networking between Windows host and EFLOW virtual machine,Overview of Azure IoT Edge for Linux on Windows networking between the Windows host OS and the IoT Edge for Linux on Windows (EFLOW) virtual machine.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. This article provides information about how to configure the networking between the Windows host OS and the IoT Edge for Linux on Windows (EFLOW) virtual machine. EFLOW uses anAzure LinuxLinux virtual machine in order to run IoT Edge modules. For more information about EFLOW architecture, seeWhat is Azure IoT Ed",2026-02-25T08:00:00.000Z,concept-article,configuration,0.6,True,"Provides details on networking configuration between Windows host and EFLOW VM; includes network modes, ports, and settings specific to this product.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-security,Security,Azure IoT Edge for Linux on Windows security,Understand and configure security principles for IoT Edge for Linux on Windows,Overview of the Azure IoT Edge for Linux on Windows security framework and the different security premises that are enabled by default or optional.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Azure IoT Edge for Linux on Windows uses all the security features of a Windows client or server host and makes sure all extra components follow the same security principles. This article explains the different security principles that are enabled by default, and some optional principles you can enable.",2025-07-21T22:12:00.000Z,concept-article,security,0.6,True,Describes EFLOW security framework and which security premises are enabled by default or optional; includes product-specific security behaviors and configuration options.,unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-support,Supported platforms,Supported platforms for IoT Edge for Linux on Windows,Check supported platforms for IoT Edge on Windows,Learn which operating systems and container engines are supported for Azure IoT Edge for Linux on Windows,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This article provides details about which systems support IoT Edge for Linux on Windows, whether generally available or in preview.",2026-04-13T22:10:00.000Z,concept-article,deployment,0.65,True,"The page details which operating systems and container engines support IoT Edge for Linux on Windows, likely in matrix form distinguishing GA vs preview. This is product- and platform-specific deployment support information that an LLM would not reliably know from training.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-support,Supported platforms,Supported platforms for IoT Edge for Linux on Windows,Check supported platforms for IoT Edge on Windows,Learn which operating systems and container engines are supported for Azure IoT Edge for Linux on Windows,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This article provides details about which systems support IoT Edge for Linux on Windows, whether generally available or in preview.",2026-04-13T22:10:00.000Z,concept-article,deployment,0.65,True,"The page details which operating systems and container engines support IoT Edge for Linux on Windows, likely in matrix form distinguishing GA vs preview. This is product- and platform-specific deployment support information that an LLM would not reliably know from training.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-updates,Updates,Azure IoT Edge for Linux on Windows updates,,Overview of Azure IoT Edge for Linux on Windows updates. Learn how to update your IoT Edge for Linux on Windows devices when a new version is available.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. When a new version of the IoT Edge for Linux on Windows (EFLOW) application is released, update your IoT Edge devices to get the latest features and security improvements. This article explains how to update your IoT Edge for Linux on Windows devices when a new version is available. With IoT Edge for Linux on Wi",2025-07-21T22:12:00.000Z,concept-article,,0.45,False,Explains how to update EFLOW; mostly procedural update steps without detailed configuration matrices or constraints.,unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-limits-and-restrictions,Limits and restrictions,Azure IoT Edge limits and restrictions,Review Azure IoT Edge limits and restrictions,Understand the limits and restrictions when using Azure IoT Edge,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This article explains the limits and restrictions when you use IoT Edge.",2026-04-13T22:10:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly documents IoT Edge limits and restrictions; such pages typically include concrete numeric constraints (for example, module counts, message sizes, or resource limits) that qualify as expert knowledge under limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-limits-and-restrictions,Limits and restrictions,Azure IoT Edge limits and restrictions,Review Azure IoT Edge limits and restrictions,Understand the limits and restrictions when using Azure IoT Edge,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This article explains the limits and restrictions when you use IoT Edge.",2026-04-23T06:20:00.000Z,concept-article,limits-quotas,0.95,True,"The page is explicitly about IoT Edge limits and restrictions and will list concrete numerical constraints (for modules, routes, message sizes, etc.) that are version-specific and not reliably known from training data, matching the limits-quotas criteria.",updated https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-modules,Understand Azure IoT Edge modules,How Azure IoT Edge Modules Run Logic on Devices,,"Learn how Azure IoT Edge modules run logic on devices, using containerized applications, and secure communication with IoT Hub.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Azure IoT Edge lets you deploy and manage business logic on edge devices by usingmodules. Azure IoT Edge modules are the smallest unit of computation managed by IoT Edge. They can contain Azure services, such as Azure Stream Analytics, or your own solution-specific code. To understand how modules are developed, ",2026-02-27T23:17:00.000Z,concept-article,,0.35,False,Conceptual explanation of modules and how they run; not a structured config or limits reference.,unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-runtime,IoT Edge runtime,Azure IoT Edge runtime and architecture explained,,"Discover how Azure IoT Edge runtime manages modules, security, and communication with IoT Hub to optimize IoT solutions.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. The IoT Edge runtime is a set of programs that turn a device into an IoT Edge device. The runtime components let IoT Edge devices receive code to run at the edge, and then communicate results. The IoT Edge runtime is responsible for the following functions on IoT Edge devices: The responsibilities of the IoT",2026-03-02T08:00:00.000Z,concept-article,,0.2,False,"Described as an explanation of runtime and architecture responsibilities; this is primarily conceptual architecture/overview content without clear decision matrices, limits, or configuration tables in the summary.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-security-manager,IoT Edge security manager,Azure IoT Edge security manager and module runtime,,Understand how Azure IoT Edge security manager and module runtime enable IoT Edge device security and the integrity of security services.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. The Azure IoT Edge security manager is a well-bounded security core that protects the IoT Edge device and its components by abstracting secure silicon hardware. The security manager focuses on security hardening and gives a technology integration point to original equipment manufacturers (OEM). The security ",2026-03-02T08:00:00.000Z,concept-article,,0.25,False,"Explains security manager and module runtime conceptually; summary emphasizes understanding and integration points, not specific RBAC roles, configuration parameters, or security setting tables.",unchanged @@ -83,7 +83,7 @@ https://learn.microsoft.com/en-us/azure/iot-edge/reference-iot-edge-for-linux-on https://learn.microsoft.com/en-us/azure/iot-edge/security,Securing Azure IoT Edge,Security framework for Azure IoT Edge,,"Learn about the security, authentication, and authorization standards used to develop Azure IoT Edge for you to consider in your solution design.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Azure IoT Edge addresses risks inherent in moving your data and analytics to the intelligent edge. IoT Edge security standards balance flexibility for different deployment scenarios with the protection customers expect from Azure services. IoT Edge runs on various makes and models of hardware, supports several o",2026-02-24T08:00:00.000Z,concept-article,,0.45,False,Security framework overview; high-level standards and concepts without detailed role names or config parameters.,unchanged https://learn.microsoft.com/en-us/azure/iot-edge/support,Supported platforms,IoT Edge supported platforms,Check supported platforms for Azure IoT Edge deployment,"Azure IoT Edge supported operating systems, runtimes, and container engines.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. This article explains what operating system platforms, IoT Edge runtimes, container engines, and components are supported by IoT Edge, whether generally available or in preview.",2026-03-02T23:28:00.000Z,concept-article,deployment,0.65,True,"A supported platforms page typically includes detailed matrices of operating systems, runtimes, and container engines by version and support status. These are product-specific deployment constraints and compatibility details that qualify as expert knowledge under deployment.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot,Diagnose IoT Edge devices,Troubleshoot Azure IoT Edge,Troubleshoot and diagnose Azure IoT Edge issues,"Use this article to learn standard diagnostic skills for Azure IoT Edge, like retrieving component status and logs","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. If you experience issues running Azure IoT Edge in your environment, use this article as a guide for troubleshooting and diagnostics.",2026-04-01T08:00:00.000Z,troubleshooting-general,troubleshooting,0.9,True,"Explicit troubleshooting article for Azure IoT Edge. Typically includes specific error messages, diagnostic commands, log locations, and symptom→cause→solution flows that are unique to IoT Edge and not broadly known, matching the troubleshooting criteria.",unchanged -https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-common-errors,Resolve common errors,Troubleshoot Azure IoT Edge common errors,Diagnose and fix common Azure IoT Edge errors,"Resolve common issues in Azure IoT Edge solutions. Learn how to troubleshoot issues with provisioning, deployment, the IoT Edge runtime, and networking.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Use this article to identify and resolve common issues when using IoT Edge solutions. For information about how to find logs and errors from your IoT Edge device, seeTroubleshoot your IoT Edge device.",2026-04-14T22:21:00.000Z,troubleshooting-general,troubleshooting,0.9,True,"The page is explicitly a troubleshooting guide for Azure IoT Edge, organized around common issues and how to resolve them. These mappings of specific IoT Edge behaviors/logs to causes and fixes are product-specific troubleshooting knowledge that goes beyond generic debugging.",updated +https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-common-errors,Resolve common errors,Troubleshoot Azure IoT Edge common errors,Diagnose and fix common Azure IoT Edge errors,"Resolve common issues in Azure IoT Edge solutions. Learn how to troubleshoot issues with provisioning, deployment, the IoT Edge runtime, and networking.","Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS reached end of life on November 12, 2024. If you're using an earlier release, seeUpdate IoT Edge. Use this article to identify and resolve common issues when using IoT Edge solutions. For information about how to find logs and errors from your IoT Edge device, seeTroubleshoot your IoT Edge device.",2026-04-14T22:21:00.000Z,troubleshooting-general,troubleshooting,0.9,True,"The page is explicitly a troubleshooting guide for Azure IoT Edge, organized around common issues and how to resolve them. These mappings of specific IoT Edge behaviors/logs to causes and fixes are product-specific troubleshooting knowledge that goes beyond generic debugging.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-in-portal,Troubleshoot in the Azure portal,Troubleshoot Azure IoT Edge devices from the Azure portal,Troubleshoot Azure IoT Edge devices from the Azure portal,Use the troubleshooting page in the Azure portal to monitor IoT Edge devices and modules,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. IoT Edge lets you monitor and troubleshoot modules in the Azure portal. The troubleshooting page wraps the IoT Edge agent's direct methods, letting you retrieve logs from deployed modules and restart them remotely. This article explains how to access and filter device and module logs in the Azure portal.",2025-05-09T22:02:00.000Z,concept-article,troubleshooting,0.7,True,"Explains portal troubleshooting page that wraps $edgeAgent direct methods; includes specific operations, log retrieval options, and restart actions tied to IoT Edge diagnostics.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-iot-edge-for-linux-on-windows,Diagnose virtual machine,Troubleshoot your IoT Edge for Linux on Windows device,Troubleshoot Azure IoT Edge for Linux on Windows devices,Learn standard diagnostic skills for troubleshooting Azure IoT Edge for Linux on Windows (EFLOW) like retrieving component status and logs.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. If you run into issues with Azure IoT Edge for Linux on Windows (EFLOW), use this article to troubleshoot and diagnose the problem. You can also checkIoT Edge for Linux on Windows GitHub issuesfor a similar reported issue.",2025-07-22T22:09:00.000Z,troubleshooting-general,troubleshooting,0.85,True,"Dedicated troubleshooting guide with EFLOW-specific diagnostic commands, log locations, and symptom-to-solution mappings that are unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-iot-edge-for-linux-on-windows-common-errors,Resolve common errors,Troubleshoot Azure IoT Edge for Linux on Windows common issues,Resolve common Azure IoT Edge for Linux on Windows issues,Learn how to resolve common issues encountered when deploying an IoT Edge for Linux on Windows (EFLOW) solution.,"Applies to:IoT Edge 1.5 Important IoT Edge 1.5 LTS is thesupported release. IoT Edge 1.4 LTS is end of life as of November 12, 2024. If you are on an earlier release, seeUpdate IoT Edge. Use this article to fix common issues that can happen when you deploy IoT Edge for Linux on Windows solutions.",2026-02-26T08:00:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Lists common EFLOW deployment and runtime issues with their causes and fixes, including product-specific error patterns and resolutions.",unchanged diff --git a/products/azure-iot-edge/report.md b/products/azure-iot-edge/report.md index 0b54155c..56228919 100644 --- a/products/azure-iot-edge/report.md +++ b/products/azure-iot-edge/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: Configuring IoT Edge devices, networking, gateways, provisioning (DPS, X.509, TPM, symmetric keys), EFLOW/VM settings, storage, proxies, and built-in/custom @@ -25,19 +25,19 @@ category_descriptions: architecture-patterns: Gateway design patterns for connecting downstream devices and patterns for handling offline/intermittent connectivity, local processing, and sync behavior in Azure IoT Edge setups. - limits-quotas: 'Azure IoT Edge service and resource limits: max modules, routes, - deployments, message sizes, throttling, and other scalability and quota constraints + limits-quotas: 'IoT Edge-specific limits and quotas: max modules, routes, deployments, + message sizes, resource constraints, and other scalability or capacity restrictions for edge solutions.' skill_description: Expert knowledge for Azure IoT Edge development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when provisioning IoT Edge/EFLOW, deploying modules via manifests/CI-CD, using DPS/X.509, - or designing gateway topologies, and other Azure IoT Edge related development tasks. - Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Central (use azure-iot-central), + gateways, or nested edge, and other Azure IoT Edge related development tasks. Not + for Azure IoT Hub (use azure-iot-hub), Azure IoT Central (use azure-iot-central), Azure IoT Operations (use azure-iot-operations), Azure Stack Edge (use azure-stack-edge). use_when: Use when provisioning IoT Edge/EFLOW, deploying modules via manifests/CI-CD, - using DPS/X.509, or designing gateway topologies, and other Azure IoT Edge related - development tasks. + using DPS/X.509, gateways, or nested edge, and other Azure IoT Edge related development + tasks. confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Central (use azure-iot-central), Azure IoT Operations (use azure-iot-operations), Azure Stack Edge (use azure-stack-edge). @@ -54,8 +54,8 @@ confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Central ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 9 -- **Unchanged**: 89 +- **Updated Pages**: 1 +- **Unchanged**: 97 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-iot-edge/azure-iot-edge.csv` @@ -78,30 +78,14 @@ confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Central ### Updated Pages -- [Resolve common errors](https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-common-errors) - - Updated: 2026-03-03T23:37:00.000Z → 2026-04-14T22:21:00.000Z -- [Benefits](https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-benefits) - - Updated: 2025-08-21T05:12:00.000Z → 2026-04-13T22:10:00.000Z -- [Supported platforms](https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-for-linux-on-windows-support) - - Updated: 2024-11-19T23:02:00.000Z → 2026-04-13T22:10:00.000Z -- [About Azure IoT Edge](https://learn.microsoft.com/en-us/azure/iot-edge/about-iot-edge) - - Updated: 2026-02-20T08:00:00.000Z → 2026-04-13T22:10:00.000Z -- [Configure, connect, and verify a GPU](https://learn.microsoft.com/en-us/azure/iot-edge/configure-connect-verify-gpu) - - Updated: 2025-06-05T22:05:00.000Z → 2026-04-13T22:10:00.000Z - [Limits and restrictions](https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-limits-and-restrictions) - - Updated: 2025-07-11T22:35:00.000Z → 2026-04-13T22:10:00.000Z -- [Configure proxy support](https://learn.microsoft.com/en-us/azure/iot-edge/how-to-configure-proxy-support) - - Updated: 2025-05-09T08:00:00.000Z → 2026-04-13T22:10:00.000Z -- [Update IoT Edge](https://learn.microsoft.com/en-us/azure/iot-edge/how-to-update-iot-edge) - - Updated: 2026-03-03T23:37:00.000Z → 2026-04-13T08:00:00.000Z -- [Manage device certificates](https://learn.microsoft.com/en-us/azure/iot-edge/how-to-manage-device-certificates) - - Updated: 2026-03-23T08:00:00.000Z → 2026-04-13T08:00:00.000Z + - Updated: 2026-04-13T22:10:00.000Z → 2026-04-23T06:20:00.000Z ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Limits and restrictions](https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-limits-and-restrictions) | limits-quotas | 0.95 | Explicitly documents IoT Edge limits and restrictions; such pages typically include concrete numeric constraints (for example, module counts, message sizes, or resource limits) that qualify as expert knowledge under limits-quotas. | +| [Limits and restrictions](https://learn.microsoft.com/en-us/azure/iot-edge/iot-edge-limits-and-restrictions) | limits-quotas | 0.95 | The page is explicitly about IoT Edge limits and restrictions and will list concrete numerical constraints (for modules, routes, message sizes, etc.) that are version-specific and not reliably known from training data, matching the limits-quotas criteria. | | [Configure device settings](https://learn.microsoft.com/en-us/azure/iot-edge/configure-device) | configuration | 0.90 | Detailed reference of config.toml sections, option names, and allowed values for IoT Edge devices. | | [Diagnose IoT Edge devices](https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting article for Azure IoT Edge. Typically includes specific error messages, diagnostic commands, log locations, and symptom→cause→solution flows that are unique to IoT Edge and not broadly known, matching the troubleshooting criteria. | | [Resolve common errors](https://learn.microsoft.com/en-us/azure/iot-edge/troubleshoot-common-errors) | troubleshooting | 0.90 | The page is explicitly a troubleshooting guide for Azure IoT Edge, organized around common issues and how to resolve them. These mappings of specific IoT Edge behaviors/logs to causes and fixes are product-specific troubleshooting knowledge that goes beyond generic debugging. | diff --git a/products/azure-iot-hub/azure-iot-hub.csv b/products/azure-iot-hub/azure-iot-hub.csv index a6400b73..7bc7a5c3 100644 --- a/products/azure-iot-hub/azure-iot-hub.csv +++ b/products/azure-iot-hub/azure-iot-hub.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/iot-dps/about-iot-dps,What is IoT Hub Device Provisioning Service?,Overview of Azure IoT Hub Device Provisioning Service - Azure IoT Hub Device Provisioning Service,,Describes production scale device provisioning in Azure with the Device Provisioning Service (DPS) and IoT Hub,"The IoT Hub Device Provisioning Service (DPS) is a helper service for IoT Hub that enables zero-touch, just-in-time provisioning to the right IoT hub without requiring human intervention. In a cloud-based solution, DPS enables the provisioning of millions of devices in a secure and scalable manner. Many of the manual steps traditionally involved in provisioning are automated with DPS to reduce the time to deploy IoT devices and lower the risk of manual error.",2025-02-28T05:35:00.000Z,overview,,0.1,False,"High-level overview of DPS capabilities and concepts without numeric limits, configuration tables, or detailed patterns.",unchanged +https://learn.microsoft.com/en-us/azure/iot-dps/about-iot-dps,What is IoT Hub Device Provisioning Service?,Overview of Azure IoT Hub Device Provisioning Service - Azure IoT Hub Device Provisioning Service,,Describes production scale device provisioning in Azure with the Device Provisioning Service (DPS) and IoT Hub,"The IoT Hub Device Provisioning Service (DPS) is a helper service for IoT Hub that enables zero-touch, just-in-time provisioning to the right IoT hub without requiring human intervention. In a cloud-based solution, DPS enables the provisioning of millions of devices in a secure and scalable manner. Many of the manual steps traditionally involved in provisioning are automated with DPS to reduce the time to deploy IoT devices and lower the risk of manual error.",2026-01-22T19:36:00.000Z,overview,,0.1,False,"High-level conceptual overview of Azure IoT Hub Device Provisioning Service; no specific limits, configuration parameters, error codes, or decision matrices with quantified trade-offs.",updated https://learn.microsoft.com/en-us/azure/iot-dps/concepts-control-access-dps,Overview,Access control and security for Azure DPS - Azure IoT Hub Device Provisioning Service,,"Overview on controlling access to Azure IoT Hub Device Provisioning Service, links to articles on Microsoft Entra integration and SAS options.",This article describes the available options for securing your Azure IoT Hub Device Provisioning Service (DPS). The provisioning service usesauthenticationandpermissionsto grant access to each endpoint. Permissions allow the authentication process to limit access to a service instance based on functionality. There are two different ways for controlling access to DPS:,2023-10-11T22:21:00.000Z,concept-article,,0.35,False,"High-level overview of access control options; detailed token or RBAC specifics are in linked articles, not here.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/concepts-control-access-dps-azure-ad,Control access to DPS with Microsoft Entra ID (preview),Access control and security for DPS with Microsoft Entra ID - Azure IoT Hub Device Provisioning Service,Secure DPS APIs using Microsoft Entra ID and RBAC,Control access to Azure IoT Hub Device Provisioning Service (DPS) for back-end apps. Includes information about Microsoft Entra ID and RBAC.,"You can use Microsoft Entra ID to authenticate requests to Azure IoT Hub Device Provisioning Service (DPS) APIs, like create device identity and invoke direct method. You can also use Azure role-based access control (Azure RBAC) to authorize those same service APIs. By using these technologies together, you can grant permissions to access Azure IoT Hub Device Provisioning Service (DPS) APIs to a Microsoft Entra security principal. This security principal could be a user, group, or application se",2023-10-11T22:21:00.000Z,concept-article,security,0.75,True,"Covers using Entra ID and Azure RBAC with DPS APIs, including specific role-based access patterns and scopes unique to DPS.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/concepts-custom-allocation,Custom allocation policies,Using custom allocation policies with Azure DPS - Azure IoT Hub Device Provisioning Service,,Understand how custom allocation policies enable provisioning to multiple IoT hubs with the Azure IoT Hub Device Provisioning Service (DPS),"Custom allocation policies give you more control over how devices are assigned to your IoT hubs. By using custom allocation policies, you can define your own allocation policies when the built-in policies provided by the Device Provisioning Service (DPS) don't meet the requirements of your scenario. For example, maybe you want to examine the certificate a device is using during provisioning and assign the device to an IoT hub based on a certificate property. Or, maybe you have information stored",2025-08-08T08:00:00.000Z,concept-article,,0.3,False,Explains custom allocation policies conceptually; lacks decision matrices with thresholds or detailed integration parameter tables.,unchanged @@ -11,7 +11,7 @@ https://learn.microsoft.com/en-us/azure/iot-dps/concepts-service,DPS terminology https://learn.microsoft.com/en-us/azure/iot-dps/concepts-symmetric-key-attestation,Symmetric key attestation,Symmetric key attestation with Azure DPS - Azure IoT Hub Device Provisioning Service,,This article provides a conceptual overview of symmetric key attestation using IoT Device Provisioning Service (DPS).,"This article describes the identity attestation process when using symmetric keys with the Device Provisioning Service. Symmetric key attestation is a simple approach to authenticating a device with a Device Provisioning Service instance. This attestation method represents a ""Hello world"" experience for developers who are new to device provisioning, or don't have strict security requirements. Device attestation using aTPMor anX.509 certificateis more secure, and should be used for more stringent",2025-08-07T08:00:00.000Z,concept-article,,0.05,False,"Conceptual overview of symmetric key attestation; no numeric limits, config tables, or security role details.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/concepts-tpm-attestation,TPM attestation,TPM Attestation with Azure DPS - Azure IoT Hub Device Provisioning Service,,This article provides a conceptual overview of the TPM attestation flow using IoT Device Provisioning Service (DPS).,"This article describes the concepts involved when provisioning devices using Trusted Platform Module (TPM) attestation in the Device Provisioning Service (DPS). This article is relevant to all personas involved in getting a device ready for deployment. A Trusted Platform Module (TPM) is a type of hardware security module (HSM). This article assumes that you're using a discrete, firmware, or integrated TPM. Software emulated TPMs are well-suited for prototyping or testing, but they don't provide ",2025-08-07T08:00:00.000Z,concept-article,,0.05,False,Conceptual TPM attestation overview; no expert-only numeric or configuration reference content.,unchanged https://learn.microsoft.com/en-us/azure/iot-dps/concepts-x509-attestation,X.509 certificate attestation,X.509 certificate attestation with Azure DPS - Azure IoT Hub Device Provisioning Service,,Describes concepts specific to using X.509 certificate attestation with Device Provisioning Service (DPS) and IoT Hub,"This article describes the concepts involved when provisioning devices using X.509 certificate attestation in the Device Provisioning Service (DPS). This article is relevant to all personas involved in getting a device ready for deployment. X.509 certificates can be stored in a hardware security module (HSM). Tip We strongly recommend using an HSM with devices to securely store secrets, like the X.509 certificate, on your devices in production.",2025-11-03T22:11:00.000Z,concept-article,,0.05,False,Conceptual X.509 attestation article; high-level guidance without detailed configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/iot-dps/dps-faq,DPS FAQ,Azure IoT Hub Device Provisioning Service frequently asked questions (FAQ) - Azure IoT Hub Device Provisioning Service,,Find answer to common questions about Azure IoT Hub Device Provisioning Service.,This article answers to common questions about Azure IoT Hub Device Provisioning Service. The following articles are covered:,2025-12-18T23:12:00Z,faq,,0.3,False,"FAQ summary only; likely mixed conceptual and general Q&A without strong indication of detailed error codes, limits, or config tables.",unchanged +https://learn.microsoft.com/en-us/azure/iot-dps/dps-faq,DPS FAQ,Azure IoT Hub Device Provisioning Service frequently asked questions (FAQ) - Azure IoT Hub Device Provisioning Service,Azure IoT DPS FAQ with limits and behaviors,Find answer to common questions about Azure IoT Hub Device Provisioning Service.,This article answers to common questions about Azure IoT Hub Device Provisioning Service. The following articles are covered:,2026-04-24T22:19:00Z,faq,limits-quotas,0.7,True,"FAQ pages for Azure services typically include concrete, product-specific details such as maximum enrollments per DPS instance, rate limits, throttling behavior, supported regions, and other numeric constraints or timeouts that are not obvious from general training data. These align with the limits-quotas category because they expose exact values and behaviors (for example, how many registrations per second, how many linked IoT hubs, or other numeric constraints) that an LLM would not reliably know without this documentation.",updated https://learn.microsoft.com/en-us/azure/iot-dps/how-to-control-access,Control access to DPS with SAS,Access control and security for DPS with security tokens - Azure IoT Hub Device Provisioning Service,Configure DPS access control with SAS tokens,Control access to Azure IoT Hub Device Provisioning Service (DPS) for backend apps by using shared access signatures and security tokens.,This article describes the available options for securing your Azure IoT Hub Device Provisioning Service (DPS). The provisioning service usesauthenticationandpermissionsto grant access to each endpoint. Permissions allow the authentication process to limit access to a service instance based on functionality. This article discusses: The authentication process and the tokens the provisioning service uses to verify permissions against both theService and Device REST APIs. The different permissions ,2025-08-07T08:00:00.000Z,concept-article,security,0.7,True,"Describes DPS authentication with SAS/security tokens, including specific permissions for Service and Device REST APIs; this is product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/how-to-legacy-device-symm-key,Provision devices with symmetric keys,Tutorial - Provision devices using a symmetric key enrollment group in DPS - Azure IoT Hub Device Provisioning Service,,This tutorial shows how to use symmetric keys to provision devices through an enrollment group in your Device Provisioning Service (DPS) instance.,"This tutorial shows how to securely provision multiple simulated symmetric key devices to a single IoT Hub using an enrollment group. The Azure IoT Hub Device Provisioning Service supports two types of enrollments for provisioning devices: The Azure IoT Hub Device Provisioning Service supports three forms of authentication for provisioning devices: Some devices might not have a certificate, TPM, or any other security feature that can be used to securely identify the device. For such devices, the",2025-08-07T08:00:00.000Z,tutorial,,0.3,False,Tutorial for symmetric key enrollment groups; mostly how-to steps without detailed security role mappings or config tables.,unchanged https://learn.microsoft.com/en-us/azure/iot-dps/how-to-manage-enrollments,Manage enrollments,Manage device enrollments in the Azure portal - Azure IoT Hub Device Provisioning Service,Manage DPS device enrollments in Azure portal,How to manage group and individual device enrollments for your Device Provisioning Service (DPS) in the Azure portal.,"Adevice enrollmentcreates a record of a single device or a group of devices that can at some point register with the Azure IoT Hub Device Provisioning Service (DPS). The enrollment record contains the initial configuration for the device as part of that enrollment. Included in the configuration is either the IoT hub to which a device is assigned, or an allocation policy that applies to a set of IoT hubs. This article shows you how to manage device enrollments for your provisioning service. The D",2025-08-11T08:00:00.000Z,how-to,configuration,0.7,True,Portal how-to for managing individual and group enrollments; includes DPS enrollment configuration options and fields.,unchanged @@ -31,7 +31,7 @@ https://learn.microsoft.com/en-us/azure/iot-dps/iot-dps-https-x509-support,Use H https://learn.microsoft.com/en-us/azure/iot-dps/iot-dps-ip-filtering,Configure IP filtering,Microsoft Azure IoT DPS IP connection filters - Azure IoT Hub Device Provisioning Service,Configure IP filtering rules for Azure IoT DPS,How to use IP filtering to block connections from specific IP addresses to your Azure IoT DPS instance.,Security is an important aspect of any IoT solution. Sometimes you need to explicitly specify the IP addresses from which devices can connect as part of your security configuration. TheIP filterfeature for an Azure IoT Hub Device Provisioning Service (DPS) enables you to configure rules for rejecting or accepting traffic from specific IPv4 addresses.,2025-11-04T23:10:00.000Z,how-to,security,0.8,True,"IP filter feature is a security control; such pages typically list rule formats, priority behavior, allow/deny semantics, and configuration parameters unique to DPS, matching product-specific security settings.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/iot-dps-understand-ip-address,Understanding DPS IP addresses,Understanding the IP address of your DPS instance - Azure IoT Hub Device Provisioning Service,Query and manage DPS instance IP address properties,Query your DPS IP address and its properties. The IP address of your DPS instance can change during scenarios like disaster recovery or regional failover.,The IP address prefixes for the public endpoints of an IoT Hub Device Provisioning Service (DPS) are published periodically under theAzureIoTHubservice tag. You can use these IP address prefixes to control connectivity between an IoT DPS instance and devices or network assets to implement various network isolation goals:,2025-11-04T23:10:00.000Z,concept-article,configuration,0.65,True,Explains how to query DPS IP address and use Azure service tags; likely includes specific properties and configuration steps for network isolation.,unchanged https://learn.microsoft.com/en-us/azure/iot-dps/iot-mqtt-connect-to-iot-dps,MQTT support,Use MQTT to communicate with Azure IoT DPS - Azure IoT Device Provisioning Service,Connect devices to Azure IoT DPS over MQTT,Guidance on using the MQTT protocol to connect a device to the Azure IoT Device Provisioning Service device-facing endpoint.,"This article describes how devices can use the MQTT protocol to communicate with the Azure IoT Device Provisioning Service (DPS) device endpoint. The DPS device endpoint supports device connectivity using: DPS isn't a full-featured MQTT broker and doesn't support all the behaviors specified in the MQTT v3.1.1 standard. All device communication with DPS must be secured using TLS. Therefore, DPS doesn't support insecure MQTT connections over port 1883. Note DPS currently doesn't support devices us",2026-03-30T17:14:00.000Z,how-to,integrations,0.78,True,"The page gives product-specific MQTT connection details for Azure IoT DPS, including supported endpoints, TLS-only ports (no insecure 1883), protocol behaviors that differ from the MQTT 3.1.1 standard, and DPS-specific topic/connection requirements. These are concrete integration and protocol-usage details unique to this service, not just conceptual MQTT guidance.",unchanged -https://learn.microsoft.com/en-us/azure/iot-dps/libraries-sdks,Libraries and SDKs,IoT Hub Device Provisioning Service libraries and SDKs - Azure IoT Hub Device Provisioning Service,,Information about the device and service libraries available for developing solutions with Device Provisioning Service (CPS).,"The Microsoft SDKs for IoT Hub Device Provisioning Service (DPS) help you build device and backend applications that provision IoT devices to one or more IoT hubs. The SDKs handle the underlying transport and security protocols between your devices or backend apps and DPS, freeing you to focus on application development. By using the SDKs, you get support for future updates to DPS, including security updates. This article describes the three categories of SDKs, lists the DPS SDKs published in po",2026-04-15T22:11:00.000Z,reference,,0.2,False,"Primarily a catalog/overview of available SDKs; no detailed configuration tables, limits, or product-specific troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/iot-dps/libraries-sdks,Libraries and SDKs,IoT Hub Device Provisioning Service libraries and SDKs - Azure IoT Hub Device Provisioning Service,,Information about the device and service libraries available for developing solutions with Device Provisioning Service (CPS).,"The Microsoft SDKs for IoT Hub Device Provisioning Service (DPS) help you build device and backend applications that provision IoT devices to one or more IoT hubs. The SDKs handle the underlying transport and security protocols between your devices or backend apps and DPS, freeing you to focus on application development. By using the SDKs, you get support for future updates to DPS, including security updates. This article describes the three categories of SDKs, lists the DPS SDKs published in po",2026-04-15T22:11:00.000Z,reference,,0.2,False,"Primarily a catalog/overview of available SDKs; no detailed configuration tables, limits, or product-specific troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/monitor-iot-dps,Monitor Device Provisioning Service,Monitor Azure IoT Hub Device Provisioning Service - Azure IoT Hub Device Provisioning Service,,Start here to learn how to monitor metrics and logs from the Azure IoT Hub Device Provisioning Service by using Azure Monitor.,"This article describes: Note If you're already familiar with this service and/or Azure Monitor and just want to know how to analyze monitoring data, see theAnalyzesection near the end of this article. When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system. Azure Monitor provides you with a view of availability",2025-08-04T08:00:00.000Z,concept-article,,0.4,False,"High-level monitoring how-to; summary doesn’t show detailed metric/diagnostic tables, thresholds, or product-specific troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/monitor-iot-dps-reference,Monitoring data reference,Monitoring Device Provisioning Service data reference - Azure IoT Hub Device Provisioning Service,Reference for Azure DPS monitoring metrics and logs,This article contains important reference material you need when you monitor Azure IoT Hub Device Provisioning Service.,This article contains all the monitoring reference information for this service. SeeMonitor Azure IoT Hub Device Provisioning Servicefor details on the data you can collect for IoT Hub Device Provisioning Service and how to use it.,2024-08-09T05:35:00.000Z,reference,configuration,0.8,True,"Monitoring reference pages typically contain exhaustive tables of metrics, dimensions, categories, and log schemas with exact names and meanings—product-specific configuration/telemetry details.",unchanged https://learn.microsoft.com/en-us/azure/iot-dps/public-network-access,Managing public network access,Manage public network access for DPS - Azure IoT Hub Device Provisioning Service,Manage public network access and private endpoints for DPS,Documentation on how to disable and enable public network access for Azure IoT Device Provisioning Service (DPS),"To restrict access toa private endpoint for DPS in your virtual network, disable public network access. To do so, use the Azure portal or thepublicNetworkAccessAPI. You can also allow public access by using the portal or thepublicNetworkAccessAPI.",2025-11-04T23:10:00.000Z,how-to,security,0.75,True,Covers enabling/disabling public network access and using the publicNetworkAccess API; involves specific security-related configuration flags and behaviors for DPS networking.,unchanged @@ -102,25 +102,25 @@ https://learn.microsoft.com/en-us/azure/iot-hub-device-update/update-manifest,Up https://learn.microsoft.com/en-us/azure/iot-hub/authenticate-authorize-azure-ad,Microsoft Entra ID,Control access with Microsoft Entra ID - Azure IoT Hub,Secure IoT Hub APIs with Microsoft Entra ID and RBAC,Understand how Azure IoT Hub uses Microsoft Entra ID to authenticate identities and authorize access to IoT hubs and devices.,"You can use Microsoft Entra ID to authenticate requests to Azure IoT Hub service APIs, likecreate device identityandinvoke direct method. You can also use Azure role-based access control (Azure RBAC) to authorize those same service APIs. By using these technologies together, you can grant permissions to access IoT Hub service APIs to a Microsoft Entra security principal. This security principal could be a user, group, or application service principal. Authenticating access by using Microsoft Ent",2025-03-28T08:00:00.000Z,conceptual,security,0.8,True,"Details IoT Hub’s Entra ID auth flows and Azure RBAC roles/scopes for service APIs, which are concrete security configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/authenticate-authorize-sas,Shared access signatures,Control access with shared access signatures - Azure IoT Hub,Control IoT Hub access with SAS tokens,Understand how Azure IoT Hub uses shared access signatures (SAS) to authenticate identities and authorize access to IoT hubs and devices.,"IoT Hub uses shared access signature (SAS) tokens to authenticate devices and services to avoid sending keys on the wire. You use SAS tokens to grant time-bounded access to devices and services to specific functionality in IoT Hub. To get authorization to connect to IoT Hub, devices and services must send SAS tokens signed with either a shared access or symmetric key. Symmetric keys are stored with a device identity in the identity registry. This article introduces: Note Some of the features men",2025-03-20T08:00:00.000Z,concept-article,security,0.8,True,"Covers SAS token formats, scopes, and key usage specific to IoT Hub devices and services, which are detailed security/auth configuration mechanics.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/authenticate-authorize-x509,Third-party X.509 certificates,Authenticate with X.509 certificates - Azure IoT Hub,Authenticate IoT Hub devices with X.509 certificates,Understand how Azure IoT Hub uses X.509 certificates to authenticate IoT hubs and devices.,IoT Hub uses X.509 certificates to authenticate devices. X.509 authentication allows authentication of an IoT device as part of the Transport Layer Security (TLS) standard connection establishment. This article describes how to use third-party managed X.509 CA certificates to authenticate devices connecting to IoT Hub.,2025-02-02T12:10:00.000Z,conceptual,security,0.75,True,"Describes IoT Hub-specific use of CA-signed X.509 certs for device auth, including certificate handling patterns unique to the service.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/concept-certificate-issuance,Issuance of device certificates,Certificate Issuance in Azure IoT Hub Certificate Management (Preview) - Azure IoT Hub,,"Learn how Azure Device Registry issues X.509 certificates to IoT devices during provisioning, including the certificate hierarchy, issuance flow, and how IoT Hub trusts issued certificates.","Device certificate issuance is the process by which devices request and receive a certificate as part of provisioning. Azure Device Registry (ADR) generates and issues a new X.509 certificate to your IoT devices during provisioning. This article explains the responsibility of a device to send a Certificate Signing Request (CSR), how ADR and Device Provisioning Service (DPS) work together to issue certificates at scale, and how IoT Hub trusts the issued certificates. Important Azure IoT Hub with ",2026-04-15T22:11:00.000Z,concept-article,,0.4,False,"Conceptual explanation of certificate issuance flow and responsibilities; likely focuses on process and roles rather than concrete configuration parameters, limits, or error mappings.",new -https://learn.microsoft.com/en-us/azure/iot-hub/concept-certificate-renewal,Renewal of device certificates,Certificate Renewal in Azure IoT Hub Certificate Management (Preview) - Azure IoT Hub,Plan and execute IoT certificate renewal with ADR,"Learn how certificate renewal works in Azure IoT Hub certificate management, including when to renew, how the renewal flow works, and how devices can track certificate expiration.","Certificate renewal is the process where an already provisioned device requests and receives a new operational certificate for its existing device identity. Renewal is used to replace certificates that are nearing expiration or have expired. The device remains provisioned with the same identity; only the operational certificate is updated. Each IoT device must track certificate expiration and initiate a renewal before expiration. This article explains when to renew, the available renewal paths, ",2026-04-15T22:11:00.000Z,concept-article,best-practices,0.65,True,"Explains when to renew, available renewal paths, and how devices track expiration; this is lifecycle guidance with product-specific recommendations and edge cases around renewal timing and flows.",new -https://learn.microsoft.com/en-us/azure/iot-hub/concepts-certificate-policy-management,Certificate revocation and policy management,Certificate Revocation and Policy Management (Preview) - Azure IoT Hub,Manage certificate revocation and policies for IoT Hub,"This article discusses the concepts of revoking leaf certificates, revoking policies, deleting policies, and deleting credential resources in Azure IoT Hub certificate management.","Certificate management in Azure IoT Hub enables you to issue, manage, and revoke X.509 certificates throughout their lifecycle. This article introduces the key concepts related to revoking device certificates, revoking policies, deleting policies, and deleting credential resources. These operations are part of a coordinated lifecycle management strategy that helps you maintain your security posture when certificates expire, devices are decommissioned, or business requirements change. Important A",2026-04-15T22:11:00.000Z,concept-article,security,0.65,True,"Focuses on revoking device certificates, revoking policies, and deleting credential resources; these are security lifecycle operations with product-specific behaviors and implications for trust and access.",new +https://learn.microsoft.com/en-us/azure/iot-hub/concept-certificate-issuance,Issuance of device certificates,Certificate Issuance in Azure IoT Hub Certificate Management (Preview) - Azure IoT Hub,,"Learn how Azure Device Registry issues X.509 certificates to IoT devices during provisioning, including the certificate hierarchy, issuance flow, and how IoT Hub trusts issued certificates.","Device certificate issuance is the process by which devices request and receive a certificate as part of provisioning. Azure Device Registry (ADR) generates and issues a new X.509 certificate to your IoT devices during provisioning. This article explains the responsibility of a device to send a Certificate Signing Request (CSR), how ADR and Device Provisioning Service (DPS) work together to issue certificates at scale, and how IoT Hub trusts the issued certificates. Important Azure IoT Hub with ",2026-04-15T22:11:00.000Z,concept-article,,0.4,False,"Conceptual explanation of certificate issuance flow and responsibilities; likely focuses on process and roles rather than concrete configuration parameters, limits, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/iot-hub/concept-certificate-renewal,Renewal of device certificates,Certificate Renewal in Azure IoT Hub Certificate Management (Preview) - Azure IoT Hub,Plan and execute IoT certificate renewal with ADR,"Learn how certificate renewal works in Azure IoT Hub certificate management, including when to renew, how the renewal flow works, and how devices can track certificate expiration.","Certificate renewal is the process where an already provisioned device requests and receives a new operational certificate for its existing device identity. Renewal is used to replace certificates that are nearing expiration or have expired. The device remains provisioned with the same identity; only the operational certificate is updated. Each IoT device must track certificate expiration and initiate a renewal before expiration. This article explains when to renew, the available renewal paths, ",2026-04-15T22:11:00.000Z,concept-article,best-practices,0.65,True,"Explains when to renew, available renewal paths, and how devices track expiration; this is lifecycle guidance with product-specific recommendations and edge cases around renewal timing and flows.",unchanged +https://learn.microsoft.com/en-us/azure/iot-hub/concepts-certificate-policy-management,Certificate revocation and policy management,Certificate Revocation and Policy Management (Preview) - Azure IoT Hub,Manage certificate revocation and policies for IoT Hub,"This article discusses the concepts of revoking leaf certificates, revoking policies, deleting policies, and deleting credential resources in Azure IoT Hub certificate management.","Certificate management in Azure IoT Hub enables you to issue, manage, and revoke X.509 certificates throughout their lifecycle. This article introduces the key concepts related to revoking device certificates, revoking policies, deleting policies, and deleting credential resources. These operations are part of a coordinated lifecycle management strategy that helps you maintain your security posture when certificates expire, devices are decommissioned, or business requirements change. Important A",2026-04-15T22:11:00.000Z,concept-article,security,0.65,True,"Focuses on revoking device certificates, revoking policies, and deleting credential resources; these are security lifecycle operations with product-specific behaviors and implications for trust and access.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/concepts-manage-device-reconnections,Manage device reconnections,Manage device reconnections to create resilient applications - Azure IoT,Design resilient Azure IoT Hub device reconnection,Manage the device connection and reconnection process to ensure resilient applications by using the Azure IoT Hub device SDKs.,This article provides high-level guidance to help you design resilient applications by adding a device reconnection strategy. It explains why devices disconnect and need to reconnect. And it describes specific strategies that developers can use to reconnect devices that have been disconnected.,2026-03-30T17:14:00.000Z,concept-article,best-practices,0.68,True,"The article provides concrete, product-specific guidance on how to implement reconnection strategies with Azure IoT Hub device SDKs (for example, handling transient disconnects, retry patterns, and SDK-specific behaviors). This is actionable DO/DON'T style guidance tied to IoT Hub’s connection model rather than generic networking advice, so it fits best under best-practices.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/create-connect-device,Create and connect a device,Register and connect an IoT device - Azure IoT Hub,Manage IoT Hub device identities and connection strings,"How to create, manage, and delete Azure IoT devices and how to retrieve the device connection string.","Create a device identity for your device to connect to Azure IoT Hub. This article introduces key tasks for managing a device identity including registering the device, collecting its connection information, and then deleting or disabling a device at the end of its lifecycle.",2025-05-20T08:00:00.000Z,how-to,security,0.65,True,"Covers creating, disabling, and deleting device identities and retrieving connection strings—core identity and access management operations specific to IoT Hub.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/create-hub,Create an IoT hub (no ADR integration),Create an Azure IoT hub - Azure IoT Hub,,"How to create, manage, and delete Azure IoT hubs through the Azure portal, CLI, and PowerShell. Includes information about retrieving the service connection string.","This article explains how to create an IoT hubwithoutAzure Device Registry (ADR) and certificate management integration. If you want to create an IoT hub integrated with these preview features, seeGet started with ADR and certificate management in IoT Hub (Preview).",2025-06-25T08:00:00.000Z,how-to,,0.45,False,How-to create an IoT hub via portal/CLI/PowerShell; usually step-by-step without comprehensive config parameter tables or limits; more basic provisioning tutorial.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-cloud-to-device-messaging,Send cloud-to-device messages,Send cloud-to-device messages - Azure IoT Hub,Send and receive IoT Hub cloud-to-device messages with SDKs,"How to send cloud-to-device messages from a back-end app and receive them on a device app using the Azure IoT SDKs for C#, Python, Java, and Node.js.","Azure IoT Hub is a fully managed service that enables bi-directional communications, including cloud-to-device (C2D) messages from solution back ends to millions of devices. This article describes how to use the Azure IoT SDKs to build the following types of applications: Device applications that receive and handle cloud-to-device messages from an IoT Hub messaging queue. Back end applications that send cloud-to-device messages to a single device through an IoT Hub messaging queue. This article ",2025-01-06T08:00:00.000Z,how-to,integrations,0.7,True,Shows device and backend apps using IoT SDKs for C2D messaging; includes API usage and parameters specific to IoT Hub messaging queues.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-collect-device-logs,Capture trace logs,Collect device debug logs - Azure IoT Hub,Collect device debug logs using IoT SDKs,"To troubleshoot device issues, it's sometimes useful to collect low-level debug logs from the devices. This article shows how to use the device SDKs to generate debug logs.","To troubleshoot device issues, it's sometimes useful to collect low-level debug logs from the devices. This article shows how to capture debug logs from the device SDKs. The steps outlined in this article assume you have either direct or remote access the device. Caution If you're sharing logs with a support engineer or adding them to a GitHub issue, be sure to remove any confidential information such as connection strings.",2025-08-13T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Shows how to enable and capture low-level logs in IoT device SDKs with product-specific logging options, used for troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/how-to-configure-credential,Configure a credential,Configure a credential in Azure Device Registry - Azure IoT Hub,Configure ADR credential for Microsoft-backed certificates,Configure a credential in your Azure Device Registry namespace to enable Microsoft-backed X.509 certificate management for IoT devices.,"When you enableMicrosoft-backed X.509 certificate managementin yourAzure Device Registry (ADR)namespace, you create a single credential resource within that ADR namespace. A credential manages one unique root CA within your own cloud PKI. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub? Microsoft manages the PKI and root CA for",2026-04-15T22:11:00.000Z,how-to,configuration,0.8,True,"How-to for configuring a credential resource in ADR; likely includes specific resource names, fields, and allowed values for enabling Microsoft-backed X.509 management, which are product-specific configuration details.",new -https://learn.microsoft.com/en-us/azure/iot-hub/how-to-create-policy,Create a policy with a Microsoft root CA,Create or Edit a Policy with Microsoft Root CA in Azure Device Registry - Azure IoT Hub,Create or edit Microsoft-root CA policy in ADR,Create or edit a policy in your Azure Device Registry namespace to issue Microsoft-backed X.509 certificates for IoT devices.,"This article explains how to create or edit a policy within yourAzure Device Registry (ADR)namespace to manage anissuing CAsigned by your namespace's uniqueroot CA. Use this workflow if you want ADR to provide a fully managed public key infrastructure (PKI) for your namespace. When a device requests a certificate, the platform returns a full certificate chain consisting of: The device certificate:Unique to the specific IoT device. The issuing CA (ICA):The CA managed by ADR that signs the device ",2026-04-15T22:11:00.000Z,how-to,configuration,0.8,True,"Describes creating/editing a policy for issuing CAs signed by a Microsoft-managed root; such articles typically include policy parameter names, structures, and required settings that are detailed configuration knowledge.",new -https://learn.microsoft.com/en-us/azure/iot-hub/how-to-create-policy-external-certificate,Create a policy with an external root CA,Create or Edit a Policy with an External Root CA in Azure Device Registry - Azure IoT Hub,Configure ADR policy with external root CA,Create or edit an external CA policy in Azure Device Registry so you can use your external CA to issue IoT device certificates.,"This article explains how to create or edit a policy within yourAzure Device Registrynamespace to manage an issuing CA that is signed by anexternal root CA. Use this workflow if your organization maintains a private Public Key Infrastructure (PKI) and requires all IoT devices to chain up to a common trusted root. When a device requests a certificate via Device Registry, the platform returns a fullcertificate chainconsisting of: The device certificate:Unique to the specific IoT device. The Micro",2026-04-15T22:11:00.000Z,how-to,configuration,0.8,True,"Covers creating/editing a policy using an external root CA; expected to define specific configuration fields, certificate chain requirements, and parameter values unique to ADR integration.",new +https://learn.microsoft.com/en-us/azure/iot-hub/how-to-configure-credential,Configure a credential,Configure a credential in Azure Device Registry - Azure IoT Hub,Configure ADR credential for Microsoft-backed certificates,Configure a credential in your Azure Device Registry namespace to enable Microsoft-backed X.509 certificate management for IoT devices.,"When you enableMicrosoft-backed X.509 certificate managementin yourAzure Device Registry (ADR)namespace, you create a single credential resource within that ADR namespace. A credential manages one unique root CA within your own cloud PKI. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub? Microsoft manages the PKI and root CA for",2026-04-15T22:11:00.000Z,how-to,configuration,0.8,True,"How-to for configuring a credential resource in ADR; likely includes specific resource names, fields, and allowed values for enabling Microsoft-backed X.509 management, which are product-specific configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/iot-hub/how-to-create-policy,Create a policy with a Microsoft root CA,Create or Edit a Policy with Microsoft Root CA in Azure Device Registry - Azure IoT Hub,Create or edit Microsoft-root CA policy in ADR,Create or edit a policy in your Azure Device Registry namespace to issue Microsoft-backed X.509 certificates for IoT devices.,"This article explains how to create or edit a policy within yourAzure Device Registry (ADR)namespace to manage anissuing CAsigned by your namespace's uniqueroot CA. Use this workflow if you want ADR to provide a fully managed public key infrastructure (PKI) for your namespace. When a device requests a certificate, the platform returns a full certificate chain consisting of: The device certificate:Unique to the specific IoT device. The issuing CA (ICA):The CA managed by ADR that signs the device ",2026-04-15T22:11:00.000Z,how-to,configuration,0.8,True,"Describes creating/editing a policy for issuing CAs signed by a Microsoft-managed root; such articles typically include policy parameter names, structures, and required settings that are detailed configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/iot-hub/how-to-create-policy-external-certificate,Create a policy with an external root CA,Create or Edit a Policy with an External Root CA in Azure Device Registry - Azure IoT Hub,Configure ADR policy with external root CA,Create or edit an external CA policy in Azure Device Registry so you can use your external CA to issue IoT device certificates.,"This article explains how to create or edit a policy within yourAzure Device Registrynamespace to manage an issuing CA that is signed by anexternal root CA. Use this workflow if your organization maintains a private Public Key Infrastructure (PKI) and requires all IoT devices to chain up to a common trusted root. When a device requests a certificate via Device Registry, the platform returns a fullcertificate chainconsisting of: The device certificate:Unique to the specific IoT device. The Micro",2026-04-15T22:11:00.000Z,how-to,configuration,0.8,True,"Covers creating/editing a policy using an external root CA; expected to define specific configuration fields, certificate chain requirements, and parameter values unique to ADR integration.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-device-management,Get started with device management,Device management using direct methods - Azure IoT Hub,Implement device management actions using IoT Hub direct methods,How to use Azure IoT Hub direct methods for device management tasks including invoking a remote device reboot.,"Back-end apps can use Azure IoT Hub primitives, such asdevice twinsanddirect methods, to remotely start and monitor device management actions on devices. Use a direct method from a back-end application to initiate device management actions, such as reboot, factory reset, and firmware update. The device is responsible for: This article shows you how a back-end app and a device app can work together to initiate and monitor a remote device action using a direct method. Note The features described i",2025-01-07T05:32:00.000Z,how-to,integrations,0.7,True,"Shows backend and device app working together with direct methods; includes method names, payload schemas, and call patterns specific to IoT Hub device management.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-device-twins,Get started with device twins,Get started with Azure IoT Hub device twins - Azure IoT Hub,Use IoT Hub device and service SDKs with device twins,How to use the Azure IoT SDKs to create device and backend service application code for device twins.,"Use the Azure IoT Hub device SDK and service SDK to develop applications that handle common device twin tasks. Device twins are JSON documents that store device state information including metadata, configurations, and conditions. IoT Hub persists a device twin for each device that connects to it. You can use device twins to: For more information about device twins, including when to use device twins, seeUnderstand and use device twins in IoT Hub. Note The features described in this article are ",2025-01-02T23:04:00.000Z,how-to,integrations,0.75,True,Shows how to implement device and backend code for twins; includes SDK APIs and patterns unique to IoT Hub device twin integration.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-disable-dr,Disable disaster recovery,Disable Disaster Recovery in Azure IoT Hub - Azure IoT Hub,Decide when to disable IoT Hub disaster recovery,Learn how to disable disaster recovery failover for Azure IoT Hub in specific regions by using the Azure portal to manage data replication settings.,"Azure IoT Hub provides Microsoft-initiated failover and manual failover by replicating data to apaired regionfor each IoT hub. For some regions, you can avoid data replication outside of the region by disabling disaster recovery (DR) when you create an IoT hub.",2025-05-30T17:07:00.000Z,how-to,decision-making,0.65,True,"Explains disabling DR and its regional replication implications, guiding decisions about data residency vs. failover for specific regions.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/how-to-disable-enable-device,Disable or enable a device,Disable or Enable a Device in Azure Device Registry - Azure IoT Hub,Disable or enable IoT devices in Azure Device Registry,Disable or enable a device in Azure Device Registry so you can pause or resume device activity in Azure IoT Hub preview deployments.,"Use Azure Device Registry to disable a device when you need to stop device activity without deleting the device resource. Enable the device again when you're ready to return it to service. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub? Disabling a device might interrupt active operations, data collection, or dependent workflo",2026-04-15T22:11:00.000Z,how-to,configuration,0.7,True,Operational guide for toggling device state in ADR; likely includes specific API/portal fields and state values that control device behavior in IoT Hub preview deployments.,new +https://learn.microsoft.com/en-us/azure/iot-hub/how-to-disable-enable-device,Disable or enable a device,Disable or Enable a Device in Azure Device Registry - Azure IoT Hub,Disable or enable IoT devices in Azure Device Registry,Disable or enable a device in Azure Device Registry so you can pause or resume device activity in Azure IoT Hub preview deployments.,"Use Azure Device Registry to disable a device when you need to stop device activity without deleting the device resource. Enable the device again when you're ready to return it to service. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub? Disabling a device might interrupt active operations, data collection, or dependent workflo",2026-04-15T22:11:00.000Z,how-to,configuration,0.7,True,Operational guide for toggling device state in ADR; likely includes specific API/portal fields and state values that control device behavior in IoT Hub preview deployments.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-file-upload,Upload files from devices,Upload files from your device to the cloud with Azure IoT Hub - Azure IoT Hub,Upload device files to cloud using IoT Hub SDKs,"How to upload files from a device to the cloud using the Azure IoT SDKs for C#, Python, Java, and Node.js.","This article demonstrates how to: In some scenarios, you can't easily map the data your devices send into the relatively small device-to-cloud messages that IoT Hub accepts. The file upload capabilities in IoT Hub enable you to move large or complex data to the cloud. For example: These files are typically batch processed in the cloud, using tools such asAzure Data Factoryor theHadoopstack. When you need to upload files from a device, you can still use the security and reliability of IoT Hub. Th",2025-11-11T08:00:00.000Z,how-to,integrations,0.7,True,"Shows SDK usage in C#, Python, Java, Node.js for file upload; includes API calls and parameters specific to IoT Hub file upload integration.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-module-twins,Get started with module twins,Get started with module identity and module identity twins - Azure IoT Hub,Use module identities and twins with IoT Hub,Learn how to create module identities and update module identity twins using the Azure IoT Hub SDKs.,"Module identities and module identity twins are similar to Azure IoT Hub device identities and device twins, but provide finer granularity. While Azure IoT Hub device identities and device twins enable the back-end application to configure a device and provide visibility on the device's conditions, a module identity and module identity twin provide these capabilities for individual components of a device. On capable devices with multiple components, such as operating system devices or firmware d",2025-01-05T12:12:00.000Z,how-to,integrations,0.65,True,"How-to for creating and updating module identities and module twins using specific IoT Hub SDK APIs and patterns, which are product-specific integration details beyond generic LLM knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/how-to-revoke-certificate-delete-policy,Revoke certificates and delete policies,Revoke Certificates and Delete Policies in Azure Device Registry - Azure IoT Hub,Revoke device certificates and clean up ADR policies,"Learn how to revoke device and policy certificates, and delete policies and credential resources in Azure Device Registry for IoT Hub.","This article shows you how to run certificate lifecycle operations for Azure Device Registry in Azure IoT Hub: Use these procedures when you need to respond to a security event, retire certificate resources, or clean up certificate paths in production. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub?",2026-04-15T22:11:00.000Z,how-to,security,0.75,True,Procedural guide for revoking certificates and deleting policies/credentials in ADR; contains concrete steps and operations that directly affect security posture and are specific to this product.,new -https://learn.microsoft.com/en-us/azure/iot-hub/how-to-routing-portal,Create message routes and endpoints,Create and delete routes and endpoints in Azure portal - Azure IoT Hub,,Learn how to create and delete routes and endpoints in Azure IoT Hub by using the Azure portal for message routing.,"This article shows you how to create a route and endpoint in your hub in Azure IoT Hub and then delete your route and endpoint. Learn how to use the Azure portal to create routes and endpoints for Azure Event Hubs, Azure Service Bus queues and topics, Azure Storage, and Azure Cosmos DB. To learn more about how routing works in IoT Hub, seeUse IoT Hub message routing to send device-to-cloud messages to Azure services. To walk through setting up a route that sends messages to storage and then test",2025-08-13T08:00:00.000Z,how-to,,0.2,False,"Step-by-step portal tutorial for creating and deleting IoT Hub routes and endpoints; no indication of configuration tables, limits, error codes, or product-specific best-practice guidance beyond generic how-to instructions.",new +https://learn.microsoft.com/en-us/azure/iot-hub/how-to-revoke-certificate-delete-policy,Revoke certificates and delete policies,Revoke Certificates and Delete Policies in Azure Device Registry - Azure IoT Hub,Revoke device certificates and clean up ADR policies,"Learn how to revoke device and policy certificates, and delete policies and credential resources in Azure Device Registry for IoT Hub.","This article shows you how to run certificate lifecycle operations for Azure Device Registry in Azure IoT Hub: Use these procedures when you need to respond to a security event, retire certificate resources, or clean up certificate paths in production. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub?",2026-04-15T22:11:00.000Z,how-to,security,0.75,True,Procedural guide for revoking certificates and deleting policies/credentials in ADR; contains concrete steps and operations that directly affect security posture and are specific to this product.,unchanged +https://learn.microsoft.com/en-us/azure/iot-hub/how-to-routing-portal,Create message routes and endpoints,Create and delete routes and endpoints in Azure portal - Azure IoT Hub,,Learn how to create and delete routes and endpoints in Azure IoT Hub by using the Azure portal for message routing.,"This article shows you how to create a route and endpoint in your hub in Azure IoT Hub and then delete your route and endpoint. Learn how to use the Azure portal to create routes and endpoints for Azure Event Hubs, Azure Service Bus queues and topics, Azure Storage, and Azure Cosmos DB. To learn more about how routing works in IoT Hub, seeUse IoT Hub message routing to send device-to-cloud messages to Azure services. To walk through setting up a route that sends messages to storage and then test",2025-08-13T08:00:00.000Z,how-to,,0.2,False,"Step-by-step portal tutorial for creating and deleting IoT Hub routes and endpoints; no indication of configuration tables, limits, error codes, or product-specific best-practice guidance beyond generic how-to instructions.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/how-to-schedule-broadcast-jobs,Create back-end app to schedule jobs,Use jobs to schedule tasks for groups of devices - Azure IoT Hub,Use IoT Hub service SDK to schedule broadcast jobs,How to use the service SDK to schedule a job that invokes a device direct method and updates desired device twin properties.,"This article shows you how to create back-end app code to schedule and broadcast jobs. Use Azure IoT Hub to schedule and track jobs that update up to millions of devices for these operations: A job wraps one of these actions and tracks the execution against a set of devices that is defined by a device twin query. For example, a back-end app can use a job to invoke a direct method on 10,000 devices that reboots the devices. You specify the set of devices with a device twin query and schedule the ",2025-01-21T23:02:00.000Z,how-to,integrations,0.65,True,"Shows backend code using the service SDK to schedule jobs; includes SDK method names, parameters, and patterns specific to IoT Hub job scheduling.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-concepts-and-iot-hub,What is IoT Hub?,What is Azure IoT Hub? - Azure IoT Hub,,This article discusses the basic concepts of how Azure IoT Hub helps users connect IoT applications and their attached devices.,"The Internet of Things (IoT) connects physical devices to exchange data over the internet. With over 10 billion connected devices worldwide, anything embedded with sensors and software can join this network. Azure IoT Hub is a managed service that acts as a central message hub in a cloud-based IoT solution. It enables reliable and secure communication at scale between an IoT application and its attached devices. Almost any device can be connected to an IoT hub. Several messaging patterns are sup",2025-11-18T17:01:00.000Z,overview,,0.1,False,"Conceptual 'What is IoT Hub?' overview; marketing/introductory content without detailed limits, configs, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-amqp-support,AMQP support,Understand Azure IoT Hub AMQP support,Use AMQP protocol with Azure IoT Hub endpoints,This article describes support for devices connecting to IoT Hub device-facing and service-facing endpoints using the AMQP protocol. Includes information about built-in AMQP support in the Azure IoT d,"Azure IoT Hub supportsOASIS Advanced Message Queuing Protocol @@ -129,7 +129,7 @@ https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-automatic-device-managem https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-azure-service-health-integration,IoT Hub service and resource health,Check Azure IoT Hub service and resource health,,Use Azure Service Health and Azure Resource Health to monitor your IoT Hub,"Azure IoT Hub integrates withAzure Service Healthto enable service-level health monitoring of the IoT Hub service and individual IoT hubs. You can also set up alerts to be notified when the status of the IoT Hub service or an IoT hub (instance) changes. Azure Service Health is a combination of three smaller services: Azure Resource Health, Azure Service Health, and the Azure status page. The sections in this article describe each service and its relationship to IoT Hub. Azure Service Health help",2025-12-15T23:10:00.000Z,how-to,,0.45,False,Explains integration with Azure Service Health and Resource Health conceptually; likely lacks detailed error-code mappings or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-bulk-identity-mgmt,Bulk import and export IoT devices,Import and export device identities - Azure IoT Hub,Bulk import and export IoT Hub device identities,"Use the Azure IoT service SDK to import and export device identities so that you can create, update, and delete device identities in bulk.","Each IoT hub has an identity registry that you can use to create device resources in the service. The identity registry also enables you to control access to the device-facing endpoints. This article describes how to import and export device identities in bulk to and from an identity registry, using the ImportExportDeviceSample sample included with theMicrosoft Azure IoT SDK for .NET. For more information about how you can use this capability when migrating an IoT hub to a different region, seeH",2025-08-13T08:00:00.000Z,how-to,integrations,0.7,True,"Uses .NET service SDK sample for bulk identity operations; likely includes specific API calls, parameters, and file formats unique to IoT Hub identity registry.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-certificate-management-concepts,Key concepts for certificate management,Key Concepts for Certificate Management (Preview) - Azure IoT Hub,,This article discusses the basic concepts of certificate management in Azure IoT Hub.,"Certificate management in Azure IoT Hub is designed to simplify the management of X.509 certificates for IoT devices. This article introduces the fundamental concepts related to certificate management and certificate-based authentication in IoT Hub. For more information, seeWhat is certificate management (preview)? Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information,",2025-11-18T17:01:00.000Z,overview,,0.25,False,"Key concepts article; focuses on explaining certificate management ideas, not detailed configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-certificate-management-overview,Microsoft certificate management (preview),What is Microsoft-backed X.509 Certificate Management (Preview)? - Azure IoT Hub,,This article discusses the basic concepts of how certificate management in Azure IoT Hub helps users manage device certificates.,"Certificate managementis an optional feature ofAzure Device Registry (ADR)that simplifies the issuance and lifecycle management of X.509 certificates for IoT devices. This feature configures a unique, cloud Public Key Infrastructure (PKI) for each ADR namespace, eliminating the need for on-prem servers, complex connectors, or dedicated hardware. By automating certificate issuance and renewal, ADR ensures that provisioned devices maintain a secure, seamless connection when authenticating withAzur",2026-04-15T22:11:00.000Z,overview,,0.3,False,"Described as a conceptual overview of certificate management; likely focuses on what the feature is and benefits, not detailed config parameters or limits.",new +https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-certificate-management-overview,Microsoft certificate management (preview),What is Microsoft-backed X.509 Certificate Management (Preview)? - Azure IoT Hub,,This article discusses the basic concepts of how certificate management in Azure IoT Hub helps users manage device certificates.,"Certificate managementis an optional feature ofAzure Device Registry (ADR)that simplifies the issuance and lifecycle management of X.509 certificates for IoT devices. This feature configures a unique, cloud Public Key Infrastructure (PKI) for each ADR namespace, eliminating the need for on-prem servers, complex connectors, or dedicated hardware. By automating certificate issuance and renewal, ADR ensures that provisioned devices maintain a secure, seamless connection when authenticating withAzur",2026-04-15T22:11:00.000Z,overview,,0.3,False,"Described as a conceptual overview of certificate management; likely focuses on what the feature is and benefits, not detailed config parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-compare-event-hubs,Compare IoT Hub and Event Hubs,Compare Azure IoT Hub to Azure Event Hubs,Choose between Azure IoT Hub and Event Hubs,"A comparison of the IoT Hub and Event Hubs Azure services highlighting functional differences and use cases. The comparison includes supported protocols, device management, monitoring, and file upload","Azure provides services developed for diverse types of connectivity and communication to help you connect your data to the power of the cloud. Both Azure IoT Hub and Azure Event Hubs are cloud services that can ingest large amounts of data and process or store that data for business insights. The two services are similar in that they both support ingestion of data with low latency and high reliability, but they're designed for different purposes. IoT Hub was developed to address the unique requi",2025-03-28T08:00:00.000Z,product-comparison,decision-making,0.8,True,"Explicit comparison article; usually includes feature comparison tables, supported protocols, and scenario-based recommendations to help select the right service.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-configuration-best-practices,Device configuration best practices,Device configuration best practices for Azure IoT Hub,Apply IoT Hub automatic device configuration best practices,Learn about best practices for using automatic device management to minimize repetitive and complex tasks involved in managing IoT devices at scale.,"Automatic device management in Azure IoT Hub automates many repetitive and complex tasks of managing large device fleets over the entirety of their lifecycles. This article defines many of the best practices for the various roles involved in developing and operating an IoT solution. IoT hardware manufacturer/integrator:Manufacturers of IoT hardware, integrators assembling hardware from various manufacturers, or suppliers providing hardware for an IoT deployment manufactured or integrated by othe",2025-08-05T08:00:00.000Z,concept-article,best-practices,0.75,True,Explicit best-practices article with product-specific guidance for roles and lifecycle operations; likely includes concrete patterns and gotchas for IoT Hub automatic configurations.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-configure-file-upload,Configure file upload,Configure file upload in IoT Hub,Configure IoT Hub file upload to Azure Storage,How to use the Azure portal to configure your IoT hub to enable file uploads from connected devices. Includes information about configuring the destination Azure storage account.,"Configuring file uploads in your IoT hub enables your connected devices to upload files to an Azure storage account. This article shows you how to configure file uploads on your IoT hub using the Azure portal, Azure CLI, and Azure PowerShell. To use thefile upload functionality in IoT Hub, you must first associate an Azure storage account and blob container with your IoT hub. IoT Hub automatically generates SAS URIs with write permissions to this blob container for devices to use when they uploa",2025-12-01T08:00:00.000Z,how-to,configuration,0.8,True,"Details associating storage accounts/containers and enabling file upload; includes specific settings and behaviors (SAS URIs, containers) unique to IoT Hub.",unchanged @@ -157,7 +157,7 @@ https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-sdks,IoT Servic https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-sdks,Resource Manager template,Azure IoT Hub device and service SDKs,,Links to the Azure IoT Hub SDKs that you can use to build device apps and back-end apps.,"IoT Hub provides three categories of software development kits (SDKs) to help you build device and back-end applications: IoT Hub device SDKsenable you to build applications that run on your IoT devices using the device client or module client. These apps send telemetry to your IoT hub, and can also receive messages, jobs, methods, or twin updates from your IoT hub. You can use these SDKs to build device apps that useAzure IoT Plug and Playconventions and models to advertise their capabilities t",2025-02-28T23:02:00.000Z,concept-article,,0.4,False,Primarily an index/overview of SDKs with links; lacks detailed parameter tables or product-specific patterns itself.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-device-management-overview,Overview of device management,Overview of device management with Microsoft Azure IoT Hub,,"Overview of device management in Azure IoT Hub--enterprise device lifecycle and device management patterns such as, reboot, factory reset, firmware update, configuration, device twins, queries, jobs, ","Azure IoT Hub provides the features and an extensibility model that enable device and back-end developers to build robust device management solutions. Devices range from constrained sensors and single purpose microcontrollers, to powerful gateways that route communications for groups of devices. Also, the use cases and requirements for IoT operators vary significantly across industries. Despite this variation, device management with IoT Hub provides the capabilities, patterns, and code libraries",2025-05-27T17:04:00.000Z,concept-article,,0.45,False,"Device management overview; focuses on patterns and concepts (reboot, firmware update, etc.) rather than detailed configuration parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-device-registry-overview,Azure Device Registry integration (preview),Integration with Azure Device Registry (preview) - Azure IoT Hub,,This article discusses the basic concepts of how Azure Device Registry helps users manage IoT devices.,"This article provides background on Azure Device Registry (ADR) integration with Azure IoT Hub (preview). Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub?",2026-02-24T06:10:00.000Z,overview,,0.2,False,"Conceptual overview of ADR integration; background article without indication of numeric thresholds, config tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-device-registry-setup,Get started (ADR integration),Deploy IoT Hub with ADR Integration and Certificate Management (Preview) - Azure IoT Hub,Deploy IoT Hub with ADR and certificate management,Learn how to create an IoT Hub with ADR integration and Microsoft-backed X.509 certificate management.,"This article explains how to deploy IoT Hub withAzure Device Registry (ADR)integration andMicrosoft-backed X.509 certificate management. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub?",2026-04-15T22:11:00.000Z,how-to,configuration,0.7,True,How-to deployment/configuration article for creating an IoT Hub with ADR integration and Microsoft-backed X.509 certificate management; likely includes specific resource/parameter names and required settings beyond generic knowledge.,updated +https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-device-registry-setup,Get started (ADR integration),Deploy IoT Hub with ADR Integration and Certificate Management (Preview) - Azure IoT Hub,Deploy IoT Hub with ADR and certificate management,Learn how to create an IoT Hub with ADR integration and Microsoft-backed X.509 certificate management.,"This article explains how to deploy IoT Hub withAzure Device Registry (ADR)integration andMicrosoft-backed X.509 certificate management. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommended for production workloads. For more information, see theFAQ: What is new in IoT Hub?",2026-04-15T22:11:00.000Z,how-to,configuration,0.7,True,How-to deployment/configuration article for creating an IoT Hub with ADR integration and Microsoft-backed X.509 certificate management; likely includes specific resource/parameter names and required settings beyond generic knowledge.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-event-grid,React to IoT Hub events with Event Grid,Azure IoT Hub and Event Grid - Azure IoT Hub,,Use Azure Event Grid to send notifications and trigger processes based on actions that happen in IoT Hub.,"Azure IoT Hub integrates with Azure Event Grid so that you can send event notifications to other services and trigger downstream processes. Configure your business applications to listen for IoT Hub events so that you can react to critical events in a reliable, scalable, and secure manner. For example, build an application that updates a database, creates a work ticket, and delivers an email notification every time a new IoT device is registered to your IoT hub. Azure Event Gridis a fully manage",2025-04-01T17:03:00.000Z,conceptual,,0.45,False,Integration overview between IoT Hub and Event Grid; describes capabilities and scenarios but not detailed parameter tables or decision matrices.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-event-grid-routing-comparison,Compare message and event routing,"Compare Event Grid, routing for IoT Hub - Azure IoT Hub",Decide between IoT Hub routing and Event Grid,"IoT Hub offers its own message routing service, but also integrates with Event Grid for event publishing. Compare the two features.","Azure IoT Hub provides the capability to stream data from your connected devices and integrate that data into your business applications. IoT Hub offers two methods for integrating IoT events into other Azure services or business applications. This article discusses the two features that provide this capability, so that you can choose which option is best for your scenario. Note Some of the features mentioned in this article, like cloud-to-device messaging, device twins, and device management, a",2025-08-13T08:00:00.000Z,product-comparison,decision-making,0.8,True,Explicit comparison of IoT Hub routing vs Event Grid for integrating events; provides scenario-based recommendations and trade-offs for choosing one or both.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-faq,Frequently asked questions,FAQ: What is New in Azure IoT Hub? (Preview) - Azure IoT Hub,,Learn about the new features and improvements in Azure IoT Hub.,Azure IoT Hub introduces advanced capabilities to improve security and unify operations across cloud and edge-connected deployments. This FAQ addresses common questions about the new generation of IoT Hub.,2026-02-24T06:10:00.000Z,troubleshooting,,0.3,False,FAQ about new features; likely conceptual and policy-focused rather than detailed technical limits or config parameters.,unchanged @@ -173,8 +173,8 @@ https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-troubleshoot-connectivit https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-understand-ip-address,Understanding IoT hub IP address,Understanding the IP address of your IoT hub,Understand and manage Azure IoT Hub IP addresses,Understand how to query your IoT hub IP address and its properties. The IP address of your IoT hub can change during certain scenarios such as disaster recovery or regional failover.,"The IP address prefixes of IoT Hub public endpoints are published periodically under theAzureIoTHubservice tag. Note For devices that are deployed inside of on-premises networks, Azure IoT Hub supports virtual network connectivity integration with private endpoints. For more information, seeIoT Hub support for virtual networks with Azure Private Link. You can use these IP address prefixes to control connectivity between IoT Hub and your devices or network assets in order to implement various net",2025-05-27T17:04:00.000Z,concept-article,configuration,0.65,True,"Explains how to query hub IPs and use service tags; such docs usually include specific commands, properties, and networking behaviors unique to IoT Hub.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-upgrade,Upgrade an IoT hub,Upgrade Azure IoT Hub,Select and upgrade Azure IoT Hub tier and size,Change the pricing and scale tier for IoT Hub to get more messaging and device management capabilities.,"As your IoT solution grows, Azure IoT Hub is ready to help you scale. Azure IoT Hub offers two tiers, basic (B) and standard (S), to accommodate customers that want to use different features. Within each tier are three sizes (1, 2, and 3) that determine the number of messages that can be sent each day. When you have more devices and need more capabilities, there are three ways to adjust your IoT hub to suit your needs: Add units within the IoT hub to increase the daily message limit for that hub",2025-07-29T22:11:00.000Z,upgrade-and-migration-article,decision-making,0.8,True,"Explains basic vs standard tiers, sizes, and ways to scale with message limits per day; provides concrete tier selection and upgrade guidance with quantitative trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-what-is-new,What's new?,What is New in Azure IoT Hub? (Preview) - Azure IoT Hub,,This article explains the new features and improvements in Azure IoT Hub.,"Starting November 2025, Azure IoT Hub introduces two newpreviewfeatures: integration with Azure Device Registry (ADR) and enhanced Microsoft-backed X.509 certificate management. These features help improve security, simplify device management, and streamline operations for IoT deployments. This article provides an overview of both features and how to get started. Important Azure IoT Hub with ADR integration and Microsoft-backed X.509 certificate management is inpublic previewand isn't recommende",2026-02-24T06:10:00.000Z,overview,,0.45,False,"Overview of new preview features; summary doesn’t indicate detailed config tables, limits, or decision matrices—more feature announcement style.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/iot-mqtt-connect-to-iot-hub,MQTT support,Use MQTT to communicate with Azure IoT Hub - Azure IoT Hub,Connect devices to Azure IoT Hub using MQTT,Guidance on using the MQTT protocol to connect a device to IoT Hub. Includes using the Azure IoT device SDKs and connecting directly using MQTT.,"This article describes how devices can use the MQTT protocol to communicate with Azure IoT Hub. The IoT Hub device endpoints support device connectivity using: Note Some of the features mentioned in this article, like cloud-to-device messaging, device twins, and device management, are only available in the standard tier of IoT Hub. For more information about the basic and standard/free IoT Hub tiers, seeChoose the right IoT Hub tier and size for your solution. All device communication with IoT H",2026-04-15T22:11:00.000Z,how-to,integrations,0.7,True,"Protocol-specific guidance for MQTT with IoT Hub typically includes exact connection endpoints, topic formats, username/password patterns, and feature constraints unique to IoT Hub, which are integration details not generally known.",updated -https://learn.microsoft.com/en-us/azure/iot-hub/iot-sdks,IoT Device SDKs,Azure IoT device and service SDKs,,A list of the IoT SDKs and libraries. Includes SDKs for device development and SDKs for building service applications.,"This reference lists the Azure SDKs you can use to build IoT solutions, including device, service, and management SDKs for IoT Hub and Device Provisioning Service (DPS), preview SDKs for certificate management, and links to Azure Digital Twins control plane and data plane APIs.",2026-04-15T22:11:00.000Z,reference,,0.1,False,"Page is primarily a catalog/list of available IoT SDKs and related APIs with links. It does not contain detailed configuration parameters, limits, error codes, or product-specific decision matrices; it’s more of a navigational/overview reference.",updated +https://learn.microsoft.com/en-us/azure/iot-hub/iot-mqtt-connect-to-iot-hub,MQTT support,Use MQTT to communicate with Azure IoT Hub - Azure IoT Hub,Connect devices to Azure IoT Hub using MQTT,Guidance on using the MQTT protocol to connect a device to IoT Hub. Includes using the Azure IoT device SDKs and connecting directly using MQTT.,"This article describes how devices can use the MQTT protocol to communicate with Azure IoT Hub. The IoT Hub device endpoints support device connectivity using: Note Some of the features mentioned in this article, like cloud-to-device messaging, device twins, and device management, are only available in the standard tier of IoT Hub. For more information about the basic and standard/free IoT Hub tiers, seeChoose the right IoT Hub tier and size for your solution. All device communication with IoT H",2026-04-15T22:11:00.000Z,how-to,integrations,0.7,True,"Protocol-specific guidance for MQTT with IoT Hub typically includes exact connection endpoints, topic formats, username/password patterns, and feature constraints unique to IoT Hub, which are integration details not generally known.",unchanged +https://learn.microsoft.com/en-us/azure/iot-hub/iot-sdks,IoT Device SDKs,Azure IoT device and service SDKs,,A list of the IoT SDKs and libraries. Includes SDKs for device development and SDKs for building service applications.,"This reference lists the Azure SDKs you can use to build IoT solutions, including device, service, and management SDKs for IoT Hub and Device Provisioning Service (DPS), preview SDKs for certificate management, and links to Azure Digital Twins control plane and data plane APIs.",2026-04-15T22:11:00.000Z,reference,,0.1,False,"Page is primarily a catalog/list of available IoT SDKs and related APIs with links. It does not contain detailed configuration parameters, limits, error codes, or product-specific decision matrices; it’s more of a navigational/overview reference.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/manage-device-twins,Manage device twins,How to manage devices and modules using twins - Azure IoT Hub,Manage IoT Hub device and module twins via portal and CLI,Use the Azure portal and Azure CLI to query and update device twins and module twins in your Azure IoT hub.,"Use the Azure portal and Azure CLI to manage devices through device twins and module twins. This article focuses on device twins for simplicity, but all of the concepts and processes work in a similar way for module twins. This article describes device twin management tasks available in the Azure portal or Azure CLI to manage device twins remotely. For information about developing device applications to handle device twin changes, seeGet started with device twins. In IoT Hub, adevice twinis a JS",2024-08-15T05:40:00.000Z,how-to,configuration,0.7,True,Shows portal and CLI operations for querying/updating twins; includes command syntax and twin query usage specific to IoT Hub.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/migrate-hub-arm,Manually migrate an IoT hub using ARM,How to manually migrate an IoT hub - Azure IoT Hub,Manually migrate Azure IoT Hub across regions or SKUs,"Use the Azure portal, ARM templates, and service SDKs to manually migrate an Azure IoT hub to a new region or new SKU","Use the Azure portal, Azure Resource Manager templates, and Azure IoT Hub service SDKs to migrate an IoT hub to a new region, a new tier, or a new configuration. The steps in this article are useful if you want to: To migrate a hub, you need a subscription with administrative access to the original hub. You can put the new hub in a new resource group and region, in the same subscription as the original hub, or even in a new subscription. You just can't use the same name because the hub name has ",2024-12-09T08:00:00.000Z,how-to,deployment,0.7,True,"Covers manual migration using portal, ARM templates, and SDKs with IoT Hub-specific constraints (naming, subscriptions, regions); deployment/migration-focused expert guidance.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/migrate-hub-state-cli,Automatically migrate an IoT hub using CLI,How to migrate an IoT hub - Azure IoT Hub,Migrate Azure IoT Hub using Azure CLI state commands,"Use the Azure CLI iot hub state command group to migrate an IoT hub to a new region, a new tier, or a new configuration","Use the Azure CLI to migrate an IoT hub to a new region, a new tier, or a new configuration. The steps in this article are useful if you want to:",2023-04-16T11:19:00.000Z,how-to,deployment,0.7,True,"Describes product-specific migration flow (regions, tiers, configurations) using IoT Hub state CLI; includes constraints and steps unique to IoT Hub deployment/migration.",unchanged @@ -193,7 +193,7 @@ https://learn.microsoft.com/en-us/azure/iot-hub/quickstart-control-device,Contro https://learn.microsoft.com/en-us/azure/iot-hub/quickstart-send-telemetry-cli,Send telemetry using CLI,Quickstart - Send telemetry to Azure IoT Hub (CLI) quickstart,,"This quickstart shows developers new to IoT Hub how to get started by using the Azure CLI to create an IoT hub, send telemetry, and view messages between a device and the hub.","IoT Hub is an Azure service designed to collect large volumes of telemetry data from IoT devices for storage or processing in the cloud. In this codeless quickstart, you use the Azure CLI to create an IoT hub and a simulated device. You send device telemetry to the hub, and also send messages, call methods, and update properties on the device. Additionally, you use the Azure portal to visualize device metrics. This article provides a basic workflow for developers using the CLI to interact with a",2025-01-10T08:00:00.000Z,quickstart,,0.2,False,"Quickstart tutorial for sending telemetry; focuses on basic workflow, not deep configuration, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/raspberry-pi-get-started,Connect Raspberry Pi to run samples,Connect Raspberry Pi to Azure IoT Hub - Azure IoT Hub,,Connect a Raspberry Pi to Azure IoT Hub and test sample scenarios that send data to the Azure cloud.,This article provides basic steps for getting starting with connecting a Raspberry Pi that's running Raspberry Pi OS to the cloud by usingAzure IoT Hub. You can use a physical Raspberry Pi device or an online device emulator.,2025-03-20T08:00:00.000Z,how-to,,0.35,False,"Getting-started tutorial for Raspberry Pi; mostly step-by-step setup and sample code, not deep config tables or limits.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/reference-iot-hub-extension,Azure IoT Hub extension for Visual Studio Code,Azure IoT Hub extension for Visual Studio Code - Azure IoT Hub,Use the Azure IoT Hub VS Code extension,Reference documentation containing information about the Azure IoT Hub extension for Visual Studio Code.,"Visual Studio Code (VS Code) lets you addextensions, such as languages, debuggers, and tools, to your VS Code installation to support your development workflow. TheAzure IoT Hub extension for Visual Studio Codelets you add Azure IoT Hub support to your VS Code installation, so you can manage and interact with your IoT hubs, devices, and modules during development. The Azure IoT Hub extension is available from theVisual Studio Code Marketplace. Note Some of the features mentioned in this article,",2023-05-12T11:13:00.000Z,reference,configuration,0.7,True,"Reference for extension commands, settings, and capabilities, which are concrete configuration/integration details for development tooling.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/reference-self-sign-script,Self-sign script for external CSR,Script to Self Sign a CSR Certificate in Azure Device Registry,Use PowerShell script to self-sign IoT Hub CSR certificates,Use a sample script to self sign a certificate signing request (CSR) for test purposes.,You can use the provided PowerShell script to create a free root CA and private key using OpenSSL. This script should only be used for testing and never for production environments.,2026-04-15T22:11:00.000Z,reference,integrations,0.7,True,Page provides a concrete PowerShell/OpenSSL script and parameters for generating a root CA and self-signed CSR specifically for Azure IoT Hub device registry testing. This is product- and tooling-specific integration/code pattern knowledge rather than a generic concept.,new +https://learn.microsoft.com/en-us/azure/iot-hub/reference-self-sign-script,Self-sign script for external CSR,Script to Self Sign a CSR Certificate in Azure Device Registry,Use PowerShell script to self-sign IoT Hub CSR certificates,Use a sample script to self sign a certificate signing request (CSR) for test purposes.,You can use the provided PowerShell script to create a free root CA and private key using OpenSSL. This script should only be used for testing and never for production environments.,2026-04-15T22:11:00.000Z,reference,integrations,0.7,True,Page provides a concrete PowerShell/OpenSSL script and parameters for generating a root CA and self-signed CSR specifically for Azure IoT Hub device registry testing. This is product- and tooling-specific integration/code pattern knowledge rather than a generic concept.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/reference-x509-certificates,X.509 certificates,X.509 certificates,,"Reference documentation containing information about X.509 certificates, including certificate fields, certificate extensions, and certificate formats.","X.509 certificates are digital documents that represent a user, computer, service, or device. A certificate authority (CA), subordinate CA, or registration authority issues X.509 certificates. The certificates contain the public key of the certificate subject. They don't contain the subject's private key, which must be stored securely.RFC 5280documents public key certificates, including their fields and extensions. Public key certificates are digitally signed and typically contain the following ",2023-04-26T00:16:00.000Z,reference,,0.4,False,"General X.509 certificate reference (fields, extensions) aligned with RFC 5280; largely generic PKI knowledge rather than IoT Hub–specific configuration.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/regenerate-keys,Regenerate access keys,Regenerate access keys - Azure IoT Hub,Regenerate IoT Hub shared access keys safely,"Use the Azure portal, Azure CLI, or REST API to renew shared access policy keys for your IoT hub instance and devices.","Shared access signatures are one way to grant permissions at the service or device level. This article describes the process to regenerate those keys when you need to renew them in your applications. For more information, seeControl access to IoT Hub with shared access signatures.",2024-11-19T18:02:00.000Z,how-to,security,0.65,True,"Provides concrete procedures via portal/CLI/REST for rotating IoT Hub SAS keys, which are product-specific security operations.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/schedule-jobs-cli,Schedule bulk operations with jobs,Use jobs to schedule tasks for groups of devices (CLI) - Azure IoT Hub,Schedule IoT Hub jobs for device groups using Azure CLI,Use the Azure CLI to schedule jobs that invoke a direct method and update device twin properties of a simulated device.,"Use the Azure CLI to schedule and track jobs that update millions of devices. Use jobs to: Conceptually, a job wraps one of these actions and tracks the progress of execution against a set of devices. A device twin query defines the set of devices with which a job interacts. For example, a back-end app can use a job to invoke a reboot method on 10,000 devices, specified by a device twin query and scheduled at a future time. That application can then track progress as each of those devices receiv",2025-01-10T23:02:00.000Z,how-to,deployment,0.6,True,"CLI-focused job scheduling; includes specific CLI commands, parameters, and possibly constraints for large-scale job execution, which are product-specific operational deployment patterns.",unchanged @@ -207,7 +207,7 @@ https://learn.microsoft.com/en-us/azure/iot-hub/troubleshoot-message-routing,Tro https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-connectivity,Test device connectivity,Tutorial - Check device connectivity to Azure IoT Hub,Troubleshoot device connectivity to Azure IoT Hub,"Tutorial - Use IoT Hub tools to troubleshoot, during development, device connectivity issues to your IoT hub.","In this tutorial, you use Azure IoT Hub portal tools and Azure CLI commands to test device connectivity. This tutorial also uses a simple device simulator that you run on your desktop machine. If you don't have an Azure subscription,create a free accountbefore you begin. In this tutorial, you learn how to:",2023-04-25T22:05:00.000Z,tutorial,troubleshooting,0.7,True,"Tutorial explicitly for troubleshooting connectivity; typically includes specific CLI commands, portal tools, and symptom-based checks unique to IoT Hub connectivity.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-manual-failover,Perform manual failover,Tutorial - Manually failover an Azure IoT hub,Perform manual failover for an Azure IoT hub,Learn how to perform a manual failover of your IoT hub to a different region and then return it to the original region.,"Manual failover is a feature of the IoT Hub service that allows customers tofailovertheir hub's operations from a primary region to the correspondingAzure geo-paired region. The manual failover feature is offered to customers at no additional cost for IoT hubs created after May 18, 2017. Note Manual failover can be done in the event of a regional disaster or an extended service outage. You can also perform a planned failover to test your disaster recovery capabilities, although we recommend usin",2022-11-18T00:00:00.000Z,tutorial,deployment,0.65,True,"Tutorial on failing over between geo-paired regions with constraints (date, cost, scenarios), which are product-specific deployment/DR operations.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-message-enrichments,Use message enrichments,Tutorial - Use message enrichments - Azure IoT Hub,Set up and use IoT Hub message enrichments,Tutorial showing how to use message enrichments for Azure IoT Hub messages,"Message enrichmentsare the ability of Azure IoT Hub to stamp messages with additional information before the messages are sent to the designated endpoint. One reason to use message enrichments is to include data that can be used to simplify downstream processing. For example, enriching device messages with a device twin tag can reduce load on customers to make device twin API calls for this information. For more information, seeOverview of message enrichments. In thefirst part of this tutorial, ",2023-05-11T08:00:00.000Z,tutorial,configuration,0.6,True,"Tutorial with stepwise configuration of enrichments in portal/CLI, including specific setting names and usage patterns that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-routing,Tutorial - Route device messages to storage,Tutorial - Configure message routing - Azure IoT Hub,,Tutorial - Route device messages to an Azure Storage account with message routing for Azure IoT Hub using the Azure CLI and the Azure portal,"Use message routing in Azure IoT Hub to send telemetry data from your IoT devices to Azure services such as blob storage, Service Bus Queues, Service Bus Topics, and Event Hubs. Every IoT hub has a default built-in endpoint that is compatible with Event Hubs. You can also create custom endpoints and route messages to other Azure services by defining routing queries. Each message that arrives at the IoT hub is routed to all endpoints whose routing queries it matches. If a message doesn't match an",2025-03-31T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for configuring IoT Hub message routing using CLI and portal; appears to be a walkthrough without detailed limits, configuration matrices, or troubleshooting/error-code mappings.",new +https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-routing,Tutorial - Route device messages to storage,Tutorial - Configure message routing - Azure IoT Hub,,Tutorial - Route device messages to an Azure Storage account with message routing for Azure IoT Hub using the Azure CLI and the Azure portal,"Use message routing in Azure IoT Hub to send telemetry data from your IoT devices to Azure services such as blob storage, Service Bus Queues, Service Bus Topics, and Event Hubs. Every IoT hub has a default built-in endpoint that is compatible with Event Hubs. You can also create custom endpoints and route messages to other Azure services by defining routing queries. Each message that arrives at the IoT hub is routed to all endpoints whose routing queries it matches. If a message doesn't match an",2025-03-31T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for configuring IoT Hub message routing using CLI and portal; appears to be a walkthrough without detailed limits, configuration matrices, or troubleshooting/error-code mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-use-metrics-and-diags,Use metrics and logs to monitor IoT Hub,Tutorial - Set up and use metrics and logs with an Azure IoT hub,,Tutorial - Learn how to set up and use metrics and logs with an Azure IoT hub to provide data to analyze and diagnose problems your hub may be having.,"Use Azure Monitor to collect metrics and logs from your IoT hub to monitor the operation of your solution and troubleshoot problems when they occur. In this tutorial, you'll learn how to create charts based on metrics, how to create alerts that trigger on metrics, how to send IoT Hub operations and errors to Azure Monitor Logs, and how to check the logs for errors. This tutorial uses the Azure sample from the.NET send telemetry quickstartto send messages to the IoT hub. You can always use a devi",2023-04-25T22:05:00.000Z,tutorial,,0.4,False,Tutorial-style setup of metrics and logs; mostly step-by-step UI usage without deep config matrices or error-code mappings.,unchanged https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-use-mqtt,Tutorial - Connect a device with MQTT,Tutorial: Use MQTT to create an IoT device client,Connect IoT devices to Azure IoT Hub via MQTT,Tutorial - Use the MQTT protocol directly to create an IoT device client without using the Azure IoT device SDKs,"You should use one of the Azure IoT Device SDKs to build your IoT device clients if at all possible. However, in scenarios such as using a memory constrained device, you might need to use an MQTT library to communicate with your IoT hub. The samples in this tutorial use theEclipse MosquittoMQTT library. In this tutorial, you learn how to: You can use either a Windows or Linux development machine to complete the steps in this tutorial. If you don't have an Azure subscription, create afree account",2026-03-30T17:14:00.000Z,tutorial,integrations,0.65,True,"Tutorial shows product-specific MQTT usage against Azure IoT Hub, including broker/endpoint details and protocol patterns that go beyond generic MQTT knowledge. While framed as a tutorial, it likely includes concrete connection parameters and message patterns unique to IoT Hub, fitting integrations & coding patterns better than generic how-to content.",unchanged https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-x509-test-certs,Create and upload certificates for testing,Tutorial - Create and upload certificates for testing - Azure IoT Hub,Create and upload X.509 test certificates for IoT Hub,Tutorial - Create a root certificate authority and use it to create subordinate CA and client certificates that you can use for testing purposes with Azure IoT Hub.,"You can use X.509 certificates to authenticate devices to your IoT hub. For production environments, we recommend that you purchase an X.509 CA certificate from a professional certificate services vendor. You can then issue certificates within your organization from an internal, self-managed certificate authority (CA) chained to the purchased CA certificate as part of a comprehensive public key infrastructure (PKI) strategy. For more information about getting an X.509 CA certificate from a profe",2025-03-27T08:00:00.000Z,tutorial,security,0.7,True,"Step-by-step creation of root/subordinate/client certs and upload to IoT Hub, with concrete commands and formats that are product-specific security configuration.",unchanged diff --git a/products/azure-iot-hub/report.md b/products/azure-iot-hub/report.md index 69410cb8..9e853e10 100644 --- a/products/azure-iot-hub/report.md +++ b/products/azure-iot-hub/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing IoT Hub, DPS, and Device Update: auth (Entra ID, RBAC, SAS, X.509), certificates and revocation, TLS/ciphers, keys, network/IP controls, private @@ -10,6 +10,9 @@ category_descriptions: architecture-patterns: Design patterns for DPS lifecycle/HA/DR, VNet connectivity, secure device streams, and reliably persisting ordered IoT Hub events with Cosmos DB. + limits-quotas: Details on IoT Hub, DPS, and Device Update service limits, quotas, + and throttling behaviors, including scale constraints and how operations are restricted + under load. configuration: 'Configuring IoT Hub and DPS: enroll devices, manage certificates and ADR, set twins, jobs, routing, endpoints, file upload, message enrichments, queries, and monitoring/metrics/logs.' @@ -22,20 +25,18 @@ category_descriptions: deployment: 'Deploying and updating IoT Hubs and devices: region/SKU migration, failover, ARM/Bicep deployments, Device Update (image/package, proxy, OS support), and scheduling jobs via CLI.' - limits-quotas: Details on IoT Hub and Device Update service limits, quotas, throttling - behavior, and how many devices/operations you can scale to before hitting constraints. decision-making: Guidance for choosing IoT Hub vs alternatives, tiers/scale, pricing, routing, comms patterns (C2D/D2C), monitoring methods, and when to use or disable disaster recovery. skill_description: Expert knowledge for Azure IoT Hub development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - provisioning devices via DPS, managing twins/jobs/routing, using device streams, - or integrating Device Update, and other Azure IoT Hub related development tasks. - Not for Azure IoT (use azure-iot), Azure IoT Central (use azure-iot-central), Azure + configuring IoT Hub/DPS, device twins/routing, MQTT/AMQP clients, Cosmos DB event + storage, or Device Update, and other Azure IoT Hub related development tasks. Not + for Azure IoT (use azure-iot), Azure IoT Central (use azure-iot-central), Azure IoT Edge (use azure-iot-edge), Azure Defender For Iot (use azure-defender-for-iot). -use_when: Use when provisioning devices via DPS, managing twins/jobs/routing, using - device streams, or integrating Device Update, and other Azure IoT Hub related development +use_when: Use when configuring IoT Hub/DPS, device twins/routing, MQTT/AMQP clients, + Cosmos DB event storage, or Device Update, and other Azure IoT Hub related development tasks. confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Central (use azure-iot-central), Azure IoT Edge (use azure-iot-edge), Azure Defender For Iot (use azure-defender-for-iot). @@ -47,14 +48,14 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Central (use az - **Total Pages**: 204 - **Fetched**: 204 - **Fetch Failed**: 0 -- **Classified**: 136 -- **Unclassified**: 68 +- **Classified**: 137 +- **Unclassified**: 67 ### Incremental Update -- **New Pages**: 12 -- **Updated Pages**: 4 -- **Unchanged**: 188 -- **Deleted Pages**: 3 +- **New Pages**: 0 +- **Updated Pages**: 2 +- **Unchanged**: 202 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-iot-hub/azure-iot-hub.csv` ## Classification Statistics @@ -67,44 +68,19 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Central (use az | decision-making | 9 | 4.4% | | deployment | 9 | 4.4% | | integrations | 21 | 10.3% | -| limits-quotas | 2 | 1.0% | +| limits-quotas | 3 | 1.5% | | security | 35 | 17.2% | | troubleshooting | 13 | 6.4% | -| *(Unclassified)* | 68 | 33.3% | +| *(Unclassified)* | 67 | 32.8% | ## Changes -### New Pages - -- [Microsoft certificate management (preview)](https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-certificate-management-overview) -- [Configure a credential](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-configure-credential) -- [Create a policy with a Microsoft root CA](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-create-policy) -- [Create a policy with an external root CA](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-create-policy-external-certificate) -- [Issuance of device certificates](https://learn.microsoft.com/en-us/azure/iot-hub/concept-certificate-issuance) -- [Renewal of device certificates](https://learn.microsoft.com/en-us/azure/iot-hub/concept-certificate-renewal) -- [Certificate revocation and policy management](https://learn.microsoft.com/en-us/azure/iot-hub/concepts-certificate-policy-management) -- [Revoke certificates and delete policies](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-revoke-certificate-delete-policy) -- [Disable or enable a device](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-disable-enable-device) -- [Create message routes and endpoints](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-routing-portal) -- [Tutorial - Route device messages to storage](https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-routing) -- [Self-sign script for external CSR](https://learn.microsoft.com/en-us/azure/iot-hub/reference-self-sign-script) - ### Updated Pages -- [Libraries and SDKs](https://learn.microsoft.com/en-us/azure/iot-dps/libraries-sdks) - - Updated: 2026-02-13T06:11:00.000Z → 2026-04-15T22:11:00.000Z -- [Get started (ADR integration)](https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-device-registry-setup) - - Updated: 2026-02-23T18:40:00.000Z → 2026-04-15T22:11:00.000Z -- [MQTT support](https://learn.microsoft.com/en-us/azure/iot-hub/iot-mqtt-connect-to-iot-hub) - - Updated: 2026-03-30T17:14:00.000Z → 2026-04-15T22:11:00.000Z -- [IoT Device SDKs](https://learn.microsoft.com/en-us/azure/iot-hub/iot-sdks) - - Updated: 2026-03-30T17:14:00.000Z → 2026-04-15T22:11:00.000Z - -### Deleted Pages - -- ~~Azure portal~~ (https://learn.microsoft.com/en-us/azure/iot-hub/how-to-routing-portal) -- ~~Certificate management (preview)~~ (https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-certificate-management-overview) -- ~~Route device messages to storage~~ (https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-routing) +- [What is IoT Hub Device Provisioning Service?](https://learn.microsoft.com/en-us/azure/iot-dps/about-iot-dps) + - Updated: 2025-02-28T05:35:00.000Z → 2026-01-22T19:36:00.000Z +- [DPS FAQ](https://learn.microsoft.com/en-us/azure/iot-dps/dps-faq) + - Updated: 2025-12-18T23:12:00Z → 2026-04-24T22:19:00Z ## Classified Pages @@ -185,6 +161,7 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Central (use az | [Create an X.509 enrollment group with DPS service SDK](https://learn.microsoft.com/en-us/azure/iot-dps/quick-enroll-device-x509) | integrations | 0.70 | Shows using Azure IoT service SDK to create X.509 enrollment groups; includes DPS-specific API/SDK parameters and configuration. | | [Create and read IoT Hub messages](https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-construct) | integrations | 0.70 | Defines exact IoT Hub message structure, system properties, and headers for C2D and D2C messages; these are product-specific schema and protocol details that function as integration contract knowledge. | | [Create and upload certificates for testing](https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-x509-test-certs) | security | 0.70 | Step-by-step creation of root/subordinate/client certs and upload to IoT Hub, with concrete commands and formats that are product-specific security configuration. | +| [DPS FAQ](https://learn.microsoft.com/en-us/azure/iot-dps/dps-faq) | limits-quotas | 0.70 | FAQ pages for Azure services typically include concrete, product-specific details such as maximum enrollments per DPS instance, rate limits, throttling behavior, supported regions, and other numeric constraints or timeouts that are not obvious from general training data. These align with the limits-quotas category because they expose exact values and behaviors (for example, how many registrations per second, how many linked IoT hubs, or other numeric constraints) that an LLM would not reliably know without this documentation. | | [Device Update Supported Platforms](https://learn.microsoft.com/en-us/azure/iot-hub-device-update/support) | deployment | 0.70 | Supported platforms article will list specific OS versions and components supported/preview, which are deployment constraints. | | [Disable or enable a device](https://learn.microsoft.com/en-us/azure/iot-hub/how-to-disable-enable-device) | configuration | 0.70 | Operational guide for toggling device state in ADR; likely includes specific API/portal fields and state values that control device behavior in IoT Hub preview deployments. | | [Get started (ADR integration)](https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-device-registry-setup) | configuration | 0.70 | How-to deployment/configuration article for creating an IoT Hub with ADR integration and Microsoft-backed X.509 certificate management; likely includes specific resource/parameter names and required settings beyond generic knowledge. | @@ -288,7 +265,6 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Central (use az | [Automate provisioning with GitHub Actions](https://learn.microsoft.com/en-us/azure/iot-dps/tutorial-automation-github-actions) | 0.30 | GitHub Actions GitOps tutorial; CI/CD example but no tier support matrix or deployment constraints by SKU. | | [Automatic device management at scale](https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-automatic-device-management) | 0.30 | High-level description of automatic device management; no detailed limits, configs, or decision matrices beyond what an LLM likely knows. | | [Custom allocation policies](https://learn.microsoft.com/en-us/azure/iot-dps/concepts-custom-allocation) | 0.30 | Explains custom allocation policies conceptually; lacks decision matrices with thresholds or detailed integration parameter tables. | -| [DPS FAQ](https://learn.microsoft.com/en-us/azure/iot-dps/dps-faq) | 0.30 | FAQ summary only; likely mixed conceptual and general Q&A without strong indication of detailed error codes, limits, or config tables. | | [Device Update account and instance](https://learn.microsoft.com/en-us/azure/iot-hub-device-update/device-update-resources) | 0.30 | Resource overview for Device Update accounts/instances; no detailed limits, configs, or patterns beyond what an LLM likely knows. | | [Device Update agent](https://learn.microsoft.com/en-us/azure/iot-hub-device-update/device-update-agent-overview) | 0.30 | Agent conceptual structure overview; no indication of config parameters, limits, or troubleshooting mappings. | | [Frequently asked questions](https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-faq) | 0.30 | FAQ about new features; likely conceptual and policy-focused rather than detailed technical limits or config parameters. | @@ -313,7 +289,7 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Central (use az | [Tutorial - Route device messages to storage](https://learn.microsoft.com/en-us/azure/iot-hub/tutorial-routing) | 0.20 | Tutorial for configuring IoT Hub message routing using CLI and portal; appears to be a walkthrough without detailed limits, configuration matrices, or troubleshooting/error-code mappings. | | [X.509 certificate sample](https://learn.microsoft.com/en-us/azure/iot-dps/quick-create-simulated-device-x509) | 0.20 | Tutorial for provisioning an X.509 simulated device; step-by-step example rather than reference-style expert details. | | [IoT Device SDKs](https://learn.microsoft.com/en-us/azure/iot-hub/iot-sdks) | 0.10 | Page is primarily a catalog/list of available IoT SDKs and related APIs with links. It does not contain detailed configuration parameters, limits, error codes, or product-specific decision matrices; it’s more of a navigational/overview reference. | -| [What is IoT Hub Device Provisioning Service?](https://learn.microsoft.com/en-us/azure/iot-dps/about-iot-dps) | 0.10 | High-level overview of DPS capabilities and concepts without numeric limits, configuration tables, or detailed patterns. | +| [What is IoT Hub Device Provisioning Service?](https://learn.microsoft.com/en-us/azure/iot-dps/about-iot-dps) | 0.10 | High-level conceptual overview of Azure IoT Hub Device Provisioning Service; no specific limits, configuration parameters, error codes, or decision matrices with quantified trade-offs. | | [What is IoT Hub?](https://learn.microsoft.com/en-us/azure/iot-hub/iot-concepts-and-iot-hub) | 0.10 | Conceptual 'What is IoT Hub?' overview; marketing/introductory content without detailed limits, configs, or troubleshooting. | | [DPS terminology](https://learn.microsoft.com/en-us/azure/iot-dps/concepts-service) | 0.05 | Terminology and glossary; conceptual definitions only. | | [Symmetric key attestation](https://learn.microsoft.com/en-us/azure/iot-dps/concepts-symmetric-key-attestation) | 0.05 | Conceptual overview of symmetric key attestation; no numeric limits, config tables, or security role details. | diff --git a/products/azure-iot-operations/azure-iot-operations.csv b/products/azure-iot-operations/azure-iot-operations.csv index 456dc105..5c20fe27 100644 --- a/products/azure-iot-operations/azure-iot-operations.csv +++ b/products/azure-iot-operations/azure-iot-operations.csv @@ -24,13 +24,13 @@ https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-cr https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-filter,Filter data,Filter data in a data flow - Azure IoT Operations,Configure filter stages in Azure IoT Operations data flows,Filter messages in a data flow based on conditions using Azure IoT Operations.,"Use the filter stage in a data flow to drop messages that match a condition. When a filter expression evaluates to true, the message isdropped. When the expression evaluates to false, the message passes through to the next stage. You can define multiple filter rules. Each rule specifies input fields and a boolean expression. The rules use OR logic: ifanyrule evaluates to true, the message is dropped.",2026-04-02T18:15:00.000Z,how-to,configuration,0.7,True,"Explains filter rules, boolean expressions, and OR logic for dropping messages, which are specific configuration semantics for this product’s filter operation.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graph-wasm,Use WASM transforms,Use WASM transforms in data flow graphs - Azure IoT Operations,Build and use WebAssembly transforms in Azure IoT Operations data flow graphs,Learn how to build and deploy custom WebAssembly transforms in data flow graphs in Azure IoT Operations.,"Azure IoT Operationsdata flow graphsinclude built-in transforms for common processing tasks like mapping, filtering, and aggregation. When you need custom logic beyond what the built-in transforms provide, you can deploy WebAssembly (WASM) modules as custom transforms in your data flow graph pipelines. Tip For most data processing scenarios, start with thebuilt-in transforms. Use WASM transforms when you need custom business logic, specialized algorithms, or processing that the built-in options ",2026-03-25T08:00:00.000Z,how-to,integrations,0.7,True,"Covers how to package and deploy WASM modules as custom transforms, including when to use them, which are product-specific integration and coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-enrich,Enrich with external data,Enrich data with external datasets in data flow graphs - Azure IoT Operations,Configure enrichment with external datasets in data flow graphs,Learn how to augment incoming messages with data from an external state store by configuring datasets in Azure IoT Operations data flow graphs.,"Sometimes the incoming message doesn't contain everything you need. A temperature reading might arrive with a device ID, but the display name, location, and calibration offset live in a separate lookup table. Enrichment lets you pull that external data into your transform rules. For an overview of data flow graphs, seeData flow graphs overview.",2026-04-02T18:15:00.000Z,how-to,configuration,0.7,True,"Shows how to attach external state stores and lookup data in transforms, which are product-specific configuration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-filter-route,Filter and route data,Filter and route data in data flow graphs - Azure IoT Operations,,"Learn how to filter, branch, and merge messages using data flow graphs in Azure IoT Operations.","Data flow graphs provide two ways to control which messages flow through your pipeline:filtertransforms drop unwanted messages, andbranchtransforms route each message down one of two paths based on a condition. After branching, aconcatenatetransform merges the paths back together. For an overview of data flow graphs and how transforms compose in a pipeline, seeData flow graphs overview.",2026-04-16T08:00:00.000Z,how-to,,0.3,False,"This is a how-to/tutorial on filtering and routing data in data flow graphs. The summary shows conceptual explanation of filter/branch/concatenate transforms, but no indication of detailed configuration tables, error codes, or product-specific constraints.",updated +https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-filter-route,Filter and route data,Filter and route data in data flow graphs - Azure IoT Operations,,"Learn how to filter, branch, and merge messages using data flow graphs in Azure IoT Operations.","Data flow graphs provide two ways to control which messages flow through your pipeline:filtertransforms drop unwanted messages, andbranchtransforms route each message down one of two paths based on a condition. After branching, aconcatenatetransform merges the paths back together. For an overview of data flow graphs and how transforms compose in a pipeline, seeData flow graphs overview.",2026-04-16T08:00:00.000Z,how-to,,0.3,False,"This is a how-to/tutorial on filtering and routing data in data flow graphs. The summary shows conceptual explanation of filter/branch/concatenate transforms, but no indication of detailed configuration tables, error codes, or product-specific constraints.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-map,Transform data with map,Transform data with map in data flow graphs - Azure IoT Operations,Configure map transforms in Azure IoT Operations data flow graphs,"Learn how to define map rules that rename, restructure, compute, and copy fields in Azure IoT Operations data flow graphs.","A map transform takes each incoming message and produces an output message based on your rules. You can rename fields, reorganize them into new structures, compute derived values, or remove unwanted fields. Wildcard rules let you copy all fields at once. For an overview of data flow graphs and how transforms compose in a pipeline, seeData flow graphs overview.",2026-04-02T18:15:00.000Z,how-to,configuration,0.7,True,"Details rules for renaming, restructuring, computing, and wildcard copying of fields in map transforms, which are specific configuration semantics.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-topic-routing,Route messages to different topics,Route messages to different topics in data flow graphs - Azure IoT Operations,Configure dynamic MQTT topic routing in data flow graphs,Learn how to dynamically set the output MQTT topic based on message content using data flow graphs in Azure IoT Operations.,"Some scenarios require messages to arrive on different MQTT topics depending on their content. For example, sensor readings above a critical threshold might need to go to analertstopic, while normal readings go to ahistoriantopic. With data flow graphs, you can set the output topic dynamically, even though the dataflow has a single destination.",2026-04-02T18:15:00.000Z,how-to,configuration,0.7,True,"Describes how to set output topics based on message content within a single destination, which is a specific routing configuration behavior.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-window,Aggregate data over time,Aggregate data over time in data flow graphs - Azure IoT Operations,Configure windowed aggregations in Azure IoT Operations data flow graphs,"Learn how to compute averages, sums, min/max, counts, and other aggregations over tumbling time windows in Azure IoT Operations data flow graphs.","A window transform collects messages over a fixed time interval and produces a single output message with aggregated values. Instead of forwarding every reading individually, you can compute statistics like averages, minimums, or counts and send one consolidated result downstream.",2026-04-02T18:15:00.000Z,how-to,configuration,0.7,True,"Explains window transforms, tumbling intervals, and aggregation outputs, which are specific configuration options for time-based processing.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/open-telemetry,OpenTelemetry,Configure OpenTelemetry data flow endpoints in Azure IoT Operations (preview) - Azure IoT Operations,Configure OpenTelemetry endpoints for Azure IoT Operations data flows,Learn how to configure data flow endpoints for OpenTelemetry destinations to send metrics and logs to observability platforms.,"OpenTelemetry (OTEL) data flow endpoints send metrics and logs to OpenTelemetry collectors, which can then forward the data to observability platforms like Grafana dashboards and Azure Monitor. You can configure the endpoint settings, authentication, Transport Layer Security (TLS), and batching options. This article describes how to create and configure an OpenTelemetry dataflow endpoint to export asset data from your MQTT broker to an OpenTelemetry collector. The article describes theOTEL dataf",2026-04-02T18:15:00.000Z,how-to,configuration,0.8,True,"Includes endpoint, auth, TLS, and batching configuration for OTEL collectors, which are detailed product-specific configuration parameters and behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow,Overview,Process and route data with data flows - Azure IoT Operations,,Learn about data flows and how to process and route data in Azure IoT Operations.,"Data flows simplify the setup of data paths to move, transform, and enrich data. By using data flows, you can connect various data sources and perform data operations. The data flow component is part of Azure IoT Operations, which you deploy as an Azure Arc extension. You configure a data flow by using the operations experience web UI, the Azure CLI, or Azure Resource Manager templates. You can write configurations for various use cases, such as: Data flows aren't limited to the region where you",2026-04-17T11:12:00.000Z,concept-article,,0.2,False,"Described as an overview of data flows and how to process and route data. It reads like conceptual and introductory content without evidence of specific limits, configuration tables, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow-comparison,Data flows vs. data flow graphs,Data flows vs. data flow graphs - Azure IoT Operations,Choose between data flows and data flow graphs in Azure IoT Operations,"Understand the differences between data flows and data flow graphs in Azure IoT Operations, and choose the right approach for your scenario.","Azure IoT Operations provides two ways to process and route data:data flowsanddata flow graphs. Both use shared infrastructure (endpoints, profiles, source/destination configuration), but they differ in pipeline shape, transform capabilities, and endpoint support.",2026-04-17T11:12:00.000Z,concept-article,decision-making,0.65,True,"The page is explicitly about choosing the right approach for a scenario and compares two product-specific mechanisms (data flows vs. data flow graphs). This is selection guidance between options within the same service, which fits decision-making. While the summary is high level, the purpose is to guide choice, implying comparison criteria beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow,Overview,Process and route data with data flows - Azure IoT Operations,,Learn about data flows and how to process and route data in Azure IoT Operations.,"Data flows simplify the setup of data paths to move, transform, and enrich data. By using data flows, you can connect various data sources and perform data operations. The data flow component is part of Azure IoT Operations, which you deploy as an Azure Arc extension. You configure a data flow by using the operations experience web UI, the Azure CLI, or Azure Resource Manager templates. You can write configurations for various use cases, such as: Data flows aren't limited to the region where you",2026-04-17T11:12:00.000Z,concept-article,,0.2,False,"Described as an overview of data flows and how to process and route data. It reads like conceptual and introductory content without evidence of specific limits, configuration tables, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow-comparison,Data flows vs. data flow graphs,Data flows vs. data flow graphs - Azure IoT Operations,Choose between data flows and data flow graphs in Azure IoT Operations,"Understand the differences between data flows and data flow graphs in Azure IoT Operations, and choose the right approach for your scenario.","Azure IoT Operations provides two ways to process and route data:data flowsanddata flow graphs. Both use shared infrastructure (endpoints, profiles, source/destination configuration), but they differ in pipeline shape, transform capabilities, and endpoint support.",2026-04-17T11:12:00.000Z,concept-article,decision-making,0.65,True,"The page is explicitly about choosing the right approach for a scenario and compares two product-specific mechanisms (data flows vs. data flow graphs). This is selection guidance between options within the same service, which fits decision-making. While the summary is high level, the purpose is to guide choice, implying comparison criteria beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/tutorial-mqtt-bridge,Bi-directional messaging with Event Grid,Tutorial: Bi-directional MQTT bridge to Azure Event Grid - Azure IoT Operations,,Learn how to create a bi-directional MQTT bridge to Azure Event Grid using Azure IoT Operations data flows.,"In this tutorial, you set up a bi-directional MQTT bridge between an Azure IoT Operations MQTT broker and Azure Event Grid. To keep the tutorial simple, use the default settings for the Azure IoT Operations MQTT broker and Azure Event Grid endpoints, and no transformation is applied.",2026-04-02T18:15:00.000Z,tutorial,,0.2,False,"Tutorial using default settings to build an MQTT bridge; likely step-by-step without detailed config matrices, limits, or product-specific troubleshooting/error mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/tutorial-opc-ua-to-data-lake,Send data to Data Lake Storage,Tutorial: Send data from an OPC UA server to Azure Data Lake Storage Gen 2 using Azure IoT Operations - Azure IoT Operations,,Learn how to send data from an OPC UA server to Azure Data Lake Storage Gen 2 using Azure IoT Operations.,"In the quickstart, you created a data flow that sends data from Azure IoT Operations to Event Hubs, and then to Microsoft Fabric via EventStreams. However, it's also possible to send the data directly to a storage endpoint without using Event Hubs. This approach requires creating a Delta Lake schema that represents the data, uploading the schema to Azure IoT Operations, and then creating a data flow that reads the data from the OPC UA server and writes it to the storage endpoint. This tutorial b",2024-11-19T18:02:00.000Z,how-to,,0.35,False,"Tutorial for sending OPC UA data to Data Lake; focused on steps and example schema, not broad configuration or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/deploy-iot-ops/concept-production-examples,Production deployment examples,Production deployment examples - Azure IoT Operations,Use Azure IoT Operations production deployment sizing examples,Describes some example deployments to help you understand how to scale your solution.,This article describes two example Azure IoT Operations deployments that collect data from edge and transfer it to the cloud. These examples are based on real-world scenarios that take hardware capability and data volumes into consideration. Use these examples to better understand how much data Azure IoT Operations can handle with certain hardware. Microsoft used similar configurations and data volumes to validate Azure IoT Operations and measure its performance.,2025-03-19T17:20:00.000Z,concept-article,decision-making,0.7,True,"Uses real-world scenarios with hardware capability and data volume to illustrate how much data can be handled, guiding capacity and scaling decisions.",unchanged @@ -64,7 +64,7 @@ https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/overvie https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/quickstart-get-started-sdks,Start developing with the SDKs,Quickstart: Start developing with the Azure IoT Operations SDKs - Azure IoT Operations,,"Setup up a development environment for building and running the samples, as well as creating and testing your own Azure IoT Operations highly available edge applications.","Get started developing with the Azure IoT Operations SDKs. Follow these steps to set up your development environment for building and running the samples, as well as creating and testing your own highly available edge applications. GitHub repository|.NET SDK|Go SDK|Rust SDK",2026-03-09T17:11:00.000Z,quickstart-sdk,,0.3,False,"Quickstart for setting up a development environment and running samples; appears to be step-by-step tutorial content without detailed configuration matrices, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/reference-state-store-protocol,State store protocol,State store protocol - Azure IoT Operations,Implement Azure IoT Operations state store protocol,State store protocol guidance for developers who need to implement their own state store clients.,"The state store is a distributed storage system within the Azure IoT Operations cluster. The state store offers the same high availability guarantees as MQTT messages in MQTT broker. According to the MQTT5/RPC protocol guidelines, clients should use MQTT v5 to interact with the state store. This article provides protocol guidance for developers who need to implement their own state store clients.",2025-07-31T17:19:00.000Z,reference,integrations,0.78,True,"Protocol reference for custom state store clients; likely includes MQTT v5 topic formats, payload schemas, and request/response parameters that are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/tutorial-event-driven-with-dapr,Build an event-driven app with Dapr,Build an event-driven app with Dapr - Azure IoT Operations,,Learn how to create a Dapr application that aggregates data and publishing on another topic using MQTT broker.,"In this walkthrough, you deploy a Dapr application to the cluster. The Dapr application consumes simulated MQTT data published to MQTT broker, applies a windowing function, and then publishes the result back to MQTT broker. The published output represents how high volume data can be aggregated on the edge to reduce message frequency and size. The Dapr application is stateless, and uses the MQTT broker state store to cache past values needed for the window calculations. The Dapr application perfo",2025-07-31T17:19:00.000Z,tutorial,,0.2,False,End-to-end Dapr event-driven app walkthrough; primarily a scenario tutorial without reference-style expert details.,unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/concept-assets-devices,Understand assets and devices,Assets and Devices - Azure IoT Operations,Define Azure IoT Operations assets and devices in Device Registry,Understand the Azure Device Registry resources that define assets and devices.,"Important To view the asset endpoint (classic) documentation, go toWhat is asset management in Azure IoT Operationson the site for previous versions. Azure IoT Operations uses the termsassetanddeviceto refer to configuration resources. These configuration resources don't map directly to the physical assets and devices in your environment. Instead, they define how a connector in Azure IoT Operations connects to and interacts with the physical assets and devices in your environment. In Azure IoT O",2026-02-24T23:11:00.000Z,concept-article,configuration,0.65,True,"Explains how asset and device configuration resources map to physical entities and connectors, which is product-specific configuration modeling.",unchanged +https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/concept-assets-devices,Understand assets and devices,Assets and Devices - Azure IoT Operations,,Understand the Azure Device Registry resources that define assets and devices.,"Important To view the asset endpoint (classic) documentation, go toWhat is asset management in Azure IoT Operationson the site for previous versions. Azure IoT Operations uses the termsassetanddeviceto refer to configuration resources. These configuration resources don't map directly to the physical assets and devices in your environment. Instead, they define how a connector in Azure IoT Operations connects to and interacts with the physical assets and devices in your environment. In Azure IoT O",2026-04-22T22:14:00.000Z,concept-article,,0.2,False,"Conceptual explanation of what assets and devices represent in Azure IoT Operations; describes mapping to physical entities but lacks concrete configuration parameters, limits, or troubleshooting details.",updated https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-configure-opc-ua,Configure OPC UA assets and devices,How to use the connector for OPC UA - Azure IoT Operations,Configure OPC UA assets and devices in Azure IoT Operations,Use the operations experience web UI or the Azure CLI to configure assets and devices for OPC UA connections.,"OPC UA serversare software applications that communicate with assets. OPC UA servers exposeOPC UA data pointsthat represent data points. OPC UA data points provide real-time or historical data about the status, performance, quality, or condition of assets. Anassetin Azure IoT Operations is a logical entity that you create to represent a physical asset or device. An Azure IoT Operations asset can have custom properties, data points, streams, and events that describe its behavior and characteristi",2026-04-07T17:12:00.000Z,how-to,configuration,0.7,True,"Describes how to configure assets and devices for OPC UA connections using the operations experience UI and Azure CLI. Although the summary is conceptual, this type of 'how to configure' article for a specific connector typically includes product-specific configuration parameters, field names, and possibly CLI options that qualify as expert configuration knowledge beyond generic tutorials.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-configure-opc-ua-certificates-infrastructure,Configure OPC UA application authentication,Configure OPC UA certificates - Azure IoT Operations,Configure OPC UA certificate trust for Azure IoT Operations connector,How to configure and manage the OPC UA certificates trust relationship in the context of the connector for OPC UA,"In this article, you learn how to configure the OPC UA certificates infrastructure for the connector for OPC UA. This configuration lets you determine which OPC UA servers you trust to securely establish a session with. Based on theOPC UA specification, the connector for OPC UA acts as a single OPC UA application when it establishes secure communications with OPC UA servers. The connector for OPC UA uses the same application instance certificate for all secure channels it opens to your OPC UA se",2026-04-07T17:12:00.000Z,how-to,security,0.85,True,"Explicitly about configuring and managing OPC UA certificate trust relationships for the connector. This will include concrete steps, certificate locations/usage, and trust configuration behavior specific to this product, which is expert-level security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-connect-kafka,Connect to Kafka endpoints,Connect to a Kafka source - Azure IoT Operations,,"Use a data flow and the connector for MQTT to ingest data from a Kafka-compatible source such as Azure Event Hubs, discover topics as assets, and route data through the MQTT broker in Azure IoT Operat","Many messaging services expose a Kafka-compatible endpoint, including Azure Event Hubs, Confluent Cloud, and self-hosted Apache Kafka clusters. Azure IoT Operations doesn't include a dedicated southbound Kafka connector, but you can ingest data from any Kafka-compatible source by using a data flow and the connector for MQTT. For simplicity, this article uses an Azure Event Hubs namespace as the Kafka source. To connect to a Kafka source, you combine: Adata flowwith a Kafka inbound endpoint that ",2026-03-06T08:00:00.000Z,how-to,,0.4,False,"Kafka source connection how-to; focuses on combining data flows and MQTT connector to ingest Kafka-compatible data. Summary suggests a scenario tutorial, not detailed config/limits/troubleshooting content.",unchanged @@ -76,9 +76,9 @@ https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/ho https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-mqtt-connector,Connect to MQTT endpoints,How to use the connector for MQTT - Azure IoT Operations,,Use the operations experience web UI or the Azure CLI to configure assets and devices for connections to MQTT topics.,"By using the connector for MQTT, you can model MQTT endpoints as assets in Azure IoT Operations. These MQTT endpoints can be on external MQTT brokers or on the built-in MQTT broker in Azure IoT Operations. The MQTT connector detects new topic paths as they appear, and you can view the custom resources that represent the detected topics. Anassetin Azure IoT Operations is a logical entity that you create to represent a physical asset or device. An Azure IoT Operations asset can have custom propert",2026-03-25T16:54:00.000Z,how-to,,0.4,False,"MQTT connector how-to; describes modeling endpoints as assets and detecting topics. Appears as a usage tutorial rather than a reference of configuration parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-onvif-connector,Connect to ONVIF-compliant cameras,How to use the connector for ONVIF - Azure IoT Operations,Integrate ONVIF cameras with Azure IoT Operations via connector,Use the operations experience web UI to discover and configure assets and devices to use media streams from ONVIF compliant cameras.,"In Azure IoT Operations, the connector for ONVIF enables you to discover and use anONVIF conformantcamera connected to your Azure IoT Operations cluster. Anassetin Azure IoT Operations is a logical entity that you create to represent a physical asset or device. An Azure IoT Operations asset can have custom properties, data points, streams, and events that describe its behavior and characteristics. An asset is associated with one or more devices. Azure IoT Operations stores asset definitions in ",2025-12-11T08:00:00.000Z,how-to,integrations,0.7,True,"Shows how to discover and configure ONVIF camera assets/devices, a concrete integration scenario.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-operations-experience,Use the operations experience UI,Manage resources in the operations experience UI - Azure IoT Operations,Manage Azure IoT Operations resources in the operations experience UI,Use the operations experience web UI to manage resources such as your asset and device configurations.,"Theoperations experienceweb UI lets OT users manage resources in Azure IoT Operations. The operations experience is a web-based user interface that gives you a consistent way to manage resources like devices, assets, and data flows. This article describes how to use the operations experience web UI to manage core resources like To learn how to use the operations experience to manage assets and devices, see:",2025-11-05T18:14:00.000Z,how-to,configuration,0.6,True,"Details how to configure and manage core resources (devices, assets, data flows) via the dedicated UI, which is product-specific configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-sse-connector,Connect to SSE endpoints,How to use the connector for SSE - Azure IoT Operations,Configure SSE connector assets and devices in Azure IoT Operations,Use the operations experience web UI or the Azure CLI to configure assets and devices for connections to server-sent event (SSE) endpoints.,"In Azure IoT Operations, the connector for server-sent events (SSE) enables access to data from SSE endpoints exposed by HTTP services. Anassetin Azure IoT Operations is a logical entity that you create to represent a physical asset or device. An Azure IoT Operations asset can have custom properties, data points, streams, and events that describe its behavior and characteristics. An asset is associated with one or more devices. Azure IoT Operations stores asset definitions in the Azure Device Re",2026-03-10T16:12:00.000Z,how-to,configuration,0.65,True,"How-to for configuring assets/devices to connect to SSE endpoints via UI or CLI. Likely includes product-specific configuration parameters, CLI flags, and settings for the SSE connector, which qualifies as configuration-focused expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-sse-connector,Connect to SSE endpoints,How to use the connector for SSE - Azure IoT Operations,Configure SSE connector assets and devices in Azure IoT Operations,Use the operations experience web UI or the Azure CLI to configure assets and devices for connections to server-sent event (SSE) endpoints.,"In Azure IoT Operations, the connector for server-sent events (SSE) enables access to data from SSE endpoints exposed by HTTP services. Anassetin Azure IoT Operations is a logical entity that you create to represent a physical asset or device. An Azure IoT Operations asset can have custom properties, data points, streams, and events that describe its behavior and characteristics. An asset is associated with one or more devices. Azure IoT Operations stores asset definitions in the Azure Device Re",2026-04-23T08:00:00.000Z,how-to,configuration,0.65,True,"How-to for configuring the SSE connector using UI and CLI; likely includes product-specific configuration parameters (asset properties, data points, streams, events, and device associations) and command/setting details that qualify as expert configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-akri,Understand Akri services,Learn about Akri services - Azure IoT Operations,,"Understand how the Akri services enable you to dynamically configure and deploy Akri connectors to connect a broad variety of assets and devices to the Azure IoT Operations cluster, ingest telemetry f","The Microsoft Akri framework lets you perform the following tasks in Azure IoT Operations: The following diagram shows the architecture of the Akri services in Azure IoT Operations. The following steps explain how the Akri services work together to configure devices and assets, and connect them to your physical assets and devices:",2026-03-25T16:54:00.000Z,overview,,0.2,False,"Akri services overview; describes architecture and how services work together, but appears conceptual without detailed configuration parameters, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-manage-assets,Understand asset and device management,Asset and Device Management - Azure IoT Operations,,Understand concepts and options for managing the devices and assets that are part of your Azure IoT Operations solution.,"In Azure IoT Operations, a key task is to manage the assets and devices that are part of your solution. This article:",2026-04-17T11:12:00.000Z,overview,,0.2,False,"The page is an overview of asset and device management concepts in Azure IoT Operations. It does not expose concrete limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. Content is primarily conceptual and descriptive rather than expert-level reference material.",updated +https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-manage-assets,Understand asset and device management,Asset and Device Management - Azure IoT Operations,,Understand concepts and options for managing the devices and assets that are part of your Azure IoT Operations solution.,"In Azure IoT Operations, a key task is to manage the assets and devices that are part of your solution. This article:",2026-04-22T08:00:00.000Z,overview,,0.2,False,"High-level overview of asset and device management concepts in Azure IoT Operations; no indication of numeric limits, configuration tables, error codes, or other detailed expert-only data.",updated https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-opc-ua-connector,Understand the connector for OPC UA,Connect and control industrial assets using the connector for OPC UA - Azure IoT Operations,,Use the connector for OPC UA to connect to OPC UA servers and exchange messages and data with the MQTT broker in a Kubernetes cluster.,"OPC UA (OPC Unified Architecture) is a standard developed by theOPC Foundationto enable the exchange of data between industrial components at the edge and with the cloud. The connector for OPC UA can route messages from OPC UA servers to the MQTT broker and send control messages to OPC UA servers. OPC UA provides a consistent, secure, documented standard based on widely used data formats. Industrial components can implement the OPC UA standard to enable universal data exchange. By using theOPC U",2026-04-07T17:12:00.000Z,overview,,0.2,False,"High-level overview of the OPC UA connector and its role in routing messages; summary indicates conceptual description of OPC UA and Azure IoT Operations without concrete limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-opc-ua-connector-certificates-management,Understand OPC UA application authentication,Understand OPC UA certificates infrastructure - Azure IoT Operations,Manage OPC UA application certificates for Azure IoT Operations,Describes the security concepts of OPC UA and how they can be managed in the context of the connector for OPC UA.,"The connector for OPC UA is an OPC UA client application that lets you connect securely to OPC UA servers. In OPC UA, the security includes: This article focuses onapplication authenticationand how to configure the connector for OPC UA to connect securely to your OPC UA servers at the edge. In OPC UA, every application instance has an X.509 certificate that it uses to establish trust with the other OPC UA applications it communicates with. To learn more about OPC UA application security, seeAppl",2026-04-07T17:12:00.000Z,concept-article,security,0.75,True,"Focuses on OPC UA security concepts and certificate management in the context of the connector. Such content typically includes product-specific certificate handling, trust model details, and possibly role/scope implications for secure connections, which are security configuration details not captured by generic training.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/end-to-end-tutorials/tutorial-add-assets,Add OPC UA assets to your cluster,Tutorial: Add assets - Azure IoT Operations,,Tutorial: Add OPC UA assets that publish messages to the MQTT broker in your Azure IoT Operations cluster.,"In this tutorial, you manually add OPC UA assets to your Azure IoT Operations cluster. These assets publish messages to the MQTT broker in your Azure IoT Operations cluster. Typically, an OT user completes these steps. Anassetis a physical device or logical entity that represents a device, a machine, a system, or a process. For example, a physical asset could be a pump, a motor, a tank, or a production line. A logical asset that you define can have properties, stream data points, or generate eve",2026-02-16T08:00:00.000Z,tutorial,,0.2,False,Tutorial for adding assets; likely UI/step instructions rather than deep configuration matrices or limits.,unchanged @@ -88,10 +88,10 @@ https://learn.microsoft.com/en-us/azure/iot-operations/end-to-end-tutorials/tuto https://learn.microsoft.com/en-us/azure/iot-operations/get-started-end-to-end-sample/quickstart-configure,Configure your instance,Quickstart: Configure your cluster - Azure IoT Operations,,"Quickstart: Configure devices, assets, and data flows in your cluster to process and route data from a simulated OPC PLC server to the cloud.","In this quickstart, you configure the following resources in your Azure IoT Operations cluster: In the context of Azure IoT operations, an asset is a logical representation of a physical device or system that you want to monitor or control. OPC UA serversare software applications that communicate with assets.OPC UA data pointsare values that OPC UA servers expose. OPC UA data points can provide real-time or historical data about the status, performance, quality, or condition of assets. In this q",2025-07-31T17:19:00.000Z,quickstart,,0.3,False,"Quickstart for configuring a sample pipeline; mostly procedural, not a catalog of configuration options or expert constraints.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/get-started-end-to-end-sample/quickstart-deploy,Run Azure IoT Operations in Codespaces,Quickstart: Run Azure IoT Operations in Codespaces - Azure IoT Operations,,Quickstart: Deploy Azure IoT Operations to a Kubernetes cluster running in GitHub Codespaces.,"In this quickstart, you deploy Azure IoT Operations to an Azure Arc-enabled Kubernetes cluster so that you can remotely manage your devices and workloads. At the end of the quickstart, you have a cluster that you can manage from the cloud. The rest of the quickstarts in this end-to-end series build on this one to define sample assets, data processing pipelines, and visualizations.",2025-11-20T12:10:00.000Z,quickstart,,0.3,False,Quickstart tutorial for deploying to Codespaces; likely step-by-step without detailed config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/iot-operations/get-started-end-to-end-sample/quickstart-get-insights,Get insights from your data,Quickstart: Get insights from your processed data - Azure IoT Operations,,Quickstart: Use a real-time dashboard to capture insights from the OPC UA data you sent to Event Hubs.,"In this quickstart, you populate areal-time dashboardto capture insights from the OPC UA data that you sent to Event Hubs in the previous quickstart. Using Microsoft Fabric Real-Time Intelligence, you bring your data from Event Hubs into Microsoft Fabric, and map it into a KQL database that can be a source for real-time dashboards. Then, you build a dashboard to display that data in visual tiles that capture insights and show the values over time. These operations are the last steps in the sampl",2025-03-07T23:05:00.000Z,quickstart,,0.2,False,Quickstart for building a dashboard; focused on example flow rather than product-specific limits or configuration catalogs.,unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/concept-layered-network,Layered networking,Layered networking for Azure IoT Operations - Azure IoT Operations,,"Learn about deploying Azure IoT Operations across layered (Purdue/ISA-95) industrial networks using Envoy proxy chaining, CoreDNS, and Kubernetes-based configuration.","In many industrial environments, segmented networking architectures (such as thePurdue Network Architecture) separate assets, control systems, and business applications into distinct network layers. Lower layers typically can't connect directly to the internet and can only communicate with adjacent layers. Azure IoT Operations supports deploying across these layered networks, using Envoy proxy chaining, CoreDNS, and Kubernetes-based configuration to route traffic between layers. The following di",2026-04-16T06:12:00.000Z,concept-article,,0.25,False,"Conceptual explanation of layered networking (Purdue/ISA-95) and how Azure IoT Operations supports it. The summary mentions technologies used (Envoy, CoreDNS, Kubernetes) but not concrete configuration tables, limits, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/concept-layered-network,Layered networking,Layered networking for Azure IoT Operations - Azure IoT Operations,,"Learn about deploying Azure IoT Operations across layered (Purdue/ISA-95) industrial networks using Envoy proxy chaining, CoreDNS, and Kubernetes-based configuration.","In many industrial environments, segmented networking architectures (such as thePurdue Network Architecture) separate assets, control systems, and business applications into distinct network layers. Lower layers typically can't connect directly to the internet and can only communicate with adjacent layers. Azure IoT Operations supports deploying across these layered networks, using Envoy proxy chaining, CoreDNS, and Kubernetes-based configuration to route traffic between layers. The following di",2026-04-16T06:12:00.000Z,concept-article,,0.25,False,"Conceptual explanation of layered networking (Purdue/ISA-95) and how Azure IoT Operations supports it. The summary mentions technologies used (Envoy, CoreDNS, Kubernetes) but not concrete configuration tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/howto-private-connectivity,Deploy with private connectivity,Deploy Azure IoT Operations with private connectivity - Azure IoT Operations,Configure private connectivity for Azure IoT Operations,"Configure private connectivity for Azure IoT Operations using Private Link, Arc Gateway, and Azure Firewall Explicit Proxy.","This article describes how to configure private connectivity for Azure IoT Operations. Follow the sections in order: These scenarios apply to environments with a single Arc-enabled Kubernetes cluster. There's no Purdue-style network segmentation, no proxy chaining across layers, and no Envoy deployment. If you have a layered network topology, seeTutorial: Deploy Azure IoT Operations in a layered network with private connectivityinstead.",2026-04-08T22:12:00.000Z,how-to,security,0.68,True,"Step-by-step configuration of private connectivity using Private Link, Arc Gateway, and Azure Firewall Explicit Proxy. Such pages typically include specific Azure resource types, RBAC roles, DNS/endpoint settings, and network/security configuration parameters unique to this product scenario. This aligns best with security (private connectivity, Private Link, firewall/proxy configuration) rather than generic configuration.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/howto-troubleshoot-private-connectivity,Troubleshoot private connectivity,Troubleshoot private connectivity for Azure IoT Operations - Azure IoT Operations,Troubleshoot private connectivity in Azure IoT Operations,"Diagnose and resolve DNS, Private Endpoint, RBAC, and connectivity issues for Azure IoT Operations in private network deployments.","This article helps you diagnose and resolve common issues when running Azure IoT Operations with private connectivity. For setup instructions, seeDeploy Azure IoT Operations with private connectivity.",2026-04-08T22:12:00.000Z,troubleshooting,troubleshooting,0.86,True,"Explicit troubleshooting article for DNS, Private Endpoint, RBAC, and connectivity issues. It is organized around diagnosing and resolving issues in private network deployments and will contain symptom → cause → solution mappings, likely with specific error messages, Azure diagnostics, and product-specific checks, which matches the troubleshooting sub-skill definition.",unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/overview-layered-network,Overview,Azure IoT Operations networking - Azure IoT Operations,,Learn about Azure IoT Operations networking,"Networking is a foundational aspect of deploying and managing distributed systems, especially in hybrid and multicloud environments. In Azure IoT Operations, secure networking enables reliable connectivity between on-premises resources, edge devices, and Azure services. Proper network configuration is essential for communication, security, and scalability of IoT Operations and Kubernetes clusters. This article describes key networking options for IoT Operations.",2026-04-16T06:12:00.000Z,concept-article,,0.2,False,"Networking overview article describing key networking options for IoT Operations. The summary suggests high-level conceptual guidance without specific configuration parameters, limits, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/overview-layered-network,Overview,Azure IoT Operations networking - Azure IoT Operations,,Learn about Azure IoT Operations networking,"Networking is a foundational aspect of deploying and managing distributed systems, especially in hybrid and multicloud environments. In Azure IoT Operations, secure networking enables reliable connectivity between on-premises resources, edge devices, and Azure services. Proper network configuration is essential for communication, security, and scalability of IoT Operations and Kubernetes clusters. This article describes key networking options for IoT Operations.",2026-04-16T06:12:00.000Z,concept-article,,0.2,False,"Networking overview article describing key networking options for IoT Operations. The summary suggests high-level conceptual guidance without specific configuration parameters, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/manage-mqtt-broker/howto-broker-diagnostics,Diagnostic settings,Configure Azure IoT Operations MQTT broker diagnostics settings - Azure IoT Operations,Configure diagnostics for Azure IoT MQTT broker,"Learn how to configure diagnostics settings for the Azure IoT Operations MQTT broker, like logs, metrics, self-check, and tracing.","Set up diagnostic settings to configure metrics, logs, and self-check for the MQTT broker. Important The diagnostics are set on the Broker resource. Configure diagnostics during initial deployment by using the Azure CLI or the Azure portal. If you want to change broker settings, deploy a new broker resource. To learn more, seeCustomize default Broker.",2025-05-19T17:08:00.000Z,how-to,configuration,0.7,True,"Covers logs, metrics, self-check, tracing; expected to include specific diagnostic setting names, categories, and configuration fields for this broker.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/manage-mqtt-broker/howto-broker-mqtt-client-options,MQTT client options,Configure Azure IoT Operations MQTT client options - Azure IoT Operations,Set advanced MQTT client options on broker,"Learn how to configure advanced client options for the Azure IoT Operations MQTT broker, like session expiry, message expiry, receive maximum, and subscriber queue limit.","Important This setting requires that you modify the Broker resource. It's configured only at initial deployment by using the Azure CLI or the Azure portal. A new deployment is required if Broker configuration changes are needed. To learn more, seeCustomize default Broker. The MQTT broker advanced client options control how the broker interacts with MQTT clients. These settings, negotiated between the broker and the client during connection, include session expiry, message expiry, receive maximum",2025-06-26T22:19:00.000Z,how-to,configuration,0.82,True,"Focuses on session expiry, message expiry, receive maximum, subscriber queue limit; these are negotiated settings with specific parameter names and ranges unique to this broker implementation.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/manage-mqtt-broker/howto-broker-persistence,Persistence,Data persistence for the Azure IoT Operations MQTT broker - Azure IoT Operations,Configure data persistence for Azure MQTT broker,Learn how to configure the data persistence feature for the Azure IoT Operations MQTT broker for data durability.,"The data persistence feature is designed as a complementary mechanism to the replication system. While the broker replicates data across multiple nodes, a cluster-wide shutdown can still result in data loss. To mitigate this risk, the MQTT broker supports persistent storage, which lets critical data be written to disk and preserved across restarts. This data persistence feature is different from theDisk-backed message buffer, which uses disk as an extension of memory but is ephemeral and doesn't",2025-10-27T08:00:00.000Z,how-to,configuration,0.78,True,Describes configuring persistent storage for the broker; likely lists specific configuration parameters and options for durability that are unique to this product.,unchanged @@ -118,6 +118,6 @@ https://learn.microsoft.com/en-us/azure/iot-operations/secure-iot-ops/howto-mana https://learn.microsoft.com/en-us/azure/iot-operations/secure-iot-ops/howto-manage-secrets,Manage secrets,Manage secrets - Azure IoT Operations,Manage Azure IoT Operations secrets with Key Vault and Kubernetes,"Create, update, and manage secrets that are required to give your Arc-enabled Kubernetes cluster access to Azure resources.","Azure IoT Operations uses Azure Key Vault as the managed vault solution on the cloud, and usesAzure Key Vault Secret Store extension for Kubernetesto sync the secrets down from the cloud and store them on the edge as Kubernetes secrets. Edge resources like connectors and dataflows can then use these secrets for authentication when connecting to external systems.",2026-02-10T12:11:00.000Z,how-to,security,0.8,True,"Describes how Azure IoT Operations uses Key Vault and the Secret Store extension to sync secrets to edge as Kubernetes secrets, a product-specific security pattern.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/secure-iot-ops/howto-validate-images,Validate images,Validate images - Azure IoT Operations,Validate Azure IoT Operations container and Helm images,Validate that Azure IoT Operations docker and helm images are legitimate.,Azure IoT Operations signs its docker and helm images to allow users to verify the integrity and origin of the images they use. Signing utilizes a public/private key pair to prove that Microsoft built a container image by creating a digital signature and adding it to the image. This article provides the steps to verify that an image was signed by Microsoft. Download Notation. Download the Microsoft signing public certificate:https://www.microsoft.com/pkiops/certs/Microsoft%20Supply%20Chain%20RSA,2024-11-19T18:02:00.000Z,how-to,security,0.7,True,"Provides concrete steps and URLs for verifying signed images, a product-specific supply-chain security configuration.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/troubleshoot/iot-operations-faq,FAQ,Azure IoT Operations frequently asked questions - Azure IoT Operations,,This article provides a list of frequently asked questions (FAQ) for Azure IoT Operations and their answers.,This article provides a list of frequently asked questions (FAQ) for Azure IoT Operations and their answers.,2026-04-08T22:12:00Z,faq,,0.3,False,"FAQ page is likely high-level Q&A without structured error codes, config tables, or numeric limits; more general guidance than expert configuration/troubleshooting data.",unchanged -https://learn.microsoft.com/en-us/azure/iot-operations/troubleshoot/known-issues,Known issues,Known Issues - Azure IoT Operations,Resolve known issues in Azure IoT Operations components,"Known issues for the MQTT broker, connector for OPC UA, OPC PLC simulator, data flows, and operations experience web UI.","This article lists the current known issues you might encounter when using Azure IoT Operations. The guidance helps you identify these issues and provides workarounds where available. For general troubleshooting guidance, seeTroubleshoot Azure IoT Operations.",2026-04-17T17:13:00.000Z,troubleshooting-known-issue,troubleshooting,0.78,True,"A 'known issues' page for specific components (MQTT broker, OPC UA connector, OPC PLC simulator, data flows, web UI) typically lists concrete symptoms, product-specific error behaviors, and workarounds. This is organized around identifying issues and applying fixes, which aligns with troubleshooting guidance rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/iot-operations/troubleshoot/known-issues,Known issues,Known Issues - Azure IoT Operations,Resolve known issues in Azure IoT Operations components,"Known issues for the MQTT broker, connector for OPC UA, OPC PLC simulator, data flows, and operations experience web UI.","This article lists the current known issues you might encounter when using Azure IoT Operations. The guidance helps you identify these issues and provides workarounds where available. For general troubleshooting guidance, seeTroubleshoot Azure IoT Operations.",2026-04-17T17:13:00.000Z,troubleshooting-known-issue,troubleshooting,0.78,True,"A 'known issues' page for specific components (MQTT broker, OPC UA connector, OPC PLC simulator, data flows, web UI) typically lists concrete symptoms, product-specific error behaviors, and workarounds. This is organized around identifying issues and applying fixes, which aligns with troubleshooting guidance rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/troubleshoot/tips-tools,Tips and tools,Tips and tools for troubleshooting Azure IoT Operations - Azure IoT Operations,,"Use common Kubernetes and MQTT tools such as kubectl, k9s, MQTT explorer, and mosquitto to troubleshoot and test your Azure IoT Operations instance.","This article describes how to use some common tools when you're learning, exploring, or troubleshooting your Azure IoT Operations instances. These tools are in addition to the capabilities provided by the Azure portal, Azure CLI, operations experience web UI, andobservability resources.",2026-04-02T18:15:00.000Z,how-to,,0.35,False,"Describes generic use of tools like kubectl, k9s, MQTT Explorer, mosquitto; likely high-level guidance without product-specific error codes, config tables, or limits.",unchanged https://learn.microsoft.com/en-us/azure/iot-operations/troubleshoot/troubleshoot,Troubleshoot,Troubleshoot Azure IoT Operations - Azure IoT Operations,Troubleshoot Azure IoT Operations deployments and runtime,Troubleshoot your Azure IoT Operations deployment and configuration,"This article contains troubleshooting tips for Azure IoT Operations. The troubleshooting guidance helps you diagnose and resolve issues you might encounter when deploying, configuring, or running Azure IoT Operations by: For information about known issues and temporary workarounds, seeKnown issues: Azure IoT Operations.",2026-02-11T18:17:00.000Z,troubleshooting-general,troubleshooting,0.78,True,Central troubleshooting article; expected to map specific symptoms and errors to causes and resolutions for Azure IoT Operations components.,unchanged diff --git a/products/azure-iot-operations/report.md b/products/azure-iot-operations/report.md index d7803126..f6d583bc 100644 --- a/products/azure-iot-operations/report.md +++ b/products/azure-iot-operations/report.md @@ -1,9 +1,9 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - configuration: Configuring Azure IoT Operations data flows, endpoints, routing, - transforms, persistence, MQTT broker settings, device/asset models, and observability/metrics - for monitoring and tuning. + configuration: Configuring Azure IoT Operations data flows, endpoints, schemas, + MQTT broker, assets/devices, persistence, scaling, and observability/metrics to + tune, monitor, and manage the system. deployment: Deploying, cloning, upgrading, and securing Azure IoT Operations in production (incl. private networks), plus deploying observability (Prometheus/Grafana) and WASM/graph workloads. @@ -30,15 +30,16 @@ category_descriptions: skill_description: Expert knowledge for Azure IoT Operations development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when building IoT data flows/graphs, MQTT broker setups, WASM/ONNX workloads, - Akri/REST connectors, or OPC UA/MQTT security, and other Azure IoT Operations related - development tasks. Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure-iot-hub), - Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central). -use_when: Use when building IoT data flows/graphs, MQTT broker setups, WASM/ONNX workloads, - Akri/REST connectors, or OPC UA/MQTT security, and other Azure IoT Operations related - development tasks. -confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure-iot-hub), - Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central). + Use when configuring MQTT broker/data flows, OPC UA assets, Akri/REST/WASM connectors, + ONNX transforms, or Prometheus/Grafana, and other Azure IoT Operations related development + tasks. Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), + Azure IoT Central (use azure-iot-central), Azure Digital Twins (use azure-digital-twins). +use_when: Use when configuring MQTT broker/data flows, OPC UA assets, Akri/REST/WASM + connectors, ONNX transforms, or Prometheus/Grafana, and other Azure IoT Operations + related development tasks. +confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use + azure-iot-edge), Azure IoT Central (use azure-iot-central), Azure Digital Twins + (use azure-digital-twins). --- # Azure IoT Operations Crawl Report @@ -47,13 +48,13 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure- - **Total Pages**: 122 - **Fetched**: 122 - **Fetch Failed**: 0 -- **Classified**: 84 -- **Unclassified**: 38 +- **Classified**: 83 +- **Unclassified**: 39 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 6 -- **Unchanged**: 115 +- **New Pages**: 0 +- **Updated Pages**: 3 +- **Unchanged**: 119 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-iot-operations/azure-iot-operations.csv` @@ -63,35 +64,25 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure- |------|-------|------------| | architecture-patterns | 1 | 0.8% | | best-practices | 2 | 1.6% | -| configuration | 39 | 32.0% | +| configuration | 38 | 31.1% | | decision-making | 2 | 1.6% | | deployment | 6 | 4.9% | | integrations | 15 | 12.3% | | limits-quotas | 1 | 0.8% | | security | 14 | 11.5% | | troubleshooting | 4 | 3.3% | -| *(Unclassified)* | 38 | 31.1% | +| *(Unclassified)* | 39 | 32.0% | ## Changes -### New Pages - -- [Layered networking](https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/concept-layered-network) - ### Updated Pages -- [Known issues](https://learn.microsoft.com/en-us/azure/iot-operations/troubleshoot/known-issues) - - Updated: 2026-04-09T17:25:00.000Z → 2026-04-17T17:13:00.000Z - [Understand asset and device management](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-manage-assets) - - Updated: 2026-03-10T16:12:00.000Z → 2026-04-17T11:12:00.000Z -- [Data flows vs. data flow graphs](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow-comparison) - - Updated: 2026-04-02T08:00:00.000Z → 2026-04-17T11:12:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow) - - Updated: 2026-03-25T16:54:00.000Z → 2026-04-17T11:12:00.000Z -- [Filter and route data](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/howto-dataflow-graphs-filter-route) - - Updated: 2026-04-03T06:12:00.000Z → 2026-04-16T08:00:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/overview-layered-network) - - Updated: 2026-04-10T22:10:00.000Z → 2026-04-16T06:12:00.000Z + - Updated: 2026-04-17T11:12:00.000Z → 2026-04-22T08:00:00.000Z +- [Understand assets and devices](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/concept-assets-devices) + - Updated: 2026-02-24T23:11:00.000Z → 2026-04-22T22:14:00.000Z +- [Connect to SSE endpoints](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-sse-connector) + - Updated: 2026-03-10T16:12:00.000Z → 2026-04-23T08:00:00.000Z ## Classified Pages @@ -165,14 +156,13 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure- | [Create a custom Akri connector with .NET](https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-develop-akri-connectors) | integrations | 0.66 | Shows building a REST connector using a specific .NET template; likely includes connector configuration parameters and patterns unique to Akri/IoT Operations integration. | | [Build WASM modules](https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-build-wasm-modules) | integrations | 0.65 | How-to for building WASM modules for Azure IoT Operations data flows likely includes product-specific build commands, CLI parameters (aio-dataflow), and configuration details unique to this service, which fits integrations & coding patterns more than generic tutorials. | | [Clone an instance](https://learn.microsoft.com/en-us/azure/iot-operations/deploy-iot-ops/howto-clone-instance) | deployment | 0.65 | Describes cloning behavior and scenarios for Azure IoT Operations instances, which is a deployment pattern unique to this product. | -| [Connect to SSE endpoints](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-sse-connector) | configuration | 0.65 | How-to for configuring assets/devices to connect to SSE endpoints via UI or CLI. Likely includes product-specific configuration parameters, CLI flags, and settings for the SSE connector, which qualifies as configuration-focused expert knowledge. | +| [Connect to SSE endpoints](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-sse-connector) | configuration | 0.65 | How-to for configuring the SSE connector using UI and CLI; likely includes product-specific configuration parameters (asset properties, data points, streams, events, and device associations) and command/setting details that qualify as expert configuration knowledge. | | [Create stateful WASM graphs with the state store](https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-wasm-state-store) | integrations | 0.65 | Describes using the state store with WASM operators in Azure IoT Operations; likely includes product-specific APIs/config parameters for persisting state across messages, which are integration/coding patterns beyond generic concepts. | | [Data flows vs. data flow graphs](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/overview-dataflow-comparison) | decision-making | 0.65 | The page is explicitly about choosing the right approach for a scenario and compares two product-specific mechanisms (data flows vs. data flow graphs). This is selection guidance between options within the same service, which fits decision-making. While the summary is high level, the purpose is to guide choice, implying comparison criteria beyond generic concepts. | | [Deploy observability resources](https://learn.microsoft.com/en-us/azure/iot-operations/configure-observability-monitoring/howto-configure-observability) | deployment | 0.65 | Shows how to deploy observability resources, configure Prometheus metrics, and set up Grafana dashboards using Azure Monitor managed service for Prometheus, which are product-specific deployment and configuration patterns. | | [Enrich data](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/concept-dataflow-enrich) | integrations | 0.65 | Describes how to use contextualization datasets and query conditions in transforms, which are product-specific enrichment mechanisms and patterns. | | [Map data](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/concept-dataflow-mapping) | integrations | 0.65 | Reference-style description of a custom mapping language with syntax and functions specific to Azure IoT Operations, which are product-specific transformation and coding patterns. | | [Schemas](https://learn.microsoft.com/en-us/azure/iot-operations/connect-to-cloud/concept-schema-registry) | configuration | 0.65 | Describes how schemas are stored and used in data flows, including where schemas are applied, which is product-specific configuration behavior. | -| [Understand assets and devices](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/concept-assets-devices) | configuration | 0.65 | Explains how asset and device configuration resources map to physical entities and connectors, which is product-specific configuration modeling. | | [Upgrade](https://learn.microsoft.com/en-us/azure/iot-operations/deploy-iot-ops/howto-upgrade) | deployment | 0.65 | Product-specific upgrade process for Azure IoT Operations instances qualifies as deployment expertise. | | [Use schema registry with WASM modules](https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-wasm-schema-registry) | integrations | 0.65 | Shows how to use schema registry validation with WASM modules; this typically involves specific configuration parameters, schema references, and validation behaviors unique to Azure IoT Operations, matching integrations & coding patterns. | | [Build Akri connectors with VS Code extension](https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-build-akri-connectors-vscode) | integrations | 0.64 | Describes using a dedicated VS Code extension to create connectors; likely includes extension-specific settings, templates, and configuration fields. | @@ -220,7 +210,8 @@ confusable_not_for: Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure- | [Overview](https://learn.microsoft.com/en-us/azure/iot-operations/manage-layered-network/overview-layered-network) | 0.20 | Networking overview article describing key networking options for IoT Operations. The summary suggests high-level conceptual guidance without specific configuration parameters, limits, or decision matrices. | | [Understand Akri services](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-akri) | 0.20 | Akri services overview; describes architecture and how services work together, but appears conceptual without detailed configuration parameters, limits, or decision matrices. | | [Understand WASM module capabilities](https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/concepts-wasm-modules) | 0.20 | Described as explaining architecture, operator types, host APIs, and schemas conceptually; sounds like a conceptual overview of WASM modules and graph definitions rather than detailed config tables or limits. | -| [Understand asset and device management](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-manage-assets) | 0.20 | The page is an overview of asset and device management concepts in Azure IoT Operations. It does not expose concrete limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. Content is primarily conceptual and descriptive rather than expert-level reference material. | +| [Understand asset and device management](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-manage-assets) | 0.20 | High-level overview of asset and device management concepts in Azure IoT Operations; no indication of numeric limits, configuration tables, error codes, or other detailed expert-only data. | +| [Understand assets and devices](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/concept-assets-devices) | 0.20 | Conceptual explanation of what assets and devices represent in Azure IoT Operations; describes mapping to physical entities but lacks concrete configuration parameters, limits, or troubleshooting details. | | [Understand the connector for OPC UA](https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/overview-opc-ua-connector) | 0.20 | High-level overview of the OPC UA connector and its role in routing messages; summary indicates conceptual description of OPC UA and Azure IoT Operations without concrete limits, configuration tables, error codes, or decision matrices. | | [Upload sensor data to the cloud](https://learn.microsoft.com/en-us/azure/iot-operations/end-to-end-tutorials/tutorial-upload-messages-to-cloud) | 0.20 | Tutorial for sending messages to cloud via data flow; appears procedural, not a reference of configs, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/iot-operations/overview-iot-operations) | 0.10 | High-level product overview of Azure IoT Operations features and use cases without concrete limits, configs, or error details. | diff --git a/products/azure-iot/azure-iot.csv b/products/azure-iot/azure-iot.csv index 03da2f04..1e3d48e5 100644 --- a/products/azure-iot/azure-iot.csv +++ b/products/azure-iot/azure-iot.csv @@ -1,9 +1,11 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/iot/howto-connect-on-premises-sap-to-azure,Connect on-premises SAP system to Azure,Connect an on-premises SAP system to Azure - Azure IoT,Connect on-premises SAP ERP to Azure industrial IoT,Step by step guide that shows how to connect an on-premises SAP Enterprise Resource Planning system to Azure.,"Many manufacturers use on-premises SAP Enterprise Resource Planning (ERP) systems. Often, manufacturers connect SAP systems to Industrial IoT solutions, and use the connected system to retrieve data for manufacturing processes, customer orders, and inventory status. This article describes how to connect these SAP-based ERP systems. This solution usesIEC 62541. Open Platform Communications (OPC) Unified Architecture (UA)for all operational technology data. The following diagram shows an overview ",2024-12-13T18:02:00.000Z,how-to,integrations,0.7,True,Step-by-step integration of SAP ERP via OPC UA and Azure; product-specific integration architecture and configuration details.,unchanged https://learn.microsoft.com/en-us/azure/iot/howto-iot-industrial-dataspaces,Enable industrial dataspace in Azure,Enable an industrial dataspace on Azure - Azure IoT,Enable industrial dataspace architectures on Azure,Step by step guide that shows how to enable an industrial dataspace on Azure based on open-source reference implementations.,"Many manufacturers need to provide data about their manufactured products to their customers in digital and machine-readable from. Sometimes a law such as the European Commission'sDigital Product Passportlegislation mandates this requirement. To provide this data, manufacturers often create an industrial dataspace between their enterprise systems and their customer's systems. The dataspace provides a secure, point-to-point communication channel for digital product data between the manufacturer a",2025-02-05T18:02:00.000Z,how-to,architecture-patterns,0.6,True,Describes how to build an industrial dataspace using Azure and open-source implementations; architecture-specific guidance for secure data exchange.,unchanged +https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-namespace-guidance,Best practices for namespaces,Best practices for Azure Device Registry namespaces - Azure Device Registry,Design and choose Azure Device Registry namespaces,"Learn how to design Azure Device Registry namespaces for your IoT solution, including when to create new namespaces and when to reuse existing ones.","Azure Device Registryusesnamespacesto organize devices and assets. Because namespaces act as long-lived organizational boundaries, it's important to plan them before you deploy. This article helps you decide:",2026-04-21T17:17:00.000Z,best-practice,decision-making,0.78,True,"The article focuses on how to design and plan Azure Device Registry namespaces, including when to create new namespaces versus reuse existing ones. This is product-specific decision guidance about organizational boundaries and solution design, matching the decision-making category rather than generic best practices. It helps users choose between options for different solution scenarios.",new +https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-schema-registry-guidance,Best practices for schema registries,Best practices for Azure Device Registry schema registries - Azure Device Registry,Plan Azure Device Registry schema registries for IoT,"Learn how to design Azure Device Registry schema registries for your Azure IoT Operations solution, including when to create new registries and when to reuse existing ones.","Theschema registry, a feature ofAzure Device Registry, is a synchronized repository that's accessible both in the cloud and at the edge. It stores definitions of messages coming from edge assets and exposes an API to access those schemas from either location. This article helps you decide:",2026-04-21T17:17:00.000Z,best-practice,decision-making,0.78,True,"The article provides guidance on how to design schema registries in Azure Device Registry, including when to create new registries and when to reuse existing ones for Azure IoT Operations. This is product-specific selection and design guidance for different solution scenarios, aligning with decision-making rather than generic conceptual content.",new https://learn.microsoft.com/en-us/azure/iot/iot-glossary,IoT glossary,Azure IoT glossary of terms - Azure IoT,,Developer guide - a glossary explaining some of the common terms used in the Azure IoT articles.,This article lists some of the common terms used in the IoT articles.,2026-03-09T08:00:00.000Z,conceptual,,0.0,False,"Glossary of IoT terms is conceptual reference, not expert configuration, limits, troubleshooting, or decision-making content.",unchanged -https://learn.microsoft.com/en-us/azure/iot/iot-introduction,What is Azure IoT?,Introduction to Azure IoT - Azure IoT,,"Introduction explaining Azure IoT on Azure, including how IoT Hub and Azure IoT Operations are part of the Azure IoT portfolio and the adaptive cloud approach.","Azure IoTis Microsoft's portfolio of services for connecting, managing, and deriving intelligence from IoT devices and industrial equipment at scale. It uses a collection of cloud services, edge components, and SDKs, and applies theadaptive cloud approachto unify cloud-connected devices and on-premises operational technology (OT) environments under a common management, data, and AI model. Raw sensor telemetry flows through a consistent pipeline and ultimately becomes actionable intelligence for ",2026-04-15T11:11:00.000Z,overview,,0.1,False,"High-level introduction to Azure IoT portfolio and concepts without specific limits, configuration parameters, error codes, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/iot/iot-overview-device-development,IoT device development,IoT device development - Azure IoT,,"An overview of Azure IoT device development including an introduction to the device SDKs, modeling, IoT Edge modules, and a survey of the available tools.","This overview introduces the key concepts around developing devices that connect to typical Azure IoT solutions. Each section includes links to content that provides further detail and guidance. In a cloud-connected solution, devices connect directly to cloud-connected services such as IoT Hub, while in an edge-connected solution devices connect to edge-connected services in your environment such as Azure IoT Operations. The following diagram shows a high-level view of the components in a typica",2026-04-15T11:11:00.000Z,overview,,0.1,False,"Overview of IoT device development concepts and components; does not expose concrete configuration tables, quotas, error mappings, or product-specific best-practice details.",new +https://learn.microsoft.com/en-us/azure/iot/iot-introduction,What is Azure IoT?,Introduction to Azure IoT - Azure IoT,,"Introduction explaining Azure IoT on Azure, including how IoT Hub and Azure IoT Operations are part of the Azure IoT portfolio and the adaptive cloud approach.","Azure IoTis Microsoft's portfolio of services for connecting, managing, and deriving intelligence from IoT devices and industrial equipment at scale. It uses a collection of cloud services, edge components, and SDKs, and applies theadaptive cloud approachto unify cloud-connected devices and on-premises operational technology (OT) environments under a common management, data, and AI model. Raw sensor telemetry flows through a consistent pipeline and ultimately becomes actionable intelligence for ",2026-04-15T11:11:00.000Z,overview,,0.1,False,"High-level introduction to Azure IoT portfolio and concepts without specific limits, configuration parameters, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/iot/iot-overview-device-development,IoT device development,IoT device development - Azure IoT,,"An overview of Azure IoT device development including an introduction to the device SDKs, modeling, IoT Edge modules, and a survey of the available tools.","This overview introduces the key concepts around developing devices that connect to typical Azure IoT solutions. Each section includes links to content that provides further detail and guidance. In a cloud-connected solution, devices connect directly to cloud-connected services such as IoT Hub, while in an edge-connected solution devices connect to edge-connected services in your environment such as Azure IoT Operations. The following diagram shows a high-level view of the components in a typica",2026-04-15T11:11:00.000Z,overview,,0.1,False,"Overview of IoT device development concepts and components; does not expose concrete configuration tables, quotas, error mappings, or product-specific best-practice details.",unchanged https://learn.microsoft.com/en-us/azure/iot/iot-overview-security,Secure your solution,Secure your IoT solutions - Azure IoT,,"Learn how to secure IoT solutions, with best practices for cloud-based and edge-based solutions. Includes recommendations for assets, devices, data, and infrastructure","IoT solutions let you connect, monitor, and control your IoT devices and assets at scale. In a cloud-based solution, devices and assets connect directly to the cloud. In an edge-based solution, devices and assets connect to an edge runtime environment. You must secure your physical assets and devices, edge infrastructure, and cloud services to protect your IoT solution from threats. You must also secure the data that flows through your IoT solution, whether it's at the edge or in the cloud. This",2025-06-18T11:21:00.000Z,conceptual,,0.3,False,"Security overview and best practices at a high level; summary does not indicate specific RBAC roles, config values, or compliance settings.",unchanged https://learn.microsoft.com/en-us/azure/iot/iot-services-and-technologies,Choose an Azure IoT service,Choose an Azure IoT service - Azure IoT,,Describes the collection of services and technologies you can use to build Azure IoT cloud-based and edge-based solutions.,Azure IoT services and technologies provide you with options to create a wide variety of IoT solutions that enable digital transformation for your organization. This article describes Azure IoT services and technologies such as:,2025-03-28T22:03:00.000Z,product-comparison,,0.2,False,Describes available Azure IoT services; appears as catalog/overview without detailed decision matrices or quantified comparisons.,unchanged https://learn.microsoft.com/en-us/azure/iot/iot-support-help,Support and help options,Azure IoT support and help options - Azure IoT,,How to obtain help and support for questions or problems when you create solutions using Azure IoT Services.,Here are suggestions for where you can get help when developing your Azure IoT solutions.,2025-03-20T17:02:00.000Z,concept-article,,0.1,False,"Support and help options; meta-information, not technical configuration or troubleshooting content.",unchanged diff --git a/products/azure-iot/report.md b/products/azure-iot/report.md index 022fd2b1..c920a423 100644 --- a/products/azure-iot/report.md +++ b/products/azure-iot/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: integrations: Patterns and code for integrating devices via MQTT and IoT Plug and Play, building device/service apps, formatting payloads, using DPS/IoT Hub, and @@ -7,14 +7,17 @@ category_descriptions: architecture-patterns: Reference architectures and patterns for industrial IoT on Azure, including dataspace-based designs, component choices, and end-to-end implementation guidance for industrial scenarios. -skill_description: Expert knowledge for Azure IoT development including architecture - & design patterns, and integrations & coding patterns. Use when using MQTT, IoT - Plug and Play, DPS/IoT Hub, SAP ERP integration, or industrial IoT reference architectures, - and other Azure IoT related development tasks. Not for Azure IoT Hub (use azure-iot-hub), - Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central), - Azure Defender For Iot (use azure-defender-for-iot). -use_when: Use when using MQTT, IoT Plug and Play, DPS/IoT Hub, SAP ERP integration, - or industrial IoT reference architectures, and other Azure IoT related development + decision-making: Guidance on designing Azure Device Registry namespaces and schema + registries, including structure, organization, and planning for IoT device data + and metadata. +skill_description: Expert knowledge for Azure IoT development including decision making, + architecture & design patterns, and integrations & coding patterns. Use when using + MQTT or IoT Plug and Play, DPS/IoT Hub, SAP ERP integration, device registries, + or schema registries, and other Azure IoT related development tasks. Not for Azure + IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), Azure IoT Central + (use azure-iot-central), Azure Defender For Iot (use azure-defender-for-iot). +use_when: Use when using MQTT or IoT Plug and Play, DPS/IoT Hub, SAP ERP integration, + device registries, or schema registries, and other Azure IoT related development tasks. confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central), Azure Defender For Iot @@ -24,16 +27,16 @@ confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (u ## Summary -- **Total Pages**: 9 -- **Fetched**: 9 +- **Total Pages**: 11 +- **Fetched**: 11 - **Fetch Failed**: 0 -- **Classified**: 3 +- **Classified**: 5 - **Unclassified**: 6 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 1 -- **Unchanged**: 7 +- **New Pages**: 2 +- **Updated Pages**: 0 +- **Unchanged**: 9 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-iot/azure-iot.csv` @@ -41,25 +44,24 @@ confusable_not_for: Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (u | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 2 | 22.2% | -| integrations | 1 | 11.1% | -| *(Unclassified)* | 6 | 66.7% | +| architecture-patterns | 2 | 18.2% | +| decision-making | 2 | 18.2% | +| integrations | 1 | 9.1% | +| *(Unclassified)* | 6 | 54.5% | ## Changes ### New Pages -- [IoT device development](https://learn.microsoft.com/en-us/azure/iot/iot-overview-device-development) - -### Updated Pages - -- [What is Azure IoT?](https://learn.microsoft.com/en-us/azure/iot/iot-introduction) - - Updated: 2025-10-07T08:00:00.000Z → 2026-04-15T11:11:00.000Z +- [Best practices for namespaces](https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-namespace-guidance) +- [Best practices for schema registries](https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-schema-registry-guidance) ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| +| [Best practices for namespaces](https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-namespace-guidance) | decision-making | 0.78 | The article focuses on how to design and plan Azure Device Registry namespaces, including when to create new namespaces versus reuse existing ones. This is product-specific decision guidance about organizational boundaries and solution design, matching the decision-making category rather than generic best practices. It helps users choose between options for different solution scenarios. | +| [Best practices for schema registries](https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-schema-registry-guidance) | decision-making | 0.78 | The article provides guidance on how to design schema registries in Azure Device Registry, including when to create new registries and when to reuse existing ones for Azure IoT Operations. This is product-specific selection and design guidance for different solution scenarios, aligning with decision-making rather than generic conceptual content. | | [Connect on-premises SAP system to Azure](https://learn.microsoft.com/en-us/azure/iot/howto-connect-on-premises-sap-to-azure) | integrations | 0.70 | Step-by-step integration of SAP ERP via OPC UA and Azure; product-specific integration architecture and configuration details. | | [Enable industrial dataspace in Azure](https://learn.microsoft.com/en-us/azure/iot/howto-iot-industrial-dataspaces) | architecture-patterns | 0.60 | Describes how to build an industrial dataspace using Azure and open-source implementations; architecture-specific guidance for secure data exchange. | | [Implement industrial IoT reference solution](https://learn.microsoft.com/en-us/azure/iot/tutorial-iot-industrial-solution-architecture) | architecture-patterns | 0.60 | Reference architecture for industrial IoT with specific use cases (OEE, forecasting, anomaly detection); likely includes Azure-service-specific architectural patterns and components. | diff --git a/products/azure-key-vault/azure-key-vault.csv b/products/azure-key-vault/azure-key-vault.csv index 44960877..ef20777f 100644 --- a/products/azure-key-vault/azure-key-vault.csv +++ b/products/azure-key-vault/azure-key-vault.csv @@ -30,7 +30,7 @@ https://learn.microsoft.com/en-us/azure/key-vault/general/authentication,Key Vau https://learn.microsoft.com/en-us/azure/key-vault/general/authentication-requests-and-responses,"Authentication, requests and responses","Authentication, requests, and responses",Formulate authenticated JSON requests to Azure Key Vault,Learn how Azure Key Vault uses JSON-formatted requests and responses and about required authentication for using a key vault.,Azure Key Vault provides two types of containers to store and manage secrets for your cloud applications: Here are the suffixes of the URLs used to access each type of object Azure Key Vault supports JSON formatted requests and responses. Requests to the Azure Key Vault are directed to a valid Azure Key Vault URL using HTTPS with some URL parameters and JSON encoded request and response bodies. This article covers specifics for the Azure Key Vault service. For general information on using Azure ,2026-04-10T08:00:00.000Z,concept-article,configuration,0.65,True,"Details URL suffixes for different object types and JSON request/response specifics for Key Vault. These are product-specific request formats and endpoint patterns, aligning with configuration of how to call the service.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/autorotation,Understanding auto-rotation,Understanding autorotation in Azure Key Vault,,"Learn about autorotation concepts for keys, secrets, and certificates in Azure Key Vault","Cryptographic assets like certificates, keys, and secrets have limited lifetimes. As a security best practice, these assets should be rotated regularly to reduce the risk of compromise and ensure compliance with security policies. Azure Key Vault provides automation capabilities for rotating these assets, helping organizations maintain strong security posture with minimal operational overhead.",2026-04-10T08:00:00.000Z,concept-article,,0.4,False,"Conceptual explanation of autorotation; summary does not show specific configuration values, ranges, or decision matrices that would qualify as expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/azure-policy,Integrate Azure Key Vault with Azure Policy,Integrate Azure Key Vault with Azure Policy,Apply Azure Policy to govern Azure Key Vault,Learn how to integrate Azure Key Vault with Azure Policy,"Azure Policyis a governance tool that gives users the ability to audit and manage their Azure environment at scale, allowing them to place guardrails on Azure resources to ensure they're compliant with assigned policy rules. It allows users to perform audit, real-time enforcement, and remediation of their Azure environment. The results of audits performed by policy are available to users in a compliance dashboard where they can see to see a drill-down of which resources and components are compli",2026-04-10T08:00:00.000Z,how-to,configuration,0.65,True,"Integration with Azure Policy will list specific policy definitions, aliases, and parameters for Key Vault (e.g., enforcing soft delete), which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/key-vault/general/backup,Back up and restore objects,"Back up a secret, key, or certificate stored in Azure Key Vault",,"Use this document to help back up a secret, key, or certificate stored in Azure Key Vault.","This document shows you how to back up secrets, keys, and certificates stored in your key vault. A backup is intended to provide you with an offline copy of all your secrets in the unlikely event that you lose access to your key vault.",2026-04-10T08:00:00.000Z,how-to,,0.45,False,"Backup article is likely a procedural how-to using portal/CLI/PowerShell; summary doesn’t indicate detailed parameter tables, limits, or error mappings beyond generic backup steps.",unchanged +https://learn.microsoft.com/en-us/azure/key-vault/general/backup,Back up and restore objects,"Back up a secret, key, or certificate stored in Azure Key Vault",,"Use this document to help back up a secret, key, or certificate stored in Azure Key Vault.","This document shows you how to back up secrets, keys, and certificates stored in your key vault. A backup is intended to provide you with an offline copy of all your secrets in the unlikely event that you lose access to your key vault.",2026-04-22T22:13:00.000Z,how-to,,0.3,False,"Appears to be a how-to/tutorial on backing up Key Vault objects, not a reference for limits, configuration matrices, or error-code-based troubleshooting. Likely lacks detailed numeric limits, config parameter tables, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/key-vault/general/basic-concepts,Basic concepts,What is Azure Key Vault?,,Learn how Azure Key Vault safeguards cryptographic keys and secrets that cloud applications and services use.,"Azure Key Vault is a cloud service for securely storing and accessing secrets. A secret is anything that you want to tightly control access to, such as API keys, passwords, certificates, or cryptographic keys. Key Vault service supports two types of containers: vaults and managed hardware security module(HSM) pools. Vaults support storing software and HSM-backed keys, secrets, and certificates. Managed HSM pools only support HSM-backed keys. SeeAzure Key Vault REST API overviewfor complete detai",2026-04-10T08:00:00.000Z,concept-article,,0.2,False,"High-level 'What is Azure Key Vault?' overview; conceptual description of service and containers, no detailed numeric limits or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/client-libraries,Client libraries,Client Libraries for Azure Key Vault,,Client Libraries for Azure Key Vault,"The Azure Key Vault client libraries provide programmatic access to Key Vault functionality across multiple languages, including .NET, Python, Java, JavaScript, and Spring. These libraries follow the latestAzure SDK guidelinesand integrate withAzure Identityfor authentication.",2026-04-10T08:00:00.000Z,tutorial,,0.35,False,Client libraries page is a catalog/overview of SDKs; it typically links out to references and doesn’t itself contain detailed configuration tables or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/common-error-codes,Common error codes,Common error codes for Azure Key Vault,Resolve common Azure Key Vault error codes,Common error codes for Azure Key Vault,The error codes listed in the following table may be returned by an operation on Azure Key Vault.,2026-04-10T08:00:00.000Z,reference,troubleshooting,0.9,True,"Common error codes table maps specific Key Vault error identifiers to explanations and possibly remediation steps, which is product-specific troubleshooting content.",unchanged @@ -61,7 +61,7 @@ https://learn.microsoft.com/en-us/azure/key-vault/general/rbac-access-policy,RBA https://learn.microsoft.com/en-us/azure/key-vault/general/rbac-guide,Use an Azure RBAC for managing access,Grant permission to applications to access an Azure key vault using Azure RBAC,Configure Azure RBAC permissions for Key Vault access,"Learn how to provide access to keys, secrets, and certificates using Azure role-based access control.","Note Key Vault resource provider supports two resource types:vaultsandmanaged HSMs. Access control described in this article only applies tovaults. To learn more about access control for managed HSM, seeManaged HSM access control. Azure role-based access control (Azure RBAC) is an authorization system built onAzure Resource Managerthat provides centralized access management of Azure resources. Starting with API version 2026-02-01, Azure RBAC is the default access control model for newly created ",2026-04-10T08:00:00.000Z,how-to,security,0.78,True,"RBAC guide necessarily lists specific built-in role names, scopes, and how they map to Key Vault data-plane permissions, which are product-specific security details beyond generic RBAC concepts.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/rbac-migration,Vault access policies to Azure RBAC migration guide,Migrate to Azure role-based access control,Migrate Azure Key Vault from access policies to RBAC,Learn how to migrate from vault access policies to Azure roles.,"Azure Key Vault offers two access control models: Azure role-based access control (Azure RBAC), and an access policy model. Azure RBAC is the default and recommended access control model for Azure Key Vault. Starting with API version 2026-02-01, Azure RBAC is the default access control model for new vaults. For a comparison of the two methods of authorization, seeAzure role-based access control (Azure RBAC) vs. access policies. For information on preparing your existing deployments for this chan",2026-04-10T08:00:00.000Z,how-to,decision-making,0.7,True,"Migration guidance between access policies and RBAC for Key Vault includes product-specific steps, caveats, and decision points about when/how to switch models that go beyond conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/rest-error-codes,REST API error codes,REST API error codes - Azure Key Vault,Interpret Azure Key Vault REST API error codes,An operation on an Azure Key Vault web service may return the following error codes.,An operation on an Azure Key Vault web service may return the following error codes.,2026-04-10T08:00:00.000Z,reference,troubleshooting,0.9,True,"REST error codes article lists specific HTTP status codes and Key Vault error codes with meanings and likely causes, which is core troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/key-vault/general/secure-key-vault,Secure your Key Vault,Secure your Azure Key Vault,Apply security best practices to Azure Key Vault,"Learn how to secure Azure Key Vault, with best practices for protecting your deployment.","Azure Key Vault protects cryptographic keys, certificates (and the private keys associated with the certificates), and secrets (such as connection strings and passwords) in the cloud. When storing sensitive and business-critical data, however, you must take steps to maximize the security of your vaults and the data stored in them. The security recommendations in this article implement Zero Trust principles: ""Verify explicitly"", ""Use least privilege access"", and ""Assume breach"". For comprehensive",2026-01-30T08:00:00.000Z,article,security,0.85,True,"Explicit security hardening article with Key Vault–specific recommendations (Zero Trust, least privilege, likely specific settings and patterns), which are product-specific security best practices.",unchanged +https://learn.microsoft.com/en-us/azure/key-vault/general/secure-key-vault,Secure your Key Vault,Secure your Azure Key Vault,Apply security best practices to Azure Key Vault,"Learn how to secure Azure Key Vault, with best practices for protecting your deployment.","Azure Key Vault protects cryptographic keys, certificates (and the private keys associated with the certificates), and secrets (such as connection strings and passwords) in the cloud. When storing sensitive and business-critical data, however, you must take steps to maximize the security of your vaults and the data stored in them. The security recommendations in this article implement Zero Trust principles: ""Verify explicitly"", ""Use least privilege access"", and ""Assume breach"". For comprehensive",2026-04-20T08:00:00.000Z,best-practice,security,0.7,True,"Described as a security recommendations article implementing Zero Trust principles specifically for Azure Key Vault. Such guidance typically includes product-specific security settings (e.g., private endpoints, firewall rules, RBAC role names, access policies) and concrete configuration guidance beyond generic security concepts.",updated https://learn.microsoft.com/en-us/azure/key-vault/general/service-limits,Service limits,Azure Key Vault service limits,Review Azure Key Vault and Managed HSM service limits,"Review the service limits for Azure Key Vault, including transaction throughput, API request limits, and Managed HSM capacity.",Azure Key Vault service supports two resource types: Vaults and Managed HSMs. The following two sections describe the service limits for each of them respectively.,2025-07-21T05:30:00.000Z,reference,limits-quotas,0.95,True,"Explicit service limits page for vaults and Managed HSMs, including transaction throughput, API request limits, and capacity values with exact numbers.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/soft-delete-overview,Azure Key Vault soft-delete,Azure Key Vault soft-delete,Configure and use Azure Key Vault soft-delete safely,"Soft-delete in Azure Key Vault allows you to recover deleted key vaults and key vault objects, such as keys, secrets, and certificates.","Important If a key vault does not have soft-delete protection enabled, deleting a key deletes it permanently. Customers are strongly encouraged to turn on soft delete enforcement for their vaults viaAzure Policy. Important When a Key Vault is soft-deleted, services that are integrated with the Key Vault are deleted. For example: Azure RBAC roles assignments and Event Grid subscriptions. Recovering a soft-deleted Key Vault does not restore these services. They must be recreated. Key Vault's soft-",2026-04-10T08:00:00.000Z,feature-guide,security,0.7,True,"Describes soft-delete behavior and its impact on integrated services (for example, RBAC role assignments and Event Grid subscriptions not being restored). These are product-specific security and recovery behaviors beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/general/troubleshoot-azure-policy-for-key-vault,Troubleshoot issues with implementing Azure Key Vault with Azure Policy,Troubleshoot issues with implementing Azure policy on Key Vault,Troubleshoot Azure Policy enforcement on Key Vault,Troubleshooting issues with implementing Azure policy on Key Vault,"This article guides you how to troubleshoot general errors that might occur when you set up theAzure Policy for Key Vault, and suggests ways to resolve them.",2026-03-26T08:00:00.000Z,how-to,troubleshooting,0.85,True,"Explicit troubleshooting guide for Azure Policy with Key Vault, likely mapping specific policy errors/symptoms to causes and resolutions, which is product-specific diagnostic knowledge.",unchanged @@ -99,7 +99,7 @@ https://learn.microsoft.com/en-us/azure/key-vault/keys/quick-create-powershell,P https://learn.microsoft.com/en-us/azure/key-vault/keys/quick-create-python,Python,Quickstart – Azure Key Vault Python client library – manage keys,,"Learn how to create, retrieve, and delete keys from an Azure key vault using the Python client library","Get started with the Azure Key Vault client library for Python. Follow these steps to install the package and try out example code for basic tasks. By using Key Vault to store cryptographic keys, you avoid storing such keys in your code, which increases the security of your app. API reference documentation|Library source code|Package (Python Package Index)",2026-03-30T08:00:00.000Z,quickstart,,0.25,False,"Quickstart for Python Key Vault client library; introductory usage without detailed configuration parameters, limits, or structured troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/keys/quick-create-template,ARM template,Azure Quickstart - Create an Azure key vault and a key by using Azure Resource Manager template,,"Quickstart showing how to create Azure key vaults, and add key to the vaults by using Azure Resource Manager template (ARM template).","Azure Key Vaultis a cloud service that provides a secure store for secrets, such as keys, passwords, and certificate. This quickstart focuses on the process of deploying an Azure Resource Manager template (ARM template) to create a key vault and a key.",2026-04-10T08:00:00.000Z,quickstart,,0.2,False,"ARM template quickstart for creating a key vault and key; focuses on a single example template, not comprehensive configuration or expert patterns.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/keys/quick-create-terraform,Terraform,Quickstart: Create an Azure key vault and key using Terraform,Provision Key Vault and key using Terraform,"In this article, you create an Azure key vault and key using Terraform","Azure Key Vaultis a cloud service that provides a secure store for secrets, such as keys, passwords, and certificates. This article focuses on the process of deploying a Terraform file to create a key vault and a key. Terraformenables the definition, preview, and deployment of cloud infrastructure. Using Terraform, you create configuration files usingHCL syntax. The HCL syntax allows you to specify the cloud provider - such as Azure - and the elements that make up your cloud infrastructure. Afte",2026-01-08T08:00:00.000Z,quickstart,deployment,0.62,True,"Contains Terraform HCL configuration for Azure Key Vault and keys, including resource types and arguments, which are product-specific deployment details.",unchanged -https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys,Secure your Key Vault keys,Secure your Azure Key Vault keys,Apply security best practices for Azure Key Vault keys,"Learn how to secure Azure Key Vault keys, with best practices specific to cryptographic key management.","Azure Key Vault keys protect cryptographic keys used for encryption, digital signatures, and key wrapping operations. This article provides security recommendations specific to cryptographic key management. Note This article focuses on security practices specific to Key Vault keys. For comprehensive Key Vault security guidance including network security, identity and access management, and vault architecture, seeSecure your Azure Key Vault.",2026-04-10T08:00:00.000Z,best-practice,best-practices,0.8,True,"Explicitly a security recommendations article for Key Vault keys; typically includes product-specific rotation intervals, key usage patterns, and do/don’t guidance tailored to this service, which fits best-practices with expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys,Secure your Key Vault keys,Secure your Azure Key Vault keys,Apply secure key management practices in Azure Key Vault,"Learn how to secure Azure Key Vault keys, with best practices specific to cryptographic key management.","Azure Key Vault keys protect cryptographic keys used for encryption, digital signatures, and key wrapping operations. This article provides security recommendations specific to cryptographic key management. Note This article focuses on security practices specific to Key Vault keys. For comprehensive Key Vault security guidance including network security, identity and access management, and vault architecture, seeSecure your Azure Key Vault.",2026-04-22T22:13:00.000Z,best-practice,best-practices,0.7,True,"The article focuses on concrete security recommendations specific to Azure Key Vault keys and cryptographic key management (DOs/DON’Ts and product-specific guidance), which aligns with best-practices. It goes beyond generic security concepts and provides actionable, service-specific advice, but does not emphasize numeric limits, decision matrices, or configuration parameter tables.",updated https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/access-control,Access control,Azure Key Vault Managed HSM access control,Manage access control and authorization for Managed HSM,Learn how to manage access permissions for Azure Key Vault Managed HSM and keys. Understand the authentication and authorization models for Managed HSM and how to secure your HSMs.,"Azure Key Vault Managed HSM is a cloud service that safeguards encryption keys. Because this data is sensitive and critical to your business, you need to secure your managed hardware security modules (HSMs) by allowing only authorized applications and users to access the data. This article provides an overview of the Managed HSM access control model. It explains authentication and authorization, and describes how to secure access to your managed HSMs. For practical implementation guidance, seeSe",2026-01-30T08:00:00.000Z,concept-article,security,0.7,True,Describes authentication and authorization models and how to secure access; likely includes specific access patterns and security configuration unique to Managed HSM.,unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/authorize-azure-resource-manager,Authorize Azure Resource Manager,Allow key management operations through Azure Resource Manager,Configure Azure Resource Manager access to Managed HSM,Learn how to allow key management operations through ARM,"For many asynchronous operations in the Portal and Template deployments, Azure Resource Manager must be trusted to act on behalf of users. Azure Key Vault trusts Azure Resource Manager but, for many higher assurance environments, such trust in the Azure portal and Azure Resource Manager may be considered a risk. Azure Managed HSM doesn't trust Azure Resource Manager by default. However, for environments where such risk is an acceptable tradeoff for the ease of use of the Azure portal and templat",2026-03-26T08:00:00.000Z,tutorial,security,0.7,True,"Describes how to allow key management operations via ARM in higher-assurance environments. This is product-specific security configuration (trusting ARM, likely via specific access policies/roles or settings) that goes beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/azure-policy,Integrate Managed HSM with Azure Policy,Integrate Azure Managed HSM with Azure Policy,,Learn how to integrate Azure Managed HSM with Azure Policy,"Azure Policyis a governance tool that gives users the ability to audit and manage their Azure environment at scale. Azure Policy lets you place guardrails on Azure resources to ensure they're compliant with assigned policy rules. It allows users to perform audit, real-time enforcement, and remediation of their Azure environment. The results of audits performed by policy are available to users in a compliance dashboard, where they'll be able to see a drill-down of which resources and components a",2026-03-26T08:00:00.000Z,how-to,,0.3,False,"Integration of Managed HSM with Azure Policy is governance/overview oriented; summary does not indicate specific policy definitions, parameter tables, or RBAC/role details unique enough to qualify as expert configuration or security guidance.",unchanged @@ -123,8 +123,8 @@ https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/network-security,N https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/overview,About Managed HSM,Azure Managed HSM Overview - Azure Managed HSM,,Azure Managed HSM is a cloud service that safeguards your cryptographic keys for cloud applications.,"Important We have updated our HSM fleet to a FIPS 140-3 level 3 validated firmware for both Azure Key Vault Managed HSM and Azure Key Vault Premium. See full details atUpdating Managed HSM Firmware for Enhanced Security and Compliance. Azure Key Vault Managed HSM (Hardware Security Module) is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. It is",2025-11-19T08:00:00.000Z,overview,,0.3,False,"High-level service overview and marketing-style description without detailed configuration, limits, or patterns.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/policy-grammar,Secure key release policy grammar,Azure Managed HSM Secure key release policy grammar,Author secure key release policies for Managed HSM,Managed HSM Secure key release policy grammar,"This article documents a simplified EBNF grammar for secure key release policy, which itself is modeled onAzure Policy. For a complete example of a secure key release policy, see theconfidential VM key release policy.",2025-04-14T08:00:00.000Z,reference,configuration,0.85,True,"Documents an EBNF grammar for secure key release policy, which is a precise configuration language definition unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/private-link,Use private endpoints,Configure Azure Key Vault Managed HSM with private endpoints,Configure Managed HSM private endpoints with Private Link,Learn how to integrate Azure Key Vault Managed HSM with Azure Private Link Service,"Azure Private Link Service enables you to access Azure Services (for example, Managed HSM, Azure Storage, and Azure Cosmos DB etc.) and Azure hosted customer/partner services over a Private Endpoint in your virtual network. An Azure Private Endpoint is a network interface that connects you privately and securely to a service powered by Azure Private Link. The private endpoint uses a private IP address from your VNet, effectively bringing the service into your VNet. All traffic to the service can",2026-01-30T08:00:00.000Z,how-to,security,0.8,True,"Provides specific steps and parameters to integrate Managed HSM with Private Link, which are product-specific network security configurations.",unchanged -https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-cli,CLI,Quickstart - Provision and activate an Azure Managed HSM,,Quickstart showing how to provision and activate a managed HSM using Azure CLI,"In this quickstart, you create and activate an Azure Key Vault Managed HSM (Hardware Security Module) by using Azure CLI. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview.",2026-04-14T08:00:00.000Z,quickstart,,0.2,False,"Quickstart focused on provisioning and activating Managed HSM via Azure CLI; primarily step-by-step commands without configuration tables, limits, error-code mappings, or product-specific decision matrices. Does not meet thresholds for limits, configuration, troubleshooting, or other expert-knowledge categories.",updated -https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-powershell,PowerShell,Create and retrieve attributes of a managed key in Azure Key Vault – Azure PowerShell,,Quickstart showing how to set and retrieve a managed key from Azure Key Vault using Azure PowerShell,"In this quickstart, you create and activate an Azure Key Vault Managed HSM (Hardware Security Module) by using PowerShell. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview.",2026-04-14T08:00:00.000Z,quickstart,,0.2,False,"Quickstart for creating and activating Managed HSM using PowerShell; mainly procedural instructions and basic usage. Lacks detailed limits, configuration parameter tables, security role breakdowns, or troubleshooting mappings required for expert-knowledge classification.",updated +https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-cli,CLI,Quickstart - Provision and activate an Azure Managed HSM,,Quickstart showing how to provision and activate a managed HSM using Azure CLI,"In this quickstart, you create and activate an Azure Key Vault Managed HSM (Hardware Security Module) by using Azure CLI. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview.",2026-04-14T08:00:00.000Z,quickstart,,0.2,False,"Quickstart focused on provisioning and activating Managed HSM via Azure CLI; primarily step-by-step commands without configuration tables, limits, error-code mappings, or product-specific decision matrices. Does not meet thresholds for limits, configuration, troubleshooting, or other expert-knowledge categories.",unchanged +https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-powershell,PowerShell,Create and retrieve attributes of a managed key in Azure Key Vault – Azure PowerShell,,Quickstart showing how to set and retrieve a managed key from Azure Key Vault using Azure PowerShell,"In this quickstart, you create and activate an Azure Key Vault Managed HSM (Hardware Security Module) by using PowerShell. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview.",2026-04-14T08:00:00.000Z,quickstart,,0.2,False,"Quickstart for creating and activating Managed HSM using PowerShell; mainly procedural instructions and basic usage. Lacks detailed limits, configuration parameter tables, security role breakdowns, or troubleshooting mappings required for expert-knowledge classification.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-template,ARM template,Azure Quickstart - Create a Managed HSM using an Azure Resource Manager template,,Quickstart showing how to create Azure an Azure Key Vault Managed HSM using Resource Manager template,"This quickstart shows how to use an Azure Resource Manager template (ARM template) to create an Azure Key Vault managed HSM. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview. AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastr",2026-03-30T08:00:00.000Z,quickstart,,0.25,False,ARM template quickstart for Managed HSM; shows one template pattern rather than a full configuration reference or decision/limits guide.,unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quickstart-dotnet,.NET SDK,Quickstart - Azure Key Vault Managed HSM client library for .NET,,Learn how to access keys in Azure Managed HSM using the .NET client library,"Get started with the Azure Key Vault Managed HSM client library for .NET. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview. In this quickstart, you learn how to access and perform cryptographic operations on keys in a Managed HSM using the .NET client library. Managed HSM clie",2026-03-30T08:00:00.000Z,quickstart,,0.3,False,".NET client library quickstart for Managed HSM; basic usage and sample code, not a deep configuration or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quickstart-javascript,JavaScript SDK,Quickstart - Azure Key Vault Managed HSM client library for JavaScript,,Learn how to access keys in Azure Managed HSM using the JavaScript client library,"Get started with the Azure Key Vault Managed HSM client library for JavaScript. Managed HSM is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, usingFIPS 140-3 Level 3validated HSMs. For more information on Managed HSM, review theOverview. In this quickstart, you learn how to access and perform cryptographic operations on keys in a Managed HSM using the JavaScript client library. Mana",2026-03-30T08:00:00.000Z,quickstart,,0.3,False,"JavaScript client library quickstart for Managed HSM; focuses on basic operations, not on expert-level configuration or limits.",unchanged diff --git a/products/azure-key-vault/report.md b/products/azure-key-vault/report.md index 5ebf816f..ce30c673 100644 --- a/products/azure-key-vault/report.md +++ b/products/azure-key-vault/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: integrations: 'Using Key Vault from code and services: JS/Go/.NET/Python client patterns for keys/secrets/certs, rotation and backup, plus integrations with Event Grid, Logic Apps, Databricks, DigiCert, and TLS offload.' - security: 'Securing Azure Key Vault and Managed HSM: auth, RBAC vs access policies, - firewalls, Private Link, soft-delete, backups, and security best practices for - keys, secrets, and certificates.' + security: 'Securing Key Vault and Managed HSM: auth with Entra ID, RBAC vs access + policies, network/firewall/Private Link, soft delete, backup/restore, and hardening + best practices for keys, secrets, and certs.' configuration: 'Configuring Key Vault and Managed HSM: monitoring, alerts, logging, policies, key types/rotation, secure key release, replication, and special secret formats (e.g., multiline).' @@ -16,21 +16,20 @@ category_descriptions: logging latency, soft-delete behavior, and network/IP firewall configuration.' decision-making: Guidance on planning key and HSM capacity, scaling, and migrating cryptographic workloads or Key Vault access control from access policies to RBAC - best-practices: Guidance on BYOK/HSM key import, key/secret security best practices, - disaster recovery for Managed HSM, and automating single/dual-credential secret - rotation in Key Vault. + best-practices: Best practices for BYOK/HSM key generation and transfer, secure + key management, disaster recovery for Managed HSM, and automating single/dual-credential + secret rotation in Key Vault. deployment: How to deploy and provision Azure Key Vault and Managed HSM (vaults, keys, secrets) using ARM templates, Bicep, Terraform, Azure CLI, and PowerShell skill_description: Expert knowledge for Azure Key Vault development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations - & coding patterns, and deployment. Use when using Key Vault/Managed HSM APIs, RBAC - vs access policies, Private Link, key rotation, or IaC deployment, and other Azure - Key Vault related development tasks. Not for Azure Dedicated HSM (use azure-dedicated-hsm), + & coding patterns, and deployment. Use when using KV/Managed HSM SDKs, Entra auth/RBAC, + Private Link, key rotation/backup, or IaC deployments, and other Azure Key Vault + related development tasks. Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure Cloud Hsm (use azure-cloud-hsm), Azure Payment Hsm (use azure-payment-hsm), Azure Information Protection (use azure-information-protection). -use_when: Use when using Key Vault/Managed HSM APIs, RBAC vs access policies, Private - Link, key rotation, or IaC deployment, and other Azure Key Vault related development - tasks. +use_when: Use when using KV/Managed HSM SDKs, Entra auth/RBAC, Private Link, key rotation/backup, + or IaC deployments, and other Azure Key Vault related development tasks. confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure Cloud Hsm (use azure-cloud-hsm), Azure Payment Hsm (use azure-payment-hsm), Azure Information Protection (use azure-information-protection). @@ -47,8 +46,8 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 2 -- **Unchanged**: 160 +- **Updated Pages**: 3 +- **Unchanged**: 159 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-key-vault/azure-key-vault.csv` @@ -70,10 +69,12 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure ### Updated Pages -- [CLI](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-cli) - - Updated: 2026-04-07T22:10:00.000Z → 2026-04-14T08:00:00.000Z -- [PowerShell](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/quick-create-powershell) - - Updated: 2026-03-30T08:00:00.000Z → 2026-04-14T08:00:00.000Z +- [Back up and restore objects](https://learn.microsoft.com/en-us/azure/key-vault/general/backup) + - Updated: 2026-04-10T08:00:00.000Z → 2026-04-22T22:13:00.000Z +- [Secure your Key Vault](https://learn.microsoft.com/en-us/azure/key-vault/general/secure-key-vault) + - Updated: 2026-01-30T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Secure your Key Vault keys](https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys) + - Updated: 2026-04-10T08:00:00.000Z → 2026-04-22T22:13:00.000Z ## Classified Pages @@ -89,7 +90,6 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure | [Diagnose Private Links Configuration Issues](https://learn.microsoft.com/en-us/azure/key-vault/general/private-link-diagnostics) | troubleshooting | 0.85 | Diagnostics article will map specific Private Link misconfigurations, status codes, and connectivity symptoms to root causes and resolutions, which is product-specific troubleshooting knowledge. | | [Monitoring Key Vault data reference](https://learn.microsoft.com/en-us/azure/key-vault/general/monitor-key-vault-reference) | configuration | 0.85 | Monitoring data reference typically lists all metrics, dimensions, log categories, and schemas for Key Vault, which are detailed configuration/reference data. | | [Secure key release policy grammar](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/policy-grammar) | configuration | 0.85 | Documents an EBNF grammar for secure key release policy, which is a precise configuration language definition unique to this product. | -| [Secure your Key Vault](https://learn.microsoft.com/en-us/azure/key-vault/general/secure-key-vault) | security | 0.85 | Explicit security hardening article with Key Vault–specific recommendations (Zero Trust, least privilege, likely specific settings and patterns), which are product-specific security best practices. | | [Troubleshoot issues with implementing Azure Key Vault with Azure Policy](https://learn.microsoft.com/en-us/azure/key-vault/general/troubleshoot-azure-policy-for-key-vault) | troubleshooting | 0.85 | Explicit troubleshooting guide for Azure Policy with Key Vault, likely mapping specific policy errors/symptoms to causes and resolutions, which is product-specific diagnostic knowledge. | | [About secrets](https://learn.microsoft.com/en-us/azure/key-vault/secrets/about-secrets) | limits-quotas | 0.80 | Contains a specific numerical constraint: secrets are stored as octet sequences with a maximum size of 25 KB each. This is an exact product limit that qualifies as expert knowledge under limits-quotas. | | [Bring your own key specification](https://learn.microsoft.com/en-us/azure/key-vault/keys/byok-specification) | configuration | 0.80 | A formal BYOK specification typically defines exact formats, parameter names, key sizes, wrapping algorithms, and file structures required for import, which are detailed configuration constraints not derivable from general knowledge. | @@ -99,7 +99,6 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure | [Prepare for Azure RBAC as default](https://learn.microsoft.com/en-us/azure/key-vault/general/access-control-default) | security | 0.80 | Explains change to Azure RBAC as default access control, specific API version (2026-02-01), and retirement date for earlier versions. Contains product-specific security model details and version timelines relevant to secure configuration. | | [Secure access to your Managed HSM](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/how-to-secure-access) | security | 0.80 | Tutorial on securing access with Azure RBAC and local RBAC; likely includes specific role names, assignment patterns, and separation-of-duties configurations unique to Managed HSM. | | [Secure key release policy grammar](https://learn.microsoft.com/en-us/azure/key-vault/keys/policy-grammar) | configuration | 0.80 | Documents an EBNF grammar for secure key release policy, including specific fields and operators, which is detailed configuration syntax. | -| [Secure your Key Vault keys](https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys) | best-practices | 0.80 | Explicitly a security recommendations article for Key Vault keys; typically includes product-specific rotation intervals, key usage patterns, and do/don’t guidance tailored to this service, which fits best-practices with expert knowledge. | | [Use private endpoints](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/private-link) | security | 0.80 | Provides specific steps and parameters to integrate Managed HSM with Private Link, which are product-specific network security configurations. | | [Virtual network service endpoints for Azure Key Vault](https://learn.microsoft.com/en-us/azure/key-vault/general/overview-vnet-service-endpoints) | security | 0.80 | Describes restricting access via VNets, IPv4 ranges, and exceptions for trusted Microsoft services. This is product-specific network security configuration, including special-case behavior, fitting the security category. | | [Access Key Vault behind a firewall](https://learn.microsoft.com/en-us/azure/key-vault/general/access-behind-firewall) | security | 0.78 | Article specifies ports, hosts, and IP ranges required for Key Vault access from restricted networks, which are concrete, product-specific security/network configuration values. | @@ -136,7 +135,9 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure | [Receive notifications via Logic Apps](https://learn.microsoft.com/en-us/azure/key-vault/general/event-grid-logicapps) | integrations | 0.70 | Guide wires Key Vault events through Event Grid into Logic Apps; this requires product-specific event types, schema fields, and connector configuration parameters, which are integration details. | | [Recovery management with soft-delete and purge protection](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/recovery) | configuration | 0.70 | Describes how to enable and manage recovery features with specific CLI/PowerShell commands and flags, which are configuration details. | | [Scaling guidance](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/scaling-guidance) | decision-making | 0.70 | Scaling guidance with benchmark performance numbers supports capacity planning and tier/size decisions, including quantified trade-offs. | +| [Secure your Key Vault](https://learn.microsoft.com/en-us/azure/key-vault/general/secure-key-vault) | security | 0.70 | Described as a security recommendations article implementing Zero Trust principles specifically for Azure Key Vault. Such guidance typically includes product-specific security settings (e.g., private endpoints, firewall rules, RBAC role names, access policies) and concrete configuration guidance beyond generic security concepts. | | [Secure your Key Vault certificates](https://learn.microsoft.com/en-us/azure/key-vault/certificates/secure-certificates) | security | 0.70 | Explicitly described as providing security recommendations specific to Key Vault certificates; such guidance typically includes product-specific practices (for example, recommended access models, rotation patterns, and configuration details) that go beyond generic security concepts. | +| [Secure your Key Vault keys](https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys) | best-practices | 0.70 | The article focuses on concrete security recommendations specific to Azure Key Vault keys and cryptographic key management (DOs/DON’Ts and product-specific guidance), which aligns with best-practices. It goes beyond generic security concepts and provides actionable, service-specific advice, but does not emphasize numeric limits, decision matrices, or configuration parameter tables. | | [Secure your Key Vault secrets](https://learn.microsoft.com/en-us/azure/key-vault/secrets/secure-secrets) | security | 0.70 | The article is explicitly focused on security recommendations specific to Azure Key Vault secrets management. It provides product-specific guidance on how to secure secrets (for example, how to configure access, rotation, and usage patterns) rather than generic security theory. This aligns with the 'security' sub-skill type, as it contains concrete, service-specific security practices rather than just conceptual overviews. | | [Secure your managed HSM](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/secure-managed-hsm) | security | 0.70 | Explicitly a security hardening article for Managed HSM. Likely includes product-specific recommendations (network isolation, identity setup, access control patterns) and possibly role names or configuration settings, which qualify as expert security best practices. | | [Sign and verify with key](https://learn.microsoft.com/en-us/azure/key-vault/keys/javascript-developer-guide-sign-verify-key) | integrations | 0.70 | Provides concrete signing/verifying method usage and parameters in the JS client library, which are specific integration patterns. | @@ -174,7 +175,6 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure | TOC Title | Confidence | Reason | |-----------|------------|--------| | [Certificate access control](https://learn.microsoft.com/en-us/azure/key-vault/certificates/certificate-access-control) | 0.50 | Access control overview for certificates; mentions control/data plane and Entra ID but no specific RBAC role names or permission scopes. | -| [Back up and restore objects](https://learn.microsoft.com/en-us/azure/key-vault/general/backup) | 0.45 | Backup article is likely a procedural how-to using portal/CLI/PowerShell; summary doesn’t indicate detailed parameter tables, limits, or error mappings beyond generic backup steps. | | [Move Key Vault to Another Resource Group](https://learn.microsoft.com/en-us/azure/key-vault/general/move-resourcegroup) | 0.45 | Moving a vault between resource groups is typically a step-by-step operation guide; description doesn’t suggest detailed constraints tables or error-code-based troubleshooting. | | [Move Key Vault to Another Subscription](https://learn.microsoft.com/en-us/azure/key-vault/general/move-subscription) | 0.45 | Moving a vault between subscriptions is similar to index 13; mainly procedural guidance without clear indication of expert-level limits, configuration matrices, or error mappings. | | [Create certificate signing requests](https://learn.microsoft.com/en-us/azure/key-vault/certificates/create-certificate-signing-request) | 0.40 | How-to for creating and merging CSRs; procedural guidance without evidence of detailed configuration parameters or limits. | @@ -195,6 +195,7 @@ confusable_not_for: Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure | [About certificates](https://learn.microsoft.com/en-us/azure/key-vault/certificates/about-certificates) | 0.30 | High-level overview of Key Vault certificates and behaviors; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds. | | [Apps, API keys, and Key Vault secrets](https://learn.microsoft.com/en-us/azure/key-vault/general/apps-api-keys-secrets) | 0.30 | Article is an overview/tutorial on using Key Vault for API keys; description suggests step-by-step usage rather than detailed configuration matrices or error/limit specifics. | | [Azure Key Vault Developer's Guide](https://learn.microsoft.com/en-us/azure/key-vault/general/developers-guide) | 0.30 | Developer’s guide is described as a general integration overview; likely focuses on concepts and basic usage patterns rather than detailed config tables, limits, or error mappings. | +| [Back up and restore objects](https://learn.microsoft.com/en-us/azure/key-vault/general/backup) | 0.30 | Appears to be a how-to/tutorial on backing up Key Vault objects, not a reference for limits, configuration matrices, or error-code-based troubleshooting. Likely lacks detailed numeric limits, config parameter tables, or decision matrices. | | [Certificate creation methods](https://learn.microsoft.com/en-us/azure/key-vault/certificates/create-certificate) | 0.30 | Describes methods to create/import certificates; summary indicates conceptual options, not detailed configuration values, limits, or security role mappings. | | [Configure certificate rotation](https://learn.microsoft.com/en-us/azure/key-vault/certificates/tutorial-rotate-certificates) | 0.30 | Tutorial on updating autorotation frequency; summary does not show specific numeric ranges, configuration tables, or decision matrices, just procedural steps. | | [Firmware updates](https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/firmware-update) | 0.30 | Announcement-style content about firmware update and FIPS validation; summary does not indicate detailed configuration, limits, or troubleshooting. | diff --git a/products/azure-kubernetes-service/azure-kubernetes-service.csv b/products/azure-kubernetes-service/azure-kubernetes-service.csv index 24f9cfad..0359e789 100644 --- a/products/azure-kubernetes-service/azure-kubernetes-service.csv +++ b/products/azure-kubernetes-service/azure-kubernetes-service.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad,Cluster and node access control with Conditional Access,Control cluster and node access using Conditional Access with AKS-managed Microsoft Entra integration - Azure Kubernetes Service,Use Conditional Access for AKS control plane and nodes,Learn how to access clusters and nodes using Conditional Access when integrating Microsoft Entra ID in your Azure Kubernetes Service (AKS) clusters.,"When you integrate Microsoft Entra ID with your AKS cluster, you can useConditional Accessto control access to your cluster control plane and cluster nodes. This article shows you how to enable Conditional Access on your AKS clusters for both control plane access and SSH access to nodes. Note Microsoft Entra Conditional Access has Microsoft Entra ID P1, P2, or Governance capabilities requiring a Premium P2 SKU. For more on Microsoft Entra ID licenses and SKUs, seeMicrosoft Entra ID Governance li",2025-12-10T12:02:00.000Z,concept-article,security,0.76,True,"Details configuration of Conditional Access for AKS API and SSH, including specific Entra policies and AKS integration steps.",unchanged +https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad,Conditional Access for cluster and node access,Control cluster and node access using Conditional Access with Microsoft Entra integration - Azure Kubernetes Service,Configure Conditional Access for AKS clusters and nodes,Learn how to access clusters and nodes using Conditional Access when integrating Microsoft Entra ID in your Azure Kubernetes Service (AKS) clusters.,"When you integrate Microsoft Entra ID with your AKS cluster, you can useConditional Accessto control access to your cluster control plane and cluster nodes. This article shows you how to enable Conditional Access on your AKS clusters for both control plane access and SSH access to nodes. Note Microsoft Entra Conditional Access has Microsoft Entra ID P1, P2, or Governance capabilities requiring a Premium P2 SKU. For more on Microsoft Entra ID licenses and SKUs, seeMicrosoft Entra ID Governance li",2025-12-10T12:02:00.000Z,concept-article,security,0.8,True,"Shows how to use Microsoft Entra Conditional Access with AKS; will contain specific policy settings, scopes, and role requirements, which are detailed security configuration instructions.",new https://learn.microsoft.com/en-us/azure/aks/access-private-cluster,Access a private cluster remotely,Access a Private Azure Kubernetes Service (AKS) Cluster using the Command Invoke or Run Command Feature - Azure Kubernetes Service,Access private AKS clusters using command invoke and Run command,Learn how to access a private Azure Kubernetes Service (AKS) cluster using the Azure CLI command invoke feature or the Azure portal Run command feature.,"Deploy and Explore When you access a private Azure Kubernetes Service (AKS) cluster, you need to connect to the cluster from the cluster virtual network (VNet), a peered network, or a configured private endpoint. These approaches require extra configuration, such as setting up a VPN or Express Route. With the Azure CLI, you can usecommand invoketo access private clusters without the need to configure a VPN or Express Route.command invokeallows you to remotely invoke commands, likekubectlandhelm,",2025-12-17T23:08:00.000Z,how-to,security,0.65,True,Uses Azure CLI command invoke and portal Run command to reach private clusters without VPN/ExpressRoute; involves specific commands and access scopes that are AKS- and Azure-specific secure access patterns.,unchanged https://learn.microsoft.com/en-us/azure/aks/active-active-solution,Active-active,Recommended active-active high availability solution overview for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Active-active high availability pattern for AKS,Learn about the recommended active-active high availability solution overview for Azure Kubernetes Service (AKS).,"When you create an application in Azure Kubernetes Service (AKS) and choose an Azure region during resource creation, it's a single-region app. In the event of a disaster that causes the region to become unavailable, your application also becomes unavailable. If you create an identical deployment in a secondary Azure region, your application becomes less susceptible to a single-region disaster, which guarantees business continuity, and any data replication across the regions lets you recover you",2024-08-01T20:29:00.000Z,concept-article,architecture-patterns,0.7,True,"Describes a specific AKS active-active architecture, including when to use it and trade-offs, which is product-specific design guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/active-passive-solution,Active-passive,Recommended active-passive disaster recovery solution overview for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Active-passive disaster recovery pattern for AKS,Learn about an active-passive disaster recovery solution overview for Azure Kubernetes Service (AKS).,"When you create an application in Azure Kubernetes Service (AKS) and choose an Azure region during resource creation, it's a single-region app. When the region becomes unavailable during a disaster, your application also becomes unavailable. If you create an identical deployment in a secondary Azure region, your application becomes less susceptible to a single-region disaster, which guarantees business continuity, and any data replication across the regions lets you recover your last application",2024-08-01T20:29:00.000Z,concept-article,architecture-patterns,0.7,True,"Provides an AKS-focused active-passive DR architecture with scenario-based guidance and trade-offs, beyond generic DR theory.",unchanged @@ -23,9 +23,14 @@ https://learn.microsoft.com/en-us/azure/aks/airflow-deploy,Configure and deploy https://learn.microsoft.com/en-us/azure/aks/airflow-overview,Overview,Deploy Apache Airflow on AKS with Helm - Azure Kubernetes Service,,Learn the high-level architecture of deploying production-ready Apache Airflow on Azure Kubernetes Service (AKS) and the available Airflow executors.,"In this guide, you deploy Apache Airflow on Azure Kubernetes Service (AKS) using Helm. You learn how to set up an AKS cluster, install Helm, deploy Airflow using the Helm chart, and explore the Airflow UI. This article provides a high-level overview of the architecture and components involved in deploying production-ready Airflow on AKS. Important Open-source software is mentioned throughout AKS documentation and samples. Software that you deploy is excluded from AKS service-level agreements, li",2025-08-05T22:07:00.000Z,overview,,0.3,False,High-level architecture overview for Airflow on AKS; summary does not indicate detailed configuration parameters or limits.,unchanged https://learn.microsoft.com/en-us/azure/aks/aks-communication-manager,AKS Communication Manager,AKS Communication Manager - Azure Kubernetes Service,Configure AKS Communication Manager for maintenance notifications,Learn how to set up and receive notices in Azure Resource Notifications for Azure Kubernetes Service maintenance events.,"The Azure Kubernetes Service (AKS) communication manager streamlines notifications for all your AKS maintenance tasks by using Azure Resource Notifications and Azure Resource Graph frameworks. The communication manager gives you timely alerts on event triggers and outcomes, so that you can closely monitor your upgrades. If maintenance fails, the communication manager notifies you with the reasons for the failure. This information reduces operational hassles related to observability and follow-up",2025-10-01T17:10:00.000Z,how-to,configuration,0.7,True,Describes how to set up notifications using Azure Resource Notifications and Resource Graph for AKS maintenance events—product-specific configuration.,unchanged https://learn.microsoft.com/en-us/azure/aks/aks-component-versioning,AKS component versioning,Understanding AKS component versioning - Azure Kubernetes Service,Manage AKS component versioning and patching strategy,"Learn how different AKS components are versioned, patched, and upgraded across AKS cluster control plane and nodes.","Azure Kubernetes Service (AKS) manages multiple components that follow different versioning and patching strategies. Understanding how these components are versioned helps you plan upgrades, track security patches, and manage your cluster lifecycle effectively. This article explains the versioning approach for different categories of AKS components and how they're maintained over time.",2025-08-06T05:08:00.000Z,concept-article,configuration,0.7,True,"Explains how different AKS components are versioned and patched, including product-specific versioning behavior and upgrade mechanics.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aks-desktop-app,Deploy an application using AKS desktop,Deploy an application using AKS desktop (preview) - Azure Kubernetes Service,,"This article guides you through deploying an application using AKS desktop, enabling you to manage your containerized workloads with an intuitive, application-centric interface.","Applies to: ✔️AKS Automatic clusters Deploy applications to Azure Kubernetes Service (AKS) using AKS desktop, an application-focused experience that simplifies Kubernetes management. This guide walks you through the steps to deploy your first application using AKS desktop.",2026-01-12T23:20:00.000Z,how-to,,0.45,False,Step-by-step app deployment via AKS desktop; likely a basic tutorial without detailed config matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/aks/aks-desktop-overview,AKS desktop overview,AKS desktop for Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,,"Learn about AKS desktop, an application-focused experience for deploying and managing workloads on Azure Kubernetes Service (AKS) that accelerates time to business value.","Applies to: ✔️AKS Automatic clusters AKS desktop delivers an application-focused experience for deploying and managing workloads on Azure Kubernetes Service (AKS). It accelerates time to business value by providing a guided, self-service user experience (UX) built on supported AKS features, best practices, and open-sourceHeadlamp. AKS desktop works within your existing environment and tools, enabling team collaboration through role-based access control (RBAC) while simplifying Kubernetes managem",2026-01-12T23:20:00.000Z,overview,,0.5,False,"Overview of AKS desktop experience; summary mentions best practices but not concrete parameters, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions,Set up permissions in AKS desktop,Set up permissions and role-based access control (RBAC) in AKS desktop (preview) - Azure Kubernetes Service,Set up RBAC permissions for AKS desktop users,Learn how to set up permissions and role-based access control (RBAC) for AKS desktop in Azure Kubernetes Service (AKS) based on your role as a cluster operator or developer.,"Applies to: ✔️AKS Automatic clusters AKS desktop uses Azure role-based access control (RBAC) to manage user permissions for accessing and managing resources within AKS desktop. Depending on your role as a cluster operator or developer, you have different responsibilities and required permissions to work with AKS desktop effectively. This article guides you through the setup process based on your role: cluster operator or developer. Note When you create Projects in AKS desktop,AKS managed namespa",2026-01-12T23:20:00.000Z,how-to,security,0.8,True,Describes Azure RBAC roles and required permissions for operators vs developers in AKS desktop; product-specific security and access configuration.,unchanged +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-app,Deploy an application,Deploy an Application using AKS Desktop for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn how to deploy a containerized application to AKS using AKS desktop without writing Kubernetes manifests.,"This article shows you how to sign in, add a cluster, create aProject, deploy an application, and view metrics in AKS desktop for Azure Kubernetes Service (AKS). For an overview of AKS desktop, seeAKS desktop overview.",2026-04-23T22:11:00.000Z,how-to,,0.2,False,"How-to guide for signing in, adding a cluster, creating a project, and deploying an app. It reads as a basic usage tutorial without explicit mention of configuration tables, limits, RBAC roles, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-ai-assistant,Use AI with AKS desktop,Troubleshoot Azure Kubernetes Service (AKS) Workloads with Natural Language in AKS Desktop (preview) - Azure Kubernetes Service,Use AI assistant to troubleshoot AKS Desktop workloads,Learn how to use the AI-powered troubleshooting assistant in AKS desktop to diagnose and resolve Kubernetes issues using natural language.,"AKS desktop includes an AI-powered troubleshooting assistant that provides natural language diagnostics for your AKS workloads. By securely connecting to your cluster, the assistant can analyze logs, events, and metrics to identify issues and recommend resolutions in plain language. This allows developers and operators to quickly understand and fix problems without needing deep Kubernetes expertise. This article walks you through how to enable and use the AI troubleshooting assistant in AKS desk",2026-04-23T22:11:00.000Z,how-to,troubleshooting,0.68,True,"The article explains how to enable and use an AI-powered troubleshooting assistant that analyzes AKS logs, events, and metrics to identify issues and recommend resolutions. This is a product-specific troubleshooting workflow (symptom to diagnosis to resolution) using a unique AKS Desktop feature, which aligns with the troubleshooting sub-skill rather than generic conceptual content.",new +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-troubleshooting,Troubleshoot an application,Troubleshoot an Application using Insights in AKS Desktop (preview) - Azure Kubernetes Service,Use AKS Desktop Insights to troubleshoot Kubernetes apps,Learn how to troubleshoot Kubernetes applications in AKS desktop using the built-in Insights feature powered by Inspektor Gadget.,"AKS desktop includes an integrated troubleshooting suite calledInsights, powered byInspektor Gadget, an open-source eBPF (extended Berkeley Packet Filter)-based debugging tool. Without modifying your code or restarting anything, it lets you understand network traffic between pods, trace DNS failures, and explore running processes to spot unexpected or resource-heavy activity, all through an intuitive UI with a few clicks. This article walks you through enabling Insights and using it to diagnose ",2026-04-23T22:11:00.000Z,how-to,troubleshooting,0.72,True,"The page describes a product-specific troubleshooting suite (Insights powered by Inspektor Gadget) and how to use it to diagnose issues like network traffic, DNS failures, and process anomalies. This is organized around diagnosing and resolving issues in AKS Desktop with concrete, tool-specific guidance, which fits the troubleshooting sub-skill. It goes beyond generic debugging by focusing on this specific integrated tool and workflow.",new +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-install-cluster-setup,Set up a cluster for AKS desktop,Configure an Azure Kubernetes Service (AKS) Cluster for AKS Desktop - Azure Kubernetes Service,Configure AKS clusters for AKS Desktop compatibility,Learn how to set up a compatible Azure Kubernetes Service (AKS) cluster for AKS desktop.,"AKS desktop supports Standard and Automatic (recommended) AKS clusters. However, there are some differences in setup and features available between the two tiers. This article covers recommended Azure Kubernetes Service (AKS) cluster configurations and add-ons for AKS desktop. Tip If you want to get started with AKS desktop as quickly as possible, we recommend:",2026-04-23T22:11:00.000Z,how-to,configuration,0.7,True,"Described as covering recommended AKS cluster configurations and add-ons for AKS Desktop, and differentiating between Standard and Automatic clusters. This implies concrete configuration options, required add-ons, and possibly tier-specific settings unique to AKS Desktop, matching the configuration sub-skill.",new +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-overview,AKS desktop overview,AKS Desktop for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn what AKS desktop is, who it's for, and how it helps development teams deploy and manage applications on AKS without deep Kubernetes expertise.","This article provides an overview of AKS desktop: an application-focused developer portal for Azure Kubernetes Service (AKS) that simplifies application deployment and management without requiring deep Kubernetes expertise. AKS desktop is built on supported AKS features, best practices, and open-sourceHeadlamp. To install AKS desktop, see theAKS desktop GitHub repository. Important AKS desktop abstracts Kubernetes complexity but doesn't remove access. You can still use kubectl, YAML, or other ex",2026-04-23T22:11:00.000Z,overview,,0.1,False,"High-level overview of AKS Desktop, its purpose, and benefits. No indication of numeric limits, configuration tables, RBAC role details, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions,Set up permissions,Set up Permissions and Role-Based Access Control (RBAC) in AKS Desktop - Azure Kubernetes Service,Configure RBAC permissions for AKS Desktop roles,Learn how to configure RBAC permissions for AKS desktop based on your role as a cluster operator or developer.,"Depending on your role as a cluster operator or developer, you can provide an environment (Project) in AKS desktop for developers to deploy, migrate, or manage applications, or allow them to self-serve deploying and managing applications on a dedicated AKS cluster. Alternatively, you might want to grant more developers access to manage, observe, and troubleshoot applications. By default, when you create a Project in AKS desktop you can share this with other people in your organization, and it se",2026-04-23T22:11:00.000Z,how-to,security,0.8,True,"Focused on setting up permissions and RBAC for AKS Desktop, distinguishing cluster operator and developer roles. This will necessarily include specific RBAC role names, scopes, and permission configurations unique to AKS Desktop usage, which fits the security sub-skill.",new +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-projects,Projects in AKS desktop,Projects in AKS Desktop - Azure Kubernetes Service,,"Learn about Projects in AKS desktop, the application-centric grouping of Kubernetes resources that simplifies deployment and management of applications on Azure Kubernetes Service (AKS).","Projects are the primary units for managing applications in AKS desktop. Projects group related Kubernetes resources, such as deployments, services, and configuration, into a single logical unit. This article provides an overview of Projects in AKS desktop, including their benefits, features, and how they compare to traditional Kubernetes namespaces.",2026-04-23T22:11:00.000Z,overview,,0.1,False,"Conceptual explanation of Projects in AKS Desktop and how they relate to namespaces. No specific configuration parameters, limits, or decision matrices are indicated.",new +https://learn.microsoft.com/en-us/azure/aks/aks-desktop-quickstart-auto,Quickstart,Quickstart: Get Started Deploying and Managing Applications using AKS Automatic with AKS Desktop - Azure Kubernetes Service,,Learn how to deploy and manage a containerized application on Azure Kubernetes Service (AKS) using AKS desktop without writing Kubernetes manifests.,"Deploying and managing Kubernetes applications typically requires writing YAML manifests, running kubectl commands, and switching between multiple tools. AKS desktop removes this complexity with guided workflows that let developers and DevOps engineers deploy, monitor, troubleshoot, and clean up applications without deep Kubernetes expertise. This guide walks you through deploying a TypeScript application using AKS desktop, from creating a cluster through exploring application logs, metrics, sca",2026-04-23T22:11:00.000Z,quickstart,,0.2,False,"Quickstart walkthrough for deploying an app with AKS Desktop. Primarily step-by-step tutorial; no evidence of detailed configuration matrices, limits, or product-specific troubleshooting/error-code mappings.",new https://learn.microsoft.com/en-us/azure/aks/aks-diagnostics,Configure AKS diagnostics,Azure Kubernetes Service (AKS) Diagnose and Solve Problems overview - Azure Kubernetes Service,,Learn about self-diagnosing clusters in Azure Kubernetes Service.,"Troubleshooting Azure Kubernetes Service (AKS) cluster issues plays an important role in maintaining your cluster, especially if your cluster is running mission-critical workloads. AKS Diagnose and Solve Problems is an intelligent, self-diagnostic experience that:",2024-10-09T21:59:00.000Z,how-to,,0.35,False,Overview of the Diagnose and Solve Problems experience; summary does not indicate detailed error-code mappings or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/aks/aks-extension-attach-azure-container-registry,Attach to Azure Container Registry (ACR),Attach to Azure Container Registry (ACR) using the Azure Kubernetes Service (AKS) extension for Visual Studio Code - Azure Kubernetes Service,Attach AKS extension in VS Code to Azure Container Registry,Learn how to attach to Azure Container Registry (ACR) using the Azure Kubernetes Service (AKS) extension for Visual Studio Code.,"In this article, you learn how to attach to Azure Container Registry (ACR) using the Azure Kubernetes Service (AKS) extension for Visual Studio Code.",2024-08-01T20:29:00.000Z,how-to,integrations,0.7,True,Describes how to configure the AKS VS Code extension to connect to ACR; involves product-specific integration steps and parameters.,unchanged https://learn.microsoft.com/en-us/azure/aks/aks-extension-draft-deployment,Create a Kubernetes deployment,Create a Kubernetes deployment using Automated Deployments in the Azure Kubernetes Service (AKS) extension for Visual Studio Code - Azure Kubernetes Service,,Learn how create a Kubernetes deployment using Automated Deployments in the Azure Kubernetes Service (AKS) extension for Visual Studio Code.,"In this article, you learn how to create a Kubernetes deployment using Automated Deployments in the Azure Kubernetes Service (AKS) extension for Visual Studio Code. Automated Deployments provides an easy way to automate the process of scaling, updating, and maintaining your applications.",2024-08-01T20:29:00.000Z,how-to,,0.3,False,"Tutorial on creating a Kubernetes deployment using the AKS VS Code extension; primarily workflow guidance, not detailed configuration references or limits.",unchanged @@ -38,11 +43,12 @@ https://learn.microsoft.com/en-us/azure/aks/aks-managed-gpu-nodes,Use managed GP https://learn.microsoft.com/en-us/azure/aks/aks-migration,Plan and execute a migration,Migrate to Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Plan and execute migration to Azure Kubernetes Service,This article shows you how to migrate to Azure Kubernetes Service (AKS).,"To help you plan and execute a successful migration to Azure Kubernetes Service (AKS), this guide provides details for the current recommended AKS configuration. While this article doesn't cover every scenario, it contains links to more detailed information for planning a successful migration. In this article, we summarize migration details for: Note Depending on your scenario, the following open-source tools might help with your migration:",2024-10-30T21:59:00.000Z,concept-article,decision-making,0.7,True,Migration guide summarizing recommended AKS configurations and linking to detailed planning content; provides concrete guidance for migration decisions rather than just conceptual overview.,unchanged https://learn.microsoft.com/en-us/azure/aks/aks-model-context-protocol-server,AKS MCP server,Connect your Azure Kubernetes Service (AKS) cluster to AI agents using the Model Context Protocol (MCP) server - Azure Kubernetes Service,Connect AKS clusters to AI agents via MCP server,Learn how to install and use the Model Context Protocol (MCP) server to intelligently troubleshoot and manage your Azure Kubernetes Service (AKS) clusters.,"The AKS Model Context Protocol (MCP) server enables AI assistants to interact with Azure Kubernetes Service (AKS) clusters with clarity, safety, and control. It serves as a bridge between AI tools (like GitHub Copilot, Claude, and other MCP-compatible AI assistants) and AKS, translating natural language requests into AKS operations and returning the results in a format the AI tools can understand. The AKS MCP server connects to Azure using the Azure SDK and provides a set of tools that AI assist",2026-02-18T06:03:00.000Z,how-to,integrations,0.8,True,"Describes MCP server installation and tool definitions that bridge AI assistants and AKS, with specific operations and configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/aks/aks-production-upgrade-strategies,Production upgrade strategies,AKS Production Upgrade Strategies - Azure Kubernetes Service,Apply production upgrade patterns for AKS clusters,Proven patterns to use for upgrading Azure Kubernetes Service clusters in production with minimal downtime and maximum safety.,Upgrade your production Azure Kubernetes Service (AKS) clusters safely by using these proven patterns. Each strategy is optimized for specific business constraints and risk tolerance.,2025-08-02T05:06:00.000Z,how-to,architecture-patterns,0.7,True,Provides proven patterns for production upgrades with trade-offs for downtime and safety; these are AKS-specific operational patterns.,unchanged +https://learn.microsoft.com/en-us/azure/aks/aks-service-permissions,AKS service permissions reference,AKS service permissions reference - Azure Kubernetes Service,Reference AKS service permissions and required roles,"Reference for the Azure permissions required by the identity creating an AKS cluster, the cluster identity at runtime, and AKS node access.","When creating a cluster, AKS generates or modifies resources it needs (like VMs and NICs) to create and run the cluster on behalf of the user. This identity is distinct from the cluster's identity permission, which is created during cluster creation. For the built-in roles used to grant these permissions, seeAzure built-in roles for Containers. For a worked example of granting a service principal the permissions needed for a custom virtual network, seeUse a service principal with AKS. For an ori",2026-04-23T06:10:00.000Z,reference,security,0.85,True,"Permissions reference for AKS; will enumerate specific Azure RBAC roles, actions, and scopes required for cluster creation and runtime, which is product-specific security configuration knowledge.",new https://learn.microsoft.com/en-us/azure/aks/aks-support-help,Support options for AKS,Support and troubleshooting for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Support and troubleshooting options for Azure Kubernetes Service,This article provides support and troubleshooting options for Azure Kubernetes Service (AKS).,,2025-06-30T08:00:00.000Z,troubleshooting,troubleshooting,0.7,True,"Central troubleshooting article likely maps common AKS issues and error conditions to diagnostic steps and solutions, including product-specific commands and guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/aks-virtual-machine-sizes,"VM sizes, generations, and features","Virtual Machine (VM) Sizes, Generations, and Features for Azure Kubernetes Service (AKS) - Azure Kubernetes Service",Choose VM sizes and generations for AKS workloads,"Learn about VM fundamentals on AKS, like different VM sizes, generations, features, how to check for available VM sizes, why some VM sizes might not be available, and what happens when a VM size retir","Azure Kubernetes Service (AKS) supports various virtual machine (VM) sizes, generations, and features to accommodate different workloads and performance requirements. This article provides an overview of available VM sizes and generations for AKS, how to check for available VM sizes in your region, reasons why certain VM sizes might not be available, and what happens when a VM size retires.",2025-10-23T22:11:00.000Z,overview,decision-making,0.65,True,"Discusses available VM sizes, generations, and reasons some sizes are unavailable or retired; used to decide which VM SKUs to use for AKS, aligning with decision-making on capacity and SKU selection.",unchanged https://learn.microsoft.com/en-us/azure/aks/api-server-authorized-ip-ranges,Define API server authorized IP ranges,API Server Authorized IP Ranges in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Secure AKS API server with authorized IP ranges,Learn how to secure access to the API server in Azure Kubernetes Service (AKS) using authorized IP address ranges to limit access to specific IP addresses and CIDRs.,This article shows you how to use API server authorized IP address ranges to limit which IP addresses and CIDRs can access control plane endpoints for your Azure Kubernetes Service (AKS) workloads.,2025-12-30T23:03:00.000Z,how-to,security,0.8,True,How-to for configuring authorized IP ranges; includes specific CLI/ARM parameters and constraints for AKS control plane endpoints.,unchanged https://learn.microsoft.com/en-us/azure/aks/api-server-service-tags,Use service tags for API server authorized IP ranges,Use Service Tags for API Server Authorized IP Ranges in Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Use service tags for AKS API authorized IP ranges,Learn how to use service tags to specify authorized IP ranges for the API server in Azure Kubernetes Service (AKS).,"Service tags for API server authorized IP ranges is a preview feature that allows you to use service tags to specify authorized IP ranges for the API server in Azure Kubernetes Service (AKS). This feature simplifies the management of authorized IP ranges by allowing you to use predefined service tags instead of manually specifying individual IP addresses or CIDR ranges. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,""",2025-12-30T23:03:00.000Z,how-to,security,0.75,True,Preview feature configuration; likely includes specific service tag names and usage patterns for AKS API server security.,unchanged -https://learn.microsoft.com/en-us/azure/aks/api-server-vnet-integration,Use API Server VNet integration,API Server VNet Integration in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn how to create an Azure Kubernetes Service (AKS) cluster with API Server VNet Integration,"An Azure Kubernetes Service (AKS) cluster configured with API Server VNet Integration projects the API server endpoint directly into a delegated subnet in the VNet where AKS is deployed. API Server VNet Integration enables network communication between the API server and the cluster nodes without requiring a private link or tunnel. The API server is available behind an internal load balancer VIP in the delegated subnet, which the nodes are configured to utilize. By using API Server VNet Integrat",2026-04-15T08:00:00.000Z,how-to,,0.4,False,"The summary indicates this is primarily a conceptual/feature explanation of API Server VNet Integration and how it works (projecting the API server into a delegated subnet, internal load balancer VIP, etc.). There is no clear evidence of detailed configuration tables, limits, error codes, or prescriptive decision matrices in the description, so it does not clearly meet any expert-knowledge sub-skill criteria based on the provided snippet.",updated +https://learn.microsoft.com/en-us/azure/aks/api-server-vnet-integration,Use API Server VNet integration,API Server VNet Integration in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn how to create an Azure Kubernetes Service (AKS) cluster with API Server VNet Integration,"An Azure Kubernetes Service (AKS) cluster configured with API Server VNet Integration projects the API server endpoint directly into a delegated subnet in the VNet where AKS is deployed. API Server VNet Integration enables network communication between the API server and the cluster nodes without requiring a private link or tunnel. The API server is available behind an internal load balancer VIP in the delegated subnet, which the nodes are configured to utilize. By using API Server VNet Integrat",2026-04-15T08:00:00.000Z,how-to,,0.4,False,"The summary indicates this is primarily a conceptual/feature explanation of API Server VNet Integration and how it works (projecting the API server into a delegated subnet, internal load balancer VIP, etc.). There is no clear evidence of detailed configuration tables, limits, error codes, or prescriptive decision matrices in the description, so it does not clearly meet any expert-knowledge sub-skill criteria based on the provided snippet.",unchanged https://learn.microsoft.com/en-us/azure/aks/app-routing,Application routing add-on overview,Azure Kubernetes Service (AKS) managed NGINX ingress with the application routing add-on - Azure Kubernetes Service,,Use the application routing add-on to securely access applications deployed on Azure Kubernetes Service (AKS).,"Caution TheKubernetes SIG Networkand the Security Response Committeeannounced the upcoming retirementof theIngress NGINX project, with maintenance ending inMarch 2026. There's no immediate action required today for AKS clusters using theapplication routing add-on with NGINX. Microsoft will provide official support for critical security patches for application routing add-on NGINX Ingress resources throughNovember 2026. AKS is aligning with upstream Kubernetes by moving toGateway APIas the long-t",2026-03-19T11:02:00.000Z,how-to,,0.3,False,"Application routing add-on with NGINX ingress description and deprecation notice; summary suggests high-level guidance and support timelines, not detailed limits, config matrices, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/aks/app-routing-dns-ssl,Custom domain name and SSL certificate configuration,Set up a Custom Domain Name and SSL Certificate with the Application Routing Add-on for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Set up custom domains and SSL for AKS application routing,Understand the advanced configuration options that are supported with the application routing add-on for Azure Kubernetes Service (AKS).,"Important TheKubernetes SIG Networkand the Security Response Committeeannounced the upcoming retirementof theIngress NGINX project, with maintenance ending inMarch 2026. There's no immediate action required today for AKS clusters using theApplication Routing add-on with NGINX. Microsoft will provide official support for critical security patches for Application Routing add-on NGINX Ingress resources throughNovember 2026. AKS is aligning with upstream Kubernetes by moving toGateway APIas the long",2026-02-04T23:11:00.000Z,how-to,configuration,0.8,True,"Focuses on DNS and SSL configuration with the application routing add-on; expected to include specific annotation names, certificate settings, and DNS configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/aks/app-routing-gateway-api,Kubernetes Gateway API with application routing (preview),Azure Kubernetes Service (AKS) application routing add-on with the Kubernetes Gateway API (preview) - Azure Kubernetes Service,,Use the application routing add-on to manage ingress traffic on Azure Kubernetes Service (AKS) using the Kubernetes Gateway API.,"Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best-effort basis. As such, these features aren't meant for production use. For more information, see the following support articles: Caution TheKubernetes SIG Networkand the Security Response Committeeannounced the upcoming retirement",2026-03-24T22:13:00.000Z,how-to,,0.3,False,"Covers AKS application routing add-on with Gateway API in preview; summary focuses on preview caveats and high-level usage, not on numeric limits, decision matrices, or detailed configuration/diagnostic tables.",unchanged @@ -55,20 +61,19 @@ https://learn.microsoft.com/en-us/azure/aks/auto-upgrade-node-os-image,Automatic https://learn.microsoft.com/en-us/azure/aks/automated-deployments,Configure automated deployments,Automate app deployment to Azure Kubernetes Service (AKS) with CI/CD via Automated deployments Automated deployments - Azure Kubernetes Service,Set up automated deployments to AKS with CI/CD,Learn how to use automated deployments to simplify the process of adding GitHub Actions or Azure DevOps Pipelines to your Azure Kubernetes Service (AKS) project.,"Automated Deployments streamline the process of setting up a GitHub Action or Azure DevOps Pipeline, making it easy to create a continuous deployment pipeline for your application to Azure Kubernetes Service (AKS). Once connected, every new commit automatically triggers the pipeline, delivering updates to your application seamlessly. You can either bring your own deployment files for quick pipeline creation or generate Dockerfiles and Kubernetes manifests to containerize and deploy non-container",2025-05-19T15:24:00.000Z,how-to,deployment,0.75,True,"Describes using Automated Deployments to generate GitHub Actions or Azure DevOps pipelines for AKS, including AKS-specific pipeline configuration.",unchanged https://learn.microsoft.com/en-us/azure/aks/automatic/aks-automatic-managed-system-node-pools,Create an AKS Automatic cluster with managed system node pools,Create an Azure Kubernetes Service (AKS) Automatic Cluster with Managed System Node Pools (Preview) - Azure Kubernetes Service,,Learn how to create an Azure Kubernetes Service (AKS) Automatic cluster with managed system node pools using the Azure CLI.,"In this quickstart, you learn how to create an Azure Kubernetes Service (AKS) Automatic cluster with managed system node pools (preview) using the Azure CLI. With managed system node pools on AKS Automatic clusters, AKS manages thesystem node pooland its components for you. AKS automatically handles creating, upgrading, and scaling the system node pool, so you can focus on your application workloads. Important AKS preview features are available on a self-service, opt-in basis. Previews are provi",2025-11-28T23:03:00.000Z,quickstart,,0.4,False,"Quickstart to create AKS Automatic cluster with managed system node pools; focuses on CLI steps, not on enumerating configuration options or quotas.",unchanged https://learn.microsoft.com/en-us/azure/aks/automatic/aks-automatic-managed-system-node-pools-about,Overview,Overview of AKS Automatic Clusters with Managed System Node Pools (Preview) - Azure Kubernetes Service,,"Learn about the managed system node pools (preview) feature for Azure Kubernetes Service (AKS) Automatic clusters, including its key features, benefits, and restrictions.","In this article, you learn about the managed system node pools (preview) feature forAzure Kubernetes Service (AKS) Automatic clusters. With this feature, AKS automatically manages system node pools in your cluster, including configuration, scaling, and maintenance. To create an AKS Automatic cluster with managed system node pools, see theCreate an Azure Kubernetes Service (AKS) Automatic cluster with managed system node pools (preview)quickstart. Important AKS preview features are available on a",2025-11-18T16:04:00.000Z,overview,,0.4,False,"Overview of managed system node pools for AKS Automatic; describes feature, benefits, and restrictions at a high level without detailed numeric limits or config tables in the summary.",unchanged -https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-custom-network,Public cluster,Quickstart: Create an Azure Kubernetes Service (AKS) Automatic cluster in a custom virtual network - Azure Kubernetes Service,Create AKS Automatic cluster in custom VNet with SLA,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) Automatic in a custom virtual network.,"Applies to:✔️ AKS Automatic Azure Kubernetes Service (AKS) Automaticprovides the easiest managed Kubernetes experience for developers, DevOps engineers, and platform engineers. Ideal for modern and AI applications, AKS Automatic automates AKS cluster setup and operations and embeds best practice configurations. Users of any skill level can benefit from the security, performance, and dependability of AKS Automatic for their applications. AKS Automatic also includes apod readiness SLAthat guarante",2026-04-14T06:03:00.000Z,quickstart,limits-quotas,0.65,True,"Quickstart for AKS Automatic in a custom virtual network that again references the pod readiness SLA with the same quantified guarantee (99.9% within 5 minutes); this is a specific numeric service constraint, fitting limits-quotas best.",updated -https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-from-code,From a code repository,Quickstart: Deploy an application to a new Azure Kubernetes Service (AKS) Automatic cluster from a code repository using Automated Deployments - Azure Kubernetes Service,,Learn how to quickly deploy an application from source code to a new Azure Kubernetes Service (AKS) Automatic cluster.,"Applies to:✔️ AKS Automatic Useautomated deploymentsto build and deploy an application from a code repository to a new or existing AKS Automatic cluster. Automated deployments simplify the process of setting up a GitHub Action workflow to build and deploy your code. Once connected, every new commit you make kicks off the pipeline. Automated deployments build ondraft.sh. When you create a new deployment workflow, you can use an existing Dockerfile, generate a Dockerfile, use existing Kubernetes m",2026-04-14T06:03:00.000Z,quickstart,,0.2,False,"Quickstart workflow for deploying from source using automated deployments; appears to be step-by-step tutorial without detailed config tables, limits, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-managed-network,Inside a managed virtual network,Quickstart: Create an Azure Kubernetes Service (AKS) Automatic cluster - Azure Kubernetes Service,Use AKS Automatic pod readiness SLA guarantees,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) Automatic.,"Applies to: ✔️ AKS Automatic Azure Kubernetes Service (AKS) Automaticis a managed Kubernetes experience that automates AKS cluster setup and operations and embeds best practice configurations. AKS Automatic also includes a [pod readiness SLA][azure-sla] that guarantees 99.9% of qualifying pod readiness operations complete within 5 minutes, guaranteeing reliable, self-healing infrastructure for your applications. In this quickstart, you learn how to:",2026-04-15T06:34:00.000Z,quickstart,limits-quotas,0.65,True,"Quickstart for creating an AKS Automatic cluster that explicitly includes a pod readiness SLA with a quantified guarantee (99.9% of qualifying pod readiness operations complete within 5 minutes), which is a specific numeric reliability constraint not generally known; closest fit is limits-quotas because it defines a concrete time-bound service guarantee.",updated -https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-private-custom-network,Private cluster,Quickstart: Create a private Azure Kubernetes Service (AKS) Automatic cluster in a custom virtual network - Azure Kubernetes Service,Create private AKS Automatic cluster with readiness SLA,Learn how to quickly deploy a private Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) Automatic in a custom virtual network.,"Applies to:✔️ AKS Automatic Azure Kubernetes Service (AKS) Automaticprovides the easiest managed Kubernetes experience for developers, DevOps engineers, and platform engineers. Ideal for modern and AI applications, AKS Automatic automates AKS cluster setup and operations and embeds best practice configurations. Users of any skill level can benefit from the security, performance, and dependability of AKS Automatic for their applications. AKS Automatic also includes apod readiness SLAthat guarante",2026-04-14T06:03:00.000Z,quickstart,limits-quotas,0.65,True,"Quickstart for a private AKS Automatic cluster in a custom VNet that includes the same pod readiness SLA statement with explicit numeric guarantee; this constitutes product-specific numeric limits/constraints, aligning with limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-custom-network,Public cluster,Quickstart: Create an Azure Kubernetes Service (AKS) Automatic cluster in a custom virtual network - Azure Kubernetes Service,Create AKS Automatic cluster in custom VNet with SLA,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) Automatic in a custom virtual network.,"Applies to:✔️ AKS Automatic Azure Kubernetes Service (AKS) Automaticprovides the easiest managed Kubernetes experience for developers, DevOps engineers, and platform engineers. Ideal for modern and AI applications, AKS Automatic automates AKS cluster setup and operations and embeds best practice configurations. Users of any skill level can benefit from the security, performance, and dependability of AKS Automatic for their applications. AKS Automatic also includes apod readiness SLAthat guarante",2026-04-14T06:03:00.000Z,quickstart,limits-quotas,0.65,True,"Quickstart for AKS Automatic in a custom virtual network that again references the pod readiness SLA with the same quantified guarantee (99.9% within 5 minutes); this is a specific numeric service constraint, fitting limits-quotas best.",unchanged +https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-from-code,From a code repository,Quickstart: Deploy an application to a new Azure Kubernetes Service (AKS) Automatic cluster from a code repository using Automated Deployments - Azure Kubernetes Service,,Learn how to quickly deploy an application from source code to a new Azure Kubernetes Service (AKS) Automatic cluster.,"Applies to:✔️ AKS Automatic Useautomated deploymentsto build and deploy an application from a code repository to a new or existing AKS Automatic cluster. Automated deployments simplify the process of setting up a GitHub Action workflow to build and deploy your code. Once connected, every new commit you make kicks off the pipeline. Automated deployments build ondraft.sh. When you create a new deployment workflow, you can use an existing Dockerfile, generate a Dockerfile, use existing Kubernetes m",2026-04-14T06:03:00.000Z,quickstart,,0.2,False,"Quickstart workflow for deploying from source using automated deployments; appears to be step-by-step tutorial without detailed config tables, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-managed-network,Inside a managed virtual network,Quickstart: Create an Azure Kubernetes Service (AKS) Automatic cluster - Azure Kubernetes Service,Use AKS Automatic pod readiness SLA guarantees,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) Automatic.,"Applies to: ✔️ AKS Automatic Azure Kubernetes Service (AKS) Automaticis a managed Kubernetes experience that automates AKS cluster setup and operations and embeds best practice configurations. AKS Automatic also includes a [pod readiness SLA][azure-sla] that guarantees 99.9% of qualifying pod readiness operations complete within 5 minutes, guaranteeing reliable, self-healing infrastructure for your applications. In this quickstart, you learn how to:",2026-04-15T06:34:00.000Z,quickstart,limits-quotas,0.65,True,"Quickstart for creating an AKS Automatic cluster that explicitly includes a pod readiness SLA with a quantified guarantee (99.9% of qualifying pod readiness operations complete within 5 minutes), which is a specific numeric reliability constraint not generally known; closest fit is limits-quotas because it defines a concrete time-bound service guarantee.",unchanged +https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-private-custom-network,Private cluster,Quickstart: Create a private Azure Kubernetes Service (AKS) Automatic cluster in a custom virtual network - Azure Kubernetes Service,Create private AKS Automatic cluster with readiness SLA,Learn how to quickly deploy a private Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) Automatic in a custom virtual network.,"Applies to:✔️ AKS Automatic Azure Kubernetes Service (AKS) Automaticprovides the easiest managed Kubernetes experience for developers, DevOps engineers, and platform engineers. Ideal for modern and AI applications, AKS Automatic automates AKS cluster setup and operations and embeds best practice configurations. Users of any skill level can benefit from the security, performance, and dependability of AKS Automatic for their applications. AKS Automatic also includes apod readiness SLAthat guarante",2026-04-14T06:03:00.000Z,quickstart,limits-quotas,0.65,True,"Quickstart for a private AKS Automatic cluster in a custom VNet that includes the same pod readiness SLA statement with explicit numeric guarantee; this constitutes product-specific numeric limits/constraints, aligning with limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/aks/autoscale-gpu-workloads-with-keda,Autoscale GPU workloads with KEDA,Autoscale GPU Workloads using KEDA and NVIDIA DCGM Exporter metrics on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Autoscale AKS GPU workloads with KEDA and DCGM,Learn how to autoscale GPU-based workloads using KEDA and NVIDIA DCGM Exporter metrics on Azure Kubernetes Service (AKS).,"In this article, you learn how to autoscale GPU workloads on Azure Kubernetes Service (AKS) by using GPU metrics collected by the NVIDIA Data Center GPU Manager (DCGM) exporter. These metrics are exposed through Azure Managed Prometheus and consumed by Kubernetes Event-Driven Autoscaling (KEDA) to automatically scale workloads based on real-time GPU utilization. This solution helps optimize GPU resource usage and control operational costs by dynamically adjusting application scale in response to",2025-05-20T05:26:00.000Z,how-to,integrations,0.7,True,"Shows autoscaling using KEDA and NVIDIA DCGM metrics; likely includes KEDA ScaledObject configuration, metric names, and Prometheus query parameters specific to this integration.",unchanged https://learn.microsoft.com/en-us/azure/aks/availability-sets-on-aks,Availability Sets deprecation,Availability Sets Deprecation in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,Migrate AKS clusters from Availability Sets to VM node pools,Learn about the deprecation of Availability Sets and how to migrate from Availability Sets to Virtual Machines node pools in Azure Kubernetes Service (AKS).,"This article details how Azure Kubernetes Service (AKS) is phasing out support for Virtual Machine Availability Sets (VMAS) in favor of Virtual Machines (VMs). We recommend usingVirtual Machine node poolsfor AKS-optimized VMs. Virtual Machine node pools: Important Starting onSeptember 30, 2025, Azure Kubernetes Service (AKS) no longer supports Availability Sets. Clusters with Availability Sets after this date are considered out of support. Migrate all workloads toVirtual Machines node poolsbefor",2026-03-30T22:07:00.000Z,overview,decision-making,0.82,True,"The page provides concrete deprecation timelines (for example, the September 30, 2025 end-of-support date) and prescriptive guidance to move from Availability Sets to Virtual Machines node pools. It includes AKS-specific migration and support considerations that inform technology and configuration choices, aligning with decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/aks/azure-ad-rbac,Use Kubernetes RBAC with Microsoft Entra integration,Use Microsoft Entra ID and Kubernetes RBAC for clusters - Azure Kubernetes Service,Use Entra groups with Kubernetes RBAC in AKS,Learn how to use Microsoft Entra group membership to restrict access to cluster resources using Kubernetes role-based access control (Kubernetes RBAC) in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) can be configured to use Microsoft Entra ID for user authentication. In this configuration, you sign in to an AKS cluster using a Microsoft Entra authentication token. Once authenticated, you can use the built-in Kubernetes role-based access control (RBAC) to manage access to namespaces and cluster resources based on a user's identity or group membership. This article shows you how to: Control access using Kubernetes RBAC in an AKS cluster based on Microsoft Entra ",2025-12-17T06:03:00.000Z,how-to,security,0.8,True,Shows how to wire Entra group membership into Kubernetes RBAC with concrete role/rolebinding YAML and AKS-specific settings.,unchanged https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration,Install Azure App Configuration AKS extension,Azure App Configuration extension for Azure Kubernetes Service - Azure Kubernetes Service,Install and manage Azure App Configuration extension on AKS,Install and configure Azure App Configuration extension on your Azure Kubernetes Service (AKS).,"Azure App Configurationprovides a service to centrally manage application settings and feature flags.Azure App Configuration Kubernetes Provideris a Kubernetes operator that gets key-values, Key Vault references and feature flags from Azure App Configuration and builds them into Kubernetes ConfigMaps and Secrets. Azure App Configuration extension for Azure Kubernetes Service (AKS) allows you to install and manage Azure App Configuration Kubernetes Provider on your AKS cluster via Azure Resource ",2025-10-24T05:03:00.000Z,concept-article,configuration,0.7,True,"Covers installing and configuring the Azure App Configuration Kubernetes Provider as an AKS extension; expected to include extension resource properties, parameter names, and configuration options unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration-quickstart,Quickstart with Azure App Configuration,Quickstart Generate ConfigMap from Azure App Configuration - Azure Kubernetes Service,Configure AKS workloads with App Configuration provider,Learn how to configure the workload in Azure Kubernetes Service (AKS) with ConfigMap that is generated from Azure App Configuration.,"You can externalize the configurations of your Azure Kubernetes Service (AKS) workloads and manage them inAzure App Configuration. TheAzure App Configuration Kubernetes providerruns as a container in your cluster. Key benefits include: The Azure App Configuration Kubernetes provider is available as an AKS extension. By following this document, you can easily install the extension and connect your AKS cluster with an App Configuration store using the Service Connector in the Azure portal. For inf",2025-02-10T12:05:00.000Z,quickstart,configuration,0.65,True,"Quickstart for generating ConfigMaps from Azure App Configuration via the AKS extension; likely includes provider-specific settings, CLI flags, and connection parameters that qualify as product-specific configuration.",unchanged https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration-settings,Configure Azure App Configuration AKS extension,Configure the Azure App Configuration extension for your Azure Kubernetes Service - Azure Kubernetes Service,Tune Azure App Configuration extension settings for AKS,Learn how to configure the Azure App Configuration extension specifically for your Azure Kubernetes Service (AKS).,"Once youcreated the Azure App Configuration extension, you can configure the extension to work best for you and your project using various configuration options, like: The extension enables you to configure Azure App Configuration extension settings by using the--configuration-settingsparameter in the Azure CLI. Tip For a list of available options, seeAzure App Configuration Kubernetes Provider helm values.",2024-09-10T08:00:00.000Z,concept-article,configuration,0.8,True,"Explicitly about configuring the extension using --configuration-settings and references Helm values; likely contains parameter names, allowed values, and defaults, which are expert configuration details.",unchanged https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay,Create an Azure CNI Overlay cluster,Configure Azure CNI Overlay Networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Azure CNI Overlay networking in AKS,"Learn how to configure Azure CNI Overlay networking in Azure Kubernetes Service (AKS), including setup, dual-stack networking, and example workload deployment.","This article explains the setup process, dual-stack networking configuration, and an example workload deployment for Azure CNI Overlay in Azure Kubernetes Service (AKS) clusters. For an overview of Azure CNI Overlay networking, seeOverview of Azure CNI Overlay networking in Azure Kubernetes Service (AKS). Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or provides security updates for Azure Linux 2.0. The Azure Linux 2.0 node image is frozen at the202512",2026-01-15T23:03:00.000Z,how-to,configuration,0.72,True,"Explains setup, dual-stack configuration, and example workloads; includes AKS-specific configuration parameters and supported combinations.",unchanged -https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay-pod-expand,Expand pod CIDR space in Azure CNI Overlay clusters,Expand Pod CIDR Space in Azure CNI Overlay Azure Kubernetes Service (AKS) Clusters - Azure Kubernetes Service,Expand pod CIDR space for Azure CNI Overlay in AKS,Learn how to expand pod CIDR space in Azure CNI Overlay AKS clusters with Linux nodes and validate the new pod CIDR block.,"You can expand your pod Classless Inter-Domain Routing (CIDR) space on Azure CNI Overlay clusters in Azure Kubernetes Service with Linux nodes only. The operation uses theaz aks updatecommand and allows expansions without the need to re-create your AKS cluster. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covere",2026-01-15T23:03:00.000Z,how-to,configuration,0.7,True,Uses az aks update with specific options to expand pod CIDR; this is detailed configuration and operational guidance unique to AKS.,unchanged +https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay-pod-expand,Expand pod CIDR space in Azure CNI Overlay clusters,Expand Pod CIDR Space in Azure CNI Overlay Azure Kubernetes Service (AKS) Clusters - Azure Kubernetes Service,,Learn how to expand pod CIDR space in Azure CNI Overlay AKS clusters with Linux nodes and validate the new pod CIDR block.,You can expand your pod Classless Inter-Domain Routing (CIDR) space on Azure CNI Overlay clusters in Azure Kubernetes Service with Linux nodes only. The operation uses theaz aks updatecommand and allows expansions without the need to re-create your AKS cluster.,2026-04-20T22:08:00.000Z,how-to,,0.2,False,"Primarily describes that pod CIDR space can be expanded using az aks update; summary does not indicate detailed configuration tables, limits, or product-specific error mappings, so it reads as a procedural capability/tutorial rather than expert-knowledge configuration or limits content.",updated https://learn.microsoft.com/en-us/azure/aks/azure-cni-powered-by-cilium,Use Azure CNI Powered by Cilium,Configure Azure CNI Powered by Cilium in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Azure CNI Powered by Cilium in AKS,Learn how to create an Azure Kubernetes Service (AKS) cluster with Azure CNI Powered by Cilium.,Deploy and Explore Azure CNI Powered by Cilium combines the robust control plane of Azure Container Networking Interface (CNI) with the data plane ofCiliumto provide high-performance networking and security. Azure CNI Powered by Cilium provides the following benefits by making use of eBPF programs loaded into the Linux kernel and a more efficient API object structure:,2025-12-19T05:33:00.000Z,how-to,configuration,0.68,True,Covers how to deploy and configure the Cilium-based data plane with Azure CNI; includes AKS-specific flags and settings.,unchanged -https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys,BYOK for Azure managed disks,Use a customer-managed key to encrypt Azure managed disks in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure customer-managed disk encryption keys for AKS,Bring your own keys (BYOK) to encrypt managed OS and data disks in AKS.,"Azure encrypts all data in a managed disk at rest. By default, data is encrypted with Microsoft-managed keys. For more control over encryption keys, you can supply customer-managed keys to use for encryption at rest for both the OS and data disks for your AKS clusters. Learn more about customer-managed keys onLinuxandWindows.",2025-04-10T22:05:00.000Z,concept-article,security,0.7,True,"Page is a how-to for enabling customer-managed keys (CMK) for AKS-managed OS and data disks. It typically includes AKS- and disk-specific configuration details such as required Azure roles, key vault access policies, specific resource/parameter names, and wiring between AKS cluster, managed disks, and Key Vault. These are product-specific security configuration steps rather than just conceptual encryption info.",unchanged +https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys,Encrypt Azure managed disks using customer-managed keys,Use a customer-managed key to encrypt Azure managed disks in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Encrypt AKS managed disks with customer-managed keys,Bring your own keys (BYOK) to encrypt managed OS and data disks in AKS.,"Azure encrypts all data in a managed disk at rest. By default, data is encrypted with Microsoft-managed keys. For more control over encryption keys, you can supply customer-managed keys to use for encryption at rest for both the OS and data disks for your AKS clusters. Learn more about customer-managed keys onLinuxandWindows.",2025-04-10T22:05:00.000Z,concept-article,security,0.7,True,"Describes how to use BYOK for OS and data disks in AKS; involves product-specific key and disk encryption configuration, which is security-focused.",new https://learn.microsoft.com/en-us/azure/aks/azure-hpc-cache,Azure HPC Cache,Integrate Azure HPC Cache with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Integrate Azure HPC Cache with AKS workloads,Learn how to integrate HPC Cache with Azure Kubernetes Service (AKS).,"Azure HPC Cachespeeds access to your data for high-performance computing (HPC) tasks. By caching files in Azure, Azure HPC Cache brings the scalability of cloud computing to your existing workflow. This article shows you how to integrate Azure HPC Cache with Azure Kubernetes Service (AKS).",2024-08-01T20:29:00.000Z,concept-article,integrations,0.7,True,"Describes AKS-specific integration steps and configuration parameters to mount and use Azure HPC Cache from pods, which is not generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/azure-hybrid-benefit,Use Azure Hybrid Benefit,Use Azure Hybrid Benefit - Azure Kubernetes Service,,Learn how to save costs for Windows workloads by using existing Windows Server licenses on Azure Kubernetes Service.,"Azure Hybrid Benefit is a program that enables you to significantly reduce the costs of running workloads in the cloud. With Azure Hybrid Benefit for Azure Kubernetes Service (AKS), you can maximize the value of your on-premises licenses and modernize your applications at no extra cost. Azure Hybrid Benefit enables you to use your on-premises licenses that also have either active Software Assurance (SA) or a qualifying subscription to get Windows virtual machines (VMs) on Azure at a reduced cost",2024-10-09T05:32:00.000Z,concept-article,,0.3,False,Cost/benefit overview of Azure Hybrid Benefit for AKS; likely licensing and pricing narrative without detailed technical configs or limits.,unchanged https://learn.microsoft.com/en-us/azure/aks/azure-linux-aks-partner-solutions,Azure Linux AKS partner solutions,Azure Linux AKS Container Host partner solutions - Azure Linux AKS Container Host partner solutions,,"Discover partner-tested solutions that enable you to build, test, deploy, manage, and monitor your AKS environment using Azure Linux Container Host.","Microsoft collaborates with partners to ensure your build, test, deployment, configuration, and monitoring of your applications perform optimally with Azure Linux Container Host on AKS. The third party partners featured in this article have introduction guides to help you start using their solutions with your applications running on Azure Linux Container Host on AKS. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or provides security updates for Azure L",2025-12-17T12:02:00.000Z,concept-article,,0.2,False,Partner solutions listing and marketing-style guidance; no indication of concrete technical parameters or patterns.,unchanged @@ -84,7 +89,7 @@ https://learn.microsoft.com/en-us/azure/aks/best-practices-gpu,GPU best practice https://learn.microsoft.com/en-us/azure/aks/best-practices-ml-ops,Best practices,Machine learning operations (MLOps) best practices in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,MLOps best practices for AKS machine learning,Learn about best practices for machine learning operations (MLOps) and how you can use them with your AI and machine learning workflows on Azure Kubernetes Service (AKS).,"This article describes best practices and considerations to keep in mind when using MLOps in AKS. For more information on MLOps, seeMachine learning operations (MLOps) for AI and machine learning workflows.",2024-10-22T22:00:00.000Z,best-practice,best-practices,0.7,True,"Contains AKS-specific MLOps recommendations, including concrete patterns and configurations for CI/CD, scaling, and operations of ML workloads.",unchanged https://learn.microsoft.com/en-us/azure/aks/best-practices-monitoring-proactive,Proactive monitoring best practices,Proactive monitoring best practices for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Apply proactive monitoring best practices for AKS clusters,Learn the best practices for proactively monitoring your Azure Kubernetes Service (AKS) cluster and workloads.,This article covers the best practices for proactive monitoring on Azure Kubernetes Service (AKS) and provides a comprehensive list of the key signals AKS recommends for you to monitor. Proactively monitoring your AKS clusters is crucial for reducing downtime and saving business interruptions for your applications. This process involves identifying and monitoring key indicators of abnormal behavior in your cluster that might lead to major issues or downtime.,2024-11-10T12:11:00.000Z,best-practice,best-practices,0.72,True,Explicit best-practices article with recommended key signals and proactive monitoring patterns specific to AKS; likely includes concrete metric names and thresholds.,unchanged https://learn.microsoft.com/en-us/azure/aks/best-practices-performance-scale,For small to medium workloads,Performance and scaling best practices for small to medium workloads in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Optimize AKS performance and scaling for small to medium workloads,Learn the best practices for performance and scaling for small to medium workloads in Azure Kubernetes Service (AKS).,"Note This article focuses on general best practices forsmall to medium workloads. For best practices specific tolarge workloads, seePerformance and scaling best practices for large workloads in Azure Kubernetes Service (AKS). As you deploy and maintain clusters in AKS, you can use the following best practices to help you optimize performance and scaling. In this article, you learn about: Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or provides securit",2025-12-17T12:02:00.000Z,best-practice,best-practices,0.8,True,"Best-practices article with AKS-specific performance and scaling guidance; expected to include concrete recommendations (e.g., node sizes, pod density, autoscaler settings) that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/aks/best-practices-performance-scale-large,For large workloads,Performance and scaling best practices for large workloads in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Optimize performance and scaling for large AKS workloads,Learn the best practices for performance and scaling for large workloads in Azure Kubernetes Service (AKS).,"Note This article focuses on general best practices forlarge workloads. For best practices specific tosmall to medium workloads, seePerformance and scaling best practices for small to medium workloads in Azure Kubernetes Service (AKS). As you deploy and maintain clusters in AKS, you can use the following best practices to help you optimize performance and scaling. Keep in mind thatlargeis a relative term. Kubernetes has a multi-dimensional scale envelope, and the scale envelope for your workload",2026-04-17T22:07:00.000Z,best-practice,best-practices,0.78,True,"The page provides AKS-specific performance and scaling recommendations for large clusters/workloads (for example, guidance on node counts, pod densities, control plane considerations, and configuration patterns unique to AKS). These are concrete, product-specific DO/DON'T guidelines rather than generic Kubernetes theory, fitting the best-practices category.",updated +https://learn.microsoft.com/en-us/azure/aks/best-practices-performance-scale-large,For large workloads,Performance and scaling best practices for large workloads in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Optimize performance and scaling for large AKS workloads,Learn the best practices for performance and scaling for large workloads in Azure Kubernetes Service (AKS).,"Note This article focuses on general best practices forlarge workloads. For best practices specific tosmall to medium workloads, seePerformance and scaling best practices for small to medium workloads in Azure Kubernetes Service (AKS). As you deploy and maintain clusters in AKS, you can use the following best practices to help you optimize performance and scaling. Keep in mind thatlargeis a relative term. Kubernetes has a multi-dimensional scale envelope, and the scale envelope for your workload",2026-04-17T22:07:00.000Z,best-practice,best-practices,0.78,True,"The page provides AKS-specific performance and scaling recommendations for large clusters/workloads (for example, guidance on node counts, pod densities, control plane considerations, and configuration patterns unique to AKS). These are concrete, product-specific DO/DON'T guidelines rather than generic Kubernetes theory, fitting the best-practices category.",unchanged https://learn.microsoft.com/en-us/azure/aks/best-practices-storage-nvme,Best practices for ephemeral NVMe data disks,Best practices for ephemeral NVMe data disks - Azure Kubernetes Service,Best practices for AKS ephemeral NVMe disks,Learn the cluster operator best practices for ephemeral NVMe data disks in Azure Kubernetes Service (AKS) for different use cases and virtual machine (VM) sizes. Learn how to manage volumes and work w,"Ephemeral NVMe data disks provide high-performance, low-latency storage that's ideal for demanding workloads running on Azure Kubernetes Service (AKS). Many modern applications, such as AI/ML training, data analytics, and high-throughput databases, require fast temporary storage to process large volumes of intermediate data efficiently. By using ephemeral NVMe disks, you can significantly improve application responsiveness and throughput, while optimizing for cost and scalability in your AKS clu",2026-02-14T06:03:00.000Z,best-practice,best-practices,0.8,True,"Contains AKS- and VM-size-specific recommendations, caveats, and usage patterns for ephemeral NVMe data disks that go beyond generic storage advice.",unchanged https://learn.microsoft.com/en-us/azure/aks/blue-green-node-pool-upgrade,Configure blue-green node pool upgrades,Blue-Green Node Pool Upgrades in Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Use blue-green node pool upgrade strategy in AKS,Perform upgrades of AKS node pools using a blue-green deployment strategy to ensure workload availability during updates.,"Blue-green upgrades enable you to upgrade your AKS node pools side by side by creating a parallelgreennode pool with the new configuration while maintaining the existingbluenode pool. This strategy allows you to test and validate the new configuration before switching traffic, with the ability to quickly roll back if issues arise. This article explains when to use blue-green upgrades, how the process works, configuration options, and considerations for using this upgrade strategy.",2026-02-21T06:10:00.000Z,how-to,architecture-patterns,0.8,True,"Explains when to use blue-green upgrades, how they work, and configuration options specific to AKS node pools—an AKS-specific upgrade pattern.",unchanged https://learn.microsoft.com/en-us/azure/aks/certificate-rotation,Manage cluster certificates and rotation,Certificate Rotation in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Manage AKS certificate rotation and autorotation,"Learn about certificate rotation in an Azure Kubernetes Service (AKS) cluster, including how to manually rotate certificates and enable autorotation for enhanced security.","Azure Kubernetes Service (AKS) uses certificates for authentication with many of its components. You need to periodically rotate those certificates for security or policy reasons. This article shows you how certificate rotation works in your AKS cluster. Important Starting onMarch 30, 2026Azure Kubernetes Service (AKS) no longer supports theaks-disable-kubelet-serving-certificate-rotation=truenode pool tag to disable Kubelet Serving Certificate Rotation (KSCR). You can create new node pools usin",2025-12-17T06:03:00.000Z,how-to,security,0.75,True,"Explains certificate rotation behavior and how to configure it; includes AKS-specific commands, flags, and deprecation details for node pool tags.",unchanged @@ -100,18 +105,20 @@ https://learn.microsoft.com/en-us/azure/aks/cluster-extensions,About cluster ext https://learn.microsoft.com/en-us/azure/aks/cluster-health-monitor,Cluster Health Monitor checker (preview),Use Cluster Health Monitor checker in Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Configure and use AKS Cluster Health Monitor checker,Learn how the Cluster Health Monitor checker in Azure Kubernetes Service (AKS) runs data plane health checks and supports CoreDNS remediation.,"This article shows you how to deploy Cluster Health Monitor in Azure Kubernetes Service (AKS). Cluster Health Monitor helps you detect AKS managed component issues earlier and improve resiliency by enabling automatic remediation for unhealthy CoreDNS pods in specific scenarios. It runs periodic checks for components such as CoreDNS, metrics server, and API server, and exposes results as Prometheus metrics for alerting. Important Cluster Health Monitor is an AKS-managed checker for system compone",2026-03-27T06:02:00.000Z,how-to,configuration,0.7,True,"Describes how to deploy and configure the AKS Cluster Health Monitor, including product-specific behavior (periodic checks for CoreDNS, metrics server, API server, Prometheus metrics exposure, and automatic remediation scenarios). This is expert configuration knowledge for an AKS-managed checker rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/aks/compare-container-options-with-aks,Compare AKS with other Azure container options,Comparing AKS with Other Azure Container Options - Azure Kubernetes Service,Choose between AKS and other Azure container services,"Understand when to use Azure Kubernetes Service (AKS) and how it compares to other container options including Azure Container Instances, Azure App Service, Azure Functions, and Azure Container Apps.",There are many options for teams to build and deploy cloud native and containerized applications on Azure. This article compares several container options on Azure. There's no perfect solution for every use case and every team. The following explanation provides general guidance and recommendations as a starting point to help find the best fit for your team and your requirements.,2026-02-18T23:06:00.000Z,overview,decision-making,0.7,True,"Explicitly compares AKS with ACI, App Service, Functions, and Container Apps to guide service selection; provides scenario-based recommendations and criteria for when to use each option.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-ai-ml-language-models,Small and large language models,Concepts - Small and large language models - Azure Kubernetes Service,Choose small vs large language models on AKS,"Learn about small and large language models, including when to use them and how you can onboard them to your AI and machine learning workflows on Azure Kubernetes Service (AKS).","In this article, you learn about small and large language models, including when to use them and how you can use them with your AI and machine learning workflows on Azure Kubernetes Service (AKS).",2025-07-21T17:11:00.000Z,concept-article,decision-making,0.65,True,"Provides AKS-focused guidance on when to use small vs large models in different scenarios, likely with concrete criteria for selection.",unchanged +https://learn.microsoft.com/en-us/azure/aks/concepts-cluster-authentication,Cluster authentication concepts,Cluster authentication concepts in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn how Azure Kubernetes Service (AKS) authenticates Kubernetes API requests using Microsoft Entra ID, and how to disable local cluster admin accounts in production.","This article describes how Azure Kubernetes Service (AKS) authenticates callers to the Kubernetes API — that is, who can connect to the control plane. It covers the recommended Microsoft Entra ID-based authentication path and how to lock down break-glass access. For how AKS evaluates what an authenticated caller isallowed to do, seeCluster authorization concepts. For the other identity scenarios in AKS, see: For an orientation across all four AKS identity scenarios, seeAccess and identity option",2026-04-23T06:10:00.000Z,concept-article,,0.3,False,"Cluster authentication concepts article; describes how AKS authenticates callers and recommended patterns, but summary suggests conceptual guidance rather than detailed role/parameter references.",new +https://learn.microsoft.com/en-us/azure/aks/concepts-cluster-authorization,Cluster authorization concepts,Cluster authorization concepts in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn how authorization for the Kubernetes API works in Azure Kubernetes Service (AKS), and how to choose between Kubernetes RBAC and Microsoft Entra ID authorization with optional Azure ABAC conditio","This article describes how Azure Kubernetes Service (AKS) decides what an authenticated caller is allowed to do against the Kubernetes API. It covers the two authorization models AKS supports and fine-grained custom resource control with Azure ABAC conditions. For how AKSauthenticatescallers in the first place, seeCluster authentication concepts. For an orientation across all four AKS identity scenarios, seeAccess and identity options for AKS.",2026-04-23T06:10:00.000Z,concept-article,,0.45,False,"Conceptual explanation of AKS authorization models (Kubernetes RBAC vs Entra ID with ABAC); no detailed role tables, config parameters, or error/diagnostic mappings.",new https://learn.microsoft.com/en-us/azure/aks/concepts-fine-tune-language-models,Fine-tuning language models,Concepts - Fine-tuning language models for AI and machine learning workflows - Azure Kubernetes Service,,Learn about how you can customize language models to use in your AI and machine learning workflows on Azure Kubernetes Service (AKS).,"In this article, you learn about fine-tuninglanguage models, including some common methods and how applying the tuning results can improve the performance of your AI and machine learning workflows on Azure Kubernetes Service (AKS).",2026-04-06T22:07:00.000Z,concept-article,,0.2,False,"Conceptual overview of fine-tuning language models on AKS; describes methods and benefits but does not include product-specific limits, configuration tables, error codes, or decision matrices with quantified trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/aks/concepts-identity,Access and identity concepts,Concepts - Access and identity in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about access and identity in Azure Kubernetes Service (AKS), including Microsoft Entra integration, Kubernetes role-based access control (Kubernetes RBAC), and roles and bindings.","You can authenticate, authorize, secure, and control access to Kubernetes clusters in a variety of ways: Kubernetes RBAC and AKS help you secure your cluster access and provide only the minimum required permissions to developers and operators. This article introduces the core concepts that help you authenticate and assign permissions in AKS.",2024-08-01T20:29:00.000Z,concept-article,,0.2,False,Conceptual overview of identity and access in AKS; lacks detailed configuration parameters or error mappings.,unchanged +https://learn.microsoft.com/en-us/azure/aks/concepts-identity,Access and identity overview,Concepts - Access and identity in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn the five identity scenarios in Azure Kubernetes Service (AKS) — Kubernetes control-plane authentication and authorization, AKS resource (ARM) authorization, cluster identity, and workload identi",AKS uses identity in five distinct scenarios. Each scenario answers a different question and has its own configuration model. This article gives a brief introduction to each and points to the deep-dive documentation.,2026-04-23T06:10:00.000Z,concept-article,,0.1,False,"Identity concepts overview; summary indicates orientation across scenarios and links to deep dives, not detailed configuration or role definitions.",new https://learn.microsoft.com/en-us/azure/aks/concepts-machine-learning-ops,Concepts,Concepts - Machine learning operations (MLOps) for AI and machine learning workflows - Azure Kubernetes Service,,Learn about MLOps and how you can use it with your AI and machine learning workflows on Azure Kubernetes Service (AKS).,"In this article, you learn about machine learning operations (MLOps), including what types of practices and tools are involved, and how it can simplify and speed up your AI and machine learning workflows on Azure Kubernetes Service (AKS).",2024-10-22T22:00:00.000Z,concept-article,,0.2,False,"Conceptual overview of MLOps; lacks product-specific parameters, limits, or detailed configurations.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-managed-namespaces,Overview,Overview of Managed Namespaces in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn how to simplify namespace management and resource isolation in Azure Kubernetes Service (AKS) with managed namespaces.,"Applies to:✔️ AKS Automatic ✔️ AKS Standard As you manage clusters in Azure Kubernetes Service (AKS), you often need to isolate teams and workloads. With logical isolation, you can use a single AKS cluster for multiple workloads, teams, or environments. Kubernetes namespaces form the logical isolation boundary for workloads and resources. Performing logical isolation involves implementing scripts and processes to create namespaces, set resource limits, apply network policies, and grant team acce",2025-11-18T16:04:00.000Z,overview,,0.35,False,Conceptual overview of managed namespaces and logical isolation; mostly descriptive without detailed configuration parameter tables.,unchanged -https://learn.microsoft.com/en-us/azure/aks/concepts-network,Networking concepts,Concepts - Networking in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about networking in Azure Kubernetes Service (AKS), including kubenet and Azure CNI networking, ingress controllers, load balancers, and static IP addresses.","In a container-based, microservices approach to application development, application components work together to process their tasks. Kubernetes provides various resources enabling this cooperation: This article introduces the core concepts that provide networking to your applications in AKS:",2026-04-13T22:08:00.000Z,concept-article,,0.1,False,"This is a conceptual networking overview for AKS (kubenet vs Azure CNI, ingress, load balancers). It explains what the components are but does not provide detailed configuration tables, limits, or decision matrices with quantified trade-offs, so it lacks the required expert-knowledge signals.",updated -https://learn.microsoft.com/en-us/azure/aks/concepts-network-azure-cni-overlay,Overview,Overview of Azure CNI Overlay Networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about Azure CNI Overlay networking in Azure Kubernetes Service (AKS), including its architecture, IP address planning, and differences from kubenet.","Azure CNI Overlay is a networking model for Azure Kubernetes Service (AKS) that provides efficient IP address management and high-performance pod communication. This article provides an overview of Azure CNI Overlay, including its architecture, IP address planning, and differences from the traditional kubenet networking model.",2026-04-16T08:00:00.000Z,overview,,0.2,False,"Described as an overview of Azure CNI Overlay architecture and IP planning; from the summary it appears conceptual (architecture, differences from kubenet) without clear evidence of concrete configuration tables, limits, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/aks/concepts-network,Networking concepts,Concepts - Networking in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about networking in Azure Kubernetes Service (AKS), including kubenet and Azure CNI networking, ingress controllers, load balancers, and static IP addresses.","In a container-based, microservices approach to application development, application components work together to process their tasks. Kubernetes provides various resources enabling this cooperation: This article introduces the core concepts that provide networking to your applications in AKS:",2026-04-13T22:08:00.000Z,concept-article,,0.1,False,"This is a conceptual networking overview for AKS (kubenet vs Azure CNI, ingress, load balancers). It explains what the components are but does not provide detailed configuration tables, limits, or decision matrices with quantified trade-offs, so it lacks the required expert-knowledge signals.",unchanged +https://learn.microsoft.com/en-us/azure/aks/concepts-network-azure-cni-overlay,Overview,Overview of Azure CNI Overlay Networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about Azure CNI Overlay networking in Azure Kubernetes Service (AKS), including its architecture, IP address planning, and differences from kubenet.","Azure CNI Overlay is a networking model for Azure Kubernetes Service (AKS) that provides efficient IP address management and high-performance pod communication. This article provides an overview of Azure CNI Overlay, including its architecture, IP address planning, and differences from the traditional kubenet networking model.",2026-04-16T08:00:00.000Z,overview,,0.2,False,"Described as an overview of Azure CNI Overlay architecture and IP planning; from the summary it appears conceptual (architecture, differences from kubenet) without clear evidence of concrete configuration tables, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-network-azure-cni-pod-subnet,Use Azure CNI Pod Subnet,Concepts - Azure CNI Pod Subnet networking in AKS - Azure Kubernetes Service,,"Learn about Azure CNI Pod Subnet, dynamic IP allocation mode, and static block allocation mode in Azure Kubernetes Service (AKS).",Azure CNI Pod Subnet assigns IP addresses to pods from a separate subnet from your cluster Nodes. This feature is available in two modes: Dynamic IP Allocation and Static Block Allocation.,2025-10-16T22:11:00.000Z,concept-article,,0.35,False,Concepts article about Azure CNI Pod Subnet modes; primarily descriptive without clear indication of config tables or numeric thresholds in the summary.,unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-network-cni-overview,CNI networking overview,Concepts - CNI Networking in AKS - Azure Kubernetes Service,,Learn about CNI networking options in Azure Kubernetes Service (AKS),"Kubernetes uses Container Networking Interface (CNI) plugins to manage networking in Kubernetes clusters. CNI plug-ins are responsible for assigning IP addresses to pods, network routing between pods, Kubernetes service routing, and more. Azure Kubernetes Service (AKS) provides multiple CNI plugins that you can use in your clusters, depending on your networking requirements.",2026-04-02T22:13:00.000Z,concept-article,,0.2,False,"Conceptual overview of CNI networking options in AKS (what CNI is, high-level behavior). The summary does not indicate specific numeric limits, configuration tables, or decision matrices; it is primarily explanatory, so it does not meet the expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-network-ingress,Overview,Concepts - Ingress Networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn about ingress networking in Azure Kubernetes Service (AKS) including ingress controllers,"Important TheKubernetes SIG Networkand the Security Response Committeeannounced the upcoming retirementof theIngress NGINX project, with maintenance ending inMarch 2026. There's no immediate action required today for AKS clusters using theApplication Routing add-on with NGINX. Microsoft will provide official support for critical security patches for Application Routing add-on NGINX Ingress resources throughNovember 2026. AKS is aligning with upstream Kubernetes by moving toGateway APIas the long",2026-02-04T23:11:00.000Z,concept-article,,0.4,False,"Ingress networking concepts and deprecation notice; primarily conceptual and lifecycle information, not detailed configuration or numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/aks/concepts-network-ip-address-planning,IP address planning,Concepts - IP address planning in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Plan IP address space for Azure AKS clusters,Learn about IP address planning in Azure Kubernetes Service (AKS).,"This article provides guidance on IP address planning for Azure Kubernetes Service (AKS) clusters. For specific guidance on IP address planning for individual CNI options, see thenext stepssection for links to plugin documentation.",2026-04-16T08:00:00.000Z,concept-article,decision-making,0.65,True,"The article gives concrete guidance on IP address planning for AKS clusters, including how to size address spaces for nodes and pods and how to choose ranges for different CNI options. This is used to make design decisions about cluster networking and capacity, aligning with decision-making rather than just conceptual networking content.",updated +https://learn.microsoft.com/en-us/azure/aks/concepts-network-ip-address-planning,IP address planning,Concepts - IP address planning in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Plan IP address space for Azure AKS clusters,Learn about IP address planning in Azure Kubernetes Service (AKS).,"This article provides guidance on IP address planning for Azure Kubernetes Service (AKS) clusters. For specific guidance on IP address planning for individual CNI options, see thenext stepssection for links to plugin documentation.",2026-04-16T08:00:00.000Z,concept-article,decision-making,0.65,True,"The article gives concrete guidance on IP address planning for AKS clusters, including how to size address spaces for nodes and pods and how to choose ranges for different CNI options. This is used to make design decisions about cluster networking and capacity, aligning with decision-making rather than just conceptual networking content.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-network-isolated,Overview,Network isolated AKS clusters - Azure Kubernetes Service,,Learn how network isolated AKS clusters work,"Organizations typically have strict security and compliance requirements to regulate egress (outbound) network traffic from a cluster to eliminate risks of data exfiltration. By default, standard SKU Azure Kubernetes Service (AKS) clusters have unrestricted outbound internet access. This level of network access allows nodes and services you run to access external resources as needed. If you wish to restrict egress traffic, a limited number of ports and addresses must be accessible to maintain he",2025-07-08T08:00:00.000Z,concept-article,,0.3,False,"Conceptual explanation of network-isolated clusters and motivations; summary suggests no detailed port/FQDN tables, config parameters, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/concepts-network-legacy-cni,Legacy CNI options,Concepts - AKS Legacy Container Networking Interfaces (CNI) - Azure Kubernetes Service,Choose and understand legacy CNI options in Azure Kubernetes Service,Learn about legacy CNI networking options in Azure Kubernetes Service (AKS),"Important On31 March 2028, kubenet networking for Azure Kubernetes Service (AKS) will be retired. To avoid service disruptions,you'll need toupgrade to Azure Container Networking Interface (CNI) overlaybefore that date, when workloads running on kubenet for AKS will no longer be supported. In Azure Kubernetes Service (AKS), whileAzure CNI OverlayandAzure CNI Pod Subnetare recommended for most scenarios, legacy networking models such as Azure CNI Node Subnet and kubenet are still available and su",2026-04-16T08:00:00.000Z,concept-article,decision-making,0.65,True,"The page compares legacy AKS networking models (Azure CNI Node Subnet, kubenet) against recommended options (Azure CNI Overlay, Pod Subnet) and discusses when they are still supported. This is product-specific selection guidance between networking models, which fits decision-making, even though the summary also mentions retirement timelines.",updated +https://learn.microsoft.com/en-us/azure/aks/concepts-network-legacy-cni,Legacy CNI options,Concepts - AKS Legacy Container Networking Interfaces (CNI) - Azure Kubernetes Service,Choose and understand legacy CNI options in Azure Kubernetes Service,Learn about legacy CNI networking options in Azure Kubernetes Service (AKS),"Important On31 March 2028, kubenet networking for Azure Kubernetes Service (AKS) will be retired. To avoid service disruptions,you'll need toupgrade to Azure Container Networking Interface (CNI) overlaybefore that date, when workloads running on kubenet for AKS will no longer be supported. In Azure Kubernetes Service (AKS), whileAzure CNI OverlayandAzure CNI Pod Subnetare recommended for most scenarios, legacy networking models such as Azure CNI Node Subnet and kubenet are still available and su",2026-04-16T08:00:00.000Z,concept-article,decision-making,0.65,True,"The page compares legacy AKS networking models (Azure CNI Node Subnet, kubenet) against recommended options (Azure CNI Overlay, Pod Subnet) and discusses when they are still supported. This is product-specific selection guidance between networking models, which fits decision-making, even though the summary also mentions retirement timelines.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-network-services,Services,Concepts - Services in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about networking services in Azure Kubernetes Service (AKS), including what Kubernetes Services are and what types of services are available in AKS.","You can use Kubernetes Services to logically group pods and provide network connectivity by allowing direct access to them through a specific IP address or DNS name on a designated port. This allows you to expose your application workloads to other services within the cluster or to external clients without having to manually manage the network configuration for each pod hosting a workload. You can specify what kind of service you want using KubernetesServicetypevalues. For more information, see ",2025-07-16T17:07:00.000Z,concept-article,,0.4,False,Conceptual explanation of Kubernetes Services types in AKS; mostly generic Kubernetes knowledge without AKS-specific limits or configuration matrices.,unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-pod-sandboxing,Pod Sandboxing overview,Overview of Pod Sandboxing - Azure Kubernetes Service,,Learn how to spin up Kata pods to enforce compute isolation across your various workloads,"As clusters scale and host workloads from multiple teams or tenants, shared infrastructure can introduce complexity. Customers often need mechanisms to enforce stronger isolation between workloads. This isolation might be driven by performance considerations like separating resource-intensive or bursty workloads from more predictable workloads to avoid disruptions. Or by security requirements that call for isolating sensitive workloads from others. Customers today can opt for either logical or p",2025-11-18T16:04:00.000Z,how-to,,0.35,False,High-level overview of pod sandboxing and isolation; lacks detailed configuration parameters or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-preview-api-life-cycle,AKS preview API lifecycle,AKS Preview API life cycle - Azure Kubernetes Service,Understand AKS preview API lifecycle and deprecation timing,Learn about the AKS preview API life cycle.,"Deploy and Explore The Azure Kubernetes Service (AKS) preview APIs (APIs that end in-preview) have a lifespan of ~one year from their release date. @@ -119,11 +126,11 @@ This means that you can expect the 2023-01-02-preview API to be deprecated somew tools built on them (such as theAKS Preview CLI Extension). After an API version is deprecated, it will no longer function",2026-03-31T22:11:00.000Z,concept-article,limits-quotas,0.7,True,"Contains a specific, time-bound lifecycle rule for AKS preview API versions (e.g., ~one year lifespan tied to release date and deprecation behavior). This is a product-specific limit/constraint with concrete timing that affects usage and is unlikely to be reliably known from general training data.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-scale,Scaling concepts,Scaling options for applications in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about scaling in Azure Kubernetes Service (AKS), including the horizontal pod autoscaler, cluster autoscaler, and Azure Container Instances.","When running applications in Azure Kubernetes Service (AKS), you might need to actively increase or decrease the amount of compute resources in your cluster. As you change the number of application instances you have, you might need to change the number of underlying Kubernetes nodes. You might also need to provision a large number of other application instances. This article introduces core AKS application scaling concepts, includingmanually scaling pods or nodes, using theHorizontal pod autosc",2025-11-18T19:02:00.000Z,concept-article,,0.3,False,"Conceptual overview of scaling options (HPA, cluster autoscaler, ACI); high-level without clear evidence of numeric thresholds or config tables.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-scheduler-configuration,Overview,Concepts - Scheduler Configuration in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn about scheduler configuration and advanced scheduling concepts for Azure Kubernetes Service (AKS).,"This article covers scheduler configuration and advanced scheduling concepts for workload placement in Azure Kubernetes Service (AKS), including configurable scheduler profiles, scheduling plugins, and scheduling constraints.",2025-12-05T23:11:00.000Z,concept-article,,0.3,False,"Concepts article on scheduler configuration; primarily explanatory, not focused on detailed config tables or numeric thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/aks/concepts-security,Security concepts,Concepts - Security in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about security in Azure Kubernetes Service (AKS), including master and node communication, network policies, and Kubernetes secrets.","Container security protects the entire end-to-end pipeline from build to the application workloads running in Azure Kubernetes Service (AKS). The Secure Supply Chain includes the build environment and registry. Kubernetes includes security components, such aspod security standardsandSecrets. Azure includes components like Active Directory, Microsoft Defender for Containers, Azure Policy, Azure Key Vault, network security groups, and orchestrated cluster upgrades. AKS combines these security comp",2025-11-10T18:12:00.000Z,concept-article,,0.25,False,Security concepts overview; high-level description of components and services without detailed RBAC roles or config parameters.,unchanged +https://learn.microsoft.com/en-us/azure/aks/concepts-security,Security overview,Concepts - Security in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about security in Azure Kubernetes Service (AKS), including master and node communication, network policies, and Kubernetes secrets.","Container security protects the entire end-to-end pipeline from build to the application workloads running in Azure Kubernetes Service (AKS). The Secure Supply Chain includes the build environment and registry. Kubernetes includes security components, such aspod security standardsandSecrets. Azure includes components like Active Directory, Microsoft Defender for Containers, Azure Policy, Azure Key Vault, network security groups, and orchestrated cluster upgrades. AKS combines these security comp",2025-11-10T18:12:00.000Z,concept-article,,0.1,False,Security concepts overview for AKS; primarily conceptual description of components and services without detailed RBAC role lists or configuration parameters.,new https://learn.microsoft.com/en-us/azure/aks/concepts-storage,Storage concepts,Concepts - Storage in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,"Learn about storage in Azure Kubernetes Service (AKS), including volumes, persistent volumes, storage classes, and claims.","Applications running in Azure Kubernetes Service (AKS) might need to store and retrieve data. While some application workloads can use local, fast storage on unneeded, emptied nodes, others require storage that persists on more regular data volumes within the Azure platform. Multiple pods might need to: You also might need to collect and store sensitive data or application configuration information into pods. This article introduces the core concepts that provide storage to your applications in ",2025-10-01T22:07:00.000Z,concept-article,,0.2,False,"Conceptual overview of AKS storage (volumes, PVs, storage classes) without detailed configuration tables or product-specific limits.",unchanged https://learn.microsoft.com/en-us/azure/aks/concepts-sustainable-software-engineering,Sustainable software engineering best practices,Concepts - Sustainable software engineering in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,,Learn about sustainable software engineering in Azure Kubernetes Service (AKS).,"The sustainable software engineering principles are a set of competencies to help you define, build, and run sustainable applications. The overall goal is to reduce the carbon footprint in every aspect of your application. The Azure Well-Architected Framework guidance for sustainability aligns with the Principles of Sustainable Software Engineering from theGreen Software Foundationand provides an overview of the principles of sustainable software engineering. Sustainable software engineering is ",2024-08-01T20:29:00.000Z,concept-article,,0.2,False,"Conceptual sustainability principles and high-level guidance; lacks concrete AKS-specific configuration values, limits, or patterns.",unchanged -https://learn.microsoft.com/en-us/azure/aks/concepts-vulnerability-management,Security vulnerability management,Vulnerability management for Azure Kubernetes Service - Azure Kubernetes Service,,Learn how Microsoft manages security vulnerabilities for Azure Kubernetes Service (AKS) clusters.,"Vulnerability management involves detecting, assessing, mitigating, and reporting on any security vulnerabilities that exist in an organization's systems and software. Vulnerability management is a shared responsibility between you and Microsoft. This article describes how Microsoft manages security vulnerabilities and security updates (also referred to as patches), for Azure Kubernetes Service (AKS) clusters. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer suppo",2025-09-11T05:03:00.000Z,concept-article,,0.25,False,"Vulnerability management overview; describes shared responsibility and process, not specific configuration or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/confidential-containers-overview,Confidential Containers overview,Confidential Containers (preview) with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn about Confidential Containers (preview) on an Azure Kubernetes Service (AKS) cluster to maintain security and protect sensitive information.,"Important The Confidential Containers preview is set to sunset inMarch 2026. After this date, customers with existing Confidential Container node pools should expect to see reduced functionality, and you won't be able to spin up any new nodes with theKataCcIsolationruntime. Customers currently using Confidential Container node pools can continue using them as normal. If you want to move off Confidential Containers, consider the following alternatives: If you have additional questions, please cre",2025-10-25T05:08:00.000Z,concept-article,,0.3,False,"Confidential Containers overview; mostly conceptual plus deprecation note, not focused on detailed configuration or security role mappings.",unchanged +https://learn.microsoft.com/en-us/azure/aks/concepts-vulnerability-management,Vulnerability management overview,Vulnerability management for Azure Kubernetes Service - Azure Kubernetes Service,,Learn how Microsoft manages security vulnerabilities for Azure Kubernetes Service (AKS) clusters.,"Vulnerability management involves detecting, assessing, mitigating, and reporting on any security vulnerabilities that exist in an organization's systems and software. Vulnerability management is a shared responsibility between you and Microsoft. This article describes how Microsoft manages security vulnerabilities and security updates (also referred to as patches), for Azure Kubernetes Service (AKS) clusters. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer suppo",2025-09-11T05:03:00.000Z,concept-article,,0.4,False,"Conceptual description of vulnerability management responsibilities and patching for AKS; summary suggests high-level process rather than specific error codes, configuration parameters, or limits.",new +https://learn.microsoft.com/en-us/azure/aks/confidential-containers-overview,Confidential Containers (preview),Confidential Containers (preview) with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Understand AKS Confidential Containers preview capabilities and alternatives,Learn about Confidential Containers (preview) on an Azure Kubernetes Service (AKS) cluster to maintain security and protect sensitive information.,"Important The Confidential Containers preview is set to sunset inMarch 2026. After this date, customers with existing Confidential Container node pools should expect to see reduced functionality, and you won't be able to spin up any new nodes with theKataCcIsolationruntime. Customers currently using Confidential Container node pools can continue using them as normal. If you want to move off Confidential Containers, consider the following alternatives: If you have additional questions, please cre",2025-10-25T05:08:00.000Z,concept-article,decision-making,0.6,True,"Overview of Confidential Containers preview including sunset timelines and suggested alternatives; provides product-specific guidance on when to use or migrate away, which is decision-focused and time-bound expert knowledge.",new https://learn.microsoft.com/en-us/azure/aks/configure-aks-scheduler,Configure a scheduler profile,Configure Scheduler Profiles on Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Configure scheduler profiles and plugins in AKS,Learn how to set scheduler profiles to achieve advanced scheduling behaviors on Azure Kubernetes Service (AKS).,"In this article, you learn how to deploy example scheduler profiles in Azure Kubernetes Service (AKS) to configure advanced scheduling behavior using in-tree scheduling plugins. This guide also explains how to verify the successful application of custom scheduler profiles targeting specific node pools or the entire AKS cluster.",2025-12-16T23:08:00.000Z,how-to,configuration,0.8,True,"How-to for scheduler profiles; likely includes specific profile names, plugin configuration fields, and AKS-specific constraints.",unchanged https://learn.microsoft.com/en-us/azure/aks/configure-azure-cni,Use Azure CNI,Configure Azure CNI Networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Azure CNI networking for AKS clusters,Learn how to configure Azure CNI (advanced) networking in Azure Kubernetes Service (AKS).,"This article shows you how to use Container Networking Interface (CNI) networking in Azure to create and use a virtual network subnet for an Azure Kubernetes Service (AKS) cluster. For more information on network options and considerations, seeNetworking concepts for applications in Azure Kubernetes Service.",2026-01-29T08:00:00.000Z,how-to,configuration,0.7,True,"How-to for Azure CNI with specific subnet requirements, CLI flags, and AKS properties; these are concrete configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/aks/configure-azure-cni-dynamic-ip-allocation,Use Azure CNI Pod Subnet - Dynamic IP Allocation,Configure Azure CNI Pod Subnet - Dynamic IP Allocation and enhanced subnet support - Azure Kubernetes Service,Configure Azure CNI Pod Subnet dynamic IP allocation,Learn how to configure Azure CNI Pod Subnet - Dynamic IP Allocation and enhanced subnet support in Azure Kubernetes Service (AKS),"A drawback with the traditional CNI is the exhaustion of pod IP addresses as the AKS cluster grows, which results in the need to rebuild your entire cluster in a bigger subnet. The new dynamic IP allocation capability in Azure CNI solves this problem by allocating pod IPs from a subnet separate from the subnet hosting the AKS cluster. It offers the following benefits: This article shows you how to use Azure CNI Pod Subnet - Dynamic IP Allocation and enhanced subnet support in AKS.",2025-07-24T05:08:00.000Z,how-to,configuration,0.7,True,Shows how to enable dynamic IP allocation and enhanced subnet support; involves specific AKS configuration properties and constraints.,unchanged @@ -132,24 +139,24 @@ The Pod Subnet - Static Block Allocation in Azure CNI solves this problem by ass https://learn.microsoft.com/en-us/azure/aks/configure-dual-stack,Use dual-stack networking,Configure dual-stack networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure dual-stack IPv4/IPv6 networking in AKS,Learn how to configure dual-stack networking in Azure Kubernetes Service (AKS).,"You can deploy your AKS clusters in a dual-stack mode when using a dual-stack Azure virtual network. In this configuration, nodes receive both an IPv4 and IPv6 address from the Azure virtual network subnet. Pods receive both an IPv4 and IPv6 address from a logically different address space to the Azure virtual network subnet of the nodes. Network address translation (NAT) is then configured so that the pods can reach resources on the Azure virtual network. The source IP address of the traffic is",2025-12-17T12:02:00.000Z,how-to,configuration,0.7,True,"Describes dual-stack setup, address assignment, and NAT behavior; such how-to pages include specific AKS flags and network configuration details.",unchanged https://learn.microsoft.com/en-us/azure/aks/configure-dual-stack,Use kubenet with dual-stack networking,Configure dual-stack networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure dual-stack IPv4/IPv6 networking in AKS,Learn how to configure dual-stack networking in Azure Kubernetes Service (AKS).,"You can deploy your AKS clusters in a dual-stack mode when using a dual-stack Azure virtual network. In this configuration, nodes receive both an IPv4 and IPv6 address from the Azure virtual network subnet. Pods receive both an IPv4 and IPv6 address from a logically different address space to the Azure virtual network subnet of the nodes. Network address translation (NAT) is then configured so that the pods can reach resources on the Azure virtual network. The source IP address of the traffic is",2025-12-17T12:02:00.000Z,how-to,configuration,0.7,True,Duplicate of index 19 URL; same dual-stack configuration guidance with AKS-specific settings.,unchanged https://learn.microsoft.com/en-us/azure/aks/configure-kube-proxy,Use kube-proxy configuration,Configure kube-proxy (iptables/IPVS/nftables) (Preview) - Azure Kubernetes Service,Configure kube-proxy backends on AKS clusters,Learn how to configure kube-proxy to utilize different load balancing configurations with Azure Kubernetes Service (AKS).,"kube-proxyis a component of Kubernetes that handles routing traffic for services within the cluster. There are three backends available for Layer 3/4 load balancing in upstreamkube-proxy:iptables,IPVS, andnftables. For more information, see theKubernetes documentation on kube-proxy. Note You can disable the AKS-managedkube-proxyDaemonSetto supportbring-your-own CNI. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and",2026-01-29T06:02:00.000Z,how-to,configuration,0.7,True,"How-to configuration article for kube-proxy on AKS; likely includes AKS-specific flags, modes (iptables/IPVS/nftables), and configuration parameters rather than just concepts.",unchanged -https://learn.microsoft.com/en-us/azure/aks/configure-kubenet,Use kubenet,Configure kubenet networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure kubenet networking for Azure Kubernetes Service clusters,Learn how to configure kubenet (basic) network in Azure Kubernetes Service (AKS) to deploy an AKS cluster into an existing virtual network and subnet.,"Important On31 March 2028, kubenet networking for Azure Kubernetes Service (AKS) will be retired. To avoid service disruptions,you'll need toupgrade to Azure Container Networking Interface (CNI) overlaybefore that date, when workloads running on kubenet for AKS will no longer be supported. AKS clusters use kubenet and create an Azure virtual network and subnet for you by default. With kubenet, nodes get an IP address from the Azure virtual network subnet. Pods receive an IP address from a logica",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"A kubenet configuration article for AKS generally contains AKS-specific network settings (subnet requirements, route table configuration, cluster creation flags, and possibly IP range examples) that go beyond generic Kubernetes knowledge. These concrete, product-specific configuration steps and parameters constitute expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/aks/configure-kubenet,Use kubenet,Configure kubenet networking in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure kubenet networking for Azure Kubernetes Service clusters,Learn how to configure kubenet (basic) network in Azure Kubernetes Service (AKS) to deploy an AKS cluster into an existing virtual network and subnet.,"Important On31 March 2028, kubenet networking for Azure Kubernetes Service (AKS) will be retired. To avoid service disruptions,you'll need toupgrade to Azure Container Networking Interface (CNI) overlaybefore that date, when workloads running on kubenet for AKS will no longer be supported. AKS clusters use kubenet and create an Azure virtual network and subnet for you by default. With kubenet, nodes get an IP address from the Azure virtual network subnet. Pods receive an IP address from a logica",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"A kubenet configuration article for AKS generally contains AKS-specific network settings (subnet requirements, route table configuration, cluster creation flags, and possibly IP range examples) that go beyond generic Kubernetes knowledge. These concrete, product-specific configuration steps and parameters constitute expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/configure-load-balancer-standard,Configure a Standard Load Balancer,Configure a Public Standard Load Balancer in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Standard Load Balancer settings for AKS,"Learn how you can configure a public standard load balancer in Azure Kubernetes Service (AKS) to meet your workload needs, including changing the inbound pool type, scaling outbound IPs, and configuri","You can customize different settings for your standard public load balancer at cluster creation time or by updating the cluster. These customization options allow you to create a load balancer that meets your workload needs. With the standard load balancer, you can: Important You can only use one outbound IP option (managed IPs, bring your own IP, or IP prefix) at a given time.",2025-11-04T23:06:00.000Z,how-to,configuration,0.78,True,"Covers changing inbound pool type, scaling outbound IPs, and configuring allocated outbound ports; includes specific parameters and constraints.",unchanged https://learn.microsoft.com/en-us/azure/aks/configure-node-binpack-scheduler,Configure node bin packing scheduler profile,Bin Pack Nodes with Scheduler Profiles on Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Configure AKS scheduler profiles for node bin packing,Learn how to configure scheduler profiles to reduce idle costs and improve node utilization on Azure Kubernetes Service (AKS).,"In this article, you learn how to bin pack your nodes to improve node utilization for Azure Kubernetes Service (AKS) clusters using in-tree scheduling plugin,NodeResourcesFit. The default AKS scheduler operates in aNodeResourcesFit:LeastAllocatedmode, which prioritizes nodes with lower utilization with scheduling pods. Configurable Scheduler Profiles on AKS allows you to change this default behavior and fine-tune the configuration to prioritize nodes with higher utilization. This documentation c",2026-03-12T08:00:00.000Z,how-to,configuration,0.75,True,"This article describes configuring scheduler profiles and the NodeResourcesFit plugin to change AKS scheduling behavior. Such guidance typically includes specific scheduler profile names, plugin configuration fields, and example YAML/CLI parameters that are unique to AKS. These concrete configuration options and their effects on scheduling behavior constitute expert, product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/configure-static-egress-gateway,Configure Static Egress Gateway,Configure Static Egress Gateway in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure static egress gateway for AKS outbound IPs,Learn how to configure Static Egress Gateway in Azure Kubernetes Service (AKS) to manage egress traffic from a constant IP address.,"Static Egress Gateway in AKS provides a streamlined solution for configuring fixed source IP addresses for outbound traffic from your AKS workloads. This feature allows you to route egress traffic through a dedicated ""gateway node pool"". By using the Static Egress Gateway, you can efficiently manage and control outbound IP addresses and ensure that your AKS workloads can communicate with external systems securely and consistently, using predefined IPs. This article provides step-by-step instruct",2026-03-10T17:04:00.000Z,how-to,configuration,0.7,True,"The article describes how to configure Static Egress Gateway in AKS, which typically involves product-specific configuration fields (for gateway node pools, routing, IP settings) and step-by-step setup with concrete parameter names and values. This is detailed configuration guidance rather than just conceptual networking overview.",unchanged https://learn.microsoft.com/en-us/azure/aks/considerations-pod-sandboxing,Pod Sandboxing considerations,Considerations for Pod Sandboxing on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Plan resource and security considerations for AKS pod sandboxing,Learn about some considerations that should be taken into account when deploying Pod Sandboxing.,"For Pod Sandboxing deployments on Azure Kubernetes Service (AKS) there are several items to consider in regard to resource management, memory management, CPU management, and security.",2025-11-18T16:04:00.000Z,how-to,best-practices,0.7,True,"Discusses concrete considerations and recommendations for resource, memory, CPU, and security management when using pod sandboxing in AKS.",unchanged -https://learn.microsoft.com/en-us/azure/aks/container-network-insights-agent-overview,Overview,Container Network Insights Agent for AKS overview (Public Preview) - Azure Kubernetes Service,,"Learn about Container Network Insights Agent, an AI-powered diagnostic assistant that helps you troubleshoot networking issues in Azure Kubernetes Service (AKS) clusters.","Container Network Insights Agent is an AI-powered diagnostic assistant that helps you identify and resolve networking issues in your Azure Kubernetes Service (AKS) clusters. Describe a problem in natural language, such as DNS failures, packet drops, unreachable services, or blocked traffic. The agent collects evidence from your cluster and returns a structured report with root cause analysis and remediation guidance. Unlike tools that operate only at the Kubernetes layer, Container Network Insig",2026-04-16T11:03:00.000Z,overview,,0.2,False,"Overview/marketing-style description of the Container Network Insights Agent; no detailed error codes, configuration tables, limits, or decision matrices are evident in the summary.",new +https://learn.microsoft.com/en-us/azure/aks/container-network-insights-agent-overview,Overview,Container Network Insights Agent for AKS overview (Public Preview) - Azure Kubernetes Service,,"Learn about Container Network Insights Agent, an AI-powered diagnostic assistant that helps you troubleshoot networking issues in Azure Kubernetes Service (AKS) clusters.","Container Network Insights Agent is an AI-powered diagnostic assistant that helps you identify and resolve networking issues in your Azure Kubernetes Service (AKS) clusters. Describe a problem in natural language, such as DNS failures, packet drops, unreachable services, or blocked traffic. The agent collects evidence from your cluster and returns a structured report with root cause analysis and remediation guidance. Unlike tools that operate only at the Kubernetes layer, Container Network Insig",2026-04-16T11:03:00.000Z,overview,,0.2,False,"Overview/marketing-style description of the Container Network Insights Agent; no detailed error codes, configuration tables, limits, or decision matrices are evident in the summary.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-observability-guide,Advanced Container Networking Services observability guide,Use Advanced Container Networking Services for diagnosing and resolving network issues - Azure Kubernetes Service,Diagnose AKS network issues using Advanced Container Networking Services,This article walks you through how to use Advanced Container Networking Services for diagnosing and resolving network issues in Azure Kubernetes Service (AKS).,"This guide helps you navigateAdvanced Container Networking Services (ACNS)as the primary solution for addressing real-world networking use cases in Azure Kubernetes Service (AKS). Whether troubleshooting DNS resolution problems, optimizing ingress and egress traffic, or ensuring compliance with network policies, this manual demonstrates how to harness ACNS observability dashboards,Container Network Logs,Container Network Metrics, and visualization tools to diagnose and resolve issues effectively",2025-08-18T17:09:00.000Z,concept-article,troubleshooting,0.7,True,"Guide for diagnosing and resolving real-world network issues using ACNS dashboards, logs, and metrics; likely organized by symptom and uses product-specific tools and views.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-observability-how-to,Configure Container Network Metrics,Set up Container Network Observability for Azure Kubernetes Service (AKS) - Azure managed Prometheus and Grafana - Azure Kubernetes Service,Set up container network observability with Prometheus and Grafana,Get started with Container Network Observability for your AKS cluster using Azure managed Prometheus and Grafana.,"This article shows you how to set up Container Network Observability for Azure Kubernetes Service (AKS) using Managed Prometheus and Grafana and BYO Prometheus and Grafana and to visualize the scraped metrics You can use Container Network Observability to collect data about the network traffic of your AKS clusters. It enables a centralized platform for monitoring application and network health. Currently, metrics are stored in Prometheus and Grafana can be used to visualize them. Container Netwo",2025-09-11T05:03:00.000Z,how-to,configuration,0.7,True,"Step-by-step setup for Container Network Observability with managed/BYO Prometheus and Grafana; likely includes specific configuration flags, CRDs, and parameter values for scraping and dashboards.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-observability-logs,Overview,Container Network Logs Overview - Azure Kubernetes Service,Configure container network logs and component renaming in AKS,Get an overview of container network logs (preview) in Advanced Container Networking Services for Azure Kubernetes Service (AKS).,"Important Component renaming (starting November 11, 2025) We are renaming components in the Container Network Logs feature to improve clarity and consistency: What's changing Action items for existing users to enable new naming Update Azure CLI(MUST - First step!): Update Preview CLI Extension(MUST): Disable Monitoring: Re-enable Monitoring: Re-enable ACNS Container Network Logs: Apply new ContainerNetworkLog CRD: Apply your updated CRD configuration with the new naming. Reimport Grafana Dashboa",2025-11-18T19:02:00.000Z,overview,configuration,0.7,True,"Component renaming and required CLI/CRD changes imply specific commands, resource names, and configuration steps for Container Network Logs, which are product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-observability-metrics,Overview,Node and Pod Metrics with Advanced Container Networking Services - Azure Kubernetes Service,,Get an overview of container network metrics in Advanced Container Networking Services for Azure Kubernetes Service (AKS).,"Advanced Container Networking Services in Azure Kubernetes Service (AKS) facilitates the collection of comprehensive container network metrics to give you valuable insights into the performance of your containerized environment. The capability continuously captures essential metrics at the node level and pod level, including traffic volume, dropped packets, connection states, and Domain Name System (DNS) resolution times for effective monitoring and optimizing network performance. Capturing thes",2025-12-17T12:02:00.000Z,overview,,0.35,False,Described as an overview of container network metrics; likely conceptual description of available metrics rather than detailed config or limits.,unchanged -https://learn.microsoft.com/en-us/azure/aks/container-network-performance-ebpf-host-routing,Overview,eBPF Host Routing with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,,An overview of eBPF Host Routing on Advanced Container Networking Services with Azure Kubernetes Service (AKS).,"Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best-effort basis. As such, these features aren't meant for production use. For more information, see the following support articles: As containerized workloads scale across distributed environments, the need for high-performance, low-",2025-10-31T17:08:00.000Z,concept-article,,0.35,False,eBPF host routing overview; summary suggests conceptual performance/security discussion rather than detailed configuration or limits.,unchanged +https://learn.microsoft.com/en-us/azure/aks/container-network-performance-ebpf-host-routing,Overview,eBPF Host Routing with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,,An overview of eBPF Host Routing on Advanced Container Networking Services with Azure Kubernetes Service (AKS).,"Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best-effort basis. As such, these features aren't meant for production use. For more information, see the following support articles: As containerized workloads scale across distributed environments, the need for high-performance, low-",2026-04-20T22:08:00.000Z,concept-article,,0.2,False,"Overview of eBPF host routing; summary indicates conceptual performance discussion without detailed configuration tables, limits, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/aks/container-network-security-cilium-mutual-tls-concepts,Overview,Cilium mutual TLS authentication and encryption with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,,An overview of Advanced Container Networking Services' Cilium mTLS encryption capabilities on Azure Kubernetes Service (AKS).,"Cilium mTLS encryption provides transparent, mutual TLS (mTLS) encryption and authentication for pod-to-pod traffic in Kubernetes without requiring application changes or introducing any extra networking stack. It ensures that both the source and destination workloads are cryptographically authenticated before any traffic is exchanged. This approach enables a zero-trust networking model for Kubernetes workloads. All encryption and authentication happens below the application layer, meaning workl",2026-03-23T11:02:00.000Z,concept-article,,0.3,False,"Described as an overview of Cilium mTLS encryption capabilities and zero-trust networking concepts for AKS. Based on the summary, it focuses on conceptual behavior (transparent mTLS, pod-to-pod encryption, zero-trust model) rather than concrete configuration parameters, limits, error codes, or decision matrices. No clear indication of RBAC role names, config tables, or numeric thresholds, so it does not meet any sub-skill detection criteria.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-security-cilium-mutual-tls-how-to,Set up Cilium mTLS encryption,Deploy Cilium mTLS encryption with Advanced Container Networking Services - Azure Kubernetes Service,Configure Cilium mTLS encryption on AKS with ACNS,Get started with Cilium mTLS encryption for Advanced Container Networking Services on your AKS cluster.,"Important Cilium mTLS encryption with Advanced Cluster Networking Services is currently in PREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This article shows you how to deploy Cilium mTLS encryption with Advanced Container Networking Services in Azure Kubernetes Service (AKS) clusters.",2026-03-18T22:12:00.000Z,how-to,configuration,0.7,True,"A how-to article for deploying Cilium mTLS encryption on AKS with Advanced Container Networking Services is likely to include concrete configuration steps: specific CLI flags, YAML fields, configuration parameter names, and possibly default/allowed values for enabling mTLS at the cluster or CNI level. These are product-specific configuration details that qualify as expert knowledge under the configuration sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-security-fqdn-filtering-concepts,Overview,FQDN filtering with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,,An overview of Advanced Container Networking Services' Security capabilities on Azure Kubernetes Service (AKS).,"Containerized environments present unique security challenges. Traditional network security methods, often reliant on IP-based filtering, can become cumbersome and less effective as IP addresses frequently change. Additionally, understanding network traffic patterns and identifying potential threats can be complex. FQDN filtering offers an efficient and user-friendly approach for managing network policies. By defining these policies based on domain names rather than IP addresses, organizations c",2025-04-30T17:05:00.000Z,concept-article,,0.3,False,"FQDN filtering concepts article; summary indicates high-level explanation of benefits and use, not detailed config or limits.",unchanged https://learn.microsoft.com/en-us/azure/aks/container-network-security-l7-policy-concepts,Overview,L7 Policy with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,,An overview of Advanced Container Networking Services' L7 Policy capabilities on Azure Kubernetes Service (AKS).,"Network policies are essential for securing Kubernetes clusters by defining and controlling pod communication. They mitigate unauthorized access and potential security breaches by regulating traffic flow. Advanced Container Networking Services strengthens security withFQDN-based network policies. Expanding on this foundation, Advanced Container Networking Services now provides L7 policy support, enabling detailed inspection and management of application-level traffic. This advancement enhances b",2025-04-30T17:05:00.000Z,concept-article,,0.35,False,L7 policy capabilities overview; appears conceptual description of features and benefits without detailed configuration tables.,unchanged -https://learn.microsoft.com/en-us/azure/aks/container-network-security-wireguard-encryption-concepts,Overview,WireGuard Encryption with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,,An overview of Advanced Container Networking Services' WireGurard encryption capabilities on Azure Kubernetes Service (AKS).,"As organizations increasingly rely on Azure Kubernetes Service (AKS) to run containerized workloads, ensuring the security of network traffic between applications and services becomes essential especially in regulated or security-sensitive environments. In-transit encryption with WireGuard protects data as it moves between pods and nodes, mitigating risks of interception or tampering. WireGuard is known for its simplicity, and robust cryptography, offers a powerful solution for securing communic",2026-04-08T11:02:00.000Z,concept-article,,0.2,False,"Primarily a conceptual/overview article describing WireGuard encryption capabilities in ACNS for AKS. The summary suggests high-level security and feature explanation without specific RBAC roles, configuration parameter tables, limits, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access,Limit access to cluster configuration file,Limit access to kubeconfig in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Control kubeconfig access using Azure RBAC for AKS,Learn how to control access to the Kubernetes configuration file (kubeconfig) for cluster administrators and cluster users,You can interact with Kubernetes clusters using thekubectltool. The Azure CLI provides an easy way to get the access credentials andkubeconfigconfiguration file to connect to your AKS clusters usingkubectl. You can use Azure role-based access control (Azure RBAC) to limit who can get access to thekubeconfigfile and the permissions they have. This article shows you how to assign Azure roles that limit who can get the configuration information for an AKS cluster.,2024-08-01T20:29:00.000Z,how-to,security,0.8,True,Shows how to assign Azure roles to limit kubeconfig retrieval; includes specific RBAC role names and scope patterns unique to AKS.,unchanged +https://learn.microsoft.com/en-us/azure/aks/container-network-security-wireguard-encryption-concepts,Overview,WireGuard Encryption with Advanced Container Networking Services - Azure Kubernetes Service,,An overview of Advanced Container Networking Services' WireGurard encryption capabilities on Azure Kubernetes Service (AKS).,"As organizations increasingly rely on Azure Kubernetes Service (AKS) to run containerized workloads, ensuring the security of network traffic between applications and services becomes essential especially in regulated or security-sensitive environments. In-transit encryption with WireGuard protects data as it moves between pods and nodes, mitigating risks of interception or tampering. WireGuard is known for its simplicity, and robust cryptography, offers a powerful solution for securing communic",2026-04-24T17:17:00.000Z,concept-article,,0.2,False,"Conceptual overview of WireGuard encryption on AKS; summary suggests high-level explanation without concrete configuration parameters, limits, or role definitions.",updated +https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access,Control access to kubeconfig,Limit access to kubeconfig in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Restrict kubeconfig access in AKS using Azure RBAC,Learn how to control access to the Kubernetes configuration file (kubeconfig) for cluster administrators and cluster users,You can interact with Kubernetes clusters using thekubectltool. The Azure CLI provides an easy way to get the access credentials andkubeconfigconfiguration file to connect to your AKS clusters usingkubectl. You can use Azure role-based access control (Azure RBAC) to limit who can get access to thekubeconfigfile and the permissions they have. This article shows you how to assign Azure roles that limit who can get the configuration information for an AKS cluster.,2024-08-01T20:29:00.000Z,how-to,security,0.8,True,"Shows how to assign Azure roles to control kubeconfig access; contains specific RBAC role names and scope usage, which are concrete security configuration details.",new https://learn.microsoft.com/en-us/azure/aks/control-plane-metrics-monitor,Monitor control plane metrics,Monitor AKS Control Plane Metrics - Azure Kubernetes Service,Configure Azure Monitor for AKS control plane metrics,Learn how to monitor Azure Kubernetes Service (AKS) control plane metrics (preview) by using Azure Monitor.,"In this article, you learn how to monitor the Azure Kubernetes Service (AKS) control plane by using control plane metrics in Azure Monitor. AKS supports a subset of control plane metrics free throughAzure Monitor platform metrics. The control plane metrics feature gives you visibility into the availability and performance of critical control plane components like the API server, etcd, the scheduler, the autoscaler, and the controller manager in AKS. The feature is also fully compatible with the ",2025-07-11T05:05:00.000Z,how-to,configuration,0.65,True,"How-to for enabling control plane metrics in Azure Monitor; likely includes metric names, namespaces, and configuration parameters specific to AKS control plane monitoring.",unchanged https://learn.microsoft.com/en-us/azure/aks/core-aks-concepts,Core AKS concepts,Azure Kubernetes Service (AKS) Core Concepts - Azure Kubernetes Service,,Learn about the core concepts of Azure Kubernetes Service (AKS).,"This article describes core concepts of Azure Kubernetes Service (AKS), a managed Kubernetes service that you can use to deploy and operate containerized applications at scale on Azure. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or provides security updates for Azure Linux 2.0. The Azure Linux 2.0 node image is frozen at the202512.06.0 release. Beginning onMarch 31, 2026, node images will be removed, and you'll be unable to scale your node pools. Mi",2025-12-30T23:03:00.000Z,concept-article,,0.3,False,Core concepts article; mostly conceptual explanation of AKS constructs. The deprecation dates for Azure Linux 2.0 are specific but not within any defined sub-skill category.,unchanged https://learn.microsoft.com/en-us/azure/aks/coredns-autoscale,Configure autoscaling for CoreDNS,Configure Autoscaling for CoreDNS in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure CoreDNS autoscaling settings in AKS,Learn how to configure and customize CoreDNS autoscaling in Azure Kubernetes Service (AKS) clusters.,This article explains how to configure and customize CoreDNS autoscaling in Azure Kubernetes Service (AKS) clusters.,2025-10-22T05:06:00.000Z,how-to,configuration,0.7,True,Explains how to tune CoreDNS autoscaler with specific parameters and thresholds; these are product-specific configuration values.,unchanged @@ -169,12 +176,12 @@ https://learn.microsoft.com/en-us/azure/aks/create-volume-azure-disk,Create pers https://learn.microsoft.com/en-us/azure/aks/create-volume-azure-files,Create persistent volumes with Azure Files CSI driver,Create and Manage Persistent Volumes with Azure Files in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use Azure Files CSI volumes with AKS pods,Learn how to create and manage persistent volumes using Azure Files with the Container Storage Interface (CSI) driver in Azure Kubernetes Service (AKS) to provide scalable and reliable storage for you,"If multiple pods need concurrent access to the same storage volume, you can use Azure Files to connect using theServer Message Block (SMB)orNFS protocol. This article shows you how to dynamically and statically create an Azure file share for use by multiple pods in an Azure Kubernetes Service (AKS) cluster. Note The Azure File CSI driver only permits the mounting of SMB file shares using key-based (NTLM v2) authentication, and therefore doesn't support the maximum security profile of Azure File ",2026-03-31T08:00:00.000Z,how-to,integrations,0.7,True,"How-to for dynamically and statically creating Azure Files shares via the CSI driver for AKS. These pages typically include StorageClass, PersistentVolume, and PersistentVolumeClaim YAML with driver names, parameters, and mount options that are specific to the Azure File CSI driver (e.g., protocol selection, secrets, volumeHandle formats). That constitutes product-specific integration and configuration patterns beyond generic Kubernetes knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/csi-disk-move-subscriptions,Move a persistent volume between clusters,Move Azure Disk persistent volumes to another AKS cluster in the same or a different subscription - Azure Kubernetes Service,Move Azure Disk PVs between AKS clusters,Learn how to move a persistent volume between Azure Kubernetes Service clusters in the same subscription or a different subscription.,This article describes how to safely move Azure Disk persistent volumes from one Azure Kubernetes Service (AKS) cluster to another in the same subscription or in a different subscription. The target subscription must be in the same region. The sequence of steps to complete this move are:,2024-08-01T20:29:00.000Z,how-to,deployment,0.7,True,"Stepwise procedure for safely moving Azure Disk-backed PVs across AKS clusters/subscriptions, including ordering and AKS-specific operations, is specialized operational knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/csi-migrate-in-tree-volumes,Migrate from in-tree to CSI driver,Migrate from in-tree storage class to CSI drivers on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Migrate AKS in-tree volumes to CSI drivers,Learn how to migrate from in-tree persistent volume to the Container Storage Interface (CSI) driver in an Azure Kubernetes Service (AKS) cluster.,"The implementation of theContainer Storage Interface (CSI) driverwas introduced in Azure Kubernetes Service (AKS) starting with version 1.21. By adopting and using CSI as the standard, your existing stateful workloads using in-tree Persistent Volumes (PVs) should be migrated or upgraded to use the CSI driver. To make this process as simple as possible, and to ensure no data loss, this article provides different migration options. These options include scripts to help ensure a smooth migration fr",2025-04-12T08:00:00.000Z,how-to,deployment,0.65,True,Migration runbooks and scripts for moving AKS in-tree PVs to CSI drivers are product- and version-specific operational knowledge that goes beyond generic Kubernetes concepts.,unchanged -https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options,3 - Apply extra configurations or perform troubleshooting,Azure Key Vault Provider for Secrets Store CSI Driver for Azure Kubernetes Service (AKS) Configuration Options - Azure Kubernetes Service,Configure Azure Key Vault CSI provider options on AKS,Learn about configuration and troubleshooting options for the Azure Key Vault provider for Secrets Store CSI Driver in AKS.,"This article describes configuration options for the Azure Key Vault provider for Secrets Store CSI Driver in Azure Kubernetes Service (AKS), including how to enable and customize autorotation of secrets, sync mounted content with a Kubernetes secret, and access metrics.",2026-01-30T23:07:00.000Z,how-to,configuration,0.86,True,"Explicitly about configuration options (autorotation, secret sync, metrics) for the Azure Key Vault provider; contains product-specific settings and behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver,1 - Configure Secrets Store CSI Driver,Use the Azure Key Vault provider for Secrets Store CSI Driver for Azure Kubernetes Service (AKS) secrets - Azure Kubernetes Service,Integrate AKS with Azure Key Vault via CSI driver,Learn how to use the Azure Key Vault provider for Secrets Store CSI Driver to integrate secrets stores with Azure Kubernetes Service (AKS).,The Azure Key Vault provider for Secrets Store Container Storage Interface (CSI) Driver allows for the integration of an Azure Key Vault as a secret store with an Azure Kubernetes Service (AKS) cluster via aCSI volume.,2026-04-04T06:03:00.000Z,how-to,integrations,0.7,True,"The article describes using the Azure Key Vault provider for the Secrets Store CSI Driver with AKS, which is a concrete integration pattern between AKS and Key Vault. It typically includes provider-specific configuration objects, parameters, and Kubernetes resource specs unique to this integration, matching the integrations & coding patterns category.",unchanged -https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access,2 - Provide Azure Key Vault access,Access Azure Key Vault with the CSI Driver Identity Provider - Azure Kubernetes Service,Secure AKS Key Vault access with CSI identity options,Learn how to integrate the Azure Key Vault Provider for Secrets Store CSI Driver with your Azure credentials and user identities.,The Secrets Store Container Storage Interface (CSI) Driver on Azure Kubernetes Service (AKS) provides various methods of identity-based access to your Azure Key Vault. This article outlines these methods and best practices for when to use Azure role-based access control (Azure RBAC) or OpenID Connect (OIDC) security models to access your key vault and AKS cluster. You can use one of the following access methods: Learn how to connect to Azure Key Vault with the Secrets Store CSI Driver in an Azur,2026-02-25T23:14:00.000Z,how-to,security,0.82,True,"Covers identity-based access methods (Azure RBAC, OIDC) for Secrets Store CSI Driver with AKS, including security model choices and best practices specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-nginx-tls,Configure TLS for NGINX Ingress controller,Set up Secrets Store CSI Driver to enable NGINX Ingress Controller with TLS on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Secrets Store CSI with NGINX TLS on AKS,How to configure Secrets Store CSI Driver to enable NGINX Ingress Controller with TLS for Azure Kubernetes Service (AKS).,"Important TheKubernetes SIG Networkand the Security Response Committeeannounced the upcoming retirementof theIngress NGINX project, with maintenance ending inMarch 2026. There's no immediate action required today for AKS clusters using theApplication Routing add-on with NGINX. Microsoft will provide official support for critical security patches for Application Routing add-on NGINX Ingress resources throughNovember 2026. AKS is aligning with upstream Kubernetes by moving toGateway APIas the long",2026-02-04T23:11:00.000Z,how-to,integrations,0.76,True,"Detailed integration pattern between Secrets Store CSI Driver, Azure Key Vault, and NGINX Ingress TLS on AKS with product-specific configuration steps and parameters.",unchanged +https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options,Configure and troubleshoot the Azure Key Vault provider for Secrets Store CSI Driver,Azure Key Vault Provider for Secrets Store CSI Driver for Azure Kubernetes Service (AKS) Configuration Options - Azure Kubernetes Service,Configure Azure Key Vault provider options for Secrets Store CSI on AKS,Learn about configuration and troubleshooting options for the Azure Key Vault provider for Secrets Store CSI Driver in AKS.,"This article describes configuration options for the Azure Key Vault provider for Secrets Store CSI Driver in Azure Kubernetes Service (AKS), including how to enable and customize autorotation of secrets, sync mounted content with a Kubernetes secret, and access metrics.",2026-01-30T23:07:00.000Z,how-to,configuration,0.85,True,"Explicitly about configuration options for the Key Vault provider in CSI Driver; likely includes parameter tables, autorotation settings, sync options, and metrics configuration with specific names and allowed values.",new +https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver,Configure the Azure Key Vault provider for Secrets Store CSI Driver,Use the Azure Key Vault provider for Secrets Store CSI Driver for Azure Kubernetes Service (AKS) secrets - Azure Kubernetes Service,Use Azure Key Vault provider for Secrets Store CSI on AKS,Learn how to use the Azure Key Vault provider for Secrets Store CSI Driver to integrate secrets stores with Azure Kubernetes Service (AKS).,The Azure Key Vault provider for Secrets Store Container Storage Interface (CSI) Driver allows for the integration of an Azure Key Vault as a secret store with an Azure Kubernetes Service (AKS) cluster via aCSI volume.,2026-04-04T06:03:00.000Z,how-to,integrations,0.8,True,"Covers Azure Key Vault provider for CSI Driver with AKS; typically includes SecretProviderClass schema, parameter names, and provider-specific options that are product-specific integration details.",new +https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access,Provide access to Azure Key Vault from Secrets Store CSI Driver,Access Azure Key Vault with the CSI Driver Identity Provider - Azure Kubernetes Service,Choose identity models for AKS Secrets Store CSI to access Key Vault,Learn how to integrate the Azure Key Vault Provider for Secrets Store CSI Driver with your Azure credentials and user identities.,The Secrets Store Container Storage Interface (CSI) Driver on Azure Kubernetes Service (AKS) provides various methods of identity-based access to your Azure Key Vault. This article outlines these methods and best practices for when to use Azure role-based access control (Azure RBAC) or OpenID Connect (OIDC) security models to access your key vault and AKS cluster. You can use one of the following access methods: Learn how to connect to Azure Key Vault with the Secrets Store CSI Driver in an Azur,2026-02-25T23:14:00.000Z,how-to,security,0.8,True,"Focuses on identity-based access (Azure RBAC vs OIDC) for Key Vault via CSI Driver; likely includes specific roles, scopes, and configuration for AKS and Key Vault, which are product-specific security settings and decision guidance.",new +https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-nginx-tls,Serve NGINX TLS certs from Key Vault,Set up Secrets Store CSI Driver to enable NGINX Ingress Controller with TLS on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Secrets Store CSI with NGINX TLS on AKS,How to configure Secrets Store CSI Driver to enable NGINX Ingress Controller with TLS for Azure Kubernetes Service (AKS).,"Caution TheKubernetes SIG Networkand the Security Response Committeeannounced the upcoming retirementof theIngress NGINX project, with maintenance ending inMarch 2026. There's no immediate action required today for AKS clusters using theapplication routing add-on with NGINX. Microsoft will provide official support for critical security patches for application routing add-on NGINX Ingress resources throughNovember 2026. AKS is aligning with upstream Kubernetes by moving toGateway APIas the long-t",2026-02-04T23:11:00.000Z,how-to,integrations,0.7,True,"How-to for integrating Secrets Store CSI Driver, Azure Key Vault, and NGINX Ingress TLS on AKS; likely includes YAML specs, driver parameters, and configuration values unique to this integration.",new https://learn.microsoft.com/en-us/azure/aks/csi-storage-drivers,Overview of CSI storage drivers,Use Container Storage Interface (CSI) Drivers on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about and deploy the Container Storage Interface (CSI) drivers for Azure Disks, Azure Files, and Azure Blob storage on your Azure Kubernetes Service (AKS) cluster to enable flexible and efficien","The Container Storage Interface (CSI) is a standard for exposing arbitrary block and file storage systems to containerized workloads on Kubernetes. When using CSI, Azure Kubernetes Service (AKS) can write, deploy, and iterate plug-ins to expose new or improve existing storage systems in Kubernetes without having to touch the core Kubernetes code and wait for its release cycles. The CSI storage driver support on AKS allows you to natively useAzure Disks,Azure Files, orAzure Blob storageas persist",2026-04-01T06:03:00.000Z,how-to,,0.3,False,"Appears to be a conceptual/overview page about CSI drivers on AKS and how to use Azure Disks/Files/Blob via CSI. The summary does not indicate detailed limits, configuration parameter tables, error codes, or decision matrices; it reads like a capability/feature explanation and high-level guidance.",unchanged -https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority,Use custom certificate authorities,Custom Certificate Authority (CA) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure custom certificate authorities on AKS nodes,Learn how to use a custom certificate authority (CA) to add certificates to your nodes in an Azure Kubernetes Service (AKS) cluster.,"Custom Certificate Authority (CA) allows you to add up to 10 base64-encoded certificates to your node's trust store. This feature is often needed when certificate authorities (CAs) are required to be present on the node, like when connecting to a private registry. This article shows you how to create custom CAs and apply them to your AKS clusters. Note The Custom CA feature adds your custom certificates to the trust store of the AKS node. Certificates added with this feature aren't available to ",2026-01-06T23:11:00.000Z,how-to,security,0.8,True,Describes Custom CA feature with explicit limit (up to 10 certificates) and how they are applied to node trust stores; includes AKS-specific configuration steps.,unchanged +https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority,Custom certificate authorities on nodes,Custom Certificate Authority (CA) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure custom certificate authorities on AKS nodes,Learn how to use a custom certificate authority (CA) to add certificates to your nodes in an Azure Kubernetes Service (AKS) cluster.,"Custom Certificate Authority (CA) allows you to add up to 10 base64-encoded certificates to your node's trust store. This feature is often needed when certificate authorities (CAs) are required to be present on the node, like when connecting to a private registry. This article shows you how to create custom CAs and apply them to your AKS clusters. Note The Custom CA feature adds your custom certificates to the trust store of the AKS node. Certificates added with this feature aren't available to ",2026-01-06T23:11:00.000Z,how-to,limits-quotas,0.8,True,States a concrete numeric limit (up to 10 base64-encoded certificates per node trust store) and explains behavior; this is specific limit/constraint information.,new https://learn.microsoft.com/en-us/azure/aks/custom-node-configuration,Custom node configuration,Customize the Node Configuration for Azure Kubernetes Service (AKS) Node Pools - Azure Kubernetes Service,Configure AKS node OS and kubelet settings,Learn how to customize the configuration on Azure Kubernetes Service (AKS) cluster nodes and node pools.,"Customizing your node configuration allows you to adjust operating system (OS) settings or kubelet parameters to match the needs of your workloads. When you create an AKS cluster or add a node pool to your cluster, you can customize a subset of commonly used OS and kubelet settings. To configure settings beyond this subset, you canuse a daemon set to customize your needed configurations without losing AKS support for your nodes.",2026-04-02T22:13:00.000Z,how-to,configuration,0.78,True,"The page describes how to customize AKS node and node pool configuration, including specific OS and kubelet parameters that can be set only through AKS-supported mechanisms. These are product-specific configuration options (names, scopes, and supported subsets) that go beyond generic Kubernetes knowledge, fitting the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/aks/customize-resource-configuration,Customize resource configuration with cost optimized autoscaling,Customize the resource configuration for managed add-ons - Azure Kubernetes Service,Customize resource configuration for AKS managed add-ons,Learn how to customize the resource configuration for managed add-ons.,This article provides an overview of how to customize the resource configuration for Azure Kubernetes Service (AKS) managed add-ons withcost optimized add-on scaling (Preview).,2025-06-20T11:03:00.000Z,how-to,configuration,0.76,True,"Explains how to customize resource configuration for managed add-ons with cost-optimized scaling; likely includes parameter names, ranges, and examples.",unchanged https://learn.microsoft.com/en-us/azure/aks/dapr,Install the Dapr extension,Install Dapr Extension for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Install Dapr extension on AKS and Arc-enabled clusters,Install and configure Dapr on your Azure Kubernetes Service (AKS) and Arc-enabled Kubernetes clusters using the Dapr cluster extension.,"Daprsimplifies building resilient, stateless, and stateful applications that run on the cloud and edge and embrace the diversity of languages and developer frameworks. With Dapr's sidecar architecture, you can keep your code platform agnostic while tackling challenges around building microservices, like: Note If you plan to install Dapr in a Kubernetes production environment, see theDapr production guidelines.",2026-02-19T18:11:00.000Z,how-to,deployment,0.65,True,"Covers installing and configuring the Dapr cluster extension on AKS/Arc, including Azure-specific extension parameters and setup.",unchanged @@ -188,7 +195,7 @@ https://learn.microsoft.com/en-us/azure/aks/deploy-application-az-cli,Deploy Kub https://learn.microsoft.com/en-us/azure/aks/deploy-application-template,Deploy Kubernetes applications with ARM template,Deploy an Azure Kubernetes application by using an ARM template - Azure Kubernetes Service,Deploy Azure Kubernetes applications using ARM templates,Learn how to deploy an Azure Kubernetes application by using an ARM template.,"To deploy a Kubernetes application programmatically through Azure CLI, you select the Kubernetes application and settings, generate an ARM template, accept legal terms and conditions, and finally deploy the ARM template.",2024-08-01T20:29:00.000Z,concept-article,deployment,0.7,True,Shows how to generate and use ARM templates to deploy Kubernetes applications programmatically to AKS; product-specific deployment automation.,unchanged https://learn.microsoft.com/en-us/azure/aks/deploy-batch-jobs-with-kueue,Schedule and deploy batch jobs with Kueue,Schedule and Deploy Batch Jobs with Kueue on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Schedule and deploy batch jobs with Kueue on AKS,Learn how to define Kueue deployments and efficiently schedule batch workloads on your Azure Kubernetes Service (AKS) cluster.,"In this article, you learn how to schedule and deploy sample batch jobs on Azure Kubernetes Service (AKS) using Kueue. Also, this guide covers installing Kueue, configuring ResourceFlavors and ClusterQueues for fine-grained resource management, and submitting jobs via LocalQueues. You also learn how to use Kueue to queue up a sample batch job and track the results across Pending, Running, and Finished states. Important Open-source software is mentioned throughout AKS documentation and samples. S",2025-10-02T17:09:00.000Z,how-to,configuration,0.7,True,"Shows how to define ResourceFlavors, ClusterQueues, LocalQueues; includes concrete YAML and Kueue-specific configuration patterns on AKS.",unchanged https://learn.microsoft.com/en-us/azure/aks/deploy-cluster-terraform-verified-module,Deploy a production-ready cluster using Terraform with an Azure Verified Module (AVM),Deploy a Production-Ready Azure Kubernetes Service (AKS) Cluster using Terraform with an Azure Verified Module (AVM) - Azure Kubernetes Service,Deploy production AKS clusters with Terraform AVM,Learn how to deploy a production-ready Azure Kubernetes Service (AKS) cluster using Terraform with an Azure Verified Module (AVM).,"Azure Verified Modules (AVMs) are pre-defined, reusable Infrastructure as Code (IaC) modules developed and maintained by Microsoft forBicepandTerraform. AVMs are designed to help you deploy Azure resources in a consistent and reliable manner, following best practices and compliance standards. In this article, you learn how to deploy a production-ready AKS cluster using Terraform with an Azure Verified Module (AVM). For more information about AVMs, seeAzure Verified Modules.",2026-04-09T22:07:00.000Z,how-to,deployment,0.65,True,"The article describes deploying a production-ready AKS cluster using a specific Azure Verified Module for Terraform. AVMs encode Microsoft-authored, product-specific deployment patterns and constraints for AKS (for example, required/expected configuration combinations, production-ready defaults, and module-specific parameters). This goes beyond generic Terraform usage and contains expert, product-specific deployment guidance, so it best fits the deployment sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy,Deploy Confidential Containers with default policy,Deploy an AKS cluster with Confidential Containers (preview) - Azure Kubernetes Service,Deploy AKS clusters with Confidential Containers,Learn how to create an Azure Kubernetes Service (AKS) cluster with Confidential Containers (preview) and a default security policy by using the Azure CLI.,"In this article, you use the Azure CLI to deploy an Azure Kubernetes Service (AKS) cluster and configure Confidential Containers (preview) with an automatically generated security policy. You then deploy an application as a Confidential container. To learn more, read theoverview of AKS Confidential Containers. In general, getting started with AKS Confidential Containers involves the following steps. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or prov",2025-01-15T23:01:00.000Z,quickstart,deployment,0.7,True,"CLI-based deployment of Confidential Containers; likely includes required node pool types, flags, and constraints specific to this preview runtime.",unchanged +https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy,Deploy with default policy,Deploy an AKS cluster with Confidential Containers (preview) - Azure Kubernetes Service,Deploy AKS clusters with Confidential Containers and default policy,Learn how to create an Azure Kubernetes Service (AKS) cluster with Confidential Containers (preview) and a default security policy by using the Azure CLI.,"In this article, you use the Azure CLI to deploy an Azure Kubernetes Service (AKS) cluster and configure Confidential Containers (preview) with an automatically generated security policy. You then deploy an application as a Confidential container. To learn more, read theoverview of AKS Confidential Containers. In general, getting started with AKS Confidential Containers involves the following steps. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or prov",2025-01-15T23:01:00.000Z,quickstart,deployment,0.75,True,"CLI-based deployment of AKS with Confidential Containers and auto-generated security policy; likely includes required flags, runtime options, and constraints specific to this deployment scenario.",new https://learn.microsoft.com/en-us/azure/aks/deploy-extensions-az-cli,Use the Azure CLI,Deploy and manage cluster extensions by using the Azure CLI - Azure Kubernetes Service,Deploy and manage AKS cluster extensions with Azure CLI,Learn how to use Azure CLI to deploy and manage extensions for Azure Kubernetes Service clusters.,"You can create extension instances in an AKS cluster, setting required and optional parameters including options related to updates and configurations. You can also view, list, update, and delete extension instances. Before you begin, read aboutcluster extensions. Note The examples provided in this article are not complete, and are only meant to showcase functionality. For a comprehensive list of commands and their parameters, see theaz k8s-extension CLI reference.",2024-08-01T20:29:00.000Z,how-to,configuration,0.7,True,"Shows CLI parameters for creating, updating, and deleting extension instances, including required/optional settings—product-specific configuration commands.",unchanged https://learn.microsoft.com/en-us/azure/aks/deploy-marketplace,Deploy Kubernetes applications from Azure Marketplace,Deploy a Kubernetes application from Azure Marketplace - Azure Kubernetes Service,Deploy Kubernetes applications from Azure Marketplace to AKS,Learn how to deploy Kubernetes applications from Azure Marketplace on an Azure Kubernetes Service (AKS) cluster.,"In this article, you learn how to deploy and manage a Kubernetes application from Azure Marketplace. Azure Marketplaceis an online store that contains thousands of IT software applications and services built by industry-leading technology companies. In Azure Marketplace, you can find, try, buy, and deploy the software and services that you need to build new solutions and manage your cloud infrastructure. The catalog includes solutions for different industries and technical areas, free trials, an",2024-10-15T22:00:00.000Z,how-to,deployment,0.7,True,"Explains how to select, deploy, and manage Kubernetes applications from Azure Marketplace onto AKS; includes Azure-specific deployment flow.",unchanged https://learn.microsoft.com/en-us/azure/aks/deploy-mongodb-cluster,Deploy MongoDB cluster,Configure and deploy a MongoDB cluster on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure and deploy a MongoDB cluster on AKS,"In this article, you learn how to configure and deploy a MongoDB cluster on AKS.","In this article, you configure and deploy a MongoDB cluster on Azure Kubernetes Service (AKS).",2025-09-15T08:00:00.000Z,how-to,configuration,0.7,True,"Deployment configuration for MongoDB on AKS; likely includes YAML manifests, operator settings, and MongoDB-specific parameters unique to this solution.",unchanged @@ -217,15 +224,16 @@ https://learn.microsoft.com/en-us/azure/aks/eks-web-prepare,Prepare for deployme https://learn.microsoft.com/en-us/azure/aks/eks-web-rearchitect,Rearchitect for AKS,Rearchitect AWS EKS web application for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Rearchitect AWS EKS web app and WAF for AKS,Understand the architectural differences and steps to replicate the AWS EKS web application workload and AWS WAF protection in Azure.,"Now that you have a better understanding of the platform differences between AWS and Azure, let's examine the web application architecture on AWS and the modifications needed to make it compatible withAzure Kubernetes Service (AKS).",2024-12-13T23:02:00.000Z,how-to,architecture-patterns,0.75,True,Details architectural modifications to move an AWS EKS web app with AWS WAF to AKS with Azure WAF and Application Gateway for Containers; product-specific pattern guidance.,unchanged https://learn.microsoft.com/en-us/azure/aks/eks-web-refactor,Refactor app,Migrate Amazon Web Services (AWS) web application to Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Migrate Yelb web application from EKS to AKS,Step-by-step guide on updating and migrating the AWS web application to Azure Kubernetes Service (AKS).,"This article outlines the steps necessary for migrating the Yelb application from AWS EKS to AKS. Please note that theYelbapplication is self-contained and doesn't rely on external services, so it can be migrated from AWS to Azure without code changes.",2024-12-13T23:02:00.000Z,how-to,deployment,0.65,True,"Provides concrete steps to move the Yelb app from AWS EKS to AKS, including AKS-specific deployment instructions; focused on practical migration deployment.",unchanged https://learn.microsoft.com/en-us/azure/aks/eks-web-understand,Understand platform differences,Understand platform differences for the web application workload - Azure Kubernetes Service,Understand AWS vs Azure differences for web apps,Learn about the key differences between the AWS and Azure platforms related to the web application hosting.,"Before you migrate the sample web application to Azure, make sure you have a solid understanding of the operational differences between the AWS and Azure platforms. This article walks through some of the key concepts for this workload and provides links to resources for more information. For a comprehensive comparison between Azure and AWS services, seeAWS to Azure services comparison. Both AKS and EKS provide multiple options for deploying a managed Kubernetes cluster on Azure and AWS. These op",2025-11-20T23:11:00.000Z,how-to,decision-making,0.65,True,Explains operational differences between AWS and Azure for hosting web apps on managed Kubernetes; supports choosing equivalent Azure components and approaches.,unchanged -https://learn.microsoft.com/en-us/azure/aks/enable-authentication-microsoft-entra-id,Enable AKS-managed Microsoft Entra integration,Enable AKS-managed Microsoft Entra integration on an Azure Kubernetes Service cluster - Azure Kubernetes Service,Enable AKS-managed Microsoft Entra integration,Learn how to enable AKS-managed Microsoft Entra integration on an Azure Kubernetes Service cluster with kubelogin.,"The AKS-managed Microsoft Entra integration simplifies the Microsoft Entra integration process. Previously, you were required to create a client and server app, and the Microsoft Entra tenant had to assignDirectory Readersrole permissions. Now, the Azure Kubernetes Service (AKS) resource provider manages the client and server apps for you. Cluster administrators can configure Kubernetes role-based access control (Kubernetes RBAC) based on a user's identity or directory group membership. Microsof",2025-10-07T22:07:00.000Z,how-to,security,0.8,True,"Step-by-step configuration of AKS-managed Entra integration with specific parameters, roles, and kubelogin usage unique to AKS.",unchanged -https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes,Enable FIPS,Enable Federal Information Process Standard (FIPS) for Azure Kubernetes Service (AKS) node pools - Azure Kubernetes Service,Enable FIPS 140-2 compliant node pools in AKS,Learn how to enable Federal Information Process Standard (FIPS) for Azure Kubernetes Service (AKS) node pools.,The Federal Information Processing Standard (FIPS) 140-2 is a US government standard that defines minimum security requirements for cryptographic modules in information technology products and systems. Azure Kubernetes Service (AKS) allows you to create Linux and Windows node pools with FIPS 140-2 enabled. Deployments running on FIPS-enabled node pools can use those cryptographic modules to provide increased security and help meet security controls as part of FedRAMP compliance. For more informa,2026-04-09T06:03:00.000Z,how-to,security,0.7,True,"Page is a how-to for enabling FIPS 140-2 on AKS Linux/Windows node pools, tied to FedRAMP compliance. It likely includes AKS- and OS-specific configuration flags/parameters (for example, node pool creation options, image types, or CLI arguments) that are product-specific security settings rather than generic FIPS concepts, fitting the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/aks/enable-host-encryption,Enable host-based encryption,Enable host-based encryption on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Enable host-based encryption for AKS node VMs,Learn how to configure a host-based encryption in an Azure Kubernetes Service (AKS) cluster.,"With host-based encryption, the data stored on the VM host of your AKS agent nodes' VMs is encrypted at rest and flows encrypted to the Storage service. This means the temp disks are encrypted at rest with platform-managed keys. The cache of OS and data disks is encrypted at rest with either platform-managed keys or customer-managed keys depending on the encryption type set on those disks. By default, when using AKS, OS and data disks use server-side encryption with platform-managed keys. The ca",2025-10-16T22:11:00.000Z,how-to,security,0.72,True,Explains how to turn on host-based encryption for AKS nodes with specific configuration options and interactions with disk encryption types.,unchanged +https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes,FIPS-compliant node pools,Enable Federal Information Processing Standard (FIPS) for Azure Kubernetes Service (AKS) Node Pools - Azure Kubernetes Service,Create FIPS 140-2 compliant AKS node pools,Learn how to enable Federal Information Processing Standard (FIPS) for Azure Kubernetes Service (AKS) node pools.,The Federal Information Processing Standard (FIPS) 140-2 is a US government standard that defines minimum security requirements for cryptographic modules in information technology products and systems. Azure Kubernetes Service (AKS) allows you to create Linux and Windows node pools with FIPS 140-2 enabled. Deployments running on FIPS-enabled node pools can use those cryptographic modules to provide increased security and help meet security controls as part of FedRAMP compliance. For more informa,2026-04-21T06:03:00.000Z,how-to,security,0.7,True,Shows how to enable FIPS 140-2 for AKS node pools; includes product-specific security and compliance configuration steps.,new +https://learn.microsoft.com/en-us/azure/aks/enable-host-encryption,Host-based encryption,Enable host-based encryption on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Enable host-based encryption for AKS node VMs,Learn how to configure a host-based encryption in an Azure Kubernetes Service (AKS) cluster.,"With host-based encryption, the data stored on the VM host of your AKS agent nodes' VMs is encrypted at rest and flows encrypted to the Storage service. This means the temp disks are encrypted at rest with platform-managed keys. The cache of OS and data disks is encrypted at rest with either platform-managed keys or customer-managed keys depending on the encryption type set on those disks. By default, when using AKS, OS and data disks use server-side encryption with platform-managed keys. The ca",2025-10-16T22:11:00.000Z,how-to,security,0.7,True,"Explains configuring host-based encryption for AKS nodes, including how temp disks and caches are encrypted with platform or customer-managed keys; product-specific security configuration.",new https://learn.microsoft.com/en-us/azure/aks/enable-keda-existing-cluster,Enable on an existing cluster using the Azure portal,Enable the Kubernetes Event-driven Autoscaling (KEDA) add-on on an existing AKS cluster - Azure Kubernetes Service,Enable KEDA add-on on existing AKS clusters,Use the Azure portal to enable the Kubernetes Event-driven Autoscaling (KEDA) add-on on an existing Azure Kubernetes Service (AKS) cluster.,"Important The KEDA add-on for AKS doesn't currently support modifying the CPU requests or limits and other Helm values for theMetrics ServerorOperator. Keep this limitation in mind when using the add-on. If you have any questions, feel free to reach outhere. This article shows you how to enable the Kubernetes Event-driven Autoscaling (KEDA) add-on on an existing Azure Kubernetes Service (AKS) cluster using the Azure portal. Important Your cluster Kubernetes version determines what KEDA version w",2025-03-14T22:01:00.000Z,how-to,deployment,0.66,True,Portal-based enablement of KEDA on existing clusters; includes AKS-version-to-KEDA-version mapping and portal configuration steps specific to this deployment path.,unchanged +https://learn.microsoft.com/en-us/azure/aks/entra-id-authorization,Use Microsoft Entra ID authorization for the Kubernetes API,Use Microsoft Entra ID authorization for the Kubernetes API in AKS - Azure Kubernetes Service,Authorize AKS Kubernetes API with Entra ID RBAC,Learn how to authorize Kubernetes API access in Azure Kubernetes Service (AKS) using Microsoft Entra ID role assignments and ABAC conditions.,"This article shows how to authorize calls to the Kubernetes API in Azure Kubernetes Service (AKS) using Microsoft Entra ID identities. Entra ID authorization for the Kubernetes API uses Azure RBAC role assignments to grant access to Kubernetes resources. For built-in Kubernetes resources, assign one of the AKS built-in roles (such asAzure Kubernetes Service RBAC Reader) at the cluster or namespace scope. For custom resources (CRDs), assign a custom role with Azure ABAC conditions that specify wh",2026-04-23T06:10:00.000Z,how-to,security,0.76,True,"Describes using Microsoft Entra ID for Kubernetes API authorization with Azure RBAC and ABAC; includes specific AKS built-in role names and guidance on assigning roles and conditions, which are product-specific security configurations.",new +https://learn.microsoft.com/en-us/azure/aks/entra-id-control-plane-authentication,Microsoft Entra ID authentication to cluster control plane,Enable Microsoft Entra ID authentication for the AKS control plane - Azure Kubernetes Service,Enable Entra ID authentication for AKS control plane,Learn how to enable Microsoft Entra ID authentication for the Kubernetes API server (control plane) on an Azure Kubernetes Service (AKS) cluster.,"The Microsoft Entra integration simplifies the Microsoft Entra integration process. Previously, you were required to create a client and server app, and the Microsoft Entra tenant had to assignDirectory Readersrole permissions. Now, the Azure Kubernetes Service (AKS) resource provider manages the client and server apps for you. Cluster administrators can configure Kubernetes role-based access control (Kubernetes RBAC) based on a user's identity or directory group membership. Learn more about the",2026-04-23T06:10:00.000Z,how-to,security,0.8,True,"How-to for configuring Microsoft Entra authentication for the AKS API server; will include specific configuration parameters, flags, and identity settings unique to AKS security.",new https://learn.microsoft.com/en-us/azure/aks/events,Monitor Kubernetes object events,Use Kubernetes events for troubleshooting - Azure Kubernetes Service,Use Kubernetes events to troubleshoot AKS clusters,"Learn about Kubernetes events, which provide details on pods, nodes, and other Kubernetes objects.",Deploy and Explore This article shows you how to use Kubernetes events to monitor and troubleshoot issues in your Azure Kubernetes Service (AKS) clusters.,2025-07-01T05:08:00.000Z,how-to,troubleshooting,0.65,True,"How-to for using Kubernetes events for troubleshooting; likely maps event types/reasons to causes and actions in AKS, which is symptom→solution guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/extended-zones,Deploy a cluster in an Azure Extended Zone,Azure Kubernetes Service (AKS) for Extended Zones (preview) - Azure Kubernetes Service,Deploy AKS clusters in Azure Extended Zones,Learn how to deploy an Azure Kubernetes Service (AKS) for Azure Extended Zone cluster.,"Azure Kubernetes Service (AKS) for Extended Zones provides an extensive and sophisticated set of capabilities that make it simpler to deploy and operate a fully managed Kubernetes cluster in an Extended Zone scenario. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best-effort basis",2025-03-26T22:01:00.000Z,how-to,deployment,0.65,True,"Extended Zones is a preview, AKS-specific deployment scenario; such articles usually include constraints, supported regions/SKUs, and configuration details that are not generally known from training.",unchanged -https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure,Configure external identity providers,Configure External Identity Providers with AKS Structured Authentication (Preview) - Azure Kubernetes Service,Configure external identity providers for AKS structured authentication,Learn how to configure external identity providers for Azure Kubernetes Service (AKS) using structured authentication and JWT authenticators.,"This article shows you how to configure GitHub and Google Identity external identity providers for Azure Kubernetes Service (AKS) control plane authentication using structured authentication. You learn how to create JSON Web Token (JWT) authenticators, configure claim validation and mapping, and test the authentication flow. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level a",2026-03-19T06:02:00.000Z,how-to,security,0.85,True,"Provides concrete, product-specific security configuration steps for AKS structured authentication: how to define JWT authenticators, configure claim validation and mapping, and wire GitHub/Google Identity into AKS control plane auth. These are detailed configuration patterns and parameters unique to AKS external IdP integration, fitting the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-overview,External identity provider authentication overview,Use External Identity Providers with AKS Structured Authentication (Preview) - Azure Kubernetes Service,Understand AKS structured authentication with external IdPs,"Learn how to use external identity providers for Azure Kubernetes Service (AKS) control plane authentication using structured authentication, enabling integration with your organization's existing ide","Azure Kubernetes Service (AKS) supports structured authentication, which allows you to configure external identity providers for authenticating users to the Kubernetes API server. This feature is based on the upstreamKubernetes structured authentication configuration. AKS implements this functionality through JSON Web Token (JWT) authenticators that validate tokens from external identity providers according to your configuration. With structured authentication, organizations can integrate AKS wi",2026-03-19T06:02:00.000Z,overview,security,0.7,True,"Describes AKS-specific structured authentication behavior and how AKS implements upstream Kubernetes structured authentication with JWT authenticators for external identity providers. This is product-specific security/auth configuration knowledge (control plane auth model, how tokens are validated) that goes beyond generic identity concepts, but it’s primarily an overview so details are moderate.",unchanged -https://learn.microsoft.com/en-us/azure/aks/faq,FAQ,AKS Frequently Asked Questions,"AKS FAQ with node, pod, and quota limits",Frequently asked questions about Azure Kubernetes Service (AKS),This article provides answers to some of the most common questions about Azure Kubernetes Service (AKS).,2026-04-15T06:34:00Z,faq,limits-quotas,0.7,True,"AKS FAQ pages typically include concrete platform constraints such as maximum node counts, pod density per node, supported Kubernetes version windows, and other numeric service limits that are not inferable from general training data. These are expert, product-specific values that fit the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure,Set up external identity provider authentication to cluster,Configure External Identity Providers with AKS Structured Authentication (Preview) - Azure Kubernetes Service,Configure external JWT identity providers for AKS,Learn how to configure external identity providers for Azure Kubernetes Service (AKS) using structured authentication and JWT authenticators.,"This article shows you how to configure GitHub and Google Identity external identity providers for Azure Kubernetes Service (AKS) control plane authentication using structured authentication. You learn how to create JSON Web Token (JWT) authenticators, configure claim validation and mapping, and test the authentication flow. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level a",2026-04-23T06:10:00.000Z,how-to,security,0.78,True,"How-to configuration for GitHub and Google Identity as external providers using structured authentication; includes product-specific JWT authenticator settings, claim validation and mapping details that are implementation-specific security configuration.",new +https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-overview,External identity provider authentication to cluster (overview),Use External Identity Providers with AKS Structured Authentication (Preview) - Azure Kubernetes Service,,"Learn how to use external identity providers for Azure Kubernetes Service (AKS) control plane authentication using structured authentication, enabling integration with your organization's existing ide","Azure Kubernetes Service (AKS) supports structured authentication, which allows you to configure external identity providers for authenticating users to the Kubernetes API server. This feature is based on the upstreamKubernetes structured authentication configuration. AKS implements this functionality through JSON Web Token (JWT) authenticators that validate tokens from external identity providers according to your configuration. With structured authentication, organizations can integrate AKS wi",2026-03-19T06:02:00.000Z,overview,,0.4,False,"High-level overview of structured authentication and external identity providers for AKS; primarily conceptual without detailed configuration parameters, limits, or error mappings.",new +https://learn.microsoft.com/en-us/azure/aks/faq,FAQ,AKS Frequently Asked Questions,"AKS FAQ with node, pod, and quota limits",Frequently asked questions about Azure Kubernetes Service (AKS),This article provides answers to some of the most common questions about Azure Kubernetes Service (AKS).,2026-04-15T06:34:00Z,faq,limits-quotas,0.7,True,"AKS FAQ pages typically include concrete platform constraints such as maximum node counts, pod density per node, supported Kubernetes version windows, and other numeric service limits that are not inferable from general training data. These are expert, product-specific values that fit the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/aks/flatcar-container-linux-for-aks,Use Flatcar Container Linux for AKS,Flatcar Container Linux for Azure Kubernetes Service (AKS) (preview) Overview - Azure Kubernetes Service,Plan migration off Flatcar Container Linux on AKS,"Learn about Flatcar Container Linux for Azure Kubernetes Service (AKS), including limitations, migration information, and resources to get started.","Important Starting onJune 8, 2026, AKS no longer supports Flatcar Container Linux for Azure Kubernetes Service (AKS) (preview). At that point, AKS will no longer produce new Flatcar Container Linux node images or provide security patches, and you'll be unable to create new node pools with Flatcar Container Linux. OnSeptember 8, 2026, AKS will remove all existing Flatcar Container Linux node images, causing scaling and remediation (reimage and redeploy) operations to fail. To avoid disruption, we",2026-03-10T22:16:00.000Z,overview,decision-making,0.7,True,"The page contains time-bound, product-specific deprecation details (exact support end dates, what operations will fail when images are removed, and required migration actions). These are expert, time-sensitive details not inferable from general training data and are used to decide when and how to move away from Flatcar on AKS. It is not primarily limits, configuration, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/aks/free-standard-pricing-tiers,Pricing tiers for AKS,"Azure Kubernetes Service (AKS) Free, Standard, and Premium Pricing Tiers - Azure Kubernetes Service","Select AKS Free, Standard, or Premium pricing tier","Learn about the Free, Standard, and Premium pricing tiers for Azure Kubernetes Service (AKS) cluster management, including when to use each tier and how to create or update clusters using Azure CLI.","Manage your Azure Kubernetes Service (AKS) clusters using AKS pricing tiers. This article explains the differences between these tiers, when to use each tier, and how to create or update AKS clusters using Azure CLI.",2026-01-07T23:08:00.000Z,how-to,decision-making,0.8,True,"Explains differences between AKS pricing tiers and when to use each; likely includes comparison criteria and guidance for choosing tiers, fitting decision-making.",unchanged https://learn.microsoft.com/en-us/azure/aks/generation-2-vms,Use generation 2 VMs,Use Generation 2 (Gen 2) Virtual Machines (VMs) on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Adopt and migrate to Gen2 VMs for AKS node pools,"Learn how to check available Gen 2 VM sizes, create AKS node pools with Gen 2 VMs, migrate from Gen 1 to Gen 2 VMs on AKS, and verify VM generation.","In this article, you learn how to use Generation 2 (Gen 2) virtual machines (VMs) on Azure Kubernetes Service (AKS), including how to check available Gen 2 VM sizes, create AKS node pools with Gen 2 VMs, migrate from Gen 1 to Gen 2 VMs on AKS, and verify the VM generation of your AKS nodes.",2025-10-23T22:11:00.000Z,how-to,decision-making,0.66,True,"Details which VM sizes support Gen2, how to create/migrate node pools, and considerations when moving from Gen1 to Gen2—AKS-specific migration and selection guidance.",unchanged @@ -235,33 +243,33 @@ https://learn.microsoft.com/en-us/azure/aks/github-actions-azure-files-create-in https://learn.microsoft.com/en-us/azure/aks/github-actions-azure-files-deploy-test,Deploy and test GitHub Actions,Deploy and test highly available GitHub Actions on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Deploy highly available GitHub Actions runners on AKS,Learn how to deploy and test highly available GitHub Actions with Azure Files on Azure Kubernetes Service (AKS).,"In this article, you learn how to deploy and test highly available GitHub Actions with Azure Files on Azure Kubernetes Service (AKS).",2025-07-22T05:03:00.000Z,how-to,deployment,0.65,True,Scenario-specific deployment of self-hosted GitHub Actions on AKS with Azure Files; includes AKS- and storage-specific configuration details beyond generic CI/CD knowledge.,unchanged https://learn.microsoft.com/en-us/azure/aks/github-actions-azure-files-overview,Overview,Solution overview for deploying highly available GitHub Actions on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"In this article, we provide an overview of deploying highly available GitHub Actions on Azure Kubernetes Service (AKS) using Azure Files.","In this guide, you deploy a highly available GitHub Actions controller and self-hosted agents running on Azure Kubernetes Service (AKS). The self-hosted runners use SMB Azure file shares for persistent storage. Important Open-source software is mentioned throughout AKS documentation and samples. Software that you deploy is excluded from AKS service-level agreements, limited warranty, and Azure support. As you use open-source technology alongside AKS, consult the support options available from th",2025-07-22T05:03:00.000Z,overview,,0.35,False,"Solution overview for GitHub Actions on AKS; summary indicates high-level architecture and prerequisites, not detailed configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/aks/gpu-health-monitoring,GPU health checking with NPD,GPU health monitoring in Node Problem Detector (NPD) in Azure Kubernetes Service (AKS) nodes - Azure Kubernetes Service,Diagnose GPU node health with NPD in AKS,Learn about how AKS uses Node Problem Detector to expose issues on GPU-enabled nodes.,"This article describes how Azure Kubernetes Service (AKS) uses Node Problem Detector (NPD) to monitor the health of GPU-enabled node pools. NPD is a Kubernetes component that detects and reports node-level issues, including hardware faults, driver errors, and connectivity problems that can affect the performance and availability of GPU workloads.",2026-03-10T22:16:00.000Z,how-to,troubleshooting,0.68,True,"The page explains how AKS uses Node Problem Detector specifically for GPU-enabled nodes, which typically includes concrete node-level issue types, GPU-related error conditions, and how they are surfaced for diagnosis. This is product- and scenario-specific troubleshooting knowledge (GPU hardware/driver/connectivity issues on AKS nodes) that goes beyond generic Kubernetes concepts, mapping symptoms on GPU nodes to detection and remediation steps.",unchanged -https://learn.microsoft.com/en-us/azure/aks/gpu-multi-instance,Multi-instance GPU node pool,Create a multi-instance GPU node pool in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn how to create a multi-instance GPU node pool in Azure Kubernetes Service (AKS).,"Certain NVIDIA GPUs can be divided in up to seven independent instances. Each instance has its own Stream Multiprocessor (SM), which is responsible for executing instructions in parallel, and GPU memory. For more information on GPU partitioning, seeNVIDIA MIG. This article walks you through how to create a multi-instance GPU node pool using a MIG-compatible VM size in an Azure Kubernetes Service (AKS) cluster.",2026-04-17T22:07:00.000Z,how-to,,0.3,False,"Primarily a procedural guide for creating a multi-instance GPU node pool. The summary does not indicate detailed configuration tables, limits, or troubleshooting/error mappings; likely a standard step-by-step tutorial rather than expert reference content.",updated +https://learn.microsoft.com/en-us/azure/aks/gpu-multi-instance,Multi-instance GPU node pool,Create a multi-instance GPU node pool in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn how to create a multi-instance GPU node pool in Azure Kubernetes Service (AKS).,"Certain NVIDIA GPUs can be divided in up to seven independent instances. Each instance has its own Stream Multiprocessor (SM), which is responsible for executing instructions in parallel, and GPU memory. For more information on GPU partitioning, seeNVIDIA MIG. This article walks you through how to create a multi-instance GPU node pool using a MIG-compatible VM size in an Azure Kubernetes Service (AKS) cluster.",2026-04-17T22:07:00.000Z,how-to,,0.3,False,"Primarily a procedural guide for creating a multi-instance GPU node pool. The summary does not indicate detailed configuration tables, limits, or troubleshooting/error mappings; likely a standard step-by-step tutorial rather than expert reference content.",unchanged https://learn.microsoft.com/en-us/azure/aks/how-to-apply-fqdn-filtering-policies,Set up FQDN filtering,Set up FQDN filtering feature for Container Network Security in Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,Configure FQDN filtering policies with ACNS on AKS,Get started with FQDN Filtering Feature for Advanced Container Networking Services (ACNS) for your AKS cluster using Azure managed Cilium Network Policies.,This article shows you how to set up Advanced Container Networking Services with Container Network Security feature in AKS clusters.,2026-04-06T22:07:00.000Z,how-to,configuration,0.7,True,"A setup guide for FQDN filtering with Container Network Security and Azure-managed Cilium Network Policies will include specific policy schema fields, configuration parameters, and allowed values for enabling and tuning FQDN filtering on AKS, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/aks/how-to-apply-l7-policies,Set up Layer 7 policies,Set up Layer 7(L7) policies with Advanced Container Networking Services (ACNS) - Azure Kubernetes Service,Configure L7 network policies with ACNS on AKS,Get started with Layer 7(L7) Feature for Advanced Container Networking Services (ACNS) for your AKS cluster using Azure managed Cilium Network Policies.,This article demonstrates how to set up L7 policies with Advanced Container Networking Services in AKS clusters. Continue only after you have reviewed the limitations and considerations listed on theLayer 7 Policy Overviewpage.,2026-03-02T23:10:00.000Z,how-to,configuration,0.7,True,"How-to for setting up L7 policies with Azure managed Cilium on AKS will necessarily include product-specific policy objects, fields, and configuration parameters (e.g., CRD names, YAML spec fields, required annotations) that are unique to ACNS and AKS networking. This is concrete configuration guidance rather than just conceptual networking content.",unchanged -https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard,Set up WireGuard encryption,Deploy WireGuard Encryption with Advanced Container Networking Services - Azure Kubernetes Service,Configure WireGuard encryption for ACNS on AKS clusters,Get started with WireGuard Encryption Feature for Advanced Container Networking Services on your AKS cluster.,"Important WireGuard encryption with Advanced Cluster Networking Services is currently in PREVIEW.See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This article shows you how to deploy WireGuard encryption with Advanced Container Networking Services in Azure Kubernetes Service (AKS) clusters.",2026-04-08T11:02:00.000Z,how-to,configuration,0.7,True,"A 'how to deploy' feature article for WireGuard encryption in ACNS typically includes AKS- and ACNS-specific configuration steps, such as CLI flags, YAML fields, and parameter names/values required to enable the feature. These are product-specific configuration details that go beyond generic deployment commands.",unchanged -https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-insights-agent,Install and use the Container Network Insights Agent,Deploy and use Container Network Insights Agent on AKS - Azure Kubernetes Service,Configure and use Container Network Insights Agent on AKS,Learn how to deploy Container Network Insights Agent as an AKS extension to troubleshoot networking issues in Azure Kubernetes Service (AKS) clusters.,"This article shows how to deploy Container Network Insights Agent on your Azure Kubernetes Service (AKS) cluster, configure authentication and identity, and use the agent to troubleshoot networking issues. Container Network Insights Agent is an AI-powered diagnostic assistant that runs as an in-cluster web application. You describe networking problems in natural language, and the agent runs diagnostic commands (kubectl,cilium,hubble) against your cluster. It returns structured, evidence-backed r",2026-04-16T11:03:00.000Z,how-to,configuration,0.7,True,"How-to article for deploying and configuring the agent, including authentication and identity. Likely contains product-specific configuration parameters and settings (extension configuration, identity/auth options, possibly CLI/ARM parameters) that go beyond generic deployment steps.",new +https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard,Set up WireGuard encryption,Deploy WireGuard Encryption with Advanced Container Networking Services - Azure Kubernetes Service,Configure WireGuard encryption on AKS with ACNS,Get started with WireGuard Encryption Feature for Advanced Container Networking Services on your AKS cluster.,This article shows you how to deploy WireGuard encryption with Advanced Container Networking Services in Azure Kubernetes Service (AKS) clusters.,2026-04-24T17:17:00.000Z,how-to,security,0.7,True,"How-to deployment/configuration article for WireGuard on AKS; likely includes specific CLI flags, configuration fields, and required settings unique to AKS networking, which are product-specific security configuration details.",updated +https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-insights-agent,Install and use the Container Network Insights Agent,Deploy and use Container Network Insights Agent on AKS - Azure Kubernetes Service,Configure and use Container Network Insights Agent on AKS,Learn how to deploy Container Network Insights Agent as an AKS extension to troubleshoot networking issues in Azure Kubernetes Service (AKS) clusters.,"This article shows how to deploy Container Network Insights Agent on your Azure Kubernetes Service (AKS) cluster, configure authentication and identity, and use the agent to troubleshoot networking issues. Container Network Insights Agent is an AI-powered diagnostic assistant that runs as an in-cluster web application. You describe networking problems in natural language, and the agent runs diagnostic commands (kubectl,cilium,hubble) against your cluster. It returns structured, evidence-backed r",2026-04-16T11:03:00.000Z,how-to,configuration,0.7,True,"How-to article for deploying and configuring the agent, including authentication and identity. Likely contains product-specific configuration parameters and settings (extension configuration, identity/auth options, possibly CLI/ARM parameters) that go beyond generic deployment steps.",unchanged https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-logs,Configure Container Network Logs,Set Up Container Network Logs - Azure Kubernetes Service,Set up container network flow logs with Advanced Container Networking,"Learn how to set up container network flow logs with storage with Advanced Container Networking Services (preview) in Azure Kubernetes Service (AKS).""","Important Component renaming (starting November 11, 2025) We are renaming components in the Container Network Logs feature to improve clarity and consistency: What's changing Action items for existing users to enable new naming Update Azure CLI(MUST - First step!): Update Preview CLI Extension(MUST): Disable Monitoring: Re-enable Monitoring: Re-enable ACNS Container Network Logs: Apply new ContainerNetworkLog CRD: Apply your updated CRD configuration with the new naming. Reimport Grafana Dashboa",2025-11-18T19:02:00.000Z,how-to,configuration,0.78,True,"Setup guide for ContainerNetworkLog CRD and monitoring; likely includes CRD schema fields, CLI parameters, and storage configuration unique to ACNS.",unchanged https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-metrics-filtering,Configure container network metrics filtering,Configure Container network metrics filtering for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure container network metrics filtering in AKS with Cilium,Learn how to configure container network metrics filtering to optimize data collection and reduce storage costs in Azure Kubernetes Service (AKS) with Cilium.,"This article shows you how to configure container network metrics filtering for Azure Kubernetes Service (AKS) with Cilium to optimize data collection, reduce storage costs, and focus on the metrics most relevant to your monitoring needs. Configure container network metrics filtering enables dynamic management of Hubble metrics cardinality through Kubernetes Custom Resource Definitions (CRDs). This feature allows you to dynamically control the cardinality, dimensions, and targets of Hubble metr",2025-11-08T06:03:00.000Z,how-to,configuration,0.8,True,"Explicitly about configuring metrics filtering via CRDs; likely contains CRD field names, allowed values, and examples for controlling Hubble metrics cardinality and dimensions.",unchanged -https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing,Set up eBPF Host routing,Enable eBPF Host Routing with Advanced Container Networking Services - Azure Kubernetes Service,Enable eBPF host routing with ACNS on AKS,Get started with eBPF Host Routing for Advanced Container Networking Services on your AKS cluster.,"Important eBPF Host Routing with Advanced Container Networking Services is currently in PREVIEW.See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This article shows you how to enable eBPF Host Routing with Advanced Container Networking Services (ACNS) on Azure Kubernetes Service (AKS) clusters.",2026-01-08T18:15:00.000Z,how-to,configuration,0.7,True,"How-to enable feature article; likely includes AKS CLI flags, required versions, and configuration parameters specific to this preview feature.",unchanged +https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing,Set up eBPF Host routing,Enable eBPF Host Routing with Advanced Container Networking Services - Azure Kubernetes Service,Enable eBPF host routing on AKS with ACNS,Get started with eBPF Host Routing for Advanced Container Networking Services on your AKS cluster.,"Important eBPF Host Routing with Advanced Container Networking Services is currently in PREVIEW.See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This article shows you how to enable eBPF Host Routing with Advanced Container Networking Services (ACNS) on Azure Kubernetes Service (AKS) clusters.",2026-04-20T22:08:00.000Z,how-to,configuration,0.7,True,"Step-by-step enablement article for eBPF host routing; likely contains AKS-specific configuration flags, parameters, and required settings that qualify as expert configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/aks/howto-deploy-java-liberty-app,Build Java app with WebSphere Liberty or Open Liberty,Deploy a Java application with Open Liberty/WebSphere Liberty on an Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,,"Deploy a Java application with Open Liberty or WebSphere Liberty on an AKS cluster by using the Azure Marketplace offer, which automatically provisions resources.","This article demonstrates how to: The Open Liberty Operator simplifies the deployment and management of applications running on Kubernetes clusters. With the Open Liberty Operator or WebSphere Liberty Operator, you can also perform more advanced operations, such as gathering traces and dumps. This article uses the Azure Marketplace offer for Open Liberty or WebSphere Liberty to accelerate your journey to AKS. The offer automatically provisions some Azure resources, including: If you prefer manua",2026-03-26T08:00:00.000Z,how-to,,0.2,False,"Tutorial-style deployment of Java Liberty apps to AKS using a Marketplace offer; summary indicates step-by-step guidance but no mention of configuration tables, limits, error codes, or product-specific settings with defaults/ranges.",unchanged https://learn.microsoft.com/en-us/azure/aks/howto-deploy-java-quarkus-app,Build Java app with Quarkus,Deploy Quarkus on Azure Kubernetes Service - Azure Kubernetes Service,,Shows how to quickly stand up Quarkus on Azure Kubernetes Service.,"This article shows you how to quickly deploy Red Hat Quarkus on Azure Kubernetes Service (AKS) with a simple CRUD application. The application is a ""to do list"" with a JavaScript front end and a REST endpoint. Azure Database for PostgreSQL Flexible Server provides the persistence layer for the app. The article shows you how to test your app locally and deploy it to AKS.",2026-03-26T08:00:00.000Z,how-to,,0.2,False,"How-to guide for deploying a Quarkus CRUD app on AKS with PostgreSQL; summary suggests a basic tutorial without detailed configuration parameter tables, limits, or specialized troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/howto-deploy-java-wls-app,Build Java app with WebLogic Server,Deploy WebLogic Server on Azure Kubernetes Service using the Azure portal - Azure Kubernetes Service,,Shows how to quickly stand up WebLogic Server on Azure Kubernetes Service.,"This article demonstrates how to: This article uses theAzure Marketplace offer for WebLogic Serverto accelerate your journey to AKS. The offer automatically provisions several Azure resources, including the following resources: Then, the article introduces building an image to update the WebLogic Server cluster. The image provides the application and WDT models. If you prefer a less automated approach to deploying WebLogic on AKS, see the step-by-step guidance included in the official documentat",2026-03-26T08:00:00.000Z,how-to,,0.2,False,"Walkthrough for deploying WebLogic Server on AKS via Azure portal and Marketplace; description focuses on provisioning resources and building an image, without explicit expert-only configuration matrices, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/aks/http-proxy,Use an HTTP proxy,Configure Azure Kubernetes Service (AKS) nodes with an HTTP proxy - Azure Kubernetes Service,Configure HTTP proxy settings for AKS nodes,Learn how to configure Azure Kubernetes Service (AKS) clusters to use an HTTP proxy for outbound internet access.,"In this article, you learn how to configure Azure Kubernetes Service (AKS) clusters to use an HTTP proxy for outbound internet access. AKS clusters deployed into managed or custom virtual networks have certain outbound dependencies that are necessary to function properly, which created problems in environments requiring internet access to be routed through HTTP proxies. Nodes had no way of bootstrapping the configuration, environment variables, and certificates necessary to access internet servi",2026-03-30T17:11:00.000Z,how-to,configuration,0.78,True,"How-to article for configuring AKS clusters to use an HTTP proxy. These pages typically include node/cluster-level environment variable names (HTTP_PROXY, HTTPS_PROXY, NO_PROXY), AKS-specific configuration flags, and sometimes required endpoint lists or certificate handling steps that are product-specific. This is concrete configuration guidance rather than conceptual networking theory.",unchanged -https://learn.microsoft.com/en-us/azure/aks/identity-bindings,Set up identity bindings,Set up identity bindings on Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Set up identity bindings for AKS workload identity,Learn how to enable and configure identity bindings on AKS to map a user-assigned managed identity (UAMI) across multiple clusters while using a single federated identity credential.,Set upidentity bindingson your Azure Kubernetes Service (AKS) clusters to map a user-assigned managed identity (UAMI) across multiple clusters while using a single federated identity credential (FIC). This setup helps you scale Microsoft Entra authentication for workloads without hitting FIC limits.,2026-01-28T18:10:00.000Z,how-to,security,0.76,True,Configuration-focused article on mapping UAMIs across clusters via identity bindings with specific AKS settings.,unchanged -https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts,Identity bindings overview,Identity bindings for Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Understand AKS identity binding limits and scale,Learn about identity bindings on AKS and how they extend Microsoft Entra workload identity for large scale scenarios that exceed federated identity credential limits.,"Identity binding is a preview feature for Azure Kubernetes Service (AKS) that extends the existingworkload identity featureto address scale limitations around federated identity credentials (FICs) on user-assigned managed identities (UAMIs). Withworkload identity for AKS, a single UAMI can't have more than20 FICs. Large Kubernetes platform deployments might span more than 20 clusters (each cluster has a unique issuer) or have manycombinations that require mapping to t",2026-02-26T08:00:00.000Z,concept-article,limits-quotas,0.82,True,"Explicitly documents the 20 FIC per UAMI limit and how identity bindings address this, which is a concrete product limit.",unchanged -https://learn.microsoft.com/en-us/azure/aks/image-cleaner,Remove vulnerable images with ImageCleaner,Use Image Cleaner on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Image Cleaner to remove stale AKS images,Learn how to use Image Cleaner to clean up vulnerable stale images on Azure Kubernetes Service (AKS),"It's common to use pipelines to build and deploy images on Azure Kubernetes Service (AKS) clusters. While great for image creation, this process often doesn't account for the stale images left behind and can lead to image bloat on cluster nodes. These images might contain vulnerabilities, which might create security issues. To remove security risks in your clusters, you can clean these unreferenced images. Manually cleaning images can be time intensive. Image Cleaner performs automatic image ide",2024-12-05T23:00:00.000Z,how-to,configuration,0.7,True,"Describes using Image Cleaner; likely includes configuration parameters (intervals, thresholds, labels) and AKS-specific deployment patterns.",unchanged -https://learn.microsoft.com/en-us/azure/aks/image-integrity,Validate signed images with Image Integrity,Use Image Integrity to validate signed images before deploying them to your Azure Kubernetes Service (AKS) clusters (Preview) - Azure Kubernetes Service,Validate signed container images with AKS Image Integrity,Learn how to use Image Integrity to validate signed images before deploying them to your Azure Kubernetes Service (AKS) clusters.,"Azure Kubernetes Service (AKS) and its underlying container model provide increased scalability and manageability for cloud native applications. With AKS, you can launch flexible software applications according to the runtime needs of your system. However, this flexibility can introduce new challenges. In these application environments, using signed container images helps verify that your deployments are built from a trusted entity and that images haven't been tampered with since their creation.",2025-09-01T08:00:00.000Z,how-to,security,0.75,True,"How-to for Image Integrity; likely includes specific admission configuration, policy settings, and integration details unique to AKS.",unchanged +https://learn.microsoft.com/en-us/azure/aks/identity-bindings,Set up identity bindings for scalable workload identity,Set up identity bindings on Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Set up AKS identity bindings for workload identity,Learn how to enable and configure identity bindings on AKS to map a user-assigned managed identity (UAMI) across multiple clusters while using a single federated identity credential.,Set upidentity bindingson your Azure Kubernetes Service (AKS) clusters to map a user-assigned managed identity (UAMI) across multiple clusters while using a single federated identity credential (FIC). This setup helps you scale Microsoft Entra authentication for workloads without hitting FIC limits.,2026-01-28T18:10:00.000Z,how-to,security,0.7,True,"How-to guide for configuring identity bindings to map a UAMI across clusters using a single FIC; contains product-specific identity configuration steps, which are security-related.",new +https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts,Identity Bindings for scalable workload identity (overview),Identity bindings for Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Scale AKS workload identity with identity bindings,Learn about identity bindings on AKS and how they extend Microsoft Entra workload identity for large scale scenarios that exceed federated identity credential limits.,"Identity binding is a preview feature for Azure Kubernetes Service (AKS) that extends the existingworkload identity featureto address scale limitations around federated identity credentials (FICs) on user-assigned managed identities (UAMIs). Withworkload identity for AKS, a single UAMI can't have more than20 FICs. Large Kubernetes platform deployments might span more than 20 clusters (each cluster has a unique issuer) or have manycombinations that require mapping to t",2026-02-26T08:00:00.000Z,concept-article,limits-quotas,0.86,True,Explicitly documents a concrete limit (a single UAMI can't have more than 20 federated identity credentials) and how identity bindings address this scale constraint; this is expert limit/quota knowledge.,new +https://learn.microsoft.com/en-us/azure/aks/image-cleaner,Remove vulnerable images with Image Cleaner,Use Image Cleaner on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure and run Image Cleaner to remove stale AKS images,Learn how to use Image Cleaner to clean up vulnerable stale images on Azure Kubernetes Service (AKS),"It's common to use pipelines to build and deploy images on Azure Kubernetes Service (AKS) clusters. While great for image creation, this process often doesn't account for the stale images left behind and can lead to image bloat on cluster nodes. These images might contain vulnerabilities, which might create security issues. To remove security risks in your clusters, you can clean these unreferenced images. Manually cleaning images can be time intensive. Image Cleaner performs automatic image ide",2024-12-05T23:00:00.000Z,how-to,configuration,0.7,True,"Describes using Image Cleaner on AKS; likely includes configuration options, schedules, labels/annotations, and operational parameters that are specific to this AKS feature.",new +https://learn.microsoft.com/en-us/azure/aks/image-integrity,Verify signed container images at deploy time,Use Image Integrity to validate signed images before deploying them to your Azure Kubernetes Service (AKS) clusters (Preview) - Azure Kubernetes Service,Enforce signed container image validation with AKS Image Integrity,Learn how to use Image Integrity to validate signed images before deploying them to your Azure Kubernetes Service (AKS) clusters.,"Azure Kubernetes Service (AKS) and its underlying container model provide increased scalability and manageability for cloud native applications. With AKS, you can launch flexible software applications according to the runtime needs of your system. However, this flexibility can introduce new challenges. In these application environments, using signed container images helps verify that your deployments are built from a trusted entity and that images haven't been tampered with since their creation.",2025-09-01T08:00:00.000Z,how-to,security,0.7,True,"Covers validating signed images before deployment; likely includes policy configuration, admission controls, and verification settings specific to AKS Image Integrity, which are product-specific security configurations.",new https://learn.microsoft.com/en-us/azure/aks/imds-restriction,Restrict access to IMDS,Block pod access to the IMDS endpoint (preview) - Azure Kubernetes Service,Restrict AKS pod access to IMDS endpoint,Learn how to enable IMDS restriction on an AKS cluster to restrict pod access to the IMDS endpoint (preview).,"By default, all pods running in an Azure Kubernetes Service (AKS) cluster can access theAzure Instance Metadata Service (IMDS)endpoint. You can now optionally restrict access to the IMDS endpoint from your Azure Kubernetes Service (AKS) clusters to enhance security (preview). Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are p",2025-05-30T11:08:00.000Z,how-to,security,0.78,True,Explains enabling IMDS restriction with AKS-specific preview settings and flags to control pod access to metadata service.,unchanged https://learn.microsoft.com/en-us/azure/aks/improve-network-fault-tolerance-in-aks-using-tcp-keepalive,Improve network fault tolerance using TCP keep-alive,Improve network fault tolerance in Azure Kubernetes Service using TCP keepalive - Azure Kubernetes Service,Use TCP keepalive to improve AKS network resilience,Learn how to use TCP keepalive to enhance network fault tolerance in cloud applications hosted in Azure Kubernetes Service.,"In a standard Transmission Control Protocol (TCP) connection, no data flows between the peers when the connection is idle. Therefore, applications or API requests that use TCP to communicate with servers handling long-running requests might have to rely on connection timeouts to become aware of the termination or loss of connection. This article illustrates the use of TCP keepalive to enhance fault tolerance in applications hosted in Azure Kubernetes Service (AKS).",2024-12-03T18:01:00.000Z,concept-article,best-practices,0.65,True,"Shows how to use TCP keepalive for fault tolerance; likely includes concrete sysctl values, timeout settings, and AKS-specific recommendations.",unchanged https://learn.microsoft.com/en-us/azure/aks/integrations,"Overview of add-ons, extensions, and integrations","Add-ons, extensions, and other integrations with Azure Kubernetes Service (AKS) - Azure Kubernetes Service",,"Learn about the add-ons, extensions, and open-source integrations you can use with Azure Kubernetes Service (AKS).",Azure Kubernetes Service (AKS) provides extra functionality for your clusters using add-ons and extensions. Open-source projects and third parties provide by more integrations that are commonly used with AKS. TheAKS support policydoesn't support the open-source and third-party integrations.,2025-04-02T22:01:00.000Z,overview,,0.45,False,High-level overview of add-ons and integrations; summary does not indicate detailed config tables or error/limit specifics.,unchanged https://learn.microsoft.com/en-us/azure/aks/internal-lb,Create an internal load balancer,Create an internal load balancer - Azure Kubernetes Service,Create and use internal load balancers in AKS,Learn how to create and use an internal load balancer to expose your services with Azure Kubernetes Service (AKS).,You can create and use an internal load balancer to restrict access to your applications in Azure Kubernetes Service (AKS). An internal load balancer doesn't have a public IP and makes a Kubernetes service accessible only to applications that can reach the private IP. These applications can be within the same virtual network or in another virtual network through virtual network peering. This article shows you how to create and use an internal load balancer with AKS. Important Starting onSeptembe,2026-01-28T18:10:00.000Z,how-to,configuration,0.7,True,"How-to for internal load balancers with AKS, including annotations and network requirements; these are concrete configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/aks/intro-aks-automatic,Compare AKS Standard and AKS Automatic,Introduction to Azure Kubernetes Service (AKS) Automatic - Azure Kubernetes Service,,Simplify deployment and management of container-based applications in Azure by learning about the features and benefits of Azure Kubernetes Service Automatic.,"Applies to:✔️ AKS Automatic Azure Kubernetes Service (AKS) Automatic offers an experience that makes the most common tasks on Kubernetes fast and frictionless, while preserving the flexibility, extensibility, and consistency of Kubernetes. Azure takes care of your cluster setup, including node management, scaling, security, and preconfigured settings that follow AKS well-architected recommendations. Automatic clusters dynamically allocate compute resources based on your specific workload require",2026-04-14T06:03:00.000Z,overview,,0.2,False,"Introductory/marketing-style overview of AKS Automatic features and benefits; no concrete limits, configuration tables, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/aks/intro-aks-automatic,Compare AKS Standard and AKS Automatic,Introduction to Azure Kubernetes Service (AKS) Automatic - Azure Kubernetes Service,,Simplify deployment and management of container-based applications in Azure by learning about the features and benefits of Azure Kubernetes Service Automatic.,"Applies to:✔️ AKS Automatic Azure Kubernetes Service (AKS) Automatic offers an experience that makes the most common tasks on Kubernetes fast and frictionless, while preserving the flexibility, extensibility, and consistency of Kubernetes. Azure takes care of your cluster setup, including node management, scaling, security, and preconfigured settings that follow AKS well-architected recommendations. Automatic clusters dynamically allocate compute resources based on your specific workload require",2026-04-14T06:03:00.000Z,overview,,0.2,False,"Introductory/marketing-style overview of AKS Automatic features and benefits; no concrete limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/aks/istio-about,About Istio,Istio-based service mesh add-on for Azure Kubernetes Service - Azure Kubernetes Service,,Istio-based service mesh add-on for Azure Kubernetes Service.,Istioaddresses the challenges developers and operators face with a distributed or microservices architecture. The Istio-based service mesh add-on provides an officially supported and tested integration for Azure Kubernetes Service (AKS).,2025-10-17T22:05:00.000Z,concept-article,,0.4,False,High-level description of the Istio-based add-on and its benefits; summary suggests marketing/overview rather than detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/aks/istio-cni,Istio CNI (Preview),Enable Istio CNI for Istio-based service mesh add-on for Azure Kubernetes Service - Azure Kubernetes Service,Enable Istio CNI for secure Istio add-on workloads on AKS,Enable Istio CNI for enhanced security in Istio-based service mesh add-on for Azure Kubernetes Service,"This article shows you how to enable Istio CNI for the Istio-based service mesh add-on on Azure Kubernetes Service (AKS). Istio CNI improves security by eliminating the need for privileged network capabilities in application workloads within the service mesh. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered ",2025-10-27T17:15:00.000Z,how-to,security,0.7,True,Istio CNI configuration removes privileged capabilities from workloads; article necessarily includes specific security-related settings and constraints for AKS.,unchanged https://learn.microsoft.com/en-us/azure/aks/istio-deploy-addon,Getting started,Deploy Istio-based service mesh add-on for Azure Kubernetes Service - Azure Kubernetes Service,Deploy Istio-based service mesh add-on on AKS clusters,Deploy Istio-based service mesh add-on for Azure Kubernetes Service,"This article shows you how to install the Istio-based service mesh add-on for Azure Kubernetes Service (AKS) cluster. For more information on Istio and the service mesh add-on, seeIstio-based service mesh add-on for Azure Kubernetes Service. Tip You can use Azure Copilot to help deploy Istio to your AKS clusters in the Azure portal. For more information, seeWork with AKS clusters efficiently using Azure Copilot.",2025-09-13T05:11:00.000Z,how-to,deployment,0.65,True,"Installation article for a specific AKS add-on typically includes required cluster versions, flags, and constraints unique to the add-on deployment.",unchanged https://learn.microsoft.com/en-us/azure/aks/istio-deploy-egress,Deploy egress gateways,Azure Kubernetes Service (AKS) egress gateway for Istio service mesh add-on - Azure Kubernetes Service,Deploy and configure egress gateways for AKS Istio add-on,Deploy egress gateways for Istio service mesh add-on for Azure Kubernetes Service,This article shows you how to deploy egress gateways for the Istio service mesh add-on for Azure Kubernetes Service (AKS) cluster.,2025-05-29T22:02:00.000Z,how-to,configuration,0.65,True,Egress gateway setup involves specific Istio and AKS configuration objects and patterns that are not generic knowledge.,unchanged https://learn.microsoft.com/en-us/azure/aks/istio-deploy-ingress,Deploy ingress gateways,Azure Kubernetes Service (AKS) external or internal ingresses for Istio service mesh add-on - Azure Kubernetes Service,Configure external and internal ingress gateways for AKS Istio add-on,Deploy external or internal ingresses for Istio service mesh add-on for Azure Kubernetes Service,"This article shows you how to deploy external or internal ingresses for the Istio service mesh add-on for Azure Kubernetes Service (AKS) cluster. Note When you perform aminor revision upgradeof the Istio add-on, another deployment for the external / internal gateways will be created for the new control plane revision.",2026-01-28T23:09:00.000Z,how-to,configuration,0.7,True,"Ingress deployment for the add-on includes specific gateway resources, annotations, and behavior during minor revision upgrades, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/aks/istio-gateway-api,Ingress with Kubernetes Gateway API,Kubernetes Gateway API Ingress for Istio Service Mesh Add-on for Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Configure Istio Gateway API ingress on AKS,Configure ingresses for the Istio service mesh add-on for AKS using the Kubernetes Gateway API.,"Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best-effort basis. As such, these features aren't meant for production use. For more information, see the following support articles: The Istio service mesh add-on supports bothIstio's own ingress traffic management APIand the Kubernet",2026-03-19T11:02:00.000Z,how-to,configuration,0.68,True,"The page describes how to configure ingress for the Istio service mesh add-on on AKS using the Kubernetes Gateway API, including product-specific configuration objects and fields (Gateway, HTTPRoute, listeners, references to Istio add-on behavior). This is concrete, product-specific configuration guidance rather than a conceptual overview, matching the configuration sub-skill. It does not primarily focus on limits, troubleshooting, or architecture trade-offs.",unchanged +https://learn.microsoft.com/en-us/azure/aks/istio-gateway-api,Ingress with Kubernetes Gateway API,Kubernetes Gateway API Ingress for Istio Service Mesh Add-on for Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Configure Istio Gateway API ingress on AKS,Configure ingresses for the Istio service mesh add-on for AKS using the Kubernetes Gateway API.,"Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best-effort basis. As such, these features aren't meant for production use. For more information, see the following support articles: The Istio service mesh add-on supports bothIstio's own ingress traffic management APIand the Kubernet",2026-04-21T17:10:00.000Z,how-to,configuration,0.68,True,"The page describes how to configure ingress for the Istio service mesh add-on on AKS using the Kubernetes Gateway API, including product-specific resource kinds, fields, and configuration patterns that are unique to this integration. These are concrete configuration details rather than just conceptual explanations, so it best fits the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/aks/istio-latency,Latency comparison,Istio service mesh Azure Kubernetes Service add-on latency comparison - Azure Kubernetes Service,Compare latency impact across AKS Istio add-on versions,Istio service mesh Azure Kubernetes Service add-on latency compared across addon versions,"This document elaborates on thedata plane latency performanceacross Istio add-on versions and Kubernetes version. The results evaluate the impact of adding sidecar proxies to the data path, showcasing the P90 and P99 latency difference. The comparison measures the difference between traffic routed through the sidecar and traffic sent directly to the pod.",2024-10-18T22:01:00.000Z,concept-article,limits-quotas,0.8,True,"Focuses on P90/P99 latency differences across versions and Kubernetes releases, providing concrete numeric performance data that LLMs would not know from training.",unchanged https://learn.microsoft.com/en-us/azure/aks/istio-meshconfig,Mesh configuration,Configure Istio-based service mesh add-on for Azure Kubernetes Service - Azure Kubernetes Service,Configure MeshConfig and supported settings for Istio add-on on AKS,Configure Istio-based service mesh add-on for Azure Kubernetes Service,"Deploy and Explore Open-source Istio usesMeshConfigto define mesh-wide settings for the Istio service mesh. Istio-based service mesh add-on for AKS builds on top of MeshConfig and classifies different properties as supported, allowed, and blocked. This article walks through how to configure Istio-based service mesh add-on for Azure Kubernetes Service and the support policy applicable for such configuration.",2025-08-23T05:05:00.000Z,how-to,configuration,0.75,True,"Details which MeshConfig properties are supported/allowed/blocked for the AKS Istio add-on, a clear product-specific configuration matrix.",unchanged https://learn.microsoft.com/en-us/azure/aks/istio-metrics-managed-prometheus,Collect metrics in Azure Managed Prometheus,Collect metrics for Istio service mesh add-on workloads for Azure Kubernetes Service in Azure Managed Prometheus - Azure Kubernetes Service,Integrate AKS Istio add-on metrics with Azure Managed Prometheus,Collect metrics for Istio service mesh add-on workloads for Azure Kubernetes Service in Azure Managed Prometheus,This guide explains how to set up and use Azure Managed Prometheus to collect metrics from Istio service mesh add-on workloads on your Azure Kubernetes cluster.,2025-03-05T23:02:00.000Z,how-to,integrations,0.7,True,"Shows how to configure scraping and metrics export from Istio workloads to Azure Managed Prometheus, including config parameters and labels specific to this integration.",unchanged @@ -283,23 +291,24 @@ https://learn.microsoft.com/en-us/azure/aks/keda-deploy-add-on-arm,Use an ARM te https://learn.microsoft.com/en-us/azure/aks/keda-deploy-add-on-cli,Use the Azure CLI,Install the Kubernetes Event-driven Autoscaling (KEDA) add-on using the Azure CLI - Azure Kubernetes Service,Install KEDA add-on on AKS using Azure CLI,Use the Azure CLI to deploy the Kubernetes Event-driven Autoscaling (KEDA) add-on to Azure Kubernetes Service (AKS).,"Important The KEDA add-on for AKS doesn't currently support modifying the CPU requests or limits and other Helm values for theMetrics ServerorOperator. Keep this limitation in mind when using the add-on. If you have any questions, feel free to reach outhere. This article shows you how to install the Kubernetes Event-driven Autoscaling (KEDA) add-on to Azure Kubernetes Service (AKS) using the Azure CLI. Important Your cluster Kubernetes version determines what KEDA version will be installed on yo",2024-08-06T17:07:00.000Z,how-to,deployment,0.68,True,"CLI-based deployment of KEDA add-on; includes specific az commands, flags, and version behavior tied to AKS Kubernetes versions.",unchanged https://learn.microsoft.com/en-us/azure/aks/keda-integrations,Kubernetes Event-driven Autoscaler (KEDA) integrations,Integrations with Kubernetes Event-driven Autoscaling (KEDA) on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use KEDA integrations with Azure and OSS services,Integrations with Kubernetes Event-driven Autoscaling (KEDA) on Azure Kubernetes Service (AKS).,The Kubernetes Event-driven Autoscaling (KEDA) add-on for AKS integrates with features provided by Azure and open-source projects. Important TheAKS support policydoesn't cover integrations with open-source projects.,2025-07-21T17:11:00.000Z,concept-article,integrations,0.7,True,"Describes KEDA integrations with Azure and open-source projects; expected to list specific scalers, configuration parameters, and integration patterns unique to AKS KEDA add-on.",unchanged https://learn.microsoft.com/en-us/azure/aks/keda-workload-identity,Securely scale your applications using the Kubernetes Event-driven Autoscaling (KEDA) add-on and workload identity,Securely scale your applications using the Kubernetes Event-driven Autoscaling (KEDA) add-on and workload identity - Azure Kubernetes Service,Secure KEDA-based autoscaling on AKS with workload identity,Learn how to securely scale your applications using the KEDA add-on and workload identity on Azure Kubernetes Service (AKS).,"This article shows you how to securely scale your applications with the Kubernetes Event-driven Autoscaling (KEDA) add-on and workload identity on Azure Kubernetes Service (AKS). Important Your cluster Kubernetes version determines what KEDA version will be installed on your AKS cluster. To see which KEDA version maps to each AKS version, see theAKS managed add-onscolumn of theKubernetes component version table. For GA Kubernetes versions, AKS offers full support of the corresponding KEDA minor ",2024-08-01T20:29:00.000Z,how-to,security,0.74,True,"Focuses on secure scaling using workload identity; likely includes specific Azure AD / workload identity configuration, role assignments, and KEDA identity bindings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption,Use KMS platform-managed keys or customer-managed keys,Enable KMS data encryption in Azure Kubernetes Service (AKS) clusters (Preview) - Azure Kubernetes Service,Enable KMS data encryption for AKS secrets,Learn how to enable Key Management Service (KMS) data encryption with platform-managed keys or customer-managed keys in AKS.,"This article shows you how to enable Key Management Service (KMS) data encryption for Kubernetes secrets in Azure Kubernetes Service (AKS). KMS encryption encrypts Kubernetes secrets stored in etcd using Azure Key Vault keys. AKS supports two key management options: For more information about encryption concepts and key options, seeData encryption at rest concepts for AKS. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as availabl",2025-12-22T23:07:00.000Z,how-to,security,0.76,True,"How-to for enabling KMS encryption with platform-managed or customer-managed keys, including AKS- and Key Vault–specific configuration parameters and options.",unchanged -https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption-concepts,Overview of KMS etcd encryption,Data encryption at rest concepts for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn about encryption at rest for Kubernetes secrets in Azure Kubernetes Service (AKS) using Azure Key Vault and the KMS provider.,"Azure Kubernetes Service (AKS) stores sensitive data such as Kubernetes secrets in etcd, the distributed key-value store used by Kubernetes. For enhanced security and compliance requirements, AKS supports encryption of Kubernetes secrets at rest using the Kubernetes Key Management Service (KMS) provider integrated with Azure Key Vault. This article explains the key concepts, encryption models, and key management options available for protecting Kubernetes secrets at rest in AKS.",2025-12-22T23:07:00.000Z,concept-article,,0.3,False,"Conceptual explanation of encryption at rest and KMS models; no detailed configuration tables, limits, or product-specific parameters.",unchanged -https://learn.microsoft.com/en-us/azure/aks/kms-observability,Observability for KMS etcd encryption,Observability for Azure Kubernetes Service (AKS) Clusters with Key Management Service (KMS) Etcd Encryption (legacy) - Azure Kubernetes Service,Monitor AKS legacy KMS etcd encryption metrics,Learn how to view observability metrics and improve observability for AKS clusters with KMS etcd encryption.,This article shows you how to view observability metrics and improve observability for AKS clusters with KMS etcd encryption.,2025-12-22T08:00:00.000Z,how-to,configuration,0.64,True,"Covers observability metrics and configuration for KMS etcd encryption, including specific metric names and monitoring settings unique to AKS.",unchanged +https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption,Configure KMS encryption at rest of K8s API resources,Enable KMS data encryption in Azure Kubernetes Service (AKS) clusters (Preview) - Azure Kubernetes Service,Enable KMS-based secret encryption for AKS clusters,Learn how to enable Key Management Service (KMS) data encryption with platform-managed keys or customer-managed keys in AKS.,"This article shows you how to enable Key Management Service (KMS) data encryption for Kubernetes secrets in Azure Kubernetes Service (AKS). KMS encryption encrypts Kubernetes secrets stored in etcd using Azure Key Vault keys. AKS supports two key management options: For more information about encryption concepts and key options, seeData encryption at rest concepts for AKS. Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as availabl",2025-12-22T23:07:00.000Z,how-to,security,0.74,True,Step-by-step configuration of KMS data encryption with platform-managed or customer-managed keys; includes product-specific security and key management settings.,new +https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption-concepts,KMS encryption at rest of K8s API resources (overview),Data encryption at rest concepts for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,Learn about encryption at rest for Kubernetes secrets in Azure Kubernetes Service (AKS) using Azure Key Vault and the KMS provider.,"Azure Kubernetes Service (AKS) stores sensitive data such as Kubernetes secrets in etcd, the distributed key-value store used by Kubernetes. For enhanced security and compliance requirements, AKS supports encryption of Kubernetes secrets at rest using the Kubernetes Key Management Service (KMS) provider integrated with Azure Key Vault. This article explains the key concepts, encryption models, and key management options available for protecting Kubernetes secrets at rest in AKS.",2025-12-22T23:07:00.000Z,concept-article,,0.45,False,"Conceptual overview of data encryption at rest and KMS provider integration; does not emphasize concrete configuration parameters, limits, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/aks/kms-observability,KMS observability (legacy),Observability for Azure Kubernetes Service (AKS) Clusters with Key Management Service (KMS) Etcd Encryption (legacy) - Azure Kubernetes Service,Monitor and troubleshoot AKS KMS etcd encryption,Learn how to view observability metrics and improve observability for AKS clusters with KMS etcd encryption.,This article shows you how to view observability metrics and improve observability for AKS clusters with KMS etcd encryption.,2025-12-22T08:00:00.000Z,how-to,troubleshooting,0.62,True,"Focuses on observability metrics and how to improve observability for KMS etcd encryption; likely includes specific metrics, their meanings, and actions, which are troubleshooting/diagnostic patterns unique to this feature.",new https://learn.microsoft.com/en-us/azure/aks/kubelet-logs,View kubelet logs,View Kubelet Logs from AKS Nodes - Azure Kubernetes Service,Collect and view kubelet logs from AKS nodes,Learn how to view troubleshooting information in the kubelet logs from Azure Kubernetes Service (AKS) nodes.,"Deploy and Explore You might need to review logs to troubleshoot a problem in your Azure Kubernetes Service (AKS) cluster. You can use tools in the Azure portal to view logs for AKSmain componentsandcluster containers. Occasionally, you might need to getkubeletlogs from AKS nodes to help you troubleshoot an issue. This article shows you how to usejournalctlto view kubelet logs on an AKS node. Alternatively, you can collect kubelet logs by using thesyslog collection feature in Container insights ",2025-08-23T05:05:00.000Z,how-to,troubleshooting,0.7,True,Shows how to use journalctl and Container insights syslog collection to get kubelet logs; includes specific commands and log locations for AKS nodes.,unchanged -https://learn.microsoft.com/en-us/azure/aks/kubelogin-authentication,Azure kubelogin,Use kubelogin to authenticate in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Authenticate to AKS using kubelogin and Entra ID,Learn how to use the kubelogin plugin for all Microsoft Entra authentication methods in Azure Kubernetes Service (AKS).,"The kubelogin plugin in Azure is a client-go credentialpluginthat implements Microsoft Entra authentication. The kubelogin plugin offers features that aren't available in the kubectl command-line tool. For more information, see thekubelogin introductionand thekubectl introduction. This article provides an overview and examples of how to use kubelogin for all supported Microsoft Entra authentication methods in AKS.",2025-12-12T23:08:00.000Z,how-to,security,0.8,True,"Shows kubelogin usage for multiple Entra auth methods; likely includes specific command flags, environment variables, and AKS kubeconfig integration details.",unchanged +https://learn.microsoft.com/en-us/azure/aks/kubelogin-authentication,Authenticate with kubelogin,Use kubelogin to authenticate in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use kubelogin for Entra-based AKS authentication,Learn how to use the kubelogin plugin for all Microsoft Entra authentication methods in Azure Kubernetes Service (AKS).,"The kubelogin plugin in Azure is a client-go credentialpluginthat implements Microsoft Entra authentication. The kubelogin plugin offers features that aren't available in the kubectl command-line tool. For more information, see thekubelogin introductionand thekubectl introduction. This article provides an overview and examples of how to use kubelogin for all supported Microsoft Entra authentication methods in AKS.",2025-12-12T23:08:00.000Z,how-to,security,0.75,True,"Provides concrete examples of kubelogin usage for AKS; includes command options, modes, and parameters specific to Entra authentication with AKS, which are product-specific security/auth configuration details.",new https://learn.microsoft.com/en-us/azure/aks/kubernetes-action,GitHub Actions for Kubernetes,"Build, test, and deploy containers to Azure Kubernetes Service (AKS) using GitHub Actions - Azure Kubernetes Service",Use GitHub Actions to build and deploy to AKS,"Learn how to use GitHub Actions to build, test, and deploy containers to Azure Kubernetes Service (AKS).",GitHub Actionsgives you the flexibility to build an automated software development lifecycle workflow. You can use multiple Kubernetes actions to deploy to containers from Azure Container Registry (ACR) to Azure Kubernetes Service (AKS) with GitHub Actions.,2024-08-01T20:29:00.000Z,how-to,deployment,0.8,True,Includes GitHub Actions workflow configuration for building and deploying containers from ACR to AKS; AKS-specific deployment automation details.,unchanged https://learn.microsoft.com/en-us/azure/aks/kubernetes-center-azure-portal,Create and manage AKS resources in the Azure portal with Kubernetes center,Create and Manage Kubernetes resources in the Azure portal with Kubernetes Center (preview) - Azure Kubernetes Service,,"Learn how to create and manage your Kubernetes resources directly in the Azure portal using Kubernetes Center (preview), which centralizes creation, management, and monitoring of your Kubernetes clust","Kubernetes Center (preview) in the Azure portal centralizes and simplifies AKS resource creation and management. This feature provides a unified experience across all your AKS resources and delivers intelligent insights and streamlined workflows, allowing platform teams to maintain control while enabling developers to move quickly and confidently. This article shows you how to get started with Kubernetes Center (preview) in the Azure portal, including how to access it, explore its features, and ",2025-11-18T16:04:00.000Z,how-to,,0.4,False,"Kubernetes Center overview; describes portal experience and workflows, but summary does not show detailed configuration parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/aks/kubernetes-helm,Install existing applications with Helm,Install existing applications with Helm in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Install and use Helm to deploy apps on AKS,Learn how to use the Helm packaging tool to deploy containers in an Azure Kubernetes Service (AKS) cluster,"Helmis an open-source packaging tool that helps you install and manage the lifecycle of Kubernetes applications. Similar to Linux package managers, such asAPTandYum, you can use Helm to manage Kubernetes charts, which are packages of preconfigured Kubernetes resources. This article shows you how to configure and use Helm in a Kubernetes cluster on Azure Kubernetes Service (AKS).",2024-08-01T20:29:00.000Z,how-to,deployment,0.65,True,Shows AKS-specific steps to configure Helm and deploy charts; includes concrete commands and configuration beyond generic Helm concepts.,unchanged https://learn.microsoft.com/en-us/azure/aks/kubernetes-portal,Azure portal Kubernetes resource view,Access Kubernetes Resources using the Azure Portal - Azure Kubernetes Service,Manage AKS Kubernetes resources through Azure portal UI,Learn how to access and manage your Azure Kubernetes Service (AKS) resources using the Azure portal.,"In this article, you learn how to access and manage your Azure Kubernetes Service (AKS) resources using the Azure portal.",2026-01-26T23:08:00.000Z,how-to,configuration,0.6,True,"Portal-based AKS resource management typically documents specific blades, actions, and constraints unique to AKS integration in the portal, which are product-specific configuration/management details.",unchanged -https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal,Create a service principal,Use a Service Principal with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use Microsoft Entra service principals with AKS,Learn how to create and manage a Microsoft Entra service principal with a cluster in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) clusters require either aMicrosoft Entra service principalor amanaged identityto dynamically create and manage other Azure resources. This article describes how to create a Microsoft Entra service principal and use it with your AKS cluster. Note For optimal security and ease of use, we recommend using managed identities instead of service principals to authorize access from an AKS cluster to other resources in Azure. A managed identity is a special type of service ",2026-01-22T23:12:00.000Z,how-to,security,0.8,True,"How-to for creating and attaching service principals to AKS; includes specific role assignments, scopes, and AKS-specific identity behavior.",unchanged +https://learn.microsoft.com/en-us/azure/aks/kubernetes-rbac-entra-id,Use Kubernetes RBAC with Microsoft Entra integration,Use Microsoft Entra ID and Kubernetes RBAC for clusters - Azure Kubernetes Service,Use Entra groups with Kubernetes RBAC in AKS,Learn how to use Microsoft Entra group membership to restrict access to cluster resources using Kubernetes role-based access control (Kubernetes RBAC) in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) can be configured to use Microsoft Entra ID for user authentication. In this configuration, you sign in to an AKS cluster using a Microsoft Entra authentication token. Once authenticated, you can use the built-in Kubernetes role-based access control (RBAC) to manage access to namespaces and cluster resources based on a user's identity or group membership. This article shows you how to: Control access using Kubernetes RBAC in an AKS cluster based on Microsoft Entra ",2026-04-23T06:10:00.000Z,how-to,security,0.7,True,"Shows how to wire Entra group membership into Kubernetes RBAC for AKS; includes concrete RBAC role/clusterrole and binding patterns tied to Entra groups, which are product-specific security configuration details.",new +https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal,Service principal as cluster identity,Use a Service Principal with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Entra service principals for AKS clusters,Learn how to create and manage a Microsoft Entra service principal with a cluster in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) clusters require either aMicrosoft Entra service principalor amanaged identityto dynamically create and manage other Azure resources. This article describes how to create a Microsoft Entra service principal and use it with your AKS cluster. Note For optimal security and ease of use, we recommend using managed identities instead of service principals to authorize access from an AKS cluster to other resources in Azure. A managed identity is a special type of service ",2026-01-22T23:12:00.000Z,how-to,security,0.7,True,"Explains creating and attaching a Microsoft Entra service principal to AKS; includes product-specific identity and role assignment configuration steps, which fall under security configuration.",new https://learn.microsoft.com/en-us/azure/aks/kueue-overview,Learn about Kueue for batch scheduling,Install Kueue on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Install and configure Kueue for batch on AKS,"Install and configure Kueue on your Azure Kubernetes Service (AKS) cluster, including enabling advanced features and verifying deployment.","In this article, you learn how to install and configure Kueue to schedule batch workloads on an Azure Kubernetes Service (AKS) cluster. You also explore different Kueue concepts, installation methods to enable advanced Kueue features, and learn how to verify your deployments. Important Open-source software is mentioned throughout AKS documentation and samples. Software that you deploy is excluded from AKS service-level agreements, limited warranty, and Azure support. As you use open-source techn",2025-11-04T05:34:00.000Z,how-to,configuration,0.7,True,"Covers installation methods and enabling advanced Kueue features; likely includes CRD versions, Helm values, and AKS-specific setup steps.",unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-flatcar-deploy-arm-template,Use ARM template,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster with Flatcar Container Linux for AKS (preview) using an ARM template - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster with Flatcar Container Linux for AKS (preview) using an Azure Resource Manager template and deploy an application in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you: AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe your intended deployment without writing the sequence of programming commands to create the deployment. Note To get started with quickly provisioning an AKS cluster, ",2025-11-10T18:12:00.000Z,quickstart,,0.4,False,"Quickstart for AKS with Flatcar using ARM template; explains template usage with one scenario, not a full configuration matrix.",unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-flatcar-deploy-cli,Use the Azure CLI,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster with Flatcar Container Linux for AKS (preview) using Azure CLI - Azure Kubernetes Service,,Learn how to deploy an Azure Kubernetes Service cluster (AKS) with Flatcar Container Linux for AKS (preview) and deploy an application using Azure CLI.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you learn how to: Note This article includes steps to deploy a cluster with default settings for evaluation purposes only. Before you deploy a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements.",2025-11-10T18:12:00.000Z,quickstart,,0.4,False,"Quickstart for AKS with Flatcar using CLI; preview note and default deployment steps, not a configuration or limits reference.",unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-azd,Use the Azure Developer CLI,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster using the Azure Developer CLI - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) using the Azure Developer CLI.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you learn to: Note To get started with quickly provisioning an AKS cluster, this article includes steps to deploy a cluster with default settings for evaluation purposes only. Before deploying a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements.",2024-08-01T20:29:00.000Z,quickstart,,0.3,False,"Quickstart for deploying AKS via Azure Developer CLI; uses defaults and focuses on workflow, not expert configuration or decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-bicep,Use Bicep,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster using Bicep - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster using a Bicep file and deploy an application in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you: Note To get started with quickly provisioning an AKS cluster, this article includes steps to deploy a cluster with default settings for evaluation purposes only. Before deploying a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements.",2024-08-01T20:29:00.000Z,quickstart,,0.4,False,"Quickstart using Bicep; focuses on deploying with a template and defaults, not on enumerating configuration options or limits.",unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-bicep-kubernetes-extension,Use Bicep Kubernetes extension,Deploy an AKS cluster using the Bicep Kubernetes extension - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster using the Bicep Kubernetes extension and deploy an application in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you: Important The Bicep Kubernetes extension is currently in preview. You can enable the feature from theBicep configuration fileby adding: Note To get started with quickly provisioning an AKS cluster, this article includes steps to deploy a cluster with default settings for evaluation purposes only. Before deploying a production-ready cluster, we recommend that y",2024-08-06T22:01:00.000Z,quickstart,,0.4,False,"Quickstart using Bicep Kubernetes extension; preview note and basic deployment steps, but summary does not indicate detailed parameter tables or constraints.",unchanged -https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-cli,Use the Azure CLI,Deploy an Azure Kubernetes Service (AKS) Cluster Using Azure CLI - Azure Kubernetes Service,,Learn how to deploy an Azure Kubernetes Service cluster (AKS) with default settings using Azure CLI and deploy a multi-container application.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you learn how to: Note This article includes steps to deploy a cluster with default settings for evaluation purposes only. Before you deploy a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements. If you only want to deploy an Azure Kubernetes Service cl",2026-03-13T22:09:00.000Z,quickstart,,0.2,False,"Quickstart tutorial for creating an AKS cluster and deploying an app via Azure CLI with default settings; it does not focus on limits, configuration matrices, error codes, or product-specific best practices beyond generic deployment steps.",unchanged +https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-cli,Use the Azure CLI,Deploy an Azure Kubernetes Service (AKS) Cluster Using Azure CLI - Azure Kubernetes Service,,Learn how to deploy an Azure Kubernetes Service cluster (AKS) with default settings using Azure CLI and deploy a multi-container application.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you learn how to: Note This article includes steps to deploy a cluster for evaluation purposes only. Before you deploy a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements. If you only want to deploy an Azure Kubernetes Service cluster, selectDeploy to",2026-04-22T22:13:00.000Z,quickstart,,0.2,False,"Quickstart tutorial for creating an AKS cluster and deploying an app via Azure CLI. It shows basic commands and default settings but does not provide configuration parameter tables, limits, quotas, error-code-based troubleshooting, or decision matrices. Content is procedural, not expert reference material.",updated https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-portal,Use the Azure portal,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster using the Azure portal - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) using the Azure portal.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you: Note To get started with quickly provisioning an AKS cluster, this article includes steps to deploy a cluster with default settings for evaluation purposes only. Before deploying a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements.",2025-12-21T12:02:00.000Z,quickstart,,0.3,False,Quickstart for deploying AKS via portal; basic deployment steps without detailed expert-level configuration or quotas.,unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-powershell,Use Azure PowerShell,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster using Azure PowerShell - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster and deploy an application in Azure Kubernetes Service (AKS) using PowerShell.,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you: Note To get started with quickly provisioning an AKS cluster, this article includes steps to deploy a cluster with default settings for evaluation purposes only. Before deploying a production-ready cluster, we recommend that you familiarize yourself with ourbaseline reference architectureto consider how it aligns with your business requirements.",2024-12-12T23:02:00.000Z,quickstart,,0.3,False,"Quickstart for deploying AKS via PowerShell; evaluation-focused defaults, not deep configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-rm-template,Use ARM template,Quickstart: Deploy an Azure Kubernetes Service (AKS) cluster using an ARM template - Azure Kubernetes Service,,Learn how to quickly deploy a Kubernetes cluster using an Azure Resource Manager template and deploy an application in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) is a managed Kubernetes service that lets you quickly deploy and manage clusters. In this quickstart, you: AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe your intended deployment without writing the sequence of programming commands to create the deployment. Note To get started with quickly provisioning an AKS cluster, ",2024-08-01T20:29:00.000Z,quickstart,,0.4,False,"Quickstart using ARM template; explains template conceptually and shows one deployment example, not a full configuration reference.",unchanged @@ -310,15 +319,13 @@ https://learn.microsoft.com/en-us/azure/aks/learn/quick-windows-container-deploy https://learn.microsoft.com/en-us/azure/aks/learn/quick-windows-container-deploy-terraform,Use Terraform,Quickstart: Create a Windows-based Azure Kubernetes Service (AKS) cluster using Terraform - Azure Kubernetes Service,,"In this quickstart, you create an Azure Kubernetes cluster with a default node pool and a separate Windows node pool.","In this quickstart, you create an Azure Kubernetes cluster with a Windows node pool using Terraform. Azure Kubernetes Service (AKS) is a managed container orchestration service provided by Azure. It simplifies the deployment, scaling, and operations of containerized applications. The service uses Kubernetes, an open-source system for automating the deployment, scaling, and management of containerized applications. The Windows node pool allows you to run Windows containers in your Kubernetes clus",2024-08-01T20:29:00.000Z,quickstart,,0.4,False,"Quickstart for Windows-based AKS cluster using Terraform; focuses on creating a cluster and Windows node pool, not on enumerating limits or configuration options.",unchanged https://learn.microsoft.com/en-us/azure/aks/limit-egress-traffic,Restrict and control cluster egress traffic,Limit Network Traffic with Azure Firewall in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Limit AKS egress traffic using Azure Firewall,Learn how to control egress traffic with Azure Firewall to set restrictions for outbound network connections in AKS clusters.,"This article shows you how to use theoutbound network and fully qualified domain name (FQDN) rules for AKS clustersto control egress traffic using Azure Firewall. To simplify this configuration, Azure Firewall provides an Azure Kubernetes Service (AzureKubernetesService) FQDN tag that restricts outbound traffic from the AKS cluster.",2026-01-06T23:11:00.000Z,how-to,configuration,0.7,True,Uses AKS outbound/FQDN rules and the AzureKubernetesService FQDN tag; this is specific configuration knowledge about tags and rule setup.,unchanged https://learn.microsoft.com/en-us/azure/aks/load-balancer-standard,Create a Load Balancer Service,Use a Public Standard Load Balancer in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use public Standard Load Balancer with AKS,Learn how to use a public load balancer with a Standard SKU to expose your services with Azure Kubernetes Service (AKS).,"TheAzure Load Balanceroperates at layer 4 of the Open Systems Interconnection (OSI) model that supports both inbound and outbound scenarios. It distributes inbound flows that arrive at the load balancer's front end to the back end pool instances. Apublicload balancer integrated with AKS serves two purposes: This article covers integration with a public load balancer on AKS. For internal load balancer integration, seeUse an internal load balancer in AKS.",2025-11-04T23:06:00.000Z,how-to,configuration,0.7,True,Describes integration of Standard Load Balancer with AKS; includes AKS-specific behavior and configuration options.,unchanged +https://learn.microsoft.com/en-us/azure/aks/local-accounts,Disable local cluster admin accounts,Manage local accounts with Microsoft Entra integration - Azure Kubernetes Service,Manage AKS local admin accounts with Entra integration,Learn how to managed local accounts when integrating Microsoft Entra ID in your Azure Kubernetes Service (AKS) clusters.,"When you deploy an AKS cluster, local accounts are enabled by default. Even when you enable RBAC or Microsoft Entra integration,--adminaccess still exists as a non-auditable backdoor option. This article shows you how to disable local accounts on an existing cluster, create a new cluster with local accounts disabled, and re-enable local accounts on existing clusters.",2026-04-23T06:10:00.000Z,how-to,security,0.8,True,"Operational guide for disabling/enabling local accounts on AKS clusters; includes specific flags, API fields, and configuration options that control local admin access, which are product-specific security settings.",new https://learn.microsoft.com/en-us/azure/aks/localdns-custom,Configure LocalDNS,Configure LocalDNS in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure LocalDNS for faster AKS DNS resolution,Learn how to improve your Domain Name System (DNS) resolution performance and resiliency in AKS using localDNS.,"LocalDNS is a feature in Azure Kubernetes Service (AKS) designed to enhance the Domain Name System (DNS) resolution performance and resiliency for workloads running in your cluster. When you deploy a DNS proxy on each node, LocalDNS reduces DNS query latency, improves reliability during network disruptions, and provides advanced configuration options for DNS caching and forwarding. This article explains how LocalDNS works, its configuration options, and how to enable, verify, and troubleshoot Lo",2026-02-27T18:12:00.000Z,how-to,configuration,0.76,True,"Describes LocalDNS feature, configuration options, and enable/verify steps; includes specific AKS settings and deployment patterns.",unchanged https://learn.microsoft.com/en-us/azure/aks/long-term-support,Long-term support,Long-term support for Azure Kubernetes Service (AKS) versions - Azure Kubernetes Service,Use long-term support channels for AKS Kubernetes versions,Learn about Azure Kubernetes Service (AKS) long-term support for Kubernetes,"The Kubernetes community releases a new minor version approximately every four months, with a support window for each version for one year. In Azure Kubernetes Service (AKS), this support window is calledcommunity support. AKS supports versions of Kubernetes that are within thiscommunity supportwindow to push bug fixes and security updates from community releases. While the community support release cadence provides benefits, it requires that you keep up to date with Kubernetes releases, which c",2025-08-28T22:06:00.000Z,concept-article,limits-quotas,0.8,True,"Describes extended support windows and version lifecycles beyond community support, with concrete timelines and constraints that are effectively support limits.",unchanged https://learn.microsoft.com/en-us/azure/aks/manage-abort-operations,Abort long running operations,Abort an AKS long-running operation - Azure Kubernetes Service,Abort long-running AKS cluster operations via API,Learn how to terminate a long running operation on an Azure Kubernetes Service cluster at the node pool or cluster level.,"Sometimes deployment or other processes running within pods on nodes in a cluster can run for periods of time longer than expected due to various reasons. You can get insight into the progress of any ongoing operation, such as create, upgrade, and scale, using any preview API version after2024-01-02-previewusing the following az rest command: This command provides you with a percentage that indicates how close the operation is to completion. You can use this method to get these insights for up t",2024-08-01T20:29:00.000Z,how-to,configuration,0.7,True,"Describes use of specific preview API versions and az rest invocation pattern to inspect and abort AKS operations, including operation status fields and behavior, which are AKS-specific operational details.",unchanged -https://learn.microsoft.com/en-us/azure/aks/manage-azure-rbac,Use Azure RBAC for Kubernetes authorization,Use Azure RBAC for Kubernetes Authorization - Azure Kubernetes Service,Configure Azure RBAC for Kubernetes authorization in AKS,Learn how to use Azure role-based access control (Azure RBAC) for Kubernetes Authorization with Azure Kubernetes Service (AKS).,"This article covers how to use Azure RBAC for Kubernetes Authorization, which allows for the unified management and access control across Azure resources, AKS, and Kubernetes resources. For more information, seeAzure RBAC for Kubernetes Authorization. Note When usingintegrated authentication between Microsoft Entra ID and AKS, you can use Microsoft Entra users, groups, or service principals as subjects inKubernetes role-based access control (Kubernetes RBAC). With this feature, you don't need to",2024-11-08T08:00:00.000Z,how-to,security,0.8,True,"Contains specific Azure role definitions, scopes, and AKS configuration to unify Azure RBAC with Kubernetes authorization.",unchanged -https://learn.microsoft.com/en-us/azure/aks/manage-local-accounts-managed-azure-ad,Manage local accounts,Manage local accounts with AKS-managed Microsoft Entra integration - Azure Kubernetes Service,Manage AKS local accounts with Entra integration,Learn how to managed local accounts when integrating Microsoft Entra ID in your Azure Kubernetes Service (AKS) clusters.,"When you deploy an AKS cluster, local accounts are enabled by default. Even when you enable RBAC or Microsoft Entra integration,--adminaccess still exists as a non-auditable backdoor option. This article shows you how to disable local accounts on an existing cluster, create a new cluster with local accounts disabled, and re-enable local accounts on existing clusters.",2024-08-01T20:29:00.000Z,how-to,security,0.77,True,Shows how to disable/enable local admin accounts on AKS clusters with specific commands and flags affecting security posture.,unchanged https://learn.microsoft.com/en-us/azure/aks/manage-ssh-node-access,Manage SSH access to nodes,Manage SSH access on Azure Kubernetes Service cluster nodes - Azure Kubernetes Service,Configure and manage SSH access to AKS nodes,Learn how to configure SSH and manage SSH keys on Azure Kubernetes Service (AKS) cluster nodes.,"This article describes how to configure SSH access (preview) on your AKS clusters or node pools, during initial deployment or at a later time. AKS supports the following configuration options to manage SSH access on cluster nodes: Important AKS preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. AKS previews are partially covered by customer support on a best",2025-12-10T12:02:00.000Z,how-to,security,0.74,True,"Provides concrete SSH configuration options and key management behavior for AKS clusters/node pools, including preview constraints and supported patterns, which are AKS-specific security configurations.",unchanged https://learn.microsoft.com/en-us/azure/aks/managed-gateway-api,Ingress with the Kubernetes Gateway API (preview),Azure Kubernetes Service (AKS) Managed Gateway API Installation (preview) - Azure Kubernetes Service,,Learn how to install the Kubernetes Gateway API Custom Resource Definitions (CRDs) on your Azure Kubernetes Service (AKS) cluster using the Managed Gateway API installation.,"TheKubernetes Gateway APIis a specification for traffic management on Kubernetes clusters. The specification enhancesIngress API, which lacks a unified and provider-agnostic approach for advanced traffic routing. The Managed Gateway API Installation for Azure Kubernetes Service (AKS) installs the Custom Resource Definitions (CRDs) for the Kubernetes Gateway API. You can install these CRDs independently of any specific Gateway API implementation. Note The Managed Gateway API installation only ins",2026-03-31T22:11:00.000Z,how-to,,0.2,False,"Describes installing Gateway API CRDs on AKS; from the summary it looks like a conceptual/installation overview without detailed config tables, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview,Overview,Overview of managed identities in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"This article provides an overview of managed identities in Azure Kubernetes Service (AKS), including system-assigned, user-assigned, and pre-created kubelet managed identities. It also explains how th","This article provides an overview of system-assigned and user-assigned managed identities in AKS, including how they work, role assignments, and AKS-specific managed identity features. To enable a managed identity on a new or existing AKS cluster, seeUse a managed identity in Azure Kubernetes Service (AKS). For more information about managed identities in Azure, see theManaged identities for Azure resources documentation. Note The system-assigned and user-assigned identity types differ from awor",2025-11-04T05:34:00.000Z,overview,,0.35,False,"Managed identities overview; primarily conceptual explanation of identity types and behavior, not detailed configuration steps or role tables.",unchanged -https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview,Use managed identities,Overview of Managed Identities in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"This article provides an overview of managed identities in Azure Kubernetes Service (AKS), including system-assigned, user-assigned, and pre-created kubelet managed identities. It also explains how th","This article provides an overview of system-assigned and user-assigned managed identities in AKS, including how they work, role assignments, and AKS-specific managed identity features.] For more information about managed identities in Azure, see theManaged identities for Azure resources documentation. Note The system-assigned and user-assigned identity types differ from aworkload identity, which is intended for use by an application running on a pod.",2025-11-04T05:34:00.000Z,overview,,0.1,False,"Described as an overview of managed identities in AKS (system-assigned, user-assigned, kubelet identity) and how they work. This is conceptual/architectural content without clear indication of concrete configuration tables, parameter values, or role/permission matrices that go beyond what an LLM would know from training.",unchanged +https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview,Managed identities overview,Overview of Managed Identities in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"This article provides an overview of managed identities in Azure Kubernetes Service (AKS), including system-assigned, user-assigned, and pre-created kubelet managed identities. It also explains how th","This article provides an overview of system-assigned and user-assigned managed identities in AKS, including how they work, role assignments, and AKS-specific managed identity features.] For more information about managed identities in Azure, see theManaged identities for Azure resources documentation. Note Managed identities cover thecluster-to-Azureidentity scenario in AKS — how the AKS cluster acts on Azure to manage resources on your behalf. For the other identity scenarios (control-plane aut",2025-11-04T05:34:00.000Z,overview,,0.4,False,"Overview of managed identities in AKS (system-assigned, user-assigned, kubelet identity); largely conceptual without detailed role/permission tables or config parameters.",new https://learn.microsoft.com/en-us/azure/aks/managed-namespaces,Use managed namespaces,How to Use Managed Namespaces in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use managed namespaces to isolate AKS workloads,Step-by-step guide on using managed namespaces in Azure Kubernetes Service (AKS).,"Applies to:✔️ AKS Automatic ✔️ AKS Standard Managed namespaces in Azure Kubernetes Service (AKS) provide a way to logically isolate workloads and teams within a cluster. This feature enables administrators to enforce resource quotas, apply network policies, and manage access control at the namespace level. For a detailed overview of managed namespaces, see themanaged namespaces overview.",2025-11-18T16:04:00.000Z,how-to,configuration,0.65,True,"Step-by-step guide for managed namespaces; likely includes AKS-specific resource definitions, quota settings, and access control configuration fields.",unchanged https://learn.microsoft.com/en-us/azure/aks/migrate-from-autoscaler-to-node-auto-provisioning,Migrate from Cluster Autoscaler to NAP,Migrate from cluster autoscaler to node auto-provisioning - Azure Kubernetes Service,Migrate AKS from cluster autoscaler to node auto-provisioning,Learn how to migrate your Azure Kubernetes Service (AKS) cluster from cluster autoscaler to node auto-provisioning.,"Migrate your existing Azure Kubernetes Service (AKS) cluster fromcluster autoscalertonode auto-provisioningusing the steps in this guide. Node auto-provisioning (NAP) uses pending pod resource requirements to decide the optimal virtual machine (VM) configuration to run those workloads in the most efficient and cost-effective manner. Node auto-provisioning is based on the open-sourceKarpenterproject and theAKS Karpenter provider. Node auto-provisioning automatically deploys, configures, and manag",2026-04-02T22:13:00.000Z,how-to,deployment,0.68,True,"Provides product-specific migration steps and constraints for moving from AKS cluster autoscaler to node auto-provisioning (Karpenter-based). While not a generic tutorial, it contains AKS-specific operational guidance and requirements for how to transition between autoscaling mechanisms in production, which is deployment-focused expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/aks/migrate-from-npm-to-cilium-network-policy,Migrate from Network Policy Manager to Cilium Network policy,Migrate from Network Policy Manager (NPM) to Cilium Network Policy - Azure Kubernetes Service,Plan and execute migration from NPM to Cilium in AKS,"This article is a comprehensive guide to plan, execute, and validate the migration from Network Policy Manager (NPM) to Cilium Network Policy","In this article, we provide a comprehensive guide to plan, execute, and validate the migration from Network Policy Manager (NPM) to Cilium Network Policy. The goal is to ensure policy parity, minimize service disruption, and align with Azure CNI's strategic direction toward eBPF-based networking and enhanced observability. Important This guide applies exclusively to AKS clusters running Linux nodes. Cilium Network Policy isn't currently supported for Windows nodes in AKS.",2026-02-17T23:08:00.000Z,how-to,decision-making,0.62,True,"Stepwise migration guide with planning, execution, and validation; helps decide and carry out migration between two network policy engines with AKS-specific constraints.",unchanged @@ -330,17 +337,17 @@ https://learn.microsoft.com/en-us/azure/aks/monitor-aks,Monitoring concepts and https://learn.microsoft.com/en-us/azure/aks/monitor-aks-mongodb,Monitor with Percona Monitoring and Management,Monitoring for MongoDB cluster on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure monitoring for MongoDB clusters on AKS with PMM,"In this article, you learn how to monitor a MongoDB cluster on Azure Kubernetes Service (AKS).","In this article, you learn how to monitor a MongoDB cluster on Azure Kubernetes Service (AKS) usingPercona Monitoring and Management (PMM). Monitoring is essential for ensuring the health and performance of your MongoDB cluster. By monitoring key metrics, you can identify issues early and take corrective actions to prevent downtime and data loss.",2025-03-20T22:01:00.000Z,how-to,configuration,0.7,True,"Monitoring setup using Percona Monitoring and Management; likely includes exporter configuration, metric names, and integration details specific to AKS+MongoDB.",unchanged https://learn.microsoft.com/en-us/azure/aks/monitor-aks-reference,AKS monitoring data reference,Monitoring data reference for Azure Kubernetes Service - Azure Kubernetes Service,Reference for AKS monitoring data and metrics,This article contains important reference material you need when you monitor Azure Kubernetes Service (AKS) by using Azure Monitor.,This article contains all the monitoring reference information for this service. SeeMonitor Azure Kubernetes Service (AKS)for details on the data you can collect for AKS and how to use it.,2025-02-27T23:00:00.000Z,reference,configuration,0.7,True,"Monitoring data reference includes specific table names, metric names, and schema details for AKS telemetry in Azure Monitor; configuration-level reference for monitoring.",unchanged https://learn.microsoft.com/en-us/azure/aks/monitor-gpu-metrics,Monitor GPU metrics,GPU observability in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,This article provides a conceptual overview of key utilization and performance NVIDIA DCGM GPU metrics on Azure Kubernetes Service (AKS).,"Efficient placement and optimization of GPU workloads often requires visibility into resource utilization and performance. Managed GPU metrics on AKS (preview) provide automated collection and exposure of GPU utilization, memory, and performance data across NVIDIA GPU-enabled node pools. This enables platform administrators to optimize cluster resources and developers to tune and debug workloads with limited manual instrumentation. In this article, you learn about GPU metrics collected by the NV",2025-11-07T23:08:00.000Z,concept-article,,0.4,False,Conceptual overview of GPU metrics and observability; likely describes metric types but not detailed configuration tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/aks/nat-gateway,Use a NAT Gateway,Create a managed or user-assigned NAT gateway for your Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Understand AKS NAT gateway outbound flow limits,Learn how to create an AKS cluster with managed NAT integration and user-assigned NAT gateway.,"While you can route egress traffic through an Azure Load Balancer, there are limitations on the number of outbound flows of traffic you can have. Azure NAT Gateway allows up to 64,512 outbound UDP and TCP traffic flows per IP address with a maximum of 16 IP addresses. There are three outbound types that support NAT gateway -managedNATGatewayV2(Preview),managedNATGateway, anduserAssignedNATGateway. In a managed NAT gateway model, AKS manages the NAT gateway to provide outbound connectivity for yo",2026-04-10T22:06:00.000Z,how-to,limits-quotas,0.85,True,"The summary explicitly states a precise numeric limit: Azure NAT Gateway allows up to 64,512 outbound UDP and TCP flows per IP address with a maximum of 16 IP addresses. These are concrete, product-specific limits and quotas that an LLM is unlikely to know reliably from training. The rest of the page describes NAT gateway outbound types for AKS, but the standout expert knowledge is the exact flow and IP limits, which matches the limits-quotas sub-skill definition.",unchanged +https://learn.microsoft.com/en-us/azure/aks/nat-gateway,Use a NAT Gateway,Create a managed or user-assigned NAT gateway for your Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Configure AKS NAT gateway outbound flow limits,Learn how to create an AKS cluster with managed NAT integration and user-assigned NAT gateway.,"While you can route egress traffic through an Azure Load Balancer, there are limitations on the number of outbound flows of traffic you can have. Azure NAT Gateway allows up to 64,512 outbound UDP and TCP traffic flows per IP address with a maximum of 16 IP addresses. There are three outbound types that support NAT gateway -managedNATGatewayV2(Preview),managedNATGateway, anduserAssignedNATGateway. In a managed NAT gateway model, AKS manages the NAT gateway to provide outbound connectivity for yo",2026-04-15T08:00:00.000Z,how-to,limits-quotas,0.78,True,"Contains concrete outbound flow limits for Azure NAT Gateway (64,512 flows per IP, up to 16 IPs) and product-specific outbound type options for AKS, which are numeric constraints not generally known from training.",updated https://learn.microsoft.com/en-us/azure/aks/network-isolated,Create a network isolated cluster,Create a network isolated AKS cluster - Azure Kubernetes Service,Create and configure network-isolated AKS clusters,Learn how to configure an Azure Kubernetes Service (AKS) cluster with outbound and inbound network restrictions.,"Organizations typically have strict security and compliance requirements to regulate egress (outbound) network traffic from a cluster to eliminate risks of data exfiltration. By default, standard SKU Azure Kubernetes Service (AKS) clusters have unrestricted outbound internet access. This level of network access allows nodes and services you run to access external resources as needed. If you wish to restrict egress traffic, a limited number of ports and addresses must be accessible to maintain he",2025-06-20T17:00:00.000Z,how-to,configuration,0.7,True,"How-to for configuring outbound and inbound restrictions; such pages typically include specific NSG rules, ports, service tags, and AKS-specific settings, which are expert configuration details.",unchanged https://learn.microsoft.com/en-us/azure/aks/network-policy-best-practices,Network policy best practices,Best practices for network policies in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Apply AKS network policy security best practices,Learn the best practices for how to design and use network policies in Azure Kubernetes Service (AKS).,"Kubernetes, by default, operates as a flat network where all pods can communicate freely with each other. This unrestricted connectivity can be convenient for developers but poses significant security risks as applications scale. Imagine an organization deploying multiple microservices, each handling sensitive data, customer transactions, or backend operations. Without any restrictions, any compromised pod could potentially access unauthorized data or disrupt services. To address these security ",2025-04-16T17:03:00.000Z,best-practice,best-practices,0.78,True,Best-practices article specific to AKS network policies with concrete DO/DON'T guidance and AKS-specific patterns beyond generic Kubernetes concepts.,unchanged https://learn.microsoft.com/en-us/azure/aks/node-access,Connect securely to cluster nodes,Connect to Azure Kubernetes Service (AKS) cluster nodes - Azure Kubernetes Service,Securely connect to AKS cluster nodes for maintenance,Learn how to connect to Azure Kubernetes Service (AKS) cluster nodes for troubleshooting and maintenance tasks.,"Throughout the lifecycle of your Azure Kubernetes Service (AKS) cluster, you eventually need to directly access an AKS node. This access could be for maintenance, log collection, or troubleshooting operations. This article describes two options for secure connection against AKS Linux and Windows nodes. One requires that you have Kubernetes API access, and the other is through the AKS ARM API, which provides direct private IP information. For security reasons, AKS nodes aren't exposed to the inte",2026-03-04T23:10:00.000Z,troubleshooting,security,0.7,True,"The page describes concrete, product-specific methods to securely access AKS Linux and Windows nodes, including use of Kubernetes API vs AKS ARM API, and emphasizes security constraints around node exposure. While it’s partly troubleshooting-oriented, the core expert content is about secure access patterns and configuration for node connectivity, which aligns best with the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-drain,Node auto-drain,Automatically drain Azure Kubernetes Service (AKS) nodes - Azure Kubernetes Service,Protect workloads with AKS node auto-drain,Learn about node auto-drain functionality and how AKS protects your workloads from scheduled VM maintenance events.,"Node auto-drain helps you protect your node workloads from disruptions whenscheduled eventsoccur on the underlying virtual machines (VMs) in any of your node pools. When certain node events happen, node auto-drain attempts a cordon and drain of the affected node so the workloads can be rescheduled safely. An example of when node auto-drain might occur is when scheduled events on aspot node poolcause a preempt node event. The spot node with the taint""kubernetes.azure.com/scalesetpriority: spot""ma",2025-07-28T22:03:00.000Z,concept-article,best-practices,0.68,True,"Documents AKS behavior on scheduled events (e.g., spot eviction), taints used, and how cordon/drain is triggered, which are specific operational patterns and gotchas for AKS.",unchanged -https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning,Overview,Overview of Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about node auto-provisioning in AKS, including how it works, prerequisites, upgrade behavior, limitations, and resources to get started.","This article provides an overview of node auto-provisioning (NAP) in Azure Kubernetes Service (AKS), including how it works, upgrade behavior, prerequisites, limitations, and resources to get started.",2026-04-14T06:03:00.000Z,overview,,0.3,False,"Described as an overview of AKS Node Auto-Provisioning covering how it works, prerequisites, upgrade behavior, and limitations. This is primarily conceptual/behavioral documentation; the summary does not indicate concrete numeric limits, configuration tables, error codes, or decision matrices with thresholds. Likely no detailed expert-only parameters or quotas.",updated +https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning,Overview,Overview of Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about node auto-provisioning in AKS, including how it works, prerequisites, upgrade behavior, limitations, and resources to get started.","This article provides an overview of node auto-provisioning (NAP) in Azure Kubernetes Service (AKS), including how it works, upgrade behavior, prerequisites, limitations, and resources to get started.",2026-04-14T06:03:00.000Z,overview,,0.3,False,"Described as an overview of AKS Node Auto-Provisioning covering how it works, prerequisites, upgrade behavior, and limitations. This is primarily conceptual/behavioral documentation; the summary does not indicate concrete numeric limits, configuration tables, error codes, or decision matrices with thresholds. Likely no detailed expert-only parameters or quotas.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-aksnodeclass,Configure AKSNodeClass custom resources,Configure AKSNodeClass Resources for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure AKSNodeClass for AKS node auto-provisioning,Learn how to configure Azure-specific settings for AKS node auto-provisioning using AKSNodeClass resources.,"This article explains how to configureAKSNodeClassresources to define Azure-specific settings for node auto-provisioning (NAP) in Azure Kubernetes Service (AKS) using Karpenter.AKSNodeClassallows you to customize various aspects of the nodes that Karpenter provisions, such as the virtual machine (VM) image, operating system (OS) disk size, maximum pods per node, and kubelet configurations. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or provides secur",2026-01-20T23:08:00.000Z,how-to,configuration,0.78,True,"Configuration article for AKSNodeClass and NAP with Azure-specific fields (VM image, OS disk size, max pods per node, kubelet settings). Likely includes CRD field names, allowed values, and defaults that are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-custom-vnet,Use NAP in a custom VNet,Create an Azure Kubernetes Service (AKS) Cluster with Node Auto-Provisioning (NAP) in a Custom Virtual Network - Azure Kubernetes Service,Create NAP-enabled AKS clusters in custom VNets,Learn how to create an AKS cluster with node auto-provisioning in a custom virtual network.,"This article shows you how to create a virtual network (VNet) and subnet, create a managed identity with permissions to access the VNet, and create an Azure Kubernetes Service (AKS) cluster in your custom VNet with node auto-provisioning (NAP) enabled.",2025-10-30T22:08:00.000Z,how-to,configuration,0.72,True,"Shows creating VNets, identities, and NAP-enabled clusters; includes networking parameters, identity roles, and AKS-specific NAP settings.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-disruption,Configure disruption policies,Configure Disruption Policies for Node Auto-Provisioning (NAP) Nodes in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure disruption policies for NAP nodes in AKS,Learn how to configure node disruption policies for node auto-provisioning (NAP) nodes in Azure Kubernetes Service (AKS) to optimize resource utilization.,This article explains how to configure node disruption policies for Azure Kubernetes Service (AKS) node auto-provisioning (NAP) nodes and details how disruption works to optimize resource utilization and cost efficiency. NAP optimizes your cluster by:,2025-10-30T22:08:00.000Z,overview,configuration,0.68,True,"Explains configuring node disruption policies for NAP nodes; likely includes specific policy fields, thresholds, and examples that control NAP behavior.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-networking,Configure networking,Understand Networking Configurations for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure networking and RBAC for NAP-enabled AKS clusters,"Learn about networking configuration requirements and recommendations for AKS clusters using node auto-provisioning (NAP), including supported configurations, subnet behavior, RBAC setup, and CIDR con","This article provides an overview of networking configuration requirements and recommendations for Azure Kubernetes Service (AKS) clusters using node auto-provisioning (NAP). It covers supported configurations, default subnet behavior, role-based access control (RBAC) setup, and classless inter-domain routing (CIDR) considerations. For an overview of node auto-provisioning in AKS, seeOverview of node auto-provisioning (NAP) in Azure Kubernetes Service (AKS).",2025-10-30T22:08:00.000Z,overview,security,0.6,True,"Covers networking requirements, supported configurations, RBAC setup, and CIDR considerations; includes role definitions and network configuration details specific to NAP.",unchanged -https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-node-pools,Configure NodePool custom resources,Configure Node Pools for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure AKS node pools for Node Auto-Provisioning,"This article shows you how to configure node pools for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS), including SKU selectors, limits, and weights.","This article explains how to configure node pools for node auto-provisioning (NAP) in Azure Kubernetes Service (AKS), including SKU selectors, resource limits, and priority weights. It also provides examples to help you get started.",2026-04-13T17:13:00.000Z,how-to,configuration,0.75,True,"Focuses on configuring node pools for AKS NAP, including SKU selectors, resource limits, and priority weights. This implies product-specific configuration parameters and examples (such as selector fields, limit fields, and weight values) that qualify as expert configuration knowledge beyond generic Kubernetes concepts.",updated +https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-node-pools,Configure NodePool custom resources,Configure Node Pools for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure AKS node pools for Node Auto-Provisioning,"This article shows you how to configure node pools for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS), including SKU selectors, limits, and weights.","This article explains how to configure node pools for node auto-provisioning (NAP) in Azure Kubernetes Service (AKS), including SKU selectors, resource limits, and priority weights. It also provides examples to help you get started.",2026-04-13T17:13:00.000Z,how-to,configuration,0.75,True,"Focuses on configuring node pools for AKS NAP, including SKU selectors, resource limits, and priority weights. This implies product-specific configuration parameters and examples (such as selector fields, limit fields, and weight values) that qualify as expert configuration knowledge beyond generic Kubernetes concepts.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-upgrade-image,Update NAP node images,Node Image Updates for Node Auto-Provisioning (NAP) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Plan node image updates for NAP in AKS,"Learn about node image updates for NAP in AKS, including how it works, recommended maintenance windows, and examples to get started.","This article provides an overview of node image updates for node auto-provisioning (NAP) in Azure Kubernetes Service (AKS), including how it works, recommended maintenance windows, and examples to get started.",2025-10-30T22:08:00.000Z,overview,best-practices,0.62,True,Discusses node image updates for NAP with recommended maintenance windows and examples; provides operational guidance and timing considerations specific to NAP.,unchanged https://learn.microsoft.com/en-us/azure/aks/node-auto-repair,Node auto-repair,Automatically repair Azure Kubernetes Service (AKS) nodes - Azure Kubernetes Service,Use AKS node auto-repair for unhealthy nodes,Learn about node auto-repair functionality and how AKS fixes broken worker nodes.,"Azure Kubernetes Service (AKS) continuously monitors the health state of worker nodes and performs automatic node repair if they become unhealthy. The Azure virtual machine (VM) platformperforms maintenance on VMsexperiencing issues. AKS and Azure VMs work together to minimize service disruptions for clusters. In this article, you learn how the automatic node repair functionality behaves for Windows and Linux nodes.",2025-04-02T17:01:00.000Z,concept-article,best-practices,0.65,True,"Describes AKS-specific health checks, thresholds, and repair actions for Linux/Windows nodes, including behavior during platform maintenance—concrete product behavior and recommendations.",unchanged https://learn.microsoft.com/en-us/azure/aks/node-images,Node images,Node Images in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Choose and manage AKS node OS images,Learn about the different node images available in Azure Kubernetes Service (AKS).,"This article describes the node images available for Azure Kubernetes Service (AKS) nodes. Important Starting onMarch 17, 2027, Azure Kubernetes Service (AKS) no longer supports or provides security updates for Ubuntu 20.04. Any existing node images will be deleted, and you'll be unable to scale any node pools running Ubuntu 20.04. Migrate to a supported Ubuntu version byupgrading your node poolsto Kubernetes version 1.35+. For more information on this retirement, see theRetirement GitHub issuea",2026-04-09T06:03:00.000Z,overview,decision-making,0.68,True,"The page contains product-specific, time-bound retirement details for specific AKS node images (for example, exact retirement date for Ubuntu 20.04, required Kubernetes version to migrate, and supported image options). This is expert, versioned knowledge that changes over time and is unlikely to be fully known from pretraining. The content helps users decide which node image/OS version to use and when to migrate based on support status, fitting best under decision-making rather than generic concepts.",unchanged @@ -367,8 +374,7 @@ https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-advanced-sch https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-cluster-isolation,Multi-tenancy and cluster isolation,Cluster isolation best practices for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Implement cluster isolation strategies in Azure Kubernetes Service,Learn the cluster operator best practices for isolation in Azure Kubernetes Service (AKS).,"As you manage clusters in Azure Kubernetes Service (AKS), you often need to isolate teams and workloads. AKS allows flexibility in how you run multi-tenant clusters and isolate resources. To maximize your investment in Kubernetes, it's important you understand AKS multi-tenancy and isolation features. This best practices article focuses on isolation for cluster operators. In this article, you learn how to:",2024-08-01T20:29:00.000Z,best-practice,best-practices,0.8,True,"Best-practices for multi-tenancy and isolation in AKS; expected to include concrete patterns (e.g., namespace, node pool, network policy configurations) and AKS-specific guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-cluster-security,Cluster security best practices,Best practices for cluster security - Azure Kubernetes Service,Apply AKS cluster security and upgrade best practices,Learn the cluster operator best practices for how to manage cluster security and upgrades in Azure Kubernetes Service (AKS),"As you manage clusters in Azure Kubernetes Service (AKS), workload and data security is a key consideration. When you run multi-tenant clusters using logical isolation, you especially need to secure resource and workload access. Minimize the risk of attack by applying the latest Kubernetes and node OS security updates. This article focuses on how to secure your AKS cluster. You learn how to: You can also read the best practices forcontainer image managementand forpod security.",2024-11-10T12:11:00.000Z,best-practice,best-practices,0.8,True,"Cluster security best-practices; likely includes concrete AKS guidance (upgrade channels, node OS versions, RBAC patterns) and product-specific gotchas.",unchanged https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-container-image-management,Container image management best practices,Operator best practices - Container image management in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,Implement container image security best practices in AKS,Learn the cluster operator best practices for how to manage and secure container images in Azure Kubernetes Service (AKS).,"Container and container image security is a major priority when developing and running applications in Azure Kubernetes Service (AKS). Containers with outdated base images or unpatched application runtimes introduce security risks and possible attack vectors. You can minimize these risks by integrating and running scan and remediation tools in your containers at build and runtime. The earlier you catch the vulnerability or outdated base image, the more secure your application is. In this article",2024-08-01T20:29:00.000Z,best-practice,best-practices,0.75,True,"Operator best-practices for image management; likely includes AKS-specific scanning, registry, and pipeline recommendations beyond generic advice.",unchanged -https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-identity,Authentication and authorization best practices,Best practices for managing authentication and authorization - Azure Kubernetes Service,Best practices for AKS authentication and authorization,Learn the cluster operator best practices for how to manage authentication and authorization for clusters in Azure Kubernetes Service (AKS),"As you deploy and maintain clusters in Azure Kubernetes Service (AKS), you implement ways to manage access to resources and services. Without these controls: In this article, we discuss what recommended practices a cluster operator can follow to manage access and identity for AKS clusters. You'll learn how to: Warning The open source Microsoft Entra pod-managed identity (preview) in Azure Kubernetes Service has been deprecated as of 10/24/2022. If you haveMicrosoft Entra pod-managed identityenab",2024-08-01T20:29:00.000Z,best-practice,best-practices,0.74,True,"Provides concrete AKS-specific recommendations (e.g., which identity features to use, deprecation notes, and configuration patterns) beyond generic security advice.",unchanged -https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-network,Best practices for network connectivity and security,Best practices for network resources in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Apply AKS network configuration best practices,Learn the cluster operator best practices for virtual network resources and connectivity in Azure Kubernetes Service (AKS).,"Important On31 March 2028, kubenet networking for Azure Kubernetes Service (AKS) will be retired. To avoid service disruptions,you'll need toupgrade to Azure Container Networking Interface (CNI) overlaybefore that date, when workloads running on kubenet for AKS will no longer be supported. As you create and manage clusters in Azure Kubernetes Service (AKS), you provide network connectivity for your nodes and applications. These network resources include IP address ranges, load balancers, and ing",2026-04-13T22:08:00.000Z,best-practice,best-practices,0.78,True,"The page is explicitly framed as operator best practices for AKS networking and typically includes product-specific recommendations (for example, concrete guidance on IP range sizing, load balancer configuration, and AKS-specific networking choices like kubenet vs CNI and migration considerations). These are actionable, AKS-specific DO/DON'T patterns rather than generic networking theory, matching the best-practices category.",updated +https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-network,Best practices for network connectivity and security,Best practices for network resources in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Apply AKS network configuration best practices,Learn the cluster operator best practices for virtual network resources and connectivity in Azure Kubernetes Service (AKS).,"Important On31 March 2028, kubenet networking for Azure Kubernetes Service (AKS) will be retired. To avoid service disruptions,you'll need toupgrade to Azure Container Networking Interface (CNI) overlaybefore that date, when workloads running on kubenet for AKS will no longer be supported. As you create and manage clusters in Azure Kubernetes Service (AKS), you provide network connectivity for your nodes and applications. These network resources include IP address ranges, load balancers, and ing",2026-04-13T22:08:00.000Z,best-practice,best-practices,0.78,True,"The page is explicitly framed as operator best practices for AKS networking and typically includes product-specific recommendations (for example, concrete guidance on IP range sizing, load balancer configuration, and AKS-specific networking choices like kubenet vs CNI and migration considerations). These are actionable, AKS-specific DO/DON'T patterns rather than generic networking theory, matching the best-practices category.",unchanged https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-scheduler,Basic scheduler features,Operator best practices - Basic scheduler features in Azure Kubernetes Services (AKS) - Azure Kubernetes Service,Apply basic scheduler best practices in AKS,Learn the cluster operator best practices for using basic scheduler features such as resource quotas and pod disruption budgets in Azure Kubernetes Service (AKS),"As you manage clusters in Azure Kubernetes Service (AKS), you often need to isolate teams and workloads. The Kubernetes scheduler lets you control the distribution of compute resources, or limit the impact of maintenance events. This best practices article focuses on basic Kubernetes scheduling features for cluster operators. In this article, you learn how to:",2024-08-01T20:29:00.000Z,best-practice,best-practices,0.7,True,"Operator best-practices article for quotas and PDBs; likely includes AKS-specific examples, recommended configurations, and edge cases.",unchanged https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-storage,Best practices for storage and backups,Best practices for storage and backup - Azure Kubernetes Service,Apply AKS storage and backup operator best practices,"Learn the cluster operator best practices for storage, data encryption, and backups in Azure Kubernetes Service (AKS)","As you create and manage clusters in Azure Kubernetes Service (AKS), your applications often need storage. Make sure you understand pod performance needs and access methods so that you can select the best storage for your application. The AKS node size may impact your storage choices. Plan for ways to back up and test the restore process for attached storage. This best practices article focuses on storage considerations for cluster operators. In this article, you learn:",2025-10-22T05:06:00.000Z,best-practice,best-practices,0.78,True,"Operator-focused storage, encryption, and backup best practices with AKS-specific recommendations (node size impact, backup strategies).",unchanged https://learn.microsoft.com/en-us/azure/aks/optimize-aks-costs,Optimize AKS usage and costs,Optimize Azure Kubernetes Service (AKS) usage and costs - Azure Kubernetes Service,Choose strategies to optimize AKS usage and costs,Learn different ways to optimize your Azure Kubernetes Service (AKS) usage and costs.,This article provides guidance on how to optimize your Azure Kubernetes Service (AKS) usage and costs. It covers guidance on the following topics:,2025-02-27T23:00:00.000Z,how-to,decision-making,0.65,True,"Guidance on ways to optimize AKS usage and costs; likely includes scenario-based recommendations and trade-offs between options (e.g., node types, autoscaling, add-ons).",unchanged @@ -398,7 +404,7 @@ https://learn.microsoft.com/en-us/azure/aks/plan-control-plane-networking,Plan c https://learn.microsoft.com/en-us/azure/aks/plan-networking,Overview,Plan Networking for Azure Kubernetes Service (AKS) Workloads - Azure Kubernetes Service,Plan networking architecture for Azure Kubernetes Service workloads,This article provides an overview of the networking components you need to consider when planning your Azure Kubernetes Service (AKS) workloads.,This article provides an overview of the networking components you need to consider when planning your Azure Kubernetes Service (AKS) workloads.,2025-12-15T18:09:00.000Z,overview,decision-making,0.65,True,"Planning article for AKS networking; likely compares options (kubenet vs CNI, load balancer choices) with criteria and recommendations, supporting technology selection decisions.",unchanged https://learn.microsoft.com/en-us/azure/aks/plan-node-networking,Plan node networking,Plan Node Networking for Azure Kubernetes Service (AKS) Workloads - Azure Kubernetes Service,Select node networking models for Azure Kubernetes Service,This article provides an overview of the networking components you need to consider for Azure Kubernetes Service (AKS) nodes.,"In this article, you learn node networking options for Azure Kubernetes Service (AKS). We first pose a question to help guide your planning, and then provide options, recommendations, and best practices.",2025-12-15T18:09:00.000Z,overview,decision-making,0.7,True,"Node networking planning article; expected to compare node networking options and trade-offs, guiding SKU/architecture choices.",unchanged https://learn.microsoft.com/en-us/azure/aks/plan-pod-networking,Plan pod networking,Plan Pod Networking for Azure Kubernetes Service (AKS) Workloads - Azure Kubernetes Service,Plan pod networking strategies for Azure Kubernetes Service,This article provides an overview of the networking components you need to consider for Azure Kubernetes Service (AKS) pods.,"In this article, you learn pod networking options for Azure Kubernetes Service (AKS). We first pose a question to help guide your planning, and then provide options, recommendations, and best practices.",2025-12-15T18:09:00.000Z,overview,decision-making,0.7,True,Pod networking planning with options and recommendations; supports decisions between CNI modes and pod IP strategies with AKS-specific guidance.,unchanged -https://learn.microsoft.com/en-us/azure/aks/planned-maintenance,Use Planned Maintenance to schedule and control upgrades,Use Planned Maintenance to Schedule and Control Upgrades for Azure Kubernetes Service (AKS) Clusters - Azure Kubernetes Service,Configure planned maintenance windows for AKS upgrades,Learn how to use planned maintenance to schedule and control cluster and node image upgrades in Azure Kubernetes Service (AKS).,"This article shows you how to use planned maintenance to schedule and control cluster and node image upgrades in Azure Kubernetes Service (AKS). Regular maintenance is performed on your AKS cluster automatically. There are two types of maintenance operations: When you use the feature of planned maintenance in AKS, you can run both types of maintenance in a cadence of your choice to minimize workload impact. Note You can use planned maintenance to schedule the timing of automatic upgrades, but en",2026-04-14T06:03:00.000Z,how-to,configuration,0.7,True,"The page describes detailed, product-specific configuration of AKS planned maintenance for cluster and node image upgrades (for example, maintenance windows, schedules, and how they affect automatic upgrades). These are concrete AKS feature settings rather than generic concepts, fitting the configuration category best. It does not primarily focus on limits, troubleshooting, or architecture patterns.",updated +https://learn.microsoft.com/en-us/azure/aks/planned-maintenance,Use Planned Maintenance to schedule and control upgrades,Use Planned Maintenance to Schedule and Control Upgrades for Azure Kubernetes Service (AKS) Clusters - Azure Kubernetes Service,Configure planned maintenance windows for AKS upgrades,Learn how to use planned maintenance to schedule and control cluster and node image upgrades in Azure Kubernetes Service (AKS).,"This article shows you how to use planned maintenance to schedule and control cluster and node image upgrades in Azure Kubernetes Service (AKS). Regular maintenance is performed on your AKS cluster automatically. There are two types of maintenance operations: When you use the feature of planned maintenance in AKS, you can run both types of maintenance in a cadence of your choice to minimize workload impact. Note You can use planned maintenance to schedule the timing of automatic upgrades, but en",2026-04-14T06:03:00.000Z,how-to,configuration,0.7,True,"The page describes detailed, product-specific configuration of AKS planned maintenance for cluster and node image upgrades (for example, maintenance windows, schedules, and how they affect automatic upgrades). These are concrete AKS feature settings rather than generic concepts, fitting the configuration category best. It does not primarily focus on limits, troubleshooting, or architecture patterns.",unchanged https://learn.microsoft.com/en-us/azure/aks/policy-reference,Azure Policy built-ins,Built-in policy definitions for Azure Kubernetes Service - Azure Kubernetes Service,Use built-in Azure Policy definitions for AKS,Lists Azure Policy built-in policy definitions for Azure Kubernetes Service. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy definitions for Azure Kubernetes Service. For additional Azure Policy built-ins for other services, seeAzure Policy built-in definitions. The name of each built-in policy definition links to the policy definition in the Azure portal. Use @@ -408,7 +414,7 @@ https://learn.microsoft.com/en-us/azure/aks/pre-created-kubelet-managed-identity https://learn.microsoft.com/en-us/azure/aks/private-apiserver-vnet-integration-cluster,Access a private API Server VNet Integration cluster from another virtual network,Access a Private API Server VNet Integration Cluster from Another Virtual Network - Azure Kubernetes Service,Configure AKS private API server access across VNets,"Step‑by‑step guidance for exposing the API server of an AKS cluster that has API Server VNet Integration enabled, by using Azure Private Link and consuming it from a separate virtual network.","With API Server VNet Integration, the AKS control plane is reachable through a private IP inside your cluster's virtual network (VNet). This configuration works well for workloads running in the same VNet, but it doesn't automatically cover external networks, such as a hub network, a dedicated operations VNet, or a jump-host environment that need to reach the API server without exposing it to the public internet. This article shows you how to bridge that gap by fronting the API server with a Pri",2026-03-11T22:11:00.000Z,how-to,configuration,0.7,True,"The article gives product-specific, stepwise configuration for exposing an AKS cluster with API Server VNet Integration via Private Link to another VNet. It includes concrete Azure resource types and settings (Private Link service, private endpoint, VNet integration specifics) that represent expert, implementation-level knowledge rather than generic concepts, but it does not focus on limits, troubleshooting, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/aks/private-cluster-connect,Establish connectivity to a private cluster,Establish network connectivity to a private Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Configure network access paths to private AKS clusters,"Learn about the options for connecting to a private AKS cluster, including using Azure Cloud Shell, Azure Bastion, virtual network peering, and private endpoints.","In private AKS clusters, the API server endpoint has no public IP address. To manage the API server, you need to use a virtual machine (VM) or container that has access to the virtual network (VNet) of the AKS cluster. There are several options for establishing network connectivity to the private cluster:",2025-12-16T23:08:00.000Z,how-to,security,0.65,True,"Covers concrete options (Cloud Shell, Bastion, peering, private endpoints) and their configuration for accessing private clusters, which are product-specific secure connectivity patterns.",unchanged https://learn.microsoft.com/en-us/azure/aks/private-clusters,Create a private cluster,Create a Private Azure Kubernetes Service (AKS) Cluster - Azure Kubernetes Service,Configure and deploy private AKS clusters with CLI/Terraform,Learn how to create a private Azure Kubernetes Service (AKS) cluster with enhanced security and network control using Azure CLI or Terraform.,"This article helps you deploy a private link-based AKS cluster using Azure CLI or Terraform. If you're interested in creating an AKS cluster without required private link or tunnel, seeCreate an Azure Kubernetes Service (AKS) cluster with API Server VNet integration.",2026-04-02T22:13:00.000Z,how-to,configuration,0.7,True,"Private AKS cluster creation involves product-specific network and API server configuration (e.g., private link, VNet integration, required parameters for CLI/Terraform). These are concrete configuration patterns and settings unique to AKS private clusters rather than generic Kubernetes knowledge, so it best fits the configuration sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/aks/privileged-identity-management,Cluster and node access control with Privileged Identity Management,Control cluster and node access using Privileged Identity Management (PIM) with AKS-managed Microsoft Entra integration - Azure Kubernetes Service,Configure PIM-based just-in-time access to AKS,Learn how to access clusters and nodes using Privileged Identity Management (PIM) when integrating Microsoft Entra ID in your Azure Kubernetes Service (AKS) clusters.,"When setting up permissions for different teams, you might want to set default permissions for specified teams, then grant privileged access to specific users when needed. Using Azure Kubernetes Service (AKS) with Microsoft Entra ID allows you to set up Privileged Identity Management (PIM) for just-in-time (JIT) requests for both cluster control plane access and SSH access to nodes. In this article, you learn how to: Note Microsoft Entra Privileged Identity Management (PIM) has Microsoft Entra I",2025-12-10T12:02:00.000Z,how-to,security,0.76,True,Explains how to use Entra PIM for JIT access to AKS control plane and nodes with specific role assignments and flows.,unchanged +https://learn.microsoft.com/en-us/azure/aks/privileged-identity-management,Privileged Identity Management for cluster and node access,Control cluster and node access using Privileged Identity Management (PIM) with Microsoft Entra integration - Azure Kubernetes Service,Set up PIM-based just-in-time access for AKS,Learn how to access clusters and nodes using Privileged Identity Management (PIM) when integrating Microsoft Entra ID in your Azure Kubernetes Service (AKS) clusters.,"When setting up permissions for different teams, you might want to set default permissions for specified teams, then grant privileged access to specific users when needed. Using Azure Kubernetes Service (AKS) with Microsoft Entra ID allows you to set up Privileged Identity Management (PIM) for just-in-time (JIT) requests for both cluster control plane access and SSH access to nodes. In this article, you learn how to: Note Microsoft Entra Privileged Identity Management (PIM) has Microsoft Entra I",2025-12-10T12:02:00.000Z,how-to,security,0.8,True,"Describes configuring Privileged Identity Management for AKS cluster and node access; includes specific role assignments, activation settings, and scopes, which are expert security configuration details.",new https://learn.microsoft.com/en-us/azure/aks/quickstart-dapr,Develop with Dapr,Deploy an App with Dapr Extension for Kubernetes - Azure Kubernetes Service,,Use the Dapr extension for Azure Kubernetes Service (AKS) or Arc-enabled Kubernetes to deploy an application.,"In this quickstart, you use theDapr extensionin an AKS or Arc-enabled Kubernetes cluster. You deployahello worldexample, which consists of a Python application that generates messages and a Node.js application that consumes and persists the messages.",2026-02-12T23:08:00.000Z,quickstart,,0.35,False,"Quickstart deploying a sample app with the Dapr extension; primarily step-by-step example, not deep configuration or limits content.",unchanged https://learn.microsoft.com/en-us/azure/aks/quickstart-event-grid,Subscribe to AKS events with Event Grid,Subscribe to Azure Kubernetes Service events with Azure Event Grid - Azure Kubernetes Service,,Use Azure Event Grid to subscribe to Azure Kubernetes Service events,"Azure Event Grid is a fully managed event routing service that provides uniform event consumption using a publish-subscribe model. In this quickstart, you create an AKS cluster and subscribe to AKS events.",2026-01-22T23:12:00.000Z,how-to,,0.4,False,Quickstart tutorial for subscribing to events; likely basic steps without deep config tables or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/aks/quickstart-helm,Develop with Helm,Develop on Azure Kubernetes Service (AKS) with Helm - Azure Kubernetes Service,,Use Helm with AKS and Azure Container Registry to package and run application containers in a cluster.,"Helmis an open-source packaging tool that helps you install and manage the lifecycle of Kubernetes applications. Similar to Linux package managers likeAPTandYum, Helm manages Kubernetes charts, which are packages of pre-configured Kubernetes resources. In this quickstart, you use Helm to package and run an application on AKS. For information on installing an existing application using Helm, seeInstall existing applications with Helm in AKS.",2024-08-01T20:29:00.000Z,concept-article,,0.35,False,"Helm quickstart for AKS; standard tutorial on packaging and deploying an app, unlikely to contain detailed AKS-specific configuration matrices.",unchanged @@ -427,8 +433,8 @@ https://learn.microsoft.com/en-us/azure/aks/roll-back-node-pool-version,Roll bac https://learn.microsoft.com/en-us/azure/aks/scale-cluster,Scale an AKS cluster,Manually Scale Nodes in an Azure Kubernetes Service (AKS) Cluster - Azure Kubernetes Service,,Manually scale the node count in your Azure Kubernetes Service (AKS) cluster with Azure CLI or PowerShell to match demand and optimize capacity.,"If the resource needs of your applications change, your cluster performance may be impacted due to low capacity on CPU, memory, PID space, or disk sizes. To address these changes, you can manually scale your AKS cluster to run a different number of nodes. When you scale in, nodes are carefullycordoned and drainedto minimize disruption to running applications. When you scale out, AKS waits until nodes are markedReadyby the Kubernetes cluster before pods are scheduled on them. This article describ",2026-03-17T22:13:00.000Z,how-to,,0.3,False,"Primarily a how-to for manually scaling AKS nodes via CLI/PowerShell; summary does not indicate numeric limits, config tables, or product-specific thresholds beyond generic scaling behavior.",unchanged https://learn.microsoft.com/en-us/azure/aks/scale-down-mode,Stop/deallocate nodes with Scale-down Mode,Use Scale-down Mode for your Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Choose and configure AKS scale-down mode,Learn how to use Scale-down Mode in Azure Kubernetes Service (AKS).,"By default, scale-up operations performed manually or by the cluster autoscaler require the allocation and provisioning of new nodes, and scale-down operations delete nodes. Scale-down Mode allows you to decide whether you would like to delete or deallocate the nodes in your Azure Kubernetes Service (AKS) cluster upon scaling down. When an Azure VM is in theStopped(deallocated) state, you will not be charged for the VM compute resources. However, you'll still need to pay for any OS and data stor",2024-08-01T20:29:00.000Z,how-to,decision-making,0.7,True,"Explains when to delete vs deallocate nodes, cost implications, and how to configure scale-down mode on AKS. Contains AKS-specific behavior and trade-offs for different scenarios.",unchanged https://learn.microsoft.com/en-us/azure/aks/scale-node-pools,Scale node pools,Scale Node Pools in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Manually and automatically scale AKS node pools,Learn how to manually and automatically scale node pools in Azure Kubernetes Service (AKS).,"As your application workload demands change, you might need to scale the number of nodes in a node pool in Azure Kubernetes Service (AKS). In this article, you learn how to manually and automatically scale node pools in AKS.",2025-12-02T06:03:00.000Z,how-to,configuration,0.7,True,"Provides AKS-specific scaling commands, autoscaler settings, and behavior for node pools, which are concrete configuration patterns beyond generic Kubernetes autoscaling.",unchanged -https://learn.microsoft.com/en-us/azure/aks/secure-container-access,Secure container access to resources,Secure Container Access to Resources in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,"Harden AKS containers with namespaces, AppArmor, seccomp","Learn how to limit access to actions that containers can perform, provide the least number of permissions, and avoid the use of root access or privileged escalation.","In this article, you learn how to secure container access to resources for your Azure Kubernetes Service (AKS) workloads using theuser namespaces,AppArmor, andseccompbuilt-in Linux security features.",2026-01-29T23:06:00.000Z,how-to,security,0.8,True,"Gives concrete AKS-specific guidance and configuration snippets for user namespaces, AppArmor, and seccomp to restrict container capabilities.",unchanged -https://learn.microsoft.com/en-us/azure/aks/security-bulletins/overview,Security Bulletins,Security bulletins for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,AKS security bulletins and vulnerability troubleshooting,This article provides security/vulnerability related updates and troubleshooting guides for Azure Kubernetes Services (AKS).,This page provides up-to-date information on security vulnerabilities affecting Azure Kubernetes Service(AKS) and its components. This information includes details on: These updates cover security information related to the following AKS components:,2026-03-26T22:14:00.000Z,concept-article,troubleshooting,0.65,True,"The page aggregates AKS security/vulnerability updates and related troubleshooting guidance. Security bulletins usually include specific CVEs, component versions, and remediation steps, mapping symptoms or vulnerabilities to fixes. This aligns with troubleshooting, as it provides diagnosis and resolution information specific to AKS security issues.",unchanged +https://learn.microsoft.com/en-us/azure/aks/secure-container-access,Container privilege hardening (Linux),Secure Container Access to Resources in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,"Harden AKS containers with user namespaces, AppArmor, seccomp","Learn how to limit access to actions that containers can perform, provide the least number of permissions, and avoid the use of root access or privileged escalation.","In this article, you learn how to secure container access to resources for your Azure Kubernetes Service (AKS) workloads using theuser namespaces,AppArmor, andseccompbuilt-in Linux security features.",2026-01-29T23:06:00.000Z,how-to,security,0.7,True,"Describes concrete use of Linux security features (user namespaces, AppArmor, seccomp) in AKS; likely includes specific profile names, configuration snippets, and AKS-specific security guidance beyond generic concepts.",new +https://learn.microsoft.com/en-us/azure/aks/security-bulletins/overview,Subscribe to AKS security bulletins,Security bulletins for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use AKS security bulletins for vulnerability impact and mitigation,This article provides security/vulnerability related updates and troubleshooting guides for Azure Kubernetes Services (AKS).,This page provides up-to-date information on security vulnerabilities affecting Azure Kubernetes Service(AKS) and its components. This information includes details on: These updates cover security information related to the following AKS components:,2026-03-26T22:14:00.000Z,concept-article,troubleshooting,0.7,True,"Security bulletins page aggregates vulnerability details and troubleshooting guidance for AKS components; typically includes CVEs, affected versions, and mitigation steps, which are product-specific troubleshooting knowledge.",new https://learn.microsoft.com/en-us/azure/aks/security-controls-policy,Security controls by Azure Policy,Azure Policy Regulatory Compliance controls for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use Azure Policy regulatory controls for AKS,Lists Azure Policy Regulatory Compliance controls available for Azure Kubernetes Service (AKS). These built-in policy definitions provide common approaches to managing the compliance of your Azure res,"Regulatory Compliance in Azure Policyprovides initiative definitions (built-ins) created and managed by Microsoft, for the compliance domains and security controls related to different compliance standards. This page lists the Azure Kubernetes Service (AKS) compliance domains and security controls. You can assign the built-ins for asecurity controlindividually to help make your Azure resources compliant with the specific standard. The title of each built-in policy definition links to the policy ",2024-12-12T05:31:00.000Z,sample,security,0.8,True,Lists specific built-in policy definitions and compliance controls; includes policy names and mappings that are product-specific security configuration artifacts.,unchanged https://learn.microsoft.com/en-us/azure/aks/servicemesh-about,Service meshes overview,About service meshes - Azure Kubernetes Service,,"Obtain an overview of service meshes, supported scenarios, selection criteria, and next steps to explore.","A service mesh is an infrastructure layer in your application that facilitates communication between services. Service meshes provide capabilities like traffic management, resiliency, policy, security, strong identity, and observability to your workloads. Your application is decoupled from these operational capabilities, while the service mesh moves them out of the application layer and down to the infrastructure layer.",2024-08-01T20:29:00.000Z,concept-article,,0.3,False,"Conceptual overview of service meshes and capabilities; lacks product-specific numeric thresholds, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/aks/shared-health-probes,Use shared health probes,Use Shared Health Probes for externalTrafficPolicy Cluster Services in Azure Kubernetes Service (AKS) (preview) - Azure Kubernetes Service,Enable shared health probes for AKS Services,Learn how to use shared health probes for Services with externalTrafficPolicy set to Cluster in Azure Kubernetes Service (AKS) to improve load balancer efficiency and reduce configuration complexity.,"This article describes how to enableshared health probe mode(preview) for Services withexternalTrafficPolicy: Clusterin Azure Kubernetes Service (AKS). Shared probe mode improves load balancer efficiency, reduces configuration complexity, and provides more accurate node health monitoring.",2025-11-04T23:06:00.000Z,how-to,configuration,0.66,True,Describes shared health probe mode for externalTrafficPolicy=Cluster; includes AKS-specific configuration flags and behavior.,unchanged @@ -442,10 +448,10 @@ https://learn.microsoft.com/en-us/azure/aks/stop-cluster-upgrade-api-breaking-ch https://learn.microsoft.com/en-us/azure/aks/support-policies,Support policies,Support policies for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,AKS support policies and platform limitations,"Learn about Azure Kubernetes Service (AKS) support policies, shared responsibility, and features that are in preview (or alpha or beta).","This article describes technical support policies and limitations for Azure Kubernetes Service (AKS). It also details agent node management, managed control plane components, third-party open-source components, and security or patch management.",2026-01-29T23:06:00.000Z,concept-article,limits-quotas,0.75,True,"Support policies document concrete limitations (supported versions, node OS support windows, preview feature constraints) that function as service limits and constraints.",unchanged https://learn.microsoft.com/en-us/azure/aks/supported-kubernetes-versions,Supported Kubernetes versions,Supported Kubernetes Versions in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,AKS supported Kubernetes versions and lifecycle rules,Learn the Kubernetes version support policy and lifecycle of clusters in Azure Kubernetes Service (AKS).,"The Kubernetes community releasesminor versionsroughly every four months. Minor version releases include new features and improvements. Patch releases are more frequent (sometimes weekly) and are intended for critical bug fixes within a minor version. Patch releases include fixes for security vulnerabilities or major bugs. Important Starting onNovember 30, 2025, Azure Kubernetes Service (AKS) no longer supports or provides security updates for Azure Linux 2.0. The Azure Linux 2.0 node image is f",2026-03-24T02:09:00.000Z,concept-article,limits-quotas,0.7,True,"Version support and lifecycle pages for AKS typically include exact support windows, deprecation dates, and version-specific constraints (for example, how many minor versions are supported, exact end-of-support dates like November 30, 2025 for Azure Linux 2.0). These are concrete, time-bound limits that change over time and are not reliably known from model pretraining, fitting the limits-quotas category best.",unchanged https://learn.microsoft.com/en-us/azure/aks/system-assigned-managed-identity,Use a system-assigned managed identity,Use a System-Assigned Managed Identity in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure system-assigned managed identity for AKS clusters,"This article explains how to enable a system-assigned managed identity on a new or existing AKS cluster, get the principal ID of the system-assigned managed identity, and add a role assignment for the","This article explains how to enable a system-assigned managed identity on a new or existing AKS cluster, get the principal ID of the system-assigned managed identity, and add a role assignment for the system-assigned managed identity.",2026-04-01T22:09:00.000Z,how-to,security,0.7,True,"Page explains how to enable and use a system-assigned managed identity for AKS, including obtaining the principal ID and adding role assignments. This is product-specific security/identity configuration (managed identity usage and RBAC role assignment steps) that goes beyond generic concepts. It fits the security sub-skill because it covers identity configuration and role assignment details for AKS.",unchanged -https://learn.microsoft.com/en-us/azure/aks/troubleshoot-container-network-insights-agent,Troubleshoot Container Network Insights Agent issues,Troubleshoot Container Network Insights Agent on AKS - Azure Kubernetes Service,Troubleshoot Container Network Insights Agent issues on AKS,"Troubleshoot common issues you might encounter when deploying, configuring, or using Container Network Insights Agent on Azure Kubernetes Service (AKS).","This article covers common issues you might encounter when deploying, setting up, or using Container Network Insights Agent on AKS. Each section follows aSymptom → Cause → Resolutionformat. For deployment instructions, seeDeploy and use Container Network Insights Agent on AKS.",2026-04-16T11:03:00.000Z,troubleshooting,troubleshooting,0.95,True,"Explicitly described as organized in Symptom → Cause → Resolution format for common issues when deploying/configuring/using the agent, which matches the troubleshooting criteria with product-specific diagnostics and resolutions.",new +https://learn.microsoft.com/en-us/azure/aks/troubleshoot-container-network-insights-agent,Troubleshoot Container Network Insights Agent issues,Troubleshoot Container Network Insights Agent on AKS - Azure Kubernetes Service,Troubleshoot Container Network Insights Agent issues on AKS,"Troubleshoot common issues you might encounter when deploying, configuring, or using Container Network Insights Agent on Azure Kubernetes Service (AKS).","This article covers common issues you might encounter when deploying, setting up, or using Container Network Insights Agent on AKS. Each section follows aSymptom → Cause → Resolutionformat. For deployment instructions, seeDeploy and use Container Network Insights Agent on AKS.",2026-04-16T11:03:00.000Z,troubleshooting,troubleshooting,0.95,True,"Explicitly described as organized in Symptom → Cause → Resolution format for common issues when deploying/configuring/using the agent, which matches the troubleshooting criteria with product-specific diagnostics and resolutions.",unchanged https://learn.microsoft.com/en-us/azure/aks/troubleshoot-source-network-address-translation,Troubleshoot SNAT issues,Troubleshoot SNAT Port Exhaustion for a Standard Load Balancer in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Troubleshoot SNAT port exhaustion for AKS load balancers,Learn how to troubleshoot SNAT port exhaustion issues for a Standard Load Balancer in Azure Kubernetes Service (AKS).,"If you know that you're starting many outbound TCP or UDP connections to the same destination IP address and port, and you observe failing outbound connections or support notifies you that you're exhausting SNAT ports, you have several general mitigation options. Review these options and decide what's best for your scenario. It's possible that one or more can help manage your scenario. For detailed information, review theoutbound connections troubleshooting guide. The root cause of SNAT exhausti",2025-11-04T23:06:00.000Z,troubleshooting-general,troubleshooting,0.8,True,Troubleshooting guide for SNAT exhaustion with Standard Load Balancer; includes symptom → cause → mitigation mappings and AKS-specific configuration options.,unchanged https://learn.microsoft.com/en-us/azure/aks/troubleshoot-udp-packet-drops,Diagnose and solve UDP packet drops,Diagnose and solve UDP packet drops in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Troubleshoot UDP packet drops in AKS clusters,Learn how to diagnose and solve UDP packet drops in Azure Kubernetes Service (AKS).,"User Datagram Protocol (UDP) is a connectionless protocol used within managed AKS clusters. UDP packets are sent without any guarantee of delivery, reliability, or order, as they don’t establish a connection before data transfer. This means that UDP packets can be lost, duplicated, or arrive out of order at the destination because of multiple reasons. This article describes how to diagnose and solve UDP packet drop issues caused by a small read buffer which could overflow in cases where you have",2024-08-01T20:29:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicitly a diagnose-and-solve article; likely maps UDP packet loss symptoms to causes (small read buffer, node/network settings) and AKS-specific remediation steps.",unchanged -https://learn.microsoft.com/en-us/azure/aks/trusted-access-feature,Enable access to AKS clusters using Trusted Access,Get secure resource access to Azure Kubernetes Service (AKS) using Trusted Access - Azure Kubernetes Service,Configure Trusted Access for secure AKS API access,Learn how to use the Trusted Access feature to give Azure resources access to Azure Kubernetes Service (AKS) clusters.,"This article shows you how to get secure access for your Azure services to your Kubernetes API server in Azure Kubernetes Service (AKS) using Trusted Access. The Trusted Access feature gives services secure access to AKS API server by using the Azure back end without requiring a private endpoint. Instead of relying on identities that haveMicrosoft Entrapermissions, this feature can use your system-assigned managed identity to authenticate with the managed services and applications that you want ",2024-11-05T22:59:00.000Z,how-to,security,0.8,True,"Describes Trusted Access feature; likely includes configuration of bindings between AKS and other Azure services, with specific roles and parameters.",unchanged +https://learn.microsoft.com/en-us/azure/aks/trusted-access-feature,Trusted Access for Azure services,Get secure resource access to Azure Kubernetes Service (AKS) using Trusted Access - Azure Kubernetes Service,Configure Trusted Access from Azure services to AKS,Learn how to use the Trusted Access feature to give Azure resources access to Azure Kubernetes Service (AKS) clusters.,"This article shows you how to get secure access for your Azure services to your Kubernetes API server in Azure Kubernetes Service (AKS) using Trusted Access. The Trusted Access feature gives services secure access to AKS API server by using the Azure back end without requiring a private endpoint. Instead of relying on identities that haveMicrosoft Entrapermissions, this feature can use your system-assigned managed identity to authenticate with the managed services and applications that you want ",2024-11-05T22:59:00.000Z,how-to,security,0.78,True,Shows how to use Trusted Access so Azure services can securely reach the AKS API server via backend integration and managed identities; includes product-specific security configuration patterns.,new https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-deploy-application,5 - Deploy containerized application,Kubernetes on Azure tutorial - Deploy an application to Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"In this Azure Kubernetes Service (AKS) tutorial, you deploy a multi-container application to your cluster using images stored in Azure Container Registry.","Kubernetes provides a distributed platform for containerized applications. You build and deploy your own applications and services into a Kubernetes cluster and let the cluster manage the availability and connectivity. In this tutorial, you deploy a sample application into a Kubernetes cluster. You learn how to: Tip With AKS, you can use the following approaches for configuration management: GitOps: Enables declarations of your cluster's state to automatically apply to the cluster. To learn how ",2024-11-08T22:59:00.000Z,tutorial,,0.3,False,Application deployment tutorial; mentions GitOps conceptually but does not indicate detailed configuration matrices or quotas.,unchanged https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-deploy-azure-container-storage,4 - Deploy Azure Container Storage,Kubernetes on Azure tutorial - Use Azure Container Storage - Azure Kubernetes Service,,"In this Azure Kubernetes Service (AKS) tutorial, you learn how to deploy Azure Container Storage on an AKS cluster and create volumes.","This tutorial introduces Azure Container Storage and demonstrates how to deploy and manage container-native storage for applications running on Azure Kubernetes Service (AKS). If you don't want to deploy Azure Container Storage now, you can skip this tutorial and proceed directly toDeploy an application in AKS. You won't need Azure Container Storage for the basic storefront application in this tutorial series. Azure Container Storage simplifies the management of stateful applications in Kubernet",2026-02-18T18:36:00.000Z,tutorial,,0.3,False,"Tutorial introducing Azure Container Storage on AKS; summary suggests basic usage, not detailed limits or config parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-deploy-cluster,3 - Create Kubernetes cluster,Kubernetes on Azure tutorial - Create an Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,,"In this Azure Kubernetes Service (AKS) tutorial, you learn how to create an AKS cluster and use kubectl to connect to the Kubernetes main node.","Kubernetes provides a distributed platform for containerized applications. With Azure Kubernetes Service (AKS), you can quickly create a production ready Kubernetes cluster. In this tutorial, you deploy a Kubernetes cluster in AKS. You learn how to:",2025-03-12T22:00:00.000Z,tutorial,,0.2,False,"Tutorial for creating an AKS cluster; uses default settings and basic commands, not deep configuration or decision guidance.",unchanged @@ -457,10 +463,10 @@ https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-upgrade-cluster, https://learn.microsoft.com/en-us/azure/aks/understand-aks-costs,Understand AKS usage and costs,Understand Azure Kubernetes Service (AKS) usage and costs - Azure Kubernetes Service,,"Understand Azure Kubernetes Service (AKS) usage and costs, including allocation, monitoring, analytics, and anomaly management.",This article provides resources you can use to better understand your Azure Kubernetes Service (AKS) usage and costs and identify cost optimization opportunities.,2025-01-30T23:02:00.000Z,how-to,,0.45,False,High-level guidance and links about understanding AKS usage and costs; appears more like a resource index than detailed configuration or decision matrices.,unchanged https://learn.microsoft.com/en-us/azure/aks/update-azure-cni,Update Azure CNI IPAM and dataplane,Update Azure CNI IP Address Management (IPAM) Mode and Data Plane Technology - Azure Kubernetes Service,Update AKS clusters to new Azure CNI modes,Learn how to update existing Azure Kubernetes Service (AKS) clusters to use the latest Azure CNI IPAM modes and data plane technologies.,Existing Azure Kubernetes Service (AKS) clusters inevitably need an update to newer IP assignment management (IPAM) modes and data plane technologies to access the latest features and supportability. This article provides guidance on updating an existing AKS cluster to use Azure CNI Overlay for the IPAM mode and Azure CNI Powered by Cilium as the data plane.,2026-03-22T17:03:00.000Z,how-to,configuration,0.7,True,"Guidance on updating existing AKS clusters to Azure CNI Overlay and Azure CNI powered by Cilium. Such an article typically includes specific CLI flags, configuration parameters, and supported combinations for IPAM modes and data planes, which are product-specific configuration details rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/aks/update-azure-cni,Upgrade AKS IPAM and dataplane,Update Azure CNI IP Address Management (IPAM) Mode and Data Plane Technology - Azure Kubernetes Service,Update AKS clusters to new Azure CNI modes (duplicate),Learn how to update existing Azure Kubernetes Service (AKS) clusters to use the latest Azure CNI IPAM modes and data plane technologies.,Existing Azure Kubernetes Service (AKS) clusters inevitably need an update to newer IP assignment management (IPAM) modes and data plane technologies to access the latest features and supportability. This article provides guidance on updating an existing AKS cluster to use Azure CNI Overlay for the IPAM mode and Azure CNI Powered by Cilium as the data plane.,2026-03-22T17:03:00.000Z,how-to,configuration,0.7,True,Same URL and description as index 3; contains the same product-specific configuration guidance for changing Azure CNI IPAM mode and data plane technology in AKS.,unchanged -https://learn.microsoft.com/en-us/azure/aks/update-credentials,Update cluster credentials,Update or rotate the credentials for an Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Rotate AKS service principal and Entra credentials,Learn how update or rotate the service principal or Microsoft Entra Application credentials for an Azure Kubernetes Service (AKS) cluster.,"AKS clusters created with a service principal have a one-year expiration time. As you near the expiration date, you can reset the credentials to extend the service principal for an additional period of time. You might also want to update, or rotate, the credentials as part of a defined security policy. AKS clustersintegrated with Microsoft Entra IDas an authentication provider have two more identities: the Microsoft Entra Server App and the Microsoft Entra Client App. This article details how to",2024-08-01T20:29:00.000Z,how-to,security,0.78,True,"Details rotation procedures and timing for AKS service principals and Entra apps, including expiration behavior and commands.",unchanged -https://learn.microsoft.com/en-us/azure/aks/update-kms-key-vault,Update key vault mode,Update the Key Vault Mode for an Azure Kubernetes Service (AKS) Cluster with KMS Etcd Encryption (legacy) - Azure Kubernetes Service,Update Key Vault mode for AKS KMS etcd,Learn how to update the key vault mode from public to private or private to public for an AKS cluster with Key Management Service (KMS) etcd encryption.,This article shows you how to update the key vault mode from public to private or private to public for an Azure Kubernetes Service (AKS) cluster with Key Management Service (KMS) etcd encryption.,2025-12-22T08:00:00.000Z,how-to,security,0.7,True,"Shows how to switch Key Vault mode between public and private for AKS KMS etcd encryption, including specific commands and parameters unique to this feature.",unchanged +https://learn.microsoft.com/en-us/azure/aks/update-credentials,Rotate cluster credentials,Update or rotate the credentials for an Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Rotate AKS service principal and Entra app credentials,Learn how update or rotate the service principal or Microsoft Entra Application credentials for an Azure Kubernetes Service (AKS) cluster.,"AKS clusters created with a service principal have a one-year expiration time. As you near the expiration date, you can reset the credentials to extend the service principal for an additional period of time. You might also want to update, or rotate, the credentials as part of a defined security policy. AKS clustersintegrated with Microsoft Entra IDas an authentication provider have two more identities: the Microsoft Entra Server App and the Microsoft Entra Client App. This article details how to",2024-08-01T20:29:00.000Z,how-to,security,0.72,True,Details how to update or rotate service principal and Entra application credentials for AKS clusters; includes identity objects and security-focused operational steps specific to AKS.,new +https://learn.microsoft.com/en-us/azure/aks/update-kms-key-vault,Update key vault mode for legacy KMS,Update the Key Vault Mode for an Azure Kubernetes Service (AKS) Cluster with KMS Etcd Encryption (legacy) - Azure Kubernetes Service,Change Key Vault connectivity mode for AKS KMS,Learn how to update the key vault mode from public to private or private to public for an AKS cluster with Key Management Service (KMS) etcd encryption.,This article shows you how to update the key vault mode from public to private or private to public for an Azure Kubernetes Service (AKS) cluster with Key Management Service (KMS) etcd encryption.,2025-12-22T08:00:00.000Z,how-to,security,0.7,True,Operational guide to switch Key Vault mode (public/private) for AKS clusters using KMS etcd encryption; contains specific security configuration steps tied to AKS and Key Vault.,new https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-control-plane,Upgrade cluster control plane,Upgrade the Azure Kubernetes Service (AKS) cluster control plane - Azure Kubernetes Service,Upgrade the AKS control plane independently,Learn how to upgrade the control plane of an Azure Kubernetes Service (AKS) cluster to get the latest Kubernetes version features and security updates.,"Azure Kubernetes Service (AKS) clusters consist of two main components: thecontrol plane managed by Azureand thenode pools where your workloads run. This article focuses on upgrading the control plane independently, which allows you to adopt new Kubernetes versions for API server features while separately managing node pool upgrades.",2025-12-23T23:05:00.000Z,how-to,deployment,0.7,True,"Explains how to upgrade the control plane separately from node pools, including AKS-specific upgrade behavior and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq,Upgrade FAQ,AKS Upgrades Frequently Asked Questions,Resolve common AKS cluster upgrade issues (FAQ),Frequently asked questions about Azure Kubernetes Service (AKS),Answers to frequently asked questions about Azure Kubernetes Service (AKS) upgrades.,2026-01-07T23:08:00Z,faq,troubleshooting,0.7,True,"Upgrade FAQ typically maps specific upgrade symptoms and questions to causes and resolutions, which is troubleshooting-style expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq,Upgrade FAQ,AKS Upgrades Frequently Asked Questions,Resolve common Azure AKS upgrade issues and questions,Frequently asked questions about Azure Kubernetes Service (AKS),Answers to frequently asked questions about Azure Kubernetes Service (AKS) upgrades.,2026-04-23T06:10:00Z,faq,troubleshooting,0.7,True,"An upgrade FAQ for AKS typically includes specific error messages, unsupported scenarios, version skew rules, and step-by-step guidance for resolving upgrade problems. This is organized around common symptoms and questions unique to AKS upgrades, which fits the troubleshooting category better than general guidance.",updated https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-node-pools-rolling,Customize node surge upgrade,Configure rolling upgrades for Azure Kubernetes Service (AKS) node pools - Azure Kubernetes Service,Configure AKS node pool rolling upgrade settings,"Learn how to configure and customize rolling upgrades for Azure Kubernetes Service (AKS) node pools, including surge settings, drain timeout, and soak time.","A rolling upgrade strategy upgrades nodes one at a time (or a few at a time), minimizing workload disruption while ensuring the node pool remains available throughout the upgrade process. This article explains how to configure rolling upgrades for AKS node pools, including surge settings, drain timeout, and soak time.",2025-12-23T23:05:00.000Z,how-to,configuration,0.85,True,"Covers surge settings, drain timeout, soak time, and how to configure them—concrete AKS configuration parameters and ranges.",unchanged https://learn.microsoft.com/en-us/azure/aks/upgrade-basic-load-balancer-on-aks,Upgrade Basic Load Balancer,Upgrade from Basic Load Balancer on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Migrate AKS from Basic to Standard Load Balancer,Upgrade guidance for migrating Basic Load Balancer to Standard Load Balancer on AKS.,"In this article, you learn how to upgrade your Basic Load Balancer instances to Standard Load Balancer on Azure Kubernetes Services (AKS). We recommend using Standard Load Balancer for all production instances. It provides manykey differencesto your infrastructure. For guidance on upgrading from Basic Load Balancer to Standard Load Balancer outside of AKS, see theofficial guidance for Basic Load Balancer upgrade. Important Starting onSeptember 30, 2025, Azure Kubernetes Service (AKS) no longer s",2026-01-28T18:10:00.000Z,how-to,decision-making,0.7,True,Upgrade guidance between Basic and Standard SKUs with AKS-specific steps and recommendations; helps decide and execute migration path.,unchanged https://learn.microsoft.com/en-us/azure/aks/upgrade-cluster-components,Maintain and upgrade cluster components,Overview of upgrading Azure Kubernetes Service (AKS) clusters and components - Azure Kubernetes Service,Plan and manage AKS cluster and component upgrades,Learn about the various upgradeable components of an Azure Kubernetes Service (AKS) cluster and how to maintain them.,"An Azure Kubernetes Service (AKS) cluster needs to be periodically updated to ensure security and compatibility with the latest features. There are two components of an AKS cluster that are necessary to maintain: For Linux nodes, node image security patches and hotfixes may be performed without your initiation asunattended updates. These updates are automatically applied, but AKS doesn't automatically reboot your Linux nodes to complete the update process. You're required to use a tool likekured",2025-12-23T23:05:00.000Z,overview,deployment,0.7,True,"Describes upgradeable components, unattended updates, and AKS-specific upgrade behavior, which are product-specific deployment lifecycle details.",unchanged @@ -478,19 +484,19 @@ https://learn.microsoft.com/en-us/azure/aks/upgrade-windows-os,Upgrade Windows O https://learn.microsoft.com/en-us/azure/aks/use-advanced-container-networking-services,Enable or disable ACNS,Enable Advanced Container Networking Services on Azure Kubernetes Service (AKS) Clusters - Azure Kubernetes Service,Configure Advanced Container Networking Services on AKS,Learn how to set up Advanced Container Networking Services on your Azure Kubernetes Service (AKS) clusters to enhance network observability and security for your containerized applications.,"This article describes how to enable and disable Advanced Container Networking Services, includingContainer Network ObservabilityandContainer Network Security, on your AKS clusters.",2026-03-02T23:10:00.000Z,how-to,configuration,0.68,True,"The page is a how-to for enabling/disabling Advanced Container Networking Services (Container Network Observability and Container Network Security) on AKS clusters. It likely includes AKS- and CNI-specific configuration flags, feature names, and parameter values unique to this product rather than just conceptual networking guidance, fitting the configuration sub-skill. It does not focus on limits, troubleshooting, or architecture trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-amd-gpus,Use AMD GPUs,Use AMD GPUs on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure AMD GPU node pools on AKS,Learn how to use AMD GPUs for high performance compute or AI workloads on Azure Kubernetes Service (AKS).,"AMDGPU Virtual Machine (VM)sizes on Azure can provide flexibility in performance and cost, offering high compute capacity while allowing you to choose the right configuration for your workload requirements. AKS supports AMD GPU-enabled Linux node pools to run compute-intensive Kubernetes workloads. This article helps you provision nodes with schedulable AMD GPUs on new and existing AKS clusters.",2025-06-20T22:11:00.000Z,how-to,configuration,0.68,True,"Focuses on AMD GPU-enabled Linux node pools; likely includes specific VM sizes, driver setup, and AKS configuration parameters for AMD GPUs.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-arm64-vms,Use Arm64 VMs for cost optimization,Use Arm64 Virtual Machines in Azure Kubernetes Service (AKS) for cost effectiveness - Azure Kubernetes Service,Use Arm64 node pools in AKS for cost efficiency,Learn how to create node pools using Arm64 Virtual Machines with Azure Kubernetes Service (AKS) for cost effectiveness.,"Arm-based processors (Arm64)are power-efficient and cost-effective, but don't compromise on performance. These Arm64 VMs are engineered to efficiently run dynamic, scalable workloads and can deliver up to 50% better price-performance than comparable x86-based VMs for scale-out workloads. Because of their ability to scale workloads efficiently, Arm64 VMs are well-suited for web or application servers, open-source databases, cloud-native applications, gaming servers, and other high traffic applica",2025-12-17T12:02:00.000Z,how-to,decision-making,0.7,True,Discusses when to use Arm64 VMs for AKS with quantified price-performance claims and workload suitability; helps decide between Arm64 and x86 for specific scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity,Microsoft Entra pod identity,Use Microsoft Entra pod-managed identities in AKS (Preview) - Azure Kubernetes Service,Use Entra pod-managed identities in AKS,Learn how to use Microsoft Entra pod-managed identities in Azure Kubernetes Service (AKS),Microsoft Entra pod-managed identities use Azure Kubernetes Service (AKS) primitives to associatemanaged identities for Azure resourcesand identities in Microsoft Entra ID with pods. Administrators create identities and bindings as Kubernetes primitives that allow pods to access Azure resources that rely on Microsoft Entra ID as an identity provider. Microsoft Entra pod-managed identities in AKS have the following limitations: Important We recommend you reviewMicrosoft Entra Workload ID. This au,2025-12-17T06:03:00.000Z,how-to,security,0.78,True,"Describes AKS primitives for pod-managed identities, including limitations and configuration objects unique to this feature.",unchanged -https://learn.microsoft.com/en-us/azure/aks/use-azure-dedicated-hosts,Use Dedicated Hosts with AKS,Use Azure Dedicated Hosts in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use Azure Dedicated Hosts for AKS nodes,Learn how to create an Azure Dedicated Hosts Group and associate it with Azure Kubernetes Service (AKS),"Azure Dedicated Host is a service that provides physical servers - able to host one or more virtual machines - dedicated to one Azure subscription. Dedicated hosts are the same physical servers used in our data centers, provided as a resource. You can provision dedicated hosts within a region, availability zone, and fault domain. Then, you can place VMs directly into your provisioned hosts, in whatever configuration best meets your needs. Using Azure Dedicated Hosts for nodes with your AKS clust",2024-08-01T20:29:00.000Z,how-to,deployment,0.66,True,Covers how to bind AKS node pools to Dedicated Hosts with specific deployment constraints and configuration parameters.,unchanged +https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity,Microsoft Entra pod identity (legacy),Use Microsoft Entra pod-managed identities in AKS (Preview) - Azure Kubernetes Service,Use Entra pod-managed identities in AKS,Learn how to use Microsoft Entra pod-managed identities in Azure Kubernetes Service (AKS),Microsoft Entra pod-managed identities use Azure Kubernetes Service (AKS) primitives to associatemanaged identities for Azure resourcesand identities in Microsoft Entra ID with pods. Administrators create identities and bindings as Kubernetes primitives that allow pods to access Azure resources that rely on Microsoft Entra ID as an identity provider. Microsoft Entra pod-managed identities in AKS have the following limitations: Important We recommend you reviewMicrosoft Entra Workload ID. This au,2025-12-17T06:03:00.000Z,how-to,limits-quotas,0.7,True,"Describes Microsoft Entra pod-managed identities and explicitly calls out limitations; these limits are product-specific expert knowledge, aligning with limits-quotas. (Even though summary truncates, the limitations section is central to this article.)",new +https://learn.microsoft.com/en-us/azure/aks/use-azure-dedicated-hosts,Azure Dedicated Hosts for AKS nodes,Use Azure Dedicated Hosts in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Run AKS node pools on Azure Dedicated Hosts,Learn how to create an Azure Dedicated Hosts Group and associate it with Azure Kubernetes Service (AKS),"Azure Dedicated Host is a service that provides physical servers - able to host one or more virtual machines - dedicated to one Azure subscription. Dedicated hosts are the same physical servers used in our data centers, provided as a resource. You can provision dedicated hosts within a region, availability zone, and fault domain. Then, you can place VMs directly into your provisioned hosts, in whatever configuration best meets your needs. Using Azure Dedicated Hosts for nodes with your AKS clust",2024-08-01T20:29:00.000Z,how-to,configuration,0.64,True,"Covers creating a Dedicated Host group and associating it with AKS; includes AKS-specific configuration of host groups, regions, zones, and fault domains, which are configuration details beyond generic knowledge.",new https://learn.microsoft.com/en-us/azure/aks/use-azure-linux,Use Azure Linux for AKS,Use the Azure Linux container host on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Decide and use Azure Linux node pools on AKS,Learn how to use the Azure Linux container host on Azure Kubernetes Service (AKS),"The Azure Linux container host for AKS is an open-source Linux distribution created by Microsoft, and it's generally available as a container host on Azure Kubernetes Service (AKS). The Azure Linux container host provides reliability and consistency from cloud to edge across the AKS, AKS-HCI, and Arc products. You can deploy Azure Linux node pools in a new cluster, add Azure Linux node pools to your existing Ubuntu clusters, or migrate your Ubuntu nodes to Azure Linux nodes. To learn more about ",2025-12-17T12:02:00.000Z,how-to,decision-making,0.65,True,"Covers when and how to use Azure Linux container host, including migration from Ubuntu and node pool options; this is product-specific selection and migration guidance.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-azure-linux-os-guard,Use Azure Linux with OS Guard for AKS,Use Azure Linux with OS Guard (preview) for Azure Kubernetes Service (AKS) - Azure Kubernetes Service,,"Learn about Azure Linux with OS Guard (preview) on Azure Kubernetes Service (AKS), including key features, region availability, and resources to get started.","This article provides an overview of Azure Linux with OS Guard (preview) on Azure Kubernetes Service (AKS), including key features, region availability, and resources to get started.",2025-11-05T23:12:00.000Z,overview,,0.4,False,"Preview feature overview (OS Guard) with regions and key features; summary suggests high-level info without detailed configs, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/aks/use-azure-policy,Use Azure Policy,Use Azure Policy to secure your Azure Kubernetes Service (AKS) clusters - Azure Kubernetes Service,Secure AKS clusters using Azure Policy add-on,Learn how to use Azure Policy to secure your Azure Kubernetes Service (AKS) clusters.,"You can apply and enforce built-in security policies on your Azure Kubernetes Service (AKS) clusters usingAzure Policy. Azure Policy helps enforce organizational standards and assess compliance at-scale. After you install theAzure Policy add-on for AKS, you can apply individual policy definitions or groups of policy definitions called initiatives (sometimes called policysets) to your cluster. SeeAzure Policy built-in definitions for AKSfor a complete list of AKS policy and initiative definitions",2024-08-01T20:29:00.000Z,how-to,security,0.8,True,"How-to for using Azure Policy with AKS; includes specific policy initiatives, add-on configuration, and enforcement behavior unique to AKS.",unchanged -https://learn.microsoft.com/en-us/azure/aks/use-byo-cni,Bring your own CNI,Bring Your Own Container Network Interface (CNI) Plugin with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure custom CNI plugins for Azure Kubernetes Service,Learn how to bring your own Container Network Interface (CNI) plugin with Azure Kubernetes Service (AKS).,"Kubernetes doesn't provide a network interface system by default. Instead,network pluginsprovide this functionality. Azure Kubernetes Service (AKS) provides several supported Container Network Interface (CNI) plugins. For information on supported plugins, seeNetworking concepts for applications in Azure Kubernetes Service. The supported plugins meet most networking needs in Kubernetes. However, advanced AKS users might want the same CNI plugin that they used in on-premises Kubernetes environment",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"A BYO-CNI guide for AKS typically includes AKS-specific configuration parameters (for example, cluster creation flags, required daemonset settings, and node/network prerequisites) that are unique to AKS and not just generic Kubernetes CNI usage. These concrete settings and constraints qualify as product-specific configuration expert knowledge rather than a conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/aks/use-azure-policy,Enforce compliance with Azure Policy,Use Azure Policy to secure your Azure Kubernetes Service (AKS) clusters - Azure Kubernetes Service,Secure AKS clusters with Azure Policy definitions,Learn how to use Azure Policy to secure your Azure Kubernetes Service (AKS) clusters.,"You can apply and enforce built-in security policies on your Azure Kubernetes Service (AKS) clusters usingAzure Policy. Azure Policy helps enforce organizational standards and assess compliance at-scale. After you install theAzure Policy add-on for AKS, you can apply individual policy definitions or groups of policy definitions called initiatives (sometimes called policysets) to your cluster. SeeAzure Policy built-in definitions for AKSfor a complete list of AKS policy and initiative definitions",2024-08-01T20:29:00.000Z,how-to,security,0.76,True,"Describes installing the Azure Policy add-on for AKS and applying built-in security policies/initiatives; includes product-specific policy definitions and enforcement configuration, which are security settings.",new +https://learn.microsoft.com/en-us/azure/aks/use-byo-cni,Bring your own CNI,Bring Your Own Container Network Interface (CNI) Plugin with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure custom CNI plugins for Azure Kubernetes Service,Learn how to bring your own Container Network Interface (CNI) plugin with Azure Kubernetes Service (AKS).,"Kubernetes doesn't provide a network interface system by default. Instead,network pluginsprovide this functionality. Azure Kubernetes Service (AKS) provides several supported Container Network Interface (CNI) plugins. For information on supported plugins, seeNetworking concepts for applications in Azure Kubernetes Service. The supported plugins meet most networking needs in Kubernetes. However, advanced AKS users might want the same CNI plugin that they used in on-premises Kubernetes environment",2026-04-16T08:00:00.000Z,how-to,configuration,0.7,True,"A BYO-CNI guide for AKS typically includes AKS-specific configuration parameters (for example, cluster creation flags, required daemonset settings, and node/network prerequisites) that are unique to AKS and not just generic Kubernetes CNI usage. These concrete settings and constraints qualify as product-specific configuration expert knowledge rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-capacity-reservation-groups,Assign capacity reservation groups to node pools,Assign Capacity Reservation Groups to Node Pools in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use capacity reservation groups with AKS node pools,Learn how to use capacity reservation groups with node pools in Azure Kubernetes Service (AKS) to guarantee allocated capacity for your node pools.,"As your workload demands change, you can associate existingcapacity reservation groups (CRGs)to your Azure Kubernetes Service (AKS) node pools to guarantee allocated capacity for them. Capacity reservation groups allow you to reserve compute capacity in an Azure region or availability zone for any duration of time. This feature is useful for workloads that require guaranteed capacity, such as those with predictable traffic patterns or those that need to meet specific performance requirements. In",2025-12-02T06:03:00.000Z,how-to,decision-making,0.7,True,"Explains when and how to associate CRGs with node pools, including scenarios requiring guaranteed capacity and performance, which are AKS-specific capacity-planning decisions.",unchanged -https://learn.microsoft.com/en-us/azure/aks/use-cvm,Use Confidential Virtual Machines,Use Confidential Virtual Machines (CVM) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Run AKS workloads on Confidential VMs,Learn how to create Confidential Virtual Machines (CVM) node pools with Azure Kubernetes Service (AKS),"Confidential Virtual Machines (CVM)offer strong security and confidentiality for tenants. CVMs offer VM based Hardware Trusted Execution Environment (TEE) that leverage SEV-SNP security features to deny the hypervisor and other host management code access to VM memory and state, providing defense in depth protections against operator access. These features enable node pools with CVM to target the migration of highly sensitive container workloads to AKS without any code refactoring while benefiti",2025-12-09T23:11:00.000Z,how-to,security,0.68,True,Describes how to create CVM-backed node pools with AKS-specific configuration and constraints tied to SEV-SNP and TEE usage.,unchanged +https://learn.microsoft.com/en-us/azure/aks/use-cvm,Confidential VM nodes,Use Confidential Virtual Machines (CVM) in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Create and configure AKS node pools with Confidential VMs,Learn how to create Confidential Virtual Machines (CVM) node pools with Azure Kubernetes Service (AKS),"Confidential Virtual Machines (CVM)offer strong security and confidentiality for tenants. CVMs offer VM based Hardware Trusted Execution Environment (TEE) that leverage SEV-SNP security features to deny the hypervisor and other host management code access to VM memory and state, providing defense in depth protections against operator access. These features enable node pools with CVM to target the migration of highly sensitive container workloads to AKS without any code refactoring while benefiti",2025-12-09T23:11:00.000Z,how-to,configuration,0.7,True,"How to create CVM node pools in AKS; typically includes specific CLI flags, SKU requirements, and node pool settings that are product-specific configuration details.",new https://learn.microsoft.com/en-us/azure/aks/use-etags,Use eTags for Concurrency Control,Enhancing Concurrency Control with Entity Tags (eTags) in Azure Kubernetes Service - Azure Kubernetes Service,Use eTags for concurrency control in AKS APIs,Learn how to use eTags (Entity Tags) to enable concurrency control and avoid racing conditions or overwriting scenarios.,"To prevent conflicting requests in Azure Kubernetes Service (AKS), eTags (Entity Tags) serve as unique identifiers that enable concurrency control. When a request to the cluster is made, the system checks whether the provided eTag matches the latest version stored in the database. If there is a mismatch, the request fails early, ensuring that no unintended overwrites occur.",2025-07-01T05:08:00.000Z,how-to,configuration,0.7,True,"Explains AKS control-plane behavior with eTags, including request/response patterns and failure semantics when tags mismatch, which are specific to AKS management APIs.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-flyte,Deploy data and ML pipelines with Flyte,Build and deploy data and machine learning pipelines with Flyte on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Run Flyte data and ML pipelines on AKS,"Learn about Flyte, an open-source platform for building and deploying data and machine learning pipelines on Azure Kubernetes Service (AKS).","This article shows you how to use Flyte on Azure Kubernetes Service (AKS). Flyte is an open-source workflow orchestrator that unifies machine learning, data engineering, and data analytics stacks to help you build robust and reliable applications. When using Flyte as a Kubernetes-native workflow automation tool, you can focus on experimentation and providing business value without increasing your scope to infrastructure and resource management. Keep in mind that Flyte isn't officially supported ",2024-08-09T22:01:00.000Z,how-to,deployment,0.7,True,"Covers AKS-specific deployment and configuration of Flyte, including manifests and integration details that are not generic to Kubernetes.",unchanged -https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts,Enable GMSA integration,Enable Group Managed Service Accounts (GMSA) for your Windows Server nodes on your Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Enable GMSA for Windows pods on AKS,Learn how to enable Group Managed Service Accounts (GMSA) for your Windows Server nodes on your Azure Kubernetes Service (AKS) cluster to secure your pods.,"Group Managed Service Accounts (GMSA)is a managed domain account for multiple servers that provides automatic password management, simplified service principal name (SPN) management, and the ability to delegate management to other administrators. With Azure Kubernetes Service (AKS), you can enable GMSA on your Windows Server nodes, which allows containers running on Windows Server nodes to integrate with and be managed by GMSA.",2025-12-03T08:00:00.000Z,how-to,security,0.72,True,"Provides AKS- and Windows-specific configuration steps for Group Managed Service Accounts, including domain and node settings.",unchanged +https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts,Use Group Managed Service Accounts,Enable Group Managed Service Accounts (GMSA) for your Windows Server nodes on your Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Enable GMSA for Windows nodes in AKS,Learn how to enable Group Managed Service Accounts (GMSA) for your Windows Server nodes on your Azure Kubernetes Service (AKS) cluster to secure your pods.,"Group Managed Service Accounts (GMSA)is a managed domain account for multiple servers that provides automatic password management, simplified service principal name (SPN) management, and the ability to delegate management to other administrators. With Azure Kubernetes Service (AKS), you can enable GMSA on your Windows Server nodes, which allows containers running on Windows Server nodes to integrate with and be managed by GMSA.",2025-12-03T08:00:00.000Z,how-to,security,0.74,True,"This page describes how to securely configure Group Managed Service Accounts for Windows Server nodes in AKS, including product-specific security configuration steps and parameters (GMSA-related settings, domain integration details, and how pods use these accounts). This is concrete security configuration guidance unique to AKS + Windows + GMSA, matching the security sub-skill.",new https://learn.microsoft.com/en-us/azure/aks/use-kms-etcd-encryption,Use KMS etcd encryption (legacy),Use Key Management Service (KMS) Etcd Encryption in Azure Kubernetes Service (AKS) (legacy) - Azure Kubernetes Service,Configure legacy KMS etcd encryption in AKS,Learn how to use Key Management Service (KMS) etcd encryption for a public or private key vault with AKS using the legacy KMS experience.,"Important This article describes the legacy KMS experience for AKS. For new clusters running Kubernetes version 1.33 or later, we recommend using the newKMS data encryptionexperience, which offers platform-managed keys, customer-managed keys with automatic key rotation, and a simplified configuration experience. For conceptual information about data encryption options, seeData encryption at rest concepts for AKS. This article shows you how to turn on encryption at rest for a public or private ke",2026-01-20T23:08:00.000Z,how-to,security,0.76,True,"Legacy KMS etcd encryption setup with specific AKS and Key Vault configuration flags and modes (public/private), which are product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/use-kms-v2,Migrate to KMS v2,Migrate to Key Management Service (KMS) v2 in Azure Kubernetes Service (AKS) (legacy) - Azure Kubernetes Service,Migrate AKS clusters from KMS v1 to v2,Learn how to migrate to KMS v2 for clusters with versions older than 1.27.,"Important This article applies to clusters using the legacy KMS experience that need to migrate from KMS v1 to KMS v2. For clusters running Kubernetes version 1.33 or later, we recommend using the newKMS data encryptionexperience, which offers platform-managed keys, customer-managed keys with automatic key rotation, and a simplified configuration experience. In this article, you learn how to migrate to KMS v2 for clusters with versions older than 1.27. Beginning in AKS version 1.27, turning on t",2026-01-20T23:08:00.000Z,how-to,deployment,0.65,True,"Describes version-specific migration steps and constraints (e.g., Kubernetes version thresholds, behavior changes when enabling encryption) that are deployment- and product-specific.",unchanged +https://learn.microsoft.com/en-us/azure/aks/use-kms-v2,Migrate to KMS v2 (legacy),Migrate to Key Management Service (KMS) v2 in Azure Kubernetes Service (AKS) (legacy) - Azure Kubernetes Service,Migrate AKS clusters from KMS v1 to KMS v2,Learn how to migrate to KMS v2 for clusters with versions older than 1.27.,"Important This article applies to clusters using the legacy KMS experience that need to migrate from KMS v1 to KMS v2. For clusters running Kubernetes version 1.33 or later, we recommend using the newKMS data encryptionexperience, which offers platform-managed keys, customer-managed keys with automatic key rotation, and a simplified configuration experience. In this article, you learn how to migrate to KMS v2 for clusters with versions older than 1.27. Beginning in AKS version 1.27, turning on t",2026-01-20T23:08:00.000Z,how-to,decision-making,0.68,True,"Guides when and how to migrate from KMS v1 to v2 based on Kubernetes version and feature differences; includes recommendations and constraints for migration, fitting decision-making around upgrade paths.",new https://learn.microsoft.com/en-us/azure/aks/use-labels,Use labels,Use labels in an Azure Kubernetes Service (AKS) cluster - Azure Kubernetes Service,Use Kubernetes node pool labels effectively in AKS,Learn how to use labels in an Azure Kubernetes Service (AKS) cluster.,"Deploy and Explore If you have multiple node pools, you may want to add a label during node pool creation.Kubernetes labelshandle the scheduling rules for nodes. You can add labels to a node pool anytime and apply them to all nodes in the node pool. In this how-to guide, you learn how to use labels in an Azure Kubernetes Service (AKS) cluster.",2025-11-18T19:02:00.000Z,how-to,configuration,0.6,True,"Describes how labels are applied at node pool creation and later, and how AKS handles them for scheduling; these are concrete, AKS-specific configuration patterns rather than generic label theory.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-metrics-server-vertical-pod-autoscaler,Configure Metrics Server VPA,Configure Metrics Server VPA in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Metrics Server VPA on AKS clusters,Learn how to vertically autoscale your Metrics Server pods on an Azure Kubernetes Service (AKS) cluster.,"Metrics Serveris a scalable, efficient source of container resource metrics for Kubernetes built-in autoscaling pipelines. With Azure Kubernetes Service (AKS), vertical pod autoscaling is enabled for the Metrics Server. The Metrics Server is commonly used by other Kubernetes add-ons, like theHorizontal Pod Autoscaler. Vertical Pod Autoscaler (VPA) enables you to adjust the resource limit when the Metrics Server is experiencing consistent CPU and memory resource constraints.",2025-08-12T08:00:00.000Z,how-to,configuration,0.72,True,"Shows how to vertically autoscale Metrics Server pods; likely includes specific VPA objects, resource settings, and AKS-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-multiple-standard-load-balancer,Use multiple Standard Load Balancers,Use multiple load balancers in Azure Kubernetes Service (preview) - Azure Kubernetes Service,Scale AKS with multiple Standard Load Balancers,Learn how to configure multiple Standard SKU public load balancers to scale LoadBalancer Services in Azure Kubernetes Service (AKS).,"Azure Kubernetes Service (AKS) normally provisions one Standard Load Balancer (SLB) for allLoadBalancerServices in a cluster. Because each node NIC is limited to300 inbound load‑balancing rulesand8 private‑link services, large clusters or port‑heavy workloads can quickly exhaust these limits. Themultiple SLB previewremoves that bottleneck by letting you create several SLBs inside the same cluster and shard nodes and Services across them. You defineload‑balancer configurations, each tied to a pri",2025-07-16T17:07:00.000Z,how-to,limits-quotas,0.88,True,Explicitly mentions node NIC limits of 300 inbound rules and 8 private-link services; these numeric limits drive the multi-SLB pattern and are expert quota knowledge.,unchanged @@ -504,8 +510,8 @@ https://learn.microsoft.com/en-us/azure/aks/use-oidc-issuer,Create an OIDC Issue https://learn.microsoft.com/en-us/azure/aks/use-pod-sandboxing,Deploy Pod Sandboxing,Pod Sandboxing with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Deploy and use pod sandboxing in AKS,Learn about and deploy Pod Sandboxing on an Azure Kubernetes Service (AKS) cluster.,"To help secure and protect your container workloads from untrusted or potentially malicious code, AKS now includes a mechanism called Pod Sandboxing. Pod Sandboxing provides an isolation boundary between the container application and the shared kernel and compute resources of the container host such as CPU, memory, and networking. Applications are spun up in isolated, lightweight pod virtual machines (VMs). Pod Sandboxing complements other security measures or data protection controls with your ",2025-11-18T16:04:00.000Z,how-to,security,0.74,True,How-to for enabling pod sandboxing with AKS-specific configuration and security implications for isolated pod VMs.,unchanged https://learn.microsoft.com/en-us/azure/aks/use-premium-v2-disks,Use Azure Premium SSD v2 disks,Enable Premium SSD v2 support on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Premium SSD v2 disks for AKS workloads,Learn how to enable and configure Premium SSD v2 disks in an Azure Kubernetes Service (AKS) cluster.,"Azure Premium SSD v2 disksoffer IO-intense enterprise workloads, a consistent submillisecond disk latency, and high IOPS and throughput. The performance (capacity, throughput, and IOPS) of Premium SSD v2 disks can be independently configured at any time, making it easier for more scenarios to be cost efficient while meeting performance needs. This article describes how to configure a new or existing AKS cluster to use Azure Premium SSD v2 disks.",2024-08-01T20:29:00.000Z,how-to,configuration,0.7,True,"Contains AKS-specific StorageClass and disk configuration parameters for Premium SSD v2, including required annotations/fields that are unique to AKS CSI integration.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-psa,Pod Security Admission,Use Pod Security Admission in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Pod Security Admission in AKS clusters,Learn how to enable and use Pod Security Admission with Azure Kubernetes Service (AKS).,"Pod Security Admission (PSA) uses labels to enforce Pod Security Standards policies on pods running in a namespace. In AKS, Pod Security Admission is enabled by default. For more information about Pod Security Admission and Pod Security Standards, seeEnforce Pod Security Standards with namespace labelsandPod Security Standards. Pod Security Admission is a built-in policy solution for single cluster implementations. If you want to use an enterprise-grade policy, we recommend you useAzure policy.",2024-08-01T20:29:00.000Z,how-to,configuration,0.7,True,Duplicate of index 2; same AKS-specific PSA configuration guidance and labels usage.,unchanged -https://learn.microsoft.com/en-us/azure/aks/use-psa,Pod security policies,Use Pod Security Admission in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Pod Security Admission in AKS clusters,Learn how to enable and use Pod Security Admission with Azure Kubernetes Service (AKS).,"Pod Security Admission (PSA) uses labels to enforce Pod Security Standards policies on pods running in a namespace. In AKS, Pod Security Admission is enabled by default. For more information about Pod Security Admission and Pod Security Standards, seeEnforce Pod Security Standards with namespace labelsandPod Security Standards. Pod Security Admission is a built-in policy solution for single cluster implementations. If you want to use an enterprise-grade policy, we recommend you useAzure policy.",2024-08-01T20:29:00.000Z,how-to,configuration,0.7,True,Covers enabling and using PSA in AKS with AKS-specific configuration details and namespace label usage beyond generic PSA concepts.,unchanged -https://learn.microsoft.com/en-us/azure/aks/use-system-pools,Use system node pools,Use system node pools in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Design and use system vs user node pools in AKS,Learn how to create and manage system node pools in Azure Kubernetes Service (AKS),"In Azure Kubernetes Service (AKS), nodes of the same configuration are grouped together intonode pools. Node pools contain the underlying VMs that run your applications. System node pools and user node pools are two different node pool modes for your AKS clusters. System node pools serve the primary purpose of hosting critical system pods such asCoreDNSandmetrics-server. User node pools serve the primary purpose of hosting your application pods. However, application pods can be scheduled on syst",2025-06-10T08:00:00.000Z,how-to,architecture-patterns,0.72,True,"Provides AKS-specific rules and recommendations for system node pools (what must run there, sizing, count) vs user pools, a core cluster design pattern.",unchanged +https://learn.microsoft.com/en-us/azure/aks/use-psa,Pod security policies (legacy),Use Pod Security Admission in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure Pod Security Admission policies in AKS,Learn how to enable and use Pod Security Admission with Azure Kubernetes Service (AKS).,"Pod Security Admission (PSA) uses labels to enforce Pod Security Standards policies on pods running in a namespace. In AKS, Pod Security Admission is enabled by default. For more information about Pod Security Admission and Pod Security Standards, seeEnforce Pod Security Standards with namespace labelsandPod Security Standards. Pod Security Admission is a built-in policy solution for single cluster implementations. If you want to use an enterprise-grade policy, we recommend you useAzure policy.",2024-08-01T20:29:00.000Z,how-to,security,0.7,True,"Implementation-focused PSA article for AKS; likely includes specific namespace labels, policy levels, and AKS-specific behavior for enforcing Pod Security Standards, which are product-specific security configuration details.",new +https://learn.microsoft.com/en-us/azure/aks/use-system-pools,Use system node pools,Use system node pools in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Manage and configure AKS system node pools,Learn how to create and manage system node pools in Azure Kubernetes Service (AKS),"In Azure Kubernetes Service (AKS), nodes of the same configuration are grouped together intonode pools. Node pools contain the underlying virtual machines (VM) that run your applications. System node pools and user node pools are two different node pool modes for your AKS clusters. This article explains how to manage system node pools in AKS. For information about how to use multiple node pools, seecreate node pools. A production AKS cluster with a single system node pool must contain at least t",2026-04-21T06:03:00.000Z,how-to,best-practices,0.68,True,"The article goes beyond concepts and includes AKS-specific operational guidance for system node pools (for example, requirements like a production cluster with a single system node pool must contain at least a certain number of nodes, and constraints on what can run on system vs. user pools). These are product-specific DO/DON'T style recommendations and constraints that an LLM is unlikely to know precisely from training, fitting best under best-practices.",updated https://learn.microsoft.com/en-us/azure/aks/use-tags,Use tags,Use Azure tags in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure and apply Azure tags in AKS,Learn how to use Azure provider tags to track resources in Azure Kubernetes Service (AKS).,"With Azure Kubernetes Service (AKS), you can set Azure tags on an AKS cluster and its related resources using Azure Resource Manager and the Azure CLI. You can also use Kubernetes manifests to set Azure tags for certain resources. Azure tags are a useful tracking resource for certain business processes, such aschargeback. This article explains how to set Azure tags for AKS clusters and related resources.",2026-04-10T06:03:00.000Z,concept-article,configuration,0.7,True,"The page explains how to set Azure tags on AKS clusters and related resources using Azure Resource Manager, Azure CLI, and Kubernetes manifests. It focuses on concrete, product-specific configuration actions (which resources can be tagged and how), rather than conceptual tagging theory, so it fits the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/aks/use-telepresence-aks,Use Telepresence to develop and test microservices locally,Use Telepresence to develop and test microservices locally - Azure Kubernetes Service,Use Telepresence with AKS for local microservice debugging,Learn how to use Telepresence to debug microservices on AKS,"Telepresenceis a Cloud Native Computing Foundation (CNCF) Sandbox project created by the team at Ambassador Labs. Telepresence allows developers to run services locally on their development machine while connected to a remote Kubernetes cluster. This setup makes it easier to develop, debug, and test applications that interact with other services in the cluster without having to redeploy or rebuild the entire application in Kubernetes every time changes are made. Note Telepresence is an open-sour",2025-01-22T05:30:00.000Z,tutorial,integrations,0.7,True,Details how to integrate Telepresence with AKS clusters for local development and debugging; includes product-specific commands and setup.,unchanged https://learn.microsoft.com/en-us/azure/aks/use-trusted-launch,Deploy trusted launch on AKS,Trusted Launch with Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Enable Trusted Launch security for AKS nodes,"Learn how Trusted Launch protects the Azure Kubernetes Cluster (AKS) nodes against boot kits, rootkits, and kernel-level malware.","Trusted Launchimproves the security of generation 2 virtual machines (VMs) by protecting against advanced and persistent attack techniques. It enables administrators to deploy AKS nodes, which contain the underlying virtual machines, with verified and signed bootloaders, OS kernels, and drivers. By using secure and measured boot, administrators gain insights and confidence of the entire boot chain's integrity. This article helps you understand this new feature, and how to implement it. Important",2025-08-05T22:07:00.000Z,how-to,security,0.76,True,"Describes how AKS uses Trusted Launch, including requirements around Gen2 VMs, secure/measured boot, and how to enable it for node pools—product-specific security configuration details.",unchanged @@ -536,12 +542,12 @@ https://learn.microsoft.com/en-us/azure/aks/windows-faq,Windows Server container https://learn.microsoft.com/en-us/azure/aks/windows-vs-linux-containers,Windows container considerations,Windows Container Considerations in Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Plan Windows vs Linux container workloads on AKS,See the Windows container considerations with Azure Kubernetes Service (AKS).,"When you create deployments that use Windows Server containers on Azure Kubernetes Service (AKS), there are a few differences relative to Linux deployments you should keep in mind. For a detailed comparison of the differences between Windows and Linux in upstream Kubernetes, seeWindows containers in Kubernetes. Some of the major differences include: Important Starting on March 01, 2026, Azure Kubernetes Service (AKS) no longer supports Windows Server 2019 node pools. Node pools running Kubernete",2026-01-28T18:10:00.000Z,overview,decision-making,0.7,True,"Lists AKS-specific differences and constraints for Windows containers (supported versions, features, timelines), guiding workload placement decisions.",unchanged https://learn.microsoft.com/en-us/azure/aks/workload-identity-cross-tenant,Configure cross-tenant workload identity,Configure cross-tenant workload identity on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Configure cross-tenant workload identity for AKS,Learn how to configure cross-tenant workload identity on Azure Kubernetes Service (AKS).,"In this article, you learn how to configure cross-tenant workload identity on Azure Kubernetes Service (AKS). Cross-tenant workload identity allows you to access resources in another tenant from your AKS cluster. In this example, you create an Azure Service Bus in one tenant and send messages to it from a workload running in an AKS cluster in another tenant. For more information on workload identity, see theWorkload identity overview.",2024-08-01T20:29:00.000Z,how-to,security,0.78,True,Shows how to access resources in another tenant using AKS workload identity with specific cross-tenant configuration steps.,unchanged https://learn.microsoft.com/en-us/azure/aks/workload-identity-deploy-cluster,Deploy and configure workload identity for a cluster,Deploy and Configure an Azure Kubernetes Service (AKS) Cluster with Microsoft Entra Workload ID - Azure Kubernetes Service,Configure AKS cluster with Entra Workload ID,"This article shows you how to deploy an AKS cluster and configure it with Microsoft Entra Workload ID, including creating a managed identity, Kubernetes service account, and federated identity credent","In this article, you learn how to deploy and configure an Azure Kubernetes Service (AKS) cluster withMicrosoft Entra Workload ID. The steps in this article include:",2026-04-07T22:09:00.000Z,how-to,security,0.75,True,"Article walks through configuring AKS with Microsoft Entra Workload ID, including creating a managed identity, Kubernetes service account, and federated identity credential. This implies specific RBAC/identity objects, configuration parameters, and bindings unique to AKS + Entra, which are product-specific security and authentication settings.",unchanged -https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity,Migrate your app from pod identity to workload identity,Migrate Azure Kubernetes Service (AKS) Pods to Microsoft Entra Workload ID - Azure Kubernetes Service,Migrate AKS pods from pod identity to Workload ID,Migrate AKS pods from pod-managed identities to Microsoft Entra Workload ID using Azure Identity SDK versions or migration sidecar approaches.,"Migrate AKS pods from pod-managed identities toMicrosoft Entra Workload ID(workload identity) using one of three approaches based on your currentAzure Identity SDKversion: latest SDK parallel deployment, migration sidecar proxy (Linux only), or SDK rewrite.",2026-01-06T23:11:00.000Z,how-to,security,0.76,True,"Provides migration strategies and patterns (parallel deployment, sidecar proxy, SDK rewrite) specific to AKS identity features.",unchanged +https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity,Migrate from pod identity to workload identity,Migrate Azure Kubernetes Service (AKS) Pods to Microsoft Entra Workload ID - Azure Kubernetes Service,Choose migration approach to Entra Workload ID for AKS,Migrate AKS pods from pod-managed identities to Microsoft Entra Workload ID using Azure Identity SDK versions or migration sidecar approaches.,"Migrate AKS pods from pod-managed identities toMicrosoft Entra Workload ID(workload identity) using one of three approaches based on your currentAzure Identity SDKversion: latest SDK parallel deployment, migration sidecar proxy (Linux only), or SDK rewrite.",2026-01-06T23:11:00.000Z,how-to,decision-making,0.65,True,"Provides multiple migration approaches (parallel deployment, sidecar proxy, SDK rewrite) based on SDK version and scenario; helps decide which path to use with concrete guidance, fitting decision-making around migration.",new https://learn.microsoft.com/en-us/azure/aks/workload-identity-overview,Workload identity overview,Use a Microsoft Entra Workload ID on Azure Kubernetes Service (AKS) - Azure Kubernetes Service,Use Microsoft Entra Workload ID with AKS workloads,Learn about Microsoft Entra Workload ID for Azure Kubernetes Service (AKS) and how to migrate your application to authenticate using this identity.,"Workloads deployed on an AKS cluster require Microsoft Entra application credentials or managed identities to access Microsoft Entra protected resources, such as Azure Key Vault and Microsoft Graph. Microsoft Entra Workload ID integrates with the capabilities native to Kubernetes to federate with external identity providers, allowing you to assign workload identities to your workloads to authenticate and access other services and resources. Microsoft Entra Workload IDusesService Account Token Vo",2025-12-10T23:07:00.000Z,overview,security,0.78,True,"Describes AKS-specific workload identity model, including federation with Entra and Kubernetes service account token usage.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/access-fleet-hub-cluster-kubernetes-api,Access Fleet Manager hub cluster,Access the Kubernetes API for an Azure Kubernetes Fleet Manager Hub Cluster,Access the Kubernetes API of an Azure Kubernetes Fleet hub cluster,Learn how to access the Kubernetes API for an Azure Kubernetes Fleet Manager hub cluster.,"Applies to:✔️ Fleet Manager with hub cluster If your Azure Kubernetes Fleet Manager (Kubernetes Fleet) resource was created with a hub cluster, you can use it to centrally control scenarios like Kubernetes resource propagation. In this article, you learn how to access the Kubernetes API for a Kubernetes Fleet hub cluster.",2025-11-18T19:02:00.000Z,how-to,configuration,0.65,True,"Explains how to configure access (kubeconfig, auth) to the hub cluster API; product-specific access configuration steps and parameters.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-automated-deployments,Automated deployments overview,Azure Kubernetes Fleet Manager Automated Deployments conceptual overview,,Learn about the concepts behind Fleet Manager Automated Deployments which simplify the process of building and deploying your application from Git.,"This article provides a conceptual overview of Fleet Manager's Automated Deployments capability. Fleet Manager Automated Deployments simplify the process of taking your application source code from a GitHub repository and deploying it across one or more AKS clusters in your fleet. Once configured, every new commit you make runs the pipeline, resulting in updates to your application wherever it's deployed in your fleet. Important Azure Kubernetes Fleet Manager preview features are available on a ",2025-05-12T11:07:00.000Z,overview,,0.45,False,"Automated Deployments conceptual overview; focuses on high-level pipeline concept from Git, not detailed configuration matrices or limits.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-choosing-fleet,Choosing a fleet type,Choosing an Azure Kubernetes Fleet Manager option,Choose the right Azure Kubernetes Fleet Manager configuration,This article provides an overview of the various Azure Kubernetes Fleet Manager options and why you may choose a specific configuration.,This article provides an overview of the various Azure Kubernetes Fleet Manager options and the considerations you should use to guide your selection of a specific configuration.,2025-12-11T06:03:00.000Z,concept-article,decision-making,0.7,True,"Explicitly about choosing between Fleet Manager options; likely includes scenario-based guidance and trade-offs for different configurations, fitting decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-cross-cluster-networking,Cross-cluster networking for Fleet Manager,Cross-cluster networking for Azure Kubernetes Fleet Manager,,This article provides a conceptual overview of Cross-cluster networking for Azure Kubernetes Fleet Manager.,Azure Kubernetes Fleet Manager provides a dedicated cross-cluster networking solution that extends the Kubernetes datapath across multiple clusters. Using cross-cluster networking enables any connected cluster to communicate directly with endpoints on any other connected cluster with full network‑policy enforcement. Using cross-cluster networking allowing clusters to publish services such that any connected cluster can call them as if they were local. Multiple cross-cluster network profiles can ,2026-04-14T08:00:00.000Z,concept-article,,0.1,False,"Described explicitly as a conceptual overview of cross-cluster networking for Azure Kubernetes Fleet Manager. Overviews of concepts and capabilities without configuration tables, numeric thresholds, or error mappings do not meet any expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-cross-cluster-networking,Cross-cluster networking for Fleet Manager,Cross-cluster networking for Azure Kubernetes Fleet Manager,,This article provides a conceptual overview of Cross-cluster networking for Azure Kubernetes Fleet Manager.,Azure Kubernetes Fleet Manager provides a dedicated cross-cluster networking solution that extends the Kubernetes datapath across multiple clusters. Using cross-cluster networking enables any connected cluster to communicate directly with endpoints on any other connected cluster with full network‑policy enforcement. Using cross-cluster networking allowing clusters to publish services such that any connected cluster can call them as if they were local. Multiple cross-cluster network profiles can ,2026-04-14T08:00:00.000Z,concept-article,,0.1,False,"Described explicitly as a conceptual overview of cross-cluster networking for Azure Kubernetes Fleet Manager. Overviews of concepts and capabilities without configuration tables, numeric thresholds, or error mappings do not meet any expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-dns-load-balancing,Public endpoint DNS-based load balancing,Multi-cluster DNS-based load balancing with Azure Kubernetes Fleet Manager,,Understand how Fleet Manager supports DNS-based load balancing for placed workloads.,"Azure Kubernetes Fleet Manager can be used to create and manage DNS-based multi-cluster load balancing for public-facing workloads deployed across member clusters. Important Azure Kubernetes Fleet Manager preview features are available on a self-service, opt-in basis. Previews are provided ""as is"" and ""as available,"" and they're excluded from the service-level agreements and limited warranty. Azure Kubernetes Fleet Manager previews are partially covered by customer support on a best-effort basis",2025-05-12T11:07:00.000Z,concept-article,,0.3,False,Explains DNS-based multi-cluster load balancing conceptually; preview note and overview style suggest it lacks detailed configuration tables or numeric thresholds in the provided summary.,unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-eviction-disruption,Controlling eviction and disruption,Controlling eviction and disruption budgets for Azure Kubernetes Fleet Manager cluster resource placement,Control eviction and disruption budgets for Fleet Manager workloads,This article describes how to manage evictions and voluntary disruption for workloads placed by Fleet Manager's cluster resource placement.,"Administrators using Fleet Manager's cluster resource placement (CRP) can find they need to remove resources previously placed on member clusters, while ensuring that key resource placements aren't disrupted. In this article, we explore how you can use Fleet Manager'sClusterResourcePlacementEvictionandClusterResourcePlacementDisruptionBudgetobjects to achieve these goals. Note If you aren't already familiar with Fleet Manager's cluster resource placement (CRP), read theconceptual overview of res",2025-04-24T05:31:00.000Z,concept-article,best-practices,0.65,True,Describes specific CRP eviction and disruption budget objects and how to use them to avoid disruption; product-specific operational guidance.,unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-fleet,Fleets and member clusters,Azure Kubernetes Fleet Manager fleets and member clusters,,This article provides a conceptual overview of Azure Kubernetes Fleet Manager and member clusters.,This article provides a conceptual overview of fleets and member clusters in Azure Kubernetes Fleet Manager.,2026-04-02T11:02:00.000Z,concept-article,,0.0,False,"Explicitly a conceptual overview of fleets and member clusters; no detailed limits, configs, or troubleshooting content.",unchanged @@ -557,7 +563,7 @@ https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-snap https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-takeover,Taking over existing resources,Taking over existing workloads with Azure Kubernetes Fleet Manager resource placement,Control Fleet takeover of existing workloads,This article describes how to use the whenToTakeOver property to control how Fleet Manager handles existing workloads when placing workloads using resource placement.,"When a multi-cluster environment matures, the presence of a specific workload on multiple clusters is a common situation. One reason to add clusters to a fleet is to centralize management of workloads to improve visibility and manageability of workloads across multiple clusters. However, adding clusters with existing workloads to a fleet can lead to placement conflicts when Fleet Manager attempts to place a managed workload onto an added member cluster. In this article, we look at how to use the",2025-12-04T06:03:00.000Z,concept-article,architecture-patterns,0.7,True,"Describes how to use the whenToTakeOver property to manage conflicts when adding clusters with existing workloads. This is a product-specific decision pattern (how and when Fleet should assume control) with clear scenario-based guidance, aligning with architecture-patterns.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-preview-api-lifecycle,Preview API lifecycle,Azure Kubernetes Fleet Manager Preview API lifecycle,Understand lifecycle limits for Fleet Manager preview APIs,Learn about the Azure Kubernetes Fleet Manager preview API lifecycle.,"The Azure Kubernetes Fleet Manager preview Azure Resource Manager (ARM) REST APIs (APIs that end in-preview) have a lifespan of approximately one year from their release date. This means that you can expect the 2024-05-02-preview API to be deprecated around May 3, 2025. We love when people try our preview features and give us feedback, so we encourage you to use the preview APIs and the tools built on them. After an API version is deprecated, it will no longer function. We recommend you routinel",2026-03-09T08:00:00.000Z,concept-article,limits-quotas,0.7,True,"Contains a specific, time-bound lifecycle rule for preview ARM API versions (approximately one-year lifespan with concrete example dates), which is a product-specific limit/constraint not generally known from training.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-rbac,Azure RBAC and Fleet Manager,Grant access to Azure Kubernetes Fleet Manager resources with Azure role-based access control,Configure RBAC roles for Azure Kubernetes Fleet Manager,This article provides an overview of the Azure role-based access control roles that can be used to access Azure Kubernetes Fleet Manager resources.,This article provides an overview of the Azure RBAC roles that you can use with Azure Kubernetes Fleet Manager. Azure role-based access control (Azure RBAC)is an authorization system built on Azure Resource Manager that provides fine-grained access management to Azure resources.,2026-04-08T06:03:00.000Z,concept-article,security,0.7,True,"RBAC articles for a specific Azure service typically list concrete built-in role names, their exact permissions/scopes, and how they apply to that resource type. Those role definitions and scope mappings are product-specific security configuration details that qualify as expert knowledge under the security category.",unchanged -https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-resource-placement,Introduction to resource placement,Introducing Azure Kubernetes Fleet Manager intelligent resource placement,,This article describes the concepts of Azure Kubernetes Fleet Manager intelligent resource placement,"Applies to✔️ Fleet Manager with hub cluster Managing Kubernetes resources across multiple clusters presents significant challenges for both platform administrators and application developers. As organizations scale their Kubernetes infrastructure beyond a single cluster, they often encounter complexities related to resource distribution, consistency, and manual management overhead. The traditional approach of managing each cluster independently creates operational silos that become increasingly ",2026-04-07T08:00:00.000Z,concept-article,,0.2,False,"Introduced as a concepts article for intelligent resource placement across multiple clusters. The summary focuses on challenges and high-level capabilities, without evidence of configuration tables, limits, or detailed decision matrices. This is likely conceptual guidance rather than the kind of specific, product-level expert knowledge required.",unchanged +https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-resource-placement,Introduction to resource placement,Introducing Azure Kubernetes Fleet Manager intelligent resource placement,,This article describes the concepts of Azure Kubernetes Fleet Manager intelligent resource placement,"Applies to✔️ Fleet Manager with hub cluster Managing Kubernetes resources across multiple clusters presents significant challenges for both platform administrators and application developers. As organizations scale their Kubernetes infrastructure beyond a single cluster, they often encounter complexities related to resource distribution, consistency, and manual management overhead. The traditional approach of managing each cluster independently creates operational silos that become increasingly ",2026-04-23T08:00:00.000Z,concept-article,,0.2,False,"Conceptual description of intelligent resource placement in Azure Kubernetes Fleet Manager; no concrete limits, configuration tables, error codes, or decision matrices with quantified trade-offs are evident from the summary.",updated https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-rollout-strategy,Controlling placement rollout order,Defining a rollout strategy for Azure Kubernetes Fleet Manager resource placement,Design rollout strategies for Fleet resource placement,This article describes how to define a rollout strategy for Fleet Manager's resource placement.,"Applies to:✔️ Fleet Manager with hub cluster During the lifetime of a resource placement (cluster-scopedClusterResourcePlacementor namespace-scopedResourcePlacement), changes might be made which can result in one of the following scenarios: Most scenarios can lead to service interruptions as workloads running on member clusters might temporarily become unavailable as Fleet Manager dispatches updated resources. Clusters that are no longer selected lose all placed resources, resulting in lost traf",2026-03-06T06:07:00.000Z,concept-article,architecture-patterns,0.65,True,"Focuses on defining rollout strategies for resource placement to avoid service interruptions and manage cluster selection changes. This is a product-specific pattern for how and when to roll out changes across clusters, with trade-offs around availability and placement behavior, fitting architecture-patterns better than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-update-orchestration,Safe multi-cluster upgrades,Update Kubernetes and node images across multiple member clusters,,This article describes the concept of update orchestration across multiple clusters.,"Applies to:✔️ Fleet Manager ✔️ Fleet Manager with hub cluster Platform admins managing large number of clusters often have problems with staging the updates of multiple clusters (for example, upgrading node OS image or Kubernetes versions) in a safe and predictable way. To address this challenge, Azure Kubernetes Fleet Manager allows you to orchestrate updates across multiple clusters using update runs. Update runs consist of stages, groups, and strategies and can be applied either manually, for",2026-04-08T08:00:00.000Z,concept-article,,0.2,False,"Described as a concepts article about update orchestration (stages, groups, strategies) across clusters. This is primarily conceptual behavior and workflow; there is no indication of numeric limits, config tables, error codes, or detailed settings that would qualify as expert knowledge in the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/kubernetes-fleet/faq,FAQ,Frequently asked questions - Azure Kubernetes Fleet Manager,,This article covers the frequently asked questions for Azure Kubernetes Fleet Manager,Applies to:✔️ Fleet Manager ✔️ Fleet Manager with hub cluster This article covers the frequently asked questions for Azure Kubernetes Fleet Manager.,2026-03-25T08:00:00.000Z,concept-article,,0.3,False,"The FAQ page is likely a mix of high-level Q&A and general guidance. The summary does not indicate structured limits, configuration tables, or detailed error mappings. Without clear evidence of specific expert-only details (limits, config matrices, or error codes), it should not be classified as an expert-knowledge sub-skill.",unchanged @@ -596,4 +602,3 @@ https://learn.microsoft.com/en-us/azure/kubernetes-fleet/view-fleet-agent-logs,V If you have a Fleet with the hub cluster mode enabled, Azure Kubernetes Fleet Manager installs Fleet agents in both the hub cluster and the member clusters to facilitate communications and orchestrate operations across the Fleet, in support of Fleet's workload orchestration and load balancing capabilities. These agents generate logs that provide insights into: And you can retr",2025-06-17T05:16:00.000Z,how-to,troubleshooting,0.7,True,Shows how to configure and view Fleet agent logs on hub and member clusters; includes log locations and patterns for monitoring/troubleshooting.,unchanged -https://learn.microsoft.com/en-us/previous-versions/azure/aks/azure-ad-integration-cli,Microsoft Entra integration (legacy),Integrate Microsoft Entra ID with Azure Kubernetes Service (AKS) (legacy) - Azure Kubernetes Service,Configure legacy Entra integration for AKS via CLI,Learn how to use the Azure CLI to create and Microsoft Entra ID-enabled Azure Kubernetes Service (AKS) cluster (legacy),"Warning The feature described in this document, Microsoft Entra Integration (legacy) wasdeprecated on June 1st, 2023. At this time, no new clusters can be created with Microsoft Entra Integration (legacy). AKS has a new improvedAKS-managed Microsoft Entra IDexperience that doesn't require you to manage server or client applications. If you want to migrate follow the instructionshere. Azure Kubernetes Service (AKS) can be configured to use Microsoft Entra ID for user authentication. In this confi",2024-02-21T00:00:00.000Z,concept-article,security,0.7,True,"Legacy but detailed instructions for setting up Entra integration, including app registrations, roles, and AKS CLI flags.",unchanged diff --git a/products/azure-kubernetes-service/report.md b/products/azure-kubernetes-service/report.md index 6c36a289..c150abcb 100644 --- a/products/azure-kubernetes-service/report.md +++ b/products/azure-kubernetes-service/report.md @@ -1,44 +1,44 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - security: 'Securing AKS clusters: identity and access (Entra, RBAC, workload identity), - network and node hardening, encryption, policies, PCI compliance, and secure integrations - with other Azure services.' + security: 'Securing AKS clusters: identity/RBAC, Entra auth, network policies, disk/secret + encryption, node hardening, PCI compliance, Istio/mTLS, image integrity, and Azure + Policy-based controls.' architecture-patterns: 'Designing AKS architectures: HA/DR patterns, multi-region and PCI designs, upgrade/rollout strategies, node pool/network isolation, virtual nodes, and Fleet-based multi-cluster management.' - configuration: 'Configuring AKS clusters and fleets: networking, ingress/egress, - storage, GPUs, autoscaling, cost/monitoring, security, add-ons (Istio, Dapr, ACNS), - and infrastructure for common data/AI workloads.' + configuration: Configuring AKS clusters, networking, storage, security, autoscaling, + service mesh, databases, costs, and multi-cluster/Fleet features, plus extensions + and advanced networking (ACNS, CNI, ingress). troubleshooting: 'Diagnosing and fixing AKS cluster issues: networking, DNS, kubelet, - GPUs, security/vulns, upgrades, Windows nodes, Fleet CRDs, and agent/log-based - troubleshooting.' - deployment: Deploying and upgrading AKS clusters and apps, including CI/CD, service - meshes, autoscaling, storage/migration, AI/ML and Wasm workloads, and production-ready - infrastructure setup. - integrations: Patterns and how-tos for integrating AKS with AI agents, KAITO, storage, - security (Key Vault/CSI), GPUs, monitoring, GitOps/Fleet, and external Azure/OSS - services and tools. - decision-making: 'Guidance for planning, choosing, and migrating AKS architectures: - VM/node pools, networking, upgrades, pricing, cost optimization, PCI, multi‑cloud - comparisons, and Fleet Manager workflows.' - best-practices: 'AKS reliability, performance, security, and cost best practices: - cluster design, upgrades, networking, storage, scheduling, GPU/ML, PCI, and workload + GPUs, security/KMS, upgrades, Windows containers, Fleet, and troubleshooting tools + like Insights, events, and logs.' + deployment: Deploying and upgrading AKS clusters and add-ons, setting up CI/CD and + IaC (ARM/Bicep/Terraform), and running apps/workloads (AI, Ray, Dapr, service + mesh, Wasm, OpenFaaS, KEDA). + integrations: Patterns and how-tos for integrating AKS with AI agents/KAITO, storage, + secrets, GPUs, monitoring, KEDA, Fleet, Istio/OSM, and external services like + ACR, MongoDB, and GitHub Actions + decision-making: 'Guidance for AKS design and migration decisions: cluster/network/VM + sizing, pricing and cost optimization, security/PCI, OS and node pools, upgrades, + and comparisons with other platforms.' + best-practices: 'AKS operational best practices: reliability, security, cost, performance, + scheduling, storage/backup, networking, upgrades, GPU/ML, PCI, and workload-specific resiliency patterns.' - limits-quotas: 'AKS limits, quotas, SLAs, and lifecycle: node/pod/identity caps, - CNI/NAT/load balancer scaling, Istio performance, version support/LTS, Fleet limits, - and platform support constraints.' + limits-quotas: AKS platform limits, SLAs, quotas, version lifecycle, performance/capacity + (Istio, NAT, load balancers), identity scaling, networking/subnet limits, and + support policies. skill_description: Expert knowledge for Azure Kubernetes Service (AKS) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, - and deployment. Use when designing AKS clusters with Fleet, Istio, Dapr, GPUs, GitOps/CI-CD, - or AI/ML and Wasm workloads, and other Azure Kubernetes Service (AKS) related development - tasks. Not for Azure Container Apps (use azure-container-apps), Azure Container - Instances (use azure-container-instances), Azure Red Hat OpenShift (use azure-redhat-openshift), - Azure Virtual Machine Scale Sets (use azure-vm-scalesets). -use_when: Use when designing AKS clusters with Fleet, Istio, Dapr, GPUs, GitOps/CI-CD, - or AI/ML and Wasm workloads, and other Azure Kubernetes Service (AKS) related development - tasks. + and deployment. Use when configuring AKS networking/storage, Istio or service mesh, + Fleet multi-cluster, GPUs/AI workloads, or KEDA autoscaling, and other Azure Kubernetes + Service (AKS) related development tasks. Not for Azure Container Apps (use azure-container-apps), + Azure Container Instances (use azure-container-instances), Azure Red Hat OpenShift + (use azure-redhat-openshift), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). +use_when: Use when configuring AKS networking/storage, Istio or service mesh, Fleet + multi-cluster, GPUs/AI workloads, or KEDA autoscaling, and other Azure Kubernetes + Service (AKS) related development tasks. confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Red Hat OpenShift (use azure-redhat-openshift), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). @@ -47,84 +47,110 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu ## Summary -- **Total Pages**: 588 -- **Fetched**: 588 +- **Total Pages**: 593 +- **Fetched**: 593 - **Fetch Failed**: 0 -- **Classified**: 443 -- **Unclassified**: 145 +- **Classified**: 444 +- **Unclassified**: 149 ### Incremental Update -- **New Pages**: 3 -- **Updated Pages**: 20 -- **Unchanged**: 565 -- **Deleted Pages**: 0 +- **New Pages**: 55 +- **Updated Pages**: 12 +- **Unchanged**: 526 +- **Deleted Pages**: 50 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-kubernetes-service/azure-kubernetes-service.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 21 | 3.6% | -| best-practices | 41 | 7.0% | -| configuration | 149 | 25.3% | -| decision-making | 43 | 7.3% | -| deployment | 52 | 8.8% | -| integrations | 21 | 3.6% | -| limits-quotas | 19 | 3.2% | -| security | 80 | 13.6% | -| troubleshooting | 17 | 2.9% | -| *(Unclassified)* | 145 | 24.7% | +| architecture-patterns | 20 | 3.4% | +| best-practices | 41 | 6.9% | +| configuration | 148 | 25.0% | +| decision-making | 46 | 7.8% | +| deployment | 50 | 8.4% | +| integrations | 21 | 3.5% | +| limits-quotas | 21 | 3.5% | +| security | 77 | 13.0% | +| troubleshooting | 20 | 3.4% | +| *(Unclassified)* | 149 | 25.1% | ## Changes ### New Pages -- [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-insights-agent-overview) -- [Install and use the Container Network Insights Agent](https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-insights-agent) -- [Troubleshoot Container Network Insights Agent issues](https://learn.microsoft.com/en-us/azure/aks/troubleshoot-container-network-insights-agent) +- [Quickstart](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-quickstart-auto) +- [Projects in AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-projects) +- [Set up a cluster for AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-install-cluster-setup) +- [Deploy an application](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-app) +- [Set up permissions](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions) +- [Use Group Managed Service Accounts](https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts) +- [Troubleshoot an application](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-troubleshooting) +- [Use AI with AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-ai-assistant) +- [External identity provider authentication to cluster (overview)](https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-overview) +- [Set up external identity provider authentication to cluster](https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure) +- [Cluster authorization concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-cluster-authorization) +- [Use Microsoft Entra ID authorization for the Kubernetes API](https://learn.microsoft.com/en-us/azure/aks/entra-id-authorization) +- [Use Kubernetes RBAC with Microsoft Entra integration](https://learn.microsoft.com/en-us/azure/aks/kubernetes-rbac-entra-id) +- [Managed identities overview](https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview) +- [Service principal as cluster identity](https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal) +- [Rotate cluster credentials](https://learn.microsoft.com/en-us/azure/aks/update-credentials) +- [Identity Bindings for scalable workload identity (overview)](https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts) +- [Set up identity bindings for scalable workload identity](https://learn.microsoft.com/en-us/azure/aks/identity-bindings) +- [Migrate from pod identity to workload identity](https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity) +- [Microsoft Entra pod identity (legacy)](https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity) +- *...and 35 more* ### Updated Pages -- [Compare AKS Standard and AKS Automatic](https://learn.microsoft.com/en-us/azure/aks/intro-aks-automatic) - - Updated: 2025-12-17T12:02:00.000Z → 2026-04-14T06:03:00.000Z -- [From a code repository](https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-from-code) - - Updated: 2026-01-29T23:06:00.000Z → 2026-04-14T06:03:00.000Z -- [Inside a managed virtual network](https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-managed-network) - - Updated: 2026-04-01T06:03:00.000Z → 2026-04-15T06:34:00.000Z -- [Public cluster](https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-custom-network) - - Updated: 2025-11-04T23:06:00.000Z → 2026-04-14T06:03:00.000Z -- [Private cluster](https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-private-custom-network) - - Updated: 2025-10-10T08:00:00.000Z → 2026-04-14T06:03:00.000Z -- [Use Planned Maintenance to schedule and control upgrades](https://learn.microsoft.com/en-us/azure/aks/planned-maintenance) - - Updated: 2025-12-19T08:00:00.000Z → 2026-04-14T06:03:00.000Z -- [Best practices for network connectivity and security](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-network) - - Updated: 2025-11-10T08:00:00.000Z → 2026-04-13T22:08:00.000Z -- [Use API Server VNet integration](https://learn.microsoft.com/en-us/azure/aks/api-server-vnet-integration) - - Updated: 2026-01-16T18:11:00.000Z → 2026-04-15T08:00:00.000Z -- [Bring your own CNI](https://learn.microsoft.com/en-us/azure/aks/use-byo-cni) - - Updated: 2026-02-04T23:11:00.000Z → 2026-04-16T08:00:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/aks/concepts-network-azure-cni-overlay) - - Updated: 2026-01-15T23:03:00.000Z → 2026-04-16T08:00:00.000Z -- [Use kubenet](https://learn.microsoft.com/en-us/azure/aks/configure-kubenet) - - Updated: 2024-08-01T20:29:00.000Z → 2026-04-16T08:00:00.000Z -- [Legacy CNI options](https://learn.microsoft.com/en-us/azure/aks/concepts-network-legacy-cni) - - Updated: 2025-04-01T11:11:00.000Z → 2026-04-16T08:00:00.000Z -- [For large workloads](https://learn.microsoft.com/en-us/azure/aks/best-practices-performance-scale-large) - - Updated: 2024-11-21T05:31:00.000Z → 2026-04-17T22:07:00.000Z -- [Networking concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-network) - - Updated: 2025-11-04T23:06:00.000Z → 2026-04-13T22:08:00.000Z -- [IP address planning](https://learn.microsoft.com/en-us/azure/aks/concepts-network-ip-address-planning) - - Updated: 2024-09-09T22:10:00.000Z → 2026-04-16T08:00:00.000Z -- [Multi-instance GPU node pool](https://learn.microsoft.com/en-us/azure/aks/gpu-multi-instance) - - Updated: 2025-03-20T05:32:00.000Z → 2026-04-17T22:07:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning) - - Updated: 2026-02-14T06:03:00.000Z → 2026-04-14T06:03:00.000Z -- [Configure NodePool custom resources](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-node-pools) - - Updated: 2025-10-30T22:08:00.000Z → 2026-04-13T17:13:00.000Z -- [FAQ](https://learn.microsoft.com/en-us/azure/aks/faq) - - Updated: 2026-01-28T18:10:00Z → 2026-04-15T06:34:00Z -- [Cross-cluster networking for Fleet Manager](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-cross-cluster-networking) - - Updated: 2026-03-23T03:00:00.000Z → 2026-04-14T08:00:00.000Z +- [Ingress with Kubernetes Gateway API](https://learn.microsoft.com/en-us/azure/aks/istio-gateway-api) + - Updated: 2026-03-19T11:02:00.000Z → 2026-04-21T17:10:00.000Z +- [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-cli) + - Updated: 2026-03-13T22:09:00.000Z → 2026-04-22T22:13:00.000Z +- [Upgrade FAQ](https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq) + - Updated: 2026-01-07T23:08:00Z → 2026-04-23T06:10:00Z +- [AKS desktop overview](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-overview) + - Updated: 2026-01-12T23:20:00.000Z → 2026-04-23T22:11:00.000Z +- [Use system node pools](https://learn.microsoft.com/en-us/azure/aks/use-system-pools) + - Updated: 2025-06-10T08:00:00.000Z → 2026-04-21T06:03:00.000Z +- [Use a NAT Gateway](https://learn.microsoft.com/en-us/azure/aks/nat-gateway) + - Updated: 2026-04-10T22:06:00.000Z → 2026-04-15T08:00:00.000Z +- [Expand pod CIDR space in Azure CNI Overlay clusters](https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay-pod-expand) + - Updated: 2026-01-15T23:03:00.000Z → 2026-04-20T22:08:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-security-wireguard-encryption-concepts) + - Updated: 2026-04-08T11:02:00.000Z → 2026-04-24T17:17:00.000Z +- [Set up WireGuard encryption](https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard) + - Updated: 2026-04-08T11:02:00.000Z → 2026-04-24T17:17:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-performance-ebpf-host-routing) + - Updated: 2025-10-31T17:08:00.000Z → 2026-04-20T22:08:00.000Z +- [Set up eBPF Host routing](https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing) + - Updated: 2026-01-08T18:15:00.000Z → 2026-04-20T22:08:00.000Z +- [Introduction to resource placement](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-resource-placement) + - Updated: 2026-04-07T08:00:00.000Z → 2026-04-23T08:00:00.000Z + +### Deleted Pages + +- ~~Cluster and node access control with Conditional Access~~ (https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad) +- ~~Deploy an application using AKS desktop~~ (https://learn.microsoft.com/en-us/azure/aks/aks-desktop-app) +- ~~Set up permissions in AKS desktop~~ (https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions) +- ~~Use Kubernetes RBAC with Microsoft Entra integration~~ (https://learn.microsoft.com/en-us/azure/aks/azure-ad-rbac) +- ~~BYOK for Azure managed disks~~ (https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys) +- ~~Access and identity concepts~~ (https://learn.microsoft.com/en-us/azure/aks/concepts-identity) +- ~~Security concepts~~ (https://learn.microsoft.com/en-us/azure/aks/concepts-security) +- ~~Security vulnerability management~~ (https://learn.microsoft.com/en-us/azure/aks/concepts-vulnerability-management) +- ~~Confidential Containers overview~~ (https://learn.microsoft.com/en-us/azure/aks/confidential-containers-overview) +- ~~Limit access to cluster configuration file~~ (https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access) +- ~~3 - Apply extra configurations or perform troubleshooting~~ (https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options) +- ~~1 - Configure Secrets Store CSI Driver~~ (https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver) +- ~~2 - Provide Azure Key Vault access~~ (https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access) +- ~~Configure TLS for NGINX Ingress controller~~ (https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-nginx-tls) +- ~~Use custom certificate authorities~~ (https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority) +- ~~Deploy Confidential Containers with default policy~~ (https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy) +- ~~Enable AKS-managed Microsoft Entra integration~~ (https://learn.microsoft.com/en-us/azure/aks/enable-authentication-microsoft-entra-id) +- ~~Enable FIPS~~ (https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes) +- ~~Enable host-based encryption~~ (https://learn.microsoft.com/en-us/azure/aks/enable-host-encryption) +- ~~Configure external identity providers~~ (https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure) +- *...and 30 more* ## Classified Pages @@ -134,9 +160,10 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Troubleshoot Container Network Insights Agent issues](https://learn.microsoft.com/en-us/azure/aks/troubleshoot-container-network-insights-agent) | troubleshooting | 0.95 | Explicitly described as organized in Symptom → Cause → Resolution format for common issues when deploying/configuring/using the agent, which matches the troubleshooting criteria with product-specific diagnostics and resolutions. | | [Diagnose and solve UDP packet drops](https://learn.microsoft.com/en-us/azure/aks/troubleshoot-udp-packet-drops) | troubleshooting | 0.90 | Explicitly a diagnose-and-solve article; likely maps UDP packet loss symptoms to causes (small read buffer, node/network settings) and AKS-specific remediation steps. | | [Use multiple Standard Load Balancers](https://learn.microsoft.com/en-us/azure/aks/use-multiple-standard-load-balancer) | limits-quotas | 0.88 | Explicitly mentions node NIC limits of 300 inbound rules and 8 private-link services; these numeric limits drive the multi-SLB pattern and are expert quota knowledge. | -| [3 - Apply extra configurations or perform troubleshooting](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options) | configuration | 0.86 | Explicitly about configuration options (autorotation, secret sync, metrics) for the Azure Key Vault provider; contains product-specific settings and behaviors. | +| [Identity Bindings for scalable workload identity (overview)](https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts) | limits-quotas | 0.86 | Explicitly documents a concrete limit (a single UAMI can't have more than 20 federated identity credentials) and how identity bindings address this scale constraint; this is expert limit/quota knowledge. | +| [AKS service permissions reference](https://learn.microsoft.com/en-us/azure/aks/aks-service-permissions) | security | 0.85 | Permissions reference for AKS; will enumerate specific Azure RBAC roles, actions, and scopes required for cluster creation and runtime, which is product-specific security configuration knowledge. | | [Advanced application routing add-on configurations (NGINX)](https://learn.microsoft.com/en-us/azure/aks/app-routing-nginx-configuration) | configuration | 0.85 | Explicitly about advanced configuration options and NGINX ingress annotations; contains detailed config keys and supported values unique to the AKS add-on. | -| [Configure external identity providers](https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure) | security | 0.85 | Provides concrete, product-specific security configuration steps for AKS structured authentication: how to define JWT authenticators, configure claim validation and mapping, and wire GitHub/Google Identity into AKS control plane auth. These are detailed configuration patterns and parameters unique to AKS external IdP integration, fitting the security sub-skill. | +| [Configure and troubleshoot the Azure Key Vault provider for Secrets Store CSI Driver](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options) | configuration | 0.85 | Explicitly about configuration options for the Key Vault provider in CSI Driver; likely includes parameter tables, autorotation settings, sync options, and metrics configuration with specific names and allowed values. | | [Configure the Dapr extension](https://learn.microsoft.com/en-us/azure/aks/dapr-settings) | configuration | 0.85 | Focuses on Dapr extension configuration via --configuration-settings and Bicep configurationSettings; includes specific parameter names and options, matching configuration criteria. | | [Customize node surge upgrade](https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-node-pools-rolling) | configuration | 0.85 | Covers surge settings, drain timeout, soak time, and how to configure them—concrete AKS configuration parameters and ranges. | | [Deployment and cluster reliability best practices](https://learn.microsoft.com/en-us/azure/aks/best-practices-app-cluster-reliability) | best-practices | 0.85 | Explicit best-practices article for AKS reliability; typically includes concrete AKS-specific recommendations (e.g., node pool patterns, upgrade strategies, configuration values) and gotchas beyond generic Kubernetes advice. | @@ -144,56 +171,51 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Service account and workload identity setup](https://learn.microsoft.com/en-us/azure/aks/agentic-cli-for-aks-service-account-workload-identity-setup) | security | 0.85 | Covers creating a required service account and configuring workload identity for the Agentic CLI. This typically includes specific Kubernetes service account names, Azure AD/workload identity bindings, and RBAC/permission scopes, which are product-specific security configuration details. | | [Troubleshoot Open Service Mesh](https://learn.microsoft.com/en-us/azure/aks/open-service-mesh-troubleshoot) | troubleshooting | 0.85 | Explicit troubleshooting article with common errors and resolutions for the OSM add-on, matching the symptom→cause→solution pattern with product-specific error details. | | [Troubleshooting](https://learn.microsoft.com/en-us/azure/aks/agentic-cli-for-aks-troubleshoot) | troubleshooting | 0.85 | Explicitly a troubleshooting article; likely organized around specific error messages or behaviors of the Agentic CLI, with causes and resolutions unique to this tool, matching the troubleshooting criteria. | -| [Use a NAT Gateway](https://learn.microsoft.com/en-us/azure/aks/nat-gateway) | limits-quotas | 0.85 | The summary explicitly states a precise numeric limit: Azure NAT Gateway allows up to 64,512 outbound UDP and TCP flows per IP address with a maximum of 16 IP addresses. These are concrete, product-specific limits and quotas that an LLM is unlikely to know reliably from training. The rest of the page describes NAT gateway outbound types for AKS, but the standout expert knowledge is the exact flow and IP limits, which matches the limits-quotas sub-skill definition. | -| [2 - Provide Azure Key Vault access](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access) | security | 0.82 | Covers identity-based access methods (Azure RBAC, OIDC) for Secrets Store CSI Driver with AKS, including security model choices and best practices specific to this integration. | | [Availability Sets deprecation](https://learn.microsoft.com/en-us/azure/aks/availability-sets-on-aks) | decision-making | 0.82 | The page provides concrete deprecation timelines (for example, the September 30, 2025 end-of-support date) and prescriptive guidance to move from Availability Sets to Virtual Machines node pools. It includes AKS-specific migration and support considerations that inform technology and configuration choices, aligning with decision-making. | | [Identity and access management](https://learn.microsoft.com/en-us/azure/aks/pci-identity) | security | 0.82 | Identity and access management guidance with Kubernetes and Azure RBAC roles and patterns specific to AKS under PCI DSS. | -| [Identity bindings overview](https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts) | limits-quotas | 0.82 | Explicitly documents the 20 FIC per UAMI limit and how identity bindings address this, which is a concrete product limit. | | [Pod security best practices](https://learn.microsoft.com/en-us/azure/aks/developer-best-practices-pod-security) | best-practices | 0.82 | Developer-focused pod security DOs and DON'Ts with AKS/Kubernetes-specific recommendations (secrets handling, privilege settings) that go beyond generic security theory. | | [AKS MCP server](https://learn.microsoft.com/en-us/azure/aks/aks-model-context-protocol-server) | integrations | 0.80 | Describes MCP server installation and tool definitions that bridge AI assistants and AKS, with specific operations and configuration parameters. | | [Automatically upgrade node OS images](https://learn.microsoft.com/en-us/azure/aks/auto-upgrade-node-os-image) | deployment | 0.80 | Describes multiple autoupgrade channels, their purpose, and OS image lifecycle constraints; these are AKS-specific deployment and maintenance settings. | -| [Azure kubelogin](https://learn.microsoft.com/en-us/azure/aks/kubelogin-authentication) | security | 0.80 | Shows kubelogin usage for multiple Entra auth methods; likely includes specific command flags, environment variables, and AKS kubeconfig integration details. | | [Best practices for ephemeral NVMe data disks](https://learn.microsoft.com/en-us/azure/aks/best-practices-storage-nvme) | best-practices | 0.80 | Contains AKS- and VM-size-specific recommendations, caveats, and usage patterns for ephemeral NVMe data disks that go beyond generic storage advice. | | [CI/CD with Azure Pipelines](https://learn.microsoft.com/en-us/azure/aks/devops-pipeline) | deployment | 0.80 | Provides concrete Azure Pipelines YAML and configuration to build, push to ACR, and deploy to AKS; product-specific CI/CD deployment patterns. | | [Cluster security best practices](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-cluster-security) | best-practices | 0.80 | Cluster security best-practices; likely includes concrete AKS guidance (upgrade channels, node OS versions, RBAC patterns) and product-specific gotchas. | +| [Conditional Access for cluster and node access](https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad) | security | 0.80 | Shows how to use Microsoft Entra Conditional Access with AKS; will contain specific policy settings, scopes, and role requirements, which are detailed security configuration instructions. | | [Configure Azure App Configuration AKS extension](https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration-settings) | configuration | 0.80 | Explicitly about configuring the extension using --configuration-settings and references Helm values; likely contains parameter names, allowed values, and defaults, which are expert configuration details. | | [Configure a scheduler profile](https://learn.microsoft.com/en-us/azure/aks/configure-aks-scheduler) | configuration | 0.80 | How-to for scheduler profiles; likely includes specific profile names, plugin configuration fields, and AKS-specific constraints. | | [Configure and deploy a Ray cluster](https://learn.microsoft.com/en-us/azure/aks/deploy-ray) | deployment | 0.80 | Provides concrete manifests and AKS-specific deployment steps for Ray clusters, which are specialized integration/deployment knowledge. | | [Configure and deploy a Ray cluster for tuning](https://learn.microsoft.com/en-us/azure/aks/deploy-ray-tuning) | deployment | 0.80 | Combines Ray, AKS, and BlobFuse with specific configuration parameters and manifests, representing detailed integration and deployment patterns. | | [Configure blue-green node pool upgrades](https://learn.microsoft.com/en-us/azure/aks/blue-green-node-pool-upgrade) | architecture-patterns | 0.80 | Explains when to use blue-green upgrades, how they work, and configuration options specific to AKS node pools—an AKS-specific upgrade pattern. | | [Configure container network metrics filtering](https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-metrics-filtering) | configuration | 0.80 | Explicitly about configuring metrics filtering via CRDs; likely contains CRD field names, allowed values, and examples for controlling Hubble metrics cardinality and dimensions. | +| [Configure the Azure Key Vault provider for Secrets Store CSI Driver](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver) | integrations | 0.80 | Covers Azure Key Vault provider for CSI Driver with AKS; typically includes SecretProviderClass schema, parameter names, and provider-specific options that are product-specific integration details. | | [Configure tool calling with LLM inference](https://learn.microsoft.com/en-us/azure/aks/ai-toolchain-operator-tool-calling) | integrations | 0.80 | Provides OpenAI-style tool calling configuration for KAITO inference services, including specific schema and metrics integration. | +| [Control access to kubeconfig](https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access) | security | 0.80 | Shows how to assign Azure roles to control kubeconfig access; contains specific RBAC role names and scope usage, which are concrete security configuration details. | | [Create Azure AKS infrastructure](https://learn.microsoft.com/en-us/azure/aks/create-aks-infrastructure) | deployment | 0.80 | Provides a concrete, parameterized script-based workflow to create ACR, AKS, node pools, and permissions, which is detailed deployment knowledge. | -| [Create a service principal](https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal) | security | 0.80 | How-to for creating and attaching service principals to AKS; includes specific role assignments, scopes, and AKS-specific identity behavior. | +| [Custom certificate authorities on nodes](https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority) | limits-quotas | 0.80 | States a concrete numeric limit (up to 10 base64-encoded certificates per node trust store) and explains behavior; this is specific limit/constraint information. | | [Custom domain name and SSL certificate configuration](https://learn.microsoft.com/en-us/azure/aks/app-routing-dns-ssl) | configuration | 0.80 | Focuses on DNS and SSL configuration with the application routing add-on; expected to include specific annotation names, certificate settings, and DNS configuration parameters. | | [Define API server authorized IP ranges](https://learn.microsoft.com/en-us/azure/aks/api-server-authorized-ip-ranges) | security | 0.80 | How-to for configuring authorized IP ranges; includes specific CLI/ARM parameters and constraints for AKS control plane endpoints. | | [Deploy an AI model using the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/ai-toolchain-operator) | deployment | 0.80 | Shows how to enable and use the KAITO add-on with AKS-specific CRDs and configuration fields for model deployment, which is highly product-specific. | -| [Enable AKS-managed Microsoft Entra integration](https://learn.microsoft.com/en-us/azure/aks/enable-authentication-microsoft-entra-id) | security | 0.80 | Step-by-step configuration of AKS-managed Entra integration with specific parameters, roles, and kubelogin usage unique to AKS. | -| [Enable access to AKS clusters using Trusted Access](https://learn.microsoft.com/en-us/azure/aks/trusted-access-feature) | security | 0.80 | Describes Trusted Access feature; likely includes configuration of bindings between AKS and other Azure services, with specific roles and parameters. | +| [Disable local cluster admin accounts](https://learn.microsoft.com/en-us/azure/aks/local-accounts) | security | 0.80 | Operational guide for disabling/enabling local accounts on AKS clusters; includes specific flags, API fields, and configuration options that control local admin access, which are product-specific security settings. | | [Fine-tune an AI model for inferencing](https://learn.microsoft.com/en-us/azure/aks/ai-toolchain-operator-fine-tune) | deployment | 0.80 | Details KAITO version requirements, CRD fields, and workflow for fine-tuning and deploying models on AKS, which is unique operational knowledge. | | [For small to medium workloads](https://learn.microsoft.com/en-us/azure/aks/best-practices-performance-scale) | best-practices | 0.80 | Best-practices article with AKS-specific performance and scaling guidance; expected to include concrete recommendations (e.g., node sizes, pod density, autoscaler settings) that are product-specific. | | [GitHub Actions for Kubernetes](https://learn.microsoft.com/en-us/azure/aks/kubernetes-action) | deployment | 0.80 | Includes GitHub Actions workflow configuration for building and deploying containers from ACR to AKS; AKS-specific deployment automation details. | | [Integrate an MCP server](https://learn.microsoft.com/en-us/azure/aks/ai-toolchain-operator-mcp) | integrations | 0.80 | Describes how to connect MCP-compliant tool servers to KAITO inference workspaces with concrete configuration and validation steps. | | [Internal NGINX controller and private DNS zone](https://learn.microsoft.com/en-us/azure/aks/create-nginx-ingress-private-controller) | configuration | 0.80 | Describes configuring ingress with private IP and Azure private DNS; likely includes AKS-specific annotations, DNS zone settings, and ingress configuration fields. | | [Latency comparison](https://learn.microsoft.com/en-us/azure/aks/istio-latency) | limits-quotas | 0.80 | Focuses on P90/P99 latency differences across versions and Kubernetes releases, providing concrete numeric performance data that LLMs would not know from training. | -| [Limit access to cluster configuration file](https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access) | security | 0.80 | Shows how to assign Azure roles to limit kubeconfig retrieval; includes specific RBAC role names and scope patterns unique to AKS. | | [Long-term support](https://learn.microsoft.com/en-us/azure/aks/long-term-support) | limits-quotas | 0.80 | Describes extended support windows and version lifecycles beyond community support, with concrete timelines and constraints that are effectively support limits. | +| [Microsoft Entra ID authentication to cluster control plane](https://learn.microsoft.com/en-us/azure/aks/entra-id-control-plane-authentication) | security | 0.80 | How-to for configuring Microsoft Entra authentication for the AKS API server; will include specific configuration parameters, flags, and identity settings unique to AKS security. | | [Multi-tenancy and cluster isolation](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-cluster-isolation) | best-practices | 0.80 | Best-practices for multi-tenancy and isolation in AKS; expected to include concrete patterns (e.g., namespace, node pool, network policy configurations) and AKS-specific guidance. | | [Performance](https://learn.microsoft.com/en-us/azure/aks/istio-scale) | limits-quotas | 0.80 | Explicitly presents performance metrics, sidecar capacity, and resource consumption, likely with numeric limits and thresholds, plus recommendations for handling heavy load—matching limits/quotas and best-practices characteristics; primary value is numeric capacity guidance. | | [Pricing tiers for AKS](https://learn.microsoft.com/en-us/azure/aks/free-standard-pricing-tiers) | decision-making | 0.80 | Explains differences between AKS pricing tiers and when to use each; likely includes comparison criteria and guidance for choosing tiers, fitting decision-making. | +| [Privileged Identity Management for cluster and node access](https://learn.microsoft.com/en-us/azure/aks/privileged-identity-management) | security | 0.80 | Describes configuring Privileged Identity Management for AKS cluster and node access; includes specific role assignments, activation settings, and scopes, which are expert security configuration details. | +| [Provide access to Azure Key Vault from Secrets Store CSI Driver](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access) | security | 0.80 | Focuses on identity-based access (Azure RBAC vs OIDC) for Key Vault via CSI Driver; likely includes specific roles, scopes, and configuration for AKS and Key Vault, which are product-specific security settings and decision guidance. | | [Requirement mapping matrix](https://learn.microsoft.com/en-us/azure/aks/pci-requirement-mapping-matrix) | decision-making | 0.80 | Requirement mapping matrix that links AKS controls/docs to PCI requirements, aiding compliance decision-making and gap analysis. | | [Secure access to Azure OpenAI from Azure Kubernetes Service (AKS)](https://learn.microsoft.com/en-us/azure/aks/open-ai-secure-access-quickstart) | security | 0.80 | Includes concrete Microsoft Entra Workload ID and managed identity configuration for securing OpenAI access from AKS, with specific roles and scopes. | -| [Secure container access to resources](https://learn.microsoft.com/en-us/azure/aks/secure-container-access) | security | 0.80 | Gives concrete AKS-specific guidance and configuration snippets for user namespaces, AppArmor, and seccomp to restrict container capabilities. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/aks/security-controls-policy) | security | 0.80 | Lists specific built-in policy definitions and compliance controls; includes policy names and mappings that are product-specific security configuration artifacts. | -| [Set up permissions in AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions) | security | 0.80 | Describes Azure RBAC roles and required permissions for operators vs developers in AKS desktop; product-specific security and access configuration. | +| [Set up permissions](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions) | security | 0.80 | Focused on setting up permissions and RBAC for AKS Desktop, distinguishing cluster operator and developer roles. This will necessarily include specific RBAC role names, scopes, and permission configurations unique to AKS Desktop usage, which fits the security sub-skill. | | [Troubleshoot CoreDNS](https://learn.microsoft.com/en-us/azure/aks/coredns-troubleshoot) | troubleshooting | 0.80 | Explicit troubleshooting article; will map DNS symptoms to causes and fixes, with AKS-specific commands and log locations. | | [Troubleshoot SNAT issues](https://learn.microsoft.com/en-us/azure/aks/troubleshoot-source-network-address-translation) | troubleshooting | 0.80 | Troubleshooting guide for SNAT exhaustion with Standard Load Balancer; includes symptom → cause → mitigation mappings and AKS-specific configuration options. | | [Upgrade scenarios hub](https://learn.microsoft.com/en-us/azure/aks/upgrade-scenarios-hub) | decision-making | 0.80 | Decision-tree style hub that maps business constraints to specific upgrade strategies; clearly focused on helping choose between options. | | [Use Azure CNI Pod Subnet - Static Block Allocation](https://learn.microsoft.com/en-us/azure/aks/configure-azure-cni-static-block-allocation) | limits-quotas | 0.80 | Discusses /16 subnet scalability and a 65k pod limit due to Azure address mapping; these are explicit numeric limits plus configuration for static block allocation. | -| [Use Azure Policy](https://learn.microsoft.com/en-us/azure/aks/use-azure-policy) | security | 0.80 | How-to for using Azure Policy with AKS; includes specific policy initiatives, add-on configuration, and enforcement behavior unique to AKS. | -| [Use Azure RBAC for Kubernetes authorization](https://learn.microsoft.com/en-us/azure/aks/manage-azure-rbac) | security | 0.80 | Contains specific Azure role definitions, scopes, and AKS configuration to unify Azure RBAC with Kubernetes authorization. | -| [Use Kubernetes RBAC with Microsoft Entra integration](https://learn.microsoft.com/en-us/azure/aks/azure-ad-rbac) | security | 0.80 | Shows how to wire Entra group membership into Kubernetes RBAC with concrete role/rolebinding YAML and AKS-specific settings. | -| [Use custom certificate authorities](https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority) | security | 0.80 | Describes Custom CA feature with explicit limit (up to 10 certificates) and how they are applied to node trust stores; includes AKS-specific configuration steps. | | [Best practices for network connectivity and security](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-network) | best-practices | 0.78 | The page is explicitly framed as operator best practices for AKS networking and typically includes product-specific recommendations (for example, concrete guidance on IP range sizing, load balancer configuration, and AKS-specific networking choices like kubenet vs CNI and migration considerations). These are actionable, AKS-specific DO/DON'T patterns rather than generic networking theory, matching the best-practices category. | | [Best practices for storage and backups](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-storage) | best-practices | 0.78 | Operator-focused storage, encryption, and backup best practices with AKS-specific recommendations (node size impact, backup strategies). | | [CIS Azure Linux 2.0 benchmark](https://learn.microsoft.com/en-us/azure/aks/cis-azure-linux) | security | 0.78 | Details how CIS Azure Linux 2.0 benchmark is applied to AKS node images, including specific OS security configuration guidance and applicability notes. | @@ -208,35 +230,32 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Enhanced MFA implementation](https://learn.microsoft.com/en-us/azure/aks/pci-enhanced-mfa-implementation) | security | 0.78 | PCI-focused MFA implementation details for AKS admin and CDE access, with AKS/Azure-specific security configuration patterns. | | [For large workloads](https://learn.microsoft.com/en-us/azure/aks/best-practices-performance-scale-large) | best-practices | 0.78 | The page provides AKS-specific performance and scaling recommendations for large clusters/workloads (for example, guidance on node counts, pod densities, control plane considerations, and configuration patterns unique to AKS). These are concrete, product-specific DO/DON'T guidelines rather than generic Kubernetes theory, fitting the best-practices category. | | [GPU best practices](https://learn.microsoft.com/en-us/azure/aks/best-practices-gpu) | best-practices | 0.78 | Explicitly a best-practices article; for AKS GPU workloads it likely includes concrete recommendations, diagnostic commands, and configuration patterns specific to AKS GPU nodes. | -| [Microsoft Entra pod identity](https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity) | security | 0.78 | Describes AKS primitives for pod-managed identities, including limitations and configuration objects unique to this feature. | | [Network policy best practices](https://learn.microsoft.com/en-us/azure/aks/network-policy-best-practices) | best-practices | 0.78 | Best-practices article specific to AKS network policies with concrete DO/DON'T guidance and AKS-specific patterns beyond generic Kubernetes concepts. | | [Network security](https://learn.microsoft.com/en-us/azure/aks/pci-network) | security | 0.78 | Network security guidance for AKS under PCI DSS with specific topology (hub-spoke) and segmentation considerations; security-focused and AKS-specific. | | [Outbound network and FQDN rules for AKS clusters](https://learn.microsoft.com/en-us/azure/aks/outbound-rules-control-egress) | configuration | 0.78 | Outbound/FQDN rules articles list required ports, protocols, and specific endpoint/FQDN dependencies for AKS and its add-ons. These are product-specific configuration details (what must be opened, where) that an LLM won't reliably know from training. | | [Resource management best practices](https://learn.microsoft.com/en-us/azure/aks/developer-best-practices-resource-management) | best-practices | 0.78 | Developer best practices for AKS resource management with concrete recommendations on deployment patterns and resource usage specific to AKS. | | [Restrict access to IMDS](https://learn.microsoft.com/en-us/azure/aks/imds-restriction) | security | 0.78 | Explains enabling IMDS restriction with AKS-specific preview settings and flags to control pod access to metadata service. | -| [Update cluster credentials](https://learn.microsoft.com/en-us/azure/aks/update-credentials) | security | 0.78 | Details rotation procedures and timing for AKS service principals and Entra apps, including expiration behavior and commands. | +| [Set up external identity provider authentication to cluster](https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure) | security | 0.78 | How-to configuration for GitHub and Google Identity as external providers using structured authentication; includes product-specific JWT authenticator settings, claim validation and mapping details that are implementation-specific security configuration. | +| [Trusted Access for Azure services](https://learn.microsoft.com/en-us/azure/aks/trusted-access-feature) | security | 0.78 | Shows how to use Trusted Access so Azure services can securely reach the AKS API server via backend integration and managed identities; includes product-specific security configuration patterns. | +| [Use a NAT Gateway](https://learn.microsoft.com/en-us/azure/aks/nat-gateway) | limits-quotas | 0.78 | Contains concrete outbound flow limits for Azure NAT Gateway (64,512 flows per IP, up to 16 IPs) and product-specific outbound type options for AKS, which are numeric constraints not generally known from training. | | [Use an HTTP proxy](https://learn.microsoft.com/en-us/azure/aks/http-proxy) | configuration | 0.78 | How-to article for configuring AKS clusters to use an HTTP proxy. These pages typically include node/cluster-level environment variable names (HTTP_PROXY, HTTPS_PROXY, NO_PROXY), AKS-specific configuration flags, and sometimes required endpoint lists or certificate handling steps that are product-specific. This is concrete configuration guidance rather than conceptual networking theory. | | [Vertical Pod Autoscaler API reference](https://learn.microsoft.com/en-us/azure/aks/vertical-pod-autoscaler-api-reference) | configuration | 0.78 | Explicit API reference for VPA on AKS; contains schema details, field names, types, and allowed values—classic configuration reference content. | | [Windows container best practices](https://learn.microsoft.com/en-us/azure/aks/windows-best-practices) | best-practices | 0.78 | Provides concrete, AKS-specific recommendations and gotchas for running Windows containers, including OS version timelines and configuration guidance. | | [Workload identity overview](https://learn.microsoft.com/en-us/azure/aks/workload-identity-overview) | security | 0.78 | Describes AKS-specific workload identity model, including federation with Entra and Kubernetes service account token usage. | -| [Manage local accounts](https://learn.microsoft.com/en-us/azure/aks/manage-local-accounts-managed-azure-ad) | security | 0.77 | Shows how to disable/enable local admin accounts on AKS clusters with specific commands and flags affecting security posture. | | [CIS Ubuntu benchmark](https://learn.microsoft.com/en-us/azure/aks/cis-ubuntu) | security | 0.76 | Describes security OS configuration for Ubuntu 24.04 in AKS per CIS benchmark, including version-specific guidance. | | [CIS Windows Server benchmark](https://learn.microsoft.com/en-us/azure/aks/cis-windows) | security | 0.76 | Explains Windows Server 2022 image security configuration in AKS based on Azure security baseline and CIS, with AKS-specific implications. | -| [Cluster and node access control with Conditional Access](https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad) | security | 0.76 | Details configuration of Conditional Access for AKS API and SSH, including specific Entra policies and AKS integration steps. | -| [Cluster and node access control with Privileged Identity Management](https://learn.microsoft.com/en-us/azure/aks/privileged-identity-management) | security | 0.76 | Explains how to use Entra PIM for JIT access to AKS control plane and nodes with specific role assignments and flows. | | [Configure LocalDNS](https://learn.microsoft.com/en-us/azure/aks/localdns-custom) | configuration | 0.76 | Describes LocalDNS feature, configuration options, and enable/verify steps; includes specific AKS settings and deployment patterns. | -| [Configure TLS for NGINX Ingress controller](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-nginx-tls) | integrations | 0.76 | Detailed integration pattern between Secrets Store CSI Driver, Azure Key Vault, and NGINX Ingress TLS on AKS with product-specific configuration steps and parameters. | | [Continuous security monitoring](https://learn.microsoft.com/en-us/azure/aks/pci-continuous-security-monitoring) | security | 0.76 | Continuous security and threat detection guidance specific to AKS under PCI DSS, with ongoing monitoring patterns. | | [Create and configure Managed Fleet Namespaces](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-managed-namespaces) | configuration | 0.76 | Provides concrete steps and CRD/UI fields for Managed Fleet Namespaces, including resource quotas, network policies, and delegated access configuration specific to Fleet Manager. | | [Customize CoreDNS](https://learn.microsoft.com/en-us/azure/aks/coredns-custom) | configuration | 0.76 | Uses ConfigMaps and specific CoreDNS options for AKS; includes concrete config keys and patterns unique to AKS-managed CoreDNS. | | [Customize resource configuration with cost optimized autoscaling](https://learn.microsoft.com/en-us/azure/aks/customize-resource-configuration) | configuration | 0.76 | Explains how to customize resource configuration for managed add-ons with cost-optimized scaling; likely includes parameter names, ranges, and examples. | | [Data protection](https://learn.microsoft.com/en-us/azure/aks/pci-data) | best-practices | 0.76 | Data protection best practices for AKS in PCI context, with concrete design guidance beyond generic data security theory. | | [Deploy trusted launch on AKS](https://learn.microsoft.com/en-us/azure/aks/use-trusted-launch) | security | 0.76 | Describes how AKS uses Trusted Launch, including requirements around Gen2 VMs, secure/measured boot, and how to enable it for node pools—product-specific security configuration details. | -| [Migrate your app from pod identity to workload identity](https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity) | security | 0.76 | Provides migration strategies and patterns (parallel deployment, sidecar proxy, SDK rewrite) specific to AKS identity features. | +| [Enforce compliance with Azure Policy](https://learn.microsoft.com/en-us/azure/aks/use-azure-policy) | security | 0.76 | Describes installing the Azure Policy add-on for AKS and applying built-in security policies/initiatives; includes product-specific policy definitions and enforcement configuration, which are security settings. | | [Monitoring and logging](https://learn.microsoft.com/en-us/azure/aks/pci-monitor) | security | 0.76 | Monitoring and logging design for AKS PCI workloads, tied to hub-spoke baseline and regulated environment requirements. | -| [Set up identity bindings](https://learn.microsoft.com/en-us/azure/aks/identity-bindings) | security | 0.76 | Configuration-focused article on mapping UAMIs across clusters via identity bindings with specific AKS settings. | | [Use KMS etcd encryption (legacy)](https://learn.microsoft.com/en-us/azure/aks/use-kms-etcd-encryption) | security | 0.76 | Legacy KMS etcd encryption setup with specific AKS and Key Vault configuration flags and modes (public/private), which are product-specific security settings. | -| [Use KMS platform-managed keys or customer-managed keys](https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption) | security | 0.76 | How-to for enabling KMS encryption with platform-managed or customer-managed keys, including AKS- and Key Vault–specific configuration parameters and options. | +| [Use Microsoft Entra ID authorization for the Kubernetes API](https://learn.microsoft.com/en-us/azure/aks/entra-id-authorization) | security | 0.76 | Describes using Microsoft Entra ID for Kubernetes API authorization with Azure RBAC and ABAC; includes specific AKS built-in role names and guidance on assigning roles and conditions, which are product-specific security configurations. | +| [Authenticate with kubelogin](https://learn.microsoft.com/en-us/azure/aks/kubelogin-authentication) | security | 0.75 | Provides concrete examples of kubelogin usage for AKS; includes command options, modes, and parameters specific to Entra authentication with AKS, which are product-specific security/auth configuration details. | | [Automate approvals using events](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-configure-events-for-gates) | integrations | 0.75 | Shows how approval gate events are published via Event Grid System Topic and how to configure subscriptions; includes event schema and integration parameters. | | [Configure Azure NetApp Files for AKS](https://learn.microsoft.com/en-us/azure/aks/azure-netapp-files) | configuration | 0.75 | Shows AKS-specific setup for Azure NetApp Files, including driver configuration, StorageClass and PV parameters unique to this integration. | | [Configure NodePool custom resources](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-node-pools) | configuration | 0.75 | Focuses on configuring node pools for AKS NAP, including SKU selectors, resource limits, and priority weights. This implies product-specific configuration parameters and examples (such as selector fields, limit fields, and weight values) that qualify as expert configuration knowledge beyond generic Kubernetes concepts. | @@ -247,6 +266,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Deploy Strimzi and Kafka components](https://learn.microsoft.com/en-us/azure/aks/kafka-deploy) | configuration | 0.75 | Deployment of Strimzi operator and Kafka cluster; includes CRD specs, broker configuration, and AKS-specific YAML unique to this integration. | | [Deploy an AI model using the Azure portal](https://learn.microsoft.com/en-us/azure/aks/ai-toolchain-operator-azure-portal) | deployment | 0.75 | Portal-based workflow for KAITO on AKS with preview-specific constraints and configuration steps is specialized deployment knowledge. | | [Deploy and configure workload identity for a cluster](https://learn.microsoft.com/en-us/azure/aks/workload-identity-deploy-cluster) | security | 0.75 | Article walks through configuring AKS with Microsoft Entra Workload ID, including creating a managed identity, Kubernetes service account, and federated identity credential. This implies specific RBAC/identity objects, configuration parameters, and bindings unique to AKS + Entra, which are product-specific security and authentication settings. | +| [Deploy with default policy](https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy) | deployment | 0.75 | CLI-based deployment of AKS with Confidential Containers and auto-generated security policy; likely includes required flags, runtime options, and constraints specific to this deployment scenario. | | [Kubernetes version support](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-fleet-kubernetes-version-support) | limits-quotas | 0.75 | Version support policies typically include specific supported versions, windows, and feature availability constraints, which are numeric and time-bound limits. | | [Manage cluster certificates and rotation](https://learn.microsoft.com/en-us/azure/aks/certificate-rotation) | security | 0.75 | Explains certificate rotation behavior and how to configure it; includes AKS-specific commands, flags, and deprecation details for node pool tags. | | [Mesh configuration](https://learn.microsoft.com/en-us/azure/aks/istio-meshconfig) | configuration | 0.75 | Details which MeshConfig properties are supported/allowed/blocked for the AKS Istio add-on, a clear product-specific configuration matrix. | @@ -264,9 +284,8 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Upgrade options and recommendations](https://learn.microsoft.com/en-us/azure/aks/upgrade-options) | decision-making | 0.75 | Scenario-based recommendations and options for AKS upgrades constitute decision guidance with trade-offs for different upgrade strategies. | | [Use managed identities](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/use-managed-identity) | security | 0.75 | Details using system- and user-assigned managed identities, plus Azure RBAC role assignment for Fleet; product-specific security configuration. | | [Use service tags for API server authorized IP ranges](https://learn.microsoft.com/en-us/azure/aks/api-server-service-tags) | security | 0.75 | Preview feature configuration; likely includes specific service tag names and usage patterns for AKS API server security. | -| [Validate signed images with Image Integrity](https://learn.microsoft.com/en-us/azure/aks/image-integrity) | security | 0.75 | How-to for Image Integrity; likely includes specific admission configuration, policy settings, and integration details unique to AKS. | -| [Authentication and authorization best practices](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-identity) | best-practices | 0.74 | Provides concrete AKS-specific recommendations (e.g., which identity features to use, deprecation notes, and configuration patterns) beyond generic security advice. | | [CIS Kubernetes benchmark](https://learn.microsoft.com/en-us/azure/aks/cis-kubernetes) | security | 0.74 | Maps CIS Kubernetes benchmark to AKS implementation with concrete control behaviors and AKS-specific security configuration details. | +| [Configure KMS encryption at rest of K8s API resources](https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption) | security | 0.74 | Step-by-step configuration of KMS data encryption with platform-managed or customer-managed keys; includes product-specific security and key management settings. | | [Customized approach guidance](https://learn.microsoft.com/en-us/azure/aks/pci-customized-approach-guidance) | decision-making | 0.74 | Explains how to design and justify customized PCI controls in AKS, supporting structured decisions on alternative implementations. | | [Deploy Pod Sandboxing](https://learn.microsoft.com/en-us/azure/aks/use-pod-sandboxing) | security | 0.74 | How-to for enabling pod sandboxing with AKS-specific configuration and security implications for isolated pod VMs. | | [Introduction to PCI DSS v4.0.1 on AKS](https://learn.microsoft.com/en-us/azure/aks/pci-intro) | architecture-patterns | 0.74 | Reference architecture for regulated AKS clusters with PCI-specific design considerations and trade-offs, beyond generic architecture concepts. | @@ -274,6 +293,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Manage SSH access to nodes](https://learn.microsoft.com/en-us/azure/aks/manage-ssh-node-access) | security | 0.74 | Provides concrete SSH configuration options and key management behavior for AKS clusters/node pools, including preview constraints and supported patterns, which are AKS-specific security configurations. | | [Securely scale your applications using the Kubernetes Event-driven Autoscaling (KEDA) add-on and workload identity](https://learn.microsoft.com/en-us/azure/aks/keda-workload-identity) | security | 0.74 | Focuses on secure scaling using workload identity; likely includes specific Azure AD / workload identity configuration, role assignments, and KEDA identity bindings. | | [Security policies](https://learn.microsoft.com/en-us/azure/aks/pci-policy) | security | 0.74 | Security policy considerations and patterns for AKS in PCI DSS context, including control design and enforcement. | +| [Use Group Managed Service Accounts](https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts) | security | 0.74 | This page describes how to securely configure Group Managed Service Accounts for Windows Server nodes in AKS, including product-specific security configuration steps and parameters (GMSA-related settings, domain integration details, and how pods use these accounts). This is concrete security configuration guidance unique to AKS + Windows + GMSA, matching the security sub-skill. | | [About node resource reservations](https://learn.microsoft.com/en-us/azure/aks/node-resource-reservations) | limits-quotas | 0.72 | This article typically includes tables of CPU/memory reservations per node size and Kubernetes version, which are numeric, product-specific limits that affect schedulable capacity. | | [Configure Metrics Server VPA](https://learn.microsoft.com/en-us/azure/aks/use-metrics-server-vertical-pod-autoscaler) | configuration | 0.72 | Shows how to vertically autoscale Metrics Server pods; likely includes specific VPA objects, resource settings, and AKS-specific configuration parameters. | | [Configure and deploy Airflow](https://learn.microsoft.com/en-us/azure/aks/airflow-deploy) | configuration | 0.72 | Deployment configuration using Helm; likely includes values.yaml parameters, secret management, and storage settings specific to Airflow on AKS. | @@ -281,17 +301,15 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Create Valkey infrastructure resources](https://learn.microsoft.com/en-us/azure/aks/create-valkey-infrastructure) | configuration | 0.72 | Infrastructure setup including Key Vault, ACR, workload identity, and dedicated node pools; contains specific resource and identity configuration patterns unique to this solution. | | [Create an Azure CNI Overlay cluster](https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay) | configuration | 0.72 | Explains setup, dual-stack configuration, and example workloads; includes AKS-specific configuration parameters and supported combinations. | | [Deploy PostgreSQL](https://learn.microsoft.com/en-us/azure/aks/deploy-postgresql-ha) | configuration | 0.72 | Deployment article using CloudNativePG; likely includes CRD specs, storage and replica settings, and AKS-specific YAML configuration. | -| [Enable GMSA integration](https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts) | security | 0.72 | Provides AKS- and Windows-specific configuration steps for Group Managed Service Accounts, including domain and node settings. | -| [Enable host-based encryption](https://learn.microsoft.com/en-us/azure/aks/enable-host-encryption) | security | 0.72 | Explains how to turn on host-based encryption for AKS nodes with specific configuration options and interactions with disk encryption types. | | [Proactive monitoring best practices](https://learn.microsoft.com/en-us/azure/aks/best-practices-monitoring-proactive) | best-practices | 0.72 | Explicit best-practices article with recommended key signals and proactive monitoring patterns specific to AKS; likely includes concrete metric names and thresholds. | +| [Rotate cluster credentials](https://learn.microsoft.com/en-us/azure/aks/update-credentials) | security | 0.72 | Details how to update or rotate service principal and Entra application credentials for AKS clusters; includes identity objects and security-focused operational steps specific to AKS. | +| [Troubleshoot an application](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-troubleshooting) | troubleshooting | 0.72 | The page describes a product-specific troubleshooting suite (Insights powered by Inspektor Gadget) and how to use it to diagnose issues like network traffic, DNS failures, and process anomalies. This is organized around diagnosing and resolving issues in AKS Desktop with concrete, tool-specific guidance, which fits the troubleshooting sub-skill. It goes beyond generic debugging by focusing on this specific integrated tool and workflow. | | [Upgrade node pools](https://learn.microsoft.com/en-us/azure/aks/upgrade-node-pools) | best-practices | 0.72 | Includes AKS-specific upgrade flows, ordering (control plane vs node pools), and best-practice guidance (e.g., version alignment, rolling upgrades) that are unique to AKS. | | [Use NAP in a custom VNet](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-custom-vnet) | configuration | 0.72 | Shows creating VNets, identities, and NAP-enabled clusters; includes networking parameters, identity roles, and AKS-specific NAP settings. | | [Use a user-defined route for egress](https://learn.microsoft.com/en-us/azure/aks/egress-udr) | configuration | 0.72 | Describes using UDRs with AKS, including route table settings and supported patterns; these are detailed, product-specific network configuration patterns. | -| [Use system node pools](https://learn.microsoft.com/en-us/azure/aks/use-system-pools) | architecture-patterns | 0.72 | Provides AKS-specific rules and recommendations for system node pools (what must run there, sizing, count) vs user pools, a core cluster design pattern. | | [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/virtual-nodes-cli) | configuration | 0.72 | Includes specific requirements (advanced networking/Azure CNI, delegated subnet) and CLI parameters to enable virtual nodes—detailed integration/configuration guidance. | | [Use the Azure portal](https://learn.microsoft.com/en-us/azure/aks/virtual-nodes-portal) | configuration | 0.72 | Shows portal-based configuration of virtual nodes, including required subnet delegation and networking mode, which are AKS-specific settings. | | [Use the cluster autoscaler on AKS](https://learn.microsoft.com/en-us/azure/aks/cluster-autoscaler) | configuration | 0.72 | How-to for enabling and configuring cluster autoscaler; typically includes min/max node counts, flags, and AKS-specific configuration parameters. | -| [1 - Configure Secrets Store CSI Driver](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver) | integrations | 0.70 | The article describes using the Azure Key Vault provider for the Secrets Store CSI Driver with AKS, which is a concrete integration pattern between AKS and Key Vault. It typically includes provider-specific configuration objects, parameters, and Kubernetes resource specs unique to this integration, matching the integrations & coding patterns category. | | [AKS Communication Manager](https://learn.microsoft.com/en-us/azure/aks/aks-communication-manager) | configuration | 0.70 | Describes how to set up notifications using Azure Resource Notifications and Resource Graph for AKS maintenance events—product-specific configuration. | | [AKS component versioning](https://learn.microsoft.com/en-us/azure/aks/aks-component-versioning) | configuration | 0.70 | Explains how different AKS components are versioned and patched, including product-specific versioning behavior and upgrade mechanics. | | [AKS monitoring data reference](https://learn.microsoft.com/en-us/azure/aks/monitor-aks-reference) | configuration | 0.70 | Monitoring data reference includes specific table names, metric names, and schema details for AKS telemetry in Azure Monitor; configuration-level reference for monitoring. | @@ -311,7 +329,6 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Azure HPC Cache](https://learn.microsoft.com/en-us/azure/aks/azure-hpc-cache) | integrations | 0.70 | Describes AKS-specific integration steps and configuration parameters to mount and use Azure HPC Cache from pods, which is not generic knowledge. | | [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/aks/policy-reference) | security | 0.70 | Lists specific Azure Policy definitions for AKS, which encode security and compliance controls with concrete policy names and scopes; product-specific security configuration reference. | | [Azure RBAC and Fleet Manager](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-rbac) | security | 0.70 | RBAC articles for a specific Azure service typically list concrete built-in role names, their exact permissions/scopes, and how they apply to that resource type. Those role definitions and scope mappings are product-specific security configuration details that qualify as expert knowledge under the security category. | -| [BYOK for Azure managed disks](https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys) | security | 0.70 | Page is a how-to for enabling customer-managed keys (CMK) for AKS-managed OS and data disks. It typically includes AKS- and disk-specific configuration details such as required Azure roles, key vault access policies, specific resource/parameter names, and wiring between AKS cluster, managed disks, and Key Vault. These are product-specific security configuration steps rather than just conceptual encryption info. | | [Basic scheduler features](https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-scheduler) | best-practices | 0.70 | Operator best-practices article for quotas and PDBs; likely includes AKS-specific examples, recommended configurations, and edge cases. | | [Best practices](https://learn.microsoft.com/en-us/azure/aks/best-practices-ml-ops) | best-practices | 0.70 | Contains AKS-specific MLOps recommendations, including concrete patterns and configurations for CI/CD, scaling, and operations of ML workloads. | | [Bring your own CNI](https://learn.microsoft.com/en-us/azure/aks/use-byo-cni) | configuration | 0.70 | A BYO-CNI guide for AKS typically includes AKS-specific configuration parameters (for example, cluster creation flags, required daemonset settings, and node/network prerequisites) that are unique to AKS and not just generic Kubernetes CNI usage. These concrete settings and constraints qualify as product-specific configuration expert knowledge rather than a conceptual overview. | @@ -319,6 +336,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Cluster Health Monitor checker (preview)](https://learn.microsoft.com/en-us/azure/aks/cluster-health-monitor) | configuration | 0.70 | Describes how to deploy and configure the AKS Cluster Health Monitor, including product-specific behavior (periodic checks for CoreDNS, metrics server, API server, Prometheus metrics exposure, and automatic remediation scenarios). This is expert configuration knowledge for an AKS-managed checker rather than a conceptual overview. | | [Collect metrics in Azure Managed Prometheus](https://learn.microsoft.com/en-us/azure/aks/istio-metrics-managed-prometheus) | integrations | 0.70 | Shows how to configure scraping and metrics export from Istio workloads to Azure Managed Prometheus, including config parameters and labels specific to this integration. | | [Compare AKS with other Azure container options](https://learn.microsoft.com/en-us/azure/aks/compare-container-options-with-aks) | decision-making | 0.70 | Explicitly compares AKS with ACI, App Service, Functions, and Container Apps to guide service selection; provides scenario-based recommendations and criteria for when to use each option. | +| [Confidential VM nodes](https://learn.microsoft.com/en-us/azure/aks/use-cvm) | configuration | 0.70 | How to create CVM node pools in AKS; typically includes specific CLI flags, SKU requirements, and node pool settings that are product-specific configuration details. | | [Configure Container Network Metrics](https://learn.microsoft.com/en-us/azure/aks/container-network-observability-how-to) | configuration | 0.70 | Step-by-step setup for Container Network Observability with managed/BYO Prometheus and Grafana; likely includes specific configuration flags, CRDs, and parameter values for scraping and dashboards. | | [Configure Static Egress Gateway](https://learn.microsoft.com/en-us/azure/aks/configure-static-egress-gateway) | configuration | 0.70 | The article describes how to configure Static Egress Gateway in AKS, which typically involves product-specific configuration fields (for gateway node pools, routing, IP settings) and step-by-step setup with concrete parameter names and values. This is detailed configuration guidance rather than just conceptual networking overview. | | [Configure autoscaling for CoreDNS](https://learn.microsoft.com/en-us/azure/aks/coredns-autoscale) | configuration | 0.70 | Explains how to tune CoreDNS autoscaler with specific parameters and thresholds; these are product-specific configuration values. | @@ -326,6 +344,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Configure outbound type for AKS](https://learn.microsoft.com/en-us/azure/aks/egress-outboundtype) | configuration | 0.70 | Page is about customizing AKS cluster egress via outbound types. It contains product-specific settings such as outboundType values (for example, managedNATGateway, userAssignedNATGateway, loadBalancer, userDefinedRouting), flags like defaultOutboundAccess, and how these affect routing. These are concrete AKS configuration parameters and behaviors that are not generic Kubernetes knowledge, fitting the configuration sub-skill. While it may mention limits conceptually, the core is about specific outbound configuration options rather than numeric quotas. | | [Connect remotely](https://learn.microsoft.com/en-us/azure/aks/rdp) | security | 0.70 | Explains how to securely RDP/SSH into Windows nodes that are not internet-exposed, including key-pair requirements and access paths—AKS-specific access control patterns. | | [Connect securely to cluster nodes](https://learn.microsoft.com/en-us/azure/aks/node-access) | security | 0.70 | The page describes concrete, product-specific methods to securely access AKS Linux and Windows nodes, including use of Kubernetes API vs AKS ARM API, and emphasizes security constraints around node exposure. While it’s partly troubleshooting-oriented, the core expert content is about secure access patterns and configuration for node connectivity, which aligns best with the security sub-skill. | +| [Container privilege hardening (Linux)](https://learn.microsoft.com/en-us/azure/aks/secure-container-access) | security | 0.70 | Describes concrete use of Linux security features (user namespaces, AppArmor, seccomp) in AKS; likely includes specific profile names, configuration snippets, and AKS-specific security guidance beyond generic concepts. | | [Cost optimization best practices](https://learn.microsoft.com/en-us/azure/aks/best-practices-cost) | best-practices | 0.70 | Explicit cost optimization best-practices article; typically includes concrete AKS-specific recommendations (e.g., node pool types, autoscaler settings) rather than only theory. | | [Create Airflow infrastructure resources](https://learn.microsoft.com/en-us/azure/aks/airflow-create-infrastructure) | configuration | 0.70 | Infrastructure setup including identity and storage for production Airflow; likely includes specific Azure resource and identity configuration details. | | [Create Windows Server node pools with `containerd`](https://learn.microsoft.com/en-us/azure/aks/windows-containerd) | configuration | 0.70 | Documents supported Kubernetes versions, default runtime behavior, and how to specify containerd for Windows node pools—AKS-specific runtime configuration. | @@ -341,7 +360,6 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Create persistent volumes with Azure Files CSI driver](https://learn.microsoft.com/en-us/azure/aks/create-volume-azure-files) | integrations | 0.70 | How-to for dynamically and statically creating Azure Files shares via the CSI driver for AKS. These pages typically include StorageClass, PersistentVolume, and PersistentVolumeClaim YAML with driver names, parameters, and mount options that are specific to the Azure File CSI driver (e.g., protocol selection, secrets, volumeHandle formats). That constitutes product-specific integration and configuration patterns beyond generic Kubernetes knowledge. | | [Customize cluster-scoped resources with overrides](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-use-overrides-customize-resources-placement) | configuration | 0.70 | The page covers Fleet Manager resource override APIs, including how to define and apply overrides to customize resources per environment. It necessarily includes specific API shapes, fields, and configuration patterns analogous to Helm/Kustomize but unique to Fleet, which qualifies as expert configuration knowledge. | | [Customize namespace-scoped resources with overrides](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-use-overrides-customize-resources-placement) | configuration | 0.70 | This is a duplicate of index 3 with the same URL and summary. It similarly provides detailed guidance on using Fleet resource override APIs, which is product-specific configuration knowledge. | -| [Deploy Confidential Containers with default policy](https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy) | deployment | 0.70 | CLI-based deployment of Confidential Containers; likely includes required node pool types, flags, and constraints specific to this preview runtime. | | [Deploy Kubernetes applications from Azure Marketplace](https://learn.microsoft.com/en-us/azure/aks/deploy-marketplace) | deployment | 0.70 | Explains how to select, deploy, and manage Kubernetes applications from Azure Marketplace onto AKS; includes Azure-specific deployment flow. | | [Deploy Kubernetes applications with ARM template](https://learn.microsoft.com/en-us/azure/aks/deploy-application-template) | deployment | 0.70 | Shows how to generate and use ARM templates to deploy Kubernetes applications programmatically to AKS; product-specific deployment automation. | | [Deploy Kubernetes applications with Azure CLI](https://learn.microsoft.com/en-us/azure/aks/deploy-application-az-cli) | deployment | 0.70 | Provides Azure CLI commands and flow to deploy Kubernetes applications from Azure Marketplace to AKS; AKS-specific deployment procedure. | @@ -355,12 +373,12 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Deploy to Azure](https://learn.microsoft.com/en-us/azure/aks/eks-edw-deploy) | deployment | 0.70 | Step-by-step deployment and validation of the migrated EDW workload on AKS; includes Azure-specific deployment details beyond generic Kubernetes knowledge. | | [Deploy wasmCloud to AKS to run distributed Wasm workloads](https://learn.microsoft.com/en-us/azure/aks/wasmcloud) | deployment | 0.70 | Covers deploying and using wasmCloud on AKS with Azure-specific configuration steps for distributed WebAssembly applications. | | [Detecting and managing resource drift](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-drift) | architecture-patterns | 0.70 | Explains using the applyStrategy property to detect and handle drift between Fleet-managed definitions and placed workloads. This is a Fleet-specific control pattern for multi-cluster management, mapping scenarios (drift) to strategy choices, fitting architecture-patterns. | -| [Enable FIPS](https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes) | security | 0.70 | Page is a how-to for enabling FIPS 140-2 on AKS Linux/Windows node pools, tied to FedRAMP compliance. It likely includes AKS- and OS-specific configuration flags/parameters (for example, node pool creation options, image types, or CLI arguments) that are product-specific security settings rather than generic FIPS concepts, fitting the security sub-skill. | | [Enable cost optimized autoscaling](https://learn.microsoft.com/en-us/azure/aks/optimized-addon-scaling) | configuration | 0.70 | Overview of cost-optimized add-on scaling; expected to describe specific configuration options and override parameters for add-on CPU/memory and autoscaling. | | [Enable or disable NAP](https://learn.microsoft.com/en-us/azure/aks/use-node-auto-provisioning) | configuration | 0.70 | How-to for enabling/disabling NAP via CLI/ARM; includes specific parameters, flags, and template fields unique to NAP configuration. | -| [Expand pod CIDR space in Azure CNI Overlay clusters](https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay-pod-expand) | configuration | 0.70 | Uses az aks update with specific options to expand pod CIDR; this is detailed configuration and operational guidance unique to AKS. | -| [External identity provider authentication overview](https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-overview) | security | 0.70 | Describes AKS-specific structured authentication behavior and how AKS implements upstream Kubernetes structured authentication with JWT authenticators for external identity providers. This is product-specific security/auth configuration knowledge (control plane auth model, how tokens are validated) that goes beyond generic identity concepts, but it’s primarily an overview so details are moderate. | +| [Encrypt Azure managed disks using customer-managed keys](https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys) | security | 0.70 | Describes how to use BYOK for OS and data disks in AKS; involves product-specific key and disk encryption configuration, which is security-focused. | | [FAQ](https://learn.microsoft.com/en-us/azure/aks/faq) | limits-quotas | 0.70 | AKS FAQ pages typically include concrete platform constraints such as maximum node counts, pod density per node, supported Kubernetes version windows, and other numeric service limits that are not inferable from general training data. These are expert, product-specific values that fit the limits-quotas category. | +| [FIPS-compliant node pools](https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes) | security | 0.70 | Shows how to enable FIPS 140-2 for AKS node pools; includes product-specific security and compliance configuration steps. | +| [Host-based encryption](https://learn.microsoft.com/en-us/azure/aks/enable-host-encryption) | security | 0.70 | Explains configuring host-based encryption for AKS nodes, including how temp disks and caches are encrypted with platform or customer-managed keys; product-specific security configuration. | | [Install Azure App Configuration AKS extension](https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration) | configuration | 0.70 | Covers installing and configuring the Azure App Configuration Kubernetes Provider as an AKS extension; expected to include extension resource properties, parameter names, and configuration options unique to this integration. | | [Install and use the Container Network Insights Agent](https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-insights-agent) | configuration | 0.70 | How-to article for deploying and configuring the agent, including authentication and identity. Likely contains product-specific configuration parameters and settings (extension configuration, identity/auth options, possibly CLI/ARM parameters) that go beyond generic deployment steps. | | [Install the agentic CLI for AKS](https://learn.microsoft.com/en-us/azure/aks/agentic-cli-for-aks-install) | configuration | 0.70 | An install/use article for a preview CLI typically includes concrete configuration steps, flags, environment variables, and parameter values specific to the Agentic CLI (for client vs cluster mode). These are product-specific configuration details that go beyond generic CLI usage. | @@ -368,7 +386,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Kubernetes Event-driven Autoscaler (KEDA) integrations](https://learn.microsoft.com/en-us/azure/aks/keda-integrations) | integrations | 0.70 | Describes KEDA integrations with Azure and open-source projects; expected to list specific scalers, configuration parameters, and integration patterns unique to AKS KEDA add-on. | | [Learn about Kueue for batch scheduling](https://learn.microsoft.com/en-us/azure/aks/kueue-overview) | configuration | 0.70 | Covers installation methods and enabling advanced Kueue features; likely includes CRD versions, Helm values, and AKS-specific setup steps. | | [Maintain and upgrade cluster components](https://learn.microsoft.com/en-us/azure/aks/upgrade-cluster-components) | deployment | 0.70 | Describes upgradeable components, unattended updates, and AKS-specific upgrade behavior, which are product-specific deployment lifecycle details. | -| [Microsoft Entra integration (legacy)](https://learn.microsoft.com/en-us/previous-versions/azure/aks/azure-ad-integration-cli) | security | 0.70 | Legacy but detailed instructions for setting up Entra integration, including app registrations, roles, and AKS CLI flags. | +| [Microsoft Entra pod identity (legacy)](https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity) | limits-quotas | 0.70 | Describes Microsoft Entra pod-managed identities and explicitly calls out limitations; these limits are product-specific expert knowledge, aligning with limits-quotas. (Even though summary truncates, the limitations section is central to this article.) | | [Migrate from Dapr OSS to the Dapr extension](https://learn.microsoft.com/en-us/azure/aks/dapr-migration) | decision-making | 0.70 | Guides choosing and executing a migration path from Dapr OSS to the managed Dapr extension, including options for reusing existing Kubernetes resources. | | [Migrate preview instances to supported instances](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-migrate-preview-to-ga-fleets) | troubleshooting | 0.70 | Helps detect instances affected by CRD API changes and migrate; includes specific API versions and mappings, used to resolve compatibility issues. | | [Migrate to KMS with platform-managed or customer-managed keys](https://learn.microsoft.com/en-us/azure/aks/migrate-key-management-service-platform-managed-key-customer-managed-key) | deployment | 0.70 | Provides migration path and constraints between KMS v2 and new infrastructure encryption with PMK/CMK, including version requirements and unsupported paths. | @@ -394,7 +412,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Plan pod networking](https://learn.microsoft.com/en-us/azure/aks/plan-pod-networking) | decision-making | 0.70 | Pod networking planning with options and recommendations; supports decisions between CNI modes and pod IP strategies with AKS-specific guidance. | | [Pod Sandboxing considerations](https://learn.microsoft.com/en-us/azure/aks/considerations-pod-sandboxing) | best-practices | 0.70 | Discusses concrete considerations and recommendations for resource, memory, CPU, and security management when using pod sandboxing in AKS. | | [Pod Security Admission](https://learn.microsoft.com/en-us/azure/aks/use-psa) | configuration | 0.70 | Duplicate of index 2; same AKS-specific PSA configuration guidance and labels usage. | -| [Pod security policies](https://learn.microsoft.com/en-us/azure/aks/use-psa) | configuration | 0.70 | Covers enabling and using PSA in AKS with AKS-specific configuration details and namespace label usage beyond generic PSA concepts. | +| [Pod security policies (legacy)](https://learn.microsoft.com/en-us/azure/aks/use-psa) | security | 0.70 | Implementation-focused PSA article for AKS; likely includes specific namespace labels, policy levels, and AKS-specific behavior for enforcing Pod Security Standards, which are product-specific security configuration details. | | [Prepare GitHub Actions infrastructure](https://learn.microsoft.com/en-us/azure/aks/github-actions-azure-files-create-infrastructure) | configuration | 0.70 | Infrastructure creation for ARC with Azure Files and Helm; likely includes storage class settings, ARC configuration values, and AKS-specific resource setup. | | [Prepare Kafka infrastructure](https://learn.microsoft.com/en-us/azure/aks/kafka-infrastructure) | configuration | 0.70 | Infrastructure preparation for Kafka; likely includes specific networking, storage, and AKS cluster settings tailored to Strimzi. | | [Prepare for deployment](https://learn.microsoft.com/en-us/azure/aks/eks-edw-prepare) | deployment | 0.70 | Describes concrete changes to deployment scripts and manifests (KEDA scaler for Azure Storage Queue, AKS infra) required to deploy the EDW workload on Azure. | @@ -403,7 +421,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Production upgrade strategies](https://learn.microsoft.com/en-us/azure/aks/aks-production-upgrade-strategies) | architecture-patterns | 0.70 | Provides proven patterns for production upgrades with trade-offs for downtime and safety; these are AKS-specific operational patterns. | | [Reduce image pull time with Artifact Streaming on AKS](https://learn.microsoft.com/en-us/azure/aks/artifact-streaming) | best-practices | 0.70 | Describes AKS-specific feature behavior, configuration steps, and quantified impact (e.g., >15% faster readiness) for large images, which is concrete product guidance. | | [Refactor app](https://learn.microsoft.com/en-us/azure/aks/eks-edw-refactor) | integrations | 0.70 | Details code changes and Azure SDK usage to adapt the EDW workload from AWS to Azure; includes product-specific integration patterns. | -| [Remove vulnerable images with ImageCleaner](https://learn.microsoft.com/en-us/azure/aks/image-cleaner) | configuration | 0.70 | Describes using Image Cleaner; likely includes configuration parameters (intervals, thresholds, labels) and AKS-specific deployment patterns. | +| [Remove vulnerable images with Image Cleaner](https://learn.microsoft.com/en-us/azure/aks/image-cleaner) | configuration | 0.70 | Describes using Image Cleaner on AKS; likely includes configuration options, schedules, labels/annotations, and operational parameters that are specific to this AKS feature. | | [Resilient public DNS load balancing](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-dns-load-balancing) | configuration | 0.70 | The article walks through configuring DNS-based L4/L7 load balancing across Fleet member clusters. It will include specific Fleet and DNS configuration objects, annotations, and settings that are unique to this feature, making it expert configuration knowledge rather than a generic tutorial. | | [Resize node pools](https://learn.microsoft.com/en-us/azure/aks/resize-node-pool) | configuration | 0.70 | Explains AKS constraints on resizing VM sizes, and the supported pattern (new pool + cordon/drain) with AKS-specific commands and behavior. | | [Restrict and control cluster egress traffic](https://learn.microsoft.com/en-us/azure/aks/limit-egress-traffic) | configuration | 0.70 | Uses AKS outbound/FQDN rules and the AzureKubernetesService FQDN tag; this is specific configuration knowledge about tags and rule setup. | @@ -413,16 +431,21 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Secure application routing Gateway API ingress traffic](https://learn.microsoft.com/en-us/azure/aks/app-routing-gateway-api-tls) | security | 0.70 | Focuses on securing ingress traffic using AKS application routing, Gateway API, Azure Key Vault, and Secrets Store CSI Driver; likely includes product-specific security configuration steps, secret syncing behavior, and TLS termination settings that qualify as expert security configuration knowledge. | | [Secure ingress gateways](https://learn.microsoft.com/en-us/azure/aks/istio-secure-gateway) | security | 0.70 | Shows how to expose secure HTTPS services with simple or mutual TLS, including certificate and gateway configuration specific to the Istio add-on. | | [Secure pod traffic with network policies](https://learn.microsoft.com/en-us/azure/aks/use-network-policies) | security | 0.70 | The page is a product-specific guide on implementing network policies in AKS to secure pod-to-pod traffic. It focuses on security configuration (policy types, how to apply them in AKS, and platform-specific considerations like NPM support changes), which falls under security. It goes beyond conceptual networking and provides AKS-specific secure configuration patterns. | +| [Serve NGINX TLS certs from Key Vault](https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-nginx-tls) | integrations | 0.70 | How-to for integrating Secrets Store CSI Driver, Azure Key Vault, and NGINX Ingress TLS on AKS; likely includes YAML specs, driver parameters, and configuration values unique to this integration. | +| [Service principal as cluster identity](https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal) | security | 0.70 | Explains creating and attaching a Microsoft Entra service principal to AKS; includes product-specific identity and role assignment configuration steps, which fall under security configuration. | | [Set up Cilium mTLS encryption](https://learn.microsoft.com/en-us/azure/aks/container-network-security-cilium-mutual-tls-how-to) | configuration | 0.70 | A how-to article for deploying Cilium mTLS encryption on AKS with Advanced Container Networking Services is likely to include concrete configuration steps: specific CLI flags, YAML fields, configuration parameter names, and possibly default/allowed values for enabling mTLS at the cluster or CNI level. These are product-specific configuration details that qualify as expert knowledge under the configuration sub-skill type. | | [Set up FQDN filtering](https://learn.microsoft.com/en-us/azure/aks/how-to-apply-fqdn-filtering-policies) | configuration | 0.70 | A setup guide for FQDN filtering with Container Network Security and Azure-managed Cilium Network Policies will include specific policy schema fields, configuration parameters, and allowed values for enabling and tuning FQDN filtering on AKS, which are product-specific configuration details. | | [Set up Layer 7 policies](https://learn.microsoft.com/en-us/azure/aks/how-to-apply-l7-policies) | configuration | 0.70 | How-to for setting up L7 policies with Azure managed Cilium on AKS will necessarily include product-specific policy objects, fields, and configuration parameters (e.g., CRD names, YAML spec fields, required annotations) that are unique to ACNS and AKS networking. This is concrete configuration guidance rather than just conceptual networking content. | | [Set up NVIDIA GPU Operator](https://learn.microsoft.com/en-us/azure/aks/nvidia-gpu-operator) | integrations | 0.70 | Covers using NVIDIA GPU Operator on AKS; expected to include Helm values, CRD parameters, and AKS-specific configuration/constraints for the operator. | -| [Set up WireGuard encryption](https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard) | configuration | 0.70 | A 'how to deploy' feature article for WireGuard encryption in ACNS typically includes AKS- and ACNS-specific configuration steps, such as CLI flags, YAML fields, and parameter names/values required to enable the feature. These are product-specific configuration details that go beyond generic deployment commands. | -| [Set up eBPF Host routing](https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing) | configuration | 0.70 | How-to enable feature article; likely includes AKS CLI flags, required versions, and configuration parameters specific to this preview feature. | +| [Set up WireGuard encryption](https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard) | security | 0.70 | How-to deployment/configuration article for WireGuard on AKS; likely includes specific CLI flags, configuration fields, and required settings unique to AKS networking, which are product-specific security configuration details. | +| [Set up a cluster for AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-install-cluster-setup) | configuration | 0.70 | Described as covering recommended AKS cluster configurations and add-ons for AKS Desktop, and differentiating between Standard and Automatic clusters. This implies concrete configuration options, required add-ons, and possibly tier-specific settings unique to AKS Desktop, matching the configuration sub-skill. | +| [Set up eBPF Host routing](https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing) | configuration | 0.70 | Step-by-step enablement article for eBPF host routing; likely contains AKS-specific configuration flags, parameters, and required settings that qualify as expert configuration knowledge. | +| [Set up identity bindings for scalable workload identity](https://learn.microsoft.com/en-us/azure/aks/identity-bindings) | security | 0.70 | How-to guide for configuring identity bindings to map a UAMI across clusters using a single FIC; contains product-specific identity configuration steps, which are security-related. | | [Start and stop node pools](https://learn.microsoft.com/en-us/azure/aks/start-stop-nodepools) | configuration | 0.70 | Documents AKS-specific start/stop semantics for node pools, including supported states and impact on workloads and billing. | | [Stateful workload upgrade patterns](https://learn.microsoft.com/en-us/azure/aks/stateful-workload-upgrades) | architecture-patterns | 0.70 | Describes zero-downtime upgrade strategies for stateful workloads on AKS, which are specialized patterns with product-specific considerations. | | [Stop cluster upgrades on API breaking changes](https://learn.microsoft.com/en-us/azure/aks/stop-cluster-upgrade-api-breaking-changes) | configuration | 0.70 | Describes how to configure AKS to automatically halt upgrades when breaking API changes are detected—product-specific safety configuration. | | [Stop/deallocate nodes with Scale-down Mode](https://learn.microsoft.com/en-us/azure/aks/scale-down-mode) | decision-making | 0.70 | Explains when to delete vs deallocate nodes, cost implications, and how to configure scale-down mode on AKS. Contains AKS-specific behavior and trade-offs for different scenarios. | +| [Subscribe to AKS security bulletins](https://learn.microsoft.com/en-us/azure/aks/security-bulletins/overview) | troubleshooting | 0.70 | Security bulletins page aggregates vulnerability details and troubleshooting guidance for AKS components; typically includes CVEs, affected versions, and mitigation steps, which are product-specific troubleshooting knowledge. | | [Support options for AKS](https://learn.microsoft.com/en-us/azure/aks/aks-support-help) | troubleshooting | 0.70 | Central troubleshooting article likely maps common AKS issues and error conditions to diagnostic steps and solutions, including product-specific commands and guidance. | | [Supported Kubernetes versions](https://learn.microsoft.com/en-us/azure/aks/supported-kubernetes-versions) | limits-quotas | 0.70 | Version support and lifecycle pages for AKS typically include exact support windows, deprecation dates, and version-specific constraints (for example, how many minor versions are supported, exact end-of-support dates like November 30, 2025 for Azure Linux 2.0). These are concrete, time-bound limits that change over time and are not reliably known from model pretraining, fitting the limits-quotas category best. | | [Taking over existing resources](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-takeover) | architecture-patterns | 0.70 | Describes how to use the whenToTakeOver property to manage conflicts when adding clusters with existing workloads. This is a product-specific decision pattern (how and when Fleet should assume control) with clear scenario-based guidance, aligning with architecture-patterns. | @@ -432,10 +455,10 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Understanding placement status](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-understand-placement) | troubleshooting | 0.70 | The page focuses on understanding status fields for ClusterResourcePlacement and ResourcePlacement custom resources, which is used to monitor deployment progress and troubleshoot issues. Interpreting these status fields is product-specific expert knowledge that helps map symptoms (status values/conditions) to causes and next actions, fitting the troubleshooting category. | | [Update Azure CNI IPAM and dataplane](https://learn.microsoft.com/en-us/azure/aks/update-azure-cni) | configuration | 0.70 | Guidance on updating existing AKS clusters to Azure CNI Overlay and Azure CNI powered by Cilium. Such an article typically includes specific CLI flags, configuration parameters, and supported combinations for IPAM modes and data planes, which are product-specific configuration details rather than generic concepts. | | [Update Kubernetes and node images](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/update-orchestration) | configuration | 0.70 | How-to for update runs with stages, groups, and strategies; includes specific resource properties and allowed values, making it configuration-focused. | -| [Update key vault mode](https://learn.microsoft.com/en-us/azure/aks/update-kms-key-vault) | security | 0.70 | Shows how to switch Key Vault mode between public and private for AKS KMS etcd encryption, including specific commands and parameters unique to this feature. | +| [Update key vault mode for legacy KMS](https://learn.microsoft.com/en-us/azure/aks/update-kms-key-vault) | security | 0.70 | Operational guide to switch Key Vault mode (public/private) for AKS clusters using KMS etcd encryption; contains specific security configuration steps tied to AKS and Key Vault. | | [Upgrade AKS IPAM and dataplane](https://learn.microsoft.com/en-us/azure/aks/update-azure-cni) | configuration | 0.70 | Same URL and description as index 3; contains the same product-specific configuration guidance for changing Azure CNI IPAM mode and data plane technology in AKS. | | [Upgrade Basic Load Balancer](https://learn.microsoft.com/en-us/azure/aks/upgrade-basic-load-balancer-on-aks) | decision-making | 0.70 | Upgrade guidance between Basic and Standard SKUs with AKS-specific steps and recommendations; helps decide and execute migration path. | -| [Upgrade FAQ](https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq) | troubleshooting | 0.70 | Upgrade FAQ typically maps specific upgrade symptoms and questions to causes and resolutions, which is troubleshooting-style expert knowledge. | +| [Upgrade FAQ](https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq) | troubleshooting | 0.70 | An upgrade FAQ for AKS typically includes specific error messages, unsupported scenarios, version skew rules, and step-by-step guidance for resolving upgrade problems. This is organized around common symptoms and questions unique to AKS upgrades, which fits the troubleshooting category better than general guidance. | | [Upgrade OS version](https://learn.microsoft.com/en-us/azure/aks/upgrade-os-version) | best-practices | 0.70 | Contains AKS-specific OS support timelines and best practices for testing, upgrading, and rollback of node OS versions. | | [Upgrade Windows OS version](https://learn.microsoft.com/en-us/azure/aks/upgrade-windows-os) | configuration | 0.70 | Provides AKS-specific procedure (new node pool, workload migration) to upgrade Windows OS versions while maintaining version alignment, a concrete operational pattern. | | [Upgrade cluster control plane](https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-control-plane) | deployment | 0.70 | Explains how to upgrade the control plane separately from node pools, including AKS-specific upgrade behavior and constraints. | @@ -446,6 +469,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Use Azure Premium SSD v2 disks](https://learn.microsoft.com/en-us/azure/aks/use-premium-v2-disks) | configuration | 0.70 | Contains AKS-specific StorageClass and disk configuration parameters for Premium SSD v2, including required annotations/fields that are unique to AKS CSI integration. | | [Use Azure Ultra Disks](https://learn.microsoft.com/en-us/azure/aks/use-ultra-disks) | configuration | 0.70 | Provides AKS-specific configuration for Ultra Disks (StorageClass fields, constraints, and usage patterns) that are not generic Kubernetes knowledge. | | [Use Flatcar Container Linux for AKS](https://learn.microsoft.com/en-us/azure/aks/flatcar-container-linux-for-aks) | decision-making | 0.70 | The page contains time-bound, product-specific deprecation details (exact support end dates, what operations will fail when images are removed, and required migration actions). These are expert, time-sensitive details not inferable from general training data and are used to decide when and how to move away from Flatcar on AKS. It is not primarily limits, configuration, or troubleshooting content. | +| [Use Kubernetes RBAC with Microsoft Entra integration](https://learn.microsoft.com/en-us/azure/aks/kubernetes-rbac-entra-id) | security | 0.70 | Shows how to wire Entra group membership into Kubernetes RBAC for AKS; includes concrete RBAC role/clusterrole and binding patterns tied to Entra groups, which are product-specific security configuration details. | | [Use OpenFaaS](https://learn.microsoft.com/en-us/azure/aks/openfaas) | deployment | 0.70 | Provides AKS-specific instructions to install and operate OpenFaaS, including configuration steps unique to this environment. | | [Use Planned Maintenance to schedule and control upgrades](https://learn.microsoft.com/en-us/azure/aks/planned-maintenance) | configuration | 0.70 | The page describes detailed, product-specific configuration of AKS planned maintenance for cluster and node image upgrades (for example, maintenance windows, schedules, and how they affect automatic upgrades). These are concrete AKS feature settings rather than generic concepts, fitting the configuration category best. It does not primarily focus on limits, troubleshooting, or architecture patterns. | | [Use Telepresence to develop and test microservices locally](https://learn.microsoft.com/en-us/azure/aks/use-telepresence-aks) | integrations | 0.70 | Details how to integrate Telepresence with AKS clusters for local development and debugging; includes product-specific commands and setup. | @@ -465,6 +489,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/deploy-extensions-az-cli) | configuration | 0.70 | Shows CLI parameters for creating, updating, and deleting extension instances, including required/optional settings—product-specific configuration commands. | | [Use the Vertical Pod Autoscaler on AKS](https://learn.microsoft.com/en-us/azure/aks/use-vertical-pod-autoscaler) | configuration | 0.70 | A how-to article for using VPA on AKS typically includes specific YAML fields, configuration options, and possibly mode settings (e.g., Off/Auto/Initial) and resource annotations unique to AKS VPA integration, which are product-specific configuration details. | | [Use virtual nodes](https://learn.microsoft.com/en-us/azure/aks/virtual-nodes) | architecture-patterns | 0.70 | Describes AKS virtual nodes feature, region support, Linux-only constraints, and when to use ACI-backed pods vs regular nodes—an AKS-specific scaling pattern. | +| [Verify signed container images at deploy time](https://learn.microsoft.com/en-us/azure/aks/image-integrity) | security | 0.70 | Covers validating signed images before deployment; likely includes policy configuration, admission controls, and verification settings specific to AKS Image Integrity, which are product-specific security configurations. | | [View Fleet agent logs](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/view-fleet-agent-logs) | troubleshooting | 0.70 | Shows how to configure and view Fleet agent logs on hub and member clusters; includes log locations and patterns for monitoring/troubleshooting. | | [View kubelet logs](https://learn.microsoft.com/en-us/azure/aks/kubelet-logs) | troubleshooting | 0.70 | Shows how to use journalctl and Container insights syslog collection to get kubelet logs; includes specific commands and log locations for AKS nodes. | | [Windows container considerations](https://learn.microsoft.com/en-us/azure/aks/windows-vs-linux-containers) | decision-making | 0.70 | Lists AKS-specific differences and constraints for Windows containers (supported versions, features, timelines), guiding workload placement decisions. | @@ -476,26 +501,27 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Enable or disable ACNS](https://learn.microsoft.com/en-us/azure/aks/use-advanced-container-networking-services) | configuration | 0.68 | The page is a how-to for enabling/disabling Advanced Container Networking Services (Container Network Observability and Container Network Security) on AKS clusters. It likely includes AKS- and CNI-specific configuration flags, feature names, and parameter values unique to this product rather than just conceptual networking guidance, fitting the configuration sub-skill. It does not focus on limits, troubleshooting, or architecture trade-offs. | | [Enforce best practices with deployment safeguards](https://learn.microsoft.com/en-us/azure/aks/deployment-safeguards) | best-practices | 0.68 | Describes AKS Deployment Safeguards with concrete, product-specific best-practice rules (e.g., required settings, validations enforced on deployments) that go beyond generic Kubernetes knowledge. | | [GPU health checking with NPD](https://learn.microsoft.com/en-us/azure/aks/gpu-health-monitoring) | troubleshooting | 0.68 | The page explains how AKS uses Node Problem Detector specifically for GPU-enabled nodes, which typically includes concrete node-level issue types, GPU-related error conditions, and how they are surfaced for diagnosis. This is product- and scenario-specific troubleshooting knowledge (GPU hardware/driver/connectivity issues on AKS nodes) that goes beyond generic Kubernetes concepts, mapping symptoms on GPU nodes to detection and remediation steps. | -| [Ingress with Kubernetes Gateway API](https://learn.microsoft.com/en-us/azure/aks/istio-gateway-api) | configuration | 0.68 | The page describes how to configure ingress for the Istio service mesh add-on on AKS using the Kubernetes Gateway API, including product-specific configuration objects and fields (Gateway, HTTPRoute, listeners, references to Istio add-on behavior). This is concrete, product-specific configuration guidance rather than a conceptual overview, matching the configuration sub-skill. It does not primarily focus on limits, troubleshooting, or architecture trade-offs. | +| [Ingress with Kubernetes Gateway API](https://learn.microsoft.com/en-us/azure/aks/istio-gateway-api) | configuration | 0.68 | The page describes how to configure ingress for the Istio service mesh add-on on AKS using the Kubernetes Gateway API, including product-specific resource kinds, fields, and configuration patterns that are unique to this integration. These are concrete configuration details rather than just conceptual explanations, so it best fits the configuration sub-skill. | | [Migrate from Cluster Autoscaler to NAP](https://learn.microsoft.com/en-us/azure/aks/migrate-from-autoscaler-to-node-auto-provisioning) | deployment | 0.68 | Provides product-specific migration steps and constraints for moving from AKS cluster autoscaler to node auto-provisioning (Karpenter-based). While not a generic tutorial, it contains AKS-specific operational guidance and requirements for how to transition between autoscaling mechanisms in production, which is deployment-focused expert knowledge. | +| [Migrate to KMS v2 (legacy)](https://learn.microsoft.com/en-us/azure/aks/use-kms-v2) | decision-making | 0.68 | Guides when and how to migrate from KMS v1 to v2 based on Kubernetes version and feature differences; includes recommendations and constraints for migration, fitting decision-making around upgrade paths. | | [Node auto-drain](https://learn.microsoft.com/en-us/azure/aks/node-auto-drain) | best-practices | 0.68 | Documents AKS behavior on scheduled events (e.g., spot eviction), taints used, and how cordon/drain is triggered, which are specific operational patterns and gotchas for AKS. | | [Node images](https://learn.microsoft.com/en-us/azure/aks/node-images) | decision-making | 0.68 | The page contains product-specific, time-bound retirement details for specific AKS node images (for example, exact retirement date for Ubuntu 20.04, required Kubernetes version to migrate, and supported image options). This is expert, versioned knowledge that changes over time and is unlikely to be fully known from pretraining. The content helps users decide which node image/OS version to use and when to migrate based on support status, fitting best under decision-making rather than generic concepts. | | [Placing namespace-scoped resources](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/quickstart-namespace-scoped-resource-propagation) | configuration | 0.68 | Quickstart for ResourcePlacement includes concrete CRD field usage, YAML specs, and product-specific configuration patterns for propagating namespace-scoped resources across clusters, which are implementation details not inferable from general Kubernetes knowledge. | | [Security awareness training](https://learn.microsoft.com/en-us/azure/aks/pci-security-awareness-training) | security | 0.68 | PCI-driven security awareness requirements applied to AKS admin/developer personnel with environment-specific considerations. | +| [Use AI with AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-ai-assistant) | troubleshooting | 0.68 | The article explains how to enable and use an AI-powered troubleshooting assistant that analyzes AKS logs, events, and metrics to identify issues and recommend resolutions. This is a product-specific troubleshooting workflow (symptom to diagnosis to resolution) using a unique AKS Desktop feature, which aligns with the troubleshooting sub-skill rather than generic conceptual content. | | [Use AMD GPUs](https://learn.microsoft.com/en-us/azure/aks/use-amd-gpus) | configuration | 0.68 | Focuses on AMD GPU-enabled Linux node pools; likely includes specific VM sizes, driver setup, and AKS configuration parameters for AMD GPUs. | | [Use Azure CNI Powered by Cilium](https://learn.microsoft.com/en-us/azure/aks/azure-cni-powered-by-cilium) | configuration | 0.68 | Covers how to deploy and configure the Cilium-based data plane with Azure CNI; includes AKS-specific flags and settings. | -| [Use Confidential Virtual Machines](https://learn.microsoft.com/en-us/azure/aks/use-cvm) | security | 0.68 | Describes how to create CVM-backed node pools with AKS-specific configuration and constraints tied to SEV-SNP and TEE usage. | | [Use HostProcess containers](https://learn.microsoft.com/en-us/azure/aks/use-windows-hpc) | configuration | 0.68 | Describes AKS support, configuration, and security implications of HostProcess/privileged containers on Windows, which are detailed product-specific behaviors. | | [Use NVIDIA GPUs](https://learn.microsoft.com/en-us/azure/aks/use-nvidia-gpu) | configuration | 0.68 | How-to for enabling GPUs on AKS typically includes VM SKU requirements, driver/daemonset settings, taints/labels, and Kubernetes-specific configuration flags that are product-specific and parameterized. | | [Use Windows GPUs](https://learn.microsoft.com/en-us/azure/aks/use-windows-gpu) | configuration | 0.68 | Guides provisioning Windows GPU-enabled node pools; expected to contain AKS- and Windows-specific parameters, image types, and configuration flags not generally known. | | [Use an ARM template](https://learn.microsoft.com/en-us/azure/aks/keda-deploy-add-on-arm) | deployment | 0.68 | Shows deploying KEDA add-on via ARM; expected to include template schema, parameters, and AKS-version-to-KEDA-version mapping relevant to deployment. | | [Use instance-level public IP addresses](https://learn.microsoft.com/en-us/azure/aks/use-node-public-ips) | configuration | 0.68 | Shows how to attach public IPs to node pools; involves AKS-specific properties and Azure resource wiring, which are concrete configuration details. | +| [Use system node pools](https://learn.microsoft.com/en-us/azure/aks/use-system-pools) | best-practices | 0.68 | The article goes beyond concepts and includes AKS-specific operational guidance for system node pools (for example, requirements like a production cluster with a single system node pool must contain at least a certain number of nodes, and constraints on what can run on system vs. user pools). These are product-specific DO/DON'T style recommendations and constraints that an LLM is unlikely to know precisely from training, fitting best under best-practices. | | [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/keda-deploy-add-on-cli) | deployment | 0.68 | CLI-based deployment of KEDA add-on; includes specific az commands, flags, and version behavior tied to AKS Kubernetes versions. | | [Enable cost analysis on your cluster](https://learn.microsoft.com/en-us/azure/aks/cost-analysis) | configuration | 0.66 | How-to for enabling cost analysis add-on; expected to include specific configuration steps, parameters, and possibly tagging/labeling schemes unique to AKS cost analysis. | | [Enable on an existing cluster using the Azure portal](https://learn.microsoft.com/en-us/azure/aks/enable-keda-existing-cluster) | deployment | 0.66 | Portal-based enablement of KEDA on existing clusters; includes AKS-version-to-KEDA-version mapping and portal configuration steps specific to this deployment path. | | [Monitor and optimize idle costs](https://learn.microsoft.com/en-us/azure/aks/cost-analysis-idle-costs) | configuration | 0.66 | Explains idle costs in the AKS cost analysis add-on and how to monitor/reduce them; likely includes specific metrics, UI fields, and configuration steps unique to the add-on. | | [Test PostgreSQL](https://learn.microsoft.com/en-us/azure/aks/validate-postgresql-ha) | best-practices | 0.66 | Covers validation, connectivity, and failover testing; includes day-2 operations and specific steps to test HA behavior, which are product-specific operational practices. | -| [Use Dedicated Hosts with AKS](https://learn.microsoft.com/en-us/azure/aks/use-azure-dedicated-hosts) | deployment | 0.66 | Covers how to bind AKS node pools to Dedicated Hosts with specific deployment constraints and configuration parameters. | | [Use generation 2 VMs](https://learn.microsoft.com/en-us/azure/aks/generation-2-vms) | decision-making | 0.66 | Details which VM sizes support Gen2, how to create/migrate node pools, and considerations when moving from Gen1 to Gen2—AKS-specific migration and selection guidance. | | [Use shared health probes](https://learn.microsoft.com/en-us/azure/aks/shared-health-probes) | configuration | 0.66 | Describes shared health probe mode for externalTrafficPolicy=Cluster; includes AKS-specific configuration flags and behavior. | | [Access Fleet Manager hub cluster](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/access-fleet-hub-cluster-kubernetes-api) | configuration | 0.65 | Explains how to configure access (kubeconfig, auth) to the hub cluster API; product-specific access configuration steps and parameters. | @@ -523,7 +549,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Member cluster types](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-member-cluster-types) | decision-making | 0.65 | Describes supported member cluster types and includes a capability support table; helps decide which type to use for scenarios, aligning with decision-making. | | [Migrate Kubernetes updates to Fleet Manager from Terragrunt and Terraform](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-migrate-updates-from-terraform) | decision-making | 0.65 | Guides migration from Terragrunt/Terraform-based updates to Fleet Update Runs; includes trade-offs and mapping of existing patterns to Fleet. | | [Migrate from in-tree to CSI driver](https://learn.microsoft.com/en-us/azure/aks/csi-migrate-in-tree-volumes) | deployment | 0.65 | Migration runbooks and scripts for moving AKS in-tree PVs to CSI drivers are product- and version-specific operational knowledge that goes beyond generic Kubernetes concepts. | -| [Migrate to KMS v2](https://learn.microsoft.com/en-us/azure/aks/use-kms-v2) | deployment | 0.65 | Describes version-specific migration steps and constraints (e.g., Kubernetes version thresholds, behavior changes when enabling encryption) that are deployment- and product-specific. | +| [Migrate from pod identity to workload identity](https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity) | decision-making | 0.65 | Provides multiple migration approaches (parallel deployment, sidecar proxy, SDK rewrite) based on SDK version and scenario; helps decide which path to use with concrete guidance, fitting decision-making around migration. | | [Monitor Kubernetes object events](https://learn.microsoft.com/en-us/azure/aks/events) | troubleshooting | 0.65 | How-to for using Kubernetes events for troubleshooting; likely maps event types/reasons to causes and actions in AKS, which is symptom→solution guidance. | | [Monitor control plane metrics](https://learn.microsoft.com/en-us/azure/aks/control-plane-metrics-monitor) | configuration | 0.65 | How-to for enabling control plane metrics in Azure Monitor; likely includes metric names, namespaces, and configuration parameters specific to AKS control plane monitoring. | | [Node auto-repair](https://learn.microsoft.com/en-us/azure/aks/node-auto-repair) | best-practices | 0.65 | Describes AKS-specific health checks, thresholds, and repair actions for Linux/Windows nodes, including behavior during platform maintenance—concrete product behavior and recommendations. | @@ -537,7 +563,6 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Quickstart with Azure App Configuration](https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration-quickstart) | configuration | 0.65 | Quickstart for generating ConfigMaps from Azure App Configuration via the AKS extension; likely includes provider-specific settings, CLI flags, and connection parameters that qualify as product-specific configuration. | | [Refactor app](https://learn.microsoft.com/en-us/azure/aks/eks-web-refactor) | deployment | 0.65 | Provides concrete steps to move the Yelb app from AWS EKS to AKS, including AKS-specific deployment instructions; focused on practical migration deployment. | | [Roll back node pool versions](https://learn.microsoft.com/en-us/azure/aks/roll-back-node-pool-version) | best-practices | 0.65 | A rollback feature article for AKS node pools that explains when and how to use rollback, its capabilities and limitations, and post-rollback actions will contain product-specific guidance, constraints, and gotchas (for example, which upgrade paths can be rolled back, image/version combinations that are supported, and recommended operational steps). This is actionable, AKS-specific operational guidance rather than generic theory, so it best fits best-practices. | -| [Security Bulletins](https://learn.microsoft.com/en-us/azure/aks/security-bulletins/overview) | troubleshooting | 0.65 | The page aggregates AKS security/vulnerability updates and related troubleshooting guidance. Security bulletins usually include specific CVEs, component versions, and remediation steps, mapping symptoms or vulnerabilities to fixes. This aligns with troubleshooting, as it provides diagnosis and resolution information specific to AKS security issues. | | [Small and large language models](https://learn.microsoft.com/en-us/azure/aks/concepts-ai-ml-language-models) | decision-making | 0.65 | Provides AKS-focused guidance on when to use small vs large models in different scenarios, likely with concrete criteria for selection. | | [Understand platform differences](https://learn.microsoft.com/en-us/azure/aks/eks-edw-understand) | decision-making | 0.65 | Explains operational differences between AWS and Azure relevant to EDW scaling; supports choosing equivalent Azure services and approaches for migration. | | [Understand platform differences](https://learn.microsoft.com/en-us/azure/aks/eks-web-understand) | decision-making | 0.65 | Explains operational differences between AWS and Azure for hosting web apps on managed Kubernetes; supports choosing equivalent Azure components and approaches. | @@ -552,20 +577,22 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [VM sizes, generations, and features](https://learn.microsoft.com/en-us/azure/aks/aks-virtual-machine-sizes) | decision-making | 0.65 | Discusses available VM sizes, generations, and reasons some sizes are unavailable or retired; used to decide which VM SKUs to use for AKS, aligning with decision-making on capacity and SKU selection. | | [What is Azure Kubernetes Fleet Manager?](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/overview) | architecture-patterns | 0.65 | Describes multi-cluster management scenarios, safe rollout patterns, and resource placement strategies unique to Fleet Manager; provides product-specific architectural guidance. | | [Add a node pool with a unique subnet](https://learn.microsoft.com/en-us/azure/aks/node-pool-unique-subnet) | architecture-patterns | 0.64 | Shows how and when to split node pools into separate subnets for isolation and address-space constraints, an AKS-specific network/cluster design pattern. | +| [Azure Dedicated Hosts for AKS nodes](https://learn.microsoft.com/en-us/azure/aks/use-azure-dedicated-hosts) | configuration | 0.64 | Covers creating a Dedicated Host group and associating it with AKS; includes AKS-specific configuration of host groups, regions, zones, and fault domains, which are configuration details beyond generic knowledge. | | [Delete node pools](https://learn.microsoft.com/en-us/azure/aks/delete-node-pool) | configuration | 0.64 | Describes what happens to workloads and cluster resources when a node pool is deleted in AKS, including product-specific behavior and constraints. | | [Get AKS cost recommendations in Azure Advisor](https://learn.microsoft.com/en-us/azure/aks/cost-advisors) | decision-making | 0.64 | Describes AKS cost recommendations in Azure Advisor; includes how Advisor evaluates configurations and suggests changes, supporting decision-making on cost vs reliability. | | [Load test Valkey with Locust](https://learn.microsoft.com/en-us/azure/aks/validate-valkey-cluster) | best-practices | 0.64 | Resiliency validation using Locust and simulated failures; provides concrete patterns and expected behaviors for Valkey on AKS under failure. | -| [Observability for KMS etcd encryption](https://learn.microsoft.com/en-us/azure/aks/kms-observability) | configuration | 0.64 | Covers observability metrics and configuration for KMS etcd encryption, including specific metric names and monitoring settings unique to AKS. | | [Operate cost-optimized AKS at scale](https://learn.microsoft.com/en-us/azure/aks/operate-cost-optimized-scale) | best-practices | 0.64 | Operational guidance for cost-optimized AKS at scale; likely includes concrete patterns and configurations (autoscaling, node pools, add-ons) rather than only conceptual advice. | | [Summary](https://learn.microsoft.com/en-us/azure/aks/pci-summary) | architecture-patterns | 0.64 | Summarizes architectural choices and patterns for AKS PCI baseline, tying together well-architected decisions for regulated workloads. | | [Test resiliency during an AKS node pool upgrade](https://learn.microsoft.com/en-us/azure/aks/upgrade-valkey-aks-nodepool) | best-practices | 0.64 | Guidance on validating Valkey behavior during node pool upgrades with Locust monitoring; includes specific operational recommendations for minimizing disruption. | | [Validate resiliency during node pool upgrade](https://learn.microsoft.com/en-us/azure/aks/upgrade-mongodb-cluster) | best-practices | 0.64 | Focuses on behavior during node pool upgrades with Locust monitoring; contains product-specific guidance on upgrade impact and validation steps. | | [View and access Managed Fleet Namespaces](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-managed-namespaces-access) | configuration | 0.64 | Gives product-specific procedures and parameters for locating and accessing managed namespaces, including how Fleet exposes them to users; these are concrete operational details. | | [Windows Server containers FAQ](https://learn.microsoft.com/en-us/azure/aks/windows-faq) | troubleshooting | 0.64 | FAQ typically includes specific error messages, supportability constraints, and resolutions for Windows on AKS, which are product-specific troubleshooting details. | +| [KMS observability (legacy)](https://learn.microsoft.com/en-us/azure/aks/kms-observability) | troubleshooting | 0.62 | Focuses on observability metrics and how to improve observability for KMS etcd encryption; likely includes specific metrics, their meanings, and actions, which are troubleshooting/diagnostic patterns unique to this feature. | | [Migrate from Network Policy Manager to Cilium Network policy](https://learn.microsoft.com/en-us/azure/aks/migrate-from-npm-to-cilium-network-policy) | decision-making | 0.62 | Stepwise migration guide with planning, execution, and validation; helps decide and carry out migration between two network policy engines with AKS-specific constraints. | | [Test resiliency](https://learn.microsoft.com/en-us/azure/aks/resiliency-mongodb-cluster) | best-practices | 0.62 | Resiliency testing scenarios using Locust and Percona operator; likely includes concrete patterns and gotchas for MongoDB behavior under failure on AKS. | | [Update NAP node images](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-upgrade-image) | best-practices | 0.62 | Discusses node image updates for NAP with recommended maintenance windows and examples; provides operational guidance and timing considerations specific to NAP. | | [Azure portal Kubernetes resource view](https://learn.microsoft.com/en-us/azure/aks/kubernetes-portal) | configuration | 0.60 | Portal-based AKS resource management typically documents specific blades, actions, and constraints unique to AKS integration in the portal, which are product-specific configuration/management details. | +| [Confidential Containers (preview)](https://learn.microsoft.com/en-us/azure/aks/confidential-containers-overview) | decision-making | 0.60 | Overview of Confidential Containers preview including sunset timelines and suggested alternatives; provides product-specific guidance on when to use or migrate away, which is decision-focused and time-bound expert knowledge. | | [Configure networking](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-networking) | security | 0.60 | Covers networking requirements, supported configurations, RBAC setup, and CIDR considerations; includes role definitions and network configuration details specific to NAP. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/ray-overview) | deployment | 0.60 | Describes AKS-specific considerations and components (KubeRay, dashboard, cluster layout) for Ray deployments, beyond generic Ray concepts. | | [Support policy](https://learn.microsoft.com/en-us/azure/aks/istio-support-policy) | decision-making | 0.60 | Support policy articles typically include version support windows, timelines, and constraints that guide decisions about when to upgrade or adopt specific revisions. | @@ -579,10 +606,10 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | TOC Title | Confidence | Reason | |-----------|------------|--------| | [About Kubernetes Event-driven Autoscaler (KEDA)](https://learn.microsoft.com/en-us/azure/aks/keda-about) | 0.55 | Conceptual overview of KEDA add-on with a noted limitation; mostly descriptive, with only a single high-level limitation and no detailed configuration tables. | -| [AKS desktop overview](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-overview) | 0.50 | Overview of AKS desktop experience; summary mentions best practices but not concrete parameters, limits, or decision matrices. | | [Cluster autoscaler overview](https://learn.microsoft.com/en-us/azure/aks/cluster-autoscaler-overview) | 0.48 | Cluster autoscaler overview; primarily conceptual explanation of how autoscaling works without detailed configuration matrices or limits. | | [Automated deployments overview](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-automated-deployments) | 0.45 | Automated Deployments conceptual overview; focuses on high-level pipeline concept from Git, not detailed configuration matrices or limits. | -| [Deploy an application using AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-app) | 0.45 | Step-by-step app deployment via AKS desktop; likely a basic tutorial without detailed config matrices or limits. | +| [Cluster authorization concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-cluster-authorization) | 0.45 | Conceptual explanation of AKS authorization models (Kubernetes RBAC vs Entra ID with ABAC); no detailed role tables, config parameters, or error/diagnostic mappings. | +| [KMS encryption at rest of K8s API resources (overview)](https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption-concepts) | 0.45 | Conceptual overview of data encryption at rest and KMS provider integration; does not emphasize concrete configuration parameters, limits, or decision matrices. | | [Overview of add-ons, extensions, and integrations](https://learn.microsoft.com/en-us/azure/aks/integrations) | 0.45 | High-level overview of add-ons and integrations; summary does not indicate detailed config tables or error/limit specifics. | | [Resize an AKS cluster](https://learn.microsoft.com/en-us/azure/aks/resize-cluster) | 0.45 | Describes resizing AKS clusters and autoscaling; mostly conceptual and procedural scaling guidance, not explicit limits tables or config parameter references. | | [Understand AKS usage and costs](https://learn.microsoft.com/en-us/azure/aks/understand-aks-costs) | 0.45 | High-level guidance and links about understanding AKS usage and costs; appears more like a resource index than detailed configuration or decision matrices. | @@ -596,8 +623,10 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [DNS Resolution](https://learn.microsoft.com/en-us/azure/aks/dns-concepts) | 0.40 | DNS concepts and LocalDNS benefits; summary emphasizes understanding and troubleshooting, not specific config parameters or error-code mappings. | | [Delete an AKS cluster](https://learn.microsoft.com/en-us/azure/aks/delete-cluster) | 0.40 | Explains what happens when deleting a cluster and how to delete it; likely procedural without detailed configuration matrices or limits. | | [Deploy Open-WebSearch MCP Server on AKS](https://learn.microsoft.com/en-us/azure/aks/openwebsearch-on-aks) | 0.40 | End-to-end deployment walkthrough for Open-WebSearch MCP on AKS; primarily procedural without configuration tables, limits, or product-specific decision matrices. | +| [External identity provider authentication to cluster (overview)](https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-overview) | 0.40 | High-level overview of structured authentication and external identity providers for AKS; primarily conceptual without detailed configuration parameters, limits, or error mappings. | | [Fleet and Arc integration](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-fleet-arc-integration) | 0.40 | Conceptual overview of Arc integration; likely descriptive without detailed config parameter tables or numeric constraints. | | [Fleet hub cluster overview](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-lifecycle) | 0.40 | Hub cluster overview; mostly conceptual description of lifecycle and role, not detailed decision matrices or config tables. | +| [Managed identities overview](https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview) | 0.40 | Overview of managed identities in AKS (system-assigned, user-assigned, kubelet identity); largely conceptual without detailed role/permission tables or config parameters. | | [Monitor GPU metrics](https://learn.microsoft.com/en-us/azure/aks/monitor-gpu-metrics) | 0.40 | Conceptual overview of GPU metrics and observability; likely describes metric types but not detailed configuration tables or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/advanced-container-networking-services-overview) | 0.40 | Overview of Advanced Container Networking Services; marketing/feature description without clear indication of detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/automatic/aks-automatic-managed-system-node-pools-about) | 0.40 | Overview of managed system node pools for AKS Automatic; describes feature, benefits, and restrictions at a high level without detailed numeric limits or config tables in the summary. | @@ -617,6 +646,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/learn/quick-windows-container-deploy-cli) | 0.40 | Quickstart for Windows Server containers on AKS via CLI; procedural deployment with defaults, not detailed configuration or limits. | | [Use the Azure portal](https://learn.microsoft.com/en-us/azure/aks/learn/quick-windows-container-deploy-portal) | 0.40 | Quickstart for Windows Server containers on AKS via portal; step-by-step UI guide without detailed configuration matrices or limits. | | [Vertical Pod Autoscaler overview](https://learn.microsoft.com/en-us/azure/aks/vertical-pod-autoscaler) | 0.40 | Described as an overview of Vertical Pod Autoscaler in AKS. Likely focuses on concepts and benefits rather than detailed configuration parameters, limits, or decision matrices. | +| [Vulnerability management overview](https://learn.microsoft.com/en-us/azure/aks/concepts-vulnerability-management) | 0.40 | Conceptual description of vulnerability management responsibilities and patching for AKS; summary suggests high-level process rather than specific error codes, configuration parameters, or limits. | | [Configure AKS diagnostics](https://learn.microsoft.com/en-us/azure/aks/aks-diagnostics) | 0.35 | Overview of the Diagnose and Solve Problems experience; summary does not indicate detailed error-code mappings or configuration tables. | | [Create a GitHub Workflow](https://learn.microsoft.com/en-us/azure/aks/aks-extension-draft-github-workflow) | 0.35 | How-to for generating a GitHub workflow from the AKS extension; likely generic CI/CD steps without AKS-specific config parameter tables or quotas. | | [Deploy and test inference models with the AI toolchain operator (KAITO) in Visual Studio Code](https://learn.microsoft.com/en-us/azure/aks/aks-extension-kaito) | 0.35 | Scenario tutorial for deploying/testing inference models with KAITO; likely step-by-step without detailed config matrices or limits. | @@ -624,11 +654,9 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Develop with Helm](https://learn.microsoft.com/en-us/azure/aks/quickstart-helm) | 0.35 | Helm quickstart for AKS; standard tutorial on packaging and deploying an app, unlikely to contain detailed AKS-specific configuration matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/concepts-managed-namespaces) | 0.35 | Conceptual overview of managed namespaces and logical isolation; mostly descriptive without detailed configuration parameter tables. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-observability-metrics) | 0.35 | Described as an overview of container network metrics; likely conceptual description of available metrics rather than detailed config or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-performance-ebpf-host-routing) | 0.35 | eBPF host routing overview; summary suggests conceptual performance/security discussion rather than detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-security-l7-policy-concepts) | 0.35 | L7 policy capabilities overview; appears conceptual description of features and benefits without detailed configuration tables. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/github-actions-azure-files-overview) | 0.35 | Solution overview for GitHub Actions on AKS; summary indicates high-level architecture and prerequisites, not detailed configuration parameters. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/kafka-overview) | 0.35 | Solution overview for Kafka on AKS with Strimzi; mostly prerequisites and architecture considerations without clear evidence of detailed config tables. | -| [Overview](https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview) | 0.35 | Managed identities overview; primarily conceptual explanation of identity types and behavior, not detailed configuration steps or role tables. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/postgresql-ha-overview) | 0.35 | Overview of HA PostgreSQL deployment on AKS; summary suggests architecture and prerequisites, not detailed config tables or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/valkey-overview) | 0.35 | Solution overview for Valkey on AKS; primarily summarizes steps and architecture without clear indication of detailed configuration tables. | | [Pod Sandboxing overview](https://learn.microsoft.com/en-us/azure/aks/concepts-pod-sandboxing) | 0.35 | High-level overview of pod sandboxing and isolation; lacks detailed configuration parameters or numeric thresholds. | @@ -641,7 +669,7 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [About cluster extensions](https://learn.microsoft.com/en-us/azure/aks/cluster-extensions) | 0.30 | High-level description of AKS cluster extensions and lifecycle management; based on the summary it appears to be conceptual/overview without specific configuration tables, limits, or troubleshooting details. | | [Application routing add-on overview](https://learn.microsoft.com/en-us/azure/aks/app-routing) | 0.30 | Application routing add-on with NGINX ingress description and deprecation notice; summary suggests high-level guidance and support timelines, not detailed limits, config matrices, or troubleshooting content. | | [Automatically upgrade an AKS cluster](https://learn.microsoft.com/en-us/azure/aks/auto-upgrade-cluster) | 0.30 | The page describes automatic upgrades for AKS clusters and notes that any upgrade operation also upgrades the node image if not already on the latest version. Based on the summary, it is primarily lifecycle and conceptual/operational guidance without explicit configuration parameter tables, error codes, or quantified decision matrices, so it does not clearly meet the expert-knowledge criteria for any sub-skill type. | -| [Confidential Containers overview](https://learn.microsoft.com/en-us/azure/aks/confidential-containers-overview) | 0.30 | Confidential Containers overview; mostly conceptual plus deprecation note, not focused on detailed configuration or security role mappings. | +| [Cluster authentication concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-cluster-authentication) | 0.30 | Cluster authentication concepts article; describes how AKS authenticates callers and recommended patterns, but summary suggests conceptual guidance rather than detailed role/parameter references. | | [Core AKS concepts](https://learn.microsoft.com/en-us/azure/aks/core-aks-concepts) | 0.30 | Core concepts article; mostly conceptual explanation of AKS constructs. The deprecation dates for Azure Linux 2.0 are specific but not within any defined sub-skill category. | | [Create a Dockerfile](https://learn.microsoft.com/en-us/azure/aks/aks-extension-draft-dockerfile) | 0.30 | Task-focused tutorial on creating a Dockerfile via the AKS VS Code extension; likely step-by-step without config tables, limits, or product-specific best-practice matrices. | | [Create a Kubernetes deployment](https://learn.microsoft.com/en-us/azure/aks/aks-extension-draft-deployment) | 0.30 | Tutorial on creating a Kubernetes deployment using the AKS VS Code extension; primarily workflow guidance, not detailed configuration references or limits. | @@ -663,7 +691,6 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Overview](https://learn.microsoft.com/en-us/azure/aks/mongodb-overview) | 0.30 | High-level overview of deploying MongoDB on AKS; summary does not indicate detailed config parameters or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning) | 0.30 | Described as an overview of AKS Node Auto-Provisioning covering how it works, prerequisites, upgrade behavior, and limitations. This is primarily conceptual/behavioral documentation; the summary does not indicate concrete numeric limits, configuration tables, error codes, or decision matrices with thresholds. Likely no detailed expert-only parameters or quotas. | | [Overview of CSI storage drivers](https://learn.microsoft.com/en-us/azure/aks/csi-storage-drivers) | 0.30 | Appears to be a conceptual/overview page about CSI drivers on AKS and how to use Azure Disks/Files/Blob via CSI. The summary does not indicate detailed limits, configuration parameter tables, error codes, or decision matrices; it reads like a capability/feature explanation and high-level guidance. | -| [Overview of KMS etcd encryption](https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption-concepts) | 0.30 | Conceptual explanation of encryption at rest and KMS models; no detailed configuration tables, limits, or product-specific parameters. | | [Public endpoint DNS-based load balancing](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-dns-load-balancing) | 0.30 | Explains DNS-based multi-cluster load balancing conceptually; preview note and overview style suggest it lacks detailed configuration tables or numeric thresholds in the provided summary. | | [Scale an AKS cluster](https://learn.microsoft.com/en-us/azure/aks/scale-cluster) | 0.30 | Primarily a how-to for manually scaling AKS nodes via CLI/PowerShell; summary does not indicate numeric limits, config tables, or product-specific thresholds beyond generic scaling behavior. | | [Scaling concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-scale) | 0.30 | Conceptual overview of scaling options (HPA, cluster autoscaler, ACI); high-level without clear evidence of numeric thresholds or config tables. | @@ -683,13 +710,10 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [Windows AKS partner solutions](https://learn.microsoft.com/en-us/azure/aks/windows-aks-partner-solutions) | 0.30 | Primarily a partner solutions listing and marketing-style introductions; does not focus on detailed AKS configuration, limits, or troubleshooting content. | | [Dapr extension](https://learn.microsoft.com/en-us/azure/aks/dapr-overview) | 0.25 | Dapr extension overview describing capabilities and concepts; lacks indication of detailed config tables, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/stateful-workloads-overview) | 0.25 | Overview of stateful workloads on AKS; likely conceptual without detailed configuration tables or limits. | -| [Security concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-security) | 0.25 | Security concepts overview; high-level description of components and services without detailed RBAC roles or config parameters. | -| [Security vulnerability management](https://learn.microsoft.com/en-us/azure/aks/concepts-vulnerability-management) | 0.25 | Vulnerability management overview; describes shared responsibility and process, not specific configuration or error mappings. | | [1 - Prepare application for AKS](https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-prepare-app) | 0.20 | Tutorial on preparing a multi-container app with Docker Compose for AKS; primarily step-by-step guidance without product-specific limits, configuration matrices, or specialized troubleshooting/error-code content. | | [2 - Create container registry](https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-prepare-acr) | 0.20 | Tutorial on creating an Azure Container Registry and pushing images; focuses on basic setup and usage, not on limits, quotas, configuration parameter tables, or decision/troubleshooting matrices. | | [3 - Create Kubernetes cluster](https://learn.microsoft.com/en-us/azure/aks/tutorial-kubernetes-deploy-cluster) | 0.20 | Tutorial for creating an AKS cluster; uses default settings and basic commands, not deep configuration or decision guidance. | | [AKS best practices](https://learn.microsoft.com/en-us/azure/aks/best-practices) | 0.20 | High-level collection/landing page for best-practices articles; does not itself contain detailed recommendations or product-specific configurations. | -| [Access and identity concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-identity) | 0.20 | Conceptual overview of identity and access in AKS; lacks detailed configuration parameters or error mappings. | | [Advanced AKS concepts](https://learn.microsoft.com/en-us/azure/aks/advanced-aks-concepts) | 0.20 | Advanced concepts overview that links to deeper docs; lacks concrete limits, configs, or decision matrices in the summary. | | [Azure Linux AKS partner solutions](https://learn.microsoft.com/en-us/azure/aks/azure-linux-aks-partner-solutions) | 0.20 | Partner solutions listing and marketing-style guidance; no indication of concrete technical parameters or patterns. | | [Build Java app with Quarkus](https://learn.microsoft.com/en-us/azure/aks/howto-deploy-java-quarkus-app) | 0.20 | How-to guide for deploying a Quarkus CRUD app on AKS with PostgreSQL; summary suggests a basic tutorial without detailed configuration parameter tables, limits, or specialized troubleshooting guidance. | @@ -698,27 +722,34 @@ confusable_not_for: Not for Azure Container Apps (use azure-container-apps), Azu | [CNI networking overview](https://learn.microsoft.com/en-us/azure/aks/concepts-network-cni-overview) | 0.20 | Conceptual overview of CNI networking options in AKS (what CNI is, high-level behavior). The summary does not indicate specific numeric limits, configuration tables, or decision matrices; it is primarily explanatory, so it does not meet the expert-knowledge criteria. | | [Compare AKS Standard and AKS Automatic](https://learn.microsoft.com/en-us/azure/aks/intro-aks-automatic) | 0.20 | Introductory/marketing-style overview of AKS Automatic features and benefits; no concrete limits, configuration tables, error codes, or decision matrices. | | [Concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-machine-learning-ops) | 0.20 | Conceptual overview of MLOps; lacks product-specific parameters, limits, or detailed configurations. | +| [Deploy an application](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-app) | 0.20 | How-to guide for signing in, adding a cluster, creating a project, and deploying an app. It reads as a basic usage tutorial without explicit mention of configuration tables, limits, RBAC roles, or troubleshooting mappings. | +| [Expand pod CIDR space in Azure CNI Overlay clusters](https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay-pod-expand) | 0.20 | Primarily describes that pod CIDR space can be expanded using az aks update; summary does not indicate detailed configuration tables, limits, or product-specific error mappings, so it reads as a procedural capability/tutorial rather than expert-knowledge configuration or limits content. | | [Fine-tuning language models](https://learn.microsoft.com/en-us/azure/aks/concepts-fine-tune-language-models) | 0.20 | Conceptual overview of fine-tuning language models on AKS; describes methods and benefits but does not include product-specific limits, configuration tables, error codes, or decision matrices with quantified trade-offs. | | [From a code repository](https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-from-code) | 0.20 | Quickstart workflow for deploying from source using automated deployments; appears to be step-by-step tutorial without detailed config tables, limits, or troubleshooting mappings. | | [Ingress with the Kubernetes Gateway API (preview)](https://learn.microsoft.com/en-us/azure/aks/managed-gateway-api) | 0.20 | Describes installing Gateway API CRDs on AKS; from the summary it looks like a conceptual/installation overview without detailed config tables, limits, or troubleshooting mappings. | -| [Introduction to resource placement](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-resource-placement) | 0.20 | Introduced as a concepts article for intelligent resource placement across multiple clusters. The summary focuses on challenges and high-level capabilities, without evidence of configuration tables, limits, or detailed decision matrices. This is likely conceptual guidance rather than the kind of specific, product-level expert knowledge required. | +| [Introduction to resource placement](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-resource-placement) | 0.20 | Conceptual description of intelligent resource placement in Azure Kubernetes Fleet Manager; no concrete limits, configuration tables, error codes, or decision matrices with quantified trade-offs are evident from the summary. | | [Migration and modernization solutions for Windows containers on AKS](https://learn.microsoft.com/en-us/azure/aks/windows-aks-migration-modernization-solutions) | 0.20 | Partner migration walkthrough overview; no concrete limits, configs, or error mappings in summary. | | [Monitor kube-audit events](https://learn.microsoft.com/en-us/azure/aks/monitor-aks) | 0.20 | Duplicate of index 1; high-level monitoring overview without clear evidence of detailed configuration or limits in the summary. | | [Monitoring concepts and resources](https://learn.microsoft.com/en-us/azure/aks/monitor-aks) | 0.20 | High-level monitoring overview and integrations; summary does not indicate specific config tables, limits, or error mappings. | | [Multi-cluster networking overview](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-multi-cluster-networking-overview) | 0.20 | Multi-cluster networking concepts article; summary indicates high-level explanation of north-south/east-west patterns and challenges, without concrete limits, configs, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/concepts-network-azure-cni-overlay) | 0.20 | Described as an overview of Azure CNI Overlay architecture and IP planning; from the summary it appears conceptual (architecture, differences from kubenet) without clear evidence of concrete configuration tables, limits, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-insights-agent-overview) | 0.20 | Overview/marketing-style description of the Container Network Insights Agent; no detailed error codes, configuration tables, limits, or decision matrices are evident in the summary. | -| [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-security-wireguard-encryption-concepts) | 0.20 | Primarily a conceptual/overview article describing WireGuard encryption capabilities in ACNS for AKS. The summary suggests high-level security and feature explanation without specific RBAC roles, configuration parameter tables, limits, or error codes. | +| [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-performance-ebpf-host-routing) | 0.20 | Overview of eBPF host routing; summary indicates conceptual performance discussion without detailed configuration tables, limits, or decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/aks/container-network-security-wireguard-encryption-concepts) | 0.20 | Conceptual overview of WireGuard encryption on AKS; summary suggests high-level explanation without concrete configuration parameters, limits, or role definitions. | +| [Quickstart](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-quickstart-auto) | 0.20 | Quickstart walkthrough for deploying an app with AKS Desktop. Primarily step-by-step tutorial; no evidence of detailed configuration matrices, limits, or product-specific troubleshooting/error-code mappings. | | [Safe multi-cluster upgrades](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-update-orchestration) | 0.20 | Described as a concepts article about update orchestration (stages, groups, strategies) across clusters. This is primarily conceptual behavior and workflow; there is no indication of numeric limits, config tables, error codes, or detailed settings that would qualify as expert knowledge in the defined sub-skill types. | | [Storage concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-storage) | 0.20 | Conceptual overview of AKS storage (volumes, PVs, storage classes) without detailed configuration tables or product-specific limits. | | [Sustainable software engineering best practices](https://learn.microsoft.com/en-us/azure/aks/concepts-sustainable-software-engineering) | 0.20 | Conceptual sustainability principles and high-level guidance; lacks concrete AKS-specific configuration values, limits, or patterns. | -| [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-cli) | 0.20 | Quickstart tutorial for creating an AKS cluster and deploying an app via Azure CLI with default settings; it does not focus on limits, configuration matrices, error codes, or product-specific best practices beyond generic deployment steps. | +| [Use the Azure CLI](https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-cli) | 0.20 | Quickstart tutorial for creating an AKS cluster and deploying an app via Azure CLI. It shows basic commands and default settings but does not provide configuration parameter tables, limits, quotas, error-code-based troubleshooting, or decision matrices. Content is procedural, not expert reference material. | | [Use the Azure portal](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/quickstart-create-fleet-and-members-portal) | 0.20 | Quickstart focused on creating a Fleet Manager and joining clusters via portal; primarily step-by-step UI guidance without configuration tables, limits, or product-specific diagnostic details. | | [AKS basics](https://learn.microsoft.com/en-us/azure/aks/get-started-aks) | 0.10 | Basic AKS learning overview; primarily a resource index without detailed expert configuration or troubleshooting content. | +| [AKS desktop overview](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-overview) | 0.10 | High-level overview of AKS Desktop, its purpose, and benefits. No indication of numeric limits, configuration tables, RBAC role details, or troubleshooting content. | +| [Access and identity overview](https://learn.microsoft.com/en-us/azure/aks/concepts-identity) | 0.10 | Identity concepts overview; summary indicates orientation across scenarios and links to deep dives, not detailed configuration or role definitions. | | [Cross-cluster networking for Fleet Manager](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-cross-cluster-networking) | 0.10 | Described explicitly as a conceptual overview of cross-cluster networking for Azure Kubernetes Fleet Manager. Overviews of concepts and capabilities without configuration tables, numeric thresholds, or error mappings do not meet any expert-knowledge criteria. | | [Kubernetes basics](https://learn.microsoft.com/en-us/azure/aks/get-started-kubernetes) | 0.10 | Introductory Kubernetes learning guide; conceptual and navigational without product-specific limits, configs, or decision matrices. | | [Networking concepts](https://learn.microsoft.com/en-us/azure/aks/concepts-network) | 0.10 | This is a conceptual networking overview for AKS (kubenet vs Azure CNI, ingress, load balancers). It explains what the components are but does not provide detailed configuration tables, limits, or decision matrices with quantified trade-offs, so it lacks the required expert-knowledge signals. | -| [Use managed identities](https://learn.microsoft.com/en-us/azure/aks/managed-identity-overview) | 0.10 | Described as an overview of managed identities in AKS (system-assigned, user-assigned, kubelet identity) and how they work. This is conceptual/architectural content without clear indication of concrete configuration tables, parameter values, or role/permission matrices that go beyond what an LLM would know from training. | +| [Projects in AKS desktop](https://learn.microsoft.com/en-us/azure/aks/aks-desktop-projects) | 0.10 | Conceptual explanation of Projects in AKS Desktop and how they relate to namespaces. No specific configuration parameters, limits, or decision matrices are indicated. | +| [Security overview](https://learn.microsoft.com/en-us/azure/aks/concepts-security) | 0.10 | Security concepts overview for AKS; primarily conceptual description of components and services without detailed RBAC role lists or configuration parameters. | | [What is AKS?](https://learn.microsoft.com/en-us/azure/aks/what-is-aks) | 0.10 | High-level overview of AKS features and benefits without detailed limits, configs, or error mappings. | | [Fleets and member clusters](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-fleet) | - | Explicitly a conceptual overview of fleets and member clusters; no detailed limits, configs, or troubleshooting content. | | [Multi-tenancy with Managed Fleet Namespaces](https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-fleet-managed-namespace) | - | Conceptual overview of Managed Fleet Namespaces and multi-cluster multi-tenancy; no concrete configuration parameter tables or limits indicated. | diff --git a/products/azure-language-service/azure-language-service.csv b/products/azure-language-service/azure-language-service.csv index 0240ef31..6329a3c7 100644 --- a/products/azure-language-service/azure-language-service.csv +++ b/products/azure-language-service/azure-language-service.csv @@ -107,7 +107,6 @@ https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entit https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/quickstart,Quickstart,Quickstart: Use the NER client library - Foundry Tools,Use the NER client library to extract entities,Use this quickstart to start using the Named Entity Recognition (NER) API.,"Reference documentation|More samples|Package (NuGet)|Library source code Use this quickstart to create a Named Entity Recognition (NER) application with the client library for .NET. In the following example, you create a C# application that can identifyrecognized entitiesin text. Tip You can useMicrosoft Foundryto try Azure Language features without needing to write code.",2025-11-18T08:00:00.000Z,quickstart,integrations,0.7,True,"Quickstart for NER client library; includes API calls, parameters, and response structures specific to NER.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/tutorials/extract-excel-information,Extract information in Excel using Power Automate,Extract information in Excel using Power Automate - Foundry Tools,,"Learn how to Extract Excel text without having to write code, using Named Entity Recognition and Power Automate.","In this tutorial, you create a Power Automate flow to extract text in an Excel spreadsheet without having to write code. This flow uses a spreadsheet consisting of issues reported about an apartment complex, and classifies them into two categories: plumbing and other. It also extracts the names and phone numbers of the tenants who sent them. Lastly, the flow appends this information to the Excel sheet. In this tutorial, you learn how to:",2025-11-18T08:00:00.000Z,tutorial,,0.3,False,Power Automate tutorial using NER and Excel; mainly workflow steps without detailed configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/native-document-support/managed-identities,Create a managed identity for storage containers,Managed identities for storage blobs - Foundry Tools,Use managed identities for Language Blob access,Create managed identities for containers and blobs with Azure portal.,"Managed identities for Azure resources are service principals that create a Microsoft Entra identity and specific permissions for Azure managed resources. Managed identities are a safer way to grant access to storage data and replace the requirement for you to include shared access signature tokens (SAS) with your source and target container URLs. You can use managed identities to grant access to any resource that supports Microsoft Entra authentication, including your own applications. To grant",2025-11-18T08:00:00.000Z,how-to,security,0.75,True,"Describes granting Blob access via managed identities; includes role assignments, identity types, and Entra configuration specific to this scenario.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/native-document-support/overview,Native documents for language processing,Native document support for Azure Language in Foundry Tools (preview) - Foundry Tools,Use native document support with Language APIs,How to use native document with Azure Language Personally Identifiable Information and Summarization capabilities.,"Important Language is a cloud-based service that applies Natural Language Processing (NLP) features to text-based data. The native document support capability enables you to send API requests asynchronously, using an HTTP POST request body to send your data and HTTP GET request query string to retrieve the status results. Your processed documents are located in your Azure Blob Storage target container. A native document refers to the file format used to create the original document such as Micro",2025-11-18T08:00:00.000Z,how-to,integrations,0.65,True,"Describes asynchronous native document processing with Blob Storage; will include request formats, storage URL patterns, and status retrieval parameters unique to this feature.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/native-document-support/shared-access-signatures,Create SAS tokens for storage containers,Shared access signature (SAS) tokens for storage blobs - Foundry Tools,Create SAS tokens for Language Blob access,Create shared access signature tokens (SAS) for containers and blobs with Azure portal.,"Learn to create user delegation, shared access signature (SAS) tokens, using the Azure portal. User delegation SAS tokens are secured with Microsoft Entra credentials. SAS tokens provide secure, delegated access to resources in your Azure storage account. Tip Role-based access control (managed identities)provide an alternate method for granting access to your storage data without the need to include SAS tokens with your HTTP requests. At a high level, here's how SAS tokens work: Your application",2025-11-18T08:00:00.000Z,how-to,security,0.75,True,"Explains creating user delegation SAS tokens for containers/blobs; includes permission scopes, token parameters, and portal settings that are concrete security configurations.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/concepts/data-formats,Data formats,Orchestration workflow data formats - Foundry Tools,,Learn about the data formats accepted by orchestration workflow.,"When data is used by your model for learning, it expects the data to be in a specific format. When you tag your data, it gets converted to the JSON format described in this article. You can also manually tag your files.",2025-12-18T23:08:00.000Z,concept-article,,0.4,False,Data formats overview; summary doesn’t expose detailed schema or parameter constraints.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/concepts/evaluation-metrics,Evaluation metrics,Orchestration workflow model evaluation metrics - Foundry Tools,,Learn about evaluation metrics in orchestration workflow,"Your dataset is split into two parts: a set for training, and a set for testing. The training set is used to train the model, while the testing set is used as a test for model after training to calculate the model performance and evaluation. The testing set isn't introduced to the model through the training process, to make sure that the model is tested on new data. Model evaluation is triggered automatically after training is completed successfully. The evaluation process starts by using the tr",2025-11-18T08:00:00.000Z,concept-article,,0.2,False,Conceptual evaluation metrics; no product-specific thresholds or decision matrices.,unchanged @@ -133,31 +132,34 @@ https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally- https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/concepts/conversations-entity-categories,Extended format,Entity categories recognized by Conversational Personally Identifiable Information (PII) detection in Azure Language in Foundry Tools - Foundry Tools,,Learn about the types of entities the conversational PII feature can detect and identify within conversation inputs.,The conversational Personally Identifiable Information (PII) detection API uses AI and machine learning to detect and redact sensitive information in conversation inputs. The API categorizes personal details into predefined entity types for precise identification and removal. Detecting and redacting PII helps ensure compliance with privacy regulations and enables your applications to handle data securely. Tip Try PII detection in text or conversations using theMicrosoft Foundrylanguage playgroun,2026-02-21T06:16:00.000Z,concept-article,,0.2,False,Conceptual description of conversational PII entity categories; no numeric limits or config parameters indicated.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/concepts/entity-categories,Extended format,Entity categories recognized by Personally Identifiable Information (PII) and Protected Health Information (PHI) detection in Azure Language in Foundry Tools - Foundry Tools,,Learn about the types of entities the PII feature can detect and identify within unstructured text.,"The Personally Identifiable Information (PII) and Protected Health Information (PHI) detection APIs are cloud-based solutions that use artificial intelligence (AI) and machine learning to help you create smart applications with advanced natural language processing. ThePIIandPHIAPIs effectively detect and removes sensitive information from input data by categorizing personal details into specific, predefined entity types. This comprehensive approach not only safeguards sensitive data to ensure fu",2025-11-18T15:37:00.000Z,concept-article,,0.2,False,"Conceptual description of entity categories; no numeric limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/concepts/entity-categories-list,List format,Entity categories list recognized by Personally Identifiable Information (PII) detection in Azure Language in Foundry Tools - Foundry Tools,,View a list of entity types the PII feature can detect and identify within unstructured text.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-02-21T06:16:00.000Z,concept-article,,0.5,False,"Entity list behind authorization; likely catalog of categories, not limits or configuration per the defined sub-skills.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/conversation-pii-overview,Overview,Conversation Personally Identifiable Information (PII) redaction overview - Foundry Tools,,Learn how conversation PII redaction in Azure Language detects and redacts sensitive data in turn-based conversational inputs.,"Conversation PII redaction in Azure AI Language helps you detect and redact sensitive data in turn-based conversational input. You can use this feature for chat and transcript workflows such as customer support conversations, call transcripts, and meeting transcripts. Conversation PII is optimized for asynchronous conversation jobs and conversation-level context, so you can redact sensitive data across multiple speakers and turns.",2026-04-21T16:56:00.000Z,overview,,0.2,False,"Overview of conversation PII redaction and use cases; summary does not indicate detailed configuration tables, limits, or error mappings.",new +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/document-based-pii-overview,Overview,Document-based PII overview - Foundry Tools,,Learn how document-based PII redaction in Azure Language detects and redacts sensitive data from native documents while preserving file structure.,"Document-based PIIis a preview feature in Azure AI Language Personally Identifiable Information (PII) detection. It helps you detect and redact sensitive data directly in native document files, including Microsoft Word and PDF files, without building your own text extraction and reconstruction pipeline. This feature uses an asynchronous API workflow and returns redacted output that preserves document structure and formatting. You can use it when document fidelity is important for compliance revi",2026-04-21T16:56:00.000Z,overview,,0.25,False,"Preview feature overview for document-based PII; description focuses on what it does and scenarios, not on concrete configuration parameters, limits, or troubleshooting mappings.",new https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/adapt-to-domain-pii,Adapt PII to your domain,Adapt Personally Identifying Information (PII) to your domain - Foundry Tools,,This article shows you how to adapt Personally Identifying Information (PII) to your domain.,"To support using your own terminology to identify entities (also known ascontext), theentitySynonymsfeature enables you to define custom synonyms for specific entity types. This feature helps the system detect entities that appear in your inputs using terms or vocabulary that the model doesn't recognize by default. Aligning your specific terms with standard entities allows the model to accurately recognize and link these terms during entity detection. This custom vocabulary support enhances the ",2025-11-18T08:00:00.000Z,how-to,,0.4,False,Domain adaptation feature description; summary doesn’t show concrete config parameter tables or numeric thresholds.,unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii,Redact PII from conversations,Identify and extract Personally Identifying Information (PII) from conversations - Foundry Tools,,"This article shows you how to detect and redact Personally Identifying Information (PII) from speech, chat, and spoken-word transcriptions and call transcripts.","Azure Language in Foundry Tools conversation PII API analyzes audio discourse to identify and redact sensitive information (PII) using various predefined categories. This API works on both transcribed text (referred to as transcripts) and chats. For transcripts, it also facilitates the redaction of audio segments containing PII by providing the timing information for those segments.",2026-04-04T06:03:00.000Z,how-to,,0.4,False,"How-to/tutorial style description of using conversation PII API; summary does not indicate detailed configuration tables, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii,Redact PII from native documents,Identify and extract Personally Identifying Information (PII) from native documents - Foundry Tools,,This article shows you how to redact Personally Identifying Information (PII) from native documents.,"Important Language is a cloud-based service that applies Natural Language Processing (NLP) features to text-based data. The native document support capability enables you to send API requests asynchronously, using an HTTP POST request body to send your data and HTTP GET request query string to retrieve the status results. Your processed documents are located in your Azure Blob Storage target container. A native document refers to the file format used to create the original document such as Micro",2025-11-18T08:00:00.000Z,how-to,,0.4,False,Native document PII how-to; asynchronous pattern description but no visible limits or config parameter tables.,unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii,Redact PII from text,Identify and extract Personally Identifying Information (PII) from text - Foundry Tools,,"This article shows you how to identify, extract, and redact Personally Identifying Information (PII) from text.","Azure Language in Foundry Tools is a cloud-based service that applies Natural Language Processing (NLP) features to text-based data. The PII feature can evaluate unstructured text, extract, and redact sensitive information (PII) and health information (PHI) in text across severalpredefined categories.",2026-03-17T08:00:00.000Z,how-to,,0.2,False,"Appears to be a how-to/tutorial for using PII redaction in Azure Language/Foundry Tools, likely focused on calling the service and basic usage. No indication of detailed limits, configuration parameter tables, error-code troubleshooting, or security/RBAC specifics; more of a feature usage guide than expert reference content.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii,Detect and redact PII in conversations,Identify and extract Personally Identifiable Information (PII) from conversations - Foundry Tools,Use conversation PII API for chats and transcripts,"This article shows you how to detect and redact Personally Identifiable Information (PII) from speech, chat, and spoken-word transcriptions and call transcripts.","Azure Language in Foundry Tools conversation PII API analyzes audio discourse to identify and redact sensitive information (PII) using various predefined categories. This API works on both transcribed text (referred to as transcripts) and chats. For transcripts, it also facilitates the redaction of audio segments containing PII by providing the timing information for those segments.",2026-04-04T06:03:00.000Z,how-to,integrations,0.6,True,"How-to article for conversation PII typically documents API request structure, parameters (such as conversation jobs, timing info), and service-specific behaviors for transcripts and chats, which are integration-focused expert details.",new +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii,Detect and redact PII in native documents,Identify and extract Personally Identifiable Information (PII) from native documents - Foundry Tools,Integrate document-based PII redaction with Blob Storage,This article shows you how to redact Personally Identifiable Information (PII) from native documents.,"Important Language is a cloud-based service that applies Natural Language Processing (NLP) features to text-based data. The native document support capability enables you to send API requests asynchronously, using an HTTP POST request body to send your data and HTTP GET request query string to retrieve the status results. Your processed documents are located in your Azure Blob Storage target container. A native document refers to the file format used to create the original document such as Micro",2025-11-18T08:00:00.000Z,how-to,integrations,0.65,True,"How-to for native document PII redaction mentions asynchronous POST/GET workflow and Blob Storage containers, implying detailed API request formats, parameters, and storage integration patterns that qualify as product-specific integration knowledge.",new +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii,Detect and redact PII in text,Identify and extract Personally Identifiable Information (PII) from text - Foundry Tools,Call Azure Text PII redaction APIs from code,"This article shows you how to identify, extract, and redact Personally Identifiable Information (PII) from text.","Azure Language in Foundry Tools is a cloud-based service that applies Natural Language Processing (NLP) features to text-based data. The PII feature can evaluate unstructured text, extract, and redact sensitive information (PII) and health information (PHI) in text across severalpredefined categories.",2026-03-17T08:00:00.000Z,how-to,integrations,0.6,True,"How-to for identifying and redacting PII from text is likely to include request/response schemas, API parameters, and configuration options (such as categories, redaction modes) that are specific to this service’s SDK/REST API.",new https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/use-containers,Install and run containers,Use personally identifiable information (PII) detection Docker containers on-premises - Foundry Tools,Apply PII container per-call character and document limits,"Use Docker containers for the Personally Identifiable Information (PII) detection API to determine the language of written text, on-premises.","Note The data limits in a single synchronous API call for the PII container are 5,120 characters per document and up to 10 documents per call. Containers enable you to host the PII detection API on your own infrastructure. If you have security or data governance requirements that can't be fulfilled by calling PII detection remotely, then containers might be a good option. If you don't have an Azure subscription, create afree accountbefore you begin.",2025-12-05T08:00:00.000Z,how-to,limits-quotas,0.85,True,"Explicitly states data limits per synchronous API call (5,120 characters per document and 10 documents per call), which are concrete numeric quotas.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/language-support,Language support,Personally Identifiable Information (PII) detection language support - Foundry Tools,,This article explains which natural languages the PII detection feature supports of Azure Language in Foundry Tools.,"Use this article to learn which natural languages text PII, document PII, and conversation PII features support.",2025-11-18T08:00:00.000Z,concept-article,,0.4,False,"Language support list; catalog-like, not configuration or limits per the defined categories.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/overview,Overview,What is the Personally Identifying Information (PII) detection feature in Azure Language? - Foundry Tools,,"An overview of the PII detection feature in Azure Language, which helps you extract entities and sensitive information (PII) in text.","Personally Identifiable Information (PII) detection is an Azure Languagecore capability. The PII detection service is a cloud-based API that uses machine learning toidentify and redactsensitive information from input data. The service classifies sensitive personal data into predefined categories such as phone numbers, email addresses, and identification documents. Tip Try PII detectionin Microsoft Foundry portal. There you canutilize a currently existing Language Studio resource or create a new ",2026-03-30T08:00:00.000Z,overview,,0.2,False,"Conceptual overview of PII detection capability; no detailed limits, configs, or troubleshooting content indicated.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/language-support,Language support,Personally Identifiable Information (PII) detection language support - Foundry Tools,Check language support for Azure PII detection,This article explains which natural languages the PII detection feature in Azure Language in Foundry Tools supports.,Use this article to learn which natural languages each PII feature type supports:,2026-04-21T16:56:00.000Z,concept-article,configuration,0.65,True,"Language support pages typically enumerate exact supported languages and feature coverage per language in table form, which is product-specific configuration-like reference data not inferable from general training.",updated +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/overview,Overview,What is the Personally Identifiable Information (PII) detection feature in Azure Language? - Foundry Tools,,"An overview of the PII detection feature in Azure Language, which helps you extract entities and sensitive information (PII) in text.","Personally Identifiable Information (PII) detection is an Azure Languagecore capabilitythat helps you identify, classify, and redact sensitive data across text, conversations, and native documents. Submit input text to the service and receive structured output with entity categories, confidence scores, and redacted results based on your API configuration. You can use this capability to implement privacy controls, reduce sensitive data exposure, and support compliance requirements in application ",2026-04-21T16:56:00.000Z,overview,,0.2,False,"High-level overview of PII detection capability; no specific limits, configuration tables, error codes, or product-specific decision matrices.",updated https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/quickstart,Quickstart,Quickstart: Detect personally identifiable information (PII) in text - Foundry Tools,,"Use Azure Language in Foundry Tools to detect and redact personally identifiable information (PII) in text with client libraries, the REST API, or the Microsoft Foundry portal.","In this quickstart, you use the Azure Language in Foundry Tools PII detection feature to identify and redact personally identifiable information in text. You can get started using your preferred client library, the REST API, or the Microsoft Foundry portal. If you don't have an Azure subscription, create afree accountbefore you begin. Note This quickstart covers PII detection in documents. To detect PII in conversations, seeHow to detect and redact PII in conversations. To detect PII in text, se",2026-02-06T08:00:00.000Z,quickstart,,0.2,False,Quickstart; step-by-step usage without explicit limits tables or config matrices in summary.,unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/text-pii-overview,Overview,Text Personally Identifiable Information (PII) redaction overview - Foundry Tools,,Learn how text PII redaction in Azure Language detects and redacts sensitive data in raw text for applications and workflows.,"Text PII redaction in Azure AI Language helps you detect and redact sensitive data in raw text strings. You can use this feature when your application handles logs, prompts, forms, chat messages, or other text content directly. Text PII is optimized for synchronous request/response integration and configurable redaction behavior, so you can apply PII controls inline in application and data-processing workflows.",2026-04-21T16:56:00.000Z,overview,,0.2,False,"Conceptual overview of text PII redaction and scenarios; no detailed settings, limits, or troubleshooting content indicated.",new https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/azure-resources,Resource planning,Azure resources - custom question answering - Foundry Tools,Choose and manage Azure resources for CQA,"Question answering uses several Azure sources, each with a different purpose. Understanding how they're used individually allows you to plan for and select the correct pricing tier or know when to cha","Custom question answering uses several Azure sources, each with a different purpose. Understanding how they're used individually allows you to plan for and select the correct pricing tier or know when to change your pricing tier. Understanding how resources are usedin combinationallows you to find and fix problems when they occur.",2025-12-11T08:00:00.000Z,get-started,decision-making,0.7,True,Explains how different Azure resources and pricing tiers are used and when to change tiers; supports planning and tier selection decisions.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/best-practices,Best practices,Best practices - custom question answering - Foundry Tools,Implement best practices for CQA project quality,Use these best practices to improve your project and provide better results to your application/chat bot's end users.,Use these best practices to improve your project and provide better results to your client application or chat bot's end users.,2025-12-12T08:00:00.000Z,best-practice,best-practices,0.8,True,Explicit best-practices article for improving project results; contains product-specific recommendations and patterns.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/confidence-score,Confidence score,Confidence score - Custom question answering - Foundry Tools,Understand and configure confidence scores in CQA,"When a user query is matched against a knowledge base, Custom question answering returns relevant answers, along with a confidence score.","When a user query is matched against a project (also known as a knowledge base), Custom question answering returns relevant answers, along with a confidence score. This score indicates the confidence that the answer is the right match for the given user query. The confidence score is a number between 0 and 100. A score of 100 is likely an exact match, while a score of 0 means, that no matching answer was found. The higher the score- the greater the confidence in the answer. For a given query, th",2025-11-18T08:00:00.000Z,concept-article,configuration,0.65,True,Defines confidence score range (0–100) and how it affects answer selection; likely includes threshold configuration guidance specific to CQA.,unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits,Limits,Limits and boundaries - custom question answering - Foundry Tools,Service limits and boundaries for CQA projects,Custom question answering has meta-limits for parts of the knowledge base and service. It's important to keep your knowledge base within those limits in order to test and publish.,The following custom question answering limits are a combination of theAzure AI Search pricing tier limitsand custom question answering limits. Both sets of limits affect how many projects you can create per resource and how large each project can grow.,2025-12-11T08:00:00.000Z,limits-and-quotas,limits-quotas,0.95,True,Dedicated limits article combining Azure AI Search tier limits and CQA limits; will list specific numeric limits and constraints.,unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits,Limits,Limits and boundaries - custom question answering - Foundry Tools,Custom question answering service limits and quotas,Custom question answering has meta-limits for parts of the knowledge base and service. It's important to keep your knowledge base within those limits in order to test and publish.,The following custom question answering limits are a combination of theAzure AI Search pricing tier limitsand custom question answering limits. Both sets of limits affect how many projects you can create per resource and how large each project can grow.,2026-04-20T08:00:00.000Z,limits-and-quotas,limits-quotas,0.95,True,"The page explicitly documents concrete numerical limits and boundaries for custom question answering, including how many projects per resource and how large each project can be, derived from Azure AI Search pricing tier limits and service-specific limits. These are product-specific quota values that an LLM would not reliably know from training.",updated https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/plan,App planning,Plan your app - custom question answering - Foundry Tools,,Learn how to plan your custom question answering app. Understand how custom question answering works and interacts with other Azure services and some project concepts.,"To plan your custom question answering app, you need to understand how custom question answering works and interacts with other Azure services. You should also have a solid grasp of project concepts.",2025-12-13T06:19:00.000Z,get-started,,0.4,False,Planning article is likely conceptual (how CQA interacts with other services) without concrete thresholds or decision matrices.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/precise-answering,Precise answering,Precise answering using answer span detection - custom question answering - Foundry Tools,,Understand Precise answering feature available in custom question answering.,"The precise answering feature introduced, allows you to get the precise short answer from the best candidate answer passage present in the project for any user query. This feature uses a deep learning model at runtime, which understands the intent of the user query and detects the precise short answer from the answer passage, if there is a short answer present as a fact in the answer passage. This feature is beneficial for both content developers as well as end users. Now, content developers don",2025-12-13T06:19:00.000Z,feature-guide,,0.4,False,"Explains precise answering conceptually; summary does not indicate configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/project-development-lifecycle,Development lifecycle,Project lifecycle - custom question answering - Foundry Tools,,"Custom question answering learns best in an iterative cycle of model changes, utterance examples, deployment, and gathering data from endpoint queries.","Custom question answering learns best in an iterative cycle of model changes, utterance examples, deployment, and gathering data from endpoint queries.",2025-12-13T06:19:00.000Z,lifecycle,,0.3,False,Conceptual project lifecycle guidance; likely high-level process without product-specific numeric thresholds or configs.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/analytics,Analytics,Analytics on projects - custom question answering - Foundry Tools,Enable diagnostics and run analytics for CQA projects,Custom question answering uses Azure diagnostic logging to store the telemetry data and chat logs,"Custom question answering uses Azure diagnostic logging to store the telemetry data and chat logs. Follow the below steps to run sample queries to get analytics on the usage of your custom question answering project. Enable diagnostics loggingfor your language resource with custom question answering enabled. In the previous step, selectTracein addition toAudit, RequestResponse and AllMetricsfor logging",2025-11-18T08:00:00.000Z,how-to,configuration,0.7,True,"Uses Azure diagnostic logging with specific categories (Trace, Audit, RequestResponse, AllMetrics) and sample queries; these are product-specific telemetry settings.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring,Authoring API,Authoring API - custom question answering - Foundry Tools,Use the CQA Authoring API for automated management,"Use the custom question answering Authoring API to automate common tasks like adding new question answer pairs, and creating, and publishing projects.","The custom question answering Authoring API is used to automate common tasks like adding new question answer pairs, and creating, publishing, and maintaining projects. Note Authoring functionality is available via the REST API andAuthoring SDK (preview). This article provides examples of using the REST API with cURL. For full documentation of all parameters and functionality available, consult theREST API reference content.",2025-12-13T06:19:00.000Z,how-to,integrations,0.8,True,Authoring API reference with REST examples and parameters is an integration/coding pattern; includes product-specific endpoints and parameter usage.,unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring,Authoring API,Authoring API - custom question answering - Foundry Tools,Automate CQA projects with Authoring REST API,"Use the custom question answering Authoring API to automate common tasks like adding new question answer pairs, and creating, and publishing projects.","The custom question answering Authoring API is used to automate common tasks like adding new question answer pairs, and creating, publishing, and maintaining projects. Note Authoring functionality is available via the REST API andAuthoring SDK (preview). This article provides examples of using the REST API with cURL. For full documentation of all parameters and functionality available, consult theREST API reference content.",2025-12-15T08:00:00.000Z,how-to,integrations,0.7,True,"Describes the custom question answering Authoring API with concrete REST examples and parameters. This is product-specific API usage that includes parameter names and request patterns unique to this service, fitting the integrations & coding patterns category.",updated https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/best-practices,Project Best practices,Project best practices - Foundry Tools,Apply project authoring best practices in CQA,Best practices for custom question answering,The following list of QnA pairs are used to represent a project to highlight best practices when authoring in custom question answering.,2025-11-18T08:00:00.000Z,how-to,best-practices,0.9,True,Explicitly a best-practices article with QnA examples showing how to structure projects; contains product-specific authoring guidance and gotchas.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/change-default-answer,Change default answer,Get default answer - custom question answering - Foundry Tools,Customize default answer behavior in CQA projects,The default answer is returned when there is no match to the question. You might want to change the default answer from the standard default answer in custom question answering.,"The default answer for a project is meant to be returned when an answer is not found. If you're using a client application, such as theAzure AI Bot Service, it may also have a separate default answer, indicating no answer met the score threshold.",2025-12-13T06:19:00.000Z,how-to,configuration,0.7,True,Explains how default answers work with score thresholds and client apps like Azure AI Bot Service; likely includes product-specific settings and interactions.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/chit-chat,Add chit-chat,Adding chitchat to a custom question answering project - Foundry Tools,Add and configure chitchat personas in CQA,"Adding personal chitchat to your bot makes it more conversational and engaging when you create a project. Custom question answering allows you to easily add a prepopulated set of the top chitchat, int","Adding chitchat to your bot makes it more conversational and engaging. The chitchat feature in custom question answering allows you to easily add a prepopulated set of the top chitchat, into your project. This can be a starting point for your bot's personality, and it will save you the time and cost of writing them from scratch. This dataset has about 100 scenarios of chitchat in the voice of multiple personas, like Professional, Friendly and Witty. Choose the persona that most closely resembles",2025-11-18T15:37:00.000Z,how-to,configuration,0.7,True,"Describes a CQA-specific chitchat dataset (~100 scenarios, personas like Professional/Friendly/Witty) and how to apply it; this is product-specific configuration behavior.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/configure-azure-resources,Configure your environment and Azure resources,Configure the custom question answering service for fine-tune models - Foundry Tools,Configure Azure resources and permissions for CQA fine-tuning,This article details Azure AI resource configurations for custom question answering fine-tune models.,"In this guide, we walk you through configuring your Azure AI resources and permissions for custom question and answering projects, enabling you to fine-tune models with Azure AI Search and Custom Question Answering (CQA). Completing this setup is essential for fully integrating your environment with Foundry Tools resources. You only need to perform this setup once—afterward, you have seamless access to advanced, AI-powered question answering capabilities. In addition, we show you how to assign t",2025-11-18T15:37:00.000Z,how-to,configuration,0.85,True,"Details Azure AI resource configuration and RBAC assignments for fine-tune models; includes resource types, roles, and settings unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/create-test-deploy,"Create, test, and deploy a knowledge base","Create, test, and deploy your custom question answering (CQA) knowledge base - Foundry Tools",,"You can create a custom question answering knowledge from your own content, such as FAQs or product manuals. This article includes an example of creating a custom question answering knowledge base.","This guide walks you through the essential steps needed to create, test, and deploy a custom question answering (CQA) knowledge base in the Microsoft Foundry. Whether you're transitioning from Language Studio or starting from scratch, this guide is for you. It provides clear and actionable instructions to achieve a fast and successful CQA deployment in the Foundry. Note",2025-12-15T08:00:00.000Z,how-to,,0.4,False,"How-to create/test/deploy CQA knowledge base; appears as a workflow guide without explicit limits, config tables, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/create-test-deploy,"Create, test, and deploy a knowledge base","Create, test, and deploy your custom question answering (CQA) knowledge base - Foundry Tools",,"You can create a custom question answering knowledge from your own content, such as FAQs or product manuals. This article includes an example of creating a custom question answering knowledge base.","This guide walks you through the essential steps needed to create, test, and deploy a custom question answering (CQA) knowledge base in the Microsoft Foundry. Whether you're transitioning from Language Studio or starting from scratch, this guide is for you. It provides clear and actionable instructions to achieve a fast and successful CQA deployment in the Foundry. Note",2026-04-21T22:14:00.000Z,how-to,,0.2,False,"Step-by-step guide for creating, testing, and deploying a custom question answering knowledge base in Foundry. From the summary it appears to be a procedural tutorial without detailed configuration tables, limits, error-code mappings, or product-specific decision matrices, so it doesn't meet any expert-knowledge sub-skill criteria.",updated https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/deploy-agent,Deploy a CQA agent,Create and deploy a custom question and answering (CQA) agent in Microsoft Foundry - Foundry Tools,,Use this guide to create a CQA Microsoft Foundry agent.,"This article gives you clear steps and important tips for building and deploying a CQA agent. Whether you're new to this process or updating your skills, this guide helps you set up and launch your agent successfully. Note",2025-11-18T15:37:00.000Z,overview,,0.2,False,"From the summary, this is a step-by-step guide for creating and deploying a custom question answering agent in Microsoft Foundry. It appears to be procedural/tutorial content without explicit mention of numeric limits, configuration parameter tables, RBAC role lists, error-code-based troubleshooting, or decision matrices. Without evidence of such expert details, it does not meet the criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/encrypt-data-at-rest,Encrypt data at rest,Custom question answering encryption of data at rest - Foundry Tools,Configure data-at-rest encryption and CMK for CQA,"Microsoft offers Microsoft-managed encryption keys, and also lets you manage your Azure AI services subscriptions with your own keys, called customer-managed keys (CMK). This article covers data encry","Custom question answering automatically encrypts your data when persisted to the cloud, helping to meet your organizational security and compliance goals.",2025-12-15T08:00:00.000Z,how-to,security,0.85,True,"Covers enabling and managing customer-managed keys for this specific service; will include resource types, key scopes, and configuration steps unique to CQA.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/export-import-refresh,Export/import/refresh,Export/import/refresh - Foundry Tools,,Learn about backing up your custom question answering projects and projects,You might want to create a copy of your custom question answering project or related question and answer pairs for several reasons:,2025-12-13T06:19:00.000Z,how-to,,0.3,False,"High-level export/import/refresh description; no indication of detailed parameters, limits, or product-specific configuration tables.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/export-import-refresh,Export/import/refresh,Export/import/refresh - Foundry Tools,,Learn about backing up your custom question answering projects and projects,You might want to create a copy of your custom question answering project or related question and answer pairs for several reasons:,2026-04-20T08:00:00.000Z,how-to,,0.2,False,"Covers export/import/refresh of custom question answering projects, likely as a how-to/backup tutorial. The summary does not indicate presence of configuration parameter tables, limits, error codes, or decision matrices; it appears to be operational guidance rather than expert-knowledge content.",updated https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/manage-knowledge-base,Manage projects,Manage projects - custom question answering - Foundry Tools,,Custom question answering allows you to manage projects by providing access to the project settings and content.,Custom question answering allows you to manage your projects by providing access to the project settings and data sources. If you haven't created a custom question answering project before we recommend starting with thegetting started article.,2025-11-18T08:00:00.000Z,how-to,,0.4,False,"General project management overview; summary does not indicate detailed configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/migrate-knowledge-base,Move projects,Move projects - custom question answering - Foundry Tools,Move custom question answering projects between resources,"Moving a custom question answering project requires exporting a project from one resource, and then importing into another.",Note This article deals with the process to export and move projects and sources from one Language resource to another. You might want to create copies of your projects or sources for several reasons:,2025-11-18T15:37:00.000Z,how-to,deployment,0.6,True,Describes exporting and moving projects between Language resources; contains product-specific migration steps and constraints beyond generic export/import concepts.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/network-isolation,Network isolation and private endpoints,Network isolation and Private Link - custom question answering - Foundry Tools,Configure network isolation and Private Link for CQA,Users can restrict public access to custom question answering resources.,The following steps describe how to restrict public access to custom question answering resources as well as how to enable Azure Private Link. Protect a Foundry resource from public access byconfiguring the virtual network.,2025-12-15T08:00:00.000Z,how-to,security,0.8,True,Describes restricting public access and enabling Private Link for this service; will include service-specific network configuration steps and requirements.,unchanged @@ -168,7 +170,7 @@ https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-an https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/overview,Connect to Azure OpenAI,What is custom question answering? - Foundry Tools,,Custom question answering is a cloud-based NLP service that creates conversational layers over your data to deliver accurate answers for natural language queries.,"Important Custom question answering is retiring from Azure Language effectiveMarch 31, 2029. After this date, the CQA feature is no longer supported. During the support window, we recommend that users migrate existing workloads and direct all new projects toMicrosoft Foundry models, which offer enhanced capabilities for natural language understanding and can be easily integrated into your applications. Custom question answering (CQA) is a cloud-based Natural Language Processing (NLP) service tha",2026-03-30T16:59:00.000Z,overview,,0.1,False,"Duplicate of index 0: conceptual overview of Custom Question Answering with retirement date, but no detailed limits, configuration parameters, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/overview,Overview,What is custom question answering? - Foundry Tools,,Custom question answering is a cloud-based NLP service that creates conversational layers over your data to deliver accurate answers for natural language queries.,"Important Custom question answering is retiring from Azure Language effectiveMarch 31, 2029. After this date, the CQA feature is no longer supported. During the support window, we recommend that users migrate existing workloads and direct all new projects toMicrosoft Foundry models, which offer enhanced capabilities for natural language understanding and can be easily integrated into your applications. Custom question answering (CQA) is a cloud-based Natural Language Processing (NLP) service tha",2026-03-30T16:59:00.000Z,overview,,0.1,False,"Overview/retirement notice for Custom Question Answering; no evidence of numeric limits, configuration tables, error-code mappings, or other product-specific expert details.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/quickstart/sdk,Quickstart,Quickstart: custom question answering - Foundry Tools,,This quickstart shows you how to create and manage your custom question answering projects.,"This quickstart guides you through the essential steps needed to create, test, and deploy a custom question answering (CQA) project in the Microsoft Foundry. Whether you're transitioning from Language Studio or starting from scratch, this quickstart is for you. It provides clear and actionable instructions to achieve a fast and successful CQA project deployment. Note",2025-11-18T15:37:00.000Z,quickstart,,0.4,False,"Quickstart for CQA; primarily procedural, not focused on limits, configuration matrices, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines,Format guidelines,Import document format guidelines - custom question answering - Foundry Tools,Apply document format guidelines for CQA imports,Use these guidelines for importing documents to get the best results for your content with custom question answering.,Review these formatting guidelines to get the best results for your content.,2025-12-15T08:00:00.000Z,reference,best-practices,0.7,True,Guidelines for document formatting to get best results; includes product-specific recommendations and edge cases for ingestion.,unchanged +https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines,Format guidelines,Import document format guidelines - custom question answering - Foundry Tools,Format documents for custom question answering imports,Use these guidelines for importing documents to get the best results for your content with custom question answering.,Review these formatting guidelines to get the best results for your content.,2026-04-21T22:14:00.000Z,reference,best-practices,0.7,True,"The page provides product-specific document formatting guidelines to optimize imports for custom question answering. These are concrete, service-tailored recommendations (how to structure and format content for best results) that go beyond generic theory and qualify as best-practices-style expert knowledge.",updated https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/markdown-format,Markdown format,Markdown format - custom question answering - Foundry Tools,Use supported markdown formats in CQA answers,Following is the list of markdown formats that you can use your answer text.,"Custom question answering stores answer text as markdown. There are many flavors of markdown. In order to make sure the answer text is returned and displayed correctly, use this reference. Use theCommonMarktutorial to validate your markdown. The tutorial has aTry itfeature for quick copy/paste validation.",2025-12-15T08:00:00.000Z,reference,configuration,0.7,True,Reference list of supported markdown formats for answer text; product-specific rendering behavior and allowed syntax are configuration details.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/tutorials/active-learning,Active learning,Enrich your project with active learning - Foundry Tools,,"In this tutorial, learn how to enrich your Custom question answering projects with active learning","In this tutorial, you learn how to: This tutorial shows you how to enhance your Custom question answering project with active learning. If you notice that customers are asking questions that are not covered in your project, they may be paraphrased variations of questions. These variations, when added as alternate questions to the relevant question answer pair, help to optimize the project to answer real world user queries. You can manually add alternate questions to question answer pairs through",2025-11-18T08:00:00.000Z,tutorial,,0.45,False,"Tutorial on active learning; likely step-by-step usage rather than configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/tutorials/adding-synonyms,Adding synonyms,Improve the quality of responses with synonyms - Foundry Tools,,"In this tutorial, learn how to improve response with synonyms and alternate words","In this tutorial, you learn how to: This tutorial shows you how you can improve the quality of your responses by using synonyms. Let's assume that users aren't getting an accurate response to their queries, when they use alternate forms, synonyms or acronyms of a word. So, they decide to improve the quality of the response by usingAuthoring APIto add synonyms for keywords.",2025-11-18T08:00:00.000Z,tutorial,,0.45,False,"Tutorial on synonyms; mostly workflow guidance using Authoring API, but summary does not show detailed parameter tables or limits.",unchanged diff --git a/products/azure-language-service/report.md b/products/azure-language-service/report.md index 8593f99e..3038be07 100644 --- a/products/azure-language-service/report.md +++ b/products/azure-language-service/report.md @@ -1,18 +1,18 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: - configuration: 'Configuring Azure AI Language projects and containers: CLU, custom - NER, text classification, CQA, sentiment, summarization, health, data formats, - resources, and runtime settings.' + configuration: 'Configuring Azure AI Language projects and containers: resources, + data formats, training, evaluation, entities/NER, PII, CQA behavior, diagnostics, + and on-prem Docker setups.' deployment: How to deploy and run Azure AI Language models (custom classification, NER, QnA, key phrases, language detection) across regions, containers, AKS, and migrate projects/resources. limits-quotas: Limits, quotas, and language/region support for Azure AI Language - features (CLU, NER, classification, PII, CQA), including data size, rate, throughput, - and container request limits. - integrations: How to call Azure Language/CLU/Health/Summarization/CQA APIs and SDKs, - wire them into bots, Power Automate, and Foundry, and correctly handle async, - parameters, and outputs + features (CLU, NER, classification, QnA, PII, containers), including data size, + rate, and throughput constraints. + integrations: Coding examples and API guides for calling Azure Language features + (CLU, NER, PII, sentiment, summarization, health, CQA) via SDK/REST and integrating + with bots, storage, and Power Automate. security: 'Security for Azure AI Language: encryption at rest, customer-managed keys, RBAC, managed identities, SAS tokens, and network isolation/Private Link for CQA resources.' @@ -20,8 +20,8 @@ category_descriptions: and migration paths from LUIS, QnA Maker, Text Analytics, and Language Studio to Azure Language and Microsoft Foundry best-practices: Best practices for designing, labeling, and evaluating CLU, custom - NER, text classification, and CQA projects, including multilingual handling, emojis, - schemas, and autolabeling. + NER, text classification, and custom question answering projects, including multilingual + handling, emojis, and data/schema prep. architecture-patterns: 'Architectural guidance for CLU and custom text classification: choosing CLU vs orchestration workflows, and designing regional backup, redundancy, and failover strategies.' @@ -31,32 +31,32 @@ category_descriptions: skill_description: Expert knowledge for Azure AI Language development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - building CLU intents, custom NER, text classification, CQA, sentiment/summarization, - or health text solutions, and other Azure AI Language related development tasks. - Not for Azure AI Search (use azure-cognitive-search), Azure AI Document Intelligence - (use azure-document-intelligence), Azure AI Speech (use azure-speech), Azure Translator - (use azure-translator). -use_when: Use when building CLU intents, custom NER, text classification, CQA, sentiment/summarization, - or health text solutions, and other Azure AI Language related development tasks. + building CLU apps, custom NER/PII, sentiment/summarization, CQA/QnA, or containerized + Language workloads, and other Azure AI Language related development tasks. Not for + Azure AI Search (use azure-cognitive-search), Azure AI Speech (use azure-speech), + Azure Translator (use azure-translator), Azure AI Bot Service (use azure-bot-service). +use_when: Use when building CLU apps, custom NER/PII, sentiment/summarization, CQA/QnA, + or containerized Language workloads, and other Azure AI Language related development + tasks. confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure AI - Document Intelligence (use azure-document-intelligence), Azure AI Speech (use azure-speech), - Azure Translator (use azure-translator). + Speech (use azure-speech), Azure Translator (use azure-translator), Azure AI Bot + Service (use azure-bot-service). --- # Azure AI Language Crawl Report ## Summary -- **Total Pages**: 197 -- **Fetched**: 197 +- **Total Pages**: 199 +- **Fetched**: 199 - **Fetch Failed**: 0 -- **Classified**: 104 -- **Unclassified**: 93 +- **Classified**: 107 +- **Unclassified**: 92 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 197 -- **Deleted Pages**: 0 +- **New Pages**: 6 +- **Updated Pages**: 7 +- **Unchanged**: 186 +- **Deleted Pages**: 4 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-language-service/azure-language-service.csv` ## Classification Statistics @@ -64,23 +64,56 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 3 | 1.5% | -| best-practices | 14 | 7.1% | -| configuration | 24 | 12.2% | +| best-practices | 14 | 7.0% | +| configuration | 25 | 12.6% | | decision-making | 6 | 3.0% | -| deployment | 7 | 3.6% | -| integrations | 28 | 14.2% | -| limits-quotas | 14 | 7.1% | +| deployment | 7 | 3.5% | +| integrations | 30 | 15.1% | +| limits-quotas | 14 | 7.0% | | security | 6 | 3.0% | | troubleshooting | 2 | 1.0% | -| *(Unclassified)* | 93 | 47.2% | +| *(Unclassified)* | 92 | 46.2% | ## Changes +### New Pages + +- [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/text-pii-overview) +- [Detect and redact PII in text](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii) +- [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/conversation-pii-overview) +- [Detect and redact PII in conversations](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii) +- [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/document-based-pii-overview) +- [Detect and redact PII in native documents](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii) + +### Updated Pages + +- [Limits](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits) + - Updated: 2025-12-11T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Format guidelines](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines) + - Updated: 2025-12-15T08:00:00.000Z → 2026-04-21T22:14:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/overview) + - Updated: 2026-03-30T08:00:00.000Z → 2026-04-21T16:56:00.000Z +- [Language support](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/language-support) + - Updated: 2025-11-18T08:00:00.000Z → 2026-04-21T16:56:00.000Z +- [Create, test, and deploy a knowledge base](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/create-test-deploy) + - Updated: 2025-12-15T08:00:00.000Z → 2026-04-21T22:14:00.000Z +- [Export/import/refresh](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/export-import-refresh) + - Updated: 2025-12-13T06:19:00.000Z → 2026-04-20T08:00:00.000Z +- [Authoring API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring) + - Updated: 2025-12-13T06:19:00.000Z → 2025-12-15T08:00:00.000Z + +### Deleted Pages + +- ~~Native documents for language processing~~ (https://learn.microsoft.com/en-us/azure/ai-services/language-service/native-document-support/overview) +- ~~Redact PII from conversations~~ (https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii) +- ~~Redact PII from native documents~~ (https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii) +- ~~Redact PII from text~~ (https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii) + ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Limits](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits) | limits-quotas | 0.95 | Dedicated limits article combining Azure AI Search tier limits and CQA limits; will list specific numeric limits and constraints. | +| [Limits](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits) | limits-quotas | 0.95 | The page explicitly documents concrete numerical limits and boundaries for custom question answering, including how many projects per resource and how large each project can be, derived from Azure AI Search pricing tier limits and service-specific limits. These are product-specific quota values that an LLM would not reliably know from training. | | [Quotas and limits](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/data-limits) | limits-quotas | 0.95 | Explicitly described as data and service limitations; such pages list request sizes, payload limits, and rate constraints with exact numeric values and units. | | [Use Docker Containers](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/use-containers) | limits-quotas | 0.95 | Explicitly states data limits per API call (5,120 characters per document, 10 documents per call); clear numeric quotas for the container. | | [Project Best practices](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/best-practices) | best-practices | 0.90 | Explicitly a best-practices article with QnA examples showing how to structure projects; contains product-specific authoring guidance and gotchas. | @@ -92,7 +125,6 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [Encrypt data at rest](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/encrypt-data-at-rest) | security | 0.85 | Covers enabling and managing customer-managed keys for this specific service; will include resource types, key scopes, and configuration steps unique to CQA. | | [Install and run containers](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/use-containers) | limits-quotas | 0.85 | Explicitly states data limits per synchronous API call (5,120 characters per document and 10 documents per call), which are concrete numeric quotas. | | [Use skill parameters](https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/how-to/skill-parameters) | configuration | 0.85 | Explicitly about API parameters like overlapPolicy and inferenceOptions; describes allowed values and behavior, which are product-specific configuration details. | -| [Authoring API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring) | integrations | 0.80 | Authoring API reference with REST examples and parameters is an integration/coding pattern; includes product-specific endpoints and parameter usage. | | [Best practices](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/best-practices) | best-practices | 0.80 | Explicit best-practices article for improving project results; contains product-specific recommendations and patterns. | | [Configure Docker containers](https://learn.microsoft.com/en-us/azure/ai-services/language-service/text-analytics-for-health/how-to/configure-containers) | configuration | 0.80 | Explicitly about configuration framework for containers, including storage, logging, telemetry, and security settings; will contain parameter names and allowed values. | | [Configure containers](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/configure-containers) | configuration | 0.80 | Describes common configuration framework for containers including storage, logging, and security; likely includes specific environment variables and settings. | @@ -108,6 +140,7 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [API version mapping](https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/concepts/ga-preview-mapping) | configuration | 0.70 | Version-based mapping of entity types/tags vs category/subcategory; provides concrete mapping tables and version-specific behavior, which are configuration/compatibility details. | | [Add chit-chat](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/chit-chat) | configuration | 0.70 | Describes a CQA-specific chitchat dataset (~100 scenarios, personas like Professional/Friendly/Witty) and how to apply it; this is product-specific configuration behavior. | | [Analytics](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/analytics) | configuration | 0.70 | Uses Azure diagnostic logging with specific categories (Trace, Audit, RequestResponse, AllMetrics) and sample queries; these are product-specific telemetry settings. | +| [Authoring API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring) | integrations | 0.70 | Describes the custom question answering Authoring API with concrete REST examples and parameters. This is product-specific API usage that includes parameter names and request patterns unique to this service, fitting the integrations & coding patterns category. | | [Auto label your data (preview)](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/how-to/use-autolabeling) | best-practices | 0.70 | Explains how to run autolabeling jobs using trained models or GPT models; includes workflow-specific guidance unique to custom NER. | | [Call Sentiment Analysis and Opinion Mining](https://learn.microsoft.com/en-us/azure/ai-services/language-service/sentiment-opinion-mining/how-to/call-api) | integrations | 0.70 | How-to for calling the API will include request bodies, parameters, and response fields unique to this service, which are product-specific integration patterns. | | [Call key phrase extraction](https://learn.microsoft.com/en-us/azure/ai-services/language-service/key-phrase-extraction/how-to/call-api) | integrations | 0.70 | How-to for calling the API; includes request/response formats and parameters unique to this feature. | @@ -122,7 +155,7 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [Deploy to multiple regions](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/custom-features/multi-region-deployment) | deployment | 0.70 | Covers multi-region deployment capabilities and constraints, including deploying to multiple resources within a region; product-specific deployment behavior. | | [Developer guide](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/developer-guide) | integrations | 0.70 | Developer guide for integrating SDK and REST will include endpoint formats, authentication headers, and request/response patterns specific to Azure Language. | | [Encryption of data at rest](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/encryption-data-at-rest) | security | 0.70 | Explains how data is encrypted when persisted; likely includes encryption mechanisms, key management options, and compliance-related settings specific to the service. | -| [Format guidelines](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines) | best-practices | 0.70 | Guidelines for document formatting to get best results; includes product-specific recommendations and edge cases for ingestion. | +| [Format guidelines](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines) | best-practices | 0.70 | The page provides product-specific document formatting guidelines to optimize imports for custom question answering. These are concrete, service-tailored recommendations (how to structure and format content for best results) that go beyond generic theory and qualify as best-practices-style expert knowledge. | | [How to call the API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/entity-linking/how-to/call-api) | integrations | 0.70 | How-to for calling the API; will describe request structure, parameters, and possibly options unique to entity linking. | | [How to call the API](https://learn.microsoft.com/en-us/azure/ai-services/language-service/text-analytics-for-health/how-to/call-api) | integrations | 0.70 | How-to for calling the health API will include request body formats, options, and response JSON fields specific to this capability. | | [Label data](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/how-to/tag-data) | best-practices | 0.70 | Describes how to label data, including entity type creation and import requirements; these are product-specific labeling practices and constraints. | @@ -160,9 +193,10 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [Confidence score](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/confidence-score) | configuration | 0.65 | Defines confidence score range (0–100) and how it affects answer selection; likely includes threshold configuration guidance specific to CQA. | | [Create projects](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/how-to/create-project) | configuration | 0.65 | Covers setting up requirements and creating projects/resources; likely includes resource configuration details specific to custom NER. | | [Create projects](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/how-to/create-project) | configuration | 0.65 | Describes requirements and setup for projects; likely includes resource configuration and project settings specific to this feature. | +| [Detect and redact PII in native documents](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii) | integrations | 0.65 | How-to for native document PII redaction mentions asynchronous POST/GET workflow and Blob Storage containers, implying detailed API request formats, parameters, and storage integration patterns that qualify as product-specific integration knowledge. | +| [Language support](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/language-support) | configuration | 0.65 | Language support pages typically enumerate exact supported languages and feature coverage per language in table form, which is product-specific configuration-like reference data not inferable from general training. | | [Migrate from Text Analytics API to Language Service](https://learn.microsoft.com/en-us/azure/ai-services/language-service/reference/migrate-language-service-latest) | decision-making | 0.65 | A migration guide from Text Analytics API to the latest Azure Language service typically includes version-specific behavior changes, feature mapping, and concrete upgrade paths. This is expert, product-specific guidance that helps decide how and when to move between APIs and adjust implementations, fitting the decision-making category. | | [Model performance (preview)](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/how-to/view-model-evaluation) | configuration | 0.65 | Explains evaluation behavior including automatic vs manual test splits and their impact; product-specific evaluation configuration details. | -| [Native documents for language processing](https://learn.microsoft.com/en-us/azure/ai-services/language-service/native-document-support/overview) | integrations | 0.65 | Describes asynchronous native document processing with Blob Storage; will include request formats, storage URL patterns, and status retrieval parameters unique to this feature. | | [Prebuilt component reference](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/prebuilt-component-reference) | configuration | 0.65 | Reference for supported prebuilt entities; likely includes lists and constraints that are product-specific configuration options. | | [Quickstart](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/quickstart) | integrations | 0.65 | Quickstart for using custom NER with Foundry and REST; includes concrete API usage and configuration steps specific to this feature. | | [Region support](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/regional-support) | decision-making | 0.65 | Regional support pages typically include per-feature region tables and explicit support/limitation matrices, which are concrete selection criteria for where features can be used. | @@ -176,6 +210,8 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [Back up and recover your models](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/fail-over) | architecture-patterns | 0.60 | Discusses designing for regional fail-over using two Language resources and model backup/recovery; this is a product-specific high-availability pattern. | | [Build a fine-tuning schema](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/build-schema) | best-practices | 0.60 | Schema design guidance is product-specific and likely includes concrete DOs/DON’Ts for intents and entities unique to CLU. | | [Call the API and make predictions](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/call-api) | integrations | 0.60 | Focuses on sending prediction requests via prediction API and SDK; likely includes request/response schema and parameter details specific to CLU. | +| [Detect and redact PII in conversations](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii) | integrations | 0.60 | How-to article for conversation PII typically documents API request structure, parameters (such as conversation jobs, timing info), and service-specific behaviors for transcripts and chats, which are integration-focused expert details. | +| [Detect and redact PII in text](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii) | integrations | 0.60 | How-to for identifying and redacting PII from text is likely to include request/response schemas, API parameters, and configuration options (such as categories, redaction modes) that are specific to this service’s SDK/REST API. | | [FAQ](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/faq) | troubleshooting | 0.60 | FAQ for this API likely includes answers to specific error scenarios and constraints; serves as troubleshooting guidance. | | [Label utterances](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/tag-utterances) | best-practices | 0.60 | Data labeling guidance for CLU projects is product-specific and includes actionable recommendations for tagging utterances. | | [Model lifecycle](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/model-lifecycle) | decision-making | 0.60 | Describes model versioning, deprecation, and retirement timelines; this is decision guidance for when to upgrade models and plan migrations, with service-specific policies. | @@ -204,19 +240,15 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [App planning](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/plan) | 0.40 | Planning article is likely conceptual (how CQA interacts with other services) without concrete thresholds or decision matrices. | | [Build schema](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/how-to/build-schema) | 0.40 | Schema design guidance appears conceptual; no quantified thresholds or product-specific decision matrices visible. | | [Create a fine-tuning task project](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/create-project) | 0.40 | How-to create CLU fine-tuning tasks; mostly workflow, not detailed config tables or limits. | -| [Create, test, and deploy a knowledge base](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/create-test-deploy) | 0.40 | How-to create/test/deploy CQA knowledge base; appears as a workflow guide without explicit limits, config tables, or troubleshooting mappings. | | [Data formats](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/concepts/data-formats) | 0.40 | Data formats overview; summary doesn’t expose detailed schema or parameter constraints. | | [Deploy a model](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/deploy-model) | 0.40 | Deployment how-to for CLU models; summary doesn’t show deployment matrices, tier constraints, or limits. | | [Entity Metadata](https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/concepts/entity-metadata) | 0.40 | Explains metadata and resolution behavior conceptually; no indication of parameter tables, limits, or configuration ranges. | | [FAQ](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/faq) | 0.40 | FAQ may contain some specifics but summary doesn’t indicate structured error codes, limits, or config tables. | -| [Language support](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/language-support) | 0.40 | Language support list; catalog-like, not configuration or limits per the defined categories. | | [Manage projects](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/manage-knowledge-base) | 0.40 | General project management overview; summary does not indicate detailed configuration tables, limits, or error mappings. | | [Multiple domains](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/tutorials/multiple-domains) | 0.40 | No-code FAQ bot tutorial; primarily step-by-step instructions without expert-level configuration matrices or limits. | | [Precise answering](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/precise-answering) | 0.40 | Explains precise answering conceptually; summary does not indicate configuration tables, limits, or error mappings. | | [Quickstart](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/quickstart) | 0.40 | Quickstart tutorial; mainly step-by-step creation, not deep configuration matrices or troubleshooting content. | | [Quickstart](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/quickstart/sdk) | 0.40 | Quickstart for CQA; primarily procedural, not focused on limits, configuration matrices, or error codes. | -| [Redact PII from conversations](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii) | 0.40 | How-to/tutorial style description of using conversation PII API; summary does not indicate detailed configuration tables, limits, or error mappings. | -| [Redact PII from native documents](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii) | 0.40 | Native document PII how-to; asynchronous pattern description but no visible limits or config parameter tables. | | [Train a model](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/how-to/train-model) | 0.40 | Training how-to; the only numeric detail is a 7‑day job expiry, without broader limits tables or config parameters. | | [FAQ](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/faq) | 0.35 | FAQ page; summary suggests conceptual Q&A about scenarios and concepts, not detailed error codes or configuration tables. | | [Language support](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/language-support) | 0.35 | Language support for CQA; summary doesn’t indicate numeric constraints or configuration parameters. | @@ -227,7 +259,6 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [Deploy a model](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/how-to/deploy-model) | 0.30 | Deployment overview; no deployment matrices, tier constraints, or timing details in summary. | | [Development lifecycle](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/project-development-lifecycle) | 0.30 | Conceptual project lifecycle guidance; likely high-level process without product-specific numeric thresholds or configs. | | [Evaluation metrics](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/concepts/evaluation-metrics) | 0.30 | Explains generic ML evaluation concepts (train/test split, automatic evaluation). No product-specific limits, configs, or error codes. | -| [Export/import/refresh](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/export-import-refresh) | 0.30 | High-level export/import/refresh description; no indication of detailed parameters, limits, or product-specific configuration tables. | | [Extract entities from text](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/how-to/call-api) | 0.30 | API usage overview; summary doesn’t show parameter tables, limits, or error mappings. | | [Extract information in Excel using Power Automate](https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/tutorials/extract-excel-information) | 0.30 | Power Automate tutorial using NER and Excel; mainly workflow steps without detailed configuration matrices or limits. | | [FAQ](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/faq) | 0.30 | FAQ summary only; no explicit indication of error-code mappings or detailed diagnostics. | @@ -243,24 +274,28 @@ confusable_not_for: Not for Azure AI Search (use azure-cognitive-search), Azure | [View your model's performance](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/how-to/view-model-evaluation) | 0.30 | Evaluation how-to; similar to other evaluation pages, no product-specific numeric thresholds or configs. | | [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/overview) | 0.25 | Conceptual overview of custom named entity recognition; describes what it is and general workflow, without specific numeric limits or configuration tables. | | [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/language-detection/overview) | 0.25 | Overview of language detection capability; likely conceptual with feature description, not detailed limits, configs, or decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/document-based-pii-overview) | 0.25 | Preview feature overview for document-based PII; description focuses on what it does and scenarios, not on concrete configuration parameters, limits, or troubleshooting mappings. | | [Azure tools and agents](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/foundry-tools-agents) | 0.20 | Appears to be an overview of Azure Language integrations with Foundry Tools (MCP endpoints, agents, routing, Q&A). Description and summary indicate conceptual integration/intro content, not detailed configuration tables, limits, error codes, or decision matrices. | | [Build a multi-turn conversation model](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/build-multi-turn-model) | 0.20 | How-to article for building a CLU model with slot filling; description suggests general guidance and workflow steps rather than configuration matrices, limits, or diagnostic mappings. | | [Connect CLU and custom question answering](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/tutorials/connect-services) | 0.20 | Tutorial for connecting services; typical integration walkthrough without explicit config tables or limits in summary. | +| [Create, test, and deploy a knowledge base](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/create-test-deploy) | 0.20 | Step-by-step guide for creating, testing, and deploying a custom question answering knowledge base in Foundry. From the summary it appears to be a procedural tutorial without detailed configuration tables, limits, error-code mappings, or product-specific decision matrices, so it doesn't meet any expert-knowledge sub-skill criteria. | | [Deploy a CQA agent](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/deploy-agent) | 0.20 | From the summary, this is a step-by-step guide for creating and deploying a custom question answering agent in Microsoft Foundry. It appears to be procedural/tutorial content without explicit mention of numeric limits, configuration parameter tables, RBAC role lists, error-code-based troubleshooting, or decision matrices. Without evidence of such expert details, it does not meet the criteria for any sub-skill type. | | [Evaluation metrics](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-named-entity-recognition/concepts/evaluation-metrics) | 0.20 | Conceptual explanation of evaluation metrics; no product-specific thresholds or config values. | | [Evaluation metrics](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/concepts/evaluation-metrics) | 0.20 | Conceptual evaluation metrics; no product-specific thresholds or decision matrices. | +| [Export/import/refresh](https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/export-import-refresh) | 0.20 | Covers export/import/refresh of custom question answering projects, likely as a how-to/backup tutorial. The summary does not indicate presence of configuration parameter tables, limits, error codes, or decision matrices; it appears to be operational guidance rather than expert-knowledge content. | | [Extended format](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/concepts/conversations-entity-categories) | 0.20 | Conceptual description of conversational PII entity categories; no numeric limits or config parameters indicated. | | [Extended format](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/concepts/entity-categories) | 0.20 | Conceptual description of entity categories; no numeric limits, configs, or troubleshooting mappings. | | [Glossary](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/glossary) | 0.20 | Glossary of definitions; conceptual terminology without numeric limits, configs, or troubleshooting mappings. | | [Label utterances](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/how-to/tag-utterances) | 0.20 | Tagging utterances workflow; no limits, config tables, or error mappings indicated. | | [Language support](https://learn.microsoft.com/en-us/azure/ai-services/language-service/sentiment-opinion-mining/language-support) | 0.20 | Language support list is likely just supported/unsupported languages without numeric limits, configuration parameters, or troubleshooting mappings; does not match any expert-knowledge sub-skill type definitions. | -| [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/overview) | 0.20 | Conceptual overview of PII detection capability; no detailed limits, configs, or troubleshooting content indicated. | +| [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/conversation-pii-overview) | 0.20 | Overview of conversation PII redaction and use cases; summary does not indicate detailed configuration tables, limits, or error mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/overview) | 0.20 | High-level overview of PII detection capability; no specific limits, configuration tables, error codes, or product-specific decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/text-pii-overview) | 0.20 | Conceptual overview of text PII redaction and scenarios; no detailed settings, limits, or troubleshooting content indicated. | | [Previous service updates](https://learn.microsoft.com/en-us/azure/ai-services/language-service/concepts/previous-updates) | 0.20 | Archive of updates; mostly release notes and dates without structured limits, configs, or troubleshooting mappings in the summary. | | [Quickstart](https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/quickstart) | 0.20 | Quickstart; likely step-by-step usage without deep limits, config matrices, or error catalogs. | | [Quickstart](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/quickstart) | 0.20 | Quickstart; step-by-step usage without explicit limits tables or config matrices in summary. | | [Recognized entity categories](https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/concepts/named-entity-categories) | 0.20 | Page describes what entity categories and types are for Azure Language NER but appears to be a conceptual overview of recognized categories, not configuration tables, limits, error codes, or decision matrices. It likely lists category names and descriptions, which are general reference rather than product-specific limits, settings, or troubleshooting guidance. | | [Recognized entity categories](https://learn.microsoft.com/en-us/azure/ai-services/language-service/text-analytics-for-health/concepts/health-entity-categories) | 0.20 | Describes entity categories conceptually; likely a list of categories but not configuration, limits, or troubleshooting mappings. | -| [Redact PII from text](https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii) | 0.20 | Appears to be a how-to/tutorial for using PII redaction in Azure Language/Foundry Tools, likely focused on calling the service and basic usage. No indication of detailed limits, configuration parameter tables, error-code troubleshooting, or security/RBAC specifics; more of a feature usage guide than expert reference content. | | [Triage incoming emails with Power Automate](https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/tutorials/triage-email) | 0.20 | Step-by-step tutorial wiring email, classification, and Teams via Power Automate; lacks parameter tables, limits, or troubleshooting details. | | [Try multi-turn conversations](https://learn.microsoft.com/en-us/azure/ai-services/language-service/conversational-language-understanding/how-to/quickstart-multi-turn-conversations) | 0.20 | Quickstart tutorial for building a CLU model with multi-turn slot filling; likely procedural how-to without detailed configuration tables, limits, or error-code-based troubleshooting. | | [What is Azure Language in Foundry Tools?](https://learn.microsoft.com/en-us/azure/ai-services/language-service/overview) | 0.20 | High-level overview of Azure Language in Foundry Tools; no concrete limits, configs, error codes, or decision matrices. | diff --git a/products/azure-load-balancer/azure-load-balancer.csv b/products/azure-load-balancer/azure-load-balancer.csv index 1122bcaf..9378a3d1 100644 --- a/products/azure-load-balancer/azure-load-balancer.csv +++ b/products/azure-load-balancer/azure-load-balancer.csv @@ -19,9 +19,9 @@ https://learn.microsoft.com/en-us/azure/load-balancer/egress-only,Outbound only https://learn.microsoft.com/en-us/azure/load-balancer/gateway-deploy-dual-stack-load-balancer,Deploy a dual-stack gateway load balancer,Deploy a dual-stack Azure Gateway Load Balancer,,"In this tutorial, you deploy IPv6 configurations to an existing IPv4-configured Azure Gateway Load Balancer","In this tutorial, you deploy IPv6 configurations to an existing IPv4-configured Azure Gateway Load Balancer. You learn to: Along with the Gateway Load Balancer, this scenario includes the following already-deployed resources:",2024-09-25T08:00:00.000Z,how-to,,0.3,False,Tutorial for adding IPv6 to an existing Gateway Load Balancer; scenario-specific steps rather than general configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/load-balancer/gateway-overview,Gateway load balancer overview,Gateway load balancer - Azure Load Balancer,,Overview of gateway load balancer SKU for Azure Load Balancer.,"Gateway Load Balancer is a SKU of the Azure Load Balancer portfolio catered for high performance and high availability scenarios with third-party Network Virtual Appliances (NVAs). With the capabilities of Gateway Load Balancer, you can easily deploy, scale, and manage NVAs. Chaining a Gateway Load Balancer to your public endpoint only requires one selection. You can insert appliances transparently for different kinds of scenarios such as: With Gateway Load Balancer, you can easily add or remove",2026-01-07T08:00:00.000Z,concept-article,,0.2,False,"Gateway Load Balancer overview focused on scenarios and benefits; lacks numeric limits, config parameter tables, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/gateway-partners,Partners,Azure Gateway Load Balancer partners,,Learn about partners offering their network appliances for use with Azure Gateway Load Balancer.,Azure has a growing ecosystem of partners offering their network appliances for use with Gateway Load Balancer.,2026-01-08T23:15:00.000Z,concept-article,,0.2,False,Partner listing for Gateway Load Balancer; essentially ecosystem/marketing content without technical configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/load-balancer/howto-load-balancer-imds,Retrieve metadata using the Azure IMDS,Retrieve load balancer and virtual machine IP metadata using Azure Instance Metadata Service (IMDS) - Azure Load Balancer,Use IMDS to retrieve Azure Load Balancer metadata,Get started learning how to retrieve load balancer metadata using Azure Instance Metadata Service.,,2026-04-14T22:21:00.000Z,how-to,integrations,0.7,True,"A 'how-to' for retrieving load balancer and VM IP metadata via Azure Instance Metadata Service typically includes the exact IMDS endpoint (169.254.169.254), REST paths, required headers, and example requests/responses. These are product-specific API integration details and configuration parameters that qualify as expert knowledge under the integrations category.",updated +https://learn.microsoft.com/en-us/azure/load-balancer/howto-load-balancer-imds,Retrieve metadata using the Azure IMDS,Retrieve load balancer and virtual machine IP metadata using Azure Instance Metadata Service (IMDS) - Azure Load Balancer,Use IMDS to retrieve Azure Load Balancer metadata,Get started learning how to retrieve load balancer metadata using Azure Instance Metadata Service.,,2026-04-14T22:21:00.000Z,how-to,integrations,0.7,True,"A 'how-to' for retrieving load balancer and VM IP metadata via Azure Instance Metadata Service typically includes the exact IMDS endpoint (169.254.169.254), REST paths, required headers, and example requests/responses. These are product-specific API integration details and configuration parameters that qualify as expert knowledge under the integrations category.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/inbound-nat-rules,Inbound NAT rules overview,Inbound NAT rules - Azure Load Balancer,,"Overview of what is inbound NAT rule, why to use inbound NAT rule, and how to use inbound NAT rule.",An inbound NAT rule is used to forward traffic from a load balancer frontend to one or more instances in the backend pool.,2026-01-08T23:15:00.000Z,concept-article,,0.1,False,Conceptual overview of inbound NAT rules; describes what and why without detailed configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/load-balancer/instance-metadata-service-load-balancer,Retrieve information using the Azure Instance Metadata Service,Retrieve load balancer and virtual machine IP information by using Azure Instance Metadata Service - Azure Load Balancer,Query load balancer and VM IPs via IMDS,Get started learning about using Azure Instance Metadata Service to retrieve load balancer information.,"IMDS (Azure Instance Metadata Service) provides information about currently running virtual machine instances. The service is a REST API that's available at a well-known, nonroutable IP address (169.254.169.254). When you place virtual machine or virtual machine set instances behind an Azure Standard Load Balancer, you can use IMDS to retrieve metadata related to the load balancer and the instances. The metadata includes the following information for the virtual machines or virtual machine scale",2026-04-14T22:21:00.000Z,concept-article,integrations,0.7,True,"Describes using Azure Instance Metadata Service to retrieve load balancer and VM IP information, which generally includes specific REST paths, query structure, and metadata schema fields. These are concrete, product-specific API usage details that fit the integrations sub-skill type rather than a conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/load-balancer/instance-metadata-service-load-balancer,Retrieve information using the Azure Instance Metadata Service,Retrieve load balancer and virtual machine IP information by using Azure Instance Metadata Service - Azure Load Balancer,Query load balancer and VM IPs via IMDS,Get started learning about using Azure Instance Metadata Service to retrieve load balancer information.,"IMDS (Azure Instance Metadata Service) provides information about currently running virtual machine instances. The service is a REST API that's available at a well-known, nonroutable IP address (169.254.169.254). When you place virtual machine or virtual machine set instances behind an Azure Standard Load Balancer, you can use IMDS to retrieve metadata related to the load balancer and the instances. The metadata includes the following information for the virtual machines or virtual machine scale",2026-04-14T22:21:00.000Z,concept-article,integrations,0.7,True,"Describes using Azure Instance Metadata Service to retrieve load balancer and VM IP information, which generally includes specific REST paths, query structure, and metadata schema fields. These are concrete, product-specific API usage details that fit the integrations sub-skill type rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/ipv6-add-to-existing-vnet-cli,Add IPv6 to an IPv4 application - Azure CLI,Add IPv6 to an IPv4 application in Azure virtual network - Azure CLI,,This article shows how to deploy IPv6 addresses to an existing application in an Azure virtual network for a Standard Load Balancer using Azure CLI.,"This article shows you how to add IPv6 addresses to an application that is using IPv4 public IP address in an Azure virtual network for a Standard Load Balancer using Azure CLI. The in-place upgrade includes a virtual network and subnet, a Standard Load Balancer with IPv4 + IPV6 frontend configurations, VMs with NICs that have a IPv4 + IPv6 configurations, network security group, and public IPs.",2024-09-30T08:00:00.000Z,how-to,,0.4,False,"Azure CLI tutorial for adding IPv6 to an existing IPv4 app; similar to index 28, mostly procedural commands.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/ipv6-add-to-existing-vnet-powershell,Add IPv6 to an IPv4 application- PowerShell,Add IPv6 to an IPv4 application in Azure Virtual Network - PowerShell,,This article shows how to deploy IPv6 addresses to an existing application in Azure virtual network using Azure PowerShell.,This article shows you how to add IPv6 connectivity to an existing IPv4 application in an Azure virtual network with a Standard Load Balancer and Public IP. The in-place upgrade includes:,2024-09-30T08:00:00.000Z,how-to,,0.4,False,PowerShell tutorial for adding IPv6 to an existing IPv4 app; appears to be a step-by-step upgrade scenario rather than a config reference.,unchanged https://learn.microsoft.com/en-us/azure/load-balancer/ipv6-dual-stack-standard-internal-load-balancer-powershell,Deploy a dual-stack Internal load balancer,Deploy an IPv6 dual stack application using Standard Internal Load Balancer in Azure - PowerShell,,This article shows how to deploy an IPv6 dual stack application with Standard Internal Load Balancer in Azure virtual network using Azure PowerShell.,"This article shows you how to deploy a dual stack (IPv4 + IPv6) application in Azure that includes a dual stack virtual network and subnet, a Standard Internal Load Balancer with dual (IPv4 + IPv6) frontend configurations, VMs with NICs that have a dual IP configuration, network security group, and public IPs. The procedure to create an IPv6-capable Internal Load Balancer is nearly identical to the process for creating an Internet-facing IPv6 Load Balancer describedhere. The only differences for",2024-06-27T08:00:00.000Z,how-to,,0.4,False,PowerShell-focused tutorial for dual-stack internal load balancer; mostly procedural without configuration matrices or error/limit details.,unchanged @@ -29,7 +29,7 @@ https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-basic-upgrad https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-best-practices,Load balancer best practices,Azure Load Balancer Best Practices - Azure Load Balancer,Apply Azure Load Balancer deployment best practices,Learn about the best practices for deploying and configuring Azure Load Balancer.,"This article discusses a collection of Azure best practices for your load balancer deployment. These best practices are derived from our experience with Azure networking and the experiences of customers like yourself. For each best practice, this article explains: These best practices are based on a consensus opinion, and Azure platform capability and features sets, as they exist at the time this article was written.",2026-01-29T08:00:00.000Z,troubleshooting,best-practices,0.85,True,Explicit best-practices article with DO/DON'T guidance derived from field experience; contains product-specific recommendations and gotchas for configuring and deploying Load Balancer.,unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-custom-probe-overview,Health probes,Azure Load Balancer health probes,Configure Azure Load Balancer health probe settings,"Azure Load Balancer health probes and configuration for detecting application failures, managing load, and planned downtime. Includes probe properties and SKU comparison.","An Azure Load Balancer health probe is a feature that detects the health status of your application instances. It sends a request to the instances to check if they're available and responding to requests. The health probe can be configured to use different protocols such as TCP, HTTP, or HTTPS. It's an important feature because it helps you to detect application failures, manage load, and plan for downtime. Azure Load Balancer rules require a health probe to detect the endpoint status. The confi",2026-01-08T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes health probe properties and SKU comparison; typically includes probe configuration parameters (interval, threshold, protocol) and their allowed values, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-distribution-mode,Configure distribution mode for Load Balancer,Configure Azure Load Balancer distribution mode - Azure Load Balancer,Configure Azure Load Balancer traffic distribution mode,"In this article, get started configuring the distribution mode for Azure Load Balancer to support source IP affinity.","Azure Load Balancer supports two distribution modes for distributing traffic to your applications: To learn more about the different distribution modes supported by Azure Load Balancer, seeAzure Load Balancer distribution modes. In this article, you learn how to configure the distribution mode for your Azure Load Balancer.",2026-01-29T08:00:00.000Z,how-to,configuration,0.65,True,How-to for configuring distribution modes including source IP affinity; involves specific mode settings that are product-specific configuration options.,unchanged -https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-faqs,FAQ,Frequently asked questions - Azure Load Balancer,Azure Load Balancer FAQs on limits and behavior,Answers to frequently asked questions about the Azure Load Balancer.,,2026-04-14T22:21:00Z,faq,limits-quotas,0.78,True,"The FAQ includes product-specific numeric limits and behaviors (for example, SNAT port counts, backend pool size limits, timeout values, and other concrete constraints) that qualify as limits-quotas expert knowledge beyond generic conceptual information.",updated +https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-faqs,FAQ,Frequently asked questions - Azure Load Balancer,Azure Load Balancer FAQs on limits and behavior,Answers to frequently asked questions about the Azure Load Balancer.,,2026-04-14T22:21:00Z,faq,limits-quotas,0.78,True,"The FAQ includes product-specific numeric limits and behaviors (for example, SNAT port counts, backend pool size limits, timeout values, and other concrete constraints) that qualify as limits-quotas expert knowledge beyond generic conceptual information.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-floating-ip,Floating IP configuration,Azure Load Balancer Floating IP configuration,,Overview of Azure Load Balancer Floating IP.,Load balancer provides several capabilities for both UDP and TCP applications.,2025-10-28T08:00:00.000Z,how-to,,0.3,False,High-level overview of floating IP configuration; summary suggests conceptual explanation without detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-ha-ports-overview,High Availability ports,High availability ports overview in Azure - Azure Load Balancer,,Learn about high availability ports load balancing on an internal load balancer.,"Azure Standard Load Balancer helps you load-balanceallprotocol flows onallports simultaneously when you're using an internal load balancer via HA Ports. High availability (HA) ports are a type of load balancing rule that provides an easy way to load-balanceallflows that arrive onallports of an internal standard load balancer. The load-balancing decision is made per flow. This action is based on the following five-tuple connection: source IP address, source port, destination IP address, destinati",2024-06-26T08:00:00.000Z,concept-article,,0.4,False,"Overview of HA ports; mostly conceptual description of behavior, not a configuration reference with parameters or limits.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-health-event-logs,Load Balancer Health Event Logs,Azure Load Balancer health event logs - Azure Load Balancer,Interpret Azure Load Balancer health event logs,"Learn what health event logs are available for Azure Load Balancer including severity definitions, health event types, and publishing frequency.",Azure Load Balancer supports health event logs to help you identify and troubleshoot ongoing issues affecting your load balancer resource’s health. These events are provided through the Azure Monitor resource log category LoadBalancerHealthEvent. These logs are supported for Standard (regional and global tier) and Gateway Load Balancers.,2025-12-13T06:24:00.000Z,concept-article,troubleshooting,0.8,True,"Explicitly about health event logs, including severity definitions, event types, and publishing frequency. This is a symptom→event→meaning mapping for a specific log category (LoadBalancerHealthEvent), which is product-specific troubleshooting information.",unchanged @@ -47,7 +47,7 @@ https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-overview,Wha https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-query-metrics-rest-api,Get Load Balancer metrics with REST,Retrieve metrics with the Azure REST API - Azure Load Balancer,Query Azure Load Balancer metrics via REST API,"In this article, get started using the Azure REST APIs to collect health and usage metrics for Azure Load Balancer.",Collect the number of bytes processed by aStandard Load Balancerfor an interval of time using theAzure REST API. Complete reference documentation and more samples for the REST API are available in theAzure Monitor REST reference.,2024-06-26T08:00:00.000Z,how-to,integrations,0.7,True,"Shows how to collect bytes-processed metrics for Standard Load Balancer using Azure Monitor REST API; includes endpoint usage and parameters, which are integration-specific.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-standard-diagnostics,Standard Load Balancer metrics and diagnostics,"Diagnostics with metrics, alerts, and resource health - Azure Load Balancer","Use metrics, alerts, and health to diagnose Azure Load Balancer","Use the available metrics, alerts, and resource health information to diagnose your load balancer.","Azure Load Balancer exposes the following diagnostic capabilities: Multi-dimensional metrics and alerts: Provides multi-dimensional diagnostic capabilities throughAzure Monitorfor Azure Load Balancer configurations. You can monitor, manage, and troubleshoot your standard load balancer resources. Resource health: The Resource Health status of your load balancer is available in theResource healthpage underMonitor. This automatic check informs you of the current availability of your load balancer r",2025-03-10T22:02:00.000Z,concept-article,troubleshooting,0.65,True,"Focused on using metrics, alerts, and resource health to diagnose issues. Such diagnostic guides for a specific service usually map symptoms/metrics to causes and resolutions, which is product-specific troubleshooting knowledge beyond generic monitoring concepts.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-standard-virtual-machine-scale-sets,Standard Load Balancer and Virtual Machine Scale Sets,Guidance for virtual machine scale sets with Azure Standard Load Balancer,Use Azure Standard Load Balancer with virtual machine scale sets,Learn about working with virtual machine scale sets and Azure Standard Load Balancer.,"When you work with Virtual Machine Scale Sets and Azure Load Balancer, consider the following guidelines.",2024-06-26T08:00:00.000Z,concept-article,best-practices,0.6,True,Guidance article with specific recommendations for combining VM scale sets and Standard Load Balancer; contains product-specific behavioral guidance and gotchas.,unchanged -https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-support-help,Support and troubleshooting for Azure Load Balancer,Azure Load Balancer support and help options,,How to obtain help and support for questions or problems when you create solutions using Azure Load Balancer.,Here are suggestions for where you can get help when developing your Azure Load Balancer solutions.,2026-04-14T22:21:00.000Z,troubleshooting,,0.1,False,"Support/help options page; no technical limits, configuration parameters, error codes, or product-specific patterns. Primarily guidance on where to get assistance, not expert implementation knowledge.",new +https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-support-help,Support and troubleshooting for Azure Load Balancer,Azure Load Balancer support and help options,,How to obtain help and support for questions or problems when you create solutions using Azure Load Balancer.,Here are suggestions for where you can get help when developing your Azure Load Balancer solutions.,2026-04-14T22:21:00.000Z,troubleshooting,,0.1,False,"Support/help options page; no technical limits, configuration parameters, error codes, or product-specific patterns. Primarily guidance on where to get assistance, not expert implementation knowledge.",unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-tcp-idle-timeout,Configure TCP reset and idle timeout,Configure load balancer TCP reset and idle timeout - Azure Load Balancer,Configure Azure Load Balancer TCP idle timeout and reset,"In this article, learn how to configure Azure Load Balancer TCP idle timeout and reset.","Azure Load Balancer rules have a default timeout range of 4 minutes to 100 minutes for Load Balancer rules, Outbound Rules, and Inbound NAT rules. The default setting is 4 minutes. If a period of inactivity is longer than the timeout value, there's no guarantee that the TCP or HTTP session is maintained between the client and your service. The following sections describe how to change idle timeout and tcp reset settings for load balancer resources.",2025-02-12T08:00:00.000Z,how-to,limits-quotas,0.95,True,Explicitly states default timeout (4 minutes) and range (4–100 minutes) for multiple rule types; these numeric limits and ranges are classic limits-quotas content.,unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-tcp-reset,TCP reset on idle timeout,Load Balancer TCP Reset and idle timeout in Azure - Azure Load Balancer,Understand Azure Load Balancer TCP reset behavior,"With this article, learn about Azure Load Balancer with bidirectional TCP Reset packets on idle timeout.",You can useStandard Load Balancerto create a more predictable application behavior for your scenarios by enabling TCP Reset on Idle for a given rule. Load Balancer's default behavior is to silently drop flows when the idle timeout of a flow is reached. Enabling TCP reset causes Load Balancer to send bidirectional TCP Resets (TCP reset packets) on idle timeout to inform your application endpoints that the connection timed out and is no longer usable. Endpoints can immediately establish a new conn,2025-09-25T22:12:00.000Z,concept-article,limits-quotas,0.7,True,Explains TCP reset on idle timeout; likely includes specific idle timeout behavior and packet handling details that are product-specific and not generic knowledge.,unchanged https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-test-frontend-reachability,Testing inbound frontend IP address reachability,Test reachability of Azure Public Load Balancer frontends with ping and traceroute,Test Azure Public Load Balancer frontend reachability,Learn how to test Azure Public Load Balancer frontend IPv4 and IPv6 addresses for reachability from an Azure VM or an external device. Supports ping and traceroute.,"Standard Public Azure Load Balancer frontend IPv4 and IPv6 addresses support testing reachability using ping and traceroute. Testing reachability of a load balancer frontend is useful for troubleshooting inbound connectivity issues to Azure resources. In this article, you learn how to use ping and traceroute for testing a frontend of an existing Standard public load balancer. It can be completed from an Azure Virtual Machine or from a device outside of Azure.",2024-06-26T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Shows how to use ping and traceroute for diagnosing inbound connectivity issues to Standard Public Load Balancer frontends. While somewhat procedural, it is clearly framed as a troubleshooting technique for a specific product scenario.",unchanged diff --git a/products/azure-load-balancer/report.md b/products/azure-load-balancer/report.md index 33b58a0b..13f7bdbd 100644 --- a/products/azure-load-balancer/report.md +++ b/products/azure-load-balancer/report.md @@ -51,10 +51,10 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat - **Unclassified**: 46 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 3 -- **Unchanged**: 83 -- **Deleted Pages**: 8 +- **New Pages**: 0 +- **Updated Pages**: 0 +- **Unchanged**: 87 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-load-balancer/azure-load-balancer.csv` ## Classification Statistics @@ -74,30 +74,6 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat ## Changes -### New Pages - -- [Support and troubleshooting for Azure Load Balancer](https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-support-help) - -### Updated Pages - -- [Retrieve metadata using the Azure IMDS](https://learn.microsoft.com/en-us/azure/load-balancer/howto-load-balancer-imds) - - Updated: 2026-01-29T08:00:00.000Z → 2026-04-14T22:21:00.000Z -- [Retrieve information using the Azure Instance Metadata Service](https://learn.microsoft.com/en-us/azure/load-balancer/instance-metadata-service-load-balancer) - - Updated: 2026-01-29T08:00:00.000Z → 2026-04-14T22:21:00.000Z -- [FAQ](https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-faqs) - - Updated: 2026-02-23T23:22:00Z → 2026-04-14T22:21:00Z - -### Deleted Pages - -- ~~Common deployment errors~~ (https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-common-deployment-errors) -- ~~Troubleshoot Azure Load Balancer~~ (https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-troubleshoot) -- ~~Backend pool (VM) traffic~~ (https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-troubleshoot-backend-traffic) -- ~~Troubleshoot load balancer health event logs~~ (https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-troubleshoot-health-event-logs) -- ~~Health Probe status~~ (https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-troubleshoot-health-probe-status) -- ~~Common error codes for Azure Instance Metadata Service (IMDS)~~ (https://learn.microsoft.com/en-us/azure/load-balancer/troubleshoot-load-balancer-imds) -- ~~Troubleshoot SNAT exhaustion and connection timeouts~~ (https://learn.microsoft.com/en-us/azure/load-balancer/troubleshoot-outbound-connection) -- ~~Resource health and inbound availability issues~~ (https://learn.microsoft.com/en-us/azure/load-balancer/troubleshoot-rhc) - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-local/azure-local.csv b/products/azure-local/azure-local.csv index 00a89adc..6fd09890 100644 --- a/products/azure-local/azure-local.csv +++ b/products/azure-local/azure-local.csv @@ -1,358 +1,377 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-fedramp-guidance?view=azloc-2603,FedRAMP guidance,FedRAMP guidance for Azure Local - Azure Local,,Learn about FedRAMP compliance using Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article explains the relationship between Azure Local and FedRAMP and how organizations can stay compliant with FedRAMP with Azure Local solutions.,2026-01-28T23:03:00.000Z,concept-article,,0.2,False,"FedRAMP guidance; focuses on relationship and compliance, not on concrete configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-hipaa-guidance?view=azloc-2603,HIPAA guidance,HIPAA guidance for Azure Local - Azure Local,,Learn about HIPAA compliance using Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article provides guidance on how organizations can most efficiently navigate HIPAA compliance for solutions built with Azure Local.,2026-01-28T23:03:00.000Z,concept-article,,0.2,False,"HIPAA guidance; summary suggests general compliance guidance, not detailed product-specific settings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-iso27001-guidance?view=azloc-2603,ISO/IEC 27001 guidance,ISO 27001 guidance for Azure Local - Azure Local,,Learn about ISO 27001 compliance using Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article outlines how Azure Local helps organizations meet the security control requirements of ISO/IEC 27001:2022, both in cloud and on premises. Learn more about Azure Local and other security standards atAzure Local and security standards.",2026-01-28T23:03:00.000Z,concept-article,,0.2,False,ISO 27001 guidance at a conceptual/compliance level; summary does not show concrete security settings or RBAC details.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-pci-dss-guidance?view=azloc-2603,PCI DSS guidance,PCI DSS guidance for Azure Local - Azure Local,,Learn about PCI DSS compliance using Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article explains how Microsoft Azure Local security features can help organizations in the payment card industry achieve the security control requirements of PCI DSS, both in the cloud and in their on-premises environments.",2026-01-28T23:03:00.000Z,concept-article,,0.2,False,PCI DSS guidance; appears compliance-oriented without specific configuration parameters or numeric limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-security-standards?view=azloc-2603,Azure Local and security standards,Azure Local and security standards - Azure Local,,"Learn about Azure Local, security standards, and security assurance.","Applies to: Hyperconverged deployments of Azure Local This article provides information about security standards related to Azure Local. The resources detailed in this article, including certifications and evaluation reports, could be used as sources to help you in your compliance planning. Each section in this article provides information on Azure Local and a particular security standard, together with any completed certifications.",2026-03-02T23:03:00.000Z,concept-article,,0.2,False,Describes security standards and certifications; compliance/assurance overview without product-specific configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit?view=azloc-2603,Azure Hybrid Benefit,Azure Hybrid Benefit for Azure Local - Azure Local,,Learn about Azure Hybrid Benefit for Azure Local.,"Applies to: Azure Local 2311.2 and later This article describes Azure Hybrid Benefit and how to use it for Azure Local. Azure Hybrid Benefitis a program that helps you reduce the costs of running workloads in the cloud. With Azure Hybrid Benefit for Azure Local, you can maximize the value of your on-premises licenses and modernize your existing infrastructure to Azure Local at no extra cost.",2026-04-05T08:00:00.000Z,how-to,,0.2,False,"High-level cost/benefit description of Azure Hybrid Benefit for Azure Local; summary indicates conceptual and marketing-style explanation without specific limits, configuration parameters, decision matrices, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/billing?view=azloc-2603,Billing and payment,Azure Local billing and payment - Azure Local,Understand billing and payment model for Azure Local,How billing and payment works in Azure Local.,"Applies to: Azure Local 2311.2 and later Azure Local is an Azure service that goes on your Azure subscription bill just like any other Azure service. It's priced on a per core basis on your on-premises servers. For current pricing, seeAzure Local pricing. Currencies and discounts are handled centrally by the Azure Commerce billing platform, and the customer gets one unified, itemized bill at the end of the month. No traditional on-premises software license is required for Azure Local, although g",2026-01-07T23:02:00.000Z,overview,decision-making,0.6,True,Provides product-specific billing behavior and pricing model details that inform cost planning and service selection.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2603,Compare VM management capabilities,Compare Management Capabilities of VMs on Azure Local - Azure Local,Choose Azure Local VM types and management capabilities,Learn about the kinds of virtual machines (VMs) that can run on Azure Local and compare their management capabilities.,Applies to: Hyperconverged deployments of Azure Local This article describes the types of virtual machines (VMs) available on Azure Local. It also compares their management capabilities in Azure.,2026-01-07T23:02:00.000Z,product-comparison,decision-making,0.65,True,Comparison of VM types and their management capabilities; likely includes comparison tables that guide which VM type to use for different scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2603,Compare to Windows Server,Compare Azure Local to Windows Server - Key Differences and Benefits - Azure Local,Choose between Azure Local and Windows Server,"Learn the key differences between Azure Local and Windows Server to choose the best solution for your organization. Compare features, benefits, and scenarios.","Applies to: Azure Local 2311.2 and later; Windows Server 2025 This article compares Azure Local and Windows Server, and highlights key differences between the two products. It helps you learn when to use each product, and how they can work together. Azure Local and Windows Server share many similarities, such as letting you run virtual machines and container-based workloads. But they're designed for different scenarios and use cases. Azure Local is a cloud-connected hyperconverged solution that ",2026-03-02T08:00:00.000Z,product-comparison,decision-making,0.7,True,"Explicitly compares Azure Local vs Windows Server and explains when to use each; this is product-specific selection guidance. Even if numbers are sparse, it provides scenario-based recommendations and trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/datacenter-firewall-overview?view=azloc-2603,Datacenter Firewall overview,Overview of Datacenter Firewall in Azure Local and Windows Server - Azure Local,,Use this article to learn about Datacenter Firewall in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Datacenter Firewall is a network layer, 5-tuple (protocol, source and destination port numbers, source and destination IP addresses), stateful, multitenant Software Defined Networking (SDN) firewall. The Datacenter Firewall protects east-west and north-south traffic flows across the network layer of virtual networks and traditional VLAN networks.",2025-12-22T23:06:00.000Z,overview,,0.3,False,"Overview of Datacenter Firewall; conceptual description of capabilities, not detailed security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2603,External storage support,External storage support for Azure Local (preview) - Azure Local,Design Azure Local deployments with external SAN storage,"Learn how Azure Local supports external SAN integration, enabling high-performance storage for VMs, AKS clusters, and AVD instances without re-architecture (preview).","This article explains external storage support for Azure Local, its benefits, supported configurations, and other essential information. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-02-19T18:05:00.000Z,concept-article,architecture-patterns,0.65,True,"Details supported configurations and patterns for integrating external SAN with Azure Local, including when and how to use external storage in this product.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2603,Firewall requirements,Firewall requirements for Azure Local - Azure Local,Configure firewall rules and endpoints for Azure Local,This article provides guidance on firewall requirements for the Azure Stack HCI operating system.,Applies to: Azure Local 2311.2 and later This article provides guidance on how to configure firewalls for the Azure Stack HCI operating system. It includes firewall requirements for outbound endpoints and internal rules and ports. The article also provides information on how to use Azure service tags with Microsoft Defender firewall. This article also describes how to optionally use a highly locked-down firewall configuration to block all traffic to all destinations except those included in your,2026-01-26T23:10:00.000Z,how-to,security,0.85,True,"Provides outbound endpoints, internal ports/rules, and use of service tags; detailed security configuration parameters specific to Azure Local.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/gateway-overview?view=azloc-2603,RAS Gateway overview,RAS Gateway for Software Defined Networking managed by on-premises tools - Azure Local,,Learn about Remote Access Service (RAS) Gateway for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article provides an overview of Remote Access Service (RAS) Gateway for Software Defined Networking (SDN) in Azure Local and Windows Server. RAS Gateway is a software-based Border Gateway Protocol (BGP) capable router designed for cloud service providers (CSPs) and enterprises that host multitenant virtual networks using Hyper-V Network Virtualization (HNV). You can use RAS Gateway to rou",2025-12-22T23:06:00.000Z,overview,,0.3,False,Overview of RAS Gateway; high-level description of capabilities without detailed configuration tables in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2603,Host network requirements,Host network requirements for Azure Local - Azure Local,Configure host networking for Azure Local clusters,Learn the host network requirements for Azure Local,"Applies to: Azure Local 2311.2 and later This topic discusses host networking considerations and requirements for Azure Local. For information on datacenter architectures and the physical connections between machines, seePhysical network requirements. For information on how to simplify host networking using Network ATC, seeSimplify host networking with Network ATC.",2026-01-26T23:10:00.000Z,how-to,configuration,0.8,True,Host networking considerations and requirements; includes specific network settings and patterns for Azure Local hosts.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/microsoft-365-local-overview?view=azloc-2603,Microsoft 365 Local,Overview of Microsoft 365 Local on Azure Local Infrastructure - Azure Local,,"Learn how Microsoft 365 Local enables private cloud productivity with Exchange, SharePoint, and Skype for Business on customer-managed Azure Local infrastructure.",This article provides an overview of Microsoft 365 Local on Azure Local infrastructure and how it helps organizations meet sovereignty requirements while maintaining productivity in a private cloud environment.,2026-04-09T22:04:00.000Z,concept-article,,0.1,False,"Described as an overview of Microsoft 365 Local on Azure Local infrastructure, focused on what it is and how it helps with sovereignty and productivity. This is conceptual/marketing-style content without clear indication of numeric limits, configuration tables, or detailed decision matrices, so it doesn't meet the expert-knowledge criteria.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/monitoring-overview?view=azloc-2603,Overview,Overview of Azure Local monitoring - Azure Local,,This article provides an overview of the Azure Local monitoring solution.,"Applies to: Hyperconverged deployments of Azure Local This article provides an overview of monitoring in hyperconvered deployments of Azure Local (formerly Azure Stack HCI). Monitoring Azure Local involves the regular collection and analysis of data from all components of your system to promptly identify and address any potential issues. Routine monitoring is crucial for maintaining the health and functionality of your Azure Local system. To understand the current performance patterns, identify ",2025-12-12T23:06:00.000Z,concept-article,,0.2,False,Monitoring overview; conceptual explanation of monitoring approach without detailed metrics tables or configuration parameters.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-atc-overview?view=azloc-2603,Network ATC overview,Network ATC overview - Azure Local,Apply Network ATC best practices for Azure Local networking,This article introduces Network ATC for Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later Deployment and operation of Azure Local networking can be a complex and error-prone process. Due to the configuration flexibility provided with the host networking stack, there are many moving parts that can be easily misconfigured or overlooked. Staying up to date with the latest best practices is also a challenge as improvements are continuously made to the underlying technologies. Additionally, configuration consistency across Azure Local machines is i",2026-01-26T23:10:00.000Z,overview,best-practices,0.6,True,Network ATC overview for Azure Local/Windows Server likely embeds product-specific recommended configurations and patterns to avoid misconfiguration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-controller-overview?view=azloc-2603,Network Controller overview,Overview of Network Controller in Azure Local and Windows Server - Azure Local,,Use this article to learn about Network Controller for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Network Controller is the cornerstone of Software Defined Networking (SDN) management. It's a highly scalable server role that provides a centralized, programmable point of automation to manage, configure, monitor, and troubleshoot virtual network infrastructure. Using Network Controller, you can automate the configuration and management of network infrastructure instead of performing manual c",2025-12-22T23:06:00.000Z,overview,,0.3,False,"Overview of Network Controller; appears conceptual, not focused on specific configuration parameters or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/observability?view=azloc-2603,Azure Local observability,Azure Local observability - Azure Local,,Learn about observability in Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article describes observability in Azure Local and the data sources through which it is achieved.,2025-07-31T22:03:00.000Z,how-to,,0.3,False,Conceptual overview of observability and data sources; mostly high-level without detailed config tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2603,Physical network requirements,Physical network requirements for Azure Local - Azure Local,Configure physical network requirements for Azure Local,"Learn about physical network requirements for Azure Local, including network switches, to ensure optimal performance.","Applies to: Azure Local 2311.2 and later This article discusses physical (fabric) network considerations and requirements for Azure Local, particularly for network switches. Note Requirements for future Azure Local versions may change.",2026-03-30T22:03:00.000Z,concept-article,configuration,0.7,True,"Describes physical (fabric) network requirements and switch considerations for Azure Local. Such pages typically include specific network configuration parameters (e.g., VLANs, MTU, port speeds, required features) and version applicability, which are product-specific configuration details not generally known from training.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-network-controller-deployment?view=azloc-2603,Network Controller,"Plan to deploy Network Controller on Azure Local, version 23H2 - Azure Local",Plan Network Controller deployment topology on Azure Local 23H2,This article covers how to plan to deploy Network Controller on Azure Local via Windows Admin Center on a set of virtual machines (VMs).,Applies to: Hyperconverged deployments of Azure Local This article describes how to plan to deploy Network Controller on Azure Local via Windows Admin Center on a set of virtual machines (VMs). Planning to deploy Network Controller via Windows Admin Center requires a set of VMs running the Azure Stack HCI operating system. Network Controller is a highly available and scalable server role that requires a minimum of three VMs to provide high availability on your network. Note We recommend that you,2025-12-22T23:06:00.000Z,install-set-up-deploy,architecture-patterns,0.6,True,"Planning article for Network Controller deployment; includes requirements like minimum three VMs and recommendations, which are product-specific architectural decisions.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-software-defined-networking-infrastructure-23h2?view=azloc-2603,SDN infrastructure,"Plan infrastructure for Software Defined Networking managed by on-premises tools in Azure Local, version 23H2 - Azure Local",Plan SDN infrastructure for Azure Local 23H2,"This topic provides information on how to plan a Software Defined Network (SDN) infrastructure deployment, managed by on-premises tools, for Azure Local, version 23H2.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to plan an infrastucture deployment for Software Defined Networking (SDN) managed by on-premises tools, including hardware and software prerequisites. It outlines planning requirements for both physical and logical network configuration, routing, gateways, network hardware, and more. This article also includes considerations on extending an SDN infrastructure and usi",2026-01-28T23:03:00.000Z,concept-article,architecture-patterns,0.68,True,"Planning guide includes product-specific SDN topology guidance, hardware/software prerequisites, and concrete design considerations (physical/logical networks, routing, gateways, extension scenarios) that go beyond generic networking concepts and are specific to Azure Local SDN.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-add-server?view=azloc-2603,Add and repair nodes,Add or repair a node to a rack aware cluster on Azure Local - Azure Local,Add or repair nodes in Azure Local rack aware clusters,Learn how to add or repair a node to a rack aware cluster on Azure Local.,This article explains how to add or repair servers (nodes) for your Azure Local rack aware cluster.,2026-01-22T23:03:00.000Z,how-to,deployment,0.65,True,Node add/repair procedures are product-specific operational deployment/scale-out steps with concrete commands and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-aks-nodes?view=azloc-2603,Spread AKS nodes,Spread Azure Kubernetes Service (AKS) nodes in rack aware cluster - Azure Local,Configure AKS node distribution in rack aware clusters,Learn how to deploy AKS clusters with rack aware cluster support to ensure fault tolerance and evenly distribute nodes across Azure Local zones.,This article explains how to deploy Azure Kubernetes Service (AKS) clusters with rack aware cluster support. You'll learn how to ensure fault tolerance and evenly distribute nodes across Azure Local zones for improved reliability.,2026-01-22T23:03:00.000Z,concept-article,configuration,0.7,True,Describes how to deploy AKS clusters with rack aware support and distribute nodes across zones; involves product-specific configuration parameters.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-overview?view=azloc-2603,What's a rack aware cluster?,Overview of Azure Local rack aware clustering - Azure Local,Understand Azure Local rack aware clustering architecture,Use this article to learn about Azure Local rack aware clustering.,This article gives a high-level overview of the Azure Local rack aware clustering feature including its benefits and use cases. The article also details the supported configurations and deployment requirements for rack aware clusters. This article applies only to new deployments of Azure Local.,2026-01-22T23:03:00.000Z,overview,architecture-patterns,0.62,True,Overview includes supported configurations and deployment requirements for rack aware clusters; likely describes product-specific clustering patterns and when to use them.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-provision-vm-local-availability-zone?view=azloc-2603,Provision and place VMs,Provision VMs in local availability zone for Azure Local - Azure Local,Provision Azure Local VMs in local availability zones,Learn about how to provision VMs in local availability zone for Azure Local.,"This article explains how to create Azure Local virtual machines (VMs) in a local availability zone to reduce latency, improve performance, ensure redundancy, and meet compliance requirements. Important Updating placement configuration of existing virtual machines (VMs) is not supported.",2026-01-28T23:03:00.000Z,how-to,configuration,0.7,True,Explains how to configure VM placement in local availability zones; includes constraints like non-support for updating placement of existing VMs.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-reference-architecture?view=azloc-2603,Reference architecture,Azure Local rack aware cluster reference architecture - Azure Local,Reference network architecture for Azure Local rack aware clusters,Learn about the network design and configuration of an Azure Local rack aware cluster,"This article contains information about the network design and configuration of an Azure Local rack aware cluster. This configuration involves a single cluster where nodes are placed in different physical locations within a building. The primary intent is to support factory environments where hardware must be isolated in different rooms due to regulatory requirements, safety protocols, or operational constraints. This isolation provides fault domain separation while maintaining cluster functiona",2026-01-22T23:03:00.000Z,concept-article,architecture-patterns,0.7,True,Describes network design and configuration for rack aware clusters in specific factory/room-isolation scenarios; this is a product-specific reference architecture pattern.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-requirements?view=azloc-2603,Review requirements,Requirements and supported configurations for rack aware clusters - Azure Local,Rack aware cluster requirements and supported configurations,Learn about requirements and supported configurations for rack aware clusters.,This article provides the requirements and supported configurations for rack aware clusters.,2026-02-24T23:03:00.000Z,how-to,limits-quotas,0.7,True,"Requirements/supported configurations typically include numeric constraints (node counts, network requirements, supported topologies) that function as product-specific limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-room-to-room-connectivity?view=azloc-2603,Room-to-room connections,Azure Local rack aware cluster room-to-room connectivity - Azure Local,Design room-to-room connectivity for rack aware clusters,Learn about Azure Local rack aware cluster room-to-room connectivity.,"Azure Local rack aware clusters require specialized room-to-room connectivity to enable storage replication and failover across availability zones. This article outlines four distinct configuration options (A, B, C, and D) for implementing room-to-room links, each optimized for different resilience, cost, and complexity requirements. Review the following key concepts: Room-to-room links: Physical network connections that span between separate rooms or availability zones, enabling RDMA (Remote Di",2026-02-24T23:03:00.000Z,concept-article,architecture-patterns,0.7,True,"Outlines four configuration options with different resilience, cost, and complexity; this is explicit pattern/trade-off guidance for Azure Local rack aware connectivity.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/route-reflector-overview?view=azloc-2603,Route reflector overview,Overview of BGP Route Reflector in Azure Local and Windows Server - Azure Local,,Use this topic to learn about BGP Route Reflector for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article provides an overview of Border Gateway Protocol (BGP) Route Reflector in Azure Local and Windows Server. BGP Route Reflector is included withRemote Access Service (RAS) Gatewayand provides an alternative to BGP full mesh topology that is required for route synchronization between routers. A Route Reflector in a Software Defined Networking deployment is a logical entity that sits o",2025-12-22T23:06:00.000Z,overview,,0.3,False,Overview of BGP Route Reflector; summary suggests conceptual explanation rather than detailed configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-frequently-asked-questions?view=azloc-2603,SDN FAQ,Frequently Asked Questions (FAQ) for Software-Defined Networking (SDN) on Azure Local - Azure Local,,This FAQ provides information about SDN enabled by Azure Arc on Azure Local.,This FAQ provides information about Software-Defined Networking (SDN) enabled by Azure Arc on your Azure Local VMs. This feature is available in Azure Local 2506 or later with OS build 26100.xxxx.,2025-12-22T23:06:00Z,faq,,0.4,False,"FAQ for SDN; summary does not indicate detailed error codes, config tables, or numeric limits—likely general Q&A.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-multisite-overview?view=azloc-2603,SDN Multisite overview,Overview of SDN Multisite in Azure Local and Windows Server - Azure Local,Plan SDN Multisite topology and disaster recovery on Azure Local,This article provides an overview of the SDN Multisite solution.,"Applies to: Hyperconverged deployments of Azure Local Applies to: Windows Server 2025 This article provides an overview of SDN Multisite, including its benefits and current limitations. You can use it as a guide to help design your network topology and disaster recovery plan. SDN Multisite allows you to expand the capabilities of traditional SDN deployed at different physical locations. SDN Multisite enables native Layer 2 and Layer 3 connectivity across different physical locations for virtuali",2025-12-22T23:06:00.000Z,concept-article,architecture-patterns,0.65,True,Provides overview of SDN Multisite benefits and limitations and is explicitly for designing network topology and DR; this is product-specific architecture and pattern guidance.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-overview?view=azloc-2603,Overview,Software Defined Networking (SDN) enabled by Azure Arc on Azure Local - Azure Local,Choose SDN management methods enabled by Azure Arc on Azure Local,"Software Defined Networking enabled by Arc provides a way to centrally configure and manage logical networks, network security groups, network security rules via the Azure portal and Azure CLI in Azur","This article explains Software Defined Networking (SDN) enabled by Azure Arc on Azure Local. It covers SDN management methods, when to use each method, and supported and unsupported SDN scenarios. SDN offers a centralized way to configure and manage networks and network services such as switching, routing, and load balancing in your datacenter. SDN enables you to dynamically create, secure, and connect your network to meet the evolving needs of your applications.",2026-02-24T23:03:00.000Z,concept-article,architecture-patterns,0.6,True,"Explains SDN management methods, when to use each, and supported/unsupported scenarios; this is product-specific pattern selection guidance with trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/security-features?view=azloc-2603,About security features,"Security features for Azure Local, version 23H2. - Azure Local",,"Learn about security features available for new deployments of Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local Azure Local (formerly Azure Stack HCI) is a secure-by-default product that has more than 300 security settings enabled right from the start. Default security settings provide a consistent security baseline to ensure that devices start in a known good state. This article provides a brief conceptual overview of the various security features associated with your Azure Local instance. Features include security defaults, Application Control, volum",2026-01-28T23:03:00.000Z,concept-article,,0.2,False,"Conceptual overview of security features; no specific RBAC roles, parameters, or config values indicated.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-defined-networking-23h2?view=azloc-2603,Software-defined networking overview,"Software defined networking (SDN) managed by on-premises tools in Azure Local, version 23H2 - Azure Local",,"Software defined networking (SDN) managed by on-premises tools provides a way to centrally configure and manage networks and network services such as switching, routing, and load balancing in Azure Lo","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Software defined networking (SDN) provides a way to centrally configure and manage networks and network services such as switching, routing, and load balancing in your data center. You can use SDN to dynamically create, secure, and connect your network to meet the evolving needs of your apps. Operating global-scale datacenter networks for services like Microsoft Azure, which efficiently perfor",2026-01-28T23:03:00.000Z,concept-article,,0.3,False,Conceptual overview of SDN managed by on-premises tools; mostly descriptive without concrete configuration matrices in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-load-balancer?view=azloc-2603,Load balancer overview,Software Load Balancer (SLB) for SDN managed by on-premises tools in Azure Local and Windows Server - Azure Local,,Use this article to learn about Software Load Balancer for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Cloud Service Providers (CSPs) and enterprises that are deployingSoftware Defined Networking (SDN)can use Software Load Balancer (SLB) to evenly distribute tenant and tenant customer network traffic among virtual network resources. SLB enables multiple servers to host the same workload, providing high availability and scalability. Software Load Balancer can provide a multitenant, unified edge ",2025-12-22T23:06:00.000Z,overview,,0.3,False,Overview of Software Load Balancer; summary is conceptual and does not show specific configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2603,System requirements,"System requirements for Azure Local, version 23H2 - Azure Local",Configure system requirements for Azure Local 23H2,"How to choose machines, storage, and networking components for Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article discusses Azure, machine and storage, networking, and other requirements for hyperconvered deployments of Azure Local (formerly Azure Stack HCI). If you purchased Integrated System solution hardware from theAzure Local Catalog, you can skip to theNetworking requirementssince the hardware already adheres to machine and storage requirements.",2026-01-30T18:03:00.000Z,how-to,configuration,0.85,True,"Details machine, storage, networking, and Azure requirements; likely includes specific hardware/network parameters and ranges, which are product-specific configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2603,System requirements for low capacity class,System requirements for low capacity deployments of Azure Local (preview) - Azure Local,Configure low-capacity Azure Local deployment hardware,"How to choose machines, storage, and networking components for low capacity deployments of Azure Local (preview).","Applies to: Hyperconverged deployments of Azure Local This article describes the requirements for machines, storage, and networking for building solutions of Azure Local that use lower capacity hardware. If you purchase lower capacity hardware from theAzure Local Catalog, ensure that these requirements are met before you deploy the Azure Local solutions. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azur",2026-02-13T18:03:00.000Z,how-to,configuration,0.85,True,"Defines requirements for machines, storage, and networking for low-capacity deployments; contains specific configuration thresholds unique to this product scenario.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/concepts/telemetry-and-diagnostics-overview?view=azloc-2603,Telemetry and diagnostics extension,Azure Local telemetry and diagnostics extension - Azure Local,,This article describes the telemetry and diagnostics extension in Azure Local.,"Applies to: Azure Local 2311.2 and later This article gives an overview, lists benefits, and describes options for the telemetry and diagnostics extension in Azure Local.",2025-08-04T17:04:00.000Z,how-to,,0.3,False,"Telemetry and diagnostics extension overview and benefits; lacks detailed settings tables, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/about-private-endpoints?view=azloc-2603,About private endpoint scenarios,About using Private Endpoints to Connect with Azure Local - Azure Local,Plan Azure Local connectivity using private endpoints,"Review how Azure Private Endpoints can be used when deploying Azure Local, with and without Arc gateway, and with and without Proxy.","This article provides an overview of Azure private endpoints on Azure Local including the supported and unsupported scenarios, and key requirements for successful connectivity.",2026-03-11T22:07:00.000Z,concept-article,decision-making,0.7,True,Overview of supported/unsupported scenarios and key requirements for private endpoints with Azure Local; helps choose connectivity approach across scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/azure-verification?view=azloc-2603,Azure verification for VMs,Azure verification for VMs on Azure Local - Azure Local,Configure Azure verification attestation for Azure Local VMs,Learn about the Azure verification for VMs feature on Azure Local.,"Applies to: Hyperconverged deployments of Azure Local Microsoft Azure offers a range of differentiated workloads and capabilities that are designed to run only on Azure. Azure Local extends many of the same benefits you get from Azure, while running on the same familiar and high-performance on-premises or edge environments. Azure verification for VMsmakes it possible for supported Azure-exclusive workloads to work outside of the cloud. This feature, modeled after theIMDS attestationservice in Az",2025-12-19T23:03:00.000Z,overview,security,0.6,True,Azure verification for VMs is modeled after IMDS attestation; configuring this feature likely involves security/attestation settings unique to Azure Local.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-no-gateway?view=azloc-2603,"Private endpoints - no proxy, no gateway",Use Azure Private Endpoints with Azure Local for No Proxy No Arc Gateway Scenario - Azure Local,Configure private endpoints for Azure Local without proxy or gateway,"Review how Azure private endpoints can be used when deploying Azure Local, without an enterprise proxy and without an Arc gateway.","This article provides an overview of how you can integrate both Azure private endpoints with Azure Local in a scenario without an enterprise proxy and without an Arc gateway. -For more information about Azure private endpoints on Azure Local and the supported scenarios, seeAbout Azure private endpoints on Azure Local.",2026-03-11T22:07:00.000Z,concept-article,configuration,0.7,True,"Scenario-specific integration of private endpoints with Azure Local (no proxy, no gateway); includes product-specific configuration steps and requirements.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-with-gateway?view=azloc-2603,"Private endpoints - no proxy, with gateway",Use Private Endpoints with Azure Local for No Proxy with Arc Gateway - Azure Local,Configure private endpoints for Azure Local with Arc gateway,"Review how Azure Private Endpoints can be used when deploying Azure Local, without an enterprise proxy but with an Arc gateway.","This article provides an overview of how you can integrate both existing and new Azure private endpoints with Azure Local in a scenario without an enterprise proxy but with an Arc gateway. -For more information about Azure private endpoints on Azure Local and the supported scenarios, seeAbout Azure private endpoints on Azure Local.",2026-03-11T22:07:00.000Z,concept-article,configuration,0.7,True,"Details how to integrate existing and new private endpoints in a no-proxy, with-gateway scenario; product-specific configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-no-gateway?view=azloc-2603,"Private endpoints - with proxy, no gateway",Use Private Endpoints with Azure Local for with Proxy no Arc Gateway - Azure Local,Configure private endpoints for Azure Local with proxy only,"Review how Azure Private Endpoints can be used when deploying Azure Local, with an enterprise proxy but without an Arc gateway.","This article provides an overview of how you can integrate both existing and new Azure private endpoints with Azure Local in a scenario with enterprise proxy but without an Arc gateway. -For more information about Azure private endpoints on Azure Local and the supported scenarios, seeAbout Azure private endpoints on Azure Local.",2026-03-11T22:07:00.000Z,concept-article,configuration,0.7,True,Describes configuration of private endpoints when using an enterprise proxy but no Arc gateway; scenario-specific connectivity configuration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-with-gateway?view=azloc-2603,"Private endpoints - with proxy, with gateway",Use Private Endpoints with Azure Local for Proxy with Arc Gateway - Azure Local,Configure private endpoints for Azure Local with proxy and gateway,"Review how Azure Private Endpoints can be used when deploying Azure Local, with an enterprise proxy and with an Arc gateway.","This article describes the scenario where Azure Local is deployed with both an enterprise proxy and an Arc gateway and private endpoints are used. Currently, Azure Local offers the following distinct methods for outbound connectivity: Deploy Azure Local without an enterprise proxy and without an Arc gateway. Deploy Azure Local with an enterprise proxy but without an Arc gateway. Deploy Azure Local without an enterprise proxy but with an Arc gateway. Deploy Azure Local with both an enterprise pro",2026-03-11T22:07:00.000Z,concept-article,configuration,0.75,True,"Covers deployment scenarios combining enterprise proxy, Arc gateway, and private endpoints; product-specific configuration matrix for outbound connectivity.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal?view=azloc-2603,6A. Deploy via Azure portal,Deploy an Azure Local instance using the Azure portal - Azure Local,,Learn how to deploy an Azure Local instance from the Azure portal,This article helps you deploy an Azure Local instance using the Azure portal.,2026-03-23T08:00:00.000Z,how-to,,0.3,False,"High-level deployment walkthrough via Azure portal; typically step-by-step UI guidance without detailed configuration matrices, limits, or product-specific parameters beyond what an LLM would already know.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-arc-register-server-permissions?view=azloc-2603,4. Set up subscription permissions,Register your Azure Local machines with Azure Arc and assign permissions for deployment - Azure Local,Assign Azure Arc registration and deployment permissions for Azure Local,Learn how to register your Azure Local machines with Azure Arc and assign permissions for deployment.,Applies to: Hyperconverged deployments of Azure Local This article describes how to set up the required permissions on your subscription to deploy Azure Local.,2025-12-22T23:06:00.000Z,how-to,security,0.7,True,"Focuses on setting up required permissions on the subscription; likely lists specific roles, scopes, and identity settings, which are product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-arc-gateway-overview?view=azloc-2603,Review Azure Arc gateway for Azure Local,Overview of Azure Arc gateway for Azure Local - Azure Local,Deploy and manage Azure Arc gateway for Azure Local,Learn what is Azure Arc gateway for Azure Local.,"This article provides an overview of the Azure Arc gateway for Azure Local (formerly known as Azure Stack HCI). You can enable the Arc gateway on new deployments of Azure Local running software version 2506 and later. This article also describes how to create and delete the Arc gateway resource in Azure. Use the Arc gateway to significantly reduce the number of required endpoints needed to deploy and manage Azure Local instances. When you create the Arc gateway, connect to and use it for new dep",2025-12-22T23:06:00.000Z,how-to,deployment,0.75,True,"Explains enabling Arc gateway, version requirements, and how it reduces required endpoints; product-specific deployment configuration and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template?view=azloc-2603,6B. Deploy via ARM template,"Azure Resource Manager template deployment for Azure Local, version 23H2 - Azure Local",Deploy Azure Local using Azure Resource Manager templates,"Learn how to prepare and then deploy Azure Local instance, version 23H2 using the Azure Resource Manager template.",This article details how to use an Azure Resource Manager (ARM) template in the Azure portal to deploy an Azure Local in your environment. The article also contains the prerequisites and the preparation steps required to begin the deployment. Important ARM template deployment of Azure Local systems is targeted for deployments-at-scale. The intended audience for this deployment is IT administrators who have experience deploying Azure Local instances. We recommend that you deploy a system via the ,2025-12-22T23:06:00.000Z,how-to,deployment,0.7,True,"ARM template deployment article with prerequisites and preparation; typically includes template parameters, supported scenarios, and deployment constraints specific to this product.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os?view=azloc-2603,3A. Install manually via ISO,"Install Azure Stack HCI operating system, version 23H2 using SConfig - Azure Local",,"Learn how to install the Azure Stack HCI operating system, version 23H2 on each machine of your system using SConfig.","Applies to: Hyperconverged deployments of Azure Local There are two distinct ways of installing the OS on your Azure Local machines. You can use theInstall Azure Stack HCIwizard and SConfig or you can install and register the OS using simplified machine provisioning (preview). This article describes the OS install using the wizard only. To use simplified machine provisioning, seeInstall and register the OS using simplified machine provisioning.",2026-02-26T18:04:00.000Z,how-to,,0.4,False,Step-by-step OS installation using SConfig; tutorial-style deployment steps rather than deployment constraints or config tables.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-introduction?view=azloc-2603,Read overview,"Azure Local, version 23H2 deployment overview - Azure Local",,"Learn about the deployment methods for Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article is the first in the series of deployment articles that describe how to deploy Azure Local. This article applies to both single and multi-node deployments. The target audience for this article is IT administrators who are responsible for deploying Azure Local in their organization.,2026-02-26T18:04:00.000Z,overview,,0.3,False,Deployment overview article; likely conceptual and navigational without matrices or constraints.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault-template?view=azloc-2603,Deploy via ARM template,Deploy Azure Local using local identity with Azure Key Vault via an Azure Resource Manager Template (preview) - Azure Local,Deploy Azure Local with local identity and Key Vault via ARM template,Learn how to prepare and then deploy Azure Local using local identity with Azure Key Vault using an Azure Resource Manager (ARM) template (preview).,"This article describes how to deploy Azure Local using local identity with Azure Key Vault by using an Azure Resource Manager (ARM) template configured for external DNS. The article also describes the prerequisites and the preparation steps required to begin the deployment. Important Use ARM template deployment for Azure Local systems at scale. This approach is intended for experienced IT administrators. Deploy asystem via the Azure portalfirst, then use the ARM template for subsequent deploymen",2026-02-26T18:04:00.000Z,how-to,deployment,0.7,True,"ARM template-based deployment with local identity and Key Vault; includes template parameters and deployment constraints, fitting deployment patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2603,Deploy via Azure portal,Deploy Azure Local Using Local Identity with Azure Key Vault - Azure Local,Use local identity and Key Vault for Azure Local deployment,Learn how to use local identity with Azure Key Vault for Azure Local deployment.,"This article describes how to use local identity with Azure Key Vault for Azure Local deployment. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-16T08:00:00.000Z,how-to,security,0.64,True,"The page covers using local identity with Azure Key Vault specifically for Azure Local deployment, which implies product-specific identity and secret-access configuration. This is security-focused (identity + Key Vault) and contains expert configuration details unique to this scenario rather than generic deployment or conceptual content.",updated -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prep-active-directory?view=azloc-2603,1. Prepare Active Directory,"Prepare Active Directory for Azure Local, version 23H2 deployment - Azure Local",Prepare Active Directory for Azure Local deployment,"Learn how to prepare Active Directory before you deploy Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to prepare your Active Directory environment before you deploy Azure Local. Active Directory requirements for Azure Local include: Note To manually assign the required permissions for Active Directory, create an OU, and block GPO inheritance, seeCustom Active Directory configuration for your Azure Local.",2026-03-26T17:03:00.000Z,how-to,security,0.7,True,"Preparation of AD for Azure Local typically includes product-specific OU structure, required permissions, and GPO configuration details (for example, which rights the deployment account needs and how to block inheritance). These are concrete, service-specific security/identity configuration steps rather than generic AD concepts.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prerequisites?view=azloc-2603,Complete prerequisites,"Prerequisites to deploy Azure Local, version 23H2 - Azure Local","Review security, software, hardware, and network prerequisites for Azure Local deployment","Learn about the prerequisites to deploy Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article discusses the security, software, hardware, and networking prerequisites, and the deployment checklist in order to deploy Azure Local instance.",2025-12-22T23:06:00.000Z,install-set-up-deploy,configuration,0.65,True,"Prerequisites and checklist typically enumerate specific OS versions, ports, hardware specs, and network requirements, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2603,Virtual deployment,"Deploy a virtual Azure Local, version 23H2 and 24H2 system - Azure Local",Deploy a virtualized Azure Local instance,"Describes how to perform an Azure Local, version 23H2 virtualized deployment.","Applies to: Hyperconverged deployments of Azure Local This article describes how to deploy a virtualized Azure Local (formerly Azure Stack HCI) instance on a host system running Windows Server 2022, Windows 11, or later operating system (OS). The host must have Hyper-V enabled for the deployment. You need administrator privileges for the Azure Local virtual deployment and should be familiar with the existing Azure Local solution. The deployment can take around 2.5 hours to complete. Important A ",2026-01-12T23:03:00.000Z,how-to,deployment,0.7,True,"Describes how to deploy a virtual Azure Local system, including host OS requirements, Hyper-V usage, and deployment duration; product-specific deployment procedure and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2603,With Arc gateway,Register Azure Local with Azure Arc using Arc Gateway - Azure Local,Configure Azure Local registration via Arc gateway and proxy,Learn how to register Azure Local using Azure Arc gateway Arc proxy. Both scenarios with and without proxy are configured.,"This article details how to register Azure Local using Azure Arc gateway and with the proxy configuration enabled. Once you create an Arc gateway resource in your Azure subscription, you can enable the Arc gateway features. For an overview of the Arc gateway, seeAbout Azure Arc gateway for Azure Local. Configure proxy with a script: Using this method, you can configure Arc proxy with a script. This method is useful as you don't need to configure the Arc proxy across WinInet, WinHttp, or environm",2026-04-10T08:00:00.000Z,how-to,configuration,0.68,True,"The article describes concrete, product-specific configuration steps for registering Azure Local with Azure Arc using an Arc gateway, including proxy configuration via script and environment-specific settings. This goes beyond generic deployment guidance and focuses on how to configure the Arc proxy and gateway for this product, which is expert operational knowledge. It does not emphasize limits, troubleshooting, or architecture trade-offs.",updated -https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-without-azure-arc-gateway?view=azloc-2603,Without Arc gateway,Register Azure Local with Azure Arc without using Arc Gateway - Azure Local,Configure Azure Local Arc registration without gateway,Learn how to register Azure Local with Azure Arc with and without proxy setup. The proxy configuration can be done via an Arc script or via the Configurator app on Azure Local.,"This article details how to register Azure Local machines with Azure Arc without using an Arc gateway and with proxy configuration enabled. The proxy configuration can be done via an Arc script or via the Configurator app for Azure Local. Configure with a script: You can use an Arc script to configure registration settings. Set up via the Configurator app (Preview): Using this method, you can configure Azure Local registration via a user interface. This method is useful if you prefer not to use ",2026-04-09T08:00:00.000Z,how-to,configuration,0.68,True,"The article describes concrete, product-specific registration and proxy configuration for Azure Local with Azure Arc, including use of an Arc script and the Configurator app. It focuses on how to set registration settings and proxy options rather than generic concepts, which aligns best with configuration. It does not emphasize deployment matrices, limits, or troubleshooting error codes.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/download-23h2-software?view=azloc-2603,2. Download the software,"Download Azure Stack HCI Operating System, version 23H2 software for Azure Local deployment - Azure Local",,"Learn how to download Azure Local, version 23H2 software from the Azure portal to deploy an Azure Local instance.","Applies to: Hyperconverged deployments of Azure Local This article describes how to download the operating system (OS) software from the Azure portal to deploy an Azure Local instance. The first step in deploying Azure Local is to download the OS from the Azure portal. The software download includes a free 60-day trial. However, if you've purchased Integrated System solution hardware from theAzure Local Catalogthrough your preferred Microsoft hardware partner, the OS should be preinstalled. In t",2025-12-22T23:06:00.000Z,how-to,,0.4,False,How to download OS software and notes about trial; mostly procedural without configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-sdn-integration?view=azloc-2603,Enable SDN integration,Enable Software-Defined Networking (SDN) enabled by Azure Arc on Azure Local using a PowerShell Action Plan - Azure Local,Enable SDN integration on Azure Local using PowerShell action plans,Describes how to enable integration of SDN enabled by Azure Arc using a PowerShell action plan on Azure Local.,This article describes how to enable and integrate software defined networking (SDN) on your existing Azure Local instance. You use a PowerShell action plan to enable SDN.,2026-03-02T18:16:00.000Z,how-to,configuration,0.7,True,"Describes enabling SDN via a PowerShell action plan; likely includes specific cmdlets, parameters, and configuration steps unique to Azure Local SDN.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-portal?view=azloc-2603,Via Azure portal,Deploy rack aware cluster using the Azure portal - Azure Local,Deploy Azure Local rack aware clusters via Azure portal,"Learn how to deploy a rack aware cluster via the Azure portal with step-by-step guidance, including configuration, networking, and validation processes.",This article describes the steps to deploy Azure Local rack aware clusters using the Azure portal.,2026-01-22T23:03:00.000Z,how-to,deployment,0.7,True,Step-by-step deployment for a specific platform; likely includes Azure Local-specific deployment constraints and supported options in the portal.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-prep?view=azloc-2603,Prepare to deploy,Prepare to deploy rack aware cluster via the Azure portal - Azure Local,Prepare network and hardware for rack aware cluster deployment,Learn how to deploy Azure Local rack aware clusters with high resiliency using ToR switches and VLAN isolation for optimal network configurations.,"This article describes the preparation steps to deploy Azure Local rack aware clusters. It includes network design recommendations, machine configuration guidelines, and best practices for deployment.",2026-01-22T23:03:00.000Z,how-to,best-practices,0.7,True,"Includes network design recommendations, machine configuration guidelines, and best practices; these are product-specific deployment recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deployment-via-template?view=azloc-2603,Via ARM template,Azure Resource Manager template deployment for Azure Local rack aware cluster - Azure Local,Deploy rack aware clusters using ARM templates at scale,Learn how to prepare and then deploy Azure Local rack aware cluster using the Azure Resource Manager template.,"This article describes how to use an Azure Resource Manager (ARM) template in the Azure portal to deploy a rack aware cluster. Important ARM template deployment of rack aware cluster is targeted for deployments-at-scale. The intended audience for this deployment is IT administrators who have experience deploying rack aware clusters. We recommend that you deploy a system via the Azure portal first, and then perform subsequent deployments via the ARM template.",2026-01-22T23:03:00.000Z,how-to,deployment,0.75,True,"ARM template deployment for rack aware clusters is a product-specific deployment method, including template parameters and constraints for scale deployments.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-post-deployment?view=azloc-2603,Complete post-deployment tasks,Perform post deployment tasks on rack aware clusters - Azure Local,Perform post-deployment tasks for rack aware clusters,Learn about the post deployment tasks that you need to perform on your newly deployed rack aware cluster.,"After deploying rack aware cluster, either through theAzure portalor using theAzure Resource Manager deployment template, you need to complete a set of post-deployments tasks. This article describes the typical tasks required once your rack aware cluster is successfully deployed and all machines are up and running.",2026-01-22T23:03:00.000Z,how-to,deployment,0.65,True,Post-deployment tasks are specific operational steps required after deploying rack aware clusters; these are product-specific deployment/ops procedures.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-readiness-check?view=azloc-2603,Assess network readiness via LLDP,Use LLDP validator to assess deployment readiness for Azure Local rack aware cluster - Azure Local,Validate rack aware cluster readiness with LLDP tool,How to use the LLDP validator to assess if your environment is ready for deploying Azure Local rack aware cluster.,This article describes how to use the Link Layer Discovery Protocol (LLDP) validator in a standalone mode to assess how ready your environment is for deploying rack aware cluster.,2026-01-22T23:03:00.000Z,how-to,troubleshooting,0.7,True,"Using LLDP validator to assess readiness is diagnostic; likely includes specific commands, outputs, and interpretations for deployment issues.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-express-23h2?view=azloc-2603,Using Express scripts,"Deploy infrastructure for Software Defined Networking managed by on-premises tools using SDN Express for Azure Local, version 23H2 - Azure Local",,"Learn how to deploy infrastructure for Software Defined Networking managed by on-premises tools using SDN Express for Azure Local, version 23h2.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 In this article, you deploy an end-to-end Software Defined Network (SDN) infrastructure for Azure Local using SDN Express PowerShell scripts. The infrastructure includes a highly available (HA) Network Controller (NC), and optionally, a highly available Software Load Balancer (SLB), and a highly available Gateway (GW). The scripts support a phased deployment, where you can deploy just the Net",2025-12-22T23:06:00.000Z,how-to,,0.4,False,"Primarily a deployment walkthrough using SDN Express scripts; no clear configuration matrices, limits, or product-specific troubleshooting/error-code mapping visible from the summary.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-wizard-23h2?view=azloc-2603,Using Windows Admin Center,Deploy Software Defined Networking managed by on-premises tools with Windows Admin Center for Azure Local - Azure Local,,Learn how to deploy infrastructure for Software Defined Networking managed by on-premises tools with Windows Admin Center for Azure Local,"Applies to: Hyperconverged deployments of Azure Local This article describes how to deploy Software Defined Networking (SDN) managed by on-premises tools through Windows Admin Center after you deployed your Azure Local via the Azure portal. Windows Admin Center enables you to deploy all the SDN infrastructure components on your existing Azure Local, in the following deployment order: Alternatively, you can deploy the entire SDN infrastructure through theSDN Expressscripts. You can also deploy an",2025-12-22T23:06:00.000Z,how-to,,0.35,False,"Step-by-step deployment via Windows Admin Center; appears procedural without detailed config tables, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/simplified-machine-provisioning?view=azloc-2603,3B. Install and register via simplified machine provisioning,Install and Register Azure Local Machines using Simplified Machine Provisioning (preview) - Azure Local,,Install and register Azure Local machines using simplified machine provisioning (preview).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to use simplified machine provisioning to set up machines for an Azure Local instance. You can install the OS on your Azure Local machines in two distinct ways: you can manually install the OS using ISO images, or you can use the simplified machine provisioning process. This article covers only the installation and registration process by using simplified machine provisioning, which is currently in preview. To insta",2026-02-26T18:04:00.000Z,how-to,,0.45,False,Describes simplified machine provisioning process; appears as a procedural tutorial without explicit configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/sql-server-23h2?view=azloc-2603,Run SQL Server,Deploy SQL Server on Azure Local Version 23H2 - Azure Local,Deploy SQL Server workloads on Azure Local 23H2,"This article provides guidance on how to deploy SQL Server on Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article provides guidance on how to deploy SQL Server on Azure Local, version 23H2.",2026-02-04T23:07:00.000Z,how-to,deployment,0.65,True,Guidance on deploying SQL Server on Azure Local; likely includes product-specific deployment requirements and configuration patterns for this workload.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/deploy/troubleshoot-simplified-machine-provisioning?view=azloc-2603,Troubleshoot simplified machine provisioning,Troubleshoot Simplified Machine Provisioning for Azure Local (preview) - Azure Local,Troubleshoot simplified machine provisioning in Azure Local,Learn how to troubleshoot simplified machine provisioning for Azure Local (preview).,"Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This article describes how to troubleshoot simplified machine provisioning. You can use the following methods to troubleshoot:",2026-04-02T17:05:00.000Z,how-to,troubleshooting,0.78,True,"Troubleshooting article for simplified machine provisioning; likely organized by specific provisioning failures, includes concrete error messages, logs, and remediation steps unique to Azure Local simplified provisioning.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/faq?view=azloc-2603,FAQ,Azure Local FAQ - Azure Local,,"The Azure Local FAQ provides information about deployment types, Azure connectivity, data handling, and supported services.","The Azure Local FAQ provides information about deployment types, Azure connectivity, data handling, and supported services.",2026-02-23T23:04:00Z,faq,,0.3,False,"FAQ about deployment, connectivity, and data handling; summary suggests general Q&A without detailed error codes, configs, or numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/hybrid-capabilities-with-azure-services-23h2?view=azloc-2603,Hybrid capabilities with Azure services,"Hybrid capabilities with Azure services in Azure Local, version 23H2 - Azure Local",,"This article describes the cloud service components of Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local Your on-premises Azure Local solution integrates with Azure cloud using several cloud service components, such as Azure Local cloud service, Azure Arc, and other Azure hybrid services. This article describes the functionality provided by these cloud service components, and how they help provide hybrid capabilities to your Azure Local deployment.",2026-02-13T18:03:00.000Z,overview,,0.2,False,"High-level description of hybrid capabilities and cloud service components; no concrete limits, configs, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2603,Known issues,Release notes with fixed and known issues in Azure Local - Azure Local,Resolve known issues in Azure Local releases,Read about the known issues and fixed issues in Azure Local.,"This article identifies critical known issues and their workarounds in Azure Local. These release notes are continuously updated, and as critical issues requiring a workaround are discovered, they're added. Before you deploy your Azure Local instance, carefully review the information contained here. Important For information about supported update paths for this release, seeRelease information. For more information about new features in this release, seeWhat's new for Azure Local.",2026-03-17T22:09:00.000Z,troubleshooting-general,troubleshooting,0.7,True,"Release notes of critical known issues and workarounds; typically organized by symptom and workaround, providing product-specific troubleshooting guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/license-billing?view=azloc-2603,FAQ,OEM license and billing FAQ for Azure Local - Azure Local,,The FAQ provides information on the OEM license and billing for Azure Local.,This FAQ provides information on the OEM license and billing for Azure Local.,2026-04-06T17:04:00Z,faq,,0.2,False,"OEM license and billing FAQ is commercial/administrative, not technical configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server?view=azloc-2603,Add a node,"Manage Capacity by Adding a Node on Azure Local, Version 23H2 - Azure Local",Scale Azure Local capacity by adding cluster nodes,"Learn how to manage capacity on your Azure Local, version 23H2 system by adding a node.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage capacity by adding a node (often called scale-out) to your Azure Local instance. In this article, each server is referred to as a node.",2025-12-23T23:03:00.000Z,how-to,deployment,0.65,True,Adding nodes to manage capacity is a scale-out deployment operation with specific steps and constraints for Azure Local 23H2.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/arc-extension-management?view=azloc-2603,Azure Arc extension management,Azure Arc extension management on Azure Local - Azure Local,Configure and manage Azure Arc extensions on Azure Local,This article describes how to manage Azure Arc extensions on Azure Local.,"Applies to: Azure Local 2311.2 and later This article explains how to install, upgrade, and manage Azure Arc extensions on Azure Local.",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Covers installing, upgrading, and managing Arc extensions on Azure Local with product-specific extension configuration steps and parameters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-public-ip-to-vm?view=azloc-2603,Assign public IP address to a VM,Assign a public IP address to a virtual machine - Azure Local,,Learn how to assign a public IP address to a virtual machine in Software Defined Networking managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to use Windows Admin Center to assign a Software Defined Networking (SDN) public IP address to a virtual machine (VM) in Azure Local. By assigning a public IP address to a VM, you enable the VM to communicate with external networks, thereby enhancing its capabilities and extending its connectivity.",2025-12-22T23:06:00.000Z,how-to,,0.4,False,"Assigning a public IP via Windows Admin Center is a straightforward how-to; summary doesn’t indicate detailed config tables, limits, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-vm-rbac-roles?view=azloc-2603,Assign RBAC role,Use built-in RBAC roles for Azure Local VM to manage Azure Local VMs enabled by Azure Arc - Azure Local,Assign built-in RBAC roles for Azure Local VMs,Learn how to use RBAC built-in roles to manage Azure Local VMs enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to use the Role-based Access Control (RBAC) to control access to Azure Local virtual machines (VMs) enabled by Azure Arc. You can use the built-in RBAC roles to control access to VMs and VM resources such as virtual disks, network interfaces, VM images, logical networks, and storage paths. You can assign these roles to users, groups, service principals, and managed identities.",2026-04-07T22:03:00.000Z,how-to,security,0.82,True,"The article explains how to use built-in RBAC roles for Azure Local VMs enabled by Azure Arc. This typically includes specific role names, scopes, and which permissions they grant for VM disks, NICs, images, etc., which matches the security category’s requirement for product-specific RBAC configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/attach-gpu-to-linux-vm?view=azloc-2603,Attach GPU to Linux VM,Attach a GPU to a Linux VM in Azure Local - Azure Local,Attach and configure GPUs for Linux VMs on Azure Local,How to use a GPU with AI workloads running in an Ubuntu Linux VM on Azure Local.,"Applies to: Azure Local 2311.2 and later Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't enabled by Azure Arc, have limited manageability from the Azure Arc control plane, and fewer Azure Hybrid Benefits, including usage of Azure Update Manager ",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Describes attaching GPU to Ubuntu VM for AI workloads; likely includes driver versions, VM settings, and GPU configuration parameters specific to Azure Local.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-overview?view=azloc-2603,What is Azure Local VM management?,What is Azure Local VM management - Azure Local,,Learn about using Azure Local VM management to provision and manage on-premises Windows and Linux virtual machines (VMs) running on Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article provides an overview of virtual machine (VM) management in hyperconverged deployments of Azure Local (formerly Azure Stack HCI), including its benefits, components, and a high-level workflow. Azure Local VM management enables IT admins to provision and manage Windows and Linux VMs hosted in an on-premises Azure Local environment. IT admins can use the feature to create, modify, delete, and assign permissions and roles to app owne",2026-04-15T08:00:00.000Z,how-to,,0.2,False,"High-level overview of Azure Local VM management benefits, components, and workflow without specific limits, configuration tables, error codes, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-prerequisites?view=azloc-2603,Review prerequisites,Azure Local VM management prerequisites - Azure Local,Configure prerequisites for Azure Local Arc-enabled VMs,Learn about the prerequisites for deploying Azure Local VMs enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article lists the requirements and prerequisites for Azure Local virtual machines (VMs) enabled by Azure Arc. We recommend that you review the requirements and complete the prerequisites before you manage your Azure Local VMs.,2026-03-02T18:16:00.000Z,how-to,configuration,0.6,True,"Prerequisites article typically lists required versions, settings, and environment conditions specific to Azure Local Arc-enabled VMs, functioning as configuration requirements.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vms-faq?view=azloc-2603,FAQs,Azure Local VMs Enabled by Azure Arc FAQ - Azure Local,,This article provides answers to questions related to Azure Local VM management.,Frequently asked questions about Azure Local VMs enabled by Azure Arc for version 2311.2 and later.,2025-12-23T23:03:00Z,faq,,0.3,False,"FAQ content is usually high-level Q&A; summary does not mention error codes, config tables, or numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-benefits-esu?view=azloc-2603,Extended Security Updates (ESUs),Extended Security Updates (ESU) on Azure Local - Azure Local,Use Extended Security Updates on Azure Local VMs,Learn how to get free extended security updates (ESUs) with Azure VM verification on Azure Local.,"Applies to: Azure Local 2311.2 and later The Extended Security Update (ESU) program enables you to get important security patches for legacy Microsoft products that are past the end of support. Getting ESU through Azure Local comes with additional benefits and implementation steps. This article explains the specifics for Azure Local. To get general information about the ESU program, products that are covered, and support dates, see theProduct Lifecycle FAQ. For detailed steps to set up legacy OS",2025-12-23T23:03:00.000Z,overview,decision-making,0.65,True,"Explains specifics of ESU on Azure Local, including benefits and implementation steps; likely includes which products/versions qualify and how Azure verification affects ESU, guiding ESU usage decisions.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-enhanced-management-managed-identity?view=azloc-2603,Enhanced management from Azure,Enhanced management of Azure Local from Azure - Azure Local,Configure managed identity for enhanced Azure Local management,Learn how to use enhanced Azure management for Azure Local. This enhanced management is enabled via Managed Identity created for your Azure Local.,"Applies to: Azure Local 2311.2 and later This guide describes the feature in the May 2023 cumulative update for Azure Local, version 22H2, that enables enhanced management from Azure.",2026-01-28T23:03:00.000Z,concept-article,security,0.7,True,"Describes enabling enhanced management via a specific managed identity for Azure Local, including Azure-side configuration details that are product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-site-recovery?view=azloc-2603,Use Azure Site Recovery,Protect your Hyper-V Virtual Machine workloads on Azure Local with Azure Site Recovery (preview) - Azure Local,,Use Azure Site Recovery to protect Hyper-V VM workloads running on Azure Local. (preview),"Applies to: Azure Local 2311.2 and later This guide describes how to protect Windows and Linux VM workloads running on your Azure Local if there's a disaster. You can use Azure Site Recovery to replicate your on-premises Azure Local virtual machines (VMs) into Azure and protect your business critical workloads. This feature is enabled on Azure Local running the May 2023 cumulative update of version 22H2 and later. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use f",2026-04-05T08:00:00.000Z,how-to,,0.4,False,"Described as a guide to protect Azure Local Hyper-V VMs with Azure Site Recovery. From the summary it appears to be a scenario/tutorial guide; there’s no explicit indication of detailed limits, configuration matrices, or error-code-based troubleshooting that would qualify as expert knowledge under the defined sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-log-files-arc-enabled-vms?view=azloc-2603,Collect log files for Azure Local VM,Collect log files for Azure Local VMs enabled by Azure Arc - Azure Local,Collect diagnostic logs for Azure Local Arc-enabled VMs,Learn how to collect log files for an Azure Local VM enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local Collect logs and other files to identify and troubleshoot issues with Azure Local virtual machines (VMs) enabled by Azure Arc.,2025-12-23T23:03:00.000Z,how-to,troubleshooting,0.7,True,"Log collection article is part of troubleshooting; likely specifies exact log locations, commands, and files needed to diagnose Azure Local VM issues.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-logs?view=azloc-2603,Collect logs,Collect diagnostic logs for Azure Local - Azure Local,Collect and upload Azure Local diagnostic logs to Microsoft,Learn how to collect diagnostic logs and share them with Microsoft.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to collect diagnostic logs for Azure Local and send them to Microsoft via the Azure portal or PowerShell. These diagnostic logs help identify and fix any issues with your Azure Local solution. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general av",2026-02-11T18:03:00.000Z,how-to,configuration,0.6,True,"Describes Azure Local–specific log collection commands and portal flows, including what logs are gathered and how they’re packaged for support.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-network-security-groups-with-tags?view=azloc-2603,Configure network security groups with tags,Configure network security groups with tags in Windows Admin Center - Azure Local,Use tags with SDN network security groups in WAC,Learn how to configure network security groups with tags in Windows Admin Center.,"Applies to: Azure Local 2311.2 and later Applies to: Windows Server 2025 This article describes how to configure network security groups with network security tags in Windows Admin Center. With network security tags, you can create custom user-defined tags, attach those tags to your virtual machine (VM) network interfaces, and apply network access policies (with network security groups) based on these tags.",2025-12-22T23:06:00.000Z,how-to,security,0.76,True,Describes network security tags and how to apply NSG policies based on tags; includes product-specific tag behavior and configuration steps for Azure Local.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-proxy-settings-23h2?view=azloc-2603,Configure proxy,"Configure proxy settings for Azure Local, version 23H2 - Azure Local",Configure proxy settings for Azure Local 23H2 environments,"Learn how to configure proxy settings for Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local Important Since the release of Azure Local 2506, it is not required to configure the proxy settings manually on your Azure Local Machines. Azure Local machines proxy configuration is done automatically during Arc registration. Make sure you follow the guidance for proxy environments deployments based on your registration method:Register Azure local with Arc using proxy.Register Azure Local with Arc using proxy and Arc gateway. This article de",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,Proxy configuration article includes guidance for different registration methods and version-specific behavior; involves concrete network/proxy settings.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-software-load-balancer?view=azloc-2603,Configure load balancer for high availability ports,Configure Software Load Balancer for high availability ports - Azure Local,Configure SLB high availability ports and understand limits,Learn how to configure Software Load Balancer for high availability ports.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article provides an overview of high availability ports rule and their purpose. It also describes the prerequisites for setting up such rules, the configuration steps, supported configurations, and associated limitations.",2025-12-22T23:06:00.000Z,how-to,limits-quotas,0.64,True,"Article explicitly mentions supported configurations and associated limitations for high availability ports; these are typically expressed as concrete constraints (supported scenarios, protocol/port rules) that qualify as limits/quotas.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/connect-arc-vm-using-ssh?view=azloc-2603,Connect to Azure Local VMs,Connect to an Azure Local Virtual Machine using SSH or Remote Desktop Protocol over SSH or VM Connect (Preview) - Azure Local,,Learn how to use SSH or RDP over SSH or VM Connect feature (preview) to connect to an Azure Local VM enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article describes how to connect to an Azure Local VM in two scenarios:,2026-01-23T18:03:00.000Z,how-to,,0.45,False,"Connection methods (SSH, RDP over SSH, VM Connect) are procedural; summary doesn’t indicate detailed configuration tables or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/create-arc-virtual-machines?view=azloc-2603,Run Azure Local VMs,Create Azure Local Virtual Machines Enabled by Azure Arc - Azure Local,,Learn how to view your Azure Local instance in the Azure portal and create Azure Local VMs enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc, starting with the VM images that you created on your Azure Local instance. You can create Azure Local VMs by using the Azure CLI, the Azure portal, or an Azure Resource Manager template (ARM template).",2026-04-07T08:00:00.000Z,how-to,,0.3,False,"The page is primarily a how-to guide for creating Azure Local VMs enabled by Azure Arc using CLI, portal, or ARM templates. From the summary, it appears to be procedural tutorial content without detailed configuration parameter tables, limits, or troubleshooting mappings. It likely reuses standard Azure VM creation patterns that an LLM already knows, so it does not clearly meet the expert-knowledge criteria for any sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/create-logical-networks?view=azloc-2603,3. Create logical networks,Create logical networks for Azure Local virtual machines enabled by Azure Arc - Azure Local,Configure logical networks for Azure Local Arc VMs,Learn how to create logical networks on Azure Local. The Azure Local VMs enabled by Azure Arc running on your system use this logical network.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create or add logical networks for application workloads running on your Azure Local instance. Any Azure Local virtual machines (VMs) that you create use these logical networks. A logical network is a logical representation of a physical network where Azure Local VMs can be provisioned. It defines how VM network interfaces connect to the underlying network. Logical networks allows you to configure networking sett,2026-01-22T23:03:00.000Z,how-to,configuration,0.65,True,"Logical network creation defines networking settings for VMs; such articles typically include specific configuration parameters (subnets, VLANs, address spaces) that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-interfaces?view=azloc-2603,4. Create network interfaces,Create network interfaces for virtual machines on Azure Local - Azure Local,,Learn how to create network interfaces on an existing logical network associated with your Azure Local. The Azure Local VM enabled by Azure Arc uses these network interfaces.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create network interfaces that you can associate with an Azure Local virtual machine (VM). You can create network interfaces using the Azure portal or Azure Command-Line Interface (CLI).,2026-04-07T17:03:00.000Z,how-to,,0.35,False,"Describes creating network interfaces for Azure Local VMs via portal or CLI. The summary suggests a step-by-step creation guide rather than detailed configuration matrices, limits, or troubleshooting content, so it doesn’t clearly meet any expert-knowledge sub-skill criteria.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-security-groups?view=azloc-2603,Create Network Security Groups,"Create network security groups, network security rules, default network access policies on Azure Local VMs - Azure Local","Create NSGs, rules, and default network access policies for Azure Local VMs","Learn how to create network security groups, network security rules, and default network access policies on Azure Local VMs using the Azure CLI or the Azure portal.",This article explains how to create and configure network security groups (NSGs) to manage data traffic flow after you enable network controller on your Azure Local.,2025-12-22T23:06:00.000Z,how-to,security,0.75,True,Covers creation and configuration of NSGs and rules; includes product-specific security configuration parameters and possibly defaults.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/create-storage-path?view=azloc-2603,1. Create a storage path,Create storage path for Azure Local virtual machines images - Azure Local,,Learn how to create storage path for use with VM images for your Azure Local instance.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to create storage path for VM images used on your Azure Local instance. Storage paths are an Azure resource and are used to provide a path to store VM configuration files, VM image, and VHDs on your system. You can create a storage path using the Azure CLI or Azure portal.",2026-04-08T22:04:00.000Z,how-to,,0.4,False,"Focuses on how to create a storage path for VM images using CLI or portal. From the summary it appears to be a procedural how-to without explicit mention of configuration parameter tables, limits, or specialized patterns; likely standard tutorial-style guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-infrastructure-resiliency?view=azloc-2603,Infrastructure resiliency,Infrastructure Resiliency for Azure Local - Azure Local,Design resilient infrastructure for Azure Local deployments,Infrastructure resiliency considerations for Azure Local.,"This article explores the key infrastructure elements that contribute to a resilient Azure Local deployment and how they support continuity in the face of hardware faults, network outages, and site-level disasters. Infrastructure resiliency is the foundation of a robust disaster recovery strategy for Azure Local deployments. Before virtual machines (VMs) and applications can be protected, the underlying physical and logical infrastructure must be designed to withstand failures and disruptions. T",2026-01-28T23:03:00.000Z,concept-article,architecture-patterns,0.65,True,"Infrastructure resiliency article discusses key elements and design considerations specific to Azure Local, constituting product-specific architecture patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-overview?view=azloc-2603,Overview,Disaster Recovery for Azure Local Virtual Machines - Azure Local,Plan disaster recovery architecture for Azure Local VMs,Disaster recovery considerations for Azure Local virtual machines.,"Disaster recovery planning is strategic for any organization using IT infrastructure, and its importance is magnified in hybrid environments that use Azure Local. Ensuring business systems and services are resilient and can recover from disruptions, ranging from localized hardware failures to site-wide disasters, is vital for maintaining business operations, safeguarding data, and preserving trust. For Azure Local instance deployments that blend edge infrastructure with Azure cloud services a ca",2026-01-28T23:03:00.000Z,concept-article,architecture-patterns,0.6,True,Disaster recovery considerations for Azure Local VMs likely include product-specific DR patterns and when to use them in hybrid/edge scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-vm-resiliency?view=azloc-2603,Overview,Virtual Machine Resiliency for Azure Local - Azure Local,Design resilient virtual machines on Azure Local,Virtual machine resiliency considerations for Azure Local.,"After reviewing and implementing the design considerations forinfrastructure resiliencyat the platform level, it's essential to understand how your virtual machines (VMs) and applications are resilient to failures. This understanding helps you enable them to detect, withstand, and recover from failures within an acceptable time period. Resiliency is fundamental to maintaining continuous operations for business-critical applications. By default, all virtual machines (VMs) on Azure Local are desig",2026-03-11T22:07:00.000Z,reliability-article,architecture-patterns,0.65,True,"VM resiliency considerations describe how to design VMs and apps to withstand failures on Azure Local, which are product-specific resiliency patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-workloads-resiliency?view=azloc-2603,Workloads resiliency,Workloads Resiliency for Azure Local - Azure Local,,Disaster recovery workloads for Azure Local.,"Disaster recovery for workloads running on Azure Local virtual machines (VMs) requires a layered approach that aligns infrastructure-level protections with application-specific continuity strategies. Whether you're hosting SQL Server databases or delivering virtual desktops through Azure Virtual Desktop, each workload type has unique recovery requirements and dependencies. Azure Local supports a wide range of disaster recovery technologies allowing you to tailor recovery plans to meet business c",2026-01-28T23:03:00.000Z,concept-article,,0.4,False,"Disaster recovery and resiliency guidance for workloads on Azure Local VMs is architectural and conceptual; summary does not indicate concrete limits, configs, or product-specific error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-acquire?view=azloc-2603,Acquire disconnected operations,Acquire Disconnected Operations for Azure Local - Azure Local,Acquire and set up disconnected operations appliance for Azure Local,Learn how to set up disconnected operations for Azure Local. Create a resource in the Azure portal and download the necessary files.,"This article explains how to acquire disconnected operations for Azure Local. Learn how to create a virtual appliance resource in the Azure portal, download installation files, and get support from Microsoft for your deployment.",2026-03-04T18:05:00.000Z,how-to,deployment,0.65,True,"Explains how to create the virtual appliance resource and download installation files. This is part of a product-specific deployment flow for disconnected operations, including prerequisites and platform-specific steps.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-arc-vm?view=azloc-2603,Azure Local VMs,Disconnected operations with Azure Local VMs enabled by Azure Arc - Azure Local,Manage Azure Local VMs in disconnected mode with Azure Arc,Learn how to manage Azure Local VMs running disconnected.,"This article provides a brief overview of management features for Azure Local virtual machine (VM) for disconnected operations. It covers the benefits, components, and high-level workflow. This feature closely mirrors Azure Local VM capabilities and references many Azure Local VM articles for connected operations. You learn about key differences and limitations of disconnected operations.",2026-03-09T22:15:00.000Z,concept-article,configuration,0.6,True,"Describes management features, differences, and limitations for VMs in disconnected operations. Likely includes product-specific configuration differences and constraints compared to connected mode.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-azure-container-registry?view=azloc-2603,Azure Container Registry,Deploy Azure Container Registry with Disconnected Operations for Azure Local - Azure Local,Deploy Azure Container Registry on Azure Local disconnected operations,Learn how to deploy and manage Azure Container Registry with disconnected operations for Azure Local.,"This article explains how to deploy and manage Azure Container Registry on disconnected operations running on Azure Local. It provides an overview of the service, prerequisites, deployment steps, and how to manage images in the registry.",2026-02-23T23:04:00.000Z,how-to,deployment,0.7,True,"Explains prerequisites, deployment steps, and image management for ACR in disconnected Azure Local. This is a product-specific deployment/integration pattern with concrete steps and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-back-up-restore?view=azloc-2603,Back up disconnected operations,Backup for Disconnected Operations for Azure Local - Azure Local,Configure backup parameters for disconnected Azure Local,Learn how to back up Azure Local environments running disconnected. Configure parameters and trigger backups.,"This article explains the backup process for disconnected operations for Azure Local environments. It provides practical steps to trigger a backup and parameter configurations to customize it. Operators need access to theOperator subscription and role-based access control (RBAC) permissions. For more information, seeDisconnected operations for Azure Local. Important The restore feature is currently in development. Documentation for the restore process will be available once the feature is stable",2026-03-27T22:03:00.000Z,concept-article,configuration,0.7,True,"The page describes how to configure and trigger backups for disconnected Azure Local environments, including parameter configurations. This is product-specific operational configuration detail (parameters and how they’re used) that goes beyond generic backup concepts, fitting the configuration sub-skill. It is not just a tutorial; it focuses on parameterized backup behavior.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-billing?view=azloc-2603,Billing,Billing for Disconnected Operations for Azure Local - Azure Local,Understand billing and capacity pricing for disconnected Azure Local,Learn how disconnected operation for Azure Local is billed.,This article explains billing for disconnected operations for Azure Local. The article also covers the capacity-based pricing model that uses physical processor cores and the licensing requirements for Windows Server virtual machines (VMs).,2026-03-26T22:08:00.000Z,concept-article,decision-making,0.65,True,"The page explains billing for disconnected Azure Local, including a capacity-based pricing model using physical processor cores and Windows Server VM licensing requirements. This is specialized guidance to understand cost and licensing implications for a specific operating mode, helping users make deployment and capacity decisions, which aligns with decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-cli?view=azloc-2603,Azure CLI,Use Azure CLI for Disconnected Operations for Azure Local - Azure Local,Configure Azure CLI for Azure Local disconnected operations,"Learn how to configure Azure CLI for disconnected operations on Azure Local, including cloud setup, certificate trust, and extension installation.","This article explains how to install and configure the Azure Command-Line Interface (CLI) and its extensions for disconnected operations for Azure Local. It provides an overview of CLI, supported versions, installation steps, and how to set up the CLI for disconnected operations.",2026-02-23T23:04:00.000Z,how-to,configuration,0.75,True,"Explains supported CLI versions, extensions, cloud setup, and certificate trust for disconnected operations. This involves specific configuration parameters, extension names, and environment settings unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-deploy?view=azloc-2603,Deploy disconnected operations,Deploy Disconnected Operations for Azure Local - Azure Local,Deploy disconnected Azure Local instances in datacenters,Learn how to deploy disconnected operations for Azure Local in your datacenter,"This article explains how to deploy disconnected operations for Azure Local in your datacenter. This step is key to deploy and operate Azure Local without any outbound network connection. After you deploy the management cluster (control plane), you deploy your first Azure Local instance.",2026-04-09T08:00:00.000Z,how-to,deployment,0.65,True,"A deployment-focused article for Azure Local in fully disconnected environments is likely to include product-specific deployment requirements, sequencing constraints (management cluster then first instance), and possibly environment prerequisites unique to disconnected operations. These are implementation details an LLM wouldn't reliably know from training, fitting the deployment sub-skill better than generic configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-fallback?view=azloc-2603,Fallback log collection,Appliance Fallback Log Collection for Disconnected Operations with Azure Local VMs Enabled by Azure Arc - Azure Local,Use appliance fallback log collection for Azure Local disconnected VMs,Export and send logs for disconnected operations with Azure Local VMs enabled by Azure Arc.,This article explains how to use appliance fallback logging to export and send logs to Microsoft when Azure Local VMs operates in disconnected mode. This process helps you troubleshoot issues when standard log collection isn't available.,2026-02-23T23:04:00.000Z,how-to,troubleshooting,0.7,True,Describes fallback logging to export and send logs when standard collection fails. This is a troubleshooting pattern specific to Azure Local VMs in disconnected mode.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-identity?view=azloc-2603,Identity,Plan your Identity for Disconnected Operations on Azure Local - Azure Local,Plan identity architecture for Azure Local disconnected operations,Plan and integrate your identity on disconnected operations for Azure Local.,"This article explains how to plan and integrate your identity for disconnected operations on Azure Local. Learn how to set up your identity solution, and understand the actions and roles available to operators.",2026-02-23T23:04:00.000Z,concept-article,security,0.7,True,"Explains how to plan and integrate identity, including actions and roles for operators. Likely includes product-specific identity/role mappings and configuration guidance, fitting security (identity and access management).",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-known-issues?view=azloc-2603,Known issues,Release Notes for Disconnected Operations for Azure Local - Azure Local,Known issues and workarounds for disconnected operations,Read about the known issues and fixed issues for disconnected operations for Azure Local.,This article identifies critical known issues and their workarounds in disconnected operations for Azure Local. These release notes are updated continuously to include critical issues and required workarounds. Review this information carefully before you deploy disconnected operations for Azure Local.,2026-03-04T18:05:00.000Z,concept-article,troubleshooting,0.7,True,"Release notes of critical known issues and workarounds. Such pages typically map specific symptoms/conditions to causes and workarounds, which is expert troubleshooting knowledge unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-monitoring?view=azloc-2603,Overview,Monitor Disconnected Operations in Azure Local - Azure Local,Integrate monitoring solutions with Azure Local disconnected operations,Learn how to monitor disconnected operations for Azure Local to ensure system reliability and performance.,"This article explains how to monitor disconnected operations for Azure Local by integrating with external monitoring solutions. Learn how to use Microsoft, non-Microsoft, and open-source monitoring systems to ensure the reliability and performance of your infrastructure and workloads.",2026-02-23T23:04:00.000Z,concept-article,configuration,0.6,True,"Describes integrating Microsoft and non-Microsoft monitoring systems. Likely includes product-specific endpoints, data flows, or configuration steps for monitoring in disconnected environments.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-network?view=azloc-2603,Network,Plan Your Network for Disconnected Operations on Azure Local - Azure Local,Plan networking for Azure Local disconnected operations,Plan and integrate your network for disconnected operations on Azure Local.,This article describes how to plan your network for disconnected operations on Azure Local. It covers key design considerations and requirements to help ensure reliable integration and performance in a disconnected environment.,2026-02-23T23:04:00.000Z,concept-article,decision-making,0.65,True,"Covers key design considerations and requirements for network planning in disconnected environments. This is environment- and product-specific decision guidance (topology, integration choices) rather than generic networking theory.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-on-demand-logs?view=azloc-2603,On-demand log collection,Collect Logs On-Demand with Azure Local Disconnected Operations - Azure Local,Collect on-demand logs via PowerShell for Azure Local disconnected operations,Learn how to use the PowerShell module to collect logs on-demand with disconnected operations for Azure Local.,This article explains how to collect logs on-demand for disconnected operations for Azure Local by using the PowerShell module. You learn how to provide logs for troubleshooting and support when Azure Local operates in disconnected mode.,2026-03-09T17:02:00.000Z,how-to,troubleshooting,0.7,True,"Explains using a PowerShell module to collect logs on-demand for troubleshooting and support. Contains product-specific commands, parameters, and log collection patterns, mapping symptoms to diagnostic data.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-overview?view=azloc-2603,Overview,Disconnected operations for Azure Local overview - Azure Local,Plan and use disconnected operations for Azure Local,"Learn how to use disconnected operations to deploy and manage Azure Local. Build secure, compliant private clouds for remote or sovereign environments.","Disconnected operations enable you to deploy and manage Azure Local instances to build sovereign private clouds. This article explains how this feature supports compliance, security, and remote deployments.",2026-02-25T18:04:00.000Z,overview,decision-making,0.6,True,"Explains how disconnected operations support compliance, security, and remote deployments. Likely includes scenario-based guidance on when to use disconnected operations and considerations for sovereign/private clouds, fitting decision-making for deployment mode selection.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-pki?view=azloc-2603,Public key infrastructure,Understand Public Key Infrastructure (PKI) requirements for disconnected operations on Azure Local - Azure Local,Configure PKI and certificates for Azure Local disconnected operations,Learn about public key infrastructure (PKI) requirements for disconnected operations on Azure Local. Discover how to create certificates to secure endpoints and ensure a trusted deployment.,This article explains the public key infrastructure (PKI) requirements for disconnected operations on Azure Local. You learn how to create certificates to secure appliance endpoints and ensure secure communication in your environment.,2026-02-25T18:04:00.000Z,concept-article,security,0.7,True,"Details PKI requirements and how to create certificates to secure appliance endpoints. This is product-specific security configuration (certificate types, usage, and trust requirements).",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-policy?view=azloc-2603,Azure Policy,Use Azure Policy in a Disconnected Azure Local Environment - Azure Local,Configure Azure Policy in disconnected Azure Local environments,Learn how to use Azure Policy to enforce compliance and manage resources in a disconnected Azure Local environment.,"This article explains how to use Azure Policy in a disconnected Azure Local environment to enforce compliance and manage resources at scale. Azure Policy helps organizations meet standards by checking resource properties against business rules, even when disconnected from Azure cloud.",2026-02-23T23:04:00.000Z,concept-article,configuration,0.7,True,"Describes using Azure Policy when disconnected, which requires specific configuration steps (policy sync, assignment, evaluation) unique to this scenario.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-powershell?view=azloc-2603,Azure PowerShell,Use Azure PowerShell for Disconnected Operations on Azure Local - Azure Local,Configure Azure PowerShell for disconnected Azure Local,Learn how to use Azure PowerShell for disconnected operations on Azure Local.,This article explains how to configure Azure PowerShell for disconnected operations on Azure Local.,2026-04-05T08:00:00.000Z,how-to,configuration,0.7,True,"Configuring Azure PowerShell for disconnected Azure Local operations is likely to involve specific module versions, context settings, environment variables, or endpoint/URI overrides unique to this scenario. Those are concrete configuration parameters and patterns that qualify as expert knowledge and align with the configuration sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-prepare?view=azloc-2603,Prepare Azure Local nodes,Prepare Azure Local for Disconnected Deployments - Azure Local,Prepare Azure Local nodes for disconnected deployments,"Prepare your Azure Local environment for disconnected deployments. Learn how to set up nodes, configure networking, and ensure deployment readiness.",This article explains how to deploy an Azure Local node for deployment in disconnected environments. Prepare your environment to ensure a successful setup of your Azure Local instance using a local control plane. Complete these steps before setting up any Azure Local instances in your disconnected environment.,2026-02-23T23:04:00.000Z,how-to,deployment,0.7,True,"Describes preparing nodes, networking, and readiness for disconnected deployments. Contains deployment-specific requirements and ordering that are unique to this product’s disconnected mode.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-registration?view=azloc-2603,Register disconnected operations,Register Disconnected Operations for Azure Local - Azure Local,Register Azure Local disconnected operations for compliance,Learn how to register disconnected operations for Azure Local to ensure compliance with deployment requirements.,This article explains how to register disconnected operations for Azure Local after deploying your management cluster with the control plane. Learn how to ensure compliance with Azure Local requirements through proper registration.,2026-03-04T18:05:00.000Z,how-to,deployment,0.65,True,Covers registration of disconnected operations to meet Azure Local requirements. Registration flows and constraints are product-specific deployment/compliance steps.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-security?view=azloc-2603,Security,Security Controls with Disconnected Operations for Azure Local - Azure Local,Apply security controls for Azure Local disconnected VMs,Learn about the security considerations and compliance regulations for disconnected operations for Azure Local.,This article explains the security considerations and compliance regulations for disconnected operations with Azure Local VMs enabled by Azure Arc. Learn how to protect your environment and meet regulatory standards when running Azure Local VMs disconnected.,2026-02-23T23:04:00.000Z,concept-article,security,0.65,True,Focuses on security considerations and compliance regulations for disconnected operations with Azure Local VMs. Contains product-specific security and compliance configuration guidance beyond generic security concepts.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-update?view=azloc-2603,Update disconnected operations,Update Disconnected Operations for Azure Local - Azure Local,,Learn how to update disconnected operations for Azure Local.,This article explains how to update disconnected operations for Azure Local. Learn how to apply updates to the appliance to ensure optimal performance and reliability in disconnected environments.,2026-04-14T22:03:00.000Z,how-to,,0.3,False,"The page appears to be a procedural guide on updating disconnected operations for Azure Local appliances. From the description and summary, it likely focuses on step-by-step update instructions rather than detailed configuration parameter tables, limits/quotas, error-code-based troubleshooting, or decision matrices. Without evidence of specific numeric limits, configuration option tables, or error-code mappings, it does not meet the criteria for any expert-knowledge sub-skill type.",updated -https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-whats-new?view=azloc-2603,What's new in disconnected operations?,What's New in Disconnected Operations for Azure Local - Azure Local,,Find out about the new features and enhancements in disconnected operations for Azure Local.,"This article describes new features and improvements in disconnected operations for Azure Local. Before you deploy disconnected operations with Azure Local, review theKnown issuesto understand current limitations and available workarounds.",2026-02-23T23:04:00.000Z,concept-article,,0.3,False,"What's new/release info; mostly change log and feature descriptions. Does not clearly map to limits, configuration tables, or troubleshooting patterns per the defined categories.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/drift-detection?view=azloc-2603,Drift detection,Drift Detection for Azure Local - Azure Local,Apply drift detection to maintain Azure Local configuration health,Learn how Azure Locals Drift Detection framework ensures system reliability by continuously validating component states against a baseline.,"This article explains how the drift detection framework identifies configuration deviations, improves troubleshooting, and helps reduce configuration-related issues in your Azure Local environment.",2026-02-19T23:06:00.000Z,overview,best-practices,0.65,True,"Explains Azure Local’s drift detection framework and how to use it to identify configuration deviations, providing product-specific operational guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/enable-nested-virtualization?view=azloc-2603,Enable nested virtualization,Enable nested virtualization in Azure Local - Azure Local,Enable nested virtualization on Azure Local hosts,Learn how to enable nested virtualization in Azure Local.,"This article provides an overview of nested virtualization in Azure Local and how to enable it. Nested virtualization lets you enable virtualization capabilities inside a Hyper-V virtual machine (VM). This allows you to maximize your hardware investments and gain flexibility in evaluation and testing scenarios. Other use cases include enabling security features, such asVirtualization based security (VBS). Important Azure Local provides virtualization capabilities to run workloads in VMs. Running",2026-01-28T23:03:00.000Z,how-to,configuration,0.7,True,Covers enabling nested virtualization; typically involves specific host/VM configuration flags and PowerShell settings unique to Azure Local/Hyper-V.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/gateway-connections?view=azloc-2603,Manage gateway connections,Manage Azure Local gateway connections using Windows Admin Center - Azure Local,,"Learn how to create, delete, and update gateway connections using Windows Admin Center after you deploy Software Defined Networking (SDN) managed by on-premises tools.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to create, delete, and update gateway connections using Windows Admin Center after you deploy Software Defined Networking (SDN) managed by on-premises tools. Gateways are used for routing network traffic between a virtual network and another network, either local or remote. There are three types of gateway connections – Internet Protocol Security (IPsec), Generic Rou",2025-12-22T23:06:00.000Z,how-to,,0.45,False,Gateway connection management how-to; summary suggests basic create/update/delete operations without detailed configuration tables or troubleshooting content.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/get-remote-support?view=azloc-2603,Get remote support,Get remote support for Azure Local - Azure Local,Enable and manage remote support for Azure Local OS,Learn how to get remote support for the Azure Stack HCI Operating System.,"Applies to: Azure Local 2311.2 and later This article explains how to get remote support for the Azure Stack HCI operating system for Azure Local. -It gives an overview of remote support, the terms and conditions, and the steps to enable remote support on your Azure Local. It also covers setting up proxy settings, submitting a support request, and other remote support tasks.",2025-12-23T23:03:00.000Z,how-to,configuration,0.6,True,"Covers steps to enable remote support, configure proxy settings, and submit support requests, including Azure Local–specific remote support configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support-for-deployment-issues?view=azloc-2603,Get support for deployment issues,Get support for Azure Local deployment issues - Azure Local,,"Learn how to get Microsoft support for Azure Local deployment issues, including log collection and remote support.","Applies to: Hyperconverged deployments of Azure Local This article describes how to get Microsoft support for Azure Local deployment issues, including log collection and remote support.",2026-01-16T23:02:00.000Z,how-to,,0.3,False,Describes how to contact Microsoft support and use remote support at a high level; lacks detailed technical troubleshooting mappings or configs.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support?view=azloc-2603,Get support for Azure Local,Get support for Azure Local - Azure Local,,This article provides guidance on how to get support for Azure Local.,"Applies to: Azure Local 2311.2 and later This article provides guidance on how to get support for Azure Local. Azure Local follows the same support process as Azure. Enterprise customers can follow the process described inCreate an Azure support request. If you're a customer of a Cloud Solution Provider (CSP), contact your CSP for support. Updates for Azure Local are released monthly to enhance customer experience. To keep your Azure Local instance in a supported state, you have up to six months",2025-12-23T23:03:00.000Z,how-to,,0.3,False,"General support process and lifecycle guidance; no product-specific configuration, limits, or error-resolution mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-device?view=azloc-2603,Manage GPUs using Discrete Device Assignment,Manage GPUs via Discrete Device Assignment for Azure Local (preview) - Azure Local,Configure GPU Discrete Device Assignment on Azure Local,Learn how to manage GPUs via Discrete Device Assignment for Azure Local (preview).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to manage GPUs using Discrete Device Assignment (DDA) for Azure Local VMs enabled by Azure Arc. For GPU DDA management on Azure Kubernetes Service (AKS) enabled by Azure Arc, seeUse GPUs for compute-intensive workloads. DDA allows you to dedicate a physical graphical processing unit (GPU) to your workload. In a DDA deployment, virtualized workloads run on the native driver and typically have full access to the GPU's",2025-12-23T23:03:00.000Z,how-to,configuration,0.75,True,"Managing GPUs via DDA requires detailed configuration (device IDs, assignment settings) specific to Azure Local and Arc-enabled VMs.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-partitioning?view=azloc-2603,Manage GPUs via partitioning,Manage GPUs using partitioning for Azure Local (preview) - Azure Local,Configure GPU partitioning (GPU-P) for Azure Local VMs,Learn how to manage GPUs using partitioning Azure Local (preview).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to manage GPUs using partitioning (GPU-P) for Azure Local virtual machines (VMs) enabled by Azure Arc. GPU-P allows you to share a GPU with multiple workloads by splitting the GPU into dedicated fractional partitions. Important This feature is currently in PREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet",2025-12-23T23:03:00.000Z,how-to,configuration,0.75,True,GPU-P partitioning involves product-specific configuration parameters and constraints for sharing GPUs across workloads.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-preparation?view=azloc-2603,Prepare GPUs,Prepare GPUs for Azure Local instance - Azure Local,Configure GPUs for Azure Local hyperconverged hosts,Learn how to prepare GPUs for an Azure Local instance.,Applies to: Hyperconverged deployments of Azure Local This article describes how to prepare graphical processing units (GPUs) on your Azure Local instance for workloads running on Azure Local virtual machines (VMs) enabled by Azure Arc and on Azure Kubernetes Service (AKS) enabled by Azure Arc. GPUs are used for computation-intensive workloads such as machine learning and deep learning.,2026-03-19T17:11:00.000Z,how-to,configuration,0.7,True,"The page describes product-specific steps and settings to prepare and configure GPUs for Azure Local instances (for Arc-enabled VMs and AKS). This includes concrete configuration actions and parameters tied to this platform, which an LLM is unlikely to know from training. It is not just a conceptual overview but a how-to with environment-specific configuration details, fitting best under configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/health-alerts-via-azure-monitor-alerts?view=azloc-2603,Health alerts,Use Azure Monitor alerts for Azure Local health alerts - Azure Local,Integrate Azure Local health service with Azure Monitor alerts,Learn how to use the Azure Monitor alerts to respond to Azure Local health alerts.,"Applies to: Hyperconverged deployments of Azure Local The OS health service for Azure Local continuously monitors your Azure Local system to detect over 80 health issues across various components, such as physical and virtual disk, storage pool capacity, volume capacity, network interface, storage QoS, virtual machines (VMs), and VHDs. It provides information about the affected component, including the cause, time of the issue, and recommendations to mitigate it. You can view health issues like ",2025-12-23T23:03:00.000Z,how-to,configuration,0.65,True,Explains mapping of Azure Local OS health issues into Azure Monitor alerts with product-specific alert configuration steps.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-actions?view=azloc-2603,Track Health Service actions,Track Health Service actions - Azure Local,Track automated Health Service actions in Azure Local,Learn more about: Health Service actions,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 The Health Service, first released in Windows Server 2016, improves the day-to-day monitoring and operational experience for clusters running Storage Spaces Direct. This topic describes workflows that the Health Service automates. The Health Service generates ""Actions"" to verify that they are taken autonomously, or to track their progress or outcome. Unlike logs, Actions disappear shortly afte",2026-01-28T23:03:00.000Z,concept-article,configuration,0.6,True,"Describes Health Service workflows and actions, including how actions are generated and tracked, which is product-specific operational configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-cluster-performance-history?view=azloc-2603,Get cluster performance history,Cluster performance history - Azure Local,Retrieve Azure Local cluster performance history via Health Service,Learn more about: Cluster performance history,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 The Health Service reduces the work required to get live performance and capacity information from your Storage Spaces Direct cluster. One cmdlet provides a curated list of essential metrics, which are collected efficiently and aggregated dynamically across nodes, with built-in logic to detect cluster membership. All values are real-time and point-in-time only. For additional information, seePerformance history fo",2025-12-23T23:03:00.000Z,how-to,configuration,0.6,True,"Explains cmdlets and metrics for cluster performance history, including product-specific metric aggregation and retrieval commands.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-faults?view=azloc-2603,View Health Service faults,View Health Service faults - Azure Local,Interpret and resolve Azure Local Health Service faults,Learn more about Health Service faults,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article provides detailed information about Health Service faults in Azure Local and Windows Server.",2025-12-23T23:03:00.000Z,how-to,troubleshooting,0.7,True,"Provides detailed information on Health Service faults, mapping fault types to causes and likely resolutions for Azure Local and Windows Server clusters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-overview?view=azloc-2603,Monitor clusters with Health Service,Monitor clusters with the Health Service - Azure Local,Use Health Service to monitor Azure Local clusters,Learn more about how to use the Health Service to monitor clusters,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 The Health Service, first released in Windows Server 2016, improves the day-to-day monitoring and operational experience for clusters running Storage Spaces Direct.",2026-02-13T18:03:00.000Z,how-to,configuration,0.6,True,"Describes Health Service usage and configuration for clusters running Storage Spaces Direct, including Azure Local–specific monitoring behavior.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-settings?view=azloc-2603,Modify Health Service settings,Modify Health Service settings - Azure Local,Tune Azure Local Health Service settings and thresholds,Learn more about: Health Service settings,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 The Health Service, first released in Windows Server 2016, improves the day-to-day monitoring and operational experience for clusters running Storage Spaces Direct. Many of the parameters which govern the behavior of the Health Service are exposed as settings. You can modify these to tune the aggressiveness of faults or actions, turn certain behaviors on/off, and more. For a detailed overview ",2025-12-23T23:03:00.000Z,how-to,configuration,0.75,True,"Details Health Service parameters that govern behavior, including settings to adjust aggressiveness of faults and actions—explicit configuration knobs unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/kerberos-with-spn?view=azloc-2603,Authenticate with Kerberos,Use Kerberos authentication with Service Principal Name (SPN) - Azure Local,Use Kerberos SPN authentication with Network Controller,Learn how to use Kerberos authentication with SPN.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes how to use Kerberos authentication with Service Principal Name (SPN). Network Controller supports multiple authentication methods for communication with management clients. You can use Kerberos based authentication, X509 certificate-based authentication. You also have the option to use no authentication for test deployments. System Center Virtual Machine Manager uses Kerberos-based authentic",2025-12-22T23:06:00.000Z,how-to,security,0.82,True,"Covers Kerberos with SPN for Network Controller and SCVMM; includes SPN formats, account requirements, and configuration steps that are product-specific security details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balance-multiple-networks?view=azloc-2603,Load balance multiple logical networks,Load balance multiple logical networks for Azure Local - Azure Local,Load balance multiple logical networks in Azure Local SDN,Learn how to load balance multiple logical networks in Software Defined Networking (SDN) managed by on-premises tools for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article provides guidance on how to load balance multiple logical networks in Software Defined Networking (SDN) managed by on-premises tools for Azure Local. By using multiple logical networks for load balancing, you have more control over isolating workloads from each other. For information about how to create and manage logical networks, seeManage tenant logical networks.",2025-12-22T23:06:00.000Z,how-to,architecture-patterns,0.63,True,Provides guidance on using multiple logical networks for load balancing and isolation; this is a product-specific design pattern for Azure Local SDN rather than a generic concept.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balancers?view=azloc-2603,Manage software load balancers,Manage Software Load Balancer for SDN managed by on-premises tools - Azure Local,,Learn how to manage Software Load Balancer for SDN managed by on-premises tools,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 In this article, learn how to manage Software Load Balancer (SLB) policies using Windows Admin Center after you deploy Software Defined Networking (SDN). SLBs are used to evenly distribute network traffic among multiple resources. SLB enables multiple machines to host the same workload, providing high availability and scalability. You can create load balancers for your workloads hosted on trad",2025-12-22T23:06:00.000Z,how-to,,0.45,False,"Managing SLB policies via Windows Admin Center; appears as a standard management tutorial without explicit limits, config matrices, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machine-resources?view=azloc-2603,Manage Azure Local VM resources,Manage resources for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to manage resources like data disks and network interfaces on an Azure Local VM enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local After you deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you need to add or delete resources like data disks and network interfaces. -This article describes how to manage these VM resources for an Azure Local VM running on your Azure Local instance. Add or delete resources using the Azure portal. To add a data disk, use the Azure CLI.",2026-01-30T18:03:00.000Z,how-to,,0.5,False,"Managing VM resources (disks, NICs) is operational; summary doesn’t clearly indicate detailed configuration parameter tables or limits beyond basic add/delete steps.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machines?view=azloc-2603,Manage Azure Local VMs,"Manage including restart, start, stop or delete Azure Local VMs enabled by Azure Arc - Azure Local",,"Learn how to manage Azure Local VMs enabled by Azure Arc. This includes operations such as start, stop, restart, view properties of Azure Local VMs.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage Azure Local virtual machines (VMs) enabled by Azure Arc. The procedures to enable guest management, start, stop, restart, pause, save, or delete an Azure Local VM, are detailed.",2026-04-07T08:00:00.000Z,how-to,,0.35,False,"Covers basic VM lifecycle operations (start, stop, restart, delete) and enabling guest management. This is largely procedural and generic VM management behavior, without clear indication of product-specific limits, configuration tables, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-at-scale-dashboard?view=azloc-2603,Use dashboard to manage at-scale,Monitor at scale using the Azure Local overview and All systems page - Azure Local,Monitor Azure Local at scale using portal dashboards,Learn to monitor your Azure Local using dashboards in Azure portal. You can view the status of Azure Local as charts or lists.,Applies to: Hyperconverged deployments of Azure Local This article details how to manage at-scale your Azure Local via the dashboard in the Azure portal. You can view the status of the systems as charts or lists.,2025-12-23T23:03:00.000Z,how-to,configuration,0.65,True,Managing at scale via overview and All systems pages involves product-specific monitoring configuration and dashboard usage.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-bitlocker?view=azloc-2603,Manage BitLocker encryption,"Manage BitLocker encryption on Azure Local, version 23H2 - Azure Local",Manage BitLocker encryption and recovery keys on Azure Local,"Learn how to manage BitLocker encryption on your Azure Local, version 23H2 system.","Applies to: Hyperconverged deployments of Azure Local This article describes how to view and enable BitLocker encryption, and retrieve BitLocker recovery keys on your Azure Local instance.",2026-01-05T23:02:00.000Z,how-to,security,0.78,True,BitLocker management on Azure Local includes specific steps to enable encryption and retrieve recovery keys; these are concrete security configuration procedures.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-data-disks?view=azloc-2603,Download managed disks from Azure,Download Azure managed disk to Azure Local - Azure Local,,Learn how to download Azure managed disk to Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to download an Azure managed disk from Azure to your Azure Local instance. You can then use the disk to create an image or to attach it to your Azure Local virtual machines (VMs) enabled by Arc, as needed.",2026-01-16T23:02:00.000Z,how-to,,0.35,False,Downloading managed disks to Azure Local is a step-by-step integration scenario but described at tutorial level; no clear evidence of config tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-default-network-access-policies-virtual-machines-23h2?view=azloc-2603,Manage default network access policies,"Enable and assign default network access policies on Azure Local, version 23H2 VMs - Azure Local",Enable default VM network access policies on Azure Local,"Learn how to enable and assign default network access policies on VMs running on Azure Local, version 23H2 via the Windows Admin Center.",Applies to: Hyperconverged deployments of Azure Local Applies to: Windows Server 2025 This article describes how to enable default network access policies and assign them to virtual machines (VMs). Default network policies can be used to protect virtual machines running from external unauthorized attacks. These policies block all inbound access to virtual machines (except the specified management ports you want enabled) while allowing all outbound access. Use these policies to ensure that your w,2025-12-22T23:06:00.000Z,how-to,security,0.7,True,Describes default network access policies that block inbound traffic except specific management ports; likely includes concrete policy settings and port lists specific to Azure Local VMs.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-logical-networks?view=azloc-2603,Manage logical networks,Manage logical networks for Azure Local VMs enabled by Azure Arc - Azure Local,Manage logical network configuration for Azure Local VMs,Learn how to manage logical networks for Azure Local VMs enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local To deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you need to create logical networks. Once these networks are provisioned, you may need to manage them. This article describes how to manage these logical networks for Azure Local VMs deployed on your Local instance.",2026-01-22T23:03:00.000Z,how-to,configuration,0.6,True,"Managing logical networks implies editing network configuration parameters (subnets, IP pools, etc.) specific to Azure Local, fitting configuration sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-network-security-groups?view=azloc-2603,Manage Network Security Groups,Manage network security groups and network security rules on Azure Local VMs - Azure Local,Manage NSGs and security rules for Azure Local Arc-enabled VMs,Learn how to manage network security groups and network security rules for Azure Local virtual machines.,"This article describes how to manage network security groups (NSGs) on your Azure Local virtual machines (VMs) enabled by Azure Arc. Once you create network security groups on your Azure Local VMs, you can then list, show details, associate, dissociate, update, and delete these resources. Note The only VMs that are in scope for using NSGs with this feature are Azure Local VMs. These are VMs that were deployed from Azure client interfaces (Azure CLI, Azure portal, Azure Resource Manager). Do not ",2025-12-22T23:06:00.000Z,how-to,security,0.7,True,"Describes listing, associating, updating, and deleting NSGs for Azure Local VMs; product-specific security configuration operations.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-sdn-multisite?view=azloc-2603,Manage SDN Multisite,Manage SDN Multisite for Azure Local and Windows Server - Azure Local,Deploy and manage SDN Multisite for Azure Local,Learn how to manage a multisite SDN solution for Azure Local and Windows Server.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to deploy and manage the Software Defined Networking (SDN) Multisite solution for Azure Local using Windows Admin Center. Applies to: Windows Server 2025 This article describes how to deploy and manage the Software Defined Networking (SDN) Multisite solution for Windows Server using Windows Admin Center. For an overview of SDN Multisite, it's current capabilities and limitations, seeOverview of SDN Multisite. For an",2025-12-22T23:06:00.000Z,how-to,configuration,0.62,True,"Multisite SDN management typically involves specific configuration parameters (site definitions, link settings, routing policies) unique to Azure Local/Windows Server SDN; article is focused on deployment and management details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secrets-rotation?view=azloc-2603,Manage secrets rotation,"Change deployment user password on Azure Local, version 23H2 - Azure Local",Rotate deployment user password and internal secrets on Azure Local,"This article describes how to manage internal secret rotation on Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article describes how you can change the password associated with the deployment user on Azure Local.,2025-12-23T23:03:00.000Z,how-to,security,0.7,True,Changing deployment user password and managing internal secret rotation is product-specific security/identity configuration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-baseline?view=azloc-2603,Manage security defaults,"Manage security defaults on Azure Local, version 23H2 - Azure Local",Manage Azure Local 23H2 security default settings,"Learn how to manage security default settings available for Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article describes how to manage default security settings for your Azure Local instance. You can also modify drift control and protected security settings defined during deployment so your device starts in a known good state.,2025-12-23T23:03:00.000Z,how-to,security,0.78,True,"Managing security defaults, drift control, and protected settings is product-specific security configuration, likely with named settings and allowed values.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2603,Manage Secure Boot updates,Manage Secure Boot Updates - Azure Local,Manage Secure Boot certificate updates in Azure Local,"Manage Secure Boot updates, including 2023 certificate rollout and CVE-2023-24932 mitigations, for Azure Local clusters.","This article describes how Azure Local manages the transition from the 2011 Secure Boot certificates, which expire in June 2026, to the 2023 Secure Boot certificates, including how it mitigatesCVE-2023-24932and why the changes are delivered through a phased rollout. It also covers how Azure Local orchestrates Secure Boot updates alongside OEM and hardware updates including Solution Builder Extension (SBE) packages and provides guidance for monitoring and validating each stage of the update proce",2026-04-17T22:03:00.000Z,how-to,security,0.7,True,"Page contains product-specific Secure Boot behavior for Azure Local, including how 2011 vs 2023 Secure Boot certificates are rolled out, CVE-2023-24932 mitigation details, and ordered update orchestration with OEM/SBE packages plus monitoring/validation steps. These are security-configuration and platform-specific operational details that go beyond generic Secure Boot concepts.",new -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-post-upgrade?view=azloc-2603,Manage security post upgrade,"Manage security after upgrading your Azure Local from an Azure Stack HCI, version 22H2. - Azure Local",Manage security settings after upgrading to Azure Local from Azure Stack HCI,"Learn how to manage security posture after you upgrade Azure Local from an Azure Stack HCI, version 22H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage security settings on an Azure Local that was upgraded from an Azure Stack HCI, version 22H2.",2026-03-12T22:05:00.000Z,how-to,security,0.68,True,Post-upgrade security posture management involves product-specific settings and migration considerations for security configuration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-with-defender-for-cloud?view=azloc-2603,Manage with Microsoft Defender for Cloud,Manage system security with Microsoft Defender for Cloud (preview) - Azure Local,Secure Azure Local with Microsoft Defender for Cloud,This article describes how to use Microsoft Defender for Cloud to secure Azure Local (preview).,"Applies to: Azure Local 2311.2 and later This article discusses how to use Microsoft Defender for Cloud to protect Azure Local from various cyber threats and vulnerabilities. Defender for Cloud helps improve the security posture of Azure Local, and can protect against existing and evolving threats. For more information about Microsoft Defender for Cloud, seeMicrosoft Defender for Cloud documentation. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft A",2026-02-13T18:03:00.000Z,how-to,security,0.7,True,"Describes using Defender for Cloud to protect Azure Local; likely includes enabling plans, scopes, and security settings specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-syslog-forwarding?view=azloc-2603,Manage syslog forwarding,Manage syslog forwarding for Azure Local - Azure Local,Configure syslog forwarding from Azure Local to SIEM,Learn how to configure syslog forwarding for Azure Local security information and event management (SIEM).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to configure security events to be forwarded to a customer-managed security information and event management (SIEM) system using syslog protocol for Azure Local. Use syslog forwarding to integrate with security monitoring solutions and to retrieve relevant security event logs to store them for retention on your own SIEM platform. For more information about security features in this release, seeSecurity features for ",2025-12-23T23:03:00.000Z,how-to,security,0.8,True,"Syslog forwarding configuration for security events includes protocol settings, event types, and integration details with SIEM; clearly security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-thin-provisioning-23h2?view=azloc-2603,Storage thin provisioning,"Storage thin provisioning in Azure Local, version 23H2 - Azure Local",Configure storage thin provisioning on Azure Local 23H2,"How to use storage thin provisioning on Azure Local, version 23H2 by using Windows PowerShell.","Applies to: Hyperconverged deployments of Azure Local This article describes how thin provisioning works on your Azure Local instance. Traditionally, volumes are fixed provisioned, meaning that all storage is allocated from the storage pool when a volume is created. Despite the volume being empty, a portion of the storage pool's resources is depleted. Other volumes can't make use of this storage, which impacts storage efficiency and requires more maintenance.",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,Explains how to enable and manage thin provisioning via PowerShell with Azure Local–specific storage configuration behavior.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-wdac?view=azloc-2603,Manage Application Control,"Manage Application Control for Azure Local, version 23H2 - Azure Local",Configure Application Control policies on Azure Local 23H2,"This article describes how to use Application Control on Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to use Application Control to reduce the attack surface of Azure Local. For more information, seeManage baseline security settings on Azure Local.",2025-12-23T23:03:00.000Z,how-to,security,0.75,True,"Application Control configuration to reduce attack surface is security-focused and product-specific, likely including policy settings and scopes.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-cluster-with-metrics?view=azloc-2603,Metrics,Monitor Azure Local with Azure Monitor Metrics - Azure Local,Configure Azure Monitor Metrics for Azure Local clusters,Learn how to monitor Azure Local with Azure Monitor Metrics.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to monitor your Azure Local system withAzure Monitor Metrics. It also describes the Performance Metrics dashboard and lists metrics collected for compute, storage, and network resources in Azure Local. When you have critical applications and business processes that rely on Azure resources, it's important to monitor those resources for their availability, performance, and operation. The integration of Azure Monitor M",2026-01-30T23:03:00.000Z,how-to,configuration,0.7,True,"Lists metrics collected for compute, storage, and network resources and describes Azure Local–specific metrics configuration and dashboards.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-features?view=azloc-2603,Monitor feature workbooks,Monitor Azure Local features with Insights - Azure Local,Monitor Azure Local features with Insights workbooks,Monitor Azure Local features with Insights.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to use Insights to monitor key Azure Local features, such as Resilient File System (ReFS) deduplication and compression. To monitor Azure Local systems with Insights, seeMonitor a single Azure Local system with InsightsandMonitor multiple Azure Local systems with Insights.",2025-12-23T23:03:00.000Z,how-to,configuration,0.6,True,"Shows how to monitor specific Azure Local features like ReFS deduplication and compression using Insights, involving feature-specific configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-23h2?view=azloc-2603,Monitor multiple systems,"Monitor multiple Azure Local, version 23H2 systems with Insights - Azure Local",Configure Insights to monitor multiple Azure Local systems,"How to use Insights to monitor the health, performance, and usage of multiple Azure Local, version 23H2 systems.","Applies to: Hyperconverged deployments of Azure Local This article explains how to use Insights to monitor multiple Azure Local systems. For a single Azure Local system, seeMonitor a single Azure Local system with Insights. For information about the benefits, prerequisites, and how to enable Insights on each Azure Local system, seeBenefits,Prerequisites, andEnable Insights. To monitor multiple Azure Local system with Insights, you need to enable Insights on each system individually. Instead, you",2025-12-23T23:03:00.000Z,how-to,configuration,0.65,True,"Explains configuration for multi-system monitoring with Insights, including Azure Local–specific setup patterns at scale.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-azure-policies?view=azloc-2603,Enable Insights at scale using Azure policies,Enable Insights for Azure Local at scale using Azure policies - Azure Local,Enable Azure Local Insights at scale using Azure Policy,How to enable Insights for Azure Local systems at scale using Azure policies.,"Applies to: Azure Local 2311.2 and later This document describes how to enable Insights for Azure Local systems at scale using Azure policies. To enable Insights for a single Azure Local system, seeMonitor a single Azure Local system with Insights. For an overview of Azure Policy, seeWhat is Azure Policy?",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Describes Azure Policy definitions and assignments to enable Insights across Azure Local systems, including product-specific policy configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-single-23h2?view=azloc-2603,Monitor a single system,"Monitor a single Azure Local, version 23H2 system with Insights - Azure Local",Configure Insights to monitor a single Azure Local system,"Enable logging and monitoring capabilities to monitor a single Azure Local, version 23H2 system using Insights.","Applies to: Hyperconverged deployments of Azure Local This article describes how to use Insights to monitor a single Azure Local system. For multiple Azure Local systems, seeMonitor multiple Azure Local systems with Insights. Insights is a feature of Azure Monitor that quickly gets you started monitoring your Azure Local system. You can view key metrics, health, and usage information regarding cluster, nodes, virtual machines, and storage. Take a few moments to watch the video walkthrough on Ins",2026-02-13T18:03:00.000Z,how-to,configuration,0.65,True,Describes enabling Insights and configuring monitoring for a single Azure Local system with product-specific setup steps and options.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/nc-security?view=azloc-2603,Secure the Network Controller,Network Controller Security - Azure Local,Configure security for Azure Local Network Controller traffic,Learn how to configure security for all communication between Network Controller and other software and devices.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to configure security for all communication betweenNetwork Controllerand other software and devices. The communication paths that you can secure include Northbound communication on the management plane, cluster communication between Network Controller virtual machines (VMs) in a cluster, and Southbound communication on the data plane. Northbound Communication. Networ",2025-12-22T23:06:00.000Z,how-to,security,0.84,True,"Explains securing Northbound, cluster, and Southbound communication paths; likely includes supported authentication methods, certificate requirements, and port/protocol details specific to Network Controller.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/refs-deduplication-and-compression?view=azloc-2603,Use ReFS deduplication and compression,Optimize storage with ReFS deduplication and compression in Azure Local - Azure Local,Enable ReFS deduplication and compression on Azure Local,Learn how to use ReFS deduplication and compression in Azure Local to optimize storage.,Applies to: Hyperconverged deployments of Azure Local This article describes the Resilient File System (ReFS) deduplication and compression feature and how to use this feature in Azure Local to optimize storage.,2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,Provides Azure Local–specific steps and parameters to configure ReFS deduplication and compression to optimize storage.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/remediate-support-tool-infrastructure?view=azloc-2603,Use the Support tool for infrastructure issues,Remediation Support Tool for Azure Local infrastructure component issues - Azure Local,Use AKS Arc Support Tool to remediate Azure Local infrastructure,Learn how to run commands in the Support.AksArc PowerShell module to remediate issues in Azure Local infrastructure components.,TheSupport tool(also known as AKS Arc Support Tool) is a PowerShell module that provides diagnostic and remediation capabilities for the infrastructure components used by Azure Local VMs and Azure Kubernetes Service (AKS) enabled by Azure Arc on Azure Local.,2026-01-09T23:02:00.000Z,troubleshooting,troubleshooting,0.7,True,"Describes diagnostic and remediation commands in the Support.AksArc PowerShell module, mapping infrastructure issues to specific remediation actions.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/remote-support-arc-extension?view=azloc-2603,Remote support extension,Azure Local Remote Support Arc extension and remote support overview - Azure Local,Use Azure Local Remote Support Arc extension for diagnostics,This article describes the remote support arc extension and remote support in Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article gives an overview of the Remote Support Arc extension and remote support in Azure Local. Learn about the benefits, scenarios, and commands that Microsoft support uses during a remote support session.",2025-12-23T23:03:00.000Z,overview,troubleshooting,0.65,True,"Describes remote support scenarios and specific commands Microsoft support uses during sessions, which are product-specific diagnostic patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server?view=azloc-2603,Repair a node,"Repair a node on Azure Local, version 23H2 - Azure Local",Repair Azure Local 23H2 cluster nodes safely,"Learn how to repair a node on your Azure Local, version 23H2 system.","Applies to: Hyperconverged deployments of Azure Local This article describes how to repair a node on your Azure Local instance. In this article, each server is referred to as a node.",2025-12-23T23:03:00.000Z,how-to,deployment,0.65,True,Node repair procedures are operational/deployment tasks with specific steps and constraints for Azure Local clusters.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-log-collection?view=azloc-2603,Collect SDN logs,Collect logs for Software Defined Networking managed by on-premises tools on Azure Local - Azure Local,Collect SDN logs on Azure Local for troubleshooting,Learn how to collect logs to troubleshoot Software Defined Networking (SDN) managed by on-premises tools in Azure Local.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes how to collect logs for Software Defined Networking (SDN) managed by on-premises tools on Azure Local. The SDN logs help you identify and troubleshoot advanced issues in your SDN environment. Use these logs to gather key information before you contact Microsoft support. Use SDN logs to also test a recently deployed SDN environment or retest an existing SDN deployment. In addition to log coll",2025-12-22T23:06:00.000Z,how-to,troubleshooting,0.78,True,"Describes how to collect SDN logs; includes specific log locations, commands, and collection procedures unique to Azure Local SDN.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-manage-certs?view=azloc-2603,Manage certificates for SDN,Manage certificates for Software Defined Networking (SDN) managed by on-premises tools - Azure Local,Manage certificates for Azure Local SDN Network Controller,Learn how to manage certificates for Network Controller Northbound and Southbound communications when you deploy Software Defined Networking (SDN) managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to manage certificates for Network Controller Northbound and Southbound communications when you deploy Software Defined Networking (SDN) and when you use System Center Virtual Machine Manager (SCVMM) as your SDN management client. Note For overview information about Network Controller, seeNetwork Controller. If you aren't using Kerberos for securing the Network Contr",2025-12-22T23:06:00.000Z,how-to,security,0.78,True,"Covers certificate management for Northbound/Southbound communications and SCVMM; includes certificate types, usage, and configuration steps specific to SDN security.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-technical-reference?view=azloc-2603,SDN technical reference,Technical reference for Software Defined Networking (SDN) managed by on-premises tools in Azure Local. - Azure Local,,This technical reference serves as a one-stop-shop to access learning resources available for SDN managed by on-premises tools.,"Applies to: Azure Stack HCI, version 22H2 This technical reference provides comprehensive learning resources to enhance your understanding of Software Defined Networking (SDN) in Azure Local. We continuously update these resources based on the latest information available and user feedback. Be sure to check back here often for fresh content.",2026-01-07T23:02:00.000Z,overview,,0.2,False,Technical reference index/landing page aggregating SDN resources; no direct expert configuration or troubleshooting content indicated.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-troubleshooting?view=azloc-2603,Troubleshoot SDN,Troubleshoot Azure Local SDN enabled by Azure Arc - Azure Local,"Troubleshoot Azure Local SDN deployment, connectivity, and NSG issues","Troubleshoot and resolve common Azure Local SDN network controller deployment errors, VM connectivity issues, and NSG configuration problems. Learn how to fix DNS, downtime, and network policy errors.","This article provides troubleshooting steps for common issues encountered when you deploy and manage Software-Defined Networking (SDN) enabled by Azure Arc on your Azure Local VMs. The article covers errors during the action plan deployment, VM connectivity issues, and network security group (NSG) configurations.",2025-12-22T23:06:00.000Z,concept-article,troubleshooting,0.85,True,"Explicit troubleshooting article for SDN; will map specific errors and symptoms (deployment errors, DNS, downtime, policy issues) to causes and resolutions.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/set-up-recommended-alert-rules?view=azloc-2603,Recommended alert rules,Enable recommended alert rules for Azure Local - Azure Local,Enable recommended Azure Local alert rules for baseline monitoring,How to enable recommended alert rules for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to enable recommended alert rules for Azure Local. A metric alert rule monitors a resource by evaluating conditions on the resource metrics at regular intervals. If the conditions are met, an alert is fired. Recommended alerts are predefined metric-based alerts for your Azure Local system resource. These alerts provide you with initial monitoring for a common set of metrics including CPU percentage and available mem",2025-12-23T23:03:00.000Z,how-to,best-practices,0.65,True,"Provides predefined, recommended metric-based alert rules (CPU, memory, etc.) tailored to Azure Local, representing product-specific monitoring best practices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-metric-alerts?view=azloc-2603,Metric alerts,Set up metric alerts for Azure Local - Azure Local,Configure metric alerts for Azure Local resources,How to set up metric alerts for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to set up metric alerts for Azure Local. For information about how to set up log alerts, seeSet up log alerts for Azure Local systems.",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,Describes how to create metric alerts for Azure Local with product-specific metrics and alert rule configuration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-system-alerts?view=azloc-2603,Log alerts,Set up log alerts for Azure Local - Azure Local,Set up log alerts for Azure Local using Insights queries,How to set up log alerts for various Azure Local system resources using Insights for Azure Local and sample log queries.,"Applies to: Azure Local 2311.2 and later This article describes how to set up log alerts for Azure Local systems: using Insights for Azure Local and using preexisting sample log queries, such as average node CPU, available memory, available volume capacity, and more. For information about how to set up metric alerts, seeSet up metric alerts for Azure Local. Take a few moments to watch the video walkthrough on collecting new logs, customizing the Insights workbooks, and creating alerts using logs",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Provides sample log queries and alert configuration for Azure Local resources, including concrete query patterns and alert settings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/support-tools?view=azloc-2603,Use the Diagnostic Support tool,Support Tool for Azure Local Hyperconverged Deployments - Azure Local,Use Azure Local Support Diagnostic Tool for troubleshooting,This article provides guidance on the Support Diagnostic Tool for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article provides information to download and use the Azure Local Support Diagnostic Tool. The tool is a set of PowerShell commands to simplify data collection, troubleshooting, and resolution of common issues. This tool isn't a substitute for expert knowledge. If you encounter any issues, contact Microsoft Support for assistance.",2026-04-05T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Page focuses on a product-specific diagnostic tool for Azure Local hyperconverged deployments, likely detailing concrete PowerShell commands and data collection steps for resolving issues, which fits the troubleshooting pattern of product-specific diagnosis and resolution guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/suspend-resume-cluster-maintenance?view=azloc-2603,Suspend and resume,Suspend and resume Azure Local machines for planned maintenance operations - Azure Local,Suspend and resume Azure Local machines for maintenance,Learn how to suspend and resume Azure Local machines for planned maintenance operations.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to suspend an Azure Local machine (physical host) for planned maintenance, such as powering off the machine to replace non-hot-pluggable components. It also provides instructions on how to resume the machine once maintenance is complete.",2026-01-23T18:03:00.000Z,how-to,deployment,0.65,True,Describes operational procedure to suspend/resume physical hosts for maintenance; product-specific maintenance/deployment workflow.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-logical-networks?view=azloc-2603,Manage tenant logical networks,Manage tenant logical networks - Azure Local,,"This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete logical networks after you have deployed Network Controller.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete logical networks after you have deployed Network Controller. A Software Defined Networking (SDN) logical network is a traditional VLAN-based network. By modeling a VLAN-based network as an SDN logical network, you can apply network policies to workloads that are attached to these netw",2025-12-22T23:06:00.000Z,how-to,,0.45,False,How-to for managing logical networks via Windows Admin Center; likely mostly UI steps without detailed config tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-virtual-networks?view=azloc-2603,Manage tenant virtual networks,Manage tenant virtual networks - Azure Local,,"This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete Hyper-V Network Virtualization (HNV) virtual networks after you have deployed Software De","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete Hyper-V Network Virtualization (HNV) virtual networks after you have deployed Software Defined Networking (SDN). HNV helps you isolate tenant networks so that each tenant network is a separate entity. Each entity has no cross-connection possibility, unless you either configure public ",2025-12-22T23:06:00.000Z,how-to,,0.45,False,How-to for tenant virtual networks; summary indicates procedural steps rather than deep config references or troubleshooting.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-arc-enabled-vms?view=azloc-2603,Troubleshoot,Troubleshoot Azure Local Virtual Machines enabled by Azure Arc - Azure Local,Troubleshoot Azure Local Arc-enabled virtual machines,Learn how to troubleshoot issues you experience with Azure Local Virtual Machines (VMs).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to collect logs and troubleshoot problems with Azure Local Virtual Machines enabled by Azure Arc. It also lists the current limitations and known problems with Azure Local VM management, along with recommended resolutions.",2026-01-22T23:03:00.000Z,how-to,troubleshooting,0.78,True,"Explicit troubleshooting article for Azure Local VMs enabled by Azure Arc; likely includes specific log locations, commands, and known issues/limitations with resolutions.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-common-sdn-issues?view=azloc-2603,Troubleshoot common SDN issues,Collect traces and logs to troubleshoot common issues in SDN managed by on-premises tools - Azure Local,Collect SDN traces and logs on Azure Local,Learn how to collect network traces and logs to troubleshoot common issues in Software Defined Networking (SDN) managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes what data to collect to troubleshoot common issues in Software Defined Networking (SDN) managed by on-premises tools on Azure Local. Use this information to perform initial troubleshooting before contacting Microsoft Support.",2026-03-27T17:04:00.000Z,how-to,troubleshooting,0.78,True,"Troubleshooting-focused article for SDN on Azure Local that describes exactly what data to collect and from where (specific log locations, trace types, and commands) before contacting Microsoft Support. This is product- and version-specific diagnostic guidance that an LLM is unlikely to know from training, matching the troubleshooting criteria of symptom → data to gather → next steps.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment-configurator-app?view=azloc-2603,Troubleshoot registration issues for Configurator app,Troubleshoot registration issues using Configurator app in Azure Local - Azure Local,Troubleshoot Azure Local registration failures in Configurator app,Learn how to troubleshoot the registration failures for Azure Local when using the Configurator app.,Applies to: Azure Local 2508 and later This article provides guidance on how to collect logs and troubleshoot issues experienced during the registration of Azure Local via the Configurator app.,2025-12-23T23:03:00.000Z,how-to,troubleshooting,0.8,True,"Focused on diagnosing registration failures with the Configurator app, including log collection and symptom-based guidance specific to Azure Local deployment.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment?view=azloc-2603,Troubleshoot deployment validation issues,"Troubleshoot deployment validation issues in Azure Local, version 23H2 via Azure portal - Azure Local",Troubleshoot Azure Local 23H2 deployment validation issues,"Learn how to troubleshoot the deployment validation failures for Azure Local, version 23H2 when deployed via the Azure portal.",Applies to: Azure Local 2405 and later This article provides guidance on how to troubleshoot deployment validation issues experienced during the deployment of Azure Local via the Azure portal.,2026-01-28T18:04:00.000Z,how-to,troubleshooting,0.8,True,"Provides symptom-to-resolution guidance for deployment validation failures via Azure portal, including Azure Local–specific checks and remediation steps.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-sdn-deployment?view=azloc-2603,Troubleshoot SDN deployment,"Troubleshoot deployment of Software Defined Networking managed by on-premises tools in Azure Local, version 23H2 via Windows Admin Center - Azure Local",Troubleshoot SDN deployment via Windows Admin Center,"Learn how to troubleshoot the deployment of Software Defined Networking (SDN) managed by on-premises tools in Azure Local, version 23H2 via Windows Admin Center.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article provides guidance on how to troubleshoot issues that you may encounter while deploying the Software Defined Networking (SDN) components using Windows Admin Center. Use this guidance to troubleshoot the issues before creating a support ticket. This article also provides instructions on how to collect logs after successfully troubleshooting the issues to help diagnose the cause of deployment failure.",2025-12-22T23:06:00.000Z,how-to,troubleshooting,0.86,True,"Explicit troubleshooting article for SDN deployment; likely organized by deployment failures, includes logs to check and remediation steps specific to Azure Local SDN.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-software-load-balancer?view=azloc-2603,Troubleshoot Software Load Balancer for SDN,Troubleshoot Software Load Balancer (SLB) for SDN managed by on-premises tools in Azure Local and Windows Server - Azure Local,Troubleshoot Azure Local SDN Software Load Balancer,Learn how to troubleshoot Software Load Balancer for SDN managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 If you've set up Software Load Balancer (SLB) for Software Defined Networking (SDN) and your data path isn't working through SLB, there could be several reasons behind it. This article helps you identify and troubleshoot some common issues in SLB for SDN managed by on-premises tools. For an overview of SLB and how to manage it, seeWhat is Software Load Balancer (SLB) for SDN?andManage Software Load Balancer for SD",2025-12-22T23:06:00.000Z,how-to,troubleshooting,0.84,True,"Troubleshooting SLB data path issues; likely includes specific checks, counters, and configuration validations unique to SLB for Azure Local/Windows Server.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-automatic-state-transfer?view=azloc-2603,Automatic virtual TPM state transfer,Automatic virtual TPM state transfer for Azure Local - Azure Local,,Learn how automatic virtual TPM state transfer works for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article uses an example to illustrate the automatic transfer of virtual TPM (vTPM) state for Trusted launch for Azure Local VM, even as the VM migrates or fails over to another machine in the system. This operation allows the applications that use the vTPM to function normally during VM migration or failover.",2025-12-23T23:03:00.000Z,how-to,,0.45,False,"Describes automatic vTPM state transfer behavior conceptually; summary does not indicate configuration parameters, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-guest-attestation?view=azloc-2603,Enable guest attestation,Guest attestation for Trusted launch for Azure Local VMs (preview) - Azure Local,Enable guest attestation for Azure Local Trusted launch VMs,Learn how guest attestation works for Trusted launch for Azure Local VMs (preview).,"Applies to: Azure Local 2509 and later This article describes how to enable guest attestation for Trusted launch for Azure Local virtual machines (VMs) enabled by Azure Arc. Guest attestation, also called boot integrity verification, is a new feature you can preview starting with Azure Local version 2509. Guest attestation allows you to verify if the VM started in a well-known good state – specifically, verify integrity of the entire boot chain. This helps detect any unexpected changes to the bo",2025-12-23T23:03:00.000Z,how-to,security,0.7,True,"Covers enabling guest attestation (boot integrity verification) for Trusted launch VMs; likely includes specific security settings, parameters, and version requirements, which are product-specific security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-import-key?view=azloc-2603,Manual backup and recovery,Manual backup and recovery of guest state protection keys for Trusted launch Azure Local VMs enabled by Azure Arc - Azure Local,Back up and restore Trusted launch guest state keys,Learn how to perform a manual backup and recovery of guest state protection keys for Trusted launch Azure Local VMs enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article describes how to manually back up and restore guest state protection keys for Trusted launch Azure Local virtual machines (VMs) enabled by Azure Arc. For Azure Local release 2505 and later: Back up and restore Azure Local VM guest state protection keys to and from a file system folder. For Azure Local releases prior to 2505: Back up and restore Azure Local VM guest state protection keys to and from a key vault in another Azure Lo,2025-12-23T23:03:00.000Z,how-to,security,0.78,True,"Describes manual backup and recovery of guest state protection keys, including different procedures for specific Azure Local releases and storage targets; this is product-specific security key management configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-overview?view=azloc-2603,What is Trusted launch for Azure Local VMs?,Overview for Trusted launch for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn about Trusted launch for Azure Local VMs enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article introduces Trusted launch for Azure Local virtual machines (VMs) enabled by Azure Arc. You can create a Trusted launch for an Azure Local VM using the Azure portal or by using Azure Command-Line Interface (CLI).,2026-04-06T22:03:00.000Z,concept-article,,0.2,False,"High-level overview of Trusted launch for Azure Local VMs; description suggests conceptual introduction and how to create VMs via portal/CLI, but no indication of specific limits, configuration parameter tables, RBAC role lists, or error-code-based troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/unregister-register-machine?view=azloc-2603,Unregister and reregister machines,Unregister and re-register Azure Local machines - Azure Local,Unregister and re-register Azure Local machines using PowerShell,Learn how to unregister and re-register Azure Local machines without having to install the operating system again.,Applies to: Azure Local 2508 and later This article provides guidance on how to unregister and re-register Azure Local machines without having to install the operating system (OS) again. This method uses PowerShell cmdlets and applies to registration with and without Azure Arc gateway. Important This guidance applies only to Azure Local clusters that haven't been deployed yet.,2026-03-03T23:11:00.000Z,how-to,configuration,0.65,True,Uses specific PowerShell cmdlets and flows for registration state management; these are product-specific configuration/management commands.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/update-network-controller-certificates?view=azloc-2603,Update Network Controller certificates,Renew certificates for Network Controller - Azure Local,Renew Azure Local Network Controller certificates,This article describes how to renew Network Controller certificates.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022 and Windows Server 2019 This article provides instructions on how to renew or change Network Controller certificates, both automatically and manually. If you face any issues in renewing your Network Controller certificates, contact Microsoft Support. In your Software Defined Networking (SDN) managed by on-premises tools infrastructure, the Network Controller uses certificate-based authentication to secure Northbound communication chan",2025-12-22T23:06:00.000Z,how-to,security,0.8,True,"Details automatic and manual renewal of Network Controller certificates; includes product-specific certificate roles, stores, and renewal procedures.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn-infrastructure-certificates?view=azloc-2603,Update SDN infrastructure certificates,Renew certificates for Software Defined Networking managed by on-premises tools infrastructure - Azure Local,Renew SDN infrastructure and SLB MUX certificates,This article describes how to renew or change SDN server and Software Load Balancer multiplexer certificates.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022 and Windows Server 2019 This article provides instructions on how to renew or change Software Defined Networking (SDN) server and Software Load Balancer (SLB) multiplexer (MUX) certificates. If you face any issues in renewing your certificates, contact Microsoft Support. For information about how to renew Network Controller certificates, seeRenew Network Controller certificates before they expire. In your SDN infrastructure, the Netwo",2025-12-22T23:06:00.000Z,how-to,security,0.8,True,Describes renewal of SDN server and SLB MUX certificates; contains specific certificate usage and renewal steps unique to Azure Local SDN.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn?view=azloc-2603,Update SDN infrastructure,Update infrastructure for SDN managed by on-premises tools for Azure Local - Azure Local,,Learn to update infrastructure for SDN managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Software Defined Networking (SDN) infrastructure components include Network Controller and optionally, Software Load Balancers (SLBs), and SDN gateways that run on virtual machines (VMs). When you update each component, you use any of the standard methods for installing Windows updates, and you also use Windows PowerShell. You can update the SDN infrastructure in any order, but we recommend th",2025-12-22T23:06:00.000Z,how-to,,0.45,False,"Update procedure for SDN components; summary suggests generic update flow without detailed config parameters, limits, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn?view=azloc-2603,Upgrade SDN infrastructure,Upgrade Infrastructure for Software Defined Networking (SDN) Managed by On-Premises Tools - Azure Local,Upgrade SDN infrastructure and fix upgrade issues,Learn how to upgrade infrastructure for SDN managed by on-premises tools.,"Applies to: Hyperconverged deployments of Azure Local running 2311.2 and later; Windows Server 2025, Windows Server 2022 This article provides guidance on safely and securely upgrading infrastructure for Software Defined Networking (SDN) managed by on-premises tools. It also provides troubleshooting guidance to help remediate issues that might occur during the upgrade process. Important Do not use this article for upgrading SDN enabled by Azure Arc on Azure Local.",2026-01-16T23:02:00.000Z,how-to,troubleshooting,0.62,True,"Upgrade guide explicitly includes troubleshooting guidance for SDN upgrade issues, which typically maps symptoms to causes and resolutions specific to Azure Local SDN.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-powershell?view=azloc-2603,Configure network security groups with PowerShell,Configure network security groups with PowerShell - Azure Local,Configure SDN network security groups with PowerShell,Configure network security groups with PowerShell.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article provides instructions for configuring network security groups (NSGs) to manage data traffic flow usingDatacenter Firewallfor Software Defined Networking (SDN) in Azure Local using Windows PowerShell. You enable and configure Datacenter Firewall by creating network security groups that get applied to a subnet or a network interface. The example scripts in this article use Windows P",2025-12-22T23:06:00.000Z,how-to,security,0.8,True,"PowerShell-based NSG configuration for Datacenter Firewall; likely includes cmdlet names, parameter sets, and example scripts that are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-windows-admin-center?view=azloc-2603,Configure network security groups with Windows Admin Center,Configure network security groups with Windows Admin Center - Azure Local,Configure SDN network security groups in Windows Admin Center,Configure network security groups with Windows Admin Center,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This topic provides step-by-step instructions on how to use Windows Admin Center to create and configure network security groups (NSGs) to manage data traffic flow using Datacenter Firewall. It also provides instructions on managing network security groups on Software Defined Network (SDN) virtual and logical networks. You enable and configure Datacenter Firewall by creating network security g",2025-12-22T23:06:00.000Z,how-to,security,0.78,True,"Covers Datacenter Firewall and NSG configuration for SDN, including NSG objects applied to subnets/NICs; typically includes rule properties, directions, and scopes specific to Azure Local SDN.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/use-environment-checker?view=azloc-2603,Assess environment readiness,"Use Azure Local Environment Checker to assess deployment readiness for Azure Local, version 23H2. - Azure Local",Use Azure Local Environment Checker for readiness validation,"How to use the Environment Checker to assess if your environment is ready for deploying Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to use the Azure Local Environment Checker in a standalone mode to assess how ready your environment is for deploying the Azure Local solution. For a smooth deployment of the Azure Local solution, your IT environment must meet certain requirements for connectivity, hardware, networking, and Active Directory. The Azure Local Environment Checker is a readiness assessment tool that checks these minimum requirements and",2026-01-27T18:04:00.000Z,how-to,configuration,0.7,True,"Readiness tool checks minimum requirements for connectivity, hardware, networking, and AD; such pages typically list specific parameters, thresholds, and checks that are product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-red-hat?view=azloc-2603,Using Red Hat Enterprise Azure Marketplace image,Prepare Red Hat Enterprise Azure Marketplace Image for Azure Local VM Deployment - Azure Local,,Learn how to prepare and export a Red Hat Enterprise Azure Marketplace VM image for use with Azure Local clusters.,"This article explains how to prepare a Red Hat Enterprise Linux (RHEL) Azure Marketplace image for use with Azure Local virtual machines (VMs). By following these steps, you ensure your VM has the latest security updates, support, and integration features.",2026-01-16T18:04:00.000Z,how-to,,0.4,False,"Preparing RHEL Marketplace images is a how-to; focus is on steps, not on limits, security roles, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-ubuntu?view=azloc-2603,Using Ubuntu Azure Marketplace image,Prepare Ubuntu Azure Marketplace Image for Azure Local VM Deployment - Azure Local,,Learn how to prepare and export a Ubuntu Azure Marketplace VM image for use with Azure Local clusters.,"This article explains how to prepare an Ubuntu Azure Marketplace image for use with Azure Local virtual machines (VMs). By following these steps, you ensure your VM has the latest security updates, support, and integration features.",2026-01-12T23:03:00.000Z,how-to,,0.4,False,"Preparing Ubuntu Marketplace images is a step-by-step preparation guide; summary emphasizes updates and integration, not limits or config matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-compute-gallery?view=azloc-2603,Using Azure Compute Gallery images,Create Azure Local VM from Azure Compute Gallery Images via Azure CLI - Azure Local,Create Azure Local VMs from Azure Compute Gallery images,Learn how to create Azure Local VM images using Azure Compute Gallery images.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc using source images from the Azure Compute Gallery. You can create VM images on Azure CLI using the instructions in this article and then use these VM images to create Azure Local VMs.,2026-03-31T17:04:00.000Z,how-to,integrations,0.65,True,"How-to article for creating Azure Local VMs from Azure Compute Gallery via Azure CLI. Likely includes product-specific CLI commands, parameters, and image configuration details that qualify as integration/coding patterns rather than generic tutorial content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-marketplace?view=azloc-2603,Using Azure Marketplace images,Create Azure Local VM from Azure Marketplace Images via Azure CLI - Azure Local,,Learn how to create Azure Local VM images using source images from Azure Marketplace.,"Applies to: Hyperconverged deployments of Azure Local This article explains how to create Windows virtual machine (VM) images for Azure Local by using source images from Azure Marketplace, either through the Azure portal or Azure CLI. To create Linux VM images from Azure Marketplace, choose: Prepare RHEL Azure Marketplace image for Azure Local VM deployment. Prepare Ubuntu Azure Marketplace image for Azure Local VMs.",2026-01-16T18:04:00.000Z,how-to,,0.35,False,Creating VM images from Azure Marketplace is a procedural guide; summary doesn’t show detailed configuration matrices or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-centos?view=azloc-2603,Using CentOS VM image,Prepare CentOS Linux image via Azure CLI for Azure Local VMs enabled by Azure Arc (preview) - Azure Local,,Learn how to prepare CentOS Linux images to create an Azure Local VM image (preview).,"Caution This article references CentOS, a Linux distribution that's reached end-of-life (EOL). Consider your use of CentOS and plan accordingly. For more information, seeCentOS end-of-life guidance. Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare a CentOS Linux image and create an Azure Local virtual machine (VM).",2025-12-23T23:03:00.000Z,how-to,,0.4,False,"Preparing CentOS images is a CLI tutorial; EOL warning is conceptual, and there’s no indication of detailed configuration tables or quotas.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-existing-arc-vm?view=azloc-2603,Using an existing Azure Local VM,Create Azure Local VM image from an existing Azure Local VM enabled by Azure Arc - Azure Local,,Learn how to create Azure Local VM images using an existing Azure Local VM enabled by Azure Arc via Azure CLI.,Applies to: Hyperconverged deployments of Azure Local 2408.2 and later This article describes how to create Azure Local VM images using Azure Command-Line Interface (CLI) and existing Azure Local VMs. Learn to use the operating system (OS) disk of an Azure Local VM to create a gallery image on your Azure Local.,2026-02-11T23:03:00.000Z,how-to,,0.4,False,Creating VM images from existing VMs is a CLI-based how-to; summary doesn’t show configuration matrices or limits beyond version applicability.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-linux-sysprep?view=azloc-2603,Using Ubuntu VM image,Prepare Ubuntu image via Azure CLI for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to prepare Ubuntu images to create an Azure Local VM image by using Azure CLI.,Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare an Ubuntu image and create an Azure Local virtual machine (VM).,2025-12-29T18:05:00.000Z,how-to,,0.4,False,"Preparing Ubuntu images via CLI is a tutorial; while it may have commands, it’s not primarily a configuration reference or troubleshooting guide.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-local-share?view=azloc-2603,Using images in local share,Create Azure Local VM from local share images via Azure CLI - Azure Local,,Learn how to create Azure Local VM images using source images from a local share on your system.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to create virtual machine (VM) images for Azure Local using source images from a local share. You can create VM images by using the Azure portal or Azure CLI. Then, use these VM images to create Azure Local VMs.",2025-12-23T23:03:00.000Z,how-to,,0.35,False,Using local share images is a procedural article; no evidence of detailed configuration option tables or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-red-hat-enterprise?view=azloc-2603,Using Red Hat Enterprise VM image,Prepare Red Hat Enterprise Linux image via Azure CLI for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to prepare a Red Hat Enterprise Linux image to create an Azure Local VM image (preview).,Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare a Red Hat Enterprise Linux image and create an Azure Local virtual machine (VM).,2026-01-16T18:04:00.000Z,how-to,,0.4,False,Preparing RHEL images is a procedural article; summary doesn’t show expert-level configuration or troubleshooting content.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-storage-account?view=azloc-2603,Using images in Azure Storage account,Create Azure Local VMs enabled by Azure Arc in Azure Storage account - Azure Local,,Learn how to create Azure Local VMs enabled by Azure Arc using source images from Azure Storage account via Azure portal and Azure CLI.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc using source images from the Azure Storage account. You can create VM images by using the Azure portal or Azure CLI and then use these VM images to create Azure Local VMs.,2025-12-23T23:03:00.000Z,how-to,,0.35,False,"Creating VMs from Storage account images is a tutorial; summary doesn’t indicate expert-level limits, security roles, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-suse?view=azloc-2603,Using SUSE VM image,Prepare SUSE Linux image via Azure CLI for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to prepare SUSE Linux images to create an Azure Local VM image (preview).,Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare an SUSE Linux image and create an Azure Local virtual machine (VM).,2026-02-13T18:03:00.000Z,how-to,,0.4,False,Preparing SUSE images is a CLI tutorial; no clear indication of configuration parameter reference tables or quotas.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-extension?view=azloc-2603,Manage VM extensions,Manage VM Extensions on Azure Local VMs enabled by Azure Arc on Azure Local - Azure Local,,Learn how to enable guest management and then install and manage extensions on Azure Local VMs via the Azure portal.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to install and manage virtual machine (VM) extensions on Azure Local via the Azure portal. The VM extensions on your Azure Local VMs enabled by Azure Arc are useful for post-deployment configuration, software installation, or other management tasks. To install VM extensions, you must enable Azure guest management on your Azure Local VMs.",2026-04-05T08:00:00.000Z,how-to,,0.45,False,"Explains how to install and manage VM extensions on Azure Local VMs via the portal. The summary suggests a how-to tutorial; it doesn’t clearly indicate detailed extension configuration parameter tables, limits, or error mappings that would qualify as expert configuration, integration, or troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-image?view=azloc-2603,Manage Azure Local VM Images,Manage Azure Local VM Images via CLI or Portal - Azure Local,,"Learn how to list, view properties, and delete virtual machine images on Azure Local using Azure CLI or Azure portal.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage virtual machine (VM) images on your Azure Local instance. You can create VM images from various sources like Azure Marketplace, Azure Compute Gallery, Azure Storage accounts, or local shares. After you create images, you can list, view properties, and delete them using Azure CLI or the Azure portal.",2025-12-23T23:03:00.000Z,how-to,,0.45,False,"Managing VM images (list, view, delete) is operational guidance; likely simple CLI/portal steps without detailed config matrices or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-operations?view=azloc-2603,Supported operations for VMs,Supported Operations for Azure Local Virtual Machines (VMs) Enabled by Azure Arc - Azure Local,Use supported operations for Azure Local Arc VMs,Learn about the supported virtual machine (VM) operations for Azure Local VMs enabled by Azure Arc.,"Applies to: Azure Local 2504 or later This article discusses the most common operations for Azure Local virtual machines (VMs) enabled by Azure Arc. The article identifies the operations that are supported on Azure Local VMs, along with the operations that you need to avoid to prevent complications.",2026-02-24T23:03:00.000Z,concept-article,best-practices,0.65,True,"Article explicitly calls out supported and unsupported VM operations to avoid complications, which are product-specific DO/DON’T guidelines.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-activate?view=azloc-2603,Activate Windows Server VMs,Activate Windows Server VMs on Azure Local - Azure Local,Configure Windows Server VM activation on Azure Local,This article explains the benefits of using Automatic Virtual Machine Activation and provides instructions on setting up this optional feature on Azure Local.,"Applies to: Azure Local 2311.1 and later; Windows Server 2025, Windows Server 2022, Windows Server 2019 Datacenter Edition and later Windows Server virtual machines (VMs) must be activated before you can use them on Azure Local. You can use any existing Windows Server activation methods that you already have. Optionally, Azure Local offers an addon subscription and tools to help simplify this process. This article describes Windows Server VM activation concepts and the options that are available",2026-01-05T23:02:00.000Z,how-to,configuration,0.6,True,Activation article describes specific activation options and setup steps (including optional subscription add-on) that are product-specific configuration patterns.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-affinity?view=azloc-2603,Create VM affinity rules,Set up VM affinity rules using Windows PowerShell - Azure Local,Configure VM affinity and anti-affinity rules on Azure Local,Learn how to set up VM affinity rules using Windows PowerShell,"Applies to: Azure Local 2311.2 and later Using either Windows Admin Center or Windows PowerShell, you can easily create affinity and anti-affinity rules for virtual machines (VMs) in your Azure Local instance. Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created thi",2025-12-23T23:03:00.000Z,how-to,configuration,0.68,True,"Affinity rules via PowerShell/Admin Center are configuration-focused and product-specific; likely includes setting names, allowed values, and examples for Azure Local clusters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-load-balancing?view=azloc-2603,VM load balancing,Virtual machine load balancing - Azure Local,Configure virtual machine load balancing on Azure Local,Use this article to learn how to configure the VM load balancing feature in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2025, Windows Server 2022, Windows Server 2019, Windows Server 2016 Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't enabled by Azure Arc, have limited manageability from the Azure Arc cont",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Article is about configuring VM load balancing; likely includes specific settings, parameters, and possibly tables for Azure Local/Windows Server load balancing behavior.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-powershell?view=azloc-2603,Manage VMs with PowerShell,Manage VMs using Windows PowerShell on Azure Local - Azure Local,,How to manage virtual machines on Azure Local using Windows PowerShell,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes how to create and manage virtual machines (VMs) on Azure Local using Windows PowerShell. Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't en",2026-04-05T08:00:00.000Z,how-to,,0.3,False,"Article describes how to create and manage VMs on Azure Local using PowerShell; likely a procedural tutorial with commands, but prompt summary does not indicate structured configuration tables, limits/quotas, or product-specific troubleshooting mappings. Appears to be general how-to guidance rather than expert reference data.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/vm?view=azloc-2603,Manage VMs,Manage VMs with Windows Admin Center on Azure Local - Azure Local,,Learn how to create and manage virtual machines on Azure Local using Windows Admin Center.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 Windows Admin Center can be used to create and manage your virtual machines (VMs) on Azure Local. Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't enabled by Azure",2025-12-23T23:03:00.000Z,how-to,,0.45,False,How-to for creating/managing VMs via Windows Admin Center; likely step-by-step UI operations without detailed config parameter tables or numeric limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/manage/windows-server-azure-edition-23h2?view=azloc-2603,Deploy Windows Server Azure Edition VMs,"Deploy Windows Server Azure Edition VMs on Azure Local, version 23H2 - Azure Local",,"Learn how to deploy Windows Server Azure Edition VMs on Azure Local, version 23H2 starting with an image in Azure Local Marketplace or Azure Marketplace.",Applies to: Hyperconverged deployments of Azure Local The Windows Server Azure Edition operating system can be deployed as a guest virtual machine (VM) on Azure Local 2311.2 or later. This article describes how to deploy and hotpatch Windows Server Azure Edition VMs starting with an image in Azure Local marketplace or an image in Azure Marketplace. Note Both Azure Local VMs enabled by Azure Arc and unmanaged VMs are supported. Azure Local is the only on-premises platform to run Windows Server Az,2025-12-23T23:03:00.000Z,install-set-up-deploy,,0.5,False,"Deploying Windows Server Azure Edition VMs is a deployment/tutorial article; summary doesn’t clearly show limits, config matrices, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-azure-migrate?view=azloc-2603,"Migrate, verify",Migrate Hyper V VMs to Azure Local using Azure Migrate (preview) - Azure Local,,Learn about how to to migrate Windows and Linux VMs to your Azure Local instance using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article describes how to migrate Hyper-V virtual machines (VMs) to Azure Local using Azure Migrate and includes the steps to verify the migration. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-01-22T23:03:00.000Z,how-to,,0.4,False,"Step-by-step migration walkthrough for Hyper-V VMs; likely procedural without detailed limits, configs tables, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-enable-guest-management?view=azloc-2603,Enable guest management,Enable guest management for migrated VMs (preview) - Azure Local,Enable guest management on Azure Local migrated VMs,Learn how to enable guest management for migrated VMs (preview).,"Applies to: Azure Local 2503 and later This article describes how to enable guest management for VMs that have been migrated to Azure Local usingAzure Migrate. Enabling guest management allows you to manage in guest OS settings and install and manage Azure extensions on these VMs. This article is only for VMs that have been migrated using Azure Migrate. For more information on how to enable guest management in other scenarios, seeManage Azure Local VMs. The output properties may vary depending o",2026-01-22T23:03:00.000Z,how-to,configuration,0.7,True,Enabling guest management involves specific settings and possibly extension configuration parameters that are product-specific configuration knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-faq?view=azloc-2603,FAQ,Azure Migrate FAQ - Azure Local,Troubleshoot Azure Local VM migration with Azure Migrate,This FAQ provides answers to common questions about the migration of a Hyper-V virtual machine (VM) to an Azure Local instance using Azure Migrate.,"The Azure Migrate based solution enables you to migrate VMs from Hyper-V (Preview) and VMware to an Azure Local instance. This FAQ answers questions you might have about the migration of a VM from a Hyper-V or a VMware VM to an Azure Local instance using Azure Migrate. -Tabs have questions aboutVMware and Hyper-V VMs,VMware VMs only, andHyper-V VMs only. The migration of Hyper-V VMs is in preview.",2026-03-26T17:03:00Z,faq,troubleshooting,0.7,True,"An FAQ for a specific migration scenario (Hyper-V and VMware to Azure Local) typically includes concrete answers about supported/unsupported scenarios, specific error messages or behaviors, and how to resolve them. While not labeled as troubleshooting, such FAQs often map symptoms and questions to causes and resolutions, which aligns best with the troubleshooting sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-prerequisites?view=azloc-2603,Complete prerequisites,Prerequisites for Hyper-V VM migration to Azure Local using Azure Migrate (preview) - Azure Local,Complete prerequisites for Hyper-V migration to Azure Local,Learn prerequisites for Hyper-V migration to Azure Local using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article describes the prerequisite tasks you need to complete before you begin the process to migrate Hyper-V virtual machines (VMs) to Azure Local. Make sure toreview the requirementsfor migration if you haven't already. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availabili",2026-03-26T17:03:00.000Z,how-to,configuration,0.65,True,"Prerequisites article for a migration workflow usually lists concrete configuration steps, required settings, and environment preparations (accounts, permissions, ports, agents) that are specific to Azure Migrate and Azure Local.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-replicate?view=azloc-2603,"Discover, replicate",Discover and replicate Hyper-V VMs for migration to Azure Local using Azure Migrate (preview) - Azure Local,Discover and replicate Hyper-V VMs with Azure Migrate to Azure Local,Learn the discovery and replication process for Hyper-V VMs to Azure Local using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article describes the discovery and replication phase for Hyper-V virtual machine (VM) migration to Azure Local using Azure Migrate. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. For more information on appliances for Azure Migrate and how to manage them, seeAzure",2026-03-25T08:00:00.000Z,how-to,integrations,0.66,True,"Discovery and replication phase documentation generally includes appliance configuration, connection settings, and replication parameters (frequency, retention, credentials) that are specific to integrating Hyper-V with Azure Migrate and Azure Local.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-requirements?view=azloc-2603,Review requirements,Review requirements for Hyper-V VM migration to Azure Local using Azure Migrate (preview) - Azure Local,System requirements for Hyper-V migration to Azure Local,Learn the system requirements for Hyper-V migration to Azure Local using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article lists the system requirements for migrating Hyper-V virtual machines (VMs) to Azure Local using Azure Migrate. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-03-26T17:03:00.000Z,how-to,configuration,0.68,True,"Requirements article typically includes specific supported versions, network/port requirements, capacity constraints, and configuration parameters for Azure Migrate Hyper-V migration, which are product-specific expert details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-maintain-ip-addresses?view=azloc-2603,Maintain static IP addresses,Maintain static IP addresses during migration - Azure Local,Preserve static IP addresses during Azure Local VM migration,Learn how to maintain static IP addresses for VMs during migration.,Applies to: Azure Local 2503 and later This article explains how to preserve static IP addresses during virtual machine (VM) migration to Azure Local using Azure Migrate. It provides detailed instructions for both Software Defined Networking (SDN) enabled and non-SDN enabled Azure Local environments. This article applies to migration of Hyper-V VMs (Preview) and VMware VMs.,2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,Explains how to preserve static IPs in SDN and non-SDN environments; likely includes concrete network configuration steps and parameter values.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-troubleshoot?view=azloc-2603,Troubleshoot,Troubleshoot issues when migrating VMs to Azure Local using Azure Migrate - Azure Local,Troubleshoot Azure Local VM migrations with Azure Migrate,Learn about how to troubleshoot issues when migrating Windows VMs to your Azure Local instance using Azure Migrate.,Applies to: Azure Local 2503 and later This article describes how to troubleshoot any potential issues that you may experience when migrating Hyper-V (Preview) and VMware VMs to your Azure Local using Azure Migrate.,2025-12-23T23:03:00.000Z,how-to,troubleshooting,0.9,True,"Explicit troubleshooting article; likely organized by symptom and includes specific errors, causes, and resolutions for Azure Local migrations.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-via-powershell?view=azloc-2603,Migrate via PowerShell,Migrate VMs to Azure Local with Azure Migrate using PowerShell - Azure Local,Migrate Azure Local VMs with Azure Migrate PowerShell,Learn how to migrate VMs to Azure Local with Azure Migrate using PowerShell.,Applies to: Azure Local 2503 and later This article describes how to migrate virtual machines (VMs) to Azure Local with Azure Migrate using PowerShell. This article applies to migration of Hyper-V VMs (Preview) and VMware VMs.,2026-01-22T18:08:00.000Z,how-to,integrations,0.65,True,"PowerShell-based migration article will reference specific cmdlets, parameters, and required values unique to Azure Migrate and Azure Local, fitting integration/coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-migrate?view=azloc-2603,"Migrate, verify",Migrate VMware VMs to Azure Local using Azure Migrate - Azure Local,,Learn about how to to migrate VMware VMs to your Azure Local instance using Azure Migrate.,Applies to: Azure Local 2503 and later This article describes how to migrate the VMware virtual machines (VMs) to Azure Local using Azure Migrate and includes the steps to verify the migration.,2026-01-22T23:03:00.000Z,how-to,,0.45,False,"Migration how-to for VMware VMs; mostly step-by-step operations rather than expert-only limits, configs, or error catalogs.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-prerequisites?view=azloc-2603,Complete prerequisites,Prerequisites for VMware VM migration to Azure Local using Azure Migrate - Azure Local,,Learn prerequisites for VMware migration to Azure Local using Azure Migrate.,Applies to: Azure Local 2503 and later This article describes the prerequisite tasks you need to complete before you begin the process to migrate VMware virtual machines (VMs) to Azure Local. Make sure toreview the requirementsfor migration if you haven't already.,2026-03-26T17:03:00.000Z,how-to,,0.4,False,"A 'prerequisites' article for a migration workflow usually lists tasks and high-level setup steps (permissions, accounts, enabling features). While it may reference some specifics, it is primarily procedural and not a structured list of limits, configuration matrices, or error mappings that match any sub-skill definition.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-replicate?view=azloc-2603,"Discover, replicate",Discover and replicate VMware VMs for migration to Azure Local using Azure Migrate - Azure Local,,Learn the discovery and replication process for VMware VMs to Azure Local using Azure Migrate.,"Applies to: Azure Local 2503 and later This article describes the discovery and replication phase for VMware virtual machine (VM) migration to Azure Local using Azure Migrate. For more information on appliances for Azure Migrate and how to manage them, seeAzure Migrate appliance.",2026-03-26T17:03:00.000Z,how-to,,0.45,False,"A 'discovery and replication process' article is likely a step-by-step workflow/tutorial for using Azure Migrate with VMware VMs. The summary does not indicate structured limits, configuration parameter tables, or troubleshooting mappings; it appears more procedural than expert-reference in nature.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-requirements?view=azloc-2603,Review requirements,Review requirements for VMware VM migration to Azure Local using Azure Migrate - Azure Local,System requirements for VMware migration to Azure Local,Learn the system requirements for VMware migration to Azure Local using Azure Migrate.,Applies to: Azure Local 2503 and later This article lists the system requirements for migrating VMware virtual machines (VMs) to Azure Local by using Azure Migrate.,2026-03-26T17:03:00.000Z,how-to,limits-quotas,0.78,True,"A 'requirements' article for a specific migration path typically enumerates exact supported versions, minimum CPU/RAM, network ports, and other hard constraints that function as product-specific limits/quotas (for example, supported vCenter/ESXi versions, minimum appliance sizing, port numbers). These are concrete numeric and version limits that an LLM would not reliably know from training.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-whats-new?view=azloc-2603,What's new in VM migration?,What's new in Azure Migrate for Azure Local - Azure Local,,Learn about new features in Azure Migrate for Azure Local.,This article lists the various features and improvements that are available in virtual machine (VM) migration to Azure Local (formerly Azure Stack HCI). This article applies to both Hyper-V (Preview) and VMware VM migrations. Applies to: Azure Local 2503 and later,2026-03-26T17:03:00.000Z,how-to,,0.2,False,"“What’s new” feature list for Azure Migrate; primarily release notes/overview of new capabilities without detailed limits, configs, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-overview?view=azloc-2603,Overview,Use Azure Migrate to move Hyper-V VMs to Azure Local (preview) - Azure Local,,Learn about how to use Azure Migrate to migrate Windows and Linux VMs to your Azure Local instance (preview).,"Applies to: Azure Local 2503 and later This article provides an overview of how to migrate Hyper-V virtual machines (VMs) to your Azure Local instance using Azure Migrate. Azure Migrate is a central hub for tools to discover, assess, and migrate on-premises servers, apps, and data to the Microsoft Azure cloud. Azure Local is a hyperconverged infrastructure (HCI) system solution that hosts virtualized Windows and Linux workloads in a hybrid environment. You can use the Azure Migrate platform to m",2025-12-23T23:03:00.000Z,overview,,0.4,False,"High-level overview of using Azure Migrate with Azure Local; detailed steps and configs are likely in linked articles, not here.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-vmware-overview?view=azloc-2603,Overview,Use Azure Migrate to move VMware VMs to Azure Local - Azure Local,,Learn about how to use Azure Migrate to migrate VMware VMs to your Azure Local instance.,"Applies to: Azure Local 2503 and later This article provides an overview of how to migrate VMware virtual machines (VMs) to your Azure Local instance using Azure Migrate. Azure Migrate is a central hub for tools to discover, assess, and migrate on-premises servers, apps, and data to the Microsoft Azure cloud. Azure Local is a hyperconverged infrastructure system solution that hosts virtualized Windows and Linux workloads in a hybrid environment. You can use the Azure Migrate platform to move on-",2025-12-23T23:03:00.000Z,overview,,0.2,False,High-level overview of using Azure Migrate with VMware; primarily conceptual and descriptive.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-options-overview?view=azloc-2603,Migration overview,Options for migrating virtual machines to Azure Local - Azure Local,Choose migration options for VMs to Azure Local,Learn about the available migration options for migrating VM workloads to your Azure Local.,Applies to: Azure Local 2503 and later This article provides an overview of the options available for migrating virtual machine (VM) workloads to your Azure Local (formerly Azure Stack HCI) instance.,2025-12-23T23:03:00.000Z,overview,decision-making,0.65,True,"Compares available VM migration options to Azure Local and guides selection based on scenarios, representing product-specific migration decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/migrate/monitor-migration?view=azloc-2603,Monitor Azure Local migrations,Monitor Azure Local Migrations Using Diagnostic Settings in Azure Migrate - Azure Local,Configure diagnostic settings to monitor Azure Local migrations,Learn how to monitor Azure Local migrations using diagnostic settings in Azure Migrate.,"Applies to: Azure Local 2503 and later This article describes how to enable diagnostic settings in Azure Migrate via the Azure portal to help monitor Azure Local migrations. Diagnostic logs provide detailed and frequent data about resource operations, helping in monitoring, troubleshooting, and auditing. For more information, seeDiagnostic settings in Azure Monitor. To enable diagnostic settings in Azure Migrate via PowerShell or the Azure CLI, seeCollect and consume log data from your Azure res",2025-12-23T23:03:00.000Z,how-to,configuration,0.65,True,"Covers enabling diagnostic settings with specific categories, destinations, and configuration options for Azure Migrate, which are product-specific config details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-assign-vm-rbac-roles?view=azloc-2603,Assign RBAC roles,Use built-in RBAC roles to manage Azure Local VMs for multi-rack deployments (preview) - Azure Local,Use built-in RBAC roles for Azure Local multi-rack VMs,Learn how to use RBAC built-in roles to manage Azure Local VMs for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to use Role-based Access Control (RBAC) to control access to Azure Local virtual machines (VMs) for multi-rack deployments. You can use the RBAC roles to control access to VMs and VM resources such as virtual disks, network interfaces, VM images, logical networks, and virtual networks. You can assign these roles to users, groups, service principals, and managed identities. Important This feature is curren",2025-12-23T23:03:00.000Z,how-to,security,0.85,True,"Describes RBAC role usage for VM and resource access; will list specific built-in role names and scopes, which is product-specific security configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-azure-arc-vm-management-overview?view=azloc-2603,Azure Local VMs for multi-rack deployments,What is Azure Local VM Management for Multi-rack Deployments (preview)? - Azure Local,,Learn about using Azure Local VM management to provision and manage on-premises Windows and Linux virtual machines (VMs) in Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides a brief overview of the Azure Local virtual machine (VM) management feature on Azure Local for multi-rack deployments, including benefits, components, and a high-level workflow. Azure Local VM management enables IT admins to provision and manage Windows and Linux VMs hosted in an on-premises Azure Local environment. IT admins can use the feature to create, modify, delete, and assign permissions and roles to ap",2025-12-23T23:03:00.000Z,how-to,,0.25,False,"High-level overview of Azure Local VM management; focuses on benefits, components, and workflow, not on specific configuration parameters or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-cli-extensions?view=azloc-2603,Install Azure CLI extensions,Install CLI extensions for multi-rack deployments of Azure Local (preview) - Azure Local,Install Azure CLI extensions for Azure Local multi-rack,Learn how to install the needed Azure CLI extensions for multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article explains how to install the required Azure CLI extensions for multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-02-25T23:03:00.000Z,how-to,configuration,0.65,True,Explains required CLI extensions and versions for multi-rack; includes specific extension names and installation commands.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-concepts-compute?view=azloc-2603,Compute for multi-rack deployments,Compute for Multi-rack Deployments of Azure Local (preview) - Azure Local,,Get an overview of compute resources for multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of compute resources for multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-02-25T23:03:00.000Z,concept-article,,0.2,False,"Compute overview for multi-rack deployments; appears conceptual without explicit limits tables, configuration parameter tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-configure-layer-3-isolation-domain?view=azloc-2603,Create and manage L3 isolation domains,Manage Layer 3 Isolation Domains for Azure Local Multi-rack Deployments (preview) - Azure Local,Configure Layer 3 isolation domains for Azure Local multi-rack,Learn how to manage Layer 3 isolation domains for Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create, modify, or delete Layer 3 (L3) isolation domains for multi-rack deployments of Azure Local. Isolation domains enable network connectivity between workloads hosted in the same rack (intra-rack communication) or different racks (inter-rack communication) and with endpoints external to Azure Local. You can create, update, delete, and check the status of your L3 isolation domains by using the Azure",2026-02-02T18:06:00.000Z,how-to,configuration,0.75,True,"Managing L3 isolation domains involves specific network configuration parameters and allowed values, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-connect-arc-vm-using-ssh?view=azloc-2603,Connect to Azure Local VMs via SSH,Connect to an Azure Local virtual machine (VM) via SSH for multi-rack deployments (preview) - Azure Local,Connect to Azure Local multi-rack Windows VMs via SSH,Learn how to use SSH to connect to an Azure Local VM for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to connect to a Windows-based Azure Local virtual machine (VM) using Secure Shell (SSH) and Remote Desktop (RDP) over SSH for multi-rack deployments. The example demonstrates how to enable the OpenSSH Server via the Azure Arc extension using the Azure portal and Azure CLI. This article applies only to Windows VMs. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microso",2026-02-25T23:03:00.000Z,how-to,integrations,0.7,True,"Details enabling OpenSSH via Arc extension and using SSH/RDP over SSH; includes extension configuration and SSH parameters, an integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-arc-virtual-machines?view=azloc-2603,Create Azure Local VMs,Create Azure Local Virtual Machines Enabled by Azure Arc on Multi-rack Deployments (preview) - Azure Local,Create Azure Arc-enabled VMs on Azure Local multi-rack,Learn how to view your Azure Local multi-rack deployment in the Azure portal and create Azure Local virtual machines enabled by Azure Arc (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc, using the VM images that you created on multi-rack deployments of Azure Local. You can create Azure Local VMs using the Azure CLI, Azure portal, or Azure Resource Manager (ARM) template. Note Arc gateway isn't supported on Azure Local VMs. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure ",2026-02-25T23:03:00.000Z,how-to,configuration,0.7,True,"Covers creating VMs via CLI/portal/ARM with Arc enablement; includes specific resource types, properties, and configuration fields.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-internal-load-balancer-virtual-networks?view=azloc-2603,Create internal load balancer for VNETs,Create and Manage an Internal Load Balancer on Multi-Rack Deployments for Azure Local (Preview) - Azure Local,Create internal load balancer via Azure CLI for Azure Local multi-rack,Learn to create and configure internal load balancers for Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article described how to create an internal load balancer on multi-rack deployments for Azure Local using Azure CLI. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-12-23T23:03:00.000Z,concept-article,integrations,0.65,True,"How-to article for creating/configuring internal load balancers using Azure CLI on Azure Local multi-rack. Likely includes specific CLI parameters, flags, and resource property names unique to this product, which qualify as product-specific integration/coding patterns rather than just conceptual guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-load-balancer-logical-network?view=azloc-2603,Create public load balancer for LNETs,Create Load Balancers on Logical Networks using Azure CLI in Multi-Rack Deployments of Azure Local (preview) - Azure Local,Create load balancers on logical networks with Azure CLI,Learn how to create load balancers on logical networks using Azure CLI in multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create load balancers on logical networks using Azure CLI in multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-01-28T23:03:00.000Z,how-to,integrations,0.65,True,"Describes creating load balancers on logical networks in Azure Local multi-rack using Azure CLI. This typically involves product-specific CLI commands, parameter names, and configuration patterns that go beyond generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-logical-networks?view=azloc-2603,Create logical network,Create logical networks for Azure Local VMs for multi-rack deployments (preview) - Azure Local,Create logical networks for Azure Local multi-rack VMs,Learn how to create logical networks on Azure Local VMs for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create or add logical networks for Azure Local virtual machines (VMs) for multi-rack deployments. Azure Local VMs that you create use these logical networks. Note Azure Local VMs support only IPv4 addresses. IPv6 addresses aren't supported. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are i",2025-12-23T23:03:00.000Z,how-to,configuration,0.75,True,"Logical network creation for multi-rack deployments includes specific network settings (address spaces, subnets, constraints) that are configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-interfaces?view=azloc-2603,Create network interfaces,Create network interfaces for Azure Local VMs for multi-rack deployments (preview) - Azure Local,Create network interfaces for Azure Local multi-rack VMs,Learn how to create network interfaces on an existing logical network associated with your Azure Local for multi-rack deployments. The Azure Local VM enabled by Azure Arc uses these network interfaces,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create network interfaces that you can associate with an Azure Local virtual machine (VM) for multi-rack deployments. You can create network interfaces using the Azure portal or Azure Command-Line Interface (CLI). Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwi",2026-02-25T23:03:00.000Z,how-to,configuration,0.7,True,"NIC creation on logical networks involves specific configuration parameters (IP allocation, subnet, NSG association) unique to this environment.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-security-groups?view=azloc-2603,Create NSGs,Create network security groups on Azure Local VMs on multi-rack deployments (preview) - Azure Local,Configure network security groups for Azure Local multi-rack VMs,Learn how to create network security groups on Azure Local VMs on multi-rack deployments using the Azure CLI (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article explains how to create and configure network security groups (NSGs) to manage data traffic flow on a multi-rack deployment of your Azure Local. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-02-25T23:03:00.000Z,how-to,security,0.75,True,"NSG creation and configuration is security-focused; article will list rule properties, directions, and ports specific to Azure Local multi-rack networking.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-ip?view=azloc-2603,Create public IP addresses,Create Public IP Addresses on Multi-rack Deployments of Azure Local (preview) - Azure Local,Create public IP resources on Azure Local multi-rack,Learn how to create public IP resources on multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create public IP resources on multi-rack deployments of Azure Local. A public IP in a multi-rack deployment of Azure Local represents an externally routable IP address resource. Unlike Azure public IP addresses, which are always internet-routable, a public IP on multi-rack deployments can be configured with any IP address that's routable within your network or, optionally, internet-facing. This resourc",2025-12-23T23:03:00.000Z,how-to,integrations,0.65,True,"Covers creating public IP resources in Azure Local multi-rack, which differ from standard Azure public IPs. Likely includes specific CLI/API parameters and configuration options unique to this environment, fitting integrations/coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-load-balancer-virtual-networks?view=azloc-2603,Create public load balancer for VNETs,Create Public Load Balancer on Virtual Networks for Multi-Rack Deployments of Azure Local - Azure Local,Create public load balancers on Azure Local multi-rack virtual networks,Learn how to create and manage a public Load Balancer on Azure Local for multi-rack deployments. Distribute inbound traffic efficiently across virtual machines.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create a public load balancer on a multi-rack deployment of Azure Local using the Azure Command-line Interface (CLI). Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Describes creating load balancers via CLI with specific configuration parameters (frontend IPs, backend pools, rules) unique to Azure Local multi-rack.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-virtual-networks?view=azloc-2603,Create virtual network,Create virtual networks for multi-rack deployments of Azure Local (preview) - Azure Local,Configure virtual networks for Azure Local multi-rack deployments,Learn how to create virtual networks for multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of virtual networks (VNets) for multi-rack deployments of Azure Local and shows how to create one. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"VNet creation article will detail parameters like address ranges, associations, and constraints specific to multi-rack Azure Local.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-disk-snapshot?view=azloc-2603,Create and restore data disk snapshots,Create and Restore Data Disk Snapshots of Azure Local - Azure Local,Create and restore data disk snapshots on Azure Local multi-rack,Learn how to create snapshots from a data disk and restore a new disk from a snapshot on multi-rack deployments of Azure Local.,Disk snapshots let you capture a point-in-time copy of a data disk so that you can recover data or quickly provision new disks from a known-good state. This article shows you how to create a snapshot from an existing data disk and restore a new disk from that snapshot on Azure Local. Note This article covers data disk snapshots only. This release doesn't include OS disk snapshot and restore.,2026-03-12T22:05:00.000Z,how-to,configuration,0.7,True,"Snapshot operations include product-specific capabilities and constraints (data disk only, no OS disk) and likely CLI/portal parameters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-load-balancer-overview?view=azloc-2603,Load balancer for multi-rack deployments,About Load Balancers in Multi-Rack Deployments of Azure Local (preview) - Azure Local,,Learn about the types of load balancers you can use in multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes the different types of load balancers you can use in multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-12-23T23:03:00.000Z,concept-article,,0.25,False,Load balancer overview for multi-rack; describes types of load balancers. Summary suggests conceptual description rather than detailed configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machine-resources?view=azloc-2603,Manage Azure Local VM resources,Manage resources for Azure Local VMs for multi-rack deployments (preview) - Azure Local,Manage disks and NIC resources for Azure Local multi-rack VMs,Learn how to manage resources like data disks and network interfaces on an Azure Local VM for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to manage the VM resources for an Azure Local VM for multi-rack deployments. After you deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you may need to add or delete resources like data disks. Note You can't add or delete network interfaces after the VM is created. If more than one network interface is needed, make sure to add them during VM creation. Important This feature is currently in ",2026-02-25T23:03:00.000Z,how-to,configuration,0.7,True,"Resource management article with constraints (e.g., cannot add NICs after creation) and disk operations; these are product-specific configuration behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machines?view=azloc-2603,Manage Azure Local VMs,Manage Azure Local VMs for Multi-Rack Deployments (preview) - Azure Local,Manage Azure Arc-enabled VMs on Azure Local multi-rack,"Learn how to manage Azure Local VMs enabled by Azure Arc. This includes operations such as start, stop, restart, view properties of Azure Local VMs for multi-rack deployments (preview).","Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to manage Azure Local virtual machines (VMs) enabled by Azure Arc for multi-rack deployments. The procedures to start, stop, restart, and delete an Azure Local VM, and manage guest management are detailed. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet ",2025-12-23T23:03:00.000Z,how-to,configuration,0.65,True,Describes VM lifecycle operations and guest management; includes specific management options and possibly required settings.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-cluster-with-metrics?view=azloc-2603,Monitor cluster metrics,Monitor Multi-rack Deployments of Azure Local with Azure Monitor Metrics (preview) - Azure Local,Configure Azure Monitor metrics for Azure Local multi-rack,Learn how to monitor multi-rack deployments of Azure Local with Azure Monitor Metrics (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to monitor your multi-rack deployments of Azure Local withAzure Monitor Metrics. It also describes the Performance Metrics dashboard and lists metrics collected for compute, storage, and network resources in multi-rack deployments of Azure Local. When you have critical applications and business processes that rely on Azure resources, it's important to monitor those resources for their availability, perfor",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Describes monitoring with Azure Monitor Metrics and lists metrics collected for compute, storage, and network. Such metric listings typically include metric names, dimensions, and possibly units, which are product-specific configuration/monitoring details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-overview?view=azloc-2603,Monitoring overview,Overview of Azure Local Monitoring for Multi-rack Deployments (preview) - Azure Local,,This article provides an overview of the Azure Local monitoring solution for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of monitoring for multi-rack deployments of Azure Local. Monitoring multi-rack deployments of Azure Local involves regular collection and analysis of data from all components of your system to promptly identify and address any potential issues. Routine monitoring is crucial for maintaining the health and functionality of your system. To understand current performance patterns, identify performance ",2026-02-25T23:03:00.000Z,overview,,0.2,False,"Monitoring overview article; describes concepts and importance of monitoring without clear indication of specific metrics tables, limits, or configuration parameters that meet the expert-knowledge criteria.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-nat-gateway-overview?view=azloc-2603,NAT gateway for multi-rack deployments,Overview of NAT Gateway on Multi-Rack Deployments of Azure Local (preview) - Azure Local,,Learn about NAT gateway on multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of Network Address Translation (NAT) gateway on multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-01-07T23:02:00.000Z,overview,,0.25,False,NAT gateway overview; likely conceptual explanation of NAT gateway behavior in multi-rack deployments without detailed numeric limits or config tables.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-network-fabric-overview?view=azloc-2603,Network fabric for multi-rack deployments,Network Fabric Overview For Azure Local Multi-Rack Deployments (preview) - Azure Local,,Learn about network fabric resources for Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes the capabilities of the network fabric used for infrastructure management for multi-rack deployments of Azure Local. The article also covers the workload networking required for these deployments. The network fabric instance is a single deployed physical network infrastructure - including racks, switches, terminal server connections, and cabling - that Azure represents and manages as a Network Fabric (NF) res",2025-12-23T23:03:00.000Z,concept-article,,0.25,False,"Network fabric overview; primarily describes capabilities and concepts. No clear evidence of numeric limits, config parameter tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-overview?view=azloc-2603,About multi-rack deployments,What are multi-rack deployments of Azure Local? (preview) - Azure Local,Evaluate multi-rack deployments for large Azure Local datacenters,"Discover Azure Local multi-rack deployments, a new capability for deploying large on-premises datacenters with over 100 machines and 8,000 cores. Learn how to get started (preview).","Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of multi-rack deployments of Azure Local. The overview also details the benefits, key features, use cases, and how to get started with the preview release. Multi-rack deployments extend the scale of Azure Local, supporting hundreds of servers across multiple racks in a single instance. Microsoft currently offers multi-rack deployments in preview for qualified opportunities. Important This feature i",2025-12-23T23:03:00.000Z,overview,decision-making,0.6,True,"Overview of multi-rack deployments with scale thresholds (over 100 machines, 8,000 cores) and use cases, helping decide when to use multi-rack.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-prerequisites?view=azloc-2603,Complete deployment prerequisites,Prerequisites for multi-rack deployments of Azure Local (preview) - Azure Local,Prerequisites for Azure Local multi-rack deployments,Review the prerequisites for multi-rack deployments of Azure Local (preview).,Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes the prerequisites for multi-rack deployments of Azure Local.,2026-01-07T23:02:00.000Z,how-to,configuration,0.7,True,"Prerequisites article for multi-rack likely lists specific hardware, network, and software configuration requirements unique to this deployment model.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-security?view=azloc-2603,Security concepts,Security Overview for Multi-rack Deployments of Azure Local (preview) - Azure Local,,Read an overview of security features for multi-rack deployments of Azure Local (preview).,This article provides an overview of security for multi-rack deployments of Azure Local. Multi-rack deployments of Azure Local are designed and built to detect and defend against the latest security threats. These deployments also comply with the strict requirements of government and industry security standards. The security posture of Azure Local is based on the following two principles: Use Microsoft cloud-native security tools to improve your cloud security posture and protect your workloads.,2025-12-23T23:03:00.000Z,concept-article,,0.25,False,"Security overview; describes posture and principles but does not indicate specific RBAC roles, permission scopes, or configuration values.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-serial-console?view=azloc-2603,Connect to Azure Local VMs via serial console,Connect to VM Serial Console Using Azure CLI on Azure Local - Azure Local,Use Azure CLI serial console for Azure Local multi-rack VM recovery,Learn how to connect to the serial console of an Azure Local Multi Rack VM using Azure Command-line Interface (CLI).,"This article describes how to connect to the serial console of an Azure Local virtual machine (VM) in a multi-rack deployment using the Azure CLI. Serial console provides access to a text-based console for VMs running Linux or Windows Server. It connects to VM's COM1 serial port, giving you direct console access independent of the VM's network state. This is useful for troubleshooting boot issues, fixing misconfigured networking, or recovering a VM that is otherwise unreachable via RDP or SSH. O",2026-03-12T22:05:00.000Z,how-to,troubleshooting,0.7,True,Serial console usage for boot/network issues is a troubleshooting technique; article likely maps certain failure modes to using serial console commands.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-image-storage-account?view=azloc-2603,Create Azure Local VM images,Create Azure Local VM images for multi-rack deployments using Azure Storage account (preview) - Azure Local,Create Azure Local multi-rack VM images from Azure Storage,Learn how to create Azure Local VMs for multi-rack deployments using source images from Azure Storage account via Azure portal and Azure CLI (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create Azure Local virtual machines (VMs) for multi-rack deployments using source images from the Azure Storage account. You can create VM images using Azure Command Line Interface (CLI) and then use these images to create Azure Local VMs. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in",2026-02-25T23:03:00.000Z,how-to,integrations,0.7,True,Shows how to use Azure Storage and CLI to create VM images; includes specific CLI parameters and integration patterns between Azure Local and Azure Storage.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-extension?view=azloc-2603,Manage Azure Local VM extensions,Manage VM extensions on Azure Local VMs for multi-rack deployments (preview) - Azure Local,Manage VM extensions on Azure Local multi-rack VMs,Learn how to enable guest management and then install and manage extensions on Azure Local VMs via Azure portal for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to install and manage Azure Local virtual machine (VM) extensions for multi-rack deployments via Azure portal. VM extensions for Azure Local VMs enabled by Azure Arc are useful for post-deployment configuration, software installation, or other management tasks. To install VM extensions, you must enable Azure guest management on your Azure Local VMs. Important This feature is currently in PREVIEW. -See theS",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Covers enabling guest management and installing extensions; includes extension types, settings, and constraints specific to Azure Local/Arc.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-vm-management-prerequisites?view=azloc-2603,Complete Azure Local VM prerequisites,Review prerequisites for Azure Local VMs for multi-rack deployments (preview) - Azure Local,VM requirements for Azure Local multi-rack deployments,Learn about the prerequisites for deploying Azure Local VMs for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article lists the requirements and prerequisites for Azure Local virtual machines (VMs) for multi-rack deployments. Review the requirements and complete the prerequisites before you manage your Azure Local VMs. Important This feature is currently in PREVIEW. -See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into gen",2026-02-25T23:03:00.000Z,how-to,configuration,0.7,True,"Lists VM requirements and prerequisites (sizes, images, network constraints) for multi-rack; these are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/oem-license?view=azloc-2603,OEM license information,OEM license for Azure Local overview - Azure Local,,"Learn about the OEM license for Azure Local, its benefits, license requirements, activation, and more.","Applies to: Hyperconverged deployments of Azure Local This article covers the OEM license for Azure Local, its benefits, license requirements, activation, and more.",2026-02-13T18:03:00.000Z,overview,,0.3,False,"OEM license overview and benefits; licensing/marketing/legal focus, not technical configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/overview/hyperconverged-overview?view=azloc-2603,About hyperconverged deployments,Overview of Hyperconverged Deployments for Azure Local - Azure Local,,"Learn the benefits, features, and use cases of Azure Local hyperconverged deployments, designed to accelerate cloud and AI innovation from edge to core.","Applies to: Hyperconverged deployments of Azure Local This article provides an overview of hyperconverged deployments of Azure Local (formerly Azure Stack HCI). The overview details the benefits, key features, use cases, and how to get started with this generally available solution. Hyperconverged deployments come in different sizes, from a single machine footprint to a maximum of 16 machines that use hyperconverged storage. They offer a unified management control plane and support a wide range ",2025-12-22T23:06:00.000Z,overview,,0.4,False,"Overview of hyperconverged deployments with benefits and use cases; mentions max 16 machines but primarily conceptual, not a focused limits/config/decision guide.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/overview?view=azloc-2603,What is Azure Local?,What Is Azure Local? Overview and Key Benefits - Azure Local,,"Learn how Azure Local accelerates cloud and AI innovation by delivering applications, workloads, and services from cloud to edge with Azure Arc as the control plane.","Azure Local is Microsoft’s distributed infrastructure solution that extends Azure capabilities to customer-owned environments. It facilitates the local deployment of both modern and legacy applications across distributed or sovereign locations. Azure Local accelerates cloud and AI innovation by seamlessly delivering new applications, workloads, and services from cloud to edge, using Azure Arc as the unifying control plane. The solution offers a cloud-native management experience and supports dep",2025-11-19T18:04:00.000Z,overview,,0.2,False,"High-level overview and benefits of Azure Local; no detailed limits, configs, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2603,Choose network reference pattern,Azure Local deployment network reference patterns - Azure Local,Choose Azure Local deployment network pattern,Select a network reference pattern for single-node and two-node Azure Local deployments.,"Applies to: Azure Local 2311.2 and later This article describes a set of network patterns references to architect, deploy, and configure Azure Local using either one, two, or three physical hosts. Depending on your needs or scenarios, you can go directly to your pattern of interest. Each pattern is described as a standalone entity and includes all the network components for specific scenarios.",2026-01-05T23:02:00.000Z,overview,architecture-patterns,0.8,True,"Helps select among network reference patterns for one to three hosts; provides scenario-based guidance on which pattern to use, a product-specific architecture decision.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/cloud-deployment-network-considerations?view=azloc-2603,Review cloud deployment network considerations,"Network considerations for cloud deployment for Azure Local, version 23H2 - Azure Local",,"This article introduces network considerations for cloud deployments of Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article discusses how to design and plan an Azure Local system network for cloud deployment. Before you continue, familiarize yourself with the variousAzure Local networking patternsand available configurations.",2026-01-21T18:04:00.000Z,how-to,,0.3,False,"Planning/overview of network considerations; summary does not indicate concrete limits, config tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/configure-custom-settings-active-directory?view=azloc-2603,Configure advanced Active Directory settings,"Custom or advanced Active Directory configuration for Azure Local, version 23H2 - Azure Local",Configure custom Active Directory permissions and DNS for Azure Local,"Learn how to assign the required permissions and create the required DNS records for use by Active Directory for your Azure Local, version 23H2 system.",Applies to: Hyperconverged deployments of Azure Local This article describes the permissions and the DNS records required for the Azure Local instance deployment. The article also uses examples with detailed steps on how to manually assign permissions and create DNS records for your Active Directory environment. The Azure Local solution is deployed in large Active Directories with established processes and tools for assigning permissions. Microsoft provides anActive Directory preparation scriptt,2026-01-05T23:02:00.000Z,how-to,security,0.75,True,Describes required AD permissions and DNS records with detailed steps; this is product-specific security/identity configuration with concrete permission assignments and record types.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/four-node-switchless-two-switches-two-links?view=azloc-2603,"Storage switchless, dual TOR, dual link","Azure Local four-node storage switchless, dual TOR, dual link deployment network reference pattern - Azure Local",Plan Azure Local four-node switchless dual-link network,"Plan to deploy an Azure Local four-node storage switchless, dual TOR, dual link network reference pattern.",Applies to: Azure Local 2411.1 and later This article describes how you can use a four-node storage switchless network reference pattern with two TOR L3 switches and two full-mesh links to deploy your Azure Local solution. Note Microsoft has tested and validated the four-node switchless network reference patterns described in this article.,2026-04-05T08:00:00.000Z,concept-article,architecture-patterns,0.75,True,"Four-node, dual TOR, dual-link switchless pattern is a specific, Microsoft-tested network reference architecture for Azure Local deployments.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2603,Network reference patterns overview,Network reference patterns overview for Azure Local - Azure Local,Understand Azure Local network reference patterns,Learn about the different supported network reference patterns for Azure Local.,Applies to: Azure Local 2311.2 and later This article provides an overview of deploying network reference patterns in hyperconverged deployments of Azure Local (formerly Azure Stack HCI). A deployment consists of single-node or multiple node systems (up to 16 machines per system) that connect to one or two Top of Rack (TOR) switches. Those environments have the following characteristics: At least two network adapter ports dedicated for storage traffic intent. The only exception to this rule is s,2025-12-22T23:06:00.000Z,concept-article,architecture-patterns,0.7,True,"Overview of supported network reference patterns with specific deployment characteristics (nodes, TOR switches, adapter requirements); product-specific network architecture patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-sdn-considerations?view=azloc-2603,SDN considerations,Review SDN considerations for network reference patterns - Azure Local,Apply SDN considerations to Azure Local patterns,Learn about SDN considerations for network reference patterns for Azure Local.,"Applies to: Azure Local 2311.2 and later In this article, you'll review considerations when deploying Software Defined Networking (SDN) in your Azure Local instance.",2026-01-28T23:03:00.000Z,concept-article,architecture-patterns,0.7,True,Discusses SDN considerations when deploying network reference patterns; product-specific architectural guidance on when/how to use SDN with these patterns.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-components?view=azloc-2603,Pattern components,Review single-server storage reference pattern components for Azure Local - Azure Local,Review single-server Azure Local network components,Learn about single-server storage reference pattern components for Azure Local.,"Applies to: Azure Local 2311.2 and later This article describes which network components are deployed for the single-server reference pattern, as shown in the following diagram:",2026-01-28T23:03:00.000Z,concept-article,configuration,0.75,True,Details which network components are deployed in the single-server pattern; configuration-level breakdown of product-specific components.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-deployment?view=azloc-2603,Single-node deployment,Azure Local single node storage deployment network reference pattern - Azure Local,Plan single-server storage network pattern for Azure Local,Plan to deploy an Azure Local single-server storage network reference pattern.,"Applies to: Azure Local 2311.2 and later This article describes the single-server storage network reference pattern that you can use to deploy your Azure Local solution. The information in this article also helps you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local in their datacenters. For information about other network patterns, seeAzure Local network deployment patterns.",2026-01-05T23:02:00.000Z,how-to,architecture-patterns,0.8,True,Describes a specific single-server storage network reference pattern and viability criteria; product-specific architecture pattern with concrete topology guidance.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-ip-requirements?view=azloc-2603,Pattern IP requirements,Review single-server storage reference pattern IP requirements for Azure Local - Azure Local,Configure IP addressing for single-server Azure Local,Review single-server storage reference pattern IP requirements for Azure Local.,Applies to: Azure Local 2311.2 and later This article describes the IP requirements for deploying a single-server network reference pattern in your environment.,2026-01-07T23:02:00.000Z,feature-availability,configuration,0.85,True,"IP requirements for single-server pattern; includes specific IP ranges/allocations and counts, which are configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-components?view=azloc-2603,Pattern components,Review three-node storage reference pattern components for Azure Local - Azure Local,Review three-node Azure Local network components,Learn about three-node storage reference pattern components for Azure Local.,"Applies to: Azure Local 2311.2 and later In this article, you'll learn about which network components get deployed for three-node reference patterns, as shown below:",2026-01-28T23:03:00.000Z,concept-article,configuration,0.75,True,Describes which network components are deployed for three-node patterns; product-specific configuration breakdown.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-ip-requirements?view=azloc-2603,Pattern IP requirements,Review three-node storage reference pattern IP requirements for Azure Local - Azure Local,Configure IP addressing for three-node Azure Local,Review three-node storage reference pattern IP requirements for Azure Local,"Applies to: Azure Local 2311.2 and later In this article, learn about the IP address requirements for deploying a three-node network reference pattern in your environment.",2025-12-22T23:06:00.000Z,article,configuration,0.85,True,IP address requirements for three-node patterns; includes specific IP planning details.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-single-link?view=azloc-2603,"Storage switchless, dual TOR, single link","Azure Local three-node storage switchless, dual TOR, single link deployment network reference pattern - Azure Local",Design Azure Local three-node switchless single-link network,"Plan to deploy an Azure Local three-node storage switchless, dual TOR, single link network reference pattern.","Applies to: Hyperconverged deployments of Azure Local In this article, learn about the three-node storage switchless with two TOR L3 switches and full-mesh single link network reference pattern that you can use to deploy your Azure Local solution. Note Microsoft has tested and validated the three-node switchless network reference patterns described in this article. For information on two-node switchless network patterns, seeAzure Local network deployment patterns.",2026-04-05T08:00:00.000Z,concept-article,architecture-patterns,0.75,True,"Three-node, dual TOR, single-link switchless pattern is a specific, tested topology pattern for Azure Local deployments, used in planning and design decisions.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-two-links?view=azloc-2603,"Storage switchless, dual TOR, dual link","Azure Local three-node storage switchless, dual TOR, dual link deployment network reference pattern - Azure Local",Design Azure Local three-node switchless dual-link network,"Plan to deploy an Azure Local three-node storage switchless, dual TOR, dual link network reference pattern.","Applies to: Hyperconverged deployments of Azure Local In this article, learn about the three-node storage switchless with two TOR L3 switches and two full-mesh links network reference pattern that you can use to deploy your Azure Local solution. Note Microsoft has tested and validated the three-node switchless network reference patterns described in this article. For information on two-node switchless network patterns, seeAzure Local network deployment patterns.",2026-04-06T17:04:00.000Z,concept-article,architecture-patterns,0.75,True,"Three-node, dual TOR, dual-link switchless pattern is a validated Azure Local network architecture pattern guiding deployment design choices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-components?view=azloc-2603,Pattern components,Review two-node storage reference pattern components for Azure Local - Azure Local,Review two-node Azure Local network components,Learn about two-node storage reference pattern components for Azure Local.,"Applies to: Azure Local 2311.2 and later In this article, you'll learn about which network components get deployed for two-node reference patterns, as shown below:",2026-01-28T23:03:00.000Z,concept-article,configuration,0.75,True,Lists which network components are deployed for two-node reference patterns; product-specific component configuration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-ip-requirements?view=azloc-2603,Pattern IP requirements,Review two-node storage reference pattern IP requirements for Azure Local - Azure Local,Configure IP addressing for two-node Azure Local,Review two-node storage reference pattern IP requirements for Azure Local,"Applies to: Azure Local 2311.2 and later In this article, learn about the IP address requirements for deploying a two-node network reference pattern in your environment.",2026-01-07T23:02:00.000Z,feature-availability,configuration,0.85,True,IP address requirements for two-node patterns; includes specific IP allocations and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-converged?view=azloc-2603,"Storage switched, fully converged","Azure Local two-node storage switched, fully converged deployment network reference pattern - Azure Local",Plan two-node switched converged Azure Local network,"Plan to deploy an Azure Local two-node storage switched, fully converged network reference pattern.","Applies to: Azure Local 2311.2 and later In this article, you'll learn about the two-node storage switched, fully converged with two TOR switches network reference pattern that you can use to deploy your Azure Local instance solution. The information in this article will also help you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local instances in their datacenters. For information",2026-01-05T23:02:00.000Z,how-to,architecture-patterns,0.8,True,"Two-node switched, fully converged pattern; describes a specific network architecture for Azure Local deployments.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-non-converged?view=azloc-2603,"Storage switched, non-converged","Azure Local two-node storage switched, non-converged deployment network reference pattern - Azure Local",Plan two-node switched non-converged Azure Local network,"Plan to deploy an Azure Local two-node storage switched, non-converged network reference pattern.","Applies to: Azure Local 2311.2 and later In this article, you'll learn about the two-node storage switched, non-converged, two-TOR-switch network reference pattern that you can use to deploy your Azure Local solution. The information in this article will also help you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local in their datacenters. For information on other network patterns,",2026-01-05T23:02:00.000Z,how-to,architecture-patterns,0.8,True,"Two-node switched, non-converged pattern with two TOR switches; product-specific network design pattern and trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-single-switch?view=azloc-2603,"Storage switchless, single switch",Azure Local two-node storage switchless deployment network reference pattern - Azure Local,Plan Azure Local two-node switchless single-switch network,Plan to deploy an Azure Local two-node storage switchless network reference pattern.,"Applies to: Azure Local 2311.2 and later This article describes the two-node storage switchless with single TOR switch network reference pattern that you can use to deploy your Azure Local solution. The information in this article also helps you determine if this configuration is viable for your deployment planning needs. This article targets the IT administrators who deploy and manage Azure Local in their datacenters. For information about other network patterns, seeAzure Local network deployme",2026-04-05T08:00:00.000Z,how-to,architecture-patterns,0.75,True,"Describes a specific, validated network reference pattern for Azure Local with viability criteria; this is a product-specific architecture pattern for deployment topology.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-two-switches?view=azloc-2603,"Storage switchless, two switches","Azure Local two-node storage switchless, two switches deployment network reference pattern - Azure Local",Plan Azure Local two-node switchless dual-switch network,"Plan to deploy an Azure Local two-node storage switchless, two switches network reference pattern.","Applies to: Azure Local 2311.2 and later In this article, you learn about the two-node storage switchless with two TOR L3 switches network reference pattern that you can use to deploy your Azure Local solution. The information in this article also helps you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local in their datacenters. For information on other network patterns, seeAzure L",2026-04-05T08:00:00.000Z,how-to,architecture-patterns,0.75,True,"Defines a concrete Azure Local network reference pattern (two-node, two TOR L3 switches) used to decide and design deployment topology; fits architecture-patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-23?view=azloc-2603,Known issues,Release notes with fixed and known issues in Azure Local 23xx releases - Azure Local,Known issues and workarounds for Azure Local 23xx releases,Read about the known issues and fixed issues in Azure Local 23xx releases.,"This article identifies critical known issues and their workarounds in Azure Local 23xx releases. Note Azure Local 23xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-02-18T23:03:00.000Z,troubleshooting-general,troubleshooting,0.65,True,Known/fixed issues article with workarounds is inherently troubleshooting-focused with symptom-to-solution mappings.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-24?view=azloc-2603,Known issues,Release notes with fixed and known issues in Azure Local 24xx releases - Azure Local,Known issues and workarounds for Azure Local 24xx releases,Read about the known issues and fixed issues in Azure Local 24xx releases.,"This article identifies critical known issues and their workarounds in Azure Local 24xx releases. Note Azure Local 24xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-02-18T23:03:00.000Z,troubleshooting-general,troubleshooting,0.65,True,"Release notes with known issues and workarounds typically map specific symptoms/bugs to causes and mitigation steps, which is troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-23?view=azloc-2603,Security updates,Security updates for Azure Local 23xx releases - Azure Local,Security update reference for Azure Local 23xx releases,Security updates for Azure Local 23xx releases.,"This article lists the various security updates that are available for Azure Local 23xx releases. Note Azure Local 23xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-03-25T22:04:00.000Z,release-notes,security,0.7,True,"Similar to index 4, this page lists concrete security updates tied to specific Azure Local 23xx versions. These are detailed, versioned security artifacts (KBs, fixes) that qualify as product-specific security knowledge beyond generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-24?view=azloc-2603,Security updates,Security updates for Azure Local 24xx releases - Azure Local,Security update reference for Azure Local 24xx releases,Security updates for Azure Local 24xx releases.,"This article lists the various security updates that are available for Azure Local 24xx releases. Note Azure Local 24xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-03-25T08:00:00.000Z,release-notes,security,0.7,True,"A security update listing for specific Azure Local release trains is product- and version-specific expert knowledge, typically including KB identifiers, affected components, and sometimes CVE mappings or required actions. This is security-focused configuration/maintenance information that an LLM would not know from training.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-23?view=azloc-2603,What's new in Azure Local?,What's new in Hyperconverged Deployments of Azure Local 23xx releases - Azure Local,,Find out about the new features and enhancements in the Azure Local 23xx releases.,"This article lists the features and improvements that are available in hyperconverged deployments of Azure Local (formerly Azure Stack HCI) 23xx releases. Note Azure Local 23xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-03-11T17:06:00.000Z,overview,,0.4,False,What's new feature list for 23xx; similar to other release summaries without deep expert-only technical details.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-24?view=azloc-2603,What's new in Azure Local?,What's new in Hyperconverged Deployments of Azure Local 24xx releases - Azure Local,,Find out about the new features and enhancements in the Azure Local 24xx releases.,"This article lists the features and improvements that are available in hyperconverged deployments of Azure Local (formerly Azure Stack HCI) 24xx releases. Note Azure Local 24xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-03-11T17:06:00.000Z,overview,,0.4,False,"What's new release summary; feature list without deep configuration, limits, or troubleshooting matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2603,Release information,"Azure Local, version 23H2 and 24H2 release information - Azure Local",Plan supported Azure Local release upgrade paths,"Learn about Azure Local releases, including OS builds, supported update paths, and key considerations for staying in a supported state.","Important Azure Local versions 11.2510.1002.93 and 12.2510.1002.531 (supersedes 12.2510.1002.529) are now available. To enhance your Azure Local (formerly known as Azure Stack HCI) experience, Microsoft periodically releases feature updates that introduce new capabilities and improvements. Additionally, Azure Local provides cumulative updates that include monthly quality and security enhancements. These updates are listed for each instance, ensuring your devices remain protected and productive. ",2026-03-17T22:09:00.000Z,release-notes,deployment,0.65,True,Release information with OS builds and supported update paths; contains product-specific upgrade constraints and supported paths relevant to deployment lifecycle.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/rename-to-azure-local?view=azloc-2603,Renaming to Azure Local,Renaming Azure Stack HCI to Azure Local - Azure Local,,This article provides the renaming information for Azure Stack HCI to Azure Local.,Applies to: Azure Local 2311.2 and later Azure Stack HCI is now part ofAzure Local. Microsoft renamed Azure Stack HCI to Azure Local to communicate a single brand that unifies the entire distributed infrastructure portfolio. This article describes Azure Local as the new name for Azure Stack HCI and answers commonly asked questions related to this rename.,2026-01-07T23:02:00.000Z,concept-article,,0.1,False,Brand rename explanation and FAQ; marketing/terminology content without technical expert configuration or troubleshooting details.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2603,About Azure Local deployments,Azure Local Deployment Types and Scalability - Azure Local,Select Azure Local deployment type and scale,"Discover how Azure Local offers scalable on-premises solutions for critical workloads, from single machines to hundreds of machines, tailored to your needs.","Azure Local offers you a consistent on-premises experience for your critical workloads and Arc services across a wide spectrum of scale points and use cases, from a single machine up to hundreds of machines. This article provides an overview of the different Azure Local deployment types and their scalability options to help you choose the right solution for your organization's needs.",2025-12-22T23:06:00.000Z,concept-article,decision-making,0.7,True,Describes different deployment types and scalability options to help choose the right solution; contains product-specific scale points and guidance for when to use each option.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/conclusion?view=azloc-2603,Conclusion,Azure Local security book conclusion - Azure Local,,Conclusion for the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local We designed Azure Local so it's secure right out of the box. Further, we provide mechanisms to help the system remain secure over time. We'll continue to build on our security foundations with innovations that deliver powerful protection now and in the future.",2025-12-23T23:03:00.000Z,concept-article,,0.2,False,"Conclusion chapter; summarises security posture and future direction, not providing new expert configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/operational-security?view=azloc-2603,Operational security,Operational security for the Azure Local security book - Azure Local,,Learn about operational security for the Azure Local security book.,Applies to: Hyperconverged deployments of Azure Local Operational security in Azure Local means ongoing operations using Windows Admin Center and ongoing compliance using Microsoft Defender for Cloud and other tools.,2025-12-23T23:03:00.000Z,concept-article,,0.3,False,"Operational security overview; mentions tools like Windows Admin Center and Defender for Cloud but appears conceptual, not a detailed configuration or troubleshooting guide.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/overview?view=azloc-2603,Azure Local security book,Azure Local security book overview - Azure Local,,Overview of the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local The Azure Local security book discusses in detail the built-in security layers found in Azure Local, from core to cloud.",2026-01-07T23:02:00.000Z,overview,,0.1,False,"High-level overview of a security book; no concrete settings, roles, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/security-foundation?view=azloc-2603,Security foundation,Security foundation for the Azure Local security book - Azure Local,,Learn about the ecurity foundation for the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local Azure Local is built on a strong security foundation, including the Microsoft Security Development Lifecycle (SDL), certifications, and a secure supply chain.",2025-12-23T23:03:00.000Z,concept-article,,0.25,False,"Security foundation overview; discusses SDL, certifications, and supply chain at a high level, without product-specific configuration or numeric constraints.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/silicon-assisted-security?view=azloc-2603,Silicon-assisted security,Silicon assisted security for the Azure Local security book - Azure Local,,Learn about silicon assisted security for the Azure Local security book.,Applies to: Hyperconverged deployments of Azure Local Silicon assisted security for Azure Local means using secured core hardware and approved Azure Local solutions.,2025-12-23T23:03:00.000Z,concept-article,,0.3,False,"Silicon-assisted security overview; focuses on secured core hardware and approved solutions conceptually, not on detailed configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/trustworthy-addition?view=azloc-2603,Trustworthy addition,Trustworthy addition for the Azure Local security book - Azure Local,,Learn about trustworthy addition for the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local Trustworthy addition for Azure Local means using security by default, application control, credential protection, memory integrity protectionm, data protection, network security, malware protection, and privacy controls.",2025-12-23T23:03:00.000Z,concept-article,,0.3,False,"High-level description of trustworthy addition; conceptual security principles without specific RBAC roles, config parameters, or compliance settings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-book/workload-security?view=azloc-2603,Workload security,Workload security for Azure Local security book - Azure Local,,Learn about workload security for the Azure Local security book.,Applies to: Hyperconverged deployments of Azure Local Workload security for Azure Local means using Trusted launch for Azure Local VMs enabled by Azure Arc and Microsoft Defender for Cloud for continuous monitoring of your workloads.,2025-12-23T23:03:00.000Z,concept-article,,0.3,False,"Workload security overview; describes using Trusted launch and Defender for Cloud conceptually, without explicit configuration parameters or role mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/security-update/security-update?view=azloc-2603,Security updates,Security updates for Azure Local - Azure Local,,Security updates for Azure Local.,This article lists the various security updates that are available for Azure Local.,2026-04-08T08:00:00.000Z,release-notes,,0.3,False,"Lists security updates; typically a catalog of patches/KBs rather than RBAC roles, config parameters, or security setting details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/about-updates-23h2?view=azloc-2603,About Updates,"About updates for Azure Local, version 23H2 - Azure Local",,"This article describes the updates feature for this release, benefits, and how to keep various pieces of your Azure Local, version 23H2 solution up to date.","Applies to: Hyperconverged deployments of Azure Local This article describes the new update feature for this release of Azure Local (formerly Azure Stack HCI), the benefits of the feature, and how to keep various components of your solution up to date.",2025-12-22T23:06:00.000Z,overview,,0.3,False,"High-level overview of updates feature and benefits; summary doesn’t indicate detailed config parameters, limits, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/azure-update-manager-23h2?view=azloc-2603,Update via Azure portal,"Use Azure Update Manager to update your Azure Local, version 23H2 - Azure Local",,"This article describes the Azure Update Manager, its benefits, and ways to use it to update your Azure Local, version 23H2 system in the Azure portal.","Applies to: Hyperconverged deployments of Azure Local Important The procedure described here applies when updating your existing Azure Local version to a newer version. This article describes how to use Azure Update Manager to find and install available updates on Azure Local. It also describes how to review, track progress, and browse the history of system updates. Important Based on the solution you're using to run Azure Local, latest feature updates might take a week from the availability dat",2026-04-02T17:05:00.000Z,how-to,,0.25,False,"Describes how to use Azure Update Manager to apply updates and view history. From the summary it looks like a usage/how-to article without explicit configuration parameter tables, limits, or troubleshooting error mappings. Does not clearly match any expert-knowledge sub-skill type based on the provided hints.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2603,Update via PowerShell with limited connectivity,Import and Discover Update Packages For Azure Local With Limited Connectivity - Azure Local,Import Azure Local update packages in low-connectivity sites,This article describes how to import and discover update packages for Azure Local with limited connectivity.,"This article explains how to discover and import solution update packages for Azure Local (formerly Azure Stack HCI) deployed in sites with limited bandwidth connections to Azure. You can download Azure Local solution update as a static payload, then copy or transfer it to multiple instances, and import it using PowerShell. Do these actions before you start an update to reduce the amount of data downloaded during the update. The static payload for a solution update includes the OS security updat",2026-04-06T08:00:00.000Z,how-to,configuration,0.65,True,"Covers how to discover and import static update payloads for Azure Local with limited connectivity, including using PowerShell and handling static payloads. This scenario is product-specific and likely includes concrete cmdlets, parameters, and steps unique to Azure Local update packages, fitting configuration/integration patterns more than generic tutorial content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/solution-builder-extension?view=azloc-2603,About Solution Builder Extension software updates,"Solution Builder Extension updates on Azure Local, version 23H2 - Azure Local",,This article describes the Solution Builder Extension updates and how to apply them on your Azure Local machines.,"Applies to: Hyperconverged deployments of Azure Local This article provides an overview of the Solution Builder Extension updates and explains how to identify and install them on your Azure Local systems. Additionally, it offers insights into the extension’s advanced capabilities.",2026-04-09T22:04:00.000Z,overview,,0.3,False,"Primarily an overview of Solution Builder Extension updates and how to apply them. The summary does not indicate presence of specific configuration tables, limits, error codes, or other detailed expert-only data; it appears to be a procedural/update overview.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/update-best-practices?view=azloc-2603,Review best practices,Best Practices for managing Azure Local Update Management - Azure Local,Best practices for Azure Local update management,Learn the best practices for managing Azure Local environments.,"This article provides an overview of Azure Local update management, including best practices and common pitfalls to help keep Azure Local secure, up to date, and compliant. This article is intended for IT decision-makers, infrastructure architects, and operations teams responsible for Azure Local deployments.",2025-12-22T23:06:00.000Z,overview,best-practices,0.7,True,"Explicit best-practices article; likely includes concrete recommendations, pitfalls, and possibly scheduling or configuration guidance specific to Azure Local updates.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/update-phases-23h2?view=azloc-2603,Understand update phases,Understand Update Phases of Azure Local - Azure Local,,Understand the various phases of solution updates applied to Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes the preparation and installation phases of the Azure Local update workflow, including how updates are downloaded, validated, health-checked, and installed. It also explains how update progress is reported at various stages. For more detailed information on progress reporting, seeUse Azure Update Manager to update Azure LocalandUpdate Azure Local via PowerShell.",2026-04-06T22:03:00.000Z,concept-article,,0.3,False,"Describes phases of the Azure Local update workflow (preparation, download, validation, health checks, installation, progress reporting). This is process/behavioral documentation rather than detailed limits, configuration matrices, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/update-troubleshooting-23h2?view=azloc-2603,Troubleshoot updates,"Troubleshoot solution updates for Azure Local, version 23H2 - Azure Local",Troubleshoot Azure Local 23H2 solution updates,"Learn how to troubleshoot solution updates applied to Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article describes how to troubleshoot solution updates that are applied to your Azure Local to keep it up-to-date.,2026-03-25T08:00:00.000Z,how-to,troubleshooting,0.8,True,"Explicitly a troubleshooting article for solution updates on Azure Local 23H2. Such pages typically organize by update failures/symptoms and map them to causes and resolutions, often including specific error messages, codes, and diagnostic steps unique to Azure Local updates. This aligns with the troubleshooting criteria of symptom → cause → solution with product-specific guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/update/update-via-powershell-23h2?view=azloc-2603,Update via PowerShell,"Update Azure Local, version 23H2 systems via PowerShell - Azure Local",Apply Azure Local solution updates via PowerShell,"Learn how to use PowerShell to apply operating system, service, and Solution Extension updates to Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to apply a solution update to your Azure Local by using PowerShell. The procedure in this article applies to both single node and multi-node systems that run the latest version of Azure Local with the orchestrator (Lifecycle Manager) installed. If you created your system by deploying Azure Local, the orchestrator was automatically installed as part of the deployment. Important The procedure described here applies wh",2026-03-04T18:05:00.000Z,how-to,configuration,0.64,True,"PowerShell-based update procedure for OS, services, and extensions; likely includes specific cmdlets, parameters, and required states unique to Azure Local update orchestration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/about-upgrades-23h2?view=azloc-2603,About Azure Local upgrades,About Azure Local upgrades - Azure Local,Understand upgrade options from Azure Stack HCI 22H2,This article provides an overview of upgrading your cluster to Azure Local.,"Applies to: Azure Local 2311.2 and later This article provides an overview of upgrading your version 22H2 cluster to Azure Local (formerly Azure Stack HCI). Azure Stack HCI OS, version 22H2 is already out of support. To continue receiving updates, we recommend upgrading your operating system to a newer versionvia PowerShell. If you're running OS version 20349.xxxx (22H2) you won't be able to purchase Windows Server Subscription or Extended Security Updates (ESU). If you're running an Azure Stack",2025-12-22T23:06:00.000Z,concept-article,decision-making,0.65,True,Overview of upgrading from 22H2 with support and subscription implications; helps decide when and how to upgrade based on OS version and supportability.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-enable-network-atc?view=azloc-2603,3. Configure Network ATC,Configure Network ATC on Azure Local - Azure Local,Configure Network ATC on existing Azure Local clusters,Learn how to configure Network ATC on Azure Local.,"This article describes how to configure Network ATC on an existing Azure Local cluster that doesn't already have it configured. Important In Azure Local upgrade scenarios where Network ATC isn't already configured, we recommend upgrading the operating system first, then configuring Network ATC, and then proceeding with the solution upgrade. +https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-fedramp-guidance?view=azloc-2604,FedRAMP guidance,FedRAMP guidance for Azure Local - Azure Local,Align Azure Local deployments with FedRAMP,Learn about FedRAMP compliance using Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article explains the relationship between Azure Local and FedRAMP and how organizations can stay compliant with FedRAMP with Azure Local solutions.,2026-04-22T22:07:00.000Z,concept-article,security,0.65,True,Details the relationship between Azure Local and FedRAMP and how to stay compliant; specialized security/compliance guidance beyond generic concepts.,new +https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-hipaa-guidance?view=azloc-2604,HIPAA guidance,HIPAA guidance for Azure Local - Azure Local,Plan HIPAA-compliant solutions with Azure Local,Learn about HIPAA compliance using Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article provides guidance on how organizations can most efficiently navigate HIPAA compliance for solutions built with Azure Local.,2026-04-22T22:07:00.000Z,concept-article,security,0.65,True,"Provides Azure Local–specific guidance for navigating HIPAA compliance, which is specialized security/compliance configuration and responsibility guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-iso27001-guidance?view=azloc-2604,ISO/IEC 27001 guidance,ISO 27001 guidance for Azure Local - Azure Local,Use Azure Local to support ISO 27001 controls,Learn about ISO 27001 compliance using Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article outlines how Azure Local helps organizations meet the security control requirements of ISO/IEC 27001:2022, both in cloud and on premises. Learn more about Azure Local and other security standards atAzure Local and security standards.",2026-04-22T22:07:00.000Z,concept-article,security,0.7,True,"Provides detailed mapping/guidance on how Azure Local features satisfy ISO 27001:2022 controls, which is product- and standard-specific security guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-pci-dss-guidance?view=azloc-2604,PCI DSS guidance,PCI DSS guidance for Azure Local - Azure Local,Use Azure Local to support PCI DSS compliance,Learn about PCI DSS compliance using Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article explains how Microsoft Azure Local security features can help organizations in the payment card industry achieve the security control requirements of PCI DSS, both in the cloud and in their on-premises environments.",2026-04-22T22:07:00.000Z,concept-article,security,0.7,True,Explains how Azure Local security features help meet PCI DSS requirements; product-specific compliance and control mapping information.,new +https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-security-standards?view=azloc-2604,Azure Local and security standards,Azure Local and security standards - Azure Local,Map Azure Local to security standards and certifications,"Learn about Azure Local, security standards, and security assurance.","Applies to: Hyperconverged deployments of Azure Local This article provides information about security standards related to Azure Local. The resources detailed in this article, including certifications and evaluation reports, could be used as sources to help you in your compliance planning. Each section in this article provides information on Azure Local and a particular security standard, together with any completed certifications.",2026-04-22T22:07:00.000Z,concept-article,security,0.6,True,Security-assurance article tying Azure Local to specific standards and certifications; contains product-specific compliance positioning that is not generic knowledge.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit-disaggregated?view=azloc-2604,Azure Hybrid Benefit,Azure Hybrid Benefit for Azure Local Disaggregated Deployments - Azure Local,Use Azure Hybrid Benefit for Azure Local licensing,Learn about Azure Hybrid Benefit for Azure Local disaggregated deployments.,"Applies to: Azure Local 2311.2 and later This article describes Azure Hybrid Benefit and how to use it for Azure Local. Azure Hybrid Benefitis a program that helps you reduce the costs of running workloads in the cloud. With Azure Hybrid Benefit for Azure Local, you can maximize the value of your on-premises licenses and modernize your existing infrastructure to Azure Local at no additional cost.",2026-04-22T22:28:00.000Z,how-to,decision-making,0.6,True,"Explains how to apply Azure Hybrid Benefit to Azure Local disaggregated deployments; likely includes licensing conditions and when it reduces cost, which is product-specific decision guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit?view=azloc-2604,Azure Hybrid Benefit,Azure Hybrid Benefit for Azure Local - Azure Local,,Learn about Azure Hybrid Benefit for Azure Local.,"Applies to: Azure Local 2311.2 and later This article describes Azure Hybrid Benefit and how to use it for Azure Local. Azure Hybrid Benefitis a program that helps you reduce the costs of running workloads in the cloud. With Azure Hybrid Benefit for Azure Local, you can maximize the value of your on-premises licenses and modernize your existing infrastructure to Azure Local at no extra cost.",2026-04-22T22:07:00.000Z,how-to,,0.3,False,"Explains Azure Hybrid Benefit program for Azure Local; mostly licensing and cost concept, not technical configuration or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/billing?view=azloc-2604,Billing and payment,Azure Local billing and payment - Azure Local,,How billing and payment works in Azure Local.,"Applies to: Azure Local 2311.2 and later Azure Local is an Azure service that goes on your Azure subscription bill just like any other Azure service. It's priced on a per core basis on your on-premises servers. For current pricing, seeAzure Local pricing. Currencies and discounts are handled centrally by the Azure Commerce billing platform, and the customer gets one unified, itemized bill at the end of the month. No traditional on-premises software license is required for Azure Local, although g",2026-04-22T22:07:00.000Z,overview,,0.3,False,Billing and payment overview; pricing is external and not a technical configuration or limit table in this article.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2604,Compare VM management capabilities,Compare Management Capabilities of VMs on Azure Local - Azure Local,Choose Azure Local VM type and management model,Learn about the kinds of virtual machines (VMs) that can run on Azure Local and compare their management capabilities.,Applies to: Hyperconverged deployments of Azure Local This article describes the types of virtual machines (VMs) available on Azure Local. It also compares their management capabilities in Azure.,2026-04-22T22:07:00.000Z,product-comparison,decision-making,0.65,True,"Compares types of VMs and their management capabilities; likely includes comparison tables and criteria to decide which VM type to use, fitting decision-making guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2604,Compare to Windows Server,Compare Azure Local to Windows Server - Key Differences and Benefits - Azure Local,Decide between Azure Local and Windows Server,"Learn the key differences between Azure Local and Windows Server to choose the best solution for your organization. Compare features, benefits, and scenarios.","Applies to: Azure Local 2311.2 and later; Windows Server 2025 This article compares Azure Local and Windows Server, and highlights key differences between the two products. It helps you learn when to use each product, and how they can work together. Azure Local and Windows Server share many similarities, such as letting you run virtual machines and container-based workloads. But they're designed for different scenarios and use cases. Azure Local is a cloud-connected hyperconverged solution that ",2026-04-22T22:07:00.000Z,product-comparison,decision-making,0.8,True,Explicit comparison to help choose between Azure Local and Windows Server with scenario-based recommendations.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/datacenter-firewall-overview?view=azloc-2604,Datacenter Firewall overview,Overview of Datacenter Firewall in Azure Local and Windows Server - Azure Local,,Use this article to learn about Datacenter Firewall in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Datacenter Firewall is a network layer, 5-tuple (protocol, source and destination port numbers, source and destination IP addresses), stateful, multitenant Software Defined Networking (SDN) firewall. The Datacenter Firewall protects east-west and north-south traffic flows across the network layer of virtual networks and traditional VLAN networks.",2026-04-22T22:07:00.000Z,overview,,0.4,False,"Overview of Datacenter Firewall; describes what it is and traffic types, but not specific configuration parameters or rules.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2604,External storage support,External storage support for Azure Local (preview) - Azure Local,Plan and configure external SAN storage support for Azure Local,"Learn how Azure Local supports external SAN integration, enabling high-performance storage for VMs, AKS clusters, and AVD instances without re-architecture (preview).","This article explains external storage support for Azure Local, its benefits, supported configurations, and other essential information. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T22:28:00.000Z,concept-article,decision-making,0.6,True,"Explains external storage support, benefits, and supported configurations; likely includes supported topologies, constraints, and configuration patterns that guide design decisions.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2604,Firewall requirements,Firewall requirements for Azure Local - Azure Local,Configure firewall rules and endpoints for Azure Local,This article provides guidance on firewall requirements for the Azure Stack HCI operating system.,Applies to: Azure Local 2311.2 and later This article provides guidance on how to configure firewalls for the Azure Stack HCI operating system. It includes firewall requirements for outbound endpoints and internal rules and ports. The article also provides information on how to use Azure service tags with Microsoft Defender firewall. This article also describes how to optionally use a highly locked-down firewall configuration to block all traffic to all destinations except those included in your,2026-04-22T22:07:00.000Z,how-to,security,0.85,True,"Lists outbound endpoints, internal ports/rules, and use of service tags; product-specific security configuration parameters.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/gateway-overview?view=azloc-2604,RAS Gateway overview,RAS Gateway for Software Defined Networking managed by on-premises tools - Azure Local,,Learn about Remote Access Service (RAS) Gateway for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article provides an overview of Remote Access Service (RAS) Gateway for Software Defined Networking (SDN) in Azure Local and Windows Server. RAS Gateway is a software-based Border Gateway Protocol (BGP) capable router designed for cloud service providers (CSPs) and enterprises that host multitenant virtual networks using Hyper-V Network Virtualization (HNV). You can use RAS Gateway to rou",2026-04-22T22:07:00.000Z,overview,,0.4,False,Overview of RAS Gateway; high-level explanation of role and capabilities without clear product-specific configuration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements-disaggregated?view=azloc-2604,Host network requirements,Host Network Requirements for Azure Local Disaggregated Deployments - Azure Local,Host network requirements for Azure Local disaggregated,Learn the host network requirements for Azure Local disaggregated deployments.,"Applies to: Azure Local 2311.2 and later This topic discusses host networking considerations and requirements for Azure Local disaggregated architectures. For information on disaggregated architectures and the physical connections between machines, seePhysical network requirements Azure Local disaggregated deployments.",2026-04-22T22:28:00.000Z,how-to,limits-quotas,0.7,True,"Host networking requirements usually define supported NIC types, minimum speeds, VLAN/VXLAN constraints, and node limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2604,Host network requirements,Host network requirements for Azure Local - Azure Local,Configure host networking for Azure Local clusters,Learn the host network requirements for Azure Local,"Applies to: Azure Local 2311.2 and later This topic discusses host networking considerations and requirements for Azure Local. For information on datacenter architectures and the physical connections between machines, seePhysical network requirements. For information on how to simplify host networking using Network ATC, seeSimplify host networking with Network ATC.",2026-01-26T23:10:00.000Z,how-to,configuration,0.8,True,Host networking considerations and requirements are detailed configuration guidance specific to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/microsoft-365-local-overview?view=azloc-2604,Microsoft 365 Local,Overview of Microsoft 365 Local on Azure Local Infrastructure - Azure Local,,"Learn how Microsoft 365 Local enables private cloud productivity with Exchange, SharePoint, and Skype for Business on customer-managed Azure Local infrastructure.",This article provides an overview of Microsoft 365 Local on Azure Local infrastructure and how it helps organizations meet sovereignty requirements while maintaining productivity in a private cloud environment.,2026-04-22T22:07:00.000Z,concept-article,,0.2,False,Overview of Microsoft 365 Local on Azure Local; primarily conceptual and marketing-style explanation of benefits and sovereignty.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/monitoring-overview?view=azloc-2604,Overview,Overview of Azure Local monitoring - Azure Local,,This article provides an overview of the Azure Local monitoring solution.,"Applies to: Hyperconverged deployments of Azure Local This article provides an overview of monitoring in hyperconvered deployments of Azure Local (formerly Azure Stack HCI). Monitoring Azure Local involves the regular collection and analysis of data from all components of your system to promptly identify and address any potential issues. Routine monitoring is crucial for maintaining the health and functionality of your Azure Local system. To understand the current performance patterns, identify ",2025-12-12T23:06:00.000Z,concept-article,,0.2,False,"High-level monitoring overview; lacks detailed metrics tables, configuration parameters, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-atc-overview?view=azloc-2604,Network ATC overview,Network ATC overview - Azure Local,,This article introduces Network ATC for Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later Deployment and operation of Azure Local networking can be a complex and error-prone process. Due to the configuration flexibility provided with the host networking stack, there are many moving parts that can be easily misconfigured or overlooked. Staying up to date with the latest best practices is also a challenge as improvements are continuously made to the underlying technologies. Additionally, configuration consistency across Azure Local machines is i",2026-04-22T22:07:00.000Z,overview,,0.35,False,"Network ATC overview; primarily conceptual and benefits-focused, not a detailed configuration or limits reference.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-controller-overview?view=azloc-2604,Network Controller overview,Overview of Network Controller in Azure Local and Windows Server - Azure Local,,Use this article to learn about Network Controller for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Network Controller is the cornerstone of Software Defined Networking (SDN) management. It's a highly scalable server role that provides a centralized, programmable point of automation to manage, configure, monitor, and troubleshoot virtual network infrastructure. Using Network Controller, you can automate the configuration and management of network infrastructure instead of performing manual c",2026-04-22T22:07:00.000Z,overview,,0.4,False,Overview of Network Controller; appears conceptual and descriptive rather than configuration- or troubleshooting-focused.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/observability?view=azloc-2604,Azure Local observability,Azure Local observability - Azure Local,,Learn about observability in Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article describes observability in Azure Local and the data sources through which it is achieved.,2026-04-22T22:07:00.000Z,how-to,,0.3,False,Conceptual observability overview; describes data sources and concepts rather than detailed configuration or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements-disaggregated?view=azloc-2604,Physical network requirements,Physical Network Requirements for Azure Local Disaggregated Deployments - Azure Local,Physical network requirements for Azure Local disaggregated,"Learn about physical network requirements for Azure Local disaggregated deployments, including network switches, to ensure optimal performance.","Applies to: Azure Local 2311.2 and later This article discusses physical (fabric) network considerations and requirements for Azure Local disaggregated architectures, particularly for network switches. Note Requirements for future Azure Local versions may change.",2026-04-22T22:28:00.000Z,concept-article,limits-quotas,0.7,True,"Physical network requirements typically include port counts, bandwidth, latency, and topology constraints—concrete numeric limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2604,Physical network requirements,Physical network requirements for Azure Local - Azure Local,Meet physical network requirements for Azure Local,"Learn about physical network requirements for Azure Local, including network switches, to ensure optimal performance.","Applies to: Azure Local 2311.2 and later This article discusses physical (fabric) network considerations and requirements for Azure Local, particularly for network switches. Note Requirements for future Azure Local versions may change.",2026-04-22T22:07:00.000Z,concept-article,configuration,0.8,True,Physical network and switch requirements are product-specific configuration guidance with concrete constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-network-controller-deployment?view=azloc-2604,Network Controller,"Plan to deploy Network Controller on Azure Local, version 23H2 - Azure Local",Plan Network Controller VM deployment on Azure Local,This article covers how to plan to deploy Network Controller on Azure Local via Windows Admin Center on a set of virtual machines (VMs).,Applies to: Hyperconverged deployments of Azure Local This article describes how to plan to deploy Network Controller on Azure Local via Windows Admin Center on a set of virtual machines (VMs). Planning to deploy Network Controller via Windows Admin Center requires a set of VMs running the Azure Stack HCI operating system. Network Controller is a highly available and scalable server role that requires a minimum of three VMs to provide high availability on your network. Note We recommend that you,2026-04-22T22:07:00.000Z,install-set-up-deploy,deployment,0.65,True,"Planning article with concrete, product-specific deployment requirements (for example, minimum three VMs for high availability and Azure Stack HCI OS prerequisites). These are deployment constraints rather than generic concepts.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-software-defined-networking-infrastructure-23h2?view=azloc-2604,SDN infrastructure,"Plan infrastructure for Software Defined Networking managed by on-premises tools in Azure Local, version 23H2 - Azure Local",Plan SDN infrastructure and topology for Azure Local,"This topic provides information on how to plan a Software Defined Network (SDN) infrastructure deployment, managed by on-premises tools, for Azure Local, version 23H2.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to plan an infrastucture deployment for Software Defined Networking (SDN) managed by on-premises tools, including hardware and software prerequisites. It outlines planning requirements for both physical and logical network configuration, routing, gateways, network hardware, and more. This article also includes considerations on extending an SDN infrastructure and usi",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.6,True,"Covers detailed SDN infrastructure planning including physical/logical network configuration, routing, gateways, and extension scenarios. Contains product-specific design guidance and trade-offs for Azure Local SDN deployments.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-add-server?view=azloc-2604,Add and repair nodes,Add or repair a node to a rack aware cluster on Azure Local - Azure Local,Add or repair nodes in Azure Local rack aware clusters,Learn how to add or repair a node to a rack aware cluster on Azure Local.,This article explains how to add or repair servers (nodes) for your Azure Local rack aware cluster.,2026-01-22T23:03:00.000Z,how-to,deployment,0.7,True,"Describes procedures to add/repair nodes in rack aware clusters, including product-specific operational and deployment details.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-aks-nodes?view=azloc-2604,Spread AKS nodes,Spread Azure Kubernetes Service (AKS) nodes in rack aware cluster - Azure Local,Distribute AKS nodes across Azure Local rack aware zones,Learn how to deploy AKS clusters with rack aware cluster support to ensure fault tolerance and evenly distribute nodes across Azure Local zones.,This article explains how to deploy Azure Kubernetes Service (AKS) clusters with rack aware cluster support. You'll learn how to ensure fault tolerance and evenly distribute nodes across Azure Local zones for improved reliability.,2026-01-22T23:03:00.000Z,concept-article,configuration,0.7,True,Guides how to deploy AKS clusters with rack aware support and node spreading; product-specific configuration for reliability.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-overview?view=azloc-2604,What's a rack aware cluster?,Overview of Azure Local rack aware clustering - Azure Local,Understand Azure Local rack aware clustering capabilities,Use this article to learn about Azure Local rack aware clustering.,This article gives a high-level overview of the Azure Local rack aware clustering feature including its benefits and use cases. The article also details the supported configurations and deployment requirements for rack aware clusters. This article applies only to new deployments of Azure Local.,2026-04-22T22:07:00.000Z,overview,deployment,0.6,True,"Overview but includes supported configurations and deployment requirements for rack aware clusters, which are product-specific deployment characteristics.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-provision-vm-local-availability-zone?view=azloc-2604,Provision and place VMs,Provision VMs in local availability zone for Azure Local - Azure Local,Provision Azure Local VMs in local availability zones,Learn about how to provision VMs in local availability zone for Azure Local.,"This article explains how to create Azure Local virtual machines (VMs) in a local availability zone to reduce latency, improve performance, ensure redundancy, and meet compliance requirements. Important Updating placement configuration of existing virtual machines (VMs) is not supported.",2026-01-28T23:03:00.000Z,how-to,configuration,0.75,True,Explains how to place VMs in local availability zones with constraints (such as no updating of existing VM placement) specific to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-reference-architecture?view=azloc-2604,Reference architecture,Azure Local rack aware cluster reference architecture - Azure Local,Use reference architecture for Azure Local rack aware clusters,Learn about the network design and configuration of an Azure Local rack aware cluster,"This article contains information about the network design and configuration of an Azure Local rack aware cluster. This configuration involves a single cluster where nodes are placed in different physical locations within a building. The primary intent is to support factory environments where hardware must be isolated in different rooms due to regulatory requirements, safety protocols, or operational constraints. This isolation provides fault domain separation while maintaining cluster functiona",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.8,True,Provides a concrete reference architecture with network design and configuration for rack aware clusters in specific factory-like scenarios—product-specific architecture pattern.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-requirements?view=azloc-2604,Review requirements,Requirements and supported configurations for rack aware clusters - Azure Local,Review requirements and supported configs for rack aware clusters,Learn about requirements and supported configurations for rack aware clusters.,This article provides the requirements and supported configurations for rack aware clusters.,2026-04-22T22:07:00.000Z,how-to,deployment,0.85,True,Explicitly lists requirements and supported configurations for rack aware clusters—deployment constraints and matrices unique to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-room-to-room-connectivity?view=azloc-2604,Room-to-room connections,Azure Local rack aware cluster room-to-room connectivity - Azure Local,Design room-to-room connectivity for rack aware clusters,Learn about Azure Local rack aware cluster room-to-room connectivity.,"Azure Local rack aware clusters require specialized room-to-room connectivity to enable storage replication and failover across availability zones. This article outlines four distinct configuration options (A, B, C, and D) for implementing room-to-room links, each optimized for different resilience, cost, and complexity requirements. Review the following key concepts: Room-to-room links: Physical network connections that span between separate rooms or availability zones, enabling RDMA (Remote Di",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.8,True,Outlines four specific configuration options with trade-offs for room-to-room links; this is a product-specific network architecture decision guide.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/route-reflector-overview?view=azloc-2604,Route reflector overview,Overview of BGP Route Reflector in Azure Local and Windows Server - Azure Local,,Use this topic to learn about BGP Route Reflector for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article provides an overview of Border Gateway Protocol (BGP) Route Reflector in Azure Local and Windows Server. BGP Route Reflector is included withRemote Access Service (RAS) Gatewayand provides an alternative to BGP full mesh topology that is required for route synchronization between routers. A Route Reflector in a Software Defined Networking deployment is a logical entity that sits o",2026-04-22T22:07:00.000Z,overview,,0.4,False,Overview of BGP Route Reflector; conceptual explanation of topology and purpose without detailed config or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/san-requirements?view=azloc-2604,SAN requirements,Supported SAN solutions on Azure Local - Azure Local,Use supported SAN solutions with Azure Local,Describes the supported SAN solutions for Azure Local.,Azure Local supports using Fibre Channel (FC) storage area network (SAN) storage as an alternative to local storage (Storage Spaces Direct). This article details the supported SAN solutions from our storage partners.,2026-04-22T22:28:00.000Z,how-to,configuration,0.7,True,Details supported SAN solutions and compatibility; product-specific configuration/compatibility matrix for storage.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-frequently-asked-questions?view=azloc-2604,SDN FAQ,Frequently Asked Questions (FAQ) for Software-Defined Networking (SDN) on Azure Local - Azure Local,,This FAQ provides information about SDN enabled by Azure Arc on Azure Local.,This FAQ provides information about Software-Defined Networking (SDN) enabled by Azure Arc on your Azure Local VMs. This feature is available in Azure Local 2506 or later with OS build 26100.xxxx.,2025-12-22T23:06:00Z,faq,,0.4,False,"FAQ format; summary suggests general Q&A about SDN availability and versions, not detailed error codes, configs, or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-multisite-overview?view=azloc-2604,SDN Multisite overview,Overview of SDN Multisite in Azure Local and Windows Server - Azure Local,Design SDN Multisite topology and disaster recovery with Azure Local,This article provides an overview of the SDN Multisite solution.,"Applies to: Hyperconverged deployments of Azure Local Applies to: Windows Server 2025 This article provides an overview of SDN Multisite, including its benefits and current limitations. You can use it as a guide to help design your network topology and disaster recovery plan. SDN Multisite allows you to expand the capabilities of traditional SDN deployed at different physical locations. SDN Multisite enables native Layer 2 and Layer 3 connectivity across different physical locations for virtuali",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.6,True,"Provides guidance on SDN Multisite benefits, limitations, and how to design network topology and DR; product-specific architectural pattern for multi-site SDN.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-overview?view=azloc-2604,Overview,Software Defined Networking (SDN) enabled by Azure Arc on Azure Local - Azure Local,Choose and manage SDN enabled by Azure Arc on Azure Local,"Software Defined Networking enabled by Arc provides a way to centrally configure and manage logical networks, network security groups, network security rules via the Azure portal and Azure CLI in Azur","This article explains Software Defined Networking (SDN) enabled by Azure Arc on Azure Local. It covers SDN management methods, when to use each method, and supported and unsupported SDN scenarios. SDN offers a centralized way to configure and manage networks and network services such as switching, routing, and load balancing in your datacenter. SDN enables you to dynamically create, secure, and connect your network to meet the evolving needs of your applications.",2026-04-22T22:07:00.000Z,concept-article,decision-making,0.65,True,"Explains SDN management methods, when to use each, and supported/unsupported scenarios; provides product-specific decision guidance on SDN usage.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/security-features?view=azloc-2604,About security features,"Security features for Azure Local, version 23H2. - Azure Local",,"Learn about security features available for new deployments of Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local Azure Local (formerly Azure Stack HCI) is a secure-by-default product that has more than 300 security settings enabled right from the start. Default security settings provide a consistent security baseline to ensure that devices start in a known good state. This article provides a brief conceptual overview of the various security features associated with your Azure Local instance. Features include security defaults, Application Control, volum",2026-04-22T22:07:00.000Z,concept-article,,0.3,False,"Described as a brief conceptual overview of security features; no indication of concrete RBAC roles, parameters, or configuration values.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-defined-networking-23h2?view=azloc-2604,Software-defined networking overview,"Software defined networking (SDN) managed by on-premises tools in Azure Local, version 23H2 - Azure Local",,"Software defined networking (SDN) managed by on-premises tools provides a way to centrally configure and manage networks and network services such as switching, routing, and load balancing in Azure Lo","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Software defined networking (SDN) provides a way to centrally configure and manage networks and network services such as switching, routing, and load balancing in your data center. You can use SDN to dynamically create, secure, and connect your network to meet the evolving needs of your apps. Operating global-scale datacenter networks for services like Microsoft Azure, which efficiently perfor",2026-04-22T22:07:00.000Z,concept-article,,0.4,False,Conceptual overview of SDN managed by on-premises tools; primarily descriptive without clear indication of configuration tables or decision matrices.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-load-balancer?view=azloc-2604,Load balancer overview,Software Load Balancer (SLB) for SDN managed by on-premises tools in Azure Local and Windows Server - Azure Local,,Use this article to learn about Software Load Balancer for Software Defined Networking managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Cloud Service Providers (CSPs) and enterprises that are deployingSoftware Defined Networking (SDN)can use Software Load Balancer (SLB) to evenly distribute tenant and tenant customer network traffic among virtual network resources. SLB enables multiple servers to host the same workload, providing high availability and scalability. Software Load Balancer can provide a multitenant, unified edge ",2026-04-22T22:07:00.000Z,overview,,0.4,False,Overview of Software Load Balancer; mainly conceptual description of functionality and benefits.,new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2604,System requirements,"System requirements for Azure Local, version 23H2 - Azure Local",Apply system requirements for Azure Local 23H2,"How to choose machines, storage, and networking components for Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article discusses Azure, machine and storage, networking, and other requirements for hyperconvered deployments of Azure Local (formerly Azure Stack HCI). If you purchased Integrated System solution hardware from theAzure Local Catalog, you can skip to theNetworking requirementssince the hardware already adheres to machine and storage requirements.",2026-04-22T22:28:00.000Z,how-to,configuration,0.85,True,"System requirements for machines, storage, networking are concrete configuration constraints (hardware/network specs) unique to this product.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-disaggregated?view=azloc-2604,System requirements,System Requirements for Azure Local Disaggregated Deployments - Azure Local,System requirements for Azure Local disaggregated deployments,"Learn how to choose machines, storage, and networking components for Azure Local disaggregated deployments.","Applies to: Hyperconverged deployments of Azure Local This article discusses Azure, machine, storage, networking, and other requirements for disaggregated deployments of Azure Local. To acquire the machines and SAN that supports Azure Local disaggregated architectures, you can purchase validated hardware from a Microsoft hardware partner with the operating system preinstalledAzure Local Catalog.",2026-04-22T22:28:00.000Z,how-to,limits-quotas,0.75,True,"System requirements article will specify supported machine types, SAN capabilities, and numeric constraints (node counts, capacities).",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2604,System requirements for low capacity class,System requirements for low capacity deployments of Azure Local (preview) - Azure Local,Configure low-capacity Azure Local hardware requirements,"How to choose machines, storage, and networking components for low capacity deployments of Azure Local (preview).","Applies to: Hyperconverged deployments of Azure Local This article describes the requirements for machines, storage, and networking for building solutions of Azure Local that use lower capacity hardware. If you purchase lower capacity hardware from theAzure Local Catalog, ensure that these requirements are met before you deploy the Azure Local solutions. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azur",2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,"Low-capacity deployment system requirements for machines, storage, and networking are concrete configuration constraints.",new +https://learn.microsoft.com/en-us/azure/azure-local/concepts/telemetry-and-diagnostics-overview?view=azloc-2604,Telemetry and diagnostics extension,Azure Local telemetry and diagnostics extension - Azure Local,,This article describes the telemetry and diagnostics extension in Azure Local.,"Applies to: Azure Local 2311.2 and later This article gives an overview, lists benefits, and describes options for the telemetry and diagnostics extension in Azure Local.",2025-08-04T17:04:00.000Z,how-to,,0.3,False,Telemetry and diagnostics overview with benefits and options; appears conceptual without detailed config tables or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/about-private-endpoints?view=azloc-2604,About private endpoint scenarios,About using Private Endpoints to Connect with Azure Local - Azure Local,Decide how to use Private Endpoints with Azure Local,"Review how Azure Private Endpoints can be used when deploying Azure Local, with and without Arc gateway, and with and without Proxy.","This article provides an overview of Azure private endpoints on Azure Local including the supported and unsupported scenarios, and key requirements for successful connectivity.",2026-04-22T22:07:00.000Z,concept-article,decision-making,0.7,True,Covers supported/unsupported scenarios and key requirements for private endpoints with/without Arc gateway and proxy; helps choose connectivity approach.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/azure-verification?view=azloc-2604,Azure verification for VMs,Azure verification for VMs on Azure Local - Azure Local,Use Azure verification for VMs on Azure Local,Learn about the Azure verification for VMs feature on Azure Local.,"Applies to: Hyperconverged deployments of Azure Local Microsoft Azure offers a range of differentiated workloads and capabilities that are designed to run only on Azure. Azure Local extends many of the same benefits you get from Azure, while running on the same familiar and high-performance on-premises or edge environments. Azure verification for VMsmakes it possible for supported Azure-exclusive workloads to work outside of the cloud. This feature, modeled after theIMDS attestationservice in Az",2026-04-22T22:07:00.000Z,overview,security,0.7,True,Describes Azure verification modeled after IMDS attestation; involves security/attestation configuration and requirements unique to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-no-gateway?view=azloc-2604,"Private endpoints - no proxy, no gateway",Use Azure Private Endpoints with Azure Local for No Proxy No Arc Gateway Scenario - Azure Local,Configure Private Endpoints for Azure Local without proxy or gateway,"Review how Azure private endpoints can be used when deploying Azure Local, without an enterprise proxy and without an Arc gateway.","This article provides an overview of how you can integrate both Azure private endpoints with Azure Local in a scenario without an enterprise proxy and without an Arc gateway. +For more information about Azure private endpoints on Azure Local and the supported scenarios, seeAbout Azure private endpoints on Azure Local.",2026-04-22T22:07:00.000Z,concept-article,configuration,0.7,True,Scenario-specific integration of private endpoints with Azure Local; product-specific connectivity configuration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-with-gateway?view=azloc-2604,"Private endpoints - no proxy, with gateway",Use Private Endpoints with Azure Local for No Proxy with Arc Gateway - Azure Local,Configure Private Endpoints with Arc gateway for Azure Local,"Review how Azure Private Endpoints can be used when deploying Azure Local, without an enterprise proxy but with an Arc gateway.","This article provides an overview of how you can integrate both existing and new Azure private endpoints with Azure Local in a scenario without an enterprise proxy but with an Arc gateway. +For more information about Azure private endpoints on Azure Local and the supported scenarios, seeAbout Azure private endpoints on Azure Local.",2026-04-22T22:07:00.000Z,concept-article,configuration,0.7,True,Details how to integrate existing/new private endpoints in a no-proxy but Arc gateway scenario; concrete configuration steps and constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-no-gateway?view=azloc-2604,"Private endpoints - with proxy, no gateway",Use Private Endpoints with Azure Local for with Proxy no Arc Gateway - Azure Local,Configure Private Endpoints with proxy for Azure Local,"Review how Azure Private Endpoints can be used when deploying Azure Local, with an enterprise proxy but without an Arc gateway.","This article provides an overview of how you can integrate both existing and new Azure private endpoints with Azure Local in a scenario with enterprise proxy but without an Arc gateway. +For more information about Azure private endpoints on Azure Local and the supported scenarios, seeAbout Azure private endpoints on Azure Local.",2026-04-22T22:07:00.000Z,concept-article,configuration,0.7,True,Describes integrating private endpoints when using an enterprise proxy without Arc gateway; scenario-specific connectivity configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-with-gateway?view=azloc-2604,"Private endpoints - with proxy, with gateway",Use Private Endpoints with Azure Local for Proxy with Arc Gateway - Azure Local,Configure Azure Local private endpoints with proxy and Arc gateway,"Review how Azure Private Endpoints can be used when deploying Azure Local, with an enterprise proxy and with an Arc gateway.","This article describes the scenario where Azure Local is deployed with both an enterprise proxy and an Arc gateway and private endpoints are used. Currently, Azure Local offers the following distinct methods for outbound connectivity: Deploy Azure Local without an enterprise proxy and without an Arc gateway. Deploy Azure Local with an enterprise proxy but without an Arc gateway. Deploy Azure Local without an enterprise proxy but with an Arc gateway. Deploy Azure Local with both an enterprise pro",2026-04-22T22:07:00.000Z,concept-article,configuration,0.7,True,Describes a specific outbound connectivity scenario (enterprise proxy + Arc gateway + private endpoints) with product-specific configuration flows and constraints that go beyond generic networking knowledge.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal-disaggregated?view=azloc-2604,6A. Deploy via Azure portal,Deploy Azure Local Using the Azure Portal for Disaggregated Deployments - Azure Local,Deploy Azure Local disaggregated clusters via Azure portal,Learn how to deploy an Azure Local instance from the Azure portal for disaggregated deployments.,This article helps you deploy a disaggregated Azure Local instance using the Azure portal.,2026-04-22T22:28:00.000Z,how-to,deployment,0.7,True,Portal-based deployment article for a specific deployment type; includes product-specific deployment steps and constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal?view=azloc-2604,6A. Deploy via Azure portal,Deploy an Azure Local instance using the Azure portal - Azure Local,Deploy an Azure Local instance from the Azure portal,Learn how to deploy an Azure Local instance from the Azure portal,This article helps you deploy an Azure Local instance using the Azure portal.,2026-04-22T22:07:00.000Z,how-to,deployment,0.65,True,Portal-based deployment flow for Azure Local with product-specific steps and options beyond generic deployment commands.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-arc-register-server-permissions?view=azloc-2604,4. Set up subscription permissions,Register your Azure Local machines with Azure Arc and assign permissions for deployment - Azure Local,Assign Azure Arc permissions to register Azure Local machines,Learn how to register your Azure Local machines with Azure Arc and assign permissions for deployment.,Applies to: Hyperconverged deployments of Azure Local This article describes how to set up the required permissions on your subscription to deploy Azure Local.,2026-04-22T22:07:00.000Z,how-to,security,0.7,True,Focuses on setting up required subscription permissions and registration steps; includes product-specific RBAC/permission configuration for deployment.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-arc-gateway-overview?view=azloc-2604,Review Azure Arc gateway for Azure Local,Overview of Azure Arc gateway for Azure Local - Azure Local,Configure Azure Arc gateway for Azure Local deployments,Learn what is Azure Arc gateway for Azure Local.,"This article provides an overview of the Azure Arc gateway for Azure Local (formerly known as Azure Stack HCI). You can enable the Arc gateway on new deployments of Azure Local running software version 2506 and later. This article also describes how to create and delete the Arc gateway resource in Azure. Use the Arc gateway to significantly reduce the number of required endpoints needed to deploy and manage Azure Local instances. When you create the Arc gateway, connect to and use it for new dep",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Explains enabling Arc gateway, supported software versions, and resource creation; product-specific configuration and deployment behavior.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template-disaggregated?view=azloc-2604,6B. Deploy via ARM template,Deploy Azure Local Using an Azure Resource Manager Template for Disaggregated Deployments - Azure Local,Deploy Azure Local disaggregated using ARM templates,Learn how to prepare and deploy an Azure Local instance using an Azure Resource Manager template for disaggregated deployments.,This article details how to use an Azure Resource Manager (ARM) template in the Azure portal to deploy a disaggregated Azure Local in your environment. The article also contains the prerequisites and the preparation steps required to begin the deployment. Important ARM template deployment of Azure Local systems is targeted for deployments-at-scale. The intended audience for this deployment is IT administrators who have experience deploying Azure Local instances. We recommend that you deploy a sy,2026-04-22T22:28:00.000Z,how-to,deployment,0.8,True,"ARM template deployment for disaggregated Azure Local is a product-specific deployment pattern, likely with parameter and scale constraints.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template?view=azloc-2604,6B. Deploy via ARM template,"Azure Resource Manager template deployment for Azure Local, version 23H2 - Azure Local",Deploy Azure Local 23H2 using ARM templates at scale,"Learn how to prepare and then deploy Azure Local instance, version 23H2 using the Azure Resource Manager template.",This article details how to use an Azure Resource Manager (ARM) template in the Azure portal to deploy an Azure Local in your environment. The article also contains the prerequisites and the preparation steps required to begin the deployment. Important ARM template deployment of Azure Local systems is targeted for deployments-at-scale. The intended audience for this deployment is IT administrators who have experience deploying Azure Local instances. We recommend that you deploy a system via the ,2026-04-22T22:07:00.000Z,how-to,deployment,0.75,True,ARM template deployment guidance including prerequisites and scale-focused recommendations; product-specific deployment pattern and constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os-disaggregated?view=azloc-2604,3A. Install manually via ISO,Install the Azure Local Operating System for Disaggregated Deployments - Azure Local,Install Azure Local OS for disaggregated deployments with SConfig,Learn how to install the Azure Local operating system on each machine of your disaggregated deployment using SConfig.,"Applies to: Hyperconverged deployments of Azure Local There are two distinct ways of installing the OS on your Azure Local machines. You can use theInstall Azure Stack HCIwizard and SConfig or you can install and register the OS using simplified machine provisioning (preview). This article describes the OS install using the wizard only. To use simplified machine provisioning, seeInstall and register the OS using simplified machine provisioning.",2026-04-22T22:28:00.000Z,how-to,deployment,0.7,True,OS installation method for this deployment mode; likely includes product-specific deployment requirements and supported methods.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os?view=azloc-2604,3A. Install manually via ISO,"Install Azure Stack HCI operating system, version 23H2 using SConfig - Azure Local",Install Azure Stack HCI 23H2 OS using SConfig,"Learn how to install the Azure Stack HCI operating system, version 23H2 on each machine of your system using SConfig.","Applies to: Hyperconverged deployments of Azure Local There are two distinct ways of installing the OS on your Azure Local machines. You can use theInstall Azure Stack HCIwizard and SConfig or you can install and register the OS using simplified machine provisioning (preview). This article describes the OS install using the wizard only. To use simplified machine provisioning, seeInstall and register the OS using simplified machine provisioning.",2026-04-22T22:28:00.000Z,how-to,deployment,0.65,True,Step-by-step OS installation using a specific wizard and SConfig for Azure Local; includes product-specific deployment steps and options.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-introduction?view=azloc-2604,Read overview,"Azure Local, version 23H2 deployment overview - Azure Local",,"Learn about the deployment methods for Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article is the first in the series of deployment articles that describe how to deploy Azure Local. This article applies to both single and multi-node deployments. The target audience for this article is IT administrators who are responsible for deploying Azure Local in their organization.,2026-04-22T22:07:00.000Z,overview,,0.3,False,"Deployment overview article; likely describes methods and flow without detailed matrices, limits, or configuration tables.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault-template?view=azloc-2604,Deploy via ARM template,Deploy Azure Local using local identity with Azure Key Vault via an Azure Resource Manager Template - Azure Local,Deploy Azure Local with local identity and Key Vault via ARM template,Learn how to prepare and then deploy Azure Local using local identity with Azure Key Vault using an Azure Resource Manager (ARM) template.,"This article describes how to deploy Azure Local using local identity with Azure Key Vault by using an Azure Resource Manager (ARM) template configured for external DNS. The article also describes the prerequisites and the preparation steps required to begin the deployment. Important Use ARM template deployment for Azure Local systems at scale. This approach is intended for experienced IT administrators. Deploy asystem via the Azure portalfirst, then use the ARM template for subsequent deploymen",2026-04-23T08:00:00.000Z,how-to,deployment,0.75,True,"Combines local identity, Key Vault, and ARM templates with external DNS; detailed, product-specific deployment configuration for scaled scenarios.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2604,Deploy via Azure portal,Deploy Azure Local Using Local Identity with Azure Key Vault - Azure Local,Deploy Azure Local using local identity with Azure Key Vault,Learn how to use local identity with Azure Key Vault for Azure Local deployment.,This article describes how to use local identity with Azure Key Vault for Azure Local deployment.,2026-04-21T17:05:00.000Z,how-to,security,0.7,True,Describes using local identity integrated with Key Vault for deployment; product-specific identity and secret management configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prep-active-directory?view=azloc-2604,1. Prepare Active Directory,"Prepare Active Directory for Azure Local, version 23H2 deployment - Azure Local",Prepare Active Directory environment for Azure Local deployment,"Learn how to prepare Active Directory before you deploy Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to prepare your Active Directory environment before you deploy Azure Local. Active Directory requirements for Azure Local include: Note To manually assign the required permissions for Active Directory, create an OU, and block GPO inheritance, seeCustom Active Directory configuration for your Azure Local.",2026-04-22T22:07:00.000Z,how-to,security,0.65,True,"Details AD requirements and preparation steps, including OU and GPO considerations; product-specific identity and permission configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prerequisites?view=azloc-2604,Complete prerequisites,"Prerequisites to deploy Azure Local, version 23H2 - Azure Local","Verify security, hardware, and network prerequisites for Azure Local deployment","Learn about the prerequisites to deploy Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article discusses the security, software, hardware, and networking prerequisites, and the deployment checklist in order to deploy Azure Local instance.",2026-04-22T22:07:00.000Z,install-set-up-deploy,deployment,0.6,True,"Prerequisites and checklist for deploying Azure Local, including concrete requirements for hardware, networking, and security that are product-specific.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2604,Virtual deployment,"Deploy a virtual Azure Local, version 23H2 and 24H2 system - Azure Local",Deploy virtualized Azure Local 23H2/24H2 systems,"Describes how to perform an Azure Local, version 23H2 virtualized deployment.","Applies to: Hyperconverged deployments of Azure Local This article describes how to deploy a virtualized Azure Local (formerly Azure Stack HCI) instance on a host system running Windows Server 2022, Windows 11, or later operating system (OS). The host must have Hyper-V enabled for the deployment. You need administrator privileges for the Azure Local virtual deployment and should be familiar with the existing Azure Local solution. The deployment can take around 2.5 hours to complete. Important A ",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,"Product-specific deployment article with OS version requirements, Hyper-V requirement, and approximate deployment time; focused on how to deploy this SKU.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2604,With Arc gateway,Register Azure Local with Azure Arc using Arc Gateway - Azure Local,Register Azure Local with Azure Arc gateway and proxy,Learn how to register Azure Local using Azure Arc gateway Arc proxy. Both scenarios with and without proxy are configured.,"This article details how to register Azure Local using Azure Arc gateway and with the proxy configuration enabled. Once you create an Arc gateway resource in your Azure subscription, you can enable the Arc gateway features. For an overview of the Arc gateway, seeAbout Azure Arc gateway for Azure Local. Configure proxy with a script: Using this method, you can configure Arc proxy with a script. This method is useful as you don't need to configure the Arc proxy across WinInet, WinHttp, or environm",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,Details configuration of Arc gateway and proxy via scripts; product-specific connectivity and registration configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-without-azure-arc-gateway?view=azloc-2604,Without Arc gateway,Register Azure Local with Azure Arc without using Arc Gateway - Azure Local,Register Azure Local with Azure Arc without gateway and configure proxy,Learn how to register Azure Local with Azure Arc with and without proxy setup. The proxy configuration can be done via an Arc script or via the Configurator app on Azure Local.,"This article details how to register Azure Local machines with Azure Arc without using an Arc gateway and with proxy configuration enabled. The proxy configuration can be done via an Arc script or via the Configurator app for Azure Local. Configure with a script: You can use an Arc script to configure registration settings. Set up via the Configurator app (Preview): Using this method, you can configure Azure Local registration via a user interface. This method is useful if you prefer not to use ",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,Covers registration flows and proxy configuration via script and Configurator app; includes product-specific parameters and options.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/download-23h2-software?view=azloc-2604,2. Download the software,"Download Azure Stack HCI Operating System, version 23H2 software for Azure Local deployment - Azure Local",Download Azure Local 23H2 OS from Azure portal,"Learn how to download Azure Local, version 23H2 software from the Azure portal to deploy an Azure Local instance.","Applies to: Hyperconverged deployments of Azure Local This article describes how to download the operating system (OS) software from the Azure portal to deploy an Azure Local instance. The first step in deploying Azure Local is to download the OS from the Azure portal. The software download includes a free 60-day trial. However, if you've purchased Integrated System solution hardware from theAzure Local Catalogthrough your preferred Microsoft hardware partner, the OS should be preinstalled. In t",2026-04-22T22:07:00.000Z,how-to,deployment,0.6,True,"Explains how to obtain the specific OS image, including trial details and catalog-based preinstallation; product-specific deployment acquisition flow.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-external-storage?view=azloc-2604,Connect External Storage (SAN),Enable External Storage (SAN) on Azure Local - Azure Local,Enable external SAN storage integration with Azure Local,Describes how to enable integration of external Storage from various SAN vendors to Azure Local.,,2026-04-22T22:28:00.000Z,how-to,integrations,0.65,True,"Describes integrating external SAN storage from various vendors with Azure Local, including product-specific integration steps and constraints.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-sdn-integration?view=azloc-2604,Enable SDN integration,Enable Software-Defined Networking (SDN) enabled by Azure Arc on Azure Local using a PowerShell Action Plan - Azure Local,Enable SDN integration on Azure Local using PowerShell action plan,Describes how to enable integration of SDN enabled by Azure Arc using a PowerShell action plan on Azure Local.,This article describes how to enable and integrate software defined networking (SDN) on your existing Azure Local instance. You use a PowerShell action plan to enable SDN.,2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,Describes enabling SDN via a specific PowerShell action plan; includes product-specific configuration steps and scripts.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-portal?view=azloc-2604,Via Azure portal,Deploy rack aware cluster using the Azure portal - Azure Local,Deploy Azure Local rack aware clusters via Azure portal,"Learn how to deploy a rack aware cluster via the Azure portal with step-by-step guidance, including configuration, networking, and validation processes.",This article describes the steps to deploy Azure Local rack aware clusters using the Azure portal.,2026-04-22T22:07:00.000Z,how-to,deployment,0.8,True,"Step-by-step deployment of rack aware clusters through the portal, including configuration and validation processes unique to this feature.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-prep?view=azloc-2604,Prepare to deploy,Prepare to deploy rack aware cluster via the Azure portal - Azure Local,Prepare network and hardware for rack aware cluster deployment,Learn how to deploy Azure Local rack aware clusters with high resiliency using ToR switches and VLAN isolation for optimal network configurations.,"This article describes the preparation steps to deploy Azure Local rack aware clusters. It includes network design recommendations, machine configuration guidelines, and best practices for deployment.",2026-04-22T22:07:00.000Z,how-to,best-practices,0.75,True,"Includes network design recommendations, machine configuration guidelines, and deployment best practices specific to Azure Local rack aware clusters.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deployment-via-template?view=azloc-2604,Via ARM template,Azure Resource Manager template deployment for Azure Local rack aware cluster - Azure Local,Deploy rack aware clusters using ARM templates,Learn how to prepare and then deploy Azure Local rack aware cluster using the Azure Resource Manager template.,"This article describes how to use an Azure Resource Manager (ARM) template in the Azure portal to deploy a rack aware cluster. Important ARM template deployment of rack aware cluster is targeted for deployments-at-scale. The intended audience for this deployment is IT administrators who have experience deploying rack aware clusters. We recommend that you deploy a system via the Azure portal first, and then perform subsequent deployments via the ARM template.",2026-04-22T22:07:00.000Z,how-to,deployment,0.85,True,ARM template-based deployment for rack aware clusters with guidance for scale deployments—product-specific deployment method and constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-post-deployment?view=azloc-2604,Complete post-deployment tasks,Perform post deployment tasks on rack aware clusters - Azure Local,Complete post-deployment tasks for rack aware clusters,Learn about the post deployment tasks that you need to perform on your newly deployed rack aware cluster.,"After deploying rack aware cluster, either through theAzure portalor using theAzure Resource Manager deployment template, you need to complete a set of post-deployments tasks. This article describes the typical tasks required once your rack aware cluster is successfully deployed and all machines are up and running.",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,Lists required post-deployment tasks after rack aware cluster deployment; these are operational deployment steps specific to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-readiness-check?view=azloc-2604,Assess network readiness via LLDP,Use LLDP validator to assess deployment readiness for Azure Local rack aware cluster - Azure Local,Use LLDP validator to check rack aware cluster readiness,How to use the LLDP validator to assess if your environment is ready for deploying Azure Local rack aware cluster.,This article describes how to use the Link Layer Discovery Protocol (LLDP) validator in a standalone mode to assess how ready your environment is for deploying rack aware cluster.,2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,Describes using an LLDP validator tool with product-specific steps and parameters to assess deployment readiness.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-express-23h2?view=azloc-2604,Using Express scripts,"Deploy infrastructure for Software Defined Networking managed by on-premises tools using SDN Express for Azure Local, version 23H2 - Azure Local",Deploy SDN infrastructure with SDN Express scripts,"Learn how to deploy infrastructure for Software Defined Networking managed by on-premises tools using SDN Express for Azure Local, version 23h2.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 In this article, you deploy an end-to-end Software Defined Network (SDN) infrastructure for Azure Local using SDN Express PowerShell scripts. The infrastructure includes a highly available (HA) Network Controller (NC), and optionally, a highly available Software Load Balancer (SLB), and a highly available Gateway (GW). The scripts support a phased deployment, where you can deploy just the Net",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,"Stepwise deployment of SDN infrastructure using SDN Express PowerShell, including HA Network Controller, SLB, and gateways with phased deployment options. Contains product-specific deployment patterns and constraints.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-wizard-23h2?view=azloc-2604,Using Windows Admin Center,Deploy Software Defined Networking managed by on-premises tools with Windows Admin Center for Azure Local - Azure Local,Deploy SDN via Windows Admin Center on Azure Local,Learn how to deploy infrastructure for Software Defined Networking managed by on-premises tools with Windows Admin Center for Azure Local,"Applies to: Hyperconverged deployments of Azure Local This article describes how to deploy Software Defined Networking (SDN) managed by on-premises tools through Windows Admin Center after you deployed your Azure Local via the Azure portal. Windows Admin Center enables you to deploy all the SDN infrastructure components on your existing Azure Local, in the following deployment order: Alternatively, you can deploy the entire SDN infrastructure through theSDN Expressscripts. You can also deploy an",2026-04-22T22:07:00.000Z,how-to,deployment,0.65,True,"Describes deploying all SDN components on Azure Local through Windows Admin Center, including supported deployment order and alternative SDN Express path. Product-specific deployment workflow.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/simplified-machine-provisioning?view=azloc-2604,3B. Install and register via simplified machine provisioning,Install and Register Azure Local Machines using Simplified Machine Provisioning (preview) - Azure Local,Provision Azure Local machines with simplified machine provisioning,Install and register Azure Local machines using simplified machine provisioning (preview).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to use simplified machine provisioning to set up machines for an Azure Local instance. You can install the OS on your Azure Local machines in two distinct ways: you can manually install the OS using ISO images, or you can use the simplified machine provisioning process. This article covers only the installation and registration process by using simplified machine provisioning, which is currently in preview. To insta",2026-04-02T17:05:00.000Z,how-to,deployment,0.7,True,Describes a preview deployment mechanism with specific registration and installation flow unique to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/sql-server-23h2?view=azloc-2604,Run SQL Server,Deploy SQL Server on Azure Local Version 23H2 - Azure Local,Deploy SQL Server workloads on Azure Local 23H2,"This article provides guidance on how to deploy SQL Server on Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article provides guidance on how to deploy SQL Server on Azure Local, version 23H2.",2026-04-22T22:07:00.000Z,how-to,deployment,0.65,True,"Guidance for deploying SQL Server specifically on Azure Local, likely including configuration and deployment patterns unique to this platform.",new +https://learn.microsoft.com/en-us/azure/azure-local/deploy/troubleshoot-simplified-machine-provisioning?view=azloc-2604,Troubleshoot simplified machine provisioning,Troubleshoot Simplified Machine Provisioning for Azure Local (preview) - Azure Local,Troubleshoot simplified machine provisioning for Azure Local (preview),Learn how to troubleshoot simplified machine provisioning for Azure Local (preview).,"Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This article describes how to troubleshoot simplified machine provisioning. You can use the following methods to troubleshoot:",2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.7,True,Troubleshooting article for a specific provisioning feature; includes methods and likely error/solution mappings unique to this workflow.,new +https://learn.microsoft.com/en-us/azure/azure-local/faq?view=azloc-2604,FAQ,Azure Local FAQ - Azure Local,,"The Azure Local FAQ provides information about deployment types, Azure connectivity, data handling, and supported services.","The Azure Local FAQ provides information about deployment types, Azure connectivity, data handling, and supported services.",2026-02-23T23:04:00Z,faq,,0.3,False,FAQ summary only; likely mixed conceptual and policy answers without clear indication of detailed limits/configs in the snippet.,new +https://learn.microsoft.com/en-us/azure/azure-local/hybrid-capabilities-with-azure-services-23h2?view=azloc-2604,Hybrid capabilities with Azure services,"Hybrid capabilities with Azure services in Azure Local, version 23H2 - Azure Local",,"This article describes the cloud service components of Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local Your on-premises Azure Local solution integrates with Azure cloud using several cloud service components, such as Azure Local cloud service, Azure Arc, and other Azure hybrid services. This article describes the functionality provided by these cloud service components, and how they help provide hybrid capabilities to your Azure Local deployment.",2026-04-22T22:07:00.000Z,overview,,0.2,False,"Conceptual overview of hybrid capabilities and components; no concrete limits, configs, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2604,Known issues,Release notes with fixed and known issues in Azure Local - Azure Local,Resolve known issues in Azure Local releases,Read about the known issues and fixed issues in Azure Local.,"This article identifies critical known issues and their workarounds in Azure Local. These release notes are continuously updated, and as critical issues requiring a workaround are discovered, they're added. Before you deploy your Azure Local instance, carefully review the information contained here. Important For information about supported update paths for this release, seeRelease information. For more information about new features in this release, seeWhat's new for Azure Local.",2026-04-23T17:08:00.000Z,troubleshooting-general,troubleshooting,0.7,True,Known issues and workarounds imply symptom→cause→solution mappings specific to Azure Local versions.,new +https://learn.microsoft.com/en-us/azure/azure-local/license-billing?view=azloc-2604,FAQ,OEM license and billing FAQ for Azure Local - Azure Local,,The FAQ provides information on the OEM license and billing for Azure Local.,This FAQ provides information on the OEM license and billing for Azure Local.,2026-04-06T17:04:00Z,faq,,0.2,False,"Billing and license FAQ; commercial details, not technical expert knowledge per defined categories.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/add-network-adapters-to-network-intents?view=azloc-2604,Add NICs to a Network ATC intent,Add NICs to an existing Network ATC intent on Azure Local - Azure Local,Add physical NICs to existing Network ATC intents on Azure Local,Learn how to add new physical network adapters to an existing Network ATC intent on an Azure Local instance.,"Applies to: Hyperconverged deployments of Azure Local This article shows you how to add new physical network adapters (NICs) to an existing Network ATC intent on an Azure Local instance. Use this procedure to expand an intent for more bandwidth, better redundancy, or to match new hardware standards in your cluster. For the related repair workflow, seeReplace a failed NIC in an existing Network ATC intent.",2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Shows how to modify Network ATC intents by adding NICs; likely includes specific intent properties, adapter mappings, and configuration commands unique to Azure Local networking.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server-disaggregated?view=azloc-2604,Add a node,Add a Node on Azure Local for Disaggregated Deployments - Azure Local,,Learn how to manage capacity on your Azure Local disaggregated deployment by adding a node.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to manage capacity by adding a node (often called scale-out) to your Azure Local instance. In this article, each server is referred to as a node.",2026-04-22T22:28:00.000Z,how-to,,0.3,False,"Adding a node is a capacity/scale-out how-to; summary suggests step-by-step operations without explicit limits, config matrices, or decision criteria.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server?view=azloc-2604,Add a node,"Manage Capacity by Adding a Node on Azure Local, Version 23H2 - Azure Local",,"Learn how to manage capacity on your Azure Local, version 23H2 system by adding a node.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage capacity by adding a node (often called scale-out) to your Azure Local instance. In this article, each server is referred to as a node.",2026-04-22T22:07:00.000Z,how-to,,0.4,False,Operational guide to add a node; mostly procedural without detailed configuration parameter tables or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/arc-extension-management?view=azloc-2604,Azure Arc extension management,Azure Arc extension management on Azure Local - Azure Local,Configure and manage Azure Arc extensions on Azure Local,This article describes how to manage Azure Arc extensions on Azure Local.,"Applies to: Azure Local 2311.2 and later This article explains how to install, upgrade, and manage Azure Arc extensions on Azure Local.",2025-12-23T23:03:00.000Z,how-to,configuration,0.65,True,"Covers installing, upgrading, and managing Arc extensions on Azure Local; likely includes extension types, parameters, and management settings specific to this platform.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-public-ip-to-vm?view=azloc-2604,Assign public IP address to a VM,Assign a public IP address to a virtual machine - Azure Local,Assign SDN public IP addresses to Azure Local VMs,Learn how to assign a public IP address to a virtual machine in Software Defined Networking managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to use Windows Admin Center to assign a Software Defined Networking (SDN) public IP address to a virtual machine (VM) in Azure Local. By assigning a public IP address to a VM, you enable the VM to communicate with external networks, thereby enhancing its capabilities and extending its connectivity.",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Explains how to assign SDN public IPs to VMs via Windows Admin Center, enabling external connectivity. Product-specific configuration steps and behavior.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-vm-rbac-roles?view=azloc-2604,Assign RBAC role,Use built-in RBAC roles for Azure Local VM to manage Azure Local VMs enabled by Azure Arc - Azure Local,Assign RBAC roles for Azure Local Arc VMs,Learn how to use RBAC built-in roles to manage Azure Local VMs enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to use the Role-based Access Control (RBAC) to control access to Azure Local virtual machines (VMs) enabled by Azure Arc. You can use the built-in RBAC roles to control access to VMs and VM resources such as virtual disks, network interfaces, VM images, logical networks, and storage paths. You can assign these roles to users, groups, service principals, and managed identities.",2026-04-22T22:07:00.000Z,how-to,security,0.8,True,"Describes using built-in RBAC roles to control access to VM resources; will list specific role names and scopes, matching security criteria.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/attach-gpu-to-linux-vm?view=azloc-2604,Attach GPU to Linux VM,Attach a GPU to a Linux VM in Azure Local - Azure Local,Attach and use GPUs on Linux VMs in Azure Local,How to use a GPU with AI workloads running in an Ubuntu Linux VM on Azure Local.,"Applies to: Azure Local 2311.2 and later Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't enabled by Azure Arc, have limited manageability from the Azure Arc control plane, and fewer Azure Hybrid Benefits, including usage of Azure Update Manager ",2026-04-22T22:07:00.000Z,how-to,configuration,0.75,True,Shows how to attach GPUs to Ubuntu VMs on Azure Local with product-specific steps and constraints for AI workloads.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-overview?view=azloc-2604,What is Azure Local VM management?,What is Azure Local VM management - Azure Local,,Learn about using Azure Local VM management to provision and manage on-premises Windows and Linux virtual machines (VMs) running on Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article provides an overview of virtual machine (VM) management in hyperconverged deployments of Azure Local (formerly Azure Stack HCI), including its benefits, components, and a high-level workflow. Azure Local VM management enables IT admins to provision and manage Windows and Linux VMs hosted in an on-premises Azure Local environment. IT admins can use the feature to create, modify, delete, and assign permissions and roles to app owne",2026-04-22T22:07:00.000Z,how-to,,0.2,False,"High-level overview of VM management; conceptual description of benefits and components, not detailed configs or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-prerequisites?view=azloc-2604,Review prerequisites,Azure Local VM management prerequisites - Azure Local,Satisfy prerequisites for Azure Local Arc VMs,Learn about the prerequisites for deploying Azure Local VMs enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article lists the requirements and prerequisites for Azure Local virtual machines (VMs) enabled by Azure Arc. We recommend that you review the requirements and complete the prerequisites before you manage your Azure Local VMs.,2026-04-22T22:07:00.000Z,how-to,configuration,0.6,True,"Prerequisites article typically enumerates specific requirements (versions, SKUs, settings); these are product-specific configuration and environment requirements.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vms-faq?view=azloc-2604,FAQs,Azure Local VMs Enabled by Azure Arc FAQ - Azure Local,,This article provides answers to questions related to Azure Local VM management.,Frequently asked questions about Azure Local VMs enabled by Azure Arc for version 2311.2 and later.,2025-12-23T23:03:00Z,faq,,0.3,False,"FAQ format; summary doesn’t indicate detailed error codes, configuration tables, or numeric limits—likely general Q&A rather than deep expert data.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-benefits-esu?view=azloc-2604,Extended Security Updates (ESUs),Extended Security Updates (ESU) on Azure Local - Azure Local,Configure Extended Security Updates on Azure Local,Learn how to get free extended security updates (ESUs) with Azure VM verification on Azure Local.,"Applies to: Azure Local 2311.2 and later The Extended Security Update (ESU) program enables you to get important security patches for legacy Microsoft products that are past the end of support. Getting ESU through Azure Local comes with additional benefits and implementation steps. This article explains the specifics for Azure Local. To get general information about the ESU program, products that are covered, and support dates, see theProduct Lifecycle FAQ. For detailed steps to set up legacy OS",2026-04-22T22:07:00.000Z,overview,security,0.7,True,Details ESU specifics for Azure Local and Azure verification; includes product-specific security/compliance configuration steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-enhanced-management-managed-identity?view=azloc-2604,Enhanced management from Azure,Enhanced management of Azure Local from Azure - Azure Local,Configure enhanced Azure management using managed identity for Azure Local,Learn how to use enhanced Azure management for Azure Local. This enhanced management is enabled via Managed Identity created for your Azure Local.,"Applies to: Azure Local 2311.2 and later This guide describes the feature in the May 2023 cumulative update for Azure Local, version 22H2, that enables enhanced management from Azure.",2026-04-22T22:07:00.000Z,concept-article,security,0.7,True,"Describes enabling enhanced management via a specific managed identity for Azure Local, likely including role assignments, identity scopes, and Azure-side security configuration details unique to this product.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-site-recovery?view=azloc-2604,Use Azure Site Recovery,Protect your Hyper-V Virtual Machine workloads on Azure Local with Azure Site Recovery (preview) - Azure Local,Protect Azure Local Hyper-V VMs with Site Recovery,Use Azure Site Recovery to protect Hyper-V VM workloads running on Azure Local. (preview),"Applies to: Azure Local 2311.2 and later This guide describes how to protect Windows and Linux VM workloads running on your Azure Local if there's a disaster. You can use Azure Site Recovery to replicate your on-premises Azure Local virtual machines (VMs) into Azure and protect your business critical workloads. This feature is enabled on Azure Local running the May 2023 cumulative update of version 22H2 and later. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use f",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,"Step-by-step guidance for using Azure Site Recovery with Azure Local Hyper-V workloads; includes product-specific deployment requirements and version applicability, going beyond generic disaster recovery tutorials.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-log-files-arc-enabled-vms?view=azloc-2604,Collect log files for Azure Local VM,Collect log files for Azure Local VMs enabled by Azure Arc - Azure Local,Collect diagnostic logs for Azure Local Arc VMs,Learn how to collect log files for an Azure Local VM enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local Collect logs and other files to identify and troubleshoot issues with Azure Local virtual machines (VMs) enabled by Azure Arc.,2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.7,True,"Focused on collecting logs to troubleshoot VM issues; likely specifies log locations and commands, matching troubleshooting criteria.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-logs?view=azloc-2604,Collect logs,Collect diagnostic logs for Azure Local - Azure Local,Collect and upload Azure Local diagnostic logs via portal and PowerShell,Learn how to collect diagnostic logs and share them with Microsoft.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to collect diagnostic logs for Azure Local and send them to Microsoft via the Azure portal or PowerShell. These diagnostic logs help identify and fix any issues with your Azure Local solution. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general av",2026-04-22T22:07:00.000Z,how-to,configuration,0.6,True,"Describes log collection mechanisms and commands; includes specific log packages, locations, and PowerShell usage unique to Azure Local.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-network-security-groups-with-tags?view=azloc-2604,Configure network security groups with tags,Configure network security groups with tags in Windows Admin Center - Azure Local,Use tags with SDN network security groups in WAC,Learn how to configure network security groups with tags in Windows Admin Center.,"Applies to: Azure Local 2311.2 and later Applies to: Windows Server 2025 This article describes how to configure network security groups with network security tags in Windows Admin Center. With network security tags, you can create custom user-defined tags, attach those tags to your virtual machine (VM) network interfaces, and apply network access policies (with network security groups) based on these tags.",2026-04-22T22:07:00.000Z,how-to,security,0.7,True,"Describes creating custom network security tags, attaching them to VM NICs, and applying NSG policies based on tags. Product-specific security tagging model.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-proxy-settings-23h2?view=azloc-2604,Configure proxy,"Configure proxy settings for Azure Local, version 23H2 - Azure Local",Configure proxy settings for Azure Local 23H2,"Learn how to configure proxy settings for Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local Important Since the release of Azure Local 2506, it is not required to configure the proxy settings manually on your Azure Local Machines. Azure Local machines proxy configuration is done automatically during Arc registration. Make sure you follow the guidance for proxy environments deployments based on your registration method:Register Azure local with Arc using proxy.Register Azure Local with Arc using proxy and Arc gateway. This article de",2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,"Explains proxy configuration behavior, including version-specific automation (post-2506) and guidance for different Arc registration methods—product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-software-load-balancer?view=azloc-2604,Configure load balancer for high availability ports,Configure Software Load Balancer for high availability ports - Azure Local,Configure SLB high availability ports on Azure Local,Learn how to configure Software Load Balancer for high availability ports.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article provides an overview of high availability ports rule and their purpose. It also describes the prerequisites for setting up such rules, the configuration steps, supported configurations, and associated limitations.",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Describes HA ports rules, prerequisites, supported configurations, and limitations for SLB. Product-specific configuration and constraints for HA ports.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/connect-arc-vm-using-ssh?view=azloc-2604,Connect to Azure Local VMs,Connect to an Azure Local Virtual Machine using SSH or Remote Desktop Protocol over SSH or VM Connect (Preview) - Azure Local,Connect to Azure Local VMs using SSH and RDP,Learn how to use SSH or RDP over SSH or VM Connect feature (preview) to connect to an Azure Local VM enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article describes how to connect to an Azure Local VM in two scenarios:,2026-04-22T22:07:00.000Z,how-to,integrations,0.6,True,"Covers SSH/RDP over SSH and VM Connect; likely includes connection parameters, ports, and feature-specific settings, which are integration/coding patterns for remote access.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/create-arc-virtual-machines?view=azloc-2604,Run Azure Local VMs,Create Azure Local Virtual Machines Enabled by Azure Arc - Azure Local,Create Azure Local VMs enabled by Azure Arc,Learn how to view your Azure Local instance in the Azure portal and create Azure Local VMs enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc, starting with the VM images that you created on your Azure Local instance. You can create Azure Local VMs by using the Azure CLI, the Azure portal, or an Azure Resource Manager template (ARM template).",2026-04-22T22:07:00.000Z,how-to,integrations,0.7,True,"Shows how to create Arc-enabled VMs via CLI, portal, and ARM templates; includes Azure Local–specific parameters and integration patterns.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/create-logical-networks?view=azloc-2604,3. Create logical networks,Create logical networks for Azure Local virtual machines enabled by Azure Arc - Azure Local,Configure logical networks for Azure Local Arc VMs,Learn how to create logical networks on Azure Local. The Azure Local VMs enabled by Azure Arc running on your system use this logical network.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create or add logical networks for application workloads running on your Azure Local instance. Any Azure Local virtual machines (VMs) that you create use these logical networks. A logical network is a logical representation of a physical network where Azure Local VMs can be provisioned. It defines how VM network interfaces connect to the underlying network. Logical networks allows you to configure networking sett,2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Defines logical networks and their properties for Azure Local; likely includes network settings (address spaces, VLANs, etc.), which are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-interfaces?view=azloc-2604,4. Create network interfaces,Create network interfaces for virtual machines on Azure Local - Azure Local,Configure network interfaces for Azure Local VMs,Learn how to create network interfaces on an existing logical network associated with your Azure Local. The Azure Local VM enabled by Azure Arc uses these network interfaces.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create network interfaces that you can associate with an Azure Local virtual machine (VM). You can create network interfaces using the Azure portal or Azure Command-Line Interface (CLI).,2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,"Describes creating NICs on logical networks; includes NIC properties and settings specific to Azure Local, fitting configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-security-groups?view=azloc-2604,Create Network Security Groups,"Create network security groups, network security rules, default network access policies on Azure Local VMs - Azure Local","Create NSGs, rules, and default access policies for Azure Local VMs","Learn how to create network security groups, network security rules, and default network access policies on Azure Local VMs using the Azure CLI or the Azure portal.",This article explains how to create and configure network security groups (NSGs) to manage data traffic flow after you enable network controller on your Azure Local.,2026-04-22T22:07:00.000Z,how-to,security,0.75,True,Covers NSG and rule creation plus default access policies via CLI/portal; product-specific network security configuration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/create-storage-path?view=azloc-2604,1. Create a storage path,Create storage path for Azure Local virtual machines images - Azure Local,Configure storage paths for Azure Local VM images,Learn how to create storage path for use with VM images for your Azure Local instance.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to create storage path for VM images used on your Azure Local instance. Storage paths are an Azure resource and are used to provide a path to store VM configuration files, VM image, and VHDs on your system. You can create a storage path using the Azure CLI or Azure portal.",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,"Defines storage path resource and shows how to configure it via CLI/portal; includes product-specific resource properties and settings, fitting configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-infrastructure-resiliency?view=azloc-2604,Infrastructure resiliency,Infrastructure Resiliency for Azure Local - Azure Local,Design resilient Azure Local infrastructure for DR,Infrastructure resiliency considerations for Azure Local.,"This article explores the key infrastructure elements that contribute to a resilient Azure Local deployment and how they support continuity in the face of hardware faults, network outages, and site-level disasters. Infrastructure resiliency is the foundation of a robust disaster recovery strategy for Azure Local deployments. Before virtual machines (VMs) and applications can be protected, the underlying physical and logical infrastructure must be designed to withstand failures and disruptions. T",2026-01-28T23:03:00.000Z,concept-article,architecture-patterns,0.65,True,"Explores infrastructure elements for resiliency and continuity; provides Azure Local–specific resiliency patterns and design trade-offs, fitting architecture-patterns.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-overview?view=azloc-2604,Overview,Disaster Recovery for Azure Local Virtual Machines - Azure Local,Plan disaster recovery strategy for Azure Local VMs,Disaster recovery considerations for Azure Local virtual machines.,"Disaster recovery planning is strategic for any organization using IT infrastructure, and its importance is magnified in hybrid environments that use Azure Local. Ensuring business systems and services are resilient and can recover from disruptions, ranging from localized hardware failures to site-wide disasters, is vital for maintaining business operations, safeguarding data, and preserving trust. For Azure Local instance deployments that blend edge infrastructure with Azure cloud services a ca",2026-01-28T23:03:00.000Z,concept-article,architecture-patterns,0.6,True,"Disaster recovery considerations for Azure Local VMs; likely includes product-specific DR patterns and when to use them, which is architecture/DR pattern guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-vm-resiliency?view=azloc-2604,Overview,Virtual Machine Resiliency for Azure Local - Azure Local,,Virtual machine resiliency considerations for Azure Local.,"After reviewing and implementing the design considerations forinfrastructure resiliencyat the platform level, it's essential to understand how your virtual machines (VMs) and applications are resilient to failures. This understanding helps you enable them to detect, withstand, and recover from failures within an acceptable time period. Resiliency is fundamental to maintaining continuous operations for business-critical applications. By default, all virtual machines (VMs) on Azure Local are desig",2026-03-11T22:07:00.000Z,reliability-article,,0.4,False,"Resiliency considerations for VMs are largely conceptual design guidance without concrete numeric thresholds, configuration tables, or product-specific limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-workloads-resiliency?view=azloc-2604,Workloads resiliency,Workloads Resiliency for Azure Local - Azure Local,,Disaster recovery workloads for Azure Local.,"Disaster recovery for workloads running on Azure Local virtual machines (VMs) requires a layered approach that aligns infrastructure-level protections with application-specific continuity strategies. Whether you're hosting SQL Server databases or delivering virtual desktops through Azure Virtual Desktop, each workload type has unique recovery requirements and dependencies. Azure Local supports a wide range of disaster recovery technologies allowing you to tailor recovery plans to meet business c",2026-01-28T23:03:00.000Z,concept-article,,0.4,False,"Layered DR strategy for workloads is high-level resiliency guidance without specific numeric RPO/RTO thresholds, configuration tables, or product-specific limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-acquire?view=azloc-2604,Acquire disconnected operations,Acquire Disconnected Operations for Azure Local - Azure Local,Acquire and provision Azure Local disconnected operations appliance,Learn how to set up disconnected operations for Azure Local. Create a resource in the Azure portal and download the necessary files.,"This article explains how to acquire disconnected operations for Azure Local. Learn how to create a virtual appliance resource in the Azure portal, download installation files, and get support from Microsoft for your deployment.",2026-03-26T22:08:00.000Z,how-to,deployment,0.65,True,"Covers how to create the virtual appliance resource and download installation files; this is product-specific deployment and acquisition procedure, not generic deployment.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-arc-vm?view=azloc-2604,Azure Local VMs,Disconnected operations with Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to manage Azure Local VMs running disconnected.,"This article provides a brief overview of management features for Azure Local virtual machine (VM) for disconnected operations. It covers the benefits, components, and high-level workflow. This feature closely mirrors Azure Local VM capabilities and references many Azure Local VM articles for connected operations. You learn about key differences and limitations of disconnected operations.",2026-03-09T22:15:00.000Z,concept-article,,0.3,False,Described as a brief overview of management features and differences; likely conceptual with references to other articles rather than detailed configs or error mappings.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-azure-container-registry?view=azloc-2604,Azure Container Registry,Deploy Azure Container Registry with Disconnected Operations for Azure Local - Azure Local,Deploy and manage Azure Container Registry on Azure Local disconnected,Learn how to deploy and manage Azure Container Registry with disconnected operations for Azure Local.,"This article explains how to deploy and manage Azure Container Registry on disconnected operations running on Azure Local. It provides an overview of the service, prerequisites, deployment steps, and how to manage images in the registry.",2026-02-23T23:04:00.000Z,how-to,deployment,0.65,True,Service-specific deployment and management of ACR in a disconnected Azure Local environment is specialized deployment/integration knowledge.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-back-up-restore?view=azloc-2604,Back up disconnected operations,Back up Azure Local Disconnected Environments - Azure Local,Configure and run backups for Azure Local disconnected environments,Learn how to back up Azure Local environments running disconnected. Configure parameters and trigger backups.,"This article explains the backup process for disconnected operations for Azure Local environments. It provides practical steps to trigger a backup and parameter configurations to customize it. Operators need access to theOperator subscription and role-based access control (RBAC) permissions. For more information, seeDisconnected operations for Azure Local.",2026-04-23T22:03:00.000Z,concept-article,configuration,0.7,True,"Includes backup parameters, how to trigger backups, and RBAC requirements; these are concrete configuration details for a specific product scenario.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-billing?view=azloc-2604,Billing,Billing for Disconnected Operations for Azure Local - Azure Local,Understand billing and capacity-based pricing for Azure Local disconnected operations,Learn how disconnected operation for Azure Local is billed.,This article explains billing for disconnected operations for Azure Local. The article also covers the capacity-based pricing model that uses physical processor cores and the licensing requirements for Windows Server virtual machines (VMs).,2026-03-26T22:08:00.000Z,concept-article,decision-making,0.65,True,Explains capacity-based pricing using physical cores and Windows Server VM licensing; likely includes concrete billing rules and trade-offs for planning.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-cli?view=azloc-2604,Azure CLI,Use Azure CLI for Disconnected Operations for Azure Local - Azure Local,Configure Azure CLI for Azure Local disconnected environments,"Learn how to configure Azure CLI for disconnected operations on Azure Local, including cloud setup, certificate trust, and extension installation.","This article explains how to install and configure the Azure Command-Line Interface (CLI) and its extensions for disconnected operations for Azure Local. It provides an overview of CLI, supported versions, installation steps, and how to set up the CLI for disconnected operations.",2026-02-23T23:04:00.000Z,how-to,configuration,0.7,True,"Details on supported CLI versions, cloud setup, certificate trust, and extensions for disconnected mode are product-specific configuration parameters.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-deploy?view=azloc-2604,Deploy disconnected operations,Deploy Disconnected Operations for Azure Local - Azure Local,Deploy Azure Local management cluster for disconnected operations,Learn how to deploy disconnected operations for Azure Local in your datacenter,"This article explains how to deploy disconnected operations for Azure Local in your datacenter. This step is key to deploy and operate Azure Local without any outbound network connection. After you deploy the management cluster (control plane), you deploy your first Azure Local instance.",2026-04-09T22:04:00.000Z,how-to,deployment,0.7,True,Explains deploying the management cluster and first instance in a no-outbound-connectivity scenario; includes product-specific deployment flow and constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-fallback?view=azloc-2604,Fallback log collection,Appliance Fallback Log Collection for Disconnected Operations with Azure Local VMs Enabled by Azure Arc - Azure Local,Use appliance fallback log collection for Azure Local Arc-enabled VMs,Export and send logs for disconnected operations with Azure Local VMs enabled by Azure Arc.,This article explains how to use appliance fallback logging to export and send logs to Microsoft when Azure Local VMs operates in disconnected mode. This process helps you troubleshoot issues when standard log collection isn't available.,2026-02-23T23:04:00.000Z,how-to,troubleshooting,0.7,True,"Describes fallback logging when standard collection fails, including how to export and send logs; this is specialized troubleshooting/logging guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-identity?view=azloc-2604,Identity,Plan your Identity for Disconnected Operations on Azure Local - Azure Local,Plan identity architecture for Azure Local disconnected environments,Plan and integrate your identity on disconnected operations for Azure Local.,"This article explains how to plan and integrate your identity for disconnected operations on Azure Local. Learn how to set up your identity solution, and understand the actions and roles available to operators.",2026-02-23T23:04:00.000Z,concept-article,security,0.65,True,"Identity planning for disconnected Azure Local will include product-specific roles, actions, and identity integration patterns that qualify as security/identity configuration guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-known-issues?view=azloc-2604,Known issues,Release Notes for Disconnected Operations for Azure Local - Azure Local,Resolve known issues in Azure Local disconnected operations,Read about the known issues and fixed issues for disconnected operations for Azure Local.,This article identifies critical known issues and their workarounds in disconnected operations for Azure Local. These release notes are updated continuously to include critical issues and required workarounds. Review this information carefully before you deploy disconnected operations for Azure Local.,2026-03-26T22:08:00.000Z,concept-article,troubleshooting,0.7,True,Release notes for known issues and workarounds typically map specific symptoms to causes and fixes; this is product- and version-specific troubleshooting knowledge.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-monitoring?view=azloc-2604,Overview,Monitor Disconnected Operations in Azure Local - Azure Local,Integrate monitoring solutions with Azure Local disconnected operations,Learn how to monitor disconnected operations for Azure Local to ensure system reliability and performance.,"This article explains how to monitor disconnected operations for Azure Local by integrating with external monitoring solutions. Learn how to use Microsoft, non-Microsoft, and open-source monitoring systems to ensure the reliability and performance of your infrastructure and workloads.",2026-02-23T23:04:00.000Z,concept-article,configuration,0.6,True,Describes integrating Microsoft and non-Microsoft monitoring systems in a disconnected scenario; likely includes product-specific configuration endpoints and patterns.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-network?view=azloc-2604,Network,Plan Your Network for Disconnected Operations on Azure Local - Azure Local,Design network architecture for Azure Local disconnected operations,Plan and integrate your network for disconnected operations on Azure Local.,This article describes how to plan your network for disconnected operations on Azure Local. It covers key design considerations and requirements to help ensure reliable integration and performance in a disconnected environment.,2026-02-23T23:04:00.000Z,concept-article,architecture-patterns,0.65,True,"Network planning article for a niche disconnected scenario likely includes product-specific topology patterns, requirements, and trade-offs beyond generic networking concepts.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-on-demand-logs?view=azloc-2604,On-demand log collection,Collect Logs On-Demand with Azure Local Disconnected Operations - Azure Local,Collect on-demand logs via PowerShell for Azure Local disconnected operations,Learn how to use the PowerShell module to collect logs on-demand with disconnected operations for Azure Local.,This article explains how to collect logs on-demand for disconnected operations for Azure Local by using the PowerShell module. You learn how to provide logs for troubleshooting and support when Azure Local operates in disconnected mode.,2026-04-16T22:02:00.000Z,how-to,troubleshooting,0.7,True,Focused on log collection for troubleshooting and support using a specific PowerShell module; this is a product-specific troubleshooting/logging procedure.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-overview?view=azloc-2604,Overview,Disconnected operations for Azure Local overview - Azure Local,,"Learn how to use disconnected operations to deploy and manage Azure Local. Build secure, compliant private clouds for remote or sovereign environments.","Disconnected operations enable you to deploy and manage Azure Local instances to build sovereign private clouds. This article explains how this feature supports compliance, security, and remote deployments.",2026-02-25T18:04:00.000Z,overview,,0.2,False,"High-level overview of disconnected operations, focused on concepts, benefits, and compliance; no concrete limits, configs, or error mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-pki?view=azloc-2604,Public key infrastructure,Understand Public Key Infrastructure (PKI) requirements for disconnected operations on Azure Local - Azure Local,Configure PKI and certificates for Azure Local disconnected operations,Learn about public key infrastructure (PKI) requirements for disconnected operations on Azure Local. Discover how to create certificates to secure endpoints and ensure a trusted deployment.,This article explains the public key infrastructure (PKI) requirements for disconnected operations on Azure Local. You learn how to create certificates to secure appliance endpoints and ensure secure communication in your environment.,2026-02-25T18:04:00.000Z,concept-article,security,0.7,True,PKI requirements and certificate details for securing Azure Local appliance endpoints are product-specific security configuration knowledge.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-policy?view=azloc-2604,Azure Policy,Use Azure Policy in a Disconnected Azure Local Environment - Azure Local,Configure Azure Policy in disconnected Azure Local environments,Learn how to use Azure Policy to enforce compliance and manage resources in a disconnected Azure Local environment.,"This article explains how to use Azure Policy in a disconnected Azure Local environment to enforce compliance and manage resources at scale. Azure Policy helps organizations meet standards by checking resource properties against business rules, even when disconnected from Azure cloud.",2026-02-23T23:04:00.000Z,concept-article,configuration,0.65,True,Using Azure Policy in a disconnected control plane requires product-specific configuration steps and parameters for policy evaluation and enforcement.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-powershell?view=azloc-2604,Azure PowerShell,Use Azure PowerShell for Disconnected Operations on Azure Local - Azure Local,Set up Azure PowerShell for Azure Local disconnected operations,Learn how to use Azure PowerShell for disconnected operations on Azure Local.,This article explains how to configure Azure PowerShell for disconnected operations on Azure Local.,2026-04-06T17:04:00.000Z,how-to,configuration,0.65,True,"Covers configuring Azure PowerShell specifically for disconnected Azure Local, including modules and environment settings; this is product-specific configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-prepare?view=azloc-2604,Prepare Azure Local nodes,Prepare Azure Local for Disconnected Deployments - Azure Local,Prepare Azure Local nodes for disconnected deployment,"Prepare your Azure Local environment for disconnected deployments. Learn how to set up nodes, configure networking, and ensure deployment readiness.",This article explains how to deploy an Azure Local node for deployment in disconnected environments. Prepare your environment to ensure a successful setup of your Azure Local instance using a local control plane. Complete these steps before setting up any Azure Local instances in your disconnected environment.,2026-02-23T23:04:00.000Z,how-to,deployment,0.7,True,Describes preparing nodes and networking for a specific disconnected deployment model; contains product-specific deployment requirements and readiness steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-registration?view=azloc-2604,Register disconnected operations,Register Disconnected Operations for Azure Local - Azure Local,Register Azure Local disconnected operations for compliance,Learn how to register disconnected operations for Azure Local to ensure compliance with deployment requirements.,This article explains how to register disconnected operations for Azure Local after deploying your management cluster with the control plane. Learn how to ensure compliance with Azure Local requirements through proper registration.,2026-03-04T18:05:00.000Z,how-to,configuration,0.65,True,Registration steps and requirements for disconnected operations are product-specific configuration/provisioning details needed for compliance.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-restore?view=azloc-2604,Restore disconnected operations,Restore Azure Local Disconnected Environments - Azure Local,Configure and execute restores for Azure Local disconnected environments,Learn how to restore an Azure Local environment running in disconnected mode. Configure restore parameters and trigger a restore operation.,"This article explains the restore process for disconnected operations for Azure Local environments. It provides practical steps to trigger a restore and parameter configurations to customize it. Operators need access to theOperator subscription and role-based access control (RBAC) permissions. For more information, seeDisconnected operations for Azure Local. Important The restore operation supports restoring the backup to the same version of Azure local disconnected environment.",2026-04-23T22:03:00.000Z,concept-article,configuration,0.7,True,"Covers restore parameters, triggering restore, and version constraints (same-version restore); these are specific configuration and operational details.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-security?view=azloc-2604,Security,Security Controls with Disconnected Operations for Azure Local - Azure Local,Apply security controls for Azure Local disconnected VMs,Learn about the security considerations and compliance regulations for disconnected operations for Azure Local.,This article explains the security considerations and compliance regulations for disconnected operations with Azure Local VMs enabled by Azure Arc. Learn how to protect your environment and meet regulatory standards when running Azure Local VMs disconnected.,2026-02-23T23:04:00.000Z,concept-article,security,0.7,True,Focuses on security considerations and compliance regulations for Azure Local VMs in disconnected mode; likely includes product-specific security settings and compliance mappings.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-update?view=azloc-2604,Update disconnected operations,Update Disconnected Operations for Azure Local - Azure Local,Update Azure Local disconnected operations appliance safely,Learn how to update disconnected operations for Azure Local.,This article explains how to update disconnected operations for Azure Local. Learn how to apply updates to the appliance to ensure optimal performance and reliability in disconnected environments.,2026-04-14T22:03:00.000Z,how-to,deployment,0.65,True,Describes how to apply updates to the appliance in disconnected mode; update flows and constraints are product-specific deployment/operations knowledge.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-whats-new?view=azloc-2604,What's new in disconnected operations?,What's New in Disconnected Operations for Azure Local - Azure Local,,Find out about the new features and enhancements in disconnected operations for Azure Local.,"This article describes new features and improvements in disconnected operations for Azure Local. Before you deploy disconnected operations with Azure Local, review theKnown issuesto understand current limitations and available workarounds.",2026-02-23T23:04:00.000Z,concept-article,,0.2,False,"What's new/release highlights; does not indicate detailed configuration, limits, or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/drift-detection?view=azloc-2604,Drift detection,Drift Detection for Azure Local - Azure Local,Use drift detection to maintain Azure Local configuration consistency,Learn how Azure Locals Drift Detection framework ensures system reliability by continuously validating component states against a baseline.,"This article explains how the drift detection framework identifies configuration deviations, improves troubleshooting, and helps reduce configuration-related issues in your Azure Local environment.",2026-02-19T23:06:00.000Z,overview,best-practices,0.6,True,"Explains drift detection framework, how it validates component states, and how it aids troubleshooting; includes product-specific guidance on managing configuration drift.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/enable-nested-virtualization?view=azloc-2604,Enable nested virtualization,Enable nested virtualization in Azure Local - Azure Local,Enable nested virtualization on Azure Local clusters,Learn how to enable nested virtualization in Azure Local.,"This article provides an overview of nested virtualization in Azure Local and how to enable it. Nested virtualization lets you enable virtualization capabilities inside a Hyper-V virtual machine (VM). This allows you to maximize your hardware investments and gain flexibility in evaluation and testing scenarios. Other use cases include enabling security features, such asVirtualization based security (VBS). Important Azure Local provides virtualization capabilities to run workloads in VMs. Running",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Explains enabling nested virtualization with Azure Local-specific constraints and configuration steps, including security-related use cases like VBS.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/gateway-connections?view=azloc-2604,Manage gateway connections,Manage Azure Local gateway connections using Windows Admin Center - Azure Local,Manage SDN gateway connections in Azure Local,"Learn how to create, delete, and update gateway connections using Windows Admin Center after you deploy Software Defined Networking (SDN) managed by on-premises tools.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to create, delete, and update gateway connections using Windows Admin Center after you deploy Software Defined Networking (SDN) managed by on-premises tools. Gateways are used for routing network traffic between a virtual network and another network, either local or remote. There are three types of gateway connections – Internet Protocol Security (IPsec), Generic Rou",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Covers creating, deleting, and updating IPsec, GRE, and L3 gateway connections via Windows Admin Center. Includes product-specific connection types and configuration steps.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/get-remote-support?view=azloc-2604,Get remote support,Get remote support for Azure Local - Azure Local,,Learn how to get remote support for the Azure Stack HCI Operating System.,"Applies to: Azure Local 2311.2 and later This article explains how to get remote support for the Azure Stack HCI operating system for Azure Local. +It gives an overview of remote support, the terms and conditions, and the steps to enable remote support on your Azure Local. It also covers setting up proxy settings, submitting a support request, and other remote support tasks.",2025-12-23T23:03:00.000Z,how-to,,0.3,False,"Procedural overview for enabling remote support; likely step-by-step UI/portal flow without detailed config tables, limits, or error-code mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support-for-deployment-issues?view=azloc-2604,Get support for deployment issues,Get support for Azure Local deployment issues - Azure Local,,"Learn how to get Microsoft support for Azure Local deployment issues, including log collection and remote support.","Applies to: Hyperconverged deployments of Azure Local This article describes how to get Microsoft support for Azure Local deployment issues, including log collection and remote support.",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"Explains how to get support, including log collection and remote support, but primarily process-oriented rather than technical configuration or error mapping.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support?view=azloc-2604,Get support for Azure Local,Get support for Azure Local - Azure Local,,This article provides guidance on how to get support for Azure Local.,"Applies to: Azure Local 2311.2 and later This article provides guidance on how to get support for Azure Local. Azure Local follows the same support process as Azure. Enterprise customers can follow the process described inCreate an Azure support request. If you're a customer of a Cloud Solution Provider (CSP), contact your CSP for support. Updates for Azure Local are released monthly to enhance customer experience. To keep your Azure Local instance in a supported state, you have up to six months",2026-04-22T22:07:00.000Z,how-to,,0.3,False,General guidance on obtaining support and staying in a supported state; lacks detailed technical configuration or troubleshooting content.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-device?view=azloc-2604,Manage GPUs using Discrete Device Assignment,Manage GPUs via Discrete Device Assignment for Azure Local (preview) - Azure Local,Configure GPU Discrete Device Assignment on Azure Local,Learn how to manage GPUs via Discrete Device Assignment for Azure Local (preview).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to manage GPUs using Discrete Device Assignment (DDA) for Azure Local VMs enabled by Azure Arc. For GPU DDA management on Azure Kubernetes Service (AKS) enabled by Azure Arc, seeUse GPUs for compute-intensive workloads. DDA allows you to dedicate a physical graphical processing unit (GPU) to your workload. In a DDA deployment, virtualized workloads run on the native driver and typically have full access to the GPU's",2026-04-22T22:28:00.000Z,how-to,configuration,0.8,True,Covers DDA setup for GPUs on Azure Local Arc-enabled VMs with product-specific parameters and constraints for direct device assignment.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-partitioning?view=azloc-2604,Manage GPUs via partitioning,Manage GPUs using partitioning for Azure Local (preview) - Azure Local,Configure GPU partitioning (GPU-P) for Azure Local VMs,Learn how to manage GPUs using partitioning Azure Local (preview).,Applies to: Hyperconverged deployments of Azure Local This article describes how to manage GPUs using partitioning (GPU-P) for Azure Local virtual machines (VMs) enabled by Azure Arc. GPU-P allows you to share a GPU with multiple workloads by splitting the GPU into dedicated fractional partitions.,2026-04-22T22:28:00.000Z,how-to,configuration,0.8,True,Describes GPU-P configuration to share GPUs across workloads with Azure Local-specific steps and partitioning options.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-preparation?view=azloc-2604,Prepare GPUs,Prepare GPUs for Azure Local instance - Azure Local,Prepare GPUs for Azure Local VMs and AKS workloads,Learn how to prepare GPUs for an Azure Local instance.,Applies to: Hyperconverged deployments of Azure Local This article describes how to prepare graphical processing units (GPUs) on your Azure Local instance for workloads running on Azure Local virtual machines (VMs) enabled by Azure Arc and on Azure Kubernetes Service (AKS) enabled by Azure Arc. GPUs are used for computation-intensive workloads such as machine learning and deep learning.,2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,Details GPU preparation steps and requirements for Azure Local VMs and AKS; includes product-specific configuration and environment setup.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/health-alerts-via-azure-monitor-alerts?view=azloc-2604,Health alerts,Use Azure Monitor alerts for Azure Local health alerts - Azure Local,Use Azure Monitor alerts to act on Azure Local health issues,Learn how to use the Azure Monitor alerts to respond to Azure Local health alerts.,"Applies to: Hyperconverged deployments of Azure Local The OS health service for Azure Local continuously monitors your Azure Local system to detect over 80 health issues across various components, such as physical and virtual disk, storage pool capacity, volume capacity, network interface, storage QoS, virtual machines (VMs), and VHDs. It provides information about the affected component, including the cause, time of the issue, and recommendations to mitigate it. You can view health issues like ",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Explains mapping OS health service issues to Azure Monitor alerts; includes alert rule configuration and health signal specifics unique to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-actions?view=azloc-2604,Track Health Service actions,Track Health Service actions - Azure Local,Track and understand Health Service automated actions on Azure Local,Learn more about: Health Service actions,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 The Health Service, first released in Windows Server 2016, improves the day-to-day monitoring and operational experience for clusters running Storage Spaces Direct. This topic describes workflows that the Health Service automates. The Health Service generates ""Actions"" to verify that they are taken autonomously, or to track their progress or outcome. Unlike logs, Actions disappear shortly afte",2026-01-28T23:03:00.000Z,concept-article,configuration,0.6,True,Describes Health Service workflows and actions; includes action types and behavior details specific to Azure Local clusters.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-cluster-performance-history?view=azloc-2604,Get cluster performance history,Cluster performance history - Azure Local,Retrieve cluster performance history with Health Service on Azure Local,Learn more about: Cluster performance history,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 The Health Service reduces the work required to get live performance and capacity information from your Storage Spaces Direct cluster. One cmdlet provides a curated list of essential metrics, which are collected efficiently and aggregated dynamically across nodes, with built-in logic to detect cluster membership. All values are real-time and point-in-time only. For additional information, seePerformance history fo",2025-12-23T23:03:00.000Z,how-to,configuration,0.6,True,Explains cmdlets and metrics for performance history; includes metric sets and usage patterns unique to Azure Local/Storage Spaces Direct.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-faults?view=azloc-2604,View Health Service faults,View Health Service faults - Azure Local,Interpret and resolve Azure Local Health Service faults,Learn more about Health Service faults,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article provides detailed information about Health Service faults in Azure Local and Windows Server.",2025-12-23T23:03:00.000Z,how-to,troubleshooting,0.7,True,Provides detailed information about Health Service faults; likely maps fault IDs/messages to causes and remediation steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-overview?view=azloc-2604,Monitor clusters with Health Service,Monitor clusters with the Health Service - Azure Local,Monitor Azure Local clusters using Health Service,Learn more about how to use the Health Service to monitor clusters,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 The Health Service, first released in Windows Server 2016, improves the day-to-day monitoring and operational experience for clusters running Storage Spaces Direct.",2026-02-13T18:03:00.000Z,how-to,configuration,0.6,True,"Describes using Health Service for Storage Spaces Direct clusters; includes cmdlets, health policies, and configuration options specific to this service.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-settings?view=azloc-2604,Modify Health Service settings,Modify Health Service settings - Azure Local,Modify Azure Local Health Service settings and behavior,Learn more about: Health Service settings,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 The Health Service, first released in Windows Server 2016, improves the day-to-day monitoring and operational experience for clusters running Storage Spaces Direct. Many of the parameters which govern the behavior of the Health Service are exposed as settings. You can modify these to tune the aggressiveness of faults or actions, turn certain behaviors on/off, and more. For a detailed overview ",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Covers Health Service parameters and settings; includes setting names, allowed values, and their effects, matching configuration criteria.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/kerberos-with-spn?view=azloc-2604,Authenticate with Kerberos,Use Kerberos authentication with Service Principal Name (SPN) - Azure Local,Use Kerberos SPN authentication with Network Controller,Learn how to use Kerberos authentication with SPN.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes how to use Kerberos authentication with Service Principal Name (SPN). Network Controller supports multiple authentication methods for communication with management clients. You can use Kerberos based authentication, X509 certificate-based authentication. You also have the option to use no authentication for test deployments. System Center Virtual Machine Manager uses Kerberos-based authentic",2025-12-22T23:06:00.000Z,how-to,security,0.75,True,"Explains using Kerberos with SPN for Network Controller management clients and SCVMM, including supported authentication modes. Product-specific security/auth configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balance-multiple-networks?view=azloc-2604,Load balance multiple logical networks,Load balance multiple logical networks for Azure Local - Azure Local,Load balance multiple logical networks in SDN,Learn how to load balance multiple logical networks in Software Defined Networking (SDN) managed by on-premises tools for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article provides guidance on how to load balance multiple logical networks in Software Defined Networking (SDN) managed by on-premises tools for Azure Local. By using multiple logical networks for load balancing, you have more control over isolating workloads from each other. For information about how to create and manage logical networks, seeManage tenant logical networks.",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.6,True,Guidance on using multiple logical networks for load balancing and workload isolation. Product-specific pattern for structuring SDN logical networks.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balancers?view=azloc-2604,Manage software load balancers,Manage Software Load Balancer for SDN managed by on-premises tools - Azure Local,Configure and manage Software Load Balancer policies,Learn how to manage Software Load Balancer for SDN managed by on-premises tools,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 In this article, learn how to manage Software Load Balancer (SLB) policies using Windows Admin Center after you deploy Software Defined Networking (SDN). SLBs are used to evenly distribute network traffic among multiple resources. SLB enables multiple machines to host the same workload, providing high availability and scalability. You can create load balancers for your workloads hosted on trad",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Explains how to manage SLB policies for SDN workloads via Windows Admin Center, including scenarios for traditional and virtual networks. Product-specific SLB configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machine-resources?view=azloc-2604,Manage Azure Local VM resources,Manage resources for Azure Local VMs enabled by Azure Arc - Azure Local,Manage disks and NIC resources for Azure Local VMs,Learn how to manage resources like data disks and network interfaces on an Azure Local VM enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local After you deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you need to add or delete resources like data disks and network interfaces. +This article describes how to manage these VM resources for an Azure Local VM running on your Azure Local instance. Add or delete resources using the Azure portal. To add a data disk, use the Azure CLI.",2026-04-22T22:07:00.000Z,how-to,configuration,0.6,True,Describes adding/removing data disks and NICs; involves VM resource configuration specific to Azure Local and Azure Arc.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machines?view=azloc-2604,Manage Azure Local VMs,"Manage including restart, start, stop or delete Azure Local VMs enabled by Azure Arc - Azure Local",,"Learn how to manage Azure Local VMs enabled by Azure Arc. This includes operations such as start, stop, restart, view properties of Azure Local VMs.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage Azure Local virtual machines (VMs) enabled by Azure Arc. The procedures to enable guest management, start, stop, restart, pause, save, or delete an Azure Local VM, are detailed.",2026-04-22T22:07:00.000Z,how-to,,0.5,False,"Manage lifecycle of Arc-enabled VMs (start/stop/etc.); operational tutorial without structured best-practices, limits, or config tables.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-at-scale-dashboard?view=azloc-2604,Use dashboard to manage at-scale,Monitor at scale using the Azure Local overview and All systems page - Azure Local,,Learn to monitor your Azure Local using dashboards in Azure portal. You can view the status of Azure Local as charts or lists.,Applies to: Hyperconverged deployments of Azure Local This article details how to manage at-scale your Azure Local via the dashboard in the Azure portal. You can view the status of the systems as charts or lists.,2026-04-22T22:07:00.000Z,how-to,,0.3,False,"Dashboard usage/how-to for monitoring at scale; lacks numeric limits, config matrices, or deep troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-bitlocker?view=azloc-2604,Manage BitLocker encryption,"Manage BitLocker encryption on Azure Local, version 23H2 - Azure Local",Manage BitLocker encryption on Azure Local 23H2,"Learn how to manage BitLocker encryption on your Azure Local, version 23H2 system.","Applies to: Hyperconverged deployments of Azure Local This article describes how to view and enable BitLocker encryption, and retrieve BitLocker recovery keys on your Azure Local instance.",2026-04-22T22:07:00.000Z,how-to,security,0.8,True,Provides procedures to enable/view BitLocker and retrieve recovery keys on Azure Local—product-specific security configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-data-disks?view=azloc-2604,Download managed disks from Azure,Download Azure managed disk to Azure Local - Azure Local,,Learn how to download Azure managed disk to Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to download an Azure managed disk from Azure to your Azure Local instance. You can then use the disk to create an image or to attach it to your Azure Local virtual machines (VMs) enabled by Arc, as needed.",2026-04-22T22:07:00.000Z,how-to,,0.45,False,How-to download a managed disk to Azure Local; likely a linear procedure without detailed config tables or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-default-network-access-policies-virtual-machines-23h2?view=azloc-2604,Manage default network access policies,"Enable and assign default network access policies on Azure Local, version 23H2 VMs - Azure Local",Enable default network access policies for Azure Local VMs,"Learn how to enable and assign default network access policies on VMs running on Azure Local, version 23H2 via the Windows Admin Center.",Applies to: Hyperconverged deployments of Azure Local Applies to: Windows Server 2025 This article describes how to enable default network access policies and assign them to virtual machines (VMs). Default network policies can be used to protect virtual machines running from external unauthorized attacks. These policies block all inbound access to virtual machines (except the specified management ports you want enabled) while allowing all outbound access. Use these policies to ensure that your w,2026-04-22T22:07:00.000Z,how-to,security,0.75,True,Security-focused configuration of default network policies that block inbound traffic except specified management ports. Product-specific security behavior and configuration steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-logical-networks?view=azloc-2604,Manage logical networks,Manage logical networks for Azure Local VMs enabled by Azure Arc - Azure Local,Manage logical network configuration for Azure Local VMs,Learn how to manage logical networks for Azure Local VMs enabled by Azure Arc.,"Applies to: Hyperconverged deployments of Azure Local To deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you need to create logical networks. Once these networks are provisioned, you may need to manage them. This article describes how to manage these logical networks for Azure Local VMs deployed on your Local instance.",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,"Focuses on managing existing logical networks; involves editing network settings and properties, which are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-network-security-groups?view=azloc-2604,Manage Network Security Groups,Manage network security groups and network security rules on Azure Local VMs - Azure Local,Manage NSGs and security rules for Azure Local Arc-enabled VMs,Learn how to manage network security groups and network security rules for Azure Local virtual machines.,"This article describes how to manage network security groups (NSGs) on your Azure Local virtual machines (VMs) enabled by Azure Arc. Once you create network security groups on your Azure Local VMs, you can then list, show details, associate, dissociate, update, and delete these resources. Note The only VMs that are in scope for using NSGs with this feature are Azure Local VMs. These are VMs that were deployed from Azure client interfaces (Azure CLI, Azure portal, Azure Resource Manager). Do not ",2026-04-22T22:07:00.000Z,how-to,security,0.7,True,Describes managing NSGs for Azure Local VMs with scope constraints; product-specific security management behavior and commands.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-sdn-multisite?view=azloc-2604,Manage SDN Multisite,Manage SDN Multisite for Azure Local and Windows Server - Azure Local,Deploy and manage SDN Multisite for Azure Local,Learn how to manage a multisite SDN solution for Azure Local and Windows Server.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to deploy and manage the Software Defined Networking (SDN) Multisite solution for Azure Local using Windows Admin Center. Applies to: Windows Server 2025 This article describes how to deploy and manage the Software Defined Networking (SDN) Multisite solution for Windows Server using Windows Admin Center. For an overview of SDN Multisite, it's current capabilities and limitations, seeOverview of SDN Multisite. For an",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.65,True,"Covers deployment and management of SDN Multisite, a specific topology pattern with capabilities and limitations. Product-specific multi-site networking design and operations.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secrets-rotation?view=azloc-2604,Manage secrets rotation,"Change deployment user password on Azure Local, version 23H2 - Azure Local",Rotate deployment user password on Azure Local 23H2,"This article describes how to manage internal secret rotation on Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article describes how you can change the password associated with the deployment user on Azure Local.,2026-04-22T22:07:00.000Z,how-to,security,0.7,True,"Describes internal secret rotation for the deployment user, including product-specific security operations.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-baseline?view=azloc-2604,Manage security defaults,"Manage security defaults on Azure Local, version 23H2 - Azure Local",Manage default security settings on Azure Local 23H2,"Learn how to manage security default settings available for Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article describes how to manage default security settings for your Azure Local instance. You can also modify drift control and protected security settings defined during deployment so your device starts in a known good state.,2026-04-22T22:07:00.000Z,how-to,security,0.8,True,"Covers default security settings, drift control, and protected settings for Azure Local with concrete configuration behavior.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2604,Manage Secure Boot updates,Manage Secure Boot Updates - Azure Local,Manage Secure Boot certificate updates on Azure Local,"Manage Secure Boot updates, including 2023 certificate rollout and CVE-2023-24932 mitigations, for Azure Local clusters.","This article describes how Azure Local manages the transition from the 2011 Secure Boot certificates, which expire in June 2026, to the 2023 Secure Boot certificates, including how it mitigatesCVE-2023-24932and why the changes are delivered through a phased rollout. It also covers how Azure Local orchestrates Secure Boot updates alongside OEM and hardware updates including Solution Builder Extension (SBE) packages and provides guidance for monitoring and validating each stage of the update proce",2026-04-22T22:07:00.000Z,how-to,security,0.85,True,"Details how Azure Local transitions Secure Boot certificates, mitigates CVE-2023-24932, and orchestrates updates with OEM packages—highly product-specific security operations.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-post-upgrade?view=azloc-2604,Manage security post upgrade,"Manage security after upgrading your Azure Local from an Azure Stack HCI, version 22H2. - Azure Local",Manage security after upgrading Azure Local from 22H2,"Learn how to manage security posture after you upgrade Azure Local from an Azure Stack HCI, version 22H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage security settings on an Azure Local that was upgraded from an Azure Stack HCI, version 22H2.",2026-04-22T22:07:00.000Z,how-to,security,0.7,True,"Covers managing security settings on upgraded systems, including migration-specific security considerations and configurations.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-with-defender-for-cloud?view=azloc-2604,Manage with Microsoft Defender for Cloud,Manage system security with Microsoft Defender for Cloud (preview) - Azure Local,Secure Azure Local with Microsoft Defender for Cloud,This article describes how to use Microsoft Defender for Cloud to secure Azure Local (preview).,"Applies to: Azure Local 2311.2 and later This article discusses how to use Microsoft Defender for Cloud to protect Azure Local from various cyber threats and vulnerabilities. Defender for Cloud helps improve the security posture of Azure Local, and can protect against existing and evolving threats. For more information about Microsoft Defender for Cloud, seeMicrosoft Defender for Cloud documentation. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft A",2026-04-22T22:07:00.000Z,how-to,security,0.75,True,"Shows how to integrate and configure Defender for Cloud for Azure Local, including product-specific security posture management.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-syslog-forwarding?view=azloc-2604,Manage syslog forwarding,Manage syslog forwarding for Azure Local - Azure Local,Configure syslog forwarding from Azure Local to SIEM,Learn how to configure syslog forwarding for Azure Local security information and event management (SIEM).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to configure security events to be forwarded to a customer-managed security information and event management (SIEM) system using syslog protocol for Azure Local. Use syslog forwarding to integrate with security monitoring solutions and to retrieve relevant security event logs to store them for retention on your own SIEM platform. For more information about security features in this release, seeSecurity features for ",2026-04-22T22:07:00.000Z,how-to,security,0.8,True,Explains how to forward security events via syslog to a SIEM with Azure Local-specific configuration steps and event handling.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-thin-provisioning-23h2?view=azloc-2604,Storage thin provisioning,"Storage thin provisioning in Azure Local, version 23H2 - Azure Local",Configure storage thin provisioning on Azure Local 23H2 with PowerShell,"How to use storage thin provisioning on Azure Local, version 23H2 by using Windows PowerShell.","Applies to: Hyperconverged deployments of Azure Local This article describes how thin provisioning works on your Azure Local instance. Traditionally, volumes are fixed provisioned, meaning that all storage is allocated from the storage pool when a volume is created. Despite the volume being empty, a portion of the storage pool's resources is depleted. Other volumes can't make use of this storage, which impacts storage efficiency and requires more maintenance.",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Explains how thin provisioning works and how to configure it via PowerShell; likely includes cmdlet parameters and volume settings specific to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-wdac?view=azloc-2604,Manage Application Control,"Manage Application Control for Azure Local, version 23H2 - Azure Local",Configure Application Control on Azure Local 23H2,"This article describes how to use Application Control on Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to use Application Control to reduce the attack surface of Azure Local. For more information, seeManage baseline security settings on Azure Local.",2026-04-22T22:07:00.000Z,how-to,security,0.75,True,Describes using Application Control to reduce attack surface with Azure Local-specific configuration steps and policies.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-cluster-with-metrics?view=azloc-2604,Metrics,Monitor Azure Local with Azure Monitor Metrics - Azure Local,Monitor Azure Local with Azure Monitor Metrics and performance dashboards,Learn how to monitor Azure Local with Azure Monitor Metrics.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to monitor your Azure Local system withAzure Monitor Metrics. It also describes the Performance Metrics dashboard and lists metrics collected for compute, storage, and network resources in Azure Local. When you have critical applications and business processes that rely on Azure resources, it's important to monitor those resources for their availability, performance, and operation. The integration of Azure Monitor M",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Describes metrics collected for compute, storage, and network and the Performance Metrics dashboard; includes metric names and resource-specific monitoring configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-features?view=azloc-2604,Monitor feature workbooks,Monitor Azure Local features with Insights - Azure Local,Monitor Azure Local features like ReFS deduplication using Insights,Monitor Azure Local features with Insights.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to use Insights to monitor key Azure Local features, such as Resilient File System (ReFS) deduplication and compression. To monitor Azure Local systems with Insights, seeMonitor a single Azure Local system with InsightsandMonitor multiple Azure Local systems with Insights.",2026-04-22T22:07:00.000Z,how-to,configuration,0.6,True,Describes monitoring specific features via Insights; likely includes feature-specific metrics and workbook configuration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-23h2?view=azloc-2604,Monitor multiple systems,"Monitor multiple Azure Local, version 23H2 systems with Insights - Azure Local",Configure Insights to monitor multiple Azure Local systems,"How to use Insights to monitor the health, performance, and usage of multiple Azure Local, version 23H2 systems.","Applies to: Hyperconverged deployments of Azure Local This article explains how to use Insights to monitor multiple Azure Local systems. For a single Azure Local system, seeMonitor a single Azure Local system with Insights. For information about the benefits, prerequisites, and how to enable Insights on each Azure Local system, seeBenefits,Prerequisites, andEnable Insights. To monitor multiple Azure Local system with Insights, you need to enable Insights on each system individually. Instead, you",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Explains monitoring multiple systems with Insights; includes configuration patterns for multi-cluster monitoring and possibly workspace and policy settings.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-azure-policies?view=azloc-2604,Enable Insights at scale using Azure policies,Enable Insights for Azure Local at scale using Azure policies - Azure Local,Enable Azure Local Insights at scale using Azure Policy,How to enable Insights for Azure Local systems at scale using Azure policies.,"Applies to: Azure Local 2311.2 and later This document describes how to enable Insights for Azure Local systems at scale using Azure policies. To enable Insights for a single Azure Local system, seeMonitor a single Azure Local system with Insights. For an overview of Azure Policy, seeWhat is Azure Policy?",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Shows how to use Azure Policy to enable Insights at scale; includes policy definitions, parameters, and assignment scopes specific to Azure Local monitoring.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-single-23h2?view=azloc-2604,Monitor a single system,"Monitor a single Azure Local, version 23H2 system with Insights - Azure Local",Configure Insights to monitor a single Azure Local 23H2 system,"Enable logging and monitoring capabilities to monitor a single Azure Local, version 23H2 system using Insights.","Applies to: Hyperconverged deployments of Azure Local This article describes how to use Insights to monitor a single Azure Local system. For multiple Azure Local systems, seeMonitor multiple Azure Local systems with Insights. Insights is a feature of Azure Monitor that quickly gets you started monitoring your Azure Local system. You can view key metrics, health, and usage information regarding cluster, nodes, virtual machines, and storage. Take a few moments to watch the video walkthrough on Ins",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,"Covers enabling logging and monitoring via Insights for a single system; likely includes workspace, data collection, and configuration settings specific to Azure Local.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/nc-security?view=azloc-2604,Secure the Network Controller,Network Controller Security - Azure Local,Configure Network Controller security channels and protocols,Learn how to configure security for all communication between Network Controller and other software and devices.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to configure security for all communication betweenNetwork Controllerand other software and devices. The communication paths that you can secure include Northbound communication on the management plane, cluster communication between Network Controller virtual machines (VMs) in a cluster, and Southbound communication on the data plane. Northbound Communication. Networ",2025-12-22T23:06:00.000Z,how-to,security,0.75,True,"Describes securing northbound, cluster, and southbound communication paths for Network Controller, including supported authentication and encryption options. Product-specific security configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/refs-deduplication-and-compression?view=azloc-2604,Use ReFS deduplication and compression,Optimize storage with ReFS deduplication and compression in Azure Local - Azure Local,Enable and tune ReFS deduplication and compression on Azure Local,Learn how to use ReFS deduplication and compression in Azure Local to optimize storage.,Applies to: Hyperconverged deployments of Azure Local This article describes the Resilient File System (ReFS) deduplication and compression feature and how to use this feature in Azure Local to optimize storage.,2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Describes using ReFS deduplication and compression in Azure Local; includes product-specific configuration steps and options.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/remediate-support-tool-infrastructure?view=azloc-2604,Use the Support tool for infrastructure issues,Remediation Support Tool for Azure Local infrastructure component issues - Azure Local,Use Remediation Support Tool to fix Azure Local infrastructure issues,Learn how to run commands in the Support.AksArc PowerShell module to remediate issues in Azure Local infrastructure components.,TheSupport tool(also known as AKS Arc Support Tool) is a PowerShell module that provides diagnostic and remediation capabilities for the infrastructure components used by Azure Local VMs and Azure Kubernetes Service (AKS) enabled by Azure Arc on Azure Local.,2026-01-09T23:02:00.000Z,troubleshooting,troubleshooting,0.7,True,Describes Support.AksArc PowerShell module for diagnostics and remediation; includes specific commands and remediation flows for infrastructure components.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/remote-support-arc-extension?view=azloc-2604,Remote support extension,Azure Local Remote Support Arc extension and remote support overview - Azure Local,Use Azure Local Remote Support Arc extension for assisted diagnostics,This article describes the remote support arc extension and remote support in Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article gives an overview of the Remote Support Arc extension and remote support in Azure Local. Learn about the benefits, scenarios, and commands that Microsoft support uses during a remote support session.",2025-12-23T23:03:00.000Z,overview,integrations,0.6,True,"Describes Remote Support Arc extension, scenarios, and commands used by Microsoft support; includes product-specific command patterns and integration behavior.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server-disaggregated?view=azloc-2604,Repair a node,Repair a Node on Azure Local for Disaggregated Deployments - Azure Local,,Learn how to repair a node on your Azure Local disaggregated deployments.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to repair a node on your Azure Local instance. In this article, each server is referred to as a node.",2026-04-22T22:28:00.000Z,how-to,,0.3,False,"Repairing a node is primarily procedural operations content; summary doesn’t indicate detailed limits, config tables, or error-code-based troubleshooting.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server?view=azloc-2604,Repair a node,"Repair a node on Azure Local, version 23H2 - Azure Local",Repair nodes in Azure Local 23H2 clusters,"Learn how to repair a node on your Azure Local, version 23H2 system.","Applies to: Hyperconverged deployments of Azure Local This article describes how to repair a node on your Azure Local instance. In this article, each server is referred to as a node.",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,"Provides procedures to repair a node in Azure Local, including cluster-specific operational steps and constraints.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/replace-network-adapter-to-network-intents?view=azloc-2604,Replace a failed NIC in a Network ATC intent,Replace a failed NIC in an existing Network ATC intent on Azure Local - Azure Local,Replace failed NICs in Network ATC intents on Azure Local,Learn how to replace a failed physical network adapter that's part of an existing Network ATC intent on an Azure Local instance.,"Applies to: Hyperconverged deployments of Azure Local This article shows you how to replace a failed physical network adapter (NIC) that belongs to an existing Network ATC intent on an Azure Local instance, without rebuilding the node or recreating the intent. For the related expansion workflow, seeAdd NICs to an existing Network ATC intent.",2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Covers repair workflow for NICs within Network ATC intents without rebuilding nodes; involves detailed, product-specific network configuration steps.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-log-collection?view=azloc-2604,Collect SDN logs,Collect logs for Software Defined Networking managed by on-premises tools on Azure Local - Azure Local,Collect SDN logs for Azure Local troubleshooting,Learn how to collect logs to troubleshoot Software Defined Networking (SDN) managed by on-premises tools in Azure Local.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes how to collect logs for Software Defined Networking (SDN) managed by on-premises tools on Azure Local. The SDN logs help you identify and troubleshoot advanced issues in your SDN environment. Use these logs to gather key information before you contact Microsoft support. Use SDN logs to also test a recently deployed SDN environment or retest an existing SDN deployment. In addition to log coll",2025-12-22T23:06:00.000Z,how-to,troubleshooting,0.7,True,Describes specific SDN log types and collection methods to diagnose advanced issues and support cases. Product-specific diagnostic data collection.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-manage-certs?view=azloc-2604,Manage certificates for SDN,Manage certificates for Software Defined Networking (SDN) managed by on-premises tools - Azure Local,Manage certificates for Azure Local SDN communications,Learn how to manage certificates for Network Controller Northbound and Southbound communications when you deploy Software Defined Networking (SDN) managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article describes how to manage certificates for Network Controller Northbound and Southbound communications when you deploy Software Defined Networking (SDN) and when you use System Center Virtual Machine Manager (SCVMM) as your SDN management client. Note For overview information about Network Controller, seeNetwork Controller. If you aren't using Kerberos for securing the Network Contr",2025-12-22T23:06:00.000Z,how-to,security,0.7,True,Details managing certificates for Network Controller northbound/southbound communication and SCVMM integration. Product-specific certificate roles and security configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-technical-reference?view=azloc-2604,SDN technical reference,Technical reference for Software Defined Networking (SDN) managed by on-premises tools in Azure Local. - Azure Local,,This technical reference serves as a one-stop-shop to access learning resources available for SDN managed by on-premises tools.,"Applies to: Azure Stack HCI, version 22H2 This technical reference provides comprehensive learning resources to enhance your understanding of Software Defined Networking (SDN) in Azure Local. We continuously update these resources based on the latest information available and user feedback. Be sure to check back here often for fresh content.",2026-04-22T22:07:00.000Z,overview,,0.2,False,"High-level technical reference hub linking to other SDN resources; does not itself contain detailed configuration, limits, or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-troubleshooting?view=azloc-2604,Troubleshoot SDN,Troubleshoot Azure Local SDN enabled by Azure Arc - Azure Local,Troubleshoot Azure Local SDN deployment and connectivity issues,"Troubleshoot and resolve common Azure Local SDN network controller deployment errors, VM connectivity issues, and NSG configuration problems. Learn how to fix DNS, downtime, and network policy errors.","This article provides troubleshooting steps for common issues encountered when you deploy and manage Software-Defined Networking (SDN) enabled by Azure Arc on your Azure Local VMs. The article covers errors during the action plan deployment, VM connectivity issues, and network security group (NSG) configurations.",2026-04-22T22:07:00.000Z,concept-article,troubleshooting,0.85,True,"Explicit troubleshooting article with symptom-based guidance for SDN controller deployment, VM connectivity, NSG, DNS, and policy errors; includes product-specific error patterns and fixes.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/set-up-recommended-alert-rules?view=azloc-2604,Recommended alert rules,Enable recommended alert rules for Azure Local - Azure Local,Enable recommended metric alert rules for Azure Local,How to enable recommended alert rules for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to enable recommended alert rules for Azure Local. A metric alert rule monitors a resource by evaluating conditions on the resource metrics at regular intervals. If the conditions are met, an alert is fired. Recommended alerts are predefined metric-based alerts for your Azure Local system resource. These alerts provide you with initial monitoring for a common set of metrics including CPU percentage and available mem",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Covers predefined recommended alerts for metrics like CPU and memory; includes specific alert rule definitions and thresholds tailored to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-metric-alerts?view=azloc-2604,Metric alerts,Set up metric alerts for Azure Local - Azure Local,Configure metric alerts for Azure Local resources,How to set up metric alerts for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to set up metric alerts for Azure Local. For information about how to set up log alerts, seeSet up log alerts for Azure Local systems.",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,"Describes setting up metric alerts; includes metric names, conditions, and alert rule configuration specific to Azure Local.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-system-alerts?view=azloc-2604,Log alerts,Set up log alerts for Azure Local - Azure Local,Set up log alerts for Azure Local using Insights and sample queries,How to set up log alerts for various Azure Local system resources using Insights for Azure Local and sample log queries.,"Applies to: Azure Local 2311.2 and later This article describes how to set up log alerts for Azure Local systems: using Insights for Azure Local and using preexisting sample log queries, such as average node CPU, available memory, available volume capacity, and more. For information about how to set up metric alerts, seeSet up metric alerts for Azure Local. Take a few moments to watch the video walkthrough on collecting new logs, customizing the Insights workbooks, and creating alerts using logs",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,Provides sample log queries and alert setup steps; includes KQL examples and alert rule parameters specific to Azure Local telemetry.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/support-tools?view=azloc-2604,Use the Diagnostic Support tool,Support Tool for Azure Local Hyperconverged Deployments - Azure Local,Use Azure Local Support Diagnostic Tool for data collection and issue resolution,This article provides guidance on the Support Diagnostic Tool for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article provides information to download and use the Azure Local Support Diagnostic Tool. The tool is a set of PowerShell commands to simplify data collection, troubleshooting, and resolution of common issues. This tool isn't a substitute for expert knowledge. If you encounter any issues, contact Microsoft Support for assistance.",2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.7,True,Provides PowerShell-based diagnostic tool usage; includes commands and workflows tailored to Azure Local troubleshooting.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/suspend-resume-cluster-maintenance?view=azloc-2604,Suspend and resume,Suspend and resume Azure Local machines for planned maintenance operations - Azure Local,,Learn how to suspend and resume Azure Local machines for planned maintenance operations.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to suspend an Azure Local machine (physical host) for planned maintenance, such as powering off the machine to replace non-hot-pluggable components. It also provides instructions on how to resume the machine once maintenance is complete.",2026-01-23T18:03:00.000Z,how-to,,0.4,False,"Step-by-step maintenance procedure (suspend/resume host) but no detailed config tables, limits, or product-specific troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-logical-networks?view=azloc-2604,Manage tenant logical networks,Manage tenant logical networks - Azure Local,Manage tenant logical networks in Azure Local SDN,"This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete logical networks after you have deployed Network Controller.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete logical networks after you have deployed Network Controller. A Software Defined Networking (SDN) logical network is a traditional VLAN-based network. By modeling a VLAN-based network as an SDN logical network, you can apply network policies to workloads that are attached to these netw",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Step-by-step configuration of SDN logical (VLAN-based) networks via Windows Admin Center, including how policies are applied. Product-specific configuration procedures.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-virtual-networks?view=azloc-2604,Manage tenant virtual networks,Manage tenant virtual networks - Azure Local,Manage tenant virtual networks with Hyper-V Network Virtualization,"This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete Hyper-V Network Virtualization (HNV) virtual networks after you have deployed Software De","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This topic provides step-by-step instructions on how to use Windows Admin Center to create, update, and delete Hyper-V Network Virtualization (HNV) virtual networks after you have deployed Software Defined Networking (SDN). HNV helps you isolate tenant networks so that each tenant network is a separate entity. Each entity has no cross-connection possibility, unless you either configure public ",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Detailed instructions for creating/updating/deleting HNV virtual networks and isolation behavior, using Windows Admin Center. Product-specific SDN configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-arc-enabled-vms?view=azloc-2604,Troubleshoot,Troubleshoot Azure Local Virtual Machines enabled by Azure Arc - Azure Local,Troubleshoot Azure Local Arc-enabled virtual machines,Learn how to troubleshoot issues you experience with Azure Local Virtual Machines (VMs).,"Applies to: Hyperconverged deployments of Azure Local This article describes how to collect logs and troubleshoot problems with Azure Local Virtual Machines enabled by Azure Arc. It also lists the current limitations and known problems with Azure Local VM management, along with recommended resolutions.",2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.85,True,"Explicit troubleshooting article with log collection steps, known limitations, and likely error/symptom-to-resolution mappings specific to Azure Local Arc-enabled VMs.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-common-sdn-issues?view=azloc-2604,Troubleshoot common SDN issues,Collect traces and logs to troubleshoot common issues in SDN managed by on-premises tools - Azure Local,Collect traces and logs for common SDN issues,Learn how to collect network traces and logs to troubleshoot common issues in Software Defined Networking (SDN) managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes what data to collect to troubleshoot common issues in Software Defined Networking (SDN) managed by on-premises tools on Azure Local. Use this information to perform initial troubleshooting before contacting Microsoft Support.",2026-03-27T17:04:00.000Z,how-to,troubleshooting,0.7,True,Guides what traces/logs to collect for common SDN problems before contacting support. Product-specific troubleshooting data guidance.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment-configurator-app?view=azloc-2604,Troubleshoot registration issues for Configurator app,Troubleshoot registration issues using Configurator app in Azure Local - Azure Local,Troubleshoot Azure Local registration issues using Configurator app,Learn how to troubleshoot the registration failures for Azure Local when using the Configurator app.,Applies to: Azure Local 2508 and later This article provides guidance on how to collect logs and troubleshoot issues experienced during the registration of Azure Local via the Configurator app.,2025-12-23T23:03:00.000Z,how-to,troubleshooting,0.75,True,Explicit troubleshooting guide for registration failures; likely organized by symptoms and includes log collection steps and resolution paths.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment?view=azloc-2604,Troubleshoot deployment validation issues,"Troubleshoot deployment validation issues in Azure Local, version 23H2 via Azure portal - Azure Local",Troubleshoot Azure Local 23H2 deployment validation failures via Azure portal,"Learn how to troubleshoot the deployment validation failures for Azure Local, version 23H2 when deployed via the Azure portal.",Applies to: Azure Local 2405 and later This article provides guidance on how to troubleshoot deployment validation issues experienced during the deployment of Azure Local via the Azure portal.,2026-01-28T18:04:00.000Z,how-to,troubleshooting,0.75,True,"Focused on diagnosing deployment validation issues; likely includes specific error messages, causes, and remediation steps.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-sdn-deployment?view=azloc-2604,Troubleshoot SDN deployment,"Troubleshoot deployment of Software Defined Networking managed by on-premises tools in Azure Local, version 23H2 via Windows Admin Center - Azure Local",Troubleshoot SDN deployment via Windows Admin Center,"Learn how to troubleshoot the deployment of Software Defined Networking (SDN) managed by on-premises tools in Azure Local, version 23H2 via Windows Admin Center.","Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article provides guidance on how to troubleshoot issues that you may encounter while deploying the Software Defined Networking (SDN) components using Windows Admin Center. Use this guidance to troubleshoot the issues before creating a support ticket. This article also provides instructions on how to collect logs after successfully troubleshooting the issues to help diagnose the cause of deployment failure.",2025-12-22T23:06:00.000Z,how-to,troubleshooting,0.8,True,"Focused on diagnosing and resolving SDN deployment issues, including guidance on symptoms, causes, and log collection. Product-specific troubleshooting workflow.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-software-load-balancer?view=azloc-2604,Troubleshoot Software Load Balancer for SDN,Troubleshoot Software Load Balancer (SLB) for SDN managed by on-premises tools in Azure Local and Windows Server - Azure Local,Troubleshoot Software Load Balancer data path issues,Learn how to troubleshoot Software Load Balancer for SDN managed by on-premises tools in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 If you've set up Software Load Balancer (SLB) for Software Defined Networking (SDN) and your data path isn't working through SLB, there could be several reasons behind it. This article helps you identify and troubleshoot some common issues in SLB for SDN managed by on-premises tools. For an overview of SLB and how to manage it, seeWhat is Software Load Balancer (SLB) for SDN?andManage Software Load Balancer for SD",2025-12-22T23:06:00.000Z,how-to,troubleshooting,0.75,True,Helps diagnose SLB data path failures with common causes and resolutions for Azure Local/Windows Server SDN. Product-specific troubleshooting mappings.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-automatic-state-transfer?view=azloc-2604,Automatic virtual TPM state transfer,Automatic virtual TPM state transfer for Azure Local - Azure Local,Understand automatic vTPM state transfer for Azure Local,Learn how automatic virtual TPM state transfer works for Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article uses an example to illustrate the automatic transfer of virtual TPM (vTPM) state for Trusted launch for Azure Local VM, even as the VM migrates or fails over to another machine in the system. This operation allows the applications that use the vTPM to function normally during VM migration or failover.",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.65,True,"Describes how vTPM state is handled during VM migration/failover in Azure Local Trusted launch; this is a product-specific resiliency pattern for secure workloads, not generic virtualization behavior.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-guest-attestation?view=azloc-2604,Enable guest attestation,Guest attestation for Trusted launch for Azure Local VMs (preview) - Azure Local,Enable guest attestation for Azure Local Trusted launch VMs,Learn how guest attestation works for Trusted launch for Azure Local VMs (preview).,"Applies to: Azure Local 2509 and later This article describes how to enable guest attestation for Trusted launch for Azure Local virtual machines (VMs) enabled by Azure Arc. Guest attestation, also called boot integrity verification, is a new feature you can preview starting with Azure Local version 2509. Guest attestation allows you to verify if the VM started in a well-known good state – specifically, verify integrity of the entire boot chain. This helps detect any unexpected changes to the bo",2026-04-22T22:07:00.000Z,how-to,security,0.8,True,Covers enabling boot integrity verification for Trusted launch VMs with version-specific requirements and security configuration steps unique to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-import-key?view=azloc-2604,Manual backup and recovery,Manual backup and recovery of guest state protection keys for Trusted launch Azure Local VMs enabled by Azure Arc - Azure Local,Back up and restore Trusted launch guest state keys,Learn how to perform a manual backup and recovery of guest state protection keys for Trusted launch Azure Local VMs enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article describes how to manually back up and restore guest state protection keys for Trusted launch Azure Local virtual machines (VMs) enabled by Azure Arc. For Azure Local release 2505 and later: Back up and restore Azure Local VM guest state protection keys to and from a file system folder. For Azure Local releases prior to 2505: Back up and restore Azure Local VM guest state protection keys to and from a key vault in another Azure Lo,2026-04-22T22:07:00.000Z,how-to,security,0.85,True,"Provides concrete procedures and storage options (file system vs key vault) for guest state protection keys, including version-specific behavior—product-specific security key management.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-overview?view=azloc-2604,What is Trusted launch for Azure Local VMs?,Overview for Trusted launch for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn about Trusted launch for Azure Local VMs enabled by Azure Arc.,Applies to: Hyperconverged deployments of Azure Local This article introduces Trusted launch for Azure Local virtual machines (VMs) enabled by Azure Arc. You can create a Trusted launch for an Azure Local VM using the Azure portal or by using Azure Command-Line Interface (CLI).,2026-04-22T22:07:00.000Z,concept-article,,0.4,False,Overview of Trusted launch; primarily conceptual and feature-intro content without detailed configuration parameter tables or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/unregister-register-machine?view=azloc-2604,Unregister and reregister machines,Unregister and re-register Azure Local machines - Azure Local,Unregister and re-register Azure Local machines via PowerShell,Learn how to unregister and re-register Azure Local machines without having to install the operating system again.,Applies to: Azure Local 2508 and later This article provides guidance on how to unregister and re-register Azure Local machines without having to install the operating system (OS) again. This method uses PowerShell cmdlets and applies to registration with and without Azure Arc gateway. Important This guidance applies only to Azure Local clusters that haven't been deployed yet.,2026-03-03T23:11:00.000Z,how-to,configuration,0.65,True,Provides specific PowerShell cmdlets and constraints for re-registration without OS reinstall; product-specific operational configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/update-network-controller-certificates?view=azloc-2604,Update Network Controller certificates,Renew certificates for Network Controller - Azure Local,Renew Network Controller certificates in Azure Local,This article describes how to renew Network Controller certificates.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022 and Windows Server 2019 This article provides instructions on how to renew or change Network Controller certificates, both automatically and manually. If you face any issues in renewing your Network Controller certificates, contact Microsoft Support. In your Software Defined Networking (SDN) managed by on-premises tools infrastructure, the Network Controller uses certificate-based authentication to secure Northbound communication chan",2026-04-22T22:07:00.000Z,how-to,security,0.7,True,"Provides automatic and manual renewal procedures for Network Controller certificates, including channel-specific usage. Product-specific security and certificate lifecycle steps.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn-infrastructure-certificates?view=azloc-2604,Update SDN infrastructure certificates,Renew certificates for Software Defined Networking managed by on-premises tools infrastructure - Azure Local,Renew SDN infrastructure and SLB MUX certificates,This article describes how to renew or change SDN server and Software Load Balancer multiplexer certificates.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022 and Windows Server 2019 This article provides instructions on how to renew or change Software Defined Networking (SDN) server and Software Load Balancer (SLB) multiplexer (MUX) certificates. If you face any issues in renewing your certificates, contact Microsoft Support. For information about how to renew Network Controller certificates, seeRenew Network Controller certificates before they expire. In your SDN infrastructure, the Netwo",2026-04-22T22:07:00.000Z,how-to,security,0.7,True,Explains how to renew/change SDN server and SLB MUX certificates and their relationship to Network Controller. Product-specific certificate management.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn?view=azloc-2604,Update SDN infrastructure,Update infrastructure for SDN managed by on-premises tools for Azure Local - Azure Local,Update SDN infrastructure components on Azure Local,Learn to update infrastructure for SDN managed by on-premises tools.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 Software Defined Networking (SDN) infrastructure components include Network Controller and optionally, Software Load Balancers (SLBs), and SDN gateways that run on virtual machines (VMs). When you update each component, you use any of the standard methods for installing Windows updates, and you also use Windows PowerShell. You can update the SDN infrastructure in any order, but we recommend th",2026-04-22T22:07:00.000Z,how-to,configuration,0.6,True,"Gives concrete, product-specific steps and ordering guidance for updating Network Controller, SLB, and gateways using Windows Update and PowerShell. Focused on configuration/maintenance of SDN components.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn-gateways?view=azloc-2604,Upgrade SDN gateway VMs,Upgrade gateway VMs in SDN managed by on-premises tools - Azure Local,Upgrade SDN gateway VMs with minimal disruption,Learn how to upgrade gateway VMs in SDN managed by on-premises tools.,"Applies to: Hyperconverged deployments of Azure Local running 2311.2 and later; Windows Server 2025, Windows Server 2022 This article explains how to upgrade Software Defined Networking (SDN) gateway virtual machines (VMs) with minimal disruption to network connectivity. The process updates redundant and active gateways in a controlled sequence to keep services available.",2026-04-24T17:07:00.000Z,how-to,deployment,0.65,True,Describes a controlled sequence for upgrading redundant and active SDN gateways to preserve connectivity. This is a deployment/upgrade pattern with product-specific ordering constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn?view=azloc-2604,Upgrade SDN infrastructure,Upgrade Infrastructure for Software Defined Networking (SDN) Managed by On-Premises Tools - Azure Local,Upgrade SDN infrastructure managed by on-premises tools,Learn how to upgrade infrastructure for SDN managed by on-premises tools.,"Applies to: Hyperconverged deployments of Azure Local running 2311.2 and later; Windows Server 2025, Windows Server 2022 This article provides guidance on safely and securely upgrading infrastructure for Software Defined Networking (SDN) managed by on-premises tools. It also provides troubleshooting guidance to help remediate issues that might occur during the upgrade process. Important Do not use this article for upgrading SDN enabled by Azure Arc on Azure Local.",2026-04-24T17:07:00.000Z,how-to,best-practices,0.65,True,"Provides safe and secure upgrade guidance plus troubleshooting for SDN infrastructure, including do/don’t style recommendations and product-specific gotchas (and explicit exclusion of Arc-enabled SDN).",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-powershell?view=azloc-2604,Configure network security groups with PowerShell,Configure network security groups with PowerShell - Azure Local,Configure SDN network security groups with PowerShell,Configure network security groups with PowerShell.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This article provides instructions for configuring network security groups (NSGs) to manage data traffic flow usingDatacenter Firewallfor Software Defined Networking (SDN) in Azure Local using Windows PowerShell. You enable and configure Datacenter Firewall by creating network security groups that get applied to a subnet or a network interface. The example scripts in this article use Windows P",2026-04-22T22:07:00.000Z,how-to,security,0.75,True,"Provides PowerShell-based configuration of NSGs and Datacenter Firewall, including specific cmdlets and parameter usage unique to Azure Local SDN.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-windows-admin-center?view=azloc-2604,Configure network security groups with Windows Admin Center,Configure network security groups with Windows Admin Center - Azure Local,Configure SDN network security groups with Windows Admin Center,Configure network security groups with Windows Admin Center,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019, Windows Server 2016 This topic provides step-by-step instructions on how to use Windows Admin Center to create and configure network security groups (NSGs) to manage data traffic flow using Datacenter Firewall. It also provides instructions on managing network security groups on Software Defined Network (SDN) virtual and logical networks. You enable and configure Datacenter Firewall by creating network security g",2026-04-22T22:07:00.000Z,how-to,security,0.75,True,"Details configuring NSGs and Datacenter Firewall for SDN logical and virtual networks, including how NSGs are applied. Product-specific security configuration.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/use-environment-checker?view=azloc-2604,Assess environment readiness,"Use Azure Local Environment Checker to assess deployment readiness for Azure Local, version 23H2. - Azure Local",Run Azure Local Environment Checker for deployment readiness,"How to use the Environment Checker to assess if your environment is ready for deploying Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to use the Azure Local Environment Checker in a standalone mode to assess how ready your environment is for deploying the Azure Local solution. For a smooth deployment of the Azure Local solution, your IT environment must meet certain requirements for connectivity, hardware, networking, and Active Directory. The Azure Local Environment Checker is a readiness assessment tool that checks these minimum requirements and",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Describes a product-specific tool with checks for connectivity, hardware, networking, and AD; includes concrete readiness rules and outputs unique to Azure Local.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/use-external-storage-for-containerized-workloads?view=azloc-2604,External Storage support with AKS,Using Custom Storage Classes in AKS to Consume External Storage (SAN) on Azure Local - Azure Local,Use custom AKS storage classes to consume external SAN on Azure Local,Describes how to enable leverage external storage for containerized workloads on Azure Local.,,2026-04-22T22:28:00.000Z,how-to,integrations,0.7,True,Describes using custom StorageClasses in AKS to consume external SAN; involves Kubernetes storage class parameters and Azure Local-specific integration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-red-hat?view=azloc-2604,Using Red Hat Enterprise Azure Marketplace image,Prepare Red Hat Enterprise Azure Marketplace Image for Azure Local VM Deployment - Azure Local,,Learn how to prepare and export a Red Hat Enterprise Azure Marketplace VM image for use with Azure Local clusters.,"This article explains how to prepare a Red Hat Enterprise Linux (RHEL) Azure Marketplace image for use with Azure Local virtual machines (VMs). By following these steps, you ensure your VM has the latest security updates, support, and integration features.",2026-01-16T18:04:00.000Z,how-to,,0.45,False,Prepare RHEL Marketplace image; tutorial-style steps without structured configuration or decision-making content.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-ubuntu?view=azloc-2604,Using Ubuntu Azure Marketplace image,Prepare Ubuntu Azure Marketplace Image for Azure Local VM Deployment - Azure Local,,Learn how to prepare and export a Ubuntu Azure Marketplace VM image for use with Azure Local clusters.,"This article explains how to prepare an Ubuntu Azure Marketplace image for use with Azure Local virtual machines (VMs). By following these steps, you ensure your VM has the latest security updates, support, and integration features.",2026-01-12T23:03:00.000Z,how-to,,0.45,False,"Prepare Ubuntu Marketplace image; stepwise preparation and export, not a config/limits/troubleshooting reference.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-compute-gallery?view=azloc-2604,Using Azure Compute Gallery images,Create Azure Local VM from Azure Compute Gallery Images via Azure CLI - Azure Local,,Learn how to create Azure Local VM images using Azure Compute Gallery images.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc using source images from the Azure Compute Gallery. You can create VM images on Azure CLI using the instructions in this article and then use these VM images to create Azure Local VMs.,2026-04-22T22:07:00.000Z,how-to,,0.45,False,Tutorial for creating VM images from Azure Compute Gallery; procedural content without expert-only configuration matrices.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-marketplace?view=azloc-2604,Using Azure Marketplace images,Create Azure Local VM from Azure Marketplace Images via Azure CLI - Azure Local,,Learn how to create Azure Local VM images using source images from Azure Marketplace.,"Applies to: Hyperconverged deployments of Azure Local This article explains how to create Windows virtual machine (VM) images for Azure Local by using source images from Azure Marketplace, either through the Azure portal or Azure CLI. To create Linux VM images from Azure Marketplace, choose: Prepare RHEL Azure Marketplace image for Azure Local VM deployment. Prepare Ubuntu Azure Marketplace image for Azure Local VMs.",2026-04-22T22:07:00.000Z,how-to,,0.45,False,"Tutorial for creating VM images from Azure Marketplace; mostly step-by-step usage of portal/CLI, not deep config or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-centos?view=azloc-2604,Using CentOS VM image,Prepare CentOS Linux image via Azure CLI for Azure Local VMs enabled by Azure Arc (preview) - Azure Local,,Learn how to prepare CentOS Linux images to create an Azure Local VM image (preview).,"Caution This article references CentOS, a Linux distribution that's reached end-of-life (EOL). Consider your use of CentOS and plan accordingly. For more information, seeCentOS end-of-life guidance. Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare a CentOS Linux image and create an Azure Local virtual machine (VM).",2026-04-22T22:07:00.000Z,how-to,,0.45,False,"Prepare CentOS image; similar to other image-prep tutorials, mostly commands and steps without structured expert reference content.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-existing-arc-vm?view=azloc-2604,Using an existing Azure Local VM,Create Azure Local VM image from an existing Azure Local VM enabled by Azure Arc - Azure Local,,Learn how to create Azure Local VM images using an existing Azure Local VM enabled by Azure Arc via Azure CLI.,Applies to: Hyperconverged deployments of Azure Local 2408.2 and later This article describes how to create Azure Local VM images using Azure Command-Line Interface (CLI) and existing Azure Local VMs. Learn to use the operating system (OS) disk of an Azure Local VM to create a gallery image on your Azure Local.,2026-04-22T22:07:00.000Z,how-to,,0.5,False,"Create image from existing Arc-enabled VM; mostly CLI steps, not a configuration or troubleshooting reference.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-linux-sysprep?view=azloc-2604,Using Ubuntu VM image,Prepare Ubuntu image via Azure CLI for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to prepare Ubuntu images to create an Azure Local VM image by using Azure CLI.,Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare an Ubuntu image and create an Azure Local virtual machine (VM).,2026-04-22T22:07:00.000Z,how-to,,0.45,False,"Prepare Ubuntu image via CLI; image-prep tutorial with commands, but not organized as config reference or best-practices with quantified impact.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-local-share?view=azloc-2604,Using images in local share,Create Azure Local VM from local share images via Azure CLI - Azure Local,,Learn how to create Azure Local VM images using source images from a local share on your system.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to create virtual machine (VM) images for Azure Local using source images from a local share. You can create VM images by using the Azure portal or Azure CLI. Then, use these VM images to create Azure Local VMs.",2026-04-22T22:07:00.000Z,how-to,,0.45,False,Create VM images from local share; procedural guide without detailed parameter tables or constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-red-hat-enterprise?view=azloc-2604,Using Red Hat Enterprise VM image,Prepare Red Hat Enterprise Linux image via Azure CLI for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to prepare a Red Hat Enterprise Linux image to create an Azure Local VM image (preview).,Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare a Red Hat Enterprise Linux image and create an Azure Local virtual machine (VM).,2026-01-16T18:04:00.000Z,how-to,,0.45,False,"Prepare RHEL image; procedural CLI guide, not focused on limits, security roles, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-storage-account?view=azloc-2604,Using images in Azure Storage account,Create Azure Local VMs enabled by Azure Arc in Azure Storage account - Azure Local,,Learn how to create Azure Local VMs enabled by Azure Arc using source images from Azure Storage account via Azure portal and Azure CLI.,Applies to: Hyperconverged deployments of Azure Local This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc using source images from the Azure Storage account. You can create VM images by using the Azure portal or Azure CLI and then use these VM images to create Azure Local VMs.,2026-04-22T22:07:00.000Z,how-to,,0.45,False,"How-to create VMs from images in a storage account; standard portal/CLI steps, not configuration reference or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-suse?view=azloc-2604,Using SUSE VM image,Prepare SUSE Linux image via Azure CLI for Azure Local VMs enabled by Azure Arc - Azure Local,,Learn how to prepare SUSE Linux images to create an Azure Local VM image (preview).,Applies to: Hyperconverged deployments of Azure Local This article describes how to use Azure CLI to prepare an SUSE Linux image and create an Azure Local virtual machine (VM).,2026-02-13T18:03:00.000Z,how-to,,0.45,False,"Prepare SUSE image; similar to other Linux image-prep guides, mainly commands and steps.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-extension?view=azloc-2604,Manage VM extensions,Manage VM Extensions on Azure Local VMs enabled by Azure Arc on Azure Local - Azure Local,Configure and manage VM extensions on Azure Local,Learn how to enable guest management and then install and manage extensions on Azure Local VMs via the Azure portal.,"Applies to: Hyperconverged deployments of Azure Local This article describes how to install and manage virtual machine (VM) extensions on Azure Local via the Azure portal. The VM extensions on your Azure Local VMs enabled by Azure Arc are useful for post-deployment configuration, software installation, or other management tasks. To install VM extensions, you must enable Azure guest management on your Azure Local VMs.",2026-04-22T22:07:00.000Z,how-to,configuration,0.6,True,"Covers enabling guest management and installing extensions; includes extension-related settings and requirements, which are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-image?view=azloc-2604,Manage Azure Local VM Images,Manage Azure Local VM Images via CLI or Portal - Azure Local,,"Learn how to list, view properties, and delete virtual machine images on Azure Local using Azure CLI or Azure portal.","Applies to: Hyperconverged deployments of Azure Local This article describes how to manage virtual machine (VM) images on your Azure Local instance. You can create VM images from various sources like Azure Marketplace, Azure Compute Gallery, Azure Storage accounts, or local shares. After you create images, you can list, view properties, and delete them using Azure CLI or the Azure portal.",2025-12-23T23:03:00.000Z,how-to,,0.5,False,"Manage VM images (list, view, delete); operational how-to without detailed configuration parameter tables or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-operations?view=azloc-2604,Supported operations for VMs,Supported Operations for Azure Local Virtual Machines (VMs) Enabled by Azure Arc - Azure Local,Use supported operations for Azure Local Arc VMs,Learn about the supported virtual machine (VM) operations for Azure Local VMs enabled by Azure Arc.,"Applies to: Azure Local 2504 or later This article discusses the most common operations for Azure Local virtual machines (VMs) enabled by Azure Arc. The article identifies the operations that are supported on Azure Local VMs, along with the operations that you need to avoid to prevent complications.",2026-04-22T22:07:00.000Z,concept-article,best-practices,0.65,True,Lists supported and unsupported VM operations and warns about operations to avoid to prevent complications; product-specific DO/DON'T guidance qualifies as best practices.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-activate?view=azloc-2604,Activate Windows Server VMs,Activate Windows Server VMs on Azure Local - Azure Local,Configure Windows Server VM activation on Azure Local,This article explains the benefits of using Automatic Virtual Machine Activation and provides instructions on setting up this optional feature on Azure Local.,"Applies to: Azure Local 2311.1 and later; Windows Server 2025, Windows Server 2022, Windows Server 2019 Datacenter Edition and later Windows Server virtual machines (VMs) must be activated before you can use them on Azure Local. You can use any existing Windows Server activation methods that you already have. Optionally, Azure Local offers an addon subscription and tools to help simplify this process. This article describes Windows Server VM activation concepts and the options that are available",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,Explains activation options including Automatic VM Activation and Azure Local add-on; includes product-specific activation settings and steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-affinity?view=azloc-2604,Create VM affinity rules,Set up VM affinity rules using Windows PowerShell - Azure Local,Configure VM affinity and anti-affinity rules on Azure Local,Learn how to set up VM affinity rules using Windows PowerShell,"Applies to: Azure Local 2311.2 and later Using either Windows Admin Center or Windows PowerShell, you can easily create affinity and anti-affinity rules for virtual machines (VMs) in your Azure Local instance. Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created thi",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Describes how to set affinity rules via Admin Center/PowerShell; likely includes specific cmdlets, parameters, and constraints unique to Azure Local clusters.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-load-balancing?view=azloc-2604,VM load balancing,Virtual machine load balancing - Azure Local,Configure virtual machine load balancing on Azure Local,Use this article to learn how to configure the VM load balancing feature in Azure Local and Windows Server.,"Applies to: Azure Local 2311.2 and later; Windows Server 2025, Windows Server 2022, Windows Server 2019, Windows Server 2016 Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't enabled by Azure Arc, have limited manageability from the Azure Arc cont",2026-04-22T22:07:00.000Z,how-to,configuration,0.7,True,"Product-specific VM load balancing configuration for Azure Local/Windows Server, including settings and behaviors not covered by generic load balancing knowledge.",new +https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-powershell?view=azloc-2604,Manage VMs with PowerShell,Manage VMs using Windows PowerShell on Azure Local - Azure Local,Manage Azure Local virtual machines using PowerShell,How to manage virtual machines on Azure Local using Windows PowerShell,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 This article describes how to create and manage virtual machines (VMs) on Azure Local using Windows PowerShell. Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't en",2026-04-22T22:07:00.000Z,how-to,configuration,0.75,True,PowerShell-based VM management with Azure Local-specific cmdlets/parameters and behaviors that go beyond generic Hyper-V or Azure VM scripting.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/vm?view=azloc-2604,Manage VMs,Manage VMs with Windows Admin Center on Azure Local - Azure Local,Create and manage Azure Local VMs with Windows Admin Center,Learn how to create and manage virtual machines on Azure Local using Windows Admin Center.,"Applies to: Azure Local 2311.2 and later; Windows Server 2022, Windows Server 2019 Windows Admin Center can be used to create and manage your virtual machines (VMs) on Azure Local. Note The recommended way to create and manage VMs on Azure Local is using theAzure Arc control plane. However, since the functionality described in this article isn't yet provided by Azure Arc, you can use Windows Admin Center or PowerShell as described in this article. The VMs created this way aren't enabled by Azure",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,How-to for VM lifecycle management via Windows Admin Center on Azure Local; likely includes specific UI options and parameters unique to this product combination.,new +https://learn.microsoft.com/en-us/azure/azure-local/manage/windows-server-azure-edition-23h2?view=azloc-2604,Deploy Windows Server Azure Edition VMs,"Deploy Windows Server Azure Edition VMs on Azure Local, version 23H2 - Azure Local",,"Learn how to deploy Windows Server Azure Edition VMs on Azure Local, version 23H2 starting with an image in Azure Local Marketplace or Azure Marketplace.",Applies to: Hyperconverged deployments of Azure Local The Windows Server Azure Edition operating system can be deployed as a guest virtual machine (VM) on Azure Local 2311.2 or later. This article describes how to deploy and hotpatch Windows Server Azure Edition VMs starting with an image in Azure Local marketplace or an image in Azure Marketplace. Note Both Azure Local VMs enabled by Azure Arc and unmanaged VMs are supported. Azure Local is the only on-premises platform to run Windows Server Az,2026-04-22T22:07:00.000Z,install-set-up-deploy,,0.5,False,"Deploy Windows Server Azure Edition VMs; deployment tutorial without explicit tier matrices, limits, or config reference tables.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-azure-migrate?view=azloc-2604,"Migrate, verify",Migrate Hyper V VMs to Azure Local using Azure Migrate (preview) - Azure Local,,Learn about how to to migrate Windows and Linux VMs to your Azure Local instance using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article describes how to migrate Hyper-V virtual machines (VMs) to Azure Local using Azure Migrate and includes the steps to verify the migration. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"End-to-end migration how-to; mostly step-by-step instructions, not focused on limits, configuration matrices, or troubleshooting catalogs.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-enable-guest-management?view=azloc-2604,Enable guest management,Enable guest management for migrated VMs (preview) - Azure Local,Configure guest management for Azure Local migrated VMs,Learn how to enable guest management for migrated VMs (preview).,"Applies to: Azure Local 2503 and later This article describes how to enable guest management for VMs that have been migrated to Azure Local usingAzure Migrate. Enabling guest management allows you to manage in guest OS settings and install and manage Azure extensions on these VMs. This article is only for VMs that have been migrated using Azure Migrate. For more information on how to enable guest management in other scenarios, seeManage Azure Local VMs. The output properties may vary depending o",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Enabling guest management typically involves specific Azure Local and Azure Migrate settings/flags and possibly extension configuration parameters.,new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-faq?view=azloc-2604,FAQ,Azure Migrate FAQ - Azure Local,,This FAQ provides answers to common questions about the migration of a Hyper-V virtual machine (VM) to an Azure Local instance using Azure Migrate.,"The Azure Migrate based solution enables you to migrate VMs from Hyper-V (Preview) and VMware to an Azure Local instance. This FAQ answers questions you might have about the migration of a VM from a Hyper-V or a VMware VM to an Azure Local instance using Azure Migrate. +Tabs have questions aboutVMware and Hyper-V VMs,VMware VMs only, andHyper-V VMs only. The migration of Hyper-V VMs is in preview.",2026-03-26T17:03:00Z,faq,,0.35,False,FAQ format; likely high-level Q&A without structured error-code mappings or detailed configuration tables.,new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-prerequisites?view=azloc-2604,Complete prerequisites,Prerequisites for Hyper-V VM migration to Azure Local using Azure Migrate (preview) - Azure Local,,Learn prerequisites for Hyper-V migration to Azure Local using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article describes the prerequisite tasks you need to complete before you begin the process to migrate Hyper-V virtual machines (VMs) to Azure Local. Make sure toreview the requirementsfor migration if you haven't already. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availabili",2026-03-26T17:03:00.000Z,how-to,,0.4,False,"Prerequisites article is usually task/checklist oriented (enable permissions, create resources) rather than tables of config parameters or limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-replicate?view=azloc-2604,"Discover, replicate",Discover and replicate Hyper-V VMs for migration to Azure Local using Azure Migrate (preview) - Azure Local,,Learn the discovery and replication process for Hyper-V VMs to Azure Local using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article describes the discovery and replication phase for Hyper-V virtual machine (VM) migration to Azure Local using Azure Migrate. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. For more information on appliances for Azure Migrate and how to manage them, seeAzure",2026-03-26T17:03:00.000Z,how-to,,0.4,False,Describes discovery and replication phase; likely procedural tutorial steps without detailed config tables or error-code mappings.,new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-requirements?view=azloc-2604,Review requirements,Review requirements for Hyper-V VM migration to Azure Local using Azure Migrate (preview) - Azure Local,System requirements for Hyper-V migration to Azure Local,Learn the system requirements for Hyper-V migration to Azure Local using Azure Migrate (preview).,"Applies to: Azure Local 2503 and later This article lists the system requirements for migrating Hyper-V virtual machines (VMs) to Azure Local using Azure Migrate. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-03-26T17:03:00.000Z,how-to,limits-quotas,0.7,True,"A ‘system requirements’ article for a specific migration feature typically lists exact supported versions, CPU, RAM, network, and other numeric constraints that function as hard limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-maintain-ip-addresses?view=azloc-2604,Maintain static IP addresses,Maintain static IP addresses during migration - Azure Local,Preserve static IP addresses during Azure Local VM migration,Learn how to maintain static IP addresses for VMs during migration.,Applies to: Azure Local 2503 and later This article explains how to preserve static IP addresses during virtual machine (VM) migration to Azure Local using Azure Migrate. It provides detailed instructions for both Software Defined Networking (SDN) enabled and non-SDN enabled Azure Local environments. This article applies to migration of Hyper-V VMs (Preview) and VMware VMs.,2026-04-22T22:07:00.000Z,how-to,best-practices,0.7,True,"Explains how to maintain static IPs across SDN and non-SDN environments; likely includes product-specific steps, gotchas, and configuration patterns.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-troubleshoot?view=azloc-2604,Troubleshoot,Troubleshoot issues when migrating VMs to Azure Local using Azure Migrate - Azure Local,Troubleshoot Azure Local VM migrations with Azure Migrate,Learn about how to troubleshoot issues when migrating Windows VMs to your Azure Local instance using Azure Migrate.,Applies to: Azure Local 2503 and later This article describes how to troubleshoot any potential issues that you may experience when migrating Hyper-V (Preview) and VMware VMs to your Azure Local using Azure Migrate.,2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.85,True,Explicit troubleshooting article; typically organized by symptom and includes specific error messages/codes and resolutions.,new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-via-powershell?view=azloc-2604,Migrate via PowerShell,Migrate VMs to Azure Local with Azure Migrate using PowerShell - Azure Local,Migrate VMs to Azure Local with Azure Migrate PowerShell,Learn how to migrate VMs to Azure Local with Azure Migrate using PowerShell.,Applies to: Azure Local 2503 and later This article describes how to migrate virtual machines (VMs) to Azure Local with Azure Migrate using PowerShell. This article applies to migration of Hyper-V VMs (Preview) and VMware VMs.,2026-01-22T18:08:00.000Z,how-to,integrations,0.65,True,"PowerShell-based migration article likely documents specific cmdlets, parameters, and required values unique to Azure Migrate + Azure Local integration.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-migrate?view=azloc-2604,"Migrate, verify",Migrate VMware VMs to Azure Local using Azure Migrate - Azure Local,,Learn about how to to migrate VMware VMs to your Azure Local instance using Azure Migrate.,Applies to: Azure Local 2503 and later This article describes how to migrate the VMware virtual machines (VMs) to Azure Local using Azure Migrate and includes the steps to verify the migration.,2026-04-22T22:07:00.000Z,how-to,,0.4,False,"Migration procedure article; step-by-step guidance without strong indication of limits, config matrices, or error-code catalogs.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-prerequisites?view=azloc-2604,Complete prerequisites,Prerequisites for VMware VM migration to Azure Local using Azure Migrate - Azure Local,,Learn prerequisites for VMware migration to Azure Local using Azure Migrate.,Applies to: Azure Local 2503 and later This article describes the prerequisite tasks you need to complete before you begin the process to migrate VMware virtual machines (VMs) to Azure Local. Make sure toreview the requirementsfor migration if you haven't already.,2026-03-26T17:03:00.000Z,how-to,,0.4,False,"Prerequisites checklist for VMware migration; likely procedural (create projects, permissions) rather than detailed config or limits tables.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-replicate?view=azloc-2604,"Discover, replicate",Discover and replicate VMware VMs for migration to Azure Local using Azure Migrate - Azure Local,,Learn the discovery and replication process for VMware VMs to Azure Local using Azure Migrate.,"Applies to: Azure Local 2503 and later This article describes the discovery and replication phase for VMware virtual machine (VM) migration to Azure Local using Azure Migrate. For more information on appliances for Azure Migrate and how to manage them, seeAzure Migrate appliance.",2026-03-26T17:03:00.000Z,how-to,,0.4,False,"Discovery and replication how-to for VMware; mostly workflow steps, not structured troubleshooting or configuration reference.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-requirements?view=azloc-2604,Review requirements,Review requirements for VMware VM migration to Azure Local using Azure Migrate - Azure Local,System requirements for VMware migration to Azure Local,Learn the system requirements for VMware migration to Azure Local using Azure Migrate.,Applies to: Azure Local 2503 and later This article lists the system requirements for migrating VMware virtual machines (VMs) to Azure Local by using Azure Migrate.,2026-03-26T17:03:00.000Z,how-to,limits-quotas,0.7,True,"System requirements for VMware migration will enumerate supported ESXi/vCenter versions, network and resource constraints, which are concrete numeric and version limits.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-whats-new?view=azloc-2604,What's new in VM migration?,What's new in Azure Migrate for Azure Local - Azure Local,,Learn about new features in Azure Migrate for Azure Local.,This article lists the various features and improvements that are available in virtual machine (VM) migration to Azure Local (formerly Azure Stack HCI). This article applies to both Hyper-V (Preview) and VMware VM migrations. Applies to: Azure Local 2503 and later,2026-04-22T22:07:00.000Z,how-to,,0.2,False,"“What’s new” feature list; usually release/feature bullets without deep config, limits, or troubleshooting matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-overview?view=azloc-2604,Overview,Use Azure Migrate to move Hyper-V VMs to Azure Local (preview) - Azure Local,,Learn about how to use Azure Migrate to migrate Windows and Linux VMs to your Azure Local instance (preview).,"Applies to: Azure Local 2503 and later This article provides an overview of how to migrate Hyper-V virtual machines (VMs) to your Azure Local instance using Azure Migrate. Azure Migrate is a central hub for tools to discover, assess, and migrate on-premises servers, apps, and data to the Microsoft Azure cloud. Azure Local is a hyperconverged infrastructure (HCI) system solution that hosts virtualized Windows and Linux workloads in a hybrid environment. You can use the Azure Migrate platform to m",2026-04-22T22:07:00.000Z,overview,,0.3,False,"High-level overview of using Azure Migrate for Hyper-V; likely conceptual and workflow-focused, not detailed config or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-vmware-overview?view=azloc-2604,Overview,Use Azure Migrate to move VMware VMs to Azure Local - Azure Local,,Learn about how to use Azure Migrate to migrate VMware VMs to your Azure Local instance.,"Applies to: Azure Local 2503 and later This article provides an overview of how to migrate VMware virtual machines (VMs) to your Azure Local instance using Azure Migrate. Azure Migrate is a central hub for tools to discover, assess, and migrate on-premises servers, apps, and data to the Microsoft Azure cloud. Azure Local is a hyperconverged infrastructure system solution that hosts virtualized Windows and Linux workloads in a hybrid environment. You can use the Azure Migrate platform to move on-",2026-04-22T22:07:00.000Z,overview,,0.3,False,Overview of using Azure Migrate for VMware; conceptual and workflow-focused rather than detailed expert configuration or decision matrices.,new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-options-overview?view=azloc-2604,Migration overview,Options for migrating virtual machines to Azure Local - Azure Local,Choose VM migration options to Azure Local,Learn about the available migration options for migrating VM workloads to your Azure Local.,Applies to: Azure Local 2503 and later This article provides an overview of the options available for migrating virtual machine (VM) workloads to your Azure Local (formerly Azure Stack HCI) instance.,2026-04-22T22:07:00.000Z,overview,decision-making,0.65,True,"Overview of migration options typically includes comparison of approaches (Azure Migrate, PowerShell, etc.) and guidance on when to use which; this is product-specific decision guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/migrate/monitor-migration?view=azloc-2604,Monitor Azure Local migrations,Monitor Azure Local Migrations Using Diagnostic Settings in Azure Migrate - Azure Local,Configure diagnostic settings to monitor Azure Local migrations,Learn how to monitor Azure Local migrations using diagnostic settings in Azure Migrate.,"Applies to: Azure Local 2503 and later This article describes how to enable diagnostic settings in Azure Migrate via the Azure portal to help monitor Azure Local migrations. Diagnostic logs provide detailed and frequent data about resource operations, helping in monitoring, troubleshooting, and auditing. For more information, seeDiagnostic settings in Azure Monitor. To enable diagnostic settings in Azure Migrate via PowerShell or the Azure CLI, seeCollect and consume log data from your Azure res",2025-12-23T23:03:00.000Z,how-to,configuration,0.7,True,"Diagnostic settings article will specify log categories, destinations, and configuration parameters for Azure Migrate resources.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-assign-vm-rbac-roles?view=azloc-2604,Assign RBAC roles,Use built-in RBAC roles to manage Azure Local VMs for multi-rack deployments (preview) - Azure Local,Assign built-in RBAC roles for Azure Local multi-rack VMs,Learn how to use RBAC built-in roles to manage Azure Local VMs for multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to use Role-based Access Control (RBAC) to control access to Azure Local virtual machines (VMs) for multi-rack deployments. You can use the RBAC roles to control access to VMs and VM resources such as virtual disks, network interfaces, VM images, logical networks, and virtual networks. You can assign these roles to users, groups, service principals, and managed identities.",2026-04-23T17:08:00.000Z,how-to,security,0.85,True,"Describes RBAC role usage for VM and resource access; likely lists specific built-in role names and scopes, which are product-specific security configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-azure-arc-vm-management-overview?view=azloc-2604,Azure Local VMs for multi-rack deployments,What is Azure Local VM Management for Multi-rack Deployments (preview)? - Azure Local,,Learn about using Azure Local VM management to provision and manage on-premises Windows and Linux virtual machines (VMs) in Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides a brief overview of the Azure Local virtual machine (VM) management feature on Azure Local for multi-rack deployments, including benefits, components, and a high-level workflow. Azure Local VM management enables IT admins to provision and manage Windows and Linux VMs hosted in an on-premises Azure Local environment. IT admins can use the feature to create, modify, delete, and assign permissions and roles to ap",2026-04-23T17:08:00.000Z,how-to,,0.25,False,"VM management overview is high-level (benefits, components, workflow); lacks explicit expert-level configuration or troubleshooting details.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-cli-extensions?view=azloc-2604,Install Azure CLI extensions,Install CLI extensions for multi-rack deployments of Azure Local (preview) - Azure Local,Install Azure CLI extensions for Azure Local multi-rack,Learn how to install the needed Azure CLI extensions for multi-rack deployments of Azure Local (preview).,Applies to: Multi-rack deployments of Azure Local 2511 and later This article explains how to install the required Azure CLI extensions for multi-rack deployments of Azure Local.,2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Describes required Azure CLI extensions for multi-rack; includes extension names, versions, and install commands that are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-concepts-compute?view=azloc-2604,Compute for multi-rack deployments,Compute for Multi-rack Deployments of Azure Local (preview) - Azure Local,,Get an overview of compute resources for multi-rack deployments of Azure Local (preview).,Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of compute resources for multi-rack deployments of Azure Local.,2026-04-23T17:08:00.000Z,concept-article,,0.25,False,"Compute overview is conceptual; no clear indication of numeric limits, config tables, or decision matrices in the summary.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-configure-layer-3-isolation-domain?view=azloc-2604,Create and manage L3 isolation domains,Manage Layer 3 Isolation Domains for Azure Local Multi-rack Deployments (preview) - Azure Local,Configure Layer 3 isolation domains for Azure Local multi-rack,Learn how to manage Layer 3 isolation domains for Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create, modify, or delete Layer 3 (L3) isolation domains for multi-rack deployments of Azure Local. Isolation domains enable network connectivity between workloads hosted in the same rack (intra-rack communication) or different racks (inter-rack communication) and with endpoints external to Azure Local. You can create, update, delete, and check the status of your L3 isolation domains by using the Azure",2026-04-23T17:08:00.000Z,how-to,configuration,0.8,True,"Managing L3 isolation domains uses Azure resources and CLI/portal parameters (resource types, properties, allowed values) that are specific configuration knowledge.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-connect-arc-vm-using-ssh?view=azloc-2604,Connect to Azure Local VMs via SSH,Connect to an Azure Local VM using SSH or RDP over SSH for multi-rack deployments (preview) - Azure Local,Connect to Azure Local multi-rack VMs via SSH and RDP over SSH,Learn how to use SSH or RDP over SSH to connect to an Azure Local VM for multi-rack deployments (preview).,Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to connect to an Azure Local virtual machine (VM) using Secure Shell (SSH) and Remote Desktop Protocol (RDP) over SSH for multi-rack deployments.,2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Shows how to configure SSH/RDP over SSH connectivity; includes port, user, and CLI/portal settings specific to Azure Local multi-rack.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-arc-virtual-machines?view=azloc-2604,Create Azure Local VMs,Create Azure Local Virtual Machines Enabled by Azure Arc on Multi-rack Deployments - Azure Local,Create Azure Arc-enabled VMs on Azure Local multi-rack,Learn how to view your Azure Local multi-rack deployment in the Azure portal and create Azure Local virtual machines enabled by Azure Arc (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create Azure Local virtual machines (VMs) enabled by Azure Arc, using the VM images that you created on multi-rack deployments of Azure Local. You can create Azure Local VMs using the Azure CLI, Azure portal, or Azure Resource Manager (ARM) template. Note Arc gateway isn't supported on Azure Local VMs.",2026-04-23T17:08:00.000Z,how-to,configuration,0.65,True,Describes creating VMs via CLI/portal/ARM templates; includes resource definitions and parameters specific to Azure Local multi-rack VM management.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-internal-load-balancer-virtual-networks?view=azloc-2604,Create internal load balancer for VNETs,Create and Manage an Internal Load Balancer on Multi-Rack Deployments for Azure Local (Preview) - Azure Local,Configure internal load balancers on Azure Local multi-rack,Learn to create and configure internal load balancers for Azure Local multi-rack deployments (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create an internal load balancer on multi-rack deployments for Azure Local. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-23T17:08:00.000Z,concept-article,configuration,0.75,True,Internal load balancer setup uses Azure Local-specific resource settings and CLI parameters for multi-rack deployments.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-load-balancer-logical-network?view=azloc-2604,Create public load balancer for LNETs,Create Load Balancers on Logical Networks using Azure CLI in Multi-Rack Deployments of Azure Local (preview) - Azure Local,Create load balancers on logical networks in Azure Local multi-rack,Learn how to create load balancers on logical networks using Azure CLI in multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create load balancers on logical networks using Azure CLI in multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T22:07:00.000Z,how-to,configuration,0.75,True,Focuses on LB creation on logical networks via CLI; includes configuration parameters and constraints unique to Azure Local multi-rack.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-logical-networks?view=azloc-2604,Create logical network,Create logical networks for Azure Local VMs for multi-rack deployments - Azure Local,Create logical networks for Azure Local multi-rack VMs,Learn how to create logical networks on Azure Local VMs for multi-rack deployments.,Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create or add logical networks for Azure Local virtual machines (VMs) for multi-rack deployments. Azure Local VMs that you create use these logical networks. Note Azure Local VMs support only IPv4 addresses. IPv6 addresses aren't supported.,2026-04-23T17:08:00.000Z,how-to,configuration,0.75,True,"Creating logical networks involves specifying address spaces, subnets, and flags; these are product-specific configuration parameters and constraints.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-interfaces?view=azloc-2604,Create network interfaces,Create network interfaces for Azure Local VMs for multi-rack deployments - Azure Local,Configure network interfaces for Azure Local multi-rack VMs,Learn how to create network interfaces on an existing logical network associated with your Azure Local for multi-rack deployments. The Azure Local VM enabled by Azure Arc uses these network interfaces,Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create network interfaces that you can associate with an Azure Local virtual machine (VM) for multi-rack deployments. You can create network interfaces using the Azure portal or Azure Command-Line Interface (CLI).,2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Creating NICs on logical networks uses Azure resource properties (IP allocation, subnet, NIC settings) that are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-security-groups?view=azloc-2604,Create NSGs,Create network security groups on Azure Local VMs on multi-rack deployments - Azure Local,Create network security groups for Azure Local multi-rack VMs,Learn how to create network security groups on Azure Local VMs on multi-rack deployments using the Azure CLI.,Applies to: Multi-rack deployments of Azure Local 2511 and later This article explains how to create and configure network security groups (NSGs) to manage data traffic flow on a multi-rack deployment of your Azure Local.,2026-04-23T17:08:00.000Z,how-to,security,0.8,True,"Describes NSG creation and rule configuration via CLI; includes security rule parameters (ports, protocols, priorities) specific to Azure Local multi-rack.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-ip?view=azloc-2604,Create public IP addresses,Create Public IP Addresses on Multi-rack Deployments of Azure Local (preview) - Azure Local,Create public IP resources on Azure Local multi-rack deployments,Learn how to create public IP resources on multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create public IP resources on multi-rack deployments of Azure Local. A public IP in a multi-rack deployment of Azure Local represents an externally routable IP address resource. Unlike Azure public IP addresses, which are always internet-routable, a public IP on multi-rack deployments can be configured with any IP address that's routable within your network or, optionally, internet-facing. This resourc",2026-04-22T22:07:00.000Z,how-to,configuration,0.75,True,Defines how to configure public IPs that are routable within on-prem or internet; includes resource properties and constraints specific to Azure Local multi-rack.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-load-balancer-virtual-networks?view=azloc-2604,Create public load balancer for VNETs,Create Public Load Balancer on Virtual Networks for Multi-Rack Deployments of Azure Local (preview) - Azure Local,Configure public load balancers on Azure Local multi-rack VNets,Learn how to create and manage a public Load Balancer on Azure Local for multi-rack deployments. Distribute inbound traffic efficiently across virtual machines (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create a public load balancer on a multi-rack deployment of Azure Local using the Azure Command-line Interface (CLI). Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-23T17:08:00.000Z,how-to,configuration,0.75,True,"Creating public load balancers involves resource properties (frontend IPs, backend pools, rules) that are product-specific configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-virtual-networks?view=azloc-2604,Create virtual network,Create virtual networks for multi-rack deployments of Azure Local (preview) - Azure Local,Create virtual networks for Azure Local multi-rack deployments,Learn how to create virtual networks for multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of virtual networks (VNets) for multi-rack deployments of Azure Local and shows how to create one. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,VNet creation for multi-rack uses Azure resource properties and CLI parameters; article is configuration-focused with product-specific settings.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-disk-snapshot?view=azloc-2604,Create and restore data disk snapshots,Create and Restore Data Disk Snapshots of Azure Local (Preview) - Azure Local,Create and restore data disk snapshots on Azure Local multi-rack,Learn how to create snapshots from a data disk and restore a new disk from a snapshot on multi-rack deployments of Azure Local (preview).,"Disk snapshots let you capture a point-in-time copy of a data disk so that you can recover data or quickly provision new disks from a known-good state. This article shows you how to create a snapshot from an existing data disk and restore a new disk from that snapshot on Azure Local. Note This article covers data disk snapshots only. This release doesn't include OS disk snapshot and restore. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Prev",2026-04-23T17:08:00.000Z,how-to,configuration,0.6,True,"Shows how to snapshot and restore data disks with Azure Local-specific commands and constraints (data disk only, no OS disk).",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-load-balancer-overview?view=azloc-2604,Load balancer for multi-rack deployments,About Load Balancers in Multi-Rack Deployments of Azure Local (preview) - Azure Local,,Learn about the types of load balancers you can use in multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes the different types of load balancers you can use in multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T22:07:00.000Z,concept-article,,0.25,False,"Load balancer overview is descriptive; no explicit mention of configuration tables, limits, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machine-resources?view=azloc-2604,Manage Azure Local VM resources,Manage resources for Azure Local VMs for multi-rack deployments - Azure Local,Manage disks and NIC resources for Azure Local multi-rack VMs,Learn how to manage resources like data disks and network interfaces on an Azure Local VM for multi-rack deployments.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to manage the VM resources for an Azure Local VM for multi-rack deployments. After you deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you may need to add or delete resources like data disks. Note You can add network interfaces when the VM is stopped. If you add a network interface with a static IP after the VM is provisioned, the IP isn't automatically configured inside the guest OS. You ",2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Covers adding/removing data disks and NICs with constraints (for example, NICs only when stopped, static IP behavior), which are product-specific configuration behaviors.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machines?view=azloc-2604,Manage Azure Local VMs,Manage Azure Local VMs for Multi-Rack Deployments - Azure Local,Manage lifecycle of Azure Local multi-rack Arc-enabled VMs,"Learn how to manage Azure Local VMs enabled by Azure Arc. This includes operations such as start, stop, restart, view properties of Azure Local VMs for multi-rack deployments.","Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to manage Azure Local virtual machines (VMs) enabled by Azure Arc for multi-rack deployments. The procedures to start, stop, restart, and delete an Azure Local VM, and manage guest management are detailed.",2026-04-23T17:08:00.000Z,how-to,configuration,0.6,True,Details start/stop/restart/delete and guest management operations; uses Azure-specific commands and settings for VM management.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-data-disks?view=azloc-2604,Download managed disks from Azure,Download Azure managed disk to Azure Local multi-rack - Azure Local,Download Azure managed disks to Azure Local multi-rack,Learn how to download an Azure managed disk to Azure Local multi-rack deployments.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to download an Azure managed disk from Azure to your Azure Local multi-rack instance. You can then use the disk to create an image or to attach it to your Azure Local virtual machines (VMs) enabled by Arc, as needed.",2026-04-23T17:08:00.000Z,how-to,integrations,0.7,True,"Covers moving managed disks from Azure to Azure Local; involves specific commands, URIs, and parameters for this cross-service integration.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-logical-networks?view=azloc-2604,Manage logical networks,Manage logical networks for Azure Local multi-rack VMs enabled by Azure Arc - Azure Local,Manage logical networks for Azure Local multi-rack Arc VMs,Learn how to manage logical networks for Azure Local multi-rack VMs enabled by Azure Arc.,"Applies to: Multi-rack deployments of Azure Local 2511 and later To deploy Azure Local virtual machines (VMs) enabled by Azure Arc, you need to create logical networks. Once these networks are provisioned, you might need to manage them. This article describes how to manage these logical networks for Azure Local VMs deployed on your multi-rack instance.",2026-04-23T17:08:00.000Z,how-to,configuration,0.65,True,Managing logical networks involves editing network properties and associations; uses Azure Local-specific resource configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-network-security-groups?view=azloc-2604,Manage NSGs,Manage network security groups on Azure Local multi-rack deployments - Azure Local,Manage NSGs and security rules on Azure Local multi-rack,Learn how to manage network security groups and network security rules on Azure Local multi-rack deployments.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to manage network security groups (NSGs) on your Azure Local multi-rack deployment. Once you create network security groups, you can then list, show details, associate, dissociate, update, and delete these resources.",2026-04-23T17:08:00.000Z,how-to,security,0.75,True,"Managing NSGs and rules is security-focused; involves specific rule properties, associations, and behaviors for Azure Local multi-rack.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-cluster-with-metrics?view=azloc-2604,Monitor cluster metrics,Monitor Multi-rack Deployments of Azure Local with Azure Monitor Metrics - Azure Local,Monitor Azure Local multi-rack with Azure Monitor Metrics,Learn how to monitor multi-rack deployments of Azure Local with Azure Monitor Metrics.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to monitor your multi-rack deployments of Azure Local withAzure Monitor Metrics. It also describes the Performance Metrics dashboard and lists metrics collected for compute, storage, and network resources in multi-rack deployments of Azure Local. When you have critical applications and business processes that rely on Azure resources, it's important to monitor those resources for their availability, perfor",2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Lists specific metrics for compute, storage, and network and how to configure dashboards; these metric names and dimensions are product-specific configuration knowledge.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-overview?view=azloc-2604,Monitoring overview,Overview of Azure Local Monitoring for Multi-rack Deployments - Azure Local,,This article provides an overview of the Azure Local monitoring solution for multi-rack deployments.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of monitoring for multi-rack deployments of Azure Local. Monitoring multi-rack deployments of Azure Local involves regular collection and analysis of data from all components of your system to promptly identify and address any potential issues. Routine monitoring is crucial for maintaining the health and functionality of your system. To understand current performance patterns, identify performance ",2026-04-23T17:08:00.000Z,overview,,0.3,False,Monitoring overview is conceptual; summary doesn’t indicate metric tables with numeric thresholds or configuration parameters.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-nat-gateway-overview?view=azloc-2604,NAT gateway for multi-rack deployments,Overview of NAT Gateway on Multi-Rack Deployments of Azure Local (preview) - Azure Local,,Learn about NAT gateway on multi-rack deployments of Azure Local (preview).,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of Network Address Translation (NAT) gateway on multi-rack deployments of Azure Local. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T22:07:00.000Z,overview,,0.25,False,NAT gateway overview is conceptual; summary doesn’t indicate detailed configuration parameters or numeric constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-network-fabric-overview?view=azloc-2604,Network fabric for multi-rack deployments,Network Fabric Overview For Azure Local Multi-Rack Deployments - Azure Local,,Learn about network fabric resources for Azure Local multi-rack deployments.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes the capabilities of the network fabric used for infrastructure management for multi-rack deployments of Azure Local. The article also covers the workload networking required for these deployments. The network fabric instance is a single deployed physical network infrastructure - including racks, switches, terminal server connections, and cabling - that Azure represents and manages as a Network Fabric (NF) res",2026-04-23T17:08:00.000Z,concept-article,,0.25,False,Network fabric overview describes capabilities and concepts; summary doesn’t show detailed configuration parameters or limits.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-overview?view=azloc-2604,About multi-rack deployments,What are multi-rack deployments of Azure Local? - Azure Local,,"Discover Azure Local multi-rack deployments, a new capability for deploying large on-premises datacenters with over 100 machines and 8,000 cores. Learn how to get started.","Applies to: Multi-rack deployments of Azure Local 2511 and later This article provides an overview of multi-rack deployments of Azure Local. The overview details the benefits, key features, use cases, and how to get started with multi-rack deployments. Multi-rack deployments extend the scale of Azure Local, supporting hundreds of servers across multiple racks in a single instance.",2026-04-23T17:08:00.000Z,overview,,0.2,False,"Multi-rack overview is high-level benefits and features; no indication of numeric limits, config parameters, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-prerequisites?view=azloc-2604,Complete deployment prerequisites,Prerequisites for multi-rack deployments of Azure Local - Azure Local,Meet prerequisites for Azure Local multi-rack deployments,Review the prerequisites for multi-rack deployments of Azure Local.,Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes the prerequisites for multi-rack deployments of Azure Local.,2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Prerequisites pages typically list concrete requirements (supported versions, network ranges, hardware specs) that are product-specific configuration knowledge.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-security?view=azloc-2604,Security concepts,Security Overview for Multi-rack Deployments of Azure Local - Azure Local,,Read an overview of security features for multi-rack deployments of Azure Local.,This article provides an overview of security for multi-rack deployments of Azure Local. Multi-rack deployments of Azure Local are designed and built to detect and defend against the latest security threats. These deployments also comply with the strict requirements of government and industry security standards. The security posture of Azure Local is based on the following two principles: Use Microsoft cloud-native security tools to improve your cloud security posture and protect your workloads.,2026-04-23T17:08:00.000Z,concept-article,,0.25,False,"Security overview is conceptual and principle-based; summary doesn’t show specific RBAC roles, settings, or compliance configuration details.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-serial-console?view=azloc-2604,Connect to Azure Local VMs via serial console,Connect to VM Serial Console Using Azure CLI on Azure Local (Preview) - Azure Local,Use Azure CLI serial console for Azure Local multi-rack VM recovery,Learn how to connect to the serial console of an Azure Local Multi Rack VM using Azure Command-line Interface (CLI) (preview).,"This article describes how to connect to the serial console of an Azure Local virtual machine (VM) in a multi-rack deployment using the Azure CLI. Serial console provides access to a text-based console for VMs running Linux or Windows Server. It connects to VM's COM1 serial port, giving you direct console access independent of the VM's network state. This is useful for troubleshooting boot issues, fixing misconfigured networking, or recovering a VM that is otherwise unreachable via RDP or SSH. O",2026-04-23T17:08:00.000Z,how-to,troubleshooting,0.65,True,Serial console usage is primarily for troubleshooting boot and network issues; article likely maps symptoms to using specific CLI commands and console actions.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-troubleshoot-arc-enabled-vms?view=azloc-2604,Troubleshoot Azure Local VMs,Troubleshoot Azure Local Multi-rack Virtual Machines Enabled by Azure Arc - Azure Local,Troubleshoot Azure Local multi-rack Arc-enabled VM issues,Learn how to troubleshoot issues you experience with Azure Local multi-rack Virtual Machines (VMs) enabled by Azure Arc.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to troubleshoot problems with Azure Local Virtual Machines enabled by Azure Arc on multi-rack deployments. It also lists the current limitations and known problems with Azure Local VM management, along with recommended resolutions.",2026-04-23T17:08:00.000Z,how-to,troubleshooting,0.9,True,"Explicit troubleshooting article with limitations, known problems, and recommended resolutions; likely includes error patterns and product-specific fixes.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-image-storage-account?view=azloc-2604,Create Azure Local VM images,Create Azure Local VM images for multi-rack deployments using Azure Storage account - Azure Local,Create Azure Local multi-rack VM images from Azure Storage,Learn how to create Azure Local VMs for multi-rack deployments using source images from Azure Storage account via Azure portal and Azure CLI.,Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to create Azure Local virtual machines (VMs) for multi-rack deployments using source images from the Azure Storage account. You can create VM images using Azure Command Line Interface (CLI) and then use these images to create Azure Local VMs.,2026-04-23T17:08:00.000Z,how-to,integrations,0.7,True,Shows how to use Azure Storage and CLI to build VM images for Azure Local; includes API/CLI parameters and integration-specific steps between services.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-extension?view=azloc-2604,Manage Azure Local VM extensions,Manage VM extensions on Azure Local VMs for multi-rack deployments - Azure Local,Install and manage VM extensions on Azure Local multi-rack VMs,Learn how to enable guest management and then install and manage extensions on Azure Local VMs via Azure portal for multi-rack deployments.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to install and manage Azure Local virtual machine (VM) extensions for multi-rack deployments via Azure portal. VM extensions for Azure Local VMs enabled by Azure Arc are useful for post-deployment configuration, software installation, or other management tasks. To install VM extensions, you must enable Azure guest management on your Azure Local VMs.",2026-04-23T17:08:00.000Z,how-to,configuration,0.65,True,VM extensions and guest management require specific configuration flags and extension names; these are product-specific settings.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-image?view=azloc-2604,Manage Azure Local VM images,Manage VM images for Azure Local multi-rack VMs enabled by Azure Arc (preview) - Azure Local,Manage VM images on Azure Local multi-rack deployments,"Learn how to list, view properties, and delete virtual machine images on Azure Local multi-rack deployments using Azure CLI or Azure portal.","Applies to: Multi-rack deployments of Azure Local 2511 and later This article describes how to manage virtual machine (VM) images on your Azure Local multi-rack instance. You can create VM images from an Azure Storage account. After you create images, you can list, view properties, and delete them using Azure CLI or the Azure portal. Important This feature is currently in PREVIEW. +See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are i",2026-04-22T17:03:00.000Z,how-to,configuration,0.65,True,"Describes listing, viewing, and deleting VM images via CLI/portal; includes resource types and parameters specific to Azure Local multi-rack.",new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-operations?view=azloc-2604,Supported VM operations,Supported Operations for Azure Local Multi-rack Virtual Machines (VMs) Enabled by Azure Arc - Azure Local,Use supported operations for Azure Local multi-rack Arc VMs,Learn about the supported virtual machine (VM) operations for Azure Local multi-rack VMs enabled by Azure Arc.,"Applies to: Multi-rack deployments of Azure Local 2511 and later This article discusses the most common operations for Azure Local multi-rack virtual machines (VMs) enabled by Azure Arc. The article identifies the operations that are supported on Azure Local multi-rack VMs, along with the operations that you need to avoid to prevent complications.",2026-04-23T17:08:00.000Z,concept-article,best-practices,0.7,True,Explicitly lists which VM operations are supported and which to avoid; this is product-specific DO/DON'T guidance and gotchas.,new +https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-vm-management-prerequisites?view=azloc-2604,Complete Azure Local VM prerequisites,Review prerequisites for Azure Local VMs for multi-rack deployments - Azure Local,Satisfy VM prerequisites for Azure Local multi-rack,Learn about the prerequisites for deploying Azure Local VMs for multi-rack deployments.,Applies to: Multi-rack deployments of Azure Local 2511 and later This article lists the requirements and prerequisites for Azure Local virtual machines (VMs) for multi-rack deployments. Review the requirements and complete the prerequisites before you manage your Azure Local VMs.,2026-04-23T17:08:00.000Z,how-to,configuration,0.7,True,"Lists VM requirements (sizes, images, networking, storage) that are specific to Azure Local multi-rack; these are configuration constraints and settings.",new +https://learn.microsoft.com/en-us/azure/azure-local/oem-license?view=azloc-2604,OEM license information,OEM license for Azure Local overview - Azure Local,,"Learn about the OEM license for Azure Local, its benefits, license requirements, activation, and more.","Applies to: Hyperconverged deployments of Azure Local This article covers the OEM license for Azure Local, its benefits, license requirements, activation, and more.",2026-04-22T22:07:00.000Z,overview,,0.2,False,"OEM license overview; licensing/benefits rather than technical limits, configs, or patterns.",new +https://learn.microsoft.com/en-us/azure/azure-local/overview/disaggregated-overview?view=azloc-2604,About disaggregated deployments,Overview of Disaggregated Deployments for Azure Local - Azure Local,,"Learn the benefits, features, and use cases of Azure Local disaggregated deployments, designed to accelerate cloud and AI innovation from edge to core.","Applies to: Hyperconverged deployments of Azure Local This article provides an overview of disaggregated deployments of Azure Local (formerly Azure Stack HCI). The overview details the benefits, key features, use cases, and how to get started with this generally available solution. Disaggregated deployments come in different sizes, from a single machine footprint to a maximum of 64 machines that use SAN storage. They offer a unified management control plane and support a wide range of validated ",2026-04-23T08:00:00.000Z,overview,,0.3,False,"Overview of disaggregated deployments; benefits and use cases, not detailed requirements or decision matrices with thresholds.",new +https://learn.microsoft.com/en-us/azure/azure-local/overview/hyperconverged-overview?view=azloc-2604,About hyperconverged deployments,Overview of Hyperconverged Deployments for Azure Local - Azure Local,,"Learn the benefits, features, and use cases of Azure Local hyperconverged deployments, designed to accelerate cloud and AI innovation from edge to core.","Applies to: Hyperconverged deployments of Azure Local This article provides an overview of hyperconverged deployments of Azure Local (formerly Azure Stack HCI). The overview details the benefits, key features, use cases, and how to get started with this generally available solution. Hyperconverged deployments come in different sizes, from a single machine footprint to a maximum of 16 machines that use hyperconverged storage. They offer a unified management control plane and support a wide range ",2026-04-22T22:07:00.000Z,overview,,0.3,False,Overview of hyperconverged deployments and benefits; no clear evidence of detailed configuration tables or decision matrices.,new +https://learn.microsoft.com/en-us/azure/azure-local/overview?view=azloc-2604,What is Azure Local?,What Is Azure Local? Overview and Key Benefits - Azure Local,,"Learn how Azure Local accelerates cloud and AI innovation by delivering applications, workloads, and services from cloud to edge with Azure Arc as the control plane.","Azure Local is Microsoft’s distributed infrastructure solution that extends Azure capabilities to customer-owned environments. It facilitates the local deployment of both modern and legacy applications across distributed or sovereign locations. Azure Local accelerates cloud and AI innovation by seamlessly delivering new applications, workloads, and services from cloud to edge, using Azure Arc as the unifying control plane. The solution offers a cloud-native management experience and supports dep",2026-04-22T22:07:00.000Z,overview,,0.2,False,"High-level product overview and benefits; no concrete limits, configs, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern-disaggregated?view=azloc-2604,Choose disaggregated network reference pattern,Azure Local disaggregated deployment network reference patterns - Azure Local,Select a network reference pattern for Azure Local disaggregated,Select a network reference pattern for disaggregated Azure Local deployments.,"This article helps you choose a network reference pattern for disaggregated Azure Local deployments where storage is provided by an external Storage Area Network (SAN). Each pattern is described as a standalone article and includes all the network components for that specific scenario. For an overview of the leaf-spine fabric architecture, traffic flow, and key concepts, seeNetwork reference patterns overview for disaggregated deployments.",2026-04-22T22:28:00.000Z,overview,decision-making,0.75,True,Explicitly helps choose between supported network patterns; likely includes scenario-based recommendations and trade-offs.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2604,Choose network reference pattern,Azure Local deployment network reference patterns - Azure Local,Choose Azure Local deployment network reference pattern,Select a network reference pattern for single-node and two-node Azure Local deployments.,"Applies to: Azure Local 2311.2 and later This article describes a set of network patterns references to architect, deploy, and configure Azure Local using either one, two, or three physical hosts. Depending on your needs or scenarios, you can go directly to your pattern of interest. Each pattern is described as a standalone entity and includes all the network components for specific scenarios.",2026-04-22T22:07:00.000Z,overview,decision-making,0.8,True,Helps select among network patterns for one to three hosts based on scenarios; explicit decision guidance between patterns.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/cloud-deployment-network-considerations?view=azloc-2604,Review cloud deployment network considerations,"Network considerations for cloud deployment for Azure Local, version 23H2 - Azure Local",Plan Azure Local cloud deployment network topology,"This article introduces network considerations for cloud deployments of Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article discusses how to design and plan an Azure Local system network for cloud deployment. Before you continue, familiarize yourself with the variousAzure Local networking patternsand available configurations.",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.65,True,Network planning guidance for Azure Local cloud deployments with product-specific patterns and considerations; focuses on how to design the network rather than just concepts.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/configure-custom-settings-active-directory?view=azloc-2604,Configure advanced Active Directory settings,"Custom or advanced Active Directory configuration for Azure Local, version 23H2 - Azure Local",Configure custom Active Directory permissions and DNS for Azure Local,"Learn how to assign the required permissions and create the required DNS records for use by Active Directory for your Azure Local, version 23H2 system.",Applies to: Hyperconverged deployments of Azure Local This article describes the permissions and the DNS records required for the Azure Local instance deployment. The article also uses examples with detailed steps on how to manually assign permissions and create DNS records for your Active Directory environment. The Azure Local solution is deployed in large Active Directories with established processes and tools for assigning permissions. Microsoft provides anActive Directory preparation scriptt,2026-04-22T22:07:00.000Z,how-to,security,0.7,True,Gives detailed steps to assign required AD permissions and create specific DNS records; this is product-specific identity and access configuration.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/fiber-channel-no-backup-disaggregated-pattern?view=azloc-2604,Fiber Channel disaggregated pattern without backup network,Fiber Channel disaggregated pattern without backup network - Azure Local,Plan Fiber Channel disaggregated pattern without backup network,Plan to deploy an Azure Local disaggregated cluster using Fiber Channel SAN without a backup network.,"This article describes the network reference pattern for disaggregated Azure Local clusters that use Fiber Channel (FC) Storage Area Network (SAN) external storagewithouta dedicated backup network. Cluster sizes can range from a single node up to 64 nodes across multiple racks. For an overview of the leaf-spine fabric architecture, traffic flow, and key concepts such as Virtual Routing and Forwarding (VRF), Virtual Extensible LAN (VXLAN), and the role of compute leaf switches vs. service leaf sw",2026-04-22T22:28:00.000Z,how-to,architecture-patterns,0.8,True,"Describes a specific network reference pattern (FC SAN, no backup network) including topology and constraints unique to Azure Local.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/fiber-channel-with-backup-disaggregated-pattern?view=azloc-2604,Fiber Channel disaggregated pattern with backup network,Fiber Channel disaggregated pattern with backup network - Azure Local,Plan Fiber Channel disaggregated pattern with backup network,Plan to deploy an Azure Local disaggregated cluster using Fiber Channel SAN with a dedicated backup network.,"This article describes the network reference pattern for disaggregated Azure Local clusters that use Fiber Channel (FC) Storage Area Network (SAN) external storagewitha dedicated backup network for virtual machine backup traffic. Cluster sizes can range from a single node up to 64 nodes across multiple racks. For an overview of the leaf-spine fabric architecture, traffic flow, and key concepts such as Virtual Routing and Forwarding (VRF), Virtual Extensible LAN (VXLAN), and the role of compute l",2026-04-22T22:28:00.000Z,how-to,architecture-patterns,0.8,True,Another concrete network pattern (FC SAN with dedicated backup network) with product-specific architecture details and trade-offs.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/four-node-switchless-two-switches-two-links?view=azloc-2604,"Storage switchless, dual TOR, dual link","Azure Local four-node storage switchless, dual TOR, dual link deployment network reference pattern - Azure Local",Plan four-node switchless dual-link Azure Local network,"Plan to deploy an Azure Local four-node storage switchless, dual TOR, dual link network reference pattern.",Applies to: Azure Local 2411.1 and later This article describes how you can use a four-node storage switchless network reference pattern with two TOR L3 switches and two full-mesh links to deploy your Azure Local solution. Note Microsoft has tested and validated the four-node switchless network reference patterns described in this article.,2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.8,True,"Four-node switchless dual-link pattern with dual TOR switches is a specific, validated network architecture pattern.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview-disaggregated?view=azloc-2604,Disaggregated network reference patterns overview,Network reference patterns overview for Azure Local disaggregated deployments - Azure Local,Understand network reference patterns for Azure Local disaggregated,Learn about the different supported network reference patterns for Azure Local disaggregated deployments.,"This article provides an overview of network reference patterns for disaggregated deployments of Azure Local. Use this guide to understand the leaf-spine fabric architecture, how traffic flows between racks, and how external connectivity is managed through service leaf switches.",2026-04-22T22:28:00.000Z,concept-article,architecture-patterns,0.65,True,"Overview of specific network reference patterns (leaf-spine, traffic flows) for this product; these are product-specific architecture patterns.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2604,Network reference patterns overview,Network reference patterns overview for Azure Local - Azure Local,Understand Azure Local network reference pattern options,Learn about the different supported network reference patterns for Azure Local.,Applies to: Azure Local 2311.2 and later This article provides an overview of deploying network reference patterns in hyperconverged deployments of Azure Local (formerly Azure Stack HCI). A deployment consists of single-node or multiple node systems (up to 16 machines per system) that connect to one or two Top of Rack (TOR) switches. Those environments have the following characteristics: At least two network adapter ports dedicated for storage traffic intent. The only exception to this rule is s,2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.7,True,Defines supported network reference patterns with node counts and topology constraints; product-specific architecture patterns for networking.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-sdn-considerations?view=azloc-2604,SDN considerations,Review SDN considerations for network reference patterns - Azure Local,Apply SDN design considerations to Azure Local patterns,Learn about SDN considerations for network reference patterns for Azure Local.,"Applies to: Azure Local 2311.2 and later In this article, you'll review considerations when deploying Software Defined Networking (SDN) in your Azure Local instance.",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.7,True,Discusses SDN considerations when deploying network reference patterns; product-specific architectural guidance for SDN usage.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-components?view=azloc-2604,Pattern components,Review single-server storage reference pattern components for Azure Local - Azure Local,Review single-server network components for Azure Local,Learn about single-server storage reference pattern components for Azure Local.,"Applies to: Azure Local 2311.2 and later This article describes which network components are deployed for the single-server reference pattern, as shown in the following diagram:",2026-04-22T22:07:00.000Z,concept-article,configuration,0.75,True,Details which network components are deployed in the single-server pattern; concrete configuration of components.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-deployment?view=azloc-2604,Single-node deployment,Azure Local single node storage deployment network reference pattern - Azure Local,Plan single-server storage network pattern for Azure Local,Plan to deploy an Azure Local single-server storage network reference pattern.,"Applies to: Azure Local 2311.2 and later This article describes the single-server storage network reference pattern that you can use to deploy your Azure Local solution. The information in this article also helps you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local in their datacenters. For information about other network patterns, seeAzure Local network deployment patterns.",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.8,True,Describes a specific single-server storage network reference pattern and viability criteria; product-specific architecture pattern.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-ip-requirements?view=azloc-2604,Pattern IP requirements,Review single-server storage reference pattern IP requirements for Azure Local - Azure Local,Apply IP address requirements for single-server Azure Local,Review single-server storage reference pattern IP requirements for Azure Local.,Applies to: Azure Local 2311.2 and later This article describes the IP requirements for deploying a single-server network reference pattern in your environment.,2026-04-22T22:07:00.000Z,feature-availability,configuration,0.85,True,"IP requirements for the single-server pattern are specific configuration parameters (IP ranges, counts) unique to this product.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-components?view=azloc-2604,Pattern components,Review three-node storage reference pattern components for Azure Local - Azure Local,Review three-node network components for Azure Local,Learn about three-node storage reference pattern components for Azure Local.,"Applies to: Azure Local 2311.2 and later In this article, you'll learn about which network components get deployed for three-node reference patterns, as shown below:",2026-04-22T22:07:00.000Z,concept-article,configuration,0.75,True,Describes which network components are deployed in three-node patterns; concrete configuration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-ip-requirements?view=azloc-2604,Pattern IP requirements,Review three-node storage reference pattern IP requirements for Azure Local - Azure Local,Apply IP address requirements for three-node Azure Local,Review three-node storage reference pattern IP requirements for Azure Local,"Applies to: Azure Local 2311.2 and later In this article, learn about the IP address requirements for deploying a three-node network reference pattern in your environment.",2026-04-22T22:07:00.000Z,article,configuration,0.85,True,Three-node IP address requirements are detailed configuration parameters specific to Azure Local deployments.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-single-link?view=azloc-2604,"Storage switchless, dual TOR, single link","Azure Local three-node storage switchless, dual TOR, single link deployment network reference pattern - Azure Local",Plan three-node switchless single-link Azure Local network,"Plan to deploy an Azure Local three-node storage switchless, dual TOR, single link network reference pattern.","Applies to: Hyperconverged deployments of Azure Local In this article, learn about the three-node storage switchless with two TOR L3 switches and full-mesh single link network reference pattern that you can use to deploy your Azure Local solution. Note Microsoft has tested and validated the three-node switchless network reference patterns described in this article. For information on two-node switchless network patterns, seeAzure Local network deployment patterns.",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.8,True,Three-node switchless dual TOR single-link pattern is a validated network architecture pattern unique to Azure Local.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-two-links?view=azloc-2604,"Storage switchless, dual TOR, dual link","Azure Local three-node storage switchless, dual TOR, dual link deployment network reference pattern - Azure Local",Plan three-node switchless dual-link Azure Local network,"Plan to deploy an Azure Local three-node storage switchless, dual TOR, dual link network reference pattern.","Applies to: Hyperconverged deployments of Azure Local In this article, learn about the three-node storage switchless with two TOR L3 switches and two full-mesh links network reference pattern that you can use to deploy your Azure Local solution. Note Microsoft has tested and validated the three-node switchless network reference patterns described in this article. For information on two-node switchless network patterns, seeAzure Local network deployment patterns.",2026-04-22T22:07:00.000Z,concept-article,architecture-patterns,0.8,True,"Three-node switchless dual-link pattern with dual TOR switches is a specific, tested architecture pattern.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-components?view=azloc-2604,Pattern components,Review two-node storage reference pattern components for Azure Local - Azure Local,Review two-node network components for Azure Local,Learn about two-node storage reference pattern components for Azure Local.,"Applies to: Azure Local 2311.2 and later In this article, you'll learn about which network components get deployed for two-node reference patterns, as shown below:",2026-04-22T22:07:00.000Z,concept-article,configuration,0.75,True,Lists which network components are deployed in two-node reference patterns; concrete configuration details.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-ip-requirements?view=azloc-2604,Pattern IP requirements,Review two-node storage reference pattern IP requirements for Azure Local - Azure Local,Apply IP address requirements for two-node Azure Local,Review two-node storage reference pattern IP requirements for Azure Local,"Applies to: Azure Local 2311.2 and later In this article, learn about the IP address requirements for deploying a two-node network reference pattern in your environment.",2026-04-22T22:07:00.000Z,feature-availability,configuration,0.85,True,"Two-node IP address requirements are specific configuration parameters (counts, roles) for this product.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-converged?view=azloc-2604,"Storage switched, fully converged","Azure Local two-node storage switched, fully converged deployment network reference pattern - Azure Local",Plan two-node switched converged Azure Local network,"Plan to deploy an Azure Local two-node storage switched, fully converged network reference pattern.","Applies to: Azure Local 2311.2 and later In this article, you'll learn about the two-node storage switched, fully converged with two TOR switches network reference pattern that you can use to deploy your Azure Local instance solution. The information in this article will also help you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local instances in their datacenters. For information",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.8,True,"Two-node switched, fully converged pattern is a product-specific network architecture pattern with defined components.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-non-converged?view=azloc-2604,"Storage switched, non-converged","Azure Local two-node storage switched, non-converged deployment network reference pattern - Azure Local",Plan two-node switched non-converged Azure Local network,"Plan to deploy an Azure Local two-node storage switched, non-converged network reference pattern.","Applies to: Azure Local 2311.2 and later In this article, you'll learn about the two-node storage switched, non-converged, two-TOR-switch network reference pattern that you can use to deploy your Azure Local solution. The information in this article will also help you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local in their datacenters. For information on other network patterns,",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.8,True,"Two-node switched, non-converged pattern with two TOR switches is a specific network architecture pattern for this product.",new +https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-single-switch?view=azloc-2604,"Storage switchless, single switch",Azure Local two-node storage switchless deployment network reference pattern - Azure Local,Plan two-node switchless single-switch Azure Local network,Plan to deploy an Azure Local two-node storage switchless network reference pattern.,"Applies to: Azure Local 2311.2 and later This article describes the two-node storage switchless with single TOR switch network reference pattern that you can use to deploy your Azure Local solution. The information in this article also helps you determine if this configuration is viable for your deployment planning needs. This article targets the IT administrators who deploy and manage Azure Local in their datacenters. For information about other network patterns, seeAzure Local network deployme",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.8,True,Defines a specific two-node switchless pattern with single TOR switch and when it’s viable; product-specific network architecture pattern.,new +https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-two-switches?view=azloc-2604,"Storage switchless, two switches","Azure Local two-node storage switchless, two switches deployment network reference pattern - Azure Local",Plan two-node switchless dual-switch Azure Local network,"Plan to deploy an Azure Local two-node storage switchless, two switches network reference pattern.","Applies to: Azure Local 2311.2 and later In this article, you learn about the two-node storage switchless with two TOR L3 switches network reference pattern that you can use to deploy your Azure Local solution. The information in this article also helps you determine if this configuration is viable for your deployment planning needs. This article is targeted towards the IT administrators who deploy and manage Azure Local in their datacenters. For information on other network patterns, seeAzure L",2026-04-22T22:07:00.000Z,how-to,architecture-patterns,0.8,True,Describes two-node switchless with dual TOR switches pattern and deployment viability; concrete architecture pattern.,new +https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-23?view=azloc-2604,Known issues,Release notes with fixed and known issues in Azure Local 23xx releases - Azure Local,Known issues and workarounds for Azure Local 23xx,Read about the known issues and fixed issues in Azure Local 23xx releases.,"This article identifies critical known issues and their workarounds in Azure Local 23xx releases. Note Azure Local 23xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-04-22T22:07:00.000Z,troubleshooting-general,troubleshooting,0.7,True,Known/fixed issues article with workarounds is effectively a troubleshooting catalog mapping issues to resolutions.,new +https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-24?view=azloc-2604,Known issues,Release notes with fixed and known issues in Azure Local 24xx releases - Azure Local,Known issues and workarounds for Azure Local 24xx,Read about the known issues and fixed issues in Azure Local 24xx releases.,"This article identifies critical known issues and their workarounds in Azure Local 24xx releases. Note Azure Local 24xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-04-22T22:07:00.000Z,troubleshooting-general,troubleshooting,0.7,True,Release notes with known issues and workarounds typically map specific symptoms/bugs to causes and mitigation steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-23?view=azloc-2604,Security updates,Security updates for Azure Local 23xx releases - Azure Local,Security update catalog for Azure Local 23xx releases,Security updates for Azure Local 23xx releases.,"This article lists the various security updates that are available for Azure Local 23xx releases. Note Azure Local 23xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-04-22T22:07:00.000Z,release-notes,security,0.65,True,Lists specific security updates for these releases; product-specific security patch information.,new +https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-24?view=azloc-2604,Security updates,Security updates for Azure Local 24xx releases - Azure Local,Security update catalog for Azure Local 24xx releases,Security updates for Azure Local 24xx releases.,"This article lists the various security updates that are available for Azure Local 24xx releases. Note Azure Local 24xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-04-22T22:07:00.000Z,release-notes,security,0.65,True,"Security update listing is product-specific security information (KBs, CVEs, affected components) not generally known from training.",new +https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-23?view=azloc-2604,What's new in Azure Local?,What's new in Hyperconverged Deployments of Azure Local 23xx releases - Azure Local,,Find out about the new features and enhancements in the Azure Local 23xx releases.,"This article lists the features and improvements that are available in hyperconverged deployments of Azure Local (formerly Azure Stack HCI) 23xx releases. Note Azure Local 23xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-04-22T22:07:00.000Z,overview,,0.25,False,“What’s new” feature list for 23xx; similar to other release feature summaries without deep technical matrices.,new +https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-24?view=azloc-2604,What's new in Azure Local?,What's new in Hyperconverged Deployments of Azure Local 24xx releases - Azure Local,,Find out about the new features and enhancements in the Azure Local 24xx releases.,"This article lists the features and improvements that are available in hyperconverged deployments of Azure Local (formerly Azure Stack HCI) 24xx releases. Note Azure Local 24xx releases are not in a supported state. For more information, seeAzure Local release information.",2026-04-22T22:07:00.000Z,overview,,0.25,False,"“What’s new” release feature list; not focused on configuration matrices, limits, or troubleshooting catalogs.",new +https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2604,Release information,"Azure Local, version 23H2 and 24H2 release information - Azure Local",Plan supported Azure Local release upgrade paths,"Learn about Azure Local releases, including OS builds, supported update paths, and key considerations for staying in a supported state.","Important Azure Local versions 11.2510.1002.93 and 12.2510.1002.531 (supersedes 12.2510.1002.529) are now available. To enhance your Azure Local (formerly known as Azure Stack HCI) experience, Microsoft periodically releases feature updates that introduce new capabilities and improvements. Additionally, Azure Local provides cumulative updates that include monthly quality and security enhancements. These updates are listed for each instance, ensuring your devices remain protected and productive. ",2026-04-22T22:07:00.000Z,release-notes,deployment,0.65,True,Release information with supported update paths and version/build specifics is deployment-focused expert knowledge about upgrade constraints.,new +https://learn.microsoft.com/en-us/azure/azure-local/rename-to-azure-local?view=azloc-2604,Renaming to Azure Local,Renaming Azure Stack HCI to Azure Local - Azure Local,,This article provides the renaming information for Azure Stack HCI to Azure Local.,Applies to: Azure Local 2311.2 and later Azure Stack HCI is now part ofAzure Local. Microsoft renamed Azure Stack HCI to Azure Local to communicate a single brand that unifies the entire distributed infrastructure portfolio. This article describes Azure Local as the new name for Azure Stack HCI and answers commonly asked questions related to this rename.,2026-04-22T22:07:00.000Z,concept-article,,0.1,False,Brand rename explanation and FAQ; marketing/communication content without technical expert details.,new +https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2604,About Azure Local deployments,Azure Local Deployment Types and Scalability - Azure Local,Select Azure Local deployment type and scale,"Discover how Azure Local offers scalable on-premises solutions for critical workloads, from single machines to hundreds of machines, tailored to your needs.","Azure Local offers you a consistent on-premises experience for your critical workloads and Arc services across a wide spectrum of scale points and use cases, from a single machine up to hundreds of machines. This article provides an overview of the different Azure Local deployment types and their scalability options to help you choose the right solution for your organization's needs.",2026-04-22T22:07:00.000Z,concept-article,decision-making,0.7,True,Explicitly helps choose between deployment types and scalability options across scale points; decision-focused guidance beyond generic concepts.,new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/conclusion?view=azloc-2604,Conclusion,Azure Local security book conclusion - Azure Local,,Conclusion for the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local We designed Azure Local so it's secure right out of the box. Further, we provide mechanisms to help the system remain secure over time. We'll continue to build on our security foundations with innovations that deliver powerful protection now and in the future.",2026-04-22T22:07:00.000Z,concept-article,,0.1,False,Conclusion chapter summarizing security posture and future direction; no expert-level technical details.,new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/operational-security?view=azloc-2604,Operational security,Operational security for the Azure Local security book - Azure Local,,Learn about operational security for the Azure Local security book.,Applies to: Hyperconverged deployments of Azure Local Operational security in Azure Local means ongoing operations using Windows Admin Center and ongoing compliance using Microsoft Defender for Cloud and other tools.,2026-04-22T22:07:00.000Z,concept-article,,0.3,False,Operational security chapter summary; likely conceptual (tools mentioned like WAC and Defender for Cloud) without detailed configuration or error mappings.,new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/overview?view=azloc-2604,Azure Local security book,Azure Local security book overview - Azure Local,,Overview of the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local The Azure Local security book discusses in detail the built-in security layers found in Azure Local, from core to cloud.",2026-04-22T22:07:00.000Z,overview,,0.1,False,High-level overview of a security book; navigation/summary content without concrete technical details.,new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/security-foundation?view=azloc-2604,Security foundation,Security foundation for the Azure Local security book - Azure Local,,Learn about the ecurity foundation for the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local Azure Local is built on a strong security foundation, including the Microsoft Security Development Lifecycle (SDL), certifications, and a secure supply chain.",2026-04-22T22:07:00.000Z,concept-article,,0.2,False,"Security foundation overview (SDL, certifications, supply chain); conceptual background, not actionable configuration or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/silicon-assisted-security?view=azloc-2604,Silicon-assisted security,Silicon assisted security for the Azure Local security book - Azure Local,,Learn about silicon assisted security for the Azure Local security book.,Applies to: Hyperconverged deployments of Azure Local Silicon assisted security for Azure Local means using secured core hardware and approved Azure Local solutions.,2026-04-22T22:07:00.000Z,concept-article,,0.3,False,"Silicon-assisted security overview (secured core hardware, approved solutions); likely high-level without detailed configuration or thresholds.",new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/trustworthy-addition?view=azloc-2604,Trustworthy addition,Trustworthy addition for the Azure Local security book - Azure Local,,Learn about trustworthy addition for the Azure Local security book.,"Applies to: Hyperconverged deployments of Azure Local Trustworthy addition for Azure Local means using security by default, application control, credential protection, memory integrity protectionm, data protection, network security, malware protection, and privacy controls.",2026-04-22T22:07:00.000Z,concept-article,,0.3,False,"High-level description of 'trustworthy addition' and security-by-default concepts; no indication of specific roles, parameters, or configs.",new +https://learn.microsoft.com/en-us/azure/azure-local/security-book/workload-security?view=azloc-2604,Workload security,Workload security for Azure Local security book - Azure Local,,Learn about workload security for the Azure Local security book.,Applies to: Hyperconverged deployments of Azure Local Workload security for Azure Local means using Trusted launch for Azure Local VMs enabled by Azure Arc and Microsoft Defender for Cloud for continuous monitoring of your workloads.,2026-04-22T22:07:00.000Z,concept-article,,0.3,False,Workload security overview referencing Trusted launch and Defender for Cloud; appears conceptual rather than parameter- or role-level guidance.,new +https://learn.microsoft.com/en-us/azure/azure-local/security-update/security-update?view=azloc-2604,Security updates,Security updates for Azure Local - Azure Local,,Security updates for Azure Local.,This article lists the various security updates that are available for Azure Local.,2026-04-22T22:07:00.000Z,release-notes,,0.3,False,"Security update listing; likely just KB references and dates, not configuration, RBAC, or troubleshooting guidance.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/about-updates-23h2?view=azloc-2604,About Updates,"About updates for Azure Local, version 23H2 - Azure Local",,"This article describes the updates feature for this release, benefits, and how to keep various pieces of your Azure Local, version 23H2 solution up to date.","Applies to: Hyperconverged deployments of Azure Local This article describes the new update feature for this release of Azure Local (formerly Azure Stack HCI), the benefits of the feature, and how to keep various components of your solution up to date.",2026-04-22T22:28:00.000Z,overview,,0.4,False,"Overview of the updates feature and benefits without detailed configuration parameters, limits, or troubleshooting specifics.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/azure-update-manager-23h2?view=azloc-2604,Update via Azure portal,"Use Azure Update Manager to update your Azure Local, version 23H2 - Azure Local",Use Azure Update Manager to update Azure Local,"This article describes the Azure Update Manager, its benefits, and ways to use it to update your Azure Local, version 23H2 system in the Azure portal.","Applies to: Hyperconverged deployments of Azure Local Important The procedure described here applies when updating your existing Azure Local version to a newer version. This article describes how to use Azure Update Manager to find and install available updates on Azure Local. It also describes how to review, track progress, and browse the history of system updates. Important Based on the solution you're using to run Azure Local, latest feature updates might take a week from the availability dat",2026-04-02T17:05:00.000Z,how-to,deployment,0.7,True,"Explains using Azure Update Manager to find, install, and track updates for Azure Local, including versioning behavior. Product-specific update deployment workflow.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2604,Opt in to update to 12.25xx from 11.xxxx,Import and Discover Update Packages For Azure Local With Limited Connectivity - Azure Local,Import Azure Local update packages in offline environments,This article describes how to import and discover update packages for Azure Local with limited connectivity.,"This article explains how to discover and import solution update packages for Azure Local (formerly Azure Stack HCI) deployed in sites with limited bandwidth connections to Azure. You can download Azure Local solution update as a static payload, then copy or transfer it to multiple instances, and import it using PowerShell. Do these actions before you start an update to reduce the amount of data downloaded during the update. The static payload for a solution update includes the OS security updat",2026-04-22T22:07:00.000Z,how-to,configuration,0.65,True,Procedural article for importing static update payloads via PowerShell for limited-connectivity Azure Local sites; includes product-specific cmd usage and parameters that function as configuration steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2604,Update via PowerShell with limited connectivity,Import and Discover Update Packages For Azure Local With Limited Connectivity - Azure Local,Import and discover Azure Local updates in limited connectivity,This article describes how to import and discover update packages for Azure Local with limited connectivity.,"This article explains how to discover and import solution update packages for Azure Local (formerly Azure Stack HCI) deployed in sites with limited bandwidth connections to Azure. You can download Azure Local solution update as a static payload, then copy or transfer it to multiple instances, and import it using PowerShell. Do these actions before you start an update to reduce the amount of data downloaded during the update. The static payload for a solution update includes the OS security updat",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,"Details how to download static payloads, transfer, and import update packages via PowerShell for low-bandwidth sites. Product-specific offline update deployment pattern.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/solution-builder-extension?view=azloc-2604,About Solution Builder Extension software updates,"Solution Builder Extension updates on Azure Local, version 23H2 - Azure Local",Manage Solution Builder Extension updates on Azure Local,This article describes the Solution Builder Extension updates and how to apply them on your Azure Local machines.,"Applies to: Hyperconverged deployments of Azure Local This article provides an overview of the Solution Builder Extension updates and explains how to identify and install them on your Azure Local systems. Additionally, it offers insights into the extension’s advanced capabilities.",2026-04-09T22:04:00.000Z,overview,configuration,0.6,True,Explains how to identify and install Solution Builder Extension updates and use advanced capabilities. Contains product-specific update configuration steps.,new +https://learn.microsoft.com/en-us/azure/azure-local/update/update-best-practices?view=azloc-2604,Review best practices,Best Practices for managing Azure Local Updates - Azure Local,Best practices for managing Azure Local updates,Learn the best practices for managing Azure Local updates.,"This article provides an overview of Azure Local update management, including best practices and common pitfalls to help keep Azure Local secure, up to date, and compliant. This article is intended for IT decision-makers, infrastructure architects, and operations teams responsible for Azure Local deployments.",2026-04-22T22:28:00.000Z,overview,best-practices,0.75,True,"Provides do/don’t style guidance and common pitfalls for update management, tailored to Azure Local. Product-specific operational best practices.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/update-phases-23h2?view=azloc-2604,Understand update phases,Understand Update Phases of Azure Local - Azure Local,,Understand the various phases of solution updates applied to Azure Local.,"Applies to: Hyperconverged deployments of Azure Local This article describes the preparation and installation phases of the Azure Local update workflow, including how updates are downloaded, validated, health-checked, and installed. It also explains how update progress is reported at various stages. For more detailed information on progress reporting, seeUse Azure Update Manager to update Azure LocalandUpdate Azure Local via PowerShell.",2026-04-06T22:03:00.000Z,concept-article,,0.4,False,"Describes phases of the update workflow conceptually (preparation, validation, health checks, reporting) without detailed parameters or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/update-settings?view=azloc-2604,Manage update settings,Update Settings - Azure Local,Configure Azure Local update settings and behavior,Manage update setting for Azure Local,This article provides an overview of different Azure Local update settings that control how updates are applied. The default settings provide a balanced experience of update run time versus potential workload impact. Use the available controls to fine tune and optimize the update behavior to your servicing requirements.,2026-04-22T22:28:00.000Z,overview,configuration,0.7,True,Update settings article will enumerate specific configuration options and their effects on update timing and workload impact.,new +https://learn.microsoft.com/en-us/azure/azure-local/update/update-troubleshooting-23h2?view=azloc-2604,Troubleshoot updates,"Troubleshoot solution updates for Azure Local, version 23H2 - Azure Local",Troubleshoot Azure Local solution update failures,"Learn how to troubleshoot solution updates applied to Azure Local, version 23H2.",Applies to: Hyperconverged deployments of Azure Local This article describes how to troubleshoot solution updates that are applied to your Azure Local to keep it up-to-date.,2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.75,True,"Focused on diagnosing and resolving issues with solution updates, likely including error conditions and corrective actions. Product-specific troubleshooting.",new +https://learn.microsoft.com/en-us/azure/azure-local/update/update-via-powershell-23h2?view=azloc-2604,Update via PowerShell,"Update Azure Local, version 23H2 systems via PowerShell - Azure Local",Apply Azure Local solution updates via PowerShell,"Learn how to use PowerShell to apply operating system, service, and Solution Extension updates to Azure Local, version 23H2.","Applies to: Hyperconverged deployments of Azure Local This article describes how to apply a solution update to your Azure Local by using PowerShell. The procedure in this article applies to both single node and multi-node systems that run the latest version of Azure Local with the orchestrator (Lifecycle Manager) installed. If you created your system by deploying Azure Local, the orchestrator was automatically installed as part of the deployment. Important The procedure described here applies wh",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,Provides concrete PowerShell procedures and requirements for updating single and multi-node systems with the orchestrator. Product-specific update deployment method.,new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/about-upgrades-23h2?view=azloc-2604,About Azure Local upgrades,About Azure Local upgrades - Azure Local,Plan upgrade from Azure Stack HCI 22H2 to Azure Local,This article provides an overview of upgrading your cluster to Azure Local.,"Applies to: Azure Local 2311.2 and later This article provides an overview of upgrading your version 22H2 cluster to Azure Local (formerly Azure Stack HCI). Azure Stack HCI OS, version 22H2 is already out of support. To continue receiving updates, we recommend upgrading your operating system to a newer versionvia PowerShell. If you're running OS version 20349.xxxx (22H2) you won't be able to purchase Windows Server Subscription or Extended Security Updates (ESU). If you're running an Azure Stack",2026-04-22T22:07:00.000Z,concept-article,decision-making,0.6,True,"Provides guidance on when and why to upgrade, support implications, and paths (including ESU and subscription constraints). Helps decide upgrade timing and approach.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-enable-network-atc?view=azloc-2604,3. Configure Network ATC,Configure Network ATC on Azure Local - Azure Local,,Learn how to configure Network ATC on Azure Local.,"This article describes how to configure Network ATC on an existing Azure Local cluster that doesn't already have it configured. Important In Azure Local upgrade scenarios where Network ATC isn't already configured, we recommend upgrading the operating system first, then configuring Network ATC, and then proceeding with the solution upgrade. If Network ATC is already configured on your cluster, verify the configuration. Skip this article if everything works as expected. -For more information on up",2025-12-22T23:06:00.000Z,how-to,configuration,0.7,True,Network ATC configuration article; contains product-specific configuration sequences and recommendations tied to upgrade scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade-azure-resource-manager-template?view=azloc-2603,Upgrade via ARM template,Install solution upgrade on Azure Local using Azure Resource Manager template - Azure Local,,Learn how to install the solution upgrade on your Azure Local instance using Azure Resource Manager template.,"Applies to: Azure Local 2311.2 and later This article describes how to install the solution upgrade on your Azure Local instance using Azure Resource Manager (ARM) template, after upgrading the operating system (OS) build from 20349.xxxx (22H2) to 25398.xxxx (23H2). Important",2026-02-04T18:06:00.000Z,how-to,,0.4,False,ARM template-based upgrade article is a how-to deployment/tutorial; summary doesn’t show parameter tables or tier constraints beyond generic version numbers.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade?view=azloc-2603,Upgrade via Azure portal,Install solution upgrade on Azure Local - Azure Local,,Learn how to install the solution upgrade on your Azure Local instance.,"Applies to: Azure Local 2311.2 and later Important Azure Stack HCI OS, version 22H2 is already out of support. To continue receiving updates, we recommend upgrading your operating system to a newer versionvia PowerShell. If you're running OS version 20349.xxxx (22H2) you won't be able to purchase Windows Server Subscription or Extended Security Updates (ESU). If you're running an Azure Stack HCI, version 22H2 stretch cluster or managing the cluster via System Center Virtual Machine Manager, revi",2026-02-03T23:02:00.000Z,how-to,,0.4,False,"Solution upgrade install guide is primarily step-by-step procedure; summary doesn’t indicate limits, config matrices, or detailed troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/post-upgrade-steps?view=azloc-2603,2. Perform post-OS upgrade tasks,Post-upgrade steps on Azure Local via PowerShell - Azure Local,Run post-upgrade tasks for Azure Local via PowerShell,Learn how to perform the post-upgrade tasks for Azure Local using PowerShell.,"Applies to: Azure Local 2311.2 and later This article describes how to perform the post-OS upgrade tasks after you upgraded the operating system (OS) to the new version. The post-upgrade tasks described in this article are required for the stability of the Azure Local instance. Throughout this article, we refer to OS version 24H2 or 23H2 as thenewversion, and version 22H2 as theoldversion.",2025-12-22T23:06:00.000Z,how-to,deployment,0.68,True,Post-upgrade tasks required for stability; includes specific commands and configuration steps that are unique to Azure Local upgrade flows.,unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/troubleshoot-upgrade-to-23h2?view=azloc-2603,Troubleshoot upgrades,Troubleshoot Azure Local upgrade - Azure Local,Diagnose and fix Azure Local 23H2 upgrade issues,Learn how to troubleshoot upgrades on your Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article describes how to identify and troubleshoot common Azure Local upgrade issues.,2026-04-06T17:04:00.000Z,how-to,troubleshooting,0.86,True,"The article is explicitly a troubleshooting guide for Azure Local upgrades, likely organized by specific symptoms and including product-specific steps, commands, and error conditions that map to causes and resolutions. This is detailed, version-specific operational knowledge that goes beyond generic debugging concepts.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-22h2-to-23h2-powershell?view=azloc-2603,1. Upgrade OS via PowerShell,"Upgrade Azure Stack HCI OS, version 22H2 to version 23H2 via PowerShell - Azure Local",Upgrade Azure Stack HCI OS 22H2 to 23H2 or 24H2 via PowerShell,"Learn how to use PowerShell to upgrade Azure Stack HCI OS, version 22H2 to version 23H2.","This article describes how to upgrade the Azure Stack HCI operating system (OS) from version 20349.xxxx (22H2) to version 25398.xxxx (23H2), via PowerShell. This is the first step in the upgrade process, which upgrades only the OS. This article describes how to upgrade the Azure Stack HCI operating system (OS) to version 26100.xxxx (24H2), via PowerShell. There are two upgrade paths available: With the 2505 release, a direct upgrade path from version 20349.xxxx (22H2) to version 26100.xxxx (24H2",2026-02-27T18:05:00.000Z,how-to,deployment,0.7,True,"Details specific upgrade paths, OS build numbers, and PowerShell steps; includes product-specific upgrade constraints and commands.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-stretched-cluster-to-23h2?view=azloc-2603,Upgrade stretched clusters to 23H2,"Upgrade stretched clusters from Azure Stack HCI OS, version 22H2 to 23H2 - Azure Local",,"Learn how to upgrade stretched clusters from Azure Stack HCI OS, version 22H2 to 23H2, including prerequisites, PowerShell steps, and post-upgrade verification.","Applies to: Azure Stack HCI, version 22H2 Important Azure Stack HCI OS, version 22H2 is already out of support. Monthly security and quality updates have stopped. Your system continues to work, including registration and repair. Billing has continued. Microsoft Support is available only for upgrade assistance. These steps in this article are the only supported method to upgrade the OS for Azure Stack HCI stretched clusters from version 20349.xxxx (22H2) to version 25398.xxxx (23H2). This version",2025-12-22T23:06:00.000Z,how-to,,0.45,False,"Stretched cluster upgrade article is a prescriptive procedure; no indication of numeric limits, decision matrices, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/upgrade/validate-solution-upgrade-readiness?view=azloc-2603,4. Validate solution upgrade readiness,"Validate solution upgrade readiness for Azure Local, version 23H2 - Azure Local",,"Learn how to assess upgrade readiness for Azure Local, version 23H2 that already had its operating system upgraded from version 22H2.","Applies to: Azure Local 2311.2 and later This article describes how to assess the upgrade readiness of your Azure Local after the operating system (OS) upgrade. Throughout this article, we refer to OS version 24H2 or 23H2 as thenewversion, and version 22H2 as theoldversion.",2026-02-27T23:05:00.000Z,how-to,,0.4,False,"Upgrade readiness assessment article appears to be a procedural checklist without specific limits, error-code mappings, or configuration tables.",unchanged -https://learn.microsoft.com/en-us/azure/azure-local/whats-new?view=azloc-2603,What's new in Azure Local?,What's new in Hyperconverged Deployments of Azure Local latest release - Azure Local,,Find out about the new features and enhancements in the latest Azure Local release for hyperconverged deployments.,"This article lists the features and improvements that are available in hyperconverged deployments of Azure Local (formerly Azure Stack HCI). The latest version of Azure Local solution focuses on cloud-based deployment and updates, cloud-based monitoring, a new and simplified experience for Azure Local virtual machine (VM) management, security, and more.",2026-04-09T08:00:00.000Z,overview,,0.2,False,"Release 'what's new' summary; likely high-level feature descriptions without detailed limits, configs, or troubleshooting matrices.",unchanged +For more information on up",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"How-to configure Network ATC on Azure Local; mostly stepwise guidance without config parameter tables, limits, or troubleshooting matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade-azure-resource-manager-template?view=azloc-2604,Upgrade via ARM template,Install solution upgrade on Azure Local using Azure Resource Manager template - Azure Local,,Learn how to install the solution upgrade on your Azure Local instance using Azure Resource Manager template.,"Applies to: Azure Local 2311.2 and later This article describes how to install the solution upgrade on your Azure Local instance using Azure Resource Manager (ARM) template, after upgrading the operating system (OS) build from 20349.xxxx (22H2) to 25398.xxxx (23H2). Important",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"ARM template-based solution upgrade; deployment tutorial without tier matrices, limits, or config parameter tables.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade?view=azloc-2604,Upgrade via Azure portal,Install solution upgrade on Azure Local - Azure Local,,Learn how to install the solution upgrade on your Azure Local instance.,"Applies to: Azure Local 2311.2 and later Important Azure Stack HCI OS, version 22H2 is already out of support. To continue receiving updates, we recommend upgrading your operating system to a newer versionvia PowerShell. If you're running OS version 20349.xxxx (22H2) you won't be able to purchase Windows Server Subscription or Extended Security Updates (ESU). If you're running an Azure Stack HCI, version 22H2 stretch cluster or managing the cluster via System Center Virtual Machine Manager, revi",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"Solution upgrade installation guide; step-by-step upgrade but no expert-only limits, configs, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/post-upgrade-steps?view=azloc-2604,2. Perform post-OS upgrade tasks,Post-upgrade steps on Azure Local via PowerShell - Azure Local,,Learn how to perform the post-upgrade tasks for Azure Local using PowerShell.,"Applies to: Azure Local 2311.2 and later This article describes how to perform the post-OS upgrade tasks after you upgraded the operating system (OS) to the new version. The post-upgrade tasks described in this article are required for the stability of the Azure Local instance. Throughout this article, we refer to OS version 24H2 or 23H2 as thenewversion, and version 22H2 as theoldversion.",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"Post-upgrade task walkthrough for Azure Local via PowerShell; procedural steps but no detailed limits, configs tables, or error-code mappings.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/troubleshoot-upgrade-to-23h2?view=azloc-2604,Troubleshoot upgrades,Troubleshoot Azure Local upgrade - Azure Local,Troubleshoot Azure Local upgrade failures and issues,Learn how to troubleshoot upgrades on your Azure Local.,Applies to: Hyperconverged deployments of Azure Local This article describes how to identify and troubleshoot common Azure Local upgrade issues.,2026-04-22T22:07:00.000Z,how-to,troubleshooting,0.7,True,"Explicit troubleshooting article for Azure Local upgrades; likely organized by common upgrade issues with specific symptoms and resolutions, matching symptom→cause→solution troubleshooting criteria.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-22h2-to-23h2-powershell?view=azloc-2604,1. Upgrade OS via PowerShell,"Upgrade Azure Stack HCI OS, version 22H2 to version 23H2 via PowerShell - Azure Local",Upgrade Azure Stack HCI OS 22H2 to 23H2 or 24H2 via PowerShell,"Learn how to use PowerShell to upgrade Azure Stack HCI OS, version 22H2 to version 23H2.","This article describes how to upgrade the Azure Stack HCI operating system (OS) from version 20349.xxxx (22H2) to version 25398.xxxx (23H2), via PowerShell. This is the first step in the upgrade process, which upgrades only the OS. This article describes how to upgrade the Azure Stack HCI operating system (OS) to version 26100.xxxx (24H2), via PowerShell. There are two upgrade paths available: With the 2505 release, a direct upgrade path from version 20349.xxxx (22H2) to version 26100.xxxx (24H2",2026-04-22T22:07:00.000Z,how-to,deployment,0.7,True,Gives concrete PowerShell-based OS upgrade procedures and supported paths (including direct 22H2→24H2). Product-specific upgrade deployment details.,new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-stretched-cluster-to-23h2?view=azloc-2604,Upgrade stretched clusters to 23H2,"Upgrade stretched clusters from Azure Stack HCI OS, version 22H2 to 23H2 - Azure Local",,"Learn how to upgrade stretched clusters from Azure Stack HCI OS, version 22H2 to 23H2, including prerequisites, PowerShell steps, and post-upgrade verification.","Applies to: Azure Stack HCI, version 22H2 Important Azure Stack HCI OS, version 22H2 is already out of support. Monthly security and quality updates have stopped. Your system continues to work, including registration and repair. Billing has continued. Microsoft Support is available only for upgrade assistance. These steps in this article are the only supported method to upgrade the OS for Azure Stack HCI stretched clusters from version 20349.xxxx (22H2) to version 25398.xxxx (23H2). This version",2025-12-22T23:06:00.000Z,how-to,,0.45,False,"Upgrade stretched clusters 22H2→23H2; specific build numbers but mainly procedural upgrade steps, not limits, configs, or troubleshooting tables.",new +https://learn.microsoft.com/en-us/azure/azure-local/upgrade/validate-solution-upgrade-readiness?view=azloc-2604,4. Validate solution upgrade readiness,"Validate solution upgrade readiness for Azure Local, version 23H2 - Azure Local",,"Learn how to assess upgrade readiness for Azure Local, version 23H2 that already had its operating system upgraded from version 22H2.","Applies to: Azure Local 2311.2 and later This article describes how to assess the upgrade readiness of your Azure Local after the operating system (OS) upgrade. Throughout this article, we refer to OS version 24H2 or 23H2 as thenewversion, and version 22H2 as theoldversion.",2026-04-22T22:07:00.000Z,how-to,,0.4,False,"Upgrade readiness validation steps; procedural checks but no numeric thresholds, limits, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-local/whats-new?view=azloc-2604,What's new in Azure Local?,What's new in Hyperconverged Deployments of Azure Local latest release - Azure Local,,Find out about the new features and enhancements in the latest Azure Local release for hyperconverged deployments.,"This article lists the features and improvements that are available in hyperconverged deployments of Azure Local (formerly Azure Stack HCI). The latest version of Azure Local solution focuses on cloud-based deployment and updates, cloud-based monitoring, a new and simplified experience for Azure Local virtual machine (VM) management, security, and more.",2026-04-23T17:08:00.000Z,overview,,0.2,False,"What's new / release highlights; feature list rather than limits, configs, or troubleshooting mappings.",new diff --git a/products/azure-local/report.md b/products/azure-local/report.md index e938dfc1..ae52a892 100644 --- a/products/azure-local/report.md +++ b/products/azure-local/report.md @@ -1,423 +1,484 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - decision-making: 'Planning Azure Local deployments: billing/pricing, VM types, connectivity - and networking (incl. disconnected), migration, multi‑rack scale, deployment types, - and upgrade paths.' - architecture-patterns: 'Network, storage, and resiliency design for Azure Local: - SDN topologies, rack-aware clustering, SAN use, room-to-room links, load balancing, - and VM/DR architecture patterns.' - security: 'Securing Azure Local: identity/RBAC, AD/PKI, certificates and rotation, - firewall/NSGs, BitLocker, Trusted launch/attestation, Defender/SIEM, and security - defaults for connected/disconnected setups.' - configuration: 'Configuring Azure Local infrastructure: networking, hardware/GPUs, - registration/proxy, monitoring/Health, backups, disconnected ops, migrations, - multi-rack, and VM/network policies.' - best-practices: Best practices for Azure Local networking, rack-aware deployment - prep, drift detection, alerting, Arc VM operations, and safe, reliable update - management. - deployment: Deploying, scaling, upgrading, and repairing Azure Local clusters (including - rack-aware and disconnected), plus ARM/portal deployment, Arc gateway, ACR, SQL - workloads, and post-deploy tasks - limits-quotas: Cluster rack/layout limits, SLB HA port scale constraints, and VMware-to–Azure - Local migration system requirements and supported configurations. - troubleshooting: 'Diagnosing and fixing Azure Local issues: deployment/upgrade failures, - SDN and networking, VM/Arc/AKS problems, registration, disconnected ops, and collecting/using - logs and support tools.' - integrations: VM networking, load balancing, image/VM creation, and migration/replication - patterns for Azure Local (Hyper-V, Azure Migrate, Site Recovery, CLI/PowerShell, - SSH, public IPs). + security: 'Security, compliance, and identity for Azure Local: mapping to standards + (FedRAMP, HIPAA, PCI, ISO), RBAC/identity, certificates/PKI, NSGs/firewalls, encryption, + Defender, logging, and security updates.' + decision-making: Guidance on choosing Azure Local vs alternatives, VM types, networking, + storage, licensing, pricing, migration, deployment patterns/scale, and upgrade + planning. + limits-quotas: Hardware, host, and physical network prerequisites for Azure Local + disaggregated deployments, plus system requirements for migrating workloads from + Hyper-V and VMware. + configuration: 'Configuring Azure Local infrastructure: networking, storage, GPUs, + SDN, private endpoints, monitoring, updates, and managing VMs (single/multi-rack, + connected or disconnected) via Azure/Arc tools.' + deployment: End-to-end Azure Local deployment, clustering, SDN, disconnected setups, + workload provisioning, and update/upgrade procedures, including rack-aware and + disaggregated scenarios. + architecture-patterns: 'Designing Azure Local network and SDN architectures: rack-aware, + multisite/DR, disconnected, vTPM, and detailed 2–4 node/switchless/Fibre Channel + reference network patterns.' + integrations: 'VM/AKS integration patterns for Azure Local: SAN storage, Arc-enabled + VMs, SSH/RDP access, remote diagnostics, VM migration, managed disk download, + and multi-rack image creation.' + best-practices: Guidance on networking and hardware prep, SDN upgrades, supported + VM operations (single/multi‑rack), preserving IPs on VM moves, drift detection, + and managing Azure Local updates. + troubleshooting: 'Diagnosing and fixing Azure Local issues: provisioning, SDN/connectivity, + Arc VMs, registration, upgrades/updates, multi‑rack, and collecting logs/diagnostics + for known errors.' skill_description: Expert knowledge for Azure Local development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - planning Azure Local racks/SDN, configuring disconnected clusters, securing RBAC/PKI, - or migrating VMs/AKS, and other Azure Local related development tasks. Not for Microsoft - Foundry Local (use microsoft-foundry-local), Azure Stack Edge (use azure-stack-edge), - Azure Kubernetes Service Edge Essentials (use azure-aks-edge-essentials), Azure - IoT Edge (use azure-iot-edge). -use_when: Use when planning Azure Local racks/SDN, configuring disconnected clusters, - securing RBAC/PKI, or migrating VMs/AKS, and other Azure Local related development + planning Azure Local racks/SDN, AKS/VM workloads, Arc integration, SAN/Fibre Channel, + or disconnected sites, and other Azure Local related development tasks. Not for + Microsoft Foundry Local (use microsoft-foundry-local), Azure Stack Edge (use azure-stack-edge), + Azure Arc (use azure-arc). +use_when: Use when planning Azure Local racks/SDN, AKS/VM workloads, Arc integration, + SAN/Fibre Channel, or disconnected sites, and other Azure Local related development tasks. confusable_not_for: Not for Microsoft Foundry Local (use microsoft-foundry-local), - Azure Stack Edge (use azure-stack-edge), Azure Kubernetes Service Edge Essentials - (use azure-aks-edge-essentials), Azure IoT Edge (use azure-iot-edge). + Azure Stack Edge (use azure-stack-edge), Azure Arc (use azure-arc). --- # Azure Local Crawl Report ## Summary -- **Total Pages**: 321 -- **Fetched**: 321 +- **Total Pages**: 350 +- **Fetched**: 350 - **Fetch Failed**: 0 -- **Classified**: 216 -- **Unclassified**: 105 +- **Classified**: 257 +- **Unclassified**: 93 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 4 -- **Unchanged**: 316 -- **Deleted Pages**: 0 +- **New Pages**: 350 +- **Updated Pages**: 0 +- **Unchanged**: 0 +- **Deleted Pages**: 321 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-local/azure-local.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 23 | 7.2% | -| best-practices | 6 | 1.9% | -| configuration | 81 | 25.2% | -| decision-making | 12 | 3.7% | -| deployment | 20 | 6.2% | -| integrations | 8 | 2.5% | -| limits-quotas | 3 | 0.9% | -| security | 36 | 11.2% | -| troubleshooting | 27 | 8.4% | -| *(Unclassified)* | 105 | 32.7% | +| architecture-patterns | 24 | 6.9% | +| best-practices | 7 | 2.0% | +| configuration | 100 | 28.6% | +| decision-making | 12 | 3.4% | +| deployment | 34 | 9.7% | +| integrations | 8 | 2.3% | +| limits-quotas | 5 | 1.4% | +| security | 43 | 12.3% | +| troubleshooting | 24 | 6.9% | +| *(Unclassified)* | 93 | 26.6% | ## Changes ### New Pages -- [Manage Secure Boot updates](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2603) +- [What is Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/overview?view=azloc-2604) +- [About Azure Local deployments](https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2604) +- [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/faq?view=azloc-2604) +- [About hyperconverged deployments](https://learn.microsoft.com/en-us/azure/azure-local/overview/hyperconverged-overview?view=azloc-2604) +- [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/whats-new?view=azloc-2604) +- [Release information](https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2604) +- [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2604) +- [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/security-update/security-update?view=azloc-2604) +- [OEM license information](https://learn.microsoft.com/en-us/azure/azure-local/oem-license?view=azloc-2604) +- [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/license-billing?view=azloc-2604) +- [Compare to Windows Server](https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2604) +- [Virtual deployment](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2604) +- [System requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2604) +- [SAN requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/san-requirements?view=azloc-2604) +- [System requirements for low capacity class](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2604) +- [Physical network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2604) +- [Host network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2604) +- [Firewall requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2604) +- [Network reference patterns overview](https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2604) +- [Choose network reference pattern](https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2604) +- *...and 330 more* -### Updated Pages +### Deleted Pages -- [With Arc gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2603) - - Updated: 2026-03-26T08:00:00.000Z → 2026-04-10T08:00:00.000Z -- [Deploy via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2603) - - Updated: 2026-02-26T18:04:00.000Z → 2026-04-16T08:00:00.000Z -- [What is Azure Local VM management?](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-overview?view=azloc-2603) - - Updated: 2026-04-08T17:05:00.000Z → 2026-04-15T08:00:00.000Z -- [Update disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-update?view=azloc-2603) - - Updated: 2026-02-23T23:04:00.000Z → 2026-04-14T22:03:00.000Z +- ~~FedRAMP guidance~~ (https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-fedramp-guidance?view=azloc-2603) +- ~~HIPAA guidance~~ (https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-hipaa-guidance?view=azloc-2603) +- ~~ISO/IEC 27001 guidance~~ (https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-iso27001-guidance?view=azloc-2603) +- ~~PCI DSS guidance~~ (https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-pci-dss-guidance?view=azloc-2603) +- ~~Azure Local and security standards~~ (https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-security-standards?view=azloc-2603) +- ~~Azure Hybrid Benefit~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit?view=azloc-2603) +- ~~Billing and payment~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/billing?view=azloc-2603) +- ~~Compare VM management capabilities~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2603) +- ~~Compare to Windows Server~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2603) +- ~~Datacenter Firewall overview~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/datacenter-firewall-overview?view=azloc-2603) +- ~~External storage support~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2603) +- ~~Firewall requirements~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2603) +- ~~RAS Gateway overview~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/gateway-overview?view=azloc-2603) +- ~~Host network requirements~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2603) +- ~~Microsoft 365 Local~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/microsoft-365-local-overview?view=azloc-2603) +- ~~Overview~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/monitoring-overview?view=azloc-2603) +- ~~Network ATC overview~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-atc-overview?view=azloc-2603) +- ~~Network Controller overview~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-controller-overview?view=azloc-2603) +- ~~Azure Local observability~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/observability?view=azloc-2603) +- ~~Physical network requirements~~ (https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2603) +- *...and 301 more* ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-troubleshoot?view=azloc-2603) | troubleshooting | 0.90 | Explicit troubleshooting article; likely organized by symptom and includes specific errors, causes, and resolutions for Azure Local migrations. | -| [Troubleshoot SDN deployment](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-sdn-deployment?view=azloc-2603) | troubleshooting | 0.86 | Explicit troubleshooting article for SDN deployment; likely organized by deployment failures, includes logs to check and remediation steps specific to Azure Local SDN. | -| [Troubleshoot upgrades](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/troubleshoot-upgrade-to-23h2?view=azloc-2603) | troubleshooting | 0.86 | The article is explicitly a troubleshooting guide for Azure Local upgrades, likely organized by specific symptoms and including product-specific steps, commands, and error conditions that map to causes and resolutions. This is detailed, version-specific operational knowledge that goes beyond generic debugging concepts. | -| [Assign RBAC roles](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-assign-vm-rbac-roles?view=azloc-2603) | security | 0.85 | Describes RBAC role usage for VM and resource access; will list specific built-in role names and scopes, which is product-specific security configuration. | -| [Firewall requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2603) | security | 0.85 | Provides outbound endpoints, internal ports/rules, and use of service tags; detailed security configuration parameters specific to Azure Local. | -| [Pattern IP requirements](https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-ip-requirements?view=azloc-2603) | configuration | 0.85 | IP requirements for single-server pattern; includes specific IP ranges/allocations and counts, which are configuration parameters. | -| [Pattern IP requirements](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-ip-requirements?view=azloc-2603) | configuration | 0.85 | IP address requirements for three-node patterns; includes specific IP planning details. | -| [Pattern IP requirements](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-ip-requirements?view=azloc-2603) | configuration | 0.85 | IP address requirements for two-node patterns; includes specific IP allocations and constraints. | -| [System requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2603) | configuration | 0.85 | Details machine, storage, networking, and Azure requirements; likely includes specific hardware/network parameters and ranges, which are product-specific configuration guidance. | -| [System requirements for low capacity class](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2603) | configuration | 0.85 | Defines requirements for machines, storage, and networking for low-capacity deployments; contains specific configuration thresholds unique to this product scenario. | -| [Troubleshoot SDN](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-troubleshooting?view=azloc-2603) | troubleshooting | 0.85 | Explicit troubleshooting article for SDN; will map specific errors and symptoms (deployment errors, DNS, downtime, policy issues) to causes and resolutions. | -| [Secure the Network Controller](https://learn.microsoft.com/en-us/azure/azure-local/manage/nc-security?view=azloc-2603) | security | 0.84 | Explains securing Northbound, cluster, and Southbound communication paths; likely includes supported authentication methods, certificate requirements, and port/protocol details specific to Network Controller. | -| [Troubleshoot Software Load Balancer for SDN](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-software-load-balancer?view=azloc-2603) | troubleshooting | 0.84 | Troubleshooting SLB data path issues; likely includes specific checks, counters, and configuration validations unique to SLB for Azure Local/Windows Server. | -| [Assign RBAC role](https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-vm-rbac-roles?view=azloc-2603) | security | 0.82 | The article explains how to use built-in RBAC roles for Azure Local VMs enabled by Azure Arc. This typically includes specific role names, scopes, and which permissions they grant for VM disks, NICs, images, etc., which matches the security category’s requirement for product-specific RBAC configuration details. | -| [Authenticate with Kerberos](https://learn.microsoft.com/en-us/azure/azure-local/manage/kerberos-with-spn?view=azloc-2603) | security | 0.82 | Covers Kerberos with SPN for Network Controller and SCVMM; includes SPN formats, account requirements, and configuration steps that are product-specific security details. | -| [Choose network reference pattern](https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2603) | architecture-patterns | 0.80 | Helps select among network reference patterns for one to three hosts; provides scenario-based guidance on which pattern to use, a product-specific architecture decision. | -| [Configure network security groups with PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-powershell?view=azloc-2603) | security | 0.80 | PowerShell-based NSG configuration for Datacenter Firewall; likely includes cmdlet names, parameter sets, and example scripts that are product-specific configuration details. | -| [Host network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2603) | configuration | 0.80 | Host networking considerations and requirements; includes specific network settings and patterns for Azure Local hosts. | -| [Manage syslog forwarding](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-syslog-forwarding?view=azloc-2603) | security | 0.80 | Syslog forwarding configuration for security events includes protocol settings, event types, and integration details with SIEM; clearly security configuration. | -| [Single-node deployment](https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-deployment?view=azloc-2603) | architecture-patterns | 0.80 | Describes a specific single-server storage network reference pattern and viability criteria; product-specific architecture pattern with concrete topology guidance. | -| [Storage switched, fully converged](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-converged?view=azloc-2603) | architecture-patterns | 0.80 | Two-node switched, fully converged pattern; describes a specific network architecture for Azure Local deployments. | -| [Storage switched, non-converged](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-non-converged?view=azloc-2603) | architecture-patterns | 0.80 | Two-node switched, non-converged pattern with two TOR switches; product-specific network design pattern and trade-offs. | -| [Troubleshoot deployment validation issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment?view=azloc-2603) | troubleshooting | 0.80 | Provides symptom-to-resolution guidance for deployment validation failures via Azure portal, including Azure Local–specific checks and remediation steps. | -| [Troubleshoot registration issues for Configurator app](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment-configurator-app?view=azloc-2603) | troubleshooting | 0.80 | Focused on diagnosing registration failures with the Configurator app, including log collection and symptom-based guidance specific to Azure Local deployment. | -| [Troubleshoot updates](https://learn.microsoft.com/en-us/azure/azure-local/update/update-troubleshooting-23h2?view=azloc-2603) | troubleshooting | 0.80 | Explicitly a troubleshooting article for solution updates on Azure Local 23H2. Such pages typically organize by update failures/symptoms and map them to causes and resolutions, often including specific error messages, codes, and diagnostic steps unique to Azure Local updates. This aligns with the troubleshooting criteria of symptom → cause → solution with product-specific guidance. | -| [Update Network Controller certificates](https://learn.microsoft.com/en-us/azure/azure-local/manage/update-network-controller-certificates?view=azloc-2603) | security | 0.80 | Details automatic and manual renewal of Network Controller certificates; includes product-specific certificate roles, stores, and renewal procedures. | -| [Update SDN infrastructure certificates](https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn-infrastructure-certificates?view=azloc-2603) | security | 0.80 | Describes renewal of SDN server and SLB MUX certificates; contains specific certificate usage and renewal steps unique to Azure Local SDN. | -| [Collect SDN logs](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-log-collection?view=azloc-2603) | troubleshooting | 0.78 | Describes how to collect SDN logs; includes specific log locations, commands, and collection procedures unique to Azure Local SDN. | -| [Configure network security groups with Windows Admin Center](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-windows-admin-center?view=azloc-2603) | security | 0.78 | Covers Datacenter Firewall and NSG configuration for SDN, including NSG objects applied to subnets/NICs; typically includes rule properties, directions, and scopes specific to Azure Local SDN. | -| [Manage BitLocker encryption](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-bitlocker?view=azloc-2603) | security | 0.78 | BitLocker management on Azure Local includes specific steps to enable encryption and retrieve recovery keys; these are concrete security configuration procedures. | -| [Manage certificates for SDN](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-manage-certs?view=azloc-2603) | security | 0.78 | Covers certificate management for Northbound/Southbound communications and SCVMM; includes certificate types, usage, and configuration steps specific to SDN security. | -| [Manage security defaults](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-baseline?view=azloc-2603) | security | 0.78 | Managing security defaults, drift control, and protected settings is product-specific security configuration, likely with named settings and allowed values. | -| [Manual backup and recovery](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-import-key?view=azloc-2603) | security | 0.78 | Describes manual backup and recovery of guest state protection keys, including different procedures for specific Azure Local releases and storage targets; this is product-specific security key management configuration. | -| [Review requirements](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-requirements?view=azloc-2603) | limits-quotas | 0.78 | A 'requirements' article for a specific migration path typically enumerates exact supported versions, minimum CPU/RAM, network ports, and other hard constraints that function as product-specific limits/quotas (for example, supported vCenter/ESXi versions, minimum appliance sizing, port numbers). These are concrete numeric and version limits that an LLM would not reliably know from training. | -| [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-arc-enabled-vms?view=azloc-2603) | troubleshooting | 0.78 | Explicit troubleshooting article for Azure Local VMs enabled by Azure Arc; likely includes specific log locations, commands, and known issues/limitations with resolutions. | -| [Troubleshoot common SDN issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-common-sdn-issues?view=azloc-2603) | troubleshooting | 0.78 | Troubleshooting-focused article for SDN on Azure Local that describes exactly what data to collect and from where (specific log locations, trace types, and commands) before contacting Microsoft Support. This is product- and version-specific diagnostic guidance that an LLM is unlikely to know from training, matching the troubleshooting criteria of symptom → data to gather → next steps. | -| [Troubleshoot simplified machine provisioning](https://learn.microsoft.com/en-us/azure/azure-local/deploy/troubleshoot-simplified-machine-provisioning?view=azloc-2603) | troubleshooting | 0.78 | Troubleshooting article for simplified machine provisioning; likely organized by specific provisioning failures, includes concrete error messages, logs, and remediation steps unique to Azure Local simplified provisioning. | -| [Configure network security groups with tags](https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-network-security-groups-with-tags?view=azloc-2603) | security | 0.76 | Describes network security tags and how to apply NSG policies based on tags; includes product-specific tag behavior and configuration steps for Azure Local. | -| [Azure CLI](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-cli?view=azloc-2603) | configuration | 0.75 | Explains supported CLI versions, extensions, cloud setup, and certificate trust for disconnected operations. This involves specific configuration parameters, extension names, and environment settings unique to this product. | -| [Configure advanced Active Directory settings](https://learn.microsoft.com/en-us/azure/azure-local/plan/configure-custom-settings-active-directory?view=azloc-2603) | security | 0.75 | Describes required AD permissions and DNS records with detailed steps; this is product-specific security/identity configuration with concrete permission assignments and record types. | -| [Create NSGs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-security-groups?view=azloc-2603) | security | 0.75 | NSG creation and configuration is security-focused; article will list rule properties, directions, and ports specific to Azure Local multi-rack networking. | -| [Create Network Security Groups](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-security-groups?view=azloc-2603) | security | 0.75 | Covers creation and configuration of NSGs and rules; includes product-specific security configuration parameters and possibly defaults. | -| [Create and manage L3 isolation domains](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-configure-layer-3-isolation-domain?view=azloc-2603) | configuration | 0.75 | Managing L3 isolation domains involves specific network configuration parameters and allowed values, which are product-specific configuration details. | -| [Create logical network](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-logical-networks?view=azloc-2603) | configuration | 0.75 | Logical network creation for multi-rack deployments includes specific network settings (address spaces, subnets, constraints) that are configuration knowledge. | -| [Manage Application Control](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-wdac?view=azloc-2603) | security | 0.75 | Application Control configuration to reduce attack surface is security-focused and product-specific, likely including policy settings and scopes. | -| [Manage GPUs using Discrete Device Assignment](https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-device?view=azloc-2603) | configuration | 0.75 | Managing GPUs via DDA requires detailed configuration (device IDs, assignment settings) specific to Azure Local and Arc-enabled VMs. | -| [Manage GPUs via partitioning](https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-partitioning?view=azloc-2603) | configuration | 0.75 | GPU-P partitioning involves product-specific configuration parameters and constraints for sharing GPUs across workloads. | -| [Modify Health Service settings](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-settings?view=azloc-2603) | configuration | 0.75 | Details Health Service parameters that govern behavior, including settings to adjust aggressiveness of faults and actions—explicit configuration knobs unique to this product. | -| [Pattern components](https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-components?view=azloc-2603) | configuration | 0.75 | Details which network components are deployed in the single-server pattern; configuration-level breakdown of product-specific components. | -| [Pattern components](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-components?view=azloc-2603) | configuration | 0.75 | Describes which network components are deployed for three-node patterns; product-specific configuration breakdown. | -| [Pattern components](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-components?view=azloc-2603) | configuration | 0.75 | Lists which network components are deployed for two-node reference patterns; product-specific component configuration. | -| [Private endpoints - with proxy, with gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-with-gateway?view=azloc-2603) | configuration | 0.75 | Covers deployment scenarios combining enterprise proxy, Arc gateway, and private endpoints; product-specific configuration matrix for outbound connectivity. | -| [Review Azure Arc gateway for Azure Local](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-arc-gateway-overview?view=azloc-2603) | deployment | 0.75 | Explains enabling Arc gateway, version requirements, and how it reduces required endpoints; product-specific deployment configuration and constraints. | -| [Storage switchless, dual TOR, dual link](https://learn.microsoft.com/en-us/azure/azure-local/plan/four-node-switchless-two-switches-two-links?view=azloc-2603) | architecture-patterns | 0.75 | Four-node, dual TOR, dual-link switchless pattern is a specific, Microsoft-tested network reference architecture for Azure Local deployments. | -| [Storage switchless, dual TOR, dual link](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-two-links?view=azloc-2603) | architecture-patterns | 0.75 | Three-node, dual TOR, dual-link switchless pattern is a validated Azure Local network architecture pattern guiding deployment design choices. | -| [Storage switchless, dual TOR, single link](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-single-link?view=azloc-2603) | architecture-patterns | 0.75 | Three-node, dual TOR, single-link switchless pattern is a specific, tested topology pattern for Azure Local deployments, used in planning and design decisions. | -| [Storage switchless, single switch](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-single-switch?view=azloc-2603) | architecture-patterns | 0.75 | Describes a specific, validated network reference pattern for Azure Local with viability criteria; this is a product-specific architecture pattern for deployment topology. | -| [Storage switchless, two switches](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-two-switches?view=azloc-2603) | architecture-patterns | 0.75 | Defines a concrete Azure Local network reference pattern (two-node, two TOR L3 switches) used to decide and design deployment topology; fits architecture-patterns. | -| [Via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deployment-via-template?view=azloc-2603) | deployment | 0.75 | ARM template deployment for rack aware clusters is a product-specific deployment method, including template parameters and constraints for scale deployments. | -| [1. Prepare Active Directory](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prep-active-directory?view=azloc-2603) | security | 0.70 | Preparation of AD for Azure Local typically includes product-specific OU structure, required permissions, and GPO configuration details (for example, which rights the deployment account needs and how to block inheritance). These are concrete, service-specific security/identity configuration steps rather than generic AD concepts. | -| [1. Upgrade OS via PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-22h2-to-23h2-powershell?view=azloc-2603) | deployment | 0.70 | Details specific upgrade paths, OS build numbers, and PowerShell steps; includes product-specific upgrade constraints and commands. | -| [3. Configure Network ATC](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-enable-network-atc?view=azloc-2603) | configuration | 0.70 | Network ATC configuration article; contains product-specific configuration sequences and recommendations tied to upgrade scenarios. | -| [4. Set up subscription permissions](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-arc-register-server-permissions?view=azloc-2603) | security | 0.70 | Focuses on setting up required permissions on the subscription; likely lists specific roles, scopes, and identity settings, which are product-specific security configuration. | -| [6B. Deploy via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template?view=azloc-2603) | deployment | 0.70 | ARM template deployment article with prerequisites and preparation; typically includes template parameters, supported scenarios, and deployment constraints specific to this product. | -| [About Azure Local deployments](https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2603) | decision-making | 0.70 | Describes different deployment types and scalability options to help choose the right solution; contains product-specific scale points and guidance for when to use each option. | -| [About private endpoint scenarios](https://learn.microsoft.com/en-us/azure/azure-local/deploy/about-private-endpoints?view=azloc-2603) | decision-making | 0.70 | Overview of supported/unsupported scenarios and key requirements for private endpoints with Azure Local; helps choose connectivity approach across scenarios. | -| [Assess environment readiness](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-environment-checker?view=azloc-2603) | configuration | 0.70 | Readiness tool checks minimum requirements for connectivity, hardware, networking, and AD; such pages typically list specific parameters, thresholds, and checks that are product-specific configuration knowledge. | -| [Assess network readiness via LLDP](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-readiness-check?view=azloc-2603) | troubleshooting | 0.70 | Using LLDP validator to assess readiness is diagnostic; likely includes specific commands, outputs, and interpretations for deployment issues. | -| [Attach GPU to Linux VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/attach-gpu-to-linux-vm?view=azloc-2603) | configuration | 0.70 | Describes attaching GPU to Ubuntu VM for AI workloads; likely includes driver versions, VM settings, and GPU configuration parameters specific to Azure Local. | -| [Azure Arc extension management](https://learn.microsoft.com/en-us/azure/azure-local/manage/arc-extension-management?view=azloc-2603) | configuration | 0.70 | Covers installing, upgrading, and managing Arc extensions on Azure Local with product-specific extension configuration steps and parameters. | -| [Azure Container Registry](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-azure-container-registry?view=azloc-2603) | deployment | 0.70 | Explains prerequisites, deployment steps, and image management for ACR in disconnected Azure Local. This is a product-specific deployment/integration pattern with concrete steps and constraints. | -| [Azure Policy](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-policy?view=azloc-2603) | configuration | 0.70 | Describes using Azure Policy when disconnected, which requires specific configuration steps (policy sync, assignment, evaluation) unique to this scenario. | -| [Azure PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-powershell?view=azloc-2603) | configuration | 0.70 | Configuring Azure PowerShell for disconnected Azure Local operations is likely to involve specific module versions, context settings, environment variables, or endpoint/URI overrides unique to this scenario. Those are concrete configuration parameters and patterns that qualify as expert knowledge and align with the configuration sub-skill. | -| [Back up disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-back-up-restore?view=azloc-2603) | configuration | 0.70 | The page describes how to configure and trigger backups for disconnected Azure Local environments, including parameter configurations. This is product-specific operational configuration detail (parameters and how they’re used) that goes beyond generic backup concepts, fitting the configuration sub-skill. It is not just a tutorial; it focuses on parameterized backup behavior. | -| [Collect log files for Azure Local VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-log-files-arc-enabled-vms?view=azloc-2603) | troubleshooting | 0.70 | Log collection article is part of troubleshooting; likely specifies exact log locations, commands, and files needed to diagnose Azure Local VM issues. | -| [Compare to Windows Server](https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2603) | decision-making | 0.70 | Explicitly compares Azure Local vs Windows Server and explains when to use each; this is product-specific selection guidance. Even if numbers are sparse, it provides scenario-based recommendations and trade-offs. | -| [Complete Azure Local VM prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-vm-management-prerequisites?view=azloc-2603) | configuration | 0.70 | Lists VM requirements and prerequisites (sizes, images, network constraints) for multi-rack; these are product-specific configuration details. | -| [Complete deployment prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-prerequisites?view=azloc-2603) | configuration | 0.70 | Prerequisites article for multi-rack likely lists specific hardware, network, and software configuration requirements unique to this deployment model. | -| [Configure proxy](https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-proxy-settings-23h2?view=azloc-2603) | configuration | 0.70 | Proxy configuration article includes guidance for different registration methods and version-specific behavior; involves concrete network/proxy settings. | -| [Connect to Azure Local VMs via SSH](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-connect-arc-vm-using-ssh?view=azloc-2603) | integrations | 0.70 | Details enabling OpenSSH via Arc extension and using SSH/RDP over SSH; includes extension configuration and SSH parameters, an integration pattern. | -| [Connect to Azure Local VMs via serial console](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-serial-console?view=azloc-2603) | troubleshooting | 0.70 | Serial console usage for boot/network issues is a troubleshooting technique; article likely maps certain failure modes to using serial console commands. | -| [Create Azure Local VM images](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-image-storage-account?view=azloc-2603) | integrations | 0.70 | Shows how to use Azure Storage and CLI to create VM images; includes specific CLI parameters and integration patterns between Azure Local and Azure Storage. | -| [Create Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-arc-virtual-machines?view=azloc-2603) | configuration | 0.70 | Covers creating VMs via CLI/portal/ARM with Arc enablement; includes specific resource types, properties, and configuration fields. | -| [Create and restore data disk snapshots](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-disk-snapshot?view=azloc-2603) | configuration | 0.70 | Snapshot operations include product-specific capabilities and constraints (data disk only, no OS disk) and likely CLI/portal parameters. | -| [Create network interfaces](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-interfaces?view=azloc-2603) | configuration | 0.70 | NIC creation on logical networks involves specific configuration parameters (IP allocation, subnet, NSG association) unique to this environment. | -| [Create public load balancer for VNETs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-load-balancer-virtual-networks?view=azloc-2603) | configuration | 0.70 | Describes creating load balancers via CLI with specific configuration parameters (frontend IPs, backend pools, rules) unique to Azure Local multi-rack. | -| [Create virtual network](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-virtual-networks?view=azloc-2603) | configuration | 0.70 | VNet creation article will detail parameters like address ranges, associations, and constraints specific to multi-rack Azure Local. | -| [Deploy via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault-template?view=azloc-2603) | deployment | 0.70 | ARM template-based deployment with local identity and Key Vault; includes template parameters and deployment constraints, fitting deployment patterns. | -| [Enable Insights at scale using Azure policies](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-azure-policies?view=azloc-2603) | configuration | 0.70 | Describes Azure Policy definitions and assignments to enable Insights across Azure Local systems, including product-specific policy configuration. | -| [Enable SDN integration](https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-sdn-integration?view=azloc-2603) | configuration | 0.70 | Describes enabling SDN via a PowerShell action plan; likely includes specific cmdlets, parameters, and configuration steps unique to Azure Local SDN. | -| [Enable guest attestation](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-guest-attestation?view=azloc-2603) | security | 0.70 | Covers enabling guest attestation (boot integrity verification) for Trusted launch VMs; likely includes specific security settings, parameters, and version requirements, which are product-specific security configuration details. | -| [Enable guest management](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-enable-guest-management?view=azloc-2603) | configuration | 0.70 | Enabling guest management involves specific settings and possibly extension configuration parameters that are product-specific configuration knowledge. | -| [Enable nested virtualization](https://learn.microsoft.com/en-us/azure/azure-local/manage/enable-nested-virtualization?view=azloc-2603) | configuration | 0.70 | Covers enabling nested virtualization; typically involves specific host/VM configuration flags and PowerShell settings unique to Azure Local/Hyper-V. | -| [Enhanced management from Azure](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-enhanced-management-managed-identity?view=azloc-2603) | security | 0.70 | Describes enabling enhanced management via a specific managed identity for Azure Local, including Azure-side configuration details that are product-specific security settings. | -| [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-faq?view=azloc-2603) | troubleshooting | 0.70 | An FAQ for a specific migration scenario (Hyper-V and VMware to Azure Local) typically includes concrete answers about supported/unsupported scenarios, specific error messages or behaviors, and how to resolve them. While not labeled as troubleshooting, such FAQs often map symptoms and questions to causes and resolutions, which aligns best with the troubleshooting sub-skill. | -| [Fallback log collection](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-fallback?view=azloc-2603) | troubleshooting | 0.70 | Describes fallback logging to export and send logs when standard collection fails. This is a troubleshooting pattern specific to Azure Local VMs in disconnected mode. | -| [Identity](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-identity?view=azloc-2603) | security | 0.70 | Explains how to plan and integrate identity, including actions and roles for operators. Likely includes product-specific identity/role mappings and configuration guidance, fitting security (identity and access management). | -| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2603) | troubleshooting | 0.70 | Release notes of critical known issues and workarounds; typically organized by symptom and workaround, providing product-specific troubleshooting guidance. | -| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-known-issues?view=azloc-2603) | troubleshooting | 0.70 | Release notes of critical known issues and workarounds. Such pages typically map specific symptoms/conditions to causes and workarounds, which is expert troubleshooting knowledge unique to this product. | -| [Log alerts](https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-system-alerts?view=azloc-2603) | configuration | 0.70 | Provides sample log queries and alert configuration for Azure Local resources, including concrete query patterns and alert settings. | -| [Maintain static IP addresses](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-maintain-ip-addresses?view=azloc-2603) | configuration | 0.70 | Explains how to preserve static IPs in SDN and non-SDN environments; likely includes concrete network configuration steps and parameter values. | -| [Manage Azure Local VM extensions](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-extension?view=azloc-2603) | configuration | 0.70 | Covers enabling guest management and installing extensions; includes extension types, settings, and constraints specific to Azure Local/Arc. | -| [Manage Azure Local VM resources](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machine-resources?view=azloc-2603) | configuration | 0.70 | Resource management article with constraints (e.g., cannot add NICs after creation) and disk operations; these are product-specific configuration behaviors. | -| [Manage Network Security Groups](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-network-security-groups?view=azloc-2603) | security | 0.70 | Describes listing, associating, updating, and deleting NSGs for Azure Local VMs; product-specific security configuration operations. | -| [Manage Secure Boot updates](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2603) | security | 0.70 | Page contains product-specific Secure Boot behavior for Azure Local, including how 2011 vs 2023 Secure Boot certificates are rolled out, CVE-2023-24932 mitigation details, and ordered update orchestration with OEM/SBE packages plus monitoring/validation steps. These are security-configuration and platform-specific operational details that go beyond generic Secure Boot concepts. | -| [Manage default network access policies](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-default-network-access-policies-virtual-machines-23h2?view=azloc-2603) | security | 0.70 | Describes default network access policies that block inbound traffic except specific management ports; likely includes concrete policy settings and port lists specific to Azure Local VMs. | -| [Manage secrets rotation](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secrets-rotation?view=azloc-2603) | security | 0.70 | Changing deployment user password and managing internal secret rotation is product-specific security/identity configuration. | -| [Manage with Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-with-defender-for-cloud?view=azloc-2603) | security | 0.70 | Describes using Defender for Cloud to protect Azure Local; likely includes enabling plans, scopes, and security settings specific to this integration. | -| [Metric alerts](https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-metric-alerts?view=azloc-2603) | configuration | 0.70 | Describes how to create metric alerts for Azure Local with product-specific metrics and alert rule configuration. | -| [Metrics](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-cluster-with-metrics?view=azloc-2603) | configuration | 0.70 | Lists metrics collected for compute, storage, and network resources and describes Azure Local–specific metrics configuration and dashboards. | -| [Monitor cluster metrics](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-cluster-with-metrics?view=azloc-2603) | configuration | 0.70 | Describes monitoring with Azure Monitor Metrics and lists metrics collected for compute, storage, and network. Such metric listings typically include metric names, dimensions, and possibly units, which are product-specific configuration/monitoring details. | -| [Network reference patterns overview](https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2603) | architecture-patterns | 0.70 | Overview of supported network reference patterns with specific deployment characteristics (nodes, TOR switches, adapter requirements); product-specific network architecture patterns. | -| [On-demand log collection](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-on-demand-logs?view=azloc-2603) | troubleshooting | 0.70 | Explains using a PowerShell module to collect logs on-demand for troubleshooting and support. Contains product-specific commands, parameters, and log collection patterns, mapping symptoms to diagnostic data. | -| [Physical network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2603) | configuration | 0.70 | Describes physical (fabric) network requirements and switch considerations for Azure Local. Such pages typically include specific network configuration parameters (e.g., VLANs, MTU, port speeds, required features) and version applicability, which are product-specific configuration details not generally known from training. | -| [Prepare Azure Local nodes](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-prepare?view=azloc-2603) | deployment | 0.70 | Describes preparing nodes, networking, and readiness for disconnected deployments. Contains deployment-specific requirements and ordering that are unique to this product’s disconnected mode. | -| [Prepare GPUs](https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-preparation?view=azloc-2603) | configuration | 0.70 | The page describes product-specific steps and settings to prepare and configure GPUs for Azure Local instances (for Arc-enabled VMs and AKS). This includes concrete configuration actions and parameters tied to this platform, which an LLM is unlikely to know from training. It is not just a conceptual overview but a how-to with environment-specific configuration details, fitting best under configuration. | -| [Prepare to deploy](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-prep?view=azloc-2603) | best-practices | 0.70 | Includes network design recommendations, machine configuration guidelines, and best practices; these are product-specific deployment recommendations. | -| [Private endpoints - no proxy, no gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-no-gateway?view=azloc-2603) | configuration | 0.70 | Scenario-specific integration of private endpoints with Azure Local (no proxy, no gateway); includes product-specific configuration steps and requirements. | -| [Private endpoints - no proxy, with gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-with-gateway?view=azloc-2603) | configuration | 0.70 | Details how to integrate existing and new private endpoints in a no-proxy, with-gateway scenario; product-specific configuration guidance. | -| [Private endpoints - with proxy, no gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-no-gateway?view=azloc-2603) | configuration | 0.70 | Describes configuration of private endpoints when using an enterprise proxy but no Arc gateway; scenario-specific connectivity configuration. | -| [Provision and place VMs](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-provision-vm-local-availability-zone?view=azloc-2603) | configuration | 0.70 | Explains how to configure VM placement in local availability zones; includes constraints like non-support for updating placement of existing VMs. | -| [Public key infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-pki?view=azloc-2603) | security | 0.70 | Details PKI requirements and how to create certificates to secure appliance endpoints. This is product-specific security configuration (certificate types, usage, and trust requirements). | -| [Reference architecture](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-reference-architecture?view=azloc-2603) | architecture-patterns | 0.70 | Describes network design and configuration for rack aware clusters in specific factory/room-isolation scenarios; this is a product-specific reference architecture pattern. | -| [Review best practices](https://learn.microsoft.com/en-us/azure/azure-local/update/update-best-practices?view=azloc-2603) | best-practices | 0.70 | Explicit best-practices article; likely includes concrete recommendations, pitfalls, and possibly scheduling or configuration guidance specific to Azure Local updates. | -| [Review requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-requirements?view=azloc-2603) | limits-quotas | 0.70 | Requirements/supported configurations typically include numeric constraints (node counts, network requirements, supported topologies) that function as product-specific limits. | -| [Room-to-room connections](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-room-to-room-connectivity?view=azloc-2603) | architecture-patterns | 0.70 | Outlines four configuration options with different resilience, cost, and complexity; this is explicit pattern/trade-off guidance for Azure Local rack aware connectivity. | -| [SDN considerations](https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-sdn-considerations?view=azloc-2603) | architecture-patterns | 0.70 | Discusses SDN considerations when deploying network reference patterns; product-specific architectural guidance on when/how to use SDN with these patterns. | -| [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-23?view=azloc-2603) | security | 0.70 | Similar to index 4, this page lists concrete security updates tied to specific Azure Local 23xx versions. These are detailed, versioned security artifacts (KBs, fixes) that qualify as product-specific security knowledge beyond generic concepts. | -| [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-24?view=azloc-2603) | security | 0.70 | A security update listing for specific Azure Local release trains is product- and version-specific expert knowledge, typically including KB identifiers, affected components, and sometimes CVE mappings or required actions. This is security-focused configuration/maintenance information that an LLM would not know from training. | -| [Spread AKS nodes](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-aks-nodes?view=azloc-2603) | configuration | 0.70 | Describes how to deploy AKS clusters with rack aware support and distribute nodes across zones; involves product-specific configuration parameters. | -| [Storage thin provisioning](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-thin-provisioning-23h2?view=azloc-2603) | configuration | 0.70 | Explains how to enable and manage thin provisioning via PowerShell with Azure Local–specific storage configuration behavior. | -| [Use ReFS deduplication and compression](https://learn.microsoft.com/en-us/azure/azure-local/manage/refs-deduplication-and-compression?view=azloc-2603) | configuration | 0.70 | Provides Azure Local–specific steps and parameters to configure ReFS deduplication and compression to optimize storage. | -| [Use the Diagnostic Support tool](https://learn.microsoft.com/en-us/azure/azure-local/manage/support-tools?view=azloc-2603) | troubleshooting | 0.70 | Page focuses on a product-specific diagnostic tool for Azure Local hyperconverged deployments, likely detailing concrete PowerShell commands and data collection steps for resolving issues, which fits the troubleshooting pattern of product-specific diagnosis and resolution guidance. | -| [Use the Support tool for infrastructure issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/remediate-support-tool-infrastructure?view=azloc-2603) | troubleshooting | 0.70 | Describes diagnostic and remediation commands in the Support.AksArc PowerShell module, mapping infrastructure issues to specific remediation actions. | -| [VM load balancing](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-load-balancing?view=azloc-2603) | configuration | 0.70 | Article is about configuring VM load balancing; likely includes specific settings, parameters, and possibly tables for Azure Local/Windows Server load balancing behavior. | -| [Via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-portal?view=azloc-2603) | deployment | 0.70 | Step-by-step deployment for a specific platform; likely includes Azure Local-specific deployment constraints and supported options in the portal. | -| [View Health Service faults](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-faults?view=azloc-2603) | troubleshooting | 0.70 | Provides detailed information on Health Service faults, mapping fault types to causes and likely resolutions for Azure Local and Windows Server clusters. | -| [Virtual deployment](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2603) | deployment | 0.70 | Describes how to deploy a virtual Azure Local system, including host OS requirements, Hyper-V usage, and deployment duration; product-specific deployment procedure and constraints. | -| [2. Perform post-OS upgrade tasks](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/post-upgrade-steps?view=azloc-2603) | deployment | 0.68 | Post-upgrade tasks required for stability; includes specific commands and configuration steps that are unique to Azure Local upgrade flows. | -| [Create VM affinity rules](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-affinity?view=azloc-2603) | configuration | 0.68 | Affinity rules via PowerShell/Admin Center are configuration-focused and product-specific; likely includes setting names, allowed values, and examples for Azure Local clusters. | -| [Manage security post upgrade](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-post-upgrade?view=azloc-2603) | security | 0.68 | Post-upgrade security posture management involves product-specific settings and migration considerations for security configuration. | -| [Review requirements](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-requirements?view=azloc-2603) | configuration | 0.68 | Requirements article typically includes specific supported versions, network/port requirements, capacity constraints, and configuration parameters for Azure Migrate Hyper-V migration, which are product-specific expert details. | -| [SDN infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-software-defined-networking-infrastructure-23h2?view=azloc-2603) | architecture-patterns | 0.68 | Planning guide includes product-specific SDN topology guidance, hardware/software prerequisites, and concrete design considerations (physical/logical networks, routing, gateways, extension scenarios) that go beyond generic networking concepts and are specific to Azure Local SDN. | -| [With Arc gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2603) | configuration | 0.68 | The article describes concrete, product-specific configuration steps for registering Azure Local with Azure Arc using an Arc gateway, including proxy configuration via script and environment-specific settings. This goes beyond generic deployment guidance and focuses on how to configure the Arc proxy and gateway for this product, which is expert operational knowledge. It does not emphasize limits, troubleshooting, or architecture trade-offs. | -| [Without Arc gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-without-azure-arc-gateway?view=azloc-2603) | configuration | 0.68 | The article describes concrete, product-specific registration and proxy configuration for Azure Local with Azure Arc, including use of an Arc script and the Configurator app. It focuses on how to set registration settings and proxy options rather than generic concepts, which aligns best with configuration. It does not emphasize deployment matrices, limits, or troubleshooting error codes. | -| [Discover, replicate](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-replicate?view=azloc-2603) | integrations | 0.66 | Discovery and replication phase documentation generally includes appliance configuration, connection settings, and replication parameters (frequency, retention, credentials) that are specific to integrating Hyper-V with Azure Migrate and Azure Local. | -| [3. Create logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-logical-networks?view=azloc-2603) | configuration | 0.65 | Logical network creation defines networking settings for VMs; such articles typically include specific configuration parameters (subnets, VLANs, address spaces) that are product-specific. | -| [About Azure Local upgrades](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/about-upgrades-23h2?view=azloc-2603) | decision-making | 0.65 | Overview of upgrading from 22H2 with support and subscription implications; helps decide when and how to upgrade based on OS version and supportability. | -| [Acquire disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-acquire?view=azloc-2603) | deployment | 0.65 | Explains how to create the virtual appliance resource and download installation files. This is part of a product-specific deployment flow for disconnected operations, including prerequisites and platform-specific steps. | -| [Add a node](https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server?view=azloc-2603) | deployment | 0.65 | Adding nodes to manage capacity is a scale-out deployment operation with specific steps and constraints for Azure Local 23H2. | -| [Add and repair nodes](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-add-server?view=azloc-2603) | deployment | 0.65 | Node add/repair procedures are product-specific operational deployment/scale-out steps with concrete commands and constraints. | -| [Billing](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-billing?view=azloc-2603) | decision-making | 0.65 | The page explains billing for disconnected Azure Local, including a capacity-based pricing model using physical processor cores and Windows Server VM licensing requirements. This is specialized guidance to understand cost and licensing implications for a specific operating mode, helping users make deployment and capacity decisions, which aligns with decision-making. | -| [Compare VM management capabilities](https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2603) | decision-making | 0.65 | Comparison of VM types and their management capabilities; likely includes comparison tables that guide which VM type to use for different scenarios. | -| [Complete post-deployment tasks](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-post-deployment?view=azloc-2603) | deployment | 0.65 | Post-deployment tasks are specific operational steps required after deploying rack aware clusters; these are product-specific deployment/ops procedures. | -| [Complete prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prerequisites?view=azloc-2603) | configuration | 0.65 | Prerequisites and checklist typically enumerate specific OS versions, ports, hardware specs, and network requirements, which are product-specific configuration details. | -| [Complete prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-prerequisites?view=azloc-2603) | configuration | 0.65 | Prerequisites article for a migration workflow usually lists concrete configuration steps, required settings, and environment preparations (accounts, permissions, ports, agents) that are specific to Azure Migrate and Azure Local. | -| [Create internal load balancer for VNETs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-internal-load-balancer-virtual-networks?view=azloc-2603) | integrations | 0.65 | How-to article for creating/configuring internal load balancers using Azure CLI on Azure Local multi-rack. Likely includes specific CLI parameters, flags, and resource property names unique to this product, which qualify as product-specific integration/coding patterns rather than just conceptual guidance. | -| [Create public IP addresses](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-ip?view=azloc-2603) | integrations | 0.65 | Covers creating public IP resources in Azure Local multi-rack, which differ from standard Azure public IPs. Likely includes specific CLI/API parameters and configuration options unique to this environment, fitting integrations/coding patterns. | -| [Create public load balancer for LNETs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-load-balancer-logical-network?view=azloc-2603) | integrations | 0.65 | Describes creating load balancers on logical networks in Azure Local multi-rack using Azure CLI. This typically involves product-specific CLI commands, parameter names, and configuration patterns that go beyond generic knowledge. | -| [Deploy disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-deploy?view=azloc-2603) | deployment | 0.65 | A deployment-focused article for Azure Local in fully disconnected environments is likely to include product-specific deployment requirements, sequencing constraints (management cluster then first instance), and possibly environment prerequisites unique to disconnected operations. These are implementation details an LLM wouldn't reliably know from training, fitting the deployment sub-skill better than generic configuration. | -| [Drift detection](https://learn.microsoft.com/en-us/azure/azure-local/manage/drift-detection?view=azloc-2603) | best-practices | 0.65 | Explains Azure Local’s drift detection framework and how to use it to identify configuration deviations, providing product-specific operational guidance. | -| [Extended Security Updates (ESUs)](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-benefits-esu?view=azloc-2603) | decision-making | 0.65 | Explains specifics of ESU on Azure Local, including benefits and implementation steps; likely includes which products/versions qualify and how Azure verification affects ESU, guiding ESU usage decisions. | -| [External storage support](https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2603) | architecture-patterns | 0.65 | Details supported configurations and patterns for integrating external SAN with Azure Local, including when and how to use external storage in this product. | -| [Health alerts](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-alerts-via-azure-monitor-alerts?view=azloc-2603) | configuration | 0.65 | Explains mapping of Azure Local OS health issues into Azure Monitor alerts with product-specific alert configuration steps. | -| [Infrastructure resiliency](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-infrastructure-resiliency?view=azloc-2603) | architecture-patterns | 0.65 | Infrastructure resiliency article discusses key elements and design considerations specific to Azure Local, constituting product-specific architecture patterns. | -| [Install Azure CLI extensions](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-cli-extensions?view=azloc-2603) | configuration | 0.65 | Explains required CLI extensions and versions for multi-rack; includes specific extension names and installation commands. | -| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-23?view=azloc-2603) | troubleshooting | 0.65 | Known/fixed issues article with workarounds is inherently troubleshooting-focused with symptom-to-solution mappings. | -| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-24?view=azloc-2603) | troubleshooting | 0.65 | Release notes with known issues and workarounds typically map specific symptoms/bugs to causes and mitigation steps, which is troubleshooting knowledge. | -| [Manage Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machines?view=azloc-2603) | configuration | 0.65 | Describes VM lifecycle operations and guest management; includes specific management options and possibly required settings. | -| [Migrate via PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-via-powershell?view=azloc-2603) | integrations | 0.65 | PowerShell-based migration article will reference specific cmdlets, parameters, and required values unique to Azure Migrate and Azure Local, fitting integration/coding patterns. | -| [Migration overview](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-options-overview?view=azloc-2603) | decision-making | 0.65 | Compares available VM migration options to Azure Local and guides selection based on scenarios, representing product-specific migration decision guidance. | -| [Monitor Azure Local migrations](https://learn.microsoft.com/en-us/azure/azure-local/migrate/monitor-migration?view=azloc-2603) | configuration | 0.65 | Covers enabling diagnostic settings with specific categories, destinations, and configuration options for Azure Migrate, which are product-specific config details. | -| [Monitor a single system](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-single-23h2?view=azloc-2603) | configuration | 0.65 | Describes enabling Insights and configuring monitoring for a single Azure Local system with product-specific setup steps and options. | -| [Monitor multiple systems](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-23h2?view=azloc-2603) | configuration | 0.65 | Explains configuration for multi-system monitoring with Insights, including Azure Local–specific setup patterns at scale. | -| [Network](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-network?view=azloc-2603) | decision-making | 0.65 | Covers key design considerations and requirements for network planning in disconnected environments. This is environment- and product-specific decision guidance (topology, integration choices) rather than generic networking theory. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-vm-resiliency?view=azloc-2603) | architecture-patterns | 0.65 | VM resiliency considerations describe how to design VMs and apps to withstand failures on Azure Local, which are product-specific resiliency patterns. | -| [Recommended alert rules](https://learn.microsoft.com/en-us/azure/azure-local/manage/set-up-recommended-alert-rules?view=azloc-2603) | best-practices | 0.65 | Provides predefined, recommended metric-based alert rules (CPU, memory, etc.) tailored to Azure Local, representing product-specific monitoring best practices. | -| [Register disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-registration?view=azloc-2603) | deployment | 0.65 | Covers registration of disconnected operations to meet Azure Local requirements. Registration flows and constraints are product-specific deployment/compliance steps. | -| [Release information](https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2603) | deployment | 0.65 | Release information with OS builds and supported update paths; contains product-specific upgrade constraints and supported paths relevant to deployment lifecycle. | -| [Remote support extension](https://learn.microsoft.com/en-us/azure/azure-local/manage/remote-support-arc-extension?view=azloc-2603) | troubleshooting | 0.65 | Describes remote support scenarios and specific commands Microsoft support uses during sessions, which are product-specific diagnostic patterns. | -| [Repair a node](https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server?view=azloc-2603) | deployment | 0.65 | Node repair procedures are operational/deployment tasks with specific steps and constraints for Azure Local clusters. | -| [Run SQL Server](https://learn.microsoft.com/en-us/azure/azure-local/deploy/sql-server-23h2?view=azloc-2603) | deployment | 0.65 | Guidance on deploying SQL Server on Azure Local; likely includes product-specific deployment requirements and configuration patterns for this workload. | -| [SDN Multisite overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-multisite-overview?view=azloc-2603) | architecture-patterns | 0.65 | Provides overview of SDN Multisite benefits and limitations and is explicitly for designing network topology and DR; this is product-specific architecture and pattern guidance. | -| [Security](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-security?view=azloc-2603) | security | 0.65 | Focuses on security considerations and compliance regulations for disconnected operations with Azure Local VMs. Contains product-specific security and compliance configuration guidance beyond generic security concepts. | -| [Supported operations for VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-operations?view=azloc-2603) | best-practices | 0.65 | Article explicitly calls out supported and unsupported VM operations to avoid complications, which are product-specific DO/DON’T guidelines. | -| [Suspend and resume](https://learn.microsoft.com/en-us/azure/azure-local/manage/suspend-resume-cluster-maintenance?view=azloc-2603) | deployment | 0.65 | Describes operational procedure to suspend/resume physical hosts for maintenance; product-specific maintenance/deployment workflow. | -| [Unregister and reregister machines](https://learn.microsoft.com/en-us/azure/azure-local/manage/unregister-register-machine?view=azloc-2603) | configuration | 0.65 | Uses specific PowerShell cmdlets and flows for registration state management; these are product-specific configuration/management commands. | -| [Update via PowerShell with limited connectivity](https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2603) | configuration | 0.65 | Covers how to discover and import static update payloads for Azure Local with limited connectivity, including using PowerShell and handling static payloads. This scenario is product-specific and likely includes concrete cmdlets, parameters, and steps unique to Azure Local update packages, fitting configuration/integration patterns more than generic tutorial content. | -| [Use dashboard to manage at-scale](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-at-scale-dashboard?view=azloc-2603) | configuration | 0.65 | Managing at scale via overview and All systems pages involves product-specific monitoring configuration and dashboard usage. | -| [Using Azure Compute Gallery images](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-compute-gallery?view=azloc-2603) | integrations | 0.65 | How-to article for creating Azure Local VMs from Azure Compute Gallery via Azure CLI. Likely includes product-specific CLI commands, parameters, and image configuration details that qualify as integration/coding patterns rather than generic tutorial content. | -| [Configure load balancer for high availability ports](https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-software-load-balancer?view=azloc-2603) | limits-quotas | 0.64 | Article explicitly mentions supported configurations and associated limitations for high availability ports; these are typically expressed as concrete constraints (supported scenarios, protocol/port rules) that qualify as limits/quotas. | -| [Deploy via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2603) | security | 0.64 | The page covers using local identity with Azure Key Vault specifically for Azure Local deployment, which implies product-specific identity and secret-access configuration. This is security-focused (identity + Key Vault) and contains expert configuration details unique to this scenario rather than generic deployment or conceptual content. | -| [Update via PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/update/update-via-powershell-23h2?view=azloc-2603) | configuration | 0.64 | PowerShell-based update procedure for OS, services, and extensions; likely includes specific cmdlets, parameters, and required states unique to Azure Local update orchestration. | -| [Load balance multiple logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balance-multiple-networks?view=azloc-2603) | architecture-patterns | 0.63 | Provides guidance on using multiple logical networks for load balancing and isolation; this is a product-specific design pattern for Azure Local SDN rather than a generic concept. | -| [Manage SDN Multisite](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-sdn-multisite?view=azloc-2603) | configuration | 0.62 | Multisite SDN management typically involves specific configuration parameters (site definitions, link settings, routing policies) unique to Azure Local/Windows Server SDN; article is focused on deployment and management details. | -| [Upgrade SDN infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn?view=azloc-2603) | troubleshooting | 0.62 | Upgrade guide explicitly includes troubleshooting guidance for SDN upgrade issues, which typically maps symptoms to causes and resolutions specific to Azure Local SDN. | -| [What's a rack aware cluster?](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-overview?view=azloc-2603) | architecture-patterns | 0.62 | Overview includes supported configurations and deployment requirements for rack aware clusters; likely describes product-specific clustering patterns and when to use them. | -| [About multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-overview?view=azloc-2603) | decision-making | 0.60 | Overview of multi-rack deployments with scale thresholds (over 100 machines, 8,000 cores) and use cases, helping decide when to use multi-rack. | -| [Activate Windows Server VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-activate?view=azloc-2603) | configuration | 0.60 | Activation article describes specific activation options and setup steps (including optional subscription add-on) that are product-specific configuration patterns. | -| [Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-arc-vm?view=azloc-2603) | configuration | 0.60 | Describes management features, differences, and limitations for VMs in disconnected operations. Likely includes product-specific configuration differences and constraints compared to connected mode. | -| [Azure verification for VMs](https://learn.microsoft.com/en-us/azure/azure-local/deploy/azure-verification?view=azloc-2603) | security | 0.60 | Azure verification for VMs is modeled after IMDS attestation; configuring this feature likely involves security/attestation settings unique to Azure Local. | -| [Billing and payment](https://learn.microsoft.com/en-us/azure/azure-local/concepts/billing?view=azloc-2603) | decision-making | 0.60 | Provides product-specific billing behavior and pricing model details that inform cost planning and service selection. | -| [Collect logs](https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-logs?view=azloc-2603) | configuration | 0.60 | Describes Azure Local–specific log collection commands and portal flows, including what logs are gathered and how they’re packaged for support. | -| [Get cluster performance history](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-cluster-performance-history?view=azloc-2603) | configuration | 0.60 | Explains cmdlets and metrics for cluster performance history, including product-specific metric aggregation and retrieval commands. | -| [Get remote support](https://learn.microsoft.com/en-us/azure/azure-local/manage/get-remote-support?view=azloc-2603) | configuration | 0.60 | Covers steps to enable remote support, configure proxy settings, and submit support requests, including Azure Local–specific remote support configuration. | -| [Manage logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-logical-networks?view=azloc-2603) | configuration | 0.60 | Managing logical networks implies editing network configuration parameters (subnets, IP pools, etc.) specific to Azure Local, fitting configuration sub-skill. | -| [Monitor clusters with Health Service](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-overview?view=azloc-2603) | configuration | 0.60 | Describes Health Service usage and configuration for clusters running Storage Spaces Direct, including Azure Local–specific monitoring behavior. | -| [Monitor feature workbooks](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-features?view=azloc-2603) | configuration | 0.60 | Shows how to monitor specific Azure Local features like ReFS deduplication and compression using Insights, involving feature-specific configuration. | -| [Network ATC overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-atc-overview?view=azloc-2603) | best-practices | 0.60 | Network ATC overview for Azure Local/Windows Server likely embeds product-specific recommended configurations and patterns to avoid misconfiguration. | -| [Network Controller](https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-network-controller-deployment?view=azloc-2603) | architecture-patterns | 0.60 | Planning article for Network Controller deployment; includes requirements like minimum three VMs and recommendations, which are product-specific architectural decisions. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-overview?view=azloc-2603) | architecture-patterns | 0.60 | Explains SDN management methods, when to use each, and supported/unsupported scenarios; this is product-specific pattern selection guidance with trade-offs. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-overview?view=azloc-2603) | architecture-patterns | 0.60 | Disaster recovery considerations for Azure Local VMs likely include product-specific DR patterns and when to use them in hybrid/edge scenarios. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-monitoring?view=azloc-2603) | configuration | 0.60 | Describes integrating Microsoft and non-Microsoft monitoring systems. Likely includes product-specific endpoints, data flows, or configuration steps for monitoring in disconnected environments. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-overview?view=azloc-2603) | decision-making | 0.60 | Explains how disconnected operations support compliance, security, and remote deployments. Likely includes scenario-based guidance on when to use disconnected operations and considerations for sovereign/private clouds, fitting decision-making for deployment mode selection. | -| [Review prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-prerequisites?view=azloc-2603) | configuration | 0.60 | Prerequisites article typically lists required versions, settings, and environment conditions specific to Azure Local Arc-enabled VMs, functioning as configuration requirements. | -| [Track Health Service actions](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-actions?view=azloc-2603) | configuration | 0.60 | Describes Health Service workflows and actions, including how actions are generated and tracked, which is product-specific operational configuration. | +| [Troubleshoot Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-troubleshoot-arc-enabled-vms?view=azloc-2604) | troubleshooting | 0.90 | Explicit troubleshooting article with limitations, known problems, and recommended resolutions; likely includes error patterns and product-specific fixes. | +| [Assign RBAC roles](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-assign-vm-rbac-roles?view=azloc-2604) | security | 0.85 | Describes RBAC role usage for VM and resource access; likely lists specific built-in role names and scopes, which are product-specific security configuration details. | +| [Firewall requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2604) | security | 0.85 | Lists outbound endpoints, internal ports/rules, and use of service tags; product-specific security configuration parameters. | +| [Manage Secure Boot updates](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2604) | security | 0.85 | Details how Azure Local transitions Secure Boot certificates, mitigates CVE-2023-24932, and orchestrates updates with OEM packages—highly product-specific security operations. | +| [Manual backup and recovery](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-import-key?view=azloc-2604) | security | 0.85 | Provides concrete procedures and storage options (file system vs key vault) for guest state protection keys, including version-specific behavior—product-specific security key management. | +| [Pattern IP requirements](https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-ip-requirements?view=azloc-2604) | configuration | 0.85 | IP requirements for the single-server pattern are specific configuration parameters (IP ranges, counts) unique to this product. | +| [Pattern IP requirements](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-ip-requirements?view=azloc-2604) | configuration | 0.85 | Three-node IP address requirements are detailed configuration parameters specific to Azure Local deployments. | +| [Pattern IP requirements](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-ip-requirements?view=azloc-2604) | configuration | 0.85 | Two-node IP address requirements are specific configuration parameters (counts, roles) for this product. | +| [Review requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-requirements?view=azloc-2604) | deployment | 0.85 | Explicitly lists requirements and supported configurations for rack aware clusters—deployment constraints and matrices unique to Azure Local. | +| [System requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2604) | configuration | 0.85 | System requirements for machines, storage, networking are concrete configuration constraints (hardware/network specs) unique to this product. | +| [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-arc-enabled-vms?view=azloc-2604) | troubleshooting | 0.85 | Explicit troubleshooting article with log collection steps, known limitations, and likely error/symptom-to-resolution mappings specific to Azure Local Arc-enabled VMs. | +| [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-troubleshoot?view=azloc-2604) | troubleshooting | 0.85 | Explicit troubleshooting article; typically organized by symptom and includes specific error messages/codes and resolutions. | +| [Troubleshoot SDN](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-troubleshooting?view=azloc-2604) | troubleshooting | 0.85 | Explicit troubleshooting article with symptom-based guidance for SDN controller deployment, VM connectivity, NSG, DNS, and policy errors; includes product-specific error patterns and fixes. | +| [Via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deployment-via-template?view=azloc-2604) | deployment | 0.85 | ARM template-based deployment for rack aware clusters with guidance for scale deployments—product-specific deployment method and constraints. | +| [6B. Deploy via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template-disaggregated?view=azloc-2604) | deployment | 0.80 | ARM template deployment for disaggregated Azure Local is a product-specific deployment pattern, likely with parameter and scale constraints. | +| [Assign RBAC role](https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-vm-rbac-roles?view=azloc-2604) | security | 0.80 | Describes using built-in RBAC roles to control access to VM resources; will list specific role names and scopes, matching security criteria. | +| [Choose network reference pattern](https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2604) | decision-making | 0.80 | Helps select among network patterns for one to three hosts based on scenarios; explicit decision guidance between patterns. | +| [Compare to Windows Server](https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2604) | decision-making | 0.80 | Explicit comparison to help choose between Azure Local and Windows Server with scenario-based recommendations. | +| [Configure proxy](https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-proxy-settings-23h2?view=azloc-2604) | configuration | 0.80 | Explains proxy configuration behavior, including version-specific automation (post-2506) and guidance for different Arc registration methods—product-specific configuration details. | +| [Create NSGs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-security-groups?view=azloc-2604) | security | 0.80 | Describes NSG creation and rule configuration via CLI; includes security rule parameters (ports, protocols, priorities) specific to Azure Local multi-rack. | +| [Create and manage L3 isolation domains](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-configure-layer-3-isolation-domain?view=azloc-2604) | configuration | 0.80 | Managing L3 isolation domains uses Azure resources and CLI/portal parameters (resource types, properties, allowed values) that are specific configuration knowledge. | +| [Enable guest attestation](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-guest-attestation?view=azloc-2604) | security | 0.80 | Covers enabling boot integrity verification for Trusted launch VMs with version-specific requirements and security configuration steps unique to Azure Local. | +| [Fiber Channel disaggregated pattern with backup network](https://learn.microsoft.com/en-us/azure/azure-local/plan/fiber-channel-with-backup-disaggregated-pattern?view=azloc-2604) | architecture-patterns | 0.80 | Another concrete network pattern (FC SAN with dedicated backup network) with product-specific architecture details and trade-offs. | +| [Fiber Channel disaggregated pattern without backup network](https://learn.microsoft.com/en-us/azure/azure-local/plan/fiber-channel-no-backup-disaggregated-pattern?view=azloc-2604) | architecture-patterns | 0.80 | Describes a specific network reference pattern (FC SAN, no backup network) including topology and constraints unique to Azure Local. | +| [Host network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2604) | configuration | 0.80 | Host networking considerations and requirements are detailed configuration guidance specific to Azure Local. | +| [Manage BitLocker encryption](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-bitlocker?view=azloc-2604) | security | 0.80 | Provides procedures to enable/view BitLocker and retrieve recovery keys on Azure Local—product-specific security configuration. | +| [Manage GPUs using Discrete Device Assignment](https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-device?view=azloc-2604) | configuration | 0.80 | Covers DDA setup for GPUs on Azure Local Arc-enabled VMs with product-specific parameters and constraints for direct device assignment. | +| [Manage GPUs via partitioning](https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-partitioning?view=azloc-2604) | configuration | 0.80 | Describes GPU-P configuration to share GPUs across workloads with Azure Local-specific steps and partitioning options. | +| [Manage security defaults](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-baseline?view=azloc-2604) | security | 0.80 | Covers default security settings, drift control, and protected settings for Azure Local with concrete configuration behavior. | +| [Manage syslog forwarding](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-syslog-forwarding?view=azloc-2604) | security | 0.80 | Explains how to forward security events via syslog to a SIEM with Azure Local-specific configuration steps and event handling. | +| [Physical network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2604) | configuration | 0.80 | Physical network and switch requirements are product-specific configuration guidance with concrete constraints. | +| [Prepare GPUs](https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-preparation?view=azloc-2604) | configuration | 0.80 | Details GPU preparation steps and requirements for Azure Local VMs and AKS; includes product-specific configuration and environment setup. | +| [Reference architecture](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-reference-architecture?view=azloc-2604) | architecture-patterns | 0.80 | Provides a concrete reference architecture with network design and configuration for rack aware clusters in specific factory-like scenarios—product-specific architecture pattern. | +| [Room-to-room connections](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-room-to-room-connectivity?view=azloc-2604) | architecture-patterns | 0.80 | Outlines four specific configuration options with trade-offs for room-to-room links; this is a product-specific network architecture decision guide. | +| [Single-node deployment](https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-deployment?view=azloc-2604) | architecture-patterns | 0.80 | Describes a specific single-server storage network reference pattern and viability criteria; product-specific architecture pattern. | +| [Storage switched, fully converged](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-converged?view=azloc-2604) | architecture-patterns | 0.80 | Two-node switched, fully converged pattern is a product-specific network architecture pattern with defined components. | +| [Storage switched, non-converged](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-non-converged?view=azloc-2604) | architecture-patterns | 0.80 | Two-node switched, non-converged pattern with two TOR switches is a specific network architecture pattern for this product. | +| [Storage switchless, dual TOR, dual link](https://learn.microsoft.com/en-us/azure/azure-local/plan/four-node-switchless-two-switches-two-links?view=azloc-2604) | architecture-patterns | 0.80 | Four-node switchless dual-link pattern with dual TOR switches is a specific, validated network architecture pattern. | +| [Storage switchless, dual TOR, dual link](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-two-links?view=azloc-2604) | architecture-patterns | 0.80 | Three-node switchless dual-link pattern with dual TOR switches is a specific, tested architecture pattern. | +| [Storage switchless, dual TOR, single link](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-single-link?view=azloc-2604) | architecture-patterns | 0.80 | Three-node switchless dual TOR single-link pattern is a validated network architecture pattern unique to Azure Local. | +| [Storage switchless, single switch](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-single-switch?view=azloc-2604) | architecture-patterns | 0.80 | Defines a specific two-node switchless pattern with single TOR switch and when it’s viable; product-specific network architecture pattern. | +| [Storage switchless, two switches](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-two-switches?view=azloc-2604) | architecture-patterns | 0.80 | Describes two-node switchless with dual TOR switches pattern and deployment viability; concrete architecture pattern. | +| [System requirements for low capacity class](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2604) | configuration | 0.80 | Low-capacity deployment system requirements for machines, storage, and networking are concrete configuration constraints. | +| [Troubleshoot SDN deployment](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-sdn-deployment?view=azloc-2604) | troubleshooting | 0.80 | Focused on diagnosing and resolving SDN deployment issues, including guidance on symptoms, causes, and log collection. Product-specific troubleshooting workflow. | +| [Via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-portal?view=azloc-2604) | deployment | 0.80 | Step-by-step deployment of rack aware clusters through the portal, including configuration and validation processes unique to this feature. | +| [6B. Deploy via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template?view=azloc-2604) | deployment | 0.75 | ARM template deployment guidance including prerequisites and scale-focused recommendations; product-specific deployment pattern and constraints. | +| [Attach GPU to Linux VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/attach-gpu-to-linux-vm?view=azloc-2604) | configuration | 0.75 | Shows how to attach GPUs to Ubuntu VMs on Azure Local with product-specific steps and constraints for AI workloads. | +| [Authenticate with Kerberos](https://learn.microsoft.com/en-us/azure/azure-local/manage/kerberos-with-spn?view=azloc-2604) | security | 0.75 | Explains using Kerberos with SPN for Network Controller management clients and SCVMM, including supported authentication modes. Product-specific security/auth configuration. | +| [Choose disaggregated network reference pattern](https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern-disaggregated?view=azloc-2604) | decision-making | 0.75 | Explicitly helps choose between supported network patterns; likely includes scenario-based recommendations and trade-offs. | +| [Configure network security groups with PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-powershell?view=azloc-2604) | security | 0.75 | Provides PowerShell-based configuration of NSGs and Datacenter Firewall, including specific cmdlets and parameter usage unique to Azure Local SDN. | +| [Configure network security groups with Windows Admin Center](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-windows-admin-center?view=azloc-2604) | security | 0.75 | Details configuring NSGs and Datacenter Firewall for SDN logical and virtual networks, including how NSGs are applied. Product-specific security configuration. | +| [Create Network Security Groups](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-security-groups?view=azloc-2604) | security | 0.75 | Covers NSG and rule creation plus default access policies via CLI/portal; product-specific network security configuration details. | +| [Create internal load balancer for VNETs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-internal-load-balancer-virtual-networks?view=azloc-2604) | configuration | 0.75 | Internal load balancer setup uses Azure Local-specific resource settings and CLI parameters for multi-rack deployments. | +| [Create logical network](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-logical-networks?view=azloc-2604) | configuration | 0.75 | Creating logical networks involves specifying address spaces, subnets, and flags; these are product-specific configuration parameters and constraints. | +| [Create public IP addresses](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-ip?view=azloc-2604) | configuration | 0.75 | Defines how to configure public IPs that are routable within on-prem or internet; includes resource properties and constraints specific to Azure Local multi-rack. | +| [Create public load balancer for LNETs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-load-balancer-logical-network?view=azloc-2604) | configuration | 0.75 | Focuses on LB creation on logical networks via CLI; includes configuration parameters and constraints unique to Azure Local multi-rack. | +| [Create public load balancer for VNETs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-load-balancer-virtual-networks?view=azloc-2604) | configuration | 0.75 | Creating public load balancers involves resource properties (frontend IPs, backend pools, rules) that are product-specific configuration details. | +| [Deploy via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault-template?view=azloc-2604) | deployment | 0.75 | Combines local identity, Key Vault, and ARM templates with external DNS; detailed, product-specific deployment configuration for scaled scenarios. | +| [Manage Application Control](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-wdac?view=azloc-2604) | security | 0.75 | Describes using Application Control to reduce attack surface with Azure Local-specific configuration steps and policies. | +| [Manage NSGs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-network-security-groups?view=azloc-2604) | security | 0.75 | Managing NSGs and rules is security-focused; involves specific rule properties, associations, and behaviors for Azure Local multi-rack. | +| [Manage VMs with PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-powershell?view=azloc-2604) | configuration | 0.75 | PowerShell-based VM management with Azure Local-specific cmdlets/parameters and behaviors that go beyond generic Hyper-V or Azure VM scripting. | +| [Manage default network access policies](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-default-network-access-policies-virtual-machines-23h2?view=azloc-2604) | security | 0.75 | Security-focused configuration of default network policies that block inbound traffic except specified management ports. Product-specific security behavior and configuration steps. | +| [Manage with Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-with-defender-for-cloud?view=azloc-2604) | security | 0.75 | Shows how to integrate and configure Defender for Cloud for Azure Local, including product-specific security posture management. | +| [Pattern components](https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-components?view=azloc-2604) | configuration | 0.75 | Details which network components are deployed in the single-server pattern; concrete configuration of components. | +| [Pattern components](https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-components?view=azloc-2604) | configuration | 0.75 | Describes which network components are deployed in three-node patterns; concrete configuration details. | +| [Pattern components](https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-components?view=azloc-2604) | configuration | 0.75 | Lists which network components are deployed in two-node reference patterns; concrete configuration details. | +| [Prepare to deploy](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-prep?view=azloc-2604) | best-practices | 0.75 | Includes network design recommendations, machine configuration guidelines, and deployment best practices specific to Azure Local rack aware clusters. | +| [Provision and place VMs](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-provision-vm-local-availability-zone?view=azloc-2604) | configuration | 0.75 | Explains how to place VMs in local availability zones with constraints (such as no updating of existing VM placement) specific to Azure Local. | +| [Review best practices](https://learn.microsoft.com/en-us/azure/azure-local/update/update-best-practices?view=azloc-2604) | best-practices | 0.75 | Provides do/don’t style guidance and common pitfalls for update management, tailored to Azure Local. Product-specific operational best practices. | +| [Secure the Network Controller](https://learn.microsoft.com/en-us/azure/azure-local/manage/nc-security?view=azloc-2604) | security | 0.75 | Describes securing northbound, cluster, and southbound communication paths for Network Controller, including supported authentication and encryption options. Product-specific security configuration. | +| [System requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-disaggregated?view=azloc-2604) | limits-quotas | 0.75 | System requirements article will specify supported machine types, SAN capabilities, and numeric constraints (node counts, capacities). | +| [Troubleshoot Software Load Balancer for SDN](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-software-load-balancer?view=azloc-2604) | troubleshooting | 0.75 | Helps diagnose SLB data path failures with common causes and resolutions for Azure Local/Windows Server SDN. Product-specific troubleshooting mappings. | +| [Troubleshoot deployment validation issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment?view=azloc-2604) | troubleshooting | 0.75 | Focused on diagnosing deployment validation issues; likely includes specific error messages, causes, and remediation steps. | +| [Troubleshoot registration issues for Configurator app](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment-configurator-app?view=azloc-2604) | troubleshooting | 0.75 | Explicit troubleshooting guide for registration failures; likely organized by symptoms and includes log collection steps and resolution paths. | +| [Troubleshoot updates](https://learn.microsoft.com/en-us/azure/azure-local/update/update-troubleshooting-23h2?view=azloc-2604) | troubleshooting | 0.75 | Focused on diagnosing and resolving issues with solution updates, likely including error conditions and corrective actions. Product-specific troubleshooting. | +| [1. Upgrade OS via PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-22h2-to-23h2-powershell?view=azloc-2604) | deployment | 0.70 | Gives concrete PowerShell-based OS upgrade procedures and supported paths (including direct 22H2→24H2). Product-specific upgrade deployment details. | +| [3. Create logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-logical-networks?view=azloc-2604) | configuration | 0.70 | Defines logical networks and their properties for Azure Local; likely includes network settings (address spaces, VLANs, etc.), which are product-specific configuration details. | +| [3A. Install manually via ISO](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os-disaggregated?view=azloc-2604) | deployment | 0.70 | OS installation method for this deployment mode; likely includes product-specific deployment requirements and supported methods. | +| [3B. Install and register via simplified machine provisioning](https://learn.microsoft.com/en-us/azure/azure-local/deploy/simplified-machine-provisioning?view=azloc-2604) | deployment | 0.70 | Describes a preview deployment mechanism with specific registration and installation flow unique to Azure Local. | +| [4. Set up subscription permissions](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-arc-register-server-permissions?view=azloc-2604) | security | 0.70 | Focuses on setting up required subscription permissions and registration steps; includes product-specific RBAC/permission configuration for deployment. | +| [6A. Deploy via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal-disaggregated?view=azloc-2604) | deployment | 0.70 | Portal-based deployment article for a specific deployment type; includes product-specific deployment steps and constraints. | +| [About Azure Local deployments](https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2604) | decision-making | 0.70 | Explicitly helps choose between deployment types and scalability options across scale points; decision-focused guidance beyond generic concepts. | +| [About private endpoint scenarios](https://learn.microsoft.com/en-us/azure/azure-local/deploy/about-private-endpoints?view=azloc-2604) | decision-making | 0.70 | Covers supported/unsupported scenarios and key requirements for private endpoints with/without Arc gateway and proxy; helps choose connectivity approach. | +| [Activate Windows Server VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-activate?view=azloc-2604) | configuration | 0.70 | Explains activation options including Automatic VM Activation and Azure Local add-on; includes product-specific activation settings and steps. | +| [Add NICs to a Network ATC intent](https://learn.microsoft.com/en-us/azure/azure-local/manage/add-network-adapters-to-network-intents?view=azloc-2604) | configuration | 0.70 | Shows how to modify Network ATC intents by adding NICs; likely includes specific intent properties, adapter mappings, and configuration commands unique to Azure Local networking. | +| [Add and repair nodes](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-add-server?view=azloc-2604) | deployment | 0.70 | Describes procedures to add/repair nodes in rack aware clusters, including product-specific operational and deployment details. | +| [Assess environment readiness](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-environment-checker?view=azloc-2604) | configuration | 0.70 | Describes a product-specific tool with checks for connectivity, hardware, networking, and AD; includes concrete readiness rules and outputs unique to Azure Local. | +| [Assess network readiness via LLDP](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-readiness-check?view=azloc-2604) | configuration | 0.70 | Describes using an LLDP validator tool with product-specific steps and parameters to assess deployment readiness. | +| [Assign public IP address to a VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-public-ip-to-vm?view=azloc-2604) | configuration | 0.70 | Explains how to assign SDN public IPs to VMs via Windows Admin Center, enabling external connectivity. Product-specific configuration steps and behavior. | +| [Azure CLI](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-cli?view=azloc-2604) | configuration | 0.70 | Details on supported CLI versions, cloud setup, certificate trust, and extensions for disconnected mode are product-specific configuration parameters. | +| [Azure verification for VMs](https://learn.microsoft.com/en-us/azure/azure-local/deploy/azure-verification?view=azloc-2604) | security | 0.70 | Describes Azure verification modeled after IMDS attestation; involves security/attestation configuration and requirements unique to Azure Local. | +| [Back up disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-back-up-restore?view=azloc-2604) | configuration | 0.70 | Includes backup parameters, how to trigger backups, and RBAC requirements; these are concrete configuration details for a specific product scenario. | +| [Collect SDN logs](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-log-collection?view=azloc-2604) | troubleshooting | 0.70 | Describes specific SDN log types and collection methods to diagnose advanced issues and support cases. Product-specific diagnostic data collection. | +| [Collect log files for Azure Local VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-log-files-arc-enabled-vms?view=azloc-2604) | troubleshooting | 0.70 | Focused on collecting logs to troubleshoot VM issues; likely specifies log locations and commands, matching troubleshooting criteria. | +| [Complete Azure Local VM prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-vm-management-prerequisites?view=azloc-2604) | configuration | 0.70 | Lists VM requirements (sizes, images, networking, storage) that are specific to Azure Local multi-rack; these are configuration constraints and settings. | +| [Complete deployment prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-prerequisites?view=azloc-2604) | configuration | 0.70 | Prerequisites pages typically list concrete requirements (supported versions, network ranges, hardware specs) that are product-specific configuration knowledge. | +| [Complete post-deployment tasks](https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-post-deployment?view=azloc-2604) | deployment | 0.70 | Lists required post-deployment tasks after rack aware cluster deployment; these are operational deployment steps specific to Azure Local. | +| [Configure advanced Active Directory settings](https://learn.microsoft.com/en-us/azure/azure-local/plan/configure-custom-settings-active-directory?view=azloc-2604) | security | 0.70 | Gives detailed steps to assign required AD permissions and create specific DNS records; this is product-specific identity and access configuration. | +| [Configure load balancer for high availability ports](https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-software-load-balancer?view=azloc-2604) | configuration | 0.70 | Describes HA ports rules, prerequisites, supported configurations, and limitations for SLB. Product-specific configuration and constraints for HA ports. | +| [Configure network security groups with tags](https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-network-security-groups-with-tags?view=azloc-2604) | security | 0.70 | Describes creating custom network security tags, attaching them to VM NICs, and applying NSG policies based on tags. Product-specific security tagging model. | +| [Connect to Azure Local VMs via SSH](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-connect-arc-vm-using-ssh?view=azloc-2604) | configuration | 0.70 | Shows how to configure SSH/RDP over SSH connectivity; includes port, user, and CLI/portal settings specific to Azure Local multi-rack. | +| [Create Azure Local VM images](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-image-storage-account?view=azloc-2604) | integrations | 0.70 | Shows how to use Azure Storage and CLI to build VM images for Azure Local; includes API/CLI parameters and integration-specific steps between services. | +| [Create VM affinity rules](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-affinity?view=azloc-2604) | configuration | 0.70 | Describes how to set affinity rules via Admin Center/PowerShell; likely includes specific cmdlets, parameters, and constraints unique to Azure Local clusters. | +| [Create network interfaces](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-interfaces?view=azloc-2604) | configuration | 0.70 | Creating NICs on logical networks uses Azure resource properties (IP allocation, subnet, NIC settings) that are product-specific configuration details. | +| [Create virtual network](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-virtual-networks?view=azloc-2604) | configuration | 0.70 | VNet creation for multi-rack uses Azure resource properties and CLI parameters; article is configuration-focused with product-specific settings. | +| [Deploy disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-deploy?view=azloc-2604) | deployment | 0.70 | Explains deploying the management cluster and first instance in a no-outbound-connectivity scenario; includes product-specific deployment flow and constraints. | +| [Deploy via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2604) | security | 0.70 | Describes using local identity integrated with Key Vault for deployment; product-specific identity and secret management configuration. | +| [Download managed disks from Azure](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-data-disks?view=azloc-2604) | integrations | 0.70 | Covers moving managed disks from Azure to Azure Local; involves specific commands, URIs, and parameters for this cross-service integration. | +| [Enable Insights at scale using Azure policies](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-azure-policies?view=azloc-2604) | configuration | 0.70 | Shows how to use Azure Policy to enable Insights at scale; includes policy definitions, parameters, and assignment scopes specific to Azure Local monitoring. | +| [Enable SDN integration](https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-sdn-integration?view=azloc-2604) | configuration | 0.70 | Describes enabling SDN via a specific PowerShell action plan; includes product-specific configuration steps and scripts. | +| [Enable nested virtualization](https://learn.microsoft.com/en-us/azure/azure-local/manage/enable-nested-virtualization?view=azloc-2604) | configuration | 0.70 | Explains enabling nested virtualization with Azure Local-specific constraints and configuration steps, including security-related use cases like VBS. | +| [Enhanced management from Azure](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-enhanced-management-managed-identity?view=azloc-2604) | security | 0.70 | Describes enabling enhanced management via a specific managed identity for Azure Local, likely including role assignments, identity scopes, and Azure-side security configuration details unique to this product. | +| [Extended Security Updates (ESUs)](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-benefits-esu?view=azloc-2604) | security | 0.70 | Details ESU specifics for Azure Local and Azure verification; includes product-specific security/compliance configuration steps. | +| [External Storage support with AKS](https://learn.microsoft.com/en-us/azure/azure-local/manage/use-external-storage-for-containerized-workloads?view=azloc-2604) | integrations | 0.70 | Describes using custom StorageClasses in AKS to consume external SAN; involves Kubernetes storage class parameters and Azure Local-specific integration details. | +| [Fallback log collection](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-fallback?view=azloc-2604) | troubleshooting | 0.70 | Describes fallback logging when standard collection fails, including how to export and send logs; this is specialized troubleshooting/logging guidance. | +| [Host network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements-disaggregated?view=azloc-2604) | limits-quotas | 0.70 | Host networking requirements usually define supported NIC types, minimum speeds, VLAN/VXLAN constraints, and node limits. | +| [ISO/IEC 27001 guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-iso27001-guidance?view=azloc-2604) | security | 0.70 | Provides detailed mapping/guidance on how Azure Local features satisfy ISO 27001:2022 controls, which is product- and standard-specific security guidance. | +| [Install Azure CLI extensions](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-cli-extensions?view=azloc-2604) | configuration | 0.70 | Describes required Azure CLI extensions for multi-rack; includes extension names, versions, and install commands that are product-specific configuration details. | +| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2604) | troubleshooting | 0.70 | Known issues and workarounds imply symptom→cause→solution mappings specific to Azure Local versions. | +| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-known-issues?view=azloc-2604) | troubleshooting | 0.70 | Release notes for known issues and workarounds typically map specific symptoms to causes and fixes; this is product- and version-specific troubleshooting knowledge. | +| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-23?view=azloc-2604) | troubleshooting | 0.70 | Known/fixed issues article with workarounds is effectively a troubleshooting catalog mapping issues to resolutions. | +| [Known issues](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-24?view=azloc-2604) | troubleshooting | 0.70 | Release notes with known issues and workarounds typically map specific symptoms/bugs to causes and mitigation steps. | +| [Log alerts](https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-system-alerts?view=azloc-2604) | configuration | 0.70 | Provides sample log queries and alert setup steps; includes KQL examples and alert rule parameters specific to Azure Local telemetry. | +| [Maintain static IP addresses](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-maintain-ip-addresses?view=azloc-2604) | best-practices | 0.70 | Explains how to maintain static IPs across SDN and non-SDN environments; likely includes product-specific steps, gotchas, and configuration patterns. | +| [Manage Azure Local VM resources](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machine-resources?view=azloc-2604) | configuration | 0.70 | Covers adding/removing data disks and NICs with constraints (for example, NICs only when stopped, static IP behavior), which are product-specific configuration behaviors. | +| [Manage Network Security Groups](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-network-security-groups?view=azloc-2604) | security | 0.70 | Describes managing NSGs for Azure Local VMs with scope constraints; product-specific security management behavior and commands. | +| [Manage VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm?view=azloc-2604) | configuration | 0.70 | How-to for VM lifecycle management via Windows Admin Center on Azure Local; likely includes specific UI options and parameters unique to this product combination. | +| [Manage certificates for SDN](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-manage-certs?view=azloc-2604) | security | 0.70 | Details managing certificates for Network Controller northbound/southbound communication and SCVMM integration. Product-specific certificate roles and security configuration. | +| [Manage gateway connections](https://learn.microsoft.com/en-us/azure/azure-local/manage/gateway-connections?view=azloc-2604) | configuration | 0.70 | Covers creating, deleting, and updating IPsec, GRE, and L3 gateway connections via Windows Admin Center. Includes product-specific connection types and configuration steps. | +| [Manage secrets rotation](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secrets-rotation?view=azloc-2604) | security | 0.70 | Describes internal secret rotation for the deployment user, including product-specific security operations. | +| [Manage security post upgrade](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-post-upgrade?view=azloc-2604) | security | 0.70 | Covers managing security settings on upgraded systems, including migration-specific security considerations and configurations. | +| [Manage software load balancers](https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balancers?view=azloc-2604) | configuration | 0.70 | Explains how to manage SLB policies for SDN workloads via Windows Admin Center, including scenarios for traditional and virtual networks. Product-specific SLB configuration. | +| [Manage tenant logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-logical-networks?view=azloc-2604) | configuration | 0.70 | Step-by-step configuration of SDN logical (VLAN-based) networks via Windows Admin Center, including how policies are applied. Product-specific configuration procedures. | +| [Manage tenant virtual networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-virtual-networks?view=azloc-2604) | configuration | 0.70 | Detailed instructions for creating/updating/deleting HNV virtual networks and isolation behavior, using Windows Admin Center. Product-specific SDN configuration. | +| [Manage update settings](https://learn.microsoft.com/en-us/azure/azure-local/update/update-settings?view=azloc-2604) | configuration | 0.70 | Update settings article will enumerate specific configuration options and their effects on update timing and workload impact. | +| [Metrics](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-cluster-with-metrics?view=azloc-2604) | configuration | 0.70 | Describes metrics collected for compute, storage, and network and the Performance Metrics dashboard; includes metric names and resource-specific monitoring configuration. | +| [Modify Health Service settings](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-settings?view=azloc-2604) | configuration | 0.70 | Covers Health Service parameters and settings; includes setting names, allowed values, and their effects, matching configuration criteria. | +| [Monitor Azure Local migrations](https://learn.microsoft.com/en-us/azure/azure-local/migrate/monitor-migration?view=azloc-2604) | configuration | 0.70 | Diagnostic settings article will specify log categories, destinations, and configuration parameters for Azure Migrate resources. | +| [Monitor cluster metrics](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-cluster-with-metrics?view=azloc-2604) | configuration | 0.70 | Lists specific metrics for compute, storage, and network and how to configure dashboards; these metric names and dimensions are product-specific configuration knowledge. | +| [Network reference patterns overview](https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2604) | architecture-patterns | 0.70 | Defines supported network reference patterns with node counts and topology constraints; product-specific architecture patterns for networking. | +| [On-demand log collection](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-on-demand-logs?view=azloc-2604) | troubleshooting | 0.70 | Focused on log collection for troubleshooting and support using a specific PowerShell module; this is a product-specific troubleshooting/logging procedure. | +| [PCI DSS guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-pci-dss-guidance?view=azloc-2604) | security | 0.70 | Explains how Azure Local security features help meet PCI DSS requirements; product-specific compliance and control mapping information. | +| [Physical network requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements-disaggregated?view=azloc-2604) | limits-quotas | 0.70 | Physical network requirements typically include port counts, bandwidth, latency, and topology constraints—concrete numeric limits. | +| [Prepare Azure Local nodes](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-prepare?view=azloc-2604) | deployment | 0.70 | Describes preparing nodes and networking for a specific disconnected deployment model; contains product-specific deployment requirements and readiness steps. | +| [Private endpoints - no proxy, no gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-no-gateway?view=azloc-2604) | configuration | 0.70 | Scenario-specific integration of private endpoints with Azure Local; product-specific connectivity configuration details. | +| [Private endpoints - no proxy, with gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-with-gateway?view=azloc-2604) | configuration | 0.70 | Details how to integrate existing/new private endpoints in a no-proxy but Arc gateway scenario; concrete configuration steps and constraints. | +| [Private endpoints - with proxy, no gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-no-gateway?view=azloc-2604) | configuration | 0.70 | Describes integrating private endpoints when using an enterprise proxy without Arc gateway; scenario-specific connectivity configuration. | +| [Private endpoints - with proxy, with gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-with-gateway?view=azloc-2604) | configuration | 0.70 | Describes a specific outbound connectivity scenario (enterprise proxy + Arc gateway + private endpoints) with product-specific configuration flows and constraints that go beyond generic networking knowledge. | +| [Public key infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-pki?view=azloc-2604) | security | 0.70 | PKI requirements and certificate details for securing Azure Local appliance endpoints are product-specific security configuration knowledge. | +| [Repair a node](https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server?view=azloc-2604) | deployment | 0.70 | Provides procedures to repair a node in Azure Local, including cluster-specific operational steps and constraints. | +| [Replace a failed NIC in a Network ATC intent](https://learn.microsoft.com/en-us/azure/azure-local/manage/replace-network-adapter-to-network-intents?view=azloc-2604) | configuration | 0.70 | Covers repair workflow for NICs within Network ATC intents without rebuilding nodes; involves detailed, product-specific network configuration steps. | +| [Restore disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-restore?view=azloc-2604) | configuration | 0.70 | Covers restore parameters, triggering restore, and version constraints (same-version restore); these are specific configuration and operational details. | +| [Review Azure Arc gateway for Azure Local](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-arc-gateway-overview?view=azloc-2604) | configuration | 0.70 | Explains enabling Arc gateway, supported software versions, and resource creation; product-specific configuration and deployment behavior. | +| [Review requirements](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-requirements?view=azloc-2604) | limits-quotas | 0.70 | A ‘system requirements’ article for a specific migration feature typically lists exact supported versions, CPU, RAM, network, and other numeric constraints that function as hard limits. | +| [Review requirements](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-requirements?view=azloc-2604) | limits-quotas | 0.70 | System requirements for VMware migration will enumerate supported ESXi/vCenter versions, network and resource constraints, which are concrete numeric and version limits. | +| [Run Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-arc-virtual-machines?view=azloc-2604) | integrations | 0.70 | Shows how to create Arc-enabled VMs via CLI, portal, and ARM templates; includes Azure Local–specific parameters and integration patterns. | +| [SAN requirements](https://learn.microsoft.com/en-us/azure/azure-local/concepts/san-requirements?view=azloc-2604) | configuration | 0.70 | Details supported SAN solutions and compatibility; product-specific configuration/compatibility matrix for storage. | +| [SDN considerations](https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-sdn-considerations?view=azloc-2604) | architecture-patterns | 0.70 | Discusses SDN considerations when deploying network reference patterns; product-specific architectural guidance for SDN usage. | +| [Security](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-security?view=azloc-2604) | security | 0.70 | Focuses on security considerations and compliance regulations for Azure Local VMs in disconnected mode; likely includes product-specific security settings and compliance mappings. | +| [Spread AKS nodes](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-aks-nodes?view=azloc-2604) | configuration | 0.70 | Guides how to deploy AKS clusters with rack aware support and node spreading; product-specific configuration for reliability. | +| [Supported VM operations](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-operations?view=azloc-2604) | best-practices | 0.70 | Explicitly lists which VM operations are supported and which to avoid; this is product-specific DO/DON'T guidance and gotchas. | +| [Troubleshoot common SDN issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-common-sdn-issues?view=azloc-2604) | troubleshooting | 0.70 | Guides what traces/logs to collect for common SDN problems before contacting support. Product-specific troubleshooting data guidance. | +| [Troubleshoot simplified machine provisioning](https://learn.microsoft.com/en-us/azure/azure-local/deploy/troubleshoot-simplified-machine-provisioning?view=azloc-2604) | troubleshooting | 0.70 | Troubleshooting article for a specific provisioning feature; includes methods and likely error/solution mappings unique to this workflow. | +| [Troubleshoot upgrades](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/troubleshoot-upgrade-to-23h2?view=azloc-2604) | troubleshooting | 0.70 | Explicit troubleshooting article for Azure Local upgrades; likely organized by common upgrade issues with specific symptoms and resolutions, matching symptom→cause→solution troubleshooting criteria. | +| [Update Network Controller certificates](https://learn.microsoft.com/en-us/azure/azure-local/manage/update-network-controller-certificates?view=azloc-2604) | security | 0.70 | Provides automatic and manual renewal procedures for Network Controller certificates, including channel-specific usage. Product-specific security and certificate lifecycle steps. | +| [Update SDN infrastructure certificates](https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn-infrastructure-certificates?view=azloc-2604) | security | 0.70 | Explains how to renew/change SDN server and SLB MUX certificates and their relationship to Network Controller. Product-specific certificate management. | +| [Update via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/update/azure-update-manager-23h2?view=azloc-2604) | deployment | 0.70 | Explains using Azure Update Manager to find, install, and track updates for Azure Local, including versioning behavior. Product-specific update deployment workflow. | +| [Update via PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/update/update-via-powershell-23h2?view=azloc-2604) | deployment | 0.70 | Provides concrete PowerShell procedures and requirements for updating single and multi-node systems with the orchestrator. Product-specific update deployment method. | +| [Update via PowerShell with limited connectivity](https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2604) | deployment | 0.70 | Details how to download static payloads, transfer, and import update packages via PowerShell for low-bandwidth sites. Product-specific offline update deployment pattern. | +| [Use Azure Site Recovery](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-site-recovery?view=azloc-2604) | deployment | 0.70 | Step-by-step guidance for using Azure Site Recovery with Azure Local Hyper-V workloads; includes product-specific deployment requirements and version applicability, going beyond generic disaster recovery tutorials. | +| [Use the Diagnostic Support tool](https://learn.microsoft.com/en-us/azure/azure-local/manage/support-tools?view=azloc-2604) | troubleshooting | 0.70 | Provides PowerShell-based diagnostic tool usage; includes commands and workflows tailored to Azure Local troubleshooting. | +| [Use the Support tool for infrastructure issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/remediate-support-tool-infrastructure?view=azloc-2604) | troubleshooting | 0.70 | Describes Support.AksArc PowerShell module for diagnostics and remediation; includes specific commands and remediation flows for infrastructure components. | +| [Using Express scripts](https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-express-23h2?view=azloc-2604) | deployment | 0.70 | Stepwise deployment of SDN infrastructure using SDN Express PowerShell, including HA Network Controller, SLB, and gateways with phased deployment options. Contains product-specific deployment patterns and constraints. | +| [VM load balancing](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-load-balancing?view=azloc-2604) | configuration | 0.70 | Product-specific VM load balancing configuration for Azure Local/Windows Server, including settings and behaviors not covered by generic load balancing knowledge. | +| [View Health Service faults](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-faults?view=azloc-2604) | troubleshooting | 0.70 | Provides detailed information about Health Service faults; likely maps fault IDs/messages to causes and remediation steps. | +| [Virtual deployment](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2604) | deployment | 0.70 | Product-specific deployment article with OS version requirements, Hyper-V requirement, and approximate deployment time; focused on how to deploy this SKU. | +| [With Arc gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2604) | configuration | 0.70 | Details configuration of Arc gateway and proxy via scripts; product-specific connectivity and registration configuration. | +| [Without Arc gateway](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-without-azure-arc-gateway?view=azloc-2604) | configuration | 0.70 | Covers registration flows and proxy configuration via script and Configurator app; includes product-specific parameters and options. | +| [1. Create a storage path](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-storage-path?view=azloc-2604) | configuration | 0.65 | Defines storage path resource and shows how to configure it via CLI/portal; includes product-specific resource properties and settings, fitting configuration. | +| [1. Prepare Active Directory](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prep-active-directory?view=azloc-2604) | security | 0.65 | Details AD requirements and preparation steps, including OU and GPO considerations; product-specific identity and permission configuration. | +| [3A. Install manually via ISO](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os?view=azloc-2604) | deployment | 0.65 | Step-by-step OS installation using a specific wizard and SConfig for Azure Local; includes product-specific deployment steps and options. | +| [4. Create network interfaces](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-interfaces?view=azloc-2604) | configuration | 0.65 | Describes creating NICs on logical networks; includes NIC properties and settings specific to Azure Local, fitting configuration. | +| [6A. Deploy via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal?view=azloc-2604) | deployment | 0.65 | Portal-based deployment flow for Azure Local with product-specific steps and options beyond generic deployment commands. | +| [Acquire disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-acquire?view=azloc-2604) | deployment | 0.65 | Covers how to create the virtual appliance resource and download installation files; this is product-specific deployment and acquisition procedure, not generic deployment. | +| [Automatic virtual TPM state transfer](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-automatic-state-transfer?view=azloc-2604) | architecture-patterns | 0.65 | Describes how vTPM state is handled during VM migration/failover in Azure Local Trusted launch; this is a product-specific resiliency pattern for secure workloads, not generic virtualization behavior. | +| [Azure Arc extension management](https://learn.microsoft.com/en-us/azure/azure-local/manage/arc-extension-management?view=azloc-2604) | configuration | 0.65 | Covers installing, upgrading, and managing Arc extensions on Azure Local; likely includes extension types, parameters, and management settings specific to this platform. | +| [Azure Container Registry](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-azure-container-registry?view=azloc-2604) | deployment | 0.65 | Service-specific deployment and management of ACR in a disconnected Azure Local environment is specialized deployment/integration knowledge. | +| [Azure Policy](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-policy?view=azloc-2604) | configuration | 0.65 | Using Azure Policy in a disconnected control plane requires product-specific configuration steps and parameters for policy evaluation and enforcement. | +| [Azure PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-powershell?view=azloc-2604) | configuration | 0.65 | Covers configuring Azure PowerShell specifically for disconnected Azure Local, including modules and environment settings; this is product-specific configuration. | +| [Billing](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-billing?view=azloc-2604) | decision-making | 0.65 | Explains capacity-based pricing using physical cores and Windows Server VM licensing; likely includes concrete billing rules and trade-offs for planning. | +| [Compare VM management capabilities](https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2604) | decision-making | 0.65 | Compares types of VMs and their management capabilities; likely includes comparison tables and criteria to decide which VM type to use, fitting decision-making guidance. | +| [Connect External Storage (SAN)](https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-external-storage?view=azloc-2604) | integrations | 0.65 | Describes integrating external SAN storage from various vendors with Azure Local, including product-specific integration steps and constraints. | +| [Connect to Azure Local VMs via serial console](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-serial-console?view=azloc-2604) | troubleshooting | 0.65 | Serial console usage is primarily for troubleshooting boot and network issues; article likely maps symptoms to using specific CLI commands and console actions. | +| [Create Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-arc-virtual-machines?view=azloc-2604) | configuration | 0.65 | Describes creating VMs via CLI/portal/ARM templates; includes resource definitions and parameters specific to Azure Local multi-rack VM management. | +| [Disaggregated network reference patterns overview](https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview-disaggregated?view=azloc-2604) | architecture-patterns | 0.65 | Overview of specific network reference patterns (leaf-spine, traffic flows) for this product; these are product-specific architecture patterns. | +| [Enable guest management](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-enable-guest-management?view=azloc-2604) | configuration | 0.65 | Enabling guest management typically involves specific Azure Local and Azure Migrate settings/flags and possibly extension configuration parameters. | +| [FedRAMP guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-fedramp-guidance?view=azloc-2604) | security | 0.65 | Details the relationship between Azure Local and FedRAMP and how to stay compliant; specialized security/compliance guidance beyond generic concepts. | +| [HIPAA guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-hipaa-guidance?view=azloc-2604) | security | 0.65 | Provides Azure Local–specific guidance for navigating HIPAA compliance, which is specialized security/compliance configuration and responsibility guidance. | +| [Health alerts](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-alerts-via-azure-monitor-alerts?view=azloc-2604) | configuration | 0.65 | Explains mapping OS health service issues to Azure Monitor alerts; includes alert rule configuration and health signal specifics unique to Azure Local. | +| [Identity](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-identity?view=azloc-2604) | security | 0.65 | Identity planning for disconnected Azure Local will include product-specific roles, actions, and identity integration patterns that qualify as security/identity configuration guidance. | +| [Infrastructure resiliency](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-infrastructure-resiliency?view=azloc-2604) | architecture-patterns | 0.65 | Explores infrastructure elements for resiliency and continuity; provides Azure Local–specific resiliency patterns and design trade-offs, fitting architecture-patterns. | +| [Manage Azure Local VM extensions](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-extension?view=azloc-2604) | configuration | 0.65 | VM extensions and guest management require specific configuration flags and extension names; these are product-specific settings. | +| [Manage Azure Local VM images](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-image?view=azloc-2604) | configuration | 0.65 | Describes listing, viewing, and deleting VM images via CLI/portal; includes resource types and parameters specific to Azure Local multi-rack. | +| [Manage SDN Multisite](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-sdn-multisite?view=azloc-2604) | architecture-patterns | 0.65 | Covers deployment and management of SDN Multisite, a specific topology pattern with capabilities and limitations. Product-specific multi-site networking design and operations. | +| [Manage logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-logical-networks?view=azloc-2604) | configuration | 0.65 | Focuses on managing existing logical networks; involves editing network settings and properties, which are product-specific configuration details. | +| [Manage logical networks](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-logical-networks?view=azloc-2604) | configuration | 0.65 | Managing logical networks involves editing network properties and associations; uses Azure Local-specific resource configuration. | +| [Metric alerts](https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-metric-alerts?view=azloc-2604) | configuration | 0.65 | Describes setting up metric alerts; includes metric names, conditions, and alert rule configuration specific to Azure Local. | +| [Migrate via PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-via-powershell?view=azloc-2604) | integrations | 0.65 | PowerShell-based migration article likely documents specific cmdlets, parameters, and required values unique to Azure Migrate + Azure Local integration. | +| [Migration overview](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-options-overview?view=azloc-2604) | decision-making | 0.65 | Overview of migration options typically includes comparison of approaches (Azure Migrate, PowerShell, etc.) and guidance on when to use which; this is product-specific decision guidance. | +| [Monitor a single system](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-single-23h2?view=azloc-2604) | configuration | 0.65 | Covers enabling logging and monitoring via Insights for a single system; likely includes workspace, data collection, and configuration settings specific to Azure Local. | +| [Monitor multiple systems](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-23h2?view=azloc-2604) | configuration | 0.65 | Explains monitoring multiple systems with Insights; includes configuration patterns for multi-cluster monitoring and possibly workspace and policy settings. | +| [Network](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-network?view=azloc-2604) | architecture-patterns | 0.65 | Network planning article for a niche disconnected scenario likely includes product-specific topology patterns, requirements, and trade-offs beyond generic networking concepts. | +| [Network Controller](https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-network-controller-deployment?view=azloc-2604) | deployment | 0.65 | Planning article with concrete, product-specific deployment requirements (for example, minimum three VMs for high availability and Azure Stack HCI OS prerequisites). These are deployment constraints rather than generic concepts. | +| [Opt in to update to 12.25xx from 11.xxxx](https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2604) | configuration | 0.65 | Procedural article for importing static update payloads via PowerShell for limited-connectivity Azure Local sites; includes product-specific cmd usage and parameters that function as configuration steps. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-overview?view=azloc-2604) | decision-making | 0.65 | Explains SDN management methods, when to use each, and supported/unsupported scenarios; provides product-specific decision guidance on SDN usage. | +| [Recommended alert rules](https://learn.microsoft.com/en-us/azure/azure-local/manage/set-up-recommended-alert-rules?view=azloc-2604) | configuration | 0.65 | Covers predefined recommended alerts for metrics like CPU and memory; includes specific alert rule definitions and thresholds tailored to Azure Local. | +| [Register disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-registration?view=azloc-2604) | configuration | 0.65 | Registration steps and requirements for disconnected operations are product-specific configuration/provisioning details needed for compliance. | +| [Release information](https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2604) | deployment | 0.65 | Release information with supported update paths and version/build specifics is deployment-focused expert knowledge about upgrade constraints. | +| [Review cloud deployment network considerations](https://learn.microsoft.com/en-us/azure/azure-local/plan/cloud-deployment-network-considerations?view=azloc-2604) | architecture-patterns | 0.65 | Network planning guidance for Azure Local cloud deployments with product-specific patterns and considerations; focuses on how to design the network rather than just concepts. | +| [Run SQL Server](https://learn.microsoft.com/en-us/azure/azure-local/deploy/sql-server-23h2?view=azloc-2604) | deployment | 0.65 | Guidance for deploying SQL Server specifically on Azure Local, likely including configuration and deployment patterns unique to this platform. | +| [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-23?view=azloc-2604) | security | 0.65 | Lists specific security updates for these releases; product-specific security patch information. | +| [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-24?view=azloc-2604) | security | 0.65 | Security update listing is product-specific security information (KBs, CVEs, affected components) not generally known from training. | +| [Storage thin provisioning](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-thin-provisioning-23h2?view=azloc-2604) | configuration | 0.65 | Explains how thin provisioning works and how to configure it via PowerShell; likely includes cmdlet parameters and volume settings specific to Azure Local. | +| [Supported operations for VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-operations?view=azloc-2604) | best-practices | 0.65 | Lists supported and unsupported VM operations and warns about operations to avoid to prevent complications; product-specific DO/DON'T guidance qualifies as best practices. | +| [Unregister and reregister machines](https://learn.microsoft.com/en-us/azure/azure-local/manage/unregister-register-machine?view=azloc-2604) | configuration | 0.65 | Provides specific PowerShell cmdlets and constraints for re-registration without OS reinstall; product-specific operational configuration. | +| [Update disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-update?view=azloc-2604) | deployment | 0.65 | Describes how to apply updates to the appliance in disconnected mode; update flows and constraints are product-specific deployment/operations knowledge. | +| [Upgrade SDN gateway VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn-gateways?view=azloc-2604) | deployment | 0.65 | Describes a controlled sequence for upgrading redundant and active SDN gateways to preserve connectivity. This is a deployment/upgrade pattern with product-specific ordering constraints. | +| [Upgrade SDN infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn?view=azloc-2604) | best-practices | 0.65 | Provides safe and secure upgrade guidance plus troubleshooting for SDN infrastructure, including do/don’t style recommendations and product-specific gotchas (and explicit exclusion of Arc-enabled SDN). | +| [Use ReFS deduplication and compression](https://learn.microsoft.com/en-us/azure/azure-local/manage/refs-deduplication-and-compression?view=azloc-2604) | configuration | 0.65 | Describes using ReFS deduplication and compression in Azure Local; includes product-specific configuration steps and options. | +| [Using Windows Admin Center](https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-wizard-23h2?view=azloc-2604) | deployment | 0.65 | Describes deploying all SDN components on Azure Local through Windows Admin Center, including supported deployment order and alternative SDN Express path. Product-specific deployment workflow. | +| [2. Download the software](https://learn.microsoft.com/en-us/azure/azure-local/deploy/download-23h2-software?view=azloc-2604) | deployment | 0.60 | Explains how to obtain the specific OS image, including trial details and catalog-based preinstallation; product-specific deployment acquisition flow. | +| [About Azure Local upgrades](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/about-upgrades-23h2?view=azloc-2604) | decision-making | 0.60 | Provides guidance on when and why to upgrade, support implications, and paths (including ESU and subscription constraints). Helps decide upgrade timing and approach. | +| [About Solution Builder Extension software updates](https://learn.microsoft.com/en-us/azure/azure-local/update/solution-builder-extension?view=azloc-2604) | configuration | 0.60 | Explains how to identify and install Solution Builder Extension updates and use advanced capabilities. Contains product-specific update configuration steps. | +| [Azure Hybrid Benefit](https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit-disaggregated?view=azloc-2604) | decision-making | 0.60 | Explains how to apply Azure Hybrid Benefit to Azure Local disaggregated deployments; likely includes licensing conditions and when it reduces cost, which is product-specific decision guidance. | +| [Azure Local and security standards](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-security-standards?view=azloc-2604) | security | 0.60 | Security-assurance article tying Azure Local to specific standards and certifications; contains product-specific compliance positioning that is not generic knowledge. | +| [Collect logs](https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-logs?view=azloc-2604) | configuration | 0.60 | Describes log collection mechanisms and commands; includes specific log packages, locations, and PowerShell usage unique to Azure Local. | +| [Complete prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prerequisites?view=azloc-2604) | deployment | 0.60 | Prerequisites and checklist for deploying Azure Local, including concrete requirements for hardware, networking, and security that are product-specific. | +| [Connect to Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/connect-arc-vm-using-ssh?view=azloc-2604) | integrations | 0.60 | Covers SSH/RDP over SSH and VM Connect; likely includes connection parameters, ports, and feature-specific settings, which are integration/coding patterns for remote access. | +| [Create and restore data disk snapshots](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-disk-snapshot?view=azloc-2604) | configuration | 0.60 | Shows how to snapshot and restore data disks with Azure Local-specific commands and constraints (data disk only, no OS disk). | +| [Drift detection](https://learn.microsoft.com/en-us/azure/azure-local/manage/drift-detection?view=azloc-2604) | best-practices | 0.60 | Explains drift detection framework, how it validates component states, and how it aids troubleshooting; includes product-specific guidance on managing configuration drift. | +| [External storage support](https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2604) | decision-making | 0.60 | Explains external storage support, benefits, and supported configurations; likely includes supported topologies, constraints, and configuration patterns that guide design decisions. | +| [Get cluster performance history](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-cluster-performance-history?view=azloc-2604) | configuration | 0.60 | Explains cmdlets and metrics for performance history; includes metric sets and usage patterns unique to Azure Local/Storage Spaces Direct. | +| [Load balance multiple logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balance-multiple-networks?view=azloc-2604) | architecture-patterns | 0.60 | Guidance on using multiple logical networks for load balancing and workload isolation. Product-specific pattern for structuring SDN logical networks. | +| [Manage Azure Local VM resources](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machine-resources?view=azloc-2604) | configuration | 0.60 | Describes adding/removing data disks and NICs; involves VM resource configuration specific to Azure Local and Azure Arc. | +| [Manage Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machines?view=azloc-2604) | configuration | 0.60 | Details start/stop/restart/delete and guest management operations; uses Azure-specific commands and settings for VM management. | +| [Manage VM extensions](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-extension?view=azloc-2604) | configuration | 0.60 | Covers enabling guest management and installing extensions; includes extension-related settings and requirements, which are product-specific configuration details. | +| [Monitor clusters with Health Service](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-overview?view=azloc-2604) | configuration | 0.60 | Describes using Health Service for Storage Spaces Direct clusters; includes cmdlets, health policies, and configuration options specific to this service. | +| [Monitor feature workbooks](https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-features?view=azloc-2604) | configuration | 0.60 | Describes monitoring specific features via Insights; likely includes feature-specific metrics and workbook configuration details. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-overview?view=azloc-2604) | architecture-patterns | 0.60 | Disaster recovery considerations for Azure Local VMs; likely includes product-specific DR patterns and when to use them, which is architecture/DR pattern guidance. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-monitoring?view=azloc-2604) | configuration | 0.60 | Describes integrating Microsoft and non-Microsoft monitoring systems in a disconnected scenario; likely includes product-specific configuration endpoints and patterns. | +| [Remote support extension](https://learn.microsoft.com/en-us/azure/azure-local/manage/remote-support-arc-extension?view=azloc-2604) | integrations | 0.60 | Describes Remote Support Arc extension, scenarios, and commands used by Microsoft support; includes product-specific command patterns and integration behavior. | +| [Review prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-prerequisites?view=azloc-2604) | configuration | 0.60 | Prerequisites article typically enumerates specific requirements (versions, SKUs, settings); these are product-specific configuration and environment requirements. | +| [SDN Multisite overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-multisite-overview?view=azloc-2604) | architecture-patterns | 0.60 | Provides guidance on SDN Multisite benefits, limitations, and how to design network topology and DR; product-specific architectural pattern for multi-site SDN. | +| [SDN infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-software-defined-networking-infrastructure-23h2?view=azloc-2604) | architecture-patterns | 0.60 | Covers detailed SDN infrastructure planning including physical/logical network configuration, routing, gateways, and extension scenarios. Contains product-specific design guidance and trade-offs for Azure Local SDN deployments. | +| [Track Health Service actions](https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-actions?view=azloc-2604) | configuration | 0.60 | Describes Health Service workflows and actions; includes action types and behavior details specific to Azure Local clusters. | +| [Update SDN infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn?view=azloc-2604) | configuration | 0.60 | Gives concrete, product-specific steps and ordering guidance for updating Network Controller, SLB, and gateways using Windows Update and PowerShell. Focused on configuration/maintenance of SDN components. | +| [What's a rack aware cluster?](https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-overview?view=azloc-2604) | deployment | 0.60 | Overview but includes supported configurations and deployment requirements for rack aware clusters, which are product-specific deployment characteristics. | ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Deploy Windows Server Azure Edition VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/windows-server-azure-edition-23h2?view=azloc-2603) | 0.50 | Deploying Windows Server Azure Edition VMs is a deployment/tutorial article; summary doesn’t clearly show limits, config matrices, or troubleshooting mappings. | -| [Manage Azure Local VM resources](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machine-resources?view=azloc-2603) | 0.50 | Managing VM resources (disks, NICs) is operational; summary doesn’t clearly indicate detailed configuration parameter tables or limits beyond basic add/delete steps. | -| [3B. Install and register via simplified machine provisioning](https://learn.microsoft.com/en-us/azure/azure-local/deploy/simplified-machine-provisioning?view=azloc-2603) | 0.45 | Describes simplified machine provisioning process; appears as a procedural tutorial without explicit configuration matrices or limits. | -| [Automatic virtual TPM state transfer](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-automatic-state-transfer?view=azloc-2603) | 0.45 | Describes automatic vTPM state transfer behavior conceptually; summary does not indicate configuration parameters, limits, or troubleshooting mappings. | -| [Connect to Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/connect-arc-vm-using-ssh?view=azloc-2603) | 0.45 | Connection methods (SSH, RDP over SSH, VM Connect) are procedural; summary doesn’t indicate detailed configuration tables or troubleshooting mappings. | -| [Discover, replicate](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-replicate?view=azloc-2603) | 0.45 | A 'discovery and replication process' article is likely a step-by-step workflow/tutorial for using Azure Migrate with VMware VMs. The summary does not indicate structured limits, configuration parameter tables, or troubleshooting mappings; it appears more procedural than expert-reference in nature. | -| [Manage Azure Local VM Images](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-image?view=azloc-2603) | 0.45 | Managing VM images (list, view, delete) is operational guidance; likely simple CLI/portal steps without detailed config matrices or limits. | -| [Manage VM extensions](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-extension?view=azloc-2603) | 0.45 | Explains how to install and manage VM extensions on Azure Local VMs via the portal. The summary suggests a how-to tutorial; it doesn’t clearly indicate detailed extension configuration parameter tables, limits, or error mappings that would qualify as expert configuration, integration, or troubleshooting knowledge. | -| [Manage VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm?view=azloc-2603) | 0.45 | How-to for creating/managing VMs via Windows Admin Center; likely step-by-step UI operations without detailed config parameter tables or numeric limits. | -| [Manage gateway connections](https://learn.microsoft.com/en-us/azure/azure-local/manage/gateway-connections?view=azloc-2603) | 0.45 | Gateway connection management how-to; summary suggests basic create/update/delete operations without detailed configuration tables or troubleshooting content. | -| [Manage software load balancers](https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balancers?view=azloc-2603) | 0.45 | Managing SLB policies via Windows Admin Center; appears as a standard management tutorial without explicit limits, config matrices, or error mappings. | -| [Manage tenant logical networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-logical-networks?view=azloc-2603) | 0.45 | How-to for managing logical networks via Windows Admin Center; likely mostly UI steps without detailed config tables or limits. | -| [Manage tenant virtual networks](https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-virtual-networks?view=azloc-2603) | 0.45 | How-to for tenant virtual networks; summary indicates procedural steps rather than deep config references or troubleshooting. | -| [Migrate, verify](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-migrate?view=azloc-2603) | 0.45 | Migration how-to for VMware VMs; mostly step-by-step operations rather than expert-only limits, configs, or error catalogs. | -| [Update SDN infrastructure](https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn?view=azloc-2603) | 0.45 | Update procedure for SDN components; summary suggests generic update flow without detailed config parameters, limits, or error-code-based troubleshooting. | -| [Upgrade stretched clusters to 23H2](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-stretched-cluster-to-23h2?view=azloc-2603) | 0.45 | Stretched cluster upgrade article is a prescriptive procedure; no indication of numeric limits, decision matrices, or error-code-based troubleshooting. | -| [1. Create a storage path](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-storage-path?view=azloc-2603) | 0.40 | Focuses on how to create a storage path for VM images using CLI or portal. From the summary it appears to be a procedural how-to without explicit mention of configuration parameter tables, limits, or specialized patterns; likely standard tutorial-style guidance. | -| [2. Download the software](https://learn.microsoft.com/en-us/azure/azure-local/deploy/download-23h2-software?view=azloc-2603) | 0.40 | How to download OS software and notes about trial; mostly procedural without configuration matrices or limits. | -| [3A. Install manually via ISO](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os?view=azloc-2603) | 0.40 | Step-by-step OS installation using SConfig; tutorial-style deployment steps rather than deployment constraints or config tables. | -| [4. Validate solution upgrade readiness](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/validate-solution-upgrade-readiness?view=azloc-2603) | 0.40 | Upgrade readiness assessment article appears to be a procedural checklist without specific limits, error-code mappings, or configuration tables. | -| [About hyperconverged deployments](https://learn.microsoft.com/en-us/azure/azure-local/overview/hyperconverged-overview?view=azloc-2603) | 0.40 | Overview of hyperconverged deployments with benefits and use cases; mentions max 16 machines but primarily conceptual, not a focused limits/config/decision guide. | -| [Assign public IP address to a VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-public-ip-to-vm?view=azloc-2603) | 0.40 | Assigning a public IP via Windows Admin Center is a straightforward how-to; summary doesn’t indicate detailed config tables, limits, or troubleshooting. | -| [Complete prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-prerequisites?view=azloc-2603) | 0.40 | A 'prerequisites' article for a migration workflow usually lists tasks and high-level setup steps (permissions, accounts, enabling features). While it may reference some specifics, it is primarily procedural and not a structured list of limits, configuration matrices, or error mappings that match any sub-skill definition. | -| [Migrate, verify](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-azure-migrate?view=azloc-2603) | 0.40 | Step-by-step migration walkthrough for Hyper-V VMs; likely procedural without detailed limits, configs tables, or error mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-overview?view=azloc-2603) | 0.40 | High-level overview of using Azure Migrate with Azure Local; detailed steps and configs are likely in linked articles, not here. | -| [SDN FAQ](https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-frequently-asked-questions?view=azloc-2603) | 0.40 | FAQ for SDN; summary does not indicate detailed error codes, config tables, or numeric limits—likely general Q&A. | -| [Upgrade via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade-azure-resource-manager-template?view=azloc-2603) | 0.40 | ARM template-based upgrade article is a how-to deployment/tutorial; summary doesn’t show parameter tables or tier constraints beyond generic version numbers. | -| [Upgrade via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade?view=azloc-2603) | 0.40 | Solution upgrade install guide is primarily step-by-step procedure; summary doesn’t indicate limits, config matrices, or detailed troubleshooting mappings. | -| [Use Azure Site Recovery](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-site-recovery?view=azloc-2603) | 0.40 | Described as a guide to protect Azure Local Hyper-V VMs with Azure Site Recovery. From the summary it appears to be a scenario/tutorial guide; there’s no explicit indication of detailed limits, configuration matrices, or error-code-based troubleshooting that would qualify as expert knowledge under the defined sub-skill types. | -| [Using CentOS VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-centos?view=azloc-2603) | 0.40 | Preparing CentOS images is a CLI tutorial; EOL warning is conceptual, and there’s no indication of detailed configuration tables or quotas. | -| [Using Express scripts](https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-express-23h2?view=azloc-2603) | 0.40 | Primarily a deployment walkthrough using SDN Express scripts; no clear configuration matrices, limits, or product-specific troubleshooting/error-code mapping visible from the summary. | -| [Using Red Hat Enterprise Azure Marketplace image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-red-hat?view=azloc-2603) | 0.40 | Preparing RHEL Marketplace images is a how-to; focus is on steps, not on limits, security roles, or decision matrices. | -| [Using Red Hat Enterprise VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-red-hat-enterprise?view=azloc-2603) | 0.40 | Preparing RHEL images is a procedural article; summary doesn’t show expert-level configuration or troubleshooting content. | -| [Using SUSE VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-suse?view=azloc-2603) | 0.40 | Preparing SUSE images is a CLI tutorial; no clear indication of configuration parameter reference tables or quotas. | -| [Using Ubuntu Azure Marketplace image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-ubuntu?view=azloc-2603) | 0.40 | Preparing Ubuntu Marketplace images is a step-by-step preparation guide; summary emphasizes updates and integration, not limits or config matrices. | -| [Using Ubuntu VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-linux-sysprep?view=azloc-2603) | 0.40 | Preparing Ubuntu images via CLI is a tutorial; while it may have commands, it’s not primarily a configuration reference or troubleshooting guide. | -| [Using an existing Azure Local VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-existing-arc-vm?view=azloc-2603) | 0.40 | Creating VM images from existing VMs is a CLI-based how-to; summary doesn’t show configuration matrices or limits beyond version applicability. | -| [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-23?view=azloc-2603) | 0.40 | What's new feature list for 23xx; similar to other release summaries without deep expert-only technical details. | -| [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-24?view=azloc-2603) | 0.40 | What's new release summary; feature list without deep configuration, limits, or troubleshooting matrices. | -| [Workloads resiliency](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-workloads-resiliency?view=azloc-2603) | 0.40 | Disaster recovery and resiliency guidance for workloads on Azure Local VMs is architectural and conceptual; summary does not indicate concrete limits, configs, or product-specific error mappings. | -| [4. Create network interfaces](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-interfaces?view=azloc-2603) | 0.35 | Describes creating network interfaces for Azure Local VMs via portal or CLI. The summary suggests a step-by-step creation guide rather than detailed configuration matrices, limits, or troubleshooting content, so it doesn’t clearly meet any expert-knowledge sub-skill criteria. | -| [Download managed disks from Azure](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-data-disks?view=azloc-2603) | 0.35 | Downloading managed disks to Azure Local is a step-by-step integration scenario but described at tutorial level; no clear evidence of config tables or limits. | -| [Manage Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machines?view=azloc-2603) | 0.35 | Covers basic VM lifecycle operations (start, stop, restart, delete) and enabling guest management. This is largely procedural and generic VM management behavior, without clear indication of product-specific limits, configuration tables, or error-code-based troubleshooting. | -| [Using Azure Marketplace images](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-marketplace?view=azloc-2603) | 0.35 | Creating VM images from Azure Marketplace is a procedural guide; summary doesn’t show detailed configuration matrices or quotas. | -| [Using Windows Admin Center](https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-wizard-23h2?view=azloc-2603) | 0.35 | Step-by-step deployment via Windows Admin Center; appears procedural without detailed config tables, limits, or decision matrices. | -| [Using images in Azure Storage account](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-storage-account?view=azloc-2603) | 0.35 | Creating VMs from Storage account images is a tutorial; summary doesn’t indicate expert-level limits, security roles, or troubleshooting mappings. | -| [Using images in local share](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-local-share?view=azloc-2603) | 0.35 | Using local share images is a procedural article; no evidence of detailed configuration option tables or quotas. | -| [6A. Deploy via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal?view=azloc-2603) | 0.30 | High-level deployment walkthrough via Azure portal; typically step-by-step UI guidance without detailed configuration matrices, limits, or product-specific parameters beyond what an LLM would already know. | -| [About Solution Builder Extension software updates](https://learn.microsoft.com/en-us/azure/azure-local/update/solution-builder-extension?view=azloc-2603) | 0.30 | Primarily an overview of Solution Builder Extension updates and how to apply them. The summary does not indicate presence of specific configuration tables, limits, error codes, or other detailed expert-only data; it appears to be a procedural/update overview. | -| [About Updates](https://learn.microsoft.com/en-us/azure/azure-local/update/about-updates-23h2?view=azloc-2603) | 0.30 | High-level overview of updates feature and benefits; summary doesn’t indicate detailed config parameters, limits, or troubleshooting. | -| [Azure Local observability](https://learn.microsoft.com/en-us/azure/azure-local/concepts/observability?view=azloc-2603) | 0.30 | Conceptual overview of observability and data sources; mostly high-level without detailed config tables or limits. | -| [Datacenter Firewall overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/datacenter-firewall-overview?view=azloc-2603) | 0.30 | Overview of Datacenter Firewall; conceptual description of capabilities, not detailed security configuration. | -| [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/faq?view=azloc-2603) | 0.30 | FAQ about deployment, connectivity, and data handling; summary suggests general Q&A without detailed error codes, configs, or numeric limits. | -| [FAQs](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vms-faq?view=azloc-2603) | 0.30 | FAQ content is usually high-level Q&A; summary does not mention error codes, config tables, or numeric limits. | -| [Get support for Azure Local](https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support?view=azloc-2603) | 0.30 | General support process and lifecycle guidance; no product-specific configuration, limits, or error-resolution mappings. | -| [Get support for deployment issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support-for-deployment-issues?view=azloc-2603) | 0.30 | Describes how to contact Microsoft support and use remote support at a high level; lacks detailed technical troubleshooting mappings or configs. | -| [Load balancer overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-load-balancer?view=azloc-2603) | 0.30 | Overview of Software Load Balancer; summary is conceptual and does not show specific configuration or limits. | -| [Manage VMs with PowerShell](https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-powershell?view=azloc-2603) | 0.30 | Article describes how to create and manage VMs on Azure Local using PowerShell; likely a procedural tutorial with commands, but prompt summary does not indicate structured configuration tables, limits/quotas, or product-specific troubleshooting mappings. Appears to be general how-to guidance rather than expert reference data. | -| [Network Controller overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-controller-overview?view=azloc-2603) | 0.30 | Overview of Network Controller; appears conceptual, not focused on specific configuration parameters or limits. | -| [OEM license information](https://learn.microsoft.com/en-us/azure/azure-local/oem-license?view=azloc-2603) | 0.30 | OEM license overview and benefits; licensing/marketing/legal focus, not technical configuration or limits. | -| [Operational security](https://learn.microsoft.com/en-us/azure/azure-local/security-book/operational-security?view=azloc-2603) | 0.30 | Operational security overview; mentions tools like Windows Admin Center and Defender for Cloud but appears conceptual, not a detailed configuration or troubleshooting guide. | -| [RAS Gateway overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/gateway-overview?view=azloc-2603) | 0.30 | Overview of RAS Gateway; high-level description of capabilities without detailed configuration tables in the summary. | -| [Read overview](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-introduction?view=azloc-2603) | 0.30 | Deployment overview article; likely conceptual and navigational without matrices or constraints. | -| [Review cloud deployment network considerations](https://learn.microsoft.com/en-us/azure/azure-local/plan/cloud-deployment-network-considerations?view=azloc-2603) | 0.30 | Planning/overview of network considerations; summary does not indicate concrete limits, config tables, or decision matrices. | -| [Route reflector overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/route-reflector-overview?view=azloc-2603) | 0.30 | Overview of BGP Route Reflector; summary suggests conceptual explanation rather than detailed configuration or limits. | -| [Run Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/create-arc-virtual-machines?view=azloc-2603) | 0.30 | The page is primarily a how-to guide for creating Azure Local VMs enabled by Azure Arc using CLI, portal, or ARM templates. From the summary, it appears to be procedural tutorial content without detailed configuration parameter tables, limits, or troubleshooting mappings. It likely reuses standard Azure VM creation patterns that an LLM already knows, so it does not clearly meet the expert-knowledge criteria for any sub-skill type. | -| [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/security-update/security-update?view=azloc-2603) | 0.30 | Lists security updates; typically a catalog of patches/KBs rather than RBAC roles, config parameters, or security setting details. | -| [Silicon-assisted security](https://learn.microsoft.com/en-us/azure/azure-local/security-book/silicon-assisted-security?view=azloc-2603) | 0.30 | Silicon-assisted security overview; focuses on secured core hardware and approved solutions conceptually, not on detailed configuration or limits. | -| [Software-defined networking overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-defined-networking-23h2?view=azloc-2603) | 0.30 | Conceptual overview of SDN managed by on-premises tools; mostly descriptive without concrete configuration matrices in the summary. | -| [Telemetry and diagnostics extension](https://learn.microsoft.com/en-us/azure/azure-local/concepts/telemetry-and-diagnostics-overview?view=azloc-2603) | 0.30 | Telemetry and diagnostics extension overview and benefits; lacks detailed settings tables, limits, or troubleshooting mappings. | -| [Trustworthy addition](https://learn.microsoft.com/en-us/azure/azure-local/security-book/trustworthy-addition?view=azloc-2603) | 0.30 | High-level description of trustworthy addition; conceptual security principles without specific RBAC roles, config parameters, or compliance settings. | -| [Understand update phases](https://learn.microsoft.com/en-us/azure/azure-local/update/update-phases-23h2?view=azloc-2603) | 0.30 | Describes phases of the Azure Local update workflow (preparation, download, validation, health checks, installation, progress reporting). This is process/behavioral documentation rather than detailed limits, configuration matrices, or troubleshooting mappings. | -| [Update disconnected operations](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-update?view=azloc-2603) | 0.30 | The page appears to be a procedural guide on updating disconnected operations for Azure Local appliances. From the description and summary, it likely focuses on step-by-step update instructions rather than detailed configuration parameter tables, limits/quotas, error-code-based troubleshooting, or decision matrices. Without evidence of specific numeric limits, configuration option tables, or error-code mappings, it does not meet the criteria for any expert-knowledge sub-skill type. | -| [What's new in disconnected operations?](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-whats-new?view=azloc-2603) | 0.30 | What's new/release info; mostly change log and feature descriptions. Does not clearly map to limits, configuration tables, or troubleshooting patterns per the defined categories. | -| [Workload security](https://learn.microsoft.com/en-us/azure/azure-local/security-book/workload-security?view=azloc-2603) | 0.30 | Workload security overview; describes using Trusted launch and Defender for Cloud conceptually, without explicit configuration parameters or role mappings. | -| [Azure Local VMs for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-azure-arc-vm-management-overview?view=azloc-2603) | 0.25 | High-level overview of Azure Local VM management; focuses on benefits, components, and workflow, not on specific configuration parameters or limits. | -| [Load balancer for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-load-balancer-overview?view=azloc-2603) | 0.25 | Load balancer overview for multi-rack; describes types of load balancers. Summary suggests conceptual description rather than detailed configuration or limits. | -| [NAT gateway for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-nat-gateway-overview?view=azloc-2603) | 0.25 | NAT gateway overview; likely conceptual explanation of NAT gateway behavior in multi-rack deployments without detailed numeric limits or config tables. | -| [Network fabric for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-network-fabric-overview?view=azloc-2603) | 0.25 | Network fabric overview; primarily describes capabilities and concepts. No clear evidence of numeric limits, config parameter tables, or decision matrices. | -| [Security concepts](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-security?view=azloc-2603) | 0.25 | Security overview; describes posture and principles but does not indicate specific RBAC roles, permission scopes, or configuration values. | -| [Security foundation](https://learn.microsoft.com/en-us/azure/azure-local/security-book/security-foundation?view=azloc-2603) | 0.25 | Security foundation overview; discusses SDL, certifications, and supply chain at a high level, without product-specific configuration or numeric constraints. | -| [Update via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/update/azure-update-manager-23h2?view=azloc-2603) | 0.25 | Describes how to use Azure Update Manager to apply updates and view history. From the summary it looks like a usage/how-to article without explicit configuration parameter tables, limits, or troubleshooting error mappings. Does not clearly match any expert-knowledge sub-skill type based on the provided hints. | -| [About security features](https://learn.microsoft.com/en-us/azure/azure-local/concepts/security-features?view=azloc-2603) | 0.20 | Conceptual overview of security features; no specific RBAC roles, parameters, or config values indicated. | -| [Azure Hybrid Benefit](https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit?view=azloc-2603) | 0.20 | High-level cost/benefit description of Azure Hybrid Benefit for Azure Local; summary indicates conceptual and marketing-style explanation without specific limits, configuration parameters, decision matrices, or error codes. | -| [Azure Local and security standards](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-security-standards?view=azloc-2603) | 0.20 | Describes security standards and certifications; compliance/assurance overview without product-specific configuration or limits. | -| [Compute for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-concepts-compute?view=azloc-2603) | 0.20 | Compute overview for multi-rack deployments; appears conceptual without explicit limits tables, configuration parameter tables, or troubleshooting mappings. | -| [Conclusion](https://learn.microsoft.com/en-us/azure/azure-local/security-book/conclusion?view=azloc-2603) | 0.20 | Conclusion chapter; summarises security posture and future direction, not providing new expert configuration, limits, or troubleshooting content. | -| [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/license-billing?view=azloc-2603) | 0.20 | OEM license and billing FAQ is commercial/administrative, not technical configuration, limits, or troubleshooting content. | -| [FedRAMP guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-fedramp-guidance?view=azloc-2603) | 0.20 | FedRAMP guidance; focuses on relationship and compliance, not on concrete configuration or limits. | -| [HIPAA guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-hipaa-guidance?view=azloc-2603) | 0.20 | HIPAA guidance; summary suggests general compliance guidance, not detailed product-specific settings. | -| [Hybrid capabilities with Azure services](https://learn.microsoft.com/en-us/azure/azure-local/hybrid-capabilities-with-azure-services-23h2?view=azloc-2603) | 0.20 | High-level description of hybrid capabilities and cloud service components; no concrete limits, configs, or decision matrices. | -| [ISO/IEC 27001 guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-iso27001-guidance?view=azloc-2603) | 0.20 | ISO 27001 guidance at a conceptual/compliance level; summary does not show concrete security settings or RBAC details. | -| [Monitoring overview](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-overview?view=azloc-2603) | 0.20 | Monitoring overview article; describes concepts and importance of monitoring without clear indication of specific metrics tables, limits, or configuration parameters that meet the expert-knowledge criteria. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/monitoring-overview?view=azloc-2603) | 0.20 | Monitoring overview; conceptual explanation of monitoring approach without detailed metrics tables or configuration parameters. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-vmware-overview?view=azloc-2603) | 0.20 | High-level overview of using Azure Migrate with VMware; primarily conceptual and descriptive. | -| [PCI DSS guidance](https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-pci-dss-guidance?view=azloc-2603) | 0.20 | PCI DSS guidance; appears compliance-oriented without specific configuration parameters or numeric limits. | -| [SDN technical reference](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-technical-reference?view=azloc-2603) | 0.20 | Technical reference index/landing page aggregating SDN resources; no direct expert configuration or troubleshooting content indicated. | -| [What is Azure Local VM management?](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-overview?view=azloc-2603) | 0.20 | High-level overview of Azure Local VM management benefits, components, and workflow without specific limits, configuration tables, error codes, or decision matrices. | -| [What is Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/overview?view=azloc-2603) | 0.20 | High-level overview and benefits of Azure Local; no detailed limits, configs, or decision matrices. | -| [What is Trusted launch for Azure Local VMs?](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-overview?view=azloc-2603) | 0.20 | High-level overview of Trusted launch for Azure Local VMs; description suggests conceptual introduction and how to create VMs via portal/CLI, but no indication of specific limits, configuration parameter tables, RBAC role lists, or error-code-based troubleshooting content. | -| [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/whats-new?view=azloc-2603) | 0.20 | Release 'what's new' summary; likely high-level feature descriptions without detailed limits, configs, or troubleshooting matrices. | -| [What's new in VM migration?](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-whats-new?view=azloc-2603) | 0.20 | “What’s new” feature list for Azure Migrate; primarily release notes/overview of new capabilities without detailed limits, configs, or troubleshooting mappings. | -| [Azure Local security book](https://learn.microsoft.com/en-us/azure/azure-local/security-book/overview?view=azloc-2603) | 0.10 | High-level overview of a security book; no concrete settings, roles, or troubleshooting content. | -| [Microsoft 365 Local](https://learn.microsoft.com/en-us/azure/azure-local/concepts/microsoft-365-local-overview?view=azloc-2603) | 0.10 | Described as an overview of Microsoft 365 Local on Azure Local infrastructure, focused on what it is and how it helps with sovereignty and productivity. This is conceptual/marketing-style content without clear indication of numeric limits, configuration tables, or detailed decision matrices, so it doesn't meet the expert-knowledge criteria. | -| [Renaming to Azure Local](https://learn.microsoft.com/en-us/azure/azure-local/rename-to-azure-local?view=azloc-2603) | 0.10 | Brand rename explanation and FAQ; marketing/terminology content without technical expert configuration or troubleshooting details. | +| [Deploy Windows Server Azure Edition VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/windows-server-azure-edition-23h2?view=azloc-2604) | 0.50 | Deploy Windows Server Azure Edition VMs; deployment tutorial without explicit tier matrices, limits, or config reference tables. | +| [Manage Azure Local VM Images](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-image?view=azloc-2604) | 0.50 | Manage VM images (list, view, delete); operational how-to without detailed configuration parameter tables or limits. | +| [Manage Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machines?view=azloc-2604) | 0.50 | Manage lifecycle of Arc-enabled VMs (start/stop/etc.); operational tutorial without structured best-practices, limits, or config tables. | +| [Using an existing Azure Local VM](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-existing-arc-vm?view=azloc-2604) | 0.50 | Create image from existing Arc-enabled VM; mostly CLI steps, not a configuration or troubleshooting reference. | +| [Download managed disks from Azure](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-data-disks?view=azloc-2604) | 0.45 | How-to download a managed disk to Azure Local; likely a linear procedure without detailed config tables or limits. | +| [Upgrade stretched clusters to 23H2](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-stretched-cluster-to-23h2?view=azloc-2604) | 0.45 | Upgrade stretched clusters 22H2→23H2; specific build numbers but mainly procedural upgrade steps, not limits, configs, or troubleshooting tables. | +| [Using Azure Compute Gallery images](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-compute-gallery?view=azloc-2604) | 0.45 | Tutorial for creating VM images from Azure Compute Gallery; procedural content without expert-only configuration matrices. | +| [Using Azure Marketplace images](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-marketplace?view=azloc-2604) | 0.45 | Tutorial for creating VM images from Azure Marketplace; mostly step-by-step usage of portal/CLI, not deep config or limits. | +| [Using CentOS VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-centos?view=azloc-2604) | 0.45 | Prepare CentOS image; similar to other image-prep tutorials, mostly commands and steps without structured expert reference content. | +| [Using Red Hat Enterprise Azure Marketplace image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-red-hat?view=azloc-2604) | 0.45 | Prepare RHEL Marketplace image; tutorial-style steps without structured configuration or decision-making content. | +| [Using Red Hat Enterprise VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-red-hat-enterprise?view=azloc-2604) | 0.45 | Prepare RHEL image; procedural CLI guide, not focused on limits, security roles, or troubleshooting mappings. | +| [Using SUSE VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-suse?view=azloc-2604) | 0.45 | Prepare SUSE image; similar to other Linux image-prep guides, mainly commands and steps. | +| [Using Ubuntu Azure Marketplace image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-azure-marketplace-ubuntu?view=azloc-2604) | 0.45 | Prepare Ubuntu Marketplace image; stepwise preparation and export, not a config/limits/troubleshooting reference. | +| [Using Ubuntu VM image](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-linux-sysprep?view=azloc-2604) | 0.45 | Prepare Ubuntu image via CLI; image-prep tutorial with commands, but not organized as config reference or best-practices with quantified impact. | +| [Using images in Azure Storage account](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-storage-account?view=azloc-2604) | 0.45 | How-to create VMs from images in a storage account; standard portal/CLI steps, not configuration reference or limits. | +| [Using images in local share](https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-local-share?view=azloc-2604) | 0.45 | Create VM images from local share; procedural guide without detailed parameter tables or constraints. | +| [2. Perform post-OS upgrade tasks](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/post-upgrade-steps?view=azloc-2604) | 0.40 | Post-upgrade task walkthrough for Azure Local via PowerShell; procedural steps but no detailed limits, configs tables, or error-code mappings. | +| [3. Configure Network ATC](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-enable-network-atc?view=azloc-2604) | 0.40 | How-to configure Network ATC on Azure Local; mostly stepwise guidance without config parameter tables, limits, or troubleshooting matrices. | +| [4. Validate solution upgrade readiness](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/validate-solution-upgrade-readiness?view=azloc-2604) | 0.40 | Upgrade readiness validation steps; procedural checks but no numeric thresholds, limits, or decision matrices. | +| [About Updates](https://learn.microsoft.com/en-us/azure/azure-local/update/about-updates-23h2?view=azloc-2604) | 0.40 | Overview of the updates feature and benefits without detailed configuration parameters, limits, or troubleshooting specifics. | +| [Add a node](https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server?view=azloc-2604) | 0.40 | Operational guide to add a node; mostly procedural without detailed configuration parameter tables or limits. | +| [Complete prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-prerequisites?view=azloc-2604) | 0.40 | Prerequisites article is usually task/checklist oriented (enable permissions, create resources) rather than tables of config parameters or limits. | +| [Complete prerequisites](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-prerequisites?view=azloc-2604) | 0.40 | Prerequisites checklist for VMware migration; likely procedural (create projects, permissions) rather than detailed config or limits tables. | +| [Datacenter Firewall overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/datacenter-firewall-overview?view=azloc-2604) | 0.40 | Overview of Datacenter Firewall; describes what it is and traffic types, but not specific configuration parameters or rules. | +| [Discover, replicate](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-replicate?view=azloc-2604) | 0.40 | Describes discovery and replication phase; likely procedural tutorial steps without detailed config tables or error-code mappings. | +| [Discover, replicate](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-replicate?view=azloc-2604) | 0.40 | Discovery and replication how-to for VMware; mostly workflow steps, not structured troubleshooting or configuration reference. | +| [Get support for deployment issues](https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support-for-deployment-issues?view=azloc-2604) | 0.40 | Explains how to get support, including log collection and remote support, but primarily process-oriented rather than technical configuration or error mapping. | +| [Load balancer overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-load-balancer?view=azloc-2604) | 0.40 | Overview of Software Load Balancer; mainly conceptual description of functionality and benefits. | +| [Migrate, verify](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-azure-migrate?view=azloc-2604) | 0.40 | End-to-end migration how-to; mostly step-by-step instructions, not focused on limits, configuration matrices, or troubleshooting catalogs. | +| [Migrate, verify](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-migrate?view=azloc-2604) | 0.40 | Migration procedure article; step-by-step guidance without strong indication of limits, config matrices, or error-code catalogs. | +| [Network Controller overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-controller-overview?view=azloc-2604) | 0.40 | Overview of Network Controller; appears conceptual and descriptive rather than configuration- or troubleshooting-focused. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-vm-resiliency?view=azloc-2604) | 0.40 | Resiliency considerations for VMs are largely conceptual design guidance without concrete numeric thresholds, configuration tables, or product-specific limits. | +| [RAS Gateway overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/gateway-overview?view=azloc-2604) | 0.40 | Overview of RAS Gateway; high-level explanation of role and capabilities without clear product-specific configuration details. | +| [Route reflector overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/route-reflector-overview?view=azloc-2604) | 0.40 | Overview of BGP Route Reflector; conceptual explanation of topology and purpose without detailed config or limits. | +| [SDN FAQ](https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-frequently-asked-questions?view=azloc-2604) | 0.40 | FAQ format; summary suggests general Q&A about SDN availability and versions, not detailed error codes, configs, or limits. | +| [Software-defined networking overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/software-defined-networking-23h2?view=azloc-2604) | 0.40 | Conceptual overview of SDN managed by on-premises tools; primarily descriptive without clear indication of configuration tables or decision matrices. | +| [Suspend and resume](https://learn.microsoft.com/en-us/azure/azure-local/manage/suspend-resume-cluster-maintenance?view=azloc-2604) | 0.40 | Step-by-step maintenance procedure (suspend/resume host) but no detailed config tables, limits, or product-specific troubleshooting mappings. | +| [Understand update phases](https://learn.microsoft.com/en-us/azure/azure-local/update/update-phases-23h2?view=azloc-2604) | 0.40 | Describes phases of the update workflow conceptually (preparation, validation, health checks, reporting) without detailed parameters or troubleshooting mappings. | +| [Upgrade via ARM template](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade-azure-resource-manager-template?view=azloc-2604) | 0.40 | ARM template-based solution upgrade; deployment tutorial without tier matrices, limits, or config parameter tables. | +| [Upgrade via Azure portal](https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-solution-upgrade?view=azloc-2604) | 0.40 | Solution upgrade installation guide; step-by-step upgrade but no expert-only limits, configs, or troubleshooting mappings. | +| [What is Trusted launch for Azure Local VMs?](https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-overview?view=azloc-2604) | 0.40 | Overview of Trusted launch; primarily conceptual and feature-intro content without detailed configuration parameter tables or limits. | +| [Workloads resiliency](https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-workloads-resiliency?view=azloc-2604) | 0.40 | Layered DR strategy for workloads is high-level resiliency guidance without specific numeric RPO/RTO thresholds, configuration tables, or product-specific limits. | +| [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-faq?view=azloc-2604) | 0.35 | FAQ format; likely high-level Q&A without structured error-code mappings or detailed configuration tables. | +| [Network ATC overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-atc-overview?view=azloc-2604) | 0.35 | Network ATC overview; primarily conceptual and benefits-focused, not a detailed configuration or limits reference. | +| [About disaggregated deployments](https://learn.microsoft.com/en-us/azure/azure-local/overview/disaggregated-overview?view=azloc-2604) | 0.30 | Overview of disaggregated deployments; benefits and use cases, not detailed requirements or decision matrices with thresholds. | +| [About hyperconverged deployments](https://learn.microsoft.com/en-us/azure/azure-local/overview/hyperconverged-overview?view=azloc-2604) | 0.30 | Overview of hyperconverged deployments and benefits; no clear evidence of detailed configuration tables or decision matrices. | +| [About security features](https://learn.microsoft.com/en-us/azure/azure-local/concepts/security-features?view=azloc-2604) | 0.30 | Described as a brief conceptual overview of security features; no indication of concrete RBAC roles, parameters, or configuration values. | +| [Add a node](https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server-disaggregated?view=azloc-2604) | 0.30 | Adding a node is a capacity/scale-out how-to; summary suggests step-by-step operations without explicit limits, config matrices, or decision criteria. | +| [Azure Hybrid Benefit](https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit?view=azloc-2604) | 0.30 | Explains Azure Hybrid Benefit program for Azure Local; mostly licensing and cost concept, not technical configuration or limits. | +| [Azure Local VMs](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-arc-vm?view=azloc-2604) | 0.30 | Described as a brief overview of management features and differences; likely conceptual with references to other articles rather than detailed configs or error mappings. | +| [Azure Local observability](https://learn.microsoft.com/en-us/azure/azure-local/concepts/observability?view=azloc-2604) | 0.30 | Conceptual observability overview; describes data sources and concepts rather than detailed configuration or limits. | +| [Billing and payment](https://learn.microsoft.com/en-us/azure/azure-local/concepts/billing?view=azloc-2604) | 0.30 | Billing and payment overview; pricing is external and not a technical configuration or limit table in this article. | +| [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/faq?view=azloc-2604) | 0.30 | FAQ summary only; likely mixed conceptual and policy answers without clear indication of detailed limits/configs in the snippet. | +| [FAQs](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vms-faq?view=azloc-2604) | 0.30 | FAQ format; summary doesn’t indicate detailed error codes, configuration tables, or numeric limits—likely general Q&A rather than deep expert data. | +| [Get remote support](https://learn.microsoft.com/en-us/azure/azure-local/manage/get-remote-support?view=azloc-2604) | 0.30 | Procedural overview for enabling remote support; likely step-by-step UI/portal flow without detailed config tables, limits, or error-code mappings. | +| [Get support for Azure Local](https://learn.microsoft.com/en-us/azure/azure-local/manage/get-support?view=azloc-2604) | 0.30 | General guidance on obtaining support and staying in a supported state; lacks detailed technical configuration or troubleshooting content. | +| [Monitoring overview](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-overview?view=azloc-2604) | 0.30 | Monitoring overview is conceptual; summary doesn’t indicate metric tables with numeric thresholds or configuration parameters. | +| [Operational security](https://learn.microsoft.com/en-us/azure/azure-local/security-book/operational-security?view=azloc-2604) | 0.30 | Operational security chapter summary; likely conceptual (tools mentioned like WAC and Defender for Cloud) without detailed configuration or error mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-overview?view=azloc-2604) | 0.30 | High-level overview of using Azure Migrate for Hyper-V; likely conceptual and workflow-focused, not detailed config or decision matrices. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-azure-migrate-vmware-overview?view=azloc-2604) | 0.30 | Overview of using Azure Migrate for VMware; conceptual and workflow-focused rather than detailed expert configuration or decision matrices. | +| [Read overview](https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-introduction?view=azloc-2604) | 0.30 | Deployment overview article; likely describes methods and flow without detailed matrices, limits, or configuration tables. | +| [Repair a node](https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server-disaggregated?view=azloc-2604) | 0.30 | Repairing a node is primarily procedural operations content; summary doesn’t indicate detailed limits, config tables, or error-code-based troubleshooting. | +| [Security updates](https://learn.microsoft.com/en-us/azure/azure-local/security-update/security-update?view=azloc-2604) | 0.30 | Security update listing; likely just KB references and dates, not configuration, RBAC, or troubleshooting guidance. | +| [Silicon-assisted security](https://learn.microsoft.com/en-us/azure/azure-local/security-book/silicon-assisted-security?view=azloc-2604) | 0.30 | Silicon-assisted security overview (secured core hardware, approved solutions); likely high-level without detailed configuration or thresholds. | +| [Telemetry and diagnostics extension](https://learn.microsoft.com/en-us/azure/azure-local/concepts/telemetry-and-diagnostics-overview?view=azloc-2604) | 0.30 | Telemetry and diagnostics overview with benefits and options; appears conceptual without detailed config tables or limits. | +| [Trustworthy addition](https://learn.microsoft.com/en-us/azure/azure-local/security-book/trustworthy-addition?view=azloc-2604) | 0.30 | High-level description of 'trustworthy addition' and security-by-default concepts; no indication of specific roles, parameters, or configs. | +| [Use dashboard to manage at-scale](https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-at-scale-dashboard?view=azloc-2604) | 0.30 | Dashboard usage/how-to for monitoring at scale; lacks numeric limits, config matrices, or deep troubleshooting content. | +| [Workload security](https://learn.microsoft.com/en-us/azure/azure-local/security-book/workload-security?view=azloc-2604) | 0.30 | Workload security overview referencing Trusted launch and Defender for Cloud; appears conceptual rather than parameter- or role-level guidance. | +| [Azure Local VMs for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-azure-arc-vm-management-overview?view=azloc-2604) | 0.25 | VM management overview is high-level (benefits, components, workflow); lacks explicit expert-level configuration or troubleshooting details. | +| [Compute for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-concepts-compute?view=azloc-2604) | 0.25 | Compute overview is conceptual; no clear indication of numeric limits, config tables, or decision matrices in the summary. | +| [Load balancer for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-load-balancer-overview?view=azloc-2604) | 0.25 | Load balancer overview is descriptive; no explicit mention of configuration tables, limits, or decision matrices. | +| [NAT gateway for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-nat-gateway-overview?view=azloc-2604) | 0.25 | NAT gateway overview is conceptual; summary doesn’t indicate detailed configuration parameters or numeric constraints. | +| [Network fabric for multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-network-fabric-overview?view=azloc-2604) | 0.25 | Network fabric overview describes capabilities and concepts; summary doesn’t show detailed configuration parameters or limits. | +| [Security concepts](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-security?view=azloc-2604) | 0.25 | Security overview is conceptual and principle-based; summary doesn’t show specific RBAC roles, settings, or compliance configuration details. | +| [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-23?view=azloc-2604) | 0.25 | “What’s new” feature list for 23xx; similar to other release feature summaries without deep technical matrices. | +| [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/whats-new-24?view=azloc-2604) | 0.25 | “What’s new” release feature list; not focused on configuration matrices, limits, or troubleshooting catalogs. | +| [About multi-rack deployments](https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-overview?view=azloc-2604) | 0.20 | Multi-rack overview is high-level benefits and features; no indication of numeric limits, config parameters, or decision matrices. | +| [FAQ](https://learn.microsoft.com/en-us/azure/azure-local/license-billing?view=azloc-2604) | 0.20 | Billing and license FAQ; commercial details, not technical expert knowledge per defined categories. | +| [Hybrid capabilities with Azure services](https://learn.microsoft.com/en-us/azure/azure-local/hybrid-capabilities-with-azure-services-23h2?view=azloc-2604) | 0.20 | Conceptual overview of hybrid capabilities and components; no concrete limits, configs, or decision matrices. | +| [Microsoft 365 Local](https://learn.microsoft.com/en-us/azure/azure-local/concepts/microsoft-365-local-overview?view=azloc-2604) | 0.20 | Overview of Microsoft 365 Local on Azure Local; primarily conceptual and marketing-style explanation of benefits and sovereignty. | +| [OEM license information](https://learn.microsoft.com/en-us/azure/azure-local/oem-license?view=azloc-2604) | 0.20 | OEM license overview; licensing/benefits rather than technical limits, configs, or patterns. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/concepts/monitoring-overview?view=azloc-2604) | 0.20 | High-level monitoring overview; lacks detailed metrics tables, configuration parameters, or troubleshooting mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-overview?view=azloc-2604) | 0.20 | High-level overview of disconnected operations, focused on concepts, benefits, and compliance; no concrete limits, configs, or error mappings. | +| [SDN technical reference](https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-technical-reference?view=azloc-2604) | 0.20 | High-level technical reference hub linking to other SDN resources; does not itself contain detailed configuration, limits, or troubleshooting content. | +| [Security foundation](https://learn.microsoft.com/en-us/azure/azure-local/security-book/security-foundation?view=azloc-2604) | 0.20 | Security foundation overview (SDL, certifications, supply chain); conceptual background, not actionable configuration or troubleshooting content. | +| [What is Azure Local VM management?](https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-overview?view=azloc-2604) | 0.20 | High-level overview of VM management; conceptual description of benefits and components, not detailed configs or limits. | +| [What is Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/overview?view=azloc-2604) | 0.20 | High-level product overview and benefits; no concrete limits, configs, or decision matrices. | +| [What's new in Azure Local?](https://learn.microsoft.com/en-us/azure/azure-local/whats-new?view=azloc-2604) | 0.20 | What's new / release highlights; feature list rather than limits, configs, or troubleshooting mappings. | +| [What's new in VM migration?](https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-whats-new?view=azloc-2604) | 0.20 | “What’s new” feature list; usually release/feature bullets without deep config, limits, or troubleshooting matrices. | +| [What's new in disconnected operations?](https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-whats-new?view=azloc-2604) | 0.20 | What's new/release highlights; does not indicate detailed configuration, limits, or troubleshooting content. | +| [Azure Local security book](https://learn.microsoft.com/en-us/azure/azure-local/security-book/overview?view=azloc-2604) | 0.10 | High-level overview of a security book; navigation/summary content without concrete technical details. | +| [Conclusion](https://learn.microsoft.com/en-us/azure/azure-local/security-book/conclusion?view=azloc-2604) | 0.10 | Conclusion chapter summarizing security posture and future direction; no expert-level technical details. | +| [Renaming to Azure Local](https://learn.microsoft.com/en-us/azure/azure-local/rename-to-azure-local?view=azloc-2604) | 0.10 | Brand rename explanation and FAQ; marketing/communication content without technical expert details. | diff --git a/products/azure-logic-apps/azure-logic-apps.csv b/products/azure-logic-apps/azure-logic-apps.csv index d4abeea2..45cef504 100644 --- a/products/azure-logic-apps/azure-logic-apps.csv +++ b/products/azure-logic-apps/azure-logic-apps.csv @@ -173,7 +173,7 @@ https://learn.microsoft.com/en-us/azure/logic-apps/monitor-track-b2b-transaction https://learn.microsoft.com/en-us/azure/logic-apps/monitor-workflows-collect-diagnostic-data,Monitor and collect diagnostic data for workflows,Collect diagnostic data for workflows - Azure Logic Apps,Configure diagnostic logging for Logic Apps with Azure Monitor,Record diagnostic data for workflows in Azure Logic Apps with Azure Monitor Logs.,"Applies to:Azure Logic Apps (Consumption + Standard) To get richer data for debugging and diagnosing your workflows in Azure Logic Apps, you can log workflow runtime data and events, such as trigger events, run events, and action events, that you can send to aLog Analytics workspace, Azurestorage account, Azureevent hub, another partner destination, or all these destinations when you set up and useAzure Monitor Logs. Note Azure Monitor Resource Logs aren't 100% lossless. Resource Logs are based ",2025-09-24T08:00:00.000Z,how-to,configuration,0.7,True,"Describes setting up diagnostic logging destinations (Log Analytics, storage, event hub) and Azure Monitor Resource Logs behavior. This typically includes specific diagnostic settings and categories, which are product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/logic-apps/move-logic-app-resources,Move logic app resources,"Move Logic Apps to Other Subscriptions, Resource Groups, or Regions - Azure Logic Apps","Move Logic Apps across subscriptions, groups, and regions","Migrate logic apps or integration accounts to other Azure subscriptions, resource groups, or locations (regions).","To migrate your logic app or related resources to another Azure resource group, region, or subscription, you have various ways to complete these tasks, such as the Azure portal, Azure PowerShell, Azure CLI, and REST API. Before you move resources, review these considerations: You can move onlyspecific logic app resource typesbetween Azure resource groups or subscriptions. Check thelimitson the number of logic app resources that you can have in your Azure subscription and in each Azure region. Th",2026-03-11T08:00:00.000Z,how-to,deployment,0.65,True,"Migration guidance mentions specific movable resource types and references subscription/region limits; such pages typically include constraints, supported/unsupported scenarios, and method-specific requirements (portal, PowerShell, CLI), which are product-specific deployment/migration details.",unchanged https://learn.microsoft.com/en-us/azure/logic-apps/multi-region-disaster-recovery,Multi-region deployments for disaster recovery,Multi-region deployments for disaster recovery in Azure Logic Apps - Azure Logic Apps,Design multi-region disaster recovery for Logic Apps,"Learn how to design your strategy to protect data, recover quickly from disruptive events, restore resources required by critical business functions, and maintain business continuity for Azure Logic A","This article provides guidance and strategies for how to set up a multi-region deployment for your logic app workflows in Azure Logic Apps. A multi-region deployment helps you protect data, recover quickly from disasters and other disruptive events, restore resources required by critical business functions, and maintain business continuity. For more information on the reliability features in Azure Logic Apps, including intra-regional resiliency via availability zones, seeReliability in Azure Log",2025-07-17T08:00:00.000Z,concept-article,architecture-patterns,0.6,True,"Provides product-specific DR architecture guidance and strategies for multi-region deployments, which are design/architecture patterns unique to Logic Apps.",unchanged -https://learn.microsoft.com/en-us/azure/logic-apps/parse-document-chunk-text,Parse or chunk content,Parse Documents and Chunk Text in Workflows - Azure Logic Apps,Configure Parse Document and Chunk Text actions in Logic Apps,Parse documents and chunk text to transform content into tokenized strings for workflows in Azure Logic Apps.,"Applies to:Azure Logic Apps (Consumption + Standard) Sometimes you have to convert content into tokens, which are words or chunks of characters, or divide a large document into smaller pieces before you can use this content with specific actions. For example, theAzure AI SearchorAzure OpenAIactions expect tokenized input and can handle only a limited number of tokens. For these scenarios, use theData Operationsactions namedParse a documentandChunk textin your logic app workflow. These actions re",2026-04-13T08:00:00.000Z,how-to,configuration,0.68,True,"The page describes product-specific behavior and parameters of the 'Parse a document' and 'Chunk text' Data Operations actions in Azure Logic Apps, including how they transform content into tokenized strings for Azure AI Search and Azure OpenAI actions. It likely includes specific action settings/fields and their effects, which are configuration details unique to Logic Apps rather than generic concepts. While limits are mentioned conceptually (token limits for downstream actions), the core expert content is how to configure and use these actions within workflows.",updated +https://learn.microsoft.com/en-us/azure/logic-apps/parse-document-chunk-text,Parse or chunk content,Parse Documents and Chunk Text in Workflows - Azure Logic Apps,Configure Parse Document and Chunk Text actions in Logic Apps,Parse documents and chunk text to transform content into tokenized strings for workflows in Azure Logic Apps.,"Applies to:Azure Logic Apps (Consumption + Standard) Sometimes you have to convert content into tokens, which are words or chunks of characters, or divide a large document into smaller pieces before you can use this content with specific actions. For example, theAzure AI SearchorAzure OpenAIactions expect tokenized input and can handle only a limited number of tokens. For these scenarios, use theData Operationsactions namedParse a documentandChunk textin your logic app workflow. These actions re",2026-04-13T08:00:00.000Z,how-to,configuration,0.68,True,"The page describes product-specific behavior and parameters of the 'Parse a document' and 'Chunk text' Data Operations actions in Azure Logic Apps, including how they transform content into tokenized strings for Azure AI Search and Azure OpenAI actions. It likely includes specific action settings/fields and their effects, which are configuration details unique to Logic Apps rather than generic concepts. While limits are mentioned conceptually (token limits for downstream actions), the core expert content is how to configure and use these actions within workflows.",unchanged https://learn.microsoft.com/en-us/azure/logic-apps/plan-manage-costs,Plan and manage costs,Plan and Manage Costs - Azure Logic Apps,Plan and manage Azure Logic Apps costs,Learn how to plan and manage costs for Azure Logic Apps by using cost analysis in the Azure portal.,"Applies to:Azure Logic Apps (Consumption + Standard) This article helps you plan and manage costs for Azure Logic Apps. Before you create or add any resources using this service, estimate your costs by using the Azure pricing calculator. After you start using Azure Logic Apps resources, you can set budgets and monitor costs by usingMicrosoft Cost Management. To identify areas where you might want to act, you can also review forecasted costs and monitor spending trends. Keep in mind that costs fo",2026-01-13T08:00:00.000Z,how-to,decision-making,0.78,True,"Cost-planning article that likely includes pricing drivers, usage metrics, and guidance on cost trade-offs between tiers and usage patterns—supports decision-making.",unchanged https://learn.microsoft.com/en-us/azure/logic-apps/policy-reference,Azure Policy built-ins,Built-in policy definitions for Azure Logic Apps - Azure Logic Apps,Use built-in Azure Policy definitions for Logic Apps governance,Lists Azure Policy built-in policy definitions for Azure Logic Apps. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy definitions for Azure Logic Apps. For additional Azure Policy built-ins for other services, diff --git a/products/azure-logic-apps/report.md b/products/azure-logic-apps/report.md index 6b2dad31..dc022c29 100644 --- a/products/azure-logic-apps/report.md +++ b/products/azure-logic-apps/report.md @@ -53,8 +53,8 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure API Man ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 228 +- **Updated Pages**: 0 +- **Unchanged**: 229 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-logic-apps/azure-logic-apps.csv` @@ -75,11 +75,6 @@ confusable_not_for: Not for Azure Functions (use azure-functions), Azure API Man ## Changes -### Updated Pages - -- [Parse or chunk content](https://learn.microsoft.com/en-us/azure/logic-apps/parse-document-chunk-text) - - Updated: 2026-03-11T08:00:00.000Z → 2026-04-13T08:00:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-machine-learning/azure-machine-learning.csv b/products/azure-machine-learning/azure-machine-learning.csv index 592fe563..b26bfa41 100644 --- a/products/azure-machine-learning/azure-machine-learning.csv +++ b/products/azure-machine-learning/azure-machine-learning.csv @@ -493,23 +493,22 @@ definitions for Azure Machine Learning. Common use cases for Azure Policy includ governance for resource consistency, regulatory compliance, security, cost, and management. Policy definitions for these common use cases are already available in your Azure environment as built-ins to help you get started. For additional Azure Policy built-ins for other services, seeAzure Policy built-in definitions. The name of each built-in policy ",2025-07-04T11:03:00.000Z,concept-article,security,0.78,True,"Lists concrete Azure Policy definition names, parameters, and scopes for AML, which are product-specific governance/security configurations.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/community-ecosystem?view=azureml-api-2,Prompt flow Ecosystem,Prompt flow ecosystem - Azure Machine Learning,,"Introduction to the prompt flow ecosystem, which includes the prompt flow open source project, tutorials, SDK, CLI and VS Code extension.","The prompt flow ecosystem provides a comprehensive set of tutorials, tools, and resources for developers who want to leverage the power of prompt flow to experimentally tune their prompts and develop their LLM-based application in a pure local environment, without any dependencies on Azure resources binding. This article provides an overview of the key components within the ecosystem, which include:",2026-02-28T08:00:00.000Z,concept-article,,0.1,False,"High-level overview of the prompt flow ecosystem (tools, tutorials, SDK, CLI, VS Code extension) without detailed configuration parameters, limits, error codes, or product-specific decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-connections?view=azureml-api-2,Connections,Connections in Azure Machine Learning prompt flow - Azure Machine Learning,Manage API and data source credentials with prompt flow connections,"Learn about how in Azure Machine Learning prompt flow, you can utilize connections to effectively manage credentials or secrets for APIs and data sources.","In Azure Machine Learning prompt flow, use connections to manage credentials or secrets for APIs and data sources.",2025-11-19T18:13:00.000Z,concept-article,security,0.7,True,"Focuses on managing credentials/secrets via connections; likely includes connection types, scopes, and secure usage patterns, which are product-specific security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-flows?view=azureml-api-2,Flows,What are flows in Azure Machine Learning prompt flow - Azure Machine Learning,,Learn about how a flow in prompt flow serves as an executable workflow that streamlines the development of your LLM-based AI application. It provides a comprehensive framework for managing data flow a,"In Azure Machine Learning prompt flow, you can develop an LLM-based AI application by engaging in the stages of developing, testing, tuning, and deploying a flow. This comprehensive workflow allows you to harness the power of Large Language Models (LLMs) and create sophisticated AI applications with ease.",2025-11-19T23:15:00.000Z,concept-article,,0.25,False,"Conceptual description of flows and lifecycle; no indication of numeric limits, config tables, or security/troubleshooting specifics.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-llmops-maturity?view=azureml-api-2,Advance your maturity level,Advance your maturity level for GenAIOps - Azure Machine Learning,,Learn about the different stages of Generative Artificial Intelligence Operations (GenAIOps) and how to advance your organization's capabilities.,"Generative Artificial Intelligence Operations, orGenAIOps(sometimes called LLMOps), describes the operational practices and strategies for managing large language models (LLMs) in production. This article provides guidance on how to advance your capabilities in GenAIOps, based on your organization's current maturity level. Use the descriptions below to find yourGenAIOps Maturity Modelranking level. These levels provide a general understanding and practical application level of your organization.",2025-11-18T15:37:00.000Z,concept-article,,0.2,False,"GenAIOps maturity model and stages are conceptual guidance without product-specific numeric thresholds, configs, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/community-ecosystem?view=azureml-api-2,Prompt flow Ecosystem,Prompt flow ecosystem - Azure Machine Learning,,"Introduction to the prompt flow ecosystem, which includes the prompt flow open source project, tutorials, SDK, CLI and VS Code extension.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The prompt flow ecosystem provides a comprehensive set of tutorials, tools, and resources for developers who want to leverage the power of prompt flow to exp",2026-04-21T16:56:00.000Z,concept-article,,0.1,False,"Ecosystem/marketing-style overview of tools and resources; not focused on limits, configs, or detailed patterns.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-connections?view=azureml-api-2,Connections,Connections in Azure Machine Learning prompt flow - Azure Machine Learning,,"Learn about how in Azure Machine Learning prompt flow, you can utilize connections to effectively manage credentials or secrets for APIs and data sources.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. In Azure Machine Learning prompt flow, use connections to manage credentials or secrets for APIs and data sources.",2026-04-21T16:56:00.000Z,concept-article,,0.3,False,"Conceptual explanation of connections and secret management; likely lacks detailed RBAC roles, config tables, or numeric limits.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-flows?view=azureml-api-2,Flows,What are flows in Azure Machine Learning prompt flow - Azure Machine Learning,,Learn about how a flow in prompt flow serves as an executable workflow that streamlines the development of your LLM-based AI application. It provides a comprehensive framework for managing data flow a,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. In Azure Machine Learning prompt flow, you can develop an LLM-based AI application by engaging in the stages of developing, testing, tuning, and deploying a ",2026-04-21T16:56:00.000Z,concept-article,,0.2,False,"Conceptual overview of flows and lifecycle; appears descriptive without expert-level configuration, limits, or troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-llmops-maturity?view=azureml-api-2,Advance your maturity level,Advance your maturity level for GenAIOps - Azure Machine Learning,,Learn about the different stages of Generative Artificial Intelligence Operations (GenAIOps) and how to advance your organization's capabilities.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Generative Artificial Intelligence Operations, orGenAIOps(sometimes called LLMOps), describes the operational practices and strategies for managing large lan",2026-04-21T16:56:00.000Z,concept-article,,0.1,False,"Describes GenAIOps/LLMOps maturity stages conceptually; no indication of concrete limits, configs, or product-specific error/decision matrices.",updated https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-model-monitoring-generative-ai-evaluation-metrics?view=azureml-api-2,Evaluation monitoring metrics,Monitoring evaluation metrics descriptions and use cases (preview) - Azure Machine Learning,Apply generative AI monitoring metrics and recommended practices in Azure ML,Understand the metrics used when monitoring the performance of generative AI models deployed to production on Azure Machine Learning.,"In this article, you learn about the metrics used when monitoring and evaluating generative AI models in Azure Machine Learning, and the recommended practices for using generative AI model monitoring. Important Monitoring is currently in public preview. This preview is provided without a service-level agreement, and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Mic",2025-08-25T08:00:00.000Z,how-to,best-practices,0.7,True,Describes specific evaluation metrics and recommended practices for monitoring generative models; these are product-specific usage guidelines and metric interpretations that qualify as best practices.,unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-session?view=azureml-api-2,Session,Compute session in Azure Machine Learning prompt flow - Azure Machine Learning,,"Learn about how in Azure Machine Learning prompt flow, the execution of flows is facilitated by using compute session.","In Azure Machine Learning prompt flow, you use a compute session to run flows.",2025-11-19T23:15:00.000Z,concept-article,,0.45,False,"Conceptual explanation of compute sessions; summary doesn’t indicate detailed configuration parameters, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-tools?view=azureml-api-2,Tools,What are tools in Azure Machine Learning prompt flow - Azure Machine Learning,,Learn about how tools are the fundamental building blocks of a flow in Azure Machine Learning prompt flow.,"Tools are the fundamental building blocks of a flow in Azure Machine Learning prompt flow. Each tool is a simple, executable unit with a specific function, allowing users to perform various tasks. -By combining different tools, users can create a flow that accomplishes a wide range of goals. One of the key benefit of prompt flow tools is their seamless integration with third-party APIs and python open source packages. -This not only improves the functionality of large language models but also make",2025-11-24T08:00:00.000Z,concept-article,,0.45,False,Conceptual explanation of tools as building blocks; integration mention is high-level without clear evidence of parameter tables or SDK config details in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-variants?view=azureml-api-2,Variants,Variants in Azure Machine Learning prompt flow - Azure Machine Learning,,"Learn about how with Azure Machine Learning prompt flow, you can use variants to tune your prompt.","With Azure Machine Learning prompt flow, you can use variants to tune your prompt. In this article, you learn about the prompt flow variants concept.",2025-11-19T23:15:00.000Z,concept-article,,0.3,False,"Conceptual article about variants for prompt tuning; likely guidance and examples but not configuration matrices, limits, or error-code troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-session?view=azureml-api-2,Session,Compute session in Azure Machine Learning prompt flow - Azure Machine Learning,,"Learn about how in Azure Machine Learning prompt flow, the execution of flows is facilitated by using compute session.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. In Azure Machine Learning prompt flow, you use a compute session to run flows.",2026-04-21T16:56:00.000Z,concept-article,,0.3,False,"Conceptual description of compute sessions; no clear indication of numeric limits, config parameter tables, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-tools?view=azureml-api-2,Tools,What are tools in Azure Machine Learning prompt flow - Azure Machine Learning,,Learn about how tools are the fundamental building blocks of a flow in Azure Machine Learning prompt flow.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Tools are the fundamental building blocks of a flow in Azure Machine Learning prompt flow. Each tool is a simple, executable unit with a specific function, a",2026-04-21T16:56:00.000Z,concept-article,,0.25,False,"Explains what tools are in Prompt Flow; likely conceptual without specific config tables, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-variants?view=azureml-api-2,Variants,Variants in Azure Machine Learning prompt flow - Azure Machine Learning,,"Learn about how with Azure Machine Learning prompt flow, you can use variants to tune your prompt.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. With Azure Machine Learning prompt flow, you can use variants to tune your prompt. In this article, you learn about the prompt flow variants concept.",2026-04-21T16:56:00.000Z,concept-article,,0.25,False,"Describes the concept of variants for prompt tuning; no evidence of numeric thresholds, config matrices, or product-specific troubleshooting.",updated https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/get-started-prompt-flow?view=azureml-api-2,Get started in prompt flow,Get started with prompt flow - Azure Machine Learning,,"Learn how to set up, create, evaluate, and deploy a prompt flow in Azure Machine Learning studio.","This article walks you through the main user journey of using prompt flow in Azure Machine Learning studio. You learn how to enable prompt flow in your Azure Machine Learning workspace, create and develop a prompt flow, test and evaluate the flow, and then deploy it to production.",2025-07-17T08:00:00.000Z,how-to,,0.35,False,"Get-started walkthrough for prompt flow; primarily onboarding and basic workflow, not deep configuration, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-bulk-test-evaluate-flow?view=azureml-api-2,Submit batch run and evaluate a flow,Submit a batch run to evaluate a prompt flow - Azure Machine Learning,,Submit a batch run in Azure Machine Learning studio and use evaluation methods to measure how well your prompt flow performs with a large dataset.,"A batch run executes a prompt flow with a large dataset and generates outputs for each data row. To evaluate how well your prompt flow performs with a large dataset, you can submit a batch run and use evaluation methods to generate performance scores and metrics. After the batch flow completes, the evaluation methods automatically execute to calculate the scores and metrics. You can use the evaluation metrics to assess the output of your flow against your performance criteria and goals. This art",2026-02-28T08:00:00.000Z,how-to,,0.3,False,"How-to/tutorial style guidance on submitting batch runs and evaluating prompt flows; likely procedural without detailed limits, configuration tables, or error-code-based troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2,Custom tool package creation and usage,Custom tool package creation and usage in prompt flow - Azure Machine Learning,Create and use custom tool packages in Azure ML prompt flow,Learn how to develop your own tool package in prompt flow.,"When developing flows, you can not only use the built-in tools provided by prompt flow, but also develop your own custom tool. In this document, we guide you through the process of developing your own tool package, offering detailed steps and advice on how to utilize the custom tool package. After successful installation, your custom ""tool"" can show up in the tool list:",2024-11-26T08:00:00.000Z,how-to,integrations,0.75,True,"Guides development and installation of custom tool packages; includes package structure, configuration, and registration details unique to prompt flow’s tool system.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2,Customize base image for compute session,Customize base image for compute session in prompt flow - Azure Machine Learning,Configure custom base images for prompt flow sessions,Learn how to create a custom base image for a compute session in prompt flow with Azure Machine Learning studio.,"Before you begin, make sure you're familiar withDockerandAzure Machine Learning environments.",2026-03-19T11:04:00.000Z,how-to,configuration,0.7,True,"The page describes product-specific configuration for Azure ML prompt flow compute sessions, including how to define and use a custom Docker/base image as the session environment. It likely includes concrete image configuration details and environment settings unique to Azure Machine Learning, which go beyond generic Docker knowledge and qualify as expert, configuration-focused content.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2,Custom tool package creation and usage,Custom tool package creation and usage in prompt flow - Azure Machine Learning,Create and use custom tool packages in prompt flow,Learn how to develop your own tool package in prompt flow.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. When developing flows, you can not only use the built-in tools provided by prompt flow, but also develop your own custom tool. In this document, we guide you",2026-04-21T16:56:00.000Z,how-to,integrations,0.7,True,"How-to for developing custom tool packages likely includes concrete package structure, required files, configuration fields, and code patterns specific to prompt flow tools, which are product-specific integration details beyond generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2,Customize base image for compute session,Customize base image for compute session in prompt flow - Azure Machine Learning,Customize compute session base images for Prompt Flow,Learn how to create a custom base image for a compute session in prompt flow with Azure Machine Learning studio.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Before you begin, make sure you're familiar withDockerandAzure Machine Learning environments.",2026-04-21T16:56:00.000Z,how-to,configuration,0.7,True,How-to for creating and configuring a custom Docker base image for compute sessions; likely includes specific environment settings and image configuration details unique to this product.,updated https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-for-real-time-inference?view=azureml-api-2,Deploy a flow to online endpoint for real-time inference with UI,Deploy a flow in prompt flow as a managed online endpoint for real-time inference - Azure Machine Learning,,Learn how to deploy in prompt flow a flow as a managed online endpoint for real-time inference with Azure Machine Learning studio.,"After you build a flow and test it properly, you might want to deploy it as an endpoint so that you can invoke the endpoint for real-time inference. In this article, you'll learn how to deploy a flow as a managed online endpoint for real-time inference. The steps you'll take are: Important Items marked (preview) in this article are currently in public preview. The preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features mi",2026-03-23T08:00:00.000Z,how-to,,0.3,False,"Step-by-step guide to deploy a prompt flow as a managed online endpoint; likely a procedural tutorial without deployment matrices, tier-specific constraints, or detailed configuration parameter tables.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-migrated-agent-framework-workflow?view=azureml-api-2,Deploy and cut over,Deploy and operate your migrated Agent Framework workflow - Azure Machine Learning,Deploy and operate Agent Framework workflows on Azure,"Set up OpenTelemetry tracing, deploy your Microsoft Agent Framework workflow to Azure Container Apps, add a CI/CD quality gate, and cut over production traffic from Prompt Flow.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. This article covers the operations and cutover steps of the Prompt Flow to Microsoft Agent Framework migration: setting up tracing, deploying to Azure Contai",2026-04-21T16:56:00.000Z,how-to,deployment,0.75,True,"Covers deploying Agent Framework workflows to Azure Container Apps, setting up OpenTelemetry tracing, CI/CD quality gates, and cutover from Prompt Flow—product-specific deployment and operations guidance.",new https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-to-code?view=azureml-api-2,Deploy a flow to online endpoint for real-time inference with CLI,Deploy a flow in prompt flow to online endpoint for real-time inference with CLI - Azure Machine Learning,Deploy prompt flows to managed or Kubernetes online endpoints with CLI,Learn how to deploy your flow to a managed online endpoint or Kubernetes online endpoint in Azure Machine Learning prompt flow.,"In this article, you learn to deploy your flow to amanaged online endpointor aKubernetes online endpointfor use in real-time inferencing with Azure Machine Learning v2 CLI. Before beginning make sure that you have tested your flow properly, and feel confident that it's ready to be deployed to production. To learn more about testing your flow, seetest your flow. After testing your flow you learn how to create managed online endpoint and deployment, and how to use the endpoint for real-time infere",2024-08-28T16:59:00.000Z,how-to,deployment,0.75,True,"CLI-based deployment to managed and Kubernetes endpoints; likely includes endpoint/deployment YAML schemas, constraints, and environment settings specific to Azure ML.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-develop-an-evaluation-flow?view=azureml-api-2,Customize evaluation flow and metrics,Evaluation flow and metrics in prompt flow - Azure Machine Learning,Design and use evaluation flows and metrics in prompt flow,"Use Azure Machine Learning studio to create or customize evaluation flows and metrics, and use a batch run as a prompt flow evaluation method.","Evaluation flows are a special type of prompt flow that calculates metrics to assess how well the outputs of a run meet specific criteria and goals. You can create or customize evaluation flows and metrics tailored to your tasks and objectives, and use them to evaluate other prompt flows. This article explains evaluation flows, how to develop and customize them, and how to use them in prompt flow batch runs to evaluate flow performance.",2026-01-13T08:00:00.000Z,how-to,best-practices,0.7,True,"Explains how to create/customize evaluation flows and metrics; contains product-specific patterns and configurations for evaluating flows, which are actionable best practices.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-develop-flow?view=azureml-api-2,Develop a flow,Develop prompt flow - Azure Machine Learning,,Learn how to develop a prompt flow and a chat flow in Azure Machine Learning studio.,"Prompt flow is a development tool that streamlines the development cycle of AI applications that are powered by Large Language Models (LLMs). In this article, you learn how to create and develop a prompt flow and a chat flow in Azure Machine Learning studio. As the momentum for LLM-based AI applications grows, prompt flow provides a comprehensive solution that simplifies the process of prototyping, experimenting, iterating, and deploying AI applications. By using prompt flow, you can:",2025-09-22T08:00:00.000Z,how-to,,0.45,False,"How-to for developing flows; mostly workflow and UI/SDK usage, with no clear indication of configuration tables, limits, or troubleshooting in the summary.",unchanged @@ -517,31 +516,30 @@ https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-enab https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-enable-trace-feedback-for-deployment?view=azureml-api-2,Enable trace and collect feedback for a flow deployment,Enable trace and collect feedback for a flow deployment - Azure Machine Learning,Enable tracing and user feedback collection for prompt flow deployments,Learn how to enable trace and collect feedback during inference time of a flow deployment,"Note This feature is currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. After deploying a Generative AI APP in production, APP developers seek to enhance their understanding and optimize performance. Trace data for each request, aggregated metrics, ",2024-08-28T16:59:00.000Z,how-to,configuration,0.7,True,"Describes enabling trace and feedback for deployments; typically involves configuration flags, parameters, and logging settings unique to prompt flow deployments.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-end-to-end-azure-devops-with-prompt-flow?view=azureml-api-2,Set up end-to-end GenAIOps with prompt flow and Azure DevOps,GenAIOps with prompt flow and Azure DevOps - Azure Machine Learning,Implement GenAIOps with prompt flow and Azure DevOps pipelines,Use Azure Machine Learning to set up an end-to-end Azure DevOps and GenAIOps pipeline to run a prompt flow.,"As the demand for LLM-infused applications soars, organizations need a cohesive and streamlined process to manage the end-to-end lifecycle of these apps. Generative Artificial Intelligence Operations (GenAIOps), sometimes calledLLMOps, is a cornerstone of efficient prompt engineering and LLM-infused application development and deployment. This article shows how Azure Machine Learning lets you integrate with Azure DevOps to automate the LLM-infused application development lifecycle withprompt flo",2024-10-25T05:34:00.000Z,how-to,deployment,0.8,True,Similar to GitHub article but for Azure DevOps; contains Azure DevOps pipeline configuration and deployment details specific to prompt flow.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-end-to-end-llmops-with-prompt-flow?view=azureml-api-2,Set up end-to-end GenAIOps with prompt flow and GitHub,GenAIOps with prompt flow and GitHub - Azure Machine Learning,Implement GenAIOps with prompt flow and GitHub pipelines,Use Azure Machine Learning to set up an end-to-end GitHub and GenAIOps pipeline to run a prompt flow.,"As the demand for large language model (LLM)-infused applications soars, organizations need a cohesive and streamlined process to manage the end-to-end lifecycle of these apps. Generative Artificial Intelligence Operations (GenAIOps), sometimes calledLLMOps, is a cornerstone of efficient prompt engineering and LLM-infused application development and deployment. This article shows how Azure Machine Learning lets you integrate with GitHub to automate the LLM-infused application development lifecyc",2026-01-13T23:15:00.000Z,how-to,deployment,0.8,True,Shows an end-to-end GitHub-based GenAIOps pipeline for prompt flow; includes CI/CD configuration and deployment patterns specific to Azure ML and prompt flow.,unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2,Evaluate your Semantic Kernel with prompt flow,Evaluate Semantic Kernel with prompt flow - Azure Machine Learning,Evaluate Semantic Kernel plugins and planners with prompt flow,Learn how to use a prompt flow to evaluate Semantic Kernel in Azure Machine Learning studio.,"This article describes the seamless integration between prompt flow andSemantic Kernel, and demonstrates how to evaluate Semantic Kernel plugins and planners by using prompt flow. In the rapidly evolving landscape of AI orchestration, a comprehensive evaluation of your plugins and planners is important for optimal performance. Semantic Kernel is an open-source SDK that lets you easily combine Foundry Tools with programming languages like C# and Python to create AI apps that combine the best of b",2024-10-16T05:31:00.000Z,how-to,integrations,0.75,True,Describes integration between Semantic Kernel and prompt flow; includes SDK usage and evaluation patterns unique to this integration.,unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2,Integrate Langchain in prompt flow,Integrate LangChain in prompt flows - Azure Machine Learning,Integrate LangChain workflows with Azure ML prompt flow,Learn how to integrate LangChain into prompt flows in Azure Machine Learning studio.,"TheLangChainPython library is a framework for developing applications powered by large language models (LLMs), agents, and dependency tools. This article shows you how to supercharge your LangChain development with Azure Machine Learning prompt flow. The integration of LangChain with prompt flow is a powerful combination that can help you build and test your custom language models with ease. You can use LangChain modules to build the flow, then use the prompt flow process to scale experiments fo",2024-10-25T05:34:00.000Z,how-to,integrations,0.8,True,"Shows how to combine LangChain modules with prompt flow; likely includes SDK usage, configuration parameters, and patterns specific to this integration.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2,Evaluate your Semantic Kernel with prompt flow,Evaluate Semantic Kernel with prompt flow - Azure Machine Learning,Evaluate Semantic Kernel plugins using Prompt Flow,Learn how to use a prompt flow to evaluate Semantic Kernel in Azure Machine Learning studio.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. This article describes the seamless integration between prompt flow andSemantic Kernel, and demonstrates how to evaluate Semantic Kernel plugins and planners",2026-04-21T16:56:00.000Z,how-to,integrations,0.75,True,"Describes integration between Prompt Flow and Semantic Kernel, including how to evaluate plugins and planners; contains product-specific API and configuration patterns.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2,Integrate Langchain in prompt flow,Integrate LangChain in prompt flows - Azure Machine Learning,Integrate LangChain workflows into Prompt Flow,Learn how to integrate LangChain into prompt flows in Azure Machine Learning studio.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. TheLangChainPython library is a framework for developing applications powered by large language models (LLMs), agents, and dependency tools. This article sho",2026-04-21T16:56:00.000Z,how-to,integrations,0.8,True,"Shows how to integrate LangChain with Prompt Flow, including product-specific SDK usage and configuration patterns for combining the two frameworks.",updated https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-llm-app-devops?view=azureml-api-2,Integrate prompt flow with LLM-based application DevOps,Integrate prompt flow with DevOps for LLM-based applications - Azure Machine Learning,Integrate prompt flow with DevOps pipelines for LLM apps,Integrate prompt flow with DevOps to enhance your LLM-based application development workflows in Azure Machine Learning.,"Azure Machine Learning prompt flow is a developer-friendly and easy-to-use code-first method to develop and iterate flows for large language model (LLM)-based application development. Prompt flow provides an SDK and CLI, a Visual Studio Code extension, and a flow authoring UI. These tools facilitate local flow development, local flow run and evaluation run triggering, and transitioning flows between local and cloud workspace environments. You can combine the prompt flow experience and code capab",2024-11-01T21:59:00.000Z,how-to,deployment,0.7,True,Shows how to integrate prompt flow with DevOps workflows; includes CI/CD patterns and product-specific deployment pipeline configurations for LLM applications.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-manage-compute-session?view=azureml-api-2,Manage compute_session,Manage prompt flow compute session - Azure Machine Learning,Configure and manage prompt flow compute sessions in Azure ML,Learn how to manage prompt flow compute session in Azure Machine Learning studio.,"A prompt flow compute session provides the computing resources required for the application to run, including a Docker image with all necessary dependency packages. This reliable, scalable environment lets prompt flow efficiently execute tasks and functions, ensuring a seamless user experience.",2025-08-26T05:08:00.000Z,how-to,configuration,0.7,True,"Managing compute sessions typically involves specifying compute types, sizes, images, and lifecycle settings; this is product-specific configuration knowledge beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-migrate-prompt-flow-to-agent-framework?view=azureml-api-2,Rebuild and validate your workflow,"Audit, rebuild, and validate your Prompt Flow workflow in Microsoft Agent Framework - Azure Machine Learning",Rebuild Prompt Flow workflows using Agent Framework,"Export your Prompt Flow, re-implement it with Microsoft Agent Framework WorkflowBuilder and Executor classes, and validate output parity using the Azure AI Evaluation SDK.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. This article walks you through the first three phases of the Prompt Flow to Microsoft Agent Framework migration: auditing your existing flow, rebuilding it i",2026-04-21T16:56:00.000Z,how-to,integrations,0.65,True,"Walks through exporting Prompt Flow definitions and re-implementing them with Agent Framework WorkflowBuilder/Executor and Azure AI Evaluation SDK; contains product-specific API/SDK usage patterns, fitting integrations & coding patterns.",new https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-monitor-generative-ai-applications?view=azureml-api-2,Monitor generative AI applications in production,Model monitoring for generative AI applications (preview) - Azure Machine Learning,Configure monitoring for Azure ML generative AI apps,Monitor the safety and quality of generative AI applications deployed to production on Azure Machine Learning.,"Monitoring models in production is an essential part of the AI lifecycle. Changes in data and consumer behavior can influence your generative AI application over time, resulting in outdated systems that negatively affect business outcomes and expose organizations to compliance, economic, and reputational risks. Important Model monitoring for generative AI applications is currently in public preview. These previews are provided without a service-level agreement, and are not recommended for produc",2025-08-26T05:08:00.000Z,how-to,configuration,0.65,True,"Model monitoring article typically includes Azure ML–specific configuration options, metric names, and parameter settings for safety/quality monitoring that go beyond generic monitoring concepts.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-process-image?view=azureml-api-2,Incorporate images into prompt flow,Incorporate images into prompt flow (preview) - Azure Machine Learning,Incorporate image inputs into Azure ML prompt flows,Learn how to incorporate images into prompt flow.,"Multimodal Large Language Models (LLMs), which can process and interpret diverse forms of data inputs, present a powerful tool that can elevate the capabilities of language-only systems to new heights. Among the various data types, images are important for many real-world applications. The incorporation of image data into AI systems provides an essential layer of visual understanding. In this article, you'll learn: Important Prompt flow image support is currently in public preview. This preview ",2024-08-28T16:59:00.000Z,how-to,integrations,0.65,True,"Describes multimodal image support in prompt flow; likely includes specific parameters, formats, and constraints for image handling that are product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-secure-prompt-flow?view=azureml-api-2,Network isolation in prompt flow,Network isolation in prompt flow - Azure Machine Learning,Secure prompt flow with virtual network isolation in Azure ML,Learn how to secure prompt flow with virtual network.,You can secure prompt flow using private networks. This article explains the requirements to use prompt flow in an environment secured by private networks.,2025-07-10T22:11:00.000Z,how-to,security,0.8,True,"Describes requirements for using prompt flow in private networks; includes VNet, private endpoint, and network configuration specifics, which are product-specific security settings.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-tune-prompts-using-variants?view=azureml-api-2,Tune prompts using variants,Tune prompts using variants in prompt flow - Azure Machine Learning,Tune LLM prompts using variants in Azure ML prompt flow,Learn how to tune prompts using variants in prompt flow with Azure Machine Learning studio.,"Crafting a good prompt is a challenging task that requires a lot of creativity, clarity, and relevance. A good prompt can elicit the desired output from a pretrained language model, while a bad prompt can lead to inaccurate, irrelevant, or nonsensical outputs. Therefore, it's necessary to tune prompts to optimize their performance and robustness for different tasks and domains. So, we introducethe concept of variantswhich can help you test the model's behavior under different conditions, such as",2024-08-28T16:59:00.000Z,how-to,best-practices,0.7,True,Provides concrete techniques for using variants to test model behavior under different conditions; these are product-specific prompt-tuning practices and gotchas.,unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/overview-what-is-prompt-flow?view=azureml-api-2,What is prompt flow?,What is Azure Machine Learning prompt flow - Azure Machine Learning,,Azure Machine Learning prompt flow is a development tool designed to streamline the entire development cycle of AI applications powered by Large Language Models (LLMs).,"Azure Machine Learning prompt flow is a development tool designed to streamline the entire development cycle of AI applications powered by Large Language Models (LLMs). Prompt flow provides a comprehensive solution that simplifies the process of prototyping, experimenting, iterating, and deploying your AI applications. With Azure Machine Learning prompt flow, you can: Azure Machine Learning prompt flow offers a versatile, intuitive way to streamline your LLM-based AI development.",2025-07-16T08:00:00.000Z,concept-article,,0.2,False,"Conceptual overview of prompt flow; describes capabilities and benefits without detailed configuration, limits, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/azure-open-ai-gpt-4v-tool?view=azureml-api-2,Azure OpenAI GPT-4 Turbo with Vision tool,Azure OpenAI GPT-4 Turbo with Vision tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure Azure OpenAI GPT-4 Turbo with Vision tool,The prompt flow Azure OpenAI GPT-4 Turbo with Vision tool enables you to leverage AzureOpenAI GPT-4 Turbo with Vision model deployment to analyze images and provide textual responses to questions abou,"Azure OpenAI GPT-4 Turbo with Vision tool enables you to leverage your AzureOpenAI GPT-4 Turbo with Vision model deployment to analyze images and provide textual responses to questions about them. Important Azure OpenAI GPT-4 Turbo with Vision tool is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. -For more information, seeSuppl",2024-08-28T16:59:00.000Z,reference,configuration,0.75,True,"Tool reference for GPT-4 Turbo with Vision will document image input formats, parameters, and Azure OpenAI deployment configuration specific to prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2,Content Safety (text) tool,Content Safety (Text) tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure Content Safety text tool in prompt flow,"The Content Safety (Text) tool is a wrapper for the Azure AI Content Safety Text API, which you can use to detect text content and get moderation results.","Azure AI Content Safety is a content moderation service developed by Microsoft that helps you detect harmful content from different modalities and languages. The Content Safety (Text) tool is a wrapper for the Azure AI Content Safety Text API, which allows you to detect text content and get moderation results. For more information, seeAzure AI Content Safety.",2024-08-28T16:59:00.000Z,reference,configuration,0.7,True,"Wrapper tool for Azure AI Content Safety Text API will document specific parameters, categories, and settings required to call the API from prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2,Embedding tool,Embedding tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure embedding tool for OpenAI in prompt flow,The prompt flow embedding tool uses OpenAI's embedding models to convert text into dense vector representations for various natural language processing tasks.,"OpenAI's embedding models convert text into dense vector representations for various natural language processing tasks. For more information, see theOpenAI Embeddings API.",2025-11-18T15:37:00.000Z,reference,configuration,0.65,True,"Embedding tool reference should include model selection, parameter names, and constraints for using OpenAI embeddings via prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2,Index Lookup tool,Index lookup tool for flows in Azure Machine Learning - Azure Machine Learning,Configure Index Lookup tool for RAG in prompt flow,This article introduces the Index Lookup tool for flows in Azure Machine Learning.,"The prompt flowIndex Lookuptool enables the usage of common vector indices (such as Azure AI Search, FAISS, and Pinecone) for retrieval augmented generation (RAG) in prompt flow. The tool automatically detects the indices in the workspace and allows the selection of the index to be used in the flow. Important Index Lookup tool is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be ",2025-10-06T22:10:00.000Z,reference,configuration,0.75,True,"Index Lookup tool reference will describe supported index types, connection parameters, and selection options unique to Azure ML prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2,LLM tool,LLM tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure LLM tool in Azure ML prompt flow,The prompt flow LLM tool enables you to take advantage of widely used large language models like OpenAI or Azure OpenAI for natural language processing.,"The large language model (LLM) tool in prompt flow enables you to use widely used large language models likeOpenAI,Azure OpenAI in Microsoft Foundry Models, or any language model supported by theAzure AI model inference APIfor natural language processing. Prompt flow provides several large language model APIs: TheEmbeddingsAPI isn't available in the LLM tool. Use theembedding toolto generate embeddings with OpenAI or Azure OpenAI. Note The LLM tool in prompt flow does not support reasoning model",2025-07-24T17:12:00.000Z,reference,configuration,0.75,True,"LLM tool reference typically documents tool parameters, supported models, options, and constraints (for example unsupported embeddings, reasoning models) which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/open-model-llm-tool?view=azureml-api-2,Open Model LLM tool,Open Model LLM tool in Azure Machine Learning prompt flow - Azure Machine Learning,Use Open Model LLM tool in Azure ML prompt flow,The prompt flow Open Model LLM tool enables you to utilize various open-source and foundational models.,"The Open Model LLM tool enables the utilization of various Open Model and Foundational Models, such asFalconandLlama 2, for natural language processing in Azure Machine Learning prompt flow. Caution Deprecation notice:The Open Model LLM tool has been deprecated in favor of theLLM tool, which provide support for all the models supported by theAzure AI model inference APIand hence it provider greater flexibility. Here's how it looks in action on the Visual Studio Code prompt flow extension. In thi",2024-08-28T16:59:00.000Z,reference,configuration,0.7,True,"Open Model LLM tool reference will include configuration for selecting open-source models, endpoints, and parameters; also includes deprecation behavior specific to this product.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/openai-gpt-4v-tool?view=azureml-api-2,OpenAI GPT-4V tool,OpenAI GPT-4V (preview) - Azure Machine Learning,Configure OpenAI GPT-4V tool in Azure ML prompt flow,"The prompt flow OpenAI GPT-4V tool enables you to use OpenAI's GPT-4 with vision, also referred to as GPT-4V or gpt-4-vision-preview in the API, to take images as input and answer questions about them","OpenAI GPT-4V tool enables you to use OpenAI's GPT-4 with vision, also referred to as GPT-4V or gpt-4-vision-preview in the API, to take images as input and answer questions about them. Important OpenAI GPT-4V tool is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. -For more information, seeSupplemental Terms of Use for Microsoft",2025-12-19T23:10:00.000Z,reference,configuration,0.75,True,"GPT-4V tool article will describe how to call gpt-4-vision-preview, including parameters and constraints unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/overview?view=azureml-api-2,Overview,Overview of tools in prompt flow - Azure Machine Learning,Configure and manage tools in Azure ML prompt flow,This overview of the tools in prompt flow includes an index table for tools and the instructions for custom tool package creation and tool package usage.,This page provides an overview of the tools that are available in prompt flow. It also offers instructions on how to create your own custom tool and how to install custom tools.,2024-08-30T17:04:00.000Z,reference,configuration,0.7,True,"Tools overview with index table and custom tool package instructions likely lists tool names, parameters, and usage constraints specific to prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2,Prompt tool,Prompt tool in Azure Machine Learning prompt flow - Azure Machine Learning,Use and configure prompt templates in prompt flow,The prompt tool in prompt flow offers a collection of textual templates that serve as a starting point for creating prompts.,"The prompt tool in prompt flow offers a collection of textual templates that serve as a starting point for creating prompts. These templates, based on the Jinja2 template engine, facilitate the definition of prompts. The tool proves useful when prompt tuning is required prior to feeding the prompts into the large language model in prompt flow.",2024-08-28T16:59:00.000Z,reference,configuration,0.7,True,"Prompt tool reference is expected to list template parameters, Jinja2 variables, and configuration fields unique to this tool.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2,Python tool,Python tool in Azure Machine Learning prompt flow - Azure Machine Learning,Create and configure Python tools in prompt flow,The Python tool empowers you to offer customized code snippets as self-contained executable nodes in prompt flow.,"The Python tool enables you to create customized code snippets as self-contained executable nodes in prompt flow. You can easily create Python tools, edit code, and verify results.",2025-07-24T17:12:00.000Z,reference,configuration,0.7,True,"Python tool article will describe how to define tool inputs/outputs, environment, and execution settings—configuration details specific to Azure ML prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2,Rerank tool,Rerank tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure Rerank tool for RAG in prompt flow,The prompt flow rerank tool enables you to rerank documents based on the relevancy to a given query.,"The prompt flow Rerank tool improves search quality of relevant documents given a query for retrieval-augment generation (RAG) in prompt flow. This tool works best withIndex Look up toolas a ranker after the initial retrieval. Important Rerank tool is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. -For more information, seeSuppl",2025-11-18T15:37:00.000Z,reference,configuration,0.7,True,"Rerank tool article will specify inputs, outputs, and configuration parameters for ranking documents, which are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/serp-api-tool?view=azureml-api-2,SERP API tool,SerpAPI tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure SerpAPI search tool in Azure ML prompt flow,SerpAPI is a Python tool that provides a wrapper to the SerpAPI Google Search Engine Results API and the SerpAPI Bing Search Engine Results API.,"SerpAPI is a Python tool that provides a wrapper to theSerpAPI Google Search Engine Results APIand theSerpAPI Bing Search Engine Results API. You can use the tool to retrieve search results from many different search engines, including Google and Bing. You can also specify a range of search parameters, such as the search query, location, and device type.",2024-08-28T16:59:00.000Z,reference,configuration,0.7,True,"SerpAPI tool reference will list supported search parameters, required keys, and configuration fields for integrating SerpAPI with prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/transparency-note?view=azureml-api-2,Transparency note,Transparency Note for auto-generate prompt variants in prompt flow - Azure Machine Learning,,Learn about the feature in prompt flow that automatically generates variations of a base prompt with the help of language models.,,2025-11-24T08:00:00.000Z,concept-article,,0.2,False,"Transparency note describes behavior and limitations of a feature at a high level; generally policy/behavioral, not detailed config, limits, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/migrate-prompt-flow-to-agent-framework?view=azureml-api-2,Migration overview,Migrate from Prompt Flow to Microsoft Agent Framework - Azure Machine Learning,Plan migration from Prompt Flow to Agent Framework,"Learn how to migrate your Prompt Flow workflows to Microsoft Agent Framework with a structured plan covering audit, rebuild, validation, operations, and cutover.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Prompt Flow is a development tool that streamlines the entire development cycle of AI applications powered by large language models (LLMs). Prompt Flow provi",2026-04-21T16:56:00.000Z,overview,decision-making,0.7,True,"Provides structured migration guidance from Prompt Flow to Microsoft Agent Framework with phased recommendations and scenario-based guidance, which is product- and migration-specific decision-making content.",new +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/overview-what-is-prompt-flow?view=azureml-api-2,What is prompt flow?,What is Azure Machine Learning prompt flow - Azure Machine Learning,,Azure Machine Learning prompt flow is a development tool designed to streamline the entire development cycle of AI applications powered by Large Language Models (LLMs).,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Azure Machine Learning prompt flow is a development tool designed to streamline the entire development cycle of AI applications powered by Large Language Mod",2026-04-21T16:56:00.000Z,concept-article,,0.2,False,"High-level overview of Prompt Flow; no detailed limits, configs, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/azure-open-ai-gpt-4v-tool?view=azureml-api-2,Azure OpenAI GPT-4 Turbo with Vision tool,Azure OpenAI GPT-4 Turbo with Vision tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure Azure OpenAI GPT-4 Turbo with Vision tool in prompt flow,The prompt flow Azure OpenAI GPT-4 Turbo with Vision tool enables you to leverage AzureOpenAI GPT-4 Turbo with Vision model deployment to analyze images and provide textual responses to questions abou,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Azure OpenAI GPT-4 Turbo with Vision tool enables you to leverage your AzureOpenAI GPT-4 Turbo with Vision model deployment to analyze images and provide tex",2026-04-21T16:56:00.000Z,reference,integrations,0.75,True,"Tool for Azure OpenAI GPT-4 Turbo with Vision will document image input handling, model deployment binding, and tool parameters unique to prompt flow, which are expert integration/configuration details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2,Content Safety (text) tool,Content Safety (Text) tool in Azure Machine Learning prompt flow - Azure Machine Learning,Use Content Safety text tool in prompt flow,"The Content Safety (Text) tool is a wrapper for the Azure AI Content Safety Text API, which you can use to detect text content and get moderation results.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Azure AI Content Safety is a content moderation service developed by Microsoft that helps you detect harmful content from different modalities and languages.",2026-04-21T16:56:00.000Z,reference,integrations,0.7,True,"Wrapper around Azure AI Content Safety Text API; such a tool page typically lists configuration parameters (endpoint, keys, categories, severity levels) and usage patterns specific to prompt flow, fitting integrations.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2,Embedding tool,Embedding tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure embedding tool for OpenAI models in prompt flow,The prompt flow embedding tool uses OpenAI's embedding models to convert text into dense vector representations for various natural language processing tasks.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. OpenAI's embedding models convert text into dense vector representations for various natural language processing tasks. For more information, see theOpenAI E",2026-04-21T16:56:00.000Z,reference,integrations,0.7,True,"Embedding tool reference likely documents tool-specific parameters (model name, input limits, output schema) and how to wire it into flows, which are concrete integration/configuration details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2,Index Lookup tool,Index lookup tool for flows in Azure Machine Learning - Azure Machine Learning,Configure Index Lookup tool for RAG in prompt flow,This article introduces the Index Lookup tool for flows in Azure Machine Learning.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The prompt flowIndex Lookuptool enables the usage of common vector indices (such as Azure AI Search, FAISS, and Pinecone) for retrieval augmented generation ",2026-04-21T16:56:00.000Z,reference,integrations,0.75,True,"Index Lookup tool for Azure AI Search, FAISS, Pinecone will have tool parameters (index type, connection settings, query fields) and wiring patterns unique to prompt flow, which are product-specific integration details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2,LLM tool,LLM tool in Azure Machine Learning prompt flow - Azure Machine Learning,Configure and use LLM tool in prompt flow,The prompt flow LLM tool enables you to take advantage of widely used large language models like OpenAI or Azure OpenAI for natural language processing.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The large language model (LLM) tool in prompt flow enables you to use widely used large language models likeOpenAI,Azure OpenAI in Microsoft Foundry Models, ",2026-04-21T16:56:00.000Z,reference,integrations,0.7,True,"Reference for the LLM tool that connects to OpenAI/Azure OpenAI likely documents tool parameters, model selection fields, and request options unique to prompt flow, matching integration/config-style expert details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/open-model-llm-tool?view=azureml-api-2,Open Model LLM tool,Open Model LLM tool in Azure Machine Learning prompt flow - Azure Machine Learning,Use Open Model LLM tool with foundational models in prompt flow,The prompt flow Open Model LLM tool enables you to utilize various open-source and foundational models.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The Open Model LLM tool enables the utilization of various Open Model and Foundational Models, such asFalconandLlama 2, for natural language processing in Az",2026-04-21T16:56:00.000Z,reference,integrations,0.7,True,"Open Model LLM tool reference will specify how to configure different open-source models (Falcon, Llama 2) within prompt flow, including parameters and deployment bindings, which are integration-specific details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/openai-gpt-4v-tool?view=azureml-api-2,OpenAI GPT-4V tool,OpenAI GPT-4V (preview) - Azure Machine Learning,Use OpenAI GPT-4V vision tool in prompt flow,"The prompt flow OpenAI GPT-4V tool enables you to use OpenAI's GPT-4 with vision, also referred to as GPT-4V or gpt-4-vision-preview in the API, to take images as input and answer questions about them","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. OpenAI GPT-4V tool enables you to use OpenAI's GPT-4 with vision, also referred to as GPT-4V or gpt-4-vision-preview in the API, to take images as input and ",2026-04-21T16:56:00.000Z,reference,integrations,0.75,True,"OpenAI GPT-4V tool reference will include configuration for image inputs, model name (gpt-4-vision-preview), and tool parameters specific to prompt flow, matching the integrations sub-skill.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/overview?view=azureml-api-2,Overview,Overview of tools in prompt flow - Azure Machine Learning,,This overview of the tools in prompt flow includes an index table for tools and the instructions for custom tool package creation and tool package usage.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. This page provides an overview of the tools that are available in prompt flow. It also offers instructions on how to create your own custom tool and how to i",2026-04-21T16:56:00.000Z,reference,,0.3,False,"Overview/index page for tools; summary suggests navigation plus high-level instructions, not detailed configuration tables or error mappings.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2,Prompt tool,Prompt tool in Azure Machine Learning prompt flow - Azure Machine Learning,Use prompt templates with the prompt tool in prompt flow,The prompt tool in prompt flow offers a collection of textual templates that serve as a starting point for creating prompts.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The prompt tool in prompt flow offers a collection of textual templates that serve as a starting point for creating prompts. These templates, based on the Ji",2026-04-21T16:56:00.000Z,reference,integrations,0.65,True,"Describes a specific tool with JiNa template usage; such tool references typically include parameter names, template fields, and usage patterns specific to prompt flow, which are product-specific coding/config patterns.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2,Python tool,Python tool in Azure Machine Learning prompt flow - Azure Machine Learning,Build and configure Python tools in prompt flow,The Python tool empowers you to offer customized code snippets as self-contained executable nodes in prompt flow.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The Python tool enables you to create customized code snippets as self-contained executable nodes in prompt flow. You can easily create Python tools, edit co",2026-04-21T16:56:00.000Z,reference,integrations,0.75,True,"Python tool reference for self-contained executable nodes will include function signatures, input/output schema, environment constraints, and configuration fields unique to prompt flow, which are expert integration details.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2,Rerank tool,Rerank tool in Azure Machine Learning prompt flow - Azure Machine Learning,Use rerank tool to improve RAG search in prompt flow,The prompt flow rerank tool enables you to rerank documents based on the relevancy to a given query.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. The prompt flow Rerank tool improves search quality of relevant documents given a query for retrieval-augment generation (RAG) in prompt flow. This tool work",2026-04-21T16:56:00.000Z,reference,integrations,0.7,True,"Rerank tool reference will describe parameters like top_k, scoring fields, and input/output formats specific to prompt flow tools, which are concrete integration/coding patterns.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/serp-api-tool?view=azureml-api-2,SERP API tool,SerpAPI tool in Azure Machine Learning prompt flow - Azure Machine Learning,Integrate SerpAPI search results with prompt flow,SerpAPI is a Python tool that provides a wrapper to the SerpAPI Google Search Engine Results API and the SerpAPI Bing Search Engine Results API.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. SerpAPI is a Python tool that provides a wrapper to theSerpAPI Google Search Engine Results APIand theSerpAPI Bing Search Engine Results API. You can use the",2026-04-21T16:56:00.000Z,reference,integrations,0.75,True,"SerpAPI tool wraps external search APIs; documentation will include configuration fields (API key, engine, query params) and tool-specific options, which are detailed integration settings.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/transparency-note?view=azureml-api-2,Transparency note,Transparency Note for auto-generate prompt variants in prompt flow - Azure Machine Learning,,Learn about the feature in prompt flow that automatically generates variations of a base prompt with the help of language models.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027.",2025-11-24T08:00:00.000Z,concept-article,,0.2,False,"Transparency note describes behavior and limitations of a feature at a high level; generally policy/behavioral, not detailed config, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/troubleshoot-guidance?view=azureml-api-2,Troubleshoot Guidance,Troubleshoot guidance - Azure Machine Learning,Troubleshoot Azure ML prompt flow issues,This article addresses frequent questions prompt flow usage.,This article addresses frequent questions about prompt flow usage.,2024-08-28T16:59:00.000Z,troubleshooting,troubleshooting,0.8,True,Same as index 2; troubleshooting guidance for prompt flow with likely symptom-to-solution mappings and product-specific error behaviors.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/troubleshoot-guidance?view=azureml-api-2,Troubleshoot prompt flow,Troubleshoot guidance - Azure Machine Learning,Troubleshoot Azure ML prompt flow issues,This article addresses frequent questions prompt flow usage.,This article addresses frequent questions about prompt flow usage.,2024-08-28T16:59:00.000Z,troubleshooting,troubleshooting,0.8,True,"Explicit troubleshooting article for prompt flow; such pages usually map common symptoms and errors to causes and fixes, including product-specific behaviors.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/quickstart-create-resources?view=azureml-api-2,Create ML resources to get started,Tutorial: Create workspace resources - Azure Machine Learning,,Create an Azure Machine Learning workspace and cloud resources that can be used to train machine learning models.,"In this tutorial, you create the resources you need to start working with Azure Machine Learning. In this tutorial, you create your resources inAzure Machine Learning studio. You can also create a workspace using theAzure portal or SDK,the CLI,Azure PowerShell, orthe Visual Studio Code extension. For other ways to create a compute instance, seeCreate a compute instance. This video shows you how to create a workspace and compute instance in Azure Machine Learning studio. The steps are also descri",2025-08-08T22:08:00.000Z,tutorial,,0.3,False,"Step-by-step tutorial to create workspace and compute; no config parameter tables, limits, or tier matrices.",unchanged @@ -633,7 +631,7 @@ You can find the schemas for older extension versions athttps://azuremlschemaspr https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-deployment-managed-online?view=azureml-api-2,Managed online (real-time),CLI (v2) managed online deployment YAML schema - Azure Machine Learning,Configure managed online deployments via YAML,Reference documentation for the CLI (v2) managed online deployment YAML schema.,"APPLIES TO:Azure CLI ml extensionv2 (current) The source JSON schema can be found athttps://azuremlschemas.azureedge.net/latest/managedOnlineDeployment.schema.json. Note The YAML syntax detailed in this document is based on the JSON schema for the latest version of the ML CLI v2 extension. This syntax is guaranteed only to work with the latest version of the ML CLI v2 extension. You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2024-08-28T16:59:00.000Z,reference,configuration,0.92,True,Managed online deployment YAML schema reference listing deployment-specific configuration keys and constraints.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-deployment-template?view=azureml-api-2,Deployment template,CLI (v2) deployment template YAML schema - Azure Machine Learning,Configure Azure ML deployment template YAML schema,Reference documentation for the CLI (v2) deployment template YAML schema.,"APPLIES TO:Azure CLI ml extensionv2 (current) The source JSON schema can be found athttps://azuremlschemas.azureedge.net/latest/deploymentTemplate.schema.json. Note The YAML syntax detailed in this document is based on the JSON schema for the latest version of the ML CLI v2 extension. This syntax is guaranteed only to work with the latest version of the ML CLI v2 extension. -You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2026-04-15T22:09:00.000Z,reference,configuration,0.86,True,"This is a reference page for the Azure ML CLI v2 deployment template YAML schema, which enumerates specific fields, their allowed values, structures, and constraints. That constitutes detailed configuration parameters unique to this product, matching the configuration sub-skill definition.",updated +You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2026-04-15T22:09:00.000Z,reference,configuration,0.86,True,"This is a reference page for the Azure ML CLI v2 deployment template YAML schema, which enumerates specific fields, their allowed values, structures, and constraints. That constitutes detailed configuration parameters unique to this product, matching the configuration sub-skill definition.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-endpoint-batch?view=azureml-api-2,Batch,CLI (v2) batch endpoint YAML schema - Azure Machine Learning,Author batch endpoint YAML for Azure ML CLI v2,Reference documentation for the CLI (v2) batch endpoint YAML schema.,"APPLIES TO:Azure CLI ml extensionv2 (current) The source JSON schema can be found athttps://azuremlschemas.azureedge.net/latest/batchEndpoint.schema.json. Note The YAML syntax detailed in this document is based on the JSON schema for the latest version of the ML CLI v2 extension. This syntax is guaranteed only to work with the latest version of the ML CLI v2 extension. You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2024-08-28T16:59:00.000Z,reference,configuration,0.92,True,Batch endpoint YAML schema reference with detailed configuration parameters and structure.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-endpoint-online?view=azureml-api-2,Online (real-time),Online endpoints YAML reference - Azure Machine Learning,Configure Azure ML online endpoints with YAML,Learn about the YAML files used to deploy models as online endpoints,"APPLIES TO:Azure CLI ml extensionv2 (current) The source JSON schema can be found athttps://azuremlschemas.azureedge.net/latest/managedOnlineEndpoint.schema.jsonfor managed online endpoint, and athttps://azuremlschemas.azureedge.net/latest/kubernetesOnlineEndpoint.schema.jsonfor Kubernetes online endpoint. The differences between managed online endpoint and Kubernetes online endpoint are described in the table of properties in this article. Sample in this article focuses on managed online endpoi",2024-08-28T16:59:00.000Z,how-to,configuration,0.92,True,Online endpoint YAML reference including property tables and differences between managed and Kubernetes endpoints.,unchanged @@ -663,7 +661,7 @@ You can find the schemas for older extension versions athttps://azuremlschemaspr https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-mltable?view=azureml-api-2,MLTable,CLI (v2) MLtable YAML schema - Azure Machine Learning,Reference schema for Azure ML MLTable YAML,Reference documentation for the CLI (v2) MLTable YAML schema.,"APPLIES TO:Azure CLI ml extensionv2 (current) You can find the source JSON schema athttps://azuremlschemas.azureedge.net/latest/MLTable.schema.json. Note The YAML syntax detailed in this document is based on the JSON schema for the latest version of the ML CLI v2 extension. This syntax is guaranteed only to work with the latest version of the ML CLI v2 extension. You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2024-08-28T16:59:00.000Z,reference,configuration,0.9,True,"MLTable YAML schema reference; defines fields and constraints for MLTable assets, which is highly product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-model?view=azureml-api-2,Model,CLI (v2) model YAML schema - Azure Machine Learning,Define Azure ML models with CLI v2 YAML schema,Reference documentation for the CLI (v2) model YAML schema.,"APPLIES TO:Azure CLI ml extensionv2 (current) The source JSON schema can be found athttps://azuremlschemas.azureedge.net/latest/model.schema.json. Note The YAML syntax detailed in this document is based on the JSON schema for the latest version of the ML CLI v2 extension. This syntax is guaranteed only to work with the latest version of the ML CLI v2 extension. -You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2026-04-15T22:09:00.000Z,reference,configuration,0.86,True,"This is a reference page for the Azure Machine Learning CLI v2 model YAML schema. It enumerates specific schema fields, their names, allowed values, types, and sometimes defaults, which are product-specific configuration parameters not generally known from training. This matches the configuration sub-skill definition that focuses on detailed config options and parameter references.",updated +You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2026-04-15T22:09:00.000Z,reference,configuration,0.86,True,"This is a reference page for the Azure Machine Learning CLI v2 model YAML schema. It enumerates specific schema fields, their names, allowed values, types, and sometimes defaults, which are product-specific configuration parameters not generally known from training. This matches the configuration sub-skill definition that focuses on detailed config options and parameter references.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-monitor?view=azureml-api-2,Monitor,CLI (v2) schedule YAML schema for model monitoring - Azure Machine Learning,Create model monitoring schedules with Azure ML YAML,Reference documentation for the CLI (v2) schedule YAML schema for model monitoring.,"APPLIES TO:Azure CLI ml extensionv2 (current) The YAML syntax detailed in this document is based on the JSON schema for the latest version of the ML CLI v2 extension. This syntax is guaranteed only to work with the latest version of the ML CLI v2 extension. The comprehensive JSON schema can be viewed athttps://azuremlschemas.azureedge.net/latest/monitorSchedule.schema.json. You can find the schemas for older extension versions athttps://azuremlschemasprod.azureedge.net/.",2026-01-30T18:18:00.000Z,reference,configuration,0.9,True,Monitor schedule YAML schema reference with product-specific fields and configuration patterns.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-overview?view=azureml-api-2,Overview,CLI (v2) YAML schema overview - Azure Machine Learning,Navigate Azure ML CLI v2 YAML schema references,Overview and index of CLI (v2) YAML schemas.,"APPLIES TO:Azure CLI ml extensionv2 (current) The Azure Machine Learning CLI (v2), an extension to the Azure CLI, often uses and sometimes requires YAML files with specific schemas. This article lists reference docs and the source schema for YAML files. Examples are included inline in individual articles.",2024-08-28T16:59:00.000Z,reference,configuration,0.7,True,"Index of YAML schemas; links to detailed schema docs and explains usage patterns, which are specific to Azure ML CLI configuration.",unchanged @@ -698,9 +696,9 @@ https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-create-secure- https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-deploy-model?view=azureml-api-2,Deploy a model,Tutorial: Deploy a model - Azure Machine Learning,,This tutorial covers how to deploy a model to production using Azure Machine Learning Python SDK v2.,"APPLIES TO:Python SDK azure-ai-mlv2 (current) Learn to deploy a model to an online endpoint using Azure Machine Learning Python SDK v2. In this tutorial, you deploy and use a model that predicts the likelihood of a customer defaulting on a credit card payment. The steps you take are: This video shows how to get started in Azure Machine Learning studio so you can follow the steps in the tutorial. The video shows how to create a notebook, create a compute instance, and clone the notebook. The step",2025-09-10T08:00:00.000Z,tutorial,,0.4,False,Model deployment tutorial; likely shows basic endpoint creation but not deployment matrices or tier-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-designer-automobile-price-deploy?view=azureml-api-1,2. Deploy the model,Use Designer to Deploy No-Code Models - Azure Machine Learning,,Learn how to deploy a machine learning model to predict car prices with the Azure Machine Learning designer.,"Important This article provides information on using the Azure Machine Learning SDK v1. SDK v1 is deprecated as of March 31, 2025. Support for it will end on June 30, 2026. You can install and use SDK v1 until that date. Your existing workflows using SDK v1 will continue to operate after the end-of-support date. However, they could be exposed to security risks or breaking changes in the event of architectural changes in the product. We recommend that you transition to the SDK v2 before June 30, ",2025-06-09T08:00:00.000Z,tutorial,,0.3,False,Designer deployment tutorial; procedural deployment steps without tier matrices or detailed constraints.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-designer-automobile-price-train-score?view=azureml-api-1,1. Train a regression model,Train a No-Code Regression Model in Designer - Azure Machine Learning,,Train a regression model that predicts car prices using the Azure Machine Learning designer.,"Important This article provides information on using the Azure Machine Learning SDK v1. SDK v1 is deprecated as of March 31, 2025. Support for it will end on June 30, 2026. You can install and use SDK v1 until that date. Your existing workflows using SDK v1 will continue to operate after the end-of-support date. However, they could be exposed to security risks or breaking changes in the event of architectural changes in the product. We recommend that you transition to the SDK v2 before June 30, ",2025-06-11T05:27:00.000Z,tutorial,,0.3,False,"Designer regression tutorial; step-by-step modeling, not configuration/limits/troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-develop-feature-set-with-custom-source?view=azureml-api-2,Develop a feature set with a custom source,Tutorial 5: Develop a feature set with a custom source - Azure Machine Learning managed feature store - basics,,This is part 5 of the managed feature store tutorial series,"An Azure Machine Learning managed feature store lets you discover, create, and operationalize features. Features serve as the connective tissue in the machine learning lifecycle, starting from the prototyping phase, where you experiment with various features. That lifecycle continues to the operationalization phase, where you deploy your models, and inference steps look up the feature data. For more information about feature stores, visit thefeature store conceptsresource. Important Azure Cache ",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"Tutorial on developing a feature set with a custom source; focuses on conceptual and procedural guidance rather than detailed configuration parameters, limits, or error-code-based troubleshooting.",updated -https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-enable-recurrent-materialization-run-batch-inference?view=azureml-api-2,Enable recurrent materialization and run batch inference,Tutorial 3: Enable recurrent materialization and run batch inference - Azure Machine Learning managed feature store - basics,,This is part of a tutorial series on managed feature store.,"This tutorial series shows how features seamlessly integrate all phases of the machine learning lifecycle: prototyping, training, and operationalization. Important Azure Cache for Redis announced its retirement timeline for all SKUs. We recommend moving your existing Azure Cache for Redis instances to Azure Managed Redis as soon as you can. Migration guidance: For more details about the retirement: The first tutorial showed how to create a feature set specification with custom transformations. I",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"Tutorial on enabling recurrent materialization and running batch inference; appears to be step-by-step usage guidance without explicit limits, configuration reference tables, or structured troubleshooting/decision content.",updated -https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-experiment-train-models-using-features?view=azureml-api-2,Experiment and train models using features,Tutorial 2: Experiment and train models by using features - Azure Machine Learning managed feature store - basics,,This is part of a tutorial series about managed feature store.,"This tutorial series shows how features seamlessly integrate all phases of the machine learning lifecycle: prototyping, training, and operationalization. Important Azure Cache for Redis announced its retirement timeline for all SKUs. We recommend moving your existing Azure Cache for Redis instances to Azure Managed Redis as soon as you can. Migration guidance: For more details about the retirement: The first tutorial showed how to create a feature set specification with custom transformations. T",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"Tutorial-style walkthrough for experimenting and training models with managed feature store; no clear tables of limits, config matrices, error-code mappings, or product-specific best-practice/decision content beyond generic tutorial steps.",updated +https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-develop-feature-set-with-custom-source?view=azureml-api-2,Develop a feature set with a custom source,Tutorial 5: Develop a feature set with a custom source - Azure Machine Learning managed feature store - basics,,This is part 5 of the managed feature store tutorial series,"An Azure Machine Learning managed feature store lets you discover, create, and operationalize features. Features serve as the connective tissue in the machine learning lifecycle, starting from the prototyping phase, where you experiment with various features. That lifecycle continues to the operationalization phase, where you deploy your models, and inference steps look up the feature data. For more information about feature stores, visit thefeature store conceptsresource. Important Azure Cache ",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"Tutorial on developing a feature set with a custom source; focuses on conceptual and procedural guidance rather than detailed configuration parameters, limits, or error-code-based troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-enable-recurrent-materialization-run-batch-inference?view=azureml-api-2,Enable recurrent materialization and run batch inference,Tutorial 3: Enable recurrent materialization and run batch inference - Azure Machine Learning managed feature store - basics,,This is part of a tutorial series on managed feature store.,"This tutorial series shows how features seamlessly integrate all phases of the machine learning lifecycle: prototyping, training, and operationalization. Important Azure Cache for Redis announced its retirement timeline for all SKUs. We recommend moving your existing Azure Cache for Redis instances to Azure Managed Redis as soon as you can. Migration guidance: For more details about the retirement: The first tutorial showed how to create a feature set specification with custom transformations. I",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"Tutorial on enabling recurrent materialization and running batch inference; appears to be step-by-step usage guidance without explicit limits, configuration reference tables, or structured troubleshooting/decision content.",unchanged +https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-experiment-train-models-using-features?view=azureml-api-2,Experiment and train models using features,Tutorial 2: Experiment and train models by using features - Azure Machine Learning managed feature store - basics,,This is part of a tutorial series about managed feature store.,"This tutorial series shows how features seamlessly integrate all phases of the machine learning lifecycle: prototyping, training, and operationalization. Important Azure Cache for Redis announced its retirement timeline for all SKUs. We recommend moving your existing Azure Cache for Redis instances to Azure Managed Redis as soon as you can. Migration guidance: For more details about the retirement: The first tutorial showed how to create a feature set specification with custom transformations. T",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"Tutorial-style walkthrough for experimenting and training models with managed feature store; no clear tables of limits, config matrices, error-code mappings, or product-specific best-practice/decision content beyond generic tutorial steps.",unchanged https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-explore-data?view=azureml-api-2,Prepare and explore data,"Tutorial: Upload, access, and explore your data - Azure Machine Learning",,"Upload data to cloud storage, create an Azure Machine Learning data asset, create new versions for data assets, and use the data for interactive development.","APPLIES TO:Python SDK azure-ai-mlv2 (current) In this tutorial, you: A machine learning project typically starts with exploratory data analysis (EDA), data preprocessing (cleaning, feature engineering), and building machine learning model prototypes to validate hypotheses. Thisprototypingproject phase is highly interactive and lends itself to development in an IDE or a Jupyter notebook with aPython interactive console. This tutorial describes these concepts.",2025-08-05T22:08:00.000Z,tutorial,,0.3,False,Tutorial on uploading and exploring data; procedural steps without detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-first-experiment-automated-ml?view=azureml-api-2,Create automated ML experiments,Tutorial: AutoML- train no-code classification models - Azure Machine Learning,,"In this tutorial, train a classification model without writing a single line of code using Azure Machine Learning Automated ML in the studio UI.","In this tutorial, you learn how to train a classification model with no-code automated machine learning (AutoML) using Azure Machine Learning in the Azure Machine Learning studio. This classification model predicts whether a client subscribes to a fixed term deposit with a financial institution. With Automated ML, you can automate away time intensive tasks. Automated machine learning rapidly iterates over many combinations of algorithms and hyperparameters to help you find the best model based o",2025-10-15T08:00:00.000Z,tutorial,,0.3,False,No-code AutoML classification tutorial; UI-driven steps without deep configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-get-started-with-feature-store?view=azureml-api-2,Develop and register a feature set with managed feature store,Managed feature store tutorial 1: Develop and register a feature set - Azure Machine Learning,,"Learn how to develop and register a feature set, the first tutorial in a series on using managed feature store in Azure Machine Learning.","In this tutorial series, you learn how to use the managed feature store to discover, create, and operationalize Azure Machine Learning features. Features seamlessly integrate the prototyping, training, and operationalization phases of the machine learning lifecycle. In the prototyping phase, you experiment with various features, and in the operationalization phase, you deploy models that use inference steps to look up feature data. Features serve as the connective tissue in the lifecycle. You us",2026-01-29T23:08:00.000Z,tutorial,,0.4,False,"Feature store tutorial on developing/registering feature sets; mostly workflow, not detailed config tables or limits.",unchanged diff --git a/products/azure-machine-learning/report.md b/products/azure-machine-learning/report.md index adaed514..755c0663 100644 --- a/products/azure-machine-learning/report.md +++ b/products/azure-machine-learning/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - decision-making: 'Guidance on Azure ML design choices: algorithm selection, training - methods, networking and DR, cost optimization, and detailed migration/upgrade - paths from SDK v1 to v2 and between services.' - configuration: 'Configuring Azure ML: designer components, AutoML, compute, networking, - data, monitoring, registries, prompt flow, and full CLI/SDK/YAML setup for training, - deployment, and ops.' + decision-making: 'Guidance on Azure ML design choices and migrations: selecting + algorithms, training and networking options, cost/failover strategies, and upgrading/migrating + from v1, ACI, Prompt Flow to v2/Agent Framework.' + configuration: 'Configuring Azure ML: designer components, AutoML, YAML/CLI jobs, + compute, networking, data, monitoring, registries, environments, and deployment/runtime + settings.' troubleshooting: 'Diagnosing and fixing Azure ML issues: pipelines, AutoML, endpoints (online/batch), networking (VNet/private/Kubernetes), environments/images, prompt flow, feature store, and known bugs.' @@ -23,21 +23,21 @@ category_descriptions: cost and compute optimization, data prep, monitoring, deployment scripts, performance tuning, and ethical data use.' integrations: Integrating Azure ML with data sources, Spark/Databricks/Synapse, - MLflow, REST/HTTP, prompt flow, and batch/online endpoints, plus patterns for - logging, storage, and deployment. - deployment: 'Deploying and operationalizing models and prompt flows: online/batch - endpoints, AKS/ACI, CI/CD, MLOps/GenAIOps, blue‑green rollouts, pipelines, and - cross-workspace consumption.' + MLflow, REST APIs, batch/online endpoints, Prompt Flow, and external tools to + run, track, and deploy ML workflows. + deployment: 'Deploying and operating models and prompt flows: online/batch endpoints, + AKS/ACI, CI/CD and MLOps/GenAIOps pipelines, blue‑green rollouts, cross-workspace + use, and pipeline-based batch scoring.' skill_description: Expert knowledge for Azure Machine Learning development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when using Azure ML workspaces, AutoML, Prompt Flow, online/batch endpoints, - or the Python SDK/CLI, and other Azure Machine Learning related development tasks. - Not for Azure Databricks (use azure-databricks), Azure Synapse Analytics (use azure-synapse-analytics), + Use when using Azure ML jobs/endpoints, AutoML, Prompt Flow, feature store, or CI/CD + MLOps pipelines, and other Azure Machine Learning related development tasks. Not + for Azure Databricks (use azure-databricks), Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Science Virtual Machines (use azure-data-science-vm), Azure HDInsight (use azure-hdinsight). -use_when: Use when using Azure ML workspaces, AutoML, Prompt Flow, online/batch endpoints, - or the Python SDK/CLI, and other Azure Machine Learning related development tasks. +use_when: Use when using Azure ML jobs/endpoints, AutoML, Prompt Flow, feature store, + or CI/CD MLOps pipelines, and other Azure Machine Learning related development tasks. confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Science Virtual Machines (use azure-data-science-vm), Azure HDInsight (use azure-hdinsight). @@ -46,16 +46,16 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap ## Summary -- **Total Pages**: 624 -- **Fetched**: 624 +- **Total Pages**: 627 +- **Fetched**: 627 - **Fetch Failed**: 0 -- **Classified**: 482 -- **Unclassified**: 142 +- **Classified**: 483 +- **Unclassified**: 144 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 5 -- **Unchanged**: 619 +- **New Pages**: 3 +- **Updated Pages**: 24 +- **Unchanged**: 600 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-machine-learning/azure-machine-learning.csv` @@ -65,29 +65,66 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap |------|-------|------------| | architecture-patterns | 3 | 0.5% | | best-practices | 20 | 3.2% | -| configuration | 269 | 43.1% | -| decision-making | 25 | 4.0% | -| deployment | 39 | 6.2% | -| integrations | 40 | 6.4% | +| configuration | 257 | 41.0% | +| decision-making | 26 | 4.1% | +| deployment | 40 | 6.4% | +| integrations | 52 | 8.3% | | limits-quotas | 5 | 0.8% | -| security | 50 | 8.0% | -| troubleshooting | 31 | 5.0% | -| *(Unclassified)* | 142 | 22.8% | +| security | 49 | 7.8% | +| troubleshooting | 31 | 4.9% | +| *(Unclassified)* | 144 | 23.0% | ## Changes +### New Pages + +- [Migration overview](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/migrate-prompt-flow-to-agent-framework?view=azureml-api-2) +- [Rebuild and validate your workflow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-migrate-prompt-flow-to-agent-framework?view=azureml-api-2) +- [Deploy and cut over](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-migrated-agent-framework-workflow?view=azureml-api-2) + ### Updated Pages -- [Experiment and train models using features](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-experiment-train-models-using-features?view=azureml-api-2) - - Updated: 2024-09-30T08:00:00.000Z → 2026-04-13T17:17:00.000Z -- [Enable recurrent materialization and run batch inference](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-enable-recurrent-materialization-run-batch-inference?view=azureml-api-2) - - Updated: 2024-11-20T08:00:00.000Z → 2026-04-13T17:17:00.000Z -- [Develop a feature set with a custom source](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-develop-feature-set-with-custom-source?view=azureml-api-2) - - Updated: 2024-11-21T08:00:00.000Z → 2026-04-13T17:17:00.000Z -- [Deployment template](https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-deployment-template?view=azureml-api-2) - - Updated: 2026-01-16T06:04:00.000Z → 2026-04-15T22:09:00.000Z -- [Model](https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-model?view=azureml-api-2) - - Updated: 2025-02-10T08:00:00.000Z → 2026-04-15T22:09:00.000Z +- [What is prompt flow?](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/overview-what-is-prompt-flow?view=azureml-api-2) + - Updated: 2025-07-16T08:00:00.000Z → 2026-04-21T16:56:00.000Z +- [Connections](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-connections?view=azureml-api-2) + - Updated: 2025-11-19T18:13:00.000Z → 2026-04-21T16:56:00.000Z +- [Session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-session?view=azureml-api-2) + - Updated: 2025-11-19T23:15:00.000Z → 2026-04-21T16:56:00.000Z +- [Flows](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-flows?view=azureml-api-2) + - Updated: 2025-11-19T23:15:00.000Z → 2026-04-21T16:56:00.000Z +- [Tools](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-tools?view=azureml-api-2) + - Updated: 2025-11-24T08:00:00.000Z → 2026-04-21T16:56:00.000Z +- [Variants](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-variants?view=azureml-api-2) + - Updated: 2025-11-19T23:15:00.000Z → 2026-04-21T16:56:00.000Z +- [Prompt flow Ecosystem](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/community-ecosystem?view=azureml-api-2) + - Updated: 2026-02-28T08:00:00.000Z → 2026-04-21T16:56:00.000Z +- [Customize base image for compute session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2) + - Updated: 2026-03-19T11:04:00.000Z → 2026-04-21T16:56:00.000Z +- [Integrate Langchain in prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2) + - Updated: 2024-10-25T05:34:00.000Z → 2026-04-21T16:56:00.000Z +- [Evaluate your Semantic Kernel with prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2) + - Updated: 2024-10-16T05:31:00.000Z → 2026-04-21T16:56:00.000Z +- [Custom tool package creation and usage](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2) + - Updated: 2024-11-26T08:00:00.000Z → 2026-04-21T16:56:00.000Z +- [Advance your maturity level](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-llmops-maturity?view=azureml-api-2) + - Updated: 2025-11-18T15:37:00.000Z → 2026-04-21T16:56:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/overview?view=azureml-api-2) + - Updated: 2024-08-30T17:04:00.000Z → 2026-04-21T16:56:00.000Z +- [LLM tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2) + - Updated: 2025-07-24T17:12:00.000Z → 2026-04-21T16:56:00.000Z +- [Prompt tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2) + - Updated: 2024-08-28T16:59:00.000Z → 2026-04-21T16:56:00.000Z +- [Python tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2) + - Updated: 2025-07-24T17:12:00.000Z → 2026-04-21T16:56:00.000Z +- [Embedding tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2) + - Updated: 2025-11-18T15:37:00.000Z → 2026-04-21T16:56:00.000Z +- [Content Safety (text) tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2) + - Updated: 2024-08-28T16:59:00.000Z → 2026-04-21T16:56:00.000Z +- [Index Lookup tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2) + - Updated: 2025-10-06T22:10:00.000Z → 2026-04-21T16:56:00.000Z +- [Rerank tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2) + - Updated: 2025-11-18T15:37:00.000Z → 2026-04-21T16:56:00.000Z +- *...and 4 more* ## Classified Pages @@ -206,7 +243,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [How to troubleshoot data access](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-troubleshoot-data-access?view=azureml-api-2) | troubleshooting | 0.80 | Data access troubleshooting guide; expected to map common symptoms to causes (auth, networking, datastore config) and resolutions. | | [Hyperparameters for AutoML computer vision tasks](https://learn.microsoft.com/en-us/azure/machine-learning/reference-automl-images-hyperparameters?view=azureml-api-2) | configuration | 0.80 | Lists available hyperparameters, their names, and likely ranges/defaults for AutoML CV tasks, which are product-specific configuration details. | | [Inference image models with ONNX model](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-inference-onnx-automl-image-models?view=azureml-api-2) | integrations | 0.80 | Describes using ONNX for predictions on AutoML image models; includes ONNX-specific configuration and usage patterns unique to Azure ML outputs. | -| [Integrate Langchain in prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2) | integrations | 0.80 | Shows how to combine LangChain modules with prompt flow; likely includes SDK usage, configuration parameters, and patterns specific to this integration. | +| [Integrate Langchain in prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2) | integrations | 0.80 | Shows how to integrate LangChain with Prompt Flow, including product-specific SDK usage and configuration patterns for combining the two frameworks. | | [Network isolation in batch endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-secure-batch-endpoint?view=azureml-api-2) | security | 0.80 | Explains network isolation for batch endpoints, including use of custom virtual networks, private endpoints, and possibly required subnets, NSG rules, and service tags. These are concrete, product-specific security/network configuration patterns. | | [Network isolation in prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-secure-prompt-flow?view=azureml-api-2) | security | 0.80 | Describes requirements for using prompt flow in private networks; includes VNet, private endpoint, and network configuration specifics, which are product-specific security settings. | | [Plan for network isolation](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-network-isolation-planning?view=azureml-api-2) | decision-making | 0.80 | Provides planning guidance and recommendations for choosing among network isolation options, including scenario-based architecture decisions for Azure ML. | @@ -251,7 +288,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Apply SQL Transformation](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/apply-sql-transformation?view=azureml-api-2) | configuration | 0.75 | Describes using SQLite-based SQL transformation, including query configuration and behavior; product-specific configuration and engine choice. | | [Auto-train a forecasting model (Python, CLI)](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-auto-train-forecast?view=azureml-api-2) | configuration | 0.75 | How-to for setting up AutoML forecasting via CLI/SDK; typically includes forecasting-specific configuration parameters and options. | | [Azure Kubernetes Service](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-azure-kubernetes-service?view=azureml-api-1) | deployment | 0.75 | Provides AKS-specific deployment configuration and constraints for AML models using v1 tooling, which is product-specific deployment knowledge. | -| [Azure OpenAI GPT-4 Turbo with Vision tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/azure-open-ai-gpt-4v-tool?view=azureml-api-2) | configuration | 0.75 | Tool reference for GPT-4 Turbo with Vision will document image input formats, parameters, and Azure OpenAI deployment configuration specific to prompt flow. | +| [Azure OpenAI GPT-4 Turbo with Vision tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/azure-open-ai-gpt-4v-tool?view=azureml-api-2) | integrations | 0.75 | Tool for Azure OpenAI GPT-4 Turbo with Vision will document image input handling, model deployment binding, and tool parameters unique to prompt flow, which are expert integration/configuration details. | | [Bookmark your favorite data assets](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-create-data-assets?view=azureml-api-2) | configuration | 0.75 | Shows how to create/manage data assets, including CLI/SDK fields and datastore URI usage. These are concrete configuration patterns unique to Azure ML. | | [Built-in policy to allow specific models](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-built-in-policy-model-deployment?view=azureml-api-2) | security | 0.75 | Describes built-in Azure Policy definitions for controlling which models can be deployed; such content includes specific policy names, effects, and scopes, which are product-specific security configuration details. | | [Clean Missing Data](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/clean-missing-data?view=azureml-api-2) | configuration | 0.75 | Component reference for Clean Missing Data; includes specific operation modes and options for handling missing values. | @@ -262,13 +299,13 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Create & manage registries](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-registries?view=azureml-api-2) | configuration | 0.75 | Describes registry creation, multiregion replication, and configuration properties specific to Azure ML registries. | | [Create Azure Machine Learning Datastores](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-datastore?view=azureml-api-2) | configuration | 0.75 | How-to for connecting to Azure storage via datastores. Typically includes datastore types, required fields, and connection parameter details—product-specific configuration. | | [Create a Datastore with the User Interface](https://learn.microsoft.com/en-us/azure/machine-learning/create-datastore-with-user-interface?view=azureml-api-2) | configuration | 0.75 | Describes how to configure a datastore that links OneLake Lakehouse tables to Azure ML using the UI, including table-type specifics and connection settings. This is detailed, product-specific configuration for a particular data source type. | -| [Custom tool package creation and usage](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2) | integrations | 0.75 | Guides development and installation of custom tool packages; includes package structure, configuration, and registration details unique to prompt flow’s tool system. | | [Datastore](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-resource-datastore?view=azureml-api-2) | decision-making | 0.75 | Details differences in datastore support (e.g., SQL-like sources) and compares scenarios, which is migration decision guidance. | | [Deploy a flow to online endpoint for real-time inference with CLI](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-to-code?view=azureml-api-2) | deployment | 0.75 | CLI-based deployment to managed and Kubernetes endpoints; likely includes endpoint/deployment YAML schemas, constraints, and environment settings specific to Azure ML. | +| [Deploy and cut over](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-migrated-agent-framework-workflow?view=azureml-api-2) | deployment | 0.75 | Covers deploying Agent Framework workflows to Azure Container Apps, setting up OpenTelemetry tracing, CI/CD quality gates, and cutover from Prompt Flow—product-specific deployment and operations guidance. | | [Deploy locally](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-online-endpoints?view=azureml-api-2) | deployment | 0.75 | Same content as index 6 (duplicate URL and summary). Contains Azure ML–specific deployment configuration and behavior, which fits deployment expert knowledge. | | [Deploy locally on compute instance](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-online-endpoints?view=azureml-api-2) | deployment | 0.75 | Same content as index 6 (duplicate URL and summary). Contains Azure ML–specific deployment configuration and behavior, which fits deployment expert knowledge. | | [Edit Metadata](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/edit-metadata?view=azureml-api-2) | configuration | 0.75 | Explains how to change metadata such as data types, labels, and feature flags; detailed component configuration. | -| [Evaluate your Semantic Kernel with prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2) | integrations | 0.75 | Describes integration between Semantic Kernel and prompt flow; includes SDK usage and evaluation patterns unique to this integration. | +| [Evaluate your Semantic Kernel with prompt flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2) | integrations | 0.75 | Describes integration between Prompt Flow and Semantic Kernel, including how to evaluate plugins and planners; contains product-specific API and configuration patterns. | | [Existing Kubernetes compute cannot be updated](https://learn.microsoft.com/en-us/azure/machine-learning/known-issues/inferencing-updating-kubernetes-compute-appears-to-succeed?view=azureml-api-2) | troubleshooting | 0.75 | Describes a specific failure mode when updating Kubernetes compute; includes status and likely workaround steps. | | [Featurization in automated ML (Python)](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-configure-auto-features?view=azureml-api-1) | configuration | 0.75 | Covers how to customize featurization settings; this is a configuration-focused article with product-specific parameter names and options. | | [Filter Based Feature Selection](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/filter-based-feature-selection?view=azureml-api-2) | configuration | 0.75 | Component reference with algorithm choices and parameters for feature selection; product-specific configuration. | @@ -278,16 +315,17 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [How to manage inputs and outputs in pipeline](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2) | configuration | 0.75 | Details how to declare and wire inputs/outputs at component and pipeline levels, including types and behaviors; this is product-specific configuration knowledge. | | [How to move a workspace](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-move-workspace?view=azureml-api-2) | decision-making | 0.75 | Details what content moves vs. doesn’t, preview limitations, and implications for planning subscription moves—decision-focused guidance. | | [Identity-based data access to storage](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-datastore?view=azureml-api-2) | integrations | 0.75 | Duplicate of index 3; same datastore configuration and connection details with product-specific parameters. | -| [Index Lookup tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2) | configuration | 0.75 | Index Lookup tool reference will describe supported index types, connection parameters, and selection options unique to Azure ML prompt flow. | +| [Index Lookup tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2) | integrations | 0.75 | Index Lookup tool for Azure AI Search, FAISS, Pinecone will have tool parameters (index type, connection settings, query fields) and wiring patterns unique to prompt flow, which are product-specific integration details. | | [Invalid certificate error during deployment](https://learn.microsoft.com/en-us/azure/machine-learning/known-issues/inferencing-invalid-certificate?view=azureml-api-2) | troubleshooting | 0.75 | Documents a specific invalid certificate error with full error payload; likely includes cause and mitigation steps. | -| [LLM tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2) | configuration | 0.75 | LLM tool reference typically documents tool parameters, supported models, options, and constraints (for example unsupported embeddings, reasoning models) which are product-specific configuration details. | | [Model specification for online deployment](https://learn.microsoft.com/en-us/azure/machine-learning/concept-online-deployment-model-specification?view=azureml-api-2) | configuration | 0.75 | Details the ways to reference models (registry, workspace, paths) and environment variables used; these are concrete configuration mechanisms unique to Azure ML. | | [Network isolation with managed online endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/concept-secure-online-endpoint?view=azureml-api-2) | security | 0.75 | Describes how private endpoints and workspace-managed VNets secure AML managed online endpoints, including service-specific network topology details. | | [Normalize Data](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/normalize-data?view=azureml-api-2) | configuration | 0.75 | Explains normalization methods and parameters; product-specific configuration for how normalization is applied. | | [Online endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-deploy-endpoints?view=azureml-api-2) | decision-making | 0.75 | Describes how ACI/AKS web services in SDK v1 map to v2 deployment endpoints, including deprecation timelines and recommended migration paths. This is concrete, product- and version-specific migration guidance that helps choose new endpoint types and plan upgrades, fitting the decision-making category. | -| [OpenAI GPT-4V tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/openai-gpt-4v-tool?view=azureml-api-2) | configuration | 0.75 | GPT-4V tool article will describe how to call gpt-4-vision-preview, including parameters and constraints unique to this integration. | +| [OpenAI GPT-4V tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/openai-gpt-4v-tool?view=azureml-api-2) | integrations | 0.75 | OpenAI GPT-4V tool reference will include configuration for image inputs, model name (gpt-4-vision-preview), and tool parameters specific to prompt flow, matching the integrations sub-skill. | | [Partition and Sample](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/partition-and-sample?view=azureml-api-2) | configuration | 0.75 | Describes sampling modes, partition sizes, and randomization options; detailed configuration for this component. | | [Permutation Feature Importance](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/permutation-feature-importance?view=azureml-api-2) | configuration | 0.75 | Explains how to set metrics and options for permutation importance; detailed configuration of this component. | +| [Python tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2) | integrations | 0.75 | Python tool reference for self-contained executable nodes will include function signatures, input/output schema, environment constraints, and configuration fields unique to prompt flow, which are expert integration details. | +| [SERP API tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/serp-api-tool?view=azureml-api-2) | integrations | 0.75 | SerpAPI tool wraps external search APIs; documentation will include configuration fields (API key, engine, query params) and tool-specific options, which are detailed integration settings. | | [SMOTE](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/smote?view=azureml-api-2) | configuration | 0.75 | Component reference for SMOTE; includes parameters controlling oversampling behavior, which are configuration details. | | [Schedule a data import](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-schedule-data-import?view=azureml-api-2) | configuration | 0.75 | Describes creating time-based schedules for data imports and managing them via CLI/SDK/UI. Contains schedule configuration parameters and behavior specific to Azure ML. | | [Secure inference environment](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-secure-inferencing-vnet?view=azureml-api-2) | security | 0.75 | Covers VNet integration for managed online endpoints and AKS with Azure ML–specific network and security configuration. | @@ -355,9 +393,8 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Compute](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-resource-compute?view=azureml-api-2) | decision-making | 0.70 | Compares compute management scenarios in SDK v1 and v2, helping users decide how to migrate their compute workflows. | | [Compute instance](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-create-compute-instance?view=azureml-api-2) | configuration | 0.70 | Same content as index 33; describes Azure ML-specific configuration for compute instances. | | [Configure Apache Spark jobs](https://learn.microsoft.com/en-us/azure/machine-learning/quickstart-spark-jobs?view=azureml-api-2) | integrations | 0.70 | Shows how to submit Spark jobs using serverless Spark compute, ADLS Gen2, and identity passthrough. Contains Azure ML–specific job definitions and parameters. | -| [Connections](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-connections?view=azureml-api-2) | security | 0.70 | Focuses on managing credentials/secrets via connections; likely includes connection types, scopes, and secure usage patterns, which are product-specific security configuration details. | | [Consume serverless API endpoints from a different workspace](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-connect-models-serverless?view=azureml-api-2) | deployment | 0.70 | Explains configuration to call a standard deployment from another workspace; cross-workspace consumption is a deployment/operational configuration pattern. | -| [Content Safety (text) tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2) | configuration | 0.70 | Wrapper tool for Azure AI Content Safety Text API will document specific parameters, categories, and settings required to call the API from prompt flow. | +| [Content Safety (text) tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2) | integrations | 0.70 | Wrapper around Azure AI Content Safety Text API; such a tool page typically lists configuration parameters (endpoint, keys, categories, severity levels) and usage patterns specific to prompt flow, fitting integrations. | | [Convert to CSV](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/convert-to-csv?view=azureml-api-2) | configuration | 0.70 | Component reference for converting datasets to CSV; includes options and parameters for export behavior. | | [Convert to Dataset](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/convert-to-dataset?view=azureml-api-2) | configuration | 0.70 | Describes how to configure conversion into designer’s internal dataset format; product-specific behavior and options. | | [Core syntax](https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-core-syntax?view=azureml-api-2) | configuration | 0.70 | Explains core syntax concepts for Azure ML YAML entities; includes product-specific fields and patterns not generally known. | @@ -368,7 +405,8 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Create jobs and input data for batch endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-access-data-batch-endpoints-jobs?view=azureml-api-2) | integrations | 0.70 | Describes AML-specific ways to bind data assets, datastores, and storage accounts to batch jobs via CLI/SDK/REST parameters, which are product-specific integration patterns. | | [Create text labeling projects](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-create-text-labeling-projects?view=azureml-api-2) | configuration | 0.70 | Shows how to define text labeling projects, single vs multi-label, and project parameters. These are specific configuration options in Azure ML. | | [Custom CI/CD and Event-driven workflows](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-event-grid?view=azureml-api-2) | integrations | 0.70 | Covers event types, payloads, and configuration for triggering workflows from Azure ML via Event Grid; includes product-specific event schema and settings. | -| [Customize base image for compute session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2) | configuration | 0.70 | The page describes product-specific configuration for Azure ML prompt flow compute sessions, including how to define and use a custom Docker/base image as the session environment. It likely includes concrete image configuration details and environment settings unique to Azure Machine Learning, which go beyond generic Docker knowledge and qualify as expert, configuration-focused content. | +| [Custom tool package creation and usage](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2) | integrations | 0.70 | How-to for developing custom tool packages likely includes concrete package structure, required files, configuration fields, and code patterns specific to prompt flow tools, which are product-specific integration details beyond generic knowledge. | +| [Customize base image for compute session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2) | configuration | 0.70 | How-to for creating and configuring a custom Docker base image for compute sessions; likely includes specific environment settings and image configuration details unique to this product. | | [Customize evaluation flow and metrics](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-develop-an-evaluation-flow?view=azureml-api-2) | best-practices | 0.70 | Explains how to create/customize evaluation flows and metrics; contains product-specific patterns and configurations for evaluating flows, which are actionable best practices. | | [Data ingestion with Azure Data Factory](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-data-ingest-adf?view=azureml-api-1) | integrations | 0.70 | Covers building ingestion pipelines with Azure Data Factory for Azure ML. Contains product-specific pipeline activities, linked service settings, and patterns. | | [Data splits & cross-validation (Python)](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-configure-cross-validation-data-splits?view=azureml-api-1) | configuration | 0.70 | Explains how to configure training/validation/test splits and cross-validation for AutoML; likely includes specific parameters and allowed values in SDK v1. | @@ -379,6 +417,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Deploy models for scoring in batch endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-model-deployments?view=azureml-api-2) | deployment | 0.70 | Step-by-step deployment of a model to batch endpoints with AML-specific deployment configuration and invocation patterns. | | [Deploy pipelines with batch endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-pipeline-deployments?view=azureml-api-2) | deployment | 0.70 | Describes deploying pipeline components as batch deployments, with Azure ML–specific endpoint/deployment configuration, component wiring, and invocation details that define how pipelines are operationalized on this platform. | | [Dockerfile extensibility](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-extend-prebuilt-docker-image-inference?view=azureml-api-1) | configuration | 0.70 | How-to for extending prebuilt images; includes Azure ML–specific image names, environment variables, and configuration steps not generally known. | +| [Embedding tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2) | integrations | 0.70 | Embedding tool reference likely documents tool-specific parameters (model name, input limits, output schema) and how to wire it into flows, which are concrete integration/configuration details. | | [Enable trace and collect feedback for a flow deployment](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-enable-trace-feedback-for-deployment?view=azureml-api-2) | configuration | 0.70 | Describes enabling trace and feedback for deployments; typically involves configuration flags, parameters, and logging settings unique to prompt flow deployments. | | [Enterprise security overview](https://learn.microsoft.com/en-us/azure/machine-learning/concept-enterprise-security?view=azureml-api-2) | security | 0.70 | Covers concrete security and governance features (auth, network, encryption, monitoring) with Azure ML–specific configuration guidance. | | [Evaluate Recommender](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/evaluate-recommender?view=azureml-api-2) | configuration | 0.70 | Evaluate Recommender reference will define input schema expectations and parameter names for evaluation modes and metrics, mapping to specific columns and options. These are detailed configuration semantics unique to this component. | @@ -408,6 +447,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Integrate prompt flow with LLM-based application DevOps](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-llm-app-devops?view=azureml-api-2) | deployment | 0.70 | Shows how to integrate prompt flow with DevOps workflows; includes CI/CD patterns and product-specific deployment pipeline configurations for LLM applications. | | [Intro to Azure Pipelines for CI/CD](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-devops-machine-learning?view=azureml-api-2) | deployment | 0.70 | Describes Azure DevOps pipeline configuration for training and deploying models to Azure ML with product-specific tasks, variables, and constraints. | | [Join Data](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/join-data?view=azureml-api-2) | configuration | 0.70 | Component reference for Join Data; includes join types and key selection options, which are configuration details. | +| [LLM tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2) | integrations | 0.70 | Reference for the LLM tool that connects to OpenAI/Azure OpenAI likely documents tool parameters, model selection fields, and request options unique to prompt flow, matching integration/config-style expert details. | | [Latent Dirichlet Allocation](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/latent-dirichlet-allocation?view=azureml-api-2) | configuration | 0.70 | Component reference for LDA in Azure ML designer, including topic count, iterations, and other parameters, which are detailed configuration settings. | | [Log MLflow models](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-log-mlflow-models?view=azureml-api-2) | configuration | 0.70 | Describes logging models as MLflow models in Azure ML, including options for custom models and autologging. Contains product-specific configuration and API usage patterns. | | [Logging](https://learn.microsoft.com/en-us/azure/machine-learning/reference-migrate-sdk-v1-mlflow-tracking?view=azureml-api-2) | integrations | 0.70 | Reference-style comparison of SDK v1 logging APIs to MLflow tracking; contains product-specific API mappings and usage patterns that are effectively an integration pattern between Azure ML and MLflow. | @@ -416,6 +456,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Manage compute_session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-manage-compute-session?view=azureml-api-2) | configuration | 0.70 | Managing compute sessions typically involves specifying compute types, sizes, images, and lifecycle settings; this is product-specific configuration knowledge beyond generic concepts. | | [Manage models with MLflow](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-models-mlflow?view=azureml-api-2) | configuration | 0.70 | Covers registering, editing, querying, and deleting models in Azure ML via MLflow. This involves product-specific registry behavior and API usage details. | | [Migrate data import to Microsoft Fabric](https://learn.microsoft.com/en-us/azure/machine-learning/data-import-migration-guide?view=azureml-api-2) | decision-making | 0.70 | Migration guide from Data Import/Data Connections to Fabric. Likely includes recommended paths for different scenarios and comparison of options, which is decision-making guidance. | +| [Migration overview](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/migrate-prompt-flow-to-agent-framework?view=azureml-api-2) | decision-making | 0.70 | Provides structured migration guidance from Prompt Flow to Microsoft Agent Framework with phased recommendations and scenario-based guidance, which is product- and migration-specific decision-making content. | | [Model assets](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-assets-model?view=azureml-api-2) | decision-making | 0.70 | Provides detailed comparison of model registration and management patterns between SDK v1 and v2, including how to adapt existing workflows. This is product- and version-specific migration guidance that supports decisions on how to change implementations, fitting decision-making better than other categories. | | [Monitor Kubernetes Online Endpoint inference server logs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-monitor-kubernetes-online-enpoint-inference-server-log?view=azureml-api-2) | configuration | 0.70 | Describes how to route inference server logs to Log Analytics; likely includes diagnostic settings, table names, and configuration parameters. | | [Monitor online endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-monitor-online-endpoints?view=azureml-api-2) | configuration | 0.70 | Details metrics, log tables, and Application Insights integration for endpoints; includes specific metric names and log schemas, which are configuration-level expert knowledge. | @@ -425,12 +466,11 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Neural Network Regression](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/neural-network-regression?view=azureml-api-2) | configuration | 0.70 | Covers a concrete Azure ML neural network regression component with service-specific hyperparameters and configuration options, which qualify as expert configuration knowledge. | | [One vs. All Multiclass](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/one-vs-all-multiclass?view=azureml-api-2) | configuration | 0.70 | Covers Azure ML’s One-vs-All Multiclass component with its specific options and parameterization, which are configuration details unique to this environment. | | [One vs. One Multiclass](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/one-vs-one-multiclass?view=azureml-api-2) | configuration | 0.70 | Explains how to configure the One-vs-One Multiclass component in Azure ML designer, including its parameters and usage, which is expert configuration knowledge. | -| [Open Model LLM tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/open-model-llm-tool?view=azureml-api-2) | configuration | 0.70 | Open Model LLM tool reference will include configuration for selecting open-source models, endpoints, and parameters; also includes deprecation behavior specific to this product. | +| [Open Model LLM tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/open-model-llm-tool?view=azureml-api-2) | integrations | 0.70 | Open Model LLM tool reference will specify how to configure different open-source models (Falcon, Llama 2) within prompt flow, including parameters and deployment bindings, which are integration-specific details. | | [Operationalize a scoring pipeline on batch endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-scoring-pipeline?view=azureml-api-2) | deployment | 0.70 | Covers deployment of inference pipelines that reuse preprocessing components, with AML-specific pipeline and batch endpoint configuration. | | [Operationalize a training pipeline on batch endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-training-pipeline?view=azureml-api-2) | deployment | 0.70 | Shows how to deploy multi-step training pipelines under batch endpoints with AML-specific pipeline component wiring and invocation. | | [Optimize Checkpoint performance for large models](https://learn.microsoft.com/en-us/azure/machine-learning/reference-checkpoint-performance-for-large-models?view=azureml-api-2) | best-practices | 0.70 | Focuses on performance optimization for checkpoints using Nebula; includes product-specific recommendations and constraints for large-model training. | | [Overfitting & imbalanced data](https://learn.microsoft.com/en-us/azure/machine-learning/concept-manage-ml-pitfalls?view=azureml-api-2) | best-practices | 0.70 | Article explicitly focuses on implementing best practices in Automated ML to handle overfitting and imbalanced data, likely including Azure-specific options and behaviors beyond generic ML theory. | -| [Overview](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/overview?view=azureml-api-2) | configuration | 0.70 | Tools overview with index table and custom tool package instructions likely lists tool names, parameters, and usage constraints specific to prompt flow. | | [Overview](https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-overview?view=azureml-api-2) | configuration | 0.70 | Index of YAML schemas; links to detailed schema docs and explains usage patterns, which are specific to Azure ML CLI configuration. | | [Parallel run step](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-execution-parallel-run-step?view=azureml-api-2) | decision-making | 0.70 | Contains concrete, product-specific migration guidance from v1 parallel run step to v2 parallel job, including how concepts map, what to change in code, and when to use the new constructs. This is expert, version-specific knowledge that helps decide how to upgrade an existing solution, fitting decision-making better than generic configuration or best practices. | | [Perform continuous monitoring](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-monitor-model-performance?view=azureml-api-2) | configuration | 0.70 | How-to for enabling out-of-box, advanced, and custom monitoring; includes Azure ML–specific monitoring configuration, metrics, and alert setup. | @@ -439,20 +479,17 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Preprocess Text](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/preprocess-text?view=azureml-api-2) | configuration | 0.70 | Describes configuration options (tokenization, stop words, casing, etc.) for the Preprocess Text component, which are specific to Azure ML’s implementation. | | [Profile models](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-profile-model?view=azureml-api-1) | best-practices | 0.70 | Shows how to use AML v1 tools to profile resource usage and choose deployment sizing, including product-specific commands and recommendations. | | [Progressive rollout of MLflow models to Online Endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-mlflow-models-online-progressive?view=azureml-api-2) | deployment | 0.70 | Describes blue-green/progressive rollout for MLflow models on Azure ML online endpoints with product-specific deployment steps and traffic-weight configuration via MLflow/Azure ML. This is a deployment pattern tied to Azure ML’s endpoint model. | -| [Prompt tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2) | configuration | 0.70 | Prompt tool reference is expected to list template parameters, Jinja2 variables, and configuration fields unique to this tool. | | [Python extensibility](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-prebuilt-docker-images-inference-python-extensibility?view=azureml-api-1) | configuration | 0.70 | Describes Python package extensibility for prebuilt Docker images, including image-specific configuration steps and parameters unique to Azure ML’s prebuilt inference images. | -| [Python tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2) | configuration | 0.70 | Python tool article will describe how to define tool inputs/outputs, environment, and execution settings—configuration details specific to Azure ML prompt flow. | | [Query & compare experiments and runs with MLflow](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-track-experiments-mlflow?view=azureml-api-2) | configuration | 0.70 | Explains how to query and compare experiments/runs via MLflow SDK against Azure ML. Likely includes specific query patterns and workspace integration details that are product-specific. | | [RAG cloud to local](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-retrieval-augmented-generation-cloud-to-local?view=azureml-api-2) | deployment | 0.70 | Describes how to transition flows from cloud to local, including environment and tooling requirements—deployment/execution pattern specific to this product. | | [Regenerate storage access keys](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-change-storage-access-key?view=azureml-api-2) | security | 0.70 | Gives product-specific steps and considerations for changing storage keys used by Azure ML workspaces, including impact on services. | | [Region availability for models in Serverless API endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/concept-endpoint-serverless-availability?view=azureml-api-2) | limits-quotas | 0.70 | Region availability per model is effectively a constraints matrix unique to Azure ML standard deployments. This is expert knowledge akin to a limits/availability table that an LLM would not reliably know from training and must be looked up. | | [Regression](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference-v2/regression?view=azureml-api-2) | configuration | 0.70 | Describes how to configure AutoML Regression via component parameters (primary metric, training limits, featurization options), which are specific configuration options for Azure ML. | | [Remove Duplicate Rows](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/remove-duplicate-rows?view=azureml-api-2) | configuration | 0.70 | Explains how to select columns and rules for identifying duplicates; product-specific configuration behavior. | -| [Rerank tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2) | configuration | 0.70 | Rerank tool article will specify inputs, outputs, and configuration parameters for ranking documents, which are product-specific. | +| [Rerank tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2) | integrations | 0.70 | Rerank tool reference will describe parameters like top_k, scoring fields, and input/output formats specific to prompt flow tools, which are concrete integration/coding patterns. | | [Run Azure OpenAI models in batch endpoints to compute embeddings](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-model-openai-embeddings?view=azureml-api-2) | deployment | 0.70 | Describes how to deploy Azure OpenAI models (text-embedding-ada-002) as AML batch endpoints with MLflow, including auth and endpoint configuration details. | | [Run a script](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-command-job?view=azureml-api-2) | decision-making | 0.70 | Explains how to move from v1 experiments/runs to v2 jobs, with concrete guidance on changing submission code while leaving inner script logic unchanged. This is expert migration guidance for a specific product feature, primarily helping users decide and plan how to upgrade their job submission approach. | | [Run batch endpoints from Azure Data Factory](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-azure-data-factory?view=azureml-api-2) | integrations | 0.70 | Covers integration between Azure Data Factory and Azure ML batch endpoints, including ADF activity configuration, linked service settings, and parameter mappings that are specific to this cross-service integration. | -| [SERP API tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/serp-api-tool?view=azureml-api-2) | configuration | 0.70 | SerpAPI tool reference will list supported search parameters, required keys, and configuration fields for integrating SerpAPI with prompt flow. | | [Secret injection in online endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/concept-secret-injection?view=azureml-api-2) | security | 0.70 | Explains AML-specific secret injection model (secret stores, injection into user containers) and when to use each approach, which is product-specific security behavior. | | [Secured workspace traffic flow](https://learn.microsoft.com/en-us/azure/machine-learning/concept-secure-network-traffic-flow?view=azureml-api-2) | security | 0.70 | Explains detailed traffic flows and security considerations when using VNets with Azure ML; includes product-specific network paths and hardening patterns. | | [Select Columns in Dataset](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/select-columns-in-dataset?view=azureml-api-2) | configuration | 0.70 | Explains how to choose a subset of columns for downstream use; product-specific configuration options. | @@ -527,7 +564,6 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Deploy an AutoML model to an online endpoint](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-automl-endpoint?view=azureml-api-2) | deployment | 0.65 | Provides concrete steps and AML-specific configuration for deploying AutoML-trained models as managed online endpoints, beyond generic deployment concepts. | | [Designer algorithms & components](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/component-reference?view=azureml-api-2) | configuration | 0.65 | Algorithm & component reference for designer; typically includes per-component parameters and options, which are product-specific configuration details. | | [Detect drift on datasets](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-monitor-datasets?view=azureml-api-1) | configuration | 0.65 | How-to for dataset monitors, drift detection, and alerts. Likely includes monitor configuration parameters, thresholds, and Azure ML–specific monitoring constructs. | -| [Embedding tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2) | configuration | 0.65 | Embedding tool reference should include model selection, parameter names, and constraints for using OpenAI embeddings via prompt flow. | | [Enable log metrics in designer](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-track-designer-experiments?view=azureml-api-1) | integrations | 0.65 | Describes using Execute Python Script component to emit metrics and view them in studio. Contains Azure ML–specific logging APIs and configuration patterns. | | [Enter Data Manually](https://learn.microsoft.com/en-us/azure/machine-learning/component-reference/enter-data-manually?view=azureml-api-2) | configuration | 0.65 | Describes how to configure the Enter Data Manually component; includes specific fields and options for defining dataset structure. | | [Execute Python code](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-designer-python?view=azureml-api-1) | integrations | 0.65 | Explains how to use the Execute Python Script module with Azure ML designer, including how inputs/outputs are wired; this is a product-specific coding/integration pattern. | @@ -552,6 +588,8 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Overview](https://learn.microsoft.com/en-us/azure/machine-learning/concept-train-machine-learning-model?view=azureml-api-2) | decision-making | 0.65 | Helps determine which training method (SDK, Designer, AutoML) to use; provides service selection guidance across options. | | [Plan and manage costs](https://learn.microsoft.com/en-us/azure/machine-learning/concept-plan-manage-cost?view=azureml-api-2) | best-practices | 0.65 | Cost-planning guidance for Azure Machine Learning with concrete, product-specific recommendations (for example, which resource types drive cost, how to structure workspaces/compute usage, and how to use Azure cost tools in this context). While mostly conceptual, it includes actionable, service-specific cost-saving practices rather than just generic cloud cost theory. | | [Prep image data for computer vision models (Python)](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-prepare-datasets-for-automl-images?view=azureml-api-2) | best-practices | 0.65 | The article gives concrete, product-specific guidance on how to structure and label image data for Azure ML AutoML computer vision tasks (classification, detection, segmentation). It includes required folder structures, annotation formats, and task-specific constraints that are unique to this feature, which go beyond generic ML knowledge and qualify as best-practices for this product. | +| [Prompt tool](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2) | integrations | 0.65 | Describes a specific tool with JiNa template usage; such tool references typically include parameter names, template fields, and usage patterns specific to prompt flow, which are product-specific coding/config patterns. | +| [Rebuild and validate your workflow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-migrate-prompt-flow-to-agent-framework?view=azureml-api-2) | integrations | 0.65 | Walks through exporting Prompt Flow definitions and re-implementing them with Agent Framework WorkflowBuilder/Executor and Azure AI Evaluation SDK; contains product-specific API/SDK usage patterns, fitting integrations & coding patterns. | | [Registries](https://learn.microsoft.com/en-us/azure/machine-learning/concept-machine-learning-registries-mlops?view=azureml-api-2) | decision-making | 0.65 | Explains when and how to use registries across dev/test/prod with environment-scenario guidance; helps decide registry vs workspace usage for scaling MLOps. | | [Responsible AI dashboard overview](https://learn.microsoft.com/en-us/azure/machine-learning/concept-responsible-ai-dashboard?view=azureml-api-2) | configuration | 0.65 | Dashboard concept article also explains how tools integrate and are configured within the dashboard, including component options and usage patterns. | | [Run batch endpoints from Event Grid events in storage](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-event-grid-batch?view=azureml-api-2) | integrations | 0.65 | Shows how to integrate batch endpoints with Event Grid and Logic Apps. Likely includes event schema details, subscription configuration, endpoint URLs, and payload/parameter mappings specific to Azure ML batch endpoints, which are product-specific integration patterns. | @@ -587,8 +625,6 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Auto-train a regression (NYC Taxi data)](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-auto-train-models-v1?view=azureml-api-1) | 0.45 | Tutorial for training a regression model with SDK v1; step-by-step example rather than a reference of configuration options or limits. | | [Develop a flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-develop-flow?view=azureml-api-2) | 0.45 | How-to for developing flows; mostly workflow and UI/SDK usage, with no clear indication of configuration tables, limits, or troubleshooting in the summary. | | [Schedule jobs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-schedule-pipeline-job?view=azureml-api-2) | 0.45 | Scheduling pipeline jobs via CLI/SDK/Studio; primarily procedural how-to without explicit limits, decision matrices, or detailed configuration parameter tables in the summary. | -| [Session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-session?view=azureml-api-2) | 0.45 | Conceptual explanation of compute sessions; summary doesn’t indicate detailed configuration parameters, limits, or troubleshooting mappings. | -| [Tools](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-tools?view=azureml-api-2) | 0.45 | Conceptual explanation of tools as building blocks; integration mention is high-level without clear evidence of parameter tables or SDK config details in the summary. | | [Train TensorFlow models](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-train-tensorflow?view=azureml-api-2) | 0.45 | End-to-end TensorFlow training and deployment example; primarily tutorial content without detailed configuration matrices or limits. | | [Train scikit-learn models](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-train-scikit-learn?view=azureml-api-2) | 0.45 | Tutorial for running scikit-learn training scripts; mostly example code and basic job submission rather than a comprehensive config or limits reference. | | [Use Azure portal](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-hub-workspace-portal?view=azureml-api-2) | 0.45 | Portal-based hub workspace management; largely CRUD operations without deep configuration tables or security/limit specifics. | @@ -641,6 +677,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Automated ML overview](https://learn.microsoft.com/en-us/azure/machine-learning/concept-automated-ml?view=azureml-api-2) | 0.30 | Conceptual overview of automated ML; mostly high-level description of capabilities without detailed config tables or limits. | | [Calendar features](https://learn.microsoft.com/en-us/azure/machine-learning/concept-automl-forecasting-calendar-features?view=azureml-api-2) | 0.30 | Explains calendar/holiday feature concepts for AutoML forecasting; lacks detailed configuration parameters or numeric thresholds. | | [Compute instances](https://learn.microsoft.com/en-us/azure/machine-learning/concept-compute-instance?view=azureml-api-2) | 0.30 | Conceptual overview of compute instances; no explicit quotas, config parameter tables, or troubleshooting mappings. | +| [Connections](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-connections?view=azureml-api-2) | 0.30 | Conceptual explanation of connections and secret management; likely lacks detailed RBAC roles, config tables, or numeric limits. | | [Create ML resources to get started](https://learn.microsoft.com/en-us/azure/machine-learning/quickstart-create-resources?view=azureml-api-2) | 0.30 | Step-by-step tutorial to create workspace and compute; no config parameter tables, limits, or tier matrices. | | [Create automated ML experiments](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-first-experiment-automated-ml?view=azureml-api-2) | 0.30 | No-code AutoML classification tutorial; UI-driven steps without deep configuration or limits. | | [Curated environments](https://learn.microsoft.com/en-us/azure/machine-learning/resource-curated-environments?view=azureml-api-2) | 0.30 | Curated environments article is described as an overview of benefits and usage, not a parameter-level reference or limits page. | @@ -661,23 +698,24 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Lag features](https://learn.microsoft.com/en-us/azure/machine-learning/concept-automl-forecasting-lags?view=azureml-api-2) | 0.30 | Describes lag and rolling window features conceptually; no detailed product-specific configuration tables or limits. | | [Launch VS Code remote](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-launch-vs-code-remote?view=azureml-api-2) | 0.30 | How-to for starting VS Code connected to a compute instance. Primarily connection steps; no clear evidence of detailed configuration tables, limits, or troubleshooting mappings. | | [MLOps capabilities](https://learn.microsoft.com/en-us/azure/machine-learning/concept-model-management-and-deployment?view=azureml-api-2) | 0.30 | Conceptual MLOps model management overview; focuses on lifecycle concepts and DevOps practices without product-specific numeric thresholds or configs. | +| [Overview](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/overview?view=azureml-api-2) | 0.30 | Overview/index page for tools; summary suggests navigation plus high-level instructions, not detailed configuration tables or error mappings. | | [Pipeline endpoints](https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-deploy-pipelines?view=azureml-api-2) | 0.30 | Short migration note about published pipelines to v2; lacks detailed configuration, limits, or troubleshooting content. | | [Prepare and explore data](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-explore-data?view=azureml-api-2) | 0.30 | Tutorial on uploading and exploring data; procedural steps without detailed configuration tables or limits. | | [Secure code](https://learn.microsoft.com/en-us/azure/machine-learning/concept-secure-code-best-practice?view=azureml-api-2) | 0.30 | Described as secure code best practices and threats/mitigations in general terms. Likely focuses on conceptual secure coding guidance (e.g., only run trusted notebooks) without Azure-ML-specific configuration parameters, roles, or numeric thresholds, so it does not meet the expert-knowledge criteria for any sub-skill type. | +| [Session](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-session?view=azureml-api-2) | 0.30 | Conceptual description of compute sessions; no clear indication of numeric limits, config parameter tables, or troubleshooting content. | | [Set up VS Code extension](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-setup-vs-code?view=azureml-api-2) | 0.30 | VS Code extension setup guide; focuses on how to install and connect, not detailed configuration options. | | [Share data assets](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-share-data-across-workspaces-with-registries?view=azureml-api-2) | 0.30 | Primarily explains how to share data via Azure ML registries and cross-workspace collaboration. Summary suggests a conceptual/how-to sharing workflow without detailed config tables, limits, or security role mappings. | | [Submit batch run and evaluate a flow](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-bulk-test-evaluate-flow?view=azureml-api-2) | 0.30 | How-to/tutorial style guidance on submitting batch runs and evaluating prompt flows; likely procedural without detailed limits, configuration tables, or error-code-based troubleshooting mappings. | | [Train & deploy image classification](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-azure-ml-in-a-day?view=azureml-api-2) | 0.30 | Duplicate quickstart for Azure ML; same reasoning as index 28. | | [Train a model](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-train-model?view=azureml-api-2) | 0.30 | Training tutorial using a sample dataset; no detailed product-specific configuration or limits. | | [Transform data](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-designer-transform-data?view=azureml-api-1) | 0.30 | How-to for transforming data in designer; mostly procedural UI steps and generic transformation concepts without detailed product-specific limits, configs, or troubleshooting matrices. | -| [Variants](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-variants?view=azureml-api-2) | 0.30 | Conceptual article about variants for prompt tuning; likely guidance and examples but not configuration matrices, limits, or error-code troubleshooting. | | [What is RAG](https://learn.microsoft.com/en-us/azure/machine-learning/concept-retrieval-augmented-generation?view=azureml-api-2) | 0.30 | Conceptual explanation of RAG pattern and its use with prompt flow; primarily overview without detailed configuration tables or numeric thresholds. | | [What is Responsible AI?](https://learn.microsoft.com/en-us/azure/machine-learning/concept-responsible-ai?view=azureml-api-2) | 0.30 | High-level Responsible AI overview; conceptual and policy-focused without detailed configuration or numeric thresholds. | | [What is managed feature store](https://learn.microsoft.com/en-us/azure/machine-learning/concept-what-is-managed-feature-store?view=azureml-api-2) | 0.30 | High-level conceptual overview of managed feature store; mostly descriptive without detailed config, limits, or troubleshooting. | | [Featured models in the model catalog](https://learn.microsoft.com/en-us/azure/machine-learning/concept-models-featured?view=azureml-api-2) | 0.25 | Catalog listing of featured models and deployment options; largely reference/marketing without configuration tables, limits, or decision matrices in the summary. | -| [Flows](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-flows?view=azureml-api-2) | 0.25 | Conceptual description of flows and lifecycle; no indication of numeric limits, config tables, or security/troubleshooting specifics. | | [MLflow models](https://learn.microsoft.com/en-us/azure/machine-learning/concept-mlflow-models?view=azureml-api-2) | 0.25 | Explains MLflow artifacts and models conceptually and how Azure ML uses them. No indication of numeric limits, configuration tables, or troubleshooting mappings. | -| [Advance your maturity level](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-llmops-maturity?view=azureml-api-2) | 0.20 | GenAIOps maturity model and stages are conceptual guidance without product-specific numeric thresholds, configs, or decision matrices. | +| [Tools](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-tools?view=azureml-api-2) | 0.25 | Explains what tools are in Prompt Flow; likely conceptual without specific config tables, limits, or error mappings. | +| [Variants](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-variants?view=azureml-api-2) | 0.25 | Describes the concept of variants for prompt tuning; no evidence of numeric thresholds, config matrices, or product-specific troubleshooting. | | [Architecture & terms](https://learn.microsoft.com/en-us/azure/machine-learning/concept-azure-machine-learning-v2?view=azureml-api-2) | 0.20 | High-level architecture/concepts for Azure ML v2; lacks quantified thresholds or decision matrices. | | [Azure Machine Learning CLI and Python SDK](https://learn.microsoft.com/en-us/azure/machine-learning/concept-v2?view=azureml-api-2) | 0.20 | Conceptual comparison of CLI/SDK v1 vs v2; no concrete limits, configs, or troubleshooting content. | | [Azure Machine Learning SDK & CLI v1](https://learn.microsoft.com/en-us/azure/machine-learning/introduction?view=azureml-api-1) | 0.20 | Intro/deprecation notice for SDK & CLI v1; no detailed configuration tables or limits. | @@ -689,6 +727,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Develop a feature set with a custom source](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-develop-feature-set-with-custom-source?view=azureml-api-2) | 0.20 | Tutorial on developing a feature set with a custom source; focuses on conceptual and procedural guidance rather than detailed configuration parameters, limits, or error-code-based troubleshooting. | | [Enable recurrent materialization and run batch inference](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-enable-recurrent-materialization-run-batch-inference?view=azureml-api-2) | 0.20 | Tutorial on enabling recurrent materialization and running batch inference; appears to be step-by-step usage guidance without explicit limits, configuration reference tables, or structured troubleshooting/decision content. | | [Experiment and train models using features](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-experiment-train-models-using-features?view=azureml-api-2) | 0.20 | Tutorial-style walkthrough for experimenting and training models with managed feature store; no clear tables of limits, config matrices, error-code mappings, or product-specific best-practice/decision content beyond generic tutorial steps. | +| [Flows](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-flows?view=azureml-api-2) | 0.20 | Conceptual overview of flows and lifecycle; appears descriptive without expert-level configuration, limits, or troubleshooting details. | | [MLflow and Azure Machine Learning](https://learn.microsoft.com/en-us/azure/machine-learning/concept-mlflow?view=azureml-api-2) | 0.20 | Conceptual article describing MLflow and Azure ML capabilities. Summary indicates overview of what MLflow does, not detailed configuration parameters or limits. | | [Manage environments in studio](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-environments-in-studio?view=azureml-api-2) | 0.20 | Studio-focused how-to for creating and managing environments; primarily procedural without detailed parameter tables, limits, or product-specific troubleshooting. | | [Monitor Machine Learning](https://learn.microsoft.com/en-us/azure/machine-learning/monitor-azure-machine-learning?view=azureml-api-2) | 0.20 | High-level overview of monitoring Azure Machine Learning using Azure Monitor. Primarily conceptual and navigational, without detailed error codes, configuration parameter tables, or product-specific diagnostic mappings that would qualify as troubleshooting, configuration, or other expert-knowledge categories. | @@ -703,8 +742,9 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Use Visual Studio Code](https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-train-deploy-image-classification-model-vscode?view=azureml-api-2) | 0.20 | Tutorial-style walkthrough for training and deploying an image classification model with VS Code; focuses on step-by-step usage rather than product-specific limits, configuration matrices, or troubleshooting details. | | [Use a terminal](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-access-terminal?view=azureml-api-2) | 0.20 | How-to guide for using the compute instance terminal (Git, package install, kernels). It’s procedural usage content without product-specific limits, configuration matrices, error-code mappings, or other expert-only details. | | [Visualize training results (preview)](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-visualize-jobs?view=azureml-api-2) | 0.20 | Describes using dashboards to visualize experiment results in Azure ML studio; primarily conceptual/UX guidance without numeric limits, configuration parameter tables, or troubleshooting mappings. | -| [What is prompt flow?](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/overview-what-is-prompt-flow?view=azureml-api-2) | 0.20 | Conceptual overview of prompt flow; describes capabilities and benefits without detailed configuration, limits, or troubleshooting. | +| [What is prompt flow?](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/overview-what-is-prompt-flow?view=azureml-api-2) | 0.20 | High-level overview of Prompt Flow; no detailed limits, configs, error codes, or decision matrices. | | [Workspaces](https://learn.microsoft.com/en-us/azure/machine-learning/concept-workspace?view=azureml-api-2) | 0.20 | Conceptual description of workspaces and access; no detailed RBAC role lists, limits, or configuration tables. | +| [Advance your maturity level](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-llmops-maturity?view=azureml-api-2) | 0.10 | Describes GenAIOps/LLMOps maturity stages conceptually; no indication of concrete limits, configs, or product-specific error/decision matrices. | | [Assess errors in ML models](https://learn.microsoft.com/en-us/azure/machine-learning/concept-error-analysis?view=azureml-api-2) | 0.10 | Conceptual explanation of error analysis and assessing model errors across cohorts. Focuses on why error analysis is useful, not on specific configuration values, limits, or troubleshooting mappings. | | [Causal analysis](https://learn.microsoft.com/en-us/azure/machine-learning/concept-causal-inference?view=azureml-api-2) | 0.10 | Conceptual overview of causal inference and data-driven decision-making with Responsible AI dashboard and EconML. Describes purpose and high-level use, but no detailed configuration parameters, limits, error codes, or decision matrices with thresholds. | | [Compute targets](https://learn.microsoft.com/en-us/azure/machine-learning/concept-compute-target?view=azureml-api-2) | 0.10 | Conceptual explanation of compute targets and supported options; lacks numeric limits, tier-specific constraints, or detailed configuration parameters. | @@ -712,7 +752,7 @@ confusable_not_for: Not for Azure Databricks (use azure-databricks), Azure Synap | [Explore Microsoft Foundry Models](https://learn.microsoft.com/en-us/azure/machine-learning/foundry-models-overview?view=azureml-api-2) | 0.10 | Overview of Microsoft Foundry Models and the model catalog; marketing/introductory content (e.g., number of models) rather than detailed limits, configuration, or decision matrices. | | [Glossary](https://learn.microsoft.com/en-us/azure/machine-learning/azure-machine-learning-glossary?view=azureml-api-2) | 0.10 | Glossary of terms; definitions only, no expert configuration, limits, or troubleshooting guidance. | | [Hub workspaces](https://learn.microsoft.com/en-us/azure/machine-learning/concept-hub-workspace?view=azureml-api-2) | 0.10 | Conceptual overview of hub workspaces, governance, and shared resources; does not include numeric limits, decision matrices, or detailed configuration tables. | -| [Prompt flow Ecosystem](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/community-ecosystem?view=azureml-api-2) | 0.10 | High-level overview of the prompt flow ecosystem (tools, tutorials, SDK, CLI, VS Code extension) without detailed configuration parameters, limits, error codes, or product-specific decision matrices. | +| [Prompt flow Ecosystem](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/community-ecosystem?view=azureml-api-2) | 0.10 | Ecosystem/marketing-style overview of tools and resources; not focused on limits, configs, or detailed patterns. | | [Responsible AI scorecard](https://learn.microsoft.com/en-us/azure/machine-learning/concept-responsible-ai-scorecard?view=azureml-api-2) | 0.10 | Overview of the Responsible AI scorecard concept and its role in sharing insights. Primarily descriptive/marketing-style content without concrete configuration parameters, limits, or troubleshooting details. | | [Searching for Assets and Resources](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-search-assets?view=azureml-api-2) | 0.10 | How-to guidance for using search to find assets across workspaces; no specific limits, configuration parameter tables, or troubleshooting mappings. | | [What is Azure Machine Learning?](https://learn.microsoft.com/en-us/azure/machine-learning/overview-what-is-azure-machine-learning?view=azureml-api-2) | 0.10 | High-level product overview of Azure Machine Learning; no detailed limits, configs, or error mappings. | diff --git a/products/azure-managed-grafana/azure-managed-grafana.csv b/products/azure-managed-grafana/azure-managed-grafana.csv index 2be98692..0ebe051f 100644 --- a/products/azure-managed-grafana/azure-managed-grafana.csv +++ b/products/azure-managed-grafana/azure-managed-grafana.csv @@ -1,13 +1,14 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/managed-grafana/agent-framework-dashboard,Agent Framework dashboard,Create an Agent Framework dashboard in Azure Managed Grafana,,"Learn how to create and customize an Agent Framework dashboard in Azure Managed Grafana to monitor AI agent performance, token usage, costs, and errors.","In this guide, you learn how to create and customize an Agent Framework dashboard in Azure Managed Grafana. After adding instrumentation to send data to Application Insights, you can use this prebuilt dashboard to visualize and monitor the performance of your Agent Framework applications. This dashboard helps you track the health, performance, and cost of individual AI agents built with the Microsoft Agent Framework. TheMicrosoft Agent Frameworkis an open-source development kit for building AI a",2026-04-14T17:11:00.000Z,how-to,,0.2,False,"From the summary, the page is a how-to guide for creating and customizing an Agent Framework dashboard in Azure Managed Grafana. It describes using a prebuilt dashboard to visualize performance, token usage, costs, and errors, but there is no indication of specific limits, configuration parameter tables, error-code mappings, or other product-specific expert details as defined in the sub-skill types. It appears to be primarily tutorial/usage content rather than expert reference material.",updated +https://learn.microsoft.com/en-us/azure/managed-grafana/agent-framework-dashboard,Agent Framework dashboard,Create an Agent Framework dashboard in Azure Managed Grafana,,"Learn how to create and customize an Agent Framework dashboard in Azure Managed Grafana to monitor AI agent performance, token usage, costs, and errors.","In this guide, you learn how to create and customize an Agent Framework dashboard in Azure Managed Grafana. After adding instrumentation to send data to Application Insights, you can use this prebuilt dashboard to visualize and monitor the performance of your Agent Framework applications. This dashboard helps you track the health, performance, and cost of individual AI agents built with the Microsoft Agent Framework. TheMicrosoft Agent Frameworkis an open-source development kit for building AI a",2026-04-14T17:11:00.000Z,how-to,,0.2,False,"From the summary, the page is a how-to guide for creating and customizing an Agent Framework dashboard in Azure Managed Grafana. It describes using a prebuilt dashboard to visualize performance, token usage, costs, and errors, but there is no indication of specific limits, configuration parameter tables, error-code mappings, or other product-specific expert details as defined in the sub-skill types. It appears to be primarily tutorial/usage content rather than expert reference material.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/agent-framework-workflow-dashboard,Agent Framework Workflow dashboard,Create an Agent Framework Workflow dashboard in Azure Managed Grafana,Monitor Agent Framework workflows in Grafana,Learn how to create and customize an Agent Framework Workflow dashboard in Azure Managed Grafana to monitor multi-agent workflow execution and performance.,"In this guide, you learn how to create and customize an Agent Framework Workflow dashboard in Azure Managed Grafana. The Microsoft Agent Framework has built-in OpenTelemetry support. After adding instrumentation to send data to Application Insights, you can use this prebuilt dashboard to visualize and monitor the execution and performance of your Agent Framework workflows. This dashboard is designed specifically for monitoring graph-based workflows that connect multiple agents and functions to p",2025-11-19T12:10:00.000Z,how-to,integrations,0.7,True,Prebuilt dashboard for multi-agent workflows with OpenTelemetry and Application Insights; product-specific metrics and visualization patterns.,unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/azure-ai-foundry-dashboard,Microsoft Foundry dashboard,Create a Microsoft Foundry dashboard in Azure Managed Grafana,Build Azure AI Foundry monitoring dashboards in Grafana,Learn how to set up and configure a Microsoft Foundry metrics dashboard in Azure Managed Grafana to monitor AI workload performance and resource utilization.,"In this guide, you learn how to set up a Microsoft Foundry metrics dashboard in Azure Managed Grafana. This dashboard tracks inference latency, throughput, token usage, and API call success rates to help you optimize costs, identify performance bottlenecks, and maintain the health of your AI resources.",2026-02-12T23:11:00.000Z,how-to,integrations,0.7,True,"Prebuilt dashboard for AI metrics (latency, throughput, token usage); includes specific metrics, queries, and panel configurations tied to Azure AI Foundry.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/encryption,Encryption,Encryption in Azure Managed Grafana,Understand data storage and encryption in Managed Grafana,"In this guide, learn basic information about data storage and encryption within Azure Managed Grafana.",This article provides a short description of encryption within Azure Managed Grafana.,2025-02-20T12:33:00.000Z,concept-article,security,0.7,True,Product-specific description of how data is stored and encrypted; includes implementation details not obvious from generic knowledge.,unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/faq,FAQ,Azure Managed Grafana FAQ,,Frequently asked questions about Azure Managed Grafana,This article answers frequently asked questions about Azure Managed Grafana.,2025-05-19T15:23:00.000Z,reference,,0.3,False,"FAQ pages are often mixed, but based on the description this appears to be general Q&A about Azure Managed Grafana (what it is, availability, support, pricing, etc.) rather than detailed limits, configuration tables, or error-code-based troubleshooting. Without evidence of specific numeric limits, config parameter tables, or error mappings, it doesn't meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/find-help-open-support-ticket,Support,Find help or open a ticket for Azure Managed Grafana,,"Learn how to find help, get technical information or open a support ticket for Azure Managed Grafana","In the page below, find out how you can get technical information about Azure Managed Grafana, look up answers to your questions or open a support ticket.",2025-02-06T18:03:00.000Z,troubleshooting,,0.1,False,"This page describes how to find help and open a support ticket, which is process/navigation content. It doesn't contain product-specific limits, configuration parameters, error-code mappings, or decision matrices, so it doesn't qualify as expert knowledge for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-app-ui,Grafana UI,Grafana UI,,"Learn about the Grafana UI components--panels, visualizations and dashboards.","This reference covers the Grafana web application's main UI components, including panels, visualizations, and dashboards. For consistency, this document links to the corresponding topics in the Grafana documentation.",2023-03-24T00:00:00.000Z,reference,,0.3,False,Reference for Grafana UI components linking to upstream Grafana docs; mostly conceptual UI overview without Azure-specific expert configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server,Remote MCP server,Configure an Azure Managed Grafana remote MCP server,Configure Azure Managed Grafana AMG-MCP server access,"Discover MCP tools for Azure Managed Grafana. Query Application Insights, Azure Data Explorer, and more with secure authentication and easy setup.","Every Azure Managed Grafana instance includes a built-in Model Context Protocol (MCP) server endpoint called AMG-MCP. The AMG-MCP endpoint allows tools and applications to interact programmatically with the Grafana instance using the MCP. The AMG-MCP endpoint uses the same authentication mechanism as the Grafana instance, supporting both Entra ID and the Grafana service account token.",2026-04-09T17:25:00.000Z,concept-article,configuration,0.68,True,"The page describes product-specific configuration for the built-in AMG-MCP endpoint on Azure Managed Grafana, including how it uses the same authentication mechanisms as the Grafana instance (Entra ID and service account tokens). This is concrete, service-specific configuration and integration behavior that isn't generic knowledge, but it doesn't focus on limits, troubleshooting, or architecture patterns.",unchanged -https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-settings,Grafana settings,How to Configure Grafana Settings - Azure Managed Grafana,Configure Azure Managed Grafana instance settings,"Learn how to configure Grafana settings in Azure Managed Grafana, including enabling Viewers can Edit and External Enabled.","This article provides step-by-step instructions on how to configure Grafana settings in Azure Managed Grafana. These settings allow you to customize your Grafana instance by enabling or disabling specific options. These Grafana settings are also referenced in Grafana's documentation, underGrafana configuration.",2025-11-18T18:43:00.000Z,how-to,configuration,0.8,True,How-to for Grafana settings like 'Viewers can Edit' and 'External Enabled'; likely includes specific setting names and allowed values.,unchanged +https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server,Remote MCP server,Configure an Azure Managed Grafana Remote MCP server,Use Azure Managed Grafana MCP server programmatically,"Discover MCP tools for Azure Managed Grafana. Query Application Insights, Azure Data Explorer, and more with secure authentication and easy setup.","Every Azure Managed Grafana instance includes a built-in Model Context Protocol (MCP) server endpoint. The Azure Managed Grafana MCP endpoint allows tools and applications to interact programmatically with the Grafana instance that's using the MCP. The Azure Managed Grafana MCP endpoint uses the same authentication mechanism as the Grafana instance, supporting both Microsoft Entra ID and the Grafana service account token.",2026-04-23T17:12:00.000Z,concept-article,integrations,0.7,True,"Describes the built-in Model Context Protocol (MCP) server endpoint, its authentication mechanisms, and how tools interact with it. This is an integration-focused pattern with product-specific endpoint/auth details, fitting the integrations category.",updated +https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-opentelemetry-app-insights,Ingest data via OpenTelemetry Collector,Ingest data into Application Insights via OpenTelemetry Collector for Azure Managed Grafana dashboards,Ingest telemetry into App Insights for Grafana,"Learn how to run an OpenTelemetry Collector with the Azure Monitor Exporter to ingest telemetry from GitHub Copilot, Claude Code, and OpenClaw into Application Insights for Azure Managed Grafana dashb","Several Azure Managed Grafana dashboards, includingGitHub Copilot,Claude Code, andOpenClaw, visualize telemetry that flows into Azure Application Insights via OpenTelemetry (OTLP). This guide walks through the end-to-end ingestion pipeline: running an OpenTelemetry Collector with the Azure Monitor Exporter, then pointing each source application at it.",2026-04-21T22:10:00.000Z,how-to,integrations,0.76,True,"Describes an end-to-end ingestion pipeline using OpenTelemetry Collector with Azure Monitor Exporter and specific source applications. This is a concrete integration pattern with product-specific configuration and flow, matching the integrations category.",new +https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-settings,Grafana settings,Configure Grafana Settings - Azure Managed Grafana,Configure Azure Managed Grafana instance settings,Learn how to configure Grafana settings in Azure Managed Grafana.,"This article provides step-by-step instructions on how to configure Grafana settings in Azure Managed Grafana. These settings allow you to customize your Grafana instance by enabling or disabling specific options. You can also reference these settings in Grafana documentation, underGrafana configuration.",2026-04-23T17:12:00.000Z,how-to,configuration,0.78,True,"A Grafana settings page for a managed service typically lists concrete configuration keys, allowed values, and sometimes defaults that are specific to Azure Managed Grafana. This matches the configuration category (named settings with specific values/ranges) rather than a generic tutorial.",updated https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-authentication-permissions,Configure Grafana resource authentication and permissions,How to configure authentication and permissions in Azure Managed Grafana,Configure authentication and permissions for Managed Grafana,Learn how to configure Azure Managed Grafana authentication permissions using a system-assigned managed identity or a service principal,"To process data, Azure Managed Grafana needs permission to access data sources. In this guide, learn how to set up authentication in an Azure Managed Grafana instance, so that Grafana can access data sources using a managed identity or a service principal. This guide also introduces the action of adding a Monitoring Reader role assignment on the target subscription to provide read-only access to monitoring data across all resources within the subscription.",2025-10-01T11:10:00.000Z,how-to,security,0.85,True,"Covers managed identity vs service principal, and specific RBAC role (Monitoring Reader) assignments and scopes for data access.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-bundled-prometheus,Configure bundled Prometheus,Configure Bundled Prometheus (preview) in Azure Managed Grafana,Configure bundled Prometheus integration in Managed Grafana,"Learn how to configure bundled Prometheus in Azure Managed Grafana. This guide covers enabling the integration, setting up Grafana-managed recording rules, and visualizing Prometheus metrics.","Azure Managed Grafana offers a bundled Prometheus integration in preview that lets you connect a selected Azure Monitor workspace (managed Prometheus) to your Grafana instance and immediately use it as both a read and remote-write backend for Grafana-managed recording rules. By connecting your chosen Azure Monitor workspace to Grafana, you can periodically pre-compute frequently used or computationally expensive queries, saving the results as a new time series metric back into the same workspace",2025-07-18T11:10:00.000Z,how-to,integrations,0.8,True,"Describes preview bundled Prometheus integration, Grafana-managed recording rules, and remote-write backend configuration; includes product-specific settings and behavior.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-configure-mcp-for-ai-foundry,Configure MCP for AI Foundry agents,Configure Azure Managed Grafana MCP for Azure AI Foundry agents,Integrate Azure Managed Grafana MCP with Azure AI Foundry agents,"Learn how to configure Azure Managed Grafana MCP in Azure AI Foundry so your agent can query Azure resources, metrics, and logs.","This article shows how to configure the Azure Managed Grafana MCP endpoint in an Azure AI Foundry agent. After configuration, your agent can use MCP tools to query Azure resources, metrics, logs, and dashboards through your Azure Managed Grafana workspace.",2026-03-10T22:11:00.000Z,how-to,integrations,0.7,True,"Covers how to wire the Azure Managed Grafana MCP endpoint into Azure AI Foundry agents so they can query Azure resources, metrics, logs, and dashboards. This is a product-specific integration pattern between two Azure services with concrete configuration steps and parameters.",unchanged @@ -19,7 +20,7 @@ https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-data-source-plugi https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-deterministic-ip,Use deterministic outbound IPs,How to set up and use deterministic outbound IPs - Azure Managed Grafana,Set up deterministic outbound IPs for Managed Grafana,Learn how to activate deterministic outbound IP support used by Azure Managed Grafana to communicate with data sources.,"In this guide, learn how to activate deterministic outbound IP support used by Azure Managed Grafana to communicate with data sources, disable public access and set up a firewall rule to allow inbound requests from your Grafana workspace. Note The deterministic outbound IPs feature is only accessible for customers with a Standard plan. For more information about plans, go topricing plans.",2025-02-20T12:33:00.000Z,how-to,security,0.75,True,"Describes deterministic outbound IP feature, plan requirement (Standard), and firewall rule configuration; product-specific networking and security behavior.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-enable-zone-redundancy,Enable zone redundancy,How to enable zone redundancy in Azure Managed Grafana,Enable zone-redundant Azure Managed Grafana workspaces,Learn how to create a zone-redundant Azure Managed Grafana workspace for protection against datacenter failure.,"Azure Managed Grafana offers a zone-redundant option to protect your workspace against datacenter failure. Enabling zone redundancy for Azure Managed Grafana lets you deploy your Azure Managed Grafana resources across a minimum of threeAzure availability zoneswithin the same Azure region. In this how-to guide, learn how to enable zone redundancy for Azure Managed Grafana during the creation of your Azure Managed Grafana workspace. Note Zone redundancy for Azure Managed Grafana is a billable opti",2025-02-20T12:33:00.000Z,how-to,deployment,0.7,True,"Describes zone redundancy option, availability zone usage, and billing implications; product-specific deployment/reliability configuration.",unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-grafana-enterprise,Enable Grafana Enterprise,Subscribe to Grafana Enterprise,Activate and manage Grafana Enterprise plans in Azure,Activate Grafana Enterprise to access Grafana Enterprise plugins within Azure Managed Grafana,"In this guide, learn how to activate the Grafana Enterprise add-on in Azure Managed Grafana, update your Grafana Enterprise plan, and accessGrafana Enterprise plugins. The Grafana Enterprise plans offered through Azure Managed Grafana enable users to access Grafana Enterprise plugins to do more with Azure Managed Grafana. Note To activate the Grafana Enterprise option, your Azure Managed Grafana workspace must be using the Standard plan. For more information about plans, go topricing plans. Graf",2025-09-17T11:11:00.000Z,how-to,decision-making,0.7,True,"Describes prerequisites (Standard plan), plan options, and how to update plans; supports decision-making about Enterprise add-on usage and access to plugins.",unchanged -https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities,Manage access and permissions for users and identities,Manage access and permissions for users and identities - Azure Managed Grafana,Manage user and identity roles in Managed Grafana,"Learn how you can manage access permissions to Azure Managed Grafana by assigning a Grafana role to a user, group, service principal, or a managed identity.","In today's collaborative work environments, multiple teams often need to access and manage the same monitoring dashboards. Whether it's a DevOps team monitoring application performance or a support team troubleshooting customer issues, having the right access permissions is crucial. Azure Managed Grafana simplifies this process by allowing you to set varying levels of permissions for your team members and identities. This guide walks you through the supported Grafana roles and shows you how to u",2025-02-20T12:33:00.000Z,how-to,security,0.8,True,"Explains supported Grafana roles and how to assign them to users, groups, service principals, and managed identities; product-specific access model.",unchanged +https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities,Manage access and permissions for users and identities,Manage access and permissions for users and identities - Azure Managed Grafana,Manage Azure Managed Grafana roles and access,"Learn how you can manage access permissions to Azure Managed Grafana by assigning a Grafana role to a user, group, service principal, or a managed identity.","In today's collaborative work environments, multiple teams often need to access and manage the same monitoring dashboards. Whether it's a DevOps team monitoring application performance or a support team troubleshooting customer issues, having the right access permissions is crucial. Azure Managed Grafana simplifies this process by allowing you to set varying levels of permissions for your team members and identities. This guide walks you through the supported Grafana roles and shows you how to u",2026-04-20T22:11:00.000Z,how-to,security,0.8,True,"Covers managing access permissions, Grafana roles, and assignment to users, groups, service principals, and managed identities. This is product-specific IAM/RBAC guidance with concrete role/permission behavior, fitting the security category.",updated https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-plugins,Manage plugins,How to manage plugins in Azure Managed Grafana,Manage Grafana plugins in Azure Managed Grafana,"In this how-to guide, discover how you can add a Grafana plugin or remove a Grafana plugin you no longer need.","Grafana supports data source, panel, and app plugins. When you create a new Grafana instance, some plugins, such as Azure Monitor, are installed by default. In the following guide, learn how you can add or remove optional plugins. Note Installing and removing plugins isn't available from the Grafana UI or the Azure CLI at this stage. Plugin management is done from the Azure Managed Grafana workspace in the Azure portal.",2024-12-20T12:09:00.000Z,how-to,configuration,0.7,True,Explains how to add/remove plugins via Azure portal rather than Grafana UI/CLI; product-specific configuration path and constraints.,unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-migrate,Migrate to Azure Managed Grafana,Migrate to Azure Managed Grafana,Migrate self-hosted or cloud Grafana to Azure Managed Grafana,Learn how to migrate content from a self-hosted or a cloud-managed Grafana to Azure Managed Grafana using the Grafana UI or the Azure CLI.,This guide shows how to migrate content from a local or a cloud-managed Grafana to Azure Managed Grafana using the Azure CLI. The following elements can be migrated automatically:,2025-10-30T08:00:00.000Z,how-to,decision-making,0.7,True,Migration guide with what can be migrated automatically and how; supports decision-making and concrete migration steps between environments.,unchanged https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-migrate-essential-service-tier,Migrate from Essential service tier,Migrate from Azure Managed Grafana Essential Service Tier,Migrate from Essential to Standard or Azure Monitor dashboards,Migrate from Azure Managed Grafana Essential service tier before retirement. Step-by-step guide to upgrade to Standard service tier or move to Azure Monitor dashboards with Grafana.,"The Azure Managed Grafana Essential service tier is being transitioned to Azure Monitor dashboards with Grafana (available for free in Azure portal) and/or to Azure Managed Grafana Standard service tier. With this transition, creation of new Essential tier workspaces is now disabled. This article outlines the transition timeline, guides you through the transition options. Two migration paths are available to ensure continuity of your Grafana dashboards and monitoring capabilities: Choose the opt",2026-01-28T06:18:00.000Z,how-to,decision-making,0.8,True,Describes retirement timeline for Essential tier and two migration paths with guidance; tier-selection and migration decision content.,unchanged diff --git a/products/azure-managed-grafana/report.md b/products/azure-managed-grafana/report.md index 6e317883..22adc7e2 100644 --- a/products/azure-managed-grafana/report.md +++ b/products/azure-managed-grafana/report.md @@ -1,15 +1,14 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - integrations: Integrating Managed Grafana with Azure AI/Agent Framework, Prometheus, - AKS, Azure Monitor, and Data Explorer, plus configuring data sources, alerts, - and private/managed connections. - security: 'Securing Managed Grafana: auth/permissions, Entra/Team Sync, roles, service - accounts/tokens, private access/endpoints, outbound IPs, data encryption, and + integrations: Patterns and code to integrate Managed Grafana with Azure Monitor, + Prometheus, App Insights, AKS, Azure AI Foundry, MCP, and Data Explorer, plus + managing data source plugins and alerts. + security: 'Securing Managed Grafana: auth and RBAC, Entra/Team Sync, service accounts/tokens, + private endpoints and outbound IPs, data encryption, Azure Monitor access, and security best practices.' - configuration: 'Configuring Azure Managed Grafana workspaces: access control, instance - settings, plugins, metrics/diagnostics via Azure Monitor, and SMTP email alert - setup.' + configuration: 'Configuring Managed Grafana workspaces: instance settings, plugins, + metrics via Azure Monitor, diagnostic logging, and SMTP email alert setup.' deployment: Designing highly available Azure Managed Grafana workspaces, including reliability features, SLAs, and enabling zone-redundant deployments for resiliency. decision-making: Guidance on choosing and managing Grafana Enterprise plans, migrating @@ -21,31 +20,29 @@ category_descriptions: access, configuration, and private endpoint connectivity and DNS problems. skill_description: Expert knowledge for Azure Managed Grafana development including troubleshooting, decision making, limits & quotas, security, configuration, integrations - & coding patterns, and deployment. Use when integrating Azure Managed Grafana with - Azure Monitor/AKS, Entra auth, plugins, email alerts, or zone-redundant setups, - and other Azure Managed Grafana related development tasks. Not for Azure Monitor - (use azure-monitor), Azure Synapse Analytics (use azure-synapse-analytics), Azure - Data Explorer (use azure-data-explorer). -use_when: Use when integrating Azure Managed Grafana with Azure Monitor/AKS, Entra - auth, plugins, email alerts, or zone-redundant setups, and other Azure Managed Grafana - related development tasks. -confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Synapse Analytics - (use azure-synapse-analytics), Azure Data Explorer (use azure-data-explorer). + & coding patterns, and deployment. Use when connecting Azure Monitor/Prometheus, + configuring Entra auth/RBAC, zone-redundant workspaces, alerts, or image/report + rendering, and other Azure Managed Grafana related development tasks. Not for Azure + Monitor (use azure-monitor). +use_when: Use when connecting Azure Monitor/Prometheus, configuring Entra auth/RBAC, + zone-redundant workspaces, alerts, or image/report rendering, and other Azure Managed + Grafana related development tasks. +confusable_not_for: Not for Azure Monitor (use azure-monitor). --- # Azure Managed Grafana Crawl Report ## Summary -- **Total Pages**: 43 -- **Fetched**: 43 +- **Total Pages**: 44 +- **Fetched**: 44 - **Fetch Failed**: 0 -- **Classified**: 34 +- **Classified**: 35 - **Unclassified**: 9 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 42 +- **New Pages**: 1 +- **Updated Pages**: 3 +- **Unchanged**: 40 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-managed-grafana/azure-managed-grafana.csv` @@ -53,21 +50,29 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Synapse Ana | Type | Count | Percentage | |------|-------|------------| -| configuration | 6 | 14.0% | -| decision-making | 4 | 9.3% | +| configuration | 5 | 11.4% | +| decision-making | 4 | 9.1% | | deployment | 1 | 2.3% | -| integrations | 9 | 20.9% | -| limits-quotas | 2 | 4.7% | -| security | 10 | 23.3% | -| troubleshooting | 2 | 4.7% | -| *(Unclassified)* | 9 | 20.9% | +| integrations | 11 | 25.0% | +| limits-quotas | 2 | 4.5% | +| security | 10 | 22.7% | +| troubleshooting | 2 | 4.5% | +| *(Unclassified)* | 9 | 20.5% | ## Changes +### New Pages + +- [Ingest data via OpenTelemetry Collector](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-opentelemetry-app-insights) + ### Updated Pages -- [Agent Framework dashboard](https://learn.microsoft.com/en-us/azure/managed-grafana/agent-framework-dashboard) - - Updated: 2025-11-19T12:10:00.000Z → 2026-04-14T17:11:00.000Z +- [Grafana settings](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-settings) + - Updated: 2025-11-18T18:43:00.000Z → 2026-04-23T17:12:00.000Z +- [Remote MCP server](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server) + - Updated: 2026-04-09T17:25:00.000Z → 2026-04-23T17:12:00.000Z +- [Manage access and permissions for users and identities](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities) + - Updated: 2025-02-20T12:33:00.000Z → 2026-04-20T22:11:00.000Z ## Classified Pages @@ -82,11 +87,12 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Synapse Ana | [Add Azure Data Explorer](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-connect-azure-data-explorer) | integrations | 0.80 | Details ADX data source configuration and authentication options; includes specific parameters and auth flows unique to this integration. | | [Configure SMTP settings](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-smtp-settings) | configuration | 0.80 | Details SMTP server settings and how to enable email alerts via portal/CLI; includes specific configuration parameters and constraints (e.g., not available at creation time). | | [Configure bundled Prometheus](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-bundled-prometheus) | integrations | 0.80 | Describes preview bundled Prometheus integration, Grafana-managed recording rules, and remote-write backend configuration; includes product-specific settings and behavior. | -| [Grafana settings](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-settings) | configuration | 0.80 | How-to for Grafana settings like 'Viewers can Edit' and 'External Enabled'; likely includes specific setting names and allowed values. | -| [Manage access and permissions for users and identities](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities) | security | 0.80 | Explains supported Grafana roles and how to assign them to users, groups, service principals, and managed identities; product-specific access model. | +| [Manage access and permissions for users and identities](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities) | security | 0.80 | Covers managing access permissions, Grafana roles, and assignment to users, groups, service principals, and managed identities. This is product-specific IAM/RBAC guidance with concrete role/permission behavior, fitting the security category. | | [Migrate from Essential service tier](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-migrate-essential-service-tier) | decision-making | 0.80 | Describes retirement timeline for Essential tier and two migration paths with guidance; tier-selection and migration decision content. | | [Set up private access](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-set-up-private-access) | security | 0.80 | Covers disabling public access and configuring private endpoints; product-specific network security configuration. | | [Use Grafana Team Sync](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-sync-teams-with-entra-groups) | security | 0.80 | Describes mapping Microsoft Entra groups to Grafana Teams and interaction with Azure RBAC roles; product-specific permission configuration. | +| [Grafana settings](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-settings) | configuration | 0.78 | A Grafana settings page for a managed service typically lists concrete configuration keys, allowed values, and sometimes defaults that are specific to Azure Managed Grafana. This matches the configuration category (named settings with specific values/ranges) rather than a generic tutorial. | +| [Ingest data via OpenTelemetry Collector](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-opentelemetry-app-insights) | integrations | 0.76 | Describes an end-to-end ingestion pipeline using OpenTelemetry Collector with Azure Monitor Exporter and specific source applications. This is a concrete integration pattern with product-specific configuration and flow, matching the integrations category. | | [Configure data sources](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-data-source-plugins-managed-identity) | integrations | 0.75 | Covers supported data sources per plan and how to add/configure/remove them; includes plan-specific support matrix and data source configuration details. | | [Connect to a data source privately](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-connect-to-data-source-privately) | security | 0.75 | Explains managed private endpoints in a managed VNet and how they link to Azure data sources; product-specific private connectivity configuration. | | [Use deterministic outbound IPs](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-deterministic-ip) | security | 0.75 | Describes deterministic outbound IP feature, plan requirement (Standard), and firewall rule configuration; product-specific networking and security behavior. | @@ -100,9 +106,9 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Synapse Ana | [Manage plugins](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-plugins) | configuration | 0.70 | Explains how to add/remove plugins via Azure portal rather than Grafana UI/CLI; product-specific configuration path and constraints. | | [Microsoft Foundry dashboard](https://learn.microsoft.com/en-us/azure/managed-grafana/azure-ai-foundry-dashboard) | integrations | 0.70 | Prebuilt dashboard for AI metrics (latency, throughput, token usage); includes specific metrics, queries, and panel configurations tied to Azure AI Foundry. | | [Migrate to Azure Managed Grafana](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-migrate) | decision-making | 0.70 | Migration guide with what can be migrated automatically and how; supports decision-making and concrete migration steps between environments. | +| [Remote MCP server](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server) | integrations | 0.70 | Describes the built-in Model Context Protocol (MCP) server endpoint, its authentication mechanisms, and how tools interact with it. This is an integration-focused pattern with product-specific endpoint/auth details, fitting the integrations category. | | [Use Azure Monitor alerts with Grafana](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-use-azure-monitor-alerts) | integrations | 0.70 | Explains how Azure Monitor and Grafana alerts interact, including plan-specific availability (Essential lacks Grafana alerts) and shared compute/query throttling limits. | | [Use service accounts](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-service-accounts) | security | 0.70 | Explains enabling service accounts and creating tokens for API access; product-specific identity and access configuration. | -| [Remote MCP server](https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server) | configuration | 0.68 | The page describes product-specific configuration for the built-in AMG-MCP endpoint on Azure Managed Grafana, including how it uses the same authentication mechanisms as the Grafana instance (Entra ID and service account tokens). This is concrete, service-specific configuration and integration behavior that isn't generic knowledge, but it doesn't focus on limits, troubleshooting, or architecture patterns. | | [Monitor using diagnostic settings](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-monitor-managed-grafana-workspace) | configuration | 0.65 | Details diagnostic settings and event log categories for the service; product-specific logging configuration. | | [Monitor using metrics](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-monitor-managed-grafana-metrics) | configuration | 0.65 | Shows which workspace metrics are exposed and how to configure metric charts; includes specific metric names and usage patterns. | | [Upgrade to Grafana 12](https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-upgrade-grafana-12) | decision-making | 0.65 | Provides product-specific guidance on upgrading workspaces from Grafana 11 to 12, including retirement and automatic-upgrade dates and conditions. This is expert decision and migration guidance (when and how to upgrade, implications of missing the deadline) that goes beyond generic upgrade concepts. | diff --git a/products/azure-managed-lustre/azure-managed-lustre.csv b/products/azure-managed-lustre/azure-managed-lustre.csv index 7b1c5a8a..1ff6a0b1 100644 --- a/products/azure-managed-lustre/azure-managed-lustre.csv +++ b/products/azure-managed-lustre/azure-managed-lustre.csv @@ -26,6 +26,6 @@ https://learn.microsoft.com/en-us/azure/azure-managed-lustre/optimize-file-layou https://learn.microsoft.com/en-us/azure/azure-managed-lustre/optimize-performance,Optimize Azure Managed Lustre performance,Optimize Azure Managed Lustre performance - Azure Managed Lustre File System,Optimize Azure Managed Lustre performance with network configuration,"Learn how to optimize Azure Managed Lustre (AMLFS) performance with recommended network configuration, including accelerated networking, availability zone placement, and routing.","This reference describes how network configuration for your client virtual machines (VMs) and Azure Managed Lustre (AMLFS) file systems affects overall performance. Network throughput and latency between AMLFS and your clients directly affects job completion times. To get predictable, high performance, follow these design principles:",2026-02-17T18:04:00.000Z,reference,best-practices,0.8,True,"Provides product-specific performance tuning guidance (e.g., VM sizes, accelerated networking, AZ placement, routing) with concrete configuration recommendations.",unchanged https://learn.microsoft.com/en-us/azure/azure-managed-lustre/root-squash-configure-settings,Configure root squash settings,Configure root squash settings for Azure Managed Lustre file systems - Azure Managed Lustre File System,Configure root squash security settings for Azure Managed Lustre,Learn how to configure root squash settings for Azure Managed Lustre file systems.,"Root squash is a security feature that prevents a user with root privileges on a client from accessing files on the remote Managed Lustre file system. This functionality is achieved using the Lustre nodemap feature, and is an important part of protecting user data and system settings from manipulation by untrusted or compromised clients. In this article, you learn how to configure root squash settings for Azure Managed Lustre file systems. You can configure root squash settings via REST API requ",2024-11-11T08:00:00.000Z,how-to,security,0.7,True,Describes nodemap-based root squash options and how to set them via REST or other APIs; these are product-specific security configuration details.,unchanged https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-deployment,Troubleshoot cluster deployment failures,Troubleshoot Azure Managed Lustre cluster deployment issues - Azure Managed Lustre File System,Troubleshoot Azure Managed Lustre cluster deployment issues,Learn how to troubleshoot common cluster deployment issues in Azure Managed Lustre,"In this article, you learn how to troubleshoot common issues that you might encounter when deploying an Azure Managed Lustre file system.",2024-11-01T17:02:00.000Z,troubleshooting-general,troubleshooting,0.85,True,Explicit troubleshooting guide for deployment; likely organized by error codes/messages and causes with specific resolutions unique to this service.,unchanged -https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-performance,Troubleshoot cluster performance issues,Troubleshoot Azure Managed Lustre cluster performance issues - Azure Managed Lustre File System,Diagnose and fix Azure Managed Lustre performance issues,Learn how to troubleshoot common cluster performance issues in Azure Managed Lustre,"In this article, you learn how to troubleshoot common issues that you might encounter when deploying an Azure Managed Lustre file system.",2026-02-27T18:05:00.000Z,troubleshooting-general,troubleshooting,0.86,True,"The page is explicitly a troubleshooting guide for Azure Managed Lustre performance, organized around common performance issues and how to resolve them. It likely includes product-specific diagnostics (for example, how to collect performance metrics, which logs or counters to inspect, and how to interpret them) and concrete remediation steps tailored to this service, which qualifies as expert troubleshooting knowledge beyond generic debugging advice.",unchanged +https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-performance,Troubleshoot cluster performance issues,Troubleshoot Azure Managed Lustre cluster performance issues - Azure Managed Lustre File System,Diagnose and fix Azure Managed Lustre performance issues,Learn how to troubleshoot common cluster performance issues in Azure Managed Lustre,"In this article, you learn how to troubleshoot common issues that you might encounter when deploying an Azure Managed Lustre file system.",2026-04-24T17:07:00.000Z,troubleshooting-general,troubleshooting,0.86,True,"The page is explicitly a troubleshooting guide for Azure Managed Lustre performance, organized around specific performance symptoms and their causes with product-specific remediation steps. It focuses on diagnosing cluster performance issues unique to Azure Managed Lustre rather than generic performance tuning, matching the troubleshooting criteria of symptom → cause → solution for this service.",updated https://learn.microsoft.com/en-us/azure/azure-managed-lustre/use-csi-driver-kubernetes,Use Azure Lustre CSI driver for Kubernetes,Use the Azure Managed Lustre CSI Driver with Azure Kubernetes Service - Azure Managed Lustre File System,Use Azure Managed Lustre with AKS via the CSI driver,Learn how to use an Azure Managed Lustre storage system with your Kubernetes containers in Azure Kubernetes Service (AKS).,"In this article, you learn how to plan, install, and useAzure Managed LustreinAzure Kubernetes Service (AKS)with theAzure Lustre CSI Driver for Kubernetes. This driver is based on the Container Support Interface (CSI) specification. You can use the Azure Lustre CSI Driver for Kubernetes to access Azure Managed Lustre storage as persistent storage volumes from Kubernetes containers deployed in AKS.",2025-11-11T08:00:00.000Z,overview,integrations,0.75,True,"CSI driver usage involves StorageClass, PV, and PVC specs with driver-specific parameters and options, which are detailed integration and configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-managed-lustre/vnet-encryption,Enable VNet encryption,Enable and Validate Virtual Network Encryption - Azure Managed Lustre File System,Enable and validate virtual network encryption for Azure Managed Lustre,Learn how to enable and test virtual network encryption for the Azure Managed Lustre file system.,"Azure Managed Lustre supportsvirtual network encryptionto encrypt data in transit between Managed Lustre and client virtual machines (VMs). This feature is valuable for customers in regulated industries, such as finance, healthcare, and government, where data confidentiality is paramount.",2025-11-07T23:02:00.000Z,how-to,security,0.75,True,"Covers enabling and testing VNet encryption for this service, including specific configuration steps and validation commands/logs.",unchanged diff --git a/products/azure-managed-lustre/report.md b/products/azure-managed-lustre/report.md index 81ac52fc..bce61971 100644 --- a/products/azure-managed-lustre/report.md +++ b/products/azure-managed-lustre/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-05' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring Azure Managed Lustre: network/storage prerequisites, client install/upgrade, mounting (fstab), ARM/Bicep deployment, and monitoring/alerts @@ -18,19 +18,18 @@ category_descriptions: best-practices: Guidance on tuning Azure Managed Lustre performance via optimal file/directory layout, client striping, and network setup (NICs, throughput, latency, and scaling). - troubleshooting: Diagnosing and resolving Azure Managed Lustre deployment failures - and performance issues, including cluster provisioning errors, throughput/latency - problems, and tuning guidance. + troubleshooting: Diagnosing and resolving Azure Managed Lustre issues, including + cluster deployment failures, configuration problems, and performance bottlenecks + or slow I/O. skill_description: Expert knowledge for Azure Managed Lustre development including troubleshooting, best practices, architecture & design patterns, limits & quotas, - security, configuration, and integrations & coding patterns. Use when mounting AMLFS, - integrating with Blob auto-import/export, using AKS CSI, setting CMK/root squash, - or tuning performance, and other Azure Managed Lustre related development tasks. - Not for Azure HPC Cache (use azure-hpc-cache), Azure NetApp Files (use azure-netapp-files), - Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san). -use_when: Use when mounting AMLFS, integrating with Blob auto-import/export, using - AKS CSI, setting CMK/root squash, or tuning performance, and other Azure Managed - Lustre related development tasks. + security, configuration, and integrations & coding patterns. Use when deploying + AMLFS with Blob auto-import/export, Terraform, AKS CSI, quotas, or HA/DR failover, + and other Azure Managed Lustre related development tasks. Not for Azure HPC Cache + (use azure-hpc-cache), Azure NetApp Files (use azure-netapp-files), Azure Blob Storage + (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san). +use_when: Use when deploying AMLFS with Blob auto-import/export, Terraform, AKS CSI, + quotas, or HA/DR failover, and other Azure Managed Lustre related development tasks. confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure NetApp Files (use azure-netapp-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san). @@ -47,8 +46,8 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure NetApp ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 30 +- **Updated Pages**: 1 +- **Unchanged**: 29 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-managed-lustre/azure-managed-lustre.csv` @@ -67,11 +66,16 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure NetApp ## Changes +### Updated Pages + +- [Troubleshoot cluster performance issues](https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-performance) + - Updated: 2026-02-27T18:05:00.000Z → 2026-04-24T17:07:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Troubleshoot cluster performance issues](https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-performance) | troubleshooting | 0.86 | The page is explicitly a troubleshooting guide for Azure Managed Lustre performance, organized around common performance issues and how to resolve them. It likely includes product-specific diagnostics (for example, how to collect performance metrics, which logs or counters to inspect, and how to interpret them) and concrete remediation steps tailored to this service, which qualifies as expert troubleshooting knowledge beyond generic debugging advice. | +| [Troubleshoot cluster performance issues](https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-performance) | troubleshooting | 0.86 | The page is explicitly a troubleshooting guide for Azure Managed Lustre performance, organized around specific performance symptoms and their causes with product-specific remediation steps. It focuses on diagnosing cluster performance issues unique to Azure Managed Lustre rather than generic performance tuning, matching the troubleshooting criteria of symptom → cause → solution for this service. | | [Troubleshoot cluster deployment failures](https://learn.microsoft.com/en-us/azure/azure-managed-lustre/troubleshoot-deployment) | troubleshooting | 0.85 | Explicit troubleshooting guide for deployment; likely organized by error codes/messages and causes with specific resolutions unique to this service. | | [Configure a network security group](https://learn.microsoft.com/en-us/azure/azure-managed-lustre/configure-network-security-group) | security | 0.80 | Describes specific NSG rules (ports, protocols, directions) required for Managed Lustre, which are detailed security configuration settings. | | [Monitoring reference for metrics and logs](https://learn.microsoft.com/en-us/azure/azure-managed-lustre/monitor-file-system-reference) | configuration | 0.80 | Monitoring data reference will list metric names, dimensions, and log schemas unique to this service, which are detailed configuration/telemetry references. | diff --git a/products/azure-managed-redis/azure-managed-redis.csv b/products/azure-managed-redis/azure-managed-redis.csv index 45900647..d4824a48 100644 --- a/products/azure-managed-redis/azure-managed-redis.csv +++ b/products/azure-managed-redis/azure-managed-redis.csv @@ -14,18 +14,18 @@ https://learn.microsoft.com/en-us/azure/redis/best-practices-server-load,Server https://learn.microsoft.com/en-us/azure/redis/configure,Configure in Azure portal,How to configure Azure Managed Redis - Azure Managed Redis,Configure Azure Managed Redis instance settings,Understand the default Redis configuration for Azure Managed Redis and learn how to configure your Azure Managed Redis instances.,This article describes the configurations available for your Azure Managed Redis instances.,2025-05-19T17:08:00.000Z,conceptual,configuration,0.8,True,"Central configuration article describing Redis and platform settings, including parameter names and allowed values.",unchanged https://learn.microsoft.com/en-us/azure/redis/development-faq,Development FAQs,Azure Managed Redis development FAQs - Azure Managed Redis,Development guidance for Azure Managed Redis applications,Learn the answers to common questions that help you develop for Azure Managed Redis.,This article provides answers to common questions about how to develop for Azure Managed Redis.,2026-02-01T12:13:00Z,faq,best-practices,0.6,True,"Development FAQ likely includes product-specific coding recommendations, edge cases, and gotchas beyond generic Redis usage.",unchanged https://learn.microsoft.com/en-us/azure/redis/dotnet,.NET app,Quickstart: Use Azure Managed Redis in .NET Core - Azure Managed Redis,Connect .NET apps to Azure Managed Redis with Entra ID,"In this quickstart, learn how to use Azure Managed Redis in a .NET console app","This .NET 8 console application demonstrates how to connect toAzure Managed Redisby usingMicrosoft Entra IDauthentication. The core value proposition ispasswordless authenticationwith automatic token refresh, providing a secure and modern approach to Redis connectivity.",2026-02-01T12:13:00.000Z,quickstart,integrations,0.7,True,Quickstart includes concrete .NET connection code and Entra ID auth parameters specific to Azure Managed Redis.,unchanged -https://learn.microsoft.com/en-us/azure/redis/enable-redis-keyspace-notifications,Enable Redis keyspace notifications,Enable Redis Keyspace Notifications in Azure Managed Redis (preview) - Azure Managed Redis,Use Redis keyspace notifications in Azure Managed Redis,Enable Redis keyspace notifications (preview) in Azure Managed Redis so clients can monitor changes to cache keys and values in real time.,"Redis keyspace notifications allow clients to subscribe to Pub/Sub channels to receive events that affect the Redis data set in some way. Use keyspace notifications (preview) to monitor changes to keys and values in your Azure Managed Redis cache. This article shows how to deploy a cache with keyspace notifications enabled, connect clients using Redis commands, subscribe to notification channels, and test the resulting events.",2026-04-15T22:11:00.000Z,how-to,integrations,0.65,True,"Describes enabling and using Redis keyspace notifications in Azure Managed Redis with Redis commands and subscription patterns. This is a product-specific integration/coding pattern (Pub/Sub channels, notification channels, client commands) rather than a generic concept, and likely includes concrete command usage and configuration flags.",new +https://learn.microsoft.com/en-us/azure/redis/enable-redis-keyspace-notifications,Enable Redis keyspace notifications,Enable Redis Keyspace Notifications in Azure Managed Redis (preview) - Azure Managed Redis,Use Redis keyspace notifications in Azure Managed Redis,Enable Redis keyspace notifications (preview) in Azure Managed Redis so clients can monitor changes to cache keys and values in real time.,"Redis keyspace notifications allow clients to subscribe to Pub/Sub channels to receive events that affect the Redis data set in some way. Use keyspace notifications (preview) to monitor changes to keys and values in your Azure Managed Redis cache. This article shows how to deploy a cache with keyspace notifications enabled, connect clients using Redis commands, subscribe to notification channels, and test the resulting events.",2026-04-15T22:11:00.000Z,how-to,integrations,0.65,True,"Describes enabling and using Redis keyspace notifications in Azure Managed Redis with Redis commands and subscription patterns. This is a product-specific integration/coding pattern (Pub/Sub channels, notification channels, client commands) rather than a generic concept, and likely includes concrete command usage and configuration flags.",unchanged https://learn.microsoft.com/en-us/azure/redis/entra-for-authentication,Microsoft Entra ID for authentication,Use Microsoft Entra for cache authentication with Azure Managed Redis - Azure Managed Redis,Use Microsoft Entra authentication for Azure Managed Redis,Learn how to use Microsoft Entra with Azure Managed Redis.,"Azure Managed Redis offers a password-free authentication mechanism by integrating withMicrosoft Entra ID. Azure Managed Redis caches use Microsoft Entra ID by default. When you create a new cache, managed identity is enabled. Although access key authentication is still available, it comes with a set of challenges around security and password management. For contrast, in this article, you learn how to use a Microsoft Entra token for cache authentication. In this article, you learn how to use you",2026-02-14T06:11:00.000Z,how-to,security,0.8,True,"Details Entra ID integration, token usage, and managed identity configuration for Redis authentication.",unchanged https://learn.microsoft.com/en-us/azure/redis/failover,Failover and patching,Failover and patching - Azure Managed Redis,Handle failover and patching for Azure Managed Redis,"Learn about failover, patching, and the update process for Azure Managed Redis.","To build resilient and successful client applications, it's critical to understand failover in the Azure Managed Redisservice. A failover can be a part of planned management operations, or it might be caused by unplanned hardware or network failures. A common use of cache failover comes when the management service patches the Azure Managed Redis binaries. In this article, you find this information:",2026-03-03T23:37:00.000Z,conceptual,best-practices,0.68,True,"The page describes product-specific behavior of Azure Managed Redis during failover and patching, including how planned vs. unplanned failovers occur and how clients should behave to remain resilient. This is actionable, service-specific guidance on handling failover scenarios rather than generic concepts, fitting best under best-practices. It does not primarily focus on limits, configuration tables, or deployment matrices.",unchanged https://learn.microsoft.com/en-us/azure/redis/faq,Azure Redis FAQ,Azure Redis Cache FAQ - Azure Managed Redis,,"Learn the answers to common questions, patterns, and best practices for Azure Managed Redis and Azure Cache for Redis","This article covers some basic questions about Azure Managed Redis and Azure Cache for Redis, and how the latest open-source Redis licensing changes affect Azure Redis.",2025-07-09T17:43:00Z,faq,,0.4,False,"General FAQ with basic questions, patterns, and best practices; summary doesn’t indicate detailed numeric limits, config tables, or error mappings beyond conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/redis/go-get-started,Go app,Quickstart: Connect to Azure Managed Redis with Go - Azure Managed Redis,Integrate Go applications with Azure Managed Redis,"In this quickstart, you learn how to create a Go app that uses Azure Managed Redis.","In this article, you learn how to use an Azure Redis cache with the Go language and connect using Microsoft Entra ID.",2025-07-18T08:00:00.000Z,quickstart,integrations,0.7,True,"Go quickstart shows concrete connection code and auth configuration for Azure Managed Redis, which is product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/redis/grafana-dashboards,Dashboards with Grafana,Dashboards with Grafana - Azure Managed Redis,Configure built-in Grafana dashboards for Azure Managed Redis,"Learn how to use the built-in Grafana experience in Azure Managed Redis to monitor cache performance, memory, operations, and connectivity with prebuilt dashboards and ad-hoc queries.","Dashboards with GrafanainAzure Managed RedisbringAzure Monitor'sbuilt-in Grafana experience directly into the Azure portal. You can create and customize Grafana dashboards by using your Azure Managed Redis metrics and logs without deploying a separateAzure Managed Grafanainstance. Built-in Grafana controls support a wide range of visualization panels and client-side transformations. Note This feature uses the Grafana experience built into Azure Monitor. It is separate fromAzure Managed Grafana, ",2026-04-15T06:10:00.000Z,how-to,configuration,0.6,True,"Explains using the built-in Grafana experience in Azure Monitor specifically for Azure Managed Redis metrics and logs. Likely includes product-specific dashboard configuration options, metric/log selection, and panel settings, which fits configuration of monitoring/visualization rather than generic tutorial content.",new +https://learn.microsoft.com/en-us/azure/redis/grafana-dashboards,Dashboards with Grafana,Dashboards with Grafana - Azure Managed Redis,Configure built-in Grafana dashboards for Azure Managed Redis,"Learn how to use the built-in Grafana experience in Azure Managed Redis to monitor cache performance, memory, operations, and connectivity with prebuilt dashboards and ad-hoc queries.","Dashboards with GrafanainAzure Managed RedisbringAzure Monitor'sbuilt-in Grafana experience directly into the Azure portal. You can create and customize Grafana dashboards by using your Azure Managed Redis metrics and logs without deploying a separateAzure Managed Grafanainstance. Built-in Grafana controls support a wide range of visualization panels and client-side transformations. Note This feature uses the Grafana experience built into Azure Monitor. It is separate fromAzure Managed Grafana, ",2026-04-15T06:10:00.000Z,how-to,configuration,0.6,True,"Explains using the built-in Grafana experience in Azure Monitor specifically for Azure Managed Redis metrics and logs. Likely includes product-specific dashboard configuration options, metric/log selection, and panel settings, which fits configuration of monitoring/visualization rather than generic tutorial content.",unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-active-geo-replication,Set up active geo-replication,Configure active geo-replication for Azure Managed Redis instances - Azure Managed Redis,Set up active geo-replication for Azure Managed Redis,Learn how to replicate your Azure Managed Redis instances across Azure regions.,"In this article, you learn how to configure an active geo-replicated cache using the Azure portal. Active geo-replication groups up to five instances of Azure Managed Redis into a single cache that spans across Azure regions. All instances act as the local, primary caches. An application decides which instance or instances to use for read and write requests. Note Using active geo-replication produces data transfer between Azure regions. These bandwidth charges are currently absorbed by Azure Man",2025-05-18T08:00:00.000Z,conceptual,configuration,0.7,True,Describes configuring active geo-replication groups (up to five instances) and portal settings; this is product-specific configuration with concrete limits and options.,unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-encryption,Configure disk encryption,Configure disk encryption in Azure Managed Redis - Azure Managed Redis,Configure disk encryption for Azure Managed Redis data,Learn about disk encryption when using Azure Managed Redis.,"Data in a Redis server is stored in memory by default. This data isn't encrypted. You can implement your own encryption on the data before writing it to the cache. In some cases, data can reside on-disk, either due to the operations of the operating system, or because of deliberate actions to persist data usingexportordata persistence. Azure Managed Redis offers platform-managed keys (PMKs), also know as Microsoft-managed keys (MMKs), by default to encrypt data on-disk in all tiers. Azure Manage",2025-05-18T08:00:00.000Z,how-to,security,0.8,True,Explains PMK/CMK usage and disk encryption behavior specific to this service.,unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-import-export-data,Import/Export data,Import and Export data in Azure Managed Redis - Azure Managed Redis,Import and export Azure Managed Redis data via Blob storage,Learn how to import and export data to and from blob storage with your Azure Managed Redis instances,Use the import and export functionality in Azure Managed Redis as a data management operation. You import data into your cache instance or export data from a cache instance using a Redis Database (RDB) snapshot. The snapshots are imported or exported using a blob in an Azure Storage Account. You can use Import/Export to migrate between different Azure Managed Redis instances or populate the cache with data before use. You can also export data from an older Azure Cache for Redis instance to migra,2025-05-20T05:24:00.000Z,how-to,configuration,0.7,True,"Describes import/export operations, including storage account, container, and snapshot configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-manage-redis-cache-powershell,Create and manage with Azure PowerShell,Manage Azure Managed Redis with Azure PowerShell - Azure Managed Redis,Administer Azure Managed Redis using PowerShell,Learn how to create and perform administrative tasks for Azure Managed Redis using Azure PowerShell.,"This article shows you how to create, manage, and delete your Azure Redis instances by using Azure PowerShell.",2026-01-28T08:00:00.000Z,how-to,configuration,0.65,True,"PowerShell management article typically documents specific cmdlets and parameters for creating and managing Redis instances, which are product-specific configuration interfaces.",unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-persistence,Persist your cache with Redis data persistence,Configure data persistence - Azure Managed Redis,Configure persistence options for Azure Managed Redis,Learn how to configure and manage data persistence your Azure Managed Redis instances,"Redis persistence)allows you to persist data stored in cache instance. If there's a hardware failure, the cache instance is rehydrated with data from the persistence file when it comes back online. The ability to persist data is an important way to boost the durability of a cache instance because all cache data is stored in memory. Data loss is possible if a failure occurs when cache nodes are down. Persistence should be a key part of your high availability and disaster recovery strategy with Az",2025-10-22T08:00:00.000Z,conceptual,configuration,0.7,True,"How-to for configuring Redis persistence; such articles typically include product-specific settings (AOF/RDB modes, frequency, parameters) and concrete configuration values unique to Azure Managed Redis.",unchanged -https://learn.microsoft.com/en-us/azure/redis/how-to-redis-access-data,Manage data with client tools,Use Client Tools to Access Data in Azure Managed Redis - Azure Managed Redis,Use Redis Insight and redis-cli with Azure Managed Redis,Learn how to use *Redis Insight* and *redis-cli* as client tools to access data and for troubleshooting and debugging Azure Managed Redis.,You can use the following tools to access and manage data in Azure Managed Redis as a client. Use these tools to directly interact with your Azure Managed Redis instance and for debugging and troubleshooting.,2026-04-15T22:11:00.000Z,concept-article,troubleshooting,0.65,True,"Page focuses on using Redis Insight and redis-cli specifically with Azure Managed Redis for accessing data and for debugging/troubleshooting. This is product-specific, tool-based guidance that likely includes concrete commands, connection details, and diagnostic usage patterns that go beyond generic Redis knowledge, fitting the troubleshooting sub-skill best among the available categories.",updated +https://learn.microsoft.com/en-us/azure/redis/how-to-redis-access-data,Manage data with client tools,Use Client Tools to Access Data in Azure Managed Redis - Azure Managed Redis,Use Redis Insight and redis-cli with Azure Managed Redis,Learn how to use *Redis Insight* and *redis-cli* as client tools to access data and for troubleshooting and debugging Azure Managed Redis.,You can use the following tools to access and manage data in Azure Managed Redis as a client. Use these tools to directly interact with your Azure Managed Redis instance and for debugging and troubleshooting.,2026-04-15T22:11:00.000Z,concept-article,troubleshooting,0.65,True,"Page focuses on using Redis Insight and redis-cli specifically with Azure Managed Redis for accessing data and for debugging/troubleshooting. This is product-specific, tool-based guidance that likely includes concrete commands, connection details, and diagnostic usage patterns that go beyond generic Redis knowledge, fitting the troubleshooting sub-skill best among the available categories.",unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-scale,Change the size and tier of a cache,Scale an Azure Managed Redis instance - Azure Managed Redis,Scale Azure Managed Redis instances across SKUs,"Learn how to scale your Azure Managed Redis instances using the Azure portal, and tools such as Azure PowerShell, and Azure CLI","Azure Managed Redis has different SKU and tier offerings that provide flexibility in the choice of cache size and performance. You can scale to a larger memory size or change to a tier with more compute performance. You can also scale down to a smaller or more appropriate tier. This article shows you how to scale your cache using the Azure portal, plus tools such as Azure PowerShell and Azure CLI. Note Because each tier of Azure Managed Redis has almost the same features, scaling is typically us",2025-11-10T08:00:00.000Z,how-to,deployment,0.65,True,Scaling article includes product-specific scaling operations and constraints across SKUs and tiers.,unchanged https://learn.microsoft.com/en-us/azure/redis/how-to-upgrade,Upgrade to a new version,Upgrade Your Redis Version in Azure Managed Redis - Azure Managed Redis,Plan and execute Azure Managed Redis version upgrades,Learn how to upgrade your Redis version in Azure Managed Redis.,"New versions of Redis server software are frequently released with new features, more commands, and stability improvements. By maintaining your Azure Managed Redis instance with the latest version of Redis, you can ensure the best possible experience. This article details how to upgrade your Redis instance in Azure Managed Redis to the latest available version. Important Followingstandard Redis versioning, this article only covers upgrades to themajorversion of Redis, not theminororpatchversions",2026-04-01T06:12:00.000Z,how-to,deployment,0.7,True,"An upgrade article for Redis major versions is part of deployment/operations lifecycle. These pages usually include product-specific upgrade constraints, supported paths, timing/availability behavior, and possibly tier-specific requirements—details that are not generic and are critical for production deployment planning.",unchanged https://learn.microsoft.com/en-us/azure/redis/management-faq,Management FAQs,Azure Managed Redis management FAQs - Azure Managed Redis,,Learn the answers to common questions that help you manage Azure Managed Redis,This article provides answers to common questions about how to manage Azure Managed Redis.,2025-05-20T11:13:00Z,faq,,0.4,False,"General management FAQ; likely mixed conceptual and high-level operational Q&A without clear evidence of numeric limits, config tables, or detailed error mappings.",unchanged @@ -70,4 +70,4 @@ https://learn.microsoft.com/en-us/azure/redis/troubleshoot-server,Troubleshoot R https://learn.microsoft.com/en-us/azure/redis/troubleshoot-timeouts,Troubleshoot latency and timeouts,Troubleshoot Azure Managed Redis latency and timeouts - Azure Managed Redis,Diagnose latency and timeout problems in Azure Managed Redis,"Learn how to resolve common latency and timeout issues with Azure Managed Redis, such as Redis server patching and timeout exceptions.",A client operation that doesn't receive a timely response can result in a high latency or timeout exception. An operation could time out at various stages. Where the timeout comes from helps to determine the cause and the mitigation. This section discusses troubleshooting for latency and timeout issues that occur when connecting to Azure Managed Redis. Note Several of the troubleshooting steps in this guide include instructions to run Redis commands and monitor various performance metrics. For m,2025-05-18T08:00:00.000Z,troubleshooting-general,troubleshooting,0.85,True,"Focused on latency and timeout exceptions; such docs map symptoms to causes (patching, client timeouts) and mitigations, often with specific timeout settings and commands.",unchanged https://learn.microsoft.com/en-us/azure/redis/tutorial-active-replication,Use active geo-replication with an AKS-hosted application,Tutorial: Get started using Azure Cache for Redis Enterprise active replication with an AKS-hosted application - Azure Managed Redis,,"In this tutorial, you learn how to connect your AKS hosted application to a cache that uses active geo-replication.","In this tutorial, you will host an inventory application on Azure Kubernetes Service (AKS) and find out how you can use active geo-replication to replicate data in your Azure Cache for Redis Enterprise or Azure Managed Redis instances across Azure regions.",2025-03-31T22:36:00.000Z,tutorial,,0.45,False,Tutorial for using active geo-replication with AKS; primarily a guided example rather than a detailed configuration or troubleshooting reference.,unchanged https://learn.microsoft.com/en-us/azure/redis/tutorial-aks-get-started,Connect an AKS application to a cache,Tutorial: Get started connecting an AKS application to a cache - Azure Managed Redis,,"In this tutorial, you learn how to connect your AKS-hosted application to an Azure Cache for Redis instance.","In this tutorial, you use thissampleto connect with an Azure Cache for Redis or Azure Managed Redis instance.",2025-03-31T22:36:00.000Z,tutorial,,0.45,False,Tutorial for connecting an AKS app to Redis; likely step-by-step with sample code but not a comprehensive configuration reference or troubleshooting matrix.,unchanged -https://learn.microsoft.com/en-us/azure/redis/whats-new,What's new,What's new in Azure Managed Redis - Azure Managed Redis,,Recent updates for Azure Managed Redis,Find out what's new in Azure Managed Redis.,2026-04-15T22:11:00.000Z,whats-new,,0.2,False,"A 'what's new' changelog-style page; likely high-level feature announcements and dates without detailed limits, configs, or troubleshooting matrices.",updated +https://learn.microsoft.com/en-us/azure/redis/whats-new,What's new,What's new in Azure Managed Redis - Azure Managed Redis,,Recent updates for Azure Managed Redis,Find out what's new in Azure Managed Redis.,2026-04-15T22:11:00.000Z,whats-new,,0.2,False,"A 'what's new' changelog-style page; likely high-level feature announcements and dates without detailed limits, configs, or troubleshooting matrices.",unchanged diff --git a/products/azure-managed-redis/report.md b/products/azure-managed-redis/report.md index 3efb602f..050ad67b 100644 --- a/products/azure-managed-redis/report.md +++ b/products/azure-managed-redis/report.md @@ -42,9 +42,9 @@ confusable_not_for: Not for Azure Cache for Redis (use azure-cache-redis). - **Unclassified**: 12 ### Incremental Update -- **New Pages**: 2 -- **Updated Pages**: 2 -- **Unchanged**: 64 +- **New Pages**: 0 +- **Updated Pages**: 0 +- **Unchanged**: 68 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-managed-redis/azure-managed-redis.csv` @@ -63,18 +63,6 @@ confusable_not_for: Not for Azure Cache for Redis (use azure-cache-redis). ## Changes -### New Pages - -- [Enable Redis keyspace notifications](https://learn.microsoft.com/en-us/azure/redis/enable-redis-keyspace-notifications) -- [Dashboards with Grafana](https://learn.microsoft.com/en-us/azure/redis/grafana-dashboards) - -### Updated Pages - -- [What's new](https://learn.microsoft.com/en-us/azure/redis/whats-new) - - Updated: 2026-04-01T22:41:00.000Z → 2026-04-15T22:11:00.000Z -- [Manage data with client tools](https://learn.microsoft.com/en-us/azure/redis/how-to-redis-access-data) - - Updated: 2026-03-17T11:12:00.000Z → 2026-04-15T22:11:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-microsoft-discovery/azure-microsoft-discovery.csv b/products/azure-microsoft-discovery/azure-microsoft-discovery.csv new file mode 100644 index 00000000..8a0d5a08 --- /dev/null +++ b/products/azure-microsoft-discovery/azure-microsoft-discovery.csv @@ -0,0 +1,67 @@ +url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-advanced-investigation-patterns,Advanced investigation patterns,Advanced investigation patterns in Microsoft Discovery,Use advanced investigation patterns with Discovery Engine,"Learn advanced patterns for structuring investigations with the Discovery Engine, including fully deterministic investigations, guided exploration, and fully autonomous research.","Thetrust and basic investigation patternsarticle introduced the spectrum from structured tasks to broad objectives. This article goes deeper into three advanced patterns that represent distinct approaches to working with theDiscovery Engine. Cognition is goal-seeking, it continuously plans, executes, and adapts to drive an investigation toward its research objective, each pattern represents a different way to balance your control against cognition's autonomy. The right pattern depends on how wel",2026-04-20T15:52:00.000Z,concept-article,architecture-patterns,0.75,True,"Defines advanced patterns (deterministic, guided, autonomous) and when to use each, representing product-specific architecture/pattern guidance.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-azure-container-registry,Azure Container Registry,Azure Container Registry in Microsoft Discovery,Configure Azure Container Registry for Microsoft Discovery tools,Learn how Azure Container Registry integrates with Microsoft Discovery to store and manage containerized tools images.,"Azure Container Registry (ACR) is a private container registry service that serves as the image repository for containerized components in Microsoft Discovery. This article explains the role ACR plays in the platform, the SKU and networking options available, and how to create and configure a registry using the Azure portal or Azure CLI. Note You'll require an ACR if you're planning to publish a tool for agentic workflow.",2026-04-20T15:52:00.000Z,concept-article,configuration,0.7,True,"Describes ACR SKU and networking options and how to create/configure a registry for Discovery, implying product-specific configuration parameters.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-bookshelf-knowledge-bases,Bookshelf & Knowledge Bases,Microsoft Discovery Bookshelf & Knowledge Bases,,Conceptual overview of Microsoft Discovery Bookshelf service and Knowledge Bases.,"Microsoft Discovery includes the Bookshelf, a service that enables customers to convert their data into curated graphs known as Knowledge Bases (KBs). The key components of the Bookshelf service are the Bookshelf resource and Knowledge Bases within each Bookshelf. A Knowledge Base contains a vector database and knowledge graph of your indexed artifacts. KBs can be used by Discovery agents as grounding skills and queried by Discovery agents for various use cases, including answering questions, su",2026-04-22T06:17:00.000Z,concept-article,,0.25,False,"Conceptual overview of Bookshelf and Knowledge Bases; no indication of numeric limits, config tables, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-cognition-overview,Cognition overview,Cognition overview in Microsoft Discovery,,"Learn how cognition works in Microsoft Discovery. Understand the reasoning loop, task management, agent selection, and how to interact with the autonomous research process.","Cognition is the goal-seeking reasoning process that powers investigations in Microsoft Discovery. Rather than executing a fixed sequence of steps, it continuously assesses the current state of your investigation, decides what to do next, and adapts as results come in. When you enable Discovery Mode, cognition starts running in the background. It reads your tasks, selects agents, executes work, validates results, and creates new tasks when it identifies gaps. Cognition continues the cycle until ",2026-04-20T15:52:00.000Z,concept-article,,0.3,False,"Cognition overview describes reasoning loops and task management conceptually, not specific numeric thresholds or config parameters.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-data-encryption-at-rest,Data encryption at rest,Data encryption at rest in Microsoft Discovery,Manage data encryption at rest in Microsoft Discovery,"Learn how Microsoft Discovery encrypts data at rest, when Microsoft-managed keys are used by default, and when customer-managed keys are available.","Microsoft Discovery encrypts customer and system data at rest by using Azure platform encryption capabilities. Encryption at rest helps protect stored data from unauthorized access and is enabled automatically for Microsoft Discovery resources. This article explains what data is encrypted, the available key management models, and when you might use customer-managed keys instead of the default Microsoft-managed keys.",2026-04-20T15:52:00.000Z,concept-article,security,0.7,True,"Covers what data is encrypted, key management models, and when customer-managed keys are available, which are concrete security configuration patterns.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-agent,Discovery Agent concepts,Microsoft Discovery Agent concepts,,"Learn about Discovery Agents, AI assistants that execute scientific research tasks with tool-augmented reasoning and multi-agent orchestration.","Microsoft Discovery Agents are AI assistants that execute scientific research tasks on your behalf within the Microsoft Discovery platform. Discovery agents build onMicrosoft Foundry Agent Service. They add scientific research features to conversational AI. These features include reasoning loops, agent teams, research tools, and knowledge bases. Agents serve as the fundamental building blocks for automating complex scientific workflows. You define their behavior through natural language instruct",2026-04-20T15:52:00.000Z,concept-article,,0.35,False,"Conceptual description of Discovery Agents and capabilities; no clear indication of config tables, limits, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-agent-types,Discovery Agent types,Agent types in Microsoft Discovery,Choose between prompt and workflow agents in Discovery,"Learn about prompt agents and workflow agents in Microsoft Discovery, including when to use each type and how they're composed.","Microsoft Discovery supports two agent types: prompt agents and workflow agents. Both are managed through the same Discovery APIs, but they serve different purposes in a scientific workflow. Use this article to understand how the two agent types differ, when to choose each one, and which components define their behavior.",2026-04-20T15:52:00.000Z,concept-article,decision-making,0.7,True,"Explains when to use each agent type and how they differ, providing concrete guidance for selecting the right agent type for scenarios.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-billing,Billing overview,Microsoft Discovery billing overview,Understand Microsoft Discovery billing model and metering,"Learn about the Microsoft Discovery pricing and billing model, including what counts as a User Message, which operations are billable, and how charges appear in Azure.",Microsoft Discovery uses a billing model with two components: This model lets you pay only for what you use and track spending with precision through Azure.,2026-04-20T15:52:00.000Z,concept-article,decision-making,0.6,True,"Explains what counts as a User Message, which operations are billable, and how charges appear—service-specific cost model that informs usage and capacity decisions.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-engine,Discovery Engine overview,Discovery Engine overview in Microsoft Discovery,,"Learn about the Discovery Engine, the cognitive backbone of Microsoft Discovery that autonomously plans, executes, and manages complex scientific research through tasks and continuous reasoning.","The Discovery Engine is the cognitive backbone of Microsoft Discovery. It operates as an autonomous research partner that plans work, delegates to specialized agents, monitors progress, and adapts when results come back differently than expected. Instead of responding to individual prompts in a chat window, the engine runs continuously in the background while you focus on the scientific decisions that matter most. Traditional AI assistants work in a question-and-answer cycle. You ask, they respo",2026-04-20T15:52:00.000Z,concept-article,,0.3,False,Overview of Discovery Engine behavior; focuses on conceptual cognition and orchestration rather than concrete configuration or limits.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-files-storage-assets,Files and storage assets,Files and storage assets in Microsoft Discovery,Understand file handling and limitations in Discovery investigations,"Learn how files work in Microsoft Discovery investigations, including how agents create and read files, supported file types, how files move between tasks, and current limitations.","When agents work on tasks in a Microsoft Discovery investigation, they often produce files such as research reports, datasets, configuration files, or computational results. These files are stored asstorage assetsin Azure Blob Storage and tracked as part of the task results. Understanding how files flow through an investigation helps you design tasks that produce useful outputs and write validation requirements that verify file content.",2026-04-20T15:52:00.000Z,concept-article,limits-quotas,0.6,True,"Covers supported file types, how files move between tasks, and current limitations, which likely include explicit constraints unique to Discovery.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-managed-identities,Managed identities,Managed identities in Microsoft Discovery,Use managed identities with Microsoft Discovery resources,"Understand how Microsoft Discovery uses user-assigned managed identities (UAMI) for authentication across workspaces, supercomputers, and bookshelves.","Microsoft Discovery usesuser-assigned managed identities (UAMI)to authenticate against Azure resources on your behalf. Rather than managing secrets or connection strings, you create a managed identity, grant it the necessary Azure roles, and provide its resource ID when you create Discovery resources. The Discovery platform then uses that identity to access storage accounts, container registries, AI services, and managed resource group resources.",2026-04-20T15:52:00.000Z,concept-article,security,0.75,True,"Details how Discovery uses user-assigned managed identities across resources, including role grants and identity usage for secure access.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-network-security,Network security,Network security for Microsoft Discovery,Configure network security for Microsoft Discovery workspaces,Understand how Microsoft Discovery uses Network Security Perimeters and private endpoints to protect managed resources and data-plane traffic.,Microsoft Discovery provides two layers of network security to protect your workspace resources and data-plane traffic: Network hardening is enabled by default for all workspaces and bookshelves managed with the2026-02-01-previewAPI version and later. Private endpoints for data-plane access are optional and can be configured separately.,2026-04-20T15:52:00.000Z,concept-article,security,0.7,True,"Describes network security perimeters and private endpoints, including API-version-specific behavior and default hardening, which are product-specific security details.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-observability,Observability overview,Observability in Microsoft Discovery,,"Learn about the observability capabilities available in Microsoft Discovery, including application logs stored in Managed Resource Group Log Analytics workspaces, Azure activity logs for control plane","Microsoft Discovery integrates with Azure Monitor to provide comprehensive observability across all platform resources. You can monitor and troubleshoot workspaces, supercomputers, and bookshelves by querying application logs in dedicated Log Analytics workspaces and reviewing activity logs for control plane operations.",2026-04-20T15:52:00.000Z,concept-article,,0.3,False,"High-level observability overview; conceptual description of logs and monitoring without detailed tables, schemas, or configuration parameters.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-projects-investigations,Projects and Investigations,Microsoft Discovery projects and investigations,Organize Microsoft Discovery projects and investigations effectively,"Learn about projects and investigations in Microsoft Discovery, how they organize scientific research, and how they relate to other platform resources.","Microsoft Discovery organizes scientific research through two key concepts:projectsandinvestigations. Projects define the functional boundary for your research resources, while investigations are where you interact with agents and conduct research. This article explains how both concepts work, how they relate to other platform resources, and best practices for using them.",2026-04-20T15:52:00.000Z,concept-article,best-practices,0.65,True,"Includes best practices for using projects and investigations specific to Discovery’s resource model, beyond generic project organization advice.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-quota-reservation,Quota reservations,Quota reservations for Microsoft Discovery,Plan Azure quotas and reservations for Microsoft Discovery,"Learn about the Azure quotas and capacity reservations required for Microsoft Discovery deployments, including VM SKUs, storage, database, and AI model quotas.",Microsoft Discovery requires specific quotas across multiple Azure services to function effectively. These quotas must be secured before deploying Microsoft Discovery infrastructure components. Proper quota planning ensures optimal performance and prevents deployment failures during infrastructure setup. The primary quota categories include:,2026-04-20T15:52:00.000Z,concept-article,limits-quotas,0.85,True,"Explicitly about required quotas and capacity reservations (VM SKUs, storage, database, AI models) with specific numeric requirements for deployments.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-resource-naming,Resource Naming,Resource naming guidelines for Microsoft Discovery,Apply Microsoft Discovery resource naming rules,"Learn the naming rules and best practices for Microsoft Discovery resources, including character limits, allowed characters, and patterns for each resource type.","Microsoft Discovery includes multiple resource types, some with dependencies on resources in other resource providers. By understanding resource naming constraints, you can improve clarity, prevent conflicts, enable automation, and support governance and security. In this article, you learn the naming rules for all resource types within Microsoft Discovery. This guidance helps you make informed decisions and avoid deployment errors.",2026-04-20T15:52:00.000Z,concept-article,limits-quotas,0.8,True,"Covers naming rules including character limits and allowed characters per resource type, which are numeric constraints and patterns unique to this service.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-resource-provider-registration,Resource provider registration,Microsoft Discovery Resource Provider Registration,Register Microsoft Discovery resource provider in Azure,Learn how to register the Microsoft Discovery resource provider and its dependencies in your Azure subscription to enable Discovery services.,"Registering the Microsoft Discovery resource provider is a prerequisite for creating and using Microsoft Discovery resources in your Azure subscription. This article explains what resource provider registration means, what permissions you need, and how to complete registration using the Azure portal, Azure CLI, Azure PowerShell, or the REST API.",2026-04-20T15:52:00.000Z,concept-article,configuration,0.7,True,"Explains provider registration, required permissions, and how to register via portal/CLI/PowerShell/REST, which involves specific commands and parameters.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-responsible-ai,Responsible AI in Microsoft Discovery,Responsible AI in Microsoft Discovery,Apply responsible AI practices in Microsoft Discovery research,"Learn about responsible AI principles, safety components, limitations, and best practices for using Microsoft Discovery in scientific research.","Microsoft Discovery is an enterprise agentic AI platform for scientific research and development. It uses large language models (LLMs), multi-agent orchestration, and high-performance computing (HPC) to help researchers reason through complex problems, generate hypotheses, and analyze results. Like all AI systems, Discovery has limitations and potential risks. This article describes the responsible AI principles, safety components, known limitations, and best practices that help you use the plat",2026-04-20T15:52:00.000Z,concept-article,best-practices,0.7,True,"Includes Discovery-specific limitations, safety components, and best practices for using the platform responsibly in scientific contexts.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-role-assignments,Role assignments,Role assignments in Microsoft Discovery,Configure RBAC role assignments for Microsoft Discovery,"Learn about Azure role-based access control (RBAC) for Microsoft Discovery, including the three built-in roles, their permissions, and how to assign them.","Microsoft Discovery uses Azure role-based access control permissions to control who can access resources and what actions they can perform. This article explains the three built-in Microsoft Discovery roles, the other Azure roles commonly required alongside them, and how to assign roles using the Azure portal, Azure CLI, or Azure PowerShell.",2026-04-20T15:52:00.000Z,concept-article,security,0.85,True,"Describes three built-in Discovery roles and their permissions plus related Azure roles, which are product-specific RBAC details.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-storage-account,Azure Blob Storage Account,Azure Blob Storage in Microsoft Discovery,Configure Azure Blob Storage for Microsoft Discovery data,"Learn how Azure Blob Storage is used in Microsoft Discovery to store investigation input and output data, and how to configure networking, CORS, and identity access for the storage account.","Microsoft Discovery uses Azure Blob Storage as the backing store for investigation input and output data. This article explains the role a storage account plays in the platform, and what networking, CORS, and identity access configuration is required before you create Discovery resources.",2026-04-20T15:52:00.000Z,concept-article,configuration,0.75,True,"Explains required networking, CORS, and identity access configuration for the storage account used by Discovery, which are concrete settings.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-storage-containers-assets,Storage containers and storage assets,Storage containers and storage assets in Microsoft Discovery,,"Understand how storage containers and storage assets organize data in Microsoft Discovery workspaces, including the relationship to Azure Blob Storage and how agents use them.","Microsoft Discovery usesstorage containersandstorage assetsto organize data for workspaces. A storage container creates a logical reference to an Azure Blob Storage account (or Azure NetApp Files volume), while storage assets point to specific blob paths within that account. Together, they provide the data layer that agents, tools, and investigations use to read input and write output.",2026-04-20T15:52:00.000Z,concept-article,,0.35,False,Explains concepts of storage containers and assets; likely structural description rather than detailed configuration matrices.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-studio,Microsoft Discovery Studio,What is Microsoft Discovery Studio?,,"Learn about Microsoft Discovery Studio, a web-based research environment built on Visual Studio Code for the web that enables scientists and engineers to work with AI agents, investigations, and scien","Microsoft Discovery Studio is the web-based, unified research environment forMicrosoft Discovery. You use Discovery Studio to create and manage workspaces, projects, agents, investigations, knowledge bases, tools, and data—all from a single browser-based interface. You can customize the layout and tailor the environment to match your specific research workflows—all without leaving the browser. You can access Microsoft Discovery Studio atstudio.discovery.microsoft.comusing any supported modern br",2026-04-20T15:52:00.000Z,concept-article,,0.2,False,"Describes what Discovery Studio is and its UI; no specific settings, limits, or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-tasks-investigations,Tasks and investigations,Tasks and investigations in Microsoft Discovery,,"Learn how tasks work in Microsoft Discovery. Understand task structure, the status lifecycle, dependencies, and how cognition uses tasks to organize and execute research.","Tasks are how you define the work you want theDiscovery Engineto carry out. Each task represents a discrete piece of work with a clear objective and success criteria. Whencognitionis enabled, it reads your tasks, understands their relationships, and orchestrates execution across agents and tools. Tasks serve two purposes. First, they organize your work into manageable pieces, whether you created those pieces yourself or cognition decomposed them from a broader objective. Second, they provide an ",2026-04-20T15:52:00.000Z,concept-article,,0.4,False,Explains task structure and lifecycle conceptually; summary doesn’t indicate detailed config tables or numeric constraints.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-tools-model-integration,Tools and model integration,Microsoft Discovery tools and model integration,Integrate tools and models into Microsoft Discovery workflows,"Learn about tools in Microsoft Discovery, including tool types, how tools are deployed, and how to integrate models into agent workflows using common integration patterns.","This guide covers how tools work in Microsoft Discovery and how to integrate your models into agent workflows. It explains tool types, integration patterns, and how to get started.",2026-04-20T15:52:00.000Z,conceptual,integrations,0.7,True,"Describes tool types, deployment, and model integration patterns, which are product-specific integration patterns for Discovery agents.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-trust-basic-investigation-patterns,Trust relationship and basic investigation patterns,Trust relationship and basic investigation patterns in Microsoft Discovery,Apply trust and basic investigation patterns in Discovery,"Learn how to calibrate the level of autonomy you give the Discovery Engine, how validation requirements shape cognition's behavior, and basic investigation patterns for getting started.","Working with theDiscovery Engineis a collaboration. You set direction and define what success looks like.Cognitionhandles the execution. The quality of this collaboration depends on how clearly you express your expectations and how much structure you provide. This article covers how validation requirements shape cognition's behavior, how to calibrate the level of detail in your tasks, and the basic investigation patterns for getting started.",2026-04-20T15:52:00.000Z,concept-article,best-practices,0.8,True,"Provides concrete patterns and guidance on structuring tasks, validation requirements, and autonomy levels specific to Discovery Engine.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-v1-to-v2-transition-guide,v1 to v2 transition guidance,Microsoft Discovery v1 to v2 transition: Resource retention and recreation guidance,Plan Microsoft Discovery v1 to v2 resource transition,Understand which Microsoft Discovery resources can be retained and which must be recreated when transitioning from v1 to v2 APIs during the coexistence window.,"As Microsoft Discovery transitions from v1 to v2, there's a coexistence window where both versions are supported simultaneously. This article explains which resources you can keep, which must be recreated under v2, and what is deprecated, so you can plan your transition with confidence. The guiding principles for the transition are:",2026-04-20T15:52:00.000Z,concept-article,decision-making,0.7,True,"Transition guide explains which resources can be retained vs. must be recreated and what is deprecated, providing concrete migration decisions and guidance.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-virtual-networks,Virtual Networks and Subnets,Virtual Networks in Microsoft Discovery,,Conceptual Architecture Overview of Virtual Networks in Microsoft Discovery,"Microsoft Discovery uses Azure Virtual Networks (VNets) to provide secure, isolated networking for Discovery resources deployed in a customer’s Azure subscription. VNets form the foundational network boundary that enables secure communication between Discovery components while aligning with enterprise security and compliance expectations. This document provides a high‑level, conceptual overview of how VNets are used in Microsoft Discovery. It focuses on intent and principles rather than configur",2026-04-20T15:52:00.000Z,concept-article,,0.2,False,"Conceptual VNet overview focuses on intent and principles, not specific network configuration parameters or limits.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-access-resource-logs,Access resource logs,Access resource logs for Microsoft Discovery resources,Access Discovery resource application logs in Managed Resource Groups,"Learn how to navigate to the Log Analytics workspace in the Managed Resource Group for Microsoft Discovery workspaces, supercomputers, and bookshelves to query application logs.","Microsoft Discovery application logs are stored in a dedicated Log Analytics workspace inside theManaged Resource Group (MRG)that Azure provisions for each Discovery resource. This article explains how to navigate to the MRG and open the Log Analytics workspace for a workspace, supercomputer, or bookshelf. Note This article covers access to application logs in MRG-based Log Analytics workspaces. For control plane audit logs, seeView activity logs for Microsoft Discovery resources. For configurin",2026-04-20T15:52:00.000Z,how-to,configuration,0.7,True,"Explains how to locate the MRG, open the Log Analytics workspace, and access Discovery-specific log tables—concrete observability configuration for this product.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-agent-creation,Create agents,Create agents in Microsoft Discovery,Create prompt and workflow agents in Microsoft Discovery,Learn how to create prompt agents and workflow agents in Microsoft Discovery using Discovery Studio and the Foundry UI.,Microsoft Discovery supports two agent types: prompt agents and workflow agents. You can author agents through the Discovery Studio UI or copy them from other projects. This article walks you through each authoring method. Choose the approach that best fits your workflow and team needs.,2026-04-20T15:52:00.000Z,how-to,configuration,0.65,True,"Walks through agent authoring in Discovery Studio/Foundry UI, likely including specific agent configuration fields and options.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-build-investigations-cognition,Build investigations with cognition,Build investigations with cognition in Microsoft Discovery,,"Step-by-step guide to setting up investigations with the Discovery Engine, from creating your first task to monitoring an autonomous research investigation.","This guide walks you through setting up an investigation that uses theDiscovery Engineto execute research autonomously. By the end, you have a working investigation with tasks, validation requirements, andcognitionrunning in the background.",2026-04-20T15:52:00.000Z,how-to,,0.4,False,"Step-by-step investigation setup is procedural tutorial content without structured configuration tables, limits, or error mappings.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-collect-v1-configurations,Collect v1 resource configurations,Collect Microsoft Discovery v1 resource configurations,Export Microsoft Discovery v1 configurations for migration,"Learn how to export and document your v1 resource configurations for tools, agents, workflows, and knowledge bases before transitioning to Microsoft Discovery v2.","Before you transition from Microsoft Discovery v1 to v2, you must collect and export your existing resource configurations. There's no in-place migration path, so all resources must be recreated under v2. This article walks you through exporting the definitions you need. Review thev1 to v2 transition guidebefore you begin. It explains which resources require recreation and which are deprecated.",2026-04-20T15:52:00.000Z,how-to,decision-making,0.7,True,"Guides which v1 configurations to collect and export before v2 migration, supporting concrete migration planning decisions.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-configure-managed-identity,Configure managed identities,Configure managed identities for Microsoft Discovery,Configure user-assigned managed identities for Discovery,"Learn how to create and configure user-assigned managed identities (UAMI) for Microsoft Discovery workspaces, supercomputers, and bookshelves, including the required Azure role assignments.","Microsoft Discovery usesuser-assigned managed identities (UAMI)to authenticate against Azure resources on your behalf. Every workspace and supercomputer requires a UAMI at creation time. This article explains how to create a UAMI, assign the required roles, and attach it to Discovery resources.",2026-04-20T15:52:00.000Z,how-to,security,0.8,True,"Step-by-step creation and configuration of UAMIs, including required role assignments and attachment to resources, is concrete security configuration.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-configure-network-security,Configure network security,Configure network security for Microsoft Discovery workspaces,Configure private endpoints and network security for Discovery,"Learn how to assign NSP roles, configure subnets, create private endpoints, and configure DNS for Microsoft Discovery workspaces.","This article walks you through the prerequisites for network hardening and how to create private endpoints for Microsoft Discovery workspaces and bookshelves. Network hardening is enabled by default — the Discovery control plane automatically deploys Network Security Perimeters, private endpoints, and virtual network injection for managed resources. For an overview of what these features are and why they matter, seeNetwork security for Microsoft Discovery.",2026-04-20T15:52:00.000Z,how-to,security,0.8,True,"Details assigning NSP roles, configuring subnets, private endpoints, and DNS for Discovery workspaces/bookshelves—product-specific network security configuration with Azure constructs.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-create-tool-definition,Create a tool definition,Create a tool definition for Microsoft Discovery,Author tool definition YAML for Discovery tools,"Learn how to write a tool definition YAML file that describes how Microsoft Discovery deploys, configures, and invokes your containerized tool.","A tool definition is a YAML file that serves as the integration contract between your containerized tool and Microsoft Discovery. It tells the platform where your container image is, what compute resources the tool needs, and how to invoke each operation the tool exposes. This article explains each section of a tool definition and provides complete examples for the three supported tool types: action-based, code environment, and hybrid. Note This article assumes your container image is already pu",2026-04-20T15:52:00.000Z,how-to,configuration,0.85,True,"Explains each section of the Discovery tool definition YAML, including fields for image location, compute resources, and operations—detailed configuration schema unique to this product.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-create-tool-docker-file,Create a Dockerfile for a tool,Create a Dockerfile for a Microsoft Discovery tool,,"Learn how to containerize a tool for Microsoft Discovery by creating a Dockerfile, organizing the project structure, building and testing the container image locally.","Containerizing your tool with Docker ensures it runs consistently across different hardware, compute pools, and environments within Microsoft Discovery. This article walks through organizing your tool's project, writing a Dockerfile, and validating the container image locally. Note This article assumes you understand tool's requirements and have written any action scripts. SeePlan tool requirementsandWrite action scripts.",2026-04-20T15:52:00.000Z,how-to,,0.45,False,Dockerfile creation and local validation are mostly generic containerization steps; unlikely to contain Discovery-specific configuration tables or constraints.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-data-handling-with-tools-agents,Data handling with tools and agents,Manage data handling with tools and agents in Microsoft Discovery,Configure data handling and storage assets for Discovery agents,"Learn how to configure Microsoft Discovery agents to manage storage assets produced and consumed by tools, including resource URIs, built-in resource management tools, storage asset promotion, and inp","Microsoft Discovery uses a resource-based data model to manage all data exchanged between agents and tools. Every file, directory, or dataset produced during a conversation is aresourcein your workspace that agents can inspect, describe, and share with you. In this article, you learn how to manage data in Discovery conversations, control what outputs you see, and set up tools to work with your data. Note This article applies to agents using API version2026-02-01-preview.",2026-04-20T15:52:00.000Z,how-to,configuration,0.7,True,"Describes Discovery’s resource-based data model, resource URIs, built-in resource management tools, and how to configure input/output mounts for tools—product-specific configuration behavior not inferable from general LLM knowledge.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-debug-task-execution,Debug task execution,Debug task execution in Microsoft Discovery,Troubleshoot and debug Discovery task execution issues,"Learn how to identify and resolve common issues with task execution in the Discovery Engine - including stuck tasks, validation failures, agent errors, and cognition behavior.","When tasks in your investigation aren't progressing as expected, this guide helps you identify the cause and take action. Most issues fall into a few categories:",2026-04-20T15:52:00.000Z,how-to,troubleshooting,0.8,True,"Explicitly focuses on diagnosing stuck tasks, validation failures, agent errors, and cognition behavior; likely organized by symptom-to-cause-to-solution for Discovery-specific failures.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-deploy-network-hardened-stack,End-to-end network-hardened deployment,End-to-end network-hardened deployment for Microsoft Discovery,Deploy a fully network-hardened Microsoft Discovery stack,"Learn how to deploy a fully network-hardened Microsoft Discovery stack where all traffic stays within your virtual network, including workspace, bookshelf, supercomputer, and customer storage.","This guide walks you through deploying a complete Microsoft Discovery stack whereall traffic stays within your virtual network. By the end, your workspace data-plane APIs, bookshelf search, supercomputer jobs, and customer blob storage are all accessible exclusively through private endpoints - with zero public internet exposure.",2026-04-20T15:52:00.000Z,how-to,deployment,0.7,True,"End-to-end guide for deploying all Discovery components (workspace, bookshelf, supercomputer, storage) with only private endpoints; describes a product-specific deployment pattern and constraints.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-enable-audit-logging,Enable audit logging,Enable audit logging for Microsoft Discovery resources,Enable and export audit logs for Microsoft Discovery,"Learn how to configure Azure Monitor diagnostic settings to export audit logs from Microsoft Discovery workspaces, bookshelves, and supercomputers to an Azure Storage account or Log Analytics workspac","Microsoft Discovery supportscustomer-configurable audit loggingthrough Azure Monitor diagnostic settings. When enabled, audit and platform logs are exported from Discovery resources to an Azure Storage account or a Log Analytics workspace that you control, where they can be retained for compliance, security auditing, and long-term analysis. Audit logs are distinct from the application logs that Discovery automatically collects in Managed Resource Group (MRG) Log Analytics workspaces. Audit logs ",2026-04-20T15:52:00.000Z,how-to,security,0.7,True,Covers configuring Azure Monitor diagnostic settings to export Discovery audit logs to Storage or Log Analytics for compliance—security/audit configuration details.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-index-bookshelf-knowledgebase,Create Bookshelf and index a Knowledgebase,Create Bookshelf and index a Knowledgebase for Microsoft Discovery,Configure Bookshelf resources and index Discovery knowledgebases,"Learn how to create a Bookshelf resource, configure storage, create a knowledgebase, and index documents using the Microsoft Discovery Bookshelf service.","This article walks you through creating a Bookshelf resource, uploading documents, and indexing a knowledgebase. A Bookshelf knowledgebase enables customers to convert unstructured private dataset in Azure blob storage into rich, summary-based index, enabling a graph-enabled retrieval-augmented generation (RAG) with rich citations. The knowledgebase can answer global queries that address the entire dataset, such as ""what are the main themes in the data?"", or ""what are the most important implicat",2026-04-20T15:52:00.000Z,how-to,configuration,0.65,True,"Walks through creating Bookshelf resources, wiring storage, and indexing knowledgebases with Discovery-specific resource types and flows, which are concrete configuration patterns.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-storage-containers,Manage storage containers,Manage storage containers in Microsoft Discovery,Configure storage containers and assets in Discovery workspaces,"Learn how to create, list, and manage storage containers and storage assets in a Microsoft Discovery workspace for data ingestion and output.","A storage container is a Discovery resource that creates a reference to an Azure Blob Storage account or Azure NetApp Files volume. Storage containers enable data ingestion, tool output, and agent data access across your Discovery workspace. Each storage container can hold multiple storage assets that point to specific blob paths within the account.",2026-04-20T15:52:00.000Z,how-to,configuration,0.7,True,"Covers how Discovery maps storage containers and assets to Azure Blob/NetApp paths with product-specific resource semantics and management steps, which are configuration details unique to this service.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-supercomputers,Manage Supercomputer & Nodepools,Manage Supercomputer and Nodepools in Microsoft Discovery,Create and manage Microsoft Discovery supercomputers and nodepools,How to create and manage supercomputer and Nodepools,Applies to:Microsoft Discovery (Public Preview) This article describes how to create aSupercomputerandNodePoolsusing the Azure portal. It follows Learn.microsoft.com conventions and is safe for public preview documentation.,2026-04-20T15:52:00.000Z,how-to,configuration,0.65,True,Managing supercomputers and nodepools in the portal will involve specific settings and constraints unique to Discovery’s compute resources.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-supercomputers-rest-api,Manage Supercomputer using REST APIs,How to manage Microsoft Discovery Supercomputers using REST API,Manage Microsoft Discovery supercomputers via REST API,"Learn how to create, retrieve, update, list, and delete Supercomputer resources in Azure Discovery using the REST API.","This article shows you how to manage Azure Discovery Supercomputer resources by using the REST API. You learn how to create, retrieve, update, list, and delete Supercomputer resources in your Azure subscription. A Supercomputer provides dedicated compute infrastructure for running workloads on the Azure Discovery platform. It manages an AKS-backed cluster with configurable networking, identity, and encryption settings.",2026-04-20T15:52:00.000Z,how-to,integrations,0.65,True,"Shows REST operations and parameters for supercomputer resources, which are product-specific API integration details.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-workspaces,Manage Workspaces,Manage workspaces in Microsoft Discovery,Configure and manage Microsoft Discovery workspaces,"Learn how to create, update, browse, and delete Microsoft Discovery workspaces in the Azure portal, including networking configuration and supercomputer management.","Applies to:Microsoft Discovery (Public Preview) This article describes how to create, update, browse, and delete aMicrosoft Discovery workspaceusing the Azure portal.",2026-04-20T15:52:00.000Z,how-to,configuration,0.7,True,"Includes workspace creation, updates, networking configuration, and supercomputer management, which are concrete configuration steps.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-plan-tool-requirements,Plan tool requirements,Plan tool requirements for Microsoft Discovery,Plan compute and functional requirements for Discovery tools,"Learn how to identify functionalities, compute needs, tool type, and dependencies before building and publishing a tool to Microsoft Discovery.","Before containerizing and publishing a tool to Microsoft Discovery, carefully plan the tool's functionality, compute needs, and dependencies. A thorough planning phase reduces issues during containerization and helps ensure your tool works reliably within Discovery investigations. Note This article applies to tools targeting Microsoft Discovery API version2026-02-01-preview.",2026-04-20T15:52:00.000Z,how-to,decision-making,0.65,True,"Guides selection of tool type, compute needs, and dependencies before publishing; this is product-specific planning and trade-off guidance for tool design decisions.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-prompt-engineering,Write effective prompts for agents,Write effective prompts for agents in Microsoft Discovery,,"Learn how to write agent instructions and user prompts for Microsoft Discovery agents to get accurate, well-structured responses for scientific research tasks.","Prompt engineering is the practice of writing clear instructions that guide a large language model (LLM) to produce the output you need. In Microsoft Discovery, you write prompts in two places: agent instructions that define the agent's behavior, and user prompts that you type during investigations. This article covers techniques for both instruction authoring and user prompt construction. All examples focus on scientific research scenarios relevant to Discovery workflows.",2026-04-20T15:52:00.000Z,how-to,,0.2,False,"Prompt engineering guidance is conceptual and pattern-based without product-specific configuration tables, limits, or error mappings.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-publish-tool-to-acr,Publish a tool to Azure Container Registry,Publish a tool container image to Azure Container Registry for Microsoft Discovery,Publish Discovery tool container images to Azure Container Registry,"Learn how to build, validate, tag, and push a tool container image to Azure Container Registry for use with Microsoft Discovery.","After you have built and tested your tool's container image locally, the next step is to publish it to Azure Container Registry (ACR). The Microsoft Discovery platform pulls tool images from ACR when it deploys tools within investigations. Note This article assumes you have a working, locally tested container image. SeeCreate a Dockerfile for a Discovery toolbefore proceeding.",2026-04-20T15:52:00.000Z,how-to,deployment,0.65,True,"Details how Discovery expects tool images to be built, tagged, and pushed to ACR for deployment; this is a product-specific deployment pipeline step.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-bookshelf-indexing-logs,Query Bookshelf indexing logs,Query bookshelf indexing logs for Microsoft Discovery,Query bookshelf indexing logs from Discovery supercomputers,"Learn how to query bookshelf indexing job logs for Microsoft Discovery in the Log Analytics workspace within the Supercomputer's Managed Resource Group, to track indexing execution and diagnose failur","Microsoft Discovery bookshelf indexing job logs capture the stdout/stderr output from indexing jobs and provide error diagnostics for failures that occur during knowledgebase indexing. Because indexing jobs run on the supercomputer's compute infrastructure, these logs are stored in theDiscoveryBookshelfLogs_CLtable in the Log Analytics workspace within thesupercomputer'sManaged Resource Group (MRG). Note Bookshelf indexing logs are in the Supercomputer's MRG, not the bookshelf's MRG. The bookshe",2026-04-20T15:52:00.000Z,how-to,configuration,0.75,True,Describes DiscoveryBookshelfLogs_CL location (supercomputer MRG) and how to query indexing job stdout/stderr—product-specific log placement and query behavior.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-bookshelf-logs,Query Bookshelf query logs,Query bookshelf knowledgebase query logs for Microsoft Discovery,Query bookshelf knowledgebase query logs in Discovery,"Learn how to query knowledgebase query logs for a Microsoft Discovery bookshelf in the Log Analytics workspace within the Managed Resource Group, to investigate query execution and diagnose failures.",Microsoft Discovery bookshelf knowledgebase query logs capture query execution traces and error diagnostics for queries processed by the knowledgebase agent. These logs are automatically collected and stored in theDiscoveryLogs_CLtable in the Log Analytics workspace within the bookshelf's Managed Resource Group (MRG).,2026-04-20T15:52:00.000Z,how-to,configuration,0.75,True,Covers DiscoveryLogs_CL usage for bookshelf query traces and diagnostics in the bookshelf’s MRG—detailed log schema and query configuration.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-cognitive-loop-logs,Query CogLoop logs,Query CogLoop logs for Microsoft Discovery,Query CogLoop orchestration logs for Discovery investigations,"Learn how to query CogLoop AI orchestration logs for a Microsoft Discovery workspace in the Log Analytics workspace within the Managed Resource Group, to investigate investigation progress, errors, an",Microsoft Discovery CogLoop is the AI orchestration engine that drives investigation progress. Cognition Engine logs capture: It continuously runs two subloops -ActandCognitionto plan and execute research tasks on your behalf. CogLoop logs are automatically stored in theDiscoveryCogLoopLogs_CLtable in the Log Analytics workspace inside the workspace's Managed Resource Group (MRG). Important DiscoveryCogLoopLogs_CLis anAuxiliarytier table. Each query must target this table only (cross-table joins,2026-04-20T15:52:00.000Z,how-to,configuration,0.8,True,"Documents the DiscoveryCogLoopLogs_CL table, its role in tracing Act/Cognition loops, and query constraints—product-specific log configuration and usage.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-supercomputer-logs,Query Supercomputer logs,Query supercomputer logs for Microsoft Discovery,Query supercomputer platform and tool logs in Discovery,"Learn how to query platform and application logs for a Microsoft Discovery supercomputer in the Log Analytics workspace within the Managed Resource Group, including Kubernetes events, pod inventory, s","Microsoft Discovery Supercomputer logs provide visibility into platform orchestration, Kubernetes cluster activity, system health, tool execution, and bookshelf indexing operations. These logs are automatically collected and stored in a dedicated Log Analytics workspace within the Supercomputer's Managed Resource Group (MRG). This article describes the available log tables, their schemas, and example queries for common diagnostic scenarios.",2026-04-20T15:52:00.000Z,how-to,configuration,0.8,True,"Explains supercomputer log tables, schemas, and example queries for Kubernetes events, system health, and tool execution—detailed observability configuration.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-workspace-logs,Query workspace logs,Query workspace logs for Microsoft Discovery,Query Discovery workspace logs with Kusto and correlation IDs,"Learn how to query application logs for a Microsoft Discovery workspace in the Log Analytics workspace within the Managed Resource Group, including how to use correlation IDs for end-to-end request tr","Microsoft Discovery workspace logs capture agent execution traces, tool invocations, workflow steps, and error diagnostics for all investigations run within a workspace. These logs are automatically stored in theDiscoveryLogs_CLtable in the Log Analytics workspace inside the workspace's Managed Resource Group (MRG). Important DiscoveryLogs_CLis anAuxiliarytier table. Each query must target this table only (cross-table joins aren't supported). This article explains how to query workspace logs and",2026-04-20T15:52:00.000Z,how-to,configuration,0.8,True,"Describes the DiscoveryLogs_CL table, Auxiliary tier constraints (no cross-table joins), and query patterns—detailed, product-specific log schema and query configuration.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-recreate-v2-resources,Recreate resources in v2,Recreate Microsoft Discovery resources in v2,Recreate Microsoft Discovery resources on v2 infrastructure,"Learn how to set up v2 infrastructure and recreate tools, bookshelves, agents, and workflows in Microsoft Discovery v2 using your exported v1 configurations.","After you collect your v1 resource configurations, use this article to set up v2 infrastructure and recreate your tools, bookshelves, agents, and workflows in Microsoft Discovery v2. Review thev1 to v2 transition guidebefore you begin. It explains which resources require recreation and which are deprecated.",2026-04-20T15:52:00.000Z,how-to,deployment,0.7,True,"Describes how to set up v2 infrastructure and recreate resources using exported configs, which is product-specific deployment/migration procedure.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-select-models-for-agents,Select models for agents,Select models for agents in Microsoft Discovery,Select optimal OpenAI models for Microsoft Discovery agents,"Learn how to choose the right OpenAI model for your Microsoft Discovery agents based on use case, output quality, cost, and response time.","Microsoft Discovery is built onMicrosoft Foundry Agent Service. All models available in theFoundry model catalogare accessible for Discovery agents. During public preview, we recommend OpenAI GPT-5.x series models for the best experience with Discovery agents. This article helps you choose the right model for your agents based on task complexity, output quality, cost, and response time.",2026-04-20T15:52:00.000Z,how-to,decision-making,0.8,True,"Provides recommendations on choosing models based on task complexity, quality, cost, and latency, including preview-specific guidance (GPT-5.x).",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-task-addition-execution,Task addition and execution,Task addition and execution in Microsoft Discovery,,"Learn how to add tasks to an investigation, manage task relationships, monitor execution progress, and handle results in the Discovery Engine.","This guide covers the practical steps for creating tasks, setting up relationships between them, monitoring execution, and managing results in theDiscovery Engine. For conceptual background on task structure and statuses, seeTasks and investigations.",2026-04-20T15:52:00.000Z,how-to,,0.4,False,"Task creation and execution steps are workflow instructions rather than expert configuration, limits, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-view-activity-logs,View activity logs,View activity logs for Microsoft Discovery resources,View and filter Azure activity logs for Discovery resources,"Learn how to view and filter Azure activity logs for Microsoft Discovery control plane operations including workspace, supercomputer, and bookshelf create, update, and delete events.","Azure Activity Logs record all control plane operations performed on Microsoft Discovery resources through the Azure Resource Manager (ARM) API. These logs let you audit who made changes, what was changed, and when—covering operations such as creating a workspace, updating a supercomputer, or deleting a bookshelf. Activity logs are available inAzure Monitorand are separate from the application logs that Discovery stores in MRG-based Log Analytics workspaces. For information on application logs, ",2026-04-20T15:52:00.000Z,how-to,configuration,0.6,True,"Shows how to access and filter control plane activity logs for Discovery workspaces, supercomputers, and bookshelves; product-specific logging configuration and usage.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-write-tool-action-scripts,Write action scripts for a tool,Write action scripts for a Microsoft Discovery tool,Implement action scripts for Discovery action-based tools,"Learn how to implement action scripts for action-based tools in Microsoft Discovery, including entrypoint structure, input format handling, batch processing, and output conventions.","Action scripts implement the operations that your tool exposes to Discovery agents. Each action maps to a script (or a command dispatched by a central entrypoint) that the Discovery platform calls when an agent invokes that action. This article describes how to structure action scripts, handle multiple input formats, implement batch processing, and produce consistent output, using a molecular analysis tool as a reference example. Note This article applies toaction-basedandhybridtools. If you're ",2026-04-20T15:52:00.000Z,how-to,integrations,0.7,True,"Describes entrypoint structure, input formats, batch processing, and output conventions for Discovery action-based/hybrid tools—product-specific integration contract and coding patterns.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/howto-data-encryption-at-rest,Configure customer-managed keys,Configure customer-managed keys for Microsoft Discovery resources,Configure customer-managed keys for Discovery resources,"Learn how to configure customer-managed keys for supported Microsoft Discovery resources by using Azure Key Vault, managed identities, and resource-specific settings.","This article shows how to configure customer-managed keys (CMK) for supported Microsoft Discovery resources. Use this article when you need to create a Bookshelf, Supercomputer, or Workspace resource that uses a key you manage in Azure Key Vault. For background on encryption at rest and the difference between Microsoft-managed keys and customer-managed keys, seeData encryption at rest in Microsoft Discovery.",2026-04-20T15:52:00.000Z,how-to,security,0.8,True,"Explains how to wire CMK from Key Vault to Discovery workspaces, bookshelves, and supercomputers with resource-specific settings—detailed security configuration.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/overview-key-scenarios,Key scenarios,Key scenarios & Use Cases for Microsoft Discovery,,Key scenarios and use cases for Microsoft Discovery.,"Note Microsoft Discovery is currently in Public Preview. Features, capabilities, and availability are subject to change and results may vary. Microsoft Discovery is anAI-powered platform for scientific research and engineering, designed to accelerate discovery across pharmaceuticals, materials science, chemicals, semiconductors, energy, and advanced manufacturing. The platform supports the full research lifecycle—from understanding existing knowledge to designing, testing, and refining new ideas",2026-04-20T15:52:00.000Z,overview,,0.1,False,"Key scenarios and use cases are descriptive and marketing-like, not configuration- or limit-focused.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/overview-service-architecture,Service architecture overview,Microsoft Discovery service architecture overview,,An overview of Microsoft Discovery's Service Architecture,"The Microsoft Discovery resource provider introduces several new Azure Resource Manager (ARM) object types within your subscription. These resources provide the foundation for Microsoft Discovery's user experience, agentic AI, and computational layers, and can be deployed and managed like any other Azure resource in your subscription.",2026-04-20T15:52:00.000Z,overview,,0.2,False,"Service architecture overview is conceptual; no quantified thresholds, config matrices, or detailed patterns are indicated.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/overview-what-is-microsoft-discovery,What is Microsoft Discovery?,What is Microsoft Discovery?,,Learn about Microsoft Discovery in Azure.,"Microsoft Discovery is an extensible platform that brings together agentic orchestration, advanced reasoning, a graph-based knowledge foundation, and high-performance computing. It helps drive the three principles for effective agentic discovery - enabling agent empowerment, discovery loop automation, and quality at scale. Because it's built on Azure’s enterprise cloud infrastructure, Microsoft Discovery is designed to operate within the security, compliance, transparency, and governance framewo",2026-04-20T15:52:00.000Z,overview,,0.1,False,"High-level product overview of Microsoft Discovery without numeric limits, config tables, or concrete patterns.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-agents-bundles,Quickstart - Add agents using bundles,Quickstart: Add agents using bundles in Microsoft Discovery Studio,,Use agent bundles to quickly deploy a preconfigured set of agents to your Microsoft Discovery project and run your first AI-powered scientific investigation.,"In this quickstart, you use agent bundles to quickly deploy a preconfigured set of agents to your Microsoft Discovery project and run your first AI-powered scientific investigation. You will complete the following tasks:",2026-04-20T15:52:00.000Z,quickstart,,0.3,False,"Quickstart for agent bundles is a usage tutorial, not a configuration reference or limits document.",new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-agents-studio,Quickstart - First set of Agent and investigation,Quickstart: Get started with agents and investigations in Microsoft Discovery Studio,,Create Microsoft Discovery agents and investigations to run your first AI-powered scientific research.,"In this quickstart, you will set up resources for your Microsoft Discovery project, such as agents and investigations to run your first AI-powered scientific research. You will complete the following tasks:",2026-04-20T15:52:00.000Z,quickstart,,0.3,False,Quickstart for agents and investigations is task-focused and unlikely to enumerate product-specific limits or config tables.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-infrastructure-bicep,Quickstart - Deploy infrastructure using Bicep,Quickstart: Deploy Microsoft Discovery infrastructure using Bicep,Deploy Microsoft Discovery infrastructure using Bicep,This quickstart shows you how to deploy the prerequisite infrastructure for Microsoft Discovery using Bicep.,"This quickstart describes how to use Bicep to deploy the prerequisite infrastructure for Microsoft Discovery. The deployment creates the foundational Azure resources required before you can create a Discovery workspace, supercomputer, and projects. Bicepis a domain-specific language (DSL) that uses declarative syntax to deploy Azure resources. It provides concise syntax, reliable type safety, and support for code reuse. Bicep offers the best authoring experience for your infrastructure-as-code s",2026-04-20T15:52:00.000Z,quickstart,deployment,0.65,True,Bicep-based infrastructure deployment for Discovery will include specific resource definitions and constraints unique to this product’s deployment.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-infrastructure-portal,Quickstart - Deploy infrastructure using Azure portal,Quickstart: Get started with Microsoft Discovery Infrastructure,,"Set up Microsoft Discovery infrastructure by creating a supercomputer, workspace, project, agents, investigations, and then run your first AI-powered scientific investigation.","In this quickstart, you set up your Microsoft Discovery environment to run your first AI-powered scientific investigation. You complete the following tasks:",2026-04-20T15:52:00.000Z,quickstart,,0.3,False,Quickstart for initial setup is procedural and unlikely to contain detailed config matrices or limits beyond basic steps.,new +https://learn.microsoft.com/en-us/azure/microsoft-discovery/tutorial-discovery-mode,Tutorial: Discovery Mode,Tutorial: Run your first investigation with Discovery Mode in Microsoft Discovery,,"Walk through an end-to-end investigation using the Discovery Engine, from creating tasks to reviewing results produced by cognition.","In this tutorial, you create an investigation, set up a small set of tasks, enable Discovery Mode, and observe how theDiscovery Engineexecutes your work autonomously. By the end, you have a completed investigation with results that were planned, executed, and validated usingcognition. Time to complete: 30-45 minutes (including wait time for cognition to execute)",2026-04-20T15:52:00.000Z,tutorial,,0.35,False,"Tutorial for running a first investigation; primarily step-by-step usage without structured configuration, limits, or troubleshooting mappings.",new diff --git a/products/azure-microsoft-discovery/report.md b/products/azure-microsoft-discovery/report.md new file mode 100644 index 00000000..b1899c44 --- /dev/null +++ b/products/azure-microsoft-discovery/report.md @@ -0,0 +1,137 @@ +--- +generated_at: '2026-04-26' +category_descriptions: + architecture-patterns: 'Advanced investigation workflows in Discovery Engine: multi-step + queries, pattern-based searches, correlation strategies, and best practices for + complex data exploration.' + configuration: Configuring Discovery workspaces, storage, tools, agents, and supercomputers, + plus accessing and querying logs for Bookshelf, CogLoop, and other Discovery resources + security: 'Securing Discovery resources: encryption at rest, customer-managed keys, + managed identities, RBAC, private endpoints, network rules, and enabling/exporting + audit logs.' + decision-making: Guidance on choosing agent types and models, planning compute and + billing, and migrating/transitioning Discovery resources and configurations between + v1 and v2. + limits-quotas: File size/type limits, investigation file handling, required Azure + quotas/reservations, and naming rules for Microsoft Discovery resources and deployments + best-practices: Guidance on structuring Discovery projects, running trustworthy + investigations, and applying responsible AI and trust patterns in Microsoft Discovery + research workflows + integrations: Integrating external tools/models into Discovery workflows, managing + Discovery supercomputers via REST, and writing action scripts for action-based + tools. + troubleshooting: Diagnosing and fixing Discovery task execution problems, including + common failures, debugging steps, logs, and configuration issues. + deployment: 'Deploying and hardening Discovery in Azure: Bicep-based infra setup, + v2 resource recreation, container image publishing to ACR, and secure networked + stack deployment.' +skill_description: Expert knowledge for Azure Microsoft Discovery development including + troubleshooting, best practices, decision making, architecture & design patterns, + limits & quotas, security, configuration, integrations & coding patterns, and deployment. + Use when building Discovery workspaces, supercomputer agents, Bookshelf/CogLoop + logs, REST control, or v1→v2 migration, and other Azure Microsoft Discovery related + development tasks. +use_when: Use when building Discovery workspaces, supercomputer agents, Bookshelf/CogLoop + logs, REST control, or v1→v2 migration, and other Azure Microsoft Discovery related + development tasks. +--- +# Azure Microsoft Discovery Crawl Report + +## Summary + +- **Total Pages**: 66 +- **Fetched**: 66 +- **Fetch Failed**: 0 +- **Classified**: 46 +- **Unclassified**: 20 + +## Classification Statistics + +| Type | Count | Percentage | +|------|-------|------------| +| architecture-patterns | 1 | 1.5% | +| best-practices | 3 | 4.5% | +| configuration | 17 | 25.8% | +| decision-making | 6 | 9.1% | +| deployment | 4 | 6.1% | +| integrations | 3 | 4.5% | +| limits-quotas | 3 | 4.5% | +| security | 8 | 12.1% | +| troubleshooting | 1 | 1.5% | +| *(Unclassified)* | 20 | 30.3% | + +## Classified Pages + +| TOC Title | Type | Confidence | Reason | +|-----------|------|------------|--------| +| [Create a tool definition](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-create-tool-definition) | configuration | 0.85 | Explains each section of the Discovery tool definition YAML, including fields for image location, compute resources, and operations—detailed configuration schema unique to this product. | +| [Quota reservations](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-quota-reservation) | limits-quotas | 0.85 | Explicitly about required quotas and capacity reservations (VM SKUs, storage, database, AI models) with specific numeric requirements for deployments. | +| [Role assignments](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-role-assignments) | security | 0.85 | Describes three built-in Discovery roles and their permissions plus related Azure roles, which are product-specific RBAC details. | +| [Configure customer-managed keys](https://learn.microsoft.com/en-us/azure/microsoft-discovery/howto-data-encryption-at-rest) | security | 0.80 | Explains how to wire CMK from Key Vault to Discovery workspaces, bookshelves, and supercomputers with resource-specific settings—detailed security configuration. | +| [Configure managed identities](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-configure-managed-identity) | security | 0.80 | Step-by-step creation and configuration of UAMIs, including required role assignments and attachment to resources, is concrete security configuration. | +| [Configure network security](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-configure-network-security) | security | 0.80 | Details assigning NSP roles, configuring subnets, private endpoints, and DNS for Discovery workspaces/bookshelves—product-specific network security configuration with Azure constructs. | +| [Debug task execution](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-debug-task-execution) | troubleshooting | 0.80 | Explicitly focuses on diagnosing stuck tasks, validation failures, agent errors, and cognition behavior; likely organized by symptom-to-cause-to-solution for Discovery-specific failures. | +| [Query CogLoop logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-cognitive-loop-logs) | configuration | 0.80 | Documents the DiscoveryCogLoopLogs_CL table, its role in tracing Act/Cognition loops, and query constraints—product-specific log configuration and usage. | +| [Query Supercomputer logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-supercomputer-logs) | configuration | 0.80 | Explains supercomputer log tables, schemas, and example queries for Kubernetes events, system health, and tool execution—detailed observability configuration. | +| [Query workspace logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-workspace-logs) | configuration | 0.80 | Describes the DiscoveryLogs_CL table, Auxiliary tier constraints (no cross-table joins), and query patterns—detailed, product-specific log schema and query configuration. | +| [Resource Naming](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-resource-naming) | limits-quotas | 0.80 | Covers naming rules including character limits and allowed characters per resource type, which are numeric constraints and patterns unique to this service. | +| [Select models for agents](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-select-models-for-agents) | decision-making | 0.80 | Provides recommendations on choosing models based on task complexity, quality, cost, and latency, including preview-specific guidance (GPT-5.x). | +| [Trust relationship and basic investigation patterns](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-trust-basic-investigation-patterns) | best-practices | 0.80 | Provides concrete patterns and guidance on structuring tasks, validation requirements, and autonomy levels specific to Discovery Engine. | +| [Advanced investigation patterns](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-advanced-investigation-patterns) | architecture-patterns | 0.75 | Defines advanced patterns (deterministic, guided, autonomous) and when to use each, representing product-specific architecture/pattern guidance. | +| [Azure Blob Storage Account](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-storage-account) | configuration | 0.75 | Explains required networking, CORS, and identity access configuration for the storage account used by Discovery, which are concrete settings. | +| [Managed identities](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-managed-identities) | security | 0.75 | Details how Discovery uses user-assigned managed identities across resources, including role grants and identity usage for secure access. | +| [Query Bookshelf indexing logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-bookshelf-indexing-logs) | configuration | 0.75 | Describes DiscoveryBookshelfLogs_CL location (supercomputer MRG) and how to query indexing job stdout/stderr—product-specific log placement and query behavior. | +| [Query Bookshelf query logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-bookshelf-logs) | configuration | 0.75 | Covers DiscoveryLogs_CL usage for bookshelf query traces and diagnostics in the bookshelf’s MRG—detailed log schema and query configuration. | +| [Access resource logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-access-resource-logs) | configuration | 0.70 | Explains how to locate the MRG, open the Log Analytics workspace, and access Discovery-specific log tables—concrete observability configuration for this product. | +| [Azure Container Registry](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-azure-container-registry) | configuration | 0.70 | Describes ACR SKU and networking options and how to create/configure a registry for Discovery, implying product-specific configuration parameters. | +| [Collect v1 resource configurations](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-collect-v1-configurations) | decision-making | 0.70 | Guides which v1 configurations to collect and export before v2 migration, supporting concrete migration planning decisions. | +| [Data encryption at rest](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-data-encryption-at-rest) | security | 0.70 | Covers what data is encrypted, key management models, and when customer-managed keys are available, which are concrete security configuration patterns. | +| [Data handling with tools and agents](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-data-handling-with-tools-agents) | configuration | 0.70 | Describes Discovery’s resource-based data model, resource URIs, built-in resource management tools, and how to configure input/output mounts for tools—product-specific configuration behavior not inferable from general LLM knowledge. | +| [Discovery Agent types](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-agent-types) | decision-making | 0.70 | Explains when to use each agent type and how they differ, providing concrete guidance for selecting the right agent type for scenarios. | +| [Enable audit logging](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-enable-audit-logging) | security | 0.70 | Covers configuring Azure Monitor diagnostic settings to export Discovery audit logs to Storage or Log Analytics for compliance—security/audit configuration details. | +| [End-to-end network-hardened deployment](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-deploy-network-hardened-stack) | deployment | 0.70 | End-to-end guide for deploying all Discovery components (workspace, bookshelf, supercomputer, storage) with only private endpoints; describes a product-specific deployment pattern and constraints. | +| [Manage Workspaces](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-workspaces) | configuration | 0.70 | Includes workspace creation, updates, networking configuration, and supercomputer management, which are concrete configuration steps. | +| [Manage storage containers](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-storage-containers) | configuration | 0.70 | Covers how Discovery maps storage containers and assets to Azure Blob/NetApp paths with product-specific resource semantics and management steps, which are configuration details unique to this service. | +| [Network security](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-network-security) | security | 0.70 | Describes network security perimeters and private endpoints, including API-version-specific behavior and default hardening, which are product-specific security details. | +| [Recreate resources in v2](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-recreate-v2-resources) | deployment | 0.70 | Describes how to set up v2 infrastructure and recreate resources using exported configs, which is product-specific deployment/migration procedure. | +| [Resource provider registration](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-resource-provider-registration) | configuration | 0.70 | Explains provider registration, required permissions, and how to register via portal/CLI/PowerShell/REST, which involves specific commands and parameters. | +| [Responsible AI in Microsoft Discovery](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-responsible-ai) | best-practices | 0.70 | Includes Discovery-specific limitations, safety components, and best practices for using the platform responsibly in scientific contexts. | +| [Tools and model integration](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-tools-model-integration) | integrations | 0.70 | Describes tool types, deployment, and model integration patterns, which are product-specific integration patterns for Discovery agents. | +| [Write action scripts for a tool](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-write-tool-action-scripts) | integrations | 0.70 | Describes entrypoint structure, input formats, batch processing, and output conventions for Discovery action-based/hybrid tools—product-specific integration contract and coding patterns. | +| [v1 to v2 transition guidance](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-v1-to-v2-transition-guide) | decision-making | 0.70 | Transition guide explains which resources can be retained vs. must be recreated and what is deprecated, providing concrete migration decisions and guidance. | +| [Create Bookshelf and index a Knowledgebase](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-index-bookshelf-knowledgebase) | configuration | 0.65 | Walks through creating Bookshelf resources, wiring storage, and indexing knowledgebases with Discovery-specific resource types and flows, which are concrete configuration patterns. | +| [Create agents](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-agent-creation) | configuration | 0.65 | Walks through agent authoring in Discovery Studio/Foundry UI, likely including specific agent configuration fields and options. | +| [Manage Supercomputer & Nodepools](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-supercomputers) | configuration | 0.65 | Managing supercomputers and nodepools in the portal will involve specific settings and constraints unique to Discovery’s compute resources. | +| [Manage Supercomputer using REST APIs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-supercomputers-rest-api) | integrations | 0.65 | Shows REST operations and parameters for supercomputer resources, which are product-specific API integration details. | +| [Plan tool requirements](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-plan-tool-requirements) | decision-making | 0.65 | Guides selection of tool type, compute needs, and dependencies before publishing; this is product-specific planning and trade-off guidance for tool design decisions. | +| [Projects and Investigations](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-projects-investigations) | best-practices | 0.65 | Includes best practices for using projects and investigations specific to Discovery’s resource model, beyond generic project organization advice. | +| [Publish a tool to Azure Container Registry](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-publish-tool-to-acr) | deployment | 0.65 | Details how Discovery expects tool images to be built, tagged, and pushed to ACR for deployment; this is a product-specific deployment pipeline step. | +| [Quickstart - Deploy infrastructure using Bicep](https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-infrastructure-bicep) | deployment | 0.65 | Bicep-based infrastructure deployment for Discovery will include specific resource definitions and constraints unique to this product’s deployment. | +| [Billing overview](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-billing) | decision-making | 0.60 | Explains what counts as a User Message, which operations are billable, and how charges appear—service-specific cost model that informs usage and capacity decisions. | +| [Files and storage assets](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-files-storage-assets) | limits-quotas | 0.60 | Covers supported file types, how files move between tasks, and current limitations, which likely include explicit constraints unique to Discovery. | +| [View activity logs](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-view-activity-logs) | configuration | 0.60 | Shows how to access and filter control plane activity logs for Discovery workspaces, supercomputers, and bookshelves; product-specific logging configuration and usage. | + +## Unclassified Pages + +| TOC Title | Confidence | Reason | +|-----------|------------|--------| +| [Create a Dockerfile for a tool](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-create-tool-docker-file) | 0.45 | Dockerfile creation and local validation are mostly generic containerization steps; unlikely to contain Discovery-specific configuration tables or constraints. | +| [Build investigations with cognition](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-build-investigations-cognition) | 0.40 | Step-by-step investigation setup is procedural tutorial content without structured configuration tables, limits, or error mappings. | +| [Task addition and execution](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-task-addition-execution) | 0.40 | Task creation and execution steps are workflow instructions rather than expert configuration, limits, or troubleshooting mappings. | +| [Tasks and investigations](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-tasks-investigations) | 0.40 | Explains task structure and lifecycle conceptually; summary doesn’t indicate detailed config tables or numeric constraints. | +| [Discovery Agent concepts](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-agent) | 0.35 | Conceptual description of Discovery Agents and capabilities; no clear indication of config tables, limits, or troubleshooting mappings. | +| [Storage containers and storage assets](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-storage-containers-assets) | 0.35 | Explains concepts of storage containers and assets; likely structural description rather than detailed configuration matrices. | +| [Tutorial: Discovery Mode](https://learn.microsoft.com/en-us/azure/microsoft-discovery/tutorial-discovery-mode) | 0.35 | Tutorial for running a first investigation; primarily step-by-step usage without structured configuration, limits, or troubleshooting mappings. | +| [Cognition overview](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-cognition-overview) | 0.30 | Cognition overview describes reasoning loops and task management conceptually, not specific numeric thresholds or config parameters. | +| [Discovery Engine overview](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-engine) | 0.30 | Overview of Discovery Engine behavior; focuses on conceptual cognition and orchestration rather than concrete configuration or limits. | +| [Observability overview](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-observability) | 0.30 | High-level observability overview; conceptual description of logs and monitoring without detailed tables, schemas, or configuration parameters. | +| [Quickstart - Add agents using bundles](https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-agents-bundles) | 0.30 | Quickstart for agent bundles is a usage tutorial, not a configuration reference or limits document. | +| [Quickstart - Deploy infrastructure using Azure portal](https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-infrastructure-portal) | 0.30 | Quickstart for initial setup is procedural and unlikely to contain detailed config matrices or limits beyond basic steps. | +| [Quickstart - First set of Agent and investigation](https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-agents-studio) | 0.30 | Quickstart for agents and investigations is task-focused and unlikely to enumerate product-specific limits or config tables. | +| [Bookshelf & Knowledge Bases](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-bookshelf-knowledge-bases) | 0.25 | Conceptual overview of Bookshelf and Knowledge Bases; no indication of numeric limits, config tables, or troubleshooting mappings. | +| [Microsoft Discovery Studio](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-studio) | 0.20 | Describes what Discovery Studio is and its UI; no specific settings, limits, or troubleshooting content. | +| [Service architecture overview](https://learn.microsoft.com/en-us/azure/microsoft-discovery/overview-service-architecture) | 0.20 | Service architecture overview is conceptual; no quantified thresholds, config matrices, or detailed patterns are indicated. | +| [Virtual Networks and Subnets](https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-virtual-networks) | 0.20 | Conceptual VNet overview focuses on intent and principles, not specific network configuration parameters or limits. | +| [Write effective prompts for agents](https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-prompt-engineering) | 0.20 | Prompt engineering guidance is conceptual and pattern-based without product-specific configuration tables, limits, or error mappings. | +| [Key scenarios](https://learn.microsoft.com/en-us/azure/microsoft-discovery/overview-key-scenarios) | 0.10 | Key scenarios and use cases are descriptive and marketing-like, not configuration- or limit-focused. | +| [What is Microsoft Discovery?](https://learn.microsoft.com/en-us/azure/microsoft-discovery/overview-what-is-microsoft-discovery) | 0.10 | High-level product overview of Microsoft Discovery without numeric limits, config tables, or concrete patterns. | diff --git a/products/azure-migrate/azure-migrate.csv b/products/azure-migrate/azure-migrate.csv index 1e89a5a7..e625c354 100644 --- a/products/azure-migrate/azure-migrate.csv +++ b/products/azure-migrate/azure-migrate.csv @@ -19,7 +19,7 @@ https://learn.microsoft.com/en-us/azure/migrate/assessment-prerequisites?view=mi https://learn.microsoft.com/en-us/azure/migrate/assessment-properties?view=migrate,Customize assessment settings,Assessment Properties - Azure Migrate,Configure general assessment properties in Azure Migrate,Describes the components of an assessment in Azure Migrate.,This article explains the assessment properties on theGeneraltab that you can use when creating an assessment. These general properties apply to all workloads in an application or for cross-workload assessments. They also apply to individual workload assessments.,2025-09-30T22:16:00.000Z,concept-article,configuration,0.7,True,Explains specific assessment property names and options on the General tab; these are product-specific configuration parameters.,unchanged https://learn.microsoft.com/en-us/azure/migrate/assessment-report?view=migrate,Review assessment report,Overview of Azure Migrate Assessment Report - Azure Migrate,Interpret Azure Migrate assessment report outputs,"Learn about assessment report, Azure readiness, and recommendations.","Each assessment provides four key outputs: Azure readiness, right sized target recommendations, cost details, and migration guidance.",2025-10-29T22:11:00.000Z,concept-article,decision-making,0.65,True,"Describes Azure readiness, right-sized recommendations, cost details, and migration guidance; helps users decide migration paths based on report metrics.",unchanged https://learn.microsoft.com/en-us/azure/migrate/assessment-rules-for-postgresql?view=migrate,Assessment rules for PostgreSQL,PostgreSQL Assessment Rules to Detect Blockers and Compatibility Issues - Azure Migrate,Apply PostgreSQL assessment rules for Azure migration,"Helps detect migration blockers and compatibility issues when moving PostgreSQL databases to Azure Database for PostgreSQL Flexible Server, ensuring a smooth and successful cloud transition.","The assessment rules help identify compatibility issues and migration blockers when you move PostgreSQL instances to Azure Database for PostgreSQL flexible server. You can evaluate the source environment for resource limits, feature support, security settings, and configuration gaps. This evaluation categorizes the findings as Issues (blockers you must fix) or Warnings (items you should address), and recommend changes needed for the database, application, and architecture to ensure a successful ",2025-09-19T08:00:00.000Z,concept-article,best-practices,0.8,True,"PostgreSQL assessment rules for blockers/compatibility; likely lists specific checks, configuration thresholds, and recommended changes, which are product-specific best practices.",unchanged -https://learn.microsoft.com/en-us/azure/migrate/azure-copilot-migration-agent?view=migrate,What is Azure Copilot Migration Agent?,Azure Copilot Migration Agent (preview) - Azure Migrate,,"Azure Copilot migration agent is a planning‑focused Copilot experience that helps you analyze migrations using Azure Migrate data, including readiness, strategy, ROI, and landing zone insights (previe","Azure Copilot migration agent is a planning‑focused experience that helps you plan and analyze migrations by reasoning over Azure Migrate data. The migration agent supports migration planning, analysis, and decision making, but not migration execution. You can interact with the Agent using natural language prompts to explore inventory, migration readiness, strategies, ROI considerations, and landing zone requirements. Migration planning and analysis capabilities: Azure Copilot migration agent pr",2026-04-15T08:00:00.000Z,overview,,0.2,False,"Planning-focused overview of Azure Copilot migration agent capabilities; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds.",updated +https://learn.microsoft.com/en-us/azure/migrate/azure-copilot-migration-agent?view=migrate,What is Azure Copilot Migration Agent?,Azure Copilot Migration Agent (preview) - Azure Migrate,,"Azure Copilot migration agent is a planning‑focused Copilot experience that helps you analyze migrations using Azure Migrate data, including readiness, strategy, ROI, and landing zone insights (previe","Azure Copilot migration agent is a planning‑focused experience that helps you plan and analyze migrations by reasoning over Azure Migrate data. The migration agent supports migration planning, analysis, and decision making, but not migration execution. You can interact with the Agent using natural language prompts to explore inventory, migration readiness, strategies, ROI considerations, and landing zone requirements. Migration planning and analysis capabilities: Azure Copilot migration agent pr",2026-04-15T08:00:00.000Z,overview,,0.2,False,"Planning-focused overview of Azure Copilot migration agent capabilities; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/migrate/azure-migrate-unsupported-regions?view=migrate,Azure Migrate in unsupported regions,Use Azure Migrate in Unsupported Regions - Azure Migrate,Use Azure Migrate in unsupported regions,Describes and lists the regions that Azure Migrate doesn't support,"Azure Migrate is a geo-level service that is deployed in at least one region of each geography. The service depends on other Azure services that need to be available before it can be deployed in a new region. As a result, it may not be available immediately when a region is launched. However, customers can still migrate their workloads to the new regions using the Azure Migrate service from a nearby region within the same geography.",2025-09-08T08:00:00.000Z,troubleshooting,limits-quotas,0.65,True,"Region support/unsupported list is a product-specific capability matrix; while not numeric quotas, it’s concrete availability constraints that LLMs won’t reliably know.",unchanged https://learn.microsoft.com/en-us/azure/migrate/azure-monitor-agent-migration?view=migrate,Migrate to Azure Monitor agent,Migrate to Azure Monitor Agent from Log Analytics Agent - Azure Migrate,Migrate dependency analysis from MMA to Azure Monitor Agent,Procedure to migrate to Azure Monitor Agent from MMA,"Dependency analysis helps you to identify and understand dependencies across servers that you want to assess and migrate to Azure. We currently perform agent-based dependency analysis by downloading theMMA agent and associating a Log Analytics workspacewith the Azure Migrate project. Azure Monitor Agent (AMA)replaces the Log Analytics agent, also known as Microsoft Monitor Agent (MMA) and OMS, for Windows and Linux machines, in Azure and non-Azure environments, on-premises, and other clouds. Thi",2025-04-23T05:33:00.000Z,how-to,configuration,0.65,True,"Migration from MMA to AMA for dependency analysis will include agent configuration details, workspace settings, and product-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/migrate/best-practices-least-privileged-account?view=migrate,Credentials:Security best practices,Set Up Least Privileged Accounts - Azure Migrate,Configure least-privilege VMware roles for Azure Migrate,Learn how to configure the Azure Migrate appliance with least privileged access by setting up read-only VMware roles with guest operations and scoped permissions.,"The Azure Migrate appliance is a lightweight tool that discovers on-premises servers and sends their configuration and performance data to Azure. It also performs software inventory, performs agentless dependency analysis, and detects workloads like web apps and instances of SQL Server or MySQL Server. To use these features, you add server and guest credentials in the appliance configuration manager. Following the principle of least privilege helps keep the setup secure and efficient. Important ",2025-09-04T08:00:00.000Z,concept-article,security,0.85,True,"Details read-only VMware roles with guest operations and scoped permissions for the appliance; includes specific role/permission configurations, a product-specific security pattern.",unchanged @@ -38,7 +38,7 @@ https://learn.microsoft.com/en-us/azure/migrate/concepts-arc-resource-discovery? https://learn.microsoft.com/en-us/azure/migrate/concepts-business-case-calculation?view=migrate,Business case overview,Business Case in Azure Migrate - Azure Migrate,Interpret Azure Migrate business case calculations,"Learn what a business case in Azure Migrate is, what reports it contains, and the core concepts and formulas that are used.","Azure Migrate helps you plan and execute migration and modernization to Azure through a centralized hub for discovery, assessment, and migration.",2026-04-10T22:10:00.000Z,concept-article,decision-making,0.7,True,"Business case content typically includes concrete cost formulas, calculation logic, and parameterized assumptions (for example, how on-premises costs, Azure pricing, and utilization are modeled). These quantified formulas and report structures are product-specific and help decide whether and how to migrate, fitting decision-making. It goes beyond conceptual overview by detailing how Azure Migrate computes the business case.",unchanged https://learn.microsoft.com/en-us/azure/migrate/concepts-dependency-visualization?view=migrate,Dependency analysis,Dependency analysis in Azure Migrate Discovery and assessment - Azure Migrate,,Describes how to use dependency analysis for assessment using Azure Migrate Discovery and assessment.,This article describes dependency analysis in Azure Migrate: Discovery and assessment. Dependency analysis identifies dependencies between discovered on-premises or Azure VMware Solution servers. It provides these advantages:,2026-03-26T22:23:00.000Z,concept-article,,0.2,False,"Conceptual description of dependency analysis in Azure Migrate (what it is, advantages, and how it helps assessments). Does not expose numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. Primarily an overview of the feature’s purpose and usage, so it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/migrate/concepts-overview?view=migrate,Overview,Overview Assessment - Azure Migrate,,Learn about types of assessments in Azure Migrate.,This article provides an overview of Azure Migrate assessments. An Azure Migrate assessment evaluates your workloads hosted on your on-premises datacenter or other public clouds for migration to Azure. Each Azure Migrate assessment analyzes your source workloads for:,2026-04-10T22:10:00.000Z,concept-article,,0.2,False,"An overview of Azure Migrate assessments; overviews generally describe types and concepts without detailed limits, configuration tables, or decision matrices. No indication of specific numeric thresholds or product-specific patterns.",unchanged -https://learn.microsoft.com/en-us/azure/migrate/concepts-vmware-agentless-migration?view=migrate,Agentless migration architecture,Agentless Replication of VMware Virtual Machines - Azure Migrate,,This article describes concepts related to agentless migration of VMware VMs in Azure Migrate.,This article describes the replication concepts when you're migrating VMware virtual machines (VMs) by using the agentless migration method in Azure Migrate.,2026-04-14T11:11:00.000Z,concept-article,,0.3,False,"Conceptual explanation of agentless VMware VM replication in Azure Migrate; likely describes flow and concepts without numeric limits, config parameter tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/migrate/concepts-vmware-agentless-migration?view=migrate,Agentless migration architecture,Agentless Replication of VMware Virtual Machines - Azure Migrate,,This article describes concepts related to agentless migration of VMware VMs in Azure Migrate.,This article describes the replication concepts when you're migrating VMware virtual machines (VMs) by using the agentless migration method in Azure Migrate.,2026-04-14T11:11:00.000Z,concept-article,,0.3,False,"Conceptual explanation of agentless VMware VM replication in Azure Migrate; likely describes flow and concepts without numeric limits, config parameter tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/migrate/confidence-ratings?view=migrate,Performance coverage,Performance coverage - Azure Migrate,Evaluate performance coverage in Azure Migrate,Describes how performance coverage is calculated in an assessment.,"Each performance-based Azure VM assessment in Azure Migrate is associated with a performance coverage. The coverage ranges from one (lowest) to five (highest) stars. The performance coverage helps you estimate the reliability of the size recommendations Azure Migrate provides. The performance coverage is assigned to an assessment. The coverage is based on the availability of data points that are needed to compute the assessment. For performance-based sizing, the assessment needs: If any of these",2025-10-29T22:11:00.000Z,concept-article,decision-making,0.7,True,"Defines the star-based performance coverage metric, its calculation from data points, and its impact on recommendation reliability; product-specific quantified decision signal.",unchanged https://learn.microsoft.com/en-us/azure/migrate/cost-estimation?view=migrate,Cost and savings calculations,Cost estimation of assessments in Azure Migrate - Azure Migrate,,Learn about cost estimation in assessments in Azure Migrate,"Azure Migrate assessments provide you with an estimated cost of hosting the recommended targets on Azure. These costs are identified for each right-sized target on Azure. Note The cost estimates are dependent on the rates in the specified region, any applicable offers, and the licensing program selected by you. This article describes Azure Migrate assessments, which estimate hosting costs for recommended targets on Azure, based on region rates, applicable offers, and selected licensing programs.",2026-04-01T17:25:00.000Z,concept-article,,0.3,False,"Explains cost estimation behavior in assessments; summary suggests conceptual description of how costs are derived rather than concrete limits, parameter tables, or decision matrices with thresholds.",unchanged https://learn.microsoft.com/en-us/azure/migrate/create-application-assessment?view=migrate,Create assessment,Create an application assessment - Azure Migrate,,Learn how to create an application assessment using Azure Migrate.,"This article shows you how to create an application assessment to migrate or modernize your application workloads using Azure Migrate. Creating an application assessment for your application provides you with multiple migration strategies that you can use to migrate your workloads identify the recommended as well as alternative targets and key insights such asreadiness,target right-sizing, andcostto host and run these workloads on Azure month over month. You can also create a cross-workload asse",2025-04-23T05:33:00.000Z,concept-article,,0.4,False,Step-by-step tutorial for creating an application assessment; mostly workflow/UI without deep config tables or numeric thresholds.,unchanged @@ -81,8 +81,8 @@ https://learn.microsoft.com/en-us/azure/migrate/how-to-review-discovered-invento https://learn.microsoft.com/en-us/azure/migrate/how-to-scale-out-for-migration?view=migrate,How to migrate VMware VMs at scale using the agentless replication method,Set up an Azure Migrate scale-out appliance for agentless VMware migration - Azure Migrate,Scale-out Azure Migrate appliance capacity limits,Learn how to set up an Azure Migrate scale-out appliance to migrate Hyper-V VMs.,"This article helps you understand how to use a scale-out appliance to migrate a large number of VMware virtual machines (VMs) to Azure using the Migration and modernization tool's agentless method for migration of VMware VMs. Using the agentless migration method for VMware virtual machines you can: In this article, you learn how to: Note While you can schedule replication for up to 300 VMs on a single appliance and up to 500 VMs using a scale-out appliance, the replication itself is limited by e",2024-12-31T12:14:00.000Z,how-to,limits-quotas,0.7,True,"Mentions concrete VM count limits for a single vs. scale-out appliance (for example, up to 300 vs. 500 VMs), which are product-specific numerical limits.",unchanged https://learn.microsoft.com/en-us/azure/migrate/how-to-set-up-appliance-physical?view=migrate,Deploy appliance with template,Set up an Azure Migrate appliance for physical servers - Azure Migrate,Configure Azure Migrate appliance for physical servers,Learn how to set up an Azure Migrate appliance for physical server discovery and assessment.,"This article describes how to set up the Azure Migrate appliance if you're assessing physical servers with the Azure Migrate: Discovery and assessment tool. The Azure Migrate appliance is a lightweight appliance, used by Azure Migrate: Discovery and assessment to do the following: Learn moreabout the Azure Migrate appliance. After creating the appliance, you check that it can connect to Azure Migrate: Discovery and assessment, configure it for the first time, and register it with the project. No",2025-12-12T12:11:00.000Z,how-to,configuration,0.7,True,"Details appliance setup, connectivity checks, and registration; likely includes specific ports, URLs, and configuration options.",unchanged https://learn.microsoft.com/en-us/azure/migrate/how-to-test-replicating-virtual-machines?view=migrate,Test migration,Test migrate replicating virtual machines - Azure Migrate,Best practices for test migrations of virtual machines,Learn best practices for testing replicating virtual machines,This article helps you understand how to test replicating virtual machines. Test migration provides a way to test and validate migrations prior to the actual migration.,2025-05-13T05:03:00.000Z,how-to,best-practices,0.7,True,Explicitly about understanding how to test migrations; likely includes concrete recommendations and gotchas for test migration scenarios.,unchanged -https://learn.microsoft.com/en-us/azure/migrate/how-to-upgrade-windows?view=migrate,Upgrade Windows OS,Upgrade Windows Operating System - Azure Migrate,Decide and execute Windows Server OS upgrades during Azure Migrate,Learn how to upgrade Windows OS during migration.,"This article describes how to upgrade Windows Server OS while migrating to Azure. Azure Migrate OS upgrade allows you to move from an older operating system to a newer one while keeping your settings, server roles, and data intact. You can move your on-premises server to Azure with an upgraded OS version of Windows Server using Windows upgrade. Note",2026-04-14T17:11:00.000Z,how-to,decision-making,0.68,True,"The article is about upgrading Windows Server OS as part of Azure Migrate. These docs typically include which source Windows Server versions can be upgraded in-place to which target versions during migration, supported/unsupported upgrade paths, and constraints specific to Azure Migrate tooling. That constitutes product-specific migration and upgrade-path guidance (what paths are allowed, when to choose upgrade vs other options), which fits the decision-making category more than a generic tutorial. It is not just conceptual; it encodes concrete support rules and migration considerations that an LLM would not reliably know from training.",updated -https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate,Requirements for Private endpoints,Use Private Endpoints - Azure Migrate,Use Azure Migrate over Private Link with private endpoints,"Use Azure Migrate to discover, assess, and migrate servers by using Azure Private Link.","This article describes how to use Azure Migrate to discover, assess, and migrate servers over a private network by usingAzure Private Link. You can use the tools in Azure Migrate to connect to the service over an Azure ExpressRoute private peering connection or a site-to-site VPN connection by using Private Link. For more information about these tools, seeWhat is Azure Migrate? We recommend the method of private endpoint connectivity when there's an organizational requirement to access Azure Mig",2026-02-10T12:11:00.000Z,concept-article,configuration,0.7,True,Describes configuring Azure Private Link/private endpoints for Azure Migrate; likely includes endpoint/resource configuration details specific to this service.,unchanged +https://learn.microsoft.com/en-us/azure/migrate/how-to-upgrade-windows?view=migrate,Upgrade Windows OS,Upgrade Windows Operating System - Azure Migrate,Decide and execute Windows Server OS upgrades during Azure Migrate,Learn how to upgrade Windows OS during migration.,"This article describes how to upgrade Windows Server OS while migrating to Azure. Azure Migrate OS upgrade allows you to move from an older operating system to a newer one while keeping your settings, server roles, and data intact. You can move your on-premises server to Azure with an upgraded OS version of Windows Server using Windows upgrade. Note",2026-04-14T17:11:00.000Z,how-to,decision-making,0.68,True,"The article is about upgrading Windows Server OS as part of Azure Migrate. These docs typically include which source Windows Server versions can be upgraded in-place to which target versions during migration, supported/unsupported upgrade paths, and constraints specific to Azure Migrate tooling. That constitutes product-specific migration and upgrade-path guidance (what paths are allowed, when to choose upgrade vs other options), which fits the decision-making category more than a generic tutorial. It is not just conceptual; it encodes concrete support rules and migration considerations that an LLM would not reliably know from training.",unchanged +https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate,Requirements for Private endpoints,Use Private Endpoints - Azure Migrate,Configure Azure Migrate with Private Endpoints and Private Link,"Use Azure Migrate to discover, assess, and migrate servers by using Azure Private Link.","This article describes how to use Azure Migrate to discover, assess, and migrate servers over a private network by usingAzure Private Link. You can use the tools in Azure Migrate to connect to the service over an Azure ExpressRoute private peering connection or a site-to-site VPN connection by using Private Link. For more information about these tools, seeWhat is Azure Migrate? We recommend the method of private endpoint connectivity when there's an organizational requirement to access Azure Mig",2026-04-19T17:12:00.000Z,concept-article,configuration,0.7,True,"The article describes product-specific network and Private Link configuration for Azure Migrate, including how to set up private endpoints over ExpressRoute or VPN and required settings. This is concrete, service-specific configuration guidance rather than a conceptual overview.",updated https://learn.microsoft.com/en-us/azure/migrate/how-to-view-a-business-case?view=migrate,Review business case,Review a Business Case with Azure Migrate - Azure Migrate,,This article describes how to review a business case with Azure Migrate.,"This article describes how to review the business case reports for on-premises applications and workloads in your datacenter with Azure Migrate. Azure Migratehelps you to plan and execute migration and modernization projects to Azure. Azure Migrate provides a centralized hub to track discovery, assessment, and migration of on-premises infrastructure, applications, and data to Azure.",2026-04-10T22:10:00.000Z,how-to,,0.3,False,"Described as how to review business case reports, likely a UI walkthrough of viewing existing reports rather than exposing underlying formulas, thresholds, or comparison matrices. Appears procedural, not expert decision logic or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/migrate/hydration-process?view=migrate,Hydration process workflow,Hydration process - Azure Migrate,Use Azure Migrate hydration process for VM configuration,Learn about the hydration process in Azure Migrate.,"You have to make some changes to the VMs configuration before the migration to ensure that the migrated VMs function properly on Azure. Azure Migrate handles these configuration changes via thehydrationprocess. The hydration process is only performed for the versions of Azure supported operating systems given above. Before you migrate, you may need to perform the required changes manually for other operating system versions that aren't listed above. If the VM is migrated without the required cha",2025-05-27T17:04:00.000Z,concept-article,best-practices,0.7,True,Explains hydration process and required configuration changes for supported OS versions; includes product-specific behavior and requirements to avoid failures.,unchanged https://learn.microsoft.com/en-us/azure/migrate/hyper-v-migration-architecture?view=migrate,Hyper-V migration architecture,How does Hyper-V migration work in Azure Migrate? - Azure Migrate,Understand Hyper-V migration architecture in Azure Migrate,Learn about Hyper-V migration with Azure Migrate,"This article provides an overview of the architecture and processes used when you migrate Hyper-V VMs with the Migration and modernization tool. Azure Migrateprovides a central hub to track discovery, assessment, and migration of your on-premises apps and workloads, and private/public cloud VMs, to Azure. The hub provides Azure Migrate tools for assessment and migration, as well as third-party independent software vendor (ISV) offerings.",2024-11-29T18:04:00.000Z,concept-article,architecture-patterns,0.65,True,Architecture and process overview for Hyper-V migration using Migration and modernization tool; likely includes Azure Migrate–specific architectural patterns and flows.,unchanged @@ -96,7 +96,7 @@ https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-hyper-v-m https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-hyper-v?view=migrate,Support matrix for Hyper-V discovery,Support for Hyper-V assessment in Azure Migrate and Modernize - Azure Migrate,Review Hyper-V assessment support in Azure Migrate,Learn about support for Hyper-V assessment with Azure Migrate: Discovery and assessment.,"This article summarizes prerequisites and support requirements when you discover and assess on-premises servers running in a Hyper-V environment for migration to Azure by using theAzure Migrate: Discovery and assessmenttool. If you want to migrate servers running on Hyper-V to Azure, see themigration support matrix. To set up discovery and assessment of servers running on Hyper-V, you create a project and add the Azure Migrate: Discovery and assessment tool to the project. After the tool is adde",2025-12-30T06:12:00.000Z,concept-article,limits-quotas,0.8,True,"Summarizes prerequisites and support requirements for Hyper-V assessment; likely includes supported versions and constraints, a support/limits matrix.",unchanged https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-physical?view=migrate,Support matrix for Physical server discovery,Support for physical discovery and assessment in Azure Migrate and Modernize - Azure Migrate,Review physical server discovery support in Azure Migrate,Learn about support for physical discovery and assessment with Azure Migrate: Discovery and assessment.,"This article summarizes prerequisites and support requirements when you assess physical servers for migration to Azure by using theAzure Migrate: Discovery and assessmenttool. If you want to migrate physical servers to Azure, see themigration support matrix. To assess physical servers, you create a project and add the Azure Migrate: Discovery and assessment tool to the project. After you add the tool, you deploy theAzure Migrate appliance. The appliance continuously discovers on-premises servers",2025-12-30T06:12:00.000Z,concept-article,limits-quotas,0.8,True,"Support matrix for physical discovery and assessment; includes prerequisites and supported configurations, which are expert support/limit details.",unchanged https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-vmware-migration?view=migrate,VMware vSphere migration requirements,Support for VMware vSphere migration in Azure Migrate - Azure Migrate,Check VMware vSphere migration support and limits in Azure Migrate,Learn about support for VMware vSphere VM migration in Azure Migrate.,"This article summarizes support settings and limitations for migrating VMware vSphere VMs withMigration and modernization. If you're looking for information about assessing VMware vSphere VMs for migration to Azure, review theassessment support matrix. Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. Caution This article references Windows Ser",2026-04-10T11:20:00.000Z,concept-article,limits-quotas,0.86,True,"A 'support matrix' article for VMware vSphere migration typically enumerates supported/unsupported OS versions, configurations, and explicit limitations and constraints for Azure Migrate. These matrices include product-specific support boundaries and limits that function as expert knowledge not inferable from general training data, fitting the limits-quotas category best among the available options.",unchanged -https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-vmware?view=migrate,Support matrix for VMware discovery,VMware server discovery support in Azure Migrate and Modernize - Azure Migrate,Check VMware discovery prerequisites and support in Azure Migrate,Learn about Azure Migrate and Modernize discovery and assessment support for servers in a VMware environment.,"This article summarizes prerequisites and support requirements for using theAzure Migrate: Discovery and assessmenttool to discover and assess servers in a VMware environment for migration to Azure. To assess servers, first, create an Azure Migrate project. The Azure Migrate: Discovery and assessment tool is automatically added to the project. Then, deploy the Azure Migrate appliance. The appliance continuously discovers on-premises or Azure VMware Solution servers and sends configuration and pe",2026-04-16T11:14:00.000Z,concept-article,limits-quotas,0.75,True,"Support matrix/prerequisites page for VMware discovery typically includes detailed version support, environment constraints, and other matrix-style requirements that function as product-specific limits and compatibility rules not known from training.",updated +https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-vmware?view=migrate,Support matrix for VMware discovery,VMware server discovery support in Azure Migrate and Modernize - Azure Migrate,Check VMware discovery prerequisites and support in Azure Migrate,Learn about Azure Migrate and Modernize discovery and assessment support for servers in a VMware environment.,"This article summarizes prerequisites and support requirements for using theAzure Migrate: Discovery and assessmenttool to discover and assess servers in a VMware environment for migration to Azure. To assess servers, first, create an Azure Migrate project. The Azure Migrate: Discovery and assessment tool is automatically added to the project. Then, deploy the Azure Migrate appliance. The appliance continuously discovers on-premises or Azure VMware Solution servers and sends configuration and pe",2026-04-16T11:14:00.000Z,concept-article,limits-quotas,0.75,True,"Support matrix/prerequisites page for VMware discovery typically includes detailed version support, environment constraints, and other matrix-style requirements that function as product-specific limits and compatibility rules not known from training.",unchanged https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix?view=migrate,Support Matrix,Support Matrix - Azure Migrate,Review Azure Migrate support matrix and limitations,This article provides a summary of support settings and limitations for the Azure Migrate service.,You can use theAzure Migrate serviceto assess and migrate servers to the Microsoft Azure cloud platform. This article summarizes general support settings and limitations for Azure Migrate scenarios and deployments.,2026-01-06T06:10:00.000Z,concept-article,limits-quotas,0.85,True,"Support matrix summarizing settings and limitations for scenarios; typically includes explicit supported/unsupported combinations and constraints, which are expert limits.",unchanged https://learn.microsoft.com/en-us/azure/migrate/migrate-to-trusted-launch-virtual-machines-with-azure-migrate?view=migrate,Migrate Generation 2 Virtual Machines to Azure Trusted Launch Virtual Machines with Azure Migrate,Migrate Generation 2 Virtual Machines to Azure Trusted Launch Virtual Machines with Azure Migrate - Azure Migrate,Migrate Gen2 VMs to Azure Trusted Launch securely,Use Azure Migrate to migrate on premises Generation 2 Virtual Machines to Azure Trusted Launch Virtual Machines,"Azure Migrate now supports migrating Generation 2 virtual machines to Azure Virtual Machines with Trusted Launch. Trusted Launch uses UEFI-based Secure Boot and a virtual Trusted Platform Module (vTPM) to establish a trusted boot chain. This helps ensure that only approved and signed components are loaded during startup, reducing the risk of bootkits, rootkits, and other low-level malware. Trusted Launch is the default security type for supported Generation 2 Virtual Machines and virtual machine",2026-03-18T06:15:00.000Z,how-to,security,0.68,True,"The page is focused on migrating Generation 2 VMs specifically to Azure Trusted Launch VMs, which involves product-specific security configuration (Trusted Launch as default security type, use of UEFI Secure Boot and vTPM). While the summary doesn’t show tables, this scenario typically includes concrete steps and settings unique to Azure Migrate and Trusted Launch security configuration, which qualify as product-specific security guidance rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/migrate/migrate-virtual-machine-extension-reference?view=migrate,Azure Migrate VM Extension for Arc-enabled servers,Azure Migrate Collector virtual machine extension reference - Azure Migrate,Reference for Azure Migrate Collector VM extension settings,"Technical reference for the Azure Migrate Collector VM extension including settings schema, endpoints, and configuration options.",This article provides technical reference information for the Azure Migrate Collector Virtual Machine (VM) extension used with Arc-enabled servers.,2025-11-05T12:23:00.000Z,reference,configuration,0.9,True,"Technical reference with settings schema, endpoints, and configuration options; clearly a configuration parameter reference with defaults and constraints.",unchanged @@ -182,13 +182,13 @@ https://learn.microsoft.com/en-us/azure/migrate/whats-new?view=migrate,What's ne https://learn.microsoft.com/en-us/azure/migration/migrate-compute-from-aws,Compute,Migrate Compute from Amazon Web Services to Azure,Choose Azure compute equivalents for AWS workloads,"Learn how to migrate AWS compute services to Azure, including maintaining feature parity and exploring scenarios like VMs, web apps, and serverless functions.","This article describes scenarios that you can use to migrate Amazon Web Services (AWS) compute services to Azure. These cloud services provide the processing power, memory, and storage necessary to run computational tasks. The migration process involves transitioning these services from AWS to Azure, with a focus on maintaining feature parity. The scenarios cover common compute types, such as virtual machines (VMs), web applications, and serverless services. For example, a typical migration scen",2026-03-10T08:00:00.000Z,concept-article,decision-making,0.7,True,"Migration guidance from AWS compute to Azure typically includes concrete service mapping tables (for example, EC2 → Azure VMs, Lambda → Azure Functions, ECS/EKS → AKS/App Service) and scenario-based recommendations on which Azure service to choose for specific AWS usage patterns. This is product- and provider-specific decision guidance that helps select Azure services and approaches for different workloads, which fits the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-databases-from-aws,Databases,Migrate Databases from Amazon Web Services (AWS) to Azure,Select Azure database services for AWS migrations,"Learn how to migrate databases from AWS to Azure. See example scenarios for relational database, NoSQL database, and data warehouse migration.","Data migration is critical when you move from Amazon Web Services (AWS) to Microsoft Azure. You need to transition your databases and make sure that they work similarly in the new environment. The scope of this migration includes various database types, such as relational databases, NoSQL databases, and data warehouses. For example, you might have a workload that uses an Amazon Relational Database Service (RDS) for PostgreSQL database that you need to migrate to Azure Database for PostgreSQL.",2026-03-10T08:00:00.000Z,concept-article,decision-making,0.7,True,"The page covers how to migrate various AWS database types (RDS engines, DynamoDB, Redshift, etc.) to Azure equivalents (Azure SQL, PostgreSQL, Cosmos DB, Synapse, etc.) and provides example scenarios. These cross-cloud service mappings and scenario-based recommendations are concrete migration decision guidance, matching the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-databases-from-google-cloud,Migrate databases,Migrate Databases from Google Cloud to Azure,Plan migration of GCP databases to Azure,"Learn how to migrate databases from Google Cloud Platform (GCP) to Azure with concepts, best practices, and step-by-step guides.","Migrating data from Google Cloud Platform (GCP) to Azure is a critical step in ensuring a seamless transition to Azure. This article covers how to migrate relational databases, NoSQL databases, and data warehouses effectively. For example, learn how to migrate a Google Cloud SQL for PostgreSQL database to Azure Database for PostgreSQL.",2026-02-18T12:10:00.000Z,concept-article,decision-making,0.63,True,Covers concrete scenarios like Cloud SQL PostgreSQL to Azure Database for PostgreSQL and likely includes product-specific target selection and migration approach guidance.,unchanged -https://learn.microsoft.com/en-us/azure/migration/migrate-from-aws,Getting started,Migrate Amazon Web Services (AWS) to Microsoft Azure,Choose Azure services when migrating from AWS,Learn about resources that can help you transition your workload components from Amazon Web Services (AWS) to Azure.,"Replatforming an entire workload from Amazon Web Services (AWS) to Azure involves migrating various components, including compute, databases, storage, and applications. This section offers service comparisons between both platforms and example scenarios to help integrate components for transition to Azure.",2026-04-15T11:11:00Z,landing-page,decision-making,0.7,True,"The page focuses on migrating workloads from AWS to Azure and explicitly mentions service comparisons and example scenarios. This is migration and technology selection guidance that helps decide which Azure services map to AWS components, fitting the decision-making category. It goes beyond conceptual overview by providing concrete comparison-based guidance for migration choices.",updated +https://learn.microsoft.com/en-us/azure/migration/migrate-from-aws,Getting started,Migrate Amazon Web Services (AWS) to Microsoft Azure,Choose Azure services when migrating from AWS,Learn about resources that can help you transition your workload components from Amazon Web Services (AWS) to Azure.,"Replatforming an entire workload from Amazon Web Services (AWS) to Azure involves migrating various components, including compute, databases, storage, and applications. This section offers service comparisons between both platforms and example scenarios to help integrate components for transition to Azure.",2026-04-15T11:11:00Z,landing-page,decision-making,0.7,True,"The page focuses on migrating workloads from AWS to Azure and explicitly mentions service comparisons and example scenarios. This is migration and technology selection guidance that helps decide which Azure services map to AWS components, fitting the decision-making category. It goes beyond conceptual overview by providing concrete comparison-based guidance for migration choices.",unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-from-google-cloud,Getting started,Migrate from Google Cloud Platform (GCP) to Microsoft Azure,Map GCP services to Azure for full workload migration,Learn about resources that can help you transition your workload components from Google Cloud Platform (GCP) to Azure.,"Replatforming an entire workload from Google Cloud Platform (GCP) to Azure involves migrating various components, including compute, databases, storage, and applications. This section offers service comparisons between both platforms and example scenarios to help integrate components for transition to Azure.",2026-03-10T16:12:00Z,landing-page,decision-making,0.65,True,"The page provides service comparisons between GCP and Azure and example scenarios for transitioning components (compute, databases, storage, applications). These comparisons and recommendations for how to replatform components are concrete technology selection and migration decisions, fitting the decision-making sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/migration/migrate-from-on-premises,Migrate from on-premises,Migrate on-premises to Microsoft Azure,Plan on-premises workload migration to Azure,Learn about resources that can help you transition your workload components from Amazon Web Services (AWS) to Azure.,"Replatforming an entire workload from on-premises to Azure involves migrating various components, including compute, databases, storage, and applications. This section offers service comparisons between both platforms and example scenarios to help integrate components for transition to Azure.",2026-04-15T11:11:00Z,landing-page,decision-making,0.65,True,"The page covers replatforming on-premises workloads to Azure and includes service comparisons and example scenarios for transitioning components. This is migration planning and service selection guidance, helping users decide which Azure services to use for different on-premises components, which aligns with the decision-making sub-skill.",updated +https://learn.microsoft.com/en-us/azure/migration/migrate-from-on-premises,Migrate from on-premises,Migrate on-premises to Microsoft Azure,Plan on-premises workload migration to Azure,Learn about resources that can help you transition your workload components from Amazon Web Services (AWS) to Azure.,"Replatforming an entire workload from on-premises to Azure involves migrating various components, including compute, databases, storage, and applications. This section offers service comparisons between both platforms and example scenarios to help integrate components for transition to Azure.",2026-04-15T11:11:00Z,landing-page,decision-making,0.65,True,"The page covers replatforming on-premises workloads to Azure and includes service comparisons and example scenarios for transitioning components. This is migration planning and service selection guidance, helping users decide which Azure services to use for different on-premises components, which aligns with the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-networking-from-aws,Networking,Migrate Networking from Amazon Web Services (AWS) to Azure,Map AWS networking services to Azure networking,"Learn about concepts, how-tos, and best practices for migrating networking services from Amazon Web Services (AWS) to Azure.","The articles listed on this page outline scenarios for how to migrate networking services from Amazon Web Services (AWS) to Azure networking services. Networking services are foundational components of most enterprise workloads. The migration process moves these services while ensuring that they retain the same capabilities. Examples of networking services include virtual networks, load balancers, and firewalls that connect and protect critical data for various purposes. Whether it's to support ",2025-11-10T23:18:00.000Z,concept-article,decision-making,0.64,True,"Outlines scenarios for migrating AWS networking (VPC, load balancers, firewalls) to Azure networking services, providing concrete service mapping and choice guidance that is product-specific.",unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-security-from-aws,Security,Migrate security services from Amazon Web Services (AWS),Replatform AWS security services to Microsoft Azure,Learn about replatforming security services from AWS to Microsoft Cloud to support the security requirements of the workload. Discover key similarities and differences between AWS and Microsoft.,This article describes scenarios that you can use to migrate Amazon Web Services (AWS) security services to Microsoft Azure. These cloud services provide foundational security elements necessary for monitoring other services and applications built in the cloud. The migration process involves transitioning services while focusing on maintaining or enhancing functionality. These scenarios might cover tasks like migrating your security and information and event management (SIEM) solution to Azure.,2025-08-19T05:19:00.000Z,concept-article,decision-making,0.62,True,"Focuses on migrating AWS security services (such as SIEM) to Azure/Microsoft security offerings, with platform-specific similarities/differences and migration scenarios that guide technology selection.",unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-storage-from-aws,Storage,Migrate Storage from Amazon Web Services (AWS) to Azure,Plan Azure storage targets for AWS storage migration,"Learn about concepts, how-tos, and best practices for migrating storage services from Amazon Web Services (AWS) to Azure.","The articles listed on this page outline scenarios for how to migrate storage services from Amazon Web Services (AWS) to Azure storage services. Storage services are foundational components of most enterprise workloads. The migration process moves these services while ensuring that they retain the same capabilities. Examples of storage services include file storage, blob storage, data lakes, virtual machine images, and file shares that store critical data for various purposes. Whether it's to su",2026-03-10T08:00:00.000Z,concept-article,decision-making,0.65,True,"The article aggregates scenarios for migrating AWS storage services (S3, EFS, EBS, etc.) to Azure storage offerings (Blob, Files, Disks, Data Lake). Such provider-specific mapping and scenario guidance help users decide which Azure storage service to use for each AWS source, which is specialized migration decision knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/migration/migrate-to-azure,Migrate to Azure,Migrate Workloads to Azure,,Learn about migration resources that might help you transition workloads from AWS and Google Cloud to Azure.,"This content collection is curated to help workload teams plan and implement their workload migration. It covers migrations from cloud platforms, like Amazon Web Services (AWS) and Google Cloud Platform (GCP), to Microsoft Azure. The expected outcome is that, after you complete your migration to Azure, you decommission the workload on the source platform. Important Some migration scenarios are out of scope for this collection. It doesn't cover on-premises to Azure migrations, full datacenter mig",2025-03-31T17:03:00.000Z,concept-article,,0.2,False,"High-level migration content collection; primarily navigation/overview without detailed limits, configs, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/migration/migrate-to-azure,Migrate to Azure,Migrate Workloads to Azure,,"Learn about migration resources that might help you transition workloads from AWS, Google Cloud Platform, and on-premises to Azure.","The Azure Migration Hub provides prescriptive, opinionated guidance to help workload teams plan and implement their migration to Azure. It covers migrations from on-premises environments and cloud platforms such as Amazon Web Services (AWS) and Google Cloud Platform (GCP). Important This content covers single-workload migrations. It doesn't cover full datacenter migrations, region relocations, or hybrid workloads that run concurrently on multiple clouds. Migration to Azure involves networking, i",2026-04-23T17:12:00.000Z,concept-article,,0.2,False,"High-level migration guidance and resource overview for moving workloads to Azure; no specific limits, configuration tables, error codes, or quantified decision matrices that meet the expert-knowledge criteria.",updated https://learn.microsoft.com/en-us/azure/migration/migrate-workload-from-aws-conclusion,Conclusion,Conclude Your Workload Migration from Amazon Web Services (AWS) to Azure,,Learn about the next steps after you migrate a single workload from AWS to Azure. Access training resources and explore example workload migrations.,"This article is part of a series about how tomigrate a workload from Amazon Web Services (AWS) to Azure. Workload migration, especially if it's your first migration or if the workload has many dependencies, is an intensive project. Use this series and the right approach to migrate the workload confidently and prepare to handle problems. This series covers the following items that you need for success: Tip Remember that you can rely on experts. When possible, work with colleagues or Microsoft exp",2026-02-16T18:16:00.000Z,concept-article,,0.5,False,Conclusion/next steps article; mostly meta-guidance and links to further resources rather than detailed technical decision criteria.,unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-workload-from-aws-decommission,5. Decommission,Decommission Your Amazon Web Services (AWS) Workload After You Migrate to Azure,Decommission AWS resources after Azure migration,"Learn how to decommission AWS resources after you migrate a single workload from AWS to Azure. Create final backups, delete resources, and establish new baselines.","This article is part of a series about how tomigrate a workload from Amazon Web Services (AWS) to Azure. This step is the final step in the workload migration. Proceed after you complete the evaluation phase and confirm that your workload operates as expected in Azure. The goal of this phase is to retire AWS dependencies safely, remove redundant resources, and complete the transition to Azure. Important If you prematurely delete AWS resources, overlook hidden dependencies, or skip final data and",2026-02-16T18:16:00.000Z,concept-article,decision-making,0.6,True,Guides decisions on when and how to retire AWS dependencies and delete resources safely; includes risk-focused recommendations.,unchanged https://learn.microsoft.com/en-us/azure/migration/migrate-workload-from-aws-evaluate,4. Evaluate,Evaluate your Workload from Amazon Web Services (AWS) After You Migrate to Azure,Evaluate workload health after AWS to Azure migration,"Learn how to evaluate the migration of a single workload from AWS to Azure. Monitor performance, verify baselines, and confirm data cutover.","This article is part of a series about how tomigrate a workload from Amazon Web Services (AWS) to Azure. Congratulations, your workload now serves its users from Azure! The evaluation phase consists of these steps: The goal of this phase is to confirm that your workload in Azure meets the functional, performance, reliability, security, and cost baselines that you establish in the planning phase on AWS. Important Incomplete monitoring, insufficient performance testing, or ineffective cost and sec",2026-02-16T18:16:00.000Z,concept-article,decision-making,0.6,True,Evaluation phase focuses on verifying baselines and monitoring; offers structured criteria to decide if migration meets goals.,unchanged diff --git a/products/azure-migrate/report.md b/products/azure-migrate/report.md index 35b42ea6..38d8b8b0 100644 --- a/products/azure-migrate/report.md +++ b/products/azure-migrate/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: integrations: 'Code-level integration patterns: using AppCAT CLI, CAST Highlight, GitHub Copilot insights, and Site Recovery REST APIs to assess and automate VMware-to-Azure app migrations.' - configuration: Configuring Azure Migrate projects, appliances, assessments, dependencies, - Arc/agents, networking, and Resource Mover settings for server, SQL, .NET/Java, - and PostgreSQL migrations + configuration: Configuring Azure Migrate appliances, assessments, dependencies, + Arc/agents, networking (Private Link), and Resource Mover settings for discovering, + assessing, and preparing servers/DBs for migration. decision-making: Guidance for assessing migration readiness, sizing, costs, and tooling, and for planning, sequencing, and executing workload moves from on-prem, AWS, GCP, and VMware to Azure @@ -31,13 +31,14 @@ category_descriptions: skill_description: Expert knowledge for Azure Migrate development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - using AppCAT/CAST/Copilot, Site Recovery APIs, Azure Migrate appliances, Arc discovery, + using AppCAT/CAST, Site Recovery REST APIs, Azure Migrate appliance, Arc-based discovery, or Resource Mover, and other Azure Migrate related development tasks. Not for Azure Database Migration service (use azure-database-migration), Azure Site Recovery (use azure-site-recovery), Azure Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). -use_when: Use when using AppCAT/CAST/Copilot, Site Recovery APIs, Azure Migrate appliances, - Arc discovery, or Resource Mover, and other Azure Migrate related development tasks. +use_when: Use when using AppCAT/CAST, Site Recovery REST APIs, Azure Migrate appliance, + Arc-based discovery, or Resource Mover, and other Azure Migrate related development + tasks. confusable_not_for: Not for Azure Database Migration service (use azure-database-migration), Azure Site Recovery (use azure-site-recovery), Azure Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). @@ -54,8 +55,8 @@ confusable_not_for: Not for Azure Database Migration service (use azure-database ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 200 +- **Updated Pages**: 2 +- **Unchanged**: 204 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-migrate/azure-migrate.csv` @@ -78,18 +79,10 @@ confusable_not_for: Not for Azure Database Migration service (use azure-database ### Updated Pages -- [Upgrade Windows OS](https://learn.microsoft.com/en-us/azure/migrate/how-to-upgrade-windows?view=migrate) - - Updated: 2025-05-08T08:00:00.000Z → 2026-04-14T17:11:00.000Z -- [What is Azure Copilot Migration Agent?](https://learn.microsoft.com/en-us/azure/migrate/azure-copilot-migration-agent?view=migrate) - - Updated: 2026-03-18T11:13:00.000Z → 2026-04-15T08:00:00.000Z -- [Agentless migration architecture](https://learn.microsoft.com/en-us/azure/migrate/concepts-vmware-agentless-migration?view=migrate) - - Updated: 2025-08-08T17:09:00.000Z → 2026-04-14T11:11:00.000Z -- [Support matrix for VMware discovery](https://learn.microsoft.com/en-us/azure/migrate/migrate-support-matrix-vmware?view=migrate) - - Updated: 2025-05-13T05:03:00.000Z → 2026-04-16T11:14:00.000Z -- [Getting started](https://learn.microsoft.com/en-us/azure/migration/migrate-from-aws) - - Updated: 2026-03-10T16:12:00Z → 2026-04-15T11:11:00Z -- [Migrate from on-premises](https://learn.microsoft.com/en-us/azure/migration/migrate-from-on-premises) - - Updated: 2026-03-10T16:12:00Z → 2026-04-15T11:11:00Z +- [Requirements for Private endpoints](https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate) + - Updated: 2026-02-10T12:11:00.000Z → 2026-04-19T17:12:00.000Z +- [Migrate to Azure](https://learn.microsoft.com/en-us/azure/migration/migrate-to-azure) + - Updated: 2025-03-31T17:03:00.000Z → 2026-04-23T17:12:00.000Z ## Classified Pages @@ -157,7 +150,7 @@ confusable_not_for: Not for Azure Database Migration service (use azure-database | [Prepare Azure accounts using built-in roles](https://learn.microsoft.com/en-us/azure/migrate/prepare-azure-accounts?view=migrate) | security | 0.70 | Quickstart for setting up Azure RBAC for Azure Migrate projects; likely lists specific built-in role names and scopes, which are product-specific security configuration details. | | [Provide server credentials](https://learn.microsoft.com/en-us/azure/migrate/add-server-credentials?view=migrate) | configuration | 0.70 | Details how to add multiple server credentials for inventory and dependency analysis; likely includes credential types, scopes, and constraints specific to the appliance. | | [Replicate using ExpressRoute](https://learn.microsoft.com/en-us/azure/migrate/discover-and-assess-using-private-endpoints?view=migrate) | security | 0.70 | Same as index 20; focuses on Private Link-based secure connectivity with specific network/security configuration steps. | -| [Requirements for Private endpoints](https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate) | configuration | 0.70 | Describes configuring Azure Private Link/private endpoints for Azure Migrate; likely includes endpoint/resource configuration details specific to this service. | +| [Requirements for Private endpoints](https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate) | configuration | 0.70 | The article describes product-specific network and Private Link configuration for Azure Migrate, including how to set up private endpoints over ExpressRoute or VPN and required settings. This is concrete, service-specific configuration guidance rather than a conceptual overview. | | [Scoped discovery of VMware hosted VMs](https://learn.microsoft.com/en-us/azure/migrate/set-discovery-scope?view=migrate) | security | 0.70 | Describes limiting discovery scope via vCenter permissions; includes specific permission assignments and scoping patterns, which are security/permission configurations. | | [Support-Move Extension resource types](https://learn.microsoft.com/en-us/azure/resource-mover/support-matrix-extension-resource-types) | deployment | 0.70 | Summarizes all extension resource types currently supported for moves. This is a capability matrix (what can be moved) that is critical deployment constraint knowledge. | | [Supported Scenarios](https://learn.microsoft.com/en-us/azure/migrate/troubleshoot-assessment-supported-scenarios?view=migrate) | troubleshooting | 0.70 | Supported scenarios for troubleshooting assessments; scenario-based guidance for resolving specific assessment problems. | @@ -294,7 +287,7 @@ confusable_not_for: Not for Azure Database Migration service (use azure-database | [Generate and deploy a platform landing zone](https://learn.microsoft.com/en-us/azure/migrate/platform-landing-zone?view=migrate) | 0.20 | Primarily a how-to/tutorial for generating and deploying a Platform Landing Zone with Azure Migrate. From the summary, it focuses on process (generate, iterate, deploy with VS Code, GitHub Copilot Chat, MCP) and conceptual description of PLZ capabilities (governance, identity, networking). It does not indicate presence of numeric limits, detailed configuration parameter tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. | | [Insights](https://learn.microsoft.com/en-us/azure/migrate/insights-overview?view=migrate) | 0.20 | Page is an overview of Azure Migrate Insights (preview) describing what it does conceptually (security assessment, vulnerabilities, end-of-support software, missing security tools) without detailed configuration parameters, limits, error codes, or decision matrices. It lacks the concrete numeric limits, settings tables, or symptom→solution mappings required for any of the expert-knowledge sub-skill types. | | [Manage projects](https://learn.microsoft.com/en-us/azure/migrate/create-manage-projects?view=migrate) | 0.20 | How-to for creating and managing projects; no indication of numeric limits, config tables, or error-code-based troubleshooting. | -| [Migrate to Azure](https://learn.microsoft.com/en-us/azure/migration/migrate-to-azure) | 0.20 | High-level migration content collection; primarily navigation/overview without detailed limits, configs, or troubleshooting. | +| [Migrate to Azure](https://learn.microsoft.com/en-us/azure/migration/migrate-to-azure) | 0.20 | High-level migration guidance and resource overview for moving workloads to Azure; no specific limits, configuration tables, error codes, or quantified decision matrices that meet the expert-knowledge criteria. | | [Overview](https://learn.microsoft.com/en-us/azure/migrate/concepts-overview?view=migrate) | 0.20 | An overview of Azure Migrate assessments; overviews generally describe types and concepts without detailed limits, configuration tables, or decision matrices. No indication of specific numeric thresholds or product-specific patterns. | | [Overview of Web App Migration and Modernization](https://learn.microsoft.com/en-us/azure/migrate/web-app-migration-modernization?view=migrate) | 0.20 | Conceptual overview of web app migration/modernization; no detailed limits, configs, or troubleshooting content indicated. | | [Overview reports](https://learn.microsoft.com/en-us/azure/migrate/reports-overview?view=migrate) | 0.20 | Overview of Azure Migrate reports and their purpose; lacks concrete numeric limits, configuration tables, or detailed decision criteria beyond high-level descriptions. | diff --git a/products/azure-monitor/azure-monitor.csv b/products/azure-monitor/azure-monitor.csv index 4bd8dc54..61bed0bd 100644 --- a/products/azure-monitor/azure-monitor.csv +++ b/products/azure-monitor/azure-monitor.csv @@ -36,7 +36,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extensi https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-troubleshooting,Troubleshooting,Troubleshooting Azure Diagnostics extension - Azure Monitor,Troubleshoot Azure Diagnostics extension issues on Azure resources,"Troubleshoot problems when you use Azure Diagnostics in Azure Virtual Machines, Azure Service Fabric, or Azure Cloud Services.","This article describes troubleshooting information that's relevant to using Azure Diagnostics. For more information about Diagnostics, seeAzure Diagnostics overview. Important Azure Diagnostics extension will bedeprecated on March 31, 2026. After this date, Microsoft will no longer provide support for the Azure Diagnostics extensions. To ensure continued support and access to new features, you shouldmigrate to alternative solutions recommendedhere.",2026-02-24T08:00:00.000Z,troubleshooting-general,troubleshooting,0.86,True,"Explicit troubleshooting article; these typically map specific WAD errors/symptoms to causes and resolutions, including error messages, log locations, and diagnostic steps unique to Azure Diagnostics.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-versions,Version history,Microsoft Azure Diagnostics (WAD) extension configuration schema version history - Azure Monitor,Azure Diagnostics extension configuration schema version history,"Relevant to collecting perf counters in Azure Virtual Machines, Virtual Machine Scale Sets, Service Fabric, and Cloud Services.","This article provides the version history of theAzure Diagnostics extension for Windows (WAD)schema versions shipped as part of the Microsoft Azure SDK. Important Azure Diagnostics extension will bedeprecated on March 31, 2026. After this date, Microsoft will no longer provide support for the Azure Diagnostics extensions. To ensure continued support and access to new features, you shouldmigrate to alternative solutions recommendedhere.",2026-02-24T08:00:00.000Z,reference,configuration,0.82,True,"A schema version history page for WAD will list schema versions and their configuration elements/changes, including field names and allowed values, which are detailed configuration references not inferable from general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-windows-install,Installation options,Install and configure the Azure Diagnostics extension for Windows (WAD) - Azure Monitor,Install and configure Azure Diagnostics extension for Windows VMs,Learn about installing and configuring the Azure Diagnostics extension for Windows and how the data is stored in an Azure Storage account.,"TheAzure Diagnostics extensionis an agent in Azure Monitor that collects monitoring data from the guest operating system and workloads of Azure virtual machines and other compute resources. This article provides information on how to install and configure the Azure Diagnostics extension for Windows and describes how the data is stored in an Azure Storage account. Important Azure Diagnostics extension will bedeprecated on March 31, 2026. After this date, Microsoft will no longer provide support f",2026-02-24T08:00:00.000Z,install-set-up-deploy,configuration,0.78,True,"Installation and configuration article for the WAD extension typically includes extension configuration schema elements, setting names, and parameter values (for storage accounts, data collection options, etc.), which are product-specific configuration details not generally known from training.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway,Log Analytics gateway,Connect computers by using the Log Analytics gateway - Azure Monitor,Configure Log Analytics gateway for Azure Monitor connectivity,Connect your devices and Operations Manager-monitored computers by using the Log Analytics gateway to send data to the Azure Automation and Log Analytics service when they don't have internet access.,This article describes how to configure communication with Azure Automation and Azure Monitor by using the Log Analytics gateway when computers that are directly connected or that are monitored by Operations Manager have no internet access. The Log Analytics gateway is an HTTP forward proxy that supports HTTP tunneling using the HTTP CONNECT command. This gateway sends data to Azure Automation and a Log Analytics workspace in Azure Monitor on behalf of the computers that can't directly connect t,2026-04-09T08:00:00.000Z,how-to,configuration,0.75,True,"Explains configuring communication via Log Analytics gateway, including proxy behavior, endpoints, and possibly port and URL settings; these are concrete configuration parameters.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway,Log Analytics gateway,Connect computers by using the Log Analytics gateway - Azure Monitor,Configure Log Analytics gateway for Azure Monitor,Connect your devices and Operations Manager-monitored computers by using the Log Analytics gateway to send data to the Azure Automation and Log Analytics service when they don't have internet access.,This article describes how to configure communication with Azure Automation and Azure Monitor by using the Log Analytics gateway when computers that are directly connected or that are monitored by Operations Manager have no internet access. The Log Analytics gateway is an HTTP forward proxy that supports HTTP tunneling using the HTTP CONNECT command. This gateway sends data to Azure Automation and a Log Analytics workspace in Azure Monitor on behalf of the computers that can't directly connect t,2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"Gateway setup articles typically include product-specific configuration details such as proxy modes, required ports/URLs, and parameter settings for connecting on-premises or isolated machines to Azure Automation and Log Analytics. These are concrete configuration values and connection settings that qualify as expert knowledge beyond generic proxy usage.",updated https://learn.microsoft.com/en-us/azure/azure-monitor/agents/resource-manager-agent,ARM templates,Resource Manager template samples for agents - Azure Monitor,Deploy Azure Monitor and Log Analytics agents with ARM templates,Sample Azure Resource Manager templates to deploy and configure virtual machine agents in Azure Monitor.,"This article includes sampleAzure Resource Manager templatesto deploy and configure theAzure Monitor agent, the legacyLog Analytics agent, anddiagnostic extensionfor virtual machines in Azure Monitor. Each sample includes a template file and a parameters file with sample values to provide to the template. Note SeeAzure Resource Manager samplesfor Azure Monitor for a list of samples that are available and guidance on deploying them in your Azure subscription.",2026-04-07T08:00:00.000Z,sample,deployment,0.8,True,"Provides ARM template samples and parameter files for deploying agents; this is product-specific deployment configuration, including resource types, properties, and constraints.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/agents/troubleshooter-ama-linux,Linux AMA Troubleshooter,Azure Monitor Agent troubleshooter for Linux - Azure Monitor,Use AMA Linux troubleshooter to diagnose agent issues,Detailed instructions on using the Linux agent troubleshooter tool to diagnose potential issues.,"Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. The Azure Monitor Agent (AMA) Troubleshooter is designed to help identify issues with the agent and perform general health assessments. It can perform various checks to ensure that the agent is properly installed and connected, and can also gather AMA-related logs from the machine being diagnose",2026-04-09T08:00:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Linux AMA troubleshooter article; expected to include specific diagnostic commands, log paths, and symptom → cause → fix mappings for the agent.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/agents/troubleshooter-ama-windows,Windows AMA Troubleshooter,Azure Monitor Agent troubleshooter for Windows - Azure Monitor,Use AMA Windows troubleshooter to diagnose agent issues,Detailed instructions on using the Windows agent troubleshooter tool to diagnose potential issues.,"The Azure Monitor Agent (AMA) Troubleshooter is designed to help identify issues with the agent and perform general health assessments. It can perform various checks to ensure that the agent is properly installed and connected, and can also gather AMA-related logs from the machine being diagnosed. Note The Windows AMA Troubleshooter is a command line executable that is shipped with the agent for all versions newer than1.12.0.0.",2026-04-09T08:00:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Described as a detailed instructions article for a Windows AMA troubleshooter tool; such pages typically list specific checks, log locations, and symptom-to-solution flows unique to the product.",unchanged @@ -125,7 +125,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/test-action-group-e https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/tutorial-log-alert,Create a log search alert,Tutorial - Create a log search alert for an Azure resource - Azure Monitor,,Tutorial to create a log search alert for an Azure resource.,"Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. Log search alert rules create an alert when a log query returns a particular result. For example, receive an alert when a particular event is created on a virtual machine, or send a warning when excessive anonymous requests are made to a storage account. In this tutorial, you learn how to:",2026-04-24T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for creating a log search alert; appears to be basic how-to content rather than reference-style expert knowledge with limits, configs, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/tutorial-metric-alert,Create a metric alert,Tutorial - Create a metric alert for an Azure resource - Azure Monitor,,Learn how to create a metric chart with Azure metrics explorer.,"Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. Metric alert rules create an alert when a metric value from an Azure resource exceeds a threshold. In this tutorial, you learn how to:",2026-04-24T08:00:00.000Z,tutorial,,0.2,False,"Tutorial for creating a metric alert; typical step-by-step example without evidence of detailed limits, configuration matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/agents-view,Agent details,Monitor AI Agents with Application Insights - Azure Monitor,Monitor and troubleshoot AI agents with Application Insights,Learn how to monitor AI agents across multiple sources with Application Insights for performance tracking and troubleshooting.,"TheAgent detailsview in Application Insights provides a unified experience for monitoring AI agents across multiple sources, includingMicrosoft Foundry,Copilot Studio, and third-party agents. This feature consolidates telemetry and diagnostics, enabling customers to track agent performance, analyze token usage and costs, troubleshoot errors, and optimize your agent's behavior. Note Azure Monitor Agent Observability is based onOpenTelemetry Generative AI Semantics.",2026-03-05T08:00:00.000Z,overview,troubleshooting,0.65,True,"Described as enabling customers to track agent performance, analyze token usage and costs, and troubleshoot errors; AI-agent monitoring views typically surface specific telemetry fields and diagnostic flows that map symptoms (errors, high token usage) to analysis steps, which is product-specific troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-center-migration,App Center migration (Preview),Migrate App Center telemetry to Azure Monitor (Preview) - Azure Monitor,,Migrate Visual Studio App Center telemetry to Azure Monitor by using community OpenTelemetry (OTel) software development kits (SDKs) and a telemetry gateway.,"This article provides high-level linked guidance for sending mobile app telemetry toAzure Monitorusing OpenTelemetry (OTel). OTel is a vendor-neutral, open-source standard for collecting and exporting telemetry across languages and platforms, including mobile apps. Important",2026-04-14T22:14:00.000Z,how-to,,0.2,False,"High-level migration overview for moving App Center telemetry to Azure Monitor using OpenTelemetry; no detailed limits, configuration tables, error codes, or decision matrices with quantified trade-offs are evident from the summary.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-center-migration,App Center migration (Preview),Migrate App Center telemetry to Azure Monitor (Preview) - Azure Monitor,,Migrate Visual Studio App Center telemetry to Azure Monitor by using community OpenTelemetry (OTel) software development kits (SDKs) and a telemetry gateway.,"This article provides high-level linked guidance for sending mobile app telemetry toAzure Monitorusing OpenTelemetry (OTel). OTel is a vendor-neutral, open-source standard for collecting and exporting telemetry across languages and platforms, including mobile apps. Important",2026-04-14T22:14:00.000Z,how-to,,0.2,False,"High-level migration overview for moving App Center telemetry to Azure Monitor using OpenTelemetry; no detailed limits, configuration tables, error codes, or decision matrices with quantified trade-offs are evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview,Application Insights overview,Application Insights OpenTelemetry observability overview - Azure Monitor,,Learn how Azure Monitor Application Insights integrates with OpenTelemetry (OTel) for comprehensive application observability.,"Azure Monitor Application Insights is an application performance monitoring (APM) feature ofAzure Monitor. For supported scenarios, you can use OpenTelemetry (OTel), a vendor-neutral observability framework, to instrument your applications and collect telemetry data, then analyze that telemetry in Application Insights.",2026-03-18T17:43:00.000Z,overview,,0.1,False,"High-level overview of Application Insights with OpenTelemetry; describes concepts and capabilities but does not include numeric limits, configuration parameter tables, error codes, or detailed troubleshooting/decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-map,Application map,Application map in Azure Application Insights - Azure Monitor,,"Discover how to monitor distributed systems with Application map in Azure Application Insights. Gain insights into dependencies, failures, and performance.",Developers use application maps to represent the logical structure of their distributed applications. A map is produced by identifying the individual application components with theirroleNameornameproperty in recorded telemetry. Circles (ornodes) on the map represent the components and directional lines (connectorsoredges) show the HTTP calls fromsourcenodes totargetnodes. Azure Monitor provides theApplication mapfeature to help you quickly implement a map and spot performance bottlenecks or fai,2026-03-06T08:00:00.000Z,how-to,,0.3,False,"Describes what Application Map is and how it visualizes components; no product-specific limits, configuration tables, or detailed troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/application-insights-faq,Application Insights FAQ,Application Insights FAQ - Frequently Asked Questions - Azure Monitor,,Official FAQ for Azure Monitor Application Insights. Find answers to questions about using Application Insights with Azure Monitor.,Official FAQ for Azure Monitor Application Insights. Find answers to questions about using Application Insights with Azure Monitor.,2026-04-05T17:03:00Z,faq,,0.1,False,"FAQ pages can contain expert details, but the summary here is generic and does not indicate specific limits, configuration parameters, error codes, or other concrete technical mappings. Without evidence of such specifics, it is treated as general Q&A/overview rather than expert-knowledge content.",unchanged @@ -139,9 +139,9 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/app/dependencies,Dependenc https://learn.microsoft.com/en-us/azure/azure-monitor/app/failures-performance-transactions,"Failures, performance, and transactions","Investigate failures, performance, and transactions with Application Insights - Azure Monitor",Investigate failures and performance with Application Insights views,"Monitor and investigate application performance, failures, and transactions with Application Insights.","Application Insightscollects telemetry from your application to help diagnosing failures and investigating slow transactions. It includes four essential tools: Failures view- Tracks errors, exceptions, and faults, offering clear insights for fast problem-solving and enhanced stability. Performance view- Quickly identifies and helps resolve application bottlenecks by displaying response times and operation counts. Search view- Enables users to locate and examine individual telemetry items such as",2026-03-15T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Explains how to use Failures, Performance, and Search views to diagnose errors and slow transactions; these views usually have specific fields, filters, and workflows that map symptoms to causes and resolutions, constituting product-specific troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/grafana-dashboards,Grafana dashboards,Dashboards with Grafana in Application Insights - Azure Monitor,Build Grafana dashboards from Application Insights data in Azure,"Create, customize, and share Grafana dashboards for Application Insights directly in the Azure portal.","Dashboards withGrafanainApplication InsightsintegratesAzure Monitor’sGrafana experience directly into the Azure portal. You create and customize Grafana dashboards by using your Application Insights data without running your own Grafana instance or using a separate managed Grafana service. Built‑in Grafana controls support a wide range of visualization panels and client‑side transformations across metrics, logs, and traces.",2025-11-07T23:11:00.000Z,how-to,integrations,0.65,True,Covers using Azure Monitor’s built-in Grafana experience with Application Insights; likely includes product-specific configuration options and parameters for this integrated Grafana environment.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/ip-collection,Geolocation and IP address handling,Application Insights IP address collection - Azure Monitor,Configure IP address handling in Application Insights,Understand how Application Insights handles IP addresses and geolocation.,This article explains how geolocation lookup and IP address handling work inApplication Insights.,2026-03-15T17:05:00.000Z,how-to,security,0.7,True,"IP address collection and geolocation handling for Application Insights is security- and privacy-related behavior that is product-specific. This page typically includes concrete details on how IPs are stored or masked, what configuration options exist, and how to control collection, which aligns with security-focused configuration and compliance requirements.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-get-started-supplemental,Get started (supplemental),Application Insights with containers - Azure Monitor,Configure Application Insights for Java in containers,This article shows you how to set up Application Insights.,"In the following sections, learn how to get Java autoinstrumentation for specific technical environments.",2026-04-08T08:00:00.000Z,how-to,configuration,0.65,True,"Container-focused Java autoinstrumentation setup typically documents environment variables, agent paths, and container-specific configuration flags. These are product-specific configuration details rather than generic container or Java concepts.",updated -https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-spring-boot,Spring Boot,Configure Azure Monitor Application Insights for Spring Boot - Azure Monitor,Configure Application Insights for Spring Boot apps,How to configure Azure Monitor Application Insights for Spring Boot applications,"Note WithSpring Boot native image applications, you can usethis project. There are two options for enabling Application Insights Java with Spring Boot: Java Virtual Machine (JVM) argument and programmatically.",2026-04-08T08:00:00.000Z,how-to,configuration,0.7,True,"Spring Boot configuration for Application Insights commonly includes JVM argument names, programmatic configuration APIs, and property keys specific to the Java agent and Spring Boot integration. These are concrete configuration parameters and patterns unique to this product integration.",updated -https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-standalone-config,Configuration (supplemental),Configure Azure Monitor Application Insights for Java - Azure Monitor,Advanced Java configuration for Application Insights,"Learn how to configure Azure Monitor Application Insights for Java, including connection strings, JSON configuration, sampling overrides, JMX metrics, telemetry processors, logging, Micrometer metrics","This article shows you how to configure Azure Monitor Application Insights for Java. For more information, seeGet started with OpenTelemetry.",2026-04-08T08:00:00.000Z,how-to,configuration,0.9,True,"The article explicitly covers connection strings, JSON configuration, sampling overrides, JMX metrics, telemetry processors, logging, Micrometer metrics, and runtime settings. This is a catalog of configuration options and parameters (often with defaults and allowed values), which squarely fits the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-get-started-supplemental,Get started (supplemental),Application Insights with containers - Azure Monitor,Configure Application Insights for Java in containers,This article shows you how to set up Application Insights.,"In the following sections, learn how to get Java autoinstrumentation for specific technical environments.",2026-04-08T08:00:00.000Z,how-to,configuration,0.65,True,"Container-focused Java autoinstrumentation setup typically documents environment variables, agent paths, and container-specific configuration flags. These are product-specific configuration details rather than generic container or Java concepts.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-spring-boot,Spring Boot,Configure Azure Monitor Application Insights for Spring Boot - Azure Monitor,Configure Application Insights for Spring Boot apps,How to configure Azure Monitor Application Insights for Spring Boot applications,"Note WithSpring Boot native image applications, you can usethis project. There are two options for enabling Application Insights Java with Spring Boot: Java Virtual Machine (JVM) argument and programmatically.",2026-04-08T08:00:00.000Z,how-to,configuration,0.7,True,"Spring Boot configuration for Application Insights commonly includes JVM argument names, programmatic configuration APIs, and property keys specific to the Java agent and Spring Boot integration. These are concrete configuration parameters and patterns unique to this product integration.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-standalone-config,Configuration (supplemental),Configure Azure Monitor Application Insights for Java - Azure Monitor,Advanced Java configuration for Application Insights,"Learn how to configure Azure Monitor Application Insights for Java, including connection strings, JSON configuration, sampling overrides, JMX metrics, telemetry processors, logging, Micrometer metrics","This article shows you how to configure Azure Monitor Application Insights for Java. For more information, seeGet started with OpenTelemetry.",2026-04-08T08:00:00.000Z,how-to,configuration,0.9,True,"The article explicitly covers connection strings, JSON configuration, sampling overrides, JMX metrics, telemetry processors, logging, Micrometer metrics, and runtime settings. This is a catalog of configuration options and parameters (often with defaults and allowed values), which squarely fits the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-standalone-profiler,Profiler,Azure Monitor Application Insights Profiler for Java - Azure Monitor,Configure Application Insights Profiler for Java,How to configure the Azure Monitor Application Insights Profiler for Java,"Note The Java Profiler feature is in preview, starting from 3.4.0. The Java Profiler provides a system for:",2026-03-06T08:00:00.000Z,how-to,configuration,0.7,True,Profiler setup for Java with version-specific notes and configuration parameters unique to this feature.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/javascript-feature-extensions,Enable Click Analytics,Feature extensions for Application Insights JavaScript SDK (Click Analytics) - Azure Monitor,Enable Click Analytics feature extension for JS SDK,Learn how to install and use JavaScript feature extensions (Click Analytics) for the Application Insights JavaScript SDK.,"Application Insights JavaScript SDK feature extensions are extra features that can be added to the Application Insights JavaScript SDK to enhance its functionality. In this article, we cover the Click Analytics plug-in, which automatically tracks click events on webpages and usesdata-*attributes or customized tags on HTML elements to populate event telemetry.",2026-03-06T08:00:00.000Z,how-to,integrations,0.75,True,Feature extension configuration using data-* attributes and custom tags; product-specific integration behavior.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/javascript-framework-extensions,"Enable React, React Native, Angular",Enable a framework extension for Application Insights JavaScript SDK - Azure Monitor,"Integrate Application Insights JS SDK with React, Angular, and React Native",Learn how to install and use JavaScript framework extensions for the Application Insights JavaScript SDK.,"In addition to the core SDK, there are also plugins available for specific frameworks, such as the React plugin, the React Native plugin, and the Angular plugin. These plugins provide extra functionality and integration with the specific framework.",2026-03-06T08:00:00.000Z,how-to,integrations,0.75,True,Framework-specific plugins and usage patterns; includes integration code and options unique to these frameworks.,unchanged @@ -150,11 +150,11 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/app/javascript-sdk-configu https://learn.microsoft.com/en-us/azure/azure-monitor/app/live-stream,Live metric stream,Live Metrics: Real-Time Monitoring in Application Insights - Azure Monitor,,"Monitor your web app in real time with Azure Application Insights. Diagnose issues instantly using live metrics, custom filters, and failure traces.","Use live metrics fromApplication Insightsto monitor web applications. Select and filter metrics and performance counters to watch in real time and inspect stack traces from sample failed requests and exceptions. With live metrics, you can:",2026-03-06T08:00:00.000Z,how-to,,0.3,False,"Explains Live Metrics conceptually (real-time monitoring, filters, traces) without exposing detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/managed-workspaces,Managed workspaces,Application Insights managed workspaces - Azure Monitor,Use Application Insights managed Log Analytics workspaces,This article explains automatically created managed workspaces,"Azure MonitorApplication Insightsrequires a connection to aLog Analyticsworkspace to store and analyze telemetry data. To simplify deployment, Application Insights automatically creates a managed workspace when you don't specify one during resource creation.",2025-08-28T22:12:00.000Z,how-to,configuration,0.6,True,"Explains behavior of automatically created managed workspaces and how AI resources bind to them, which is product-specific configuration behavior not obvious from general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/metrics-overview,Metrics,Metrics in Application Insights - Azure Monitor - Azure Monitor,,This article explains the difference between log-based and standard/preaggregated metrics in Application Insights.,"Application Insights supports three different types of metrics: standard (preaggregated), log-based, and custom metrics. Each one brings a unique value in monitoring application health, diagnostics, and analytics. Developers who are instrumenting applications can decide which type of metric is best suited to a particular scenario. Decisions are based on the size of the application, expected volume of telemetry, and business requirements for metrics precision and alerting. This article explains t",2025-11-07T06:04:00.000Z,how-to,,0.3,False,"Metrics overview describing types (standard, log-based, custom) and when to use them; sounds conceptual without concrete limits, configs, or decision matrices with thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/app/migrate-to-opentelemetry,Migrate to OpenTelemetry,Migrate Application Insights Classic API Software Development Kits (SDKs) to Azure Monitor OpenTelemetry - Azure Monitor,Migrate from Application Insights SDKs to OpenTelemetry,"This article provides guidance on how to migrate .NET, Java, Node.js, and Python applications from the Application Insights Classic API SDKs to Azure Monitor OpenTelemetry.","This guide provides step-by-step instructions to migrate applications from using Application Insights SDKs (Classic API) to Azure Monitor OpenTelemetry. You get a similar experience with Azure Monitor OpenTelemetry instrumentation as with the Application Insights SDKs. For more information and a feature-by-feature comparison, seerelease state of features. Tip To review archived .NET or Node.js classic API SDK information, seeAPI 2.x. Use Application Insights .NET software development kit (SDK) 3",2026-04-08T08:00:00.000Z,how-to,decision-making,0.65,True,"Migration guidance from classic SDKs to Azure Monitor OpenTelemetry usually includes feature-by-feature comparisons, trade-offs, and recommendations on when/how to move. That aligns with decision-making: helping choose migration paths and understand capability differences, beyond simple how-to steps.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/app/migrate-to-opentelemetry,Migrate to OpenTelemetry,Migrate Application Insights Classic API Software Development Kits (SDKs) to Azure Monitor OpenTelemetry - Azure Monitor,Migrate from Application Insights SDKs to OpenTelemetry,"This article provides guidance on how to migrate .NET, Java, Node.js, and Python applications from the Application Insights Classic API SDKs to Azure Monitor OpenTelemetry.","This guide provides step-by-step instructions to migrate applications from using Application Insights SDKs (Classic API) to Azure Monitor OpenTelemetry. You get a similar experience with Azure Monitor OpenTelemetry instrumentation as with the Application Insights SDKs. For more information and a feature-by-feature comparison, seerelease state of features. Tip To review archived .NET or Node.js classic API SDK information, seeAPI 2.x. Use Application Insights .NET software development kit (SDK) 3",2026-04-08T08:00:00.000Z,how-to,decision-making,0.65,True,"Migration guidance from classic SDKs to Azure Monitor OpenTelemetry usually includes feature-by-feature comparisons, trade-offs, and recommendations on when/how to move. That aligns with decision-making: helping choose migration paths and understand capability differences, beyond simple how-to steps.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-add-modify,Add and modify OpenTelemetry,Add and Modify OpenTelemetry in Application Insights - Azure Monitor,Customize OpenTelemetry integration with Application Insights,"Learn how to add and modify OpenTelemetry (OTel) in Application Insights. Includes .NET, Java, Node.js, and Python applications, custom attributes, telemetry processors, and log and trace modification","This guide provides instructions on integrating and customizing OpenTelemetry (OTel) instrumentation withinAzure Monitor Application Insights. To learn more about OpenTelemetry concepts, see theOpenTelemetry overview. Note For Azure Function Apps, seeUse OpenTelemetry with Azure Functions.",2026-04-11T06:04:00.000Z,how-to,integrations,0.7,True,"Page focuses on adding and modifying OpenTelemetry instrumentation for Application Insights across .NET, Java, Node.js, and Python. It likely includes product-specific SDK/API usage, configuration parameters, and patterns for custom attributes, telemetry processors, and log/trace modifications, which fits the integrations & coding patterns category.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-collect-detect,Autocollect OpenTelemetry,Data Collection and Resource Detectors for Azure Monitor OpenTelemetry - Azure Monitor,,"Learn how Azure Monitor OpenTelemetry automatically collects telemetry and how resource detectors enrich telemetry with consistent service, host, and cloud metadata in Application Insights.","This article explains how Azure Monitor OpenTelemetry collects telemetry automatically and how resource detectors enrich telemetry with consistent metadata. You learn what signals are collected by default and how resource detectors populate attributes like service identity and environment details so your Application Insights data is easier to filter, correlate, and troubleshoot across .NET, Java, Node.js, and Python applications. To learn more about OpenTelemetry concepts, see theOpenTelemetry o",2026-04-11T06:04:00.000Z,how-to,,0.3,False,"Explains what telemetry is collected and how resource detectors work, but is primarily conceptual/how-it-works guidance without detailed config tables, limits, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-configuration,Configure OpenTelemetry,Configuring OpenTelemetry in Application Insights - Azure Monitor,Configure OpenTelemetry settings in Application Insights,"Learn how to configure OpenTelemetry (OTel) settings in Application Insights for .NET, Java, Node.js, and Python applications, including connection strings and sampling options.","This guide explains how to configure OpenTelemetry (OTel) inAzure Monitor Application Insightsusing the Azure Monitor OpenTelemetry distro. Proper configuration ensures consistent telemetry data collection across .NET, Java, Node.js, and Python applications, allowing for more reliable monitoring and diagnostics. Note For Azure Function Apps, seeUse OpenTelemetry with Azure Functions.",2026-04-17T22:09:00.000Z,how-to,configuration,0.85,True,"Explicitly about configuring OTel in Application Insights, likely listing settings such as connection string keys, sampling configuration options, environment variables, and their allowed values. This matches configuration: product-specific parameters and options, not just conceptual guidance.",updated -https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-enable,Get started with OpenTelemetry,Enable OpenTelemetry in Application Insights - Azure Monitor,Enable Azure Monitor OpenTelemetry for Application Insights,"Learn how to enable OpenTelemetry (OTel) data collection in Application Insights for .NET, Java, Node.js, and Python applications using the Azure Monitor OpenTelemetry Distro.","This article describes how to enable and configure OpenTelemetry-based data collection withinAzure Monitor Application Insights. The Azure Monitor OpenTelemetry Distro: For more information about the advantages of using the Azure Monitor OpenTelemetry Distro, seeWhy should I use the Azure Monitor OpenTelemetry Distro. To learn more about collecting data using OpenTelemetry, check outCollect OpenTelemetry (OTel) for Application Insights experiencesor theOpenTelemetry FAQ. OpenTelemetry offerings ",2026-04-08T08:00:00.000Z,how-to,configuration,0.7,True,"Enablement article for OTel in Application Insights typically includes concrete, product-specific configuration steps such as required environment variables, connection string formats, and SDK/distro flags for .NET, Java, Node.js, and Python. These are configuration parameters unique to Azure Monitor OpenTelemetry rather than generic OTel concepts.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-configuration,Configure OpenTelemetry,Configuring OpenTelemetry in Application Insights - Azure Monitor,Configure OpenTelemetry settings in Application Insights,"Learn how to configure OpenTelemetry (OTel) settings in Application Insights for .NET, Java, Node.js, and Python applications, including connection strings and sampling options.","This guide explains how to configure OpenTelemetry (OTel) inAzure Monitor Application Insightsusing the Azure Monitor OpenTelemetry distro. Proper configuration ensures consistent telemetry data collection across .NET, Java, Node.js, and Python applications, allowing for more reliable monitoring and diagnostics. Note For Azure Function Apps, seeUse OpenTelemetry with Azure Functions.",2026-04-17T22:09:00.000Z,how-to,configuration,0.85,True,"Explicitly about configuring OTel in Application Insights, likely listing settings such as connection string keys, sampling configuration options, environment variables, and their allowed values. This matches configuration: product-specific parameters and options, not just conceptual guidance.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-enable,Get started with OpenTelemetry,Enable OpenTelemetry in Application Insights - Azure Monitor,Enable Azure Monitor OpenTelemetry for Application Insights,"Learn how to enable OpenTelemetry (OTel) data collection in Application Insights for .NET, Java, Node.js, and Python applications using the Azure Monitor OpenTelemetry Distro.","This article describes how to enable and configure OpenTelemetry-based data collection withinAzure Monitor Application Insights. The Azure Monitor OpenTelemetry Distro: For more information about the advantages of using the Azure Monitor OpenTelemetry Distro, seeWhy should I use the Azure Monitor OpenTelemetry Distro. To learn more about collecting data using OpenTelemetry, check outCollect OpenTelemetry (OTel) for Application Insights experiencesor theOpenTelemetry FAQ. OpenTelemetry offerings ",2026-04-08T08:00:00.000Z,how-to,configuration,0.7,True,"Enablement article for OTel in Application Insights typically includes concrete, product-specific configuration steps such as required environment variables, connection string formats, and SDK/distro flags for .NET, Java, Node.js, and Python. These are configuration parameters unique to Azure Monitor OpenTelemetry rather than generic OTel concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-filter,Filter OpenTelemetry,Filtering OpenTelemetry in Application Insights - Azure Monitor,Filter OpenTelemetry telemetry in Application Insights,"Learn how to filter OpenTelemetry (OTel) data in Application Insights for .NET, Java, Node.js, and Python applications. Exclude unwanted telemetry and protect sensitive information.","Use this guide to filter OpenTelemetry (OTel) data inAzure Monitor Application Insights. Filtering helps you exclude unnecessary telemetry and prevent collection of sensitive data to optimize performance and support compliance. Reasons to filter out telemetry include: To learn more about OpenTelemetry concepts, review theOpenTelemetry overview. Note For Azure Function Apps, seeUse OpenTelemetry with Azure Functions.",2026-04-01T06:03:00.000Z,how-to,configuration,0.65,True,"Focuses on filtering and excluding telemetry and sensitive data; likely documents specific filter configuration options, parameter names, and patterns for Azure Monitor OpenTelemetry rather than generic advice.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-sampling,OpenTelemetry sampling,Sampling in Azure Application Insights with OpenTelemetry - Azure Monitor,Configure OpenTelemetry sampling for Azure Application Insights,"Learn how OpenTelemetry sampling in Application Insights reduces telemetry volume, controls costs, and preserves key diagnostic data.","Application Insightsincludes a custom sampler and integrates withOpenTelemetryto reduce telemetry volume, lower costs, and retain the diagnostic data you care about. Important For information on sampling when using the Application Insights Classic API Software Development Kits (SDKs), seeClassic API Sampling.",2026-02-17T13:02:00.000Z,how-to,configuration,0.7,True,"Sampling article will include sampler types, configuration parameters, and possibly default/allowed values specific to Application Insights’ OpenTelemetry integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/app/overview-dashboard,Application dashboard,Application Insights Dashboard Setup and Customization - Azure Monitor,Customize the Application Insights overview dashboard,Monitor your application's health and performance with the Application Insights Overview dashboard. Learn how to customize tiles and create actionable insights.,"Application Insights provides a summary in the overview pane to allow at-a-glance assessment of your application's health and performance. A time range selection is available at the top of the interface. Each tile can be selected to navigate to the corresponding experience. As an example, selecting theFailed requeststile opens theFailuresexperience.",2026-03-06T08:00:00.000Z,how-to,configuration,0.6,True,Explains how to configure tiles and time ranges for the dashboard; product-specific UI configuration options.,unchanged @@ -201,7 +201,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insig https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-throttling,Throttling,Configure throttling for Container Insights - Azure Monitor,Configure throttling parameters and monitor log loss in Container Insights,Configure throttling parameters and monitor for log loss in Container Insights,"Note Container Insights logs are only throttled whenContainer Network Logsare being collected. If you have not enabled the collection of Container Network Logs, throttling is not enabled on your cluster. Azure Monitor - Container Insights allow customers to collect logs generated in their Azure Kubernetes Service (AKS) cluster. Depending on workload and logging configuration, the volume of logs generated can be substantial, leading to throttling and log loss. This article discusses the default v",2025-06-16T22:02:00.000Z,how-to,configuration,0.84,True,"Describes default throttling values and configurable parameters for Container Network Logs, including product-specific settings and behaviors; may also include numeric defaults but primarily configuration-focused.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-transformations,Data transformations,Advanced filtering and transformations for Kubernetes logs in Azure Monitor - Azure Monitor,Configure DCR transformations for Kubernetes container logs,Describes how to transform data using a DCR transformation with container logs from your Kubernetes cluster using Azure Monitor.,"This article describes how to implement data transformations with container log data from your Kubernetes cluster.Transformationsin Azure Monitor allow you to modify or filter data before it's ingested in your Log Analytics workspace. They allow you to perform such actions as filtering out data collected from your cluster to save costs or processing incoming data to assist in your data queries. Tip While transformations are a powerful and reliable feature, they should be used only after using ot",2025-12-19T05:32:00.000Z,how-to,configuration,0.78,True,Describes how to implement DCR transformations for container logs with specific transformation configuration patterns and fields unique to Azure Monitor.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-transition-solution,Container monitoring solution,Transition from the Container Monitoring Solution to using Container Insights - Azure Monitor,Transition from Container Monitoring Solution to Container Insights,Learn how to migrate from using the legacy solution to monitoring your containers using Container Insights,"With both the underlying platform and agent deprecations, on August 31, 2024 theContainer Monitoring Solutionwill be retired. If you use the Container Monitoring Solution to ingest data to your Log Analytics workspace, make sure to transition to usingContainer Insightsprior to that date. Warning The Container Monitoring Solution is no longer supported as of August 31, 2024.",2025-04-23T08:00:00.000Z,how-to,decision-making,0.7,True,Migration-focused guidance with timelines and steps to move from a legacy solution to Container Insights; helps decide and plan migration paths.,unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot,Container insights,Troubleshoot collection of container logs in Azure Monitor - Azure Monitor,Troubleshoot Container Insights container log collection issues,This article describes how you can troubleshoot and resolve issues with Container insights.,This article discusses some common issues and troubleshooting steps when using Container insights to monitor your Kubernetes cluster.,2025-01-29T08:00:00.000Z,troubleshooting-general,troubleshooting,0.86,True,"Provides symptom-based troubleshooting for Container Insights with product-specific checks, commands, and causes/solutions.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot,Container insights,Troubleshoot collection of container logs in Azure Monitor - Azure Monitor,Diagnose and fix Azure Monitor container log collection,This article describes how you can troubleshoot and resolve issues with Container insights.,Use this article to troubleshoot common issues when you use Container insights to monitor your Kubernetes cluster.,2026-04-24T22:11:00.000Z,troubleshooting-general,troubleshooting,0.86,True,"The page is explicitly a troubleshooting guide for Container insights log collection, organized around specific issues and their resolutions. It typically includes product-specific error symptoms, causes, and remediation steps for Azure Monitor and Kubernetes clusters, which qualify as expert troubleshooting knowledge beyond generic debugging advice.",updated https://learn.microsoft.com/en-us/azure/azure-monitor/containers/control-plane-transformations,Filter control plane logs,Filter AKS control plane logs using workspace transformations in Azure Monitor - Azure Monitor,Configure workspace transformations for AKS control plane logs,Step by step guide to filter AKS control plane logs using workspace transformations in Azure Monitor.,This tutorial walks through configuration of a sample transformation in a workspacedata collection rule (DCR)using the Azure portal.Transformationsin Azure Monitor allow you to filter or modify incoming data before it's sent to its destination. Workspace transformations provide support foringestion-time transformationsfor workflows that don't yet use DCRs.,2025-08-04T17:16:00.000Z,how-to,configuration,0.7,True,Step-by-step configuration of workspace DCR transformations with concrete transformation definitions and fields for AKS control plane logs; includes product-specific parameters and structure.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/cost-effective-alerting,Cost-effective Alerts,Cost effective alerting strategies for AKS clusters in Azure Monitor - Azure Monitor,Design cost-effective alerting for AKS in Azure Monitor,Describes different strategies for cost effective alerting from AKS clusters in Azure Monitor.,"Alerting is a critical part of monitoring workloads on Azure Kubernetes Service (AKS). Advanced alerting requiresAnalytics-tier logsin your Log Analytics workspace, but this can be cost-prohibitive for high-volume environments or certain types of logs such as audit logs. You can significantly reduce your data ingestion costs by converting tables holding container logs toBasic logsand leveraging other cost effective strategies of the Log Analytics platform. Azure Monitor provides options for even",2025-05-29T22:03:00.000Z,concept-article,best-practices,0.8,True,"Gives specific strategies (e.g., using Basic vs Analytics logs, table conversions) to reduce alerting costs with AKS; these are product-specific best practices beyond generic alerting advice.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/integrate-keda,Scale using KEDA,Integrate KEDA with your Azure Kubernetes Service cluster - Azure Monitor,Integrate KEDA autoscaling with Prometheus metrics from Azure Monitor,How to integrate KEDA with your Azure Kubernetes Service cluster.,"KEDA is a Kubernetes-based Event Driven Autoscaler. KEDA lets you drive the scaling of any container in Kubernetes based on the load to be processed, by querying metrics from systems such as Prometheus. Integrate KEDA with your Azure Kubernetes Service (AKS) or Azure Arc-enabled Kubernetes cluster to scale your workloads based on Prometheus metrics from your Azure Monitor workspace. To integrate KEDA into your Azure Kubernetes Service, you have to deploy and configure a workload identity or pod ",2025-03-20T22:01:00.000Z,how-to,integrations,0.8,True,"Shows how to configure KEDA to scale based on Prometheus metrics from Azure Monitor; includes scaler configuration and identity setup, which are integration-specific.",unchanged @@ -211,7 +211,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-data https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-data-collection-configure,Overview,Filter and customize data collection for Kubernetes clusters - Azure Monitor,Customize and filter Azure Monitor data collection for Kubernetes,Describes the different types of customization you can perform for data collection from your Kubernetes clusters and the methods required to implement each.,"The data that Azure Monitor initially collects from your Kubernetes clusters depends on the options you chose when youenabled collection of logs and metrics. After this initial onboarding, you can further customize the data collection either to add data that you require to properly monitor your Kubernetes environment, or to filter out data that you don't need to reduce your monitoring costs. You can further optimize costs with advanced filtering and sending different types of data to different s",2025-12-19T05:32:00.000Z,how-to,configuration,0.85,True,"Describes customization of data collection, advanced filtering, and routing to different sinks; expected to include data collection rule parameters and configuration options.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-metric-alerts,Metric alert rules,Recommended alert rules for Kubernetes clusters - Azure Monitor,Enable recommended metric alert rules for Kubernetes clusters,Describes how to enable recommended metric alerts rules for a Kubernetes cluster in Azure Monitor.,Alertsin Azure Monitor proactively identify issues related to the health and performance of your Azure resources. This article describes how to enable and edit a set of recommended metric alert rules that are predefined for your Kubernetes clusters.,2025-08-25T08:00:00.000Z,how-to,configuration,0.68,True,Describes predefined metric alert rules and how to enable/edit them; includes specific rule names/metrics and thresholds tied to Azure Monitor’s implementation.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-disable,Disable monitoring,Disable monitoring of your Kubernetes cluster - Azure Monitor,,Describes how to disable scraping of Prometheus metrics and collection of logs from your Kubernetes cluster.,Use the following methods to disable collection ofPrometheus metricsorlog collectionfrom your Kubernetes cluster.,2025-09-16T22:14:00.000Z,how-to,,0.35,False,Describes methods to disable metrics/log collection; likely simple procedural steps without extensive config matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-enable,AKS clusters,Enable Monitoring for Azure Kubernetes Service (AKS) Clusters - Azure Monitor,Configure Azure Monitor for AKS cluster monitoring,Learn how to enable monitoring for Azure Kubernetes Service (AKS) cluster with Azure Monitor.,"As described inKubernetes monitoring in Azure Monitor, multiple features of Azure Monitor work together to provide complete monitoring of your Azure Kubernetes Service (AKS) clusters. This article describes how to enable the following features for AKS clusters:",2026-04-14T06:09:00.000Z,how-to,configuration,0.68,True,"The page describes how to enable multiple Azure Monitor features for AKS, which typically involves product-specific configuration steps and settings (e.g., enabling insights, selecting workspaces, toggling features). While it’s a how-to article, enabling monitoring for AKS in Azure Monitor requires concrete configuration choices unique to this integration, so it best fits the configuration sub-skill. It does not focus on limits, troubleshooting, or architecture patterns.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-enable,AKS clusters,Enable Monitoring for Azure Kubernetes Service (AKS) Clusters - Azure Monitor,Configure Azure Monitor for AKS cluster monitoring,Learn how to enable monitoring for Azure Kubernetes Service (AKS) cluster with Azure Monitor.,"As described inKubernetes monitoring in Azure Monitor, multiple features of Azure Monitor work together to provide complete monitoring of your Azure Kubernetes Service (AKS) clusters. This article describes how to enable the following features for AKS clusters:",2026-04-14T06:09:00.000Z,how-to,configuration,0.68,True,"The page describes how to enable multiple Azure Monitor features for AKS, which typically involves product-specific configuration steps and settings (e.g., enabling insights, selecting workspaces, toggling features). While it’s a how-to article, enabling monitoring for AKS in Azure Monitor requires concrete configuration choices unique to this integration, so it best fits the configuration sub-skill. It does not focus on limits, troubleshooting, or architecture patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-enable-arc,Arc-enabled clusters,Enable monitoring for Arc-enabled Kubernetes clusters - Azure Monitor,,Learn how to enable monitoring for Arc-enabled Kubernetes clusters with Azure Monitor.,"As described inKubernetes monitoring in Azure Monitor, multiple features of Azure Monitor work together to provide complete monitoring of your Kubernetes clusters. This article describes how to enable the following features for Arc-enabled clusters running in other clouds and on-premises:",2025-10-01T05:08:00.000Z,how-to,,0.3,False,Enablement steps for Arc-enabled clusters; mostly tutorial-style instructions rather than deep configuration catalogs or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-firewall,Firewall requirements,Network Firewall Requirements for Monitoring Kubernetes Clusters - Azure Monitor,Configure firewall and proxy for Kubernetes monitoring agents,This article shows proxy and firewall configuration information required for the containerized agent to communicate with Managed Prometheus and Container insights.,"The tables in the following sections specify proxy and firewall configuration information for the Azure public cloud, Azure operated by 21Vianet cloud, and Azure Government cloud. This information is required for the containerized agent to communicate with managed service for Prometheus and Container insights. All network traffic from the agent is outbound to Azure Monitor.",2026-01-28T23:13:00.000Z,concept-article,configuration,0.92,True,"Contains tables of required endpoints, ports, and URLs for different Azure clouds; these network requirements are detailed configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-overview,Overview,Kubernetes monitoring in Azure Monitor - Azure Monitor,,"Describes Container Insights and Managed Prometheus in Azure Monitor, which work together to monitor your Kubernetes clusters.","Azure provides a complete set of services based onAzure Monitorfor monitoring the health and performance of different layers of your Kubernetes infrastructure and the applications that depend on it. These services work in conjunction with each other to provide a complete monitoring solution for your clusters running inAzure Kubernetes Service (AKS)or other clouds such asAWSandGCP. You may have an existing investment in cloud native technologies endorsed by theCloud Native Computing Foundation, o",2025-09-16T22:14:00.000Z,concept-article,,0.2,False,"High-level overview of Kubernetes monitoring options (Container Insights, Managed Prometheus) without detailed limits, configs, or decision matrices.",unchanged @@ -219,7 +219,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-moni https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-open-protocol,Monitor AKS applications with OpenTelemetry Protocol (Preview),Monitor AKS applications with OpenTelemetry Protocol (OTLP) Preview - Azure Monitor,,Enable application monitoring for Azure Kubernetes Service (AKS) namespaces and deployments and send OpenTelemetry Protocol (OTLP) telemetry to Application Insights using Azure Monitor.,"OpenTelemetry provides a standardized way to emit traces, logs, and metrics. Azure Monitor addsPreviewsupport for monitoring applications that run on Azure Kubernetes Service (AKS) by using the OpenTelemetry Protocol (OTLP) for instrumentation and data collection. Important This feature is apreview. Preview features are provided without a service-level agreement and aren't recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-08T22:12:00.000Z,how-to,,0.3,False,"Appears to be a how-to/tutorial for enabling OTLP monitoring on AKS rather than a reference of limits, configs, or decision matrices. No clear indication of detailed configuration tables, error mappings, or tier comparisons.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-workbooks,Workbooks,Kubernetes workbooks in Azure Monitor - Azure Monitor,,This article describes workbooks in Azure Monitor available to analyze monitoring data for Kubernetes clusters.,A variety ofAzure Monitor workbooksare available to analyze the data collected by your Kubernetes clusters. This article describes the different workbooks that are available and how to access them. Note Workbooks may not function as expected if you've filtered data collected for your cluster.,2025-09-16T22:14:00.000Z,concept-article,,0.35,False,Overview of available workbooks and how to access them; likely descriptive rather than configuration-heavy or decision-focused.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/monitor-kubernetes,Deployment guidance,Monitor Kubernetes clusters using Azure Monitor and cloud native tools - Azure Monitor,Apply Azure Monitor best practices for Kubernetes layers,Describes how to monitor the health and performance of the different layers of your Kubernetes environment using Azure Monitor and cloud native services in Azure.,"Kubernetes monitoring in Azure Monitordescribes the Azure Monitor services used to provide complete monitoring of your Kubernetes environment and the workloads that run on it. This article provides best practices for how to leverage these services to monitor the different layers of your Kubernetes environment based on the typical roles that manage them. Following is an illustration of a common model of a typical Kubernetes environment, starting from the infrastructure layer up through applicatio",2025-09-16T22:14:00.000Z,best-practice,best-practices,0.7,True,"Explicitly described as best practices for monitoring different Kubernetes layers with Azure Monitor and cloud-native tools; likely includes concrete, product-specific recommendations and patterns.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/containers/opentelemetry-ingest-agent,Ingest OTLP signals with AMA (Preview),Ingest OTLP Data into Azure Monitor with AMA (Preview) - Azure Monitor,Configure Azure Monitor Agent for OTLP ingestion,"Learn how to send OpenTelemetry Protocol (OTLP) telemetry data to Azure Monitor using Azure Monitor Agent on VMs, Scale Sets, and Arc-enabled servers.",Azure Monitor now supports native ingestion of OpenTelemetry Protocol (OTLP) signals. You can send telemetry data directly from OpenTelemetry-instrumented applications to Azure Monitor. Important,2026-04-15T22:11:00.000Z,how-to,configuration,0.7,True,"The page describes how to ingest OpenTelemetry Protocol data into Azure Monitor using Azure Monitor Agent on VMs, scale sets, and Arc-enabled servers. This scenario typically includes product-specific configuration details such as AMA data collection rules, endpoint/port settings, and OTLP-specific parameters that are unique to Azure Monitor’s OTLP ingestion path, which qualify as expert configuration knowledge beyond generic OpenTelemetry usage.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/containers/opentelemetry-ingest-agent,Ingest OTLP signals with AMA (Preview),Ingest OTLP Data into Azure Monitor with AMA (Preview) - Azure Monitor,Configure Azure Monitor Agent for OTLP ingestion,"Learn how to send OpenTelemetry Protocol (OTLP) telemetry data to Azure Monitor using Azure Monitor Agent on VMs, Scale Sets, and Arc-enabled servers.",Azure Monitor now supports native ingestion of OpenTelemetry Protocol (OTLP) signals. You can send telemetry data directly from OpenTelemetry-instrumented applications to Azure Monitor. Important,2026-04-15T22:11:00.000Z,how-to,configuration,0.7,True,"The page describes how to ingest OpenTelemetry Protocol data into Azure Monitor using Azure Monitor Agent on VMs, scale sets, and Arc-enabled servers. This scenario typically includes product-specific configuration details such as AMA data collection rules, endpoint/port settings, and OTLP-specific parameters that are unique to Azure Monitor’s OTLP ingestion path, which qualify as expert configuration knowledge beyond generic OpenTelemetry usage.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/opentelemetry-protocol-ingestion,Ingest OTLP signals with OTel Collector (Preview),Ingest OTLP Data into Azure Monitor with OTel Collector (Preview) - Azure Monitor,Configure OpenTelemetry Collector to send OTLP to Azure Monitor,Learn how to send OpenTelemetry Protocol (OTLP) telemetry data directly to Azure Monitor cloud ingestion endpoints using the OpenTelemetry Collector.,Azure Monitor now supports native ingestion of OpenTelemetry Protocol (OTLP) signals. You can send telemetry data directly from OpenTelemetry-instrumented applications to Azure Monitor. Important,2026-04-08T22:12:00.000Z,how-to,configuration,0.7,True,"Page is about sending OTLP data directly to Azure Monitor via OTel Collector; such docs typically include endpoint URLs, exporter/receiver names, required headers, and configuration snippets specific to Azure Monitor ingestion, which are product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/opentelemetry-summary,OpenTelemetry Protocol ingestion overview (Preview),OpenTelemetry ingestion options (preview) - Azure Monitor,Choose Azure Monitor OTLP ingestion option for AKS and VMs,"Compare OpenTelemetry Protocol (OTLP) ingestion options in Azure Monitor, including AKS monitoring, Azure Monitor Agent, and the OpenTelemetry Collector.","OpenTelemetry Protocol (OTLP) is an open standard for transmitting traces, metrics, and logs from applications and infrastructure to observability backends. Azure Monitor supports OTLP ingestion so you can collect telemetry from your workloads without proprietary instrumentation. Different compute environments—Azure Kubernetes Service (AKS), virtual machines, Azure Arc-enabled servers, hybrid clouds—call for different ingestion approaches. Choosing the wrong path can mean extra infrastructure to",2026-04-08T22:12:00.000Z,concept-article,decision-making,0.7,True,"Comparison page that helps select between AKS monitoring, Azure Monitor Agent, and OpenTelemetry Collector for OTLP ingestion across environments. It contains product-specific decision guidance and trade-offs for different compute scenarios, going beyond a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-argo-cd-integration,Monitor Argo CD,Configure Argo CD Integration for Prometheus Metrics in Azure Monitor - Azure Monitor,Configure Argo CD monitoring with Azure Managed Prometheus,This article describes how to configure Argo CD monitoring by using Prometheus metrics in Azure Monitor to a Kubernetes cluster.,"Argo CD is a declarative, GitOps continuous delivery tool for Kubernetes. Argo CD follows the GitOps pattern of using Git repositories as the source of truth for defining the desired application states. It automates the deployment of the desired application states in the specified target environments. Application deployments can track updates to branches or tags, or they can be pinned to a specific version of manifests at a Git commit. This article describes how to configure the Azure Monitorman",2025-04-13T11:05:00.000Z,how-to,integrations,0.85,True,"Describes configuring Prometheus metrics scraping for Argo CD; likely includes exporter setup and CRD/ConfigMap parameters, which are integration-specific.",unchanged @@ -238,8 +238,8 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metr https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-scrape-scale,High scale,Scrape Prometheus metrics at scale in Azure Monitor - Azure Monitor,Plan Prometheus scraping performance and scale in Azure Monitor,Guidance on performance that can be expected when collection metrics at high scale for Azure Monitor managed service for Prometheus.,This article provides guidance on performance that can be expected when collection metrics at high scale forAzure Monitor managed service for Prometheus.,2025-05-25T08:00:00.000Z,concept-article,limits-quotas,0.8,True,"Performance guidance for high-scale metrics collection typically includes concrete throughput, series count, and resource limits, fitting limits-quotas and best-practices; primary focus is quantitative performance expectations.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-troubleshoot,Prometheus,Troubleshoot collection of Prometheus metrics in Azure Monitor - Azure Monitor,Troubleshoot Prometheus metrics collection in Azure Monitor,Steps that you can take if you aren't collecting Prometheus metrics as expected.,"Follow the steps in this article to determine the cause of Prometheus metrics not being collected as expected in Azure Monitor. Replica pod scrapes metrics fromkube-state-metrics, custom scrape targets in theama-metrics-prometheus-configconfigmap and custom scrape targets defined in theCustom Resources. DaemonSet pods scrape metrics from the following targets on their respective node:kubelet,cAdvisor,node-exporter, and custom scrape targets in theama-metrics-prometheus-config-nodeconfigmap. The ",2024-10-13T08:00:00.000Z,troubleshooting-general,troubleshooting,0.9,True,"Organized around missing metrics symptoms and includes product-specific scrape targets, configmap names, and diagnostic steps for Prometheus collection.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-active-directory,Entra authentication,Set up Prometheus remote write using Microsoft Entra authentication - Azure Monitor,Configure Prometheus remote write with Entra ID authentication,Learn how to set up remote write in Azure Monitor managed service for Prometheus. Use Microsoft Entra authentication to send data from a self-managed Prometheus server running in your Azure Kubernetes,Important This article describes how to set upremote write in Azure Monitor managed service for Prometheususing Entra ID authentication and a side car container provided by Azure Monitor. You can use remote write with Entra ID without using a sidecar using the guidance atConnect self-managed Prometheus to Azure Monitor managed service for Prometheus. This article describes how to set upremote writeto send data from a self-managed Prometheus server running in your Azure Kubernetes Service (AKS) c,2025-10-01T05:08:00.000Z,how-to,integrations,0.9,True,"Similar to 37 but with Entra ID; includes specific auth flows, scopes, and configuration parameters for remote write to Azure Monitor.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-azure-workload-identity,Entra Workload ID authentication,Set up Prometheus remote write using Microsoft Entra Workload ID authentication - Azure Monitor,Configure Prometheus remote write with Entra Workload ID,Learn how to set up remote write in Azure Monitor managed service for Prometheus. Use Microsoft Entra Workload ID (preview) authentication to send data from a self-managed Prometheus server to your Az,"Important You can use remote write with workload identity by configuring Prometheus by following the guidance atConnect self-managed Prometheus to Azure Monitor managed service for Prometheus. Effective March 31, 2027, Azure Monitor will deprecate the sidecar-based remote-write solution for sending Prometheus metrics to Azure Monitor Workspace. This article describes how to set upremote writeto send data from your Azure Monitor managed Prometheus cluster by using Microsoft Entra Workload ID auth",2026-04-16T22:09:00.000Z,how-to,configuration,0.68,True,"The article describes detailed, product-specific setup for Prometheus remote write using Microsoft Entra Workload ID, including precise configuration steps and parameters unique to Azure Monitor managed service for Prometheus. This is configuration-focused expert knowledge rather than generic concepts or simple how-to, and it includes deprecation/timing details and auth-specific settings that an LLM wouldn't reliably know from training.",updated -https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-managed-identity,Managed identity authentication,Set up Prometheus remote write using managed identity authentication - Azure Monitor,Configure Prometheus remote write with managed identity,Learn how to set up remote write in Azure Monitor managed service for Prometheus. Use managed identity authentication to send data from a self-managed Prometheus server running in your Azure Kubernete,"Important Effective March 31, 2027, Azure Monitor will not support the sidecar-based remote-write solution for sending Prometheus metrics to Azure Monitor Workspace. Configure self-hosted Prometheus or Prometheus Operator to remote-write directly to Azure Monitor Workspace without using a sidecar. For guidance, seeConnect self-managed Prometheus to Azure Monitor managed service for Prometheus. This article describes how to set upremote writeto send data from a self-managed Prometheus server runn",2026-04-16T22:09:00.000Z,how-to,configuration,0.78,True,"The article explains how to set up Prometheus remote write to Azure Monitor using managed identity, which involves specific endpoint URLs, authentication parameters, and configuration blocks for Prometheus and Azure Monitor. These are product-specific configuration details and parameters rather than generic concepts, so it fits the configuration sub-skill. It also includes deprecation timing for the sidecar-based solution, but the core value is the concrete configuration for remote write.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-azure-workload-identity,Entra Workload ID authentication,Set up Prometheus remote write using Microsoft Entra Workload ID authentication - Azure Monitor,Configure Prometheus remote write with Entra Workload ID,Learn how to set up remote write in Azure Monitor managed service for Prometheus. Use Microsoft Entra Workload ID (preview) authentication to send data from a self-managed Prometheus server to your Az,"Important You can use remote write with workload identity by configuring Prometheus by following the guidance atConnect self-managed Prometheus to Azure Monitor managed service for Prometheus. Effective March 31, 2027, Azure Monitor will deprecate the sidecar-based remote-write solution for sending Prometheus metrics to Azure Monitor Workspace. This article describes how to set upremote writeto send data from your Azure Monitor managed Prometheus cluster by using Microsoft Entra Workload ID auth",2026-04-16T22:09:00.000Z,how-to,configuration,0.68,True,"The article describes detailed, product-specific setup for Prometheus remote write using Microsoft Entra Workload ID, including precise configuration steps and parameters unique to Azure Monitor managed service for Prometheus. This is configuration-focused expert knowledge rather than generic concepts or simple how-to, and it includes deprecation/timing details and auth-specific settings that an LLM wouldn't reliably know from training.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-managed-identity,Managed identity authentication,Set up Prometheus remote write using managed identity authentication - Azure Monitor,Configure Prometheus remote write with managed identity,Learn how to set up remote write in Azure Monitor managed service for Prometheus. Use managed identity authentication to send data from a self-managed Prometheus server running in your Azure Kubernete,"Important Effective March 31, 2027, Azure Monitor will not support the sidecar-based remote-write solution for sending Prometheus metrics to Azure Monitor Workspace. Configure self-hosted Prometheus or Prometheus Operator to remote-write directly to Azure Monitor Workspace without using a sidecar. For guidance, seeConnect self-managed Prometheus to Azure Monitor managed service for Prometheus. This article describes how to set upremote writeto send data from a self-managed Prometheus server runn",2026-04-16T22:09:00.000Z,how-to,configuration,0.78,True,"The article explains how to set up Prometheus remote write to Azure Monitor using managed identity, which involves specific endpoint URLs, authentication parameters, and configuration blocks for Prometheus and Azure Monitor. These are product-specific configuration details and parameters rather than generic concepts, so it fits the configuration sub-skill. It also includes deprecation timing for the sidecar-based solution, but the core value is the concrete configuration for remote write.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-endpoint-overview,Data collection endpoints,Data collection endpoints in Azure Monitor - Azure Monitor,Configure Azure Monitor data collection endpoints,Overview of how data collection endpoints work and how to create and set them up based on your deployment.,"A data collection endpoint (DCE) is an Azure resource that defines a unique set of endpoints related to data collection, configuration, and ingestion in Azure Monitor. This article provides an overview of data collection endpoints and explains how to create and set them up based on your deployment. Note This article only relates to data collection scenarios in Azure Monitor that use adata collection rule (DCR). Legacy data collection scenarios such as collecting resource logs with diagnostic set",2025-05-21T08:00:00.000Z,how-to,configuration,0.7,True,"Explains how to create and set up DCE resources with deployment-based configuration details, which are product-specific settings.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-monitor,Monitor DCRs,Monitor DCR data collection in Azure Monitor - Azure Monitor,Monitor and troubleshoot DCR-based data collection in Azure Monitor,Configure log collection for monitoring of DCR-based data collection in Azure Monitor.,"This article provides detailed metrics and logs that you can use to monitor performance and troubleshoot any issues related to data collection in Azure Monitor. This telemetry is currently available for data collection scenarios defined by adata collection rules (DCR)such as Azure Monitor agent and Logs ingestion API. Important This article only refers to data collection scenarios that use DCRs, including the following: See the documentation for other scenarios for any monitoring and troubleshoo",2026-01-20T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Provides detailed metrics and logs to monitor performance and troubleshoot DCR-based collection, mapping telemetry to issues—product-specific troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-associations,Manage associations,Manage data collection rule associations in Azure Monitor - Azure Monitor,,Describes different options for viewing data collection rules (DCRs) and data collection rule associations (DCRA) in Azure Monitor.,Data collection rule associations (DCRAs) associate DCRs with monitored resources in Azure Monitor as described inUsing a DCR. This article describes different methods for viewing and creating DCRAs and their related resources. Important Not all data collection scenarios with DCRs use DCRAs. SeeUsing a DCRfor an explanation and comparison of how DCRs are specified in different data collection scenarios.,2026-01-20T08:00:00.000Z,how-to,,0.3,False,"Primarily describes ways to view and create DCR associations; no detailed config tables, limits, or error-code-based troubleshooting.",unchanged @@ -251,24 +251,26 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-colle https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure,DCR structure,Structure of a data collection rule (DCR) in Azure Monitor - Azure Monitor,Understand Azure Monitor data collection rule JSON schema,Details on the structure of different kinds of data collection rule in Azure Monitor.,This article describes the JSON structure of DCRs for those cases where you need to work directly with their definition.,2025-11-18T12:03:00.000Z,reference,configuration,0.8,True,"Explains the JSON structure of DCRs, which is product-specific configuration with schema fields and allowed values that qualify as expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-view,View DCRs,Viewing data collection rules (DCRs) in Azure Monitor - Azure Monitor,View and inspect data collection rule definitions in Azure Monitor,Options for viewing data collection rules (DCRs) and their definitions in Azure Monitor.,Data collection rules (DCRs)are stored in Azure so they can be centrally deployed and managed like any other Azure resource. They provide a consistent and centralized way to define and customize different data collection scenarios.,2025-10-15T05:09:00.000Z,concept-article,configuration,0.8,True,"Shows options to view DCRs and their definitions, including where and how they’re stored and accessed; product-specific resource configuration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations,Overview,Transformations Azure Monitor - Azure Monitor,Configure Azure Monitor data transformations in DCRs,Use transformations in a data collection rule in Azure Monitor to filter and modify incoming data.,Transformations in Azure Monitor allow you to filter or modify incoming data before it's sent to a Log Analytics workspace. Transformations are run after the data source delivers the data and before it's sent to the destination. They're defined in adata collection rule (DCR)and use aKusto Query Language (KQL) statementthat's applied individually to each entry in the incoming data. The following diagram illustrates the transformation process for incoming data and shows a sample query that might b,2026-02-03T23:12:00.000Z,concept-article,configuration,0.7,True,Details how transformations are defined in DCRs using KQL; includes product-specific configuration fields and behavior for transformation processing.,unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-create,Create,Create a transformation in Azure Monitor - Azure Monitor,Create and test Azure Monitor transformation queries,Create a transformation in Azure Monitor and add it to a data collection rule (DCR).,"Transformations in Azure Monitorallow you to filter or modify incoming data before it's stored in a Log Analytics workspace. They're implemented as a Kusto Query Language (KQL) statement in adata collection rule (DCR). This article provides guidance on creating and testing a transformation query and adding it to a DCR. Note If you're not familiar with KQL or creating log queries for Azure Monitor data, start withOverview of Log Analytics in Azure MonitorandLog queries in Azure Monitor.",2026-01-20T08:00:00.000Z,how-to,configuration,0.75,True,"Provides concrete guidance on creating and adding transformation queries to DCRs, including product-specific configuration steps and parameters.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-create,Create,Create a transformation in Azure Monitor - Azure Monitor,,Create a transformation in Azure Monitor and add it to a data collection rule (DCR).,"Transformations in Azure Monitorallow you to filter or modify incoming data before it's stored in a Log Analytics workspace. They're implemented as a Kusto Query Language (KQL) statement in adata collection rule (DCR). This article provides guidance on creating and testing a transformation query and adding it to a DCR. Note If you're not familiar with KQL or creating log queries for Azure Monitor data, start withOverview of Log Analytics in Azure MonitorandLog queries in Azure Monitor.",2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Primarily a how-to for creating and attaching KQL-based transformations to data collection rules. From the summary, it doesn't clearly expose product-specific limits, configuration tables, or error-code-based troubleshooting; it looks like general procedural guidance that an LLM can approximate without hidden numeric constraints or specialized config matrices.",updated https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-kql,Supported KQL,Supported KQL features in Azure Monitor transformations - Azure Monitor,Use supported KQL features in Azure Monitor transformations,Supported KQL features in Azure Monitor transformations,"Transformations in Azure Monitorallow you to run a KQL query against incoming Azure Monitor data to filter or modify incoming data before it's stored in a Log Analytics workspace. This article details KQL considerations and supported features in transformation queries in addition to special operators that are only available in transformations. Since transformations are applied to each record individually, they can't use any KQL operators that act on multiple records. Only operators that take a s",2026-01-20T08:00:00.000Z,reference,integrations,0.8,True,Lists which KQL operators/features are supported or restricted in transformations and introduces special operators unique to this feature—product-specific query/integration behavior.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-samples,Best practices and samples,Sample transformations in Azure Monitor - Azure Monitor,Reuse sample KQL transformations for Azure Monitor,Sample transformations for common scenarios in Azure Monitor.,Transformations in Azure Monitorallow you to filter or modify incoming data before it's sent to a Log Analytics workspace. This article provides sample queries for common scenarios that you can use to get started creating your own transformations. SeeCreate a transformation in Azure Monitorfor details on testing these transformations and adding them to a data collection rule (DCR).,2026-01-20T08:00:00.000Z,how-to,integrations,0.65,True,Provides concrete sample transformation queries for common scenarios; these are product-specific code patterns for integrating data via transformations.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-plane-versus-metrics-export,Compare metrics strategies,Metrics export feature comparison - Azure Monitor,Choose between Azure Monitor metrics export and data plane API,A comparison of Data plane API or metrics batch query and metrics export.,"Azure Monitor provides two ways to access metrics data at scale:Azure Monitor Metrics Data plane APIandmetrics export. Although both collect metrics data, they're more effective for different use cases. This article provides a scenario comparison for using these features and recommendations on when to use each.",2026-01-21T13:03:00.000Z,concept-article,decision-making,0.8,True,"Scenario-based comparison of two metrics access methods with recommendations on when to use each, fitting decision-making guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/metrics-export-create,Metrics export using DCRs,Metrics export using data collection rules (Preview) - Azure Monitor,Configure DCR-based metrics export in Azure Monitor,Learn how to create data collection rules for metrics.,"Platform metrics measure the performance of different aspects of your Azure resources.Diagnostic settingsare used to collect and export platform metrics from all Azure resources that support them.Data collection rules (DCRs)can also be used to collect and export platform metrics fromsupported Azure resources. This article describes how to use DCRs to export metrics. Note While you can use DCRs and diagnostic settings at the same time, you should disable any diagnostic settings for metrics when u",2026-01-21T13:03:00.000Z,concept-article,configuration,0.7,True,"Describes how to define DCRs for metrics export, which involves product-specific configuration fields and settings beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/metrics-export-structure,Metrics export DCR structure,Data collection rule (DCR) structure for metrics export - Azure Monitor,Author DCR JSON for Azure Monitor metrics export,Details on DCR structure for metrics export in Azure Monitor.,"Metrics exportin Azure Monitor usesdata collection rules (DCRs)to define which metrics to collect from which resources and where to send them. When you use the Azure portal to configure this feature, you don't need to understand the structure of DCR. Using other methods though, you may need to understand the structure so you can modify it for your requirements. This article describes the details of DCRs used for metrics export.",2026-01-21T13:03:00.000Z,how-to,configuration,0.85,True,"Focuses on the detailed DCR structure for metrics export, including schema and fields—core configuration reference content.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure,Configuration overview,Configure Azure Monitor pipeline - Azure Monitor,Configure Azure Monitor pipeline in your environment,Configure Azure Monitor pipeline which extends Azure Monitor data collection into your data center.,TheAzure Monitor pipelineextends the data collection capabilities of Azure Monitor to edge and multicloud environments. This article provides details on enabling and configuring the Azure Monitor pipeline in your environment.,2026-02-25T23:09:00.000Z,how-to,configuration,0.75,True,"Details enabling and configuring the pipeline, likely with specific extension settings, cache options, and routing parameters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli,CLI and templates,Configure Azure Monitor pipeline using CLI or ARM templates - Azure Monitor,Configure Azure Monitor pipeline using CLI or ARM templates,Use CLI or ARM templates to configure Azure Monitor pipeline which extends Azure Monitor data collection into your data center.,"TheAzure Monitor pipelineextends the data collection capabilities of Azure Monitor to edge and multicloud environments. This article provides details on enabling and configuring the Azure Monitor pipeline in your environment. Depending on the method you use, you may not require all the details in this article.",2026-02-25T23:09:00.000Z,how-to,configuration,0.75,True,CLI/ARM-based configuration with detailed resource definitions and parameters; highly product-specific configuration knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-clients,Configure clients,Configure clients to use Azure Monitor pipeline - Azure Monitor,Configure Azure Monitor pipeline client connections,Configure clients to use the Azure Monitor pipeline which extends Azure Monitor data collection into your data center.,"TheAzure Monitor pipelineextends the data collection capabilities of Azure Monitor to edge and multicloud environments. This article describes how to configure your clients to send data to the pipeline after it's been deployed. Each client requires the external IP address of the Azure Monitor pipeline service. Use the following command to retrieve this address: Note If the external-ip field is set topending, you need to configure an external IP for this ingress manually according to your cluster",2026-02-19T23:12:00.000Z,how-to,configuration,0.78,True,"Client configuration for Azure Monitor pipeline will include concrete commands, endpoint formats, and parameter values (for example, how to retrieve and use the external IP and specific flags), which are product-specific configuration details rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal,Azure portal,Configure Azure Monitor pipeline using the Azure portal - Azure Monitor,Configure Azure Monitor pipeline using Azure portal,Use the Azure portal to configure Azure Monitor pipeline which extends Azure Monitor data collection into your data center.,"TheAzure Monitor pipelineextends the data collection capabilities of Azure Monitor to edge and multicloud environments. This article describes how to enable and configure the Azure Monitor pipeline using the Azure portal. Using this method, you don't need to understand the individual components that make up the pipeline, but you may need to use other methods for more advanced functionality such as enabling the cache. To use CLI or ARM templates to configure the pipeline, seeConfigure Azure Monit",2026-02-25T23:09:00.000Z,how-to,configuration,0.7,True,Portal-based configuration of pipeline components; includes specific setting names and allowed values for pipeline behavior.,unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-extension-versions,Extension versions,Azure Monitor pipeline extension versions - Azure Monitor,,Extension versions and release notes for Azure Monitor pipeline.,"This article describes the version details for the Azure Monitor pipeline Arc-enabled Kubernetes extension. This extension deploys the pipeline on Arc-enabled Kubernetes clusters in your on-premises, edge, hybrid or multicloud environments.",2026-03-18T06:05:00.000Z,how-to,,0.3,False,"Page is primarily version history and release notes for the Azure Monitor pipeline Arc-enabled Kubernetes extension. It does not focus on limits/quotas, configuration parameter tables, deployment matrices, security roles, or troubleshooting mappings with error codes. While it has product-specific details (version numbers, fixes, and changes), these are release-note style rather than reusable expert configuration/architecture/troubleshooting knowledge as defined by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway,Sample gateway setup,Azure Monitor pipeline - Gateway for Kubernetes deployment - Azure Monitor,Expose Azure Monitor pipeline via Traefik gateway,Secure the connection from your remote clients to Azure Monitor pipeline using a gateway.,"Azure Monitor pipeline extension gets deployed with ClusterIP services, which are only accessible within the Kubernetes cluster. To expose these pipelines to remote clients external to the cluster (e.g. network switches, firewall devices), you need to deploy a gateway solution.This guide shows how to expose an Azure Monitor Pipeline receiver to external clients using a Traefik gateway. Note Traefik gateway will only work in environments where Kubernetes Load Balancers can be deployed successfull",2026-03-03T13:04:00.000Z,article,configuration,0.7,True,"Describes a concrete, product-specific Kubernetes gateway setup (Traefik) to expose Azure Monitor pipeline receivers, likely including specific service types, annotations, ports, and configuration snippets that go beyond generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-overview,Overview,Azure Monitor pipeline overview - Azure Monitor,,"Overview of the Azure Monitor pipeline, which extends Azure Monitor data collection into your own data center","The Azure Monitor pipeline extends the data collection capabilities of Azure Monitor to your local data center and multicloud environments. It enables at-scale collection, transformation, and routing of telemetry data before sending it to Azure Monitor in the cloud. The pipeline can cache data locally, sync with the cloud when connectivity is restored, and route telemetry to Azure Monitor in cases where clients can't send data directly to the cloud.",2026-03-05T06:34:00.000Z,article,,0.1,False,"Described as an overview of the Azure Monitor pipeline and its capabilities (extending data collection, caching, syncing, routing). This is conceptual/architectural explanation without clear indication of specific configuration parameters, limits, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement,Pod placement,Azure Monitor pipeline pod placement - Azure Monitor,Configure pod placement for Azure Monitor pipeline,Manage how Azure Monitor pipeline instances are scheduled across Kubernetes cluster nodes by configuring pod placement.,"As Azure Monitor pipeline scales, default scheduling behavior in your Kubernetes environment may not meet your performance, isolation, or compliance needs. Pod placement allows you to manage how yourAzure Monitor pipeline instancesare scheduled across Kubernetes cluster nodes. This feature allows you to target specific nodes based on their capabilities, control instance distribution to prevent resource contention, and enforce isolation policies for high-scale deployments.",2026-02-25T23:09:00.000Z,article,configuration,0.7,True,"Pod placement across Kubernetes nodes typically involves specific Kubernetes constructs (node selectors, taints, tolerations, affinity rules) and product-specific labels/annotations, which are concrete configuration details unique to this service.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls,Configure TLS,Azure Monitor pipeline TLS configuration - Azure Monitor,Configure TLS and mTLS for Azure Monitor pipeline,Secure the connection from your Azure Monitor pipeline to Azure Monitor by configuring TLS.,"TheAzure Monitor pipelineextends the data collection capabilities of Azure Monitor to your local data center and multicloud environments. It supports both TLS and mutual TLS (mTLS) for TCP‑based receivers through two certificate management approaches: With these options, you can: This article explains how to secure data ingestion into Azure Monitor pipeline using TLS encryption, and additional secure intra-cluster traffic using mTLS client authentication. Using the options below, you can choose ",2026-03-03T13:04:00.000Z,article,security,0.8,True,"Focuses on TLS/mTLS for Azure Monitor pipeline with product-specific security configuration, likely including certificate types, configuration parameters, and options for securing intra-cluster traffic that qualify as detailed security guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated,Automated certificate management (Default TLS),Azure Monitor pipeline TLS configuration (Automated certificate management) - Azure Monitor,Use automated TLS certificate management for Azure Monitor pipeline,Secure the connection from your Azure Monitor pipeline to Azure Monitor by configuring TLS (Automated certificate management).,TheAzure Monitor pipelinesupports both TLS and mutual TLS (mTLS) for TCP‑based receivers through two certificate management approaches: This article provides detailed guidance for theDefault TLSoption. SeeUsing your own certificate management (Customer managed or BYOC)for theBYOCoption.,2026-03-03T13:04:00.000Z,article,security,0.85,True,"Provides detailed guidance for the default TLS option with automated certificate management, including product-specific security settings and configuration parameters for Azure Monitor pipeline.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-custom,Customer-managed certificates (BYOC),Azure Monitor pipeline TLS configuration (Customer managed) - Azure Monitor,Configure customer-managed TLS certificates for Azure Monitor pipeline,Secure the connection from your Azure Monitor pipeline to Azure Monitor by configuring TLS (Customer managed).,"TheAzure Monitor pipelinesupports both TLS and mutual TLS (mTLS) for TCP‑based receivers through two certificate management approaches: This article provides detailed guidance for theBYOCoption. SeeUsing automated certificate managementfor theDefault TLSoption. You can provide your own certificates to meet compliance, security, and custom PKI requirements. With BYOC, you can:",2026-03-03T13:04:00.000Z,article,security,0.85,True,"Covers BYOC/customer-managed certificates for TLS/mTLS with Azure Monitor pipeline, including product-specific certificate requirements and configuration steps that constitute expert security configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-transformations,Transformations,Configure Azure Monitor pipeline transformations - Azure Monitor,,Configure Azure Monitor pipeline transformations to filter and manipulate log data before it's sent to Azure Monitor in the cloud.,"Azure Monitor pipeline data transformations allow you to filter and manipulate log data before it's sent to Azure Monitor in the cloud. Transformations enable you to structure incoming data according to your analytics needs, ensuring that only relevant information is sent to Azure Monitor and that it's in an appropriate format to be processed. Benefits of using pipeline transformations include: Azure Monitor pipeline solves the challenges of high ingestion costs and complex analytics by enabling",2026-02-03T23:12:00.000Z,article,,0.3,False,"Described as benefits and conceptual explanation of transformations; summary does not indicate concrete parameter tables, limits, or product-specific gotchas.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure,Configure Azure Monitor pipeline,Configure Azure Monitor pipeline - Azure Monitor,Prepare and configure clusters for Azure Monitor pipeline,"Learn how to prepare your cluster, install cert-manager, and choose a configuration method for Azure Monitor pipeline.",This article describes the overall setup process forAzure Monitor pipelineand provides details for the initial common setup to prepare your Arc-enabled Kubernetes cluster for the pipeline.,2026-04-21T17:12:00.000Z,how-to,configuration,0.7,True,"Described as overall setup including preparing Arc-enabled Kubernetes cluster and installing cert-manager; such setup docs typically list specific extension names, parameters, and configuration steps unique to this product.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli,Configure with CLI or ARM templates,Configure Azure Monitor pipeline with CLI or ARM templates - Azure Monitor,Configure Azure Monitor pipeline with CLI and ARM,Learn how to configure Azure Monitor pipeline with CLI or ARM templates for automation and advanced scenarios.,"Use this article after you complete the shared setup inConfigure Azure Monitor pipeline. This method is best for automation, custom tables, buffering, and other advanced scenarios.",2026-04-21T17:12:00.000Z,how-to,configuration,0.75,True,"CLI/ARM configuration implies parameter names, JSON schema, and flags specific to Azure Monitor pipeline, matching configuration sub-skill criteria.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal,Configure with the Azure portal,Configure Azure Monitor pipeline with the Azure portal - Azure Monitor,Configure Azure Monitor pipeline and dataflows in portal,Learn how to create an Azure Monitor pipeline and dataflows with the Azure portal after you complete the shared setup.,"Use this article after you complete the shared setup inConfigure Azure Monitor pipeline. The Azure portal is the quickest way to create a pipeline and its dataflows because it creates the required pipeline resources for you. If you need automation, buffering, or more control over the deployed resources, useConfigure Azure Monitor pipeline with CLI or ARM templates.",2026-04-21T17:12:00.000Z,how-to,configuration,0.7,True,"Portal configuration article will contain concrete resource names, fields, and settings for pipeline and dataflows that are product-specific configuration knowledge.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-extension-versions,Extension versions,Azure Monitor pipeline extension versions - Azure Monitor,Review Azure Monitor pipeline extension versions and changes,Extension versions and release notes for Azure Monitor pipeline.,"This article describes the version details for the Azure Monitor pipeline Arc-enabled Kubernetes extension. This extension deploys the pipeline on Arc-enabled Kubernetes clusters in your on-premises, edge, hybrid, or multicloud environments.",2026-04-21T17:12:00.000Z,how-to,limits-quotas,0.65,True,"Version and release notes typically include specific version numbers, supported features, and sometimes constraints tied to versions; this is expert, version-specific operational knowledge, closest to limits/quotas among available categories.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-faq,FAQ,Azure Monitor pipeline FAQ - Azure Monitor,,"Frequently asked questions for Azure Monitor pipeline, including architecture, deployment, data collection, and configuration.","Frequently asked questions for Azure Monitor pipeline, including architecture, deployment, data collection, and configuration.",2026-04-21T17:12:00Z,faq,,0.2,False,"FAQ description is generic; likely high-level Q&A about architecture and deployment without structured error codes, limits, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway,Configure a Kubernetes gateway,Configure a Kubernetes gateway for Azure Monitor pipeline - Azure Monitor,Deploy and manage Kubernetes gateway for Azure Monitor pipeline,Deploy a dedicated Kubernetes gateway for a new Azure Monitor pipeline receiver.,"Azure Monitor pipeline services deploy as KubernetesClusterIPservices, so clients outside the cluster can't reach them directly. A gateway exposes the receiver endpoint to external clients such as network devices, firewalls, and other telemetry sources. This article shows how to deploy and manage a Traefik gateway for Azure Monitor pipeline. It covers a first deployment, adding a receiver to an existing gateway, onboarding another client to an existing receiver, and deploying another pipeline gr",2026-04-21T17:12:00.000Z,how-to,configuration,0.7,True,"Describes deploying a Traefik gateway and managing receivers/clients; likely includes specific Kubernetes manifests, service types, and annotations unique to this integration pattern, fitting configuration.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-overview,Overview,What is Azure Monitor pipeline? - Azure Monitor,Decide when and how to use Azure Monitor pipeline,"Learn what Azure Monitor pipeline is, when to use it, and the recommended setup sequence for edge and multicloud data collection.","Azure Monitor can already collect telemetry from on-premises, edge, and multicloud environments. In many enterprise environments, sending large volumes of telemetry directly to the cloud can increase ingestion costs, introduce reliability risks during connectivity loss, and limit your control over what data is collected. Azure Monitor pipeline builds on existing Azure Monitor collection capabilities for these scenarios. Azure Monitor pipeline provides centralized governance and a single point of",2026-04-21T17:12:00.000Z,overview,decision-making,0.65,True,"Overview focuses on when to use Azure Monitor pipeline for edge and multicloud data collection and recommends a setup sequence; likely includes scenario-based guidance and trade-offs versus direct ingestion, which is product-specific decision guidance.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement,Pod placement,Azure Monitor pipeline pod placement - Azure Monitor,Configure pod placement for Azure Monitor pipeline on Kubernetes,Manage how Azure Monitor pipeline instances are scheduled across Kubernetes cluster nodes by configuring pod placement.,"As Azure Monitor pipeline scales, default scheduling behavior in your Kubernetes environment might not meet your performance, isolation, or compliance needs. Pod placement helps you manage how yourAzure Monitor pipeline instancesare scheduled across Kubernetes cluster nodes. This feature helps you target specific nodes based on their capabilities, control instance distribution to prevent resource contention, and enforce isolation policies for high-scale deployments.",2026-04-21T17:12:00.000Z,article,configuration,0.7,True,"Pod placement feature involves configuring scheduling rules, node selectors, and policies specific to Azure Monitor pipeline instances, which is detailed configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-sizing,Sizing best practices,Best practices for sizing Azure Monitor pipeline - Azure Monitor,Apply sizing best practices for Azure Monitor pipeline throughput,"Learn how to size Azure Monitor pipeline for your throughput requirements, including per-vCPU baselines, scaling strategies, and capacity planning examples.","The Azure Monitor pipeline is CPU-bound and scales linearly with available cores. This article provides throughput baselines, scaling strategies, and capacity planning examples to help you size a pipeline deployment for your environment. Use these baselines alongside theAzure Monitor pipeline overviewandConfigure Azure Monitor pipelineto plan your deployment. Note The performance data in this article was collected using pipeline versionv0.158in March 2026 with TCP transport, ~1.2 KB payloads, an",2026-04-21T17:12:00.000Z,best-practice,best-practices,0.85,True,"Explicitly described as best practices for sizing with per-vCPU baselines, scaling strategies, and capacity planning examples; includes performance data and quantitative guidance unique to this product.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls,Configure TLS,Azure Monitor pipeline TLS configuration - Azure Monitor,Choose TLS and mTLS options for Azure Monitor pipeline,Secure data ingestion to Azure Monitor pipeline by configuring TLS.,Use this article to choose how to secure TCP-based ingestion forAzure Monitor pipeline. The pipeline supports TLS and mutual TLS (mTLS) for TCP-based receivers through two certificate management approaches: Choose one of the following approaches: Use the following sections to choose the approach that fits your deployment.,2026-04-21T17:12:00.000Z,article,security,0.8,True,"TLS configuration article focuses on securing ingestion, choosing between TLS/mTLS and certificate management approaches; this is product-specific security configuration guidance.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated,Automated certificate management (default TLS),Azure Monitor pipeline TLS configuration (Automated certificate management) - Azure Monitor,Configure automated TLS certificate management for Azure Monitor pipeline,Secure data ingestion to Azure Monitor pipeline by using automated certificate management.,"Use this article when you want the Certificate Manager extension to manage certificates for TLS-enabled Azure Monitor pipeline receivers. For the customer-managed option, seeUsing your own certificate management.",2026-04-21T17:12:00.000Z,article,security,0.8,True,"Details using the Certificate Manager extension for TLS-enabled receivers, including specific configuration parameters and flows, which are security-focused and product-specific.",new +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-custom,Customer-managed certificates (BYOC),Azure Monitor pipeline TLS configuration (Customer managed) - Azure Monitor,Configure customer-managed TLS certificates for Azure Monitor pipeline,Secure data ingestion to Azure Monitor pipeline by using customer-managed certificates.,"Use this article when you want to manage certificates for Azure Monitor pipeline receivers yourself. For the automated option, seeUsing automated certificate management. Advantages of BYOC:",2026-04-21T17:12:00.000Z,article,security,0.8,True,"Customer-managed TLS configuration will include certificate requirements, secret formats, and binding settings unique to Azure Monitor pipeline, fitting security configuration.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-transformations,Transformations,Configure Azure Monitor pipeline transformations - Azure Monitor,Configure Azure Monitor pipeline log transformations,Configure Azure Monitor pipeline transformations to filter and manipulate log data before it's sent to Azure Monitor in the cloud.,"If you need to reduce the volume of data sent to Azure Monitor, clean up incoming records, or change data into a format that works better for analysis, transform the data before it leaves your cluster. Azure Monitor pipeline transformations let you filter, aggregate, and modify incoming log data in the pipeline before sending it to the cloud. This article shows how to configure transformations in the Azure portal or in ARM templates. It also explains how aggregations work and which KQL operators",2026-04-21T17:12:00.000Z,article,configuration,0.7,True,"Covers how to configure transformations, including which KQL operators are supported and how aggregations work; this is product-specific configuration behavior and constraints.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-troubleshoot,Troubleshoot,Troubleshoot Azure Monitor pipeline - Azure Monitor,Troubleshoot Azure Monitor pipeline deployment and data issues,"Guidance for troubleshooting issues with Azure Monitor pipeline deployment, configuration, data collection, and connectivity.",This article provides guidance for common issues encountered when deploying and using Azure Monitor pipeline.,2026-04-21T17:12:00.000Z,troubleshooting-general,troubleshooting,0.85,True,"Explicit troubleshooting article for deployment, configuration, data collection, and connectivity; likely organized by symptoms with specific causes and resolutions unique to Azure Monitor pipeline.",new https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/azure-monitor-network-access,Azure Monitor network access requirements,Azure Monitor endpoint access and firewall configuration - Azure Monitor,Configure network and firewall access to Azure Monitor,Ensure your Azure resources can connect to Azure Monitor by configuring firewall rules and understanding endpoint access requirements.,"If your monitored application or infrastructure is behind a firewall, you need to configure network access to allow communication withAzure Monitorservices. Azure Monitor usesservice tags, which provide a dynamic way to manage network access, especially if you're usingAzure network security groups, Azure Firewall, or next generation firewalls (NGFW) that can implement service tags. For hybrid or on-premises resources, or network perimeter controls that don't support service tags, retrieve the eq",2025-08-19T05:14:00.000Z,reference,security,0.74,True,"Details Azure Monitor endpoint access requirements, service tags, and firewall configuration specifics, which are product-specific security/network configuration settings.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/azure-monitor-operations-manager,Migrate from Operations Manager,Migrate from System Center Operations Manager (SCOM) to Azure Monitor - Azure Monitor,Decide how to migrate SCOM monitoring to Azure Monitor,Guidance for existing users of System Center Operations Manager to transition monitoring of workloads to Azure Monitor as part of a transition to the cloud.,This article provides guidance for customers who useSystem Center Operations Managerand are planning a transition to cloud based monitoring withAzure Monitoras they migrate business applications and other resources into Azure. There's no standard process for migrating from System Center Operations Manager. You may rely on SCOM management packs for an extended time as opposed to performing a quick migration. This article describes the different options available and decision criteria you can use ,2025-08-28T08:00:00.000Z,upgrade-and-migration-article,decision-making,0.7,True,Migration guidance from SCOM to Azure Monitor with explicit options and decision criteria for different scenarios qualifies as decision-making. It goes beyond overview and provides product-specific migration paths and trade-offs.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/azure-monitor-rest-api-index,Azure Monitor REST API index,Azure Monitor REST API index - Azure Monitor,Index of Azure Monitor REST API operation groups,"Lists the operation groups for the Azure Monitor REST API, which includes Application Insights, Log Analytics, and Monitor.","Note This page lists APIs for native Azure Monitor features, includingManaged Prometheus. Other managed services, such asAzure Managed GrafanaandAzure Monitor SCOM Managed Instance, use their own REST APIs and aren't included in this index. For more information, seeAzure Managed Grafana REST API ReferenceandSystem Center Operations Manager REST API Reference. Organized by subject area.",2025-12-11T08:00:00.000Z,reference,configuration,0.7,True,"Provides a structured index of REST API operation groups for Azure Monitor, which is a product-specific API surface reference useful for configuration and integration.",unchanged @@ -334,7 +336,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table,A https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table-auxiliary,Set up a table with the Auxiliary plan,Set up a table with the Auxiliary plan for low-cost data ingestion and retention in your Log Analytics workspace - Azure Monitor,Use Auxiliary table plan for low-cost Azure Monitor log retention,Create a custom table with the Auxiliary table plan in your Log Analytics workspace for low-cost ingestion and retention of log data.,TheAuxiliary table planlets you ingest and retain data in your Log Analytics workspace at a low cost. Here's a video that explains some of the uses and benefits of the Auxiliary table plan: Azure Monitor Logs currently supports the Auxiliary table plan ondata collection rule (DCR)-based custom tablesto which you send data you collect usingAzure Monitor Agentor theLogs ingestion API. This article explains how to create a new custom table with the Auxiliary plan in your Log Analytics workspace and,2026-03-13T12:02:00.000Z,how-to,decision-making,0.65,True,"Describes when and how to use the Auxiliary table plan for DCR-based custom tables, including plan-specific behavior and constraints for ingestion and retention. This is concrete, SKU/plan-specific guidance that helps choose and configure a cost-optimized option.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/cross-workspace-query,Query data across workspaces and resources,Query across resources with Azure Monitor - Azure Monitor,Run cross-resource queries in Azure Monitor Logs,"Query and correlated data from multiple Log Analytics workspaces, applications, or resources using the `workspace()`, `app()`, and `resource()` Kusto Query Language (KQL) expressions.","There are two ways to query data from multiple workspaces, applications, and resources: This article explains how to use the explicitworkspace(),app(), andresource()expressions to query data from multiple Log Analytics workspaces, applications, and resources. When you manage subscriptions in other Microsoft Entra tenants withAzure Lighthouse,Log Analytics workspaces created in those customer tenantsare available to use in your cross-workspace queries. Important If you're using aworkspace-based A",2025-10-07T17:17:00.000Z,how-to,configuration,0.75,True,"Documents workspace(), app(), resource() KQL expressions and cross-tenant behavior; includes a hard limit on number of resources per query.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/custom-logs-migrate,Migrate from Data Collector API,Migrate from the HTTP Data Collector API to the Log Ingestion API - Azure Monitor,Migrate from HTTP Data Collector API to Logs Ingestion API,"Migrate from the legacy Azure Monitor Data Collector API to the Log Ingestion API, which provides more processing power and greater flexibility.","The Azure MonitorLog Ingestion APIprovides more processing power and greater flexibility in ingesting logs andmanaging tablesthan the legacyHTTP Data Collector API. This article describes the differences between the Data Collector API and the Log Ingestion API and provides guidance and best practices for migrating to the new Log Ingestion API. Note As a Microsoft MVP,Morten Waltorp Knudsencontributed to and provided material feedback for this article. For an example of how you can automate the s",2024-09-11T16:50:00.000Z,how-to,decision-making,0.8,True,Compares legacy Data Collector vs Logs Ingestion API with migration guidance and best practices; helps choose and plan migration with product-specific trade-offs.,unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/logs/customer-managed-keys,Customer-managed keys,Azure Monitor customer-managed keys - Azure Monitor,Configure customer-managed keys for Azure Monitor Logs,Information and steps to configure Customer-managed key to encrypt data in your Log Analytics workspaces using an Azure Key Vault key.,"Azure Monitor encrypts data by using Microsoft-managed keys. You can use your own encryption key to protect data in your workspaces. By using customer-managed keys in Azure Monitor, you control the encryption key lifecycle and access to logs. After you set up customer-managed keys, new data ingested to linked workspaces is encrypted by using your key inAzure Key VaultorAzure Key Vault Managed HSM(Hardware Security Module).",2026-03-10T08:00:00.000Z,how-to,security,0.8,True,"Provides product-specific steps and settings to enable CMK encryption for Log Analytics workspaces using Azure Key Vault or Managed HSM, including workspace linkage behavior and key management details. This is concrete security configuration, not just conceptual.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/logs/customer-managed-keys,Customer-managed keys,Azure Monitor customer-managed keys - Azure Monitor,Configure customer-managed keys for Azure Monitor Logs,Information and steps to configure Customer-managed key to encrypt data in your Log Analytics workspaces using an Azure Key Vault key.,"Azure Monitor encrypts data by using Microsoft-managed keys. You can use your own encryption key to protect data in your workspaces. By using customer-managed keys in Azure Monitor, you control the encryption key lifecycle and access to logs. After you set up customer-managed keys, new data ingested to linked workspaces is encrypted by using your key inAzure Key VaultorAzure Key Vault Managed HSM(Hardware Security Module).",2026-04-09T08:00:00.000Z,how-to,security,0.8,True,"Customer-managed keys setup for Azure Monitor Logs involves specific Azure roles, Key Vault/Managed HSM configuration parameters, workspace linkage behavior, and encryption-scope details. These are product-specific security and encryption configuration steps (RBAC roles, key URIs, workspace settings) that qualify as expert security knowledge beyond generic concepts.",updated https://learn.microsoft.com/en-us/azure/azure-monitor/logs/daily-cap,Set a daily cap,Set daily cap on Log Analytics workspace - Azure Monitor,Configure daily ingestion caps for Log Analytics workspaces,Set a,A daily cap on a Log Analytics workspace allows you to reduce unexpected increases in charges for data ingestion by stopping collection of billable log data for tables in the Analytics or Basictable plansfor the rest of a 24-hour period whenever your specified threshold is reached. Tables in the Auxiliary table plan are not subject to any daily cap. Important The daily cap feature in Azure Monitor shouldnotbe used as a primary mechanism to filter or reduce data before ingestion into a Log Analyt,2025-08-29T08:00:00.000Z,how-to,limits-quotas,0.88,True,"Daily cap configuration involves specific numeric thresholds, behavior when limits are reached, and interactions with table plans, which are concrete limit/quota details.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-collection-troubleshoot,Troubleshoot,Troubleshoot why data is no longer being collected in Azure Monitor - Azure Monitor,Troubleshoot stopped data collection in Azure Monitor Logs,Steps to take if data is no longer being collected in Log Analytics workspace in Azure Monitor.,"This article explains how to detect when data collection in Azure Monitor stops and details steps you can take to address data collection issues. Important If you're troubleshooting data collection for a scenario that uses a data collection rule (DCR) such as Azure Monitor agent or Logs ingestion API, seeMonitor and troubleshoot DCR data collection in Azure Monitorfor additional troubleshooting information.",2024-09-11T16:50:00.000Z,troubleshooting-general,troubleshooting,0.88,True,"Explicit troubleshooting article; these typically map symptoms to causes and resolutions, and include product-specific checks, error messages, and diagnostic steps.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-ingestion-time,Log data ingestion time,Log data ingestion time in Azure Monitor - Azure Monitor,Understand Azure Monitor log data ingestion latency,This article explains the different factors that affect latency in collecting log data in Azure Monitor.,"Azure Monitor is a high-scale data service that sends terabytes of data each month and continues to grow. Under normal service operations, the time it takes for log data to become available after collection is predictable and consistent. This article explains the factors that affect this latency.",2026-03-05T18:17:00.000Z,concept-article,limits-quotas,0.7,True,"Explains ingestion time characteristics and factors affecting latency for Azure Monitor Logs. Such an article typically includes concrete latency ranges/targets and behavior under normal operations, which are numeric service characteristics not known generically.",unchanged @@ -359,7 +361,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-powerbi,Power BI, https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-query-overview,Overview,Log queries in Azure Monitor - Azure Monitor,Azure Monitor–specific KQL query reference,This reference information for Kusto Query Language used by Azure Monitor includes elements specific to Azure Monitor and elements not supported in Azure Monitor log queries.,"Azure Monitor Logsis based onAzure Data Explorerand uses the sameKusto Query Language (KQL)to write log queries. This rich language is designed to be easy to read and author, which allows you to start writing queries with minimal guidance. Areas in Azure Monitor where you use queries include: Important Since July 1, 2025, querying log data and events requires TLS 1.2 or higher when usingquery API endpoints for Log Analytics or Application Insights. For more information, seeSecure data in transit",2025-08-29T08:00:00.000Z,concept-article,configuration,0.65,True,"Reference for KQL elements specific to Azure Monitor, including unsupported features and TLS requirements; product-specific query behavior.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-standard-columns,Standard columns,Standard columns in Azure Monitor log records - Azure Monitor,Use standard columns in Azure Monitor log records,Describes columns that are common to multiple data types in Azure Monitor logs.,"Data in Azure Monitor Logs isstored as a set of records in either a Log Analytics workspace or Application Insights application, each with a particular data type that has a unique set of columns. Many data types have standard columns that are common across multiple types. This article describes these columns and provides examples of how you can use them in queries. Workspace-based applications in Application Insights store their data in a Log Analytics workspace and use the same standard columns",2024-09-11T16:50:00.000Z,concept-article,configuration,0.7,True,Documents standard column names and semantics across data types; essential schema-level configuration knowledge unique to Azure Monitor.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-data-export,Workspace data export,Log Analytics workspace data export in Azure Monitor - Azure Monitor,Configure continuous data export from Log Analytics,Log Analytics workspace data export in Azure Monitor lets you continuously export data per selected tables in your workspace. You can export to an Azure Storage Account or Azure Event Hubs as it's col,Data export in a Log Analytics workspace lets you continuously export data per selected tables in your workspace. You can export to an Azure Storage Account or Azure Event Hubs as the data arrives to an Azure Monitor pipeline. This article provides details on this feature and steps to configure data export in your workspaces.,2026-02-16T08:00:00.000Z,how-to,configuration,0.75,True,"Details export configuration per table to Storage/Event Hubs, including settings and constraints unique to Log Analytics data export.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters,Dedicated clusters,Azure Monitor Logs Dedicated Clusters - Azure Monitor,Plan and use Azure Monitor Logs dedicated clusters,Customers meeting the minimum commitment tier could use dedicated clusters,"A dedicated cluster in Azure Monitor provides advanced security and control capabilities, and cost optimization. You can link new or existing workspaces to a dedicated cluster without interrupting ingestion and query operations.",2026-03-05T13:06:00.000Z,how-to,decision-making,0.65,True,"Covers when to use dedicated clusters for Azure Monitor Logs, including minimum commitment tier and workspace-linking behavior. This is service- and SKU-specific selection guidance beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters,Dedicated clusters,Azure Monitor Logs Dedicated Clusters - Azure Monitor,Decide and configure Azure Monitor dedicated clusters,Customers meeting the minimum commitment tier could use dedicated clusters,"A dedicated cluster in Azure Monitor provides advanced security and control capabilities, and cost optimization. You can link new or existing workspaces to a dedicated cluster without interrupting ingestion and query operations.",2026-04-09T08:00:00.000Z,how-to,decision-making,0.7,True,"Dedicated clusters documentation typically includes minimum commitment tier values, cost and capacity thresholds, and guidance on when to use a dedicated cluster versus standard workspaces. This is concrete, product-specific decision guidance (when to choose dedicated clusters, how commitments affect cost and behavior), which an LLM is unlikely to know in detail from training.",updated https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-export-logic-app,Export with Logic Apps,Export data from a Log Analytics workspace to a storage account by using Logic Apps - Azure Monitor,Export Log Analytics data to Storage using Logic Apps,This article describes a method to use Azure Logic Apps to query data from a Log Analytics workspace and send it to Azure Storage.,This article describes a method to useAzure Logic Appsto query data from a Log Analytics workspace in Azure Monitor and send it to Azure Storage. Use this process when you need to export your Azure Monitor Logs data for auditing and compliance scenarios or to allow another service to retrieve this data.,2024-09-11T16:50:00.000Z,how-to,integrations,0.7,True,"Integration pattern with Logic Apps, including connectors, query configuration, and payload handling specific to Log Analytics and Azure Storage.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview,Overview,Logs Ingestion API in Azure Monitor - Azure Monitor,,Send data to a Log Analytics workspace using REST API or client libraries.,"The Logs Ingestion API in Azure Monitor lets you send data to a Log Analytics workspace using either aREST API callorclient libraries. The API allows you to send data tosupported Azure tablesor tocustom tables that you create. You can alsoextend the schema of Azure tables with custom columnsto accept additional data. Important Starting March 1, 2026, theLogs Ingestion APIwill enforce TLS 1.2 or higher connections. For more information, seeSecure Logs data in transit.",2026-03-19T06:04:00.000Z,concept-article,,0.2,False,"High-level overview of the Logs Ingestion API without visible numeric limits, configuration tables, error-code mappings, or other product-specific expert details; primarily conceptual description of capabilities.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-table-plans,Select a table plan,Select a table plan based on data usage in a Log Analytics workspace - Azure Monitor,Choose Azure Monitor Logs table plans by usage,"Use the Auxiliary, Basic, and Analytics Logs plans to reduce costs and take advantage of advanced analytics capabilities in Azure Monitor Logs.","You can use one Log Analytics workspace to store any type of log required for any purpose. For example: Table plans let you manage data costs based on how often you use the data in a table and the type of analysis you need the data for. This article explains and how to set a table's plan. For information about what each table plan offers and which use cases it's optimal for, seeTable plans.",2025-08-29T08:00:00.000Z,how-to,decision-making,0.7,True,"Table-plan selection guidance with plan-specific behaviors and use cases; likely includes comparison details and concrete criteria for when to use Auxiliary, Basic, or Analytics.",unchanged @@ -2011,7 +2013,7 @@ https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-storage_storageaccounts,microsoft.storage/storageaccounts,Azure Monitor tables for microsoft.storage/storageaccounts - Azure Monitor,Understand Azure Storage accounts logs and metrics schema,Azure Monitor tables for resource type microsoft.storage/storageaccounts,Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance. Storage Blob Service Logs Schema Storage File Service Logs Schema Storage Queue Service Logs Schema Storage Table Service Logs Schema,2026-01-27T08:00:00.000Z,generated-reference,configuration,0.84,True,"Covers multiple schemas (activity, metrics, blob/file/queue/table logs) for microsoft.storage/storageaccounts, including service-specific log structures.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-storagecache_amlfilesytems,microsoft.storagecache/amlfilesytems,Azure Monitor tables for microsoft.storagecache/amlfilesytems - Azure Monitor,Use Azure Monitor tables for Azure Managed Lustre logs,Azure Monitor tables for resource type microsoft.storagecache/amlfilesytems,This table contains audit logs retrieved from your Azure Managed Lustre filesystem resource. These logs capture all priviledged operations performed on each Azure Managed Lustre resource. They can be used to monitor events and configure alerts on your resource. Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance.,2026-01-27T08:00:00.000Z,generated-reference,configuration,0.78,True,"Defines audit, activity, and metrics table schemas for microsoft.storagecache/amlfilesytems.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-storagecache_caches,microsoft.storagecache/caches,Azure Monitor tables for microsoft.storagecache/caches - Azure Monitor,Interpret Azure HPC Cache API and event logs schema,Azure Monitor tables for resource type microsoft.storagecache/caches,Logs for Azure HPC Cache API requests. Logs for Azure HPC Cache firmware upgrade events. Logs for Azure HPC Cache warning events.,2026-01-27T08:00:00.000Z,generated-reference,configuration,0.78,True,"Documents log tables for microsoft.storagecache/caches, including API requests, firmware upgrades, and warnings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-storagemover_storagemovers,microsoft.storagemover/storagemovers,Azure Monitor tables for microsoft.storagemover/storagemovers - Azure Monitor,Use Azure Monitor tables for Storage Mover logs,Azure Monitor tables for resource type microsoft.storagemover/storagemovers,Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance. Audit logs for storage mover and its child resources. The result logs generated during the execution of Storage Mover job runs where the transfer result is 'Failed'. The logs include the details of the scanned items and their transfer result. The result logs generated du,2026-01-27T23:14:00.000Z,generated-reference,configuration,0.78,True,"Azure Monitor table reference pages list product-specific schema details (table names, column names, data types, and semantics) for microsoft.storagemover/storagemovers logs and metrics. This is expert configuration/telemetry knowledge that an LLM wouldn't reliably know from training and is used to correctly configure queries and monitoring for this resource type.",updated +https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-storagemover_storagemovers,microsoft.storagemover/storagemovers,Azure Monitor tables for microsoft.storagemover/storagemovers - Azure Monitor,Use Azure Monitor tables for Storage Mover logs,Azure Monitor tables for resource type microsoft.storagemover/storagemovers,Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance. Audit logs for storage mover and its child resources. The result logs generated during the execution of Storage Mover job runs where the transfer result is 'Failed'. The logs include the details of the scanned items and their transfer result. The result logs generated du,2026-01-27T23:14:00.000Z,generated-reference,configuration,0.78,True,"Azure Monitor table reference pages list product-specific schema details (table names, column names, data types, and semantics) for microsoft.storagemover/storagemovers logs and metrics. This is expert configuration/telemetry knowledge that an LLM wouldn't reliably know from training and is used to correctly configure queries and monitoring for this resource type.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-streamanalytics_streamingjobs,microsoft.streamanalytics/streamingjobs,Azure Monitor tables for microsoft.streamanalytics/streamingjobs - Azure Monitor,Monitor Stream Analytics jobs with Azure Monitor tables,Azure Monitor tables for resource type microsoft.streamanalytics/streamingjobs,Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance.,2026-01-27T08:00:00.000Z,generated-reference,configuration,0.74,True,Provides schema for activity and metrics tables for microsoft.streamanalytics/streamingjobs.,unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-synapse_workspaces,microsoft.synapse/workspaces,Azure Monitor tables for microsoft.synapse/workspaces - Azure Monitor,Use Azure Monitor tables for Synapse workspace logs,Azure Monitor tables for resource type microsoft.synapse/workspaces,"Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance. Azure Synapse SQL Audit Log. Information about ended Apache Spark applications. Ended Azure Synapse built-in serverless SQL requests. Azure data explorer synapse command execution summary. Logs include DatabaseName, State, Duration that can be used for monitoring the com",2026-01-27T08:00:00.000Z,generated-reference,configuration,0.82,True,"Covers multiple schemas (activity, metrics, SQL audit, Spark apps, serverless SQL, Kusto commands) for microsoft.synapse/workspaces, which is detailed, product-specific configuration.",unchanged https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-timeseriesinsights_environments,microsoft.timeseriesinsights/environments,Azure Monitor tables for microsoft.timeseriesinsights/environments - Azure Monitor,Use Azure Monitor tables for Time Series Insights environments,Azure Monitor tables for resource type microsoft.timeseriesinsights/environments,Entries from the Azure Activity log that provides insight into any subscription-level or management group level events that have occurred in Azure. Metric data emitted by Azure services that measure their health and performance. The Ingress category tracks errors that occur in the ingress pipeline. This category includes errors that occur when receiving events (such as failures to connect to an Event Source) and processing events (such as errors when parsing an event payload).,2026-01-27T08:00:00.000Z,generated-reference,configuration,0.78,True,"Azure Monitor table reference pages list schema columns, data types, and semantics for logs specific to a resource type. This is product-specific configuration/telemetry knowledge (table/field names and meanings) that isn't generally known from training and is needed to correctly query and interpret microsoft.timeseriesinsights/environments logs.",unchanged diff --git a/products/azure-monitor/report.md b/products/azure-monitor/report.md index 44bcf83f..9eb19a10 100644 --- a/products/azure-monitor/report.md +++ b/products/azure-monitor/report.md @@ -1,29 +1,29 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - troubleshooting: 'Diagnosing and fixing Azure Monitor issues: agent/extension health, - data collection and ingestion failures, alerts/ITSM, Application Insights, containers/Prometheus, - workbooks, and VM performance.' - configuration: 'Configuring Azure Monitor end to end: agents, DCRs, pipelines, networking, - alerts, autoscale, Application Insights, Kubernetes/Prometheus, Private Link, - logs/metrics schemas, and resource‑specific metrics/logs.' + troubleshooting: Diagnosing and fixing Azure Monitor data collection, agent, alert, + ingestion, and workbook issues across VMs, containers, pipelines, Application + Insights, ITSM, and Copilot observability. + configuration: 'Configuring Azure Monitor end to end: agents, DCRs, pipelines, alerts, + autoscale, workbooks, Private Link, pricing, and detailed schemas for logs, metrics, + and tables across Azure and partner services.' deployment: Deploying and migrating Azure Monitor agents/resources at scale (VMs, Arc, diagnostics, alerts, Profiler, workspaces, Grafana) using Policy, ARM, CLI, and PowerShell - decision-making: Guidance for choosing Azure Monitor options and planning migrations - (agents, alerts, metrics, logs, SCOM, Prometheus, Splunk), plus cost, billing, - and visualization decisions. - limits-quotas: Limits, performance, and scaling behavior for Azure Monitor logs, - metrics, agents, autoscale, Prometheus, Container Insights, Workbooks, and per‑resource - metric definitions. + decision-making: Guidance for choosing Azure Monitor features, costs, and tools, + and step‑by‑step migrations from legacy agents, alerts, APIs, SCOM, Splunk, Prometheus, + and VM monitoring to modern Azure Monitor options + limits-quotas: Limits, quotas, performance, and scaling behavior for Azure Monitor + logs, metrics, agents, autoscale, Prometheus, Container Insights, Workbooks, and + resource‑specific metric coverage. integrations: Integrating Azure Monitor with apps, alerts, ITSM, Prometheus/Grafana, REST/CLI, and using KQL patterns to query, export, and analyze logs/metrics across many Azure and third‑party services - best-practices: Best practices for configuring, scaling, querying, and optimizing - Azure Monitor (logs, metrics, alerts, autoscale, AKS/VMs, Prometheus, multicloud) - for performance, reliability, and cost. - security: 'Securing Azure Monitor: auth (Entra, RBAC, keys), network (NSP, firewalls, - Private Link, TLS), ITSM/webhooks, Container/Prometheus/Grafana access, and security/audit + best-practices: Best practices for Azure Monitor alerts, autoscale, AKS/VM monitoring, + logs/metrics performance and cost, telemetry, Prometheus/PromQL, multicloud, and + operational excellence. + security: 'Securing Azure Monitor and related services: auth (Entra, RBAC, keys), + network/TLS/NSP/Private Link, secure webhooks, ITSM, workbooks, and detailed security/audit log schemas and analysis.' architecture-patterns: 'Designing Azure Monitor architectures: enterprise-wide layouts, Private Link network patterns, choosing single vs multiple workspaces, and using @@ -31,32 +31,32 @@ category_descriptions: skill_description: Expert knowledge for Azure Monitor development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - configuring Log Analytics, Application Insights, DCRs/agents, Prometheus/Grafana, - or Azure Monitor alerts, and other Azure Monitor related development tasks. Not - for Azure Network Watcher (use azure-network-watcher), Azure Service Health (use - azure-service-health), Azure Defender For Cloud (use azure-defender-for-cloud), - Azure Security (use azure-security). -use_when: Use when configuring Log Analytics, Application Insights, DCRs/agents, Prometheus/Grafana, - or Azure Monitor alerts, and other Azure Monitor related development tasks. -confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), Azure - Service Health (use azure-service-health), Azure Defender For Cloud (use azure-defender-for-cloud), - Azure Security (use azure-security). + configuring Log Analytics, Application Insights, DCRs/agents, alerts/autoscale, + or Prometheus/Grafana monitoring, and other Azure Monitor related development tasks. + Not for Azure AI Metrics Advisor (use azure-metrics-advisor), Azure Network Watcher + (use azure-network-watcher), Azure Service Health (use azure-service-health), Azure + Defender For Cloud (use azure-defender-for-cloud). +use_when: Use when configuring Log Analytics, Application Insights, DCRs/agents, alerts/autoscale, + or Prometheus/Grafana monitoring, and other Azure Monitor related development tasks. +confusable_not_for: Not for Azure AI Metrics Advisor (use azure-metrics-advisor), + Azure Network Watcher (use azure-network-watcher), Azure Service Health (use azure-service-health), + Azure Defender For Cloud (use azure-defender-for-cloud). --- # Azure Monitor Crawl Report ## Summary -- **Total Pages**: 2371 -- **Fetched**: 2371 +- **Total Pages**: 2373 +- **Fetched**: 2373 - **Fetch Failed**: 0 -- **Classified**: 1839 -- **Unclassified**: 532 +- **Classified**: 1842 +- **Unclassified**: 531 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 12 -- **Unchanged**: 2359 -- **Deleted Pages**: 0 +- **New Pages**: 8 +- **Updated Pages**: 11 +- **Unchanged**: 2354 +- **Deleted Pages**: 6 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-monitor/azure-monitor.csv` ## Classification Statistics @@ -64,44 +64,62 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 4 | 0.2% | -| best-practices | 33 | 1.4% | -| configuration | 1281 | 54.0% | -| decision-making | 34 | 1.4% | +| best-practices | 34 | 1.4% | +| configuration | 1280 | 53.9% | +| decision-making | 35 | 1.5% | | deployment | 17 | 0.7% | | integrations | 308 | 13.0% | -| limits-quotas | 64 | 2.7% | +| limits-quotas | 65 | 2.7% | | security | 52 | 2.2% | -| troubleshooting | 46 | 1.9% | -| *(Unclassified)* | 532 | 22.4% | +| troubleshooting | 47 | 2.0% | +| *(Unclassified)* | 531 | 22.4% | ## Changes +### New Pages + +- [FAQ](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-faq) +- [Configure Azure Monitor pipeline](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure) +- [Configure with the Azure portal](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal) +- [Configure with CLI or ARM templates](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli) +- [Configure a Kubernetes gateway](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway) +- [Automated certificate management (default TLS)](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated) +- [Sizing best practices](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-sizing) +- [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-troubleshoot) + ### Updated Pages -- [Get started with OpenTelemetry](https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-enable) - - Updated: 2026-03-13T05:06:00.000Z → 2026-04-08T08:00:00.000Z -- [Configure OpenTelemetry](https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-configuration) - - Updated: 2026-04-01T06:03:00.000Z → 2026-04-17T22:09:00.000Z -- [Migrate to OpenTelemetry](https://learn.microsoft.com/en-us/azure/azure-monitor/app/migrate-to-opentelemetry) - - Updated: 2026-03-30T22:11:00.000Z → 2026-04-08T08:00:00.000Z -- [Spring Boot](https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-spring-boot) - - Updated: 2025-09-26T08:00:00.000Z → 2026-04-08T08:00:00.000Z -- [Get started (supplemental)](https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-get-started-supplemental) - - Updated: 2025-09-26T17:15:00.000Z → 2026-04-08T08:00:00.000Z -- [Configuration (supplemental)](https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-standalone-config) - - Updated: 2026-04-01T06:03:00.000Z → 2026-04-08T08:00:00.000Z -- [AKS clusters](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-enable) - - Updated: 2025-09-16T22:14:00.000Z → 2026-04-14T06:09:00.000Z -- [Managed identity authentication](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-managed-identity) - - Updated: 2025-10-01T05:08:00.000Z → 2026-04-16T22:09:00.000Z -- [Ingest OTLP signals with AMA (Preview)](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/opentelemetry-ingest-agent) - - Updated: 2026-04-08T22:12:00.000Z → 2026-04-15T22:11:00.000Z -- [Entra Workload ID authentication](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-remote-write-azure-workload-identity) - - Updated: 2026-04-07T22:14:00.000Z → 2026-04-16T22:09:00.000Z -- [App Center migration (Preview)](https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-center-migration) - - Updated: 2026-03-18T06:05:00.000Z → 2026-04-14T22:14:00.000Z -- [microsoft.storagemover/storagemovers](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/microsoft-storagemover_storagemovers) - - Updated: 2026-01-27T08:00:00.000Z → 2026-01-27T23:14:00.000Z +- [Create](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-create) + - Updated: 2026-01-20T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Log Analytics gateway](https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway) + - Updated: 2026-04-09T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Container insights](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot) + - Updated: 2025-01-29T08:00:00.000Z → 2026-04-24T22:11:00.000Z +- [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-overview) + - Updated: 2026-03-05T06:34:00.000Z → 2026-04-21T17:12:00.000Z +- [Transformations](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-transformations) + - Updated: 2026-02-03T23:12:00.000Z → 2026-04-21T17:12:00.000Z +- [Configure TLS](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls) + - Updated: 2026-03-03T13:04:00.000Z → 2026-04-21T17:12:00.000Z +- [Customer-managed certificates (BYOC)](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-custom) + - Updated: 2026-03-03T13:04:00.000Z → 2026-04-21T17:12:00.000Z +- [Pod placement](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement) + - Updated: 2026-02-25T23:09:00.000Z → 2026-04-21T17:12:00.000Z +- [Extension versions](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-extension-versions) + - Updated: 2026-03-18T06:05:00.000Z → 2026-04-21T17:12:00.000Z +- [Dedicated clusters](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters) + - Updated: 2026-03-05T13:06:00.000Z → 2026-04-09T08:00:00.000Z +- [Customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/customer-managed-keys) + - Updated: 2026-03-10T08:00:00.000Z → 2026-04-09T08:00:00.000Z + +### Deleted Pages + +- ~~Configuration overview~~ (https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure) +- ~~CLI and templates~~ (https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli) +- ~~Configure clients~~ (https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-clients) +- ~~Azure portal~~ (https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal) +- ~~Sample gateway setup~~ (https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway) +- ~~Automated certificate management (Default TLS)~~ (https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated) ## Classified Pages @@ -146,7 +164,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [AWSELBFlowLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/awselbflowlogs) | configuration | 0.86 | The AWSELBFlowLogs table reference will enumerate the precise columns, data types, and field semantics for ELB flow logs ingested into Azure Monitor/Microsoft Sentinel. These details are specific to this connector and table and are not generic knowledge. This aligns best with configuration, as it documents the exact log schema used to configure queries, alerts, and analytics. | | [AWSNLBAccessLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/awsnlbaccesslogs) | configuration | 0.86 | The AWSNLBAccessLogs table reference will provide the detailed schema (column names, types, and meanings) for NLB access logs ingested into Azure Monitor/Sentinel. This is expert, product-specific information used to configure and write correct KQL queries, fitting the configuration sub-skill rather than limits, troubleshooting, or decision-making. | | [Agent management](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-manage-agent) | configuration | 0.86 | Details maintenance operations, upgrade steps, and settings (like disabling env var collection) for the containerized Log Analytics agent; highly product-specific configuration. | -| [Container insights](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot) | troubleshooting | 0.86 | Provides symptom-based troubleshooting for Container Insights with product-specific checks, commands, and causes/solutions. | +| [Container insights](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot) | troubleshooting | 0.86 | The page is explicitly a troubleshooting guide for Container insights log collection, organized around specific issues and their resolutions. It typically includes product-specific error symptoms, causes, and remediation steps for Azure Monitor and Kubernetes clusters, which qualify as expert troubleshooting knowledge beyond generic debugging advice. | | [ContainerNetworkLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/containernetworklogs) | configuration | 0.86 | Azure Monitor Logs reference pages for specific tables typically list all columns, data types, and semantics for that table. This is product-specific schema/configuration knowledge (column names, types, meanings) that an LLM is unlikely to know from training and is needed to correctly query and interpret ContainerNetworkLogs. It fits best under configuration because it defines structured fields and their allowed/expected values rather than limits, patterns, or troubleshooting. | | [CopilotActivity](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/copilotactivity) | configuration | 0.86 | The CopilotActivity table reference will enumerate columns, data types, and field semantics for AI/Copilot audit logs. These are detailed, product-specific schema definitions that qualify as expert configuration knowledge for building KQL queries and analytics. This aligns with configuration (parameter/field definitions) rather than limits, patterns, or troubleshooting. | | [CrowdStrikeCases](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/crowdstrikecases) | configuration | 0.86 | The CrowdStrikeCases table reference describes the exact schema (column names, types, meanings) for logs ingested from the CrowdStrike Cases API into Sentinel. This is specialized schema/configuration information needed to correctly query and correlate these logs, and is unlikely to be known generically by an LLM. It best fits configuration because it documents structured fields and their usage. | @@ -174,12 +192,10 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Troubleshooting](https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-troubleshooting) | troubleshooting | 0.86 | Explicit troubleshooting article; these typically map specific WAD errors/symptoms to causes and resolutions, including error messages, log locations, and diagnostic steps unique to Azure Diagnostics. | | [Windows diagnostics extension schema](https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-schema-windows) | configuration | 0.86 | Details the configuration schema for the Windows diagnostics extension, including element/attribute names and allowed values. This is product-specific configuration reference that an LLM would not know without the schema documentation. | | [WireData](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/wiredata) | configuration | 0.86 | WireData table reference defines the schema for network data collected by the Dependency and Log Analytics agents. Column-level details are expert configuration knowledge. | -| [Automated certificate management (Default TLS)](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated) | security | 0.85 | Provides detailed guidance for the default TLS option with automated certificate management, including product-specific security settings and configuration parameters for Azure Monitor pipeline. | | [Best practices](https://learn.microsoft.com/en-us/azure/azure-monitor/aiops/observability-agent-best-practices) | best-practices | 0.85 | Explicit best-practices article for configuring Application Insights and OpenTelemetry so investigations are accurate; contains product-specific recommendations and likely concrete configuration patterns. | | [Best practices](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-best-practices) | best-practices | 0.85 | Explicitly a best-practices article for DCR creation/management with product-specific recommendations and likely concrete configuration patterns. | | [Configure BYOS (Bring Your Own Storage)](https://learn.microsoft.com/en-us/azure/azure-monitor/profiler/profiler-bring-your-own-storage) | configuration | 0.85 | BYOS setup requires specific storage account settings, connection strings, and possibly required permissions; these are concrete configuration parameters and security-related details. | | [Configure OpenTelemetry](https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-configuration) | configuration | 0.85 | Explicitly about configuring OTel in Application Insights, likely listing settings such as connection string keys, sampling configuration options, environment variables, and their allowed values. This matches configuration: product-specific parameters and options, not just conceptual guidance. | -| [Customer-managed certificates (BYOC)](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-custom) | security | 0.85 | Covers BYOC/customer-managed certificates for TLS/mTLS with Azure Monitor pipeline, including product-specific certificate requirements and configuration steps that constitute expert security configuration knowledge. | | [Default configuration](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-scrape-default) | configuration | 0.85 | Lists default targets, dashboards, and recording rules; these are detailed configuration defaults and metric rules unique to the product. | | [Enable Snapshot Debugger for a Function App](https://learn.microsoft.com/en-us/azure/azure-monitor/snapshot-debugger/snapshot-debugger-function-app) | configuration | 0.85 | Requires editing host.json and possibly adding specific configuration keys/values; also includes tier recommendations (Basic or higher), which are concrete product-specific settings. | | [Heartbeat](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/heartbeat) | integrations | 0.85 | Includes multiple Kusto queries over Heartbeat for counts, last heartbeat, latency spikes, and stopped resources, using detailed schema fields and time filters. | @@ -197,6 +213,8 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Profiler](https://learn.microsoft.com/en-us/azure/azure-monitor/profiler/profiler-troubleshooting) | troubleshooting | 0.85 | Dedicated troubleshooting guide; expected to list Profiler-specific symptoms, diagnostic steps, and fixes, possibly including error messages and configuration checks. | | [Response caching](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/cache) | limits-quotas | 0.85 | Explicitly states default cache duration of 2 minutes for responses; numeric limit relevant to query behavior. | | [Send to multiple metric workspaces](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-multiple-workspaces) | configuration | 0.85 | Describes data collection rules for sending different metric sets to different workspaces; includes DCR schema and routing configuration parameters. | +| [Sizing best practices](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-sizing) | best-practices | 0.85 | Explicitly described as best practices for sizing with per-vCPU baselines, scaling strategies, and capacity planning examples; includes performance data and quantitative guidance unique to this product. | +| [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-troubleshoot) | troubleshooting | 0.85 | Explicit troubleshooting article for deployment, configuration, data collection, and connectivity; likely organized by symptoms with specific causes and resolutions unique to Azure Monitor pipeline. | | [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/vm-enable-troubleshoot) | troubleshooting | 0.85 | Explicit troubleshooting article for VM monitoring enablement; likely organized by symptoms and includes specific error conditions and resolutions unique to Azure Monitor agents. | | [Troubleshoot Copilot for Workbooks](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-copilot-troubleshoot) | troubleshooting | 0.85 | Provides resolutions for common Copilot errors and known limitations; product-specific troubleshooting mappings. | | [Troubleshoot insights](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/troubleshoot-workbooks) | troubleshooting | 0.85 | Explicit troubleshooting guide for workbook-based insights with symptom/diagnosis/resolution patterns for specific services. | @@ -313,6 +331,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [AppPerformanceCounters](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/appperformancecounters) | configuration | 0.80 | Table reference for performance counters with detailed column definitions and types. | | [AppRequests](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/apprequests) | configuration | 0.80 | Application Insights requests table reference with exact field definitions for request telemetry. | | [AuditLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/auditlogs) | security | 0.80 | Azure AD AuditLogs schema is security/audit focused with specific fields and meanings; this is expert security telemetry knowledge. | +| [Automated certificate management (default TLS)](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated) | security | 0.80 | Details using the Certificate Manager extension for TLS-enabled receivers, including specific configuration parameters and flows, which are security-focused and product-specific. | | [Autoscale diagnostics](https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-diagnostics) | configuration | 0.80 | Describes autoscale log categories, metrics, and destinations; includes specific diagnostic settings and categories, which are configuration details. | | [AzureActivity](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/azureactivity) | configuration | 0.80 | AzureActivity table schema (operation names, categories, status fields, etc.) is detailed, product-specific telemetry configuration not generally known. | | [Best practices](https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-best-practices) | best-practices | 0.80 | Explicit best-practices article for autoscale patterns in specific Azure services; likely includes concrete recommendations and product-specific gotchas. | @@ -326,7 +345,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Common alert schema](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-common-schema) | configuration | 0.80 | Defines the standardized common alert schema and how to enable it, including schema fields and integration behavior, which are detailed configuration and payload definitions. | | [Compare metrics strategies](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-plane-versus-metrics-export) | decision-making | 0.80 | Scenario-based comparison of two metrics access methods with recommendations on when to use each, fitting decision-making guidance. | | [Configure Profiler](https://learn.microsoft.com/en-us/azure/azure-monitor/profiler/profiler-settings) | configuration | 0.80 | Explains Profiler settings pane, including specific toggles, modes, and options that control profiling sessions—product-specific configuration not derivable from general knowledge. | -| [Configure TLS](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls) | security | 0.80 | Focuses on TLS/mTLS for Azure Monitor pipeline with product-specific security configuration, likely including certificate types, configuration parameters, and options for securing intra-cluster traffic that qualify as detailed security guidance. | +| [Configure TLS](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls) | security | 0.80 | TLS configuration article focuses on securing ingestion, choosing between TLS/mTLS and certificate management approaches; this is product-specific security configuration guidance. | | [Configure granular RBAC](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/granular-rbac-use-case) | security | 0.80 | Step-by-step configuration of row-level security with concrete examples (roles, departments, locations); detailed, product-specific security configuration. | | [ContainerAppConsoleLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/containerappconsolelogs) | configuration | 0.80 | ContainerAppConsoleLogs table reference with fields for stdout/stderr and Dapr sidecar logs is product-specific log configuration. | | [ContainerAppSystemLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/containerappsystemlogs) | configuration | 0.80 | Describes schema for system/platform logs (revision management, Dapr, Keda, Envoy) which is detailed configuration of monitoring data. | @@ -335,7 +354,8 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Cost-effective Alerts](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/cost-effective-alerting) | best-practices | 0.80 | Gives specific strategies (e.g., using Basic vs Analytics logs, table conversions) to reduce alerting costs with AKS; these are product-specific best practices beyond generic alerting advice. | | [Create or edit](https://learn.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings) | configuration | 0.80 | Details diagnostic settings options, including specific setting names, selectable categories, and destination behaviors for Azure Monitor, which are product-specific configuration details. | | [Cross-resource queries](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/cross-workspace-queries) | limits-quotas | 0.80 | Describes implicit vs explicit cross-workspace queries and states a hard limit of 10 resources per cross-resource query. | -| [Customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/customer-managed-keys) | security | 0.80 | Provides product-specific steps and settings to enable CMK encryption for Log Analytics workspaces using Azure Key Vault or Managed HSM, including workspace linkage behavior and key management details. This is concrete security configuration, not just conceptual. | +| [Customer-managed certificates (BYOC)](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-custom) | security | 0.80 | Customer-managed TLS configuration will include certificate requirements, secret formats, and binding settings unique to Azure Monitor pipeline, fitting security configuration. | +| [Customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/customer-managed-keys) | security | 0.80 | Customer-managed keys setup for Azure Monitor Logs involves specific Azure roles, Key Vault/Managed HSM configuration parameters, workspace linkage behavior, and encryption-scope details. These are product-specific security and encryption configuration steps (RBAC roles, key URIs, workspace settings) that qualify as expert security knowledge beyond generic concepts. | | [Customize](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/metrics-opentelemetry-guest-modify) | configuration | 0.80 | Details modifying DCRs to add per-process and advanced performance counters; includes product-specific metric and DCR configuration patterns. | | [DCR structure](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure) | configuration | 0.80 | Explains the JSON structure of DCRs, which is product-specific configuration with schema fields and allowed values that qualify as expert configuration knowledge. | | [DatabricksAccounts](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/databricksaccounts) | configuration | 0.80 | DatabricksAccounts Azure Monitor table reference provides the exact schema for accounts audit logs, which is product-specific configuration/reference information. | @@ -599,7 +619,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [CloudAppEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/cloudappevents) | configuration | 0.78 | Describes the exact schema of CloudAppEvents in Azure Monitor Logs, which is expert, product-specific configuration data. | | [CloudHsmServiceOperationAuditLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/cloudhsmserviceoperationauditlogs) | configuration | 0.78 | Provides the schema and field definitions for Cloud HSM operation audit logs, which are detailed configuration/log-structure specifics. | | [CommonSecurityLog](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/commonsecuritylog) | configuration | 0.78 | CommonSecurityLog table reference defines columns and formats for CEF security appliance events, which is expert schema/configuration knowledge. | -| [Configure clients](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-clients) | configuration | 0.78 | Client configuration for Azure Monitor pipeline will include concrete commands, endpoint formats, and parameter values (for example, how to retrieve and use the external IP and specific flags), which are product-specific configuration details rather than generic concepts. | | [Connection strings](https://learn.microsoft.com/en-us/azure/azure-monitor/app/connection-strings) | configuration | 0.78 | Page is focused on how to configure Application Insights connection strings, describing that they are composed of multiple key-value settings (including instrumentation key and ApplicationId) and showing schema/examples. This is product-specific configuration detail (exact setting names and structure) rather than a generic concept, matching the configuration sub-skill. It is not just a tutorial; it defines the specific configuration mechanism for this service. | | [ContainerRegistryLoginEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/containerregistryloginevents) | configuration | 0.78 | ACR login auditing table schema is specific to this service and needed for precise monitoring queries. | | [ContainerRegistryRepositoryEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/containerregistryrepositoryevents) | configuration | 0.78 | Repository auditing log schema is product-specific configuration for Azure Container Registry monitoring. | @@ -1060,19 +1079,17 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [AppPlatformLogsforSpring](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/appplatformlogsforspring) | configuration | 0.75 | Schema reference for App Platform Logs for Spring, detailing columns and their meanings. | | [AppPlatformSystemLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/appplatformsystemlogs) | configuration | 0.75 | Azure Spring Cloud system logs table schema, which is detailed configuration/contract information. | | [BMC Helix](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/itsmc-secure-webhook-connections-bmc) | integrations | 0.75 | Product-specific integration guide between BMC Helix and Secure Webhook in Azure; likely includes endpoint URLs, payload expectations, and configuration parameters unique to this integration. | -| [CLI and templates](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli) | configuration | 0.75 | CLI/ARM-based configuration with detailed resource definitions and parameters; highly product-specific configuration knowledge. | | [Collect logs from Event Hubs](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/ingest-logs-event-hub) | integrations | 0.75 | Explains DCR and table setup plus template requirements to pull from Event Hubs; includes product-specific configuration and constraints. | | [Configuration](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/private-link-configure) | configuration | 0.75 | Step-by-step configuration of AMPLS and private link; includes specific settings and resource relationships unique to Azure Monitor. | -| [Configuration overview](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure) | configuration | 0.75 | Details enabling and configuring the pipeline, likely with specific extension settings, cache options, and routing parameters. | | [Configuration settings](https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-understanding-settings) | configuration | 0.75 | Explains autoscale settings and how they work across resource types; likely details specific setting names, behaviors, and allowed values, fitting configuration. | | [Configure Azure for a Secure Webhook](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/itsm-connector-secure-webhook-connections-azure-configuration) | security | 0.75 | Describes required Azure configurations for Secure Webhook; likely includes specific identity, permission, and endpoint settings, fitting security-focused configuration patterns. | | [Configure with ARM templates](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-api) | integrations | 0.75 | ARM template-based setup plus sample code for multiple SDKs; includes API parameters and product-specific integration patterns. | +| [Configure with CLI or ARM templates](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli) | configuration | 0.75 | CLI/ARM configuration implies parameter names, JSON schema, and flags specific to Azure Monitor pipeline, matching configuration sub-skill criteria. | | [Configure with PowerShell](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/set-up-logs-ingestion-api-prerequisites) | configuration | 0.75 | PowerShell script that provisions all required resources for Logs Ingestion API; includes specific resource types, parameters, and dependencies. | | [Connect ServiceNow with an ITSM Connector](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/itsmc-connections-servicenow) | integrations | 0.75 | Details configuring ServiceNow with ITSMC in Log Analytics; integration-focused with product-specific connection settings and possibly field mappings. | | [ContainerLog](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/containerlog) | integrations | 0.75 | Includes parameterized search, 7-day billable data breakdown by log-type, and namespace-wide queries, which are detailed, product-specific Kusto patterns. | | [CopilotActivity](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/copilotactivity) | integrations | 0.75 | Contains multiple concrete queries for Copilot interactions, plugins, PromptBooks, security events, and model usage over specific time ranges, which are detailed, product-specific patterns. | | [Cost optimization](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/best-practices-cost) | best-practices | 0.75 | Cost optimization guidance with Azure Monitor–specific configuration choices and data collection strategies; actionable product-specific recommendations. | -| [Create](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-create) | configuration | 0.75 | Provides concrete guidance on creating and adding transformation queries to DCRs, including product-specific configuration steps and parameters. | | [EmailAttachmentInfo](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/emailattachmentinfo) | integrations | 0.75 | Contains KQL queries over EmailAttachmentInfo to find first appearance of malicious files and external emails with attachments, using detailed schema fields. | | [EmailEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/emailevents) | integrations | 0.75 | Provides concrete Kusto queries over EmailEvents to count phishing emails by sender domain, leveraging product-specific fields. | | [EmailPostDeliveryEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/emailpostdeliveryevents) | integrations | 0.75 | Shows KQL examples over EmailPostDeliveryEvents for admin actions and unremediated phishing detections, using schema-specific columns. | @@ -1085,7 +1102,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [IdentityLogonEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/identitylogonevents) | integrations | 0.75 | Provides Kusto queries over IdentityLogonEvents to find processes performing LDAP auth with cleartext passwords, using detailed schema filters. | | [IdentityQueryEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/identityqueryevents) | integrations | 0.75 | Contains KQL examples over IdentityQueryEvents to find processes sending SAMR queries to AD, using specific event fields. | | [Integrate with Logic apps](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-logic-apps) | integrations | 0.75 | Shows how to build Logic Apps workflows that process Azure Monitor alerts, including how to access alert metadata and query results, which are concrete integration patterns. | -| [Log Analytics gateway](https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway) | configuration | 0.75 | Explains configuring communication via Log Analytics gateway, including proxy behavior, endpoints, and possibly port and URL settings; these are concrete configuration parameters. | | [MICROSOFT.OPENENERGYPLATFORM/ENERGYSERVICES](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/supported-logs/microsoft-openenergyplatform-energyservices-logs) | configuration | 0.75 | Defines concrete diagnostic, audit, and HTTP indexer log categories for MICROSOFT.OPENENERGYPLATFORM/ENERGYSERVICES. | | [Manage table-level read access](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/manage-table-access) | security | 0.75 | Describes table-level access methods and granular RBAC; likely lists specific roles/permissions and scope behaviors unique to Log Analytics. | | [Microsoft.AppConfiguration/configurationStores](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/supported-logs/microsoft-appconfiguration-configurationstores-logs) | configuration | 0.75 | Defines concrete Azure Monitor log categories (audit, request logs) for Microsoft.AppConfiguration/configurationStores, which is product-specific logging configuration. | @@ -1408,7 +1424,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Azure Policy](https://learn.microsoft.com/en-us/azure/azure-monitor/agents/azure-monitor-agent-policy) | deployment | 0.70 | Describes built-in policies/initiatives to auto-install AMA and associate DCRs; includes specific policy definitions and assignment patterns, which are deployment/management mechanics unique to this product. | | [Azure Policy](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/vminsights-enable-policy) | configuration | 0.70 | Details VM insights policy initiatives and how they install agents and enable monitoring; includes product-specific policy definitions and behavior beyond generic concepts. | | [Azure portal](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-create-portal) | configuration | 0.70 | Describes detailed DCR configuration options and fields in the Azure portal (sources, destinations, rule structure). While portal-focused, it exposes product-specific configuration elements and constraints that go beyond generic knowledge. | -| [Azure portal](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal) | configuration | 0.70 | Portal-based configuration of pipeline components; includes specific setting names and allowed values for pipeline behavior. | | [Azure resource queries](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/azure-resource-queries) | configuration | 0.70 | Documents querying logs in the context of Azure resources instead of workspaces; product-specific API behavior and patterns. | | [AzureActivity](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/azureactivity) | integrations | 0.70 | Provides concrete example queries over AzureActivity with time filters and projections; these are product-specific query patterns for this table. | | [AzureAssessmentRecommendation](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/azureassessmentrecommendation) | configuration | 0.70 | Defines schema for recommendations from Azure assessments, including fields and structure; also notes default 7-day schedule, which is specific operational detail. | @@ -1433,7 +1448,10 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [ConfidentialWatchlist](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/confidentialwatchlist) | integrations | 0.70 | Demonstrates treating ConfidentialWatchlist as a table in Kusto joins/lookups and getting distinct aliases, which is a specific integration pattern. | | [Configuration](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/network-security-perimeter) | security | 0.70 | Details how to add Azure Monitor resources to a network security perimeter, including perimeter rules and communication constraints. | | [ConfigurationChange](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/configurationchange) | integrations | 0.70 | Contains multiple concrete queries with time windows (last 30 minutes) and change categories, specific to ConfigurationChange schema. | +| [Configure Azure Monitor pipeline](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure) | configuration | 0.70 | Described as overall setup including preparing Arc-enabled Kubernetes cluster and installing cert-manager; such setup docs typically list specific extension names, parameters, and configuration steps unique to this product. | +| [Configure a Kubernetes gateway](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway) | configuration | 0.70 | Describes deploying a Traefik gateway and managing receivers/clients; likely includes specific Kubernetes manifests, service types, and annotations unique to this integration pattern, fitting configuration. | | [Configure with Azure portal](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal) | integrations | 0.70 | Tutorial configuring tables and sample app to send logs via API; includes concrete API usage and configuration details. | +| [Configure with the Azure portal](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal) | configuration | 0.70 | Portal configuration article will contain concrete resource names, fields, and settings for pipeline and dataflows that are product-specific configuration knowledge. | | [Container monitoring solution](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-transition-solution) | decision-making | 0.70 | Migration-focused guidance with timelines and steps to move from a legacy solution to Container Insights; helps decide and plan migration paths. | | [ContainerNetworkLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/containernetworklogs) | integrations | 0.70 | Shows queries for dropped network flows and top IPs by bytes, which are specific to ContainerNetworkLogs fields. | | [Convert ITSM ServiceNow actions to secure webhook actions](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/itsm-convert-servicenow-to-webhook) | integrations | 0.70 | Describes converting ITSM actions to Secure Webhook actions for ServiceNow; migration/integration guidance with specific steps and configuration details for the new webhook-based integration. | @@ -1456,6 +1474,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [DatabricksRemoteHistoryService](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/databricksremotehistoryservice) | security | 0.70 | Logs for adding/deleting credentials in the remote history service are security-centric; the table schema and field semantics are product-specific security configuration knowledge. | | [DatabricksSSH](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/databricksssh) | security | 0.70 | SSH audit logs capture secure access events; the detailed table schema is product-specific security and logging knowledge. | | [DataverseActivity](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/dataverseactivity) | integrations | 0.70 | Contains specific Kusto queries over DataverseActivity with table/column usage that is product-specific and serves as concrete integration/query patterns. | +| [Dedicated clusters](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters) | decision-making | 0.70 | Dedicated clusters documentation typically includes minimum commitment tier values, cost and capacity thresholds, and guidance on when to use a dedicated cluster versus standard workspaces. This is concrete, product-specific decision guidance (when to choose dedicated clusters, how commitments affect cost and behavior), which an LLM is unlikely to know in detail from training. | | [Delete and recover a workspace](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/delete-workspace) | configuration | 0.70 | Explains soft-delete behavior, recovery windows, and permanent deletion options, which are product-specific configuration and lifecycle details. | | [Deploy agents](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/monitor-virtual-machine-agent) | deployment | 0.70 | Describes deploying AMA across environments as part of a monitoring guide; includes product-specific deployment requirements and patterns. | | [Deployment guidance](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/monitor-kubernetes) | best-practices | 0.70 | Explicitly described as best practices for monitoring different Kubernetes layers with Azure Monitor and cloud-native tools; likely includes concrete, product-specific recommendations and patterns. | @@ -1507,6 +1526,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [KubeMonAgentEvents](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/kubemonagentevents) | integrations | 0.70 | Provides KQL examples over KubeMonAgentEvents including a generic search pattern, reflecting product-specific log schema usage. | | [KubeServices](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/kubeservices) | integrations | 0.70 | Page is a catalog of concrete Kusto queries against the KubeServices table, including field names and query patterns that are product-specific and not general knowledge. | | [Link actions](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-link-actions) | configuration | 0.70 | Describes how link actions work from link components and column settings; includes product-specific behavior and options for navigation/actions. | +| [Log Analytics gateway](https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway) | configuration | 0.70 | Gateway setup articles typically include product-specific configuration details such as proxy modes, required ports/URLs, and parameter settings for connecting on-premises or isolated machines to Azure Automation and Log Analytics. These are concrete configuration values and connection settings that qualify as expert knowledge beyond generic proxy usage. | | [Log data ingestion time](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-ingestion-time) | limits-quotas | 0.70 | Explains ingestion time characteristics and factors affecting latency for Azure Monitor Logs. Such an article typically includes concrete latency ranges/targets and behavior under normal operations, which are numeric service characteristics not known generically. | | [Log queries](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-log-query) | configuration | 0.70 | Describes record schemas and provides sample Kusto queries; table/field names and query patterns are product-specific log configuration/usage details. | | [Log search alert](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-create-log-alert-rule) | configuration | 0.70 | Explains creating and editing log search alert rules, including payload structure and rule parameters, which are concrete configuration details. | @@ -1565,7 +1585,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [PGSQLServerLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/pgsqlserverlogs) | integrations | 0.70 | Contains concrete Kusto queries targeting PGSQLServerLogs for errors, deadlocks, shutdowns, and connection events, reflecting product-specific log schema and query patterns. | | [Parse text data](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/parse-text) | best-practices | 0.70 | Compares ingestion-time vs query-time parsing options with Azure Monitor–specific trade-offs and patterns. | | [Perf](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/perf) | integrations | 0.70 | Provides specific Kusto queries over the Perf table (RDMA, CPU percentiles, disk space) using Azure Monitor’s Perf schema, which are product-specific query/integration patterns. | -| [Pod placement](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement) | configuration | 0.70 | Pod placement across Kubernetes nodes typically involves specific Kubernetes constructs (node selectors, taints, tolerations, affinity rules) and product-specific labels/annotations, which are concrete configuration details unique to this service. | +| [Pod placement](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement) | configuration | 0.70 | Pod placement feature involves configuring scheduling rules, node selectors, and policies specific to Azure Monitor pipeline instances, which is detailed configuration knowledge. | | [Predictive autoscale](https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-predictive) | limits-quotas | 0.70 | Mentions minimum seven days of history and a maximum (truncated) history window; predictive autoscale behavior is governed by specific numeric requirements, qualifying as limits/quotas. | | [Prefer headers](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/prefer-options) | configuration | 0.70 | Describes specific request/response options exposed via the Prefer HTTP header, including option names and allowed values for the Logs query API, which are product-specific configuration parameters. | | [Profiler](https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-standalone-profiler) | configuration | 0.70 | Profiler setup for Java with version-specific notes and configuration parameters unique to this feature. | @@ -1581,7 +1601,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Resource logs](https://learn.microsoft.com/en-us/azure/azure-monitor/platform/resource-logs) | configuration | 0.70 | Describes how to enable and send resource logs to various destinations, with resource-type-specific behaviors and configuration steps. | | [SDK Stats](https://learn.microsoft.com/en-us/azure/azure-monitor/app/sdk-stats) | troubleshooting | 0.70 | Focuses on SDK stats custom metrics, including drop codes and retry codes that explain causes and next steps; these are product-specific error/diagnostic codes and mappings. | | [SQLAssessmentRecommendation](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/sqlassessmentrecommendation) | integrations | 0.70 | Contains queries counting SQL assessment recommendations by focus area, computer, instance, database, and security priority, using this assessment table’s schema. | -| [Sample gateway setup](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway) | configuration | 0.70 | Describes a concrete, product-specific Kubernetes gateway setup (Traefik) to expose Azure Monitor pipeline receivers, likely including specific service types, annotations, ports, and configuration snippets that go beyond generic knowledge. | | [Scale with custom metrics](https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-custom-metric) | configuration | 0.70 | Shows how to set up autoscale on custom metrics with specific namespace constraints (Standard and Azure.ApplicationInsights); includes product-specific configuration details and constraints. | | [Scenarios](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/network-security-perimeter-scenarios) | security | 0.70 | Scenario-based configurations for NSP with Azure Monitor resources; product-specific network isolation patterns and settings. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/security-controls-policy) | security | 0.70 | Lists specific Azure Policy built-in definitions and compliance controls for Azure Monitor, including policy names and scopes that are product-specific security/compliance configuration. | @@ -1608,6 +1627,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Telemetry data model](https://learn.microsoft.com/en-us/azure/azure-monitor/app/data-model-complete) | configuration | 0.70 | A complete telemetry data model page typically enumerates telemetry types and their exact schema (field names, types, and relationships). This is product-specific configuration/structure knowledge that an LLM is unlikely to fully know from training and is needed to correctly configure queries, alerts, and integrations. It is not conceptual, but a detailed reference of the data model. | | [Text](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-text-visualizations) | configuration | 0.70 | Details workbook-specific text visualization behavior (first cell only, styling options) and settings, which are configuration details unique to this product. | | [Time range brushing](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-time-brushing) | configuration | 0.70 | Explains time range brushing behavior and parameter export options; product-specific configuration of interactive time selection. | +| [Transformations](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-transformations) | configuration | 0.70 | Covers how to configure transformations, including which KQL operators are supported and how aggregations work; this is product-specific configuration behavior and constraints. | | [Troubleshooting metric charts](https://learn.microsoft.com/en-us/azure/azure-monitor/metrics/metrics-troubleshoot) | troubleshooting | 0.70 | Explicit troubleshooting article for metric charts; likely organized by symptoms and Azure Monitor–specific causes/solutions. | | [Types of alert rules](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-types) | decision-making | 0.70 | Explains different alert types and when to use each, helping users decide between options based on scenarios, which is explicit decision guidance. | | [UCDOStatus](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/ucdostatus) | integrations | 0.70 | Page provides concrete Kusto query patterns against the UCDOStatus table (field names, filters, aggregations) that are product-specific usage patterns rather than generic concepts. | @@ -1748,7 +1768,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Data sources](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-data-sources) | configuration | 0.65 | Lists supported workbook data sources and how they’re wired into workbooks; product-specific data source options and behaviors constitute configuration knowledge. | | [DataSetOutput](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/datasetoutput) | integrations | 0.65 | Shows how to display output of the latest successful DCR-based conditional data set from DataSetOutput, which is a specific integration pattern. | | [DataSetRuns](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/datasetruns) | integrations | 0.65 | Provides a query pattern to display output of the latest successful DCR-based conditional data set from DataSetRuns, which is table-specific. | -| [Dedicated clusters](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters) | decision-making | 0.65 | Covers when to use dedicated clusters for Azure Monitor Logs, including minimum commitment tier and workspace-linking behavior. This is service- and SKU-specific selection guidance beyond generic concepts. | | [Deploy with Azure Resource Manager](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-automate) | configuration | 0.65 | Describes managing workbooks programmatically with ARM templates, including workbook resource types and template-based deployment. This implies product-specific resource definitions and parameters for workbook configuration, fitting the configuration category (config parameters and resource settings) rather than a generic tutorial. | | [DeviceCalendar](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/devicecalendar) | integrations | 0.65 | Provides example queries over DeviceCalendar for SurfaceHub Exchange errors, using specific table/column patterns unique to this log. | | [DeviceCleanup](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/devicecleanup) | integrations | 0.65 | Contains Kusto queries over DeviceCleanup for SurfaceHub cleanup failures, reflecting concrete schema and filter usage. | @@ -1758,6 +1777,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Dynamic thresholds](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-dynamic-thresholds) | configuration | 0.65 | Describes how to configure dynamic thresholds, including where they can be used and how to set them, which is product-specific alert configuration guidance. | | [E-mail notification](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/proactive-email-notification) | configuration | 0.65 | Describes changes to default notification recipients and how to manage them; involves product-specific notification configuration settings. | | [EdgeActionConsoleLog](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/edgeactionconsolelog) | integrations | 0.65 | Provides Kusto queries over EdgeActionConsoleLog to get top log messages by action version, using product-specific schema. | +| [Extension versions](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-extension-versions) | limits-quotas | 0.65 | Version and release notes typically include specific version numbers, supported features, and sometimes constraints tied to versions; this is expert, version-specific operational knowledge, closest to limits/quotas among available categories. | | [Filter OpenTelemetry](https://learn.microsoft.com/en-us/azure/azure-monitor/app/opentelemetry-filter) | configuration | 0.65 | Focuses on filtering and excluding telemetry and sensitive data; likely documents specific filter configuration options, parameter names, and patterns for Azure Monitor OpenTelemetry rather than generic advice. | | [GCPAuditLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/gcpauditlogs) | integrations | 0.65 | Provides KQL examples over GCPAuditLogs, including Pub/Sub subscription logs with severity, reflecting cross-cloud log schema usage. | | [Get started (supplemental)](https://learn.microsoft.com/en-us/azure/azure-monitor/app/java-get-started-supplemental) | configuration | 0.65 | Container-focused Java autoinstrumentation setup typically documents environment variables, agent paths, and container-specific configuration flags. These are product-specific configuration details rather than generic container or Java concepts. | @@ -1801,6 +1821,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [NGXOperationLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/ngxoperationlogs) | integrations | 0.65 | Shows Kusto queries over NGXOperationLogs to list access and error logs sorted by time, using table-specific schema. | | [NGXSecurityLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/ngxsecuritylogs) | integrations | 0.65 | Provides a Kusto query over NGXSecurityLogs listing security logs sorted by time, relying on product-specific table and fields. | | [NginxUpstreamUpdateLogs](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/nginxupstreamupdatelogs) | integrations | 0.65 | Shows a Kusto query over NginxUpstreamUpdateLogs listing upstream update logs sorted by time, using table-specific schema. | +| [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-overview) | decision-making | 0.65 | Overview focuses on when to use Azure Monitor pipeline for edge and multicloud data collection and recommends a setup sequence; likely includes scenario-based guidance and trade-offs versus direct ingestion, which is product-specific decision guidance. | | [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-query-overview) | configuration | 0.65 | Reference for KQL elements specific to Azure Monitor, including unsupported features and TLS requirements; product-specific query behavior. | | [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/monitor-virtual-machine) | best-practices | 0.65 | Guide for complete monitoring of Azure and hybrid VMs; likely includes concrete recommendations on what telemetry to collect and how to configure alerts, beyond generic monitoring advice. | | [PerfInsightsImpactedResources](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/perfinsightsimpactedresources) | integrations | 0.65 | Shows how to unpack and query PerfInsightsImpactedResources using Kusto, relying on table-specific fields and structures that are unique to this Azure Monitor table. | @@ -2298,6 +2319,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Concepts](https://learn.microsoft.com/en-us/azure/azure-monitor/health-models/concepts) | 0.30 | Concepts article describing health model components and relationships; lacks concrete configuration parameters or numeric thresholds. | | [Configure Azure for an ITSM Connector](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/itsmc-definition) | 0.30 | Overview of ITSMC and how to configure it conceptually; summary suggests high-level description rather than detailed config tables or troubleshooting mappings. | | [Connector deletion](https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/itsmc-connector-deletion) | 0.30 | How-to deletion steps for ITSM connectors and associated actions; no product-specific limits, configs, or error-code mappings beyond generic behavior. | +| [Create](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-create) | 0.30 | Primarily a how-to for creating and attaching KQL-based transformations to data collection rules. From the summary, it doesn't clearly expose product-specific limits, configuration tables, or error-code-based troubleshooting; it looks like general procedural guidance that an LLM can approximate without hidden numeric constraints or specialized config matrices. | | [Create interactive reports](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/workbooks-interactive-reports) | 0.30 | Describes ways to create interactive reports in workbooks, likely as a how-to/tutorial. No indication of specific configuration tables, error codes, or quantified best practices unique to the product. | | [Dashboard using log data](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/tutorial-logs-dashboards) | 0.30 | Step-by-step tutorial for creating a dashboard; mostly procedural without deep config matrices, limits, or troubleshooting codes. | | [DeviceHardwareHealth](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/devicehardwarehealth) | 0.30 | Content is access-restricted; from the visible summary we can’t confirm presence of detailed schema or other expert data. | @@ -2312,7 +2334,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Enable monitoring](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/tutorial-scale-set-enable-monitoring) | 0.30 | Tutorial for enabling monitoring on VM scale sets; procedural content without deep configuration parameter references or limits. | | [Enable recommended alerts](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/tutorial-alerts) | 0.30 | Tutorial for enabling recommended alerts; focuses on portal steps, not detailed alert rule definitions, limits, or error mappings. | | [Enable recommended alerts](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/tutorial-scale-set-alerts) | 0.30 | Tutorial for enabling recommended alerts on VM scale sets; no expert-only limits, config matrices, or troubleshooting mappings. | -| [Extension versions](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-extension-versions) | 0.30 | Page is primarily version history and release notes for the Azure Monitor pipeline Arc-enabled Kubernetes extension. It does not focus on limits/quotas, configuration parameter tables, deployment matrices, security roles, or troubleshooting mappings with error codes. While it has product-specific details (version numbers, fixes, and changes), these are release-note style rather than reusable expert configuration/architecture/troubleshooting knowledge as defined by the sub-skill types. | | [Install and run reports](https://learn.microsoft.com/en-us/azure/azure-monitor/vm/performance-diagnostics-run) | 0.30 | Primarily a how-to guide for installing and running performance diagnostics reports. These run/installation articles are usually procedural without detailed error-code mappings, configuration tables, or limits; they don’t meet the thresholds for the defined expert-knowledge categories. | | [Live metric stream](https://learn.microsoft.com/en-us/azure/azure-monitor/app/live-stream) | 0.30 | Explains Live Metrics conceptually (real-time monitoring, filters, traces) without exposing detailed config tables, limits, or error mappings. | | [Log Analytics workspace insights](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-analytics-workspace-insights-overview) | 0.30 | High-level overview of Workspace Insights and onboarding; primarily conceptual and navigational without detailed limits, configs, or error mappings. | @@ -2368,7 +2389,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/optimization-insights/code-optimizations-profiler-overview) | 0.30 | Appears to be an overview of Code Optimizations and Profiler capabilities and benefits; summary does not indicate concrete configuration tables, limits, or troubleshooting mappings. | | [Query scope](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/scope) | 0.30 | Describes query scope and time range conceptually; no evidence of numeric limits or product-specific config parameters. | | [StorageMoverCopyLogsTransferred](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries/storagemovercopylogstransferred) | 0.30 | Page provides example Kusto queries for a specific log table but is primarily usage examples, not structured troubleshooting mappings, configuration parameter tables, limits, or decision matrices. It lacks error-code-based diagnosis, config defaults, or product-specific constraints that meet the expert-knowledge criteria for any sub-skill type. | -| [Transformations](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-transformations) | 0.30 | Described as benefits and conceptual explanation of transformations; summary does not indicate concrete parameter tables, limits, or product-specific gotchas. | | [Use issues](https://learn.microsoft.com/en-us/azure/azure-monitor/aiops/aiops-issue-and-investigation-how-to) | 0.30 | How-to article for creating and managing Azure Monitor issues; appears procedural without strong signals of detailed configuration matrices, limits, or error-code-based troubleshooting. | | [View .NET Profiler data](https://learn.microsoft.com/en-us/azure/azure-monitor/profiler/profiler-data) | 0.30 | Describes generating load and viewing Profiler data; appears to be procedural usage guidance without product-specific configuration tables or error/limit details. | | [View insights](https://learn.microsoft.com/en-us/azure/azure-monitor/optimization-insights/view-code-optimizations) | 0.30 | High-level guidance on viewing Code Optimizations results; likely UI navigation without detailed configuration tables, limits, or error mappings. | @@ -2424,6 +2444,7 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Custom metrics in Azure Monitor (preview)](https://learn.microsoft.com/en-us/azure/azure-monitor/metrics/metrics-custom-overview) | 0.20 | Overview of custom metrics; summary does not show numeric limits, configuration tables, or detailed API parameters. | | [Dependencies](https://learn.microsoft.com/en-us/azure/azure-monitor/app/dependencies) | 0.20 | Describes what dependency tracking is and what telemetry is collected, but from the summary it appears to be a conceptual/feature overview without specific limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific numeric or configuration details. | | [Enable monitoring for a cluster](https://learn.microsoft.com/en-us/azure/azure-monitor/containers/kubernetes-monitoring-tutorial) | 0.20 | Quickstart/tutorial for enabling AKS monitoring in Azure Monitor. Primarily step-by-step portal guidance and overview of default monitoring experience; no clear evidence of numeric limits, detailed configuration parameter tables, error-code-based troubleshooting, or decision matrices. | +| [FAQ](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-faq) | 0.20 | FAQ description is generic; likely high-level Q&A about architecture and deployment without structured error codes, limits, or decision matrices. | | [Functions](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/functions) | 0.20 | Article explains how to use and create functions in Azure Monitor log queries. It appears to be conceptual/how-to guidance without specific limits, quotas, configuration parameter tables, error-code-based troubleshooting, or decision matrices. No strong product-specific numeric thresholds or configuration reference details are indicated in the summary. | | [Issues overview](https://learn.microsoft.com/en-us/azure/azure-monitor/aiops/aiops-issue-and-investigation-overview) | 0.20 | Conceptual overview of Azure Monitor issues and investigations; describes what issues are and how they help, without clear indication of numeric limits, configuration tables, or troubleshooting mappings. | | [Log Analytics sample queries](https://learn.microsoft.com/en-us/azure/azure-monitor/reference/queries-by-table) | 0.20 | A high-level page describing that there are sample queries by table is primarily navigational/organizational. It’s unlikely to contain detailed limits, configuration parameters, or error mappings itself; those are in the linked per-table articles, so this page does not meet the expert-knowledge criteria. | @@ -2471,7 +2492,6 @@ confusable_not_for: Not for Azure Network Watcher (use azure-network-watcher), A | [Azure Monitor overview](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/overview) | 0.10 | High-level Azure Monitor overview; no detailed limits, configs, or product-specific patterns. | | [Insights and curated visualizations](https://learn.microsoft.com/en-us/azure/azure-monitor/visualize/insights-overview) | 0.10 | High-level overview of Azure Monitor Insights and curated visualizations without concrete limits, configuration tables, error codes, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/aiops/observability-agent-overview) | 0.10 | Overview of Azure Copilot observability agent; primarily conceptual description of capabilities without detailed configuration, limits, or troubleshooting content. | -| [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-overview) | 0.10 | Described as an overview of the Azure Monitor pipeline and its capabilities (extending data collection, caching, syncing, routing). This is conceptual/architectural explanation without clear indication of specific configuration parameters, limits, or decision matrices, so it does not meet the expert-knowledge criteria for any sub-skill type. | | [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/data-platform) | 0.10 | High-level overview of the Azure Monitor data platform; no indication of numeric limits, configuration parameters, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/azure-monitor/metrics/prometheus-metrics-overview) | 0.10 | Overview of managed Prometheus service; summary is conceptual without detailed configuration or limits. | | [Overview of Azure Monitor Metrics](https://learn.microsoft.com/en-us/azure/azure-monitor/metrics/data-platform-metrics) | 0.10 | Conceptual overview of metrics in Azure Monitor; summary does not suggest detailed limits, configuration tables, or error handling. | diff --git a/products/azure-nat-gateway/azure-nat-gateway.csv b/products/azure-nat-gateway/azure-nat-gateway.csv index f56e7f72..929fb07a 100644 --- a/products/azure-nat-gateway/azure-nat-gateway.csv +++ b/products/azure-nat-gateway/azure-nat-gateway.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/nat-gateway/faq,FAQ,Azure NAT Gateway frequently asked questions,Azure NAT Gateway FAQs with limits and behavior,Answers to common questions about using Azure NAT Gateway.,Here are some answers to common questions about using Azure NAT Gateway.,2026-04-16T17:19:00Z,faq,limits-quotas,0.7,True,"FAQ pages for networking services typically include concrete expert details such as SNAT port counts, IP limits per gateway, and other numeric constraints and behaviors that are not obvious from general knowledge, aligning best with limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/nat-gateway/faq,FAQ,Azure NAT Gateway frequently asked questions,Azure NAT Gateway FAQs with limits and behavior,Answers to common questions about using Azure NAT Gateway.,Here are some answers to common questions about using Azure NAT Gateway.,2026-04-16T17:19:00Z,faq,limits-quotas,0.7,True,"FAQ pages for networking services typically include concrete expert details such as SNAT port counts, IP limits per gateway, and other numeric constraints and behaviors that are not obvious from general knowledge, aligning best with limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/manage-nat-gateway,Manage a Standard NAT gateway,Manage a NAT gateway - Azure NAT Gateway,Manage Azure NAT Gateway configuration and IPs,Learn how to create and remove a NAT gateway resource from a virtual network subnet. Add and remove public IP addresses and prefixes used for outbound connectivity.,Learn how to create and remove a NAT gateway resource from a virtual network subnet. A NAT gateway enables outbound connectivity for resources in an Azure Virtual Network. You can change the public IP addresses and public IP address prefixes associated with the NAT gateway changed after deployment. This article explains how to manage the following aspects of NAT gateway: Create a NAT gateway and associate it with an existing subnet. Remove a NAT gateway from an existing subnet and delete the NAT,2024-10-08T17:03:00.000Z,how-to,configuration,0.7,True,"Explains how to create/remove NAT gateway, associate subnets, and manage public IPs/prefixes; contains concrete configuration operations and parameters.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/manage-nat-gateway-v2,Manage a Standard V2 NAT gateway,Manage a Standard V2 NAT Gateway - Azure NAT Gateway,,Learn how to create and remove a NAT gateway v2 resource from a virtual network and virtual network subnet. Add and remove public IP addresses and prefixes used for outbound connectivity.,Learn how to create and remove a NAT gateway resource from a virtual network subnet. A NAT gateway enables outbound connectivity for resources in an Azure Virtual Network. You can change the public IP addresses and public IP address prefixes associated with the NAT gateway changed after deployment. This article explains how to manage the following aspects of NAT gateway: Create a NAT gateway and associate it with an existing subnet. Remove a NAT gateway from an existing subnet and delete the NAT,2026-04-09T06:11:00.000Z,how-to,,0.2,False,"Page is a how-to guide for creating/removing and associating a NAT Gateway v2 with subnets and IPs. From the description it appears procedural without detailed configuration parameter tables, limits, quotas, or product-specific troubleshooting/error-code mappings. It reads as standard tutorial content rather than expert reference material.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/monitor-nat-gateway-flow-logs,Monitor Standard V2 NAT gateway flow logs,Monitor with StandardV2 NAT Gateway Flow Logs - Azure NAT Gateway,Monitor and troubleshoot with NAT Gateway flow logs,"Learn how to set up, monitor, and troubleshoot with StandardV2 NAT Gateway Flow Logs.","In this article, you learn how to set up, monitor, and troubleshoot with Azure StandardV2 NAT Gateway flow logs. These logs can help you monitor and analyze the traffic flows going through your NAT gateway resource. The health event logs are provided through the Azure Monitor resource log category NatGatwayFlowlogsV1, which is enabled through Diagnostic Settings.",2026-01-22T06:12:00.000Z,how-to,troubleshooting,0.7,True,Shows how to use flow logs for monitoring and troubleshooting traffic; includes product-specific log categories and analysis patterns for diagnosing issues.,unchanged @@ -11,15 +11,15 @@ https://learn.microsoft.com/en-us/azure/nat-gateway/nat-gateway-snat,SNAT with N https://learn.microsoft.com/en-us/azure/nat-gateway/nat-gateway-support-help,Support and troubleshooting,Azure NAT Gateway Support and Help Options,,How to obtain help and support for questions or problems when you create solutions using Azure NAT Gateway.,Here are suggestions for where you can get help when developing your Azure NAT Gateway solutions.,2026-04-09T22:25:00.000Z,troubleshooting,,0.0,False,"Support/help options page that points to where to get assistance. Contains no technical configuration details, limits, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/nat-gateway-v2-migrate,Migrate NAT Gateway to Standard V2,Migrate Azure NAT Gateway from Standard to StandardV2 - Guidance,Migrate Azure NAT Gateway to StandardV2,Upgrade guidance for migrating Standard NAT Gateway to StandardV2 NAT Gateway.,"StandardV2 NAT Gateway offers enhanced data processing limits and high availability through zone redundancy. StandardV2 NAT Gateway is recommended for production workloads requiring resiliency to zone outages. In this article, we discuss guidance for how to migrate your subnets from Standard NAT gateway to StandardV2 NAT gateway. In place migration to StandardV2 NAT Gateway isn't available. Important Migration from Standard to StandardV2 NAT Gateway involvesdowntime and impact to existing connec",2026-01-22T06:12:00.000Z,concept-article,decision-making,0.75,True,"Provides migration guidance, including downtime impact and lack of in-place upgrade; this is SKU/upgrade decision and migration-path guidance.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/nat-metrics,Metrics and alerts,Metrics and alerts for Azure NAT Gateway - Azure NAT Gateway,Configure metrics and alerts for Azure NAT Gateway,Get started learning about Azure Monitor metrics and alerts available for monitoring Azure NAT Gateway.,"This article provides an overview of all NAT gateway metrics and diagnostic capabilities. This article provides general guidance on how to use metrics and alerts to monitor, manage, andtroubleshootyour NAT gateway resource. All metrics are the same for Standard and StandardV2 NAT Gateway. Figure: Azure NAT Gateway for outbound to Internet",2025-11-18T17:01:00.000Z,how-to,configuration,0.65,True,"Details available metrics and diagnostic capabilities for NAT Gateway; likely includes metric names, dimensions, and usage guidance, which are configuration/monitoring specifics.",unchanged -https://learn.microsoft.com/en-us/azure/nat-gateway/nat-overview,What is Azure NAT Gateway?,What is Azure NAT Gateway?,,"Overview of Azure NAT Gateway features, resources, architecture, and implementation. Learn about what NAT Gateway is and how to use it.",Azure NAT Gateway is a fully managed and highly resilient Network Address Translation (NAT) service. Use Azure NAT Gateway to let all instances in a subnet connect outbound to the internet while remaining fully private. A NAT Gateway doesn't permit unsolicited inbound connections from the internet. Only packets arriving as response packets to an outbound connection can pass through a NAT Gateway. NAT Gateway dynamically allocates SNAT ports to automatically scale outbound connectivity and minimi,2026-04-15T17:15:00.000Z,overview,,0.2,False,"High-level overview of Azure NAT Gateway features and behavior; no detailed limits, configuration tables, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/nat-gateway/nat-overview,What is Azure NAT Gateway?,What is Azure NAT Gateway?,,"Overview of Azure NAT Gateway features, resources, architecture, and implementation. Learn about what NAT Gateway is and how to use it.",Azure NAT Gateway is a fully managed and highly resilient Network Address Translation (NAT) service. Use Azure NAT Gateway to let all instances in a subnet connect outbound to the internet while remaining fully private. A NAT Gateway doesn't permit unsolicited inbound connections from the internet. Only packets arriving as response packets to an outbound connection can pass through a NAT Gateway. NAT Gateway dynamically allocates SNAT ports to automatically scale outbound connectivity and minimi,2026-04-15T17:15:00.000Z,overview,,0.2,False,"High-level overview of Azure NAT Gateway features and behavior; no detailed limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/nat-sku,NAT Gateway SKUs,Azure NAT Gateway SKUs,Choose between Azure NAT Gateway SKUs,Overview of available NAT Gateway SKUs and their differences.,Azure Network Address Translation (NAT) Gateway has two stock-keeping units (SKUs). This article provides an overview of these different SKUs and their differences.,2026-01-23T23:11:00.000Z,article,decision-making,0.7,True,Compares Standard and StandardV2 SKUs and their differences; SKU selection guidance and trade-offs qualify as decision-making content.,unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway,Create and validate a Standard Azure NAT Gateway,Create an Azure NAT Gateway,,This quickstart shows how to create a NAT gateway by using the Azure portal.,"In this quickstart, learn how to create a NAT gateway by using the Azure portal, Azure PowerShell, or Azure CLI. The NAT Gateway service provides scalable outbound connectivity for virtual machines in Azure.",2026-02-20T06:21:00.000Z,quickstart,,0.2,False,Quickstart/tutorial for creating a NAT gateway; primarily step-by-step portal/CLI instructions without expert-only configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2,Create and validate a Standard V2 Azure NAT Gateway,Create a Standard V2 Azure NAT Gateway,,This quickstart shows how to create a Standard V2 Azure NAT Gateway by using the Azure portal.,"In this quickstart, learn how to create a Standard V2 Azure NAT Gateway by using the Azure portal, and PowerShell. The NAT Gateway service provides scalable outbound connectivity for virtual machines in Azure.",2026-04-16T17:19:00.000Z,quickstart,,0.1,False,"Quickstart/tutorial for creating a NAT Gateway via portal/PowerShell; focuses on step-by-step creation, not detailed configuration matrices, limits, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates,Use deployment templates to create StandardV2 NAT Gateway,Quickstart: Create a Standard V2 Azure NAT Gateway - Deployment templates,Deploy Standard V2 NAT Gateway with ARM/Bicep,This quickstart shows how to create a NAT gateway by using an Azure Resource Manager template (ARM template) and Bicep template.,"Get started with NAT Gateway V2 by using an Azure Resource Manager template (ARM template) or Bicep template. The templates deploy a NAT gateway, virtual network, subnet, and Ubuntu virtual machine for testing NAT gateway functionality. The NAT gateway is assigned to a subnet of the virtual network. AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe you",2026-01-22T06:12:00.000Z,quickstart,deployment,0.65,True,Provides ARM/Bicep templates and parameters for deploying NAT Gateway V2 and related resources; template-based deployment details are product-specific.,unchanged +https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2,Create and validate a Standard V2 Azure NAT Gateway,Create a Standard V2 Azure NAT Gateway,,This quickstart shows how to create a Standard V2 Azure NAT Gateway by using the Azure portal.,"In this quickstart, learn how to create a Standard V2 Azure NAT Gateway by using the Azure portal, and PowerShell. The NAT Gateway service provides scalable outbound connectivity for virtual machines in Azure.",2026-04-16T17:19:00.000Z,quickstart,,0.1,False,"Quickstart/tutorial for creating a NAT Gateway via portal/PowerShell; focuses on step-by-step creation, not detailed configuration matrices, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates,Use deployment templates to create StandardV2 NAT Gateway,Quickstart: Create a Standard V2 Azure NAT Gateway - Deployment templates,Deploy Azure NAT Gateway V2 with IaC templates,"This quickstart shows how to create a NAT gateway by using an Azure Resource Manager template (ARM template), Bicep template, or Terraform.","Get started with NAT Gateway V2 by using an Azure Resource Manager template (ARM template), Bicep template, or Terraform. The templates deploy a NAT gateway, virtual network, subnet, and Ubuntu virtual machine for testing NAT gateway functionality. The NAT gateway is assigned to a subnet of the virtual network. AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You ",2026-04-22T17:34:00.000Z,quickstart,integrations,0.64,True,"The quickstart uses ARM, Bicep, and Terraform templates to provision a NAT Gateway V2, VNet, subnet, and VM. These templates typically include product-specific resource types, properties, and configuration parameters (for example, NAT gateway SKU, public IP association, subnet bindings) that represent concrete integration patterns between Azure NAT Gateway and other Azure resources. While it is a quickstart, the presence of full infrastructure-as-code templates with specific resource schemas and parameter names constitutes expert, product-specific integration knowledge beyond generic deployment commands.",updated https://learn.microsoft.com/en-us/azure/nat-gateway/region-move-nat-gateway,Create and configure NAT gateway after region move,Deploy a NAT gateway after moving resources between regions - Azure NAT gateway,Redeploy NAT Gateway after cross-region resource move,Get started learning how to deploy and configure a new Azure NAT Gateway for resources moved to another region.,"In this article, you'll learn how to set up a NAT gateway after moving resources to a different region. You might want to move resources to a new Azure region that better suits your customers' location or meets your organization's needs and policies. Note NAT gateway instances can't directly be moved from one region to another. A workaround is to use Azure Resource Mover to move all the resources associated with the existing NAT gateway to the new region. You then create a new instance of NAT ga",2026-01-22T06:12:00.000Z,how-to,deployment,0.7,True,Covers constraints (cannot move NAT Gateway across regions) and the required deployment steps after using Azure Resource Mover; product-specific deployment constraint and workaround.,unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-hub-spoke-nat-firewall,Use a NAT gateway with Azure Firewall,Integrate NAT Gateway with Azure Firewall in Hub and Spoke Network - Azure NAT Gateway,Scale outbound traffic with NAT Gateway and Azure Firewall,"Learn to integrate NAT gateway with Azure Firewall in a hub and spoke network for scalable outbound connectivity. Step-by-step tutorial with Portal, PowerShell, and CLI examples.","In this tutorial, you learn how to integrate a NAT gateway with Azure Firewall in a hub and spoke network for enhanced outbound connectivity and scalability. Azure Firewall provides2,496 SNAT ports per public IP addressconfigured per backend Virtual Machine Scale Set instance (minimum of two instances). You can associate up to 250 public IP addresses to Azure Firewall. Depending on your architecture requirements and traffic patterns, you might require more SNAT ports than what Azure Firewall can",2025-11-18T17:01:00.000Z,tutorial,architecture-patterns,0.7,True,"Describes integration of NAT Gateway with Azure Firewall in hub-spoke; includes specific SNAT port counts and IP limits, making it an architecture pattern with numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-hub-spoke-route-nat,Use a NAT gateway with a hub and spoke network,Tutorial: Use a NAT gateway with a hub and spoke network - Azure NAT Gateway,Integrate NAT Gateway in hub-spoke with NVA,Learn how to integrate a NAT gateway into a hub and spoke network with a network virtual appliance.,"A hub and spoke network is one of the building blocks of a highly available multiple location network infrastructure. The most common deployment of a hub and spoke network is done with the intention of routing all inter-spoke and outbound internet traffic through the central hub. The purpose is to inspect all of the traffic traversing the network with a Network Virtual Appliance (NVA) for security scanning and packet inspection. For outbound traffic to the internet, the network virtual appliance",2025-11-18T17:01:00.000Z,tutorial,architecture-patterns,0.65,True,Shows how to use NAT Gateway in a hub-and-spoke architecture with a network virtual appliance; focuses on a specific architecture pattern.,unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-ilip-nat,Migrate a virtual machine public IP address,Tutorial: Migrate a virtual machine public IP address to a NAT gateway - Azure NAT Gateway,Move VM public IP outbound traffic to NAT Gateway,Use this tutorial to learn how to migrate your virtual machine public IP address to an Azure NAT Gateway.,"In this tutorial, you learn how to migrate your virtual machine's public IP address to a NAT gateway. You learn how to remove the IP address from the virtual machine. You reuse the IP address from the virtual machine for the NAT gateway. Azure NAT Gateway is the recommended method for outbound connectivity. Azure NAT Gateway is a fully managed and highly resilient Network Address Translation (NAT) service. A NAT gateway doesn't have the same limitations of Source Network Address Translation (SNA",2025-11-18T17:01:00.000Z,tutorial,deployment,0.6,True,Shows how to migrate from a VM’s direct public IP to NAT Gateway while reusing the IP; product-specific deployment/migration guidance.,unchanged -https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-outbound-nat,Migrate outbound access,Tutorial: Migrate outbound access to NAT gateway,Migrate Azure outbound access to NAT Gateway,Use this tutorial to learn how to migrate outbound access in your virtual network to an Azure NAT gateway.,"In this tutorial, you learn how to migrate your outbound connectivity fromdefault outbound accessto a NAT gateway. You learn how to change your outbound connectivity from load balancer outbound rules to a NAT gateway. You reuse the IP address from the outbound rule configuration for the NAT gateway. Azure NAT Gateway is the recommended method for outbound connectivity. A NAT gateway is a fully managed and highly resilient Network Address Translation (NAT) service. A NAT gateway doesn't have the ",2026-04-16T22:31:00.000Z,tutorial,decision-making,0.65,True,"Tutorial on migrating outbound connectivity from default outbound access or load balancer outbound rules to NAT Gateway, including when and how to switch and reuse IPs. This is migration/selection guidance between outbound options, fitting decision-making.",updated +https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-outbound-nat,Migrate outbound access,Tutorial: Migrate outbound access to NAT gateway,Migrate Azure outbound access to NAT Gateway,Use this tutorial to learn how to migrate outbound access in your virtual network to an Azure NAT gateway.,"In this tutorial, you learn how to migrate your outbound connectivity fromdefault outbound accessto a NAT gateway. You learn how to change your outbound connectivity from load balancer outbound rules to a NAT gateway. You reuse the IP address from the outbound rule configuration for the NAT gateway. Azure NAT Gateway is the recommended method for outbound connectivity. A NAT gateway is a fully managed and highly resilient Network Address Translation (NAT) service. A NAT gateway doesn't have the ",2026-04-16T22:31:00.000Z,tutorial,decision-making,0.65,True,"Tutorial on migrating outbound connectivity from default outbound access or load balancer outbound rules to NAT Gateway, including when and how to switch and reuse IPs. This is migration/selection guidance between outbound options, fitting decision-making.",unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-nat-gateway-load-balancer-internal-portal,Integrate NAT gateway internal load balancer,Tutorial: Integrate NAT gateway with an internal load balancer - Azure portal - Azure NAT Gateway,Use NAT Gateway with internal load balancer,"In this tutorial, learn how to integrate a NAT gateway with an internal load Balancer using the Azure portal.","In this tutorial, you learn how to integrate a NAT gateway with an internal load balancer. By default, an Azure Standard Load Balancer is secure. Outbound connectivity is explicitly defined by enabling outbound SNAT (Source Network Address Translation). SNAT is enabled for an internal backend pool via another public load balancer, network routing, or a public IP defined on a virtual machine. The NAT gateway integration replaces the need for the deployment of a public load balancer, network routi",2025-11-18T17:01:00.000Z,tutorial,architecture-patterns,0.6,True,Describes replacing public load balancer or routing with NAT Gateway for internal load balancer backends; specific outbound connectivity pattern.,unchanged https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-nat-gateway-load-balancer-public-portal,Integrate NAT gateway public load balancer,Tutorial: Integrate NAT gateway with a public load balancer - Azure portal - Azure NAT Gateway,Use NAT Gateway with public load balancer,"In this tutorial, learn how to integrate a NAT gateway with a public load Balancer using the Azure portal.","In this tutorial, you learn how to integrate a NAT gateway with a public load balancer. By default, an Azure Standard Load Balancer is secure. Outbound connectivity is explicitly defined by enabling outbound SNAT (Source Network Address Translation). SNAT is enabled in a load-balancing rule or outbound rules. The NAT gateway integration replaces the need for outbound rules for backend pool outbound SNAT. In this tutorial, you learn how to:",2025-11-18T17:01:00.000Z,tutorial,architecture-patterns,0.6,True,Shows how NAT Gateway replaces outbound rules on a public load balancer; this is a product-specific integration pattern for outbound connectivity design.,unchanged diff --git a/products/azure-nat-gateway/report.md b/products/azure-nat-gateway/report.md index 99c4299b..36ad36b8 100644 --- a/products/azure-nat-gateway/report.md +++ b/products/azure-nat-gateway/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: limits-quotas: NAT Gateway limits, SNAT port quotas, connection scaling behavior, per-resource caps, and FAQs on throughput, IPs, and troubleshooting limit-related @@ -17,21 +17,24 @@ category_descriptions: connectivity patterns when using Azure NAT Gateway. decision-making: Guidance on choosing NAT Gateway SKUs, migrating existing NAT Gateways to StandardV2, and moving outbound internet access from other methods to NAT Gateway. + integrations: Guidance and templates for deploying Azure NAT Gateway V2 using infrastructure-as-code + (ARM/Bicep/Terraform), including configuration patterns and automation. deployment: How to deploy and redeploy NAT Gateway (ARM/Bicep), migrate or move outbound traffic from VMs/public IPs, and transition existing outbound access to Azure NAT Gateway. skill_description: Expert knowledge for Azure NAT Gateway development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, - configuration, and deployment. Use when managing SNAT ports, outbound IPs, flow - logs, hub-spoke egress, or Azure Firewall integration, and other Azure NAT Gateway - related development tasks. Not for Azure Virtual Network (use azure-virtual-network), - Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual - WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). -use_when: Use when managing SNAT ports, outbound IPs, flow logs, hub-spoke egress, - or Azure Firewall integration, and other Azure NAT Gateway related development tasks. + configuration, integrations & coding patterns, and deployment. Use when configuring + SNAT ports, scaling outbound IPs, reading NAT flow logs, or deploying NAT Gateway + via IaC, and other Azure NAT Gateway related development tasks. Not for Azure Virtual + Network (use azure-virtual-network), Azure Load Balancer (use azure-load-balancer), + Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). +use_when: Use when configuring SNAT ports, scaling outbound IPs, reading NAT flow + logs, or deploying NAT Gateway via IaC, and other Azure NAT Gateway related development + tasks. confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), Azure - Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use - azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). + Load Balancer (use azure-load-balancer), Azure Virtual WAN (use azure-virtual-wan), + Azure VPN Gateway (use azure-vpn-gateway). --- # Azure NAT Gateway Crawl Report @@ -45,8 +48,8 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 4 -- **Unchanged**: 20 +- **Updated Pages**: 1 +- **Unchanged**: 23 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-nat-gateway/azure-nat-gateway.csv` @@ -58,7 +61,8 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A | best-practices | 1 | 4.2% | | configuration | 5 | 20.8% | | decision-making | 3 | 12.5% | -| deployment | 3 | 12.5% | +| deployment | 2 | 8.3% | +| integrations | 1 | 4.2% | | limits-quotas | 1 | 4.2% | | troubleshooting | 1 | 4.2% | | *(Unclassified)* | 5 | 20.8% | @@ -67,14 +71,8 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A ### Updated Pages -- [What is Azure NAT Gateway?](https://learn.microsoft.com/en-us/azure/nat-gateway/nat-overview) - - Updated: 2025-11-18T17:01:00.000Z → 2026-04-15T17:15:00.000Z -- [Create and validate a Standard V2 Azure NAT Gateway](https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2) - - Updated: 2025-11-18T18:43:00.000Z → 2026-04-16T17:19:00.000Z -- [Migrate outbound access](https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-outbound-nat) - - Updated: 2025-11-18T17:01:00.000Z → 2026-04-16T22:31:00.000Z -- [FAQ](https://learn.microsoft.com/en-us/azure/nat-gateway/faq) - - Updated: 2026-01-21T06:14:00Z → 2026-04-16T17:19:00Z +- [Use deployment templates to create StandardV2 NAT Gateway](https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates) + - Updated: 2026-01-22T06:12:00.000Z → 2026-04-22T17:34:00.000Z ## Classified Pages @@ -95,7 +93,7 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A | [Migrate outbound access](https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-outbound-nat) | decision-making | 0.65 | Tutorial on migrating outbound connectivity from default outbound access or load balancer outbound rules to NAT Gateway, including when and how to switch and reuse IPs. This is migration/selection guidance between outbound options, fitting decision-making. | | [NAT gateway resource](https://learn.microsoft.com/en-us/azure/nat-gateway/nat-gateway-resource) | configuration | 0.65 | Describes key components of the NAT gateway resource and how they are configured; likely includes resource properties and settings specific to NAT Gateway beyond generic knowledge. | | [Use a NAT gateway with a hub and spoke network](https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-hub-spoke-route-nat) | architecture-patterns | 0.65 | Shows how to use NAT Gateway in a hub-and-spoke architecture with a network virtual appliance; focuses on a specific architecture pattern. | -| [Use deployment templates to create StandardV2 NAT Gateway](https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates) | deployment | 0.65 | Provides ARM/Bicep templates and parameters for deploying NAT Gateway V2 and related resources; template-based deployment details are product-specific. | +| [Use deployment templates to create StandardV2 NAT Gateway](https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates) | integrations | 0.64 | The quickstart uses ARM, Bicep, and Terraform templates to provision a NAT Gateway V2, VNet, subnet, and VM. These templates typically include product-specific resource types, properties, and configuration parameters (for example, NAT gateway SKU, public IP association, subnet bindings) that represent concrete integration patterns between Azure NAT Gateway and other Azure resources. While it is a quickstart, the presence of full infrastructure-as-code templates with specific resource schemas and parameter names constitutes expert, product-specific integration knowledge beyond generic deployment commands. | | [Integrate NAT gateway internal load balancer](https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-nat-gateway-load-balancer-internal-portal) | architecture-patterns | 0.60 | Describes replacing public load balancer or routing with NAT Gateway for internal load balancer backends; specific outbound connectivity pattern. | | [Integrate NAT gateway public load balancer](https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-nat-gateway-load-balancer-public-portal) | architecture-patterns | 0.60 | Shows how NAT Gateway replaces outbound rules on a public load balancer; this is a product-specific integration pattern for outbound connectivity design. | | [Migrate a virtual machine public IP address](https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-ilip-nat) | deployment | 0.60 | Shows how to migrate from a VM’s direct public IP to NAT Gateway while reusing the IP; product-specific deployment/migration guidance. | diff --git a/products/azure-netapp-files/azure-netapp-files.csv b/products/azure-netapp-files/azure-netapp-files.csv index 7c036145..f1eb1c88 100644 --- a/products/azure-netapp-files/azure-netapp-files.csv +++ b/products/azure-netapp-files/azure-netapp-files.csv @@ -53,7 +53,7 @@ https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-pe https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-quickstart-set-up-account-create-volumes,Set up Azure NetApp Files and create an NFS volume,Quickstart: Set up Azure NetApp Files and NFS volume,,Quickstart - Describes how to quickly set up Azure NetApp Files and create a volume using the Azure portal and various command-line tools,"This article shows you how to quickly set up Azure NetApp Files and create an NFS volume. In this quickstart, you set up the following items: If you don't have an Azure subscription, create afree accountbefore you begin. To see all features that you can enable for an NFS volume and relevant considerations, seeCreate an NFS volume.",2026-01-07T08:00:00.000Z,quickstart,,0.2,False,"Quickstart tutorial for setting up an account and volume; step-by-step usage, not configuration reference or limits.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-register,Register for NetApp Resource Provider,Register for NetApp Resource Provider to use with Azure NetApp Files,Register NetApp Resource Provider in Azure,Learn how to register the NetApp Resource Provider for Azure NetApp Files.,"To use Azure NetApp Files, you need to register the NetApp Resource Provider. From the Azure portal, select the Azure Cloud Shell icon on the upper right-hand corner: If you have multiple subscriptions on your Azure account, select the one that you want to configure for Azure NetApp Files: In the Azure Cloud Shell console, enter the following command to register the Azure Resource Provider: The--waitparameter instructs the console to wait for the registration to complete. The registration proces",2025-07-15T08:00:00.000Z,how-to,configuration,0.6,True,"Shows exact Azure CLI/PowerShell commands and parameters (including --wait) to register the provider, which are concrete configuration steps.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-resize-capacity-pools-or-volumes,Resize a capacity pool or a volume,Resize the capacity pool or a volume for Azure NetApp Files,,Learn how to change the size of a capacity pool or a volume. Resizing the capacity pool changes the purchased Azure NetApp Files capacity.,"You can change the size of a capacity pool or a volume as necessary, for example, when a volume or capacity pool fills up. For information about monitoring a volume’s capacity, seeMonitor the capacity of a volume.",2025-05-20T08:00:00.000Z,how-to,,0.45,False,"Resize capacity pools/volumes tutorial; summary doesn’t show numeric limits, constraints, or configuration matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-resource-limits,Resource limits for Azure NetApp Files,Resource limits for Azure NetApp Files,Manage Azure NetApp Files resource limits and quotas,Describes limits for Azure NetApp Files resources and how to request resource limit increase.,Understanding resource limits for Azure NetApp Files helps you manage your volumes.,2026-04-14T08:00:00.000Z,concept-article,limits-quotas,0.95,True,Page explicitly describes Azure NetApp Files resource limits and how to request increases; this is product-specific numeric quota information that changes over time and is unlikely to be known from training.,updated +https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-resource-limits,Resource limits for Azure NetApp Files,Resource limits for Azure NetApp Files,Manage Azure NetApp Files resource limits and quotas,Describes limits for Azure NetApp Files resources and how to request resource limit increase.,Understanding resource limits for Azure NetApp Files helps you manage your volumes.,2026-04-14T08:00:00.000Z,concept-article,limits-quotas,0.95,True,Page explicitly describes Azure NetApp Files resource limits and how to request increases; this is product-specific numeric quota information that changes over time and is unlikely to be known from training.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-sdk-cli,"SDKs, CLI tools, and ARM templates",Azure NetApp Files SDKs and CLI tools,,"Learn about supported SDKs for Azure NetApp Files and their published locations in GitHub, and about supported command-line tools: Azure CLI and PowerShell.","Learn about the SDKs, command-line (CLI) tools, and Azure Resource Manager (ARM) templates supported by Azure NetApp Files.",2025-06-30T08:00:00.000Z,concept-article,,0.4,False,Catalog of SDKs/CLI tools and locations; mostly navigational/overview without deep configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-service-levels,Understand service levels,Service levels for Azure NetApp Files,Choose Azure NetApp Files service levels by throughput,Describes throughput performance for the service levels of Azure NetApp Files.,Service levels are an attribute of a capacity pool. Service levels are defined and differentiated by the allowed maximum throughput for a volume in the capacity pool based on the quota assigned to the volume. Throughput is a combination of read and write speed.,2026-02-02T08:00:00.000Z,concept-article,decision-making,0.7,True,"Service levels for Azure NetApp Files are defined by specific throughput per volume based on quota and pool service level. This page typically includes concrete MB/s per TiB values and comparisons between service levels, enabling selection decisions based on quantified performance characteristics.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-set-up-capacity-pool,Set up a capacity pool,Create a capacity pool for Azure NetApp Files,Create capacity pool for Azure NetApp Files,Describes how to create a capacity pool so that you can create volumes within it.,"Creating a capacity pool enables you to create volumes within it. Important For a NetApp Elastic account, seeCreate an Elastic zone-redundant capacity pool.",2025-10-02T08:00:00.000Z,how-to,configuration,0.7,True,"Capacity pool setup defines size, service level, and other parameters that directly affect quotas and throughput.",unchanged @@ -81,7 +81,7 @@ https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-application https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-application-volume-oracle-azure-resource-manager,Deploy application volume group for Oracle using Azure Resource Manager,Deploy Azure NetApp Files application volume group for Oracle using Azure Resource Manager,Deploy Oracle AVGs on Azure NetApp Files with ARM,Describes how to use an Azure Resource Manager (ARM) template to deploy Azure NetApp Files application volume group for Oracle.,"This article describes how to use an Azure Resource Manager (ARM) template to deploy Azure NetApp Files application volume group for Oracle. For detailed documentation on how to use the ARM template, seeORACLE Azure NetApp Files storage.",2025-05-19T17:08:00.000Z,concept-article,deployment,0.65,True,ARM template-based deployment for Oracle AVGs; likely includes template parameters and deployment constraints specific to this product and scenario.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-cache-volumes,Manage cache volumes,Configure a cache volume for Azure NetApp Files,,This article shows you how to create a cache volume in Azure NetApp Files.,"The purpose of this article is to provide users of Azure NetApp Files with cache volumes that simplify file distribution, reduce WAN latency, and lower WAN/ExpressRoute bandwidth costs. Azure NetApp Files cache volumes are currently designed to be peered with external sources—origin volumes in on-premises ONTAP, Cloud Volumes ONTAP, or Amazon FSx for NetApp. Azure NetApp Files cache volumes are cloud-based caches of an external origin volume, containing only the most actively accessed data on th",2026-03-13T17:21:00.000Z,how-to,,0.3,False,"How-to for configuring cache volumes; summary focuses on purpose and supported origins, without evidence of detailed configuration parameter tables or quantified best practices.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys,Configure customer-managed keys,Configure customer-managed keys for Azure NetApp Files volume encryption,Configure customer-managed keys for Azure NetApp Files encryption,Describes how to configure customer-managed keys for Azure NetApp Files volume encryption.,"Customer-managed keys for Azure NetApp Files volume encryption enable you to use your own keys rather than a platform-managed key when creating a new volume. With customer-managed keys, you can fully manage the relationship between a key's life cycle, key usage permissions, and auditing operations on keys. The following diagram demonstrates how customer-managed keys work with Azure NetApp Files: Azure NetApp Files grants permissions to encryption keys to a managed identity. The managed identity ",2025-06-20T08:00:00.000Z,how-to,security,0.8,True,Details CMK setup with managed identity and Key Vault for ANF volume encryption; includes product-specific security configuration and key lifecycle relationships.,unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware,Configure customer-managed keys with managed HSM,Configure customer-managed keys with managed Hardware Security Module for Azure NetApp Files volume encryption,Use HSM-backed customer-managed keys for Azure NetApp Files,Learn how to encrypt data in Azure NetApp Files with customer-managed keys using the Hardware Security Module,Azure NetApp Files volume encryption with customer-managed keys with the managed Hardware Security Module (HSM) is an extension tocustomer-managed keys for Azure NetApp Files volumes encryption feature. Customer-managed keys with HSM allows you to store your encryptions keys in a more secure FIPS 140-2 Level 3 HSM instead of the FIPS 140-2 Level 1 or Level 2 service used by Azure Key Vault (AKV).,2025-11-12T08:00:00.000Z,how-to,security,0.8,True,Describes using managed HSM (FIPS 140-2 Level 3) instead of standard Key Vault for ANF CMK; product-specific encryption configuration.,unchanged +https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware,Configure customer-managed keys with managed HSM,Configure customer-managed keys with managed Hardware Security Module for Azure NetApp Files volume encryption,Set up HSM-backed customer-managed keys for Azure NetApp Files,Learn how to encrypt data in Azure NetApp Files with customer-managed keys using the Hardware Security Module,Azure NetApp Files volume encryption with customer-managed keys with the managed Hardware Security Module (HSM) is an extension tocustomer-managed keys for Azure NetApp Files volumes encryption feature. Customer-managed keys with HSM allows you to store your encryptions keys in a more secure FIPS 140-2 Level 3 HSM instead of the FIPS 140-2 Level 1 or Level 2 service used by Azure Key Vault (AKV).,2026-04-22T17:34:00.000Z,how-to,security,0.7,True,"Describes concrete steps and parameters to configure customer-managed keys using a managed HSM for Azure NetApp Files volume encryption, including product-specific security configuration and key management details beyond generic concepts.",updated https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-kerberos-encryption,Configure NFSv4.1 Kerberos encryption,Configure NFSv4.1 Kerberos encryption for Azure NetApp Files,Configure NFSv4.1 Kerberos encryption for Azure NetApp Files,Describes how to configure NFSv4.1 Kerberos encryption for Azure NetApp Files and the performance impact.,"Azure NetApp Files supports NFS client encryption in Kerberos modes (krb5, krb5i, and krb5p) with AES-256 encryption. This article describes the required configurations for using an NFSv4.1 volume with Kerberos encryption.",2025-04-16T08:00:00.000Z,how-to,security,0.75,True,"Covers Kerberos modes (krb5, krb5i, krb5p), AES-256 support, and required ANF-specific configuration steps; security-focused with concrete mode values.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-ldap-extended-groups,Configure AD DS LDAP with extended groups for NFS,Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes,Enable LDAP extended groups for Azure NetApp Files NFS volumes,Describes the considerations and steps for enabling LDAP with extended groups when you create an NFS volume by using Azure NetApp Files.,"When youcreate an NFS volume, you can enable the LDAP with extended groups feature (theLDAPoption) for the volume. This feature enables Active Directory LDAP users and extended groups (up to 1,024 groups) to access files and directories in the volume. You can use the LDAP with extended groups feature with both NFSv4.1 and NFSv3 volumes. Note By default, in Active Directory LDAP servers, theMaxPageSizeattribute is set to a default of 1,000. This setting means that groups beyond 1,000 are truncate",2025-02-21T08:00:00.000Z,how-to,limits-quotas,0.8,True,"Includes explicit numeric limits (up to 1,024 groups, MaxPageSize default 1,000) and behavior when limits are exceeded; these are concrete quotas.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-ldap-over-tls,Configure AD DS LDAP authentication for NFS volumes,Configure AD DS LDAP over TLS for Azure NetApp Files,Configure AD DS LDAP over TLS for NetApp Files,"Describes how to configure AD DS LDAP over TLS for Azure NetApp Files, including root CA certificate management.","You can use Lightweight Directory Access Protocol (LDAP) over TLS to secure communication between an Azure NetApp Files volume and the Active Directory LDAP server. You can enable LDAP over TLS for NFS, SMB, and dual-protocol volumes of Azure NetApp Files.",2025-11-28T08:00:00.000Z,how-to,security,0.9,True,"Covers LDAP over TLS setup, including root CA certificate management and protocol-specific settings, which are detailed security configurations.",unchanged @@ -101,7 +101,7 @@ https://learn.microsoft.com/en-us/azure/azure-netapp-files/cross-region-replicat https://learn.microsoft.com/en-us/azure/azure-netapp-files/cross-region-replication-display-health-status,Display health and monitor status of replication relationship,Display health status of Azure NetApp Files replication relationship,,Describes how to view replication status on the source volume or the destination volume of Azure NetApp Files.,You can view replication status on the source volume or the destination volume. You can also set alert rules in Azure Monitor to help you monitor the replication status.,2025-03-03T08:00:00.000Z,how-to,,0.4,False,"Viewing replication health and setting alerts; likely UI steps and generic Azure Monitor usage, not deep product-specific troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/cross-region-replication-manage-disaster-recovery,Manage disaster recovery,Manage disaster recovery using Azure NetApp Files,,Describes how to manage disaster recovery by using Azure NetApp Files cross-region replication.,"An ongoing replication (withcross-zoneorcross-region replication) between the source and the destination volumes prepares you for a disaster recovery event. When such an event occurs, you canfail over to the destination volume, enabling the client to read and write to the destination volume. After disaster recovery, you can perform aresyncoperation to fail back to the source volume. You thenreestablish the source-to-destination replicationand remount the source volume for the client to access. N",2025-05-13T17:02:00.000Z,how-to,,0.45,False,"Disaster recovery workflow using replication; mostly operational runbook style, not focused on numeric thresholds, configs, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/cross-zone-region-replication-configure,Create cross-zone-region replication,Manage cross-zone-region replication for Azure NetApp Files,,Describes how to manage disaster recovery by using Azure NetApp Files cross-zone-region replication.,Azure NetApp Files supports volume cross-zone and cross-region replication on the same source volume.,2025-12-11T18:27:00.000Z,how-to,,0.3,False,"Covers managing cross-zone-region replication for DR; appears to be operational guidance rather than detailed limits, configuration tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant,Configure cross-tenant customer-managed keys,Configure cross-tenant customer-managed keys for Azure NetApp Files volume encryption,Configure cross-tenant CMK encryption for Azure NetApp Files,Learn how to configure cross-tenant customer-managed keys for Azure NetApp Files volume encryption.,"Cross-tenant customer-managed keys (CMK) for Azure NetApp Files volume encryption allows service providers based on Azure to offercustomer-managed key encryption. In the cross-tenant scenario, the NetApp account resides in a tenant managed by an independent software vendor, while the key used for encryption of volumes in that NetApp account resides in a key vault in a tenant that you manage. Cross-tenant customer-managed keys is available in all Azure NetApp Files supported regions.",2026-03-03T18:22:00.000Z,how-to,security,0.72,True,"The page describes detailed, product-specific steps and requirements for configuring cross-tenant customer-managed keys for Azure NetApp Files volume encryption, including tenant relationships, Key Vault usage, and encryption configuration. This is security-focused configuration (key management and access across tenants) rather than generic concepts, and contains implementation details that are unlikely to be fully known from pretraining.",unchanged +https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant,Configure cross-tenant customer-managed keys,Configure cross-tenant customer-managed keys for Azure NetApp Files volume encryption,Configure cross-tenant customer-managed keys for Azure NetApp Files,Learn how to configure cross-tenant customer-managed keys for Azure NetApp Files volume encryption.,"Cross-tenant customer-managed keys (CMK) for Azure NetApp Files volume encryption allows service providers based on Azure to offercustomer-managed key encryption. In the cross-tenant scenario, the NetApp account resides in a tenant managed by an independent software vendor, while the key used for encryption of volumes in that NetApp account resides in a key vault in a tenant that you manage. Cross-tenant customer-managed keys is available in all Azure NetApp Files supported regions.",2026-04-23T06:20:00.000Z,how-to,security,0.7,True,"Provides detailed configuration steps for cross-tenant CMK setup, including tenant, key vault, and encryption configuration specific to Azure NetApp Files, which qualifies as product-specific security and identity configuration guidance.",updated https://learn.microsoft.com/en-us/azure/azure-netapp-files/data-plane-security,Understand data plane security,Understand Azure NetApp Files data plane security,Configure Azure NetApp Files data plane security,Learn about the different data plane security features in Azure NetApp Files,Learn about the different data plane security features in Azure NetApp Files to understand what is available to best serve your needs.,2024-10-25T08:00:00.000Z,concept-article,security,0.7,True,Data plane security feature description will include specific security options and possibly role/permission scopes unique to Azure NetApp Files.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/data-protection-disaster-recovery-options,Understand data protection and disaster recovery options,Understand data protection and disaster recovery options in Azure NetApp Files,Choose Azure NetApp Files data protection options,"Learn about data protection and disaster recovery options available in Azure NetApp Files, including snapshots, backups, cross-zone replication, and cross-region replication.",Learn about the different data protection and disaster recovery features in Azure NetApp Files and understand what solutions best serve your needs.,2026-03-03T08:00:00.000Z,concept-article,decision-making,0.68,True,"The page compares Azure NetApp Files data protection and DR mechanisms (snapshots, backups, cross-zone and cross-region replication) and helps select which solution best fits different needs. It goes beyond conceptual overview by providing product-specific guidance on when to use each option for various scenarios, which aligns with decision-making. It does not primarily focus on limits, configuration tables, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/default-individual-user-group-quotas-introduction,Understand default and individual user and group quotas,Understand default and individual user and group quotas for Azure NetApp Files volumes,Configure user and group quotas on Azure NetApp Files volumes,Helps you understand the use cases of managing default and individual user and group quotas for Azure NetApp Files volumes.,User and group quotas enable you to restrict the logical space that a user or group can consume in a volume. User and group quotas apply to a specific Azure NetApp Files volume.,2025-03-21T08:00:00.000Z,concept-article,limits-quotas,0.65,True,"User/group quota documentation typically includes specific quota units, enforcement behavior, and possibly default values, which are detailed limit semantics beyond generic knowledge.",unchanged @@ -133,11 +133,11 @@ https://learn.microsoft.com/en-us/azure/azure-netapp-files/faq-nfs,NFS FAQs,NFS https://learn.microsoft.com/en-us/azure/azure-netapp-files/faq-performance,Performance FAQs,Performance FAQs for Azure NetApp Files,,Answers frequently asked questions (FAQs) about Azure NetApp Files Performance.,This article answers frequently asked questions (FAQs) about Azure NetApp Files Performance.,2024-09-10T22:03:00.000Z,concept-article,,0.4,False,"Performance FAQ likely includes some product-specific behaviors, but the description doesn’t clearly indicate numeric limits, decision matrices, or config tables; classified as general FAQ.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/faq-security,Security FAQs,Security FAQs for Azure NetApp Files,,Answers frequently asked questions (FAQs) about Azure NetApp Files security.,This article answers frequently asked questions (FAQs) about Azure NetApp Files security.,2025-04-24T17:06:00.000Z,concept-article,,0.3,False,"Security FAQ is summarized generically; without explicit RBAC role lists, parameter tables, or detailed security configs in the description, it’s treated as high-level FAQ content.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/faq-smb,SMB FAQs,SMB FAQs for Azure NetApp Files,Resolve common Azure NetApp Files SMB issues,Answers frequently asked questions (FAQs) about the SMB protocol of Azure NetApp Files.,This article answers frequently asked questions (FAQs) about the SMB protocol of Azure NetApp Files.,2026-04-09T08:00:00.000Z,concept-article,troubleshooting,0.68,True,"As an SMB-specific FAQ for Azure NetApp Files, this page typically maps concrete symptoms and behaviors (for example, SMB feature support, access/permission issues, protocol/version behaviors, client compatibility) to product-specific explanations and resolutions. These are not generic SMB concepts but Azure NetApp Files–specific behaviors and constraints, which function as troubleshooting guidance (symptom → cause → resolution). While formatted as FAQ rather than a classic runbook, the content is product- and protocol-specific enough to qualify as expert troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/generate-user-group-quota-reports,Generate user and group quota reports,Generate user and group quota reports for Azure NetApp Files volumes,,Learn how to generate user and group quota reports for Azure NetApp Files volumes.,"To help with capacity management on volumes shared among multiple users, individual user and group quotas restrict capacity usage on NFS, SMB, and dual-protocol volumes. Quota reporting in Azure NetApp Files allows administrators to generate usage reports for an existing volume with quota rules, independent of host-based tooling and without having to mount the volume. For more information and considerations related to capacity management, seeUnderstand default and individual user and group quota",2026-04-14T17:11:00.000Z,how-to,,0.3,False,"Appears to be a how-to guide for generating quota reports, focused on operational steps rather than listing specific numeric limits, configuration parameter tables, or error-code-based troubleshooting. The description mentions quotas conceptually and capacity management but does not indicate concrete limits, settings matrices, or decision criteria that meet the expert-knowledge thresholds for any sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/azure-netapp-files/generate-user-group-quota-reports,Generate user and group quota reports,Generate user and group quota reports for Azure NetApp Files volumes,,Learn how to generate user and group quota reports for Azure NetApp Files volumes.,"To help with capacity management on volumes shared among multiple users, individual user and group quotas restrict capacity usage on NFS, SMB, and dual-protocol volumes. Quota reporting in Azure NetApp Files allows administrators to generate usage reports for an existing volume with quota rules, independent of host-based tooling and without having to mount the volume. For more information and considerations related to capacity management, seeUnderstand default and individual user and group quota",2026-04-14T17:11:00.000Z,how-to,,0.3,False,"Appears to be a how-to guide for generating quota reports, focused on operational steps rather than listing specific numeric limits, configuration parameter tables, or error-code-based troubleshooting. The description mentions quotas conceptually and capacity management but does not indicate concrete limits, settings matrices, or decision criteria that meet the expert-knowledge thresholds for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/join-active-directory-domain,Join a Linux VM to an Active Directory Domain,Join a Linux VM to a Microsoft Entra Domain,Join Linux VM to Microsoft Entra Domain,Describes how to join a Linux VM to a Microsoft Entra Domain,"Joining a Linux virtual machine (VM) to anMicrosoft Entra Domain Servicesmanaged domain enables users to sign into to VMs with one set of credentials. Once joined, the user accounts and credentials can be used to sign in, access, and manage servers. Refer toUnderstand guidelines for Active Directory Domain Services site design and planningto learn more about using Active Directory in Azure NetApp Files.",2025-03-01T08:00:00.000Z,how-to,security,0.6,True,"Step-by-step domain join for Linux VMs to Entra Domain Services, including commands and configuration values for secure domain integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/kerberos,Understand Kerberos,Understand Kerberos in Azure NetApp Files,Use Kerberos authentication with Azure NetApp Files,Learn how Kerberos works in Azure NetApp Files.,Kerberos is an authentication protocol that uses a secret key to validate the identity of principals. Secret keys are generated by taking a principal's password and converting it into a hashed cryptographic key format using an agreed upon encryption method by the client and server (such as AES). See theKerberos terminologysection to learn about the terms used in this document. Key Distribution Centers (KDCs) such as Windows Active Directory maintain a database of Kerberos principals and their ha,2025-09-03T08:00:00.000Z,concept-article,security,0.7,True,"Product-specific description of how Kerberos is applied, including KDC interactions and protocol behavior for Azure NetApp Files.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/large-volumes,Understand large volumes,Understand large volumes in Azure NetApp Files,Understand size and performance limits for large Azure NetApp Files volumes,"Learn about the benefits, use cases, and requirements for using large volumes in Azure NetApp Files.","Volumes in Azure NetApp Files are the way you present high performance, cost-effective storage to your network attached storage (NAS) clients in the Azure cloud. Volumes act as independent file systems with their own capacity, file counts, ACLs, snapshots, and file system IDs. These qualities provide a way to separate datasets into individual secure tenants. All resources in Azure NetApp files havelimits.Regularvolumes have the following limits: Large volumes have the following limits. With cool",2025-11-13T18:15:00.000Z,concept-article,limits-quotas,0.85,True,Explicitly contrasts regular vs large volume limits; includes numeric limits and behaviors for large volumes.,unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/large-volumes-requirements-considerations,Requirements and considerations for large volumes,Requirements and considerations for Azure NetApp Files large volumes,Use Azure NetApp Files large volumes within size limits,Describes the requirements and considerations you need to be aware of before using large volumes.,"Azure NetApp Files large volumes support sizes between 50 TiB and 1,024 TiB. With breakthrough mode, you can create large volumes at sizes between 2,400 GiB and 2,400 TiB. You mustrequest the featurebefore using it for the first time. With cool access enabled, large volumes can scale to 7.2 PiB in certain situations; for more information, seelarge volumes up to 7.2 PiB. There are requirements and considerations you need to be aware of before usinglarge volumes.",2026-04-15T11:11:00.000Z,concept-article,limits-quotas,0.8,True,"Page includes precise supported size ranges for large volumes (e.g., 50 TiB–1,024 TiB, 2,400 GiB–2,400 TiB, scaling to 7.2 PiB with cool access), which are concrete product limits and requirements not inferable from general knowledge.",updated +https://learn.microsoft.com/en-us/azure/azure-netapp-files/large-volumes-requirements-considerations,Requirements and considerations for large volumes,Requirements and considerations for Azure NetApp Files large volumes,Use Azure NetApp Files large volumes within size limits,Describes the requirements and considerations you need to be aware of before using large volumes.,"Azure NetApp Files large volumes support sizes between 50 TiB and 1,024 TiB. With breakthrough mode, you can create large volumes at sizes between 2,400 GiB and 2,400 TiB. You mustrequest the featurebefore using it for the first time. With cool access enabled, large volumes can scale to 7.2 PiB in certain situations; for more information, seelarge volumes up to 7.2 PiB. There are requirements and considerations you need to be aware of before usinglarge volumes.",2026-04-15T11:11:00.000Z,concept-article,limits-quotas,0.8,True,"Page includes precise supported size ranges for large volumes (e.g., 50 TiB–1,024 TiB, 2,400 GiB–2,400 TiB, scaling to 7.2 PiB with cool access), which are concrete product limits and requirements not inferable from general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/lightweight-directory-access-protocol,Understand LDAP basics,Understand lightweight directory access protocol (LDAP) basics in Azure NetApp Files,Understand LDAP usage and directory access for Azure NetApp Files,This article helps you understand how Azure NetApp Files uses lightweight directory access protocol (LDAP).,"Lightweight directory access protocol (LDAP) is a standard directory access protocol that was developed by an international committee called the Internet Engineering Task Force (IETF). LDAP is intended to provide a general-purpose, network-based directory service that you can use across heterogeneous platforms to locate network objects. LDAP models define how to communicate with the LDAP directory store, how to find an object in the directory, how to describe the objects in the store, and the se",2025-10-07T08:00:00.000Z,concept-article,security,0.6,True,Explains how Azure NetApp Files uses LDAP for directory lookups and identity-related operations; tied to authentication/authorization behavior.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/lightweight-directory-access-protocol-local-users,Understand the allow local NFS users with LDAP option,Understand the allow local NFS users with LDAP option with LDAP in Azure NetApp Files,Use local NFS users with LDAP in Azure NetApp Files,This article helps you understand the allow local NFS users option in the lightweight directory access protocol (LDAP).,"When a user attempts to access an Azure NetApp Files volume via NFS, the request comes in a numeric ID. By default, Azure NetApp Files supports extended group memberships for NFS users (to go beyond the standard 16 group limit). As a result, Azure NetApp files attempts to take that numeric ID and look it up inlightweight directory access protocol (LDAP)in an attempt to resolve the group memberships for the user rather than passing the group memberships in an RPC packet. Due to this behavior, if ",2025-02-19T18:02:00.000Z,concept-article,security,0.75,True,Details how Azure NetApp Files resolves NFS user IDs via LDAP and the impact of extended group memberships; product-specific authentication behavior and options.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/lightweight-directory-access-protocol-name-mapping,Understand name mapping using LDAP,Understand name mapping using LDAP in Azure NetApp Files,Configure LDAP name mapping rules for Azure NetApp Files,This article helps you understand how Azure NetApp Files uses name mapping with lightweight directory access protocol (LDAP).,"Name mapping rules withlightweight directory access protocol (LDAP)can be broken down into two main types:symmetricandasymmetric. By default, Azure NetApp Files uses symmetric name mapping rules. If asymmetric name mapping rules are required, consider configuring the LDAP user objects to use them.",2025-02-19T18:02:00.000Z,concept-article,security,0.7,True,Describes symmetric vs asymmetric LDAP name mapping and configuration implications; product-specific identity mapping behavior.,unchanged @@ -164,7 +164,7 @@ https://learn.microsoft.com/en-us/azure/azure-netapp-files/network-attached-stor https://learn.microsoft.com/en-us/azure/azure-netapp-files/network-attached-storage-protocols,Understand NAS protocols,Understand NAS protocols in Azure NetApp Files,,"Learn how SMB, NFS, and dual protocols operate in Azure NetApp Files.","NAS protocols are how conversations happen between clients and servers. NFS and SMB are the NAS protocols used in Azure NetApp Files. Each offers their own distinct methods for communication, but at their root, they operate mostly in the same way.",2025-01-28T08:00:00.000Z,concept-article,,0.2,False,Conceptual overview of NAS protocols (SMB/NFS); lacks detailed configuration parameters or limits.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/network-file-system-group-memberships,Understand NFS group memberships and supplemental groups,Understand NFS group memberships and supplemental groups for Azure NetApp Files,Manage NFS group memberships and supplemental groups for Azure NetApp Files,This article helps you understand NFS group memberships and supplemental groups as they apply to Azure NetApp Files.,You can use LDAP to control group membership and to return supplemental groups for NFS users. This behavior is controlled through schema attributes in the LDAP server.,2025-01-02T08:00:00.000Z,concept-article,security,0.65,True,Describes how LDAP-controlled group memberships and supplemental groups are used for NFS access; product-specific authorization behavior.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/nfs-access-control-lists,Understand NFSv4.x access control lists,Understand NFSv4.x access control lists in Azure NetApp Files,Use NFSv4.x ACLs for access control in Azure NetApp Files,Learn about using NFSv4.x access control lists in Azure NetApp Files.,"The NFSv4.x protocol can provide access control in the form ofaccess control lists (ACLs), which conceptually similar to ACLs used inSMB via Windows NTFS permissions. An NFSv4.x ACL consists of individualAccess Control Entries (ACEs), each of which provides an access control directive to the server. Each NFSv4.x ACL uses the format oftype:flags:principal:permissions. A:g:group1@contoso.com:rwatTnNcCyis an example of a valid ACL, following thetype:flags:principal:permissionsformat. The example AC",2025-07-10T08:00:00.000Z,concept-article,security,0.7,True,Product-specific explanation of NFSv4.x ACL format and usage (type:flags:principal:permissions) for access control; clearly security configuration.,unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure,Configure object REST API,Configure object REST API in Azure NetApp Files,Configure Azure NetApp Files object REST API securely,Learn how to configure object REST API to manage S3 objects in Azure NetApp Files.,"Azure NetApp Files supports access to objects with theobject REST APIfeature. With the object REST API, you can connect to services such as Azure AI Search, Microsoft Fabric, Microsoft Foundry, Azure Databricks, OneLake, and other S3‑compatible clients. This article describes how to configure object REST API access and walks you through the two supported certificate workflows. Choose the workflow that best matches your security and operational requirements.",2026-04-08T11:11:00.000Z,how-to,security,0.7,True,"The article is a configuration-focused, security-driven guide for enabling the object REST API on Azure NetApp Files, including choosing between two certificate workflows based on security and operational requirements. It likely details certificate handling, trust/identity setup, and product-specific security configuration steps for S3-compatible access. This is expert, product-specific security configuration rather than a generic overview.",unchanged +https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure,Configure object REST API,Configure object REST API in Azure NetApp Files,Configure S3 object REST API access for Azure NetApp Files,Learn how to configure object REST API to manage S3 objects in Azure NetApp Files.,"Azure NetApp Files supports access to objects with theobject REST APIfeature. With the object REST API, you can connect to services such as Azure AI Search, Microsoft Fabric, Microsoft Foundry, Azure Databricks, OneLake, and other S3‑compatible clients. This article describes how to configure object REST API access and walks you through the two supported certificate workflows. Choose the workflow that best matches your security and operational requirements.",2026-04-20T15:52:00.000Z,how-to,security,0.68,True,"How-to configuration article for enabling and securing object REST API access, including certificate-based workflows and security-related configuration steps specific to Azure NetApp Files and S3-compatible clients. Contains product-specific security configuration details rather than just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-browser,Access volumes with an S3-compatible client,Access an Azure NetApp Files object REST API-enabled volume with S3-compatible clients,Access Azure NetApp Files object REST API volumes with S3 clients,Learn how to access Azure NetApp Files object REST API-enabled volumes from S3-compatible clients,"You can use Azure NetApp Files' object REST API with an S3-compatible client, taking advantage of secure SSL communication and seamless data management. You must install a certificate on your machine before accessing the bucket with S3-compatible clients. This document covers accessing the bucket with the S3 Browser and AWS CLI.",2025-10-14T11:20:00.000Z,how-to,integrations,0.75,True,"Details using S3 Browser and AWS CLI with ANF object REST API, including certificate installation and likely endpoint/parameter specifics.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-databricks,Connect to Azure Databricks,Connect Azure Databricks to an Azure NetApp Files object REST API-enabled volume,Connect Azure Databricks to Azure NetApp Files via object REST API,Learn how to connect Azure Databricks to an Azure NetApp Files volume using object REST API,"Theobject REST API featureenables Azure Databricks to read and write data to Azure NetApp Files volumes, supporting end-to-end data science workflows from ingestion to model deployment. To connect to Azure Databricks, you configure an initialization (init) script to load the SSL certificate on the Databricks compute endpoints. Using this setup ensures secure communication between Azure Databricks and your Azure NetApp Files object REST API-enabled volume.",2025-10-14T11:20:00.000Z,how-to,integrations,0.75,True,Describes Databricks init script configuration to load SSL certificates for ANF object REST API; includes integration-specific parameters and patterns.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-introduction,Understand Object REST API,Understand Azure NetApp Files object REST API,,Learn about object REST API access management for S3 workloads in Azure NetApp Files.,"Azure NetApp Files supports the Object REST API, an S3-compatible REST API. The object REST API extends your file-based storage, enabling native S3 read and write access. You can integrate Azure NetApp Files with services including Azure AI Search, Microsoft Foundry, Azure Databricks, OneLake and more. Object REST API allows you to present the same data set as a file hierarchy or as objects in a bucket. To do so, object REST API creates buckets that allow S3 clients to read, write, and enumerate",2026-03-11T11:05:00.000Z,concept-article,,0.2,False,"Introductory overview of Azure NetApp Files Object REST API and S3 access; summary does not indicate concrete limits, configuration tables, error codes, or other expert-only details.",unchanged @@ -184,15 +184,15 @@ https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-linux-nfs https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-oracle-multiple-volumes,Oracle database performance on Azure NetApp Files multiple volumes,Oracle database performance on Azure NetApp Files multiple volumes,Architect high-performance Oracle on multiple Azure NetApp Files volumes,Migrating highly performant Exadata grade databases to the cloud is increasingly becoming an imperative for Microsoft customers.,Migrating highly performant Exadata grade databases to the cloud is increasingly becoming an imperative for Microsoft customers. Supply chain software suites typically set the bar high due to the intense demands on storage I/O with a mixed read and write workload driven by a single compute node. Azure infrastructure in combination with Azure NetApp Files is able to meet the needs of this highly demanding workload. This article presents an example of how this demand was met for one customer and h,2025-07-20T08:00:00.000Z,concept-article,architecture-patterns,0.65,True,"Describes how to migrate Exadata-grade workloads using multiple volumes; likely includes layout patterns, volume counts, and sizing guidance specific to this architecture.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-oracle-single-volumes,Oracle database performance on Azure NetApp Files single volumes,Oracle database performance on Azure NetApp Files single volume,Assess Oracle performance on a single Azure NetApp Files volume,Describes performance test results of an Azure NetApp Files single volume on Oracle database.,"This article addresses the following topics about Oracle in the cloud. These topics might be of particular interest to a database administrator, cloud architect, or storage architect: Important For correct and optimal deployment of Oracle dNFS, follow the patching guidelines outlinedhere.",2025-06-04T08:00:00.000Z,concept-article,limits-quotas,0.65,True,"Oracle performance results on a single volume will include measured throughput/latency and configuration details, effectively documenting tested performance boundaries for that scenario.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-virtual-machine-sku,Azure virtual machine SKU best practices,Azure virtual machine stock-keeping unit (SKUs) best practices for Azure NetApp Files,Select Azure VM SKUs for Azure NetApp Files performance,"Describes Azure NetApp Files best practices about Azure virtual machine stocking-keeping units (SKUs), including differences within and between SKUs.","This article describes Azure NetApp Files best practices about Azure virtual machine stock-keeping units (SKUs), including differences within and between SKUs.",2025-07-02T08:00:00.000Z,concept-article,best-practices,0.7,True,"VM SKU best practices will map specific SKUs and network/storage characteristics to Azure NetApp Files performance, providing concrete recommendations and comparisons.",unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-configure,Manage advanced ransomware protection,Configure advanced ransomware protection for Azure NetApp Files volumes,Configure advanced ransomware protection for Azure NetApp Files,"Configuring ransomware protection for your Azure NetApp Files creates an added layer of security at the data storage level, alerting you to suspected ransomware attacks based on AI-generated profiles ","Ransomware attacks pose a huge threat to the integrity and reliability of data. Azure NetApp Files' advanced ransomware protection adds a line of defense at the storage level for your data. Advanced ransomware protection uses machine learning to develop a profile of your volumes, alerting you of perceived threats. Advanced ransomware protection is available to Azure NetApp Files at no additional cost. Advanced ransomware protection builds its profile based on many inputs, including but not limit",2026-04-14T17:11:00.000Z,how-to,security,0.7,True,Configuration-focused security content for Azure NetApp Files ransomware protection; likely includes product-specific security settings and parameters for enabling and tuning advanced ransomware protection beyond generic security concepts.,updated -https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements,Requirements and considerations for Azure NetApp Files advanced ransomware protection,Requirements and Considerations for Azure NetApp Files advanced ransomware protection,Satisfy requirements for ANF ransomware protection,Understand the considerations and requirements for Azure NetApp Files advanced ransomware protection.,"Before you configureAzure NetApp Files advanced ransomware protection, make sure that you understand the requirements.",2026-03-13T17:21:00.000Z,concept-article,configuration,0.65,True,"Requirements and considerations for configuring advanced ransomware protection are typically product-specific configuration constraints and prerequisites, which qualify as expert configuration knowledge rather than general concepts.",unchanged +https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-configure,Manage advanced ransomware protection,Configure advanced ransomware protection for Azure NetApp Files volumes,Configure advanced ransomware protection for Azure NetApp Files,"Configuring ransomware protection for your Azure NetApp Files creates an added layer of security at the data storage level, alerting you to suspected ransomware attacks based on AI-generated profiles ","Ransomware attacks pose a huge threat to the integrity and reliability of data. Azure NetApp Files' advanced ransomware protection adds a line of defense at the storage level for your data. Advanced ransomware protection uses machine learning to develop a profile of your volumes, alerting you of perceived threats. Advanced ransomware protection is available to Azure NetApp Files at no additional cost. Advanced ransomware protection builds its profile based on many inputs, including but not limit",2026-04-14T17:11:00.000Z,how-to,security,0.7,True,Configuration-focused security content for Azure NetApp Files ransomware protection; likely includes product-specific security settings and parameters for enabling and tuning advanced ransomware protection beyond generic security concepts.,unchanged +https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements,Requirements and considerations for Azure NetApp Files advanced ransomware protection,Requirements and Considerations for Azure NetApp Files advanced ransomware protection,Meet requirements for Azure NetApp Files ransomware protection,Understand the considerations and requirements for Azure NetApp Files advanced ransomware protection.,"Before you configureAzure NetApp Files advanced ransomware protection, make sure that you understand the requirements.",2026-04-23T11:10:00.000Z,concept-article,security,0.78,True,"The page describes concrete, product-specific requirements and considerations for enabling Azure NetApp Files advanced ransomware protection (for example, required account and volume configurations, supported regions/SKUs, and prerequisite security-related settings). These are detailed, service-specific security configuration requirements that go beyond generic security concepts, so they qualify as expert knowledge under the security sub-skill.",updated https://learn.microsoft.com/en-us/azure/azure-netapp-files/reestablish-deleted-volume-relationships,Re-establish volume replication relationships,Re-establish deleted volume replication relationships in Azure NetApp Files,,You can re-establish the replication relationship between volumes.,"Azure NetApp Files allows you to re-establish a replication relationship between two volumes in case you had previously deleted it. You can only re-establish the relationship from the destination volume. If the destination volume remains operational and no snapshots were deleted, the replication re-establish operation uses the last common snapshot. The operation incrementally synchronizes the destination volume based on the last known good snapshot. A baseline snapshot isn't required.",2025-08-07T05:10:00.000Z,how-to,,0.45,False,Re-establish replication using last common snapshot; describes behavior but not detailed configuration parameters or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/regional-capacity-quota,Regional capacity quota,Regional capacity quota for Azure NetApp Files,Understand Azure NetApp Files regional capacity quotas,Explains regional capacity quota of Azure NetApp Files.,This article explains regional capacity quota of Azure NetApp Files.,2025-05-08T08:00:00.000Z,concept-article,limits-quotas,0.8,True,"A page dedicated to regional capacity quota is expected to list concrete per-region capacity limits and quota behaviors, which are numeric constraints unique to the service.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/replication,Understand Azure NetApp Files replication,Understand Azure NetApp Files Replication,Choose Azure NetApp Files replication model,"Learn about cross-zone, cross-region, and cross-zone-region replication options in Azure NetApp Files to decide which options suit your reliability plan.",Azure NetApp Files supports the following models for replication: Learn about all three models to decide which options best suit yourreliability plan.,2025-12-09T08:00:00.000Z,concept-article,decision-making,0.7,True,"Page is explicitly about understanding cross-zone, cross-region, and cross-zone-region replication options to decide which fits a reliability plan. This is product-specific decision guidance between options, likely with scenario-based recommendations and trade-offs, but not primarily limits tables or configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/replication-requirements,Requirements and considerations for Azure NetApp Files replication,Requirements and Considerations for Azure NetApp Files Replication,Check Azure NetApp Files replication requirements and limits,Understand the considerations and individual and shared requirements for configuring Azure NetApp Files cross-zone and cross-region replication.,"Before you configurecross-zone or cross-region replication, make sure that you understand the requirements for each option. If you usecross-zone-region replication, you must adhere to all the requirements.",2026-04-09T06:11:00.000Z,concept-article,limits-quotas,0.7,True,"A replication requirements page for a specific Azure storage service typically includes concrete constraints such as supported regions and zones, minimum/maximum volume sizes, supported replication intervals, allowed combinations of capacity pools, and other numeric or tightly specified requirements that an LLM would not reliably know from training. These are product-specific limits/constraints rather than just conceptual guidance, so it best fits the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/azure-netapp-files/replication-requirements,Requirements and considerations for Azure NetApp Files replication,Requirements and Considerations for Azure NetApp Files Replication,Check Azure NetApp Files replication requirements and limits,Understand the considerations and individual and shared requirements for configuring Azure NetApp Files cross-zone and cross-region replication.,"Before you configurecross-zone or cross-region replication, make sure that you understand the requirements for each option. If you usecross-zone-region replication, you must adhere to all the requirements.",2026-04-09T06:11:00.000Z,concept-article,limits-quotas,0.7,True,"A replication requirements page for a specific Azure storage service typically includes concrete constraints such as supported regions and zones, minimum/maximum volume sizes, supported replication intervals, allowed combinations of capacity pools, and other numeric or tightly specified requirements that an LLM would not reliably know from training. These are product-specific limits/constraints rather than just conceptual guidance, so it best fits the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/request-region-access,Request region access,Request region access for Azure NetApp Files,Request Azure NetApp Files regional access,Describes how to request access to a region for using Azure NetApp Files.,"In special situations, you need to explicitly request access to a region. Learn how to submit a request.",2025-05-10T08:00:00.000Z,how-to,deployment,0.6,True,"Describes process and constraints for enabling service in specific regions, which is a deployment prerequisite unique to this service.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/reservations,Azure NetApp Files reserved capacity,Reserved capacity for Azure NetApp Files,Optimize Azure NetApp Files costs with reservations,Learn how to optimize total cost of ownership (TCO) with Azure NetApp Files reservations.,"You can save money on the storage costs for Azure NetApp Files with reservations. Azure NetApp Files reservations offer a discount on capacity for storage costs when you commit to a reservation for one or three years, optimizing your total cost of ownership (TCO). A reservation provides a fixed amount of storage capacity for the term of the reservation. Azure NetApp Files reservations can significantly reduce your capacity costs for storing data in your Azure NetApp Files volumes. How much you s",2025-03-25T08:00:00.000Z,concept-article,decision-making,0.7,True,"Reservation pages generally include term options, discount behaviors, and usage matching rules that help decide when and how much capacity to reserve, which is concrete cost-planning guidance.",unchanged -https://learn.microsoft.com/en-us/azure/azure-netapp-files/restore-single-file-backup,Restore individual files from a backup,Restore individual files with single-file restore from backups in Azure NetApp Files,Restore individual Azure NetApp Files from backup,"With single-file restore from backup, you can restore up to eight files to a directory.","You can rely on your Azure NetApp Files backup to restore individual files that aren't available in an online snapshot. With single-file restore from backup, you can restore a single file to a specific location in a volume or up to eight files to a specific directory in the volume.",2026-04-15T11:11:00.000Z,how-to,limits-quotas,0.7,True,"Describes a concrete operational limit (restore up to eight files to a directory) for single-file restore from backups in Azure NetApp Files, which is a product-specific quota-style constraint not generally known from training.",updated +https://learn.microsoft.com/en-us/azure/azure-netapp-files/restore-single-file-backup,Restore individual files from a backup,Restore individual files with single-file restore from backups in Azure NetApp Files,Restore individual Azure NetApp Files from backup,"With single-file restore from backup, you can restore up to eight files to a directory.","You can rely on your Azure NetApp Files backup to restore individual files that aren't available in an online snapshot. With single-file restore from backup, you can restore a single file to a specific location in a volume or up to eight files to a specific directory in the volume.",2026-04-15T11:11:00.000Z,how-to,limits-quotas,0.7,True,"Describes a concrete operational limit (restore up to eight files to a directory) for single-file restore from backups in Azure NetApp Files, which is a product-specific quota-style constraint not generally known from training.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/sever-message-block-support,Understand SMB support,Understand Server Message Block support in Azure NetApp Files,,SMB with Azure NetApp Files provides many features and configuration constants when an SMB server is created.,"Azure NetApp Files provides cloud-resident storage through a volumes as a service offering, using NAS protocols as the delivery mechanism to end users. SMB is one of the supported NAS protocols offered for use with Azure NetApp Files volumes and includes the following general capabilities:",2025-01-30T23:06:00.000Z,how-to,,0.3,False,Describes SMB support and capabilities at a general level; summary does not indicate detailed config tables or numeric constraints.,unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/snapshots-delete,Delete snapshots,Delete snapshots using Azure NetApp Files,,Describes how to delete snapshots by using Azure NetApp Files.,You can delete snapshots that you no longer need. Important You can't undo the snapshot deletion. You can't recover a deleted snapshot.,2026-02-24T12:29:00.000Z,how-to,,0.3,False,"Simple delete operation with an irreversibility warning; no evidence of detailed quotas, configs, or error code mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-netapp-files/snapshots-introduction,Understand Azure NetApp Files snapshot-based data protection,How Azure NetApp Files snapshots work,,"Explains how Azure NetApp Files snapshots work, including ways to create snapshots, ways to restore snapshots, how to use snapshots in cross-region replication settings.","This article explains how Azure NetApp Files snapshots work. Azure NetApp Files snapshot technology delivers stability, scalability, and faster recoverability, with no impact to performance. Snapshots provide the foundation for data protection solutions, including single-file restores, volume restores and clones, cross-region replication, cross-zone replication, and long-term retention. To create volume snapshots, seeManage snapshots using Azure NetApp Files. For considerations about snapshot ma",2025-11-14T08:00:00.000Z,concept-article,,0.2,False,"The article is an introduction to how Azure NetApp Files snapshots work and their role in data protection features. Based on the description, it is largely conceptual and explanatory, without clear evidence of detailed limits, configuration parameter tables, error codes, or decision matrices. It therefore does not meet the threshold for expert-knowledge sub-skill types defined here.",unchanged diff --git a/products/azure-netapp-files/report.md b/products/azure-netapp-files/report.md index bdc3f61c..4093863c 100644 --- a/products/azure-netapp-files/report.md +++ b/products/azure-netapp-files/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: limits-quotas: 'Limits, quotas, and performance caps for Azure NetApp Files: volume size/perf limits, user/group/inode quotas, file/path/charset constraints, regional @@ -25,19 +25,19 @@ category_descriptions: decision-making: Guidance on sizing, performance tiers, volume types, data protection/backup/replication, SMB CA, cool access trade-offs, and cost optimization/TCO for Azure NetApp Files workloads - security: 'Security configuration for Azure NetApp Files: encryption (CMK/HSM, cross-tenant), - Kerberos/LDAP/AD, NFS/SMB permissions and ACLs, ransomware protection, and secure - control/data plane and API.' + security: 'Security, encryption, and access control for Azure NetApp Files: keys/CMK, + Kerberos, LDAP/AD, ACLs/permissions (NFS/SMB/S3), ransomware protection, and control/data + plane hardening.' skill_description: Expert knowledge for Azure NetApp Files development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - deploying ANF for SAP HANA/Oracle, AzAcSnap, ZRS, AVS, S3/OneLake object REST API, - or Databricks, and other Azure NetApp Files related development tasks. Not for Azure - Files (use azure-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic - SAN (use azure-elastic-san), Azure Managed Lustre (use azure-managed-lustre). -use_when: Use when deploying ANF for SAP HANA/Oracle, AzAcSnap, ZRS, AVS, S3/OneLake - object REST API, or Databricks, and other Azure NetApp Files related development - tasks. + deploying SAP HANA/Oracle on ANF, using AzAcSnap, REST/PowerShell APIs, S3/object + access, or AVS/Databricks, and other Azure NetApp Files related development tasks. + Not for Azure Files (use azure-files), Azure Blob Storage (use azure-blob-storage), + Azure Elastic SAN (use azure-elastic-san), Azure Managed Lustre (use azure-managed-lustre). +use_when: Use when deploying SAP HANA/Oracle on ANF, using AzAcSnap, REST/PowerShell + APIs, S3/object access, or AVS/Databricks, and other Azure NetApp Files related + development tasks. confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san), Azure Managed Lustre (use azure-managed-lustre). @@ -54,8 +54,8 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 221 +- **Updated Pages**: 4 +- **Unchanged**: 223 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-netapp-files/azure-netapp-files.csv` @@ -65,12 +65,12 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u |------|-------|------------| | architecture-patterns | 8 | 3.5% | | best-practices | 17 | 7.5% | -| configuration | 30 | 13.2% | +| configuration | 29 | 12.8% | | decision-making | 12 | 5.3% | | deployment | 10 | 4.4% | | integrations | 11 | 4.8% | | limits-quotas | 21 | 9.3% | -| security | 32 | 14.1% | +| security | 33 | 14.5% | | troubleshooting | 12 | 5.3% | | *(Unclassified)* | 74 | 32.6% | @@ -78,18 +78,14 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u ### Updated Pages -- [Generate user and group quota reports](https://learn.microsoft.com/en-us/azure/azure-netapp-files/generate-user-group-quota-reports) - - Updated: 2025-11-13T18:15:00.000Z → 2026-04-14T17:11:00.000Z -- [Resource limits for Azure NetApp Files](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-resource-limits) - - Updated: 2026-03-24T06:16:00.000Z → 2026-04-14T08:00:00.000Z -- [Requirements and considerations for large volumes](https://learn.microsoft.com/en-us/azure/azure-netapp-files/large-volumes-requirements-considerations) - - Updated: 2026-03-09T17:11:00.000Z → 2026-04-15T11:11:00.000Z -- [Requirements and considerations for Azure NetApp Files replication](https://learn.microsoft.com/en-us/azure/azure-netapp-files/replication-requirements) - - Updated: 2026-01-20T08:00:00.000Z → 2026-04-09T06:11:00.000Z -- [Restore individual files from a backup](https://learn.microsoft.com/en-us/azure/azure-netapp-files/restore-single-file-backup) - - Updated: 2025-11-17T08:00:00.000Z → 2026-04-15T11:11:00.000Z -- [Manage advanced ransomware protection](https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-configure) - - Updated: 2026-03-13T17:21:00.000Z → 2026-04-14T17:11:00.000Z +- [Configure object REST API](https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure) + - Updated: 2026-04-08T11:11:00.000Z → 2026-04-20T15:52:00.000Z +- [Configure customer-managed keys with managed HSM](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware) + - Updated: 2025-11-12T08:00:00.000Z → 2026-04-22T17:34:00.000Z +- [Configure cross-tenant customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant) + - Updated: 2026-03-03T18:22:00.000Z → 2026-04-23T06:20:00.000Z +- [Requirements and considerations for Azure NetApp Files advanced ransomware protection](https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements) + - Updated: 2026-03-13T17:21:00.000Z → 2026-04-23T11:10:00.000Z ## Classified Pages @@ -118,7 +114,6 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u | [Configure application volume group for Oracle using REST API](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-application-volume-oracle-api) | integrations | 0.80 | REST API creation of Oracle AVGs with selected parameters, properties, constraints, and typical values; matches integration pattern with product-specific parameter details. | | [Configure customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys) | security | 0.80 | Details CMK setup with managed identity and Key Vault for ANF volume encryption; includes product-specific security configuration and key lifecycle relationships. | | [Configure customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-netapp-files/elastic-customer-managed-keys) | security | 0.80 | Customer-managed keys setup involves Key Vault integration, key URIs, and permission scopes specific to Azure NetApp Files Elastic ZRS. | -| [Configure customer-managed keys with managed HSM](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware) | security | 0.80 | Describes using managed HSM (FIPS 140-2 Level 3) instead of standard Key Vault for ANF CMK; product-specific encryption configuration. | | [Configure export policy for NFS or dual protocol](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-configure-export-policy) | configuration | 0.80 | Includes explicit rule limits (up to five export policy rules) and modifiable fields; export policy behavior is ANF-specific configuration. | | [Create and manage Active Directory connections](https://learn.microsoft.com/en-us/azure/azure-netapp-files/create-active-directory-connections) | security | 0.80 | Active Directory connection configuration for SMB, NFSv4.1 Kerberos, and dual-protocol volumes includes domain, DNS, and credential settings that are security-specific. | | [Linux concurrency best practices](https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-linux-concurrency-session-slots) | best-practices | 0.80 | Concurrency best practices around session slots/slot tables will include specific numeric recommendations and tuning steps tailored to Azure NetApp Files behavior. | @@ -129,6 +124,7 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u | [Troubleshoot capacity pool errors](https://learn.microsoft.com/en-us/azure/azure-netapp-files/troubleshoot-capacity-pools) | troubleshooting | 0.80 | Explicit troubleshooting for capacity pool issues; expected to list specific error messages and corresponding resolutions. | | [Troubleshoot volume errors](https://learn.microsoft.com/en-us/azure/azure-netapp-files/troubleshoot-volumes) | troubleshooting | 0.80 | Troubleshooting volume errors; likely includes specific error messages, causes, and recommended checks for terminal state and async operations. | | [Understand dual-protocol security style and permission behaviors](https://learn.microsoft.com/en-us/azure/azure-netapp-files/dual-protocol-permission-behaviors) | security | 0.80 | Guidance on selecting UNIX vs NTFS security styles and how permissions behave in dual-protocol scenarios; product-specific security decision and behavior. | +| [Requirements and considerations for Azure NetApp Files advanced ransomware protection](https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements) | security | 0.78 | The page describes concrete, product-specific requirements and considerations for enabling Azure NetApp Files advanced ransomware protection (for example, required account and volume configurations, supported regions/SKUs, and prerequisite security-related settings). These are detailed, service-specific security configuration requirements that go beyond generic security concepts, so they qualify as expert knowledge under the security sub-skill. | | [--runbefore or --runafter](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azacsnap-cmd-ref-runbefore-runafter) | integrations | 0.75 | Documents the --runbefore and --runafter options of azacsnap, which are specific integration parameters and patterns for coordinating snapshots with workloads. | | [-c restore](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azacsnap-cmd-ref-restore) | integrations | 0.75 | Restore command reference for azacsnap, including product-specific behavior (only available for Azure Large Instance and Azure NetApp Files) and likely detailed parameters, making it an integration-focused command spec. | | [Access volumes with an S3-compatible client](https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-browser) | integrations | 0.75 | Details using S3 Browser and AWS CLI with ANF object REST API, including certificate installation and likely endpoint/parameter specifics. | @@ -140,7 +136,6 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u | [Large volume performance benchmarks for Linux](https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-large-volumes-linux) | limits-quotas | 0.75 | Large volume benchmark documentation provides specific maximum throughput/IOPS and scaling behavior, which are numeric performance limits unique to the service. | | [SMB performance best practices](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-smb-performance) | best-practices | 0.75 | SMB performance tuning for this service will include specific client/server settings, registry or mount options, and patterns unique to Azure NetApp Files SMB workloads. | | [Understand the allow local NFS users with LDAP option](https://learn.microsoft.com/en-us/azure/azure-netapp-files/lightweight-directory-access-protocol-local-users) | security | 0.75 | Details how Azure NetApp Files resolves NFS user IDs via LDAP and the impact of extended group memberships; product-specific authentication behavior and options. | -| [Configure cross-tenant customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant) | security | 0.72 | The page describes detailed, product-specific steps and requirements for configuring cross-tenant customer-managed keys for Azure NetApp Files volume encryption, including tenant relationships, Key Vault usage, and encryption configuration. This is security-focused configuration (key management and access across tenants) rather than generic concepts, and contains implementation details that are unlikely to be fully known from pretraining. | | [-c delete](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azacsnap-cmd-ref-delete) | integrations | 0.70 | Provides a guide for the delete command of azacsnap, which should list command options and parameters specific to Azure NetApp Files snapshot integration. | | [-c details](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azacsnap-cmd-ref-details) | integrations | 0.70 | Command reference for the Azure Application Consistent Snapshot tool’s details command; likely includes specific CLI parameters, options, and behaviors unique to this integration. | | [Add hosts to a multiple-host SAP HANA system](https://learn.microsoft.com/en-us/azure/azure-netapp-files/application-volume-group-add-hosts) | limits-quotas | 0.70 | Includes explicit sizing guidance (HANA shared volume sized to 1× RAM per four hosts) and adjustments; these are numeric thresholds for capacity planning. | @@ -158,7 +153,8 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u | [Configure access control lists for NFSv4.1](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-access-control-lists) | security | 0.70 | ACL support and configuration on ANF NFSv4.1 volumes is product-specific security behavior, including ACE semantics and identity formats. | | [Configure an NFS client for Azure NetApp Files](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-nfs-clients) | configuration | 0.70 | Provides distro-specific NFS client configuration for RHEL and Ubuntu tied to ANF scenarios (Kerberos, dual protocol, LDAP); likely includes concrete mount options and config parameters. | | [Configure application volume groups for SAP HANA using REST API](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-application-volume-group-sap-hana-api) | integrations | 0.70 | REST API configuration for SAP HANA AVGs with special requirements; likely includes specific parameters, required values, and product-specific constraints not known generically. | -| [Configure object REST API](https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure) | security | 0.70 | The article is a configuration-focused, security-driven guide for enabling the object REST API on Azure NetApp Files, including choosing between two certificate workflows based on security and operational requirements. It likely details certificate handling, trust/identity setup, and product-specific security configuration steps for S3-compatible access. This is expert, product-specific security configuration rather than a generic overview. | +| [Configure cross-tenant customer-managed keys](https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant) | security | 0.70 | Provides detailed configuration steps for cross-tenant CMK setup, including tenant, key vault, and encryption configuration specific to Azure NetApp Files, which qualifies as product-specific security and identity configuration guidance. | +| [Configure customer-managed keys with managed HSM](https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware) | security | 0.70 | Describes concrete steps and parameters to configure customer-managed keys using a managed HSM for Azure NetApp Files volume encryption, including product-specific security configuration and key management details beyond generic concepts. | | [Connect to OneLake](https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-onelake) | integrations | 0.70 | Covers creating OneLake shortcuts to ANF via object REST API with Fabric; product-specific integration steps and networking/security considerations. | | [Cost model for Azure NetApp Files](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-cost-model) | decision-making | 0.70 | Cost model pages usually include SKU-specific pricing behaviors, billing units, and cost-impacting parameters that guide capacity and tier choices, which is concrete decision guidance beyond generic knowledge. | | [Create an Active Directory connection](https://learn.microsoft.com/en-us/azure/azure-netapp-files/elastic-active-directory) | security | 0.70 | Active Directory connection setup for Elastic ZRS SMB volumes involves domain, OU, and credential settings that are security-related configuration. | @@ -207,6 +203,7 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u | [Understand workload types](https://learn.microsoft.com/en-us/azure/azure-netapp-files/workload-types) | decision-making | 0.70 | Workload-type guidance for a specific service typically maps workloads to volume types/service levels with concrete criteria, helping users decide which configuration to choose. | | [Update Terraform-managed volume](https://learn.microsoft.com/en-us/azure/azure-netapp-files/terraform-manage-volume) | best-practices | 0.70 | Focuses on safe update patterns for Terraform-managed resources, including state handling and avoiding data loss; product-specific DO/DON’T guidance. | | [Azure Managed Disk (PREVIEW)](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azacsnap-preview) | configuration | 0.68 | Preview feature guide for AzAcSnap typically includes product-specific setup and usage details such as parameters, options, and version-specific behaviors that are not broadly known or stable. These are configuration-focused expert details for the Azure Application Consistent Snapshot tool rather than general concepts. | +| [Configure object REST API](https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure) | security | 0.68 | How-to configuration article for enabling and securing object REST API access, including certificate-based workflows and security-related configuration steps specific to Azure NetApp Files and S3-compatible clients. Contains product-specific security configuration details rather than just conceptual guidance. | | [Guidelines for Azure NetApp Files network planning](https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-netapp-files-network-topologies) | architecture-patterns | 0.68 | The page provides product-specific network architecture guidance for Azure NetApp Files, including how and when to use delegated subnets, VNet peering, and on-premises connectivity. It focuses on designing effective network topologies for this service rather than generic networking concepts, which aligns best with architecture-patterns. | | [SMB FAQs](https://learn.microsoft.com/en-us/azure/azure-netapp-files/faq-smb) | troubleshooting | 0.68 | As an SMB-specific FAQ for Azure NetApp Files, this page typically maps concrete symptoms and behaviors (for example, SMB feature support, access/permission issues, protocol/version behaviors, client compatibility) to product-specific explanations and resolutions. These are not generic SMB concepts but Azure NetApp Files–specific behaviors and constraints, which function as troubleshooting guidance (symptom → cause → resolution). While formatted as FAQ rather than a classic runbook, the content is product- and protocol-specific enough to qualify as expert troubleshooting knowledge. | | [Understand data protection and disaster recovery options](https://learn.microsoft.com/en-us/azure/azure-netapp-files/data-protection-disaster-recovery-options) | decision-making | 0.68 | The page compares Azure NetApp Files data protection and DR mechanisms (snapshots, backups, cross-zone and cross-region replication) and helps select which solution best fits different needs. It goes beyond conceptual overview by providing product-specific guidance on when to use each option for various scenarios, which aligns with decision-making. It does not primarily focus on limits, configuration tables, or troubleshooting. | @@ -224,7 +221,6 @@ confusable_not_for: Not for Azure Files (use azure-files), Azure Blob Storage (u | [Migrate volumes to Azure NetApp Files](https://learn.microsoft.com/en-us/azure/azure-netapp-files/migrate-volumes) | deployment | 0.65 | Describes ANF migration assistant for peering and migrating from on-prem or Cloud Volumes ONTAP; migration workflow and constraints are product-specific deployment knowledge. | | [Oracle database performance on Azure NetApp Files multiple volumes](https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-oracle-multiple-volumes) | architecture-patterns | 0.65 | Describes how to migrate Exadata-grade workloads using multiple volumes; likely includes layout patterns, volume counts, and sizing guidance specific to this architecture. | | [Oracle database performance on Azure NetApp Files single volumes](https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-oracle-single-volumes) | limits-quotas | 0.65 | Oracle performance results on a single volume will include measured throughput/latency and configuration details, effectively documenting tested performance boundaries for that scenario. | -| [Requirements and considerations for Azure NetApp Files advanced ransomware protection](https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements) | configuration | 0.65 | Requirements and considerations for configuring advanced ransomware protection are typically product-specific configuration constraints and prerequisites, which qualify as expert configuration knowledge rather than general concepts. | | [Requirements and considerations for Azure NetApp Files cache volumes](https://learn.microsoft.com/en-us/azure/azure-netapp-files/cache-requirements) | configuration | 0.65 | Described as requirements and considerations for cache volumes, which usually includes specific product constraints and setup conditions. This is configuration-focused expert knowledge, not just conceptual explanation. | | [Troubleshoot using diagnose and solve problems tool](https://learn.microsoft.com/en-us/azure/azure-netapp-files/troubleshoot-diagnose-solve-problems) | troubleshooting | 0.65 | Troubleshooting-focused article on using a specific tool; likely maps common issues to diagnostics and resolutions unique to this service. | | [Understand DNS](https://learn.microsoft.com/en-us/azure/azure-netapp-files/domain-name-system-concept) | best-practices | 0.65 | Gives concrete recommendations on DNS configuration and connectivity to avoid timeouts and access interruptions; product-specific operational best practices. | diff --git a/products/azure-network-watcher/azure-network-watcher.csv b/products/azure-network-watcher/azure-network-watcher.csv index 1b1e2bbb..7f7dc020 100644 --- a/products/azure-network-watcher/azure-network-watcher.csv +++ b/products/azure-network-watcher/azure-network-watcher.csv @@ -50,7 +50,7 @@ https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-policy https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-queries,Use queries in traffic analytics,Use Queries in Traffic Analytics - Azure Network Watcher,Analyze Traffic Analytics data with KQL queries,"Explore sample KQL queries for Azure Traffic Analytics to identify top talkers, analyze traffic flows, and monitor security scenarios effectively.","This article provides sample Kusto Query Language (KQL) queries to help you analyze traffic analytics data effectively. Traffic analytics processes virtual network (VNet) flow logs and network security group (NSG) flow logs to provide detailed insights into network traffic patterns, security events, and performance metrics. Use these queries to:",2025-08-20T22:15:00.000Z,how-to,integrations,0.75,True,"Provides sample KQL queries for Traffic Analytics; includes concrete query patterns, field names, and filters specific to this product’s data model, which are integration/coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-rule-impact-analyzer,Rules impact analyzer,Analyze Security Rules in Traffic Analytics (Preview) - Azure Network Watcher,,Use Rule Impact Analyzer to simulate and assess security admin rule effects in Azure Virtual Network Manager. Ensure compliance and prevent misconfigurations.,"In this article, you learn how to use the rule impact analyzer feature with network groups in Azure Virtual Network Manager. You can use the Azure portal to create a security admin configuration, add a security admin rule, and simulate the impact of your rule changes before deploying them. The rules impact analyzer enables you to preview the impact of security admin rules before applying them to your environment. This feature helps you validate rule behavior, identify potential conflicts, and en",2026-04-07T22:12:00.000Z,how-to,,0.2,False,"Page describes how to use Rule Impact Analyzer in Traffic Analytics via the Azure portal. It is a feature-usage guide without detailed limits, configuration tables, security role definitions, or troubleshooting mappings. No quantified decision criteria or expert-only reference data is evident from the summary.",unchanged https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-schema,Schema and data aggregation,Traffic Analytics Schema and Data Aggregation - Azure Network Watcher,Understand Traffic Analytics schema and aggregation,Learn about schema and data aggregation in Azure Network Watcher traffic analytics to analyze flow logs.,"Traffic analytics is a cloud-based solution that provides visibility into user and application activity in cloud networks. Traffic analytics analyzes Azure Network Watcher flow logs to provide insights into traffic flow in your Azure cloud. With traffic analytics, you can:",2025-08-20T08:00:00.000Z,concept-article,configuration,0.7,True,"Schema and data aggregation documentation; such pages typically define field names, types, and aggregation rules for Traffic Analytics data, which are detailed configuration/schema references.",unchanged -https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-sentinel,Integrate Microsoft Sentinel with traffic analytics,Integrate Microsoft Sentinel with Traffic Analytics,Integrate Azure Traffic Analytics with Microsoft Sentinel,Discover how you can integrate Azure Traffic Analytics with Microsoft Sentinel to detect network threats and anomalies.,"Traffic Analytics in Azure Network Watcher processes and aggregates virtual network flow logs to provide visibility into network flows, traffic patterns, and potential security risks. It enriches flow data with threat intelligence, geolocation attributes, and topology context to help identify anomalies and evaluate exposure across your environment. Traffic Analytics supports integration with Microsoft Sentinel. Microsoft Sentinel is a scalable, cloud-native Security Information and Event Managem",2026-03-17T06:14:00.000Z,how-to,integrations,0.68,True,"Integration guide between Traffic Analytics and Microsoft Sentinel likely includes product-specific configuration steps, parameter names, and settings unique to this integration (for example, how to connect flow logs, workspaces, and data connectors), which qualifies as expert integration knowledge beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-sentinel,Integrate Microsoft Sentinel with traffic analytics,Integrate Microsoft Sentinel with Traffic Analytics,,Discover how you can integrate Azure Traffic Analytics with Microsoft Sentinel to detect network threats and anomalies.,"Traffic Analytics in Azure Network Watcher processes and aggregates virtual network flow logs to provide visibility into network flows, traffic patterns, and potential security risks. It enriches flow data with threat intelligence, geolocation attributes, and topology context to help identify anomalies and evaluate exposure across your environment. Traffic Analytics supports integration with Microsoft Sentinel. Microsoft Sentinel is a scalable, cloud-native Security Information and Event Managem",2026-04-24T11:19:00.000Z,how-to,,0.2,False,"Integration article between Traffic Analytics and Microsoft Sentinel is likely a conceptual/how-to overview without detailed configuration parameter tables, limits, or error-code-based troubleshooting. The summary emphasizes capabilities and benefits rather than product-specific settings, quotas, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-usage-scenarios,Usage scenarios,Traffic Analytics Usage Scenarios - Azure Network Watcher,,Learn about Azure Network Watcher traffic analytics and the insights it can provide in different usage scenarios.,"In this article, you learn how to get insights about your traffic after configuring traffic analytics in different scenarios.",2026-02-25T08:00:00.000Z,concept-article,,0.2,False,"Usage scenarios article describing how Traffic Analytics can provide insights; scenario-focused but not clearly exposing configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-zero-trust,Apply Zero Trust principles to segment Azure network,Apply Zero Trust Principles to Segment Azure Network through Traffic Analytics,Apply Zero Trust segmentation using Traffic Analytics,"Learn how to use Azure Traffic Analytics to apply Zero Trust principles, segment networks, and detect security risks in your Azure environment.","Zero Trust is a security strategy. It isn't a product or a service, but an approach in designing and implementing the following set of security principles. With Zero Trust, you move away from a trust-by-default perspective to a trust-by-exception one. An integrated capability to automatically manage those exceptions and alerts is important. You can more easily detect threats, respond to threats, and prevent or block undesired events across your organization. Azure’s cloud networking is designed ",2025-06-24T17:17:00.000Z,concept-article,security,0.6,True,"Uses Traffic Analytics to implement Zero Trust segmentation and detect risks; likely includes product-specific query patterns, configuration of analytics for security, and mappings to security controls.",unchanged https://learn.microsoft.com/en-us/azure/network-watcher/vm-network-troubleshooter,VM Network Troubleshooter,VM Network Troubleshooter (Preview) Overview - Azure Network Watcher,,Learn about VM network troubleshooter in Azure Network Watcher and how it can help you detect blocked ports.,"Important The VM network troubleshooter is currently in PREVIEW. diff --git a/products/azure-network-watcher/report.md b/products/azure-network-watcher/report.md index 3001e541..3e71ac7d 100644 --- a/products/azure-network-watcher/report.md +++ b/products/azure-network-watcher/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring and governing Network Watcher logging: AMA for Connection Monitor, NSG/VNet flow logs setup, schemas, filtering, templates (Bicep/ARM), @@ -40,13 +40,13 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Networking - **Total Pages**: 64 - **Fetched**: 64 - **Fetch Failed**: 0 -- **Classified**: 29 -- **Unclassified**: 35 +- **Classified**: 28 +- **Unclassified**: 36 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 64 +- **Updated Pages**: 1 +- **Unchanged**: 63 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-network-watcher/azure-network-watcher.csv` @@ -56,14 +56,19 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Networking |------|-------|------------| | configuration | 12 | 18.8% | | decision-making | 3 | 4.7% | -| integrations | 4 | 6.2% | +| integrations | 3 | 4.7% | | limits-quotas | 1 | 1.6% | | security | 3 | 4.7% | | troubleshooting | 6 | 9.4% | -| *(Unclassified)* | 35 | 54.7% | +| *(Unclassified)* | 36 | 56.2% | ## Changes +### Updated Pages + +- [Integrate Microsoft Sentinel with traffic analytics](https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-sentinel) + - Updated: 2026-03-17T06:14:00.000Z → 2026-04-24T11:19:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -88,7 +93,6 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Networking | [Read flow logs](https://learn.microsoft.com/en-us/azure/network-watcher/flow-logs-read) | integrations | 0.70 | Article on using a PowerShell script to parse flow logs created hourly and updated every few minutes; includes concrete script code and log schema usage, which are product-specific integration/coding patterns. | | [Schema and data aggregation](https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-schema) | configuration | 0.70 | Schema and data aggregation documentation; such pages typically define field names, types, and aggregation rules for Traffic Analytics data, which are detailed configuration/schema references. | | [VNet flow logs managed identity](https://learn.microsoft.com/en-us/azure/network-watcher/vnet-flow-logs-managed-identity) | security | 0.70 | Describes using user-assigned managed identities for VNet flow logs to access storage; likely includes specific role assignments, scopes, and identity configuration steps, which are product-specific security details. | -| [Integrate Microsoft Sentinel with traffic analytics](https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-sentinel) | integrations | 0.68 | Integration guide between Traffic Analytics and Microsoft Sentinel likely includes product-specific configuration steps, parameter names, and settings unique to this integration (for example, how to connect flow logs, workspaces, and data connectors), which qualifies as expert integration knowledge beyond generic concepts. | | [VNet flow logs filtering](https://learn.microsoft.com/en-us/azure/network-watcher/vnet-flow-logs-filtering) | configuration | 0.68 | The article describes product-specific filtering options for Azure Network Watcher virtual network flow logs (for example, filtering by flow state, action, IP ranges, ports, protocols, intra- vs inter-VNet traffic). These are concrete configuration capabilities and parameters unique to this service, not just conceptual logging guidance, fitting the configuration sub-skill best. | | [Audit and deploy using Azure Policy](https://learn.microsoft.com/en-us/azure/network-watcher/nsg-flow-logs-policy-portal) | configuration | 0.65 | Shows how to use Azure Policy built-ins for NSG flow logs; likely lists policy definitions, parameters, and assignment options, which are concrete configuration details. | | [Audit and deploy using Azure Policy](https://learn.microsoft.com/en-us/azure/network-watcher/vnet-flow-logs-policy) | configuration | 0.65 | Shows how to use built-in Azure Policy definitions to audit and deploy VNet flow logs; such content typically lists specific policy names, parameters, and effects, which are configuration-level details. | @@ -129,6 +133,7 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Networking | [Update to latest version](https://learn.microsoft.com/en-us/azure/network-watcher/network-watcher-agent-update) | 0.30 | Update procedure for VM extension; summary lacks configuration parameters, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/network-watcher/nsg-flow-logs-overview) | 0.25 | Overview of NSG flow logs and retirement notice; summary does not indicate detailed configuration tables, limits, or troubleshooting mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics) | 0.25 | High-level overview of Traffic Analytics capabilities and benefits; summary does not indicate detailed configuration parameters, limits, or troubleshooting content. | +| [Integrate Microsoft Sentinel with traffic analytics](https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-sentinel) | 0.20 | Integration article between Traffic Analytics and Microsoft Sentinel is likely a conceptual/how-to overview without detailed configuration parameter tables, limits, or error-code-based troubleshooting. The summary emphasizes capabilities and benefits rather than product-specific settings, quotas, or decision matrices. | | [Monitor communication between VMs](https://learn.microsoft.com/en-us/azure/network-watcher/monitor-vm-communication) | 0.20 | Step-by-step tutorial using portal; no indication of detailed settings tables, limits, or troubleshooting codes. | | [Overview](https://learn.microsoft.com/en-us/azure/network-watcher/connection-troubleshoot-overview) | 0.20 | Page is an overview of the Network Watcher connection troubleshoot feature, describing what it does and the kinds of issues it can detect, but not providing specific error codes, diagnostic commands, configuration parameters, or numeric limits. It lacks the concrete symptom→cause→solution mappings or detailed settings required for troubleshooting or configuration classifications. | | [Overview](https://learn.microsoft.com/en-us/azure/network-watcher/vnet-flow-logs-overview) | 0.20 | Conceptual overview of virtual network flow logs and their uses; summary does not indicate detailed configuration parameters, limits, or troubleshooting mappings. | diff --git a/products/azure-networking/azure-networking.csv b/products/azure-networking/azure-networking.csv index 26ea0e9c..a032c638 100644 --- a/products/azure-networking/azure-networking.csv +++ b/products/azure-networking/azure-networking.csv @@ -1,35 +1,20 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type +https://learn.microsoft.com/en-us/azure/networking/architecture-guides,Architecture guides,Azure Networking architecture documentation,,Learn about the reference architecture documentation available for Azure networking services.,This article provides information about architecture guides that can help you explore the different networking services in Azure available to you for building your applications.,2026-04-22T19:06:00.000Z,concept-article,,0.05,False,"Navigation/overview page pointing to architecture guides; does not itself contain patterns, thresholds, or decision matrices.",new https://learn.microsoft.com/en-us/azure/networking/azure-for-network-engineers,Azure for network engineers,Azure for Network Engineers,,This page explains to traditional network engineers how networks work in Azure.,"As a conventional network engineer you have dealt with physical assets such as routers, switches, cables, firewalls to build infrastructure. At a logical layer you have configured virtual LAN (VLAN), Spanning Tree Protocol (STP), routing protocols (RIP, OSPF, BGP). You have managed your network using management tools and CLI. Networking in the cloud is different where network endpoints are logical and use of routing protocols is minimum. You will work with Azure Resource Manager API, Azure CLI, ",2021-08-16T17:04:00.000Z,article,,0.1,False,"Conceptual explanation of how networking in Azure differs from traditional networking; no concrete configs, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/networking/azure-network-latency,Azure network latency,Azure network round-trip latency statistics,Use Azure region latency stats for architecture planning,Learn about round-trip latency statistics between Azure regions.,This article provides round-trip latency statistics between Azure regions to help you optimize your cloud architecture and deployment decisions. The data comes from continuous network monitoring across Azure's global infrastructure and represents real-world performance measurements. Use these statistics to:,2025-08-18T17:25:00.000Z,concept-article,decision-making,0.75,True,Provides concrete round-trip latency statistics between regions to guide deployment and architecture decisions; quantitative data not inferable from training alone.,unchanged -https://learn.microsoft.com/en-us/azure/networking/check-usage-against-limits,Check resource usage against Azure limits,Check Azure resource usage against limits,,Learn how to check your Azure resource usage against Azure subscription limits.,"In this article, you learn how to see the number of each network resource type that you've deployed in your subscription and what yoursubscription limitsare. The ability to view resource usage against limits is helpful to track current usage, and plan for future use. You can use theAzure portal,PowerShell, or theAzure CLIto track usage.",2023-04-13T11:17:00.000Z,how-to,,0.4,False,"Explains how to view resource usage vs subscription limits using portal/CLI/PowerShell; the actual numeric limits are elsewhere, so this is procedural rather than expert limits content.",unchanged -https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-control-plane,Control Plane Analysis,Interoperability in Azure - Control plane analysis,Analyze control plane routing interoperability in Azure,"This article provides the control plane analysis of the test setup you can use to analyze interoperability between ExpressRoute, a site-to-site VPN, and virtual network peering in Azure.",This article describes the control plane analysis of the test setup. You can also review the test setup configuration and the data plane analysis of the test setup. Control plane analysis essentially examines routes that are exchanged between networks within a topology. Control plane analysis can help you understand how different networks view the topology.,2023-06-29T11:24:00.000Z,concept-article,architecture-patterns,0.6,True,"Control plane analysis of routes exchanged between ExpressRoute, VPN, and VNet peering; provides product-specific routing behavior patterns and trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-data-plane,Data Plane Analysis,Interoperability in Azure - Data plane analysis,Analyze data plane paths across Azure networks,"This article provides the data plane analysis of the test setup you can use to analyze interoperability between ExpressRoute, a site-to-site VPN, and virtual network peering in Azure.","This article describes the control plane analysis of the test setup. You can also review the test setup configuration and the data plane analysis of the test setup. Data plane analysis examines the path taken by packets that traverse from one local network (LAN or virtual network) to another within a topology. The data path between two local networks isn't necessarily symmetrical. Therefore, in this article, we analyze a forwarding path from a local network to another network that's separate fro",2023-06-29T11:24:00.000Z,concept-article,architecture-patterns,0.6,True,Data plane analysis of packet forwarding paths between LANs and VNets; details Azure-specific path asymmetry and interoperability behavior.,unchanged -https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-preface,Preface and Test Setup,Interoperability in Azure - Test setup,,"This article describes a test setup you can use to analyze interoperability between ExpressRoute, a site-to-site VPN, and virtual network peering in Azure.",This article describes a test setup you can use to analyze how Azure networking services interoperate at the control plane level and data plane level. Let's look briefly at the Azure networking components: Azure ExpressRoute: Use private peering in Azure ExpressRoute to directly connect private IP spaces in your on-premises network to your Azure Virtual Network deployments. ExpressRoute can help you achieve higher bandwidth and a private connection. Many ExpressRoute eco partners offer ExpressRo,2023-06-29T11:24:00.000Z,how-to,,0.4,False,Describes a test setup for interoperability analysis; summary does not indicate detailed configuration tables or product-specific parameters.,unchanged -https://learn.microsoft.com/en-us/azure/networking/create-zero-trust-network-web-apps,Create a Zero Trust network for web applications,Deploy a Zero Trust Virtual Network for Web Applications,Deploy a Zero Trust virtual network for web apps,"Deploy a Zero Trust virtual network configuration for web applications in Azure using Azure Firewall, Azure Application Gateway, Web Application Firewall, and other virtual network services.",,2023-01-23T12:15:00.000Z,how-to,security,0.65,True,"Describes a concrete Zero Trust VNet configuration using Azure Firewall, Application Gateway, WAF, and related services; likely includes product-specific security configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/networking/cross-service-scenarios/design-secure-hub-spoke-network,Design a secure hub-spoke network,Design a secure hub-spoke network for regional web applications in Azure,Design a secure hub-spoke network for Azure web apps,"Learn how to build a secure-by-default network foundation for regional web applications using a minimal hub-spoke topology with Application Gateway, WAF, DDoS Protection, Bastion, NSGs, and virtual ne","When you host a web application in Azure, the network design you choose determines how much of your infrastructure is exposed to attack. Without an intentional design, teams commonly leave default configurations in place, expose management ports to the internet, or skip application-layer inspection—all of which increase the attack surface. This article explains a repeatable architecture pattern that uses a minimalhub-spoke topologyto build a secure-by-default foundation for single-region web app",2026-03-24T22:22:00.000Z,concept-article,architecture-patterns,0.78,True,"Describes a repeatable, product-specific hub-spoke architecture pattern for regional web applications using Application Gateway, WAF, DDoS Protection, Bastion, NSGs, and VNet peering. Focuses on how to design a secure-by-default topology and when to use these components together, which is architecture guidance beyond generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/networking/fundamentals/architecture-guides,Architecture guides,Azure Networking architecture documentation,,Learn about the reference architecture documentation available for Azure networking services.,This article provides information about architecture guides that can help you explore the different networking services in Azure available to you for building your applications.,2023-06-13T08:00:00.000Z,concept-article,,0.2,False,Navigation/overview page listing networking architecture guides; does not itself contain detailed patterns or configs.,unchanged -https://learn.microsoft.com/en-us/azure/networking/fundamentals/lumenisity-patent-list,Lumenisity UoS Patents,Lumenisity University of Southampton Patents,,"List of Lumenisity UoS Patents as of April 19, 2023.","The following is a list of patents owned by Lumenisity UoS (University of Southampton) as of April 19, 2023. This list is subject to change without notice. Note Microsoft Hollow Core Fiber products and manufacturing processes may be protected by one or more claims of the following patents or published patent applications.",2024-11-21T23:02:00.000Z,concept-article,,0.5,False,List of patents related to Lumenisity Hollow Core Fiber; legal/patent information rather than actionable product configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/networking/fundamentals/networking-overview,About Azure networking,Azure networking services overview,,"Learn about the various networking services in Azure, including networking foundation, load balancing and content delivery, hybrid connectivity, and network security services.",The networking services in Azure provide various networking capabilities that can be used together or separately. Select each of the following networking scenarios to learn more about them:,2025-06-26T08:00:00.000Z,overview,,0.1,False,"High-level overview of Azure networking services without detailed limits, configs, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/networking/fundamentals/resource-graph-samples,Azure Resource Graph queries,Azure Resource Graph sample queries for Azure networking,Run Azure Resource Graph queries for networking resources,Sample Azure Resource Graph queries for Azure networking showing use of resource types and tables to access Azure networking related resources and properties.,This page is a collection ofAzure Resource Graphsample queries for Azure networking.,2024-06-12T05:46:00.000Z,sample,integrations,0.7,True,Provides concrete Resource Graph sample queries using specific networking resource types and table schemas; product-specific query patterns and field usage.,unchanged -https://learn.microsoft.com/en-us/azure/networking/load-balancer-content-delivery/,Load balancing and content delivery,Azure load balancing and content delivery,,"Discover how to secure and optimize your application with Azure load balancing and content delivery services. Learn how Azure Load Balancer, Application Gateway, and Front Door improve performance.",Overview,2025-06-26T15:48:00Z,hub-page,,0.1,False,"Overview of load balancing and content delivery services; likely marketing/introductory without detailed limits, configs, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/networking/microsoft-global-network,Microsoft global network,Microsoft global network - Azure,,"Learn how Microsoft builds and operates one of the largest backbone networks in the world, and why it's central to delivering a great cloud experience.","Microsoft owns and operates one of the largest backbone networks in the world. This global and sophisticated architecture, spanning more than 165,000 miles, connects our datacenters and customers. Every day, customers around the world connect and pass trillions of requests to Microsoft Azure, Bing, Dynamics 365, Microsoft 365, XBox, and many others. Regardless of type, customers expect instant reliability and responsiveness from our services. TheMicrosoft global network(WAN) is a central part of",2023-04-07T00:00:00.000Z,concept-article,,0.2,False,Describes Microsoft’s global backbone network at a high level; largely architectural/marketing overview without actionable configs or limits.,unchanged -https://learn.microsoft.com/en-us/azure/networking/network-monitoring-overview,Network monitoring overview,Network Monitoring in Azure Monitor logs,,"Overview of network monitoring solutions, including network performance monitor, to manage networks across cloud, on-premises, and hybrid environments.","Azure offers a host of solutions to monitor your networking assets. Azure has solutions and utilities to monitor network connectivity, the health of ExpressRoute circuits, and analyze network traffic in the cloud. Important As of July 1, 2021, you can no longer add new tests in an existing workspace or enable a new workspace in Network Performance Monitor (NPM). You're also no longer able to add new connection monitors in Connection Monitor (Classic). You can continue to use the tests and connec",2023-10-30T17:02:00.000Z,concept-article,,0.3,False,"Overview of network monitoring solutions and deprecation notice; summary does not show detailed configuration tables, error codes, or limits.",unchanged -https://learn.microsoft.com/en-us/azure/networking/networking-partners-msp,Providers,Networking Partners: Azure Networking,,Learn about Azure Networking Managed Service Provider Partner Program and find a list of partners that offer cloud and hybrid networking services.,"TheAzure Networking Managed Service Provider (MSP) Partner Programenables network-services focused MSPs, Telcos, and Systems Integrators (SIs) to offer cloud and hybrid networking services centered around Azure's portfolio of networking products and services. Azure Networking MSPs are a specialized set of managed service providers that address the enterprise cloud networking needs and challenges across all aspects of cloud and hybrid networking. The managed network services include one or more o",2025-10-14T08:00:00.000Z,concept-article,,0.1,False,Program/partner overview for networking MSPs; marketing and partner listing without technical configuration or decision matrices.,unchanged -https://learn.microsoft.com/en-us/azure/networking/nva-accelerated-connections,NVA accelerated connections,Accelerated connections network performance optimization on NVAs and VMs,Optimize NVA and VM performance with Accelerated Connections,Learn how Accelerated Connections improves networking performance for NVAs and VMs.,"This article helps you understand theAccelerated Connectionsfeature. When Accelerated Connections is enabled on the virtual network interface (vNIC) with Accelerated Networking, this feature significantly improves networking efficiency, resulting in enhanced overall performance. This high-performance feature offers industry leading performance in Connections Per Second (CPS) optimization, along with improvements to handling large amounts of simultaneous connections. The feature also improves the",2025-08-04T22:16:00.000Z,concept-article,best-practices,0.6,True,"Explains a product-specific performance feature with concrete behavior (CPS optimization, handling many connections) and likely includes configuration guidance unique to Azure networking.",unchanged -https://learn.microsoft.com/en-us/azure/networking/policy-reference,Azure Policy built-ins,Built-in policy definitions for Azure networking services,Use built-in Azure Policy definitions for networking,Lists Azure Policy built-in policy definitions for Azure networking services. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy -definitions for Azure networking services. For additional Azure Policy built-ins for other services, -seeAzure Policy built-in definitions. The name of each built-in policy definition links to the policy definition in the Azure portal. Use -the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2024-02-06T18:05:00.000Z,reference,security,0.7,True,Index of Azure Policy built-in definitions for networking with specific policy names and links; product-specific security/compliance configuration artifacts.,unchanged -https://learn.microsoft.com/en-us/azure/networking/secure-application-delivery,Choose a secure application delivery service,Choose a secure application delivery service,Choose secure Azure application delivery options,Learn how you can use a decision tree to help choose a secure application delivery service.,"Choosing a topology for web application ingress has a few different options, so this decision tree helps identify the initial pattern to start with when considering a web application flow for your workload. The key consideration is whether you're using a globally distributed web-based pattern with Web Application Firewall (WAF). Patterns in this classification are better served at the Azure edge versus within your specific virtual network. Azure Front Door, for example, sits at the edge, suppor",2024-06-17T08:00:00.000Z,how-to,decision-making,0.7,True,"Decision tree for selecting between Azure Front Door, Application Gateway, and other ingress patterns with WAF and edge vs VNet considerations; product-specific selection guidance.",unchanged -https://learn.microsoft.com/en-us/azure/networking/secure-network-topology,Choose a secure network topology,Choose a secure network topology,Select a secure Azure network topology,Learn how you can use a decision tree to help choose the best topology to secure your network.,"A network topology defines the basic routing and traffic flow architecture for your workload. However, you must consider security with the network topology. To simplify the initial decision to formulate a direction, there are some simple paths that can be used to help define the secure topology. This includes whether the workload is a globally distributed workload or a single region-based workload. You also must consider plans to use third-party network virtual appliances (NVA’s) to handle both",2024-06-17T08:00:00.000Z,how-to,decision-making,0.7,True,Uses a decision tree to choose between secure topologies based on workload distribution and NVA usage; provides product-specific decision guidance beyond generic concepts.,unchanged +https://learn.microsoft.com/en-us/azure/networking/foundations/network-foundations-overview,Network foundations overview,Azure Network Foundation Services Overview,,"Learn how Azure Virtual Network, Private Link, and DNS work together to create secure, private cloud connectivity. Get started with Azure network foundation services today.","Azure network foundation services provide core connectivity for your resources in Azure. Network foundation services includeAzure Virtual Network,Azure Private Link, andAzure DNS. Together, these core services build upon each other to provide the foundation for your Azure network. The following diagram is an example of how these services can be used together in a basic Azure network. This article provides a summary of each of these Azure foundational services, and illustrates how they work toget",2025-07-26T17:14:00.000Z,overview,,0.1,False,"Conceptual overview of Virtual Network, Private Link, and DNS; lacks concrete configuration tables, limits, or troubleshooting content.",new +https://learn.microsoft.com/en-us/azure/networking/hybrid-connectivity/hybrid-connectivity,Hybrid connectivity overview,What is hybrid connectivity?,,"Learn about hybrid connectivity in Azure, and the services that can help you connect and maintain resiliency with your Azure resources.","Hybrid connectivity is a critical component of a cloud architecture, which combines on-premises infrastructure, private cloud services, and public cloud services. Hybrid connectivity enables you to connect and maintain resiliency with your Azure resources. This article provides an overview of hybrid connectivity services in the context of Azure services - Azure VPN, Azure ExpressRoute, and Azure Virtual WAN. You'll learn about key concepts, benefits, and use cases for each service.",2025-06-26T15:48:00.000Z,overview,,0.1,False,"High-level overview of hybrid connectivity (VPN, ExpressRoute, Virtual WAN); no detailed limits, configs, or decision guidance with thresholds.",new +https://learn.microsoft.com/en-us/azure/networking/load-balancer-content-delivery/load-balancing-content-delivery-overview,Load balancing and content delivery overview,What is load balancing and content delivery?,,"Learn about load balancing and content delivery in Azure, and the services that can help you optimize the performance and reliability of your web applications.","Load balancing and content delivery are critical components in optimizing the performance and reliability of web applications. Load balancing ensures that incoming network traffic is distributed evenly across multiple servers or services, preventing any single server from becoming overwhelmed with requests. And content delivery optimizes the delivery of content to users by caching and distributing it across multiple locations, reducing latency and improving performance. Together, these two conce",2025-06-26T15:48:00.000Z,article,,0.1,False,"Explains what load balancing and content delivery are; no product-specific numeric limits, configs, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/networking/lumenisity-patent-list,Lumenisity UoS Patents,Lumenisity University of Southampton Patents,,"List of Lumenisity UoS Patents as of April 19, 2023.","The following is a list of patents owned by Lumenisity UoS (University of Southampton) as of April 19, 2023. This list is subject to change without notice. Note Microsoft Hollow Core Fiber products and manufacturing processes may be protected by one or more claims of the following patents or published patent applications.",2026-04-22T19:06:00.000Z,concept-article,,0.2,False,"List of patents related to Lumenisity; legal/patent information rather than technical configuration, limits, or troubleshooting guidance.",new +https://learn.microsoft.com/en-us/azure/networking/microsoft-global-network,Overview,Microsoft global network - Azure,,"Learn how Microsoft builds and operates one of the largest backbone networks in the world, and why it's central to delivering a great cloud experience.","Microsoft owns and operates one of the largest backbone networks in the world. This global and sophisticated architecture, spanning more than 500,000 miles, connects our datacenters and customers. Every day, customers around the world connect and pass trillions of requests to Microsoft Azure, Bing, Dynamics 365, Microsoft 365, XBox, and many others. Regardless of type, customers expect instant reliability and responsiveness from our services. TheMicrosoft global network(WAN) is a central part of",2023-04-07T00:00:00.000Z,concept-article,,0.2,False,Describes Microsoft global network and backbone; largely architectural/marketing overview without product-specific configuration or limits.,new +https://learn.microsoft.com/en-us/azure/networking/monitoring-management/,Network monitoring and management overview,Azure network monitoring and management documentation,,Learn about the various Azure network monitoring and management services that you can use to monitor your network resources in Azure.,Concept,2025-09-23T22:10:00Z,hub-page,,0.05,False,Landing page for monitoring and management documentation; described as conceptual and not containing detailed technical data.,new +https://learn.microsoft.com/en-us/azure/networking/networking-overview,Azure Networking overview,Azure networking services overview,,"Learn about the various networking services in Azure, including networking foundation, load balancing and content delivery, hybrid connectivity, and network security services.",The networking services in Azure provide various networking capabilities that can be used together or separately. Select each of the following networking scenarios to learn more about them:,2026-04-22T19:06:00.000Z,overview,,0.1,False,"High-level overview of Azure networking services; no specific limits, configs, error codes, or decision matrices.",new +https://learn.microsoft.com/en-us/azure/networking/resource-graph-samples,Azure Resource Graph queries,Azure Resource Graph sample queries for Azure networking,Use Azure Resource Graph queries for networking resources,Sample Azure Resource Graph queries for Azure networking showing use of resource types and tables to access Azure networking related resources and properties.,This page is a collection ofAzure Resource Graphsample queries for Azure networking.,2026-04-22T19:06:00.000Z,sample,integrations,0.7,True,Collection of concrete Resource Graph sample queries for Azure networking; contains product-specific query patterns and field usage that qualify as integration/coding patterns.,new https://learn.microsoft.com/en-us/azure/networking/security-controls-policy,Security controls by Azure Policy,Azure Policy Regulatory Compliance controls for Azure networking services,Apply Azure Policy compliance controls to networking,Lists Azure Policy Regulatory Compliance controls available for Azure networking services. These built-in policy definitions provide common approaches to managing the compliance of your Azure resource,"Regulatory Compliance in Azure Policyprovides Microsoft created and managed initiative definitions, known asbuilt-ins, for thecompliance domainsandsecurity controlsrelated to different compliance standards. This page lists thecompliance domainsandsecurity controlsfor Azure networking services. You can assign the built-ins for asecurity controlindividually to help make your Azure resources compliant with the specific standard. The title of each built-in policy definition links to the policy defin",2024-02-07T05:35:00.000Z,sample,security,0.7,True,Lists specific Azure Policy regulatory compliance controls and built-in definitions for networking services; includes product-specific policy names and mappings.,unchanged -https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-application-gateway-waf,Application Gateway WAF,Zero Trust recommendations for Azure Application Gateway WAF,Secure Application Gateway WAF with Zero Trust guidance,Review Zero Trust security recommendations for Azure Web Application Firewall on Application Gateway to help protect your web applications.,"Azure Web Application Firewall on Application Gateway protects web applications from common exploits and vulnerabilities. The following recommendations help you verify that WAF is properly configured and monitored. For a summary of all Azure network security Zero Trust recommendations, seeAzure network security Zero Trust recommendations.",2026-03-21T06:11:00.000Z,best-practice,security,0.7,True,"Provides Zero Trust recommendations for Azure WAF on Application Gateway, focusing on correct configuration and monitoring of WAF features. This is product-specific security configuration guidance rather than generic WAF concepts.",unchanged -https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-azure-firewall,Azure Firewall,Zero Trust recommendations for Azure Firewall,Harden Azure Firewall with Zero Trust settings,Review Zero Trust security recommendations for Azure Firewall to help enforce network security policies across your virtual networks.,"Azure Firewall provides centralized network security policy enforcement and logging across your virtual networks. The following recommendations help you verify that key protection features are active and properly configured. For a summary of all Azure network security Zero Trust recommendations, seeAzure network security Zero Trust recommendations.",2026-03-21T06:11:00.000Z,best-practice,security,0.7,True,"Offers Azure Firewall–specific Zero Trust recommendations to ensure key protection features are active and correctly configured, which implies detailed role, rule, and logging configurations unique to this service.",unchanged -https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-ddos-protection,Azure DDoS Protection,Zero Trust recommendations for Azure DDoS Protection,Implement Zero Trust for Azure DDoS Protection,Review Zero Trust security recommendations for Azure DDoS Protection to help secure your public-facing resources.,"Azure DDoS Protection safeguards your public-facing resources from distributed denial of service attacks. The following recommendations help you verify that DDoS protection is enabled and properly monitored across your environment. For a summary of all Azure network security Zero Trust recommendations, seeAzure network security Zero Trust recommendations.",2026-03-21T06:11:00.000Z,best-practice,security,0.7,True,"Contains Azure DDoS Protection–specific Zero Trust recommendations on verifying that protection is enabled and monitored across environments, with concrete configuration and validation steps that go beyond conceptual security advice.",unchanged -https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-front-door-waf,Azure Front Door WAF,Zero Trust recommendations for Azure Front Door WAF,Secure Front Door WAF at the edge with Zero Trust,Review Zero Trust security recommendations for Azure Web Application Firewall on Front Door to help protect your web applications at the network edge.,"Azure Web Application Firewall on Front Door protects web applications at the network edge from common exploits and vulnerabilities. The following recommendations help you verify that WAF is properly configured and monitored. For a summary of all Azure network security Zero Trust recommendations, seeAzure network security Zero Trust recommendations.",2026-03-21T06:11:00.000Z,best-practice,security,0.7,True,"Gives Azure Front Door WAF–specific Zero Trust recommendations to ensure proper configuration and monitoring at the network edge, representing detailed, product-specific security guidance.",unchanged -https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-network-security,Zero Trust network security recommendations,Azure network security Zero Trust recommendations,Apply Zero Trust recommendations to Azure network security,"Review Zero Trust security recommendations for Azure network security services including Azure DDoS Protection, Azure Firewall, and Azure Web Application Firewall.","TheZero Trust modelassumes breach and verifies each request as though it originates from an uncontrolled network. Azure network security services play a critical role in enforcing Zero Trust principles by inspecting, filtering, and logging traffic across your cloud environment. The following recommendations help you assess and harden your Azure network security posture. Each recommendation links to a detailed guide describing the security check, its risk level, and remediation steps. Tip Some or",2026-03-21T06:11:00.000Z,best-practice,security,0.7,True,"Provides concrete Zero Trust recommendations tied to Azure DDoS Protection, Azure Firewall, and WAF, including specific checks, risk levels, and remediation steps. This is product-specific security configuration guidance rather than generic Zero Trust theory.",unchanged +https://learn.microsoft.com/en-us/azure/networking/security/network-security,Network security overview,What is Azure network security?,,Overview of Azure network security,"Network security is a critical aspect of cloud computing, as it protects the data and applications that run on the cloud from various threats and attacks. Azure provides a comprehensive set of network security solutions that enable you to design, deploy, and manage secure and resilient networks in the cloud. One of the guiding principles of Azure network security is theZero Trust model, which assumes that no network or device is inherently secure or trustworthy. Instead, every request and connec",2025-06-26T15:48:00.000Z,overview,,0.1,False,"Overview of Azure network security and Zero Trust; does not list specific RBAC roles, settings, or configuration parameters.",new https://learn.microsoft.com/en-us/azure/networking/troubleshoot-failed-state,Troubleshoot Microsoft.Network Failed provisioning state,Troubleshoot Azure Microsoft.Network failed Provisioning State,Troubleshoot Microsoft.Network failed provisioning states,Learn about the meaning of various provisioning states and how to troubleshoot Azure Microsoft.Network failed Provisioning State.,"This article helps you understand the meaning of various provisioning states for Microsoft.Network resources. You can effectively troubleshoot situations when the state isFailed. If your Azure issue is not addressed in this article, visit the Azure forums onMicrosoft Q & A and Stack Overflow. You can post your issue in these forums, or post to@AzureSupport on Twitter. You also can submit an Azure support request. To submit a support request, on theAzure supportpage, selectGet support.",2023-03-22T00:00:00.000Z,how-to,troubleshooting,0.85,True,Explains meanings of specific provisioning states for Microsoft.Network resources and how to resolve failed states; symptom-to-cause-to-solution troubleshooting content.,unchanged -https://learn.microsoft.com/en-us/azure/networking/working-remotely-support,Support for working remotely,Enable remote work by using Azure networking services,Plan Azure networking for remote work scenarios,Learn how to use Azure networking services to enable remote work and how to mitigate traffic issues that result from an increased number of people who work remotely.,This article presents the different options available for organizations to establish remote access for their users. It also covers ways to supplement their existing solutions with extra capacity during periods of peak utilization. Network architects are faced with the following challenges: Address an increase in network utilization. Provide reliable and secure connectivity to more employees of their company and customers. Provide connectivity to remote locations across the globe. Not all network,2023-04-10T00:00:00.000Z,concept-article,decision-making,0.65,True,Compares different Azure networking options and capacity approaches for remote access and peak utilization; provides scenario-based selection guidance.,unchanged +https://learn.microsoft.com/en-us/azure/networking/working-remotely-support,Working remotely,Enable remote work by using Azure networking services,,Learn how to use Azure networking services to enable remote work and how to mitigate traffic issues that result from an increased number of people who work remotely.,This article presents the different options available for organizations to establish remote access for their users. It also covers ways to supplement their existing solutions with extra capacity during periods of peak utilization. Network architects are faced with the following challenges: Address an increase in network utilization. Provide reliable and secure connectivity to more employees of their company and customers. Provide connectivity to remote locations across the globe. Not all network,2023-04-10T00:00:00.000Z,concept-article,,0.3,False,"Describes options for enabling remote work and capacity considerations; likely scenario guidance but no explicit limits, configs, or error mappings indicated.",new diff --git a/products/azure-networking/report.md b/products/azure-networking/report.md index 11cce127..59fa687d 100644 --- a/products/azure-networking/report.md +++ b/products/azure-networking/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-05' +generated_at: '2026-04-26' category_descriptions: decision-making: 'Guidance on choosing Azure network architectures: using region latency data, selecting secure topologies and app delivery options, and planning @@ -7,28 +7,25 @@ category_descriptions: architecture-patterns: 'Routing and traffic flow design in Azure: analyzing control vs data plane paths, and building secure hub-spoke network architectures for web apps.' + integrations: Querying and analyzing Azure networking resources with Azure Resource + Graph, including sample queries, filters, and best practices for large-scale inventory + and compliance. security: 'Zero Trust security for Azure networking: policies, NSGs, Azure Firewall, DDoS, App Gateway/Front Door WAF hardening, and securing virtual networks for web apps.' - integrations: Using Azure Resource Graph to query, filter, and analyze Azure networking - resources at scale (e.g., VNets, NSGs, public IPs) for inventory, compliance, - and reporting. - best-practices: Guidance on boosting Azure NVA and VM network throughput/latency - using Accelerated Connections, including configuration, tuning, and performance - best practices. troubleshooting: Diagnosing and resolving Microsoft.Network resource provisioning failures in Azure, including common error patterns, causes, and step-by-step remediation guidance. skill_description: Expert knowledge for Azure Networking development including troubleshooting, - best practices, decision making, architecture & design patterns, security, and integrations - & coding patterns. Use when designing hub-spoke VNets, Azure Firewall/NSG rules, - WAF (App GW/Front Door), Accelerated Networking, or Resource Graph queries, and - other Azure Networking related development tasks. Not for Azure Virtual Network - (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager), - Azure Virtual WAN (use azure-virtual-wan), Azure Network Watcher (use azure-network-watcher). -use_when: Use when designing hub-spoke VNets, Azure Firewall/NSG rules, WAF (App GW/Front - Door), Accelerated Networking, or Resource Graph queries, and other Azure Networking - related development tasks. + decision making, architecture & design patterns, security, and integrations & coding + patterns. Use when designing hub-spoke VNets, Azure Firewall/NSG rules, App Gateway/Front + Door WAF, or Resource Graph queries, and other Azure Networking related development + tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure Virtual + Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan), + Azure Network Watcher (use azure-network-watcher). +use_when: Use when designing hub-spoke VNets, Azure Firewall/NSG rules, App Gateway/Front + Door WAF, or Resource Graph queries, and other Azure Networking related development + tasks. confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan), Azure Network Watcher (use azure-network-watcher). @@ -37,33 +34,70 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A ## Summary -- **Total Pages**: 28 -- **Fetched**: 28 +- **Total Pages**: 16 +- **Fetched**: 16 - **Fetch Failed**: 0 -- **Classified**: 18 -- **Unclassified**: 10 +- **Classified**: 5 +- **Unclassified**: 11 ### Incremental Update -- **New Pages**: 0 +- **New Pages**: 11 - **Updated Pages**: 0 -- **Unchanged**: 28 -- **Deleted Pages**: 0 +- **Unchanged**: 5 +- **Deleted Pages**: 23 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-networking/azure-networking.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 3 | 10.7% | -| best-practices | 1 | 3.6% | -| decision-making | 4 | 14.3% | -| integrations | 1 | 3.6% | -| security | 8 | 28.6% | -| troubleshooting | 1 | 3.6% | -| *(Unclassified)* | 10 | 35.7% | +| architecture-patterns | 1 | 6.2% | +| decision-making | 1 | 6.2% | +| integrations | 1 | 6.2% | +| security | 1 | 6.2% | +| troubleshooting | 1 | 6.2% | +| *(Unclassified)* | 11 | 68.8% | ## Changes +### New Pages + +- [Azure Networking overview](https://learn.microsoft.com/en-us/azure/networking/networking-overview) +- [Network foundations overview](https://learn.microsoft.com/en-us/azure/networking/foundations/network-foundations-overview) +- [Load balancing and content delivery overview](https://learn.microsoft.com/en-us/azure/networking/load-balancer-content-delivery/load-balancing-content-delivery-overview) +- [Network security overview](https://learn.microsoft.com/en-us/azure/networking/security/network-security) +- [Hybrid connectivity overview](https://learn.microsoft.com/en-us/azure/networking/hybrid-connectivity/hybrid-connectivity) +- [Network monitoring and management overview](https://learn.microsoft.com/en-us/azure/networking/monitoring-management/) +- [Architecture guides](https://learn.microsoft.com/en-us/azure/networking/architecture-guides) +- [Working remotely](https://learn.microsoft.com/en-us/azure/networking/working-remotely-support) +- [Overview](https://learn.microsoft.com/en-us/azure/networking/microsoft-global-network) +- [Azure Resource Graph queries](https://learn.microsoft.com/en-us/azure/networking/resource-graph-samples) +- [Lumenisity UoS Patents](https://learn.microsoft.com/en-us/azure/networking/lumenisity-patent-list) + +### Deleted Pages + +- ~~Check resource usage against Azure limits~~ (https://learn.microsoft.com/en-us/azure/networking/check-usage-against-limits) +- ~~Control Plane Analysis~~ (https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-control-plane) +- ~~Data Plane Analysis~~ (https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-data-plane) +- ~~Preface and Test Setup~~ (https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-preface) +- ~~Create a Zero Trust network for web applications~~ (https://learn.microsoft.com/en-us/azure/networking/create-zero-trust-network-web-apps) +- ~~Architecture guides~~ (https://learn.microsoft.com/en-us/azure/networking/fundamentals/architecture-guides) +- ~~Lumenisity UoS Patents~~ (https://learn.microsoft.com/en-us/azure/networking/fundamentals/lumenisity-patent-list) +- ~~About Azure networking~~ (https://learn.microsoft.com/en-us/azure/networking/fundamentals/networking-overview) +- ~~Azure Resource Graph queries~~ (https://learn.microsoft.com/en-us/azure/networking/fundamentals/resource-graph-samples) +- ~~Load balancing and content delivery~~ (https://learn.microsoft.com/en-us/azure/networking/load-balancer-content-delivery/) +- ~~Microsoft global network~~ (https://learn.microsoft.com/en-us/azure/networking/microsoft-global-network) +- ~~Network monitoring overview~~ (https://learn.microsoft.com/en-us/azure/networking/network-monitoring-overview) +- ~~Providers~~ (https://learn.microsoft.com/en-us/azure/networking/networking-partners-msp) +- ~~NVA accelerated connections~~ (https://learn.microsoft.com/en-us/azure/networking/nva-accelerated-connections) +- ~~Azure Policy built-ins~~ (https://learn.microsoft.com/en-us/azure/networking/policy-reference) +- ~~Choose a secure application delivery service~~ (https://learn.microsoft.com/en-us/azure/networking/secure-application-delivery) +- ~~Choose a secure network topology~~ (https://learn.microsoft.com/en-us/azure/networking/secure-network-topology) +- ~~Application Gateway WAF~~ (https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-application-gateway-waf) +- ~~Azure Firewall~~ (https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-azure-firewall) +- ~~Azure DDoS Protection~~ (https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-ddos-protection) +- *...and 3 more* + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -71,33 +105,21 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A | [Troubleshoot Microsoft.Network Failed provisioning state](https://learn.microsoft.com/en-us/azure/networking/troubleshoot-failed-state) | troubleshooting | 0.85 | Explains meanings of specific provisioning states for Microsoft.Network resources and how to resolve failed states; symptom-to-cause-to-solution troubleshooting content. | | [Design a secure hub-spoke network](https://learn.microsoft.com/en-us/azure/networking/cross-service-scenarios/design-secure-hub-spoke-network) | architecture-patterns | 0.78 | Describes a repeatable, product-specific hub-spoke architecture pattern for regional web applications using Application Gateway, WAF, DDoS Protection, Bastion, NSGs, and VNet peering. Focuses on how to design a secure-by-default topology and when to use these components together, which is architecture guidance beyond generic concepts. | | [Azure network latency](https://learn.microsoft.com/en-us/azure/networking/azure-network-latency) | decision-making | 0.75 | Provides concrete round-trip latency statistics between regions to guide deployment and architecture decisions; quantitative data not inferable from training alone. | -| [Application Gateway WAF](https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-application-gateway-waf) | security | 0.70 | Provides Zero Trust recommendations for Azure WAF on Application Gateway, focusing on correct configuration and monitoring of WAF features. This is product-specific security configuration guidance rather than generic WAF concepts. | -| [Azure DDoS Protection](https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-ddos-protection) | security | 0.70 | Contains Azure DDoS Protection–specific Zero Trust recommendations on verifying that protection is enabled and monitored across environments, with concrete configuration and validation steps that go beyond conceptual security advice. | -| [Azure Firewall](https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-azure-firewall) | security | 0.70 | Offers Azure Firewall–specific Zero Trust recommendations to ensure key protection features are active and correctly configured, which implies detailed role, rule, and logging configurations unique to this service. | -| [Azure Front Door WAF](https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-front-door-waf) | security | 0.70 | Gives Azure Front Door WAF–specific Zero Trust recommendations to ensure proper configuration and monitoring at the network edge, representing detailed, product-specific security guidance. | -| [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/networking/policy-reference) | security | 0.70 | Index of Azure Policy built-in definitions for networking with specific policy names and links; product-specific security/compliance configuration artifacts. | -| [Azure Resource Graph queries](https://learn.microsoft.com/en-us/azure/networking/fundamentals/resource-graph-samples) | integrations | 0.70 | Provides concrete Resource Graph sample queries using specific networking resource types and table schemas; product-specific query patterns and field usage. | -| [Choose a secure application delivery service](https://learn.microsoft.com/en-us/azure/networking/secure-application-delivery) | decision-making | 0.70 | Decision tree for selecting between Azure Front Door, Application Gateway, and other ingress patterns with WAF and edge vs VNet considerations; product-specific selection guidance. | -| [Choose a secure network topology](https://learn.microsoft.com/en-us/azure/networking/secure-network-topology) | decision-making | 0.70 | Uses a decision tree to choose between secure topologies based on workload distribution and NVA usage; provides product-specific decision guidance beyond generic concepts. | +| [Azure Resource Graph queries](https://learn.microsoft.com/en-us/azure/networking/resource-graph-samples) | integrations | 0.70 | Collection of concrete Resource Graph sample queries for Azure networking; contains product-specific query patterns and field usage that qualify as integration/coding patterns. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/networking/security-controls-policy) | security | 0.70 | Lists specific Azure Policy regulatory compliance controls and built-in definitions for networking services; includes product-specific policy names and mappings. | -| [Zero Trust network security recommendations](https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-network-security) | security | 0.70 | Provides concrete Zero Trust recommendations tied to Azure DDoS Protection, Azure Firewall, and WAF, including specific checks, risk levels, and remediation steps. This is product-specific security configuration guidance rather than generic Zero Trust theory. | -| [Create a Zero Trust network for web applications](https://learn.microsoft.com/en-us/azure/networking/create-zero-trust-network-web-apps) | security | 0.65 | Describes a concrete Zero Trust VNet configuration using Azure Firewall, Application Gateway, WAF, and related services; likely includes product-specific security configuration patterns. | -| [Support for working remotely](https://learn.microsoft.com/en-us/azure/networking/working-remotely-support) | decision-making | 0.65 | Compares different Azure networking options and capacity approaches for remote access and peak utilization; provides scenario-based selection guidance. | -| [Control Plane Analysis](https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-control-plane) | architecture-patterns | 0.60 | Control plane analysis of routes exchanged between ExpressRoute, VPN, and VNet peering; provides product-specific routing behavior patterns and trade-offs. | -| [Data Plane Analysis](https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-data-plane) | architecture-patterns | 0.60 | Data plane analysis of packet forwarding paths between LANs and VNets; details Azure-specific path asymmetry and interoperability behavior. | -| [NVA accelerated connections](https://learn.microsoft.com/en-us/azure/networking/nva-accelerated-connections) | best-practices | 0.60 | Explains a product-specific performance feature with concrete behavior (CPS optimization, handling many connections) and likely includes configuration guidance unique to Azure networking. | ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Lumenisity UoS Patents](https://learn.microsoft.com/en-us/azure/networking/fundamentals/lumenisity-patent-list) | 0.50 | List of patents related to Lumenisity Hollow Core Fiber; legal/patent information rather than actionable product configuration or limits. | -| [Check resource usage against Azure limits](https://learn.microsoft.com/en-us/azure/networking/check-usage-against-limits) | 0.40 | Explains how to view resource usage vs subscription limits using portal/CLI/PowerShell; the actual numeric limits are elsewhere, so this is procedural rather than expert limits content. | -| [Preface and Test Setup](https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-preface) | 0.40 | Describes a test setup for interoperability analysis; summary does not indicate detailed configuration tables or product-specific parameters. | -| [Network monitoring overview](https://learn.microsoft.com/en-us/azure/networking/network-monitoring-overview) | 0.30 | Overview of network monitoring solutions and deprecation notice; summary does not show detailed configuration tables, error codes, or limits. | -| [Architecture guides](https://learn.microsoft.com/en-us/azure/networking/fundamentals/architecture-guides) | 0.20 | Navigation/overview page listing networking architecture guides; does not itself contain detailed patterns or configs. | -| [Microsoft global network](https://learn.microsoft.com/en-us/azure/networking/microsoft-global-network) | 0.20 | Describes Microsoft’s global backbone network at a high level; largely architectural/marketing overview without actionable configs or limits. | -| [About Azure networking](https://learn.microsoft.com/en-us/azure/networking/fundamentals/networking-overview) | 0.10 | High-level overview of Azure networking services without detailed limits, configs, or decision matrices. | +| [Working remotely](https://learn.microsoft.com/en-us/azure/networking/working-remotely-support) | 0.30 | Describes options for enabling remote work and capacity considerations; likely scenario guidance but no explicit limits, configs, or error mappings indicated. | +| [Lumenisity UoS Patents](https://learn.microsoft.com/en-us/azure/networking/lumenisity-patent-list) | 0.20 | List of patents related to Lumenisity; legal/patent information rather than technical configuration, limits, or troubleshooting guidance. | +| [Overview](https://learn.microsoft.com/en-us/azure/networking/microsoft-global-network) | 0.20 | Describes Microsoft global network and backbone; largely architectural/marketing overview without product-specific configuration or limits. | +| [Azure Networking overview](https://learn.microsoft.com/en-us/azure/networking/networking-overview) | 0.10 | High-level overview of Azure networking services; no specific limits, configs, error codes, or decision matrices. | | [Azure for network engineers](https://learn.microsoft.com/en-us/azure/networking/azure-for-network-engineers) | 0.10 | Conceptual explanation of how networking in Azure differs from traditional networking; no concrete configs, limits, or troubleshooting content. | -| [Load balancing and content delivery](https://learn.microsoft.com/en-us/azure/networking/load-balancer-content-delivery/) | 0.10 | Overview of load balancing and content delivery services; likely marketing/introductory without detailed limits, configs, or decision matrices. | -| [Providers](https://learn.microsoft.com/en-us/azure/networking/networking-partners-msp) | 0.10 | Program/partner overview for networking MSPs; marketing and partner listing without technical configuration or decision matrices. | +| [Hybrid connectivity overview](https://learn.microsoft.com/en-us/azure/networking/hybrid-connectivity/hybrid-connectivity) | 0.10 | High-level overview of hybrid connectivity (VPN, ExpressRoute, Virtual WAN); no detailed limits, configs, or decision guidance with thresholds. | +| [Load balancing and content delivery overview](https://learn.microsoft.com/en-us/azure/networking/load-balancer-content-delivery/load-balancing-content-delivery-overview) | 0.10 | Explains what load balancing and content delivery are; no product-specific numeric limits, configs, or decision matrices. | +| [Network foundations overview](https://learn.microsoft.com/en-us/azure/networking/foundations/network-foundations-overview) | 0.10 | Conceptual overview of Virtual Network, Private Link, and DNS; lacks concrete configuration tables, limits, or troubleshooting content. | +| [Network security overview](https://learn.microsoft.com/en-us/azure/networking/security/network-security) | 0.10 | Overview of Azure network security and Zero Trust; does not list specific RBAC roles, settings, or configuration parameters. | +| [Architecture guides](https://learn.microsoft.com/en-us/azure/networking/architecture-guides) | 0.05 | Navigation/overview page pointing to architecture guides; does not itself contain patterns, thresholds, or decision matrices. | +| [Network monitoring and management overview](https://learn.microsoft.com/en-us/azure/networking/monitoring-management/) | 0.05 | Landing page for monitoring and management documentation; described as conceptual and not containing detailed technical data. | diff --git a/products/azure-operator-nexus/azure-operator-nexus.csv b/products/azure-operator-nexus/azure-operator-nexus.csv index 56faf14d..f9d7d95c 100644 --- a/products/azure-operator-nexus/azure-operator-nexus.csv +++ b/products/azure-operator-nexus/azure-operator-nexus.csv @@ -1,9 +1,9 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/operator-nexus/azure-operator-nexus-faq,FAQ,Azure Operator Nexus FAQ - Operator Nexus,,Answers to the most frequently asked questions about Azure Operator Nexus.,The following sections cover some of the frequently asked questions for Azure Operator Nexus:,2025-09-17T16:51:00.000Z,faq,,0.3,False,"FAQ content is typically high-level Q&A without detailed numeric limits, configuration tables, or structured troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example,Cluster Template JSON Example,Azure Operator Nexus - Example of cluster.jsonc template file - Operator Nexus,Use cluster.jsonc template for Operator Nexus cluster configuration,Example of cluster.jsonc template file to use with ARM template in creating a cluster.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2025-09-17T16:51:00.000Z,how-to,configuration,0.8,True,"Cluster template example (gated) will list concrete configuration parameters, types, and allowed values for cluster resources.",unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example,Cluster Parameters JSON Example,Azure Operator Nexus - Example of cluster.parameters.jsonc template file - Operator Nexus,Configure cluster.parameters.jsonc for multi-rack Operator Nexus clusters,Example of an eight rack Cluster parameter file to use with ARM template in creating a Cluster.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2025-09-17T16:51:00.000Z,how-to,configuration,0.8,True,Parameter file example (gated) for eight-rack cluster exposes detailed configuration keys and constraints specific to Operator Nexus.,unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example,Cluster Manager Template JSON Example,Azure Operator Nexus - Example of clusterManager.jsonc template file - Operator Nexus,Use clusterManager.jsonc template settings for Operator Nexus,Example of clusterManager.jsonc template file to use with ARM template in creating a cluster manager.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2025-09-17T16:51:00.000Z,how-to,configuration,0.8,True,"Template example page (gated) will contain concrete JSONC parameters, allowed values, and defaults for Cluster Manager configuration.",unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example,Cluster Manager Parameters JSON Example,Azure Operator Nexus - Example of clusterManager.parameters.jsonc template file - Operator Nexus,Configure clusterManager.parameters.jsonc for Operator Nexus,Example of clusterManager.parameters.jsonc template file to use with ARM template in creating a Cluster Manager.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2025-09-17T16:51:00.000Z,how-to,configuration,0.8,True,"Parameter file example (gated) exposes specific configuration keys, value formats, and constraints for Cluster Manager deployments.",unchanged +https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example,Cluster Template JSON Example,Azure Operator Nexus - Example of cluster.jsonc template file - Operator Nexus,Author cluster.jsonc templates for Nexus clusters,Example of cluster.jsonc template file to use with ARM template in creating a cluster.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,"Example cluster.jsonc template will enumerate configuration sections, property names, and required values unique to Azure Operator Nexus cluster creation, fitting the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example,Cluster Parameters JSON Example,Azure Operator Nexus - Example of cluster.parameters.jsonc template file - Operator Nexus,Define cluster.parameters.jsonc for multi-rack Nexus clusters,Example of an eight rack Cluster parameter file to use with ARM template in creating a Cluster.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,"Eight-rack cluster parameter example will show detailed parameter names, structures, and allowed values for ARM-based cluster deployment, which is expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example,Cluster Manager Template JSON Example,Azure Operator Nexus - Example of clusterManager.jsonc template file - Operator Nexus,Use clusterManager.jsonc template for Nexus deployment,Example of clusterManager.jsonc template file to use with ARM template in creating a cluster manager.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,"Example template file for clusterManager.jsonc will contain concrete schema, parameter names, and configuration patterns specific to Azure Operator Nexus ARM deployments, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example,Cluster Manager Parameters JSON Example,Azure Operator Nexus - Example of clusterManager.parameters.jsonc template file - Operator Nexus,Configure clusterManager.parameters.jsonc for Nexus clusters,Example of clusterManager.parameters.jsonc template file to use with ARM template in creating a Cluster Manager.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-04-22T22:07:00.000Z,how-to,configuration,0.8,True,"Example parameters file defines specific configuration keys, expected value formats, and possibly defaults for creating a Cluster Manager via ARM, which matches product-specific configuration guidance.",updated https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-ab-staged-commit-configuration-update-commit-workflow,A / B staged configuration update - commit workflow in Azure Operator Nexus,A/B Staged Configuration Update: Commit Workflow in Azure Operator Nexus - Operator Nexus,Apply A/B staged configuration updates in Nexus Fabric,Learn about the A/B staged configuration update commit workflow in Azure Operator Nexus.,"The A/B staged configuration update introduces a safe rollout model for Network Fabric configuration. Operators stage configuration on one customer edge device (A or B), validate behavior, optionally cancel to revert, or complete rolling out to the remaining devices by using commit workflow semantics (lock-validate-commit cycle). The current scope targets customer edge-only staging. Broader policies might be added later.",2025-12-23T18:03:00.000Z,concept-article,configuration,0.75,True,Explains staged rollout model with commit workflow semantics (lock-validate-commit) and CE-only staging; product-specific configuration pattern.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-access-control-lists,Access Control Lists,Azure Operator Nexus Access Control Lists Overview - Operator Nexus,Define and apply access control lists in Nexus Network Fabric,Get an overview of access control lists for Azure Operator Nexus.,Access control lists (ACLs) are a set of rules that regulate inbound and outbound packet flow within a network. Azure Nexus Operator - Network Fabric offers an API-based mechanism to configure ACLs for network-to-network interconnects (NNIs) and layer 3 isolation domain (ISD) external networks. These APIs enable the specification of traffic classes and performance actions based on defined rules and actions within the ACLs. ACL rules define the data against which packet contents are compared for ,2025-12-23T18:03:00.000Z,concept-article,configuration,0.7,True,Describes API-based ACL configuration for NNIs and ISD external networks with rule/action semantics; product-specific configuration model.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-baseboard-management-controller-credential-rotation,Baseboard Management Controller Credential Rotation Overview,Azure Operator Nexus: Baseboard Management Controller Credential Rotation Overview - Operator Nexus,,An overview of how credential rotation occurs for Baseboard Management Controller Credential,"The Baseboard Management Controller (BMC) (iDRAC) has several credentials that are automatically rotated as part of the system per each machine. In order for this automated rotation to occur, each Bare Metal Machine (BMM) must be considered one of two potential states in the cluster before it rotates the credential.",2026-01-19T18:03:00.000Z,how-to,,0.35,False,"Describes BMC credential rotation states conceptually; no specific Key Vault, identity, or rotation interval parameters exposed in summary.",unchanged @@ -17,7 +17,7 @@ https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-cross-subscripti https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-disable-border-gateway-protocol-neighbors,Disable Border Gateway Protocol neighbors,Disable the Border Gateway Protocol neighbors - Operator Nexus,Disable BGP neighbors using Nexus read-write commands,Learn how to use read write commands in the Nexus Network Fabric to disable the Border Gateway Protocol.,This article provides examples demonstrating how a user can implement the read write (RW) commands to disable Border Gateway Protocol (BGP) neighbors.,2025-09-17T16:51:00.000Z,concept-article,configuration,0.8,True,Provides concrete examples of RW commands to disable BGP neighbors; includes product-specific command syntax and behavior.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-disable-internal-external-networks-enabled-layer-3-isolation-domain,Disable internal/external networks in an enabled layer 3 isolation domain,Disable Internal/External Networks in an Enabled Layer 3 Isolation Domain in Azure Operator Nexus - Operator Nexus,Disable networks in enabled Layer 3 isolation domains safely,Learn about disabling internal/external networks in an enabled layer 3 isolation domain in Azure Operator Nexus.,Disabling internal or external networks while an isolation domain (ISD) is enabled ensures controlled configuration changes without disrupting active traffic. This feature introduces administrative state updates for networks integrated with commit workflows for atomic operations.,2025-12-23T18:03:00.000Z,concept-article,configuration,0.7,True,Describes administrative state updates integrated with commit workflows for atomic operations; specific to Nexus isolation domain configuration.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-fabric-management-upgrade,Network Fabric management upgrade,Azure Operator Nexus Network Fabric management upgrade overview - Operator Nexus,,Get an overview of Network Fabric management upgrade for Azure Operator Nexus.,"Operator Nexus releases various functionality and bug fixes throughout the product lifecycle to update the Azure resources and on-premises extensions, critical in communications back to Azure. Note This article describesFabric Management Bundle Upgrades, which are non-disruptive and automatically applied by Microsoft. Fabric Management Bundle Upgrades are separate fromFabric Device Runtime Upgrades, which update the software on network devices. For more information, seeRelated content.",2026-01-20T23:02:00.000Z,feature-availability,,0.35,False,Fabric management upgrade overview; lifecycle description without detailed expert-only parameters.,unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-hardware-validation-overview,Hardware Validation,Azure Operator Nexus hardware validation overview - Operator Nexus,,Get an overview of hardware validation for Azure Operator Nexus.,"Hardware Validation (HWV) assesses the state and health of hardware components for a Bare Metal Machine (BMM) by executing test cases against the baseboard management controller (BMC). At this time, the Azure Operator Nexus platform is deployed on Dell servers. Dell servers use the integrated Dell remote access controller (iDRAC), which is the equivalent of a BMC.",2026-01-27T23:09:00.000Z,concept-article,,0.3,False,"Hardware validation overview; summary does not indicate detailed test matrices, parameters, or limits.",unchanged +https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-hardware-validation-overview,Hardware Validation,Azure Operator Nexus hardware validation overview - Operator Nexus,,Get an overview of hardware validation for Azure Operator Nexus.,"Hardware Validation (HWV) assesses the state and health of hardware components for a Bare Metal Machine (BMM) by executing test cases against the baseboard management controller (BMC). At this time, the Azure Operator Nexus platform is deployed on Dell servers. Dell servers use the integrated Dell remote access controller (iDRAC), which is the equivalent of a BMC.",2026-04-23T17:08:00.000Z,concept-article,,0.3,False,"High-level overview of hardware validation and BMC/iDRAC context without specific limits, configuration parameters, error codes, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-internal-network-bgp-metrics,Azure Operator Nexus Network Fabric Internal Network BGP Metrics,Azure Operator Nexus Network Fabric internal network BGP metrics - Operator Nexus,,Overview of internal network BGP metrics.,"Border Gateway Protocol (BGP) Neighbor Monitoring is a critical aspect of network management, ensuring the stability and reliability of communication between routers within internal networks. This concept document aims to provide an overview of BGP Neighbor Monitoring, its significance, and the key metrics involved.",2025-09-17T16:51:00.000Z,concept-article,,0.5,False,Internal network BGP metrics overview; summary suggests conceptual explanation of metrics importance rather than detailed metric reference tables.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-isolation-domain,Isolation Domains overview,Azure Operator Nexus Isolation Domains - Operator Nexus,,Overview of Isolation Domains for Azure Operator Nexus.,An Isolation Domain resource enables the creation of layer-2 and layer-3 networks that your network functions can connect to. This enables inter-rack and intra-rack communication between the network functions. The Operator Nexus Network Fabric (NNF) Service enables three types of isolation domain: Layer-2 isolation domain- provides layer-2 networking capabilities within and across the racks for workloads running on servers. Workloads can take advantage of the isolated layer-2 network to establis,2025-09-17T16:51:00.000Z,concept-article,,0.45,False,Isolation domains overview; conceptual explanation of L2/L3 isolation types.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-network-fabric,Network Fabric overview,Azure Operator Nexus: Network Fabric - Operator Nexus,,Overview of Network Fabric resources for Azure Operator Nexus.,Azure Operator Nexus offers various capabilities to manage the lifecycle and configuration of the networking required to run the Operator's infrastructure and workloads. Operator Nexus enables you to: Key capabilities offered in Azure Operator Nexus Network Fabric: Bootstrapping and lifecycle management- Automated bootstrapping & provisioning of network fabric resources based on network function use-cases. It provides various controls to manage network devices in operator premises via Azure APIs,2025-09-17T16:51:00.000Z,concept-article,,0.35,False,"Network fabric overview; capabilities list but no specific config tables, limits, or troubleshooting mappings.",unchanged @@ -68,6 +68,7 @@ https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-storage,Storage https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-storage-kubernetes,Storage for Nexus Kubernetes,Azure Operator Nexus persistent storage for Kubernetes - Operator Nexus,Use Nexus Kubernetes persistent storage classes effectively,Get an overview of available storage classes for Kubernetes on Azure Operator Nexus.,Each Azure Operator Nexus provides two types of persistent storage to Nexus Kubernetes cluster tenant workloads:nexus-volumeandnexus-shared. Operators select the type of storage they need by creating Persistent Volume Claims (PVCs) using thenexus-volumeornexus-sharedstorage class. All data stored in persistent volumes is stored on a storage appliance deployed on-premises as part of the Azure Operator Nexus instance. Azure Operator Nexus requires one storage appliance and supports up to two stora,2025-09-17T16:51:00.000Z,concept-article,limits-quotas,0.7,True,"Defines specific storage classes (nexus-volume, nexus-shared) and states Nexus requires one and supports up to two storage appliances; this is product-specific limit information.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-storage-multiple-appliances,Multiple storage appliances,Azure Operator Nexus multiple storage appliances - Operator Nexus,Plan storage capacity with multiple Nexus appliances,Lean about Azure Operator Nexus support for multiple storage appliances.,"The storage appliance in Azure Operator Nexus provides highly available, persistent storage to containerized and virtualized workloads. Azure Operator Nexus hardware is organized into compute racks and an aggregator rack. The aggregator rack contains space for two storage appliances. Azure Operator Nexus instances always require one storage appliance; the second storage appliance is optional. Customers can choose to deploy a second storage appliance when their workloads require more capacity tha",2025-09-17T16:51:00.000Z,concept-article,limits-quotas,0.65,True,Describes hardware organization and requirement of one storage appliance with an optional second for more capacity; likely includes concrete capacity/scale details and constraints specific to Operator Nexus.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-telco-input-template,Telco Input Template,Azure Operator Nexus: Telco Input Template - Operator Nexus,,Representing a Nexus instance in a Telco Input template.,This concept article describes how to represent a Nexus instance in a Telco Input template.,2025-09-17T16:51:00.000Z,concept-article,,0.3,False,Concept article about representing a Nexus instance in a template; summary suggests conceptual only.,unchanged +https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-terminal-server-as-resource,Terminal Server as an Azure Operator Nexus Resource,Terminal Server as an Azure Operator Nexus Resource - Operator Nexus,Model terminal servers as Azure Operator Nexus resources,"Learn how Terminal Servers (Bootstrap Devices) are modeled as ARM resources in Azure Operator Nexus for observability, automation, and lifecycle management.","Terminal Servers (Bootstrap Devices) are modeled as Azure Resource Manager (ARM) resources within the Azure Operator Nexus (AON) platform to enable observability, automation, and lifecycle management. Each Terminal Server is represented by two resource types: TheNetworkBootstrapInterfaceis exposed as a child-resource of theNetworkBootstrapDevice. The physical interfaces on the terminal server such asNet1,Net2, andNet3are each modeled as ARM resources. These resources are defined within the AON M",2026-04-23T17:08:00.000Z,concept-article,configuration,0.68,True,"Describes how terminal servers and their physical interfaces (Net1, Net2, Net3) are modeled as specific Azure Resource Manager resource types within Azure Operator Nexus, including parent/child resource relationships. This is product-specific configuration/representation detail that an LLM is unlikely to know from training and maps best to configuration rather than general concepts.",new https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-credential-manager-key-vault,Credential Manager Key Vault,Set up customer provided Key Vault for Managed Credential rotation - Operator Nexus,Configure customer Key Vault for Operator Nexus credential rotation,Step by step guide on setting up a key vault for managing and rotating credentials used within Azure Operator Nexus Cluster resource.,"Note If no key vault is configured for the Cluster resource, credential rotation will fail. Azure Operator Nexus utilizes secrets and certificates to manage component security across the platform. The Operator Nexus platform handles the rotation of these secrets and certificates. By default, Operator Nexus stores the credentials in a managed Key Vault. To keep the rotated credentials in their own Key Vault, the user must configure their own Key Vault to receive rotated credentials. This configur",2025-09-17T16:51:00.000Z,how-to,security,0.8,True,"Step-by-step Key Vault setup for managed credential rotation; will include specific Key Vault access policies, identities, and configuration parameters unique to Operator Nexus.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-customize-kubernetes-cluster-dns,Customize cluster DNS,Customize DNS for an Azure Operator Nexus Kubernetes cluster - Operator Nexus,Customize CoreDNS and node-local-dns in Nexus,Learn how to customize DNS.,"Nexus Kubernetes clusters use a combination of CoreDNS andnode-local-dnsfor cluster DNS management and resolution, with node-local-dns taking precedence for name resolution outside the cluster. Azure Operator Nexus is a managed service, so you can't modify the main configuration for CoreDNS or node-local-dns. Instead, you use a KubernetesConfigMapto override the default settings. To see the default CoreDNS and node-local-dns ConfigMaps, use thekubectl get configmaps --namespace=kube-system cored",2025-09-17T16:51:00.000Z,how-to,configuration,0.85,True,"Shows exact ConfigMap names, kubectl commands, and override patterns for DNS in Nexus clusters, including default vs override behavior.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-ip-prefixes,IP Prefixes,Azure Operator Nexus: How to create and manage IP prefixes - Operator Nexus,Create and manage IP prefixes and rules in Operator Nexus,"Learn to create, view, list, update, and delete IP prefixes and IP prefix rules.",This article explains the main management operations for IP prefixes and IP prefix rules in Azure Operator Nexus.,2025-09-17T16:51:00.000Z,how-to,configuration,0.7,True,Explains management operations for IP prefixes and rules; likely includes specific configuration fields and constraints unique to Operator Nexus.,unchanged @@ -92,6 +93,7 @@ https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-nexusctl, run simple actions on BareMetal Machines (BMM) without using the Azure console or command-line interface (CLI). Caution Don't perform any action against control or management plane servers without first consulting with Microsoft support personnel, doing so could affect the integrity of the Operator Nexus Cluster. Important Multiple disruptive command requests against a Kubernetes Control Plane (",2025-09-17T16:51:00.000Z,how-to,troubleshooting,0.7,True,"Break-glass tool with specific bare metal commands, flags, and constraints unique to Operator Nexus; includes operational caveats and likely command syntax not generally known.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-run-data-extract,BareMetal Run-Data-Extract Execution,Troubleshoot Bare-Metal Machines with the run-data-extract Command - Operator Nexus,Use run-data-extract to troubleshoot Nexus bare-metal,Learn how to extract data from a bare-metal machine for troubleshooting and diagnostic purposes by using the run-data-extract command.,Azure Operator Nexus provides a prescribed set of data extract commands viaaz networkcloud baremetalmachine run-data-extractthat help users investigate and resolve issues with on-premises bare-metal machines. Users can employ these commands to get diagnostic data from a bare-metal machine.,2026-04-02T17:05:00.000Z,how-to,troubleshooting,0.78,True,"Page documents the az networkcloud baremetalmachine run-data-extract command specifically for troubleshooting bare-metal machines, including product-specific diagnostic usage patterns and data collection details that are unique to Azure Operator Nexus and not general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-run-read,BareMetal Run-Read Execution,Troubleshoot Bare-Metal Machines by Using the run-read Command - Operator Nexus,Troubleshoot bare metal machines with az baremetalmachine run-read-command,This article teaches you how to run diagnostics on a bare-metal machine by using the run-read command for Azure Operator Nexus.,You can investigate and resolve issues with an on-premises bare-metal machine by using theaz networkcloud baremetalmachine run-read-commandfor Azure Operator Nexus. Therun-readcommand supports a curated list of read-only commands that help you get information from a bare-metal machine.,2025-11-13T23:03:00.000Z,how-to,troubleshooting,0.8,True,Provides curated read-only diagnostic commands via run-read; symptom-oriented troubleshooting using Nexus-specific CLI.,unchanged +https://learn.microsoft.com/en-us/azure/operator-nexus/howto-certificate-rotation,Certificate Rotation,Use Certificate Rotation in Azure Operator Nexus - Operator Nexus,Rotate and resync certificates in Azure Operator Nexus,Learn the process for using certificate rotation in Azure Operator Nexus.,"API-driven Certificate Rotation enables customers to self-serve certificate rotation and resynchronization for Nexus Network Fabric. This capability allows customers to manage certificate lifecycle operations directly, while preserving existing operational safeguards.",2026-04-22T17:03:00.000Z,how-to,security,0.7,True,"Certificate rotation and resynchronization for Nexus Network Fabric is a security-focused lifecycle operation; the page likely includes specific API operations, parameters, and process steps unique to this product’s certificate management.",new https://learn.microsoft.com/en-us/azure/operator-nexus/howto-check-runtime-version,Check runtime version,Azure Operator Nexus: How to check runtime version for Azure Operator Nexus - Operator Nexus,Check runtime versions of Operator Nexus components,Learn to check the runtime version of the key components in Azure Operator Nexus.,This how-to guide explains the steps to determine the runtime version of the key components in Azure Operator Nexus.,2025-09-17T16:51:00.000Z,how-to,configuration,0.7,True,Provides specific commands and resource paths to retrieve runtime versions of key Nexus components.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-cluster-managed-identity-user-provided-resources,Cluster Managed Identity and User Provided Resources,Azure Operator Nexus Cluster Support for managed identities and user provided resources - Operator Nexus,Use managed identities and user resources in Operator Nexus clusters,Azure Operator Nexus Cluster support for managed identities and user provided resources.,"To improve the security of the Operator Nexus platform, managed identities are now supported for Operator Nexus Clusters. Managed identities provide a secure way for applications to access other Azure resources and eliminate the need for users to manage credentials. Additionally, Operator Nexus now has a user provided resource model. In addition to improved security, this shift provides a consistent user experience across the platform. Managed identities are used with the following user resource",2026-02-27T23:05:00.000Z,how-to,security,0.75,True,"Describes managed identity usage and user-provided resource model; likely includes specific identity scopes, resource types, and security configuration patterns unique to Operator Nexus.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-cluster-manager,Cluster Manager,Cluster Manager: How to manage the Cluster Manager in Operator Nexus - Operator Nexus,,"Learn to create, view, list, update, delete commands for Cluster Manager on Operator Nexus",The Cluster Manager is deployed in the operator's Azure subscription to manage the lifecycle of Operator Nexus Infrastructure Clusters.,2026-03-03T23:11:00.000Z,how-to,,0.3,False,"Page appears to be a how-to for creating, viewing, listing, updating, and deleting Cluster Manager resources in Operator Nexus. It is likely a procedural/CRUD guide without detailed configuration parameter tables, limits, error-code-based troubleshooting, or decision matrices. No clear evidence from the summary that it contains product-specific limits, quotas, security role definitions, or other expert-only configuration details.",unchanged @@ -129,8 +131,8 @@ https://learn.microsoft.com/en-us/azure/operator-nexus/howto-disable-internal-ex https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-disable-vulnerability-scanning,Enable/Disable Vulnerability Scanning,Enable/Disable Vulnerability Scanning in Azure Operator Nexus - Operator Nexus,Enable or disable vulnerability scanning in Nexus,Get instructions on enabling/disabling the Vulnerability Scanning setting.,This guide provides you with instructions on how to enable or disable Vulnerability Scanning in an Azure Operator Nexus cluster.,2025-09-17T16:51:00.000Z,how-to,security,0.7,True,"Product-specific setting and procedure to toggle vulnerability scanning on Nexus clusters, including configuration scope.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-log-streaming,How to enable / disable BMP log streaming Azure Operator Nexus,Enable or Disable BMP Log Streaming for Azure Operator Nexus - Operator Nexus,Enable or disable BMP log streaming for Nexus fabric resources,Learn how to enable or disable BMP log streaming for various Azure Operator Nexus Network Fabric resources.,This article shows you how to enable or disable BGP Monitoring Protocol (BMP) log streaming for various Azure Operator Nexus Network Fabric resources.,2025-09-17T16:51:00.000Z,how-to,configuration,0.7,True,"How-to for toggling BMP log streaming on specific Nexus resources; likely includes resource types, flags, and configuration fields unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-micro-bfd,How to enable-Micro-BFD on CE and PE devices,How to enable Micro-BFD on CE and PE devices - Operator Nexus,Enable Micro-BFD on CE and PE devices in Operator Nexus,Process of enabling Micro-BFD On CE and PE devices.,"Micro-BFD (Bidirectional Forwarding Detection) is a lightweight protocol designed to rapidly detect failures between adjacent network devices, such as routers or switches, with minimal overhead. This guide provides step-by-step instructions to enable Micro-BFD on Customer Edge (CE) and Provider Edge (PE) devices.",2025-09-17T16:51:00.000Z,how-to,configuration,0.7,True,Step-by-step enabling of Micro-BFD with Nexus-specific device roles and configuration commands/parameters.,unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-system-assigned-managed-identity-for-network-fabric-controller,How to enable System Assigned Managed Identity (SAMI) for the Network Fabric Controller,Enable a SAMI for the Network Fabric Controller in Azure Operator Nexus - Operator Nexus,Configure system-assigned managed identity for Nexus NFC,This article describes how to enable and validate a SAMI for the Network Fabric Controller in Azure Operator Nexus.,This article describes how to enable and validate a system-assigned managed identity (SAMI) for the Network Fabric Controller resource for both new resources and existing resources.,2026-04-17T17:09:00.000Z,how-to,security,0.78,True,"How-to for enabling and validating a system-assigned managed identity on the Network Fabric Controller. This is product-specific identity/security configuration (SAMI behavior for this resource type), not generic concepts, and includes concrete steps and constraints unique to Azure Operator Nexus.",updated -https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-system-assigned-managed-identity-for-network-fabric-resource,How to enable System Assigned Managed Identity (SAMI) for the Network Fabric resource,Enable a SAMI for the Network Fabric Resource in Azure Operator Nexus - Operator Nexus,Enable SAMI and roles for Nexus Network Fabric,This article describes how to enable a SAMI for the Network Fabric resource in Azure Operator Nexus. You can enable a SAMI for new and existing resource paths.,"This article explains how to enable a system-assigned managed identity (SAMI) for the Network Fabric resource. You can enable a SAMI for both new and existing resource paths. The article also covers identity transition rules, lock and commit considerations, and role requirements. The Network Fabric resource supports the following identity modes:SystemAssigned,UserAssigned, andSystemAssigned,UserAssigned.Noneisn't supported. Be aware of the following constraints: After a SAMI is associated for tr",2026-04-17T17:09:00.000Z,how-to,security,0.82,True,"Describes enabling SAMI for Network Fabric resources, supported identity modes, constraints after association, identity transition rules, lock/commit behavior, and role requirements. These are detailed, product-specific identity and RBAC behaviors that go beyond generic managed identity knowledge.",updated +https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-system-assigned-managed-identity-for-network-fabric-controller,How to enable System Assigned Managed Identity (SAMI) for the Network Fabric Controller,Enable a SAMI for the Network Fabric Controller in Azure Operator Nexus - Operator Nexus,Configure system-assigned managed identity for Nexus NFC,This article describes how to enable and validate a SAMI for the Network Fabric Controller in Azure Operator Nexus.,This article describes how to enable and validate a system-assigned managed identity (SAMI) for the Network Fabric Controller resource for both new resources and existing resources.,2026-04-17T17:09:00.000Z,how-to,security,0.78,True,"How-to for enabling and validating a system-assigned managed identity on the Network Fabric Controller. This is product-specific identity/security configuration (SAMI behavior for this resource type), not generic concepts, and includes concrete steps and constraints unique to Azure Operator Nexus.",unchanged +https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-system-assigned-managed-identity-for-network-fabric-resource,How to enable System Assigned Managed Identity (SAMI) for the Network Fabric resource,Enable a SAMI for the Network Fabric Resource in Azure Operator Nexus - Operator Nexus,Enable SAMI and roles for Nexus Network Fabric,This article describes how to enable a SAMI for the Network Fabric resource in Azure Operator Nexus. You can enable a SAMI for new and existing resource paths.,"This article explains how to enable a system-assigned managed identity (SAMI) for the Network Fabric resource. You can enable a SAMI for both new and existing resource paths. The article also covers identity transition rules, lock and commit considerations, and role requirements. The Network Fabric resource supports the following identity modes:SystemAssigned,UserAssigned, andSystemAssigned,UserAssigned.Noneisn't supported. Be aware of the following constraints: After a SAMI is associated for tr",2026-04-17T17:09:00.000Z,how-to,security,0.82,True,"Describes enabling SAMI for Network Fabric resources, supported identity modes, constraints after association, identity transition rules, lock/commit behavior, and role requirements. These are detailed, product-specific identity and RBAC behaviors that go beyond generic managed identity knowledge.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-gather-pvc-trace-id,Gather trace IDs for PersistentVolumeClaim failures,Azure Operator Nexus: Gather trace IDs for PersistentVolumeClaim failures - Operator Nexus,Gather trace IDs for Nexus PersistentVolumeClaim failures,Learn how to gather trace IDs for pods stuck in 'ContainerCreating',"There are rare cases where pods using PersistentVolumeClaims referencing the 'nexus-volume' or 'nexus-shared' storage class can enter a stuck state. Pods can get stuck in the ""ContainerCreating"" state due to failures creating a volume or in attaching a volume to a node. The 'nexus-volume' and 'nexus-shared' storage classes assign trace IDs to volume lifecycle operations to allow diagnosis of these issues. This article explains how to find trace IDs and include them in support requests. Connect t",2025-09-17T16:51:00.000Z,how-to,troubleshooting,0.8,True,Explains how to extract Nexus-specific trace IDs from nexus-volume and nexus-shared storage classes to diagnose stuck ContainerCreating pods.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-gather-vm-console-data,Gather VM Console Data,Azure Operator Nexus: Gather VM Console Data - Operator Nexus,Collect diagnostic data for Nexus VM console issues,Learn how to gather VM Console Data.,The article provides generic guidance on how to collect data necessary for diagnosing VM console-related issues.,2025-09-17T16:51:00.000Z,how-to,troubleshooting,0.65,True,Provides specific data collection steps and artifacts required to troubleshoot VM console problems in Nexus.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-install-cli-extensions,Install CLI Extension,Azure Operator Nexus: Install CLI extensions - Operator Nexus,Install Azure CLI extensions for Operator Nexus,Learn to install the needed Azure CLI extensions for Operator Nexus,"This how-to guide explains the steps for installing the required az CLI and extensions required to interact with Operator Nexus. Azure CLI is required to be installed before the extensions installations. FollowInstall Azure CLIinstructions to install it. Installations of the following CLI extensions are required: Note Any upgrade of the Azure CLI downloads the latest stable version of the installed extension. @@ -179,7 +181,7 @@ https://learn.microsoft.com/en-us/azure/operator-nexus/howto-update-expressroute https://learn.microsoft.com/en-us/azure/operator-nexus/howto-upgrade-nexus-fabric,Network Fabric Upgrades,Upgrade Network Fabric for Azure Operator Nexus - Operator Nexus,Pre-validate and upgrade Azure Operator Nexus fabric,"Learn how to upgrade Network Fabric for Azure Operator Nexus, and find out about required and recommended pre-validations.","In this article, you learn about bothrequiredandrecommendedpre-upgrade validations to carry out before you upgrade your Azure Operator Nexus Network Fabric runtime. If you don't perform therequiredpre-validation checks and meet the conditions, your upgrade fails. Performing therecommendedchecks can help you ensure the consistency of your release.",2026-04-09T17:04:00.000Z,how-to,deployment,0.68,True,"Upgrade procedure content for a specialized network fabric platform is typically highly product-specific, including required and recommended pre-upgrade validations, failure conditions, and ordered steps unique to Azure Operator Nexus. These details are unlikely to be fully captured in generic training data and map best to deployment, as they govern how and when a production fabric runtime can be upgraded successfully.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-upgrade-nexus-fabric-template,Network Fabric Upgrades Template,Azure Operator Nexus: Network Fabric Runtime Upgrade Template - Operator Nexus,Use template to upgrade Azure Operator Nexus fabric,Learn how to upgrade Network Fabric for Azure Operator Nexus with a step-by-step parameterized template.,This how-to guide provides a step-by-step template for upgrading an Azure Operator Nexus Network Fabric instance. Use this template to manage a reproducible end-to-end upgrade through Azure APIs and standard operating procedures. Regularly update your system to maintain integrity and access the latest product improvements.,2026-03-27T22:03:00.000Z,how-to,deployment,0.7,True,"Provides a parameterized template and step-by-step process to perform a reproducible end-to-end upgrade of an Azure Operator Nexus Network Fabric instance via Azure APIs and standard operating procedures. This is concrete, product-specific upgrade/deployment automation guidance, which qualifies as expert deployment knowledge rather than a generic tutorial.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-upgrade-os-of-terminal-server,How to upgrade os of terminal server,Azure Operator Nexus: How to upgrade the operating system of a Terminal Server - Operator Nexus,Upgrade Terminal Server OS in Azure Operator Nexus,Learn the process for upgrading the operating system of a Terminal Server,"This document provides a step-by-step guide to upgrade the operating system (OS) of a Terminal Server. The outlined procedure is manual and includes essential checks, a backup process, and actions for post-upgrade validation.",2026-04-08T22:04:00.000Z,how-to,deployment,0.6,True,"Step-by-step OS upgrade procedure with pre-checks, backup, and post-validation for a Nexus Terminal Server; this is a product-specific operational runbook relevant to deployment/upgrade workflows, containing expert procedural knowledge not derivable from generic OS upgrade concepts.",unchanged -https://learn.microsoft.com/en-us/azure/operator-nexus/howto-usage-guide-for-customer-managed-key-vault,How-to usage guide for Customer Managed Key Vault,Usage Guide for Customer-Managed Key Vault - Operator Nexus,Use customer-managed Key Vault with Operator Nexus,This article lists steps you need to take to enable a customer-managed key vault. The article includes examples to help you use a customer-managed key vault.,"Network Fabric uses secrets (passwords) to access the terminal server and network devices. These secrets are stored in Azure Key Vault in the Network Fabric Controller managed resource group. In some regions (currentlyWest US3andBrazil South), key vaults don't have automatic cross‑region (paired‑region) replication. As a workaround to protect against a region outage, you can define acustomer-managed key vault. After you configure this type of key vault, each time a password rotation occurs, you ",2026-04-17T17:09:00.000Z,how-to,security,0.8,True,"Covers how Network Fabric uses Key Vault, regional replication gaps (specific to West US 3 and Brazil South), and the workaround using a customer-managed Key Vault for password rotation and regional outage protection. This is concrete, product- and region-specific security/key management guidance.",updated +https://learn.microsoft.com/en-us/azure/operator-nexus/howto-usage-guide-for-customer-managed-key-vault,How-to usage guide for Customer Managed Key Vault,Usage Guide for Customer-Managed Key Vault - Operator Nexus,Use customer-managed Key Vault with Operator Nexus,This article lists steps you need to take to enable a customer-managed key vault. The article includes examples to help you use a customer-managed key vault.,"Network Fabric uses secrets (passwords) to access the terminal server and network devices. These secrets are stored in Azure Key Vault in the Network Fabric Controller managed resource group. In some regions (currentlyWest US3andBrazil South), key vaults don't have automatic cross‑region (paired‑region) replication. As a workaround to protect against a region outage, you can define acustomer-managed key vault. After you configure this type of key vault, each time a password rotation occurs, you ",2026-04-17T17:09:00.000Z,how-to,security,0.8,True,"Covers how Network Fabric uses Key Vault, regional replication gaps (specific to West US 3 and Brazil South), and the workaround using a customer-managed Key Vault for password rotation and regional outage protection. This is concrete, product- and region-specific security/key management guidance.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-use-ab-staged-commit-configuration-update-commit-workflow,How to perform A / B staged configuration update - commit workflow in Azure Operator Nexus,How to perform A / B staged configuration update - commit workflow in Azure Operator Nexus - Operator Nexus,Perform A/B staged configuration updates with Nexus commit workflow,Learn about how to perform A / B staged commit configuration update commit workflow in Azure Operator Nexus,,2025-12-09T18:14:00.000Z,how-to,configuration,0.7,True,Describes A/B staged commit pattern for configuration updates; Nexus-specific workflow semantics and steps.,unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-use-azure-policy,Use Azure Policy with Operator Nexus resources,Azure Operator Nexus: How to use Azure Policy with Operator Nexus Resources - Operator Nexus,Apply Azure Policy to secure Nexus resources,Learn how to assign Azure built-in policies or create custom policies to secure your Operator Nexus resources.,"In this article, you can learn how to use Azure Policy to secure and validate the compliance status of your Nexus resources.",2026-01-07T18:05:00.000Z,how-to,security,0.75,True,"Shows specific built-in policies, assignment scopes, and custom policy patterns tailored to Operator Nexus resources.",unchanged https://learn.microsoft.com/en-us/azure/operator-nexus/howto-use-break-glass-access,How to use-break-glass-access,How to use Method D v2.0 secure break-glass access - Operator Nexus,Use Method D v2.0 break-glass access for Nexus fabric devices,Process of using Method D v2.0 break glass access,"Break glass access using Method D v2.0 is a streamlined approach for administrators to grant secure, emergency access to critical network fabric devices. This guide walks you through setting up and using break glass access, including generating SSH keys, granting permissions, and accessing network fabric devices. Method D v2.0 also supports assigning Nexus Network Fabric (NNF) built-in roles to Entra Groups, streamlining the management of break glass access by applying group-based role assignmen",2025-09-17T16:51:00.000Z,how-to,security,0.8,True,"Walks through SSH key generation, permission grants, and role assignments (NNF built-in roles to Entra groups); detailed security configuration patterns.",unchanged diff --git a/products/azure-operator-nexus/report.md b/products/azure-operator-nexus/report.md index e20a4349..d395ddbb 100644 --- a/products/azure-operator-nexus/report.md +++ b/products/azure-operator-nexus/report.md @@ -1,12 +1,11 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - configuration: 'Configuring and updating Nexus clusters and network fabric: JSON - templates, isolation domains, BGP/route policies, ACLs, QoS, maintenance, monitoring, - and Kubernetes/node settings.' - security: 'Securing Nexus: identity/RBAC, ACLs, SSH/serial access, break-glass, - key/secret rotation, managed identities, Defender/Policy, TAP/ExpressRoute auth, - and secure VM/cluster access.' + configuration: 'Configuring and operating Azure Operator Nexus: cluster templates/parameters, + Kubernetes/node settings, network fabric (BGP, VRFs, QoS, ACLs, route policies), + isolation domains, maintenance, and monitoring.' + security: 'Securing Nexus: identity/RBAC, ACLs, SSH and break-glass access, key/cert/secret + rotation, managed identities, Defender/Policy, and secure VM/cluster/network management.' troubleshooting: Diagnosing and fixing Nexus bare-metal, network, storage, and Kubernetes issues, including provisioning failures, connectivity, performance, pod/VM errors, and collecting diagnostic logs. @@ -30,14 +29,14 @@ category_descriptions: skill_description: Expert knowledge for Azure Operator Nexus development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when configuring Nexus clusters/fabric, BGP/ACL/QoS, NPB TAP rules, near-edge - storage, or Nexus AKS ETCD ops, and other Azure Operator Nexus related development + Use when managing Nexus clusters, fabric upgrades, NPB TAP rules, near-edge storage, + or bare-metal lifecycle ops, and other Azure Operator Nexus related development tasks. Not for Azure Network Function Manager (use azure-network-function-manager), Azure Networking (use azure-networking), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan). -use_when: Use when configuring Nexus clusters/fabric, BGP/ACL/QoS, NPB TAP rules, - near-edge storage, or Nexus AKS ETCD ops, and other Azure Operator Nexus related - development tasks. +use_when: Use when managing Nexus clusters, fabric upgrades, NPB TAP rules, near-edge + storage, or bare-metal lifecycle ops, and other Azure Operator Nexus related development + tasks. confusable_not_for: Not for Azure Network Function Manager (use azure-network-function-manager), Azure Networking (use azure-networking), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan). @@ -46,16 +45,16 @@ confusable_not_for: Not for Azure Network Function Manager (use azure-network-fu ## Summary -- **Total Pages**: 212 -- **Fetched**: 212 +- **Total Pages**: 214 +- **Fetched**: 214 - **Fetch Failed**: 0 -- **Classified**: 159 +- **Classified**: 161 - **Unclassified**: 53 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 3 -- **Unchanged**: 209 +- **New Pages**: 2 +- **Updated Pages**: 5 +- **Unchanged**: 207 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-operator-nexus/azure-operator-nexus.csv` @@ -65,25 +64,34 @@ confusable_not_for: Not for Azure Network Function Manager (use azure-network-fu |------|-------|------------| | architecture-patterns | 1 | 0.5% | | best-practices | 2 | 0.9% | -| configuration | 59 | 27.8% | +| configuration | 60 | 28.0% | | decision-making | 4 | 1.9% | | deployment | 6 | 2.8% | | integrations | 1 | 0.5% | | limits-quotas | 9 | 4.2% | -| security | 35 | 16.5% | -| troubleshooting | 42 | 19.8% | -| *(Unclassified)* | 53 | 25.0% | +| security | 36 | 16.8% | +| troubleshooting | 42 | 19.6% | +| *(Unclassified)* | 53 | 24.8% | ## Changes +### New Pages + +- [Certificate Rotation](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-certificate-rotation) +- [Terminal Server as an Azure Operator Nexus Resource](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-terminal-server-as-resource) + ### Updated Pages -- [How to enable System Assigned Managed Identity (SAMI) for the Network Fabric Controller](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-system-assigned-managed-identity-for-network-fabric-controller) - - Updated: 2026-04-02T22:02:00.000Z → 2026-04-17T17:09:00.000Z -- [How to enable System Assigned Managed Identity (SAMI) for the Network Fabric resource](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-enable-system-assigned-managed-identity-for-network-fabric-resource) - - Updated: 2026-04-02T22:02:00.000Z → 2026-04-17T17:09:00.000Z -- [How-to usage guide for Customer Managed Key Vault](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-usage-guide-for-customer-managed-key-vault) - - Updated: 2026-04-02T17:05:00.000Z → 2026-04-17T17:09:00.000Z +- [Hardware Validation](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-hardware-validation-overview) + - Updated: 2026-01-27T23:09:00.000Z → 2026-04-23T17:08:00.000Z +- [Cluster Manager Template JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example) + - Updated: 2025-09-17T16:51:00.000Z → 2026-04-22T22:07:00.000Z +- [Cluster Manager Parameters JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example) + - Updated: 2025-09-17T16:51:00.000Z → 2026-04-22T22:07:00.000Z +- [Cluster Template JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example) + - Updated: 2025-09-17T16:51:00.000Z → 2026-04-22T22:07:00.000Z +- [Cluster Parameters JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example) + - Updated: 2025-09-17T16:51:00.000Z → 2026-04-22T22:07:00.000Z ## Classified Pages @@ -128,10 +136,10 @@ confusable_not_for: Not for Azure Network Function Manager (use azure-network-fu | [How to set up break glass access](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-set-up-break-glass-access) | security | 0.82 | Covers IAM policies, identity model changes from v1.5 to v2.0, and specific configuration for emergency access; includes product-specific roles and security flows. | | [BareMetal Run-Read Execution](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-run-read) | troubleshooting | 0.80 | Provides curated read-only diagnostic commands via run-read; symptom-oriented troubleshooting using Nexus-specific CLI. | | [Best Practices for Bare Metal Machine Operations](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-bare-metal-best-practices) | best-practices | 0.80 | Explicit best-practices article with concrete prerequisites and pitfalls for BMM replace/reimage; product-specific operational guidance. | -| [Cluster Manager Parameters JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example) | configuration | 0.80 | Parameter file example (gated) exposes specific configuration keys, value formats, and constraints for Cluster Manager deployments. | -| [Cluster Manager Template JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example) | configuration | 0.80 | Template example page (gated) will contain concrete JSONC parameters, allowed values, and defaults for Cluster Manager configuration. | -| [Cluster Parameters JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example) | configuration | 0.80 | Parameter file example (gated) for eight-rack cluster exposes detailed configuration keys and constraints specific to Operator Nexus. | -| [Cluster Template JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example) | configuration | 0.80 | Cluster template example (gated) will list concrete configuration parameters, types, and allowed values for cluster resources. | +| [Cluster Manager Parameters JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example) | configuration | 0.80 | Example parameters file defines specific configuration keys, expected value formats, and possibly defaults for creating a Cluster Manager via ARM, which matches product-specific configuration guidance. | +| [Cluster Manager Template JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example) | configuration | 0.80 | Example template file for clusterManager.jsonc will contain concrete schema, parameter names, and configuration patterns specific to Azure Operator Nexus ARM deployments, which are product-specific configuration details. | +| [Cluster Parameters JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example) | configuration | 0.80 | Eight-rack cluster parameter example will show detailed parameter names, structures, and allowed values for ARM-based cluster deployment, which is expert configuration knowledge. | +| [Cluster Template JSON Example](https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example) | configuration | 0.80 | Example cluster.jsonc template will enumerate configuration sections, property names, and required values unique to Azure Operator Nexus cluster creation, fitting the configuration sub-skill. | | [Cluster metrics configuration management](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-cluster-metrics-configuration-management) | configuration | 0.80 | Defines MetricsConfiguration resource, including fields and default behavior for standard vs optional metrics collection in Nexus clusters. | | [Commit Workflow v2](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-commit-workflow-v2) | configuration | 0.80 | Details lock/preview/validate/commit semantics and improved error handling; product-specific configuration workflow with operational states. | | [Configure Network Access Control Lists for SSH Access on Management VPN](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-configure-acls-for-ssh-management-on-access-vpn) | security | 0.80 | Details ACL resource creation, ingress/egress rules, and NNI payload references to control SSH access on management VPN. | @@ -181,6 +189,7 @@ confusable_not_for: Not for Azure Network Function Manager (use azure-network-fu | [Azure Operator Nexus Network Fabric Configuration Monitoring](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-network-fabric-configuration-monitoring) | configuration | 0.70 | Describes configuration monitoring and reporting of differences across devices; product-specific monitoring behavior and possibly states/fields. | | [Bare Metal Machine Platform Commands](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-functions) | troubleshooting | 0.70 | Covers lifecycle management operations on Bare Metal Machines for recovery and maintenance, with disruptive vs non-disruptive actions. This is framed as troubleshooting/recovery guidance and likely includes specific commands and when to use them, mapping actions to failure/maintenance scenarios. | | [BareMetal Machine roles](https://learn.microsoft.com/en-us/azure/operator-nexus/reference-near-edge-baremetal-machine-roles) | configuration | 0.70 | A reference for 'machineRoles' on BareMetal Machines will define specific role names, allowed combinations, and their semantics, which are product-specific configuration parameters. | +| [Certificate Rotation](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-certificate-rotation) | security | 0.70 | Certificate rotation and resynchronization for Nexus Network Fabric is a security-focused lifecycle operation; the page likely includes specific API operations, parameters, and process steps unique to this product’s certificate management. | | [Check runtime version](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-check-runtime-version) | configuration | 0.70 | Provides specific commands and resource paths to retrieve runtime versions of key Nexus components. | | [Collect debug logs for support ticket](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-kubernetes-cluster-log-collector-script) | troubleshooting | 0.70 | Describes a product-specific log collector script for Azure Operator Nexus Kubernetes nodes, including when and where to run it to support issue diagnosis. This is concrete troubleshooting guidance tied to a specific diagnostic tool rather than generic debugging advice. | | [Configure service load balancer](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-kubernetes-service-load-balancer) | configuration | 0.70 | Provides Nexus-specific load balancer configuration fields, annotations, and examples that are not generic Kubernetes knowledge. | @@ -229,6 +238,7 @@ confusable_not_for: Not for Azure Network Function Manager (use azure-network-fu | [Network Fabric Upgrades](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-upgrade-nexus-fabric) | deployment | 0.68 | Upgrade procedure content for a specialized network fabric platform is typically highly product-specific, including required and recommended pre-upgrade validations, failure conditions, and ordered steps unique to Azure Operator Nexus. These details are unlikely to be fully captured in generic training data and map best to deployment, as they govern how and when a production fabric runtime can be upgraded successfully. | | [Nexus Instance Deployment Template](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-nexus-instance-deployment-template) | configuration | 0.68 | The page describes a parameterized deployment template for an Azure Operator Nexus instance. Such templates typically enumerate product-specific parameters (names, allowed values, defaults) required to deploy the service. This constitutes configuration knowledge (template parameters and their usage) that is specific to Azure Operator Nexus and not generally known from training. It is not focused on limits, troubleshooting, or architecture, but on how to configure and deploy an instance using a structured template. | | [Route Policy operations](https://learn.microsoft.com/en-us/azure/operator-nexus/reference-nexus-route-policy-operations) | configuration | 0.68 | Covers operational procedures to create, modify, and delete route policies, likely including specific API operations, parameters, and constraints unique to this product. | +| [Terminal Server as an Azure Operator Nexus Resource](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-terminal-server-as-resource) | configuration | 0.68 | Describes how terminal servers and their physical interfaces (Net1, Net2, Net3) are modeled as specific Azure Resource Manager resource types within Azure Operator Nexus, including parent/child resource relationships. This is product-specific configuration/representation detail that an LLM is unlikely to know from training and maps best to configuration rather than general concepts. | | [Validate cables for Nexus Network Fabric](https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-validate-cables) | troubleshooting | 0.68 | Uses diagnostic APIs to classify devices as compliant/noncompliant against BOM and SKUs; likely includes specific commands and result fields unique to Nexus, fitting troubleshooting (symptom → validation). | | [How to replace terminal server](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-replace-terminal-server) | deployment | 0.66 | Covers RMA-like replacement workflow for terminal servers with specific cleanup, removal, and reconfiguration steps unique to Nexus fabric deployment. | | [Access and Identity](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-security-access-identity) | security | 0.65 | Focuses on Azure RBAC scopes and Key Vault access for Operator Nexus; likely includes specific role names and scope usage patterns unique to this product. | @@ -294,7 +304,7 @@ confusable_not_for: Not for Azure Network Function Manager (use azure-network-fu | [Cluster Manager](https://learn.microsoft.com/en-us/azure/operator-nexus/howto-cluster-manager) | 0.30 | Page appears to be a how-to for creating, viewing, listing, updating, and deleting Cluster Manager resources in Operator Nexus. It is likely a procedural/CRUD guide without detailed configuration parameter tables, limits, error-code-based troubleshooting, or decision matrices. No clear evidence from the summary that it contains product-specific limits, quotas, security role definitions, or other expert-only configuration details. | | [Compute overview](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-compute) | 0.30 | Compute overview; lacks specific configuration parameters, limits, or troubleshooting content. | | [FAQ](https://learn.microsoft.com/en-us/azure/operator-nexus/azure-operator-nexus-faq) | 0.30 | FAQ content is typically high-level Q&A without detailed numeric limits, configuration tables, or structured troubleshooting mappings. | -| [Hardware Validation](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-hardware-validation-overview) | 0.30 | Hardware validation overview; summary does not indicate detailed test matrices, parameters, or limits. | +| [Hardware Validation](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-hardware-validation-overview) | 0.30 | High-level overview of hardware validation and BMC/iDRAC context without specific limits, configuration parameters, error codes, or decision matrices. | | [List of metrics collected](https://learn.microsoft.com/en-us/azure/operator-nexus/list-of-metrics-collected) | 0.30 | A list of metrics names is largely descriptive/telemetry catalog; summary does not indicate configuration parameters, limits, or error mappings. Likely just metric names and dimensions without deep product-specific behavior or thresholds. | | [PKI implementation](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-pki-implementation) | 0.30 | Describes PKI architecture conceptually; no specific certificate parameters, config tables, or role mappings. | | [Resource Types](https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-resource-types) | 0.30 | Conceptual overview of resource types; no detailed configuration tables or limits. | diff --git a/products/azure-oracle/azure-oracle.csv b/products/azure-oracle/azure-oracle.csv index ceae1b75..69ea03d0 100644 --- a/products/azure-oracle/azure-oracle.csv +++ b/products/azure-oracle/azure-oracle.csv @@ -1,19 +1,19 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/oracle/oracle-azure-overview,Overview,Oracle on Azure,,Oracle on Azure offerings,"Customers running Oracle workloads on-premises can accelerate their cloud adoption by migrating their workloads to Azure. You have a choice to either lift and shift your Oracle workloads to Azure virtual machines or to Oracle AI Database@Azure. Additionally, you can also migrate to our Azure Database services like Azure SQL, Azure Database for PostgreSQL based on the requirements of your organization.",2026-04-15T22:11:00.000Z,overview,,0.1,False,"High-level overview of Oracle options on Azure; no concrete limits, configs, error codes, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/oracle/oracle-db/database-overview,Overview,Overview - Oracle AI Database@Azure,,Overview - Oracle AI Database@Azure.,"Oracle AI Database@Azure is an Oracle AI Database service running on Oracle Cloud Infrastructure (OCI), colocated in Microsoft data centers. This ensures that the Oracle AI Database@Azure service has the fastest possible access to Azure resources and applications. Oracle AI Database@Azure allows you to subscribe to the Oracle AI Database Service inside your Azure environment. All infrastructure for your Oracle AI Database Service is located in Azure's physical data centers, giving your critical ",2026-04-15T22:11:00.000Z,overview,,0.1,False,"Service overview describing what Oracle AI Database@Azure is; no indication of numeric limits, configs, or decision criteria.",updated -https://learn.microsoft.com/en-us/azure/oracle/oracle-db/faq-oracle-database-azure,FAQs,FAQ for Oracle AI Database@Azure,,Learn answers to frequently asked questions about Oracle AI Database@Azure.,"In this article, get answers to frequently asked questions about Oracle AI Database@Azure.",2026-04-15T22:11:00.000Z,concept-article,,0.3,False,"FAQ format; summary does not indicate presence of specific error codes, limits tables, or configuration parameter references beyond general Q&A.",updated +https://learn.microsoft.com/en-us/azure/oracle/oracle-azure-overview,Overview,Oracle on Azure,,Oracle on Azure offerings,"Customers running Oracle workloads on-premises can accelerate their cloud adoption by migrating their workloads to Azure. You have a choice to either lift and shift your Oracle workloads to Azure virtual machines or to Oracle AI Database@Azure. Additionally, you can also migrate to our Azure Database services like Azure SQL, Azure Database for PostgreSQL based on the requirements of your organization.",2026-04-15T22:11:00.000Z,overview,,0.1,False,"High-level overview of Oracle options on Azure; no concrete limits, configs, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/oracle/oracle-db/database-overview,Overview,Overview - Oracle AI Database@Azure,,Overview - Oracle AI Database@Azure.,"Oracle AI Database@Azure is an Oracle AI Database service running on Oracle Cloud Infrastructure (OCI), colocated in Microsoft data centers. This ensures that the Oracle AI Database@Azure service has the fastest possible access to Azure resources and applications. Oracle AI Database@Azure allows you to subscribe to the Oracle AI Database Service inside your Azure environment. All infrastructure for your Oracle AI Database Service is located in Azure's physical data centers, giving your critical ",2026-04-15T22:11:00.000Z,overview,,0.1,False,"Service overview describing what Oracle AI Database@Azure is; no indication of numeric limits, configs, or decision criteria.",unchanged +https://learn.microsoft.com/en-us/azure/oracle/oracle-db/faq-oracle-database-azure,FAQs,FAQ for Oracle AI Database@Azure,,Learn answers to frequently asked questions about Oracle AI Database@Azure.,"In this article, get answers to frequently asked questions about Oracle AI Database@Azure.",2026-04-15T22:11:00.000Z,concept-article,,0.3,False,"FAQ format; summary does not indicate presence of specific error codes, limits tables, or configuration parameter references beyond general Q&A.",unchanged https://learn.microsoft.com/en-us/azure/oracle/oracle-db/manage-oracle-transparent-data-encryption-azure-key-vault,Manage Oracle TDE with Azure Key Vault,Integrate Oracle Exadata Database@Azure with Azure Key Vault,Configure Oracle TDE keys with Azure Key Vault,Follow a comprehensive step-by-step guide for integrating Oracle Exadata Database@Azure with Azure Key Vault.,"In this article, you learn how to store and manage Oracle Transparent Data Encryption (TDE) master encryption keys (MEKs) for Oracle Exadata Database@Azure. You can use all three tiers of the Azure Key Vault service: This integration enables Oracle Database@Azure customers to meet a wide spectrum of security, compliance, and key management needs. These needs range from software-based key storage to single-tenant, FIPS 140-3 Level 3 validated hardware security modules (HSMs).",2025-07-24T17:11:00.000Z,how-to,security,0.85,True,"Step-by-step integration of Oracle TDE with Azure Key Vault will include specific Key Vault tiers, access policies, and security configuration parameters unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/oracle/oracle-db/onboard-oracle-database,Onboard Oracle Database@Azure,Onboard Oracle AI Database@Azure,,Learn about purchase and configuration steps to onboard Oracle AI Database@Azure.,"In this article, learn about purchase and configuration (onboarding) steps for Oracle AI Database@Azure. You complete most onboarding tasks only once, when you create your Oracle AI Database@Azure deployment. After you complete the onboarding tasks, you can begin provisioning and using Oracle AI Database resources in your Azure environment.",2026-04-15T22:11:00.000Z,concept-article,,0.4,False,"Onboarding article focused on purchase and initial setup; summary does not indicate detailed configuration tables, limits, or troubleshooting content.",updated -https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-get-started,Get started,Get started with Oracle AI Database@Azure,,Learn how to get started with using Oracle AI Database@Azure.,"In this article, learn how to purchase and start using Oracle AI Database@Azure. Oracle AI Database@Azure is an Oracle AI Database service that runs on Oracle Cloud Infrastructure (OCI) and is colocated in Azure datacenters at Microsoft. Colocation ensures that the Oracle AI Database@Azure service has the fastest possible access to Azure resources and applications. Oracle AI Database@Azure runs on infrastructure that's managed by the expert Cloud Infrastructure operations team at Oracle. The ope",2026-04-15T22:11:00.000Z,get-started,,0.3,False,"Getting started/purchase/onboarding overview; likely procedural steps but not detailed configuration tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/oracle/oracle-db/onboard-oracle-database,Onboard Oracle Database@Azure,Onboard Oracle AI Database@Azure,,Learn about purchase and configuration steps to onboard Oracle AI Database@Azure.,"In this article, learn about purchase and configuration (onboarding) steps for Oracle AI Database@Azure. You complete most onboarding tasks only once, when you create your Oracle AI Database@Azure deployment. After you complete the onboarding tasks, you can begin provisioning and using Oracle AI Database resources in your Azure environment.",2026-04-15T22:11:00.000Z,concept-article,,0.4,False,"Onboarding article focused on purchase and initial setup; summary does not indicate detailed configuration tables, limits, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-get-started,Get started,Get started with Oracle AI Database@Azure,,Learn how to get started with using Oracle AI Database@Azure.,"In this article, learn how to purchase and start using Oracle AI Database@Azure. Oracle AI Database@Azure is an Oracle AI Database service that runs on Oracle Cloud Infrastructure (OCI) and is colocated in Azure datacenters at Microsoft. Colocation ensures that the Oracle AI Database@Azure service has the fastest possible access to Azure resources and applications. Oracle AI Database@Azure runs on infrastructure that's managed by the expert Cloud Infrastructure operations team at Oracle. The ope",2026-04-15T22:11:00.000Z,get-started,,0.3,False,"Getting started/purchase/onboarding overview; likely procedural steps but not detailed configuration tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-known-issues,Known issues,Known issues in Oracle Database@Azure,Resolve common Oracle Database@Azure known issues,Learn about known issues in Oracle Database@Azure.,Learn about known issues in Oracle Database@Azure and how to resolve them.,2024-10-28T22:05:00.000Z,troubleshooting,troubleshooting,0.8,True,"Known-issues article will map specific symptoms and issues to causes and resolutions, often with product-specific behaviors and workarounds.",unchanged https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-network-plan,Network planning,Network planning for Oracle AI Database@Azure,Plan and configure networking for Oracle AI Database@Azure,Learn about network planning for Oracle AI Database@Azure.,"In this article, learn about network topologies and constraints in Oracle AI Database@Azure. -After you purchase an offer through Azure Marketplace and provision the Oracle Exadata infrastructure, the next step is to create your virtual machine cluster to host your instance of Oracle Exadata Database@Azure. The Oracle AI Database clusters are connected to your Azure virtual network via a virtual network interface card (virtual NIC) from your delegated subnet (delegated toOracle.Database/networkAt",2026-04-15T22:11:00.000Z,concept-article,configuration,0.65,True,"Describes network topologies and constraints, including delegated subnet name (Oracle.Database/networkAt...), virtual NIC usage, and likely other product-specific network configuration parameters and constraints.",updated -https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-regions,Region availability,Region availability for Oracle AI Database@Azure,,Learn about region availability for Oracle AI Database@Azure.,Learn what Azure regions and corresponding Oracle Cloud Infrastructure (OCI) regions support Oracle AI Database@Azure in standard business regions across the globe. The list below mentions the Azure and corresponding OCI regions with the regional availability for Oracle AI Database@Azure:,2026-04-15T22:11:00.000Z,concept-article,,0.4,False,"Region availability list is essentially a location matrix, not a limits/quotas, configuration, or decision-making guide with thresholds.",updated -https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-support,Support,Support for Oracle AI Database@Azure,,Learn about support for Oracle AI Database@Azure.,"This article outlines the support scope, contact details, and procedures for both Oracle and Microsoft Azure, helping you quickly address any issues.",2026-04-15T22:11:00.000Z,overview,,0.2,False,"Support scope and procedures; no technical configuration parameters, limits, or troubleshooting mappings with error codes.",updated +After you purchase an offer through Azure Marketplace and provision the Oracle Exadata infrastructure, the next step is to create your virtual machine cluster to host your instance of Oracle Exadata Database@Azure. The Oracle AI Database clusters are connected to your Azure virtual network via a virtual network interface card (virtual NIC) from your delegated subnet (delegated toOracle.Database/networkAt",2026-04-15T22:11:00.000Z,concept-article,configuration,0.65,True,"Describes network topologies and constraints, including delegated subnet name (Oracle.Database/networkAt...), virtual NIC usage, and likely other product-specific network configuration parameters and constraints.",unchanged +https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-regions,Region availability,Region availability for Oracle AI Database@Azure,,Learn about region availability for Oracle AI Database@Azure.,Learn what Azure regions and corresponding Oracle Cloud Infrastructure (OCI) regions support Oracle AI Database@Azure in standard business regions across the globe. The list below mentions the Azure and corresponding OCI regions with the regional availability for Oracle AI Database@Azure:,2026-04-15T22:11:00.000Z,concept-article,,0.4,False,"Region availability list is essentially a location matrix, not a limits/quotas, configuration, or decision-making guide with thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-support,Support,Support for Oracle AI Database@Azure,,Learn about support for Oracle AI Database@Azure.,"This article outlines the support scope, contact details, and procedures for both Oracle and Microsoft Azure, helping you quickly address any issues.",2026-04-15T22:11:00.000Z,overview,,0.2,False,"Support scope and procedures; no technical configuration parameters, limits, or troubleshooting mappings with error codes.",unchanged https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-exadata-database-dedicated-infrastructure-logs,Oracle Exadata Database on Dedicated Infrastructure Logs on Azure for Enhanced Observability,Oracle Exadata Database on Dedicated Infrastructure Logs on Azure for Enhanced Observability,Integrate Oracle Exadata logs with Azure Monitor and Sentinel,Learn how to integrate Oracle Exadata Database logs with Azure Monitor and Microsoft Sentinel.,"In this article, you learn how to integrate Oracle Exadata Database Service@ Azure logs with Azure Monitor and Microsoft Sentinel by sending them to a Log Analytics workspace. Azure Monitor Logs is a centralized software as a service (SaaS) platform for collecting, analyzing, and acting on telemetry data generated by Azure and non-Azure resources like Oracle Exadata Database Service@Azure. You can collect logs, manage log data and costs, and consume different types of data in oneLog -Analytics wo",2026-04-15T22:11:00.000Z,how-to,integrations,0.7,True,"Explains how to send Oracle Exadata Database Service@Azure logs to Log Analytics, Azure Monitor, and Sentinel; likely includes workspace configuration, data types, and product-specific logging parameters.",updated +Analytics wo",2026-04-15T22:11:00.000Z,how-to,integrations,0.7,True,"Explains how to send Oracle Exadata Database Service@Azure logs to Log Analytics, Azure Monitor, and Sentinel; likely includes workspace configuration, data types, and product-specific logging parameters.",unchanged diff --git a/products/azure-oracle/report.md b/products/azure-oracle/report.md index 6681a9bf..ececc7d8 100644 --- a/products/azure-oracle/report.md +++ b/products/azure-oracle/report.md @@ -38,8 +38,8 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 9 -- **Unchanged**: 2 +- **Updated Pages**: 0 +- **Unchanged**: 11 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-oracle/azure-oracle.csv` @@ -55,27 +55,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S ## Changes -### Updated Pages - -- [Overview](https://learn.microsoft.com/en-us/azure/oracle/oracle-azure-overview) - - Updated: 2023-12-13T12:18:00.000Z → 2026-04-15T22:11:00.000Z -- [Get started](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-get-started) - - Updated: 2024-10-28T22:05:00.000Z → 2026-04-15T22:11:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/database-overview) - - Updated: 2025-04-18T11:09:00.000Z → 2026-04-15T22:11:00.000Z -- [Region availability](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-regions) - - Updated: 2026-03-30T22:11:00.000Z → 2026-04-15T22:11:00.000Z -- [Network planning](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-network-plan) - - Updated: 2025-09-16T17:23:00.000Z → 2026-04-15T22:11:00.000Z -- [Support](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-database-support) - - Updated: 2025-05-26T05:37:00.000Z → 2026-04-15T22:11:00.000Z -- [Oracle Exadata Database on Dedicated Infrastructure Logs on Azure for Enhanced Observability](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/oracle-exadata-database-dedicated-infrastructure-logs) - - Updated: 2025-08-14T22:10:00.000Z → 2026-04-15T22:11:00.000Z -- [Onboard Oracle Database@Azure](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/onboard-oracle-database) - - Updated: 2024-10-28T22:05:00.000Z → 2026-04-15T22:11:00.000Z -- [FAQs](https://learn.microsoft.com/en-us/azure/oracle/oracle-db/faq-oracle-database-azure) - - Updated: 2024-10-28T22:05:00.000Z → 2026-04-15T22:11:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-partner-solutions/azure-partner-solutions.csv b/products/azure-partner-solutions/azure-partner-solutions.csv index 7c5253e5..026c7319 100644 --- a/products/azure-partner-solutions/azure-partner-solutions.csv +++ b/products/azure-partner-solutions/azure-partner-solutions.csv @@ -87,7 +87,7 @@ https://learn.microsoft.com/en-us/azure/partner-solutions/neon/overview,What is https://learn.microsoft.com/en-us/azure/partner-solutions/neon/tools,Resources and developer tools,Neon Serverless Postgres Preview Developer Resources and Tools - Azure Native Integrations,,Learn about resources and developer tools available with Neon Serverless Postgres Preview.,"In this article, you learn about resources and tools from Neon that can help you use Neon Serverless Postgres Preview.",2026-02-11T08:00:00.000Z,overview,,0.35,False,Developer resources/tools article for Neon likely points to external tools; summary doesn’t show Azure-specific configuration or troubleshooting details.,unchanged https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/,New Relic,Azure Native New Relic Service documentation - Azure Native Integrations,,Azure Native New Relic Service offers a monitoring solution for your Azure workloads by using a streamlined workflow.,Azure Native New Relic Service provides a native integrated experience for monitoring where you can create and manage New Relic accounts.,2026-02-13T06:11:00Z,landing-page,,0.1,False,New Relic Azure Native landing page; appears to be high-level description and entry point.,unchanged https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/create,Get started,Quickstart: Get started with Azure Native New Relic Service - Azure Native Integrations,,Learn how to create an Azure Native New Relic Service in the Azure portal.,"In this quickstart, you create an instance of Azure Native New Relic Service.",2025-12-18T23:12:00.000Z,quickstart,,0.3,False,Quickstart for creating a New Relic resource in the portal; typically step-by-step UI instructions without detailed configuration matrices or expert-only parameters.,unchanged -https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/faq,FAQ,Azure Native New Relic Service frequently asked questions - Azure Native Integrations,,"Answers to common questions about using Azure Native New Relic Service including getting started, management, and billing.","Find answers to common questions about Azure Native New Relic Service, including setup, management, and billing.",2026-02-13T06:11:00Z,faq,,0.4,False,"FAQ about setup, management, and billing is likely conceptual and procedural; summary does not indicate specific error codes, config tables, or numeric limits.",unchanged +https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/faq,FAQ,Azure Native New Relic Service frequently asked questions - Azure Native Integrations,,"Answers to common questions about using Azure Native New Relic Service including getting started, management, and billing.","Find answers to common questions about Azure Native New Relic Service, including setup, management, and billing.",2026-04-19T17:12:00Z,faq,,0.0,False,"FAQ content about setup, management, and billing for Azure Native New Relic is likely high-level and explanatory without detailed limits, configuration tables, error-code mappings, or other product-specific expert data as defined by the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/manage,Manage,Manage Azure Native New Relic Service - Azure Native Integrations,Manage configuration for Azure Native New Relic Service,Learn how to manage your Azure Native New Relic Service settings.,This article describes how to manage the settings for Azure Native New Relic Service.,2026-01-14T18:16:00.000Z,how-to,configuration,0.6,True,"Managing service settings usually involves product-specific configuration options (for example, plan settings, data collection toggles, integration flags) that go beyond generic portal usage and are unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/overview,What is Azure Native New Relic Service?,Azure Native New Relic Service overview - Azure Native Integrations,,Learn about using New Relic in Azure Marketplace.,"Use Azure Native Integrations to easily provision, manage, and tightly integrate software and services from software development companies on Azure. Microsoft and New Relic developed this service and manage it together. You can find New Relic in theAzure portalor get it onAzure Marketplace. New Relic is a full-stack observability platform that enables a single source of truth for application performance, infrastructure monitoring, log management, error tracking, and real-user monitoring. Combine",2025-12-10T23:18:00.000Z,overview,,0.1,False,"High-level overview of Azure Native New Relic Service; description and positioning without detailed configuration, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/troubleshoot,Fix common errors,Troubleshoot Azure Native New Relic Service - Azure Native Integrations,Troubleshoot Azure Native New Relic Service issues,Learn about troubleshooting Azure Native New Relic Service.,This article describes how to fix common problems when you're working with Azure Native New Relic Service resources.,2026-02-12T08:00:00.000Z,conceptual,troubleshooting,0.85,True,"Explicit troubleshooting article; likely organized by specific errors or symptoms with causes and resolutions unique to Azure Native New Relic (for example, provisioning failures, linkage issues, or billing sync problems).",unchanged diff --git a/products/azure-partner-solutions/report.md b/products/azure-partner-solutions/report.md index 1826d44f..2cb3f9d3 100644 --- a/products/azure-partner-solutions/report.md +++ b/products/azure-partner-solutions/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-05' +generated_at: '2026-04-26' category_descriptions: integrations: Patterns and setup guides for connecting Azure services to external data platforms (Confluent Cloud, MongoDB Atlas, Neon Postgres) using Service Connector @@ -45,8 +45,8 @@ confusable_not_for: Not for Azure Industry (use azure-industry), Azure Managed A ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 112 +- **Updated Pages**: 1 +- **Unchanged**: 111 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-partner-solutions/azure-partner-solutions.csv` @@ -64,6 +64,11 @@ confusable_not_for: Not for Azure Industry (use azure-industry), Azure Managed A ## Changes +### Updated Pages + +- [FAQ](https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/faq) + - Updated: 2026-02-13T06:11:00Z → 2026-04-19T17:12:00Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -113,7 +118,6 @@ confusable_not_for: Not for Azure Industry (use azure-industry), Azure Managed A | [Create a connector to Azure Cosmos DB](https://learn.microsoft.com/en-us/azure/partner-solutions/apache-kafka-confluent-cloud/add-cosmos-db-connector) | 0.40 | Tutorial for creating a Cosmos DB connector; similar to index 26, focused on walkthrough rather than config matrices. | | [Create a connector to Blob Storage](https://learn.microsoft.com/en-us/azure/partner-solutions/apache-kafka-confluent-cloud/add-confluent-connectors) | 0.40 | Tutorial for creating a Blob Storage connector; likely shows steps and some fields but not a full configuration reference with defaults/ranges. | | [FAQ](https://learn.microsoft.com/en-us/azure/partner-solutions/apache-kafka-confluent-cloud/faq) | 0.40 | FAQ for Confluent Cloud on Azure; while FAQs can contain expert details, the provided summary is generic and does not confirm presence of specific error codes, limits, or configuration tables required by any sub-skill type. | -| [FAQ](https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/faq) | 0.40 | FAQ about setup, management, and billing is likely conceptual and procedural; summary does not indicate specific error codes, config tables, or numeric limits. | | [FAQ](https://learn.microsoft.com/en-us/azure/partner-solutions/pure-storage/faq) | 0.40 | FAQ about resources and developer tools; likely conceptual and procedural answers, with no clear indication of error-code mappings, numeric limits, or config tables in the summary. | | [FAQ](https://learn.microsoft.com/en-us/azure/partner-solutions/qumulo/faq) | 0.40 | FAQ for Azure Native Qumulo; likely general Q&A about usage and concepts, with no explicit indication of error-code mappings or configuration tables in the summary. | | [Manage a resource](https://learn.microsoft.com/en-us/azure/partner-solutions/apache-kafka-confluent-cloud/manage) | 0.40 | Managing Confluent Cloud resource settings; summary does not indicate detailed parameter tables or limits. | @@ -185,3 +189,4 @@ confusable_not_for: Not for Azure Industry (use azure-industry), Azure Managed A | [What is Elastic?](https://learn.microsoft.com/en-us/azure/partner-solutions/elastic/overview) | 0.10 | Elastic integrations overview is conceptual/marketing; no evidence of detailed limits, configuration parameters, or troubleshooting mappings. | | [What is LambdaTest - HyperExecute?](https://learn.microsoft.com/en-us/azure/partner-solutions/lambda-test/overview) | 0.10 | LambdaTest - HyperExecute overview is marketing/feature description; no explicit expert-level configuration, limits, or troubleshooting content indicated. | | [What is NGINXaaS – An Azure Native ISV Service?](https://learn.microsoft.com/en-us/azure/partner-solutions/nginx/overview) | 0.10 | Overview of NGINXaaS integration; primarily descriptive and marketing-style capabilities summary without detailed configuration or limits. | +| [FAQ](https://learn.microsoft.com/en-us/azure/partner-solutions/new-relic/faq) | - | FAQ content about setup, management, and billing for Azure Native New Relic is likely high-level and explanatory without detailed limits, configuration tables, error-code mappings, or other product-specific expert data as defined by the sub-skill types. | diff --git a/products/azure-pipelines/azure-pipelines.csv b/products/azure-pipelines/azure-pipelines.csv index c18fea37..a6f2ed52 100644 --- a/products/azure-pipelines/azure-pipelines.csv +++ b/products/azure-pipelines/azure-pipelines.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops,Self-hosted agent authentication options,Self-hosted agent authentication options - Azure Pipelines,Choose authentication options for self-hosted agents,Learn about authentication options for registering a self-hosted agent,Azure Pipelines provides a choice of several authentication options you can use when you are registering an agent. These methods of authentication are used only during agent registration. SeeAgents communicationfor details of how agents communicate after registration.,2025-10-27T22:02:00.000Z,concept-article,security,0.8,True,"Details supported auth mechanisms and their scopes/usage during agent registration, which are product-specific security configurations.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops,Self-hosted agent authentication options,Self-hosted agent authentication options - Azure Pipelines,Choose and configure auth methods for self-hosted agents,Learn about authentication options for registering a self-hosted agent,Azure Pipelines provides a choice of several authentication options you can use when you are registering an agent. These methods of authentication are used only during agent registration. SeeAgents communicationfor details of how agents communicate after registration.,2026-04-22T08:00:00.000Z,concept-article,security,0.72,True,"Describes concrete authentication options for registering Azure Pipelines self-hosted agents, including specific token types, scopes, and when each is appropriate. This is product-specific security configuration (auth choices and required permissions), not just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops,About agents & agent pools,Azure Pipelines Agents - Azure Pipelines,Choose and configure Azure Pipelines agents,Learn how you can build code and deploy software by using agents in Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You need at least one agent to build your code or deploy your software by using Azure Pipelines. As your code base and team grow, you need multiple agents. When your pipeline runs, the system begins one or more jobs. An agent is computing infrastructure with installed agent software that runs one job at a time. Azure Pipelines provides several different types of agents. You can run jobsdirectly on the host machine of the agen",2025-11-25T14:05:00.000Z,concept-article,configuration,0.7,True,"Explains different agent types, how jobs run, and how to use agents; includes product-specific agent configuration and behavior details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/certificate?view=azure-devops-server,Use a self-signed certificate,Run an Agent with a Self-Signed Certificate - Azure Pipelines,Run Azure Pipelines agent with self-signed certificate,Learn how to run the build and release an agent with a self-signed certificate for Azure Pipelines and Azure DevOps Server.,Azure DevOps Server | Azure DevOps Server 2022 This article explains how you can run a self-hosted agent with a self-signed certificate for Azure Pipelines and Azure DevOps Server.,2025-10-27T22:02:00.000Z,concept-article,security,0.7,True,"Explains certificate configuration for agents, including certificate stores/paths and TLS settings specific to Azure DevOps Server.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/device-code-flow-agent-registration?view=azure-devops,Device code flow,Register an agent using device code flow - Azure Pipelines,Register Azure Pipelines agent using device code flow,Learn how to register a self-hosted agent using device code flow,You can register an agent using device code flow starting withagent version 3.227.1by specifyingAADwhen prompted for the agent authentication type.,2023-10-18T15:06:00.000Z,concept-article,security,0.8,True,"Describes AAD device code flow option, including agent version requirement and auth type value 'AAD', which are product-specific auth settings.",unchanged @@ -17,7 +17,7 @@ This allows your agent to connect to Azure Pipelines or TFS through the proxy. This in turn allows the agent to get sources and download artifacts. Finally, it passes the proxy details through to tasks which also need proxy settings in order to reach the web.",2025-10-27T22:02:00.000Z,concept-article,configuration,0.7,True,Proxy article usually includes specific environment variables and agent configuration flags for proxy settings unique to this product.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/scale-set-agents?view=azure-devops,Scale set agents,Azure Virtual Machine Scale Set agents - Azure Pipelines,Use VM scale set agents for Azure Pipelines,Use Azure Virtual Machine Scale Sets to create agents,Azure DevOps Services Tip Managed DevOps Pools is an evolution of Azure DevOps Virtual Machine Scale Sets agent pools. It simplifies custom pool creation even further by improving the scalability and reliability of custom pools. Managed DevOps Pools is a fully managed service where virtual machines or containers that power agents live in a Microsoft Azure subscription and not in your own Azure subscription. This process is similar to using Azure DevOps Virtual Machine Scale Sets agent pools. If ,2025-06-12T08:00:00.000Z,reference,deployment,0.7,True,"Covers using VM scale sets/Managed DevOps Pools with agents, typically including supported SKUs, scaling behavior, and pool constraints.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops,Service Principal,Register an agent using a service principal - Azure Pipelines,Register Azure Pipelines agent with service principal,Learn how to register a self-hosted agent using a Service Principal,You can register an agent using a Service Principal starting withagent version 3.227.1by specifyingSPas the agent authentication option.,2024-08-21T13:58:00.000Z,concept-article,security,0.8,True,"Service principal registration includes specific parameters, required permissions, and minimum agent version (3.227.1), which are security config details.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops,Service Principal,Register an agent using a service principal - Azure Pipelines,Register Azure Pipelines agents with a service principal,Learn how to register a self-hosted agent using a Service Principal,You can register an agent using a Service Principal starting withagent version 3.227.1by specifyingSPas the agent authentication option.,2026-04-22T21:02:00.000Z,concept-article,security,0.78,True,"Provides step-by-step, product-specific instructions for using a service principal to register a self-hosted agent, including required permissions, parameter names, and version-specific behavior (agent version 3.227.1 and SP option). This is detailed security/auth configuration.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/v3-agent?view=azure-devops,Agent version 3.x,Agent software version 3 - Azure Pipelines,Run and migrate Azure Pipelines agent v3,Learn how to run pipelines using the version 3 agent software.,"Important Agent software version 3 (using .NET 6) is unsupported for Azure DevOps Services, and the Azure Pipelines team recommends you upgrade toAgent software version 4(using .NET 8). To update your self-hosted agents to version 4, seeUpgrade to agent software version 4. If you're running your self-hosted agents on an operating system that isn't supported by .NET 8, you must update your machines to use a newer supported operating systemsupported by .NET 8. For more information, seeUpgrade to a",2025-11-07T08:00:00.000Z,concept-article,deployment,0.7,True,"Covers support status, migration requirements, and OS/version constraints for v3 agents, which are version- and product-specific deployment details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/v4-agent?view=azure-devops,Agent version 4.x,Agent software version 4 - Azure Pipelines,Upgrade and run Azure Pipelines agent v4,Learn how to run pipelines using the version 4 agent software.,The pipelines team is upgrading the agent software from version 3.x to version 4.x (using .NET 8).,2025-12-04T22:15:00.000Z,concept-article,deployment,0.7,True,"Version-specific agent guidance typically includes OS/CPU prerequisites, supported platforms, and upgrade constraints unique to v4 that aren't general knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/windows-agent?view=azure-devops,Self-hosted Windows agents,Deploy an Azure Pipelines agent on Windows - Azure Pipelines,Deploy Azure Pipelines agent on Windows,Learn how to use Windows agents to build and deploy your Windows and Azure code for Azure Pipelines,"Azure DevOps Services | Azure DevOps Server To build and deploy Windows, Azure, and other Visual Studio solutions you'll need at least one Windows agent. Windows agents can also build Java and Android apps. This article provides guidance for using the4.x agent softwarewith Azure DevOps Services and Azure DevOps Server. This article provides guidance for using the3.x agent softwarewith Azure DevOps Server 2022 and Azure DevOps Server 2020. For a list of Azure DevOps Server versions that support t",2025-12-04T22:15:00.000Z,concept-article,deployment,0.8,True,"Windows agent article typically documents required components, services, and supported Windows versions for running the agent.",unchanged @@ -81,7 +81,7 @@ https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/clone-impor https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/key-pipelines-concepts?view=azure-devops,Key concepts,Key Azure Pipelines concepts - Azure Pipelines,,Learn the key concepts of how Azure Pipelines works with your code and tools to automate build and deployment.,"Azure DevOps Services This article presents the key concepts and components that make up Azure Pipelines. Understanding the basic terms and parts of a pipeline can help you more effectively build, test, and deploy your code. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started.",2026-03-10T08:00:00.000Z,overview,,0.1,False,"Conceptual explanation of key Azure Pipelines concepts; describes terms and components but not detailed limits, configuration matrices, or error-resolution mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/manage-pipelines-with-azure-cli?view=azure-devops,Manage pipelines with Azure CLI,Manage pipelines with the Azure DevOps CLI - Azure Pipelines,,"Learn about the Azure DevOps CLI extension and about the az pipelines list, show, run, and update commands for managing your pipelines.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article describes how you can manage existing pipelines in your Azure DevOps project by using the followingaz pipelinescommands: Note The Azure DevOps CLI extension is available only for Azure DevOps Services, and doesn't support any version of Azure DevOps Server.",2024-08-28T17:47:00.000Z,get-started,,0.35,False,"Shows CLI commands (list, show, run, update) but at a usage level; not a full parameter reference with defaults and constraints.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-get-started?view=azure-devops,YAML vs Classic Pipelines,YAML vs Classic Pipelines - Azure Pipelines,,Learn the basics about Azure Pipelines and explore the different features available for both YAML and Classic pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines enables developers to automate a wide variety of tasks, ranging from executing a batch file to setting up a complete continuous integration (CI) and continuous delivery (CD) solution for their applications. Azure Pipelines supports a wide range of languages, platforms, and tools, and offers two types of pipelines to choose from:YAML-basedandClassic pipelineeditors. Note If you are new to Azure Pipelines, it is",2025-03-25T14:59:00.000Z,overview,,0.25,False,"Comparison of YAML vs Classic pipelines is conceptual; no numeric thresholds, decision matrices, or tier-specific limits.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-sign-up?view=azure-devops,Sign up for Azure Pipelines,Sign up for Azure Pipelines - Azure Pipelines,,Walk through signing up for Azure Pipelines to begin managing CI/CD to deploy your code.,"Azure DevOps Services Sign up for an Azure DevOps organization and Azure Pipelines to begin managing CI/CD to deploy your code with high-performance pipelines. For more information about Azure Pipelines, seeWhat is Azure Pipelines.",2026-04-14T01:03:00.000Z,how-to,,0.1,False,"Sign-up/get-started guide for Azure Pipelines without detailed limits, configuration matrices, or product-specific error/decision data; primarily onboarding/tutorial content rather than expert reference material.",updated +https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-sign-up?view=azure-devops,Sign up for Azure Pipelines,Sign up for Azure Pipelines - Azure Pipelines,,Walk through signing up for Azure Pipelines to begin managing CI/CD to deploy your code.,"Azure DevOps Services Sign up for an Azure DevOps organization and Azure Pipelines to begin managing CI/CD to deploy your code with high-performance pipelines. For more information about Azure Pipelines, seeWhat is Azure Pipelines.",2026-04-14T01:03:00.000Z,how-to,,0.1,False,"Sign-up/get-started guide for Azure Pipelines without detailed limits, configuration matrices, or product-specific error/decision data; primarily onboarding/tutorial content rather than expert reference material.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/what-is-azure-pipelines?view=azure-devops,What is Azure Pipelines?,What is Azure Pipelines? - Azure Pipelines,,"Learn how Azure Pipelines can use continuous integration, testing, and delivery to automatically build, test, and deploy your code.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines is the part of Azure DevOps that combinescontinuous integration,continuous testing, andcontinuous deliveryto automatically build, test, and deploy code projects to any destination. Azure Pipelines supports all major languages and project types, and can automate workflows in your chosen technologies and frameworks whether your app is on-premises or in the cloud.",2026-03-13T17:02:00.000Z,overview,,0.1,False,"High-level overview of Azure Pipelines (what it is and general capabilities) without specific limits, configuration tables, error codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/yaml-pipeline-editor?view=azure-devops,YAML pipeline editor,YAML pipeline editor guide - Azure Pipelines,,Learn how to author and edit pipelines with the YAML pipeline editor.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines provides a YAML pipeline editor that you can use to author and edit your pipelines. The YAML editor is based on theMonaco Editor. The editor provides tools like Intellisense support and a task assistant to provide guidance while you edit a pipeline. This article shows you how to edit your pipelines using the YAML Pipeline editor, but you can also edit pipelines by modifying theazure-pipelines.ymlfile directly ",2025-03-31T20:43:00.000Z,reference,,0.3,False,"How-to guide for using the YAML editor; focuses on UI usage, not structured configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/integrations/configure-pipelines-work-tracking?view=azure-devops,Configure pipelines to support work tracking,Configure pipelines to support integration - Azure DevOps,,Learn how to configure pipelines to support integration with Azure Boards and work tracking,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To support integration and traceability across Azure DevOps Services with pipelines, you can configure several options. You can report pipeline status, copy the syntax for status badges, and set up automatic linking of work items to builds and releases.",2026-01-07T23:38:00.000Z,how-to,,0.35,False,Covers integration with Azure Boards and work tracking conceptually; likely procedural without detailed config parameter tables.,unchanged @@ -89,11 +89,11 @@ https://learn.microsoft.com/en-us/azure/devops/pipelines/integrations/slack?view https://learn.microsoft.com/en-us/azure/devops/pipelines/library/?view=azure-devops,Library & shared resources,Asset library - Azure Pipelines,,Understand the Azure Pipelines asset library for secure files and variable groups.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 An Azure Pipelineslibraryis a collection of assets for an Azure DevOps project. You can use library assets in multiple pipelines in a project. You can access theLibraryunderPipelinesin the left menu of your Azure DevOps project. The library contains two types of assets,variable groupsandsecure files. Variable groupsstore values and secrets in groups that you can use across project pipelines.Secure filesare a secure way to sto",2025-10-27T22:02:00.000Z,concept-article,,0.35,False,Overview of the asset library (variable groups and secure files); lacks detailed configuration matrices or limits beyond basic description.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/library/add-devops-entra-service-connection?view=azure-devops,Access Azure DevOps with Microsoft Entra workload identity,Access Azure DevOps with Microsoft Entra workload identity - Azure Pipelines,Secure Azure DevOps pipelines with Entra workload identities,Learn how to create an Azure DevOps service connection using Microsoft Entra federated credentials to enable PAT-free pipelines.,"Azure DevOps Services Note This feature is rolling out this week and next. If you don't see the feature yet on your Azure DevOps Services project, check back in a few days. AnAzure DevOps service connectionlets your pipelines authenticate to Azure DevOps without Personal Access Tokens (PATs) by usingMicrosoft Entra workload identities. Service principals and managed identitiesaccess Azure DevOpsthroughworkload identity federation, a zero-secret method that eliminates the need to manage and rotat",2026-04-07T21:03:00.000Z,how-to,security,0.7,True,"The page describes creating an Azure DevOps service connection using Microsoft Entra workload identity federation. This involves product-specific security configuration: service connection types, Entra workload identity/federated credential setup, and how service principals or managed identities are authorized to access Azure DevOps without PATs. These are concrete, product-specific auth and RBAC-style configurations that qualify as expert security knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/library/add-resource-protection?view=azure-devops,Add an admin role to a resource,Add an Administrator for a Protected Resource - Azure Pipelines,Assign administrators for protected pipeline resources,"Learn how to add administrators to service connections, secure files, agent pools, secret variables, and environments in Azure Pipelines.","To manageprotected resources, Azure Pipelines requires a user to have theAdministratorrole or be a member of a group that has theAdministratorrole for the resource. This article shows you how to assign theAdministratorrole to users and groups for the following Azure Pipelines protected resources: For information about protecting repository resources, seeSecurely access repositories from pipelinesandProtect a repository resource.",2025-10-02T22:03:00.000Z,how-to,security,0.85,True,"Describes the Administrator role for service connections, secure files, agent pools, etc., and how to assign it; this is concrete RBAC/permission configuration.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops,Azure Resource Manager special cases,Azure Resource Manager service connection special cases - Azure Pipelines,Handle special ARM service connection authentication cases,Connect Azure Pipelines to Azure using an Azure Resource Manager service connection with either an agent-assigned managed identity or a publish profile.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 While the recommended option for Resource Manager service connections is touse workload identity federation with an app registration or managed identity, you might need to use an agent-assigned managed identity or a publish profile instead. In this article, you learn how to create an Resource Manager service connection that uses an app registration with a secret, one that connects to a self-hosted agent on an Azure virtual ma",2026-01-16T08:00:00.000Z,concept-article,security,0.8,True,Covers agent-assigned managed identity and publish profile scenarios with specific parameters and security implications.,unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops,Azure Resource Manager service connection,Use an Azure Resource Manager service connection - Azure Pipelines,Configure Azure Resource Manager service connections,Learn how to use an Azure Resource Manager service connection to connect Azure Pipelines to Azure services.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 An Azure Resource Manager service connection allows you to connect to Azure resources like Azure Key Vault from your pipeline. This connection lets you use a pipeline to deploy to Azure resources, such as an Azure App Service app, without needing to authenticate each time. You have multiple authentication options for connecting to Azure with an Azure Resource Manager service connection. We recommend usingworkload identity fed",2026-01-09T08:00:00.000Z,concept-article,security,0.9,True,"Details multiple auth options, scopes, and recommended workload identity federation, which are product-specific security configurations.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops,Azure Resource Manager special cases,Azure Resource Manager service connection special cases - Azure Pipelines,Handle special-case Azure Resource Manager service connections,Connect Azure Pipelines to Azure using an Azure Resource Manager service connection with either an agent-assigned managed identity or a publish profile.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 While the recommended option for Resource Manager service connections is touse workload identity federation with an app registration or managed identity, you might need to use an agent-assigned managed identity or a publish profile instead. In this article, you learn how to create an Resource Manager service connection that uses an app registration with a secret, one that connects to a self-hosted agent on an Azure virtual ma",2026-04-22T21:02:00.000Z,concept-article,security,0.76,True,"Details specific configurations for ARM service connections using agent-assigned managed identity, publish profiles, and app registrations with secrets. Includes product-specific auth flows and permission requirements, fitting the security/identity configuration category.",updated +https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops,Azure Resource Manager service connection,Use an Azure Resource Manager service connection - Azure Pipelines,Configure Azure Resource Manager service connections in pipelines,Learn how to use an Azure Resource Manager service connection to connect Azure Pipelines to Azure services.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 An Azure Resource Manager service connection allows you to connect to Azure resources like Azure Key Vault from your pipeline. This connection lets you use a pipeline to deploy to Azure resources, such as an Azure App Service app, without needing to authenticate each time. You have multiple authentication options for connecting to Azure with an Azure Resource Manager service connection. We recommend usingworkload identity fed",2026-04-22T08:00:00.000Z,concept-article,security,0.76,True,"Covers concrete authentication options (workload identity federation, managed identity, service principal, etc.) for Azure Resource Manager service connections, including how to set them up and required permissions. This is product-specific security and identity configuration rather than a generic overview.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/library/link-variable-groups-to-key-vaults?view=azure-devops,Link a variable group to Azure Key Vault,Link a variable group to secrets in Azure Key Vault - Azure Pipelines,Link Azure Pipelines variable groups to Key Vault,Learn how to link a variable group to secrets in Azure Key Vault.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can create a variable group that links to existing Azure key vaults and maps selected key vault secrets to the variable group. Only the secret names map to the variable group, not the secret values. When pipelines run, they link to the variable group to fetch the latest secret values from the vault at runtime. Any changes made to existing secrets in the key vault are automatically available to all the pipelines that use t",2025-09-10T22:34:00.000Z,tutorial,security,0.86,True,"Describes product-specific integration and security behavior between variable groups and Azure Key Vault, including how secrets are mapped and fetched at runtime.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops,Use secure files,Secure files for Azure Pipelines - Azure Pipelines,Manage secure files and access in Azure Pipelines,Understand how to add and use secure files in Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article explains how to securely store and use sensitive files, such as certificates and keys, in Azure Pipelines with the secure files library. Secure files help protect sensitive data by encrypting files at rest on the server. Only pipelines that you explicitly authorize can access secure files, which helps ensure your credentials and other critical files remain safe. This article covers adding secure files, configurin",2026-03-14T01:04:00.000Z,how-to,security,0.74,True,"The secure files article covers how to store and authorize access to sensitive files in Azure Pipelines, including product-specific security behaviors such as encryption at rest, pipeline authorization, and likely permission settings or scopes for using secure files. These are concrete, product-specific security configuration patterns rather than generic security concepts.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops,Manage service connections,Service connections - Azure Pipelines,Configure and manage Azure Pipelines service connections,Learn how to manage Azure Pipelines service connections and get a reference to service connection types.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article covers service connections in Azure Pipelines. Service connections are authenticated connections between Azure Pipelines and external or remote services that you use to execute tasks in a job. For example, your pipelines might use the following categories of service connections: The first part of this article explains how to create, view, edit, and use service connections. The second part of the article provides ",2025-11-03T14:25:00.000Z,concept-article,security,0.85,True,"Service connections article includes connection types, authentication schemes, and permission scopes; these are product-specific security and integration configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops,Manage service connections,Service connections - Azure Pipelines,,Learn how to manage Azure Pipelines service connections and get a reference to service connection types.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article covers service connections in Azure Pipelines. Service connections are authenticated connections between Azure Pipelines and external or remote services that you use to execute tasks in a job. For example, your pipelines might use the following categories of service connections: The first part of this article explains how to create, view, edit, and use service connections. The second part of the article provides ",2026-04-22T08:00:00.000Z,concept-article,,0.3,False,"Primarily an overview and how-to for creating and managing service connections, plus a reference to connection types. From the summary, it doesn’t clearly expose detailed configuration tables, parameter defaults, or error-code-based troubleshooting; it looks more like conceptual and procedural guidance rather than expert-knowledge configuration or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops,Manage variable groups,Manage variable groups - Azure Pipelines,Manage Azure Pipelines variable groups and access,Share common variables across pipelines using variable groups.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article explains how to create and use variable groups in Azure Pipelines. Variable groups store values and secrets that you can pass into a YAML pipeline or make available across multiple pipelines in a project. Secret variables in variable groups areprotected resources. You can add combinations of approvals, checks, and pipeline permissions to limit access to secret variables in a variable group. Access to nonsecret va",2025-07-15T16:57:00.000Z,tutorial,security,0.7,True,"Covers variable groups as protected resources, including approvals, checks, and pipeline permissions—product-specific security and access control configuration.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/licensing/concurrent-jobs?view=azure-devops,Configure and pay for parallel jobs,Configure and pay for parallel jobs - Azure DevOps,Configure and purchase Azure Pipelines parallel jobs,Configure parallel jobs in Azure Pipelines and pay for them,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Important Starting with Azure DevOps Server 2019, you don't have to pay for self-hosted concurrent jobs in releases. You're only limited by the number of agents that you have. Learn how to estimate how many parallel jobs you need and buy more parallel jobs for your organization. Note The free grant of parallel jobs for public projects and for certain private projects in new organizations is temporarily disabled. However, you ",2025-10-10T14:03:00.000Z,how-to,limits-quotas,0.85,True,"Parallel jobs/licensing article describes free grants, paid tiers, and job limits per organization with specific numbers and conditions, matching the limits-quotas criteria.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/migrate/from-jenkins?view=azure-devops,Migrate from Jenkins,Migrate from Jenkins to Azure Pipelines - Azure Pipelines,Migrate from Jenkins to Azure Pipelines,How to migrate from Jenkins to Azure Pipelines,"Azure DevOps Services Jenkins, an open-source automation server, is traditionally installed by enterprises in their own data @@ -159,13 +159,13 @@ https://learn.microsoft.com/en-us/azure/devops/pipelines/release/approvals/appro https://learn.microsoft.com/en-us/azure/devops/pipelines/release/approvals/gates?view=azure-devops,Deployment gates concepts,Deployment gates concepts - Azure Pipelines,,Understand deployment gates in Azure Pipelines,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Deployment gates in Azure Pipelines are added to release pipelines to ensure that deployments meet specific criteria before proceeding. Gates are essential for ensuring that deployments are reliable and secure by enforcing rigorous checks leading to more stable and secure software releases. Gates are defined in the pre-deployment and post-deployment conditions of a release stage. They provide a mechanism to automatically coll,2025-05-20T21:23:00.000Z,concept-article,,0.45,False,"Conceptual description of deployment gates; summary doesn’t indicate detailed configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/approvals/servicenow?view=azure-devops,ServiceNow,Integrate with ServiceNow change management - Azure Pipelines,Integrate ServiceNow change management with releases,Learn how to integrate ServiceNow change management by using gated releases with the Azure Pipelines ServiceNow extension.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To improve collaboration between development and IT teams, Azure Pipelines supports integration with ServiceNow. Teams can reduce the risks associated with changes and follow service management methodologies such as Information Technology Infrastructure Library (ITIL) by including change management gates in release pipelines. In this tutorial, you learn how to:",2025-07-15T16:57:00.000Z,tutorial,integrations,0.7,True,Tutorial for using ServiceNow change management gates in release pipelines; involves Azure Pipelines extension configuration and ServiceNow-specific parameters.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/artifacts?view=azure-devops,Artifact sources,Artifact sources in Classic release pipelines - Azure Pipelines,Configure artifact sources in classic release pipelines,Learn about the list of artifact sources that can be used in a Classic release pipeline.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 With Classic release pipelines, you can deploy your artifacts from a wide range of sources. Using the graphical interface, you can set up your pipeline to integrate and consume artifacts from various services. Additionally, you can link multiple artifacts from different sources and designate one as the primary source based on your needs.",2025-10-27T22:02:00.000Z,concept-article,deployment,0.6,True,Lists and configures artifact source types and linking behavior specific to Azure Pipelines classic releases.,unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/release/automate-service-connections?view=azure-devops,Use scripts to automate Azure Resource Manager service connections,Use scripts to automate Azure Resource Manager with workload identity service connections - Azure Pipelines,Automate ARM workload identity service connections with scripts,"Learn how to automate Azure Resource Manager service connections in Azure Pipelines with workload identity. Save time, ensure consistency, and reduce errors.","Azure DevOps Services Learn how to use scripts to create Azure Resource Manager service connections with workload identity in Azure Pipelines. Scripts ensure consistency, efficiency, and repeatability when setting up service connections, reducing the risk of human error. They save time, especially when creating multiple connections or deploying to different environments. These scripts can also be integrated into an automation process to scale and better manage large deployments. Using scripts as",2025-05-29T14:53:00.000Z,how-to,integrations,0.8,True,"Shows scripting patterns and parameters for creating ARM service connections, including API/CLI fields and constraints specific to this integration.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/release/automate-service-connections?view=azure-devops,Use scripts to automate Azure Resource Manager service connections,Use scripts to automate Azure Resource Manager with workload identity service connections - Azure Pipelines,Automate ARM workload identity service connections with scripts,"Learn how to automate Azure Resource Manager service connections in Azure Pipelines with workload identity. Save time, ensure consistency, and reduce errors.","Azure DevOps Services Learn how to use scripts to create Azure Resource Manager service connections with workload identity in Azure Pipelines. Scripts ensure consistency, efficiency, and repeatability when setting up service connections, reducing the risk of human error. They save time, especially when creating multiple connections or deploying to different environments. These scripts can also be integrated into an automation process to scale and better manage large deployments. Using scripts as",2026-04-22T08:00:00.000Z,how-to,integrations,0.7,True,"Shows how to script creation of Azure Resource Manager service connections with workload identity, including concrete script parameters and configuration values. This is a code-focused integration pattern for connecting Azure DevOps pipelines to Azure with specific config details.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/release/azure-key-vault?view=azure-devops,Use secrets from Azure Key Vault,Use Azure Key Vault secrets in Azure Pipelines - Azure Pipelines,Use Azure Key Vault secrets in Azure Pipelines,"Learn how to create Azure Key vaults, store secrets, and use them in your Azure Pipelines.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Key Vault is a cloud service that helps developers securely store and manage sensitive information such as API keys, credentials, and certificates. Azure Key Vault service supports two types of containers: vaults and managed HSM (Hardware Security Module) pools. Vaults can store both software and HSM-backed keys, secrets, and certificates, while managed HSM pools exclusively support HSM-backed keys. In this article, yo",2026-01-30T22:03:00.000Z,tutorial,security,0.85,True,"Covers creating Key Vaults, storing secrets, and wiring them into pipelines; includes service connection setup, secret mapping, and permission scopes, which are concrete security configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/azure-rm-endpoint?view=azure-devops,Troubleshoot Azure Resource Manager service connections,Troubleshoot Azure Resource Manager service connections - Azure Pipelines,Troubleshoot Azure Resource Manager service connections,How to troubleshoot Azure Resource Manager service connections in Azure Pipelines,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article presents common troubleshooting scenarios to help you resolve issues you might encounter when creating an Azure Resource Manager (ARM) service connection. SeeManage service connectionsto learn how to create, edit, and secure service connections. SeeTroubleshoot an Azure Resource Manager workload identity service connectionto learn how to fix workload-identity related issues. This article uses the terms ""tenant"" a",2026-02-24T18:05:00.000Z,troubleshooting,troubleshooting,0.9,True,Provides symptom → cause → fix guidance with ARM-specific errors and diagnostic steps unique to Azure Pipelines service connections.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops,Reduce build time using caching,Pipeline caching - Azure Pipelines,Optimize Azure Pipelines performance with caching,Improve pipeline performance by caching files such as dependencies between runs.,"Azure DevOps Services Pipeline caching can help reduce build time by reusing downloaded dependencies from previous runs, avoiding the need to recreate or redownload the same files. This is particularly helpful in scenarios where the same dependencies are downloaded repeatedly at the start of each run. This is often a time consuming process involving hundreds or thousands of network calls. Caching is most effective when the time required to restore and save the cache is less than the time it take",2025-04-28T22:36:00.000Z,overview,best-practices,0.7,True,"Pipeline caching article typically includes concrete YAML patterns, cache key strategies, and product-specific gotchas for effective use.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-app-secret?view=azure-devops,Manually configure an ARM service connection with a secret,Manually create an Azure Resource Manager service connection with a secret - Azure Pipelines,Create ARM service connection using client secret,"Learn how to manually set an Azure Resource Manager service connection with a secret in Azure Pipelines, one of the services in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Warning Using a secret that requires manual rotation and management and is not recommended. Workload identity federation is the preferred credential type. If you don't need to use a secret because of organizational limitations, useworkload identity federationwith either an app registration or managed identity. This article guides you through manually creating an Azure Resource Manager (ARM) service connection for service prin",2025-11-10T18:14:00.000Z,concept-article,security,0.9,True,"Manual secret-based ARM connection setup includes app registration fields, scopes, and warnings about secret rotation—detailed security config.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops,Manually configure workload identity service connections,Set a Resource Manager workload identity service connection - Azure Pipelines,Manually configure ARM workload identity connections,"Learn how to manually set an Azure Resource Manager workload identity service connection in Azure Pipelines, one of the services in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 When youtroubleshoot an Azure Resource Manager workload identity service connection, you might need to manually configure the connection instead of using the automated tool that's available in Azure DevOps. Before you begin a manual configuration,try the automated approach. For authentication, you can use either a managed identity or an app registration. The managed identity option is useful if you don't have permissions to c",2026-01-12T08:00:00.000Z,concept-article,security,0.9,True,"Explains manual setup for workload identity with managed identity or app registration, including specific IDs, roles, and settings.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops,Manually configure workload identity service connections,Set a Resource Manager workload identity service connection - Azure Pipelines,Manually configure workload identity ARM service connections,"Learn how to manually set an Azure Resource Manager workload identity service connection in Azure Pipelines, one of the services in Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 When youtroubleshoot an Azure Resource Manager workload identity service connection, you might need to manually configure the connection instead of using the automated tool that's available in Azure DevOps. Before you begin a manual configuration,try the automated approach. For authentication, you can use either a managed identity or an app registration. The managed identity option is useful if you don't have permissions to c",2026-04-22T08:00:00.000Z,concept-article,security,0.8,True,"Explains manual setup of Azure Resource Manager workload identity service connections, including choosing between managed identity and app registration, and configuring identity/permissions. This is detailed, product-specific security configuration content.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/release/create-classic-pipelines?view=azure-devops,Create Classic pipelines,Create Classic pipelines - Azure Pipelines,,Learn how to create Classic pipeline definitions in Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines enables you set up continuous integration and continuous delivery to automatically build, test, and deploy your applications to a variety of environments. With Classic pipelines, you configure your pipeline using a visual editor. You can set up tasks to build your project, package your binaries, run tests, and make build outputs available for deployment. This article walks you through creating a Classic pipeli",2026-01-29T22:17:00.000Z,tutorial,,0.3,False,Step-by-step Classic pipeline creation; mostly procedural without deep config matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/define-multistage-release-process?view=azure-devops,Create a multi-stage release,Create a multi-stage release pipeline - Azure Pipelines,Configure multi-stage classic release pipelines for ASP.NET Core,Learn how to create a multi-stage Classic release pipeline for your ASP.NET Core app using Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines enables developers to deploy their applications across multiple environments using both YAML and Classic pipelines. This article walks you through creating a multi-stage Classic release pipeline to deploy your ASP.NET Core web app to multiple stages. In this tutorial, you'll learn how to:",2024-12-11T22:08:00.000Z,tutorial,deployment,0.65,True,"Shows detailed configuration of multiple stages, tasks, and environment settings in classic release pipelines.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/deploy-multiple-branches?view=azure-devops,Deploy from multiple branches,Deploy multiple branches to different stages - Azure Pipelines,,Learn how to use Classic release pipelines to deploy from multiple branches to different stages.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Classic release pipelines provide a graphical way to set up continuous delivery for your application. You can configure a release to trigger automatically when a new artifact is available, and then use artifact filters to map specific branches to specific stages. This approach lets you deploy each branch to the stage you intend. This article walks you through how to configure a Classic release pipeline that deploys to differe",2026-03-27T21:05:00.000Z,tutorial,,0.3,False,"The page is a how-to/tutorial for configuring classic release pipelines to deploy different branches to different stages. It describes steps and UI actions but does not present product-specific limits, configuration parameter tables, error-code-based troubleshooting, or quantified decision criteria. The content is procedural rather than expert reference material, so it does not meet any sub-skill type’s detection criteria.",unchanged @@ -190,17 +190,17 @@ These components are often independently built. When an upstream component (a li https://learn.microsoft.com/en-us/azure/devops/pipelines/release/releases?view=azure-devops,Create a Classic release,Create Classic release pipelines - Azure Pipelines,Create classic release pipelines for multi-environment deployment,Learn how to create Classic release definitions in Azure Pipelines.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines enables you to deploy applications efficiently and securely across multiple environments using Classic release pipelines. This guide walks you through the steps to create a Classic release definition in Azure Pipelines.,2025-11-14T22:08:00.000Z,tutorial,deployment,0.65,True,"How-to guide for defining classic release definitions with environment, artifact, and task configuration specific to Azure Pipelines.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/task-groups?view=azure-devops,Task groups (Classic),Task groups in Classic pipelines - Azure Pipelines,,"Understand, create, and manage task groups in Classic pipelines for Azure Pipelines.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In Classic pipelines, a task group encapsulates a sequence of tasks that are already defined in a pipeline into a single reusable task. The new task group is automatically added to the task catalog, and can be added to pipelines in the project just like other tasks. Task groups are stored at the project level, and aren't accessible outside the project scope. Task groups are a way to standardize and centrally manage deployment",2024-08-14T22:07:00.000Z,how-to,,0.45,False,Explains task groups conceptually and how to create/manage them; likely more procedural than a deep configuration or limits reference.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/triggers?view=azure-devops,Release triggers (Classic),Classic release triggers - Azure Pipelines,Configure classic release triggers in Azure Pipelines,Learn the different types of release triggers and how to use them in your release pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Release triggers are an automation tool that can be used in your deployment workflow to initiate actions when specific conditions are met. Classic release pipelines support several types of triggers, which we'll cover in this article: Continuous deployment triggers Scheduled release triggers Pull request release triggers Stage triggers",2025-07-15T22:26:00.000Z,tutorial,configuration,0.65,True,"Describes multiple trigger types (CD, scheduled, PR, stage) and how to configure them in classic release pipelines—product-specific trigger configuration.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops,Troubleshoot workload identity service connections,Troubleshoot workload identity service connections - Azure Pipelines,Troubleshoot ARM workload identity service connections,"Learn how to troubleshoot an Azure Resource Manager workload identity service connection in Azure Pipelines, one of the services in Azure DevOps.",Get help debugging common issues with workload identity service connections. You also learn how to manually create a service connection if you need to.,2025-11-10T14:03:00.000Z,troubleshooting-general,troubleshooting,0.9,True,Explicit troubleshooting article; typically maps specific error messages and causes to resolutions for workload identity connections.,unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops,Troubleshoot workload identity service connections,Troubleshoot workload identity service connections - Azure Pipelines,Troubleshoot Azure workload identity service connections,"Learn how to troubleshoot an Azure Resource Manager workload identity service connection in Azure Pipelines, one of the services in Azure DevOps.",Get help debugging common issues with workload identity service connections. You also learn how to manually create a service connection if you need to.,2026-04-22T08:00:00.000Z,troubleshooting,troubleshooting,0.84,True,"Explicitly a troubleshooting guide for workload identity service connections, mapping common issues to causes and fixes, and likely including specific error messages and diagnostic steps. This matches the troubleshooting pattern of symptom → cause → solution.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops,Debug release issues,Use variables in Classic release pipelines - Azure Pipelines,Use variables in classic Azure release pipelines,Learn how to use the different types of variables in a Classic release pipeline.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Using variables in Classic release pipelines is a convenient way to exchange and transport data throughout your pipeline. Each variable is stored as a string, and its value can change between pipeline runs. UnlikeRuntime parameters, which are only available at template parsing time, variables in Classic release pipelines are accessible throughout the entire deployment process. When you set up tasks to deploy your application ",2026-01-20T08:00:00.000Z,concept-article,configuration,0.8,True,"Explains different variable types and their behavior in classic release pipelines; typically includes variable scopes, precedence, and syntax, which are concrete configuration semantics.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops,Use classic release and artifacts variables,Use variables in Classic release pipelines - Azure Pipelines,,Learn how to use the different types of variables in a Classic release pipeline.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Using variables in Classic release pipelines is a convenient way to exchange and transport data throughout your pipeline. Each variable is stored as a string, and its value can change between pipeline runs. UnlikeRuntime parameters, which are only available at template parsing time, variables in Classic release pipelines are accessible throughout the entire deployment process. When you set up tasks to deploy your application ",2026-01-20T08:00:00.000Z,concept-article,,0.45,False,Explains variables in Classic release pipelines at a conceptual/how-to level; no strong indication of detailed config tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/reports/pipeline-widgets?view=azure-devops,Monitor your pipelines with dashboard widgets,Monitor your pipelines with dashboard widgets - Azure DevOps,Monitor pipelines with Azure DevOps dashboard widgets,Monitor your pipeline with widgets on your team dashboard in Azure DevOps.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can monitor your pipelines by adding widgets to team dashboards in Azure DevOps. Widgets give you visibility to the status of your build and release pipelines and monitor test results trends. For information about team dashboards, seeAbout dashboards, charts, reports, and widgets.",2025-10-27T22:02:00.000Z,concept-article,configuration,0.6,True,"Explains which widgets exist for pipelines, what they show, and how to configure them on dashboards; these are concrete configuration options for monitoring.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/reports/pipelinereport?view=azure-devops,View pipeline reports,Pipeline reports - Azure Pipelines,Use Azure Pipelines analytics and reports,Get meaningful insights with pipeline reports in the pipeline,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Teams track their pipeline health and efficiency to ensure continuous delivery to their customers. You can gain visibility into your team's pipeline(s) using Pipeline analytics. The source of information for pipeline analytics is the set of runs for your pipeline. These analytics are accrued over a period of time, and form the basis of the rich insights offered. Pipelines reports show you metrics, trends, and can help you ide",2025-12-19T16:56:00.000Z,concept-article,configuration,0.6,True,"Pipeline reports article describes specific metrics, trends, and how they’re computed and surfaced in Azure Pipelines; these are product-specific analytics configuration/usage details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/?view=azure-devops,Supported repositories,Build source repositories - Azure Pipelines,Select supported source repositories for Azure Pipelines,Build source repositories using Azure Pipelines,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines and Azure DevOps Server integrate with a number of version control systems. When you use any of these version control systems, you can configure a pipeline to build, test, and deploy your application. YAML pipelines only work with certain version control systems. The following table shows all the supported version control systems and the ones that support YAML pipelines.",2024-04-23T21:42:00.000Z,reference,deployment,0.65,True,"Includes a table of supported version control systems and which support YAML pipelines, effectively a platform support matrix relevant to deployment methods.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops,Azure Repos Git,Build Azure Repos Git repositories - Azure Pipelines,,Using an Azure Repos Git repository with Azure Pipelines,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines can automatically build and validate every pull request and commit to your Azure Repos Git repository.,2026-02-13T08:00:00.000Z,reference,,0.35,False,How to use Azure Repos Git with pipelines; integration steps but not a detailed parameter/limits reference.,unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops,Azure Repos Git,Build Azure Repos Git repositories - Azure Pipelines,,Using an Azure Repos Git repository with Azure Pipelines,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelines can automatically build and validate every pull request and commit to your Azure Repos Git repository.,2026-04-22T08:00:00.000Z,reference,,0.3,False,"Appears to be a how-to/tutorial on using Azure Repos Git with Azure Pipelines. Likely focuses on basic configuration and workflow rather than detailed limits, configuration matrices, or product-specific error codes. No indication of numeric limits, RBAC role tables, or advanced configuration parameter tables.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/bitbucket?view=azure-devops,Bitbucket Cloud,Build Bitbucket Cloud repositories - Azure Pipelines,,Using a Bitbucket Cloud repository with Azure Pipelines,Azure DevOps Services Azure Pipelines can automatically build and validate every pull request and commit to your Bitbucket Cloud repository. This article describes how to configure the integration between Bitbucket Cloud and Azure Pipelines. Bitbucket and Azure Pipelines are two independent services that integrate well together. Your Bitbucket Cloud users do not automatically get access to Azure Pipelines. You must add them explicitly to Azure Pipelines.,2023-10-27T08:00:00.000Z,reference,,0.35,False,Integration guide for Bitbucket Cloud; mostly procedural without detailed parameter tables or numeric constraints.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/github-enterprise?view=azure-devops,GitHub Enterprise Server,Build code from GitHub Enterprise Server - Azure Pipelines,Plan Azure Pipelines deployment with GitHub Enterprise Server,Using a GitHub Enterprise Server with Azure Pipelines,"Azure DevOps Services You can integrate your on-premises GitHub Enterprise Server with Azure Pipelines. Your on-premises server may be exposed to the Internet or it may not be. If your GitHub Enterprise Server is reachable from the servers that run Azure Pipelines service, then: If your GitHub Enterprise Server is not reachable from the servers that run Azure Pipelines service, then: If your on-premises server is reachable from Microsoft-hosted agents, then you can use them to run your pipelines",2023-09-28T21:35:00.000Z,reference,deployment,0.7,True,Differentiates scenarios based on network reachability and agent type; effectively a deployment support matrix with product-specific constraints.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops,GitHub,Build GitHub repositories - Azure Pipelines,,Using a GitHub repository with Azure Pipelines,"Azure DevOps Services Azure Pipelines can automatically build and validate every pull request and commit to your GitHub repository. This article describes how to configure the integration between GitHub and Azure Pipelines. If you're new to pipelines integration with GitHub, follow the steps inCreate your first pipeline. Come back to this article to learn more about configuring and customizing the integration between GitHub and Azure Pipelines.",2024-12-11T08:00:00.000Z,reference,,0.35,False,Configuring GitHub integration is tutorial-style; lacks structured config tables or error-code troubleshooting.,unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops,Multiple repositories,Check out multiple repositories in your pipeline - Azure Pipelines,Configure multi-repository checkout in Azure Pipelines,Learn how to check out multiple repositories in your pipeline,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Pipelines often rely on multiple repositories that contain source code, tools, scripts, or other items you need to build your code. By using multiplecheckoutsteps in your pipeline, you can fetch and check out other repositories in addition to the one you use to store your YAML pipeline.",2026-03-05T18:06:00.000Z,reference,configuration,0.65,True,"Describes how to use multiple checkout steps and additional repositories in YAML pipelines; this feature typically involves specific YAML keys, allowed values, and behavior details that constitute product-specific configuration knowledge beyond generic CI concepts.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops,Multiple repositories,Check out multiple repositories in your pipeline - Azure Pipelines,Configure multi-repo checkout in Azure Pipelines,Learn how to check out multiple repositories in your pipeline,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Pipelines often rely on multiple repositories that contain source code, tools, scripts, or other items you need to build your code. By using multiplecheckoutsteps in your pipeline, you can fetch and check out other repositories in addition to the one you use to store your YAML pipeline.",2026-04-22T08:00:00.000Z,reference,configuration,0.7,True,"Multi-repo checkout in YAML pipelines typically involves product-specific configuration syntax and parameters (for example, multiple checkout steps, repository resource definitions, and settings like persistCredentials, clean, path). This is configuration-focused, with concrete parameter names and allowed values that are specific to Azure Pipelines and not just generic Git knowledge.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/on-premises-bitbucket?view=azure-devops,Bitbucket Server,Build code from on-premises Bitbucket server - Azure Pipelines,Choose agents and connectivity for on-premises Bitbucket builds,Using on-premises Bitbucket with Azure Pipelines,"Azure DevOps Services Note To integrate Bitbucket Cloud with Azure Pipelines, seeBitbucket Cloud. You can integrate your on-premises Bitbucket server or another Git server with Azure Pipelines. Your on-premises server might be exposed to the Internet or it might not be. If your on-premises server is reachable from the servers that run Azure Pipelines service, then: If your on-premises server isn't reachable from the servers that run Azure Pipelines service, then: Note YAML pipelines do not work ",2023-10-27T19:55:00.000Z,reference,deployment,0.7,True,"Differentiates scenarios based on server reachability and agent type, and notes YAML support limitations; this is a deployment constraints/support-matrix style page.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops,Pipeline options for Git repositories,Options for Git repositories - Azure Pipelines,Configure advanced Git repository options in Azure Pipelines,Options available when using a Git repository with Azure Pipelines,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 While editing a pipeline that uses a Git repo—in an Azure DevOps project, GitHub, GitHub Enterprise Server, Bitbucket Cloud, or another Git repo—you have the following options. Note ClickAdvanced settingsin theGet Sourcestask to see some of the above options.",2025-03-25T14:59:00.000Z,reference,configuration,0.7,True,"Describes multiple options under Get Sources, including advanced settings; this page typically lists specific option names and behaviors, fitting configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/subversion?view=azure-devops,Subversion,Build code from Subversion - Azure Pipelines,Integrate on-premises Subversion servers with Azure Pipelines,Using Subversion repo with Azure Pipelines,"Azure DevOps Services You can integrate your on-premises Subversion server with Azure Pipelines. The Subversion server must be accessible to Azure Pipelines. Note YAML pipelines do not work with Subversion repositories. If your server is reachable from the hosted agents, then you can use the hosted agents to run manual, scheduled, or CI builds. Otherwise, you must set up self-hosted agents that can access your on-premises server and fetch the code. To integrate with Subversion, create aSubversio",2023-01-26T20:38:00.000Z,reference,deployment,0.7,True,"Specifies that YAML pipelines do not work with Subversion and distinguishes hosted vs self-hosted agent scenarios, a deployment support/constraints matrix.",unchanged @@ -218,7 +218,7 @@ https://learn.microsoft.com/en-us/azure/devops/pipelines/security/overview?view= https://learn.microsoft.com/en-us/azure/devops/pipelines/security/project-security-script?view=azure-devops,Use a script to update project security settings,Use a script to update security settings - Azure Pipelines,Automate Azure Pipelines security with REST and PowerShell,Learn how to use PowerShell scripts to automate Azure DevOps pipeline security settings. Configure project-level settings with secure-by-default recommendations.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use theAzure DevOps REST APIto automate updates to some Azure DevOps pipeline settings at the project level. Some settings aren't available through the REST API. For organization-level settings, you'll need to make configuration changes within the Azure DevOps UI.",2025-12-19T16:56:00.000Z,how-to,security,0.8,True,"Shows using Azure DevOps REST API and PowerShell to update pipeline security settings; contains specific setting names, REST endpoints, and parameters unique to Azure DevOps.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/security/resources?view=azure-devops,Pipeline resources,Pipeline resource security - Azure Pipelines,Configure pipeline resource security and approvals,"Learn about protected Azure Pipelines resources, and how to use permissions, checks, and approvals to help secure them in pipeline runs.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article describes security features that help safeguardprotected resourcesin Azure Pipelines. Pipelines might need to access open or protected resources during runs. Artifacts, pipelines, test plans, and work items areopen resources. Pipelines can freely access these resources, and you can fully automate workflows by subscribing to resource trigger events. For more information about protecting open resources, seeProtect ",2025-10-27T22:02:00.000Z,concept-article,security,0.8,True,"Explains protected resources, permissions, checks, and approvals; contains Azure Pipelines–specific resource types and security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secrets?view=azure-devops,Protecting secrets,Secrets in pipelines - Azure Pipelines,Protect and manage secrets in Azure Pipelines,Learn about best practices for protecting secrets in Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article provides best practices for protecting secrets in Azure Pipelines. A secret is anything that you want to tightly control access to, such as API keys, passwords, certificates, or cryptographic keys. Azure Pipelines doesn't generate secret values, but you might need to add secrets to pipelines to store sensitive data like API keys. This article is part of a series that helps you implement security measures for Azur",2025-10-30T14:05:00.000Z,best-practice,security,0.8,True,"Best practices for secrets in pipelines; Azure Pipelines has product-specific secret handling (secret variables, variable groups, Key Vault integration) with concrete recommendations and gotchas.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops,Secure access to Azure repositories from pipelines,Access repositories from pipelines - Azure Pipelines,Secure repository access from Azure Pipelines,Learn how to provide secure access to source code repositories from Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To protect the code that runs their operations, organizations must carefully control access to their source code repositories. This article describes how Azure Pipelines build and release pipelines can securely access repositories to minimize the risk of unauthorized access. This article is part of a series that helps you implement security measures for Azure Pipelines. For more information, seeSecure Azure Pipelines.",2025-10-30T14:05:00.000Z,how-to,security,0.75,True,"Describes how build and release pipelines securely access repositories; likely includes specific permission models, service connection types, and configuration options unique to Azure DevOps.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops,Secure access to Azure repositories from pipelines,Access repositories from pipelines - Azure Pipelines,Secure Azure Pipelines access to source repositories,Learn how to provide secure access to source code repositories from Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To protect the code that runs their operations, organizations must carefully control access to their source code repositories. This article describes how Azure Pipelines build and release pipelines can securely access repositories to minimize the risk of unauthorized access. This article is part of a series that helps you implement security measures for Azure Pipelines. For more information, seeSecure Azure Pipelines.",2026-04-22T08:00:00.000Z,how-to,security,0.72,True,"The article provides product-specific security guidance for Azure Pipelines accessing repos, including how to configure secure access between pipelines and repositories. It focuses on secure access patterns and configuration details unique to Azure DevOps (such as how pipelines authenticate to repos and how access is controlled), which qualifies as security expert knowledge rather than a generic overview.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/security/templates?view=azure-devops,Security through templates,Templates for security - Azure Pipelines,Use YAML templates to improve pipeline security,Learn about using template features to help improve pipeline security.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Pipelinestemplateslet you define reusable content, logic, and parameters in YAML pipelines. This article describes how templates can help enhance pipeline security by: This article is part of a series that helps you implement security measures for Azure Pipelines. For more information, seeSecure Azure Pipelines.",2025-10-27T22:02:00.000Z,concept-article,security,0.7,True,Shows how to use Azure Pipelines templates for security; likely includes specific YAML patterns and parameters that enforce security controls.,unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/targets/azure-sqldb?view=azure-devops,Azure SQL database,Deploy to Azure SQL Database - Azure Pipelines,Deploy database changes to Azure SQL with Pipelines,Learn how to deploy an Azure SQL database with Azure Pipelines.,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can automatically deploy your database updates to Azure SQL database after every successful build.,2025-05-07T19:14:00.000Z,how-to,deployment,0.7,True,"Task-focused deployment article with Azure SQL–specific pipeline configuration, task names, and parameters that go beyond generic CI/CD knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/targets/azure-stack?view=azure-devops,Azure Stack,Deploy to Azure Stack Hub App Service using Azure Pipelines - Azure Pipelines,Deploy apps to Azure Stack Hub App Service,Understand how to deploy to Azure Stack Hub App Service using Azure Pipelines,Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article walks you through setting up a CI/CD pipeline for deploying an application to app services in an Azure Stack Hub instance using Azure Pipelines. In this article you can learn to create or validate:,2025-07-15T16:57:00.000Z,how-to,deployment,0.65,True,Describes CI/CD pipeline setup targeting Azure Stack Hub App Service with Stack-specific connection and deployment configuration.,unchanged @@ -306,6 +306,7 @@ https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-w https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-web-powershell-deployment-v1?view=azure-pipelines,AzureWebPowerShellDeployment@1,AzureWebPowerShellDeployment@1 - Azure App Service Classic (Deprecated) v1 task,Configure deprecated Azure Web PowerShell deployment task,Create or update Azure App Service using Azure PowerShell.,This task creates or updates Azure App Service using Azure PowerShell. This task is deprecated.,2026-04-02T08:00:00.000Z,reference,configuration,0.86,True,"AzureWebPowerShellDeployment@1 page lists task inputs (ConnectedServiceName, WebSiteName, Package, Slot, etc.) and options for creating/updating App Service via PowerShell, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bash-v3?view=azure-pipelines,Bash@3,Bash@3 - Bash v3 task,Configure Bash@3 task for cross-platform scripts,"Run a Bash script on macOS, Linux, or Windows.","Use this task to run a Bash script on macOS, Linux, or Windows. Note On a Windows host this runs bash from the WSL default distribution. WSL must be installed and the user that the agent runs as must have a distribution setup. WSL is installed on Microsoft-hosted Windows agent images. For more information, seeMicrosoft-hosted agents - Software.",2026-04-02T08:00:00.000Z,reference,configuration,0.8,True,"Task reference lists inputs like targetType, filePath, script, workingDirectory, and env, plus WSL behavior on Windows, which are specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/batch-script-v1?view=azure-pipelines,BatchScript@1,BatchScript@1 - Batch script v1 task,Configure BatchScript@1 task for Windows commands,Run a Windows command or batch script and optionally allow it to change the environment.,"Use this task to run a Windows.bator.cmdscript. Optionally, the.bator.cmdscript can permanently modify environment variables.",2026-04-02T08:00:00.000Z,reference,configuration,0.78,True,"Documents parameters for running .bat/.cmd scripts and environment modification options, which are task-specific configuration settings.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bicep-deploy-v0?view=azure-pipelines,BicepDeploy@0,BicepDeploy@0 - Bicep Deploy v0 task,Configure Azure Pipelines BicepDeploy@0 task inputs,Deploy and Manage Azure Resources using Bicep Files. (task version 0),"The Bicep Deploy task is used to deploy and manage Azure resources usingBicepfiles. Bicep is a domain-specific language (DSL) that uses declarative syntax to deploy Azure resources. This task supports both standard Azure deployments andAzure Deployment Stacks, providing a simplified and more maintainable way to manage your Azure infrastructure as code. The task handles: The task supports:",2026-04-21T12:34:00.000Z,reference,configuration,0.78,True,"Task reference pages list all task inputs, their names, types, allowed values, and defaults for the BicepDeploy@0 Azure Pipelines task. This is product-specific configuration detail (parameters like connectedServiceName, templateFile, deploymentMode, etc.) that functions as a configuration reference, not just a tutorial, and includes settings an LLM wouldn't reliably know from training.",new https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cache-beta-v0?view=azure-pipelines,CacheBeta@0,CacheBeta@0 - Cache (Beta) v0 task,Configure CacheBeta@0 task (deprecated caching),Cache files between runs (task version 0).,"Improve build performance by using this task to cache files, like dependencies, between pipeline runs. This version of the task is deprecated; useCache@2. Improve build performance by using this task to cache files, like dependencies, between pipeline runs. Note There is a newer version of this task. UseCache@2.",2026-04-02T08:00:00.000Z,reference,configuration,0.82,True,"Provides configuration parameters and semantics for the v0 cache task, which are expert configuration details even though deprecated.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cache-beta-v1?view=azure-pipelines,CacheBeta@1,CacheBeta@1 - Cache (Beta) v1 task,Configure CacheBeta@1 task for pipeline caching,Cache files between runs (task version 1).,"Improve build performance by using this task to cache files, like dependencies, between pipeline runs. Note There is a newer version of this task. UseCache@2.",2026-04-02T08:00:00.000Z,reference,configuration,0.82,True,"Documents early-version cache task inputs and behavior, including parameter names and usage, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cache-v2?view=azure-pipelines,Cache@2,Cache@2 - Cache v2 task,Configure Cache@2 task to cache pipeline files,Cache files between runs.,"Improve build performance by using this task to cache files, such as dependencies, between pipeline runs. To add the task, search forCache(cache files between runs) in Classic pipelines or the YAML editor. SeeCache task: how it worksandReduce build time using cachingfor specific examples and more details.",2026-04-02T08:00:00.000Z,reference,configuration,0.88,True,"Cache task reference includes parameters like key, restoreKeys, path, and cacheHitVar with their behaviors and constraints, which are detailed configuration options.",unchanged @@ -539,9 +540,9 @@ the typical considerations for running UI tests.",2025-10-27T22:02:00.000Z,conce https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/anti-virus-exclusion?view=azure-devops,Antivirus exclusions,Antivirus scanning exclusions for Azure DevOps - Azure Pipelines,Configure antivirus exclusions for Azure DevOps servers and agents,This article describes the system processes that you should consider excluding from antivirus scanning on computers that are running Azure DevOps Server or self-hosted agents.,"This article provides information about the processes and folders that may have to be excluded from antivirus scanning on computers that are running Team Foundation Server (TFS), Azure DevOps Server or self-hosted agents of Azure DevOps Services. It also provides links to Microsoft Knowledge Base articles that discuss antivirus exclusions that may be defined on servers hosting deployments of Microsoft SQL Server and SharePoint Server that have been integrated with Azure DevOps Server. Original p",2026-03-06T23:36:00.000Z,article,security,0.86,True,"Lists specific Azure DevOps/TFS processes and folders that must be excluded from antivirus scanning, including exact paths and process names. These are product-specific security/AV configuration details that follow a symptom-prevention pattern and are unlikely to be known generically.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/review-logs?view=azure-devops,Review logs,Review logs to diagnose pipeline issues - Azure Pipelines,Review Azure Pipelines logs for diagnostics,Learn how to review pipeline diagnostic logs to troubleshoot,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Pipeline logs provide a powerful tool for determining the cause of pipeline failures, and verbose logs can be configured to provide more diagnostic information. A typical starting point is to review the logs in your completed build or release. You can view logs by navigating to the pipeline run summary and selecting the job and task. If a certain task is failing, check the logs for that task.Configure verbose logsto include m",2025-03-25T14:59:00.000Z,troubleshooting,troubleshooting,0.8,True,"Dedicated to reviewing diagnostic logs and configuring verbose logging; includes log locations, how to enable detailed logs, and how to interpret them, which is product-specific troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-azure-web-app-deploy?view=azure-devops,Troubleshoot deployment,Troubleshoot Azure Web App deployment tasks - Azure Pipelines,Troubleshoot Azure Web App deployment tasks in pipelines,"Resolve common errors with Azure Web App and App Service deployment tasks in Azure Pipelines, including ZIP deploy, network connectivity, SSL, Web Deploy, and package issues.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article helps you resolve common errors that occur when you use theAzure Web App (AzureWebApp@1)orAzure App Service Deploy (AzureRmWebAppDeployment@4)tasks in Azure Pipelines. This is a troubleshooting article. To learn more about App Service deployments, seeDeploy to Azure App Service by using Azure Pipelines, which coversAzureWebApp@1and advanced scenarios withAzureRmWebAppDeployment@4. For container-based deployments,",2026-02-24T18:05:00.000Z,troubleshooting,troubleshooting,0.9,True,Explicit troubleshooting guide for AzureWebApp@1 and AzureRmWebAppDeployment@4 with task-specific errors and resolutions.,unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops,Pipeline queues but never starts,Troubleshoot pipeline failure to start - Azure Pipelines,Fix Azure Pipelines jobs that never start,Learn how to troubleshoot pipeline starting issues in Azure Pipelines and Team Foundation Server.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If your pipeline queues but never starts, check the following items. Note The following scenarios won't consume a parallel job: Learn more:How a parallel job is consumed by a pipeline,Add Pre-deployment approvals,Server jobs,Deployment groups",2025-02-27T21:07:00.000Z,troubleshooting,troubleshooting,0.85,True,"Covers queued-but-never-starting pipelines and notes scenarios that don't consume parallel jobs; maps these conditions to configuration fixes, matching troubleshooting criteria.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops,Troubleshoot pipeline triggers,Troubleshoot pipeline triggers - Azure Pipelines,Troubleshoot Azure Pipelines trigger issues,Learn how to troubleshoot pipeline triggers in Azure Pipelines and Team Foundation Server.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If a pipeline doesn't start at all, check the following common trigger related issues. Note An additional reason that runs may not start is that your organization goes dormant five minutes after the last user signs out of Azure DevOps. After that, each of your build pipelines will run one more time. For example, while your organization is dormant:",2025-03-25T14:59:00.000Z,troubleshooting,troubleshooting,0.85,True,"Focuses on pipelines not starting due to trigger problems; includes specific causes (branch filters, path filters, YAML settings, dormant org behavior) and remedies, fitting symptom→cause→solution troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops,Troubleshoot pipeline runs,Troubleshoot pipeline runs - Azure Pipelines,Troubleshoot Azure Pipelines run failures using logs,"Learn how to troubleshoot pipeline runs in Azure Pipelines using logs, error analysis tools, and common techniques to resolve issues.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If your pipeline run fails to complete, use the diagnostic information and logs on the pipeline run summary page to troubleshoot the issue. This guide provides instructions for diagnosing pipeline failures using logs, error analysis tools, and common troubleshooting techniques. Learn how to identify root causes and implement solutions to keep your pipelines running smoothly.",2026-02-24T02:03:00.000Z,how-to,troubleshooting,0.85,True,"Dedicated troubleshooting guide for pipeline runs; such pages map symptoms to causes and solutions, reference specific log locations, and sometimes error patterns, matching the troubleshooting criteria.",unchanged +https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops,Pipeline queues but never starts,Troubleshoot pipeline failure to start - Azure Pipelines,Resolve Azure Pipelines queued-but-not-starting problems,Learn how to troubleshoot pipeline starting issues in Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If your pipeline queues but never starts, check the following items. Note The following scenarios won't consume a parallel job: Learn more:How a parallel job is consumed by a pipeline,Add Pre-deployment approvals,Server jobs,Deployment groups",2026-04-22T21:02:00.000Z,troubleshooting,troubleshooting,0.86,True,"Targets the specific symptom of pipelines that queue but never start and notes scenarios that don't consume a parallel job. Azure DevOps parallel job consumption rules and related conditions are product-specific and non-trivial, fitting the troubleshooting pattern of mapping symptoms to causes and fixes.",updated +https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops,Troubleshoot pipeline triggers,Troubleshoot pipeline triggers - Azure Pipelines,Troubleshoot Azure Pipelines trigger and start issues,Learn how to troubleshoot pipeline triggers in Azure Pipelines.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If a pipeline doesn't start at all, check the following common trigger related issues. Note An additional reason that runs may not start is that your organization goes dormant five minutes after the last user signs out of Azure DevOps. After that, each of your build pipelines will run one more time. For example, while your organization is dormant:",2026-04-22T21:02:00.000Z,troubleshooting,troubleshooting,0.9,True,"Focused on pipelines that don't start due to trigger problems; includes a product-specific dormant-organization behavior (5 minutes after last sign-out and one more run) and other trigger-related conditions. This is symptom → cause → resolution guidance unique to Azure DevOps, matching the troubleshooting category with concrete, non-obvious behavior.",updated +https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops,Troubleshoot pipeline runs,Troubleshoot pipeline runs - Azure Pipelines,Diagnose and fix Azure Pipelines run failures,"Learn how to troubleshoot pipeline runs in Azure Pipelines using logs, error analysis tools, and common techniques to resolve issues.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 If your pipeline run fails to complete, use the diagnostic information and logs on the pipeline run summary page to troubleshoot the issue. This guide provides instructions for diagnosing pipeline failures using logs, error analysis tools, and common troubleshooting techniques. Learn how to identify root causes and implement solutions to keep your pipelines running smoothly.",2026-04-20T08:00:00.000Z,troubleshooting,troubleshooting,0.86,True,"Dedicated troubleshooting guide for failed pipeline runs; Azure Pipelines has many product-specific error messages and diagnostic views. These pages typically map symptoms (failed runs, specific log patterns) to causes and resolutions using Azure DevOps–specific UI elements and logs, which qualifies as troubleshooting expert knowledge.",updated https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/?view=azure-pipelines,YAML schema,YAML schema reference,Reference all Azure Pipelines YAML schema options,Azure Pipelines YAML schema reference,"The YAML schema reference for Azure Pipelines is a detailed reference for YAML pipelines that lists all supported YAML syntax and their available options. To create a YAML pipeline, start with thepipelinedefinition. For more information about building YAML pipelines, seeCustomize your pipeline. The YAML schema reference does not cover tasks. For more information about tasks, see theAzure Pipelines tasks index.",2026-04-02T08:00:00.000Z,reference,configuration,0.86,True,"This is a detailed schema reference listing all supported Azure Pipelines YAML syntax and options. It enumerates product-specific keys, allowed values, structures, and defaults that are not generally known from training and function as configuration parameters for pipelines.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/boolean?view=azure-pipelines,boolean,boolean definition,Use boolean values in Azure Pipelines YAML,Represents a boolean value in a pipeline.,"Represents a boolean value in a pipeline. booleanstring. Allowed values: true | y | yes | on | false | n | no | off. Azure pipelines uses any of the previous string values to represent a boolean value in a pipeline. Note This definition is a supporting definition and is not intended for use directly in a pipeline. This article provides the YAML syntax for this supporting type, but does not show usage examples. For more information and examples for using this supporting type, see the followingDef",2026-04-02T08:00:00.000Z,reference,configuration,0.9,True,"Defines all accepted string representations for booleans (true|y|yes|on|false|n|no|off), which is precise, product-specific configuration behavior.",unchanged https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/deploy-hook?view=azure-pipelines,deployHook,deployHook definition,Configure deployHook steps for deployments,Used to run steps that deploy your application.,"Used to run steps that deploy your application. Note This definition is a supporting definition and is not intended for use directly in a pipeline. This article provides the YAML syntax for this supporting type, but does not show usage examples. For more information and examples for using this supporting type, see the followingDefinitions that reference this definitionarticles. Definitions that reference this definition:jobs.deployment.strategy.runOnce,jobs.deployment.strategy.rolling,jobs.deplo",2026-04-02T08:00:00.000Z,reference,configuration,0.78,True,"deployHook schema lists fields and structure for deployment hooks, which are configuration parameters for deployment strategies.",unchanged diff --git a/products/azure-pipelines/report.md b/products/azure-pipelines/report.md index 95282d2b..051aede4 100644 --- a/products/azure-pipelines/report.md +++ b/products/azure-pipelines/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - security: 'Securing Azure Pipelines: agent auth, service connections, secrets/Key - Vault, permissions and approvals, secure variables/files, repo access, signing, - and integrating security scans/policies.' - configuration: 'Configuring Azure Pipelines: agents, YAML/classic triggers, stages/jobs/steps, - variables, environments, artifacts, test/analytics, and detailed setup for built-in - tasks and deployment strategies.' + security: 'Securing Azure Pipelines end-to-end: auth for agents and service connections, + secrets/Key Vault, permissions/approvals, secure variables, and integrating code/security + scanning.' + configuration: 'Configuring Azure Pipelines YAML/classic pipelines: agents, triggers, + stages/jobs/steps, variables, environments, artifacts, test/analytics, and detailed + task input references.' deployment: 'Agent setup and deployment guides for Azure Pipelines: installing/hosting agents, configuring CI/CD to VMs, App Service, containers, Kubernetes, databases, and publishing/consuming artifacts.' @@ -14,8 +14,8 @@ category_descriptions: image deprecation, parallel jobs, agent pool concurrency, large package handling, and retention policy configuration.' integrations: Patterns and tasks for integrating Azure Pipelines with languages, - tools, secrets, Git/GitHub, ServiceNow, Slack, Key Vault, and external APIs, plus - caching and test automation. + tools, Git/GitHub, Key Vault, ServiceNow, Slack, REST/Functions, and managing + variables, caching, and test automation. architecture-patterns: 'Guidance on end-to-end CI/CD and DevOps architectures for Azure: baseline pipeline patterns, Web App deployment design, and IaaS/VM-focused DevTest and production pipelines.' @@ -23,7 +23,7 @@ category_descriptions: from Jenkins/Travis and from classic UI pipelines to YAML, with patterns, pitfalls, and safe migration steps. troubleshooting: 'Diagnosing and fixing Azure Pipelines issues: service connection/auth - problems, web app deploy failures, triggers, stuck jobs, and using logs to debug + problems, deployment failures, trigger/queue/start issues, and using logs to debug run failures.' best-practices: 'Best practices for faster, reliable pipelines: caching, cross-platform scripts, handling flaky tests, parallel test execution (incl. VSTest), Test Impact @@ -31,12 +31,12 @@ category_descriptions: skill_description: Expert knowledge for Azure Pipelines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - configuring YAML pipelines, self-hosted agents, service connections, Key Vault secrets, - or Web App/Kubernetes deploys, and other Azure Pipelines related development tasks. + authoring YAML pipelines, configuring agents/triggers, deploying to App Service/AKS/VMs, + or integrating Git/GitHub, and other Azure Pipelines related development tasks. Not for Azure DevOps (use azure-devops), Azure Boards (use azure-boards), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). -use_when: Use when configuring YAML pipelines, self-hosted agents, service connections, - Key Vault secrets, or Web App/Kubernetes deploys, and other Azure Pipelines related +use_when: Use when authoring YAML pipelines, configuring agents/triggers, deploying + to App Service/AKS/VMs, or integrating Git/GitHub, and other Azure Pipelines related development tasks. confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use azure-boards), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). @@ -45,16 +45,16 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a ## Summary -- **Total Pages**: 569 -- **Fetched**: 569 +- **Total Pages**: 570 +- **Fetched**: 570 - **Fetch Failed**: 0 - **Classified**: 513 -- **Unclassified**: 56 +- **Unclassified**: 57 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 568 +- **New Pages**: 1 +- **Updated Pages**: 14 +- **Unchanged**: 555 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-pipelines/azure-pipelines.csv` @@ -64,21 +64,51 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a |------|-------|------------| | architecture-patterns | 4 | 0.7% | | best-practices | 7 | 1.2% | -| configuration | 323 | 56.8% | +| configuration | 324 | 56.8% | | decision-making | 3 | 0.5% | | deployment | 89 | 15.6% | | integrations | 26 | 4.6% | | limits-quotas | 5 | 0.9% | -| security | 49 | 8.6% | +| security | 48 | 8.4% | | troubleshooting | 7 | 1.2% | -| *(Unclassified)* | 56 | 9.8% | +| *(Unclassified)* | 57 | 10.0% | ## Changes +### New Pages + +- [BicepDeploy@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bicep-deploy-v0?view=azure-pipelines) + ### Updated Pages -- [Sign up for Azure Pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-sign-up?view=azure-devops) - - Updated: 2025-03-13T21:37:00.000Z → 2026-04-14T01:03:00.000Z +- [Azure Repos Git](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops) + - Updated: 2026-02-13T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Multiple repositories](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops) + - Updated: 2026-03-05T18:06:00.000Z → 2026-04-22T08:00:00.000Z +- [Self-hosted agent authentication options](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops) + - Updated: 2025-10-27T22:02:00.000Z → 2026-04-22T08:00:00.000Z +- [Service Principal](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops) + - Updated: 2024-08-21T13:58:00.000Z → 2026-04-22T21:02:00.000Z +- [Azure Resource Manager service connection](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops) + - Updated: 2026-01-09T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Azure Resource Manager special cases](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops) + - Updated: 2026-01-16T08:00:00.000Z → 2026-04-22T21:02:00.000Z +- [Manually configure workload identity service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops) + - Updated: 2026-01-12T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Troubleshoot workload identity service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops) + - Updated: 2025-11-10T14:03:00.000Z → 2026-04-22T08:00:00.000Z +- [Use scripts to automate Azure Resource Manager service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/automate-service-connections?view=azure-devops) + - Updated: 2025-05-29T14:53:00.000Z → 2026-04-22T08:00:00.000Z +- [Troubleshoot pipeline runs](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops) + - Updated: 2026-02-24T02:03:00.000Z → 2026-04-20T08:00:00.000Z +- [Troubleshoot pipeline triggers](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops) + - Updated: 2025-03-25T14:59:00.000Z → 2026-04-22T21:02:00.000Z +- [Pipeline queues but never starts](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops) + - Updated: 2025-02-27T21:07:00.000Z → 2026-04-22T21:02:00.000Z +- [Manage service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops) + - Updated: 2025-11-03T14:25:00.000Z → 2026-04-22T08:00:00.000Z +- [Secure access to Azure repositories from pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops) + - Updated: 2025-10-30T14:05:00.000Z → 2026-04-22T08:00:00.000Z ## Classified Pages @@ -87,7 +117,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [NuGetCommand@2](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/nuget-command-v2?view=azure-pipelines) | configuration | 0.92 | NuGetCommand@2 reference defines many task inputs (command, feeds, versioning, etc.) with exact names and defaults, which are expert configuration details for Azure DevOps. | | [Use predefined variables](https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops) | configuration | 0.92 | Comprehensive list of predefined variables with their names and behaviors; effectively a configuration surface reference unique to Azure Pipelines. | | [AndroidSigning@3](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/android-signing-v3?view=azure-pipelines) | configuration | 0.90 | Task reference defines all parameters (keystore, alias, alignment options) for signing and aligning APKs. | -| [Azure Resource Manager service connection](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops) | security | 0.90 | Details multiple auth options, scopes, and recommended workload identity federation, which are product-specific security configurations. | | [AzureAppConfigurationExport@10](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-app-configuration-export-v10?view=azure-pipelines) | configuration | 0.90 | AzureAppConfigurationExport@10 reference lists task parameters (azureSubscription, configStoreName, keyFilter, label, prefix, exportMethod, etc.) with specific semantics and defaults, fitting configuration patterns. | | [AzureAppConfigurationImport@10](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-app-configuration-import-v10?view=azure-pipelines) | configuration | 0.90 | AzureAppConfigurationImport@10 documents inputs like configurationFile, format, separator, label, contentType, prefix, strict, etc., with allowed values and behavior, which are product-specific configuration options. | | [AzureAppServiceManage@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-app-service-manage-v0?view=azure-pipelines) | configuration | 0.90 | AzureAppServiceManage@0 reference lists operations (Start, Stop, Restart, SlotSwap, SlotDelete, InstallExtensions, EnableContinuousMonitoring) and related parameters (ResourceGroupName, WebAppName, SourceSlot, etc.), which are configuration settings. | @@ -109,7 +138,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Kubernetes@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/kubernetes-v0?view=azure-pipelines) | configuration | 0.90 | Lists all configuration options for the v0 Kubernetes task, including service connection and command parameters, which are expert configuration details. | | [Kubernetes@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/kubernetes-v1?view=azure-pipelines) | configuration | 0.90 | Task reference with inputs for kubeconfig, connection types, namespace, command, arguments, and authentication, which are concrete configuration parameters. | | [Manually configure an ARM service connection with a secret](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-app-secret?view=azure-devops) | security | 0.90 | Manual secret-based ARM connection setup includes app registration fields, scopes, and warnings about secret rotation—detailed security config. | -| [Manually configure workload identity service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops) | security | 0.90 | Explains manual setup for workload identity with managed identity or app registration, including specific IDs, roles, and settings. | | [Maven@2](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/maven-v2?view=azure-pipelines) | configuration | 0.90 | Documents parameters for Maven build/test/deploy in Azure Pipelines, which are detailed, product-specific configuration settings. | | [Maven@3](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/maven-v3?view=azure-pipelines) | configuration | 0.90 | Task reference listing all configuration options for Maven builds (POM file, goals, options, JDK), representing expert configuration knowledge. | | [Maven@4](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/maven-v4?view=azure-pipelines) | configuration | 0.90 | Provides detailed inputs for goals, options, JDK selection, test results, code coverage, and publishing, which are concrete configuration parameters unique to this task. | @@ -121,7 +149,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [PublishPipelineArtifact@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-pipeline-artifact-v1?view=azure-pipelines) | configuration | 0.90 | Defines inputs like targetPath, artifactName, and publishLocation with constraints and on-premises limitations, which are detailed configuration and deployment constraints. | | [Troubleshoot Azure Resource Manager service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/azure-rm-endpoint?view=azure-devops) | troubleshooting | 0.90 | Provides symptom → cause → fix guidance with ARM-specific errors and diagnostic steps unique to Azure Pipelines service connections. | | [Troubleshoot deployment](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-azure-web-app-deploy?view=azure-devops) | troubleshooting | 0.90 | Explicit troubleshooting guide for AzureWebApp@1 and AzureRmWebAppDeployment@4 with task-specific errors and resolutions. | -| [Troubleshoot workload identity service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops) | troubleshooting | 0.90 | Explicit troubleshooting article; typically maps specific error messages and causes to resolutions for workload identity connections. | +| [Troubleshoot pipeline triggers](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops) | troubleshooting | 0.90 | Focused on pipelines that don't start due to trigger problems; includes a product-specific dormant-organization behavior (5 minutes after last sign-out and one more run) and other trigger-related conditions. This is symptom → cause → resolution guidance unique to Azure DevOps, matching the troubleshooting category with concrete, non-obvious behavior. | | [UseDotNet@2](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/use-dotnet-v2?view=azure-pipelines) | configuration | 0.90 | Task reference lists inputs like packageType, version, includePreviewVersions, performMultiLevelLookup, and proxy settings, with defaults and allowed values. | | [UseNode@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/use-node-v1?view=azure-pipelines) | configuration | 0.90 | Documents inputs for versionSpec, checkLatest, nodejsMirror, and proxy configuration; these are specific configuration parameters for Node.js setup. | | [UsePythonVersion@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/use-python-version-v0?view=azure-pipelines) | configuration | 0.90 | Task reference lists inputs for versionSpec, addToPath, architecture, and pre-release handling, which are detailed configuration options. | @@ -224,6 +252,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Notation@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/notation-v0?view=azure-pipelines) | configuration | 0.86 | Task reference pages list all inputs/outputs with exact parameter names, allowed values, and defaults for the Notation v0 task, which are product-specific configuration details. | | [NuGetAuthenticate@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/nuget-authenticate-v1?view=azure-pipelines) | configuration | 0.86 | Documents inputs like feeds, service connections, and tool selection with required versions, which are product-specific configuration parameters. | | [NuGetToolInstaller@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/nuget-tool-installer-v1?view=azure-pipelines) | configuration | 0.86 | Describes inputs for selecting and caching specific NuGet versions, including version spec formats and behavior, which are configuration details. | +| [Pipeline queues but never starts](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops) | troubleshooting | 0.86 | Targets the specific symptom of pipelines that queue but never start and notes scenarios that don't consume a parallel job. Azure DevOps parallel job consumption rules and related conditions are product-specific and non-trivial, fitting the troubleshooting pattern of mapping symptoms to causes and fixes. | | [PublishCodeCoverageResults@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-code-coverage-results-v1?view=azure-pipelines) | configuration | 0.86 | Documents specific inputs and allowed values for publishing Cobertura or JaCoCo coverage, which are product-specific configuration parameters. | | [PublishCodeCoverageResults@2](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-code-coverage-results-v2?view=azure-pipelines) | configuration | 0.86 | Task reference includes parameters for coverage tool type, summary file, report directory, and fail behavior, which are configuration details. | | [PublishTestResults@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-test-results-v1?view=azure-pipelines) | configuration | 0.86 | Documents concrete inputs and allowed values for publishing test results, which are detailed configuration options. | @@ -237,6 +266,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [SonarQubeAnalyze@8](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sonar-qube-analyze-v8?view=azure-pipelines) | configuration | 0.86 | Lists SonarQube-specific task inputs (scannerMode, projectKey, extraProperties, etc.) and their allowed values, which are detailed configuration parameters. | | [SonarQubePrepare@8](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sonar-qube-prepare-v8?view=azure-pipelines) | configuration | 0.86 | SonarQubePrepare@8 reference defines inputs for server connection, project key, scanner mode, and additional properties, which are product-specific configuration parameters. | | [SonarQubePublish@8](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sonar-qube-publish-v8?view=azure-pipelines) | configuration | 0.86 | Task reference defines inputs for selecting the SonarQube service, polling behavior, and timeout, which are product-specific configuration parameters. | +| [Troubleshoot pipeline runs](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops) | troubleshooting | 0.86 | Dedicated troubleshooting guide for failed pipeline runs; Azure Pipelines has many product-specific error messages and diagnostic views. These pages typically map symptoms (failed runs, specific log patterns) to causes and resolutions using Azure DevOps–specific UI elements and logs, which qualifies as troubleshooting expert knowledge. | | [UniversalPackages@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/universal-packages-v0?view=azure-pipelines) | configuration | 0.86 | Lists inputs for feed, package name, version, download vs publish mode, and patterns; these are detailed configuration options for Universal Packages. | | [VSTest@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/vstest-v1?view=azure-pipelines) | configuration | 0.86 | Deprecated version; still provides detailed parameter reference for configuring test runs. | | [YAML schema](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/?view=azure-pipelines) | configuration | 0.86 | This is a detailed schema reference listing all supported Azure Pipelines YAML syntax and options. It enumerates product-specific keys, allowed values, structures, and defaults that are not generally known from training and function as configuration parameters for pipelines. | @@ -264,14 +294,10 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [KubeloginInstaller@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/kubelogin-installer-v0?view=azure-pipelines) | configuration | 0.85 | Task reference listing inputs for installing kubelogin and adding it to PATH, which are product-specific configuration settings. | | [Library permissions](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops) | security | 0.85 | Duplicate of index 2; same security/permissions content with product-specific RBAC details. | | [Manage permissions](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops) | security | 0.85 | Security article about permissions hierarchy, groups, and resource access; Azure DevOps security docs typically list specific roles, permission names, and scopes that are product-specific. | -| [Manage service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops) | security | 0.85 | Service connections article includes connection types, authentication schemes, and permission scopes; these are product-specific security and integration configuration details. | | [Pipeline permissions](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops) | security | 0.85 | Duplicate of index 2; same security/permissions content with product-specific RBAC details. | -| [Pipeline queues but never starts](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops) | troubleshooting | 0.85 | Covers queued-but-never-starting pipelines and notes scenarios that don't consume parallel jobs; maps these conditions to configuration fixes, matching troubleshooting criteria. | | [Release pipeline permissions](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops) | security | 0.85 | Duplicate of index 2; same security/permissions content with product-specific RBAC details. | | [Service connection permissions](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops) | security | 0.85 | Duplicate of index 2; same security/permissions content with product-specific RBAC details. | | [Task group permissions](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops) | security | 0.85 | Duplicate of index 2; same security/permissions content with product-specific RBAC details. | -| [Troubleshoot pipeline runs](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops) | troubleshooting | 0.85 | Dedicated troubleshooting guide for pipeline runs; such pages map symptoms to causes and solutions, reference specific log locations, and sometimes error patterns, matching the troubleshooting criteria. | -| [Troubleshoot pipeline triggers](https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops) | troubleshooting | 0.85 | Focuses on pipelines not starting due to trigger problems; includes specific causes (branch filters, path filters, YAML settings, dormant org behavior) and remedies, fitting symptom→cause→solution troubleshooting. | | [Use secrets from Azure Key Vault](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/azure-key-vault?view=azure-devops) | security | 0.85 | Covers creating Key Vaults, storing secrets, and wiring them into pipelines; includes service connection setup, secret mapping, and permission scopes, which are concrete security configuration patterns. | | [XcodePackageiOS@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/xcode-package-ios-v0?view=azure-pipelines) | configuration | 0.85 | Even though deprecated, the task reference defines specific inputs and options for XcodePackageiOS@0, including parameter names and allowed values, which are detailed configuration settings. | | [AzureCLI@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-cli-v0?view=azure-pipelines) | configuration | 0.84 | AzureCLI@0 reference similarly enumerates task inputs for running shell/batch scripts with Azure CLI, which are product-specific configuration details. | @@ -303,6 +329,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [SonarQubePrepare@7](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sonar-qube-prepare-v7?view=azure-pipelines) | configuration | 0.84 | Provides detailed task inputs and allowed values for preparing SonarQube analysis, which are configuration details. | | [SonarQubePublish@6](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sonar-qube-publish-v6?view=azure-pipelines) | configuration | 0.84 | Provides task-specific inputs and options for the v6 publish task, which are configuration parameters. | | [SonarQubePublish@7](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sonar-qube-publish-v7?view=azure-pipelines) | configuration | 0.84 | Lists concrete parameters and behavior for publishing Quality Gate results, which are configuration details. | +| [Troubleshoot workload identity service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops) | troubleshooting | 0.84 | Explicitly a troubleshooting guide for workload identity service connections, mapping common issues to causes and fixes, and likely including specific error messages and diagnostic steps. This matches the troubleshooting pattern of symptom → cause → solution. | | [extends](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/extends?view=azure-pipelines) | configuration | 0.84 | Describes the 'extends' YAML definition, including its specific keys and how to reference templates. These are product-specific configuration parameters and schema rules, not generic concepts. | | [jobs](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/jobs?view=azure-pipelines) | configuration | 0.84 | Specifies the 'jobs' YAML definition, including structure and allowed properties. This is schema/configuration reference content unique to Azure Pipelines. | | [jobs.deployment](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/jobs-deployment?view=azure-pipelines) | configuration | 0.84 | Documents the 'jobs.deployment' YAML construct and its specific fields for deployment jobs. These are concrete configuration options and schema details for Azure Pipelines. | @@ -356,7 +383,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [resources.webhooks.webhook](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-webhooks-webhook?view=azure-pipelines) | configuration | 0.82 | resources.webhooks.webhook schema including connection and event settings is detailed configuration. | | [Add users to contribute to pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/set-permissions?view=azure-devops) | security | 0.80 | Covers permissions for build/release pipelines and resources; typically lists specific security roles, permission names, and scopes, which are product-specific security configuration details. | | [AdvancedSecurity-Publish@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/advanced-security-publish-v1?view=azure-pipelines) | security | 0.80 | AdvancedSecurity-Publish@1 task reference covers publishing security scan results with specific parameters and integration behavior. | -| [Azure Resource Manager special cases](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops) | security | 0.80 | Covers agent-assigned managed identity and publish profile scenarios with specific parameters and security implications. | | [AzurePolicyCheckGate@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-policy-check-gate-v0?view=azure-pipelines) | security | 0.80 | Task reference includes parameters for connecting to Azure Policy, specifying scopes and behavior in pre/post-deployment gates, which are security/compliance configuration details. | | [AzureVmssDeployment@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-vmss-deployment-v0?view=azure-pipelines) | deployment | 0.80 | Documents configuration inputs for the v0 VMSS deployment task, including image and scale set parameters, which are product-specific deployment configuration options. | | [Bash@3](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bash-v3?view=azure-pipelines) | configuration | 0.80 | Task reference lists inputs like targetType, filePath, script, workingDirectory, and env, plus WSL behavior on Windows, which are specific configuration details. | @@ -381,6 +407,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Gradle@3](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/gradle-v3?view=azure-pipelines) | configuration | 0.80 | The v3 task page documents configuration parameters for running Gradle via wrapper in pipelines, which are specific build configuration settings. | | [Manage signing certificates](https://learn.microsoft.com/en-us/azure/devops/pipelines/apps/mobile/app-signing?view=azure-devops) | security | 0.80 | Describes how Azure Pipelines manages certificates and provisioning profiles for Android/iOS; includes secure file handling and signing configuration, which are product-specific security and configuration details. | | [Manage variables in variable groups (CLI)](https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/cli/pipeline-variable-group-secret-nonsecret-variables?view=azure-devops) | integrations | 0.80 | Provides Azure DevOps CLI commands and parameters for creating and managing secret/nonsecret variables in variable groups—an integration/API usage reference. | +| [Manually configure workload identity service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops) | security | 0.80 | Explains manual setup of Azure Resource Manager workload identity service connections, including choosing between managed identity and app registration, and configuring identity/permissions. This is detailed, product-specific security configuration content. | | [NuGetPackager@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/nuget-packager-v0?view=azure-pipelines) | configuration | 0.80 | Lists task-specific inputs for packing NuGet packages, including parameter names and defaults, which are concrete configuration options. | | [NuGetPublisher@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/nuget-publisher-v0?view=azure-pipelines) | configuration | 0.80 | Documents parameters for publishing NuGet packages via the deprecated task, which are specific configuration settings. | | [PackerBuild@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/packer-build-v0?view=azure-pipelines) | deployment | 0.80 | Lists configuration parameters for the v0 Packer build task and notes lack of workflow identity federation support, which are specific deployment constraints and settings. | @@ -395,9 +422,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [RunVisualStudioTestsusingTestAgent@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/run-visual-studio-testsusing-test-agent-v1?view=azure-pipelines) | configuration | 0.80 | Task reference for running tests via Test Agent; includes inputs for machines, test assemblies, test settings, and execution options. | | [Self-hosted Linux agents](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/linux-agent?view=azure-devops) | deployment | 0.80 | Platform-specific agent deployment article typically includes required packages, commands, and version/OS support details unique to Linux agents. | | [Self-hosted Windows agents](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/windows-agent?view=azure-devops) | deployment | 0.80 | Windows agent article typically documents required components, services, and supported Windows versions for running the agent. | -| [Self-hosted agent authentication options](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops) | security | 0.80 | Details supported auth mechanisms and their scopes/usage during agent registration, which are product-specific security configurations. | | [Self-hosted macOS agents](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/osx-agent?view=azure-devops) | deployment | 0.80 | macOS agent deployment guidance usually has specific Xcode/tooling requirements and OS-version constraints not generally known. | -| [Service Principal](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops) | security | 0.80 | Service principal registration includes specific parameters, required permissions, and minimum agent version (3.227.1), which are security config details. | | [ServiceFabricPowerShell@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/service-fabric-powershell-v1?view=azure-pipelines) | configuration | 0.80 | Documents inputs for connecting to a Service Fabric cluster and running PowerShell scripts, including connection endpoint and security settings. | | [ServiceFabricUpdateAppVersions@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/service-fabric-update-app-versions-v1?view=azure-pipelines) | configuration | 0.80 | Task reference describes inputs controlling version suffixes, manifest paths, and update behavior for Service Fabric app packages. | | [ServiceFabricUpdateManifests@2](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/service-fabric-update-manifests-v2?view=azure-pipelines) | configuration | 0.80 | Documents parameters for updating Service Fabric application and service manifests (e.g., endpoints, ports, versions), which are configuration details. | @@ -407,7 +432,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [SqlServerDacpacDeployment@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/sql-server-dacpac-deployment-v1?view=azure-pipelines) | deployment | 0.80 | Documents inputs for DACPAC-based SQL Server deployment (connection string, authentication, deployment options), which are deployment configuration details. | | [TwineAuthenticate@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/twine-authenticate-v1?view=azure-pipelines) | configuration | 0.80 | The TwineAuthenticate@1 task reference describes concrete task inputs and output variables (for example, PYPIRC_PATH, feed/endpoint selection, auth scopes) and how to pass them to twine via specific flags. These are detailed configuration parameters and patterns unique to Azure Pipelines, so it belongs to configuration. | | [Use a script to update project security settings](https://learn.microsoft.com/en-us/azure/devops/pipelines/security/project-security-script?view=azure-devops) | security | 0.80 | Shows using Azure DevOps REST API and PowerShell to update pipeline security settings; contains specific setting names, REST endpoints, and parameters unique to Azure DevOps. | -| [Use scripts to automate Azure Resource Manager service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/automate-service-connections?view=azure-devops) | integrations | 0.80 | Shows scripting patterns and parameters for creating ARM service connections, including API/CLI fields and constraints specific to this integration. | | [onFailureHook](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/on-failure-hook?view=azure-pipelines) | configuration | 0.80 | onFailureHook schema lists configuration fields for steps that run on failure, which is product-specific hook configuration. | | [onSuccessHook](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/on-success-hook?view=azure-pipelines) | configuration | 0.80 | onSuccessHook schema defines configuration for steps that run on success, including structure and fields. | | [onSuccessOrFailureHook](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/on-success-or-failure-hook?view=azure-pipelines) | configuration | 0.80 | onSuccessOrFailureHook schema documents configuration for steps that run regardless of outcome, with specific YAML fields. | @@ -432,6 +456,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [AzureSpringCloud@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-spring-cloud-v0?view=azure-pipelines) | deployment | 0.78 | Task reference for Azure Spring Apps deployment includes deployment-specific parameters and notes plan support (Basic/Standard/Enterprise), which are product-specific deployment configuration details. | | [AzureStaticWebApp@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-static-web-app-v0?view=azure-pipelines) | deployment | 0.78 | The AzureStaticWebApp@0 task reference includes task-specific inputs, environment settings, and deployment behavior for Static Web Apps, which are detailed deployment configurations unique to this product. | | [BatchScript@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/batch-script-v1?view=azure-pipelines) | configuration | 0.78 | Documents parameters for running .bat/.cmd scripts and environment modification options, which are task-specific configuration settings. | +| [BicepDeploy@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bicep-deploy-v0?view=azure-pipelines) | configuration | 0.78 | Task reference pages list all task inputs, their names, types, allowed values, and defaults for the BicepDeploy@0 Azure Pipelines task. This is product-specific configuration detail (parameters like connectedServiceName, templateFile, deploymentMode, etc.) that functions as a configuration reference, not just a tutorial, and includes settings an LLM wouldn't reliably know from training. | | [CMake@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cmake-v1?view=azure-pipelines) | configuration | 0.78 | Provides task inputs such as workingDirectory, cmakeArgs, configuration, and clean, which are specific configuration parameters for CMake builds in Azure Pipelines. | | [CargoAuthenticate@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cargo-authenticate-v0?view=azure-pipelines) | configuration | 0.78 | The task reference page lists task-specific inputs and parameters (such as input names, types, and defaults) for CargoAuthenticate@0 in Azure Pipelines. These are product-specific configuration details that function as a parameter reference, which fits the configuration sub-skill. It goes beyond a simple tutorial by documenting the exact task schema that an LLM is unlikely to know from training. | | [CmdLine@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cmd-line-v1?view=azure-pipelines) | configuration | 0.78 | Documents v1 command line task inputs (filename, arguments, modifyEnvironment), which are product-specific configuration details. | @@ -451,6 +476,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [NuGet@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/nuget-v0?view=azure-pipelines) | configuration | 0.78 | Even though deprecated, the task reference still lists concrete parameters and behaviors for NuGet@0, which are specific configuration options. | | [PackerBuild@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/packer-build-v1?view=azure-pipelines) | deployment | 0.78 | Task reference for building images with Packer includes inputs like templatePath, connectedServiceName, and variables mapping, which are deployment/image-build configuration details. | | [PyPIPublisher@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/py-pi-publisher-v0?view=azure-pipelines) | configuration | 0.78 | Task reference pages for Azure Pipelines typically list all task inputs, their names, allowed values, defaults, and behaviors (for example: packageDirectory, packageName, versioning options, Twine arguments). These are product-specific configuration parameters not inferable from general knowledge, matching the configuration sub-skill definition. | +| [Service Principal](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops) | security | 0.78 | Provides step-by-step, product-specific instructions for using a service principal to register a self-hosted agent, including required permissions, parameter names, and version-specific behavior (agent version 3.227.1 and SP option). This is detailed security/auth configuration. | | [ServiceFabricComposeDeploy@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/service-fabric-compose-deploy-v0?view=azure-pipelines) | deployment | 0.78 | Task reference for deploying Docker Compose apps to Service Fabric; includes deployment-specific parameters like compose file, registry credentials, and cluster connection. | | [ServiceFabricDeploy@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/service-fabric-deploy-v1?view=azure-pipelines) | deployment | 0.78 | Service Fabric deployment task reference with inputs for connection endpoint, publish profile, upgrade mode, and health checks; these are deployment-specific configuration details. | | [Set variables in scripts](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/set-variables-scripts?view=azure-devops) | integrations | 0.78 | Shows how to use task.setvariable logging command from Bash/PowerShell to interact with the pipeline runtime—product-specific integration and command patterns. | @@ -467,13 +493,14 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [resources.repositories](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-repositories?view=azure-pipelines) | configuration | 0.78 | resources.repositories schema for external repositories is product-specific configuration. | | [resources.webhooks](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-webhooks?view=azure-pipelines) | configuration | 0.78 | resources.webhooks list schema is configuration reference for webhook integrations. | | [stages.stage](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/stages-stage?view=azure-pipelines) | configuration | 0.78 | The YAML schema reference for stages includes product-specific configuration parameters (such as dependsOn, condition, displayName, variables, and other stage-level keys) with their allowed structures and behaviors. This is detailed schema/configuration knowledge unique to Azure DevOps Pipelines that isn't just conceptual and is unlikely to be fully known from training data. | +| [Azure Resource Manager service connection](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops) | security | 0.76 | Covers concrete authentication options (workload identity federation, managed identity, service principal, etc.) for Azure Resource Manager service connections, including how to set them up and required permissions. This is product-specific security and identity configuration rather than a generic overview. | +| [Azure Resource Manager special cases](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops) | security | 0.76 | Details specific configurations for ARM service connections using agent-assigned managed identity, publish profiles, and app registrations with secrets. Includes product-specific auth flows and permission requirements, fitting the security/identity configuration category. | | [AzureTestPlan@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-test-plan-v0?view=azure-pipelines) | configuration | 0.76 | Task reference for Azure Test Plans includes specific inputs (testSelector, testPlan, testSuite, searchFolder, etc.) and their valid values, which are configuration details. | | [ContainerStructureTest@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/container-structure-test-v0?view=azure-pipelines) | deployment | 0.76 | Task reference includes parameters for running container-structure-test (config file path, image name, test types), which are deployment/test configuration details for container images. | | [DecryptFile@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/decrypt-file-v1?view=azure-pipelines) | configuration | 0.76 | Task reference lists parameters such as secureFile, cipher, and passphrase variables for OpenSSL decryption, which are specific configuration options. | | [DeployVisualStudioTestAgent@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/deploy-visual-studio-test-agent-v1?view=azure-pipelines) | deployment | 0.76 | Similar to v2; documents parameters for deploying and configuring Test Agent across machines, which are deployment details. | | [DeployVisualStudioTestAgent@2](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/deploy-visual-studio-test-agent-v2?view=azure-pipelines) | deployment | 0.76 | Task reference for deploying Visual Studio Test Agent to machines; includes deployment-specific inputs like machine group, credentials, and installation options. | | [jobs.template](https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/jobs-template?view=azure-pipelines) | configuration | 0.76 | Schema reference for jobs.template including parameters and usage is detailed configuration for Azure Pipelines templates. | -| [Secure access to Azure repositories from pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops) | security | 0.75 | Describes how build and release pipelines securely access repositories; likely includes specific permission models, service connection types, and configuration options unique to Azure DevOps. | | [Slack](https://learn.microsoft.com/en-us/azure/devops/pipelines/integrations/slack?view=azure-devops) | integrations | 0.75 | Describes using the Azure Pipelines Slack app, managing subscriptions, and supported events; includes product-specific configuration options and constraints (e.g., service-only availability). | | [Variables and parameters](https://learn.microsoft.com/en-us/azure/devops/pipelines/security/inputs?view=azure-devops) | security | 0.75 | Focuses on safely accepting user input in Azure Pipelines; expected to include product-specific variable/parameter behaviors, constraints, and secure usage patterns. | | [ContainerBuild@0](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/container-build-v0?view=azure-pipelines) | deployment | 0.74 | Even though the summary is brief, the reference page lists task inputs (e.g., dockerFile, imageName, arguments, containerRegistry), which are deployment configuration details for container builds. | @@ -481,6 +508,8 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Use secure files](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops) | security | 0.74 | The secure files article covers how to store and authorize access to sensitive files in Azure Pipelines, including product-specific security behaviors such as encryption at rest, pipeline authorization, and likely permission settings or scopes for using secure files. These are concrete, product-specific security configuration patterns rather than generic security concepts. | | [Chef@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/chef-v1?view=azure-pipelines) | deployment | 0.72 | Documents inputs for editing Chef environment attributes from pipelines (server URL, credentials, environment, JSON attributes), which are deployment/configuration details specific to Chef integration. | | [ChefKnife@1](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/chef-knife-v1?view=azure-pipelines) | deployment | 0.72 | Task reference lists parameters for executing Knife scripts on a Chef workstation from pipelines, which are specific deployment/integration configuration options. | +| [Secure access to Azure repositories from pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops) | security | 0.72 | The article provides product-specific security guidance for Azure Pipelines accessing repos, including how to configure secure access between pipelines and repositories. It focuses on secure access patterns and configuration details unique to Azure DevOps (such as how pipelines authenticate to repos and how access is controlled), which qualifies as security expert knowledge rather than a generic overview. | +| [Self-hosted agent authentication options](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops) | security | 0.72 | Describes concrete authentication options for registering Azure Pipelines self-hosted agents, including specific token types, scopes, and when each is appropriate. This is product-specific security configuration (auth choices and required permissions), not just conceptual guidance. | | [About agents & agent pools](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops) | configuration | 0.70 | Explains different agent types, how jobs run, and how to use agents; includes product-specific agent configuration and behavior details. | | [Access Azure DevOps with Microsoft Entra workload identity](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/add-devops-entra-service-connection?view=azure-devops) | security | 0.70 | The page describes creating an Azure DevOps service connection using Microsoft Entra workload identity federation. This involves product-specific security configuration: service connection types, Entra workload identity/federated credential setup, and how service principals or managed identities are authorized to access Azure DevOps without PATs. These are concrete, product-specific auth and RBAC-style configurations that qualify as expert security knowledge. | | [Access private key vaults from your pipeline](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/key-vault-access?view=azure-devops) | integrations | 0.70 | Describes accessing private Key Vaults restricted to VNets from Azure Pipelines; likely includes product-specific configuration (service connections, firewall/VNet integration, identity) and parameter details. | @@ -522,6 +551,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Manage variable groups](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops) | security | 0.70 | Covers variable groups as protected resources, including approvals, checks, and pipeline permissions—product-specific security and access control configuration. | | [Migrate Classic pipelines to YAML](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/from-classic-pipelines?view=azure-devops) | decision-making | 0.70 | Migration guidance from Classic to YAML; includes which pipeline types can be exported, limitations (e.g., release pipelines), and likely decision points on how to restructure pipelines. | | [Migrate from Jenkins](https://learn.microsoft.com/en-us/azure/devops/pipelines/migrate/from-jenkins?view=azure-devops) | decision-making | 0.70 | Migration guide comparing Jenkins and Azure Pipelines; likely includes mapping of concepts, configuration differences, and recommendations for when/how to move workloads. | +| [Multiple repositories](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops) | configuration | 0.70 | Multi-repo checkout in YAML pipelines typically involves product-specific configuration syntax and parameters (for example, multiple checkout steps, repository resource definitions, and settings like persistCredentials, clean, path). This is configuration-focused, with concrete parameter names and allowed values that are specific to Azure Pipelines and not just generic Git knowledge. | | [Node.js runners](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/nodejs-runners?view=azure-devops) | configuration | 0.70 | Describes which Node.js versions ship with the agent and timelines (e.g., Node 24 in 2026), which are product-specific runtime configuration details. | | [Pipeline completion triggers](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops) | configuration | 0.70 | Details how to define pipeline triggers based on other pipelines’ completion, including YAML schema/parameters unique to Azure Pipelines. | | [Pipeline options for Git repositories](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops) | configuration | 0.70 | Describes multiple options under Get Sources, including advanced settings; this page typically lists specific option names and behaviors, fitting configuration reference. | @@ -548,6 +578,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Use a canary deployment strategy](https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/kubernetes/canary-demo?view=azure-devops) | deployment | 0.70 | Step-by-step canary deployment using Kubernetes manifest task and canary strategy with product-specific task configuration and workflow. | | [Use a self-signed certificate](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/certificate?view=azure-devops-server) | security | 0.70 | Explains certificate configuration for agents, including certificate stores/paths and TLS settings specific to Azure DevOps Server. | | [Use deployment groups in Classic releases](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/deployment-groups/?view=azure-devops) | deployment | 0.70 | Details deployment group concepts, configuration, and usage differences vs deployment jobs, including tags and environment mapping. | +| [Use scripts to automate Azure Resource Manager service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/automate-service-connections?view=azure-devops) | integrations | 0.70 | Shows how to script creation of Azure Resource Manager service connections with workload identity, including concrete script parameters and configuration values. This is a code-focused integration pattern for connecting Azure DevOps pipelines to Azure with specific config details. | | [Virtual machine resource](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops) | configuration | 0.70 | Describes adding VM resources via agents and using environment deployment history; includes product-specific steps and configuration for VM-based environments. | | [Approach to securing YAML pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/security/approach?view=azure-devops) | security | 0.65 | Guidance on incrementally applying security recommendations to YAML pipelines; likely references specific Azure Pipelines security features and configurations, not just generic security concepts. | | [Artifact policy checks](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/artifact-policy?view=azure-devops) | security | 0.65 | Describes artifact policy checks, supported artifact types, and configuration of custom policies that gate deployments to critical environments. | @@ -562,7 +593,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Java apps](https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/java?view=azure-devops) | integrations | 0.65 | Covers setting up pipelines for Java with Maven/Gradle/Ant and deployment to Azure services; such ecosystem pages usually contain task input names, YAML examples, and product-specific build/deploy configuration details. | | [Linux virtual machines](https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/deploy-linux-vm?view=azure-devops) | deployment | 0.65 | Shows how to target multiple Linux VMs via environments with Azure Pipelines, including environment/resource usage and deployment job configuration specific to the product. | | [Migrate from Travis](https://learn.microsoft.com/en-us/azure/devops/pipelines/migrate/from-travis?view=azure-devops) | decision-making | 0.65 | Shows how to translate Travis configuration to Azure Pipelines; contains concrete mapping of Travis settings to Azure Pipelines YAML, which is product-specific guidance. | -| [Multiple repositories](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops) | configuration | 0.65 | Describes how to use multiple checkout steps and additional repositories in YAML pipelines; this feature typically involves specific YAML keys, allowed values, and behavior details that constitute product-specific configuration knowledge beyond generic CI concepts. | | [Publish Maven packages to internal and external feeds](https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/publish-maven-artifacts?view=azure-devops) | deployment | 0.65 | Includes Maven task configuration, settings.xml handling, and feed/registry integration unique to Azure Pipelines. | | [Publish NuGet packages to NuGet.org (Classic/YAML)](https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/publish-public-registry?view=azure-devops) | deployment | 0.65 | Covers NuGet.org-specific publishing configuration, including service connections, API keys, and task parameters. | | [Publish packages to internal and external feeds](https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/nuget?view=azure-devops) | deployment | 0.65 | Includes NuGet task configuration, feed URLs, authentication, and YAML/classic examples specific to Azure Pipelines. | @@ -618,7 +648,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Types of triggers](https://learn.microsoft.com/en-us/azure/devops/pipelines/build/triggers?view=azure-devops) | 0.40 | General triggers overview; likely lists trigger types and basic YAML snippets but not focused on limits, quotas, or detailed configuration matrices. | | [Use build tags](https://learn.microsoft.com/en-us/azure/devops/pipelines/build/build-tag?view=azure-devops) | 0.40 | Covers using build tags conceptually; likely lacks detailed config tables or numeric constraints. | | [Android](https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/android?view=azure-devops) | 0.35 | Android pipeline quickstart; focuses on basic YAML pipeline creation, not on detailed configuration tables or quotas. | -| [Azure Repos Git](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops) | 0.35 | How to use Azure Repos Git with pipelines; integration steps but not a detailed parameter/limits reference. | | [Bitbucket Cloud](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/bitbucket?view=azure-devops) | 0.35 | Integration guide for Bitbucket Cloud; mostly procedural without detailed parameter tables or numeric constraints. | | [Configure pipelines to support work tracking](https://learn.microsoft.com/en-us/azure/devops/pipelines/integrations/configure-pipelines-work-tracking?view=azure-devops) | 0.35 | Covers integration with Azure Boards and work tracking conceptually; likely procedural without detailed config parameter tables. | | [GitHub](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops) | 0.35 | Configuring GitHub integration is tutorial-style; lacks structured config tables or error-code troubleshooting. | @@ -632,6 +661,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Ruby](https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/ruby?view=azure-devops) | 0.35 | Ruby pipeline article appears to be a general how-to for building/testing Ruby apps, not a detailed config or limits reference. | | [Use approvals and gates](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/deploy-using-approvals?view=azure-devops) | 0.35 | Tutorial-style content on using approvals and gates; likely step-by-step rather than a configuration or limits reference. | | [Artifacts in Azure Pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/artifacts-overview?view=azure-devops) | 0.30 | Overview of artifacts; mostly conceptual description of capabilities without deep configuration details. | +| [Azure Repos Git](https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops) | 0.30 | Appears to be a how-to/tutorial on using Azure Repos Git with Azure Pipelines. Likely focuses on basic configuration and workflow rather than detailed limits, configuration matrices, or product-specific error codes. No indication of numeric limits, RBAC role tables, or advanced configuration parameter tables. | | [Build a pipeline with stages](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/run-stages?view=azure-devops) | 0.30 | Explains multi-stage YAML pipelines conceptually with an example; no quantified thresholds, limits, or decision matrices. | | [Build multiple branches](https://learn.microsoft.com/en-us/azure/devops/pipelines/build/ci-build-git?view=azure-devops) | 0.30 | Tutorial-style CI trigger setup; summary doesn’t indicate numeric limits, config tables, or error-code mappings. | | [Create & manage agent pools](https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/pools-queues?view=azure-devops) | 0.30 | Organizational concept for agent pools; summary doesn’t indicate numeric limits, config tables, or error mappings. | @@ -639,6 +669,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Deploy a virtual machine scale set](https://learn.microsoft.com/en-us/azure/devops/pipelines/apps/cd/azure/deploy-virtual-scale-set-java?view=azure-devops) | 0.30 | Tutorial-style VM scale set deployment; likely step-by-step CLI and pipeline example without detailed config matrices or product-specific constraints. | | [Deploy from multiple branches](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/deploy-multiple-branches?view=azure-devops) | 0.30 | The page is a how-to/tutorial for configuring classic release pipelines to deploy different branches to different stages. It describes steps and UI actions but does not present product-specific limits, configuration parameter tables, error-code-based troubleshooting, or quantified decision criteria. The content is procedural rather than expert reference material, so it does not meet any sub-skill type’s detection criteria. | | [File matching patterns](https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/file-matching-patterns?view=azure-devops) | 0.30 | Reference for file matching patterns is largely generic globbing behavior; not focused on Azure-specific limits, configs, or error codes beyond what an LLM likely already knows. | +| [Manage service connections](https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops) | 0.30 | Primarily an overview and how-to for creating and managing service connections, plus a reference to connection types. From the summary, it doesn’t clearly expose detailed configuration tables, parameter defaults, or error-code-based troubleshooting; it looks more like conceptual and procedural guidance rather than expert-knowledge configuration or troubleshooting content. | | [Node.js tutorial](https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/nodejs-tutorial?view=azure-devops) | 0.30 | Step-by-step tutorial to build and deploy a Node.js app; mainly workflow guidance without detailed config tables, limits, or product-specific error mappings. | | [Pipeline default branch](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-default-branch?view=azure-devops) | 0.30 | Explains how to view and edit default branch; procedural, without parameter tables or numeric constraints. | | [Process parameters (Classic)](https://learn.microsoft.com/en-us/azure/devops/pipelines/release/parameters?view=azure-devops) | 0.30 | The page describes classic process parameters conceptually and how they differ from variables, but the summary does not indicate detailed configuration tables, parameter ranges, or product-specific constraints. It appears more like usage guidance for UI-defined parameters than deep configuration or limits. | diff --git a/products/azure-planetary-computer-pro/azure-planetary-computer-pro.csv b/products/azure-planetary-computer-pro/azure-planetary-computer-pro.csv index 3c0f06cc..32afdbc2 100644 --- a/products/azure-planetary-computer-pro/azure-planetary-computer-pro.csv +++ b/products/azure-planetary-computer-pro/azure-planetary-computer-pro.csv @@ -12,6 +12,7 @@ https://learn.microsoft.com/en-us/azure/planetary-computer/collection-configurat https://learn.microsoft.com/en-us/azure/planetary-computer/configure-collection-web-interface,Configure collections for visualization,Configure collections for visualization in Microsoft Planetary Computer Pro,Configure collection visualization settings in Planetary Computer Pro portal,Learn how to configure collections in the Microsoft Planetary Computer Pro portal.,This quickstart explains how to configure a collection in Microsoft Planetary Computer Pro via the web interface.,2025-07-26T22:09:00.000Z,quickstart,configuration,0.65,True,Quickstart for configuring collections via web interface; likely includes specific configuration fields and allowed values for visualization.,unchanged https://learn.microsoft.com/en-us/azure/planetary-computer/configure-cross-tenant-application,Authoring and Configuring a Partner Application,Quickstart: Configure a cross-tenant application for Microsoft Planetary Computer Pro,Configure cross-tenant app access to Planetary Computer Pro,Learn how to configure a multitenant application to read and write data to customer Microsoft Planetary Computer Pro GeoCatalogs.,"In this quickstart, you create and configure a multitenant Azure application that can access customer Microsoft Planetary Computer Pro GeoCatalogs. As a geospatial data or service provider, this process enables your application to read or write data from or to your customers' GeoCatalogs.",2026-02-04T06:14:00.000Z,quickstart,security,0.7,True,"Quickstart for configuring a multitenant Azure application to read/write GeoCatalogs will necessarily include product-specific AAD app settings, permission scopes, and possibly role assignments unique to Planetary Computer Pro. This is concrete security/identity configuration rather than generic auth theory.",unchanged https://learn.microsoft.com/en-us/azure/planetary-computer/configure-qgis,Connect to QGIS,Configure QGIS to access Microsoft Planetary Computer Pro,Configure QGIS to connect to Planetary Computer Pro STAC collections,Learn how to configure and authenticate QGIS to read STAC data from Microsoft Planetary Computer Pro.,"This guide explains how to configure the open-sourceQGISdesktop GIS software to access geospatial datasets from the Microsoft Planetary Computer Pro GeoCatalog using Microsoft Entra ID authentication. QGIS enables direct interaction with STAC collections in Microsoft Planetary Computer Pro. With QGIS, you can visualize, analyze, and style data on the fly, and integrate it with local layers from Azure. All GeoCatalog assets can be downloaded to your desktop. Assets in Cloud Optimized GeoTIFF (COG",2026-02-12T08:00:00.000Z,how-to,integrations,0.75,True,Explains QGIS configuration and Entra ID auth to access GeoCatalog datasets; includes integration-specific settings and constraints.,unchanged +https://learn.microsoft.com/en-us/azure/planetary-computer/create-api-proxy-geocatalog,Create an API Proxy for GeoCatalogs,Create an API proxy for a GeoCatalog using Azure API Management,Configure APIM proxy for Planetary Computer GeoCatalog,Learn how to use Azure API Management to create an API proxy for a Microsoft Planetary Computer Pro GeoCatalog to enable anonymous access and collection-level access control.,"This article guides you through setting upAzure API Management(APIM) as an API proxy in front of a Microsoft Planetary Computer Pro GeoCatalog. With this configuration you can: The following diagram illustrates the architecture before and after adding the APIM proxy: BeforeEvery caller authenticates directly to the GeoCatalog: AfterAPIM sits between callers and the GeoCatalog, handling authentication and access control:",2026-04-24T06:15:00.000Z,how-to,configuration,0.7,True,"The article describes a concrete architecture and configuration for placing Azure API Management in front of a Planetary Computer Pro GeoCatalog to enable anonymous and collection-level access control. This involves product-specific settings and policies in APIM (such as how authentication and access control are handled for GeoCatalog requests), which are configuration details not generally known from training data. It is not just a conceptual overview or generic tutorial, but a specific configuration pattern for this service combination.",new https://learn.microsoft.com/en-us/azure/planetary-computer/create-collection-web-interface,Create a STAC collection (Web Interface),Creating collections via web interface in Microsoft Planetary Computer Pro,,Learn how to create collections in the Microsoft Planetary Computer Pro web interface.,This quickstart explains how to create a collection in Microsoft Planetary Computer Pro through the web interface.,2025-12-01T23:10:00.000Z,quickstart,,0.25,False,Quickstart for creating collections via web UI; likely procedural without deep configuration reference.,unchanged https://learn.microsoft.com/en-us/azure/planetary-computer/create-connection-arc-gis-pro,Connect to ArcGIS Pro,Use ArcGIS Pro with Microsoft Planetary Computer Pro,Configure ArcGIS Pro to access Planetary Computer Pro GeoCatalogs,Learn how to configure and authenticate ArcGIS Pro so that it can read STAC Item data from Microsoft Planetary Computer Pro.,Learn how to configure ArcGIS Pro to access geospatial datasets from the Microsoft Planetary Computer Pro GeoCatalog by using OAuth 2.0-delegated authentication with Microsoft Entra ID. This process requires that you: Learn how to securely browse and access data hosted in Microsoft Planetary Computer Pro directly in ArcGIS Pro by using Microsoft Entra ID user impersonation.,2026-01-09T08:00:00.000Z,how-to,integrations,0.75,True,Details OAuth 2.0 delegated auth with Entra ID and ArcGIS Pro configuration to read STAC items—product-specific integration steps and parameters.,unchanged https://learn.microsoft.com/en-us/azure/planetary-computer/create-stac-collection,Create a STAC collection (API),Create STAC Collections via API in Microsoft Planetary Computer Pro,Create STAC collections in Planetary Computer Pro using Python APIs,Learn how to add and use STAC collections in Microsoft Planetary Computer Pro GeoCatalog using Python.,This quickstart guides you to create a SpatioTemporal Asset Catalog (STAC) collection and add it to a Microsoft Planetary Computer Pro GeoCatalog using Python.,2025-05-19T15:23:00.000Z,quickstart,integrations,0.65,True,Python-based quickstart for adding STAC collections to GeoCatalog; likely includes API/SDK parameters and request schemas specific to the service.,unchanged diff --git a/products/azure-planetary-computer-pro/report.md b/products/azure-planetary-computer-pro/report.md index 9c05407f..2fcf367d 100644 --- a/products/azure-planetary-computer-pro/report.md +++ b/products/azure-planetary-computer-pro/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-05' +generated_at: '2026-04-26' category_descriptions: integrations: Patterns and APIs for creating/managing STAC collections/items, bulk ingesting data, generating SAS tokens, and integrating Planetary Computer Pro @@ -10,9 +10,9 @@ category_descriptions: decision-making: Guidance on selecting how to access Planetary Computer Pro data, including connection options, integrations with tools/services, and choosing the best method for your workflow. - configuration: 'Configuring Planetary Computer Pro collections: ingestion sources, - mosaics, tiles, render/colormap settings, Explorer visualization, queryable filters, - and US Gov cloud endpoints.' + configuration: 'Configuring Planetary Computer Pro data access and visualization: + collections, tiles, mosaics, render/colormap settings, queryables, ingestion sources, + APIM proxy, and US Gov cloud endpoints.' troubleshooting: Diagnosing and resolving Planetary Computer Pro GeoCatalog ingestion failures, including error code meanings, common causes, and step-by-step remediation guidance. @@ -21,13 +21,13 @@ category_descriptions: process and store data. skill_description: Expert knowledge for Microsoft Planetary Computer Pro development including troubleshooting, decision making, limits & quotas, security, configuration, - and integrations & coding patterns. Use when managing STAC collections, GeoCatalog - ingestion, SAS tokens, Explorer visualization, or QGIS/ArcGIS integration, and other - Microsoft Planetary Computer Pro related development tasks. Not for Azure Open Datasets - (use azure-open-datasets), Azure Maps (use azure-maps), Azure Data Explorer (use - azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics). -use_when: Use when managing STAC collections, GeoCatalog ingestion, SAS tokens, Explorer - visualization, or QGIS/ArcGIS integration, and other Microsoft Planetary Computer + and integrations & coding patterns. Use when managing STAC collections/items, GeoCatalog + ingestion, SAS tokens, tiles/mosaics, or QGIS/ArcGIS integration, and other Microsoft + Planetary Computer Pro related development tasks. Not for Azure Open Datasets (use + azure-open-datasets), Azure Maps (use azure-maps), Azure Data Explorer (use azure-data-explorer), + Azure Synapse Analytics (use azure-synapse-analytics). +use_when: Use when managing STAC collections/items, GeoCatalog ingestion, SAS tokens, + tiles/mosaics, or QGIS/ArcGIS integration, and other Microsoft Planetary Computer Pro related development tasks. confusable_not_for: Not for Azure Open Datasets (use azure-open-datasets), Azure Maps (use azure-maps), Azure Data Explorer (use azure-data-explorer), Azure Synapse Analytics @@ -37,14 +37,14 @@ confusable_not_for: Not for Azure Open Datasets (use azure-open-datasets), Azure ## Summary -- **Total Pages**: 45 -- **Fetched**: 45 +- **Total Pages**: 46 +- **Fetched**: 46 - **Fetch Failed**: 0 -- **Classified**: 32 +- **Classified**: 33 - **Unclassified**: 13 ### Incremental Update -- **New Pages**: 0 +- **New Pages**: 1 - **Updated Pages**: 0 - **Unchanged**: 45 - **Deleted Pages**: 0 @@ -54,16 +54,20 @@ confusable_not_for: Not for Azure Open Datasets (use azure-open-datasets), Azure | Type | Count | Percentage | |------|-------|------------| -| configuration | 10 | 22.2% | +| configuration | 11 | 23.9% | | decision-making | 1 | 2.2% | -| integrations | 11 | 24.4% | +| integrations | 11 | 23.9% | | limits-quotas | 1 | 2.2% | -| security | 7 | 15.6% | -| troubleshooting | 2 | 4.4% | -| *(Unclassified)* | 13 | 28.9% | +| security | 7 | 15.2% | +| troubleshooting | 2 | 4.3% | +| *(Unclassified)* | 13 | 28.3% | ## Changes +### New Pages + +- [Create an API Proxy for GeoCatalogs](https://learn.microsoft.com/en-us/azure/planetary-computer/create-api-proxy-geocatalog) + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -86,6 +90,7 @@ confusable_not_for: Not for Azure Open Datasets (use azure-open-datasets), Azure | [Build a web application](https://learn.microsoft.com/en-us/azure/planetary-computer/build-web-application) | integrations | 0.70 | Quickstart for a web app using Entra ID auth, STAC APIs, and map tiles; includes concrete API usage and auth parameters. | | [Bulk ingestion](https://learn.microsoft.com/en-us/azure/planetary-computer/bulk-ingestion-api) | integrations | 0.70 | Shows how to use the Bulk Ingestion API with configuration of ingestion sources and collections; product-specific API patterns and parameters. | | [Collection configuration overview](https://learn.microsoft.com/en-us/azure/planetary-computer/collection-configuration-concept) | configuration | 0.70 | Describes configuration options for collections (filters, zoom levels, searchable attributes, data types); product-specific configuration settings. | +| [Create an API Proxy for GeoCatalogs](https://learn.microsoft.com/en-us/azure/planetary-computer/create-api-proxy-geocatalog) | configuration | 0.70 | The article describes a concrete architecture and configuration for placing Azure API Management in front of a Planetary Computer Pro GeoCatalog to enable anonymous and collection-level access control. This involves product-specific settings and policies in APIM (such as how authentication and access control are handled for GeoCatalog requests), which are configuration details not generally known from training data. It is not just a conceptual overview or generic tutorial, but a specific configuration pattern for this service combination. | | [Get a collection SAS token](https://learn.microsoft.com/en-us/azure/planetary-computer/get-collection-sas-token) | integrations | 0.70 | Covers retrieving collection-level SAS tokens for STAC collection assets in managed storage within a GeoCatalog. This involves specific storage/SAS parameters and token usage patterns that are product-specific integration details between Planetary Computer Pro and Azure Storage. | | [Ingestion sources](https://learn.microsoft.com/en-us/azure/planetary-computer/ingestion-source) | configuration | 0.70 | Explains ingestion source concept including location, URI structure, and authentication methods—product-specific configuration parameters for ingestion. | | [Partner Application Overview](https://learn.microsoft.com/en-us/azure/planetary-computer/working-with-partner-applications) | integrations | 0.70 | Describes cross-tenant application integration scenarios and patterns for partner apps to read/write GeoCatalog data—product-specific integration model. | diff --git a/products/azure-policy/azure-policy.csv b/products/azure-policy/azure-policy.csv index b5d97397..c20aa3f3 100644 --- a/products/azure-policy/azure-policy.csv +++ b/products/azure-policy/azure-policy.csv @@ -54,13 +54,13 @@ https://learn.microsoft.com/en-us/azure/governance/machine-configuration/overvie https://learn.microsoft.com/en-us/azure/governance/machine-configuration/overview/03-network-requirements,Network Requirements,Azure Machine Configuration network requirements - Azure Machine Configuration,Configure network and endpoints for Machine Configuration,"Configure network connectivity, endpoints, and private link settings for Azure Machine Configuration across Azure and hybrid environments.",,2025-11-10T23:18:00.000Z,how-to,configuration,0.75,True,"Network requirements page usually lists specific endpoints, ports, and Private Link settings; these are concrete configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/governance/machine-configuration/overview/04-operations-troubleshooting,Troubleshooting Machine Configuration,Troubleshooting Azure Machine Configuration - Azure Machine Configuration,Troubleshoot Azure Machine Configuration deployments,"Monitor, troubleshoot, and manage Azure Machine Configuration deployments including availability, data residency, and common issues.",,2025-11-10T23:18:00.000Z,how-to,troubleshooting,0.85,True,"Explicit troubleshooting article; likely includes availability behaviors, data residency nuances, and symptom-to-solution guidance.",unchanged https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/,Overview,Azure machine configuration agent release notes overview - Azure Machine Configuration,,"Overview of the guest configuration agent release notes, issues, and frequently asked questions.","The machine configuration agent receives improvements on an ongoing basis. To stay up to date with -the most recent developments by platform, see the following articles: Each article provides you with information about: For information on release notes for the connected machine agent, seeWhat's new with the connected machine agent.",2026-04-15T06:10:00.000Z,release-notes,,0.2,False,"High-level overview page that just links to platform-specific release notes; no concrete limits, configuration parameters, error codes, or decision matrices are evident in the summary.",new +the most recent developments by platform, see the following articles: Each article provides you with information about: For information on release notes for the connected machine agent, seeWhat's new with the connected machine agent.",2026-04-15T06:10:00.000Z,release-notes,,0.2,False,"High-level overview page that just links to platform-specific release notes; no concrete limits, configuration parameters, error codes, or decision matrices are evident in the summary.",unchanged https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/linux,Linux agent release notes,Azure machine configuration Linux agent release notes - Azure Machine Configuration,,"Details guest configuration agent for Linux release notes, issues, and frequently asked questions.","The machine configuration agent receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about: For information on release notes for the connected machine agent, seeWhat's new with the connected machine agent. Note This article includes the release notes for the Linux extension for Azure machine configuration -(Microsoft.GuestConfiguration.ConfigurationforLinux). To review the release notes for Windows (Microsoft.G",2026-04-15T06:10:00.000Z,release-notes,,0.3,False,"Linux agent release notes likely list changes and issues, but the summary does not expose concrete limits, configuration tables, error-code mappings, or other structured expert details matching the defined sub-skill categories.",new +(Microsoft.GuestConfiguration.ConfigurationforLinux). To review the release notes for Windows (Microsoft.G",2026-04-24T06:15:00.000Z,release-notes,,0.3,False,"Linux agent release notes similarly focus on version history and issues, not on structured limits, configuration parameters, troubleshooting flows, or decision criteria. They do not match any of the defined expert-knowledge sub-skill patterns.",updated https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/windows,Windows agent release notes,Azure machine configuration Windows agent release notes - Azure Machine Configuration,,"Details guest configuration agent for Windows release notes, issues, and frequently asked questions.","The machine configuration agent receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about: For information on release notes for the connected machine agent, seeWhat's new with the connected machine agent. Note This article includes the release notes for the Windows extension for Azure machine configuration -(Microsoft.GuestConfiguration.ConfigurationforWindows). To review the release notes for Linux (Microsoft",2026-04-15T06:10:00.000Z,release-notes,,0.3,False,"Windows agent release notes likely contain version changes and issues, but the provided summary does not show specific configuration parameters, limits, error codes, or structured troubleshooting/decision content required by any sub-skill type.",new +(Microsoft.GuestConfiguration.ConfigurationforWindows). To review the release notes for Linux (Microsoft",2026-04-24T06:15:00.000Z,release-notes,,0.3,False,"Windows agent release notes typically list version changes, fixes, and known issues but are not organized as limits, configuration references, troubleshooting guides, or decision matrices as defined. They lack structured numeric limits, config tables, or error-code-to-solution mappings required for the sub-skill types.",updated https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/docs,What's new in docs,What's new in Azure machine configuration docs - Azure Machine Configuration,,Details about changes to the documentation for Azure machine configuration.,"This document describes changes to the documentation that may be of interest to users and readers. It doesn't include minor changes, like fixing typos or formatting. Instead, it lists new articles and retired articles and any major changes to the documentation. This document provides historical @@ -89,7 +89,7 @@ https://learn.microsoft.com/en-us/azure/governance/policy/concepts/compliance-st https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-alias,Aliases,Details of the policy definition structure aliases - Azure Policy,,Describes how policy definition aliases are used to establish conventions for Azure resources in your organization.,"You use property aliases to access specific properties for a resource type. Aliases enable you to restrict what values or conditions are allowed for a property on a resource. Each alias maps to paths in different API versions for a given resource type. During policy evaluation, the policy engine gets the property path for that API version. The list of aliases is always growing. To find which aliases Azure Policy supports, use one of the following methods: Azure Policy extension for Visual Studio",2025-03-04T08:00:00.000Z,concept-article,,0.35,False,"Describes alias conceptually and how to discover aliases using tools; the detailed alias list is external, so this page itself is more conceptual than reference.",unchanged https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-basics,Basics,Details of Azure Policy definition structure basics - Azure Policy,,Describes how Azure Policy definition basics are used to establish conventions for Azure resources in your organization.,"Azure Policy definitions describe resource complianceconditionsand the effect to take if a condition is met. A condition compares a resource propertyfieldor avalueto a required value. Resource property fields are accessed by usingaliases. When a resource property field is an array, a specialarray aliascan be used to select values from all array members and apply a condition to each one. Learn more aboutconditions. By using policy assignments, you can control costs and manage your resources. For ",2025-03-04T08:00:00.000Z,concept-article,,0.3,False,"Conceptual explanation of Azure Policy definition basics (conditions, fields, values, aliases) without detailed configuration tables, limits, or product-specific numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-parameters,Parameters,Details of the policy definition structure parameters - Azure Policy,,Describes how policy definition parameters are used to establish conventions for Azure resources in your organization.,"Parameters help simplify your policy management by reducing the number of policy definitions. Think of parameters like the fields on a form:name,address,city,state. These parameters always stay the same but their values change based on the individual filling out the form. Parameters work the same way when building policies. By including parameters in a policy definition, you can reuse that policy for different scenarios by using different values.",2025-03-04T08:00:00.000Z,conceptual,,0.3,False,"Conceptual description of policy parameters and their purpose; lacks detailed parameter catalogs, default values, or numeric constraints that would constitute expert configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-policy-rule,Policy rule,Details of the policy definition structure policy rules - Azure Policy,,Describes how policy definition policy rules are used to establish conventions for Azure resources in your organization.,"The policy rule consists ofifandthenblocks. In theifblock, you define one or more conditions that specify when the policy is enforced. You can apply logical operators to these conditions to precisely define the scenario for a policy. For complete details on each effect, order of evaluation, properties, and examples, seeAzure Policy definitions effect basics. In thethenblock, you define the effect that happens when theifconditions are fulfilled. For more information aboutpolicyRule, go to thepoli",2026-04-17T22:08:00.000Z,concept-article,,0.2,False,"Primarily explains Azure Policy rule structure (if/then, logical operators, effects). No specific limits, configuration tables, error codes, or product-specific numeric thresholds; it's a conceptual/structural reference rather than expert-only operational details.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-policy-rule,Policy rule,Details of the policy definition structure policy rules - Azure Policy,,Describes how policy definition policy rules are used to establish conventions for Azure resources in your organization.,"The policy rule consists ofifandthenblocks. In theifblock, you define one or more conditions that specify when the policy is enforced. You can apply logical operators to these conditions to precisely define the scenario for a policy. For complete details on each effect, order of evaluation, properties, and examples, seeAzure Policy definitions effect basics. In thethenblock, you define the effect that happens when theifconditions are fulfilled. For more information aboutpolicyRule, go to thepoli",2026-04-17T22:08:00.000Z,concept-article,,0.2,False,"Primarily explains Azure Policy rule structure (if/then, logical operators, effects). No specific limits, configuration tables, error codes, or product-specific numeric thresholds; it's a conceptual/structural reference rather than expert-only operational details.",unchanged https://learn.microsoft.com/en-us/azure/governance/policy/concepts/effect-add-to-network-group,Add to network group,Azure Policy definitions addToNetworkGroup effect - Azure Policy,,Azure Policy definitions addToNetworkGroup effect determines how compliance is managed and reported.,"TheaddToNetworkGroupeffect is used in Azure Virtual Network Manager to define dynamic network group membership. This effect is specific toMicrosoft.Network.Datapolicy modedefinitions only. With network groups, your policy definition includes your conditional expression for matching virtual networks meeting your criteria, and specifies the destination network group where any matching resources are placed. TheaddToNetworkGroupeffect is used to place resources in the destination network group. To l",2025-03-04T08:00:00.000Z,conceptual,,0.4,False,"Conceptual explanation of addToNetworkGroup effect for Azure Policy; no numeric limits, config tables, or detailed error mappings.",unchanged https://learn.microsoft.com/en-us/azure/governance/policy/concepts/effect-append,Append,Azure Policy definitions append effect - Azure Policy,,Azure Policy definitions append effect determines how compliance is managed and reported.,"Theappendeffect is used to add more fields to the requested resource during creation or update. A common example is specifying allowed IPs for a storage resource. Important appendis intended for use with non-tag properties. Whileappendcan add tags to a resource during a create or update request, it's recommended to use themodifyeffect for tags instead.",2025-03-04T08:00:00.000Z,concept-article,,0.3,False,High-level description of append effect; lacks product-specific configuration tables or quantified guidance.,unchanged https://learn.microsoft.com/en-us/azure/governance/policy/concepts/effect-audit,Audit,Azure Policy definitions audit effect - Azure Policy,,Azure Policy definitions audit effect determines how compliance is managed and reported.,"Theauditeffect is used to create a warning event in the activity log when evaluating a non-compliant resource, but it doesn't stop the request.",2025-03-04T08:00:00.000Z,reference,,0.3,False,"Describes audit effect conceptually without detailed parameters, limits, or troubleshooting mappings.",unchanged @@ -147,14 +147,14 @@ https://learn.microsoft.com/en-us/azure/governance/policy/samples/australia-ism, definition maps tocompliance domainsandcontrolsin Australian Government ISM PROTECTED. For more information about this compliance standard, seeAustralian Government ISM PROTECTED. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theAustralian Government ISM PROTECTEDcontrols. Many of the controls are implemented with anAzure Policyinitiativ",2025-11-19T08:00:00.000Z,generated-reference,security,0.7,True,"Regulatory compliance mapping for Australian Government ISM PROTECTED; contains detailed mappings from specific controls to Azure Policy definitions, which is product-specific security/compliance configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark,Microsoft cloud security benchmark,Regulatory Compliance details for Microsoft cloud security benchmark - Azure Policy,Apply Microsoft cloud security benchmark via Azure Policy,Details of the Microsoft cloud security benchmark Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark,Microsoft cloud security benchmark,Regulatory Compliance details for Microsoft cloud security benchmark - Azure Policy,Align with Microsoft cloud security benchmark via Policy,Details of the Microsoft cloud security benchmark Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin Microsoft cloud security benchmark. For more information about this compliance standard, seeMicrosoft cloud security benchmark. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theMicrosoft cloud security benchmarkcontrols. Many of the controls -are implemented with anAzure Policyinitiative d",2026-02-24T08:00:00.000Z,generated-reference,security,0.8,True,Maps Microsoft cloud security benchmark controls to Azure Policy initiatives; central reference for Azure security baseline implementation.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-initiatives,Built-in initiatives,List of built-in policy initiatives - Azure Policy,,"List built-in policy initiatives for Azure Policy. Categories include Regulatory Compliance, Azure Machine Configuration, and more.","This page is an index of Azure Policy built-in initiative definitions. The name on each built-in links to the initiative definition source on theAzure Policy GitHub repo. The built-ins are grouped by thecategoryproperty inmetadata. To go to a specificcategory, useCtrl-Ffor your browser's search feature.",2025-11-18T08:00:00.000Z,generated-reference,,0.1,False,Index of built-in policy initiatives; serves as navigation without embedded expert configuration or decision guidance.,unchanged +are implemented with anAzure Policyinitiative d",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,"Provides control-by-control mappings between Microsoft cloud security benchmark domains/controls and Azure Policy definitions. This is detailed, product-specific security/compliance configuration guidance.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-initiatives,Built-in initiatives,List of built-in policy initiatives - Azure Policy,Use Azure built-in policy initiatives for compliance,"List built-in policy initiatives for Azure Policy. Categories include Regulatory Compliance, Azure Machine Configuration, and more.","This page is an index of Azure Policy built-in initiative definitions. The name on each built-in links to the initiative definition source on theAzure Policy GitHub repo. The built-ins are grouped by thecategoryproperty inmetadata. To go to a specificcategory, useCtrl-Ffor your browser's search feature.",2026-04-19T11:06:00.000Z,generated-reference,security,0.7,True,"Provides an index of built-in Azure Policy initiative definitions with exact initiative names and categories, which are product-specific compliance/security configuration bundles not generally known from training.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-packages,Built-in packages for guest configuration,List of built-in packages for guest configuration - Azure Policy,Use built-in guest configuration packages in Azure Policy,List of all built-in packages for guest configuration mapped to each policy definition and the PowerShell modules that are used by each package.,"This page is an index of Azure Policy built-in packages for the guest configuration feature.",2022-05-06T17:04:00.000Z,sample,configuration,0.7,True,Index of built-in guest configuration packages mapped to policy definitions and PowerShell modules. Contains concrete package names and mappings that are configuration reference data.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-policies,Built-in policies,List of built-in policy definitions - Azure Policy,,"List built-in policy definitions for Azure Policy. Categories include Tags, Regulatory Compliance, Key Vault, Kubernetes, Azure Machine Configuration, and more.","This page is an index of Azure Policy built-in policy definitions. The name of each built-in links to the policy definition in the Azure portal. Use the link in theSourcecolumn to view the source on theAzure Policy GitHub repo. The built-ins are grouped by thecategoryproperty inmetadata. To go to a specificcategory, useCtrl-Ffor your browser's search feature.",2026-01-26T08:00:00.000Z,generated-reference,,0.1,False,"Index of built-in policy definitions linking out to portal/GitHub; page itself is a catalog, not detailed configuration or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-policies,Built-in policies,List of built-in policy definitions - Azure Policy,Use Azure built-in policy definitions for governance,"List built-in policy definitions for Azure Policy. Categories include Tags, Regulatory Compliance, Key Vault, Kubernetes, Azure Machine Configuration, and more.","This page is an index of Azure Policy built-in policy definitions. The name of each built-in links to the policy definition in the Azure portal. Use the link in theSourcecolumn to view the source on theAzure Policy GitHub repo. The built-ins are grouped by thecategoryproperty inmetadata. To go to a specificcategory, useCtrl-Ffor your browser's search feature.",2026-04-18T08:00:00.000Z,generated-reference,security,0.7,True,"Lists concrete built-in Azure Policy definitions with exact names, categories, and links to source JSON. These are product-specific security/governance configuration artifacts (policy definitions) that an LLM won't reliably know from training and are used to configure access/compliance behavior.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/canada-federal-pbmm,Canada Federal PBMM,Regulatory Compliance details for Canada Federal PBMM - Azure Policy,Use Azure Policy for Canada Federal PBMM compliance,Details of the Canada Federal PBMM Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin Canada Federal PBMM. For more information about this compliance standard, seeCanada Federal PBMM. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theCanada Federal PBMMcontrols. Many of the controls @@ -188,20 +188,20 @@ definition maps tocompliance domainsandcontrolsin CMMC Level 3. For more information about this compliance standard, seeCMMC Level 3. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theCMMC Level 3controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete initiative definition, openPolic",2026-03-19T08:00:00.000Z,generated-reference,security,0.78,True,"Lists how CMMC Level 3 controls map to specific Azure Policy definitions within a regulatory compliance initiative. This is detailed, product-specific security/compliance configuration information, matching the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high,FedRAMP High,Regulatory Compliance details for FedRAMP High - Azure Policy,Map Azure Policy to FedRAMP High requirements,Details of the FedRAMP High Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high,FedRAMP High,Regulatory Compliance details for FedRAMP High - Azure Policy,Map FedRAMP High controls to Azure Policy,Details of the FedRAMP High Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin FedRAMP High. For more information about this compliance standard, seeFedRAMP High. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theFedRAMP Highcontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiative definition, openPolic",2026-02-19T08:00:00.000Z,generated-reference,security,0.7,True,Shows how FedRAMP High controls are implemented via Azure Policy initiatives; detailed compliance configuration guidance.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate,FedRAMP Moderate,Regulatory Compliance details for FedRAMP Moderate - Azure Policy,Map Azure Policy to FedRAMP Moderate requirements,Details of the FedRAMP Moderate Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +initiative definition, openPolic",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,Contains detailed mappings from specific FedRAMP High controls to concrete Azure Policy definitions within a regulatory compliance initiative. This is product-specific compliance/security configuration knowledge (which policy implements which control) that is unlikely to be memorized by an LLM.,updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate,FedRAMP Moderate,Regulatory Compliance details for FedRAMP Moderate - Azure Policy,Map FedRAMP Moderate controls to Azure Policy,Details of the FedRAMP Moderate Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin FedRAMP Moderate. For more information about this compliance standard, seeFedRAMP Moderate. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theFedRAMP Moderatecontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiative definitio",2026-02-19T08:00:00.000Z,generated-reference,security,0.7,True,Provides mappings from FedRAMP Moderate controls to Azure Policy definitions; product-specific security/compliance implementation.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark,Microsoft cloud security benchmark,Regulatory Compliance details for Microsoft cloud security benchmark (Azure Government) - Azure Policy,Map Microsoft cloud security benchmark to Azure Policy,Details of the Microsoft cloud security benchmark (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessme,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +initiative definitio",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,Details how each FedRAMP Moderate control maps to one or more Azure Policy definitions in a built-in initiative. These mappings are specific security/compliance configuration relationships unique to Azure Policy.,updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark,Microsoft cloud security benchmark,Regulatory Compliance details for Microsoft cloud security benchmark (Azure Government) - Azure Policy,Use Azure Policy for Microsoft cloud security benchmark in Azure Government,Details of the Microsoft cloud security benchmark (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessme,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin Microsoft cloud security benchmark (Azure Government). For more information about this compliance standard, seeMicrosoft cloud security benchmark. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theMicrosoft cloud security benchmarkcontrols. Many of the controls -are implemented with anAzure",2025-10-31T08:00:00.000Z,generated-reference,security,0.7,True,"Details how Microsoft cloud security benchmark controls map to Azure Policy definitions for Azure Government, including initiative-based implementations, which is product-specific security guidance.",unchanged +are implemented with anAzure",2026-04-18T08:00:00.000Z,generated-reference,security,0.7,True,"Details how Microsoft cloud security benchmark controls map to Azure Policy definitions/initiatives in Azure Government. The specific control-to-policy mappings and identifiers are expert, product-specific security/compliance configuration information.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-cis-azure-1-1-0,CIS Microsoft Azure Foundations Benchmark 1.1.0,Regulatory Compliance details for CIS Microsoft Azure Foundations Benchmark 1.1.0 (Azure Government) - Azure Policy,Map CIS Azure 1.1.0 (Gov) controls to Azure Policy,Details of the CIS Microsoft Azure Foundations Benchmark 1.1.0 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist ,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin CIS Microsoft Azure Foundations Benchmark 1.1.0 (Azure Government). For more information about this compliance standard, seeCIS Microsoft Azure Foundations Benchmark 1.1.0. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theCIS Microsoft Azure Foundations Benchmark 1.1.0controls. Many of th",2026-03-19T08:00:00.000Z,generated-reference,security,0.78,True,Describes how CIS Microsoft Azure Foundations Benchmark 1.1.0 controls for Azure Government map to specific Azure Policy definitions. This is specialized security/compliance configuration knowledge for the Azure Government environment.,unchanged @@ -213,16 +213,16 @@ definition maps tocompliance domainsandcontrolsin CMMC Level 3 (Azure Government For more information about this compliance standard, seeCMMC Level 3. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theCMMC Level 3controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete initiative de",2026-03-19T08:00:00.000Z,generated-reference,security,0.78,True,"Page details how CMMC Level 3 controls map to Azure Policy definitions and built-in initiatives for Azure Government. The exact control-to-policy mappings and initiative composition are specific, non-obvious configuration knowledge for securing and assessing environments against CMMC, fitting the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high,FedRAMP High,Regulatory Compliance details for FedRAMP High (Azure Government) - Azure Policy,Align Azure Government with FedRAMP High via Policy,Details of the FedRAMP High (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high,FedRAMP High,Regulatory Compliance details for FedRAMP High (Azure Government) - Azure Policy,Map FedRAMP High controls to Azure Policy in Azure Government,Details of the FedRAMP High (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin FedRAMP High (Azure Government). For more information about this compliance standard, seeFedRAMP High. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theFedRAMP Highcontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiative de",2025-10-20T08:00:00.000Z,generated-reference,security,0.7,True,"Details how FedRAMP High controls map to Azure Policy definitions/initiatives in Azure Government, giving concrete compliance-to-policy mappings unique to this product and environment.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate,FedRAMP Moderate,Regulatory Compliance details for FedRAMP Moderate (Azure Government) - Azure Policy,Align Azure Government with FedRAMP Moderate via Policy,Details of the FedRAMP Moderate (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +initiative de",2026-04-18T08:00:00.000Z,generated-reference,security,0.72,True,"Page provides a detailed mapping of FedRAMP High controls to specific Azure Policy initiative and policy definition IDs/settings for Azure Government. These mappings and control-to-policy relationships are product- and standard-specific, not derivable from general training data, and are used for concrete compliance configuration, fitting the security category.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate,FedRAMP Moderate,Regulatory Compliance details for FedRAMP Moderate (Azure Government) - Azure Policy,Map FedRAMP Moderate controls to Azure Policy in Azure Government,Details of the FedRAMP Moderate (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin FedRAMP Moderate (Azure Government). For more information about this compliance standard, seeFedRAMP Moderate. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theFedRAMP Moderatecontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -i",2025-10-20T08:00:00.000Z,generated-reference,security,0.7,True,"Similar to index 2 but for FedRAMP Moderate; includes specific mappings between FedRAMP Moderate controls and Azure Policy initiatives, which is expert, product-specific security content.",unchanged +i",2026-04-18T08:00:00.000Z,generated-reference,security,0.72,True,"Contains a control-by-control mapping for FedRAMP Moderate to concrete Azure Policy initiative and policy definitions in Azure Government. These mappings are expert, product-specific compliance configuration details, aligning with security-focused configuration for regulatory standards.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-irs-1075-sept2016,IRS 1075 September 2016,Regulatory Compliance details for IRS 1075 September 2016 (Azure Government) - Azure Policy,Implement IRS 1075 2016 controls with Azure Policy,Details of the IRS 1075 September 2016 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin IRS 1075 September 2016 (Azure Government). For more information about this compliance standard, seeIRS 1075 September 2016. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theIRS 1075 September 2016controls. Many of the controls @@ -232,22 +232,22 @@ definition maps tocompliance domainsandcontrolsin ISO 27001:2013 (Azure Governme For more information about this compliance standard, seeISO 27001:2013. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theISO 27001:2013controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete initiat",2025-09-02T08:00:00.000Z,generated-reference,security,0.7,True,"Provides detailed mapping between ISO 27001:2013 controls and Azure Policy initiatives in Azure Government, which is specific security/compliance configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2,NIST SP 800-171 R2,Regulatory Compliance details for NIST SP 800-171 R2 (Azure Government) - Azure Policy,Use Azure Policy for NIST SP 800-171 R2,Details of the NIST SP 800-171 R2 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2,NIST SP 800-171 R2,Regulatory Compliance details for NIST SP 800-171 R2 (Azure Government) - Azure Policy,Map NIST SP 800-171 R2 controls to Azure Policy in Azure Government,Details of the NIST SP 800-171 R2 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NIST SP 800-171 R2 (Azure Government). For more information about this compliance standard, seeNIST SP 800-171 R2. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNIST SP 800-171 R2controls. Many of the controls -are implemented with anAzure Policyinitiative definition. To review the comp",2026-01-12T08:00:00.000Z,generated-reference,security,0.7,True,"Maps NIST SP 800-171 R2 controls to Azure Policy initiative definitions in Azure Government, which is specific security/compliance implementation guidance.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4,NIST SP 800-53 Rev. 4,Regulatory Compliance details for NIST SP 800-53 Rev. 4 (Azure Government) - Azure Policy,Implement NIST SP 800-53 R4 with Azure Policy,Details of the NIST SP 800-53 Rev. 4 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +are implemented with anAzure Policyinitiative definition. To review the comp",2026-04-18T08:00:00.000Z,generated-reference,security,0.7,True,"Contains a detailed mapping of NIST SP 800-171 R2 controls to Azure Policy initiatives and definitions for Azure Government. These mappings are concrete, product-specific security/compliance implementation details, not general conceptual content.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4,NIST SP 800-53 Rev. 4,Regulatory Compliance details for NIST SP 800-53 Rev. 4 (Azure Government) - Azure Policy,Implement NIST SP 800-53 R4 with Azure Policy in Azure Government,Details of the NIST SP 800-53 Rev. 4 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NIST SP 800-53 Rev. 4 (Azure Government). For more information about this compliance standard, seeNIST SP 800-53 Rev. 4. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNIST SP 800-53 Rev. 4controls. Many of the controls -are implemented with anAzure Policyinitiative definition. To review",2025-10-20T08:00:00.000Z,generated-reference,security,0.7,True,"Contains mappings from NIST SP 800-53 Rev. 4 controls to Azure Policy initiatives in Azure Government, a concrete, product-specific security/compliance configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5,NIST SP 800-53 Rev. 5,Regulatory Compliance details for NIST SP 800-53 Rev. 5 (Azure Government) - Azure Policy,Implement NIST SP 800-53 R5 with Azure Policy,Details of the NIST SP 800-53 Rev. 5 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +are implemented with anAzure Policyinitiative definition. To review",2026-04-18T08:00:00.000Z,generated-reference,security,0.7,True,"Provides detailed mappings from NIST SP 800-53 Rev. 4 controls to Azure Policy initiatives and definitions. These mappings are concrete, product-specific compliance implementation details that an LLM would not reliably know without the page, fitting the security category.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5,NIST SP 800-53 Rev. 5,Regulatory Compliance details for NIST SP 800-53 Rev. 5 (Azure Government) - Azure Policy,Implement NIST SP 800-53 R5 with Azure Policy in Azure Government,Details of the NIST SP 800-53 Rev. 5 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NIST SP 800-53 Rev. 5 (Azure Government). For more information about this compliance standard, seeNIST SP 800-53 Rev. 5. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNIST SP 800-53 Rev. 5controls. Many of the controls -are implemented with anAzure Policyinitiative definition. To review",2025-10-20T08:00:00.000Z,generated-reference,security,0.7,True,Same pattern as index 7 but for Rev. 5; provides detailed control-to-policy mappings unique to Azure Government and this standard.,unchanged +are implemented with anAzure Policyinitiative definition. To review",2026-04-18T08:00:00.000Z,generated-reference,security,0.7,True,"Similar to the R4 page, this one maps NIST SP 800-53 Rev. 5 controls to specific Azure Policy initiatives/definitions. The exact mappings and identifiers are expert, product-specific security/compliance configuration data.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-soc-2,SOC 2 Type 2,Regulatory Compliance details for System and Organization Controls (SOC) 2 (Azure Government) - Azure Policy,Align SOC 2 controls with Azure Policy in Azure Government,Details of the System and Organization Controls (SOC) 2 (Azure Government) Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with as,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin System and Organization Controls (SOC) 2 (Azure Government). For more information about this compliance standard, seeSystem and Organization Controls (SOC) 2. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theSystem and Organization Controls (SOC) 2controls. Many of the controls -are implem",2026-03-19T08:00:00.000Z,generated-reference,security,0.78,True,"Page describes the mapping of SOC 2 controls to Azure Policy regulatory compliance initiatives and individual policy definitions for Azure Government. These mappings and policy identifiers are detailed, product-specific security/compliance configuration knowledge, qualifying as expert security content.",unchanged +are implem",2026-04-18T08:00:00.000Z,generated-reference,security,0.7,True,"Provides mappings from SOC 2 controls to Azure Policy initiatives and definitions. The specific control mappings and policy identifiers are expert, product-specific security/compliance configuration information.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-cis-linux,CIS Security Benchmarks for Linux Workloads - Overview,Reference - Built-in CIS Security Benchmarks for Linux Workloads via Machine Configuration - Azure Policy,Apply CIS Linux security baselines via Machine Configuration,Reference - Built-in CIS Security Benchmarks for Linux Workloads via Machine Configuration,,2025-11-08T06:15:00.000Z,reference,security,0.7,True,"Reference for built-in CIS security benchmarks for Linux workloads via Machine Configuration, including detailed rules and configuration parameters, which are security baseline specifics.",unchanged https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-docker,Docker host security baseline,Reference - Azure Policy guest configuration baseline for Docker - Azure Policy,Apply Docker security baseline via guest configuration,Details of the Docker baseline on Azure implemented through Azure Policy guest configuration.,"Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and planning accordingly. For more information, see theCentOS End Of Life guidance. This article details the configuration settings for Docker hosts as applicable in the following implementations: For more information, seeUnderstand the guest configuration feature of Azure PolicyandOverview of the Azure Security Benchmark (V2).",2022-08-04T11:12:00.000Z,reference,security,0.7,True,"Details configuration settings for Docker hosts as part of Azure Security Benchmark via guest configuration, including concrete security settings and checks.",unchanged @@ -272,32 +272,32 @@ definition maps tocompliance domainsandcontrolsin ISO 27001:2013. For more information about this compliance standard, seeISO 27001:2013. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theISO 27001:2013controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete initiative definition, ope",2025-09-02T08:00:00.000Z,generated-reference,security,0.7,True,Control-to-policy mapping for ISO 27001:2013; Azure-specific security/compliance implementation details.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential,Microsoft Cloud for Sovereignty Confidential,Regulatory Compliance details for Microsoft Cloud for Sovereignty Baseline Confidential Policies - Azure Policy,Use Azure Policy for Sovereignty Baseline Confidential compliance,Details of the Microsoft Cloud for Sovereignty Baseline Confidential Policies Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential,Microsoft Cloud for Sovereignty Confidential,Regulatory Compliance details for Microsoft Cloud for Sovereignty Baseline Confidential Policies - Azure Policy,Use Sovereignty Baseline Confidential policies for compliance,Details of the Microsoft Cloud for Sovereignty Baseline Confidential Policies Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin Microsoft Cloud for Sovereignty Baseline Confidential Policies. -For more information about this compliance standard, seeMicrosoft Cloud for Sovereignty Baseline Confidential Policies. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theMicrosoft Cloud for Sovereignty Baseline Confidential Po",2025-08-05T20:17:00.000Z,generated-reference,security,0.7,True,"Provides mappings for Microsoft Cloud for Sovereignty Baseline Confidential controls to Azure Policy; niche, product-specific compliance configuration.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global,Microsoft Cloud for Sovereignty Global,Regulatory Compliance details for Microsoft Cloud for Sovereignty Baseline Global Policies - Azure Policy,Use Azure Policy for Sovereignty Baseline Global compliance,Details of the Microsoft Cloud for Sovereignty Baseline Global Policies Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with asses,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +For more information about this compliance standard, seeMicrosoft Cloud for Sovereignty Baseline Confidential Policies. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theMicrosoft Cloud for Sovereignty Baseline Confidential Po",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,Describes how Microsoft Cloud for Sovereignty Baseline Confidential controls map to Azure Policy definitions in a regulatory initiative. These mappings are specific security/compliance configuration knowledge.,updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global,Microsoft Cloud for Sovereignty Global,Regulatory Compliance details for Microsoft Cloud for Sovereignty Baseline Global Policies - Azure Policy,Use Sovereignty Baseline Global policies for compliance,Details of the Microsoft Cloud for Sovereignty Baseline Global Policies Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with asses,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin Microsoft Cloud for Sovereignty Baseline Global Policies. -For more information about this compliance standard, seeMicrosoft Cloud for Sovereignty Baseline Global Policies. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theMicrosoft Cloud for Sovereignty Baseline Global Policiescontrols. Ma",2025-08-05T20:17:00.000Z,generated-reference,security,0.7,True,Similar to index 23 but for Global Baseline; detailed mapping of controls to Azure Policy artifacts.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2,NIST SP 800-171 R2,Regulatory Compliance details for NIST SP 800-171 R2 - Azure Policy,Use Azure Policy to meet NIST SP 800-171 R2,Details of the NIST SP 800-171 R2 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +For more information about this compliance standard, seeMicrosoft Cloud for Sovereignty Baseline Global Policies. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theMicrosoft Cloud for Sovereignty Baseline Global Policiescontrols. Ma",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,"Shows mappings from Microsoft Cloud for Sovereignty Baseline Global controls to Azure Policy definitions. This is detailed, product-specific security/compliance configuration information.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2,NIST SP 800-171 R2,Regulatory Compliance details for NIST SP 800-171 R2 - Azure Policy,Map NIST SP 800-171 R2 controls to Azure Policy,Details of the NIST SP 800-171 R2 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NIST SP 800-171 R2. For more information about this compliance standard, seeNIST SP 800-171 R2. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNIST SP 800-171 R2controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiative def",2026-02-19T08:00:00.000Z,generated-reference,security,0.7,True,Maps NIST 800-171 R2 controls to Azure Policy initiatives; concrete Azure compliance implementation guidance.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4,NIST SP 800-53 Rev. 4,Regulatory Compliance details for NIST SP 800-53 Rev. 4 - Azure Policy,Implement NIST SP 800-53 Rev. 4 with Azure Policy,Details of the NIST SP 800-53 Rev. 4 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +initiative def",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,Details how NIST SP 800-171 R2 controls map to Azure Policy definitions in a regulatory initiative. This mapping is specialized security/compliance configuration knowledge.,updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4,NIST SP 800-53 Rev. 4,Regulatory Compliance details for NIST SP 800-53 Rev. 4 - Azure Policy,Map NIST SP 800-53 Rev. 4 controls to Azure Policy,Details of the NIST SP 800-53 Rev. 4 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NIST SP 800-53 Rev. 4. For more information about this compliance standard, seeNIST SP 800-53 Rev. 4. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNIST SP 800-53 Rev. 4controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initi",2026-02-19T08:00:00.000Z,generated-reference,security,0.75,True,Control mappings for NIST 800-53 Rev. 4 to Azure Policy; expert-level security/compliance configuration.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5,NIST SP 800-53 Rev. 5,Regulatory Compliance details for NIST SP 800-53 Rev. 5 - Azure Policy,Implement NIST SP 800-53 Rev. 5 with Azure Policy,Details of the NIST SP 800-53 Rev. 5 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +initi",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,"Contains a mapping table from NIST SP 800-53 Rev. 4 controls to specific Azure Policy definitions in a built-in initiative, which is specialized security/compliance configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5,NIST SP 800-53 Rev. 5,Regulatory Compliance details for NIST SP 800-53 Rev. 5 - Azure Policy,Map NIST SP 800-53 Rev. 5 controls to Azure Policy,Details of the NIST SP 800-53 Rev. 5 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NIST SP 800-53 Rev. 5. For more information about this compliance standard, seeNIST SP 800-53 Rev. 5. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNIST SP 800-53 Rev. 5controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initi",2026-02-19T08:00:00.000Z,generated-reference,security,0.75,True,Similar to Rev. 4 but updated controls; detailed Azure Policy mapping for NIST 800-53 Rev. 5.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme,NL BIO Cloud Theme,Regulatory Compliance details for NL BIO Cloud Theme - Azure Policy,Map Azure Policy to NL BIO Cloud Theme controls,Details of the NL BIO Cloud Theme Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +initi",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,"Provides detailed mappings between NIST SP 800-53 Rev. 5 controls and Azure Policy definitions. These are concrete, product-specific security/compliance configuration relationships.",updated +https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme,NL BIO Cloud Theme,Regulatory Compliance details for NL BIO Cloud Theme - Azure Policy,Map NL BIO Cloud Theme controls to Azure Policy,Details of the NL BIO Cloud Theme Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin NL BIO Cloud Theme. For more information about this compliance standard, seeNL BIO Cloud Theme. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theNL BIO Cloud Themecontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiative def",2026-01-12T08:00:00.000Z,generated-reference,security,0.7,True,Provides mappings from NL BIO Cloud Theme controls to Azure Policy; specialized regional compliance configuration.,unchanged +initiative def",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,Shows mappings from NL BIO Cloud Theme controls to specific Azure Policy definitions. These are product-specific security/compliance configuration mappings not generally known to an LLM.,updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/pattern-count-operator,Count operator,Pattern: The count operator in a policy definition - Azure Policy,Count array members with Azure Policy count operator,This Azure Policy pattern provides an example of how to use the count operator in a policy definition.,Thecountoperator evaluates members of a [*] alias.,2025-08-05T20:17:00.000Z,sample,best-practices,0.6,True,"Provides concrete examples of the count operator over [*] aliases in Azure Policy, a product-specific rule pattern.",unchanged https://learn.microsoft.com/en-us/azure/governance/policy/samples/pattern-deploy-resources,Deploy resources,Pattern: Deploy resources with a policy definition - Azure Policy,Deploy resources using deployIfNotExists policies,This Azure Policy pattern provides an example of how to deploy resources with a deployIfNotExists policy definition.,"ThedeployIfNotExistseffect makes it possible to deploy anAzure Resource Manager template(ARM @@ -344,25 +344,25 @@ https://learn.microsoft.com/en-us/azure/governance/policy/samples/resource-graph for Azure Policy.",2025-08-05T20:17:00.000Z,sample,integrations,0.65,True,Collection of Resource Graph sample queries specifically targeting Azure Policy resource types and tables. Contains concrete query patterns and schema usage that are product-specific integration details.,unchanged https://learn.microsoft.com/en-us/azure/governance/policy/samples/resource-graph-samples-guest-configuration,Guest configuration resource graph queries,Azure Resource Graph sample queries for Azure Policy guest configuration - Azure Policy,Query guest configuration state via Resource Graph,Sample Azure Resource Graph queries for Azure Policy guest configuration showing use of resource types and tables to access related resources and properties.,"This page is a collection ofAzure Resource Graphsample queries for Azure Policy guest configuration.",2025-02-27T05:33:00.000Z,sample,integrations,0.65,True,"Collection of Azure Resource Graph queries specifically for Azure Policy guest configuration resources and tables, providing concrete integration/query patterns.",unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/rmit-malaysia,RMIT Malaysia,Regulatory Compliance details for RMIT Malaysia - Azure Policy,Map Azure Policy to RMIT Malaysia compliance controls,Details of the RMIT Malaysia Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/rmit-malaysia,RMIT Malaysia,Regulatory Compliance details for RMIT Malaysia - Azure Policy,Map RMIT Malaysia controls to Azure Policy,Details of the RMIT Malaysia Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin RMIT Malaysia. For more information about this compliance standard, seeRMIT Malaysia. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theRMIT Malaysiacontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiative definition, openPo",2026-02-19T08:00:00.000Z,generated-reference,security,0.7,True,Provides RMIT Malaysia control mappings to Azure Policy initiatives; specialized security/compliance configuration.,unchanged +initiative definition, openPo",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,"Provides control-to-policy mappings for the RMIT Malaysia standard using Azure Policy initiatives. This is detailed, product-specific security/compliance configuration information.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/soc-2,SOC 2 Type 2,Regulatory Compliance details for System and Organization Controls (SOC) 2 - Azure Policy,Map SOC 2 controls to Azure Policy initiatives,Details of the System and Organization Controls (SOC) 2 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin System and Organization Controls (SOC) 2. For more information about this compliance standard, seeSystem and Organization Controls (SOC) 2. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theSystem and Organization Controls (SOC) 2controls. Many of the controls -are implemented with anAzure ",2026-03-19T08:00:00.000Z,generated-reference,security,0.78,True,"Shows mappings between SOC 2 controls and Azure Policy definitions/initiatives for regulatory compliance. These mappings and policy references are expert, product-specific security configuration details.",unchanged +are implemented with anAzure ",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,Describes how SOC 2 controls map to Azure Policy definitions in a regulatory compliance initiative. These mappings are specific security/compliance configuration knowledge.,updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/spain-ens,Spain ENS,Regulatory Compliance details for Spain ENS - Azure Policy,Use Azure Policy for Spain ENS regulatory compliance,Details of the Spain ENS Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin Spain ENS. For more information about this compliance standard, seeSpain ENS. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theSpain ENScontrols. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete initiative definition, openPolicyin the A",2026-02-24T08:00:00.000Z,generated-reference,security,0.7,True,Maps Spain ENS controls to Azure Policy; Azure-specific implementation of national compliance standard.,unchanged -https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2021,SWIFT CSP-CSCF 2021,Regulatory Compliance details for SWIFT CSP-CSCF v2021 - Azure Policy,Map Azure Policy to SWIFT CSP-CSCF v2021 controls,Details of the SWIFT CSP-CSCF v2021 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative +https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2021,SWIFT CSP-CSCF 2021,Regulatory Compliance details for SWIFT CSP-CSCF v2021 - Azure Policy,Map SWIFT CSP-CSCF 2021 controls to Azure Policy,Details of the SWIFT CSP-CSCF v2021 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin SWIFT CSP-CSCF v2021. For more information about this compliance standard, seeSWIFT CSP-CSCF v2021. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theSWIFT CSP-CSCF v2021controls. Many of the controls are implemented with anAzure Policyinitiative definition. To review the complete -initiati",2026-01-12T08:00:00.000Z,generated-reference,security,0.75,True,Provides mappings from SWIFT CSP-CSCF 2021 controls to Azure Policy initiatives; highly specialized financial-sector security configuration.,unchanged +initiati",2026-04-18T08:00:00.000Z,generated-reference,security,0.8,True,"Contains mappings from SWIFT CSP-CSCF v2021 controls to Azure Policy definitions. This is specialized, product-specific security/compliance configuration guidance.",updated https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2022,SWIFT CSP-CSCF 2022,Regulatory Compliance details for SWIFT CSP-CSCF v2022 - Azure Policy,Map Azure Policy to SWIFT CSP-CSCF v2022 controls,Details of the SWIFT CSP-CSCF v2022 Regulatory Compliance built-in initiative. Each control is mapped to one or more Azure Policy definitions that assist with assessment.,"The following article details how the Azure Policy Regulatory Compliance built-in initiative definition maps tocompliance domainsandcontrolsin SWIFT CSP-CSCF v2022. For more information about this compliance standard, seeSWIFT CSP-CSCF v2022. To understandOwnership, review thepolicy typeandShared responsibility in the cloud. The following mappings are to theSWIFT CSP-CSCF v2022controls. Many of the controls diff --git a/products/azure-policy/report.md b/products/azure-policy/report.md index 69c70085..caabb198 100644 --- a/products/azure-policy/report.md +++ b/products/azure-policy/report.md @@ -1,14 +1,14 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: Authoring, assigning, storing, and securing Machine Configuration (guest configuration) packages and policies, plus prerequisites, networking, remediation, and compliance result analysis. deployment: How to deploy and assign Machine Configuration packages via ARM/Bicep/Terraform/REST, publish packages to storage, and use safe deployment practices with Azure Policy. - security: Using Azure Policy and Machine Configuration for security baselines and - mapping/enforcing compliance with standards (CIS, NIST, ISO, PCI, FedRAMP, HIPAA, - SOC 2, regional regs). + security: 'Using Azure Policy and Machine Configuration for security/compliance: + baselines, guest config, package signing, exemptions, and mappings to CIS, NIST, + ISO, PCI, FedRAMP, HIPAA, CMMC, and regional standards.' best-practices: 'Designing effective Azure Policy definitions: effects, logical/value operators, arrays, tags, initiatives, parameters, and testing/behavior of Machine/Guest Configuration.' @@ -23,17 +23,16 @@ category_descriptions: policy analysis skill_description: Expert knowledge for Azure Policy development including troubleshooting, best practices, decision making, security, configuration, integrations & coding - patterns, and deployment. Use when authoring Machine Configuration packages, deploying - via ARM/Bicep/Terraform, enforcing security baselines, migrating from DSC, or querying - compliance with Resource Graph, and other Azure Policy related development tasks. - Not for Azure Blueprints (use azure-blueprints), Azure Role-based access control - (use azure-rbac), Azure Resource Manager (use azure-resource-manager), Azure Security - (use azure-security). -use_when: Use when authoring Machine Configuration packages, deploying via ARM/Bicep/Terraform, - enforcing security baselines, migrating from DSC, or querying compliance with Resource - Graph, and other Azure Policy related development tasks. -confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role-based - access control (use azure-rbac), Azure Resource Manager (use azure-resource-manager), + patterns, and deployment. Use when authoring guest config packages, deploying via + ARM/Bicep/Terraform, mapping to CIS/NIST, or querying compliance with Resource Graph, + and other Azure Policy related development tasks. Not for Azure Blueprints (use + azure-blueprints), Azure Resource Manager (use azure-resource-manager), Azure Role-based + access control (use azure-rbac), Azure Security (use azure-security). +use_when: Use when authoring guest config packages, deploying via ARM/Bicep/Terraform, + mapping to CIS/NIST, or querying compliance with Resource Graph, and other Azure + Policy related development tasks. +confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Resource + Manager (use azure-resource-manager), Azure Role-based access control (use azure-rbac), Azure Security (use azure-security). --- # Azure Policy Crawl Report @@ -43,14 +42,14 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- - **Total Pages**: 158 - **Fetched**: 158 - **Fetch Failed**: 0 -- **Classified**: 96 -- **Unclassified**: 62 +- **Classified**: 98 +- **Unclassified**: 60 ### Incremental Update -- **New Pages**: 3 -- **Updated Pages**: 1 -- **Unchanged**: 154 -- **Deleted Pages**: 1 +- **New Pages**: 0 +- **Updated Pages**: 23 +- **Unchanged**: 135 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-policy/azure-policy.csv` ## Classification Statistics @@ -62,26 +61,55 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | decision-making | 3 | 1.9% | | deployment | 7 | 4.4% | | integrations | 2 | 1.3% | -| security | 58 | 36.7% | +| security | 60 | 38.0% | | troubleshooting | 3 | 1.9% | -| *(Unclassified)* | 62 | 39.2% | +| *(Unclassified)* | 60 | 38.0% | ## Changes -### New Pages +### Updated Pages -- [Overview](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/) - [Windows agent release notes](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/windows) + - Updated: 2026-04-15T06:10:00.000Z → 2026-04-24T06:15:00.000Z - [Linux agent release notes](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/linux) - -### Updated Pages - -- [Policy rule](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-policy-rule) - - Updated: 2025-03-20T22:03:00.000Z → 2026-04-17T22:08:00.000Z - -### Deleted Pages - -- ~~What's new in the agent~~ (https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent) + - Updated: 2026-04-15T06:10:00.000Z → 2026-04-24T06:15:00.000Z +- [FedRAMP High](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high) + - Updated: 2025-10-20T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [FedRAMP Moderate](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate) + - Updated: 2025-10-20T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [Microsoft cloud security benchmark](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark) + - Updated: 2025-10-31T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [NIST SP 800-53 Rev. 4](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4) + - Updated: 2025-10-20T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [NIST SP 800-53 Rev. 5](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5) + - Updated: 2025-10-20T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [NIST SP 800-171 R2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2) + - Updated: 2026-01-12T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [SOC 2 Type 2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-soc-2) + - Updated: 2026-03-19T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [Built-in policies](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-policies) + - Updated: 2026-01-26T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [Built-in initiatives](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-initiatives) + - Updated: 2025-11-18T08:00:00.000Z → 2026-04-19T11:06:00.000Z +- [FedRAMP High](https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high) + - Updated: 2026-02-19T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [FedRAMP Moderate](https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate) + - Updated: 2026-02-19T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [Microsoft cloud security benchmark](https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark) + - Updated: 2026-02-24T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [Microsoft Cloud for Sovereignty Confidential](https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential) + - Updated: 2025-08-05T20:17:00.000Z → 2026-04-18T08:00:00.000Z +- [Microsoft Cloud for Sovereignty Global](https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global) + - Updated: 2025-08-05T20:17:00.000Z → 2026-04-18T08:00:00.000Z +- [NIST SP 800-53 Rev. 4](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4) + - Updated: 2026-02-19T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [NIST SP 800-53 Rev. 5](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5) + - Updated: 2026-02-19T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [NIST SP 800-171 R2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2) + - Updated: 2026-02-19T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- [NL BIO Cloud Theme](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme) + - Updated: 2026-01-12T08:00:00.000Z → 2026-04-18T08:00:00.000Z +- *...and 3 more* ## Classified Pages @@ -94,8 +122,19 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [5. Access a custom package](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/develop-custom-package/5-access-package) | configuration | 0.80 | Explains using managed identity resource IDs or SAS tokens; specific access configuration patterns for this service. | | [Common issues](https://learn.microsoft.com/en-us/azure/governance/policy/troubleshoot/general) | troubleshooting | 0.80 | Explicit troubleshooting article describing various errors when creating definitions, using SDKs, or Kubernetes add-on, with suggested resolutions; matches troubleshooting criteria. | | [Create a custom policy definition](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/create-policy-definition) | configuration | 0.80 | Includes required extensions, initiative names, and version requirements; detailed policy definition configuration for this feature. | -| [Microsoft cloud security benchmark](https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark) | security | 0.80 | Maps Microsoft cloud security benchmark controls to Azure Policy initiatives; central reference for Azure security baseline implementation. | +| [FedRAMP High](https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high) | security | 0.80 | Contains detailed mappings from specific FedRAMP High controls to concrete Azure Policy definitions within a regulatory compliance initiative. This is product-specific compliance/security configuration knowledge (which policy implements which control) that is unlikely to be memorized by an LLM. | +| [FedRAMP Moderate](https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate) | security | 0.80 | Details how each FedRAMP Moderate control maps to one or more Azure Policy definitions in a built-in initiative. These mappings are specific security/compliance configuration relationships unique to Azure Policy. | +| [Microsoft Cloud for Sovereignty Confidential](https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential) | security | 0.80 | Describes how Microsoft Cloud for Sovereignty Baseline Confidential controls map to Azure Policy definitions in a regulatory initiative. These mappings are specific security/compliance configuration knowledge. | +| [Microsoft Cloud for Sovereignty Global](https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global) | security | 0.80 | Shows mappings from Microsoft Cloud for Sovereignty Baseline Global controls to Azure Policy definitions. This is detailed, product-specific security/compliance configuration information. | +| [Microsoft cloud security benchmark](https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark) | security | 0.80 | Provides control-by-control mappings between Microsoft cloud security benchmark domains/controls and Azure Policy definitions. This is detailed, product-specific security/compliance configuration guidance. | +| [NIST SP 800-171 R2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2) | security | 0.80 | Details how NIST SP 800-171 R2 controls map to Azure Policy definitions in a regulatory initiative. This mapping is specialized security/compliance configuration knowledge. | +| [NIST SP 800-53 Rev. 4](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4) | security | 0.80 | Contains a mapping table from NIST SP 800-53 Rev. 4 controls to specific Azure Policy definitions in a built-in initiative, which is specialized security/compliance configuration knowledge. | +| [NIST SP 800-53 Rev. 5](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5) | security | 0.80 | Provides detailed mappings between NIST SP 800-53 Rev. 5 controls and Azure Policy definitions. These are concrete, product-specific security/compliance configuration relationships. | +| [NL BIO Cloud Theme](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme) | security | 0.80 | Shows mappings from NL BIO Cloud Theme controls to specific Azure Policy definitions. These are product-specific security/compliance configuration mappings not generally known to an LLM. | | [Overview](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/develop-custom-package/overview) | configuration | 0.80 | Covers authoring and validating custom packages, including GA limitations and usage constraints; product-specific package configuration details. | +| [RMIT Malaysia](https://learn.microsoft.com/en-us/azure/governance/policy/samples/rmit-malaysia) | security | 0.80 | Provides control-to-policy mappings for the RMIT Malaysia standard using Azure Policy initiatives. This is detailed, product-specific security/compliance configuration information. | +| [SOC 2 Type 2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/soc-2) | security | 0.80 | Describes how SOC 2 controls map to Azure Policy definitions in a regulatory compliance initiative. These mappings are specific security/compliance configuration knowledge. | +| [SWIFT CSP-CSCF 2021](https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2021) | security | 0.80 | Contains mappings from SWIFT CSP-CSCF v2021 controls to Azure Policy definitions. This is specialized, product-specific security/compliance configuration guidance. | | [Understand the baseline settings parameter format](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-security-baselines/understand-baseline-settings-parameter) | security | 0.80 | Explains baseline parameter format with JSON examples for CIS and Azure Security Baselines; product-specific security configuration schema. | | [Using Bicep](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-configuration/bicep) | deployment | 0.80 | Provides Bicep examples with specific resource types and properties; product-specific deployment syntax and constraints. | | [Using Rest API](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-configuration/rest-api) | deployment | 0.80 | Shows REST payloads with type names and parent references; detailed deployment API usage unique to this service. | @@ -108,8 +147,6 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [CIS Microsoft Azure Foundations Benchmark 2.0.0](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-azure-2-0-0) | security | 0.78 | Provides explicit mappings from CIS Microsoft Azure Foundations Benchmark 2.0.0 controls to Azure Policy definitions and initiatives. These mappings are concrete security/compliance configuration details unique to Azure Policy. | | [CMMC Level 3](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cmmc-l3) | security | 0.78 | Lists how CMMC Level 3 controls map to specific Azure Policy definitions within a regulatory compliance initiative. This is detailed, product-specific security/compliance configuration information, matching the security sub-skill. | | [CMMC Level 3](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-cmmc-l3) | security | 0.78 | Page details how CMMC Level 3 controls map to Azure Policy definitions and built-in initiatives for Azure Government. The exact control-to-policy mappings and initiative composition are specific, non-obvious configuration knowledge for securing and assessing environments against CMMC, fitting the security sub-skill. | -| [SOC 2 Type 2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-soc-2) | security | 0.78 | Page describes the mapping of SOC 2 controls to Azure Policy regulatory compliance initiatives and individual policy definitions for Azure Government. These mappings and policy identifiers are detailed, product-specific security/compliance configuration knowledge, qualifying as expert security content. | -| [SOC 2 Type 2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/soc-2) | security | 0.78 | Shows mappings between SOC 2 controls and Azure Policy definitions/initiatives for regulatory compliance. These mappings and policy references are expert, product-specific security configuration details. | | [Assignments](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/concepts/assignments) | configuration | 0.75 | Describes guest assignment resource model, including metadata and version constraints (for example minimum version 1.0.0); product-specific configuration schema. | | [CIS Security Benchmarks - AlmaLinux](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/alma-ado) | security | 0.75 | Provides supported CIS benchmarks, mismatched rules, and configurable parameters for AlmaLinux, which are detailed, product- and OS-specific security configuration references. | | [CIS Security Benchmarks - Debian Linux](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/debian-ado) | security | 0.75 | Similar to index 24 but for Debian; includes benchmark versions, rule mappings, and parameters that are detailed security configuration data. | @@ -118,21 +155,22 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [CIS Security Benchmarks - Rocky Linux](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/rocky-ado) | security | 0.75 | Provides Rocky Linux-specific CIS benchmark support details and parameters, which are concrete security configuration references. | | [CIS Security Benchmarks - SUSE Linux Enterprise](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/suse-ado) | security | 0.75 | SUSE-specific CIS benchmark reference with supported benchmarks and configurable parameters, representing detailed security baseline guidance. | | [CIS Security Benchmarks - Ubuntu Linux](https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/ubuntu-ado) | security | 0.75 | Ubuntu-specific CIS benchmark details, including supported versions and parameters, which are expert security configuration data. | -| [NIST SP 800-53 Rev. 4](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4) | security | 0.75 | Control mappings for NIST 800-53 Rev. 4 to Azure Policy; expert-level security/compliance configuration. | -| [NIST SP 800-53 Rev. 5](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5) | security | 0.75 | Similar to Rev. 4 but updated controls; detailed Azure Policy mapping for NIST 800-53 Rev. 5. | | [Network Requirements](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/overview/03-network-requirements) | configuration | 0.75 | Network requirements page usually lists specific endpoints, ports, and Private Link settings; these are concrete configuration parameters. | | [PCI DSS 3.2.1](https://learn.microsoft.com/en-us/azure/governance/policy/samples/pci-dss-3-2-1) | security | 0.75 | Control-by-control mapping for PCI DSS 3.2.1 to Azure Policy definitions; detailed security/compliance configuration. | | [PCI DSS 4.0](https://learn.microsoft.com/en-us/azure/governance/policy/samples/pci-dss-4-0) | security | 0.75 | Similar to 3.2.1 but for PCI DSS v4.0; Azure-specific mapping of controls to policy initiatives. | -| [SWIFT CSP-CSCF 2021](https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2021) | security | 0.75 | Provides mappings from SWIFT CSP-CSCF 2021 controls to Azure Policy initiatives; highly specialized financial-sector security configuration. | | [SWIFT CSP-CSCF 2022](https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2022) | security | 0.75 | Updated mapping for SWIFT CSP-CSCF 2022; detailed Azure Policy-based implementation of SWIFT security controls. | | [Specify custom parameters for baseline policy](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-security-baselines/specify-custom-parameters-for-baseline-policy) | security | 0.75 | Focuses on customizing security baseline parameters; likely includes specific parameter names and allowed values for security controls. | | [Using Terraform](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-configuration/terraform) | deployment | 0.75 | Terraform-based deployment of assignments; includes resource blocks and arguments specific to Machine Configuration. | | [Windows security 2025 baseline](https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-windows-server-2025) | security | 0.75 | Similar to index 33 but specific to Windows Server 2025 with customizable baseline content; includes detailed configuration settings and values. | | [Windows security baseline](https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-windows) | security | 0.75 | Details configuration settings for Windows Server 2012–2022 baselines, including rules and values, which are concrete security configuration details. | +| [FedRAMP High](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high) | security | 0.72 | Page provides a detailed mapping of FedRAMP High controls to specific Azure Policy initiative and policy definition IDs/settings for Azure Government. These mappings and control-to-policy relationships are product- and standard-specific, not derivable from general training data, and are used for concrete compliance configuration, fitting the security category. | +| [FedRAMP Moderate](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate) | security | 0.72 | Contains a control-by-control mapping for FedRAMP Moderate to concrete Azure Policy initiative and policy definitions in Azure Government. These mappings are expert, product-specific compliance configuration details, aligning with security-focused configuration for regulatory standards. | | [3. Test a custom package](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/develop-custom-package/3-test-package) | best-practices | 0.70 | Describes testing tools and workflow for packages; includes product-specific testing patterns and likely edge-case guidance. | | [4. Publish a custom package](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/develop-custom-package/4-publish-package) | deployment | 0.70 | Details publishing to Azure Blob and SAS usage; product-specific deployment location and access requirements for packages. | | [Australian Government ISM PROTECTED](https://learn.microsoft.com/en-us/azure/governance/policy/samples/australia-ism) | security | 0.70 | Regulatory compliance mapping for Australian Government ISM PROTECTED; contains detailed mappings from specific controls to Azure Policy definitions, which is product-specific security/compliance configuration knowledge. | +| [Built-in initiatives](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-initiatives) | security | 0.70 | Provides an index of built-in Azure Policy initiative definitions with exact initiative names and categories, which are product-specific compliance/security configuration bundles not generally known from training. | | [Built-in packages for guest configuration](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-packages) | configuration | 0.70 | Index of built-in guest configuration packages mapped to policy definitions and PowerShell modules. Contains concrete package names and mappings that are configuration reference data. | +| [Built-in policies](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-policies) | security | 0.70 | Lists concrete built-in Azure Policy definitions with exact names, categories, and links to source JSON. These are product-specific security/governance configuration artifacts (policy definitions) that an LLM won't reliably know from training and are used to configure access/compliance behavior. | | [CIS Security Benchmarks for Linux Workloads - Overview](https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-cis-linux) | security | 0.70 | Reference for built-in CIS security benchmarks for Linux workloads via Machine Configuration, including detailed rules and configuration parameters, which are security baseline specifics. | | [Canada Federal PBMM](https://learn.microsoft.com/en-us/azure/governance/policy/samples/canada-federal-pbmm) | security | 0.70 | Provides control-to-policy mappings for Canada Federal PBMM; this is concrete, product-specific security/compliance configuration guidance not derivable from generic knowledge. | | [Deploy a baseline policy assignment](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-security-baselines/deploy-a-baseline-policy-assignment) | security | 0.70 | Describes specific policy definitions for Windows and Linux baselines and how to assign them; security-focused configuration guidance. | @@ -140,32 +178,24 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [Discover and assign built-in policies](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-built-in-policies) | configuration | 0.70 | How-to for discovering and assigning built-in policies; includes policy names and parameters, which are product-specific configuration details. | | [Docker host security baseline](https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-docker) | security | 0.70 | Details configuration settings for Docker hosts as part of Azure Security Benchmark via guest configuration, including concrete security settings and checks. | | [Exemption structure](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/exemption-structure) | security | 0.70 | The page describes the precise JSON structure and fields of Azure Policy exemption definitions (such as specific properties, allowed values, and how they interact with initiatives/definitions and Resource Manager data-plane modes). These are product-specific configuration details for access/governance behavior, which align best with the security sub-skill. It goes beyond conceptual overview by specifying how to configure exemptions in practice. | -| [FedRAMP High](https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high) | security | 0.70 | Shows how FedRAMP High controls are implemented via Azure Policy initiatives; detailed compliance configuration guidance. | -| [FedRAMP High](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high) | security | 0.70 | Details how FedRAMP High controls map to Azure Policy definitions/initiatives in Azure Government, giving concrete compliance-to-policy mappings unique to this product and environment. | -| [FedRAMP Moderate](https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate) | security | 0.70 | Provides mappings from FedRAMP Moderate controls to Azure Policy definitions; product-specific security/compliance implementation. | -| [FedRAMP Moderate](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate) | security | 0.70 | Similar to index 2 but for FedRAMP Moderate; includes specific mappings between FedRAMP Moderate controls and Azure Policy initiatives, which is expert, product-specific security content. | | [HIPAA HITRUST 9.2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/hipaa-hitrust) | security | 0.70 | Details HIPAA HITRUST control mappings to Azure Policy initiatives; concrete Azure security/compliance configuration. | | [IRS 1075 September 2016](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-irs-1075-sept2016) | security | 0.70 | Maps IRS 1075 September 2016 controls to Azure Policy definitions for Azure Government, providing concrete, standard-specific security configuration mappings. | | [IRS 1075 September 2016](https://learn.microsoft.com/en-us/azure/governance/policy/samples/irs-1075-sept2016) | security | 0.70 | Maps IRS 1075 controls to Azure Policy definitions; specialized compliance configuration knowledge. | | [ISO 27001:2013](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-iso-27001) | security | 0.70 | Provides detailed mapping between ISO 27001:2013 controls and Azure Policy initiatives in Azure Government, which is specific security/compliance configuration knowledge. | | [ISO 27001:2013](https://learn.microsoft.com/en-us/azure/governance/policy/samples/iso-27001) | security | 0.70 | Control-to-policy mapping for ISO 27001:2013; Azure-specific security/compliance implementation details. | | [Linux security baseline](https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-linux) | security | 0.70 | Describes specific configuration settings and remediation checks for Linux guests under Azure Security Benchmark, which are detailed security baseline configurations. | -| [Microsoft Cloud for Sovereignty Confidential](https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential) | security | 0.70 | Provides mappings for Microsoft Cloud for Sovereignty Baseline Confidential controls to Azure Policy; niche, product-specific compliance configuration. | -| [Microsoft Cloud for Sovereignty Global](https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global) | security | 0.70 | Similar to index 23 but for Global Baseline; detailed mapping of controls to Azure Policy artifacts. | -| [Microsoft cloud security benchmark](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark) | security | 0.70 | Details how Microsoft cloud security benchmark controls map to Azure Policy definitions for Azure Government, including initiative-based implementations, which is product-specific security guidance. | +| [Microsoft cloud security benchmark](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark) | security | 0.70 | Details how Microsoft cloud security benchmark controls map to Azure Policy definitions/initiatives in Azure Government. The specific control-to-policy mappings and identifiers are expert, product-specific security/compliance configuration information. | | [Migrate from Automanage](https://learn.microsoft.com/en-us/azure/governance/policy/how-to/migrate-from-automanage-best-practices) | decision-making | 0.70 | Provides concrete migration planning from a retiring service with specific dates and service-impact details; this is product-specific decision and migration guidance between technologies. | | [Migrating from Azure Automation DSC](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/migrating-from-azure-automation) | decision-making | 0.70 | Migration planning guidance between DSC v2 and v3; contains process and technical guidance for choosing and executing migration paths. | | [Migrating from Azure DSC Extension](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/migrating-from-dsc-extension) | decision-making | 0.70 | Guidance on developing a migration strategy from DSC extension; focused on when and how to move to the new service. | -| [NIST SP 800-171 R2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2) | security | 0.70 | Maps NIST SP 800-171 R2 controls to Azure Policy initiative definitions in Azure Government, which is specific security/compliance implementation guidance. | -| [NIST SP 800-171 R2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2) | security | 0.70 | Maps NIST 800-171 R2 controls to Azure Policy initiatives; concrete Azure compliance implementation guidance. | -| [NIST SP 800-53 Rev. 4](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4) | security | 0.70 | Contains mappings from NIST SP 800-53 Rev. 4 controls to Azure Policy initiatives in Azure Government, a concrete, product-specific security/compliance configuration reference. | -| [NIST SP 800-53 Rev. 5](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5) | security | 0.70 | Same pattern as index 7 but for Rev. 5; provides detailed control-to-policy mappings unique to Azure Government and this standard. | -| [NL BIO Cloud Theme](https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme) | security | 0.70 | Provides mappings from NL BIO Cloud Theme controls to Azure Policy; specialized regional compliance configuration. | +| [NIST SP 800-171 R2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2) | security | 0.70 | Contains a detailed mapping of NIST SP 800-171 R2 controls to Azure Policy initiatives and definitions for Azure Government. These mappings are concrete, product-specific security/compliance implementation details, not general conceptual content. | +| [NIST SP 800-53 Rev. 4](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4) | security | 0.70 | Provides detailed mappings from NIST SP 800-53 Rev. 4 controls to Azure Policy initiatives and definitions. These mappings are concrete, product-specific compliance implementation details that an LLM would not reliably know without the page, fitting the security category. | +| [NIST SP 800-53 Rev. 5](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5) | security | 0.70 | Similar to the R4 page, this one maps NIST SP 800-53 Rev. 5 controls to specific Azure Policy initiatives/definitions. The exact mappings and identifiers are expert, product-specific security/compliance configuration data. | | [Overview](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-configuration/overview) | deployment | 0.70 | Covers deployment of configurations via templates and Azure Policy; product-specific deployment patterns across multiple machines. | | [Prerequisites and Environment Setup](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/overview/02-setup-prerequisites) | configuration | 0.70 | Prerequisites page typically lists required extensions, identities, and versions; product-specific configuration requirements beyond generic knowledge. | | [RBI ITF Banks v2016](https://learn.microsoft.com/en-us/azure/governance/policy/samples/rbi-itf-banks-2016) | security | 0.70 | Maps Reserve Bank of India IT Framework for Banks controls to Azure Policy; region-specific compliance implementation. | | [RBI ITF NBFC v2017](https://learn.microsoft.com/en-us/azure/governance/policy/samples/rbi-itf-nbfc-2017) | security | 0.70 | Similar to 31 but for NBFC; detailed mapping of RBI controls to Azure Policy definitions. | -| [RMIT Malaysia](https://learn.microsoft.com/en-us/azure/governance/policy/samples/rmit-malaysia) | security | 0.70 | Provides RMIT Malaysia control mappings to Azure Policy initiatives; specialized security/compliance configuration. | +| [SOC 2 Type 2](https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-soc-2) | security | 0.70 | Provides mappings from SOC 2 controls to Azure Policy initiatives and definitions. The specific control mappings and policy identifiers are expert, product-specific security/compliance configuration information. | | [Spain ENS](https://learn.microsoft.com/en-us/azure/governance/policy/samples/spain-ens) | security | 0.70 | Maps Spain ENS controls to Azure Policy; Azure-specific implementation of national compliance standard. | | [UK OFFICIAL and UK NHS](https://learn.microsoft.com/en-us/azure/governance/policy/samples/ukofficial-uknhs) | security | 0.70 | Maps UK OFFICIAL and UK NHS controls to Azure Policy; region-specific security/compliance configuration. | | [View compliance reporting](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/view-compliance) | configuration | 0.70 | Explains how compliance data appears across Policy, Guest Assignments, and ARG; product-specific reporting surfaces and query patterns. | @@ -230,7 +260,7 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [Create and manage Azure Policy](https://learn.microsoft.com/en-us/azure/governance/policy/tutorials/create-and-manage) | 0.30 | Tutorial on building policies; focuses on general usage patterns rather than detailed limits, configs, or troubleshooting. | | [Deny](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/effect-deny) | 0.30 | Overview of deny effect; no expert-only numeric constraints or configuration matrices. | | [Disallow resource types](https://learn.microsoft.com/en-us/azure/governance/policy/tutorials/disallowed-resources) | 0.30 | Tutorial applying built-in 'Not allowed resource types' policy; focuses on how to assign and manage, not on limits, decision matrices, or detailed configuration options. | -| [Linux agent release notes](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/linux) | 0.30 | Linux agent release notes likely list changes and issues, but the summary does not expose concrete limits, configuration tables, error-code mappings, or other structured expert details matching the defined sub-skill categories. | +| [Linux agent release notes](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/linux) | 0.30 | Linux agent release notes similarly focus on version history and issues, not on structured limits, configuration parameters, troubleshooting flows, or decision criteria. They do not match any of the defined expert-knowledge sub-skill patterns. | | [Manage tag governance](https://learn.microsoft.com/en-us/azure/governance/policy/tutorials/govern-tags) | 0.30 | Tutorial on using Azure Policy modify effect for tag governance; mainly procedural guidance without deep configuration matrices, limits, or error-code-based troubleshooting. | | [Parameters](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-parameters) | 0.30 | Conceptual description of policy parameters and their purpose; lacks detailed parameter catalogs, default values, or numeric constraints that would constitute expert configuration knowledge. | | [Route policy state change events](https://learn.microsoft.com/en-us/azure/governance/policy/tutorials/route-state-change-events) | 0.30 | Tutorial wiring Azure Policy state change events to Event Grid via CLI; shows commands but not detailed parameter tables, limits, or diagnostic mappings. | @@ -238,7 +268,7 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [Security baselines overview](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/assign-security-baselines/overview-page) | 0.30 | High-level overview of security baselines; summary suggests conceptual description without detailed settings or parameters. | | [System Policy](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/systempolicy) | 0.30 | High-level guide to system policy capability; no detailed settings, limits, or troubleshooting mappings. | | [Virtual machine recommended policies](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/recommended-policies) | 0.30 | Describes recommended policies UI for VMs; no detailed technical configuration or numeric criteria in summary. | -| [Windows agent release notes](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/windows) | 0.30 | Windows agent release notes likely contain version changes and issues, but the provided summary does not show specific configuration parameters, limits, error codes, or structured troubleshooting/decision content required by any sub-skill type. | +| [Windows agent release notes](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/windows) | 0.30 | Windows agent release notes typically list version changes, fixes, and known issues but are not organized as limits, configuration references, troubleshooting guides, or decision matrices as defined. They lack structured numeric limits, config tables, or error-code-to-solution mappings required for the sub-skill types. | | [Compliance states](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/compliance-states) | 0.25 | Conceptual article on compliance states; summary shows no detailed configuration or numeric guidance. | | [Overview](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/agent/) | 0.20 | High-level overview page that just links to platform-specific release notes; no concrete limits, configuration parameters, error codes, or decision matrices are evident in the summary. | | [Policy rule](https://learn.microsoft.com/en-us/azure/governance/policy/concepts/definition-structure-policy-rule) | 0.20 | Primarily explains Azure Policy rule structure (if/then, logical operators, effects). No specific limits, configuration tables, error codes, or product-specific numeric thresholds; it's a conceptual/structural reference rather than expert-only operational details. | @@ -246,7 +276,5 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Role- | [What is Azure Machine Configuration?](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/overview/01-overview-concepts) | 0.20 | Conceptual overview of Azure Machine Configuration; no detailed limits, configs, or error mappings. | | [What is Azure Policy?](https://learn.microsoft.com/en-us/azure/governance/policy/overview) | 0.20 | High-level overview of Azure Policy; mostly conceptual service description without detailed limits or configs. | | [Azure Policy glossary](https://learn.microsoft.com/en-us/azure/governance/policy/policy-glossary) | 0.10 | Glossary of terms; definitions but no configuration, limits, or troubleshooting mappings. | -| [Built-in initiatives](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-initiatives) | 0.10 | Index of built-in policy initiatives; serves as navigation without embedded expert configuration or decision guidance. | -| [Built-in policies](https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-policies) | 0.10 | Index of built-in policy definitions linking out to portal/GitHub; page itself is a catalog, not detailed configuration or troubleshooting content. | | [Index](https://learn.microsoft.com/en-us/azure/governance/policy/samples/) | 0.10 | Index/navigation page listing Azure Policy built-in definitions and initiatives; no substantive technical content itself. | | [What's new in docs](https://learn.microsoft.com/en-us/azure/governance/machine-configuration/whats-new/docs) | 0.10 | Documentation change log; meta-information about docs, not product behavior or configuration. | diff --git a/products/azure-private-link/azure-private-link.csv b/products/azure-private-link/azure-private-link.csv index 5294ff54..b74359cf 100644 --- a/products/azure-private-link/azure-private-link.csv +++ b/products/azure-private-link/azure-private-link.csv @@ -26,11 +26,11 @@ https://learn.microsoft.com/en-us/azure/private-link/increase-private-endpoint-v https://learn.microsoft.com/en-us/azure/private-link/manage-private-endpoint,Manage private endpoints,Manage Azure private endpoints - Azure Private Link,Configure and manage Azure Private Endpoint properties,Learn how to manage private endpoints in Azure.,Azure private endpoints have several options for managing their configuration and deployment. You can determineGroupIdandMemberNamevalues by querying the Azure Private Link resource. You need theGroupIdandMemberNamevalues to configure a static IP address for a private endpoint during creation. A private endpoint has two custom properties: static IP address and network interface name. These properties must be set when the private endpoint is created. With a service provider and consumer deploymen,2026-03-30T08:00:00.000Z,how-to,configuration,0.7,True,"Describes managing Azure private endpoints with product-specific properties such as GroupId, MemberName, static IP address, and network interface name, including how to obtain and set them. These are concrete configuration parameters unique to Azure Private Link.",unchanged https://learn.microsoft.com/en-us/azure/private-link/monitor-private-link,Monitor Private Link,Monitor Azure Private Link,,"Learn how to monitor Azure Private Link using Azure Monitor, including data collection, analysis, and alerting.","Azure Monitor collects and aggregates metrics and logs from your system to monitor availability, performance, and resilience, and notify you of issues affecting your system. You can use the Azure portal, PowerShell, Azure CLI, REST API, or client libraries to set up and view monitoring data. Different metrics and logs are available for different resource types. This article describes the types of monitoring data you can collect for this service and ways to analyze that data.",2026-03-30T08:00:00.000Z,concept-article,,0.3,False,"Monitoring overview for Azure Private Link; describes available metrics/logs and tools but summary does not indicate specific numeric limits, configuration parameter tables, or error-code-based troubleshooting. Likely general guidance LLM already knows.",unchanged https://learn.microsoft.com/en-us/azure/private-link/monitor-private-link-reference,Monitoring data reference,Monitoring data reference for Azure Private Link,Reference for Azure Private Link monitoring data,This article contains important reference material you need when you monitor Azure Private Link by using Azure Monitor.,This article contains all the monitoring reference information for this service. SeeMonitor Azure Private Linkfor details on the data you can collect for Private Link and how to use it.,2025-03-25T08:00:00.000Z,reference,configuration,0.7,True,"A monitoring data reference typically lists specific metric and log names, dimensions, and categories for Azure Private Link; these are detailed configuration/reference values not generally known from training.",unchanged -https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-concepts,What is a network security perimeter?,What is a network security perimeter? - Azure Private Link,,"Learn how Azure Network Security Perimeter secures PaaS resources with logical network boundaries. Control public access, prevent data exfiltration, and manage access rules for Storage, Azure AI Searc","Azure Network Security Perimeter creates logical network boundaries around your platform-as-a-service (PaaS) resources that are deployed outside your virtual networks. Network security perimeter helps you control public network access to resources like Azure Storage accounts and Azure Key Vault by establishing a secure perimeter. By default, network security perimeter restricts public access to PaaS resources within the boundary. You can grant exceptions through explicit access rules for inbound",2025-08-01T08:00:00.000Z,overview,,0.3,False,"Explains what Network Security Perimeter is and its benefits; no specific RBAC lists, config parameters, or numeric thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-concepts,What is a network security perimeter?,What is a network security perimeter? - Azure Private Link,,"Learn how Azure Network Security Perimeter secures PaaS resources with logical network boundaries. Control public access, prevent data exfiltration, and manage access rules for Storage, Azure AI Searc","Azure Network Security Perimeter creates logical network boundaries around your platform-as-a-service (PaaS) resources that are deployed outside your virtual networks. Network security perimeter helps you control public network access to resources like Azure Storage accounts and Azure Key Vault by establishing a secure perimeter. By default, network security perimeter restricts public access to PaaS resources within the boundary. You can grant exceptions through explicit access rules for inbound",2026-04-20T11:11:00.000Z,overview,,0.2,False,"Conceptual overview of Azure Network Security Perimeter; no specific limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs are evident from the summary.",updated https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-diagnostic-logs,Diagnostic logs,Diagnostic logs for Network Security Perimeter - Azure Private Link,Enable and store Network Security Perimeter diagnostic logs,Learn the options for storing diagnostic logs for Network Security Perimeter and how to enable logging through the Azure portal.,"In this article, you learn about the diagnostic logs for Network Security Perimeter and how to enable logging. You learn access logs categories used. Then, you discover the options for storing diagnostic logs and how to enable logging through the Azure portal. Important Network security perimeter is now generally available in all Azure public cloud regions. For information on supported services, seeOnboarded private link resourcesfor supported PaaS services.""",2025-08-01T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes diagnostic log categories for Network Security Perimeter and options for storing logs, with portal-based configuration steps; log category names and enablement options are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-role-based-access-control-requirements,Role-based access control permissions,Azure role-based access control permissions required for Azure Network Security Perimeter usage - Azure Private Link,Configure RBAC permissions for Azure Network Security Perimeter operations,Learn about the Azure role-based access control permissions required to use Azure Network Security Perimeter.,"In this article, you learn about the Azure role-based access control (RBAC) permissions required to usenetwork security perimetercapabilities. You learn about the actions required for network security perimeter, profile, network security perimeter access rule, diagnostic settings, association, and appendix capabilities.",2025-08-01T08:00:00.000Z,concept-article,security,0.85,True,"Describes specific RBAC actions and permissions required for NSP profiles, rules, associations, and diagnostics, which is detailed security configuration guidance.",unchanged https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-transition,Transition to a network security perimeter,Transition to a Network Security Perimeter in Azure - Azure Private Link,Plan and transition Azure resources to Network Security Perimeter,"Learn how to transition to a network security perimeter in Azure, explore access modes, and secure your resources.","In this article, you learn about the different access modes and how to transition to anetwork security perimeterin Azure. Access modes control resource access and logging behavior, helping you secure your Azure resources. [!INCLUDE network-security-perimeter-preview-message]",2025-08-15T08:00:00.000Z,overview,decision-making,0.65,True,"Covers access modes and how to transition, implying scenario-based guidance and trade-offs for choosing modes and migration steps.",unchanged -https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-dns,Private DNS zone values,Azure Private Endpoint private DNS zone values,Configure private DNS zone records for Azure Private Endpoints,Learn about the private DNS zone values for Azure services that support private endpoints.,It's important to correctly configure your DNS settings to resolve the private endpoint IP address to the fully qualified domain name (FQDN) of the connection string. Existing Microsoft Azure services might already have a DNS configuration for a public endpoint. This configuration must be overridden to connect using your private endpoint. The network interface associated with the private endpoint contains the information to configure your DNS. The network interface information includes FQDN and ,2026-04-17T08:00:00.000Z,concept-article,configuration,0.86,True,"The page lists precise private DNS zone names and record patterns required for each Azure service that supports private endpoints (for example, specific FQDN suffixes and zone names like privatelink..windows.net). These are product-specific configuration values that an LLM is unlikely to know reliably from training, and they are organized as concrete DNS settings rather than conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-dns,Private DNS zone values,Azure Private Endpoint private DNS zone values,Configure private DNS zone records for Azure Private Endpoints,Learn about the private DNS zone values for Azure services that support private endpoints.,It's important to correctly configure your DNS settings to resolve the private endpoint IP address to the fully qualified domain name (FQDN) of the connection string. Existing Microsoft Azure services might already have a DNS configuration for a public endpoint. This configuration must be overridden to connect using your private endpoint. The network interface associated with the private endpoint contains the information to configure your DNS. The network interface information includes FQDN and ,2026-04-17T08:00:00.000Z,concept-article,configuration,0.86,True,"The page lists precise private DNS zone names and record patterns required for each Azure service that supports private endpoints (for example, specific FQDN suffixes and zone names like privatelink..windows.net). These are product-specific configuration values that an LLM is unlikely to know reliably from training, and they are organized as concrete DNS settings rather than conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-dns-integration,Private endpoint DNS integration,Azure Private Endpoint DNS Integration Scenarios,Apply DNS integration best practices for Azure Private Endpoints,Learn how to configure Azure Private Endpoint DNS for secure and efficient private IP resolution. Discover key scenarios and best practices.,"Azure Private Endpoint DNS integration is essential for enabling secure, private connectivity to Azure services within your virtual network. This article describes common DNS configuration scenarios for Azure Private Endpoints, including options for virtual networks, peered networks, and on-premises environments. Use these scenarios and best practices to ensure reliable and secure name resolution for your applications and services. For private DNS zone settings for Azure services that support a ",2025-09-30T17:14:00.000Z,concept-article,best-practices,0.7,True,"Explicitly mentions scenarios and best practices for DNS integration across VNets and on-premises, which are product-specific configuration recommendations.",unchanged https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-export-dns,Export private endpoint DNS records,Export DNS records for a private endpoint - Azure portal - Azure Private Link,,"In this tutorial, learn how to export DNS records for a private endpoint in the Azure portal.","A private endpoint in Azure requires DNS records for name resolution of the endpoint. The DNS record resolves the private IP address of the endpoint for the configured resource. To export the DNS records of the endpoint, use the Private Link section in the Network foundation page in the portal.",2026-03-30T22:11:00.000Z,how-to,,0.3,False,Tutorial on exporting DNS records via portal; operational steps but no detailed configuration parameter tables or numeric constraints.,unchanged https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-overview,What is a private endpoint?,What is a private endpoint? - Azure Private Link,,"In this article, you learn how to use the Private Endpoint feature of Azure Private Link.","A private endpoint is a network interface that uses a private IP address from your virtual network. This network interface connects you privately and securely to a service that's powered by Azure Private Link. By enabling a private endpoint, you're bringing the service into your virtual network. The service could be an Azure service such as:",2026-03-30T08:00:00.000Z,concept-article,,0.1,False,"Conceptual overview of private endpoints; no detailed quotas, config parameter tables, or troubleshooting content.",unchanged diff --git a/products/azure-private-link/report.md b/products/azure-private-link/report.md index 6e1e2dcd..005d6579 100644 --- a/products/azure-private-link/report.md +++ b/products/azure-private-link/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: limits-quotas: Info on Private Link service availability per resource type and how to raise per‑VNet Private Endpoint limits using High Scale configuration @@ -64,8 +64,8 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A ### Updated Pages -- [Private DNS zone values](https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-dns) - - Updated: 2025-08-04T08:00:00.000Z → 2026-04-17T08:00:00.000Z +- [What is a network security perimeter?](https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-concepts) + - Updated: 2025-08-01T08:00:00.000Z → 2026-04-20T11:11:00.000Z ## Classified Pages @@ -110,7 +110,6 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A | [Export private endpoint DNS records](https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-export-dns) | 0.30 | Tutorial on exporting DNS records via portal; operational steps but no detailed configuration parameter tables or numeric constraints. | | [FAQ](https://learn.microsoft.com/en-us/azure/private-link/private-link-faq) | 0.30 | FAQ page likely mixes conceptual Q&A; summary does not indicate specific numeric limits, config tables, or error-code-based troubleshooting. | | [Monitor Private Link](https://learn.microsoft.com/en-us/azure/private-link/monitor-private-link) | 0.30 | Monitoring overview for Azure Private Link; describes available metrics/logs and tools but summary does not indicate specific numeric limits, configuration parameter tables, or error-code-based troubleshooting. Likely general guidance LLM already knows. | -| [What is a network security perimeter?](https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-concepts) | 0.30 | Explains what Network Security Perimeter is and its benefits; no specific RBAC lists, config parameters, or numeric thresholds. | | [Connect to a SQL server - Azure CLI](https://learn.microsoft.com/en-us/azure/private-link/tutorial-private-endpoint-sql-cli) | 0.25 | Tutorial for connecting to Azure SQL via Private Endpoint using CLI; scenario walkthrough rather than reference limits, configuration matrices, or troubleshooting. | | [Create a network security perimeter - Azure CLI](https://learn.microsoft.com/en-us/azure/private-link/create-network-security-perimeter-cli) | 0.25 | CLI quickstart for network security perimeter; tutorial-style content without expert-level configuration matrices or limits. | | [Create a network security perimeter - Azure portal](https://learn.microsoft.com/en-us/azure/private-link/create-network-security-perimeter-portal) | 0.25 | Portal quickstart for creating a network security perimeter; procedural steps without detailed RBAC role lists, config parameter tables, or numeric limits. | @@ -120,6 +119,7 @@ confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), A | [Create a Private Link service - Azure portal](https://learn.microsoft.com/en-us/azure/private-link/create-private-link-service-portal) | 0.20 | Quickstart using Azure portal; step-by-step creation of a Private Link service without detailed configuration option tables or expert-only constraints. | | [Create a Private Link service - Bicep](https://learn.microsoft.com/en-us/azure/private-link/create-private-link-service-bicep) | 0.20 | Bicep quickstart; focuses on example deployment, not exhaustive configuration parameters, limits, or troubleshooting mappings. | | [Create a Private Link service - PowerShell](https://learn.microsoft.com/en-us/azure/private-link/create-private-link-service-powershell) | 0.20 | PowerShell quickstart for creating a Private Link service; primarily procedural tutorial, not a configuration reference or troubleshooting guide. | +| [What is a network security perimeter?](https://learn.microsoft.com/en-us/azure/private-link/network-security-perimeter-concepts) | 0.20 | Conceptual overview of Azure Network Security Perimeter; no specific limits, configuration parameter tables, error codes, or decision matrices with quantified trade-offs are evident from the summary. | | [Private Link service](https://learn.microsoft.com/en-us/azure/private-link/private-link-service-overview) | 0.10 | Service overview for Azure Private Link service; primarily conceptual without expert-level numeric limits or config matrices. | | [What is Azure Private Link?](https://learn.microsoft.com/en-us/azure/private-link/private-link-overview) | 0.10 | High-level overview of Azure Private Link features and architecture without detailed limits, configuration tables, or error mappings. | | [What is a private endpoint?](https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-overview) | 0.10 | Conceptual overview of private endpoints; no detailed quotas, config parameter tables, or troubleshooting content. | diff --git a/products/azure-quantum/azure-quantum.csv b/products/azure-quantum/azure-quantum.csv index d0269fb6..31741c26 100644 --- a/products/azure-quantum/azure-quantum.csv +++ b/products/azure-quantum/azure-quantum.csv @@ -1,8 +1,8 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-common-issues,Troubleshooting Azure Quantum,Troubleshoot Issues with Azure Quantum - Azure Quantum,Diagnose and fix common Azure Quantum issues,This article provides troubleshooting steps for issues that users might experience when they use the Azure Quantum service.,"When you work with the Azure Quantum service, you might experience connection or job-related issues. This article explains how to troubleshoot these issues.",2026-04-13T22:05:00.000Z,troubleshooting-known-issue,troubleshooting,0.8,True,"Explicitly a troubleshooting article for Azure Quantum; such pages typically map specific connection/job symptoms and error messages to causes and resolutions, which are product-specific diagnostic details.",updated -https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-job-cost-billing,Billing and job costs in Azure Quantum,FAQ: Understanding Job Costs and Billing - Azure Quantum,,Learn about how to view job cost reports for running quantum programs in Azure Quantum and how to manage your invoices.,This article explains the guidelines to understand the cost of running quantum programs in Azure Quantum and how to manage your invoices.,2026-04-13T22:05:00.000Z,faq,,0.3,False,"Described as guidelines to understand costs and manage invoices; this is typically conceptual billing guidance without numeric service limits, configuration tables, or decision matrices with thresholds.",updated -https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-quotas,Azure Quantum quotas,FAQ: Limits & Quotas - Azure Quantum,Review and manage Azure Quantum usage quotas,"This document answers common questions about Azure Quantum quotas, including how to review remaining quotas and how to apply to get more.","In this article, you find answers to questions about limits and quotas for usage of Azure Quantum providers.",2026-04-13T22:05:00.000Z,faq,limits-quotas,0.9,True,"A quotas FAQ for a specific service is very likely to list concrete numeric limits (for example, job counts, shot limits, provider-specific caps) and how to request increases, which are product-specific values not inferable from general training data.",updated -https://learn.microsoft.com/en-us/azure/quantum/backend-simulators,Backend simulators,Simulators from Quantum Providers - Azure Quantum,,"Learn how to run your Q# programs on the backend simulators available from quantum providers, such as IonQ, PASQAL, Quantinuum, and Rigetti.","This article describes the backend simulators available from quantum providers. These simulators are available to all Azure Quantum users, and are a great way to test your Q# programs before running them on a real quantum computer.",2025-10-15T23:23:00.000Z,concept-article,,0.2,False,Overview of backend simulators from providers; appears descriptive without detailed configuration tables or limits.,unchanged +https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-common-issues,Troubleshooting Azure Quantum,Troubleshoot Issues with Azure Quantum - Azure Quantum,Diagnose and fix common Azure Quantum issues,This article provides troubleshooting steps for issues that users might experience when they use the Azure Quantum service.,"When you work with the Azure Quantum service, you might experience connection or job-related issues. This article explains how to troubleshoot these issues.",2026-04-13T22:05:00.000Z,troubleshooting-known-issue,troubleshooting,0.8,True,"Explicitly a troubleshooting article for Azure Quantum; such pages typically map specific connection/job symptoms and error messages to causes and resolutions, which are product-specific diagnostic details.",unchanged +https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-job-cost-billing,Billing and job costs in Azure Quantum,FAQ: Understanding Job Costs and Billing - Azure Quantum,,Learn about how to view job cost reports for running quantum programs in Azure Quantum and how to manage your invoices.,This article explains the guidelines to understand the cost of running quantum programs in Azure Quantum and how to manage your invoices.,2026-04-13T22:05:00.000Z,faq,,0.3,False,"Described as guidelines to understand costs and manage invoices; this is typically conceptual billing guidance without numeric service limits, configuration tables, or decision matrices with thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-quotas,Azure Quantum quotas,FAQ: Limits & Quotas - Azure Quantum,Review and manage Azure Quantum usage quotas,"This document answers common questions about Azure Quantum quotas, including how to review remaining quotas and how to apply to get more.","In this article, you find answers to questions about limits and quotas for usage of Azure Quantum providers.",2026-04-13T22:05:00.000Z,faq,limits-quotas,0.9,True,"A quotas FAQ for a specific service is very likely to list concrete numeric limits (for example, job counts, shot limits, provider-specific caps) and how to request increases, which are product-specific values not inferable from general training data.",unchanged +https://learn.microsoft.com/en-us/azure/quantum/backend-simulators,Backend simulators,Simulators from Quantum Providers - Azure Quantum,Configure and use Azure Quantum backend simulators,"Learn how to run your Q# programs on the backend simulators available from quantum providers, such as IonQ, Pasqal, Quantinuum, and Rigetti.","This article describes the backend simulators available from quantum providers. These simulators are available to all Azure Quantum users, and are a great way to test your Q# programs before running them on a real quantum computer.",2026-04-23T15:34:00.000Z,concept-article,configuration,0.65,True,"A backend simulators page for specific quantum providers typically lists provider-specific simulator targets, configuration options, and possibly parameter names/values (such as target IDs, shot counts, or precision settings) that are unique to Azure Quantum and not broadly known. This is concrete, product-specific configuration knowledge rather than just conceptual overview.",updated https://learn.microsoft.com/en-us/azure/quantum/bulk-add-users-to-a-workspace,Add a group to your Azure Quantum workspace,Bulk Add Users to Azure Quantum Workspace - Azure Quantum,Bulk assign Azure Quantum workspace access via CSV,Learn how to bulk add users to your Azure Quantum workspace using a CSV file. This guide simplifies user management for large teams.,"Learn how to grant a group of users access to your Azure Quantum workspace. For example, you may need to grant your team members or students access to your workspace. We recommend using the instructions in this article if you need to grant access to more than 10 users. For a smaller number of users, seeShare access to your Azure Quantum workspace. In this article you'll:",2026-03-13T18:58:00.000Z,how-to,security,0.7,True,"Page describes concrete, product-specific steps to grant access to many users at once for an Azure Quantum workspace, which is an IAM/security operation. It likely includes specific role names or access patterns (e.g., workspace roles or Azure RBAC assignments) and CSV schema details that are not generic knowledge. This fits the security sub-skill as it focuses on configuring access control for the service.",unchanged https://learn.microsoft.com/en-us/azure/quantum/circuit-diagrams-qdk-overview,The circuit visualizer,Overview of QDK circuit diagrams - Azure Quantum,,The article gives an overview of how to interact with circuit diagram elements in the QDK extension for VS Code,"Quantum algorithms are a mixture of quantum programming and classical programming. The quantum part includes qubits, quantum gates, and probabilistic measurements, while the classical part includes control flow constructs such as conditional logic and loops. Quantum circuit diagrams need to incorporate elements from both quantum and classical programming. The Microsoft Quantum Development Kit (QDK) creates interactive circuit diagrams for Q# and OpenQASM programs that show how classical control ",2026-04-01T16:52:00.000Z,overview,,0.1,False,"High-level overview of QDK circuit diagrams and interaction in VS Code; no numeric limits, configuration tables, error codes, or product-specific decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/quantum/concepts-circuits,Quantum circuits conventions,Quantum Circuit Diagram Conventions - Azure Quantum,,Learn how to read a quantum circuit diagram and how to represent quantum operations and measurements in a circuit diagram.,"Sometimes quantum algorithms are easier to understand in a circuit diagram than in the equivalent written matrix representation. This article explains how to read quantum circuit diagrams and their conventions. For more information, seeHow to visualize quantum circuits diagrams.",2025-01-16T08:00:00.000Z,concept-article,,0.2,False,"Explains how to read quantum circuit diagrams and conventions; no product-specific configuration parameters, limits, or decision matrices.",unchanged @@ -23,7 +23,7 @@ https://learn.microsoft.com/en-us/azure/quantum/concepts-vectors-and-matrices,Ve https://learn.microsoft.com/en-us/azure/quantum/contributing-overview,Contributing to the Microsoft Quantum Development Kit,Contributing to the Quantum Development Kit - Azure Quantum,,Learn how to contribute to the Quantum Development Kit (QDK) and the quantum development community.,"Thanks for being a part of the Microsoft Quantum community, we're excited for your contributions!",2026-01-29T17:24:00.000Z,concept-article,,0.05,False,"Community contribution overview for the Quantum Development Kit; process/participation guidance, not technical expert knowledge per defined categories.",unchanged https://learn.microsoft.com/en-us/azure/quantum/further-reading-qdk,Further reading,Quantum computing learning resources - Azure Quantum,,A reference list with deep coverage of quantum computing topics if you want to learn more about quantum computer programming.,This article compiles some of the most popular resources that you may find useful when learning quantum computing.,2025-02-21T20:54:00.000Z,concept-article,,0.05,False,"Curated list of external learning resources; navigation/reference content without product-specific limits, configuration, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/quantum/get-started-azure-quantum,Explore with Copilot in Microsoft Quantum,Copilot in Microsoft Quantum - Azure Quantum,,Learn about the resources available on the Microsoft Quantum website.,"Get started with quantum computing, discover the latest quantum breakthroughs, and create and run quantum programs with the help of Copilot in Microsoft Quantum on theMicrosoft Quantum website. The Microsoft Quantum website features: All you need to start exploring Microsoft Quantum is a Microsoft (MSA) email account. You can create an MSA for free athttps://account.microsoft.com/.",2026-01-29T17:24:00.000Z,get-started,,0.2,False,Marketing-style getting started page for Copilot in Microsoft Quantum; primarily navigational and conceptual.,unchanged -https://learn.microsoft.com/en-us/azure/quantum/how-to-add-a-provider,Add or remove a provider,Add or remove a provider to an existing workspace - Azure Quantum,,This article explains how to add or remove a provider to an existing Azure Quantum workspace,"When you create an Azure Quantum workspace, you choose the providers and plans that are available in the workspace. However, you can add new providers or remove existing providers at any time. If you submit a quantum job to a provider that isn't in your workspace, then you receive an error message that prompts you to add that provider to your workspace.",2026-04-13T22:05:00.000Z,how-to,,0.3,False,"How-to for adding/removing providers in a workspace; likely step-by-step UI/API instructions without detailed configuration parameter tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/quantum/how-to-add-a-provider,Add or remove a provider,Add or remove a provider to an existing workspace - Azure Quantum,,This article explains how to add or remove a provider to an existing Azure Quantum workspace,"When you create an Azure Quantum workspace, you choose the providers and plans that are available in the workspace. However, you can add new providers or remove existing providers at any time. If you submit a quantum job to a provider that isn't in your workspace, then you receive an error message that prompts you to add that provider to your workspace.",2026-04-13T22:05:00.000Z,how-to,,0.3,False,"How-to for adding/removing providers in a workspace; likely step-by-step UI/API instructions without detailed configuration parameter tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-connect-workspace,Connect to your Azure Quantum workspace,Connect to your Azure Quantum workspace - Azure Quantum,Connect to Azure Quantum workspace via qdk.azure,Learn how to connect and access to your Azure Quantum workspace using connection string and workspace parameters.,"If you have an Azure Quantum workspace, then you can connect to your workspace and submit your code with theqdk.azurePython module. Theqdk.azuremodule provides aWorkspaceclassthat represents an Azure Quantum workspace.",2026-01-14T01:12:00.000Z,how-to,integrations,0.65,True,Describes using qdk.azure Workspace class and connection strings; likely includes workspace parameter names and connection configuration details specific to Azure Quantum.,unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-create-workspace,Create an Azure Quantum workspace,Create an Azure Quantum workspace - Azure Quantum,,Learn about the different subscription plans available in Azure and how to create an Azure Quantum workspace.,"Learn how to create anAzure Quantumworkspace in the Azure portal. An Azure Quantum workspace resource, or workspace for short, is a collection of assets associated with running quantum applications. You need a workspace to submit quantum programs to quantum hardware. Tip You can also create an Azure Quantum workspace using the Azure command-line interface (CLI). For more information, seeManage quantum workspaces with the Azure CLI.",2025-10-17T17:03:00.000Z,how-to,,0.4,False,How-to create a workspace via portal; likely step-by-step UI instructions without detailed config tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-manage-quantum-workspaces-using-bicep,Work with Azure Quantum using Bicep,Manage quantum workspaces using Bicep - Azure Quantum,Deploy Azure Quantum workspaces using Bicep templates,This guide shows you how to create and delete quantum workspaces using Bicep.,"In this guide, learn to use a Bicep template to create Azure Quantum workspaces and the required resource groups and storage accounts. After template deployment, you can start running your quantum applications in Azure Quantum. Treating your infrastructure as code enables you to track changes to your infrastructure requirements and makes your deployments more consistent and repeatable. Bicep uses a declarative syntax that you treat like application code. If you're familiar with the JSON syntax f",2026-03-19T18:58:00.000Z,how-to,deployment,0.65,True,"Bicep-based infrastructure-as-code guide that likely defines Azure Quantum workspace resources with specific properties and schema fields. This is expert deployment/configuration knowledge for automating workspace provisioning, beyond generic Bicep usage.",unchanged @@ -32,7 +32,7 @@ https://learn.microsoft.com/en-us/azure/quantum/how-to-set-resource-locks,Protec https://learn.microsoft.com/en-us/azure/quantum/how-to-share-access-quantum-workspace,Share access to your Azure Quantum workspace,Share access to your Azure Quantum workspace - Azure Quantum,Share Azure Quantum workspace using RBAC roles,Learn how to share access to your Azure Quantum workspace. This guide helps you grant access to team members or students efficiently.,"Learn how to share access to yourAzure Quantum workspace. For example, you may need to grant your team members or students access to your workspace. You can assign roles to users to control their access to your workspace. For example, aContributorrole can create, delete, or modify a workspace, whereas aQuantum Workspace Data Contributorrole has limited permissions and can primarily just submit and view jobs. We recommend using the instructions in this article if you need to grant access to 10 or",2025-01-07T23:18:00.000Z,how-to,security,0.85,True,"Explicitly discusses assigning roles like Contributor and Quantum Workspace Data Contributor with differing permissions, which are product-specific security details.",unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-submit-jobs,Submit jobs with Q# and VS Code,Submit Q# Programs with VS Code - Azure Quantum,Submit and run Q# programs on Azure Quantum from VS Code,"This document provides a basic guide to submit and run Azure Quantum using the Azure portal, Python, Jupyter Notebooks, or the Azure CLI.","Learn how to use Visual Studio Code (VS Code) to create and submit Q# programs to real quantum hardware. You can submit quantum computing jobs to Azure Quantum as a standalone Q# program, combine Q# with Python in a Q# project, and run a Jupyter Notebook.",2025-11-11T23:35:00.000Z,how-to,deployment,0.6,True,"Explains how to submit Q# jobs to real hardware via VS Code, Python, and CLI; contains product-specific job submission patterns and constraints relevant to deployment.",unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-submit-re-jobs,Different ways to run the resource estimator,Run the Microsoft Quantum resource estimator - Azure Quantum,Run Microsoft Quantum resource estimator locally and online,This document provides a basic guide to run resource estimates both locally and online using different SDKs and IDEs.,"In this article, you learn how to work with theMicrosoft Quantum resource estimator. The resource estimator helps you estimate the resources required to run a quantum program on a quantum computer. Use the resource estimator to estimate the number of qubits, the number of gates, and the depth of the circuit required to run a quantum program. The resource estimator is available in Visual Studio Code (VS Code) as part of the Microsoft Quantum Development Kit (QDK) extension. For more information, ",2026-02-27T16:51:00.000Z,how-to,configuration,0.65,True,"Explains how to work with the resource estimator across SDKs and IDEs; likely includes specific commands, options, and environment settings unique to this tool.",unchanged -https://learn.microsoft.com/en-us/azure/quantum/how-to-use-molecule-visualizer,How to use the molecule visualizer,How to use the molecule visualizer with QDK for chemistry - Azure Quantum,Install and use the QDK molecule visualizer in Jupyter,"This article describes how to install, open, and use the molecule visualizer with QDK for chemistry.",The Microsoft Quantum Development Kit (QDK) includes a molecule visualizer to use with QDK for chemistry (QDK/Chemistry). You can use the visualizer to interact with the 3D structure of your molecule and overlay the molecular orbitals (MOs) in a Jupyter notebook.,2026-01-25T16:59:00.000Z,how-to,configuration,0.6,True,"Describes how to install, open, and use the molecule visualizer; likely includes configuration steps and environment settings specific to this tool.",unchanged +https://learn.microsoft.com/en-us/azure/quantum/how-to-use-molecule-visualizer,How to use the molecule visualizer,How to use the molecule visualizer with QDK for chemistry - Azure Quantum,,"This article describes how to install, open, and use the molecule visualizer with QDK for chemistry.",The Microsoft Quantum Development Kit (QDK) includes a molecule visualizer to use with QDK for chemistry (QDK/Chemistry). You can use the visualizer to interact with the 3D structure of your molecule and overlay the molecular orbitals (MOs) in a Jupyter notebook.,2026-04-21T22:44:00.000Z,how-to,,0.2,False,"The molecule visualizer article is primarily a how-to for installing, opening, and using a visualization tool in Jupyter. It’s a step-by-step usage/tutorial style page and doesn’t clearly indicate detailed configuration tables, limits, or product-specific error codes or decision matrices, so it doesn’t meet the expert-knowledge criteria for any sub-skill type.",updated https://learn.microsoft.com/en-us/azure/quantum/how-to-use-neutral-atom-visualizer,How to use the neutral atom device visualizers,How to use the neutral atom device visualizer to analyze programs on neutral atom quantum computers - Azure Quantum,,"This article gives an overview of the neutral atom simulator tools in the QDK, which allow users to simulate and visualize how their quantum programs run on neutral atom quantum computers.","The Microsoft Quantum Development Kit (QDK) offers several quantum simulators, including three simulators and a visualizer for neutral atom quantum computers. The neutral atom device visualizer produces an interactive diagram where you can track how qubits move and get processed when your program runs on a basic neutral atom device. This article explains how to create and interact with neutral atom diagrams from the visualizer. For instructions on how to install the simulators and visualizer, se",2026-02-27T16:51:00.000Z,how-to,,0.45,False,Explains how to use a visualizer UI and create diagrams; summary suggests usage guidance rather than detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-visualize-circuits,Visualize Q# circuit diagrams,Visualize Quantum Circuits with the QDK - Azure Quantum,,"Learn how to visually represent Q# and OpenQASM quantum algorithms with circuit diagrams in VS Code, Python, and Jupyter Notebook.","Quantum circuit diagrams are a visual representation of quantum algorithms. Circuit diagrams show the flow of qubits through a quantum program, including the gates and measurements that the program applies to the qubits. In this article, you learn how to create circuit diagrams for Q# and OpenQASM programs with the Microsoft Quantum Development Kit (QDK) using Visual Studio Code (VS Code) and Jupyter Notebook. For more information about quantum circuit diagrams, seeQuantum circuit diagram conven",2026-04-01T16:52:00.000Z,how-to,,0.2,False,"Tutorial-style guidance on creating and viewing circuit diagrams in VS Code and Jupyter; does not indicate presence of limits, configuration parameter tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/quantum/how-to-work-with-jobs,Work with jobs,Introduction to jobs - Azure Quantum,,"This document provides a guide to working with jobs in Azure Quantum, including properties, lifecycle, and monitoring.","When you run a quantum program in Azure Quantum, you create and run ajob. The steps to create and run a job depend on @@ -68,17 +68,17 @@ https://learn.microsoft.com/en-us/azure/quantum/overview-understanding-quantum-c https://learn.microsoft.com/en-us/azure/quantum/pricing,Azure Quantum pricing,Pricing Plans for Azure Quantum Providers - Azure Quantum,Compare Azure Quantum provider pricing plans,"Learn about the different pricing plans for Azure Quantum providers, including IonQ, PASQAL, Quantinuum, and Rigetti.","In Azure Quantum, hardware and software providers define and control the pricing of their offerings. The information in this article is subject to change by providers and some delays in reflecting latest pricing information may exist. Be sure to verify the latest pricing information from the Azure Quantum workspace that you're using.",2025-12-23T18:14:00.000Z,concept-article,decision-making,0.7,True,Provider-defined pricing details and plan differences; likely includes tables and quantified cost criteria to choose between providers and plans.,unchanged https://learn.microsoft.com/en-us/azure/quantum/provider-global-availability,Azure Quantum provider global availability,Global availability of Azure Quantum providers - Azure Quantum,Check regional availability of Azure Quantum providers,Check out the list of countries and regions where each of the Azure Quantum providers are available.,"The availability of the quantum computing providers IonQ, Pasqal, Quantinuum, and Rigetti varies based on the country/region where your billing account is registered. Select the tab for each provider to view its availability by country/region. IonQ offers a Pay-As-You-Go plan through Azure Quantum. For more information about IonQ resources on Azure Quantum, seeIonQ provider and targets. PASQAL offers a Pay-As-You-Go plan through Azure Quantum. For more information about PASQAL resources on Azure",2025-10-15T23:11:00.000Z,feature-availability,decision-making,0.7,True,Provides per-country/region availability tables for each provider; supports decisions on which provider can be used based on billing region.,unchanged https://learn.microsoft.com/en-us/azure/quantum/provider-ionq,IonQ provider and targets,IonQ quantum computing provider - Azure Quantum,Configure and use IonQ targets in Azure Quantum,This document provides the technical details of the IonQ quantum computing provider,"IonQ’s quantum computers perform calculations by manipulating the hyperfine energy states of Ytterbium ions with lasers. Atoms are nature's qubits — every qubit is identical within and between programs. Logical operations can also be performed on any arbitrary pair of qubits, enabling complex quantum programs unhindered by physical connectivity. Want to learn more? Read IonQ’strapped ion quantum computer technology overview. The following targets are available from this provider: IonQ's targets ",2025-10-03T17:59:00.000Z,concept-article,configuration,0.7,True,"Technical details of IonQ provider and targets; likely includes target IDs, supported operations, and constraints specific to IonQ integration.",unchanged -https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal,PASQAL provider and targets,PASQAL quantum computing provider - Azure Quantum,Configure PASQAL simulators and QPUs in Azure Quantum,This document provides the technical details of the simulators and QPU of the PASQAL quantum provider.,"PASQAL's quantum computers control neutral atoms with optical tweezers, using laser light to manipulate quantum registers with up to a hundred qubits. The following targets available from this provider:",2025-06-02T16:22:00.000Z,concept-article,configuration,0.7,True,"Technical details of PASQAL simulators and QPU; likely includes target names, capabilities, and constraints unique to PASQAL.",unchanged +https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal,Pasqal provider and targets,Pasqal quantum computing provider - Azure Quantum,Pasqal targets specifications and capacity limits in Azure Quantum,This document provides the technical details of the simulators and QPU of the Pasqal quantum provider.,"Pasqal's quantum computers control neutral atoms with optical tweezers, using laser light to manipulate quantum registers with up to a hundred qubits. The following targets available from this provider:",2026-04-23T15:34:00.000Z,concept-article,limits-quotas,0.7,True,"Described as providing technical details of Pasqal simulators and QPU; such provider-specific target pages typically include qubit counts, shot limits, and other numeric constraints that qualify as expert limits/quotas information not generally known from training.",new https://learn.microsoft.com/en-us/azure/quantum/provider-quantinuum,Quantinuum provider and targets,Quantinuum provider - Azure Quantum,Configure Quantinuum quantum targets in Azure Quantum,This document provides the technical details of the Quantinuum quantum provider,"Quantinuum provides access to trapped-ion systems with high-fidelity, fully connected qubits, and the ability to perform mid-circuit measurement.",2025-10-01T18:36:00.000Z,concept-article,configuration,0.7,True,"Technical details of Quantinuum provider; likely includes target identifiers, mid-circuit measurement capabilities, and constraints.",unchanged https://learn.microsoft.com/en-us/azure/quantum/provider-rigetti,Rigetti provider and targets,Rigetti provider - Azure Quantum,Rigetti provider targets and hardware limits in Azure Quantum,This document provides the technical details of the Rigetti provider,"Rigetti quantum processorsare universal, gate-model machines based on tunable superconducting qubits. System features and device characteristics include enhanced readout capabilities, a speedup in quantum processing times, fast gate times for multiple entangling gate families, rapid sampling via active register reset, and parametric control. The Rigetti provider makes the following targets available: Note Rigetti simulators and hardware targets don't support Cirq programs. Rigetti's targets corr",2026-04-10T21:22:00.000Z,concept-article,limits-quotas,0.7,True,"Provider technical details pages for specific quantum hardware typically list device-specific characteristics such as qubit counts, connectivity, gate times, sampling rates, and other numeric constraints that function as practical limits/quotas for jobs and circuits. These are expert, provider-specific values not inferable from general training data.",unchanged https://learn.microsoft.com/en-us/azure/quantum/provider-support-ionq,IonQ support policy,Support policy for IonQ in Azure Quantum - Azure Quantum,Support and escalation policy for IonQ on Azure Quantum,This document provides details on the support policy for the IonQ provider in Azure Quantum,"This article describes the Microsoft support policy that applies when you use the IonQ provider in Azure Quantum. The article applies to any of the targets under this provider. If you're using the IonQ provider and hit any unexpected issues that you can't troubleshoot yourself, you can contact the Azure Support team for help bycreating an Azure support case. However, there are however some situations where the Azure Support team will need to redirect you to IonQ's support team. Or, you may recei",2025-09-18T08:00:00.000Z,troubleshooting-general,troubleshooting,0.6,True,Describes when Azure vs IonQ support handles issues; includes product-specific guidance on troubleshooting and escalation paths.,unchanged -https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal,PASQAL support policy,Support Policy for PASQAL in Azure Quantum - Azure Quantum,Support policy for PASQAL on Azure Quantum,This document provides details on the support policy for the PASQAL quantum provider in Azure Quantum,This article describes the Microsoft support policy that applies when you use the PASQAL provider in Azure Quantum. The article applies to any of thetargetsoffered by PASQAL.,2025-03-11T18:26:00.000Z,troubleshooting-general,troubleshooting,0.6,True,Outlines support responsibilities and escalation for PASQAL targets; product-specific troubleshooting and support routing information.,unchanged +https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal,Pasqal support policy,Support Policy for Pasqal in Azure Quantum - Azure Quantum,,This document provides details on the support policy for the Pasqal quantum provider in Azure Quantum,This article describes the Microsoft support policy that applies when you use the Pasqal provider in Azure Quantum. The article applies to any of thetargetsoffered by Pasqal.,2026-04-23T15:34:00.000Z,troubleshooting-general,,0.3,False,"Support policy description is usually procedural/contractual (what Microsoft supports and when) rather than technical expert knowledge like limits, configuration parameters, or troubleshooting mappings.",new https://learn.microsoft.com/en-us/azure/quantum/provider-support-quantinuum,Quantinuum support policy,Support policy for Quantinuum in Azure Quantum - Azure Quantum,Support policy for Quantinuum on Azure Quantum,This document provides details on the support policy for the Quantinuum provider in Azure Quantum,"This article describes the Microsoft support policy that applies when you use the Quantinuum provider in Azure Quantum. The article applies to any of the targets under this provider. If you're using the Quantinuum provider and hit unexpected issues, you can contact the Azure Support team for help bycreating an Azure support case. In some situations, the Azure Support team needs to redirect you to Quantinuum's support team. You may receive a quicker response by reaching out to Quantinuum directly",2025-02-14T08:00:00.000Z,troubleshooting-general,troubleshooting,0.6,True,Describes support boundaries and escalation for Quantinuum targets; specific to Azure Quantum–Quantinuum integration.,unchanged https://learn.microsoft.com/en-us/azure/quantum/provider-support-rigetti,Rigetti support policy,Support Policy for Rigetti in Azure Quantum - Azure Quantum,Support policy for Rigetti on Azure Quantum,This document provides details on the support policy for the Rigetti provider in Azure Quantum,This article describes the Microsoft support policy that applies when you use the Rigetti provider in Azure Quantum. The article applies to any of thetargetsoffered by the Rigetti provider.,2024-09-16T08:00:00.000Z,troubleshooting-general,troubleshooting,0.6,True,Defines how issues with Rigetti targets are handled between Azure and Rigetti support; unique troubleshooting and escalation guidance.,unchanged -https://learn.microsoft.com/en-us/azure/quantum/qc-target-list,List of quantum computing providers,List of quantum computing providers on Azure Quantum - Azure Quantum,,This document provides a list of the available quantum computing providers on Azure Quantum.,"Azure Quantum offers various quantum solutions, such as different quantum hardware devices and quantum simulators that you can use to run your quantum computing programs. This article lists the providers that you can access with Azure Quantum, and provides a description of what each provider offers. Important Quantum hardware devices are still an emerging technology. These devices have some limitations and requirements for quantum programs that run on them. For more information, see thetarget pr",2025-10-15T23:11:00.000Z,concept-article,,0.4,False,"Lists providers and brief descriptions; primarily catalog/overview without detailed limits, configs, or troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/quantum/qc-target-list,List of quantum computing providers,List of quantum computing providers on Azure Quantum - Azure Quantum,,This document provides a list of the available quantum computing providers on Azure Quantum.,"Azure Quantum offers various quantum solutions, such as different quantum hardware devices and quantum simulators that you can use to run your quantum computing programs. This article lists the providers that you can access with Azure Quantum, and provides a description of what each provider offers. Important Quantum hardware devices are still an emerging technology. These devices have some limitations and requirements for quantum programs that run on them. For more information, see thetarget pr",2026-04-23T15:34:00.000Z,concept-article,,0.2,False,"Primarily a catalog/list of available Azure Quantum providers and targets with descriptions; no detailed limits, configuration parameters, error codes, or decision matrices are indicated in the summary.",updated https://learn.microsoft.com/en-us/azure/quantum/qdk-circuit-editor,The circuit editor,How to use the circuit editor in the Quantum Development Kit - Azure Quantum,,This document introduces the circuit editor feature in the QDK that allows users to build quantum circuits through a graphical interface and use the circuits in their Q# programs.,"The circuit editor is a feature in the Quantum Development Kit (QDK) that provides a graphical interface where you can create, edit, and visualize quantum circuit diagrams inside your Q# projects. You can use the circuits that you build directly in your Q# programs as callable operations.",2026-03-12T14:55:00.000Z,how-to,,0.1,False,"Introduces the circuit editor feature and how to build/visualize circuits; appears to be a conceptual/usage overview without detailed configuration parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/quantum/qdk-main-overview,Quantum Development Kit overview,Build Quantum Solutions with the Microsoft Quantum Development Kit - Azure Quantum,,"This document is the main landing page for the Microsoft Quantum Development Kit (QDK), which gives a high level overview of the QDK and links to documentation on all the features in the QDK.",The Microsoft Quantum Development Kit (QDK) is a free open-source software development kit designed specifically for quantum program development. The QDK includes an extension for Microsoft's Visual Studio Code (VS Code) and a set of Python libraries. Install the QDK to get started with state-of-the-art quantum development tools.,2026-01-30T19:51:00.000Z,overview,,0.2,False,Landing page for QDK; high-level overview and navigation without deep technical specifics.,unchanged -https://learn.microsoft.com/en-us/azure/quantum/qdk-openqasm-integration,Overview of OpenQASM in the QDK,How to run OpenQASM programs in with Microsoft Quantum Development Kit - Azure Quantum,Run OpenQASM programs with Azure Quantum QDK,This document explains how to run OpenQASM programs in the Microsoft Quantum Development Kit in VS Code and Jupyter Notebook,"The Microsoft Quantum Development Kit (QDK) provides different development environments for OpenQASM programs with Azure Quantum integration. In this article, you learn how to develop and run OpenQASM code in the following QDK environments: Install the latest version ofVS Code To work directly with OpenQASM files, install the latest version of theQDK extensionin VS Code. To work with OpenQASM in Python and Jupyter Notebook, install the latest version of thePython extensionand theJupyter extensio",2026-04-14T15:31:00.000Z,how-to,integrations,0.65,True,"The page describes product-specific integration of OpenQASM with the Microsoft Quantum Development Kit and Azure Quantum, including how to run OpenQASM code in VS Code and Jupyter environments using QDK extensions. This is concrete integration guidance rather than a conceptual overview, but it focuses on environment setup and usage rather than limits, security, or deployment.",updated +https://learn.microsoft.com/en-us/azure/quantum/qdk-openqasm-integration,Overview of OpenQASM in the QDK,How to run OpenQASM programs in with Microsoft Quantum Development Kit - Azure Quantum,Run OpenQASM programs with Azure Quantum QDK,This document explains how to run OpenQASM programs in the Microsoft Quantum Development Kit in VS Code and Jupyter Notebook,"The Microsoft Quantum Development Kit (QDK) provides different development environments for OpenQASM programs with Azure Quantum integration. In this article, you learn how to develop and run OpenQASM code in the following QDK environments: Install the latest version ofVS Code To work directly with OpenQASM files, install the latest version of theQDK extensionin VS Code. To work with OpenQASM in Python and Jupyter Notebook, install the latest version of thePython extensionand theJupyter extensio",2026-04-14T15:31:00.000Z,how-to,integrations,0.65,True,"The page describes product-specific integration of OpenQASM with the Microsoft Quantum Development Kit and Azure Quantum, including how to run OpenQASM code in VS Code and Jupyter environments using QDK extensions. This is concrete integration guidance rather than a conceptual overview, but it focuses on environment setup and usage rather than limits, security, or deployment.",unchanged https://learn.microsoft.com/en-us/azure/quantum/qdk-qiskit-cirq-overview,Python quantum language support in the QDK,Use the Qiskit and Cirq Python libraries in the Microsoft Quantum Development Kit - Azure Quantum,Use Qiskit and Cirq with the Quantum Development Kit,"This document gives an overview of the Qiskit and Cirq Python libraries, and how the QDK supports these features.","The Microsoft Quantum Development Kit (QDK) supports development in several quantum programming frameworks, including Qiskit and Cirq. Qiskit and Cirq are Python libraries that offer tools to create and visualize quantum circuits and write quantum programs. The QDK supports some interoperability with Qiskit versions 1 and 2, and with Cirq. You can write your quantum programs in these frameworks and then run your programs on the local simulator or submit them as jobs to Azure Quantum with the QDK",2025-12-22T23:45:00.000Z,overview,integrations,0.7,True,"Covers interoperability details between QDK and Qiskit/Cirq, including supported versions and ways to run programs on simulators and Azure Quantum; these are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/quantum/qdk-simulator-noise-models,Build noise models for neutral atom simulations,How to build noise models for neutral atom device simulations - Azure Quantum,Configure noise models for neutral atom simulations in QDK,"This article describes the noise models that the neutral atoms simulators support, and how to include noise in neutral atom device simulations.","The neutral atom device simulators in the Microsoft Quantum Development Kit (QDK) can model noise that occurs when you run programs on a neutral atom quantum computer. This article explains the kinds of noise that the neutral atom simulators support and how to include a noise model in your simulation. For instructions on how to install and use the neutral atom device simulators, seeHow to install and use the neutral atom device simulators in the QDK.",2026-02-25T17:42:00.000Z,how-to,configuration,0.65,True,"Describes supported noise models and how to include them in simulations; likely lists model types and parameters, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/quantum/qdk-vscode-agent-setup,Visual Studio Code agent mode,How to set up and use agent mode in VS Code for the QDK - Azure Quantum,Use Copilot agent mode effectively with QDK in VS Code,This document describes how to set up and use GitHub Copilot's agent mode in VS Code to enhance the QDK user experience.,"Use agent mode in Visual Studio Code (VS Code), powered by GitHub Copilot, to enhance your builder experience with the Microsoft Quantum Development Kit (QDK) extension. Agent mode is an AI-assisted development experience that helps you write and debug code, and complete other development tasks in VS Code. Although you can use agent mode for the QDK without any setup, follow these tips to get the most out of agent mode in your QDK projects:",2025-06-16T16:59:00.000Z,how-to,best-practices,0.65,True,Provides tips to get the most out of agent mode for QDK projects; likely includes product-specific recommendations and patterns for using the AI assistant with QDK.,unchanged @@ -87,7 +87,7 @@ https://learn.microsoft.com/en-us/azure/quantum/qsharp-quickstart,Create your fi https://learn.microsoft.com/en-us/azure/quantum/qsharp-ways-to-work,Ways to run Q# programs,Development Options for Quantum Programming with Q# - Azure Quantum,Select development environments for Q# and the Quantum Development Kit,This article describes the environment options for developing quantum programs with Q# and the QDK.,"Azure Quantum offers different development options for writing and running quantum programs. Each environment uses theMicrosoft Quantum Development Kit (QDK), a set of open-source tools that includes the Q# programming language. For more information, seeIntroduction to Q#. In this article, you learn the differences between each option and how to choose the right one for your needs.",2026-01-29T17:24:00.000Z,get-started,decision-making,0.65,True,Compares different environment options and helps choose the right one; environment selection guidance with criteria is decision-making content.,unchanged https://learn.microsoft.com/en-us/azure/quantum/quantum-computing-target-profiles,Types of target profiles,Target Profile Types - Azure Quantum,,This document provides an overview of target profile types available in Azure Quantum and their limitations.,"Quantum devices are still an emerging technology and unfortunately not all of them can run every Q# program. As such, you need to keep some restrictions in mind while you develop quantum programs. The target profile types define the capabilities of the quantum devices that you target with your Q# programs. The Microsoft Quantum Development Kit (QDK) has a set of different target profile types, which together support all the capabilities of the current quantum devices that are available in Azure ",2025-10-27T17:20:00.000Z,how-to,,0.5,False,Overview of target profile types and capabilities; likely conceptual constraints rather than numeric limits or config tables.,unchanged https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-cirq,Submit a circuit with Cirq,Submit Cirq quantum circuits to Azure Quantum - Azure Quantum,Submit Cirq circuits to Azure Quantum with QDK,Learn how to submit Cirq quantum circuits to the Azure Quantum service.,"Learn how to submit a Cirq quantum circuit using theqdk.azure.cirqPython submodule. You can submit Cirq circuits to Azure Quantum using the Microsoft Quantum Development Kit (QDK) and Jupyter Notebook in Visual Studio Code (VS Code) from your local machine. For more information, seeQuantum circuits.",2026-01-29T17:24:00.000Z,how-to,integrations,0.7,True,Shows how to use the qdk.azure.cirq submodule to submit Cirq circuits; contains product-specific module names and usage patterns for integration.,unchanged -https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-provider-format,Submit a circuit in provider-specific format,Submit formatted quantum circuits - Azure Quantum,"Submit QIR, OpenQASM, and Pulser circuits to Azure Quantum","Learn how to submit specific formatted quantum circuits with QIR, OpenQASM, or Pulser SDK to the Azure Quantum service.","Learn how to use theqdk.azurePython module to submit circuits in specific formats to the Azure Quantum service. This article shows you how to submit circuits in the following formats: For more information, seeQuantum circuits.",2026-04-13T22:05:00.000Z,how-to,integrations,0.6,True,"The article explains how to use the qdk.azure Python module to submit quantum circuits in specific formats (QIR, OpenQASM, Pulser SDK) to the Azure Quantum service. This is a product-specific coding and integration pattern for circuit submission, not just a conceptual overview, and aligns best with integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-provider-format,Submit a circuit in provider-specific format,Submit formatted quantum circuits - Azure Quantum,"Submit QIR, OpenQASM, and Pulser circuits to Azure Quantum","Learn how to submit specific formatted quantum circuits with QIR, OpenQASM, or Pulser SDK to the Azure Quantum service.","Learn how to use theqdk.azurePython module to submit circuits in specific formats to the Azure Quantum service. This article shows you how to submit circuits in the following formats: For more information, seeQuantum circuits.",2026-04-13T22:05:00.000Z,how-to,integrations,0.6,True,"The article explains how to use the qdk.azure Python module to submit quantum circuits in specific formats (QIR, OpenQASM, Pulser SDK) to the Azure Quantum service. This is a product-specific coding and integration pattern for circuit submission, not just a conceptual overview, and aligns best with integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-qiskit,Submit a circuit with Qiskit,How to submit Qiskit programs to Azure Quantum with the QDK - Azure Quantum,,Learn how to submit Qiskit quantum circuits to the Azure Quantum service.,"You can submit Qiskit programs to run on Azure Quantum targets with the Microsoft Quantum Development Kit (QDK). You can also run Qiskit programs on your local machine with the QDK's built-in sparse simulator. The QDK supports both version 1 and version 2 of Qiskit. In this article, you learn how to run Qiskit programs with the QDK Python library from a Jupyter Notebook in Visual Studio Code (VS Code). Note It's a best practice to run your quantum program on a simulator target before you submit ",2026-03-11T22:58:00.000Z,how-to,,0.2,False,"Quickstart/tutorial for submitting Qiskit programs via QDK; primarily step-by-step usage in VS Code with no indication of limits, quotas, configuration tables, error-code mappings, or other product-specific expert reference details.",unchanged https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-resources-estimator,Run your first resource estimate,Quickstart: Learn to use the Microsoft Quantum resource estimator - Azure Quantum,,Learn how to submit a Q# sample to the Microsoft Quantum resource estimator to estimate the resources of a Q# program.,"In this quickstart, you learn how to use the Microsoft Quantum resource estimator to estimate the resources of a Q# program.",2026-02-27T16:51:00.000Z,quickstart,,0.45,False,Quickstart for submitting a sample to the resource estimator; likely step-by-step tutorial without comprehensive configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/quantum/release-notes,Azure Quantum release notes,Release notes for the QDK and Azure Quantum - latest version - Azure Quantum,,Learn about the latest updates to the Microsoft Quantum Development Kit (QDK) and Azure Quantum.,"The release notes outline updates to theMicrosoft Quantum Development Kit (QDK)and theAzure Quantum service. For 'getting started' instructions, seeSet up Azure Quantum. For instructions on how to update your QDK to the latest version, seeUpdate the Microsoft Quantum Development Kit to the latest version.",2026-01-29T17:24:00.000Z,release-notes,,0.3,False,"Release notes; while detailed, they are temporal change logs rather than stable expert configuration or limits content for skills.",unchanged diff --git a/products/azure-quantum/report.md b/products/azure-quantum/report.md index 44da695d..147ad5c7 100644 --- a/products/azure-quantum/report.md +++ b/products/azure-quantum/report.md @@ -1,11 +1,15 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: troubleshooting: 'Troubleshooting Azure Quantum provider issues: diagnosing job failures and understanding support/escalation policies and limits for IonQ, PASQAL, Quantinuum, and Rigetti.' - limits-quotas: Managing Azure Quantum quotas, job/session limits, timeouts, and - Rigetti-specific hardware constraints and target capabilities. + limits-quotas: Usage limits, quotas, session timeouts, and hardware capacity specs + for Azure Quantum targets (e.g., Pasqal, Rigetti), plus how to monitor and manage + these constraints. + configuration: 'Configuring Azure Quantum environments: workspaces, simulators, + hardware targets (IonQ/Quantinuum/neutral atoms), VS Code/QDK setup, and tuning/optimizing + resource estimator runs.' security: 'Managing secure access to Azure Quantum workspaces: RBAC and access control, bulk user assignment, ARM locks, managed identities, service principals, and secure handling of access keys.' @@ -14,9 +18,6 @@ category_descriptions: Adaptive RI deployment: Deploying Azure Quantum workspaces with Bicep and running/submitting Q# quantum programs from VS Code to Azure Quantum backends - configuration: Configuring Azure Quantum workspaces, QDK tools, simulators, noise - models, and hardware targets (IonQ, PASQAL, Quantinuum, Rigetti), plus tuning - and batching resource estimator runs. architecture-patterns: Guidance on designing hybrid quantum-classical workflows in Azure Quantum, including architecture options, orchestration patterns, and when to offload tasks to quantum hardware. @@ -29,13 +30,12 @@ category_descriptions: skill_description: Expert knowledge for Azure Quantum development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - using Azure Quantum workspaces, QDK/Q#, QIR/OpenQASM circuits, IonQ/PASQAL/Quantinuum/Rigetti - targets, or hybrid jobs, and other Azure Quantum related development tasks. Not - for Azure HPC Cache (use azure-hpc-cache), Azure Batch (use azure-batch), Azure - Databricks (use azure-databricks), Azure Machine Learning (use azure-machine-learning). -use_when: Use when using Azure Quantum workspaces, QDK/Q#, QIR/OpenQASM circuits, - IonQ/PASQAL/Quantinuum/Rigetti targets, or hybrid jobs, and other Azure Quantum - related development tasks. + using QDK/VS Code, IonQ/PASQAL/Quantinuum/Rigetti targets, QIR/OpenQASM/Qiskit circuits, + or hybrid jobs, and other Azure Quantum related development tasks. Not for Azure + HPC Cache (use azure-hpc-cache), Azure Batch (use azure-batch), Azure Databricks + (use azure-databricks), Azure Machine Learning (use azure-machine-learning). +use_when: Use when using QDK/VS Code, IonQ/PASQAL/Quantinuum/Rigetti targets, QIR/OpenQASM/Qiskit + circuits, or hybrid jobs, and other Azure Quantum related development tasks. confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch (use azure-batch), Azure Databricks (use azure-databricks), Azure Machine Learning (use azure-machine-learning). @@ -47,14 +47,14 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( - **Total Pages**: 135 - **Fetched**: 135 - **Fetch Failed**: 0 -- **Classified**: 45 -- **Unclassified**: 90 +- **Classified**: 44 +- **Unclassified**: 91 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 129 -- **Deleted Pages**: 1 +- **New Pages**: 2 +- **Updated Pages**: 3 +- **Unchanged**: 130 +- **Deleted Pages**: 2 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-quantum/azure-quantum.csv` ## Classification Statistics @@ -63,35 +63,35 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( |------|-------|------------| | architecture-patterns | 1 | 0.7% | | best-practices | 3 | 2.2% | -| configuration | 13 | 9.6% | +| configuration | 12 | 8.9% | | decision-making | 5 | 3.7% | | deployment | 2 | 1.5% | | integrations | 6 | 4.4% | -| limits-quotas | 3 | 2.2% | +| limits-quotas | 4 | 3.0% | | security | 7 | 5.2% | -| troubleshooting | 5 | 3.7% | -| *(Unclassified)* | 90 | 66.7% | +| troubleshooting | 4 | 3.0% | +| *(Unclassified)* | 91 | 67.4% | ## Changes +### New Pages + +- [Pasqal provider and targets](https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal) +- [Pasqal support policy](https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal) + ### Updated Pages -- [Overview of OpenQASM in the QDK](https://learn.microsoft.com/en-us/azure/quantum/qdk-openqasm-integration) - - Updated: 2025-07-07T21:40:00.000Z → 2026-04-14T15:31:00.000Z -- [Submit a circuit in provider-specific format](https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-provider-format) - - Updated: 2026-01-14T01:12:00.000Z → 2026-04-13T22:05:00.000Z -- [Azure Quantum quotas](https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-quotas) - - Updated: 2025-05-21T18:28:00.000Z → 2026-04-13T22:05:00.000Z -- [Billing and job costs in Azure Quantum](https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-job-cost-billing) - - Updated: 2025-04-02T21:50:00.000Z → 2026-04-13T22:05:00.000Z -- [Add or remove a provider](https://learn.microsoft.com/en-us/azure/quantum/how-to-add-a-provider) - - Updated: 2025-09-30T15:51:00.000Z → 2026-04-13T22:05:00.000Z -- [Troubleshooting Azure Quantum](https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-common-issues) - - Updated: 2025-10-17T17:03:00.000Z → 2026-04-13T22:05:00.000Z +- [List of quantum computing providers](https://learn.microsoft.com/en-us/azure/quantum/qc-target-list) + - Updated: 2025-10-15T23:11:00.000Z → 2026-04-23T15:34:00.000Z +- [Backend simulators](https://learn.microsoft.com/en-us/azure/quantum/backend-simulators) + - Updated: 2025-10-15T23:23:00.000Z → 2026-04-23T15:34:00.000Z +- [How to use the molecule visualizer](https://learn.microsoft.com/en-us/azure/quantum/how-to-use-molecule-visualizer) + - Updated: 2026-01-25T16:59:00.000Z → 2026-04-21T22:44:00.000Z ### Deleted Pages -- ~~Perform long-running experiments~~ (https://learn.microsoft.com/en-us/azure/quantum/how-to-long-running-experiments) +- ~~PASQAL provider and targets~~ (https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal) +- ~~PASQAL support policy~~ (https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal) ## Classified Pages @@ -110,13 +110,14 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( | [Azure Quantum pricing](https://learn.microsoft.com/en-us/azure/quantum/pricing) | decision-making | 0.70 | Provider-defined pricing details and plan differences; likely includes tables and quantified cost criteria to choose between providers and plans. | | [Azure Quantum provider global availability](https://learn.microsoft.com/en-us/azure/quantum/provider-global-availability) | decision-making | 0.70 | Provides per-country/region availability tables for each provider; supports decisions on which provider can be used based on billing region. | | [IonQ provider and targets](https://learn.microsoft.com/en-us/azure/quantum/provider-ionq) | configuration | 0.70 | Technical details of IonQ provider and targets; likely includes target IDs, supported operations, and constraints specific to IonQ integration. | -| [PASQAL provider and targets](https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal) | configuration | 0.70 | Technical details of PASQAL simulators and QPU; likely includes target names, capabilities, and constraints unique to PASQAL. | +| [Pasqal provider and targets](https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal) | limits-quotas | 0.70 | Described as providing technical details of Pasqal simulators and QPU; such provider-specific target pages typically include qubit counts, shot limits, and other numeric constraints that qualify as expert limits/quotas information not generally known from training. | | [Protect Azure Quantum with resource locks](https://learn.microsoft.com/en-us/azure/quantum/how-to-set-resource-locks) | security | 0.70 | Shows how to apply ARM resource locks to workspaces and storage; product-specific security hardening guidance with concrete lock types and scenarios. | | [Python quantum language support in the QDK](https://learn.microsoft.com/en-us/azure/quantum/qdk-qiskit-cirq-overview) | integrations | 0.70 | Covers interoperability details between QDK and Qiskit/Cirq, including supported versions and ways to run programs on simulators and Azure Quantum; these are product-specific integration patterns. | | [Quantinuum provider and targets](https://learn.microsoft.com/en-us/azure/quantum/provider-quantinuum) | configuration | 0.70 | Technical details of Quantinuum provider; likely includes target identifiers, mid-circuit measurement capabilities, and constraints. | | [Rigetti provider and targets](https://learn.microsoft.com/en-us/azure/quantum/provider-rigetti) | limits-quotas | 0.70 | Provider technical details pages for specific quantum hardware typically list device-specific characteristics such as qubit counts, connectivity, gate times, sampling rates, and other numeric constraints that function as practical limits/quotas for jobs and circuits. These are expert, provider-specific values not inferable from general training data. | | [Submit a circuit with Cirq](https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-cirq) | integrations | 0.70 | Shows how to use the qdk.azure.cirq submodule to submit Cirq circuits; contains product-specific module names and usage patterns for integration. | | [Work with Azure Quantum using the Azure CLI](https://learn.microsoft.com/en-us/azure/quantum/how-to-manage-quantum-workspaces-with-the-azure-cli) | configuration | 0.70 | CLI-focused how-to that likely includes specific Azure Quantum workspace-related parameters, required resource types, and command options (for resource groups, storage accounts, and workspace creation/deletion). These are product-specific configuration details rather than generic concepts. | +| [Backend simulators](https://learn.microsoft.com/en-us/azure/quantum/backend-simulators) | configuration | 0.65 | A backend simulators page for specific quantum providers typically lists provider-specific simulator targets, configuration options, and possibly parameter names/values (such as target IDs, shot counts, or precision settings) that are unique to Azure Quantum and not broadly known. This is concrete, product-specific configuration knowledge rather than just conceptual overview. | | [Build noise models for neutral atom simulations](https://learn.microsoft.com/en-us/azure/quantum/qdk-simulator-noise-models) | configuration | 0.65 | Describes supported noise models and how to include them in simulations; likely lists model types and parameters, which are product-specific configuration details. | | [Compare multiple configurations](https://learn.microsoft.com/en-us/azure/quantum/resource-estimator-batching) | configuration | 0.65 | Describes running estimates for multiple configurations and comparing them; likely includes specific API options or parameters for batching. | | [Connect to your Azure Quantum workspace](https://learn.microsoft.com/en-us/azure/quantum/how-to-connect-workspace) | integrations | 0.65 | Describes using qdk.azure Workspace class and connection strings; likely includes workspace parameter names and connection configuration details specific to Azure Quantum. | @@ -132,12 +133,10 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( | [Work with Azure Quantum using Bicep](https://learn.microsoft.com/en-us/azure/quantum/how-to-manage-quantum-workspaces-using-bicep) | deployment | 0.65 | Bicep-based infrastructure-as-code guide that likely defines Azure Quantum workspace resources with specific properties and schema fields. This is expert deployment/configuration knowledge for automating workspace provisioning, beyond generic Bicep usage. | | [Analyze cryptographic protocols](https://learn.microsoft.com/en-us/azure/quantum/resource-estimator-quantum-safe-planning) | decision-making | 0.60 | Guides using the estimator to analyze resources needed to break encryption; likely includes scenario-based guidance and criteria for assessing cryptographic strength and planning migration. | | [Debug and test your Q# code](https://learn.microsoft.com/en-us/azure/quantum/testing-debugging) | best-practices | 0.60 | Covers unit tests, assertions, and dump functions specific to Q# and QDK; these are product-specific debugging patterns and gotchas. | -| [How to use the molecule visualizer](https://learn.microsoft.com/en-us/azure/quantum/how-to-use-molecule-visualizer) | configuration | 0.60 | Describes how to install, open, and use the molecule visualizer; likely includes configuration steps and environment settings specific to this tool. | | [Install the neutral atom simulators](https://learn.microsoft.com/en-us/azure/quantum/install-qdk-neutral-atom-simulators) | configuration | 0.60 | Describes installing simulation Python modules and running simulations; likely includes package names, commands, and possibly configuration options specific to these simulators. | | [Introduction to hybrid QC](https://learn.microsoft.com/en-us/azure/quantum/hybrid-computing-overview) | architecture-patterns | 0.60 | Explains different hybrid implementation types and how to choose the best approach; this is explicit decision/architecture guidance specific to Azure Quantum hybrid models. | | [IonQ support policy](https://learn.microsoft.com/en-us/azure/quantum/provider-support-ionq) | troubleshooting | 0.60 | Describes when Azure vs IonQ support handles issues; includes product-specific guidance on troubleshooting and escalation paths. | | [Optimize large programs](https://learn.microsoft.com/en-us/azure/quantum/resource-estimator-handle-large-programs) | best-practices | 0.60 | Focuses on optimizing execution time for large programs; likely includes concrete recommendations and patterns specific to this tool (for example, how to structure code or configuration to reduce runtime). | -| [PASQAL support policy](https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal) | troubleshooting | 0.60 | Outlines support responsibilities and escalation for PASQAL targets; product-specific troubleshooting and support routing information. | | [Quantinuum support policy](https://learn.microsoft.com/en-us/azure/quantum/provider-support-quantinuum) | troubleshooting | 0.60 | Describes support boundaries and escalation for Quantinuum targets; specific to Azure Quantum–Quantinuum integration. | | [Rigetti support policy](https://learn.microsoft.com/en-us/azure/quantum/provider-support-rigetti) | troubleshooting | 0.60 | Defines how issues with Rigetti targets are handled between Azure and Rigetti support; unique troubleshooting and escalation guidance. | | [Submit a circuit in provider-specific format](https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-provider-format) | integrations | 0.60 | The article explains how to use the qdk.azure Python module to submit quantum circuits in specific formats (QIR, OpenQASM, Pulser SDK) to the Azure Quantum service. This is a product-specific coding and integration pattern for circuit submission, not just a conceptual overview, and aligns best with integrations & coding patterns. | @@ -155,7 +154,6 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( | [Run your first resource estimate](https://learn.microsoft.com/en-us/azure/quantum/quickstart-microsoft-resources-estimator) | 0.45 | Quickstart for submitting a sample to the resource estimator; likely step-by-step tutorial without comprehensive configuration tables or limits. | | [Create an Azure Quantum workspace](https://learn.microsoft.com/en-us/azure/quantum/how-to-create-workspace) | 0.40 | How-to create a workspace via portal; likely step-by-step UI instructions without detailed config tables or limits. | | [Create your first Q# program](https://learn.microsoft.com/en-us/azure/quantum/qsharp-quickstart) | 0.40 | Introductory quickstart to create a basic Q# program; primarily tutorial content without detailed configuration or limits. | -| [List of quantum computing providers](https://learn.microsoft.com/en-us/azure/quantum/qc-target-list) | 0.40 | Lists providers and brief descriptions; primarily catalog/overview without detailed limits, configs, or troubleshooting. | | [Neutral atom simulator overview](https://learn.microsoft.com/en-us/azure/quantum/overview-qdk-neutral-atom-simulator) | 0.40 | Overview of neutral atom simulators and when to use them; no clear indication of detailed parameters, limits, or error codes. | | [Sparse simulator](https://learn.microsoft.com/en-us/azure/quantum/sparse-simulator) | 0.40 | Describes the sparse simulator conceptually; summary does not indicate specific configuration parameters, limits, or error codes. | | [Work with jobs](https://learn.microsoft.com/en-us/azure/quantum/how-to-work-with-jobs) | 0.40 | Intro to jobs and lifecycle; summary suggests conceptual guidance without detailed limits, configs, or error mappings. | @@ -167,6 +165,7 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( | [Conjugations](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/expressions/conjugations) | 0.30 | Conjugations and quantum memory patterns are explained conceptually; no concrete limits, configs, or decision matrices. | | [Double-factorized chemistry with the resource estimator](https://learn.microsoft.com/en-us/azure/quantum/tutorial-resource-estimator-chemistry) | 0.30 | Resource estimator tutorial for quantum chemistry; shows how to use a tool but does not enumerate service limits, configs, or decision matrices. | | [Namespaces](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/programstructure/namespaces) | 0.30 | Language reference for namespaces in Q#; mostly syntax and semantics, which are general language knowledge rather than product operational expertise. | +| [Pasqal support policy](https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal) | 0.30 | Support policy description is usually procedural/contractual (what Microsoft supports and when) rather than technical expert knowledge like limits, configuration parameters, or troubleshooting mappings. | | [Quantum Intermediate Representation](https://learn.microsoft.com/en-us/azure/quantum/concepts-qir) | 0.30 | Describes Quantum Intermediate Representation (QIR) and its role; appears as a conceptual/standards overview without concrete config tables, limits, or decision matrices. | | [T gates & T factories](https://learn.microsoft.com/en-us/azure/quantum/concepts-tfactories) | 0.30 | Describes T gates and T factories and mentions the resource estimator, but summary suggests high-level explanation rather than concrete config tables, limits, or decision guidance. | | [Type declarations](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/programstructure/typedeclarations) | 0.30 | Language reference for struct type declarations in Q#; describes syntax and behavior, which is core language knowledge rather than operational product configuration. | @@ -186,7 +185,6 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( | [Type inference](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/typesystem/typeinference) | 0.25 | Type inference algorithm description; conceptual language feature, no configuration matrices. | | [Type parameterizations](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/typesystem/typeparameterizations) | 0.25 | Type-parameterized operations/functions; generic type system feature, no product-specific limits or configs. | | [Type system in Q#](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/typesystem/) | 0.25 | Overview of Q# data types and type system; conceptual, no configuration tables or limits. | -| [Backend simulators](https://learn.microsoft.com/en-us/azure/quantum/backend-simulators) | 0.20 | Overview of backend simulators from providers; appears descriptive without detailed configuration tables or limits. | | [Binding scopes](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/statements/bindingscopes) | 0.20 | Describes variable scope rules in Q#; standard language semantics without product-specific operational details. | | [Bitwise expressions](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/expressions/bitwiseexpressions) | 0.20 | Bitwise operators and shifts; language syntax, no product-specific constraints. | | [Call expressions](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/expressions/callstatements) | 0.20 | Call expressions and rules for calling operations/functions; no product-specific operational details. | @@ -197,7 +195,9 @@ confusable_not_for: Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch ( | [Expressions in Q#](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/expressions/) | 0.20 | General description of Q# expressions and operators; language reference, not product configuration or limits. | | [Grover's algorithm](https://learn.microsoft.com/en-us/azure/quantum/concepts-grovers) | 0.20 | Detailed theory of Grover’s algorithm; mathematical explanation rather than product-specific best practices, limits, or configuration details. | | [Grover's algorithm](https://learn.microsoft.com/en-us/azure/quantum/tutorial-qdk-grovers-search) | 0.20 | Grover's algorithm tutorial; algorithm implementation guide, not limits, configs, or troubleshooting reference. | +| [How to use the molecule visualizer](https://learn.microsoft.com/en-us/azure/quantum/how-to-use-molecule-visualizer) | 0.20 | The molecule visualizer article is primarily a how-to for installing, opening, and using a visualization tool in Jupyter. It’s a step-by-step usage/tutorial style page and doesn’t clearly indicate detailed configuration tables, limits, or product-specific error codes or decision matrices, so it doesn’t meet the expert-knowledge criteria for any sub-skill type. | | [Item access expressions](https://learn.microsoft.com/en-us/azure/quantum/user-guide/language/expressions/itemaccessexpressions) | 0.20 | Item access and slicing semantics; standard language reference. | +| [List of quantum computing providers](https://learn.microsoft.com/en-us/azure/quantum/qc-target-list) | 0.20 | Primarily a catalog/list of available Azure Quantum providers and targets with descriptions; no detailed limits, configuration parameters, error codes, or decision matrices are indicated in the summary. | | [Pauli measurements](https://learn.microsoft.com/en-us/azure/quantum/concepts-pauli-measurements) | 0.20 | Describes Pauli measurements conceptually and in Q# context, but appears as theory/usage overview without concrete config tables, limits, or error mappings. | | [Quantum Development Kit overview](https://learn.microsoft.com/en-us/azure/quantum/qdk-main-overview) | 0.20 | Landing page for QDK; high-level overview and navigation without deep technical specifics. | | [Quantum circuits conventions](https://learn.microsoft.com/en-us/azure/quantum/concepts-circuits) | 0.20 | Explains how to read quantum circuit diagrams and conventions; no product-specific configuration parameters, limits, or decision matrices. | diff --git a/products/azure-redhat-openshift/azure-redhat-openshift.csv b/products/azure-redhat-openshift/azure-redhat-openshift.csv index e216ec2f..063d4bd2 100644 --- a/products/azure-redhat-openshift/azure-redhat-openshift.csv +++ b/products/azure-redhat-openshift/azure-redhat-openshift.csv @@ -1,5 +1,5 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/openshift/azure-redhat-openshift-release-notes,What's new with Azure Red Hat OpenShift?,What's new with Azure Red Hat OpenShift? - Azure Red Hat OpenShift,,This article has release notes for Azure Red Hat OpenShift.,"Microsoft Azure Red Hat OpenShift receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about the latest releases.",2026-02-25T08:00:00.000Z,whats-new,,0.2,False,"Release notes are mostly change logs; summary does not indicate detailed limits, configs, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/openshift/azure-redhat-openshift-release-notes,What's new with Azure Red Hat OpenShift?,What's new with Azure Red Hat OpenShift? - Azure Red Hat OpenShift,,This article has release notes for Azure Red Hat OpenShift.,"Microsoft Azure Red Hat OpenShift receives improvements on an ongoing basis. To stay up to date with the most recent developments, this article provides you with information about the latest releases.",2026-04-20T08:00:00.000Z,whats-new,,0.2,False,"Release notes summarize changes and new features but typically do not provide structured limits, configuration tables, troubleshooting mappings, or decision matrices as defined by the sub-skill types. They are more update/announcement oriented than reusable expert reference content.",updated https://learn.microsoft.com/en-us/azure/openshift/best-practices-openshift-virtualization,Best practices for virtual machine deployments,Best practices for virtual machine deployments on OpenShift Virtualization - Azure Red Hat OpenShift,Optimize VM deployments on OpenShift Virtualization in ARO,Learn about best practices for deploying virtual machines (VMs) using OpenShift Virtualization in Azure Red Hat OpenShift.,This document provides guidance for optimizing performance and cost efficiency when deploying virtual machines (VMs) using OpenShift Virtualization on Azure Red Hat OpenShift. This guidance also addresses any concerns around application performance and provides actionable steps for successful deployment.,2026-02-16T18:04:00.000Z,how-to,best-practices,0.85,True,Explicit best practices for performance and cost with VMs on OpenShift Virtualization; includes product-specific recommendations and tuning guidance.,unchanged https://learn.microsoft.com/en-us/azure/openshift/built-in-container-registry,Use the built-in container registry,Configure built-in container registry for Azure Red Hat OpenShift 4 - Azure Red Hat OpenShift,Configure built-in container registry on ARO 4,Configure built-in container registry for Azure Red Hat OpenShift 4,"Azure Red Hat OpenShift provides anintegrated container image registrythat adds the ability to automatically provision new image repositories on demand. This registry provides users with a built-in location for their application builds to push the resulting images. In this article, you'll configure the built-in container image registry for an Azure Red Hat OpenShift 4 cluster. You'll learn how to:",2025-08-07T22:06:00.000Z,concept-article,configuration,0.75,True,"Covers configuration of integrated image registry, including how repositories are provisioned and used; product-specific registry configuration.",unchanged https://learn.microsoft.com/en-us/azure/openshift/cluster-wide-proxy-configure,Configure cluster-wide proxy,Configure a cluster-wide proxy in an Azure Red Hat OpenShift cluster - Azure Red Hat OpenShift,Configure cluster-wide HTTP/HTTPS proxy in ARO,Discover how to configure a cluster-wide proxy in Azure Red Hat OpenShift.,"This article describes the process for enabling a cluster-wide proxy on an Azure Red Hat OpenShift cluster. This feature allows production environments to deny direct access to the internet and instead have an HTTP or HTTPS proxy available. This article details the specific configuration steps necessary for an Azure Red Hat OpenShift cluster. For more information about how the cluster-wide proxy feature works for the OpenShift Container Platform, see theRed Hat documentation. When configuring a ",2025-03-06T05:31:00.000Z,how-to,configuration,0.8,True,"Details proxy configuration for ARO clusters, including specific settings and behavior; product-specific network configuration.",unchanged @@ -62,6 +62,6 @@ https://learn.microsoft.com/en-us/azure/openshift/openshift-faq,Frequently asked https://learn.microsoft.com/en-us/azure/openshift/openshift-service-definitions,Azure Red Hat OpenShift service definition,Azure Red Hat OpenShift service definition - Azure Red Hat OpenShift,Understand Azure Red Hat OpenShift service definitions,Azure Red Hat OpenShift service definition,The following sections provide service definitions to help you manage your Azure Red Hat OpenShift account.,2025-11-25T08:00:00.000Z,concept-article,deployment,0.6,True,"Service definition docs typically include plan/tier behaviors, operational constraints, and deployment-related requirements that are product-specific and not generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/openshift/quickstart-openshift-arm-bicep-template,Deploy an Azure Red Hat OpenShift cluster with an ARM template or Bicep,Quickstart: Deploy an Azure Red Hat OpenShift cluster with an ARM template or Bicep - Azure Red Hat OpenShift,Deploy ARO clusters using ARM or Bicep templates,"In this article, learn how to create an Azure Red Hat OpenShift cluster using an Azure Resource Manager template or a Bicep file.",This article describes how to use either Azure Resource Manager template (ARM template) or Bicep to create an Azure Red Hat OpenShift cluster. You can deploy the Azure Red Hat OpenShift cluster with either PowerShell or the Azure command-line interface (Azure CLI). AnAzure Resource Manager templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. You describe your intended deployment without writi,2025-02-25T18:01:00.000Z,quickstart,deployment,0.65,True,"Template-based deployment article will include resource schema, parameters, and deployment-specific constraints unique to ARO.",unchanged https://learn.microsoft.com/en-us/azure/openshift/responsibility-matrix,Responsibility matrix,Azure Red Hat OpenShift Responsibility Assignment Matrix - Azure Red Hat OpenShift,Understand responsibility matrix for ARO operations,Learn about the ownership of responsibilities for the operation of an Azure Red Hat OpenShift cluster,"This document outlines the responsibilities of Microsoft, Red Hat, and customers for Azure Red Hat OpenShift clusters. For more information about Azure Red Hat OpenShift and its components, see theAzure Red Hat OpenShift Service Definition. While Microsoft and Red Hat manage the Azure Red Hat OpenShift service, the customer shares responsibility for the functionality of their cluster. While Azure Red Hat OpenShift clusters are hosted on Azure resources in customer Azure subscriptions, they are a",2025-02-25T18:01:00.000Z,concept-article,decision-making,0.65,True,Responsibility matrix defines who owns which operational tasks; this is decision guidance for roles and processes specific to the service.,unchanged -https://learn.microsoft.com/en-us/azure/openshift/support-lifecycle,Support lifecycle for Azure Red Hat OpenShift 4,Azure Red Hat OpenShift support lifecycle - Azure Red Hat OpenShift,Understand Azure Red Hat OpenShift support lifecycle and versions,Understand the support lifecycle and supported versions for Azure Red Hat OpenShift,"Red Hat releases minor versions of Red Hat OpenShift Container Platform (OCP) approximately every four months. These releases include new features and improvements. Patch releases are more frequent (typically weekly) and might include fixes for security vulnerabilities or bugs. Azure Red Hat OpenShift is built from specific releases of OCP. This article covers the versions of OCP that are supported for Azure Red Hat OpenShift and details about updates, deprecations, and the support policy.",2025-11-14T18:00:00.000Z,concept-article,limits-quotas,0.7,True,"Support lifecycle and supported OCP versions involve specific version numbers, timelines, and deprecation details that function as product-specific limits/constraints.",unchanged +https://learn.microsoft.com/en-us/azure/openshift/support-lifecycle,Support lifecycle for Azure Red Hat OpenShift 4,Azure Red Hat OpenShift support lifecycle - Azure Red Hat OpenShift,,Understand the support lifecycle and supported versions for Azure Red Hat OpenShift,"Red Hat releases minor versions of Red Hat OpenShift Container Platform (OCP) approximately every four months. These releases include new features and improvements. Patch releases are more frequent (typically weekly) and might include fixes for security vulnerabilities or bugs. Azure Red Hat OpenShift is built from specific releases of OCP. This article covers the versions of OCP that are supported for Azure Red Hat OpenShift and details about updates, deprecations, and the support policy.",2026-04-20T08:00:00.000Z,concept-article,,0.3,False,"Support lifecycle and version policy information is high-level policy/overview content. The summary does not indicate concrete numerical limits, configuration parameters, error codes, or decision matrices tied to technical choices. It describes release cadence and support policy conceptually, which an LLM is likely to know in general form, and does not fit any of the specified expert-knowledge sub-skill types.",updated https://learn.microsoft.com/en-us/azure/openshift/support-policies-v4,Support policies for Azure Red Hat OpenShift 4,Azure Red Hat OpenShift 4 cluster support policy - Azure Red Hat OpenShift,Follow Azure Red Hat OpenShift 4 support policies,Learn more about the support policy requirements for a Microsoft Azure Red Hat OpenShift 4 cluster.,"Certain configurations for Microsoft Azure Red Hat OpenShift 4 clusters can affect your cluster's supportability. Azure Red Hat OpenShift 4 allows cluster administrators to make changes to internal cluster components, but not all changes are supported. The support policy below shares which modifications violate the policy and void support from Microsoft and Red Hat. Note Features marked Technology Preview in OpenShift Container Platform aren't supported in Azure Red Hat OpenShift.",2026-03-17T17:14:00.000Z,concept-article,best-practices,0.7,True,"The page defines which specific configuration changes to internal Azure Red Hat OpenShift 4 components are supported vs unsupported, including explicit DO/DON'T guidance that directly affects cluster supportability. These are product-specific support and configuration rules that function as best-practice constraints and are not generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/openshift/troubleshoot,Troubleshooting,Troubleshoot Azure Red Hat OpenShift - Azure Red Hat OpenShift,Troubleshoot common Azure Red Hat OpenShift cluster issues,Troubleshoot and resolve common issues with Azure Red Hat OpenShift,This article details some common issues encountered while creating or managing Microsoft Azure Red Hat OpenShift clusters.,2025-08-07T22:06:00.000Z,troubleshooting,troubleshooting,0.82,True,"Explicit troubleshooting article; typically includes ARO-specific error messages, causes, and resolutions that constitute expert diagnostic knowledge.",unchanged diff --git a/products/azure-redhat-openshift/report.md b/products/azure-redhat-openshift/report.md index b275bcbb..e79f9484 100644 --- a/products/azure-redhat-openshift/report.md +++ b/products/azure-redhat-openshift/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-05' +generated_at: '2026-04-26' category_descriptions: best-practices: Guidance on sizing and deploying ARO clusters and infra nodes, optimizing OpenShift Virtualization VMs, and understanding ARO 4 support limits and policies @@ -45,13 +45,13 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes - **Total Pages**: 66 - **Fetched**: 66 - **Fetch Failed**: 0 -- **Classified**: 53 -- **Unclassified**: 13 +- **Classified**: 52 +- **Unclassified**: 14 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 66 +- **Updated Pages**: 2 +- **Unchanged**: 64 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-redhat-openshift/azure-redhat-openshift.csv` @@ -64,13 +64,20 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | decision-making | 1 | 1.5% | | deployment | 9 | 13.6% | | integrations | 5 | 7.6% | -| limits-quotas | 2 | 3.0% | +| limits-quotas | 1 | 1.5% | | security | 13 | 19.7% | | troubleshooting | 4 | 6.1% | -| *(Unclassified)* | 13 | 19.7% | +| *(Unclassified)* | 14 | 21.2% | ## Changes +### Updated Pages + +- [Support lifecycle for Azure Red Hat OpenShift 4](https://learn.microsoft.com/en-us/azure/openshift/support-lifecycle) + - Updated: 2025-11-14T18:00:00.000Z → 2026-04-20T08:00:00.000Z +- [What's new with Azure Red Hat OpenShift?](https://learn.microsoft.com/en-us/azure/openshift/azure-redhat-openshift-release-notes) + - Updated: 2026-02-25T08:00:00.000Z → 2026-04-20T08:00:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -110,7 +117,6 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Migrate from OpenShift SDN to OVN-Kubernetes](https://learn.microsoft.com/en-us/azure/openshift/howto-sdn-to-ovn) | deployment | 0.70 | Describes migration path due to SDN deprecation, including version constraints and steps; deployment/migration decision and process. | | [Rotate service principal credentials](https://learn.microsoft.com/en-us/azure/openshift/howto-service-principal-credential-rotation) | security | 0.70 | Details rotation of Entra service principal credentials for ARO using Azure CLI; includes product-specific identity and permission handling steps. | | [Segregate worker nodes into subnets](https://learn.microsoft.com/en-us/azure/openshift/howto-segregate-machinesets) | configuration | 0.70 | Network-level configuration of worker machine sets into different private subnets with access control implications is product-specific configuration. | -| [Support lifecycle for Azure Red Hat OpenShift 4](https://learn.microsoft.com/en-us/azure/openshift/support-lifecycle) | limits-quotas | 0.70 | Support lifecycle and supported OCP versions involve specific version numbers, timelines, and deprecation details that function as product-specific limits/constraints. | | [Support policies for Azure Red Hat OpenShift 4](https://learn.microsoft.com/en-us/azure/openshift/support-policies-v4) | best-practices | 0.70 | The page defines which specific configuration changes to internal Azure Red Hat OpenShift 4 components are supported vs unsupported, including explicit DO/DON'T guidance that directly affects cluster supportability. These are product-specific support and configuration rules that function as best-practice constraints and are not generic knowledge. | | [Tag resources using Azure Policy](https://learn.microsoft.com/en-us/azure/openshift/howto-tag-resources) | configuration | 0.70 | Involves creating JSON policy definitions/assignments and remediation for ARO-managed resource groups, with specific parameters and behavior. | | [Use Admin Kubeconfig](https://learn.microsoft.com/en-us/azure/openshift/howto-kubeconfig) | troubleshooting | 0.70 | Explicitly for regaining access when console/ingress/auth components fail; maps specific failure scenarios to using Admin Kubeconfig. | @@ -140,9 +146,10 @@ confusable_not_for: Not for Azure Kubernetes Service (AKS) (use azure-kubernetes | [Create an Azure Red Hat OpenShift cluster](https://learn.microsoft.com/en-us/azure/openshift/create-cluster) | 0.30 | Quickstart for creating a cluster; summary does not show detailed config tables, limits, or security roles beyond generic how-to. | | [Deploy a JBoss EAP Java app](https://learn.microsoft.com/en-us/azure/openshift/howto-deploy-java-jboss-enterprise-application-platform-app) | 0.30 | Quickstart/tutorial for deploying JBoss EAP on Azure Red Hat OpenShift via the portal. It focuses on step-by-step setup using a Marketplace offer, not on limits, configuration matrices, error codes, or product-specific parameter tables. No clear expert-only limits, quotas, or specialized configuration references are indicated. | | [Set up OpenShift Virtualization](https://learn.microsoft.com/en-us/azure/openshift/howto-create-openshift-virtualization) | 0.30 | Appears to be a how-to/overview for using OpenShift Virtualization on Azure Red Hat OpenShift, focused on describing capabilities and basic usage. The summary does not indicate detailed configuration tables, limits, error codes, or product-specific decision matrices; it reads as conceptual and procedural rather than expert reference content. | +| [Support lifecycle for Azure Red Hat OpenShift 4](https://learn.microsoft.com/en-us/azure/openshift/support-lifecycle) | 0.30 | Support lifecycle and version policy information is high-level policy/overview content. The summary does not indicate concrete numerical limits, configuration parameters, error codes, or decision matrices tied to technical choices. It describes release cadence and support policy conceptually, which an LLM is likely to know in general form, and does not fit any of the specified expert-knowledge sub-skill types. | | [Upgrade a cluster with managed identities enabled](https://learn.microsoft.com/en-us/azure/openshift/howto-upgrade-aro-openshift-cluster) | 0.30 | This is an upgrade how-to for clusters with managed identities. It is likely a procedural tutorial (using web console or MUO) without configuration matrices, limits, or detailed diagnostic mappings. It describes lifecycle operations rather than expert-only configuration parameters or troubleshooting details. | | [Connect to an Azure Red Hat OpenShift cluster](https://learn.microsoft.com/en-us/azure/openshift/connect-cluster) | 0.20 | Basic connection instructions using kubeadmin; no indication of detailed configuration parameters or troubleshooting mappings. | | [Create cluster with managed identities](https://learn.microsoft.com/en-us/azure/openshift/howto-create-openshift-cluster) | 0.20 | Primarily a how-to deployment guide for creating an Azure Red Hat OpenShift cluster with managed identities using CLI/Portal/Bicep/ARM. It does not clearly indicate detailed configuration tables, limits, error-code-based troubleshooting, or product-specific decision matrices; it appears to be procedural tutorial content rather than expert reference material. | | [Delete an Azure Red Hat OpenShift cluster](https://learn.microsoft.com/en-us/azure/openshift/delete-cluster) | 0.20 | Quickstart for deleting a cluster; operational but not configuration/limits-focused and lacks expert-only details in summary. | -| [What's new with Azure Red Hat OpenShift?](https://learn.microsoft.com/en-us/azure/openshift/azure-redhat-openshift-release-notes) | 0.20 | Release notes are mostly change logs; summary does not indicate detailed limits, configs, or troubleshooting mappings. | +| [What's new with Azure Red Hat OpenShift?](https://learn.microsoft.com/en-us/azure/openshift/azure-redhat-openshift-release-notes) | 0.20 | Release notes summarize changes and new features but typically do not provide structured limits, configuration tables, troubleshooting mappings, or decision matrices as defined by the sub-skill types. They are more update/announcement oriented than reusable expert reference content. | | [About Azure Red Hat OpenShift](https://learn.microsoft.com/en-us/azure/openshift/intro-openshift) | 0.10 | High-level introduction and benefits overview without product-specific limits, configs, or detailed patterns. | diff --git a/products/azure-reliability/azure-reliability.csv b/products/azure-reliability/azure-reliability.csv index eba62c16..ca522e05 100644 --- a/products/azure-reliability/azure-reliability.csv +++ b/products/azure-reliability/azure-reliability.csv @@ -24,8 +24,8 @@ https://learn.microsoft.com/en-us/azure/reliability/reliability-aks,Azure Kubern https://learn.microsoft.com/en-us/azure/reliability/reliability-api-center,Azure API Center,Reliability in Azure API Center,Configure reliability for Azure API Center,Improve reliability in Azure API Center by using availability zones and zone redundancy. Read about disaster recovery and what to expect during an outage.,"This article describes reliability support in Azure API Center, including availability zones, zone redundancy, data residency, and what customers should expect during zone or regional outages. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-01-22T18:34:00.000Z,reliability-article,best-practices,0.63,True,"Explains AZs, zone redundancy, data residency, and expected behavior during zone/region outages; concrete reliability expectations and configuration.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-api-management,Azure API Management,Reliability in Azure API Management,,"Learn about resiliency features in Azure API Management, including availability zones, multi-region deployments, transient fault handling, and service maintenance to achieve high availability and meet","Azure API Managementis a fully managed service that helps organizations publish, secure, transform, maintain, and monitor APIs. As an Azure service, API Management provides a range of capabilities to support your reliability requirements. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capa",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,"Reliability in Azure API Management; appears conceptual (zones, multi-region, transient faults) without explicit numeric or config expert details in summary.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-api-management,Azure API Management,Reliability in Azure API Management,,"Learn about resiliency features in Azure API Management, including availability zones, multi-region deployments, transient fault handling, and service maintenance to achieve high availability and meet","Azure API Managementis a fully managed service that helps organizations publish, secure, transform, maintain, and monitor APIs. As an Azure service, API Management provides a range of capabilities to support your reliability requirements. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capa",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,"Reliability in Azure API Management; appears conceptual (zones, multi-region, transient faults) without explicit numeric or config expert details in summary.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration,Azure App Configuration,Reliability in Azure App Configuration,,"Learn how to make Azure App Configuration resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and ","Azure App Configurationcentrally stores and manages application configuration settings and feature flags, which replaces configuration files embedded directly within applications. With this approach, you can dynamically update configuration values, track version history, and maintain a record of configuration changes over time. The availability and reliability of App Configuration are important considerations because application behavior can directly depend on access to configuration data at run",2026-04-15T22:12:00.000Z,reliability-article,,0.2,False,"Reliability guidance for Azure App Configuration; summary indicates high-level resiliency concepts (transient faults, outages, backup/restore) but no explicit mention of numeric limits, configuration parameter tables, or detailed troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration,Azure App Configuration,Reliability in Azure App Configuration,Build resilient configurations with Azure App Configuration,"Learn how to make Azure App Configuration resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and ","Azure App Configurationcentrally stores and manages application configuration settings and feature flags, which replaces configuration files embedded directly within applications. With this approach, you can dynamically update configuration values, track version history, and maintain a record of configuration changes over time. The availability and reliability of App Configuration are important considerations because application behavior can directly depend on access to configuration data at run",2026-04-15T22:12:00.000Z,reliability-article,best-practices,0.75,True,"Reliability guidance for Azure App Configuration typically includes specific patterns for handling transient faults, region/zone outages, and backup/restore of configuration stores and feature flags. These are actionable, product-specific recommendations (how to design access, caching, and failover), which matches best-practices.",updated +https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration,Azure App Configuration,Reliability in Azure App Configuration,Build resilient configurations with Azure App Configuration,"Learn how to make Azure App Configuration resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and ","Azure App Configurationcentrally stores and manages application configuration settings and feature flags, which replaces configuration files embedded directly within applications. With this approach, you can dynamically update configuration values, track version history, and maintain a record of configuration changes over time. The availability and reliability of App Configuration are important considerations because application behavior can directly depend on access to configuration data at run",2026-04-15T22:12:00.000Z,reliability-article,best-practices,0.75,True,"Reliability guidance for Azure App Configuration typically includes specific patterns for handling transient faults, region/zone outages, and backup/restore of configuration stores and feature flags. These are actionable, product-specific recommendations (how to design access, caching, and failover), which matches best-practices.",unchanged +https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration,Azure App Configuration,Reliability in Azure App Configuration,Build resilient configurations with Azure App Configuration,"Learn how to make Azure App Configuration resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and ","Azure App Configurationcentrally stores and manages application configuration settings and feature flags, which replaces configuration files embedded directly within applications. With this approach, you can dynamically update configuration values, track version history, and maintain a record of configuration changes over time. The availability and reliability of App Configuration are important considerations because application behavior can directly depend on access to configuration data at run",2026-04-15T22:12:00.000Z,reliability-article,best-practices,0.75,True,"Reliability guidance for Azure App Configuration typically includes specific patterns for handling transient faults, region/zone outages, and backup/restore of configuration stores and feature flags. These are actionable, product-specific recommendations (how to design access, caching, and failover), which matches best-practices.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-app-gateway-containers,Azure Application Gateway for Containers,Reliability in Azure Application Gateway for Containers,,Learn how to improve reliability in Azure Application Gateway for Containers by using availability zones and zone redundancy for more resilient performance.,"This article describes reliability and availability zone support inAzure Application Gateway for Containers. Learn how to configure zone redundancy to increase resilience and availability for container workloads that have highly available deployment configurations. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-01-22T23:17:00.000Z,reliability-article,,0.3,False,"Reliability/zone redundancy overview for Application Gateway for Containers; summary suggests conceptual guidance without concrete limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-app-service,Azure App Service,Reliability in Azure App Service,,"Learn how to make Azure App Service resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance.","Azure App Serviceis an HTTP-based service for hosting web applications, REST APIs, and mobile back ends. App Service integrates with Microsoft Azure to provide security, load balancing, autoscaling, and automated management for applications. As an Azure service, App Service provides a range of capabilities to support your reliability requirements. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're respo",2026-01-22T23:17:00.000Z,reliability-article,,0.4,False,"Reliability in Azure App Service; high-level resiliency and maintenance guidance, not clearly exposing numeric limits or config matrices.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-app-service,Azure App Service,Reliability in Azure App Service,,"Learn how to make Azure App Service resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance.","Azure App Serviceis an HTTP-based service for hosting web applications, REST APIs, and mobile back ends. App Service integrates with Microsoft Azure to provide security, load balancing, autoscaling, and automated management for applications. As an Azure service, App Service provides a range of capabilities to support your reliability requirements. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're respo",2026-01-22T23:17:00.000Z,reliability-article,,0.4,False,"Reliability in Azure App Service; high-level resiliency and maintenance guidance, not clearly exposing numeric limits or config matrices.",unchanged @@ -33,13 +33,13 @@ https://learn.microsoft.com/en-us/azure/reliability/reliability-app-service-envi https://learn.microsoft.com/en-us/azure/reliability/reliability-application-gateway-v2,Azure Application Gateway,Reliability in Azure Application Gateway v2,,"Learn how to make Azure Application Gateway v2 resilient to transient faults, availability zone outages, and region outages through availability zones and multi-region deployment patterns.","Azure Application Gatewayis a web traffic load balancer that you can use to manage traffic to your web applications. It provides advanced features like autoscaling, zone redundancy, static virtual IP (VIP) addresses, and web application firewall (WAF) integration to deliver highly available and secure application delivery services. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for under",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,Reliability in Application Gateway v2; summary is conceptual about zones and multi-region patterns without explicit numeric or config details.,unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-application-gateway-v2,Azure Application Gateway v2,Reliability in Azure Application Gateway v2,Architect highly available Azure Application Gateway v2,"Learn how to make Azure Application Gateway v2 resilient to transient faults, availability zone outages, and region outages through availability zones and multi-region deployment patterns.","Azure Application Gatewayis a web traffic load balancer that you can use to manage traffic to your web applications. It provides advanced features like autoscaling, zone redundancy, static virtual IP (VIP) addresses, and web application firewall (WAF) integration to deliver highly available and secure application delivery services. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for under",2026-01-22T18:34:00.000Z,reliability-article,best-practices,0.66,True,Explains how to use AZs and multi-region deployment patterns for Application Gateway v2 to handle transient faults and outages; product-specific reliability design.,unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-azure-storage-mover,Azure Storage Mover,Reliability in Azure Storage Mover,,Enhance reliability for Azure Storage Mover by using zone redundancy and availability zones. Learn about instance metadata protection and agent configuration.,"This article describes reliability support inAzure Storage Moverand covers both intra-regional resiliency withavailability zonesandcross-region disaster recovery and business continuity. For a more detailed overview of reliability principles in Azure, seeAzure reliability.",2026-01-22T18:34:00.000Z,reliability-article,,0.3,False,"Reliability in Azure Storage Mover; high-level resiliency and DR description, no clear evidence of numeric or config expert details.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-backup,Azure Backup,Reliability in Azure Backup,Design resilient backup strategies with Azure Backup,"Learn how to make Azure Backup resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Backupis a built-in Azure service that securely protects cloud and on-premises workloads. Backup can scale its protection across multiple workloads and provides native integration with Azure workloads, including virtual machines (VMs), SAP HANA in Azure VMs, SQL in Azure VMs, Azure Files, Azure Blob Storage, Azure Data Lake Storage, Azure managed disks, Azure Elastic SAN volumes, and Azure Kubernetes Service (AKS). You don't need to manage automation or infrastructure, write scripts, or pr",2026-04-14T17:08:00.000Z,reliability-article,best-practices,0.7,True,"Reliability guidance for Azure Backup typically includes product-specific resiliency recommendations (for example, how to configure vaults, regions, and restore options for different outage types). This goes beyond conceptual reliability and provides concrete, Azure-Backup-specific patterns and DO/DON'T guidance, which fits best-practices.",updated -https://learn.microsoft.com/en-us/azure/reliability/reliability-backup,Azure Backup,Reliability in Azure Backup,Design resilient backup strategies with Azure Backup,"Learn how to make Azure Backup resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Backupis a built-in Azure service that securely protects cloud and on-premises workloads. Backup can scale its protection across multiple workloads and provides native integration with Azure workloads, including virtual machines (VMs), SAP HANA in Azure VMs, SQL in Azure VMs, Azure Files, Azure Blob Storage, Azure Data Lake Storage, Azure managed disks, Azure Elastic SAN volumes, and Azure Kubernetes Service (AKS). You don't need to manage automation or infrastructure, write scripts, or pr",2026-04-14T17:08:00.000Z,reliability-article,best-practices,0.68,True,"The page focuses on making Azure Backup resilient to specific outage types (transient faults, zone and region outages) and provides product-specific reliability guidance and patterns. This is actionable, scenario-based advice tailored to Azure Backup rather than generic reliability theory, fitting best under best-practices. It does not primarily present numeric limits, decision matrices, or configuration parameter tables.",updated +https://learn.microsoft.com/en-us/azure/reliability/reliability-backup,Azure Backup,Reliability in Azure Backup,Design resilient backup strategies with Azure Backup,"Learn how to make Azure Backup resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Backupis a built-in Azure service that securely protects cloud and on-premises workloads. Backup can scale its protection across multiple workloads and provides native integration with Azure workloads, including virtual machines (VMs), SAP HANA in Azure VMs, SQL in Azure VMs, Azure Files, Azure Blob Storage, Azure Data Lake Storage, Azure managed disks, Azure Elastic SAN volumes, and Azure Kubernetes Service (AKS). You don't need to manage automation or infrastructure, write scripts, or pr",2026-04-14T17:08:00.000Z,reliability-article,best-practices,0.68,True,"The page focuses on making Azure Backup resilient to specific outage types (transient faults, zone and region outages) and provides product-specific reliability guidance and patterns. This is actionable, scenario-based advice tailored to Azure Backup rather than generic reliability theory, fitting best under best-practices. It does not primarily present numeric limits, decision matrices, or configuration parameter tables.",unchanged +https://learn.microsoft.com/en-us/azure/reliability/reliability-backup,Azure Backup,Reliability in Azure Backup,Design resilient backup strategies with Azure Backup,"Learn how to make Azure Backup resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Backupis a built-in Azure service that securely protects cloud and on-premises workloads. Backup can scale its protection across multiple workloads and provides native integration with Azure workloads, including virtual machines (VMs), SAP HANA in Azure VMs, SQL in Azure VMs, Azure Files, Azure Blob Storage, Azure Data Lake Storage, Azure managed disks, Azure Elastic SAN volumes, and Azure Kubernetes Service (AKS). You don't need to manage automation or infrastructure, write scripts, or pr",2026-04-14T17:08:00.000Z,reliability-article,best-practices,0.68,True,"The page focuses on making Azure Backup resilient to specific outage types (transient faults, zone and region outages) and provides product-specific reliability guidance and patterns. This is actionable, scenario-based advice tailored to Azure Backup rather than generic reliability theory, fitting best under best-practices. It does not primarily present numeric limits, decision matrices, or configuration parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-bastion,Azure Bastion,Reliability in Azure Bastion,,"Learn about resiliency in Azure Bastion, including resilience to transient faults, availability zone failures, and region failures.","Azure Bastionis a fully managed platform as a service (PaaS) that you provision to provide high-security connections to virtual machines via a private IP address. It provides seamless RDP/SSH connectivity to your virtual machines directly over TLS from the Azure portal, or via the native SSH or RDP client that's already installed on your local computer. When you connect via Azure Bastion, your virtual machines don't need a public IP address, an agent, or special client software. When you use Azu",2026-01-22T23:17:00.000Z,reliability-article,,0.3,False,Reliability overview for Azure Bastion; likely high-level resiliency description without detailed configuration tables or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-batch,Azure Batch,Reliability in Azure Batch,,"Improve reliability in Azure Batch by using availability zones, zone redundancy, and disaster recovery strategies. Design more resilient batch processing workloads.","This article describes reliability support in Azure Batch. It covers how to improve intra-regional resiliency by usingavailability zones, batch pools, and compute nodes to minimize downtime and data loss. It also links to information aboutcross-region recovery and business continuity.",2026-01-22T23:17:00.000Z,reliability-article,,0.4,False,"Reliability in Azure Batch; focuses on zones and DR conceptually, no clear numeric thresholds or config tables in summary.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-batch,Azure Batch,Reliability in Azure Batch,,"Improve reliability in Azure Batch by using availability zones, zone redundancy, and disaster recovery strategies. Design more resilient batch processing workloads.","This article describes reliability support in Azure Batch. It covers how to improve intra-regional resiliency by usingavailability zones, batch pools, and compute nodes to minimize downtime and data loss. It also links to information aboutcross-region recovery and business continuity.",2026-01-22T23:17:00.000Z,reliability-article,,0.4,False,"Reliability in Azure Batch; focuses on zones and DR conceptually, no clear numeric thresholds or config tables in summary.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-bot,Azure Bot Service,Reliability in Azure Bot Service,Plan reliability for Azure Bot Service,"Learn how Azure Bot Service provides reliability through availability zones, zone redundancy, and disaster recovery for regional bots with local data residency.","This article describes reliability support in Azure Bot Service. It covers both regional reliability with availability zones and cross-region resiliency with disaster recovery for bots with local data residency. For a more detailed overview of reliability in Azure, seeAzure reliability. When you create an application (bot) in Bot Service, you can choose global or local data residency. Local data residency ensures that your bot's personal data is preserved, stored, and processed within specific g",2026-01-22T23:17:00.000Z,reliability-article,best-practices,0.62,True,"Describes concrete patterns for regional vs cross-region reliability, availability zones, and disaster recovery for bots with local data residency; product-specific resiliency guidance.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-chaos-studio,Azure Chaos Studio,Reliability in Azure Chaos Studio,,Learn how to improve reliability in Azure Chaos Studio by using availability zones and zone redundancy. Understand disaster recovery and zone outage experiences.,"This article describes reliability support inAzure Chaos Studio. It covers how to configure availability zones and what to expect during a zone-wide outage. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-04-14T22:12:00.000Z,reliability-article,,0.2,False,"Reliability overview for Azure Chaos Studio; description suggests conceptual guidance on zones and outage behavior without clear indication of numeric limits, config tables, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/reliability/reliability-chaos-studio,Azure Chaos Studio,Reliability in Azure Chaos Studio,,Learn how to improve reliability in Azure Chaos Studio by using availability zones and zone redundancy. Understand disaster recovery and zone outage experiences.,"This article describes reliability support inAzure Chaos Studio. It covers how to configure availability zones and what to expect during a zone-wide outage. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-04-14T22:12:00.000Z,reliability-article,,0.2,False,"Reliability overview for Azure Chaos Studio; description suggests conceptual guidance on zones and outage behavior without clear indication of numeric limits, config tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-container-apps,Azure Container Apps,Reliability in Azure Container Apps,,"Learn how to make Azure Container Apps resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance.","Azure Container Appsis a fully managed, serverless container hosting service for deploying microservices and containerized applications. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet your business objectives and uptime goals. This article describes how to mak",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,"Reliability in Azure Container Apps; shared-responsibility framing and resiliency overview, not detailed limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-container-apps,Azure Container Apps,Reliability in Azure Container Apps,,"Learn how to make Azure Container Apps resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance.","Azure Container Appsis a fully managed, serverless container hosting service for deploying microservices and containerized applications. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet your business objectives and uptime goals. This article describes how to mak",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,"Reliability in Azure Container Apps; shared-responsibility framing and resiliency overview, not detailed limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-container-instances,Azure Container Instances,Reliability in Azure Container Instances,,"Find out about resiliency in Azure Container Instances, including transient faults, availability zones, multi-region support, and backups.","This article describes reliability support inAzure Container Instances, which provides a straightforward way to run Linux or Windows containers in Azure, without the need to manage virtual machines (VMs) or adopt a more complex, higher-level service. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and select",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,"Reliability in Azure Container Instances; summary is conceptual about zones, multi-region, and backups without explicit numeric or config expert details.",unchanged @@ -53,7 +53,7 @@ https://learn.microsoft.com/en-us/azure/reliability/reliability-database-mysql,A https://learn.microsoft.com/en-us/azure/reliability/reliability-database-mysql,Azure Database for MySQL,Reliability in Azure Database for MySQL,Design resilient Azure Database for MySQL deployments,"Learn how to make Azure Database for MySQL resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and","Azure Database for MySQLis a fully managed database service designed to give you granular control and flexibility over database management functions and configuration settings. The service provides high availability and disaster recovery capabilities based on your requirements. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the ",2026-04-07T22:13:00.000Z,reliability-article,best-practices,0.68,True,"The page focuses on making Azure Database for MySQL resilient to specific outage types (transient faults, AZ/region outages, service maintenance) and backup/restore behavior. It contains product-specific resiliency recommendations and patterns (for example, how to configure high availability, failover behavior, and backup strategies unique to this service), which qualify as concrete best practices rather than generic reliability concepts.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-database-postgresql,Azure Database for PostgreSQL,Reliability in Azure Database for PostgreSQL,Implement high availability for Azure Database for PostgreSQL,"Learn how to make Azure Database for PostgreSQL resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance","Azure Database for PostgreSQLis a fully managed database service designed to give you granular control and flexibility over database management functions and configuration settings. The service provides high availability and disaster recovery capabilities based on your requirements. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of",2026-03-24T11:03:00.000Z,reliability-article,best-practices,0.7,True,"Reliability content for Azure Database for PostgreSQL generally contains concrete, product-specific HA/DR recommendations (for example, how to configure replicas, backups, and failover behavior for different outage types). This is actionable guidance tied to this service rather than generic theory, fitting best-practices rather than limits or configuration references.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-databricks,Azure Databricks,Reliability in Azure Databricks,Implement resilient architectures in Azure Databricks,"Learn about resiliency features in Azure Databricks, including transient fault handling and availability zone support.","Azure Databricksis a collaborative Apache Spark-based data and AI platform optimized for Microsoft Azure. It provides a unified environment for big data and AI workloads and combines the best of Databricks and Azure to simplify data engineering, data science, and machine learning. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of t",2026-01-22T23:17:00.000Z,reliability-article,best-practices,0.62,True,"Service-specific resiliency features and how to use them (transient fault handling, AZ support) in Databricks workloads; actionable reliability design guidance.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-ddos-protection,Azure DDoS Protection,Reliability in Azure DDoS Protection,,"Learn how Azure DDoS Protection is resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure DDoS Protectionis a foundational Azure networking capability that helps protect applications from distributed denial-of-service (DDoS) attacks. DDoS attacks attempt to overwhelm applications with traffic in order to deny service to legitimate users. Azure DDoS Protection helps safeguard applications by monitoring network traffic patterns and automatically mitigating abnormal traffic that could impact availability. When you use Azure,reliability is a shared responsibility. Microsoft provide",2026-04-06T22:09:00.000Z,reliability-article,,0.2,False,"Reliability overview for Azure DDoS Protection; appears conceptual/shared-responsibility guidance without specific limits, error codes, or configuration tables.",unchanged +https://learn.microsoft.com/en-us/azure/reliability/reliability-ddos-protection,Azure DDoS Protection,Reliability in Azure DDoS Protection,Harden application reliability with Azure DDoS Protection,"Learn how Azure DDoS Protection is resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure DDoS Protectionis a foundational Azure networking capability that helps protect applications from distributed denial-of-service (DDoS) attacks. DDoS attacks attempt to overwhelm applications with traffic in order to deny service to legitimate users. Azure DDoS Protection helps safeguard applications by monitoring network traffic patterns and automatically mitigating abnormal traffic that could impact availability. When you use Azure,reliability is a shared responsibility. Microsoft provide",2026-04-07T08:00:00.000Z,reliability-article,best-practices,0.6,True,"The page focuses on how Azure DDoS Protection remains reliable during outages and how customers should use it as part of shared-responsibility reliability. Such content usually includes concrete, product-specific recommendations (for example, how to configure protection plans, design for zone/region failures, and avoid common pitfalls). This aligns with best-practices more than other categories; the summary does not indicate numeric limits or configuration tables.",updated https://learn.microsoft.com/en-us/azure/reliability/reliability-device-registry,Azure Device Registry,Reliability in Azure Device Registry,Ensure reliability for Azure Device Registry metadata,"Learn about resiliency in Azure Device Registry, including resilience to transient faults, availability zone failures, and region-wide failures.","Azure Device Registrystores information about assets and devices in the cloud. Device Registry projects assets as Azure resources in the cloud within a single registry. The single registry is a source of truth for device and asset metadata, and asset management capabilities. Device Registry can be used in conjunction withAzure IoT Operations. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsibl",2026-01-22T23:17:00.000Z,reliability-article,best-practices,0.62,True,"Guidance on handling transient faults, AZ failures, and regional failures for Device Registry; service-specific reliability considerations.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-dns,Azure DNS,Reliability in Azure DNS,,Learn how to implement Azure DNS failover for reliable disaster recovery by using cross-region backup sites and automated DNS switching strategies.,This article contains detailed information oncross-region disaster recovery and business continuitysupport for Azure DNS.,2026-01-22T23:17:00.000Z,reliability-article,,0.4,False,Azure DNS failover and DR patterns; appears architectural but summary doesn’t indicate quantified thresholds or decision matrices.,unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-documentdb,Azure DocumentDB,Reliability in Azure DocumentDB,Design high availability for Azure DocumentDB,"Learn how to achieve high availability with Azure DocumentDB by using availability zones, replicas, and disaster recovery.","This article contains detailed information on regional resiliency withavailability zonesandcross-region disaster recovery and business continuityfor Azure DocumentDB. For an architectural overview of reliability in Azure, seeAzure reliability.",2026-01-22T23:17:00.000Z,reliability-article,best-practices,0.64,True,"Service-specific use of AZs, replicas, and DR/BCDR for DocumentDB; concrete reliability patterns beyond generic database concepts.",unchanged @@ -68,21 +68,22 @@ https://learn.microsoft.com/en-us/azure/reliability/reliability-functions,Azure https://learn.microsoft.com/en-us/azure/reliability/reliability-hdinsight,Azure HDInsight,Reliability in Azure HDInsight,,"Learn how to build resilient Hadoop and Spark clusters in Azure HDInsight by using availability zones, ARM templates, and disaster recovery strategies.","This article describes reliability support inAzure HDInsight, and coversavailability zonesandcross-region recovery and business continuity. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-01-22T23:17:00.000Z,reliability-article,,0.3,False,"Reliability in HDInsight with AZs and cross-region recovery; summary indicates high-level guidance rather than specific limits, configs, or error-code troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-hdinsight,Azure HDInsight,Reliability in Azure HDInsight,,"Learn how to build resilient Hadoop and Spark clusters in Azure HDInsight by using availability zones, ARM templates, and disaster recovery strategies.","This article describes reliability support inAzure HDInsight, and coversavailability zonesandcross-region recovery and business continuity. For a more detailed overview of reliability in Azure, seeAzure reliability.",2026-01-22T23:17:00.000Z,reliability-article,,0.3,False,"Reliability in HDInsight with AZs and cross-region recovery; summary indicates high-level guidance rather than specific limits, configs, or error-code troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-image-builder,Azure Virtual Machine Image Builder,Reliability in Azure Image Builder,Implement disaster recovery for Azure Image Builder,Learn how to ensure VM image template resilience in Azure VM Image Builder by using multi-region replication and Azure Resource Graph for recovery.,This article containscross-region disaster recovery and business continuity. Azure Image Builder (AIB) is a regional service with a cluster that serves single regions. The AIB regional setup keeps data and resources within the regional boundary. AIB as a service doesn't do fail over for cluster and SQL database in region down scenarios. Note Azure Image Builder doesn't supportavailability zones.,2026-01-22T23:17:00.000Z,reliability-article,best-practices,0.64,True,"Explains regional behavior, lack of AZ support, and how to use multi-region replication and Azure Resource Graph for recovery; nuanced, product-specific DR guidance.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-iot-hub,Azure IoT Hub,Reliability in Azure IoT Hub,,"Learn about resiliency in Azure IoT Hub, including resilience to transient faults, availability zone failures, and region-wide failures. Understand backup options and SLA details.","Azure IoT Hubis a managed service hosted in the cloud that acts as a central message hub for communication between an IoT application and its attached devices. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet your business objectives and uptime goals. This artic",2026-04-16T22:08:00.000Z,reliability-article,,0.2,False,"Reliability overview for Azure IoT Hub; focuses on shared responsibility and resiliency concepts. The description mentions SLA details but not specific numeric SLA values, limits, or configuration/tier matrices that would qualify as expert knowledge under the defined categories.",updated +https://learn.microsoft.com/en-us/azure/reliability/reliability-iot-hub,Azure IoT Hub,Reliability in Azure IoT Hub,,"Learn about resiliency in Azure IoT Hub, including resilience to transient faults, availability zone failures, and region-wide failures. Understand backup options and SLA details.","Azure IoT Hubis a managed service hosted in the cloud that acts as a central message hub for communication between an IoT application and its attached devices. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet your business objectives and uptime goals. This artic",2026-04-16T22:08:00.000Z,reliability-article,,0.2,False,"Reliability overview for Azure IoT Hub; focuses on shared responsibility and resiliency concepts. The description mentions SLA details but not specific numeric SLA values, limits, or configuration/tier matrices that would qualify as expert knowledge under the defined categories.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-key-vault,Azure Key Vault,Reliability in Azure Key Vault,,"Learn about resiliency in Azure Key Vault, including resilience to transient faults, availability zone failures, and region-wide failures. Understand backup and restore options, recovery features, and","Azure Key Vaultis a cloud service that provides a secure store for secrets, such as keys, passwords, certificates, and other sensitive information. Key Vault provides a range of built-in reliability features to help ensure that your secrets remain available. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, an",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,"Reliability in Azure Key Vault; summary mentions backup/restore and SLA but not specific limits, configs, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-load-balancer,Azure Load Balancer,Reliability in Azure Load Balancer,Design resilient architectures with Azure Load Balancer,"Learn how to make Azure Load Balancer resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Load Balancer is a layer-4 load-balancing service for Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) traffic that distributes incoming requests among healthy instances of your services. Load Balancer provides high availability and ultra-low-latency network performance. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work",2026-03-20T08:00:00.000Z,reliability-article,best-practices,0.65,True,"The page focuses on making Azure Load Balancer resilient to specific outage scenarios (transient faults, zone and region outages) and provides product-specific resiliency guidance and patterns. While not about numeric limits, it contains concrete, Azure-Load-Balancer-specific recommendations on how to architect for reliability, which fits best under best-practices.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-load-balancer,Azure Load Balancer,Reliability in Azure Load Balancer,Design resilient architectures with Azure Load Balancer,"Learn how to make Azure Load Balancer resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Load Balancer is a layer-4 load-balancing service for Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) traffic that distributes incoming requests among healthy instances of your services. Load Balancer provides high availability and ultra-low-latency network performance. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work",2026-03-20T08:00:00.000Z,reliability-article,best-practices,0.65,True,"The page focuses on making Azure Load Balancer resilient to specific outage scenarios (transient faults, zone and region outages) and provides product-specific resiliency guidance and patterns. While not about numeric limits, it contains concrete, Azure-Load-Balancer-specific recommendations on how to architect for reliability, which fits best under best-practices.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-logic-apps,Azure Logic Apps,Reliability in Azure Logic Apps,Design resilient workflows with Azure Logic Apps,"Learn to make Azure Logic Apps resilient to potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Logic Appshelps you more easily integrate and orchestrate data between apps, cloud services, and on-premises systems by reducing how much code that you have to write. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet your business objectives and uptime goal",2026-01-22T18:34:00.000Z,reliability-article,best-practices,0.64,True,"Service-specific strategies for transient faults, AZ outages, and region outages in Logic Apps; concrete reliability practices.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-grafana,Azure Managed Grafana,Reliability in Azure Managed Grafana,Improve reliability of Azure Managed Grafana workspaces,"Learn how to make Azure Managed Grafana resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and le","Azure Managed Grafanaprovides hosted Grafana workspaces for building dashboards and visualizations. Microsoft manages all underlying infrastructure, including compute, networking, storage, and service updates. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet you",2026-04-15T17:08:00.000Z,reliability-article,best-practices,0.7,True,"Reliability content for Azure Managed Grafana generally includes concrete guidance on handling zone/region outages, maintenance, and backup/restore for this specific managed service. These are product-specific DO/DON'T recommendations and patterns, so it fits best-practices rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-grafana,Azure Managed Grafana,Reliability in Azure Managed Grafana,Improve reliability of Azure Managed Grafana workspaces,"Learn how to make Azure Managed Grafana resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and le","Azure Managed Grafanaprovides hosted Grafana workspaces for building dashboards and visualizations. Microsoft manages all underlying infrastructure, including compute, networking, storage, and service updates. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet you",2026-04-15T17:08:00.000Z,reliability-article,best-practices,0.7,True,"Reliability content for Azure Managed Grafana generally includes concrete guidance on handling zone/region outages, maintenance, and backup/restore for this specific managed service. These are product-specific DO/DON'T recommendations and patterns, so it fits best-practices rather than generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-hsm,Azure Key Vault Managed HSM,Reliability in Azure Key Vault Managed HSM,Ensure resilient Azure Key Vault Managed HSM deployments,"Learn how to make Azure Key Vault Managed HSM resilient to a variety of potential outages and problems, including transient faults, hardware failures, region outages, and service maintenance. Understa","Azure Key Vault Managed HSMis a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications by using FIPS 140-3 Level 3 validated hardware security modules (HSMs). Managed HSM provides a range of built-in reliability features to help ensure that your keys remain available. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and",2026-04-22T11:03:00.000Z,reliability-article,best-practices,0.7,True,"The page covers reliability, backup/restore, recovery features, and SLA details for Managed HSM. These topics generally include concrete, product-specific guidance on how to configure HSM pools, backup/restore patterns, and failover behavior to meet SLAs and handle region outages. That is actionable, service-specific reliability guidance, fitting best-practices. The summary doesn’t clearly show numeric limits or detailed config tables, so other categories are less likely.",new https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-redis,Azure Managed Redis,Reliability in Azure Managed Redis,Increase reliability of Azure Managed Redis caches,"Learn how to make Azure Managed Redis resilient to a variety of potential outages and problems, including transient faults, availability zone outages, region outages, and service maintenance, and lear","Azure Managed Redisis a fully managed Azure service based on Redis Enterprise. It provides high-performance, in-memory data storage for applications and is designed for enterprise workloads that require ultra-low latency, high throughput, and advanced data structures. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services y",2026-02-06T18:13:00.000Z,reliability-article,best-practices,0.66,True,"Guidance for handling transient faults, AZ and region outages, maintenance, and backup/restore in Managed Redis; product-specific resiliency practices.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-monitor-logs,Azure Monitor Logs,Reliability in Azure Monitor Logs,Implement resilient logging with Azure Monitor Logs,"Learn about reliability for Azure Monitor Logs (Log Analytics workspaces), including availability zones, workspace replication, multi-region strategies, and data export for continuity.","Azure Monitor Logsis a centralized software as a service (SaaS) platform for collecting, analyzing, and acting on system-generated data by Azure and non-Azure resources and applications. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to meet your business objectives a",2026-04-09T22:14:00.000Z,reliability-article,best-practices,0.75,True,"Reliability guidance for Azure Monitor Logs (Log Analytics workspaces) typically covers how to configure workspace replication, multi-region strategies, and data export for continuity. These are concrete, product-specific recommendations for achieving resiliency, which fits the best-practices sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-nat-gateway,Azure NAT Gateway,Reliability in Azure NAT Gateway,,"Learn how to make Azure NAT Gateway resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure NAT Gatewayis a fully managed Network Address Translation (NAT) service that provides outbound internet connectivity for resources connected to your private virtual network. The service provides both Source Network Address Translation (SNAT) for outbound connections and Destination Network Address Translation (DNAT) for response packets to outbound-originated connections only. Because Azure NAT Gateway handles traffic for critical virtual network resources, it's designed to provide high re",2026-01-29T18:18:00.000Z,reliability-article,,0.4,False,"Reliability in Azure NAT Gateway; summary is high-level and doesn’t indicate specific quotas, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-netapp-files,Azure NetApp Files,Reliability in Azure NetApp Files,,"Learn about resiliency in Azure NetApp Files, including resilience to transient faults, availability zone failures, and region-wide failures. Understand backup options and SLA details.","Azure NetApp Filesis a native, enterprise-grade file storage solution that integrates seamlessly within Azure and enables file sharing across clients via Network File System (NFS) and Server Message Block (SMB) protocols. Azure NetApp Files is designed for high performance and provides scalable and secure file storage that's managed as a service. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're respon",2026-01-22T18:34:00.000Z,reliability-article,,0.4,False,Reliability in Azure NetApp Files; mentions backup and SLA but summary doesn’t indicate concrete limits or config parameters.,unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-notification-hubs,Azure Notification Hubs,Reliability in Azure Notification Hubs,Improve reliability of Azure Notification Hubs,"Learn how to ensure reliability in Azure Notification Hubs by using zone redundancy, cross-region disaster recovery, notifications, and device registration backup.",This article describes reliability support in Azure Notification Hubs and covers both regional resiliency withavailability zonesanddisaster recovery and business continuity,2026-01-22T18:34:00.000Z,reliability-article,best-practices,0.64,True,"Describes AZ-based regional resiliency and DR/BCDR, including notification and device registration backup; product-specific reliability patterns.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-private-link-service,Azure Private Link service,Reliability in Azure Private Link Service,Harden Azure Private Link Service for high reliability,"Learn how to make Azure Private Link service resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Private Link servicehelps you privately expose your own applications, such as applications that run on virtual machines (VMs), within an Azure virtual network. Private Link service helps other Azure customers or clients on your networks connect securely without public IP addresses, which ensures that traffic remains within the Azure network. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're respo",2026-04-14T17:08:00.000Z,reliability-article,best-practices,0.65,True,"The page focuses on making Azure Private Link Service resilient to specific outage types. Such reliability docs usually contain concrete configuration and design recommendations unique to this service (for example, how to structure endpoints, failover, and redundancy), which aligns with product-specific best-practices.",updated +https://learn.microsoft.com/en-us/azure/reliability/reliability-private-link-service,Azure Private Link service,Reliability in Azure Private Link Service,Harden Azure Private Link Service for high reliability,"Learn how to make Azure Private Link service resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Private Link servicehelps you privately expose your own applications, such as applications that run on virtual machines (VMs), within an Azure virtual network. Private Link service helps other Azure customers or clients on your networks connect securely without public IP addresses, which ensures that traffic remains within the Azure network. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're respo",2026-04-14T17:08:00.000Z,reliability-article,best-practices,0.65,True,"The page focuses on making Azure Private Link Service resilient to specific outage types. Such reliability docs usually contain concrete configuration and design recommendations unique to this service (for example, how to structure endpoints, failover, and redundancy), which aligns with product-specific best-practices.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-service-bus,Azure Service Bus,Reliability in Azure Service Bus,,"Learn about reliability in Azure Service Bus, including availability zones and multi-region deployments.","Azure Service Bus is a fully managed enterprise message broker service that provides reliable asynchronous messaging capabilities for decoupling applications and services. Service Bus supports queues for point-to-point communication and topics with subscriptions for publish-subscribe messaging patterns. The service provides built-in reliability features, including message durability, at-least-once delivery guarantees, and dead-letter queues to handle failed message processing. When you use Azure",2026-01-27T18:20:00.000Z,reliability-article,,0.3,False,"Reliability in Azure Service Bus; summary highlights built-in reliability features and shared responsibility at a conceptual level, without specific limits, configs, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-service-bus,Azure Service Bus,Reliability in Azure Service Bus,,"Learn about reliability in Azure Service Bus, including availability zones and multi-region deployments.","Azure Service Bus is a fully managed enterprise message broker service that provides reliable asynchronous messaging capabilities for decoupling applications and services. Service Bus supports queues for point-to-point communication and topics with subscriptions for publish-subscribe messaging patterns. The service provides built-in reliability features, including message durability, at-least-once delivery guarantees, and dead-letter queues to handle failed message processing. When you use Azure",2026-01-27T18:20:00.000Z,reliability-article,,0.3,False,"Reliability in Azure Service Bus; summary highlights built-in reliability features and shared responsibility at a conceptual level, without specific limits, configs, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/reliability/reliability-site-recovery,Azure Site Recovery,Reliability in Azure Site Recovery,Design resilient disaster recovery with Azure Site Recovery,"Learn how to make Azure Site Recovery resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Site Recoveryis a managed replication and failover service for virtual machines, designed to keep workloads available during outages. It continuously replicates workloads from primary sites to secondary locations, ensuring minimal data loss and downtime. In the event of planned maintenance or unexpected disruptions, it orchestrates failover and failback processes. This service supports disaster recovery for on-premises environments and Azure VMs, helping organizations maintain business con",2026-03-03T08:00:00.000Z,reliability-article,best-practices,0.65,True,"Reliability guidance for Azure Site Recovery is typically organized as concrete recommendations (for example, how to structure replication, failover, and regional placement for resilience to zone/region outages). This is product-specific resiliency guidance rather than generic DR theory, so it fits best-practices. The description suggests actionable patterns for handling transient faults and outages, not just conceptual overview.",unchanged +https://learn.microsoft.com/en-us/azure/reliability/reliability-site-recovery,Azure Site Recovery,Reliability in Azure Site Recovery,Design resilient disaster recovery with Azure Site Recovery,"Learn how to make Azure Site Recovery resilient to a variety of potential outages and problems, including transient faults, availability zone outages, and region outages.","Azure Site Recoveryis a managed replication and failover service for virtual machines (VMs) that keeps workloads available during outages. It continuously replicates workloads from primary sites to secondary locations and limits data loss and downtime. During planned maintenance or unexpected disruptions, it orchestrates failover and failback. This service supports disaster recovery (DR) for on-premises environments and Azure VMs, which helps organizations maintain business continuity. When you ",2026-04-24T17:20:00.000Z,reliability-article,best-practices,0.65,True,"Reliability guidance for Site Recovery typically includes product-specific resiliency recommendations (for example, how to configure replication, failover, and regional placement to handle zone/region outages and transient faults). This is actionable, service-specific reliability/DR guidance rather than generic concepts, fitting best-practices. No clear evidence of numeric limits/quotas, decision matrices, or configuration tables in the summary.",updated https://learn.microsoft.com/en-us/azure/reliability/reliability-sql-database,Azure SQL Database,Reliability in Azure SQL Database,Implement resilient architectures in Azure SQL Database,"Find out about resiliency in Azure SQL Database, including resilience to transient faults, availability zone failures, region-wide failures, service maintenance, and information about backup and resto","Azure SQL Databaseis a fully managed platform as a service (PaaS) database engine that handles most of the database management functions such as upgrading, patching, backups, and monitoring without user involvement. When you use Azure,reliability is a shared responsibility. Microsoft provides a range of capabilities to support resiliency and recovery. You're responsible for understanding how those capabilities work within all of the services you use, and selecting the capabilities you need to me",2026-01-22T18:34:00.000Z,reliability-article,best-practices,0.69,True,"SQL Database–specific handling of transient faults, AZ failures, regional failures, maintenance, backup/restore, and SLAs; detailed reliability guidance.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-sql-managed-instance,Azure SQL Managed Instance,Reliability in Azure SQL Managed Instance,,"Find out about resiliency in Azure SQL Managed Instance, including resilience to transient faults, availability zone failures, region-wide failures, service maintenance, and information about backup a","Azure SQL Managed Instanceis a fully managed platform as a service (PaaS) database engine. It provides almost 100% feature compatibility with SQL Server. Azure SQL Managed Instance handles most database management functions such as upgrading, patching, backups, and monitoring without user involvement. It runs on the latest stable version of the SQL Server database engine and a patched operating system with built-in high availability. When you use Azure,reliability is a shared responsibility. Mic",2026-01-22T18:34:00.000Z,reliability-article,,0.3,False,"Reliability overview for Azure SQL Managed Instance; focuses on high availability, backups, and shared responsibility conceptually, not on numeric limits or configuration parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/reliability/reliability-sql-managed-instance,Azure SQL Managed Instance,Reliability in Azure SQL Managed Instance,,"Find out about resiliency in Azure SQL Managed Instance, including resilience to transient faults, availability zone failures, region-wide failures, service maintenance, and information about backup a","Azure SQL Managed Instanceis a fully managed platform as a service (PaaS) database engine. It provides almost 100% feature compatibility with SQL Server. Azure SQL Managed Instance handles most database management functions such as upgrading, patching, backups, and monitoring without user involvement. It runs on the latest stable version of the SQL Server database engine and a patched operating system with built-in high availability. When you use Azure,reliability is a shared responsibility. Mic",2026-01-22T18:34:00.000Z,reliability-article,,0.3,False,"Reliability overview for Azure SQL Managed Instance; focuses on high availability, backups, and shared responsibility conceptually, not on numeric limits or configuration parameter tables.",unchanged diff --git a/products/azure-reliability/report.md b/products/azure-reliability/report.md index 36e3206e..9f464ea1 100644 --- a/products/azure-reliability/report.md +++ b/products/azure-reliability/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: architecture-patterns: Designing Azure apps for high availability using zones and multi-region patterns, including planning zone-resilient workloads, hardening @@ -11,39 +11,39 @@ category_descriptions: resilient Azure Functions architectures to design highly available, fault-tolerant Azure solutions. best-practices: Patterns and guidance to design, configure, and harden Azure services - (AKS, DBs, networking, messaging, backup, DR) for high availability, fault tolerance, - and disaster recovery. + (compute, data, networking, messaging) for high availability, failover, and disaster + recovery. limits-quotas: Details on Azure Queue Storage message size limits, including max message size, behavior when limits are exceeded, and best practices for handling large payloads. skill_description: Expert knowledge for Azure Reliability development including best practices, decision making, architecture & design patterns, limits & quotas, and - deployment. Use when designing zone-redundant AKS, MySQL Flexible Server, Azure - Functions, Queue Storage, or multi-region DR setups, and other Azure Reliability - related development tasks. Not for Azure Monitor (use azure-monitor), Azure Resiliency - (use azure-resiliency), Azure Service Health (use azure-service-health), Azure Sre - Agent (use azure-sre-agent). -use_when: Use when designing zone-redundant AKS, MySQL Flexible Server, Azure Functions, - Queue Storage, or multi-region DR setups, and other Azure Reliability related development - tasks. -confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency (use - azure-resiliency), Azure Service Health (use azure-service-health), Azure Sre Agent - (use azure-sre-agent). + deployment. Use when designing zone/multi-region apps, MySQL Flexible Server HA, + resilient Functions, DR failover, or Queue size limits, and other Azure Reliability + related development tasks. Not for Azure Resiliency (use azure-resiliency), Azure + Monitor (use azure-monitor), Azure Service Health (use azure-service-health), Azure + Site Recovery (use azure-site-recovery). +use_when: Use when designing zone/multi-region apps, MySQL Flexible Server HA, resilient + Functions, DR failover, or Queue size limits, and other Azure Reliability related + development tasks. +confusable_not_for: Not for Azure Resiliency (use azure-resiliency), Azure Monitor + (use azure-monitor), Azure Service Health (use azure-service-health), Azure Site + Recovery (use azure-site-recovery). --- # Azure Reliability Crawl Report ## Summary -- **Total Pages**: 103 -- **Fetched**: 103 +- **Total Pages**: 104 +- **Fetched**: 104 - **Fetch Failed**: 0 -- **Classified**: 41 -- **Unclassified**: 62 +- **Classified**: 44 +- **Unclassified**: 60 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 8 -- **Unchanged**: 95 +- **New Pages**: 1 +- **Updated Pages**: 2 +- **Unchanged**: 101 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-reliability/azure-reliability.csv` @@ -52,48 +52,42 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 3 | 2.9% | -| best-practices | 35 | 34.0% | +| best-practices | 38 | 36.5% | | decision-making | 1 | 1.0% | | deployment | 1 | 1.0% | | limits-quotas | 1 | 1.0% | -| *(Unclassified)* | 62 | 60.2% | +| *(Unclassified)* | 60 | 57.7% | ## Changes +### New Pages + +- [Azure Key Vault Managed HSM](https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-hsm) + ### Updated Pages -- [Azure Backup](https://learn.microsoft.com/en-us/azure/reliability/reliability-backup) - - Updated: 2026-02-23T23:12:00.000Z → 2026-04-14T17:08:00.000Z -- [Azure Backup](https://learn.microsoft.com/en-us/azure/reliability/reliability-backup) - - Updated: 2026-02-23T23:12:00.000Z → 2026-04-14T17:08:00.000Z -- [Azure Managed Grafana](https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-grafana) - - Updated: 2026-03-02T18:12:00.000Z → 2026-04-15T17:08:00.000Z -- [Azure Private Link service](https://learn.microsoft.com/en-us/azure/reliability/reliability-private-link-service) - - Updated: 2026-02-20T12:01:00.000Z → 2026-04-14T17:08:00.000Z -- [Azure App Configuration](https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration) - - Updated: 2026-02-24T23:11:00.000Z → 2026-04-15T22:12:00.000Z -- [Azure Chaos Studio](https://learn.microsoft.com/en-us/azure/reliability/reliability-chaos-studio) - - Updated: 2026-01-22T23:17:00.000Z → 2026-04-14T22:12:00.000Z -- [Azure App Configuration](https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration) - - Updated: 2026-02-24T23:11:00.000Z → 2026-04-15T22:12:00.000Z -- [Azure IoT Hub](https://learn.microsoft.com/en-us/azure/reliability/reliability-iot-hub) - - Updated: 2026-01-22T18:34:00.000Z → 2026-04-16T22:08:00.000Z +- [Azure Site Recovery](https://learn.microsoft.com/en-us/azure/reliability/reliability-site-recovery) + - Updated: 2026-03-03T08:00:00.000Z → 2026-04-24T17:20:00.000Z +- [Azure DDoS Protection](https://learn.microsoft.com/en-us/azure/reliability/reliability-ddos-protection) + - Updated: 2026-04-06T22:09:00.000Z → 2026-04-07T08:00:00.000Z ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| | [Azure App Configuration](https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration) | best-practices | 0.75 | Reliability guidance for Azure App Configuration typically includes specific patterns for handling transient faults, region/zone outages, and backup/restore of configuration stores and feature flags. These are actionable, product-specific recommendations (how to design access, caching, and failover), which matches best-practices. | +| [Azure App Configuration](https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration) | best-practices | 0.75 | Reliability guidance for Azure App Configuration typically includes specific patterns for handling transient faults, region/zone outages, and backup/restore of configuration stores and feature flags. These are actionable, product-specific recommendations (how to design access, caching, and failover), which matches best-practices. | | [Azure Monitor Logs](https://learn.microsoft.com/en-us/azure/reliability/reliability-monitor-logs) | best-practices | 0.75 | Reliability guidance for Azure Monitor Logs (Log Analytics workspaces) typically covers how to configure workspace replication, multi-region strategies, and data export for continuity. These are concrete, product-specific recommendations for achieving resiliency, which fits the best-practices sub-skill type. | -| [Azure Backup](https://learn.microsoft.com/en-us/azure/reliability/reliability-backup) | best-practices | 0.70 | Reliability guidance for Azure Backup typically includes product-specific resiliency recommendations (for example, how to configure vaults, regions, and restore options for different outage types). This goes beyond conceptual reliability and provides concrete, Azure-Backup-specific patterns and DO/DON'T guidance, which fits best-practices. | | [Azure Cosmos DB for NoSQL](https://learn.microsoft.com/en-us/azure/reliability/reliability-cosmos-db-nosql) | best-practices | 0.70 | Explains how to reach 99.999% uptime using AZs, multi-region writes, and automatic failover; highly product-specific reliability design guidance. | | [Azure Database for PostgreSQL](https://learn.microsoft.com/en-us/azure/reliability/reliability-database-postgresql) | best-practices | 0.70 | Reliability content for Azure Database for PostgreSQL generally contains concrete, product-specific HA/DR recommendations (for example, how to configure replicas, backups, and failover behavior for different outage types). This is actionable guidance tied to this service rather than generic theory, fitting best-practices rather than limits or configuration references. | +| [Azure Key Vault Managed HSM](https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-hsm) | best-practices | 0.70 | The page covers reliability, backup/restore, recovery features, and SLA details for Managed HSM. These topics generally include concrete, product-specific guidance on how to configure HSM pools, backup/restore patterns, and failover behavior to meet SLAs and handle region outages. That is actionable, service-specific reliability guidance, fitting best-practices. The summary doesn’t clearly show numeric limits or detailed config tables, so other categories are less likely. | | [Azure Managed Grafana](https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-grafana) | best-practices | 0.70 | Reliability content for Azure Managed Grafana generally includes concrete guidance on handling zone/region outages, maintenance, and backup/restore for this specific managed service. These are product-specific DO/DON'T recommendations and patterns, so it fits best-practices rather than generic concepts. | | [Azure Queue Storage](https://learn.microsoft.com/en-us/azure/reliability/reliability-storage-queue) | limits-quotas | 0.70 | Explicitly states a queue message can be up to 64 KB and queues can contain millions of messages up to the storage account capacity; these are concrete numeric limits. | | [Azure VMware Solution](https://learn.microsoft.com/en-us/azure/reliability/reliability-vmware-solution) | best-practices | 0.70 | Service-specific reliability guidance for Azure VMware Solution. These reliability articles typically include concrete deployment recommendations (for example, how to distribute clusters, use availability zones/regions, and handle transient faults) and product-specific gotchas. This fits best under best-practices because it focuses on how to configure and operate the service for resiliency rather than just conceptual reliability theory. | | [Services with availability zone support](https://learn.microsoft.com/en-us/azure/reliability/availability-zones-service-support) | deployment | 0.70 | This page lists which Azure services support availability zones and in what mode (zonal vs zone-redundant vs non-zonal), effectively acting as a support matrix for deployment options across services and regions. That matrix of which deployment options are available per service is product-specific expert knowledge and aligns best with the deployment category’s focus on platform/tier support matrices and constraints. | | [Azure SQL Database](https://learn.microsoft.com/en-us/azure/reliability/reliability-sql-database) | best-practices | 0.69 | SQL Database–specific handling of transient faults, AZ failures, regional failures, maintenance, backup/restore, and SLAs; detailed reliability guidance. | | [Azure Backup](https://learn.microsoft.com/en-us/azure/reliability/reliability-backup) | best-practices | 0.68 | The page focuses on making Azure Backup resilient to specific outage types (transient faults, zone and region outages) and provides product-specific reliability guidance and patterns. This is actionable, scenario-based advice tailored to Azure Backup rather than generic reliability theory, fitting best under best-practices. It does not primarily present numeric limits, decision matrices, or configuration parameter tables. | +| [Azure Backup](https://learn.microsoft.com/en-us/azure/reliability/reliability-backup) | best-practices | 0.68 | The page focuses on making Azure Backup resilient to specific outage types (transient faults, zone and region outages) and provides product-specific reliability guidance and patterns. This is actionable, scenario-based advice tailored to Azure Backup rather than generic reliability theory, fitting best under best-practices. It does not primarily present numeric limits, decision matrices, or configuration parameter tables. | | [Azure Database for MySQL](https://learn.microsoft.com/en-us/azure/reliability/reliability-database-mysql) | best-practices | 0.68 | The page focuses on making Azure Database for MySQL resilient to specific outage types (transient faults, AZ/region outages, service maintenance) and backup/restore behavior. It contains product-specific resiliency recommendations and patterns (for example, how to configure high availability, failover behavior, and backup strategies unique to this service), which qualify as concrete best practices rather than generic reliability concepts. | | [Azure Database for MySQL](https://learn.microsoft.com/en-us/azure/reliability/reliability-database-mysql) | best-practices | 0.68 | The page focuses on making Azure Database for MySQL resilient to specific outage types (transient faults, AZ/region outages, service maintenance) and backup/restore behavior. It contains product-specific resiliency recommendations and patterns (for example, how to configure high availability, failover behavior, and backup strategies unique to this service), which qualify as concrete best practices rather than generic reliability concepts. | | [Azure Kubernetes Service (AKS)](https://learn.microsoft.com/en-us/azure/reliability/reliability-aks) | best-practices | 0.68 | AKS-specific guidance for transient faults, AZs, multi-region support, backups, and maintenance; actionable reliability configuration and patterns. | @@ -109,7 +103,7 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency | [Azure Load Balancer](https://learn.microsoft.com/en-us/azure/reliability/reliability-load-balancer) | best-practices | 0.65 | The page focuses on making Azure Load Balancer resilient to specific outage scenarios (transient faults, zone and region outages) and provides product-specific resiliency guidance and patterns. While not about numeric limits, it contains concrete, Azure-Load-Balancer-specific recommendations on how to architect for reliability, which fits best under best-practices. | | [Azure Load Balancer](https://learn.microsoft.com/en-us/azure/reliability/reliability-load-balancer) | best-practices | 0.65 | The page focuses on making Azure Load Balancer resilient to specific outage scenarios (transient faults, zone and region outages) and provides product-specific resiliency guidance and patterns. While not about numeric limits, it contains concrete, Azure-Load-Balancer-specific recommendations on how to architect for reliability, which fits best under best-practices. | | [Azure Private Link service](https://learn.microsoft.com/en-us/azure/reliability/reliability-private-link-service) | best-practices | 0.65 | The page focuses on making Azure Private Link Service resilient to specific outage types. Such reliability docs usually contain concrete configuration and design recommendations unique to this service (for example, how to structure endpoints, failover, and redundancy), which aligns with product-specific best-practices. | -| [Azure Site Recovery](https://learn.microsoft.com/en-us/azure/reliability/reliability-site-recovery) | best-practices | 0.65 | Reliability guidance for Azure Site Recovery is typically organized as concrete recommendations (for example, how to structure replication, failover, and regional placement for resilience to zone/region outages). This is product-specific resiliency guidance rather than generic DR theory, so it fits best-practices. The description suggests actionable patterns for handling transient faults and outages, not just conceptual overview. | +| [Azure Site Recovery](https://learn.microsoft.com/en-us/azure/reliability/reliability-site-recovery) | best-practices | 0.65 | Reliability guidance for Site Recovery typically includes product-specific resiliency recommendations (for example, how to configure replication, failover, and regional placement to handle zone/region outages and transient faults). This is actionable, service-specific reliability/DR guidance rather than generic concepts, fitting best-practices. No clear evidence of numeric limits/quotas, decision matrices, or configuration tables in the summary. | | [Multi-region solutions in nonpaired regions](https://learn.microsoft.com/en-us/azure/reliability/regions-multi-region-nonpaired) | architecture-patterns | 0.65 | Lists Azure services and specific configurations for multi-region solutions when regions aren't paired; this is product-specific architecture guidance on when and how to use particular patterns for reliability, beyond generic concepts. | | [Nonregional Azure services](https://learn.microsoft.com/en-us/azure/reliability/regions-nonregional-services) | decision-making | 0.65 | Provides a list of nonregional services with classification as global or geographic; this is selection/decision data about service scope that’s specific and tabular. | | [Overview](https://learn.microsoft.com/en-us/azure/reliability/availability-zones-enable-zone-resiliency) | architecture-patterns | 0.65 | Provides concrete guidance on enabling zone resiliency, prioritizing workloads, and mapping services to patterns; this is design-pattern/decision guidance specific to Azure zones. | @@ -122,6 +116,7 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency | [Azure Bot Service](https://learn.microsoft.com/en-us/azure/reliability/reliability-bot) | best-practices | 0.62 | Describes concrete patterns for regional vs cross-region reliability, availability zones, and disaster recovery for bots with local data residency; product-specific resiliency guidance. | | [Azure Databricks](https://learn.microsoft.com/en-us/azure/reliability/reliability-databricks) | best-practices | 0.62 | Service-specific resiliency features and how to use them (transient fault handling, AZ support) in Databricks workloads; actionable reliability design guidance. | | [Azure Device Registry](https://learn.microsoft.com/en-us/azure/reliability/reliability-device-registry) | best-practices | 0.62 | Guidance on handling transient faults, AZ failures, and regional failures for Device Registry; service-specific reliability considerations. | +| [Azure DDoS Protection](https://learn.microsoft.com/en-us/azure/reliability/reliability-ddos-protection) | best-practices | 0.60 | The page focuses on how Azure DDoS Protection remains reliable during outages and how customers should use it as part of shared-responsibility reliability. Such content usually includes concrete, product-specific recommendations (for example, how to configure protection plans, design for zone/region failures, and avoid common pitfalls). This aligns with best-practices more than other categories; the summary does not indicate numeric limits or configuration tables. | | [Microsoft Fabric](https://learn.microsoft.com/en-us/azure/reliability/reliability-fabric) | best-practices | 0.60 | Explains how to use availability zones, cross-region replication, and DR planning in Fabric; concrete product-specific reliability patterns. | | [Single zone (zonal) resources](https://learn.microsoft.com/en-us/azure/reliability/availability-zones-zonal-resource-resiliency) | architecture-patterns | 0.60 | Explains when to use zonal resources and responsibilities for making them resilient; product-specific pattern guidance for zonal vs zone-resilient deployments. | @@ -176,9 +171,7 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Resiliency | [Azure Virtual Machines](https://learn.microsoft.com/en-us/azure/reliability/reliability-virtual-machines) | 0.30 | Reliability in Azure Virtual Machines; description emphasizes shared responsibility and general resiliency concepts, not detailed numeric limits or configuration options. | | [Azure Virtual Machines](https://learn.microsoft.com/en-us/azure/reliability/reliability-virtual-machines) | 0.30 | Reliability in Azure Virtual Machines; description emphasizes shared responsibility and general resiliency concepts, not detailed numeric limits or configuration options. | | [Azure Virtual Network](https://learn.microsoft.com/en-us/azure/reliability/reliability-virtual-network) | 0.30 | Reliability in Azure Virtual Network; appears to be general resiliency and SLA overview without product-specific numeric or config details. | -| [Azure App Configuration](https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration) | 0.20 | Reliability guidance for Azure App Configuration; summary indicates high-level resiliency concepts (transient faults, outages, backup/restore) but no explicit mention of numeric limits, configuration parameter tables, or detailed troubleshooting mappings. | | [Azure Chaos Studio](https://learn.microsoft.com/en-us/azure/reliability/reliability-chaos-studio) | 0.20 | Reliability overview for Azure Chaos Studio; description suggests conceptual guidance on zones and outage behavior without clear indication of numeric limits, config tables, or error-code-based troubleshooting. | -| [Azure DDoS Protection](https://learn.microsoft.com/en-us/azure/reliability/reliability-ddos-protection) | 0.20 | Reliability overview for Azure DDoS Protection; appears conceptual/shared-responsibility guidance without specific limits, error codes, or configuration tables. | | [Azure Disk Storage](https://learn.microsoft.com/en-us/azure/reliability/reliability-storage-disk) | 0.20 | Reliability overview for Azure Disk Storage; focuses on redundancy concepts and shared responsibility, not concrete limits, configs, or troubleshooting mappings. | | [Azure IoT Hub](https://learn.microsoft.com/en-us/azure/reliability/reliability-iot-hub) | 0.20 | Reliability overview for Azure IoT Hub; focuses on shared responsibility and resiliency concepts. The description mentions SLA details but not specific numeric SLA values, limits, or configuration/tier matrices that would qualify as expert knowledge under the defined categories. | | [Azure service incident response](https://learn.microsoft.com/en-us/azure/reliability/incident-response) | 0.20 | Guidance on what to do during Azure service disruptions; focuses on process and support, not on technical limits, configs, or error-code troubleshooting. | diff --git a/products/azure-repos/azure-repos.csv b/products/azure-repos/azure-repos.csv index 6dbba893..8933d751 100644 --- a/products/azure-repos/azure-repos.csv +++ b/products/azure-repos/azure-repos.csv @@ -68,7 +68,7 @@ https://learn.microsoft.com/en-us/azure/devops/repos/git/history?view=azure-devo Each commit also contains a pointer to one or more previous commits. Commits can have multiple parents, creating a history that looks like a graph instead of a straight line. This difference in history is incredibly impor",2026-03-24T21:04:00.000Z,overview,,0.1,False,"Page is an overview of how Git history works conceptually (commits as a graph). It does not contain product-specific limits, configuration parameters, decision matrices, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/devops/repos/git/howto?view=azure-devops,Frequently asked questions,Git frequently asked questions - Azure Repos,Troubleshoot Git issues in Azure Repos,"Find answers to frequently asked questions about Git in Azure Repos, including branch management, commit practices, proxy and SSL configuration, and authentication troubleshooting.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Git with Azure Repos, including branch management, commit practices, configuration, and troubleshooting clone, push, proxy, SSL, and authentication issues.",2026-02-24T02:03:00Z,faq,troubleshooting,0.7,True,"FAQ explicitly covers troubleshooting clone, push, proxy, SSL, and authentication issues in Azure Repos Git, likely including specific error messages and resolutions that are product-specific.",unchanged +https://learn.microsoft.com/en-us/azure/devops/repos/git/howto?view=azure-devops,Frequently asked questions,Git frequently asked questions - Azure Repos,Troubleshoot Git issues in Azure Repos,"Find answers to frequently asked questions about Git in Azure Repos, including branch management, commit practices, proxy and SSL configuration, and authentication troubleshooting.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Find answers to frequently asked questions about using Git with Azure Repos, including branch management, commit practices, configuration, and troubleshooting clone, push, proxy, SSL, and authentication issues.",2026-04-22T21:02:00Z,faq,troubleshooting,0.72,True,"The FAQ explicitly covers troubleshooting clone, push, proxy, SSL, and authentication issues for Git in Azure Repos. Such pages typically list Azure DevOps–specific error messages, configuration nuances (for proxies/SSL/auth), and concrete resolution steps that go beyond generic Git knowledge. This aligns with the troubleshooting sub-skill (symptom → cause → solution with product-specific details), and is more specific than generic Git guidance.",updated https://learn.microsoft.com/en-us/azure/devops/repos/git/ignore-files?view=azure-devops,Ignore files,Ignore files in your Git repo - Azure Repos,Configure Git ignore rules for Azure Repos projects,"Learn how to exclude files from Git version control by using files, commands, and repo management.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Visual Studio 2019 | Visual Studio 2022 Not every file in your project needs tracking by Git. Examples of files that typically don't need tracking include temporary files from your development environment, test outputs, and logs. You can use several mechanisms to inform Git which files in your project shouldn't be tracked and to ensure that Git doesn't report changes to those files. For files that Git doesn't track, you can u",2025-02-18T21:53:00.000Z,how-to,best-practices,0.65,True,Provides concrete guidance on excluding files using .gitignore and related mechanisms; product-specific recommendations and patterns for managing tracked vs untracked files.,unchanged https://learn.microsoft.com/en-us/azure/devops/repos/git/import-from-tfvc?view=azure-devops,Import and migrate from TFVC,Import and migrate repositories from TFVC to Git - Azure Repos,Plan and execute TFVC to Git migration in Azure DevOps,Import and migrate your repositories from TFVC to Git repositories within the same account.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 You can migrate code from an existing Team Foundation Version Control (TFVC) repository to a new Git repository within the same organization. Migrating to Git is an involved process for large TFVC repositories and teams. Centralized version control systems, like TFVC, behave differently from Git in fundamental ways. The switch involves a lot more than learning new commands. It's a disruptive change that requires careful plann",2026-01-22T08:00:00.000Z,how-to,decision-making,0.7,True,"Covers migration from TFVC to Git with planning considerations and process guidance; informs strategic decision and migration approach, beyond generic Git knowledge.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/git/import-git-repository?view=azure-devops,Import repo,Import a Git repository into a project - Azure Repos,Securely import external Git repositories into Azure Repos,"Import a Git repository from GitHub, GitLab, Bitbucket, or other locations into your Azure DevOps project using secure authentication methods including Microsoft Entra ID tokens.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Important Consider using the more secureMicrosoft Entra tokensover higher-riskpersonal access tokens. For more information, seeReduce PAT usage. @@ -130,12 +130,12 @@ https://learn.microsoft.com/en-us/azure/devops/repos/git/use-ssh-keys-to-authent https://learn.microsoft.com/en-us/azure/devops/repos/git/view-pull-requests?view=azure-devops,View and open pull requests,"View, filter, and open pull requests - Azure Repos",,"Learn about different ways to list, filter, and open Git pull requests in Azure Repos.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Visual Studio 2019 | Visual Studio 2022 You create pull requests (PRs) toreviewandmergecode changes in aGit repository on Azure Repos. Team members and stakeholders can review changes and give feedback before merging the code into the target branch. Reviewers can also comment on changes and vote to approve or reject the code. Teams can require PRs for any changes on protected branches, and setbranch policiesto require certain",2025-10-27T22:02:00.000Z,how-to,,0.3,False,"How to view/filter/open PRs; operational UI guidance without deep config, limits, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/integrations/repos-slack?view=azure-devops,Azure Repos with Slack,Use Azure Repos with Slack - Azure Repos,Connect Azure Repos notifications to Slack channels,Monitor Azure Repos from Slack.,"Azure DevOps Services If you useSlack, you can use theAzure Repos app for Slackto easily monitor your Azure Repos repositories. Set up and manage subscriptions to receive notifications in your channel whenever code is pushed or checked in and whenever a pull request (PR) gets created, updated, or merged. This app supports both Git and Team Foundation Version Control (TFVC) events.",2025-10-24T08:00:00.000Z,how-to,integrations,0.65,True,"Describes using the Azure Repos app for Slack to subscribe to specific repository events (pushes, check-ins, PR lifecycle). This is a concrete product integration pattern between Azure Repos and Slack, beyond generic conceptual content.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/integrations/repos-teams?view=azure-devops,Azure Repos with Microsoft Teams,Use Azure Repos with Microsoft Teams - Azure Repos,Monitor Azure Repos activity in Microsoft Teams,Monitor Azure Repos from Microsoft Teams.,"Azure DevOps Services If you useMicrosoft Teamsand Azure Repos, you can use theAzure Repos app for Teamsto monitor your repos. The app supports monitoring both Git and Team Foundation Version Control (TFVC) repos, but it doesn't support integration with GitHub repos. In this article, learn how to do the following tasks:",2025-10-27T22:02:00.000Z,how-to,integrations,0.65,True,"Covers the Azure Repos app for Microsoft Teams, including supported repo types and the lack of GitHub integration. This is specific integration behavior and configuration between Azure Repos and Teams.",unchanged -https://learn.microsoft.com/en-us/azure/devops/repos/security/configure-github-advanced-security-features?view=azure-devops,Configure GitHub Advanced Security,Configure GitHub Advanced Security for Azure DevOps features - Azure Repos,Configure GitHub Advanced Security in Azure Repos,"Enable secret, repo, code, and dependency scanning with GitHub Advanced Security for Azure DevOps",GitHub Advanced Security for Azure DevOps adds GitHub Advanced Security's suite of security features to Azure Repos and includes the following features: You can bring the protection of Advanced Security to your enterprise with the flexibility to enable the right level of protection for your repositories. GitHub Advanced Security for Azure DevOps is available as GitHub Secret Protection for Azure DevOps and GitHub Code Security for Azure DevOps. Secret Protection includes the following features: ,2026-04-14T13:04:00.000Z,how-to,security,0.68,True,"Configuration-focused page for enabling GitHub Advanced Security features in Azure DevOps Repos, likely detailing specific security feature toggles, scope options, and product-specific settings (e.g., enabling secret, repo, code, and dependency scanning per project/repo). This is concrete security configuration rather than conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/devops/repos/security/configure-github-advanced-security-features?view=azure-devops,Configure GitHub Advanced Security,Configure GitHub Advanced Security for Azure DevOps features - Azure Repos,Configure GitHub Advanced Security in Azure Repos,"Enable secret, repo, code, and dependency scanning with GitHub Advanced Security for Azure DevOps",GitHub Advanced Security for Azure DevOps adds GitHub Advanced Security's suite of security features to Azure Repos and includes the following features: You can bring the protection of Advanced Security to your enterprise with the flexibility to enable the right level of protection for your repositories. GitHub Advanced Security for Azure DevOps is available as GitHub Secret Protection for Azure DevOps and GitHub Code Security for Azure DevOps. Secret Protection includes the following features: ,2026-04-14T13:04:00.000Z,how-to,security,0.68,True,"Configuration-focused page for enabling GitHub Advanced Security features in Azure DevOps Repos, likely detailing specific security feature toggles, scope options, and product-specific settings (e.g., enabling secret, repo, code, and dependency scanning per project/repo). This is concrete security configuration rather than conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-billing?view=azure-devops,Billing for GitHub Advanced Security,Billing for GitHub Advanced Security for Azure DevOps - Azure Repos,,Configure billing for GitHub Advanced Security for Azure DevOps,"To access results and useGitHub Advanced Security for Azure DevOpsfeatures, you need a license. Each active committer to at least one repository with Advanced Security enabled consumes one license. For more information on pricing, seeGitHub Advanced Security pricing. A committer is considered active if they're present in a push made within the last 90 days, regardless of when they originally committed. Advanced Security gets billed monthly and directly to the Azure subscription associated with y",2025-08-11T08:00:00.000Z,overview,,0.4,False,Primarily billing and licensing description without technical configuration parameters or limits beyond generic pricing/licensing concepts.,unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-code-scanning-queries?view=azure-devops,Using custom queries and query packs,Use custom queries with GitHub Advanced Security for Azure DevOps - Azure Repos,Configure custom CodeQL queries in Azure DevOps,Using custom queries with GitHub Advanced Security for Azure DevOps.,"By default, if you don't have a custom configuration file specified in your pipeline setup, CodeQL runs thesecurity-extendedquery pack to analyze your code. You can utilize custom CodeQL queries to write your own queries to find specific vulnerabilities and errors. You also need to create a custom configuration file to modify CodeQL's default analysis. To find existing custom queries or to contribute your own custom query, seeContributing to CodeQL.",2025-02-21T04:40:00.000Z,how-to,configuration,0.65,True,"Describes product-specific behavior of CodeQL in Azure DevOps (default query packs, need for custom configuration files) and how to alter default analysis using custom queries. This is concrete configuration guidance for GitHub Advanced Security for Azure DevOps rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-code-scanning-third-party?view=azure-devops,Integrate third-party code scanning tools,Integrate non-Microsoft scanning tools with GitHub Advanced Security for Azure DevOps - Azure Repos,Integrate third-party scanners via SARIF in Azure DevOps,Integrate non-Microsoft scanning tools with GitHub Advanced Security for Azure DevOps.,"GitHub Advanced Security for Azure DevOpscreates code scanning alerts in a repository using information from Static Analysis Results Interchange Format (SARIF) files. The SARIF file properties are used to populate alert information, such as the alert title, location, and description text. You can generate SARIF files using many static analysis security testing tools, including CodeQL. The results must use SARIF version 2.1.0. For more information on SARIF, seeSARIF tutorials.",2025-05-07T22:21:00.000Z,how-to,integrations,0.7,True,"Provides product-specific integration details: GitHub Advanced Security for Azure DevOps consumes SARIF 2.1.0 files from non-Microsoft tools, and uses specific SARIF properties to populate alerts. This is concrete integration behavior and format requirements unique to the product.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-code-scanning-troubleshoot?view=azure-devops,Troubleshoot code scanning,Troubleshoot code scanning for GitHub Advanced Security for Azure DevOps - Azure Repos,Troubleshoot CodeQL code scanning in Azure DevOps,Troubleshoot code scanning with GitHub Advanced Security for Azure DevOps.,"Generally, if you're encountering errors with CodeQL execution, the CodeQL CLI reports the status of each command it runs as an exit code. The exit code provides information for subsequent commands or for other tools that rely on the CodeQL CLI. For more information on exit code details, seeExit codes.",2025-11-24T22:02:00.000Z,how-to,troubleshooting,0.7,True,"Explicitly a troubleshooting article for CodeQL execution within GitHub Advanced Security for Azure DevOps, referencing exit codes and their diagnostic meaning. Organized around errors and how to interpret/resolve them, which is product-specific troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-code-scanning?view=azure-devops,Configure code scanning,Set up code scanning for GitHub Advanced Security for Azure DevOps - Azure Repos,Set up GitHub Advanced Security code scanning in Azure DevOps,Set up code scanning with GitHub Advanced Security for Azure DevOps,"Code scanning in GitHub Advanced Security for Azure DevOps lets you analyze the code in an Azure DevOps repository to find security vulnerabilities and coding errors. You'll need either GitHub Advanced Security for Azure DevOps or, if you're using the standalone experience, GitHub Code Security for Azure DevOps enabled. Any problems identified by the analysis are raised as an alert. Code scanning uses CodeQL to identify vulnerabilities. CodeQL is the code analysis engine developed by GitHub to a",2026-04-14T13:04:00.000Z,how-to,security,0.64,True,"Describes how to set up CodeQL-based code scanning for GitHub Advanced Security in Azure DevOps, which typically includes product-specific security configuration steps, required permissions, and scan settings. This is actionable security configuration rather than a high-level overview.",updated +https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-code-scanning?view=azure-devops,Configure code scanning,Set up code scanning for GitHub Advanced Security for Azure DevOps - Azure Repos,Set up GitHub Advanced Security code scanning in Azure DevOps,Set up code scanning with GitHub Advanced Security for Azure DevOps,"Code scanning in GitHub Advanced Security for Azure DevOps lets you analyze the code in an Azure DevOps repository to find security vulnerabilities and coding errors. You'll need either GitHub Advanced Security for Azure DevOps or, if you're using the standalone experience, GitHub Code Security for Azure DevOps enabled. Any problems identified by the analysis are raised as an alert. Code scanning uses CodeQL to identify vulnerabilities. CodeQL is the code analysis engine developed by GitHub to a",2026-04-14T13:04:00.000Z,how-to,security,0.64,True,"Describes how to set up CodeQL-based code scanning for GitHub Advanced Security in Azure DevOps, which typically includes product-specific security configuration steps, required permissions, and scan settings. This is actionable security configuration rather than a high-level overview.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-dependency-scanning-ecosystems?view=azure-devops,Supported ecosystems and build configurations,Supported package ecosystems for dependency scanning for GitHub Advanced Security for Azure DevOps - Azure Repos,Supported ecosystems for GitHub Advanced Security dependency scanning,Supported package ecosystems for dependency scanning for GitHub Advanced Security for Azure DevOps.,"Dependency scanning supports both direct and transitive dependencies for all supported package ecosystems. Dependency scanning is unable to detect vendored dependencies in your repository. Due to how detection is run for dependency scanning, ensure you have a package restore step in your build pipeline so that the correct package version is determined, otherwise results may be missing or incomplete.",2025-02-21T04:40:00.000Z,reference,limits-quotas,0.65,True,"Lists exactly which package ecosystems are supported and constraints like inability to detect vendored dependencies, which are capability limits.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-dependency-scanning-troubleshoot?view=azure-devops,Troubleshoot dependency scanning,Troubleshooting dependency scanning for GitHub Advanced Security for Azure DevOps - Azure Repos,Troubleshoot dependency scanning in GitHub Advanced Security,Troubleshooting dependency scanning for GitHub Advanced Security for Azure DevOps,Learn how to troubleshoot dependency scanning issues in GitHub Advanced Security for Azure DevOps.,2025-08-04T17:19:00.000Z,how-to,troubleshooting,0.8,True,"Symptom-to-solution guidance for dependency scanning issues, including product-specific causes and fixes.",unchanged https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-dependency-scanning?view=azure-devops,Configure dependency scanning,Set up dependency scanning for GitHub Advanced Security - Azure Repos,Configure dependency scanning for GitHub Advanced Security in Azure DevOps,Set up dependency scanning for GitHub Advanced Security for Azure DevOps,"Dependency scanning inGitHub Advanced Security for Azure DevOpsdetects the open source components used in your source code and detects if there are any associated vulnerabilities. Any found vulnerabilities from open source components get flagged as an alert. You'll need either GitHub Advanced Security for Azure DevOps or, if you're using the standalone experience, GitHub Code Security for Azure DevOps enabled. GitHub Advanced Security for Azure DevOps works with Azure Repos. To use GitHub Advanc",2025-02-21T04:40:00.000Z,how-to,configuration,0.7,True,Product-specific setup instructions and required pipeline configuration for dependency scanning.,unchanged diff --git a/products/azure-repos/report.md b/products/azure-repos/report.md index 12706121..70e8c5ae 100644 --- a/products/azure-repos/report.md +++ b/products/azure-repos/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: security: 'Securing Azure Repos and TFVC: auth methods (Entra, PAT, SSH, GCM), repo/branch/PR permissions and policies, branch locking, secure imports, and GitHub Advanced @@ -49,8 +49,8 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 2 -- **Unchanged**: 204 +- **Updated Pages**: 1 +- **Unchanged**: 205 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-repos/azure-repos.csv` @@ -72,10 +72,8 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us ### Updated Pages -- [Configure GitHub Advanced Security](https://learn.microsoft.com/en-us/azure/devops/repos/security/configure-github-advanced-security-features?view=azure-devops) - - Updated: 2026-03-25T21:04:00.000Z → 2026-04-14T13:04:00.000Z -- [Configure code scanning](https://learn.microsoft.com/en-us/azure/devops/repos/security/github-advanced-security-code-scanning?view=azure-devops) - - Updated: 2025-02-21T04:40:00.000Z → 2026-04-14T13:04:00.000Z +- [Frequently asked questions](https://learn.microsoft.com/en-us/azure/devops/repos/git/howto?view=azure-devops) + - Updated: 2026-02-24T02:03:00Z → 2026-04-22T21:02:00Z ## Classified Pages @@ -102,6 +100,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us | [Configure branch policy for an external service](https://learn.microsoft.com/en-us/azure/devops/repos/git/pr-status-policy?view=azure-devops) | configuration | 0.75 | Guides configuring branch policy to require status from external PR status servers; includes policy options and PR Status API usage. | | [Control access to TFVC](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/control-access-team-foundation-version-control?view=azure-devops) | security | 0.75 | Details TFVC permission model (Allow/Deny, inheritance, groups); includes product-specific permission semantics that qualify as security configuration. | | [Local and server workspaces](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/decide-between-using-local-server-workspace?view=azure-devops) | decision-making | 0.75 | Directly guides users in selecting local vs server workspace types, explaining trade-offs and scenarios, which is product-specific decision-making guidance. | +| [Frequently asked questions](https://learn.microsoft.com/en-us/azure/devops/repos/git/howto?view=azure-devops) | troubleshooting | 0.72 | The FAQ explicitly covers troubleshooting clone, push, proxy, SSL, and authentication issues for Git in Azure Repos. Such pages typically list Azure DevOps–specific error messages, configuration nuances (for proxies/SSL/auth), and concrete resolution steps that go beyond generic Git knowledge. This aligns with the troubleshooting sub-skill (symptom → cause → solution with product-specific details), and is more specific than generic Git guidance. | | [Add check-in policies](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/add-check-policies?view=azure-devops) | configuration | 0.70 | Describes how to add check-in policies and what they enforce (e.g., associating work items), which are TFVC-specific configuration options. | | [Compare Git and TFVC](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/comparison-git-tfvc?view=azure-devops) | decision-making | 0.70 | Explicitly positioned as guidance to choose version control type; likely includes comparison of Git vs TFVC with criteria and trade-offs specific to Azure Repos. | | [Configure](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/configure-command?view=azure-devops) | configuration | 0.70 | Admin command to view/change project source control settings; likely lists specific configuration options and allowed values. | @@ -116,7 +115,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (us | [Extend pull request workflows with pull request status](https://learn.microsoft.com/en-us/azure/devops/repos/git/pull-request-status?view=azure-devops) | integrations | 0.70 | Describes PR workflow extensibility using status and policy; involves PR Status API and external integrations with specific parameters and behaviors. | | [Folder comparison filters](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/folder-comparison-filters?view=azure-devops) | configuration | 0.70 | Describes ordered list of comparison filters, default name filters, and use via UI and tf folderdiff; this is product-specific configuration behavior not generally known. | | [Folderdiff](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/folderdiff-command?view=azure-devops) | integrations | 0.70 | Command reference with product-specific syntax, options, and behaviors for tf folderdiff that an LLM won't reliably know from training. | -| [Frequently asked questions](https://learn.microsoft.com/en-us/azure/devops/repos/git/howto?view=azure-devops) | troubleshooting | 0.70 | FAQ explicitly covers troubleshooting clone, push, proxy, SSL, and authentication issues in Azure Repos Git, likely including specific error messages and resolutions that are product-specific. | | [Get](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/get-command?view=azure-devops) | integrations | 0.70 | Detailed CLI reference for tf get with parameters and behaviors specific to Azure DevOps TFVC. | | [Git commands](https://learn.microsoft.com/en-us/azure/devops/repos/git/command-prompt?view=azure-devops) | configuration | 0.70 | Command reference for common Git tasks in Visual Studio, including Azure DevOps-specific behaviors and options—effectively a configuration/command reference. | | [Git index.lock file](https://learn.microsoft.com/en-us/azure/devops/repos/git/git-index-lock?view=azure-devops) | troubleshooting | 0.70 | index.lock problems are a specific failure mode; article likely maps the lock-file symptom to causes and safe resolution steps in this environment. | diff --git a/products/azure-resource-manager/azure-resource-manager.csv b/products/azure-resource-manager/azure-resource-manager.csv index d8f509d8..de2b6b62 100644 --- a/products/azure-resource-manager/azure-resource-manager.csv +++ b/products/azure-resource-manager/azure-resource-manager.csv @@ -8,7 +8,7 @@ https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-confi https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-config-modules,Module settings,Module setting for Bicep config - Azure Resource Manager,Configure Bicep module aliases and credentials,Describes how to customize configuration values for modules in Bicep deployments.,"In abicepconfig.jsonfile, you can create aliases for module paths and configure profile and credential precedence for publishing and restoring modules. This article describes the settings that are available for working withBicep modules.",2025-12-22T08:00:00.000Z,article,configuration,0.8,True,"Details module-related settings (aliases, profile/credential precedence) in bicepconfig.json; product-specific configuration options.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-core-diagnostics,All codes,Bicep warnings and error codes - Azure Resource Manager,Interpret Bicep warnings and error diagnostics,Understand Bicep warnings and error codes.,"If you need more information about a particular diagnostic code, select theFeedbackbutton in the upper-right corner of the page and specify the code. You can suppress Bicep diagnostic codes by usingdisable-next-lineanddisable-diagnostics. SeeDirectives.",2026-01-16T08:00:00.000Z,article,troubleshooting,0.7,True,Central reference for Bicep diagnostic codes and suppression mechanisms; product-specific error handling guidance.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-extension,Extension (extension),Use extensions in Bicep - Azure Resource Manager,Use Bicep extensions to reach external resources,This article describes how to use Bicep extensions.,"Bicep was initially created to enhance the authoring experience compared to Azure Resource Manager JSON templates, simplifying the deployment and management of Azure resources. Bicep extensions build on this foundation, enabling Bicep files to reference resources beyond the scope of Azure Resource Manager. This article describes how to use Bicep extensions. The syntax for importing Bicep extensions is: The syntax for importing Bicep extensions, which require configuration is: For examples, seeBi",2025-12-22T08:00:00.000Z,article,integrations,0.7,True,"Explains syntax for importing Bicep extensions, including configuration-bearing imports; these are integration patterns with external systems beyond ARM.",unchanged -https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions,All functions,Bicep functions overview - Azure Resource Manager,Use built-in functions in Azure Bicep templates,"Learn about the functions that can be used in a Bicep file to retrieve values, work with strings and numerics, and retrieve deployment information.","This article describes all of the functions that you can use in a Bicep file. To define custom functions, seeUser-defined functions. For a description of the sections in a Bicep file, seeBicep file structure and syntax. Most functions work the same when deployed to a resource group, subscription, management group, or tenant. A few functions can't be used in all scopes. They're noted in the lists below.",2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,"The page enumerates all built-in Bicep/ARM template functions with product-specific syntax, parameter names, and behaviors that are not obvious from general knowledge. It is effectively an API surface reference for Bicep functions (string, numeric, deployment, resource, etc.), which fits the integrations/coding-patterns category because it documents concrete function signatures and usage details rather than just concepts.",updated +https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions,All functions,Bicep functions overview - Azure Resource Manager,Use built-in functions in Azure Bicep templates,"Learn about the functions that can be used in a Bicep file to retrieve values, work with strings and numerics, and retrieve deployment information.","This article describes all of the functions that you can use in a Bicep file. To define custom functions, seeUser-defined functions. For a description of the sections in a Bicep file, seeBicep file structure and syntax. Most functions work the same when deployed to a resource group, subscription, management group, or tenant. A few functions can't be used in all scopes. They're noted in the lists below.",2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,"The page enumerates all built-in Bicep/ARM template functions with product-specific syntax, parameter names, and behaviors that are not obvious from general knowledge. It is effectively an API surface reference for Bicep functions (string, numeric, deployment, resource, etc.), which fits the integrations/coding-patterns category because it documents concrete function signatures and usage details rather than just concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-any,Any function,Bicep functions - any() - Azure Resource Manager,Use the any() function to relax Bicep typing,Describes the any() function that's available in Bicep to convert types.,"Bicep supports a function namedany()that suppresses type check errors. Use the Bicepany()function to cast a value to a type that's compatible with any data type. For example, use theany()function when a property requires a number but you need to provide a string, like'0.5'. This function doesn't exist in the Azure Resource Manager template runtime. The Bicepany()function only affects compile-time type checking. It doesn't convert values at runtime and isn't emitted into the JSON for an Azure Res",2025-10-22T17:11:00.000Z,article,configuration,0.8,True,"Describes the special any() function, its compile-time-only behavior, and type-checking implications; highly specific language configuration and edge-case behavior.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-array,Array functions,Bicep functions - arrays - Azure Resource Manager,Use Bicep array functions in ARM templates,Describes the functions to use in a Bicep file for working with arrays.,This article describes the Bicep functions for working with arrays. The lambda functions for working with arrays can be foundhere.,2025-10-30T08:00:00.000Z,reference,integrations,0.7,True,Lists product-specific Bicep array functions with signatures and behaviors that are not generally known; effectively an API surface reference.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-cidr,CIDR functions,Bicep functions - CIDR - Azure Resource Manager,Manipulate IP ranges with Bicep CIDR functions,Describes the functions to use in a Bicep file to manipulate IP addresses and create IP address ranges.,Classless Inter-Domain Routing (CIDR) is a method of allocating IP addresses and routing Internet Protocol (IP) packets. This article describes the Bicep functions for working with CIDR.,2025-10-30T08:00:00.000Z,reference,integrations,0.7,True,Documents specific Bicep CIDR functions and parameters for IP range manipulation; detailed API-like reference beyond generic CIDR knowledge.,unchanged @@ -21,7 +21,7 @@ https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-funct https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-numeric,Numeric functions,Bicep functions - numeric - Azure Resource Manager,Use numeric functions in Bicep templates,Describes the functions to use in a Bicep file to work with numbers.,This article describes the Bicep functions for working with integers. Some of the Azure Resource Manager JSON numeric functions are replaced withBicep numeric operators.,2025-10-30T08:00:00.000Z,reference,integrations,0.65,True,"Lists Bicep numeric functions and their usage, including differences from ARM JSON functions; specific API reference.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-object,Object functions,Bicep functions - objects - Azure Resource Manager,Manipulate objects using Bicep object functions,Describes the functions to use in a Bicep file for working with objects.,This article describes the Bicep functions for working with objects.,2025-10-30T08:00:00.000Z,reference,integrations,0.7,True,Provides concrete Bicep object functions and signatures for working with objects in templates.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-parameters-file,Parameters file functions,Bicep functions for Bicep parameters files - Azure Resource Manager,Use functions in Bicep parameters files,Learn about the functions that can be used in Bicep parameters files.,This article describes the Bicep functions that can be used inBicep parameters files (.bicepparam).,2025-09-17T22:11:00.000Z,reference,integrations,0.75,True,Documents which functions are allowed in .bicepparam files and how they behave; specific configuration and API rules.,unchanged -https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-resource,Resource functions,Bicep functions - resources - Azure Resource Manager,Use Bicep resource functions to access ARM resources,Describes the functions to use in a Bicep file to retrieve values about resources.,"This article describes the Bicep functions for getting resource values. To get values from the current deployment, seeDeployment value functions.",2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,"The page documents product-specific Bicep resource functions (such as their exact signatures, parameters, and return behaviors) used to retrieve values about Azure resources. These are concrete, service-specific API/DSL details that go beyond generic knowledge, fitting the integrations & coding patterns category best.",updated +https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-resource,Resource functions,Bicep functions - resources - Azure Resource Manager,Use Bicep resource functions to access ARM resources,Describes the functions to use in a Bicep file to retrieve values about resources.,"This article describes the Bicep functions for getting resource values. To get values from the current deployment, seeDeployment value functions.",2026-04-17T08:00:00.000Z,reference,integrations,0.7,True,"The page documents product-specific Bicep resource functions (such as their exact signatures, parameters, and return behaviors) used to retrieve values about Azure resources. These are concrete, service-specific API/DSL details that go beyond generic knowledge, fitting the integrations & coding patterns category best.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-scope,Scope functions,Bicep functions - scopes - Azure Resource Manager,Access deployment scopes using Bicep scope functions,Describes the functions to use in a Bicep file to retrieve values about deployment scopes.,This article describes the Bicep functions for getting scope values.,2025-10-30T08:00:00.000Z,reference,integrations,0.7,True,"Describes Bicep functions for scope handling (subscription, resource group, etc.), which are specific to ARM/Bicep.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-string,String functions,Bicep functions - string - Azure Resource Manager,Manipulate text with Bicep string functions,Describes the functions to use in a Bicep file to work with strings.,This article describes the Bicep functions for working with strings.,2025-12-22T08:00:00.000Z,reference,integrations,0.7,True,Lists Bicep string functions and their exact behavior; language-specific API reference.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-import,Import (import),Imports in Bicep - Azure Resource Manager,Import shared functionality and namespaces in Bicep,This article describes how to import shared functionality and namespaces in Bicep.,This article describes the syntax you use to export and import shared functionality and namespaces. Using compile-time imports automatically enableslanguage version 2.0code generation.,2025-12-22T08:00:00.000Z,article,configuration,0.7,True,"Describes syntax and behavior of import/export for shared functionality and namespaces, including language version implications; product-specific configuration semantics.",unchanged @@ -196,7 +196,7 @@ https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/patterns-lo https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/patterns-name-generation,Name generation,Name generation pattern - Azure Resource Manager,Implement robust name generation patterns in Bicep,Describes the name generation pattern.,"Within your Bicep files, use string interpolation and Bicep functions to create resource names that are unique, deterministic, meaningful, and different for each environment that you deploy to.",2025-12-22T08:00:00.000Z,article,architecture-patterns,0.7,True,"Provides a concrete pattern for generating deterministic, environment-specific resource names using Bicep functions; product-specific design pattern.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/patterns-shared-variable-file,Shared variable file,Shared variable file pattern - Azure Resource Manager,Use shared variable file pattern in Bicep,Describes the shared variable file pattern.,"Reduce the repetition of shared values in your Bicep files. Instead, load those values from a shared JSON file within your Bicep file. When using arrays, concatenate the shared values with deployment-specific values in your Bicep code.",2025-12-22T08:00:00.000Z,article,architecture-patterns,0.7,True,Describes loading shared values from JSON and combining with deployment-specific values; a Bicep-specific configuration/architecture pattern.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/private-module-registry,Private module registry,Create a private container registry in Azure for Bicep modules - Azure Resource Manager,Set up a private Azure container registry for Bicep modules,Learn how to set up a private container registry in Azure for private Bicep modules.,"To sharemoduleswithin your organization, you can create a private module registry. You can then publish modules to that registry and give read access to users who need to deploy the modules. After the modules are shared in the registries, you can reference them from your Bicep files. To use public modules, seeBicep Modules. To work with module registries, you must haveBicep CLIversion 0.4.1008 or later. To use withAzure CLI, you must have version 2.31.0 or later. To use withAzure PowerShell, you",2026-02-25T08:00:00.000Z,how-to,deployment,0.65,True,Covers creating a private module registry with specific CLI/PowerShell version requirements and registry usage constraints; deployment-focused with product-specific requirements.,unchanged -https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio,Create Bicep files - Visual Studio,Create Bicep files - Visual Studio - Azure Resource Manager,,Use Visual Studio and the Bicep extension to create Bicep files for deploy Azure resources.,"This quickstart guides you through the steps to create aBicep filewith Visual Studio. You create a storage account and a virtual network. You also learn how the Bicep extension simplifies development by providing type safety, syntax validation, and autocompletion. Similar authoring experience is also supported in Visual Studio Code. SeeQuickstart: Create Bicep files with Visual Studio Code.",2025-12-22T08:00:00.000Z,quickstart,,0.3,False,"Quickstart for Visual Studio authoring; mostly procedural steps, not deep configuration reference or limits.",unchanged +https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio,Create Bicep files - Visual Studio,Create Bicep files - Visual Studio - Azure Resource Manager,,Use Visual Studio and the Bicep extension to create Bicep files for deploy Azure resources.,"This quickstart guides you through the steps to create aBicep filewith Visual Studio. You create a storage account and a virtual network. You also learn how the Bicep extension simplifies development by providing type safety, syntax validation, and autocompletion. Similar authoring experience is also supported in Visual Studio Code. SeeQuickstart: Create Bicep files with Visual Studio Code.",2026-04-20T17:21:00.000Z,quickstart,,0.2,False,"Quickstart tutorial focused on using Visual Studio and the Bicep extension to create and deploy basic resources. It does not present configuration parameter tables, limits/quotas, error-code-based troubleshooting, or product-specific decision matrices. The content is primarily step-by-step guidance rather than expert reference details.",updated https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio-code,Create Bicep files - VS Code,Quickstart: Create Bicep files with Visual Studio Code - Azure Resource Manager,,Learn how to use Visual Studio Code and the Bicep extension to create Bicep files and deploy Azure resources.,"This quickstart guides you how to use Visual Studio Code to create aBicep file. You create a storage account and a virtual network. You also learn how the Bicep extension provides type safety, syntax validation, and autocompletion to simplify development. The Bicep MCP (Model Context Protocol) server provides AI agents with tools to help generate high-quality Bicep code. SeeQuickstart: Create Bicep files with Visual Code and Bicep MCP Server. Visual Studio supports a similar authoring experience",2026-01-30T08:00:00.000Z,quickstart,,0.3,False,Quickstart tutorial for using VS Code with Bicep; primarily step-by-step guidance without configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio-code-model-context-protocol,Create Bicep files - Bicep MCP Server,Quickstart: Create Bicep files with Visual Studio Code and Bicep MCP server - Azure Resource Manager,,Learn how to use Visual Studio Code and the Bicep MCP server to create Bicep files and deploy Azure resources.,This quickstart shows you how to use Visual Studio Code andBicep MCP serverto create aBicep file.,2026-02-09T18:26:00.000Z,quickstart,,0.3,False,Quickstart for VS Code and Bicep MCP server; summary suggests walkthrough rather than detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-deployment-stacks,Create deployment stacks,Create and deploy a deployment stack with Bicep - Azure Resource Manager,,Learn how to use Bicep to create and deploy a deployment stack in your Azure subscription.,This quickstart describes how to create adeployment stack.,2025-12-22T08:00:00.000Z,quickstart,,0.35,False,Quickstart for deployment stacks; appears example-driven without detailed limits or configuration matrices.,unchanged @@ -333,7 +333,7 @@ https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-re https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-resources-powershell,Azure PowerShell,"Tag resources, resource groups, and subscriptions with Azure PowerShell - Azure Resource Manager",Manage Azure resource tags with PowerShell,Shows how to use Azure PowerShell to apply tags to Azure resources.,"This article describes how to use Azure PowerShell to tag resources, resource groups, and subscriptions. For tag recommendations and limitations, seeUse tags to organize your Azure resources and management hierarchy.",2025-09-15T08:00:00.000Z,article,integrations,0.65,True,"PowerShell tagging article includes cmdlet names, parameter sets, and usage patterns specific to Azure Resource Manager tagging, which are concrete integration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-resources-python,Python,"Tag resources, resource groups, and subscriptions with Python - Azure Resource Manager",Tag Azure resources programmatically with Python SDK,Shows how to use Python to apply tags to Azure resources.,"This article describes how to use Python to tag resources, resource groups, and subscriptions. For tag recommendations and limitations, seeUse tags to organize your Azure resources and management hierarchy.",2025-09-15T08:00:00.000Z,article,integrations,0.7,True,"Python tagging article uses Azure SDK for Python with specific client classes, methods, and parameter names for tags; these are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-resources-templates,ARM templates,"Tag resources, resource groups, and subscriptions with ARM templates - Azure Resource Manager",Configure tags in ARM templates during deployment,Shows how to use ARM templates to apply tags to Azure resources.,"This article describes how to use Azure Resource Manager templates (ARM templates) to tag resources, resource groups, and subscriptions during deployment. For tag recommendations and limitations, seeUse tags to organize your Azure resources and management hierarchy. Note The tags you apply through an ARM template or Bicep file overwrite any existing tags.",2025-09-15T08:00:00.000Z,article,configuration,0.7,True,"Shows how to define tags in ARM templates, including schema properties and behavior (template-applied tags overwrite existing tags); this is concrete configuration behavior unique to ARM.",unchanged -https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-support,Supported resource types,Tag support for resources - Azure Resource Manager,Determine Azure resource types that support tags,Shows which Azure resource types support tags. Provides details for all Azure services.,"This article describes whether a resource type supportstags. The column labeledSupports tagsindicates whether the resource type has a property for the tag. The column labeledTag in cost reportindicates whether that resource type passes the tag to the cost report. You can view costs by tags in theCost Management cost analysisand theAzure billing invoice and daily usage data. To ensure that all the usage/cost records are tagged irrespective of whether the resource supports or emits tags, usetag in",2026-04-03T08:00:00.000Z,article,decision-making,0.7,True,"The page contains a comprehensive, product-specific matrix listing each Azure resource type and whether it supports tags and passes tags to cost reports. This is detailed reference data not inferable from general training and is used to decide tagging strategies and which resources can be used for tag-based cost reporting, fitting the decision-making category best.",updated +https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-support,Supported resource types,Tag support for resources - Azure Resource Manager,Determine Azure resource types that support tags,Shows which Azure resource types support tags. Provides details for all Azure services.,"This article describes whether a resource type supportstags. The column labeledSupports tagsindicates whether the resource type has a property for the tag. The column labeledTag in cost reportindicates whether that resource type passes the tag to the cost report. You can view costs by tags in theCost Management cost analysisand theAzure billing invoice and daily usage data. To ensure that all the usage/cost records are tagged irrespective of whether the resource supports or emits tags, usetag in",2026-04-03T08:00:00.000Z,article,decision-making,0.7,True,"The page contains a comprehensive, product-specific matrix listing each Azure resource type and whether it supports tags and passes tags to cost reports. This is detailed reference data not inferable from general training and is used to decide tagging strategies and which resources can be used for tag-based cost reporting, fitting the decision-making category best.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tls-support,TLS support,TLS version supported by Azure Resource Manager - Azure Resource Manager,Plan for TLS version support changes in Azure Resource Manager,Describes the deprecation of TLS versions prior to 1.2 in Azure Resource Manager,"Transport Layer Security (TLS) is a security protocol that establishes encryption channels over computer networks. TLS 1.2 is the current industry standard and is supported by Azure Resource Manager. For backwards compatibility, Azure Resource Manager also supports earlier versions, such as TLS 1.0 and 1.1, but that support is ending. To ensure that Azure is compliant with regulatory requirements, and provide improved security for our customers,Azure Resource Manager will stop supporting protoco",2025-09-15T08:00:00.000Z,article,security,0.7,True,"Details deprecation of TLS versions prior to 1.2 for ARM, including compliance and support behavior—product-specific security protocol configuration and requirements.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/add-template-to-azure-pipelines,ARM templates with pipelines,CI/CD with Azure Pipelines and templates - Azure Resource Manager,Configure Azure Pipelines CI/CD for ARM templates,"Describes how to configure continuous integration in Azure Pipelines by using Azure Resource Manager templates. It shows how to use a PowerShell script, or copy files to a staging location and deploy ","You can integrate Azure Resource Manager templates (ARM templates) with Azure Pipelines for continuous integration and continuous deployment (CI/CD). In this article, you learn two more advanced ways to deploy templates with Azure Pipelines.",2025-04-28T08:00:00.000Z,how-to,deployment,0.65,True,"Covers advanced deployment patterns for ARM templates with Azure Pipelines, including pipeline configuration and staging/deployment flows that are product-specific deployment knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/all-files-test-cases,All files test cases,All files test cases for Azure Resource Manager test toolkit - Azure Resource Manager,Use ARM test toolkit rules for all JSON files,Describes the tests that are run for all files by the Azure Resource Manager template test toolkit.,"This article describes the tests that are run with thetemplate test toolkitfor all JavaScript Object Notation (JSON) files. The examples include the test names and code samples thatpassorfailthe tests. For more information about how to run tests or how to run a specific test, seeTest parameters.",2025-07-23T08:00:00.000Z,how-to,best-practices,0.8,True,"Lists generic JSON-file tests with pass/fail examples and test names; these are concrete, product-specific validation rules representing best practices.",unchanged @@ -398,7 +398,7 @@ is an older module only available for Windows PowerShell 5.1 that no longer rece TheAzandAzureRMmodules arenotcompatible when installed for the same versions of PowerShell. If you need both versions: A key benefit of Azure is consistency. Development investments for one location are reusable in another. An Azure Resource Manager template (ARM template) makes your deployments consistent ",2025-04-28T08:00:00.000Z,article,best-practices,0.65,True,Focuses on making templates consistent across different cloud environments and Azure Stack; contains product-specific recommendations and patterns for cross-cloud reuse.,unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-expressions,Write template expressions,Template syntax and expressions - Azure Resource Manager,Use ARM template expressions and syntax rules,Describes the declarative JSON syntax for Azure Resource Manager templates (ARM templates).,"The basic syntax of the Azure Resource Manager template (ARM template) is JavaScript Object Notation (JSON). However, you can use expressions to extend the JSON values available within the template. Expressions start and end with brackets:[and], respectively. The value of the expression is evaluated when the template is deployed. An expression can return a string, integer, boolean, array, or object. A template expression can't exceed 24,576 characters.",2026-01-09T23:14:00.000Z,article,limits-quotas,0.7,True,"Describes expression syntax and explicitly states a maximum template expression length of 24,576 characters, which is a specific limit/constraint.",unchanged -https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions,All functions,Template functions - Azure Resource Manager,Use built-in functions in ARM templates,"Describes the functions to use in an Azure Resource Manager template (ARM template) to retrieve values, work with strings and numerics, and retrieve deployment information.","This article describes all the functions you can use in an Azure Resource Manager template (ARM template). For information about using functions in your template, seetemplate syntax. To create your own functions, seeUser-defined functions. Most functions work the same when deployed to a resource group, subscription, management group, or tenant. A few functions can't be used in all scopes. They're noted in the lists below. Tip We recommendBicepbecause it offers the same capabilities as ARM templa",2026-04-17T08:00:00.000Z,reference,integrations,0.68,True,"Lists all ARM template functions with their exact names, signatures, and behavior, which are product-specific API details. This is expert integration knowledge about how to programmatically interact with Azure Resource Manager templates rather than a conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions,All functions,Template functions - Azure Resource Manager,Use built-in functions in ARM templates,"Describes the functions to use in an Azure Resource Manager template (ARM template) to retrieve values, work with strings and numerics, and retrieve deployment information.","This article describes all the functions you can use in an Azure Resource Manager template (ARM template). For information about using functions in your template, seetemplate syntax. To create your own functions, seeUser-defined functions. Most functions work the same when deployed to a resource group, subscription, management group, or tenant. A few functions can't be used in all scopes. They're noted in the lists below. Tip We recommendBicepbecause it offers the same capabilities as ARM templa",2026-04-17T08:00:00.000Z,reference,integrations,0.68,True,"Lists all ARM template functions with their exact names, signatures, and behavior, which are product-specific API details. This is expert integration knowledge about how to programmatically interact with Azure Resource Manager templates rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-array,Array functions,Template functions - arrays - Azure Resource Manager,Use array functions in ARM templates,Describes the functions to use in an Azure Resource Manager template (ARM template) for working with arrays.,"This article describes the template functions for working with arrays. To get an array of string values delimited by a value, seesplit. Tip Bicepis recommended since it offers the same capabilities as ARM templates, and the syntax is easier to use. To learn more, seearrayfunctions.",2025-08-01T08:00:00.000Z,reference,configuration,0.85,True,"Lists specific array function names, behaviors, and usage patterns in ARM templates, which are detailed product-specific function references.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-cidr,CIDR functions,Template functions - CIDR - Azure Resource Manager,Use CIDR functions in ARM templates,Describes the functions to use in an Azure Resource Manager template (ARM template) to manipulate IP addresses and create IP address ranges.,"This article describes the functions for working with CIDR in your Azure Resource Manager template (ARM template). Tip We recommendBicepbecause it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, seecidrfunctions.",2025-02-12T08:00:00.000Z,reference,configuration,0.85,True,"Documents the exact CIDR-related functions available in ARM templates and how they manipulate IP ranges, which is specific configuration syntax.",unchanged https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-comparison,Comparison functions,Template functions - comparison - Azure Resource Manager,Use comparison functions in ARM templates,Describes the functions to use in an Azure Resource Manager template (ARM template) to compare values.,"Resource Manager provides several functions for making comparisons in your Azure Resource Manager template (ARM template): Tip Bicepis recommended since it offers the same capabilities as ARM templates, and the syntax is easier to use. To learn more, see thecoalescelogical operator andcomparisonoperators.",2025-08-01T08:00:00.000Z,reference,configuration,0.8,True,"Provides the exact comparison function set and semantics for ARM templates, including function names and behavior, which is product-specific.",unchanged diff --git a/products/azure-resource-manager/report.md b/products/azure-resource-manager/report.md index f38b259a..85a8cf04 100644 --- a/products/azure-resource-manager/report.md +++ b/products/azure-resource-manager/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: deployment: 'Deploying and moving Azure resources with ARM/Bicep: CI/CD pipelines, template specs, deployment scripts, deployment stacks, and cross-subscription/region @@ -53,8 +53,8 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Polic ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 4 -- **Unchanged**: 460 +- **Updated Pages**: 1 +- **Unchanged**: 463 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-resource-manager/azure-resource-manager.csv` @@ -77,14 +77,8 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Polic ### Updated Pages -- [Resource functions](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-resource) - - Updated: 2025-12-22T08:00:00.000Z → 2026-04-17T08:00:00.000Z -- [All functions](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions) - - Updated: 2025-09-11T08:00:00.000Z → 2026-04-17T08:00:00.000Z -- [Supported resource types](https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/tag-support) - - Updated: 2026-02-09T08:00:00.000Z → 2026-04-03T08:00:00.000Z -- [All functions](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions) - - Updated: 2025-07-10T08:00:00.000Z → 2026-04-17T08:00:00.000Z +- [Create Bicep files - Visual Studio](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio) + - Updated: 2025-12-22T08:00:00.000Z → 2026-04-20T17:21:00.000Z ## Classified Pages @@ -509,7 +503,6 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Polic | [Convert portal template gallery](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-spec-convert) | 0.30 | Primarily a how-to conversion guide from portal gallery templates to template specs without detailed configuration tables, limits, or error mappings. | | [Create Bicep files - Bicep MCP Server](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio-code-model-context-protocol) | 0.30 | Quickstart for VS Code and Bicep MCP server; summary suggests walkthrough rather than detailed configuration tables or limits. | | [Create Bicep files - VS Code](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio-code) | 0.30 | Quickstart tutorial for using VS Code with Bicep; primarily step-by-step guidance without configuration matrices or limits. | -| [Create Bicep files - Visual Studio](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio) | 0.30 | Quickstart for Visual Studio authoring; mostly procedural steps, not deep configuration reference or limits. | | [Create a custom resource provider](https://learn.microsoft.com/en-us/azure/azure-resource-manager/custom-providers/create-custom-provider) | 0.30 | Quickstart tutorial for creating a custom provider; primarily step-by-step guidance without configuration tables or error-code mappings. | | [Create a custom resource provider - Azure PowerShell](https://learn.microsoft.com/en-us/azure/azure-resource-manager/custom-providers/create-custom-provider-quickstart-powershell) | 0.30 | PowerShell quickstart for creating a custom provider; procedural content without detailed config matrices or product-specific troubleshooting. | | [Create a management group - .NET](https://learn.microsoft.com/en-us/azure/governance/management-groups/create-management-group-dotnet) | 0.30 | Quickstart for creating a management group with .NET; tutorial-style code, no SDK parameter tables or edge-case guidance. | @@ -549,6 +542,7 @@ confusable_not_for: Not for Azure Blueprints (use azure-blueprints), Azure Polic | [2 - Add resource](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-tutorial-add-resource) | 0.20 | Tutorial for adding a storage account resource to a template; step-by-step example without broader configuration reference or limits. | | [Add service group members in Azure portal](https://learn.microsoft.com/en-us/azure/governance/service-groups/create-service-group-member-portal) | 0.20 | Quickstart for adding members via portal; focuses on basic usage and preview notice, not on quotas, configuration matrices, or error-resolution mappings. | | [Contribute to Bicep](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/overview) | 0.20 | High-level overview of Bicep; conceptual description without detailed configs, limits, or troubleshooting. | +| [Create Bicep files - Visual Studio](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/quickstart-create-bicep-use-visual-studio) | 0.20 | Quickstart tutorial focused on using Visual Studio and the Bicep extension to create and deploy basic resources. It does not present configuration parameter tables, limits/quotas, error-code-based troubleshooting, or product-specific decision matrices. The content is primarily step-by-step guidance rather than expert reference details. | | [Create JSON templates - VS Code](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/quickstart-create-templates-use-visual-studio-code) | 0.20 | Quickstart/tutorial for creating ARM templates in VS Code; focuses on step-by-step usage, not detailed configuration tables, limits, or product-specific best practices. | | [Create JSON templates - portal](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/quickstart-create-templates-use-the-portal) | 0.20 | Portal-based quickstart for creating and deploying a simple ARM template; no detailed limits, configuration matrices, or troubleshooting content. | | [Create a service group in Azure portal](https://learn.microsoft.com/en-us/azure/governance/service-groups/create-service-group-portal) | 0.20 | Quickstart for creating a service group via portal; primarily step-by-step UI instructions without detailed configuration tables, limits, or product-specific troubleshooting content. | diff --git a/products/azure-sap/azure-sap.csv b/products/azure-sap/azure-sap.csv index a8d535f5..4fd275c1 100644 --- a/products/azure-sap/azure-sap.csv +++ b/products/azure-sap/azure-sap.csv @@ -9,19 +9,19 @@ https://learn.microsoft.com/en-us/azure/sap/automation/bash/remove-controlplane, https://learn.microsoft.com/en-us/azure/sap/automation/bash/remover,Removing the SAP system using shell scripts,remover.sh,Tear down SAP systems using remover.sh,Remove a new SAP system using a shell script.,,2023-02-10T23:04:00.000Z,reference,deployment,0.65,True,"Script-based removal of SAP systems in Azure deployment automation typically includes specific resource dependencies and ordering, which are expert deployment teardown details.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/bash/set-secrets,Set the Keyvault secrets using shell scripts,set_secrets.sh,Set SAP deployment SPN secrets in Azure Key Vault,Sets the SPN Secrets in Azure Key vault using a shell script.,,2024-12-15T12:14:00.000Z,reference,security,0.8,True,"Setting SPN secrets in Key Vault for SAP automation involves product-specific security configuration (service principal usage, Key Vault secret naming, scopes) that qualifies as expert security knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/bash/update-sas-token,Update the SAP Library SAS token,update_sas_token.sh,Update SAP Library SAS token in Azure Key Vault,Updates the SAP Library SAS token in Azure Key Vault,,2023-02-10T23:04:00.000Z,reference,configuration,0.7,True,"Updating SAS tokens for the SAP Library involves specific Key Vault secret names and token handling patterns, which are configuration details unique to this solution.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/bom-get-files,Get SAP media for BOM,Get SAP installation media for Bill of Materials,,Learn how to download SAP installation media for use with the Bill of Materials (BOM) in SAP Deployment Automation Framework.,"TheSAP on Azure Deployment Automation Frameworkuses a Bill of Materials (BOM). To create your BOM, you have to locate and download relevant SAP installation media. Then, you need to upload these media files to your Azure storage account. Note This guide covers advanced deployment. For a basic explanation of how to deploy the automation framework, see theget started guideinstead. This guide is for configurations that use either the SAP Application database or HANA databases.",2026-04-15T08:00:00.000Z,how-to,,0.5,False,"Guide to obtaining SAP installation media and uploading to storage; appears procedural. While specific file names or paths might exist, the summary doesn’t clearly indicate structured configuration parameters, limits, or error mappings required for expert classification.",updated +https://learn.microsoft.com/en-us/azure/sap/automation/bom-get-files,Get SAP media for BOM,Get SAP installation media for Bill of Materials,,Learn how to download SAP installation media for use with the Bill of Materials (BOM) in SAP Deployment Automation Framework.,"TheSAP on Azure Deployment Automation Frameworkuses a Bill of Materials (BOM). To create your BOM, you have to locate and download relevant SAP installation media. Then, you need to upload these media files to your Azure storage account. Note This guide covers advanced deployment. For a basic explanation of how to deploy the automation framework, see theget started guideinstead. This guide is for configurations that use either the SAP Application database or HANA databases.",2026-04-15T08:00:00.000Z,how-to,,0.5,False,"Guide to obtaining SAP installation media and uploading to storage; appears procedural. While specific file names or paths might exist, the summary doesn’t clearly indicate structured configuration parameters, limits, or error mappings required for expert classification.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/bom-prepare,Prepare BOM,Prepare Bill of Materials for automation,Prepare SAP Bill of Materials for Azure automation,Learn how to prepare a full SAP Bill of Materials (BOM) for use with the SAP on Azure Deployment Automation Framework.,"The SAP on Azure Deployment Automation Framework uses a Bill of Materials (BOM) to configure your SAP systems. A BOM defines the software components, installation media, and templates required for deployment. You can create a BOM by using ascriptormanually, and optionally includepermalinksto SAP media. The automation framework's GitHub repository contains a set ofsample BOMsthat you can use to get started. You can also create BOMs for other SAP applications and databases. For more information ab",2026-04-02T18:15:00.000Z,how-to,configuration,0.7,True,"Explains how to structure a BOM for SAP on Azure Deployment Automation Framework, including components, media, templates, scripts, and permalinks. The BOM schema and required fields are product-specific configuration knowledge beyond generic SAP or Azure concepts.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/bom-templates-db,Create automation template files,Generate SAP application installation templates,Generate SAP application installation templates for SDAF BOM,Learn how to generate SAP application installation templates for use with SAP Deployment Automation Framework.,"TheSAP on Azure Deployment Automation Frameworkuses a Bill of Materials (BOM) to define the SAP Application. Before you can deploy a system using a custom BOM, you need to also create the templates for the ini-files used in the unattended SAP installation. This guide covers how to create the application templates for an SAP/S4 deployment. The process is the same for the other SAP applications.",2026-04-14T22:21:00.000Z,how-to,configuration,0.7,True,"Covers creating ini-file templates for unattended SAP installation tied to the BOM; such content typically defines template parameters, required fields, and formats, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sap/automation/bom-templates-db,Create automation template files,Generate SAP application installation templates,Generate SAP application installation templates for SDAF BOM,Learn how to generate SAP application installation templates for use with SAP Deployment Automation Framework.,"TheSAP on Azure Deployment Automation Frameworkuses a Bill of Materials (BOM) to define the SAP Application. Before you can deploy a system using a custom BOM, you need to also create the templates for the ini-files used in the unattended SAP installation. This guide covers how to create the application templates for an SAP/S4 deployment. The process is the same for the other SAP applications.",2026-04-14T22:21:00.000Z,how-to,configuration,0.7,True,"Covers creating ini-file templates for unattended SAP installation tied to the BOM; such content typically defines template parameters, required fields, and formats, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/configure-control-plane,Configure the control plane,Configure the control plane for SAP Deployment Automation Framework,Configure control plane parameters for SAP automation,Learn about the control plane configuration parameters for the deployer VM and SAP library in SAP Deployment Automation Framework.,"The control plane is the management infrastructure forSAP Deployment Automation Framework. It provides the execution environment and shared storage that the framework uses to deploy and maintain SAP workloads on Azure. Configuration of the control plane determines how the deployer virtual machine (VM) and the SAP library behave in your environment. This article describes the Terraform parameters you use to configure the two control plane components. The deployer and the SAP library, including ne",2026-04-08T22:12:00.000Z,concept-article,configuration,0.82,True,"Explicitly describes Terraform parameters for configuring the deployer VM and SAP library in the SAP Deployment Automation Framework control plane. This is a configuration reference with specific parameter names and behaviors, matching the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/configure-devops,Configure Azure DevOps,Configure Azure DevOps for SAP Deployment Automation,Configure Azure DevOps for SAP automation pipelines,"Configure Azure DevOps Services for SAP Deployment Automation Framework to set up projects, pipelines, service connections, and variable groups for SAP deployments.","This article shows how to configure Azure DevOps Services to run SAP Deployment Automation Framework pipelines. This configuration helps you standardize and repeat SAP infrastructure deployment, software acquisition, and configuration tasks across environments. You set up the Azure DevOps project assets, service connections, pipelines, permissions, and variable groups that the framework requires. After you complete these steps, you can run deployments and ongoing SAP environment operations from ",2026-03-19T22:26:00.000Z,how-to,deployment,0.7,True,"Details setting up projects, pipelines, service connections, permissions, and variable groups specifically for SAP Deployment Automation Framework. These are product-specific CI/CD deployment requirements and structures, not generic DevOps guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/configure-extra-disks,Configure custom disk sizing,Custom disk configuration reference for SAP Deployment Automation Framework,Customize disk configurations for SAP Deployment Automation Framework,Review default disk configurations and custom sizing options for SAP systems in SAP Deployment Automation Framework.,"By default,SAP Deployment Automation Frameworkdefines the disk configuration for SAP systems. As needed, you can change the default configuration by providing a custom disk configuration JSON file. Tip When possible, it's a best practice to increase the disk size instead of adding more disks.",2026-04-15T22:11:00.000Z,concept-article,configuration,0.75,True,"Reference for default disk configurations and custom sizing options; likely includes tables of disk types, sizes, and JSON schema fields for custom configuration, which are concrete configuration parameters and ranges.",updated -https://learn.microsoft.com/en-us/azure/sap/automation/configure-sap-parameters,Configure SAP parameters,SAP installation parameters for Ansible configuration,Configure SAP installation parameters for SDAF Ansible playbooks,Learn about the SAP installation parameters used by Ansible playbooks in SAP Deployment Automation Framework.,The Ansible playbooks use a combination of default parameters and parameters defined by the Terraform deployment for the SAP installation.,2026-04-15T08:00:00.000Z,concept-article,configuration,0.8,True,"Explicitly about SAP installation parameters used by Ansible playbooks; such pages typically list parameter names, defaults, and usage, which are product-specific configuration details not generally known to LLMs.",updated +https://learn.microsoft.com/en-us/azure/sap/automation/configure-extra-disks,Configure custom disk sizing,Custom disk configuration reference for SAP Deployment Automation Framework,Customize disk configurations for SAP Deployment Automation Framework,Review default disk configurations and custom sizing options for SAP systems in SAP Deployment Automation Framework.,"By default,SAP Deployment Automation Frameworkdefines the disk configuration for SAP systems. As needed, you can change the default configuration by providing a custom disk configuration JSON file. Tip When possible, it's a best practice to increase the disk size instead of adding more disks.",2026-04-15T22:11:00.000Z,concept-article,configuration,0.75,True,"Reference for default disk configurations and custom sizing options; likely includes tables of disk types, sizes, and JSON schema fields for custom configuration, which are concrete configuration parameters and ranges.",unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/configure-sap-parameters,Configure SAP parameters,SAP installation parameters for Ansible configuration,Configure SAP installation parameters for SDAF Ansible playbooks,Learn about the SAP installation parameters used by Ansible playbooks in SAP Deployment Automation Framework.,The Ansible playbooks use a combination of default parameters and parameters defined by the Terraform deployment for the SAP installation.,2026-04-15T08:00:00.000Z,concept-article,configuration,0.8,True,"Explicitly about SAP installation parameters used by Ansible playbooks; such pages typically list parameter names, defaults, and usage, which are product-specific configuration details not generally known to LLMs.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/configure-system,Configure the SAP system,Configure SAP system parameters for automation,Define SAP system tfvars parameters for automation,Define the SAP system properties for SAP Deployment Automation Framework by using a parameters file.,"Configuration forSAP Deployment Automation Frameworkhappens through parameters files. You provide information about your SAP system infrastructure in atfvarsfile, which the automation framework uses for deployment. You can find examples of the variable file in thesamplesrepository. The automation supports creating resources (green-field deployment) or using existing resources (brown-field deployment):",2025-02-16T08:00:00.000Z,concept-article,configuration,0.75,True,Focuses on tfvars parameter files and supported fields for system configuration; clearly a configuration reference for the framework.,unchanged https://learn.microsoft.com/en-us/azure/sap/automation/configure-webapp,Configure Configuration Web App,Configure the control plane web application for SAP Deployment Automation Framework,Configure SDAF control plane web application,Configure a web app as part of the control plane to help create and deploy SAP workload zones and systems on Azure.,"As part of theSAP Deployment Automation Frameworkcontrol plane, you can optionally create an interactive web application that assists you in creating the required configuration files. You can also deploy the SAP workload zones and systems by using Azure Pipelines. The web app provides a visual interface for generating Terraform configuration files and triggering deployments, so you don't need to work with the CLI or edit parameter files manually.",2026-04-10T22:10:00.000Z,how-to,configuration,0.7,True,"Describes configuring an optional web app in the SDAF control plane to generate Terraform configuration files and trigger deployments. Involves specific configuration of this web app component and its parameters, aligning with configuration rather than generic deployment steps.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/configure-workload-zone,Configure the workload zone,Workload zone configuration in the automation framework,Design SAP workload zones in the automation framework,Overview of the SAP workload zone configuration process within SAP Deployment Automation Framework.,"AnSAP applicationtypically has multiple development tiers. For example, you might have development, quality assurance, and production tiers.SAP Deployment Automation Frameworkcalls these tiersworkload zones. See the following diagram for an example of a workload zone with two SAP systems. The workload zone provides shared services to all of the SAP Systems in the workload zone. These shared services include: The workload zone is typically deployed in a spoke subscription and the deployment of al",2024-09-24T11:22:00.000Z,concept-article,architecture-patterns,0.6,True,"Describes workload zones, shared services, and hub-spoke placement; this is a framework-specific architecture pattern for SAP tiers.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/deploy-control-plane,Deploy the control plane,Deploy the control plane for SAP Deployment Automation Framework,Deploy control plane for SAP automation framework,"Learn how to deploy the control plane, including the deployer and SAP library, for SAP Deployment Automation Framework.","The control plane is the deployment infrastructure forSAP Deployment Automation Framework(SDAF). It provides the deployment agents, state storage, and credential management that the framework needs to provision and configure SAP environments on Azure. Without a control plane, you can't run the Terraform and Ansible workflows that deploy SAP workload zones and systems. Setting up the control plane is the first deployment step in the framework. In this article, you prepare deployment credentials, ",2026-04-02T22:11:00.000Z,how-to,deployment,0.78,True,"Describes deploying the control plane (deployer VM, SAP library, state storage, credentials) for SAP Deployment Automation Framework. Contains product-specific deployment steps and requirements for the control plane infrastructure that go beyond generic Terraform/Ansible usage.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/deploy-system,Deploy the SAP system,SAP system deployment for SAP Deployment Automation Framework,,"Learn about the SAP system deployment process in SAP Deployment Automation Framework, including database, application, central services, and web dispatcher tiers.","An SAP system deployment is a step inSAP Deployment Automation Frameworkthat provisions the virtual machines (VMs), disks, and load balancers your SAP application needs. Instead of manually creating each resource, you define parameters and the framework deploys a correctly sized infrastructure. The SAP system deploys:",2026-04-10T22:10:00.000Z,concept-article,,0.4,False,"High-level description of SAP system deployment process in SDAF (VMs, disks, load balancers, tiers). The summary does not show detailed parameter tables, limits, or specific configuration values; it appears more like process/overview content.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/deploy-workload-zone,Deploy the workload zone,Deploy SAP workload zones with the automation framework,,Learn how to deploy SAP workload zones using SAP Deployment Automation Framework on Azure.,"AnSAP applicationtypically has multiple development tiers. For example, you might have development, quality assurance, and production tiers.SAP Deployment Automation Frameworkcalls these tiersworkload zones. You can use workload zones in multiple Azure regions. Each workload zone then has its own instance of Azure Virtual Network. The following services are provided by the SAP workload zone: The workload zones are typically deployed in spokes in a hub-and-spoke architecture. They can be in their",2026-04-14T22:21:00.000Z,how-to,,0.45,False,"Describes deploying workload zones and their architecture (hub-and-spoke, VNets, regions); summary reads as conceptual/step-by-step deployment without clear indication of tier matrices, constraints, or config tables.",updated +https://learn.microsoft.com/en-us/azure/sap/automation/deploy-workload-zone,Deploy the workload zone,Deploy SAP workload zones with the automation framework,,Learn how to deploy SAP workload zones using SAP Deployment Automation Framework on Azure.,"AnSAP applicationtypically has multiple development tiers. For example, you might have development, quality assurance, and production tiers.SAP Deployment Automation Frameworkcalls these tiersworkload zones. You can use workload zones in multiple Azure regions. Each workload zone then has its own instance of Azure Virtual Network. The following services are provided by the SAP workload zone: The workload zones are typically deployed in spokes in a hub-and-spoke architecture. They can be in their",2026-04-14T22:21:00.000Z,how-to,,0.45,False,"Describes deploying workload zones and their architecture (hub-and-spoke, VNets, regions); summary reads as conceptual/step-by-step deployment without clear indication of tier matrices, constraints, or config tables.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/deployment-framework,Overview,About SAP Deployment Automation Framework,,Overview of the framework and tooling for SAP Deployment Automation Framework.,"SAP Deployment Automation Frameworkis an open-source orchestration tool that can deploy, install, and maintain SAP environments. You can deploy the systems on any of the SAP-supported operating system versions and into any Azure region. You can create infrastructure for SAP landscapes based on SAP High-Performance Analytic Appliance (HANA) and NetWeaver with AnyDB by usingTerraform. The environments can be configured usingAnsible. Terraformfrom Hashicorp is an open-source tool for provisioning a",2026-02-09T08:00:00.000Z,concept-article,,0.2,False,High-level overview of SAP Deployment Automation Framework; conceptual description of what it is and does.,unchanged https://learn.microsoft.com/en-us/azure/sap/automation/devops-tutorial,Deployment DevOps hands-on lab,Deploy SAP infrastructure by using SAP Deployment Automation Framework and Azure DevOps,Deploy SAP infrastructure with SDAF and Azure DevOps,Learn how to deploy SAP infrastructure by using SAP Deployment Automation Framework with Azure DevOps Services.,"SAP Deployment Automation Frameworkprovides pipelines in Azure DevOps that automate the entire SAP deployment lifecycle, from control plane setup through SAP software installation. By using these pipelines, you can deploy and manage SAP environments consistently without running scripts manually. In this article, you:",2026-04-07T06:20:00.000Z,how-to,deployment,0.66,True,"Describes using SDAF pipelines in Azure DevOps to automate the SAP deployment lifecycle. While tutorial-like, it is specifically about product-integrated CI/CD pipelines and deployment automation for SAP on Azure, which fits the deployment sub-skill focused on production deployment methods.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/extensibility,Extensibility,Extend SAP Deployment Automation Framework,Extend SAP Deployment Automation Framework configuration,"Learn how to extend SAP Deployment Automation Framework by adding custom Ansible playbooks, repositories, packages, kernel parameters, and more.","SAP Deployment Automation Framework(SDAF) provides default configurations for deploying SAP environments on Azure. When your deployment requires custom OS settings, extra pipeline stages, or organization-specific Ansible playbooks, you can extend the framework to match your operational needs. In this article, you learn about the extensibility options available in SDAF, including custom Ansible playbooks, configuration-based extensions for repositories, packages, kernel parameters, and more. Comm",2026-04-06T22:10:00.000Z,how-to,configuration,0.7,True,"Covers extensibility options such as custom Ansible playbooks, repositories, packages, and kernel parameters. These are concrete, product-specific configuration mechanisms and patterns for SDAF, fitting the configuration category more than generic best practices.",unchanged @@ -29,11 +29,11 @@ https://learn.microsoft.com/en-us/azure/sap/automation/get-started,Get started w https://learn.microsoft.com/en-us/azure/sap/automation/integration-azure-monitor-sap,SAP Monitoring with Azure Monitor for SAP,Configure Azure Monitor for SAP with SAP Deployment Automation Framework,Integrate Azure Monitor for SAP with automation,Learn how to configure Azure Monitor for SAP with SAP Deployment Automation Framework to automate monitoring of your SAP landscape.,"Azure Monitor for SAP is an Azure-native monitoring product for SAP landscapes running on Azure. SAP Deployment Automation Framework is an open-source orchestration tool that deploys, installs, and maintains SAP environments on Azure. Setting up monitoring manually across the databases, operating systems, and clusters in an SAP landscape is complex and error-prone. In this article, you configure Azure Monitor for SAP as part of your SAP Deployment Automation Framework deployment so that the fram",2026-04-02T18:15:00.000Z,how-to,integrations,0.74,True,"Shows how to configure Azure Monitor for SAP as part of SAP Deployment Automation Framework. Likely includes specific configuration parameters, resource types, and integration settings unique to this product combination, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/naming,Naming conventions,Naming standards for the automation framework,,Explanation of naming conventions for SAP Deployment Automation Framework.,"SAP Deployment Automation Frameworkuses standard naming conventions. Consistent naming helps the automation framework run correctly with Terraform. Standard naming helps you deploy the automation framework smoothly. For example, consistent naming helps you to: Review the standard terms, area paths, and variable names before you begin your deployment. If necessary, you can alsoconfigure custom naming.",2023-12-12T18:08:00.000Z,reference,,0.45,False,Explains naming standards conceptually; likely conventions but not detailed config parameters or limits that materially affect behavior.,unchanged https://learn.microsoft.com/en-us/azure/sap/automation/naming-module,Using a custom naming convention,Configure custom naming for SAP Deployment Automation Framework,Customize Azure resource naming in SAP automation,Learn how to configure custom naming conventions for SAP Deployment Automation Framework on Azure.,"SAP Deployment Automation Frameworkuses astandard naming conventionfor the Azure resources it deploys. If the default names don't match your organization's naming standards, or if you need to avoid naming conflicts across Azure subscriptions, you can override them with your own names. This article shows you how to provide custom resource names by using a JSON override file or by modifying the Terraform naming module directly.",2026-04-01T17:25:00.000Z,how-to,configuration,0.78,True,"Explains overriding default naming via JSON override file or Terraform naming module. This implies specific parameter names/structure and module behavior unique to SAP Deployment Automation Framework, which are concrete configuration details rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/new-vs-existing,Sample deployment configurations,Configure the automation framework for new and existing deployments,Choose new vs existing infrastructure for SAP automation,Learn how to configure SAP Deployment Automation Framework for both new and existing scenarios.,"You can useSAP Deployment Automation Frameworkin both new and existing deployment scenarios. In new deployment scenarios, the automation framework doesn't use existing Azure infrastructure. The deployment process creates the virtual networks, subnets, key vaults, and more. In existing deployment scenarios, the automation framework uses existing Azure infrastructure. For example, the deployment uses existing virtual networks.",2023-09-03T11:19:00.000Z,concept-article,decision-making,0.7,True,"Guides configuration for greenfield vs brownfield scenarios, helping decide when to let the framework create vs reuse infrastructure.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/plan-deployment,Plan your automated deployment,Plan your SAP deployment with SAP Deployment Automation Framework,Plan SAP Deployment Automation Framework usage on Azure,"Prepare for using SAP Deployment Automation Framework. Steps include planning for credentials management, DevOps structure, and deployment scenarios.","SAP Deployment Automation Frameworkis an open-source orchestration tool that automates SAP deployments on Azure by using Terraform and Ansible. Before you deploy, you need to plan for subscriptions, credentials management, and virtual network design. This article describes the key planning decisions you need to make before you begin deploying. For generic SAP on Azure design considerations, seeIntroduction to an SAP adoption scenario.",2026-04-14T22:21:00.000Z,concept-article,decision-making,0.65,True,"Described as key planning decisions for subscriptions, credentials, and virtual network design specific to SAP Deployment Automation Framework; this is product-specific decision guidance about structure and deployment scenarios, fitting decision-making. Likely includes concrete recommendations for how to organize subscriptions and DevOps for this framework.",updated -https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash,Bash scripts for automation framework,SAP Deployment Automation Framework Bash reference,Use Bash scripts to deploy SAP automation framework,Use shell scripts to deploy SAP Deployment Automation Framework components.,You can deploy allSAP Deployment Automation Frameworkcomponents by using shell scripts.,2023-09-19T16:49:00.000Z,article,deployment,0.65,True,Reference for shell scripts that deploy framework components; includes script names and usage patterns for deployment.,unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/new-vs-existing,Sample deployment configurations,Configure the automation framework for new and existing deployments,,Learn how to configure SAP Deployment Automation Framework for both new and existing scenarios.,"This article shows you how to configureSAP Deployment Automation Frameworkfor both new and existing deployment scenarios. In new deployment scenarios, the automation framework creates all Azure infrastructure, including virtual networks, subnets, key vaults, and more. In existing deployment scenarios, the framework uses your existing Azure infrastructure, such as existing virtual networks.",2026-04-22T17:34:00.000Z,how-to,,0.3,False,"Describes configuring SAP Deployment Automation Framework for new vs existing deployments; summary suggests scenario-based setup guidance without explicit limits, decision matrices, or detailed configuration reference tables.",updated +https://learn.microsoft.com/en-us/azure/sap/automation/plan-deployment,Plan your automated deployment,Plan your SAP deployment with SAP Deployment Automation Framework,Plan SAP Deployment Automation Framework usage on Azure,"Prepare for using SAP Deployment Automation Framework. Steps include planning for credentials management, DevOps structure, and deployment scenarios.","SAP Deployment Automation Frameworkis an open-source orchestration tool that automates SAP deployments on Azure by using Terraform and Ansible. Before you deploy, you need to plan for subscriptions, credentials management, and virtual network design. This article describes the key planning decisions you need to make before you begin deploying. For generic SAP on Azure design considerations, seeIntroduction to an SAP adoption scenario.",2026-04-14T22:21:00.000Z,concept-article,decision-making,0.65,True,"Described as key planning decisions for subscriptions, credentials, and virtual network design specific to SAP Deployment Automation Framework; this is product-specific decision guidance about structure and deployment scenarios, fitting decision-making. Likely includes concrete recommendations for how to organize subscriptions and DevOps for this framework.",unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash,Bash scripts for automation framework,SAP Deployment Automation Framework shell script reference,Reference shell scripts for SAP deployment automation,Use shell scripts to deploy SAP Deployment Automation Framework components.,You can deploy allSAP Deployment Automation Frameworkcomponents by using shell scripts.,2026-04-21T22:10:00.000Z,concept-article,configuration,0.7,True,"A 'shell script reference' for SAP Deployment Automation Framework components is likely a parameter/command reference with specific script names, flags, and required/optional arguments—product-specific configuration details that qualify as expert knowledge under the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/sap/automation/run-ansible,Use Ansible for system configuration,Run Ansible to configure the SAP system,Run SDAF Ansible playbooks to install SAP,Configure the environment and install SAP by using Ansible playbooks with SAP Deployment Automation Framework.,"SAP Deployment Automation Framework includes Ansible playbooks that configure operating systems and install SAP components on your Azure virtual machines (VMs). Running the playbooks automates what would otherwise be a lengthy, error-prone manual process. In this article, you run the playbooks for each phase of the SAP installation: operating system configuration, software download, database installation, Central Services, application servers, and Web Dispatcher. The playbooks are in the/sap-aut",2026-04-06T22:10:00.000Z,how-to,integrations,0.76,True,"Details running SDAF-provided Ansible playbooks for OS configuration and SAP installation, including directory paths (/sap-aut...) and phase-specific playbooks. This is a product-specific automation/integration pattern between SDAF, Ansible, and SAP, matching integrations & coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/software,Download and prepare software media,Download SAP software for the automation framework,Download SAP media using automation framework playbooks,Download the SAP software to your Azure environment by using Ansible playbooks to use SAP Deployment Automation Framework.,"You need a copy of the SAP software before you can use SAP Deployment Automation Framework. Prepare your Azure environment to store the SAP media in your storage account. Then, download the SAP software by using Ansible playbooks. For more information about the framework, seeSAP Deployment Automation Framework.",2026-04-01T22:41:00.000Z,how-to,integrations,0.66,True,"Uses Ansible playbooks to download SAP software into Azure storage for the automation framework. This likely includes specific playbook variables, storage configuration, and paths unique to SAP Deployment Automation Framework, which are integration/config details not generally known.",unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/software,Download and prepare software media,Download SAP software for the automation framework,,Download the SAP software to your Azure environment by using Ansible playbooks to use SAP Deployment Automation Framework.,"You need a copy of the SAP software before you can use SAP Deployment Automation Framework. Prepare your Azure environment to store the SAP media in your storage account. Then, download the SAP software by using Ansible playbooks. For more information about the framework, seeSAP Deployment Automation Framework.",2026-04-20T11:11:00.000Z,how-to,,0.3,False,"Focuses on downloading SAP software using Ansible playbooks; from the summary it looks like a procedural tutorial rather than a reference of parameters, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/sap/automation/supportability,Supported platforms and features,Supportability matrix for SAP Deployment Automation Framework,Check SAP Deployment Automation support matrix,"Learn about the supported operating systems, databases, storage types, and deployment topologies for SAP Deployment Automation Framework.","SAP Deployment Automation Framework is an open-source orchestration tool for deploying, installing, and maintaining SAP environments on Azure. Before you plan or modify a deployment, confirm that your framework supports the target operating systems, databases, storage options, and topologies. Control plane: The deployer virtual machine (VM) of the control plane must run on Linux @@ -43,10 +43,10 @@ https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-archite https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-configuration-checks,Configuration checks,SAP Testing Automation Framework Configuration Checks,Validate SAP on Azure configuration with testing framework checks,Learn about configuration validation checks in the SAP Testing Automation Framework,"The SAP Testing Automation Framework includes comprehensive configuration validation capabilities to ensure SAP systems comply with best practices and guidelines for deployments on Microsoft Azure. These configuration checks help identify potential issues that can affect system performance, reliability, and compliance.",2025-10-31T22:17:00.000Z,reference,best-practices,0.65,True,"Configuration checks for SAP on Azure are explicit validations against best practices (e.g., parameter values, sizing, HA settings), which are actionable, product-specific recommendations.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-high-availability,High availability testing,SAP Testing Automation Framework High Availability Testing,Run high availability tests with SAP Testing Automation Framework,Learn about High Availability testing capabilities in the SAP Testing Automation Framework,"High Availability (HA) is essential for maintaining business continuity in SAP landscapes. The SAP Testing Automation Framework provides a structured, automated approach to validating HA configuration and resilience for SAP HANA (scale-up) and SAP Central Services. It executes configuration validation checks and orchestrates controlled failure simulations to ensure that recovery and failover mechanisms comply with SAP on Azure best practices. The framework uses Ansible to coordinate test executi",2025-10-31T22:17:00.000Z,how-to,best-practices,0.6,True,"HA testing capabilities usually include concrete test scenarios, failover steps, and configuration expectations specific to SAP on Azure HA setups, which are product-specific best practices.",unchanged https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-supportability,Supported platforms and features,SAP Testing Automation Framework Supported Platforms and Features,Supported platforms and features for SAP Testing Automation Framework,"Learn about the supported platforms, operating systems, and features for the SAP Testing Automation Framework","This document outlines the supported platforms, operating systems, and features for the SAP Testing Automation Framework.",2025-10-31T22:17:00.000Z,conceptual,limits-quotas,0.65,True,"A 'supported platforms and features' page typically includes explicit OS versions, platform matrices, and feature support tables, which are expert, version-specific limits/compatibility details.",unchanged -https://learn.microsoft.com/en-us/azure/sap/automation/tools-configuration,Configure external tools,Configure external tools for SAP Deployment Automation Framework,,Learn how to configure Visual Studio Code to connect to the deployer virtual machine for SAP Deployment Automation Framework.,This article describes how to configure Visual Studio Code to connect to the deployer virtual machine (VM) for SAP Deployment Automation Framework.,2026-04-17T22:08:00.000Z,how-to,,0.4,False,"How-to for configuring VS Code to connect to a VM; generally generic tooling steps, unlikely to contain product-specific parameter tables or limits beyond standard SSH/VS Code usage.",updated -https://learn.microsoft.com/en-us/azure/sap/automation/troubleshooting,Troubleshoot the automation framework,Troubleshoot SAP Deployment Automation Framework,Troubleshoot SAP Deployment Automation Framework issues on Azure,"Learn how to troubleshoot common issues with SAP Deployment Automation Framework, including deployment, configuration, and software download problems.",SAP Deployment Automation Framework (SDAF) has many moving parts. This article helps you troubleshoot issues that you might encounter.,2026-04-15T22:11:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Explicit troubleshooting article for SDAF; such content typically maps deployment/config/software download issues to causes and resolutions, often with specific error messages or logs, matching the troubleshooting criteria.",updated -https://learn.microsoft.com/en-us/azure/sap/automation/tutorial,Deployment hands-on lab,Deploy SAP Deployment Automation Framework,,"Learn how to deploy SAP Deployment Automation Framework for enterprise scale on Azure, including a control plane, workload zone, and SAP system infrastructure.","This article shows you how to perform deployments by usingSAP Deployment Automation Framework. This example uses Azure Cloud Shell to deploy the control plane infrastructure. The deployer virtual machine (VM) creates the remaining infrastructure and SAP HANA configurations. There are three main steps of an SAP deployment on Azure with the automation framework: Prepare the region. You deploy components to support the SAP automation framework in a specified Azure region. In this step, you: Prepare",2026-04-13T08:00:00.000Z,how-to,,0.45,False,"Tutorial for deploying the framework; likely a procedural guide using Cloud Shell and a VM. Summary does not indicate detailed configuration references, limits, or troubleshooting mappings beyond generic deployment steps.",updated -https://learn.microsoft.com/en-us/azure/sap/automation/upgrading,Upgrading the automation framework,Upgrade SAP Deployment Automation Framework,,Learn how to upgrade SAP Deployment Automation Framework.,"SAP Deployment Automation Framework is updated regularly. You can upgrade the pipeline definitions, the control plane, and the workload zone to get the latest features and fixes. This article describes how to upgrade each component.",2026-04-16T22:31:00.000Z,how-to,,0.45,False,"Upgrade guide for the framework; summary suggests procedural steps rather than detailed configuration matrices, limits, or troubleshooting mappings. Without evidence of specific version constraints or config tables, it doesn’t clearly meet expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/sap/automation/tools-configuration,Configure external tools,Configure external tools for SAP Deployment Automation Framework,,Learn how to configure Visual Studio Code to connect to the deployer virtual machine for SAP Deployment Automation Framework.,This article describes how to configure Visual Studio Code to connect to the deployer virtual machine (VM) for SAP Deployment Automation Framework.,2026-04-17T22:08:00.000Z,how-to,,0.4,False,"How-to for configuring VS Code to connect to a VM; generally generic tooling steps, unlikely to contain product-specific parameter tables or limits beyond standard SSH/VS Code usage.",unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/troubleshooting,Troubleshoot the automation framework,Troubleshoot SAP Deployment Automation Framework,Troubleshoot SAP Deployment Automation Framework issues on Azure,"Learn how to troubleshoot common issues with SAP Deployment Automation Framework, including deployment, configuration, and software download problems.",SAP Deployment Automation Framework (SDAF) has many moving parts. This article helps you troubleshoot issues that you might encounter.,2026-04-15T22:11:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Explicit troubleshooting article for SDAF; such content typically maps deployment/config/software download issues to causes and resolutions, often with specific error messages or logs, matching the troubleshooting criteria.",unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/tutorial,Deployment hands-on lab,Deploy SAP Deployment Automation Framework,,"Learn how to deploy SAP Deployment Automation Framework for enterprise scale on Azure, including a control plane, workload zone, and SAP system infrastructure.","This article shows you how to perform deployments by usingSAP Deployment Automation Framework. This example uses Azure Cloud Shell to deploy the control plane infrastructure. The deployer virtual machine (VM) creates the remaining infrastructure and SAP HANA configurations. There are three main steps of an SAP deployment on Azure with the automation framework: Prepare the region. You deploy components to support the SAP automation framework in a specified Azure region. In this step, you: Prepare",2026-04-13T08:00:00.000Z,how-to,,0.45,False,"Tutorial for deploying the framework; likely a procedural guide using Cloud Shell and a VM. Summary does not indicate detailed configuration references, limits, or troubleshooting mappings beyond generic deployment steps.",unchanged +https://learn.microsoft.com/en-us/azure/sap/automation/upgrading,Upgrading the automation framework,Upgrade SAP Deployment Automation Framework,,Learn how to upgrade SAP Deployment Automation Framework.,"SAP Deployment Automation Framework is updated regularly. You can upgrade the pipeline definitions, the control plane, and the workload zone to get the latest features and fixes. This article describes how to upgrade each component.",2026-04-16T22:31:00.000Z,how-to,,0.45,False,"Upgrade guide for the framework; summary suggests procedural steps rather than detailed configuration matrices, limits, or troubleshooting mappings. Without evidence of specific version constraints or config tables, it doesn’t clearly meet expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/about-business-process-solutions,Overview,Introduction to Business Process Solutions (Public Preview),,Learn how to use Business Process Solutions to enable organizations in unifying business data across different systems and functional areas.,"Business Process Solutions accelerates enterprise data analytics and derisks AI adoption by providing prebuilt resources, including data models, transformations, and business templates. This article introduces the solution and outlines how it enables organizations to unify business data across various systems and functional areas. In AI driven enterprises, access to reliable business information is critical for success. Whether AI augmented or fully autonomous, agentic solutions require trusted ",2025-11-24T12:10:00.000Z,overview,,0.3,False,Introduction to Business Process Solutions; high-level description of purpose and capabilities without detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/attestation,Attestation Document,Attestation Document,,This article provides attestation document for Business Process Solutions.,This document provides details on how our Business Process Solutions workload complies with the requirements for publishing in the Microsoft Fabric Workload Hub. The information outlined here stays up-to-date and links to the Workload metadata manifest.,2026-04-08T11:11:00.000Z,overview,,0.2,False,"Attestation/compliance overview for a workload; description suggests high-level compliance and metadata information without specific limits, configuration parameters, error codes, or decision matrices that match any expert-knowledge sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/business-templates,Business Templates,Templates for Business Process Solutions,Use Business Process Solutions templates for analytics and AI agents,Learn how Business Process Solutions templates support common business processes and how to utilize those resources to turn data models into practical insights.,"This article provides an overview of the suite of prebuilt Power BI reports and Copilot Studio Agents available in Business Process Solutions, highlighting their key features and capabilities. These templates are designed to support common business processes, offering immediate access to essential metrics and trends such as financial performance, sales effectiveness, and supplier evaluation. organizations can quickly use these resources to gain valuable insights, with the flexibility to refine a",2025-11-24T12:10:00.000Z,overview,decision-making,0.6,True,"Explains which templates (Power BI reports, Copilot agents) support which business processes and metrics, helping users choose templates for their scenarios.",unchanged @@ -65,31 +65,31 @@ https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/run-extra https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/troubleshooting,Troubleshooting common issues,Troubleshooting common issues,Troubleshoot common issues in SAP Business Process Solutions,Identify and track common Business Process Solutions issues and workarounds.,"Within the Business Process Solutions, we recognize that there are many moving parts. This article is intended to help you troubleshoot known issues that you can encounter. Use this article to quickly diagnose and remediate known issues. Each section summarizes symptoms, root cause, and the exact steps to fix and validate.",2026-02-23T12:43:00.000Z,troubleshooting,troubleshooting,0.9,True,"The article is explicitly organized around troubleshooting known issues, with sections for symptoms, root cause, and exact remediation steps. This matches the troubleshooting sub-skill definition (symptom → cause → solution mappings) and is product-specific expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/acss-backup-integration,Configure and monitor Backup for SAP system,Configure and monitor Azure Backup for SAP Virtual Instance (preview),,Learn how to configure and monitor Azure Backup status for your SAP system through the Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions.,"Azure Center for SAP solutionslets you manage SAP workloads on Azure through a Virtual Instance for SAP solutions (VIS) resource. Protecting your SAP system requires configuring and monitoring backups for Central Services instances, application servers, database virtual machines, and HANA databases. This process can involve multiple separate steps. This article shows you how to configure Azure Backup and monitor backup status for your entire SAP system from the VIS resource in a single workflow.",2026-04-09T22:25:00.000Z,how-to,,0.3,False,"Primarily a workflow/tutorial for configuring Azure Backup via VIS for SAP systems. The summary does not indicate detailed parameter tables, limits, or product-specific error codes; it focuses on a single workflow rather than deep configuration or integration reference.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/compliance-bcdr-reliabilty,Reliability overview,Resiliency in Azure Center for SAP Solutions,Understand resiliency patterns in Azure Center for SAP solutions,Find out about reliability in Azure Center for SAP Solutions,"This article describes reliability support in Azure Center for SAP Solutions, and covers both regional resiliency with availability zones and cross-region resiliency with customer enabled disaster recovery. For a more detailed overview of reliability in Azure, seeAzure reliability. Azure Center for SAP solutions is an end-to-end solution that enables you to create and run SAP systems as a unified workload on Azure and provides a more seamless foundation for innovation. You can take advantage of ",2023-05-15T23:10:00.000Z,overview,architecture-patterns,0.6,True,"Describes regional and cross-region resiliency approaches for ACSS with concrete guidance on when to use availability zones vs. customer-enabled DR, including trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/compliance-cedr,Customer enabled disaster recovery,Configure customer-enabled disaster recovery in Azure Center for SAP solutions,Plan and configure DR for Azure Center for SAP solutions,Learn how to configure customer-enabled disaster recovery for Virtual Instance for SAP solutions resources in Azure Center for SAP solutions.,"Azure Center for SAP solutions is a zone-redundant service. The service might experience downtime because no paired region exists, and there's no Microsoft-initiated failover during a region outage. This article explains strategies to achieve cross-region resiliency for Virtual Instance for SAP solutions resources with customer-enabled disaster recovery (DR). Also, steps to follow when a region where your Virtual Instance for SAP solutions resource exists is down. You must configure disaster rec",2026-04-16T22:31:00.000Z,how-to,decision-making,0.65,True,"Describes customer-enabled disaster recovery strategies and steps when a region hosting a Virtual Instance for SAP solutions is down. Likely includes product-specific DR options, cross-region resiliency guidance, and scenario-based recommendations for when and how to configure DR, which aligns with decision-making criteria.",updated -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana,Deploy S/4HANA infrastructure,Deploy an S/4HANA infrastructure,Deploy S/4HANA infrastructure with Azure Center for SAP,"Learn how to deploy S/4HANA infrastructure with Azure Center for SAP solutions through the Azure portal with High Availability (HA), non-HA, and single-server configurations.",This article describes how to deploy S/4HANA infrastructure inAzure Center for SAP solutions. There are threedeployment options:,2026-03-11T17:32:00.000Z,how-to,deployment,0.67,True,"Describes deploying S/4HANA infrastructure with multiple HA/non-HA/single-server options; such guidance typically includes SKU/VM choices, HA topology options, and deployment-time parameters and constraints specific to Azure Center for SAP solutions, which are deployment-focused expert details.",unchanged +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/compliance-cedr,Customer enabled disaster recovery,Configure customer-enabled disaster recovery in Azure Center for SAP solutions,Plan and configure DR for Azure Center for SAP solutions,Learn how to configure customer-enabled disaster recovery for Virtual Instance for SAP solutions resources in Azure Center for SAP solutions.,"Azure Center for SAP solutions is a zone-redundant service. The service might experience downtime because no paired region exists, and there's no Microsoft-initiated failover during a region outage. This article explains strategies to achieve cross-region resiliency for Virtual Instance for SAP solutions resources with customer-enabled disaster recovery (DR). Also, steps to follow when a region where your Virtual Instance for SAP solutions resource exists is down. You must configure disaster rec",2026-04-16T22:31:00.000Z,how-to,decision-making,0.65,True,"Describes customer-enabled disaster recovery strategies and steps when a region hosting a Virtual Instance for SAP solutions is down. Likely includes product-specific DR options, cross-region resiliency guidance, and scenario-based recommendations for when and how to configure DR, which aligns with decision-making criteria.",unchanged +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana,Deploy S/4HANA infrastructure,Deploy an S/4HANA infrastructure,Deploy S/4HANA infrastructure with Azure Center for SAP solutions,"Learn how to deploy S/4HANA infrastructure with Azure Center for SAP solutions through the Azure portal with High Availability (HA), non-HA, and single-server configurations.",This article describes how to deploy S/4HANA infrastructure inAzure Center for SAP solutions. There are threedeployment options:,2026-04-22T08:00:00.000Z,how-to,deployment,0.7,True,"Describes deploying S/4HANA infrastructure with HA, non-HA, and single-server options. Likely includes deployment option matrices, required resource types, and constraints per configuration. This is production deployment guidance for a specific service, fitting deployment rather than generic how-to.",updated https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/faq,FAQ,Azure Center for SAP solutions FAQ,,Get answers to common questions about Azure Center for SAP solutions and Virtual Instance for SAP solutions (VIS) resources.,This article answers commonly asked questions about Azure Center for SAP Solutions (ACSS).,2026-03-16T17:13:00Z,faq,,0.2,False,"FAQ content about Azure Center for SAP solutions appears to be general Q&A and conceptual clarification rather than detailed limits, configuration tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. No strong evidence of product-specific numeric limits, configuration parameter tables, or structured troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/get-quality-checks-insights,Get quality checks and insights,View quality checks and insights for a Virtual Instance for SAP solutions,Use Quality Insights checks for SAP on Azure,Learn how to use quality checks and insights for a Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions.,"Azure Center for SAP solutionsincludes a Quality Insights workbook that runs more than 100 quality checks on your Virtual Instance for SAP solutions (VIS) resources. These checks validate that your SAP system follows Azure and SAP best practices for reliability and performance. In this article, you use the Quality Insights workbook to review recommendations, virtual machine (VM) information, and configuration checks for your SAP system.",2026-03-19T22:26:00.000Z,how-to,best-practices,0.62,True,"Describes a Quality Insights workbook that runs 100+ checks and surfaces recommendations and configuration checks. While partially conceptual, it likely enumerates concrete checks, fields, and interpretations specific to Azure Center for SAP solutions, representing product-specific best-practice guidance not generally known.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/get-sap-installation-media,Get SAP installation media,Get SAP installation media,Prepare SAP installation media for Azure Center for SAP,"Learn how to download the necessary SAP media for installing the SAP software and upload it for use with Azure Center for SAP solutions. Note, this step is *optional*, media and BOM can be obtained wi","After youcreate the infrastructure for your new SAP system using Azure Center for SAP solutions, you need to install the SAP software on your SAP system. However, before you can do this installation, you need to get and upload the SAP installation media for use with Azure Center for SAP solutions. In this article, learn how to get the SAP software installation media using different methods and upload the SAP media to an Azure Storage account to prepare for installation.",2026-03-13T22:12:00.000Z,how-to,configuration,0.64,True,"Shows how to obtain SAP media and upload it to Azure Storage for use with Azure Center for SAP solutions, which will include storage account/container configuration, naming conventions, and possibly required folder/filename structures that are specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/install-software,Install SAP software,Install SAP software,Install SAP software on ACSS-managed systems,"Learn how to install SAP software on an SAP system that you created using Azure Center for SAP solutions. You can either install the SAP software with Azure Center for SAP solutions, or install the s","After you've created infrastructure for your new SAP system usingAzure Center for SAP solutions, you need to install the SAP software. In this how-to guide, you'll learn two ways to install the SAP software for your system. Choose whichever method is appropriate for your use case. You can either:",2026-02-02T08:00:00.000Z,how-to,deployment,0.7,True,Explains two concrete installation modes (inside ACSS vs. external detection) with product-specific steps and parameters for VIS-based systems.,unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-virtual-instance,Manage Virtual Instance for SAP solutions,Manage a Virtual Instance for SAP solutions,,Learn how to view and manage a Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions by using the Azure portal.,"Azure Center for SAP solutionsprovides a centralized management experience for SAP systems running on Azure. A Virtual Instance for SAP solutions (VIS) is the resource that represents your SAP system within the service. Once an SAP system is set up on Azure, maintaining consistent oversight of the infrastructure is essential to manage performance and reliability. The VIS provides this central management point in the Azure portal. This article shows you how to monitor, configure, and manage a VIS",2026-03-18T22:20:00.000Z,how-to,,0.2,False,"Page appears to be a how-to for viewing and managing a Virtual Instance for SAP solutions in the Azure portal. Description suggests general management/monitoring steps rather than detailed configuration tables, limits, error-code mappings, or security role definitions. No clear indication of numeric limits, parameter tables, or troubleshooting matrices, so it likely does not contain the kind of expert-only reference data required for any sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-virtual-instance,Manage Virtual Instance for SAP solutions,Manage a Virtual Instance for SAP solutions,,Learn how to view and manage a Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions by using the Azure portal.,"Azure Center for SAP solutionsprovides a centralized management experience for SAP systems running on Azure. A Virtual Instance for SAP solutions (VIS) is the resource that represents your SAP system within the service. Once an SAP system is set up on Azure, maintaining consistent oversight of the infrastructure is essential to manage performance and reliability. The VIS provides this central management point in the Azure portal. This article shows you how to monitor, configure, and manage a VIS",2026-04-22T08:00:00.000Z,how-to,,0.3,False,"Appears to be a how-to/manage article for Virtual Instance for SAP solutions in Azure portal without clear indication of numeric limits, detailed configuration parameter tables, or troubleshooting error mappings. Likely procedural guidance rather than expert-only reference content.",updated https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-with-azure-rbac,Manage resources with Azure RBAC,Azure RBAC for Azure Center for SAP solutions resources,Configure Azure RBAC for Azure Center for SAP solutions,"Learn how Azure role-based access control (Azure RBAC) manages access to SAP workloads in Azure Center for SAP solutions, including built-in roles and minimum permissions.",Azurerole-based access control (RBAC)lets you separate duties within your team and grant only the permissions users need to deploy and manage SAP systems in Azure Center for SAP solutions. Users or user-assigned managed identities require specific roles or minimum permissions for each capability. This article lists the built-in roles and minimum permissions that users and user-assigned managed identities need for each Azure Center for SAP solutions capability.,2026-04-08T22:12:00.000Z,concept-article,security,0.86,True,"Lists specific Azure RBAC built-in roles and minimum permissions required for each Azure Center for SAP solutions capability. Contains product-specific role names, scopes, and permission mappings that qualify as expert security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/monitor-portal,Monitor SAP system in Azure portal,Monitor SAP system from the Azure portal,Monitor SAP systems with Azure Center for SAP,"Learn how to monitor the health and status of your SAP system, along with important SAP metrics, using the Azure Center for SAP solutions within the Azure portal.","Azure Center for SAP solutionslets you deploy and manage SAP systems as a unified workload on Azure. When you run SAP workloads in Azure, you need visibility into system health and infrastructure performance to identify degradation and resolve issues before they affect your business processes. In this article, you check the health and status of your SAP system and its instances, analyze infrastructure metrics such as CPU and IOPS, and configure Azure Monitor for SAP solutions for deeper platform",2026-03-31T22:19:00.000Z,how-to,configuration,0.64,True,"Covers monitoring health, status, and metrics plus configuring Azure Monitor for SAP solutions. Likely includes specific metric names, configuration steps, and portal/agent settings unique to this integration, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/overview,Overview,Azure Center for SAP solutions,,Azure Center for SAP solutions is an Azure offering that makes SAP a top-level workload on Azure. You can use Azure Center for SAP solutions to deploy or manage SAP systems on Azure seamlessly.,"Azure Center for SAP solutionsis an Azure offering that makes SAP a top-level workload on Azure. Azure Center for SAP solutions is an end-to-end solution that enables you to create and run SAP systems as a unified workload on Azure and provides a more seamless foundation for innovation. You can take advantage of the management capabilities for both new and existing Azure-based SAP systems. The guided deployment experience takes care of creating the necessary compute, storage and networking compo",2026-02-02T08:00:00.000Z,overview,,0.3,False,"High-level overview of Azure Center for SAP solutions without detailed configuration tables, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network,Prepare network for deployment,Prepare a network for infrastructure deployment,Configure Azure networking and security for S/4HANA,Learn how to prepare a virtual network for an S/4HANA infrastructure deployment with Azure Center for SAP solutions.,"Azure Center for SAP solutionslets you deploy and manage SAP systems on Azure. When you deploy S/4HANA infrastructure through the service, you need a virtual network that provides outbound connectivity and allows communication between application and database subnets. Without a properly configured network, the infrastructure deployment and SAP software installation can fail. In this article, you create and configure a virtual network, set up connectivity and security rules, and allow list the en",2026-03-18T06:15:00.000Z,how-to,security,0.7,True,"The article focuses on preparing a virtual network for S/4HANA deployment, including connectivity and security rules and allow-listing endpoints. This implies product-specific network security configuration (NSG rules, outbound access, allowlisted FQDNs/IPs) that go beyond generic concepts and match the security sub-skill definition.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli,Start and stop SAP system - Azure CLI,Start or Stop SAP Systems with the Azure CLI,Start and stop SAP systems via Azure CLI in Azure Center,Learn how to start or stop an SAP system in Azure Center for SAP solutions by using the Azure CLI.,"Use the Azure CLI to create and manage Azure resources from the command line or in scripts. In this guide, you learn to start and stop your SAP systems through the Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions. You can start and stop SAP systems by using the Azure CLI or by using Azure PowerShell. This article shows you the steps to use the Azure CLI. Through the Azure CLI, you can start and stop:",2026-03-27T06:14:00.000Z,how-to,configuration,0.7,True,"Parallel to [7] but with Azure CLI. It will specify concrete CLI commands and parameters for controlling VIS-managed SAP systems, which is detailed, product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell,Start and stop SAP system - Azure PowerShell,Start or Stop SAP Systems with Azure PowerShell,Start and stop SAP systems via Azure PowerShell in Azure Center,Learn how to start or stop an SAP system in Azure Center for SAP solutions by using an Azure PowerShell module.,"Use theAz PowerShell moduleto create and manage Azure resources from the command line or in scripts. In this guide, you learn to start and stop your SAP systems through the Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions. You can start and stop SAP systems by using the Azure CLI or by using Azure PowerShell. This article shows you the steps to use Azure PowerShell. Through the Azure PowerShell module, you can start and stop:",2026-03-27T06:14:00.000Z,how-to,configuration,0.7,True,"Describes using Az PowerShell to start/stop SAP systems through VIS. This implies specific cmdlets, resource identifiers, and parameter usage unique to Azure Center for SAP solutions, which is configuration-level expert knowledge rather than generic scripting guidance.",unchanged +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network,Prepare network for deployment,Prepare a network for infrastructure deployment,Prepare Azure virtual network for S/4HANA deployment,Learn how to prepare a virtual network for an S/4HANA infrastructure deployment with Azure Center for SAP solutions.,"Azure Center for SAP solutionslets you deploy and manage SAP systems on Azure. When you deploy S/4HANA infrastructure through the service, you need a virtual network that provides outbound connectivity and allows communication between application and database subnets. Without a properly configured network, the infrastructure deployment and SAP software installation can fail. In this article, you create and configure a virtual network, set up connectivity and security rules, and allow list the en",2026-04-22T08:00:00.000Z,how-to,configuration,0.8,True,"Network preparation guide for S/4HANA via Azure Center for SAP solutions. Likely includes required subnets, address ranges, NSG rules, outbound connectivity requirements, and allowlisted endpoints. These are concrete configuration parameters and security rules specific to this service, matching the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli,Start and stop SAP system - Azure CLI,Start or stop SAP systems by using the Azure CLI,Control SAP systems via Azure CLI VIS commands,Learn how to start or stop an SAP system in Azure Center for SAP solutions by using the Azure CLI.,This article shows you how to start and stop SAP systems through the Virtual Instance for SAP solutions (VIS) resource inAzure Center for SAP solutionsby using the Azure CLI. You can start and stop:,2026-04-22T06:17:00.000Z,how-to,integrations,0.65,True,"Similar to index 3 but with Azure CLI. Contains specific CLI commands, flags, and parameter names for interacting with VIS resources to start/stop SAP systems. These are product-specific integration patterns rather than generic CLI usage.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell,Start and stop SAP system - Azure PowerShell,Start or stop SAP systems by using Azure PowerShell,Control SAP systems via Azure PowerShell VIS commands,Learn how to start or stop an SAP system in Azure Center for SAP solutions by using Azure PowerShell.,This article shows you how to start and stop SAP systems through the Virtual Instance for SAP solutions (VIS) resource inAzure Center for SAP solutionsby using Azure PowerShell. You can start and stop:,2026-04-22T06:17:00.000Z,how-to,integrations,0.65,True,"Shows how to start/stop SAP systems through the VIS resource using Azure PowerShell. Likely includes specific cmdlet names, parameters, and required values unique to Azure Center for SAP solutions integration, which fits integrations & coding patterns more than generic configuration.",updated https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-distributed-non-high-availability,Deploy S/4 HANA infrastructure - Azure PowerShell,Quickstart - Deploy distributed non-HA SAP system with PowerShell,,Learn how to create a distributed non-HA SAP system in Azure Center for SAP solutions by using the Azure PowerShell module.,"In this quickstart, you deploy infrastructure for an SAP system with a non-high-availability (non-HA) distributed architecture on Azure. You use the Azure PowerShell module to create a Virtual Instance for SAP solutions (VIS) resource through Azure Center for SAP solutions. After you deploy infrastructure andinstall SAP software, you can use visualization, management, and monitoring capabilities through the Azure portal. For example, you can:",2026-04-08T22:12:00.000Z,quickstart,,0.3,False,"Quickstart for deploying a non-HA SAP system via Azure PowerShell. Primarily a procedural tutorial without detailed configuration tables, limits, or product-specific troubleshooting/decision matrices beyond what an LLM would generally know.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom,Deploy S/4 HANA infrastructure - Azure CLI,Quickstart - Create a distributed highly available SAP system with Azure Center for SAP solutions with Azure CLI,Create distributed HA SAP system with ACSS using Azure CLI,Learn how to create a distributed highly available SAP system in Azure Center for SAP solutions through Azure CLI.,"TheAzure CLIis used to create and manage Azure resources from the command line or in scripts. Azure Center for SAP solutionsenables you to deploy and manage SAP systems on Azure. This article shows you how to use Azure CLI to deploy infrastructure for an SAP system with highly available (HA) Three-tier Distributed architecture. You also see how to customize resource names for the Azure infrastructure that gets deployed. Alternatively, you can deploy SAP systems with customized using theAzure Pow",2023-05-15T23:10:00.000Z,quickstart,deployment,0.7,True,"CLI-based quickstart with detailed commands and options to deploy HA three-tier SAP infrastructure and customize resource names, specific to ACSS.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-distributed-non-high-availability,Install SAP software - Azure PowerShell,Install SAP software for a distributed non-HA system by using Azure PowerShell,,Learn how to install SAP software for a distributed non-high-availability SAP system in Azure Center for SAP solutions by using Azure PowerShell.,"Azure Center for SAP solutionsenables you to deploy and manage SAP systems on Azure. This quickstart shows you how to install SAP software for infrastructure deployed for an SAP system. In theprevious step, you created infrastructure for an SAP system with a non-highly available distributed architecture on Azure by using Azure Center for SAP solutions. After youdeploy infrastructureand install SAP software, you can manage and monitor the system through theVirtual Instance for SAP solutions (VIS)",2026-04-16T22:31:00.000Z,quickstart,,0.3,False,"Quickstart for installing SAP software on Azure using PowerShell; primarily a step-by-step deployment/tutorial flow without configuration tables, limits, error-code mappings, or product-specific decision matrices.",updated -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-high-availability-namecustom-cli,Install SAP software - Azure CLI,Install SAP software for a distributed HA system with custom resource names by using Azure CLI,,Learn how to install SAP software for a distributed high-availability SAP system in Azure Center for SAP solutions with custom resource names.,"Azure Center for SAP solutionsenables you to deploy and manage SAP systems on Azure. This quickstart shows you how to install SAP software for infrastructure deployed for an SAP system. In theprevious step, you created infrastructure for an SAP system with a highly available distributed architecture. You used Azure Center for SAP solutions with Azure CLI and provided customized resource names for the deployed Azure resources. After you deploy infrastructure and install SAP software, you can mana",2026-04-16T22:31:00.000Z,quickstart,,0.3,False,"Quickstart for installing a distributed HA SAP system with custom resource names via Azure CLI; focuses on procedural steps rather than limits, configuration matrices, troubleshooting mappings, or quantified best practices.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom,Deploy S/4 HANA infrastructure - Azure CLI,Create infrastructure for a distributed HA SAP system with custom resource names by using Azure CLI,Create HA SAP infrastructure with custom Azure resource names,Learn how to create infrastructure for a distributed highly available (HA) SAP system with custom resource names in Azure Center for SAP solutions by using Azure CLI.,"In this quickstart, you use Azure CLI to deploy infrastructure for a distributed highly available (HA) SAP system with customized resource names inAzure Center for SAP solutions. Alternatively, you can use theAzure PowerShell module. After you deploy infrastructure andinstall SAP software, you can manage and monitor the system through theVirtual Instance for SAP solutions (VIS)resource. For example, you can:",2026-04-22T17:34:00.000Z,quickstart,configuration,0.7,True,"Quickstart using Azure CLI to deploy distributed HA SAP infrastructure with custom resource names. This typically includes specific CLI commands, parameter names, and required values/structures for Azure Center for SAP solutions and VIS resources, which are product-specific configuration details rather than generic deployment concepts.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-distributed-non-high-availability,Install SAP software - Azure PowerShell,Install SAP software for a distributed non-HA system by using Azure PowerShell,,Learn how to install SAP software for a distributed non-high-availability SAP system in Azure Center for SAP solutions by using Azure PowerShell.,"Azure Center for SAP solutionsenables you to deploy and manage SAP systems on Azure. This quickstart shows you how to install SAP software for infrastructure deployed for an SAP system. In theprevious step, you created infrastructure for an SAP system with a non-highly available distributed architecture on Azure by using Azure Center for SAP solutions. After youdeploy infrastructureand install SAP software, you can manage and monitor the system through theVirtual Instance for SAP solutions (VIS)",2026-04-16T22:31:00.000Z,quickstart,,0.3,False,"Quickstart for installing SAP software on Azure using PowerShell; primarily a step-by-step deployment/tutorial flow without configuration tables, limits, error-code mappings, or product-specific decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-high-availability-namecustom-cli,Install SAP software - Azure CLI,Install SAP software for a distributed HA system with custom resource names by using Azure CLI,,Learn how to install SAP software for a distributed high-availability SAP system in Azure Center for SAP solutions with custom resource names.,"Azure Center for SAP solutionsenables you to deploy and manage SAP systems on Azure. This quickstart shows you how to install SAP software for infrastructure deployed for an SAP system. In theprevious step, you created infrastructure for an SAP system with a highly available distributed architecture. You used Azure Center for SAP solutions with Azure CLI and provided customized resource names for the deployed Azure resources. After you deploy infrastructure and install SAP software, you can mana",2026-04-16T22:31:00.000Z,quickstart,,0.3,False,"Quickstart for installing a distributed HA SAP system with custom resource names via Azure CLI; focuses on procedural steps rather than limits, configuration matrices, troubleshooting mappings, or quantified best practices.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-register-system-cli,Register an existing SAP system - Azure CLI,Register an Existing SAP System with the Azure CLI,Register existing SAP systems in Azure Center using Azure CLI,Learn how to register an existing SAP system in Azure Center for SAP solutions through the Azure CLI.,"Use the Azure CLI to create and manage Azure resources from the command line or in scripts. WithAzure Center for SAP solutions, you can deploy and manage SAP systems on Azure. This article shows you how to register an existing SAP system that runs on Azure with Azure Center for SAP solutions. We use the Azure CLI in this article. Alternatively, you can register systems by using Azure PowerShell or the Azure portal. After you register an SAP system, you can use its visualization, management, and ",2026-03-27T06:14:00.000Z,how-to,configuration,0.7,True,"Similar to [5] but via Azure CLI. It will contain specific CLI commands, flags, and parameter requirements for creating VIS resources and registering SAP systems, which is detailed configuration knowledge rather than generic CLI usage.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-register-system-powershell,Register an existing SAP system - Azure PowerShell,Register an Existing SAP System with Azure PowerShell,Register existing SAP systems in Azure Center using PowerShell,Learn how to register an existing SAP system in Azure Center for SAP solutions through an Azure PowerShell module.,"Use the Azure PowerShell module to register an existing SAP system with Azure Center for SAP solutions. After you register the system, Azure Center for SAP solutions creates a Virtual Instance for SAP solutions (VIS) resource that provides visualization, management, and monitoring capabilities through the Azure portal. You can also register systems by using theAzure CLIor theAzure portal.",2026-03-27T06:14:00.000Z,how-to,configuration,0.7,True,"Covers using an Azure PowerShell module to register an existing SAP system and create a VIS resource. This will include specific cmdlets, parameters, and required values unique to Azure Center for SAP solutions, fitting the configuration sub-skill (command/parameter-level configuration).",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system,Register existing SAP system,Register an existing SAP system,Register existing SAP systems in Azure Center via portal,"Learn how to register an existing SAP system in Azure Center for SAP solutions through the Azure portal. You can visualize, manage, and monitor your existing SAP system through Azure Center for SAP so","In this how-to guide, you learn how to register an existing SAP system withAzure Center for SAP solutions. After you register an SAP system with Azure Center for SAP solutions, you can use its visualization, management, and monitoring capabilities through the Azure portal. For example, you can: When you register a system with Azure Center for SAP solutions, the following resources are created in your Subscription: Note You can customize the names of theManaged resource groupand theStorage accoun",2026-03-11T22:19:00.000Z,how-to,configuration,0.7,True,"Portal-based registration process creates specific managed resource groups, storage accounts, and other resources; the article will detail required fields, resource naming, and configuration options unique to Azure Center for SAP solutions, which is expert configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/soft-stop-sap-and-hana-database,Soft stop SAP instances and HANA database,Soft stop SAP instances and HANA database in Azure Center for SAP solutions,,Learn how to soft stop an SAP system and HANA database through the Virtual Instance for SAP solutions resource in Azure Center for SAP solutions.,"You can soft stop your SAP systems, individual instances, and HANA database through the Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions. A soft stop drains existing user connections and batch processes before stopping the system. By usingAzure PowerShell,Azure CLI, andREST APIinterfaces, you can:",2026-04-16T17:19:00.000Z,how-to,,0.3,False,"Appears to be a how-to for soft stopping SAP and HANA via VIS using PowerShell/CLI/REST; summary does not indicate product-specific limits, config tables, error codes, or other expert-only details.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system,Register existing SAP system,Register an existing SAP system,Register existing SAP systems in Azure Center for SAP solutions,"Learn how to register an existing SAP system in Azure Center for SAP solutions through the Azure portal. You can visualize, manage, and monitor your existing SAP system through Azure Center for SAP so","In this how-to guide, you learn how to register an existing SAP system withAzure Center for SAP solutions. After you register an SAP system with Azure Center for SAP solutions, you can use its visualization, management, and monitoring capabilities through the Azure portal. For example, you can: When you register a system with Azure Center for SAP solutions, the following resources are created in your Subscription: Note You can customize the names of theManaged resource groupand theStorage accoun",2026-04-22T08:00:00.000Z,how-to,configuration,0.75,True,"How-to for registering an existing SAP system, including what Azure resources (managed resource group, storage account, VIS) are created and how they’re named/customized. This involves specific resource types, naming patterns, and configuration parameters unique to this product, aligning with configuration.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/soft-stop-sap-and-hana-database,Soft stop SAP instances and HANA database,Soft stop SAP instances and HANA database in Azure Center for SAP solutions,,Learn how to soft stop an SAP system and HANA database through the Virtual Instance for SAP solutions resource in Azure Center for SAP solutions.,"You can soft stop your SAP systems, individual instances, and HANA database through the Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions. A soft stop drains existing user connections and batch processes before stopping the system. By usingAzure PowerShell,Azure CLI, andREST APIinterfaces, you can:",2026-04-16T17:19:00.000Z,how-to,,0.3,False,"Appears to be a how-to for soft stopping SAP and HANA via VIS using PowerShell/CLI/REST; summary does not indicate product-specific limits, config tables, error codes, or other expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/start-stop-sap-systems,Start and stop SAP systems,"Start and stop SAP systems, instances, and HANA database",Control SAP system lifecycle via Azure VIS,"Learn how to start or stop an SAP system, specific instances, and HANA database through the Virtual Instance for SAP solutions resource in Azure Center for SAP solutions.","In this article, you learn to start and stop your SAP systems through the Virtual Instance for SAP solutions (VIS) resource in Azure Center for SAP solutions. Through the Azure portal,Azure PowerShell,Azure CLI, andREST APIinterfaces, you can start and stop:",2026-03-26T06:13:00.000Z,how-to,configuration,0.68,True,"Operational how-to for starting/stopping SAP systems, instances, and HANA DB via Azure Center for SAP solutions using portal/PowerShell/CLI/REST. Likely includes specific API operations, parameter names, and required settings unique to the VIS resource, which are product-specific configuration details beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/stop-start-sap-and-underlying-vm,Start and stop SAP systems and VMs,"Start and stop SAP systems, instances, HANA database, and underlying VMs",Control SAP and VM lifecycle via VIS REST APIs,"Learn how to start and stop SAP systems, individual SAP instances, and HANA databases along with their underlying virtual machines. Use the Virtual Instance for SAP solutions (VIS) resource in Azure C","Azure Center for SAP solutionslets you manage SAP systems on Azure as a unified workload. When your SAP systems aren't actively in use, for example, outside business hours or during maintenance windows, you can stop them and deallocate the underlying virtual machines (VMs) to reduce costs. In this article, you use REST API calls to start and stop SAP application tiers, individual SAP instances, and HANA databases along with their VMs. Important The ability to start and stop VMs of an SAP system ",2026-04-09T22:25:00.000Z,how-to,integrations,0.68,True,"Page describes using the Virtual Instance for SAP solutions (VIS) resource and REST API calls to start/stop SAP systems, instances, HANA databases, and underlying VMs. This involves product-specific API operations, parameters, and behavior unique to Azure Center for SAP solutions, which qualifies as integrations & coding patterns rather than generic how-to content.",unchanged https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/tutorial-create-high-availability-name-custom,Deploy infrastructure for an SAP system with customized resource names,Create infrastructure for a distributed highly available SAP system with customized resource names using Azure CLI,,Learn how to use Azure CLI to deploy infrastructure for a distributed highly available SAP system with customized resource names in Azure Center for SAP solutions.,"Azure Center for SAP solutionsis an Azure service that deploys and manages SAP systems on Azure. When Azure Center for SAP solutions creates infrastructure, it assigns default names to Azure resources, such as virtual machines (VMs), network interfaces, and load balancers. If your organization requires specific naming conventions for governance or easier resource identification, you can customize these names during deployment. In this article, you use Azure CLI to deploy infrastructure for a dis",2026-04-06T22:10:00.000Z,how-to,,0.35,False,"Tutorial on deploying HA SAP infrastructure with custom resource names using Azure CLI. Focuses on naming and deployment steps, not on deep configuration options, limits, or specialized troubleshooting/decision guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/view-cost-analysis,View cost analysis for SAP system,View cost analysis for SAP systems in Azure Center for SAP solutions,,Learn how to view the post-deployment cost of running an SAP system through the Virtual Instance for SAP solutions resource in Azure Center for SAP solutions.,"You can view the running cost of your SAP systems through theVirtual Instance for SAP solutions (VIS)resource inAzure Center for SAP solutions. After you deploy or register an SAP system as a VIS resource, you canview the cost of running that SAP system on the VIS resource's page. This feature shows the post-deployment running costs in the context of your SAP system. When you have Azure resources of multiple SAP systems in a single resource group, you no longer need to analyze the cost for each ",2026-04-16T06:12:00.000Z,how-to,,0.3,False,"Cost analysis viewing guide; likely UI navigation and conceptual usage of VIS cost view without numeric limits, config matrices, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/view-cost-analysis,View cost analysis for SAP system,View cost analysis for SAP systems in Azure Center for SAP solutions,,Learn how to view the post-deployment cost of running an SAP system through the Virtual Instance for SAP solutions resource in Azure Center for SAP solutions.,"You can view the running cost of your SAP systems through theVirtual Instance for SAP solutions (VIS)resource inAzure Center for SAP solutions. After you deploy or register an SAP system as a VIS resource, you canview the cost of running that SAP system on the VIS resource's page. This feature shows the post-deployment running costs in the context of your SAP system. When you have Azure resources of multiple SAP systems in a single resource group, you no longer need to analyze the cost for each ",2026-04-16T06:12:00.000Z,how-to,,0.3,False,"Cost analysis viewing guide; likely UI navigation and conceptual usage of VIS cost view without numeric limits, config matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/sap/monitor/about-azure-monitor-sap-solutions,Overview,What is Azure Monitor for SAP solutions?,,"Learn about how to monitor your SAP resources on Azure for availability, performance, and operation.","When you have critical SAP applications and business processes that rely on Azure resources, you might want to monitor those resources for availability, performance, and operation. Azure Monitor for SAP solutions is an Azure-native monitoring product for SAP landscapes that run on Azure. It uses specific parts of theAzure Monitorinfrastructure. You can use Azure Monitor for SAP solutions with bothSAP on Azure virtual machines (VMs)andSAP on Azure Large Instances.",2026-02-02T08:00:00.000Z,overview,,0.3,False,Introductory 'What is' article for Azure Monitor for SAP solutions; mainly conceptual overview of purpose and scope without detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/sap/monitor/data-reference,Data reference for Azure Monitor for SAP solutions,Data reference for Azure Monitor for SAP solutions,Reference of logs and metrics for Azure Monitor for SAP solutions,Important reference material needed when you monitor SAP on Azure.,This article provides a reference of log data collected to analyze the performance and availability of Azure Monitor for SAP solutions. SeeMonitor SAP on Azurefor details on collecting and analyzing monitoring data for SAP on Azure.,2024-08-21T08:00:00.000Z,reference,configuration,0.8,True,"Data reference pages list specific table names, field names, and schemas for logs and metrics, which are detailed configuration/integration references unique to this service.",unchanged https://learn.microsoft.com/en-us/azure/sap/monitor/enable-dedicated-hosting-plan,Enable Dedicated Hosting Plan in Azure Monitor for SAP solutions,Enable the dedicated hosting plan for Azure Monitor for SAP solutions,,Switch the Azure function hosting plan in Azure Monitor for SAP solutions (AMS) from Elastic Premium to the dedicated plan to optimize cost and improve scaling efficiency.,"Azure Monitor for SAP solutions (AMS) uses an Azure function to collect monitoring data from your SAP systems. By default, this function runs on the Elastic Premium hosting plan. Switch to the dedicated hosting plan to reduce costs and improve scaling efficiency for high-volume monitoring workloads. This article shows you how to switch the hosting plan from Elastic Premium to dedicated and check the change in the Azure portal.",2026-04-09T22:25:00.000Z,how-to,,0.3,False,"How-to guide for switching Azure Monitor for SAP solutions from Elastic Premium to a dedicated hosting plan. Summary suggests step-by-step portal actions without exposing detailed configuration parameter tables, limits, or product-specific deployment matrices beyond generic hosting-plan change instructions.",unchanged @@ -132,7 +132,7 @@ https://learn.microsoft.com/en-us/azure/sap/workloads/exchange-online-integratio https://learn.microsoft.com/en-us/azure/sap/workloads/expose-sap-odata-to-power-query,Enable SAP Principal Propagation for live OData feeds with Power Query,Enable SAP Principal Propagation for live OData feeds with Power Query,Configure SAP Principal Propagation for live OData with Power Query,Learn how to configure SAP Principal Propagation for live OData feeds with Power Query.,"SAP Principal Propagation is a mechanism that maps a user's Microsoft Entra identity to their SAP back-end user, so that each data request carries the correct SAP authorization. When you consume SAP data throughPower Queryin Microsoft Excel or Power BI, you typically want live, refreshable OData feeds rather than static data exports. SAP Principal Propagation ensures that those live feeds respect per-user SAP authorizations. This article walks you through configuringAzure API Management,SAP Gate",2026-03-20T06:13:00.000Z,how-to,integrations,0.8,True,"Walks through configuring Azure API Management, SAP Gateway, and Power Query for SAP Principal Propagation. This is a concrete integration pattern with multiple products, likely including specific configuration parameters, headers, and identity mapping details that are product-specific integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/expose-sap-process-orchestration-on-azure,Expose SAP Process Orchestration on Azure securely,Expose SAP legacy middleware securely with Azure PaaS,Design secure Azure PaaS exposure for SAP Process Orchestration,"Explore Azure PaaS options for securely exposing SAP Process Orchestration, including Application Gateway, Azure Firewall, and API Management.","Enabling internal systems and external partners to interact with SAP back ends is a common requirement. Existing SAP landscapes often rely on the legacy middlewareSAP Process Orchestration (PO)orProcess Integration (PI)for their integration and transformation needs. For simplicity, this article uses the termSAP Process Orchestrationto refer to both offerings. This article describes configuration options on Azure, with emphasis on internet-facing implementations. Note SAP mentionsSAP Integration ",2026-03-26T06:13:00.000Z,concept-article,architecture-patterns,0.65,True,"Explores concrete Azure PaaS options (Application Gateway, Azure Firewall, API Management) for securely exposing SAP PO/PI, with configuration options and internet-facing patterns. This is architecture- and pattern-focused, mapping specific Azure services and topologies to SAP middleware exposure scenarios, which is product- and scenario-specific guidance.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/extract-sap-data,Extract SAP data to Microsoft Fabric,Extract SAP data to Microsoft Fabric,Choose methods to extract SAP data into Microsoft Fabric,Learn how to Extract SAP data to Microsoft Fabric.,"In this article, you gain a comprehensive understanding of the different data sources and tools available for SAP data extraction, and how to select the most appropriate option based on your analytical goals. The content covers the structure and purpose of each data layer within SAP systems. It also highlights the integration capabilities towards Microsoft Fabric, and the considerations for reliability, performance, and business alignment. Microsoft Fabricis a fully integrated, SaaS-based data p",2025-07-31T17:19:00.000Z,how-to,decision-making,0.7,True,"Compares multiple SAP data sources and extraction tools with criteria like performance, reliability, and data layer characteristics to guide method selection.",unchanged -https://learn.microsoft.com/en-us/azure/sap/workloads/get-started,Overview,Get started with SAP on Azure VMs,,Learn about SAP solutions that run on virtual machines (VMs) in Microsoft Azure,"When you use Microsoft Azure, you can reliably run your mission-critical SAP workloads and scenarios on a scalable, compliant, and enterprise-proven platform. You get the scalability, flexibility, and cost savings of Azure. With the expanded partnership between Microsoft and SAP, you can run SAP applications across development and test and production scenarios in Azure and be fully supported. From SAP NetWeaver to SAP S/4HANA, SAP BI on Linux to Windows, and SAP HANA to SQL Server, Oracle, Db2, ",2026-04-16T08:00:00.000Z,article,,0.2,False,"High-level getting-started overview for running SAP on Azure VMs without specific limits, configuration tables, error codes, or decision matrices; primarily conceptual and marketing-style guidance rather than detailed expert parameters.",updated +https://learn.microsoft.com/en-us/azure/sap/workloads/get-started,Overview,Get started with SAP on Azure VMs,,Learn about SAP solutions that run on virtual machines (VMs) in Microsoft Azure,"When you use Microsoft Azure, you can reliably run your mission-critical SAP workloads and scenarios on a scalable, compliant, and enterprise-proven platform. You get the scalability, flexibility, and cost savings of Azure. With the expanded partnership between Microsoft and SAP, you can run SAP applications across development and test and production scenarios in Azure and be fully supported. From SAP NetWeaver to SAP S/4HANA, SAP BI on Linux to Windows, and SAP HANA to SQL Server, Oracle, Db2, ",2026-04-16T08:00:00.000Z,article,,0.2,False,"High-level getting-started overview for running SAP on Azure VMs without specific limits, configuration tables, error codes, or decision matrices; primarily conceptual and marketing-style guidance rather than detailed expert parameters.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/hana-get-started,Install SAP HANA on Azure VMs,Install SAP HANA on Azure virtual machines,Prepare and deploy SAP HANA on Azure VMs,Learn how to prepare your environment for SAP HANA installation on Azure virtual machines.,"This document helps in pointing you to the right resources for deploying HANA on Azure virtual machines (VMs), including documents that you need to check before installing SAP HANA on Azure VMs. The aim is to ensure you're able to perform the right steps to achieve a supported configuration of SAP HANA on Azure. Note This guide describes deployments of SAP HANA into Azure VMs. For information on how to deploy SAP HANA on HANA Large Instances, seeHow to install and configure SAP HANA (Large Insta",2026-02-10T06:10:00.000Z,concept-article,deployment,0.6,True,"Provides a deployment-focused guide pointing to required steps and documents to reach a supported SAP HANA-on-Azure configuration, which is product-specific deployment knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/hana-tiering-guidance,SAP HANA data tiering and archiving guidance,Managing SAP HANA data footprint for balancing cost and performance,Design SAP HANA data tiering and archiving on Azure,Learn about HANA database archiving strategies to manage data footprint and reduce costs.,"Data archiving has always been a critical decision-making item and is heavily used by many companies to organize their legacy data to achieve cost benefits, balancing the need to comply with regulations and retain data for certain period with the cost of storing the data. Customers planning to migrate to S/4HANA or HANA based solution or reduce existing data storage footprint can leverage on the various data tiering options supported on Azure. This article describes options on Azure with emphasi",2026-02-05T18:11:00.000Z,concept-article,decision-making,0.7,True,"Describes data tiering options and strategies on Azure for HANA, balancing cost and performance; used for migration and storage decisions.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-operations,Configure SAP HANA on Azure VMs,SAP HANA infrastructure configurations and operations on Azure,Configure and operate SAP HANA infrastructure on Azure VMs,Operations guide for SAP HANA systems that are deployed on Azure virtual machines.,"This document provides guidance for configuring Azure infrastructure and operating SAP HANA systems that are deployed on Azure native virtual machines (VMs). The document also includes configuration information for SAP HANA scale-out for the M128s VM SKU. This document isn't intended to replace the standard SAP documentation, which includes the following content:",2024-09-16T08:00:00.000Z,article,configuration,0.7,True,"Gives detailed Azure infrastructure configuration and operational guidance for SAP HANA, including scale-out specifics for certain VM SKUs, which are configuration-level expert details.",unchanged @@ -141,14 +141,14 @@ https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-operations-storage https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-premium-ssd-v1,Premium storage for HANA,SAP HANA Azure virtual machine Premium SSD configurations,Configure Premium SSD storage for SAP HANA VMs,Storage recommendations HANA using premium storage.,"This document is about HANA storage configurations for Azure premium storage or the first version of Premium SSD as it was introduced years back as low latency storage for database management systems (DBMS) and other applications that need low latency storage. For general considerations around stripe sizes when using Logical Volume Manager (LVM), HANA data volume partitioning or other considerations that are independent of the particular storage type, check these two documents: For documentatio",2025-11-01T05:04:00.000Z,article,configuration,0.8,True,HANA-specific storage layouts and parameters for Premium SSD v1; includes product-specific configuration guidance.,unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-premium-ssd-v2,Premium SSD v2 for HANA,SAP HANA Azure virtual machine Premium SSD v2 configurations,Configure Premium SSD v2 for SAP HANA workloads,Storage recommendations HANA using Premium SSD v2.,"Premium SSD v2 simplifies the way how you build storage architectures and let's you tailor and adapt the storage capabilities to your workload. Premium SSD v2 allows you to configure and pay for capacity, IOPS (I/O operations per second), and throughput independent of each other. For general considerations around stripe sizes when using LVM, HANA data volume partitioning or other considerations that are independent of the particular storage type, check these two documents: The suggestions for th",2025-11-04T08:00:00.000Z,article,configuration,0.8,True,"Details on tuning capacity, IOPS, and throughput for HANA using Premium SSD v2; includes recommended values and patterns.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-ultra-disk,Ultra Disk for HANA,SAP HANA Azure virtual machine Ultra Disk configurations,Configure Azure Ultra Disk for SAP HANA VMs,Storage recommendations for SAP HANA using Ultra Disk.,"This document is about HANA storage configurations for Azure Ultra Disk storage as it was introduced as ultra low latency storage for DBMS and other applications that need ultra low latency storage. For general considerations around stripe sizes when using LVM, HANA data volume partitioning or other considerations that are independent of the particular storage type, check these two documents:",2026-02-26T12:27:00.000Z,article,configuration,0.8,True,HANA storage configuration guidance for Ultra Disk; product-specific layout and performance recommendations.,unchanged -https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel,Install HA SAP NetWeaver,Azure Virtual Machines HA for SAP NW on RHEL,Configure HA SAP NetWeaver on RHEL in Azure VMs,This article describes Azure Virtual Machines high availability for SAP NetWeaver on Red Hat Enterprise Linux (RHEL).,"This article describes how to deploy virtual machines (VMs), configure the VMs, install the cluster framework, and install a highly available SAP NetWeaver 7.50 system. In the example configurations and installation commands, ASCS instance number 00, ERS instance number 02, and SAP System ID NW1 are used. The names of the resources (for example, VMs and virtual networks) in the example assume that you used theASCS/SCS templatewith Resource Prefix NW1 to create the resources.",2026-04-16T08:00:00.000Z,article,configuration,0.72,True,"Covers deployment and configuration of VMs, cluster framework, and SAP NetWeaver HA on RHEL. These guides usually specify concrete Pacemaker/Corosync resource settings, Azure-specific cluster parameters, SAP instance configuration commands, and OS tuning values that are unique to SAP on Azure, which constitutes expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel,Install HA SAP NetWeaver,Azure Virtual Machines HA for SAP NW on RHEL,Configure HA SAP NetWeaver on RHEL in Azure VMs,This article describes Azure Virtual Machines high availability for SAP NetWeaver on Red Hat Enterprise Linux (RHEL).,"This article describes how to deploy virtual machines (VMs), configure the VMs, install the cluster framework, and install a highly available SAP NetWeaver 7.50 system. In the example configurations and installation commands, ASCS instance number 00, ERS instance number 02, and SAP System ID NW1 are used. The names of the resources (for example, VMs and virtual networks) in the example assume that you used theASCS/SCS templatewith Resource Prefix NW1 to create the resources.",2026-04-16T08:00:00.000Z,article,configuration,0.72,True,"Covers deployment and configuration of VMs, cluster framework, and SAP NetWeaver HA on RHEL. These guides usually specify concrete Pacemaker/Corosync resource settings, Azure-specific cluster parameters, SAP instance configuration commands, and OS tuning values that are unique to SAP on Azure, which constitutes expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-glusterfs,Install GlusterFS on Azure VMs for SAP NetWeaver,GlusterFS on Azure VMs on RHEL for SAP NetWeaver,Configure GlusterFS on Azure VMs for SAP HA,Learn about deploying GlusterFS on Azure VMs on Red Hat Enterprise Linux for SAP NetWeaver.,"This article describes how to deploy the virtual machines (VMs), configure the VMs, and install a GlusterFS cluster. The GlusterFS cluster is used to store the shared data of a highly available SAP system. This guide describes how to set up GlusterFS that is used by two SAP systems, NW1 and NW2. The names of the resources (for example VMs, virtual networks) in the example, assumes that you used theSAP file server templatewith resource prefixglust. As documented inRed Hat Gluster Storage Life Cyc",2026-03-02T23:28:00.000Z,how-to,configuration,0.72,True,"The article provides step-by-step, product-specific configuration details for deploying and configuring a GlusterFS cluster on Azure VMs running RHEL for SAP NetWeaver, including concrete VM, network, storage, and cluster settings. It goes beyond generic concepts and includes exact configuration commands and patterns unique to this SAP-on-Azure scenario, fitting best under configuration rather than generic deployment guidance.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-ibm-db2-luw,RHEL - IBM Db2 HA,Set up IBM Db2 HADR on Azure VMs on RHEL,Set up IBM Db2 HADR on RHEL Azure VMs,Learn how to deploy and establish high availability of IBM Db2 LUW on Azure VMs RHEL.,"IBM Db2 for Linux, UNIX, and Windows (LUW) inhigh availability and disaster recovery (HADR) configurationconsists of one node that runs a primary database instance and at least one node that runs a secondary database instance. Changes to the primary database instance are replicated to a secondary database instance synchronously or asynchronously, depending on your configuration. Note This article includes references to terms that Microsoft no longer uses. The contents in this article are updated",2026-02-25T23:33:00.000Z,concept-article,configuration,0.75,True,"Explains how to deploy IBM Db2 LUW in HADR configuration on RHEL Azure VMs, including primary/secondary roles and replication modes, which are product-specific HA configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-multi-sid,Deploy multi-SID clusters with Pacemaker,Azure VMs high availability for SAP NW on RHEL multi-SID,Configure multi-SID HA SAP NetWeaver cluster on RHEL,Learn how to deploy SAP NetWeaver highly available systems in a two node cluster on Azure VMs with Red Hat Enterprise Linux for SAP applications.,"This article describes how to deploy multiple SAP NetWeaver highly available systems (multi-SID) in a two node cluster on Azure VMs with Red Hat Enterprise Linux for SAP applications. In the example configurations, three SAP NetWeaver 7.50 systems are deployed in a single, two node high availability cluster. The SAP systems SIDs are: The article doesn't cover the database layer and the deployment of the SAP NFS shares. The examples in this article use theAzure NetApp FilesvolumesapMSIDfor the NF",2026-04-16T08:00:00.000Z,how-to,configuration,0.76,True,"Multi-SID HA configuration for several SAP systems in a two-node RHEL cluster with Azure NetApp Files. Multi-SID clustering requires detailed, product-specific parameters (resource naming schemes, constraints, ANF volume layout, SAP profile and instance settings per SID) that go beyond generic HA concepts, so this is expert configuration guidance.",updated -https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-netapp-files,Install HA SAP NetWeaver with Azure NetApp Files,Azure Virtual Machines HA for SAP NW on RHEL with Azure NetApp Files,Configure HA SAP NetWeaver on RHEL with Azure NetApp Files,Establish high availability (HA) for SAP NetWeaver on Azure Virtual Machines Red Hat Enterprise Linux (RHEL) with Azure NetApp Files.,"This article describes how to deploy virtual machines (VMs), configure the VMs, install the cluster framework, and install a highly available SAP NetWeaver 7.50 system by usingAzure NetApp Files. In the example configurations and installation commands, the ASCS instance is number 00, the ERS instance is number 01, the Primary Application instance (PAS) is 02, and the Application instance (AAS) is 03. The SAP System ID QAS is used. The database layer isn't covered in detail in this article.",2026-04-16T08:00:00.000Z,article,configuration,0.74,True,"Describes detailed steps to configure VMs, cluster framework, and SAP instances for HA using Azure NetApp Files. Such SAP HA workload docs normally contain specific cluster resource definitions, ANF volume and mount configuration, SAP instance numbering conventions, and OS-level settings that are product- and scenario-specific, making this expert configuration guidance rather than a generic tutorial.",updated -https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-nfs-azure-files,Install HA SAP NetWeaver with NFS on Azure Files,Azure VMs high availability for SAP NW on RHEL with NFS on Azure Files,Configure HA SAP NetWeaver on RHEL with Azure Files NFS,Establish high availability for SAP NetWeaver on Azure Virtual Machines Red Hat Enterprise Linux (RHEL) with NFS on Azure Files.,"This article describes how to deploy and configure virtual machines (VMs), install the cluster framework, and install a high-availability (HA) SAP NetWeaver system by usingNFS on Azure Files. The example configurations use VMs that run on Red Hat Enterprise Linux (RHEL).",2026-04-16T08:00:00.000Z,tutorial,configuration,0.74,True,"High-availability implementation guide for SAP NetWeaver on RHEL with NFS on Azure Files. These SAP-on-Azure HA guides typically include product- and scenario-specific cluster configuration parameters (Pacemaker/Corosync resources, fencing/STONITH settings, mount options, Azure Files/NFS export options, systemd and SAP profile parameters) and exact command lines unique to this workload, which qualify as expert configuration knowledge rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-multi-sid,Deploy multi-SID clusters with Pacemaker,Azure VMs high availability for SAP NW on RHEL multi-SID,Configure multi-SID HA SAP NetWeaver cluster on RHEL,Learn how to deploy SAP NetWeaver highly available systems in a two node cluster on Azure VMs with Red Hat Enterprise Linux for SAP applications.,"This article describes how to deploy multiple SAP NetWeaver highly available systems (multi-SID) in a two node cluster on Azure VMs with Red Hat Enterprise Linux for SAP applications. In the example configurations, three SAP NetWeaver 7.50 systems are deployed in a single, two node high availability cluster. The SAP systems SIDs are: The article doesn't cover the database layer and the deployment of the SAP NFS shares. The examples in this article use theAzure NetApp FilesvolumesapMSIDfor the NF",2026-04-16T08:00:00.000Z,how-to,configuration,0.76,True,"Multi-SID HA configuration for several SAP systems in a two-node RHEL cluster with Azure NetApp Files. Multi-SID clustering requires detailed, product-specific parameters (resource naming schemes, constraints, ANF volume layout, SAP profile and instance settings per SID) that go beyond generic HA concepts, so this is expert configuration guidance.",unchanged +https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-netapp-files,Install HA SAP NetWeaver with Azure NetApp Files,Azure Virtual Machines HA for SAP NW on RHEL with Azure NetApp Files,Configure HA SAP NetWeaver on RHEL with Azure NetApp Files,Establish high availability (HA) for SAP NetWeaver on Azure Virtual Machines Red Hat Enterprise Linux (RHEL) with Azure NetApp Files.,"This article describes how to deploy virtual machines (VMs), configure the VMs, install the cluster framework, and install a highly available SAP NetWeaver 7.50 system by usingAzure NetApp Files. In the example configurations and installation commands, the ASCS instance is number 00, the ERS instance is number 01, the Primary Application instance (PAS) is 02, and the Application instance (AAS) is 03. The SAP System ID QAS is used. The database layer isn't covered in detail in this article.",2026-04-16T08:00:00.000Z,article,configuration,0.74,True,"Describes detailed steps to configure VMs, cluster framework, and SAP instances for HA using Azure NetApp Files. Such SAP HA workload docs normally contain specific cluster resource definitions, ANF volume and mount configuration, SAP instance numbering conventions, and OS-level settings that are product- and scenario-specific, making this expert configuration guidance rather than a generic tutorial.",unchanged +https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-nfs-azure-files,Install HA SAP NetWeaver with NFS on Azure Files,Azure VMs high availability for SAP NW on RHEL with NFS on Azure Files,Configure HA SAP NetWeaver on RHEL with Azure Files NFS,Establish high availability for SAP NetWeaver on Azure Virtual Machines Red Hat Enterprise Linux (RHEL) with NFS on Azure Files.,"This article describes how to deploy and configure virtual machines (VMs), install the cluster framework, and install a high-availability (HA) SAP NetWeaver system by usingNFS on Azure Files. The example configurations use VMs that run on Red Hat Enterprise Linux (RHEL).",2026-04-16T08:00:00.000Z,tutorial,configuration,0.74,True,"High-availability implementation guide for SAP NetWeaver on RHEL with NFS on Azure Files. These SAP-on-Azure HA guides typically include product- and scenario-specific cluster configuration parameters (Pacemaker/Corosync resources, fencing/STONITH settings, mount options, Azure Files/NFS export options, systemd and SAP profile parameters) and exact command lines unique to this workload, which qualify as expert configuration knowledge rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-pacemaker,Set up Pacemaker cluster,Set up Pacemaker on RHEL in Azure,Configure Pacemaker clusters on RHEL for Azure SAP,Learn how to set up Pacemaker on Red Hat Enterprise Linux (RHEL) in Azure.,"This article describes how to configure a basic Pacemaker cluster on Red Hat Enterprise Server (RHEL). The instructions cover RHEL 7, RHEL 8, and RHEL 9.",2026-03-03T08:00:00.000Z,article,configuration,0.78,True,"The article provides step-by-step, product-specific configuration details for setting up a Pacemaker cluster on RHEL in Azure, including exact package names, cluster properties, resource/constraint definitions, and OS/Azure-specific settings. These are concrete configuration parameters and commands unique to this scenario, not just conceptual guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance,Install PAS and AAS with SAP NetWeaver HA cluster - RHEL,Deploy SAP dialog instances with SAP ASCS/SCS high-availability VMs on RHEL,Configure SAP PAS and AAS on RHEL ASCS/SCS HA VMs,Learn how to deploy SAP PAS and AAS dialog instances on ASCS/SCS high-availability cluster VMs running Red Hat Enterprise Linux (RHEL) in Azure.,"This article describes how to install and configure Primary Application Server (PAS) and Additional Application Server (AAS) dialog instances on an existing high-availability cluster running on Red Hat Enterprise Linux (RHEL). The PAS and AAS instances run on the same VMs as Advanced Business Application Programming Central Services (ASCS) or SAP Central Services (SCS). By deploying PAS and AAS on the same VMs as SAP ASCS/SCS and Enqueue Replication Server (ERS), you minimize the number of virtu",2026-03-31T11:59:00.000Z,how-to,configuration,0.65,True,"Describes installing and configuring PAS and AAS dialog instances on existing RHEL HA cluster VMs that already host ASCS/SCS and ERS. This implies detailed SAP and OS-level configuration steps (instance profiles, ports, clustering specifics) that are product-specific configuration knowledge, not just conceptual guidance.",unchanged +https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance,Install PAS and AAS with SAP NetWeaver HA cluster - RHEL,Deploy SAP dialog instances with SAP ASCS/SCS high-availability VMs on RHEL,Configure SAP PAS/AAS on RHEL HA VMs in Azure,Learn how to deploy SAP PAS and AAS dialog instances on ASCS/SCS high-availability cluster VMs running Red Hat Enterprise Linux (RHEL) in Azure.,"This article describes how to install and configure Primary Application Server (PAS) and Additional Application Server (AAS) dialog instances on an existing high-availability cluster running on Red Hat Enterprise Linux (RHEL). The PAS and AAS instances run on the same virtual machines (VMs) as Advanced Business Application Programming SAP Central Services (ASCS) or SAP Central Services (SCS). By deploying PAS and AAS on the same VMs as SAP ASCS/SCS and Enqueue Replication Server (ERS), you minim",2026-04-22T08:00:00.000Z,how-to,configuration,0.7,True,"How-to for installing and configuring SAP PAS and AAS on existing ASCS/SCS HA cluster VMs on RHEL. Likely includes product-specific configuration steps, service names, cluster/resource settings, and SAP/Azure parameters that go beyond generic knowledge. It’s not just conceptual; it’s a detailed deployment/configuration pattern for a specific HA scenario.",updated https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-hana-ascs-ers-dialog-instance,Install SAP NetWeaver with HANA HA cluster - RHEL,Deploy SAP ASCS/SCS and SAP ERS with SAP HANA high-availability VMs on RHEL,Configure SAP HANA ASCS/ERS high availability on RHEL VMs,Learn how to configure SAP ASCS/SCS and SAP ERS with SAP HANA high-availability VMs on RHEL.,This article describes how to install and configure SAP HANA along with ABAP SAP Central Services (ASCS)/SAP Central Services (SCS) and Enqueue Replication Server (ERS) instances on the same high-availability cluster running on Red Hat Enterprise Linux (RHEL).,2026-03-02T23:28:00.000Z,how-to,configuration,0.78,True,"The article is a detailed implementation guide for deploying SAP ASCS/SCS and ERS with SAP HANA on high-availability RHEL clusters in Azure. It typically includes product- and OS-specific cluster parameters, Pacemaker/Corosync resource definitions, fencing/STONITH settings, SAP profile parameters, mount options, and other configuration values that are unique to this scenario and not generally known to an LLM from training. The focus is on how to configure the HA cluster and SAP instances rather than generic concepts, so it best fits the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-standard-load-balancer-outbound-connections,Outbound connectivity for SAP VMs,Outbound connectivity for SAP VMs,Choose outbound connectivity options for SAP VMs on Azure,Learn about public endpoint connectivity for SAP virtual machines,"Important For new virtual networks created after March 31, 2026, Azure defaults subnets to private, which disables default outbound access. Any VM that must reach public internet or public Microsoft endpoints now needs an explicit outbound method. Existing virtual networks aren't changed automatically. For more information, see theofficial announcement. This document outlines different options to configure explicit outbound internet connectivity to reach internet or public endpoint for Azure Vir",2026-04-09T17:25:00.000Z,how-to,decision-making,0.68,True,"The page describes concrete options and guidance for configuring explicit outbound internet/public endpoint connectivity for SAP virtual machines on Azure, in the context of changes to default outbound access after March 31, 2026. This is product- and scenario-specific decision guidance (which outbound method to use for SAP workloads under new networking defaults), rather than generic networking concepts. While the summary doesn’t show numeric limits, it implies detailed option comparisons and scenario-based recommendations, fitting the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-suse,Install HA SAP NetWeaver,Azure VMs high availability for SAP NetWeaver on SLES,Configure HA SAP NetWeaver on Azure VMs with SLES,High-availability guide for SAP NetWeaver on SUSE Linux Enterprise Server for SAP applications,"This article describes how to deploy the virtual machines, configure the virtual machines, install the cluster framework, and install a highly available SAP NetWeaver or SAP ABAP platform based system. In the example configurations, ASCS instance number 00, ERS instance number 02, and SAP System ID NW1 is used. For new implementations on SLES for SAP Applications 15, we recommended deploying high availability for SAP ASCS/ERS insimple mount configuration. The classic Pacemaker configuration, bas",2026-04-10T06:11:00.000Z,how-to,configuration,0.78,True,"The page provides a prescriptive HA setup for SAP NetWeaver/ABAP on Azure VMs with SUSE Linux Enterprise Server for SAP. It contains specific cluster framework configuration, recommended simple-mount vs classic Pacemaker setups, instance numbers, and SAP/Azure resource configuration details that are expert, product-specific knowledge rather than general guidance.",unchanged @@ -178,7 +178,7 @@ https://learn.microsoft.com/en-us/azure/sap/workloads/sap-ascs-ha-multi-sid-wsfc https://learn.microsoft.com/en-us/azure/sap/workloads/sap-ascs-ha-multi-sid-wsfc-file-share,Multi-SID with WSFC and file share,SAP ASCS/SCS instance multi-SID high availability with Windows Server Failover Clustering and file share on Azure,Configure SAP ASCS/SCS multi-SID HA with WSFC file shares,Multi-SID high availability for SAP ASCS/SCS instances with Windows Server Failover Clustering and file share on Azure,"Windows You can manage multiple virtual IP addresses by using anAzure internal load balancer. If you have an SAP deployment, you can use an internal load balancer to create a Windows cluster configuration for SAP Central Services (ASCS/SCS) instances. This article focuses on how to move from a single ASCS/SCS installation to an SAP multi-SID configuration by installing additional SAP ASCS/SCS clustered instances into an existing Windows Server Failover Clustering (WSFC) cluster withfile share. W",2026-03-07T08:00:00.000Z,article,configuration,0.68,True,"Describes adding additional clustered ASCS/SCS instances to an existing WSFC cluster using file shares, which involves concrete cluster resource configuration, IP/load balancer settings, and SAP instance parameters that are product- and scenario-specific.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/sap-ascs-ha-multi-sid-wsfc-shared-disk,Multi-SID with WSFC and shared disk,SAP ASCS/SCS multi-SID high availability with Windows Server Failover Clustering and shared disk on Azure,Implement SAP ASCS/SCS multi-SID HA on WSFC with shared disk,Learn how to implement multi-SID high availability for an SAP ASCS/SCS instance with Windows Server Failover Clustering and shared disk on Azure.,"Windows If you have an SAP deployment, you must use an internal load balancer to create a Windows cluster configuration for Advanced Business Application Programming SAP Central Services (ASCS) or SAP Central Services (SCS) instances. This article focuses on how to move from a single ASCS/SCS installation to an SAP multiple security identifier (SID) configuration. This operation is performed by installing an extra SAP ASCS/SCS clustered instance into an existing Windows Server failover cluster w",2026-03-09T08:00:00.000Z,how-to,configuration,0.7,True,"Similar to [0], this focuses on moving from single to multi-SID ASCS/SCS in an existing WSFC cluster, which requires detailed, product-specific configuration of clustered instances, load balancer, and shared disk resources. That aligns best with configuration rather than generic tutorial or overview content.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/sap-azure-files-nfs-encryption-in-transit-guide,Azure Files NFS Encryption In Transit for SAP,Azure Files NFS Encryption in Transit for SAP on Azure Systems,Configure Azure Files NFS encryption in transit for SAP,Setup guide for Azure Files NFS Encryption In Transit for SAP on Azure Systems,"Azure Files NFS v4.1 volumes supportencryption in-transitvia TLS providing enterprise-grade security by encrypting all traffic between clients and servers, without compromising performance. With Azure Files NFS, you could encrypt your data end-to-end: at rest, in transit, and across the network. For more information, refer the following document:Encryption in transit for NFS Azure file shares.",2026-02-23T08:00:00.000Z,tutorial,security,0.7,True,"Provides concrete steps and parameters to enable TLS-based encryption in transit for Azure Files NFS v4.1 volumes used by SAP, including storage and client configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/sap/workloads/sap-edge-integration-cell-with-azure,How to onboard SAP Edge Integration Cell with Azure,Onboarding SAP Edge Integration Cell with Azure,Onboard SAP Edge Integration Cell to Azure AKS/Arc,Learn about onboarding SAP Edge Integration Cell with Azure Kubernetes Service (AKS).,"SAP Edge Integration Cellis a hybrid integration runtime offered as part of SAP Integration Suite, which enables you to manage APIs and run integration scenarios within your private landscape. The hybrid deployment model of Edge Integration Cell enables you to: UsingAzure Kubernetes Service (AKS)SAP Edge Integration Cell may natively run on Azure. Enriching AKS withAzure ARCextends the scenario to on-premises and other cloud providers. Govern from Azure but deploy anywhere. This article builds o",2025-08-01T11:10:00.000Z,how-to,deployment,0.7,True,"Gives product-specific steps and Kubernetes configuration for running SAP Edge Integration Cell on AKS with Azure Arc, including cluster and connectivity settings.",unchanged +https://learn.microsoft.com/en-us/azure/sap/workloads/sap-edge-integration-cell-with-azure,How to onboard SAP Edge Integration Cell with Azure,Onboarding SAP Edge Integration Cell with Azure,,Learn about onboarding SAP Edge Integration Cell with Azure Kubernetes Service (AKS).,"SAP Edge Integration Cellis a hybrid integration runtime offered as part of SAP Integration Suite, which enables you to manage APIs and run integration scenarios within your private landscape. The hybrid deployment model of Edge Integration Cell enables you to: UsingAzure Kubernetes Service (AKS)SAP Edge Integration Cell may natively run on Azure. Enriching AKS withAzure ARCextends the scenario to on-premises and other cloud providers. Govern from Azure but deploy anywhere. This article builds o",2026-04-22T22:14:00.000Z,how-to,,0.3,False,"Onboarding SAP Edge Integration Cell with AKS appears to be an overview/onboarding flow. Summary emphasizes hybrid model and high-level scenario (govern from Azure, deploy anywhere) rather than detailed config tables, limits, or error codes. Likely more tutorial/overview than deep expert configuration or troubleshooting.",updated https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-availability-across-regions,Availability scenarios in multiple Azure regions,SAP HANA availability across Azure regions,Design SAP HANA availability across Azure regions,An overview of availability considerations when running SAP HANA on Azure VMs in multiple Azure regions.,"This article describes scenarios related to SAP HANA availability across different Azure regions. Because of the distance between Azure regions, setting up SAP HANA availability in multiple Azure regions involves special considerations.",2026-02-26T23:12:00.000Z,concept-article,architecture-patterns,0.65,True,"Covers multi-region SAP HANA availability scenarios and special considerations due to inter-region distance, providing Azure- and SAP-specific architectural trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-availability-one-region,Availability scenarios in one Azure region,SAP HANA availability within one Azure region,Choose SAP HANA availability options in one Azure region,An overview of running SAP HANA operations on Azure native VMs in a single Azure region.,"This article describes several availability scenarios for SAP HANA within one Azure region. Azure has many regions, spread throughout the world. For the list of Azure regions, seeAzure regions. For deploying SAP HANA on VMs within one Azure region, Microsoft offers deployment of a single virtual machine (VM) with a HANA instance. For increased availability, you can deploy two VMs with two HANA instances using either aflexible scale setwith FD=1,availability zonesor anavailability setthat uses HA",2026-02-26T08:00:00.000Z,concept-article,architecture-patterns,0.7,True,"Describes concrete availability scenarios (single VM, flexible scale set with FD=1, availability zones, availability sets) and when to use each within a single region, which is product- and platform-specific architecture guidance.",unchanged https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-availability-overview,About SAP HANA availability in Azure VMs,SAP HANA availability on Azure VMs,Design SAP HANA availability architectures on Azure VMs,Learn about SAP HANA operations and requirements for Azure native VMs.,You can use numerous Azure capabilities to deploy mission-critical databases like SAP HANA on Azure virtual machines (VMs). This article provides guidance on how to achieve availability for SAP HANA instances that are hosted in Azure VMs. The article describes several scenarios that you can implement by using the Azure infrastructure to increase availability of SAP HANA in Azure.,2026-02-26T23:12:00.000Z,concept-article,architecture-patterns,0.7,True,Provides multiple availability scenarios and requirements for SAP HANA on Azure VMs; product-specific HA patterns and options.,unchanged diff --git a/products/azure-sap/report.md b/products/azure-sap/report.md index 5b4018d6..88559cd7 100644 --- a/products/azure-sap/report.md +++ b/products/azure-sap/report.md @@ -1,21 +1,21 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - configuration: 'Configuring and automating SAP on Azure: Terraform/SDAF setup, Azure - Center registration and lifecycle, monitoring (Azure Monitor), HA/DR and storage - layouts, and OS/DB-specific HA cluster designs.' - deployment: 'Deploying and automating SAP on Azure: control plane/workload zones, - SDAF scripts/pipelines, ACSS-based S/4HANA/NetWeaver/BO/B1 installs, HA/DR, and - planning checklists.' + configuration: 'Configuring SAP on Azure: automation (Terraform/SDAF), networking, + storage, HA/DR clusters, monitoring (Azure Monitor/Center), and VM extensions + for HANA, NetWeaver, Db2, and LaMa.' + deployment: 'Deploying and tearing down SAP landscapes on Azure: automation scripts, + control planes, DevOps pipelines, infrastructure patterns, HA setups, and app-specific + deployments (S/4HANA, B1, BI, NetWeaver).' security: 'Security, identity, and access design for SAP on Azure: Key Vault secrets, RBAC, TLS, private endpoints, encrypted storage, secure DB providers, and Entra ID/RISE integration.' architecture-patterns: 'Architecting SAP on Azure: HA/DR patterns, DB choices (HANA, SQL, Oracle, Db2, ASE, MaxDB), RISE integration, zoning/latency, networking, security, and multi-region/VM designs.' - integrations: Patterns and automation for integrating SAP HANA/Azure LI with Azure - Monitor, ADF, RISE, Salesforce, Exchange, Universal Print, VIS APIs, and SAP Principal - Propagation. + integrations: Patterns and scripts to integrate SAP on Azure with monitoring, automation, + VIS APIs/CLI/PowerShell, data pipelines, email/printing, identity, and external + services like Salesforce and RISE decision-making: 'Guidance on planning SAP on Azure: choosing infra, VM/storage options, DR, data models and tiering, SAP version support, data extraction, and app/network architecture decisions.' @@ -31,18 +31,17 @@ category_descriptions: skill_description: Expert knowledge for SAP HANA on Azure Large Instances development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, - and deployment. Use when designing HANA Large Instances with SDAF/ACSS, Azure Monitor - for SAP, Key Vault/RBAC, HA/DR clusters, or RISE, and other SAP HANA on Azure Large + and deployment. Use when deploying SAP HANA on Azure Large Instances, S/4HANA, NetWeaver, + LaMa, Azure Monitor for SAP, or RISE integration, and other SAP HANA on Azure Large Instances related development tasks. Not for Azure Large Instances (use azure-large-instances), - Azure Virtual Machines (use azure-virtual-machines), Azure Baremetal Infrastructure - (use azure-baremetal-infrastructure), SQL Server on Azure Virtual Machines (use - azure-sql-virtual-machines). -use_when: Use when designing HANA Large Instances with SDAF/ACSS, Azure Monitor for - SAP, Key Vault/RBAC, HA/DR clusters, or RISE, and other SAP HANA on Azure Large + Azure Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual + Machines (use azure-sql-virtual-machines), Azure Baremetal Infrastructure (use azure-baremetal-infrastructure). +use_when: Use when deploying SAP HANA on Azure Large Instances, S/4HANA, NetWeaver, + LaMa, Azure Monitor for SAP, or RISE integration, and other SAP HANA on Azure Large Instances related development tasks. confusable_not_for: Not for Azure Large Instances (use azure-large-instances), Azure - Virtual Machines (use azure-virtual-machines), Azure Baremetal Infrastructure (use - azure-baremetal-infrastructure), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). + Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual Machines + (use azure-sql-virtual-machines), Azure Baremetal Infrastructure (use azure-baremetal-infrastructure). --- # SAP HANA on Azure Large Instances Crawl Report @@ -51,13 +50,13 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A - **Total Pages**: 206 - **Fetched**: 206 - **Fetch Failed**: 0 -- **Classified**: 172 -- **Unclassified**: 34 +- **Classified**: 169 +- **Unclassified**: 37 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 20 -- **Unchanged**: 186 +- **Updated Pages**: 12 +- **Unchanged**: 194 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-sap/azure-sap.csv` @@ -67,59 +66,43 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A |------|-------|------------| | architecture-patterns | 24 | 11.7% | | best-practices | 8 | 3.9% | -| configuration | 70 | 34.0% | -| decision-making | 13 | 6.3% | -| deployment | 26 | 12.6% | -| integrations | 10 | 4.9% | +| configuration | 71 | 34.5% | +| decision-making | 12 | 5.8% | +| deployment | 23 | 11.2% | +| integrations | 11 | 5.3% | | limits-quotas | 1 | 0.5% | -| security | 12 | 5.8% | +| security | 11 | 5.3% | | troubleshooting | 8 | 3.9% | -| *(Unclassified)* | 34 | 16.5% | +| *(Unclassified)* | 37 | 18.0% | ## Changes ### Updated Pages -- [Overview](https://learn.microsoft.com/en-us/azure/sap/workloads/get-started) - - Updated: 2026-03-03T08:00:00.000Z → 2026-04-16T08:00:00.000Z -- [Soft stop SAP instances and HANA database](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/soft-stop-sap-and-hana-database) - - Updated: 2023-11-20T18:01:00.000Z → 2026-04-16T17:19:00.000Z -- [View cost analysis for SAP system](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/view-cost-analysis) - - Updated: 2023-05-15T23:10:00.000Z → 2026-04-16T06:12:00.000Z -- [Plan your automated deployment](https://learn.microsoft.com/en-us/azure/sap/automation/plan-deployment) - - Updated: 2024-03-11T22:07:00.000Z → 2026-04-14T22:21:00.000Z -- [Configure SAP parameters](https://learn.microsoft.com/en-us/azure/sap/automation/configure-sap-parameters) - - Updated: 2023-08-29T21:48:00.000Z → 2026-04-15T08:00:00.000Z -- [Configure custom disk sizing](https://learn.microsoft.com/en-us/azure/sap/automation/configure-extra-disks) - - Updated: 2023-08-29T21:48:00.000Z → 2026-04-15T22:11:00.000Z -- [Configure external tools](https://learn.microsoft.com/en-us/azure/sap/automation/tools-configuration) - - Updated: 2023-09-03T11:19:00.000Z → 2026-04-17T22:08:00.000Z -- [Upgrading the automation framework](https://learn.microsoft.com/en-us/azure/sap/automation/upgrading) - - Updated: 2023-12-21T23:04:00.000Z → 2026-04-16T22:31:00.000Z -- [Troubleshoot the automation framework](https://learn.microsoft.com/en-us/azure/sap/automation/troubleshooting) - - Updated: 2024-01-08T00:27:00.000Z → 2026-04-15T22:11:00.000Z -- [Deploy the workload zone](https://learn.microsoft.com/en-us/azure/sap/automation/deploy-workload-zone) - - Updated: 2024-02-27T23:07:00.000Z → 2026-04-14T22:21:00.000Z -- [Deployment hands-on lab](https://learn.microsoft.com/en-us/azure/sap/automation/tutorial) - - Updated: 2024-03-12T22:06:00.000Z → 2026-04-13T08:00:00.000Z -- [Get SAP media for BOM](https://learn.microsoft.com/en-us/azure/sap/automation/bom-get-files) - - Updated: 2023-02-10T23:04:00.000Z → 2026-04-15T08:00:00.000Z -- [Create automation template files](https://learn.microsoft.com/en-us/azure/sap/automation/bom-templates-db) - - Updated: 2023-02-10T23:04:00.000Z → 2026-04-14T22:21:00.000Z -- [Install HA SAP NetWeaver with NFS on Azure Files](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-nfs-azure-files) - - Updated: 2026-02-20T08:00:00.000Z → 2026-04-16T08:00:00.000Z -- [Install HA SAP NetWeaver with Azure NetApp Files](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-netapp-files) - - Updated: 2026-02-20T08:00:00.000Z → 2026-04-16T08:00:00.000Z -- [Install HA SAP NetWeaver](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel) - - Updated: 2026-04-10T06:11:00.000Z → 2026-04-16T08:00:00.000Z -- [Deploy multi-SID clusters with Pacemaker](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-multi-sid) - - Updated: 2026-02-23T08:00:00.000Z → 2026-04-16T08:00:00.000Z -- [Install SAP software - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-distributed-non-high-availability) - - Updated: 2023-09-07T11:19:00.000Z → 2026-04-16T22:31:00.000Z -- [Install SAP software - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-high-availability-namecustom-cli) - - Updated: 2023-05-15T23:10:00.000Z → 2026-04-16T22:31:00.000Z -- [Customer enabled disaster recovery](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/compliance-cedr) - - Updated: 2023-05-15T23:10:00.000Z → 2026-04-16T22:31:00.000Z +- [Install PAS and AAS with SAP NetWeaver HA cluster - RHEL](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance) + - Updated: 2026-03-31T11:59:00.000Z → 2026-04-22T08:00:00.000Z +- [How to onboard SAP Edge Integration Cell with Azure](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-edge-integration-cell-with-azure) + - Updated: 2025-08-01T11:10:00.000Z → 2026-04-22T22:14:00.000Z +- [Deploy S/4 HANA infrastructure - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom) + - Updated: 2023-05-15T23:10:00.000Z → 2026-04-22T17:34:00.000Z +- [Start and stop SAP system - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell) + - Updated: 2026-03-27T06:14:00.000Z → 2026-04-22T06:17:00.000Z +- [Start and stop SAP system - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli) + - Updated: 2026-03-27T06:14:00.000Z → 2026-04-22T06:17:00.000Z +- [Prepare network for deployment](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network) + - Updated: 2026-03-18T06:15:00.000Z → 2026-04-22T08:00:00.000Z +- [Deploy S/4HANA infrastructure](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana) + - Updated: 2026-03-11T17:32:00.000Z → 2026-04-22T08:00:00.000Z +- [Register existing SAP system](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system) + - Updated: 2026-03-11T22:19:00.000Z → 2026-04-22T08:00:00.000Z +- [Manage Virtual Instance for SAP solutions](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-virtual-instance) + - Updated: 2026-03-18T22:20:00.000Z → 2026-04-22T08:00:00.000Z +- [Sample deployment configurations](https://learn.microsoft.com/en-us/azure/sap/automation/new-vs-existing) + - Updated: 2023-09-03T11:19:00.000Z → 2026-04-22T17:34:00.000Z +- [Download and prepare software media](https://learn.microsoft.com/en-us/azure/sap/automation/software) + - Updated: 2026-04-01T22:41:00.000Z → 2026-04-20T11:11:00.000Z +- [Bash scripts for automation framework](https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash) + - Updated: 2023-09-19T16:49:00.000Z → 2026-04-21T22:10:00.000Z ## Classified Pages @@ -145,6 +128,7 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Premium SSD v2 for HANA](https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-premium-ssd-v2) | configuration | 0.80 | Details on tuning capacity, IOPS, and throughput for HANA using Premium SSD v2; includes recommended values and patterns. | | [Premium storage for HANA](https://learn.microsoft.com/en-us/azure/sap/workloads/hana-vm-premium-ssd-v1) | configuration | 0.80 | HANA-specific storage layouts and parameters for Premium SSD v1; includes product-specific configuration guidance. | | [Prepare Azure infrastructure for SAP HA with WSFC](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-high-availability-infrastructure-wsfc-shared-disk) | configuration | 0.80 | Describes Azure infrastructure preparation steps and alternatives for cluster shared disks for SAP ASCS/SCS on WSFC, which are detailed, platform-specific configuration requirements. | +| [Prepare network for deployment](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network) | configuration | 0.80 | Network preparation guide for S/4HANA via Azure Center for SAP solutions. Likely includes required subnets, address ranges, NSG rules, outbound connectivity requirements, and allowlisted endpoints. These are concrete configuration parameters and security rules specific to this service, matching the configuration sub-skill. | | [Scale-out with standby node on Azure NetApp Files](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-scale-out-standby-netapp-files-suse) | configuration | 0.80 | Describes a specific HA deployment pattern (scale-out with standby) on SLES using Azure NetApp Files, with example instance IDs and OS/HANA versions, which is expert configuration knowledge. | | [Scale-out with HSR and Pacemaker](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-high-availability-scale-out-hsr-rhel) | configuration | 0.80 | Describes a specific HA configuration pattern (HSR + Pacemaker + ANF/Azure Files NFS) with concrete instance IDs and architecture, which is highly product-specific configuration knowledge. | | [Scale-out with standby node with Azure NetApp Files](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-scale-out-standby-netapp-files-rhel) | configuration | 0.80 | Gives detailed HA deployment steps for HANA scale-out with standby on RHEL using Azure NetApp Files, including specific versions and instance IDs, which are expert configuration details. | @@ -172,6 +156,7 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Prerequisites Azure VM Extension for SAP](https://learn.microsoft.com/en-us/azure/sap/workloads/vm-extension-for-sap) | configuration | 0.75 | Covers deploying the Azure VM extension for SAP; includes extension names, parameters, and Azure-specific configuration details. | | [RHEL - IBM Db2 HA](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-ibm-db2-luw) | configuration | 0.75 | Explains how to deploy IBM Db2 LUW in HADR configuration on RHEL Azure VMs, including primary/secondary roles and replication modes, which are product-specific HA configuration details. | | [Recommended setup for SAP Cloud Identity Services and SAP Business Technology Platform with Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/sap/workloads/scenario-azure-first-sap-identity-integration) | security | 0.75 | Technical design guide with concrete Entra ID and SAP Cloud Identity Services configuration, including SSO setup, trust relationships, and scope/claim mappings. | +| [Register existing SAP system](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system) | configuration | 0.75 | How-to for registering an existing SAP system, including what Azure resources (managed resource group, storage account, VIS) are created and how they’re named/customized. This involves specific resource types, naming patterns, and configuration parameters unique to this product, aligning with configuration. | | [SLES - IBM Db2 HA](https://learn.microsoft.com/en-us/azure/sap/workloads/dbms-guide-ha-ibm) | configuration | 0.75 | Provides deployment and configuration guidance for Db2 LUW HADR on Azure VMs, including synchronous/asynchronous replication options, which are expert configuration patterns. | | [Scale-up with HSR and Pacemaker](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-high-availability) | configuration | 0.75 | Details HANA System Replication setup for HA on SLES in Azure, including node roles and replication modes, which are SAP-on-Azure configuration specifics. | | [Scale-up with HSR and Pacemaker](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-high-availability-rhel) | configuration | 0.75 | Details how to deploy and configure HANA System Replication for HA on Azure VMs with RHEL, including supported HA method and node roles, which are product-specific configuration patterns. | @@ -190,6 +175,7 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Azure Files NFS/SMB for SAP](https://learn.microsoft.com/en-us/azure/sap/workloads/planning-guide-storage-azure-files) | best-practices | 0.70 | The description mentions storage configuration suggestions and considerations for SAP workloads. Such guidance typically includes product-specific recommendations (share sizes, throughput/IOPS starting points, layout patterns) that act as concrete best practices rather than generic storage theory. | | [Azure VM restart for higher availability](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-higher-availability-architecture-scenarios) | architecture-patterns | 0.70 | Explains pattern of relying on Azure VM restart instead of clustering; includes constraints and behavior specific to SAP workloads. | | [Azure services with SAP RISE](https://learn.microsoft.com/en-us/azure/sap/workloads/rise-integration-services) | integrations | 0.70 | Describes concrete integration scenarios (Data Factory, Synapse, ABAP SDK for Azure) with SAP RISE; includes runtime and interface requirements. | +| [Bash scripts for automation framework](https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash) | configuration | 0.70 | A 'shell script reference' for SAP Deployment Automation Framework components is likely a parameter/command reference with specific script names, flags, and required/optional arguments—product-specific configuration details that qualify as expert knowledge under the configuration sub-skill. | | [Bootstrapping the Deployer using shell scripts](https://learn.microsoft.com/en-us/azure/sap/automation/bash/install-deployer) | deployment | 0.70 | Describes a control-plane bootstrap script for SAP deployment automation. Such pages usually include product-specific parameters, prerequisites, and environment details for deploying the deployer, which are expert deployment patterns. | | [Bootstrapping the Library using shell scripts](https://learn.microsoft.com/en-us/azure/sap/automation/bash/install-library) | deployment | 0.70 | Covers a script that bootstraps the SAP Library in the control plane. These docs generally include specific script flags, Azure resource layout, and sequencing for deployment, which are specialized deployment details. | | [Configure Azure DevOps](https://learn.microsoft.com/en-us/azure/sap/automation/configure-devops) | deployment | 0.70 | Details setting up projects, pipelines, service connections, permissions, and variable groups specifically for SAP Deployment Automation Framework. These are product-specific CI/CD deployment requirements and structures, not generic DevOps guidance. | @@ -199,7 +185,8 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Configure SAP HANA provider](https://learn.microsoft.com/en-us/azure/sap/monitor/provider-hana) | configuration | 0.70 | How-to guide for configuring the SAP HANA provider in Azure Monitor for SAP solutions via the portal; such provider configuration docs typically include product-specific settings (host, ports, authentication, SSL, collection options) and parameter details that qualify as expert configuration knowledge. | | [Connect to SAP LaMa](https://learn.microsoft.com/en-us/azure/sap/workloads/lama-installation) | configuration | 0.70 | Setup article for the SAP LaMa connector for Azure will necessarily include Azure- and LaMa-specific configuration steps (connector registration, parameters, authentication, VM configuration for adaptive systems). These are product-specific settings and sequences that go beyond generic knowledge, fitting the configuration sub-skill. | | [Create automation template files](https://learn.microsoft.com/en-us/azure/sap/automation/bom-templates-db) | configuration | 0.70 | Covers creating ini-file templates for unattended SAP installation tied to the BOM; such content typically defines template parameters, required fields, and formats, which are product-specific configuration details. | -| [Deploy S/4 HANA infrastructure - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom) | deployment | 0.70 | CLI-based quickstart with detailed commands and options to deploy HA three-tier SAP infrastructure and customize resource names, specific to ACSS. | +| [Deploy S/4 HANA infrastructure - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom) | configuration | 0.70 | Quickstart using Azure CLI to deploy distributed HA SAP infrastructure with custom resource names. This typically includes specific CLI commands, parameter names, and required values/structures for Azure Center for SAP solutions and VIS resources, which are product-specific configuration details rather than generic deployment concepts. | +| [Deploy S/4HANA infrastructure](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana) | deployment | 0.70 | Describes deploying S/4HANA infrastructure with HA, non-HA, and single-server options. Likely includes deployment option matrices, required resource types, and constraints per configuration. This is production deployment guidance for a specific service, fitting deployment rather than generic how-to. | | [Deploy multi-SID clusters with Pacemaker](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-suse-multi-sid) | architecture-patterns | 0.70 | The page covers a multi-SID high-availability pattern for running multiple SAP systems on a two-node Pacemaker cluster on Azure VMs. It describes how to co-locate multiple SAP SIDs, optimize resource utilization, and maintain HA, which is a product- and scenario-specific architecture pattern rather than generic clustering theory. | | [Deploying the SAP system using shell scripts](https://learn.microsoft.com/en-us/azure/sap/automation/bash/installer) | deployment | 0.70 | Script reference pages for SAP Deployment Automation Framework typically document script arguments, required environment variables, and Azure resource constraints specific to this tooling, which are not generic knowledge. This is deployment-focused automation for creating SAP systems. | | [Deploying the control plane using shell scripts](https://learn.microsoft.com/en-us/azure/sap/automation/bash/deploy-controlplane) | deployment | 0.70 | Script-specific deployment of control plane (deployer, SAP library); clearly a deployment method unique to this framework. | @@ -212,9 +199,9 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Enable TLS in Azure Monitor for SAP solutions](https://learn.microsoft.com/en-us/azure/sap/monitor/enable-tls-azure-monitor-sap-solutions) | security | 0.70 | Focuses on TLS 1.2 secure communication, supported certificate types, and how encryption works between Azure Functions and SAP systems. This is product-specific security configuration and likely includes concrete certificate requirements and connection properties. | | [Extensibility](https://learn.microsoft.com/en-us/azure/sap/automation/extensibility) | configuration | 0.70 | Covers extensibility options such as custom Ansible playbooks, repositories, packages, and kernel parameters. These are concrete, product-specific configuration mechanisms and patterns for SDAF, fitting the configuration category more than generic best practices. | | [Extract SAP data to Microsoft Fabric](https://learn.microsoft.com/en-us/azure/sap/workloads/extract-sap-data) | decision-making | 0.70 | Compares multiple SAP data sources and extraction tools with criteria like performance, reliability, and data layer characteristics to guide method selection. | -| [How to onboard SAP Edge Integration Cell with Azure](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-edge-integration-cell-with-azure) | deployment | 0.70 | Gives product-specific steps and Kubernetes configuration for running SAP Edge Integration Cell on AKS with Azure Arc, including cluster and connectivity settings. | | [Identity and security services with SAP RISE](https://learn.microsoft.com/en-us/azure/sap/workloads/rise-integration-security) | security | 0.70 | Described as detailing integration of Azure identity, security, and monitoring with SAP RISE. Such content usually includes specific RBAC roles, identity flows, and security configuration parameters for managed workloads, which are product- and scenario-specific and not generic security theory. | | [Install HA SAP NetWeaver with WSFC and SOFS file share](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-high-availability-installation-wsfc-file-share) | deployment | 0.70 | Stepwise installation and configuration of SAP NetWeaver HA using WSFC and Scale-Out File Server, including SAP- and Azure-specific cluster resource settings. | +| [Install PAS and AAS with SAP NetWeaver HA cluster - RHEL](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance) | configuration | 0.70 | How-to for installing and configuring SAP PAS and AAS on existing ASCS/SCS HA cluster VMs on RHEL. Likely includes product-specific configuration steps, service names, cluster/resource settings, and SAP/Azure parameters that go beyond generic knowledge. It’s not just conceptual; it’s a detailed deployment/configuration pattern for a specific HA scenario. | | [Install SAP software](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/install-software) | deployment | 0.70 | Explains two concrete installation modes (inside ACSS vs. external detection) with product-specific steps and parameters for VIS-based systems. | | [Monitor in Azure Data Factory](https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/monitor-data-extraction) | troubleshooting | 0.70 | Page explicitly covers how the data extraction process works, how to monitor it, and how to troubleshoot issues. For this product area, such monitoring/troubleshooting docs typically include pipeline names, run status meanings, and error/diagnostic details specific to the prebuilt templates, which qualifies as product-specific symptom→diagnosis→solution guidance. | | [Monitor in Fabric](https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/monitor-fabric-data-extraction-processing) | troubleshooting | 0.70 | Describes monitoring data extraction and processing pipelines and notebooks deployed per deployment type (SAP + ADF, Salesforce, SAP + ECC). Such monitoring docs for prebuilt templates usually include specific pipeline/notebook names, run states, and handling of failures, which are product-specific troubleshooting details rather than generic concepts. | @@ -222,20 +209,15 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Multi-SID with WSFC and shared disk](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-ascs-ha-multi-sid-wsfc-shared-disk) | configuration | 0.70 | Similar to [0], this focuses on moving from single to multi-SID ASCS/SCS in an existing WSFC cluster, which requires detailed, product-specific configuration of clustered instances, load balancer, and shared disk resources. That aligns best with configuration rather than generic tutorial or overview content. | | [Network connectivity with SAP RISE](https://learn.microsoft.com/en-us/azure/sap/workloads/rise-integration-network) | architecture-patterns | 0.70 | Details connectivity patterns between SAP RISE-managed VNets and customer Azure; product-specific network architecture options and trade-offs. | | [Prepare BOM](https://learn.microsoft.com/en-us/azure/sap/automation/bom-prepare) | configuration | 0.70 | Explains how to structure a BOM for SAP on Azure Deployment Automation Framework, including components, media, templates, scripts, and permalinks. The BOM schema and required fields are product-specific configuration knowledge beyond generic SAP or Azure concepts. | -| [Prepare network for deployment](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network) | security | 0.70 | The article focuses on preparing a virtual network for S/4HANA deployment, including connectivity and security rules and allow-listing endpoints. This implies product-specific network security configuration (NSG rules, outbound access, allowlisted FQDNs/IPs) that go beyond generic concepts and match the security sub-skill definition. | | [Register an existing SAP system - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-register-system-cli) | configuration | 0.70 | Similar to [5] but via Azure CLI. It will contain specific CLI commands, flags, and parameter requirements for creating VIS resources and registering SAP systems, which is detailed configuration knowledge rather than generic CLI usage. | | [Register an existing SAP system - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-register-system-powershell) | configuration | 0.70 | Covers using an Azure PowerShell module to register an existing SAP system and create a VIS resource. This will include specific cmdlets, parameters, and required values unique to Azure Center for SAP solutions, fitting the configuration sub-skill (command/parameter-level configuration). | -| [Register existing SAP system](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system) | configuration | 0.70 | Portal-based registration process creates specific managed resource groups, storage accounts, and other resources; the article will detail required fields, resource naming, and configuration options unique to Azure Center for SAP solutions, which is expert configuration guidance. | | [Run extraction and data processing](https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/run-extraction-data-processing) | deployment | 0.70 | Describes which pipelines to execute for different source systems and connectors (ADF, Open Mirroring, Fabric pipelines), which is operational deployment/execution guidance specific to this solution. | | [SAP HANA data tiering and archiving guidance](https://learn.microsoft.com/en-us/azure/sap/workloads/hana-tiering-guidance) | decision-making | 0.70 | Describes data tiering options and strategies on Azure for HANA, balancing cost and performance; used for migration and storage decisions. | | [SAP ILM](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-information-lifecycle-management) | configuration | 0.70 | An ILM + Blob integration guide typically includes product-specific configuration steps (storage account/container setup, authentication options, ILM Store parameters) and exact setting names/values unique to SAP on Azure. This goes beyond conceptual ILM overview and into concrete configuration for archive storage. | -| [Sample deployment configurations](https://learn.microsoft.com/en-us/azure/sap/automation/new-vs-existing) | decision-making | 0.70 | Guides configuration for greenfield vs brownfield scenarios, helping decide when to let the framework create vs reuse infrastructure. | | [Scale-out with HSR and Pacemaker](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-high-availability-scale-out-hsr-suse) | configuration | 0.70 | The page describes deploying SAP HANA scale-out with system replication and Pacemaker on SLES, including example instance IDs, system IDs, and NFS-backed shared file system configuration. This is detailed, product-specific configuration and clustering guidance rather than high-level concepts. | | [Secure Azure Infrastructure for SAP](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-security-infrastructure) | security | 0.70 | Provides SAP-on-Azure–specific infrastructure security guidance tied to Zero Trust and Azure services, beyond generic security concepts. | | [Set up Pacemaker cluster](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-suse-pacemaker) | configuration | 0.70 | A Pacemaker setup guide on SLES in Azure necessarily includes product-specific cluster configuration steps, resource definitions, and parameter values unique to Azure + SLES + Pacemaker. These are detailed configuration patterns rather than generic concepts. | | [Set up network for Azure Monitor for SAP solutions](https://learn.microsoft.com/en-us/azure/sap/monitor/set-up-network) | configuration | 0.70 | How-to guide for setting up an Azure virtual network specifically for Azure Monitor for SAP solutions is likely to include product-specific network configuration details (subnets, required service endpoints, private link/DNS settings, NSG rules, and possibly required ports). These are concrete, service-specific configuration parameters rather than generic networking concepts, fitting the configuration sub-skill. | -| [Start and stop SAP system - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli) | configuration | 0.70 | Parallel to [7] but with Azure CLI. It will specify concrete CLI commands and parameters for controlling VIS-managed SAP systems, which is detailed, product-specific configuration knowledge. | -| [Start and stop SAP system - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell) | configuration | 0.70 | Describes using Az PowerShell to start/stop SAP systems through VIS. This implies specific cmdlets, resource identifiers, and parameter usage unique to Azure Center for SAP solutions, which is configuration-level expert knowledge rather than generic scripting guidance. | | [Supported scenarios](https://learn.microsoft.com/en-us/azure/sap/workloads/planning-supported-configurations) | decision-making | 0.70 | Describes supported/non-supported SAP VM scenarios and HA configurations; used to decide which architectures are allowed on Azure, with product-specific support constraints. | | [Update the SAP Library SAS token](https://learn.microsoft.com/en-us/azure/sap/automation/bash/update-sas-token) | configuration | 0.70 | Updating SAS tokens for the SAP Library involves specific Key Vault secret names and token handling patterns, which are configuration details unique to this solution. | | [What SAP workloads run on Azure?](https://learn.microsoft.com/en-us/azure/sap/workloads/certifications) | decision-making | 0.70 | Contains tables of Azure-supported SAP configurations and certifications, used to decide which VM/OS/DB combinations are supported; product-specific decision data not inferable from general training. | @@ -248,12 +230,9 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [SAP IQ on Azure VMs](https://learn.microsoft.com/en-us/azure/sap/workloads/dbms-guide-sapiq) | architecture-patterns | 0.68 | The article covers concrete architecture, sizing, storage, and high-availability patterns for SAP BW near-line storage with SAP IQ on Azure. This is product- and workload-specific design guidance (how to separate hot/cold data, how to size and place components) that goes beyond generic concepts and would be used to decide how to architect the solution. | | [Start and stop SAP systems](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/start-stop-sap-systems) | configuration | 0.68 | Operational how-to for starting/stopping SAP systems, instances, and HANA DB via Azure Center for SAP solutions using portal/PowerShell/CLI/REST. Likely includes specific API operations, parameter names, and required settings unique to the VIS resource, which are product-specific configuration details beyond generic knowledge. | | [Start and stop SAP systems and VMs](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/stop-start-sap-and-underlying-vm) | integrations | 0.68 | Page describes using the Virtual Instance for SAP solutions (VIS) resource and REST API calls to start/stop SAP systems, instances, HANA databases, and underlying VMs. This involves product-specific API operations, parameters, and behavior unique to Azure Center for SAP solutions, which qualifies as integrations & coding patterns rather than generic how-to content. | -| [Deploy S/4HANA infrastructure](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana) | deployment | 0.67 | Describes deploying S/4HANA infrastructure with multiple HA/non-HA/single-server options; such guidance typically includes SKU/VM choices, HA topology options, and deployment-time parameters and constraints specific to Azure Center for SAP solutions, which are deployment-focused expert details. | | [Deployment DevOps hands-on lab](https://learn.microsoft.com/en-us/azure/sap/automation/devops-tutorial) | deployment | 0.66 | Describes using SDAF pipelines in Azure DevOps to automate the SAP deployment lifecycle. While tutorial-like, it is specifically about product-integrated CI/CD pipelines and deployment automation for SAP on Azure, which fits the deployment sub-skill focused on production deployment methods. | -| [Download and prepare software media](https://learn.microsoft.com/en-us/azure/sap/automation/software) | integrations | 0.66 | Uses Ansible playbooks to download SAP software into Azure storage for the automation framework. This likely includes specific playbook variables, storage configuration, and paths unique to SAP Deployment Automation Framework, which are integration/config details not generally known. | | [SAP front-end printing with Universal Print](https://learn.microsoft.com/en-us/azure/sap/workloads/universal-print-sap-frontend) | integrations | 0.66 | Explains how to use Universal Print from SAP front-end, which will require specific connector settings, printer mappings, SAP output device configuration, and possibly URL/tenant parameters unique to this integration, fitting the integrations & coding patterns category. | | [Availability scenarios in multiple Azure regions](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-hana-availability-across-regions) | architecture-patterns | 0.65 | Covers multi-region SAP HANA availability scenarios and special considerations due to inter-region distance, providing Azure- and SAP-specific architectural trade-offs. | -| [Bash scripts for automation framework](https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash) | deployment | 0.65 | Reference for shell scripts that deploy framework components; includes script names and usage patterns for deployment. | | [Checklist to deploy an SAP workload on Azure](https://learn.microsoft.com/en-us/azure/sap/workloads/deployment-checklist) | deployment | 0.65 | Project checklist for SAP IaaS deployments; contains concrete deployment requirements and sequencing specific to SAP on Azure. | | [Configuration checks](https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-configuration-checks) | best-practices | 0.65 | Configuration checks for SAP on Azure are explicit validations against best practices (e.g., parameter values, sizing, HA settings), which are actionable, product-specific recommendations. | | [Configure High-availability cluster (Pacemaker) provider](https://learn.microsoft.com/en-us/azure/sap/monitor/provider-ha-pacemaker-cluster) | configuration | 0.65 | Describes installing an HA agent on each Pacemaker cluster node and creating a high-availability provider in Azure Monitor for SAP solutions. This typically involves product-specific configuration steps, provider settings, and parameters unique to this integration, fitting the configuration sub-skill (service-specific config rather than generic tutorial). | @@ -268,10 +247,11 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Identity Management and Authentication for SAP](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-security-identity) | security | 0.65 | Aggregates SAP-on-Azure identity, SSO, MFA, and secure access guidance; while link-heavy, it reflects product-specific identity patterns rather than generic security theory. | | [Install HA SAP NetWeaver with Azure NetApp Files (SMB)](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-windows-netapp-files-smb) | deployment | 0.65 | Provides detailed deployment and configuration steps for SAP NetWeaver HA on Azure VMs using Azure NetApp Files SMB, including cluster framework and storage configuration specifics. | | [Install HA SAP NetWeaver with NFS simple mount](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-suse-nfs-simple-mount) | architecture-patterns | 0.65 | The article describes a specific high-availability architecture for SAP NetWeaver on Azure VMs using SLES, simple mount, and Azure NFS services. This is a product-specific HA pattern (including which Azure NFS options to use and how to structure mounts) that goes beyond generic concepts and provides concrete architectural guidance for this scenario. | -| [Install PAS and AAS with SAP NetWeaver HA cluster - RHEL](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance) | configuration | 0.65 | Describes installing and configuring PAS and AAS dialog instances on existing RHEL HA cluster VMs that already host ASCS/SCS and ERS. This implies detailed SAP and OS-level configuration steps (instance profiles, ports, clustering specifics) that are product-specific configuration knowledge, not just conceptual guidance. | | [Plan your automated deployment](https://learn.microsoft.com/en-us/azure/sap/automation/plan-deployment) | decision-making | 0.65 | Described as key planning decisions for subscriptions, credentials, and virtual network design specific to SAP Deployment Automation Framework; this is product-specific decision guidance about structure and deployment scenarios, fitting decision-making. Likely includes concrete recommendations for how to organize subscriptions and DevOps for this framework. | | [Removing the SAP system using shell scripts](https://learn.microsoft.com/en-us/azure/sap/automation/bash/remover) | deployment | 0.65 | Script-based removal of SAP systems in Azure deployment automation typically includes specific resource dependencies and ordering, which are expert deployment teardown details. | | [Removing the control plane using shell scripts](https://learn.microsoft.com/en-us/azure/sap/automation/bash/remove-controlplane) | deployment | 0.65 | Removal of the SAP control plane (Deployer, Library) via script is a deployment/teardown pattern with product-specific steps and constraints, not generic shell usage. | +| [Start and stop SAP system - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli) | integrations | 0.65 | Similar to index 3 but with Azure CLI. Contains specific CLI commands, flags, and parameter names for interacting with VIS resources to start/stop SAP systems. These are product-specific integration patterns rather than generic CLI usage. | +| [Start and stop SAP system - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell) | integrations | 0.65 | Shows how to start/stop SAP systems through the VIS resource using Azure PowerShell. Likely includes specific cmdlet names, parameters, and required values unique to Azure Center for SAP solutions integration, which fits integrations & coding patterns more than generic configuration. | | [Supported SAP software for Azure deployments](https://learn.microsoft.com/en-us/azure/sap/workloads/supported-product-on-azure) | decision-making | 0.65 | Guides evaluation of which SAP products and OS/DBMS releases are supported on Azure; relies on specific compatibility/support matrices. | | [Supported platforms and features](https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-supportability) | limits-quotas | 0.65 | A 'supported platforms and features' page typically includes explicit OS versions, platform matrices, and feature support tables, which are expert, version-specific limits/compatibility details. | | [Windows DFS-N - flexible SAPMNT for SMB based file share](https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-windows-dfs) | best-practices | 0.65 | Describes a specific workaround for SAPMNT naming limitations when using Azure NetApp Files or Azure Files Premium SMB, including how to configure DFS-N namespaces for SAP. This is a product- and workload-specific pattern that functions as a best-practice/gotcha guide rather than generic DFS documentation. | @@ -314,15 +294,19 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Configure Azure Monitor for SAP solutions alerts in Azure portal](https://learn.microsoft.com/en-us/azure/sap/monitor/get-alerts-portal) | 0.30 | How-to for configuring alerts in Azure Monitor for SAP solutions via the portal. Likely step-by-step UI guidance without detailed configuration parameter tables, numeric limits, or product-specific error codes. Does not clearly match limits, configuration, troubleshooting, or other expert-knowledge categories as defined. | | [Configure and monitor Backup for SAP system](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/acss-backup-integration) | 0.30 | Primarily a workflow/tutorial for configuring Azure Backup via VIS for SAP systems. The summary does not indicate detailed parameter tables, limits, or product-specific error codes; it focuses on a single workflow rather than deep configuration or integration reference. | | [Deploy S/4 HANA infrastructure - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-distributed-non-high-availability) | 0.30 | Quickstart for deploying a non-HA SAP system via Azure PowerShell. Primarily a procedural tutorial without detailed configuration tables, limits, or product-specific troubleshooting/decision matrices beyond what an LLM would generally know. | +| [Download and prepare software media](https://learn.microsoft.com/en-us/azure/sap/automation/software) | 0.30 | Focuses on downloading SAP software using Ansible playbooks; from the summary it looks like a procedural tutorial rather than a reference of parameters, limits, or troubleshooting mappings. | | [Enable Dedicated Hosting Plan in Azure Monitor for SAP solutions](https://learn.microsoft.com/en-us/azure/sap/monitor/enable-dedicated-hosting-plan) | 0.30 | How-to guide for switching Azure Monitor for SAP solutions from Elastic Premium to a dedicated hosting plan. Summary suggests step-by-step portal actions without exposing detailed configuration parameter tables, limits, or product-specific deployment matrices beyond generic hosting-plan change instructions. | | [Get started with Azure Monitor for SAP solutions - Azure portal](https://learn.microsoft.com/en-us/azure/sap/monitor/quickstart-portal) | 0.30 | Quickstart for deploying Azure Monitor for SAP solutions via the portal; likely a step-by-step tutorial without detailed limits, configuration matrices, or product-specific troubleshooting/error-code mappings. | +| [How to onboard SAP Edge Integration Cell with Azure](https://learn.microsoft.com/en-us/azure/sap/workloads/sap-edge-integration-cell-with-azure) | 0.30 | Onboarding SAP Edge Integration Cell with AKS appears to be an overview/onboarding flow. Summary emphasizes hybrid model and high-level scenario (govern from Azure, deploy anywhere) rather than detailed config tables, limits, or error codes. Likely more tutorial/overview than deep expert configuration or troubleshooting. | | [Install SAP software - Azure CLI](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-high-availability-namecustom-cli) | 0.30 | Quickstart for installing a distributed HA SAP system with custom resource names via Azure CLI; focuses on procedural steps rather than limits, configuration matrices, troubleshooting mappings, or quantified best practices. | | [Install SAP software - Azure PowerShell](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-install-distributed-non-high-availability) | 0.30 | Quickstart for installing SAP software on Azure using PowerShell; primarily a step-by-step deployment/tutorial flow without configuration tables, limits, error-code mappings, or product-specific decision matrices. | +| [Manage Virtual Instance for SAP solutions](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-virtual-instance) | 0.30 | Appears to be a how-to/manage article for Virtual Instance for SAP solutions in Azure portal without clear indication of numeric limits, detailed configuration parameter tables, or troubleshooting error mappings. Likely procedural guidance rather than expert-only reference content. | | [Overview](https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework) | 0.30 | High-level overview of the SAP Testing Automation Framework; primarily conceptual description of purpose and capabilities without detailed configuration tables, limits, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/about-business-process-solutions) | 0.30 | Introduction to Business Process Solutions; high-level description of purpose and capabilities without detailed configuration or limits. | | [Overview](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/overview) | 0.30 | High-level overview of Azure Center for SAP solutions without detailed configuration tables, limits, or decision matrices. | | [Overview](https://learn.microsoft.com/en-us/azure/sap/monitor/about-azure-monitor-sap-solutions) | 0.30 | Introductory 'What is' article for Azure Monitor for SAP solutions; mainly conceptual overview of purpose and scope without detailed configuration or limits. | | [SAP and Microsoft integration scenarios](https://learn.microsoft.com/en-us/azure/sap/workloads/integration-get-started) | 0.30 | Overview of integration scenarios; no clear evidence of parameter tables, limits, or concrete integration configs. | +| [Sample deployment configurations](https://learn.microsoft.com/en-us/azure/sap/automation/new-vs-existing) | 0.30 | Describes configuring SAP Deployment Automation Framework for new vs existing deployments; summary suggests scenario-based setup guidance without explicit limits, decision matrices, or detailed configuration reference tables. | | [Soft stop SAP instances and HANA database](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/soft-stop-sap-and-hana-database) | 0.30 | Appears to be a how-to for soft stopping SAP and HANA via VIS using PowerShell/CLI/REST; summary does not indicate product-specific limits, config tables, error codes, or other expert-only details. | | [Testing framework architecture](https://learn.microsoft.com/en-us/azure/sap/automation/testing-framework-architecture) | 0.30 | Architecture overview of the testing framework; describes components and distributed architecture conceptually, not product-specific thresholds, decision matrices, or configuration parameters. | | [View cost analysis for SAP system](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/view-cost-analysis) | 0.30 | Cost analysis viewing guide; likely UI navigation and conceptual usage of VIS cost view without numeric limits, config matrices, or troubleshooting mappings. | @@ -330,7 +314,6 @@ confusable_not_for: Not for Azure Large Instances (use azure-large-instances), A | [Currency conversion](https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/currency-conversion) | 0.20 | Appears to be conceptual guidance on how currency conversion works between SAP and Business Process Solutions; summary does not indicate numeric limits, config tables, error codes, or product-specific parameters. Likely a functional explanation rather than expert configuration or troubleshooting content. | | [FAQ](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/faq) | 0.20 | FAQ content about Azure Center for SAP solutions appears to be general Q&A and conceptual clarification rather than detailed limits, configuration tables, error-code-based troubleshooting, or decision matrices with quantified trade-offs. No strong evidence of product-specific numeric limits, configuration parameter tables, or structured troubleshooting mappings. | | [Get started with automation framework](https://learn.microsoft.com/en-us/azure/sap/automation/get-started) | 0.20 | Page is a getting-started/tutorial-style guide for SAP Deployment Automation Framework using sample parameter files. It does not focus on detailed configuration matrices, limits, quotas, or product-specific troubleshooting/error codes; instead it walks through deploying an example configuration. Lacks the structured expert-knowledge patterns required for any sub-skill type. | -| [Manage Virtual Instance for SAP solutions](https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-virtual-instance) | 0.20 | Page appears to be a how-to for viewing and managing a Virtual Instance for SAP solutions in the Azure portal. Description suggests general management/monitoring steps rather than detailed configuration tables, limits, error-code mappings, or security role definitions. No clear indication of numeric limits, parameter tables, or troubleshooting matrices, so it likely does not contain the kind of expert-only reference data required for any sub-skill type. | | [Manage datasets](https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/manage-datasets) | 0.20 | Article appears to be a procedural how-to for configuring datasets in Business Process Solutions without evidence of numeric limits, detailed configuration parameter tables, error-code-based troubleshooting, or other product-specific expert reference data. | | [Overview](https://learn.microsoft.com/en-us/azure/sap/automation/deployment-framework) | 0.20 | High-level overview of SAP Deployment Automation Framework; conceptual description of what it is and does. | | [Overview](https://learn.microsoft.com/en-us/azure/sap/sap-on-azure-overview) | 0.20 | High-level overview of SAP on Azure offerings without detailed limits, configs, or decision matrices. | diff --git a/products/azure-security/azure-security.csv b/products/azure-security/azure-security.csv index eb2323eb..406d9958 100644 --- a/products/azure-security/azure-security.csv +++ b/products/azure-security/azure-security.csv @@ -71,12 +71,12 @@ https://learn.microsoft.com/en-us/azure/security/fundamentals/code-integrity,Cod https://learn.microsoft.com/en-us/azure/security/fundamentals/customer-lockbox-alternative-email,Alternate email notifications,Customer Lockbox for Microsoft Azure alternate email feature,Configure alternate email notifications for Customer Lockbox,Customer Lockbox for Microsoft Azure alternate email feature,"Note To use this feature, your organization must have anAzure support planwith a minimal level ofDeveloper. Alternate email notification feature enables customers to use alternate email IDs for getting Customer Lockbox notifications. This feature lets Customer Lockbox for Microsoft Azure customers receive notifications when their Azure account isn't email enabled or when a service principal is defined as the tenant admin or subscription owner. Important This feature only enables Customer Lockbox",2025-05-19T17:08:00.000Z,article,security,0.65,True,"Describes alternate email notification feature for Customer Lockbox, including constraints (support plan, account types); specific security-related configuration behavior.",unchanged https://learn.microsoft.com/en-us/azure/security/fundamentals/customer-lockbox-faq,FAQ,Customer Lockbox for Microsoft Azure frequently asked questions,Resolve common issues with Azure Customer Lockbox,Frequently asked questions about Customer Lockbox,This article answers frequently asked questions about Customer Lockbox for Microsoft Azure.,2026-01-22T06:12:00Z,overview,troubleshooting,0.6,True,"FAQ for Customer Lockbox likely includes specific behaviors, edge cases, and resolutions for common issues, mapping symptoms to explanations and fixes.",unchanged https://learn.microsoft.com/en-us/azure/security/fundamentals/customer-lockbox-overview,Overview,Customer Lockbox for Microsoft Azure,Control Microsoft engineer data access with Customer Lockbox,"Technical overview of Customer Lockbox for Microsoft Azure, which provides control over cloud provider access when Microsoft might need to access customer data.","Note To use this feature, your organization must have anAzure support planwith a minimal level ofDeveloper. Most operations and support performed by Microsoft personnel and subprocessors do not require access to customer data. In those rare circumstances where such access is required, Customer Lockbox for Microsoft Azure provides an interface for customers to review and approve or reject customer data access requests. It is used in cases where a Microsoft engineer needs to access customer data, ",2025-04-16T08:00:00.000Z,article,security,0.7,True,"Technical overview of Customer Lockbox behavior and requirements (support plan level, access flow); product-specific security control details.",unchanged -https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices,Best practices,Data security and encryption best practices - Microsoft Azure,Implement Azure data security and encryption best practices,This article provides a set of best practices for data security and encryption using built in Azure capabilities.,"This article describes best practices for data security and encryption. The best practices are based on a consensus of opinion, and they work with current Azure platform capabilities and feature sets. Opinions and technologies change over time and this article is updated on a regular basis to reflect those changes. This article aligns with Microsoft'sZero Trustsecurity model, which treats data as one of the critical pillars requiring protection at all stages. For prescriptive security controls w",2026-04-02T08:00:00.000Z,article,best-practices,0.7,True,Prescriptive best-practices article aligned with Zero Trust and current Azure capabilities; likely includes concrete Azure-specific recommendations and patterns for configuring encryption and data protection beyond generic security advice.,unchanged +https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices,Best practices,Data security and encryption best practices - Microsoft Azure,Apply Azure data security and encryption best practices,This article provides a set of best practices for data security and encryption using built in Azure capabilities.,"This article describes best practices for data security and encryption. The best practices are based on a consensus of opinion, and they work with current Azure platform capabilities and feature sets. Opinions and technologies change over time and this article is updated on a regular basis to reflect those changes. This article aligns with Microsoft'sZero Trustsecurity model, which treats data as one of the critical pillars requiring protection at all stages. For prescriptive security controls w",2026-04-21T22:10:00.000Z,article,best-practices,0.7,True,"Article is explicitly framed as prescriptive best practices for data security and encryption using Azure capabilities. It provides concrete, product-specific recommendations aligned to Azure’s Zero Trust model rather than just conceptual security theory, fitting the best-practices category.",updated https://learn.microsoft.com/en-us/azure/security/fundamentals/database-security-checklist,Security checklist,Azure database security checklist,Use Azure SQL database security checklist,Use the Azure database security checklist to ensure you address important cloud database security controls for Azure SQL Database and Azure SQL Managed Instance.,"To help improve security, Azure SQL Database and Azure SQL Managed Instance include built-in security controls that you can use to limit and control access, protect data, and monitor threats. Security controls include:",2025-11-07T18:10:00.000Z,article,best-practices,0.7,True,"Checklist of concrete security controls for Azure SQL Database and Managed Instance; product-specific, actionable items.",unchanged https://learn.microsoft.com/en-us/azure/security/fundamentals/double-encryption,Double encryption,Double Encryption in Microsoft Azure,,This article describes how Microsoft Azure provides double encryption for data at rest and data in transit.,Double encryption is where two or more independent layers of encryption are enabled to protect against compromises of any one layer of encryption. Using two layers of encryption mitigates threats that come with encrypting data. For example: Azure provides double encryption for data at rest and data in transit.,2026-04-02T22:11:00.000Z,article,,0.25,False,Explains the concept of double encryption and that Azure provides it for data at rest and in transit; appears conceptual without detailed configuration values or product-specific patterns.,unchanged -https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest,Encryption at rest,Azure data encryption at rest - Azure Security,,"This article provides an overview of Azure data encryption at rest, the overall capabilities, and general considerations.","Microsoft Azure includes tools to safeguard data according to your company's security and compliance needs. This article focuses on: Encryption at rest is a common security requirement. In Azure, data is encrypted at rest by default by using platform-managed keys. This approach provides organizations with automatic encryption without the risk or cost of a custom key management solution. Organizations can rely on Azure to completely manage encryption at rest by using platform-managed keys, or the",2026-04-09T08:00:00.000Z,article,,0.2,False,"Described as an overview of Azure data encryption at rest and general considerations; no indication of specific RBAC roles, configuration parameters, limits, or error codes. Content is conceptual security guidance rather than product-specific configuration or troubleshooting details.",unchanged +https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest,Encryption at rest,Azure data encryption at rest - Azure Security,,"This article provides an overview of Azure data encryption at rest, the overall capabilities, and general considerations.","Microsoft Azure includes tools to safeguard data according to your company's security and compliance needs. This article focuses on: Encryption at rest is a common security requirement. In Azure, data is encrypted at rest by default by using platform-managed keys. This approach provides organizations with automatic encryption without the risk or cost of a custom key management solution. Organizations can rely on Azure to completely manage encryption at rest by using platform-managed keys, or the",2026-04-20T08:00:00.000Z,article,,0.2,False,"High-level overview of Azure encryption at rest and key management options; focuses on concepts and general considerations without concrete configuration parameters, limits, or product-specific error/decision matrices.",updated https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-customer-managed-keys-support,Services supporting CMKs,Services that support customer managed keys (CMKs) in Azure Key Vault and Azure Managed HSM,,Services that support customer managed keys (CMKs) in Azure Key Vault and Azure Managed HSM,"Customer-managed keys (CMK) is a key management control model in which you own and manage the key encryption key (KEK) in your ownAzure Key VaultorAzure Managed HSMinstance. Azure services use your KEK to wrap and unwrap their data encryption keys through envelope encryption. For HSM-protected keys, use Azure Key Vault Premium tier or Azure Managed HSM. The following services support server-side encryption with customer managed keys. For implementation details, see the service-specific documenta",2026-04-02T08:00:00.000Z,article,,0.4,False,"Primarily a catalog of Azure services that support customer-managed keys; does not describe limits, configuration parameters, or decision matrices itself, instead linking to service-specific docs.",unchanged -https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-models,Data encryption models,Data encryption models in Microsoft Azure,,This article provides an overview of data encryption models In Microsoft Azure.,"To understand how Azure resource providers implement encryption at rest, you need to understand the different encryption models and their advantages and disadvantages. To ensure a common language and taxonomy, Azure resource providers share these definitions. Azure automatically encrypts data at rest by default by using platform-managed keys. You can optionally choose other key management approaches based on your security and compliance requirements. Three scenarios exist for server-side encrypt",2026-04-02T08:00:00.000Z,article,,0.25,False,"Describes different data encryption models and their pros/cons at a conceptual level; no indication of numeric thresholds, configuration tables, or detailed decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-models,Data encryption models,Data encryption models in Microsoft Azure,,This article provides an overview of data encryption models In Microsoft Azure.,"To understand how Azure resource providers implement encryption at rest, you need to understand the different encryption models and their advantages and disadvantages. To ensure a common language and taxonomy, Azure resource providers share these definitions. Azure automatically encrypts data at rest by default by using platform-managed keys. You can optionally choose other key management approaches based on your security and compliance requirements. Three scenarios exist for server-side encrypt",2026-04-20T08:00:00.000Z,article,,0.2,False,"Describes Azure data encryption models and their advantages/disadvantages at a conceptual level; does not expose detailed configuration parameters, limits, or decision matrices with quantified trade-offs.",updated https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-overview,Data security and encryption,Azure encryption overview,,"Learn about encryption options in Azure. See information for encryption at rest, encryption in flight, and key management with Azure Key Vault.","This article provides an overview of how encryption is used in Microsoft Azure. It covers the major areas of encryption, including encryption at rest, encryption in flight, and key management with Azure Key Vault.",2026-04-02T08:00:00.000Z,article,,0.15,False,"General overview of encryption at rest, in flight, and key management; lacks detailed configuration, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/security/fundamentals/end-to-end,End-to-end security,End-to-end security in Azure,,"This article provides an overview of Azure security architecture organized by protection, detection, and response capabilities, with links to detailed domain-specific documentation.","Azure provides comprehensive security capabilities across all layers of your cloud deployments. Microsoft Azure delivers confidentiality, integrity, and availability of customer data while enabling transparent accountability. This article introduces Azure's security architecture organized by protection, detection, and response capabilities. For a comprehensive introduction to Azure security capabilities organized by functional area, seeIntroduction to Azure security. For detailed implementation ",2025-11-07T18:10:00.000Z,conceptual,,0.2,False,"End-to-end architecture overview organized by protection/detection/response; lacks concrete configuration parameters, limits, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/security/fundamentals/feature-availability,Feature availability for US Government clouds,Cloud feature availability for commercial and US Government customers,Check Azure vs Azure Government security feature availability,This article describes security feature availability in Azure and Azure Government clouds,"This article describes feature availability in the Microsoft Azure and Azure Government clouds. Features are listed asGA(Generally Available),Public Preview, orNot Available. The tables in this article are updated regularly to reflect the current state of feature availability.",2026-01-09T18:19:00.000Z,feature-availability,deployment,0.7,True,"Contains tables of which security features are GA, in preview, or unavailable across commercial and US Government clouds. This is concrete, SKU/cloud-specific capability information that affects deployment choices and is not inferable from general training data.",unchanged diff --git a/products/azure-security/report.md b/products/azure-security/report.md index 87fa75ad..154ca299 100644 --- a/products/azure-security/report.md +++ b/products/azure-security/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: integrations: Guidance on generating signed SBOMs for container images, attaching them in CI/CD, and integrating software supply chain security into deployment @@ -13,9 +13,9 @@ category_descriptions: security: 'Securing Azure workloads: threat modeling mitigations, AKS image signing, crypto and data protection, ransomware defense, incident response, and Azure-specific security/operational best practices.' - best-practices: Security hardening checklists and patterns for Azure (IaaS/PaaS), - covering identity, network, data encryption, secrets, DNS, and app/database protection - best practices + best-practices: 'Security hardening checklists and patterns for Azure IaaS/PaaS: + identity, network, data encryption, secrets, SQL/Storage/Service Fabric, Marketplace + images, and subdomain takeover prevention.' troubleshooting: Diagnosing and resolving common Azure Customer Lockbox issues, including access request problems, approval/denial errors, and configuration or permission-related failures. @@ -24,16 +24,17 @@ category_descriptions: trade-offs. skill_description: Expert knowledge for Azure Security development including troubleshooting, best practices, decision making, security, configuration, integrations & coding - patterns, and deployment. Use when securing AKS workloads, SBOMs, Notation image - signing, Key Vault/HSM keys, or Customer Lockbox access, and other Azure Security + patterns, and deployment. Use when securing AKS workloads, signing container images/SBOMs, + configuring firewalls/antimalware, or choosing Key Vault/HSM, and other Azure Security related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), - Azure Firewall (use azure-firewall), Azure DDos Protection (use azure-ddos-protection), - Azure Web Application Firewall (use azure-web-application-firewall). -use_when: Use when securing AKS workloads, SBOMs, Notation image signing, Key Vault/HSM - keys, or Customer Lockbox access, and other Azure Security related development tasks. + Azure Sentinel (use azure-sentinel), Azure Firewall (use azure-firewall), Azure + Web Application Firewall (use azure-web-application-firewall). +use_when: Use when securing AKS workloads, signing container images/SBOMs, configuring + firewalls/antimalware, or choosing Key Vault/HSM, and other Azure Security related + development tasks. confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-cloud), - Azure Firewall (use azure-firewall), Azure DDos Protection (use azure-ddos-protection), - Azure Web Application Firewall (use azure-web-application-firewall). + Azure Sentinel (use azure-sentinel), Azure Firewall (use azure-firewall), Azure + Web Application Firewall (use azure-web-application-firewall). --- # Azure Security Crawl Report @@ -47,8 +48,8 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 126 +- **Updated Pages**: 3 +- **Unchanged**: 123 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-security/azure-security.csv` @@ -67,6 +68,15 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo ## Changes +### Updated Pages + +- [Encryption at rest](https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest) + - Updated: 2026-04-09T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Best practices](https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices) + - Updated: 2026-04-02T08:00:00.000Z → 2026-04-21T22:10:00.000Z +- [Data encryption models](https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-models) + - Updated: 2026-04-02T08:00:00.000Z → 2026-04-20T08:00:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -84,7 +94,7 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo | [Auditing and logging](https://learn.microsoft.com/en-us/azure/security/develop/threat-modeling-tool-auditing-and-logging) | security | 0.70 | Threat-model-specific mitigation guidance and code examples for auditing/logging are product- and tool-specific security patterns, beyond generic concepts. | | [Authorization](https://learn.microsoft.com/en-us/azure/security/develop/threat-modeling-tool-authorization) | security | 0.70 | Lists specific authorization-related threats and mitigation instructions tied to the Threat Modeling Tool threat library, which is specialized security guidance. | | [Azure Marketplace images](https://learn.microsoft.com/en-us/azure/security/fundamentals/azure-marketplace-images) | best-practices | 0.70 | Security configuration requirements for Marketplace images; includes concrete recommendations and likely specific settings that are product- and process-specific. | -| [Best practices](https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices) | best-practices | 0.70 | Prescriptive best-practices article aligned with Zero Trust and current Azure capabilities; likely includes concrete Azure-specific recommendations and patterns for configuring encryption and data protection beyond generic security advice. | +| [Best practices](https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices) | best-practices | 0.70 | Article is explicitly framed as prescriptive best practices for data security and encryption using Azure capabilities. It provides concrete, product-specific recommendations aligned to Azure’s Zero Trust model rather than just conceptual security theory, fitting the best-practices category. | | [Best practices](https://learn.microsoft.com/en-us/azure/security/fundamentals/identity-management-best-practices) | best-practices | 0.70 | Prescriptive identity and access control guidance for Entra ID; likely includes concrete recommendations (e.g., specific policies, MFA configurations, conditional access patterns) that are product-specific. | | [Best practices](https://learn.microsoft.com/en-us/azure/security/fundamentals/network-best-practices) | best-practices | 0.70 | Collection of Azure-specific network security recommendations (NSGs, firewalls, DDoS, segmentation) with concrete guidance tailored to Azure. | | [Best practices](https://learn.microsoft.com/en-us/azure/security/fundamentals/operational-best-practices) | best-practices | 0.70 | Operational best practices for protecting Azure data, apps, and assets; includes Azure-specific recommendations and controls aligned to Zero Trust. | @@ -183,7 +193,6 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo | [Shared responsibility in the cloud](https://learn.microsoft.com/en-us/azure/security/fundamentals/shared-responsibility) | 0.30 | Explains shared responsibility model conceptually across SaaS/PaaS/IaaS; no product-specific settings, limits, or detailed role/permission mappings. | | [Trusted Hardware Identity Management](https://learn.microsoft.com/en-us/azure/security/fundamentals/trusted-hardware-identity-management) | 0.30 | Technical overview of Trusted Hardware Identity Management and its role in certificate cache management and TCB information; appears architectural/conceptual without concrete configuration parameters, limits, or troubleshooting mappings. | | [Virtual machine security overview](https://learn.microsoft.com/en-us/azure/security/fundamentals/virtual-machines-overview) | 0.30 | Overview of security features for Azure VMs; primarily descriptive without detailed settings, limits, or troubleshooting mappings. | -| [Data encryption models](https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-models) | 0.25 | Describes different data encryption models and their pros/cons at a conceptual level; no indication of numeric thresholds, configuration tables, or detailed decision matrices. | | [Double encryption](https://learn.microsoft.com/en-us/azure/security/fundamentals/double-encryption) | 0.25 | Explains the concept of double encryption and that Azure provides it for data at rest and in transit; appears conceptual without detailed configuration values or product-specific patterns. | | [Microsoft Threat Modeling tool](https://learn.microsoft.com/en-us/azure/security/develop/threat-modeling-tool) | 0.25 | Overview of the Threat Modeling Tool and process; primarily conceptual and marketing-style description without detailed configuration or limits. | | [Mitigations](https://learn.microsoft.com/en-us/azure/security/develop/threat-modeling-tool-mitigations) | 0.25 | Mitigations page appears to describe generic mitigation guidance for threats; summary does not indicate product-specific configuration parameters or error mappings. | @@ -191,8 +200,9 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo | [Acquire Stage](https://learn.microsoft.com/en-us/azure/security/container-secure-supply-chain/articles/container-secure-supply-chain-implementation/acquire-overview) | 0.20 | Acquire stage overview describes goals and concepts; no specific RBAC roles, config parameters, or quantified checks are indicated. | | [Build Stage](https://learn.microsoft.com/en-us/azure/security/container-secure-supply-chain/articles/container-secure-supply-chain-implementation/build-overview) | 0.20 | Build stage overview is about background and goals; no product-specific configuration tables, limits, or error mappings are evident. | | [Catalog Stage](https://learn.microsoft.com/en-us/azure/security/container-secure-supply-chain/articles/container-secure-supply-chain-implementation/catalog-overview) | 0.20 | Catalog stage overview focuses on rationale and objectives; lacks detailed settings, limits, or concrete security configurations. | +| [Data encryption models](https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-models) | 0.20 | Describes Azure data encryption models and their advantages/disadvantages at a conceptual level; does not expose detailed configuration parameters, limits, or decision matrices with quantified trade-offs. | | [Deploy Stage](https://learn.microsoft.com/en-us/azure/security/container-secure-supply-chain/articles/container-secure-supply-chain-implementation/deploy-overview) | 0.20 | Deploy stage overview discusses validating metadata and attestations conceptually; does not show concrete policies, parameters, or decision matrices. | -| [Encryption at rest](https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest) | 0.20 | Described as an overview of Azure data encryption at rest and general considerations; no indication of specific RBAC roles, configuration parameters, limits, or error codes. Content is conceptual security guidance rather than product-specific configuration or troubleshooting details. | +| [Encryption at rest](https://learn.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest) | 0.20 | High-level overview of Azure encryption at rest and key management options; focuses on concepts and general considerations without concrete configuration parameters, limits, or product-specific error/decision matrices. | | [End-to-end security](https://learn.microsoft.com/en-us/azure/security/fundamentals/end-to-end) | 0.20 | End-to-end architecture overview organized by protection/detection/response; lacks concrete configuration parameters, limits, or decision matrices. | | [Introduction](https://learn.microsoft.com/en-us/azure/security/container-secure-supply-chain/articles/container-secure-supply-chain-implementation/containers-secure-supply-chain-overview) | 0.20 | High-level introduction to the Containers Secure Supply Chain framework; conceptual overview without concrete configuration values, limits, or error details. | | [Introduction to Azure security](https://learn.microsoft.com/en-us/azure/security/fundamentals/overview) | 0.20 | High-level overview of Azure security capabilities; primarily conceptual and marketing-style, without detailed configuration, limits, or error mappings. | diff --git a/products/azure-sentinel/azure-sentinel.csv b/products/azure-sentinel/azure-sentinel.csv index 4eb7fb33..ec6afd50 100644 --- a/products/azure-sentinel/azure-sentinel.csv +++ b/products/azure-sentinel/azure-sentinel.csv @@ -1,402 +1,402 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/sentinel/add-advanced-conditions-to-automation-rules,Add advanced conditions to automation rules,Add advanced conditions to Microsoft Sentinel automation rules,Add advanced OR condition groups to Sentinel automation rules,"This article explains how to add complex, advanced ""Or"" conditions to automation rules in Microsoft Sentinel, for more effective triage of incidents.","This article explains how to add advanced ""Or"" conditions to automation rules in Microsoft Sentinel, for more effective triage of incidents. Add ""Or"" conditions in the form ofcondition groupsin the Conditions section of your automation rule. Condition groups can contain two levels of conditions: Simple: At least two conditions, each separated by anORoperator: Compound: More than two conditions, with at least two conditions on at least one side of anORoperator: You can see that this capability af",2025-04-24T22:03:00.000Z,how-to,configuration,0.8,True,"Explains advanced condition groups and OR logic in automation rules, including structure of simple vs compound conditions—detailed, product-specific configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/add-entity-to-threat-intelligence,Add entity to threat indicators,Add entities to threat intelligence - Microsoft Sentinel,,Learn how to add a malicious entity discovered in an incident investigation to your threat intelligence in Microsoft Sentinel.,"During an investigation, you examine entities and their context as an important part of understanding the scope and nature of an incident. When you discover an entity as a malicious domain name, URL, file, or IP address in the incident, it should be labeled and tracked as an indicator of compromise (IOC) in your threat intelligence. For example, you might discover an IP address that performs port scans across your network or functions as a command and control node by sending and/or receiving tra",2024-09-10T22:03:00.000Z,how-to,,0.4,False,"Describes adding entities discovered during investigations as IOCs; likely a procedural UI guide without deep config tables, limits, or error-code troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ama-migrate,AMA migration for Microsoft Sentinel,Migrate to the Azure Monitor Agent (AMA) from the Log Analytics agent (MMA/OMS) for Microsoft Sentinel,Plan and execute migration from MMA to AMA for Sentinel,"Learn about migrating from the Log Analytics agent (MMA/OMS) to the Azure Monitor Agent (AMA), when working with Microsoft Sentinel.","This article describes the migration process to the Azure Monitor Agent (AMA) when you have an existing, legacyLog Analytics Agent (MMA/OMS), and are working with Microsoft Sentinel. The Log Analytics agent isretired as of 31 August, 2024. If you are using the Log Analytics agent in your Microsoft Sentinel deployment, we recommend that you migrate to the AMA.",2024-10-07T17:05:00.000Z,reference,decision-making,0.7,True,"Migration article between agents; usually includes decision guidance, compatibility notes, and stepwise migration considerations specific to Sentinel deployments.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/anomalies-reference,Anomalies reference,Anomalies detected by the Microsoft Sentinel machine learning engine,,Learn about the anomalies detected by Microsoft Sentinel's machine learning engines.,"Microsoft Sentinel detects anomalies by analyzing the behavior of users in an environment over a period of time and constructing a baseline of legitimate activity. Once the baseline is established, any activity outside the normal parameters is considered anomalous and therefore suspicious. Microsoft Sentinel uses two models to create baselines and detect anomalies. This article lists the anomalies that Microsoft Sentinel detects using various machine learning models. In theAnomaliestable: Note T",2026-03-02T12:12:00.000Z,reference,,0.3,False,"The page lists types of anomalies detected by Microsoft Sentinel's ML engine and explains baseline/anomaly concepts, but there is no indication of numeric limits, configuration tables, error codes, or decision matrices. It appears to be descriptive/reference content about anomaly categories rather than detailed configuration, troubleshooting, or quantified guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/api-dcr-reference,Sample API requests for creating Data Collection Rules (DCRs),Microsoft Sentinel API request examples for creating Data Collection Rules (DCRs),Create Sentinel Data Collection Rules via API examples,"See samples of API requests for creating Data Collection Rules and their associations, for use with the Azure Monitor Agent.",This article presents some examples of API requests and responses for creating Data Collection Rules (DCRs) and DCR Associations (DCRAs) for use with the Azure Monitor Agent (AMA).,2024-08-26T17:04:00.000Z,reference,integrations,0.8,True,Provides concrete API request/response samples for creating DCRs and DCRAs; includes specific payload structures and parameters for Sentinel/AMA integration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/audit-sentinel-data,Audit Microsoft Sentinel with Azure Activity Logs,Audit Microsoft Sentinel queries and activities,Audit Microsoft Sentinel queries and user activities,This article describes how to audit queries and activities performed in Microsoft Sentinel.,"This article describes how you can view audit data for queries run and activities performed in your Microsoft Sentinel workspace, such as for internal and external compliance requirements in your Security Operations (SOC) workspace. Microsoft Sentinel provides access to: TheAzureActivitytable, which provides details about all actions taken in Microsoft Sentinel, such as editing alert rules. TheAzureActivitytable doesn't log specific query data. For more information, seeAuditing with Azure Activi",2024-11-12T08:00:00.000Z,how-to,security,0.7,True,Explains how to use AzureActivity and other tables to audit Sentinel operations; includes specific table names and audit patterns for compliance.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/audit-table-reference,SentinelAudit table reference,Microsoft Sentinel audit tables reference,Use Microsoft Sentinel audit tables for monitoring,"Learn about the fields in the SentinelAudit tables, used for audit monitoring and analysis.","This article describes the fields in the SentinelAudit tables, which are used for auditing user activity in Microsoft Sentinel resources. With the Microsoft Sentinel audit feature, you can keep tabs on the actions taken in your SIEM and get information on any changes made to your environment and the users that made those changes. Learn how toquery and use the audit tablefor deeper monitoring and visibility of actions in your environment. Microsoft Sentinel's audit feature currently covers only t",2025-08-25T17:10:00.000Z,reference,configuration,0.9,True,Reference for SentinelAudit tables with field definitions for auditing; table schema and field semantics are detailed configuration/reference information unique to Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/audit-track-tasks,Audit and track changes to incident tasks,Audit and track changes to incident tasks in Microsoft Sentinel in the Azure portal,Audit and track Sentinel incident task changes,"This article explains how you, as a SOC manager, can audit the history of Microsoft Sentinel incident tasks, and track changes to them, in order to gauge your task assignments and their contribution t","Incident tasksensure comprehensive and uniform treatment of incidents across all SOC personnel. Task lists are typically defined according to determinations made by senior analysts or SOC managers, and put into practice using automation rules or playbooks. Your analysts can see the list of tasks they need to perform for a particular incident on the incident details page, and mark them complete as they go. Analysts can also create their own tasks on the spot, manually, right from within the incid",2025-01-15T18:04:00.000Z,how-to,best-practices,0.6,True,"Describes how SOC managers audit and track task history to measure SOC efficiency, a Sentinel-specific operational practice rather than generic advice.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automate-incident-handling-with-automation-rules,Automation rules,Automate threat response in Microsoft Sentinel with automation rules,Implement Sentinel automation rules for SOAR operations,"This article explains what Microsoft Sentinel automation rules are, and how to use them to implement your Security Orchestration, Automation and Response (SOAR) operations. Automation rules increase y","This article explains what Microsoft Sentinel automation rules are, and how to use them to implement your Security Orchestration, Automation and Response (SOAR) operations. Automation rules increase your SOC's effectiveness and save you time and resources. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the ",2024-10-16T08:00:00.000Z,concept-article,best-practices,0.7,True,"Explains what automation rules are and how to use them to implement SOAR, providing concrete Sentinel-specific guidance on structuring automated responses.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation-rule-reference,Automation rules reference,Microsoft Sentinel automation rules reference,Configure Microsoft Sentinel automation rules and conditions,This article displays the supported properties and entities in Microsoft Sentinel automation rules.,"This article contains reference information about the configuration of automation rules and the supported conditions and properties. To learn more about automation rules, seeAutomate threat response in Microsoft Sentinel with automation rules. For instructions on creating, managing, and using automation rules, seeCreate and use Microsoft Sentinel automation rules to manage response.",2024-09-09T11:20:00.000Z,reference,configuration,0.8,True,"Reference for automation rule configuration, supported properties, and conditions; includes specific field names and allowed values, which are configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/authenticate-playbooks-to-sentinel,Authenticate playbooks to Microsoft Sentinel,Authenticate playbooks to Microsoft Sentinel,Configure authentication for Microsoft Sentinel playbooks,Learn how to give your playbooks access to Microsoft Sentinel and authorization to take remedial actions.,"Microsoft Sentinel playbooks are based on workflows built inAzure Logic Apps, a cloud service that helps you schedule, automate, and orchestrate tasks and workflows across systems throughout the enterprise. Azure Logic Apps must connect separately and authenticate independently to each resource, of each type, that it interacts with, including to Microsoft Sentinel itself. Logic Apps usesspecialized connectorsfor this purpose, with each resource type having its own connector. This article describ",2024-05-21T17:56:00.000Z,how-to,security,0.7,True,"Focuses on how playbooks authenticate to Sentinel via connectors; such articles usually enumerate connector types, permission scopes, and auth configuration steps, which are product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/automate-responses-with-playbooks,Overview,Automate Threat Response with Playbooks in Microsoft Sentinel,,Learn how to automate threat response in Microsoft Sentinel using playbooks to efficiently manage security alerts and incidents.,"Security operations centers (SOCs) face a constant stream of security alerts and incidents. Managing these efficiently is critical to keeping your organization’s security strong. Microsoft Sentinel playbooks are automated workflows that help you respond to threats quickly and consistently. This article shows how to use playbooks in Microsoft Sentinel to automate threat response, cut manual effort, and let your team focus on deeper investigations. Use Microsoft Sentinel playbooks to run preconfig",2025-05-30T11:13:00.000Z,conceptual,,0.4,False,"High-level guidance on using Sentinel playbooks to automate threat response; summary suggests conceptual/tutorial content without specific error codes, config tables, or numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/automation,Overview,Automation in Microsoft Sentinel,Design Sentinel SOAR with automation rules and playbooks,"Learn about Microsoft Sentinel security orchestration, automation, and response (SOAR) capabilities and components, including automation rules and playbooks.","Security information and event management (SIEM) and security operations center (SOC) teams are typically inundated with security alerts and incidents on a regular basis, at volumes so large that available personnel are overwhelmed. This results all too often in situations where many alerts are ignored and many incidents aren't investigated, leaving the organization vulnerable to attacks that go unnoticed. Microsoft Sentinel, in addition to being a SIEM system, is also a platform for security or",2025-07-16T22:08:00.000Z,conceptual,architecture-patterns,0.65,True,"Explains Sentinel’s SOAR components and how they work together to handle alert volume, providing product-specific orchestration patterns and when to use each component.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/create-playbooks,Create and manage playbooks,Create and manage Microsoft Sentinel playbooks,,Learn how to create and manage Microsoft Sentinel playbooks to automate your incident response and remediate security threats.,"Playbooks are collections of procedures that can be run from Microsoft Sentinel in response to an entire incident, to an individual alert, or to a specific entity. A playbook can help automate and orchestrate your response and can be attached to an automation rule to run automatically when specific alerts are generated or when incidents are created or updated. Playbooks can also be run manually on-demand on specific incidents, alerts, or entities. This article describes how to create and manage ",2024-10-16T08:00:00.000Z,how-to,,0.4,False,"Create/manage playbooks article sounds procedural; summary does not indicate presence of specific config matrices, limits, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/create-tasks-playbook,Create and perform advanced incident tasks using playbooks,Create and perform incident tasks in Microsoft Sentinel using playbooks,,"This article explains how to use playbooks to create (and optionally perform) incident tasks, in order to manage complex analyst workflow processes in Microsoft Sentinel.","This article explains how to use playbooks to create, and optionally perform, incident tasks to manage complex analyst workflow processes in Microsoft Sentinel. Use theAdd taskaction in a playbook, in the Microsoft Sentinel connector, to automatically add a task to the incident that triggered the playbook. Both Standard and Consumption workflows are supported. Tip Incident tasks can be created automatically not only by playbooks, but also by automation rules, and also manually, ad-hoc, from with",2024-05-21T17:56:00.000Z,how-to,,0.45,False,"Shows how to use playbooks to create incident tasks; appears to be workflow guidance without detailed configuration tables, limits, or error-code troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/define-playbook-access-restrictions,Define an access restriction for Standard-plan playbooks,Define an access restriction policy for Standard-plan playbooks,Define access restriction policies for Sentinel Standard playbooks,"This article shows how to define an access restriction policy for Microsoft Sentinel Standard-plan playbooks, so that they can support private endpoints.","This article describes how to define anaccess restriction policyfor Microsoft Sentinel Standard-plan playbooks, so that they can support private endpoints. Define an access restriction policy to ensure that only Microsoft Sentinel has access to the Standard logic app containing your playbook workflows. For more information, see: Important The new version of access restriction policies is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms ",2025-09-10T17:15:00.000Z,how-to,security,0.7,True,"Covers access restriction policy for Standard-plan playbooks and private endpoints; this is product-specific security configuration, likely including policy settings and scopes.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/generate-playbook,Generate playbooks using AI,Generate playbooks using AI in Microsoft Sentinel,,Generate playbooks through natural language conversations directly in the Defender portal.,"The SOAR playbook generator creates python based automation workflows coauthored through a conversational experience with Cline, an AI coding agent. You describe automation logic in natural language, and the system generates validated, code-based playbooks with complete documentation and visual flow diagrams. This experience is powered by an embedded Visual Studio Code environment within the Defender portal, so you can author and refine playbooks without leaving the portal. Generated playbooks u",2026-02-23T23:22:00.000Z,how-to,,0.3,False,"Describes AI-based playbook generation experience; mostly feature overview and workflow, not detailed configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/logic-apps-playbooks,Azure Logic Apps for Microsoft Sentinel playbooks,Azure Logic Apps for Microsoft Sentinel playbooks,,Learn about Azure Logic Apps concepts and how they work with Microsoft Sentinel playbooks.,"Microsoft Sentinel playbooks are based on workflows built inAzure Logic Apps, a cloud service that helps you schedule, automate, and orchestrate tasks and workflows across systems throughout the enterprise. Microsoft Sentinel playbooks can take advantage of all the power and capabilities of the built-in templates in Azure Logic Apps. Azure Logic Apps communicates with other systems and services using various types ofconnectors. Use theMicrosoft Sentinel connectorto create playbooks that interact",2024-08-16T22:04:00.000Z,concept-article,,0.5,False,Explains how Logic Apps underpins Sentinel playbooks; summary reads as conceptual integration overview without detailed parameter tables or constraints.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/migrate-playbooks-to-automation-rules,Migrate alert playbooks to automation rules,Migrate your Microsoft Sentinel alert-trigger playbooks to automation rules,Decide and migrate Sentinel alert-trigger playbooks to automation rules,This article explains how (and why) to take your existing playbooks built on the alert trigger and migrate them from being invoked by analytics rules to being invoked by automation rules.,"We recommend that you migrate existing playbooks built on alert triggers and migrate them from being invoked byanalytics rulesto being invoked byautomation rules. This article explains why we recommend this action, and how to migrate your playbooks. If you're migrating a playbook that's used by only one analytics rule, follow the instructions underCreate an automation rule from an analytics rule. If you’re migrating a playbook that's used by multiple analytics rules, follow the instructions unde",2025-04-24T22:03:00.000Z,how-to,decision-making,0.65,True,"Explains why and how to migrate from alert-trigger to automation-rule invocation; this is migration/choice guidance between approaches, likely with scenario-based recommendations, fitting decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-recommendations,Recommended and sample playbooks,"Recommended Microsoft Sentinel playbook use cases, templates, and examples",,"Learn about sample use cases for Microsoft Sentinel playbooks, as well as example playbooks and recommended playbook templates.","This article lists sample use cases for Microsoft Sentinel playbooks, as well as sample playbooks and recommended playbook templates.",2024-05-21T17:56:00.000Z,reference,,0.3,False,"Lists sample use cases and templates; likely catalog/overview of examples without deep config parameters, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-triggers-actions,Supported triggers and actions in playbooks,Supported triggers and actions in Microsoft Sentinel playbooks,Use Sentinel Logic Apps triggers and actions in playbooks,Learn in greater depth how to give your playbooks access to the information in your Microsoft Sentinel alerts and incidents and use that information to take remedial actions.,"This article describes the triggers and actions supported by theLogic Apps Microsoft Sentinel connector. Use the listed triggers and actions in Microsoft Sentinel playbooks to interact with your Microsoft Sentinel data. Important Noted functionality is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-08-26T11:22:00.000Z,concept-article,integrations,0.7,True,"Catalog of supported triggers/actions for the Sentinel Logic Apps connector is product-specific integration detail, typically listing operation names, parameters, and behaviors that LLMs won’t reliably know from training.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/run-playbooks,Automate and run playbooks,Automate and run Microsoft Sentinel playbooks,,"Learn how to automate incident response with Microsoft Sentinel playbooks, or run playbooks manually to remediate immediate security threats.","Playbooks are collections of procedures that can be run from Microsoft Sentinel in response to an entire incident, to an individual alert, or to a specific entity. A playbook can help automate and orchestrate your response and can be set to run automatically when specific alerts are generated or when incidents are created or updated, by being attached to an automation rule. It can also be run manually on-demand on specific incidents, alerts, or entities. This article describes how to attach play",2024-11-19T18:02:00.000Z,how-to,,0.4,False,"Describes how to attach and run playbooks automatically or manually; appears to be operational how-to without detailed config matrices, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/tutorial-respond-threats-playbook,Respond to threats using automation,Use a Microsoft Sentinel playbook to stop potentially compromised users,Automate Sentinel response to compromised users with playbooks,Learn how to use Microsoft Sentinel playbooks and automation rules to automate a sample incident response and remediate security threats.,"This article describes a sample scenario of how you can use a playbook and automation rule to automate incident response and remediate security threats. Automation rules help you triage incidents in Microsoft Sentinel, and are also used to run playbooks in response to incidents or alerts. For more information, seeAutomation in Microsoft Sentinel: Security orchestration, automation, and response (SOAR). The sample scenario described in this article describes how to use an automation rule and play",2025-04-24T22:03:00.000Z,concept-article,best-practices,0.7,True,"A concrete scenario tutorial showing how to combine automation rules and playbooks to remediate threats, including Sentinel-specific configuration and flow patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/automation/use-playbook-templates,Customize playbooks from templates,Create and customize Microsoft Sentinel playbooks from templates,,"This article shows how to create playbooks from and work with playbook templates, to customize them to fit your needs.","A playbook template is a prebuilt, tested, and ready-to-use automation workflow for Microsoft Sentinel that can be customized to meet your needs. Templates can also serve as a reference for best practices when developing playbooks from scratch, or as inspiration for new automation scenarios. Playbook templates aren't active playbooks themselves, and you must create an editable copy for your needs. Many playbook templates are developed by the Microsoft Sentinel community, independent software ven",2025-05-29T05:35:00.000Z,how-to,,0.4,False,Shows how to create playbooks from templates; appears to be a how-to tutorial without detailed configuration tables or product-specific numeric guidance.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/aws-disruption,Enable attack disruption actions on AWS,Enable Attack Disruption Actions on AWS with Microsoft Sentinel,Enable automated attack disruption actions on AWS identities,Enable Attack Disruption Actions on AWS with Microsoft Sentinel,"This article describes how to configure your AWS environment so that Microsoft Sentinel can take automated actions on a user that assumes a SAML role, or on an AWS IAM account when an alert is triggered. Attack disruption uses high-confidence signals to contain compromised assets and limit the damage from attacks, including actions on identities in AWS.",2026-02-19T12:31:00.000Z,how-to,security,0.7,True,"Security-focused configuration of Sentinel to take automated actions on AWS IAM/SAML identities, with product-specific permissions and setup.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/aws-s3-troubleshoot,Troubleshoot AWS S3 connector issues,Troubleshoot AWS S3 connector issues - Microsoft Sentinel,Troubleshoot Microsoft Sentinel AWS S3 connector problems,Troubleshoot AWS S3 connector issues in Microsoft Sentinel.,"The Amazon Web Services (AWS) S3 connector allows you to ingest AWS service logs, collected in AWS S3 buckets, to Microsoft Sentinel. The types of logs we currently support are AWS CloudTrail, VPC Flow Logs, and AWS GuardDuty. This article describes how to quickly identify the cause of issues occurring with the AWS S3 connector so you can find the steps needed to resolve the issues. Learn how toconnect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data.",2025-01-15T18:04:00.000Z,troubleshooting,troubleshooting,0.85,True,Dedicated troubleshooting guide for the AWS S3 connector with Sentinel-specific diagnostics and resolution steps.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/azure-storage-blob-connector-troubleshoot,Troubleshoot Azure Storage Blob connector issues,Troubleshoot Azure Storage Blob connector issues - Microsoft Sentinel,Troubleshoot Microsoft Sentinel Azure Storage Blob connector,Troubleshoot Azure Storage Blob connector issues in Microsoft Sentinel.,The Azure Storage Blob connector simplifies the process of ingesting data from Azure Storage Blobs to Microsoft Sentinel. This article describes how to quickly identify the cause of issues occurring with the Azure Storage Blob connector so you can find the steps needed to resolve the issues. Learn how toconnect Microsoft Sentinel to Azure Storage Blob to ingest data.,2026-03-11T11:05:00.000Z,troubleshooting,troubleshooting,0.82,True,"Troubleshooting guide for Azure Storage Blob connector issues, likely organized by symptoms and including specific error conditions and resolutions unique to this connector, which qualifies as product-specific troubleshooting knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/basic-logs-use-cases,Data lake use cases,When to use the Microsoft Sentinel data lake,Choose when to use Microsoft Sentinel data lake tier,"Learn what log sources might be appropriate for the Microsoft Sentinel data lake and what attributes to look for, to decide about other sources.","This article highlights log sources to consider configuring as data lake tier only when enabling a connector. Before choosing a tier for which to configure a given table, check which tier is most appropriate for your use case. For more information about data categories and data tiers, seeLog retention plans in Microsoft Sentinel. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All custo",2026-02-25T23:33:00.000Z,concept-article,decision-making,0.7,True,"Focuses on deciding which log sources to place in the data lake tier; such guidance is scenario-based with criteria for tier selection, fitting decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/best-practices,SIEM best practices in Microsoft Sentinel,Best practices for Microsoft Sentinel,Apply operational best practices for Microsoft Sentinel,Learn about best practices to employ when managing your Log Analytics workspace for Microsoft Sentinel.,"Best practice guidance is provided throughout the technical documentation for Microsoft Sentinel. This article highlights some key guidance to use when deploying, managing, and using Microsoft Sentinel. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the",2025-09-30T12:49:00.000Z,conceptual,best-practices,0.7,True,Explicit best practices article for deploying and managing Sentinel; likely includes product-specific recommendations and gotchas beyond generic advice.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/best-practices-data,Best practices,Best practices for data collection in Microsoft Sentinel,Apply data collection best practices in Microsoft Sentinel,Learn about best practices to employ when connecting data sources to Microsoft Sentinel.,"This section reviews best practices for collecting data using Microsoft Sentinel data connectors. For more information, seeConnect data sources,Microsoft Sentinel data connectors reference, and theMicrosoft Sentinel solutions catalog.",2024-11-27T18:02:00.000Z,concept-article,best-practices,0.7,True,"Explicit best-practices article for Sentinel data collection; likely includes product-specific recommendations (which connectors, which tables, retention choices) and gotchas.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/billing,Plan costs,Plan costs and understand pricing and billing - Microsoft Sentinel,Plan and estimate Microsoft Sentinel pricing and billing,"Learn how to plan your Microsoft Sentinel costs, and understand pricing and billing using the pricing calculator and other methods.","To help estimate your Microsoft Sentinel expected costs,contact a Security sales specialistfor more information on pricing or to request a quote. Costs for Microsoft Sentinel are only a portion of the monthly costs in your Azure bill. Although this article explains how to plan costs and understand the billing for Microsoft Sentinel, you're billed for all Azure services and resources your Azure subscription uses, including Partner services. This article is part of theDeployment guide for Microsof",2026-04-01T17:25:00.000Z,concept-article,decision-making,0.64,True,"The article is about planning costs and understanding pricing and billing. Microsoft cost-planning pages typically include tiered pricing details, data ingestion/retention cost structures, and guidance on estimating costs using calculators—information used to choose plans and manage spend, which fits decision-making with product-specific, quantitative trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/billing-monitor-costs,Monitor costs,Manage and monitor costs for Microsoft Sentinel,Analyze and optimize Microsoft Sentinel cost and billing,Learn how to manage and monitor costs and billing for Microsoft Sentinel by using cost analysis in the Azure portal and other methods.,"After you start using Microsoft Sentinel resources, use built-in Cost Management features to confidently manage budgets, monitor costs and security performance. You can also review forecasted costs and identify spending trends to optimize. With the Sentinel data lake enabled, you can also view your usage directly in the Microsoft Defender portal. Microsoft Sentinel costs are only a portion of your monthly Azure bill. Although this article explains how to manage and monitor costs for Microsoft Se",2026-03-29T22:17:00.000Z,how-to,decision-making,0.64,True,"The page explains how to use Azure Cost Management and Defender portal usage views specifically for Sentinel, including how to interpret and act on Sentinel-related cost data. It provides product-specific guidance for monitoring and managing Sentinel costs, which supports cost-related decision making and is not generic billing information.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/billing-pre-purchase-plan,Optimize costs with pre-purchase plan,Optimize costs with a prepurchase plan - Microsoft Sentinel,Use Microsoft Sentinel prepurchase plans to save costs,Learn how to save costs and buy a Microsoft Sentinel prepurchase plan,"Save on your Microsoft Sentinel analytics tier costs when you buy a pre-purchase plan. Pre-purchase plans are commit units (CUs) bought at discounted tiers in your purchasing currency for a specific product. The more you buy, the greater the discount. Purchased CUs pay down qualifying costs in US dollars (USD). So, if Microsoft Sentinel generates a retail cost of $100, then 100 Microsoft Sentinel CUs (SCUs) are consumed. Your Microsoft Sentinel pre-purchase plan automatically uses your SCUs to p",2025-07-22T12:54:00.000Z,how-to,decision-making,0.75,True,"Describes Sentinel-specific commit units, discount tiers, and how CUs map to dollar costs; this is detailed, quantified cost-optimization and plan-selection guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/billing-reduce-costs,Reduce costs,Reduce costs for Microsoft Sentinel,Reduce Microsoft Sentinel costs with product features,Learn how to reduce costs for Microsoft Sentinel by using different methods in the Azure portal.,"Costs for Microsoft Sentinel are only a portion of the monthly costs in your Azure bill. Although this article explains how to reduce costs for Microsoft Sentinel, you're billed for all Azure services and resources your Azure subscription uses, including Partner services. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will bere",2025-11-23T12:10:00.000Z,conceptual,decision-making,0.7,True,"Cost-reduction guidance for Sentinel usually contains product-specific recommendations (for example, which data types or features to adjust) and quantified impact on billing, which is concrete decision guidance rather than generic advice.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/bookmarks,Bookmarks,Hunt with bookmarks in Microsoft Sentinel,,This article describes how to use the Microsoft Sentinel hunting bookmarks to keep track of data.,"Hunting bookmarks in Microsoft Sentinel helps you preserve the queries and query results that you deem relevant. You can also record your contextual observations and reference your findings by adding notes and tags. Bookmarked data is visible to you and your teammates for easy collaboration. For more information, seeBookmarks. Note Bookmarks can only be created in the Azure portal. While you can't add bookmarks in the Microsoft Defender portal, you can see bookmarks that were already created. Im",2025-07-21T17:09:00.000Z,how-to,,0.45,False,"Explains bookmarks conceptually and how to use them; no detailed configuration parameters, limits, or error-resolution mappings are indicated.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/bring-your-own-ml,Bring your own machine learning,Bring your own ML into Microsoft Sentinel,Bring custom machine learning models into Sentinel,This article explains how to create and use your own machine learning algorithms for data analysis in Microsoft Sentinel.,"Note For information about feature availability in US Government clouds, see the Microsoft Sentinel tables inCloud feature availability for US Government customers. Machine Learning (ML) is one of the major underpinnings of Microsoft Sentinel, and one of the main attributes that set it apart. Microsoft Sentinel offers ML in several experiences: built-in to theFusioncorrelation engine and Jupyter notebooks, and the newly available Build-Your-Own ML (BYO ML) platform. ML detection models can adapt",2023-01-24T23:03:00.000Z,conceptual,architecture-patterns,0.65,True,"Describes the BYO ML platform and how to create/use ML algorithms for Sentinel data analysis, which is a product-specific architecture/pattern for ML-based detections.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/business-applications/deploy-power-platform-solution,Deploy for Power Platform and Microsoft Dynamics 365 Customer Engagement,Connect Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement to Microsoft Sentinel,Deploy Sentinel solution for Power Platform and CE,Learn how to deploy the Microsoft Sentinel solution for Business Applications with Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement to Microsoft Sentinel,"This article describes how to deploy theMicrosoft Sentinel solution for Microsoft Business Appsto connect your Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement system to Microsoft Sentinel. The solution collects audit and activity logs to detect threats, suspicious activities, illegitimate activities, and more.",2025-06-18T11:21:00.000Z,how-to,deployment,0.66,True,Describes how to deploy the Business Apps solution for Power Platform and Dynamics 365 CE; includes product-specific deployment steps and connector setup.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/business-applications/power-platform-solution-security-content,Power Platform and Microsoft Dynamics 365 Customer Engagement security content reference,Security content reference for Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement and Microsoft Dynamics 365 Customer Engagement,Security content reference for Power Platform and CE,Learn about the built-in security content provided by the Microsoft Sentinel solution for Power Platform.,"This article details the security content available for the Microsoft Sentinel solution for Power Platform. For more information about this solution, seeMicrosoft Sentinel solution for Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement overview.",2025-12-12T08:00:00.000Z,article,configuration,0.62,True,Lists built-in security content for the Power Platform solution; these rules and workbooks are specific artifacts not known generically.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/business-applications/solution-overview,Overview,Microsoft Sentinel Solution for MS Business Apps,,"Learn about the Microsoft Sentinel solution for MS Business Apps, including Microsoft Power Platform, Microsoft Dynamics 365 Customer Engagement, and Microsoft Dynamics 365 Finance and Operations.","The Microsoft Sentinel solution for Microsoft Business Apps helps you monitor and protect your Microsoft Power Platform, Microsoft Dynamics 365 Customer Engagement, and Microsoft Dynamics 365 Finance and Operations. environments. It provides security insights and threat detection by collecting audit and activity logs to detect threats, suspicious activities, illegitimate activities, and more.",2025-04-22T11:11:00.000Z,overview,,0.35,False,"High-level overview of Sentinel solution for Microsoft Business Apps; summary does not indicate detailed configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery,Business continuity and disaster recovery,BCDR Recommendations for Working With Microsoft Sentinel,Design BCDR and resiliency architecture for Sentinel,"Learn about Business Continuity and Disaster Recovery (BCDR) in Microsoft Sentinel, including availability zones and cross-region disaster recovery strategies.","This article describes reliability support in Microsoft Sentinel and covers both regional resiliency with availability zones, and cross-region resiliency with business continuity and disaster recovery (BCDR). While this article is mainly directed at Microsoft Sentinel customers working in the Azure portal, this guidance also covers data currently managed by Azure services after onboarding to theMicrosoft Defender portal. For more information, seeAzure reliability.",2025-07-22T12:54:00.000Z,concept-article,architecture-patterns,0.6,True,BCDR recommendations for availability zones and cross-region DR are Sentinel-specific architecture patterns with concrete guidance on when to use which resiliency options.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/cef-name-mapping,CEF log field mapping,Common Event Format (CEF) key and CommonSecurityLog field mapping,Map CEF keys to Sentinel CommonSecurityLog fields,This article maps CEF keys to the corresponding field names in the CommonSecurityLog in Microsoft Sentinel.,"The following tables map Common Event Format (CEF) field names to the names they use in Microsoft Sentinel's CommonSecurityLog, and might be helpful when you're working with a CEF data source in Microsoft Sentinel. For more information, seeIngest syslog and CEF messages to Microsoft Sentinel with the Azure Monitor Agent.",2024-08-12T08:00:00.000Z,reference,configuration,0.86,True,Provides explicit mapping tables between CEF field names and CommonSecurityLog fields; these mappings are detailed configuration/integration reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-overview,Overview,Syslog and CEF AMA connectors - Microsoft Sentinel,Configure Syslog and CEF connectors via Azure Monitor Agent,Learn how Microsoft Sentinel collects Syslog and Common Event Format (CEF) messages with the Azure Monitor Agent.,"The Syslog via AMA and Common Event Format (CEF) via AMA data connectors for Microsoft Sentinel filter and ingest Syslog messages, including messages in Common Event Format (CEF), from Linux machines and from network and security devices and appliances. These connectors install the Azure Monitor Agent (AMA) on any Linux machine from which you want to collect Syslog and/or CEF messages. This machine could be the originator of the messages, or it could be a forwarder that collects messages from ot",2025-07-29T08:00:00.000Z,concept-article,configuration,0.7,True,"Describes Syslog/CEF via AMA connectors; typically includes facility/severity filters, port settings, and AMA configuration specific to Sentinel’s Syslog/CEF ingestion.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-troubleshooting,Troubleshoot CEF and Syslog via AMA,Troubleshoot CEF and Syslog via AMA connectors in Microsoft Sentinel,Troubleshoot Sentinel CEF and Syslog AMA ingestion issues,Learn how to troubleshoot issues with CEF and Syslog data collection using the Azure Monitor Agent (AMA) in Microsoft Sentinel.,"This article provides troubleshooting guidance for Common Event Format (CEF) and Syslog data collection using the Azure Monitor Agent (AMA) in Microsoft Sentinel. Use this guide to diagnose and resolve ingestion issues with your log forwarder machines. The commands and configurations should be run on the log forwarder machines where AMA and RSyslog/Syslog-ng are installed. Before you begin troubleshooting, familiarize yourself with the following articles:",2026-01-12T12:11:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting guide for CEF/Syslog via AMA with product-specific commands, configuration checks, and symptom-to-solution mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ci-cd,Deploy content as code from your repository,Deploy custom content from your repository (Preview) - Microsoft Sentinel,Create repository connections to deploy Sentinel content,This article describes how to create connections with a GitHub or Azure DevOps repository where you can manage your custom content and deploy it to Microsoft Sentinel.,"When creating custom content, you can manage it from your own Microsoft Sentinel workspaces, or an external source control repository. This article describes how to create and manage connections between Microsoft Sentinel and GitHub or Azure DevOps repositories. Managing your content in an external repository allows you to make updates to that content outside of Microsoft Sentinel, and have it automatically deployed to your workspaces. For more information, seeUpdate custom content with reposito",2025-07-09T11:11:00.000Z,how-to,deployment,0.7,True,"Explains how to connect GitHub/Azure DevOps to Sentinel for automatic content deployment, including configuration parameters and constraints unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-content,Overview,Manage custom content with repository connections - Microsoft Sentinel,Use repositories and CI/CD for Microsoft Sentinel content,This article explains custom Microsoft Sentinel content like GitHub or Azure DevOps repositories that can utilize source control features.,"Microsoft Sentinel repositories let you deploy and manage custom Sentinel content from an external source control repository for continuous integration/continuous delivery (CI/CD). This automation removes the need for manual processes to update and deploy your custom content across workspaces. A subset of content as code isdetectionsas code (DaC). Microsoft SentinelRepositoriesimplements DaC as well. For more information on Sentinel content, seeAbout Microsoft Sentinel content and solutions.",2026-03-30T20:20:00.000Z,article,deployment,0.62,True,"The page explains managing Sentinel custom content via external GitHub/Azure DevOps repositories for CI/CD. Such articles usually include Sentinel-specific deployment behaviors and constraints for content-as-code and detections-as-code, which are product-specific deployment patterns rather than generic Git usage.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-deploy,Customize repository deployments,Customize repository deployments - Microsoft Sentinel,Customize CI/CD repository deployments for Sentinel,This article describes how to customize repository deployments for the repositories feature in Microsoft Sentinel.,"There are two primary ways to customize the deployment of your repository content to Microsoft Sentinel workspaces. Each method uses different files and syntax, so consider these examples to get you started. Important The Microsoft SentinelRepositoriesfeature is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor more legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-01-14T23:03:00.000Z,how-to,deployment,0.7,True,"Covers customization of repository deployments using specific files and syntax for Sentinel Repositories, a product-specific deployment configuration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/cisco-ftd-firewall,Cisco firewalls,Collect data from Cisco firewall devices running ASA,Choose and configure Sentinel connectors for Cisco ASA/FTD,Use Microsoft Sentinel connectors to collect logs from Cisco firewall devices in Adaptive Security Appliance (ASA) and Common Event Format (CEF) formats.,"Microsoft Sentinel provides two connectors that collect logs from Cisco Secure Firewall devices, depending on whether the devices run the Firewall Threat Defense (FTD) or Adaptive Security Appliance (ASA) software. This article explains when to use each connector and provides links to installation instructions.",2025-11-20T12:10:00.000Z,conceptual,decision-making,0.65,True,"Explains when to use each of two Sentinel connectors for Cisco Secure Firewall (FTD vs ASA) and links to their configuration, providing product-specific selection guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/collaborate-in-microsoft-teams,Collaborate in Microsoft Teams,Collaborate in Microsoft Teams with a Microsoft Sentinel incident team,Integrate Sentinel incidents with Microsoft Teams collaboration,Learn how to connect to Microsoft Teams from Microsoft Sentinel to collaborate with others on your team using Microsoft Sentinel data.,"Microsoft Sentinel in the Azure portal supports a direct integration withMicrosoft Teams, enabling you to jump directly into teamwork on specific incidents. Important Integration with Microsoft Teams is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2022-12-20T23:05:00.000Z,how-to,integrations,0.7,True,"Covers direct integration between Sentinel and Teams for incident collaboration, a concrete cross-service integration pattern unique to these products.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/compare-analytics-rules-custom-detections,Compare analytics rules and custom detections,Compare Microsoft Sentinel analytics rules and Microsoft Defender custom detections - Microsoft Security,Compare Sentinel analytics rules vs Defender custom detections,Compare the different features supported by Microsoft Sentinel analytics rules and Microsoft Defender custom detections.,"This article lists and compares the different features supported by Microsoft Sentinelanalytics rulesand Microsoft Defendercustom detections. It also provides additional information, such as plans to support any analytics rules capabilities that aren't available in custom detections, if applicable. Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-tim",2025-10-27T11:18:00.000Z,conceptual,decision-making,0.85,True,"Explicit comparison article; likely includes feature comparison tables, capabilities, and guidance on when to use each, which is decision-making content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/configure-connector-login-detection,Configure RDP login detection,Configure the Security Events connector for anomalous RDP login detection,Configure Security Events connector for anomalous RDP detection,Learn how to configure the Security Events or Windows Security Events connector for anomalous RDP login detection.,"Microsoft Sentinel can apply machine learning (ML) to Security events data to identify anomalous Remote Desktop Protocol (RDP) login activity. Scenarios include: Unusual IP- the IP address has rarely or never been observed in the last 30 days Unusual geo-location- the IP address, city, country/region, and ASN have rarely or never been observed in the last 30 days New user- a new user logs in from an IP address and geo-location, both or either of which were not expected to be seen based on data f",2024-10-07T17:05:00.000Z,how-to,configuration,0.75,True,Provides concrete configuration steps for the Security Events/Windows Security Events connector to enable ML-based anomalous RDP login detection in Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/configure-content,Configure content,Configure Microsoft Sentinel content,,"In this step of your deployment, you configure the Microsoft Sentinel security content, like your data connectors, analytics rules, automation rules, and more.","In the previous deployment step, you enabled Microsoft Sentinel, health monitoring, and the required solutions. In this article, you learn how to configure the different types of Microsoft Sentinel security content, which allow you to detect, monitor, and respond to security threats across your systems. This article is part of theDeployment guide for Microsoft Sentinel.",2024-07-15T03:11:00.000Z,how-to,,0.4,False,"Configuring Sentinel content article is likely procedural (how to set up rules, connectors) without detailed parameter tables or numeric constraints in the summary.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/configure-data-connector,Connect data sources,Connect data sources to Microsoft Sentinel by using data connectors,,Learn how to connect data sources to Microsoft Sentinel using data connectors for improved threat detection.,"To connect data sources to Microsoft Sentinel, you need to install and configure data connectors. This article generally explains how to install data connectors available in the Microsoft SentinelContent hubto ingest and analyze data for improved threat detection. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected",2026-01-07T19:08:00.000Z,how-to,,0.45,False,General how-to for installing data connectors from Content hub; likely step-by-step UI instructions rather than detailed config parameter references.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/configure-data-retention-archive,Configure interactive and long-term data retention,Configure interactive and long-term data retention in Microsoft Sentinel,Configure interactive and long-term Sentinel data retention,"Towards the end of your deployment procedure, you set up data retention to suit your organization's needs.","In the previous deployment step, you enabled the User and Entity Behavior Analytics (UEBA) feature to streamline your analysis process. In this article, you learn how to set up interactive and long-term data retention, to make sure your organization retains the data that's important in the long term. This article is part of theDeployment guide for Microsoft Sentinel.",2024-07-30T11:22:00.000Z,how-to,configuration,0.7,True,"Data retention configuration article will specify retention settings, tiers, and allowed ranges for interactive vs archive data, including parameter names and values.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/configure-data-transformation,Configure ingestion-time transformation,Transform or customize data at ingestion time in Microsoft Sentinel (preview),Configure ingestion-time data transformation and custom log ingestion,Learn about how to configure Azure Monitor's ingestion-time data transformation for use with Microsoft Sentinel.,"This article describes how to configureingestion-time data transformation and custom log ingestionfor use in Microsoft Sentinel. Ingestion-time data transformation provides customers with more control over the ingested data. Supplementing the pre-configured, hardcoded workflows that create standardized tables, ingestion time-transformation adds the capability to filter and enrich the output tables, even before running any queries. Custom log ingestion uses the Custom Log API to normalize custom-",2023-01-03T18:04:00.000Z,how-to,configuration,0.8,True,"Explains Sentinel-specific configuration of ingestion-time transformations and Custom Log API usage, including DCR and transformation settings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/configure-fusion-rules,Configure multistage attack (Fusion) rules,Configure multistage attack detection (Fusion) rules in Microsoft Sentinel,Configure Fusion multistage attack detection rules,Create and configure attack detection rules based on Fusion technology in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important The new version of the Fusion analytics rule is currently inPREVIEW. See theSupplemental Terms of Use for Micro",2025-10-29T11:15:00.000Z,conceptual,configuration,0.75,True,"Explicitly about creating and configuring Fusion rules; will include rule parameters, enable/disable options, and possibly data source requirements specific to Sentinel.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-aws,AWS service logs,Connect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data,Configure AWS service log connector for Microsoft Sentinel,"Use the AWS connector to delegate Microsoft Sentinel access to AWS resource logs, creating a trust relationship between Amazon Web Services and Microsoft Sentinel.","The Amazon Web Services (AWS) service log connector is available in two versions: the legacy connector for CloudTrail management and data logs, and the new version that can ingest logs from the following AWS services by pulling them from an S3 bucket (links are to AWS documentation): This tab explains how to configure the AWS S3 connector using one of two methods:",2025-06-18T11:21:00.000Z,how-to,configuration,0.7,True,"Connector configuration article with Sentinel-specific setup for AWS S3-based ingestion, including method selection and trust relationship details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-configure-environment,Connect Microsoft Sentinel to AWS,Set up your Amazon Web Services (AWS) environment to collect AWS logs to Microsoft Sentinel,Prepare AWS environment to send logs to Sentinel,Set up your Amazon Web Services environment to send AWS logs to Microsoft Sentinel using one of the Microsoft Sentinel AWS connectors.,Amazon Web Services (AWS) connectors simplify the process of collecting logs from Amazon S3 (Simple Storage Service) and ingesting them into Microsoft Sentinel. The connectors provide tools to help you configure your AWS environment for Microsoft Sentinel log collection. This article outlines the AWS environment setup required to send logs to Microsoft Sentinel and links to step-by-step instructions for setting up your environment and collecting AWS logs using each supported connector.,2025-06-18T11:21:00.000Z,how-to,configuration,0.65,True,"Outlines AWS-side configuration required for Sentinel connectors, including environment setup steps that are specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks,AWS EKS logs,Connect Microsoft Sentinel to Amazon Web Services to ingest AWS EKS logs,Ingest AWS EKS audit logs into Microsoft Sentinel,"Use the Amazon Web Services (AWS) S3-based Elastic Kubernetes Service (EKS) connector to ingest AWS EKS audit logs, collected in AWS S3 buckets, to Microsoft Sentinel.","Use the Amazon Web Services (AWS) S3-based Elastic Kubernetes Service (EKS) connector to ingest AWS EKS audit logs, collected in AWS S3 buckets, to Microsoft Sentinel. AWS EKS audit logs are detailed records of API server requests, authentication decisions, and cluster activities within your Kubernetes clusters. These records contain information such as the time the request was received, the specifics of the request, the user making the request, and the action taken. This log analysis is essenti",2026-04-15T11:11:00.000Z,how-to,integrations,0.74,True,"Connector documentation for Sentinel typically includes product-specific integration details such as required AWS/Sentinel configuration parameters, connector settings, and log source specifics. These are concrete integration patterns and settings unique to this product, beyond generic 'how to connect' guidance.",new -https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-s3-waf,AWS S3 WAF logs,Connect Microsoft Sentinel to Amazon Web Services to ingest AWS WAF logs,Configure AWS WAF S3 connector to ingest logs to Sentinel,"Use the Amazon Web Services (AWS) S3-based Web Application Firewall (WAF) connector to ingest AWS WAF logs, collected in AWS S3 buckets, to Microsoft Sentinel.","Use the Amazon Web Services (AWS) S3-based Web Application Firewall (WAF) connector to ingest AWS WAF logs, collected in AWS S3 buckets, to Microsoft Sentinel. AWS WAF logs are detailed records of the web traffic analyzed by the AWS WAF against web access control lists (ACLs). These records contain information such as the time AWS WAF received the request, the specifics of the request, and the action taken by the rule that the request matched. These logs and this analysis are essential for maint",2025-06-18T11:21:00.000Z,how-to,configuration,0.7,True,"Connector-specific configuration for ingesting AWS WAF logs from S3 into Sentinel, including required settings unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-active-directory,Microsoft Entra,Send Microsoft Entra ID data to Microsoft Sentinel,Configure Microsoft Entra ID connector to send logs to Sentinel,"Learn how to collect data from Microsoft Entra ID, and stream Microsoft Entra sign-in, audit, and provisioning logs into Microsoft Sentinel.","Microsoft Entra IDlogs provide comprehensive information about users, applications, and networks accessing your Microsoft Entra tenant. This article explains the types of logs you can collect using the Microsoft Entra ID data connector, how to enable the connector to send data to Microsoft Sentinel, and how to find your data in Microsoft Sentinel.",2025-07-09T08:00:00.000Z,how-to,configuration,0.75,True,"Connector article detailing how to enable and route Entra ID sign-in, audit, and provisioning logs into Sentinel with product-specific steps.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-functions-template,Azure Functions API connection,Use Azure Functions to connect Microsoft Sentinel to your data source,Build Azure Functions-based connectors to ingest data into Sentinel,Learn how to configure data connectors that use Azure Functions to get data from data sources into Microsoft Sentinel.,"You can useAzure Functions, in conjunction with various coding languages such asPowerShellor Python, to create a serverless connector to the REST API endpoints of your compatible data sources. Azure Function Apps then allow you to connect Microsoft Sentinel to your data source's REST API to pull in logs. This article describes how to configure Microsoft Sentinel for using Azure Function Apps. You may also need to configure your source system, and you can find vendor- and product-specific informa",2023-06-05T08:00:00.000Z,how-to,integrations,0.75,True,"Shows how to use Azure Functions (PowerShell/Python) to call REST APIs and send logs to Sentinel; likely includes function app settings, bindings, and Sentinel-specific ingestion parameters.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-stack,Azure Stack VMs,Onboard your Azure Stack Hub virtual machines to Microsoft Sentinel,Onboard Azure Stack Hub VMs to Microsoft Sentinel,"This article shows you how to provision the Azure Monitor, Update, and Configuration Management virtual machine extension on Azure Stack Hub virtual machines and start monitoring them with Microsoft S","With Microsoft Sentinel, you can monitor your VMs running on Azure and Azure Stack Hub in one place. To on-board your Azure Stack machines to Microsoft Sentinel, you first need to add the virtual machine extension to your existing Azure Stack Hub virtual machines. After you connect Azure Stack Hub machines, choose from a gallery of dashboards that surface insights based on your data. These dashboards can be easily customized to your needs.",2022-12-20T23:05:00.000Z,conceptual,deployment,0.7,True,"Shows how to deploy Azure Monitor and related VM extensions on Azure Stack Hub VMs to start monitoring with Sentinel, a product-specific onboarding/deployment pattern.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-virtual-desktop,Azure Virtual Desktop,Connect Azure Virtual Desktop to Microsoft Sentinel,Connect Azure Virtual Desktop telemetry to Microsoft Sentinel,Learn to connect your Azure Virtual Desktop data to Microsoft Sentinel.,"This article describes how you can monitor your Azure Virtual Desktop environments using Microsoft Sentinel. For example, monitoring your Azure Virtual Desktop environments can enable you to provide more remote work using virtualized desktops, while maintaining your organization's security posture.",2024-10-07T17:05:00.000Z,how-to,configuration,0.65,True,"Describes Sentinel-specific connector configuration for Azure Virtual Desktop monitoring, beyond generic logging concepts.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-windows-microsoft-services,Connect Microsoft Sentinel to Microsoft connectors,"Connect Microsoft Sentinel to Azure, Windows, and Microsoft services",Configure Sentinel connections to Azure and Microsoft services,Learn how to connect Microsoft Sentinel to Azure and Microsoft 365 cloud services and to Windows Server event logs.,"Microsoft Sentinel uses the Azure foundation to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. There are a few different methods through which these connections are made. Note For information about feature availability in US Government clouds, see the Microsoft Sentinel tables inCloud feature availability for US Government customers.",2023-10-12T11:16:00.000Z,overview,configuration,0.6,True,Describes specific methods for connecting to Azure/M365/Windows logs; likely includes configuration options and prerequisites for each connection method.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama,CEF and Syslog via AMA,Ingest syslog and CEF messages to Microsoft Sentinel - AMA,Configure AMA-based syslog and CEF ingestion to Sentinel,"Ingest syslog messages from linux machines and from network and security devices and appliances to Microsoft Sentinel, using data connectors based on the Azure Monitor Agent (AMA).","This article shows you how to use theSyslog via AMAandCommon Event Format (CEF) via AMAconnectors to filter and ingest syslog and CEF messages from Linux machines, network devices, and security appliances. To learn more about these data connectors, seeSyslog and Common Event Format (CEF) via AMA connectors for Microsoft Sentinel. Note Container Insights supports automatic collection of syslog events from Linux nodes in your AKS clusters. Learn more inSyslog collection with Container Insights.",2026-01-12T12:11:00.000Z,how-to,configuration,0.7,True,"How-to article for configuring Syslog via AMA and CEF via AMA connectors, including product-specific connector settings and filtering behavior that go beyond generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-custom-logs-ama,Collect logs from text files via AMA,Collect logs from text files with the Azure Monitor Agent and ingest to Microsoft Sentinel - AMA,Configure Custom Logs via AMA to ingest text-file logs,"Collect text file-based logs from network or security applications installed on Windows- or Linux-based machines, using the Custom Logs via AMA data connector based on the Azure Monitor Agent (AMA).",This article describes how to use theCustom Logs via AMAconnector to quickly filter and ingest logs in text-file format from network or security applications installed on Windows or Linux machines. Many applications log data to text files instead of standard logging services like Windows Event log or Syslog. You can use the Azure Monitor Agent (AMA) to collect data in text files of nonstandard formats from both Windows and Linux computers. The AMA can also effect transformations on the data at t,2024-08-15T22:07:00.000Z,how-to,configuration,0.75,True,"Describes how to configure the Custom Logs via AMA connector, including Sentinel- and AMA-specific configuration steps for nonstandard text log formats.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-data-sources,Overview,Microsoft Sentinel data connectors,,"Learn about supported data connectors, like Microsoft Defender XDR (formerly Microsoft 365 Defender), Microsoft 365 and Office 365, Microsoft Entra ID, ATP, and Defender for Cloud Apps to Microsoft Se","After you onboard Microsoft Sentinel into your workspace, use data connectors to start ingesting your data into Microsoft Sentinel. Microsoft Sentinel comes with many out of the box connectors for Microsoft services, which integrate in real time. For example, the Microsoft Defender XDR connector is a service-to-service connector that integrates data from Office 365, Microsoft Entra ID, Microsoft Defender for Identity, and Microsoft Defender for Cloud Apps. Built-in connectors enable connection t",2026-03-05T23:11:00.000Z,concept-article,,0.2,False,"This is a high-level description of Microsoft Sentinel data connectors and supported sources. The summary indicates an overview of connectors and integration examples, but not specific quotas, configuration parameter tables, or troubleshooting mappings, so it lacks the required expert-knowledge characteristics.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-defender-for-cloud,Microsoft Defender for Cloud,Ingest Microsoft Defender for Cloud subscription-based alerts to Microsoft Sentinel,Connect Microsoft Defender for Cloud alerts to Sentinel,Learn how to connect security alerts from Microsoft Defender for Cloud and stream them into Microsoft Sentinel.,"Microsoft Defender for Cloud's integrated cloud workload protections allow you to detect and quickly respond to threats across hybrid and multicloud workloads. TheMicrosoft Defender for Cloudconnector allows you to ingestsecurity alerts from Defender for Cloudinto Microsoft Sentinel, so you can view, analyze, and respond to Defender alerts, and the incidents they generate, in a broader organizational threat context. Microsoft Defender for Cloud Defender plansare enabled per subscription. While M",2024-11-27T18:02:00.000Z,how-to,configuration,0.7,True,"Explains how to configure the Defender for Cloud connector to stream subscription-based alerts into Sentinel, including plan and subscription-specific setup.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-dns-ama,DNS via AMA,Stream and filter Windows DNS logs with the AMA connector,Configure AMA connector for Windows DNS log streaming,Ingest and filter data from your Windows DNS server logs with this data connector. Query this data to protect your DNS servers from threats and attacks.,"This article describes how to use the Azure Monitor Agent (AMA) connector to stream and filter events from your Windows Domain Name System (DNS) server logs. You can then deeply analyze your data to protect your DNS servers from threats and attacks. The AMA and its DNS extension are installed on your Windows Server to upload data from your DNS analytical logs to your Microsoft Sentinel workspace. DNS is a widely used protocol, which maps between host names and computer readable IP addresses. Bec",2025-03-25T08:00:00.000Z,how-to,configuration,0.7,True,Product-specific configuration of the AMA DNS extension and connector to stream and filter Windows DNS logs into Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-google-cloud-platform,Google Cloud Platform connectors,Ingest Google Cloud Platform log data into Microsoft Sentinel,Configure GCP Pub/Sub connectors to ingest logs into Sentinel,This article describes how to ingest service log data from the Google Cloud Platform (GCP) into Microsoft Sentinel.,"Organizations are increasingly moving to multicloud architectures, whether by design or due to ongoing requirements. A growing number of these organizations use applications and store data on multiple public clouds, including the Google Cloud Platform (GCP). This article describes how to ingest GCP data into Microsoft Sentinel to get full security coverage and analyze and detect attacks in your multicloud environment. With theGCP Pub/Subconnectors, based on ourCodeless Connector Framework (CCF),",2026-02-27T23:17:00.000Z,how-to,configuration,0.7,True,"Describes how to set up GCP Pub/Sub-based connectors using Sentinel’s Codeless Connector Framework, including integration-specific configuration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules,Logstash plugin with Data Collection Rules,Use Logstash to stream logs with pipeline transformations via DCR-based API,Use Logstash with DCR-based API to stream logs to Sentinel,"Use Logstash to forward logs from external data sources into custom and standard tables in Microsoft Sentinel, and to configure the output with DCRs.","Important Data ingestion using the Logstash output plugin with Data Collection Rules (DCRs) is currently in public preview. This feature is provided without a service level agreement. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Sentinel's new Logstash output plugin supports pipeline transformations and advanced configuration via Data Collection Rules (DCRs). The plugin forwards any type of logs from external data sources into custom or standard tabl",2024-10-10T17:12:00.000Z,how-to,integrations,0.8,True,"Describes Logstash output plugin configuration with Data Collection Rules, including Sentinel-specific pipeline and DCR parameters.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-mdti-data-connector,Enable MDTI data connector,Enable the data connector for Microsoft's threat intelligence - Microsoft Defender Threat Intelligence,Enable Defender Threat Intelligence data connector in Sentinel,Learn how to ingest Microsoft's threat intelligence into your Microsoft Sentinel workspace to generate high-fidelity alerts and incidents.,"Bring public, open-source and high-fidelity indicators of compromise (IOCs) generated by Microsoft Defender Threat Intelligence into your Microsoft Sentinel workspace with the Defender Threat Intelligence data connectors. With a simple one-click setup, use the threat intelligence from the standard and premium Defender Threat Intelligence data connectors to monitor, alert, and hunt. AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only ",2026-02-13T12:10:00.000Z,how-to,integrations,0.76,True,"Connector article for MDTI; includes connector parameters, setup steps, and behavior specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-365-defender,Microsoft Defender XDR,Stream data from Microsoft Defender XDR to Microsoft Sentinel in the Azure portal,Configure Microsoft Defender XDR connector in Sentinel,"Learn how to ingest incidents, alerts, and raw event data from Microsoft Defender XDR into Microsoft Sentinel in the Azure portal.","The Defender XDR connector allows you to stream all Microsoft Defender XDR incidents, alerts, and advanced hunting events into Microsoft Sentinel and keeps incidents synchronized between both portals. This article explains how to configure the Microsoft Defender XDR connector for Microsoft Sentinel in the Azure portal. Note The Defender XDR connector is automatically enabled when you onboard Microsoft Sentinel to the Defender portal. The manual configuration steps described in this article are n",2025-12-24T08:00:00.000Z,how-to,configuration,0.75,True,"Details configuration of the Defender XDR connector, including incident synchronization and portal-specific settings unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-purview,Microsoft Purview Information Protection data,Stream data from Microsoft Purview Information Protection to Microsoft Sentinel,Stream Microsoft Purview Information Protection data to Sentinel,Stream data from Microsoft Purview Information Protection (formerly Microsoft Information Protection) to Microsoft Sentinel so you can analyze and report on data from the Microsoft Purview labeling cl,"This article describes how to stream data from Microsoft Purview Information Protection (formerly Microsoft Information Protection or MIP) to Microsoft Sentinel. You can use the data ingested from the Microsoft Purview labeling clients and scanners to track, analyze, report on the data, and use it for compliance purposes. Important The Microsoft Purview Information Protection connector is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure ",2025-01-15T18:04:00.000Z,how-to,configuration,0.7,True,"Connector configuration article for streaming Purview Information Protection labeling and scanner data into Sentinel, including preview-specific setup.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-services-api-based,Connect via API-based connectors,Connect Microsoft Sentinel to other Microsoft services with an API-based data connector,Configure API-based data connectors for Microsoft Sentinel,Learn how to connect Microsoft Sentinel to Microsoft services with API-based connections.,"This article describes how to make API-based connections to Microsoft Sentinel. Microsoft Sentinel uses the Azure foundation to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. There are a few different methods through which these connections are made. This article presents information that is common to the group of API-based data connectors. Note For information about feature ava",2025-01-21T23:02:00.000Z,how-to,configuration,0.7,True,"Common guidance for API-based connectors; typically includes endpoint URLs, auth settings, and connector parameters unique to Sentinel’s API-based ingestion.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-services-diagnostic-setting-based,Connect via diagnostic settings-based connectors,Connect Microsoft Sentinel to other Microsoft services by using diagnostic settings-based connections,Configure diagnostic settings-based connectors for Sentinel,Learn how to connect Microsoft Sentinel to Microsoft services with diagnostic settings-based connections.,"This article describes how to connect to Microsoft Sentinel by using diagnostic settings connections. Microsoft Sentinel uses the Azure foundation to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. There are a few different methods through which these connections are made. This article presents information that is common to the group of data connectors that use diagnostic setting",2024-11-04T12:13:00.000Z,how-to,configuration,0.7,True,"Covers diagnostic settings-based connections; likely documents required diagnostic categories, destinations, and configuration fields specific to Sentinel connectors.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-services-windows-based,Connect via Windows agent-based connectors,Connect Microsoft Sentinel to other Microsoft services with a Windows agent-based data connector,Configure Windows agent-based data connectors for Sentinel,Learn how to connect Microsoft Sentinel to Microsoft services with Windows agent-based connections.,"This article describes how to connect Microsoft Sentinel to other Microsoft services Windows agent-based connections. Microsoft Sentinel uses the Azure Monitor Agent to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. TheAzure Monitor AgentusesData collection rules (DCRs)to define the data to collect from each agent. Data collection rules offer you two distinct advantages: Manage ",2024-10-07T17:05:00.000Z,how-to,configuration,0.7,True,Describes using Azure Monitor Agent and DCRs for Windows-based connectors; includes product-specific configuration of rules and agent settings.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-taxii,Connect to STIX/TAXII feeds,Connect to STIX/TAXII threat intelligence feeds - Microsoft Sentinel,Connect STIX/TAXII threat intel feeds to Sentinel,Learn how to connect Microsoft Sentinel to industry-standard threat intelligence feeds to import threat indicators.,"The STIX data format and the TAXII protocolare the most widely adopted industry standards for transmitting threat intelligence. Microsoft Sentinel supports integration with threat intelligence platforms using the standards, and provides built-in connectors for importing and exporting threat intelligence. Use the Threat Intelligence – TAXII data connector to import threat indicators from TAXII 2.0 or 2.1 servers into your Sentinel workspace. To share threat intelligence externally, configure the ",2025-10-20T22:18:00.000Z,how-to,integrations,0.78,True,Describes using the TAXII data connector to import/export threat indicators; includes protocol versions and connector-specific configuration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-tip,Connect threat intelligence platforms,Connect your threat intelligence platform - Microsoft Sentinel,Connect threat intelligence platform to Sentinel (legacy connector),Learn how to connect your threat intelligence platform (TIP) or custom feed to Microsoft Sentinel and send threat indicators.,"Note This data connector will be deprecated and will stop collecting data inJune 2026. We recommend transitioning to the new Threat Intelligence Upload Indicators API data connector as soon as possible to ensure uninterrupted data collection. -For more information, seeConnect your threat intelligence platform to Microsoft Sentinel with the upload API. Many organizations use threat intelligence platform (TIP) solutions to aggregate threat indicator feeds from various sources. From the aggregated f",2025-07-01T11:22:00.000Z,how-to,integrations,0.76,True,Connector article for TIP/custom feeds; includes configuration details and deprecation guidance for a specific Sentinel data connector.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-upload-api,Connect threat intelligence with upload API,Connect your TIP with the upload API (Preview) - Microsoft Sentinel,Connect TIP to Sentinel using Threat Intel upload API,Learn how to connect your threat intelligence platform (TIP) or custom feed using the upload API to Microsoft Sentinel.,"Many organizations use threat intelligence platform (TIP) solutions to aggregate threat intelligence feeds from various sources. From the aggregated feed, the data is curated to apply to security solutions such as network devices, EDR/XDR solutions, or security information and event management (SIEM) solutions such as Microsoft Sentinel. The industry standard for describing cyberthreat information is called, ""Structured Threat Information Expression"" or STIX. By using the upload API which suppor",2025-09-10T17:15:00.000Z,how-to,integrations,0.8,True,Describes using the upload API with STIX 2.1 to send indicators; includes API parameters and integration patterns unique to Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rule-from-template,Create a scheduled rule from a template,Create scheduled analytics rules from templates in Microsoft Sentinel,Create scheduled analytics rules from Sentinel templates,This article explains how to view and create scheduled analytics rules from templates in Microsoft Sentinel.,"By far the most common type of analytics rule,Scheduledrules are based onKusto queriesthat are configured to run at regular intervals and examine raw data from a defined ""lookback"" period. These queries can perform complex statistical operations on their target data, revealing baselines and outliers in groups of events. If the number of results captured by the query passes the threshold configured in the rule, the rule produces an alert. Microsoft makes a vast array ofanalytics rule templatesava",2025-09-16T17:23:00.000Z,how-to,configuration,0.7,True,Focuses on creating rules from templates; typically includes template parameters and rule configuration fields specific to Sentinel scheduled rules.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rules,Create a scheduled rule from scratch,Create scheduled analytics rules in Microsoft Sentinel,Create custom scheduled analytics rules in Sentinel,This article explains how to view and create scheduled analytics rules in Microsoft Sentinel.,"You’ve set upconnectors and other means of collecting activity dataacross your digital estate. Now you need to dig through all that data to detect patterns of activity and discover activities that don’t fit those patterns and that could represent a security threat. Microsoft Sentinel and its manysolutions provided in the Content huboffer templates for the most commonly used types of analytics rules, and you’re strongly encouraged to make use of those templates, customizing them to fit your speci",2025-12-29T08:00:00.000Z,how-to,configuration,0.7,True,Covers viewing and creating scheduled rules; likely details rule configuration options and query-related parameters unique to Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-codeless-connector,Creating codeless data connectors (CCF),Create a codeless connector for Microsoft Sentinel,Create codeless connectors for Microsoft Sentinel with CCF,Learn how to create a codeless connector in Microsoft Sentinel using the Codeless Connector Framework (CCF).,"The Codeless Connector Framework (CCF) provides partners, advanced users, and developers the ability to create custom connectors for ingesting data to Microsoft Sentinel. Connectors created using the CCF are fully SaaS, with no requirements for service installations. They also includehealth monitoringand full support from Microsoft Sentinel. Use the following steps to create your CCF connector and connect your data source to Microsoft Sentinel This article will show you how to complete each step",2024-09-26T08:00:00.000Z,how-to,integrations,0.8,True,"Shows how to define a CCF connector, including schema, configuration, and health monitoring; contains product-specific connector configuration parameters and patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector,Create a custom connector,Resources for creating Microsoft Sentinel custom connectors,,"Learn about available resources for creating custom connectors for Microsoft Sentinel. Methods include the Log Analytics API, Logstash, Logic Apps, PowerShell, and Azure Functions.","Microsoft Sentinel provides a wide range ofout-of-the-box connectors for Azure services and external solutions, and also supports ingesting data from some sources without a dedicated connector. If you're unable to connect your data source to Microsoft Sentinel using any of the existing solutions available, consider creating your own data source connector. For a full list of supported connectors, see theFind your Microsoft Sentinel data connector).",2024-11-06T18:02:00.000Z,concept-article,,0.3,False,"High-level resource index for creating custom connectors; detailed parameters and patterns are in the linked method-specific docs, not here.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector-builder-agent,Create custom connectors using AI agent in Microsoft Sentinel,Get started with custom connectors using AI agent in Microsoft Sentinel,,Custom Data connectors using AI agent in Microsoft Sentinel Visual Studio Code extension,"The Microsoft Sentinel connector builder agent builds data connectors in minutes using the AI‑assisted workflow in GitHub Copilot using the Microsoft Sentinel extension for Visual Studio Code (VS Code). This low‑code experience guides developers and Independent Software Vendors (ISVs) end‑to‑end by autonomously generating schemas, deployment assets, connector UI, secure secret handling, and polling logic. Built‑in validation surfaces any polling issues early, so you can validate event logs befor",2026-03-30T20:20:00.000Z,feature-availability,,0.3,False,"Summary indicates a conceptual/getting-started overview of the Sentinel connector builder agent and workflow. It does not clearly reference concrete configuration parameter tables, limits, error codes, or other product-specific expert details as defined in the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-incident-manually,Create incidents manually,Create your own incidents manually in Microsoft Sentinel in the Azure portal,,Manually create incidents in Microsoft Sentinel based on data or information received by the SOC through alternate means or channels.,"Important Manual incident creation, using the portal or Logic Apps, is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Manual incident creation is generally available using the API. AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender por",2025-04-24T22:03:00.000Z,how-to,,0.5,False,Manual incident creation description is largely about capability and preview status; summary doesn’t expose detailed parameters or numeric constraints.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-incidents-from-alerts,Create incidents from Microsoft Security alerts,Create incidents from alerts in Microsoft Sentinel,Configure incident creation from alerts in Sentinel,Learn how to create incidents from alerts in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Alerts triggered in Microsoft security solutions that are connected to Microsoft Sentinel, such as Microsoft Defender for",2025-10-29T11:15:00.000Z,how-to,configuration,0.7,True,"Describes how alerts become incidents; typically includes incident rule configuration, grouping options, and mappings unique to Sentinel.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-manage-use-automation-rules,Create automation rules,Create and use Microsoft Sentinel automation rules to manage response,Configure Sentinel automation rules for incident response,"This article explains how to create and use automation rules in Microsoft Sentinel to manage and handle incidents, in order to maximize your SOC's efficiency and effectiveness in response to security ","This article explains how to create and use automation rules in Microsoft Sentinel to manage and orchestrate threat response, in order to maximize your SOC's efficiency and effectiveness. In this article you'll learn how to define the triggers and conditions that determine when your automation rule runs, the various actions that you can have the rule perform, and the remaining features and functionalities. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure",2026-02-24T08:00:00.000Z,how-to,configuration,0.75,True,"Details defining triggers, conditions, and actions for automation rules—this is a configuration-focused article with Sentinel-specific parameters and behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-nrt-rules,Create NRT analytics rules,Work with near-real-time (NRT) detection analytics rules in Microsoft Sentinel,Create and manage NRT detection rules in Sentinel,This article explains how to view and create near-real-time (NRT) detection analytics rules in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel’snear-real-time analytics rulesprovide up-to-the-minute threat detection out-of-the-box. This type of ",2025-10-29T11:15:00.000Z,how-to,configuration,0.75,True,How-to for viewing and creating NRT rules; includes rule parameters and constraints specific to Sentinel’s NRT engine.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-push-codeless-connector,Creating push codeless data connectors (CCF),Microsoft Sentinel CCF push connectors (preview) - Getting started guide,Build push-based codeless connectors for Microsoft Sentinel,Learn how to create and deploy push-based codeless connectors for Microsoft Sentinel that sends data in real-time.,"This guide helps you understand, build, and deploy push-based codeless connectors for Microsoft Sentinel using the Codeless Connector Framework (CCF) Push (preview).",2026-01-29T12:09:00.000Z,how-to,integrations,0.8,True,"Push CCF connectors require specific endpoint, auth, and payload configuration; this guide will include Sentinel-specific parameters and integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/create-tasks-automation-rule,Create incident tasks using automation rules,Create incident tasks in Microsoft Sentinel using automation rules,Create Sentinel incident task lists via automation rules,"This article explains how to use automation rules to create lists of incident tasks, in order to standardize analyst workflow processes in Microsoft Sentinel.","This article explains how to use automation rules to create lists of incident tasks, in order to standardize analyst workflow processes in Microsoft Sentinel. Incident taskscan be created automatically not only by automation rules, but also by playbooks, and also manually, ad-hoc, from within an incident.",2025-04-24T22:03:00.000Z,how-to,configuration,0.7,True,"Shows how to configure automation rules to create incident tasks, including how tasks are generated by rules/playbooks—Sentinel-specific configuration of workflow automation.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/customer-managed-keys,Set up customer-managed keys,Set up customer-managed keys in Microsoft Sentinel,Set up customer-managed keys for Microsoft Sentinel encryption,Learn how to set up customer-managed key (CMK) in Microsoft Sentinel.,This article provides background information and steps to configure acustomer-managed key (CMK)for Microsoft Sentinel. All the data stored in Microsoft Sentinel is already encrypted by Microsoft in all relevant storage resources. CMK provides an extra layer of protection with an encryption key created and owned by you and stored in yourAzure Key Vault.,2025-07-29T17:38:00.000Z,how-to,security,0.8,True,Provides Sentinel-specific CMK configuration steps with Key Vault integration and scope; these are concrete security configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/customize-alert-details,Customize alert details,Customize alert details in Microsoft Sentinel,"Customize Sentinel alert names, severity, and tactics","Customize how alerts are named and described, along with their severity and assigned tactics, based on the alerts' content.","This article explains how to override the default properties of alerts with content from the underlying query results. In the process of creating ascheduled analytics rule, as the first step you define a name and description for the rule, and you assign it a severity and MITRE ATT&CK tactics. All alerts generated by a given rule - and all incidents created as a result - will inherit the name, description, severity, and tactics defined in the rule, without regard to the particular content of a sp",2025-04-24T22:03:00.000Z,how-to,configuration,0.8,True,Describes overriding default alert properties based on query results; requires specific configuration fields and expressions unique to Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/customize-entity-activities,Create custom entity activities,Customize activities on Microsoft Sentinel entity timelines,Customize activities on Sentinel entity timelines,Add customized activities to those Microsoft Sentinel tracks and displays on the timeline of entity pages,Important,2025-04-24T22:03:00.000Z,how-to,configuration,0.65,True,"Adding customized activities implies configuring which events appear on timelines, with Sentinel-specific activity types and mapping options.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-azure-storage,Azure Storage Blob data connector reference,Azure Storage Blob data connector reference for the Codeless Connector Framework - Microsoft Sentinel,Configure CCF JSON for Azure Storage Blob connector,This article provides reference JSON fields and properties for creating the Azure Storage Blob data connector type and its data connection rules as part of the Codeless Connector Framework.,"To create an Azure Storage Blob data connector with the Codeless Connector Framework (CCF), use this reference in addition to theMicrosoft Sentinel REST API for Data Connectorsarticle. EachdataConnectorrepresents a specificconnectionof a Microsoft Sentinel data connector. One data connector might have multiple connections, which fetch data from different endpoints. The JSON configuration built using this reference document is used to complete the deployment template for the CCF data connector. F",2026-03-11T11:05:00.000Z,reference,configuration,0.78,True,"Reference article with JSON fields and properties for the Codeless Connector Framework data connector and its data connection rules. This is product-specific configuration detail (schema, property names, structure) that an LLM would not reliably know from training.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-gcp,GCP data connectors API reference,GCP data connector reference for the Codeless Connector Framework - Microsoft Sentinel,Configure GCP data connectors with Sentinel CCF,This article provides reference JSON fields and properties for creating the GCP data connector type and its data connection rules as part of the Codeless Connector Framework.,"To create a Google Cloud Platform (GCP) data connector with the Codeless Connector Framework (CCF), use this reference as a supplement to theMicrosoft Sentinel REST API for Data Connectorsdocs. EachdataConnectorrepresents a specificconnectionof a Microsoft Sentinel data connector. One data connector might have multiple connections, which fetch data from different endpoints. The JSON configuration built using this reference document is used to complete the deployment template for the CCF data con",2024-10-01T11:13:00.000Z,reference,integrations,0.9,True,Reference for JSON fields and properties for GCP connector type and its data connection rules; product-specific integration configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-connector-connection-rules-reference,RestApiPoller data connectors API reference,RestApiPoller data connector reference for the Codeless Connector Framework - Microsoft Sentinel,Configure RestApiPoller connector JSON for Sentinel CCF,This article provides reference JSON fields and properties to create the RestApiPoller data connector type and its data connection rules for the Codeless Connector Framework.,"You can create aRestApiPollerdata connector with the Codeless Connector Framework (CCF) by using this article as a supplement to theMicrosoft Sentinel REST API for data connectorsdocs. Each data connector represents a specificconnectionof a Microsoft Sentinel data connector. One data connector might have multiple connections, which fetch data from different endpoints. You can complete the deployment template for the CCF data connector by using the JSON configuration that you build with this arti",2026-03-03T18:22:00.000Z,reference,configuration,0.78,True,"The page is a reference for RestApiPoller data connector JSON fields and properties in the Codeless Connector Framework. It describes specific configuration parameters, their names, structure, and allowed usage for building data connection rules. This is product-specific configuration detail that an LLM is unlikely to know from training, and it aligns with the configuration sub-skill definition (parameter references and configuration structure), not limits, troubleshooting, or general concepts.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-connector-ui-definitions-reference,Data connector definitions API reference,Data connector definitions reference for the Codeless Connector Framework - Microsoft Sentinel,Define connector UIConfig JSON for Sentinel CCF,This article provides a supplemental reference for creating the connectorUIConfig JSON section for the Data Connector Definitions API as part of the Codeless Connector Framework.,"To create a data connector with the Codeless Connector Framework (CCF), use this document as a supplement to theMicrosoft Sentinel REST API for Data Connector Definitionsreference docs. Specifically this reference document expands on the following section: For more information, seeCreate a codeless connector.",2025-10-30T22:11:00.000Z,reference,integrations,0.9,True,Reference for connectorUIConfig JSON fields for the Codeless Connector Framework; includes parameter names and structures specific to Sentinel’s Data Connector Definitions API.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference,Find data connector,Find your Microsoft Sentinel data connector,,Learn about specific configuration steps for Microsoft Sentinel data connectors.,"This article lists all supported, out-of-the-box data connectors and links to each connector's deployment steps. Important Data connectors are available as part of the following offerings: Solutions: Many data connectors are deployed as part ofMicrosoft Sentinel solutiontogether with related content like analytics rules, workbooks, and playbooks. For more information, see theMicrosoft Sentinel solutions catalog. Community connectors: More data connectors are provided by the Microsoft Sentinel co",2026-02-18T10:58:00.000Z,reference,,0.2,False,"Page is a catalog/list of Microsoft Sentinel data connectors with links out to individual connector deployment/configuration pages. It does not itself contain specific limits, configuration parameter tables, error codes, or decision matrices; it primarily serves as navigation to other documentation.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference,Windows security events,Find your Microsoft Sentinel data connector,,Learn about specific configuration steps for Microsoft Sentinel data connectors.,"This article lists all supported, out-of-the-box data connectors and links to each connector's deployment steps. Important Data connectors are available as part of the following offerings: Solutions: Many data connectors are deployed as part ofMicrosoft Sentinel solutiontogether with related content like analytics rules, workbooks, and playbooks. For more information, see theMicrosoft Sentinel solutions catalog. Community connectors: More data connectors are provided by the Microsoft Sentinel co",2026-02-18T10:58:00.000Z,reference,,0.2,False,"Same page as index 0: a reference list of Sentinel data connectors and links, without embedded expert-level configuration details, limits, troubleshooting mappings, or decision criteria.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-source-schema-reference,Data source schema reference,Microsoft Sentinel data source schema reference,Reference Sentinel-supported data source schemas,"This article lists Azure and third-party data source schemas supported by Microsoft Sentinel, with links to their reference documentation.","This article lists supported Azure and third-party data source schemas, with links to their reference documentation.",2023-10-12T11:16:00.000Z,reference,configuration,0.7,True,"Lists supported data source schemas with links to schema references; while high-level, it is a schema/catalog reference specific to Sentinel integrations.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-transformation,Ingestion-time data transformation,Custom data ingestion and transformation in Microsoft Sentinel,,Learn about how Azure Monitor's custom log ingestion and data transformation features can help you get any data into Microsoft Sentinel and shape it the way you want.,"Azure Monitor Logsserves as the data platform for Microsoft Sentinel. All logs ingested into Microsoft Sentinel are stored in aLog Analytics workspace, andlog querieswritten inKusto Query Language (KQL)are used to detect threats and monitor your network activity. Log Analytics gives you a high level of control over the data that gets ingested to your workspace with custom data ingestion anddata collection rules (DCRs). DCRs allow you to both collect and manipulate your data before it's stored in",2026-04-01T08:46:00.000Z,conceptual,,0.3,False,"From the summary, the page is a conceptual overview of custom data ingestion and transformation using Azure Monitor Logs and data collection rules for Microsoft Sentinel. It does not clearly indicate the presence of specific numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Without evidence of detailed settings, quotas, or product-specific diagnostic mappings, it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support,Support for data types in different clouds,Support for Microsoft Sentinel connector data types in different clouds,Assess Sentinel connector data type support by cloud,This article describes the types of clouds that affect data streaming from the different connectors that Microsoft Sentinel supports.,"Microsoft Sentinel data connectors use data stored in various cloud environments, like the Microsoft 365 Commercial cloud or the Government Community Cloud (GCC). This article describes the types of clouds that affect the supported data types for the different connectors that Microsoft Sentinel supports. Specifically, support varies for different Microsoft Defender XDR connector data types in different GCC environments.",2024-06-13T11:23:00.000Z,conceptual,decision-making,0.65,True,"Describes which connector data types are supported in which cloud environments (commercial vs GCC variants), effectively a capability matrix used for deployment and connector selection decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/asset-data-tables,Asset data tables in Microsoft Sentinel data lake,Asset data tables in Microsoft Sentinel data lake - Microsoft Security,Use asset data tables in Microsoft Sentinel data lake,Asset data tables in security data lake,The following table mappings are available in the Microsoft Sentinel data lake for asset data.,2025-09-30T12:49:00.000Z,reference,configuration,0.78,True,A table-mapping reference for asset data in the Sentinel data lake; such schema/table mappings are configuration-level reference details specific to the product.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/auditing-lake-activities,Audit Microsoft Sentinel data lake and graph in Microsoft Purview portal,Audit log for Microsoft Sentinel data lake and graph in Microsoft Purview portal,Use audit log for Sentinel data lake and graph activities,Learn how to use the audit log to search for Microsoft Sentinel data lake activities to help with investigation.,"The audit log helps you investigate specific activities across Microsoft services. Microsoft Sentinel data lake and graph activities are audited and can be searched in the audit log. The audit log provides a record of activities that are performed by users and administrators in Microsoft Sentinel data lake and graph, such as: Auditing is automatically turned on for Microsoft Sentinel data lake and graph. Features that are audited are logged in the audit log automatically.",2025-09-30T12:49:00.000Z,how-to,security,0.7,True,Describes how Sentinel data lake and graph activities appear in the audit log and how to search them; this is product-specific auditing configuration/usage.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-custom-graphs,Create custom graphs,Get started with custom graphs in Microsoft Sentinel (preview),Build and manage custom security graphs with Sentinel,"Learn how to create and manage custom graphs in Microsoft Sentinel to model attack patterns, investigate threats, and run advanced graph algorithms.","Custom graphs in Microsoft Sentinel enable security researchers and analysts to create tailored graph representations of their security data. By building custom graphs, you can model specific attack patterns, investigate threats, and run advanced graph algorithms to uncover hidden relationships within your digital environment. This guide walks you through the steps to create and manage custom graphs by using Jupyter notebooks in the Microsoft Sentinel Visual Studio Code extension. This article f",2026-03-30T20:20:00.000Z,how-to,integrations,0.6,True,"Step-by-step guide using Jupyter notebooks and Sentinel VS Code extension to create graphs; likely includes API/SDK usage, class/method names, and parameter patterns specific to Sentinel graph integration with Spark/Notebooks.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-graphs-with-ai,Create custom graph using AI,AI-assisted custom graph authoring in Microsoft Sentinel (preview) - Microsoft Security,,"Use AI assistance in Visual Studio Code to create, modify, and query custom security graphs using Jupyter notebooks and GitHub Copilot.","Use GitHub Copilot in Visual Studio Code with Microsoft Sentinel to create, modify, and query custom security graphs using Jupyter notebooks. Describe what you want to build in natural language, review the generated notebook, and refine it as needed. Use Copilot for various graph authoring tasks, including:",2026-03-30T20:20:00.000Z,how-to,,0.3,False,"Describes using GitHub Copilot and notebooks for graph authoring; appears as a how-to/tutorial without product-specific limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/custom-graphs-overview,Custom graphs overview,Custom graphs in Microsoft Sentinel-Overview (preview),,An overview of custom graphs in Microsoft Sentinel,"Custom graphs let you build tailored security graphs tuned to your unique security scenarios using data from Sentinel data lake as well as non-Microsoft sources. With custom graph, powered by Fabric, you can build, query, and visualize connected data, uncover hidden patterns and attack paths, and help surface risks that are hard to detect when data is analyzed in isolation. These graphs provide the knowledge context that enables AI-powered agent experiences to work more effectively, speeding inv",2026-03-30T20:20:00.000Z,how-to,,0.25,False,"Labeled as an overview of custom graphs; description emphasizes capabilities and scenarios, not detailed configuration parameters, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-overview,Overview,Data federation overview in Microsoft Sentinel data lake - Microsoft Security,,"Learn how data federation in Microsoft Sentinel data lake enables seamless querying of external data sources including Azure Databricks, ADLS Gen 2, and Microsoft Fabric.","Data federation in Microsoft Sentinel enables seamless querying of multiple external data sources from within the Microsoft Sentinel data lake environment. By federating data sources such as Azure Databricks, Azure Data Lake Storage (ADLS) Gen 2, and Microsoft Fabric, organizations can enhance their security analytics and operational insights without moving or duplicating data.",2026-03-30T20:20:00.000Z,concept-article,,0.2,False,"High-level overview of data federation; description suggests conceptual explanation of capabilities without numeric limits, config parameter tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-setup,Set up federated tables,Set up federated data connectors in Microsoft Sentinel data lake - Microsoft Security,Configure federated data connectors in Sentinel data lake,"Learn how to configure federated data connectors for Azure Databricks, ADLS Gen 2, and Microsoft Fabric in Microsoft Sentinel data lake.","This article explains how to configure federated data connectors to enable querying of external data sources from the Microsoft Sentinel data lake. You can federate with Azure Databricks, Azure Data Lake Storage (ADLS) Gen 2, and Microsoft Fabric.",2026-03-30T20:20:00.000Z,how-to,configuration,0.65,True,"Setup article for federated connectors (Databricks, ADLS Gen2, Fabric) is likely to include connector-specific configuration parameters, connection settings, and possibly tables of required values unique to Sentinel data lake.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/enable-data-connectors,Asset data in the Sentinel data lake,Asset data in Microsoft Sentinel data lake - Microsoft Security,,Asset data in security data lake,"Asset data in cybersecurity refers to an organization’s physical and digital entities such as computers, identities, software, cloud services, and networks. It shows what exists so you know what must be protected. Microsoft Sentinel’s data lake adds powerful value by storing this asset data in a scalable, cost-efficient way that supports long-term retention, advanced analytics, and AI-driven threat detection. With unified visibility across systems and flexible data management, Sentinel lake help",2025-11-04T23:10:00.000Z,conceptual,,0.3,False,Conceptual explanation of asset data and benefits; appears more like an overview than a configuration or troubleshooting guide.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/gql-reference-for-sentinel-custom-graph,GQL reference for Sentinel custom graph,Graph Query Language (GQL) reference for Microsoft Sentinel graph (Preview),Use GQL syntax to query Sentinel custom graphs,"Learn the fundamental concepts, functions, and operators of Graph Query Language (GQL) for querying graph data in Microsoft Sentinel graph.","Applies to: Microsoft Sentinel Graph Note GQL support is in preview. Features and syntax can change based on feedback and ongoing development. This reference covers the fundamental concepts, functions, and operators of Graph Query Language (GQL). Graph Query Language (GQL) is built on mathematical graph theory concepts that provide a solid foundation for querying graph data. Understanding these fundamentals helps you write more effective queries and better understand how GQL processes your data.",2026-03-30T20:20:00.000Z,reference,integrations,0.65,True,"A language reference for Sentinel’s GQL with product-specific operators, functions, and syntax details that go beyond generic graph query knowledge; fits integrations & coding patterns as it defines query API surface and parameters.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-rest-api,Graph REST API,Graph REST APIs for custom graphs (preview) - Microsoft Security,Call Sentinel custom graph REST APIs programmatically,Learn how to use the Graph REST APIs to list and query custom graphs in the Microsoft Sentinel data lake.,"The Graph REST APIs let you list and query custom graphs in your Microsoft Sentinel data lake. Use these APIs to programmatically interact with your custom graphs from any HTTP client, automation pipeline, or custom application. For more information on creating custom graphs, seeCreate custom graphs in the security data lake.",2026-04-01T08:00:00.000Z,reference,integrations,0.8,True,"REST API article for listing/querying custom graphs will include endpoint paths, parameters, request/response schemas, and constraints unique to Sentinel; matches integrations & coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-visualization,Graph visualization,Visualize custom graphs in Microsoft Sentinel graph (preview),,"Learn how to use Microsoft Sentinel graph to query, visualize, and interact with custom security graphs to gain new security insights.","The graphs experience in the Microsoft Defender portal enables you to perform interactive graph-based investigations on your custom graphs, such as using a graph built for phishing analysis to help you quickly evaluate the impact of a recent incident, profile the attacker, and trace its paths across Microsoft telemetry and third-party data. This experience allows you to run graph queries to visualize the insights that matter most to your organization and supports ad hoc traversal of the graph so",2026-03-30T20:20:00.000Z,how-to,,0.45,False,Graph visualization article appears focused on interactive investigation workflows and running graph queries; likely more tutorial/usage than configuration tables or product-specific limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/identity-attack-graph,Identity attack graph,Identity attack graph in Microsoft Sentinel - Microsoft Security,,"Learn how the identity attack graph in Microsoft Sentinel models identities, permissions, and Azure resources to surface lateral movement paths and privilege escalation risks.","The identity attack graph in Microsoft Sentinel visualizes how identities connect to Azure resources through permissions and group memberships. Security analysts can use the graph to identify lateral movement paths, which are the potential routes an attacker could take to move from one identity or resource to another by exploiting existing permissions, group memberships, or trust relationships, often to escalate privileges or reach sensitive assets. The predefined identity attack graph represent",2026-04-12T11:12:00.000Z,overview,,0.3,False,"The description and summary indicate a conceptual explanation of the identity attack graph and how it visualizes identities, permissions, and lateral movement paths. There’s no clear indication of configuration parameters, limits, error codes, or decision matrices; it reads as a feature overview rather than detailed expert guidance.",new -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs,Aggregate insights from raw data into an Auxiliary table,Create jobs in the Microsoft Sentinel data lake - Microsoft Security,Configure and schedule KQL jobs in Sentinel data lake,Use the Defender portal's Data lake exploration KQL queries to create and schedule jobs to promote data to the analytics tier.,"KQL jobs are one-time or scheduled KQL queries on data in the Microsoft Sentinel data lake and federated tables. Use jobs for investigative and analytical scenarios, such as: KQL jobs are especially effective when queries use joins or unions across different datasets. Use jobs to promote data from the data lake tier to the analytics tier. Once in the analytics tier, use the advanced hunting KQL editor to query the data. Promoting data to the analytics tier has the following benefits: Note Storag",2026-04-01T08:46:00.000Z,how-to,configuration,0.68,True,"KQL jobs in the Sentinel data lake are a product-specific feature; this page describes how to configure one-time and scheduled jobs, including job parameters and behaviors for promoting data from the data lake tier to the analytics tier. These configuration details (job types, scheduling options, promotion behavior) are not generic KQL knowledge and qualify as expert, product-specific configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs,Create KQL jobs,Create jobs in the Microsoft Sentinel data lake - Microsoft Security,Configure and schedule KQL jobs in Sentinel data lake,Use the Defender portal's Data lake exploration KQL queries to create and schedule jobs to promote data to the analytics tier.,"KQL jobs are one-time or scheduled KQL queries on data in the Microsoft Sentinel data lake and federated tables. Use jobs for investigative and analytical scenarios, such as: KQL jobs are especially effective when queries use joins or unions across different datasets. Use jobs to promote data from the data lake tier to the analytics tier. Once in the analytics tier, use the advanced hunting KQL editor to query the data. Promoting data to the analytics tier has the following benefits: Note Storag",2026-04-01T08:46:00.000Z,how-to,configuration,0.75,True,"Focuses on creating and scheduling KQL jobs; such content typically includes job configuration options (schedule, scope, output targets) that are specific settings for this product.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs-summary-rules-search-jobs,"Compare KQL jobs, summary rules, and search jobs","KQL jobs, summary rules, and search jobs - Microsoft Security","Choose between KQL jobs, summary rules, and search jobs","A comparison of KQL jobs, summary rules, and search jobs in Microsoft Sentinel to choose the best tool for querying and analyzing security data.","This article compares KQL jobs, summary rules, and search jobs in Microsoft Sentinel. These features let you query and analyze data in Microsoft Sentinel, and each serves different purposes and use cases. Note KQL jobs require onboarding to the Microsoft Sentinel data lake. For more information, seeOnboard to the Microsoft Sentinel data lake. KQL jobs: Run one-time or scheduled asynchronous queries on data stored in the Microsoft Sentinel data lake. KQL jobs are best for incident investigations ",2026-03-29T08:00:00.000Z,how-to,decision-making,0.8,True,"Explicit comparison of three Sentinel features with guidance on when to use each; likely includes scenario-based recommendations and possibly capability/behavior differences, fitting decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-manage-jobs,Manage KQL jobs,Manage KQL jobs - Microsoft Security,Manage Microsoft Sentinel data lake KQL jobs,Managing KQL jobs in the Defender portal for Microsoft Sentinel data lake,"A KQL job is a one-time or scheduled task that runs a KQL (Kusto Query Language) query against the data in the data lake tier to promote the results to the analytics tier. Jobs can be created in theKQL querieseditor, or theJobspage underMicrosoft Sentinel>Data lake explorationin the Microsoft Defender portal for. For more information, seeKQL jobs. The Jobs management page provides the following functions:",2025-09-30T12:49:00.000Z,concept-article,configuration,0.7,True,"Jobs management page functions (enable, disable, edit, delete) typically include specific job state fields, filters, and options unique to Sentinel data lake job management.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-overview,Overview,KQL and the Microsoft Sentinel data lake - Microsoft Security,,Exploring and interacting with the Microsoft Sentinel data lake using KQL,"With Microsoft Sentinel data lake, you can store and analyze high-volume, low-fidelity logs like firewall or DNS data, asset inventories, and historical records for up to 12 years. Because storage and compute are decoupled, you can query the same copy of data using multiple tools, without moving or duplicating it. You can explore data in the data lake using Kusto Query Language (KQL) and Jupyter Notebooks, to support a wide range of scenarios, from threat hunting and investigations to enrichment",2026-04-01T08:46:00.000Z,concept-article,,0.4,False,"High-level overview of using KQL with Sentinel data lake; mostly conceptual and scenario-focused without detailed configuration tables, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries,Run KQL queries,Run KQL queries against the Microsoft Sentinel data lake - Microsoft Security,Run and manage KQL queries in Sentinel data lake,"Use the Defender portal's Data lake exploration KQL queries to query and interact with the Microsoft Sentinel data lake. Create, edit, and run KQL queries to explore your data lake resources","Data lake exploration in the Microsoft Defender portal provides a unified interface to analyze your data lake. It lets you run KQL (Kusto Query Language) queries, create jobs, and manage them. TheKQL queriespage underData lake explorationlets you edit and run KQL queries on data lake resources and federated tables. Create jobs to promote data from the data lake to the analytics tier, or create aggregate tables in the data lake tier. Run jobs on demand or schedule them. TheJobspage lets you manag",2026-03-26T08:00:00.000Z,how-to,configuration,0.65,True,"Describes the KQL queries page, creating jobs, and managing them; likely includes UI- or API-level options and parameters for jobs and queries, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries-api,KQL using the API,Run KQL queries on Microsoft Sentinel data lake using APIs - Microsoft Security,Run Sentinel data lake KQL queries via REST APIs,"Learn how to run KQL queries against the Microsoft Sentinel data lake programmatically using REST APIs. Enable automation, intelligent agents, and scalable analytics.","Microsoft Sentinel data lake supports running Kusto Query Language (KQL) queries programmatically by using REST APIs. This enables security teams and automation systems to retrieve analytical results without using the Azure portal or interactive query editors. -This article explains when to use the API, required permissions, and how to submit a basic query request.",2026-03-26T17:13:00.000Z,how-to,integrations,0.85,True,"Explains programmatic KQL execution with REST APIs, including required permissions, request formats, and parameters; matches integrations & coding patterns with product-specific API details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-sample-queries,Sample KQL queries,Sample KQL queries for Microsoft Sentinel data lake - Microsoft Security,,Use KQL queries to explore and analyze data in the Microsoft Sentinel data lake.,This article provides sample KQL queries that you can use interactively or in KQL jobs to investigate security incidents and monitor for suspicious activity in the Microsoft Sentinel data lake.,2025-12-10T12:16:00.000Z,how-to,,0.5,False,"Sample KQL queries are useful but are generic query examples rather than product-specific configuration, limits, or troubleshooting patterns; they don’t fit the defined sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-troubleshoot,Troubleshoot KQL for the lake,Troubleshoot KQL queries for the data lake - Microsoft Security,Troubleshoot KQL queries and jobs in Sentinel data lake,Troubleshoot KQL queries for the Microsoft Sentinel data lake.,"Use the following checklist to resolve common issues when working with KQL (Kusto Query Language) queries and jobs in Microsoft Sentinel data lake. Check for prerequisites before running queries or jobs. For more information, seeRoles and permissions for the Microsoft Sentinel data lake. Ensure that you selected the correct workspaces before executing KQL queries or jobs. Confirm that all referenced tables and workspaces exist and are accessible. Use only supported KQL operators and commands to ",2025-09-30T12:49:00.000Z,how-to,troubleshooting,0.85,True,"Explicit troubleshooting checklist for KQL queries and jobs; likely includes specific error conditions, unsupported operators, and role/permission issues mapped to resolutions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-examples,Notebook examples for data lake exploration,Notebook examples for querying the Microsoft Sentinel data lake - Microsoft Security,Notebook code examples for querying Sentinel data lake,"This article provides sample code snippets for querying the Microsoft Sentinel data lake using Jupyter notebooks, demonstrating how to access and analyze security data.","This article presents some sample code snippets that demonstrate how to interact with Microsoft Sentinel lake data using Jupyter notebooks to analyze security data in the Microsoft Sentinel data lake. These examples illustrate how to access and analyze data from various tables, such as Microsoft Entra ID sign-in logs, group information, and device network events. The code snippets are designed to run in Jupyter notebooks within Visual Studio Code using the Microsoft Sentinel extension. To run th",2025-08-27T08:00:00.000Z,how-to,integrations,0.7,True,Provides concrete code snippets using the Sentinel VS Code extension and Spark sessions; these are product-specific integration patterns for accessing particular tables and schemas.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-jobs,Create and manage notebook jobs,Create and manage Jupyter notebook jobs - Microsoft Security,Create and schedule Sentinel Spark notebook jobs,This article describes how to explore and interact with lake data using Spark notebooks in Visual Studio Code.,"You can create scheduled jobs to run at specific times or intervals using the Microsoft Sentinel extension for Visual Studio Code. Jobs allow you to automate data processing tasks to summarize, transform, or analyze data in the Microsoft Sentinel data lake and federated tables. Jobs are also used to process data and write results to custom tables in the lake tier or analytics tier.",2026-03-26T08:00:00.000Z,how-to,configuration,0.7,True,"Describes creating scheduled notebook jobs via the Sentinel VS Code extension; likely details job configuration parameters (schedule, target tables, output locations) that are product-specific settings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks,Run notebooks,Running notebooks on the Microsoft Sentinel data lake - Microsoft Security,Use Jupyter notebooks with Sentinel data lake in VS Code,This article describes how to explore and interact with data lake data using Jupyter notebooks in Visual Studio Code.,"Jupyter notebooks provide an interactive environment for exploring, analyzing, and visualizing data in the Microsoft Sentinel data lake and federated tables. With notebooks, you can write and execute code, document your workflow, and view results—all in one place. This makes it easy to perform data exploration, build advanced analytics solutions, and share insights with others. By leveraging Python and Apache Spark within Visual Studio Code, notebooks help you transform raw security data into ac",2026-04-01T08:46:00.000Z,how-to,integrations,0.65,True,"Covers running notebooks with Python/Spark against Sentinel data lake; likely includes connection configuration, libraries, and code patterns unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks-overview,Overview,Exploring and interacting with lake data using Jupyter Notebooks - Microsoft Security,,This article gives an overview of Jupyter notebooks in Visual Studio Code for the Microsoft Sentinel data lake.,"Jupyter notebooks are an integral part of the Microsoft Sentinel data lake ecosystem, offering powerful tools for data analysis and visualization. The notebooks are provided by the Microsoft Sentinel Visual Studio Code extension that allows you to interact with the data lake using Python for Spark (PySpark). Notebooks enable you to perform complex data transformations, run machine learning models, and create visualizations directly within the notebook environment. The Microsoft Sentinel Visual S",2025-07-22T12:54:00.000Z,overview,,0.35,False,Overview of Jupyter notebooks with Sentinel data lake; high-level explanation of capabilities rather than detailed configuration or troubleshooting.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-overview,Microsoft Sentinel graph overview,What is Microsoft Sentinel graph? - Microsoft Security,,"Learn how Microsoft Sentinel graph enables multi-modal security analytics through graph-based representation of security data, providing deep insights into digital environments and attack paths.","Microsoft Sentinel graph is a unified graph analytics capability within Microsoft Sentinel that powers graph-based experiences across security, compliance, identity, and the Microsoft Security ecosystem - empowering security teams to model, analyze, and visualize complex relationships across their digital estate. Unlike traditional tabular data approaches, Sentinel graph enables defenders and AI agents to reason over interconnected assets, identities, activities, and threat intelligence, unlocki",2026-04-01T08:46:00.000Z,overview,,0.2,False,"Described as an overview of Sentinel graph and its benefits; likely conceptual architecture explanation without concrete thresholds, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-provider-reference,Microsoft Sentinel graph provider reference,Microsoft Sentinel graph provider reference,Use Sentinel graph provider API in Spark notebooks,Reference documentation for the Microsoft Sentinel Graph Builder API for building and querying security graphs.,"The sentinel_graph class provides a way to interact with the Microsoft Sentinel graph, allowing you to define your graph schema, transform data from the Microsoft Sentinel data lake into nodes and edges, publish a graph, query graph, and run advanced graph algorithms. This class is designed to work with the Spark sessions in Jupyter notebooks running on Microsoft Sentinel spark compute.",2026-03-30T20:20:00.000Z,reference,integrations,0.8,True,"Reference for sentinel_graph class and Graph Builder API; by definition includes method signatures, parameter names, and usage patterns unique to Sentinel graph integration with Spark sessions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-connectors,Set up connectors for the Microsoft Sentinel data lake,Set up connectors for the Microsoft Sentinel data lake - Microsoft Security,Configure connectors and retention for Sentinel data lake tiers,Set up and configuring connectors for Microsoft Sentinel data lake.,"The Microsoft Sentinel data lake mirrors data from Microsoft Sentinel workspaces. When you onboard to Microsoft Sentinel data lake, your existing Microsoft Sentinel data connectors are configured to send data to both the analytics tier - your Microsoft Sentinel workspaces, and mirror the data to the data lake tier for longer term storage. After onboarding, configure your connectors to retain data in each tier according to your requirements. This article explains how to set up connectors for the ",2026-02-08T12:21:00.000Z,conceptual,configuration,0.7,True,Explains setting up connectors and configuring retention per tier; likely includes connector settings and retention configuration options specific to Sentinel data lake.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboard-defender,Onboard to data lake and graph from Defender portal,Onboarding to Microsoft Sentinel data lake from the Defender portal - Microsoft Security,Onboard Sentinel data lake from Defender portal,This article describes how to onboard to the Microsoft Sentinel data lake for customers who are currently using Microsoft Defender.,"Onboarding your tenant to the Microsoft Sentinel data lake occurs once and starts from the Microsoft Defender portal. The onboarding process creates a new Microsoft Sentinel data lake for your tenant in the subscription specified during the onboarding process. Graph enablement is included as part of onboarding. If you had onboarded to the data lake during public preview, you're automatically upgraded to the generally available data lake and graph. Note You'll always have one data lake that you c",2025-11-13T08:00:00.000Z,how-to,configuration,0.65,True,"Describes onboarding flow from Defender portal with subscription selection and data lake creation specifics, including constraints like single data lake per tenant.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboarding,Microsoft Sentinel platform deployment overview,Onboarding to Microsoft Sentinel data lake and graph - Microsoft Security,Onboard tenants to Microsoft Sentinel data lake and graph,This article describes how to onboard to the Microsoft Sentinel data lake and graph,"TheMicrosoft Sentinel data lakeis a tenant-wide repository for collecting, storing, and managing large volumes of security-related data from various sources. It enables comprehensive, unified analysis and visibility across your security landscape.Microsoft Sentinel graphis a unified graph capability within Microsoft Sentinel platform powering graph-based experiences across security, compliance, identity, and the entire ecosystem. These solutions use advanced analytics, machine learning, graphs, ",2025-11-13T08:00:00.000Z,how-to,configuration,0.65,True,Onboarding to Sentinel data lake/graph includes tenant-wide configuration steps and likely specific settings and constraints unique to this platform capability.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-overview,Microsoft Sentinel data lake overview,Microsoft Sentinel data lake overview - Microsoft Security,,"An overview of Microsoft Sentinel data lake, a cloud-native platform that extends Microsoft Sentinel with highly scalable, cost-effective long-term storage, advanced analytics, and AI-driven security ","Microsoft Sentinel data lake is a purpose-built, cloud-native security data lake that transforms how organizations manage and analyze security data. Designed as a true data lake, it ingests, stores, and analyzes large volumes of diverse security data at scale. By centralizing security data into a single, open-format, extensible platform, it provides deep visibility, long-term retention, and advanced analytics. The data lake lets you bring all your security data into Microsoft Sentinel cost-effec",2025-09-30T12:49:00.000Z,conceptual,,0.4,False,Overview of Sentinel data lake; conceptual description of capabilities and benefits without detailed config tables or numeric thresholds in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-service-limits,Microsoft Sentinel data lake service limits,Microsoft Sentinel data lake service limits - Microsoft Security,Service limits and quotas for Microsoft Sentinel data lake,Service limits for the Microsoft Sentinel data lake service.,The following service parameters and limits apply to the Microsoft Sentinel data lake service.,2026-02-26T06:24:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly a service limits page; will contain numeric limits, sizes, and timeouts that are expert-only details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-agent-creation-tool,Agent creation,Agent creation tool collection in Microsoft Sentinel MCP server - Microsoft Security,Leverage Sentinel MCP agent creation tool collection,Learn about the different tools available in the Agent creation collection in Microsoft Sentinel,The agent creation tool collection in the Microsoft Sentinel Model Context Protocol (MCP) server lets you create effective Microsoft Security Copilot agents.,2026-01-30T08:00:00.000Z,how-to,integrations,0.65,True,"Covers a specific MCP tool collection for creating Security Copilot agents; likely documents tool names, inputs, and configuration patterns unique to this product, fitting integrations & coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-billing,"Billing, limits, and availability","Microsoft Sentinel MCP server pricing, limits, and availability - Microsoft Security","Sentinel MCP server pricing, limits, and availability","Learn about the pricing, limits, and availability of using the different MCP collection of tools in Microsoft Sentinel","Important Some information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article provides information on pricing, limits, and availability when setting up and using Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools.",2026-04-02T08:00:00.000Z,concept-article,limits-quotas,0.9,True,"Explicitly about pricing, limits, and availability; such pages typically contain numeric quotas, per-tool usage caps, and possibly region availability tables that qualify as limits & quotas.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-chatgpt-claude-connector,Use MCP connector in ChatGPT or Claude,Use the Microsoft Sentinel MCP connector in ChatGPT or Claude - Microsoft Security,Enable and use Microsoft Sentinel MCP connector with ChatGPT or Claude,Learn how to turn on and use a custom Microsoft Sentinel's Model Context Protocol (MCP) connector in ChatGPT or Claude,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article shows you how to enable and use a custom Microsoft Sentinel Model Context Protocol (MCP) connector in ChatGPT by OpenAI or Claude by Anthropic. By using this approach, Security Operations Center (SOC) analysts can run security tasks by using Microsoft Sentinel MCP.",2026-04-06T08:00:00.000Z,how-to,integrations,0.7,True,"The article describes a product-specific integration pattern between Microsoft Sentinel and MCP-based clients (ChatGPT/Claude), including how to enable and use a custom Sentinel MCP connector. This is specialized integration knowledge that involves concrete configuration steps and parameters unique to this connector, which an LLM is unlikely to know from training. It is not just a conceptual overview but a how-to for a specific integration, fitting the integrations sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-create-custom-tool,Create your own custom tool,Create and use custom Microsoft Sentinel MCP tools - Microsoft Security,Create custom Sentinel MCP tools from KQL queries,Learn how to set up and use custom Microsoft Sentinel Model Context Protocol (MCP) tools using saved KQL queries in advanced hunting,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. Security agents built with Microsoft Sentinel's collection of Model Context Protocol (MCP) tools can effectively reason over data in Microsoft Sentinel. You can create custom Sentinel MCP tools to have granular control over the data accessible to your security agents and create deter",2025-11-24T08:00:00.000Z,get-started,integrations,0.7,True,"Shows how to build custom MCP tools from saved KQL queries; likely includes tool schema, parameter definitions, and configuration fields unique to Sentinel MCP, which are expert integration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool,Data exploration,Data exploration tool collection in Microsoft Sentinel MCP server - Microsoft Security,Configure data exploration tools in Sentinel MCP server,Learn about the different tools available in the Data exploration collection in Microsoft Sentinel,"Important Some information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. The data exploration tool collection in the Microsoft Sentinel Model Context Protocol (MCP) server lets you search for relevant tables and retrieve data from Microsoft Sentinel's data lake by using natural language.",2026-04-14T08:00:00.000Z,how-to,configuration,0.64,True,"A page describing a specific 'data exploration tool collection' in an MCP server is likely to enumerate tool names, parameters, and how to invoke them to search tables and retrieve data. Those are product-specific configuration/integration details for the Sentinel MCP tools, not just conceptual overview, and best match the configuration sub-skill among the given options.",updated -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started,Overview,Get started with Microsoft Sentinel MCP server - Microsoft Security,Configure and use Microsoft Sentinel MCP server tools,Learn how to set up and use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools to enable natural language queries and AI-powered security investigations,This article shows you how to set up and use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools to enable natural language queries against your security data. Sentinel's support for MCP enables security teams to bring AI into their security operations by allowing AI models to access security data in a standard way. Sentinel'scollectionof security tools works with multiple clients and automation platforms. You can use these tools to search for relevant tables and retri,2026-04-17T11:12:00.000Z,get-started,configuration,0.68,True,"A 'get started' page for an MCP server typically includes concrete setup steps, configuration parameters, and tool usage details specific to Microsoft Sentinel’s MCP implementation (for example, server endpoints, tool names, and configuration fields). These are product-specific configuration details that go beyond generic LLM knowledge, but the summary doesn’t indicate limits, troubleshooting, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-logic-apps,Build logic apps with Microsoft Sentinel MCP,Build Azure Logic Apps with Microsoft Sentinel MCP tools - Microsoft Security,Integrate Sentinel MCP tools into Azure Logic Apps,Learn how to set up an Azure Logic App using Microsoft Sentinel's collection of Model Context Protocol (MCP) tools,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. You can access the value of Microsoft Sentinel's collection of Model Context Protocol (MCP) tools inAzure Logic Apps, starting with theentity analyzer tool. Security analysts and automation engineers often spend significant time creating complex Security Orchestration, Automation, an",2026-02-16T08:00:00.000Z,how-to,integrations,0.7,True,"Describes using MCP tools (like entity analyzer) inside Logic Apps; such articles typically list connector actions, parameters, and constraints specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-overview,Microsoft Sentinel MCP server overview,What is Microsoft Sentinel’s support for MCP? - Microsoft Security,,Learn how Model Context Protocol (MCP),"Microsoft Sentinel, our security platform, introduces support for Model Context Protocol (MCP). This support includes multiple scenario-focused collections of security tools through a unified server interface. With this support, you can interactively query security data in natural language and build effective security agents that can perform complex automation. Our collection of security tools helps security teams bring AI into their daily security operations to assist with common tasks like dat",2025-12-01T12:18:00.000Z,overview,,0.45,False,"Overview of Sentinel’s MCP support and tool collections; high-level description of capabilities, not detailed configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-responsible-ai-faq,Responsible AI FAQs,Responsible AI FAQs for Microsoft Sentinel MCP server - Microsoft Security,,Learn about how Microsoft Responsible AI Standard guides the development and use of Microsoft Sentinel's collection of Model Context Protocol (MCP) tools,"At Microsoft, we recognize the importance of regulatory compliance as a cornerstone of trust and reliability in AI technologies. We're committed to creating responsible AI by design. Our goal is to develop and deploy AI that has a beneficial impact on and earns trust from society. A core set of principlesguides our work: fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability. Microsoft Sentinel MCP server is being developed in accordance with our ",2025-12-01T12:18:00.000Z,concept-article,,0.3,False,"Responsible AI FAQ is conceptual/compliance-focused; unlikely to contain concrete RBAC roles, config parameters, or other product-specific security settings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-tools-overview,Overview,What is Microsoft Sentinel MCP server's tool collection? - Microsoft Security,,Learn about the different MCP collection of tools in Microsoft Sentinel,"Microsoft Sentinel’s Model Context Protocol (MCP) Server collections are logical groupings of related security-focused MCP tools that you can use in anycompatible clientto: Our collections are scenario-focused and have security-optimized descriptions that help AI models pick the right tools and deliver those outcomes. For example, you can use the following sample prompts to get the appropriate tool:",2025-12-01T12:18:00.000Z,article,,0.3,False,"Overview of MCP tool collections and example prompts; conceptual and scenario-focused, not configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-triage-tool,Triage,Triage tool collection in Microsoft Sentinel MCP server - Microsoft Security,Use Sentinel MCP triage tools for incident hunting,Learn about the different tools available in the triage collection,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. The triage collection in the Microsoft Sentinel Model Context Protocol (MCP) server integrates your AI models with APIs that support incident triage and hunting. This integration lets you prioritize incidents quickly and hunt over your own data easily, reducing mean time to resolutio",2025-12-01T12:18:00.000Z,how-to,integrations,0.65,True,"Describes MCP triage collection integrating AI models with Sentinel APIs; such content typically includes tool operations and parameters specific to Sentinel triage workflows, matching integrations.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-azure-ai-foundry,Use Sentinel MCP tools in Microsoft Foundry,Use a Microsoft Sentinel MCP tool in Microsoft Foundry - Microsoft Security,Use Sentinel MCP tools with Microsoft Foundry AI agents,Learn how to use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Microsoft Foundry,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents inMicrosoft Foundry. For information about how to get started with MCP tools, see the following articles:",2026-01-30T08:00:00.000Z,how-to,configuration,0.65,True,"Explains how to add Sentinel MCP tools or custom tools into Microsoft Foundry; likely includes specific configuration fields and values for tool registration, which is configuration detail.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-copilot-studio,Use Sentinel MCP tools in Copilot Studio,Use a Microsoft Sentinel MCP tool in Microsoft Copilot Studio - Microsoft Security,Configure Sentinel MCP tools in Microsoft Copilot Studio,Learn how to add Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Microsoft Copilot Studio,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents inMicrosoft Copilot Studio. For information about how to get started with MCP tools, see the following articles: Tip For the best ",2025-11-18T18:43:00.000Z,how-to,configuration,0.65,True,Shows how to add Sentinel MCP tools or custom tools into Copilot Studio; prerelease but still focused on concrete configuration steps and parameters.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-security-copilot,Use Sentinel MCP tools in Security Copilot,Use a Microsoft Sentinel MCP tool in Microsoft Security Copilot - Microsoft Security,Add Sentinel MCP tools to Microsoft Security Copilot,Learn how to add Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Microsoft Security Copilot,"This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents inMicrosoft Security Copilot. For information about how to get started with MCP tools, see the following articles:",2025-11-18T18:43:00.000Z,how-to,configuration,0.65,True,"Describes how to add Sentinel MCP tools or custom tools into Security Copilot; likely includes configuration fields and values for registering tools, fitting configuration/integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-visual-studio-code,Use Sentinel MCP tools in Visual Studio Code,Use a Microsoft Sentinel MCP tool in Visual Studio Code - Microsoft Security,,Learn how to use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Visual Studio Code,"This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents in Visual Studio Code. For information about how to get started with MCP tools, see the following articles:",2025-11-18T18:43:00.000Z,how-to,,0.4,False,"How-to/tutorial style for using MCP tools in VS Code; likely step-by-step enablement without detailed config tables, limits, or product-specific troubleshooting matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-provider-class-reference,Microsoft Sentinel provider class reference,Microsoft Sentinel data lake Microsoft Sentinel Provider class reference,Use SentinelProvider class to access Sentinel data lake,"Reference documentation for the Microsoft Sentinel Provider class, which allows you to connect to the Microsoft Sentinel data lake and perform various operations.","TheMicrosoftSentinelProviderclass provides a way to interact with the Microsoft Sentinel data lake, allowing you to perform operations such as listing databases, reading tables, and saving data. This class is designed to work with the Spark sessions in Jupyter notebooks and provides methods to access and manipulate data stored in the Microsoft Sentinel data lake. This class is part of thesentinel.datalakemodule and provides methods to interact with the data lake. To use this class, import it and",2026-02-08T08:00:00.000Z,reference,integrations,0.9,True,Class reference for MicrosoftSentinelProvider with methods and parameters; clearly an SDK integration reference with product-specific API surface and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/troubleshoot-sentinel-mcp,Troubleshooting,Best practices and troubleshooting for Microsoft Sentinel MCP tool collection - Microsoft Security,Best practices and troubleshooting for Sentinel MCP tools,Learn about the best practices for using Microsoft Sentinel's collection of MCP tools and how to troubleshoot them,This article outlines best practices to using Microsoft Sentinel's collection of Model Context Protocol (MCP) tools. It also provides steps you can take to troubleshoot common issues you might experience while using them.,2025-12-01T12:18:00.000Z,how-to,troubleshooting,0.8,True,"Explicitly combines best practices and troubleshooting; likely includes concrete do/don’t guidance and symptom→cause→solution steps for MCP tools, including product-specific issues.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/using-data-federation,Using federated tables,Use federated data sources in Microsoft Sentinel - Microsoft Security,,"Learn how to view, query, and work with federated data sources in Microsoft Sentinel data lake using the portal, KQL queries, and Jupyter notebooks.","After setting up federated data connectors, you can access your federated tables through multiple interfaces in Microsoft Sentinel. Federated tables are used in the same way as other data lake tables. This article explains how to view federated tables, query them using KQL (Kusto Query Language), and work with them in Jupyter notebooks.",2026-03-30T20:20:00.000Z,how-to,,0.4,False,"Focuses on how to view and query federated tables using portal, KQL, and notebooks; sounds like usage/tutorial content rather than detailed config, limits, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/datalake/workbooks-for-data-lake,Workbooks for Microsoft Sentinel data lake,Workbooks for Microsoft Sentinel Data Lake - Microsoft Security,Build Sentinel workbooks using data lake as source,Learn how to create and use Microsoft Sentinel workbooks with data from the Microsoft Sentinel data lake to visualize and monitor security data.,"Running Microsoft Sentinel workbooks on top of Microsoft Sentinel data lake data allows SOC teams to visualize and monitor security data directly from the lake using KQL (Kusto Query Language), without duplicating or transforming data. By selecting Sentinel data lake as the data source in a workbook, analysts can run the same analytical queries used for investigations and hunting. They can render them as interactive charts and tables for operational monitoring and reporting. Using Sentinel data ",2026-03-30T20:20:00.000Z,how-to,configuration,0.6,True,Describes selecting Sentinel data lake as a workbook data source and running queries; likely includes workbook data source configuration and query binding details specific to Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/delete-incident,Delete incidents,Delete incidents in Microsoft Sentinel in the Azure portal,,"Delete incidents in Microsoft Sentinel from the Azure portal, through the API, or using a Logic App.","Important Incident deletion using the portal is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Incident deletion is generally available through the API. The ability to create incidents from scratch in Microsoft Sentinel in the Azure portal opens the possibility that you'll create an incident that you later decide you should",2025-03-04T18:02:00.000Z,how-to,,0.5,False,Describes incident deletion via portal/API/Logic Apps; appears procedural without detailed configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/deploy-overview,Deployment planning guide,Deployment guide for Microsoft Sentinel,,"Learn about the steps to deploy Microsoft Sentinel including the phases to plan and prepare, deploy, and fine tune.","This article introduces the activities that help you plan, deploy, and fine tune your Microsoft Sentinel deployment.",2025-07-22T12:54:00.000Z,concept-article,,0.3,False,"Deployment guide overview describing phases; lacks specific deployment matrices, constraints, or config parameter tables.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/deploy-side-by-side,Deploy side-by-side,Deploying Microsoft Sentinel side-by-side to an existing SIEM.,Plan side-by-side deployment with existing SIEM,Learn how to deploy Microsoft Sentinel side-by-side to an existing SIEM.,"Your security operations center (SOC) team uses centralized security information and event management (SIEM) and security orchestration, automation, and response (SOAR) solutions to protect your increasingly decentralized digital estate. This article describes the approach and methods to consider when deploying Microsoft Sentinel in a side-by-side configuration together with your existing SIEM.",2024-07-25T05:34:00.000Z,concept-article,decision-making,0.65,True,"Side-by-side deployment guidance compares approaches and methods for coexisting with an existing SIEM, providing scenario-based recommendations and trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/detection-tuning,Get fine-tuning recommendations,Get fine-tuning recommendations for your analytics rules in Microsoft Sentinel,Apply fine-tuning recommendations to Sentinel rules,"Learn how to fine-tune your threat detection rules in Microsoft Sentinel, using automatically generated recommendations, to reduce false positives while maintaining threat detection coverage.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important Detection tuning is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor addit",2025-10-29T11:15:00.000Z,how-to,best-practices,0.7,True,Focuses on tuning rules to reduce false positives while maintaining coverage; contains Sentinel-specific tuning recommendations and possibly quantified impacts.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/dns-ama-fields,DNS over AMA reference,Microsoft Sentinel DNS over AMA connector reference - available fields and normalization schema,Configure DNS over AMA connector fields and schema in Sentinel,"This article lists available fields for filtering DNS data using the Windows DNS Events via AMA connector, and the normalization schema for Windows DNS server fields.","Microsoft Sentinel allows you to stream and filter events from your Windows Domain Name System (DNS) server logs to theASimDnsActivityLognormalized schema table. This article describes the fields used for filtering the data, and the normalization schema for the Windows DNS server fields. The Azure Monitor Agent (AMA) and its DNS extension are installed on your Windows Server to upload data from your DNS analytical logs to your Microsoft Sentinel workspace. You stream and filter the data using th",2022-12-27T18:04:00.000Z,reference,configuration,0.9,True,"Lists available fields for filtering DNS data and details the normalization schema for Windows DNS server logs; includes field names and usage, which are configuration reference details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/domain-based-essential-solutions,ASIM-based domain solutions,ASIM-based domain solutions - Essentials for Microsoft Sentinel,Use ASIM-based essential domain solutions in Sentinel,"Learn about the Microsoft essential solutions for Microsoft Sentinel that span across different ASIM schemas like networks, DNS, and web sessions.",Microsoft essential solutions are domain solutions published by Microsoft for Microsoft Sentinel. These solutions have out-of-the-box content that can operate across multiple products for specific categories like networking. Some of these essential solutions use the normalization technique Advanced Security Information Model (ASIM) to normalize the data at query time or ingestion time. Important Microsoft essential solutions and the Network Session Essentials solution are currently in PREVIEW. T,2024-03-01T08:00:00.000Z,conceptual,best-practices,0.6,True,"Describes Microsoft essential solutions using ASIM schemas with product-specific normalization and content usage guidance, representing Sentinel-specific best practices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/deploy-dynamics-365-finance-operations-solution,Deploy for Dynamics 365 Finance and Operations,Connect Microsoft Dynamics 365 Finance and Operations to Microsoft Sentinel,Deploy Sentinel solution for Dynamics 365 Finance and Operations,Learn how to deploy the Microsoft Sentinel solution for Business Applications with Microsoft Dynamics 365 Finance and Operations.,"This article describes how to deploy the Dynamics 365 Finance and Operations content within the Microsoft Sentinel solution for Microsoft Business Applications. The solution monitors and protects your Dynamics 365 Finance and Operations system: It collects audits and activity logs from the Dynamics 365 Finance and Operations environment, and detects threats, suspicious activities, illegitimate activities, and more.Read more about the solution.",2025-04-22T11:11:00.000Z,how-to,deployment,0.66,True,Deployment article for Dynamics 365 F&O content within the Business Apps solution; includes product-specific deployment and connector configuration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/dynamics-365-finance-operations-security-content,Dynamics 365 Finance and Operations security content reference,Security content reference for Dynamics 365 Finance and Operations,Security content reference for Dynamics 365 F&O,Learn about the built-in security content provided by the Microsoft Sentinel solution for Dynamics 365 Finance and Operations.,This article details the security content available for the Microsoft Sentinel solution for Dynamics 365 Finance and Operations. Learn more about the solution.,2025-04-22T11:11:00.000Z,reference,configuration,0.62,True,Details built-in security content for Dynamics 365 F&O; these analytics rules and workbooks are specific to this solution.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/enable-entity-behavior-analytics,Enable User and Entity Behavior Analytics (UEBA),Enable entity behavior analytics to detect advanced threats,Enable and configure Sentinel UEBA data sources,"Enable User and Entity Behavior Analytics in Microsoft Sentinel, and configure data sources","User and Entity Behavior Analytics (UEBA) in Microsoft Sentinel analyzes logs and alerts from connected data sources to build baseline behavioral profiles of your organization's entities—such as users, hosts, IP addresses, and applications. Using machine learning, UEBA identifies anomalous activity that may indicate a compromised asset. You can enable User and Entity Behavior Analytics in two ways, both with the same result: This article explains how to enable UEBA and configure data sources fro",2026-02-26T23:12:00.000Z,how-to,configuration,0.6,True,"UEBA enablement article will include specific configuration options for data sources and feature toggles unique to Sentinel UEBA, representing product-specific configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/enable-monitoring,Enable auditing and health monitoring,Turn on auditing and health monitoring in Microsoft Sentinel,Enable Sentinel auditing and health monitoring and query logs,Monitor supported data connectors by using the SentinelHealth data table.,"Monitor the health and audit the integrity of supported Microsoft Sentinel resources by turning on the auditing and health monitoring feature in Microsoft Sentinel'sSettingspage. Get insights on health drifts, such as the latest failure events or changes from success to failure states, and on unauthorized actions, and use this information to create notifications and other automated actions. To get health data from theSentinelHealthdata table, or to get auditing information from theSentinelAuditd",2025-08-24T08:00:00.000Z,how-to,configuration,0.7,True,Covers turning on the feature and using SentinelHealth and SentinelAudit tables; includes concrete configuration steps and table names.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/enable-sentinel-features-content,Enable Microsoft Sentinel and initial features and content,Enable Microsoft Sentinel SIEM and initial features and content,,"As the first step of your deployment, you enable Microsoft Sentinel, and then enable the health and audit feature, solutions, and content.","To begin your deployment, you need to enable Microsoft Sentinel SIEM and set up key features and content. In this article, you learn how to enable Microsoft Sentinel, enable the health and audit feature, and enable the solutions and content you've identified according to your organization's needs. This article is part of theDeployment guide for Microsoft Sentinel.",2025-09-30T12:49:00.000Z,how-to,,0.35,False,"Enabling Sentinel and initial features is a step-by-step deployment action article; summary does not indicate matrices, limits, or detailed config tables beyond basic onboarding.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/enable-storage-network-security,Enable network security for Azure Storage Blob data connector,Enable Network Security for Azure Storage blob connectors,Enable network security for Sentinel Azure Storage connector,Learn how to enable network security for Azure Storage connector resources. Follow step-by-step instructions to secure your storage accounts with Network Security Perimeters.,"This article provides step-by-step instructions on how to enable network security on the storage resources integrated with your Azure Storage connector. Azure network security perimeter (NSP) is an Azure-native feature that creates a logical isolation boundary for your PaaS resources. By associating resources like storage accounts or databases with an NSP, you can centrally manage network access using a simplified rule set. For more information, seeNetwork security perimeter concepts.",2026-03-11T11:05:00.000Z,how-to,security,0.7,True,"Step-by-step instructions to secure storage accounts with Network Security Perimeters for the Azure Storage connector. Likely includes specific Azure security configuration steps, resource associations, and rule settings that are product-specific security guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/enroll-simplified-pricing-tier,Switch to simplified pricing tiers,Enroll in a simplified pricing tier for Microsoft Sentinel,Enroll Sentinel workspaces in simplified pricing tiers,"Learn how to enroll in simplified billing, the impact of the switch to commitment pricing tiers, and frequently asked questions about enrollment.","For many Microsoft Sentinel workspaces created before July 2023, there's a separate pricing tier for Azure Monitor Log Analytics in addition to the classic pricing tier for Microsoft Sentinel. To combine the data ingestion costs for Log Analytics and the data analysis costs of Microsoft Sentinel, enroll your workspace in a simplified pricing tier.",2025-07-24T11:08:00.000Z,how-to,decision-making,0.7,True,"Explains how to switch to simplified pricing tiers and the impact; such pages typically include tier names, eligibility rules, and billing behavior details that drive SKU/tier selection decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/entities,Overview,Entities in Microsoft Sentinel,,"Entities are classifications or labels for data elements in your Microsoft Sentinel alerts. Microsoft Sentinel uses entities to recognize data elements as a particular entity type, correlate data acro","When alerts are sent to or generated by Microsoft Sentinel, they contain data elements that Sentinel can recognize and classify into categories asentities. When Microsoft Sentinel understands what kind of entity a particular data element represents, it knows the right questions to ask about it, and it can then compare insights about that item across the full range of data sources, and easily track it and refer to it throughout the entire Sentinel experience - analytics, investigation, remediatio",2025-09-16T17:23:00.000Z,conceptual,,0.5,False,Conceptual explanation of entities and how Sentinel uses them; summary doesn’t indicate detailed config tables or numeric constraints.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/entities-reference,Entities reference,Microsoft Sentinel entity types reference,Use Sentinel entity types and identifiers correctly,"This article displays the Microsoft Sentinel entity types and their identifiers, and lists strong and weak identifiers for each.","This document contains two sets of information regarding entities and entity types in Microsoft Sentinel in the Azure portal andMicrosoft Sentinel in the Defender portal. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you're",2025-03-24T08:00:00.000Z,reference,configuration,0.7,True,Reference for Sentinel entity types with strong/weak identifiers is product-specific schema/configuration knowledge (field names and identifier semantics) that an LLM is unlikely to know in detail. Fits configuration because it defines how entities are structured and identified.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer,Aggregate behavioral insights from raw logs,Translate raw security logs to behavioral insights using UEBA behaviors in Microsoft Sentinel,,"The Microsoft Sentinel UEBA behaviors layer translates security telemetry into normalized behavioral patterns for investigation, hunting, and detection engineering.","The User and Entity Behavior Analytics (UEBA) behavior layer in Microsoft Sentinel aggregates and summarizes high-volume raw logs into clear, plain-language patterns of security actions, explaining “who did what to whom” in a structured way. Unlike alerts or anomalies, behaviors don’t necessarily indicate risk - they create an abstraction layer that optimizes your data for investigations, hunting, and detection by enhancing: This abstraction layer enables faster threat detection, investigation, ",2026-01-21T12:16:00.000Z,how-to,,0.5,False,Explains the UEBA behaviors abstraction layer conceptually; likely describes behavior types but not as configuration parameters or limits tables.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer-rai-faqs,Responsible AI FAQ for UEBA behaviors layer,Responsible AI FAQ for the Microsoft Sentinel UEBA behaviors layer,,"This FAQ provides information about the AI technology used in the Microsoft Sentinel UEBA behaviors layer, along with key considerations and details about how AI is used, how it was tested and evaluat",These frequently asked questions (FAQ) describe the AI impact of theUEBA behaviors layerfeature in Microsoft Sentinel.,2026-01-12T12:11:00.000Z,contributor-guide,,0.3,False,"Responsible AI FAQ about UEBA behaviors layer focuses on AI usage, evaluation, and limitations conceptually. It does not emphasize concrete configuration parameters, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/entity-pages,Entity pages,Entity pages in Microsoft Sentinel,,"Entity pages display information about entities surfaced in your alerts, or that you otherwise come across in your incident investigations. Among this information is the timeline of alerts involving t","When you come across a user account, a hostname, an IP address, or an Azure resource in an incident investigation, you may decide you want to know more about it. For example, you might want to know its activity history, whether it's appeared in other alerts or incidents, whether it's done anything unexpected or out of character, and so on. In short, you want information that can help you determine what sort of threat these entities represent and guide your investigation accordingly. This article",2025-11-18T18:43:00.000Z,conceptual,,0.45,False,"Describes entity pages and what they show; primarily UI and investigation experience, not configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/extend-sentinel-across-workspaces-tenants,Extend across multiple workspaces,Extend Microsoft Sentinel across workspaces and tenants,Query and manage Sentinel data across workspaces and tenants,How to use Microsoft Sentinel to query and analyze data across workspaces and tenants.,"When you onboard Microsoft Sentinel, your first step is to select your Log Analytics workspace. While you can get the full benefit of the Microsoft Sentinel experience with a single workspace, in some cases, you might want to extend your workspace to query and analyze your data across workspaces and tenants. For more information, seeDesign a Log Analytics workspace architectureandPrepare for multiple workspaces and tenants in Microsoft Sentinel. If you onboard Microsoft Sentinel to the Microsoft",2025-06-10T08:00:00.000Z,concept-article,architecture-patterns,0.7,True,"Explains how to extend Sentinel across workspaces/tenants; this is a product-specific cross-workspace/tenant architecture pattern, not generic theory.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/false-positives,Handle false positives,Handle false positives in Microsoft Sentinel,Reduce false positives in Microsoft Sentinel analytics,Learn how to resolve false positives in Microsoft Sentinel by creating automation rules or modifying analytics rules to specify exceptions.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel analytics rulesnotify you when something suspicious occurs in your network. No analytics rule is perfe",2025-10-29T11:15:00.000Z,how-to,best-practices,0.6,True,"Focuses on concrete Sentinel-specific ways to handle false positives (automation rules, modifying analytics rules, specifying exceptions). These are product-specific operational patterns rather than generic theory.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/feature-availability,Feature support in different clouds,Microsoft Sentinel feature support for Azure commercial/other clouds,Check Microsoft Sentinel feature availability by Azure cloud,This article describes feature availability in Microsoft Sentinel across different Azure environments.,"This article describes the features available in Microsoft Sentinel across different Azure environments. Features are listed as GA (generally available), public preview, or shown as not available. Note These lists and tables do not include feature or bundle availability in the Azure Government Secret or Azure Government Top Secret clouds. -For more information about specific availability for air-gapped clouds, please contact your account team. Important All Microsoft Sentinel features will be off",2025-08-21T17:11:00.000Z,feature-availability,decision-making,0.7,True,"Feature availability across Azure environments is presented as GA/preview/not-available tables, guiding which features can be used where and influencing environment selection and rollout decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/forward-syslog-monitor-agent,Tutorial - Forward syslog data to workspace,Tutorial: Forward Syslog data to Microsoft Sentinel and Azure Monitor by using Azure Monitor Agent,,"In this tutorial, you learn how to monitor Linux-based devices by forwarding Syslog data to a Log Analytics workspace.","In this tutorial, you configure a Linux virtual machine (VM) to forward Syslog data to your workspace by using Azure Monitor Agent. These steps allow you to collect and monitor data from Linux-based devices where you can't install an agent like a firewall network device. Note Container Insights now supports the automatic collection of Syslog events from Linux nodes in your AKS clusters. To learn more, seeSyslog collection with Container Insights. Configure your Linux-based device to send data to",2025-05-22T11:12:00.000Z,tutorial,,0.45,False,"Tutorial for forwarding Syslog via AMA; mostly procedural steps, not focused on configuration parameter matrices, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/fusion,Overview,Advanced multistage attack detection in Microsoft Sentinel,,Use Fusion technology in Microsoft Sentinel to reduce alert fatigue and create actionable incidents that are based on advanced multistage attack detection.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel uses Fusion, a correlation engine based on scalable machine learning algorithms, to automatically dete",2025-10-29T11:15:00.000Z,concept-article,,0.45,False,Explains Fusion-based multistage attack detection conceptually; summary doesn’t indicate detailed config tables or numeric thresholds.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/fusion-scenario-reference,Multistage attack detection scenarios,Scenarios detected by the Microsoft Sentinel Fusion engine,,"Learn about the scenarios detected by Fusion, listed here grouped by threat classification.","This document lists the types of scenario-based multistage attacks, grouped by threat classification, that Microsoft Sentinel detects using the Fusion correlation engine. SinceFusioncorrelates multiple signals from various products to detect advanced multistage attacks, successful Fusion detections are presented asFusion incidentson the Microsoft SentinelIncidentspage and not asalerts, and are stored in theIncidentstable inLogsand not in theSecurityAlertstable. In order to enable these Fusion-po",2023-10-12T11:16:00.000Z,reference,,0.4,False,"Catalog of Fusion-detected scenarios grouped by threat classification. Useful, but not focused on configuration options, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency,Geographical availability and data residency,Geographical availability and data residency in Microsoft Sentinel,Plan Sentinel deployment for geography and data residency,"In this article, you learn about geographical availability and data residency in Microsoft Sentinel.","After your data is collected, stored, and processed, compliance can become an important design requirement, with a significant impact on your Microsoft Sentinel architecture. Having the ability to validate and prove who has access to what data under all conditions is a critical data sovereignty requirement in many countries and regions, and assessing risks and getting insights in Microsoft Sentinel workflows is a priority for many customers. This article can help you meet compliance requirements",2026-01-14T12:11:00.000Z,concept-article,decision-making,0.6,True,"Geographical availability and data residency article provides environment-specific availability and compliance-driven deployment choices, supporting architecture and region selection decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/geolocation-data-api,Enrich entities with geolocation data with REST-API,Enrich entities with geolocation data in Microsoft Sentinel using REST API,Enrich Sentinel entities with geolocation REST API,This article describes how you can enrich entities in Microsoft Sentinel with geolocation data via REST API.,"This article shows you how to enrich entities in Microsoft Sentinel with geolocation data using the REST API. Important This feature is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2023-01-09T23:08:00.000Z,reference,integrations,0.75,True,"Explains enriching entities with geolocation data via REST API, which implies specific API paths, parameters, and payload formats. This is product-specific integration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/get-visibility,View collected data on the Overview dashboard,View aggregated data from the Overview,,Learn how to quickly view and monitor what's happening across your environment by using Microsoft Sentinel.,"After connecting your data sources to Microsoft Sentinel, use theOverviewpage to view, monitor, and analyze activities across your environment. This article describes the widgets and graphs available on Microsoft Sentinel'sOverviewdashboard. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal",2025-04-24T22:03:00.000Z,how-to,,0.2,False,"Overview dashboard description; mostly UI and conceptual monitoring, no indication of numeric limits, config tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/health-audit,Overview,Auditing and health monitoring in Microsoft Sentinel,Configure auditing and health monitoring in Microsoft Sentinel,"Learn about the Microsoft Sentinel health and audit feature, which monitors service health drifts and user actions.","Microsoft Sentinel is a critical service for advancing and protecting the security of your organization’s technological and information assets, so you want to be sure that it's always running smoothly and free of interference. You want to verify that the service's many moving parts are always functioning as intended, and it isn't being manipulated by unauthorized actions, whether by internal users or otherwise. You might also like to configure notifications of health drifts or unauthorized actio",2025-08-24T17:09:00.000Z,concept-article,configuration,0.65,True,"Describes Sentinel health/audit feature, likely including specific tables, settings, and configuration options for monitoring service health and user actions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/health-table-reference,SentinelHealth table reference,Microsoft Sentinel health tables reference,Query and interpret Microsoft Sentinel health tables,"Learn about the fields in the SentinelHealth tables, used for health monitoring and analysis.","This article describes the fields in theSentinelHealthtable used for monitoring the health of Microsoft Sentinel resources. With the Microsoft Sentinelhealth monitoring feature, you can keep tabs on the proper functioning of your SIEM and get information on any health drifts in your environment. Learn how to query and use the health table for deeper monitoring and visibility of actions in your environment: Microsoft Sentinel's health monitoring feature covers different kinds of resources (see th",2025-08-20T08:00:00.000Z,reference,configuration,0.9,True,Reference for SentinelHealth tables with field-level details used for health monitoring; schema/field references are product-specific configuration data not derivable from general training.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/hunting,Overview,Hunting capabilities in Microsoft Sentinel,,Use Microsoft Sentinel's built-in hunting queries to guide you into asking the right questions to find issues in your data.,"As security analysts and investigators, you want to be proactive about looking for security threats, but your various systems and security appliances generate mountains of data that can be difficult to parse and filter into meaningful events. Microsoft Sentinel has powerful hunting search and query tools to hunt for security threats across your organization's data sources. To help security analysts look proactively for new anomalies that aren't detected by your security apps or even by your sche",2026-03-17T22:33:00.000Z,conceptual,,0.2,False,"Page describes Microsoft Sentinel hunting capabilities and built-in hunting queries at a conceptual/feature level. From the summary, it does not indicate specific error codes, configuration parameter tables, limits, or detailed product-specific settings; it focuses on how hunting helps analysts proactively find threats, which is general feature/usage guidance rather than expert-knowledge reference content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/hunting-with-rest-api,Manage hunting queries with REST-API,Manage hunting queries in Microsoft Sentinel using REST API,Manage Microsoft Sentinel hunting queries via REST API,This article describes how Microsoft Sentinel hunting features enable you to take advantage Log Analytics’ REST API to manage hunting queries.,"Microsoft Sentinel, being built in part on Azure Monitor Log Analytics, lets you use Log Analytics’ REST API to manage hunting queries. This document shows you how to create and manage hunting queries using the REST API. Queries created in this way are displayed in the Microsoft Sentinel UI. -For more information on thesaved searches API, see the definitive REST API reference.",2026-03-17T22:33:00.000Z,reference,integrations,0.7,True,"The page describes using the Log Analytics saved searches REST API specifically for Microsoft Sentinel hunting queries. This implies product-specific API endpoints, request/response schemas, and parameter usage that go beyond generic REST knowledge, fitting the integrations & coding patterns category best.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/hunts,Conduct end-to-end hunts,Conduct end-to-end threat hunting with Hunts - Microsoft Sentinel,,Learn how to use hunts for conducting end-to-end proactive threat hunting. Seek out undetected threats based on hypothesis or start broadly and refine your searches with this hunting experience.,"Proactive threat hunting is a process where security analysts seek out undetected threats and malicious behaviors. By creating a hypothesis, searching through data, and validating that hypothesis, they determine what to act on. Actions can include creating new detections, new threat intelligence, or spinning up a new incident. Use the end to end hunting experience within Microsoft Sentinel to: Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and w",2025-07-01T11:22:00.000Z,how-to,,0.4,False,"End-to-end hunting experience description is largely workflow/UX oriented; no specific configs, limits, or error mappings are evident from the summary.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/hunts-custom-queries,Create custom query,Create custom hunting queries in Microsoft Sentinel - Microsoft Sentinel,Author custom hunting KQL queries in Sentinel,Learn how to create a custom query to hunt for threats.,"Hunt for security threats across your organization's data sources with custom hunting queries. Microsoft Sentinel provides built-in hunting queries to help you find issues in the data you have on your network. But you can create your own custom queries. For more information about hunting queries, seeThreat hunting in Microsoft Sentinel.",2025-05-05T22:16:00.000Z,how-to,integrations,0.6,True,"Covers creating custom hunting queries against Sentinel data sources. While KQL is generic, the patterns and usage in Sentinel hunting are product-specific integration/query patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/identify-threats-with-entity-behavior-analytics,UEBA overview,Advanced threat detection with User and Entity Behavior Analytics (UEBA) in Microsoft Sentinel,,"Create behavioral baselines for entities (users, hostnames, IP addresses) and use them to detect anomalous behavior and identify zero-day advanced persistent threats (APT).","Detecting anomalous behavior within an organization is often complex and time-consuming. Microsoft Sentinel's User and Entity Behavior Analytics (UEBA) simplifies this challenge by continuously learning from your data to surface meaningful anomalies that help analysts detect and investigate potential threats more effectively. This article explains what Microsoft Sentinel User and Entity Behavior Analytics (UEBA) is, how it works, how to onboard to it, and how to use UEBA to detect and investigat",2026-02-26T23:12:00.000Z,article,,0.45,False,"UEBA overview and onboarding; mostly conceptual explanation of how UEBA works and how to enable it, without strong indication of detailed config tables or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/import-export-analytics-rules,Export and import analytics rules,Import and export Microsoft Sentinel analytics rules,Import and export Sentinel analytics rules via ARM,Export and import analytics rules to and from ARM templates to aid deployment,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important Exporting and importing rules is inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor ad",2025-10-29T11:15:00.000Z,how-to,deployment,0.7,True,Covers exporting/importing rules as ARM templates; this is deployment-focused with product-specific constraints and template schema details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/import-export-automation-rules,Export and import automation rules,Import and export Microsoft Sentinel automation rules,Manage Sentinel automation rules as code with ARM templates,Export and import automation rules to and from ARM templates to aid deployment,"Manage your Microsoft Sentinel automation rules as code! You can now export your automation rules to Azure Resource Manager (ARM) template files, and import rules from these files, as part of your program to manage and control your Microsoft Sentinel deployments as code. The export action creates a JSON file in your browser's downloads location, that you can then rename, move, and otherwise handle like any other file. The exported JSON file is workspace-independent, so it can be imported to othe",2025-09-16T17:23:00.000Z,how-to,deployment,0.75,True,"Describes exporting/importing automation rules as ARM templates, including workspace-independent JSON files—this is a deployment-as-code pattern specific to Sentinel automation rules.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/incident-investigation,Overview,Microsoft Sentinel incident investigation in the Azure portal,,This article describes Microsoft Sentinel's incident investigation and case management capabilities and features in the Azure portal.,"Microsoft Sentinel gives you a complete, full-featured case management platform for investigating and managing security incidents.Incidentsare Microsoft Sentinel’s name for files that contain a complete and constantly updated chronology of a security threat, whether it’s individual pieces of evidence (alerts), suspects and parties of interest (entities), insights collected and curated by security experts and AI/machine learning models, or comments and logs of all the actions taken in the course ",2025-01-15T12:12:00.000Z,concept-article,,0.45,False,Incident investigation and case management overview; likely UX and workflow focused without detailed configs or numeric thresholds in the summary.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/incident-navigate-triage,Triage and manage your incidents,Basic incident tasks for Microsoft Sentinel incidents in the Azure portal,,This article describes how to navigate and triage incidents in Microsoft Sentinel in the Azure portal.,This article describes how to navigate and run basic triage on your incidents in the Azure portal.,2025-01-15T12:12:00.000Z,how-to,,0.45,False,Describes basic navigation and triage steps; appears procedural without deep product-specific configuration or numeric guidance.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/incident-tasks,Overview,Use tasks to manage incidents in Microsoft Sentinel in the Azure portal,Standardize Sentinel incident handling with tasks,"This article describes incident tasks and how to work with them to ensure all required steps are taken in triaging, investigating, and responding to incidents in Microsoft Sentinel.","One of the most important factors in running your security operations (SecOps) effectively and efficiently is thestandardization of processes. SecOps analysts are expected to perform a list of steps, or tasks, in the process of triaging, investigating, or remediating an incident. Standardizing and formalizing the list of tasks can help keep your SOC running smoothly, ensuring the same requirements apply to all analysts. This way, regardless of who is on-shift, an incident will always get the sam",2024-02-06T12:14:00.000Z,conceptual,best-practices,0.6,True,"Focuses on using incident tasks to standardize triage/investigation workflows, providing concrete product-specific guidance for SOC process implementation.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/indicators-bulk-file-import,Add threat intelligence in bulk by file,Add threat intelligence in bulk by file - Microsoft Sentinel,Bulk import threat indicators from files into Sentinel,Learn how to add threat intelligence in bulk from flat files like .csv or .json into Microsoft Sentinel.,"This article demonstrates how to add indicators from a CSV or STIX objects from a JSON file into Microsoft Sentinel threat intelligence. Because threat intelligence sharing still happens across emails and other informal channels during an ongoing investigation, the ability to import that information quickly into Microsoft Sentinel is important to relay emerging threats to your team. These identified threats are then available to power other analytics, such as producing security alerts, incidents",2025-10-23T08:00:00.000Z,how-to,configuration,0.7,True,"How-to for importing indicators from CSV/JSON; likely includes file schema, required columns/fields, and Sentinel-specific parameters for bulk import, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ingest-defender-for-cloud-incidents,Integrate Microsoft Defender for Cloud,Ingest Microsoft Defender for Cloud incidents with Microsoft Defender XDR integration,Ingest Defender for Cloud incidents via Defender XDR,Learn how using Microsoft Defender for Cloud's integration with Microsoft Defender XDR lets you ingest Microsoft Defender for Cloud incidents through Microsoft Defender XDR. This lets you add Defender,"Microsoft Defender for Cloud is nowintegrated with Microsoft Defender XDR, formerly known as Microsoft 365 Defender. This integration allows Defender XDR to collect alerts from Defender for Cloud and create Defender XDR incidents from them. Thanks to this integration, Microsoft Sentinel customers who enableDefender XDR incident integrationcan now ingest and synchronize Defender for Cloud incidents through Microsoft Defender XDR. To support this integration, you must set up one of the following M",2024-04-16T08:00:00.000Z,conceptual,integrations,0.7,True,Describes how Defender for Cloud incidents flow through Defender XDR into Sentinel; includes specific integration requirements and configuration options.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ingestion-delay,Handle ingestion delay in analytics rules,Handle ingestion delay in Microsoft Sentinel,Handle data ingestion delay in Sentinel rules,Handle ingestion delay in Microsoft Sentinel scheduled analytics rules.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. While Microsoft Sentinel can ingest data fromvarious sources, ingestion time for each data source may differ in different",2025-10-29T11:15:00.000Z,how-to,best-practices,0.7,True,"Discusses handling ingestion delay for scheduled rules; likely includes concrete recommendations (for lookback windows, scheduling) and product-specific gotchas.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/investigate-incidents,Investigate incidents in depth,Investigate Microsoft Sentinel incidents in depth in the Azure portal,,"This article takes you through all the panels and options available on the incident details page in the Azure portal, helping you navigate and investigate your incidents more quickly, effectively, and","Microsoft Sentinel incidents are files that contain an aggregation of all the relevant evidence for specific investigations. Each incident is created (or added to) based on pieces of evidence (alerts) that were either generated by analytics rules or imported from third-party security products that produce their own alerts. Incidents inherit theentitiescontained in the alerts, as well as the alerts' properties, such as severity, status, and MITRE ATT&CK tactics and techniques. Microsoft Sentinel ",2025-01-15T12:12:00.000Z,how-to,,0.45,False,"Explains incident details page and investigation panels; mostly UI/flow description, not configuration tables or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/investigate-large-datasets,Overview,Start an investigation by searching large datasets - Microsoft Sentinel,Investigate Sentinel incidents using large dataset search,Learn about search jobs and restoring data from long-term retention in Microsoft Sentinel.,"One of the primary activities of a security team is to search logs for specific events. For example, you might search logs for the activities of a specific user within a given time-frame. In Microsoft Sentinel, you can search across long time periods in extremely large datasets by using a search job. While you can run a search job on any type of log, search jobs are ideally suited to search logs in a long-term retention (formerly known as archive) state. If you need to do a full investigation o",2025-09-21T22:10:00.000Z,conceptual,architecture-patterns,0.6,True,"Explains when and how to use search jobs vs other methods for long-term retention data, a Sentinel-specific investigation pattern tied to data states.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/investigate-with-ueba,UEBA investigation examples,Investigate incidents with UEBA data,,Learn how to use UEBA data while investigating to gain greater context to potentially malicious activity occurring in your organization.,"This article describes common methods and sample procedures for usinguser entity behavior analytics (UEBA)in your regular investigation workflows. Important Noted features in this article are currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2025-11-18T18:43:00.000Z,how-to,,0.45,False,"Describes investigation workflows using UEBA data; scenario-focused guidance rather than configuration, limits, or troubleshooting reference.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/log-plans,Plan data retention,Log retention tiers in Microsoft Sentinel,Select Microsoft Sentinel log retention tiers and limits,Learn about the different log retention plans that are available in Microsoft Sentinel and how they're meant to be used to ensure maximum coverage at minimum expenditure.,"For Microsoft Sentinel workspaces connected to Defender, tiering and retention management must be done from the new table management experience in the Defender portal. For unattached Microsoft Sentinel workspaces, continue to use the experiences described below to manage data in your workspaces. There are two competing aspects of log collection and retention that are critical to a successful threat detection program. On the one hand, you want to maximize the number of log sources that you collec",2025-09-30T12:49:00.000Z,conceptual,limits-quotas,0.7,True,"Log retention tiers article typically includes concrete retention durations, tier behaviors, and possibly GB/day or cost-related thresholds, which are numeric limits and tier-specific constraints.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/manage-analytics-rule-templates,Manage template versions for analytics rules,Manage template versions for your scheduled analytics rules in Microsoft Sentinel,Manage Sentinel analytics rule template versions,"Learn how to manage the relationship between your scheduled analytics rule templates and the rules created from those templates. Merge updates to the templates into your rules, and revert changes in y","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important This feature is inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal te",2024-07-15T03:11:00.000Z,how-to,configuration,0.7,True,"Explains managing relationships between templates and rules, merging updates, and reverting; involves Sentinel-specific configuration flows and options.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/manage-data-overview,Data management overview,Manage data tiers and retention in Microsoft Sentinel,Choose data tiers and retention for Microsoft Sentinel,Manage log data in Microsoft Sentinel and with Microsoft Defender XDR services in the Microsoft Defender portal to optimize security operations and cost efficiency.,Data you collect into Microsoft Sentinel (SIEM) and Microsoft Defender XDR is stored in tables. The Microsoft Defender portal lets you manage the retention period and the store costs associated with your data. You can manage retention and costs when you: This article explains how to manage table retention and tier options in the Microsoft Defender portal to optimize security operations and reduce costs in Microsoft Sentinel and Microsoft Defender XDR.,2026-03-12T08:00:00.000Z,conceptual,decision-making,0.7,True,"The article focuses on managing table retention and tier options to optimize cost and security operations. This implies guidance on selecting retention periods and data tiers, likely including comparison of options and trade-offs between cost and capability, which fits decision-making around tier/retention selection.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/manage-platform-solutions,Managing Platform Solutions,Manage Microsoft Sentinel platform solutions,Configure and manage installed Microsoft Sentinel platform solutions,"Learn how to configure, update, and uninstall components installed by a Microsoft Sentinel platform solution.","After you install a Microsoft Sentinel platform solution, you manage its components in different places. This article explains how to configure, update, and uninstall the main types of components.",2025-09-30T12:49:00.000Z,how-to,configuration,0.65,True,"Explains how to configure, update, and uninstall components installed by a platform solution; includes product-specific management locations and settings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/manage-soc-with-incident-metrics,Manage your SOC with incident metrics,Manage your SOC better with incident metrics in Microsoft Sentinel,Use Sentinel incident metrics to manage SOC performance,Use information from the Microsoft Sentinel incident metrics screen and workbook to help you manage your Security Operations Center (SOC).,"Note For information about feature availability in US Government clouds, see the Microsoft Sentinel tables inCloud feature availability for US Government customers. As a Security Operations Center (SOC) manager, you need to have overall efficiency metrics and measures at your fingertips to gauge the performance of your team. You'll want to see incident operations over time by many different criteria, like severity, MITRE tactics, mean time to triage, mean time to resolve, and more. Microsoft Sen",2023-05-10T22:07:00.000Z,how-to,best-practices,0.6,True,"Uses Sentinel’s incident metrics and workbook with specific measures (MTTR, severity, MITRE tactics) to manage SOC; this is product-specific operational guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention,"Manage tables, tiers, and retention",Configure table settings in Microsoft Sentinel,Configure table retention and tier settings for Sentinel,Configure Microsoft Sentinel and Defender XDR table settings in Microsoft Defender Portal to optimize security operations and cost efficiency.,"The Microsoft Defender portal provides a centralized experience for configuring table-level data retention and tier settings across Microsoft Sentinel and Microsoft Defender XDR. You can view and manage retention settings, switch between analytics and data lake tiers, and optimize data storage based on operational and cost requirements. This article explains how to configure retention and tier settings for tables in Microsoft Sentinel and Defender XDR in the Microsoft Defender portal. For more i",2025-11-19T14:04:00.000Z,conceptual,configuration,0.75,True,"Step-by-step configuration of table-level retention and tier (analytics vs data lake) in Defender portal, including Sentinel-specific options and behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/map-data-fields-to-entities,Map data fields to entities,Map data fields to Microsoft Sentinel entities,Map analytics rule fields to Sentinel entities,"Map data fields in tables to Microsoft Sentinel entities in analytics rules, for better incident information",Entity mapping is an integral part of the configuration ofscheduled analytics rules. It enriches the rules' output (alerts and incidents) with essential information that serves as the building blocks of any investigative processes and remedial actions that follow. The procedure detailed below is part of the analytics rule creation wizard. It's treated here independently to address the scenario of adding or changing entity mappings in an existing analytics rule. Important,2025-09-16T17:23:00.000Z,how-to,configuration,0.85,True,"Entity mapping is a Sentinel-specific configuration; page will list entity types, field mapping rules, and constraints, which are detailed config options.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-cloud-support,Microsoft Defender XDR connector data type support,Support for Microsoft Defender XDR connector data types in Microsoft Sentinel for different clouds (GCC environments),Check Sentinel Defender XDR data support by cloud,"This article describes support for different Microsoft Defender XDR connector data types in Microsoft Sentinel across different clouds, including Commercial, GCC, GCC-High, and DoD.","The type of cloud your environment uses affects Microsoft Sentinel's ability to ingest and display data from these connectors, like logs, alerts, device events, and more. This article describes support for different Microsoft Defender XDR connector data types in Microsoft Sentinel across different clouds, including Commercial, GCC, GCC-High, and DoD. Read more aboutdata type support for different clouds in Microsoft Sentinel.",2023-12-07T23:03:00.000Z,reference,deployment,0.65,True,"Describes which Microsoft Defender XDR connector data types are supported in Microsoft Sentinel across Commercial, GCC, GCC-High, and DoD clouds. This is environment- and SKU-specific capability/support matrix information that an LLM is unlikely to know from training and is used for planning deployments across sovereign clouds.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-sentinel-integration,Integrate Microsoft Defender XDR,Microsoft Defender XDR integration with Microsoft Sentinel,Integrate Microsoft Defender XDR with Microsoft Sentinel,Learn how using Microsoft Defender XDR together with Microsoft Sentinel lets you use Microsoft Sentinel as your universal incidents queue.,"This article describes how Microsoft Defender XDR services integrate with Microsoft Sentinel, whether in the Microsoft Defender portal or in the Azure portal. If you first onboarded to Microsoft Sentinel after July 1, 2025 with permissions of a subscriptionOwneror aUser access administrator, your workspace isautomatically onboarded to the Defender portal. In such cases, youuse Microsoft Sentinel in the Defender portal only, where your data can integrate directly with Defender XDR service data fo",2025-10-27T11:18:00.000Z,conceptual,integrations,0.7,True,"Explains integration behavior between Defender XDR and Sentinel, including onboarding conditions and data flow; includes product-specific integration patterns and constraints.",unchanged +https://learn.microsoft.com/en-us/azure/sentinel/add-advanced-conditions-to-automation-rules,Add advanced conditions to automation rules,Add advanced conditions to Microsoft Sentinel automation rules,Configure advanced OR conditions in Sentinel automation rules,"This article explains how to add complex, advanced ""Or"" conditions to automation rules in Microsoft Sentinel, for more effective triage of incidents.","This article explains how to add advanced ""Or"" conditions to automation rules in Microsoft Sentinel, for more effective triage of incidents. Add ""Or"" conditions in the form ofcondition groupsin the Conditions section of your automation rule. Condition groups can contain two levels of conditions: Simple: At least two conditions, each separated by anORoperator: Compound: More than two conditions, with at least two conditions on at least one side of anORoperator: You can see that this capability af",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Describes adding complex condition groups and OR logic in automation rules, which are detailed, product-specific configuration patterns not generally known.",updated +https://learn.microsoft.com/en-us/azure/sentinel/add-entity-to-threat-intelligence,Add entity to threat indicators,Add entities to threat intelligence - Microsoft Sentinel,,Learn how to add a malicious entity discovered in an incident investigation to your threat intelligence in Microsoft Sentinel.,"During an investigation, you examine entities and their context as an important part of understanding the scope and nature of an incident. When you discover an entity as a malicious domain name, URL, file, or IP address in the incident, it should be labeled and tracked as an indicator of compromise (IOC) in your threat intelligence. For example, you might discover an IP address that performs port scans across your network or functions as a command and control node by sending and/or receiving tra",2026-04-22T17:56:00.000Z,how-to,,0.3,False,Describes adding entities discovered during investigations as indicators; appears to be workflow/tutorial style without product-specific configuration tables or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/ama-migrate,AMA migration for Microsoft Sentinel,Migrate to the Azure Monitor Agent (AMA) from the Log Analytics agent (MMA/OMS) for Microsoft Sentinel,Plan and execute Sentinel migration from MMA to AMA,"Learn about migrating from the Log Analytics agent (MMA/OMS) to the Azure Monitor Agent (AMA), when working with Microsoft Sentinel.","This article describes the migration process to the Azure Monitor Agent (AMA) when you have an existing, legacyLog Analytics Agent (MMA/OMS), and are working with Microsoft Sentinel. The Log Analytics agent isretired as of 31 August, 2024. If you are using the Log Analytics agent in your Microsoft Sentinel deployment, we recommend that you migrate to the AMA.",2026-04-22T17:56:00.000Z,reference,decision-making,0.65,True,"Migration article between agents; such content typically includes decision guidance on when/how to migrate, mapping of old to new capabilities, and deployment considerations—expert decision-making guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/anomalies-reference,Anomalies reference,Anomalies detected by the Microsoft Sentinel machine learning engine,Reference anomalies detected by Sentinel ML engine,Learn about the anomalies detected by Microsoft Sentinel's machine learning engines.,"Microsoft Sentinel detects anomalies by analyzing the behavior of users in an environment over a period of time and constructing a baseline of legitimate activity. Once the baseline is established, any activity outside the normal parameters is considered anomalous and therefore suspicious. Microsoft Sentinel uses two models to create baselines and detect anomalies. This article lists the anomalies that Microsoft Sentinel detects using various machine learning models. In theAnomaliestable: Note T",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,"Lists specific anomaly types and how they are represented, which is detailed product behavior and schema-level knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/api-dcr-reference,Sample API requests for creating Data Collection Rules (DCRs),Microsoft Sentinel API request examples for creating Data Collection Rules (DCRs),Use Sentinel API examples to create Data Collection Rules,"See samples of API requests for creating Data Collection Rules and their associations, for use with the Azure Monitor Agent.",This article presents some examples of API requests and responses for creating Data Collection Rules (DCRs) and DCR Associations (DCRAs) for use with the Azure Monitor Agent (AMA).,2026-04-22T17:56:00.000Z,reference,integrations,0.74,True,"Contains concrete REST request/response examples for DCR and DCRA creation, including parameter names and structures specific to Sentinel/AMA.",updated +https://learn.microsoft.com/en-us/azure/sentinel/audit-sentinel-data,Audit Microsoft Sentinel with Azure Activity Logs,Audit Microsoft Sentinel queries and activities,Audit Sentinel queries and activities using log tables,This article describes how to audit queries and activities performed in Microsoft Sentinel.,"This article describes how you can view audit data for queries run and activities performed in your Microsoft Sentinel workspace, such as for internal and external compliance requirements in your Security Operations (SOC) workspace. Microsoft Sentinel provides access to: TheAzureActivitytable, which provides details about all actions taken in Microsoft Sentinel, such as editing alert rules. TheAzureActivitytable doesn't log specific query data. For more information, seeAuditing with Azure Activi",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Describes accessing AzureActivity and other audit data for Sentinel; includes table names and query usage specific to Sentinel auditing.,updated +https://learn.microsoft.com/en-us/azure/sentinel/audit-table-reference,SentinelAudit table reference,Microsoft Sentinel audit tables reference,Use SentinelAudit table fields for audit analysis,"Learn about the fields in the SentinelAudit tables, used for audit monitoring and analysis.","This article describes the fields in the SentinelAudit tables, which are used for auditing user activity in Microsoft Sentinel resources. With the Microsoft Sentinel audit feature, you can keep tabs on the actions taken in your SIEM and get information on any changes made to your environment and the users that made those changes. Learn how toquery and use the audit tablefor deeper monitoring and visibility of actions in your environment. Microsoft Sentinel's audit feature currently covers only t",2026-04-22T17:56:00.000Z,reference,configuration,0.85,True,Audit table reference with field definitions; provides detailed schema and usage guidance specific to Sentinel audit logging.,updated +https://learn.microsoft.com/en-us/azure/sentinel/audit-track-tasks,Audit and track changes to incident tasks,Audit and track changes to incident tasks in Microsoft Sentinel in the Azure portal,,"This article explains how you, as a SOC manager, can audit the history of Microsoft Sentinel incident tasks, and track changes to them, in order to gauge your task assignments and their contribution t","Incident tasksensure comprehensive and uniform treatment of incidents across all SOC personnel. Task lists are typically defined according to determinations made by senior analysts or SOC managers, and put into practice using automation rules or playbooks. Your analysts can see the list of tasks they need to perform for a particular incident on the incident details page, and mark them complete as they go. Analysts can also create their own tasks on the spot, manually, right from within the incid",2026-04-22T17:56:00.000Z,how-to,,0.35,False,Describes auditing and tracking task changes conceptually; summary does not show specific configuration parameters or numeric thresholds.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automate-incident-handling-with-automation-rules,Automation rules,Automate threat response in Microsoft Sentinel with automation rules,,"This article explains what Microsoft Sentinel automation rules are, and how to use them to implement your Security Orchestration, Automation and Response (SOAR) operations. Automation rules increase y","This article explains what Microsoft Sentinel automation rules are, and how to use them to implement your Security Orchestration, Automation and Response (SOAR) operations. Automation rules increase your SOC's effectiveness and save you time and resources. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the ",2026-04-22T17:56:00.000Z,concept-article,,0.45,False,Conceptual explanation of automation rules and their role; summary does not show specific parameter tables or numeric thresholds.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automation-rule-reference,Automation rules reference,Microsoft Sentinel automation rules reference,Configure Microsoft Sentinel automation rules properties,This article displays the supported properties and entities in Microsoft Sentinel automation rules.,"This article contains reference information about the configuration of automation rules and the supported conditions and properties. To learn more about automation rules, seeAutomate threat response in Microsoft Sentinel with automation rules. For instructions on creating, managing, and using automation rules, seeCreate and use Microsoft Sentinel automation rules to manage response.",2026-04-22T17:56:00.000Z,reference,configuration,0.78,True,"Reference for supported properties, conditions, and entities in automation rules, including specific field names and allowed values.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/authenticate-playbooks-to-sentinel,Authenticate playbooks to Microsoft Sentinel,Authenticate playbooks to Microsoft Sentinel,Configure authentication for Microsoft Sentinel playbooks,Learn how to give your playbooks access to Microsoft Sentinel and authorization to take remedial actions.,"Microsoft Sentinel playbooks are based on workflows built inAzure Logic Apps, a cloud service that helps you schedule, automate, and orchestrate tasks and workflows across systems throughout the enterprise. Azure Logic Apps must connect separately and authenticate independently to each resource, of each type, that it interacts with, including to Microsoft Sentinel itself. Logic Apps usesspecialized connectorsfor this purpose, with each resource type having its own connector. This article describ",2026-04-22T17:56:00.000Z,how-to,security,0.8,True,"Describes how playbooks authenticate to Sentinel using specialized connectors; likely includes connector permissions, identities, and RBAC-related configuration specific to Sentinel, which is product-specific security configuration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/automate-responses-with-playbooks,Overview,Automate Threat Response with Playbooks in Microsoft Sentinel,,Learn how to automate threat response in Microsoft Sentinel using playbooks to efficiently manage security alerts and incidents.,"Security operations centers (SOCs) face a constant stream of security alerts and incidents. Managing these efficiently is critical to keeping your organization’s security strong. Microsoft Sentinel playbooks are automated workflows that help you respond to threats quickly and consistently. This article shows how to use playbooks in Microsoft Sentinel to automate threat response, cut manual effort, and let your team focus on deeper investigations. Use Microsoft Sentinel playbooks to run preconfig",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"High-level guidance on automating threat response with playbooks; appears conceptual/how-to without detailed error codes, config tables, or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/automation,Overview,Automation in Microsoft Sentinel,,"Learn about Microsoft Sentinel security orchestration, automation, and response (SOAR) capabilities and components, including automation rules and playbooks.","Security information and event management (SIEM) and security operations center (SOC) teams are typically inundated with security alerts and incidents on a regular basis, at volumes so large that available personnel are overwhelmed. This results all too often in situations where many alerts are ignored and many incidents aren't investigated, leaving the organization vulnerable to attacks that go unnoticed. Microsoft Sentinel, in addition to being a SIEM system, is also a platform for security or",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,High-level overview of automation/SOAR capabilities; summary does not show specific configuration values or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/create-playbooks,Create and manage playbooks,Create and manage Microsoft Sentinel playbooks,,Learn how to create and manage Microsoft Sentinel playbooks to automate your incident response and remediate security threats.,"Playbooks are collections of procedures that can be run from Microsoft Sentinel in response to an entire incident, to an individual alert, or to a specific entity. A playbook can help automate and orchestrate your response and can be attached to an automation rule to run automatically when specific alerts are generated or when incidents are created or updated. Playbooks can also be run manually on-demand on specific incidents, alerts, or entities. This article describes how to create and manage ",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"Create/manage playbooks article is primarily step-by-step usage; no indication of detailed configuration tables, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/create-tasks-playbook,Create and perform advanced incident tasks using playbooks,Create and perform incident tasks in Microsoft Sentinel using playbooks,,"This article explains how to use playbooks to create (and optionally perform) incident tasks, in order to manage complex analyst workflow processes in Microsoft Sentinel.","This article explains how to use playbooks to create, and optionally perform, incident tasks to manage complex analyst workflow processes in Microsoft Sentinel. Use theAdd taskaction in a playbook, in the Microsoft Sentinel connector, to automatically add a task to the incident that triggered the playbook. Both Standard and Consumption workflows are supported. Tip Incident tasks can be created automatically not only by playbooks, but also by automation rules, and also manually, ad-hoc, from with",2026-04-22T17:56:00.000Z,how-to,,0.45,False,"Describes using playbooks to create incident tasks; appears as feature usage guidance without detailed config tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/define-playbook-access-restrictions,Define an access restriction for Standard-plan playbooks,Define an access restriction policy for Standard-plan playbooks,Define access restriction policies for Sentinel playbooks,"This article shows how to define an access restriction policy for Microsoft Sentinel Standard-plan playbooks, so that they can support private endpoints.","This article describes how to define anaccess restriction policyfor Microsoft Sentinel Standard-plan playbooks, so that they can support private endpoints. Define an access restriction policy to ensure that only Microsoft Sentinel has access to the Standard logic app containing your playbook workflows. For more information, see: Important The new version of access restriction policies is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms ",2026-04-22T17:56:00.000Z,how-to,security,0.7,True,"Covers access restriction policy for Standard-plan playbooks and private endpoints; likely includes specific policy settings and scopes, which are product-specific security configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/generate-playbook,Generate playbooks using AI,Generate playbooks using AI in Microsoft Sentinel,,Generate playbooks through natural language conversations directly in the Defender portal.,"The SOAR playbook generator creates python based automation workflows coauthored through a conversational experience with Cline, an AI coding agent. You describe automation logic in natural language, and the system generates validated, code-based playbooks with complete documentation and visual flow diagrams. This experience is powered by an embedded VS Code environment within the Defender portal, so you can author and refine playbooks without leaving the portal. Generated playbooks use alert da",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"Describes AI-based playbook generation experience; appears as feature overview/how-to without deep config, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/logic-apps-playbooks,Azure Logic Apps for Microsoft Sentinel playbooks,Azure Logic Apps for Microsoft Sentinel playbooks,,Learn about Azure Logic Apps concepts and how they work with Microsoft Sentinel playbooks.,"Microsoft Sentinel playbooks are based on workflows built inAzure Logic Apps, a cloud service that helps you schedule, automate, and orchestrate tasks and workflows across systems throughout the enterprise. Microsoft Sentinel playbooks can take advantage of all the power and capabilities of the built-in templates in Azure Logic Apps. Azure Logic Apps communicates with other systems and services using various types ofconnectors. Use theMicrosoft Sentinel connectorto create playbooks that interact",2026-04-22T17:56:00.000Z,concept-article,,0.4,False,Explains how Logic Apps underpins playbooks; likely conceptual integration overview without detailed parameter tables or error mappings.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/migrate-playbooks-to-automation-rules,Migrate alert playbooks to automation rules,Migrate your Microsoft Sentinel alert-trigger playbooks to automation rules,Migrate Sentinel alert-trigger playbooks to automation rules,This article explains how (and why) to take your existing playbooks built on the alert trigger and migrate them from being invoked by analytics rules to being invoked by automation rules.,"We recommend that you migrate existing playbooks built on alert triggers and migrate them from being invoked byanalytics rulesto being invoked byautomation rules. This article explains why we recommend this action, and how to migrate your playbooks. If you're migrating a playbook that's used by only one analytics rule, follow the instructions underCreate an automation rule from an analytics rule. If you’re migrating a playbook that's used by multiple analytics rules, follow the instructions unde",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Explains when and why to migrate from analytics-rule triggers to automation rules, with scenario-based guidance; this is product-specific migration/choice guidance that helps decide between approaches.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-recommendations,Recommended and sample playbooks,"Recommended Microsoft Sentinel playbook use cases, templates, and examples",Apply recommended Microsoft Sentinel playbook templates,"Learn about sample use cases for Microsoft Sentinel playbooks, as well as example playbooks and recommended playbook templates.","This article lists sample use cases for Microsoft Sentinel playbooks, as well as sample playbooks and recommended playbook templates.",2026-04-22T17:56:00.000Z,reference,best-practices,0.65,True,Lists concrete playbook use cases and recommended templates specific to Sentinel; these are product-specific DO/USE patterns that function as best-practice guidance beyond generic automation concepts.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-triggers-actions,Supported triggers and actions in playbooks,Supported triggers and actions in Microsoft Sentinel playbooks,Use Sentinel playbook triggers and actions via Logic Apps,Learn in greater depth how to give your playbooks access to the information in your Microsoft Sentinel alerts and incidents and use that information to take remedial actions.,"This article describes the triggers and actions supported by theLogic Apps Microsoft Sentinel connector. Use the listed triggers and actions in Microsoft Sentinel playbooks to interact with your Microsoft Sentinel data. Important Noted functionality is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T17:56:00.000Z,concept-article,integrations,0.8,True,Catalog of supported triggers/actions for the Sentinel Logic Apps connector is an integration-focused reference with connector-specific operations and parameters that qualify as expert integration knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/run-playbooks,Automate and run playbooks,Automate and run Microsoft Sentinel playbooks,,"Learn how to automate incident response with Microsoft Sentinel playbooks, or run playbooks manually to remediate immediate security threats.","Playbooks are collections of procedures that can be run from Microsoft Sentinel in response to an entire incident, to an individual alert, or to a specific entity. A playbook can help automate and orchestrate your response and can be set to run automatically when specific alerts are generated or when incidents are created or updated, by being attached to an automation rule. It can also be run manually on-demand on specific incidents, alerts, or entities. This article describes how to attach play",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"How-to on attaching and running playbooks; procedural automation usage without clear evidence of detailed config matrices, limits, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/tutorial-respond-threats-playbook,Respond to threats using automation,Use a Microsoft Sentinel playbook to stop potentially compromised users,,Learn how to use Microsoft Sentinel playbooks and automation rules to automate a sample incident response and remediate security threats.,"This article describes a sample scenario of how you can use a playbook and automation rule to automate incident response and remediate security threats. Automation rules help you triage incidents in Microsoft Sentinel, and are also used to run playbooks in response to incidents or alerts. For more information, seeAutomation in Microsoft Sentinel: Security orchestration, automation, and response (SOAR). The sample scenario described in this article describes how to use an automation rule and play",2026-04-22T17:56:00.000Z,concept-article,,0.45,False,Scenario-based tutorial for a playbook; likely step-by-step but summary does not indicate reusable configuration tables or numeric constraints.,updated +https://learn.microsoft.com/en-us/azure/sentinel/automation/use-playbook-templates,Customize playbooks from templates,Create and customize Microsoft Sentinel playbooks from templates,,"This article shows how to create playbooks from and work with playbook templates, to customize them to fit your needs.","A playbook template is a prebuilt, tested, and ready-to-use automation workflow for Microsoft Sentinel that can be customized to meet your needs. Templates can also serve as a reference for best practices when developing playbooks from scratch, or as inspiration for new automation scenarios. Playbook templates aren't active playbooks themselves, and you must create an editable copy for your needs. Many playbook templates are developed by the Microsoft Sentinel community, independent software ven",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Focuses on creating playbooks from templates; likely a procedural tutorial without detailed config matrices, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/aws-disruption,Enable attack disruption actions on AWS,Enable Attack Disruption Actions on AWS with Microsoft Sentinel,Configure AWS attack disruption actions from Microsoft Sentinel,Enable Attack Disruption Actions on AWS with Microsoft Sentinel,"This article describes how to configure your AWS environment so that Microsoft Sentinel can take automated actions on a user that assumes a SAML role, or on an AWS IAM account when an alert is triggered. Attack disruption uses high-confidence signals to contain compromised assets and limit the damage from attacks, including actions on identities in AWS.",2026-04-22T17:56:00.000Z,how-to,security,0.8,True,"Describes configuring automated actions on AWS identities; likely includes IAM roles, permissions, and security settings—product-specific security configuration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/aws-s3-troubleshoot,Troubleshoot AWS S3 connector issues,Troubleshoot AWS S3 connector issues - Microsoft Sentinel,Troubleshoot AWS S3 log ingestion connector issues in Sentinel,Troubleshoot AWS S3 connector issues in Microsoft Sentinel.,"The Amazon Web Services (AWS) S3 connector allows you to ingest AWS service logs, collected in AWS S3 buckets, to Microsoft Sentinel. The types of logs we currently support are AWS CloudTrail, VPC Flow Logs, and AWS GuardDuty. This article describes how to quickly identify the cause of issues occurring with the AWS S3 connector so you can find the steps needed to resolve the issues. Learn how toconnect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data.",2026-04-22T17:56:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicit troubleshooting for AWS S3 connector; will map symptoms to causes and fixes, possibly with error messages and diagnostic steps—expert troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/azure-storage-blob-connector-troubleshoot,Troubleshoot Azure Storage Blob connector issues,Troubleshoot Azure Storage Blob connector issues - Microsoft Sentinel,Troubleshoot Microsoft Sentinel Azure Storage Blob connector issues,Troubleshoot Azure Storage Blob connector issues in Microsoft Sentinel.,The Azure Storage Blob connector simplifies the process of ingesting data from Azure Storage Blobs to Microsoft Sentinel. This article describes how to quickly identify the cause of issues occurring with the Azure Storage Blob connector so you can find the steps needed to resolve the issues. Learn how toconnect Microsoft Sentinel to Azure Storage Blob to ingest data.,2026-04-22T17:56:00.000Z,troubleshooting,troubleshooting,0.86,True,Troubleshooting guide mapping connector symptoms to causes and resolutions; likely includes specific error messages and diagnostic steps.,updated +https://learn.microsoft.com/en-us/azure/sentinel/basic-logs-use-cases,Data lake use cases,When to use the Microsoft Sentinel data lake,Decide when to use the Sentinel data lake tier,"Learn what log sources might be appropriate for the Microsoft Sentinel data lake and what attributes to look for, to decide about other sources.","This article highlights log sources to consider configuring as data lake tier only when enabling a connector. Before choosing a tier for which to configure a given table, check which tier is most appropriate for your use case. For more information about data categories and data tiers, seeLog retention plans in Microsoft Sentinel. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All custo",2026-04-22T17:56:00.000Z,concept-article,decision-making,0.75,True,Guides which log sources to place in data lake vs analytics tier based on attributes and use cases; product-specific tier selection criteria.,updated +https://learn.microsoft.com/en-us/azure/sentinel/best-practices,SIEM best practices in Microsoft Sentinel,Best practices for Microsoft Sentinel,Apply best practices for managing Sentinel workspaces,Learn about best practices to employ when managing your Log Analytics workspace for Microsoft Sentinel.,"Best practice guidance is provided throughout the technical documentation for Microsoft Sentinel. This article highlights some key guidance to use when deploying, managing, and using Microsoft Sentinel. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the",2026-04-22T17:56:00.000Z,best-practice,best-practices,0.8,True,Explicit best practices article for deploying and managing Sentinel; contains product-specific DOs/DON’Ts and configuration recommendations.,updated +https://learn.microsoft.com/en-us/azure/sentinel/best-practices-data,Best practices,Best practices for data collection in Microsoft Sentinel,Apply data collection best practices in Sentinel,Learn about best practices to employ when connecting data sources to Microsoft Sentinel.,"This section reviews best practices for collecting data using Microsoft Sentinel data connectors. For more information, seeConnect data sources,Microsoft Sentinel data connectors reference, and theMicrosoft Sentinel solutions catalog.",2026-04-22T17:56:00.000Z,concept-article,best-practices,0.7,True,Best practices article specific to Sentinel data connectors; likely includes concrete recommendations and product-specific gotchas for data collection.,updated +https://learn.microsoft.com/en-us/azure/sentinel/billing,Plan costs,Plan costs and understand pricing and billing - Microsoft Sentinel,"Plan Microsoft Sentinel pricing, billing, and cost control","Learn how to plan your Microsoft Sentinel costs, and understand pricing and billing using the pricing calculator and other methods.","To estimate your expected Microsoft Sentinel costs, use theMicrosoft Sentinel cost estimator tool,andwork directly with a Security sales specialistfor a custom quote or additional guidance. Costs for Microsoft Sentinel are only a portion of the monthly costs in your Azure bill. Although this article explains how to plan costs and understand the billing for Microsoft Sentinel, you're billed for all Azure services and resources your Azure subscription uses, including Partner services. This article",2026-04-22T17:56:00.000Z,concept-article,decision-making,0.75,True,Cost planning and billing guidance for Sentinel with estimator usage and likely SKU/tier cost comparisons; supports decision-making on pricing models.,updated +https://learn.microsoft.com/en-us/azure/sentinel/billing-monitor-costs,Monitor costs,Manage and monitor costs for Microsoft Sentinel,Monitor and manage Microsoft Sentinel costs,Learn how to manage and monitor costs and billing for Microsoft Sentinel by using cost analysis in the Azure portal and other methods.,"After you start using Microsoft Sentinel resources, use built-in Cost Management features to confidently manage budgets, monitor costs and security performance. You can also review forecasted costs and identify spending trends to optimize. With the Sentinel data lake enabled, you can also view your usage directly in the Microsoft Defender portal. Microsoft Sentinel costs are only a portion of your monthly Azure bill. Although this article explains how to manage and monitor costs for Microsoft Se",2026-04-22T17:56:00.000Z,how-to,decision-making,0.6,True,Explains how to interpret and act on Sentinel cost data using Cost Management; includes product-specific cost components and usage views for decision-making.,updated +https://learn.microsoft.com/en-us/azure/sentinel/billing-pre-purchase-plan,Optimize costs with pre-purchase plan,Optimize costs with a prepurchase plan - Microsoft Sentinel,Use Sentinel prepurchase plans to optimize analytics costs,Learn how to save costs and buy a Microsoft Sentinel prepurchase plan,"Save on your Microsoft Sentinel analytics tier costs when you buy a pre-purchase plan. Pre-purchase plans are commit units (CUs) bought at discounted tiers in your purchasing currency for a specific product. The more you buy, the greater the discount. Purchased CUs pay down qualifying costs in US dollars (USD). So, if Microsoft Sentinel generates a retail cost of $100, then 100 Microsoft Sentinel CUs (SCUs) are consumed. Your Microsoft Sentinel pre-purchase plan automatically uses your SCUs to p",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Explains commit units, discounts, and how SCUs are applied; includes quantified trade-offs and purchasing guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/billing-reduce-costs,Reduce costs,Reduce costs for Microsoft Sentinel,Reduce and optimize Microsoft Sentinel costs,Learn how to reduce costs for Microsoft Sentinel by using different methods in the Azure portal.,"Costs for Microsoft Sentinel are only a portion of the monthly costs in your Azure bill. Although this article explains how to reduce costs for Microsoft Sentinel, you're billed for all Azure services and resources your Azure subscription uses, including Partner services. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will bere",2026-04-22T17:56:00.000Z,conceptual,best-practices,0.7,True,"Provides concrete Sentinel-specific cost reduction methods (tier choices, transformations, rule tuning); actionable product-specific guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/bookmarks,Bookmarks,Hunt with bookmarks in Microsoft Sentinel,,This article describes how to use the Microsoft Sentinel hunting bookmarks to keep track of data.,"Hunting bookmarks in Microsoft Sentinel helps you preserve the queries and query results that you deem relevant. You can also record your contextual observations and reference your findings by adding notes and tags. Bookmarked data is visible to you and your teammates for easy collaboration. For more information, seeBookmarks. Note Bookmarks can only be created in the Azure portal. While you can't add bookmarks in the Microsoft Defender portal, you can see bookmarks that were already created. Im",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"Describes bookmark usage conceptually; no configuration tables, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/bring-your-own-ml,Bring your own machine learning,Bring your own ML into Microsoft Sentinel,,This article explains how to create and use your own machine learning algorithms for data analysis in Microsoft Sentinel.,"Note For information about feature availability in US Government clouds, see the Microsoft Sentinel tables inCloud feature availability for US Government customers. Machine Learning (ML) is one of the major underpinnings of Microsoft Sentinel, and one of the main attributes that set it apart. Microsoft Sentinel offers ML in several experiences: built-in to theFusioncorrelation engine and Jupyter notebooks, and the newly available Build-Your-Own ML (BYO ML) platform. ML detection models can adapt",2026-04-22T17:56:00.000Z,concept-article,,0.35,False,"Explains bring-your-own-ML conceptually; summary does not expose concrete config parameters, limits, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/business-applications/deploy-power-platform-solution,Deploy for Power Platform and Microsoft Dynamics 365 Customer Engagement,Connect Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement to Microsoft Sentinel,Connect Power Platform and Dynamics CE to Sentinel,Learn how to deploy the Microsoft Sentinel solution for Business Applications with Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement to Microsoft Sentinel,"This article describes how to deploy theMicrosoft Sentinel solution for Microsoft Business Appsto connect your Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement system to Microsoft Sentinel. The solution collects audit and activity logs to detect threats, suspicious activities, illegitimate activities, and more.",2026-04-22T17:56:00.000Z,how-to,deployment,0.65,True,Describes how to deploy the Business Apps solution for Power Platform and Dynamics CE; includes product-specific deployment and connector configuration steps.,updated +https://learn.microsoft.com/en-us/azure/sentinel/business-applications/power-platform-solution-security-content,Power Platform and Microsoft Dynamics 365 Customer Engagement security content reference,Security content reference for Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement and Microsoft Dynamics 365 Customer Engagement,Security content reference for Power Platform and CE,Learn about the built-in security content provided by the Microsoft Sentinel solution for Power Platform.,"This article details the security content available for the Microsoft Sentinel solution for Power Platform. For more information about this solution, seeMicrosoft Sentinel solution for Microsoft Power Platform and Microsoft Dynamics 365 Customer Engagement overview.",2026-04-22T17:56:00.000Z,article,security,0.65,True,"Details built-in security content for the Power Platform/Dynamics CE solution, which is product-specific detection and workbook information.",updated +https://learn.microsoft.com/en-us/azure/sentinel/business-applications/solution-overview,Overview,Microsoft Sentinel Solution for MS Business Apps,,"Learn about the Microsoft Sentinel solution for MS Business Apps, including Microsoft Power Platform, Microsoft Dynamics 365 Customer Engagement, and Microsoft Dynamics 365 Finance and Operations.","The Microsoft Sentinel solution for Microsoft Business Apps helps you monitor and protect your Microsoft Power Platform, Microsoft Dynamics 365 Customer Engagement, and Microsoft Dynamics 365 Finance and Operations. environments. It provides security insights and threat detection by collecting audit and activity logs to detect threats, suspicious activities, illegitimate activities, and more.",2026-04-22T17:56:00.000Z,overview,,0.3,False,"Solution overview for Business Apps is high-level description of capabilities; no indication of detailed limits, configuration tables, or error codes.",updated +https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery,Business continuity and disaster recovery,BCDR Recommendations for Working With Microsoft Sentinel,Design Sentinel for high availability and disaster recovery,"Learn about Business Continuity and Disaster Recovery (BCDR) in Microsoft Sentinel, including availability zones and cross-region disaster recovery strategies.","This article describes reliability support in Microsoft Sentinel and covers both regional resiliency with availability zones, and cross-region resiliency with business continuity and disaster recovery (BCDR). While this article is mainly directed at Microsoft Sentinel customers working in the Azure portal, this guidance also covers data currently managed by Azure services after onboarding to theMicrosoft Defender portal. For more information, seeAzure reliability.",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.65,True,BCDR and cross-region/zone resiliency guidance for Sentinel; product-specific reliability architecture patterns and recommendations.,updated +https://learn.microsoft.com/en-us/azure/sentinel/cef-name-mapping,CEF log field mapping,Common Event Format (CEF) key and CommonSecurityLog field mapping,Map CEF keys to Sentinel CommonSecurityLog fields,This article maps CEF keys to the corresponding field names in the CommonSecurityLog in Microsoft Sentinel.,"The following tables map Common Event Format (CEF) field names to the names they use in Microsoft Sentinel's CommonSecurityLog, and might be helpful when you're working with a CEF data source in Microsoft Sentinel. For more information, seeIngest syslog and CEF messages to Microsoft Sentinel with the Azure Monitor Agent.",2026-04-22T17:56:00.000Z,reference,configuration,0.88,True,"Contains detailed mapping tables between CEF keys and Sentinel fields, which are precise integration/configuration references.",updated +https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-overview,Overview,Syslog and CEF AMA connectors - Microsoft Sentinel,Configure Syslog and CEF AMA connectors for Sentinel,Learn how Microsoft Sentinel collects Syslog and Common Event Format (CEF) messages with the Azure Monitor Agent.,"The Syslog via AMA and Common Event Format (CEF) via AMA data connectors for Microsoft Sentinel filter and ingest Syslog messages, including messages in Common Event Format (CEF), from Linux machines and from network and security devices and appliances. These connectors install the Azure Monitor Agent (AMA) on any Linux machine from which you want to collect Syslog and/or CEF messages. This machine could be the originator of the messages, or it could be a forwarder that collects messages from ot",2026-04-22T17:56:00.000Z,concept-article,configuration,0.7,True,"Overview of Syslog/CEF via AMA connectors; these docs typically include AMA extension names, ports, facility/severity filters—product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-troubleshooting,Troubleshoot CEF and Syslog via AMA,Troubleshoot CEF and Syslog via AMA connectors in Microsoft Sentinel,Troubleshoot Syslog and CEF AMA connectors in Sentinel,Learn how to troubleshoot issues with CEF and Syslog data collection using the Azure Monitor Agent (AMA) in Microsoft Sentinel.,"This article provides troubleshooting guidance for Common Event Format (CEF) and Syslog data collection using the Azure Monitor Agent (AMA) in Microsoft Sentinel. Use this guide to diagnose and resolve ingestion issues with your log forwarder machines. The commands and configurations should be run on the log forwarder machines where AMA and RSyslog/Syslog-ng are installed. Before you begin troubleshooting, familiarize yourself with the following articles:",2026-04-22T17:56:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting guide; will map ingestion issues to causes and fixes, include diagnostic commands and log locations—classic symptom→cause→solution expert content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/ci-cd,Deploy content as code from your repository,Deploy custom content from your repository (Preview) - Microsoft Sentinel,Create repository connections to deploy Sentinel custom content,This article describes how to create connections with a GitHub or Azure DevOps repository where you can manage your custom content and deploy it to Microsoft Sentinel.,"When creating custom content, you can manage it from your own Microsoft Sentinel workspaces, or an external source control repository. This article describes how to create and manage connections between Microsoft Sentinel and GitHub or Azure DevOps repositories. Managing your content in an external repository allows you to make updates to that content outside of Microsoft Sentinel, and have it automatically deployed to your workspaces. For more information, seeUpdate custom content with reposito",2026-04-22T17:56:00.000Z,how-to,deployment,0.75,True,Describes creating and managing connections between Sentinel and external repos; product-specific deployment automation configuration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-content,Overview,Manage custom content with repository connections - Microsoft Sentinel,Manage Sentinel custom content with CI/CD repositories,This article explains custom Microsoft Sentinel content like GitHub or Azure DevOps repositories that can utilize source control features.,"Microsoft Sentinel repositories let you deploy and manage custom Sentinel content from an external source control repository for continuous integration/continuous delivery (CI/CD). This automation removes the need for manual processes to update and deploy your custom content across workspaces. A subset of content as code isdetectionsas code (DaC). Microsoft SentinelRepositoriesimplements DaC as well. For more information on Sentinel content, seeAbout Microsoft Sentinel content and solutions.",2026-04-22T17:56:00.000Z,article,deployment,0.75,True,Explains using GitHub/Azure DevOps repositories for Sentinel content; includes product-specific CI/CD integration patterns and constraints.,updated +https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-deploy,Customize repository deployments,Customize repository deployments - Microsoft Sentinel,Customize CI/CD repository deployments for Sentinel content,This article describes how to customize repository deployments for the repositories feature in Microsoft Sentinel.,"There are two primary ways to customize the deployment of your repository content to Microsoft Sentinel workspaces. Each method uses different files and syntax, so consider these examples to get you started. Important The Microsoft SentinelRepositoriesfeature is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor more legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,Shows ways to customize repository deployments using specific files/syntax; Sentinel-specific deployment behavior and configuration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/cisco-ftd-firewall,Cisco firewalls,Collect data from Cisco firewall devices running ASA,Choose and configure Cisco firewall connectors for Sentinel,Use Microsoft Sentinel connectors to collect logs from Cisco firewall devices in Adaptive Security Appliance (ASA) and Common Event Format (CEF) formats.,"Microsoft Sentinel provides two connectors that collect logs from Cisco Secure Firewall devices, depending on whether the devices run the Firewall Threat Defense (FTD) or Adaptive Security Appliance (ASA) software. This article explains when to use each connector and provides links to installation instructions.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Explains when to use each Cisco connector (FTD vs ASA) and links to setup; this is product-specific connector selection guidance—decision-making expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/collaborate-in-microsoft-teams,Collaborate in Microsoft Teams,Collaborate in Microsoft Teams with a Microsoft Sentinel incident team,Integrate Microsoft Sentinel incidents with Microsoft Teams,Learn how to connect to Microsoft Teams from Microsoft Sentinel to collaborate with others on your team using Microsoft Sentinel data.,"Microsoft Sentinel in the Azure portal supports a direct integration withMicrosoft Teams, enabling you to jump directly into teamwork on specific incidents. Important Integration with Microsoft Teams is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T17:56:00.000Z,how-to,integrations,0.62,True,"Covers a direct, product-specific integration between Sentinel and Teams; such pages typically include connection settings and behaviors unique to this integration, beyond generic Teams usage.",updated +https://learn.microsoft.com/en-us/azure/sentinel/compare-analytics-rules-custom-detections,Compare analytics rules and custom detections,Compare Microsoft Sentinel analytics rules and Microsoft Defender custom detections - Microsoft Security,Compare Sentinel analytics rules vs Defender custom detections,Compare the different features supported by Microsoft Sentinel analytics rules and Microsoft Defender custom detections.,"This article lists and compares the different features supported by Microsoft Sentinelanalytics rulesand Microsoft Defendercustom detections. It also provides additional information, such as plans to support any analytics rules capabilities that aren't available in custom detections, if applicable. Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-tim",2026-04-22T17:56:00.000Z,product-comparison,decision-making,0.75,True,"Explicit comparison article; likely includes feature comparison tables, supported capabilities, and guidance on when to use each, which are decision-making details with quantified trade-offs.",updated +https://learn.microsoft.com/en-us/azure/sentinel/configure-connector-login-detection,Configure RDP login detection,Configure the Security Events connector for anomalous RDP login detection,Configure Security Events connector for anomalous RDP detection in Sentinel,Learn how to configure the Security Events or Windows Security Events connector for anomalous RDP login detection.,"Microsoft Sentinel can apply machine learning (ML) to Security events data to identify anomalous Remote Desktop Protocol (RDP) login activity. Scenarios include: Unusual IP- the IP address has rarely or never been observed in the last 30 days Unusual geo-location- the IP address, city, country/region, and ASN have rarely or never been observed in the last 30 days New user- a new user logs in from an IP address and geo-location, both or either of which were not expected to be seen based on data f",2026-04-22T17:56:00.000Z,how-to,configuration,0.8,True,"Describes configuring connector for ML-based RDP anomaly detection; includes specific connector settings, required event IDs, and data requirements—product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/configure-content,Configure content,Configure Microsoft Sentinel content,"Configure Microsoft Sentinel connectors, analytics, and automation","In this step of your deployment, you configure the Microsoft Sentinel security content, like your data connectors, analytics rules, automation rules, and more.","In the previous deployment step, you enabled Microsoft Sentinel, health monitoring, and the required solutions. In this article, you learn how to configure the different types of Microsoft Sentinel security content, which allow you to detect, monitor, and respond to security threats across your systems. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Configures data connectors, analytics rules, automation rules, etc.; likely includes specific setting names and options unique to Sentinel content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/configure-data-connector,Connect data sources,Connect data sources to Microsoft Sentinel by using data connectors,Install and configure Microsoft Sentinel data connectors,Learn how to connect data sources to Microsoft Sentinel using data connectors for improved threat detection.,"To connect data sources to Microsoft Sentinel, you need to install and configure data connectors. This article generally explains how to install data connectors available in the Microsoft SentinelContent hubto ingest and analyze data for improved threat detection. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Explains how to install and configure connectors from Content hub; likely includes connector-specific settings and configuration parameters.,updated +https://learn.microsoft.com/en-us/azure/sentinel/configure-data-retention-archive,Configure interactive and long-term data retention,Configure interactive and long-term data retention in Microsoft Sentinel,Configure interactive and long-term Sentinel data retention,"Towards the end of your deployment procedure, you set up data retention to suit your organization's needs.","In the previous deployment step, you enabled the User and Entity Behavior Analytics (UEBA) feature to streamline your analysis process. In this article, you learn how to set up interactive and long-term data retention, to make sure your organization retains the data that's important in the long term. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Covers setting up interactive vs long-term retention; likely includes specific retention settings and options for Sentinel workspaces.,updated +https://learn.microsoft.com/en-us/azure/sentinel/configure-data-transformation,Configure ingestion-time transformation,Transform or customize data at ingestion time in Microsoft Sentinel (preview),Configure ingestion-time data transformation and custom log ingestion in Sentinel,Learn about how to configure Azure Monitor's ingestion-time data transformation for use with Microsoft Sentinel.,"This article describes how to configureingestion-time data transformation and custom log ingestionfor use in Microsoft Sentinel. Ingestion-time data transformation provides customers with more control over the ingested data. Supplementing the pre-configured, hardcoded workflows that create standardized tables, ingestion time-transformation adds the capability to filter and enrich the output tables, even before running any queries. Custom log ingestion uses the Custom Log API to normalize custom-",2026-04-22T17:56:00.000Z,how-to,configuration,0.8,True,"Covers configuring ingestion-time transformations and custom log ingestion; includes transformation rules, DCR fields, and API usage—detailed configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/configure-fusion-rules,Configure multistage attack (Fusion) rules,Configure multistage attack detection (Fusion) rules in Microsoft Sentinel,Configure Fusion multistage attack rules in Sentinel,Create and configure attack detection rules based on Fusion technology in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important The new version of the Fusion analytics rule is currently inPREVIEW. See theSupplemental Terms of Use for Micro",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Configuration article for Fusion rules; likely lists rule parameters, enable/disable options, and scope settings that are specific to Sentinel’s Fusion engine.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-aws,AWS service logs,Connect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data,Configure AWS service log connector to ingest data into Sentinel,"Use the AWS connector to delegate Microsoft Sentinel access to AWS resource logs, creating a trust relationship between Amazon Web Services and Microsoft Sentinel.","The Amazon Web Services (AWS) service log connector is available in two versions: the legacy connector for CloudTrail management and data logs, and the new version that can ingest logs from the following AWS services by pulling them from an S3 bucket (links are to AWS documentation): This tab explains how to configure the AWS S3 connector using one of two methods:",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"Connector configuration for AWS logs; includes trust relationship, role ARNs, S3 paths, and connector settings—expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-configure-environment,Connect Microsoft Sentinel to AWS,Set up your Amazon Web Services (AWS) environment to collect AWS logs to Microsoft Sentinel,Prepare AWS environment to send logs to Microsoft Sentinel,Set up your Amazon Web Services environment to send AWS logs to Microsoft Sentinel using one of the Microsoft Sentinel AWS connectors.,Amazon Web Services (AWS) connectors simplify the process of collecting logs from Amazon S3 (Simple Storage Service) and ingesting them into Microsoft Sentinel. The connectors provide tools to help you configure your AWS environment for Microsoft Sentinel log collection. This article outlines the AWS environment setup required to send logs to Microsoft Sentinel and links to step-by-step instructions for setting up your environment and collecting AWS logs using each supported connector.,2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"AWS environment setup for connectors; typically includes IAM roles, policies, S3 bucket settings—detailed cross-cloud configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks,AWS EKS logs,Connect Microsoft Sentinel to Amazon Web Services to ingest AWS EKS logs,Configure AWS EKS S3 connector to ingest audit logs into Sentinel,"Use the Amazon Web Services (AWS) S3-based Elastic Kubernetes Service (EKS) connector to ingest AWS EKS audit logs, collected in AWS S3 buckets, to Microsoft Sentinel.","Use the Amazon Web Services (AWS) S3-based Elastic Kubernetes Service (EKS) connector to ingest AWS EKS audit logs, collected in AWS S3 buckets, to Microsoft Sentinel. AWS EKS audit logs are detailed records of API server requests, authentication decisions, and cluster activities within your Kubernetes clusters. These records contain information such as the time the request was received, the specifics of the request, the user making the request, and the action taken. This log analysis is essenti",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"EKS audit log connector; includes S3 bucket, permissions, and Sentinel connector settings—product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-s3-waf,AWS S3 WAF logs,Connect Microsoft Sentinel to Amazon Web Services to ingest AWS WAF logs,Configure AWS WAF S3 connector to stream logs to Sentinel,"Use the Amazon Web Services (AWS) S3-based Web Application Firewall (WAF) connector to ingest AWS WAF logs, collected in AWS S3 buckets, to Microsoft Sentinel.","Use the Amazon Web Services (AWS) S3-based Web Application Firewall (WAF) connector to ingest AWS WAF logs, collected in AWS S3 buckets, to Microsoft Sentinel. AWS WAF logs are detailed records of the web traffic analyzed by the AWS WAF against web access control lists (ACLs). These records contain information such as the time AWS WAF received the request, the specifics of the request, and the action taken by the rule that the request matched. These logs and this analysis are essential for maint",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"AWS WAF S3-based connector; will specify bucket configuration, log format, and Sentinel connector parameters—detailed configuration content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-active-directory,Microsoft Entra,Send Microsoft Entra ID data to Microsoft Sentinel,Configure Microsoft Entra ID connector to send logs to Sentinel,"Learn how to collect data from Microsoft Entra ID, and stream Microsoft Entra sign-in, audit, and provisioning logs into Microsoft Sentinel.","Microsoft Entra IDlogs provide comprehensive information about users, applications, and networks accessing your Microsoft Entra tenant. This article explains the types of logs you can collect using the Microsoft Entra ID data connector, how to enable the connector to send data to Microsoft Sentinel, and how to find your data in Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"Connector for Entra ID logs; includes portal settings, log categories, and connector parameters—product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-functions-template,Azure Functions API connection,Use Azure Functions to connect Microsoft Sentinel to your data source,Integrate external data sources via Azure Functions with Sentinel,Learn how to configure data connectors that use Azure Functions to get data from data sources into Microsoft Sentinel.,"You can useAzure Functions, in conjunction with various coding languages such asPowerShellor Python, to create a serverless connector to the REST API endpoints of your compatible data sources. Azure Function Apps then allow you to connect Microsoft Sentinel to your data source's REST API to pull in logs. This article describes how to configure Microsoft Sentinel for using Azure Function Apps. You may also need to configure your source system, and you can find vendor- and product-specific informa",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,"Shows how to build Azure Functions-based connectors; typically includes function app settings, bindings, and Sentinel-specific API parameters—code-focused integration patterns.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-stack,Azure Stack VMs,Onboard your Azure Stack Hub virtual machines to Microsoft Sentinel,Onboard Azure Stack Hub VMs to Microsoft Sentinel monitoring,"This article shows you how to provision the Azure Monitor, Update, and Configuration Management virtual machine extension on Azure Stack Hub virtual machines and start monitoring them with Microsoft S","With Microsoft Sentinel, you can monitor your VMs running on Azure and Azure Stack Hub in one place. To on-board your Azure Stack machines to Microsoft Sentinel, you first need to add the virtual machine extension to your existing Azure Stack Hub virtual machines. After you connect Azure Stack Hub machines, choose from a gallery of dashboards that surface insights based on your data. These dashboards can be easily customized to your needs.",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,Shows how to provision VM extensions and start monitoring; involves deployment of agents/extensions on Azure Stack with Sentinel-specific requirements—deployment expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-virtual-desktop,Azure Virtual Desktop,Connect Azure Virtual Desktop to Microsoft Sentinel,Connect Azure Virtual Desktop diagnostics and logs to Sentinel,Learn to connect your Azure Virtual Desktop data to Microsoft Sentinel.,"This article describes how you can monitor your Azure Virtual Desktop environments using Microsoft Sentinel. For example, monitoring your Azure Virtual Desktop environments can enable you to provide more remote work using virtualized desktops, while maintaining your organization's security posture.",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Monitoring AVD with Sentinel requires configuring specific diagnostics/log sources and connectors—product-specific configuration details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-windows-microsoft-services,Connect Microsoft Sentinel to Microsoft connectors,"Connect Microsoft Sentinel to Azure, Windows, and Microsoft services",Configure Sentinel connectors for Azure and Microsoft services,Learn how to connect Microsoft Sentinel to Azure and Microsoft 365 cloud services and to Windows Server event logs.,"Microsoft Sentinel uses the Azure foundation to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. There are a few different methods through which these connections are made. Note For information about feature availability in US Government clouds, see the Microsoft Sentinel tables inCloud feature availability for US Government customers.",2026-04-22T17:56:00.000Z,overview,configuration,0.7,True,"Describes how to connect Sentinel to Azure/M365/Windows; such connector docs contain specific configuration steps, parameters, and sometimes per-service settings, which are configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama,CEF and Syslog via AMA,Ingest syslog and CEF messages to Microsoft Sentinel - AMA,Ingest Syslog and CEF data to Sentinel using AMA,"Ingest syslog messages from linux machines and from network and security devices and appliances to Microsoft Sentinel, using data connectors based on the Azure Monitor Agent (AMA).","This article shows you how to use theSyslog via AMAandCommon Event Format (CEF) via AMAconnectors to filter and ingest syslog and CEF messages from Linux machines, network devices, and security appliances. To learn more about these data connectors, seeSyslog and Common Event Format (CEF) via AMA connectors for Microsoft Sentinel. Note Container Insights supports automatic collection of syslog events from Linux nodes in your AKS clusters. Learn more inSyslog collection with Container Insights.",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"Step-by-step ingestion guide; will include AMA install commands, DCR definitions, and connector settings, which are detailed configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-custom-logs-ama,Collect logs from text files via AMA,Collect logs from text files with the Azure Monitor Agent and ingest to Microsoft Sentinel - AMA,Configure Custom Logs via AMA to ingest text files into Sentinel,"Collect text file-based logs from network or security applications installed on Windows- or Linux-based machines, using the Custom Logs via AMA data connector based on the Azure Monitor Agent (AMA).",This article describes how to use theCustom Logs via AMAconnector to quickly filter and ingest logs in text-file format from network or security applications installed on Windows or Linux machines. Many applications log data to text files instead of standard logging services like Windows Event log or Syslog. You can use the Azure Monitor Agent (AMA) to collect data in text files of nonstandard formats from both Windows and Linux computers. The AMA can also effect transformations on the data at t,2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"Describes using Custom Logs via AMA; such docs include DCR schema, file path patterns, and transformation settings—detailed configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-data-sources,Overview,Microsoft Sentinel data connectors,,"Learn about supported data connectors, like Microsoft Defender XDR (formerly Microsoft 365 Defender), Microsoft 365 and Office 365, Microsoft Entra ID, ATP, and Defender for Cloud Apps to Microsoft Se","After you onboard Microsoft Sentinel into your workspace, use data connectors to start ingesting your data into Microsoft Sentinel. Microsoft Sentinel comes with many out of the box connectors for Microsoft services, which integrate in real time. For example, the Microsoft Defender XDR connector is a service-to-service connector that integrates data from Office 365, Microsoft Entra ID, Microsoft Defender for Identity, and Microsoft Defender for Cloud Apps. Built-in connectors enable connection t",2026-04-22T17:56:00.000Z,concept-article,,0.4,False,Overview of data connectors; mostly descriptive without detailed configuration tables or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-defender-for-cloud,Microsoft Defender for Cloud,Ingest Microsoft Defender for Cloud subscription-based alerts to Microsoft Sentinel,Configure Defender for Cloud alerts ingestion into Sentinel,Learn how to connect security alerts from Microsoft Defender for Cloud and stream them into Microsoft Sentinel.,"Microsoft Defender for Cloud's integrated cloud workload protections allow you to detect and quickly respond to threats across hybrid and multicloud workloads. TheMicrosoft Defender for Cloudconnector allows you to ingestsecurity alerts from Defender for Cloudinto Microsoft Sentinel, so you can view, analyze, and respond to Defender alerts, and the incidents they generate, in a broader organizational threat context. Microsoft Defender for Cloud Defender plansare enabled per subscription. While M",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Connector for Defender for Cloud; includes subscription-level settings and connector parameters—detailed configuration expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-dns-ama,DNS via AMA,Stream and filter Windows DNS logs with the AMA connector,Stream and filter Windows DNS logs to Sentinel with AMA,Ingest and filter data from your Windows DNS server logs with this data connector. Query this data to protect your DNS servers from threats and attacks.,"This article describes how to use the Azure Monitor Agent (AMA) connector to stream and filter events from your Windows Domain Name System (DNS) server logs. You can then deeply analyze your data to protect your DNS servers from threats and attacks. The AMA and its DNS extension are installed on your Windows Server to upload data from your DNS analytical logs to your Microsoft Sentinel workspace. DNS is a widely used protocol, which maps between host names and computer readable IP addresses. Bec",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"Connector for DNS logs; will specify AMA extension, event channels, filters, and DCR settings—product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-google-cloud-platform,Google Cloud Platform connectors,Ingest Google Cloud Platform log data into Microsoft Sentinel,Configure GCP Pub/Sub connectors to ingest logs into Sentinel,This article describes how to ingest service log data from the Google Cloud Platform (GCP) into Microsoft Sentinel.,"Organizations are increasingly moving to multicloud architectures, whether by design or due to ongoing requirements. A growing number of these organizations use applications and store data on multiple public clouds, including the Google Cloud Platform (GCP). This article describes how to ingest GCP data into Microsoft Sentinel to get full security coverage and analyze and detect attacks in your multicloud environment. With theGCP Pub/Subconnectors, based on ourCodeless Connector Framework (CCF),",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"GCP ingestion article; will include Pub/Sub topics, subscriptions, service accounts, and Sentinel connector settings—detailed cross-cloud configuration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules,Logstash plugin with Data Collection Rules,Use Logstash to stream logs with pipeline transformations via DCR-based API,Integrate Logstash with Sentinel using DCR-based output plugin,"Use Logstash to forward logs from external data sources into custom and standard tables in Microsoft Sentinel, and to configure the output with DCRs.","Important Data ingestion using the Logstash output plugin with Data Collection Rules (DCRs) is currently in public preview. This feature is provided without a service level agreement. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Sentinel's new Logstash output plugin supports pipeline transformations and advanced configuration via Data Collection Rules (DCRs). The plugin forwards any type of logs from external data sources into custom or standard tabl",2026-04-22T17:56:00.000Z,how-to,integrations,0.8,True,"Logstash integration with DCR-based API; includes plugin parameters, pipeline config, and DCR schema—code-focused integration and configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-mdti-data-connector,Enable MDTI data connector,Enable the data connector for Microsoft's threat intelligence - Microsoft Defender Threat Intelligence,Enable Defender Threat Intelligence data connector,Learn how to ingest Microsoft's threat intelligence into your Microsoft Sentinel workspace to generate high-fidelity alerts and incidents.,"Bring public, open-source and high-fidelity indicators of compromise (IOCs) generated by Microsoft Defender Threat Intelligence into your Microsoft Sentinel workspace with the Defender Threat Intelligence data connectors. With a simple one-click setup, use the threat intelligence from the standard and premium Defender Threat Intelligence data connectors to monitor, alert, and hunt. AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only ",2026-04-22T17:56:00.000Z,how-to,integrations,0.8,True,Connector enablement article for Defender Threat Intelligence; will include connector configuration parameters and behavior unique to this integration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-365-defender,Microsoft Defender XDR,Stream data from Microsoft Defender XDR to Microsoft Sentinel in the Azure portal,Configure Microsoft Defender XDR connector to stream incidents to Sentinel,"Learn how to ingest incidents, alerts, and raw event data from Microsoft Defender XDR into Microsoft Sentinel in the Azure portal.","The Defender XDR connector allows you to stream all Microsoft Defender XDR incidents, alerts, and advanced hunting events into Microsoft Sentinel and keeps incidents synchronized between both portals. This article explains how to configure the Microsoft Defender XDR connector for Microsoft Sentinel in the Azure portal. Note The Defender XDR connector is automatically enabled when you onboard Microsoft Sentinel to the Defender portal. The manual configuration steps described in this article are n",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Explains configuring Defender XDR connector; includes portal settings, scopes, and sync behavior—product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-purview,Microsoft Purview Information Protection data,Stream data from Microsoft Purview Information Protection to Microsoft Sentinel,Configure Microsoft Purview Information Protection connector for Sentinel,Stream data from Microsoft Purview Information Protection (formerly Microsoft Information Protection) to Microsoft Sentinel so you can analyze and report on data from the Microsoft Purview labeling cl,"This article describes how to stream data from Microsoft Purview Information Protection (formerly Microsoft Information Protection or MIP) to Microsoft Sentinel. You can use the data ingested from the Microsoft Purview labeling clients and scanners to track, analyze, report on the data, and use it for compliance purposes. Important The Microsoft Purview Information Protection connector is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure ",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Streaming Purview Information Protection data requires specific connector settings and scopes—detailed configuration expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-services-api-based,Connect via API-based connectors,Connect Microsoft Sentinel to other Microsoft services with an API-based data connector,Configure API-based Microsoft service connectors for Sentinel,Learn how to connect Microsoft Sentinel to Microsoft services with API-based connections.,"This article describes how to make API-based connections to Microsoft Sentinel. Microsoft Sentinel uses the Azure foundation to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. There are a few different methods through which these connections are made. This article presents information that is common to the group of API-based data connectors. Note For information about feature ava",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"API-based connector article will include endpoint URLs, auth scopes, and connector parameters unique to Sentinel integrations, fitting configuration/integration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-services-diagnostic-setting-based,Connect via diagnostic settings-based connectors,Connect Microsoft Sentinel to other Microsoft services by using diagnostic settings-based connections,Configure diagnostic settings-based connectors for Sentinel,Learn how to connect Microsoft Sentinel to Microsoft services with diagnostic settings-based connections.,"This article describes how to connect to Microsoft Sentinel by using diagnostic settings connections. Microsoft Sentinel uses the Azure foundation to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. There are a few different methods through which these connections are made. This article presents information that is common to the group of data connectors that use diagnostic setting",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Diagnostic settings-based connections require specific Azure diagnostic categories, destinations, and settings; these are product-specific configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-services-windows-based,Connect via Windows agent-based connectors,Connect Microsoft Sentinel to other Microsoft services with a Windows agent-based data connector,Configure Windows agent-based data connectors for Sentinel,Learn how to connect Microsoft Sentinel to Microsoft services with Windows agent-based connections.,"This article describes how to connect Microsoft Sentinel to other Microsoft services Windows agent-based connections. Microsoft Sentinel uses the Azure Monitor Agent to provide built-in, service-to-service support for data ingestion from many Azure and Microsoft 365 services, Amazon Web Services, and various Windows Server services. TheAzure Monitor AgentusesData collection rules (DCRs)to define the data to collect from each agent. Data collection rules offer you two distinct advantages: Manage ",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Uses AMA and DCRs; such docs list DCR fields, data sources, and agent settings, which are detailed configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-taxii,Connect to STIX/TAXII feeds,Connect to STIX/TAXII threat intelligence feeds - Microsoft Sentinel,Connect TAXII threat intel feeds to Sentinel,Learn how to connect Microsoft Sentinel to industry-standard threat intelligence feeds to import threat indicators.,"The STIX data format and the TAXII protocolare the most widely adopted industry standards for transmitting threat intelligence. Microsoft Sentinel supports integration with threat intelligence platforms using the standards, and provides built-in connectors for importing and exporting threat intelligence. Use the Threat Intelligence – TAXII data connector to import threat indicators from TAXII 2.0 or 2.1 servers into your Sentinel workspace. To share threat intelligence externally, configure the ",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,"Connector article for STIX/TAXII feeds typically documents connector-specific configuration fields (TAXII server URL formats, collection IDs, authentication parameters, polling intervals) and Sentinel-side connector settings that are product-specific integration details, not just conceptual info.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-tip,Connect threat intelligence platforms,Connect your threat intelligence platform - Microsoft Sentinel,Connect threat intelligence platform to Sentinel (legacy),Learn how to connect your threat intelligence platform (TIP) or custom feed to Microsoft Sentinel and send threat indicators.,"Note This data connector will be deprecated and will stop collecting data inJune 2026. We recommend transitioning to the new Threat Intelligence Upload Indicators API data connector as soon as possible to ensure uninterrupted data collection. +For more information, seeConnect your threat intelligence platform to Microsoft Sentinel with the upload API. Many organizations use threat intelligence platform (TIP) solutions to aggregate threat indicator feeds from various sources. From the aggregated f",2026-04-22T17:56:00.000Z,how-to,integrations,0.8,True,"Connector article for TIP integration, including deprecation details and configuration steps; product-specific integration configuration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-upload-api,Connect threat intelligence with upload API,Connect your TIP with the upload API (Preview) - Microsoft Sentinel,Connect TIP to Sentinel using Upload API,Learn how to connect your threat intelligence platform (TIP) or custom feed using the upload API to Microsoft Sentinel.,"Many organizations use threat intelligence platform (TIP) solutions to aggregate threat intelligence feeds from various sources. From the aggregated feed, the data is curated to apply to security solutions such as network devices, EDR/XDR solutions, or security information and event management (SIEM) solutions such as Microsoft Sentinel. The industry standard for describing cyberthreat information is called, ""Structured Threat Information Expression"" or STIX. By using the upload API which suppor",2026-04-22T17:56:00.000Z,how-to,integrations,0.85,True,Explains how to use the Upload API with STIX/TAXII to send indicators; includes API parameters and constraints specific to Sentinel’s TIP integration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rule-from-template,Create a scheduled rule from a template,Create scheduled analytics rules from templates in Microsoft Sentinel,,This article explains how to view and create scheduled analytics rules from templates in Microsoft Sentinel.,"By far the most common type of analytics rule,Scheduledrules are based onKusto queriesthat are configured to run at regular intervals and examine raw data from a defined ""lookback"" period. These queries can perform complex statistical operations on their target data, revealing baselines and outliers in groups of events. If the number of results captured by the query passes the threshold configured in the rule, the rule produces an alert. Microsoft makes a vast array ofanalytics rule templatesava",2026-04-22T17:56:00.000Z,how-to,,0.4,False,Focuses on creating rules from templates; usually more of a how-to/template usage guide than a full configuration reference with parameter tables or numeric ranges.,updated +https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rules,Create a scheduled rule from scratch,Create scheduled analytics rules in Microsoft Sentinel,Create and configure scheduled Sentinel rules,This article explains how to view and create scheduled analytics rules in Microsoft Sentinel.,"You’ve set upconnectors and other means of collecting activity dataacross your digital estate. Now you need to dig through all that data to detect patterns of activity and discover activities that don’t fit those patterns and that could represent a security threat. Microsoft Sentinel and its manysolutions provided in the Content huboffer templates for the most commonly used types of analytics rules, and you’re strongly encouraged to make use of those templates, customizing them to fit your speci",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Covers creating scheduled analytics rules beyond templates; such articles typically enumerate rule settings (query schedule, lookback, thresholds, suppression) with specific options and constraints, which are configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-codeless-connector,Creating codeless data connectors (CCF),Create a codeless connector for Microsoft Sentinel,Create codeless data connectors using Sentinel CCF,Learn how to create a codeless connector in Microsoft Sentinel using the Codeless Connector Framework (CCF).,"The Codeless Connector Framework (CCF) provides partners, advanced users, and developers the ability to create custom connectors for ingesting data to Microsoft Sentinel. Connectors created using the CCF are fully SaaS, with no requirements for service installations. They also includehealth monitoringand full support from Microsoft Sentinel. Use the following steps to create your CCF connector and connect your data source to Microsoft Sentinel This article will show you how to complete each step",2026-04-22T17:56:00.000Z,how-to,integrations,0.75,True,"Codeless Connector Framework article for ingesting data; typically includes connector schema, parameter names, and configuration details unique to Sentinel integrations.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector,Create a custom connector,Resources for creating Microsoft Sentinel custom connectors,,"Learn about available resources for creating custom connectors for Microsoft Sentinel. Methods include the Log Analytics API, Logstash, Logic Apps, PowerShell, and Azure Functions.","Microsoft Sentinel provides a wide range ofout-of-the-box connectors for Azure services and external solutions, and also supports ingesting data from some sources without a dedicated connector. If you're unable to connect your data source to Microsoft Sentinel using any of the existing solutions available, consider creating your own data source connector. For a full list of supported connectors, see theFind your Microsoft Sentinel data connector).",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"High-level resource overview for creating custom connectors; likely links out to detailed docs but itself is conceptual without concrete configs, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector-builder-agent,Create custom connectors using AI agent in Microsoft Sentinel,Get started with custom connectors using AI agent in Microsoft Sentinel,Build custom Sentinel connectors with AI agent in VS Code,Custom Data connectors using AI agent in Microsoft Sentinel Visual Studio Code extension,"The Microsoft Sentinel connector builder agent builds data connectors in minutes using the AI‑assisted workflow in GitHub Copilot using the Microsoft Sentinel extension for Visual Studio Code (VS Code). This low‑code experience guides developers and Independent Software Vendors (ISVs) end‑to‑end by autonomously generating schemas, deployment assets, connector UI, secure secret handling, and polling logic. Built‑in validation surfaces any polling issues early, so you can validate event logs befor",2026-04-22T17:56:00.000Z,feature-availability,integrations,0.7,True,"Describes an AI-assisted connector builder with deployment assets, polling logic, and secure secret handling; likely includes product-specific connector configuration parameters and patterns for integrating data sources.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-incident-manually,Create incidents manually,Create your own incidents manually in Microsoft Sentinel in the Azure portal,,Manually create incidents in Microsoft Sentinel based on data or information received by the SOC through alternate means or channels.,"Important Manual incident creation, using the portal or Logic Apps, is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Manual incident creation is generally available using the API. AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender por",2026-04-22T17:56:00.000Z,how-to,,0.35,False,Manual incident creation description; preview/legal notes but no explicit expert-only configuration matrices or limits in the summary.,updated +https://learn.microsoft.com/en-us/azure/sentinel/create-incidents-from-alerts,Create incidents from Microsoft Security alerts,Create incidents from alerts in Microsoft Sentinel,Configure incident creation from alerts in Sentinel,Learn how to create incidents from alerts in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Alerts triggered in Microsoft security solutions that are connected to Microsoft Sentinel, such as Microsoft Defender for",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Describes how alerts are grouped into incidents and how to configure that behavior; typically includes rule settings and mapping options unique to Sentinel incident creation.,updated +https://learn.microsoft.com/en-us/azure/sentinel/create-manage-use-automation-rules,Create automation rules,Create and use Microsoft Sentinel automation rules to manage response,Configure Microsoft Sentinel automation rules for response,"This article explains how to create and use automation rules in Microsoft Sentinel to manage and handle incidents, in order to maximize your SOC's efficiency and effectiveness in response to security ","This article explains how to create and use automation rules in Microsoft Sentinel to manage and orchestrate threat response, in order to maximize your SOC's efficiency and effectiveness. In this article you'll learn how to define the triggers and conditions that determine when your automation rule runs, the various actions that you can have the rule perform, and the remaining features and functionalities. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Explains defining triggers, conditions, and actions for automation rules, which implies detailed product-specific configuration options and fields unique to Sentinel automation.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-nrt-rules,Create NRT analytics rules,Work with near-real-time (NRT) detection analytics rules in Microsoft Sentinel,Create and manage NRT detection rules in Sentinel,This article explains how to view and create near-real-time (NRT) detection analytics rules in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel’snear-real-time analytics rulesprovide up-to-the-minute threat detection out-of-the-box. This type of ",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Covers viewing and creating NRT rules; such content typically lists rule parameters, supported options, and constraints unique to NRT analytics.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-push-codeless-connector,Creating push codeless data connectors (CCF),Microsoft Sentinel CCF push connectors (preview) - Getting started guide,Configure push-based codeless connectors for Sentinel,Learn how to create and deploy push-based codeless connectors for Microsoft Sentinel that sends data in real-time.,"This guide helps you understand, build, and deploy push-based codeless connectors for Microsoft Sentinel using the Codeless Connector Framework (CCF) Push (preview).",2026-04-22T17:56:00.000Z,how-to,integrations,0.75,True,"Push-based CCF connectors with real-time data; likely documents endpoint formats, auth parameters, and connector-specific configuration fields.",updated +https://learn.microsoft.com/en-us/azure/sentinel/create-tasks-automation-rule,Create incident tasks using automation rules,Create incident tasks in Microsoft Sentinel using automation rules,Configure automation rules to create Sentinel incident tasks,"This article explains how to use automation rules to create lists of incident tasks, in order to standardize analyst workflow processes in Microsoft Sentinel.","This article explains how to use automation rules to create lists of incident tasks, in order to standardize analyst workflow processes in Microsoft Sentinel. Incident taskscan be created automatically not only by automation rules, but also by playbooks, and also manually, ad-hoc, from within an incident.",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Focuses on using automation rules to create standardized incident task lists, involving specific rule configuration options tied to task creation.",updated +https://learn.microsoft.com/en-us/azure/sentinel/customer-managed-keys,Set up customer-managed keys,Set up customer-managed keys in Microsoft Sentinel,Set up customer-managed keys for Sentinel encryption,Learn how to set up customer-managed key (CMK) in Microsoft Sentinel.,This article provides background information and steps to configure acustomer-managed key (CMK)for Microsoft Sentinel. All the data stored in Microsoft Sentinel is already encrypted by Microsoft in all relevant storage resources. CMK provides an extra layer of protection with an encryption key created and owned by you and stored in yourAzure Key Vault.,2026-04-22T17:56:00.000Z,how-to,security,0.8,True,Provides steps and parameters for configuring CMK with Azure Key Vault for Sentinel; product-specific security configuration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/customize-alert-details,Customize alert details,Customize alert details in Microsoft Sentinel,Customize Sentinel alert naming and severity,"Customize how alerts are named and described, along with their severity and assigned tactics, based on the alerts' content.","This article explains how to override the default properties of alerts with content from the underlying query results. In the process of creating ascheduled analytics rule, as the first step you define a name and description for the rule, and you assign it a severity and MITRE ATT&CK tactics. All alerts generated by a given rule - and all incidents created as a result - will inherit the name, description, severity, and tactics defined in the rule, without regard to the particular content of a sp",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Describes overriding default alert properties based on query results; involves specific rule configuration fields and mapping expressions that are product-specific.,updated +https://learn.microsoft.com/en-us/azure/sentinel/customize-entity-activities,Create custom entity activities,Customize activities on Microsoft Sentinel entity timelines,Customize activities on Sentinel entity timelines,Add customized activities to those Microsoft Sentinel tracks and displays on the timeline of entity pages,Important,2026-04-22T17:56:00.000Z,how-to,configuration,0.6,True,Adding customized activities implies specifying activity definitions and mappings; these are product-specific configuration options for entity timelines.,updated +https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-azure-storage,Azure Storage Blob data connector reference,Azure Storage Blob data connector reference for the Codeless Connector Framework - Microsoft Sentinel,Configure Azure Storage Blob connector JSON in CCF,This article provides reference JSON fields and properties for creating the Azure Storage Blob data connector type and its data connection rules as part of the Codeless Connector Framework.,"To create an Azure Storage Blob data connector with the Codeless Connector Framework (CCF), use this reference in addition to theMicrosoft Sentinel REST API for Data Connectorsarticle. EachdataConnectorrepresents a specificconnectionof a Microsoft Sentinel data connector. One data connector might have multiple connections, which fetch data from different endpoints. The JSON configuration built using this reference document is used to complete the deployment template for the CCF data connector. F",2026-04-22T17:56:00.000Z,reference,configuration,0.9,True,"Defines JSON fields and properties for Azure Storage Blob connectors, which are precise configuration references.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-gcp,GCP data connectors API reference,GCP data connector reference for the Codeless Connector Framework - Microsoft Sentinel,Configure GCP data connector JSON for Sentinel CCF,This article provides reference JSON fields and properties for creating the GCP data connector type and its data connection rules as part of the Codeless Connector Framework.,"To create a Google Cloud Platform (GCP) data connector with the Codeless Connector Framework (CCF), use this reference as a supplement to theMicrosoft Sentinel REST API for Data Connectorsdocs. EachdataConnectorrepresents a specificconnectionof a Microsoft Sentinel data connector. One data connector might have multiple connections, which fetch data from different endpoints. The JSON configuration built using this reference document is used to complete the deployment template for the CCF data con",2026-04-22T17:56:00.000Z,reference,configuration,0.9,True,"Reference for GCP connector JSON schema and connection rules, with detailed parameter definitions.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-connector-connection-rules-reference,RestApiPoller data connectors API reference,RestApiPoller data connector reference for the Codeless Connector Framework - Microsoft Sentinel,Configure RestApiPoller data connector JSON rules,This article provides reference JSON fields and properties to create the RestApiPoller data connector type and its data connection rules for the Codeless Connector Framework.,"You can create aRestApiPollerdata connector with the Codeless Connector Framework (CCF) by using this article as a supplement to theMicrosoft Sentinel REST API for data connectorsdocs. Each data connector represents a specificconnectionof a Microsoft Sentinel data connector. One data connector might have multiple connections, which fetch data from different endpoints. You can complete the deployment template for the CCF data connector by using the JSON configuration that you build with this arti",2026-04-22T17:56:00.000Z,reference,configuration,0.9,True,"Provides reference JSON fields and properties for RestApiPoller connectors, including parameter names and constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-connector-ui-definitions-reference,Data connector definitions API reference,Data connector definitions reference for the Codeless Connector Framework - Microsoft Sentinel,Define Codeless Connector Framework UIConfig JSON,This article provides a supplemental reference for creating the connectorUIConfig JSON section for the Data Connector Definitions API as part of the Codeless Connector Framework.,"To create a data connector with the Codeless Connector Framework (CCF), use this document as a supplement to theMicrosoft Sentinel REST API for Data Connector Definitionsreference docs. Specifically this reference document expands on the following section: For more information, seeCreate a codeless connector.",2026-04-22T17:56:00.000Z,reference,configuration,0.9,True,"Reference for connectorUIConfig JSON with specific fields and allowed values, a direct configuration schema for connector UIs.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference,Find data connector,Find your Microsoft Sentinel data connector,,Learn about specific configuration steps for Microsoft Sentinel data connectors.,"This article lists all supported, out-of-the-box data connectors and links to each connector's deployment steps. Important Data connectors are available as part of the following offerings: Solutions: Many data connectors are deployed as part ofMicrosoft Sentinel solutiontogether with related content like analytics rules, workbooks, and playbooks. For more information, see theMicrosoft Sentinel solutions catalog. Community connectors: More data connectors are provided by the Microsoft Sentinel co",2026-04-22T17:56:00.000Z,reference,,0.3,False,"Reference index listing connectors and links; navigation-style content without embedded limits, config tables, or troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference,Windows security events,Find your Microsoft Sentinel data connector,,Learn about specific configuration steps for Microsoft Sentinel data connectors.,"This article lists all supported, out-of-the-box data connectors and links to each connector's deployment steps. Important Data connectors are available as part of the following offerings: Solutions: Many data connectors are deployed as part ofMicrosoft Sentinel solutiontogether with related content like analytics rules, workbooks, and playbooks. For more information, see theMicrosoft Sentinel solutions catalog. Community connectors: More data connectors are provided by the Microsoft Sentinel co",2026-04-22T17:56:00.000Z,reference,,0.3,False,"Duplicate of index 6; acts as a catalog of connectors with links, not containing the detailed configuration or troubleshooting itself.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-source-schema-reference,Data source schema reference,Microsoft Sentinel data source schema reference,,"This article lists Azure and third-party data source schemas supported by Microsoft Sentinel, with links to their reference documentation.","This article lists supported Azure and third-party data source schemas, with links to their reference documentation.",2026-04-22T17:56:00.000Z,reference,,0.35,False,"High-level index listing data source schemas with links; the expert details live in the linked pages, not here.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-transformation,Ingestion-time data transformation,Custom data ingestion and transformation in Microsoft Sentinel,Configure custom data ingestion and transformation in Sentinel,Learn about how Azure Monitor's custom log ingestion and data transformation features can help you get any data into Microsoft Sentinel and shape it the way you want.,"Azure Monitor Logsserves as the data platform for Microsoft Sentinel. All logs ingested into Microsoft Sentinel are stored in aLog Analytics workspace, andlog querieswritten inKusto Query Language (KQL)are used to detect threats and monitor your network activity. Log Analytics gives you a high level of control over the data that gets ingested to your workspace with custom data ingestion anddata collection rules (DCRs). DCRs allow you to both collect and manipulate your data before it's stored in",2026-04-22T17:56:00.000Z,conceptual,configuration,0.7,True,"Covers custom ingestion and DCR-based transformations; these docs usually list DCR schema fields, transformation rules, and config options that are product-specific configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support,Support for data types in different clouds,Support for Microsoft Sentinel connector data types in different clouds,Check Sentinel connector data type support by cloud,This article describes the types of clouds that affect data streaming from the different connectors that Microsoft Sentinel supports.,"Microsoft Sentinel data connectors use data stored in various cloud environments, like the Microsoft 365 Commercial cloud or the Government Community Cloud (GCC). This article describes the types of clouds that affect the supported data types for the different connectors that Microsoft Sentinel supports. Specifically, support varies for different Microsoft Defender XDR connector data types in different GCC environments.",2026-04-22T17:56:00.000Z,concept-article,decision-making,0.7,True,Describes which connector data types are supported in which cloud environments (commercial vs GCC variants); this is product-specific capability matrix for planning.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/asset-data-tables,Asset data tables in Microsoft Sentinel data lake,Asset data tables in Microsoft Sentinel data lake - Microsoft Security,Reference asset data table mappings in Sentinel data lake,Asset data tables in security data lake,The following table mappings are available in the Microsoft Sentinel data lake for asset data.,2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"Provides table mappings for asset data; likely includes column names, types, and semantics that are product-specific configuration/schema details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/auditing-lake-activities,Audit Microsoft Sentinel data lake and graph in Microsoft Purview portal,Audit log for Microsoft Sentinel data lake and graph in Microsoft Purview portal,Use audit logs for Sentinel data lake and graph activities,Learn how to use the audit log to search for Microsoft Sentinel data lake activities to help with investigation.,"The audit log helps you investigate specific activities across Microsoft services. Microsoft Sentinel data lake and graph activities are audited and can be searched in the audit log. The audit log provides a record of activities that are performed by users and administrators in Microsoft Sentinel data lake and graph, such as: Auditing is automatically turned on for Microsoft Sentinel data lake and graph. Features that are audited are logged in the audit log automatically.",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Explains how Sentinel data lake/graph activities appear in the audit log and how to search them; product-specific audited operation details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-custom-graphs,Create custom graphs,Get started with custom graphs in Microsoft Sentinel (preview),,"Learn how to create and manage custom graphs in Microsoft Sentinel to model attack patterns, investigate threats, and run advanced graph algorithms.","Custom graphs in Microsoft Sentinel enable security researchers and analysts to create tailored graph representations of their security data. By building custom graphs, you can model specific attack patterns, investigate threats, and run advanced graph algorithms to uncover hidden relationships within your digital environment. This guide walks you through the steps to create and manage custom graphs by using Jupyter notebooks in the Microsoft Sentinel Visual Studio Code extension. This article f",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"How-to/tutorial for creating custom graphs in notebooks; likely step-based without detailed config tables, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-graphs-with-ai,Create custom graph using AI,AI-assisted custom graph authoring in Microsoft Sentinel (preview) - Microsoft Security,,"Use AI assistance in Visual Studio Code to create, modify, and query custom security graphs using Jupyter notebooks and GitHub Copilot.","Use GitHub Copilot in Visual Studio Code with Microsoft Sentinel to create, modify, and query custom security graphs using Jupyter notebooks. Describe what you want to build in natural language, review the generated notebook, and refine it as needed. Use Copilot for various graph authoring tasks, including:",2026-04-22T17:56:00.000Z,how-to,,0.3,False,AI-assisted authoring workflow using Copilot; likely conceptual and tutorial-like without deep product-specific configs or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/custom-graphs-overview,Custom graphs overview,Custom graphs in Microsoft Sentinel-Overview (preview),,An overview of custom graphs in Microsoft Sentinel,"Custom graphs let you build tailored security graphs tuned to your unique security scenarios using data from Sentinel data lake as well as non-Microsoft sources. With custom graph, powered by Fabric, you can build, query, and visualize connected data, uncover hidden patterns and attack paths, and help surface risks that are hard to detect when data is analyzed in isolation. These graphs provide the knowledge context that enables AI-powered agent experiences to work more effectively, speeding inv",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Overview of custom graphs; high-level description of capabilities and scenarios without clear evidence of detailed configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-overview,Overview,Data federation overview in Microsoft Sentinel data lake - Microsoft Security,,"Learn how data federation in Microsoft Sentinel data lake enables seamless querying of external data sources including Azure Databricks, ADLS Gen 2, and Microsoft Fabric.","Data federation in Microsoft Sentinel enables seamless querying of multiple external data sources from within the Microsoft Sentinel data lake environment. By federating data sources such as Azure Databricks, Azure Data Lake Storage (ADLS) Gen 2, and Microsoft Fabric, organizations can enhance their security analytics and operational insights without moving or duplicating data.",2026-04-22T17:56:00.000Z,concept-article,,0.35,False,"Conceptual overview of data federation; describes what it is and benefits, but no clear indication of detailed configuration parameters or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-setup,Set up federated tables,Set up federated data connectors in Microsoft Sentinel data lake - Microsoft Security,Configure federated data connectors in Sentinel data lake,"Learn how to configure federated data connectors for Azure Databricks, ADLS Gen 2, and Microsoft Fabric in Microsoft Sentinel data lake.","This article explains how to configure federated data connectors to enable querying of external data sources from the Microsoft Sentinel data lake. You can federate with Azure Databricks, Azure Data Lake Storage (ADLS) Gen 2, and Microsoft Fabric.",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Explains how to configure federated connectors for Databricks, ADLS Gen2, and Fabric; likely includes connector settings and parameters, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/enable-data-connectors,Asset data in the Sentinel data lake,Asset data in Microsoft Sentinel data lake - Microsoft Security,,Asset data in security data lake,"Asset data in cybersecurity refers to an organization’s physical and digital entities such as computers, identities, software, cloud services, and networks. It shows what exists so you know what must be protected. Microsoft Sentinel’s data lake adds powerful value by storing this asset data in a scalable, cost-efficient way that supports long-term retention, advanced analytics, and AI-driven threat detection. With unified visibility across systems and flexible data management, Sentinel lake help",2026-04-22T17:56:00.000Z,concept-article,,0.2,False,"Conceptual description of asset data value in Sentinel data lake; no concrete configuration parameters, limits, or troubleshooting content indicated.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/gql-reference-for-sentinel-custom-graph,GQL reference for Sentinel custom graph,Graph Query Language (GQL) reference for Microsoft Sentinel graph (Preview),Query Sentinel graphs with GQL syntax reference,"Learn the fundamental concepts, functions, and operators of Graph Query Language (GQL) for querying graph data in Microsoft Sentinel graph.","Applies to: Microsoft Sentinel Graph Note GQL support is in preview. Features and syntax can change based on feedback and ongoing development. This reference covers the fundamental concepts, functions, and operators of Graph Query Language (GQL). Graph Query Language (GQL) is built on mathematical graph theory concepts that provide a solid foundation for querying graph data. Understanding these fundamentals helps you write more effective queries and better understand how GQL processes your data.",2026-04-22T17:56:00.000Z,reference,integrations,0.7,True,"Language reference for GQL with operators and functions specific to Microsoft Sentinel graph; contains API-like syntax and semantics, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-rest-api,Graph REST API,Graph REST APIs for custom graphs (preview) - Microsoft Security,Call Sentinel custom graph REST APIs programmatically,Learn how to use the Graph REST APIs to list and query custom graphs in the Microsoft Sentinel data lake.,"The Graph REST APIs let you list and query custom graphs in your Microsoft Sentinel data lake. Use these APIs to programmatically interact with your custom graphs from any HTTP client, automation pipeline, or custom application. For more information on creating custom graphs, seeCreate custom graphs in the security data lake.",2026-04-22T17:56:00.000Z,reference,integrations,0.8,True,"REST API usage for listing and querying custom graphs; likely includes endpoints, parameters, and request/response schemas unique to this product.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-visualization,Graph visualization,Visualize custom graphs in Microsoft Sentinel graph (preview),,"Learn how to use Microsoft Sentinel graph to query, visualize, and interact with custom security graphs to gain new security insights.","The graphs experience in the Microsoft Defender portal enables you to perform interactive graph-based investigations on your custom graphs, such as using a graph built for phishing analysis to help you quickly evaluate the impact of a recent incident, profile the attacker, and trace its paths across Microsoft telemetry and third-party data. This experience allows you to run graph queries to visualize the insights that matter most to your organization and supports ad hoc traversal of the graph so",2026-04-22T17:56:00.000Z,how-to,,0.45,False,"Describes how to visualize custom graphs and run graph queries; likely focused on usage and investigation workflows, not on configuration tables or numeric limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/identity-attack-graph,Identity attack graph,Identity attack graph in Microsoft Sentinel - Microsoft Security,Use Sentinel identity attack graph to find lateral paths,"Learn how the identity attack graph in Microsoft Sentinel models identities, permissions, and Azure resources to surface lateral movement paths and privilege escalation risks.","The identity attack graph in Microsoft Sentinel visualizes how identities connect to Azure resources through permissions and group memberships. Security analysts can use the graph to identify lateral movement paths, which are the potential routes an attacker could take to move from one identity or resource to another by exploiting existing permissions, group memberships, or trust relationships, often to escalate privileges or reach sensitive assets. The predefined identity attack graph represent",2026-04-22T17:56:00.000Z,overview,best-practices,0.65,True,"Provides product-specific guidance on interpreting identity attack graphs, lateral movement paths, and privilege escalation risks; these are Sentinel-specific investigative best practices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs,Aggregate insights from raw data into an Auxiliary table,Create jobs in the Microsoft Sentinel data lake - Microsoft Security,Create and schedule KQL jobs in Sentinel data lake,Use the Defender portal's Data lake exploration KQL queries to create and schedule jobs to promote data to the analytics tier.,"KQL jobs are one-time or scheduled KQL queries on data in the Microsoft Sentinel data lake and federated tables. Use jobs for investigative and analytical scenarios, such as: KQL jobs are especially effective when queries use joins or unions across different datasets. Use jobs to promote data from the data lake tier to the analytics tier. Once in the analytics tier, use the advanced hunting KQL editor to query the data. Promoting data to the analytics tier has the following benefits: Note Storag",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Covers creating/scheduling KQL jobs and promoting data between tiers; involves job configuration parameters and tier behaviors unique to Sentinel data lake.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs,Create KQL jobs,Create jobs in the Microsoft Sentinel data lake - Microsoft Security,Use KQL jobs to promote Sentinel data,Use the Defender portal's Data lake exploration KQL queries to create and schedule jobs to promote data to the analytics tier.,"KQL jobs are one-time or scheduled KQL queries on data in the Microsoft Sentinel data lake and federated tables. Use jobs for investigative and analytical scenarios, such as: KQL jobs are especially effective when queries use joins or unions across different datasets. Use jobs to promote data from the data lake tier to the analytics tier. Once in the analytics tier, use the advanced hunting KQL editor to query the data. Promoting data to the analytics tier has the following benefits: Note Storag",2026-04-22T17:56:00.000Z,how-to,decision-making,0.62,True,"Explains when to use KQL jobs, benefits, and scenarios vs direct queries; likely includes guidance on when to promote data to analytics tier with some quantified trade-offs.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs-summary-rules-search-jobs,"Compare KQL jobs, summary rules, and search jobs","KQL jobs, summary rules, and search jobs - Microsoft Security","Choose between KQL jobs, summary rules, and search jobs","A comparison of KQL jobs, summary rules, and search jobs in Microsoft Sentinel to choose the best tool for querying and analyzing security data.","This article compares KQL jobs, summary rules, and search jobs in Microsoft Sentinel. These features let you query and analyze data in Microsoft Sentinel, and each serves different purposes and use cases. Note KQL jobs require onboarding to the Microsoft Sentinel data lake. For more information, seeOnboard to the Microsoft Sentinel data lake. KQL jobs: Run one-time or scheduled asynchronous queries on data stored in the Microsoft Sentinel data lake. KQL jobs are best for incident investigations ",2026-04-22T17:56:00.000Z,how-to,decision-making,0.85,True,Explicit comparison of three features with use cases; likely includes decision criteria and possibly thresholds for selecting the right mechanism.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-manage-jobs,Manage KQL jobs,Manage KQL jobs - Microsoft Security,Manage Sentinel data lake KQL jobs in portal,Managing KQL jobs in the Defender portal for Microsoft Sentinel data lake,"A KQL job is a one-time or scheduled task that runs a KQL (Kusto Query Language) query against the data in the data lake tier to promote the results to the analytics tier. Jobs can be created in theKQL querieseditor, or theJobspage underMicrosoft Sentinel>Data lake explorationin the Microsoft Defender portal for. For more information, seeKQL jobs. The Jobs management page provides the following functions:",2026-04-22T17:56:00.000Z,concept-article,configuration,0.7,True,"Jobs management page functions; expected to list job properties, statuses, and configuration options for managing scheduled and one-time jobs.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-overview,Overview,KQL and the Microsoft Sentinel data lake - Microsoft Security,,Exploring and interacting with the Microsoft Sentinel data lake using KQL,"With Microsoft Sentinel data lake, you can store and analyze high-volume, low-fidelity logs like firewall or DNS data, asset inventories, and historical records for up to 12 years. Because storage and compute are decoupled, you can query the same copy of data using multiple tools, without moving or duplicating it. You can explore data in the data lake using Kusto Query Language (KQL) and Jupyter Notebooks, to support a wide range of scenarios, from threat hunting and investigations to enrichment",2026-04-22T17:56:00.000Z,concept-article,,0.45,False,"KQL overview for the data lake; conceptual and scenario-focused, not a detailed config, limits, or troubleshooting reference.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries,Run KQL queries,Run KQL queries against the Microsoft Sentinel data lake - Microsoft Security,Run and manage KQL queries in Sentinel data lake,"Use the Defender portal's Data lake exploration KQL queries to query and interact with the Microsoft Sentinel data lake. Create, edit, and run KQL queries to explore your data lake resources","Data lake exploration in the Microsoft Defender portal provides a unified interface to analyze your data lake. It lets you run KQL (Kusto Query Language) queries, create jobs, and manage them. TheKQL queriespage underData lake explorationlets you edit and run KQL queries on data lake resources and federated tables. Create jobs to promote data from the data lake to the analytics tier, or create aggregate tables in the data lake tier. Run jobs on demand or schedule them. TheJobspage lets you manag",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Describes KQL queries page, job creation, and management; likely includes specific options, parameters, and scheduling settings for jobs.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries-api,KQL using the API,Run KQL queries on Microsoft Sentinel data lake using APIs - Microsoft Security,Call Sentinel data lake KQL APIs via REST,"Learn how to run KQL queries against the Microsoft Sentinel data lake programmatically using REST APIs. Enable automation, intelligent agents, and scalable analytics.","Microsoft Sentinel data lake supports running Kusto Query Language (KQL) queries programmatically by using REST APIs. This enables security teams and automation systems to retrieve analytical results without using the Azure portal or interactive query editors. +This article explains when to use the API, required permissions, and how to submit a basic query request.",2026-04-22T17:56:00.000Z,how-to,integrations,0.78,True,"Programmatic KQL queries via REST; expected to document endpoints, request bodies, permissions, and parameters unique to this API.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-sample-queries,Sample KQL queries,Sample KQL queries for Microsoft Sentinel data lake - Microsoft Security,,Use KQL queries to explore and analyze data in the Microsoft Sentinel data lake.,This article provides sample KQL queries that you can use interactively or in KQL jobs to investigate security incidents and monitor for suspicious activity in the Microsoft Sentinel data lake.,2026-04-22T17:56:00.000Z,how-to,,0.55,False,"Sample KQL queries; while useful, they are generic query examples rather than configuration, limits, or troubleshooting knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-troubleshoot,Troubleshoot KQL for the lake,Troubleshoot KQL queries for the data lake - Microsoft Security,Troubleshoot KQL queries and jobs in Sentinel data lake,Troubleshoot KQL queries for the Microsoft Sentinel data lake.,"Use the following checklist to resolve common issues when working with KQL (Kusto Query Language) queries and jobs in Microsoft Sentinel data lake. Check for prerequisites before running queries or jobs. For more information, seeRoles and permissions for the Microsoft Sentinel data lake. Ensure that you selected the correct workspaces before executing KQL queries or jobs. Confirm that all referenced tables and workspaces exist and are accessible. Use only supported KQL operators and commands to ",2026-04-22T17:56:00.000Z,how-to,troubleshooting,0.82,True,"Checklist for resolving KQL query and job issues; includes specific causes (permissions, workspaces, operators) and corrective actions.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-examples,Notebook examples for data lake exploration,Notebook examples for querying the Microsoft Sentinel data lake - Microsoft Security,Sample notebook code for querying Sentinel data lake,"This article provides sample code snippets for querying the Microsoft Sentinel data lake using Jupyter notebooks, demonstrating how to access and analyze security data.","This article presents some sample code snippets that demonstrate how to interact with Microsoft Sentinel lake data using Jupyter notebooks to analyze security data in the Microsoft Sentinel data lake. These examples illustrate how to access and analyze data from various tables, such as Microsoft Entra ID sign-in logs, group information, and device network events. The code snippets are designed to run in Jupyter notebooks within Visual Studio Code using the Microsoft Sentinel extension. To run th",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,Provides concrete PySpark/Python code snippets using Sentinel extension APIs and table schemas; product-specific coding patterns for data access.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-jobs,Create and manage notebook jobs,Create and manage Jupyter notebook jobs - Microsoft Security,Schedule and manage Sentinel notebook jobs in VS Code,This article describes how to explore and interact with lake data using Spark notebooks in Visual Studio Code.,"You can create scheduled jobs to run at specific times or intervals using the Microsoft Sentinel extension for Visual Studio Code. Jobs allow you to automate data processing tasks to summarize, transform, or analyze data in the Microsoft Sentinel data lake and federated tables. Jobs are also used to process data and write results to custom tables in the lake tier or analytics tier.",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Describes creating scheduled Spark notebook jobs; likely includes job configuration options, schedules, and output table settings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks,Run notebooks,Running notebooks on the Microsoft Sentinel data lake - Microsoft Security,,This article describes how to explore and interact with data lake data using Jupyter notebooks in Visual Studio Code.,"Jupyter notebooks provide an interactive environment for exploring, analyzing, and visualizing data in the Microsoft Sentinel data lake and federated tables. With notebooks, you can write and execute code, document your workflow, and view results—all in one place. This makes it easy to perform data exploration, build advanced analytics solutions, and share insights with others. By leveraging Python and Apache Spark within Visual Studio Code, notebooks help you transform raw security data into ac",2026-04-22T17:56:00.000Z,how-to,,0.45,False,How-to article for running notebooks; likely step-based without deep configuration matrices or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks-overview,Overview,Exploring and interacting with lake data using Jupyter Notebooks - Microsoft Security,,This article gives an overview of Jupyter notebooks in Visual Studio Code for the Microsoft Sentinel data lake.,"Jupyter notebooks are an integral part of the Microsoft Sentinel data lake ecosystem, offering powerful tools for data analysis and visualization. The notebooks are provided by the Microsoft Sentinel Visual Studio Code extension that allows you to interact with the data lake using Python for Spark (PySpark). Notebooks enable you to perform complex data transformations, run machine learning models, and create visualizations directly within the notebook environment. The Microsoft Sentinel Visual S",2026-04-22T17:56:00.000Z,overview,,0.45,False,Overview of notebooks and capabilities; mostly conceptual without detailed parameter tables or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-overview,Microsoft Sentinel graph overview,What is Microsoft Sentinel graph? - Microsoft Security,,"Learn how Microsoft Sentinel graph enables multi-modal security analytics through graph-based representation of security data, providing deep insights into digital environments and attack paths.","Microsoft Sentinel graph is a unified graph analytics capability within Microsoft Sentinel that powers graph-based experiences across security, compliance, identity, and the Microsoft Security ecosystem - empowering security teams to model, analyze, and visualize complex relationships across their digital estate. Unlike traditional tabular data approaches, Sentinel graph enables defenders and AI agents to reason over interconnected assets, identities, activities, and threat intelligence, unlocki",2026-04-22T17:56:00.000Z,overview,,0.3,False,"Overview of Sentinel graph capabilities; conceptual explanation of graph analytics without detailed config, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-provider-reference,Microsoft Sentinel graph provider reference,Microsoft Sentinel graph provider reference,Use sentinel_graph API to build security graphs,Reference documentation for the Microsoft Sentinel Graph Builder API for building and querying security graphs.,"The sentinel_graph class provides a way to interact with the Microsoft Sentinel graph, allowing you to define your graph schema, transform data from the Microsoft Sentinel data lake into nodes and edges, publish a graph, query graph, and run advanced graph algorithms. This class is designed to work with the Spark sessions in Jupyter notebooks running on Microsoft Sentinel spark compute.",2026-04-22T17:56:00.000Z,reference,integrations,0.78,True,"Reference for the sentinel_graph class with product-specific methods and parameters for schema definition, transforms, publishing, and querying; fits integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-connectors,Set up connectors for the Microsoft Sentinel data lake,Set up connectors for the Microsoft Sentinel data lake - Microsoft Security,,Set up and configuring connectors for Microsoft Sentinel data lake.,"The Microsoft Sentinel data lake mirrors data from Microsoft Sentinel workspaces. When you onboard to Microsoft Sentinel data lake, your existing Microsoft Sentinel data connectors are configured to send data to both the analytics tier - your Microsoft Sentinel workspaces, and mirror the data to the data lake tier for longer term storage. After onboarding, configure your connectors to retain data in each tier according to your requirements. This article explains how to set up connectors for the ",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Connector setup/retention guidance but summary shows no numeric limits, config tables, or product-specific error codes; likely a procedural how-to without expert-only details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboard-defender,Onboard to data lake and graph from Defender portal,Onboarding to Microsoft Sentinel data lake from the Defender portal - Microsoft Security,Onboard Sentinel data lake from Microsoft Defender portal,This article describes how to onboard to the Microsoft Sentinel data lake for customers who are currently using Microsoft Defender.,"Onboarding your tenant to the Microsoft Sentinel data lake occurs once and starts from the Microsoft Defender portal. The onboarding process creates a new Microsoft Sentinel data lake for your tenant in the subscription specified during the onboarding process. Graph enablement is included as part of onboarding. If you had onboarded to the data lake during public preview, you're automatically upgraded to the generally available data lake and graph. Note You'll always have one data lake that you c",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Describes onboarding flow from Defender portal with subscription selection and tenant-level constraints; product-specific configuration process.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboarding,Microsoft Sentinel platform deployment overview,Onboarding to Microsoft Sentinel data lake and graph - Microsoft Security,Onboard to Microsoft Sentinel data lake and graph,This article describes how to onboard to the Microsoft Sentinel data lake and graph,"TheMicrosoft Sentinel data lakeis a tenant-wide repository for collecting, storing, and managing large volumes of security-related data from various sources. It enables comprehensive, unified analysis and visibility across your security landscape.Microsoft Sentinel graphis a unified graph capability within Microsoft Sentinel platform powering graph-based experiences across security, compliance, identity, and the entire ecosystem. These solutions use advanced analytics, machine learning, graphs, ",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Onboarding to Sentinel data lake/graph includes tenant-wide configuration steps and parameters unique to this platform capability.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-overview,Microsoft Sentinel data lake overview,Microsoft Sentinel data lake overview - Microsoft Security,,"An overview of Microsoft Sentinel data lake, a cloud-native platform that extends Microsoft Sentinel with highly scalable, cost-effective long-term storage, advanced analytics, and AI-driven security ","Microsoft Sentinel data lake is a purpose-built, cloud-native security data lake that transforms how organizations manage and analyze security data. Designed as a true data lake, it ingests, stores, and analyzes large volumes of diverse security data at scale. By centralizing security data into a single, open-format, extensible platform, it provides deep visibility, long-term retention, and advanced analytics. The data lake lets you bring all your security data into Microsoft Sentinel cost-effec",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"Overview of Sentinel data lake; conceptual description of capabilities without evidence of detailed limits, config tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-service-limits,Microsoft Sentinel data lake service limits,Microsoft Sentinel data lake service limits - Microsoft Security,Service limits and quotas for Sentinel data lake,Service limits for the Microsoft Sentinel data lake service.,The following service parameters and limits apply to the Microsoft Sentinel data lake service.,2026-04-22T17:56:00.000Z,concept-article,limits-quotas,0.95,True,"Explicit service limits article; will list numerical limits, quotas, and constraints for the data lake service.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-agent-creation-tool,Agent creation,Agent creation tool collection in Microsoft Sentinel MCP server - Microsoft Security,Use Sentinel MCP agent creation tool collection,Learn about the different tools available in the Agent creation collection in Microsoft Sentinel,The agent creation tool collection in the Microsoft Sentinel Model Context Protocol (MCP) server lets you create effective Microsoft Security Copilot agents.,2026-04-22T17:56:00.000Z,how-to,integrations,0.63,True,Tool collection reference for creating Security Copilot agents; expected to list tool APIs and parameters specific to agent creation.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-billing,"Billing, limits, and availability","Microsoft Sentinel MCP server pricing, limits, and availability - Microsoft Security","Sentinel MCP server pricing, limits, and availability","Learn about the pricing, limits, and availability of using the different MCP collection of tools in Microsoft Sentinel","Important Some information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article provides information on pricing, limits, and availability when setting up and using Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools.",2026-04-22T17:56:00.000Z,concept-article,limits-quotas,0.88,True,"Pricing and limits article; expected to list concrete numerical limits, quotas, and availability constraints for MCP tool usage.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-chatgpt-claude-connector,Use MCP connector in ChatGPT or Claude,Use the Microsoft Sentinel MCP connector in ChatGPT or Claude - Microsoft Security,Enable Sentinel MCP connector in ChatGPT or Claude,Learn how to turn on and use a custom Microsoft Sentinel's Model Context Protocol (MCP) connector in ChatGPT or Claude,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article shows you how to enable and use a custom Microsoft Sentinel Model Context Protocol (MCP) connector in ChatGPT by OpenAI or Claude by Anthropic. By using this approach, Security Operations Center (SOC) analysts can run security tasks by using Microsoft Sentinel MCP.",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Connector setup article; likely includes endpoint URLs, auth settings, and configuration parameters for ChatGPT/Claude integration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-create-custom-tool,Create your own custom tool,Create and use custom Microsoft Sentinel MCP tools - Microsoft Security,Create and configure custom Sentinel MCP tools,Learn how to set up and use custom Microsoft Sentinel Model Context Protocol (MCP) tools using saved KQL queries in advanced hunting,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. Security agents built with Microsoft Sentinel's collection of Model Context Protocol (MCP) tools can effectively reason over data in Microsoft Sentinel. You can create custom Sentinel MCP tools to have granular control over the data accessible to your security agents and create deter",2026-04-22T17:56:00.000Z,get-started,configuration,0.72,True,"How-to for defining custom tools from saved KQL queries; expected to include tool schema, parameter definitions, and configuration constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool,Data exploration,Data exploration tool collection in Microsoft Sentinel MCP server - Microsoft Security,Use Sentinel MCP data exploration tools for queries,Learn about the different tools available in the Data exploration collection in Microsoft Sentinel,"Important Some information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. The data exploration tool collection in the Microsoft Sentinel Model Context Protocol (MCP) server lets you search for relevant tables and retrieve data from Microsoft Sentinel's data lake by using natural language.",2026-04-22T17:56:00.000Z,how-to,integrations,0.65,True,"Describes specific tools in the data exploration collection; likely includes tool names, parameters, and behaviors for querying tables and retrieving data.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started,Overview,Get started with Microsoft Sentinel MCP server - Microsoft Security,Configure Microsoft Sentinel MCP server tools,Learn how to set up and use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools to enable natural language queries and AI-powered security investigations,This article shows you how to set up and use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools to enable natural language queries against your security data. Sentinel's support for MCP enables security teams to bring AI into their security operations by allowing AI models to access security data in a standard way. Sentinel'scollectionof security tools works with multiple clients and automation platforms. You can use these tools to search for relevant tables and retri,2026-04-22T17:56:00.000Z,get-started,configuration,0.7,True,"Get-started setup article for MCP server likely includes concrete configuration parameters, connection settings, and environment details specific to Sentinel MCP.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-logic-apps,Build logic apps with Microsoft Sentinel MCP,Build Azure Logic Apps with Microsoft Sentinel MCP tools - Microsoft Security,Integrate Sentinel MCP tools with Azure Logic Apps,Learn how to set up an Azure Logic App using Microsoft Sentinel's collection of Model Context Protocol (MCP) tools,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. You can access the value of Microsoft Sentinel's collection of Model Context Protocol (MCP) tools inAzure Logic Apps, starting with theentity analyzer tool. Security analysts and automation engineers often spend significant time creating complex Security Orchestration, Automation, an",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,"Shows how to call MCP tools from Logic Apps; likely includes connector actions, parameters, and workflow configuration specific to this integration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-overview,Microsoft Sentinel MCP server overview,What is Microsoft Sentinel’s support for MCP? - Microsoft Security,,Learn how Model Context Protocol (MCP),"Microsoft Sentinel, our security platform, introduces support for Model Context Protocol (MCP). This support includes multiple scenario-focused collections of security tools through a unified server interface. With this support, you can interactively query security data in natural language and build effective security agents that can perform complex automation. Our collection of security tools helps security teams bring AI into their daily security operations to assist with common tasks like dat",2026-04-22T17:56:00.000Z,overview,,0.35,False,"High-level overview of MCP support and scenarios; not focused on limits, configs, or troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-responsible-ai-faq,Responsible AI FAQs,Responsible AI FAQs for Microsoft Sentinel MCP server - Microsoft Security,,Learn about how Microsoft Responsible AI Standard guides the development and use of Microsoft Sentinel's collection of Model Context Protocol (MCP) tools,"At Microsoft, we recognize the importance of regulatory compliance as a cornerstone of trust and reliability in AI technologies. We're committed to creating responsible AI by design. Our goal is to develop and deploy AI that has a beneficial impact on and earns trust from society. A core set of principlesguides our work: fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability. Microsoft Sentinel MCP server is being developed in accordance with our ",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,Responsible AI FAQ and principles; mostly policy and conceptual guidance without product-specific configuration or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-tools-overview,Overview,What is Microsoft Sentinel MCP server's tool collection? - Microsoft Security,,Learn about the different MCP collection of tools in Microsoft Sentinel,"Microsoft Sentinel’s Model Context Protocol (MCP) Server collections are logical groupings of related security-focused MCP tools that you can use in anycompatible clientto: Our collections are scenario-focused and have security-optimized descriptions that help AI models pick the right tools and deliver those outcomes. For example, you can use the following sample prompts to get the appropriate tool:",2026-04-22T17:56:00.000Z,article,,0.4,False,Overview of MCP tool collections and example prompts; more conceptual and scenario-focused than configuration- or limits-focused.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-triage-tool,Triage,Triage tool collection in Microsoft Sentinel MCP server - Microsoft Security,Use Sentinel MCP triage tools for incidents,Learn about the different tools available in the triage collection,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. The triage collection in the Microsoft Sentinel Model Context Protocol (MCP) server integrates your AI models with APIs that support incident triage and hunting. This integration lets you prioritize incidents quickly and hunt over your own data easily, reducing mean time to resolutio",2026-04-22T17:56:00.000Z,how-to,integrations,0.63,True,Triage tool collection integrates AI models with incident triage and hunting APIs; likely documents tool operations and parameters.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-azure-ai-foundry,Use Sentinel MCP tools in Microsoft Foundry,Use a Microsoft Sentinel MCP tool in Microsoft Foundry - Microsoft Security,Configure Sentinel MCP tools in Azure AI Foundry,Learn how to use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Microsoft Foundry,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents inMicrosoft Foundry. For information about how to get started with MCP tools, see the following articles:",2026-04-22T17:56:00.000Z,how-to,configuration,0.68,True,Shows how to add Sentinel MCP tools or custom tools in Foundry; likely includes product-specific configuration fields and values.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-copilot-studio,Use Sentinel MCP tools in Copilot Studio,Use a Microsoft Sentinel MCP tool in Microsoft Copilot Studio - Microsoft Security,Configure Sentinel MCP tools in Copilot Studio,Learn how to add Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Microsoft Copilot Studio,"Important This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents inMicrosoft Copilot Studio. For information about how to get started with MCP tools, see the following articles: Tip For the best ",2026-04-22T17:56:00.000Z,how-to,configuration,0.68,True,Covers wiring Sentinel MCP tools into Copilot Studio; expected to list concrete configuration options and parameters for tool integration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-security-copilot,Use Sentinel MCP tools in Security Copilot,Use a Microsoft Sentinel MCP tool in Microsoft Security Copilot - Microsoft Security,Add Sentinel MCP tools to Security Copilot,Learn how to add Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Microsoft Security Copilot,"This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents inMicrosoft Security Copilot. For information about how to get started with MCP tools, see the following articles:",2026-04-22T17:56:00.000Z,how-to,configuration,0.68,True,Describes adding Sentinel MCP tools or custom tools into Security Copilot; likely includes specific configuration fields and values for tool registration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-visual-studio-code,Use Sentinel MCP tools in Visual Studio Code,Use a Microsoft Sentinel MCP tool in Visual Studio Code - Microsoft Security,Use Sentinel MCP tools in Visual Studio Code,Learn how to use Microsoft Sentinel's Model Context Protocol (MCP) collection of security tools or your own custom tool in Visual Studio Code,"This article shows you how to add Microsoft Sentinel's Model Context Protocol (MCP)collection of security toolsor your own custom tools to your AI agents in Visual Studio Code. For information about how to get started with MCP tools, see the following articles:",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Explains adding MCP tools to AI agents in VS Code; expected to contain specific settings, identifiers, and configuration steps unique to this integration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-provider-class-reference,Microsoft Sentinel provider class reference,Microsoft Sentinel data lake Microsoft Sentinel Provider class reference,Use MicrosoftSentinelProvider class with Spark notebooks,"Reference documentation for the Microsoft Sentinel Provider class, which allows you to connect to the Microsoft Sentinel data lake and perform various operations.","TheMicrosoftSentinelProviderclass provides a way to interact with the Microsoft Sentinel data lake, allowing you to perform operations such as listing databases, reading tables, and saving data. This class is designed to work with the Spark sessions in Jupyter notebooks and provides methods to access and manipulate data stored in the Microsoft Sentinel data lake. This class is part of thesentinel.datalakemodule and provides methods to interact with the data lake. To use this class, import it and",2026-04-22T17:56:00.000Z,reference,integrations,0.8,True,"Class reference with methods to list databases, read tables, and save data; includes method signatures and parameters unique to this provider.",updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/troubleshoot-sentinel-mcp,Troubleshooting,Best practices and troubleshooting for Microsoft Sentinel MCP tool collection - Microsoft Security,Best practices and troubleshooting for Sentinel MCP tools,Learn about the best practices for using Microsoft Sentinel's collection of MCP tools and how to troubleshoot them,This article outlines best practices to using Microsoft Sentinel's collection of Model Context Protocol (MCP) tools. It also provides steps you can take to troubleshoot common issues you might experience while using them.,2026-04-22T17:56:00.000Z,how-to,troubleshooting,0.8,True,Explicitly combines best practices and troubleshooting; likely includes symptom→cause→solution mappings and product-specific gotchas for MCP tools.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/using-data-federation,Using federated tables,Use federated data sources in Microsoft Sentinel - Microsoft Security,,"Learn how to view, query, and work with federated data sources in Microsoft Sentinel data lake using the portal, KQL queries, and Jupyter notebooks.","After setting up federated data connectors, you can access your federated tables through multiple interfaces in Microsoft Sentinel. Federated tables are used in the same way as other data lake tables. This article explains how to view federated tables, query them using KQL (Kusto Query Language), and work with them in Jupyter notebooks.",2026-04-22T17:56:00.000Z,how-to,,0.45,False,Shows how to view and query federated tables using KQL and notebooks; primarily usage examples rather than configuration matrices or troubleshooting.,updated +https://learn.microsoft.com/en-us/azure/sentinel/datalake/workbooks-for-data-lake,Workbooks for Microsoft Sentinel data lake,Workbooks for Microsoft Sentinel Data Lake - Microsoft Security,Configure Sentinel workbooks for data lake queries,Learn how to create and use Microsoft Sentinel workbooks with data from the Microsoft Sentinel data lake to visualize and monitor security data.,"Running Microsoft Sentinel workbooks on top of Microsoft Sentinel data lake data allows SOC teams to visualize and monitor security data directly from the lake using KQL (Kusto Query Language), without duplicating or transforming data. By selecting Sentinel data lake as the data source in a workbook, analysts can run the same analytical queries used for investigations and hunting. They can render them as interactive charts and tables for operational monitoring and reporting. Using Sentinel data ",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Describes selecting Sentinel data lake as a workbook data source and running queries; likely includes specific configuration steps and options for workbook data sources.,updated +https://learn.microsoft.com/en-us/azure/sentinel/delete-incident,Delete incidents,Delete incidents in Microsoft Sentinel in the Azure portal,,"Delete incidents in Microsoft Sentinel from the Azure portal, through the API, or using a Logic App.","Important Incident deletion using the portal is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Incident deletion is generally available through the API. The ability to create incidents from scratch in Microsoft Sentinel in the Azure portal opens the possibility that you'll create an incident that you later decide you should",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"Describes incident deletion options; summary does not show product-specific limits, error codes, or config tables.",updated +https://learn.microsoft.com/en-us/azure/sentinel/deploy-overview,Deployment planning guide,Deployment guide for Microsoft Sentinel,Plan and execute a Microsoft Sentinel deployment,"Learn about the steps to deploy Microsoft Sentinel including the phases to plan and prepare, deploy, and fine tune.","This article introduces the activities that help you plan, deploy, and fine tune your Microsoft Sentinel deployment.",2026-04-22T17:56:00.000Z,concept-article,deployment,0.65,True,Deployment guide overview for Sentinel including phases and planning; likely includes Sentinel-specific deployment requirements and sequencing beyond generic deployment steps.,updated +https://learn.microsoft.com/en-us/azure/sentinel/deploy-side-by-side,Deploy side-by-side,Deploying Microsoft Sentinel side-by-side to an existing SIEM.,Plan side-by-side deployment of Sentinel with existing SIEM,Learn how to deploy Microsoft Sentinel side-by-side to an existing SIEM.,"Your security operations center (SOC) team uses centralized security information and event management (SIEM) and security orchestration, automation, and response (SOAR) solutions to protect your increasingly decentralized digital estate. This article describes the approach and methods to consider when deploying Microsoft Sentinel in a side-by-side configuration together with your existing SIEM.",2026-04-22T17:56:00.000Z,concept-article,decision-making,0.7,True,Describes approaches and methods for running Sentinel alongside an existing SIEM; focused on migration/architecture decisions and trade-offs.,updated +https://learn.microsoft.com/en-us/azure/sentinel/detection-tuning,Get fine-tuning recommendations,Get fine-tuning recommendations for your analytics rules in Microsoft Sentinel,Tune Sentinel analytics rules to reduce noise,"Learn how to fine-tune your threat detection rules in Microsoft Sentinel, using automatically generated recommendations, to reduce false positives while maintaining threat detection coverage.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important Detection tuning is currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor addit",2026-04-22T17:56:00.000Z,how-to,best-practices,0.6,True,Provides fine-tuning recommendations to reduce false positives while maintaining coverage; likely includes concrete rule configuration adjustments and patterns specific to Sentinel.,updated +https://learn.microsoft.com/en-us/azure/sentinel/dns-ama-fields,DNS over AMA reference,Microsoft Sentinel DNS over AMA connector reference - available fields and normalization schema,Configure DNS AMA connector fields and normalization schema,"This article lists available fields for filtering DNS data using the Windows DNS Events via AMA connector, and the normalization schema for Windows DNS server fields.","Microsoft Sentinel allows you to stream and filter events from your Windows Domain Name System (DNS) server logs to theASimDnsActivityLognormalized schema table. This article describes the fields used for filtering the data, and the normalization schema for the Windows DNS server fields. The Azure Monitor Agent (AMA) and its DNS extension are installed on your Windows Server to upload data from your DNS analytical logs to your Microsoft Sentinel workspace. You stream and filter the data using th",2026-04-22T17:56:00.000Z,reference,configuration,0.86,True,"Lists available filter fields and the normalized schema for DNS logs, including field names and semantics, which are configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/domain-based-essential-solutions,ASIM-based domain solutions,ASIM-based domain solutions - Essentials for Microsoft Sentinel,Use ASIM-based essential domain solutions in Sentinel,"Learn about the Microsoft essential solutions for Microsoft Sentinel that span across different ASIM schemas like networks, DNS, and web sessions.",Microsoft essential solutions are domain solutions published by Microsoft for Microsoft Sentinel. These solutions have out-of-the-box content that can operate across multiple products for specific categories like networking. Some of these essential solutions use the normalization technique Advanced Security Information Model (ASIM) to normalize the data at query time or ingestion time. Important Microsoft essential solutions and the Network Session Essentials solution are currently in PREVIEW. T,2026-04-22T17:56:00.000Z,concept-article,configuration,0.6,True,Describes Microsoft essential solutions using ASIM schemas; includes product-specific solution behavior and configuration context.,updated +https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/deploy-dynamics-365-finance-operations-solution,Deploy for Dynamics 365 Finance and Operations,Connect Microsoft Dynamics 365 Finance and Operations to Microsoft Sentinel,Connect Dynamics 365 Finance and Operations to Sentinel,Learn how to deploy the Microsoft Sentinel solution for Business Applications with Microsoft Dynamics 365 Finance and Operations.,"This article describes how to deploy the Dynamics 365 Finance and Operations content within the Microsoft Sentinel solution for Microsoft Business Applications. The solution monitors and protects your Dynamics 365 Finance and Operations system: It collects audits and activity logs from the Dynamics 365 Finance and Operations environment, and detects threats, suspicious activities, illegitimate activities, and more.Read more about the solution.",2026-04-22T17:56:00.000Z,how-to,deployment,0.65,True,Deployment article for Dynamics 365 F&O content within Sentinel; includes specific steps and requirements for this integration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/dynamics-365-finance-operations-security-content,Dynamics 365 Finance and Operations security content reference,Security content reference for Dynamics 365 Finance and Operations,Security content reference for Dynamics 365 F&O,Learn about the built-in security content provided by the Microsoft Sentinel solution for Dynamics 365 Finance and Operations.,This article details the security content available for the Microsoft Sentinel solution for Dynamics 365 Finance and Operations. Learn more about the solution.,2026-04-22T17:56:00.000Z,reference,security,0.65,True,Reference for built-in security content for Dynamics 365 F&O solution; product-specific analytics and workbooks.,updated +https://learn.microsoft.com/en-us/azure/sentinel/enable-entity-behavior-analytics,Enable User and Entity Behavior Analytics (UEBA),Enable entity behavior analytics to detect advanced threats,Enable and configure Sentinel UEBA data sources,"Enable User and Entity Behavior Analytics in Microsoft Sentinel, and configure data sources","User and Entity Behavior Analytics (UEBA) in Microsoft Sentinel analyzes logs and alerts from connected data sources to build baseline behavioral profiles of your organization's entities—such as users, hosts, IP addresses, and applications. Using machine learning, UEBA identifies anomalous activity that may indicate a compromised asset. You can enable User and Entity Behavior Analytics in two ways, both with the same result: This article explains how to enable UEBA and configure data sources fro",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,Explains how to enable UEBA and configure data sources; includes product-specific feature toggles and data requirements.,updated +https://learn.microsoft.com/en-us/azure/sentinel/enable-monitoring,Enable auditing and health monitoring,Turn on auditing and health monitoring in Microsoft Sentinel,Enable Sentinel auditing and health monitoring and query logs,Monitor supported data connectors by using the SentinelHealth data table.,"Monitor the health and audit the integrity of supported Microsoft Sentinel resources by turning on the auditing and health monitoring feature in Microsoft Sentinel'sSettingspage. Get insights on health drifts, such as the latest failure events or changes from success to failure states, and on unauthorized actions, and use this information to create notifications and other automated actions. To get health data from theSentinelHealthdata table, or to get auditing information from theSentinelAuditd",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,Covers turning on the feature and querying SentinelHealth and SentinelAudit tables; includes table names and query patterns unique to Sentinel.,updated +https://learn.microsoft.com/en-us/azure/sentinel/enable-sentinel-features-content,Enable Microsoft Sentinel and initial features and content,Enable Microsoft Sentinel SIEM and initial features and content,Enable Microsoft Sentinel SIEM and core features,"As the first step of your deployment, you enable Microsoft Sentinel, and then enable the health and audit feature, solutions, and content.","To begin your deployment, you need to enable Microsoft Sentinel SIEM and set up key features and content. In this article, you learn how to enable Microsoft Sentinel, enable the health and audit feature, and enable the solutions and content you've identified according to your organization's needs. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,"Covers enabling Sentinel, health/audit, and solutions; includes product-specific deployment steps and feature enablement order.",updated +https://learn.microsoft.com/en-us/azure/sentinel/enable-storage-network-security,Enable network security for Azure Storage Blob data connector,Enable Network Security for Azure Storage blob connectors,Enable network security perimeters for Sentinel storage connectors,Learn how to enable network security for Azure Storage connector resources. Follow step-by-step instructions to secure your storage accounts with Network Security Perimeters.,"This article provides step-by-step instructions on how to enable network security on the storage resources integrated with your Azure Storage connector. Azure network security perimeter (NSP) is an Azure-native feature that creates a logical isolation boundary for your PaaS resources. By associating resources like storage accounts or databases with an NSP, you can centrally manage network access using a simplified rule set. For more information, seeNetwork security perimeter concepts.",2026-04-22T17:56:00.000Z,how-to,security,0.78,True,"Step-by-step instructions for configuring Azure NSP on storage accounts used by connectors, including product-specific security settings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/enroll-simplified-pricing-tier,Switch to simplified pricing tiers,Enroll in a simplified pricing tier for Microsoft Sentinel,Enroll Sentinel workspaces in simplified pricing tiers,"Learn how to enroll in simplified billing, the impact of the switch to commitment pricing tiers, and frequently asked questions about enrollment.","For many Microsoft Sentinel workspaces created before July 2023, there's a separate pricing tier for Azure Monitor Log Analytics in addition to the classic pricing tier for Microsoft Sentinel. To combine the data ingestion costs for Log Analytics and the data analysis costs of Microsoft Sentinel, enroll your workspace in a simplified pricing tier.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Covers switching to simplified pricing and its impact; includes tier behavior and migration considerations to support billing decisions.,updated +https://learn.microsoft.com/en-us/azure/sentinel/entities,Overview,Entities in Microsoft Sentinel,Understand and configure entities in Sentinel,"Entities are classifications or labels for data elements in your Microsoft Sentinel alerts. Microsoft Sentinel uses entities to recognize data elements as a particular entity type, correlate data acro","When alerts are sent to or generated by Microsoft Sentinel, they contain data elements that Sentinel can recognize and classify into categories asentities. When Microsoft Sentinel understands what kind of entity a particular data element represents, it knows the right questions to ask about it, and it can then compare insights about that item across the full range of data sources, and easily track it and refer to it throughout the entire Sentinel experience - analytics, investigation, remediatio",2026-04-22T17:56:00.000Z,concept-article,configuration,0.6,True,Defines entity types and how Sentinel classifies data elements; typically includes lists of entity schemas and mappings that are product-specific configuration details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/entities-reference,Entities reference,Microsoft Sentinel entity types reference,Reference Sentinel entity types and identifiers,"This article displays the Microsoft Sentinel entity types and their identifiers, and lists strong and weak identifiers for each.","This document contains two sets of information regarding entities and entity types in Microsoft Sentinel in the Azure portal andMicrosoft Sentinel in the Defender portal. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you're",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"Defines entity types with strong/weak identifiers, which are detailed schema/configuration elements used across Sentinel content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer,Aggregate behavioral insights from raw logs,Translate raw security logs to behavioral insights using UEBA behaviors in Microsoft Sentinel,,"The Microsoft Sentinel UEBA behaviors layer translates security telemetry into normalized behavioral patterns for investigation, hunting, and detection engineering.","The User and Entity Behavior Analytics (UEBA) behavior layer in Microsoft Sentinel aggregates and summarizes high-volume raw logs into clear, plain-language patterns of security actions, explaining “who did what to whom” in a structured way. Unlike alerts or anomalies, behaviors don’t necessarily indicate risk - they create an abstraction layer that optimizes your data for investigations, hunting, and detection by enhancing: This abstraction layer enables faster threat detection, investigation, ",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Explains the UEBA behaviors abstraction layer conceptually; summary doesn’t indicate specific configuration parameters, limits, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer-rai-faqs,Responsible AI FAQ for UEBA behaviors layer,Responsible AI FAQ for the Microsoft Sentinel UEBA behaviors layer,,"This FAQ provides information about the AI technology used in the Microsoft Sentinel UEBA behaviors layer, along with key considerations and details about how AI is used, how it was tested and evaluat",These frequently asked questions (FAQ) describe the AI impact of theUEBA behaviors layerfeature in Microsoft Sentinel.,2026-04-22T17:56:00.000Z,contributor-guide,,0.45,False,"Responsible AI FAQ about UEBA behaviors; mostly conceptual and policy/impact discussion without concrete configuration, limits, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/sentinel/entity-pages,Entity pages,Entity pages in Microsoft Sentinel,,"Entity pages display information about entities surfaced in your alerts, or that you otherwise come across in your incident investigations. Among this information is the timeline of alerts involving t","When you come across a user account, a hostname, an IP address, or an Azure resource in an incident investigation, you may decide you want to know more about it. For example, you might want to know its activity history, whether it's appeared in other alerts or incidents, whether it's done anything unexpected or out of character, and so on. In short, you want information that can help you determine what sort of threat these entities represent and guide your investigation accordingly. This article",2026-04-22T17:56:00.000Z,concept-article,,0.4,False,Entity pages article focuses on how to view information and timelines; more of a UX/investigation guide than a configuration or troubleshooting reference.,updated +https://learn.microsoft.com/en-us/azure/sentinel/extend-sentinel-across-workspaces-tenants,Extend across multiple workspaces,Extend Microsoft Sentinel across workspaces and tenants,Extend Sentinel analytics across workspaces and tenants,How to use Microsoft Sentinel to query and analyze data across workspaces and tenants.,"When you onboard Microsoft Sentinel, your first step is to select your Log Analytics workspace. While you can get the full benefit of the Microsoft Sentinel experience with a single workspace, in some cases, you might want to extend your workspace to query and analyze your data across workspaces and tenants. For more information, seeDesign a Log Analytics workspace architectureandPrepare for multiple workspaces and tenants in Microsoft Sentinel. If you onboard Microsoft Sentinel to the Microsoft",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.7,True,Explains patterns for querying/analyzing across workspaces/tenants; product-specific cross-workspace architecture and trade-offs.,updated +https://learn.microsoft.com/en-us/azure/sentinel/false-positives,Handle false positives,Handle false positives in Microsoft Sentinel,,Learn how to resolve false positives in Microsoft Sentinel by creating automation rules or modifying analytics rules to specify exceptions.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel analytics rulesnotify you when something suspicious occurs in your network. No analytics rule is perfe",2026-04-22T17:56:00.000Z,how-to,,0.45,False,"Describes handling false positives conceptually and via automation/rule changes; summary does not show concrete product-specific configs, codes, or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/feature-availability,Feature support in different clouds,Microsoft Sentinel feature support for Azure commercial/other clouds,Review Microsoft Sentinel feature availability by Azure cloud,This article describes feature availability in Microsoft Sentinel across different Azure environments.,"This article describes the features available in Microsoft Sentinel across different Azure environments. Features are listed as GA (generally available), public preview, or shown as not available. Note These lists and tables do not include feature or bundle availability in the Azure Government Secret or Azure Government Top Secret clouds. +For more information about specific availability for air-gapped clouds, please contact your account team. Important All Microsoft Sentinel features will be off",2026-04-22T17:56:00.000Z,feature-availability,decision-making,0.8,True,Feature availability across Azure environments with GA/preview/unsupported states; effectively a decision matrix for what you can use where.,updated +https://learn.microsoft.com/en-us/azure/sentinel/forward-syslog-monitor-agent,Tutorial - Forward syslog data to workspace,Tutorial: Forward Syslog data to Microsoft Sentinel and Azure Monitor by using Azure Monitor Agent,,"In this tutorial, you learn how to monitor Linux-based devices by forwarding Syslog data to a Log Analytics workspace.","In this tutorial, you configure a Linux virtual machine (VM) to forward Syslog data to your workspace by using Azure Monitor Agent. These steps allow you to collect and monitor data from Linux-based devices where you can't install an agent like a firewall network device. Note Container Insights now supports the automatic collection of Syslog events from Linux nodes in your AKS clusters. To learn more, seeSyslog collection with Container Insights. Configure your Linux-based device to send data to",2026-04-22T17:56:00.000Z,tutorial,,0.45,False,Step-by-step tutorial for forwarding Syslog via Azure Monitor Agent; typical how-to without detailed limits or config matrices.,updated +https://learn.microsoft.com/en-us/azure/sentinel/fusion,Overview,Advanced multistage attack detection in Microsoft Sentinel,,Use Fusion technology in Microsoft Sentinel to reduce alert fatigue and create actionable incidents that are based on advanced multistage attack detection.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel uses Fusion, a correlation engine based on scalable machine learning algorithms, to automatically dete",2026-04-22T17:56:00.000Z,concept-article,,0.35,False,"Fusion overview for multistage attack detection; mainly conceptual description of how Fusion works and its benefits, not detailed config or numeric limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/fusion-scenario-reference,Multistage attack detection scenarios,Scenarios detected by the Microsoft Sentinel Fusion engine,Reference Fusion-detected multistage attack scenarios,"Learn about the scenarios detected by Fusion, listed here grouped by threat classification.","This document lists the types of scenario-based multistage attacks, grouped by threat classification, that Microsoft Sentinel detects using the Fusion correlation engine. SinceFusioncorrelates multiple signals from various products to detect advanced multistage attacks, successful Fusion detections are presented asFusion incidentson the Microsoft SentinelIncidentspage and not asalerts, and are stored in theIncidentstable inLogsand not in theSecurityAlertstable. In order to enable these Fusion-po",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,"Catalog of specific Fusion detection scenarios grouped by threat classification, describing concrete product detection capabilities.",updated +https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency,Geographical availability and data residency,Geographical availability and data residency in Microsoft Sentinel,Plan Sentinel deployment for data residency compliance,"In this article, you learn about geographical availability and data residency in Microsoft Sentinel.","After your data is collected, stored, and processed, compliance can become an important design requirement, with a significant impact on your Microsoft Sentinel architecture. Having the ability to validate and prove who has access to what data under all conditions is a critical data sovereignty requirement in many countries and regions, and assessing risks and getting insights in Microsoft Sentinel workflows is a priority for many customers. This article can help you meet compliance requirements",2026-04-22T17:56:00.000Z,concept-article,decision-making,0.65,True,Geographical availability and data residency affect where and how to deploy Sentinel; this is decision guidance tied to compliance and region support.,updated +https://learn.microsoft.com/en-us/azure/sentinel/geolocation-data-api,Enrich entities with geolocation data with REST-API,Enrich entities with geolocation data in Microsoft Sentinel using REST API,Enrich Sentinel entities with geolocation via REST API,This article describes how you can enrich entities in Microsoft Sentinel with geolocation data via REST API.,"This article shows you how to enrich entities in Microsoft Sentinel with geolocation data using the REST API. Important This feature is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T17:56:00.000Z,reference,integrations,0.7,True,Describes how to call a Sentinel REST API for geolocation enrichment with specific request structures and parameters.,updated +https://learn.microsoft.com/en-us/azure/sentinel/get-visibility,View collected data on the Overview dashboard,View aggregated data from the Overview,,Learn how to quickly view and monitor what's happening across your environment by using Microsoft Sentinel.,"After connecting your data sources to Microsoft Sentinel, use theOverviewpage to view, monitor, and analyze activities across your environment. This article describes the widgets and graphs available on Microsoft Sentinel'sOverviewdashboard. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal",2026-04-22T17:56:00.000Z,how-to,,0.2,False,"Overview dashboard description; primarily UI widgets and graphs, not configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/health-audit,Overview,Auditing and health monitoring in Microsoft Sentinel,Understand Sentinel auditing and health monitoring capabilities,"Learn about the Microsoft Sentinel health and audit feature, which monitors service health drifts and user actions.","Microsoft Sentinel is a critical service for advancing and protecting the security of your organization’s technological and information assets, so you want to be sure that it's always running smoothly and free of interference. You want to verify that the service's many moving parts are always functioning as intended, and it isn't being manipulated by unauthorized actions, whether by internal users or otherwise. You might also like to configure notifications of health drifts or unauthorized actio",2026-04-22T17:56:00.000Z,concept-article,configuration,0.65,True,Explains Sentinel health/audit feature and monitored actions; includes product-specific tables and signals used for monitoring.,updated +https://learn.microsoft.com/en-us/azure/sentinel/health-table-reference,SentinelHealth table reference,Microsoft Sentinel health tables reference,Use SentinelHealth table fields for SIEM monitoring,"Learn about the fields in the SentinelHealth tables, used for health monitoring and analysis.","This article describes the fields in theSentinelHealthtable used for monitoring the health of Microsoft Sentinel resources. With the Microsoft Sentinelhealth monitoring feature, you can keep tabs on the proper functioning of your SIEM and get information on any health drifts in your environment. Learn how to query and use the health table for deeper monitoring and visibility of actions in your environment: Microsoft Sentinel's health monitoring feature covers different kinds of resources (see th",2026-04-22T17:56:00.000Z,reference,configuration,0.85,True,Health table reference with field descriptions; schema and field semantics are detailed configuration knowledge unique to Sentinel.,updated +https://learn.microsoft.com/en-us/azure/sentinel/hunting,Overview,Hunting capabilities in Microsoft Sentinel,,Use Microsoft Sentinel's built-in hunting queries to guide you into asking the right questions to find issues in your data.,"As security analysts and investigators, you want to be proactive about looking for security threats, but your various systems and security appliances generate mountains of data that can be difficult to parse and filter into meaningful events. Microsoft Sentinel has powerful hunting search and query tools to hunt for security threats across your organization's data sources. To help security analysts look proactively for new anomalies that aren't detected by your security apps or even by your sche",2026-04-22T17:56:00.000Z,how-to,,0.2,False,"Conceptual overview of hunting capabilities and built-in queries; no specific configs, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/hunting-with-rest-api,Manage hunting queries with REST-API,Manage hunting queries in Microsoft Sentinel using REST API,Manage Sentinel hunting queries via Log Analytics REST API,This article describes how Microsoft Sentinel hunting features enable you to take advantage Log Analytics’ REST API to manage hunting queries.,"Microsoft Sentinel, being built in part on Azure Monitor Log Analytics, lets you use Log Analytics’ REST API to manage hunting queries. This document shows you how to create and manage hunting queries using the REST API. Queries created in this way are displayed in the Microsoft Sentinel UI. +For more information on thesaved searches API, see the definitive REST API reference.",2026-04-22T17:56:00.000Z,reference,integrations,0.7,True,"Shows concrete REST API usage patterns and parameters for managing hunting queries, a product-specific integration pattern.",updated +https://learn.microsoft.com/en-us/azure/sentinel/hunts,Conduct end-to-end hunts,Conduct end-to-end threat hunting with Hunts - Microsoft Sentinel,,Learn how to use hunts for conducting end-to-end proactive threat hunting. Seek out undetected threats based on hypothesis or start broadly and refine your searches with this hunting experience.,"Proactive threat hunting is a process where security analysts seek out undetected threats and malicious behaviors. By creating a hypothesis, searching through data, and validating that hypothesis, they determine what to act on. Actions can include creating new detections, new threat intelligence, or spinning up a new incident. Use the end to end hunting experience within Microsoft Sentinel to: Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and w",2026-04-22T17:56:00.000Z,how-to,,0.25,False,"Describes end-to-end hunting experience conceptually; no detailed configuration tables, limits, or error mappings evident.",updated +https://learn.microsoft.com/en-us/azure/sentinel/hunts-custom-queries,Create custom query,Create custom hunting queries in Microsoft Sentinel - Microsoft Sentinel,,Learn how to create a custom query to hunt for threats.,"Hunt for security threats across your organization's data sources with custom hunting queries. Microsoft Sentinel provides built-in hunting queries to help you find issues in the data you have on your network. But you can create your own custom queries. For more information about hunting queries, seeThreat hunting in Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,,0.25,False,Explains creating custom hunting queries at a high level; summary lacks product-specific parameters or numeric constraints.,updated +https://learn.microsoft.com/en-us/azure/sentinel/identify-threats-with-entity-behavior-analytics,UEBA overview,Advanced threat detection with User and Entity Behavior Analytics (UEBA) in Microsoft Sentinel,,"Create behavioral baselines for entities (users, hostnames, IP addresses) and use them to detect anomalous behavior and identify zero-day advanced persistent threats (APT).","Detecting anomalous behavior within an organization is often complex and time-consuming. Microsoft Sentinel's User and Entity Behavior Analytics (UEBA) simplifies this challenge by continuously learning from your data to surface meaningful anomalies that help analysts detect and investigate potential threats more effectively. This article explains what Microsoft Sentinel User and Entity Behavior Analytics (UEBA) is, how it works, how to onboard to it, and how to use UEBA to detect and investigat",2026-04-22T17:56:00.000Z,concept-article,,0.35,False,"UEBA overview and onboarding article; describes what UEBA is and how to use it conceptually, without clear indication of detailed configuration tables or numeric thresholds.",updated +https://learn.microsoft.com/en-us/azure/sentinel/import-export-analytics-rules,Export and import analytics rules,Import and export Microsoft Sentinel analytics rules,Import and export Sentinel analytics rules via ARM,Export and import analytics rules to and from ARM templates to aid deployment,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important Exporting and importing rules is inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor ad",2026-04-22T17:56:00.000Z,how-to,deployment,0.65,True,"Covers exporting/importing rules as ARM templates; this is about deployment of rules across environments and usually includes template schema, required properties, and constraints—deployment-focused expert details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/import-export-automation-rules,Export and import automation rules,Import and export Microsoft Sentinel automation rules,Export and import Sentinel automation rules as ARM templates,Export and import automation rules to and from ARM templates to aid deployment,"Manage your Microsoft Sentinel automation rules as code! You can now export your automation rules to Azure Resource Manager (ARM) template files, and import rules from these files, as part of your program to manage and control your Microsoft Sentinel deployments as code. The export action creates a JSON file in your browser's downloads location, that you can then rename, move, and otherwise handle like any other file. The exported JSON file is workspace-independent, so it can be imported to othe",2026-04-22T17:56:00.000Z,how-to,deployment,0.62,True,"Covers managing automation rules as code via ARM templates, including export/import behavior and workspace-independence, which are product-specific deployment-as-code details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/incident-investigation,Overview,Microsoft Sentinel incident investigation in the Azure portal,,This article describes Microsoft Sentinel's incident investigation and case management capabilities and features in the Azure portal.,"Microsoft Sentinel gives you a complete, full-featured case management platform for investigating and managing security incidents.Incidentsare Microsoft Sentinel’s name for files that contain a complete and constantly updated chronology of a security threat, whether it’s individual pieces of evidence (alerts), suspects and parties of interest (entities), insights collected and curated by security experts and AI/machine learning models, or comments and logs of all the actions taken in the course ",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"Describes incident investigation features conceptually; summary lacks specific configs, limits, or error/decision tables.",updated +https://learn.microsoft.com/en-us/azure/sentinel/incident-navigate-triage,Triage and manage your incidents,Basic incident tasks for Microsoft Sentinel incidents in the Azure portal,,This article describes how to navigate and triage incidents in Microsoft Sentinel in the Azure portal.,This article describes how to navigate and run basic triage on your incidents in the Azure portal.,2026-04-22T17:56:00.000Z,how-to,,0.25,False,Basic navigation/triage article; summary does not indicate detailed expert-only settings or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/incident-tasks,Overview,Use tasks to manage incidents in Microsoft Sentinel in the Azure portal,,"This article describes incident tasks and how to work with them to ensure all required steps are taken in triaging, investigating, and responding to incidents in Microsoft Sentinel.","One of the most important factors in running your security operations (SecOps) effectively and efficiently is thestandardization of processes. SecOps analysts are expected to perform a list of steps, or tasks, in the process of triaging, investigating, or remediating an incident. Standardizing and formalizing the list of tasks can help keep your SOC running smoothly, ensuring the same requirements apply to all analysts. This way, regardless of who is on-shift, an incident will always get the sam",2026-04-22T17:56:00.000Z,how-to,,0.35,False,Conceptual description of incident tasks and process standardization; summary lacks specific configuration or numeric guidance.,updated +https://learn.microsoft.com/en-us/azure/sentinel/indicators-bulk-file-import,Add threat intelligence in bulk by file,Add threat intelligence in bulk by file - Microsoft Sentinel,Bulk import threat intel files into Sentinel,Learn how to add threat intelligence in bulk from flat files like .csv or .json into Microsoft Sentinel.,"This article demonstrates how to add indicators from a CSV or STIX objects from a JSON file into Microsoft Sentinel threat intelligence. Because threat intelligence sharing still happens across emails and other informal channels during an ongoing investigation, the ability to import that information quickly into Microsoft Sentinel is important to relay emerging threats to your team. These identified threats are then available to power other analytics, such as producing security alerts, incidents",2026-04-22T17:56:00.000Z,how-to,integrations,0.65,True,"Bulk import of indicators from CSV/STIX JSON usually includes required column/field names, formats, and constraints for Sentinel’s threat intelligence schema, which are product-specific integration details beyond generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/ingest-defender-for-cloud-incidents,Integrate Microsoft Defender for Cloud,Ingest Microsoft Defender for Cloud incidents with Microsoft Defender XDR integration,Ingest Defender for Cloud incidents via Defender XDR,Learn how using Microsoft Defender for Cloud's integration with Microsoft Defender XDR lets you ingest Microsoft Defender for Cloud incidents through Microsoft Defender XDR. This lets you add Defender,"Microsoft Defender for Cloud is nowintegrated with Microsoft Defender XDR, formerly known as Microsoft 365 Defender. This integration allows Defender XDR to collect alerts from Defender for Cloud and create Defender XDR incidents from them. Thanks to this integration, Microsoft Sentinel customers who enableDefender XDR incident integrationcan now ingest and synchronize Defender for Cloud incidents through Microsoft Defender XDR. To support this integration, you must set up one of the following M",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,Describes specific integration paths and configuration to ingest Defender for Cloud incidents through Defender XDR into Sentinel; product-specific integration pattern.,updated +https://learn.microsoft.com/en-us/azure/sentinel/ingestion-delay,Handle ingestion delay in analytics rules,Handle ingestion delay in Microsoft Sentinel,Handle data ingestion delay in Sentinel rules,Handle ingestion delay in Microsoft Sentinel scheduled analytics rules.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. While Microsoft Sentinel can ingest data fromvarious sources, ingestion time for each data source may differ in different",2026-04-22T17:56:00.000Z,how-to,best-practices,0.65,True,"Discusses handling ingestion delay for scheduled rules; such content typically gives concrete recommendations (for example, adjusting lookback windows, scheduling offsets) that are product-specific best practices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/investigate-incidents,Investigate incidents in depth,Investigate Microsoft Sentinel incidents in depth in the Azure portal,,"This article takes you through all the panels and options available on the incident details page in the Azure portal, helping you navigate and investigate your incidents more quickly, effectively, and","Microsoft Sentinel incidents are files that contain an aggregation of all the relevant evidence for specific investigations. Each incident is created (or added to) based on pieces of evidence (alerts) that were either generated by analytics rules or imported from third-party security products that produce their own alerts. Incidents inherit theentitiescontained in the alerts, as well as the alerts' properties, such as severity, status, and MITRE ATT&CK tactics and techniques. Microsoft Sentinel ",2026-04-22T17:56:00.000Z,how-to,,0.3,False,Deep-dive UI walkthrough for incident details; still primarily conceptual/operational without explicit expert-only configs in the summary.,updated +https://learn.microsoft.com/en-us/azure/sentinel/investigate-large-datasets,Overview,Start an investigation by searching large datasets - Microsoft Sentinel,,Learn about search jobs and restoring data from long-term retention in Microsoft Sentinel.,"One of the primary activities of a security team is to search logs for specific events. For example, you might search logs for the activities of a specific user within a given time-frame. In Microsoft Sentinel, you can search across long time periods in extremely large datasets by using a search job. While you can run a search job on any type of log, search jobs are ideally suited to search logs in a long-term retention (formerly known as archive) state. If you need to do a full investigation o",2026-04-22T17:56:00.000Z,how-to,,0.45,False,Explains search jobs and long-term retention conceptually; summary does not show specific numeric thresholds beyond generic 'large datasets'.,updated +https://learn.microsoft.com/en-us/azure/sentinel/investigate-with-ueba,UEBA investigation examples,Investigate incidents with UEBA data,,Learn how to use UEBA data while investigating to gain greater context to potentially malicious activity occurring in your organization.,"This article describes common methods and sample procedures for usinguser entity behavior analytics (UEBA)in your regular investigation workflows. Important Noted features in this article are currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T17:56:00.000Z,how-to,,0.35,False,Describes investigation workflows using UEBA data; appears to be procedural guidance rather than configuration reference or troubleshooting catalog.,updated +https://learn.microsoft.com/en-us/azure/sentinel/log-plans,Plan data retention,Log retention tiers in Microsoft Sentinel,Select Microsoft Sentinel log retention tiers and limits,Learn about the different log retention plans that are available in Microsoft Sentinel and how they're meant to be used to ensure maximum coverage at minimum expenditure.,"For Microsoft Sentinel workspaces connected to Defender, tiering and retention management must be done from the new table management experience in the Defender portal. For unattached Microsoft Sentinel workspaces, continue to use the experiences described below to manage data in your workspaces. There are two competing aspects of log collection and retention that are critical to a successful threat detection program. On the one hand, you want to maximize the number of log sources that you collec",2026-04-22T17:56:00.000Z,concept-article,limits-quotas,0.7,True,"Describes different log retention plans and how to use them; such pages typically include concrete retention durations, tier behaviors, and cost/coverage trade-offs.",updated +https://learn.microsoft.com/en-us/azure/sentinel/manage-analytics-rule-templates,Manage template versions for analytics rules,Manage template versions for your scheduled analytics rules in Microsoft Sentinel,Manage Sentinel analytics rule template versions,"Learn how to manage the relationship between your scheduled analytics rule templates and the rules created from those templates. Merge updates to the templates into your rules, and revert changes in y","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Important This feature is inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal te",2026-04-22T17:56:00.000Z,how-to,configuration,0.6,True,"Explains managing relationships between templates and rules, merging updates, and reverting changes; involves product-specific configuration flows and options for template-version handling.",updated +https://learn.microsoft.com/en-us/azure/sentinel/manage-data-overview,Data management overview,Manage data tiers and retention in Microsoft Sentinel,Plan Sentinel data tiers and retention for cost and operations,Manage log data in Microsoft Sentinel and with Microsoft Defender XDR services in the Microsoft Defender portal to optimize security operations and cost efficiency.,Data you collect into Microsoft Sentinel (SIEM) and Microsoft Defender XDR is stored in tables. The Microsoft Defender portal lets you manage the retention period and the store costs associated with your data. You can manage retention and costs when you: This article explains how to manage table retention and tier options in the Microsoft Defender portal to optimize security operations and reduce costs in Microsoft Sentinel and Microsoft Defender XDR.,2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Guides how to choose retention periods and tiers to optimize cost and operations; likely includes tier capabilities and trade-offs for decision-making.,updated +https://learn.microsoft.com/en-us/azure/sentinel/manage-platform-solutions,Managing Platform Solutions,Manage Microsoft Sentinel platform solutions,Configure and manage installed Sentinel platform solutions,"Learn how to configure, update, and uninstall components installed by a Microsoft Sentinel platform solution.","After you install a Microsoft Sentinel platform solution, you manage its components in different places. This article explains how to configure, update, and uninstall the main types of components.",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Managing components after installation implies detailed configuration steps and settings for different component types, which are product-specific.",updated +https://learn.microsoft.com/en-us/azure/sentinel/manage-soc-with-incident-metrics,Manage your SOC with incident metrics,Manage your SOC better with incident metrics in Microsoft Sentinel,Use Sentinel incident metrics to manage SOC performance,Use information from the Microsoft Sentinel incident metrics screen and workbook to help you manage your Security Operations Center (SOC).,"Note For information about feature availability in US Government clouds, see the Microsoft Sentinel tables inCloud feature availability for US Government customers. As a Security Operations Center (SOC) manager, you need to have overall efficiency metrics and measures at your fingertips to gauge the performance of your team. You'll want to see incident operations over time by many different criteria, like severity, MITRE tactics, mean time to triage, mean time to resolve, and more. Microsoft Sen",2026-04-22T17:56:00.000Z,how-to,configuration,0.6,True,"Describes incident metrics screen/workbook and how to use specific metrics (MTTR, severity, MITRE tactics) within Sentinel; product-specific configuration/usage.",updated +https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention,"Manage tables, tiers, and retention",Configure table settings in Microsoft Sentinel,Configure Sentinel table retention and tier settings,Configure Microsoft Sentinel and Defender XDR table settings in Microsoft Defender Portal to optimize security operations and cost efficiency.,"The Microsoft Defender portal provides a centralized experience for configuring table-level data retention and tier settings across Microsoft Sentinel and Microsoft Defender XDR. You can view and manage retention settings, switch between analytics and data lake tiers, and optimize data storage based on operational and cost requirements. This article explains how to configure retention and tier settings for tables in Microsoft Sentinel and Defender XDR in the Microsoft Defender portal. For more i",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,"Explains table-level retention and tier configuration in Defender portal; involves specific setting names, allowed values, and behaviors.",updated +https://learn.microsoft.com/en-us/azure/sentinel/map-data-fields-to-entities,Map data fields to entities,Map data fields to Microsoft Sentinel entities,Map data fields to Sentinel entities,"Map data fields in tables to Microsoft Sentinel entities in analytics rules, for better incident information",Entity mapping is an integral part of the configuration ofscheduled analytics rules. It enriches the rules' output (alerts and incidents) with essential information that serves as the building blocks of any investigative processes and remedial actions that follow. The procedure detailed below is part of the analytics rule creation wizard. It's treated here independently to address the scenario of adding or changing entity mappings in an existing analytics rule. Important,2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Entity mapping article describes specific entity types and field mappings used in analytics rules; these are product-specific configuration options and schema details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-cloud-support,Microsoft Defender XDR connector data type support,Support for Microsoft Defender XDR connector data types in Microsoft Sentinel for different clouds (GCC environments),Assess Defender XDR connector data type support across clouds,"This article describes support for different Microsoft Defender XDR connector data types in Microsoft Sentinel across different clouds, including Commercial, GCC, GCC-High, and DoD.","The type of cloud your environment uses affects Microsoft Sentinel's ability to ingest and display data from these connectors, like logs, alerts, device events, and more. This article describes support for different Microsoft Defender XDR connector data types in Microsoft Sentinel across different clouds, including Commercial, GCC, GCC-High, and DoD. Read more aboutdata type support for different clouds in Microsoft Sentinel.",2026-04-22T17:56:00.000Z,reference,decision-making,0.7,True,"Describes which data types are supported in Commercial vs GCC/GCC-High/DoD, guiding deployment decisions based on environment capabilities.",updated +https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-sentinel-integration,Integrate Microsoft Defender XDR,Microsoft Defender XDR integration with Microsoft Sentinel,Integrate Microsoft Defender XDR with Sentinel incidents,Learn how using Microsoft Defender XDR together with Microsoft Sentinel lets you use Microsoft Sentinel as your universal incidents queue.,"This article describes how Microsoft Defender XDR services integrate with Microsoft Sentinel, whether in the Microsoft Defender portal or in the Azure portal. If you first onboarded to Microsoft Sentinel after July 1, 2025 with permissions of a subscriptionOwneror aUser access administrator, your workspace isautomatically onboarded to the Defender portal. In such cases, youuse Microsoft Sentinel in the Defender portal only, where your data can integrate directly with Defender XDR service data fo",2026-04-22T17:56:00.000Z,concept-article,integrations,0.7,True,"Explains integration behavior between Defender XDR and Sentinel, including onboarding conditions and incident queue behavior; this is product-specific integration logic.",updated https://learn.microsoft.com/en-us/azure/sentinel/microsoft-purview-record-types-activities,Microsoft Purview Information Protection reference,Microsoft Purview Information Protection connector reference - audit log record types and activities support in Microsoft Sentinel,Use Purview Information Protection connector record types in Sentinel,This article lists supported audit log record types and activities when using the Microsoft Purview Information Protection connector with Microsoft Sentinel.,"This article lists supported audit log record types and activities when using the Microsoft Purview Information Protection connector with Microsoft Sentinel. When you use theMicrosoft Purview Information Protection connector, you stream audit logs into theMicrosoftPurviewInformationProtectionstandardized table. Data is -gathered through theOffice Management API, which uses a structured schema.",2023-01-09T12:26:00.000Z,reference,configuration,0.8,True,Lists supported audit log record types and activities for the Purview Information Protection connector and its standardized table; this is detailed configuration/schema support information.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/microsoft-sentinel-defender-portal,Microsoft Sentinel SIEM experience in Defender portal,Microsoft Sentinel in the Microsoft Defender portal,Use Microsoft Sentinel within the Defender portal,Learn about the Microsoft Sentinel experience when you onboard Microsoft Sentinel to the Microsoft Defender portal.,"Microsoft Defender provides a unified cybersecurity solution that integrates endpoint protection, cloud security, identity protection, email security, threat intelligence, exposure management, and SIEM into a centralized platform powered by a modern data lake. It uses AI-driven defense to help organizations anticipate and stop attacks, ensuring efficient and effective security operations. Microsoft Sentinel is generally available in the Microsoft Defender portal, either withMicrosoft Defender XD",2025-09-30T12:49:00.000Z,conceptual,decision-making,0.6,True,"Explains Sentinel experience in Defender portal, including which capabilities are available in which context and how to choose/transition, aiding portal and integration decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration,Plan your migration,Plan your migration to Microsoft Sentinel,Plan and phase a migration to Microsoft Sentinel,"Discover the reasons for migrating from a legacy SIEM, and learn how to plan out the different phases of your migration.","Security operations center (SOC) teams use centralized security information and event management (SIEM) and security orchestration, automation, and response (SOAR) solutions to protect their increasingly decentralized digital estate. While legacy SIEMs can maintain good coverage of on-premises assets, on-premises architectures may have insufficient coverage for cloud assets, such as in Azure, Microsoft 365, AWS, or Google Cloud Platform (GCP). In contrast, Microsoft Sentinel can ingest data from",2026-04-15T08:00:00.000Z,how-to,decision-making,0.7,True,"Migration planning content for moving from legacy SIEMs to Microsoft Sentinel typically includes phase breakdowns, option comparisons, and guidance on when to choose particular migration approaches. This is decision-focused, helping SOC teams choose migration strategies and phases rather than just describing Sentinel conceptually.",updated -https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-automation,Migrate SOAR automation,Migrate ArcSight SOAR automation to Microsoft Sentinel,Migrate ArcSight SOAR automation to Sentinel rules and playbooks,"Learn how to identify SOAR use cases, and how to migrate your ArcSight SOAR automation to Microsoft Sentinel.","Microsoft Sentinel provides Security Orchestration, Automation, and Response (SOAR) capabilities withautomation rulesandplaybooks. Automation rules automate incident handling and response, and playbooks run predetermined sequences of actions to response and remediate threats. This article discusses how to identify SOAR use cases, and how to migrate your ArcSight SOAR automation to Microsoft Sentinel. Automation rules simplify complex workflows for your incident orchestration processes, and allow",2022-05-31T22:09:00.000Z,how-to,decision-making,0.6,True,Guides how to identify SOAR use cases and migrate ArcSight automation to Sentinel automation rules/playbooks; provides migration choices and mapping guidance.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-detection-rules,Migrate detection rules,Migrate ArcSight detection rules to Microsoft Sentinel,Map and migrate ArcSight detection rules to Sentinel,"Identify, compare, and migrate your ArcSight detection rules to Microsoft Sentinel built-in rules.","This article describes how to identify, compare, and migrate your ArcSight detection rules to Microsoft Sentinel analytics rules.",2022-06-01T17:07:00.000Z,how-to,decision-making,0.6,True,"Focuses on identifying, comparing, and migrating ArcSight rules to Sentinel built-ins; this is migration decision guidance between rule types and mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-historical-data,Export historical data,Microsoft Sentinel migration: Export ArcSight data to target platform,Export ArcSight historical data for Sentinel migration,Learn how to export your historical data from ArcSight.,"This article describes how to export your historical data from ArcSight. After you complete the steps in this article, you canselect a target platformto host the exported data, and thenselect an ingestion toolto migrate the data. You can export data from ArcSight in several ways. Your selection of an export method depends on the data volumes and the deployed ArcSight environment. You can export the logs to a local folder on the ArcSight server or to another server accessible by ArcSight. To expo",2022-06-08T11:06:00.000Z,how-to,decision-making,0.65,True,"Describes multiple export methods depending on data volumes and environment; helps choose export approach, which is migration-related decision-making with scenario-based criteria.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-convert-dashboards,Convert dashboards to workbooks,Convert dashboards to Azure Workbooks in Microsoft Sentinel,,"Learn how to review, planning, and migrate your current dashboards to Azure Workbooks.","Convert dashboards from your existing security information and event management (SIEM) solution to an Azure workbook for Microsoft Sentinel. Azure Workbooks provide versatility to create custom dashboards for Microsoft Sentinel. This article describes how to review, plan, and convert your current dashboards to Azure Workbooks.",2024-06-14T06:14:00.000Z,how-to,,0.5,False,"Covers reviewing, planning, and converting dashboards to Azure Workbooks; appears to be migration how-to without explicit decision matrices or numeric thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-export-ingest,Ingest data,Microsoft Sentinel migration: Ingest data into target platform,,Learn how to ingest historical data into your selected target platform.,"In previous articles, youselected a target platformfor your historical data. You also selecteda tool to transfer your dataand stored the historical data in a staging location. You can now start to ingest the data into the target platform. This article describes how to ingest your historical data into your selected target platform.",2025-04-01T11:23:00.000Z,how-to,,0.5,False,Describes how to ingest historical data into the chosen platform; likely procedural steps rather than comparison matrices or numeric limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-target-platform,Select target platform,Microsoft Sentinel migration: Select a target Azure platform to host exported data,Choose an Azure target platform for Sentinel historical data,Select a target Azure platform to host the exported historical data,"One of the important decisions you make during your migration process is where to store your historical data. To make this decision, you need to understand and be able to compare the various target platforms. This article compares target platforms in terms of performance, cost, usability and management overhead. Note The considerations in this table only apply to historical log migration, and don't apply in other scenarios, such as long-term retention.",2023-03-07T18:41:00.000Z,how-to,decision-making,0.8,True,"Explicitly compares target platforms on performance, cost, usability, and management in a table; classic decision matrix for platform selection with quantified trade-offs.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-tool,Select data ingestion tool,Microsoft Sentinel migration: Select a data ingestion tool,Select a data ingestion tool for Sentinel historical logs,Select a tool to transfer your historical data to the selected target platform.,"After youselect a target platformfor your historical data, the next step is to select a tool to transfer your data. This article describes a set of different tools used to transfer your historical data to the selected target platform. This table lists the tools available for each target platform, and general tools to help you with the ingestion process.",2023-01-04T12:20:00.000Z,how-to,decision-making,0.75,True,"Provides a table of tools per target platform and general tools; helps choose ingestion tooling based on platform and scenario, which is decision-making guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-automation,Migrate SOAR automation,Migrate IBM Security QRadar SOAR automation to Microsoft Sentinel,Migrate QRadar SOAR automation to Sentinel automation rules,"Learn how to identify SOAR use cases, and how to migrate your QRadar SOAR automation to Microsoft Sentinel.","Microsoft Sentinel provides Security Orchestration, Automation, and Response (SOAR) capabilities withautomation rulesandplaybooks. Automation rules automate incident handling and response, and playbooks run predetermined sequences of actions to response and remediate threats. This article discusses how to identify SOAR use cases, and how to migrate your IBM Security QRadar SOAR automation to Microsoft Sentinel. Automation rules simplify complex workflows for your incident orchestration processes",2022-05-31T22:09:00.000Z,how-to,decision-making,0.6,True,Discusses identifying SOAR use cases and migrating QRadar automation to Sentinel; provides migration strategy and mapping guidance.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-detection-rules,Migrate detection rules,Migrate QRadar detection rules to Microsoft Sentinel,Migrate QRadar detection rules to Microsoft Sentinel,"Identify, compare, and migrate your QRadar detection rules to Microsoft Sentinel built-in rules.","This article describes how to identify, compare, and migrate your QRadar detection rules to Microsoft Sentinel built-in rules.",2025-07-27T11:14:00.000Z,how-to,decision-making,0.6,True,"Guides identification, comparison, and migration of QRadar rules to Sentinel built-ins; supports migration decisions and mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-historical-data,Export historical data,Microsoft Sentinel migration: Export QRadar data to target platform,Export QRadar historical data for Sentinel migration,Learn how to export your historical data from QRadar.,"This article describes how to export your historical data from QRadar. After you complete the steps in this article, you canselect a target platformto host the exported data, and thenselect an ingestion toolto migrate the data. To export your QRadar data, you use the QRadar REST API to run Ariel Query Language (AQL) queries on data stored in an Ariel database. Because the export process is resource intensive, we recommend that you use small time ranges in your queries, and only migrate the data ",2022-06-09T11:13:00.000Z,how-to,decision-making,0.7,True,"Explains exporting QRadar data via REST API and recommends small time ranges; includes performance considerations and guidance on what data to migrate, which is migration decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-security-operations-center-processes,Update SOC processes,Microsoft Sentinel migration: Update SOC and analyst processes,Update SOC and analyst processes for Microsoft Sentinel,Learn how to update your SOC and analyst processes as part of your migration to Microsoft Sentinel.,"A security operations center (SOC) is a centralized function within an organization that integrates people, processes, and technology. A SOC implements the organization's overall cybersecurity framework. The SOC collaborates the organizational efforts to monitor, alert, prevent, detect, analyze, and respond to cybersecurity incidents. SOC teams, led by a SOC manager, may include incident responders, SOC analysts at levels 1, 2, and 3, threat hunters, and incident response managers. SOC teams use",2022-09-20T22:05:00.000Z,how-to,best-practices,0.65,True,"Focuses on updating SOC processes as part of Sentinel migration; likely includes Sentinel-specific operational recommendations and workflows, which are best practices beyond generic SOC theory.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-automation,Migrate SOAR automation,Migrate Splunk SOAR automation to Microsoft Sentinel,Migrate Splunk SOAR automation to Sentinel automation rules,"Learn how to identify SOAR use cases, and how to migrate your Splunk SOAR automation to Microsoft Sentinel.","Microsoft Sentinel provides Security Orchestration, Automation, and Response (SOAR) capabilities with automation rules and playbooks. Automation rules facilitate simple incident handling and response, while playbooks run more complex sequences of actions to respond and remediate threats. This article discusses how to identify SOAR use cases, and how to migrate your Splunk SOAR automation to Microsoft Sentinel automation rules and playbooks. For more information about the differences between auto",2024-10-01T17:10:00.000Z,how-to,decision-making,0.6,True,"Explains how to identify SOAR use cases and migrate Splunk automation to Sentinel; provides mapping and approach choices, which is decision-making content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-detection-rules,Migrate detection rules,Migrate Splunk detection rules to Microsoft Sentinel - Microsoft Sentinel,Migrate Splunk detection rules to Microsoft Sentinel analytics,"Learn how to identify, compare, and migrate your Splunk detection rules to Microsoft Sentinel built-in rules.","Splunk detection rules are security information and event management (SIEM) components that compare to analytics rules in Microsoft Sentinel. This article describes the concepts to identify, compare, and migrate them to Microsoft Sentinel. The best way is to start with theSIEM migration experience, which identifies out-of-the-box (OOTB) analytics rules to automatically translate to. If you want to migrate your Splunk Observability deployment, learn more about how tomigrate from Splunk to Azure M",2024-10-01T17:10:00.000Z,conceptual,decision-making,0.6,True,"Covers identifying, comparing, and migrating Splunk rules to Sentinel; includes guidance on when to use the SIEM migration experience, fitting migration decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-historical-data,Export historical data,Export Splunk data to target platform - Microsoft Sentinel,Export Splunk historical data for Sentinel migration,Learn how to export your historical data from Splunk for a Microsoft Sentinel migration of security monitoring use cases.,"This article describes how to export your historical data from Splunk. After you complete the steps in this article, you canselect a target platformto host the exported data, and thenselect an ingestion toolto migrate the data. You can export data from Splunk in several ways. Your selection of an export method depends on the data volumes involved and your level of interactivity. For example, exporting a single, on-demand search via Splunk Web might be appropriate for a low-volume export. Alterna",2024-03-11T08:00:00.000Z,how-to,decision-making,0.65,True,Describes several export methods with guidance based on data volume and interactivity; this is migration decision-making about which export path to choose.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/migration-track,Track migration with a workbook,Track your Microsoft Sentinel migration with a workbook,,"Learn how to track your migration with a workbook, how to customize and manage the workbook, and how to use the workbook tabs for useful Microsoft Sentinel actions.","As your organization's security operations center (SOC) handles growing amounts of data, it's essential to plan and monitor your deployment status. While you can track your migration process using generic tools such as Microsoft Project, Microsoft Excel, Microsoft Teams, or Azure DevOps, these tools aren’t specific to security information and event management (SIEM) migration tracking. To help you to track, we provide a dedicated workbook in Microsoft Sentinel namedMicrosoft Sentinel Deployment ",2024-06-14T08:00:00.000Z,how-to,,0.5,False,"Workbook for tracking migration status; mostly operational tracking and UI usage, not deep config, limits, or troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/mitre-coverage,MITRE ATT&CK coverage,View MITRE coverage for your organization from Microsoft Sentinel,,"Learn how to view coverage indicator in Microsoft Sentinel for MITRE tactics that are currently covered, and available to configure, for your organization.","MITRE ATT&CKis a publicly accessible knowledge base of tactics and techniques that are commonly used by attackers, and is created and maintained by observing real-world observations. Many organizations use the MITRE ATT&CK knowledge base to develop specific threat models and methodologies that are used to verify security status in their environments. Microsoft Sentinel analyzes ingested data, not only todetect threatsand help youinvestigate, but also to visualize the nature and coverage of your ",2025-06-16T08:00:00.000Z,how-to,,0.45,False,"Describes viewing MITRE coverage; primarily visualization/overview of coverage indicators, not detailed config parameters or numeric thresholds for decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-analytics-rule-integrity,Audit and monitor the health of analytics rules,Monitor the health and audit the integrity of your Microsoft Sentinel analytics rules,Monitor health and integrity of Microsoft Sentinel analytics rules,Use the SentinelHealth data table to keep track of your analytics rules' execution and performance.,"To ensure comprehensive, uninterrupted, and tampering-free threat detection in your Microsoft Sentinel service, keep track of your analytics rules' health and integrity. Keep them functioning optimally by monitoring theirexecution insights, by querying the health and audit logs, and byusing manual rerun to test and optimize your rules. Set up notifications of health and audit events for relevant stakeholders, who can then take action. For example, define and send email or Microsoft Teams message",2025-10-30T17:12:00.000Z,conceptual,best-practices,0.7,True,Uses SentinelHealth and audit logs plus manual rerun to ensure tamper-free detection; these are concrete Sentinel-specific monitoring and governance practices.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-automation-health,Monitor automation rules and playbooks health,Monitor the health of your Microsoft Sentinel automation rules and playbooks,Monitor Sentinel automation rules and playbook health,Use the SentinelHealth and AzureDiagnostics data tables to keep track of your automation rules' and playbooks' execution and performance.,"To ensure proper functioning and performance of your security orchestration, automation, and response operations in your Microsoft Sentinel service, keep track of the health of your automation rules and playbooks by monitoring their execution logs. Set up notifications of health events for relevant stakeholders, who can then take action. For example, define and send email or Microsoft Teams messages, create new tickets in your ticketing system, and so on. This article describes how to use Micros",2025-08-20T08:00:00.000Z,how-to,configuration,0.7,True,"Uses SentinelHealth and AzureDiagnostics tables to monitor automation; includes concrete table names, fields, and monitoring patterns unique to Sentinel.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health,Monitor data connector health,Monitor the health of your Microsoft Sentinel data connectors,Monitor Microsoft Sentinel data connector health and ingestion,Use the SentinelHealth data table and the Health Monitoring workbook to keep track of your data connectors' connectivity and performance.,"To ensure complete and uninterrupted data ingestion in your Microsoft Sentinel service, keep track of your data connectors' health, connectivity, and performance. The following features allow you to perform this monitoring from within Microsoft Sentinel: Data collection health monitoring workbook: This workbook provides additional monitors, detects anomalies, and gives insight regarding the workspace’s data ingestion status. You can use the workbook’s logic to monitor the general health of the i",2025-08-24T17:09:00.000Z,how-to,configuration,0.7,True,"Uses SentinelHealth table and a specific workbook to track connector status; includes product-specific table names, workbook usage, and possibly KQL queries.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-key-vault-honeytokens,Deploy and monitor decoy honeytokens,Deploy and monitor Azure Key Vault honeytokens with Microsoft Sentinel,,"Plant Azure Key Vault honeytoken keys and secrets, and monitor them with Microsoft Sentinel.","Important The Microsoft Sentinel Deception (Honey Tokens) solution is offered in a community supported model by theMicrosoft SIEM & XDR Community. Any support required can be raised as anissueon GitHub where the Microsoft Sentinel community can assist. For solution documentation, review theHoneytokens solution GitHub page.",2023-02-21T18:15:00.000Z,how-to,,0.4,False,"High-level pointer to a community-supported solution and GitHub docs; the expert details live in GitHub, not in this page.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-optimize-analytics-rule-execution,Monitor and optimize execution of analytics rules,Monitor and optimize the execution of your Microsoft Sentinel scheduled analytics rules,Monitor and optimize Sentinel scheduled analytics rule execution,"Use Microsoft Sentinel's execution management tools, rule insights and manual rerun, to test and manage your scheduled analytics rules' execution.","To ensure that Microsoft Sentinel's threat detection provides complete coverage in your environment, take advantage of its execution management tools. These tools consist of insights on yourscheduled analytics rules'execution, based on Microsoft Sentinel'shealth and audit data, and a facility to manually rerun previous executions of rules on specific time windows, for testing and/or troubleshooting purposes. Important Microsoft Sentinel's analytics rule insights and manual rerun are currently in",2023-07-05T11:15:00.000Z,conceptual,best-practices,0.7,True,Describes execution insights and manual rerun for scheduled rules; provides Sentinel-specific tooling and patterns for testing and troubleshooting rule execution.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-sap-system-health,Monitor SAP system health and role,Monitor the health of the connection between Microsoft Sentinel and your SAP system,Monitor SAP–Sentinel connection health and alerts,Use the SAP connector page and a dedicated alert rule template to keep track of your SAP systems' connectivity and performance.,"After youdeploy the SAP solution, you want to ensure proper functioning and performance of your SAP systems, and keep track of your system health, connectivity, and performance. This article describes how you can check the connectivity health manually on the data connector page and use a dedicated alert rule template to monitor the health of your SAP systems. Important Monitoring the health of your SAP systems and the agentless data connector for SAP are both currently in PREVIEW. TheAzure Previ",2025-09-30T08:00:00.000Z,how-to,configuration,0.66,True,Describes using connector page and alert rule template to monitor SAP connectivity; includes product-specific rule configuration and monitoring patterns.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/monitor-your-data,View customized views with workbooks,Visualize your data using workbooks in Microsoft Sentinel,,Learn how to visualize your data using workbooks in Microsoft Sentinel.,"After you connect your data sources to Microsoft Sentinel, visualize and monitor the data using workbooks in Microsoft Sentinel. Microsoft Sentinel workbooks are based on Azure Monitor workbooks, and add tables and charts with analytics for your logs and queries to the tools already available in Azure. Microsoft Sentinel allows you to create custom workbooks across your data or use existing workbook templates available with packaged solutions or as standalone content from the content hub. Each w",2026-03-24T02:22:00.000Z,how-to,,0.2,False,"Page is a how-to/tutorial on visualizing data with Microsoft Sentinel workbooks. It describes using templates and creating custom workbooks but does not expose product-specific limits, configuration parameter tables, error-code-based troubleshooting, or quantified decision criteria. Content is primarily conceptual and procedural, not expert reference material.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/move-to-defender,Transition to the Defender portal,Transition Your Microsoft Sentinel Environment to the Defender Portal,,Move Microsoft Sentinel operations from the Azure portal to the Microsoft Defender portal.,"Microsoft Sentinel is available in the Microsoft Defender portal withMicrosoft Defender XDRor on its own. It delivers a unified experience across SIEM and XDR for faster, more accurate threat detection and response, simpler workflows, and better operational efficiency. This article explains how to transition your Microsoft Sentinel experience from the Azure portal to the Defender portal. If you use Microsoft Sentinel in the Azure portal, transition to Microsoft Defender for unified security oper",2026-03-17T08:00:00.000Z,how-to,,0.3,False,"Transition guide between Sentinel in Azure portal and Defender portal; likely procedural/UX steps without limits, configs tables, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/mssp-protect-intellectual-property,Manage your intellectual property in Microsoft Sentinel,Protecting managed security service provider (MSSPs) intellectual property in Microsoft Sentinel,Protect MSSP intellectual property in Microsoft Sentinel,Learn about how managed security service providers (MSSPs) can protect the intellectual property they've created in Microsoft Sentinel.,"This article describes the methods that managed security service providers (MSSPs) can use to protect intellectual property they've developed in Microsoft Sentinel, such as Microsoft Sentinel analytics rules, hunting queries, playbooks, and workbooks. The method you choose depends on how each of your customers buys Azure; whether you act as aCloud Solutions Provider (CSP), or the customer has anEnterprise Agreement (EA)/Pay-as-you-go (PAYG)account. The following sections describe each of these m",2023-01-09T23:08:00.000Z,conceptual,best-practices,0.65,True,"Covers concrete methods for packaging and isolating analytics rules, queries, and playbooks per customer purchasing model; these are Sentinel-specific operational best practices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/multiple-tenants-service-providers,Manage multiple tenants (MSSP),Manage multiple tenants in Microsoft Sentinel as a Managed Security Service Provider,Configure multi-tenant management for Microsoft Sentinel MSSPs,How to onboard and manage multiple tenants in Microsoft Sentinel as a Managed Security Service Provider (MSSP) using Azure Lighthouse.,"If you're a managed security service provider (MSSP) and you're usingAzure Lighthouseto offer security operations center (SOC) services to your customers, you can manage your customers' Microsoft Sentinel resources directly from your own Azure tenant, without having to connect to the customer's tenant. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentine",2026-03-18T22:20:00.000Z,how-to,configuration,0.68,True,"Page describes how MSSPs use Azure Lighthouse to onboard and manage multiple customer tenants' Sentinel resources from a provider tenant. This involves product-specific configuration patterns (cross-tenant management model, required setup in provider vs. customer tenants, Sentinel-specific management flows) that go beyond generic concepts. While not focused on limits or error codes, it contains concrete, Sentinel- and Lighthouse-specific configuration guidance for multi-tenant SOC operations.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/multiple-workspace-view,Work with incidents in multiple workspaces,Work with Microsoft Sentinel incidents in many workspaces at once,Work with Sentinel incidents across multiple workspaces,How to view incidents in multiple workspaces concurrently in Microsoft Sentinel.,"To take full advantage of Microsoft Sentinel’s capabilities, Microsoft recommends using a single-workspace environment. However, there are some use cases that require having several workspaces, in some cases – for example, that of aManaged Security Service Provider (MSSP)and its customers – across multiple tenants.Multiple workspace viewlets you see and work with security incidents across several workspaces at the same time, even across tenants, allowing you to maintain full visibility and contr",2024-10-18T05:37:00.000Z,conceptual,architecture-patterns,0.65,True,Describes multi-workspace incident view and when to use it; this is a Sentinel-specific pattern for multi-workspace incident management.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/near-real-time-rules,Overview,Quick threat detection with near-real-time (NRT) analytics rules in Microsoft Sentinel,Configure near-real-time analytics rules in Sentinel,This article explains how the new near-real-time (NRT) analytics rules can help you detect threats quickly in Microsoft Sentinel.,"When you're faced with security threats, time and speed are of the essence. You need to be aware of threats as they materialize so you can analyze and respond quickly to contain them. Microsoft Sentinel's near-real-time (NRT) analytics rules offer you faster threat detection—closer to that of an on-premises SIEM—and the ability to shorten response times in specific scenarios. Microsoft Sentinel’snear-real-time analytics rulesprovide up-to-the-minute threat detection out-of-the-box. This type of ",2024-05-28T08:00:00.000Z,conceptual,configuration,0.75,True,"NRT rules have specific configuration behaviors (frequency, latency, constraints) that are Sentinel-specific; page explains how these rules work and are configured.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization,ASIM overview,Normalization and the Advanced Security Information Model (ASIM),,This article explains how Microsoft Sentinel normalizes data from many different sources using the Advanced Security Information Model (ASIM),"Microsoft Sentinel ingests data from many sources. Working with various data types and tables together requires you to understand each of them, and write and use unique sets of data for analytics rules, workbooks, and hunting queries for each type or schema. Sometimes, you'll need separate rules, workbooks, and queries, even when data types share common elements, such as firewall devices. Correlating between different types of data during an investigation and hunting can also be challenging. The",2025-06-18T17:06:00.000Z,concept-article,,0.4,False,"Conceptual explanation of normalization and ASIM; does not expose detailed configuration tables, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-parsers,Use ASIM,Use Advanced Security Information Model (ASIM) parsers,,This article explains how to use Kusto Query Language (KQL) functions as query-time parsers to implement the Advanced Security Information Model (ASIM),Use Advanced Security Information Model (ASIM) parsers instead of table names in your Microsoft Sentinel queries to view data in a normalized format and to include all data relevant to the schema in your query. Refer to the table below to find the relevant parser for each schema.,2026-03-06T17:59:00.000Z,concept-article,,0.3,False,"Explains using ASIM parsers conceptually and references a table mapping schemas to parsers, but the summary does not show detailed configuration parameters, error codes, or best-practice gotchas; it appears to be usage guidance rather than deep expert configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-schemas,ASIM schemas,Advanced Security Information Model (ASIM) schemas,,"This article explains Advanced Security Information Model (ASIM) schemas, and how they help. ASIM normalizes data from many different sources to a uniform presentation.","An Advanced Security Information Model (ASIM) schema is a set of fields that represent an activity or entity. Using the fields from a normalized schema in a query ensures that the query works with every normalized source. To understand how schemas fit within the ASIM architecture, refer to theASIM architecture diagram.",2026-03-07T06:11:00.000Z,article,,0.2,False,"Describes what ASIM schemas are and their role in normalization, but the summary indicates a conceptual explanation without concrete configuration parameters, limits, or product-specific error/decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-workspace-parsers,Workspace deployed parsers,Advanced Security Information Model (ASIM) workspace deployed parsers - Microsoft Sentinel,Manage workspace-deployed ASIM parsers in Sentinel,This article explains how to manage and use workspace deployed Advanced Security Information Model (ASIM) parsers,Workspace deployed Advanced Security Information Model (ASIM) parsers are used to support developing and modifying ASIM parsers.,2026-01-05T23:10:00.000Z,concept-article,configuration,0.6,True,"Describes management and usage of workspace-deployed ASIM parsers, including Sentinel-specific parser deployment and configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-common-fields,ASIM common fields,The Advanced Security Information Model (ASIM) common schema fields reference,Apply ASIM common schema fields in Sentinel,This article describes the Advanced Information Security (ASIM) common schema fields,"Some fields are common to all ASIM schemas. Each schema might add guidelines for using some of the common fields in the context of the specific schema. For example, permitted values for theEventTypefield might vary per schema, as might the value of theEventSchemaVersionfield.",2025-12-13T23:21:00.000Z,reference,configuration,0.82,True,Defines common ASIM schema fields and permitted values per schema; these field names and allowed values are product-specific configuration/schema reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-content,ASIM content,Advanced Security Information Model (ASIM) security content,,This article outlines the Microsoft Sentinel security content that uses the Advanced Security Information Model (ASIM).,"Normalized security content in Microsoft Sentinel includes analytics rules, hunting queries, and workbooks that work with unifying normalization parsers. You can find normalized, built-in content in Microsoft Sentinel galleries andsolutions, create your own normalized content, or modify existing content to use normalized data. This article lists built-in Microsoft Sentinel content that has been configured to support the Advanced Security Information Model (ASIM). While links to the Microsoft Sen",2025-06-26T11:11:00.000Z,reference,,0.4,False,"Catalog-style listing of ASIM-based security content; mainly links and descriptions of content types without deep configuration, limits, or troubleshooting details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-develop-parsers,Develop ASIM parsers,Develop Microsoft Sentinel Advanced Security Information Model (ASIM) parsers,Develop and deploy custom ASIM parsers for Sentinel,"This article explains how to develop, test, and deploy Microsoft Sentinel Advanced Security Information Model (ASIM) parsers.","Advanced Security Information Model (ASIM) users useunifying parsersinstead of table names in their queries, to view data in a normalized format and to include all data relevant to the schema in the query. Unifying parsers, in turn, usesource-specific parsersto handle the specific details of each source. Microsoft Sentinel provides built-in, source-specific parsers for many data sources. You may want to modify, ordevelop, these source-specific parsers in the following situations: When your devic",2023-03-22T00:00:00.000Z,how-to,configuration,0.7,True,"Detailed guidance on developing, testing, and deploying ASIM parsers, including Sentinel-specific parser structure and deployment practices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-application,ASIM application entity,The Advanced Security Information Model (ASIM) Application Entity reference - Microsoft Sentinel,Implement ASIM Application Entity schema in Sentinel,This article displays the Microsoft Sentinel Application Entity schema.,,2026-01-05T18:11:00.000Z,reference,configuration,0.8,True,Displays the Application Entity schema; field definitions and structure are configuration/schema reference details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-device,ASIM device entity,The Advanced Security Information Model (ASIM) Device Entity reference,Implement ASIM Device Entity schema in Sentinel,This article displays the Microsoft Sentinel Device Entity schema.,"Devices, or hosts, are the common terms used for the systems that take part in the event. TheDvcprefix is used to designate the primary device on which the event occurs. Some events, such as network sessions, have source and destination devices, designated by the prefixSrcandDst. In such a case, theDvcprefix is used for the device reporting the event, which might be the source, destination, or a monitoring device.",2025-11-20T06:12:00.000Z,reference,configuration,0.88,True,"Defines Device Entity schema and prefixes (Dvc, Src, Dst) for events; field-level schema details are product-specific configuration reference.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-user,ASIM user entity,The Advanced Security Information Model (ASIM) User Entity reference,Implement ASIM User Entity schema in Sentinel,This article displays the Microsoft Sentinel User Entity schema.,"Users are central to activities reported by events. The user entity fields listed in this section are used to describe the users involved in the action. When used in an event, prefixes are used to designate the role of a user entity in the activity. The prefixesSrcandDstare used to designate the user role in network related events, in which a source system and a destination system communicate. The prefixes 'Actor' and 'Target' are used for system oriented events such as process events.",2025-11-20T06:12:00.000Z,reference,configuration,0.88,True,"Provides the User Entity schema with field names, roles, and prefixes (Src, Dst, Actor, Target); this is detailed schema configuration unique to ASIM.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-functions,ASIM helper functions,Advanced Security Information Model (ASIM) helper functions,Use ASIM helper functions for normalized data in Sentinel,This article outlines the Microsoft Sentinel Advanced Security Information Model (ASIM) helper functions.,Advanced Security Information Model (ASIM) helper functions extend the KQL language providing functionality that helps interact with normalized data and in writing parsers.,2026-01-21T06:14:00.000Z,reference,integrations,0.7,True,Describes ASIM helper functions that extend KQL with product-specific parameters and usage patterns for interacting with normalized data; this is an integration/coding pattern reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-ingest-time,Ingest-time normalization,Ingest time normalization,,This article explains how Microsoft Sentinel normalizes data at ingest,,2025-06-18T17:06:00.000Z,concept-article,,0.4,False,High-level explanation of ingest-time normalization; summary suggests conceptual description rather than detailed configuration parameters.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-known-issues,ASIM known issues,Advanced Security Information Model (ASIM) known issues,Review ASIM known issues and limitations in Sentinel,This article outlines the Microsoft Sentinel Advanced Security Information Model (ASIM) known issues.,The following are the Advanced Security Information Model (ASIM) known issues and limitations:,2026-01-07T19:08:00.000Z,reference,limits-quotas,0.7,True,Documents specific known issues and limitations of ASIM; such constraints and edge cases are expert knowledge about product behavior and limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-manage-parsers,Manage ASIM parsers,Manage Advanced Security Information Model (ASIM) parsers,Manage and customize ASIM parsers in Microsoft Sentinel,"This article explains how to manage Advanced Security Information Model (ASIM) parsers, add a customer parser, and replace a built-in parser.","Advanced Security Information Model (ASIM) users useunifying parsersinstead of table names in their queries, to view data in a normalized format and get all the data relevant to the schema in a single query. Each unifying parser uses multiple source-specific parsers that handle each source's specific details. To understand how parsers fit within the ASIM architecture, refer to theASIM architecture diagram. You may need to manage the source-specific parsers used by each unifying parser to: Add a ",2026-01-05T23:10:00.000Z,how-to,configuration,0.7,True,"Provides concrete steps to add custom parsers and replace built-in ones, including parser naming and management patterns unique to Sentinel’s ASIM.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-modify-content,Modify content to use ASIM,Modify content to use the Microsoft Sentinel Advanced Security Information Model (ASIM),Convert Sentinel content to use ASIM normalized data,This article explains how to convert Microsoft Sentinel content to use the Advanced Security Information Model (ASIM).,"Normalized security content in Microsoft Sentinel includes analytics rules, hunting queries, and workbooks that work with unifying normalization parsers. You can find normalized, out-of-the-box content in Microsoft Sentinel galleries andsolutions, create your own normalized content, or modify existing, custom content to use normalized data. This article explains how to convert existing Microsoft Sentinel analytics rules to usenormalized datawith the Advanced Security Information Model (ASIM). To",2026-01-01T12:10:00.000Z,conceptual,configuration,0.7,True,"Explains how to modify analytics rules and other content to use ASIM parsers and normalized schemas, with concrete Sentinel-specific patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-list,ASIM parsers,List of Microsoft Sentinel Advanced Security Information Model (ASIM) parsers,,This article lists Advanced Security Information Model (ASIM) parsers.,"This document provides a list of Advanced Security Information Model (ASIM) parsers. For an overview of ASIM parsers refer to theparsers overview. To understand how parsers fit within the ASIM architecture, refer to theASIM architecture diagram. Parsers that don't have a value underUses pack parameterdon't have theAdditionalFieldscolumn populated.",2026-03-12T05:52:00.000Z,reference,,0.1,False,"Page is primarily a catalog/list of ASIM parsers without configuration parameters, limits, error codes, or decision matrices. It does not map to the defined sub-skill types and lacks the kind of expert numeric/configuration detail required.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-overview,ASIM parsers,Microsoft Sentinel Advanced Security Information Model (ASIM) parsers overview,,This article provides an overview of Advanced Security Information Model (ASIM) parsers and a link to more detailed ASIM parsers documents.,"In Microsoft Sentinel, parsing andnormalizinghappen at query time. Parsers are built asKQL user-defined functionsthat transform data in existing tables, such asCommonSecurityLog, custom logs tables, or Syslog, into the normalized schema. Usersuse Advanced Security Information Model (ASIM) parsersinstead of table names in their queries to view data in a normalized format, and to include all data relevant to the schema in your query. To understand how parsers fit within the ASIM architecture, refe",2026-01-05T23:10:00.000Z,conceptual,,0.4,False,"Overview of ASIM parsers; primarily conceptual and linking to detailed parser docs, without concrete configuration or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-alert,ASIM alert event schema,The Advanced Security Information Model (ASIM) Alert Events normalization schema reference,Use ASIM Alert Events normalization schema in Sentinel,This article displays the Microsoft Sentinel Alert Events normalization schema.,"The Microsoft Sentinel Alert Schema is designed to normalize security-related alerts from various products into a standardized format within Microsoft Advanced Security Information Model (ASIM). This schema focuses exclusively on security events, ensuring consistent and efficient analysis across different data sources. The Alert Schema represents various types of security alerts, such as threats, suspicious activities, user behavior anomalies and compliance violations. These alerts are reported ",2026-01-21T06:14:00.000Z,reference,configuration,0.9,True,Reference for the Alert Events normalization schema with fields and structure for security alerts; detailed schema is expert configuration knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-asset,ASIM asset entity schema,The Advanced Security Information Model (ASIM) Asset Entity normalization schema reference,,This article displays the Microsoft Sentinel Asset Entity normalization schema.,"The Microsoft Sentinel Asset Entity Schema is designed to normalize assets from various products into a standardized format within Microsoft Advanced Security Information Model (ASIM). This schema focuses exclusively on assets in non-Microsoft data sources, ensuring consistent and efficient analysis. An asset is any data resource that an organization stores, processes, or manages, such as a file, or site. Each asset carries security-relevant metadata including ownership, permissions, sensitivity",2026-03-06T17:59:00.000Z,reference,,0.2,False,"Asset Entity normalization schema reference is largely a schema/field definition. While detailed, it is not a configuration, limits, troubleshooting, or decision-making guide and does not fit any specified sub-skill category.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-audit,ASIM audit event schema,The Advanced Security Information Model (ASIM) Audit Events normalization schema reference,Use ASIM Audit Events normalization schema in Sentinel,This article displays the Microsoft Sentinel Audit Events normalization schema.,"The Microsoft Sentinel Audit events normalization schema represents events associated with the audit trail of information systems. The audit trail logs system configuration activities and policy changes. Such changes are often performed by system administrators, but can also be performed by users when configuring the settings of their own applications. Every system logs audit events alongside its core activity logs. For example, a Firewall will log events about the network sessions is processes,",2025-12-04T23:11:00.000Z,reference,configuration,0.9,True,Defines the Audit Events normalization schema; field-level schema details are product-specific configuration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-authentication,ASIM authentication schema,The Advanced Security Information Model (ASIM) Authentication normalization schema reference,Use ASIM Authentication normalization schema in Sentinel,This article describes the Microsoft Sentinel Authentication normalization schema.,"The Microsoft Sentinel Authentication schema is used to describe events related to user authentication, sign-in, and sign-out. Authentication events are sent by many reporting devices, usually as part of the event stream alongside other events. For example, Windows sends several authentication events alongside other OS activity events. Authentication events include both events from systems that focus on authentication such as VPN gateways or domain controllers, and direct authentication to an en",2023-03-26T00:00:00.000Z,reference,configuration,0.9,True,Describes the Authentication schema for sign-in/sign-out events; schema fields and semantics are configuration reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dhcp,ASIM DHCP schema,The Advanced Security Information Model (ASIM) DHCP normalization schema reference,Use ASIM DHCP normalization schema in Sentinel,This article describes the Microsoft Sentinel DHCP normalization schema.,"The DHCP information model is used to describe events reported by a DHCP server, and is used by Microsoft Sentinel to enable source-agnostic analytics. For more information, seeNormalization and the Advanced Security Information Model (ASIM).",2026-01-21T06:14:00.000Z,reference,configuration,0.9,True,DHCP normalization schema reference; field definitions and usage are configuration-level expert details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dns,ASIM DNS schema,The Advanced Security Information Model (ASIM) DNS normalization schema reference,Use ASIM DNS normalization schema in Sentinel,This article describes the Microsoft Sentinel DNS normalization schema.,"The DNS information model is used to describe events reported by a DNS server or a DNS security system, and is used by Microsoft Sentinel to enable source-agnostic analytics. For more information, seeNormalization and the Advanced Security Information Model (ASIM).",2026-01-21T06:14:00.000Z,reference,configuration,0.9,True,DNS normalization schema reference with specific fields for DNS events; detailed schema is expert configuration knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-file-event,ASIM file event schema,The Advanced Security Information Model (ASIM) File Event normalization schema reference,Use ASIM File Event normalization schema in Sentinel,This article describes the Microsoft Sentinel File Event normalization schema.,"The File Event normalization schema is used to describe file activity such as creating, modifying, or deleting files or documents. Such events are reported by operating systems, file storage systems such as Azure Files, and document management systems such as Microsoft SharePoint. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Model (ASIM).",2026-01-21T06:14:00.000Z,reference,configuration,0.9,True,File Event normalization schema for file activity; schema fields and constraints are product-specific configuration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-network,ASIM network session schema,The Advanced Security Information Model (ASIM) Network Session normalization schema reference,Use ASIM Network Session normalization schema in Sentinel,This article displays the Microsoft Sentinel Network Session normalization schema.,"The Microsoft Sentinel Network Session normalization schema represents an IP network activity, such as network connections and network sessions. Such events are reported, for example, by operating systems, routers, firewalls, and intrusion prevention systems. The network normalization schema can represent any type of an IP network session but is designed to provide support for common source types, such as Netflow, firewalls, and intrusion prevention systems. For more information about normalizat",2026-01-05T18:11:00.000Z,reference,configuration,0.9,True,Network Session normalization schema reference; detailed field mappings for IP network activity are configuration/schema knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-process-event,ASIM process event schema,The Advanced Security Information Model (ASIM) Process Event normalization schema reference,Use ASIM Process Event normalization schema in Sentinel,This article describes the Microsoft Sentinel Process Event normalization schema.,"The Process Event normalization schema is used to describe the operating system activity of running and terminating a process. Such events are reported by operating systems and security systems, such as EDR (End Point Detection and Response) systems. A process, as defined by OSSEM, is a containment and management object that represents a running instance of a program. While processes themselves do not run, they do manage threads that run and execute code. For more information about normalization",2025-12-04T23:11:00.000Z,reference,configuration,0.9,True,Process Event normalization schema; field-level schema details are expert configuration information.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-registry-event,ASIM registry event schema,The Advanced Security Information Model (ASIM) Registry Event normalization schema reference,Use ASIM Registry Event normalization schema in Sentinel,This article describes the Microsoft Sentinel Registry Event normalization schema.,"The Registry Event schema is used to describe the Windows activity of creating, modifying, or deleting Windows Registry entities. Registry events are specific to Windows systems, but are reported by different systems that monitor Windows, such as EDR (End Point Detection and Response) systems, Sysmon, or Windows itself. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Model (ASIM).",2026-01-21T06:14:00.000Z,reference,configuration,0.9,True,Registry Event normalization schema for Windows registry activity; schema fields and usage are configuration reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-user-management,ASIM user management schema,Microsoft Sentinel user management normalization schema reference,Use Sentinel user management normalization schema,This article describes the Microsoft Sentinel user management normalization schema.,"The Microsoft Sentinel user management normalization schema is used to describe user management activities, such as creating a user or a group, changing user attribute, or adding a user to a group. Such events are reported, for example, by operating systems, directory services, identity management systems, and any other system reporting on its local user management activity. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Mod",2026-01-21T06:14:00.000Z,reference,configuration,0.9,True,User management normalization schema for activities like creating users/groups; detailed schema is product-specific configuration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-v1,Legacy network normalization schema,Microsoft Sentinel network normalization schema (Legacy version - Public preview),Use legacy Sentinel network normalization schema v0.1,This article displays the Microsoft Sentinel data normalization schema.,"The network normalization schema is used to describe reported network events, and is used by Microsoft Sentinel to enable unifying analytics. For more information, seeNormalization and the Advanced Security Information Model (ASIM). Important This article relates to version 0.1 of the network normalization schema, which was released as a preview before ASIM was available.Version 0.2.xof the network normalization schema aligns with ASIM and provides other enhancements. For more information, seeDi",2023-01-30T12:21:00.000Z,reference,configuration,0.86,True,Legacy network normalization schema reference (version 0.1) with specific fields and versioning details; schema and version behavior are configuration-level expert knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-web,ASIM web session schema,The Advanced Security Information Model (ASIM) Web Session normalization schema reference,Use ASIM Web Session normalization schema in Sentinel,This article displays the Microsoft Sentinel Web Session normalization schema.,"The Web Session normalization schema is used to describe an IP network activity. For example, IP network activities are reported by web servers, web proxies, and web security gateways. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Model (ASIM).",2025-12-13T23:21:00.000Z,reference,configuration,0.9,True,Web Session normalization schema for IP web activity; field definitions and structure are configuration/schema reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/notebook-get-started,Get started with notebooks and MSTICPy,Get started with Jupyter notebooks and MSTICPy in Microsoft Sentinel,Configure Sentinel notebooks and MSTICPy basics,Walk through the Getting Started Guide For Microsoft Sentinel ML Notebooks to learn the basics of Microsoft Sentinel notebooks with MSTICPy and queries.,"This article describes how to run theGetting Started Guide For Microsoft Sentinel ML Notebooksnotebook, which sets up basic configurations for running Jupyter notebooks in Microsoft Sentinel and provides examples for running simple queries. TheGetting Started Guide for Microsoft Sentinel ML Notebooksnotebook usesMSTICPy, a powerful Python library designed to enhance security investigations and threat hunting within Microsoft Sentinel notebooks. It provides built-in tools for data enrichment, vis",2025-07-01T11:22:00.000Z,how-to,configuration,0.7,True,"Walks through running the Getting Started notebook, including basic configurations for Sentinel notebooks and MSTICPy usage—product-specific configuration and code patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/notebooks,Overview,Jupyter notebooks with Microsoft Sentinel hunting capabilities,Use Jupyter notebooks for Sentinel threat hunting,Learn about Jupyter notebooks in Microsoft Sentinel for security hunting.,"Jupyter notebooks combine full programmability with a huge collection of libraries for machine learning, visualization, and data analysis. These attributes make Jupyter a compelling tool for security investigation and hunting. The foundation of Microsoft Sentinel is the data store; it combines high-performance querying, dynamic schema, and scales to massive data volumes. The Azure portal and all Microsoft Sentinel tools use a common API to access this data store. The same API is also available f",2025-04-24T22:03:00.000Z,conceptual,architecture-patterns,0.6,True,"Explains how Sentinel’s data store and API integrate with Jupyter notebooks for hunting, a product-specific investigation pattern that goes beyond generic notebook usage.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/notebooks-hunt,Launch Jupyter notebook,Hunt for security threats with Jupyter notebooks - Microsoft Sentinel,Run Sentinel hunting notebooks in Azure ML workspaces,Launch and run notebooks with the Microsoft Sentinel hunting capabilities.,"As part of your security investigations and hunting, launch and run Jupyter notebooks to programmatically analyze your data. In this article, you create an Azure Machine Learning workspace, launch notebook from Microsoft Sentinel to your Azure Machine Learning workspace, and run code in the notebook. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel",2025-04-24T22:03:00.000Z,how-to,deployment,0.6,True,"Covers creating an Azure ML workspace, launching notebooks from Sentinel into that workspace, and running code—this is a concrete deployment/integration pattern for notebooks in production-like environments.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/notebooks-msticpy-advanced,Configure advanced MSTICPy settings,Advanced configurations for Jupyter notebooks and MSTICPy in Microsoft Sentinel,Apply advanced MSTICPy and notebook settings in Sentinel,Learn about advanced configurations available for Jupyter notebooks with MSTICPy when working in Microsoft Sentinel.,"This article describes advanced configurations for working with Jupyter notebooks and MSTICPy in Microsoft Sentinel. For more information, seeUse Jupyter notebooks to hunt for security threatsandGet started with Jupyter notebooks and MSTICPy in Microsoft Sentinel.",2025-03-30T23:13:00.000Z,how-to,configuration,0.7,True,"Explicitly about advanced configurations for Jupyter notebooks and MSTICPy in Sentinel, implying detailed parameters and options unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/offboard,Remove Microsoft Sentinel from your workspaces,Remove Microsoft Sentinel from your workspace,Remove Microsoft Sentinel from a Log Analytics workspace,Learn how to delete your Microsoft Sentinel instance to discontinue use of Microsoft Sentinel and the associated costs.,"If you no longer want to use Microsoft Sentinel, this article explains how to remove it from your Log Analytics workspace. If you instead want to offboard Microsoft Sentinel from the Defender portal, seeOffboard Microsoft Sentinel.",2025-07-16T08:00:00.000Z,how-to,configuration,0.6,True,"Provides concrete steps and implications for offboarding Sentinel from a workspace, which are product-specific configuration operations.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/offboard-implications,Overview,Implications - remove Microsoft Sentinel from workspace,Understand removal impact of Microsoft Sentinel workspaces,Learn about the impact of removing a Microsoft Sentinel instance from a Log Analytics workspace in the Azure or Defender portal.,"If you decide that you no longer want to use your Microsoft Sentinel instance associated with a Log Analytics workspace, remove Microsoft Sentinel from the workspace. But before you do, consider the implications described in this article. It can take up to 48 hours for Microsoft Sentinel to be removed from the Log Analytics workspace. Data connector configuration and Microsoft Sentinel tables are deleted. Other resources and data are retained for a limited time. Your subscription continues to be",2025-02-06T23:03:00.000Z,concept-article,limits-quotas,0.7,True,"Describes concrete timing and retention implications when removing Sentinel from a workspace (for example, removal can take up to 48 hours and some resources are retained for a limited time), which are product-specific limits/behaviors not inferable from general knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ops-guide,SIEM operations guide,Operational guide - Microsoft Sentinel,Apply operational recommendations for Microsoft Sentinel SOCs,Learn about the operational recommendations to help security operations teams to plan and run security activities.,"This article lists the operational activities that we recommend security operations (SOC) teams and security administrators plan for and run as part of their regular security activities with Microsoft Sentinel. For more information about managing your security operations, seeSecurity operations overview.",2025-02-05T18:02:00.000Z,reference,best-practices,0.6,True,"An operational guide listing recommended SOC activities for Sentinel; likely contains product-specific operational DOs/DON’Ts and schedules, which are best-practice guidance beyond generic security ops theory.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/overview,Microsoft Sentinel SIEM overview,What is Microsoft Sentinel SIEM?,,"Learn about Microsoft Sentinel, a scalable, cloud-native SIEM and SOAR that uses AI, analytics, and automation for threat detection, investigation, and response.","Microsoft Sentinel is a cloud-native SIEM solution that delivers scalable, cost-efficient security across multicloud and multiplatform environments. It combines AI, automation, and threat intelligence to support threat detection, investigation, response, and proactive hunting. Microsoft Sentinel SIEM empowers analysts to anticipate and stop attacks across clouds and platforms, faster and with greater precision. This article highlights the key capabilities in Microsoft Sentinel. Microsoft Sentine",2026-01-28T08:00:00.000Z,overview,,0.25,False,Another high-level overview of Sentinel SIEM capabilities; marketing/feature description without detailed expert-only configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/package-platform-solution,Packaging and publishing a Sentinel platform solution,Package and publish a Microsoft Sentinel platform solution,Package and publish Microsoft Sentinel platform solutions,Learn how to package the components of a Microsoft Sentinel platform solution and publish the package in the Microsoft Security Store.,A Microsoft Sentinel platform solution is a deployable package for the Microsoft Sentinel data lake. It includes code and configuration that help you analyze and respond to security data. This article shows how to package and publish your completed platform solution in the Microsoft Security Store.,2025-09-30T12:49:00.000Z,how-to,deployment,0.7,True,Shows how to bundle data lake components and configuration into a deployable platform solution and publish it to the Security Store; this is Sentinel-specific deployment packaging.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/partner-integrations,Partner integrations best practices,Microsoft Sentinel components and patterns,Design Microsoft Sentinel solution components and patterns,This article describes best practices for creating your own integrations with Microsoft Sentinel.,"This article discusses the different components of a Microsoft Sentinel solution and how they can work together to address important customer scenarios. The Sentinel platform includes a data lake, graph, Jupyter notebook jobs, a Model Context Protocol (MCP) server, and data from more than 300 Sentinel connectors to help customers centralize and analyze their security data in a cost-efficient way. These capabilities plus Microsoft Security Copilot enable customers and partners to create impactful",2025-09-30T17:14:00.000Z,conceptual,architecture-patterns,0.7,True,"Describes how Sentinel data lake, graph, notebooks, MCP server, and connectors work together; this is a product-specific architectural pattern guide for solutions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/powerbi,Create a Power BI report,Create a Power BI report from Microsoft Sentinel data,Build Power BI reports from Sentinel log data,Learn how to create a Power BI report using an exported query from Microsoft Sentinel. Share your report with others in the Power BI service and a Teams channel.,"Power BIis a reporting and analytics platform that turns data into coherent, immersive, interactive visualizations. Power BI lets you easily connect to data sources, visualize and discover relationships, and share insights with whoever you want. You can base Power BI reports on data from Microsoft Sentinel and share those reports with people who don't have access to Microsoft Sentinel. For example, you might want to share information about failed sign-in attempts with app owners, without grantin",2025-09-16T17:23:00.000Z,how-to,integrations,0.6,True,"Describes exporting queries from Sentinel to Power BI; such pages typically include connection strings, query export formats, and dataset configuration specific to Sentinel-Power BI integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/prepare-multiple-workspaces,Prepare for multiple workspaces,Prepare for multiple workspaces and tenants in Microsoft Sentinel,,"To prepare for your deployment, learn how Microsoft Sentinel can extend across multiple workspaces and tenants.","Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you're still using Microsoft Sentinel in the Azure portal, we recommend that you start planning yourtransition to the Defender portalto ensure a smooth transition and take full ",2025-07-16T11:12:00.000Z,conceptual,,0.45,False,Multiple workspace preparation article appears to be conceptual architecture planning without numeric thresholds or detailed decision matrices.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/prerequisites,Prerequisites,Prerequisites for deploying Microsoft Sentinel,,Learn about prerequisites to deploy Microsoft Sentinel.,"Before deploying Microsoft Sentinel, make sure that your Azure tenant meets the requirements listed in this article. This article is part of theDeployment guide for Microsoft Sentinel.",2025-05-30T11:13:00.000Z,conceptual,,0.4,False,"Prerequisites list is likely conceptual (tenant requirements, permissions) without detailed numeric limits or config tables in the summary.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/prioritize-data-connectors,Prioritize data connectors,Prioritize data connectors for Microsoft Sentinel,,Learn how to plan and prioritize which data sources to use for your Microsoft Sentinel deployment.,"In this article, you learn how to plan and prioritize which data sources to use for your Microsoft Sentinel deployment. This article is part of theDeployment guide for Microsoft Sentinel.",2023-10-11T15:53:00.000Z,concept-article,,0.3,False,Prioritizing data connectors is planning guidance; summary does not indicate specific numeric criteria or configuration tables.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/publish-sentinel-solutions,Publish solutions,Publish security information and event management (SIEM) solutions to Microsoft Sentinel,Publish Microsoft Sentinel SIEM solutions to marketplace,This article guides you through the process of publishing solutions to Microsoft Sentinel.,"Microsoft’s commercial marketplaceis an online marketplace for applications and services that lets businesses of all sizes offer solutions to customers around the world. As an independent software vendor (ISV) member of the Partner Program, you can create, publish, and manage your Microsoft Sentinel SIEM solutions in Partner Center. Your solutions are listed together with other Microsoft solutions, connecting you to businesses, organizations, and government agencies around the world. Microsoft S",2024-10-14T17:05:00.000Z,how-to,deployment,0.7,True,"Details publishing flow, offer types, and requirements for Sentinel solutions in Partner Center and content hub; this is product-specific deployment to marketplace.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/purview-solution,Integrate Microsoft Purview,Integrate Microsoft Sentinel and Microsoft Purview,Integrate Microsoft Purview solution with Microsoft Sentinel,"This article describes how to use the Microsoft Sentinel data connector and solution for Microsoft Purview to enable data sensitivity insights, create rules to monitor when classifications have been d","Microsoft Purviewprovides organizations with visibility into where sensitive information is stored, helping prioritize at-risk data for protection. Integrate Microsoft Purview with Microsoft Sentinel to help narrow down the high volume of incidents and threats surfaced in Microsoft Sentinel, and understand the most critical areas to start. Start by ingesting your Microsoft Purview logs into Microsoft Sentinel through a data connector. Then use a Microsoft Sentinel workbook to view data such as a",2024-11-27T18:02:00.000Z,how-to,configuration,0.65,True,"Describes how to enable the Purview data connector and solution in Sentinel to surface data sensitivity insights, with Sentinel-specific configuration steps.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/quickstart-onboard,Onboard to Microsoft Sentinel,Onboard to Microsoft Sentinel,,"In this quickstart, you enable Microsoft Sentinel, and set up data connectors to monitor and protect your environment.","In this quickstart, you'll enable Microsoft Sentinel and install a solution from the content hub. Then, you'll set up a data connector to start ingesting data into Microsoft Sentinel. Microsoft Sentinel comes with many data connectors for Microsoft products such as the Microsoft Defender XDR service-to-service connector. You can also enable built-in connectors for non-Microsoft products such as Syslog or Common Event Format (CEF). For this quickstart, you'll use the Azure Activity data connector",2025-09-17T11:11:00.000Z,how-to,,0.2,False,"Quickstart onboarding tutorial; focuses on basic enablement and a sample data connector, not deep configuration or expert-only details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/relate-alerts-to-incidents,Relate alerts to incidents,Relate alerts to incidents in Microsoft Sentinel in the Azure portal,,This article shows you how to relate alerts to your incidents in Microsoft Sentinel in the Azure portal.,"This article shows you how to relate alerts to your incidents in Microsoft Sentinel. This feature allows you to manually or automatically add alerts to, or remove them from, existing incidents in the Azure portal as part of your investigation processes, refining the incident scope as the investigation unfolds. Important Incident expansion is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise",2025-05-20T22:03:00.000Z,how-to,,0.45,False,Shows how to relate alerts to incidents; summary suggests feature usage rather than deep configuration or troubleshooting content.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/resource-context-rbac,Manage workspace access with resource-context RBAC,Manage access to Microsoft Sentinel data by resource,Configure resource-context RBAC for Microsoft Sentinel data access,"This article explains you can manage access to Microsoft Sentinel data by the resources a user can access. Managing access by resource enables you to provide access to specific data only, without the ","Access to a workspace is managed by using Azure RBAC. Typically, users who have access to a Log Analytics workspace enabled for Microsoft Sentinel also have access to all the workspace data, including security content. Administrators can useAzure rolesto configure access to specific features in Microsoft Sentinel, depending on the access requirements in their team. However, you may have some users who need to access only specific data in your workspace, but shouldn't have access to the entire Mi",2024-08-28T17:05:00.000Z,conceptual,security,0.8,True,Explains managing access by resource with Azure roles; such content typically lists specific RBAC roles and scope behaviors unique to Sentinel data access.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/respond-threats-during-investigation,Remediate threats while investigating,Respond to threat actors while investigating or threat hunting in Microsoft Sentinel in the Azure portal,Trigger Sentinel playbooks from entities during hunts,"This article shows you how to take response actions against threat actors on the spot, during the course of an incident investigation or threat hunt, without pivoting or context switching out of the i","This article shows you how to take response actions against threat actors on the spot, during the course of an incident investigation or threat hunt, without pivoting or context switching out of the investigation or hunt. You accomplish this using playbooks based on the new entity trigger. The entity trigger currently supports the following entity types: Important Theentity triggeris currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that",2024-05-21T17:56:00.000Z,how-to,integrations,0.65,True,Shows how to use playbooks with the entity trigger and lists supported entity types—this is a Sentinel-specific integration pattern between incidents/hunts and automation.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/restore,Restore historical data,Restore archived logs from search - Microsoft Sentinel,Restore archived Sentinel logs for high-performance queries,Learn how to restore archived logs from search job results.,"Restore data from an archived log to use in high performing queries and analytics. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you're still using Microsoft Sentinel in the Azure portal, we recommend that you start plannin",2025-09-21T22:10:00.000Z,how-to,configuration,0.6,True,"Describes restoring archived logs from search job results, which involves Sentinel-specific configuration/operations around data states and query performance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/roles,Plan roles and permissions,Roles and permissions in the Microsoft Sentinel platform,Configure Microsoft Sentinel roles and permissions,"Learn how Microsoft Sentinel assigns permissions to users using both Azure and Microsoft Entra ID role-based access control, and identify the allowed actions for each role.","This article explains how Microsoft Sentinel assigns permissions to user roles for both Microsoft Sentinel SIEM and Microsoft Sentinel data lake, identifying the allowed actions for each role. Microsoft Sentinel usesAzure role-based access control (Azure RBAC)to provide built-in and custom roles for Microsoft Sentinel SIEM, andMicrosoft Entra ID role-based access control (Microsoft Entra ID RBAC)to provide built-in and custom roles for Microsoft Sentinel data lake. You can assign roles to users,",2026-01-07T08:00:00.000Z,concept-article,security,0.86,True,"The page details how Microsoft Sentinel uses Azure RBAC and Microsoft Entra ID RBAC, and enumerates allowed actions for each Sentinel-specific role. This includes product-specific role names and their permission scopes, which qualify as expert security configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sample-workspace-designs,Review sample workspace designs,Sample Microsoft Sentinel workspace designs,,"Learn from samples of Microsoft Sentinel architecture designs with multiple tenants, clouds or regions.","This article describes suggested Log Analytics workspace designs for organizations with the following sample requirements: For more information, seeDesign a Log Analytics workspace architecture. This article is part of theDeployment guide for Microsoft Sentinel.",2024-08-28T17:05:00.000Z,conceptual,,0.45,False,"Sample workspace designs are architecture guidance but summary suggests conceptual patterns, not quantified thresholds or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/collect-sap-hana-audit-logs,Collect SAP HANA audit logs,Collect SAP HANA audit logs in Microsoft Sentinel,Configure SAP HANA audit log collection in Sentinel,This article explains how to collect audit logs from your SAP HANA database.,"This article explains how to collect audit logs from your SAP HANA database in customer managed environments. Content in this article is intended for yoursecurity,infrastructure, andSAP BASISteams. Important Microsoft Sentinel SAP HANA support is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Important For environments managed by SAP such as SA",2026-02-09T18:26:00.000Z,how-to,configuration,0.7,True,Explains how to collect SAP HANA audit logs; likely includes HANA-specific configuration parameters and Sentinel connector settings that are product-specific.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/cross-workspace,Integrate SAP across multiple workspaces,Integrate SAP across multiple workspaces,Design multi-workspace architecture for Sentinel SAP,Learn how to work with the Microsoft Sentinel solution for SAP applications in multiple workspaces for different deployment scenarios.,"When you set up your Log Analytics workspace enabled for Microsoft Sentinel, you havemultiple architecture optionsand factors to consider. Taking into account geography, regulation, access control, and other factors, you might choose to have multiple workspaces in your organization. When working with SAP, your SAP and SOC teams might need to work in separate workspaces to maintain security boundaries. You might not want the SAP team to have visibility into all other security logs across your org",2024-11-07T18:01:00.000Z,conceptual,architecture-patterns,0.7,True,"Discusses multi-workspace options and factors (geography, regulation, access control) for SAP with Sentinel; this is architecture guidance specific to this solution.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-command-line,Deploy the agent from the command line,Connect your SAP system by deploying your data connector agent container from the command line,Deploy SAP connector container via command line,This article describes how to connect your SAP system to Microsoft Sentinel by deploying the container that that hosts the SAP data connector agent using the command line.,"This article provides command line options for deploying an SAP data connector agent. For typical deployments we recommend that you use theportalinstead of the command line, as data connector agents installed via the command line can be managed only via the command line. However, if you're using a configuration file to store your credentials instead of Azure Key Vault, or if you're an advanced user who wants to deploy the data connector manually, such as in a Kubernetes cluster, use the procedur",2025-09-30T08:00:00.000Z,how-to,deployment,0.78,True,Provides command-line deployment options for the SAP connector container; will list product-specific commands and parameters beyond generic deployment knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-data-connector-agent-container,Connect your SAP system,Connect your SAP system to Microsoft Sentinel,Deploy SAP data connector container to Sentinel,This article describes how to connect your SAP system to Microsoft Sentinel by deploying the container that that hosts the SAP data connector agent.,"For the Microsoft Sentinel solution for SAP applications to operate correctly, you must first get your SAP data into Microsoft Sentinel. Do this by either deploying the Microsoft Sentinel SAP data connector agent, or by connecting the Microsoft Sentinel agentless data connector for SAP. Select the option at the top of the page that matches your environment. This article describes the third step in deploying one of the Microsoft Sentinel solutions for SAP applications. Important The data connecto",2025-09-30T08:00:00.000Z,how-to,deployment,0.7,True,Describes how to deploy the SAP data connector container to connect SAP to Sentinel; this typically includes product-specific deployment parameters and constraints not known generically.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-btp-solution,Deploy SAP BTP,Deploy Microsoft Sentinel solution for SAP BTP,Deploy Sentinel solution for SAP BTP systems,Learn how to deploy the Microsoft Sentinel solution for SAP Business Technology Platform (BTP) system.,"This article describes how to deploy the Microsoft Sentinel solution for SAP Business Technology Platform (BTP) system. The Microsoft Sentinel solution for SAP BTP monitors and protects your SAP BTP system. It collects audit logs and activity logs from the BTP infrastructure and BTP-based apps, and then detects threats, suspicious activities, illegitimate activities, and more.Read more about the solution. Important An architectural shift in the data connector v3.0.11 to cater for delayed SAP BTP",2026-01-30T18:17:00.000Z,how-to,deployment,0.7,True,Deployment article for SAP BTP solution; includes product-specific deployment steps and notes about connector version behavior.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-security-content,Install the solution for SAP applications,Install a Microsoft Sentinel solution for SAP applications,Install Microsoft Sentinel solution for SAP applications,Learn how to install a Microsoft Sentinel solution for SAP applications from the content hub to your Log Analytics workspace enabled for Microsoft Sentinel.,"The Microsoft Sentinel solutions for SAP applications include an SAP data connector, which collects logs from your SAP systems and sends them to your Microsoft Sentinel workspace, and out-of-the-box security content, which helps you gain insight into your organization's SAP environment and detect and respond to security threats. Installing your solution is a required step before you can configure your data connector. Important The data connector agent for SAP is beingdeprecatedand will be perman",2025-10-27T22:13:00.000Z,how-to,deployment,0.65,True,"Describes installing SAP solution from content hub, including SAP data connector and content deployment steps and constraints, which are product-specific deployment details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-overview,Deployment overview,Deploy the Microsoft Sentinel solution for SAP applications,,Get an introduction to the process of deploying the Microsoft Sentinel solution for SAP applications.,"Use the Microsoft Sentinel solution for SAP applications to monitor your SAP systems with Microsoft Sentinel, detecting sophisticated threats throughout the business logic and application layers of your SAP applications. This article introduces you to the Microsoft Sentinel solution for SAP applications deployment.",2025-09-30T08:00:00.000Z,conceptual,,0.35,False,Deployment overview for SAP solution is introductory; detailed prerequisites and configuration are in separate articles.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-solution-configuration,Enable SAP detections and threat protection,Enable SAP detections and threat protection with Microsoft Sentinel,Configure Sentinel SAP detections and threat protection,This article shows you how to configure initial security content for the Microsoft Sentinel solution for SAP applications in order to start enabling SAP detections and threat protection.,"While deploying a Microsoft Sentinel data collector and solution for SAP provides you with the ability to monitor SAP systems for suspicious activities and identify threats, extra configuration steps are required to ensure the solution is optimized for your SAP deployment. This article provides best practices for getting started with the security content delivered with the Microsoft Sentinel solution for SAP applications, and is the last step in deploying the SAP integration. Important The data ",2025-10-27T22:13:00.000Z,how-to,best-practices,0.7,True,"Provides best practices for configuring security content (rules, detections) for SAP in Sentinel; includes product-specific recommendations and gotchas.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/preparing-sap,Prepare your SAP environment,Configure your SAP system for the Microsoft Sentinel solution - Microsoft Sentinel,Prepare SAP systems for Sentinel SAP connector,Learn about extra preparations required in your SAP system to install the SAP data connector agent and connect Microsoft Sentinel to your SAP system.,"This article describes how to prepare your SAP environment for connecting to the SAP data connector. Preparation differs, depending on whether you're using the containerized data connector agent. Select the option at the top of the page that matches your environment. This article is part of the second step in deploying the Microsoft Sentinel solution for SAP applications. Important The data connector agent for SAP is beingdeprecatedand will be permanently disabled bySeptember 14th 2026. We recom",2025-10-27T22:13:00.000Z,how-to,configuration,0.78,True,"Preparation steps for SAP to work with the Sentinel SAP connector will include SAP-specific configuration objects, parameter names, and required settings that are unique to this integration, beyond generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/prerequisites-for-deploying-sap-continuous-threat-monitoring,Deployment prerequisites,Prerequisites for deploying Microsoft Sentinel solution for SAP applications,Review prerequisites for Sentinel SAP solution deployment,This article lists the prerequisites required for deployment of the Microsoft Sentinel solution for SAP applications.,"This article lists the prerequisites required for deployment of the Microsoft Sentinel solution for SAP applications, which differ depending on whether you're deploying a data connector agent or using the agentless data connector with the SAP Cloud Connector. Select the option at the top of this page that matches your deployment. Reviewing and ensuring that you have or understand all the prerequisites is the first step in deploying the Microsoft Sentinel solution for SAP applications. Select a c",2025-09-30T08:00:00.000Z,reference,configuration,0.7,True,"Lists detailed prerequisites for agent vs agentless SAP connectors, including environment, connector, and possibly permission requirements specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-kickstart,Data connector agent kickstart script reference,Kickstart deployment script reference for the Microsoft Sentinel for SAP applications data connector agent,Kickstart script parameters for SAP connector deployment,Description of command line options available with kickstart deployment script,"This article provides a reference of the configurable parameters available in the kickstart script used to the deploy the Microsoft Sentinel for SAP applications data connector agent. For more information, seeConnect your SAP system to Microsoft Sentinel. Content in this article is intended for yourSAP BASISteams.",2024-10-28T11:13:00.000Z,reference,configuration,0.9,True,"Reference for kickstart deployment script parameters; will include parameter names, allowed values, and defaults, which are configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig,Data connector agent systemconfig.ini file reference (legacy),Microsoft Sentinel solution for SAP applications data connector agent systemconfig.ini file reference,Legacy systemconfig.ini settings for Sentinel SAP agent,Learn about the settings available in Microsoft Sentinel for SAP applications data connector agent systemconfig.ini file.,"Thesystemconfig.inifile is the legacy file used to configure the behavior of the Microsoft Sentinel for SAP applications data connector agent in versions earlier than June 22, 2023. This article describes the options available in each section of the configuration file. Content in this article is intended for yourSAP BASISteams. This article is not relevant if you've used therecommended deployment procedurefrom the portal. If you've installed a newer version of the agent from the command line, us",2024-12-16T18:03:00.000Z,reference,configuration,0.95,True,Legacy configuration file reference; details specific INI keys and values used by older versions of the SAP connector agent.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig-json,Data connector agent systemconfig.json file reference,Microsoft Sentinel solution for SAP applications data connector agent systemconfig.json file reference,systemconfig.json settings for Sentinel SAP agent,Learn about the settings available in Microsoft Sentinel for SAP applications data connector agent systemconfig.json file.,"Thesystemconfig.jsonfile is used to configure the behavior of the Microsoft Sentinel for SAP applications data connector agent whendeployed from the command line. This article describes the options available in each section of the configuration file. Content in this article is intended for yourSAP BASISteams, and is only relevant when your data connector agent is deployed from the command line. We recommenddeploying your data connector agent from the portalinstead. Important Microsoft Sentinel s",2024-12-16T18:03:00.000Z,reference,configuration,0.95,True,"Explicit configuration file reference; lists settings, sections, and allowed values for systemconfig.json, which is core configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-update,Data connector agent update script reference,Microsoft Sentinel solution for SAP applications data connector agent update file reference,Update script parameters for Sentinel SAP connector,Description of command line options available with update deployment script,"The Microsoft Sentinel SAP data connector agent container users anupdate scriptto simplify the update process. This article describes the configurable parameters available in the update script. For more information, seeUpdate the Microsoft Sentinel for SAP applications data connector agent. Content in this article is intended for yourSAP BASISteams.",2024-12-16T18:03:00.000Z,reference,configuration,0.9,True,Reference for update script options; contains specific parameter names and behaviors for updating the SAP connector agent.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/required-abap-authorizations,Required ABAP permissions,Required ABAP authorizations for the Microsoft Sentinel solution for SAP applications,ABAP roles and authorizations for Sentinel SAP logs,Understand the ABAP authorizations required if you want to manually define roles based on the SAP logs you want to ingest to Microsoft Sentinel and the activities you want to run.,This article lists the ABAP authorizations required to ensure that the SAP user account used by Microsoft Sentinel's SAP data connector can correctly retrieve logs from the SAP systems. The required authorizations are listed here by their purpose. You only need the authorizations that are listed for the kinds of logs you want to bring into Microsoft Sentinel.,2025-06-23T11:11:00.000Z,how-to,security,0.9,True,Lists required ABAP authorizations/roles for the SAP user account; these are specific permission objects and scopes unique to this integration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-agent-migrate,Migrate agent to agentless connector,Containerized Agent to Agentless Connector migration guide,Migrate Sentinel SAP container agent to agentless connector,Learn how to migrate from the containerized SAP agent to the agentless data connector for Microsoft Sentinel Solution for SAP applications.,This article outlines the steps required to migrate from the containerized SAP agent to the agentless data connector for Microsoft Sentinel Solution for SAP applications. Important The data connector agent for SAP is beingdeprecatedand will be permanently disabled bySeptember 14th 2026. We recommend that youmigrate to the agentless data connector. Learn more about the agentless approach from ourblog post.,2026-03-11T11:05:00.000Z,article,deployment,0.7,True,"Migration guide contains product-specific, time-bound deprecation details and concrete steps for moving from a containerized agent to an agentless data connector in Microsoft Sentinel for SAP, which are deployment/operational patterns unique to this solution and not generally known from training.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-controls-workbook,Monitor SAP audit controls,Check for SAP security controls with Microsoft Sentinel,Use SAP Security Audit Controls workbook in Sentinel,"Learn about the SAP - Security Audit Controls workbook, used to monitor and track security control framework compliance across your SAP systems.","This article describes how you can use theSAP - Security Audit Controlsworkbook to monitor and track security control framework compliance across your SAP systems, including the following functionality: For example: Content in this article is intended for yoursecurityteam.",2024-10-28T11:13:00.000Z,concept-article,configuration,0.64,True,Describes a specific workbook for SAP security control compliance; includes Sentinel query/visualization details tied to SAP data.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-log-workbook,Monitor SAP audit logs,Microsoft Sentinel solution for SAP applications - SAP -Security Audit log and Initial Access workbook,Use SAP Security Audit log workbook in Sentinel,"Learn about the SAP - Security Audit log and Initial Access workbook, used to monitor and track data across your SAP systems.","This article describes theSAP - Security Audit log and Initial Accessworkbook, used for monitoring and tracking user audit activity across your SAP systems. Use the workbook to get a bird's eye view of user audit activity, better secure your SAP systems, and gain quick visibility into suspicious actions. Drill down into suspicious events as needed. Use the workbook either for ongoing monitoring of your SAP systems, or to review the systems following a security incident or other suspicious activi",2024-10-28T11:13:00.000Z,reference,configuration,0.64,True,"Workbook article will include specific queries, fields, and visualizations tied to SAP audit logs in Sentinel, which are product-specific usage patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-security-content,Security content reference,Microsoft Sentinel Solution for SAP BTP - security content reference,Security content reference for Sentinel SAP BTP solution,Learn about the built-in security content provided by the Microsoft Sentinel Solution for SAP BTP.,"This article details the security content available for the Microsoft Sentinel Solution for SAP BTP. Available security content currently includes a built-in workbook and analytics rules. You can also add SAP-relatedwatchliststo use in your search, detection rules, threat hunting, and response playbooks. Learn more about the solution.",2026-01-06T18:11:00.000Z,reference,configuration,0.62,True,"Details built-in workbook and analytics rules for SAP BTP; these are concrete, product-specific security content artifacts.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-solution-overview,Overview,Microsoft Sentinel Solution for SAP BTP overview,,This article introduces the Microsoft Sentinel Solution for SAP BTP.,"SAP BTP is a cloud-based solution that provides a wide range of tools and services for developers to build, run, and manage applications. One of the key features of SAP BTP is its low-code development capabilities. Low-code development allows developers to create applications quickly and efficiently by using visual drag-and-drop interfaces and prebuilt components, rather than writing code from scratch. The Microsoft Sentinel solution for SAP BTP monitors and protects your SAP Business Technology",2026-01-06T18:11:00.000Z,concept-article,,0.4,False,"Overview of Sentinel solution for SAP BTP; summary suggests high-level introduction without detailed configuration tables, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-deploy-troubleshoot,Troubleshoot SAP data connector,Troubleshoot the Microsoft Sentinel solution for SAP applications data connector agent,Troubleshoot Sentinel SAP data connector agent,Learn how to troubleshoot specific issues that might occur in your Microsoft Sentinel solution for SAP applications data connector agent deployment.,"This article includes troubleshooting steps to help you ensure accurate and timely data ingestion and monitoring for your SAP environment with Microsoft Sentinel. When working with the agentless data connector, most troubleshooting is done directly in the SAP Integration Suite, where the message log displays errors indicating the nature of the issue encountered. Important The data connector agent for SAP is beingdeprecatedand will be permanently disabled bySeptember 14th 2026. We recommend that ",2025-09-30T08:00:00.000Z,troubleshooting,troubleshooting,0.86,True,"Explicit troubleshooting article for the SAP data connector agent; will contain specific error messages, causes, and resolutions unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-logserv-overview,SAP LogServ,SAP LogServ integration with Microsoft Sentinel Solution for SAP overview,,"This article introduces the Microsoft Sentinel Solution for SAP integration with SAP LogServ, an SAP-provided add-on that extends monitoring beyond the SAP application layer to infrastructure, databas","TheMicrosoft Sentinel Solution for SAP applicationsprovides powerful application-layer monitoring for SAP systems, tracking user activity, business transactions, and critical events. However, in SAP RISE/ECS environments, infrastructure and operating system logs are owned and managed by SAP, and aren't accessible through the standard SAP application connector. SAP LogServ bridges that gap. It's an SAP Enterprise Cloud Services (ECS) service that centralizes logs from all systems, applications, a",2026-04-13T17:16:00.000Z,concept-article,,0.2,False,"The page is an overview of SAP LogServ integration with Microsoft Sentinel for SAP. From the summary it appears to be conceptual/introductory, describing what LogServ is and why it's needed for SAP RISE/ECS environments, without exposing concrete configuration parameters, limits, error codes, or detailed troubleshooting/decision matrices. It reads as a high-level integration overview rather than a detailed configuration, best-practices, or troubleshooting guide.",new -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-deploy-alternate,Deploy the agent with expert options,Deploy the Microsoft Sentinel for SAP data connector agent container using expert configuration options,Expert deployment options for Sentinel SAP connector,"Learn how to deploy the Microsoft Sentinel for SAP data connector environments using expert configuration options, such as and on-premises machine and custom, manual configurations.","This article provides procedures for deploying and configuring the Microsoft Sentinel for SAP data connector agent container with expert, custom, or manual configuration options. For typical deployments we recommend that you use theportalinstead. Content in this article is intended for yourSAP BASISteams. For more information, seeDeploy an SAP data connector agent from the command line. Note This article is relevant only for the data connector agent, and isn't relevant for theSAP agentless data ",2025-09-30T08:00:00.000Z,how-to,deployment,0.78,True,"Describes expert/custom deployment of the SAP connector container, including on-prem and manual configurations; contains detailed deployment patterns and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-function-reference,SAP solution function reference,Microsoft Sentinel solution for SAP applications - function reference,Function reference for Sentinel SAP solution workspace,Learn about the functions available from the Microsoft Sentinel solution for SAP applications.,This article describes a selection of functions that are available in your workspace after you install a Microsoft Sentinel solution for SAP applications. Discover more functions by browsing in Microsoft Sentinel and loading the function code. Find functions as follows: Content in this article is intended for yoursecurityteams.,2025-09-30T08:00:00.000Z,reference,configuration,0.7,True,"Describes specific KQL functions added by the SAP solution; these function names and behaviors are expert, product-specific knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-log-reference,SAP solution log and table reference,Log and table reference for the Microsoft Sentinel solution for SAP applications,Log and table schema reference for Sentinel SAP solution,"Learn about the SAP logs, tables, and functions available from the Microsoft Sentinel solution for SAP applications.","This article describes the logs and tables available as part of the Microsoft Sentinel solution for SAP applications and its data connector. Some logs, noted in this article, aren't sent to Microsoft Sentinel by default, but you can manually add them as needed. For more information, seeDefine the SAP logs that are sent to Microsoft Sentinel Content in this article is intended for yourSAP BASISteams. Important Noted features are currently inPREVIEW. TheAzure Preview Supplemental Termsinclude addi",2025-09-30T08:00:00.000Z,reference,configuration,0.8,True,"Lists SAP-related logs and tables available via the connector, including which are enabled by default; this is detailed schema/configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-security-content,SAP security content reference,Microsoft Sentinel solution for SAP applications - security content reference,Reference for Sentinel SAP security content and rules,Learn about the built-in security content provided by the Microsoft Sentinel solution for SAP applications.,"This article details the security content available for the Microsoft Sentinel solutions for SAP. Important Noted elements described in this article are in Preview. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Available security content includes built-in workbooks and analytics rules. You can also add SAP-relatedwatchliststo use in your search, detection rules, th",2026-01-06T18:11:00.000Z,reference,configuration,0.62,True,"Security content reference lists built-in workbooks, analytics rules, and watchlists for SAP; these are concrete, product-specific artifacts.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-suspicious-configuration-security-parameters,Monitored SAP security parameters,SAP security parameters monitored by the Microsoft Sentinel solution for SAP to detect suspicious configuration changes,SAP security parameters monitored by Sentinel analytics,Learn about the security parameters in the SAP system that the Microsoft Sentinel solution for SAP applications monitors for suspicious configuration changes.,"This article lists the static security parameters in the SAP system that the Microsoft Sentinel solution for SAP applications monitors as part of theSAP - (Preview) Sensitive Static Parameter has Changedanalytics rule. The Microsoft Sentinel solution for SAP applications provides updates for this content according to SAP best practice changes. Add parameters to watch for by changing values according to your organization's needs, and turn off specific parameters in theSAPSystemParameterswatchlist",2024-10-28T11:13:00.000Z,reference,security,0.88,True,Lists specific SAP static security parameters monitored by a named analytics rule and how to adjust watchlist values; highly product-specific security configuration.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-overview,Overview,Microsoft Sentinel solution for SAP applications overview,,This article provides an overview of the Microsoft Sentinel solution for SAP applications and available support.,"SAP systems pose a unique security challenge, as they handle sensitive information, are a prime target for attackers, and traditionally provide little visibility for security operations teams. An SAP system breach could result in stolen files, exposed data, or a disrupted supply chain. Once an attacker is in the system, there are few controls to detect exfiltration or other bad acts. SAP activity needs to be correlated with other data across the organization for effective threat detection. To he",2026-04-01T17:25:00.000Z,overview,,0.2,False,"Described as an overview of the Microsoft Sentinel solution for SAP applications and available support. This is primarily conceptual/marketing-style overview of challenges and solution scope, without clear indication of detailed configuration parameters, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-partner-overview,Partner solutions,Microsoft Sentinel solutions for SAP - Partner Add-ons,,"Discover partners specializing in Microsoft Sentinel for SAP integration solutions, consulting, and managed services.","Microsoft Sentinel provides a flexible platform that enables SAP and Microsoft partners to deliver integrated security solutions through the Microsoft Sentinel Content Hub. Add-ons enable further correlational capabilities for the Microsoft Sentinel Solution for SAP applications. SAP signals are correlated with signals from other Microsoft and third-party solutions, enabling comprehensive threat detection and response across the entire IT landscape using the Microsoft unified Security Operations",2025-11-18T18:43:00.000Z,conceptual,,0.3,False,Partner add-ons overview; primarily marketing/partner discovery content without technical configuration or troubleshooting details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/stop-collection,Stop SAP data collection,Stop SAP data collection - Microsoft Sentinel,Stop SAP log collection and disable Sentinel connector,Learn about how to stop Microsoft Sentinel from collecting data from your SAP applications.,"There might be instances where you need to halt the data collection from your SAP applications by the Microsoft Sentinel data connector agent, whether for maintenance, troubleshooting, or other administrative reasons. This article provides step-by-step instructions on how to stop the ingestion of SAP logs into Microsoft Sentinel and disable the data connector agent. If you're using the agentless data connector, remove the data connector and solution from Microsoft Sentinel, and then clean up any",2025-09-30T08:00:00.000Z,how-to,configuration,0.66,True,Provides specific steps to stop ingestion and disable the SAP connector; includes product-specific configuration actions.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sap/update-sap-data-connector,Update the data connector agent,Update the Microsoft Sentinel for SAP applications data connector agent,Update Sentinel SAP data connector agent safely,This article shows you how to update an already existing SAP data connector to its latest version.,"This article shows you how to update an already existing Microsoft Sentinel for SAP data connector to its latest version so that you can use the latest features and improvements. During the data connector agent update process, there might be a brief downtime of approximately 10 seconds. To ensure data integrity, a database entry stores the timestamp of the last fetched log. After the update is complete, the data fetching process resumes from the last log fetched, preventing duplicates and ensuri",2025-09-30T08:00:00.000Z,how-to,deployment,0.68,True,"Covers updating the SAP data connector agent, including downtime behavior and log resumption; these are product-specific deployment/upgrade behaviors.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/scheduled-rules-overview,Overview,Scheduled analytics rules in Microsoft Sentinel,Configure scheduled analytics rules in Sentinel,Understand how scheduled analytics rules work in Microsoft Sentinel. Learn about all the configuration options for this rule type.,"By far the most common type of analytics rule,Scheduledrules are based onKusto queriesthat are configured to run at regular intervals and examine raw data from a defined ""lookback"" period. Queries can perform complex statistical operations on their target data, revealing baselines and outliers in groups of events. If the number of results captured by the query passes the threshold configured in the rule, the rule produces an alert. This article helps you understand how scheduled analytics rules ",2024-10-16T08:00:00.000Z,overview,configuration,0.8,True,"Describes all configuration options for scheduled rules; such pages contain rule parameters (lookback period, frequency, thresholds, suppression, etc.) with allowed values and defaults.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/scoping,Configure Microsoft Sentinel scoping (row-level RBAC),Configure Microsoft Sentinel scoping (row-level RBAC),Configure row-level RBAC scoping in Microsoft Sentinel,Create and apply Microsoft Sentinel scopes to control access to data at the row level.,"Microsoft Sentinel scoping provides row-level role-based access control (RBAC), enabling granular, row-level access without requiring workspace separation. This capability allows multiple teams to operate securely within a shared Microsoft Sentinel environment while using consistent and reusable scope definitions across tables and experiences. Scoping is configured in the Microsoft Defender portal.",2026-03-30T20:20:00.000Z,how-to,security,0.7,True,"Page is specifically about configuring Microsoft Sentinel scoping (row-level RBAC). This implies product-specific security configuration, likely including scope definitions, role mappings, and permission behaviors unique to Sentinel. That aligns with the security sub-skill type, which covers RBAC role names, permission scopes, and security settings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/search-jobs,Search large datasets,Search for specific events across large datasets in Microsoft Sentinel,Run Sentinel search jobs for large datasets and archives,Learn how to use search jobs to search large datasets.,"Use a search job to retrieve data stored inlong-term retention, or to scan through large volumes of data, if the log query time-out of 10 minutes isn't sufficient. A search job scans through up to a year of data in a table for specific events. The search job sends its results to a new Analytics table in the same workspace as the source data. This article explains how to run a search job in Microsoft Sentinel and how to work with the search job results. Search jobs across certain data sets might ",2025-10-22T11:10:00.000Z,how-to,limits-quotas,0.8,True,Explicitly mentions using search jobs when the log query timeout of 10 minutes isn't sufficient and scanning up to a year of data—these are concrete numeric limits and behavior details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema,Security alert schema reference,Microsoft Sentinel security alert schema reference,Use Microsoft Sentinel security alert schema,This article displays the schema of security alerts in Microsoft Sentinel.,"Microsoft Sentinelanalytics rulescreate incidents as the result ofsecurity alerts. Security alerts can come from different sources, and accordingly use different kinds of analytics rules to create incidents: Scheduledanalytics rules generate alerts as the result of their regular queries of data in logs ingested from external sources, and those same rules create incidents from those alerts. (For the purposes of this document, ""scheduled"" rule alerts includeNRT rule alerts.) Microsoft Securityanal",2022-12-20T23:05:00.000Z,reference,configuration,0.88,True,"Schema reference for security alerts created by analytics rules; includes field definitions and mappings, which are configuration-level expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema-differences,Standalone vs XDR alert schema reference,Microsoft Sentinel alert schema differences between standalone and XDR connectors,Choose between Sentinel standalone and XDR alert connectors,"Learn how alert schema, field mappings, and ingestion behavior differ between standalone connectors and the XDR connector in Microsoft Sentinel.","This article explains the differences between alerts ingested through standalone connectors and alerts ingested through the Extended Detection and Response (XDR) connector in Microsoft Sentinel. Standalone connectors ingest alerts directly from the original security products, whereas the XDR connector ingests alerts through the Microsoft Defender XDR pipeline. This includes connectors such as Microsoft Defender for Office 365, Microsoft Defender for Endpoint, Microsoft Defender for Identity, Inf",2026-02-04T18:14:00.000Z,reference,decision-making,0.7,True,"Explains differences in alert schema, field mappings, and ingestion behavior between standalone and XDR connectors, guiding connector selection and migration decisions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-analytic-rules-creation,Creating analytics rules,Create Analytics Rules for Microsoft Sentinel Solutions,Create analytics rules for Microsoft Sentinel solutions,This article guides you through the process of creating and publishing analytics rules to Microsoft Sentinel solutions.,"Microsoft Sentinel analytics rules are sets of criteria. They define how data is monitored, what's detected, and what actions are taken when specific conditions are met. These rules help identify suspicious behavior, anomalies, and potential security threats by analyzing logs and signals from various data sources. Microsoft Sentinel analytics rules are powerful tools for enhancing an organization's security posture because they proactively detect and respond to potential threats. By following a ",2025-02-18T18:02:00.000Z,conceptual,integrations,0.7,True,"Covers how to define and package analytics rules for solutions; includes Sentinel-specific rule schema, parameters, and publishing details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-content-centralize,OOTB content centralization changes,Out-of-the-box (OOTB) content centralization changes - Microsoft Sentinel,,This article describes upcoming centralization changes for out-of-the-box content in Microsoft Sentinel.,"The Microsoft Sentinel content hub enables discovery and on-demand installation of out-of-the-box (OOTB) content and solutions in a single step. Previously, some of this OOTB content existed only in various gallery sections of Microsoft Sentinel. Now,allof the following gallery content templates are available in the content hub as standalone items or as part of packaged solutions:",2023-06-27T21:46:00.000Z,conceptual,,0.3,False,"Describes centralization changes and where OOTB content appears; primarily informational/announcement style without detailed configuration parameters, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-hunting-rules-creation,Creating hunting queries,Create Hunting Queries for Microsoft Sentinel Solutions,Create hunting queries for Microsoft Sentinel solutions,This article guides you through the process of creating and publishing hunting queries to Microsoft Sentinel solutions.,"Hunting queries are at the heart of the threat-hunting process in Microsoft Sentinel. Security analysts use advanced, customizable queries in Kusto Query Language (KQL) to sift through large volumes of data. These queries allow analysts to identify potential security threats, investigate suspicious activities, and gain visibility into their network and endpoints. Analysts use Microsoft Sentinel to proactively search for threats that bypass existing security defenses. This proactive approach help",2025-02-18T18:02:00.000Z,conceptual,integrations,0.7,True,Guides creating and publishing hunting queries with KQL for solutions; includes Sentinel-specific packaging and metadata requirements.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-integration-guide,Overview,Guide to build and publish Microsoft Sentinel solutions,Build and publish Microsoft Sentinel SIEM solutions,This article walks you through the entire lifecycle of how to build and publish solutions to Microsoft Sentinel.,"Microsoft Sentinel SIEM and platform includes a range of capabilities that partners can use to create impactful solutions they can publish through the Microsoft Security Store or the Sentinel SIEM Content Hub. By building on top of Sentinel, partners can enable new scenarios that use a wide breadth of security data, processing capabilities, and AI experiences, without needing new pipelines, processing capabilities or storage infrastructure. For example, you can create a connector to bring new da",2025-09-30T12:49:00.000Z,conceptual,integrations,0.7,True,"Lifecycle guide for building connectors, content, and integrations; likely includes solution manifest structure, required components, and Sentinel-specific integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-overview,What is Microsoft Sentinel?,What is Microsoft Sentinel?,,"Learn about Microsoft Sentinel, an AI-first, cloud-native security information and event management (SIEM) and security platform that consolidates and analyzes security data at scale, empowers securit","Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and unified security platform for agentic defense. To meet the demands of today’s complex threats, Microsoft Sentinel has evolved from a traditional SIEM to a SIEM and platform - extending beyond static, rule-based controls and post-breach response to provide an AI-ready, data-first foundation that transforms telemetry into a security graph, standardizes access for agents, and coordinates autonomous actions, wh",2025-09-30T12:49:00.000Z,overview,,0.2,False,"High-level product overview of Microsoft Sentinel as a SIEM/platform; no concrete limits, configs, roles, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-playbook-creation,Creating playbooks,Create Playbooks for Microsoft Sentinel Solutions,Create and publish playbooks for Microsoft Sentinel solutions,This article guides you through the process of creating and publishing playbooks for Microsoft Sentinel solutions.,"Playbooks in Microsoft Sentinel are sets of procedures that can respond to incidents, alerts, or specific entities. They help automate responses and can be set to run automatically when certain alerts or incidents occur. Playbooks can also be run manually. This article uses example scenarios to walk you through the process of creating and publishing playbooks for Microsoft Sentinel solutions.",2025-02-10T23:04:00.000Z,conceptual,integrations,0.7,True,"Shows how Sentinel playbooks (Logic Apps) are wired to incidents/alerts and packaged in solutions; includes Sentinel-specific triggers, connections, and metadata.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot,Overview,Security Copilot with Microsoft Sentinel,,"Learn about Microsoft Sentinel capabilities in Security Copilot. Understand the best prompts to use and how to get timely, accurate results for natural language to KQL.","Microsoft Security Copilot is a platform that helps you defend your organization at machine speed and scale. Microsoft Sentinel's vast security data provides an excellent source for Copilot to help analyze incidents and generate hunting queries. Together with other Security Copilot sources you enable, your Microsoft Sentinel incidents and data provide wider visibility into threats and their context for your organization.",2025-05-26T11:23:00.000Z,concept-article,,0.5,False,"High-level description of Security Copilot with Sentinel; summary doesn’t show specific prompts, parameters, or configuration tables.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot-incident-summary,Summarize incidents in Azure portal,Summarize Microsoft Sentinel incidents with Security Copilot,,Learn about Microsoft Sentinel's incident summarization capabilities in Security Copilot.,"Microsoft Sentinel applies the capabilities ofSecurity Copilotin the Azure portal to create enriched summaries of incidents, providing a comprehensive overview of security incidents by consolidating information from multiple alerts. This feature enhances incident response efficiency by offering a clear summary that helps your security operations teams quickly understand the scope and impact of an incident. It provides a structured overview, including timelines, assets involved, and indicators of",2025-04-27T23:16:00.000Z,conceptual,,0.5,False,Describes incident summarization capability conceptually; no detailed configuration or numeric thresholds are evident from the summary.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-service-limits,Microsoft Sentinel SIEM service limits,Microsoft Sentinel service limits,Review Microsoft Sentinel service limits and quotas,"This article provides a list of service limits for Microsoft Sentinel, divided into the different service areas.","This article lists the most common service limits you might encounter as you use Microsoft Sentinel. For other limits that might impact services or features you use, like Azure Monitor, seeAzure subscription and service limits, quotas, and constraints.",2025-06-29T11:13:00.000Z,reference,limits-quotas,0.9,True,"Explicitly a service limits article; these pages list concrete numeric limits and quotas per area, which are expert knowledge not reliably known from training.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-soar-content,SOAR content catalog,Microsoft Sentinel SOAR content catalog,,"This article displays and details the content provided by Microsoft Sentinel for security orchestration, automation, and response (SOAR), including playbooks and Logic Apps connectors.","Microsoft Sentinel provides a wide variety of playbooks and connectors for security orchestration, automation, and response (SOAR), so that you can readily integrate Microsoft Sentinel with any product or service in your environment. The integrations listed below may include some or all of the following components: You can find SOAR integrations and their components in the following places: Tip",2023-01-25T23:11:00.000Z,reference,,0.45,False,SOAR content catalog listing playbooks and connectors; primarily a catalog/overview without deep configuration parameters or troubleshooting mappings.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution,Monitor Zero Trust,Monitor Zero Trust (TIC 3.0) security architectures with Microsoft Sentinel,Monitor Zero Trust TIC 3.0 with Sentinel solution,"Install and learn how to use the Microsoft Sentinel Zero Trust (TIC3.0) solution for an automated visualization of Zero Trust principles, cross-walked to the Trusted Internet Connections framework.","Zero Trustis a security strategy for designing and implementing the following sets of security principles: This article describes how to use the Microsoft SentinelZero Trust (TIC 3.0)solution, which helps governance and compliance teams monitor and respond to Zero Trust requirements according to theTRUSTED INTERNET CONNECTIONS (TIC) 3.0initiative. Microsoft Sentinel solutionsare sets of bundled content, pre-configured for a specific set of data. TheZero Trust (TIC 3.0)solution includes a workboo",2024-05-21T08:00:00.000Z,how-to,best-practices,0.6,True,"Shows how to use the Zero Trust (TIC 3.0) solution, including specific workbooks, analytics, and mappings to TIC controls, which are concrete product-specific usage patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-deprecation,Manage solution deprecation lifecycle,Managing end-to-end lifecycle of deprecated solutions in Microsoft Sentinel,Manage lifecycle of deprecated Sentinel solutions,This article walks you through the process of identifying deprecated solutions in Microsoft Sentinel and managing the lifecycle of these solutions.,The document explains how to manage the lifecycle of out-of-the-box solutions in Microsoft Sentinel that the solution author no longer supports. This document explains how to identify the solutions that are marked for deprecation and what actions to take on those solutions.,2025-01-02T12:13:00.000Z,concept-article,best-practices,0.6,True,"Lifecycle management of deprecated solutions includes specific recommended actions and patterns for handling unsupported content, which are Sentinel-specific operational best practices.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-quality-guidance,Sentinel solution quality guidelines,Sentinel solution quality guidelines,Apply quality guidelines to Microsoft Sentinel solutions,This article guides you through the process of publishing high quality solutions for Microsoft Sentinel.,This document describes the quality guidelines and recommendations for Microsoft Sentinel solutions.,2025-11-18T18:43:00.000Z,conceptual,best-practices,0.75,True,"Provides concrete recommendations and requirements for solution quality (content, performance, structure) that are specific to Sentinel solutions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions,Overview,Microsoft Sentinel Content and Solutions Overview,,"Discover Microsoft Sentinel content and solutions, including data connectors and analysis tools, to enhance your security operations. Learn more today.","Microsoft Sentinel content includes Security Information and Event Management (SIEM) solution components that help you ingest data, monitor, alert, and respond to security threats. This article explains the types of content and solutions in Microsoft Sentinel and how they help your security operations. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentine",2025-06-30T17:11:00.000Z,conceptual,,0.3,False,"Content and solutions overview is conceptual, describing types of content and solutions without deep configuration or numeric constraints.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-catalog,All Microsoft Sentinel SIEM solutions,Microsoft Sentinel content hub catalog,Select Sentinel content hub solutions by domain,Learn about domain specific solutions available in the content hub for Microsoft Sentinel and where to find the full list of solutions.,"Solutions in Microsoft Sentinel provide a consolidated way to acquire Microsoft Sentinel content, like data connectors, workbooks, analytics, and automation, in your workspace with a single deployment step. This article helps you find the full list of the solutions available in Microsoft Sentinel. This article also lists the domain-specific out-of-the-box (built-in) and on-demand solutions available for you to deploy in your workspace. When you deploy a solution, the security content included wi",2025-07-27T11:14:00.000Z,reference,decision-making,0.65,True,Catalog article lists domain-specific solutions and helps choose which to deploy; effectively a decision aid for solution selection based on use case and domain.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-delete,Delete out-of-the-box content,Delete installed Microsoft Sentinel out-of-the-box content and solutions,,Remove solutions and content you deployed in Microsoft Sentinel.,"If you installed a Microsoft Sentinel out-of-the-box solution, you can remove content items from the solution or delete the installed solution. If you later need to restore deleted content items, selectReinstallon the solution. Similarly, you can restore the solution by reinstalling the solution. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in t",2025-07-27T11:14:00.000Z,how-to,,0.35,False,Deleting installed content/solutions is procedural management; summary does not indicate detailed configuration tables or constraints.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-deploy,Deploy out-of-the-box content,Discover and deploy Microsoft Sentinel out-of-the-box content from Content hub,Discover and deploy Sentinel content hub solutions,"Learn how to find and deploy Sentinel packaged solutions containing data connectors, analytics rules, hunting queries, workbooks, and other content.","The Microsoft Sentinel Content hub is your centralized location to discover and manage out-of-the-box (built-in) content. There you find packaged solutions for end-to-end products by domain or industry. You have access to the vast number of standalone contributions hosted in our GitHub repository and feature blades. Discover solutions and standalone content with a consistent set of filtering capabilities based on status, content type, support, provider, and category. Install content in your work",2025-07-27T11:14:00.000Z,how-to,deployment,0.6,True,Describes how to deploy packaged solutions from Content hub; likely includes which content types can be deployed where and constraints around deployment to workspaces.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-post-publish-tracking,Sentinel SIEM solution lifecycle post publish,Status of Microsoft Sentinel solution after publishing in the Microsoft Partner center,Track Microsoft Sentinel solution status after publishing,This article walks you through the details of tracking solutions post publish in Microsoft Partner center.,"This document explains what happens once your offer is successfully published. Within partner center, your solution would be referred to as an offer (the terms solution and offer are used interchangeably in the context of this document). Once you publish the offer, the offer goes through a series of validation checks before it becomes live in Azure Marketplace and in the Sentinel content hub. If you didn't create an offer yet, follow the steps in thePublish solutions to Microsoft Sentinelarticle",2025-09-18T08:00:00.000Z,conceptual,deployment,0.65,True,Explains validation stages and status states for Sentinel offers in Partner Center and content hub; these are specific post-deployment behaviors.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-summary-rules-creation,Creating summary rules,Create Summary Rules for Microsoft Sentinel Solutions,Create summary rules and tables for Sentinel solutions,This article guides you through the process of creating and publishing summary rules to Microsoft Sentinel solutions.,"Summary rules in Microsoft Sentinel are scheduled queries that aggregate and transform high-volume data into summarized results stored in a custom log table. In essence, a summary rule runs a user-defined KQL (Kusto Query Language) query at a regular interval (for example, every hour or once a day) across a large set of logs, and saves the aggregated output (such as counts, statistics, or filtered records) into a new or existing Log Analytics (LA) table. This mechanism provides concise, precompi",2025-07-08T17:24:00.000Z,how-to,integrations,0.7,True,"Describes scheduled KQL queries that write to custom tables; includes Sentinel-specific rule configuration, table naming, and solution packaging.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-tables-connectors-reference,Sentinel tables and connectors,Microsoft Sentinel tables and associated connectors,Map Sentinel tables to their data connectors,"This article lists the tables ingested into Microsoft Sentinel via data connectors, and the connectors that ingest them.","The following table lists the tables ingested into Microsoft Sentinel via data connectors, and the connectors that ingest them. Select the table name or the connector name for more information.",2026-02-02T18:12:00.000Z,reference,configuration,0.78,True,Provides a mapping between Sentinel tables and the connectors that ingest them; this is product-specific configuration/integration reference.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/sentinel-workbook-creation,Creating workbooks,Create Workbooks for Microsoft Sentinel Solutions,Create and publish workbooks for Microsoft Sentinel solutions,This article guides you through the process of creating and publishing workbooks for Microsoft Sentinel solutions.,"Workbooks are an integral feature of Microsoft Sentinel, a cloud-native security information and event management (SIEM) solution. Workbooks provide users with interactive, customizable dashboards that aggregate and visualize data from various sources. These dashboards enable organizations to gain deeper insights into their security posture and streamline their efforts in threat detection and response. By integrating data from various sources and facilitating collaboration among security teams, ",2025-02-10T23:04:00.000Z,conceptual,integrations,0.7,True,"Workbook creation for Sentinel solutions involves specific JSON templates, parameters, and packaging rules that are product-specific.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/setup-azure-storage-connector,Azure Storage blob connectors,Set up the Azure Storage connector to stream logs to Microsoft Sentinel,Configure Azure Storage Blob connector for Sentinel,Learn how to set up the Azure Storage Blob connector to ingest logs from Azure Storage into Microsoft Sentinel using the Codeless Connector Framework.,The Azure Storage Blob connector simplifies collecting logs from Azure Storage. It lets ISVs and users build scalable connectors on top of Azure Storage integrations through the fully managed Codeless Connector Framework (CCF). This article summarizes the connector resources and provides steps to create and validate your first Azure Storage connector.,2026-03-11T11:05:00.000Z,how-to,integrations,0.7,True,"The page describes setting up the Azure Storage Blob connector using the Codeless Connector Framework to ingest logs into Sentinel. Such connector setup articles typically include connector-specific configuration parameters, required settings, and validation steps unique to this integration, aligning with the integrations & coding patterns category.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/siem-migration,Use the SIEM migration experience,Use the SIEM migration experience - Microsoft Sentinel,Use Sentinel SIEM migration experience for rule mapping,Migrate security monitoring use cases from other Security Information and Event Management (SIEM) systems to Microsoft Sentinel.,"The SIEM migration tool analyzes Splunk and QRadar detections, including custom detections, and recommends best‑fit Microsoft Sentinel detections rules. It also provides recommendations for data connectors, both Microsoft and third-party connectors available in Content Hub to enable the recommended detections. Customers can track the migration by assigning the right status to each recommendation card. Note The old migration tool is deprecated. This article describes the current SIEM migration ex",2026-01-30T18:17:00.000Z,how-to,decision-making,0.65,True,"Describes a tool that analyzes Splunk/QRadar detections and recommends Sentinel rules and connectors; provides concrete migration recommendations and status tracking, which is decision-making support.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/skill-up-resources,Microsoft Sentinel skill-up training,Microsoft Sentinel skill-up training,,"This article walks you through a level 400 training to help you skill up on Microsoft Sentinel. The training comprises 21 modules that present relevant product documentation, blog posts, and other res","This article walks you through a level 400 training to help you skill up on Microsoft Sentinel. The training comprises 21 self-paced modules that present relevant product documentation, blog posts, and other resources. The modules listed here are split into five parts following the life cycle of a Security Operation Center (SOC): Part 1: Overview Part 2: Architecting and deploying Part 3: Creating content Part 4: Operating Part 5: Advanced",2024-05-16T08:00:00.000Z,tutorial,,0.1,False,"Training/learning path aggregator that links to other resources. No direct expert technical details like limits, configuration tables, or API references.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/soc-ml-anomalies,Overview,Use customizable anomalies to detect threats in Microsoft Sentinel,Use customizable anomaly detection in Sentinel,This article explains how to use the new customizable anomaly detection capabilities in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog.",2025-10-29T11:15:00.000Z,conceptual,configuration,0.65,True,"Describes new anomaly detection capabilities; likely includes rule parameters, tuning options, and configuration fields unique to Sentinel anomalies.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-access,Optimize your security operations,Optimize security operations,Apply SOC optimization recommendations in Microsoft Sentinel,Use Microsoft Sentinel SOC optimization recommendations to optimize your security operations center (SOC) team activities.,"Security operations center (SOC) teams look for ways to improve processes and outcomes and ensure you have the data needed to address risks without extra ingestion costs. SOC teams want to make sure that you have all the necessary data to act against risks, without paying formoredata than needed. At the same time, SOC teams must also adjust security controls as threats and business priorities change, doing so quickly and efficiently to maximize your return on investment. SOC optimizations are ac",2025-09-10T17:15:00.000Z,how-to,decision-making,0.65,True,"Provides Sentinel SOC optimization recommendations to balance data coverage and ingestion cost, guiding operational decisions rather than just configuration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-api,Use SOC optimizations programmatically,Use SOC optimizations programmatically,Call Microsoft Sentinel SOC optimization recommendations API,Learn how to use Microsoft Sentinel SOC optimization recommendations programmatically.,"Use the Microsoft SentinelrecommendationsAPI to programmatically interact with SOC optimization recommendations, helping you to close coverage gaps against specific threats and tighten ingestion rates. You can get details about all current recommendations across your workspaces or a specific SOC optimization recommendation, or you can reevaluate a recommendation if you've made changes in your environment. For example, use therecommendationsAPI to: For customers or MSSPs managing multiple environ",2025-02-26T12:24:00.000Z,concept-article,integrations,0.7,True,"API-focused article for SOC optimization recommendations; likely includes endpoint shapes, parameters, and request/response details specific to Sentinel's recommendations API, which are product-specific integration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-reference,SOC optimization reference,SOC optimization reference,,Learn about the Microsoft Sentinel SOC optimization recommendations available to help you optimize your security operations.,"Use SOC optimization recommendations to help you close coverage gaps against specific threats and tighten your ingestion rates against data that doesn't provide security value. SOC optimizations help you optimize your Microsoft Sentinel workspace, without having your SOC teams spend time on manual analysis and research. Microsoft Sentinel SOC optimizations include the following types of recommendations: Data value recommendationssuggest ways to improve your data use, such as a better data plan f",2025-05-07T17:04:00.000Z,reference,,0.2,False,"Describes types of SOC optimization recommendations conceptually; no indication of numeric limits, config tables, or API/SDK parameter details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/solution-setup-essentials,Microsoft Sentinel solution setup essentials,Microsoft Sentinel solution setup essentials,Prepare prerequisites for Microsoft Sentinel SIEM solutions,Learn the prerequisites for creating Microsoft Sentinel SIEM and platform solutions.,Microsoft Sentinel solutions help you package and share custom security content. Use this page to learn about the two solution types and review the setup steps before you build or publish.,2025-09-30T12:49:00.000Z,conceptual,configuration,0.65,True,"Covers required setup and configuration for solution types; likely lists specific resource types, permissions, and configuration steps unique to Sentinel solutions.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/stix-objects-api,Threat intelligence upload API reference,Import threat intelligence with the upload API - Microsoft Sentinel,Import threat intelligence using Sentinel STIX upload API,This article is a reference for the upload upload API with example requests and responses.,"Import threat intelligence to use in Microsoft Sentinel with the upload API. Whether you're using a threat intelligence platform or a custom application, use this document as a supplemental reference to the instructions inConnect your TIP with the upload API. Installing the data connector isn't required to connect to the API. The threat intelligence you can import includes indicators of compromise and other STIX domain objects. Important This API is currently in PREVIEW. TheAzure Preview Supplem",2025-03-02T12:20:00.000Z,reference,integrations,0.85,True,"Explicitly a reference for the STIX objects upload API with example requests/responses. Contains endpoint URLs, parameter names, and constraints, which are core integration details.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/summary-rules,Aggregate data with summary rules,Aggregate Microsoft Sentinel data with summary rules,Configure and use summary rules to aggregate Sentinel data,Learn how to aggregate large sets of Microsoft Sentinel data across log tiers with summary rules.,"Usesummary rulesin Microsoft Sentinel to aggregate large sets of data in the background for a smoother security operations experience across all log tiers. Summary data is precompiled in custom log tables and provide fast query performance, including queries run on data derived fromlow-cost log tiers. Summary rules can help optimize your data for: Microsoft Sentinel stores summary rule results in custom tables with theAnalyticsdata plan. For more information on data plans and storage costs, seeL",2025-07-01T08:00:00.000Z,how-to,configuration,0.7,True,"Describes how to set up summary rules, including how they write to custom tables and interact with log tiers—Sentinel-specific configuration behavior.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/surface-custom-details-in-alerts,Surface custom details in alerts,Surface custom details in Microsoft Sentinel alerts,Surface custom event details in Sentinel alerts,"Extract and surface custom event details in alerts in Microsoft Sentinel analytics rules, for better and more complete incident information","Scheduled query analytics rulesanalyzeeventsfrom data sources connected to Microsoft Sentinel, and producealertswhen the contents of these events are significant from a security perspective. These alerts are further analyzed, grouped, and filtered by Microsoft Sentinel's various engines and distilled intoincidentsthat warrant a SOC analyst's attention. However, when the analyst views the incident, only the properties of the component alerts themselves are immediately visible. Getting to the actu",2025-09-16T17:23:00.000Z,how-to,configuration,0.8,True,Explains extracting and surfacing custom details in alerts; involves rule configuration fields and mappings that are product-specific.,unchanged +gathered through theOffice Management API, which uses a structured schema.",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,"Lists supported audit log record types and activities for the connector, which is detailed, product-specific capability/configuration information.",updated +https://learn.microsoft.com/en-us/azure/sentinel/microsoft-sentinel-defender-portal,Microsoft Sentinel SIEM experience in Defender portal,Microsoft Sentinel in the Microsoft Defender portal,Use Microsoft Sentinel within the Defender portal,Learn about the Microsoft Sentinel experience when you onboard Microsoft Sentinel to the Microsoft Defender portal.,"Microsoft Defender provides a unified cybersecurity solution that integrates endpoint protection, cloud security, identity protection, email security, threat intelligence, exposure management, and SIEM into a centralized platform powered by a modern data lake. It uses AI-driven defense to help organizations anticipate and stop attacks, ensuring efficient and effective security operations. Microsoft Sentinel is generally available in the Microsoft Defender portal, either withMicrosoft Defender XD",2026-04-22T17:56:00.000Z,overview,configuration,0.65,True,"Explains Sentinel experience in Defender portal; likely includes portal-specific enablement, navigation, and configuration nuances unique to this integration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration,Plan your migration,Plan your migration to Microsoft Sentinel,Plan migration from legacy SIEMs to Microsoft Sentinel,"Discover the reasons for migrating from a legacy SIEM, and learn how to plan out the different phases of your migration.","Security operations center (SOC) teams use centralized security information and event management (SIEM) and security orchestration, automation, and response (SOAR) solutions to protect their increasingly decentralized digital estate. While legacy SIEMs can maintain good coverage of on-premises assets, on-premises architectures may have insufficient coverage for cloud assets, such as in Azure, Microsoft 365, AWS, or Google Cloud Platform (GCP). In contrast, Microsoft Sentinel can ingest data from",2026-04-22T17:56:00.000Z,how-to,decision-making,0.75,True,"Covers reasons, phases, and planning for migrating from legacy SIEM to Sentinel; includes scenario-based migration guidance and trade-offs, fitting decision-making/migration criteria.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-automation,Migrate SOAR automation,Migrate ArcSight SOAR automation to Microsoft Sentinel,Migrate ArcSight SOAR automation to Sentinel playbooks,"Learn how to identify SOAR use cases, and how to migrate your ArcSight SOAR automation to Microsoft Sentinel.","Microsoft Sentinel provides Security Orchestration, Automation, and Response (SOAR) capabilities withautomation rulesandplaybooks. Automation rules automate incident handling and response, and playbooks run predetermined sequences of actions to response and remediate threats. This article discusses how to identify SOAR use cases, and how to migrate your ArcSight SOAR automation to Microsoft Sentinel. Automation rules simplify complex workflows for your incident orchestration processes, and allow",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Explains how to identify SOAR use cases and migrate ArcSight automation to Sentinel automation rules/playbooks; product-specific migration and approach-selection guidance.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-detection-rules,Migrate detection rules,Migrate ArcSight detection rules to Microsoft Sentinel,Map and migrate ArcSight detection rules to Sentinel,"Identify, compare, and migrate your ArcSight detection rules to Microsoft Sentinel built-in rules.","This article describes how to identify, compare, and migrate your ArcSight detection rules to Microsoft Sentinel analytics rules.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Provides guidance on identifying, comparing, and mapping ArcSight rules to Sentinel analytics rules; this is migration and selection guidance between rule sets.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-historical-data,Export historical data,Microsoft Sentinel migration: Export ArcSight data to target platform,Export ArcSight historical data for Sentinel migration,Learn how to export your historical data from ArcSight.,"This article describes how to export your historical data from ArcSight. After you complete the steps in this article, you canselect a target platformto host the exported data, and thenselect an ingestion toolto migrate the data. You can export data from ArcSight in several ways. Your selection of an export method depends on the data volumes and the deployed ArcSight environment. You can export the logs to a local folder on the ArcSight server or to another server accessible by ArcSight. To expo",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Details export methods based on data volume and environment; includes guidance on choosing export approaches, which is migration decision-making specific to ArcSight-to-Azure scenarios.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-convert-dashboards,Convert dashboards to workbooks,Convert dashboards to Azure Workbooks in Microsoft Sentinel,Convert legacy SIEM dashboards to Sentinel workbooks,"Learn how to review, planning, and migrate your current dashboards to Azure Workbooks.","Convert dashboards from your existing security information and event management (SIEM) solution to an Azure workbook for Microsoft Sentinel. Azure Workbooks provide versatility to create custom dashboards for Microsoft Sentinel. This article describes how to review, plan, and convert your current dashboards to Azure Workbooks.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.65,True,"Covers reviewing, planning, and converting dashboards to Azure Workbooks; includes migration planning and mapping decisions between dashboard models.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-export-ingest,Ingest data,Microsoft Sentinel migration: Ingest data into target platform,,Learn how to ingest historical data into your selected target platform.,"In previous articles, youselected a target platformfor your historical data. You also selecteda tool to transfer your dataand stored the historical data in a staging location. You can now start to ingest the data into the target platform. This article describes how to ingest your historical data into your selected target platform.",2026-04-22T17:56:00.000Z,how-to,,0.5,False,Describes how to ingest data into the chosen platform; likely procedural without explicit comparison matrices or limits; more of a step-by-step ingestion guide.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-target-platform,Select target platform,Microsoft Sentinel migration: Select a target Azure platform to host exported data,Choose Azure target platform for Sentinel historical data,Select a target Azure platform to host the exported historical data,"One of the important decisions you make during your migration process is where to store your historical data. To make this decision, you need to understand and be able to compare the various target platforms. This article compares target platforms in terms of performance, cost, usability and management overhead. Note The considerations in this table only apply to historical log migration, and don't apply in other scenarios, such as long-term retention.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.85,True,"Explicitly compares Azure target platforms for historical data in terms of performance, cost, usability, and management; a clear decision matrix for platform selection.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-tool,Select data ingestion tool,Microsoft Sentinel migration: Select a data ingestion tool,Select data ingestion tools for Sentinel migration,Select a tool to transfer your historical data to the selected target platform.,"After youselect a target platformfor your historical data, the next step is to select a tool to transfer your data. This article describes a set of different tools used to transfer your historical data to the selected target platform. This table lists the tools available for each target platform, and general tools to help you with the ingestion process.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.8,True,"Provides a table of tools per target platform and general tools, helping choose ingestion tooling; product-specific migration and tool-selection guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-automation,Migrate SOAR automation,Migrate IBM Security QRadar SOAR automation to Microsoft Sentinel,Migrate QRadar SOAR automation to Sentinel,"Learn how to identify SOAR use cases, and how to migrate your QRadar SOAR automation to Microsoft Sentinel.","Microsoft Sentinel provides Security Orchestration, Automation, and Response (SOAR) capabilities withautomation rulesandplaybooks. Automation rules automate incident handling and response, and playbooks run predetermined sequences of actions to response and remediate threats. This article discusses how to identify SOAR use cases, and how to migrate your IBM Security QRadar SOAR automation to Microsoft Sentinel. Automation rules simplify complex workflows for your incident orchestration processes",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Guides identification of SOAR use cases and migration of QRadar automation to Sentinel automation rules/playbooks; focused on migration and approach selection.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-detection-rules,Migrate detection rules,Migrate QRadar detection rules to Microsoft Sentinel,Migrate QRadar detection rules to Microsoft Sentinel,"Identify, compare, and migrate your QRadar detection rules to Microsoft Sentinel built-in rules.","This article describes how to identify, compare, and migrate your QRadar detection rules to Microsoft Sentinel built-in rules.",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Covers identifying, comparing, and migrating QRadar rules to Sentinel built-in rules; product-specific migration and mapping guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-historical-data,Export historical data,Microsoft Sentinel migration: Export QRadar data to target platform,Export QRadar historical data for Sentinel migration,Learn how to export your historical data from QRadar.,"This article describes how to export your historical data from QRadar. After you complete the steps in this article, you canselect a target platformto host the exported data, and thenselect an ingestion toolto migrate the data. To export your QRadar data, you use the QRadar REST API to run Ariel Query Language (AQL) queries on data stored in an Ariel database. Because the export process is resource intensive, we recommend that you use small time ranges in your queries, and only migrate the data ",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Describes exporting QRadar data via REST API and AQL with guidance on query ranges and data selection; product-specific migration/export decision guidance.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-security-operations-center-processes,Update SOC processes,Microsoft Sentinel migration: Update SOC and analyst processes,Update SOC and analyst processes for Sentinel migration,Learn how to update your SOC and analyst processes as part of your migration to Microsoft Sentinel.,"A security operations center (SOC) is a centralized function within an organization that integrates people, processes, and technology. A SOC implements the organization's overall cybersecurity framework. The SOC collaborates the organizational efforts to monitor, alert, prevent, detect, analyze, and respond to cybersecurity incidents. SOC teams, led by a SOC manager, may include incident responders, SOC analysts at levels 1, 2, and 3, threat hunters, and incident response managers. SOC teams use",2026-04-22T17:56:00.000Z,how-to,decision-making,0.65,True,Guides how to adapt SOC roles and processes when moving to Sentinel; product-specific operational process changes and migration considerations.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-automation,Migrate SOAR automation,Migrate Splunk SOAR automation to Microsoft Sentinel,Migrate Splunk SOAR automation to Sentinel automation,"Learn how to identify SOAR use cases, and how to migrate your Splunk SOAR automation to Microsoft Sentinel.","Microsoft Sentinel provides Security Orchestration, Automation, and Response (SOAR) capabilities with automation rules and playbooks. Automation rules facilitate simple incident handling and response, while playbooks run more complex sequences of actions to respond and remediate threats. This article discusses how to identify SOAR use cases, and how to migrate your Splunk SOAR automation to Microsoft Sentinel automation rules and playbooks. For more information about the differences between auto",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Describes how to identify SOAR use cases and migrate Splunk automation to Sentinel automation rules/playbooks; focused on migration choices and mapping between systems.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-detection-rules,Migrate detection rules,Migrate Splunk detection rules to Microsoft Sentinel - Microsoft Sentinel,Migrate Splunk detection rules to Microsoft Sentinel,"Learn how to identify, compare, and migrate your Splunk detection rules to Microsoft Sentinel built-in rules.","Splunk detection rules are security information and event management (SIEM) components that compare to analytics rules in Microsoft Sentinel. This article describes the concepts to identify, compare, and migrate them to Microsoft Sentinel. The best way is to start with theSIEM migration experience, which identifies out-of-the-box (OOTB) analytics rules to automatically translate to. If you want to migrate your Splunk Observability deployment, learn more about how tomigrate from Splunk to Azure M",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,Guides identification and mapping of Splunk detection rules to Sentinel analytics rules and use of the SIEM migration experience; product-specific migration and rule-selection guidance.,updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-historical-data,Export historical data,Export Splunk data to target platform - Microsoft Sentinel,Export Splunk historical data for Sentinel migration,Learn how to export your historical data from Splunk for a Microsoft Sentinel migration of security monitoring use cases.,"This article describes how to export your historical data from Splunk. After you complete the steps in this article, you canselect a target platformto host the exported data, and thenselect an ingestion toolto migrate the data. You can export data from Splunk in several ways. Your selection of an export method depends on the data volumes involved and your level of interactivity. For example, exporting a single, on-demand search via Splunk Web might be appropriate for a low-volume export. Alterna",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Explains multiple export methods based on data volume and interactivity; provides concrete guidance on choosing export approaches, which is migration decision-making.",updated +https://learn.microsoft.com/en-us/azure/sentinel/migration-track,Track migration with a workbook,Track your Microsoft Sentinel migration with a workbook,,"Learn how to track your migration with a workbook, how to customize and manage the workbook, and how to use the workbook tabs for useful Microsoft Sentinel actions.","As your organization's security operations center (SOC) handles growing amounts of data, it's essential to plan and monitor your deployment status. While you can track your migration process using generic tools such as Microsoft Project, Microsoft Excel, Microsoft Teams, or Azure DevOps, these tools aren’t specific to security information and event management (SIEM) migration tracking. To help you to track, we provide a dedicated workbook in Microsoft Sentinel namedMicrosoft Sentinel Deployment ",2026-04-22T17:56:00.000Z,how-to,,0.45,False,"Workbook-based tracking of migration status; mostly procedural and dashboard usage, without clear config matrices, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/mitre-coverage,MITRE ATT&CK coverage,View MITRE coverage for your organization from Microsoft Sentinel,,"Learn how to view coverage indicator in Microsoft Sentinel for MITRE tactics that are currently covered, and available to configure, for your organization.","MITRE ATT&CKis a publicly accessible knowledge base of tactics and techniques that are commonly used by attackers, and is created and maintained by observing real-world observations. Many organizations use the MITRE ATT&CK knowledge base to develop specific threat models and methodologies that are used to verify security status in their environments. Microsoft Sentinel analyzes ingested data, not only todetect threatsand help youinvestigate, but also to visualize the nature and coverage of your ",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"MITRE coverage visualization article; mainly explains how to view coverage indicators, not detailed configuration parameters, numeric thresholds, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-analytics-rule-integrity,Audit and monitor the health of analytics rules,Monitor the health and audit the integrity of your Microsoft Sentinel analytics rules,Monitor health and integrity of Sentinel analytics rules,Use the SentinelHealth data table to keep track of your analytics rules' execution and performance.,"To ensure comprehensive, uninterrupted, and tampering-free threat detection in your Microsoft Sentinel service, keep track of your analytics rules' health and integrity. Keep them functioning optimally by monitoring theirexecution insights, by querying the health and audit logs, and byusing manual rerun to test and optimize your rules. Set up notifications of health and audit events for relevant stakeholders, who can then take action. For example, define and send email or Microsoft Teams message",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Explains using SentinelHealth and audit logs plus manual rerun for analytics rules; product-specific monitoring configuration and queries.,updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-automation-health,Monitor automation rules and playbooks health,Monitor the health of your Microsoft Sentinel automation rules and playbooks,Monitor Sentinel automation rules and playbook health,Use the SentinelHealth and AzureDiagnostics data tables to keep track of your automation rules' and playbooks' execution and performance.,"To ensure proper functioning and performance of your security orchestration, automation, and response operations in your Microsoft Sentinel service, keep track of the health of your automation rules and playbooks by monitoring their execution logs. Set up notifications of health events for relevant stakeholders, who can then take action. For example, define and send email or Microsoft Teams messages, create new tickets in your ticketing system, and so on. This article describes how to use Micros",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,Uses SentinelHealth and AzureDiagnostics tables to track automation; includes product-specific table names and query patterns.,updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health,Monitor data connector health,Monitor the health of your Microsoft Sentinel data connectors,Monitor Sentinel data connector health with SentinelHealth,Use the SentinelHealth data table and the Health Monitoring workbook to keep track of your data connectors' connectivity and performance.,"To ensure complete and uninterrupted data ingestion in your Microsoft Sentinel service, keep track of your data connectors' health, connectivity, and performance. The following features allow you to perform this monitoring from within Microsoft Sentinel: Data collection health monitoring workbook: This workbook provides additional monitors, detects anomalies, and gives insight regarding the workspace’s data ingestion status. You can use the workbook’s logic to monitor the general health of the i",2026-04-22T17:56:00.000Z,how-to,configuration,0.75,True,Describes using SentinelHealth table and Health Monitoring workbook; includes specific workbook logic and table usage unique to Sentinel connectors.,updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-key-vault-honeytokens,Deploy and monitor decoy honeytokens,Deploy and monitor Azure Key Vault honeytokens with Microsoft Sentinel,,"Plant Azure Key Vault honeytoken keys and secrets, and monitor them with Microsoft Sentinel.","Important The Microsoft Sentinel Deception (Honey Tokens) solution is offered in a community supported model by theMicrosoft SIEM & XDR Community. Any support required can be raised as anissueon GitHub where the Microsoft Sentinel community can assist. For solution documentation, review theHoneytokens solution GitHub page.",2026-04-22T17:56:00.000Z,how-to,,0.4,False,High-level description of Key Vault honeytokens and community support; detailed implementation is offloaded to GitHub.,updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-optimize-analytics-rule-execution,Monitor and optimize execution of analytics rules,Monitor and optimize the execution of your Microsoft Sentinel scheduled analytics rules,Monitor and rerun Sentinel scheduled analytics rules,"Use Microsoft Sentinel's execution management tools, rule insights and manual rerun, to test and manage your scheduled analytics rules' execution.","To ensure that Microsoft Sentinel's threat detection provides complete coverage in your environment, take advantage of its execution management tools. These tools consist of insights on yourscheduled analytics rules'execution, based on Microsoft Sentinel'shealth and audit data, and a facility to manually rerun previous executions of rules on specific time windows, for testing and/or troubleshooting purposes. Important Microsoft Sentinel's analytics rule insights and manual rerun are currently in",2026-04-22T17:56:00.000Z,feature-guide,configuration,0.7,True,Describes rule execution insights and manual rerun; includes Sentinel-specific execution management features and their configuration.,updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-sap-system-health,Monitor SAP system health and role,Monitor the health of the connection between Microsoft Sentinel and your SAP system,Monitor health of Sentinel–SAP connectivity,Use the SAP connector page and a dedicated alert rule template to keep track of your SAP systems' connectivity and performance.,"After youdeploy the SAP solution, you want to ensure proper functioning and performance of your SAP systems, and keep track of your system health, connectivity, and performance. This article describes how you can check the connectivity health manually on the data connector page and use a dedicated alert rule template to monitor the health of your SAP systems. Important Monitoring the health of your SAP systems and the agentless data connector for SAP are both currently in PREVIEW. TheAzure Previ",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Explains how to use connector page and alert rule template to monitor health; includes product-specific configuration of monitoring rules.,updated +https://learn.microsoft.com/en-us/azure/sentinel/monitor-your-data,View customized views with workbooks,Visualize your data using workbooks in Microsoft Sentinel,,Learn how to visualize your data using workbooks in Microsoft Sentinel.,"After you connect your data sources to Microsoft Sentinel, visualize and monitor the data using workbooks in Microsoft Sentinel. Microsoft Sentinel workbooks are based on Azure Monitor workbooks, and add tables and charts with analytics for your logs and queries to the tools already available in Azure. Microsoft Sentinel allows you to create custom workbooks across your data or use existing workbook templates available with packaged solutions or as standalone content from the content hub. Each w",2026-04-22T17:56:00.000Z,how-to,,0.25,False,"Workbook visualization article; likely shows how to create and use workbooks, but summary doesn’t indicate detailed config tables or numeric constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/move-to-defender,Transition to the Defender portal,Transition Your Microsoft Sentinel Environment to the Defender Portal,Transition Sentinel operations from Azure to Defender portal,Move Microsoft Sentinel operations from the Azure portal to the Microsoft Defender portal.,"Microsoft Sentinel is available in the Microsoft Defender portal withMicrosoft Defender XDRor on its own. It delivers a unified experience across SIEM and XDR for faster, more accurate threat detection and response, simpler workflows, and better operational efficiency. This article explains how to transition your Microsoft Sentinel experience from the Azure portal to the Defender portal. If you use Microsoft Sentinel in the Azure portal, transition to Microsoft Defender for unified security oper",2026-04-22T17:56:00.000Z,how-to,decision-making,0.65,True,"Guides when and how to move from Azure portal to Defender portal, with implications for workflows and operations; this is migration/choice guidance specific to Sentinel environments.",updated +https://learn.microsoft.com/en-us/azure/sentinel/mssp-protect-intellectual-property,Manage your intellectual property in Microsoft Sentinel,Protecting managed security service provider (MSSPs) intellectual property in Microsoft Sentinel,Protect MSSP intellectual property in Sentinel deployments,Learn about how managed security service providers (MSSPs) can protect the intellectual property they've created in Microsoft Sentinel.,"This article describes the methods that managed security service providers (MSSPs) can use to protect intellectual property they've developed in Microsoft Sentinel, such as Microsoft Sentinel analytics rules, hunting queries, playbooks, and workbooks. The method you choose depends on how each of your customers buys Azure; whether you act as aCloud Solutions Provider (CSP), or the customer has anEnterprise Agreement (EA)/Pay-as-you-go (PAYG)account. The following sections describe each of these m",2026-04-22T17:56:00.000Z,how-to,decision-making,0.6,True,"Describes methods to protect analytics rules, playbooks, etc., depending on CSP vs EA/PAYG; product- and channel-specific decision guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/multiple-tenants-service-providers,Manage multiple tenants (MSSP),Manage multiple tenants in Microsoft Sentinel as a Managed Security Service Provider,Onboard and manage multiple Sentinel tenants as an MSSP,How to onboard and manage multiple tenants in Microsoft Sentinel as a Managed Security Service Provider (MSSP) using Azure Lighthouse.,"If you're a managed security service provider (MSSP) and you're usingAzure Lighthouseto offer security operations center (SOC) services to your customers, you can manage your customers' Microsoft Sentinel resources directly from your own Azure tenant, without having to connect to the customer's tenant. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentine",2026-04-22T17:56:00.000Z,how-to,architecture-patterns,0.7,True,Describes MSSP multi-tenant management using Azure Lighthouse; includes Sentinel-specific operational patterns and architecture choices.,updated +https://learn.microsoft.com/en-us/azure/sentinel/multiple-workspace-view,Work with incidents in multiple workspaces,Work with Microsoft Sentinel incidents in many workspaces at once,View and manage Sentinel incidents across multiple workspaces,How to view incidents in multiple workspaces concurrently in Microsoft Sentinel.,"To take full advantage of Microsoft Sentinel’s capabilities, Microsoft recommends using a single-workspace environment. However, there are some use cases that require having several workspaces, in some cases – for example, that of aManaged Security Service Provider (MSSP)and its customers – across multiple tenants.Multiple workspace viewlets you see and work with security incidents across several workspaces at the same time, even across tenants, allowing you to maintain full visibility and contr",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Explains configuring and using multiple workspace incident view; product-specific UI/feature configuration and behavior.,updated +https://learn.microsoft.com/en-us/azure/sentinel/near-real-time-rules,Overview,Quick threat detection with near-real-time (NRT) analytics rules in Microsoft Sentinel,Configure near-real-time analytics rules in Sentinel,This article explains how the new near-real-time (NRT) analytics rules can help you detect threats quickly in Microsoft Sentinel.,"When you're faced with security threats, time and speed are of the essence. You need to be aware of threats as they materialize so you can analyze and respond quickly to contain them. Microsoft Sentinel's near-real-time (NRT) analytics rules offer you faster threat detection—closer to that of an on-premises SIEM—and the ability to shorten response times in specific scenarios. Microsoft Sentinel’snear-real-time analytics rulesprovide up-to-the-minute threat detection out-of-the-box. This type of ",2026-04-22T17:56:00.000Z,concept-article,configuration,0.65,True,"NRT rules have specific configuration characteristics (frequency, supported operators, constraints) that differ from scheduled rules; article likely documents these product-specific settings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization,ASIM overview,Normalization and the Advanced Security Information Model (ASIM),,This article explains how Microsoft Sentinel normalizes data from many different sources using the Advanced Security Information Model (ASIM),"Microsoft Sentinel ingests data from many sources. Working with various data types and tables together requires you to understand each of them, and write and use unique sets of data for analytics rules, workbooks, and hunting queries for each type or schema. Sometimes, you'll need separate rules, workbooks, and queries, even when data types share common elements, such as firewall devices. Correlating between different types of data during an investigation and hunting can also be challenging. The",2026-04-22T17:56:00.000Z,concept-article,,0.2,False,"Conceptual explanation of normalization and ASIM; no specific configs, limits, or troubleshooting content indicated.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-parsers,Use ASIM,Use Advanced Security Information Model (ASIM) parsers,Use ASIM KQL parsers in Sentinel queries,This article explains how to use Kusto Query Language (KQL) functions as query-time parsers to implement the Advanced Security Information Model (ASIM),Use Advanced Security Information Model (ASIM) parsers instead of table names in your Microsoft Sentinel queries to view data in a normalized format and to include all data relevant to the schema in your query. Refer to the table below to find the relevant parser for each schema.,2026-04-22T17:56:00.000Z,concept-article,integrations,0.65,True,"Explains how to use specific KQL user-defined functions as parsers, with a table mapping schemas to parser function names—this is product-specific integration/query pattern detail.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-schemas,ASIM schemas,Advanced Security Information Model (ASIM) schemas,,"This article explains Advanced Security Information Model (ASIM) schemas, and how they help. ASIM normalizes data from many different sources to a uniform presentation.","An Advanced Security Information Model (ASIM) schema is a set of fields that represent an activity or entity. Using the fields from a normalized schema in a query ensures that the query works with every normalized source. To understand how schemas fit within the ASIM architecture, refer to theASIM architecture diagram.",2026-04-22T17:56:00.000Z,article,,0.3,False,Describes ASIM schemas conceptually; likely field lists but summary suggests architecture/overview rather than config tables or numeric thresholds.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-workspace-parsers,Workspace deployed parsers,Advanced Security Information Model (ASIM) workspace deployed parsers - Microsoft Sentinel,Manage workspace-deployed ASIM parsers in Sentinel,This article explains how to manage and use workspace deployed Advanced Security Information Model (ASIM) parsers,Workspace deployed Advanced Security Information Model (ASIM) parsers are used to support developing and modifying ASIM parsers.,2026-04-22T17:56:00.000Z,concept-article,configuration,0.65,True,"Covers managing workspace-deployed parsers; likely includes specific parser resource names, deployment options, and configuration behaviors unique to Sentinel.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-common-fields,ASIM common fields,The Advanced Security Information Model (ASIM) common schema fields reference,Reference common ASIM schema fields in Sentinel,This article describes the Advanced Information Security (ASIM) common schema fields,"Some fields are common to all ASIM schemas. Each schema might add guidelines for using some of the common fields in the context of the specific schema. For example, permitted values for theEventTypefield might vary per schema, as might the value of theEventSchemaVersionfield.",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,Describes common schema fields and permitted values per schema; detailed field-level configuration for ASIM.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-content,ASIM content,Advanced Security Information Model (ASIM) security content,Use ASIM-based normalized security content in Sentinel,This article outlines the Microsoft Sentinel security content that uses the Advanced Security Information Model (ASIM).,"Normalized security content in Microsoft Sentinel includes analytics rules, hunting queries, and workbooks that work with unifying normalization parsers. You can find normalized, built-in content in Microsoft Sentinel galleries andsolutions, create your own normalized content, or modify existing content to use normalized data. This article lists built-in Microsoft Sentinel content that has been configured to support the Advanced Security Information Model (ASIM). While links to the Microsoft Sen",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,"Lists built-in content configured for ASIM; includes mappings between content and normalization schemas, which is product-specific configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-develop-parsers,Develop ASIM parsers,Develop Microsoft Sentinel Advanced Security Information Model (ASIM) parsers,Develop and deploy custom ASIM parsers,"This article explains how to develop, test, and deploy Microsoft Sentinel Advanced Security Information Model (ASIM) parsers.","Advanced Security Information Model (ASIM) users useunifying parsersinstead of table names in their queries, to view data in a normalized format and to include all data relevant to the schema in the query. Unifying parsers, in turn, usesource-specific parsersto handle the specific details of each source. Microsoft Sentinel provides built-in, source-specific parsers for many data sources. You may want to modify, ordevelop, these source-specific parsers in the following situations: When your devic",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,"Focuses on developing, testing, and deploying KQL-based parsers; likely includes function signatures, parameters, and deployment patterns unique to Sentinel/ASIM.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-application,ASIM application entity,The Advanced Security Information Model (ASIM) Application Entity reference - Microsoft Sentinel,Implement ASIM Application Entity schema in Sentinel,This article displays the Microsoft Sentinel Application Entity schema.,,2026-04-22T17:56:00.000Z,reference,configuration,0.75,True,Application Entity schema reference; contains product-specific field definitions and usage patterns.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-device,ASIM device entity,The Advanced Security Information Model (ASIM) Device Entity reference,Implement ASIM Device Entity schema in Microsoft Sentinel,This article displays the Microsoft Sentinel Device Entity schema.,"Devices, or hosts, are the common terms used for the systems that take part in the event. TheDvcprefix is used to designate the primary device on which the event occurs. Some events, such as network sessions, have source and destination devices, designated by the prefixSrcandDst. In such a case, theDvcprefix is used for the device reporting the event, which might be the source, destination, or a monitoring device.",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,Device Entity schema reference with Dvc/Src/Dst semantics; detailed field-level configuration for normalized device data.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-user,ASIM user entity,The Advanced Security Information Model (ASIM) User Entity reference,Implement ASIM User Entity schema in Microsoft Sentinel,This article displays the Microsoft Sentinel User Entity schema.,"Users are central to activities reported by events. The user entity fields listed in this section are used to describe the users involved in the action. When used in an event, prefixes are used to designate the role of a user entity in the activity. The prefixesSrcandDstare used to designate the user role in network related events, in which a source system and a destination system communicate. The prefixes 'Actor' and 'Target' are used for system oriented events such as process events.",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"User Entity schema reference; documents entity fields and role-specific prefixes, which are detailed schema configuration.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-functions,ASIM helper functions,Advanced Security Information Model (ASIM) helper functions,Use ASIM helper functions for normalized Sentinel data,This article outlines the Microsoft Sentinel Advanced Security Information Model (ASIM) helper functions.,Advanced Security Information Model (ASIM) helper functions extend the KQL language providing functionality that helps interact with normalized data and in writing parsers.,2026-04-22T17:56:00.000Z,reference,integrations,0.7,True,Documents helper KQL functions for interacting with normalized data; includes function signatures and parameters unique to ASIM/Sentinel.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-ingest-time,Ingest-time normalization,Ingest time normalization,,This article explains how Microsoft Sentinel normalizes data at ingest,,2026-04-22T17:56:00.000Z,concept-article,,0.2,False,"Ingest-time normalization article is likely conceptual/architectural; no evidence of numeric limits, config tables, or troubleshooting patterns in the description.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-known-issues,ASIM known issues,Advanced Security Information Model (ASIM) known issues,Review ASIM normalization known issues and limitations,This article outlines the Microsoft Sentinel Advanced Security Information Model (ASIM) known issues.,The following are the Advanced Security Information Model (ASIM) known issues and limitations:,2026-04-22T17:56:00.000Z,reference,limits-quotas,0.7,True,Explicitly lists known issues and limitations; likely includes concrete constraints and unsupported scenarios for ASIM in Sentinel.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-manage-parsers,Manage ASIM parsers,Manage Advanced Security Information Model (ASIM) parsers,Manage and customize ASIM parsers in Sentinel,"This article explains how to manage Advanced Security Information Model (ASIM) parsers, add a customer parser, and replace a built-in parser.","Advanced Security Information Model (ASIM) users useunifying parsersinstead of table names in their queries, to view data in a normalized format and get all the data relevant to the schema in a single query. Each unifying parser uses multiple source-specific parsers that handle each source's specific details. To understand how parsers fit within the ASIM architecture, refer to theASIM architecture diagram. You may need to manage the source-specific parsers used by each unifying parser to: Add a ",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Describes adding custom parsers and replacing built-in ones; involves product-specific parser names, relationships (unifying vs source-specific), and configuration steps.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-modify-content,Modify content to use ASIM,Modify content to use the Microsoft Sentinel Advanced Security Information Model (ASIM),Convert Sentinel content to use ASIM normalization,This article explains how to convert Microsoft Sentinel content to use the Advanced Security Information Model (ASIM).,"Normalized security content in Microsoft Sentinel includes analytics rules, hunting queries, and workbooks that work with unifying normalization parsers. You can find normalized, out-of-the-box content in Microsoft Sentinel galleries andsolutions, create your own normalized content, or modify existing, custom content to use normalized data. This article explains how to convert existing Microsoft Sentinel analytics rules to usenormalized datawith the Advanced Security Information Model (ASIM). To",2026-04-22T17:56:00.000Z,how-to,best-practices,0.65,True,Explains how to modify analytics rules and queries to use normalized data; contains product-specific guidance and patterns for migrating existing content.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-list,ASIM parsers,List of Microsoft Sentinel Advanced Security Information Model (ASIM) parsers,Reference list of ASIM parsers for Microsoft Sentinel,This article lists Advanced Security Information Model (ASIM) parsers.,"This document provides a list of Advanced Security Information Model (ASIM) parsers. For an overview of ASIM parsers refer to theparsers overview. To understand how parsers fit within the ASIM architecture, refer to theASIM architecture diagram. Parsers that don't have a value underUses pack parameterdon't have theAdditionalFieldscolumn populated.",2026-04-22T17:56:00.000Z,reference,configuration,0.75,True,Provides a list of ASIM parsers and related parameters; this is a catalog of product-specific parser configurations.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-overview,ASIM parsers,Microsoft Sentinel Advanced Security Information Model (ASIM) parsers overview,,This article provides an overview of Advanced Security Information Model (ASIM) parsers and a link to more detailed ASIM parsers documents.,"In Microsoft Sentinel, parsing andnormalizinghappen at query time. Parsers are built asKQL user-defined functionsthat transform data in existing tables, such asCommonSecurityLog, custom logs tables, or Syslog, into the normalized schema. Usersuse Advanced Security Information Model (ASIM) parsersinstead of table names in their queries to view data in a normalized format, and to include all data relevant to the schema in your query. To understand how parsers fit within the ASIM architecture, refe",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"Overview of ASIM parsers; summary indicates conceptual description and links to detailed docs, not specific configuration parameters or error mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-alert,ASIM alert event schema,The Advanced Security Information Model (ASIM) Alert Events normalization schema reference,Use ASIM Alert Events normalization schema in Sentinel,This article displays the Microsoft Sentinel Alert Events normalization schema.,"The Microsoft Sentinel Alert Schema is designed to normalize security-related alerts from various products into a standardized format within Microsoft Advanced Security Information Model (ASIM). This schema focuses exclusively on security events, ensuring consistent and efficient analysis across different data sources. The Alert Schema represents various types of security alerts, such as threats, suspicious activities, user behavior anomalies and compliance violations. These alerts are reported ",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"Alert Events schema reference; defines fields and structures for normalized alerts, which is detailed configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-asset,ASIM asset entity schema,The Advanced Security Information Model (ASIM) Asset Entity normalization schema reference,Implement ASIM Asset Entity normalization schema in Sentinel,This article displays the Microsoft Sentinel Asset Entity normalization schema.,"The Microsoft Sentinel Asset Entity Schema is designed to normalize assets from various products into a standardized format within Microsoft Advanced Security Information Model (ASIM). This schema focuses exclusively on assets in non-Microsoft data sources, ensuring consistent and efficient analysis. An asset is any data resource that an organization stores, processes, or manages, such as a file, or site. Each asset carries security-relevant metadata including ownership, permissions, sensitivity",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"Schema reference for Asset Entity; includes field names, types, and usage rules that are detailed configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-audit,ASIM audit event schema,The Advanced Security Information Model (ASIM) Audit Events normalization schema reference,Use ASIM Audit Events normalization schema in Sentinel,This article displays the Microsoft Sentinel Audit Events normalization schema.,"The Microsoft Sentinel Audit events normalization schema represents events associated with the audit trail of information systems. The audit trail logs system configuration activities and policy changes. Such changes are often performed by system administrators, but can also be performed by users when configuring the settings of their own applications. Every system logs audit events alongside its core activity logs. For example, a Firewall will log events about the network sessions is processes,",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,Audit Events schema reference; includes field definitions and semantics for normalized audit logs.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-authentication,ASIM authentication schema,The Advanced Security Information Model (ASIM) Authentication normalization schema reference,Use ASIM Authentication normalization schema in Sentinel,This article describes the Microsoft Sentinel Authentication normalization schema.,"The Microsoft Sentinel Authentication schema is used to describe events related to user authentication, sign-in, and sign-out. Authentication events are sent by many reporting devices, usually as part of the event stream alongside other events. For example, Windows sends several authentication events alongside other OS activity events. Authentication events include both events from systems that focus on authentication such as VPN gateways or domain controllers, and direct authentication to an en",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,Authentication schema reference; documents fields and event types for normalized authentication events.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dhcp,ASIM DHCP schema,The Advanced Security Information Model (ASIM) DHCP normalization schema reference,Use ASIM DHCP normalization schema in Microsoft Sentinel,This article describes the Microsoft Sentinel DHCP normalization schema.,"The DHCP information model is used to describe events reported by a DHCP server, and is used by Microsoft Sentinel to enable source-agnostic analytics. For more information, seeNormalization and the Advanced Security Information Model (ASIM).",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,DHCP information model schema; detailed field-level configuration for normalized DHCP events.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dns,ASIM DNS schema,The Advanced Security Information Model (ASIM) DNS normalization schema reference,Use ASIM DNS normalization schema in Microsoft Sentinel,This article describes the Microsoft Sentinel DNS normalization schema.,"The DNS information model is used to describe events reported by a DNS server or a DNS security system, and is used by Microsoft Sentinel to enable source-agnostic analytics. For more information, seeNormalization and the Advanced Security Information Model (ASIM).",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"DNS information model schema; defines normalized DNS event fields and usage, which is expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-file-event,ASIM file event schema,The Advanced Security Information Model (ASIM) File Event normalization schema reference,Use ASIM File Event normalization schema in Sentinel,This article describes the Microsoft Sentinel File Event normalization schema.,"The File Event normalization schema is used to describe file activity such as creating, modifying, or deleting files or documents. Such events are reported by operating systems, file storage systems such as Azure Files, and document management systems such as Microsoft SharePoint. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Model (ASIM).",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,File Event schema reference; detailed configuration of normalized file activity fields and semantics.,updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-network,ASIM network session schema,The Advanced Security Information Model (ASIM) Network Session normalization schema reference,Use ASIM Network Session normalization schema in Sentinel,This article displays the Microsoft Sentinel Network Session normalization schema.,"The Microsoft Sentinel Network Session normalization schema represents an IP network activity, such as network connections and network sessions. Such events are reported, for example, by operating systems, routers, firewalls, and intrusion prevention systems. The network normalization schema can represent any type of an IP network session but is designed to provide support for common source types, such as Netflow, firewalls, and intrusion prevention systems. For more information about normalizat",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"Network Session schema reference; defines normalized network activity fields and structures, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-process-event,ASIM process event schema,The Advanced Security Information Model (ASIM) Process Event normalization schema reference,Use ASIM Process Event schema in Microsoft Sentinel,This article describes the Microsoft Sentinel Process Event normalization schema.,"The Process Event normalization schema is used to describe the operating system activity of running and terminating a process. Such events are reported by operating systems and security systems, such as EDR (End Point Detection and Response) systems. A process, as defined by OSSEM, is a containment and management object that represents a running instance of a program. While processes themselves do not run, they do manage threads that run and execute code. For more information about normalization",2026-04-22T17:56:00.000Z,reference,configuration,0.86,True,"Schema reference with detailed, product-specific field names, types, and usage for normalized process events, which are configuration-level details not inferable from general training.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-registry-event,ASIM registry event schema,The Advanced Security Information Model (ASIM) Registry Event normalization schema reference,Use ASIM Registry Event schema in Microsoft Sentinel,This article describes the Microsoft Sentinel Registry Event normalization schema.,"The Registry Event schema is used to describe the Windows activity of creating, modifying, or deleting Windows Registry entities. Registry events are specific to Windows systems, but are reported by different systems that monitor Windows, such as EDR (End Point Detection and Response) systems, Sysmon, or Windows itself. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Model (ASIM).",2026-04-22T17:56:00.000Z,reference,configuration,0.86,True,"Normalization schema reference for Windows registry events with concrete field definitions and mappings, representing detailed configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-user-management,ASIM user management schema,Microsoft Sentinel user management normalization schema reference,Apply user management normalization schema in Sentinel,This article describes the Microsoft Sentinel user management normalization schema.,"The Microsoft Sentinel user management normalization schema is used to describe user management activities, such as creating a user or a group, changing user attribute, or adding a user to a group. Such events are reported, for example, by operating systems, directory services, identity management systems, and any other system reporting on its local user management activity. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Mod",2026-04-22T17:56:00.000Z,reference,configuration,0.86,True,"Defines the Sentinel user management normalization schema with specific fields and structure, which is product-specific configuration reference.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-v1,Legacy network normalization schema,Microsoft Sentinel network normalization schema (Legacy version - Public preview),Use legacy Sentinel network normalization schema v0.1,This article displays the Microsoft Sentinel data normalization schema.,"The network normalization schema is used to describe reported network events, and is used by Microsoft Sentinel to enable unifying analytics. For more information, seeNormalization and the Advanced Security Information Model (ASIM). Important This article relates to version 0.1 of the network normalization schema, which was released as a preview before ASIM was available.Version 0.2.xof the network normalization schema aligns with ASIM and provides other enhancements. For more information, seeDi",2026-04-22T17:56:00.000Z,reference,configuration,0.8,True,"Legacy network normalization schema reference with explicit field definitions and version-specific structure, which is expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-web,ASIM web session schema,The Advanced Security Information Model (ASIM) Web Session normalization schema reference,Implement ASIM Web Session normalization schema,This article displays the Microsoft Sentinel Web Session normalization schema.,"The Web Session normalization schema is used to describe an IP network activity. For example, IP network activities are reported by web servers, web proxies, and web security gateways. For more information about normalization in Microsoft Sentinel, seeNormalization and the Advanced Security Information Model (ASIM).",2026-04-22T17:56:00.000Z,reference,configuration,0.86,True,"Web Session normalization schema reference with concrete field layout for IP network activity, a detailed configuration artifact.",updated +https://learn.microsoft.com/en-us/azure/sentinel/notebook-get-started,Get started with notebooks and MSTICPy,Get started with Jupyter notebooks and MSTICPy in Microsoft Sentinel,,Walk through the Getting Started Guide For Microsoft Sentinel ML Notebooks to learn the basics of Microsoft Sentinel notebooks with MSTICPy and queries.,"This article describes how to run theGetting Started Guide For Microsoft Sentinel ML Notebooksnotebook, which sets up basic configurations for running Jupyter notebooks in Microsoft Sentinel and provides examples for running simple queries. TheGetting Started Guide for Microsoft Sentinel ML Notebooksnotebook usesMSTICPy, a powerful Python library designed to enhance security investigations and threat hunting within Microsoft Sentinel notebooks. It provides built-in tools for data enrichment, vis",2026-04-22T17:56:00.000Z,how-to,,0.35,False,Getting-started walkthrough for notebooks and MSTICPy; likely step-by-step tutorial without configuration matrices or numeric constraints in the summary.,updated +https://learn.microsoft.com/en-us/azure/sentinel/notebooks,Overview,Jupyter notebooks with Microsoft Sentinel hunting capabilities,,Learn about Jupyter notebooks in Microsoft Sentinel for security hunting.,"Jupyter notebooks combine full programmability with a huge collection of libraries for machine learning, visualization, and data analysis. These attributes make Jupyter a compelling tool for security investigation and hunting. The foundation of Microsoft Sentinel is the data store; it combines high-performance querying, dynamic schema, and scales to massive data volumes. The Azure portal and all Microsoft Sentinel tools use a common API to access this data store. The same API is also available f",2026-04-22T17:56:00.000Z,concept-article,,0.25,False,Overview of Jupyter notebooks with Sentinel; summary does not show specific Sentinel-only configs or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/notebooks-hunt,Launch Jupyter notebook,Hunt for security threats with Jupyter notebooks - Microsoft Sentinel,,Launch and run notebooks with the Microsoft Sentinel hunting capabilities.,"As part of your security investigations and hunting, launch and run Jupyter notebooks to programmatically analyze your data. In this article, you create an Azure Machine Learning workspace, launch notebook from Microsoft Sentinel to your Azure Machine Learning workspace, and run code in the notebook. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel",2026-04-22T17:56:00.000Z,how-to,,0.3,False,Tutorial-style article on launching notebooks; no explicit expert-only configuration tables or limits in the summary.,updated +https://learn.microsoft.com/en-us/azure/sentinel/notebooks-msticpy-advanced,Configure advanced MSTICPy settings,Advanced configurations for Jupyter notebooks and MSTICPy in Microsoft Sentinel,Configure advanced MSTICPy and notebook settings for Sentinel,Learn about advanced configurations available for Jupyter notebooks with MSTICPy when working in Microsoft Sentinel.,"This article describes advanced configurations for working with Jupyter notebooks and MSTICPy in Microsoft Sentinel. For more information, seeUse Jupyter notebooks to hunt for security threatsandGet started with Jupyter notebooks and MSTICPy in Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Described as covering advanced configurations for Jupyter notebooks and MSTICPy in Sentinel, which typically includes product-specific config parameters and options beyond generic notebook usage.",updated +https://learn.microsoft.com/en-us/azure/sentinel/offboard,Remove Microsoft Sentinel from your workspaces,Remove Microsoft Sentinel from your workspace,,Learn how to delete your Microsoft Sentinel instance to discontinue use of Microsoft Sentinel and the associated costs.,"If you no longer want to use Microsoft Sentinel, this article explains how to remove it from your Log Analytics workspace. If you instead want to offboard Microsoft Sentinel from the Defender portal, seeOffboard Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"Removal/offboarding how-to; likely step-based UI instructions without detailed limits, configs, or error mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/offboard-implications,Overview,Implications - remove Microsoft Sentinel from workspace,Understand impacts and timing of removing Sentinel workspaces,Learn about the impact of removing a Microsoft Sentinel instance from a Log Analytics workspace in the Azure or Defender portal.,"If you decide that you no longer want to use your Microsoft Sentinel instance associated with a Log Analytics workspace, remove Microsoft Sentinel from the workspace. But before you do, consider the implications described in this article. It can take up to 48 hours for Microsoft Sentinel to be removed from the Log Analytics workspace. Data connector configuration and Microsoft Sentinel tables are deleted. Other resources and data are retained for a limited time. Your subscription continues to be",2026-04-22T17:56:00.000Z,concept-article,limits-quotas,0.7,True,"States specific timing (up to 48 hours) and retention behavior for resources and data after removal, which are concrete service limits/behaviors.",updated +https://learn.microsoft.com/en-us/azure/sentinel/ops-guide,SIEM operations guide,Operational guide - Microsoft Sentinel,Follow operational recommendations for Microsoft Sentinel SOCs,Learn about the operational recommendations to help security operations teams to plan and run security activities.,"This article lists the operational activities that we recommend security operations (SOC) teams and security administrators plan for and run as part of their regular security activities with Microsoft Sentinel. For more information about managing your security operations, seeSecurity operations overview.",2026-04-22T17:56:00.000Z,reference,best-practices,0.7,True,An operational guide listing recommended recurring SOC activities in Sentinel; this is product-specific operational best-practice guidance beyond generic SOC theory.,updated +https://learn.microsoft.com/en-us/azure/sentinel/overview,Microsoft Sentinel SIEM overview,What is Microsoft Sentinel SIEM?,,"Learn about Microsoft Sentinel, a scalable, cloud-native SIEM and SOAR that uses AI, analytics, and automation for threat detection, investigation, and response.","Microsoft Sentinel is a cloud-native SIEM solution that delivers scalable, cost-efficient security across multicloud and multiplatform environments. It combines AI, automation, and threat intelligence to support threat detection, investigation, response, and proactive hunting. Microsoft Sentinel SIEM empowers analysts to anticipate and stop attacks across clouds and platforms, faster and with greater precision. This article highlights the key capabilities in Microsoft Sentinel. Microsoft Sentine",2026-04-22T17:56:00.000Z,overview,,0.3,False,"General SIEM overview of Sentinel capabilities; primarily conceptual without detailed configs, limits, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/package-platform-solution,Packaging and publishing a Sentinel platform solution,Package and publish a Microsoft Sentinel platform solution,Package and deploy Microsoft Sentinel platform solutions,Learn how to package the components of a Microsoft Sentinel platform solution and publish the package in the Microsoft Security Store.,A Microsoft Sentinel platform solution is a deployable package for the Microsoft Sentinel data lake. It includes code and configuration that help you analyze and respond to security data. This article shows how to package and publish your completed platform solution in the Microsoft Security Store.,2026-04-22T17:56:00.000Z,how-to,deployment,0.65,True,"Packaging and publishing platform solutions to the Security Store; likely includes product-specific deployment artifacts, constraints, and supported components for this solution type.",updated +https://learn.microsoft.com/en-us/azure/sentinel/partner-integrations,Partner integrations best practices,Microsoft Sentinel components and patterns,Design integration patterns for Microsoft Sentinel solutions,This article describes best practices for creating your own integrations with Microsoft Sentinel.,"This article discusses the different components of a Microsoft Sentinel solution and how they can work together to address important customer scenarios. The Sentinel platform includes a data lake, graph, Jupyter notebook jobs, a Model Context Protocol (MCP) server, and data from more than 300 Sentinel connectors to help customers centralize and analyze their security data in a cost-efficient way. These capabilities plus Microsoft Security Copilot enable customers and partners to create impactful",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.65,True,"Described as components and patterns plus best practices for integrations; likely contains Sentinel-specific architectural patterns and guidance on how components work together, which is product-specific design knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/powerbi,Create a Power BI report,Create a Power BI report from Microsoft Sentinel data,Build Power BI reports from Sentinel data,Learn how to create a Power BI report using an exported query from Microsoft Sentinel. Share your report with others in the Power BI service and a Teams channel.,"Power BIis a reporting and analytics platform that turns data into coherent, immersive, interactive visualizations. Power BI lets you easily connect to data sources, visualize and discover relationships, and share insights with whoever you want. You can base Power BI reports on data from Microsoft Sentinel and share those reports with people who don't have access to Microsoft Sentinel. For example, you might want to share information about failed sign-in attempts with app owners, without grantin",2026-04-22T17:56:00.000Z,how-to,integrations,0.65,True,"Power BI integration from exported Sentinel queries typically includes connection details, dataset configuration, and query export specifics that are product-specific integration patterns.",updated +https://learn.microsoft.com/en-us/azure/sentinel/prepare-multiple-workspaces,Prepare for multiple workspaces,Prepare for multiple workspaces and tenants in Microsoft Sentinel,Design multi-workspace and multi-tenant Sentinel layouts,"To prepare for your deployment, learn how Microsoft Sentinel can extend across multiple workspaces and tenants.","Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you're still using Microsoft Sentinel in the Azure portal, we recommend that you start planning yourtransition to the Defender portalto ensure a smooth transition and take full ",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.65,True,Guidance on extending Sentinel across multiple workspaces/tenants is architecture-specific and unique to the product’s cross-tenant model.,updated +https://learn.microsoft.com/en-us/azure/sentinel/prerequisites,Prerequisites,Prerequisites for deploying Microsoft Sentinel,Verify prerequisites for Microsoft Sentinel deployment,Learn about prerequisites to deploy Microsoft Sentinel.,"Before deploying Microsoft Sentinel, make sure that your Azure tenant meets the requirements listed in this article. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,conceptual,deployment,0.7,True,"Prerequisites for deploying Sentinel usually include tenant-level requirements, supported regions, and service dependencies that are product-specific deployment constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/prioritize-data-connectors,Prioritize data connectors,Prioritize data connectors for Microsoft Sentinel,Prioritize Microsoft Sentinel data connectors by value,Learn how to plan and prioritize which data sources to use for your Microsoft Sentinel deployment.,"In this article, you learn how to plan and prioritize which data sources to use for your Microsoft Sentinel deployment. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,concept-article,decision-making,0.7,True,Helps plan and prioritize which data sources to onboard; this is Sentinel-specific decision guidance on connector selection and coverage vs. cost trade-offs.,updated +https://learn.microsoft.com/en-us/azure/sentinel/publish-sentinel-solutions,Publish solutions,Publish security information and event management (SIEM) solutions to Microsoft Sentinel,,This article guides you through the process of publishing solutions to Microsoft Sentinel.,"Microsoft’s commercial marketplaceis an online marketplace for applications and services that lets businesses of all sizes offer solutions to customers around the world. As an independent software vendor (ISV) member of the Partner Program, you can create, publish, and manage your Microsoft Sentinel SIEM solutions in Partner Center. Your solutions are listed together with other Microsoft solutions, connecting you to businesses, organizations, and government agencies around the world. Microsoft S",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Publishing solutions to marketplace; mainly process and portal steps, not technical configuration matrices or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/purview-solution,Integrate Microsoft Purview,Integrate Microsoft Sentinel and Microsoft Purview,Integrate Microsoft Purview insights and logs with Sentinel,"This article describes how to use the Microsoft Sentinel data connector and solution for Microsoft Purview to enable data sensitivity insights, create rules to monitor when classifications have been d","Microsoft Purviewprovides organizations with visibility into where sensitive information is stored, helping prioritize at-risk data for protection. Integrate Microsoft Purview with Microsoft Sentinel to help narrow down the high volume of incidents and threats surfaced in Microsoft Sentinel, and understand the most critical areas to start. Start by ingesting your Microsoft Purview logs into Microsoft Sentinel through a data connector. Then use a Microsoft Sentinel workbook to view data such as a",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,Describes using a data connector and solution for Purview; includes connector configuration and workbook usage—product-specific integration pattern.,updated +https://learn.microsoft.com/en-us/azure/sentinel/quickstart-onboard,Onboard to Microsoft Sentinel,Onboard to Microsoft Sentinel,,"In this quickstart, you enable Microsoft Sentinel, and set up data connectors to monitor and protect your environment.","In this quickstart, you'll enable Microsoft Sentinel and install a solution from the content hub. Then, you'll set up a data connector to start ingesting data into Microsoft Sentinel. Microsoft Sentinel comes with many data connectors for Microsoft products such as the Microsoft Defender XDR service-to-service connector. You can also enable built-in connectors for non-Microsoft products such as Syslog or Common Event Format (CEF). For this quickstart, you'll use the Azure Activity data connector",2026-04-22T17:56:00.000Z,how-to,,0.4,False,Quickstart onboarding tutorial; mostly step-by-step UI actions without deep configuration matrices or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/relate-alerts-to-incidents,Relate alerts to incidents,Relate alerts to incidents in Microsoft Sentinel in the Azure portal,,This article shows you how to relate alerts to your incidents in Microsoft Sentinel in the Azure portal.,"This article shows you how to relate alerts to your incidents in Microsoft Sentinel. This feature allows you to manually or automatically add alerts to, or remove them from, existing incidents in the Azure portal as part of your investigation processes, refining the incident scope as the investigation unfolds. Important Incident expansion is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"Explains relating alerts to incidents; preview note but no numeric limits, config tables, or troubleshooting mappings in the summary.",updated +https://learn.microsoft.com/en-us/azure/sentinel/resource-context-rbac,Manage workspace access with resource-context RBAC,Manage access to Microsoft Sentinel data by resource,Configure resource-context RBAC for Sentinel data access,"This article explains you can manage access to Microsoft Sentinel data by the resources a user can access. Managing access by resource enables you to provide access to specific data only, without the ","Access to a workspace is managed by using Azure RBAC. Typically, users who have access to a Log Analytics workspace enabled for Microsoft Sentinel also have access to all the workspace data, including security content. Administrators can useAzure rolesto configure access to specific features in Microsoft Sentinel, depending on the access requirements in their team. However, you may have some users who need to access only specific data in your workspace, but shouldn't have access to the entire Mi",2026-04-22T17:56:00.000Z,how-to,security,0.8,True,Explains managing access by resource using Azure roles; includes specific RBAC roles and scope patterns unique to Sentinel resource-context RBAC.,updated +https://learn.microsoft.com/en-us/azure/sentinel/respond-threats-during-investigation,Remediate threats while investigating,Respond to threat actors while investigating or threat hunting in Microsoft Sentinel in the Azure portal,,"This article shows you how to take response actions against threat actors on the spot, during the course of an incident investigation or threat hunt, without pivoting or context switching out of the i","This article shows you how to take response actions against threat actors on the spot, during the course of an incident investigation or threat hunt, without pivoting or context switching out of the investigation or hunt. You accomplish this using playbooks based on the new entity trigger. The entity trigger currently supports the following entity types: Important Theentity triggeris currently inPREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor additional legal terms that",2026-04-22T17:56:00.000Z,how-to,,0.4,False,Explains entity-trigger-based playbooks conceptually; preview note but no detailed parameter tables or numeric constraints in the summary.,updated +https://learn.microsoft.com/en-us/azure/sentinel/restore,Restore historical data,Restore archived logs from search - Microsoft Sentinel,,Learn how to restore archived logs from search job results.,"Restore data from an archived log to use in high performing queries and analytics. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you're still using Microsoft Sentinel in the Azure portal, we recommend that you start plannin",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Describes restoring archived logs; summary lacks explicit numeric limits, configuration parameter tables, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/roles,Plan roles and permissions,Roles and permissions in the Microsoft Sentinel platform,Assign Microsoft Sentinel RBAC roles and permissions,"Learn how Microsoft Sentinel assigns permissions to users using both Azure and Microsoft Entra ID role-based access control, and identify the allowed actions for each role.","This article explains how Microsoft Sentinel assigns permissions to user roles for both Microsoft Sentinel SIEM and Microsoft Sentinel data lake, identifying the allowed actions for each role. Microsoft Sentinel usesAzure role-based access control (Azure RBAC)to provide built-in and custom roles for Microsoft Sentinel SIEM, andMicrosoft Entra ID role-based access control (Microsoft Entra ID RBAC)to provide built-in and custom roles for Microsoft Sentinel data lake. You can assign roles to users,",2026-04-22T17:56:00.000Z,concept-article,security,0.9,True,"Explicitly about roles and permissions; will list Sentinel-specific RBAC roles and allowed actions, which are product-specific security configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sample-workspace-designs,Review sample workspace designs,Sample Microsoft Sentinel workspace designs,Choose Microsoft Sentinel workspace architecture patterns,"Learn from samples of Microsoft Sentinel architecture designs with multiple tenants, clouds or regions.","This article describes suggested Log Analytics workspace designs for organizations with the following sample requirements: For more information, seeDesign a Log Analytics workspace architecture. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,best-practice,architecture-patterns,0.7,True,Sample workspace designs for multi-tenant/multi-region Sentinel are product-specific architecture patterns and trade-offs for workspace topology.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/collect-sap-hana-audit-logs,Collect SAP HANA audit logs,Collect SAP HANA audit logs in Microsoft Sentinel,Configure SAP HANA audit log collection in Sentinel,This article explains how to collect audit logs from your SAP HANA database.,"This article explains how to collect audit logs from your SAP HANA database in customer managed environments. Content in this article is intended for yoursecurity,infrastructure, andSAP BASISteams. Important Microsoft Sentinel SAP HANA support is currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Important For environments managed by SAP such as SA",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Explains how to collect SAP HANA audit logs; likely includes HANA-specific settings and Sentinel connector configuration parameters that are product-specific.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/cross-workspace,Integrate SAP across multiple workspaces,Integrate SAP across multiple workspaces,Architect multi-workspace Sentinel deployments for SAP,Learn how to work with the Microsoft Sentinel solution for SAP applications in multiple workspaces for different deployment scenarios.,"When you set up your Log Analytics workspace enabled for Microsoft Sentinel, you havemultiple architecture optionsand factors to consider. Taking into account geography, regulation, access control, and other factors, you might choose to have multiple workspaces in your organization. When working with SAP, your SAP and SOC teams might need to work in separate workspaces to maintain security boundaries. You might not want the SAP team to have visibility into all other security logs across your org",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.7,True,Discusses multiple architecture options and factors for SAP across workspaces; this is architecture/partitioning guidance specific to Sentinel SAP scenarios.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-command-line,Deploy the agent from the command line,Connect your SAP system by deploying your data connector agent container from the command line,Deploy SAP data connector agent via CLI,This article describes how to connect your SAP system to Microsoft Sentinel by deploying the container that that hosts the SAP data connector agent using the command line.,"This article provides command line options for deploying an SAP data connector agent. For typical deployments we recommend that you use theportalinstead of the command line, as data connector agents installed via the command line can be managed only via the command line. However, if you're using a configuration file to store your credentials instead of Azure Key Vault, or if you're an advanced user who wants to deploy the data connector manually, such as in a Kubernetes cluster, use the procedur",2026-04-22T17:56:00.000Z,how-to,deployment,0.8,True,"Command-line deployment article will include concrete commands and parameters specific to Sentinel’s SAP connector, which are product-specific deployment details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-data-connector-agent-container,Connect your SAP system,Connect your SAP system to Microsoft Sentinel,Deploy containerized SAP data connector to Sentinel,This article describes how to connect your SAP system to Microsoft Sentinel by deploying the container that that hosts the SAP data connector agent.,"For the Microsoft Sentinel solution for SAP applications to operate correctly, you must first get your SAP data into Microsoft Sentinel. Do this by either deploying the Microsoft Sentinel SAP data connector agent, or by connecting the Microsoft Sentinel agentless data connector for SAP. Select the option at the top of the page that matches your environment. This article describes the third step in deploying one of the Microsoft Sentinel solutions for SAP applications. Important The data connecto",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,Describes how to deploy the SAP data connector agent container; such deployment docs typically include Sentinel/SAP-specific deployment parameters and constraints beyond generic container deployment.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-btp-solution,Deploy SAP BTP,Deploy Microsoft Sentinel solution for SAP BTP,Deploy Sentinel solution for SAP BTP,Learn how to deploy the Microsoft Sentinel solution for SAP Business Technology Platform (BTP) system.,"This article describes how to deploy the Microsoft Sentinel solution for SAP Business Technology Platform (BTP) system. The Microsoft Sentinel solution for SAP BTP monitors and protects your SAP BTP system. It collects audit logs and activity logs from the BTP infrastructure and BTP-based apps, and then detects threats, suspicious activities, illegitimate activities, and more.Read more about the solution. Important An architectural shift in the data connector v3.0.11 to cater for delayed SAP BTP",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,"Deployment article for SAP BTP solution, including mention of connector version behavior; contains product-specific deployment steps and constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-security-content,Install the solution for SAP applications,Install a Microsoft Sentinel solution for SAP applications,Install Microsoft Sentinel solution for SAP applications,Learn how to install a Microsoft Sentinel solution for SAP applications from the content hub to your Log Analytics workspace enabled for Microsoft Sentinel.,"The Microsoft Sentinel solutions for SAP applications include an SAP data connector, which collects logs from your SAP systems and sends them to your Microsoft Sentinel workspace, and out-of-the-box security content, which helps you gain insight into your organization's SAP environment and detect and respond to security threats. Installing your solution is a required step before you can configure your data connector. Important The data connector agent for SAP is beingdeprecatedand will be perman",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,Covers installing SAP solution from content hub and notes connector deprecation; product-specific deployment and lifecycle details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-overview,Deployment overview,Deploy the Microsoft Sentinel solution for SAP applications,Understand deployment process for Sentinel SAP solution,Get an introduction to the process of deploying the Microsoft Sentinel solution for SAP applications.,"Use the Microsoft Sentinel solution for SAP applications to monitor your SAP systems with Microsoft Sentinel, detecting sophisticated threats throughout the business logic and application layers of your SAP applications. This article introduces you to the Microsoft Sentinel solution for SAP applications deployment.",2026-04-22T17:56:00.000Z,overview,deployment,0.65,True,Introduces deployment of Sentinel solution for SAP; includes product/integration-specific deployment steps and constraints.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-solution-configuration,Enable SAP detections and threat protection,Enable SAP detections and threat protection with Microsoft Sentinel,Configure Sentinel SAP detections and threat protection,This article shows you how to configure initial security content for the Microsoft Sentinel solution for SAP applications in order to start enabling SAP detections and threat protection.,"While deploying a Microsoft Sentinel data collector and solution for SAP provides you with the ability to monitor SAP systems for suspicious activities and identify threats, extra configuration steps are required to ensure the solution is optimized for your SAP deployment. This article provides best practices for getting started with the security content delivered with the Microsoft Sentinel solution for SAP applications, and is the last step in deploying the SAP integration. Important The data ",2026-04-22T17:56:00.000Z,how-to,best-practices,0.7,True,Described as best practices for configuring initial security content; likely includes concrete recommendations and product-specific gotchas for SAP detections.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/preparing-sap,Prepare your SAP environment,Configure your SAP system for the Microsoft Sentinel solution - Microsoft Sentinel,Prepare SAP systems for Sentinel SAP connector,Learn about extra preparations required in your SAP system to install the SAP data connector agent and connect Microsoft Sentinel to your SAP system.,"This article describes how to prepare your SAP environment for connecting to the SAP data connector. Preparation differs, depending on whether you're using the containerized data connector agent. Select the option at the top of the page that matches your environment. This article is part of the second step in deploying the Microsoft Sentinel solution for SAP applications. Important The data connector agent for SAP is beingdeprecatedand will be permanently disabled bySeptember 14th 2026. We recom",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Preparation steps for SAP to work with the Sentinel SAP connector will include SAP-side configuration objects, parameters, and roles specific to this integration, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/prerequisites-for-deploying-sap-continuous-threat-monitoring,Deployment prerequisites,Prerequisites for deploying Microsoft Sentinel solution for SAP applications,Verify prerequisites for deploying Sentinel SAP monitoring,This article lists the prerequisites required for deployment of the Microsoft Sentinel solution for SAP applications.,"This article lists the prerequisites required for deployment of the Microsoft Sentinel solution for SAP applications, which differ depending on whether you're deploying a data connector agent or using the agentless data connector with the SAP Cloud Connector. Select the option at the top of this page that matches your deployment. Reviewing and ensuring that you have or understand all the prerequisites is the first step in deploying the Microsoft Sentinel solution for SAP applications. Select a c",2026-04-22T17:56:00.000Z,reference,deployment,0.7,True,"Lists prerequisites for SAP connector/agent vs agentless; includes environment, connector, and platform requirements unique to this deployment.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-kickstart,Data connector agent kickstart script reference,Kickstart deployment script reference for the Microsoft Sentinel for SAP applications data connector agent,Kickstart script parameters for SAP connector deployment,Description of command line options available with kickstart deployment script,"This article provides a reference of the configurable parameters available in the kickstart script used to the deploy the Microsoft Sentinel for SAP applications data connector agent. For more information, seeConnect your SAP system to Microsoft Sentinel. Content in this article is intended for yourSAP BASISteams.",2026-04-22T17:56:00.000Z,reference,configuration,0.9,True,Reference for kickstart deployment script parameters is a configuration reference with specific parameter names and behaviors.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig,Data connector agent systemconfig.ini file reference (legacy),Microsoft Sentinel solution for SAP applications data connector agent systemconfig.ini file reference,Legacy systemconfig.ini reference for Sentinel SAP agent,Learn about the settings available in Microsoft Sentinel for SAP applications data connector agent systemconfig.ini file.,"Thesystemconfig.inifile is the legacy file used to configure the behavior of the Microsoft Sentinel for SAP applications data connector agent in versions earlier than June 22, 2023. This article describes the options available in each section of the configuration file. Content in this article is intended for yourSAP BASISteams. This article is not relevant if you've used therecommended deployment procedurefrom the portal. If you've installed a newer version of the agent from the command line, us",2026-04-22T17:56:00.000Z,reference,configuration,0.95,True,Legacy configuration file reference for systemconfig.ini with sections and options is clearly configuration-focused expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig-json,Data connector agent systemconfig.json file reference,Microsoft Sentinel solution for SAP applications data connector agent systemconfig.json file reference,systemconfig.json reference for Sentinel SAP agent,Learn about the settings available in Microsoft Sentinel for SAP applications data connector agent systemconfig.json file.,"Thesystemconfig.jsonfile is used to configure the behavior of the Microsoft Sentinel for SAP applications data connector agent whendeployed from the command line. This article describes the options available in each section of the configuration file. Content in this article is intended for yourSAP BASISteams, and is only relevant when your data connector agent is deployed from the command line. We recommenddeploying your data connector agent from the portalinstead. Important Microsoft Sentinel s",2026-04-22T17:56:00.000Z,reference,configuration,0.95,True,"Explicit configuration file reference listing settings and their meanings for systemconfig.json, matching configuration criteria.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-update,Data connector agent update script reference,Microsoft Sentinel solution for SAP applications data connector agent update file reference,Update script parameter reference for SAP connector,Description of command line options available with update deployment script,"The Microsoft Sentinel SAP data connector agent container users anupdate scriptto simplify the update process. This article describes the configurable parameters available in the update script. For more information, seeUpdate the Microsoft Sentinel for SAP applications data connector agent. Content in this article is intended for yourSAP BASISteams.",2026-04-22T17:56:00.000Z,reference,configuration,0.9,True,Describes configurable parameters of the update script; this is a configuration reference with product-specific options.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/required-abap-authorizations,Required ABAP permissions,Required ABAP authorizations for the Microsoft Sentinel solution for SAP applications,ABAP authorizations for Sentinel SAP log ingestion,Understand the ABAP authorizations required if you want to manually define roles based on the SAP logs you want to ingest to Microsoft Sentinel and the activities you want to run.,This article lists the ABAP authorizations required to ensure that the SAP user account used by Microsoft Sentinel's SAP data connector can correctly retrieve logs from the SAP systems. The required authorizations are listed here by their purpose. You only need the authorizations that are listed for the kinds of logs you want to bring into Microsoft Sentinel.,2026-04-22T17:56:00.000Z,how-to,security,0.9,True,"Lists required ABAP authorizations/roles by purpose; these are specific permission objects and scopes, matching the security category.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-agent-migrate,Migrate agent to agentless connector,Containerized Agent to Agentless Connector migration guide,Migrate from SAP agent container to agentless,Learn how to migrate from the containerized SAP agent to the agentless data connector for Microsoft Sentinel Solution for SAP applications.,This article outlines the steps required to migrate from the containerized SAP agent to the agentless data connector for Microsoft Sentinel Solution for SAP applications. Important The data connector agent for SAP is beingdeprecatedand will be permanently disabled bySeptember 14th 2026. We recommend that youmigrate to the agentless data connector. Learn more about the agentless approach from ourblog post.,2026-04-22T17:56:00.000Z,article,decision-making,0.65,True,Migration guide between containerized and agentless SAP connectors will include product-specific migration steps and considerations that help decide and execute the transition.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-controls-workbook,Monitor SAP audit controls,Check for SAP security controls with Microsoft Sentinel,Check SAP security controls with Sentinel workbook,"Learn about the SAP - Security Audit Controls workbook, used to monitor and track security control framework compliance across your SAP systems.","This article describes how you can use theSAP - Security Audit Controlsworkbook to monitor and track security control framework compliance across your SAP systems, including the following functionality: For example: Content in this article is intended for yoursecurityteam.",2026-04-22T17:56:00.000Z,concept-article,best-practices,0.6,True,Describes using the Security Audit Controls workbook to track compliance; likely includes concrete mappings and usage patterns specific to this workbook.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-log-workbook,Monitor SAP audit logs,Microsoft Sentinel solution for SAP applications - SAP -Security Audit log and Initial Access workbook,Use SAP Security Audit log workbook in Sentinel,"Learn about the SAP - Security Audit log and Initial Access workbook, used to monitor and track data across your SAP systems.","This article describes theSAP - Security Audit log and Initial Accessworkbook, used for monitoring and tracking user audit activity across your SAP systems. Use the workbook to get a bird's eye view of user audit activity, better secure your SAP systems, and gain quick visibility into suspicious actions. Drill down into suspicious events as needed. Use the workbook either for ongoing monitoring of your SAP systems, or to review the systems following a security incident or other suspicious activi",2026-04-22T17:56:00.000Z,reference,best-practices,0.6,True,"Workbook usage article will include specific queries/visuals and how to interpret them for SAP security monitoring, which is product-specific operational guidance.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-security-content,Security content reference,Microsoft Sentinel Solution for SAP BTP - security content reference,Security content reference for Sentinel SAP BTP,Learn about the built-in security content provided by the Microsoft Sentinel Solution for SAP BTP.,"This article details the security content available for the Microsoft Sentinel Solution for SAP BTP. Available security content currently includes a built-in workbook and analytics rules. You can also add SAP-relatedwatchliststo use in your search, detection rules, threat hunting, and response playbooks. Learn more about the solution.",2026-04-22T17:56:00.000Z,reference,security,0.6,True,"Details built-in security content (workbook, analytics rules) for SAP BTP; this is product-specific security detection content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-solution-overview,Overview,Microsoft Sentinel Solution for SAP BTP overview,,This article introduces the Microsoft Sentinel Solution for SAP BTP.,"SAP BTP is a cloud-based solution that provides a wide range of tools and services for developers to build, run, and manage applications. One of the key features of SAP BTP is its low-code development capabilities. Low-code development allows developers to create applications quickly and efficiently by using visual drag-and-drop interfaces and prebuilt components, rather than writing code from scratch. The Microsoft Sentinel solution for SAP BTP monitors and protects your SAP Business Technology",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,Overview of Sentinel solution for SAP BTP; summary suggests high-level description without detailed configuration tables or error mappings.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-deploy-troubleshoot,Troubleshoot SAP data connector,Troubleshoot the Microsoft Sentinel solution for SAP applications data connector agent,Troubleshoot Sentinel SAP data connector agent,Learn how to troubleshoot specific issues that might occur in your Microsoft Sentinel solution for SAP applications data connector agent deployment.,"This article includes troubleshooting steps to help you ensure accurate and timely data ingestion and monitoring for your SAP environment with Microsoft Sentinel. When working with the agentless data connector, most troubleshooting is done directly in the SAP Integration Suite, where the message log displays errors indicating the nature of the issue encountered. Important The data connector agent for SAP is beingdeprecatedand will be permanently disabled bySeptember 14th 2026. We recommend that ",2026-04-22T17:56:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article for the SAP data connector agent; will contain specific error messages, causes, and resolutions unique to this connector.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-logserv-overview,SAP LogServ,SAP LogServ integration with Microsoft Sentinel Solution for SAP overview,Integrate SAP LogServ with Sentinel SAP solution,"This article introduces the Microsoft Sentinel Solution for SAP integration with SAP LogServ, an SAP-provided add-on that extends monitoring beyond the SAP application layer to infrastructure, databas","TheMicrosoft Sentinel Solution for SAP applicationsprovides powerful application-layer monitoring for SAP systems, tracking user activity, business transactions, and critical events. However, in SAP RISE/ECS environments, infrastructure and operating system logs are owned and managed by SAP, and aren't accessible through the standard SAP application connector. SAP LogServ bridges that gap. It's an SAP Enterprise Cloud Services (ECS) service that centralizes logs from all systems, applications, a",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.6,True,Overview of integration with SAP LogServ to extend monitoring to infra/OS logs; likely includes architecture patterns for RISE/ECS environments.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-deploy-alternate,Deploy the agent with expert options,Deploy the Microsoft Sentinel for SAP data connector agent container using expert configuration options,Expert deployment options for Sentinel SAP connector,"Learn how to deploy the Microsoft Sentinel for SAP data connector environments using expert configuration options, such as and on-premises machine and custom, manual configurations.","This article provides procedures for deploying and configuring the Microsoft Sentinel for SAP data connector agent container with expert, custom, or manual configuration options. For typical deployments we recommend that you use theportalinstead. Content in this article is intended for yourSAP BASISteams. For more information, seeDeploy an SAP data connector agent from the command line. Note This article is relevant only for the data connector agent, and isn't relevant for theSAP agentless data ",2026-04-22T17:56:00.000Z,how-to,deployment,0.8,True,"Covers expert/custom/manual deployment of the SAP connector container, likely including advanced configuration and environment-specific deployment constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-function-reference,SAP solution function reference,Microsoft Sentinel solution for SAP applications - function reference,Function reference for Sentinel SAP solution,Learn about the functions available from the Microsoft Sentinel solution for SAP applications.,This article describes a selection of functions that are available in your workspace after you install a Microsoft Sentinel solution for SAP applications. Discover more functions by browsing in Microsoft Sentinel and loading the function code. Find functions as follows: Content in this article is intended for yoursecurityteams.,2026-04-22T17:56:00.000Z,reference,integrations,0.7,True,Describes Kusto functions made available by the SAP solution; these are integration/query primitives with specific parameters and usage patterns.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-log-reference,SAP solution log and table reference,Log and table reference for the Microsoft Sentinel solution for SAP applications,Log and table reference for Sentinel SAP solution,"Learn about the SAP logs, tables, and functions available from the Microsoft Sentinel solution for SAP applications.","This article describes the logs and tables available as part of the Microsoft Sentinel solution for SAP applications and its data connector. Some logs, noted in this article, aren't sent to Microsoft Sentinel by default, but you can manually add them as needed. For more information, seeDefine the SAP logs that are sent to Microsoft Sentinel Content in this article is intended for yourSAP BASISteams. Important Noted features are currently inPREVIEW. TheAzure Preview Supplemental Termsinclude addi",2026-04-22T17:56:00.000Z,reference,integrations,0.7,True,"Reference for logs and tables exposed by the SAP connector, including which are default vs optional, is detailed schema/integration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-security-content,SAP security content reference,Microsoft Sentinel solution for SAP applications - security content reference,Security content reference for Sentinel SAP solutions,Learn about the built-in security content provided by the Microsoft Sentinel solution for SAP applications.,"This article details the security content available for the Microsoft Sentinel solutions for SAP. Important Noted elements described in this article are in Preview. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Available security content includes built-in workbooks and analytics rules. You can also add SAP-relatedwatchliststo use in your search, detection rules, th",2026-04-22T17:56:00.000Z,reference,security,0.65,True,"Reference for built-in security content (rules, workbooks, watchlists) is security-focused and product-specific, detailing what detections exist and how they behave.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-suspicious-configuration-security-parameters,Monitored SAP security parameters,SAP security parameters monitored by the Microsoft Sentinel solution for SAP to detect suspicious configuration changes,SAP security parameters monitored by Sentinel analytics,Learn about the security parameters in the SAP system that the Microsoft Sentinel solution for SAP applications monitors for suspicious configuration changes.,"This article lists the static security parameters in the SAP system that the Microsoft Sentinel solution for SAP applications monitors as part of theSAP - (Preview) Sensitive Static Parameter has Changedanalytics rule. The Microsoft Sentinel solution for SAP applications provides updates for this content according to SAP best practice changes. Add parameters to watch for by changing values according to your organization's needs, and turn off specific parameters in theSAPSystemParameterswatchlist",2026-04-22T17:56:00.000Z,reference,security,0.9,True,Lists specific SAP static security parameters monitored by a particular analytics rule and how to adjust watchlists; clearly security configuration knowledge.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-overview,Overview,Microsoft Sentinel solution for SAP applications overview,,This article provides an overview of the Microsoft Sentinel solution for SAP applications and available support.,"SAP systems pose a unique security challenge, as they handle sensitive information, are a prime target for attackers, and traditionally provide little visibility for security operations teams. An SAP system breach could result in stolen files, exposed data, or a disrupted supply chain. Once an attacker is in the system, there are few controls to detect exfiltration or other bad acts. SAP activity needs to be correlated with other data across the organization for effective threat detection. To he",2026-04-22T17:56:00.000Z,overview,,0.4,False,Overview of Sentinel solution for SAP applications; primarily conceptual and support overview without detailed configs or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-partner-overview,Partner solutions,Microsoft Sentinel solutions for SAP - Partner Add-ons,,"Discover partners specializing in Microsoft Sentinel for SAP integration solutions, consulting, and managed services.","Microsoft Sentinel provides a flexible platform that enables SAP and Microsoft partners to deliver integrated security solutions through the Microsoft Sentinel Content Hub. Add-ons enable further correlational capabilities for the Microsoft Sentinel Solution for SAP applications. SAP signals are correlated with signals from other Microsoft and third-party solutions, enabling comprehensive threat detection and response across the entire IT landscape using the Microsoft unified Security Operations",2026-04-22T17:56:00.000Z,partner-tools,,0.2,False,Partner add-ons overview is primarily ecosystem/marketing content without detailed technical configuration or troubleshooting specifics.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/stop-collection,Stop SAP data collection,Stop SAP data collection - Microsoft Sentinel,Stop SAP data collection in Microsoft Sentinel,Learn about how to stop Microsoft Sentinel from collecting data from your SAP applications.,"There might be instances where you need to halt the data collection from your SAP applications by the Microsoft Sentinel data connector agent, whether for maintenance, troubleshooting, or other administrative reasons. This article provides step-by-step instructions on how to stop the ingestion of SAP logs into Microsoft Sentinel and disable the data connector agent. If you're using the agentless data connector, remove the data connector and solution from Microsoft Sentinel, and then clean up any",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Provides specific steps to disable the SAP data connector and stop ingestion; these are product-specific configuration/operational steps.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sap/update-sap-data-connector,Update the data connector agent,Update the Microsoft Sentinel for SAP applications data connector agent,Update Sentinel SAP data connector agent safely,This article shows you how to update an already existing SAP data connector to its latest version.,"This article shows you how to update an already existing Microsoft Sentinel for SAP data connector to its latest version so that you can use the latest features and improvements. During the data connector agent update process, there might be a brief downtime of approximately 10 seconds. To ensure data integrity, a database entry stores the timestamp of the last fetched log. After the update is complete, the data fetching process resumes from the last log fetched, preventing duplicates and ensuri",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,"Update procedure for the SAP data connector agent, including mention of ~10-second downtime and resume behavior, is deployment-specific operational knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/scheduled-rules-overview,Overview,Scheduled analytics rules in Microsoft Sentinel,Configure scheduled analytics rules in Sentinel,Understand how scheduled analytics rules work in Microsoft Sentinel. Learn about all the configuration options for this rule type.,"By far the most common type of analytics rule,Scheduledrules are based onKusto queriesthat are configured to run at regular intervals and examine raw data from a defined ""lookback"" period. Queries can perform complex statistical operations on their target data, revealing baselines and outliers in groups of events. If the number of results captured by the query passes the threshold configured in the rule, the rule produces an alert. This article helps you understand how scheduled analytics rules ",2026-04-22T17:56:00.000Z,overview,configuration,0.7,True,"Article is explicitly about all configuration options for scheduled rules; these pages typically include rule parameters (lookback period, frequency, thresholds, suppression settings) and allowed ranges, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/scoping,Configure Microsoft Sentinel scoping (row-level RBAC),Configure Microsoft Sentinel scoping (row-level RBAC),Configure row-level RBAC scoping in Microsoft Sentinel,Create and apply Microsoft Sentinel scopes to control access to data at the row level.,"Microsoft Sentinel scoping provides row-level role-based access control (RBAC), enabling granular, row-level access without requiring workspace separation. This capability allows multiple teams to operate securely within a shared Microsoft Sentinel environment while using consistent and reusable scope definitions across tables and experiences. Scoping is configured in the Microsoft Defender portal.",2026-04-22T17:56:00.000Z,how-to,security,0.85,True,"Row-level RBAC configuration; likely documents specific scope definitions, role interactions, and permission behaviors unique to Sentinel.",updated +https://learn.microsoft.com/en-us/azure/sentinel/search-jobs,Search large datasets,Search for specific events across large datasets in Microsoft Sentinel,Run Sentinel search jobs with query timeout limits,Learn how to use search jobs to search large datasets.,"Use a search job to retrieve data stored inlong-term retention, or to scan through large volumes of data, if the log query time-out of 10 minutes isn't sufficient. A search job scans through up to a year of data in a table for specific events. The search job sends its results to a new Analytics table in the same workspace as the source data. This article explains how to run a search job in Microsoft Sentinel and how to work with the search job results. Search jobs across certain data sets might ",2026-04-22T17:56:00.000Z,how-to,limits-quotas,0.8,True,"Explicitly states that search jobs are used when the log query timeout of 10 minutes isn't sufficient and that they scan up to a year of data, which are concrete product-specific limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema,Security alert schema reference,Microsoft Sentinel security alert schema reference,Use Microsoft Sentinel security alert schema fields,This article displays the schema of security alerts in Microsoft Sentinel.,"Microsoft Sentinelanalytics rulescreate incidents as the result ofsecurity alerts. Security alerts can come from different sources, and accordingly use different kinds of analytics rules to create incidents: Scheduledanalytics rules generate alerts as the result of their regular queries of data in logs ingested from external sources, and those same rules create incidents from those alerts. (For the purposes of this document, ""scheduled"" rule alerts includeNRT rule alerts.) Microsoft Securityanal",2026-04-22T17:56:00.000Z,reference,configuration,0.84,True,"Schema reference for security alerts with detailed field definitions and mappings, which are product-specific configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema-differences,Standalone vs XDR alert schema reference,Microsoft Sentinel alert schema differences between standalone and XDR connectors,Compare Sentinel alert schemas for XDR and standalone connectors,"Learn how alert schema, field mappings, and ingestion behavior differ between standalone connectors and the XDR connector in Microsoft Sentinel.","This article explains the differences between alerts ingested through standalone connectors and alerts ingested through the Extended Detection and Response (XDR) connector in Microsoft Sentinel. Standalone connectors ingest alerts directly from the original security products, whereas the XDR connector ingests alerts through the Microsoft Defender XDR pipeline. This includes connectors such as Microsoft Defender for Office 365, Microsoft Defender for Endpoint, Microsoft Defender for Identity, Inf",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,"Explains concrete field mapping and ingestion behavior differences between connector types, including schema-level configuration details.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-analytic-rules-creation,Creating analytics rules,Create Analytics Rules for Microsoft Sentinel Solutions,,This article guides you through the process of creating and publishing analytics rules to Microsoft Sentinel solutions.,"Microsoft Sentinel analytics rules are sets of criteria. They define how data is monitored, what's detected, and what actions are taken when specific conditions are met. These rules help identify suspicious behavior, anomalies, and potential security threats by analyzing logs and signals from various data sources. Microsoft Sentinel analytics rules are powerful tools for enhancing an organization's security posture because they proactively detect and respond to potential threats. By following a ",2026-04-22T17:56:00.000Z,how-to,,0.45,False,"Covers creating analytics rules for solutions; mostly rule authoring workflow and concepts, not low-level configuration tables or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-content-centralize,OOTB content centralization changes,Out-of-the-box (OOTB) content centralization changes - Microsoft Sentinel,,This article describes upcoming centralization changes for out-of-the-box content in Microsoft Sentinel.,"The Microsoft Sentinel content hub enables discovery and on-demand installation of out-of-the-box (OOTB) content and solutions in a single step. Previously, some of this OOTB content existed only in various gallery sections of Microsoft Sentinel. Now,allof the following gallery content templates are available in the content hub as standalone items or as part of packaged solutions:",2026-04-22T17:56:00.000Z,whats-new,,0.35,False,"Describes centralization changes for out-of-the-box content; mostly informational about where content lives, not detailed configs or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-hunting-rules-creation,Creating hunting queries,Create Hunting Queries for Microsoft Sentinel Solutions,,This article guides you through the process of creating and publishing hunting queries to Microsoft Sentinel solutions.,"Hunting queries are at the heart of the threat-hunting process in Microsoft Sentinel. Security analysts use advanced, customizable queries in Kusto Query Language (KQL) to sift through large volumes of data. These queries allow analysts to identify potential security threats, investigate suspicious activities, and gain visibility into their network and endpoints. Analysts use Microsoft Sentinel to proactively search for threats that bypass existing security defenses. This proactive approach help",2026-04-22T17:56:00.000Z,how-to,,0.45,False,Hunting query creation guidance; primarily KQL and conceptual hunting patterns rather than product-specific configuration or error mappings.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-integration-guide,Overview,Guide to build and publish Microsoft Sentinel solutions,,This article walks you through the entire lifecycle of how to build and publish solutions to Microsoft Sentinel.,"Microsoft Sentinel SIEM and platform includes a range of capabilities that partners can use to create impactful solutions they can publish through the Microsoft Security Store or the Sentinel SIEM Content Hub. By building on top of Sentinel, partners can enable new scenarios that use a wide breadth of security data, processing capabilities, and AI experiences, without needing new pipelines, processing capabilities or storage infrastructure. For example, you can create a connector to bring new da",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"Lifecycle/overview for building and publishing solutions; primarily conceptual and process guidance, not detailed configs or limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-overview,What is Microsoft Sentinel?,What is Microsoft Sentinel?,,"Learn about Microsoft Sentinel, an AI-first, cloud-native security information and event management (SIEM) and security platform that consolidates and analyzes security data at scale, empowers securit","Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and unified security platform for agentic defense. To meet the demands of today’s complex threats, Microsoft Sentinel has evolved from a traditional SIEM to a SIEM and platform - extending beyond static, rule-based controls and post-breach response to provide an AI-ready, data-first foundation that transforms telemetry into a security graph, standardizes access for agents, and coordinates autonomous actions, wh",2026-04-22T17:56:00.000Z,overview,,0.2,False,"High-level product overview of Microsoft Sentinel as a SIEM/platform; no detailed limits, configs, roles, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-playbook-creation,Creating playbooks,Create Playbooks for Microsoft Sentinel Solutions,,This article guides you through the process of creating and publishing playbooks for Microsoft Sentinel solutions.,"Playbooks in Microsoft Sentinel are sets of procedures that can respond to incidents, alerts, or specific entities. They help automate responses and can be set to run automatically when certain alerts or incidents occur. Playbooks can also be run manually. This article uses example scenarios to walk you through the process of creating and publishing playbooks for Microsoft Sentinel solutions.",2026-04-22T17:56:00.000Z,how-to,,0.45,False,Playbook creation with example scenarios; likely focuses on Logic Apps workflows conceptually rather than Sentinel-specific config tables or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot,Overview,Security Copilot with Microsoft Sentinel,,"Learn about Microsoft Sentinel capabilities in Security Copilot. Understand the best prompts to use and how to get timely, accurate results for natural language to KQL.","Microsoft Security Copilot is a platform that helps you defend your organization at machine speed and scale. Microsoft Sentinel's vast security data provides an excellent source for Copilot to help analyze incidents and generate hunting queries. Together with other Security Copilot sources you enable, your Microsoft Sentinel incidents and data provide wider visibility into threats and their context for your organization.",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"Overview of Security Copilot with Sentinel; summary does not expose concrete configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot-incident-summary,Summarize incidents in Azure portal,Summarize Microsoft Sentinel incidents with Security Copilot,,Learn about Microsoft Sentinel's incident summarization capabilities in Security Copilot.,"Microsoft Sentinel applies the capabilities ofSecurity Copilotin the Azure portal to create enriched summaries of incidents, providing a comprehensive overview of security incidents by consolidating information from multiple alerts. This feature enhances incident response efficiency by offering a clear summary that helps your security operations teams quickly understand the scope and impact of an incident. It provides a structured overview, including timelines, assets involved, and indicators of",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"Describes incident summarization capability conceptually; no explicit configuration tables, limits, or error mappings in the summary.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-service-limits,Microsoft Sentinel SIEM service limits,Microsoft Sentinel service limits,Reference Microsoft Sentinel service limits and quotas,"This article provides a list of service limits for Microsoft Sentinel, divided into the different service areas.","This article lists the most common service limits you might encounter as you use Microsoft Sentinel. For other limits that might impact services or features you use, like Azure Monitor, seeAzure subscription and service limits, quotas, and constraints.",2026-04-22T17:56:00.000Z,reference,limits-quotas,0.95,True,"Explicitly a service-limits article; will list concrete numeric limits and quotas for Sentinel features, which are expert numeric constraints not reliably known from training.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-soar-content,SOAR content catalog,Microsoft Sentinel SOAR content catalog,,"This article displays and details the content provided by Microsoft Sentinel for security orchestration, automation, and response (SOAR), including playbooks and Logic Apps connectors.","Microsoft Sentinel provides a wide variety of playbooks and connectors for security orchestration, automation, and response (SOAR), so that you can readily integrate Microsoft Sentinel with any product or service in your environment. The integrations listed below may include some or all of the following components: You can find SOAR integrations and their components in the following places: Tip",2026-04-22T17:56:00.000Z,reference,,0.4,False,Catalog/overview of SOAR playbooks and connectors; primarily navigation and listing without deep configuration tables or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution,Monitor Zero Trust,Monitor Zero Trust (TIC 3.0) security architectures with Microsoft Sentinel,Deploy and use Sentinel Zero Trust (TIC 3.0) solution,"Install and learn how to use the Microsoft Sentinel Zero Trust (TIC3.0) solution for an automated visualization of Zero Trust principles, cross-walked to the Trusted Internet Connections framework.","Zero Trustis a security strategy for designing and implementing the following sets of security principles: This article describes how to use the Microsoft SentinelZero Trust (TIC 3.0)solution, which helps governance and compliance teams monitor and respond to Zero Trust requirements according to theTRUSTED INTERNET CONNECTIONS (TIC) 3.0initiative. Microsoft Sentinel solutionsare sets of bundled content, pre-configured for a specific set of data. TheZero Trust (TIC 3.0)solution includes a workboo",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Covers installing and using a specific Sentinel solution with preconfigured content; includes product-specific configuration and usage patterns.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-deprecation,Manage solution deprecation lifecycle,Managing end-to-end lifecycle of deprecated solutions in Microsoft Sentinel,Manage deprecated Microsoft Sentinel solutions lifecycle,This article walks you through the process of identifying deprecated solutions in Microsoft Sentinel and managing the lifecycle of these solutions.,The document explains how to manage the lifecycle of out-of-the-box solutions in Microsoft Sentinel that the solution author no longer supports. This document explains how to identify the solutions that are marked for deprecation and what actions to take on those solutions.,2026-04-22T17:56:00.000Z,concept-article,configuration,0.6,True,Explains identifying deprecated solutions and actions to take; Sentinel-specific content lifecycle management guidance.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-quality-guidance,Sentinel solution quality guidelines,Sentinel solution quality guidelines,,This article guides you through the process of publishing high quality solutions for Microsoft Sentinel.,This document describes the quality guidelines and recommendations for Microsoft Sentinel solutions.,2026-04-22T17:56:00.000Z,best-practice,,0.4,False,Quality guidelines and recommendations are likely high-level publishing standards without concrete product-specific configs or quantified impacts.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions,Overview,Microsoft Sentinel Content and Solutions Overview,,"Discover Microsoft Sentinel content and solutions, including data connectors and analysis tools, to enhance your security operations. Learn more today.","Microsoft Sentinel content includes Security Information and Event Management (SIEM) solution components that help you ingest data, monitor, alert, and respond to security threats. This article explains the types of content and solutions in Microsoft Sentinel and how they help your security operations. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentine",2026-04-22T17:56:00.000Z,overview,,0.4,False,Content and solutions overview; mostly conceptual description of solution types without deep config tables or decision matrices.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-catalog,All Microsoft Sentinel SIEM solutions,Microsoft Sentinel content hub catalog,Select Microsoft Sentinel solutions from content hub catalog,Learn about domain specific solutions available in the content hub for Microsoft Sentinel and where to find the full list of solutions.,"Solutions in Microsoft Sentinel provide a consolidated way to acquire Microsoft Sentinel content, like data connectors, workbooks, analytics, and automation, in your workspace with a single deployment step. This article helps you find the full list of the solutions available in Microsoft Sentinel. This article also lists the domain-specific out-of-the-box (built-in) and on-demand solutions available for you to deploy in your workspace. When you deploy a solution, the security content included wi",2026-04-22T17:56:00.000Z,reference,decision-making,0.65,True,Catalog and domain-specific solution list helps decide which solutions to deploy for particular scenarios; supports solution selection decisions.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-delete,Delete out-of-the-box content,Delete installed Microsoft Sentinel out-of-the-box content and solutions,Remove and restore Microsoft Sentinel solutions and content,Remove solutions and content you deployed in Microsoft Sentinel.,"If you installed a Microsoft Sentinel out-of-the-box solution, you can remove content items from the solution or delete the installed solution. If you later need to restore deleted content items, selectReinstallon the solution. Similarly, you can restore the solution by reinstalling the solution. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in t",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,Details how to delete and reinstall solutions/content; product-specific lifecycle operations and configuration steps.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-deploy,Deploy out-of-the-box content,Discover and deploy Microsoft Sentinel out-of-the-box content from Content hub,Discover and deploy Sentinel solutions from Content hub,"Learn how to find and deploy Sentinel packaged solutions containing data connectors, analytics rules, hunting queries, workbooks, and other content.","The Microsoft Sentinel Content hub is your centralized location to discover and manage out-of-the-box (built-in) content. There you find packaged solutions for end-to-end products by domain or industry. You have access to the vast number of standalone contributions hosted in our GitHub repository and feature blades. Discover solutions and standalone content with a consistent set of filtering capabilities based on status, content type, support, provider, and category. Install content in your work",2026-04-22T17:56:00.000Z,how-to,deployment,0.65,True,Describes how to find and deploy packaged solutions; includes Sentinel-specific deployment behavior for content bundles.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-post-publish-tracking,Sentinel SIEM solution lifecycle post publish,Status of Microsoft Sentinel solution after publishing in the Microsoft Partner center,,This article walks you through the details of tracking solutions post publish in Microsoft Partner center.,"This document explains what happens once your offer is successfully published. Within partner center, your solution would be referred to as an offer (the terms solution and offer are used interchangeably in the context of this document). Once you publish the offer, the offer goes through a series of validation checks before it becomes live in Azure Marketplace and in the Sentinel content hub. If you didn't create an offer yet, follow the steps in thePublish solutions to Microsoft Sentinelarticle",2026-04-22T17:56:00.000Z,concept-article,,0.35,False,"Post-publish tracking in Partner Center; focuses on offer status flow, not deep technical configuration or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-summary-rules-creation,Creating summary rules,Create Summary Rules for Microsoft Sentinel Solutions,,This article guides you through the process of creating and publishing summary rules to Microsoft Sentinel solutions.,"Summary rules in Microsoft Sentinel are scheduled queries that aggregate and transform high-volume data into summarized results stored in a custom log table. In essence, a summary rule runs a user-defined KQL (Kusto Query Language) query at a regular interval (for example, every hour or once a day) across a large set of logs, and saves the aggregated output (such as counts, statistics, or filtered records) into a new or existing Log Analytics (LA) table. This mechanism provides concise, precompi",2026-04-22T17:56:00.000Z,how-to,,0.45,False,"Summary rules explanation and scheduling examples; likely conceptual with simple examples, not detailed limits or config matrices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-tables-connectors-reference,Sentinel tables and connectors,Microsoft Sentinel tables and associated connectors,,"This article lists the tables ingested into Microsoft Sentinel via data connectors, and the connectors that ingest them.","The following table lists the tables ingested into Microsoft Sentinel via data connectors, and the connectors that ingest them. Select the table name or the connector name for more information.",2026-04-22T17:56:00.000Z,reference,,0.45,False,Lookup table mapping connectors to tables; useful but mostly catalog/navigation without deep configuration or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/sentinel-workbook-creation,Creating workbooks,Create Workbooks for Microsoft Sentinel Solutions,,This article guides you through the process of creating and publishing workbooks for Microsoft Sentinel solutions.,"Workbooks are an integral feature of Microsoft Sentinel, a cloud-native security information and event management (SIEM) solution. Workbooks provide users with interactive, customizable dashboards that aggregate and visualize data from various sources. These dashboards enable organizations to gain deeper insights into their security posture and streamline their efforts in threat detection and response. By integrating data from various sources and facilitating collaboration among security teams, ",2026-04-22T17:56:00.000Z,how-to,,0.4,False,"Workbook creation and publishing; mostly UI and visualization concepts, not deep configuration references or quotas.",updated +https://learn.microsoft.com/en-us/azure/sentinel/setup-azure-storage-connector,Azure Storage blob connectors,Set up the Azure Storage connector to stream logs to Microsoft Sentinel,Set up Azure Storage Blob connector for Sentinel logs,Learn how to set up the Azure Storage Blob connector to ingest logs from Azure Storage into Microsoft Sentinel using the Codeless Connector Framework.,The Azure Storage Blob connector simplifies collecting logs from Azure Storage. It lets ISVs and users build scalable connectors on top of Azure Storage integrations through the fully managed Codeless Connector Framework (CCF). This article summarizes the connector resources and provides steps to create and validate your first Azure Storage connector.,2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Connector setup article; usually contains storage account settings, container paths, and Sentinel connector parameters, which are configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/siem-migration,Use the SIEM migration experience,Use the SIEM migration experience - Microsoft Sentinel,Use Sentinel SIEM migration tool for detection mapping,Migrate security monitoring use cases from other Security Information and Event Management (SIEM) systems to Microsoft Sentinel.,"The SIEM migration tool analyzes Splunk and QRadar detections, including custom detections, and recommends best‑fit Microsoft Sentinel detections rules. It also provides recommendations for data connectors, both Microsoft and third-party connectors available in Content Hub to enable the recommended detections. Customers can track the migration by assigning the right status to each recommendation card. Note The old migration tool is deprecated. This article describes the current SIEM migration ex",2026-04-22T17:56:00.000Z,how-to,decision-making,0.7,True,"Describes a migration tool that analyzes Splunk/QRadar detections and recommends Sentinel rules and connectors; provides concrete guidance on mapping and choosing equivalents, which is migration decision support.",updated +https://learn.microsoft.com/en-us/azure/sentinel/skill-up-resources,Microsoft Sentinel skill-up training,Microsoft Sentinel skill-up training,,"This article walks you through a level 400 training to help you skill up on Microsoft Sentinel. The training comprises 21 modules that present relevant product documentation, blog posts, and other res","This article walks you through a level 400 training to help you skill up on Microsoft Sentinel. The training comprises 21 self-paced modules that present relevant product documentation, blog posts, and other resources. The modules listed here are split into five parts following the life cycle of a Security Operation Center (SOC): Part 1: Overview Part 2: Architecting and deploying Part 3: Creating content Part 4: Operating Part 5: Advanced",2026-04-22T17:56:00.000Z,tutorial,,0.3,False,Training roadmap aggregating links to other resources; meta-navigation content rather than detailed technical reference.,updated +https://learn.microsoft.com/en-us/azure/sentinel/soc-ml-anomalies,Overview,Use customizable anomalies to detect threats in Microsoft Sentinel,,This article explains how to use the new customizable anomaly detection capabilities in Microsoft Sentinel.,"Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog.",2026-04-22T17:56:00.000Z,feature-guide,,0.35,False,"High-level explanation of customizable anomalies; summary doesn’t indicate detailed parameter tables, numeric thresholds, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-access,Optimize your security operations,Optimize security operations,Apply Sentinel SOC optimization recommendations,Use Microsoft Sentinel SOC optimization recommendations to optimize your security operations center (SOC) team activities.,"Security operations center (SOC) teams look for ways to improve processes and outcomes and ensure you have the data needed to address risks without extra ingestion costs. SOC teams want to make sure that you have all the necessary data to act against risks, without paying formoredata than needed. At the same time, SOC teams must also adjust security controls as threats and business priorities change, doing so quickly and efficiently to maximize your return on investment. SOC optimizations are ac",2026-04-22T17:56:00.000Z,how-to,best-practices,0.6,True,SOC optimization recommendations are product-specific guidance on tuning data and detections; likely includes concrete Sentinel-centric actions and gotchas.,updated +https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-api,Use SOC optimizations programmatically,Use SOC optimizations programmatically,Use Sentinel SOC optimization recommendations API programmatically,Learn how to use Microsoft Sentinel SOC optimization recommendations programmatically.,"Use the Microsoft SentinelrecommendationsAPI to programmatically interact with SOC optimization recommendations, helping you to close coverage gaps against specific threats and tighten ingestion rates. You can get details about all current recommendations across your workspaces or a specific SOC optimization recommendation, or you can reevaluate a recommendation if you've made changes in your environment. For example, use therecommendationsAPI to: For customers or MSSPs managing multiple environ",2026-04-22T17:56:00.000Z,concept-article,integrations,0.7,True,"Describes the recommendations API usage; likely includes endpoint paths, parameters, and request/response patterns unique to Sentinel.",updated +https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-reference,SOC optimization reference,SOC optimization reference,Reference Sentinel SOC optimization recommendation types,Learn about the Microsoft Sentinel SOC optimization recommendations available to help you optimize your security operations.,"Use SOC optimization recommendations to help you close coverage gaps against specific threats and tighten your ingestion rates against data that doesn't provide security value. SOC optimizations help you optimize your Microsoft Sentinel workspace, without having your SOC teams spend time on manual analysis and research. Microsoft Sentinel SOC optimizations include the following types of recommendations: Data value recommendationssuggest ways to improve your data use, such as a better data plan f",2026-04-22T17:56:00.000Z,reference,decision-making,0.65,True,"Catalog of recommendation types (data value, coverage, etc.) with guidance on when to apply them; supports decision-making on tuning Sentinel environments.",updated +https://learn.microsoft.com/en-us/azure/sentinel/solution-setup-essentials,Microsoft Sentinel solution setup essentials,Microsoft Sentinel solution setup essentials,,Learn the prerequisites for creating Microsoft Sentinel SIEM and platform solutions.,Microsoft Sentinel solutions help you package and share custom security content. Use this page to learn about the two solution types and review the setup steps before you build or publish.,2026-04-22T17:56:00.000Z,how-to,,0.4,False,Prerequisites and setup steps for solutions; likely procedural and conceptual rather than detailed configuration references or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/stix-objects-api,Threat intelligence upload API reference,Import threat intelligence with the upload API - Microsoft Sentinel,Import threat intelligence STIX objects via Sentinel upload API,This article is a reference for the upload upload API with example requests and responses.,"Import threat intelligence to use in Microsoft Sentinel with the upload API. Whether you're using a threat intelligence platform or a custom application, use this document as a supplemental reference to the instructions inConnect your TIP with the upload API. Installing the data connector isn't required to connect to the API. The threat intelligence you can import includes indicators of compromise and other STIX domain objects. Important This API is currently in PREVIEW. TheAzure Preview Supplem",2026-04-22T17:56:00.000Z,reference,integrations,0.86,True,"API reference with example requests/responses for uploading STIX objects, including parameter names and constraints unique to this API.",updated +https://learn.microsoft.com/en-us/azure/sentinel/summary-rules,Aggregate data with summary rules,Aggregate Microsoft Sentinel data with summary rules,Configure and use Sentinel summary rules for aggregation,Learn how to aggregate large sets of Microsoft Sentinel data across log tiers with summary rules.,"Usesummary rulesin Microsoft Sentinel to aggregate large sets of data in the background for a smoother security operations experience across all log tiers. Summary data is precompiled in custom log tables and provide fast query performance, including queries run on data derived fromlow-cost log tiers. Summary rules can help optimize your data for: Microsoft Sentinel stores summary rule results in custom tables with theAnalyticsdata plan. For more information on data plans and storage costs, seeL",2026-04-22T17:56:00.000Z,how-to,configuration,0.6,True,Describes configuring summary rules and custom tables for aggregation across log tiers; likely includes rule settings and table behaviors specific to Sentinel.,updated +https://learn.microsoft.com/en-us/azure/sentinel/surface-custom-details-in-alerts,Surface custom details in alerts,Surface custom details in Microsoft Sentinel alerts,Surface custom event details in Sentinel alerts,"Extract and surface custom event details in alerts in Microsoft Sentinel analytics rules, for better and more complete incident information","Scheduled query analytics rulesanalyzeeventsfrom data sources connected to Microsoft Sentinel, and producealertswhen the contents of these events are significant from a security perspective. These alerts are further analyzed, grouped, and filtered by Microsoft Sentinel's various engines and distilled intoincidentsthat warrant a SOC analyst's attention. However, when the analyst views the incident, only the properties of the component alerts themselves are immediately visible. Getting to the actu",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,"Explains extracting and surfacing custom details in alerts; typically involves specifying field names, Kusto projections, and rule configuration parameters unique to Sentinel alert enrichment.",updated https://learn.microsoft.com/en-us/azure/sentinel/threat-analytics-sentinel,Track and respond to threats with threat analytics,Threat analytics for Microsoft Sentinel users (preview),,Learn about threat analytics and how Microsoft Sentinel users can access it in the Microsoft Defender portal.,"Important Threat analytics access for customers who use only Microsoft Sentinel is currently in preview. -This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. Threat analyticsis an in-product threat intelligence solution from expert Microsoft security researchers. It helps security teams stay efficient while facing emerging threats, such as: For ",2025-11-18T18:43:00.000Z,concept-article,,0.4,False,"High-level description of threat analytics access; summary doesn’t show concrete configuration, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/threat-detection,Overview,Threat detection in Microsoft Sentinel,,"Understand how threat detection works in Microsoft Sentinel. Learn about different types of analytics rules and templates, and the generation of alerts and incidents.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Aftersetting up Microsoft Sentinel to collect data from all over your organization, you need to constantly dig through al",2025-10-29T11:15:00.000Z,conceptual,,0.3,False,"Threat detection overview and rule types; appears conceptual with no strong indication of numeric thresholds, config tables, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/threat-intelligence-integration,Threat intelligence integrations,Threat intelligence integration in Microsoft Sentinel,Configure threat intelligence integrations in Sentinel,Learn about the different ways threat intelligence feeds are integrated with and used by Microsoft Sentinel.,"Microsoft Sentinel gives you a few ways touse threat intelligence feedsto enhance your security analysts' ability to detect and prioritize known threats: Tip If you have multiple workspaces in the same tenant, such as forManaged Security Service Providers (MSSPs), it might be more cost effective to connect threat indicators only to the centralized workspace. When you have the same set of threat indicators imported into each separate workspace, you can run cross-workspace queries to aggregate thr",2024-09-03T22:09:00.000Z,concept-article,configuration,0.64,True,"Describes specific ways to integrate threat intelligence feeds, including workspace strategies and connector usage; contains product-specific configuration guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/top-workbooks,Top Microsoft Sentinel workbooks,Commonly used Microsoft Sentinel workbooks,,"Learn about the most commonly used workbooks to use popular, out-of-the-box Microsoft Sentinel resources.","This article lists the most commonly used Microsoft Sentinel workbooks. Install the solution or standalone item that contains the workbook from theContent hubin Microsoft Sentinel. Get the workbook from theContent hubby selectingManageon the solution or standalone item. Or, in Microsoft Sentinel underThreat Management, go toWorkbooksand search for the workbook you want to use. For more information, seeVisualize and monitor your data. We recommend you deploy any workbooks associated with the data",2024-06-14T22:17:00.000Z,reference,,0.2,False,"Primarily a catalog/list of commonly used workbooks and how to access them. No detailed configuration parameters, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/transformation-filter-split,Data transformation using filter and split,Transform data using filter and split in Microsoft Sentinel,Configure filter and split transformations in Microsoft Sentinel,"Learn how to use filter and split data transformations to streamline ingestion, reduce costs, and route data between Analytics and Data lake tiers in Microsoft Sentinel.","As security data volumes continue to grow, organizations face the challenge of balancing cost-effective retention of telemetry used for AI, compliance, and investigations while ensuring that only necessary data is retained in high-performance storage tiers. Use filter and split data transformations in Microsoft Sentinel to address this challenge by modifying data at ingestion time to optimize your data retention strategy. This article describes how to configure filter and split data transformati",2026-03-30T20:20:00.000Z,how-to,configuration,0.7,True,"The article describes how to configure Sentinel’s filter and split data transformations at ingestion time to route data between Analytics and Data lake tiers. It contains product-specific configuration steps and options for these transformations that go beyond generic ingestion concepts, making it expert configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-analytics-rules,Troubleshoot analytics rules,Troubleshooting analytics rules in Microsoft Sentinel,Troubleshoot Sentinel analytics rules and AUTO DISABLED,"Learn how to deal with certain known issues that can affect analytics rules, and understand the meaning of AUTO DISABLED.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. This article explains how to deal with certain issues that may arise with execution ofscheduled analytics rulesin Microso",2025-10-29T11:15:00.000Z,conceptual,troubleshooting,0.85,True,"Explicit troubleshooting article; will map symptoms (like AUTO DISABLED) to causes and resolutions, including Sentinel-specific error states and diagnostics.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-sentinel-solutions,Troubleshoot solutions in Microsoft Sentinel,Troubleshoot solutions in Microsoft Sentinel,Troubleshoot Microsoft Sentinel solution ingestion issues,"Troubleshoot Microsoft Sentinel data ingestion, analytics, packaging, and agent integration issues, and prepare information for Support.","Use this guide to diagnose issues, validate configurations, and understand support responsibilities for your Microsoft Sentinel solution.",2025-09-30T12:49:00.000Z,troubleshooting,troubleshooting,0.86,True,"A dedicated troubleshooting guide for Sentinel solutions with product-specific diagnosis and validation steps; likely includes concrete symptoms, causes, and resolutions for data ingestion, analytics, packaging, and agent integration.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/tutorial-enrich-ip-information,Automatically enrich incident information,Tutorial - Automatically check and record IP address reputation in incident in Microsoft Sentinel,Enrich Sentinel incidents with IP reputation automation,"In this tutorial, learn how to use Microsoft Sentinel automation rules and playbooks to automatically check IP addresses in your incidents against a threat intelligence source and record each result i","One quick and easy way to assess the severity of an incident is to see if any IP addresses in it are known to be sources of malicious activity. Having a way to do this automatically can save you a lot of time and effort. In this tutorial, you'll learn how to use Microsoft Sentinel automation rules and playbooks to automatically check IP addresses in your incidents against a threat intelligence source and record each result in its relevant incident. When you complete this tutorial, you'll be able",2025-04-24T22:03:00.000Z,tutorial,integrations,0.7,True,Shows how to integrate Sentinel automation rules/playbooks with a threat intelligence source to check IP reputation and write back to incidents—detailed cross-service integration pattern.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/tutorial-extract-incident-entities,Extract incident entities with non-native actions,Extract incident entities with non-native actions,Extract non-native Sentinel entities using playbook actions,"In this tutorial, you extract entity types with action types that aren't native to Microsoft Sentinel, and save these actions in a playbook to use for SOC automation.","Entity mapping enriches alerts and incidents with information essential for any investigative processes and remedial actions that follow. Microsoft Sentinel playbooks include these native actions to extract entity info: In addition to these actions, analytic rule entity mapping contains entity types that aren't native actions, like malware, process, registry key, mailbox, and more. In this tutorial, you learn how to work with non-native actions using different built-in actions to extract the rel",2025-10-08T22:11:00.000Z,tutorial,integrations,0.7,True,"Covers working with non-native entity types (malware, process, registry key, mailbox, etc.) via built-in actions in playbooks, a Sentinel-specific automation/integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/tutorial-log4j-detection,Tutorial - Detect threats using analytics rules,Tutorial - Detect threats by using analytics rules in Microsoft Sentinel,,"In this tutorial, learn how to use analytics rules in Microsoft Sentinel to detect exploits of the Apache Log4j vulnerability in any of your susceptible systems. Take advantage of the alert enrichment","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. As a Security Information and Event Management (SIEM) service, Microsoft Sentinel is responsible for detecting security t",2024-12-31T18:03:00.000Z,tutorial,,0.4,False,Tutorial for a specific scenario (Log4j detection); mostly step-by-step usage of analytics rules rather than reference-style expert configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/ueba-reference,UEBA data sources and table schemas,Microsoft Sentinel User and Entity Behavior Analytics (UEBA) reference,Reference for Sentinel UEBA entity enrichments,This article displays the entity enrichments generated by Microsoft Sentinel's entity behavior analytics.,"This article lists the input data sources for theUser and Entity Behavior Analytics service in Microsoft Sentinel. It also describes the enrichments that UEBA adds to entities, providing needed context to alerts and incidents. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use M",2025-09-08T17:16:00.000Z,reference,configuration,0.7,True,"Lists input data sources and enrichments UEBA adds; this is a reference of product-specific fields and enrichment schema, useful as configuration/field mapping knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/understand-threat-intelligence,Overview,Threat intelligence - Microsoft Sentinel,,"Understand threat intelligence and how it integrates with features in Microsoft Sentinel to analyze data, detect threats, and enrich alerts.","Microsoft Sentinel is a cloud-native security information and event management (SIEM) solution that can ingest, curate, and manage threat intelligence from numerous sources. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you",2025-08-18T08:00:00.000Z,concept-article,,0.4,False,Conceptual explanation of threat intelligence in Sentinel; summary suggests high-level understanding rather than detailed configuration or error mappings.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/unified-connector,Unified connectors overview,Unified Connectors Platform for Microsoft Sentinel,,"Learn about the Unified Connectors Platform that simplifies connector management across Microsoft security products including Microsoft Sentinel, Defender for Cloud, and Defender for Identity.",The Unified connectors platform enables you to connect once to an external product that provides value in multiple Microsoft security products. This platform simplifies the connector management experience across Microsoft security products. Unified connectors provide the following benefits:,2025-09-01T11:11:00.000Z,concept-article,,0.4,False,Overview of Unified Connectors Platform and its benefits; largely conceptual/marketing without detailed configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-cef-device,Device configuration for CEF via AMA,CEF via AMA connector - Configure appliances and devices,,Learn how to configure specific devices that use the Common Event Format (CEF) via AMA data connector for Microsoft Sentinel.,"Log collection from many security appliances and devices is supported by theCommon Event Format (CEF) via AMAdata connector in Microsoft Sentinel. This article lists provider-supplied installation instructions for specific security appliances and devices that use this data connector. Contact the provider for updates, more information, or where information is unavailable for your security appliance or device. To ingest data to your Log Analytics workspace for Microsoft Sentinel, complete the step",2024-06-28T05:41:00.000Z,reference,,0.3,False,"Primarily a list of provider-supplied installation links for various devices; the expert details live in external vendor docs, not in this page.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-custom-device,Custom logs - configure security device,Custom logs via AMA connector - Configure data ingestion to Microsoft Sentinel from specific applications,,"Learn how to configure data ingestion into Microsoft Sentinel from specific or custom applications that produce logs as text files, using the Custom Logs via AMA data connector or manual configuration","Microsoft Sentinel'sCustom Logs via AMAdata connector supports the collection of logs from text files from several different network and security applications and devices. This article supplies the configuration information, unique to each specific security application, that you need to supply when configuring this data connector. This information is provided by the application providers. Contact the provider for updates, for more information, or when information is unavailable for your security",2024-08-25T11:27:00.000Z,reference,,0.35,False,Acts as a pointer to provider-specific configuration information for various applications; the detailed parameters are in external docs.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-integration,Integrate with unified connectors,Connect to Microsoft Sentinel using a unified connector,Configure unified connectors to integrate with Microsoft Sentinel,"Learn how to connect to the Unified Connectors Platform that simplifies connector management across Microsoft security products including Microsoft Sentinel, Defender for Cloud, and Defender for Ident","Useunified connectorsto simplify connection management across Microsoft security products. If you already have an existing connector to a product, consider replacing it with a unified connector for centralized management.",2025-09-01T11:11:00.000Z,how-to,configuration,0.65,True,How-to for connecting via unified connectors; likely includes connector configuration fields and options specific to the Unified Connectors Platform and Sentinel.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-syslog-device,Syslog - configure security device,Syslog via AMA connector - configure appliances and devices,,Learn how to configure specific appliances and devices that use the Syslog via AMA data connector for Microsoft Sentinel.,"TheSyslog via AMAdata connector in Microsoft Sentinel collects logs from many security appliances and devices. This article lists provider-supplied installation instructions for specific security appliances and devices that use this data connector. Contact the provider for updates, more information, or where information is unavailable for your security appliance or device. To forward data to your Log Analytics workspace for Microsoft Sentinel, complete the steps inIngest syslog and CEF messages ",2025-06-25T11:11:00.000Z,reference,,0.3,False,"Similar to index 1, this is a catalog of external instructions for devices; it doesn’t itself contain detailed Sentinel- or AMA-specific configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/upload-indicators-api,Legacy upload indicator API reference,Reference the legacy upload indicators API - Microsoft Sentinel,Use legacy Sentinel upload indicators API,This article is a reference for the legacy upload indicators API with an example request and response.,"The Microsoft Sentinel upload indicators API allowed threat intelligence platforms or custom applications to import indicators of compromise in the STIX format into a Microsoft Sentinel workspace. This document serves as a reference to the legacy API. Important This API is in PREVIEW but no longer recommended. Use the new STIX objects API in preview to upload threat intelligence. For more information, seeSTIX objects API. -TheAzure Preview Supplemental Termsinclude additional legal terms that app",2025-01-20T12:13:00.000Z,reference,integrations,0.85,True,"Reference for the legacy upload indicators API with example request/response. Contains product-specific REST API details and parameters, fitting the integrations category.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/use-matching-analytics-to-detect-threats,Use matching analytics to detect threats,Use matching analytics to detect threats - Microsoft Sentinel,,This article explains how to detect threats with Microsoft-generated threat intelligence in Microsoft Sentinel.,"Take advantage of threat intelligence produced by Microsoft to generate high-fidelity alerts and incidents with theMicrosoft Defender Threat Intelligence Analyticsrule. This built-in rule in Microsoft Sentinel matches indicators with Common Event Format (CEF) logs, Windows DNS events with domain and IPv4 threat indicators, syslog data, and more.",2025-02-21T23:03:00.000Z,how-to,,0.4,False,"Covers using Microsoft-generated threat intelligence via a built-in rule; likely high-level usage of a predefined rule, not detailed config parameters or limits.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/use-multiple-workspaces,Set up multiple workspaces,Set up multiple workspaces and tenants in Microsoft Sentinel,,"If you've defined that your environment needs multiple workspaces, you now set up your multiple workspace architecture in Microsoft Sentinel.","When you planned your deployment, you determined whether a multiple workspace architecture is relevant for your environment. If your environment requires multiple workspaces, you can now set them up as part of your deployment. For more information, seePrepare for multiple workspaces and tenants in Microsoft Sentinel. In this article, you learn how to set up Microsoft Sentinel to extend across multiple workspaces and tenants. This article is part of theDeployment guide for Microsoft Sentinel.",2025-07-16T11:12:00.000Z,how-to,,0.45,False,Setting up multiple workspaces and tenants is deployment/architecture execution; summary suggests procedural steps rather than quantified thresholds or decision matrices.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/use-threat-indicators-in-analytics-rules,Use threat indicators in analytics rules,Use threat indicators in analytics rules - Microsoft Sentinel,,This article explains how to generate alerts and incidents with threat intelligence indicators in Microsoft Sentinel.,Power your analytics rules with your threat indicators to automatically generate alerts based on the threat intelligence that you integrated.,2024-09-03T22:09:00.000Z,how-to,,0.4,False,Explains using threat indicators in analytics rules; appears to be conceptual/usage guidance rather than detailed configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/watchlist-schemas,Watchlist template schemas,Schemas for Microsoft Sentinel watchlist templates,Apply built-in Sentinel watchlist template schemas,Learn about the schemas used in each built-in watchlist template in Microsoft Sentinel.,"This article details the schemas used in each built-in watchlist template provided by Microsoft Sentinel. For more information, seeCreate watchlists in Microsoft Sentinel. The Microsoft Sentinel watchlist templates are currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2024-12-04T05:32:00.000Z,reference,configuration,0.8,True,"Details schemas for built-in watchlist templates, which are product-specific configuration structures (column names, types, required fields). This is expert configuration knowledge not generally known from training.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/watchlists,Overview,Use Watchlists to Correlate and Enrich Event Data in Microsoft Sentinel,Use Sentinel watchlists to enrich and correlate events,"Learn how to use watchlists in Microsoft Sentinel to efficiently correlate and enrich event data, reduce alert fatigue, and respond to threats. Discover best practices and get started today.","Watchlists in Microsoft Sentinel help security analysts efficiently correlate and enrich event data. They give you a flexible way to manage reference data, like lists of high-value assets or terminated employees. Integrate watchlists into your detection rules, threat hunting, and response workflows to reduce alert fatigue and respond to threats faster. This article explains how to use watchlists in Microsoft Sentinel, outlines key scenarios and limitations, and gives guidance on creating and que",2025-05-30T11:13:00.000Z,concept-article,best-practices,0.8,True,Article explicitly promises best practices and key scenarios; includes Sentinel-specific guidance on using watchlists effectively and known limitations.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/watchlists-create,Create watchlists,Create new watchlists - Microsoft Sentinel,Create Sentinel watchlists and manage file size limits,"Create watchlist in Microsoft Sentinel for allowlists or blocklists, to enrich event data, and help investigate threats.","Watchlists in Microsoft Sentinel help you correlate data from a data source you provide with the events in your Microsoft Sentinel environment. For example, you might create a watchlist with a list of high value assets, terminated employees, or service accounts in your environment. You can create a watchlist by using any of the following methods: You can currently upload local files up to 3.8 MB in size. A file that's over 3.8 MB and up to 500 MB is considered a large watchlist. To upload a larg",2025-12-12T12:11:00.000Z,how-to,limits-quotas,0.85,True,Explicitly states concrete size limits for uploads (3.8 MB vs up to 500 MB for large watchlists); this is precise quota information unique to the service.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/watchlists-manage,Manage watchlists,Edit watchlists - Microsoft Sentinel,Maintain and edit Microsoft Sentinel watchlists safely,Learn how to edit and add more items to Microsoft Sentinel watchlists to them to keep them up-to-date.,"We recommend you edit an existing watchlist instead of deleting and recreating a watchlist. Log analytics has a five-minute SLA for data ingestion. If you delete and recreate a watchlist, you might see both the deleted and recreated entries in Log Analytics during this five-minute window. If you see these duplicate entries in Log Analytics for a longer period of time, submit a support ticket. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and wi",2025-09-10T17:15:00.000Z,how-to,best-practices,0.7,True,"Includes a concrete product-specific recommendation tied to ingestion behavior (edit instead of delete/recreate due to a 5-minute Log Analytics ingestion SLA and potential duplicate entries), which is a Sentinel-specific operational gotcha.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/watchlists-queries,Build queries or rules,Build queries or rules with watchlists - Microsoft Sentinel,Use Sentinel watchlists in KQL queries and rules,Use watchlists in KQL search queries or detection rules with built-in functions for Microsoft Sentinel.,"Correlate your watchlist data against any Microsoft Sentinel data with Kusto tabular operators such asjoinandlookup. When you create a watchlist, you define theSearchKey. The search key is the name of a column in your watchlist that you expect to use as a join with other data or as a frequent object of searches. For optimal query performance, useSearchKeyas the key for joins in your queries. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and wil",2025-09-10T17:15:00.000Z,how-to,integrations,0.65,True,"Describes product-specific KQL usage with watchlists, including using the SearchKey column for optimal joins and built-in operators like join and lookup. These are concrete, Sentinel-specific query patterns that go beyond generic KQL knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/whats-new,What's new,What's new in Microsoft Sentinel,,Learn about the latest new features and announcement in Microsoft Sentinel from the past few months.,"This article lists recent features added for Microsoft Sentinel, and new features in related services that provide an enhanced user experience in Microsoft Sentinel. For new features related to unified security operations in the Defender portal, see theWhat's new for unified security operations? The listed features were released in the last six months. For information about earlier features delivered, see ourTech Community blogs. Note For information about feature availability in US Government c",2026-04-10T08:00:00.000Z,concept-article,,0.1,False,"Release notes and feature announcements for Microsoft Sentinel without detailed limits, configuration tables, error codes, or decision matrices; primarily high-level 'what's new' content rather than deep, product-specific expert guidance.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/windows-security-event-id-reference,Windows security event sets,Windows security event sets that can be sent to Microsoft Sentinel,Select Windows security event sets for Microsoft Sentinel,Learn about the pre-built sets of Windows security events that you can collect and stream from your Windows systems to your Microsoft Sentinel workspace.,"When ingesting security events from Windows devices using theWindows Security Events data connector(including the legacy version), you can choose which events to collect from among the following sets: All events- Collects the full, unfiltered set of events from the Windows Security event log and the AppLocker event log channels. The Security log (Windows Logs > Securityin Event Viewer) records auditing events such as logons, privilege use, and policy changes. The AppLocker logs (Application and ",2026-03-19T22:26:00.000Z,reference,configuration,0.7,True,"Page defines specific, named event collection sets (for example, 'All events') and exactly which Windows Security/AppLocker event IDs and channels each set includes. This is product-specific configuration knowledge for the Windows Security Events data connector in Microsoft Sentinel, not generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/work-with-anomaly-rules,Work with out-of-the-box anomaly rules,Work with anomaly detection analytics rules in Microsoft Sentinel,Create and tune anomaly analytics rules in Sentinel,"This article explains how to view, create, manage, assess, and fine-tune anomaly detection analytics rules in Microsoft Sentinel.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel’scustomizable anomalies featureprovidesbuilt-in anomaly templatesfor immediate value out-of-the-box. T",2026-02-26T23:12:00.000Z,how-to,configuration,0.75,True,"Covers creating, managing, and fine-tuning anomaly rules; involves product-specific configuration options and tuning parameters.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/work-with-stix-objects-indicators,Work with STIX objects and indicators,Work with STIX objects and indicators to enhance threat intelligence and threat hunting in Microsoft Sentinel (Preview) - Microsoft Sentinel,Query STIX indicator and object tables in Sentinel,This article provides examples of how to incorporate STIX objects into queries to enhance threat hunting.,"On April 3, 2025, we publicly previewed two new tables to support STIX (Structured Threat Information eXpression) indicator and object schemas:ThreatIntelIndicatorsandThreatIntelObjects. This article provides examples of how to incorporate STIX objects into queries to enhance threat hunting, and how to migrate to the new threat indicator schema. For more information about threat intelligence in Microsoft Sentinel, seeThreat intelligence in Microsoft Sentinel. Important Microsoft Sentinel will in",2025-08-07T08:00:00.000Z,how-to,integrations,0.65,True,Provides concrete examples of incorporating STIX objects into Kusto queries and migrating to new STIX-based schemas; this is a product-specific integration pattern with schema/field details.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/work-with-tasks,Use tasks to handle incident workflow,Work with incident tasks in Microsoft Sentinel in the Azure portal,Use Sentinel incident tasks in analyst workflows,This article explains how SOC analysts can use incident tasks to manage their incident-handling workflow processes in Microsoft Sentinel.,"This article explains how SOC analysts can use incident tasks to manage their incident-handling workflow processes in Microsoft Sentinel in the Azure portal. Incident tasksare typically created automatically by either automation rules or playbooks set up by senior analysts or SOC managers, but lower-tier analysts can create their own tasks on the spot, manually, right from within the incident. You can see the list of tasks you need to perform for a particular incident on the incident details pag",2024-02-06T12:14:00.000Z,how-to,best-practices,0.6,True,"Explains how analysts should use incident tasks created by automation rules/playbooks or manually, giving concrete operational patterns specific to Sentinel.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/work-with-threat-indicators,Work with threat intelligence,Work with threat intelligence - Microsoft Sentinel,,"This article explains how to view, create, manage, and visualize threat intelligence in Microsoft Sentinel.","Accelerate threat detection and remediation with streamlined creation and management of threat intelligence. This article demonstrates how to make the most of threat intelligence integration in the management interface, whether you're accessing it from Microsoft Sentinel in the Defender portal or the Azure portal. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Micro",2025-10-20T22:18:00.000Z,how-to,,0.3,False,"General guidance on viewing, creating, and managing threat intelligence; summary suggests UI workflow and conceptual usage, not detailed limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sentinel/workspace-manager,Workspace manager in the Azure portal,Manage multiple Microsoft Sentinel workspaces with workspace manager,Use workspace manager to operate multiple Sentinel workspaces,Learn how to centrally manage multiple Microsoft Sentinel workspaces within one or more Azure tenants with workspace manager. This article takes you through provisioning and usage of Workspace Manager,"Learn how to centrally manage multiple Microsoft Sentinel workspaces within one or more Azure tenants with workspace manager. This article takes you through provisioning and usage of workspace manager. Whether you're a global enterprise or a Managed Security Services Provider (MSSP), workspace manager helps you operate at scale efficiently. Here are the active content types supported with workspace manager: Important Support for workspace manager is currently in PREVIEW. TheAzure Preview Supplem",2024-10-17T08:00:00.000Z,how-to,architecture-patterns,0.65,True,Describes central management of multiple workspaces and supported content types; this is a Sentinel-specific operational architecture pattern for large or MSSP environments.,unchanged -https://learn.microsoft.com/en-us/azure/sentinel/workspaces-defender-portal,Workspaces in the Defender portal,Multiple workspaces - Microsoft Sentinel in Defender portal,Configure multiple Microsoft Sentinel workspaces in Defender portal,Learn about the support of multiple workspaces for Microsoft Sentinel in the Defender portal including primary and secondary workspaces.,"The Defender portal allows you to connect to one primary workspace and multiple secondary workspaces for Microsoft Sentinel. In the context of this article, a workspace is a Log Analytics workspace with Microsoft Sentinel enabled. This article primarily applies to the scenario where you onboard Microsoft Sentinel to the Defender portal together with Microsoft Defender XDR forunified security operations. If you plan to use Microsoft Sentinel in the Defender portal without Microsoft Defender XDR, ",2026-03-17T08:00:00.000Z,concept-article,configuration,0.66,True,"The article covers how to connect a primary and multiple secondary Sentinel workspaces in the Defender portal, including the specific behavior and constraints when onboarding Sentinel with Defender XDR. These workspace-relationship and portal-configuration details are product-specific expert configuration knowledge.",unchanged +This information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here. Threat analyticsis an in-product threat intelligence solution from expert Microsoft security researchers. It helps security teams stay efficient while facing emerging threats, such as: For ",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"Overview of threat analytics access; no detailed configuration, limits, or troubleshooting mappings in the summary.",updated +https://learn.microsoft.com/en-us/azure/sentinel/threat-detection,Overview,Threat detection in Microsoft Sentinel,,"Understand how threat detection works in Microsoft Sentinel. Learn about different types of analytics rules and templates, and the generation of alerts and incidents.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Aftersetting up Microsoft Sentinel to collect data from all over your organization, you need to constantly dig through al",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,Threat detection overview describing rule types and alert/incident generation; summary suggests conceptual explanation rather than detailed config or numeric thresholds.,updated +https://learn.microsoft.com/en-us/azure/sentinel/threat-intelligence-integration,Threat intelligence integrations,Threat intelligence integration in Microsoft Sentinel,Integrate threat intelligence feeds with Sentinel,Learn about the different ways threat intelligence feeds are integrated with and used by Microsoft Sentinel.,"Microsoft Sentinel gives you a few ways touse threat intelligence feedsto enhance your security analysts' ability to detect and prioritize known threats: Tip If you have multiple workspaces in the same tenant, such as forManaged Security Service Providers (MSSPs), it might be more cost effective to connect threat indicators only to the centralized workspace. When you have the same set of threat indicators imported into each separate workspace, you can run cross-workspace queries to aggregate thr",2026-04-22T17:56:00.000Z,concept-article,integrations,0.6,True,Describes ways threat intelligence feeds are integrated and used; likely includes connector types and workspace strategies that are product-specific integration guidance.,updated +https://learn.microsoft.com/en-us/azure/sentinel/top-workbooks,Top Microsoft Sentinel workbooks,Commonly used Microsoft Sentinel workbooks,,"Learn about the most commonly used workbooks to use popular, out-of-the-box Microsoft Sentinel resources.","This article lists the most commonly used Microsoft Sentinel workbooks. Install the solution or standalone item that contains the workbook from theContent hubin Microsoft Sentinel. Get the workbook from theContent hubby selectingManageon the solution or standalone item. Or, in Microsoft Sentinel underThreat Management, go toWorkbooksand search for the workbook you want to use. For more information, seeVisualize and monitor your data. We recommend you deploy any workbooks associated with the data",2026-04-22T17:56:00.000Z,reference,,0.4,False,List of commonly used workbooks and where to get them; primarily catalog/usage guidance without deep configuration or troubleshooting content.,updated +https://learn.microsoft.com/en-us/azure/sentinel/transformation-filter-split,Data transformation using filter and split,Transform data using filter and split in Microsoft Sentinel,Configure filter and split transformations for Sentinel ingestion,"Learn how to use filter and split data transformations to streamline ingestion, reduce costs, and route data between Analytics and Data lake tiers in Microsoft Sentinel.","As security data volumes continue to grow, organizations face the challenge of balancing cost-effective retention of telemetry used for AI, compliance, and investigations while ensuring that only necessary data is retained in high-performance storage tiers. Use filter and split data transformations in Microsoft Sentinel to address this challenge by modifying data at ingestion time to optimize your data retention strategy. This article describes how to configure filter and split data transformati",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Describes configuring filter/split transformations at ingestion to route data between tiers; includes product-specific transformation options and parameters.,updated +https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-analytics-rules,Troubleshoot analytics rules,Troubleshooting analytics rules in Microsoft Sentinel,Troubleshoot Sentinel analytics rule issues,"Learn how to deal with certain known issues that can affect analytics rules, and understand the meaning of AUTO DISABLED.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. This article explains how to deal with certain issues that may arise with execution ofscheduled analytics rulesin Microso",2026-04-22T17:56:00.000Z,troubleshooting-general,troubleshooting,0.8,True,"Explicit troubleshooting article; deals with known issues, explains AUTO DISABLED state, and likely maps symptoms to causes and resolutions specific to Sentinel analytics rules.",updated +https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-sentinel-solutions,Troubleshoot solutions in Microsoft Sentinel,Troubleshoot solutions in Microsoft Sentinel,Troubleshoot Microsoft Sentinel solution ingestion and analytics,"Troubleshoot Microsoft Sentinel data ingestion, analytics, packaging, and agent integration issues, and prepare information for Support.","Use this guide to diagnose issues, validate configurations, and understand support responsibilities for your Microsoft Sentinel solution.",2026-04-22T17:56:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicit troubleshooting guide for data ingestion, analytics, packaging, and agent integration; likely includes error messages, causes, and resolution steps unique to Sentinel solutions.",updated +https://learn.microsoft.com/en-us/azure/sentinel/tutorial-enrich-ip-information,Automatically enrich incident information,Tutorial - Automatically check and record IP address reputation in incident in Microsoft Sentinel,,"In this tutorial, learn how to use Microsoft Sentinel automation rules and playbooks to automatically check IP addresses in your incidents against a threat intelligence source and record each result i","One quick and easy way to assess the severity of an incident is to see if any IP addresses in it are known to be sources of malicious activity. Having a way to do this automatically can save you a lot of time and effort. In this tutorial, you'll learn how to use Microsoft Sentinel automation rules and playbooks to automatically check IP addresses in your incidents against a threat intelligence source and record each result in its relevant incident. When you complete this tutorial, you'll be able",2026-04-22T17:56:00.000Z,tutorial,,0.45,False,Tutorial for checking IP reputation via automation; summary suggests a worked example rather than a catalog of product-specific configuration options.,updated +https://learn.microsoft.com/en-us/azure/sentinel/tutorial-extract-incident-entities,Extract incident entities with non-native actions,Extract incident entities with non-native actions,,"In this tutorial, you extract entity types with action types that aren't native to Microsoft Sentinel, and save these actions in a playbook to use for SOC automation.","Entity mapping enriches alerts and incidents with information essential for any investigative processes and remedial actions that follow. Microsoft Sentinel playbooks include these native actions to extract entity info: In addition to these actions, analytic rule entity mapping contains entity types that aren't native actions, like malware, process, registry key, mailbox, and more. In this tutorial, you learn how to work with non-native actions using different built-in actions to extract the rel",2026-04-22T17:56:00.000Z,tutorial,,0.45,False,Tutorial on extracting non-native entities; focuses on example playbook actions rather than general configuration matrices or limits.,updated +https://learn.microsoft.com/en-us/azure/sentinel/tutorial-log4j-detection,Tutorial - Detect threats using analytics rules,Tutorial - Detect threats by using analytics rules in Microsoft Sentinel,,"In this tutorial, learn how to use analytics rules in Microsoft Sentinel to detect exploits of the Apache Log4j vulnerability in any of your susceptible systems. Take advantage of the alert enrichment","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. As a Security Information and Event Management (SIEM) service, Microsoft Sentinel is responsible for detecting security t",2026-04-22T17:56:00.000Z,tutorial,,0.35,False,Scenario-based tutorial for Log4j detection; primarily step-by-step usage of analytics rules rather than a reusable configuration reference or troubleshooting catalog.,updated +https://learn.microsoft.com/en-us/azure/sentinel/ueba-reference,UEBA data sources and table schemas,Microsoft Sentinel User and Entity Behavior Analytics (UEBA) reference,Reference for Sentinel UEBA entity enrichments,This article displays the entity enrichments generated by Microsoft Sentinel's entity behavior analytics.,"This article lists the input data sources for theUser and Entity Behavior Analytics service in Microsoft Sentinel. It also describes the enrichments that UEBA adds to entities, providing needed context to alerts and incidents. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use M",2026-04-22T17:56:00.000Z,reference,configuration,0.6,True,"Lists input data sources and describes enrichments UEBA adds to entities; this is a reference of product-specific fields and enrichment schema, which is configuration/schema knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/understand-threat-intelligence,Overview,Threat intelligence - Microsoft Sentinel,,"Understand threat intelligence and how it integrates with features in Microsoft Sentinel to analyze data, detect threats, and enrich alerts.","Microsoft Sentinel is a cloud-native security information and event management (SIEM) solution that can ingest, curate, and manage threat intelligence from numerous sources. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Microsoft Sentinel in the Azure portal will beredirected to the Defender portal and will use Microsoft Sentinel in the Defender portal only. If you",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,Conceptual article about threat intelligence in Sentinel; primarily high-level explanation without detailed configuration parameters or error mappings.,updated +https://learn.microsoft.com/en-us/azure/sentinel/unified-connector,Unified connectors overview,Unified Connectors Platform for Microsoft Sentinel,,"Learn about the Unified Connectors Platform that simplifies connector management across Microsoft security products including Microsoft Sentinel, Defender for Cloud, and Defender for Identity.",The Unified connectors platform enables you to connect once to an external product that provides value in multiple Microsoft security products. This platform simplifies the connector management experience across Microsoft security products. Unified connectors provide the following benefits:,2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"Overview of Unified Connectors Platform benefits; summary does not indicate detailed config tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-cef-device,Device configuration for CEF via AMA,CEF via AMA connector - Configure appliances and devices,Configure specific devices for CEF via AMA Sentinel connector,Learn how to configure specific devices that use the Common Event Format (CEF) via AMA data connector for Microsoft Sentinel.,"Log collection from many security appliances and devices is supported by theCommon Event Format (CEF) via AMAdata connector in Microsoft Sentinel. This article lists provider-supplied installation instructions for specific security appliances and devices that use this data connector. Contact the provider for updates, more information, or where information is unavailable for your security appliance or device. To ingest data to your Log Analytics workspace for Microsoft Sentinel, complete the step",2026-04-22T17:56:00.000Z,reference,configuration,0.65,True,"Device-specific configuration instructions (ports, formats, forwarding settings) are highly product- and vendor-specific configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-custom-device,Custom logs - configure security device,Custom logs via AMA connector - Configure data ingestion to Microsoft Sentinel from specific applications,Configure Sentinel custom log ingestion for specific applications,"Learn how to configure data ingestion into Microsoft Sentinel from specific or custom applications that produce logs as text files, using the Custom Logs via AMA data connector or manual configuration","Microsoft Sentinel'sCustom Logs via AMAdata connector supports the collection of logs from text files from several different network and security applications and devices. This article supplies the configuration information, unique to each specific security application, that you need to supply when configuring this data connector. This information is provided by the application providers. Contact the provider for updates, for more information, or when information is unavailable for your security",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,"Per-application configuration details (paths, formats, fields) for Custom Logs via AMA are highly specific configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-integration,Integrate with unified connectors,Connect to Microsoft Sentinel using a unified connector,Configure unified connectors to integrate Microsoft Sentinel,"Learn how to connect to the Unified Connectors Platform that simplifies connector management across Microsoft security products including Microsoft Sentinel, Defender for Cloud, and Defender for Ident","Useunified connectorsto simplify connection management across Microsoft security products. If you already have an existing connector to a product, consider replacing it with a unified connector for centralized management.",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"How-to for connecting via unified connector; such pages typically include connector-specific parameters, scopes, and settings unique to Sentinel’s unified platform, which qualify as configuration expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-syslog-device,Syslog - configure security device,Syslog via AMA connector - configure appliances and devices,Configure appliances and devices for Syslog via AMA,Learn how to configure specific appliances and devices that use the Syslog via AMA data connector for Microsoft Sentinel.,"TheSyslog via AMAdata connector in Microsoft Sentinel collects logs from many security appliances and devices. This article lists provider-supplied installation instructions for specific security appliances and devices that use this data connector. Contact the provider for updates, more information, or where information is unavailable for your security appliance or device. To forward data to your Log Analytics workspace for Microsoft Sentinel, complete the steps inIngest syslog and CEF messages ",2026-04-22T17:56:00.000Z,reference,configuration,0.65,True,Lists provider-specific configuration for Syslog forwarding to Sentinel; includes device settings and log formats that are expert configuration details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/upload-indicators-api,Legacy upload indicator API reference,Reference the legacy upload indicators API - Microsoft Sentinel,Use legacy Sentinel upload indicators API for STIX IOCs,This article is a reference for the legacy upload indicators API with an example request and response.,"The Microsoft Sentinel upload indicators API allowed threat intelligence platforms or custom applications to import indicators of compromise in the STIX format into a Microsoft Sentinel workspace. This document serves as a reference to the legacy API. Important This API is in PREVIEW but no longer recommended. Use the new STIX objects API in preview to upload threat intelligence. For more information, seeSTIX objects API. +TheAzure Preview Supplemental Termsinclude additional legal terms that app",2026-04-22T17:56:00.000Z,reference,integrations,0.82,True,"Legacy API reference with concrete request/response schema for uploading indicators, which is detailed integration knowledge.",updated +https://learn.microsoft.com/en-us/azure/sentinel/use-matching-analytics-to-detect-threats,Use matching analytics to detect threats,Use matching analytics to detect threats - Microsoft Sentinel,,This article explains how to detect threats with Microsoft-generated threat intelligence in Microsoft Sentinel.,"Take advantage of threat intelligence produced by Microsoft to generate high-fidelity alerts and incidents with theMicrosoft Defender Threat Intelligence Analyticsrule. This built-in rule in Microsoft Sentinel matches indicators with Common Event Format (CEF) logs, Windows DNS events with domain and IPv4 threat indicators, syslog data, and more.",2026-04-22T17:56:00.000Z,how-to,,0.35,False,"Describes using Microsoft-generated threat intelligence via a built-in analytics rule; summary indicates conceptual usage, not detailed error codes, limits, or config tables.",updated +https://learn.microsoft.com/en-us/azure/sentinel/use-multiple-workspaces,Set up multiple workspaces,Set up multiple workspaces and tenants in Microsoft Sentinel,Deploy Microsoft Sentinel across multiple workspaces and tenants,"If you've defined that your environment needs multiple workspaces, you now set up your multiple workspace architecture in Microsoft Sentinel.","When you planned your deployment, you determined whether a multiple workspace architecture is relevant for your environment. If your environment requires multiple workspaces, you can now set them up as part of your deployment. For more information, seePrepare for multiple workspaces and tenants in Microsoft Sentinel. In this article, you learn how to set up Microsoft Sentinel to extend across multiple workspaces and tenants. This article is part of theDeployment guide for Microsoft Sentinel.",2026-04-22T17:56:00.000Z,how-to,deployment,0.7,True,Implementation step for multi-workspace/multi-tenant architecture; contains deployment-specific instructions and constraints for this topology.,updated +https://learn.microsoft.com/en-us/azure/sentinel/use-threat-indicators-in-analytics-rules,Use threat indicators in analytics rules,Use threat indicators in analytics rules - Microsoft Sentinel,,This article explains how to generate alerts and incidents with threat intelligence indicators in Microsoft Sentinel.,Power your analytics rules with your threat indicators to automatically generate alerts based on the threat intelligence that you integrated.,2026-04-22T17:56:00.000Z,how-to,,0.35,False,"Explains using threat indicators in analytics rules; likely rule-creation workflow and concepts, not detailed configuration matrices or numeric thresholds.",updated +https://learn.microsoft.com/en-us/azure/sentinel/watchlist-schemas,Watchlist template schemas,Schemas for Microsoft Sentinel watchlist templates,Use schemas for built-in Sentinel watchlist templates,Learn about the schemas used in each built-in watchlist template in Microsoft Sentinel.,"This article details the schemas used in each built-in watchlist template provided by Microsoft Sentinel. For more information, seeCreate watchlists in Microsoft Sentinel. The Microsoft Sentinel watchlist templates are currently in PREVIEW. TheAzure Preview Supplemental Termsinclude additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.",2026-04-22T17:56:00.000Z,reference,configuration,0.82,True,"Details the schemas for each watchlist template, including column names and types, which are configuration references.",updated +https://learn.microsoft.com/en-us/azure/sentinel/watchlists,Overview,Use Watchlists to Correlate and Enrich Event Data in Microsoft Sentinel,Use Sentinel watchlists to enrich and correlate data,"Learn how to use watchlists in Microsoft Sentinel to efficiently correlate and enrich event data, reduce alert fatigue, and respond to threats. Discover best practices and get started today.","Watchlists in Microsoft Sentinel help security analysts efficiently correlate and enrich event data. They give you a flexible way to manage reference data, like lists of high-value assets or terminated employees. Integrate watchlists into your detection rules, threat hunting, and response workflows to reduce alert fatigue and respond to threats faster. This article explains how to use watchlists in Microsoft Sentinel, outlines key scenarios and limitations, and gives guidance on creating and que",2026-04-22T17:56:00.000Z,concept-article,best-practices,0.7,True,"Article explicitly mentions best practices and key scenarios; watchlist usage guidance typically includes concrete patterns (how to structure lists, query them, and integrate into rules) that are product-specific DO/DON’T recommendations.",updated +https://learn.microsoft.com/en-us/azure/sentinel/watchlists-create,Create watchlists,Create new watchlists - Microsoft Sentinel,Create Microsoft Sentinel watchlists with size limits,"Create watchlist in Microsoft Sentinel for allowlists or blocklists, to enrich event data, and help investigate threats.","Watchlists in Microsoft Sentinel help you correlate data from a data source you provide with the events in your Microsoft Sentinel environment. For example, you might create a watchlist with a list of high value assets, terminated employees, or service accounts in your environment. You can create a watchlist by using any of the following methods: You can currently upload local files up to 3.8 MB in size. A file that's over 3.8 MB and up to 500 MB is considered a large watchlist. To upload a larg",2026-04-22T17:56:00.000Z,how-to,limits-quotas,0.78,True,"Includes concrete file size limits for watchlist uploads (3.8 MB threshold and up to 500 MB for large watchlists), which are product-specific numerical constraints.",updated +https://learn.microsoft.com/en-us/azure/sentinel/watchlists-manage,Manage watchlists,Edit watchlists - Microsoft Sentinel,Maintain and edit Microsoft Sentinel watchlists safely,Learn how to edit and add more items to Microsoft Sentinel watchlists to them to keep them up-to-date.,"We recommend you edit an existing watchlist instead of deleting and recreating a watchlist. Log analytics has a five-minute SLA for data ingestion. If you delete and recreate a watchlist, you might see both the deleted and recreated entries in Log Analytics during this five-minute window. If you see these duplicate entries in Log Analytics for a longer period of time, submit a support ticket. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and wi",2026-04-22T17:56:00.000Z,how-to,best-practices,0.7,True,"Gives a concrete product-specific recommendation to edit instead of delete/recreate due to a 5-minute Log Analytics ingestion SLA and describes duplicate-entry behavior, which is a Sentinel-specific gotcha.",updated +https://learn.microsoft.com/en-us/azure/sentinel/watchlists-queries,Build queries or rules,Build queries or rules with watchlists - Microsoft Sentinel,,Use watchlists in KQL search queries or detection rules with built-in functions for Microsoft Sentinel.,"Correlate your watchlist data against any Microsoft Sentinel data with Kusto tabular operators such asjoinandlookup. When you create a watchlist, you define theSearchKey. The search key is the name of a column in your watchlist that you expect to use as a join with other data or as a frequent object of searches. For optimal query performance, useSearchKeyas the key for joins in your queries. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and wil",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"Focuses on using watchlists in KQL queries; no numeric limits, config tables, or product-specific error/decision matrices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/whats-new,What's new,What's new in Microsoft Sentinel,,Learn about the latest new features and announcement in Microsoft Sentinel from the past few months.,"This article lists recent features added for Microsoft Sentinel, and new features in related services that provide an enhanced user experience in Microsoft Sentinel. For new features related to unified security operations in the Defender portal, see theWhat's new for unified security operations? The listed features were released in the last six months. For information about earlier features delivered, see ourTech Community blogs. Note For information about feature availability in US Government c",2026-04-22T17:56:00.000Z,concept-article,,0.3,False,"What's new/feature announcements; typically changelog-style without deep limits, configs, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/sentinel/windows-security-event-id-reference,Windows security event sets,Windows security event sets that can be sent to Microsoft Sentinel,Select Windows security event sets for Sentinel ingestion,Learn about the pre-built sets of Windows security events that you can collect and stream from your Windows systems to your Microsoft Sentinel workspace.,"When ingesting security events from Windows devices using theWindows Security Events data connector(including the legacy version), you can choose which events to collect from among the following sets: All events- Collects the full, unfiltered set of events from the Windows Security event log and the AppLocker event log channels. The Security log (Windows Logs > Securityin Event Viewer) records auditing events such as logons, privilege use, and policy changes. The AppLocker logs (Application and ",2026-04-22T17:56:00.000Z,reference,configuration,0.7,True,Describes predefined event sets and which event IDs they include; this is product-specific configuration of what data is collected.,updated +https://learn.microsoft.com/en-us/azure/sentinel/work-with-anomaly-rules,Work with out-of-the-box anomaly rules,Work with anomaly detection analytics rules in Microsoft Sentinel,Configure anomaly detection analytics rules in Sentinel,"This article explains how to view, create, manage, assess, and fine-tune anomaly detection analytics rules in Microsoft Sentinel.","Important Custom detectionsis now the best way to create new rules across Microsoft Sentinel SIEM Microsoft Defender XDR. With custom detections, you can reduce ingestion costs, get unlimited real-time detections, and benefit from seamless integration with Defender XDR data, functions, and remediation actions with automatic entity mapping. For more information, readthis blog. Microsoft Sentinel’scustomizable anomalies featureprovidesbuilt-in anomaly templatesfor immediate value out-of-the-box. T",2026-04-22T17:56:00.000Z,how-to,configuration,0.65,True,"Article on creating, managing, and tuning anomaly rules; these typically expose anomaly-specific parameters (sensitivity, baselining windows, thresholds) that are product-specific configuration options.",updated +https://learn.microsoft.com/en-us/azure/sentinel/work-with-stix-objects-indicators,Work with STIX objects and indicators,Work with STIX objects and indicators to enhance threat intelligence and threat hunting in Microsoft Sentinel (Preview) - Microsoft Sentinel,Query STIX objects and indicators in Sentinel,This article provides examples of how to incorporate STIX objects into queries to enhance threat hunting.,"On April 3, 2025, we publicly previewed two new tables to support STIX (Structured Threat Information eXpression) indicator and object schemas:ThreatIntelIndicatorsandThreatIntelObjects. This article provides examples of how to incorporate STIX objects into queries to enhance threat hunting, and how to migrate to the new threat indicator schema. For more information about threat intelligence in Microsoft Sentinel, seeThreat intelligence in Microsoft Sentinel. Important Microsoft Sentinel will in",2026-04-22T17:56:00.000Z,how-to,integrations,0.7,True,Provides concrete Kusto query examples against ThreatIntelIndicators and ThreatIntelObjects tables and guidance on migrating to the new schema—these are product-specific query/integration patterns and schema details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/work-with-tasks,Use tasks to handle incident workflow,Work with incident tasks in Microsoft Sentinel in the Azure portal,,This article explains how SOC analysts can use incident tasks to manage their incident-handling workflow processes in Microsoft Sentinel.,"This article explains how SOC analysts can use incident tasks to manage their incident-handling workflow processes in Microsoft Sentinel in the Azure portal. Incident tasksare typically created automatically by either automation rules or playbooks set up by senior analysts or SOC managers, but lower-tier analysts can create their own tasks on the spot, manually, right from within the incident. You can see the list of tasks you need to perform for a particular incident on the incident details pag",2026-04-22T17:56:00.000Z,how-to,,0.35,False,Explains how analysts use tasks; operational guidance but no explicit expert-only configuration tables or limits in the summary.,updated +https://learn.microsoft.com/en-us/azure/sentinel/work-with-threat-indicators,Work with threat intelligence,Work with threat intelligence - Microsoft Sentinel,,"This article explains how to view, create, manage, and visualize threat intelligence in Microsoft Sentinel.","Accelerate threat detection and remediation with streamlined creation and management of threat intelligence. This article demonstrates how to make the most of threat intelligence integration in the management interface, whether you're accessing it from Microsoft Sentinel in the Defender portal or the Azure portal. Important AfterMarch 31, 2027, Microsoft Sentinel will no longer be supported in the Azure portal and will be available only in the Microsoft Defender portal. All customers using Micro",2026-04-22T17:56:00.000Z,how-to,,0.3,False,"High-level guidance on viewing, creating, and managing threat intelligence in the UI; summary suggests conceptual/UX description without detailed config tables or numeric limits.",updated +https://learn.microsoft.com/en-us/azure/sentinel/workspace-manager,Workspace manager in the Azure portal,Manage multiple Microsoft Sentinel workspaces with workspace manager,Configure and use Sentinel workspace manager for multi-workspace operations,Learn how to centrally manage multiple Microsoft Sentinel workspaces within one or more Azure tenants with workspace manager. This article takes you through provisioning and usage of Workspace Manager,"Learn how to centrally manage multiple Microsoft Sentinel workspaces within one or more Azure tenants with workspace manager. This article takes you through provisioning and usage of workspace manager. Whether you're a global enterprise or a Managed Security Services Provider (MSSP), workspace manager helps you operate at scale efficiently. Here are the active content types supported with workspace manager: Important Support for workspace manager is currently in PREVIEW. TheAzure Preview Supplem",2026-04-22T17:56:00.000Z,how-to,configuration,0.7,True,Covers provisioning and using workspace manager with supported content types; includes Sentinel-specific management configuration details.,updated +https://learn.microsoft.com/en-us/azure/sentinel/workspaces-defender-portal,Workspaces in the Defender portal,Multiple workspaces - Microsoft Sentinel in Defender portal,Design multi-workspace Sentinel deployments in Defender portal,Learn about the support of multiple workspaces for Microsoft Sentinel in the Defender portal including primary and secondary workspaces.,"The Defender portal allows you to connect to one primary workspace and multiple secondary workspaces for Microsoft Sentinel. In the context of this article, a workspace is a Log Analytics workspace with Microsoft Sentinel enabled. This article primarily applies to the scenario where you onboard Microsoft Sentinel to the Defender portal together with Microsoft Defender XDR forunified security operations. If you plan to use Microsoft Sentinel in the Defender portal without Microsoft Defender XDR, ",2026-04-22T17:56:00.000Z,concept-article,architecture-patterns,0.65,True,"Describes primary/secondary workspace patterns and when to use them, especially with Defender XDR; product-specific architecture guidance.",updated diff --git a/products/azure-sentinel/report.md b/products/azure-sentinel/report.md index 399df7e9..5bfe1ecf 100644 --- a/products/azure-sentinel/report.md +++ b/products/azure-sentinel/report.md @@ -1,46 +1,46 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - configuration: Configuring Microsoft Sentinel data connectors, retention, analytics/automation - rules, ASIM schemas, UEBA, SAP/Power Platform content, data lake, MCP tools, and - health/auditing settings. - decision-making: Planning Sentinel deployments, pricing, and data tiers; choosing - connectors and content; and migrating detections, automation, and historical data - from MMA, ArcSight, QRadar, and Splunk. - integrations: APIs, connectors, and code patterns for ingesting data, threat intel, - and incidents into Sentinel, querying the data lake/graphs, and integrating with - playbooks, MCP, Teams, Power BI, and Logic Apps - security: 'Security configuration for Microsoft Sentinel: RBAC and roles, row-level/resource-context - access, playbook auth/restrictions, encryption keys, audit logs, SAP roles/params, - and network/attack protections.' - best-practices: 'Best practices for SOC operations in Microsoft Sentinel: rule tuning, - automation/playbooks, incident tasks/metrics, watchlists, data collection, solution - lifecycle, and monitoring/health.' - architecture-patterns: 'Architecting Sentinel deployments: multi-workspace/tenant - patterns, MSSP setups, SOAR automation, BCDR/resiliency, cross-workspace data/incident - ops, SAP, ML models, and Jupyter-based hunting.' - troubleshooting: Diagnosing and fixing Microsoft Sentinel ingestion, connector, - KQL/data lake, analytics rule (auto-disable), MCP tools, and SAP/AWS/Blob/CEF/Syslog - integration issues. - deployment: Deploying and managing Microsoft Sentinel solutions and content (CI/CD, - ARM, content hub, marketplace) and specialized connectors/agents for SAP, Power - Platform, Dynamics, Azure Stack Hub, and hunting notebooks. - limits-quotas: Limits, quotas, pricing, and retention tiers for Sentinel data, search - jobs, watchlists, MCP servers, ASIM, and workspace removal impacts + configuration: Configuring Microsoft Sentinel data ingestion, connectors, schemas, + analytics/automation rules, UEBA/Fusion, data lake, MCP tools, and health/auditing + to tailor and operate the SIEM platform. + decision-making: Planning and decision guides for Sentinel deployment, pricing and + data tiers, connector and analytics choices, and migrating from legacy SIEMs (Splunk/QRadar/ArcSight) + and MMA/SAP to Sentinel. + integrations: Programmatic and low-code ways to integrate data, threat intel, incidents, + graphs, and analytics with Microsoft Sentinel using APIs, Logic Apps, DCRs, ASIM, + notebooks, and external tools. + security: 'Securing Microsoft Sentinel: RBAC and row-level access, playbook auth + and restrictions, encryption keys, network perimeters, and SAP/Power Platform/Dynamics + security content and parameters' + best-practices: Best practices for Sentinel workspace/data design, cost tuning, + rule noise reduction, SOC operations, identity/SAP threat detection, ASIM use, + and safe watchlist management. + troubleshooting: Diagnosing and fixing Microsoft Sentinel data ingestion, connector, + KQL, analytics rule, and MCP/SAP solution issues across AWS S3, Azure Storage, + Syslog/CEF, and the data lake. + deployment: Deploying and managing Microsoft Sentinel at scale, including prerequisites, + multi-tenant/workspace setup, CI/CD content deployment, Content hub solutions, + and SAP/Power Platform/Dynamics connectors. + architecture-patterns: 'Designing Microsoft Sentinel architectures: workspace/tenant + layouts, HA/DR, MSSP multi-tenant models, SAP-specific patterns, and cross-workspace/tenant + integration strategies.' + limits-quotas: Limits, quotas, pricing, and retention tiers for Sentinel data, workspaces, + watchlists, search jobs, and ASIM normalization, plus impacts of workspace removal. skill_description: Expert knowledge for Azure Sentinel development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - configuring Sentinel data connectors, analytics rules, SOAR playbooks, UEBA/ASIM - content, or multi-workspace setups, and other Azure Sentinel related development - tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Defender - For Iot (use azure-defender-for-iot), Azure Monitor (use azure-monitor), Azure Security - (use azure-security). -use_when: Use when configuring Sentinel data connectors, analytics rules, SOAR playbooks, - UEBA/ASIM content, or multi-workspace setups, and other Azure Sentinel related development + configuring Sentinel connectors, analytics rules, UEBA/Fusion, ASIM/DCR pipelines, + or Logic Apps playbooks, and other Azure Sentinel related development tasks. Not + for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Monitor (use + azure-monitor), Azure Security (use azure-security), Azure Network Watcher (use + azure-network-watcher). +use_when: Use when configuring Sentinel connectors, analytics rules, UEBA/Fusion, + ASIM/DCR pipelines, or Logic Apps playbooks, and other Azure Sentinel related development tasks. confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-cloud), - Azure Defender For Iot (use azure-defender-for-iot), Azure Monitor (use azure-monitor), - Azure Security (use azure-security). + Azure Monitor (use azure-monitor), Azure Security (use azure-security), Azure Network + Watcher (use azure-network-watcher). --- # Azure Sentinel Crawl Report @@ -49,13 +49,13 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo - **Total Pages**: 394 - **Fetched**: 394 - **Fetch Failed**: 0 -- **Classified**: 289 -- **Unclassified**: 105 +- **Classified**: 283 +- **Unclassified**: 111 ### Incremental Update -- **New Pages**: 3 -- **Updated Pages**: 3 -- **Unchanged**: 388 +- **New Pages**: 0 +- **Updated Pages**: 394 +- **Unchanged**: 0 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-sentinel/azure-sentinel.csv` @@ -63,434 +63,463 @@ confusable_not_for: Not for Azure Defender For Cloud (use azure-defender-for-clo | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 10 | 2.5% | -| best-practices | 23 | 5.8% | -| configuration | 125 | 31.7% | -| decision-making | 33 | 8.4% | -| deployment | 21 | 5.3% | -| integrations | 49 | 12.4% | +| architecture-patterns | 9 | 2.3% | +| best-practices | 15 | 3.8% | +| configuration | 130 | 33.0% | +| decision-making | 38 | 9.6% | +| deployment | 22 | 5.6% | +| integrations | 39 | 9.9% | | limits-quotas | 8 | 2.0% | -| security | 12 | 3.0% | +| security | 14 | 3.6% | | troubleshooting | 8 | 2.0% | -| *(Unclassified)* | 105 | 26.6% | +| *(Unclassified)* | 111 | 28.2% | ## Changes -### New Pages - -- [SAP LogServ](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-logserv-overview) -- [Identity attack graph](https://learn.microsoft.com/en-us/azure/sentinel/datalake/identity-attack-graph) -- [AWS EKS logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks) - ### Updated Pages -- [Plan your migration](https://learn.microsoft.com/en-us/azure/sentinel/migration) - - Updated: 2024-09-23T08:00:00.000Z → 2026-04-15T08:00:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started) - - Updated: 2026-01-30T08:00:00.000Z → 2026-04-17T11:12:00.000Z -- [Data exploration](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool) - - Updated: 2026-04-03T06:12:00.000Z → 2026-04-14T08:00:00.000Z +- [What is Microsoft Sentinel?](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-overview) + - Updated: 2025-09-30T12:49:00.000Z → 2026-04-22T17:56:00.000Z +- [What's new](https://learn.microsoft.com/en-us/azure/sentinel/whats-new) + - Updated: 2026-04-10T08:00:00.000Z → 2026-04-22T17:56:00.000Z +- [Deployment planning guide](https://learn.microsoft.com/en-us/azure/sentinel/deploy-overview) + - Updated: 2025-07-22T12:54:00.000Z → 2026-04-22T17:56:00.000Z +- [Prerequisites](https://learn.microsoft.com/en-us/azure/sentinel/prerequisites) + - Updated: 2025-05-30T11:13:00.000Z → 2026-04-22T17:56:00.000Z +- [Review sample workspace designs](https://learn.microsoft.com/en-us/azure/sentinel/sample-workspace-designs) + - Updated: 2024-08-28T17:05:00.000Z → 2026-04-22T17:56:00.000Z +- [Prepare for multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/prepare-multiple-workspaces) + - Updated: 2025-07-16T11:12:00.000Z → 2026-04-22T17:56:00.000Z +- [Prioritize data connectors](https://learn.microsoft.com/en-us/azure/sentinel/prioritize-data-connectors) + - Updated: 2023-10-11T15:53:00.000Z → 2026-04-22T17:56:00.000Z +- [Plan roles and permissions](https://learn.microsoft.com/en-us/azure/sentinel/roles) + - Updated: 2026-01-07T08:00:00.000Z → 2026-04-22T17:56:00.000Z +- [Plan data retention](https://learn.microsoft.com/en-us/azure/sentinel/log-plans) + - Updated: 2025-09-30T12:49:00.000Z → 2026-04-22T17:56:00.000Z +- [Plan costs](https://learn.microsoft.com/en-us/azure/sentinel/billing) + - Updated: 2026-04-01T17:25:00.000Z → 2026-04-22T17:56:00.000Z +- [Geographical availability and data residency](https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency) + - Updated: 2026-01-14T12:11:00.000Z → 2026-04-22T17:56:00.000Z +- [Support for data types in different clouds](https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support) + - Updated: 2024-06-13T11:23:00.000Z → 2026-04-22T17:56:00.000Z +- [Feature support in different clouds](https://learn.microsoft.com/en-us/azure/sentinel/feature-availability) + - Updated: 2025-08-21T17:11:00.000Z → 2026-04-22T17:56:00.000Z +- [Business continuity and disaster recovery](https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery) + - Updated: 2025-07-22T12:54:00.000Z → 2026-04-22T17:56:00.000Z +- [Enable Microsoft Sentinel and initial features and content](https://learn.microsoft.com/en-us/azure/sentinel/enable-sentinel-features-content) + - Updated: 2025-09-30T12:49:00.000Z → 2026-04-22T17:56:00.000Z +- [Onboard to Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/quickstart-onboard) + - Updated: 2025-09-17T11:11:00.000Z → 2026-04-22T17:56:00.000Z +- [Configure content](https://learn.microsoft.com/en-us/azure/sentinel/configure-content) + - Updated: 2024-07-15T03:11:00.000Z → 2026-04-22T17:56:00.000Z +- [Set up multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/use-multiple-workspaces) + - Updated: 2025-07-16T11:12:00.000Z → 2026-04-22T17:56:00.000Z +- [Enable User and Entity Behavior Analytics (UEBA)](https://learn.microsoft.com/en-us/azure/sentinel/enable-entity-behavior-analytics) + - Updated: 2026-02-26T23:12:00.000Z → 2026-04-22T17:56:00.000Z +- [Configure interactive and long-term data retention](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-retention-archive) + - Updated: 2024-07-30T11:22:00.000Z → 2026-04-22T17:56:00.000Z +- *...and 374 more* ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Data connector agent systemconfig.ini file reference (legacy)](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig) | configuration | 0.95 | Legacy configuration file reference; details specific INI keys and values used by older versions of the SAP connector agent. | -| [Data connector agent systemconfig.json file reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig-json) | configuration | 0.95 | Explicit configuration file reference; lists settings, sections, and allowed values for systemconfig.json, which is core configuration knowledge. | -| [Microsoft Sentinel data lake service limits](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-service-limits) | limits-quotas | 0.95 | Explicitly a service limits page; will contain numeric limits, sizes, and timeouts that are expert-only details. | -| [ASIM DHCP schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dhcp) | configuration | 0.90 | DHCP normalization schema reference; field definitions and usage are configuration-level expert details. | -| [ASIM DNS schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dns) | configuration | 0.90 | DNS normalization schema reference with specific fields for DNS events; detailed schema is expert configuration knowledge. | -| [ASIM alert event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-alert) | configuration | 0.90 | Reference for the Alert Events normalization schema with fields and structure for security alerts; detailed schema is expert configuration knowledge. | -| [ASIM audit event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-audit) | configuration | 0.90 | Defines the Audit Events normalization schema; field-level schema details are product-specific configuration. | -| [ASIM authentication schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-authentication) | configuration | 0.90 | Describes the Authentication schema for sign-in/sign-out events; schema fields and semantics are configuration reference. | -| [ASIM file event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-file-event) | configuration | 0.90 | File Event normalization schema for file activity; schema fields and constraints are product-specific configuration. | -| [ASIM network session schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-network) | configuration | 0.90 | Network Session normalization schema reference; detailed field mappings for IP network activity are configuration/schema knowledge. | -| [ASIM process event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-process-event) | configuration | 0.90 | Process Event normalization schema; field-level schema details are expert configuration information. | -| [ASIM registry event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-registry-event) | configuration | 0.90 | Registry Event normalization schema for Windows registry activity; schema fields and usage are configuration reference. | -| [ASIM user management schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-user-management) | configuration | 0.90 | User management normalization schema for activities like creating users/groups; detailed schema is product-specific configuration. | -| [ASIM web session schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-web) | configuration | 0.90 | Web Session normalization schema for IP web activity; field definitions and structure are configuration/schema reference. | -| [Billing, limits, and availability](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-billing) | limits-quotas | 0.90 | Explicitly about pricing, limits, and availability; such pages typically contain numeric quotas, per-tool usage caps, and possibly region availability tables that qualify as limits & quotas. | -| [DNS over AMA reference](https://learn.microsoft.com/en-us/azure/sentinel/dns-ama-fields) | configuration | 0.90 | Lists available fields for filtering DNS data and details the normalization schema for Windows DNS server logs; includes field names and usage, which are configuration reference details. | -| [Data connector agent kickstart script reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-kickstart) | configuration | 0.90 | Reference for kickstart deployment script parameters; will include parameter names, allowed values, and defaults, which are configuration details. | -| [Data connector agent update script reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-update) | configuration | 0.90 | Reference for update script options; contains specific parameter names and behaviors for updating the SAP connector agent. | -| [Data connector definitions API reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connector-ui-definitions-reference) | integrations | 0.90 | Reference for connectorUIConfig JSON fields for the Codeless Connector Framework; includes parameter names and structures specific to Sentinel’s Data Connector Definitions API. | -| [GCP data connectors API reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-gcp) | integrations | 0.90 | Reference for JSON fields and properties for GCP connector type and its data connection rules; product-specific integration configuration details. | -| [Microsoft Sentinel SIEM service limits](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-service-limits) | limits-quotas | 0.90 | Explicitly a service limits article; these pages list concrete numeric limits and quotas per area, which are expert knowledge not reliably known from training. | -| [Microsoft Sentinel provider class reference](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-provider-class-reference) | integrations | 0.90 | Class reference for MicrosoftSentinelProvider with methods and parameters; clearly an SDK integration reference with product-specific API surface and constraints. | -| [Required ABAP permissions](https://learn.microsoft.com/en-us/azure/sentinel/sap/required-abap-authorizations) | security | 0.90 | Lists required ABAP authorizations/roles for the SAP user account; these are specific permission objects and scopes unique to this integration. | -| [SentinelAudit table reference](https://learn.microsoft.com/en-us/azure/sentinel/audit-table-reference) | configuration | 0.90 | Reference for SentinelAudit tables with field definitions for auditing; table schema and field semantics are detailed configuration/reference information unique to Sentinel. | -| [SentinelHealth table reference](https://learn.microsoft.com/en-us/azure/sentinel/health-table-reference) | configuration | 0.90 | Reference for SentinelHealth tables with field-level details used for health monitoring; schema/field references are product-specific configuration data not derivable from general training. | -| [Troubleshoot CEF and Syslog via AMA](https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-troubleshooting) | troubleshooting | 0.90 | Explicit troubleshooting guide for CEF/Syslog via AMA with product-specific commands, configuration checks, and symptom-to-solution mappings. | -| [ASIM device entity](https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-device) | configuration | 0.88 | Defines Device Entity schema and prefixes (Dvc, Src, Dst) for events; field-level schema details are product-specific configuration reference. | -| [ASIM user entity](https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-user) | configuration | 0.88 | Provides the User Entity schema with field names, roles, and prefixes (Src, Dst, Actor, Target); this is detailed schema configuration unique to ASIM. | -| [Monitored SAP security parameters](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-suspicious-configuration-security-parameters) | security | 0.88 | Lists specific SAP static security parameters monitored by a named analytics rule and how to adjust watchlist values; highly product-specific security configuration. | -| [Security alert schema reference](https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema) | configuration | 0.88 | Schema reference for security alerts created by analytics rules; includes field definitions and mappings, which are configuration-level expert knowledge. | -| [CEF log field mapping](https://learn.microsoft.com/en-us/azure/sentinel/cef-name-mapping) | configuration | 0.86 | Provides explicit mapping tables between CEF field names and CommonSecurityLog fields; these mappings are detailed configuration/integration reference. | -| [Legacy network normalization schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-v1) | configuration | 0.86 | Legacy network normalization schema reference (version 0.1) with specific fields and versioning details; schema and version behavior are configuration-level expert knowledge. | -| [Plan roles and permissions](https://learn.microsoft.com/en-us/azure/sentinel/roles) | security | 0.86 | The page details how Microsoft Sentinel uses Azure RBAC and Microsoft Entra ID RBAC, and enumerates allowed actions for each Sentinel-specific role. This includes product-specific role names and their permission scopes, which qualify as expert security configuration knowledge. | -| [Troubleshoot SAP data connector](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-deploy-troubleshoot) | troubleshooting | 0.86 | Explicit troubleshooting article for the SAP data connector agent; will contain specific error messages, causes, and resolutions unique to this product. | -| [Troubleshoot solutions in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-sentinel-solutions) | troubleshooting | 0.86 | A dedicated troubleshooting guide for Sentinel solutions with product-specific diagnosis and validation steps; likely includes concrete symptoms, causes, and resolutions for data ingestion, analytics, packaging, and agent integration. | -| [Compare analytics rules and custom detections](https://learn.microsoft.com/en-us/azure/sentinel/compare-analytics-rules-custom-detections) | decision-making | 0.85 | Explicit comparison article; likely includes feature comparison tables, capabilities, and guidance on when to use each, which is decision-making content. | -| [Create watchlists](https://learn.microsoft.com/en-us/azure/sentinel/watchlists-create) | limits-quotas | 0.85 | Explicitly states concrete size limits for uploads (3.8 MB vs up to 500 MB for large watchlists); this is precise quota information unique to the service. | -| [KQL using the API](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries-api) | integrations | 0.85 | Explains programmatic KQL execution with REST APIs, including required permissions, request formats, and parameters; matches integrations & coding patterns with product-specific API details. | -| [Legacy upload indicator API reference](https://learn.microsoft.com/en-us/azure/sentinel/upload-indicators-api) | integrations | 0.85 | Reference for the legacy upload indicators API with example request/response. Contains product-specific REST API details and parameters, fitting the integrations category. | -| [Map data fields to entities](https://learn.microsoft.com/en-us/azure/sentinel/map-data-fields-to-entities) | configuration | 0.85 | Entity mapping is a Sentinel-specific configuration; page will list entity types, field mapping rules, and constraints, which are detailed config options. | -| [Threat intelligence upload API reference](https://learn.microsoft.com/en-us/azure/sentinel/stix-objects-api) | integrations | 0.85 | Explicitly a reference for the STIX objects upload API with example requests/responses. Contains endpoint URLs, parameter names, and constraints, which are core integration details. | -| [Troubleshoot AWS S3 connector issues](https://learn.microsoft.com/en-us/azure/sentinel/aws-s3-troubleshoot) | troubleshooting | 0.85 | Dedicated troubleshooting guide for the AWS S3 connector with Sentinel-specific diagnostics and resolution steps. | -| [Troubleshoot KQL for the lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-troubleshoot) | troubleshooting | 0.85 | Explicit troubleshooting checklist for KQL queries and jobs; likely includes specific error conditions, unsupported operators, and role/permission issues mapped to resolutions. | -| [Troubleshoot analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-analytics-rules) | troubleshooting | 0.85 | Explicit troubleshooting article; will map symptoms (like AUTO DISABLED) to causes and resolutions, including Sentinel-specific error states and diagnostics. | -| [ASIM common fields](https://learn.microsoft.com/en-us/azure/sentinel/normalization-common-fields) | configuration | 0.82 | Defines common ASIM schema fields and permitted values per schema; these field names and allowed values are product-specific configuration/schema reference. | -| [Troubleshoot Azure Storage Blob connector issues](https://learn.microsoft.com/en-us/azure/sentinel/azure-storage-blob-connector-troubleshoot) | troubleshooting | 0.82 | Troubleshooting guide for Azure Storage Blob connector issues, likely organized by symptoms and including specific error conditions and resolutions unique to this connector, which qualifies as product-specific troubleshooting knowledge. | -| [ASIM application entity](https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-application) | configuration | 0.80 | Displays the Application Entity schema; field definitions and structure are configuration/schema reference details. | -| [Add advanced conditions to automation rules](https://learn.microsoft.com/en-us/azure/sentinel/add-advanced-conditions-to-automation-rules) | configuration | 0.80 | Explains advanced condition groups and OR logic in automation rules, including structure of simple vs compound conditions—detailed, product-specific configuration behavior. | -| [Automation rules reference](https://learn.microsoft.com/en-us/azure/sentinel/automation-rule-reference) | configuration | 0.80 | Reference for automation rule configuration, supported properties, and conditions; includes specific field names and allowed values, which are configuration details. | -| [Compare KQL jobs, summary rules, and search jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs-summary-rules-search-jobs) | decision-making | 0.80 | Explicit comparison of three Sentinel features with guidance on when to use each; likely includes scenario-based recommendations and possibly capability/behavior differences, fitting decision-making. | -| [Configure ingestion-time transformation](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-transformation) | configuration | 0.80 | Explains Sentinel-specific configuration of ingestion-time transformations and Custom Log API usage, including DCR and transformation settings. | -| [Connect threat intelligence with upload API](https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-upload-api) | integrations | 0.80 | Describes using the upload API with STIX 2.1 to send indicators; includes API parameters and integration patterns unique to Sentinel. | -| [Creating codeless data connectors (CCF)](https://learn.microsoft.com/en-us/azure/sentinel/create-codeless-connector) | integrations | 0.80 | Shows how to define a CCF connector, including schema, configuration, and health monitoring; contains product-specific connector configuration parameters and patterns. | -| [Creating push codeless data connectors (CCF)](https://learn.microsoft.com/en-us/azure/sentinel/create-push-codeless-connector) | integrations | 0.80 | Push CCF connectors require specific endpoint, auth, and payload configuration; this guide will include Sentinel-specific parameters and integration patterns. | -| [Customize alert details](https://learn.microsoft.com/en-us/azure/sentinel/customize-alert-details) | configuration | 0.80 | Describes overriding default alert properties based on query results; requires specific configuration fields and expressions unique to Sentinel. | -| [Graph REST API](https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-rest-api) | integrations | 0.80 | REST API article for listing/querying custom graphs will include endpoint paths, parameters, request/response schemas, and constraints unique to Sentinel; matches integrations & coding patterns. | -| [Logstash plugin with Data Collection Rules](https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules) | integrations | 0.80 | Describes Logstash output plugin configuration with Data Collection Rules, including Sentinel-specific pipeline and DCR parameters. | -| [Manage workspace access with resource-context RBAC](https://learn.microsoft.com/en-us/azure/sentinel/resource-context-rbac) | security | 0.80 | Explains managing access by resource with Azure roles; such content typically lists specific RBAC roles and scope behaviors unique to Sentinel data access. | -| [Microsoft Purview Information Protection reference](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-purview-record-types-activities) | configuration | 0.80 | Lists supported audit log record types and activities for the Purview Information Protection connector and its standardized table; this is detailed configuration/schema support information. | -| [Microsoft Sentinel graph provider reference](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-provider-reference) | integrations | 0.80 | Reference for sentinel_graph class and Graph Builder API; by definition includes method signatures, parameter names, and usage patterns unique to Sentinel graph integration with Spark sessions. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/scheduled-rules-overview) | configuration | 0.80 | Describes all configuration options for scheduled rules; such pages contain rule parameters (lookback period, frequency, thresholds, suppression, etc.) with allowed values and defaults. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/watchlists) | best-practices | 0.80 | Article explicitly promises best practices and key scenarios; includes Sentinel-specific guidance on using watchlists effectively and known limitations. | -| [SAP solution log and table reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-log-reference) | configuration | 0.80 | Lists SAP-related logs and tables available via the connector, including which are enabled by default; this is detailed schema/configuration knowledge. | -| [Sample API requests for creating Data Collection Rules (DCRs)](https://learn.microsoft.com/en-us/azure/sentinel/api-dcr-reference) | integrations | 0.80 | Provides concrete API request/response samples for creating DCRs and DCRAs; includes specific payload structures and parameters for Sentinel/AMA integration. | -| [Search large datasets](https://learn.microsoft.com/en-us/azure/sentinel/search-jobs) | limits-quotas | 0.80 | Explicitly mentions using search jobs when the log query timeout of 10 minutes isn't sufficient and scanning up to a year of data—these are concrete numeric limits and behavior details. | -| [Select target platform](https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-target-platform) | decision-making | 0.80 | Explicitly compares target platforms on performance, cost, usability, and management in a table; classic decision matrix for platform selection with quantified trade-offs. | -| [Set up customer-managed keys](https://learn.microsoft.com/en-us/azure/sentinel/customer-managed-keys) | security | 0.80 | Provides Sentinel-specific CMK configuration steps with Key Vault integration and scope; these are concrete security configuration details. | -| [Surface custom details in alerts](https://learn.microsoft.com/en-us/azure/sentinel/surface-custom-details-in-alerts) | configuration | 0.80 | Explains extracting and surfacing custom details in alerts; involves rule configuration fields and mappings that are product-specific. | -| [Troubleshooting](https://learn.microsoft.com/en-us/azure/sentinel/datalake/troubleshoot-sentinel-mcp) | troubleshooting | 0.80 | Explicitly combines best practices and troubleshooting; likely includes concrete do/don’t guidance and symptom→cause→solution steps for MCP tools, including product-specific issues. | -| [Watchlist template schemas](https://learn.microsoft.com/en-us/azure/sentinel/watchlist-schemas) | configuration | 0.80 | Details schemas for built-in watchlist templates, which are product-specific configuration structures (column names, types, required fields). This is expert configuration knowledge not generally known from training. | -| [Asset data tables in Microsoft Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/asset-data-tables) | configuration | 0.78 | A table-mapping reference for asset data in the Sentinel data lake; such schema/table mappings are configuration-level reference details specific to the product. | -| [Azure Storage Blob data connector reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-azure-storage) | configuration | 0.78 | Reference article with JSON fields and properties for the Codeless Connector Framework data connector and its data connection rules. This is product-specific configuration detail (schema, property names, structure) that an LLM would not reliably know from training. | -| [Connect to STIX/TAXII feeds](https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-taxii) | integrations | 0.78 | Describes using the TAXII data connector to import/export threat indicators; includes protocol versions and connector-specific configuration. | -| [Deploy the agent from the command line](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-command-line) | deployment | 0.78 | Provides command-line deployment options for the SAP connector container; will list product-specific commands and parameters beyond generic deployment knowledge. | -| [Deploy the agent with expert options](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-deploy-alternate) | deployment | 0.78 | Describes expert/custom deployment of the SAP connector container, including on-prem and manual configurations; contains detailed deployment patterns and constraints. | -| [Prepare your SAP environment](https://learn.microsoft.com/en-us/azure/sentinel/sap/preparing-sap) | configuration | 0.78 | Preparation steps for SAP to work with the Sentinel SAP connector will include SAP-specific configuration objects, parameter names, and required settings that are unique to this integration, beyond generic concepts. | -| [RestApiPoller data connectors API reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connector-connection-rules-reference) | configuration | 0.78 | The page is a reference for RestApiPoller data connector JSON fields and properties in the Codeless Connector Framework. It describes specific configuration parameters, their names, structure, and allowed usage for building data connection rules. This is product-specific configuration detail that an LLM is unlikely to know from training, and it aligns with the configuration sub-skill definition (parameter references and configuration structure), not limits, troubleshooting, or general concepts. | -| [Sentinel tables and connectors](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-tables-connectors-reference) | configuration | 0.78 | Provides a mapping between Sentinel tables and the connectors that ingest them; this is product-specific configuration/integration reference. | -| [Connect threat intelligence platforms](https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-tip) | integrations | 0.76 | Connector article for TIP/custom feeds; includes configuration details and deprecation guidance for a specific Sentinel data connector. | -| [Enable MDTI data connector](https://learn.microsoft.com/en-us/azure/sentinel/connect-mdti-data-connector) | integrations | 0.76 | Connector article for MDTI; includes connector parameters, setup steps, and behavior specific to this integration. | -| [Azure Functions API connection](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-functions-template) | integrations | 0.75 | Shows how to use Azure Functions (PowerShell/Python) to call REST APIs and send logs to Sentinel; likely includes function app settings, bindings, and Sentinel-specific ingestion parameters. | -| [Collect logs from text files via AMA](https://learn.microsoft.com/en-us/azure/sentinel/connect-custom-logs-ama) | configuration | 0.75 | Describes how to configure the Custom Logs via AMA connector, including Sentinel- and AMA-specific configuration steps for nonstandard text log formats. | -| [Configure RDP login detection](https://learn.microsoft.com/en-us/azure/sentinel/configure-connector-login-detection) | configuration | 0.75 | Provides concrete configuration steps for the Security Events/Windows Security Events connector to enable ML-based anomalous RDP login detection in Sentinel. | -| [Configure multistage attack (Fusion) rules](https://learn.microsoft.com/en-us/azure/sentinel/configure-fusion-rules) | configuration | 0.75 | Explicitly about creating and configuring Fusion rules; will include rule parameters, enable/disable options, and possibly data source requirements specific to Sentinel. | -| [Create KQL jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs) | configuration | 0.75 | Focuses on creating and scheduling KQL jobs; such content typically includes job configuration options (schedule, scope, output targets) that are specific settings for this product. | -| [Create NRT analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/create-nrt-rules) | configuration | 0.75 | How-to for viewing and creating NRT rules; includes rule parameters and constraints specific to Sentinel’s NRT engine. | -| [Create automation rules](https://learn.microsoft.com/en-us/azure/sentinel/create-manage-use-automation-rules) | configuration | 0.75 | Details defining triggers, conditions, and actions for automation rules—this is a configuration-focused article with Sentinel-specific parameters and behaviors. | -| [Enrich entities with geolocation data with REST-API](https://learn.microsoft.com/en-us/azure/sentinel/geolocation-data-api) | integrations | 0.75 | Explains enriching entities with geolocation data via REST API, which implies specific API paths, parameters, and payload formats. This is product-specific integration knowledge. | -| [Export and import automation rules](https://learn.microsoft.com/en-us/azure/sentinel/import-export-automation-rules) | deployment | 0.75 | Describes exporting/importing automation rules as ARM templates, including workspace-independent JSON files—this is a deployment-as-code pattern specific to Sentinel automation rules. | -| [Manage tables, tiers, and retention](https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention) | configuration | 0.75 | Step-by-step configuration of table-level retention and tier (analytics vs data lake) in Defender portal, including Sentinel-specific options and behaviors. | -| [Microsoft Defender XDR](https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-365-defender) | configuration | 0.75 | Details configuration of the Defender XDR connector, including incident synchronization and portal-specific settings unique to this integration. | -| [Microsoft Entra](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-active-directory) | configuration | 0.75 | Connector article detailing how to enable and route Entra ID sign-in, audit, and provisioning logs into Sentinel with product-specific steps. | -| [Optimize costs with pre-purchase plan](https://learn.microsoft.com/en-us/azure/sentinel/billing-pre-purchase-plan) | decision-making | 0.75 | Describes Sentinel-specific commit units, discount tiers, and how CUs map to dollar costs; this is detailed, quantified cost-optimization and plan-selection guidance. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/near-real-time-rules) | configuration | 0.75 | NRT rules have specific configuration behaviors (frequency, latency, constraints) that are Sentinel-specific; page explains how these rules work and are configured. | -| [Select data ingestion tool](https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-tool) | decision-making | 0.75 | Provides a table of tools per target platform and general tools; helps choose ingestion tooling based on platform and scenario, which is decision-making guidance. | -| [Sentinel solution quality guidelines](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-quality-guidance) | best-practices | 0.75 | Provides concrete recommendations and requirements for solution quality (content, performance, structure) that are specific to Sentinel solutions. | -| [Work with out-of-the-box anomaly rules](https://learn.microsoft.com/en-us/azure/sentinel/work-with-anomaly-rules) | configuration | 0.75 | Covers creating, managing, and fine-tuning anomaly rules; involves product-specific configuration options and tuning parameters. | -| [AWS EKS logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks) | integrations | 0.74 | Connector documentation for Sentinel typically includes product-specific integration details such as required AWS/Sentinel configuration parameters, connector settings, and log source specifics. These are concrete integration patterns and settings unique to this product, beyond generic 'how to connect' guidance. | -| [AMA migration for Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/ama-migrate) | decision-making | 0.70 | Migration article between agents; usually includes decision guidance, compatibility notes, and stepwise migration considerations specific to Sentinel deployments. | -| [ASIM helper functions](https://learn.microsoft.com/en-us/azure/sentinel/normalization-functions) | integrations | 0.70 | Describes ASIM helper functions that extend KQL with product-specific parameters and usage patterns for interacting with normalized data; this is an integration/coding pattern reference. | -| [ASIM known issues](https://learn.microsoft.com/en-us/azure/sentinel/normalization-known-issues) | limits-quotas | 0.70 | Documents specific known issues and limitations of ASIM; such constraints and edge cases are expert knowledge about product behavior and limits. | -| [AWS S3 WAF logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-s3-waf) | configuration | 0.70 | Connector-specific configuration for ingesting AWS WAF logs from S3 into Sentinel, including required settings unique to this integration. | -| [AWS service logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws) | configuration | 0.70 | Connector configuration article with Sentinel-specific setup for AWS S3-based ingestion, including method selection and trust relationship details. | -| [Add threat intelligence in bulk by file](https://learn.microsoft.com/en-us/azure/sentinel/indicators-bulk-file-import) | configuration | 0.70 | How-to for importing indicators from CSV/JSON; likely includes file schema, required columns/fields, and Sentinel-specific parameters for bulk import, which are product-specific configuration details. | -| [Aggregate data with summary rules](https://learn.microsoft.com/en-us/azure/sentinel/summary-rules) | configuration | 0.70 | Describes how to set up summary rules, including how they write to custom tables and interact with log tiers—Sentinel-specific configuration behavior. | -| [Audit Microsoft Sentinel data lake and graph in Microsoft Purview portal](https://learn.microsoft.com/en-us/azure/sentinel/datalake/auditing-lake-activities) | security | 0.70 | Describes how Sentinel data lake and graph activities appear in the audit log and how to search them; this is product-specific auditing configuration/usage. | -| [Audit Microsoft Sentinel with Azure Activity Logs](https://learn.microsoft.com/en-us/azure/sentinel/audit-sentinel-data) | security | 0.70 | Explains how to use AzureActivity and other tables to audit Sentinel operations; includes specific table names and audit patterns for compliance. | -| [Audit and monitor the health of analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/monitor-analytics-rule-integrity) | best-practices | 0.70 | Uses SentinelHealth and audit logs plus manual rerun to ensure tamper-free detection; these are concrete Sentinel-specific monitoring and governance practices. | -| [Authenticate playbooks to Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/automation/authenticate-playbooks-to-sentinel) | security | 0.70 | Focuses on how playbooks authenticate to Sentinel via connectors; such articles usually enumerate connector types, permission scopes, and auth configuration steps, which are product-specific security settings. | -| [Automatically enrich incident information](https://learn.microsoft.com/en-us/azure/sentinel/tutorial-enrich-ip-information) | integrations | 0.70 | Shows how to integrate Sentinel automation rules/playbooks with a threat intelligence source to check IP reputation and write back to incidents—detailed cross-service integration pattern. | -| [Automation rules](https://learn.microsoft.com/en-us/azure/sentinel/automate-incident-handling-with-automation-rules) | best-practices | 0.70 | Explains what automation rules are and how to use them to implement SOAR, providing concrete Sentinel-specific guidance on structuring automated responses. | -| [Azure Stack VMs](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-stack) | deployment | 0.70 | Shows how to deploy Azure Monitor and related VM extensions on Azure Stack Hub VMs to start monitoring with Sentinel, a product-specific onboarding/deployment pattern. | -| [Azure Storage blob connectors](https://learn.microsoft.com/en-us/azure/sentinel/setup-azure-storage-connector) | integrations | 0.70 | The page describes setting up the Azure Storage Blob connector using the Codeless Connector Framework to ingest logs into Sentinel. Such connector setup articles typically include connector-specific configuration parameters, required settings, and validation steps unique to this integration, aligning with the integrations & coding patterns category. | -| [Best practices](https://learn.microsoft.com/en-us/azure/sentinel/best-practices-data) | best-practices | 0.70 | Explicit best-practices article for Sentinel data collection; likely includes product-specific recommendations (which connectors, which tables, retention choices) and gotchas. | -| [Build logic apps with Microsoft Sentinel MCP](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-logic-apps) | integrations | 0.70 | Describes using MCP tools (like entity analyzer) inside Logic Apps; such articles typically list connector actions, parameters, and constraints specific to this integration. | -| [CEF and Syslog via AMA](https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama) | configuration | 0.70 | How-to article for configuring Syslog via AMA and CEF via AMA connectors, including product-specific connector settings and filtering behavior that go beyond generic knowledge. | -| [Collaborate in Microsoft Teams](https://learn.microsoft.com/en-us/azure/sentinel/collaborate-in-microsoft-teams) | integrations | 0.70 | Covers direct integration between Sentinel and Teams for incident collaboration, a concrete cross-service integration pattern unique to these products. | -| [Collect SAP HANA audit logs](https://learn.microsoft.com/en-us/azure/sentinel/sap/collect-sap-hana-audit-logs) | configuration | 0.70 | Explains how to collect SAP HANA audit logs; likely includes HANA-specific configuration parameters and Sentinel connector settings that are product-specific. | -| [Configure Microsoft Sentinel scoping (row-level RBAC)](https://learn.microsoft.com/en-us/azure/sentinel/scoping) | security | 0.70 | Page is specifically about configuring Microsoft Sentinel scoping (row-level RBAC). This implies product-specific security configuration, likely including scope definitions, role mappings, and permission behaviors unique to Sentinel. That aligns with the security sub-skill type, which covers RBAC role names, permission scopes, and security settings. | -| [Configure advanced MSTICPy settings](https://learn.microsoft.com/en-us/azure/sentinel/notebooks-msticpy-advanced) | configuration | 0.70 | Explicitly about advanced configurations for Jupyter notebooks and MSTICPy in Sentinel, implying detailed parameters and options unique to this integration. | -| [Configure interactive and long-term data retention](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-retention-archive) | configuration | 0.70 | Data retention configuration article will specify retention settings, tiers, and allowed ranges for interactive vs archive data, including parameter names and values. | -| [Connect via API-based connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-services-api-based) | configuration | 0.70 | Common guidance for API-based connectors; typically includes endpoint URLs, auth settings, and connector parameters unique to Sentinel’s API-based ingestion. | -| [Connect via Windows agent-based connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-services-windows-based) | configuration | 0.70 | Describes using Azure Monitor Agent and DCRs for Windows-based connectors; includes product-specific configuration of rules and agent settings. | -| [Connect via diagnostic settings-based connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-services-diagnostic-setting-based) | configuration | 0.70 | Covers diagnostic settings-based connections; likely documents required diagnostic categories, destinations, and configuration fields specific to Sentinel connectors. | -| [Connect your SAP system](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-data-connector-agent-container) | deployment | 0.70 | Describes how to deploy the SAP data connector container to connect SAP to Sentinel; this typically includes product-specific deployment parameters and constraints not known generically. | -| [Create a scheduled rule from a template](https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rule-from-template) | configuration | 0.70 | Focuses on creating rules from templates; typically includes template parameters and rule configuration fields specific to Sentinel scheduled rules. | -| [Create a scheduled rule from scratch](https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rules) | configuration | 0.70 | Covers viewing and creating scheduled rules; likely details rule configuration options and query-related parameters unique to Sentinel. | -| [Create and manage notebook jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-jobs) | configuration | 0.70 | Describes creating scheduled notebook jobs via the Sentinel VS Code extension; likely details job configuration parameters (schedule, target tables, output locations) that are product-specific settings. | -| [Create incident tasks using automation rules](https://learn.microsoft.com/en-us/azure/sentinel/create-tasks-automation-rule) | configuration | 0.70 | Shows how to configure automation rules to create incident tasks, including how tasks are generated by rules/playbooks—Sentinel-specific configuration of workflow automation. | -| [Create incidents from Microsoft Security alerts](https://learn.microsoft.com/en-us/azure/sentinel/create-incidents-from-alerts) | configuration | 0.70 | Describes how alerts become incidents; typically includes incident rule configuration, grouping options, and mappings unique to Sentinel. | -| [Create your own custom tool](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-create-custom-tool) | integrations | 0.70 | Shows how to build custom MCP tools from saved KQL queries; likely includes tool schema, parameter definitions, and configuration fields unique to Sentinel MCP, which are expert integration details. | -| [Creating analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-analytic-rules-creation) | integrations | 0.70 | Covers how to define and package analytics rules for solutions; includes Sentinel-specific rule schema, parameters, and publishing details. | -| [Creating hunting queries](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-hunting-rules-creation) | integrations | 0.70 | Guides creating and publishing hunting queries with KQL for solutions; includes Sentinel-specific packaging and metadata requirements. | -| [Creating playbooks](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-playbook-creation) | integrations | 0.70 | Shows how Sentinel playbooks (Logic Apps) are wired to incidents/alerts and packaged in solutions; includes Sentinel-specific triggers, connections, and metadata. | -| [Creating summary rules](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-summary-rules-creation) | integrations | 0.70 | Describes scheduled KQL queries that write to custom tables; includes Sentinel-specific rule configuration, table naming, and solution packaging. | -| [Creating workbooks](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-workbook-creation) | integrations | 0.70 | Workbook creation for Sentinel solutions involves specific JSON templates, parameters, and packaging rules that are product-specific. | -| [Customize repository deployments](https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-deploy) | deployment | 0.70 | Covers customization of repository deployments using specific files and syntax for Sentinel Repositories, a product-specific deployment configuration pattern. | -| [DNS via AMA](https://learn.microsoft.com/en-us/azure/sentinel/connect-dns-ama) | configuration | 0.70 | Product-specific configuration of the AMA DNS extension and connector to stream and filter Windows DNS logs into Sentinel. | -| [Data lake use cases](https://learn.microsoft.com/en-us/azure/sentinel/basic-logs-use-cases) | decision-making | 0.70 | Focuses on deciding which log sources to place in the data lake tier; such guidance is scenario-based with criteria for tier selection, fitting decision-making. | -| [Data management overview](https://learn.microsoft.com/en-us/azure/sentinel/manage-data-overview) | decision-making | 0.70 | The article focuses on managing table retention and tier options to optimize cost and security operations. This implies guidance on selecting retention periods and data tiers, likely including comparison of options and trade-offs between cost and capability, which fits decision-making around tier/retention selection. | -| [Data source schema reference](https://learn.microsoft.com/en-us/azure/sentinel/data-source-schema-reference) | configuration | 0.70 | Lists supported data source schemas with links to schema references; while high-level, it is a schema/catalog reference specific to Sentinel integrations. | -| [Data transformation using filter and split](https://learn.microsoft.com/en-us/azure/sentinel/transformation-filter-split) | configuration | 0.70 | The article describes how to configure Sentinel’s filter and split data transformations at ingestion time to route data between Analytics and Data lake tiers. It contains product-specific configuration steps and options for these transformations that go beyond generic ingestion concepts, making it expert configuration knowledge. | -| [Define an access restriction for Standard-plan playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/define-playbook-access-restrictions) | security | 0.70 | Covers access restriction policy for Standard-plan playbooks and private endpoints; this is product-specific security configuration, likely including policy settings and scopes. | -| [Deploy SAP BTP](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-btp-solution) | deployment | 0.70 | Deployment article for SAP BTP solution; includes product-specific deployment steps and notes about connector version behavior. | -| [Deploy content as code from your repository](https://learn.microsoft.com/en-us/azure/sentinel/ci-cd) | deployment | 0.70 | Explains how to connect GitHub/Azure DevOps to Sentinel for automatic content deployment, including configuration parameters and constraints unique to this integration. | -| [Deployment prerequisites](https://learn.microsoft.com/en-us/azure/sentinel/sap/prerequisites-for-deploying-sap-continuous-threat-monitoring) | configuration | 0.70 | Lists detailed prerequisites for agent vs agentless SAP connectors, including environment, connector, and possibly permission requirements specific to this integration. | -| [Develop ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-develop-parsers) | configuration | 0.70 | Detailed guidance on developing, testing, and deploying ASIM parsers, including Sentinel-specific parser structure and deployment practices. | -| [Enable SAP detections and threat protection](https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-solution-configuration) | best-practices | 0.70 | Provides best practices for configuring security content (rules, detections) for SAP in Sentinel; includes product-specific recommendations and gotchas. | -| [Enable attack disruption actions on AWS](https://learn.microsoft.com/en-us/azure/sentinel/aws-disruption) | security | 0.70 | Security-focused configuration of Sentinel to take automated actions on AWS IAM/SAML identities, with product-specific permissions and setup. | -| [Enable auditing and health monitoring](https://learn.microsoft.com/en-us/azure/sentinel/enable-monitoring) | configuration | 0.70 | Covers turning on the feature and using SentinelHealth and SentinelAudit tables; includes concrete configuration steps and table names. | -| [Enable network security for Azure Storage Blob data connector](https://learn.microsoft.com/en-us/azure/sentinel/enable-storage-network-security) | security | 0.70 | Step-by-step instructions to secure storage accounts with Network Security Perimeters for the Azure Storage connector. Likely includes specific Azure security configuration steps, resource associations, and rule settings that are product-specific security guidance. | -| [Entities reference](https://learn.microsoft.com/en-us/azure/sentinel/entities-reference) | configuration | 0.70 | Reference for Sentinel entity types with strong/weak identifiers is product-specific schema/configuration knowledge (field names and identifier semantics) that an LLM is unlikely to know in detail. Fits configuration because it defines how entities are structured and identified. | -| [Export and import analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/import-export-analytics-rules) | deployment | 0.70 | Covers exporting/importing rules as ARM templates; this is deployment-focused with product-specific constraints and template schema details. | -| [Export historical data](https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-historical-data) | decision-making | 0.70 | Explains exporting QRadar data via REST API and recommends small time ranges; includes performance considerations and guidance on what data to migrate, which is migration decision-making. | -| [Extend across multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/extend-sentinel-across-workspaces-tenants) | architecture-patterns | 0.70 | Explains how to extend Sentinel across workspaces/tenants; this is a product-specific cross-workspace/tenant architecture pattern, not generic theory. | -| [Extract incident entities with non-native actions](https://learn.microsoft.com/en-us/azure/sentinel/tutorial-extract-incident-entities) | integrations | 0.70 | Covers working with non-native entity types (malware, process, registry key, mailbox, etc.) via built-in actions in playbooks, a Sentinel-specific automation/integration pattern. | -| [Feature support in different clouds](https://learn.microsoft.com/en-us/azure/sentinel/feature-availability) | decision-making | 0.70 | Feature availability across Azure environments is presented as GA/preview/not-available tables, guiding which features can be used where and influencing environment selection and rollout decisions. | -| [Get fine-tuning recommendations](https://learn.microsoft.com/en-us/azure/sentinel/detection-tuning) | best-practices | 0.70 | Focuses on tuning rules to reduce false positives while maintaining coverage; contains Sentinel-specific tuning recommendations and possibly quantified impacts. | -| [Get started with notebooks and MSTICPy](https://learn.microsoft.com/en-us/azure/sentinel/notebook-get-started) | configuration | 0.70 | Walks through running the Getting Started notebook, including basic configurations for Sentinel notebooks and MSTICPy usage—product-specific configuration and code patterns. | -| [Google Cloud Platform connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-google-cloud-platform) | configuration | 0.70 | Describes how to set up GCP Pub/Sub-based connectors using Sentinel’s Codeless Connector Framework, including integration-specific configuration. | -| [Handle ingestion delay in analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/ingestion-delay) | best-practices | 0.70 | Discusses handling ingestion delay for scheduled rules; likely includes concrete recommendations (for lookback windows, scheduling) and product-specific gotchas. | -| [Integrate Microsoft Defender XDR](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-sentinel-integration) | integrations | 0.70 | Explains integration behavior between Defender XDR and Sentinel, including onboarding conditions and data flow; includes product-specific integration patterns and constraints. | -| [Integrate Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/sentinel/ingest-defender-for-cloud-incidents) | integrations | 0.70 | Describes how Defender for Cloud incidents flow through Defender XDR into Sentinel; includes specific integration requirements and configuration options. | -| [Integrate SAP across multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/sap/cross-workspace) | architecture-patterns | 0.70 | Discusses multi-workspace options and factors (geography, regulation, access control) for SAP with Sentinel; this is architecture guidance specific to this solution. | -| [Manage ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-manage-parsers) | configuration | 0.70 | Provides concrete steps to add custom parsers and replace built-in ones, including parser naming and management patterns unique to Sentinel’s ASIM. | -| [Manage KQL jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-manage-jobs) | configuration | 0.70 | Jobs management page functions (enable, disable, edit, delete) typically include specific job state fields, filters, and options unique to Sentinel data lake job management. | -| [Manage hunting queries with REST-API](https://learn.microsoft.com/en-us/azure/sentinel/hunting-with-rest-api) | integrations | 0.70 | The page describes using the Log Analytics saved searches REST API specifically for Microsoft Sentinel hunting queries. This implies product-specific API endpoints, request/response schemas, and parameter usage that go beyond generic REST knowledge, fitting the integrations & coding patterns category best. | -| [Manage template versions for analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/manage-analytics-rule-templates) | configuration | 0.70 | Explains managing relationships between templates and rules, merging updates, and reverting; involves Sentinel-specific configuration flows and options. | -| [Manage watchlists](https://learn.microsoft.com/en-us/azure/sentinel/watchlists-manage) | best-practices | 0.70 | Includes a concrete product-specific recommendation tied to ingestion behavior (edit instead of delete/recreate due to a 5-minute Log Analytics ingestion SLA and potential duplicate entries), which is a Sentinel-specific operational gotcha. | -| [Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/sentinel/connect-defender-for-cloud) | configuration | 0.70 | Explains how to configure the Defender for Cloud connector to stream subscription-based alerts into Sentinel, including plan and subscription-specific setup. | -| [Microsoft Purview Information Protection data](https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-purview) | configuration | 0.70 | Connector configuration article for streaming Purview Information Protection labeling and scanner data into Sentinel, including preview-specific setup. | -| [Migrate agent to agentless connector](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-agent-migrate) | deployment | 0.70 | Migration guide contains product-specific, time-bound deprecation details and concrete steps for moving from a containerized agent to an agentless data connector in Microsoft Sentinel for SAP, which are deployment/operational patterns unique to this solution and not generally known from training. | -| [Modify content to use ASIM](https://learn.microsoft.com/en-us/azure/sentinel/normalization-modify-content) | configuration | 0.70 | Explains how to modify analytics rules and other content to use ASIM parsers and normalized schemas, with concrete Sentinel-specific patterns. | -| [Monitor and optimize execution of analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/monitor-optimize-analytics-rule-execution) | best-practices | 0.70 | Describes execution insights and manual rerun for scheduled rules; provides Sentinel-specific tooling and patterns for testing and troubleshooting rule execution. | -| [Monitor automation rules and playbooks health](https://learn.microsoft.com/en-us/azure/sentinel/monitor-automation-health) | configuration | 0.70 | Uses SentinelHealth and AzureDiagnostics tables to monitor automation; includes concrete table names, fields, and monitoring patterns unique to Sentinel. | -| [Monitor data connector health](https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health) | configuration | 0.70 | Uses SentinelHealth table and a specific workbook to track connector status; includes product-specific table names, workbook usage, and possibly KQL queries. | -| [Notebook examples for data lake exploration](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-examples) | integrations | 0.70 | Provides concrete code snippets using the Sentinel VS Code extension and Spark sessions; these are product-specific integration patterns for accessing particular tables and schemas. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-overview) | configuration | 0.70 | Describes Syslog/CEF via AMA connectors; typically includes facility/severity filters, port settings, and AMA configuration specific to Sentinel’s Syslog/CEF ingestion. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/offboard-implications) | limits-quotas | 0.70 | Describes concrete timing and retention implications when removing Sentinel from a workspace (for example, removal can take up to 48 hours and some resources are retained for a limited time), which are product-specific limits/behaviors not inferable from general knowledge. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-integration-guide) | integrations | 0.70 | Lifecycle guide for building connectors, content, and integrations; likely includes solution manifest structure, required components, and Sentinel-specific integration patterns. | -| [Packaging and publishing a Sentinel platform solution](https://learn.microsoft.com/en-us/azure/sentinel/package-platform-solution) | deployment | 0.70 | Shows how to bundle data lake components and configuration into a deployable platform solution and publish it to the Security Store; this is Sentinel-specific deployment packaging. | -| [Partner integrations best practices](https://learn.microsoft.com/en-us/azure/sentinel/partner-integrations) | architecture-patterns | 0.70 | Describes how Sentinel data lake, graph, notebooks, MCP server, and connectors work together; this is a product-specific architectural pattern guide for solutions. | -| [Plan data retention](https://learn.microsoft.com/en-us/azure/sentinel/log-plans) | limits-quotas | 0.70 | Log retention tiers article typically includes concrete retention durations, tier behaviors, and possibly GB/day or cost-related thresholds, which are numeric limits and tier-specific constraints. | -| [Plan your migration](https://learn.microsoft.com/en-us/azure/sentinel/migration) | decision-making | 0.70 | Migration planning content for moving from legacy SIEMs to Microsoft Sentinel typically includes phase breakdowns, option comparisons, and guidance on when to choose particular migration approaches. This is decision-focused, helping SOC teams choose migration strategies and phases rather than just describing Sentinel conceptually. | -| [Publish solutions](https://learn.microsoft.com/en-us/azure/sentinel/publish-sentinel-solutions) | deployment | 0.70 | Details publishing flow, offer types, and requirements for Sentinel solutions in Partner Center and content hub; this is product-specific deployment to marketplace. | -| [Reduce costs](https://learn.microsoft.com/en-us/azure/sentinel/billing-reduce-costs) | decision-making | 0.70 | Cost-reduction guidance for Sentinel usually contains product-specific recommendations (for example, which data types or features to adjust) and quantified impact on billing, which is concrete decision guidance rather than generic advice. | -| [Respond to threats using automation](https://learn.microsoft.com/en-us/azure/sentinel/automation/tutorial-respond-threats-playbook) | best-practices | 0.70 | A concrete scenario tutorial showing how to combine automation rules and playbooks to remediate threats, including Sentinel-specific configuration and flow patterns. | -| [SAP solution function reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-function-reference) | configuration | 0.70 | Describes specific KQL functions added by the SAP solution; these function names and behaviors are expert, product-specific knowledge. | -| [SIEM best practices in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/best-practices) | best-practices | 0.70 | Explicit best practices article for deploying and managing Sentinel; likely includes product-specific recommendations and gotchas beyond generic advice. | -| [Set up connectors for the Microsoft Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-connectors) | configuration | 0.70 | Explains setting up connectors and configuring retention per tier; likely includes connector settings and retention configuration options specific to Sentinel data lake. | -| [Standalone vs XDR alert schema reference](https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema-differences) | decision-making | 0.70 | Explains differences in alert schema, field mappings, and ingestion behavior between standalone and XDR connectors, guiding connector selection and migration decisions. | -| [Supported triggers and actions in playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-triggers-actions) | integrations | 0.70 | Catalog of supported triggers/actions for the Sentinel Logic Apps connector is product-specific integration detail, typically listing operation names, parameters, and behaviors that LLMs won’t reliably know from training. | -| [Switch to simplified pricing tiers](https://learn.microsoft.com/en-us/azure/sentinel/enroll-simplified-pricing-tier) | decision-making | 0.70 | Explains how to switch to simplified pricing tiers and the impact; such pages typically include tier names, eligibility rules, and billing behavior details that drive SKU/tier selection decisions. | -| [UEBA data sources and table schemas](https://learn.microsoft.com/en-us/azure/sentinel/ueba-reference) | configuration | 0.70 | Lists input data sources and enrichments UEBA adds; this is a reference of product-specific fields and enrichment schema, useful as configuration/field mapping knowledge. | -| [Use MCP connector in ChatGPT or Claude](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-chatgpt-claude-connector) | integrations | 0.70 | The article describes a product-specific integration pattern between Microsoft Sentinel and MCP-based clients (ChatGPT/Claude), including how to enable and use a custom Sentinel MCP connector. This is specialized integration knowledge that involves concrete configuration steps and parameters unique to this connector, which an LLM is unlikely to know from training. It is not just a conceptual overview but a how-to for a specific integration, fitting the integrations sub-skill. | -| [Use SOC optimizations programmatically](https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-api) | integrations | 0.70 | API-focused article for SOC optimization recommendations; likely includes endpoint shapes, parameters, and request/response details specific to Sentinel's recommendations API, which are product-specific integration details. | -| [Windows security event sets](https://learn.microsoft.com/en-us/azure/sentinel/windows-security-event-id-reference) | configuration | 0.70 | Page defines specific, named event collection sets (for example, 'All events') and exactly which Windows Security/AppLocker event IDs and channels each set includes. This is product-specific configuration knowledge for the Windows Security Events data connector in Microsoft Sentinel, not generic concepts. | -| [Aggregate insights from raw data into an Auxiliary table](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs) | configuration | 0.68 | KQL jobs in the Sentinel data lake are a product-specific feature; this page describes how to configure one-time and scheduled jobs, including job parameters and behaviors for promoting data from the data lake tier to the analytics tier. These configuration details (job types, scheduling options, promotion behavior) are not generic KQL knowledge and qualify as expert, product-specific configuration guidance. | -| [Manage multiple tenants (MSSP)](https://learn.microsoft.com/en-us/azure/sentinel/multiple-tenants-service-providers) | configuration | 0.68 | Page describes how MSSPs use Azure Lighthouse to onboard and manage multiple customer tenants' Sentinel resources from a provider tenant. This involves product-specific configuration patterns (cross-tenant management model, required setup in provider vs. customer tenants, Sentinel-specific management flows) that go beyond generic concepts. While not focused on limits or error codes, it contains concrete, Sentinel- and Lighthouse-specific configuration guidance for multi-tenant SOC operations. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started) | configuration | 0.68 | A 'get started' page for an MCP server typically includes concrete setup steps, configuration parameters, and tool usage details specific to Microsoft Sentinel’s MCP implementation (for example, server endpoints, tool names, and configuration fields). These are product-specific configuration details that go beyond generic LLM knowledge, but the summary doesn’t indicate limits, troubleshooting, or decision matrices. | -| [Update the data connector agent](https://learn.microsoft.com/en-us/azure/sentinel/sap/update-sap-data-connector) | deployment | 0.68 | Covers updating the SAP data connector agent, including downtime behavior and log resumption; these are product-specific deployment/upgrade behaviors. | -| [Deploy for Dynamics 365 Finance and Operations](https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/deploy-dynamics-365-finance-operations-solution) | deployment | 0.66 | Deployment article for Dynamics 365 F&O content within the Business Apps solution; includes product-specific deployment and connector configuration. | -| [Deploy for Power Platform and Microsoft Dynamics 365 Customer Engagement](https://learn.microsoft.com/en-us/azure/sentinel/business-applications/deploy-power-platform-solution) | deployment | 0.66 | Describes how to deploy the Business Apps solution for Power Platform and Dynamics 365 CE; includes product-specific deployment steps and connector setup. | -| [Monitor SAP system health and role](https://learn.microsoft.com/en-us/azure/sentinel/monitor-sap-system-health) | configuration | 0.66 | Describes using connector page and alert rule template to monitor SAP connectivity; includes product-specific rule configuration and monitoring patterns. | -| [Stop SAP data collection](https://learn.microsoft.com/en-us/azure/sentinel/sap/stop-collection) | configuration | 0.66 | Provides specific steps to stop ingestion and disable the SAP connector; includes product-specific configuration actions. | -| [Workspaces in the Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/workspaces-defender-portal) | configuration | 0.66 | The article covers how to connect a primary and multiple secondary Sentinel workspaces in the Defender portal, including the specific behavior and constraints when onboarding Sentinel with Defender XDR. These workspace-relationship and portal-configuration details are product-specific expert configuration knowledge. | -| [Agent creation](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-agent-creation-tool) | integrations | 0.65 | Covers a specific MCP tool collection for creating Security Copilot agents; likely documents tool names, inputs, and configuration patterns unique to this product, fitting integrations & coding patterns. | -| [All Microsoft Sentinel SIEM solutions](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-catalog) | decision-making | 0.65 | Catalog article lists domain-specific solutions and helps choose which to deploy; effectively a decision aid for solution selection based on use case and domain. | -| [Azure Virtual Desktop](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-virtual-desktop) | configuration | 0.65 | Describes Sentinel-specific connector configuration for Azure Virtual Desktop monitoring, beyond generic logging concepts. | -| [Bring your own machine learning](https://learn.microsoft.com/en-us/azure/sentinel/bring-your-own-ml) | architecture-patterns | 0.65 | Describes the BYO ML platform and how to create/use ML algorithms for Sentinel data analysis, which is a product-specific architecture/pattern for ML-based detections. | -| [Build queries or rules](https://learn.microsoft.com/en-us/azure/sentinel/watchlists-queries) | integrations | 0.65 | Describes product-specific KQL usage with watchlists, including using the SearchKey column for optimal joins and built-in operators like join and lookup. These are concrete, Sentinel-specific query patterns that go beyond generic KQL knowledge. | -| [Cisco firewalls](https://learn.microsoft.com/en-us/azure/sentinel/cisco-ftd-firewall) | decision-making | 0.65 | Explains when to use each of two Sentinel connectors for Cisco Secure Firewall (FTD vs ASA) and links to their configuration, providing product-specific selection guidance. | -| [Connect Microsoft Sentinel to AWS](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-configure-environment) | configuration | 0.65 | Outlines AWS-side configuration required for Sentinel connectors, including environment setup steps that are specific to this integration. | -| [Create custom entity activities](https://learn.microsoft.com/en-us/azure/sentinel/customize-entity-activities) | configuration | 0.65 | Adding customized activities implies configuring which events appear on timelines, with Sentinel-specific activity types and mapping options. | -| [Deploy side-by-side](https://learn.microsoft.com/en-us/azure/sentinel/deploy-side-by-side) | decision-making | 0.65 | Side-by-side deployment guidance compares approaches and methods for coexisting with an existing SIEM, providing scenario-based recommendations and trade-offs. | -| [Export historical data](https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-historical-data) | decision-making | 0.65 | Describes multiple export methods depending on data volumes and environment; helps choose export approach, which is migration-related decision-making with scenario-based criteria. | -| [Export historical data](https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-historical-data) | decision-making | 0.65 | Describes several export methods with guidance based on data volume and interactivity; this is migration decision-making about which export path to choose. | -| [GQL reference for Sentinel custom graph](https://learn.microsoft.com/en-us/azure/sentinel/datalake/gql-reference-for-sentinel-custom-graph) | integrations | 0.65 | A language reference for Sentinel’s GQL with product-specific operators, functions, and syntax details that go beyond generic graph query knowledge; fits integrations & coding patterns as it defines query API surface and parameters. | -| [Install the solution for SAP applications](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-security-content) | deployment | 0.65 | Describes installing SAP solution from content hub, including SAP data connector and content deployment steps and constraints, which are product-specific deployment details. | -| [Integrate Microsoft Purview](https://learn.microsoft.com/en-us/azure/sentinel/purview-solution) | configuration | 0.65 | Describes how to enable the Purview data connector and solution in Sentinel to surface data sensitivity insights, with Sentinel-specific configuration steps. | -| [Integrate with unified connectors](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-integration) | configuration | 0.65 | How-to for connecting via unified connectors; likely includes connector configuration fields and options specific to the Unified Connectors Platform and Sentinel. | -| [Manage your intellectual property in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/mssp-protect-intellectual-property) | best-practices | 0.65 | Covers concrete methods for packaging and isolating analytics rules, queries, and playbooks per customer purchasing model; these are Sentinel-specific operational best practices. | -| [Managing Platform Solutions](https://learn.microsoft.com/en-us/azure/sentinel/manage-platform-solutions) | configuration | 0.65 | Explains how to configure, update, and uninstall components installed by a platform solution; includes product-specific management locations and settings. | -| [Microsoft Defender XDR connector data type support](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-cloud-support) | deployment | 0.65 | Describes which Microsoft Defender XDR connector data types are supported in Microsoft Sentinel across Commercial, GCC, GCC-High, and DoD clouds. This is environment- and SKU-specific capability/support matrix information that an LLM is unlikely to know from training and is used for planning deployments across sovereign clouds. | -| [Microsoft Sentinel platform deployment overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboarding) | configuration | 0.65 | Onboarding to Sentinel data lake/graph includes tenant-wide configuration steps and likely specific settings and constraints unique to this platform capability. | -| [Microsoft Sentinel solution setup essentials](https://learn.microsoft.com/en-us/azure/sentinel/solution-setup-essentials) | configuration | 0.65 | Covers required setup and configuration for solution types; likely lists specific resource types, permissions, and configuration steps unique to Sentinel solutions. | -| [Migrate alert playbooks to automation rules](https://learn.microsoft.com/en-us/azure/sentinel/automation/migrate-playbooks-to-automation-rules) | decision-making | 0.65 | Explains why and how to migrate from alert-trigger to automation-rule invocation; this is migration/choice guidance between approaches, likely with scenario-based recommendations, fitting decision-making. | -| [Onboard to data lake and graph from Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboard-defender) | configuration | 0.65 | Describes onboarding flow from Defender portal with subscription selection and data lake creation specifics, including constraints like single data lake per tenant. | -| [Optimize your security operations](https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-access) | decision-making | 0.65 | Provides Sentinel SOC optimization recommendations to balance data coverage and ingestion cost, guiding operational decisions rather than just configuration. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/automation/automation) | architecture-patterns | 0.65 | Explains Sentinel’s SOAR components and how they work together to handle alert volume, providing product-specific orchestration patterns and when to use each component. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/health-audit) | configuration | 0.65 | Describes Sentinel health/audit feature, likely including specific tables, settings, and configuration options for monitoring service health and user actions. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/soc-ml-anomalies) | configuration | 0.65 | Describes new anomaly detection capabilities; likely includes rule parameters, tuning options, and configuration fields unique to Sentinel anomalies. | -| [Remediate threats while investigating](https://learn.microsoft.com/en-us/azure/sentinel/respond-threats-during-investigation) | integrations | 0.65 | Shows how to use playbooks with the entity trigger and lists supported entity types—this is a Sentinel-specific integration pattern between incidents/hunts and automation. | -| [Run KQL queries](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries) | configuration | 0.65 | Describes the KQL queries page, creating jobs, and managing them; likely includes UI- or API-level options and parameters for jobs and queries, which are product-specific configuration details. | -| [Run notebooks](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks) | integrations | 0.65 | Covers running notebooks with Python/Spark against Sentinel data lake; likely includes connection configuration, libraries, and code patterns unique to this integration. | -| [Sentinel SIEM solution lifecycle post publish](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-post-publish-tracking) | deployment | 0.65 | Explains validation stages and status states for Sentinel offers in Partner Center and content hub; these are specific post-deployment behaviors. | -| [Set up federated tables](https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-setup) | configuration | 0.65 | Setup article for federated connectors (Databricks, ADLS Gen2, Fabric) is likely to include connector-specific configuration parameters, connection settings, and possibly tables of required values unique to Sentinel data lake. | -| [Support for data types in different clouds](https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support) | decision-making | 0.65 | Describes which connector data types are supported in which cloud environments (commercial vs GCC variants), effectively a capability matrix used for deployment and connector selection decisions. | -| [Triage](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-triage-tool) | integrations | 0.65 | Describes MCP triage collection integrating AI models with Sentinel APIs; such content typically includes tool operations and parameters specific to Sentinel triage workflows, matching integrations. | -| [Update SOC processes](https://learn.microsoft.com/en-us/azure/sentinel/migration-security-operations-center-processes) | best-practices | 0.65 | Focuses on updating SOC processes as part of Sentinel migration; likely includes Sentinel-specific operational recommendations and workflows, which are best practices beyond generic SOC theory. | -| [Use Sentinel MCP tools in Copilot Studio](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-copilot-studio) | configuration | 0.65 | Shows how to add Sentinel MCP tools or custom tools into Copilot Studio; prerelease but still focused on concrete configuration steps and parameters. | -| [Use Sentinel MCP tools in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-azure-ai-foundry) | configuration | 0.65 | Explains how to add Sentinel MCP tools or custom tools into Microsoft Foundry; likely includes specific configuration fields and values for tool registration, which is configuration detail. | -| [Use Sentinel MCP tools in Security Copilot](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-security-copilot) | configuration | 0.65 | Describes how to add Sentinel MCP tools or custom tools into Security Copilot; likely includes configuration fields and values for registering tools, fitting configuration/integration patterns. | -| [Use the SIEM migration experience](https://learn.microsoft.com/en-us/azure/sentinel/siem-migration) | decision-making | 0.65 | Describes a tool that analyzes Splunk/QRadar detections and recommends Sentinel rules and connectors; provides concrete migration recommendations and status tracking, which is decision-making support. | -| [Work with STIX objects and indicators](https://learn.microsoft.com/en-us/azure/sentinel/work-with-stix-objects-indicators) | integrations | 0.65 | Provides concrete examples of incorporating STIX objects into Kusto queries and migrating to new STIX-based schemas; this is a product-specific integration pattern with schema/field details. | -| [Work with incidents in multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/multiple-workspace-view) | architecture-patterns | 0.65 | Describes multi-workspace incident view and when to use it; this is a Sentinel-specific pattern for multi-workspace incident management. | -| [Workspace manager in the Azure portal](https://learn.microsoft.com/en-us/azure/sentinel/workspace-manager) | architecture-patterns | 0.65 | Describes central management of multiple workspaces and supported content types; this is a Sentinel-specific operational architecture pattern for large or MSSP environments. | -| [Data exploration](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool) | configuration | 0.64 | A page describing a specific 'data exploration tool collection' in an MCP server is likely to enumerate tool names, parameters, and how to invoke them to search tables and retrieve data. Those are product-specific configuration/integration details for the Sentinel MCP tools, not just conceptual overview, and best match the configuration sub-skill among the given options. | -| [Monitor SAP audit controls](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-controls-workbook) | configuration | 0.64 | Describes a specific workbook for SAP security control compliance; includes Sentinel query/visualization details tied to SAP data. | -| [Monitor SAP audit logs](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-log-workbook) | configuration | 0.64 | Workbook article will include specific queries, fields, and visualizations tied to SAP audit logs in Sentinel, which are product-specific usage patterns. | -| [Monitor costs](https://learn.microsoft.com/en-us/azure/sentinel/billing-monitor-costs) | decision-making | 0.64 | The page explains how to use Azure Cost Management and Defender portal usage views specifically for Sentinel, including how to interpret and act on Sentinel-related cost data. It provides product-specific guidance for monitoring and managing Sentinel costs, which supports cost-related decision making and is not generic billing information. | -| [Plan costs](https://learn.microsoft.com/en-us/azure/sentinel/billing) | decision-making | 0.64 | The article is about planning costs and understanding pricing and billing. Microsoft cost-planning pages typically include tiered pricing details, data ingestion/retention cost structures, and guidance on estimating costs using calculators—information used to choose plans and manage spend, which fits decision-making with product-specific, quantitative trade-offs. | -| [Threat intelligence integrations](https://learn.microsoft.com/en-us/azure/sentinel/threat-intelligence-integration) | configuration | 0.64 | Describes specific ways to integrate threat intelligence feeds, including workspace strategies and connector usage; contains product-specific configuration guidance. | -| [Dynamics 365 Finance and Operations security content reference](https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/dynamics-365-finance-operations-security-content) | configuration | 0.62 | Details built-in security content for Dynamics 365 F&O; these analytics rules and workbooks are specific to this solution. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-content) | deployment | 0.62 | The page explains managing Sentinel custom content via external GitHub/Azure DevOps repositories for CI/CD. Such articles usually include Sentinel-specific deployment behaviors and constraints for content-as-code and detections-as-code, which are product-specific deployment patterns rather than generic Git usage. | -| [Power Platform and Microsoft Dynamics 365 Customer Engagement security content reference](https://learn.microsoft.com/en-us/azure/sentinel/business-applications/power-platform-solution-security-content) | configuration | 0.62 | Lists built-in security content for the Power Platform solution; these rules and workbooks are specific artifacts not known generically. | -| [SAP security content reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-security-content) | configuration | 0.62 | Security content reference lists built-in workbooks, analytics rules, and watchlists for SAP; these are concrete, product-specific artifacts. | -| [Security content reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-security-content) | configuration | 0.62 | Details built-in workbook and analytics rules for SAP BTP; these are concrete, product-specific security content artifacts. | -| [ASIM-based domain solutions](https://learn.microsoft.com/en-us/azure/sentinel/domain-based-essential-solutions) | best-practices | 0.60 | Describes Microsoft essential solutions using ASIM schemas with product-specific normalization and content usage guidance, representing Sentinel-specific best practices. | -| [Audit and track changes to incident tasks](https://learn.microsoft.com/en-us/azure/sentinel/audit-track-tasks) | best-practices | 0.60 | Describes how SOC managers audit and track task history to measure SOC efficiency, a Sentinel-specific operational practice rather than generic advice. | -| [Business continuity and disaster recovery](https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery) | architecture-patterns | 0.60 | BCDR recommendations for availability zones and cross-region DR are Sentinel-specific architecture patterns with concrete guidance on when to use which resiliency options. | -| [Connect Microsoft Sentinel to Microsoft connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-windows-microsoft-services) | configuration | 0.60 | Describes specific methods for connecting to Azure/M365/Windows logs; likely includes configuration options and prerequisites for each connection method. | -| [Create a Power BI report](https://learn.microsoft.com/en-us/azure/sentinel/powerbi) | integrations | 0.60 | Describes exporting queries from Sentinel to Power BI; such pages typically include connection strings, query export formats, and dataset configuration specific to Sentinel-Power BI integration. | -| [Create custom graphs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-custom-graphs) | integrations | 0.60 | Step-by-step guide using Jupyter notebooks and Sentinel VS Code extension to create graphs; likely includes API/SDK usage, class/method names, and parameter patterns specific to Sentinel graph integration with Spark/Notebooks. | -| [Create custom query](https://learn.microsoft.com/en-us/azure/sentinel/hunts-custom-queries) | integrations | 0.60 | Covers creating custom hunting queries against Sentinel data sources. While KQL is generic, the patterns and usage in Sentinel hunting are product-specific integration/query patterns. | -| [Deploy out-of-the-box content](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-deploy) | deployment | 0.60 | Describes how to deploy packaged solutions from Content hub; likely includes which content types can be deployed where and constraints around deployment to workspaces. | -| [Enable User and Entity Behavior Analytics (UEBA)](https://learn.microsoft.com/en-us/azure/sentinel/enable-entity-behavior-analytics) | configuration | 0.60 | UEBA enablement article will include specific configuration options for data sources and feature toggles unique to Sentinel UEBA, representing product-specific configuration knowledge. | -| [Geographical availability and data residency](https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency) | decision-making | 0.60 | Geographical availability and data residency article provides environment-specific availability and compliance-driven deployment choices, supporting architecture and region selection decisions. | -| [Handle false positives](https://learn.microsoft.com/en-us/azure/sentinel/false-positives) | best-practices | 0.60 | Focuses on concrete Sentinel-specific ways to handle false positives (automation rules, modifying analytics rules, specifying exceptions). These are product-specific operational patterns rather than generic theory. | -| [Launch Jupyter notebook](https://learn.microsoft.com/en-us/azure/sentinel/notebooks-hunt) | deployment | 0.60 | Covers creating an Azure ML workspace, launching notebooks from Sentinel into that workspace, and running code—this is a concrete deployment/integration pattern for notebooks in production-like environments. | -| [Manage solution deprecation lifecycle](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-deprecation) | best-practices | 0.60 | Lifecycle management of deprecated solutions includes specific recommended actions and patterns for handling unsupported content, which are Sentinel-specific operational best practices. | -| [Manage your SOC with incident metrics](https://learn.microsoft.com/en-us/azure/sentinel/manage-soc-with-incident-metrics) | best-practices | 0.60 | Uses Sentinel’s incident metrics and workbook with specific measures (MTTR, severity, MITRE tactics) to manage SOC; this is product-specific operational guidance. | -| [Microsoft Sentinel SIEM experience in Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-sentinel-defender-portal) | decision-making | 0.60 | Explains Sentinel experience in Defender portal, including which capabilities are available in which context and how to choose/transition, aiding portal and integration decisions. | -| [Migrate SOAR automation](https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-automation) | decision-making | 0.60 | Guides how to identify SOAR use cases and migrate ArcSight automation to Sentinel automation rules/playbooks; provides migration choices and mapping guidance. | -| [Migrate SOAR automation](https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-automation) | decision-making | 0.60 | Discusses identifying SOAR use cases and migrating QRadar automation to Sentinel; provides migration strategy and mapping guidance. | -| [Migrate SOAR automation](https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-automation) | decision-making | 0.60 | Explains how to identify SOAR use cases and migrate Splunk automation to Sentinel; provides mapping and approach choices, which is decision-making content. | -| [Migrate detection rules](https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-detection-rules) | decision-making | 0.60 | Focuses on identifying, comparing, and migrating ArcSight rules to Sentinel built-ins; this is migration decision guidance between rule types and mappings. | -| [Migrate detection rules](https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-detection-rules) | decision-making | 0.60 | Guides identification, comparison, and migration of QRadar rules to Sentinel built-ins; supports migration decisions and mappings. | -| [Migrate detection rules](https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-detection-rules) | decision-making | 0.60 | Covers identifying, comparing, and migrating Splunk rules to Sentinel; includes guidance on when to use the SIEM migration experience, fitting migration decision-making. | -| [Monitor Zero Trust](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution) | best-practices | 0.60 | Shows how to use the Zero Trust (TIC 3.0) solution, including specific workbooks, analytics, and mappings to TIC controls, which are concrete product-specific usage patterns. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/incident-tasks) | best-practices | 0.60 | Focuses on using incident tasks to standardize triage/investigation workflows, providing concrete product-specific guidance for SOC process implementation. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/investigate-large-datasets) | architecture-patterns | 0.60 | Explains when and how to use search jobs vs other methods for long-term retention data, a Sentinel-specific investigation pattern tied to data states. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/notebooks) | architecture-patterns | 0.60 | Explains how Sentinel’s data store and API integrate with Jupyter notebooks for hunting, a product-specific investigation pattern that goes beyond generic notebook usage. | -| [Remove Microsoft Sentinel from your workspaces](https://learn.microsoft.com/en-us/azure/sentinel/offboard) | configuration | 0.60 | Provides concrete steps and implications for offboarding Sentinel from a workspace, which are product-specific configuration operations. | -| [Restore historical data](https://learn.microsoft.com/en-us/azure/sentinel/restore) | configuration | 0.60 | Describes restoring archived logs from search job results, which involves Sentinel-specific configuration/operations around data states and query performance. | -| [SIEM operations guide](https://learn.microsoft.com/en-us/azure/sentinel/ops-guide) | best-practices | 0.60 | An operational guide listing recommended SOC activities for Sentinel; likely contains product-specific operational DOs/DON’Ts and schedules, which are best-practice guidance beyond generic security ops theory. | -| [Use tasks to handle incident workflow](https://learn.microsoft.com/en-us/azure/sentinel/work-with-tasks) | best-practices | 0.60 | Explains how analysts should use incident tasks created by automation rules/playbooks or manually, giving concrete operational patterns specific to Sentinel. | -| [Workbooks for Microsoft Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/workbooks-for-data-lake) | configuration | 0.60 | Describes selecting Sentinel data lake as a workbook data source and running queries; likely includes workbook data source configuration and query binding details specific to Sentinel. | -| [Workspace deployed parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-workspace-parsers) | configuration | 0.60 | Describes management and usage of workspace-deployed ASIM parsers, including Sentinel-specific parser deployment and configuration behavior. | +| [Data connector agent systemconfig.ini file reference (legacy)](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig) | configuration | 0.95 | Legacy configuration file reference for systemconfig.ini with sections and options is clearly configuration-focused expert knowledge. | +| [Data connector agent systemconfig.json file reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig-json) | configuration | 0.95 | Explicit configuration file reference listing settings and their meanings for systemconfig.json, matching configuration criteria. | +| [Microsoft Sentinel SIEM service limits](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-service-limits) | limits-quotas | 0.95 | Explicitly a service-limits article; will list concrete numeric limits and quotas for Sentinel features, which are expert numeric constraints not reliably known from training. | +| [Microsoft Sentinel data lake service limits](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-service-limits) | limits-quotas | 0.95 | Explicit service limits article; will list numerical limits, quotas, and constraints for the data lake service. | +| [Azure Storage Blob data connector reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-azure-storage) | configuration | 0.90 | Defines JSON fields and properties for Azure Storage Blob connectors, which are precise configuration references. | +| [Data connector agent kickstart script reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-kickstart) | configuration | 0.90 | Reference for kickstart deployment script parameters is a configuration reference with specific parameter names and behaviors. | +| [Data connector agent update script reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-update) | configuration | 0.90 | Describes configurable parameters of the update script; this is a configuration reference with product-specific options. | +| [Data connector definitions API reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connector-ui-definitions-reference) | configuration | 0.90 | Reference for connectorUIConfig JSON with specific fields and allowed values, a direct configuration schema for connector UIs. | +| [GCP data connectors API reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-gcp) | configuration | 0.90 | Reference for GCP connector JSON schema and connection rules, with detailed parameter definitions. | +| [Monitored SAP security parameters](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-suspicious-configuration-security-parameters) | security | 0.90 | Lists specific SAP static security parameters monitored by a particular analytics rule and how to adjust watchlists; clearly security configuration knowledge. | +| [Plan roles and permissions](https://learn.microsoft.com/en-us/azure/sentinel/roles) | security | 0.90 | Explicitly about roles and permissions; will list Sentinel-specific RBAC roles and allowed actions, which are product-specific security configuration details. | +| [Required ABAP permissions](https://learn.microsoft.com/en-us/azure/sentinel/sap/required-abap-authorizations) | security | 0.90 | Lists required ABAP authorizations/roles by purpose; these are specific permission objects and scopes, matching the security category. | +| [RestApiPoller data connectors API reference](https://learn.microsoft.com/en-us/azure/sentinel/data-connector-connection-rules-reference) | configuration | 0.90 | Provides reference JSON fields and properties for RestApiPoller connectors, including parameter names and constraints. | +| [Troubleshoot CEF and Syslog via AMA](https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-troubleshooting) | troubleshooting | 0.90 | Explicit troubleshooting guide; will map ingestion issues to causes and fixes, include diagnostic commands and log locations—classic symptom→cause→solution expert content. | +| [Troubleshoot SAP data connector](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-deploy-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting article for the SAP data connector agent; will contain specific error messages, causes, and resolutions unique to this connector. | +| [Billing, limits, and availability](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-billing) | limits-quotas | 0.88 | Pricing and limits article; expected to list concrete numerical limits, quotas, and availability constraints for MCP tool usage. | +| [CEF log field mapping](https://learn.microsoft.com/en-us/azure/sentinel/cef-name-mapping) | configuration | 0.88 | Contains detailed mapping tables between CEF keys and Sentinel fields, which are precise integration/configuration references. | +| [ASIM process event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-process-event) | configuration | 0.86 | Schema reference with detailed, product-specific field names, types, and usage for normalized process events, which are configuration-level details not inferable from general training. | +| [ASIM registry event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-registry-event) | configuration | 0.86 | Normalization schema reference for Windows registry events with concrete field definitions and mappings, representing detailed configuration knowledge. | +| [ASIM user management schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-user-management) | configuration | 0.86 | Defines the Sentinel user management normalization schema with specific fields and structure, which is product-specific configuration reference. | +| [ASIM web session schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-web) | configuration | 0.86 | Web Session normalization schema reference with concrete field layout for IP network activity, a detailed configuration artifact. | +| [DNS over AMA reference](https://learn.microsoft.com/en-us/azure/sentinel/dns-ama-fields) | configuration | 0.86 | Lists available filter fields and the normalized schema for DNS logs, including field names and semantics, which are configuration details. | +| [Threat intelligence upload API reference](https://learn.microsoft.com/en-us/azure/sentinel/stix-objects-api) | integrations | 0.86 | API reference with example requests/responses for uploading STIX objects, including parameter names and constraints unique to this API. | +| [Troubleshoot Azure Storage Blob connector issues](https://learn.microsoft.com/en-us/azure/sentinel/azure-storage-blob-connector-troubleshoot) | troubleshooting | 0.86 | Troubleshooting guide mapping connector symptoms to causes and resolutions; likely includes specific error messages and diagnostic steps. | +| [Compare KQL jobs, summary rules, and search jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs-summary-rules-search-jobs) | decision-making | 0.85 | Explicit comparison of three features with use cases; likely includes decision criteria and possibly thresholds for selecting the right mechanism. | +| [Configure Microsoft Sentinel scoping (row-level RBAC)](https://learn.microsoft.com/en-us/azure/sentinel/scoping) | security | 0.85 | Row-level RBAC configuration; likely documents specific scope definitions, role interactions, and permission behaviors unique to Sentinel. | +| [Connect threat intelligence with upload API](https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-upload-api) | integrations | 0.85 | Explains how to use the Upload API with STIX/TAXII to send indicators; includes API parameters and constraints specific to Sentinel’s TIP integration. | +| [Select target platform](https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-target-platform) | decision-making | 0.85 | Explicitly compares Azure target platforms for historical data in terms of performance, cost, usability, and management; a clear decision matrix for platform selection. | +| [SentinelAudit table reference](https://learn.microsoft.com/en-us/azure/sentinel/audit-table-reference) | configuration | 0.85 | Audit table reference with field definitions; provides detailed schema and usage guidance specific to Sentinel audit logging. | +| [SentinelHealth table reference](https://learn.microsoft.com/en-us/azure/sentinel/health-table-reference) | configuration | 0.85 | Health table reference with field descriptions; schema and field semantics are detailed configuration knowledge unique to Sentinel. | +| [Troubleshoot AWS S3 connector issues](https://learn.microsoft.com/en-us/azure/sentinel/aws-s3-troubleshoot) | troubleshooting | 0.85 | Explicit troubleshooting for AWS S3 connector; will map symptoms to causes and fixes, possibly with error messages and diagnostic steps—expert troubleshooting content. | +| [Troubleshoot solutions in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-sentinel-solutions) | troubleshooting | 0.85 | Explicit troubleshooting guide for data ingestion, analytics, packaging, and agent integration; likely includes error messages, causes, and resolution steps unique to Sentinel solutions. | +| [Security alert schema reference](https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema) | configuration | 0.84 | Schema reference for security alerts with detailed field definitions and mappings, which are product-specific configuration details. | +| [Legacy upload indicator API reference](https://learn.microsoft.com/en-us/azure/sentinel/upload-indicators-api) | integrations | 0.82 | Legacy API reference with concrete request/response schema for uploading indicators, which is detailed integration knowledge. | +| [Troubleshoot KQL for the lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-troubleshoot) | troubleshooting | 0.82 | Checklist for resolving KQL query and job issues; includes specific causes (permissions, workspaces, operators) and corrective actions. | +| [Watchlist template schemas](https://learn.microsoft.com/en-us/azure/sentinel/watchlist-schemas) | configuration | 0.82 | Details the schemas for each watchlist template, including column names and types, which are configuration references. | +| [ASIM DHCP schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dhcp) | configuration | 0.80 | DHCP information model schema; detailed field-level configuration for normalized DHCP events. | +| [ASIM DNS schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dns) | configuration | 0.80 | DNS information model schema; defines normalized DNS event fields and usage, which is expert configuration knowledge. | +| [ASIM alert event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-alert) | configuration | 0.80 | Alert Events schema reference; defines fields and structures for normalized alerts, which is detailed configuration knowledge. | +| [ASIM asset entity schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-asset) | configuration | 0.80 | Schema reference for Asset Entity; includes field names, types, and usage rules that are detailed configuration knowledge. | +| [ASIM audit event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-audit) | configuration | 0.80 | Audit Events schema reference; includes field definitions and semantics for normalized audit logs. | +| [ASIM authentication schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-authentication) | configuration | 0.80 | Authentication schema reference; documents fields and event types for normalized authentication events. | +| [ASIM common fields](https://learn.microsoft.com/en-us/azure/sentinel/normalization-common-fields) | configuration | 0.80 | Describes common schema fields and permitted values per schema; detailed field-level configuration for ASIM. | +| [ASIM device entity](https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-device) | configuration | 0.80 | Device Entity schema reference with Dvc/Src/Dst semantics; detailed field-level configuration for normalized device data. | +| [ASIM file event schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-file-event) | configuration | 0.80 | File Event schema reference; detailed configuration of normalized file activity fields and semantics. | +| [ASIM network session schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-network) | configuration | 0.80 | Network Session schema reference; defines normalized network activity fields and structures, which are product-specific configuration details. | +| [ASIM user entity](https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-user) | configuration | 0.80 | User Entity schema reference; documents entity fields and role-specific prefixes, which are detailed schema configuration. | +| [Asset data tables in Microsoft Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/asset-data-tables) | configuration | 0.80 | Provides table mappings for asset data; likely includes column names, types, and semantics that are product-specific configuration/schema details. | +| [Authenticate playbooks to Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/automation/authenticate-playbooks-to-sentinel) | security | 0.80 | Describes how playbooks authenticate to Sentinel using specialized connectors; likely includes connector permissions, identities, and RBAC-related configuration specific to Sentinel, which is product-specific security configuration. | +| [Configure RDP login detection](https://learn.microsoft.com/en-us/azure/sentinel/configure-connector-login-detection) | configuration | 0.80 | Describes configuring connector for ML-based RDP anomaly detection; includes specific connector settings, required event IDs, and data requirements—product-specific configuration details. | +| [Configure ingestion-time transformation](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-transformation) | configuration | 0.80 | Covers configuring ingestion-time transformations and custom log ingestion; includes transformation rules, DCR fields, and API usage—detailed configuration expert knowledge. | +| [Connect threat intelligence platforms](https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-tip) | integrations | 0.80 | Connector article for TIP integration, including deprecation details and configuration steps; product-specific integration configuration. | +| [Deploy the agent from the command line](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-command-line) | deployment | 0.80 | Command-line deployment article will include concrete commands and parameters specific to Sentinel’s SAP connector, which are product-specific deployment details. | +| [Deploy the agent with expert options](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-deploy-alternate) | deployment | 0.80 | Covers expert/custom/manual deployment of the SAP connector container, likely including advanced configuration and environment-specific deployment constraints. | +| [Enable MDTI data connector](https://learn.microsoft.com/en-us/azure/sentinel/connect-mdti-data-connector) | integrations | 0.80 | Connector enablement article for Defender Threat Intelligence; will include connector configuration parameters and behavior unique to this integration. | +| [Enable attack disruption actions on AWS](https://learn.microsoft.com/en-us/azure/sentinel/aws-disruption) | security | 0.80 | Describes configuring automated actions on AWS identities; likely includes IAM roles, permissions, and security settings—product-specific security configuration. | +| [Entities reference](https://learn.microsoft.com/en-us/azure/sentinel/entities-reference) | configuration | 0.80 | Defines entity types with strong/weak identifiers, which are detailed schema/configuration elements used across Sentinel content. | +| [Feature support in different clouds](https://learn.microsoft.com/en-us/azure/sentinel/feature-availability) | decision-making | 0.80 | Feature availability across Azure environments with GA/preview/unsupported states; effectively a decision matrix for what you can use where. | +| [Graph REST API](https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-rest-api) | integrations | 0.80 | REST API usage for listing and querying custom graphs; likely includes endpoints, parameters, and request/response schemas unique to this product. | +| [Legacy network normalization schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-v1) | configuration | 0.80 | Legacy network normalization schema reference with explicit field definitions and version-specific structure, which is expert configuration knowledge. | +| [Logstash plugin with Data Collection Rules](https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules) | integrations | 0.80 | Logstash integration with DCR-based API; includes plugin parameters, pipeline config, and DCR schema—code-focused integration and configuration expert knowledge. | +| [Manage workspace access with resource-context RBAC](https://learn.microsoft.com/en-us/azure/sentinel/resource-context-rbac) | security | 0.80 | Explains managing access by resource using Azure roles; includes specific RBAC roles and scope patterns unique to Sentinel resource-context RBAC. | +| [Microsoft Sentinel provider class reference](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-provider-class-reference) | integrations | 0.80 | Class reference with methods to list databases, read tables, and save data; includes method signatures and parameters unique to this provider. | +| [SIEM best practices in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/best-practices) | best-practices | 0.80 | Explicit best practices article for deploying and managing Sentinel; contains product-specific DOs/DON’Ts and configuration recommendations. | +| [Search large datasets](https://learn.microsoft.com/en-us/azure/sentinel/search-jobs) | limits-quotas | 0.80 | Explicitly states that search jobs are used when the log query timeout of 10 minutes isn't sufficient and that they scan up to a year of data, which are concrete product-specific limits. | +| [Select data ingestion tool](https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-tool) | decision-making | 0.80 | Provides a table of tools per target platform and general tools, helping choose ingestion tooling; product-specific migration and tool-selection guidance. | +| [Set up customer-managed keys](https://learn.microsoft.com/en-us/azure/sentinel/customer-managed-keys) | security | 0.80 | Provides steps and parameters for configuring CMK with Azure Key Vault for Sentinel; product-specific security configuration. | +| [Supported triggers and actions in playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-triggers-actions) | integrations | 0.80 | Catalog of supported triggers/actions for the Sentinel Logic Apps connector is an integration-focused reference with connector-specific operations and parameters that qualify as expert integration knowledge. | +| [Troubleshoot analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-analytics-rules) | troubleshooting | 0.80 | Explicit troubleshooting article; deals with known issues, explains AUTO DISABLED state, and likely maps symptoms to causes and resolutions specific to Sentinel analytics rules. | +| [Troubleshooting](https://learn.microsoft.com/en-us/azure/sentinel/datalake/troubleshoot-sentinel-mcp) | troubleshooting | 0.80 | Explicitly combines best practices and troubleshooting; likely includes symptom→cause→solution mappings and product-specific gotchas for MCP tools. | +| [Automation rules reference](https://learn.microsoft.com/en-us/azure/sentinel/automation-rule-reference) | configuration | 0.78 | Reference for supported properties, conditions, and entities in automation rules, including specific field names and allowed values. | +| [Create watchlists](https://learn.microsoft.com/en-us/azure/sentinel/watchlists-create) | limits-quotas | 0.78 | Includes concrete file size limits for watchlist uploads (3.8 MB threshold and up to 500 MB for large watchlists), which are product-specific numerical constraints. | +| [Enable network security for Azure Storage Blob data connector](https://learn.microsoft.com/en-us/azure/sentinel/enable-storage-network-security) | security | 0.78 | Step-by-step instructions for configuring Azure NSP on storage accounts used by connectors, including product-specific security settings. | +| [KQL using the API](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries-api) | integrations | 0.78 | Programmatic KQL queries via REST; expected to document endpoints, request bodies, permissions, and parameters unique to this API. | +| [Microsoft Sentinel graph provider reference](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-provider-reference) | integrations | 0.78 | Reference for the sentinel_graph class with product-specific methods and parameters for schema definition, transforms, publishing, and querying; fits integrations & coding patterns. | +| [ASIM application entity](https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-application) | configuration | 0.75 | Application Entity schema reference; contains product-specific field definitions and usage patterns. | +| [ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-list) | configuration | 0.75 | Provides a list of ASIM parsers and related parameters; this is a catalog of product-specific parser configurations. | +| [AWS EKS logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks) | configuration | 0.75 | EKS audit log connector; includes S3 bucket, permissions, and Sentinel connector settings—product-specific configuration details. | +| [AWS S3 WAF logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-s3-waf) | configuration | 0.75 | AWS WAF S3-based connector; will specify bucket configuration, log format, and Sentinel connector parameters—detailed configuration content. | +| [AWS service logs](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws) | configuration | 0.75 | Connector configuration for AWS logs; includes trust relationship, role ARNs, S3 paths, and connector settings—expert configuration knowledge. | +| [CEF and Syslog via AMA](https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama) | configuration | 0.75 | Step-by-step ingestion guide; will include AMA install commands, DCR definitions, and connector settings, which are detailed configuration parameters. | +| [Collect logs from text files via AMA](https://learn.microsoft.com/en-us/azure/sentinel/connect-custom-logs-ama) | configuration | 0.75 | Describes using Custom Logs via AMA; such docs include DCR schema, file path patterns, and transformation settings—detailed configuration parameters. | +| [Compare analytics rules and custom detections](https://learn.microsoft.com/en-us/azure/sentinel/compare-analytics-rules-custom-detections) | decision-making | 0.75 | Explicit comparison article; likely includes feature comparison tables, supported capabilities, and guidance on when to use each, which are decision-making details with quantified trade-offs. | +| [Creating codeless data connectors (CCF)](https://learn.microsoft.com/en-us/azure/sentinel/create-codeless-connector) | integrations | 0.75 | Codeless Connector Framework article for ingesting data; typically includes connector schema, parameter names, and configuration details unique to Sentinel integrations. | +| [Creating push codeless data connectors (CCF)](https://learn.microsoft.com/en-us/azure/sentinel/create-push-codeless-connector) | integrations | 0.75 | Push-based CCF connectors with real-time data; likely documents endpoint formats, auth parameters, and connector-specific configuration fields. | +| [DNS via AMA](https://learn.microsoft.com/en-us/azure/sentinel/connect-dns-ama) | configuration | 0.75 | Connector for DNS logs; will specify AMA extension, event channels, filters, and DCR settings—product-specific configuration details. | +| [Data lake use cases](https://learn.microsoft.com/en-us/azure/sentinel/basic-logs-use-cases) | decision-making | 0.75 | Guides which log sources to place in data lake vs analytics tier based on attributes and use cases; product-specific tier selection criteria. | +| [Deploy content as code from your repository](https://learn.microsoft.com/en-us/azure/sentinel/ci-cd) | deployment | 0.75 | Describes creating and managing connections between Sentinel and external repos; product-specific deployment automation configuration. | +| [Enable User and Entity Behavior Analytics (UEBA)](https://learn.microsoft.com/en-us/azure/sentinel/enable-entity-behavior-analytics) | configuration | 0.75 | Explains how to enable UEBA and configure data sources; includes product-specific feature toggles and data requirements. | +| [Enable auditing and health monitoring](https://learn.microsoft.com/en-us/azure/sentinel/enable-monitoring) | configuration | 0.75 | Covers turning on the feature and querying SentinelHealth and SentinelAudit tables; includes table names and query patterns unique to Sentinel. | +| [Google Cloud Platform connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-google-cloud-platform) | configuration | 0.75 | GCP ingestion article; will include Pub/Sub topics, subscriptions, service accounts, and Sentinel connector settings—detailed cross-cloud configuration. | +| [Manage tables, tiers, and retention](https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention) | configuration | 0.75 | Explains table-level retention and tier configuration in Defender portal; involves specific setting names, allowed values, and behaviors. | +| [Microsoft Entra](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-active-directory) | configuration | 0.75 | Connector for Entra ID logs; includes portal settings, log categories, and connector parameters—product-specific configuration details. | +| [Monitor automation rules and playbooks health](https://learn.microsoft.com/en-us/azure/sentinel/monitor-automation-health) | configuration | 0.75 | Uses SentinelHealth and AzureDiagnostics tables to track automation; includes product-specific table names and query patterns. | +| [Monitor data connector health](https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health) | configuration | 0.75 | Describes using SentinelHealth table and Health Monitoring workbook; includes specific workbook logic and table usage unique to Sentinel connectors. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-content) | deployment | 0.75 | Explains using GitHub/Azure DevOps repositories for Sentinel content; includes product-specific CI/CD integration patterns and constraints. | +| [Plan costs](https://learn.microsoft.com/en-us/azure/sentinel/billing) | decision-making | 0.75 | Cost planning and billing guidance for Sentinel with estimator usage and likely SKU/tier cost comparisons; supports decision-making on pricing models. | +| [Plan your migration](https://learn.microsoft.com/en-us/azure/sentinel/migration) | decision-making | 0.75 | Covers reasons, phases, and planning for migrating from legacy SIEM to Sentinel; includes scenario-based migration guidance and trade-offs, fitting decision-making/migration criteria. | +| [Sample API requests for creating Data Collection Rules (DCRs)](https://learn.microsoft.com/en-us/azure/sentinel/api-dcr-reference) | integrations | 0.74 | Contains concrete REST request/response examples for DCR and DCRA creation, including parameter names and structures specific to Sentinel/AMA. | +| [Create your own custom tool](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-create-custom-tool) | configuration | 0.72 | How-to for defining custom tools from saved KQL queries; expected to include tool schema, parameter definitions, and configuration constraints. | +| [ASIM content](https://learn.microsoft.com/en-us/azure/sentinel/normalization-content) | configuration | 0.70 | Lists built-in content configured for ASIM; includes mappings between content and normalization schemas, which is product-specific configuration knowledge. | +| [ASIM helper functions](https://learn.microsoft.com/en-us/azure/sentinel/normalization-functions) | integrations | 0.70 | Documents helper KQL functions for interacting with normalized data; includes function signatures and parameters unique to ASIM/Sentinel. | +| [ASIM known issues](https://learn.microsoft.com/en-us/azure/sentinel/normalization-known-issues) | limits-quotas | 0.70 | Explicitly lists known issues and limitations; likely includes concrete constraints and unsupported scenarios for ASIM in Sentinel. | +| [Add advanced conditions to automation rules](https://learn.microsoft.com/en-us/azure/sentinel/add-advanced-conditions-to-automation-rules) | configuration | 0.70 | Describes adding complex condition groups and OR logic in automation rules, which are detailed, product-specific configuration patterns not generally known. | +| [Anomalies reference](https://learn.microsoft.com/en-us/azure/sentinel/anomalies-reference) | configuration | 0.70 | Lists specific anomaly types and how they are represented, which is detailed product behavior and schema-level knowledge. | +| [Audit Microsoft Sentinel data lake and graph in Microsoft Purview portal](https://learn.microsoft.com/en-us/azure/sentinel/datalake/auditing-lake-activities) | configuration | 0.70 | Explains how Sentinel data lake/graph activities appear in the audit log and how to search them; product-specific audited operation details. | +| [Audit Microsoft Sentinel with Azure Activity Logs](https://learn.microsoft.com/en-us/azure/sentinel/audit-sentinel-data) | configuration | 0.70 | Describes accessing AzureActivity and other audit data for Sentinel; includes table names and query usage specific to Sentinel auditing. | +| [Audit and monitor the health of analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/monitor-analytics-rule-integrity) | configuration | 0.70 | Explains using SentinelHealth and audit logs plus manual rerun for analytics rules; product-specific monitoring configuration and queries. | +| [Azure Functions API connection](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-functions-template) | integrations | 0.70 | Shows how to build Azure Functions-based connectors; typically includes function app settings, bindings, and Sentinel-specific API parameters—code-focused integration patterns. | +| [Azure Stack VMs](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-stack) | deployment | 0.70 | Shows how to provision VM extensions and start monitoring; involves deployment of agents/extensions on Azure Stack with Sentinel-specific requirements—deployment expert knowledge. | +| [Azure Storage blob connectors](https://learn.microsoft.com/en-us/azure/sentinel/setup-azure-storage-connector) | configuration | 0.70 | Connector setup article; usually contains storage account settings, container paths, and Sentinel connector parameters, which are configuration expert knowledge. | +| [Best practices](https://learn.microsoft.com/en-us/azure/sentinel/best-practices-data) | best-practices | 0.70 | Best practices article specific to Sentinel data connectors; likely includes concrete recommendations and product-specific gotchas for data collection. | +| [Build logic apps with Microsoft Sentinel MCP](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-logic-apps) | integrations | 0.70 | Shows how to call MCP tools from Logic Apps; likely includes connector actions, parameters, and workflow configuration specific to this integration. | +| [Cisco firewalls](https://learn.microsoft.com/en-us/azure/sentinel/cisco-ftd-firewall) | decision-making | 0.70 | Explains when to use each Cisco connector (FTD vs ASA) and links to setup; this is product-specific connector selection guidance—decision-making expert knowledge. | +| [Collect SAP HANA audit logs](https://learn.microsoft.com/en-us/azure/sentinel/sap/collect-sap-hana-audit-logs) | configuration | 0.70 | Explains how to collect SAP HANA audit logs; likely includes HANA-specific settings and Sentinel connector configuration parameters that are product-specific. | +| [Configure content](https://learn.microsoft.com/en-us/azure/sentinel/configure-content) | configuration | 0.70 | Configures data connectors, analytics rules, automation rules, etc.; likely includes specific setting names and options unique to Sentinel content. | +| [Configure interactive and long-term data retention](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-retention-archive) | configuration | 0.70 | Covers setting up interactive vs long-term retention; likely includes specific retention settings and options for Sentinel workspaces. | +| [Configure multistage attack (Fusion) rules](https://learn.microsoft.com/en-us/azure/sentinel/configure-fusion-rules) | configuration | 0.70 | Configuration article for Fusion rules; likely lists rule parameters, enable/disable options, and scope settings that are specific to Sentinel’s Fusion engine. | +| [Connect Microsoft Sentinel to AWS](https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-configure-environment) | configuration | 0.70 | AWS environment setup for connectors; typically includes IAM roles, policies, S3 bucket settings—detailed cross-cloud configuration parameters. | +| [Connect Microsoft Sentinel to Microsoft connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-windows-microsoft-services) | configuration | 0.70 | Describes how to connect Sentinel to Azure/M365/Windows; such connector docs contain specific configuration steps, parameters, and sometimes per-service settings, which are configuration expert knowledge. | +| [Connect to STIX/TAXII feeds](https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-taxii) | integrations | 0.70 | Connector article for STIX/TAXII feeds typically documents connector-specific configuration fields (TAXII server URL formats, collection IDs, authentication parameters, polling intervals) and Sentinel-side connector settings that are product-specific integration details, not just conceptual info. | +| [Connect via API-based connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-services-api-based) | configuration | 0.70 | API-based connector article will include endpoint URLs, auth scopes, and connector parameters unique to Sentinel integrations, fitting configuration/integration expert knowledge. | +| [Connect via Windows agent-based connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-services-windows-based) | configuration | 0.70 | Uses AMA and DCRs; such docs list DCR fields, data sources, and agent settings, which are detailed configuration knowledge. | +| [Connect via diagnostic settings-based connectors](https://learn.microsoft.com/en-us/azure/sentinel/connect-services-diagnostic-setting-based) | configuration | 0.70 | Diagnostic settings-based connections require specific Azure diagnostic categories, destinations, and settings; these are product-specific configuration parameters. | +| [Connect your SAP system](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-data-connector-agent-container) | deployment | 0.70 | Describes how to deploy the SAP data connector agent container; such deployment docs typically include Sentinel/SAP-specific deployment parameters and constraints beyond generic container deployment. | +| [Create NRT analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/create-nrt-rules) | configuration | 0.70 | Covers viewing and creating NRT rules; such content typically lists rule parameters, supported options, and constraints unique to NRT analytics. | +| [Create automation rules](https://learn.microsoft.com/en-us/azure/sentinel/create-manage-use-automation-rules) | configuration | 0.70 | Explains defining triggers, conditions, and actions for automation rules, which implies detailed product-specific configuration options and fields unique to Sentinel automation. | +| [Create custom connectors using AI agent in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector-builder-agent) | integrations | 0.70 | Describes an AI-assisted connector builder with deployment assets, polling logic, and secure secret handling; likely includes product-specific connector configuration parameters and patterns for integrating data sources. | +| [Custom logs - configure security device](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-custom-device) | configuration | 0.70 | Per-application configuration details (paths, formats, fields) for Custom Logs via AMA are highly specific configuration expert knowledge. | +| [Customize alert details](https://learn.microsoft.com/en-us/azure/sentinel/customize-alert-details) | configuration | 0.70 | Describes overriding default alert properties based on query results; involves specific rule configuration fields and mapping expressions that are product-specific. | +| [Customize repository deployments](https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-deploy) | deployment | 0.70 | Shows ways to customize repository deployments using specific files/syntax; Sentinel-specific deployment behavior and configuration. | +| [Data management overview](https://learn.microsoft.com/en-us/azure/sentinel/manage-data-overview) | decision-making | 0.70 | Guides how to choose retention periods and tiers to optimize cost and operations; likely includes tier capabilities and trade-offs for decision-making. | +| [Data transformation using filter and split](https://learn.microsoft.com/en-us/azure/sentinel/transformation-filter-split) | configuration | 0.70 | Describes configuring filter/split transformations at ingestion to route data between tiers; includes product-specific transformation options and parameters. | +| [Define an access restriction for Standard-plan playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/define-playbook-access-restrictions) | security | 0.70 | Covers access restriction policy for Standard-plan playbooks and private endpoints; likely includes specific policy settings and scopes, which are product-specific security configuration details. | +| [Deploy SAP BTP](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-btp-solution) | deployment | 0.70 | Deployment article for SAP BTP solution, including mention of connector version behavior; contains product-specific deployment steps and constraints. | +| [Deploy side-by-side](https://learn.microsoft.com/en-us/azure/sentinel/deploy-side-by-side) | decision-making | 0.70 | Describes approaches and methods for running Sentinel alongside an existing SIEM; focused on migration/architecture decisions and trade-offs. | +| [Deployment prerequisites](https://learn.microsoft.com/en-us/azure/sentinel/sap/prerequisites-for-deploying-sap-continuous-threat-monitoring) | deployment | 0.70 | Lists prerequisites for SAP connector/agent vs agentless; includes environment, connector, and platform requirements unique to this deployment. | +| [Develop ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-develop-parsers) | integrations | 0.70 | Focuses on developing, testing, and deploying KQL-based parsers; likely includes function signatures, parameters, and deployment patterns unique to Sentinel/ASIM. | +| [Enable Microsoft Sentinel and initial features and content](https://learn.microsoft.com/en-us/azure/sentinel/enable-sentinel-features-content) | deployment | 0.70 | Covers enabling Sentinel, health/audit, and solutions; includes product-specific deployment steps and feature enablement order. | +| [Enable SAP detections and threat protection](https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-solution-configuration) | best-practices | 0.70 | Described as best practices for configuring initial security content; likely includes concrete recommendations and product-specific gotchas for SAP detections. | +| [Enrich entities with geolocation data with REST-API](https://learn.microsoft.com/en-us/azure/sentinel/geolocation-data-api) | integrations | 0.70 | Describes how to call a Sentinel REST API for geolocation enrichment with specific request structures and parameters. | +| [Export historical data](https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-historical-data) | decision-making | 0.70 | Details export methods based on data volume and environment; includes guidance on choosing export approaches, which is migration decision-making specific to ArcSight-to-Azure scenarios. | +| [Export historical data](https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-historical-data) | decision-making | 0.70 | Describes exporting QRadar data via REST API and AQL with guidance on query ranges and data selection; product-specific migration/export decision guidance. | +| [Export historical data](https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-historical-data) | decision-making | 0.70 | Explains multiple export methods based on data volume and interactivity; provides concrete guidance on choosing export approaches, which is migration decision-making. | +| [Extend across multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/extend-sentinel-across-workspaces-tenants) | architecture-patterns | 0.70 | Explains patterns for querying/analyzing across workspaces/tenants; product-specific cross-workspace architecture and trade-offs. | +| [GQL reference for Sentinel custom graph](https://learn.microsoft.com/en-us/azure/sentinel/datalake/gql-reference-for-sentinel-custom-graph) | integrations | 0.70 | Language reference for GQL with operators and functions specific to Microsoft Sentinel graph; contains API-like syntax and semantics, fitting integrations & coding patterns. | +| [Ingestion-time data transformation](https://learn.microsoft.com/en-us/azure/sentinel/data-transformation) | configuration | 0.70 | Covers custom ingestion and DCR-based transformations; these docs usually list DCR schema fields, transformation rules, and config options that are product-specific configuration knowledge. | +| [Install the solution for SAP applications](https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-security-content) | deployment | 0.70 | Covers installing SAP solution from content hub and notes connector deprecation; product-specific deployment and lifecycle details. | +| [Integrate Microsoft Defender XDR](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-sentinel-integration) | integrations | 0.70 | Explains integration behavior between Defender XDR and Sentinel, including onboarding conditions and incident queue behavior; this is product-specific integration logic. | +| [Integrate Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/sentinel/ingest-defender-for-cloud-incidents) | integrations | 0.70 | Describes specific integration paths and configuration to ingest Defender for Cloud incidents through Defender XDR into Sentinel; product-specific integration pattern. | +| [Integrate Microsoft Purview](https://learn.microsoft.com/en-us/azure/sentinel/purview-solution) | integrations | 0.70 | Describes using a data connector and solution for Purview; includes connector configuration and workbook usage—product-specific integration pattern. | +| [Integrate SAP across multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/sap/cross-workspace) | architecture-patterns | 0.70 | Discusses multiple architecture options and factors for SAP across workspaces; this is architecture/partitioning guidance specific to Sentinel SAP scenarios. | +| [Manage ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-manage-parsers) | configuration | 0.70 | Describes adding custom parsers and replacing built-in ones; involves product-specific parser names, relationships (unifying vs source-specific), and configuration steps. | +| [Manage KQL jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-manage-jobs) | configuration | 0.70 | Jobs management page functions; expected to list job properties, statuses, and configuration options for managing scheduled and one-time jobs. | +| [Manage hunting queries with REST-API](https://learn.microsoft.com/en-us/azure/sentinel/hunting-with-rest-api) | integrations | 0.70 | Shows concrete REST API usage patterns and parameters for managing hunting queries, a product-specific integration pattern. | +| [Manage multiple tenants (MSSP)](https://learn.microsoft.com/en-us/azure/sentinel/multiple-tenants-service-providers) | architecture-patterns | 0.70 | Describes MSSP multi-tenant management using Azure Lighthouse; includes Sentinel-specific operational patterns and architecture choices. | +| [Manage watchlists](https://learn.microsoft.com/en-us/azure/sentinel/watchlists-manage) | best-practices | 0.70 | Gives a concrete product-specific recommendation to edit instead of delete/recreate due to a 5-minute Log Analytics ingestion SLA and describes duplicate-entry behavior, which is a Sentinel-specific gotcha. | +| [Map data fields to entities](https://learn.microsoft.com/en-us/azure/sentinel/map-data-fields-to-entities) | configuration | 0.70 | Entity mapping article describes specific entity types and field mappings used in analytics rules; these are product-specific configuration options and schema details. | +| [Microsoft Defender XDR](https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-365-defender) | configuration | 0.70 | Explains configuring Defender XDR connector; includes portal settings, scopes, and sync behavior—product-specific configuration details. | +| [Microsoft Defender XDR connector data type support](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-cloud-support) | decision-making | 0.70 | Describes which data types are supported in Commercial vs GCC/GCC-High/DoD, guiding deployment decisions based on environment capabilities. | +| [Microsoft Defender for Cloud](https://learn.microsoft.com/en-us/azure/sentinel/connect-defender-for-cloud) | configuration | 0.70 | Connector for Defender for Cloud; includes subscription-level settings and connector parameters—detailed configuration expert knowledge. | +| [Microsoft Purview Information Protection data](https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-purview) | configuration | 0.70 | Streaming Purview Information Protection data requires specific connector settings and scopes—detailed configuration expert knowledge. | +| [Microsoft Purview Information Protection reference](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-purview-record-types-activities) | configuration | 0.70 | Lists supported audit log record types and activities for the connector, which is detailed, product-specific capability/configuration information. | +| [Microsoft Sentinel platform deployment overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboarding) | configuration | 0.70 | Onboarding to Sentinel data lake/graph includes tenant-wide configuration steps and parameters unique to this platform capability. | +| [Migrate SOAR automation](https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-automation) | decision-making | 0.70 | Explains how to identify SOAR use cases and migrate ArcSight automation to Sentinel automation rules/playbooks; product-specific migration and approach-selection guidance. | +| [Migrate SOAR automation](https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-automation) | decision-making | 0.70 | Guides identification of SOAR use cases and migration of QRadar automation to Sentinel automation rules/playbooks; focused on migration and approach selection. | +| [Migrate SOAR automation](https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-automation) | decision-making | 0.70 | Describes how to identify SOAR use cases and migrate Splunk automation to Sentinel automation rules/playbooks; focused on migration choices and mapping between systems. | +| [Migrate alert playbooks to automation rules](https://learn.microsoft.com/en-us/azure/sentinel/automation/migrate-playbooks-to-automation-rules) | decision-making | 0.70 | Explains when and why to migrate from analytics-rule triggers to automation rules, with scenario-based guidance; this is product-specific migration/choice guidance that helps decide between approaches. | +| [Migrate detection rules](https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-detection-rules) | decision-making | 0.70 | Provides guidance on identifying, comparing, and mapping ArcSight rules to Sentinel analytics rules; this is migration and selection guidance between rule sets. | +| [Migrate detection rules](https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-detection-rules) | decision-making | 0.70 | Covers identifying, comparing, and migrating QRadar rules to Sentinel built-in rules; product-specific migration and mapping guidance. | +| [Migrate detection rules](https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-detection-rules) | decision-making | 0.70 | Guides identification and mapping of Splunk detection rules to Sentinel analytics rules and use of the SIEM migration experience; product-specific migration and rule-selection guidance. | +| [Monitor and optimize execution of analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/monitor-optimize-analytics-rule-execution) | configuration | 0.70 | Describes rule execution insights and manual rerun; includes Sentinel-specific execution management features and their configuration. | +| [Multistage attack detection scenarios](https://learn.microsoft.com/en-us/azure/sentinel/fusion-scenario-reference) | configuration | 0.70 | Catalog of specific Fusion detection scenarios grouped by threat classification, describing concrete product detection capabilities. | +| [Notebook examples for data lake exploration](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-examples) | integrations | 0.70 | Provides concrete PySpark/Python code snippets using Sentinel extension APIs and table schemas; product-specific coding patterns for data access. | +| [Onboard to data lake and graph from Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboard-defender) | configuration | 0.70 | Describes onboarding flow from Defender portal with subscription selection and tenant-level constraints; product-specific configuration process. | +| [Optimize costs with pre-purchase plan](https://learn.microsoft.com/en-us/azure/sentinel/billing-pre-purchase-plan) | decision-making | 0.70 | Explains commit units, discounts, and how SCUs are applied; includes quantified trade-offs and purchasing guidance. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-overview) | configuration | 0.70 | Overview of Syslog/CEF via AMA connectors; these docs typically include AMA extension names, ports, facility/severity filters—product-specific configuration details. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started) | configuration | 0.70 | Get-started setup article for MCP server likely includes concrete configuration parameters, connection settings, and environment details specific to Sentinel MCP. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/offboard-implications) | limits-quotas | 0.70 | States specific timing (up to 48 hours) and retention behavior for resources and data after removal, which are concrete service limits/behaviors. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/scheduled-rules-overview) | configuration | 0.70 | Article is explicitly about all configuration options for scheduled rules; these pages typically include rule parameters (lookback period, frequency, thresholds, suppression settings) and allowed ranges, which are product-specific configuration details. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/watchlists) | best-practices | 0.70 | Article explicitly mentions best practices and key scenarios; watchlist usage guidance typically includes concrete patterns (how to structure lists, query them, and integrate into rules) that are product-specific DO/DON’T recommendations. | +| [Plan data retention](https://learn.microsoft.com/en-us/azure/sentinel/log-plans) | limits-quotas | 0.70 | Describes different log retention plans and how to use them; such pages typically include concrete retention durations, tier behaviors, and cost/coverage trade-offs. | +| [Prepare your SAP environment](https://learn.microsoft.com/en-us/azure/sentinel/sap/preparing-sap) | configuration | 0.70 | Preparation steps for SAP to work with the Sentinel SAP connector will include SAP-side configuration objects, parameters, and roles specific to this integration, which are product-specific configuration details. | +| [Prerequisites](https://learn.microsoft.com/en-us/azure/sentinel/prerequisites) | deployment | 0.70 | Prerequisites for deploying Sentinel usually include tenant-level requirements, supported regions, and service dependencies that are product-specific deployment constraints. | +| [Prioritize data connectors](https://learn.microsoft.com/en-us/azure/sentinel/prioritize-data-connectors) | decision-making | 0.70 | Helps plan and prioritize which data sources to onboard; this is Sentinel-specific decision guidance on connector selection and coverage vs. cost trade-offs. | +| [Reduce costs](https://learn.microsoft.com/en-us/azure/sentinel/billing-reduce-costs) | best-practices | 0.70 | Provides concrete Sentinel-specific cost reduction methods (tier choices, transformations, rule tuning); actionable product-specific guidance. | +| [Review sample workspace designs](https://learn.microsoft.com/en-us/azure/sentinel/sample-workspace-designs) | architecture-patterns | 0.70 | Sample workspace designs for multi-tenant/multi-region Sentinel are product-specific architecture patterns and trade-offs for workspace topology. | +| [SAP solution function reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-function-reference) | integrations | 0.70 | Describes Kusto functions made available by the SAP solution; these are integration/query primitives with specific parameters and usage patterns. | +| [SAP solution log and table reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-log-reference) | integrations | 0.70 | Reference for logs and tables exposed by the SAP connector, including which are default vs optional, is detailed schema/integration knowledge. | +| [SIEM operations guide](https://learn.microsoft.com/en-us/azure/sentinel/ops-guide) | best-practices | 0.70 | An operational guide listing recommended recurring SOC activities in Sentinel; this is product-specific operational best-practice guidance beyond generic SOC theory. | +| [Set up federated tables](https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-setup) | configuration | 0.70 | Explains how to configure federated connectors for Databricks, ADLS Gen2, and Fabric; likely includes connector settings and parameters, which are product-specific configuration details. | +| [Set up multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/use-multiple-workspaces) | deployment | 0.70 | Implementation step for multi-workspace/multi-tenant architecture; contains deployment-specific instructions and constraints for this topology. | +| [Standalone vs XDR alert schema reference](https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema-differences) | configuration | 0.70 | Explains concrete field mapping and ingestion behavior differences between connector types, including schema-level configuration details. | +| [Stop SAP data collection](https://learn.microsoft.com/en-us/azure/sentinel/sap/stop-collection) | configuration | 0.70 | Provides specific steps to disable the SAP data connector and stop ingestion; these are product-specific configuration/operational steps. | +| [Support for data types in different clouds](https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support) | decision-making | 0.70 | Describes which connector data types are supported in which cloud environments (commercial vs GCC variants); this is product-specific capability matrix for planning. | +| [Surface custom details in alerts](https://learn.microsoft.com/en-us/azure/sentinel/surface-custom-details-in-alerts) | configuration | 0.70 | Explains extracting and surfacing custom details in alerts; typically involves specifying field names, Kusto projections, and rule configuration parameters unique to Sentinel alert enrichment. | +| [Switch to simplified pricing tiers](https://learn.microsoft.com/en-us/azure/sentinel/enroll-simplified-pricing-tier) | decision-making | 0.70 | Covers switching to simplified pricing and its impact; includes tier behavior and migration considerations to support billing decisions. | +| [Update the data connector agent](https://learn.microsoft.com/en-us/azure/sentinel/sap/update-sap-data-connector) | deployment | 0.70 | Update procedure for the SAP data connector agent, including mention of ~10-second downtime and resume behavior, is deployment-specific operational knowledge. | +| [Use MCP connector in ChatGPT or Claude](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-chatgpt-claude-connector) | configuration | 0.70 | Connector setup article; likely includes endpoint URLs, auth settings, and configuration parameters for ChatGPT/Claude integration. | +| [Use SOC optimizations programmatically](https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-api) | integrations | 0.70 | Describes the recommendations API usage; likely includes endpoint paths, parameters, and request/response patterns unique to Sentinel. | +| [Use Sentinel MCP tools in Visual Studio Code](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-visual-studio-code) | configuration | 0.70 | Explains adding MCP tools to AI agents in VS Code; expected to contain specific settings, identifiers, and configuration steps unique to this integration. | +| [Use the SIEM migration experience](https://learn.microsoft.com/en-us/azure/sentinel/siem-migration) | decision-making | 0.70 | Describes a migration tool that analyzes Splunk/QRadar detections and recommends Sentinel rules and connectors; provides concrete guidance on mapping and choosing equivalents, which is migration decision support. | +| [Windows security event sets](https://learn.microsoft.com/en-us/azure/sentinel/windows-security-event-id-reference) | configuration | 0.70 | Describes predefined event sets and which event IDs they include; this is product-specific configuration of what data is collected. | +| [Work with STIX objects and indicators](https://learn.microsoft.com/en-us/azure/sentinel/work-with-stix-objects-indicators) | integrations | 0.70 | Provides concrete Kusto query examples against ThreatIntelIndicators and ThreatIntelObjects tables and guidance on migrating to the new schema—these are product-specific query/integration patterns and schema details. | +| [Workspace manager in the Azure portal](https://learn.microsoft.com/en-us/azure/sentinel/workspace-manager) | configuration | 0.70 | Covers provisioning and using workspace manager with supported content types; includes Sentinel-specific management configuration details. | +| [Use Sentinel MCP tools in Copilot Studio](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-copilot-studio) | configuration | 0.68 | Covers wiring Sentinel MCP tools into Copilot Studio; expected to list concrete configuration options and parameters for tool integration. | +| [Use Sentinel MCP tools in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-azure-ai-foundry) | configuration | 0.68 | Shows how to add Sentinel MCP tools or custom tools in Foundry; likely includes product-specific configuration fields and values. | +| [Use Sentinel MCP tools in Security Copilot](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-security-copilot) | configuration | 0.68 | Describes adding Sentinel MCP tools or custom tools into Security Copilot; likely includes specific configuration fields and values for tool registration. | +| [AMA migration for Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/ama-migrate) | decision-making | 0.65 | Migration article between agents; such content typically includes decision guidance on when/how to migrate, mapping of old to new capabilities, and deployment considerations—expert decision-making guidance. | +| [Add threat intelligence in bulk by file](https://learn.microsoft.com/en-us/azure/sentinel/indicators-bulk-file-import) | integrations | 0.65 | Bulk import of indicators from CSV/STIX JSON usually includes required column/field names, formats, and constraints for Sentinel’s threat intelligence schema, which are product-specific integration details beyond generic knowledge. | +| [Aggregate insights from raw data into an Auxiliary table](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs) | configuration | 0.65 | Covers creating/scheduling KQL jobs and promoting data between tiers; involves job configuration parameters and tier behaviors unique to Sentinel data lake. | +| [All Microsoft Sentinel SIEM solutions](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-catalog) | decision-making | 0.65 | Catalog and domain-specific solution list helps decide which solutions to deploy for particular scenarios; supports solution selection decisions. | +| [Azure Virtual Desktop](https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-virtual-desktop) | configuration | 0.65 | Monitoring AVD with Sentinel requires configuring specific diagnostics/log sources and connectors—product-specific configuration details. | +| [Business continuity and disaster recovery](https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery) | architecture-patterns | 0.65 | BCDR and cross-region/zone resiliency guidance for Sentinel; product-specific reliability architecture patterns and recommendations. | +| [Configure advanced MSTICPy settings](https://learn.microsoft.com/en-us/azure/sentinel/notebooks-msticpy-advanced) | configuration | 0.65 | Described as covering advanced configurations for Jupyter notebooks and MSTICPy in Sentinel, which typically includes product-specific config parameters and options beyond generic notebook usage. | +| [Connect data sources](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-connector) | configuration | 0.65 | Explains how to install and configure connectors from Content hub; likely includes connector-specific settings and configuration parameters. | +| [Convert dashboards to workbooks](https://learn.microsoft.com/en-us/azure/sentinel/migration-convert-dashboards) | decision-making | 0.65 | Covers reviewing, planning, and converting dashboards to Azure Workbooks; includes migration planning and mapping decisions between dashboard models. | +| [Create a Power BI report](https://learn.microsoft.com/en-us/azure/sentinel/powerbi) | integrations | 0.65 | Power BI integration from exported Sentinel queries typically includes connection details, dataset configuration, and query export specifics that are product-specific integration patterns. | +| [Create a scheduled rule from scratch](https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rules) | configuration | 0.65 | Covers creating scheduled analytics rules beyond templates; such articles typically enumerate rule settings (query schedule, lookback, thresholds, suppression) with specific options and constraints, which are configuration details. | +| [Create and manage notebook jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-jobs) | configuration | 0.65 | Describes creating scheduled Spark notebook jobs; likely includes job configuration options, schedules, and output table settings. | +| [Create incident tasks using automation rules](https://learn.microsoft.com/en-us/azure/sentinel/create-tasks-automation-rule) | configuration | 0.65 | Focuses on using automation rules to create standardized incident task lists, involving specific rule configuration options tied to task creation. | +| [Create incidents from Microsoft Security alerts](https://learn.microsoft.com/en-us/azure/sentinel/create-incidents-from-alerts) | configuration | 0.65 | Describes how alerts are grouped into incidents and how to configure that behavior; typically includes rule settings and mapping options unique to Sentinel incident creation. | +| [Data exploration](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool) | integrations | 0.65 | Describes specific tools in the data exploration collection; likely includes tool names, parameters, and behaviors for querying tables and retrieving data. | +| [Delete out-of-the-box content](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-delete) | configuration | 0.65 | Details how to delete and reinstall solutions/content; product-specific lifecycle operations and configuration steps. | +| [Deploy for Dynamics 365 Finance and Operations](https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/deploy-dynamics-365-finance-operations-solution) | deployment | 0.65 | Deployment article for Dynamics 365 F&O content within Sentinel; includes specific steps and requirements for this integration. | +| [Deploy for Power Platform and Microsoft Dynamics 365 Customer Engagement](https://learn.microsoft.com/en-us/azure/sentinel/business-applications/deploy-power-platform-solution) | deployment | 0.65 | Describes how to deploy the Business Apps solution for Power Platform and Dynamics CE; includes product-specific deployment and connector configuration steps. | +| [Deploy out-of-the-box content](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-deploy) | deployment | 0.65 | Describes how to find and deploy packaged solutions; includes Sentinel-specific deployment behavior for content bundles. | +| [Deployment overview](https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-overview) | deployment | 0.65 | Introduces deployment of Sentinel solution for SAP; includes product/integration-specific deployment steps and constraints. | +| [Deployment planning guide](https://learn.microsoft.com/en-us/azure/sentinel/deploy-overview) | deployment | 0.65 | Deployment guide overview for Sentinel including phases and planning; likely includes Sentinel-specific deployment requirements and sequencing beyond generic deployment steps. | +| [Device configuration for CEF via AMA](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-cef-device) | configuration | 0.65 | Device-specific configuration instructions (ports, formats, forwarding settings) are highly product- and vendor-specific configuration expert knowledge. | +| [Dynamics 365 Finance and Operations security content reference](https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/dynamics-365-finance-operations-security-content) | security | 0.65 | Reference for built-in security content for Dynamics 365 F&O solution; product-specific analytics and workbooks. | +| [Export and import analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/import-export-analytics-rules) | deployment | 0.65 | Covers exporting/importing rules as ARM templates; this is about deployment of rules across environments and usually includes template schema, required properties, and constraints—deployment-focused expert details. | +| [Geographical availability and data residency](https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency) | decision-making | 0.65 | Geographical availability and data residency affect where and how to deploy Sentinel; this is decision guidance tied to compliance and region support. | +| [Handle ingestion delay in analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/ingestion-delay) | best-practices | 0.65 | Discusses handling ingestion delay for scheduled rules; such content typically gives concrete recommendations (for example, adjusting lookback windows, scheduling offsets) that are product-specific best practices. | +| [Identity attack graph](https://learn.microsoft.com/en-us/azure/sentinel/datalake/identity-attack-graph) | best-practices | 0.65 | Provides product-specific guidance on interpreting identity attack graphs, lateral movement paths, and privilege escalation risks; these are Sentinel-specific investigative best practices. | +| [Integrate with unified connectors](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-integration) | configuration | 0.65 | How-to for connecting via unified connector; such pages typically include connector-specific parameters, scopes, and settings unique to Sentinel’s unified platform, which qualify as configuration expert knowledge. | +| [Managing Platform Solutions](https://learn.microsoft.com/en-us/azure/sentinel/manage-platform-solutions) | configuration | 0.65 | Managing components after installation implies detailed configuration steps and settings for different component types, which are product-specific. | +| [Microsoft Sentinel SIEM experience in Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/microsoft-sentinel-defender-portal) | configuration | 0.65 | Explains Sentinel experience in Defender portal; likely includes portal-specific enablement, navigation, and configuration nuances unique to this integration. | +| [Migrate agent to agentless connector](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-agent-migrate) | decision-making | 0.65 | Migration guide between containerized and agentless SAP connectors will include product-specific migration steps and considerations that help decide and execute the transition. | +| [Modify content to use ASIM](https://learn.microsoft.com/en-us/azure/sentinel/normalization-modify-content) | best-practices | 0.65 | Explains how to modify analytics rules and queries to use normalized data; contains product-specific guidance and patterns for migrating existing content. | +| [Monitor SAP system health and role](https://learn.microsoft.com/en-us/azure/sentinel/monitor-sap-system-health) | configuration | 0.65 | Explains how to use connector page and alert rule template to monitor health; includes product-specific configuration of monitoring rules. | +| [Monitor Zero Trust](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution) | configuration | 0.65 | Covers installing and using a specific Sentinel solution with preconfigured content; includes product-specific configuration and usage patterns. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/health-audit) | configuration | 0.65 | Explains Sentinel health/audit feature and monitored actions; includes product-specific tables and signals used for monitoring. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/near-real-time-rules) | configuration | 0.65 | NRT rules have specific configuration characteristics (frequency, supported operators, constraints) that differ from scheduled rules; article likely documents these product-specific settings. | +| [Packaging and publishing a Sentinel platform solution](https://learn.microsoft.com/en-us/azure/sentinel/package-platform-solution) | deployment | 0.65 | Packaging and publishing platform solutions to the Security Store; likely includes product-specific deployment artifacts, constraints, and supported components for this solution type. | +| [Partner integrations best practices](https://learn.microsoft.com/en-us/azure/sentinel/partner-integrations) | architecture-patterns | 0.65 | Described as components and patterns plus best practices for integrations; likely contains Sentinel-specific architectural patterns and guidance on how components work together, which is product-specific design knowledge. | +| [Power Platform and Microsoft Dynamics 365 Customer Engagement security content reference](https://learn.microsoft.com/en-us/azure/sentinel/business-applications/power-platform-solution-security-content) | security | 0.65 | Details built-in security content for the Power Platform/Dynamics CE solution, which is product-specific detection and workbook information. | +| [Prepare for multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/prepare-multiple-workspaces) | architecture-patterns | 0.65 | Guidance on extending Sentinel across multiple workspaces/tenants is architecture-specific and unique to the product’s cross-tenant model. | +| [Recommended and sample playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-recommendations) | best-practices | 0.65 | Lists concrete playbook use cases and recommended templates specific to Sentinel; these are product-specific DO/USE patterns that function as best-practice guidance beyond generic automation concepts. | +| [Run KQL queries](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries) | configuration | 0.65 | Describes KQL queries page, job creation, and management; likely includes specific options, parameters, and scheduling settings for jobs. | +| [SAP security content reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-security-content) | security | 0.65 | Reference for built-in security content (rules, workbooks, watchlists) is security-focused and product-specific, detailing what detections exist and how they behave. | +| [SOC optimization reference](https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-reference) | decision-making | 0.65 | Catalog of recommendation types (data value, coverage, etc.) with guidance on when to apply them; supports decision-making on tuning Sentinel environments. | +| [Syslog - configure security device](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-syslog-device) | configuration | 0.65 | Lists provider-specific configuration for Syslog forwarding to Sentinel; includes device settings and log formats that are expert configuration details. | +| [Transition to the Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/move-to-defender) | decision-making | 0.65 | Guides when and how to move from Azure portal to Defender portal, with implications for workflows and operations; this is migration/choice guidance specific to Sentinel environments. | +| [Update SOC processes](https://learn.microsoft.com/en-us/azure/sentinel/migration-security-operations-center-processes) | decision-making | 0.65 | Guides how to adapt SOC roles and processes when moving to Sentinel; product-specific operational process changes and migration considerations. | +| [Use ASIM](https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-parsers) | integrations | 0.65 | Explains how to use specific KQL user-defined functions as parsers, with a table mapping schemas to parser function names—this is product-specific integration/query pattern detail. | +| [Work with incidents in multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/multiple-workspace-view) | configuration | 0.65 | Explains configuring and using multiple workspace incident view; product-specific UI/feature configuration and behavior. | +| [Work with out-of-the-box anomaly rules](https://learn.microsoft.com/en-us/azure/sentinel/work-with-anomaly-rules) | configuration | 0.65 | Article on creating, managing, and tuning anomaly rules; these typically expose anomaly-specific parameters (sensitivity, baselining windows, thresholds) that are product-specific configuration options. | +| [Workbooks for Microsoft Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/workbooks-for-data-lake) | configuration | 0.65 | Describes selecting Sentinel data lake as a workbook data source and running queries; likely includes specific configuration steps and options for workbook data sources. | +| [Workspace deployed parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-workspace-parsers) | configuration | 0.65 | Covers managing workspace-deployed parsers; likely includes specific parser resource names, deployment options, and configuration behaviors unique to Sentinel. | +| [Workspaces in the Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/workspaces-defender-portal) | architecture-patterns | 0.65 | Describes primary/secondary workspace patterns and when to use them, especially with Defender XDR; product-specific architecture guidance. | +| [Agent creation](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-agent-creation-tool) | integrations | 0.63 | Tool collection reference for creating Security Copilot agents; expected to list tool APIs and parameters specific to agent creation. | +| [Triage](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-triage-tool) | integrations | 0.63 | Triage tool collection integrates AI models with incident triage and hunting APIs; likely documents tool operations and parameters. | +| [Collaborate in Microsoft Teams](https://learn.microsoft.com/en-us/azure/sentinel/collaborate-in-microsoft-teams) | integrations | 0.62 | Covers a direct, product-specific integration between Sentinel and Teams; such pages typically include connection settings and behaviors unique to this integration, beyond generic Teams usage. | +| [Create KQL jobs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs) | decision-making | 0.62 | Explains when to use KQL jobs, benefits, and scenarios vs direct queries; likely includes guidance on when to promote data to analytics tier with some quantified trade-offs. | +| [Export and import automation rules](https://learn.microsoft.com/en-us/azure/sentinel/import-export-automation-rules) | deployment | 0.62 | Covers managing automation rules as code via ARM templates, including export/import behavior and workspace-independence, which are product-specific deployment-as-code details. | +| [ASIM-based domain solutions](https://learn.microsoft.com/en-us/azure/sentinel/domain-based-essential-solutions) | configuration | 0.60 | Describes Microsoft essential solutions using ASIM schemas; includes product-specific solution behavior and configuration context. | +| [Aggregate data with summary rules](https://learn.microsoft.com/en-us/azure/sentinel/summary-rules) | configuration | 0.60 | Describes configuring summary rules and custom tables for aggregation across log tiers; likely includes rule settings and table behaviors specific to Sentinel. | +| [Create custom entity activities](https://learn.microsoft.com/en-us/azure/sentinel/customize-entity-activities) | configuration | 0.60 | Adding customized activities implies specifying activity definitions and mappings; these are product-specific configuration options for entity timelines. | +| [Get fine-tuning recommendations](https://learn.microsoft.com/en-us/azure/sentinel/detection-tuning) | best-practices | 0.60 | Provides fine-tuning recommendations to reduce false positives while maintaining coverage; likely includes concrete rule configuration adjustments and patterns specific to Sentinel. | +| [Manage solution deprecation lifecycle](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-deprecation) | configuration | 0.60 | Explains identifying deprecated solutions and actions to take; Sentinel-specific content lifecycle management guidance. | +| [Manage template versions for analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/manage-analytics-rule-templates) | configuration | 0.60 | Explains managing relationships between templates and rules, merging updates, and reverting changes; involves product-specific configuration flows and options for template-version handling. | +| [Manage your SOC with incident metrics](https://learn.microsoft.com/en-us/azure/sentinel/manage-soc-with-incident-metrics) | configuration | 0.60 | Describes incident metrics screen/workbook and how to use specific metrics (MTTR, severity, MITRE tactics) within Sentinel; product-specific configuration/usage. | +| [Manage your intellectual property in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/mssp-protect-intellectual-property) | decision-making | 0.60 | Describes methods to protect analytics rules, playbooks, etc., depending on CSP vs EA/PAYG; product- and channel-specific decision guidance. | +| [Monitor SAP audit controls](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-controls-workbook) | best-practices | 0.60 | Describes using the Security Audit Controls workbook to track compliance; likely includes concrete mappings and usage patterns specific to this workbook. | +| [Monitor SAP audit logs](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-log-workbook) | best-practices | 0.60 | Workbook usage article will include specific queries/visuals and how to interpret them for SAP security monitoring, which is product-specific operational guidance. | +| [Monitor costs](https://learn.microsoft.com/en-us/azure/sentinel/billing-monitor-costs) | decision-making | 0.60 | Explains how to interpret and act on Sentinel cost data using Cost Management; includes product-specific cost components and usage views for decision-making. | +| [Optimize your security operations](https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-access) | best-practices | 0.60 | SOC optimization recommendations are product-specific guidance on tuning data and detections; likely includes concrete Sentinel-centric actions and gotchas. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/entities) | configuration | 0.60 | Defines entity types and how Sentinel classifies data elements; typically includes lists of entity schemas and mappings that are product-specific configuration details. | +| [SAP LogServ](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-logserv-overview) | architecture-patterns | 0.60 | Overview of integration with SAP LogServ to extend monitoring to infra/OS logs; likely includes architecture patterns for RISE/ECS environments. | +| [Security content reference](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-security-content) | security | 0.60 | Details built-in security content (workbook, analytics rules) for SAP BTP; this is product-specific security detection content. | +| [Threat intelligence integrations](https://learn.microsoft.com/en-us/azure/sentinel/threat-intelligence-integration) | integrations | 0.60 | Describes ways threat intelligence feeds are integrated and used; likely includes connector types and workspace strategies that are product-specific integration guidance. | +| [UEBA data sources and table schemas](https://learn.microsoft.com/en-us/azure/sentinel/ueba-reference) | configuration | 0.60 | Lists input data sources and describes enrichments UEBA adds to entities; this is a reference of product-specific fields and enrichment schema, which is configuration/schema knowledge. | ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Aggregate behavioral insights from raw logs](https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer) | 0.50 | Explains the UEBA behaviors abstraction layer conceptually; likely describes behavior types but not as configuration parameters or limits tables. | -| [Azure Logic Apps for Microsoft Sentinel playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/logic-apps-playbooks) | 0.50 | Explains how Logic Apps underpins Sentinel playbooks; summary reads as conceptual integration overview without detailed parameter tables or constraints. | -| [Convert dashboards to workbooks](https://learn.microsoft.com/en-us/azure/sentinel/migration-convert-dashboards) | 0.50 | Covers reviewing, planning, and converting dashboards to Azure Workbooks; appears to be migration how-to without explicit decision matrices or numeric thresholds. | -| [Create incidents manually](https://learn.microsoft.com/en-us/azure/sentinel/create-incident-manually) | 0.50 | Manual incident creation description is largely about capability and preview status; summary doesn’t expose detailed parameters or numeric constraints. | -| [Delete incidents](https://learn.microsoft.com/en-us/azure/sentinel/delete-incident) | 0.50 | Describes incident deletion via portal/API/Logic Apps; appears procedural without detailed configuration matrices or limits. | -| [Ingest data](https://learn.microsoft.com/en-us/azure/sentinel/migration-export-ingest) | 0.50 | Describes how to ingest historical data into the chosen platform; likely procedural steps rather than comparison matrices or numeric limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/entities) | 0.50 | Conceptual explanation of entities and how Sentinel uses them; summary doesn’t indicate detailed config tables or numeric constraints. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot) | 0.50 | High-level description of Security Copilot with Sentinel; summary doesn’t show specific prompts, parameters, or configuration tables. | -| [Sample KQL queries](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-sample-queries) | 0.50 | Sample KQL queries are useful but are generic query examples rather than product-specific configuration, limits, or troubleshooting patterns; they don’t fit the defined sub-skill types. | -| [Summarize incidents in Azure portal](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot-incident-summary) | 0.50 | Describes incident summarization capability conceptually; no detailed configuration or numeric thresholds are evident from the summary. | -| [Track migration with a workbook](https://learn.microsoft.com/en-us/azure/sentinel/migration-track) | 0.50 | Workbook for tracking migration status; mostly operational tracking and UI usage, not deep config, limits, or troubleshooting. | -| [Bookmarks](https://learn.microsoft.com/en-us/azure/sentinel/bookmarks) | 0.45 | Explains bookmarks conceptually and how to use them; no detailed configuration parameters, limits, or error-resolution mappings are indicated. | -| [Connect data sources](https://learn.microsoft.com/en-us/azure/sentinel/configure-data-connector) | 0.45 | General how-to for installing data connectors from Content hub; likely step-by-step UI instructions rather than detailed config parameter references. | -| [Create and perform advanced incident tasks using playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/create-tasks-playbook) | 0.45 | Shows how to use playbooks to create incident tasks; appears to be workflow guidance without detailed configuration tables, limits, or error-code troubleshooting. | -| [Entity pages](https://learn.microsoft.com/en-us/azure/sentinel/entity-pages) | 0.45 | Describes entity pages and what they show; primarily UI and investigation experience, not configuration or limits. | -| [Graph visualization](https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-visualization) | 0.45 | Graph visualization article appears focused on interactive investigation workflows and running graph queries; likely more tutorial/usage than configuration tables or product-specific limits. | -| [Investigate incidents in depth](https://learn.microsoft.com/en-us/azure/sentinel/investigate-incidents) | 0.45 | Explains incident details page and investigation panels; mostly UI/flow description, not configuration tables or limits. | -| [MITRE ATT&CK coverage](https://learn.microsoft.com/en-us/azure/sentinel/mitre-coverage) | 0.45 | Describes viewing MITRE coverage; primarily visualization/overview of coverage indicators, not detailed config parameters or numeric thresholds for decisions. | -| [Microsoft Sentinel MCP server overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-overview) | 0.45 | Overview of Sentinel’s MCP support and tool collections; high-level description of capabilities, not detailed configuration or limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/fusion) | 0.45 | Explains Fusion-based multistage attack detection conceptually; summary doesn’t indicate detailed config tables or numeric thresholds. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/incident-investigation) | 0.45 | Incident investigation and case management overview; likely UX and workflow focused without detailed configs or numeric thresholds in the summary. | -| [Prepare for multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/prepare-multiple-workspaces) | 0.45 | Multiple workspace preparation article appears to be conceptual architecture planning without numeric thresholds or detailed decision matrices. | -| [Relate alerts to incidents](https://learn.microsoft.com/en-us/azure/sentinel/relate-alerts-to-incidents) | 0.45 | Shows how to relate alerts to incidents; summary suggests feature usage rather than deep configuration or troubleshooting content. | -| [Review sample workspace designs](https://learn.microsoft.com/en-us/azure/sentinel/sample-workspace-designs) | 0.45 | Sample workspace designs are architecture guidance but summary suggests conceptual patterns, not quantified thresholds or decision matrices. | -| [SOAR content catalog](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-soar-content) | 0.45 | SOAR content catalog listing playbooks and connectors; primarily a catalog/overview without deep configuration parameters or troubleshooting mappings. | -| [Set up multiple workspaces](https://learn.microsoft.com/en-us/azure/sentinel/use-multiple-workspaces) | 0.45 | Setting up multiple workspaces and tenants is deployment/architecture execution; summary suggests procedural steps rather than quantified thresholds or decision matrices. | -| [Triage and manage your incidents](https://learn.microsoft.com/en-us/azure/sentinel/incident-navigate-triage) | 0.45 | Describes basic navigation and triage steps; appears procedural without deep product-specific configuration or numeric guidance. | -| [Tutorial - Forward syslog data to workspace](https://learn.microsoft.com/en-us/azure/sentinel/forward-syslog-monitor-agent) | 0.45 | Tutorial for forwarding Syslog via AMA; mostly procedural steps, not focused on configuration parameter matrices, limits, or troubleshooting mappings. | -| [UEBA investigation examples](https://learn.microsoft.com/en-us/azure/sentinel/investigate-with-ueba) | 0.45 | Describes investigation workflows using UEBA data; scenario-focused guidance rather than configuration, limits, or troubleshooting reference. | -| [UEBA overview](https://learn.microsoft.com/en-us/azure/sentinel/identify-threats-with-entity-behavior-analytics) | 0.45 | UEBA overview and onboarding; mostly conceptual explanation of how UEBA works and how to enable it, without strong indication of detailed config tables or limits. | -| [ASIM content](https://learn.microsoft.com/en-us/azure/sentinel/normalization-content) | 0.40 | Catalog-style listing of ASIM-based security content; mainly links and descriptions of content types without deep configuration, limits, or troubleshooting details. | -| [ASIM overview](https://learn.microsoft.com/en-us/azure/sentinel/normalization) | 0.40 | Conceptual explanation of normalization and ASIM; does not expose detailed configuration tables, limits, or troubleshooting mappings. | -| [ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-overview) | 0.40 | Overview of ASIM parsers; primarily conceptual and linking to detailed parser docs, without concrete configuration or troubleshooting content. | -| [Add entity to threat indicators](https://learn.microsoft.com/en-us/azure/sentinel/add-entity-to-threat-intelligence) | 0.40 | Describes adding entities discovered during investigations as IOCs; likely a procedural UI guide without deep config tables, limits, or error-code troubleshooting. | -| [Automate and run playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/run-playbooks) | 0.40 | Describes how to attach and run playbooks automatically or manually; appears to be operational how-to without detailed config matrices, limits, or troubleshooting mappings. | -| [Conduct end-to-end hunts](https://learn.microsoft.com/en-us/azure/sentinel/hunts) | 0.40 | End-to-end hunting experience description is largely workflow/UX oriented; no specific configs, limits, or error mappings are evident from the summary. | -| [Configure content](https://learn.microsoft.com/en-us/azure/sentinel/configure-content) | 0.40 | Configuring Sentinel content article is likely procedural (how to set up rules, connectors) without detailed parameter tables or numeric constraints in the summary. | -| [Create and manage playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/create-playbooks) | 0.40 | Create/manage playbooks article sounds procedural; summary does not indicate presence of specific config matrices, limits, or error-code-based troubleshooting. | -| [Customize playbooks from templates](https://learn.microsoft.com/en-us/azure/sentinel/automation/use-playbook-templates) | 0.40 | Shows how to create playbooks from templates; appears to be a how-to tutorial without detailed configuration tables or product-specific numeric guidance. | -| [Deploy and monitor decoy honeytokens](https://learn.microsoft.com/en-us/azure/sentinel/monitor-key-vault-honeytokens) | 0.40 | High-level pointer to a community-supported solution and GitHub docs; the expert details live in GitHub, not in this page. | -| [Ingest-time normalization](https://learn.microsoft.com/en-us/azure/sentinel/normalization-ingest-time) | 0.40 | High-level explanation of ingest-time normalization; summary suggests conceptual description rather than detailed configuration parameters. | -| [Microsoft Sentinel data lake overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-overview) | 0.40 | Overview of Sentinel data lake; conceptual description of capabilities and benefits without detailed config tables or numeric thresholds in the summary. | -| [Multistage attack detection scenarios](https://learn.microsoft.com/en-us/azure/sentinel/fusion-scenario-reference) | 0.40 | Catalog of Fusion-detected scenarios grouped by threat classification. Useful, but not focused on configuration options, limits, or troubleshooting mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/automation/automate-responses-with-playbooks) | 0.40 | High-level guidance on using Sentinel playbooks to automate threat response; summary suggests conceptual/tutorial content without specific error codes, config tables, or numeric limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-overview) | 0.40 | High-level overview of using KQL with Sentinel data lake; mostly conceptual and scenario-focused without detailed configuration tables, limits, or error mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-solution-overview) | 0.40 | Overview of Sentinel solution for SAP BTP; summary suggests high-level introduction without detailed configuration tables, limits, or error mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/understand-threat-intelligence) | 0.40 | Conceptual explanation of threat intelligence in Sentinel; summary suggests high-level understanding rather than detailed configuration or error mappings. | -| [Prerequisites](https://learn.microsoft.com/en-us/azure/sentinel/prerequisites) | 0.40 | Prerequisites list is likely conceptual (tenant requirements, permissions) without detailed numeric limits or config tables in the summary. | -| [Track and respond to threats with threat analytics](https://learn.microsoft.com/en-us/azure/sentinel/threat-analytics-sentinel) | 0.40 | High-level description of threat analytics access; summary doesn’t show concrete configuration, limits, or troubleshooting mappings. | -| [Tutorial - Detect threats using analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/tutorial-log4j-detection) | 0.40 | Tutorial for a specific scenario (Log4j detection); mostly step-by-step usage of analytics rules rather than reference-style expert configuration or limits. | -| [Unified connectors overview](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector) | 0.40 | Overview of Unified Connectors Platform and its benefits; largely conceptual/marketing without detailed configuration or limits. | -| [Use Sentinel MCP tools in Visual Studio Code](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-visual-studio-code) | 0.40 | How-to/tutorial style for using MCP tools in VS Code; likely step-by-step enablement without detailed config tables, limits, or product-specific troubleshooting matrices. | -| [Use matching analytics to detect threats](https://learn.microsoft.com/en-us/azure/sentinel/use-matching-analytics-to-detect-threats) | 0.40 | Covers using Microsoft-generated threat intelligence via a built-in rule; likely high-level usage of a predefined rule, not detailed config parameters or limits. | -| [Use threat indicators in analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/use-threat-indicators-in-analytics-rules) | 0.40 | Explains using threat indicators in analytics rules; appears to be conceptual/usage guidance rather than detailed configuration matrices or limits. | -| [Using federated tables](https://learn.microsoft.com/en-us/azure/sentinel/datalake/using-data-federation) | 0.40 | Focuses on how to view and query federated tables using portal, KQL, and notebooks; sounds like usage/tutorial content rather than detailed config, limits, or troubleshooting. | -| [Custom logs - configure security device](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-custom-device) | 0.35 | Acts as a pointer to provider-specific configuration information for various applications; the detailed parameters are in external docs. | -| [Delete out-of-the-box content](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-delete) | 0.35 | Deleting installed content/solutions is procedural management; summary does not indicate detailed configuration tables or constraints. | -| [Deployment overview](https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-overview) | 0.35 | Deployment overview for SAP solution is introductory; detailed prerequisites and configuration are in separate articles. | -| [Enable Microsoft Sentinel and initial features and content](https://learn.microsoft.com/en-us/azure/sentinel/enable-sentinel-features-content) | 0.35 | Enabling Sentinel and initial features is a step-by-step deployment action article; summary does not indicate matrices, limits, or detailed config tables beyond basic onboarding. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/business-applications/solution-overview) | 0.35 | High-level overview of Sentinel solution for Microsoft Business Apps; summary does not indicate detailed configuration, limits, or troubleshooting content. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks-overview) | 0.35 | Overview of Jupyter notebooks with Sentinel data lake; high-level explanation of capabilities rather than detailed configuration or troubleshooting. | -| [Anomalies reference](https://learn.microsoft.com/en-us/azure/sentinel/anomalies-reference) | 0.30 | The page lists types of anomalies detected by Microsoft Sentinel's ML engine and explains baseline/anomaly concepts, but there is no indication of numeric limits, configuration tables, error codes, or decision matrices. It appears to be descriptive/reference content about anomaly categories rather than detailed configuration, troubleshooting, or quantified guidance. | -| [Asset data in the Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/enable-data-connectors) | 0.30 | Conceptual explanation of asset data and benefits; appears more like an overview than a configuration or troubleshooting guide. | -| [Create a custom connector](https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector) | 0.30 | High-level resource index for creating custom connectors; detailed parameters and patterns are in the linked method-specific docs, not here. | -| [Create custom connectors using AI agent in Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector-builder-agent) | 0.30 | Summary indicates a conceptual/getting-started overview of the Sentinel connector builder agent and workflow. It does not clearly reference concrete configuration parameter tables, limits, error codes, or other product-specific expert details as defined in the sub-skill types. | -| [Create custom graph using AI](https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-graphs-with-ai) | 0.30 | Describes using GitHub Copilot and notebooks for graph authoring; appears as a how-to/tutorial without product-specific limits, configs, or error mappings. | -| [Deployment planning guide](https://learn.microsoft.com/en-us/azure/sentinel/deploy-overview) | 0.30 | Deployment guide overview describing phases; lacks specific deployment matrices, constraints, or config parameter tables. | -| [Device configuration for CEF via AMA](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-cef-device) | 0.30 | Primarily a list of provider-supplied installation links for various devices; the expert details live in external vendor docs, not in this page. | -| [Generate playbooks using AI](https://learn.microsoft.com/en-us/azure/sentinel/automation/generate-playbook) | 0.30 | Describes AI-based playbook generation experience; mostly feature overview and workflow, not detailed configuration or limits. | -| [Identity attack graph](https://learn.microsoft.com/en-us/azure/sentinel/datalake/identity-attack-graph) | 0.30 | The description and summary indicate a conceptual explanation of the identity attack graph and how it visualizes identities, permissions, and lateral movement paths. There’s no clear indication of configuration parameters, limits, error codes, or decision matrices; it reads as a feature overview rather than detailed expert guidance. | -| [Ingestion-time data transformation](https://learn.microsoft.com/en-us/azure/sentinel/data-transformation) | 0.30 | From the summary, the page is a conceptual overview of custom data ingestion and transformation using Azure Monitor Logs and data collection rules for Microsoft Sentinel. It does not clearly indicate the presence of specific numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Without evidence of detailed settings, quotas, or product-specific diagnostic mappings, it does not meet the expert-knowledge criteria for any sub-skill type. | -| [OOTB content centralization changes](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-content-centralize) | 0.30 | Describes centralization changes and where OOTB content appears; primarily informational/announcement style without detailed configuration parameters, limits, or troubleshooting mappings. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-tools-overview) | 0.30 | Overview of MCP tool collections and example prompts; conceptual and scenario-focused, not configuration, limits, or troubleshooting content. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions) | 0.30 | Content and solutions overview is conceptual, describing types of content and solutions without deep configuration or numeric constraints. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/threat-detection) | 0.30 | Threat detection overview and rule types; appears conceptual with no strong indication of numeric thresholds, config tables, or error mappings. | -| [Partner solutions](https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-partner-overview) | 0.30 | Partner add-ons overview; primarily marketing/partner discovery content without technical configuration or troubleshooting details. | -| [Prioritize data connectors](https://learn.microsoft.com/en-us/azure/sentinel/prioritize-data-connectors) | 0.30 | Prioritizing data connectors is planning guidance; summary does not indicate specific numeric criteria or configuration tables. | -| [Recommended and sample playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-recommendations) | 0.30 | Lists sample use cases and templates; likely catalog/overview of examples without deep config parameters, limits, or troubleshooting mappings. | -| [Responsible AI FAQ for UEBA behaviors layer](https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer-rai-faqs) | 0.30 | Responsible AI FAQ about UEBA behaviors layer focuses on AI usage, evaluation, and limitations conceptually. It does not emphasize concrete configuration parameters, limits, or troubleshooting mappings. | -| [Responsible AI FAQs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-responsible-ai-faq) | 0.30 | Responsible AI FAQ is conceptual/compliance-focused; unlikely to contain concrete RBAC roles, config parameters, or other product-specific security settings. | -| [Syslog - configure security device](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-syslog-device) | 0.30 | Similar to index 1, this is a catalog of external instructions for devices; it doesn’t itself contain detailed Sentinel- or AMA-specific configuration parameters. | -| [Transition to the Defender portal](https://learn.microsoft.com/en-us/azure/sentinel/move-to-defender) | 0.30 | Transition guide between Sentinel in Azure portal and Defender portal; likely procedural/UX steps without limits, configs tables, or error-code-based troubleshooting. | -| [Use ASIM](https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-parsers) | 0.30 | Explains using ASIM parsers conceptually and references a table mapping schemas to parsers, but the summary does not show detailed configuration parameters, error codes, or best-practice gotchas; it appears to be usage guidance rather than deep expert configuration or limits. | -| [Work with threat intelligence](https://learn.microsoft.com/en-us/azure/sentinel/work-with-threat-indicators) | 0.30 | General guidance on viewing, creating, and managing threat intelligence; summary suggests UI workflow and conceptual usage, not detailed limits, configs, or error mappings. | -| [Custom graphs overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/custom-graphs-overview) | 0.25 | Labeled as an overview of custom graphs; description emphasizes capabilities and scenarios, not detailed configuration parameters, limits, or decision matrices. | -| [Microsoft Sentinel SIEM overview](https://learn.microsoft.com/en-us/azure/sentinel/overview) | 0.25 | Another high-level overview of Sentinel SIEM capabilities; marketing/feature description without detailed expert-only configuration or limits. | -| [ASIM asset entity schema](https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-asset) | 0.20 | Asset Entity normalization schema reference is largely a schema/field definition. While detailed, it is not a configuration, limits, troubleshooting, or decision-making guide and does not fit any specified sub-skill category. | -| [ASIM schemas](https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-schemas) | 0.20 | Describes what ASIM schemas are and their role in normalization, but the summary indicates a conceptual explanation without concrete configuration parameters, limits, or product-specific error/decision matrices. | -| [Find data connector](https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference) | 0.20 | Page is a catalog/list of Microsoft Sentinel data connectors with links out to individual connector deployment/configuration pages. It does not itself contain specific limits, configuration parameter tables, error codes, or decision matrices; it primarily serves as navigation to other documentation. | -| [Microsoft Sentinel graph overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-overview) | 0.20 | Described as an overview of Sentinel graph and its benefits; likely conceptual architecture explanation without concrete thresholds, configs, or error mappings. | -| [Onboard to Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/quickstart-onboard) | 0.20 | Quickstart onboarding tutorial; focuses on basic enablement and a sample data connector, not deep configuration or expert-only details. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/connect-data-sources) | 0.20 | This is a high-level description of Microsoft Sentinel data connectors and supported sources. The summary indicates an overview of connectors and integration examples, but not specific quotas, configuration parameter tables, or troubleshooting mappings, so it lacks the required expert-knowledge characteristics. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-overview) | 0.20 | High-level overview of data federation; description suggests conceptual explanation of capabilities without numeric limits, config parameter tables, or decision matrices. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/hunting) | 0.20 | Page describes Microsoft Sentinel hunting capabilities and built-in hunting queries at a conceptual/feature level. From the summary, it does not indicate specific error codes, configuration parameter tables, limits, or detailed product-specific settings; it focuses on how hunting helps analysts proactively find threats, which is general feature/usage guidance rather than expert-knowledge reference content. | -| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-overview) | 0.20 | Described as an overview of the Microsoft Sentinel solution for SAP applications and available support. This is primarily conceptual/marketing-style overview of challenges and solution scope, without clear indication of detailed configuration parameters, limits, or decision matrices. | -| [SAP LogServ](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-logserv-overview) | 0.20 | The page is an overview of SAP LogServ integration with Microsoft Sentinel for SAP. From the summary it appears to be conceptual/introductory, describing what LogServ is and why it's needed for SAP RISE/ECS environments, without exposing concrete configuration parameters, limits, error codes, or detailed troubleshooting/decision matrices. It reads as a high-level integration overview rather than a detailed configuration, best-practices, or troubleshooting guide. | -| [SOC optimization reference](https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-reference) | 0.20 | Describes types of SOC optimization recommendations conceptually; no indication of numeric limits, config tables, or API/SDK parameter details. | -| [Top Microsoft Sentinel workbooks](https://learn.microsoft.com/en-us/azure/sentinel/top-workbooks) | 0.20 | Primarily a catalog/list of commonly used workbooks and how to access them. No detailed configuration parameters, limits, or troubleshooting content. | -| [View collected data on the Overview dashboard](https://learn.microsoft.com/en-us/azure/sentinel/get-visibility) | 0.20 | Overview dashboard description; mostly UI and conceptual monitoring, no indication of numeric limits, config tables, or troubleshooting mappings. | -| [View customized views with workbooks](https://learn.microsoft.com/en-us/azure/sentinel/monitor-your-data) | 0.20 | Page is a how-to/tutorial on visualizing data with Microsoft Sentinel workbooks. It describes using templates and creating custom workbooks but does not expose product-specific limits, configuration parameter tables, error-code-based troubleshooting, or quantified decision criteria. Content is primarily conceptual and procedural, not expert reference material. | -| [What is Microsoft Sentinel?](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-overview) | 0.20 | High-level product overview of Microsoft Sentinel as a SIEM/platform; no concrete limits, configs, roles, or error mappings. | -| [Windows security events](https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference) | 0.20 | Same page as index 0: a reference list of Sentinel data connectors and links, without embedded expert-level configuration details, limits, troubleshooting mappings, or decision criteria. | -| [ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-list) | 0.10 | Page is primarily a catalog/list of ASIM parsers without configuration parameters, limits, error codes, or decision matrices. It does not map to the defined sub-skill types and lacks the kind of expert numeric/configuration detail required. | -| [Microsoft Sentinel skill-up training](https://learn.microsoft.com/en-us/azure/sentinel/skill-up-resources) | 0.10 | Training/learning path aggregator that links to other resources. No direct expert technical details like limits, configuration tables, or API references. | -| [What's new](https://learn.microsoft.com/en-us/azure/sentinel/whats-new) | 0.10 | Release notes and feature announcements for Microsoft Sentinel without detailed limits, configuration tables, error codes, or decision matrices; primarily high-level 'what's new' content rather than deep, product-specific expert guidance. | +| [Sample KQL queries](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-sample-queries) | 0.55 | Sample KQL queries; while useful, they are generic query examples rather than configuration, limits, or troubleshooting knowledge. | +| [Ingest data](https://learn.microsoft.com/en-us/azure/sentinel/migration-export-ingest) | 0.50 | Describes how to ingest data into the chosen platform; likely procedural without explicit comparison matrices or limits; more of a step-by-step ingestion guide. | +| [Automatically enrich incident information](https://learn.microsoft.com/en-us/azure/sentinel/tutorial-enrich-ip-information) | 0.45 | Tutorial for checking IP reputation via automation; summary suggests a worked example rather than a catalog of product-specific configuration options. | +| [Automation rules](https://learn.microsoft.com/en-us/azure/sentinel/automate-incident-handling-with-automation-rules) | 0.45 | Conceptual explanation of automation rules and their role; summary does not show specific parameter tables or numeric thresholds. | +| [Create and perform advanced incident tasks using playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/create-tasks-playbook) | 0.45 | Describes using playbooks to create incident tasks; appears as feature usage guidance without detailed config tables, limits, or error-code-based troubleshooting. | +| [Creating analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-analytic-rules-creation) | 0.45 | Covers creating analytics rules for solutions; mostly rule authoring workflow and concepts, not low-level configuration tables or limits. | +| [Creating hunting queries](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-hunting-rules-creation) | 0.45 | Hunting query creation guidance; primarily KQL and conceptual hunting patterns rather than product-specific configuration or error mappings. | +| [Creating playbooks](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-playbook-creation) | 0.45 | Playbook creation with example scenarios; likely focuses on Logic Apps workflows conceptually rather than Sentinel-specific config tables or limits. | +| [Creating summary rules](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-summary-rules-creation) | 0.45 | Summary rules explanation and scheduling examples; likely conceptual with simple examples, not detailed limits or config matrices. | +| [Extract incident entities with non-native actions](https://learn.microsoft.com/en-us/azure/sentinel/tutorial-extract-incident-entities) | 0.45 | Tutorial on extracting non-native entities; focuses on example playbook actions rather than general configuration matrices or limits. | +| [Graph visualization](https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-visualization) | 0.45 | Describes how to visualize custom graphs and run graph queries; likely focused on usage and investigation workflows, not on configuration tables or numeric limits. | +| [Handle false positives](https://learn.microsoft.com/en-us/azure/sentinel/false-positives) | 0.45 | Describes handling false positives conceptually and via automation/rule changes; summary does not show concrete product-specific configs, codes, or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-overview) | 0.45 | KQL overview for the data lake; conceptual and scenario-focused, not a detailed config, limits, or troubleshooting reference. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks-overview) | 0.45 | Overview of notebooks and capabilities; mostly conceptual without detailed parameter tables or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/investigate-large-datasets) | 0.45 | Explains search jobs and long-term retention conceptually; summary does not show specific numeric thresholds beyond generic 'large datasets'. | +| [Respond to threats using automation](https://learn.microsoft.com/en-us/azure/sentinel/automation/tutorial-respond-threats-playbook) | 0.45 | Scenario-based tutorial for a playbook; likely step-by-step but summary does not indicate reusable configuration tables or numeric constraints. | +| [Responsible AI FAQ for UEBA behaviors layer](https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer-rai-faqs) | 0.45 | Responsible AI FAQ about UEBA behaviors; mostly conceptual and policy/impact discussion without concrete configuration, limits, or troubleshooting. | +| [Run notebooks](https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks) | 0.45 | How-to article for running notebooks; likely step-based without deep configuration matrices or limits. | +| [Sentinel tables and connectors](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-tables-connectors-reference) | 0.45 | Lookup table mapping connectors to tables; useful but mostly catalog/navigation without deep configuration or limits. | +| [Track migration with a workbook](https://learn.microsoft.com/en-us/azure/sentinel/migration-track) | 0.45 | Workbook-based tracking of migration status; mostly procedural and dashboard usage, without clear config matrices, limits, or troubleshooting mappings. | +| [Tutorial - Forward syslog data to workspace](https://learn.microsoft.com/en-us/azure/sentinel/forward-syslog-monitor-agent) | 0.45 | Step-by-step tutorial for forwarding Syslog via Azure Monitor Agent; typical how-to without detailed limits or config matrices. | +| [Using federated tables](https://learn.microsoft.com/en-us/azure/sentinel/datalake/using-data-federation) | 0.45 | Shows how to view and query federated tables using KQL and notebooks; primarily usage examples rather than configuration matrices or troubleshooting. | +| [Aggregate behavioral insights from raw logs](https://learn.microsoft.com/en-us/azure/sentinel/entity-behaviors-layer) | 0.40 | Explains the UEBA behaviors abstraction layer conceptually; summary doesn’t indicate specific configuration parameters, limits, or error mappings. | +| [Azure Logic Apps for Microsoft Sentinel playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/logic-apps-playbooks) | 0.40 | Explains how Logic Apps underpins playbooks; likely conceptual integration overview without detailed parameter tables or error mappings. | +| [Create a scheduled rule from a template](https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rule-from-template) | 0.40 | Focuses on creating rules from templates; usually more of a how-to/template usage guide than a full configuration reference with parameter tables or numeric ranges. | +| [Create custom graphs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-custom-graphs) | 0.40 | How-to/tutorial for creating custom graphs in notebooks; likely step-based without detailed config tables, limits, or error mappings. | +| [Creating workbooks](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-workbook-creation) | 0.40 | Workbook creation and publishing; mostly UI and visualization concepts, not deep configuration references or quotas. | +| [Custom graphs overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/custom-graphs-overview) | 0.40 | Overview of custom graphs; high-level description of capabilities and scenarios without clear evidence of detailed configuration parameters, limits, or troubleshooting mappings. | +| [Customize playbooks from templates](https://learn.microsoft.com/en-us/azure/sentinel/automation/use-playbook-templates) | 0.40 | Focuses on creating playbooks from templates; likely a procedural tutorial without detailed config matrices, limits, or troubleshooting mappings. | +| [Deploy and monitor decoy honeytokens](https://learn.microsoft.com/en-us/azure/sentinel/monitor-key-vault-honeytokens) | 0.40 | High-level description of Key Vault honeytokens and community support; detailed implementation is offloaded to GitHub. | +| [Entity pages](https://learn.microsoft.com/en-us/azure/sentinel/entity-pages) | 0.40 | Entity pages article focuses on how to view information and timelines; more of a UX/investigation guide than a configuration or troubleshooting reference. | +| [Microsoft Sentinel solution setup essentials](https://learn.microsoft.com/en-us/azure/sentinel/solution-setup-essentials) | 0.40 | Prerequisites and setup steps for solutions; likely procedural and conceptual rather than detailed configuration references or limits. | +| [Onboard to Microsoft Sentinel](https://learn.microsoft.com/en-us/azure/sentinel/quickstart-onboard) | 0.40 | Quickstart onboarding tutorial; mostly step-by-step UI actions without deep configuration matrices or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/connect-data-sources) | 0.40 | Overview of data connectors; mostly descriptive without detailed configuration tables or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-tools-overview) | 0.40 | Overview of MCP tool collections and example prompts; more conceptual and scenario-focused than configuration- or limits-focused. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-overview) | 0.40 | Overview of Sentinel solution for SAP applications; primarily conceptual and support overview without detailed configs or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions) | 0.40 | Content and solutions overview; mostly conceptual description of solution types without deep config tables or decision matrices. | +| [Publish solutions](https://learn.microsoft.com/en-us/azure/sentinel/publish-sentinel-solutions) | 0.40 | Publishing solutions to marketplace; mainly process and portal steps, not technical configuration matrices or limits. | +| [Remediate threats while investigating](https://learn.microsoft.com/en-us/azure/sentinel/respond-threats-during-investigation) | 0.40 | Explains entity-trigger-based playbooks conceptually; preview note but no detailed parameter tables or numeric constraints in the summary. | +| [Restore historical data](https://learn.microsoft.com/en-us/azure/sentinel/restore) | 0.40 | Describes restoring archived logs; summary lacks explicit numeric limits, configuration parameter tables, or troubleshooting mappings. | +| [SOAR content catalog](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-soar-content) | 0.40 | Catalog/overview of SOAR playbooks and connectors; primarily navigation and listing without deep configuration tables or limits. | +| [Sentinel solution quality guidelines](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-quality-guidance) | 0.40 | Quality guidelines and recommendations are likely high-level publishing standards without concrete product-specific configs or quantified impacts. | +| [Set up connectors for the Microsoft Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-connectors) | 0.40 | Connector setup/retention guidance but summary shows no numeric limits, config tables, or product-specific error codes; likely a procedural how-to without expert-only details. | +| [Top Microsoft Sentinel workbooks](https://learn.microsoft.com/en-us/azure/sentinel/top-workbooks) | 0.40 | List of commonly used workbooks and where to get them; primarily catalog/usage guidance without deep configuration or troubleshooting content. | +| [Audit and track changes to incident tasks](https://learn.microsoft.com/en-us/azure/sentinel/audit-track-tasks) | 0.35 | Describes auditing and tracking task changes conceptually; summary does not show specific configuration parameters or numeric thresholds. | +| [Automate and run playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/run-playbooks) | 0.35 | How-to on attaching and running playbooks; procedural automation usage without clear evidence of detailed config matrices, limits, or troubleshooting. | +| [Bring your own machine learning](https://learn.microsoft.com/en-us/azure/sentinel/bring-your-own-ml) | 0.35 | Explains bring-your-own-ML conceptually; summary does not expose concrete config parameters, limits, or decision matrices. | +| [Create and manage playbooks](https://learn.microsoft.com/en-us/azure/sentinel/automation/create-playbooks) | 0.35 | Create/manage playbooks article is primarily step-by-step usage; no indication of detailed configuration tables, limits, or error mappings. | +| [Create incidents manually](https://learn.microsoft.com/en-us/azure/sentinel/create-incident-manually) | 0.35 | Manual incident creation description; preview/legal notes but no explicit expert-only configuration matrices or limits in the summary. | +| [Data source schema reference](https://learn.microsoft.com/en-us/azure/sentinel/data-source-schema-reference) | 0.35 | High-level index listing data source schemas with links; the expert details live in the linked pages, not here. | +| [Delete incidents](https://learn.microsoft.com/en-us/azure/sentinel/delete-incident) | 0.35 | Describes incident deletion options; summary does not show product-specific limits, error codes, or config tables. | +| [Get started with notebooks and MSTICPy](https://learn.microsoft.com/en-us/azure/sentinel/notebook-get-started) | 0.35 | Getting-started walkthrough for notebooks and MSTICPy; likely step-by-step tutorial without configuration matrices or numeric constraints in the summary. | +| [MITRE ATT&CK coverage](https://learn.microsoft.com/en-us/azure/sentinel/mitre-coverage) | 0.35 | MITRE coverage visualization article; mainly explains how to view coverage indicators, not detailed configuration parameters, numeric thresholds, or troubleshooting mappings. | +| [Microsoft Sentinel MCP server overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-overview) | 0.35 | High-level overview of MCP support and scenarios; not focused on limits, configs, or troubleshooting details. | +| [OOTB content centralization changes](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-content-centralize) | 0.35 | Describes centralization changes for out-of-the-box content; mostly informational about where content lives, not detailed configs or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-overview) | 0.35 | Conceptual overview of data federation; describes what it is and benefits, but no clear indication of detailed configuration parameters or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/fusion) | 0.35 | Fusion overview for multistage attack detection; mainly conceptual description of how Fusion works and its benefits, not detailed config or numeric limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/incident-tasks) | 0.35 | Conceptual description of incident tasks and process standardization; summary lacks specific configuration or numeric guidance. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/soc-ml-anomalies) | 0.35 | High-level explanation of customizable anomalies; summary doesn’t indicate detailed parameter tables, numeric thresholds, or troubleshooting mappings. | +| [Relate alerts to incidents](https://learn.microsoft.com/en-us/azure/sentinel/relate-alerts-to-incidents) | 0.35 | Explains relating alerts to incidents; preview note but no numeric limits, config tables, or troubleshooting mappings in the summary. | +| [Sentinel SIEM solution lifecycle post publish](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-post-publish-tracking) | 0.35 | Post-publish tracking in Partner Center; focuses on offer status flow, not deep technical configuration or troubleshooting mappings. | +| [Summarize incidents in Azure portal](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot-incident-summary) | 0.35 | Describes incident summarization capability conceptually; no explicit configuration tables, limits, or error mappings in the summary. | +| [Tutorial - Detect threats using analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/tutorial-log4j-detection) | 0.35 | Scenario-based tutorial for Log4j detection; primarily step-by-step usage of analytics rules rather than a reusable configuration reference or troubleshooting catalog. | +| [UEBA investigation examples](https://learn.microsoft.com/en-us/azure/sentinel/investigate-with-ueba) | 0.35 | Describes investigation workflows using UEBA data; appears to be procedural guidance rather than configuration reference or troubleshooting catalog. | +| [UEBA overview](https://learn.microsoft.com/en-us/azure/sentinel/identify-threats-with-entity-behavior-analytics) | 0.35 | UEBA overview and onboarding article; describes what UEBA is and how to use it conceptually, without clear indication of detailed configuration tables or numeric thresholds. | +| [Use matching analytics to detect threats](https://learn.microsoft.com/en-us/azure/sentinel/use-matching-analytics-to-detect-threats) | 0.35 | Describes using Microsoft-generated threat intelligence via a built-in analytics rule; summary indicates conceptual usage, not detailed error codes, limits, or config tables. | +| [Use tasks to handle incident workflow](https://learn.microsoft.com/en-us/azure/sentinel/work-with-tasks) | 0.35 | Explains how analysts use tasks; operational guidance but no explicit expert-only configuration tables or limits in the summary. | +| [Use threat indicators in analytics rules](https://learn.microsoft.com/en-us/azure/sentinel/use-threat-indicators-in-analytics-rules) | 0.35 | Explains using threat indicators in analytics rules; likely rule-creation workflow and concepts, not detailed configuration matrices or numeric thresholds. | +| [ASIM parsers](https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-overview) | 0.30 | Overview of ASIM parsers; summary indicates conceptual description and links to detailed docs, not specific configuration parameters or error mappings. | +| [ASIM schemas](https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-schemas) | 0.30 | Describes ASIM schemas conceptually; likely field lists but summary suggests architecture/overview rather than config tables or numeric thresholds. | +| [Add entity to threat indicators](https://learn.microsoft.com/en-us/azure/sentinel/add-entity-to-threat-intelligence) | 0.30 | Describes adding entities discovered during investigations as indicators; appears to be workflow/tutorial style without product-specific configuration tables or limits. | +| [Bookmarks](https://learn.microsoft.com/en-us/azure/sentinel/bookmarks) | 0.30 | Describes bookmark usage conceptually; no configuration tables, limits, or troubleshooting content. | +| [Build queries or rules](https://learn.microsoft.com/en-us/azure/sentinel/watchlists-queries) | 0.30 | Focuses on using watchlists in KQL queries; no numeric limits, config tables, or product-specific error/decision matrices. | +| [Create a custom connector](https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector) | 0.30 | High-level resource overview for creating custom connectors; likely links out to detailed docs but itself is conceptual without concrete configs, limits, or error mappings. | +| [Create custom graph using AI](https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-graphs-with-ai) | 0.30 | AI-assisted authoring workflow using Copilot; likely conceptual and tutorial-like without deep product-specific configs or limits. | +| [Find data connector](https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference) | 0.30 | Reference index listing connectors and links; navigation-style content without embedded limits, config tables, or troubleshooting details. | +| [Generate playbooks using AI](https://learn.microsoft.com/en-us/azure/sentinel/automation/generate-playbook) | 0.30 | Describes AI-based playbook generation experience; appears as feature overview/how-to without deep config, limits, or troubleshooting content. | +| [Investigate incidents in depth](https://learn.microsoft.com/en-us/azure/sentinel/investigate-incidents) | 0.30 | Deep-dive UI walkthrough for incident details; still primarily conceptual/operational without explicit expert-only configs in the summary. | +| [Launch Jupyter notebook](https://learn.microsoft.com/en-us/azure/sentinel/notebooks-hunt) | 0.30 | Tutorial-style article on launching notebooks; no explicit expert-only configuration tables or limits in the summary. | +| [Microsoft Sentinel SIEM overview](https://learn.microsoft.com/en-us/azure/sentinel/overview) | 0.30 | General SIEM overview of Sentinel capabilities; primarily conceptual without detailed configs, limits, or decision matrices. | +| [Microsoft Sentinel data lake overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-overview) | 0.30 | Overview of Sentinel data lake; conceptual description of capabilities without evidence of detailed limits, config tables, or troubleshooting mappings. | +| [Microsoft Sentinel graph overview](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-overview) | 0.30 | Overview of Sentinel graph capabilities; conceptual explanation of graph analytics without detailed config, limits, or troubleshooting content. | +| [Microsoft Sentinel skill-up training](https://learn.microsoft.com/en-us/azure/sentinel/skill-up-resources) | 0.30 | Training roadmap aggregating links to other resources; meta-navigation content rather than detailed technical reference. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/automation/automate-responses-with-playbooks) | 0.30 | High-level guidance on automating threat response with playbooks; appears conceptual/how-to without detailed error codes, config tables, or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/automation/automation) | 0.30 | High-level overview of automation/SOAR capabilities; summary does not show specific configuration values or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/business-applications/solution-overview) | 0.30 | Solution overview for Business Apps is high-level description of capabilities; no indication of detailed limits, configuration tables, or error codes. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/incident-investigation) | 0.30 | Describes incident investigation features conceptually; summary lacks specific configs, limits, or error/decision tables. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-solution-overview) | 0.30 | Overview of Sentinel solution for SAP BTP; summary suggests high-level description without detailed configuration tables or error mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-integration-guide) | 0.30 | Lifecycle/overview for building and publishing solutions; primarily conceptual and process guidance, not detailed configs or limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-security-copilot) | 0.30 | Overview of Security Copilot with Sentinel; summary does not expose concrete configuration parameters, limits, or troubleshooting mappings. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/threat-detection) | 0.30 | Threat detection overview describing rule types and alert/incident generation; summary suggests conceptual explanation rather than detailed config or numeric thresholds. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/understand-threat-intelligence) | 0.30 | Conceptual article about threat intelligence in Sentinel; primarily high-level explanation without detailed configuration parameters or error mappings. | +| [Remove Microsoft Sentinel from your workspaces](https://learn.microsoft.com/en-us/azure/sentinel/offboard) | 0.30 | Removal/offboarding how-to; likely step-based UI instructions without detailed limits, configs, or error mappings. | +| [Responsible AI FAQs](https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-responsible-ai-faq) | 0.30 | Responsible AI FAQ and principles; mostly policy and conceptual guidance without product-specific configuration or limits. | +| [Track and respond to threats with threat analytics](https://learn.microsoft.com/en-us/azure/sentinel/threat-analytics-sentinel) | 0.30 | Overview of threat analytics access; no detailed configuration, limits, or troubleshooting mappings in the summary. | +| [Unified connectors overview](https://learn.microsoft.com/en-us/azure/sentinel/unified-connector) | 0.30 | Overview of Unified Connectors Platform benefits; summary does not indicate detailed config tables, limits, or troubleshooting mappings. | +| [What's new](https://learn.microsoft.com/en-us/azure/sentinel/whats-new) | 0.30 | What's new/feature announcements; typically changelog-style without deep limits, configs, or decision matrices. | +| [Windows security events](https://learn.microsoft.com/en-us/azure/sentinel/data-connectors-reference) | 0.30 | Duplicate of index 6; acts as a catalog of connectors with links, not containing the detailed configuration or troubleshooting itself. | +| [Work with threat intelligence](https://learn.microsoft.com/en-us/azure/sentinel/work-with-threat-indicators) | 0.30 | High-level guidance on viewing, creating, and managing threat intelligence in the UI; summary suggests conceptual/UX description without detailed config tables or numeric limits. | +| [Conduct end-to-end hunts](https://learn.microsoft.com/en-us/azure/sentinel/hunts) | 0.25 | Describes end-to-end hunting experience conceptually; no detailed configuration tables, limits, or error mappings evident. | +| [Create custom query](https://learn.microsoft.com/en-us/azure/sentinel/hunts-custom-queries) | 0.25 | Explains creating custom hunting queries at a high level; summary lacks product-specific parameters or numeric constraints. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/notebooks) | 0.25 | Overview of Jupyter notebooks with Sentinel; summary does not show specific Sentinel-only configs or limits. | +| [Triage and manage your incidents](https://learn.microsoft.com/en-us/azure/sentinel/incident-navigate-triage) | 0.25 | Basic navigation/triage article; summary does not indicate detailed expert-only settings or limits. | +| [View customized views with workbooks](https://learn.microsoft.com/en-us/azure/sentinel/monitor-your-data) | 0.25 | Workbook visualization article; likely shows how to create and use workbooks, but summary doesn’t indicate detailed config tables or numeric constraints. | +| [ASIM overview](https://learn.microsoft.com/en-us/azure/sentinel/normalization) | 0.20 | Conceptual explanation of normalization and ASIM; no specific configs, limits, or troubleshooting content indicated. | +| [Asset data in the Sentinel data lake](https://learn.microsoft.com/en-us/azure/sentinel/datalake/enable-data-connectors) | 0.20 | Conceptual description of asset data value in Sentinel data lake; no concrete configuration parameters, limits, or troubleshooting content indicated. | +| [Ingest-time normalization](https://learn.microsoft.com/en-us/azure/sentinel/normalization-ingest-time) | 0.20 | Ingest-time normalization article is likely conceptual/architectural; no evidence of numeric limits, config tables, or troubleshooting patterns in the description. | +| [Overview](https://learn.microsoft.com/en-us/azure/sentinel/hunting) | 0.20 | Conceptual overview of hunting capabilities and built-in queries; no specific configs, limits, or troubleshooting mappings. | +| [Partner solutions](https://learn.microsoft.com/en-us/azure/sentinel/sap/solution-partner-overview) | 0.20 | Partner add-ons overview is primarily ecosystem/marketing content without detailed technical configuration or troubleshooting specifics. | +| [View collected data on the Overview dashboard](https://learn.microsoft.com/en-us/azure/sentinel/get-visibility) | 0.20 | Overview dashboard description; primarily UI widgets and graphs, not configuration parameters, limits, or troubleshooting mappings. | +| [What is Microsoft Sentinel?](https://learn.microsoft.com/en-us/azure/sentinel/sentinel-overview) | 0.20 | High-level product overview of Microsoft Sentinel as a SIEM/platform; no detailed limits, configs, roles, or troubleshooting content. | diff --git a/products/azure-service-bus/azure-service-bus.csv b/products/azure-service-bus/azure-service-bus.csv index 8d127255..f1caca09 100644 --- a/products/azure-service-bus/azure-service-bus.csv +++ b/products/azure-service-bus/azure-service-bus.csv @@ -1,6 +1,6 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/service-bus-messaging/advanced-features-overview,Overview of advanced features,Advanced Features in Azure Service Bus Messaging - Azure Service Bus,,"This article provides a high-level overview of advanced features in Azure Service Bus such as sessions, scheduled delivery, autodelete on idle, etc.",Service Bus includes advanced features that help you solve more complex messaging problems. This article describes several of these features.,2026-02-11T06:12:00.000Z,concept-article,,0.45,False,"High-level overview of advanced features (sessions, scheduled delivery, etc.); does not appear to include detailed limits or config matrices.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/authenticate-application,Authenticate from an application,Authenticate an application to access Azure Service Bus entities - Azure Service Bus,Authenticate applications to Azure Service Bus with Entra ID,"This article provides information about authenticating an application with Microsoft Entra ID to access Azure Service Bus entities (queues, topics, etc.)","Azure Service Bus supports using Microsoft Entra ID to authorize requests to Service Bus entities (queues, topics, subscriptions, or filters). With Microsoft Entra ID, you can use Azure role-based access control (Azure RBAC) to grant permissions to a security principal, which can be a user, group, application service principal, or amanaged identity for Azure resources. A key advantage of using Microsoft Entra ID with Azure Service Bus is that you don't need to store your credentials in the code ",2025-04-29T08:00:00.000Z,conceptual,security,0.85,True,"Covers Entra ID auth and Azure RBAC roles for Service Bus entities, including specific role names and permission scopes.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/authenticate-application,Authenticate from an application,Authenticate an Application to Access Azure Service Bus Entities - Azure Service Bus,Authenticate applications to Azure Service Bus with Entra ID,This article provides information about authenticating an application with Microsoft Entra ID to access Azure Service Bus entities like queues and topics.,"Azure Service Bus supports using Microsoft Entra ID to authorize requests to Service Bus entities (queues, topics, subscriptions, or filters). With Microsoft Entra ID, you can use Azure role-based access control (Azure RBAC) to grant permissions to a security principal. The security principal can be a user, group, application service principal, ormanaged identity for Azure resources. A key advantage of using Microsoft Entra ID with Azure Service Bus is that you don't need to store your credentia",2026-04-23T06:20:00.000Z,concept-article,security,0.8,True,"This article focuses on Entra ID-based auth and Azure RBAC for Service Bus, which typically includes specific role names, scopes, and configuration parameters for applications and managed identities, constituting detailed security configuration guidance.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/automate-update-messaging-units,Automatically update messaging units,Azure Service Bus - Automatically update messaging units - Azure Service Bus,,This article shows you how you can use automatically update messaging units of a Service Bus namespace.,"Autoscale ensures you have the right amount of resources running to handle the load on your application. It adds resources to handle increases in load and also saves money by removing resources that are idle. To learn more about the Autoscale feature of Azure Monitor, seeOverview of autoscale in Microsoft Azure. Service Bus Premium Messaging provides resource isolation at the CPU and memory level so that each customer workload runs in isolation. This resource container is called amessaging unit.",2026-04-08T06:10:00.000Z,how-to,,0.2,False,"From the summary, the article conceptually describes autoscale and messaging units for Service Bus Premium but does not clearly indicate the presence of specific numeric limits, configuration parameter tables, or detailed autoscale thresholds. Without evidence of exact values, settings tables, or error mappings, it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/batch-delete,Delete messages in Service Bus,Delete messages from Azure Service Bus - Azure Service Bus,Programmatically delete Service Bus messages in batches,This article explains how to delete messages in Azure Service Bus programmatically.,"Azure Service Bus is a fully managed enterprise integration message broker that enables you to send and receive messages between decoupled applications and services. However, sometimes you might want to delete messages from a queue or subscription without processing them, for example, if they're expired, corrupted, or irrelevant. This article shows you how to delete messages in batches in Azure Service Bus.",2025-11-05T18:14:00.000Z,article,integrations,0.65,True,"Shows how to delete messages via code, likely including SDK methods, parameters, and constraints specific to Service Bus batch deletion.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/build-message-driven-apps-nservicebus,Build message-driven business applications with NServiceBus,Build message-driven applications with NServiceBus and Azure Service Bus - Azure Service Bus,Build message-driven systems on Service Bus with NServiceBus,Learn how to solve complex problems with distributed systems on Azure Service Bus using the NServiceBus framework.,"NServiceBus is a commercial messaging framework provided by Particular Software. It's built on top of Azure Service Bus and helps developers focus on business logic by abstracting infrastructure concerns. In this guide, we'll build a solution that exchanges messages between two services. We'll also show how to automatically retry failing messages and review options for hosting these services in Azure. Note The code for this tutorial is available on theParticular Software Docs web site.",2024-06-21T00:28:00.000Z,how-to,architecture-patterns,0.6,True,"Shows patterns for distributed systems using NServiceBus on Service Bus, including retries and hosting options; contains product-specific architectural guidance and trade-offs.",unchanged @@ -12,9 +12,9 @@ https://learn.microsoft.com/en-us/azure/service-bus-messaging/disable-local-auth https://learn.microsoft.com/en-us/azure/service-bus-messaging/duplicate-detection,Duplicate message detection,Azure Service Bus duplicate message detection - Azure Service Bus,Configure and use Azure Service Bus duplicate detection,This article explains how you can detect duplicates in Azure Service Bus messages. The duplicate message can be ignored and dropped.,"If an application fails due to a fatal error immediately after sending a message, and the restarted application instance erroneously believes that the prior message delivery didn't occur, a subsequent send causes the same message to appear in the system twice. It's also possible for an error at the client or network level to occur a moment earlier, and for a sent message to be committed into the queue, with the acknowledgment not successfully returned to the client. This scenario leaves the clie",2025-05-28T08:00:00.000Z,article,best-practices,0.7,True,Describes duplicate detection behavior and how to use it to handle failure scenarios; includes product-specific semantics and edge cases.,unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-auto-forward,Enable auto forwarding for a queue or subscription,Enable auto forwarding for Azure Service Bus queues and subscriptions - Azure Service Bus,Configure auto-forwarding for Service Bus queues and subscriptions,"This article explains how to enable auto forwarding for queues and subscriptions by using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","The Service Bus auto forwarding feature enables you to chain a queue or subscription to another queue or topic that is part of the same namespace. When auto forwarding is enabled, Service Bus automatically removes messages that are placed in the first queue or subscription (source) and puts them in the second queue or topic (destination). It's still possible to send a message to the destination entity directly. For more information, SeeChaining Service Bus entities with auto forwarding. This art",2023-10-16T22:16:00.000Z,how-to,configuration,0.75,True,"Details the auto-forwarding property names and how to set them via portal, CLI, PowerShell, and SDKs, which are specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-dead-letter,Enable dead lettering for a queue or subscription,Enable dead lettering for Azure Service Bus queues and subscriptions - Azure Service Bus,Enable dead-lettering for Service Bus queues and subscriptions,"This article explains how to enable dead lettering for queues and subscriptions by using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","Azure Service Bus queues and subscriptions for topics provide a secondary subqueue, called a dead-letter queue (DLQ). The dead-letter queue doesn't need to be explicitly created and can't be deleted or managed independent of the main entity. The purpose of the dead-letter queue is to hold messages that can't be delivered to any receiver, or messages that couldn't be processed. For more information, SeeOverview of Service Bus dead-letter queues. This article shows you different ways to enable dea",2026-02-24T23:11:00.000Z,how-to,configuration,0.75,True,"Shows how to configure DLQ behavior with specific flags and settings across management surfaces, which are product-specific configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection,Enable duplicate detection for a queue or topic,Enable duplicate message detection - Azure Service Bus,Configure duplicate detection for Service Bus entities,"This article explains how to enable duplicate message detection using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","When you enable duplicate detection for a queue or topic, Azure Service Bus keeps a history of all messages sent to the queue or topic for a configure amount of time. During that interval, your queue or topic won't store any duplicate messages. Enabling this property guarantees exactly once delivery over a user-defined span of time. For more information, SeeDuplicate detection. This article shows you different ways to enable duplicate message detection for a Service Bus queue or a topic. Note",2023-10-16T22:16:00.000Z,how-to,configuration,0.75,True,"Covers enabling duplicate detection with specific configuration properties (time window, flags) across tools and SDKs, which are concrete settings.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection,Enable duplicate detection for a queue or topic,Enable duplicate message detection - Azure Service Bus,Configure duplicate message detection in Azure Service Bus,"This article explains how to enable duplicate message detection using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","When you enable duplicate detection for a queue or topic, Azure Service Bus keeps a history of all messages sent to the queue or topic for a configured amount of time. During that interval, your queue or topic won't store any duplicate messages. Enabling this property guarantees exactly once delivery over a user-defined span of time. For more information, SeeDuplicate detection. This article shows you different ways to enable duplicate message detection for a Service Bus queue or a topic. Note",2026-04-22T17:34:00.000Z,how-to,configuration,0.7,True,"The article provides concrete, product-specific configuration steps and code for enabling duplicate detection on Service Bus queues/topics via portal, PowerShell, CLI, and SDKs. It references specific entity properties and how to set them, which are configuration details rather than generic concepts.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-message-sessions,Enable sessions for a queue or subscription,Enable Azure Service Bus message sessions - Azure Service Bus,Enable and configure Service Bus message sessions,"This article explains how to enable message sessions using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","Azure Service Bus sessions enable joint and ordered handling of unbounded sequences of related messages. Sessions can be used infirst in, first out (FIFO)andrequest-responsepatterns. For more information, SeeMessage sessions. This article shows you different ways to enable sessions for a Service Bus queue or subscription. Important",2023-10-16T22:16:00.000Z,how-to,configuration,0.75,True,"Shows exact settings and API calls in portal, CLI, PowerShell, and SDKs to enable sessions, which are product-specific configuration parameters.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard,Enable partitions (basic / standard),Enable partitioning in Azure Service Bus basic or standard - Azure Service Bus,Enable partitioning in Basic and Standard Service Bus,"This article explains how to enable partitioning in Azure Service Bus queues and topics by using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","Service Bus partitions enable queues and topics, or messaging entities, to be partitioned across multiple message brokers. Partitioning means that the overall throughput of a partitioned entity is no longer limited by the performance of a single message broker. In addition, a temporary outage of a message broker, for example during an upgrade, doesn't render a partitioned queue or topic unavailable. Partitioned queues and topics can contain all advanced Service Bus features, such as support for ",2023-10-16T22:16:00.000Z,how-to,configuration,0.7,True,"Provides concrete steps and flags to enable partitioning on queues/topics in specific tiers using portal, CLI, PowerShell, and SDKs.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard,Enable partitions (basic / standard),Enable partitioning in Azure Service Bus basic or standard - Azure Service Bus,Enable partitioning for Azure Service Bus entities,"This article explains how to enable partitioning in Azure Service Bus queues and topics by using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","Service Bus partitions enable queues and topics, or messaging entities, to be partitioned across multiple message brokers. Partitioning means that the overall throughput of a partitioned entity is no longer limited by the performance of a single message broker. In addition, a temporary outage of a message broker, for example during an upgrade, doesn't render a partitioned queue or topic unavailable. Partitioned queues and topics can contain all advanced Service Bus features, such as support for ",2026-04-22T17:34:00.000Z,how-to,configuration,0.7,True,"The page describes how to turn on partitioning for queues and topics in Basic/Standard tiers using portal, PowerShell, CLI, and SDKs, including the specific entity property to configure. These are product-specific configuration instructions, not just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-premium,Enable partitions for queues or topics in premium tier,Enable partitioning in Azure Service Bus Premium namespaces - Azure Service Bus,,"This article explains how to enable partitioning in Azure Service Bus Premium namespaces by using Azure portal, PowerShell, CLI, and programming languages (C#, Java, Python, and JavaScript)","Service Bus partitions enable queues and topics, or messaging entities, to be partitioned across multiple message brokers. Partitioning means that the overall throughput of a partitioned entity is no longer limited by the performance of a single message broker. Partitioned queues and topics can contain all advanced Service Bus features, such as support for transactions and sessions. For more information, seePartitioned queues and topics. This article shows you different ways to enable partitioni",2026-03-11T05:11:00.000Z,how-to,,0.3,False,"How-to article for enabling partitioning via portal/CLI/SDKs; summary does not indicate specific limits, quotas, or detailed configuration tables with defaults/ranges. Appears to be procedural/tutorial content rather than expert reference data.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/entity-suspend,Suspend and reactivate messaging entities,Azure Service Bus - suspend messaging entities - Azure Service Bus,Suspend and reactivate Azure Service Bus entities,"This article explains how to temporarily suspend and reactivate Azure Service Bus message entities (queues, topics, and subscriptions).","Queues, topics, and subscriptions can be temporarily suspended. Suspension puts the entity into a disabled state in which all messages are maintained in storage. However, messages can't be removed or added, and the respective protocol operations yield errors. You may want to suspend an entity for urgent administrative reasons. For example, a faulty receiver takes messages off the queue, fails processing, and yet incorrectly completes the messages and removes them. In this case, you may want to d",2023-03-10T00:00:00.000Z,article,configuration,0.7,True,"Explains how to disable/enable queues, topics, and subscriptions with specific state flags and management operations unique to Service Bus.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/explorer,Use Service Bus Explorer,Service Bus Explorer Guide for Data Operations - Azure Service Bus,Use Service Bus Explorer in Azure portal for data operations,This article provides information on how to use the portal-based Azure Service Bus Explorer to access Azure Service Bus data.,Azure Service Bus enables sender and receiver client applications to decouple their business logic by using familiar point-to-point (Queue) and publish-subscribe (Topic-Subscription) semantics. Note This article highlights the functionality of the Azure Service Bus Explorer that's part of the Azure portal. The community ownedopen source Service Bus Exploreris a standalone application and is different from this one. You can run two kinds of operations on an Azure Service Bus namespace. Important,2026-02-06T06:10:00.000Z,how-to,configuration,0.7,True,"Details portal-based Explorer operations, including specific capabilities and constraints for sending, receiving, and peeking messages.",unchanged @@ -32,13 +32,13 @@ https://learn.microsoft.com/en-us/azure/service-bus-messaging/migrate-jms-active https://learn.microsoft.com/en-us/azure/service-bus-messaging/monitor-service-bus,Monitor Azure Service Bus,Monitor Azure Service Bus - Azure Service Bus,Configure monitoring for Azure Service Bus with Azure Monitor,"Start here to learn how to monitor Azure Service Bus by using Azure Monitor metrics, logs, and tools.","This article describes: Note If you're already familiar with this service and/or Azure Monitor and just want to know how to analyze monitoring data, see theAnalyzesection near the end of this article. When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system. Azure Monitor provides you with a view of availability",2024-07-23T22:09:00.000Z,conceptual,configuration,0.7,True,Explains how to wire Service Bus to Azure Monitor with specific metric/log categories and configuration steps beyond generic monitoring concepts.,unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/monitor-service-bus-reference,Monitoring data reference,Monitoring data reference for Azure Service Bus - Azure Service Bus,Reference for Azure Service Bus monitoring metrics and logs,This article contains important reference material you need when you monitor Azure Service Bus by using Azure Monitor.,This article contains all the monitoring reference information for this service. SeeMonitor Azure Service Busfor details on the data you can collect for Service Bus and how to use it.,2025-08-12T08:00:00.000Z,reference,configuration,0.8,True,"Monitoring data reference with detailed metric names, dimensions, and log schema specific to Service Bus, which are configuration/usage details.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/move-across-regions,Move across regions,Move a Service Bus namespace to another region - Azure Service Bus,Move an Azure Service Bus namespace across regions,This article shows you how to move an Azure Service Bus namespace from the current region to another region.,"There are various scenarios in which you'd want to move your existing Service Bus namespace from one region to another. For example, you may want to create a namespace with the same configuration for testing. You may also want to create a secondary namespace in another region as part ofdisaster recovery planning. Here are the high-level steps:",2021-06-09T11:04:00.000Z,how-to,deployment,0.65,True,"Describes concrete steps and constraints for cross-region namespace moves, including required tooling and sequencing unique to Service Bus deployment.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security,Network security,Network security for Azure Service Bus - Azure Service Bus,Configure network security for Azure Service Bus,"This article describes network security features such as service tags, IP firewall rules, service endpoints, and private endpoints.",This article describes how to use the following security features with Azure Service Bus:,2026-01-26T23:23:00.000Z,conceptual,security,0.8,True,"Describes use of service tags, IP firewall rules, service endpoints, and private endpoints with Service Bus, including product-specific settings.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter,Network security perimeter,Network Security Perimeter - Azure Service Bus,Configure network security perimeter for Azure Service Bus,Learn how to associate an Azure Service Bus namespace with a network security perimeter,"Azure Service Bussupports integration withnetwork security perimeter. Network security perimeter safeguards network traffic between Azure Service Bus and other Platform as a Service (PaaS) offerings like Azure Key Vault. By confining communication solely to Azure resources within its boundaries, it blocks unauthorized attempts to access resources beyond its secure perimeter. With a network security perimeter: Integrating Service Bus within this framework enhances messaging capabilities while ens",2026-04-10T22:10:00.000Z,feature-guide,security,0.68,True,"Page is about associating an Azure Service Bus namespace with a network security perimeter, which is a product-specific security configuration. Such pages typically include precise configuration steps, required resource types, scope definitions, and possibly specific role/permission requirements for setting up the perimeter, which qualify as expert security knowledge beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security,Network security,Network Security for Azure Service Bus - Azure Service Bus,Configure network security for Azure Service Bus namespaces,"This article describes network security features such as service tags, IP firewall rules, service endpoints, and private endpoints.",This article describes how to use security features with Azure Service Bus.,2026-04-23T06:20:00.000Z,concept-article,security,0.7,True,"Network security docs for Service Bus usually detail IP firewall rule formats, service tags, and configuration of private endpoints/service endpoints; these are concrete, product-specific security settings and parameters rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter,Network security perimeter,Network Security Perimeter - Azure Service Bus,Associate Azure Service Bus with a network security perimeter,Learn how to associate an Azure Service Bus namespace with a network security perimeter.,"Azure Service Bussupports integration with anetwork security perimeter. A network security perimeter helps safeguard network traffic between Azure Service Bus and other platform as a service (PaaS) offerings like Azure Key Vault. By confining communication solely to Azure resources within its boundaries, a network security perimeter blocks unauthorized attempts to access other resources. With a network security perimeter: Integrating Service Bus within this framework enhances messaging capabilit",2026-04-23T06:20:00.000Z,feature-guide,security,0.68,True,"Network security perimeter integration typically includes specific configuration steps, required resource associations, and possibly policy/endpoint settings unique to Service Bus and Azure NSP, which are product-specific security configuration details.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/overview-emulator,Overview of Service Bus emulator,Azure Service Bus Emulator Overview and Key Features - Azure Service Bus,,"This article describes benefits, features, limitations, and other overview information for the Azure Service Bus emulator.","The Azure Service Bus emulator offers a local development experience for the Service bus service. You can use the emulator to develop and test code against the service in isolation, free from cloud interference.",2026-02-06T06:10:00.000Z,article,,0.4,False,"Overview of Service Bus emulator benefits, features, and limitations; mostly conceptual without detailed config tables or limits.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference,Azure Policy built-ins,Built-in policy definitions for Azure Service Bus Messaging - Azure Service Bus,Apply Azure Policy definitions to Service Bus,Lists Azure Policy built-in policy definitions for Azure Service Bus Messaging. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy +https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference,Azure Policy built-ins,Built-in policy definitions for Azure Service Bus Messaging - Azure Service Bus,Use built-in Azure Policy definitions for Service Bus,Lists Azure Policy built-in policy definitions for Azure Service Bus Messaging. These built-in policy definitions provide common approaches to managing your Azure resources.,"This page is an index ofAzure Policybuilt-in policy definitions for Azure Service Bus Messaging. For additional Azure Policy built-ins for other services, seeAzure Policy built-in definitions. The name of each built-in policy definition links to the policy definition in the Azure portal. Use -the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2024-02-06T08:00:00.000Z,reference,security,0.7,True,"Lists concrete built-in Azure Policy definitions specific to Azure Service Bus, including exact policy names, effects, and scopes. These are product-specific security/governance configurations (policy definitions and enforcement patterns) that an LLM is unlikely to know exhaustively from training, fitting the security sub-skill focused on RBAC/policy and configuration-level controls.",unchanged +the link in theVersioncolumn to view the source on theAzure Policy GitHub repo.",2025-08-06T05:10:00.000Z,reference,security,0.7,True,"Lists concrete built-in Azure Policy definitions specific to Azure Service Bus Messaging, including exact policy names and enforcement scopes, which are product-specific security/governance configurations not derivable from general training data.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/prepare-for-planned-maintenance,Prepare for planned maintenance,Azure Service Bus - guidance on maintenance - Azure Service Bus,Prepare Service Bus namespaces for planned maintenance,This article helps you with preparing for planned maintenance events on your namespace in Azure Service Bus.,This article describes how you can prepare for planned maintenance events on your namespace in Azure Service Bus.,2026-02-02T06:11:00.000Z,how-to,best-practices,0.65,True,Provides concrete operational guidance and patterns (such as failover or draining) specific to Service Bus maintenance events.,unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/private-link-service,Allow access via private endpoints,Integrate Azure Service Bus with Azure Private Link Service - Azure Service Bus,Integrate Service Bus with Azure Private Link,Learn how to integrate Azure Service Bus with Azure Private Link Service,"Azure Private Link Service enables you to access Azure services (for example, Azure Service Bus, Azure Storage, and Azure Cosmos DB) and Azure hosted customer/partner services over aprivate endpointin your virtual network. A private endpoint is a network interface that connects you privately and securely to a service powered by Azure Private Link. The private endpoint uses a private IP address from your virtual network, effectively bringing the service into your virtual network. All traffic to t",2024-12-19T08:00:00.000Z,how-to,security,0.8,True,"Provides concrete steps and settings for creating private endpoints and configuring Private Link for Service Bus, which are security/network configurations.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/security-controls-policy,Security controls by Azure Policy,Azure Policy Regulatory Compliance controls for Azure Service Bus Messaging - Azure Service Bus,Apply regulatory compliance policies to Service Bus,Lists Azure Policy Regulatory Compliance controls available for Azure Service Bus Messaging. These built-in policy definitions provide common approaches to managing the compliance of your Azure resour,"Regulatory Compliance in Azure Policyprovides Microsoft created and managed initiative definitions, known asbuilt-ins, for thecompliance domainsandsecurity controlsrelated to different compliance standards. This @@ -51,7 +51,7 @@ https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-amqp-p https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-amqp-request-response,AMQP request-response-based operations,AMQP 1.0 request/response operations in Azure Service Bus - Azure Service Bus,Use AMQP request/response operations in Service Bus,This article defines the list of AMQP request/response-based operations in Microsoft Azure Service Bus.,"This article defines the list of Microsoft Azure Service Bus request/response-based operations. This information is based on the AMQP Management Version 1.0 working draft. For a detailed wire-level AMQP 1.0 protocol guide, which explains how Service Bus implements and builds on the OASIS AMQP technical specification, see theAMQP 1.0 in Azure Service Bus and Event Hubs protocol guide.",2024-03-04T08:00:00.000Z,article,configuration,0.75,True,"Defines the list of AMQP management operations supported by Service Bus, including operation names and parameters—effectively a configuration/operations reference.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-amqp-troubleshoot,AMQP errors,Troubleshoot AMQP errors in Azure Service Bus - Azure Service Bus,Troubleshoot AMQP errors in Azure Service Bus,"Provides a list of AMQP errors you may receive when using Azure Service Bus, and cause of those errors.","This article provides some of the errors you receive when using AMQP with Azure Service Bus. They're all standard behaviors of the service. You can avoid them by making send/receive calls on the connection/link, which automatically recreates the connection/link.",2024-05-06T08:00:00.000Z,article,troubleshooting,0.9,True,"Explicit troubleshooting guide listing specific AMQP error codes/messages, causes, and how to resolve them by recreating connections/links.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-async-messaging,Asynchronous messaging and high availability,Service Bus asynchronous messaging - Azure Service Bus,,"Learn how Azure Service Bus supports asynchronism via a store and forward mechanism with queues, topics, and subscriptions.","Asynchronous messaging can be implemented in a variety of different ways. With queues, topics, and subscriptions, Azure Service Bus supports asynchronism via a store and forward mechanism. In normal (synchronous) operation, you send messages to queues and topics, and receive messages from queues and subscriptions. Applications you write depend on these entities always being available. When the entity health changes, due to a variety of circumstances, you need a way to provide a reduced capabilit",2026-02-02T06:11:00.000Z,article,,0.3,False,High-level explanation of asynchronous messaging and reduced capability modes; appears conceptual without detailed config or limits.,unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization,Authentication and authorization,Azure Service Bus authentication and authorization - Azure Service Bus,Configure authentication and authorization for Service Bus,"Learn how to securely authenticate and authorize access to Azure Service Bus, including best practices for managing access keys and using Microsoft Entra ID.",There are two ways to authenticate and authorize access to Azure Service Bus resources: This article gives you details on using these two types of security mechanisms.,2025-03-21T08:00:00.000Z,article,security,0.8,True,"Explains concrete mechanisms (SAS vs Entra ID), likely including RBAC roles, scopes, and key management practices specific to Service Bus.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization,Authentication and authorization,Azure Service Bus Authentication and Authorization - Azure Service Bus,Configure authentication and authorization for Azure Service Bus,"Learn how to securely authenticate and authorize access to Azure Service Bus, including best practices for managing access keys and using Microsoft Entra ID.",There are two ways to authenticate and authorize access to Azure Service Bus resources: This article gives you details on using these two types of security mechanisms.,2026-04-23T06:20:00.000Z,concept-article,security,0.74,True,"Authentication/authorization article for Service Bus typically includes concrete security details such as specific Azure RBAC role names, permission scopes, and configuration steps for keys and Entra ID; these are product-specific security settings that qualify as expert knowledge.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-auto-forwarding,Chain entities with auto-forwarding,Autoforwarding Azure Service Bus messaging entities - Azure Service Bus,Configure and use Service Bus autoforwarding,This article describes how to chain an Azure Service Bus queue or subscription to another queue or topic.,"The Service Busautoforwardingfeature enables you to chain a queue or subscription to another queue or topic that is part of the same namespace. When autoforwarding is enabled, Service Bus automatically removes messages that are placed in the first queue or subscription (source) and puts them in the second queue or topic (destination). It's still possible to send a message to the destination entity directly. Note The basic tier of Service Bus doesn't support the autoforwarding feature. For differ",2024-11-12T08:00:00.000Z,concept-article,decision-making,0.65,True,"Describes autoforwarding behavior between entities, including tier support (not in Basic) and chaining patterns, which informs design decisions about when and how to use autoforwarding in Service Bus.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted,Compare Azure Queues and Service Bus queues,Compare Azure Storage queues and Service Bus queues - Azure Service Bus,Decide between Azure Storage queues and Service Bus queues,"Compare Azure Storage queues and Service Bus queues to determine the best fit for your application. Learn about their features, use cases, and differences.","This article analyzes the differences and similarities between the two types of queues offered by Microsoft Azure: Storage queues and Service Bus queues. By using this information, you can make a more informed decision about which solution best meets your needs.",2026-02-05T08:00:00.000Z,article,decision-making,0.75,True,Explicit comparison of two queue technologies to guide selection; likely includes feature and behavior differences relevant to design decisions.,unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dead-letter-queues,Dead-letter queues,Service Bus dead-letter queues - Azure Service Bus,Use Azure Service Bus dead-letter queues for message handling,"Describes dead-letter queues in Azure Service Bus. Service Bus queues and topic subscriptions provide a secondary subqueue, called a dead-letter queue.","Azure Service Bus queues and topic subscriptions provide a secondary subqueue, called adead-letter queue(DLQ). The dead-letter queue doesn't need to be explicitly created and can't be deleted or managed independent of the main entity. This article describes dead-letter queues in Service Bus. Much of the discussion is illustrated by theDead-Letter queues sampleon GitHub.",2025-05-15T08:00:00.000Z,concept-article,best-practices,0.7,True,Explains DLQ behavior and how messages are moved/handled; product-specific semantics and patterns for error handling.,unchanged @@ -76,7 +76,7 @@ https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-java-h https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-java-how-to-use-queues,Java,Quickstart: Get Started with Azure Service Bus Queues (Java) - Azure Service Bus,,This tutorial shows you how to send messages to and receive messages from Azure Service Bus queues using the Java programming language.,"This quickstart provides step-by-step instructions for a simple scenario of sending messages to a Service Bus queue and receiving them. You create a Java app to send messages to and receive messages from an Azure Service Bus queue. You can find prebuilt Java samples for Azure Service Bus in theAzure SDK for Java repository on GitHub. Tip If you're working with Azure Service Bus resources in a Spring application, we recommend that you considerSpring Cloud Azure. Spring Cloud Azure is an open-sour",2025-06-12T08:00:00.000Z,quickstart,,0.35,False,"Java queue quickstart; step-by-step sample, no extensive configuration tables or product-specific edge cases.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-java-how-to-use-topics-subscriptions,Java,Get started with Azure Service Bus topics (Java) - Azure Service Bus,,This tutorial shows you how to send messages to Azure Service Bus topics and receive messages from topics' subscriptions using the Java programming language.,"In this quickstart, you write Java code using the azure-messaging-servicebus package to send messages to an Azure Service Bus topic and then receive messages from subscriptions to that topic. Note This quick start provides step-by-step instructions for a simple scenario of sending a batch of messages to a Service Bus topic and receiving those messages from a subscription of the topic. You can find pre-built Java samples for Azure Service Bus in theAzure SDK for Java repository on GitHub. Tip If ",2025-03-17T11:10:00.000Z,quickstart,,0.35,False,"Java topics/subscriptions quickstart; focuses on simple scenario, not on advanced configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-manage-with-ps,Use Azure PowerShell to provision entities,Use PowerShell to manage Azure Service Bus resources - Azure Service Bus,Manage Service Bus resources with Azure PowerShell,"This article explains how to use Azure PowerShell module to create and manage Service Bus entities (namespaces, queues, topics, subscriptions).","Microsoft Azure PowerShell is a scripting environment that you can use to control and automate the deployment and management of Azure services. This article describes how to use theService Bus Resource Manager PowerShell moduleto provision and manage Service Bus entities (namespaces, queues, topics, and subscriptions) using a local Azure PowerShell console or script. You can also manage Service Bus entities using Azure Resource Manager templates. For more information, see the articleCreate Servi",2022-10-12T14:45:00.000Z,article,configuration,0.7,True,"Contains concrete cmdlets, parameters, and patterns for creating/managing namespaces, queues, topics, and subscriptions via PowerShell.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity,Authenticate with managed identities for Azure resources,Use Managed Identities with Azure Service Bus - Azure Service Bus,Authenticate to Azure Service Bus with managed identities,"Learn how to authenticate and access Azure Service Bus queues, topics, and subscriptions using managed identities for Azure resources.","Managed identities for Azure resources provide Azure services with an automatically managed identity in Microsoft Entra ID. You can use this identity to authenticate to Azure Service Bus without storing credentials in your code. This article walks you through enabling a managed identity, assigning the appropriate Service Bus role, and connecting to Service Bus from your application code. Note If you're not familiar with managed identities, seeManaged identities for Azure resources.",2026-03-18T06:15:00.000Z,how-to,security,0.78,True,"The article provides product-specific security configuration details: how to enable managed identities for Azure resources, which Azure Service Bus RBAC roles to assign, and how to configure authentication in code using those identities. It includes concrete role names and identity usage patterns that are specific to Azure Service Bus and managed identities, which qualifies as expert security configuration knowledge rather than a generic conceptual overview.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity,Authenticate with managed identities for Azure resources,Use Managed Identities with Azure Service Bus - Azure Service Bus,Authenticate to Azure Service Bus using managed identities,"Learn how to authenticate and access Azure Service Bus queues, topics, and subscriptions by using managed identities for Azure resources.","Managed identities for Azure resources provide Azure services with an automatically managed identity in Microsoft Entra ID. You can use this identity to authenticate to Azure Service Bus without storing credentials in your code. This article walks you through enabling a managed identity, assigning the appropriate Service Bus role, and connecting to Service Bus from your application code. If you're not familiar with managed identities, seeManaged identities for Azure resources.",2026-04-23T06:20:00.000Z,how-to,security,0.78,True,"Managed identity usage with Service Bus usually specifies exact Azure RBAC roles, scope assignments, and connection configuration patterns for queues/topics; these are concrete, product-specific security and identity configuration details.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-management-libraries,Service Bus management libraries,Programmatically manage Azure Service Bus namespaces and entities - Azure Service Bus,Programmatically manage Service Bus namespaces and entities,This article explains how to dynamically or programmatically create Service Bus namespaces and entities.,Azure Service Bus provides libraries to help dynamically create Service Bus namespaces and entities. It enables complex deployments and messaging scenarios and makes it possible to programmatically determine what entities to create.,2025-12-04T23:11:00.000Z,article,configuration,0.65,True,"Describes specific management libraries, classes, and methods for dynamic creation of Service Bus resources, which are product-specific APIs.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messages-payloads,"Messages, payloads, and serialization","Azure Service Bus Messages, Payloads, and Serialization - Azure Service Bus",Handle messages and serialization in Azure Service Bus,"Learn how Azure Service Bus handles messages, payloads, and serialization. Explore message properties, routing, and best practices for efficient communication.","Azure Service Bus handles messages. Messages carry a payload and metadata. The metadata is in the form of key-value pairs. It describes the payload and gives handling instructions to Service Bus and applications. Occasionally, that metadata alone is sufficient to carry the information that the sender wants to communicate to receivers, so the payload stays empty. The object model of the official Service Bus clients for .NET and Java reflects the abstract Service Bus message structure. This struct",2026-02-11T06:12:00.000Z,concept-article,best-practices,0.75,True,"Explains message structure, properties, routing, and serialization with Service Bus-specific best practices and edge cases.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-exceptions,Service Bus exceptions (deprecated),Azure Service Bus - messaging exceptions (deprecated) - Azure Service Bus,Handle deprecated Azure Service Bus messaging exceptions,This article provides a list of Azure Service Bus messaging exceptions from the deprecated packages and suggested actions to taken when the exception occurs.,"This article lists the .NET exceptions generated by .NET Framework APIs. On 30 September 2026, we'll retire the Azure Service Bus SDK libraries WindowsAzure.ServiceBus, Microsoft.Azure.ServiceBus, and com.microsoft.azure.servicebus, which don't conform to Azure SDK guidelines. We'll also end support of the SBMP protocol, so you'll no longer be able to use this protocol after 30 September 2026. Migrate to the latest Azure SDK libraries, which offer critical security updates and improved capabilit",2025-04-29T08:00:00.000Z,article,troubleshooting,0.9,True,"Similar to index 34 but for deprecated SDKs; lists exception types and suggested actions, which are product-specific troubleshooting mappings.",unchanged @@ -110,7 +110,7 @@ https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resour https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace-topic-with-rule,"Create a namespace with topic, subscription, and rule",Create Service Bus topic subscription and rule using Azure template - Azure Service Bus,"Deploy Service Bus topic, subscription, and rule via ARM","Create a Service Bus namespace with topic, subscription, and rule using Azure Resource Manager template","This article shows how to use an Azure Resource Manager template that creates a Service Bus namespace with a topic, subscription, and rule (filter). The article explains how to specify which resources are deployed and how to define parameters that are specified when the deployment is executed. You can use this template for your own deployments, or customize it to meet your requirements For more information about creating templates, seeAuthoring Azure Resource Manager templates. For more informat",2023-03-10T00:00:00.000Z,article,deployment,0.7,True,"Provides a full ARM template with topic, subscription, and rule resources and parameters, which are deployment-specific configurations.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-overview,Use ARM templates,Create Azure Service Bus resources using templates - Azure Service Bus,Deploy Service Bus resources using ARM templates,Use Azure Resource Manager templates to automate the creation of Service Bus resources,"This article describes how to create and deploy Service Bus resources using Azure Resource Manager templates, PowerShell, and the Service Bus resource provider. Azure Resource Manager templates help you define the resources to deploy for a solution, and to specify parameters and variables that enable you to input values for different environments. The template is written in JSON and consists of expressions that you can use to construct values for your deployment. For detailed information about w",2023-03-10T00:00:00.000Z,article,deployment,0.7,True,"Gives template structure, resource types, and parameterization details for deploying Service Bus via ARM, which are deployment-specific configurations.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-samples,Service Bus samples,Azure Service Bus Code Samples and Examples - Azure Service Bus,,"Explore Azure Service Bus messaging samples to learn key features and implementation for .NET, Java, Python, JavaScript, and more. Start building today!","The Service Bus messaging samples demonstrate key features inService Bus messaging. Currently, you can find the samples in the following places. On 30 September 2026, we'll retire the Azure Service Bus SDK libraries WindowsAzure.ServiceBus, Microsoft.Azure.ServiceBus, and com.microsoft.azure.servicebus, which don't conform to Azure SDK guidelines. We'll also end support of the SBMP protocol, so you'll no longer be able to use this protocol after 30 September 2026. Migrate to the latest Azure SDK",2026-02-10T08:00:00.000Z,article,,0.4,False,"Sample index and SDK retirement notice; mostly navigation and lifecycle info, not detailed config, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas,Authentication with Shared Access Signatures,Azure Service Bus access control with Shared Access Signatures - Azure Service Bus,Secure Service Bus with Shared Access Signatures,"Overview of Service Bus access control using Shared Access Signatures overview, details about SAS authorization with Azure Service Bus.","This article discussesShared Access Signatures (SAS), how they work, and how to use them in a platform-agnostic way with Azure Service Bus. SAS guards access to Service Bus based on authorization rules that are configured either on a namespace, or a messaging entity (queue, or topic). An authorization rule has a name, is associated with specific rights, and carries a pair of cryptographic keys. You use the rule's name and key via the Service Bus SDK or in your own code to generate a SAS token. A",2024-12-11T08:00:00.000Z,concept-article,security,0.8,True,"Details SAS rule structure, rights, and token usage specific to Service Bus, including key-based authorization patterns.",unchanged +https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas,Authentication with Shared Access Signatures,Azure Service Bus Access Control with Shared Access Signatures - Azure Service Bus,Use shared access signatures with Azure Service Bus,Learn about the use of shared access signatures for Azure Service Bus authorization.,"This article discusses shared access signatures (SASs), how they work, and how to use them in a platform-agnostic way with Azure Service Bus. A SAS guards access to Service Bus based on authorization rules that are configured on a namespace or in a messaging entity (queue or topic). An authorization rule has a name, is associated with specific rights, and carries a pair of cryptographic keys. You use the rule's name and keys via the Service Bus SDK or in your own code to generate a SAS token. A ",2026-04-23T06:20:00.000Z,concept-article,security,0.78,True,"SAS documentation for Service Bus generally contains token format, required fields, allowed permissions, and configuration details for authorization rules and keys; these are product-specific security configuration parameters and patterns beyond generic SAS concepts.",updated https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-service-endpoints,Allow access from specific virtual networks,Configure network service endpoints - Azure Service Bus,Configure Service Bus virtual network service endpoints,This article provides information on how to add a Microsoft.ServiceBus service endpoint to a virtual network.,"The integration of Service Bus withVirtual Network service endpointsenables secure access to messaging capabilities from workloads like virtual machines that are bound to virtual networks, with the network traffic path being secured on both ends. Once configured to be bound to at least one virtual network subnet service endpoint, the respective Service Bus namespace will no longer accept traffic from anywhere but the authorized virtual networks and, optionally, specific internet IP addresses. Fr",2024-07-31T08:00:00.000Z,how-to,security,0.8,True,"Explains how to bind namespaces to VNet subnets with service endpoints, including specific configuration properties and access behavior.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-throttling,Throttling,Azure Service Bus throttling - Azure Service Bus,Understand throttling limits in Azure Service Bus,"This article describes throttling in Azure Service Bus standard and premium tiers, and answers some frequently asked questions.","Cloud native solutions give a notion of unlimited resources that can scale with your workload. While this notion is more true in the cloud than it is with on-premises systems, there are still limitations that exist in the cloud. These limitations might causethrottlingof client application requests in both standard and premium tiers as discussed in this article.",2024-03-19T08:00:00.000Z,concept-article,limits-quotas,0.8,True,"Throttling documentation typically includes concrete throughput limits, request caps, and tier-specific behaviors that are numeric and service-specific.",unchanged https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-to-event-grid-integration-concept,Azure Service Bus and Azure Event Grid integration,Azure Service Bus to Event Grid integration overview - Azure Service Bus,Integrate Azure Service Bus with Event Grid,This article provides a description of how Azure Service Bus messaging integrates with Azure Event Grid.,"Service Bus can emit events to Event Grid when there are messages in a queue or a subscription when no receivers are present. You can create Event Grid subscriptions to your Service Bus namespaces, listen to these events, and then react to the events by starting a receiver. With this feature, you can use Service Bus in reactive programming models. The key scenario of this feature is that Service Bus queues or subscriptions with a low volume of messages do not need to have a receiver that polls f",2022-12-09T00:00:00.000Z,conceptual,integrations,0.7,True,"Explains Service Bus events emitted to Event Grid and how to subscribe/react, including event types and integration behavior specific to these services.",unchanged diff --git a/products/azure-service-bus/report.md b/products/azure-service-bus/report.md index 855233f0..055ccd53 100644 --- a/products/azure-service-bus/report.md +++ b/products/azure-service-bus/report.md @@ -1,9 +1,9 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: - security: 'Securing Service Bus: identity-based auth, SAS, keys and encryption, - TLS, network isolation (VNet, Private Link, firewall), Azure Policy, and compliance - configuration.' + security: 'Securing Service Bus: Entra ID/managed identity auth, SAS and auth rules, + CMK encryption, TLS settings, network isolation (VNet, Private Link, firewalls), + and Azure Policy/compliance.' integrations: Patterns and code for integrating Service Bus with JMS, AMQP, RabbitMQ, Event Grid/Logic Apps/Functions, subscription filters, and batch message operations/migration scenarios @@ -13,9 +13,9 @@ category_descriptions: decision-making: Guidance on choosing Service Bus vs other messaging services/tiers, configuring autoforwarding, geo-disaster recovery/replication, and migrating from Standard to Premium. - configuration: Configuring Service Bus entities, filters, sessions, partitioning, - monitoring, and management via portal, PowerShell, ARM, and local emulator, plus - message browsing, counts, and replication. + configuration: 'Configuring and managing Service Bus entities: forwarding, dead-lettering, + sessions, partitioning, monitoring/metrics, SQL filters/actions, PowerShell/ARM + management, and local emulation.' best-practices: 'Guidance on reliable Service Bus messaging: ordering, sessions, TTL/expiration, duplicate detection, dead-lettering, locks/settlement, serialization, and performance tuning (prefetch, throughput).' @@ -31,16 +31,15 @@ category_descriptions: skill_description: Expert knowledge for Azure Service Bus development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - using queues/topics, sessions, filters, geo-replication/DR, or Premium scaling and - throughput limits, and other Azure Service Bus related development tasks. Not for - Azure Event Hubs (use azure-event-hubs), Azure Event Grid (use azure-event-grid), - Azure Relay (use azure-relay), Azure Queue Storage (use azure-queue-storage). -use_when: Use when using queues/topics, sessions, filters, geo-replication/DR, or - Premium scaling and throughput limits, and other Azure Service Bus related development - tasks. -confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event Grid - (use azure-event-grid), Azure Relay (use azure-relay), Azure Queue Storage (use - azure-queue-storage). + using queues/topics, sessions, filters, dead-lettering, geo-recovery, or Premium + scaling, and other Azure Service Bus related development tasks. Not for Azure Event + Hubs (use azure-event-hubs), Azure Relay (use azure-relay), Azure Queue Storage + (use azure-queue-storage), Azure Web PubSub (use azure-web-pubsub). +use_when: Use when using queues/topics, sessions, filters, dead-lettering, geo-recovery, + or Premium scaling, and other Azure Service Bus related development tasks. +confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Relay (use + azure-relay), Azure Queue Storage (use azure-queue-storage), Azure Web PubSub (use + azure-web-pubsub). --- # Azure Service Bus Crawl Report @@ -54,8 +53,8 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 123 +- **Updated Pages**: 9 +- **Unchanged**: 114 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-service-bus/azure-service-bus.csv` @@ -76,6 +75,27 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event ## Changes +### Updated Pages + +- [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference) + - Updated: 2024-02-06T08:00:00.000Z → 2025-08-06T05:10:00.000Z +- [Authentication and authorization](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization) + - Updated: 2025-03-21T08:00:00.000Z → 2026-04-23T06:20:00.000Z +- [Authentication with Shared Access Signatures](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas) + - Updated: 2024-12-11T08:00:00.000Z → 2026-04-23T06:20:00.000Z +- [Authenticate with managed identities for Azure resources](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity) + - Updated: 2026-03-18T06:15:00.000Z → 2026-04-23T06:20:00.000Z +- [Authenticate from an application](https://learn.microsoft.com/en-us/azure/service-bus-messaging/authenticate-application) + - Updated: 2025-04-29T08:00:00.000Z → 2026-04-23T06:20:00.000Z +- [Network security](https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security) + - Updated: 2026-01-26T23:23:00.000Z → 2026-04-23T06:20:00.000Z +- [Network security perimeter](https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter) + - Updated: 2026-04-10T22:10:00.000Z → 2026-04-23T06:20:00.000Z +- [Enable duplicate detection for a queue or topic](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection) + - Updated: 2023-10-16T22:16:00.000Z → 2026-04-22T17:34:00.000Z +- [Enable partitions (basic / standard)](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard) + - Updated: 2023-10-16T22:16:00.000Z → 2026-04-22T17:34:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -84,7 +104,6 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event | [AMQP errors](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-amqp-troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting guide listing specific AMQP error codes/messages, causes, and how to resolve them by recreating connections/links. | | [Service Bus exceptions](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-exceptions-latest) | troubleshooting | 0.90 | Provides a catalog of Service Bus .NET client exceptions with meanings and recommended handling, including transient vs non-transient guidance. | | [Service Bus exceptions (deprecated)](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-exceptions) | troubleshooting | 0.90 | Similar to index 34 but for deprecated SDKs; lists exception types and suggested actions, which are product-specific troubleshooting mappings. | -| [Authenticate from an application](https://learn.microsoft.com/en-us/azure/service-bus-messaging/authenticate-application) | security | 0.85 | Covers Entra ID auth and Azure RBAC roles for Service Bus entities, including specific role names and permission scopes. | | [Enforce minimum required TLS version](https://learn.microsoft.com/en-us/azure/service-bus-messaging/transport-layer-security-enforce-minimum-version) | security | 0.85 | Explains how to configure minimum TLS version, including supported versions and namespace-level settings—product-specific security configuration. | | [Optimize performance](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-performance-improvements) | best-practices | 0.85 | Explicitly a best-practices article with concrete recommendations for send/receive patterns, batching, connection management, and configuration choices specific to Service Bus performance. | | [Resource Manager exceptions](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-exceptions) | troubleshooting | 0.85 | Lists specific ARM exception codes/messages for Service Bus and maps them to causes and suggested actions, matching troubleshooting criteria. | @@ -92,19 +111,18 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event | [Allow access from specific IP addresses](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-ip-filtering) | security | 0.80 | Contains specific firewall rule properties, allowed value formats (IPv4, CIDR), and behavior when rules are applied, which are security settings. | | [Allow access from specific virtual networks](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-service-endpoints) | security | 0.80 | Explains how to bind namespaces to VNet subnets with service endpoints, including specific configuration properties and access behavior. | | [Allow access via private endpoints](https://learn.microsoft.com/en-us/azure/service-bus-messaging/private-link-service) | security | 0.80 | Provides concrete steps and settings for creating private endpoints and configuring Private Link for Service Bus, which are security/network configurations. | -| [Authentication and authorization](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization) | security | 0.80 | Explains concrete mechanisms (SAS vs Entra ID), likely including RBAC roles, scopes, and key management practices specific to Service Bus. | -| [Authentication with Shared Access Signatures](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas) | security | 0.80 | Details SAS rule structure, rights, and token usage specific to Service Bus, including key-based authorization patterns. | +| [Authenticate from an application](https://learn.microsoft.com/en-us/azure/service-bus-messaging/authenticate-application) | security | 0.80 | This article focuses on Entra ID-based auth and Azure RBAC for Service Bus, which typically includes specific role names, scopes, and configuration parameters for applications and managed identities, constituting detailed security configuration guidance. | | [Compare messaging services](https://learn.microsoft.com/en-us/azure/service-bus-messaging/compare-messaging-services) | decision-making | 0.80 | Explicitly compares Azure messaging services and recommends which to use for scenarios; likely includes comparison tables and decision criteria. | | [Encrypt data using customer-managed keys](https://learn.microsoft.com/en-us/azure/service-bus-messaging/configure-customer-managed-key) | security | 0.80 | Describes how to set up CMK/BYOK with Key Vault, including key identifiers and namespace settings specific to Service Bus encryption at rest. | | [Integrate with RabbitMQ](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-integrate-with-rabbitmq) | integrations | 0.80 | Step-by-step integration guide with concrete connection parameters, routing patterns, and configuration details for bridging RabbitMQ to Service Bus. | | [Migrate from Standard to Premium namespaces](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-migrate-standard-premium) | decision-making | 0.80 | Describes migration steps and trade-offs between Standard and Premium tiers, including throughput, latency, and feature differences—tier selection and migration guidance. | | [Migrate to Passwordless Connections](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-migrate-azure-credentials) | security | 0.80 | Provides concrete steps and configuration details for replacing connection strings with Entra ID and RBAC, including roles and auth patterns specific to Service Bus. | | [Monitoring data reference](https://learn.microsoft.com/en-us/azure/service-bus-messaging/monitor-service-bus-reference) | configuration | 0.80 | Monitoring data reference with detailed metric names, dimensions, and log schema specific to Service Bus, which are configuration/usage details. | -| [Network security](https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security) | security | 0.80 | Describes use of service tags, IP firewall rules, service endpoints, and private endpoints with Service Bus, including product-specific settings. | | [Subscription Rule SQL action syntax](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-sql-rule-action) | configuration | 0.80 | Reference for SQL action expressions with exact syntax and capabilities, which are product-specific configuration rules. | | [Subscription Rule SQL filter syntax](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-sql-filter) | configuration | 0.80 | Provides full grammar and allowed expressions for SQL filters, including property names and operators specific to Service Bus. | | [Throttling](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-throttling) | limits-quotas | 0.80 | Throttling documentation typically includes concrete throughput limits, request caps, and tier-specific behaviors that are numeric and service-specific. | -| [Authenticate with managed identities for Azure resources](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity) | security | 0.78 | The article provides product-specific security configuration details: how to enable managed identities for Azure resources, which Azure Service Bus RBAC roles to assign, and how to configure authentication in code using those identities. It includes concrete role names and identity usage patterns that are specific to Azure Service Bus and managed identities, which qualifies as expert security configuration knowledge rather than a generic conceptual overview. | +| [Authenticate with managed identities for Azure resources](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity) | security | 0.78 | Managed identity usage with Service Bus usually specifies exact Azure RBAC roles, scope assignments, and connection configuration patterns for queues/topics; these are concrete, product-specific security and identity configuration details. | +| [Authentication with Shared Access Signatures](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas) | security | 0.78 | SAS documentation for Service Bus generally contains token format, required fields, allowed permissions, and configuration details for authorization rules and keys; these are product-specific security configuration parameters and patterns beyond generic SAS concepts. | | [AMQP request-response-based operations](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-amqp-request-response) | configuration | 0.75 | Defines the list of AMQP management operations supported by Service Bus, including operation names and parameters—effectively a configuration/operations reference. | | [Compare Azure Queues and Service Bus queues](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted) | decision-making | 0.75 | Explicit comparison of two queue technologies to guide selection; likely includes feature and behavior differences relevant to design decisions. | | [Confidential computing](https://learn.microsoft.com/en-us/azure/service-bus-messaging/confidential-computing) | security | 0.75 | Explains how to enable confidential computing on Premium namespaces and its security implications—service-specific security configuration. | @@ -112,19 +130,19 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event | [Create an authorization rule for namespace and queue](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace-auth-rule) | security | 0.75 | Shows how to define authorization rules in templates, including rights and scopes, which are security configuration parameters. | | [Enable auto forwarding for a queue or subscription](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-auto-forward) | configuration | 0.75 | Details the auto-forwarding property names and how to set them via portal, CLI, PowerShell, and SDKs, which are specific configuration parameters. | | [Enable dead lettering for a queue or subscription](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-dead-letter) | configuration | 0.75 | Shows how to configure DLQ behavior with specific flags and settings across management surfaces, which are product-specific configuration details. | -| [Enable duplicate detection for a queue or topic](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection) | configuration | 0.75 | Covers enabling duplicate detection with specific configuration properties (time window, flags) across tools and SDKs, which are concrete settings. | | [Enable sessions for a queue or subscription](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-message-sessions) | configuration | 0.75 | Shows exact settings and API calls in portal, CLI, PowerShell, and SDKs to enable sessions, which are product-specific configuration parameters. | | [Java Message Service (JMS) and AMQP](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-java-how-to-use-jms-api-amqp) | integrations | 0.75 | Explains how to use JMS 1.1 over AMQP with Service Bus Standard, including configuration, connection strings, and feature limitations unique to this scenario. | | [Message replication task patterns](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-federation-patterns) | architecture-patterns | 0.75 | Provides detailed guidance for specific replication task patterns, which are product-specific architecture patterns with concrete implementation advice. | | [Messages, payloads, and serialization](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messages-payloads) | best-practices | 0.75 | Explains message structure, properties, routing, and serialization with Service Bus-specific best practices and edge cases. | | [Partitioned queues and topics](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-partitioning) | architecture-patterns | 0.75 | Explains partitioning behavior, throughput and availability implications, and when to use partitioned entities—Service Bus–specific architecture guidance. | | [Test locally with Service Bus emulator](https://learn.microsoft.com/en-us/azure/service-bus-messaging/test-locally-with-service-bus-emulator) | configuration | 0.75 | Describes emulator setup via Docker and scripts, including image names, ports, and environment variables—product-specific configuration details. | +| [Authentication and authorization](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization) | security | 0.74 | Authentication/authorization article for Service Bus typically includes concrete security details such as specific Azure RBAC role names, permission scopes, and configuration steps for keys and Entra ID; these are product-specific security settings that qualify as expert knowledge. | | [.NET](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-amqp-dotnet) | integrations | 0.70 | Describes how to configure the WindowsAzure.ServiceBus library for AMQP, including connection settings and constraints specific to this legacy integration. | | [Audit minimum required TLS version](https://learn.microsoft.com/en-us/azure/service-bus-messaging/transport-layer-security-audit-minimum-version) | security | 0.70 | Includes Azure Policy definitions and parameters targeting Service Bus TLS settings, which are security compliance configurations. | | [Azure Functions](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-to-event-grid-integration-function) | integrations | 0.70 | Tutorial for wiring Service Bus events through Event Grid into Functions/Logic Apps; includes binding configuration and event subscription parameters unique to this integration. | | [Azure Logic Apps](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-to-event-grid-integration-example) | integrations | 0.70 | Describes handling Service Bus events via Event Grid and Logic Apps; likely includes event schema, trigger configuration, and connector parameters specific to this integration. | | [Azure Monitor - Service Bus insights](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-insights) | configuration | 0.70 | Describes Service Bus insights experience with specific metrics, dimensions, and configuration options unique to this integration. | -| [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference) | security | 0.70 | Lists concrete built-in Azure Policy definitions specific to Azure Service Bus, including exact policy names, effects, and scopes. These are product-specific security/governance configurations (policy definitions and enforcement patterns) that an LLM is unlikely to know exhaustively from training, fitting the security sub-skill focused on RBAC/policy and configuration-level controls. | +| [Azure Policy built-ins](https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference) | security | 0.70 | Lists concrete built-in Azure Policy definitions specific to Azure Service Bus Messaging, including exact policy names and enforcement scopes, which are product-specific security/governance configurations not derivable from general training data. | | [Azure Service Bus and Azure Event Grid integration](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-to-event-grid-integration-concept) | integrations | 0.70 | Explains Service Bus events emitted to Event Grid and how to subscribe/react, including event types and integration behavior specific to these services. | | [Configured replication tasks](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-federation-configuration) | configuration | 0.70 | Focuses on configuration-only replication tasks using pre-built helpers; likely includes configuration parameters and settings specific to this scenario. | | [Create a namespace](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace) | deployment | 0.70 | Includes a concrete ARM template and parameter definitions for namespace creation, which are deployment configuration details. | @@ -132,7 +150,8 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event | [Dead-letter queues](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dead-letter-queues) | best-practices | 0.70 | Explains DLQ behavior and how messages are moved/handled; product-specific semantics and patterns for error handling. | | [Disable local or SAS authentication](https://learn.microsoft.com/en-us/azure/service-bus-messaging/disable-local-authentication) | security | 0.70 | Page is focused on configuring authentication for Azure Service Bus by disabling local/SAS key auth in favor of Microsoft Entra ID. This is product-specific security configuration, likely including exact setting names, portal/ARM options, and required permissions. It does not fit limits, deployment, or generic best practices as closely as security. | | [Duplicate message detection](https://learn.microsoft.com/en-us/azure/service-bus-messaging/duplicate-detection) | best-practices | 0.70 | Describes duplicate detection behavior and how to use it to handle failure scenarios; includes product-specific semantics and edge cases. | -| [Enable partitions (basic / standard)](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard) | configuration | 0.70 | Provides concrete steps and flags to enable partitioning on queues/topics in specific tiers using portal, CLI, PowerShell, and SDKs. | +| [Enable duplicate detection for a queue or topic](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection) | configuration | 0.70 | The article provides concrete, product-specific configuration steps and code for enabling duplicate detection on Service Bus queues/topics via portal, PowerShell, CLI, and SDKs. It references specific entity properties and how to set them, which are configuration details rather than generic concepts. | +| [Enable partitions (basic / standard)](https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard) | configuration | 0.70 | The page describes how to turn on partitioning for queues and topics in Basic/Standard tiers using portal, PowerShell, CLI, and SDKs, including the specific entity property to configure. These are product-specific configuration instructions, not just conceptual guidance. | | [Geo-Disaster Recovery (metadata only)](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-geo-dr) | decision-making | 0.70 | Provides product-specific guidance on configuring Geo-DR, including Premium-only availability and namespace pairing behavior, which drives DR design decisions. | | [Geo-Replication](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-geo-replication) | decision-making | 0.70 | Covers how Geo-Replication works across regions and when to use it to protect against outages, including metadata/data replication behavior that informs architecture choices. | | [Get message counters](https://learn.microsoft.com/en-us/azure/service-bus-messaging/message-counters) | configuration | 0.70 | Provides exact API/ARM property names and patterns to query message counters, including performance considerations specific to Service Bus. | @@ -148,6 +167,7 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event | [Message transfers, locks, and settlement](https://learn.microsoft.com/en-us/azure/service-bus-messaging/message-transfers-locks-settlement) | best-practices | 0.70 | Describes send/receive semantics, locks, and settlement operations; includes product-specific behavior and guidance for reliable processing. | | [Migrate from ActiveMQ to Azure Service Bus](https://learn.microsoft.com/en-us/azure/service-bus-messaging/migrate-jms-activemq-to-servicebus) | integrations | 0.70 | Migration guide with product-specific JMS/AMQP configuration patterns and code-level changes unique to Azure Service Bus versus ActiveMQ/Amazon MQ. | | [Monitor Azure Service Bus](https://learn.microsoft.com/en-us/azure/service-bus-messaging/monitor-service-bus) | configuration | 0.70 | Explains how to wire Service Bus to Azure Monitor with specific metric/log categories and configuration steps beyond generic monitoring concepts. | +| [Network security](https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security) | security | 0.70 | Network security docs for Service Bus usually detail IP firewall rule formats, service tags, and configuration of private endpoints/service endpoints; these are concrete, product-specific security settings and parameters rather than generic concepts. | | [Prefetch messages](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-prefetch) | best-practices | 0.70 | Explains how prefetch count works with concrete behavioral details (buffering semantics, turning off prefetch while buffer drains) and product-specific guidance on configuring prefetch, which are implementation nuances beyond generic messaging knowledge. | | [Premium messaging](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-premium-messaging) | decision-making | 0.70 | Compares standard and premium tiers with technical differences and recommendations for production; supports tier selection decisions. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/service-bus-messaging/security-controls-policy) | security | 0.70 | Lists Azure Policy built-ins and compliance controls for Service Bus, which are security/compliance configurations specific to the service. | @@ -157,7 +177,7 @@ confusable_not_for: Not for Azure Event Hubs (use azure-event-hubs), Azure Event | [Use Azure PowerShell to provision entities](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-manage-with-ps) | configuration | 0.70 | Contains concrete cmdlets, parameters, and patterns for creating/managing namespaces, queues, topics, and subscriptions via PowerShell. | | [Use Service Bus Explorer](https://learn.microsoft.com/en-us/azure/service-bus-messaging/explorer) | configuration | 0.70 | Details portal-based Explorer operations, including specific capabilities and constraints for sending, receiving, and peeking messages. | | [Use Service Bus with Java Message Service (JMS) 2.0](https://learn.microsoft.com/en-us/azure/service-bus-messaging/how-to-use-java-message-service-20) | integrations | 0.70 | How-to for using JMS 2.0 over AMQP with Azure Service Bus; likely includes product-specific API usage, configuration parameters, and integration patterns beyond generic JMS knowledge. | -| [Network security perimeter](https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter) | security | 0.68 | Page is about associating an Azure Service Bus namespace with a network security perimeter, which is a product-specific security configuration. Such pages typically include precise configuration steps, required resource types, scope definitions, and possibly specific role/permission requirements for setting up the perimeter, which qualify as expert security knowledge beyond generic concepts. | +| [Network security perimeter](https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter) | security | 0.68 | Network security perimeter integration typically includes specific configuration steps, required resource associations, and possibly policy/endpoint settings unique to Service Bus and Azure NSP, which are product-specific security configuration details. | | [ARM template](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace-queue) | deployment | 0.65 | Provides an ARM template defining Service Bus resources and parameters; contains concrete deployment schema details beyond generic how-to. | | [ARM template](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace-topic) | deployment | 0.65 | ARM template-based deployment of namespace, topic, and subscription; includes resource schema and parameterization useful for deployment automation. | | [Bicep](https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace-queue-bicep) | deployment | 0.65 | Shows a reusable Bicep template with parameters and resource definitions; while a quickstart, it exposes concrete deployment resource schema and parameterization useful for production deployments. | diff --git a/products/azure-service-connector/azure-service-connector.csv b/products/azure-service-connector/azure-service-connector.csv index 25202947..26529fff 100644 --- a/products/azure-service-connector/azure-service-connector.csv +++ b/products/azure-service-connector/azure-service-connector.csv @@ -29,13 +29,13 @@ https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-servi https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-signalr,Azure SignalR Service,Integrate Azure SignalR Service with Service Connector,Integrate Azure SignalR Service using Service Connector,Integrate Azure SignalR Service into your application with Service Connector. Learn about authentication types and client types of Azure SignalR Service.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure SignalR Service to other Azure services using Service Connector. The article also shows the default environment variables and Spring Boot configurations you need to create the service connections.",2026-04-10T17:11:00.000Z,how-to,integrations,0.78,True,"The article provides specific integration details for Azure SignalR Service via Service Connector, including supported client types, authentication options, default environment variables, and Spring Boot configuration keys. These product-specific parameters and code patterns qualify as expert integration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-sql-database,Azure SQL Database,Integrate Azure SQL Database with Service Connector,Connect Azure SQL Database via Service Connector,Learn how to integrate Azure SQL Database into your application with Service Connector by using the supported authentication methods and clients.,"In this article, we cover the supported authentication methods and clients that you can use to connect your apps to Azure SQL Database using Service Connector. For each supported method, we provide sample code and describe the default environment variable names, values, and configuration obtained when creating a service connection.",2026-02-18T06:15:00.000Z,how-to,integrations,0.88,True,"SQL Database integration article with supported auth methods, client usage, and default environment variable names/values and configuration from Service Connector.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-blob,Azure Blob Storage,Integrate Azure Blob Storage with Service Connector,Integrate Azure Blob Storage with Service Connector,Learn how to integrate Azure Blob Storage into your application with Service Connector by using the supported authentication methods and clients.,"In this article, we cover the supported authentication methods and clients that you can use to connect your apps to Azure Blob Storage using Service Connector. For each supported method, we provide sample code and describe the default environment variable names, values, and configuration obtained when creating a service connection.",2026-02-03T08:00:00.000Z,how-to,integrations,0.9,True,"Details supported auth methods, clients, sample code, and default environment variable names/values for Blob Storage; clearly an integration pattern reference.",unchanged -https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-file,Azure File,Integrate Azure Files with Service Connector,Connect Azure Files to compute with Service Connector,Integrate Azure Files into your application by using Service Connector.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure Files to other Azure services using Service Connector. This article also shows the default environment variables you need to create the service connections.",2026-04-13T11:11:00.000Z,how-to,integrations,0.78,True,"The page documents product-specific integration details: supported clients, authentication methods, and default environment variables used by Service Connector when wiring Azure Files to compute services. These environment variable names/values and connection patterns are configuration- and SDK-specific details that go beyond generic knowledge, fitting the integrations category.",updated -https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-queue,Azure Queue Storage,Integrate Azure Queue Storage using Service Connector,Integrate Azure Queue Storage via Service Connector,Learn how to connect Queue Storage to supported Azure compute services by using Service Connector.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure Queue Storage to other Azure services using Service Connector. The article also shows the default environment variables and Spring Boot configurations you need to create the service connections.",2026-04-13T17:16:00.000Z,how-to,integrations,0.82,True,"The article includes concrete integration patterns between Azure Queue Storage and compute services using Service Connector, including default environment variables and Spring Boot configuration properties. These are product-specific configuration parameters and code patterns that qualify as expert integration knowledge rather than generic tutorial content.",updated +https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-file,Azure File,Integrate Azure Files with Service Connector,Connect Azure Files to compute with Service Connector,Integrate Azure Files into your application by using Service Connector.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure Files to other Azure services using Service Connector. This article also shows the default environment variables you need to create the service connections.",2026-04-13T11:11:00.000Z,how-to,integrations,0.78,True,"The page documents product-specific integration details: supported clients, authentication methods, and default environment variables used by Service Connector when wiring Azure Files to compute services. These environment variable names/values and connection patterns are configuration- and SDK-specific details that go beyond generic knowledge, fitting the integrations category.",unchanged +https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-queue,Azure Queue Storage,Integrate Azure Queue Storage using Service Connector,Integrate Azure Queue Storage via Service Connector,Learn how to connect Queue Storage to supported Azure compute services by using Service Connector.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure Queue Storage to other Azure services using Service Connector. The article also shows the default environment variables and Spring Boot configurations you need to create the service connections.",2026-04-13T17:16:00.000Z,how-to,integrations,0.82,True,"The article includes concrete integration patterns between Azure Queue Storage and compute services using Service Connector, including default environment variables and Spring Boot configuration properties. These are product-specific configuration parameters and code patterns that qualify as expert integration knowledge rather than generic tutorial content.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-table,Azure Table Storage,Integrate Azure Table Storage with Service Connector,Connect Azure Table Storage via Service Connector,Use these code samples to integrate Azure Table Storage into your application with Service Connector.,This page shows supported authentication methods and clients. It provides sample code you can use to connect compute services to Azure Table Storage using Service Connector. You might be able to connect to Azure Table Storage in other programming languages without using Service Connector. This page also shows default environment variable names and values you get when you create the service connection.,2025-08-29T17:12:00.000Z,how-to,integrations,0.86,True,Provides Table Storage-specific integration details including supported auth/clients and default env var names/values from Service Connector.,unchanged -https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-web-pubsub,Azure Web PubSub,Integrate Azure Web PubSub using Service Connector,Connect Azure Web PubSub using Service Connector,Learn how to connect Web PubSub to supported Azure compute services by using Service Connector.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure Web PubSub to other Azure services using Service Connector. The article also shows the default environment variables you need to create the service connections.",2026-04-13T17:16:00.000Z,how-to,integrations,0.78,True,"The page provides specific integration details for Azure Web PubSub with compute services through Service Connector, including supported clients, auth methods, and default environment variables. These concrete configuration and connection details are product-specific integration knowledge, aligning with the integrations sub-skill.",updated -https://learn.microsoft.com/en-us/azure/service-connector/how-to-manage-authentication,Manage authentication,Manage authentication in Service Connector,Configure authentication options for Azure Service Connector,Learn how to select and manage authentication parameters in Service Connector.,"In this guide, learn about the different authentication options available in Service Connector, and how to customize environment variables.",2026-04-13T22:10:00.000Z,how-to,security,0.7,True,"Focused on selecting and managing authentication parameters and environment variables for Service Connector. Likely includes specific auth modes, parameter names, and configuration details that are product-specific security knowledge.",updated +https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-web-pubsub,Azure Web PubSub,Integrate Azure Web PubSub using Service Connector,Connect Azure Web PubSub using Service Connector,Learn how to connect Web PubSub to supported Azure compute services by using Service Connector.,"This article shows supported clients, authentication methods, and sample code you can use to connect Azure Web PubSub to other Azure services using Service Connector. The article also shows the default environment variables you need to create the service connections.",2026-04-13T17:16:00.000Z,how-to,integrations,0.78,True,"The page provides specific integration details for Azure Web PubSub with compute services through Service Connector, including supported clients, auth methods, and default environment variables. These concrete configuration and connection details are product-specific integration knowledge, aligning with the integrations sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/service-connector/how-to-manage-authentication,Manage authentication,Manage authentication in Service Connector,Configure authentication options for Azure Service Connector,Learn how to select and manage authentication parameters in Service Connector.,"In this guide, learn about the different authentication options available in Service Connector, and how to customize environment variables.",2026-04-13T22:10:00.000Z,how-to,security,0.7,True,"Focused on selecting and managing authentication parameters and environment variables for Service Connector. Likely includes specific auth modes, parameter names, and configuration details that are product-specific security knowledge.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/how-to-provide-correct-parameters,Provide correct parameters,Provide correct parameters to Service Connector,Provide correct CLI parameters to Service Connector,Learn how to pass correct parameters to Service Connector to generate service connections between your Cloud resources.,"If you're using a CLI tool to manage connections, it's crucial to understand how to pass correct parameters to Service Connector. In this guide, you gain insights into the fundamental properties and their proper value formats.",2025-05-12T08:00:00.000Z,how-to,configuration,0.75,True,"Focuses on fundamental properties and proper value formats when passing parameters via CLI; this implies specific parameter names, formats, and constraints, which are configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/service-connector/how-to-troubleshoot-front-end-error,Troubleshoot,Service Connector troubleshooting,Diagnose and fix Azure Service Connector errors,See error messages and suggested actions for troubleshooting issues with Service Connector.,"This article lists Service Connector error messages you might encounter in the Azure portal or Azure CLI, and suggests actions you can take to troubleshoot them.",2026-04-13T17:16:00.000Z,troubleshooting,troubleshooting,0.95,True,"Explicitly described as listing Service Connector error messages with suggested actions. Contains concrete error messages and mappings from symptom to cause and resolution, which is product-specific troubleshooting knowledge.",updated +https://learn.microsoft.com/en-us/azure/service-connector/how-to-troubleshoot-front-end-error,Troubleshoot,Service Connector troubleshooting,Diagnose and fix Azure Service Connector errors,See error messages and suggested actions for troubleshooting issues with Service Connector.,"This article lists Service Connector error messages you might encounter in the Azure portal or Azure CLI, and suggests actions you can take to troubleshoot them.",2026-04-13T17:16:00.000Z,troubleshooting,troubleshooting,0.95,True,"Explicitly described as listing Service Connector error messages with suggested actions. Contains concrete error messages and mappings from symptom to cause and resolution, which is product-specific troubleshooting knowledge.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/how-to-use-service-connector-in-aks,Use Service Connector in AKS,How to use Service Connector in Azure Kubernetes Service (AKS),,"Learn how to use Service Connector to connect AKS to other Azure services. Learn about Service Connector operations, resource management, and troubleshooting.",Azure Kubernetes Service (AKS) is one of the compute services supported by Service Connector. This article covers:,2025-03-03T12:12:00.000Z,how-to,,0.55,False,"Covers how to use Service Connector in AKS, including operations, resource management, and troubleshooting; summary is broad and does not clearly indicate structured error-code mappings or configuration parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/how-to-use-service-connector-in-function,Use Service Connector in Azure Functions,How Service Connector Helps Azure Functions Connect to Services,,"Learn about the relationship between Service Connector and Azure Functions bindings, and how to connect functions to other Azure services.",Azure Functions is one of the compute services supported by Service Connector. We recommend using bindings to connect Azure Functions with other services. You can also use client SDKs. This article aims to help you understand:,2025-09-18T08:00:00.000Z,concept-article,,0.5,False,"Explains relationship between Service Connector and Functions bindings; more conceptual guidance on when to use bindings vs SDKs, without clear indication of detailed configuration tables or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/howto-mongodb-atlas-service-connection,Web app to MongoDB Atlas,Connect apps to MongoDB Atlas,,Learn how to connect apps to your MongoDB Atlas service using Service Connector in Azure.,"In this guide, you learn how to connect your app to a database within a MongoDB Atlas Cluster resource using Service Connector. Service Connector is an Azure service designed to simplify the process of connecting Azure resources together. Service Connector manages your connection's network and authentication settings to simplify the operation. This guide shows step by step instructions to connect an app deployed to Azure App Service to a MongoDB Atlas resource. You can apply a similar method to ",2025-08-01T05:11:00.000Z,how-to,,0.35,False,How-to guide for MongoDB Atlas connection; likely procedural with some parameters but summary does not indicate full configuration tables or SDK parameter catalogs.,unchanged @@ -48,7 +48,7 @@ https://learn.microsoft.com/en-us/azure/service-connector/quickstart-portal-func https://learn.microsoft.com/en-us/azure/service-connector/quickstart-portal-spring-cloud-connection,Azure Spring Apps,Quickstart: Connect Azure Spring Apps to databases and services with Service Connector,,"Learn how to connect Azure Spring Apps to databases, storage accounts, and other Azure services using Service Connector. Step-by-step guide for Azure portal and Azure CLI.","Get started with Service Connector to connect your Azure Spring Apps to databases, storage accounts, and other Azure services. Service Connector simplifies authentication and configuration, enabling you to connect to resources using managed identities other authentication methods. This article provides step-by-step instructions for both the Azure portal and Azure CLI. Choose your preferred method using the tabs above. Note TheBasic,Standard, andEnterpriseplans entered a retirement period on Marc",2025-08-19T17:10:00.000Z,quickstart,,0.3,False,Quickstart for Azure Spring Apps; largely a getting-started walkthrough without detailed expert configuration references.,unchanged https://learn.microsoft.com/en-us/azure/service-connector/tutorial-connect-web-app-app-configuration,ASP.NET core app to App Configuration,Tutorial: Connect a Web App to Azure App Configuration with Service Connector,,Learn how you can connect an ASP.NET Core application hosted in Azure Web Apps to App Configuration using Service Connector.,"Learn how to connect an ASP.NET Core app running on Azure App Service, to Azure App Configuration, using one of the following methods: In this tutorial, use the Azure CLI to complete the following tasks:",2025-07-24T17:11:00.000Z,tutorial,,0.3,False,Tutorial for connecting Web App to App Configuration; appears as a scenario walkthrough rather than a configuration reference or troubleshooting guide.,unchanged https://learn.microsoft.com/en-us/azure/service-connector/tutorial-csharp-webapp-storage-cli,ASP.NET core app to Blob Storage,Deploy a Webapp Connected to Azure Blob Storage,,This tutorial guides you through creating and deploying a web application that connects to Azure Blob Storage using Service Connector.,"In this tutorial, you learn how to access Azure Blob Storage for a web app (not a signed-in user) running on Azure App Service by using managed identities. In this tutorial, you use the Azure CLI to complete the following tasks:",2026-01-12T23:36:00.000Z,tutorial,,0.3,False,Scenario tutorial for Blob Storage access; uses CLI and managed identity but not focused on exhaustive configuration or error code mappings.,unchanged -https://learn.microsoft.com/en-us/azure/service-connector/tutorial-django-webapp-postgres-cli,Python app to PostgreSQL,Tutorial: Use Service Connector to connect a Django web app to Postgres,,Deploy a Python Django web app to Azure App Service and connect it to an Azure PostgreSQL database by using Service Connector.,"In this tutorial, you learn how to deploy a data-driven Python Django web app to Azure App Service and use Service Connector to connect it to other Azure services. The sample web app stores restaurant and review information in an Azure Database for PostgreSQL database and stores photos in an Azure Storage container. You use Azure CLI to complete the following tasks: Note This tutorial is similar to the App ServiceDeploy a Python Django web app with PostgreSQL in Azuretutorial, but uses a system-",2026-04-17T11:12:00.000Z,how-to,,0.3,False,"Primarily a step-by-step tutorial for deploying a Django app and wiring Service Connector; does not emphasize configuration tables, limits, or product-specific error mappings beyond generic tutorial usage.",updated +https://learn.microsoft.com/en-us/azure/service-connector/tutorial-django-webapp-postgres-cli,Python app to PostgreSQL,Tutorial: Use Service Connector to connect a Django web app to Postgres,,Deploy a Python Django web app to Azure App Service and connect it to an Azure PostgreSQL database by using Service Connector.,"In this tutorial, you learn how to deploy a data-driven Python Django web app to Azure App Service and use Service Connector to connect it to other Azure services. The sample web app stores restaurant and review information in an Azure Database for PostgreSQL database and stores photos in an Azure Storage container. You use Azure CLI to complete the following tasks: Note This tutorial is similar to the App ServiceDeploy a Python Django web app with PostgreSQL in Azuretutorial, but uses a system-",2026-04-17T11:12:00.000Z,how-to,,0.3,False,"Primarily a step-by-step tutorial for deploying a Django app and wiring Service Connector; does not emphasize configuration tables, limits, or product-specific error mappings beyond generic tutorial usage.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/tutorial-java-jboss-connect-managed-identity-mysql-database,Java JBoss EAP to MySQL,Access data with managed identity in Java JBoss EAP,,"Secure Azure Database for MySQL connectivity with managed identity from a sample Java JBoss EAP app, and apply it to other Azure services.","Azure App Serviceprovides a highly scalable, self-patching web hosting service in Azure. It also provides amanaged identityfor your app, which is a turn-key solution for securing access toAzure Database for MySQLand other Azure services. Managed identities in App Service make your app more secure by eliminating secrets from your app, such as credentials in the environment variables. In this tutorial, you learn how to: If you don't have an Azure account, create afree accountbefore you begin.",2024-12-18T08:00:00.000Z,tutorial,,0.3,False,"Tutorial for Java JBoss with managed identity; primarily step-by-step scenario, not a configuration or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/service-connector/tutorial-java-spring-confluent-kafka,Spring Boot app to Kafka on Confluent Cloud,Tutorial: Deploy a Spring Boot app connected to Apache Kafka on Confluent Cloud with Service Connector in Azure Spring Apps,,Create a Spring Boot app connected to Apache Kafka on Confluent Cloud with Service Connector in Azure Spring Apps.,"Learn how to access Apache Kafka on Confluent Cloud for a Spring Boot application running on Azure Spring Apps. In this tutorial, you complete the following tasks: Note TheBasic,Standard, andEnterpriseplans entered a retirement period on March 17, 2025. For more information, see theAzure Spring Apps retirement announcement. Warning Microsoft recommends that you use the most secure authentication flow available. The authentication flow described in this procedure requires a very high degree of tr",2025-03-12T17:04:00.000Z,tutorial,,0.35,False,Spring Boot + Confluent Kafka tutorial; includes warnings about auth but is primarily a scenario deployment guide.,unchanged https://learn.microsoft.com/en-us/azure/service-connector/tutorial-java-spring-mysql,Spring app to MySQL,Tutorial: Deploy an application to Azure Spring Apps and connect it to Azure Database for MySQL Flexible Server using Service Connector,,Create a Spring Boot application connected to Azure Database for MySQL Flexible Server with Service Connector.,"In this tutorial, you'll complete the following tasks using the Azure portal or the Azure CLI. Both methods are explained in the following procedures. Note TheBasic,Standard, andEnterpriseplans entered a retirement period on March 17, 2025. For more information, see theAzure Spring Apps retirement announcement. Warning Microsoft recommends that you use the most secure authentication flow available. The authentication flow described in this procedure requires a very high degree of trust in the ap",2025-10-08T22:11:00.000Z,tutorial,,0.35,False,"Spring Boot + MySQL Flexible Server tutorial; focused on stepwise deployment and connection, not on broad configuration or troubleshooting patterns.",unchanged diff --git a/products/azure-service-connector/report.md b/products/azure-service-connector/report.md index ea124f5b..4d3b6580 100644 --- a/products/azure-service-connector/report.md +++ b/products/azure-service-connector/report.md @@ -41,8 +41,8 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Fun ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 57 +- **Updated Pages**: 0 +- **Unchanged**: 63 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-service-connector/azure-service-connector.csv` @@ -60,21 +60,6 @@ confusable_not_for: Not for Azure App Service (use azure-app-service), Azure Fun ## Changes -### Updated Pages - -- [Azure File](https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-file) - - Updated: 2024-09-11T11:20:00.000Z → 2026-04-13T11:11:00.000Z -- [Azure Queue Storage](https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-storage-queue) - - Updated: 2024-09-25T11:13:00.000Z → 2026-04-13T17:16:00.000Z -- [Azure Web PubSub](https://learn.microsoft.com/en-us/azure/service-connector/how-to-integrate-web-pubsub) - - Updated: 2024-09-11T11:20:00.000Z → 2026-04-13T17:16:00.000Z -- [Python app to PostgreSQL](https://learn.microsoft.com/en-us/azure/service-connector/tutorial-django-webapp-postgres-cli) - - Updated: 2024-09-23T11:17:00.000Z → 2026-04-17T11:12:00.000Z -- [Manage authentication](https://learn.microsoft.com/en-us/azure/service-connector/how-to-manage-authentication) - - Updated: 2025-05-12T08:00:00.000Z → 2026-04-13T22:10:00.000Z -- [Troubleshoot](https://learn.microsoft.com/en-us/azure/service-connector/how-to-troubleshoot-front-end-error) - - Updated: 2023-10-19T08:00:00.000Z → 2026-04-13T17:16:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-service-health/azure-service-health.csv b/products/azure-service-health/azure-service-health.csv index c7319a07..ffd6f250 100644 --- a/products/azure-service-health/azure-service-health.csv +++ b/products/azure-service-health/azure-service-health.csv @@ -29,7 +29,7 @@ https://learn.microsoft.com/en-us/azure/service-health/resource-health-overview, https://learn.microsoft.com/en-us/azure/service-health/resource-health-vm-annotation,Resource Health status for Virtual Machines,Resource Health virtual machine Health Annotations - Azure Service Health,Interpret and troubleshoot VM Resource Health annotations,"Messages, meanings, and troubleshooting for virtual machines resource health statuses.",Virtual Machine (VM) health annotations let you know when something is happening that could affect your VM’s availability. The annotations include details that help explain exactly what the impact is and why it’s happening. SeeResource types and health checks,2026-03-19T17:29:00.000Z,reference,troubleshooting,0.8,True,"Described as messages, meanings, and troubleshooting for VM health statuses; implies mappings from annotations/symptoms to causes and resolutions, which is product-specific troubleshooting knowledge.",unchanged https://learn.microsoft.com/en-us/azure/service-health/security-advisories-add-subscription,Configure subscriptions for Security advisories,Configure subscriptions for Security advisories - Azure Service Health,Configure subscription access for Azure Security advisories,This article describes how to set up and define access to Security advisories through the Azure portal.,"Security advisory impacted resources are considered sensitive when they include details that identify affected subscriptions, resources, or configurations. This information is sensitive because it reveals customer’s security posture, enable targeted exploitation, or enable targeted exploitation. For these reasons, such details must be shared only with individuals who hold authorized roles. Access must also align with the elevated access requirements defined for Azure Security Advisories. To acce",2026-01-16T23:13:00.000Z,how-to,security,0.85,True,"Details how to set up and define access to Security advisories, including sensitivity of impacted resources and elevated access requirements; this is concrete RBAC/security configuration.",unchanged https://learn.microsoft.com/en-us/azure/service-health/security-advisories-elevated-access,Security advisories overview,Security advisories overview - Azure Service Health,Access and view Azure Service Health security advisories,This article describes the Security advisories pane and that users are required to obtain elevated access roles in order to view Security advisory details.,"The Security advisories pane in Azure Service Health is a specialized dashboard designed to notify you about urgent security-related events that might affect your subscriptions. The Security advisories pane is used to communicate critical security events such as: These advisories are distinct from general health or service issues, because they often involve sensitive information and require elevated access roles to view all the details. Each advisory typically includes four key sections: Select ",2026-03-27T08:00:00.000Z,concept-article,security,0.75,True,"Describes Security advisories pane and explicitly that elevated access roles are required; contains product-specific RBAC/elevated access requirements, fitting security.",unchanged -https://learn.microsoft.com/en-us/azure/service-health/service-health-advisories,Health advisories overview,Service Health advisories - Azure Service Health,,This article describes how to view and use the Health advisories pane in Azure Service Health,"The Health advisories pane in Azure Service Health is a vital tool that helps you proactively manage your environment by highlighting non-incident issues that might need attention. This article explains its purpose, and outlines the types of information found on this pane.",2026-03-26T08:00:00.000Z,how-to,,0.2,False,"Health advisories pane overview; high-level description without product-specific configs, limits, or troubleshooting details.",unchanged +https://learn.microsoft.com/en-us/azure/service-health/service-health-advisories,Health advisories overview,Service Health advisories - Azure Service Health,,This article describes how to view and use the Health advisories pane in Azure Service Health,"The Health advisories pane in Azure Service Health is a vital tool that helps you proactively manage your environment by highlighting non-incident issues that might need attention. This article explains its purpose, and outlines the types of information found on this pane.",2026-04-23T22:16:00.000Z,how-to,,0.1,False,"Conceptual explanation of the Health advisories pane in Azure Service Health; describes its role and types of information but does not include numeric limits, configuration tables, troubleshooting mappings, or concrete decision criteria.",updated https://learn.microsoft.com/en-us/azure/service-health/service-health-ai-summary-timeline,AI-generated summary and timelines,Service Health Alerts AI-generated summary (Preview) - Azure Service Health,,Service health AI-generated summary and timeline overview,"When an Azure service outage happens, it can be tough to quickly find the information you actually need. You need to know which services or regions are affected and when the issue began. You also need to know what actions to take, and how the mitigation effort is progressing. That information is often scattered across multiple technical update pages, which can make the whole situation even more frustrating. Because of this large amount of information, it can be harder to understand how serious t",2026-03-11T08:00:00.000Z,concept-article,,0.2,False,"Overview of AI-generated summaries and timelines for service health alerts; no indication of limits, configuration parameters, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/service-health/service-health-alert-deploy-policy,Deploy Service Health alert rules at scale using Azure Policy,Deploy Service Health alert rules at scale using Azure Policy - Azure Service Health,,This article details a process by which users can deploy Service Health alerts across subscriptions via Azure policy.,This article explains how to deploy Service Health alerts across subscriptions using Azure Policy.,2026-04-10T22:08:00.000Z,how-to,,0.3,False,"Described as a process article for deploying Service Health alerts via Azure Policy. Likely a how-to/tutorial without detailed configuration parameter tables, limits, or decision matrices; no clear evidence of expert-only configuration or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/service-health/service-health-alert-overview,Service Health alerts overview,Service Health Alerts overview - Azure Service Health,,Service health alerts notify of the health of your Azure services or regions.,"The Health Alerts panel in Azure Service Health helps you view and manage alerts about service problems, planned maintenance, health advisories, and security advisories. These alerts notify you about events that could affect your resources. It shows only the alerts that you create based on the criteria you configured, such as subscription, services, regions, and event types. @@ -40,11 +40,11 @@ https://learn.microsoft.com/en-us/azure/service-health/service-health-alert-webh https://learn.microsoft.com/en-us/azure/service-health/service-health-alert-webhook-servicenow,Send alerts with ServiceNow,Send Azure service health alerts with ServiceNow - Azure Service Health,Send Azure Service Health alerts to ServiceNow,Get personalized notifications about service health events to your ServiceNow instance.,"This article shows you how to integrate Azure service health alerts with ServiceNow using a webhook. After setting up webhook integration with your ServiceNow instance, you get alerts through your existing notification infrastructure when Azure service issues affect you. Every time an Azure Service Health alert happens, it calls a webhook through the ServiceNow Scripted REST API.",2026-03-17T08:00:00.000Z,how-to,integrations,0.8,True,"Product-specific integration guide using ServiceNow Scripted REST API; likely includes webhook URL patterns, payload structure, and configuration fields unique to ServiceNow integration.",unchanged https://learn.microsoft.com/en-us/azure/service-health/service-health-event-properties,Service Health notifications data properties,Azure Service Health notifications data overview - Azure Service Health,,An overview of Service Health notifications data properties,"Azure Service Health notifications include different data properties depending on theevent type(such as Service Issue, Planned Maintenance, Security Advisory, or Health Advisory) and itsincident type(the specific scenario within that event). There are two ways to check the metadata of Service Health: Each type of notification serves a specific purpose and comes with its own metadata. This metadata helps you understand what's happening to your resources and what level of attention the issue requi",2026-03-26T22:20:00.000Z,concept-article,,0.3,False,"Describes metadata properties of notifications at a high level; no concrete config parameters, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/service-health/service-health-event-tags,Service Health event tags,Service Health event tags - Azure Service Health,,Learn how to understand and use the event tags in Azure Service Health,"This article explains the differences between Event level, Event subtypes, and Event tags in Azure Service Health. Eventtags are metadata labels that help users understand the nature of service health communications. Event subtypestags give you more information about what aspect of that issue is involved. Event leveltags categorize the severity of these events into informational, warning, and critical. Event tags help users understand the type of action required, while event levels indicate the ",2026-03-03T08:00:00.000Z,overview,,0.2,False,"Describes event tags, levels, and subtypes conceptually; no specific configuration values, limits, or decision matrices with thresholds.",unchanged -https://learn.microsoft.com/en-us/azure/service-health/service-health-faq,Service Health FAQ,Azure Service Health Frequently asked Questions - Azure Service Health,,Overview of Azure Resource Health,Learn the answers to common questions about Azure Service Health.,2026-03-05T23:15:00Z,faq,,0.2,False,"FAQ page appears to be a conceptual/overview-style explanation of Azure Service Health and Resource Health. It is unlikely to contain detailed limits, configuration tables, error-code mappings, or other product-specific expert data as defined by the sub-skill types; instead it answers common high-level questions.",unchanged +https://learn.microsoft.com/en-us/azure/service-health/service-health-faq,Service Health FAQ,Azure Service Health Frequently asked Questions - Azure Service Health,,Overview of Azure Resource Health,Learn the answers to common questions about Azure Service Health.,2026-04-24T22:11:00Z,faq,,0.2,False,"FAQ-style overview about Azure Service Health and Resource Health; primarily conceptual Q&A without detailed limits, configuration tables, error-code mappings, or other product-specific expert data.",updated https://learn.microsoft.com/en-us/azure/service-health/service-health-notification-transitions,Service Health data transitions,Azure service health notification data transitions - Azure Service Health,Understand lifecycle and retention of Service Health notifications,Service health notification data moving from active to resolved explained.,"Service Health notifications in Azure transition from anyActivepane to theHistorypane when the status of the event changes from Active to Resolved. This move ensures that only ongoing issues remain visible in the Active view, while completed events are archived for reference. In this article, you learn about the lifecycle of Service Health notifications, including the reasons for transitions between the panes, ways to view past records, and how long they're retained.",2026-02-25T23:09:00.000Z,concept-article,limits-quotas,0.65,True,Explains how long notifications are retained and lifecycle transitions; likely includes specific retention durations (time-based limits) which are numeric constraints not generally known.,unchanged https://learn.microsoft.com/en-us/azure/service-health/service-health-notifications-properties,Service Health notifications overview,Azure Service Health notifications overview - Azure Service Health,,Service Health notifications allow you to view Service Health messages published by Microsoft Azure.,"Azure Service Health notifications are system-generated alerts that inform you about Azure service problems or events that affect your resources. The subscription'sAzure Activity Logrecords these notifications as part of logging many events in Azure. The Azure portal then displays them underAzure Service Health. When Azure needs to communicate something about service health, such as an outage, upcoming maintenance, or account-specific alert, it creates a Service Health event in your Activity Log",2026-03-26T06:03:00.000Z,concept-article,,0.2,False,"Conceptual overview of Service Health notifications; no detailed configuration tables, limits, or error codes.",unchanged https://learn.microsoft.com/en-us/azure/service-health/service-health-planned-maintenance,Planned maintenance overview,Planned maintenance overview - Azure Service Health,,Overview of the features and information found on the Planned maintenance pane.,"The Planned Maintenance pane in Azure Service Health is a dedicated section in the Azure portal that keeps you informed about upcoming maintenance activities. It highlights events that can affect your Azure resources, helping you prepare in advance.Here's a breakdown of its purpose and the information it provides: This pane is designed to provide you with advance notice of scheduled maintenance events that can affect your services. -The information enables you to: Unlike unplanned outages, planne",2026-03-27T08:00:00.000Z,concept-article,,0.2,False,"Planned maintenance pane overview; no specific configuration parameters, limits, or decision matrices.",unchanged +The information enables you to: Unlike unplanned outages, planne",2026-04-20T08:00:00.000Z,concept-article,,0.1,False,"High-level overview of the Planned maintenance pane in Azure Service Health; description focuses on purpose and what information is shown, without specific limits, configuration parameters, error codes, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/service-health/service-health-portal-update,Azure Service Health Portal,Azure Service Health Portal - Azure Service Health,,The Azure Service Health portal experience lets users engage with service events and manage actions to maintain the business continuity of affected applications.,"TheService Health portalis part of theService Health service. The portal provides you with a customizable dashboard that tracks the health of your Azure services in the regions where you use them. By using the Azure Service Health portal, you can engage with service events and manage actions and alerts to maintain the business continuity of affected applications. In this dashboard, you can track active events like ongoing service issues, upcoming planned maintenance, relevant health advisories, ",2026-03-25T08:00:00.000Z,overview,,0.2,False,"Portal overview of Service Health dashboard; no specific limits, configs, roles, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/service-health/service-issues-blade,Service issues overview,View Service issues - Azure Service Health,,This article describes how to view and use the Service issues pane,"The Service issues pane in Azure Service Health offers a detailed, real-time view of active Azure service problems. It highlights issues that could be affecting your resources. You can see which resources are impacted and review key details such as severity, status, scope, and timestamps. The information on this pane helps you stay informed and able to take action quickly if needed. This article provides a detailed explanation of the purpose of this panel and the information it provides. On the ",2026-03-26T08:00:00.000Z,overview,,0.3,False,Explains how to view the Service issues pane; appears to be UI/experience description without detailed configs or error mappings.,unchanged https://learn.microsoft.com/en-us/azure/service-health/stay-informed-security,Security notifications overview,Security notifications overview - Azure Service Health,,This article shows where you can receive Azure security notifications and three steps you can follow to ensure security alerts reach the right people in your organization.,"Security advisories and Security issues are two types of notifications that Azure provides to help you stay informed about security-related matters. Security advisoriesaddress broad threats across the environment, whileSecurity issuesfocus on particular assets needing attention. By staying informed about both types of notifications, you have better protection over your Azure environment. This article explains how you receive Azure security notifications, and the three steps you can follow to ens",2026-03-04T08:00:00.000Z,concept-article,,0.3,False,Security notifications overview and high-level steps to route alerts; likely mentions roles generically but not detailed RBAC scopes or security config parameters.,unchanged diff --git a/products/azure-service-health/report.md b/products/azure-service-health/report.md index 8020f790..9590288e 100644 --- a/products/azure-service-health/report.md +++ b/products/azure-service-health/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: integrations: Using APIs, Resource Graph, webhooks, and connectors (OpsGenie, PagerDuty, ServiceNow) to query, route, and integrate Azure Service Health and Security advisories @@ -38,8 +38,8 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Reliability ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 47 +- **Updated Pages**: 3 +- **Unchanged**: 44 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-service-health/azure-service-health.csv` @@ -56,6 +56,15 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Reliability ## Changes +### Updated Pages + +- [Service Health FAQ](https://learn.microsoft.com/en-us/azure/service-health/service-health-faq) + - Updated: 2026-03-05T23:15:00Z → 2026-04-24T22:11:00Z +- [Planned maintenance overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-planned-maintenance) + - Updated: 2026-03-27T08:00:00.000Z → 2026-04-20T08:00:00.000Z +- [Health advisories overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-advisories) + - Updated: 2026-03-26T08:00:00.000Z → 2026-04-23T22:16:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -98,17 +107,17 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Reliability | [Azure Service Health Portal](https://learn.microsoft.com/en-us/azure/service-health/service-health-portal-update) | 0.20 | Portal overview of Service Health dashboard; no specific limits, configs, roles, or error mappings. | | [Billing updates overview](https://learn.microsoft.com/en-us/azure/service-health/billing-elevated-access) | 0.20 | Appears to be an overview of how to view and use billing communications in the Azure portal. No indication of numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details. | | [Filter Service Health notifications by event level](https://learn.microsoft.com/en-us/azure/service-health/metadata-filter) | 0.20 | Explains using Event level metadata to filter/prioritize notifications; appears conceptual/UX guidance without detailed config tables or numeric thresholds. | -| [Health advisories overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-advisories) | 0.20 | Health advisories pane overview; high-level description without product-specific configs, limits, or troubleshooting details. | | [Impacted Resources from Azure retirements](https://learn.microsoft.com/en-us/azure/service-health/impacted-resources-retirements) | 0.20 | Explains impacted resources for retirements and notes subset coverage; no specific limits, configuration parameters, or troubleshooting content. | | [Impacted Resources from Azure security advisories](https://learn.microsoft.com/en-us/azure/service-health/impacted-resources-security) | 0.20 | Describes feature for viewing impacted resources from security advisories and phased rollout; no detailed security configuration, limits, or troubleshooting mappings. | | [Impacted Resources from Service issues](https://learn.microsoft.com/en-us/azure/service-health/impacted-resources-outage) | 0.20 | Explains where to see impacted resources during service issues; appears procedural/UX-focused without numeric limits, config parameters, or error codes. | | [Impacted Resources from planned maintenance events](https://learn.microsoft.com/en-us/azure/service-health/impacted-resources-planned-maintenance) | 0.20 | Describes where to view impacted resources for planned maintenance; appears to be UI/experience explanation without expert-only numeric or config details. | -| [Planned maintenance overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-planned-maintenance) | 0.20 | Planned maintenance pane overview; no specific configuration parameters, limits, or decision matrices. | | [Resource Health FAQ](https://learn.microsoft.com/en-us/azure/service-health/resource-health-faq) | 0.20 | FAQ about Azure Resource Health is likely conceptual and explanatory. The description does not indicate specific limits, error codes, configuration tables, or other detailed expert-only data. | -| [Service Health FAQ](https://learn.microsoft.com/en-us/azure/service-health/service-health-faq) | 0.20 | FAQ page appears to be a conceptual/overview-style explanation of Azure Service Health and Resource Health. It is unlikely to contain detailed limits, configuration tables, error-code mappings, or other product-specific expert data as defined by the sub-skill types; instead it answers common high-level questions. | +| [Service Health FAQ](https://learn.microsoft.com/en-us/azure/service-health/service-health-faq) | 0.20 | FAQ-style overview about Azure Service Health and Resource Health; primarily conceptual Q&A without detailed limits, configuration tables, error-code mappings, or other product-specific expert data. | | [Service Health event tags](https://learn.microsoft.com/en-us/azure/service-health/service-health-event-tags) | 0.20 | Describes event tags, levels, and subtypes conceptually; no specific configuration values, limits, or decision matrices with thresholds. | | [Service Health notifications overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-notifications-properties) | 0.20 | Conceptual overview of Service Health notifications; no detailed configuration tables, limits, or error codes. | | [Azure Service Health overview](https://learn.microsoft.com/en-us/azure/service-health/overview) | 0.10 | High-level overview of Azure Service Health; no detailed limits, configs, roles, or error mappings. | | [Azure Status page overview](https://learn.microsoft.com/en-us/azure/service-health/azure-status-overview) | 0.10 | High-level overview of Azure Status page; no numeric limits, configuration tables, error codes, or product-specific settings. | +| [Health advisories overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-advisories) | 0.10 | Conceptual explanation of the Health advisories pane in Azure Service Health; describes its role and types of information but does not include numeric limits, configuration tables, troubleshooting mappings, or concrete decision criteria. | +| [Planned maintenance overview](https://learn.microsoft.com/en-us/azure/service-health/service-health-planned-maintenance) | 0.10 | High-level overview of the Planned maintenance pane in Azure Service Health; description focuses on purpose and what information is shown, without specific limits, configuration parameters, error codes, or decision matrices. | | [Resource Health overview](https://learn.microsoft.com/en-us/azure/service-health/resource-health-overview) | 0.10 | Conceptual overview of Resource Health; no specific error codes, configs, or limits. | | [What's new](https://learn.microsoft.com/en-us/azure/service-health/whats-new) | 0.10 | What's New page with high-level feature announcements; no detailed limits, configs, error codes, or decision matrices. | diff --git a/products/azure-site-recovery/azure-site-recovery.csv b/products/azure-site-recovery/azure-site-recovery.csv index 200d768d..24c4fae2 100644 --- a/products/azure-site-recovery/azure-site-recovery.csv +++ b/products/azure-site-recovery/azure-site-recovery.csv @@ -18,7 +18,7 @@ https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-enable-glob https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-enable-replication-added-disk,For an added disk,Enable replication for an added Azure VM disk in Azure Site Recovery - Azure Site Recovery,Enable replication for newly added Azure VM data disks,This article describes how to enable replication for a disk added to an Azure VM that's enabled for disaster recovery with Azure Site Recovery,"This article describes how to enable replication for data disks that are added to an Azure VM that's already enabled for disaster recovery to another Azure region, usingAzure Site Recovery. Enabling replication for a disk you add to a VM is supported for Azure VMs with managed disks. When you add a new disk to an Azure VM that's replicating to another Azure region, the following occurs:",2023-01-31T23:04:00.000Z,how-to,configuration,0.75,True,Explains behavior when adding disks to already-protected VMs and how to enable replication; includes Site Recovery-specific state transitions and options.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-exclude-disks,Exclude disks,Exclude Azure VM disks from replication with Azure Site Recovery and Azure PowerShell - Azure Site Recovery,Exclude Azure VM disks from Site Recovery using PowerShell,Learn how to exclude disks of Azure virtual machines during Azure Site Recovery by using Azure PowerShell.,"This article describes how to exclude disks when you replicate Azure VMs. You might exclude disks to optimize the consumed replication bandwidth or the target-side resources that those disks use. Currently, this capability is available only through Azure PowerShell. Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure PowerShell. To learn how to migrate to the Az PowerShell module, seeMigrate Azure PowerShell from AzureRM to Az.",2023-05-11T11:16:00.000Z,how-to,configuration,0.8,True,Uses Az PowerShell with specific cmdlets/parameters to exclude disks; SDK/API-level configuration details unique to Site Recovery.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-policy,Using Azure policy,Enable Azure Site Recovery for your VMs by using Azure Policy - Azure Site Recovery,Enable Site Recovery protection using Azure Policy assignments,Learn how to enable policy support to help protect your VMs by using Azure Site Recovery.,This article describes how to set upAzure Site Recoveryfor your resources by using Azure Policy.Azure Policyhelps enforce certain business rules on your Azure resources and assess compliance of those resources.,2026-02-12T08:00:00.000Z,how-to,configuration,0.7,True,Shows how to configure Azure Policy definitions/assignments to auto-enable Site Recovery; involves specific policy parameters and effects.,unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication,On Azure VMs,Configure replication for Azure VMs in Azure Site Recovery - Azure Site Recovery,,"Learn how to configure replication to another region for Azure VMs, using Site Recovery.","This article describes how to enable replication of Azure VMs, from one Azure region to another.",2026-04-14T17:11:00.000Z,how-to,,0.2,False,"Step-by-step tutorial for enabling Azure Site Recovery replication for VMs. It focuses on portal workflow and basic settings, without detailed configuration parameter tables, limits/quotas, error-code-based troubleshooting, or decision matrices. The content is procedural rather than expert reference material.",updated +https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication,On Azure VMs,Configure replication for Azure VMs in Azure Site Recovery - Azure Site Recovery,,"Learn how to configure replication to another region for Azure VMs, using Site Recovery.","This article describes how to enable replication of Azure VMs, from one Azure region to another.",2026-04-14T17:11:00.000Z,how-to,,0.2,False,"Step-by-step tutorial for enabling Azure Site Recovery replication for VMs. It focuses on portal workflow and basic settings, without detailed configuration parameter tables, limits/quotas, error-code-based troubleshooting, or decision matrices. The content is procedural rather than expert reference material.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication-ade-vms,On encrypted VMs,Enable replication for encrypted Azure VMs in Azure Site Recovery - Azure Site Recovery,Configure Site Recovery for ADE-encrypted Azure virtual machines,This article describes how to configure replication for Azure Disk Encryption-enabled VMs from one Azure region to another by using Site Recovery.,"This article describes how to replicate Azure VMs with Azure Disk Encryption (ADE) enabled, from one Azure region to another. Note Site Recovery currently supports ADE, with and without Microsoft Entra ID for VMs running Windows operating systems. For Linux operating systems, we only support ADE without Microsoft Entra ID. Moreover, for machines running ADE 1.1 (without Microsoft Entra ID), the VMs must be using managed disks. VMs with unmanaged disks aren't supported. If you switch from ADE 0.1",2025-10-31T08:00:00.000Z,how-to,configuration,0.8,True,"Details support matrix (Windows vs Linux, ADE versions, managed vs unmanaged disks) and required settings; product-specific encryption configuration and constraints.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication-cmk-disks,On CMK enabled disks,Enable replication of encrypted Azure VMs in Azure Site Recovery - Azure Site Recovery,Enable Site Recovery for VMs using CMK-encrypted managed disks,This article describes how to configure replication for VMs with customer-managed key (CMK) enabled disks from one Azure region to another by using Site Recovery.,"This article describes how to replicate Azure VMs with Customer-Managed Keys (CMK) enabled managed disks, from one Azure region to another.",2025-10-31T08:00:00.000Z,how-to,configuration,0.8,True,"Product-specific guidance for CMK-enabled disks replication; includes required Key Vault, disk, and Site Recovery settings.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication-private-endpoints,Enable replication with private endpoints,Enable replication for private endpoints in Azure Site Recovery - Azure Site Recovery,Configure private endpoint-based replication for Site Recovery,This article describes how to configure replication for VMs with private endpoints from one Azure region to another by using Site Recovery.,Azure Site Recovery allows you to useAzure Private Linkprivate endpoints for replicating your machines from inside an isolated virtual network. Private endpoint access to a recovery vault is supported in all Azure Commercial & Government regions. This article provides instructions for you to perform the following steps: Following is a reference architecture on how the replication workflow changes with private endpoints.,2025-05-11T08:00:00.000Z,how-to,configuration,0.8,True,"Covers use of Private Link endpoints for vault access, including reference architecture and required settings; product-specific networking configuration.",unchanged @@ -31,7 +31,7 @@ https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-powershell, https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-protection-errors,Protection errors,Troubleshoot Azure VM replication in Azure Site Recovery - protection errors - Azure Site Recovery,Resolve protection errors in Azure-to-Azure VM replication,Troubleshoot protection errors when replicating Azure virtual machines for disaster recovery.,"This article describes how to troubleshoot common errors in Azure Site Recovery during replication and recovery ofAzure virtual machines(VM) from one region to another. For more information about supported configurations, see thesupport matrix for replicating Azure VMs.",2025-12-09T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting guide for protection errors; will contain specific error codes/messages and their resolutions, which is classic expert troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-quickstart,Set up disaster recovery on an Azure VM,Set up Azure VM disaster recovery to a secondary region with Azure Site Recovery - Azure Site Recovery,,"Quickly set up disaster recovery to another Azure region for an Azure VM, using the Azure Site Recovery service.","TheAzure Site Recoveryservice contributes to your business continuity and disaster recovery (BCDR) strategy by keeping your business applications online during planned and unplanned outages. Site Recovery manages and orchestrates disaster recovery of on-premises machines and Azure virtual machines (VM), including replication, failover, and recovery. Azure Site Recovery has an option ofHigh Churn, enabling you to configure disaster recovery for Azure VMs having data churn up to 100 MB/s. This hel",2025-09-09T08:00:00.000Z,quickstart,,0.2,False,Quickstart for enabling DR to another region; primarily step-by-step tutorial without detailed configuration tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-replicate-after-migration,Set up disaster recovery for Azure VMs after migration to Azure,Set up disaster recovery after migration to Azure with Azure Site Recovery - Azure Site Recovery,Prepare migrated Azure VMs for cross-region disaster recovery,This article describes how to prepare machines to set up disaster recovery between Azure regions after migration to Azure using Azure Site Recovery.,"Follow this article if you'vemigrated on-premises machines to Azure VMsusing theSite Recoveryservice, and you now want to get the VMs set up for disaster recovery to a secondary Azure region. The article describes how to ensure that the Azure VM agent is installed on migrated VMs, and how to remove the Site Recovery Mobility service that's no longer needed after migration.",2025-12-08T08:00:00.000Z,how-to,configuration,0.7,True,Describes removing Mobility service and ensuring Azure VM agent for post-migration DR; product-specific configuration and cleanup steps.,unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-support-matrix,Azure to Azure,Support Matrix for Azure VM Disaster Recovery with Azure Site Recovery - Azure Site Recovery,Check Azure Site Recovery VM support matrix,Summarizes support for Azure VMs disaster recovery to a secondary region with Azure Site Recovery.,This article summarizes support and prerequisites for disaster recovery (DR) of Azure virtual machines (VMs) from one Azure region to another by usingAzure Site Recovery.,2026-04-14T08:00:00.000Z,concept-article,limits-quotas,0.78,True,"Support matrices for Azure Site Recovery typically list precise supported/unsupported configurations, OS versions, disk types, and sometimes size or count constraints per scenario and region. These are product-specific, versioned details that an LLM won't reliably know from training and function as de facto limits/constraints for what is supported.",updated +https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-support-matrix,Azure to Azure,Support Matrix for Azure VM Disaster Recovery with Azure Site Recovery - Azure Site Recovery,Check Azure Site Recovery VM support matrix,Summarizes support for Azure VMs disaster recovery to a secondary region with Azure Site Recovery.,This article summarizes support and prerequisites for disaster recovery (DR) of Azure virtual machines (VMs) from one Azure region to another by usingAzure Site Recovery.,2026-04-14T08:00:00.000Z,concept-article,limits-quotas,0.78,True,"Support matrices for Azure Site Recovery typically list precise supported/unsupported configurations, OS versions, disk types, and sometimes size or count constraints per scenario and region. These are product-specific, versioned details that an LLM won't reliably know from training and function as de facto limits/constraints for what is supported.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-troubleshoot-errors,Other errors,Troubleshoot Azure VM replication in Azure Site Recovery - other issues - Azure Site Recovery,Troubleshoot other Azure Site Recovery replication issues,Troubleshoot errors when replicating Azure virtual machines for disaster recovery.,"This article describes how to troubleshoot common errors in Azure Site Recovery during replication and recovery ofAzure virtual machines(VM) from one region to another. For more information about supported configurations, see thesupport matrix for replicating Azure VMs.",2025-12-09T08:00:00.000Z,troubleshooting,troubleshooting,0.85,True,Covers additional replication errors; likely lists specific error IDs/messages and stepwise resolutions.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-troubleshoot-network-connectivity,Azure VM connectivity issues,Troubleshoot connectivity for Azure to Azure disaster recovery with Azure Site Recovery - Azure Site Recovery,Diagnose network connectivity issues for Azure-to-Azure Site Recovery,Troubleshoot connectivity issues in Azure VM disaster recovery,"This article describes the common issues related to network connectivity when you replicate and recover Azure virtual machines (VM) from one region to another region. For more information about networking requirements, see theconnectivity requirements for replicating Azure VMs. For Site Recovery replication to work, outbound connectivity to specific URLs or IP ranges is required from the VM. If your VM is behind a firewall or uses network security group (NSG) rules to control outbound connectivi",2025-12-09T08:00:00.000Z,how-to,troubleshooting,0.9,True,"Focuses on connectivity issues with explicit requirements for URLs/IP ranges and NSG/firewall rules; maps symptoms to causes and fixes, which is expert troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-troubleshoot-replication,Azure VM disaster recovery replication errors,Troubleshoot replication of Azure VMs with Azure Site Recovery - Azure Site Recovery,Troubleshoot common Azure VM replication problems in Site Recovery,Troubleshoot replication in Azure VM disaster recovery with Azure Site Recovery.,"This article describes common problems in Azure Site Recovery when you're replicating and recovering Azure virtual machines (VM) from one region to another region. It also explains how to troubleshoot the common problems. For more information about supported configurations, see thesupport matrix for replicating Azure VMs. Azure Site Recovery consistently replicates data from the source region to the disaster recovery region. It also creates a crash-consistent recovery point every 5 minutes. If S",2025-12-09T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article for replication; includes specific behaviors, error conditions, and resolutions unique to Site Recovery.",unchanged @@ -63,16 +63,16 @@ https://learn.microsoft.com/en-us/azure/site-recovery/disaster-recovery-for-edge https://learn.microsoft.com/en-us/azure/site-recovery/encryption-feature-deprecation,Deprecation of Site Recovery data encryption,Deprecation of Azure Site Recovery data encryption feature - Azure Site Recovery,Remediate deprecated Site Recovery data encryption feature,Get details about the Azure Site Recovery data encryption feature.,This article describes the deprecation details and the remediation action that you need to take if you're using the Azure Site Recovery data encryption feature while configuring disaster recovery of Hyper-V virtual machines (VMs) to Azure.,2026-02-11T12:11:00.000Z,how-to,security,0.7,True,"Deprecation article will specify which encryption options/parameters are affected and required remediation steps, including security-related configuration details.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/exclude-disks-replication,Exclude disks from replication,Exclude disks from replication with Azure Site Recovery - Azure Site Recovery,,How to exclude disks from replication to Azure with Azure Site Recovery.,"This article describes how to exclude disks from replication during disaster recovery from on-premises to Azure by usingAzure Site Recovery. You might exclude disks from replication for a number of reasons: Important The classic experience to protect VMware machines by using ASR has retired on March 30, 2026.Learn more. Switch to themodernized experienceto avoid service interruption.",2026-04-06T08:00:00.000Z,how-to,,0.3,False,"How-to article on excluding disks from replication; likely step-by-step UI/portal guidance without configuration parameter tables, limits, or product-specific error mappings in the summary.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/failover-failback-overview-modernized,Failover and failback - Modernized,About failover and failback in Azure Site Recovery - Modernized - Azure Site Recovery,,Learn about failover and failback in Azure Site Recovery - Modernized.,"This article provides an overview of failover and failback during disaster recovery of on-premises machines to Azure by usingAzure Site Recovery- Modernized. For information about failover and failback in Azure Site Recovery Classic releases,see this article.",2026-02-11T08:00:00.000Z,overview,,0.4,False,Overview of failover/failback concepts in modernized experience; likely conceptual without detailed numeric thresholds or config tables.,unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/feature-updates-whats-new,Feature releases,New feature updates in Azure Site Recovery - Azure Site Recovery,,Provides a summary of new features updates in the Azure Site Recovery service.,"The Azure Site Recovery service is updated and improved on an ongoing basis. To help you stay up-to-date, this article provides you with information about the latest feature releases. This page is updated regularly. You can follow and subscribe to Site Recovery update notifications in theAzure updates channel.",2026-04-14T08:00:00.000Z,concept-article,,0.2,False,"Release notes/what's-new summary without clear indication of detailed limits, configuration tables, error mappings, or decision matrices; primarily update/marketing-style content rather than deep expert guidance.",updated +https://learn.microsoft.com/en-us/azure/site-recovery/feature-updates-whats-new,Feature releases,New feature updates in Azure Site Recovery - Azure Site Recovery,,Provides a summary of new features updates in the Azure Site Recovery service.,"The Azure Site Recovery service is updated and improved on an ongoing basis. To help you stay up-to-date, this article provides you with information about the latest feature releases. This page is updated regularly. You can follow and subscribe to Site Recovery update notifications in theAzure updates channel.",2026-04-14T08:00:00.000Z,concept-article,,0.2,False,"Release notes/what's-new summary without clear indication of detailed limits, configuration tables, error mappings, or decision matrices; primarily update/marketing-style content rather than deep expert guidance.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/file-server-disaster-recovery,File Server,Protect a file server by using Azure Site Recovery - Azure Site Recovery,Protect on-premises file servers with Site Recovery,This article describes how to protect a file server by using Azure Site Recovery,"Azure Site Recoveryhelps you keep your business apps running during planned and unplanned outages. It supports your business continuity and disaster recovery (BCDR) strategy. Site Recovery manages and orchestrates disaster recovery for on-premises machines and Azure virtual machines (VMs). Disaster recovery includes replication, failover, and recovery of various workloads. This article describes how to protect a file server by using Site Recovery and makes other recommendations to suit various e",2026-02-12T08:00:00.000Z,how-to,architecture-patterns,0.6,True,File server DR article includes recommendations for different enterprise scenarios and patterns for protecting shares and data volumes.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/how-to-enable-replication-proximity-placement-groups,For proximity placement groups,Replicate Azure virtual machines running in a proximity placement group - Azure Site Recovery,,Learn how to replicate Azure virtual machines running in proximity placement groups by using Azure Site Recovery.,"Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and plan accordingly. For more information, see theCentOS End Of Life guidance. This article describes how to replicate, fail over, and fail back Azure virtual machines (VMs) running in a proximity placement group to a secondary region. Proximity placement groupsare a logical grouping capability in Azure Virtual Machines. You can use them to decrease the inter-VM network latenc",2026-04-06T08:00:00.000Z,how-to,,0.3,False,"Describes how to replicate VMs in proximity placement groups; appears to be procedural guidance without explicit limits, configuration option tables, or troubleshooting content in the summary.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/how-to-migrate-run-as-accounts-managed-identity,Migrate from a Run As account to Managed Identities,Migrate from a Run As account to a managed identity - Azure Site Recovery,Migrate Site Recovery automation from Run As accounts to managed identities,This article describes how to migrate from a Run As account to a managed identity in Azure Site Recovery.,"Important This article shows you how to migrate your runbooks to use a Managed Identities for Azure Site Recovery. Azure Automation Accounts are used by Azure Site Recovery customers to auto-update the agents of their protected virtual machines. Site Recovery creates Azure Automation Run As Accounts when you enable replication via the IaaS VM Blade and Recovery Services Vault. On Azure, managed identities eliminate the need for developers to manage credentials by providing an identity for the Az",2026-02-13T12:10:00.000Z,how-to,security,0.75,True,Shows how to reconfigure Automation runbooks and identities; includes specific identity/auth configuration steps for Site Recovery agent updates.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/how-to-move-from-classic-to-modernized-vmware-disaster-recovery,Move from classic to modernized VMware disaster recovery,How to move from classic to modernized VMware disaster recovery? - Azure Site Recovery,Migrate VMware disaster recovery from classic to modernized,This article describes how to move from classic to modernized VMware disaster recovery.,"This article explains how to move or migrate your VMware or physical machine replications fromclassictomodernizedprotection architecture. By using this migration capability, you can transfer your replicated items from a configuration server to an Azure Site Recovery replication appliance. A smart replication mechanism guides this migration. It ensures that the complete initial replication isn't performed again for noncritical replicated items, and only the differential data is transferred. Note",2026-02-13T12:10:00.000Z,how-to,deployment,0.6,True,Step-by-step migration from classic to modernized DR is a deployment/upgrade path with product-specific requirements and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints,Enable on-premises replication with private endpoints,Enable replication for on-premises machines with private endpoints - Azure Site Recovery,Configure Site Recovery replication with private endpoints,This article describes how to configure replication for on-premises machines by using private endpoints in Site Recovery.,"Azure Site Recovery allows you to useAzure Private Linkprivate endpoints to replicate +https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints,Enable on-premises replication with private endpoints,Enable replication for on-premises machines with private endpoints - Azure Site Recovery,Configure Azure Site Recovery replication with private endpoints,This article describes how to configure replication for on-premises machines by using private endpoints in Site Recovery.,"Azure Site Recovery allows you to useAzure Private Linkprivate endpoints to replicate your on-premises machines to a virtual network in Azure. Private endpoint access to a recovery vault is supported in all Azure Commercial & Government regions. Note Automatic upgrades are not supported for Private Endpoints.Learn more. The following diagram shows the replication workflow for hybrid disaster -recovery with private endpoints. You can't create private endpoints in your on-premises network. To use p",2025-05-11T08:00:00.000Z,how-to,configuration,0.8,True,"Private endpoint setup requires specific subnet, DNS, and vault endpoint configuration parameters and constraints; includes product-specific settings and limitations (e.g., no automatic upgrades).",unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/hydration-process,Configure on-premise disks for Azure through Hydration,Configure on-premise disks for Azure through Hydration - Azure Site Recovery,,Learn how to prepare for Configuration changes via hydration process in Azure Site Recovery.,"You have to make some changes to the VMs' configuration before the failover to ensure that the migrated VMs function properly on Azure. Azure Site Recovery handles these configuration changes via thehydrationprocess. The hydration process is only performed for the versions ofoperating systems supported by Azure Site Recovery. Before you failover, you may need to perform the required changes manually for other operating system versions that aren't listed above. If the VM is migrated without the r",2026-04-15T08:00:00.000Z,concept-article,,0.2,False,"The hydration process article appears to be a procedural/configuration-change overview for preparing VMs before failover, without clear indication of parameter tables, numeric limits, or detailed configuration matrices. It reads more like a conceptual/how-to explanation than a source of specific expert-only limits, quotas, or configuration reference.",updated +recovery with private endpoints. You can't create private endpoints in your on-premises network. To use p",2026-04-21T17:17:00.000Z,how-to,configuration,0.68,True,"The article gives product-specific configuration steps and constraints for enabling replication via Azure Private Link, including which networks/endpoints can be created, supported regions, and specific behavioral caveats (for example, automatic upgrades not supported with private endpoints). These are concrete, service-specific configuration details rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/site-recovery/hydration-process,Configure on-premise disks for Azure through Hydration,Configure on-premise disks for Azure through Hydration - Azure Site Recovery,,Learn how to prepare for Configuration changes via hydration process in Azure Site Recovery.,"You have to make some changes to the VMs' configuration before the failover to ensure that the migrated VMs function properly on Azure. Azure Site Recovery handles these configuration changes via thehydrationprocess. The hydration process is only performed for the versions ofoperating systems supported by Azure Site Recovery. Before you failover, you may need to perform the required changes manually for other operating system versions that aren't listed above. If the VM is migrated without the r",2026-04-15T08:00:00.000Z,concept-article,,0.2,False,"The hydration process article appears to be a procedural/configuration-change overview for preparing VMs before failover, without clear indication of parameter tables, numeric limits, or detailed configuration matrices. It reads more like a conceptual/how-to explanation than a source of specific expert-only limits, quotas, or configuration reference.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-azure-architecture,Hyper-V to Azure architecture,Hyper-V disaster recovery architecture in Azure Site Recovery - Azure Site Recovery,,This article provides an overview of components and architecture used when deploying disaster recovery for on-premises Hyper-V VMs (without VMM) to Azure with the Azure Site Recovery service.,"This article describes the architecture and processes used when you replicate, fail over, and recover Hyper-V virtual machines (VMs) between on-premises Hyper-V hosts and Azure, using theAzure Site Recoveryservice. Hyper-V hosts can optionally be managed in System Center Virtual Machine Manager (VMM) private clouds.",2026-02-27T08:00:00.000Z,concept-article,,0.4,False,"Architecture overview of Hyper-V DR with Site Recovery; primarily conceptual description of components and flows, without decision matrices, numeric thresholds, or detailed configuration/limit tables.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-azure-common-questions,Hyper-V to Azure disaster recovery,Common questions for Hyper-V disaster recovery with Azure Site Recovery - Azure Site Recovery,Resolve common Hyper-V to Azure Site Recovery issues,This article summarizes common questions about setting up disaster recovery for on-premises Hyper-V VMs to Azure using the Azure Site Recovery site.,This article provides answers to common questions we see when replicating on-premises Hyper-V VMs to Azure.,2026-02-27T08:00:00.000Z,overview,troubleshooting,0.7,True,"FAQ for Hyper-V DR commonly includes specific error messages, behavioral edge cases, and product-specific clarifications (for example, about replication behavior, failover scenarios, and limitations). These map symptoms and questions to concrete product-specific answers, fitting troubleshooting expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-azure-failback,Fail back from Azure to Hyper-V,Fail back Hyper-V VMs from Azure with Azure Site Recovery - Azure Site Recovery,,How to fail back Hyper-V VMs to an on-premises site from Azure with Azure Site Recovery.,"This article describes how to fail back Azure VMs that were created after failover of Hyper-V VMs from an on-premises site to Azure, withAzure Site Recovery.",2025-08-29T08:00:00.000Z,tutorial,,0.4,False,"Failback procedure for Hyper-V is operational; likely lacks detailed limits, config matrices, or error-code mappings.",unchanged @@ -101,7 +101,7 @@ https://learn.microsoft.com/en-us/azure/site-recovery/monitoring-common-question https://learn.microsoft.com/en-us/azure/site-recovery/monitoring-high-churn,Monitor churn patterns on virtual machines,Monitoring churn patterns on virtual machines - Azure Site Recovery,Analyze and mitigate high churn patterns on Site Recovery VMs,Learn how to monitor churn patterns on Virtual Machines protected using Azure Site Recovery,"This article provides an overview of various tools that can be used to monitor churn patterns on a virtual machine. By using proper tools, it is easy to find out exactly which application is causing high churn and then further actions for that application can be taken.",2023-01-31T23:04:00.000Z,how-to,best-practices,0.6,True,Focuses on tools and methods to identify high churn and act on offending applications; this is product-specific operational guidance and likely includes concrete recommendations and gotchas.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/move-azure-vms-avset-azone,Move Azure VMs to Availability Zones,Move virtual machines to an Azure region with availability zones using Azure Site Recovery - Azure Site Recovery,,Learn how to move virtual machines to an availability zone in a different region with Site Recovery,"This article describes how to move Azure virtual machines to an availability zone in a different region. If you want to move to a different zone in the same region,review this article. Availability Zones in Azure help protect your applications and data from datacenter failures. Each Availability Zone is made up of one or more datacenters equipped with independent power, cooling, and networking. To ensure resiliency, there’s a minimum of three separate zones in all enabled regions. The physical s",2026-02-12T08:00:00.000Z,tutorial,,0.4,False,"Primarily a how-to migration guide for moving VMs to zones; likely step-based without detailed limits, config tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/move-from-classic-to-modernized-vmware-disaster-recovery,Classic to modernized VMware disaster recovery,Move from classic to modernized VMware disaster recovery - Azure Site Recovery,,"Learn about the architecture, necessary infrastructure, and FAQs about moving your VMware or Physical machine replications from classic to modernized protection architecture.","This article provides information about the architecture, necessary infrastructure, and FAQs about moving your VMware or Physical machine replications fromclassictomodernizedprotection architecture. With this capability to migrate, you can successfully transfer your replicated items from a configuration server to an Azure Site Recovery replication appliance. This migration is guided by a smart replication mechanism, which ensures that complete initial replication isn't performed again for noncri",2026-04-06T08:00:00.000Z,concept-article,,0.2,False,"Described as architecture, infrastructure, and FAQs for moving from classic to modernized VMware disaster recovery. This is primarily conceptual/architectural migration guidance without clear evidence of numeric limits, detailed config tables, or error-code-based troubleshooting in the provided summary.",unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/physical-azure-disaster-recovery,Walkthrough-Set up disaster recovery,Set up disaster recovery of physical on-premises servers with Azure Site Recovery - Azure Site Recovery,,"Learn how to set up disaster recovery to Azure for on-premises Windows and Linux servers, with the Azure Site Recovery service.","TheAzure Site Recoveryservice contributes to your disaster recovery strategy by managing and orchestrating replication, failover, and failback of on-premises machines, and Azure virtual machines (VMs). This tutorial shows how to set up disaster recovery of on-premises physical Windows and Linux servers to Azure. In this tutorial, you learn how to:",2026-04-15T08:00:00.000Z,how-to,,0.2,False,"Tutorial-style walkthrough for setting up disaster recovery of physical servers to Azure Site Recovery; based on the summary, it focuses on step-by-step setup rather than detailed limits, configuration parameter tables, error codes, or decision matrices. Lacks clear evidence of product-specific numeric limits, RBAC role lists, or troubleshooting mappings required for expert-knowledge classification.",updated +https://learn.microsoft.com/en-us/azure/site-recovery/physical-azure-disaster-recovery,Walkthrough-Set up disaster recovery,Set up disaster recovery of physical on-premises servers with Azure Site Recovery - Azure Site Recovery,,"Learn how to set up disaster recovery to Azure for on-premises Windows and Linux servers, with the Azure Site Recovery service.","TheAzure Site Recoveryservice contributes to your disaster recovery strategy by managing and orchestrating replication, failover, and failback of on-premises machines, and Azure virtual machines (VMs). This tutorial shows how to set up disaster recovery of on-premises physical Windows and Linux servers to Azure. In this tutorial, you learn how to:",2026-04-15T08:00:00.000Z,how-to,,0.2,False,"Tutorial-style walkthrough for setting up disaster recovery of physical servers to Azure Site Recovery; based on the summary, it focuses on step-by-step setup rather than detailed limits, configuration parameter tables, error codes, or decision matrices. Lacks clear evidence of product-specific numeric limits, RBAC role lists, or troubleshooting mappings required for expert-knowledge classification.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/physical-azure-set-up-source,Set up the source environment,Set up the configuration server for disaster recovery of physical servers to Azure using Azure Site Recovery - Azure Site Recovery,,This article describes how to set up the on-premises configuration server for disaster recovery of on-premises physical servers to Azure.,This article describes how to set up your on-premises environment to start replicating physical servers running Windows or Linux into Azure.,2025-12-08T08:00:00.000Z,how-to,,0.4,False,Setup article for configuration server for physical servers; likely step-based instructions without structured configuration reference.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/physical-manage-configuration-server,Manage the configuration server,Manage the configuration server for physical servers in Azure Site Recovery - Azure Site Recovery,Manage configuration server for physical server DR to Azure,This article describes how to manage the Azure Site Recovery configuration server for physical server disaster recovery to Azure.,"You set up an on-premises configuration server when you use theAzure Site Recoveryservice for disaster recovery of physical servers to Azure. The configuration server coordinates communications between on-premises machines and Azure, and manages data replication. This article summarizes common tasks for managing the configuration server after it's been deployed. Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure PowerShell. To l",2026-03-02T12:12:00.000Z,tutorial,configuration,0.7,True,"Article summarizes common tasks for managing the Azure Site Recovery configuration server. Such management docs typically include product-specific settings, service roles, and possibly PowerShell parameters unique to this component, fitting configuration-focused expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/physical-server-azure-architecture-modernized,Physical server to Azure architecture - Modernized,Physical server to Azure disaster recovery architecture – Modernized - Azure Site Recovery,,This article provides an overview of components and architecture used when setting up disaster recovery of on-premises Windows and Linux servers to Azure with Azure Site Recovery - Modernized,"This article describes the modernized architecture and processes used when you replicate, failover, and recover physical Windows and Linux servers between an on-premises site and Azure, using theAzure Site Recoveryservice. For information about configuration server requirements in Classic releases, seePhysical server to Azure disaster recovery architecture. Note Ensure you create a new Recovery Services vault for setting up the ASR replication appliance. Don't use an existing vault.",2023-12-14T08:00:00.000Z,concept-article,,0.45,False,"Architecture overview for physical server DR; conceptual description of components and flows, not detailed expert configuration or limits.",unchanged @@ -117,7 +117,7 @@ https://learn.microsoft.com/en-us/azure/site-recovery/quickstart-create-vault-te https://learn.microsoft.com/en-us/azure/site-recovery/quickstart-enable-replication,Setup disaster recovery on an on-premises VMware VM,Enable replication for VMware VM disaster recovery to Azure with Azure Site Recovery - Azure Site Recovery,,Quickly enable replication for on-premises VMware VMs with Azure Site Recovery - Modernized.,"This quickstart describes how to enable replication for on-premises VMware VMs, for disaster recovery to Azure using the Modernized VMware/Physical machine protection experience usingAzure Site Recovery.",2023-10-05T14:51:00.000Z,quickstart,,0.2,False,Quickstart for enabling VMware replication; focused on basic setup steps rather than exhaustive configuration or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/recovery-plan-overview,About recovery plans,About recovery plans in Azure Site Recovery - Azure Site Recovery,,Learn about recovery plans in Azure Site Recovery.,"This article provides an overview of recovery plans inAzure Site Recovery. A recovery plan gathers machines into recovery groups for the purpose of failover. A recovery plan helps you to define a systematic recovery process, by creating small independent units that you can fail over. A unit typically represents an app in your environment. Use recovery plans to:",2025-01-22T08:00:00.000Z,overview,,0.5,False,"Overview of recovery plans; mostly conceptual description of grouping and sequencing, not detailed numeric limits or config parameter tables.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/region-move-cross-geos,Move Azure VMs between Government & Public regions,Move Azure virtual machines between government and public regions with Azure Site Recovery - Azure Site Recovery,,Use Azure Site Recovery to move Azure virtual machines between Azure Government and public regions.,"You might want to move your IaaS virtual machines between Azure Government and public regions to increase availability of your existing virtual machines, improve manageability, or for governance reasons, as detailedhere. In addition to using theAzure Site Recoveryservice to manage and orchestrate disaster recovery of on-premises machines and Azure virtual machines for the purposes of business continuity and disaster recovery (BCDR), you can also use Site Recovery to manage move Azure virtual mac",2026-02-13T12:10:00.000Z,tutorial,,0.4,False,"Region move scenario description; likely procedural without detailed limits, configuration matrices, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix,Support requirements for Azure Site Recovery replication appliance,Support Requirements for Azure Site Recovery Replication Appliance - Azure Site Recovery,Check replication appliance requirements for VMware DR,This article describes support and requirements when you deploy the replication appliance for VMware disaster recovery to Azure with Azure Site Recovery with modernized architecture.,"This article describes support and requirements when you deploy the replication appliance for VMware disaster recovery to Azure with Azure Site Recovery with modernized architecture. Note The information in this article applies to Azure Site Recovery with modernized architecture. For information about configuration server requirements in classic releases, seeDeprecation of classic experience to protect VMware and physical machines using Site Recovery. Create a new and exclusive Recovery Services",2026-03-02T12:12:00.000Z,faq,limits-quotas,0.8,True,"Describes support and requirements for the replication appliance, which typically includes specific OS versions, resource minimums, and configuration constraints. These are concrete support/limit details that qualify as expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix,Support requirements for Azure Site Recovery replication appliance,Support Requirements for Azure Site Recovery Replication Appliance - Azure Site Recovery,Check support matrix for Site Recovery replication appliance,This article describes support and requirements when you deploy the replication appliance for VMware disaster recovery to Azure with Azure Site Recovery with modernized architecture.,"This article describes support and requirements when you deploy the replication appliance for VMware disaster recovery to Azure with Azure Site Recovery with modernized architecture. Note The information in this article applies to Azure Site Recovery with modernized architecture. For information about configuration server requirements in classic releases, seeDeprecation of classic experience to protect VMware and physical machines using Site Recovery. Create a new and exclusive Recovery Services",2026-04-21T17:17:00.000Z,faq,limits-quotas,0.78,True,"Support requirement/matrix pages for Azure Site Recovery replication appliance typically list exact supported versions, capacities, and constraints (for example, supported VMware/OS versions, maximum protected machines per appliance, resource requirements). These are product-specific numeric and matrix-style limits that qualify as expert knowledge and align best with the limits-quotas category.",updated https://learn.microsoft.com/en-us/azure/site-recovery/report-site-recovery,Configure Site Recovery reports,Configure Azure Site Recovery reports - Azure Site Recovery,Set up Azure Site Recovery reporting with Monitor logs and workbooks,This article describes how to configure reports for Azure Site Recovery.,"Azure Site Recovery provides a reporting solution for Backup and Disaster Recovery admins to gain insights on long-term data. This solution includes: LikeAzure Backup, Azure Site Recovery offers a reporting solution that usesAzure Monitor logsandAzure workbooks. These resources help you gain insights on your estate that are protected with Site Recovery. This article shows how to set up and view Azure Site Recovery reports.",2026-02-13T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes configuring reports using Monitor logs and workbooks; likely includes specific configuration steps, resource types, and workbook parameters unique to Site Recovery reporting.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/service-updates-how-to,Manage Site Recovery updates,Updates and component upgrades in Azure Site Recovery - Azure Site Recovery,,"Provides an overview of Azure Site Recovery service updates, MARS agent and component upgrades.","This article provides an overview ofAzure Site Recoveryupdates, and describes how to upgrade Site Recovery components. Site Recovery publishes service updates on a regular basis. Updates include new features, support improvements, component updates, and bug fixes. In order to take advantage of the latest features and fixes, we recommend running the latest versions of Site Recovery components.",2025-11-05T08:00:00.000Z,overview,,0.2,False,"Overview of update process and component upgrades; likely procedural without detailed config tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/shared-disk-support-matrix,Azure to Azure shared disks,Support matrix for shared disks in Azure VM disaster recovery - Azure Site Recovery,Understand shared disk support for Site Recovery,This article summarizes the scenarios that shared disk in Azure Site Recovery supports for each workload type.,This article summarizes the scenarios that shared disk in Azure Site Recovery supports for each workload type.,2025-12-08T08:00:00.000Z,article,limits-quotas,0.85,True,"Shared disk support matrix will specify which workloads and scenarios are supported or limited, often in table form with precise constraints.",unchanged @@ -176,7 +176,7 @@ https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-exclude-disk, https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-failback,Fail back from Azure to on-premises,Fail back VMware VMs/physical servers from Azure with Azure Site Recovery - Azure Site Recovery,,"Learn how to fail back to the on-premises site after failover to Azure, during disaster recovery of VMware VMs and physical servers to Azure.","This article describes how to fail back Azure VMs to an on-premises site, followingfailoverof on-premises VMs to Azure withAzure Site Recovery. After failback to on-premises, you enable replication so that the on-premises VMs start replicating to Azure.",2023-12-04T08:00:00.000Z,how-to,,0.3,False,Failback procedure description; likely a workflow tutorial rather than expert configuration or troubleshooting reference.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-install-linux-master-target,Set up a Linux master target server for failback,Install a master target server for Linux VM failback with Azure Site Recovery - Azure Site Recovery,,Learn how to set up a Linux master target server for failback to an on-premises site during disaster recovery of VMware VMs to Azure using Azure Site Recovery.,"After you fail over your virtual machines to Azure, you can fail back the virtual machines to the on-premises site. To fail back, you need to reprotect the virtual machine from Azure to the on-premises site. For this process, you need an on-premises master target server to receive the traffic. If your protected virtual machine is a Windows virtual machine, you need a Windows master target. For a Linux virtual machine, you need a Linux master target. Read the following steps to learn how to creat",2026-02-13T08:00:00.000Z,how-to,,0.4,False,How-to guide for installing Linux master target; likely procedural without deep config parameter references or error-code mappings.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-install-mobility-service,Prepare for push installation,Prepare source machines to install the Mobility Service through push installation for disaster recovery of VMware VMs and physical servers to Azure - Azure Site Recovery,Prepare source machines for Mobility Service push install,Learn how to prepare your server to install Mobility agent through push installation for disaster recovery of VMware VMs and physical servers to Azure using the Azure Site Recovery service.,"Caution This article references CentOS, a Linux distribution that is end of life (EOL). Consider your use and plan accordingly. For more information, see theCentOS End Of Life guidance. When you set up disaster recovery for VMware VMs and physical servers by usingAzure Site Recovery, install theSite Recovery Mobility serviceon each on-premises VMware VM and physical server. The Mobility service captures data writes on the machine and forwards them to the Site Recovery process server.",2026-02-13T12:10:00.000Z,how-to,configuration,0.7,True,"Describes preparing servers for push installation, likely including required ports, services, permissions, and OS-specific settings—product-specific configuration parameters beyond generic knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-manage-configuration-server,Manage the configuration server for VMware,Manage the Configuration Server for Disaster Recovery with Azure Site Recovery - Azure Site Recovery,Manage Azure Site Recovery VMware configuration server,Learn about the common tasks to manage an on-premises configuration server for disaster recovery of VMware VMs and physical servers to Azure with Azure Site Recovery.,"You set up an on-premises configuration server when you useAzure Site Recoveryfor disaster recovery of VMware virtual machines (VMs) and physical servers to Azure. The configuration server coordinates communications between on-premises VMware and Azure and manages data replication. This article summarizes common tasks for managing the configuration server after it deploys. Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure Power",2026-04-15T08:00:00.000Z,how-to,configuration,0.68,True,"The article covers concrete, product-specific management tasks for the Azure Site Recovery configuration server (for VMware/physical to Azure DR). These tasks typically include specific service settings, agent/adapter configuration details, and operational parameters unique to this product, which qualify as expert configuration knowledge beyond generic concepts, even though the summary does not expose all parameter names or tables.",updated +https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-manage-configuration-server,Manage the configuration server for VMware,Manage the Configuration Server for Disaster Recovery with Azure Site Recovery - Azure Site Recovery,Manage Azure Site Recovery VMware configuration server,Learn about the common tasks to manage an on-premises configuration server for disaster recovery of VMware VMs and physical servers to Azure with Azure Site Recovery.,"You set up an on-premises configuration server when you useAzure Site Recoveryfor disaster recovery of VMware virtual machines (VMs) and physical servers to Azure. The configuration server coordinates communications between on-premises VMware and Azure and manages data replication. This article summarizes common tasks for managing the configuration server after it deploys. Note We recommend that you use the Azure Az PowerShell module to interact with Azure. To get started, seeInstall Azure Power",2026-04-15T08:00:00.000Z,how-to,configuration,0.68,True,"The article covers concrete, product-specific management tasks for the Azure Site Recovery configuration server (for VMware/physical to Azure DR). These tasks typically include specific service settings, agent/adapter configuration details, and operational parameters unique to this product, which qualify as expert configuration knowledge beyond generic concepts, even though the summary does not expose all parameter names or tables.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-manage-process-server,Manage process servers,Manage a process server for VMware VMs/physical server disaster recovery in Azure Site Recovery - Azure Site Recovery,Manage Site Recovery process server for VMware/physical,This article describes manage a process server for disaster recovery of VMware VMs/physical servers using Azure Site Recovery.,"This article describes common tasks for managing the Site Recovery process server. The process server is used to receive, optimize, and send replication data to Azure. It also performs a push installation of the Mobility service on VMware VMs and physical servers you want to replicate, and performs automatic discovery of on-premises machines. For replicating on-premises VMware VMs or physical servers to Azure, the process server is installed by default on the configuration server machine. Learn ",2022-11-03T11:27:00.000Z,how-to,configuration,0.7,True,"Process server management usually involves port requirements, cache settings, scaling parameters, and service configuration unique to Site Recovery.",unchanged https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-manage-vcenter,Manage vCenter servers,Manage VMware vCenter servers in Azure Site Recovery - Azure Site Recovery,,This article describes how to add and manage VMware vCenter for disaster recovery of VMware VMs to Azure with Azure Site Recovery.,This article summarizes management actions on a VMware vCenter Server inAzure Site Recovery.,2022-11-03T11:27:00.000Z,how-to,,0.3,False,Summarizes management actions on vCenter; description suggests high-level management tasks rather than detailed config or troubleshooting content.,unchanged https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-mobility-install-configuration-mgr,Automate Mobility service deployment,Automate Mobility Service for disaster recovery of installation in Azure Site Recovery - Azure Site Recovery,Automate Mobility Service installation and updates,How to automatically install the Mobility Service for VMware /physical server disaster recovery with Azure Site Recovery.,"Caution This article references CentOS, a Linux distribution that is End Of Life (EOL) status. Please consider your use and plan accordingly. For more information, see theCentOS End Of Life guidance. This article describes how to automate installation and updates for the Mobility Service agent inAzure Site Recovery. When you deploy Site Recovery for disaster recovery of on-premises VMware VMs and physical servers to Azure, you install the Mobility Service agent on each machine you want to replic",2023-03-03T12:19:00.000Z,how-to,configuration,0.7,True,"Covers automated install/update of Mobility Service agent, likely with specific command-line options, configuration parameters, and scheduling details unique to Site Recovery.",unchanged diff --git a/products/azure-site-recovery/report.md b/products/azure-site-recovery/report.md index d45f7eac..43afceda 100644 --- a/products/azure-site-recovery/report.md +++ b/products/azure-site-recovery/report.md @@ -1,18 +1,18 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: decision-making: 'Planning and sizing Azure Site Recovery: choosing tools vs Azure Migrate, VMware/Hyper-V DR capacity and cost estimation, managed disk pricing, failover/failback options, and classic-to-modern migration.' - configuration: 'Configuring Azure Site Recovery for VMs and clusters (Azure, VMware, - Hyper‑V, physical): replication setup, networking, encryption, appliances, policies, - monitoring, failover and failback settings.' + configuration: 'Configuring Azure Site Recovery for VMs/VMware/Hyper-V/physical: + setup, networking, encryption, policies, appliances, monitoring, and managing + replication/failover/failback settings.' troubleshooting: Diagnosing and fixing Azure Site Recovery replication, failover, agent/extension, network, and appliance issues for Azure VMs, Hyper-V, VMware, and physical servers. - limits-quotas: 'Site Recovery scale, capacity, and support limits: VM/Hyper‑V/VMware - matrices, high‑churn limits, shared disks, Mobility service usage, planner limits, - and safe use with Azure Backup.' + limits-quotas: Support limits, capacity planning, and compatibility matrices for + Azure Site Recovery (VMware/Hyper-V/Azure VMs), including churn limits, shared + disks, Mobility service usage, and safe use with Backup integrations: 'Scripts and templates for automating ASR: PowerShell for Hyper‑V/shared disks, ExpressRoute/Traffic Manager integration, and Bicep/ARM/Terraform to deploy Recovery Services vaults.' @@ -30,14 +30,14 @@ category_descriptions: skill_description: Expert knowledge for Azure Site Recovery development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when planning ASR for VMware/Hyper‑V, automating vaults via ARM/Bicep/Terraform, - or designing DR for SQL/SAP/AD workloads, and other Azure Site Recovery related - development tasks. Not for Azure Backup (use azure-backup), Azure Migrate (use azure-migrate), + Use when planning ASR for VMware/Hyper‑V, configuring replication/failover, automating + via ARM/Bicep, or securing appliances, and other Azure Site Recovery related development + tasks. Not for Azure Backup (use azure-backup), Azure Migrate (use azure-migrate), Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). -use_when: Use when planning ASR for VMware/Hyper‑V, automating vaults via ARM/Bicep/Terraform, - or designing DR for SQL/SAP/AD workloads, and other Azure Site Recovery related - development tasks. +use_when: Use when planning ASR for VMware/Hyper‑V, configuring replication/failover, + automating via ARM/Bicep, or securing appliances, and other Azure Site Recovery + related development tasks. confusable_not_for: Not for Azure Backup (use azure-backup), Azure Migrate (use azure-migrate), Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). @@ -54,8 +54,8 @@ confusable_not_for: Not for Azure Backup (use azure-backup), Azure Migrate (use ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 196 +- **Updated Pages**: 2 +- **Unchanged**: 200 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-site-recovery/azure-site-recovery.csv` @@ -78,18 +78,10 @@ confusable_not_for: Not for Azure Backup (use azure-backup), Azure Migrate (use ### Updated Pages -- [On Azure VMs](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication) - - Updated: 2025-10-31T08:00:00.000Z → 2026-04-14T17:11:00.000Z -- [Manage the configuration server for VMware](https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-manage-configuration-server) - - Updated: 2026-01-23T23:11:00.000Z → 2026-04-15T08:00:00.000Z -- [Feature releases](https://learn.microsoft.com/en-us/azure/site-recovery/feature-updates-whats-new) - - Updated: 2025-06-13T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Azure to Azure](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-support-matrix) - - Updated: 2026-02-27T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Configure on-premise disks for Azure through Hydration](https://learn.microsoft.com/en-us/azure/site-recovery/hydration-process) - - Updated: 2026-02-27T08:00:00.000Z → 2026-04-15T08:00:00.000Z -- [Walkthrough-Set up disaster recovery](https://learn.microsoft.com/en-us/azure/site-recovery/physical-azure-disaster-recovery) - - Updated: 2025-10-31T08:00:00.000Z → 2026-04-15T08:00:00.000Z +- [Support requirements for Azure Site Recovery replication appliance](https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix) + - Updated: 2026-03-02T12:12:00.000Z → 2026-04-21T17:17:00.000Z +- [Enable on-premises replication with private endpoints](https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints) + - Updated: 2025-05-11T08:00:00.000Z → 2026-04-21T17:17:00.000Z ## Classified Pages @@ -122,15 +114,14 @@ confusable_not_for: Not for Azure Backup (use azure-backup), Azure Migrate (use | [Analyze the cost estimation report](https://learn.microsoft.com/en-us/azure/site-recovery/site-recovery-vmware-deployment-planner-cost-estimation) | decision-making | 0.80 | Provides detailed DR cost estimation per VM and summary graphs; used for cost-based decision-making with quantified trade-offs. | | [Configure Mobility Service Proxy Settings](https://learn.microsoft.com/en-us/azure/site-recovery/configure-mobility-service-proxy-settings) | configuration | 0.80 | Proxy configuration for Mobility Service is highly product-specific, with concrete parameters and steps for networking configuration. | | [Configure replication settings](https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-set-up-replication) | configuration | 0.80 | Describes policy parameters (RPO, retention, frequency) for VMware replication; product-specific configuration options and ranges. | -| [Enable on-premises replication with private endpoints](https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints) | configuration | 0.80 | Private endpoint setup requires specific subnet, DNS, and vault endpoint configuration parameters and constraints; includes product-specific settings and limitations (e.g., no automatic upgrades). | | [Enable replication with private endpoints](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication-private-endpoints) | configuration | 0.80 | Covers use of Private Link endpoints for vault access, including reference architecture and required settings; product-specific networking configuration. | | [Exclude disks](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-exclude-disks) | configuration | 0.80 | Uses Az PowerShell with specific cmdlets/parameters to exclude disks; SDK/API-level configuration details unique to Site Recovery. | | [Manage network interfaces for on-premises to Azure replication](https://learn.microsoft.com/en-us/azure/site-recovery/site-recovery-manage-network-interfaces-on-premises-to-azure) | configuration | 0.80 | Discusses NIC limits per VM size and primary/secondary NIC behavior in Site Recovery scenarios; includes specific configuration constraints and mappings. | | [On CMK enabled disks](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication-cmk-disks) | configuration | 0.80 | Product-specific guidance for CMK-enabled disks replication; includes required Key Vault, disk, and Site Recovery settings. | | [On encrypted VMs](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-replication-ade-vms) | configuration | 0.80 | Details support matrix (Windows vs Linux, ADE versions, managed vs unmanaged disks) and required settings; product-specific encryption configuration and constraints. | -| [Support requirements for Azure Site Recovery replication appliance](https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix) | limits-quotas | 0.80 | Describes support and requirements for the replication appliance, which typically includes specific OS versions, resource minimums, and configuration constraints. These are concrete support/limit details that qualify as expert knowledge. | | [Using Site Recovery with Azure Backup](https://learn.microsoft.com/en-us/azure/site-recovery/site-recovery-backup-interoperability) | limits-quotas | 0.80 | Interoperability matrix between Site Recovery and Backup includes specific supported/unsupported combinations and constraints (e.g., scenarios where MARS agents can’t co-exist). | | [Azure to Azure](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-support-matrix) | limits-quotas | 0.78 | Support matrices for Azure Site Recovery typically list precise supported/unsupported configurations, OS versions, disk types, and sometimes size or count constraints per scenario and region. These are product-specific, versioned details that an LLM won't reliably know from training and function as de facto limits/constraints for what is supported. | +| [Support requirements for Azure Site Recovery replication appliance](https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix) | limits-quotas | 0.78 | Support requirement/matrix pages for Azure Site Recovery replication appliance typically list exact supported versions, capacities, and constraints (for example, supported VMware/OS versions, maximum protected machines per appliance, resource requirements). These are product-specific numeric and matrix-style limits that qualify as expert knowledge and align best with the limits-quotas category. | | [Enable replication for VMware VMs](https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-enable-replication) | configuration | 0.75 | Describes enabling VMware VM replication; includes Site Recovery-specific configuration parameters and prerequisites. | | [Enable replication for a physical server - Modernized](https://learn.microsoft.com/en-us/azure/site-recovery/physical-server-enable-replication) | configuration | 0.75 | How-to for enabling physical server replication; includes agent/appliance configuration and supported settings. | | [For an added disk](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-enable-replication-added-disk) | configuration | 0.75 | Explains behavior when adding disks to already-protected VMs and how to enable replication; includes Site Recovery-specific state transitions and options. | @@ -178,6 +169,7 @@ confusable_not_for: Not for Azure Backup (use azure-backup), Azure Migrate (use | [Upgrade the Mobility agent (Modernized)](https://learn.microsoft.com/en-us/azure/site-recovery/upgrade-mobility-service-modernized) | configuration | 0.70 | Details automatic vs manual upgrade procedures for modernized components, likely including versioning, prerequisites, and specific settings unique to this product. | | [Using Azure policy](https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-how-to-enable-policy) | configuration | 0.70 | Shows how to configure Azure Policy definitions/assignments to auto-enable Site Recovery; involves specific policy parameters and effects. | | [Using public IP addresses with Site Recovery](https://learn.microsoft.com/en-us/azure/site-recovery/concepts-public-ip-address-with-site-recovery) | configuration | 0.70 | Explains how to set up public IPs with Site Recovery and Traffic Manager; includes product-specific configuration steps and behaviors. | +| [Enable on-premises replication with private endpoints](https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints) | configuration | 0.68 | The article gives product-specific configuration steps and constraints for enabling replication via Azure Private Link, including which networks/endpoints can be created, supported regions, and specific behavioral caveats (for example, automatic upgrades not supported with private endpoints). These are concrete, service-specific configuration details rather than generic concepts. | | [Manage the configuration server for VMware](https://learn.microsoft.com/en-us/azure/site-recovery/vmware-azure-manage-configuration-server) | configuration | 0.68 | The article covers concrete, product-specific management tasks for the Azure Site Recovery configuration server (for VMware/physical to Azure DR). These tasks typically include specific service settings, agent/adapter configuration details, and operational parameters unique to this product, which qualify as expert configuration knowledge beyond generic concepts, even though the summary does not expose all parameter names or tables. | | [About Azure Site Recovery deployment planner](https://learn.microsoft.com/en-us/azure/site-recovery/deployment-planner-cost-estimation) | decision-making | 0.65 | Deployment planner guidance generally includes quantified sizing, bandwidth, and cost estimation logic to decide feasibility and configuration for DR deployments. | | [About Mobility service for VMware VMs and physical servers](https://learn.microsoft.com/en-us/azure/site-recovery/vmware-physical-mobility-service-overview) | limits-quotas | 0.65 | Includes a concrete quantified resource-impact detail (Mobility service uses approximately 6%-10% of resources), which is a specific numeric constraint relevant for capacity planning and not generally known; other content is overview, but this numeric range qualifies as an expert limit. | diff --git a/products/azure-speech/azure-speech.csv b/products/azure-speech/azure-speech.csv index 202bc98d..3b07873f 100644 --- a/products/azure-speech/azure-speech.csv +++ b/products/azure-speech/azure-speech.csv @@ -47,7 +47,7 @@ https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-inspect-data,Part 4: Test recognition quality,Test recognition quality of a custom speech model - Speech service - Foundry Tools,Evaluate and compare custom speech model accuracy,Custom speech lets you qualitatively inspect the recognition quality of a model. You can play back uploaded audio and determine if the provided recognition result is correct.,"You can inspect the recognition quality of a custom speech model. You can play back uploaded audio and determine if the provided recognition result is correct. After a test is successfully created, you can see how a model transcribed the audio dataset, or compare results from two models side by side. Side-by-side model testing is useful to validate which speech recognition model is best for an application. For an objective measure of accuracy, which requires transcription datasets input, seeTest",2025-12-29T08:00:00.000Z,how-to,decision-making,0.65,True,Guides how to inspect recognition quality and compare models side by side to choose the best one; product-specific evaluation workflow aiding decisions.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-model-and-endpoint-lifecycle,Custom speech model lifecycle,Model lifecycle of custom speech - Speech service - Foundry Tools,Manage custom speech model and endpoint lifecycle,Custom speech provides base models for training and lets you create custom models from your data. This article describes the timelines for models and for endpoints that use these models.,"You can use a custom speech model for some time after you deploy it to your custom endpoint. But when new base models are made available, the older models are expired. You must periodically recreate and train your custom model from the latest base model to take advantage of the improved accuracy and quality. Here are some key terms related to the model lifecycle: Note Endpoints used byF0Speech resources are deleted after seven days.",2025-12-29T08:00:00.000Z,how-to,limits-quotas,0.75,True,"Explicitly describes model and endpoint timelines and includes a concrete limit: F0 endpoints deleted after seven days, which is a product-specific lifecycle limit.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-test-and-train,Training and testing datasets,Training and testing datasets - Speech service - Foundry Tools,,"Learn about types of training and testing data for a custom speech project, along with how to use and manage that data.","In a custom speech project, you can upload datasets for training, qualitative inspection, and quantitative measurement. This article covers the types of training and testing data that you can use for custom speech. Text and audio that you use to test and train a custom model should include samples from a diverse set of speakers and scenarios that you want your model to recognize. Consider these factors when you're gathering data for custom model testing and training:",2025-12-19T08:00:00.000Z,how-to,,0.3,False,"Describes dataset types and considerations at a conceptual level; no concrete limits, configs, or product-specific edge cases indicated.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-train-model,Part 3: Train a model,Train a custom speech model - Speech service - Foundry Tools,,Learn how to train custom speech models. Training a speech to text model can improve recognition accuracy for the Microsoft base model or a custom model.,"In this article, you learn how to train a custom model to improve recognition accuracy from the Microsoft base model. The speech recognition accuracy and quality of a custom speech model remains consistent, even when a new base model is released. Note You pay for custom speech model usage andendpoint hosting. You'll also be charged for custom speech model training if the base model was created on October 1, 2023 and later. You are not charged for training if the base model was created prior to O",2026-04-17T22:08:00.000Z,how-to,,0.2,False,"From the summary, the page is primarily a how-to guide on training custom speech models and mentions billing behavior conceptually without listing concrete limits, configuration tables, error codes, or other product-specific expert details. It does not clearly match any sub-skill criteria such as limits/quotas, configuration parameters, troubleshooting codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-train-model,Part 3: Train a model,Train a custom speech model - Speech service - Foundry Tools,,Learn how to train custom speech models. Training a speech to text model can improve recognition accuracy for the Microsoft base model or a custom model.,"In this article, you learn how to train a custom model to improve recognition accuracy from the Microsoft base model. The speech recognition accuracy and quality of a custom speech model remains consistent, even when a new base model is released. Note You pay for custom speech model usage andendpoint hosting. You'll also be charged for custom speech model training if the base model was created on October 1, 2023 and later. You are not charged for training if the base model was created prior to O",2026-04-17T22:08:00.000Z,how-to,,0.2,False,"From the summary, the page is primarily a how-to guide on training custom speech models and mentions billing behavior conceptually without listing concrete limits, configuration tables, error codes, or other product-specific expert details. It does not clearly match any sub-skill criteria such as limits/quotas, configuration parameters, troubleshooting codes, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-upload-data,Part 2: Upload training and testing datasets,Upload training and testing datasets for custom speech - Speech service - Foundry Tools,Prepare and upload datasets for custom speech training,Learn about how to upload data to test or train a custom speech model.,"You need audio or text data for testing the accuracy of speech recognition or training your custom models. For information about the data types supported for testing or training your model, seeTraining and testing datasets.",2025-12-29T08:00:00.000Z,how-to,configuration,0.7,True,Covers supported data types and how to upload them for training/testing; product-specific dataset requirements and handling.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-voice-training-data,Prepare training data,Professional voice fine-tuning data - Speech service - Foundry Tools,Prepare training data for professional custom voice,Learn about the data types that you can use for professional voice fine-tuning.,"When you're ready to create a custom voice for your application, the first step is to gather audio recordings and associated scripts to start professional voice fine-tuning. ""Custom voice"" is an umbrella term that includes both professional voice fine-tuning and personal voice. The Speech service uses this data for professional voice fine-tuning, creating a unique voice tuned to match the voice in the recordings. After you fine-tune a professional voice, you can start synthesizing speech in your",2026-02-28T06:12:00.000Z,how-to,best-practices,0.8,True,Covers data types and requirements for professional voice fine-tuning; likely includes concrete recommendations and constraints on recordings and scripts specific to this service.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-get-speech-session-id,Get Speech to text Session ID,How to get speech to text session ID and transcription ID - Foundry Tools,Retrieve Speech to text session and transcription IDs for support,Learn how to get speech to text session ID and transcription ID,"If you usespeech to textand need to open a support case, you're often asked to provide aSession IDorTranscription IDof the problematic transcriptions to debug the issue. This article explains how to get these IDs. Note",2025-12-29T23:03:00.000Z,how-to,troubleshooting,0.7,True,Support-focused guide on obtaining IDs needed for debugging; includes product-specific steps and identifiers.,unchanged @@ -68,7 +68,7 @@ https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice- https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-auto-truncation,Handle voice interruptions in chat history (preview),Handle voice interruptions in chat history - Foundry Tools,Handle user interruptions and chat truncation in Voice Live,Learn how to automatically truncate conversation text results to match audio when users interrupt Voice Live agent responses during playback.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. A common scenario with voice agents is when users interrupt the audio playback of agent responses. The Voice Live service generates and returns audio responses to the client, but u",2026-02-25T12:07:00.000Z,how-to,best-practices,0.7,True,Describes how to automatically truncate conversation text to match interrupted audio; product-specific behavior and handling pattern for this edge case.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-function-calling,Function calling in Voice Live,Function Calling with Voice Live API - Foundry Tools,,Learn how to do function calling scenarios in a voice session using the Voice Live API.,The Voice Live API supports function calling in voice conversations. This allows you to create voice assistants that can call external functions to get real-time information and include that information in their spoken responses.,2026-04-06T22:09:00.000Z,quickstart,,0.3,False,"Explains that Voice Live API supports function calling in voice conversations; appears to be a scenario/feature overview without detailed configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-interim-response,Improve tool calling and latency wait times (preview),Improve tool calling and latency wait times with interim responses - Foundry Tools,Use interim responses in Voice Live to reduce latency gaps,"Learn how to use interim responses in Voice Live to bridge wait times during tool calls or high-latency agent responses, keeping conversations natural.","Note This feature is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. When a voice agent calls external tools or takes time to generate a response, users experience silence. Interim responses bridge these wait times with short spoken messages—keeping",2026-03-08T12:04:00.000Z,how-to,best-practices,0.7,True,"Focuses on improving tool-calling and latency wait times with interim responses. This is a product-specific behavior pattern for Voice Live, likely including concrete guidance on when and how to send interim responses, configuration options, and edge cases around tool calls—actionable, product-specific best practices rather than generic theory.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-mcp-server,Add an MCP server (preview),Add an MCP server to Voice Live - Foundry Tools,Connect MCP servers to Azure Voice Live sessions,Learn how to connect remote MCP servers to a Voice Live session for real-time tool calling with the VoiceLive SDK.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-14T06:04:00.000Z,how-to,integrations,0.68,True,"How-to page for wiring remote MCP servers into Voice Live via the VoiceLive SDK. This is a new, niche integration pattern with product-specific SDK usage and configuration details that are unlikely to be fully captured in generic LLM training data. It focuses on connecting an external MCP server as a real-time tool, which fits the integrations sub-skill better than generic tutorials.",new +https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-mcp-server,Add an MCP server (preview),Add an MCP server to Voice Live - Foundry Tools,Connect MCP servers to Azure Voice Live sessions,Learn how to connect remote MCP servers to a Voice Live session for real-time tool calling with the VoiceLive SDK.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-14T06:04:00.000Z,how-to,integrations,0.68,True,"How-to page for wiring remote MCP servers into Voice Live via the VoiceLive SDK. This is a new, niche integration pattern with product-specific SDK usage and configuration details that are unlikely to be fully captured in generic LLM training data. It focuses on connecting an external MCP server as a real-time tool, which fits the integrations sub-skill better than generic tutorials.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-proactive-messages,How to add proactive messages (preview),Add proactive messages to a Voice Live real-time voice agent - Foundry Tools,Add proactive greetings to Voice Live agents,Learn how to make your Voice Live agent speak first by triggering proactive greetings using pregenerated assistant messages or LLM-generated messages.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Proactive engagement allows your Voice Live agent tospeak first, before the user interacts with the system. This can make agents feel more natural, more helpful, and more responsiv",2026-03-23T06:03:00.000Z,how-to,integrations,0.7,True,"Implementing proactive messages in Voice Live requires using specific API fields or message types (for example, pregenerated assistant messages vs LLM-generated messages, trigger parameters). These are concrete Voice Live API integration patterns and parameters, so it fits integrations.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/speech-service/improve-accuracy-phrase-list,Improve recognition with phrase list,Improve recognition accuracy with phrase list - Foundry Tools,Improve speech recognition with phrase lists,Phrase lists can be used to customize speech recognition results based on context.,"A phrase list is a list of words or phrases provided ahead of time to help improve their recognition. Adding a phrase to a phrase list increases its importance, thus making it more likely to be recognized. You can add phrase list in real-time transcription and fast transcription. Examples of phrases include: Phrase lists are simple and lightweight: For supported phrase list locales, seeLanguage and voice support for the Speech service. You can use phrase lists with theSpeech Studio,Speech SDK, o",2025-11-05T18:22:00.000Z,how-to,best-practices,0.7,True,Describes how phrase lists affect recognition and where they can be used; this is a product-specific tuning mechanism and usage guidance.,unchanged diff --git a/products/azure-speech/report.md b/products/azure-speech/report.md index 9b479f69..f9a3190e 100644 --- a/products/azure-speech/report.md +++ b/products/azure-speech/report.md @@ -54,9 +54,9 @@ confusable_not_for: Not for Azure Communication Services (use azure-communicatio - **Unclassified**: 75 ### Incremental Update -- **New Pages**: 1 -- **Updated Pages**: 1 -- **Unchanged**: 182 +- **New Pages**: 0 +- **Updated Pages**: 0 +- **Unchanged**: 184 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-speech/azure-speech.csv` @@ -77,15 +77,6 @@ confusable_not_for: Not for Azure Communication Services (use azure-communicatio ## Changes -### New Pages - -- [Add an MCP server (preview)](https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-voice-live-mcp-server) - -### Updated Pages - -- [Part 3: Train a model](https://learn.microsoft.com/en-us/azure/ai-services/speech-service/how-to-custom-speech-train-model) - - Updated: 2025-12-19T08:00:00.000Z → 2026-04-17T22:08:00.000Z - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-sql-database/azure-sql-database.csv b/products/azure-sql-database/azure-sql-database.csv index 648fcd10..a2e130ca 100644 --- a/products/azure-sql-database/azure-sql-database.csv +++ b/products/azure-sql-database/azure-sql-database.csv @@ -40,7 +40,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-best-practic https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-manage-using-api?view=azuresql,Manage Auditing using APIs,Manage Auditing Using APIs - Azure SQL Database & Azure Synapse Analytics,Manage Azure SQL auditing using APIs and automation,Use Azure SQL Database auditing to track database events into an audit log.,Applies to:Azure SQL DatabaseAzure Synapse Analytics This article provides an overview of the different APIs used for managing Auditing for Azure SQL Database and Azure Synapse Analytics.,2025-06-10T08:00:00.000Z,concept-article,integrations,0.7,True,"Covers specific APIs for configuring auditing, including parameter names and usage patterns unique to Azure SQL and Synapse.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-managed-identity?view=azuresql,Auditing managed identity,Auditing Using Managed Identity - Azure SQL Database & Azure Synapse Analytics,Configure managed identity for Azure SQL auditing to storage,How to use managed identity with storage accounts for auditing,"Applies to:Azure SQL DatabaseAzure Synapse Analytics Auditing for Azure SQL Database can be configured to use aStorage accountwith two authentication methods: Managed Identitycan be a system-assigned managed identity (SMI) or user-assigned managed identity (UMI). To configure writing audit logs to a storage account, go to theAzure portal, and select your logical server resource for Azure SQL Database. SelectStoragein theAuditingmenu. Select the Azure storage account where logs will be saved. By ",2025-06-26T17:43:00.000Z,how-to,security,0.8,True,"Details use of system- and user-assigned managed identities for auditing to storage accounts, including specific configuration steps and options.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-microsoft-support-operations?view=azuresql,Auditing Microsoft support operations,Auditing Microsoft Support Operations - Azure SQL Database & Azure Synapse Analytics,Audit Microsoft support operations on Azure SQL servers,How to use Auditing to audit Microsoft support operations.,"Applies to:Azure SQL DatabaseAzure Synapse Analytics Auditing of Microsoft support operations for yourlogical serverin Azure SQL Database allows you to audit Microsoft support engineers' operations when they need to access your server during a support request. The use of this capability, along with your auditing, enables more transparency into your workforce and allows for anomaly detection, trend visualization, and data loss prevention. Auditing of Microsoft support operations includes the foll",2025-11-24T23:36:00.000Z,concept-article,security,0.75,True,"Explains how to enable and use auditing specifically for Microsoft support engineer operations, including what actions are logged; product-specific security/auditing configuration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-overview?view=azuresql,Auditing overview,Auditing - Azure SQL Database and Azure Synapse Analytics,,"SQL Auditing for Azure SQL Database and Azure Synapse Analytics tracks database events and writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs.","Applies to:Azure SQL DatabaseAzure Synapse Analytics Auditing forAzure SQL DatabaseandAzure Synapse Analyticstracks database events and writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs. Auditing also: Helps you maintain regulatory compliance, understand database activity, and gain insight into discrepancies and anomalies that could indicate business concerns or suspected security violations. Enables and facilitates adherence to compliance standard",2026-04-15T22:37:00.000Z,concept-article,,0.2,False,"The auditing overview describes what auditing is, its benefits, and where logs can be written (storage account, Log Analytics, Event Hubs), but from the summary it appears to be a conceptual/feature overview without specific RBAC roles, configuration parameter tables, or detailed error/limit information. It does not meet the thresholds for security, configuration, or other expert-knowledge sub-skills.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-overview?view=azuresql,Auditing overview,Auditing - Azure SQL Database and Azure Synapse Analytics,,"SQL Auditing for Azure SQL Database and Azure Synapse Analytics tracks database events and writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs.","Applies to:Azure SQL DatabaseAzure Synapse Analytics Auditing forAzure SQL DatabaseandAzure Synapse Analyticstracks database events and writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs. Auditing also: Helps you maintain regulatory compliance, understand database activity, and gain insight into discrepancies and anomalies that could indicate business concerns or suspected security violations. Enables and facilitates adherence to compliance standard",2026-04-15T22:37:00.000Z,concept-article,,0.2,False,"The auditing overview describes what auditing is, its benefits, and where logs can be written (storage account, Log Analytics, Event Hubs), but from the summary it appears to be a conceptual/feature overview without specific RBAC roles, configuration parameter tables, or detailed error/limit information. It does not meet the thresholds for security, configuration, or other expert-knowledge sub-skills.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-server-level-database-level?view=azuresql,Auditing policy at the server and database level,Auditing Policy at the Server and Database Level - Azure SQL Database & Azure Synapse Analytics,Understand server vs database-level auditing policies,This article explains the differences for Auditing policies of Azure SQL Database and Azure Synapse Analytics at the server and database level.,Applies to:Azure SQL DatabaseAzure Synapse Analytics This article highlights Auditing policies forAzure SQL DatabaseandAzure Synapse Analyticsat the server level and the database level.,2025-06-11T22:31:00.000Z,concept-article,security,0.7,True,Clarifies differences and interactions between server-level and database-level auditing policies in Azure SQL and Synapse; product-specific security behavior.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-setup?view=azuresql,Auditing setup,Set up Auditing - Azure SQL Database & Azure Synapse Analytics,Configure auditing for Azure SQL and Synapse,"This article provides an overview of how to set up Auditing and storing those audits to an Azure storage account, Log Analytics workspace, or Event Hubs destination.","Applies to:Azure SQL DatabaseAzure Synapse Analytics In this article, we go over setting up Auditing for your logical server or database inAzure SQL DatabaseandAzure Synapse Analytics.",2025-08-19T22:37:00.000Z,how-to,security,0.75,True,How-to article for setting up auditing at server or database level and choosing destinations; includes product-specific security configuration steps.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-configure?view=azuresql,Microsoft Entra authentication,Configure Microsoft Entra Authentication - Azure SQL Database & SQL Managed Instance & Azure Synapse Analytics,Configure Microsoft Entra authentication for Azure SQL,"Learn how to connect to Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics using the Microsoft Entra authentication.","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics This article shows you how to useMicrosoft Entra ID for authenticationwithAzure SQL Database,Azure SQL Managed Instance, andAzure Synapse Analytics. Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD). Alternatively, you can alsoconfigure Microsoft Entra authentication for SQL Server on Azure Virtual Machines.",2026-03-03T08:00:00.000Z,how-to,security,0.78,True,"Configuration-focused article for Entra (Azure AD) authentication to Azure SQL Database, Managed Instance, and Synapse. It typically includes specific steps, T-SQL commands, and Entra/SQL role mappings and settings (e.g., contained database users, server principals, specific permission scopes), which are product-specific security configuration details rather than generic concepts.",unchanged @@ -48,7 +48,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-di https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-directory-readers-role?view=azuresql,Directory Readers role,Directory Readers Role in Microsoft Entra ID for Azure SQL - Azure SQL Database & Azure SQL Managed Instance,Understand Directory Readers role requirements for Azure SQL,Learn about the directory reader's role in Microsoft Entra for Azure SQL.,"Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics Microsoft Entra ID (formerly Azure Active Directory) has introducedusing groups to manage role assignments. This allows for Microsoft Entra roles to be assigned to groups. Note WithMicrosoft Graphsupport for Azure SQL, the Directory Readers role can be replaced with using lower level permissions. For more information, seeManaged identities in Microsoft Entra for Azure SQL. When enabling amanaged identityfor Azure SQL ",2025-07-18T17:35:00.000Z,concept-article,security,0.8,True,Explains Directory Readers role and alternative lower-level permissions for managed identities; contains specific role/permission guidance unique to Azure SQL integration.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-guest-users?view=azuresql,Microsoft Entra guest users and set as a Microsoft Entra admin,Create Microsoft Entra Guest Users - Azure SQL Database & Azure SQL Managed Instance,,"How to create Microsoft Entra guest users and set them as Microsoft Entra admin in Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceSQL Server on Azure VM Note This feature applies to SQL Server 2022 and later on Azure Virtual Machines and Arc enabled SQL Server. Guest userswith Microsoft Entra B2B collaboration are users that have accounts in an external Microsoft Entra organization or an external identity provider (for example, Outlook, Windows Live Mail, or Gmail), which isn't managed within your Microsoft Entra tenant. Guest user accounts are created when those indiv",2026-02-25T18:37:00.000Z,how-to,,0.0,False,Parse error: Expecting value: line 12 column 13 (char 351),unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-overview?view=azuresql,Microsoft Entra authentication overview,Microsoft Entra Authentication - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Use Microsoft Entra authentication with Azure SQL,"Learn about how to use Microsoft Entra ID for authentication with Azure SQL Database, Azure SQL Managed Instance, and Synapse SQL in Azure Synapse Analytics","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics This article provides an in depth overview of using Microsoft Entra authentication withAzure SQL Database,Azure SQL Managed Instance,SQL Server on Azure VMs,Synapse SQL in Azure Synapse AnalyticsandSQL Server for Windows and Linux. If you want to configure Microsoft Entra authentication, review: Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD).",2025-09-11T08:00:00.000Z,concept-article,security,0.75,True,In-depth overview of Entra authentication for Azure SQL and related services; includes product-specific authentication flows and configuration considerations.,unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-service-principal-tutorial?view=azuresql,Create users using service principals,Create Microsoft Entra Users Using Service Principals - Azure SQL Database,Configure Entra service principals for Azure SQL access,This tutorial walks you through creating Microsoft Entra users with a Microsoft Entra application (service principal) in Azure SQL Database.,"Applies to:Azure SQL Database This article explains how to configure a service principal so it can create Microsoft Entra users in Azure SQL Database. This capability lets you programmatically manage access to Azure SQL resources for users and applications in your Microsoft Entra tenant. Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD). For more information on Microsoft Entra authentication for Azure SQL, seeUse Microsoft Entra authentication. In this tutorial, yo",2026-04-14T17:42:00.000Z,tutorial,security,0.7,True,"Tutorial on configuring Microsoft Entra service principals to create users in Azure SQL Database. Contains product-specific security configuration steps, including required permissions/roles and T-SQL/portal settings for authentication, which qualify as security-focused expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-service-principal-tutorial?view=azuresql,Create users using service principals,Create Microsoft Entra Users Using Service Principals - Azure SQL Database,Configure Entra service principals for Azure SQL access,This tutorial walks you through creating Microsoft Entra users with a Microsoft Entra application (service principal) in Azure SQL Database.,"Applies to:Azure SQL Database This article explains how to configure a service principal so it can create Microsoft Entra users in Azure SQL Database. This capability lets you programmatically manage access to Azure SQL resources for users and applications in your Microsoft Entra tenant. Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD). For more information on Microsoft Entra authentication for Azure SQL, seeUse Microsoft Entra authentication. In this tutorial, yo",2026-04-14T17:42:00.000Z,tutorial,security,0.7,True,"Tutorial on configuring Microsoft Entra service principals to create users in Azure SQL Database. Contains product-specific security configuration steps, including required permissions/roles and T-SQL/portal settings for authentication, which qualify as security-focused expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-service-principal?view=azuresql,Service principals (Applications),Microsoft Entra Service Principals with Azure SQL - Azure SQL Database & Azure SQL Managed Instance,Use Entra service principals with Azure SQL,Use Microsoft Entra service principals and managed identities in Azure SQL Database and Azure SQL Managed Instance,Applies to:Azure SQL DatabaseAzure SQL Managed Instance Azure SQL resources support programmatic access for applications using service principals and managed identities in Microsoft Entra ID (formerly Azure Active Directory).,2026-03-03T08:00:00.000Z,how-to,security,0.8,True,"Covers using Microsoft Entra service principals and managed identities for programmatic access to Azure SQL. This usually includes exact role names, permission scopes, and T-SQL/portal configuration steps for granting access to service principals and managed identities, which are product-specific security/identity configuration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-landing?view=azuresql,Microsoft Entra authentication documentation,Microsoft Entra authentication with SQL documentation - Azure SQL,,"Find documentation about Microsoft Entra authentication for SQL Server 2022 and Azure SQL Database, and Azure SQL Managed Instance.","Find documentation about Microsoft Entra authentication for SQL Server 2022, Azure SQL Database, and Azure SQL Managed Instance.",2025-06-11T22:31:00Z,landing-page,,0.1,False,Landing page for Microsoft Entra authentication docs; navigation without detailed settings.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-logins-tutorial?view=azuresql,Microsoft Entra server logins,Create and Utilize Microsoft Entra Server Logins - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,,This article guides you through creating and utilizing Microsoft Entra logins in the virtual master database of Azure SQL,"Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) This article guides you through creating and utilizingloginsbacked by Microsoft Entra ID (formerly Azure Active Directory) within the virtualmasterdatabase of Azure SQL. In this tutorial, you learn how to: Note Microsoft Entra server principals (logins) are currently in public preview for Azure SQL Database. Azure SQL Managed Instance and SQL Server 2022 and later can already utilize Microso",2025-09-18T17:34:00.000Z,tutorial,,0.0,False,Parse error: Expecting value: line 12 column 13 (char 351),unchanged @@ -128,7 +128,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/database/disaster-recovery-str https://learn.microsoft.com/en-us/azure/azure-sql/database/dns-alias-overview?view=azuresql,DNS aliases,DNS alias - Azure SQL Database & Azure Synapse Analytics,Configure and use DNS aliases for Azure SQL servers,"Your applications can connect to an alias for the name of the server for Azure SQL Database. Meanwhile, you can change the SQL Database the alias points to anytime, to facilitate testing and so on.",Applies to:Azure SQL DatabaseAzure Synapse Analytics Azure SQL Database has a Domain Name System (DNS) server. PowerShell and REST APIs acceptcalls to create and manage DNS aliasesfor yourlogical SQL servername. ADNS aliascan be used in place of the server name. Client programs can use the alias in their connection strings. The DNS alias provides a translation layer that can redirect your client programs to different servers. This layer spares you the difficulties of having to find and edit all ,2025-01-24T08:00:00.000Z,concept-article,configuration,0.7,True,Explains how to create and manage DNS aliases via PowerShell/REST and how they affect connection strings; product-specific configuration feature.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/dns-alias-powershell-create?view=azuresql,DNS alias PowerShell,DNS Alias (PowerShell & Azure CLI) - Azure SQL Database & Azure Synapse Analytics,Manage Azure SQL DNS aliases with PowerShell and CLI,"PowerShell and Azure CLI cmdlets enable you to redirect new client connections to a different SQL server in Azure, without having to touch any client configuration.",Applies to:Azure SQL DatabaseAzure Synapse Analytics This article provides Azure PowerShell Az module or Azure CLI scripts to demonstrate how you can manage a DNS alias for theAzure SQL logical serverhosting your Azure SQL Database.,2025-01-21T23:37:00.000Z,how-to,configuration,0.7,True,"Shows specific PowerShell and CLI commands and parameters for DNS alias management on Azure SQL logical servers, which are product-specific configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/doc-changes-updates-release-notes-whats-new-archive?view=azuresql,Azure SQL Database,What's New? (Archive) - Azure SQL Database,,Learn about the features and documentation improvements for Azure SQL Database made in previous years. (Archive),"Applies to:Azure SQL Database This article summarizes older documentation changes associated with new features and improvements in the releases ofAzure SQL Database. To learn more about Azure SQL Database, seeWhat is Azure SQL Database? Return toWhat's new in Azure SQL Database?",2026-03-03T18:40:00.000Z,whats-new,,0.1,False,"Archive of 'what's new' release-note links and documentation changes; no detailed limits, configs, troubleshooting mappings, or other structured expert data indicated.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/doc-changes-updates-release-notes-whats-new?view=azuresql,What's new?,What's new? - Azure SQL Database,,Learn about the new features and documentation improvements for Azure SQL Database.,"Applies to:Azure SQL Database This article summarizes the documentation changes associated with new features and improvements in the recent releases ofAzure SQL Database. For more information about Azure SQL Database, seeWhat is Azure SQL Database? Tip Deploy Azure SQL Database for free, for the life of your Azure subscription. This free offer provides up to ten free General Purpose databases, each with 100,000 vCore seconds of compute, every month. For more announcements, discussion, and commun",2026-04-13T22:36:00.000Z,whats-new,,0.2,False,"Release notes and 'what's new' summary; primarily announcements and high-level feature changes, not detailed limits, configuration matrices, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/database/doc-changes-updates-release-notes-whats-new?view=azuresql,What's new?,What's new? - Azure SQL Database,,Learn about the new features and documentation improvements for Azure SQL Database.,"Applies to:Azure SQL Database This article summarizes the documentation changes associated with new features and improvements in the recent releases ofAzure SQL Database. For more information about Azure SQL Database, seeWhat is Azure SQL Database? Tip Deploy Azure SQL Database for free, for the life of your Azure subscription. This free offer provides up to ten free General Purpose databases, each with 100,000 vCore seconds of compute, every month. For more announcements, discussion, and commun",2026-04-13T22:36:00.000Z,whats-new,,0.2,False,"Release notes and 'what's new' summary; primarily announcements and high-level feature changes, not detailed limits, configuration matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/dtu-benchmark?view=azuresql,DTU benchmark,DTU Benchmark - Azure SQL Database,Understand DTU benchmark characteristics for Azure SQL Database,Learn about the benchmark for the DTU-based purchasing model for Azure SQL Database.,"Applies to:Azure SQL Database A database transaction unit (DTU) is a unit of measure representing a blended measure of CPU, memory, reads, and writes. Physical characteristics (CPU, memory, IO) associated with each DTU measure are calibrated using a benchmark that simulates real-world database workload. This article summarizes the DTU benchmark and shares information about the schema, transaction types used, workload mix, users and pacing, scaling rules, and metrics associated with the benchmark",2025-06-13T08:00:00.000Z,concept-article,configuration,0.65,True,"Details schema, workload mix, scaling rules, and metrics used in the DTU benchmark; these include product-specific metric names and behaviors that go beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/dynamic-data-masking-configure-portal?view=azuresql,Configure dynamic data masking,Dynamic Data Masking - Azure SQL Database,Set up dynamic data masking in Azure SQL portal,How to get started with Azure SQL Database dynamic data masking in the Azure portal.,"Applies to:Azure SQL Database This article shows you how to implementdynamic data maskingwith the Azure portal. You can also implement dynamic data masking usingAzure SQL Database PowerShell cmdletsor theREST API. Note This feature cannot be set using the Azure portal for Azure SQL Managed Instance (use PowerShell or REST API). For more information, seeDynamic Data Masking.",2025-12-31T23:38:00.000Z,how-to,security,0.75,True,Dynamic Data Masking configuration involves specific masking rules and options unique to Azure SQL; security-focused configuration.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/dynamic-data-masking-overview?view=azuresql,Dynamic data masking,Dynamic Data Masking - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Configure dynamic data masking in Azure SQL,"Dynamic data masking (DDM) limits sensitive data exposure by masking it to nonprivileged users for Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics.","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only)SQL database in Fabric Azure SQL Database, SQL database in Microsoft Fabric, Azure SQL Managed Instance, and Azure Synapse Analytics support dynamic data masking (DDM). Dynamic data masking limits sensitive data exposure by masking it to nonprivileged users. Dynamic data masking helps prevent unauthorized access to sensitive data by enabling customers to designate how much of the sensitive da",2025-11-25T18:34:00.000Z,concept-article,security,0.7,True,Dynamic Data Masking feature for Azure SQL and Synapse; product-specific security configuration for masking sensitive data.,unchanged @@ -292,7 +292,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/database/security-best-practic https://learn.microsoft.com/en-us/azure/azure-sql/database/security-controls-policy?view=azuresql,Security controls by Azure Policy,Azure Policy Regulatory Compliance Controls - Azure SQL Database & SQL Managed Instance,Use Azure Policy regulatory controls for Azure SQL compliance,Lists Azure Policy Regulatory Compliance controls available for Azure SQL Database and SQL Managed Instance. These built-in policy definitions provide common approaches to managing the compliance of y,"Applies to:Azure SQL DatabaseAzure SQL Managed Instance Regulatory Compliance in Azure Policyprovides Microsoft created and managed initiative definitions, known asbuilt-ins, for thecompliance domainsandsecurity controlsrelated to different compliance standards. This page lists thecompliance domainsandsecurity controlsfor Azure SQL Database and SQL Managed Instance. You can assign the built-ins for asecurity controlindividually to help make your Azure resources compliant with the specific standa",2025-06-13T08:00:00.000Z,sample,security,0.8,True,Lists specific Azure Policy built-in definitions and compliance controls for Azure SQL; product-specific security/compliance configuration knowledge.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/security-overview?view=azuresql,Overview,Security Overview - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,,"Learn about security in Azure SQL Database and Azure SQL Managed Instance and Azure Synapse Analytics, including how it differs from SQL Server.","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics This article outlines the basics of securing the data tier of an application that usesAzure SQL Database,Azure SQL Managed Instance, andAzure Synapse Analytics. The security strategy described in this article follows the layered defense-in-depth approach as shown in the following diagram, and moves from the outside in:",2026-01-23T08:00:00.000Z,concept-article,,0.3,False,Security overview and defense-in-depth concept; lacks specific RBAC roles or configuration parameters in the summary.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/security-server-roles?view=azuresql,Server roles,Server Roles - Azure SQL Database,Use fixed server roles on Azure SQL logical servers,This article provides an overview of server roles for the logical server of Azure SQL Database.,Applies to:Azure SQL Database This article describes fixed server-level roles in Azure SQL Database. Note The fixed server-level roles in this article are in public preview for Azure SQL Database. These server-level roles are also part of the release forSQL Server 2022.,2025-06-30T08:00:00.000Z,concept-article,security,0.7,True,"Describes Azure SQL logical server fixed server-level roles (names and semantics) which are specific RBAC-like permissions for this product, qualifying as security configuration knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/serverless-tier-overview?view=azuresql,Serverless,Serverless compute tier - Azure SQL Database,Choose and configure Azure SQL serverless compute tier,This article describes the new serverless compute tier and compares it with the existing provisioned compute tier for Azure SQL Database.,Applies to:Azure SQL Database Serverless is acompute tierfor single databases in Azure SQL Database that automatically scales compute based on workload demand and bills for the amount of compute used per second. The serverless compute tier also automatically pauses databases during inactive periods when only storage is billed and automatically resumes databases when activity returns. The serverless compute tier is available in theGeneral Purposeservice tier and theHyperscaleservice tier. Current,2026-04-14T08:00:00.000Z,concept-article,decision-making,0.74,True,"The page compares serverless and provisioned compute tiers for Azure SQL Database with concrete, product-specific details such as when to use each tier, billing behavior (per-second compute, paused state billing), and availability by service tier (General Purpose, Hyperscale). This is explicit decision guidance between options rather than just a conceptual overview, matching the decision-making sub-skill.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/database/serverless-tier-overview?view=azuresql,Serverless,Serverless compute tier - Azure SQL Database,Choose and configure Azure SQL serverless compute tier,This article describes the new serverless compute tier and compares it with the existing provisioned compute tier for Azure SQL Database.,Applies to:Azure SQL Database Serverless is acompute tierfor single databases in Azure SQL Database that automatically scales compute based on workload demand and bills for the amount of compute used per second. The serverless compute tier also automatically pauses databases during inactive periods when only storage is billed and automatically resumes databases when activity returns. The serverless compute tier is available in theGeneral Purposeservice tier and theHyperscaleservice tier. Current,2026-04-14T08:00:00.000Z,concept-article,decision-making,0.74,True,"The page compares serverless and provisioned compute tiers for Azure SQL Database with concrete, product-specific details such as when to use each tier, billing behavior (per-second compute, paused state billing), and availability by service tier (General Purpose, Hyperscale). This is explicit decision guidance between options rather than just a conceptual overview, matching the decision-making sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/service-tier-hyperscale-frequently-asked-questions-faq?view=azuresql,FAQ,Azure SQL Database Hyperscale FAQ - Azure SQL,Operational limits and behaviors for Azure SQL Hyperscale,Answers to common questions customers ask about a database in SQL Database in the Hyperscale service tier - commonly called a Hyperscale database.,"Applies to:Azure SQL Database This article provides answers to frequently asked questions for customers considering a database in the Azure SQL Database Hyperscale service tier, referred to as just Hyperscale in the remainder of this FAQ. This article describes the scenarios that Hyperscale supports and the features that are compatible with Hyperscale.",2026-03-18T11:48:00Z,faq,limits-quotas,0.7,True,"FAQ for Hyperscale typically includes concrete numeric constraints (e.g., max database size, number of replicas, backup/restore behaviors, scaling characteristics) and other operational specifics that go beyond conceptual description, fitting limits-quotas best among the available categories.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/service-tier-hyperscale-replicas?view=azuresql,Replicas,Hyperscale secondary replicas - Azure SQL Database,Choose and use Hyperscale secondary replica types,This article describes the different types of secondary replicas available in the Hyperscale service tier.,"Applies to:Azure SQL Database As described inDistributed functions architecture, Azure SQL Database Hyperscale has two different types of compute nodes, also referred to as replicas: Secondary replicas are always read-only, and can be of three different types: Each type has a different architecture, feature set, purpose, and cost. Based on the features you need, you can use just one or even all of the three together. You can use secondary replicas with eitherserverlessor provisioned compute tier",2025-10-17T08:00:00.000Z,overview,decision-making,0.7,True,"Explains three types of Hyperscale secondary replicas, their architectures, feature sets, and purposes, guiding selection based on scenario and cost.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/service-tier-hyperscale?view=azuresql,Overview,What is the Hyperscale service tier? - Azure SQL Database,,This article describes the Hyperscale service tier in the vCore-based purchasing model in Azure SQL Database and explains how it's different from the General Purpose and Business Critical service tier,Applies to:Azure SQL Database Azure SQL Database is based on SQL Server Database Engine architecture that is adjusted for the cloud environment to ensurehigh availabilityeven in cases of infrastructure failures. There are three service tier choices in the vCore purchasing model for Azure SQL Database: The Hyperscale service tier is suitable for all workload types. Its cloud-native architecture provides independently scalable compute and storage to support the widest variety of traditional and mo,2026-03-09T08:00:00.000Z,concept-article,,0.2,False,"Primarily a conceptual/marketing-style overview of the Hyperscale service tier and how it differs from other tiers. While it may mention capabilities and some numbers, it is not organized as limits/quotas, configuration, or decision matrices with explicit thresholds.",unchanged @@ -329,8 +329,8 @@ https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encr https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-database-level-overview?view=azuresql,Database level CMK,Transparent data encryption (TDE) with database level customer-managed keys - Azure SQL Database,Use database-level TDE with customer-managed keys in Azure SQL,Overview of customer managed keys (CMK) support for transparent data encryption (TDE) with Azure Key Vault for Azure SQL Database at a database level granularity.,"Applies to:Azure SQL Database This article describes transparent data encryption (TDE) with customer-managed keys at the database level for Azure SQL Database. Note Database Level TDE CMK is available for Azure SQL Database (all SQL Database editions). It is not available for Azure SQL Managed Instance, SQL Server on-premises, Azure VMs, and Azure Synapse Analytics (dedicated SQL pools (formerly SQL DW)). Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD).",2025-07-02T17:39:00.000Z,concept-article,security,0.7,True,"Product-specific security configuration for TDE with CMK at database level, including supported platforms and scope; this is concrete Azure SQL security behavior not covered by generic TDE knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-identity?view=azuresql,Managed identities with CMK,Customer-managed keys with transparent data encryption using user-assigned managed identity - Azure SQL Database & Azure SQL Managed Instance,Configure TDE with user-assigned managed identity in Azure SQL,Bring Your Own Key (BYOK) support for transparent data encryption (TDE) using user-assigned managed identity (UMI),"Applies to:Azure SQL DatabaseAzure SQL Managed Instance Microsoft Entra ID,formerly Azure Active Directory, provides an automatically managed identity to authenticate to any Azure service that supports Microsoft Entra authentication, such asAzure Key Vault, without exposing credentials in the code. For more information, seeManaged identity typesin Azure. Managed Identities can be of two types: For more information, seeManaged identities in Microsoft Entra ID for Azure SQL. ForTDE with customer-m",2025-07-02T17:39:00.000Z,concept-article,security,0.8,True,Explains using user-assigned managed identities with customer-managed TDE; product-specific identity and key management configuration.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-key-rotation?view=azuresql,Rotate TDE BYOK keys,Rotate TDE protector (PowerShell & the Azure CLI) - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Rotate Azure SQL TDE protector using PowerShell and CLI,"Learn how to rotate the Transparent data encryption (TDE) protector for a server in Azure used by Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics using PowerShell and the A","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) This article describes key rotation for aserverusing a TDE protector from Azure Key Vault. Rotating the logical TDE protector for a server means to switch to a new asymmetric key that protects the databases on the server. Key rotation is an online operation and should only take a few seconds to complete, because this only decrypts and re-encrypts the database's data encryption key, not the e",2026-03-05T08:00:00.000Z,how-to,security,0.7,True,"Describes rotating the TDE protector backed by Azure Key Vault for Azure SQL Database/Managed Instance/Synapse. Includes concrete security operations and commands for changing the protector key, which are product-specific security configuration patterns rather than generic key-rotation theory.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql,Bring Your Own Key (BYOK),Customer-managed transparent data encryption (TDE) - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Set up customer-managed TDE with Azure Key Vault,"Bring Your Own Key (BYOK) support for transparent data encryption (TDE) with Azure Key Vault for SQL Database and Azure Synapse Analytics. TDE with BYOK overview, benefits, how it works, consideration","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) Transparent data encryption (TDE)in Azure SQL with customer-managed key (CMK) enables Bring Your Own Key (BYOK) scenario for data protection at rest, and allows organizations to implement separation of duties in the management of keys and data. With customer-managed TDE, the customer is responsible for and in a full control of a key lifecycle management (key creation, upload, rotation, delet",2026-03-05T08:00:00.000Z,concept-article,security,0.85,True,"Details Bring Your Own Key (BYOK) for TDE using Azure Key Vault, including key lifecycle responsibilities and separation of duties; this is product-specific encryption and key management configuration, fitting the security category.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql,Remove TDE protector,Remove TDE protector (PowerShell & Azure CLI) - Azure SQL Database & Azure Synapse Analytics,Respond to compromised Azure SQL TDE protector key,Learn how to respond to a potentially compromised TDE protector for Azure SQL Database or Azure Synapse Analytics using TDE with Bring Your Own Key (BYOK) support.,"Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) This article describes how to respond to a potentially compromised TDE protect for Azure SQL Database or Azure Synapse Analytics that is using TDE with customer-managed keys in Azure Key Vault - Bring Your Own Key (BYOK) support. To learn more about BYOK support for TDE, see theoverview page. Caution The procedures outlined in this article should only be done in extreme cases or in test envi",2026-03-05T08:00:00.000Z,how-to,security,0.74,True,"Guides how to handle a potentially compromised TDE protector when using BYOK, including specific steps and commands to remove/replace the protector and related Key Vault actions. This is product-specific security incident response/configuration, not generic troubleshooting or conceptual content.",unchanged +https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql,Bring Your Own Key (BYOK),Customer-managed transparent data encryption (TDE) - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Configure customer-managed TDE with Azure Key Vault,"Bring Your Own Key (BYOK) support for transparent data encryption (TDE) with Azure Key Vault for SQL Database and Azure Synapse Analytics. TDE with BYOK overview, benefits, how it works, consideration","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) Transparent data encryption (TDE)in Azure SQL with customer-managed key (CMK) enables Bring Your Own Key (BYOK) scenario for data protection at rest, and allows organizations to implement separation of duties in the management of keys and data. With customer-managed TDE, the customer is responsible for and in a full control of a key lifecycle management (key creation, upload, rotation, delet",2026-04-22T08:00:00.000Z,concept-article,security,0.7,True,"The BYOK TDE page goes beyond conceptual encryption overview and includes product-specific security configuration details such as use of customer-managed keys in Azure Key Vault, separation-of-duties implications, and concrete recommendations/considerations for Azure SQL Database, Managed Instance, and Synapse dedicated SQL pools. While primarily an overview, it contains configuration-oriented security guidance specific to these services rather than generic crypto concepts.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql,Remove TDE protector,Remove TDE protector (PowerShell & Azure CLI) - Azure SQL Database & Azure Synapse Analytics,Rotate and remove BYOK TDE protector for Azure SQL,Learn how to respond to a potentially compromised TDE protector for Azure SQL Database or Azure Synapse Analytics using TDE with Bring Your Own Key (BYOK) support.,"Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) This article describes how to respond to a potentially compromised TDE protect for Azure SQL Database or Azure Synapse Analytics that is using TDE with customer-managed keys in Azure Key Vault - Bring Your Own Key (BYOK) support. To learn more about BYOK support for TDE, see theoverview page. Caution The procedures outlined in this article should only be done in extreme cases or in test envi",2026-04-23T17:37:00.000Z,how-to,security,0.78,True,"The article gives step-by-step, product-specific security guidance for responding to a compromised TDE protector using customer-managed keys in Azure Key Vault, including exact PowerShell and Azure CLI commands, required parameters, and sequencing needed to safely remove/rotate the protector for Azure SQL Database, Managed Instance, and Synapse. This is concrete security configuration and incident-response procedure rather than a conceptual overview.",updated https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-tde-overview?view=azuresql,Overview,Transparent Data Encryption - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Configure transparent data encryption for Azure SQL databases,"An overview of transparent data encryption for Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics. The document covers its benefits and the options for configuration, which in","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics Transparent data encryption (TDE)helps protect Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics against the threat of malicious offline activity by encrypting data at rest. It performs real-time encryption and decryption of the database, associated backups, and transaction log files at rest without requiring changes to the application. By default, TDE is enabled for all newly deployed Azure ",2025-11-25T18:34:00.000Z,concept-article,security,0.8,True,TDE overview including default behavior and configuration options (service-managed vs BYOK); product-specific encryption settings.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-connectivity-issues?view=azuresql,Common connection issues,Working with Transient Errors - Azure SQL Database,Troubleshoot transient connectivity errors for Azure SQL,"Learn how to troubleshoot, diagnose, and prevent a SQL connection error or transient error when connecting to Azure SQL Database, SQL database in Fabric, Azure SQL Managed Instance, and Azure Synapse ","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse AnalyticsSQL database in Fabric This article describes how to prevent, troubleshoot, diagnose, and mitigate connection errors and transient errors that your client application encounters when it interacts with Azure SQL Database, SQL database in Microsoft Fabric, Azure SQL Managed Instance, and Azure Synapse Analytics. Learn how to configure retry logic, build the connection string, and adjust other connection settings.",2026-03-24T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Describes specific connection/transient error scenarios, connection string settings, and retry logic for Azure SQL Database, Managed Instance, Synapse, and Fabric. Organized around diagnosing and mitigating errors with product-specific guidance, fitting troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql,Connectivity errors,Troubleshoot Common Connection Issues - Azure SQL & SQL database in Fabric,Troubleshoot common connectivity issues for Azure SQL and Fabric SQL,"Provides steps to troubleshoot connection issues and resolve other connectivity issues in Azure SQL Database, Fabric SQL database, or Azure SQL Managed Instance.","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceSQL database in Fabric You receive error messages when the connection to Azure SQL Database, SQL database in Microsoft Fabric, or Azure SQL Managed Instance fails. As always, apply best practices and design guidelines during theapplication designprocess. Note You can useAzure SQL Connectivity Checkerto detect and fix a wide variety of connectivity errors.",2025-10-28T22:34:00.000Z,troubleshooting,troubleshooting,0.85,True,"Organized around connection failure symptoms and error messages with steps to diagnose and resolve them, including product-specific tools like Azure SQL Connectivity Checker.",unchanged diff --git a/products/azure-sql-database/report.md b/products/azure-sql-database/report.md index c4bb70c1..949a5b98 100644 --- a/products/azure-sql-database/report.md +++ b/products/azure-sql-database/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: decision-making: 'Guidance for choosing Azure SQL deployment, pricing, and HA/DR options: vCore vs DTU, Hyperscale, serverless, reservations, licensing, migration @@ -16,9 +16,9 @@ category_descriptions: limits-quotas: Limits, quotas, and resource caps for Azure SQL (free tiers, DTU/vCore, elastic pools, Hyperscale), plus how to request quota increases and watcher/operational constraints. - security: 'Securing Azure SQL: authentication (Entra, MFA, managed identity), network/firewall, - encryption (TDE, Always Encrypted), auditing/Defender, data masking/classification, - and compliance best practices.' + security: 'Configuring Azure SQL security: authentication (Entra, MI, MFA), network/firewall, + auditing, encryption (TDE, Always Encrypted), threat protection, compliance, and + best‑practice hardening.' architecture-patterns: 'Architectural patterns for Azure SQL: geo-replication, HA/DR, backups, connectivity, sharding/elastic scale-out, multi-tenant SaaS models, and rolling upgrade/failover designs.' @@ -31,17 +31,17 @@ category_descriptions: skill_description: Expert knowledge for Azure SQL Database development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - choosing vCore/DTU/Hyperscale/serverless, configuring geo‑replication, elastic pools, - Data Sync, or HA/DR, and other Azure SQL Database related development tasks. Not - for Azure SQL Managed Instance (use azure-sql-managed-instance), SQL Server on Azure - Virtual Machines (use azure-sql-virtual-machines), Azure Cosmos DB (use azure-cosmos-db), - Azure Database for PostgreSQL (use azure-database-postgresql). -use_when: Use when choosing vCore/DTU/Hyperscale/serverless, configuring geo‑replication, - elastic pools, Data Sync, or HA/DR, and other Azure SQL Database related development - tasks. -confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-instance), - SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Cosmos - DB (use azure-cosmos-db), Azure Database for PostgreSQL (use azure-database-postgresql). + choosing vCore/DTU/Hyperscale, configuring geo‑replication, elastic pools, Data + Sync, or security (TDE/Always Encrypted), and other Azure SQL Database related development + tasks. Not for Azure Database for MariaDB (use azure-database-mariadb), Azure Database + for MySQL (use azure-database-mysql), Azure Database for PostgreSQL (use azure-database-postgresql), + Azure SQL Managed Instance (use azure-sql-managed-instance). +use_when: Use when choosing vCore/DTU/Hyperscale, configuring geo‑replication, elastic + pools, Data Sync, or security (TDE/Always Encrypted), and other Azure SQL Database + related development tasks. +confusable_not_for: Not for Azure Database for MariaDB (use azure-database-mariadb), + Azure Database for MySQL (use azure-database-mysql), Azure Database for PostgreSQL + (use azure-database-postgresql), Azure SQL Managed Instance (use azure-sql-managed-instance). --- # Azure SQL Database Crawl Report @@ -55,8 +55,8 @@ confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-in ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 4 -- **Unchanged**: 358 +- **Updated Pages**: 2 +- **Unchanged**: 360 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-sql-database/azure-sql-database.csv` @@ -79,14 +79,10 @@ confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-in ### Updated Pages -- [What's new?](https://learn.microsoft.com/en-us/azure/azure-sql/database/doc-changes-updates-release-notes-whats-new?view=azuresql) - - Updated: 2026-03-26T22:35:00.000Z → 2026-04-13T22:36:00.000Z -- [Create users using service principals](https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-service-principal-tutorial?view=azuresql) - - Updated: 2026-03-18T08:00:00.000Z → 2026-04-14T17:42:00.000Z -- [Serverless](https://learn.microsoft.com/en-us/azure/azure-sql/database/serverless-tier-overview?view=azuresql) - - Updated: 2026-02-12T08:00:00.000Z → 2026-04-14T08:00:00.000Z -- [Auditing overview](https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-overview?view=azuresql) - - Updated: 2026-04-03T17:39:00.000Z → 2026-04-15T22:37:00.000Z +- [Bring Your Own Key (BYOK)](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql) + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-22T08:00:00.000Z +- [Remove TDE protector](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql) + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-23T17:37:00.000Z ## Classified Pages @@ -103,7 +99,6 @@ confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-in | [Transaction log errors in Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/troubleshoot-transaction-log-errors-issues?view=azuresql-mi) | troubleshooting | 0.90 | Covers specific transaction log full errors (9002, 40552) on Azure SQL Managed Instance with tailored diagnosis and remediation steps. This is product-specific error handling content, matching the troubleshooting category. | | [Troubleshoot geo-replication redo lag](https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-geo-replication-redo?view=azuresql) | troubleshooting | 0.86 | The page focuses on understanding and troubleshooting geo-replication and redo lag, which implies product-specific metrics, behaviors, and diagnostic steps for Azure SQL Database. It likely includes concrete symptom → cause → resolution guidance (for example, interpreting redo queue/lag, tuning, and mitigation steps) that go beyond generic replication concepts and constitute expert troubleshooting knowledge. | | [Auditing best practices](https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-best-practices?view=azuresql) | best-practices | 0.85 | Provides concrete recommendations for configuring auditing in production, including what to audit and how, tailored to Azure SQL and Synapse. | -| [Bring Your Own Key (BYOK)](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql) | security | 0.85 | Details Bring Your Own Key (BYOK) for TDE using Azure Key Vault, including key lifecycle responsibilities and separation of duties; this is product-specific encryption and key management configuration, fitting the security category. | | [Configure Azure Attestation](https://learn.microsoft.com/en-us/azure/azure-sql/database/always-encrypted-enclaves-configure-attestation?view=azuresql) | security | 0.85 | Details how to create and configure an attestation provider, recommended attestation policy, and attestation URL for Intel SGX enclaves; highly product-specific security configuration. | | [Connectivity errors](https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql) | troubleshooting | 0.85 | Organized around connection failure symptoms and error messages with steps to diagnose and resolve them, including product-specific tools like Azure SQL Connectivity Checker. | | [Create server with Microsoft Entra-only authentication enabled](https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-only-authentication-create-server?view=azuresql) | security | 0.85 | How-to guide for provisioning Azure SQL logical servers/managed instances with Microsoft Entra-only authentication. It necessarily includes specific configuration flags/parameters in ARM/CLI/Portal, and describes how SQL authentication is disabled and Entra principals are enforced. This is product-specific identity and access configuration, fitting the security sub-skill. | @@ -155,6 +150,7 @@ confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-in | [Known issues with Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/doc-changes-updates-known-issues?view=azuresql) | troubleshooting | 0.78 | A 'known issues' page for a specific service typically enumerates concrete symptoms, affected features, and product-specific workarounds or resolutions. These are current, version-specific behaviors that LLMs are unlikely to know from training and map directly to troubleshooting knowledge (symptom → workaround/resolution), rather than generic concepts. | | [Managed identity](https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-user-assigned-managed-identity?view=azuresql) | security | 0.78 | Describes system-assigned and user-assigned managed identities for Azure SQL, including how they are created/assigned and used with Entra authentication. Such pages typically contain specific configuration steps, identity types, and required permissions/roles, which are product-specific security and identity configuration details. | | [Microsoft Entra authentication](https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-configure?view=azuresql) | security | 0.78 | Configuration-focused article for Entra (Azure AD) authentication to Azure SQL Database, Managed Instance, and Synapse. It typically includes specific steps, T-SQL commands, and Entra/SQL role mappings and settings (e.g., contained database users, server principals, specific permission scopes), which are product-specific security configuration details rather than generic concepts. | +| [Remove TDE protector](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql) | security | 0.78 | The article gives step-by-step, product-specific security guidance for responding to a compromised TDE protector using customer-managed keys in Azure Key Vault, including exact PowerShell and Azure CLI commands, required parameters, and sequencing needed to safely remove/rotate the protector for Azure SQL Database, Managed Instance, and Synapse. This is concrete security configuration and incident-response procedure rather than a conceptual overview. | | [Identity and key management with database level CMK](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-database-level-basic-actions?view=azuresql) | security | 0.76 | How-to for creating, updating, and using database-level CMKs for TDE with user-assigned managed identity and cross-tenant Key Vault access. Contains detailed, product-specific security configuration patterns (identity setup, key usage at database scope) rather than generic encryption concepts. | | [Auditing Microsoft support operations](https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-microsoft-support-operations?view=azuresql) | security | 0.75 | Explains how to enable and use auditing specifically for Microsoft support engineer operations, including what actions are logged; product-specific security/auditing configuration. | | [Auditing setup](https://learn.microsoft.com/en-us/azure/azure-sql/database/auditing-setup?view=azuresql) | security | 0.75 | How-to article for setting up auditing at server or database level and choosing destinations; includes product-specific security configuration steps. | @@ -175,7 +171,6 @@ confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-in | [SSL root certificate expiring](https://learn.microsoft.com/en-us/azure/azure-sql/updates/ssl-root-certificate-expiring?view=azuresql) | security | 0.75 | Details upcoming certificate authority changes and steps to update client trust stores and connection security, which are product-specific security configuration actions. | | [SaaS design patterns](https://learn.microsoft.com/en-us/azure/azure-sql/database/saas-tenancy-app-design-patterns?view=azuresql) | architecture-patterns | 0.75 | Describes tenancy models and their impact on design and management; product-specific architecture patterns for multitenant SaaS on Azure SQL. | | [With user-assigned managed identity](https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-user-assigned-managed-identity-create-server?view=azuresql) | security | 0.75 | Covers managed identity configuration for server identity, including specific Azure role assignments and identity properties, which are product-specific security settings. | -| [Remove TDE protector](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql) | security | 0.74 | Guides how to handle a potentially compromised TDE protector when using BYOK, including specific steps and commands to remove/replace the protector and related Key Vault actions. This is product-specific security incident response/configuration, not generic troubleshooting or conceptual content. | | [Serverless](https://learn.microsoft.com/en-us/azure/azure-sql/database/serverless-tier-overview?view=azuresql) | decision-making | 0.74 | The page compares serverless and provisioned compute tiers for Azure SQL Database with concrete, product-specific details such as when to use each tier, billing behavior (per-second compute, paused state billing), and availability by service tier (General Purpose, Hyperscale). This is explicit decision guidance between options rather than just a conceptual overview, matching the decision-making sub-skill. | | [Best practices for Data Sync](https://learn.microsoft.com/en-us/azure/azure-sql/database/sql-data-sync-best-practices?view=azuresql) | best-practices | 0.72 | The page is explicitly a best-practices guide for Azure SQL Data Sync, containing product-specific recommendations on how to configure and run sync (e.g., guidance on hub/member design, sync group configuration, scheduling, and operational practices). These are concrete DO/DON'T style recommendations tied to this specific feature rather than generic database advice, so it fits the best-practices sub-skill. The summary indicates it is not just conceptual but focused on configuration and operation guidance. | | [ARM template](https://learn.microsoft.com/en-us/azure/azure-sql/database/single-database-create-arm-template-quickstart?view=azuresql) | deployment | 0.70 | ARM template quickstart includes schema, resource types, and parameter names specific to Azure SQL deployment, which are product-specific deployment details. | @@ -194,6 +189,7 @@ confusable_not_for: Not for Azure SQL Managed Instance (use azure-sql-managed-in | [Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/region-availability?view=azuresql) | deployment | 0.70 | Centralized, product-specific matrix of which Azure SQL Managed Instance features are available in which Azure regions. This is expert, region-specific deployment capability data. Best aligned with deployment since it constrains where features/tiers can be deployed. | | [Azure SQL decision tree](https://learn.microsoft.com/en-us/azure/azure-sql/azure-sql-decision-tree?view=azuresql) | decision-making | 0.70 | Describes a product-specific decision tree in the Azure portal for selecting between Azure SQL Database, Managed Instance, and SQL Server on VM; this is concrete decision guidance unique to the product, even if much of the logic is embedded in the portal tool. | | [Backup immutability for LTR backups](https://learn.microsoft.com/en-us/azure/azure-sql/database/backup-immutability?view=azuresql) | security | 0.70 | Describes WORM immutability semantics for Azure SQL LTR backups and how they protect against deletion/modification, which is product-specific security/compliance configuration. | +| [Bring Your Own Key (BYOK)](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql) | security | 0.70 | The BYOK TDE page goes beyond conceptual encryption overview and includes product-specific security configuration details such as use of customer-managed keys in Azure Key Vault, separation-of-duties implications, and concrete recommendations/considerations for Azure SQL Database, Managed Instance, and Synapse dedicated SQL pools. While primarily an overview, it contains configuration-oriented security guidance specific to these services rather than generic crypto concepts. | | [Configure Hyperscale named replicas](https://learn.microsoft.com/en-us/azure/azure-sql/database/hyperscale-named-replica-configure?view=azuresql) | configuration | 0.70 | How-to article with product-specific T-SQL/portal configuration for Hyperscale named replicas (permissions, replica properties, and management operations). Contains concrete configuration patterns unique to Hyperscale named replicas rather than just conceptual description. | | [Configure and fail over a pooled database](https://learn.microsoft.com/en-us/azure/azure-sql/database/scripts/setup-geodr-and-failover-elastic-pool-powershell?view=azuresql) | integrations | 0.70 | Shows exact Az PowerShell cmdlets and parameter values to configure active geo-replication for pooled databases, a concrete integration/config pattern unique to Azure SQL. | | [Configure and fail over a single database](https://learn.microsoft.com/en-us/azure/azure-sql/database/scripts/setup-geodr-and-failover-database-powershell?view=azuresql) | integrations | 0.70 | Contains specific Az PowerShell commands and parameters to set up and fail over active geo-replication for Azure SQL Database, which are product-specific integration patterns. | diff --git a/products/azure-sql-managed-instance/azure-sql-managed-instance.csv b/products/azure-sql-managed-instance/azure-sql-managed-instance.csv index 45e0962d..94d7d65d 100644 --- a/products/azure-sql-managed-instance/azure-sql-managed-instance.csv +++ b/products/azure-sql-managed-instance/azure-sql-managed-instance.csv @@ -85,7 +85,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/database/threat-detection-over https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-configure?view=azuresql,Configure TDE with BYOK,Enable SQL TDE with Azure Key Vault - Azure SQL Database & SQL Managed Instance & Azure Synapse Analytics,Enable Azure SQL TDE with Key Vault BYOK,Learn how to configure an Azure SQL Database and Azure Synapse Analytics to start using Transparent Data Encryption (TDE) for encryption-at-rest using PowerShell or Azure CLI.,"Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics This article walks through how to use a key from Azure Key Vault for transparent data encryption (TDE) on Azure SQL Database or Azure Synapse Analytics. To learn more about the TDE with Azure Key Vault integration - Bring Your Own Key (BYOK) Support, visitTDE with customer-managed keys in Azure Key Vault. If you are looking for Azure portal instructions on how to enable TDE with a customer-managed key from Azure Key V",2026-03-18T11:48:00.000Z,how-to,security,0.8,True,"Article details configuring Transparent Data Encryption using customer-managed keys in Azure Key Vault for Azure SQL and Synapse. It includes specific PowerShell/CLI commands, key and vault configuration parameters, and TDE settings unique to this integration. These are concrete, product-specific security and encryption configuration steps, matching the security sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-cross-tenant?view=azuresql,Cross-tenant CMK,Cross-Tenant Customer-Managed Keys with Transparent Data Encryption - Azure SQL Database & Azure Synapse Analytics,Configure cross-tenant customer-managed keys for Azure SQL TDE,Overview of cross-tenant customer-managed keys (CMK) support using transparent data encryption (TDE),Applies to:Azure SQL Database Azure SQL Database offers support for cross-tenant customer-managed keys (CMK) withtransparent data encryption (TDE). Cross-tenant CMK expands on theBring Your Own Key (BYOK)scenario for utilizing TDE without the need to have thelogical server in Azurein the same Microsoft Entra tenant as the Azure Key Vault or Azure Managed HSM that stores the customer-managed key used to protect the server. You can configure TDE with CMK for Azure SQL Database for keys stored in k,2025-07-02T17:39:00.000Z,concept-article,security,0.8,True,Cross-tenant CMK with TDE is a specialized security scenario; article covers tenant/key vault relationships and configuration steps.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-identity?view=azuresql,Managed identities with CMK,Customer-managed keys with transparent data encryption using user-assigned managed identity - Azure SQL Database & Azure SQL Managed Instance,Use user-assigned managed identities for TDE customer-managed keys,Bring Your Own Key (BYOK) support for transparent data encryption (TDE) using user-assigned managed identity (UMI),"Applies to:Azure SQL DatabaseAzure SQL Managed Instance Microsoft Entra ID,formerly Azure Active Directory, provides an automatically managed identity to authenticate to any Azure service that supports Microsoft Entra authentication, such asAzure Key Vault, without exposing credentials in the code. For more information, seeManaged identity typesin Azure. Managed Identities can be of two types: For more information, seeManaged identities in Microsoft Entra ID for Azure SQL. ForTDE with customer-m",2025-07-02T17:39:00.000Z,concept-article,security,0.8,True,Details using user-assigned managed identities with TDE CMK; includes identity types and security configuration specific to Azure SQL.,unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql,Bring Your Own Key (BYOK),Customer-managed transparent data encryption (TDE) - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Set up customer-managed TDE with Azure Key Vault,"Bring Your Own Key (BYOK) support for transparent data encryption (TDE) with Azure Key Vault for SQL Database and Azure Synapse Analytics. TDE with BYOK overview, benefits, how it works, consideration","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) Transparent data encryption (TDE)in Azure SQL with customer-managed key (CMK) enables Bring Your Own Key (BYOK) scenario for data protection at rest, and allows organizations to implement separation of duties in the management of keys and data. With customer-managed TDE, the customer is responsible for and in a full control of a key lifecycle management (key creation, upload, rotation, delet",2026-03-05T08:00:00.000Z,concept-article,security,0.85,True,"Details Bring Your Own Key (BYOK) for TDE using Azure Key Vault, including key lifecycle responsibilities and separation of duties; this is product-specific encryption and key management configuration, fitting the security category.",unchanged +https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql,Bring Your Own Key (BYOK),Customer-managed transparent data encryption (TDE) - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Configure customer-managed TDE with Azure Key Vault,"Bring Your Own Key (BYOK) support for transparent data encryption (TDE) with Azure Key Vault for SQL Database and Azure Synapse Analytics. TDE with BYOK overview, benefits, how it works, consideration","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics (dedicated SQL pools only) Transparent data encryption (TDE)in Azure SQL with customer-managed key (CMK) enables Bring Your Own Key (BYOK) scenario for data protection at rest, and allows organizations to implement separation of duties in the management of keys and data. With customer-managed TDE, the customer is responsible for and in a full control of a key lifecycle management (key creation, upload, rotation, delet",2026-04-22T08:00:00.000Z,concept-article,security,0.7,True,"The BYOK TDE page goes beyond conceptual encryption overview and includes product-specific security configuration details such as use of customer-managed keys in Azure Key Vault, separation-of-duties implications, and concrete recommendations/considerations for Azure SQL Database, Managed Instance, and Synapse dedicated SQL pools. While primarily an overview, it contains configuration-oriented security guidance specific to these services rather than generic crypto concepts.",updated https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-tde-overview?view=azuresql,Overview,Transparent Data Encryption - Azure SQL Database & Azure SQL Managed Instance & Azure Synapse Analytics,Enable and manage transparent data encryption in Azure SQL,"An overview of transparent data encryption for Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics. The document covers its benefits and the options for configuration, which in","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse Analytics Transparent data encryption (TDE)helps protect Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics against the threat of malicious offline activity by encrypting data at rest. It performs real-time encryption and decryption of the database, associated backups, and transaction log files at rest without requiring changes to the application. By default, TDE is enabled for all newly deployed Azure ",2025-11-25T18:34:00.000Z,concept-article,security,0.8,True,TDE overview for Azure SQL with configuration options (service-managed vs BYOK); product-specific encryption behavior and defaults.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-connectivity-issues?view=azuresql,Common connection issues,Working with Transient Errors - Azure SQL Database,Troubleshoot transient connectivity errors for Azure SQL,"Learn how to troubleshoot, diagnose, and prevent a SQL connection error or transient error when connecting to Azure SQL Database, SQL database in Fabric, Azure SQL Managed Instance, and Azure Synapse ","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceAzure Synapse AnalyticsSQL database in Fabric This article describes how to prevent, troubleshoot, diagnose, and mitigate connection errors and transient errors that your client application encounters when it interacts with Azure SQL Database, SQL database in Microsoft Fabric, Azure SQL Managed Instance, and Azure Synapse Analytics. Learn how to configure retry logic, build the connection string, and adjust other connection settings.",2026-03-24T08:00:00.000Z,troubleshooting,troubleshooting,0.9,True,"Describes specific connection/transient error scenarios, connection string settings, and retry logic for Azure SQL Database, Managed Instance, Synapse, and Fabric. Organized around diagnosing and mitigating errors with product-specific guidance, fitting troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql,Connectivity errors,Troubleshoot Common Connection Issues - Azure SQL & SQL database in Fabric,Troubleshoot common connection issues for Azure SQL,"Provides steps to troubleshoot connection issues and resolve other connectivity issues in Azure SQL Database, Fabric SQL database, or Azure SQL Managed Instance.","Applies to:Azure SQL DatabaseAzure SQL Managed InstanceSQL database in Fabric You receive error messages when the connection to Azure SQL Database, SQL database in Microsoft Fabric, or Azure SQL Managed Instance fails. As always, apply best practices and design guidelines during theapplication designprocess. Note You can useAzure SQL Connectivity Checkerto detect and fix a wide variety of connectivity errors.",2025-10-28T22:34:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicitly about connection errors; includes specific error messages, causes, and steps to resolve, which is classic symptom→cause→solution troubleshooting content.",unchanged @@ -105,8 +105,8 @@ https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/advance-notif https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/alerts-create?view=azuresql,Create alerts on SQL MI,Setup Alerts and Notifications in Azure Portal - Azure SQL Managed Instance,Set up alerts and notifications for Azure SQL Managed Instance,"Use the Azure portal to create Azure SQL Managed Instance alerts, which can trigger notifications or automation when the conditions you specify are met.",Applies to:Azure SQL Managed Instance This article shows you how to set up alerts for databases in Azure SQL Managed Instance using the Azure portal. This article also provides best practices for setting alert rules.,2025-08-28T22:34:00.000Z,how-to,best-practices,0.7,True,Shows how to configure alerts plus provides best practices for alert rules specific to Managed Instance metrics and behaviors.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/api-references-create-manage-instance?view=azuresql,API reference,Management API reference for Azure SQL Managed Instance - Azure SQL Managed Instance,API options to create and configure SQL Managed Instance,Learn about creating and configuring managed instances of Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance You can create and configure managed instances of Azure SQL Managed Instance using the Azure portal, PowerShell, Azure CLI, REST API, and Transact-SQL. In this article, you can find an overview of the functions and the API that you can use to create and configure managed instances.",2023-06-28T22:15:00.000Z,reference,configuration,0.7,True,"Summarizes PowerShell, CLI, REST, and T-SQL APIs for instance creation and configuration; includes product-specific API operations and parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/arm-templates-content-guide?view=azuresql,Azure Resource Manager,Azure Resource Manager templates - Azure SQL Managed Instance,Deploy SQL Managed Instance with ARM templates,Use Azure Resource Manager templates to create and configure Azure SQL Managed Instance.,Applies to:Azure SQL Managed Instance Azure Resource Manager templates enable you to define your infrastructure as code and deploy your solutions to the Azure cloud for Azure SQL Managed Instance.,2024-04-30T22:39:00.000Z,concept-article,configuration,0.65,True,"ARM template content guide will reference specific resource types, properties, and allowed values for Managed Instance deployments, which are configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing-configure?view=azuresql,How to configure SQL Server Audit in Azure SQL Managed Instance,Configure Auditing - Azure SQL Managed Instance,T-SQL configuration of auditing for Azure SQL Managed Instance,Learn how to get started with Azure SQL Managed Instance auditing using Transact-SQL (T-SQL).,Applies to:Azure SQL Managed Instance This article teaches you to configureauditing with SQL Server Audit in Azure SQL Managed Instance. Auditing tracks database events and writes them to an audit log in your Azure storage account.,2026-04-15T22:37:00.000Z,how-to,configuration,0.8,True,"This article explicitly teaches how to configure auditing using T-SQL, which implies concrete configuration statements, parameter names, and allowed values for SQL Server Audit on Managed Instance. That aligns with the configuration sub-skill (detailed settings and parameters), and is more about config than general security concepts.",new -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing?view=azuresql,SQL Server Audit in Azure SQL Managed Instance,SQL Server Audit - Azure SQL Managed Instance,Configure SQL Server Audit on Azure SQL Managed Instance,Learn about SQL Server Audit in Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance You can configureSQL Server Auditin Azure SQL Managed Instance. To get started configuring SQL Server Audit in Azure SQL Managed Instance, seeGet started with Azure SQL Managed Instance auditing.",2026-04-15T22:37:00.000Z,concept-article,security,0.7,True,"Auditing is a security/IAM feature; this page is about SQL Server Audit on Managed Instance. The full article (beyond the brief summary) is expected to contain product-specific audit configuration details (audit action groups, storage targets, and security-related settings) that go beyond conceptual overview, fitting the security sub-skill. It is not about limits, deployment, or generic concepts.",new +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing-configure?view=azuresql,How to configure SQL Server Audit in Azure SQL Managed Instance,Configure Auditing - Azure SQL Managed Instance,T-SQL configuration of auditing for Azure SQL Managed Instance,Learn how to get started with Azure SQL Managed Instance auditing using Transact-SQL (T-SQL).,Applies to:Azure SQL Managed Instance This article teaches you to configureauditing with SQL Server Audit in Azure SQL Managed Instance. Auditing tracks database events and writes them to an audit log in your Azure storage account.,2026-04-15T22:37:00.000Z,how-to,configuration,0.8,True,"This article explicitly teaches how to configure auditing using T-SQL, which implies concrete configuration statements, parameter names, and allowed values for SQL Server Audit on Managed Instance. That aligns with the configuration sub-skill (detailed settings and parameters), and is more about config than general security concepts.",unchanged +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing?view=azuresql,SQL Server Audit in Azure SQL Managed Instance,SQL Server Audit - Azure SQL Managed Instance,Configure SQL Server Audit on Azure SQL Managed Instance,Learn about SQL Server Audit in Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance You can configureSQL Server Auditin Azure SQL Managed Instance. To get started configuring SQL Server Audit in Azure SQL Managed Instance, seeGet started with Azure SQL Managed Instance auditing.",2026-04-15T22:37:00.000Z,concept-article,security,0.7,True,"Auditing is a security/IAM feature; this page is about SQL Server Audit on Managed Instance. The full article (beyond the brief summary) is expected to contain product-specific audit configuration details (audit action groups, storage targets, and security-related settings) that go beyond conceptual overview, fitting the security sub-skill. It is not about limits, deployment, or generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/authentication-azure-ad-user-assigned-managed-identity-create-managed-instance?view=azuresql,With user-assigned managed identity,Create an Azure SQL Managed Instance Using a User-Assigned Managed Identity - Azure SQL Managed Instance,Create Azure SQL Managed Instance with user-assigned identity,This article guides you through creating an Azure SQL Managed Instance using a user-assigned managed identity.,"Applies to:Azure SQL Managed Instance This how-to guide outlines the steps to create anAzure SQL Managed Instancewith auser-assigned managed identityfrom Microsoft Entra ID (formerly Azure Active Directory). For more information on the benefits of using a user-assigned managed identity for the server identity in Azure SQL Database, seeManaged identities in Microsoft Entra for Azure SQL. Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD).",2026-03-18T11:48:00.000Z,how-to,security,0.7,True,"Security-focused how-to that configures a user-assigned managed identity from Microsoft Entra ID for the managed instance. Involves product-specific identity configuration steps and parameters, which qualify as security expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/automated-backups-change-settings?view=azuresql,Change backup settings,Change automated backup settings - Azure SQL Managed Instance,Change automated backup retention and redundancy for Azure SQL Managed Instance,"Change point-in-time restore and backup redundancy options for automatic backups in Azure SQL Managed Instance by using the Azure portal, the Azure CLI, Azure PowerShell, and the REST API.","Applies to:Azure SQL Managed Instance This article provides examples to modifyautomated backupsettings for Azure SQL Managed Instance, such as the short-term retention policy and the backup storage redundancy option that's used for backups. For Azure SQL Database, seeChange automated backup settings for Azure SQL Database. Note This article provides steps about how to delete personal data from the device or service and can be used to support your obligations under the GDPR. For general informati",2026-02-23T08:00:00.000Z,how-to,configuration,0.8,True,"Shows how to modify short-term retention and backup storage redundancy via portal/CLI/PowerShell/REST, including specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/automated-backups-overview?view=azuresql,Automated backups,"Automatic, Geo-Redundant Backups - Azure SQL Managed Instance",Understand automatic and geo-redundant backups in SQL MI,Learn how Azure SQL Managed Instance automatically backs up all databases and provides point-in-time restore capability.,"Applies to:Azure SQL Managed Instance This article describes the automated backup feature for Azure SQL Managed Instance. To change backup settings, seeChange automated backup settings for Azure SQL Managed Instance. To restore a backup, seeRestore a database from a backup in Azure SQL Managed Instance.",2026-01-15T08:00:00.000Z,concept-article,configuration,0.65,True,"Describes automated backup behavior and settings; likely includes retention periods, backup frequency, and geo-redundancy options specific to Managed Instance backup configuration.",unchanged @@ -128,7 +128,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/disaster-reco https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/distributed-transaction-coordinator-dtc?view=azuresql,Distributed transactions with DTC,Distributed Transaction Coordinator (DTC) - Azure SQL Managed Instance,Use Distributed Transaction Coordinator with SQL Managed Instance,Learn how to use Distributed Transaction Coordinator (DTC) for Azure SQL Managed Instance to run distributed transactions in a mixed environment.,"Applies to:Azure SQL Managed Instance This article provides an overview of Distributed Transaction Coordinator (DTC) for Azure SQL Managed Instance. You can use DTC to run distributed transactions in mixed environments. These include across SQL managed instances, SQL Server instances, other relational database management systems (RDBMS), custom applications, and other transaction participants that are hosted in any environment that can establish network connectivity to Azure.",2025-09-11T22:34:00.000Z,how-to,integrations,0.7,True,"Explains how to run distributed transactions across Managed Instance, SQL Server, and other participants; includes product-specific DTC integration patterns and constraints.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/doc-changes-updates-known-issues?view=azuresql,Known issues with Azure SQL Managed Instance,Known Issues - Azure SQL Managed Instance,Resolve known issues in Azure SQL Managed Instance,"Learn about the currently known issues with Azure SQL Managed Instance, and their possible workarounds or resolutions.","Applies to:Azure SQL Managed Instance This article lists the currently known issues withAzure SQL Managed Instance, and their resolution date or possible workaround. To learn more about Azure SQL Managed Instance, seeWhat is Azure SQL Managed Instance?, andWhat's new in Azure SQL Managed Instance? Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD).",2026-03-10T22:39:00.000Z,troubleshooting-known-issue,troubleshooting,0.78,True,"A 'known issues' page for a specific service typically enumerates concrete symptoms, affected features, and product-specific workarounds or resolutions. These are current, version-specific behaviors that LLMs are unlikely to know from training and map directly to troubleshooting knowledge (symptom → workaround/resolution), rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/doc-changes-updates-release-notes-whats-new-archive?view=azuresql,Azure SQL Managed Instance,What's new? (Archive) - Azure SQL Managed Instance,,Learn about the features and documentation improvements for Azure SQL Managed Instance made in previous years. (Archive),"Applies to:Azure SQL Managed Instance This article summarizes older documentation changes associated with new features and improvements in the releases ofAzure SQL Managed Instance. To learn more about Azure SQL Managed Instance, see theoverview. Return toWhat's new in Azure SQL Managed Instance?",2026-03-03T18:40:00.000Z,whats-new,,0.1,False,"Archive page summarizing older feature and documentation updates; functions as navigation/history, not detailed technical guidance with specific parameters or limits.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/doc-changes-updates-release-notes-whats-new?view=azuresql,What's new?,What's new? - Azure SQL Managed Instance,,Learn about the new features and documentation improvements for Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article summarizes the documentation changes associated with new features and improvements in the recent releases ofAzure SQL Managed Instance. To learn more about Azure SQL Managed Instance, seeWhat is Azure SQL Managed Instance? Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD).",2026-04-13T22:36:00.000Z,whats-new,,0.2,False,"Release notes and 'what's new' summary; primarily high-level feature and documentation change log without detailed limits, configuration tables, error codes, or decision matrices that meet the expert-knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/doc-changes-updates-release-notes-whats-new?view=azuresql,What's new?,What's new? - Azure SQL Managed Instance,,Learn about the new features and documentation improvements for Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article summarizes the documentation changes associated with new features and improvements in the recent releases ofAzure SQL Managed Instance. To learn more about Azure SQL Managed Instance, seeWhat is Azure SQL Managed Instance? Note Microsoft Entra IDwas previously known as Azure Active Directory (Azure AD).",2026-04-13T22:36:00.000Z,whats-new,,0.2,False,"Release notes and 'what's new' summary; primarily high-level feature and documentation change log without detailed limits, configuration tables, error codes, or decision matrices that meet the expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/failover-group-configure-sql-mi?view=azuresql,Create failover group,Configure a failover group - Azure SQL Managed Instance,Configure failover groups for Azure SQL Managed Instance,"Learn how to configure a failover group for Azure SQL Managed Instance by using the Azure portal, and Azure PowerShell.","Applies to:Azure SQL Managed Instance This article teaches you how to configure afailover groupforAzure SQL Managed Instanceby using the Azure portal and Azure PowerShell. For an end-to-end PowerShell script to create both instances within a failover group, reviewAdd instance to a failover group with PowerShell.",2025-04-27T08:00:00.000Z,how-to,configuration,0.75,True,"How-to for configuring failover groups via portal/PowerShell will include specific setting names, parameters, and allowed values for failover group configuration, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/failover-group-sql-mi?view=azuresql,Failover groups,Failover Groups Overview & Best Practices - Azure SQL Managed Instance,Use failover groups for geo-replication in SQL Managed Instance,Failover groups let you manage geo-replication and coordinated failover of all user databases on Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article provides an overview of the failover group feature with best practices and recommendations to use with Azure SQL Managed Instance. The failover groups feature allows you to manage the replication and failover of all user databases in a SQL managed instance to another Azure region. To get started, reviewConfigure a failover group for Azure SQL Managed Instance.",2025-09-11T22:34:00.000Z,concept-article,best-practices,0.78,True,Overview plus best practices and recommendations for failover groups; contains product-specific guidance on configuration and usage patterns for geo-replication and failover.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/failover-group-standby-replica-how-to-configure?view=azuresql,Configure standby replica,Configure License-Free Standby Replica - Azure SQL Managed Instance,Configure license-free standby replica for SQL Managed Instance,Learn how to save on licensing costs by using a standby Azure SQL Managed Instance replica.,"Applies to:Azure SQL Managed Instance This article describes how you can save on licensing costs by designating your secondary SQL managed instance for standby when using Azure SQL Managed Instance. Note TheFailover benefitis only applicable when you configure a secondary instance as standbywithin a failover group. For hybrid environments between SQL Server and SQL Managed Instance, use theHybrid failover benefitinstead.",2025-08-28T22:34:00.000Z,how-to,decision-making,0.65,True,"Describing when and how to designate a secondary instance as standby to use the failover benefit involves SKU/licensing trade-offs and scenario-based recommendations, fitting decision-making with product-specific rules.",unchanged @@ -151,7 +151,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/instance-stop https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/instance-zone-redundancy-configure?view=azuresql,Configure zone redundancy,Configure Zone Redundancy - Azure SQL Managed Instance,,"Configure zone redundancy for your Azure SQL Managed Instance by using the Azure portal, PowerShell, Azure CLI, and REST API.","Applies to:Azure SQL Managed Instance This article teaches you how to configurezone redundancyfor Azure SQL Managed Instance by using the Azure portal, PowerShell, Azure CLI, and REST API. By using a zone-redundant configuration, you can make your Business Critical or General Purpose instances highly available and resilient to a larger set of failures, including catastrophic datacenter outages, without any changes to the application logic. You can convert any existing Business Critical or Genera",2026-03-18T11:48:00.000Z,how-to,,0.3,False,"Primarily a how-to guide for enabling zone redundancy via portal/PowerShell/CLI/REST. Based on the summary, it does not clearly expose detailed configuration parameter tables, limits, or product-specific constraints; it focuses on step-by-step configuration, which is excluded by the rules. Without clear evidence of parameter tables or limits, it does not meet the expert-knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/job-automation-managed-instance?view=azuresql,Job automation with SQL Agent,Job Automation with SQL Agent Jobs - Azure SQL Managed Instance,Automate jobs with SQL Server Agent on SQL Managed Instance,Automation options to run Transact-SQL (T-SQL) scripts in Azure SQL Managed Instance,"Applies to:Azure SQL Managed Instance UsingSQL Server AgentinAzureSQL Managed Instance, you can create and schedule jobs that could be periodically executed against one or many databases. These SQL Agent jobs run Transact-SQL (T-SQL) queries and perform maintenance tasks. This article covers the use of SQL Agent for SQL Managed Instance. Note SQL Agent isn't available in Azure SQL Database or Azure Synapse Analytics. Instead, we recommendJob automation with Elastic Jobs.",2025-09-11T22:34:00.000Z,concept-article,integrations,0.7,True,"Describes using SQL Agent jobs on Managed Instance; includes product-specific behavior, limitations, and T-SQL job patterns distinct from other Azure SQL offerings.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/log-replay-service-compare-mi-link?view=azuresql,Compare with MI link,Compare LRS with Managed Instance Link - Azure SQL Managed Instance,Choose between Log Replay Service and Managed Instance link,Compare log replay service (LRS) with the Managed Instance link when migrating to Azure SQL Managed Instance.,Applies to:Azure SQL Managed Instance This article compares theLog Replay Service (LRS)with theManaged Instance linkwhen migrating toAzure SQL Managed Instance.,2026-01-26T08:00:00.000Z,product-comparison,decision-making,0.75,True,"Explicit comparison article; likely includes criteria, scenarios, and trade-offs for selecting LRS vs Managed Instance link for migration.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/log-replay-service-migrate?view=azuresql,Database using Log Replay Service,Migrate SQL Server Databases by Using Log Replay Service - Azure SQL Managed Instance,Migrate to Azure SQL MI using Log Replay Service,"Migrate SQL Server databases to Azure SQL Managed Instance by using Log Replay Service (LRS). Step-by-step guide with prerequisites, best practices, and troubleshooting tips. Start your migration toda","Applies to:Azure SQL Managed Instance This article explains how to migrate SQL Server databases to Azure SQL Managed Instance usingLog Replay Service (LRS). LRS is a free cloud service for Azure SQL Managed Instance migrations, built on SQL Server log-shipping technology. Learn the complete process from prerequisites to cutover, including best practices for a successful database migration. Note It's now possible to migrate your SQL Server instance enabled by Azure Arc to Azure SQL Managed Instan",2026-04-16T08:00:00.000Z,how-to,best-practices,0.68,True,"Step-by-step migration guide for Log Replay Service that typically includes product-specific prerequisites, sequencing, and best-practice recommendations (for example, backup/log chain handling, cutover patterns, and configuration details) that go beyond generic migration theory.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/log-replay-service-migrate?view=azuresql,Database using Log Replay Service,Migrate SQL Server Databases by Using Log Replay Service - Azure SQL Managed Instance,Migrate to Azure SQL MI using Log Replay Service,"Migrate SQL Server databases to Azure SQL Managed Instance by using Log Replay Service (LRS). Step-by-step guide with prerequisites, best practices, and troubleshooting tips. Start your migration toda","Applies to:Azure SQL Managed Instance This article explains how to migrate SQL Server databases to Azure SQL Managed Instance usingLog Replay Service (LRS). LRS is a free cloud service for Azure SQL Managed Instance migrations, built on SQL Server log-shipping technology. Learn the complete process from prerequisites to cutover, including best practices for a successful database migration. Note It's now possible to migrate your SQL Server instance enabled by Azure Arc to Azure SQL Managed Instan",2026-04-16T08:00:00.000Z,how-to,best-practices,0.68,True,"Step-by-step migration guide for Log Replay Service that typically includes product-specific prerequisites, sequencing, and best-practice recommendations (for example, backup/log chain handling, cutover patterns, and configuration details) that go beyond generic migration theory.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/log-replay-service-overview?view=azuresql,Overview,Overview of Log Replay Service - Azure SQL Managed Instance,,Learn about Log Replay Service with Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article provides an overview of Log Replay Service (LRS), which you can use to migrate databases from SQL Server to Azure SQL Managed Instance. LRS is a free cloud service available for Azure SQL Managed Instance and is based on SQL Server log-shipping technology. Note You can now migrate your SQL Server instance enabled by Azure Arc to Azure SQL Managed Instance directly through the Azure portal. For more information, seeMigrate to Azure SQL Managed In",2026-01-26T08:00:00.000Z,concept-article,,0.4,False,"Overview of Log Replay Service; focuses on what it is and scenarios, not detailed configuration or limits in the summary.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/long-term-backup-retention-configure?view=azuresql,Long-term backup retention,Long-Term Backup Retention - Azure SQL Managed Instance,Configure long-term backup retention for SQL Managed Instance,"Learn how to store and restore automated backups on separate Azure Blob storage containers for Azure SQL Managed Instance by using the Azure portal, Azure CLI, and PowerShell.","Applies to:Azure SQL Managed Instance This article shows you how to configure along-term backup retention (LTR)policy for Azure SQL Managed Instance by using the Azure portal, PowerShell, and the Azure CLI, as well as how to view and restore backups from Azure storage. An LTR policy allows you to automatically retain database backups within a separate Azure Blob storage containers for up to 10 years. You can then recover a database using these backups.",2025-09-15T08:00:00.000Z,how-to,configuration,0.7,True,"LTR article describes retention policy settings, allowed ranges, and storage configuration for backups, which are product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/machine-learning-services-differences?view=azuresql,Key differences,Key Differences for Machine Learning Services - Azure SQL Managed Instance,Compare ML Services in SQL Managed Instance vs SQL Server,This article describes key differences between Machine Learning Services in Azure SQL Managed Instance and SQL Server Machine Learning Services.,"This article describes the few, key differences in functionality betweenMachine Learning Services in Azure SQL Managed InstanceandSQL Server Machine Learning Services.",2026-01-27T18:35:00.000Z,concept-article,decision-making,0.7,True,Describes key functional differences between Managed Instance and SQL Server ML Services; supports technology selection and migration decisions with product-specific capability comparisons.,unchanged @@ -159,16 +159,16 @@ https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/machine-learn https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/maintenance-window-configure?view=azuresql,Configure maintenance window,Configure Maintenance Window - Azure SQL Managed Instance,Configure maintenance windows for Azure SQL Managed Instance,Learn how to set the time when planned maintenance should be performed when you use Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance You can configure themaintenance windowfor an Azure SQL Managed Instance during resource creation, or anytime after a resource is created. For availability, seeMaintenance window feature availability in Azure SQL Managed Instance. Important Configuring the maintenance window is a long running asynchronous operation, similar to changing the service tier of the Azure SQL resource. The resource is available during the operation, except a short reconfiguration t",2025-08-26T08:00:00.000Z,how-to,configuration,0.82,True,"How-to configuration article for maintenance windows; includes specific settings, operation characteristics (long-running async, brief reconfiguration), and availability behavior.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/maintenance-window-faq?view=azuresql,Maintenance window FAQ,Maintenance window FAQ - Azure SQL Managed Instance,Maintenance window FAQ for Azure SQL Managed Instance,FAQ on how the Azure SQL Managed Instance maintenance window can be configured.,"This article answers frequently asked questions about the maintenance window forAzure SQL Managed Instance. For a maintenance window FAQ forAzure SQL Database, seeMaintenance window FAQ.",2025-10-24T17:40:00Z,faq,configuration,0.65,True,"FAQ likely contains detailed, product-specific answers about configuration behavior, edge cases, and constraints for maintenance windows.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/maintenance-window?view=azuresql,Maintenance window,Maintenance window - Azure SQL Managed Instance,Understand maintenance window behavior in SQL Managed Instance,Understand how the Azure SQL Managed Instance maintenance window can be configured.,Applies to:Azure SQL Managed Instance The maintenance window feature allows you to configure maintenance schedule forAzure SQL Managed Instanceresources making impactful maintenance events predictable and less disruptive for your workload. Note The maintenance window feature only protects from planned impact from upgrades or scheduled maintenance. It does not protect from all failover causes; exceptions that might cause short connection interruptions outside of a maintenance window include hardw,2025-06-25T08:00:00.000Z,conceptual,configuration,0.7,True,"Explains how maintenance windows work, what events are covered, and exceptions; includes product-specific behavior and constraints for maintenance scheduling.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-best-practices?view=azuresql,Best practices,Managed Instance link best practices - Azure SQL Managed Instance,Apply best practices for Managed Instance link replication,Learn about best practices when using the link feature for Azure SQL Managed Instance.,Applies to:Azure SQL Managed Instance This article outlines best practices for using theManaged Instance linkto replicate data betweenAzure SQL Managed Instanceand your SQL Server instances hosted anywhere. The link provides near real-time data replication between the linked replicas.,2026-04-14T17:42:00.000Z,how-to,best-practices,0.8,True,"Explicitly labeled as best practices for Managed Instance link; such pages typically contain product-specific DOs/DON’Ts, configuration guidance, and edge-case handling for near real-time replication that an LLM wouldn’t reliably infer from training alone.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-best-practices?view=azuresql,Best practices,Managed Instance link best practices - Azure SQL Managed Instance,Apply best practices for Managed Instance link replication,Learn about best practices when using the link feature for Azure SQL Managed Instance.,Applies to:Azure SQL Managed Instance This article outlines best practices for using theManaged Instance linkto replicate data betweenAzure SQL Managed Instanceand your SQL Server instances hosted anywhere. The link provides near real-time data replication between the linked replicas.,2026-04-14T17:42:00.000Z,how-to,best-practices,0.8,True,"Explicitly labeled as best practices for Managed Instance link; such pages typically contain product-specific DOs/DON’Ts, configuration guidance, and edge-case handling for near real-time replication that an LLM wouldn’t reliably infer from training alone.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-configure-how-to-scripts?view=azuresql,Configure link with scripts,Configure link with scripts - Azure SQL Managed Instance,Configure Managed Instance link using T-SQL and Azure scripts,Learn how to configure a link between SQL Server and Azure SQL Managed Instance with Transact-SQL (T-SQL) and Azure PowerShell or Azure CLI scripts.,"Applies to:Azure SQL Managed Instance This article teaches you how to configure alinkbetween SQL Server and Azure SQL Managed Instance with Transact-SQL and PowerShell or Azure CLI scripts. With the link, databases from your initial primary are replicated to your secondary replica in near-real time. After the link is created, you can then fail over to your secondary replica for the purpose of migration, or disaster recovery. Note",2025-09-30T08:00:00.000Z,how-to,integrations,0.7,True,"Provides T-SQL, PowerShell, and CLI scripts with specific parameters to configure the link integration, which are product-specific API/config details.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-configure-how-to-ssms?view=azuresql,Configure link with SSMS,Configure link with SSMS - Azure SQL Managed Instance,,Learn how to configure a link between SQL Server and Azure SQL Managed Instance with SQL Server Management Studio (SSMS).,"Applies to:Azure SQL Managed Instance Learn how to configure alinkbetween SQL Server and Azure SQL Managed Instance by using SQL Server Management Studio (SSMS). The link replicates databases from your initial primary to your secondary replica in near-real time. After you create the link, you can fail over to your secondary replica for migration or disaster recovery. Note",2026-04-01T17:39:00.000Z,how-to,,0.3,False,"Primarily a how-to tutorial for configuring a link between SQL Server and Azure SQL Managed Instance using SSMS. It focuses on procedural steps rather than detailed configuration parameter tables, limits, security roles, or troubleshooting mappings, so it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-disaster-recovery?view=azuresql,Disaster recovery with the link,Disaster recovery with Managed Instance link - Azure SQL Managed Instance,Configure disaster recovery to Azure SQL Managed Instance using Managed Instance link,Learn how to use the Managed Instance link to recover your SQL Server data to Azure SQL Managed Instance in the event of a disaster.,"Applies to:Azure SQL Managed Instance This article teaches you to configure a hybrid disaster recovery solution between SQL Server hosted anywhere and Azure SQL Managed Instance by using theManaged Instance link, and how to save on licensing costs by activating theHybrid failover benefiton a license-free DR replica.",2025-03-06T08:00:00.000Z,how-to,deployment,0.7,True,"Describes hybrid DR configuration and licensing (Hybrid failover benefit) for Managed Instance link, which is product-specific DR deployment guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-failover-how-to?view=azuresql,Fail over a link,Fail over link - Azure SQL Managed Instance,Fail over databases using Managed Instance link between SQL Server and Azure SQL Managed Instance,Learn how to fail over a link between SQL Server and Azure SQL Managed Instance with SQL Server Management Studio (SSMS) and PowerShell.,Applies to:Azure SQL Managed Instance This article teaches you how to fail over a databaselinkedbetween SQL Server and Azure SQL Managed Instance by using SQL Server Management Studio (SSMS) or PowerShell for the purpose of disaster recovery or migration.,2024-12-06T08:00:00.000Z,how-to,integrations,0.65,True,"Describes how to perform failover of linked databases via SSMS/PowerShell, including product-specific commands and behaviors for the link integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-feature-overview?view=azuresql,Managed Instance link,Managed Instance link overview - Azure SQL Managed Instance,,"This article describes the Managed Instance link, which you can use to replicate data continuously between SQL Server and Azure SQL Managed Instance for scenarios such as migrating to the cloud with m","Applies to:Azure SQL Managed Instance This article provides an overview of the Managed Instance link, which enables near real-time data replication between SQL Server andAzure SQL Managed Instance. The link provides hybrid flexibility and database mobility as it unlocks several scenarios, such as scaling read-only workloads, offloading analytics and reporting to Azure, and migrating to Azure. And, with SQL Server 2022, the link enables online disaster recovery with fail back to SQL Server, as we",2026-03-06T08:00:00.000Z,concept-article,,0.2,False,"Page is an overview of Managed Instance link capabilities and scenarios (migration, DR, analytics). The summary indicates conceptual description of what the feature does, without specific limits, configuration parameter tables, error codes, or decision matrices. It does not match any expert-knowledge sub-skill criteria.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-migrate?view=azuresql,Migrate with the link,Migrate with the Link - Azure SQL Managed Instance,,Learn how to use the Managed Instance link to migrate your SQL Server data to Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article teaches you to migrate your SQL Server database to Azure SQL Managed Instance by using theManaged Instance link. For a detailed migration guide, reviewMigrate to Azure SQL Managed Instance. To compare migration tools, reviewCompare LRS with Managed Instance link. Note You can now migrate your SQL Server instance enabled by Azure Arc to Azure SQL Managed Instance directly through the Azure portal. For more information, seeMigrate to Azure SQL Man",2026-04-14T17:42:00.000Z,how-to,,0.3,False,"Described as a how-to for using Managed Instance link to migrate; summary does not indicate detailed configuration tables, error codes, or quantified best practices—likely a procedural tutorial rather than expert-knowledge reference content.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-migrate?view=azuresql,Migrate with the link,Migrate with the Link - Azure SQL Managed Instance,,Learn how to use the Managed Instance link to migrate your SQL Server data to Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article teaches you to migrate your SQL Server database to Azure SQL Managed Instance by using theManaged Instance link. For a detailed migration guide, reviewMigrate to Azure SQL Managed Instance. To compare migration tools, reviewCompare LRS with Managed Instance link. Note You can now migrate your SQL Server instance enabled by Azure Arc to Azure SQL Managed Instance directly through the Azure portal. For more information, seeMigrate to Azure SQL Man",2026-04-14T17:42:00.000Z,how-to,,0.3,False,"Described as a how-to for using Managed Instance link to migrate; summary does not indicate detailed configuration tables, error codes, or quantified best practices—likely a procedural tutorial rather than expert-knowledge reference content.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-preparation-wsfc?view=azuresql,Prepare SQL Server 2016 environment,Prepare environment for Managed Instance link with WSFC - Azure SQL Managed Instance,Prepare WSFC environment for Managed Instance link with Azure SQL Managed Instance,Learn how to prepare your environment with WSFC for using a Managed Instance link to replicate and fail over your database to SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article teaches you how to enable the Always On availability group feature with Windows Server Failover Cluster (WSFC) on your SQL Server 2016 as an extra step toprepare your environmentfor the Managed Instance link. The steps described in this article are only mandatory for SQL Server 2016, as this version of SQL Server can't enable availability groups without Windows Server Failover Cluster present on the host Windows OS machine. The minimum requireme",2024-10-09T22:32:00.000Z,how-to,configuration,0.7,True,"Details enabling Always On AG with WSFC on SQL Server 2016 as part of link preparation, including minimum requirements and configuration steps.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-preparation?view=azuresql,Prepare environment for link,Prepare Environment for Link - Azure SQL Managed Instance,Prepare environment for Managed Instance link replication,Learn how to prepare your environment to create a link between SQL Server and Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article teaches you how to prepare your environment for aManaged Instance linkso that you can replicate between SQL Server installed to Windows or Linux and Azure SQL Managed Instance. Note You can automate preparing your environment for the Managed Instance link by using a downloadable script. For more information, see theAutomating link setup blog.",2026-03-10T22:39:00.000Z,how-to,configuration,0.65,True,"The page describes preparing an environment to create a Managed Instance link between SQL Server and Azure SQL Managed Instance. Such preparation docs usually enumerate specific configuration steps, required settings, ports, permissions, and possibly scripts unique to this feature. That constitutes product-specific configuration knowledge beyond generic concepts, fitting the configuration sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-troubleshoot-how-to?view=azuresql,Troubleshoot,Troubleshoot issues with the link - Azure SQL Managed Instance,Troubleshoot Azure SQL Managed Instance link issues,Learn how to troubleshoot common issues with a link between SQL Server and Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article teaches you how to monitor and troubleshoot issues with alinkbetween SQL Server and Azure SQL Managed Instance. You can check the state of the link with Transact-SQL (T-SQL), Azure PowerShell or the Azure CLI. If you encounter issues, you can use the error codes to troubleshoot the problem. Many issues with creating the link can be resolved bychecking the networkbetween the two instances, and validating theenvironment has been properly preparedf",2026-04-14T17:42:00.000Z,how-to,troubleshooting,0.9,True,"Article focuses on monitoring and troubleshooting the link, including checking state via T-SQL/PowerShell/CLI and using specific error codes to diagnose issues—matching the symptom → cause → solution and error-code-based troubleshooting criteria.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-troubleshoot-how-to?view=azuresql,Troubleshoot,Troubleshoot issues with the link - Azure SQL Managed Instance,Troubleshoot Azure SQL Managed Instance link issues,Learn how to troubleshoot common issues with a link between SQL Server and Azure SQL Managed Instance.,"Applies to:Azure SQL Managed Instance This article teaches you how to monitor and troubleshoot issues with alinkbetween SQL Server and Azure SQL Managed Instance. You can check the state of the link with Transact-SQL (T-SQL), Azure PowerShell or the Azure CLI. If you encounter issues, you can use the error codes to troubleshoot the problem. Many issues with creating the link can be resolved bychecking the networkbetween the two instances, and validating theenvironment has been properly preparedf",2026-04-14T17:42:00.000Z,how-to,troubleshooting,0.9,True,"Article focuses on monitoring and troubleshooting the link, including checking state via T-SQL/PowerShell/CLI and using specific error codes to diagnose issues—matching the symptom → cause → solution and error-code-based troubleshooting criteria.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/management-operations-cancel?view=azuresql,Cancel operations,Cancel management operations - Azure SQL Managed Instance,Cancel Azure SQL Managed Instance management operations,Learn how to cancel Azure SQL Managed Instance management operations.,"Applies to:Azure SQL Managed Instance Azure SQL Managed Instance provides the ability to cancel somemanagement operations, such as when you deploy a new managed instance or update instance properties.",2024-02-27T18:32:00.000Z,how-to,deployment,0.68,True,"Explains which management operations can be canceled and how; includes product-specific constraints and behavior during cancellation, relevant to deployment/change workflows.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/management-operations-duration?view=azuresql,Management operations duration,Duration of Management Operations - Azure SQL Managed Instance,Understand duration of management operations in SQL Managed Instance,Learn about the duration of Azure SQL Managed Instance management operations.,"Applies to:Azure SQL Managed Instance This article details the steps and the duration of management operations inAzure SQL Managed Instance. For an overview of the underlying processes related to management operations, such as seeding, and failover, seeManagement operations overview.",2025-06-19T05:30:00.000Z,overview,deployment,0.7,True,Details steps and durations of create/update/delete operations; includes product-specific timing characteristics important for deployment and change planning.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/management-operations-monitor?view=azuresql,Monitor operations,Monitor management operations - Azure SQL Managed Instance,Monitor Azure SQL Managed Instance management operations,Learn about different ways for monitoring of Azure SQL Managed Instance management operations.,"Applies to:Azure SQL Managed Instance Azure SQL Managed Instance provides monitoring ofmanagement operationsthat you use to deploy new managed instances, update instance properties, or delete instances when no longer needed.",2025-06-19T05:30:00.000Z,how-to,configuration,0.65,True,Describes specific monitoring mechanisms and possibly metrics/events for management operations; these are product-specific configuration/monitoring details.,unchanged diff --git a/products/azure-sql-managed-instance/report.md b/products/azure-sql-managed-instance/report.md index 6246291a..d642ec3c 100644 --- a/products/azure-sql-managed-instance/report.md +++ b/products/azure-sql-managed-instance/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: decision-making: Guidance for choosing Azure SQL Managed Instance vs other Azure SQL options, tiers, pools, networking, HA/DR options, ML differences, and planning @@ -16,9 +16,9 @@ category_descriptions: limits-quotas: 'Limits, quotas, and performance caps for Azure SQL MI: DTUs, free-tier and memory limits, resource ceilings, monitoring behavior, and how to request quota increases.' - security: 'Configuring Azure SQL Managed Instance security: Entra auth and logins, - Windows/Kerberos, managed identities, TDE and Key Vault, TLS, auditing, threat - protection, networking, and policy-based controls.' + security: 'Configuring auth, encryption, and network protection for SQL Managed + Instance: Entra/Windows logins, TDE & Key Vault, TLS, auditing, threat protection, + Private Link, and security best practices.' integrations: Connecting apps and tools to SQL Managed Instance (.NET, Java, Python, etc.), automation, data import, DTC, XEvents, MI Link, backups, tracing, and Spark integration. @@ -31,17 +31,17 @@ category_descriptions: skill_description: Expert knowledge for Azure SQL Managed Instance development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when using MI link, HA/DR and geo-replication, Entra/Kerberos auth, Key Vault - TDE, or XEvents/Intelligent Insights, and other Azure SQL Managed Instance related + Use when choosing MI vs Azure SQL, configuring networking/HA, tuning performance, + securing access, or integrating MI Link, and other Azure SQL Managed Instance related development tasks. Not for Azure SQL Database (use azure-sql-database), SQL Server - on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Database for MariaDB - (use azure-database-mariadb), Azure Database for MySQL (use azure-database-mysql). -use_when: Use when using MI link, HA/DR and geo-replication, Entra/Kerberos auth, - Key Vault TDE, or XEvents/Intelligent Insights, and other Azure SQL Managed Instance - related development tasks. + on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Database for MySQL + (use azure-database-mysql), Azure Database for PostgreSQL (use azure-database-postgresql). +use_when: Use when choosing MI vs Azure SQL, configuring networking/HA, tuning performance, + securing access, or integrating MI Link, and other Azure SQL Managed Instance related + development tasks. confusable_not_for: Not for Azure SQL Database (use azure-sql-database), SQL Server - on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Database for MariaDB - (use azure-database-mariadb), Azure Database for MySQL (use azure-database-mysql). + on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Database for MySQL + (use azure-database-mysql), Azure Database for PostgreSQL (use azure-database-postgresql). --- # Azure SQL Managed Instance Crawl Report @@ -54,10 +54,10 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), SQL Ser - **Unclassified**: 55 ### Incremental Update -- **New Pages**: 2 -- **Updated Pages**: 5 -- **Unchanged**: 234 -- **Deleted Pages**: 1 +- **New Pages**: 0 +- **Updated Pages**: 1 +- **Unchanged**: 240 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-sql-managed-instance/azure-sql-managed-instance.csv` ## Classification Statistics @@ -77,27 +77,10 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), SQL Ser ## Changes -### New Pages - -- [SQL Server Audit in Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing?view=azuresql) -- [How to configure SQL Server Audit in Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing-configure?view=azuresql) - ### Updated Pages -- [What's new?](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/doc-changes-updates-release-notes-whats-new?view=azuresql) - - Updated: 2026-03-31T22:34:00.000Z → 2026-04-13T22:36:00.000Z -- [Database using Log Replay Service](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/log-replay-service-migrate?view=azuresql) - - Updated: 2026-03-10T22:39:00.000Z → 2026-04-16T08:00:00.000Z -- [Migrate with the link](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-migrate?view=azuresql) - - Updated: 2026-02-19T08:00:00.000Z → 2026-04-14T17:42:00.000Z -- [Best practices](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-best-practices?view=azuresql) - - Updated: 2026-01-29T18:37:00.000Z → 2026-04-14T17:42:00.000Z -- [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-troubleshoot-how-to?view=azuresql) - - Updated: 2026-03-10T22:39:00.000Z → 2026-04-14T17:42:00.000Z - -### Deleted Pages - -- ~~Auditing~~ (https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing-configure?view=azuresql) +- [Bring Your Own Key (BYOK)](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql) + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-22T08:00:00.000Z ## Classified Pages @@ -111,7 +94,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), SQL Ser | [Troubleshoot](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/winauth-azuread-troubleshoot?view=azuresql) | troubleshooting | 0.90 | Explicit troubleshooting article; expected to map specific Kerberos/Entra errors and symptoms to causes and resolutions for Windows Authentication on Managed Instance. | | [Set up the incoming trust-based flow](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/winauth-azuread-setup-incoming-trust-based-flow?view=azuresql) | security | 0.86 | The page describes how to configure Windows Authentication for Azure SQL Managed Instance using Microsoft Entra ID with an incoming trust-based flow. It includes product-specific identity and Kerberos configuration steps (service accounts, Trusted Domain Object, key rotation, and removal), which are detailed security and authentication settings unique to this product, fitting the security sub-skill. | | [Troubleshoot geo-replication redo lag](https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-geo-replication-redo?view=azuresql) | troubleshooting | 0.86 | The page focuses on understanding and troubleshooting geo-replication and redo lag, which implies product-specific metrics, behaviors, and diagnostic steps for Azure SQL Database. It likely includes concrete symptom → cause → resolution guidance (for example, interpreting redo queue/lag, tuning, and mitigation steps) that go beyond generic replication concepts and constitute expert troubleshooting knowledge. | -| [Bring Your Own Key (BYOK)](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql) | security | 0.85 | Details Bring Your Own Key (BYOK) for TDE using Azure Key Vault, including key lifecycle responsibilities and separation of duties; this is product-specific encryption and key management configuration, fitting the security category. | | [Connectivity errors](https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql) | troubleshooting | 0.85 | Explicitly about connection errors; includes specific error messages, causes, and steps to resolve, which is classic symptom→cause→solution troubleshooting content. | | [Create server with Microsoft Entra-only authentication enabled](https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-only-authentication-create-server?view=azuresql) | security | 0.85 | How-to guide for provisioning Azure SQL logical servers/managed instances with Microsoft Entra-only authentication. It necessarily includes specific configuration flags/parameters in ARM/CLI/Portal, and describes how SQL authentication is disabled and Entra principals are enforced. This is product-specific identity and access configuration, fitting the security sub-skill. | | [Log diagnostic telemetry](https://learn.microsoft.com/en-us/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?view=azuresql) | configuration | 0.85 | Details telemetry types, destinations, and configuration via portal/CLI/ARM; includes specific metric/log names and settings, which are configuration parameters. | @@ -197,6 +179,7 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), SQL Ser | [Azure SQL decision tree](https://learn.microsoft.com/en-us/azure/azure-sql/azure-sql-decision-tree?view=azuresql) | decision-making | 0.70 | Describes a product-specific decision tree in the Azure portal for selecting between Azure SQL Database, Managed Instance, and SQL Server on VM; this is concrete decision guidance unique to the product, even if much of the logic is embedded in the portal tool. | | [Backup transparency](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/backup-transparency?view=azuresql) | configuration | 0.70 | Explains how to access and interpret backup history; includes product-specific DMVs or views and their fields, which are concrete configuration/inspection details. | | [Best practices](https://learn.microsoft.com/en-us/azure/azure-sql/database/security-best-practice?view=azuresql) | security | 0.70 | Playbook of security requirements and best practices for Azure SQL; likely includes concrete feature usage (TDE, auditing, firewall, roles) and product-specific recommendations. | +| [Bring Your Own Key (BYOK)](https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql) | security | 0.70 | The BYOK TDE page goes beyond conceptual encryption overview and includes product-specific security configuration details such as use of customer-managed keys in Azure Key Vault, separation-of-duties implications, and concrete recommendations/considerations for Azure SQL Database, Managed Instance, and Synapse dedicated SQL pools. While primarily an overview, it contains configuration-oriented security guidance specific to these services rather than generic crypto concepts. | | [Change update policy](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/update-policy?view=azuresql) | configuration | 0.70 | Describes a product-specific instance setting (update policy) that controls access to SQL engine features, including concrete allowed values (policies tied to SQL Server 2022, 2025, or latest). This is configuration-focused (instance-level setting with specific options) rather than conceptual, and the exact policy names/behaviors are expert, product-specific knowledge. | | [Configure connection types](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/connection-types-overview?view=azuresql) | configuration | 0.70 | Explains different VNet-local connection types and how to configure them, which are specific connectivity configuration options for this service. | | [Configure link with scripts](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-configure-how-to-scripts?view=azuresql) | integrations | 0.70 | Provides T-SQL, PowerShell, and CLI scripts with specific parameters to configure the link integration, which are product-specific API/config details. | diff --git a/products/azure-sql-virtual-machines/azure-sql-virtual-machines.csv b/products/azure-sql-virtual-machines/azure-sql-virtual-machines.csv index 66f964d8..1dd58c34 100644 --- a/products/azure-sql-virtual-machines/azure-sql-virtual-machines.csv +++ b/products/azure-sql-virtual-machines/azure-sql-virtual-machines.csv @@ -82,7 +82,7 @@ https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/creat https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/create-sql-vm-resource-manager-template?view=azuresql,ARM template,Create SQL Server VM using an ARM template - SQL Server on Azure VMs,,Learn how to create a SQL Server on Azure Virtual Machine (VM) by using an Azure Resource Manager template (ARM template).,"Use this Azure Resource Manager template (ARM template) to deploy a SQL Server on Azure Virtual Machine (VM). AnARM templateis a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax. In declarative syntax, you describe your intended deployment without writing the sequence of programming commands to create the deployment. If your environment meets the prerequisites and you're familiar with using ARM templat",2026-01-21T18:36:00.000Z,quickstart,,0.35,False,ARM template quickstart; demonstrates a deployment but not a full configuration or limits reference.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/dedicated-host?view=azuresql,Dedicated host,Run SQL Server VM on an Azure Dedicated Host - SQL Server on Azure VMs,Configure SQL Server on Azure Dedicated Host,Learn how to run a SQL Server VM on an Azure Dedicated Host.,Applies to:SQL Server on Azure VM This article details the specifics of using a SQL Server virtual machine (VM) withAzure Dedicated Host. Additional information about Azure Dedicated Host can be found in the blog postIntroducing Azure Dedicated Host.,2023-07-10T08:00:00.000Z,concept-article,configuration,0.65,True,"Details specifics of running SQL VMs on Dedicated Host, including host-level configuration and constraints.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/doc-changes-updates-release-notes-whats-new-archive?view=azuresql,SQL Server on Azure VMs,What's new? (Archive) - SQL Server on Azure VMs,,Learn about the features and documentation improvements for SQL Server on Azure VMs made in previous years. (Archive),"Applies to:SQL Server on Azure VM This article summarizes older documentation changes associated with new features and improvements in the recent releases ofSQL Server on Azure VMs. To learn more about SQL Server on Azure VMs, see theoverview. Return toWhat's new in SQL Server on Azure VMs?",2026-03-02T08:00:00.000Z,whats-new,,0.1,False,"Archive of 'what's new' documentation changes for SQL Server on Azure VMs; primarily historical and navigational, not focused on limits, configuration tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/doc-changes-updates-release-notes-whats-new?view=azuresql,What's new?,What's new? - SQL Server on Azure VMs,,Learn about the new features for and improvements to SQL Server on Azure Virtual Machines.,"Applies to:SQL Server on Azure VM When you deploy an Azure virtual machine (VM) with SQL Server installed on it, either manually, or through a built-in image, you can use Azure features to improve your experience. This article summarizes the documentation changes associated with new features and improvements in the recent releases ofSQL Server on Azure Virtual Machines (VMs). To learn more about SQL Server on Azure VMs, see theoverview. For updates made in previous years, see theWhat's new archi",2026-04-17T17:37:00.000Z,whats-new,,0.2,False,"Release notes and documentation change log for SQL Server on Azure VMs; summary text indicates high-level 'what's new' without exposing specific limits, configs, error codes, or decision matrices. Functions primarily as navigation/overview of changes rather than detailed technical guidance.",updated +https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/doc-changes-updates-release-notes-whats-new?view=azuresql,What's new?,What's new? - SQL Server on Azure VMs,,Learn about the new features for and improvements to SQL Server on Azure Virtual Machines.,"Applies to:SQL Server on Azure VM When you deploy an Azure virtual machine (VM) with SQL Server installed on it, either manually, or through a built-in image, you can use Azure features to improve your experience. This article summarizes the documentation changes associated with new features and improvements in the recent releases ofSQL Server on Azure Virtual Machines (VMs). To learn more about SQL Server on Azure VMs, see theoverview. For updates made in previous years, see theWhat's new archi",2026-04-17T17:37:00.000Z,whats-new,,0.2,False,"Release notes and documentation change log for SQL Server on Azure VMs; summary text indicates high-level 'what's new' without exposing specific limits, configs, error codes, or decision matrices. Functions primarily as navigation/overview of changes rather than detailed technical guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/failover-cluster-instance-azure-elastic-san-manually-configure?view=azuresql,Azure Elastic SAN,Create an FCI with Azure Elastic SAN - SQL Server on Azure VMs,Manually configure SQL FCI with Azure Elastic SAN,Use Azure Elastic SAN to create a failover cluster instance (FCI) with SQL Server on Azure Virtual Machines.,"Applies to:SQL Server on Azure VM This article explains how to create a failover cluster instance (FCI) by using an Azure Elastic SAN volume with SQL Server on Azure Virtual Machines (VMs). To learn more, see an overview ofFCI with SQL Server on Azure VMsandcluster best practices.",2025-02-03T08:00:00.000Z,how-to,configuration,0.7,True,"Step-by-step FCI setup with Azure Elastic SAN volumes will include product-specific configuration values (disk layout, cluster settings, volume options) that are not generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/failover-cluster-instance-azure-shared-disks-manually-configure?view=azuresql,Azure shared disks,Create an FCI with Azure Shared Disks - SQL Server on Azure VMs,Manually configure SQL FCI with Azure shared disks,Use Azure shared disks to create a failover cluster instance (FCI) with SQL Server on Azure Virtual Machines.,"Applies to:SQL Server on Azure VM This article explains how to create a failover cluster instance (FCI) by using Azure shared disks with SQL Server on Azure Virtual Machines (VMs). To learn more, see an overview ofFCI with SQL Server on Azure VMsandcluster best practices. Note You can now lift and shift your failover cluster instance solution to SQL Server on Azure VMs by using Azure Migrate. To learn more, seeMigrate failover cluster instance.",2026-04-01T17:39:00.000Z,how-to,configuration,0.7,True,"Step-by-step manual configuration of a SQL Server failover cluster instance on Azure VMs with Azure Shared Disks will include product-specific settings (cluster, storage, networking, SQL configuration) and exact parameter names/values unique to this scenario, which an LLM is unlikely to know from training.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/failover-cluster-instance-distributed-network-name-dnn-configure?view=azuresql,Distributed network name (DNN),Configure DNN for Failover Cluster Instance - SQL Server on Azure VMs,Configure distributed network name for SQL FCI on Azure,Learn how to configure a distributed network name (DNN) to route traffic to your SQL Server on Azure VM failover cluster instance (FCI).,"Applies to:SQL Server on Azure VM On Azure Virtual Machines (VMs), the distributed network name (DNN) routes traffic to the appropriate clustered resource. It provides an easier way to connect to the SQL Server failover cluster instance (FCI) than the virtual network name (VNN), without the need for an Azure Load Balancer. This article teaches you to configure a DNN resource to route traffic to your failover cluster instance with SQL Server on Azure VMs for high availability and disaster recover",2025-09-26T22:35:00.000Z,how-to,configuration,0.78,True,DNN setup for FCI requires specific cluster resource names and Azure networking parameters that are product-specific configuration details.,unchanged @@ -103,10 +103,10 @@ https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/manag https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/move-sql-vm-different-region?view=azuresql,VM to a new region,Move a virtual machine to another region (Azure Site Recovery) - SQL Server on Azure VMs,Migrate SQL Server VMs to another Azure region,Learn how you can migrate your SQL Server virtual machine from one region to another within Azure.,Applies to:SQL Server on Azure VM This article teaches you how to use Azure Site Recovery to migrate your SQL Server virtual machine (VM) from one region to another within Azure. Moving a SQL Server VM to a different region requires doing the following:,2023-03-30T00:00:00.000Z,how-to,deployment,0.6,True,"Uses Azure Site Recovery for region migration; typically includes product-specific replication settings, supported scenarios, and constraints relevant to deployment/migration.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-checklist?view=azuresql,Quick checklist,Checklist: Best Practices and Guidelines - SQL Server on Azure VMs,Apply performance best-practice checklist for SQL Server on Azure VMs,Provides a quick checklist to review your best practices and guidelines to optimize the performance of your SQL Server on Azure Virtual Machines (VM).,"Applies to:SQL Server on Azure VM This article provides checklists as part of a series of best practices and guidelines to optimize the performance of your SQL Server on Azure Virtual Machines (VMs). Use this guide to improve your VM configuration, storage setup, security posture, and troubleshoot common performance problems. The checklists in this article provide a brief overview of the more comprehensive details found in the following articles of this series: Note You can now view individual S",2026-03-31T08:00:00.000Z,best-practice,best-practices,0.7,True,"Checklist of product-specific performance recommendations for SQL Server on Azure VMs, referencing concrete configuration and tuning actions; part of a best-practices series rather than generic guidance.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-collect-baseline?view=azuresql,Collect baseline,Collect Baseline: Performance Best Practices & Guidelines - SQL Server on Azure VMs,Collect performance baselines for SQL Server on Azure VMs,Provides steps to collect a performance baseline as guidelines to optimize the performance of your SQL Server on Azure Virtual Machine (VM).,"Applies to:SQL Server on Azure VM This article provides information to collect a performance baseline as a series of best practices and guidelines to optimize performance for your SQL Server on Azure Virtual Machines (VMs). Typically, there's a trade-off between optimizing for costs and optimizing for performance. This performance best practices series focuses on getting thebestperformance for SQL Server on Azure Virtual Machines. If your workload is less demanding, you might not require every r",2026-04-03T22:38:00.000Z,best-practice,best-practices,0.75,True,"Step-by-step, product-specific guidance on collecting performance baselines for SQL Server on Azure VMs; actionable best-practice procedures rather than conceptual performance theory.",unchanged -https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-storage?view=azuresql,Storage,Storage: Performance Best Practices & Guidelines - SQL Server on Azure VMs,Optimize Azure VM storage for SQL Server performance,Provides storage best practices and guidelines to optimize the performance of your SQL Server on Azure Virtual Machines (VM).,"Applies to:SQL Server on Azure VM This article provides storage best practices and guidelines to optimize performance for your SQL Server on Azure Virtual Machines (VMs). Learn how to select the right disk types, configure storage pools, and implement caching strategies to maximize your database performance. There's typically a trade-off between optimizing for costs and optimizing for performance. This performance best practices series focuses on getting thebestperformance for SQL Server on Azur",2026-04-01T17:39:00.000Z,best-practice,best-practices,0.85,True,"Storage-focused performance best practices for SQL Server on Azure VMs, including product-specific recommendations on disk types, storage pools, and caching strategies that go beyond generic advice.",unchanged +https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-storage?view=azuresql,Storage,Storage: Performance Best Practices & Guidelines - SQL Server on Azure VMs,Optimize Azure VM storage for SQL Server performance,Provides storage best practices and guidelines to optimize the performance of your SQL Server on Azure Virtual Machines (VM).,"Applies to:SQL Server on Azure VM This article provides storage best practices and guidelines to optimize performance for your SQL Server on Azure Virtual Machines (VMs). Learn how to select the right disk types, configure storage pools, and implement caching strategies to maximize your database performance. There's typically a trade-off between optimizing for costs and optimizing for performance. This performance best practices series focuses on getting thebestperformance for SQL Server on Azur",2026-04-23T22:39:00.000Z,best-practice,best-practices,0.78,True,"The page provides product-specific performance best practices for SQL Server on Azure VMs, including concrete recommendations on disk type selection, storage pool configuration, and caching strategies tailored to Azure storage. These are actionable DO/DON'T guidelines unique to this environment rather than generic database advice, fitting the best-practices category.",updated https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-vm-size?view=azuresql,VM size,VM size: Performance best practices & guidelines - SQL Server on Azure VMs,Select and tune VM sizes for SQL Server on Azure VMs,Provides VM size guidelines and best practices to optimize the performance of your SQL Server on Azure Virtual Machine (VM).,"Applies to:SQL Server on Azure VM This article provides VM size guidance and best practices to optimize performance for your SQL Server on Azure Virtual Machines (VMs). There's typically a trade-off between optimizing for costs and optimizing for performance. This performance best practices series focuses on getting thebestperformance for SQL Server on Azure Virtual Machines. If your workload is less demanding, you might not need every recommended optimization. Consider your performance needs, c",2026-02-12T23:35:00.000Z,best-practice,best-practices,0.8,True,Provides VM size guidance and performance best practices specific to SQL Server workloads on Azure VMs.,unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/pricing-guidance?view=azuresql,Pricing,Price Guidance & Managing Costs - SQL Server on Azure VMs,Choose cost‑effective pricing for SQL Server on Azure VMs,Provides best practices for choosing the right SQL Server virtual machine pricing model.,"Applies to:SQL Server on Azure VM This article provides pricing guidance forSQL Server on Azure Virtual Machines. There are several options that affect cost, and it's important to pick the right image that balances costs with business requirements. Tip If you only need to find out a cost estimate for a specific combination of SQL Server edition and virtual machine (VM) size, see the pricing page forWindowsorLinux. Select your platform and SQL Server edition from theOS/Softwarelist. -Or use thepri",2026-04-15T17:35:00.000Z,best-practice,decision-making,0.7,True,"Pricing guidance article focused on selecting the right SQL Server VM pricing model. Such pages typically include comparison of licensing options (PAYG vs Azure Hybrid Benefit), VM sizes, and edition choices with concrete cost/feature trade-offs and scenario-based recommendations, which fits the decision-making category.",updated +Or use thepri",2026-04-15T17:35:00.000Z,best-practice,decision-making,0.7,True,"Pricing guidance article focused on selecting the right SQL Server VM pricing model. Such pages typically include comparison of licensing options (PAYG vs Azure Hybrid Benefit), VM sizes, and edition choices with concrete cost/feature trade-offs and scenario-based recommendations, which fits the decision-making category.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/security-considerations-best-practices?view=azuresql,Security,Security: Best Practices - SQL Server on Azure VMs,Harden SQL Server on Azure VMs with security best practices,This article provides general guidance for securing SQL Server running in an Azure virtual machine.,"Applies to:SQL Server on Azure VM This article includes overall security guidelines that help establish secure access to SQL Server instances in an Azure virtual machine (VM). Azure complies with several industry regulations and standards that can enable you to build a compliant solution with SQL Server running in a virtual machine. For information about regulatory compliance with Azure, seeAzure Trust Center. First review the security best practices forSQL ServerandAzure VMs, and then review th",2025-12-05T08:00:00.000Z,best-practice,security,0.75,True,"Security-focused guidance for SQL VMs, likely including Azure-specific network, identity, and SQL configuration details.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/servicing-updates-guidelines?view=azuresql,Updating SQL Server,Updating SQL Server - SQL Server on Azure VMs,,"Learn the supported methods to keep SQL Server up to date for SQL Server on Azure VMs, including Azure Update Manager, Automated Patching, custom images, winget for client tools, and post-deployment a",Applies to:SQL Server on Azure VM This article provides an overview of the different methods you can use to keep SQL Server up to date forSQL Server on Azure virtual machines (VMs).,2026-04-03T22:38:00.000Z,best-practice,,0.4,False,"Overview of supported update methods (Azure Update Manager, automated patching, etc.) for SQL Server on Azure VMs; primarily conceptual without detailed configuration matrices, limits, or error-resolution mappings.",unchanged https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-agent-extension-automatic-registration-all-vms?view=azuresql,Automatic registration,Automatic Registration with SQL IaaS Agent Extension - SQL Server on Azure VMs,Enable automatic SQL IaaS Agent extension registration for all SQL VMs,Learn how to enable the automatic registration feature to automatically register all past and future SQL Server VMs with the SQL IaaS Agent extension using the Azure portal.,"Applies to:SQL Server on Azure VM By default, Azure VMs with SQL Server 2016 or later are automatically registered with theSQL IaaS Agent extensionwhen detected by theCEIP service. You can enable the automatic registration feature for your subscription to easily and automatically register any SQL Server VMs not picked up by the CEIP service, such as older versions of SQL Server. This article teaches you to enable the automatic registration feature. Alternatively, you canregister a single VM, orr",2025-09-26T22:35:00.000Z,how-to,configuration,0.75,True,"Covers enabling automatic registration at subscription level, including extension settings and behavior.",unchanged diff --git a/products/azure-sql-virtual-machines/report.md b/products/azure-sql-virtual-machines/report.md index 23a19ce9..b6278331 100644 --- a/products/azure-sql-virtual-machines/report.md +++ b/products/azure-sql-virtual-machines/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: decision-making: Guidance for choosing Azure SQL vs SQL VMs, comparing pricing/licensing, selecting HADR options, and planning migrations (Db2, Oracle, SQL 2014) and regional @@ -17,8 +17,8 @@ category_descriptions: load balancers), storage/layout, editions/versions, IaaS Agent, monitoring, connectivity, and deployment automation.' best-practices: 'Best practices for SQL Server on Azure VMs: performance tuning, - storage/VM sizing, tempdb/ephemeral disks, HADR/FCI/DNN configuration, backups, - and planned maintenance readiness.' + storage/VM sizing, tempdb, backups/restore, HADR/FCI/DNN configuration, and maintenance + planning.' security: 'Securing SQL Server on Azure VMs: TLS/cert rotation, Azure Policy, Key Vault/EKM, managed identities, Entra auth, hardening guidance, and confidential VM deployment.' @@ -34,14 +34,14 @@ skill_description: Expert knowledge for SQL Server on Azure Virtual Machines dev tuning storage/VM sizing, or securing with Key Vault/Entra, and other SQL Server on Azure Virtual Machines related development tasks. Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), - Azure Virtual Machines (use azure-virtual-machines), Azure Database Migration service - (use azure-database-migration). + Azure Virtual Machines (use azure-virtual-machines), Azure Data Science Virtual + Machines (use azure-data-science-vm). use_when: Use when choosing Azure SQL vs SQL VMs, configuring Always On/FCI, tuning storage/VM sizing, or securing with Key Vault/Entra, and other SQL Server on Azure Virtual Machines related development tasks. confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), Azure Virtual Machines (use azure-virtual-machines), - Azure Database Migration service (use azure-database-migration). + Azure Data Science Virtual Machines (use azure-data-science-vm). --- # SQL Server on Azure Virtual Machines Crawl Report @@ -55,8 +55,8 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 2 -- **Unchanged**: 126 +- **Updated Pages**: 1 +- **Unchanged**: 127 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-sql-virtual-machines/azure-sql-virtual-machines.csv` @@ -79,10 +79,8 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S ### Updated Pages -- [What's new?](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/doc-changes-updates-release-notes-whats-new?view=azuresql) - - Updated: 2026-04-01T08:00:00.000Z → 2026-04-17T17:37:00.000Z -- [Pricing](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/pricing-guidance?view=azuresql) - - Updated: 2025-11-18T08:00:00.000Z → 2026-04-15T17:35:00.000Z +- [Storage](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-storage?view=azuresql) + - Updated: 2026-04-01T17:39:00.000Z → 2026-04-23T22:39:00.000Z ## Classified Pages @@ -94,7 +92,6 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [Transaction log errors in Azure SQL Managed Instance](https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/troubleshoot-transaction-log-errors-issues?view=azuresql-mi) | troubleshooting | 0.86 | The page targets Azure SQL Managed Instance and addresses specific transaction log full errors (9002, 40552), explaining their causes and how to resolve them in this particular platform. It follows a symptom → cause → solution structure with platform-specific guidance, which fits the troubleshooting sub-skill and contains expert operational knowledge. | | [Connectivity errors](https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql) | troubleshooting | 0.85 | Explicitly about connection errors; includes error codes/messages and stepwise diagnosis and resolution for Azure SQL and Fabric SQL. | | [Known issues and troubleshooting](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-agent-extension-troubleshoot-known-issues?view=azuresql) | troubleshooting | 0.85 | Explicitly a troubleshooting article with known issues and resolutions for the SQL IaaS Agent extension. | -| [Storage](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-storage?view=azuresql) | best-practices | 0.85 | Storage-focused performance best practices for SQL Server on Azure VMs, including product-specific recommendations on disk types, storage pools, and caching strategies that go beyond generic advice. | | [Capacity errors during deployment](https://learn.microsoft.com/en-us/azure/azure-sql/capacity-errors-troubleshoot?view=azuresql) | troubleshooting | 0.80 | Guides diagnosis and resolution of capacity errors with specific error messages and recommended actions for SQL Database and Managed Instance. | | [Common connection issues](https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-connectivity-issues?view=azuresql) | troubleshooting | 0.80 | The article explicitly focuses on preventing, diagnosing, and mitigating connection and transient errors, and will include specific error patterns, connection string options, retry logic settings, and symptom→cause→solution guidance unique to Azure SQL, which fits the troubleshooting category. | | [EKM with AKV using managed identities](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/managed-identity-extensible-key-management?view=azuresql) | security | 0.80 | Combines managed identities, TDE EKM, and Key Vault; involves specific security configuration parameters and role assignments unique to this integration. | @@ -106,6 +103,7 @@ confusable_not_for: Not for Azure SQL Database (use azure-sql-database), Azure S | [With the Azure portal](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/manage-sql-vm-portal?view=azuresql) | configuration | 0.80 | Explains SQL VM management settings exposed via the SQL virtual machines resource, including SQL-specific configuration options. | | [Distributed network name (DNN)](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/failover-cluster-instance-distributed-network-name-dnn-configure?view=azuresql) | configuration | 0.78 | DNN setup for FCI requires specific cluster resource names and Azure networking parameters that are product-specific configuration details. | | [SSL root certificate expiring](https://learn.microsoft.com/en-us/azure/azure-sql/updates/ssl-root-certificate-expiring?view=azuresql) | security | 0.78 | Details certificate authority changes and required client configuration updates to maintain secure connections, which are product-specific security settings. | +| [Storage](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-storage?view=azuresql) | best-practices | 0.78 | The page provides product-specific performance best practices for SQL Server on Azure VMs, including concrete recommendations on disk type selection, storage pool configuration, and caching strategies tailored to Azure storage. These are actionable DO/DON'T guidelines unique to this environment rather than generic database advice, fitting the best-practices category. | | [Virtual network name (VNN)](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/failover-cluster-instance-vnn-azure-load-balancer-configure?view=azuresql) | configuration | 0.78 | Load balancer configuration for FCI VNN typically includes specific port numbers, probe settings, and Azure resource parameters unique to this scenario. | | [Automatic registration](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-agent-extension-automatic-registration-all-vms?view=azuresql) | configuration | 0.75 | Covers enabling automatic registration at subscription level, including extension settings and behavior. | | [Bulk register multiple VMs](https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-agent-extension-manually-register-vms-bulk?view=azuresql) | configuration | 0.75 | Uses a specific PowerShell cmdlet and describes bulk registration behavior and constraints for SQL VMs. | diff --git a/products/azure-sre-agent/azure-sre-agent.csv b/products/azure-sre-agent/azure-sre-agent.csv index 8c3808b0..77093dd5 100644 --- a/products/azure-sre-agent/azure-sre-agent.csv +++ b/products/azure-sre-agent/azure-sre-agent.csv @@ -9,31 +9,31 @@ https://learn.microsoft.com/en-us/azure/sre-agent/agent-reasoning,Agent reasonin https://learn.microsoft.com/en-us/azure/sre-agent/anthropic-sub-processor,Anthropic subprocessor,Anthropic as a subprocessor in Azure SRE Agent,,"Learn how Anthropic operates as a Microsoft subprocessor in Azure SRE Agent, including model selection, and data residency controls.","Azure SRE Agent supports multiple AI model providers for investigations, incident response, and operational automation. Anthropic is one of the available providers and operates as a Microsoft subprocessor. Anthropic operates under Microsoft's oversight with contractual safeguards and technical and organizational measures in place. The MicrosoftProduct TermsandMicrosoft Data Protection Addendum (DPA)apply when you use Anthropic models through Azure SRE Agent. For more information about subprocess",2026-04-07T22:12:00.000Z,concept-article,,0.3,False,"Describes Anthropic as a subprocessor, contractual terms, and data residency controls at a high level; primarily legal/compliance overview without concrete security configuration parameters or numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/audit-agent-actions,Audit agent actions,Audit Agent Actions in Azure SRE Agent,Query Azure SRE Agent telemetry and actions with KQL,"Query your agent's actions, tool calls, and incident outcomes using Application Insights telemetry and KQL.","Your agent logs every action, including tool calls, model invocations, incident handling, and approval decisions, to your Application Insights resource. To see exactly what your agent did, when, and why, query thecustomEventstable by using Kusto Query Language (KQL). Access logs directly fromMonitor>Logsin the agent portal.",2026-03-27T15:55:00.000Z,reference,troubleshooting,0.6,True,"Describes querying customEvents in Application Insights with KQL to see agent actions; likely includes specific event names, properties, and query patterns unique to SRE Agent, which are troubleshooting/diagnostic details.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/automate-incidents,Automate incident response,Tutorial: Automate Incident Response in Azure SRE Agent,,"Connect Azure Monitor, create response plans, and let your agent investigate and resolve incidents autonomously from detection to fix.","Estimated time: 10 minutes Connect your incident platform and let your agent handle alerts automatically. The system handles alerts from detection to diagnosis to fix, all without you typing a single message.",2026-03-27T15:55:00.000Z,tutorial,,0.3,False,"Tutorial for automating incident response; likely procedural steps, not a reference of specific parameters, limits, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/automate-workflows,Automate workflows,Automate workflows in Azure SRE Agent,,"Schedule recurring health checks, connect notification tools, and build automated workflows with connectors and subagents.","Estimated time: 10 minutes Your team probably has recurring tasks, such as checking service health every morning, reviewing overnight alerts, verifying certificate expirations, or posting weekly capacity reports. In this tutorial, you connect your tools, build a subagent, and schedule a task that runs automatically.",2026-03-27T15:55:00.000Z,tutorial,,0.3,False,Workflow automation tutorial; describes connecting tools and scheduling tasks without clear evidence of detailed configuration tables.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/automate-workflows,Automate workflows,Step 5: Automate workflows in Azure SRE Agent,,"Schedule recurring health checks, connect notification tools, and build automated workflows with connectors and subagents.","Your team probably has recurring tasks like checking service health every morning, reviewing overnight alerts, verifying certificate expirations, or posting weekly capacity reports. Connect your tools, build a workflow, and let the agent run it on a schedule.",2026-04-24T18:40:00.000Z,tutorial,,0.3,False,"Covers automating workflows and scheduling tasks conceptually; summary does not indicate detailed configuration parameter tables, limits, or error-code-based troubleshooting.",updated https://learn.microsoft.com/en-us/azure/sre-agent/azure-devops-connector,Set up Azure DevOps connector,Tutorial: Set Up an Azure DevOps Connector in Azure SRE Agent,,"Connect your agent to Azure DevOps for repository access, work item management, and wiki documentation using OAuth or PAT authentication.","In this tutorial, you connect your agent to Azure DevOps so it can access repositories, wikis, and documentation during investigations. Choose OAuth for automatic token management or PAT for service account scenarios. When you finish this tutorial, your agent has authenticated access to an Azure DevOps organization and can read repositories, create work items, and correlate code changes with incidents. Estimated time: 5 minutes In this tutorial, you: Note Your agent references your Azure DevOps ",2026-03-27T15:55:00.000Z,tutorial,,0.3,False,"Tutorial-style connector setup; description suggests step-by-step OAuth/PAT connection but no indication of detailed config parameter tables, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/azure-devops-wiki-knowledge,Azure DevOps Wiki knowledge,Azure DevOps Wiki Knowledge in Azure SRE Agent,Connect Azure DevOps wikis as knowledge sources for SRE Agent,"Connect your Azure DevOps wikis as knowledge sources so your agent references runbooks, procedures, and documentation during investigations.","Connect your Azure DevOps wikis so your agent references your team's runbooks and procedures during investigations. Wiki content is indexed and searchable, and your agent finds the right page automatically. The connector supports both managed identity and personal access token (PAT) authentication.",2026-03-10T04:29:00.000Z,conceptual,integrations,0.7,True,Explains connecting Azure DevOps wikis with support for managed identity and PAT; this is a concrete integration with authentication configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/azure-monitor-alerts,Azure Monitor alerts,Azure Monitor Alerts,,"Connect Azure Monitor to your agent with zero credentials so alerts are detected, acknowledged, and investigated automatically.","Connect Azure Monitor through the agent's managed identity—no API keys or credentials needed. The scanner detects new alerts every minute, acknowledges them, and creates investigation threads automatically. Tip",2026-04-07T22:12:00.000Z,how-to,,0.4,False,"Explains connecting Azure Monitor via managed identity and that a scanner detects alerts every minute. While it mentions a one-minute detection interval, the page is primarily an integration/behavior overview without broader limits tables or configuration matrices, so it does not meet the stronger expert-knowledge criteria.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/chat-from-your-tools,Chat from your tools,Chat from Your Tools in Azure SRE Agent,Use Azure SRE Agent from Microsoft Teams,Interact with your agent directly in Microsoft Teams without leaving your current workflow.,"Chat with your agent directly in Microsoft Teams. Send direct messages for private investigations, or@mention your agent in channels so your whole team sees the analysis. Every channel connects to the same agent with the same reasoning, memory, and tool access.",2026-03-10T04:29:00.000Z,conceptual,integrations,0.6,True,Describes chatting with the agent directly in Teams; likely includes product-specific configuration or usage patterns for this integration.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/azure-monitor-alerts,Azure Monitor alerts,Azure Monitor Alerts in Azure SRE Agent,Integrate Azure Monitor alerts with Azure SRE Agent,"Learn how Azure SRE Agent connects to Azure Monitor to detect, acknowledge, and investigate alerts automatically with zero credentials.","When an Azure Monitor alert fires, you open the Azure portal to read the alert details. Then you switch to Log Analytics to check for errors. Then Application Insights for exceptions. Then deployment history to see if someone pushed a change. Each tool switch costs time and breaks your focus. When the same alert rule fires repeatedly during an ongoing issue, such as a CPU threshold breached every five minutes, each firing demands separate attention. By morning, you handled the same underlying pr",2026-04-25T06:18:00.000Z,concept-article,integrations,0.7,True,Explains how SRE Agent connects to Azure Monitor alerts with zero credentials and automates investigation across Azure tools. This is a product-specific integration pattern with unique behavior around alert handling.,updated https://learn.microsoft.com/en-us/azure/sre-agent/code-interpreter,Code interpreter,Run code with code interpreter in Azure SRE Agent,Use SRE Agent code interpreter for Python and shell,"Learn how to use code interpreter to execute Python code, run shell commands, and generate reports in an isolated sandbox environment.","The SRE Agent code interpreter enables you to execute Python code and shell commands in a secure, isolated sandbox environment. Use Code Interpreter to analyze data, create visualizations, generate PDF reports, and automate file operations without leaving your SRE Agent conversation. In this article, you learn how to:",2026-01-27T18:10:00.000Z,how-to,configuration,0.65,True,"Describes executing Python and shell commands in a sandbox; likely includes product-specific capabilities, constraints, and parameters for the interpreter environment.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/complete-setup,Complete setup,Complete setup for Azure SRE Agent,,Finish connecting data sources you skipped during onboarding. See what's missing and configure each source from the setup page.,"If you skip any data sources during onboarding, you can add them later. The agent shows you what's missing and provides a direct path to configure each source.",2026-03-27T15:55:00.000Z,how-to,,0.2,False,Describes completing setup and connecting data sources; appears to be workflow guidance rather than detailed configuration reference.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/connect-ado-repo-managed-identity,Connect ADO repo with managed identity,Tutorial: Connect an ADO Repository with Managed Identity in Azure SRE Agent,Configure managed identity access to ADO repos in SRE Agent,Connect an Azure DevOps repository to your agent using managed identity authentication.,Connect an Azure DevOps repository to your agent using managed identity so you don't need to create or rotate PATs. Your agent uses its own Azure identity to access ADO repos for code-aware investigations. Time: ~10 minutes (including ADO admin setup),2026-04-09T06:11:00.000Z,tutorial,security,0.65,True,"Connects Azure DevOps repositories using managed identity; likely includes specific identity/permission configuration steps (service principal or managed identity setup, scopes, and ADO permissions) that are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/connect-devops-wiki,Connect Azure DevOps Wiki,Tutorial: Connect an Azure DevOps Wiki to Azure SRE Agent,Connect Azure DevOps wiki as SRE Agent knowledge source,Connect an Azure DevOps wiki to your Azure SRE Agent so the agent can reference team runbooks and procedures during investigations.,"In this tutorial, you connect an Azure DevOps wiki as a knowledge source for your Azure SRE Agent. After you complete these steps, your agent can search your team's wiki for runbooks and procedures when it answers questions during investigations. In this tutorial, you learn how to: Estimated time: 10 minutes",2026-03-10T04:29:00.000Z,tutorial,integrations,0.75,True,"Connecting a DevOps wiki as a knowledge source involves specific connector settings, permissions, and indexing behavior unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/connect-knowledge,Connect knowledge,Connect knowledge in Azure SRE Agent,,"Give your agent access to runbooks, documentation, source code, and web resources to improve incident investigation accuracy.","Your agent comes with built-in Azure observability, but every team has unique context: runbooks, architecture docs, internal wikis, and code repositories. By using the knowledge base, you can manage all these knowledge sources in one place so your agent can reference them during investigations. Tip Key takeaways",2026-04-07T17:12:00.000Z,conceptual,,0.3,False,"Conceptual guidance on connecting knowledge sources (runbooks, docs, code, web). Summary does not indicate detailed configuration options, limits, or security roles; likely an overview of the knowledge base feature.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/connect-knowledge,Connect knowledge,Connect knowledge in Azure SRE Agent,Connect runbooks and knowledge sources to SRE Agent,"Give your agent access to runbooks, documentation, source code, and web resources to improve incident investigation accuracy.",Tip,2026-04-25T06:18:00.000Z,conceptual,integrations,0.6,True,"Page is about connecting runbooks, docs, source code, and web resources as knowledge inputs. This is a product-specific integration pattern for external knowledge sources, likely with configuration options and constraints.",updated https://learn.microsoft.com/en-us/azure/sre-agent/connect-pagerduty,Connect to PagerDuty,Tutorial: Connect to PagerDuty in Azure SRE Agent,Configure PagerDuty integration for Azure SRE Agent,Integrate your agent with PagerDuty to diagnose and respond to incidents automatically.,"In this tutorial, you connect your PagerDuty account to Azure SRE Agent so the agent can receive, investigate, and respond to incidents automatically.",2026-03-10T04:29:00.000Z,tutorial,integrations,0.7,True,"Product-specific integration tutorial for connecting PagerDuty so the agent can receive and respond to incidents. Likely includes connector settings, authentication details, and required configuration parameters unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/connect-servicenow,Connect to ServiceNow,Tutorial: Connect to ServiceNow in Azure SRE Agent,Connect Azure SRE Agent to ServiceNow for incident management,Configure ServiceNow as your incident platform using basic authentication or OAuth 2.0 for automated incident management.,"Connect your ServiceNow instance to receive and manage incidents automatically. Choose betweenBasic Authentication(username/password) orOAuth 2.0(token-based, recommended for production). Time: 15 minutes",2026-04-03T21:52:00.000Z,tutorial,integrations,0.7,True,"ServiceNow connection tutorial with basic auth and OAuth 2.0 will include endpoint URLs, credential fields, and configuration parameters specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/connectors,Connectors,Connectors in Azure SRE Agent,,"Extend your agent's capabilities to external data sources, collaboration tools, and custom APIs using connectors.","Your agent comes with built-in access to Azure services. It can query Azure Monitor, Application Insights, Log Analytics, and Azure Resource Graph out of the box. Connectors extend that reach to external systems: your Kusto clusters, source code repositories, collaboration tools, and custom APIs. Note Connectors vs. incident platforms Connectorsgive your agent access to data and actions, such as querying logs, sending notifications, and reading code.Incident platformsare a separate concept: they",2026-04-07T17:12:00.000Z,concept-article,,0.3,False,"Describes connectors conceptually (extending to external data sources, collaboration tools, APIs). Summary suggests high-level explanation rather than detailed connector configuration tables or SDK parameters.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/create-and-set-up,Create and set up,Create and set up Azure SRE Agent,,"Deploy an Azure SRE Agent using the onboarding wizard, connect your GitHub repository, and grant Azure resource access.","Deploy an Azure SRE Agent, connect your code repository, and add Azure resource access.",2026-04-07T17:12:00.000Z,tutorial,,0.3,False,"Onboarding and setup wizard for Azure SRE Agent; likely a step-by-step deployment/connection guide without detailed config matrices, limits, or product-specific parameter tables.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/create-http-trigger,Create an HTTP trigger,Tutorial: Create an HTTP trigger in Azure SRE Agent,Create and use HTTP triggers for Azure SRE Agent,Set up an HTTP trigger in Azure SRE Agent that runs a compliance check when called from a CI/CD pipeline or any HTTP client.,"In this tutorial, you create an HTTP trigger that runs a compliance check on a container app. You test it from the portal and the command line, and then integrate it into a CI/CD pipeline. Estimated time: 10 minutes In this tutorial, you:",2026-04-02T18:15:00.000Z,tutorial,integrations,0.6,True,"HTTP trigger tutorial likely documents request schema, headers, and payload parameters for invoking the agent from CI/CD or HTTP clients, which are integration details.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/connect-servicenow,Connect to ServiceNow,Tutorial: Connect to ServiceNow in Azure SRE Agent,,Configure ServiceNow as your incident platform using basic authentication or OAuth 2.0 for automated incident management.,"In this tutorial, you connect your ServiceNow instance to Azure SRE Agent so the agent can receive, investigate, and update incidents automatically. You can choose between OAuth 2.0 (recommended for production) or basic authentication. Tip What you accomplish in this tutorial: Estimated time: 15 minutes",2026-04-21T22:10:00.000Z,tutorial,,0.4,False,Tutorial to connect ServiceNow using OAuth 2.0 or basic auth; appears to be a how-to guide without explicit expert-level configuration matrices or error-code-based troubleshooting.,updated +https://learn.microsoft.com/en-us/azure/sre-agent/connectors,Connectors,Connectors in Azure SRE Agent,Use connectors to extend Azure SRE Agent integrations,"Extend your agent's capabilities to external data sources, collaboration tools, and custom APIs using connectors.","Your agent comes with built-in access to Azure services. It can query Azure Monitor, Application Insights, Log Analytics, and Azure Resource Graph. Connectors extend that reach to external systems: your Kusto clusters, source code repositories, collaboration tools, and custom APIs. Note Connectors vs. incident platforms:Connectorsgive your agent access to data and actions - querying logs, sending notifications, reading code.Incident platformsare a separate concept: they control where alerts come",2026-04-25T06:18:00.000Z,concept-article,integrations,0.65,True,"Describes product-specific connectors to Kusto, repos, collaboration tools, and custom APIs. While summary is high-level, this page is about concrete integration mechanisms unique to Azure SRE Agent, likely including connector configuration details and parameters.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/create-and-set-up,Create and set up,Create and Set Up Azure SRE Agent,,"Deploy Azure SRE Agent by using the onboarding wizard, connect your GitHub repository, and grant Azure resource access.","Azure SRE Agent is an AI solution that helps site reliability engineers (SREs) manage Azure cloud resources. This article shows you how to set up Azure SRE Agent and create an agent. In this tutorial, you learn how to:",2026-04-22T17:34:00.000Z,tutorial,,0.3,False,"Tutorial-style onboarding and setup for Azure SRE Agent; description suggests step-by-step wizard usage without exposing detailed configuration tables, limits, or product-specific troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/create-http-trigger,Create an HTTP trigger,Tutorial: Create an HTTP Trigger in Azure SRE Agent,,Set up an HTTP trigger in Azure SRE Agent that runs a compliance check when called from a CI/CD pipeline or any HTTP client.,"In this tutorial, you create an HTTP trigger that runs a compliance check on a container app. You test it from the portal and the command line, and then integrate it into a continuous integration and continuous delivery (CI/CD) pipeline. Estimated time: 10 minutes In this tutorial, you:",2026-04-22T17:34:00.000Z,tutorial,,0.4,False,"Tutorial for creating an HTTP trigger and integrating it into CI/CD; summary suggests step-by-step usage, but no clear indication of parameter tables, limits, or error mappings that would qualify as expert knowledge.",updated https://learn.microsoft.com/en-us/azure/sre-agent/create-kusto-tool,Create a Kusto tool,Tutorial: Create a Kusto Tool in Azure SRE Agent,,Build a reusable Kusto query tool for your Azure SRE Agent using the portal UI to run deterministic KQL queries.,"In this tutorial, you create a parameterized Kusto tool that runs exact KQL queries with deterministic, repeatable results. When users ask questions like ""show me errors from the last seven days,"" the agent substitutes the parameter and runs your exact query against your Azure Data Explorer database. In this tutorial, you learn how to: Estimated time: 15 minutes",2026-03-27T15:55:00.000Z,tutorial,,0.3,False,"Kusto tool creation tutorial focused on portal UI steps and example queries; no evidence of detailed configuration option tables, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/create-manage-hooks-ui,Create and manage hooks in the portal,Tutorial: Create and manage hooks in Azure SRE Agent,Create and manage governance hooks in Azure SRE Agent,Create governance hooks in the Azure SRE Agent portal to validate agent responses and audit tool usage with Stop and PostToolUse hooks.,"In this tutorial, you create two governance hooks by using the portal: aStop hookthat enforces data formatting (markdown tables with bold headers) and aPostToolUse hookthat blocks dangerous shell commands. You configure hooks at both theagent level(applies to all threads and subagents) and thesubagent level(applies to one specific subagent). Estimated time: 10 minutes In this tutorial, you learn how to:",2026-03-27T15:55:00.000Z,tutorial,configuration,0.65,True,"Portal-based hook configuration (Stop, PostToolUse) with examples like blocking dangerous shell commands implies specific hook types and settings, which are configuration details.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/create-python-tool,Create a Python tool,Tutorial: Create a Python Tool in Azure SRE Agent,Build and deploy a Python SLA calculator tool in SRE Agent,Build an SLA calculator tool for your Azure SRE Agent using AI-generated Python code.,"In this tutorial, you build a working Python tool that calculates SLA compliance for your Azure SRE Agent. You describe the tool's purpose in plain English, let AI generate the code, test the result, and deploy the tool for your agent to use. In this tutorial, you learn how to: Estimated time: 10 minutes",2026-03-27T15:55:00.000Z,tutorial,integrations,0.6,True,"Python tool creation tutorial will show tool definition, parameters, and deployment specifics unique to SRE Agent’s Python tool system, which is an integration/coding pattern.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/create-scheduled-task,Create scheduled tasks,Tutorial: Create and Edit Scheduled Tasks in Azure SRE Agent,,Set up a recurring automated task and learn how to modify it.,"In this tutorial, you set up your agent to run tasks automatically on a schedule (from daily health checks to weekly reports) then modify them as your needs change. Estimated time: 5 minutes In this tutorial, you:",2026-04-07T17:12:00.000Z,tutorial,,0.3,False,"Tutorial for creating and editing scheduled tasks; likely step-by-step usage instructions without configuration parameter matrices, quotas, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/create-scheduled-task,Create scheduled tasks,Tutorial: Create and edit scheduled tasks in Azure SRE Agent,,Set up a recurring automated task and learn how to modify it.,"Scheduled tasks let your agent run automated checks on a recurring basis without manual intervention. You define what the agent should do, set a frequency, and the agent executes the task on schedule. Each run produces a conversation thread with full details on what the agent found and any actions it took. In this tutorial, you create a scheduled task, verify it runs successfully, and then edit the task to adjust its configuration.",2026-04-24T18:40:00.000Z,tutorial,,0.3,False,"Tutorial for creating and editing scheduled tasks; summary focuses on basic setup and editing, not on detailed configuration options, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/sre-agent/create-skill,Create a skill,Tutorial: Create a skill in Azure SRE Agent,Create custom skills with tools and files in Azure SRE Agent,"Build a custom skill with instructions, tools, and supporting files that your agent uses automatically when relevant.","In this tutorial, you create a custom skill that adds domain knowledge and task playbooks to your agent. Skills are modular capabilities your agent loads automatically when relevant, such as troubleshooting a specific service or running a diagnostic procedure. In this tutorial, you learn how to: Estimated time: 10 minutes Tip Skills and knowledge documents work together. Askillteaches your agenthowto do something (procedures, playbooks, step-by-step instructions). Aknowledge documentteaches your",2026-03-27T15:55:00.000Z,tutorial,configuration,0.6,True,"Skill creation involves specifying instructions, tools, and supporting files; page likely includes schema/fields for skill configuration unique to SRE Agent.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/create-subagent,Create a subagent,Tutorial: Create a subagent in Azure SRE Agent,Configure specialized subagents in Azure SRE Agent,"Learn how to create a specialized subagent with custom instructions, tools, skills, and hooks in the Azure SRE Agent subagent builder.","In this tutorial, you create a specialized subagent in the subagent builder with its own instructions, tools, and skills. Subagents handle focused tasks like health reporting, alert triage, or notification delivery. For more information about how subagents work, seeSubagents. Estimated time: 5 minutes In this tutorial, you learn how to:",2026-03-27T15:55:00.000Z,tutorial,configuration,0.6,True,"Subagent builder involves defining instructions, tools, skills, and hooks; page likely documents configuration fields and options for subagents.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/cross-account-ado-oauth,Cross-account ADO access,Cross-Account Azure DevOps Access in Azure SRE Agent,Configure cross-tenant Azure DevOps access for SRE Agent,Learn how to connect to Azure DevOps organizations across tenants in Azure SRE Agent.,"Connect to Azure DevOps organizations in different tenants by signing in with another account. No service principals, PATs, or admin approval needed. Tip",2026-04-03T21:52:00.000Z,concept-article,integrations,0.65,True,Cross-account ADO access across tenants; likely includes specific steps and parameters for authentication and tenant selection unique to this product.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/cross-account-azdo-oauth-authorization,Cross-account ADO access,Tutorial: Set Up Cross-Account Azure DevOps Access in Azure SRE Agent,,Connect your agent to Azure DevOps repositories in a different tenant using cross-account OAuth.,"Connect your agent to an Azure DevOps organization in a different Microsoft Entra ID tenant. Your agent gains access to repositories, wikis, and code in the cross-tenant organization by using your Microsoft identity through a browser sign-in popup. Time: ~5 minutes",2026-04-03T21:52:00.000Z,tutorial,,0.3,False,"Cross-tenant Azure DevOps access tutorial; description suggests sign-in flow guidance rather than detailed security role mappings, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/data-privacy,Data privacy and residency,Data residency and privacy in Azure SRE Agent,,Learn how Azure SRE Agent handles your data.,"This article explains how the SRE Agent handles your data, including where it stores your data, how it processes your data, and the privacy measures it uses to protect your information.",2026-04-06T17:23:00.000Z,tutorial,,0.3,False,"Data residency and privacy overview; typically describes where and how data is stored/processed and general privacy measures, but not concrete RBAC roles, config parameters, or numeric limits.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/deep-investigation,Deep investigation,Use deep investigation in Azure SRE Agent,Decide when to use deep investigation in Azure SRE Agent,Use a hypothesis-driven approach to explore multiple potential root causes before acting on mitigation steps.,"Deep investigation gives you greater transparency and accuracy when diagnosing complex problems in the SRE Agent. Unlike standard queries that provide quick insights, deep investigation uses a hypothesis-driven approach so you can explore multiple potential root causes before you decide on mitigation steps. Use deep investigation when: For simple queries, standard investigation is often all you need. However, when you encounter cases where you suspect you need a structured, multi-path analysis t",2026-03-27T15:55:00.000Z,tutorial,decision-making,0.6,True,Explicitly contrasts deep vs standard investigation and lists scenarios for use; provides decision guidance on when to choose each investigation mode.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/default-agent-override-info,Default agent override info,Default Agent Override Info in Azure SRE Agent,Configure default and custom agent tool overrides,Learn how tool and skill settings apply to the default agent and how custom agents inherit or override them in Azure SRE Agent.,"When you toggle tools or skills on the settings pages underCapabilities, it wasn't clear which agent your changes affected. The descriptions said what tools and skills do, but notwherethose settings apply. If you had custom agents with their own tool overrides, changes on the settings page had no effect on those agents. There was no indication of this limitation at the point where you made changes. This limitation led to confusion: ""I disabled this tool, but my custom agent is still using it."" T",2026-04-25T06:18:00.000Z,concept-article,configuration,0.7,True,"Discusses how tool and skill settings apply to the default agent versus custom agents and how overrides work. This is nuanced, product-specific configuration behavior that clarifies which settings affect which agents.",new https://learn.microsoft.com/en-us/azure/sre-agent/diagnose-azure-observability,Diagnose with Azure observability,Diagnose with Azure Observability in Azure SRE Agent,,"Learn how your agent queries Application Insights, Log Analytics, Azure Monitor metrics, Activity Logs, Resource Graph, and resource-specific diagnostics automatically without connectors.","Your agent queries Application Insights, Log Analytics, Azure Monitor metrics, Resource Graph, Activity Logs, and resource-specific diagnostics in a single investigation. No connectors are required because everything works through managed identity and Azure RBAC. Your agent decides which sources to query based on the symptom, correlates evidence across them, and explains the findings. Tip Key benefits of Azure observability diagnostics:",2026-04-01T08:00:00.000Z,concept-article,,0.3,False,"Describes how the agent automatically queries Azure observability services via managed identity and RBAC. Appears to be behavior/benefits rather than specific configuration parameters, error codes, or limits.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/diagnose-observability,Diagnose with external observability,Diagnose with External Observability in Azure SRE Agent,Integrate external observability tools with Azure SRE Agent,"Query Azure Monitor and external observability tools like Dynatrace, Datadog, and Splunk in a single investigation through MCP.",Tip,2026-03-27T15:55:00.000Z,feature-guide,integrations,0.6,True,"Covers querying Azure Monitor plus external tools (Dynatrace, Datadog, Splunk) via MCP; likely includes connector parameters and integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/docsguide,Learn via Chat,Learn via Chat in Azure SRE Agent,,"Ask your agent about SRE Agent features, setup, and configuration to get accurate, cited answers from official documentation.","Your agent has a built-in documentation capability that answers questions about SRE Agent features, setup, and configuration directly in the conversation. The answers come from live official documentation, so they stay current as features evolve. Tip Quick summary",2026-03-27T15:55:00.000Z,conceptual,,0.2,False,"DocsGuide feature overview; no indication of detailed configuration parameters, limits, or troubleshooting mappings.",unchanged @@ -46,18 +46,18 @@ https://learn.microsoft.com/en-us/azure/sre-agent/file-attachments-tutorial,Shar https://learn.microsoft.com/en-us/azure/sre-agent/first-investigation,Run your first investigation,Run your first investigation with Azure SRE Agent,,"Ask your agent to investigate a live issue and watch it diagnose root causes using your code, Azure resources, and knowledge files.","Estimated time: 5 minutes Ask the agent to investigate an issue and watch it diagnose the root cause using your code, Azure resources, and the knowledge files it built during onboarding.",2026-03-27T15:55:00.000Z,tutorial,,0.2,False,"Quickstart-style tutorial for first investigation; no evidence of expert-only limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/github-connector,GitHub connector,GitHub connector in Azure SRE Agent,Integrate GitHub with Azure SRE Agent via OAuth or PAT,"Connect GitHub repositories for source code analysis, issue management, and workflow automation with OAuth or PAT authentication.","Connect your GitHub repositories so your agent can read source code, search for errors, create issues, trigger workflows, and correlate deployments with incidents. Tip Quick overview",2026-03-27T15:55:00.000Z,conceptual,integrations,0.65,True,"Connector article for GitHub; likely includes auth modes (OAuth, PAT), scopes, and configuration parameters unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/global-tools-page,Tools and skills,Tools and Skills in Azure SRE Agent,,Learn how to manage tools and skills at the space level in Azure SRE Agent.,"See every tool and skill your agent has, including built-in, custom, and MCP tools plus system and custom skills, organized by category. Toggle capabilities on or off at the space level, and changes apply across all agents instantly. Agents created before March 10, 2026 require workspace tools to be enabled. For older agents, enableEnableWorkspaceToolsinCapabilities > Experimental Settings. Tip",2026-04-07T06:20:00.000Z,concept-article,,0.2,False,"High-level description of viewing and toggling tools/skills at space level; no indication of numeric limits, config parameter tables, error codes, or other expert-only details.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/http-triggers,HTTP triggers,HTTP triggers in Azure SRE Agent,,"Learn how HTTP triggers in Azure SRE Agent let you invoke agent actions from CI/CD pipelines, alerting tools, and any HTTP client.","HTTP triggers in Azure SRE Agent are webhook endpoints that external systems use to invoke your agent on demand. When a CI/CD pipeline fails, an alerting tool detects an anomaly, or any HTTP client sends a POST request, the agent receives the event context and starts working immediately.",2026-04-02T18:15:00.000Z,concept-article,,0.2,False,"Describes HTTP triggers conceptually as webhooks; summary does not suggest detailed configuration parameters, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/incident-platforms,Incident platforms,Incident Platforms in Azure SRE Agent,,"Connect an incident platform to your agent so it can receive alerts, investigate issues, and take action automatically.","An incident platform is the system that tells your agent when something goes wrong. When you connect your incident platform, the agent receives alerts, investigates problems, and takes action automatically, without waiting for someone to start a chat. Without an incident platform, your agent is reactive: users ask questions and the agent investigates on demand. When you connect an incident platform, your agent becomes proactive. It picks up incidents the moment they fire and starts working immed",2026-03-27T15:55:00.000Z,concept-article,,0.4,False,Explains what an incident platform is and how it changes agent behavior; appears conceptual without detailed configuration or limits.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/http-triggers,HTTP triggers,HTTP Triggers in Azure SRE Agent,,"Learn how HTTP triggers in Azure SRE Agent let you invoke agent actions from CI/CD pipelines, alerting tools, and any HTTP client.","HTTP triggers in Azure SRE Agent are webhook endpoints that external systems use to invoke your agent on demand. When a continuous integration and continuous delivery (CI/CD) pipeline fails, an alerting tool detects an anomaly, or any HTTP client sends aPOSTrequest, the agent receives the event context and starts working immediately.",2026-04-22T17:34:00.000Z,concept-article,,0.3,False,"Explains what HTTP triggers are and how they’re used; summary does not show specific config tables, limits, or error mappings that would qualify as expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/incident-platforms,Incident platforms,Incident Platforms in Azure SRE Agent,Connect incident platforms to Azure SRE Agent,"Connect an incident platform to your agent so it can receive alerts, investigate issues, and take action automatically.","An incident platform is the system that tells your agent when something goes wrong. By connecting your incident platform, your agent can receive alerts, investigate issues, and take action automatically, without waiting for someone to start a chat. Without an incident platform, your agent is reactive: users ask questions and it investigates on demand. With one connected, your agent becomes proactive: it picks up incidents the moment they fire and starts working immediately.",2026-04-25T06:18:00.000Z,concept-article,integrations,0.65,True,Explains how to connect incident platforms so the agent can receive and act on alerts automatically. This is a product-specific integration pattern that likely includes configuration details for supported incident systems.,updated https://learn.microsoft.com/en-us/azure/sre-agent/incident-response,Incident response,Automate Incident Response in Azure SRE Agent,,"Learn how your agent monitors, investigates, and resolves incidents automatically which learns from every fix to improve over time.","Your agent monitors, investigates, and resolves incidents while you sleep. It learns from every fix to get smarter over time. Stop switching context at 3 AM. [!VIDEO /Automated_Incident_Response.mp4] Tip",2026-03-27T15:55:00.000Z,concept-article,,0.3,False,Marketing-style description of automated incident response with video; no clear indication of detailed technical configuration or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/incident-response-plans,Incident response plans,Incident Response Plans in Azure SRE Agent,,"Learn how to route incidents to specialized custom agents with the right tools, investigation depth, and autonomy level automatically.","Response plans automatically route incidents to specialized custom agents with the right tools, investigation depth, and autonomy level. The right custom agent handles each incident type without human intervention. [!VIDEO /Automated_Incident_Response.mp4] Tip",2026-03-27T15:55:00.000Z,concept-article,,0.4,False,Conceptual explanation of response plans routing incidents; summary doesn’t show detailed configuration tables or numeric thresholds.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/install-plugin-from-marketplace,Install a Marketplace plugin,Tutorial: Install a Marketplace Plugin in Azure SRE Agent,Install and configure marketplace plugins for Azure SRE Agent,"Learn how to browse, install, and configure plugins from the marketplace in Azure SRE Agent to extend your agent with community-built skills.","The Plugin Marketplace lets you discover and install community-built skills and MCP integrations from curated GitHub-hosted catalogs. In this tutorial, you add a marketplace source, browse available plugins, import skills, and optionally configure MCP connectors—all from the portal. Estimated time: 5 minutes In this tutorial, you learn how to:",2026-03-27T15:55:00.000Z,tutorial,integrations,0.6,True,Marketplace plugin installation and configuration likely includes plugin manifest fields and connector settings specific to SRE Agent.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/incident-response-plans,Incident response plans,Incident Response Plans in Azure SRE Agent,Configure incident response plans for Azure SRE Agent,"Learn how to route incidents to specialized custom agents with the right tools, investigation depth, and autonomy level automatically.",Tip,2026-04-25T06:18:00.000Z,concept-article,configuration,0.6,True,"Describes routing incidents to specialized custom agents with specific tools, investigation depth, and autonomy. This is product-specific configuration of incident routing and agent behavior, beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/install-plugin-from-marketplace,Install a Marketplace plugin,Tutorial: Install a Marketplace Plugin in Azure SRE Agent,,"Learn how to browse, install, and configure plugins from the marketplace in Azure SRE Agent to extend your agent with community-built skills.","The Plugin Marketplace lets you discover and install community-built skills and MCP integrations from curated GitHub-hosted catalogs. In this tutorial, you add a marketplace source, browse available plugins, import skills, and optionally configure MCP connectors—all from the portal. Estimated time: 5 minutes In this tutorial, you learn how to:",2026-04-21T22:10:00.000Z,tutorial,,0.3,False,"Tutorial for installing marketplace plugins; summary indicates a basic workflow (add source, browse, import, configure) without detailed configuration matrices or product-specific troubleshooting.",updated https://learn.microsoft.com/en-us/azure/sre-agent/kusto-cluster-grouping,Azure Data Explorer connector,Azure Data Explorer connector in Azure SRE Agent,,"Connect your agent to Azure Data Explorer (Kusto) clusters to query logs and telemetry, with support for multiple clusters, per-cluster health checks, and identity-based grouping.",Tip,2026-04-07T22:12:00.000Z,feature-guide,,0.2,False,"Describes connecting to Azure Data Explorer clusters and grouping; summary suggests conceptual connector behavior without specific configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/kusto-connector,Set up Kusto connector,Tutorial: Connect to Azure Data Explorer (ADX) in Azure SRE Agent,,Connect your SRE agent to Azure Data Explorer (Kusto) clusters with per-cluster connectivity testing before saving.,"In this tutorial, you connect your SRE agent to an Azure Data Explorer (Kusto) cluster. The connector wizard tests connectivity per-cluster before saving, so you can verify access before committing the configuration. Estimated time: 15 minutes In this tutorial, you learn how to:",2026-04-07T22:12:00.000Z,tutorial,,0.2,False,"Tutorial-style connector setup for Azure Data Explorer in SRE Agent; summary indicates step-by-step UI usage without exposing detailed configuration parameter tables, limits, or product-specific error mappings.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/kusto-tools,Kusto tools,Kusto Tools in Azure SRE Agent,Define Kusto tools to run deterministic KQL in SRE Agent,"Create deterministic query tools that run exact KQL queries against Azure Data Explorer, turning tribal knowledge into reusable agent capabilities.","Kusto tools help you turn your best KQL queries into reusable, parameterized tools. Your agent runs the exact query you write with no interpretation or variation. Your team's expertise becomes a shared capability. Tip",2026-03-27T15:55:00.000Z,feature-guide,integrations,0.7,True,"Kusto tools are deterministic query tools with parameterization; page likely documents tool schema, parameter names, and constraints specific to Azure Data Explorer integration, matching integrations.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/manage-global-tools,Manage global tools,Tutorial: Manage Global Tools in Azure SRE Agent,,"Browse, toggle, and manage tools and skills at the space level in Azure SRE Agent.","Learn how to browse, toggle, and manage tools at the space level by using the Tools page. Important Agents created before March 10, 2026, require workspace tools to be enabled. For older agents, enableEnableWorkspaceToolsinCapabilities > Experimental Settings. Time: 5-10 minutes",2026-04-07T06:20:00.000Z,tutorial,,0.2,False,"Tutorial on browsing and toggling tools at space level; appears to be basic feature usage with one date-based note, but no detailed configuration matrices, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/manage-permissions,Manage permissions,Manage Permissions for Azure SRE Agent,Manage Azure SRE Agent permissions and resource access,"Add or remove managed resource groups, change permission levels, and grant subscription-level access for your Azure SRE Agent.","After you create your agent, you can control which Azure resources it can access and what actions it can take. Manage access by adding resource groups, setting permission levels, and optionally granting subscription-wide access. For a full overview of how permissions work, seeAgent permissions.",2026-03-27T15:55:00.000Z,how-to,security,0.7,True,"Managing permissions and access levels to resource groups and subscriptions is security/IAM; page likely lists specific permission levels, scopes, and possibly RBAC roles.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connector,Set up MCP connector,Tutorial: Set Up the MCP Connector in Azure SRE Agent,,"Connect your SRE agent to external tools using the Model Context Protocol (MCP), then select tools for your agent to use directly in chat.","In this tutorial, you connect your SRE agent to an external tool server by using the Model Context Protocol (MCP), select tools for your agent, and test them in chat. This tutorial uses the GitHub MCP server as an example—the same steps work for Datadog, Splunk, New Relic, and any partner connector. Estimated time: 10 minutes In this tutorial, you learn how to:",2026-03-27T15:55:00.000Z,tutorial,,0.3,False,"MCP connector tutorial using GitHub MCP as example; appears to be a how-to guide rather than a reference of configuration parameters, limits, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connectors,MCP connectors,MCP connectors and tools in Azure SRE Agent,Use MCP connectors and tools with Azure SRE Agent,"Extend your agent to any external system including observability platforms, source code, ticketing systems, and custom APIs which uses the Model Context Protocol.",Tip,2026-03-27T15:55:00.000Z,conceptual,integrations,0.65,True,"MCP connectors integrate external systems via Model Context Protocol; page likely lists connector configuration fields and tool definitions unique to SRE Agent, fitting integrations & coding patterns.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connectors,MCP connectors,MCP connectors and tools in Azure SRE Agent,,"Extend your agent to any external system including observability platforms, source code, ticketing systems, and custom APIs which uses the Model Context Protocol.",Tip,2026-04-25T06:18:00.000Z,conceptual,,0.2,False,Overview of MCP connectors and tools; summary suggests marketing/feature description rather than detailed integration parameters or troubleshooting guidance.,updated https://learn.microsoft.com/en-us/azure/sre-agent/memory,Memory & knowledge,Memory and Knowledge in Azure SRE Agent,,Learn how your agent remembers past incident resolutions and references your documentation to improve over time.,Your agent becomes more effective over time by remembering what worked in past incidents and referencing your documentation.,2026-03-27T15:55:00.000Z,concept-article,,0.3,False,Describes memory and knowledge conceptually; no evidence of specific configuration parameters or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/monitor-agent-usage,Monitor agent usage,Monitor Agent Usage in Azure SRE Agent,Monitor Azure SRE Agent usage and Azure AI Unit limits,"Track Azure AI Unit consumption, manage allocation limits, and generate session insights to review your agent's performance.","Track Azure AI Unit (AAU) consumption, manage allocation limits, and review session insights which are all available from a single view. Monitor what your agent uses, what's left, and where the consumption goes. Tip",2026-03-10T04:29:00.000Z,conceptual,limits-quotas,0.7,True,"Tracks Azure AI Unit consumption and allocation limits; likely includes specific numeric limits and quota-related configuration, which are expert knowledge about usage constraints.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/network-requirements,Network requirements,Network Requirements for Azure SRE Agent,Configure network and firewall requirements for SRE Agent,"Review firewall allow list domains, authentication requirements, and network configuration for Azure SRE Agent connectivity.",This article describes the network and firewall configuration required for Azure SRE Agent connectivity.,2026-03-10T04:29:00.000Z,reference,configuration,0.85,True,"Describes firewall allow-list domains, auth requirements, and network configuration. This typically includes specific hostnames, ports, and protocol requirements that are product-specific configuration details.",unchanged @@ -66,36 +66,36 @@ https://learn.microsoft.com/en-us/azure/sre-agent/overview,What is SRE Agent?,Ov https://learn.microsoft.com/en-us/azure/sre-agent/pagerduty-incidents,PagerDuty incident indexing,PagerDuty Incident Indexing in Azure SRE Agent,Integrate PagerDuty incidents with Azure SRE Agent,"Learn how your agent picks up PagerDuty incidents, investigates using connected data sources, and tracks resolution metrics automatically.","Your agent picks up PagerDuty incidents within minutes of firing, investigates using all connected data sources, and tracks resolution metrics over time. Tip",2026-03-27T15:55:00.000Z,conceptual,integrations,0.65,True,"PagerDuty incident indexing behavior; likely includes connector configuration, event handling details, and possibly polling/latency specifics unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/permissions,Agent permissions,Agent Permissions in Azure SRE Agent,Configure Azure SRE Agent permissions and RBAC access,"Learn how your agent accesses Azure resources, including permission levels, RBAC roles, and on-behalf-of flow.",Every agent has auser-assigned managed identity (UAMI)that is automatically created alongside it. Your agent uses this UAMI to authenticate and interact with your Azure resources. It acts on your behalf without you needing to manage secrets or credentials.,2026-03-27T15:55:00.000Z,concept-article,security,0.75,True,"Explains how the agent’s user-assigned managed identity accesses Azure resources, including RBAC roles and permission levels; this is product-specific security configuration.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/plugin-marketplace,Plugin marketplace,Plugin Marketplace in Azure SRE Agent,,"Browse, discover, and install community-built skills and MCP integrations from curated plugin marketplaces in Azure SRE Agent.","The Plugin Marketplace lets you browse, install, and update agent skills from curated GitHub-hosted catalogs. Instead of manually copying skill files and configuring connectors, you can discover plugins with search and filtering, preview their content, and import them with a single select. Tip Key highlights",2026-03-27T15:55:00.000Z,conceptual,,0.3,False,Plugin Marketplace description focuses on browsing and installing plugins; summary does not indicate detailed configuration parameter tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/pricing-billing,Pricing and billing,Pricing and billing for Azure SRE Agent,Understand Azure SRE Agent pricing and cost drivers,"Learn how Azure SRE Agent billing works with Azure Agent Units, including always-on flow, active flow rates, and cost optimization tips.",Learn how Azure SRE Agent billing works and what to expect on your Azure bill. Tip,2026-04-06T17:23:00.000Z,concept-article,decision-making,0.6,True,"Pricing and billing article for Azure SRE Agent with Azure Agent Units, always-on vs active flow rates, and cost optimization tips; likely includes rate details and guidance to choose usage patterns for cost, fitting decision-making around cost/performance trade-offs.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/pricing-billing,Pricing and billing,Pricing and Billing for Azure SRE Agent,Understand Azure SRE Agent pricing and cost drivers,"Learn how Azure SRE Agent billing works with Azure Agent Units, including always-on flow, active flow rates, and cost optimization tips.","Learn how Azure SRE Agent billing works and what to expect on your Azure bill. Two billing components are always-on flow (fixed) and active flow (variable, token-based). Active flow measures the large language model (LLM) tokens that your agent consumes. Each token type is metered at a fixed Azure Agent Unit (AAU) rate based on your agent's configured model. You can monitor consumption in the portal atSettings>Agent consumption.",2026-04-22T17:34:00.000Z,concept-article,decision-making,0.65,True,"Explains billing model with always-on vs active flow, token-based metering, and cost optimization tips to help choose and manage usage; this is product-specific decision guidance about cost and consumption rather than a generic overview.",updated https://learn.microsoft.com/en-us/azure/sre-agent/python-code-execution,Python tools,Python Tools in Azure SRE Agent,Create and configure Python tools for Azure SRE Agent,"Extend your agent to reach internal systems, multicloud platforms, and custom business logic by creating Python tools.","Python tools extend your Azure SRE Agent beyond built-in Azure capabilities. Create custom tools that connect to internal APIs, on-premises databases, multicloud platforms, and proprietary systems by using Python code. Describe what you need in plain English, paste existing scripts, or wrap HTTP endpoints, then test and deploy without a restart. Tip",2026-03-27T15:55:00.000Z,how-to,integrations,0.65,True,"Python tools are a product-specific integration mechanism to reach internal APIs and systems. Page likely includes tool definition schema, parameters, and execution behavior unique to SRE Agent, fitting integrations & coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/response-plan,Set up an incident trigger,Tutorial: Create an Incident Response Plan for Azure SRE Agent,,"Create a response plan from the Agent Canvas that routes specific incidents to a custom agent, and use the enable or disable toggle to control when it's active.","In this tutorial, you create a response plan that filters incidents by severity and service, routes matching incidents to a specific custom agent for automated investigation, and learn how to use the enable or disable toggle. Estimated time: 5-10 minutes In this tutorial, you:",2026-03-27T15:55:00.000Z,tutorial,,0.25,False,Tutorial for creating a response plan; summary suggests a guided workflow rather than detailed decision matrices or configuration tables.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/response-plan,Set up an incident trigger,Tutorial: Create an incident response plan in Azure SRE Agent,,Create a response plan that routes incidents to a custom agent and use the toggle to control when it's active.,"Incident response plans let you automatically route incoming incidents to the right custom agent based on filter criteria like severity, service, and incident type. Instead of manually triaging each alert, you define the conditions once and your agent handles matching incidents as they arrive. In this tutorial, you create a response plan from the Agent Canvas, preview matching incidents, and use the enable/disable toggle to control when the plan is active.",2026-04-24T18:40:00.000Z,tutorial,,0.3,False,"Tutorial for creating an incident response plan; summary emphasizes what the feature does and basic usage, not detailed decision matrices, limits, or configuration references.",updated https://learn.microsoft.com/en-us/azure/sre-agent/root-cause-analysis,Root cause analysis,Root Cause Analysis in Azure SRE Agent,,"Learn how your agent reasons like an expert SRE by forming hypotheses, testing them with evidence, and explaining its conclusions.",Tip,2026-03-27T15:55:00.000Z,concept-article,,0.3,False,Explains root cause analysis reasoning; appears conceptual without concrete configuration parameters or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/run-modes,Run modes,Run Modes in Azure SRE Agent,,Learn how run modes control whether your agent asks for approval before taking actions or acts autonomously.,"Run modes control whether your agentasks for approvalbefore taking actions oracts on its own. The key distinction is between Azure infrastructure operations (which have system-enforced approval) and other actions. Note Run modes control theapproval workflow, deciding whether the agent should ask before acting.Permissionscontrolresource access, determining whether the agent can reach a resource. The agent needs both conditions satisfied to act.",2026-03-27T15:55:00.000Z,concept-article,,0.4,False,"Describes run modes conceptually (approval vs autonomous); no indication of numeric thresholds, config tables, or security role lists.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/scheduled-tasks,Scheduled tasks,Schedule tasks with Azure SRE Agent,,"Learn how to use scheduled tasks in SRE Agent to automate monitoring, enforce security, and validate recovery.","Scheduled tasks in Azure SRE Agent automate workflows such as monitoring, maintenance, and security checks on a schedule you define. You can create these tasks manually, request them during a chat with the agent, or allow the agent to generate them autonomously as part ofincident response. Scheduled tasks help you move from reacting to problems to being proactive with tasks that run consistently and without manual effort. The following scenarios show some common use cases for scheduled tasks: No",2026-03-27T15:55:00.000Z,overview,,0.2,False,"Appears to be a conceptual/how-to overview of scheduled tasks with scenarios; no indication of numeric limits, config tables, or error-code-based troubleshooting.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/send-notifications,Send notifications,Send Notifications in Azure SRE Agent,,"Send contextual notifications to Microsoft Teams, Outlook, or MCP-enabled tools with investigation summaries, root cause analysis, and recommended actions.","Azure SRE Agent sends contextual notifications to Microsoft Teams, Outlook, or any MCP-enabled tool. Instead of forwarding raw alerts, the agent investigates first and then delivers summaries that include root cause analysis, impact assessment, and recommended actions so your team can act immediately. Tip Your team gets an investigation summary, not raw alerts. The agent automatically includes context so recipients can act immediately. This functionality works with Outlook and Teams built-in, pl",2026-03-27T15:55:00.000Z,how-to,,0.2,False,"High-level description of notifications to Teams/Outlook; no evidence of detailed configuration options, quotas, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/servicenow-incidents,ServiceNow incident indexing,ServiceNow incident indexing in Azure SRE Agent,Connect ServiceNow as an incident platform for SRE Agent,"Learn how Azure SRE Agent indexes ServiceNow incidents with real-time scanning, connectivity validation, and automated investigation.","Connect ServiceNow as your incident platform so your agent automatically indexes, investigates, and responds to ServiceNow incidents. Real connectivity validation during setup confirms your credentials work before indexing begins. Tip",2026-03-27T15:55:00.000Z,conceptual,integrations,0.7,True,ServiceNow incident indexing with real-time scanning and connectivity validation; implies specific connector settings and validation behavior.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/scheduled-tasks,Scheduled tasks,Schedule tasks with Azure SRE Agent,,"Learn how to use scheduled tasks in SRE Agent to automate monitoring, enforce security, and validate recovery.",Tip Scheduled tasks help you by:,2026-04-24T18:40:00.000Z,overview,,0.2,False,"High-level description of scheduled tasks; summary suggests conceptual benefits, not detailed configuration parameters, limits, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/send-notifications,Send notifications,Send Notifications in Azure SRE Agent,,"Send contextual notifications to Microsoft Teams, Outlook, or MCP-enabled tools with investigation summaries, root cause analysis, and recommended actions.","Azure SRE Agent sends contextual notifications to Microsoft Teams, Outlook, or any MCP-enabled tool. Instead of forwarding raw alerts, the agent investigates first and then delivers summaries that include root cause analysis, impact assessment, and recommended actions so your team can act immediately. Tip Your team gets an investigation summary, not raw alerts. The agent automatically includes context so recipients can act immediately. This functionality works with Outlook and Teams built-in, pl",2026-04-21T22:10:00.000Z,how-to,,0.2,False,"Describes notification behavior and supported channels (Teams, Outlook, MCP tools) at a conceptual level; no evidence of detailed configuration, quotas, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/servicenow-incidents,ServiceNow incident indexing,ServiceNow incident indexing in Azure SRE Agent,Integrate ServiceNow incidents with Azure SRE Agent,"Learn how Azure SRE Agent indexes ServiceNow incidents with real-time scanning, connectivity validation, and automated investigation.","Connect ServiceNow as your incident platform so your agent automatically indexes, investigates, and responds to ServiceNow incidents. Real connectivity validation during setup confirms your credentials work before indexing begins. Tip",2026-04-21T22:10:00.000Z,conceptual,integrations,0.75,True,"Covers ServiceNow as an incident platform, including real-time scanning, connectivity validation, and indexing behavior. This is a concrete integration with product-specific connectivity validation and indexing semantics.",updated https://learn.microsoft.com/en-us/azure/sre-agent/set-up-pagerduty-indexing,Set up PagerDuty indexing,Set up PagerDuty incident indexing in Azure SRE Agent,Connect PagerDuty incidents to Azure SRE Agent,"Connect PagerDuty to Azure SRE Agent so your agent automatically picks up, investigates, and resolves PagerDuty incidents.",Connect your PagerDuty account so your agent automatically picks up PagerDuty incidents and investigates them. Time: ~5 minutes,2026-04-03T21:52:00.000Z,tutorial,integrations,0.65,True,"PagerDuty incident indexing requires product-specific connector configuration (API keys, endpoints, filters); this is an integration pattern.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/set-up-teams-connector,Set up Teams connector,Set up the Teams connector in Azure SRE Agent,Configure Microsoft Teams connector for Azure SRE Agent,"Connect your Azure SRE Agent to a Microsoft Teams channel so it can post updates, reply to conversation threads, and read channel messages.","Estimated time: 5 minutes Connect your agent to a Microsoft Teams channel so it can post updates, reply to threads, and read messages. After you complete this tutorial, your agent can send contextual notifications to your team without leaving the SRE Agent portal. Tip To learn how your agent uses Teams and Outlook for contextual notifications, seeSend notifications.",2026-03-27T15:55:00.000Z,tutorial,integrations,0.6,True,"Teams connector setup is an integration; page likely includes connector configuration fields (channel IDs, permissions, scopes) specific to this product.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/setup-servicenow-indexing,Set up ServiceNow indexing,Set up ServiceNow incident indexing in Azure SRE Agent,Configure ServiceNow incident indexing for Azure SRE Agent,"Connect ServiceNow to Azure SRE Agent so your agent automatically indexes, investigates, and responds to ServiceNow incidents.","In this tutorial, you connect your ServiceNow instance to Azure SRE Agent so the agent automatically indexes and investigates ServiceNow incidents. You can use basic authentication or OAuth 2.0. This setup takes about 10 minutes. Tip To learn how ServiceNow incident indexing works, seeServiceNow incident indexing.",2026-03-27T15:55:00.000Z,tutorial,integrations,0.65,True,ServiceNow indexing with basic auth or OAuth 2.0 implies detailed connector settings and parameters unique to this integration.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/set-up-teams-connector,Set up Teams connector,Set Up the Teams Connector in Azure SRE Agent,,"Connect your Azure SRE Agent to a Microsoft Teams channel so it can post updates, reply to conversation threads, and read channel messages.","Estimated time: 5 minutes Connect your agent to a Microsoft Teams channel so it can post updates, reply to threads, and read messages. After you complete this tutorial, your agent can send contextual notifications to your team without leaving the SRE Agent portal. Tip To learn how your agent uses Teams and Outlook for contextual notifications, seeSend notifications.",2026-04-21T22:10:00.000Z,tutorial,,0.4,False,"Tutorial to connect SRE Agent to a Teams channel; from the summary it looks like a step-by-step setup guide without explicit config parameter tables or quotas, so it doesn’t clearly meet any expert-knowledge category.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/setup-servicenow-indexing,Set up ServiceNow indexing,Set up ServiceNow incident indexing in Azure SRE Agent,,"Connect ServiceNow to Azure SRE Agent so your agent automatically indexes, investigates, and responds to ServiceNow incidents.","In this tutorial, you connect your ServiceNow instance to Azure SRE Agent so the agent automatically indexes and investigates ServiceNow incidents. You can use basic authentication or OAuth 2.0. This setup takes about 10 minutes. Tip To learn how ServiceNow incident indexing works, seeServiceNow incident indexing.",2026-04-21T22:10:00.000Z,tutorial,,0.4,False,"Tutorial for setting up ServiceNow incident indexing; summary mentions auth options and duration but not specific configuration parameter tables, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/sre-agent/skills,Skills,Skills in Azure SRE Agent,,"Learn how skills extend your agent with procedural guidance and execution capabilities like Azure CLI, Kusto queries, and Python scripts.","Skills extend your agent with procedures and execution capabilities. You can add a troubleshooting guide, attach tools like Azure CLI, Kusto queries, Python scripts, or MCP connectors, and your agent loads them when relevant to the user's question. The agent doesn't need an explicit/skillcommand. You can also upload knowledge documents such as runbooks, architecture guides, and reference material to build a knowledge base that your agent searches automatically. For more information, seeMemory an",2026-03-27T15:55:00.000Z,feature-guide,,0.4,False,"Explains skills and how they extend the agent; likely conceptual with examples, not a detailed config or integration reference.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/starter-prompts,Starter prompts,Starter Prompts for Azure SRE Agent,,"Explore example prompts for interacting with Azure SRE Agent across App Service, Container Apps, AKS, and API Management.",Use these prompts to explore your agent's capabilities. Copy any prompt directly into your agent's chat interface.,2026-03-27T15:55:00.000Z,reference,,0.1,False,"Starter prompts are usage examples; they don’t represent configuration, limits, or troubleshooting knowledge and are not critical expert reference data.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/sub-agents,Subagents,Custom agents in Azure SRE Agent,,Learn how custom agents provide specialized domain expertise that you invoke on demand through the /agent command.,"Custom agents are specialist agents that you invoke on demand. Type/agentin chat, select your specialist, and ask your question. This process gives you access to a database expert for SQL problems and a security auditor for threat investigation. Unlike skills (which are always available), custom agents require explicit invocation. This requirement scopes their expertise to specific tasks.",2026-03-27T15:55:00.000Z,concept-article,,0.3,False,Conceptual explanation of custom agents and /agent command; no indication of expert-only configuration or limits.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/subscription-permission-visibility,Subscription permission visibility,Subscription Permission Visibility in Azure SRE Agent,Understand subscription visibility and RBAC for SRE Agent,Learn how Azure SRE Agent shows all your Azure subscriptions in the assignment picker including those requiring elevated access.,"When you set up your agent's monitoring scope, select which Azure subscriptions the agent can access. The subscription picker previously showed only subscriptions where you had Owner or User Access Administrator permissions - the roles needed to grant the agent its required Reader access. If you didn't have these roles on a subscription, it simply didn't appear. This limitation created three problems: Tip",2026-04-25T06:18:00.000Z,concept-article,security,0.7,True,"Focuses on which subscriptions appear in the picker based on specific Azure RBAC roles (Owner, User Access Administrator, Reader). This is product-specific security/permission behavior that affects how access is granted and is unlikely to be known generically.",new https://learn.microsoft.com/en-us/azure/sre-agent/supported-regions,Supported regions,Supported Regions for Azure SRE Agent,Select supported Azure regions for SRE Agent,Find Azure regions where Azure SRE Agent is available and learn how to check region availability for your subscription.,Azure SRE Agent is available in select Azure regions. This reference lists the currently supported regions and provides guidance on choosing a region for your agent.,2026-03-27T15:55:00.000Z,reference,decision-making,0.65,True,"Region support is time- and product-specific expert knowledge. Page likely lists exact supported regions and provides guidance on choosing a region, which is concrete decision-making content not inferable from general training.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/team-onboard,Team onboarding,Team onboarding for Azure SRE Agent,,"Teach your agent about your team, application architecture, and operational procedures to build persistent memory for investigations.","Teach the agent about your team—who's on call, what services you own, how you handle incidents. At the same time, the agent learns from the context you connected during setup (code repositories, Azure resources, logs) to build a complete picture of your environment. The agent references this persistent memory in every future conversation.",2026-03-27T15:55:00.000Z,tutorial,,0.2,False,Conceptual guidance on teaching the agent about team and architecture; no indication of specific configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/teams-bot,Set up Teams bot,Tutorial: Set Up a Teams Bot for Azure SRE Agent,Deploy Azure SRE Agent as a Microsoft Teams bot,Deploy your Azure SRE Agent as a Microsoft Teams bot for interactive chat through direct messages or channel mentions.,"In this tutorial, you deploy your Azure SRE Agent as a Microsoft Teams bot so your team can chat with it directly in DMs or by @-mentioning it in channels. Tip What you accomplish in this tutorial:",2026-03-10T04:29:00.000Z,tutorial,deployment,0.7,True,"Setting up a Teams bot for the agent is a deployment scenario with product-specific registration steps, channel configuration, and possibly constraints on how the bot is hosted and accessed.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/test-tool-playground,Test a tool in the playground,Tutorial: Test a Tool in the Playground in Azure SRE Agent,,Debug and verify your tools in the Azure SRE Agent playground before deploying them to production.,"In this tutorial, you use the test playground in the Azure SRE Agent portal to test and debug your tools before deploying them. The playground lets you execute tools in isolation by using custom parameters and review results immediately. In this tutorial, you learn how to: Estimated time: 5 minutes",2026-03-27T15:55:00.000Z,tutorial,,0.3,False,Tool testing playground tutorial focuses on usage; unlikely to contain detailed configuration matrices or limits beyond generic testing steps.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/threads,Threads,Threads in Azure SRE Agent,,"Learn how threads organize conversations with your agent, including thread sources, filtering, search, chat commands, and keyboard shortcuts.","A thread is a conversation with your agent. Each time you start a new investigation or ask a question, you create a thread that maintains context throughout the conversation.",2026-03-27T15:55:00.000Z,concept-article,,0.2,False,Defines threads and conversation organization; conceptual usage guidance without detailed technical parameters.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/tools,Tools,Tools in Azure SRE Agent,,"Learn how your agent's tools are organized, including built-in Azure tools, MCP extensibility, code execution, knowledge, and communication.","Tools are the atomic capabilities your agent uses to take action. They enable querying logs, running commands, executing code, searching documents, and sending notifications. Your agent automatically selects the right tools based on the task. By combining tools withskillsandcustom agents, you can create powerful automation. Skills attach tools to procedural instructions. Custom agents get dedicated tool sets for their domain.",2026-03-27T15:55:00.000Z,reference,,0.4,False,Overview of tools and capabilities; summary does not show specific parameter tables or SDK references required for integrations/configuration classification.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/track-incident-value,Track incident value,Track Incident Value in Azure SRE Agent,,"Measure your agent's impact with interactive analytics — drill into incidents from any chart, filter by response plan, and track quality with star ratings.",Tip,2026-04-07T17:12:00.000Z,concept-article,,0.1,False,"Mentions tracking incident value with charts and ratings; appears to be an analytics/overview feature without detailed configuration, limits, or error-resolution guidance.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/track-incident-value,Track incident value,Track incident value in Azure SRE Agent,,"Measure your agent's impact with interactive analytics. Drill into incidents from any chart, filter by response plan, and track quality with star ratings.","You deployed an AI agent to handle incidents. Leadership wants to know:Is it actually reducing toil? Which incidents is it resolving on its own? Are we getting ROI from this investment? Today, answering those questions means manually querying telemetry, cross-referencing incident tickets, and guessing which response plans are effective. There's no single view that shows what your agent did, how well each response plan performed, or whether mitigation rates are improving over time. Without this d",2026-04-24T18:40:00.000Z,conceptual,,0.3,False,"Focuses on tracking incident value and analytics; summary indicates conceptual benefits and use cases, not specific configuration options or limits.",updated https://learn.microsoft.com/en-us/azure/sre-agent/troubleshoot-azure-app-service,Troubleshoot App Service,Tutorial: Troubleshoot an App Using Azure SRE Agent and Azure App Service,,Learn how to use Azure SRE Agent and Azure App Service to identify and fix app problems through AI-assisted troubleshooting.,"Site reliability engineering (SRE) focuses on creating reliable, scalable systems through automation and proactive management. Azure SRE Agent brings these principles to your cloud environment by providing AI-powered monitoring, troubleshooting, and remediation capabilities. SRE Agent automates routine operational tasks and provides reasoned insights to help you maintain application reliability while reducing manual intervention. SRE Agent is available as a chatbot, so you can ask questions and ",2025-11-18T17:01:00.000Z,tutorial,,0.3,False,Tutorial on troubleshooting an app with SRE Agent and App Service; appears procedural without detailed error-code mappings or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/troubleshoot-azure-container-apps,Troubleshoot Container Apps,Tutorial: Troubleshoot an App Using Azure SRE Agent and Azure Container Apps,,Deploy an automated agent to help monitor and resolve problems by using Azure SRE Agent and Azure Container Apps.,"Azure SRE Agent helps you manage and monitor Azure resources by using AI-enabled capabilities. Agents guide you in solving problems and building resilient, self-healing systems. In this tutorial, you: Important This tutorial features an AI-enabled service powered by a language model. The procedural steps reflect how the model is expected to respond. However, the responses that you encounter in your agent might differ from the examples in this tutorial. Use the example prompts to help achieve you",2025-11-18T17:01:00.000Z,tutorial,,0.3,False,"Tutorial for Azure Container Apps; mostly step-by-step usage of the agent, not deep product-specific limits or configs.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-agent-hooks,Configure agent hooks,Tutorial: Configure Agent Hooks (API) in Azure SRE Agent,Configure agent hooks via REST API in Azure SRE Agent,"Add Stop and PostToolUse hooks to your agent using the REST API v2 to validate responses, audit tool usage, and enforce policies.","Tip Prefer the portal UI?You can now create and manage hooks directly in the portal without using the REST API. The portal provides a visual form and code editor. Nocurlcommands are required. In this tutorial, you create a custom agent with a Stop hook that forces the agent to add a completion marker to every response. You configure the hook through the REST API, then test it in the portal's playground. Estimated time: 15 minutes Note Agent-level vs. custom-agent-level hooks:This tutorial create",2026-03-27T15:55:00.000Z,tutorial,configuration,0.7,True,"REST API v2 tutorial for Stop and PostToolUse hooks will include JSON schema, field names, and allowed values for hook configuration, fitting configuration.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-deep-investigation,Run a deep investigation,Tutorial: Run a Deep Investigation in Azure SRE Agent,,"Learn how to use deep investigation for structured, hypothesis-driven analysis from chat and from incident response plans.","Deep investigation gives your agent a structured methodology for complex problems. The agent forms multiple hypotheses and validates each one with evidence. In this tutorial, you trigger a deep investigation from chat and explore the results. In this tutorial, you learn how to:",2026-03-10T04:29:00.000Z,tutorial,,0.2,False,"Tutorial on using deep investigation from chat and response plans; appears to be step-by-step usage guidance without detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-incident-response,Set up incident response,Step 4: Set Up Incident Response in Azure SRE Agent,,Connect your incident platform and create response plans so your agent automatically investigates incoming incidents.,"Estimated time: 10 minutes Connect your incident platform and create a response plan. When incidents arrive, your agent automatically investigates and generates detailed execution plans.",2026-03-10T04:29:00.000Z,tutorial,,0.2,False,Step in a getting-started flow to connect an incident platform and create response plans; appears to be basic setup instructions rather than deep configuration or troubleshooting content.,unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-upload-knowledge-document,Upload knowledge documents,Tutorial: Upload Knowledge Documents to Azure SRE Agent,,Upload knowledge documents to your Azure SRE Agent's Knowledge settings through conversation and the portal UI so the agent can reference them in future investigations.,"In this tutorial, you upload knowledge documents to your Azure SRE Agent's Knowledge settings by using two methods: asking the agent to create a runbook from an investigation and uploading a file through the portal UI. Your agent can capture knowledge discovered during investigations and store it for future use, so it automatically builds institutional knowledge. For more information, seeUpload knowledge documents. In this tutorial, you learn how to: Estimated time: 15 minutes",2026-03-27T15:55:00.000Z,tutorial,,0.2,False,"Tutorial on uploading knowledge documents; summary indicates workflow guidance, not detailed configuration, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/upload-knowledge-document,Upload knowledge documents,Upload Knowledge Documents to Azure SRE Agent,,"Create and upload runbooks, troubleshooting guides, and documentation to Knowledge settings during conversations to capture institutional knowledge automatically.","Your agent captures institutional knowledge during conversations by uploading runbooks, troubleshooting guides, and documentation directly to the knowledge base. When your agent discovers a fix or resolves an incident, it generates structured documents and indexes them for semantic search, so every future investigation benefits from past resolutions. Tip Your agent creates and uploads runbooks during conversations without manual file management. Attach 31 file types in chat, including.kql,.bicep",2026-03-27T15:55:00.000Z,how-to,,0.35,False,Uploading knowledge documents is described conceptually; summary mentions file types but not detailed parameter tables or limits/quotas.,unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-upload-knowledge-document,Upload knowledge documents,Tutorial: Upload Knowledge Documents to Azure SRE Agent,,Upload knowledge documents to your Azure SRE Agent's Knowledge settings through conversation and the portal UI so the agent can reference them in future investigations.,"In this tutorial, you upload knowledge documents to your Azure SRE Agent so it can reference them during future investigations. You learn how to save investigation results directly into Knowledge settings through chat and how to upload existing files through the portal UI. In this tutorial, you learn how to: Estimated time: 10 minutes",2026-04-25T06:18:00.000Z,tutorial,,0.2,False,"Tutorial-style guidance on uploading knowledge documents via chat and portal; no configuration tables, limits, error codes, or product-specific parameters beyond basic UI steps.",updated +https://learn.microsoft.com/en-us/azure/sre-agent/upload-knowledge-document,Upload knowledge documents,Upload Knowledge Documents to Azure SRE Agent,,"Create and upload runbooks, troubleshooting guides, and documentation to Knowledge settings during conversations to capture institutional knowledge automatically.",Tip,2026-04-25T06:18:00.000Z,how-to,,0.1,False,"Very short, appears to be a conceptual description of uploading knowledge documents; no sign of parameter tables, limits, or error codes.",updated https://learn.microsoft.com/en-us/azure/sre-agent/use-code-interpreter,Use code interpreter,Tutorial: Use Code Interpreter in Azure SRE Agent,Enable and use Code Interpreter in Azure SRE Agent,"Enable Code Interpreter for your Azure SRE Agent and use it to analyze data, generate charts, and create reports from chat.","In this tutorial, you enable Code Interpreter for your Azure SRE Agent and use it to analyze Azure data, generate visualizations, and create downloadable files directly from chat prompts. Note To learn more about Code Interpreter and the problems it solves, seeCode Interpreter. In this tutorial, you learn how to: Estimated time: 10 minutes",2026-03-10T04:29:00.000Z,tutorial,configuration,0.65,True,"Enabling Code Interpreter for the agent likely involves specific configuration switches, allowed operations, and possibly limits for file handling and analysis, which are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/sre-agent/use-docsguide,Use DocsGuide,Tutorial: Use DocsGuide in Azure SRE Agent,,Learn how to ask your Azure SRE Agent questions about its features and get accurate answers from official documentation.,"In this tutorial, use DocsGuide (the built-in documentation capability) to ask your agent questions about Azure SRE Agent features, configuration, and concepts. DocsGuide answers from live official documentation, so responses stay current as features evolve. Estimated time: 5 minutes In this tutorial, you:",2026-03-27T15:55:00.000Z,tutorial,,0.25,False,DocsGuide usage tutorial is about asking questions; unlikely to contain detailed configuration parameters or limits beyond basic usage.,unchanged https://learn.microsoft.com/en-us/azure/sre-agent/user-roles,User roles and permissions,User Roles and Permissions in Azure SRE Agent,Configure Azure SRE Agent roles and permissions,"Learn how to control who can view, interact with, and administer your agent by using Azure RBAC roles and layered access control.","Your agent can investigate issues, take actions on production infrastructure, and access sensitive data across your environment. Access control determines who can request actions, who can approve them, and who can modify the agent's configuration.",2026-04-07T06:20:00.000Z,concept-article,security,0.7,True,"Focuses on controlling who can view, interact with, and administer the agent using Azure RBAC and layered access control. This implies specific role names/permissions and access scopes unique to the product, which fits the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/sre-agent/workflow-automation,Workflow automation,Workflow Automation in Azure SRE Agent,,"Automate operational workflows by connecting triggers, subagents, and tools to run without manual intervention.",Tip,2026-03-27T15:55:00.000Z,concept-article,,0.4,False,"High-level workflow automation description; no explicit indication of detailed configuration parameters, limits, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/sre-agent/workflow-automation,Workflow automation,Workflow automation in Azure SRE Agent,,"Automate operational workflows by connecting triggers, subagents, and tools to run without manual intervention.","Operational workflows span multiple tools and require someone to remember what comes next. You check status in one system, make a decision, execute in another, and notify your team in a third. Each handoff adds latency and risk. Tip Workflow automation helps you by:",2026-04-24T18:40:00.000Z,conceptual,,0.2,False,"Appears to be a conceptual/feature overview of workflow automation in SRE Agent; no indication of numeric limits, config tables, error codes, or product-specific decision matrices.",updated https://learn.microsoft.com/en-us/azure/sre-agent/workspace-tools,Deep context,Deep context in Azure SRE Agent,,"Learn how deep context gives your Azure SRE Agent accumulated understanding of your code, infrastructure, and operational history.","Deep Context is the agent's accumulated understanding of your environment—your code, your infrastructure, your team's procedures, and what happened in past investigations. Unlike a generic AI assistant that starts from zero every time, your agent builds a growing picture of how your systems work. Tip Workspace tools (file operations, terminal commands, Python execution) require enablement. Contact your agent administrator or enable via theExperimental Settingspage in the portal. Deep Context isn",2026-04-07T06:20:00.000Z,conceptual,,0.2,False,"Explains the concept of Deep Context and mentions workspace tools enablement, but appears to be conceptual behavior/benefits rather than concrete configuration parameters, limits, or troubleshooting details.",unchanged diff --git a/products/azure-sre-agent/report.md b/products/azure-sre-agent/report.md index d078d98e..e6da10c6 100644 --- a/products/azure-sre-agent/report.md +++ b/products/azure-sre-agent/report.md @@ -1,34 +1,30 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: - integrations: Integrating Azure SRE Agent with DevOps, GitHub, Teams, ServiceNow, - PagerDuty, Kusto, observability tools, and custom HTTP/Python/MCP tools for automation - and incident workflows - configuration: 'Configuring SRE Agent behavior: hooks and governance, custom tools/skills, - specialized subagents, network/firewall settings, and enabling/using the Python/shell - Code Interpreter via UI or REST API' - security: Managing SRE Agent identities, authentication methods, and configuring - RBAC/role permissions (including managed identity access to Azure DevOps repos) - for secure resource access. + integrations: Integrating SRE Agent with DevOps, GitHub, alerts, incidents, observability + tools, and custom Python/Kusto tools, including cross-tenant and PagerDuty/ServiceNow + connections. + configuration: 'Configuring SRE Agent behavior: hooks and governance, code interpreter + (Python/shell), custom tools/skills, subagents, tool overrides, incident plans, + and network/firewall settings.' + security: 'Identities, auth, and RBAC for SRE Agent: managed identities, ADO repo + access, roles/permissions, and subscription/resource visibility configuration.' troubleshooting: Diagnosing Azure SRE Agent deployment/operation issues and using KQL to query its telemetry, logs, and actions for troubleshooting. decision-making: Guidance on when to trigger deep investigations, how SRE Agent - pricing works and what drives cost, and which Azure regions you can or should - deploy the agent in + pricing works and what drives cost, and which Azure regions you can deploy SRE + Agent in. limits-quotas: Tracking Azure SRE Agent usage, AI Unit consumption, limits, and understanding how pricing, billing units, and quotas affect your workloads. - deployment: How to deploy and configure the Azure SRE Agent as a Microsoft Teams - bot, including setup steps, required permissions, and integration details. skill_description: Expert knowledge for Azure Sre Agent development including troubleshooting, - decision making, limits & quotas, security, configuration, integrations & coding - patterns, and deployment. Use when wiring SRE Agent to DevOps/Teams/ServiceNow, - configuring tools and RBAC, querying KQL logs, or deploying as a Teams bot, and - other Azure Sre Agent related development tasks. Not for Azure Monitor (use azure-monitor), - Azure Reliability (use azure-reliability), Azure Resiliency (use azure-resiliency), - Azure Service Health (use azure-service-health). -use_when: Use when wiring SRE Agent to DevOps/Teams/ServiceNow, configuring tools - and RBAC, querying KQL logs, or deploying as a Teams bot, and other Azure Sre Agent - related development tasks. + decision making, limits & quotas, security, configuration, and integrations & coding + patterns. Use when wiring SRE Agent to DevOps/GitHub, PagerDuty/ServiceNow, configuring + hooks/tools, RBAC, or KQL-based diagnostics, and other Azure Sre Agent related development + tasks. Not for Azure Monitor (use azure-monitor), Azure Reliability (use azure-reliability), + Azure Resiliency (use azure-resiliency), Azure Service Health (use azure-service-health). +use_when: Use when wiring SRE Agent to DevOps/GitHub, PagerDuty/ServiceNow, configuring + hooks/tools, RBAC, or KQL-based diagnostics, and other Azure Sre Agent related development + tasks. confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Reliability (use azure-reliability), Azure Resiliency (use azure-resiliency), Azure Service Health (use azure-service-health). @@ -40,31 +36,84 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Reliability - **Total Pages**: 100 - **Fetched**: 100 - **Fetch Failed**: 0 -- **Classified**: 40 -- **Unclassified**: 60 +- **Classified**: 39 +- **Unclassified**: 61 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 100 -- **Deleted Pages**: 0 +- **New Pages**: 2 +- **Updated Pages**: 24 +- **Unchanged**: 74 +- **Deleted Pages**: 2 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-sre-agent/azure-sre-agent.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| configuration | 8 | 8.0% | +| configuration | 10 | 10.0% | | decision-making | 3 | 3.0% | -| deployment | 1 | 1.0% | -| integrations | 20 | 20.0% | +| integrations | 17 | 17.0% | | limits-quotas | 1 | 1.0% | -| security | 5 | 5.0% | +| security | 6 | 6.0% | | troubleshooting | 2 | 2.0% | -| *(Unclassified)* | 60 | 60.0% | +| *(Unclassified)* | 61 | 61.0% | ## Changes +### New Pages + +- [Subscription permission visibility](https://learn.microsoft.com/en-us/azure/sre-agent/subscription-permission-visibility) +- [Default agent override info](https://learn.microsoft.com/en-us/azure/sre-agent/default-agent-override-info) + +### Updated Pages + +- [Upload knowledge documents](https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-upload-knowledge-document) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-25T06:18:00.000Z +- [Pricing and billing](https://learn.microsoft.com/en-us/azure/sre-agent/pricing-billing) + - Updated: 2026-04-06T17:23:00.000Z → 2026-04-22T17:34:00.000Z +- [Workflow automation](https://learn.microsoft.com/en-us/azure/sre-agent/workflow-automation) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-24T18:40:00.000Z +- [Scheduled tasks](https://learn.microsoft.com/en-us/azure/sre-agent/scheduled-tasks) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-24T18:40:00.000Z +- [HTTP triggers](https://learn.microsoft.com/en-us/azure/sre-agent/http-triggers) + - Updated: 2026-04-02T18:15:00.000Z → 2026-04-22T17:34:00.000Z +- [Send notifications](https://learn.microsoft.com/en-us/azure/sre-agent/send-notifications) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-21T22:10:00.000Z +- [Upload knowledge documents](https://learn.microsoft.com/en-us/azure/sre-agent/upload-knowledge-document) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-25T06:18:00.000Z +- [MCP connectors](https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connectors) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-25T06:18:00.000Z +- [Track incident value](https://learn.microsoft.com/en-us/azure/sre-agent/track-incident-value) + - Updated: 2026-04-07T17:12:00.000Z → 2026-04-24T18:40:00.000Z +- [Set up Teams connector](https://learn.microsoft.com/en-us/azure/sre-agent/set-up-teams-connector) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-21T22:10:00.000Z +- [Set up an incident trigger](https://learn.microsoft.com/en-us/azure/sre-agent/response-plan) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-24T18:40:00.000Z +- [Set up ServiceNow indexing](https://learn.microsoft.com/en-us/azure/sre-agent/setup-servicenow-indexing) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-21T22:10:00.000Z +- [Connect to ServiceNow](https://learn.microsoft.com/en-us/azure/sre-agent/connect-servicenow) + - Updated: 2026-04-03T21:52:00.000Z → 2026-04-21T22:10:00.000Z +- [Create scheduled tasks](https://learn.microsoft.com/en-us/azure/sre-agent/create-scheduled-task) + - Updated: 2026-04-07T17:12:00.000Z → 2026-04-24T18:40:00.000Z +- [Create an HTTP trigger](https://learn.microsoft.com/en-us/azure/sre-agent/create-http-trigger) + - Updated: 2026-04-02T18:15:00.000Z → 2026-04-22T17:34:00.000Z +- [Install a Marketplace plugin](https://learn.microsoft.com/en-us/azure/sre-agent/install-plugin-from-marketplace) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-21T22:10:00.000Z +- [Create and set up](https://learn.microsoft.com/en-us/azure/sre-agent/create-and-set-up) + - Updated: 2026-04-07T17:12:00.000Z → 2026-04-22T17:34:00.000Z +- [Automate workflows](https://learn.microsoft.com/en-us/azure/sre-agent/automate-workflows) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-24T18:40:00.000Z +- [Connectors](https://learn.microsoft.com/en-us/azure/sre-agent/connectors) + - Updated: 2026-04-07T17:12:00.000Z → 2026-04-25T06:18:00.000Z +- [Incident platforms](https://learn.microsoft.com/en-us/azure/sre-agent/incident-platforms) + - Updated: 2026-03-27T15:55:00.000Z → 2026-04-25T06:18:00.000Z +- *...and 4 more* + +### Deleted Pages + +- ~~Chat from your tools~~ (https://learn.microsoft.com/en-us/azure/sre-agent/chat-from-your-tools) +- ~~Set up Teams bot~~ (https://learn.microsoft.com/en-us/azure/sre-agent/teams-bot) + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -73,104 +122,104 @@ confusable_not_for: Not for Azure Monitor (use azure-monitor), Azure Reliability | [Troubleshooting](https://learn.microsoft.com/en-us/azure/sre-agent/faq-troubleshooting) | troubleshooting | 0.80 | Explicitly an operations troubleshooting FAQ; likely maps common symptoms (permissions, regional issues, deployment failures) to causes and resolutions specific to SRE Agent. | | [Agent permissions](https://learn.microsoft.com/en-us/azure/sre-agent/permissions) | security | 0.75 | Explains how the agent’s user-assigned managed identity accesses Azure resources, including RBAC roles and permission levels; this is product-specific security configuration. | | [Connect Azure DevOps Wiki](https://learn.microsoft.com/en-us/azure/sre-agent/connect-devops-wiki) | integrations | 0.75 | Connecting a DevOps wiki as a knowledge source involves specific connector settings, permissions, and indexing behavior unique to this integration. | +| [ServiceNow incident indexing](https://learn.microsoft.com/en-us/azure/sre-agent/servicenow-incidents) | integrations | 0.75 | Covers ServiceNow as an incident platform, including real-time scanning, connectivity validation, and indexing behavior. This is a concrete integration with product-specific connectivity validation and indexing semantics. | | [Agent identity](https://learn.microsoft.com/en-us/azure/sre-agent/agent-identity) | security | 0.70 | Details what identity resources are created, why two identities exist, and how connectors use them; this is specific managed identity and auth behavior. | | [Azure DevOps Wiki knowledge](https://learn.microsoft.com/en-us/azure/sre-agent/azure-devops-wiki-knowledge) | integrations | 0.70 | Explains connecting Azure DevOps wikis with support for managed identity and PAT; this is a concrete integration with authentication configuration details. | +| [Azure Monitor alerts](https://learn.microsoft.com/en-us/azure/sre-agent/azure-monitor-alerts) | integrations | 0.70 | Explains how SRE Agent connects to Azure Monitor alerts with zero credentials and automates investigation across Azure tools. This is a product-specific integration pattern with unique behavior around alert handling. | | [Configure agent hooks](https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-agent-hooks) | configuration | 0.70 | REST API v2 tutorial for Stop and PostToolUse hooks will include JSON schema, field names, and allowed values for hook configuration, fitting configuration. | | [Connect to PagerDuty](https://learn.microsoft.com/en-us/azure/sre-agent/connect-pagerduty) | integrations | 0.70 | Product-specific integration tutorial for connecting PagerDuty so the agent can receive and respond to incidents. Likely includes connector settings, authentication details, and required configuration parameters unique to this integration. | -| [Connect to ServiceNow](https://learn.microsoft.com/en-us/azure/sre-agent/connect-servicenow) | integrations | 0.70 | ServiceNow connection tutorial with basic auth and OAuth 2.0 will include endpoint URLs, credential fields, and configuration parameters specific to this integration. | +| [Default agent override info](https://learn.microsoft.com/en-us/azure/sre-agent/default-agent-override-info) | configuration | 0.70 | Discusses how tool and skill settings apply to the default agent versus custom agents and how overrides work. This is nuanced, product-specific configuration behavior that clarifies which settings affect which agents. | | [Kusto tools](https://learn.microsoft.com/en-us/azure/sre-agent/kusto-tools) | integrations | 0.70 | Kusto tools are deterministic query tools with parameterization; page likely documents tool schema, parameter names, and constraints specific to Azure Data Explorer integration, matching integrations. | | [Manage permissions](https://learn.microsoft.com/en-us/azure/sre-agent/manage-permissions) | security | 0.70 | Managing permissions and access levels to resource groups and subscriptions is security/IAM; page likely lists specific permission levels, scopes, and possibly RBAC roles. | | [Monitor agent usage](https://learn.microsoft.com/en-us/azure/sre-agent/monitor-agent-usage) | limits-quotas | 0.70 | Tracks Azure AI Unit consumption and allocation limits; likely includes specific numeric limits and quota-related configuration, which are expert knowledge about usage constraints. | -| [ServiceNow incident indexing](https://learn.microsoft.com/en-us/azure/sre-agent/servicenow-incidents) | integrations | 0.70 | ServiceNow incident indexing with real-time scanning and connectivity validation; implies specific connector settings and validation behavior. | -| [Set up Teams bot](https://learn.microsoft.com/en-us/azure/sre-agent/teams-bot) | deployment | 0.70 | Setting up a Teams bot for the agent is a deployment scenario with product-specific registration steps, channel configuration, and possibly constraints on how the bot is hosted and accessed. | +| [Subscription permission visibility](https://learn.microsoft.com/en-us/azure/sre-agent/subscription-permission-visibility) | security | 0.70 | Focuses on which subscriptions appear in the picker based on specific Azure RBAC roles (Owner, User Access Administrator, Reader). This is product-specific security/permission behavior that affects how access is granted and is unlikely to be known generically. | | [User roles and permissions](https://learn.microsoft.com/en-us/azure/sre-agent/user-roles) | security | 0.70 | Focuses on controlling who can view, interact with, and administer the agent using Azure RBAC and layered access control. This implies specific role names/permissions and access scopes unique to the product, which fits the security sub-skill. | | [Azure DevOps connector](https://learn.microsoft.com/en-us/azure/sre-agent/ado-connector) | integrations | 0.65 | Azure DevOps connector with OAuth/managed identity; likely documents specific configuration parameters and auth settings for this integration. | | [Code interpreter](https://learn.microsoft.com/en-us/azure/sre-agent/code-interpreter) | configuration | 0.65 | Describes executing Python and shell commands in a sandbox; likely includes product-specific capabilities, constraints, and parameters for the interpreter environment. | | [Connect ADO repo with managed identity](https://learn.microsoft.com/en-us/azure/sre-agent/connect-ado-repo-managed-identity) | security | 0.65 | Connects Azure DevOps repositories using managed identity; likely includes specific identity/permission configuration steps (service principal or managed identity setup, scopes, and ADO permissions) that are product-specific security configuration details. | +| [Connectors](https://learn.microsoft.com/en-us/azure/sre-agent/connectors) | integrations | 0.65 | Describes product-specific connectors to Kusto, repos, collaboration tools, and custom APIs. While summary is high-level, this page is about concrete integration mechanisms unique to Azure SRE Agent, likely including connector configuration details and parameters. | | [Create and manage hooks in the portal](https://learn.microsoft.com/en-us/azure/sre-agent/create-manage-hooks-ui) | configuration | 0.65 | Portal-based hook configuration (Stop, PostToolUse) with examples like blocking dangerous shell commands implies specific hook types and settings, which are configuration details. | | [Cross-account ADO access](https://learn.microsoft.com/en-us/azure/sre-agent/cross-account-ado-oauth) | integrations | 0.65 | Cross-account ADO access across tenants; likely includes specific steps and parameters for authentication and tenant selection unique to this product. | | [GitHub connector](https://learn.microsoft.com/en-us/azure/sre-agent/github-connector) | integrations | 0.65 | Connector article for GitHub; likely includes auth modes (OAuth, PAT), scopes, and configuration parameters unique to this integration. | -| [MCP connectors](https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connectors) | integrations | 0.65 | MCP connectors integrate external systems via Model Context Protocol; page likely lists connector configuration fields and tool definitions unique to SRE Agent, fitting integrations & coding patterns. | +| [Incident platforms](https://learn.microsoft.com/en-us/azure/sre-agent/incident-platforms) | integrations | 0.65 | Explains how to connect incident platforms so the agent can receive and act on alerts automatically. This is a product-specific integration pattern that likely includes configuration details for supported incident systems. | | [PagerDuty incident indexing](https://learn.microsoft.com/en-us/azure/sre-agent/pagerduty-incidents) | integrations | 0.65 | PagerDuty incident indexing behavior; likely includes connector configuration, event handling details, and possibly polling/latency specifics unique to this integration. | +| [Pricing and billing](https://learn.microsoft.com/en-us/azure/sre-agent/pricing-billing) | decision-making | 0.65 | Explains billing model with always-on vs active flow, token-based metering, and cost optimization tips to help choose and manage usage; this is product-specific decision guidance about cost and consumption rather than a generic overview. | | [Python tools](https://learn.microsoft.com/en-us/azure/sre-agent/python-code-execution) | integrations | 0.65 | Python tools are a product-specific integration mechanism to reach internal APIs and systems. Page likely includes tool definition schema, parameters, and execution behavior unique to SRE Agent, fitting integrations & coding patterns. | | [Set up PagerDuty indexing](https://learn.microsoft.com/en-us/azure/sre-agent/set-up-pagerduty-indexing) | integrations | 0.65 | PagerDuty incident indexing requires product-specific connector configuration (API keys, endpoints, filters); this is an integration pattern. | -| [Set up ServiceNow indexing](https://learn.microsoft.com/en-us/azure/sre-agent/setup-servicenow-indexing) | integrations | 0.65 | ServiceNow indexing with basic auth or OAuth 2.0 implies detailed connector settings and parameters unique to this integration. | | [Supported regions](https://learn.microsoft.com/en-us/azure/sre-agent/supported-regions) | decision-making | 0.65 | Region support is time- and product-specific expert knowledge. Page likely lists exact supported regions and provides guidance on choosing a region, which is concrete decision-making content not inferable from general training. | | [Use code interpreter](https://learn.microsoft.com/en-us/azure/sre-agent/use-code-interpreter) | configuration | 0.65 | Enabling Code Interpreter for the agent likely involves specific configuration switches, allowed operations, and possibly limits for file handling and analysis, which are product-specific configuration details. | | [Agent hooks](https://learn.microsoft.com/en-us/azure/sre-agent/agent-hooks) | configuration | 0.60 | Hooks are custom checkpoints with specific types (for example Stop, PostToolUse) and likely have structured configuration fields and allowed values, which are product-specific configuration details. | | [Audit agent actions](https://learn.microsoft.com/en-us/azure/sre-agent/audit-agent-actions) | troubleshooting | 0.60 | Describes querying customEvents in Application Insights with KQL to see agent actions; likely includes specific event names, properties, and query patterns unique to SRE Agent, which are troubleshooting/diagnostic details. | -| [Chat from your tools](https://learn.microsoft.com/en-us/azure/sre-agent/chat-from-your-tools) | integrations | 0.60 | Describes chatting with the agent directly in Teams; likely includes product-specific configuration or usage patterns for this integration. | +| [Connect knowledge](https://learn.microsoft.com/en-us/azure/sre-agent/connect-knowledge) | integrations | 0.60 | Page is about connecting runbooks, docs, source code, and web resources as knowledge inputs. This is a product-specific integration pattern for external knowledge sources, likely with configuration options and constraints. | | [Create a Python tool](https://learn.microsoft.com/en-us/azure/sre-agent/create-python-tool) | integrations | 0.60 | Python tool creation tutorial will show tool definition, parameters, and deployment specifics unique to SRE Agent’s Python tool system, which is an integration/coding pattern. | | [Create a skill](https://learn.microsoft.com/en-us/azure/sre-agent/create-skill) | configuration | 0.60 | Skill creation involves specifying instructions, tools, and supporting files; page likely includes schema/fields for skill configuration unique to SRE Agent. | | [Create a subagent](https://learn.microsoft.com/en-us/azure/sre-agent/create-subagent) | configuration | 0.60 | Subagent builder involves defining instructions, tools, skills, and hooks; page likely documents configuration fields and options for subagents. | -| [Create an HTTP trigger](https://learn.microsoft.com/en-us/azure/sre-agent/create-http-trigger) | integrations | 0.60 | HTTP trigger tutorial likely documents request schema, headers, and payload parameters for invoking the agent from CI/CD or HTTP clients, which are integration details. | | [Deep investigation](https://learn.microsoft.com/en-us/azure/sre-agent/deep-investigation) | decision-making | 0.60 | Explicitly contrasts deep vs standard investigation and lists scenarios for use; provides decision guidance on when to choose each investigation mode. | | [Diagnose with external observability](https://learn.microsoft.com/en-us/azure/sre-agent/diagnose-observability) | integrations | 0.60 | Covers querying Azure Monitor plus external tools (Dynatrace, Datadog, Splunk) via MCP; likely includes connector parameters and integration patterns. | -| [Install a Marketplace plugin](https://learn.microsoft.com/en-us/azure/sre-agent/install-plugin-from-marketplace) | integrations | 0.60 | Marketplace plugin installation and configuration likely includes plugin manifest fields and connector settings specific to SRE Agent. | -| [Pricing and billing](https://learn.microsoft.com/en-us/azure/sre-agent/pricing-billing) | decision-making | 0.60 | Pricing and billing article for Azure SRE Agent with Azure Agent Units, always-on vs active flow rates, and cost optimization tips; likely includes rate details and guidance to choose usage patterns for cost, fitting decision-making around cost/performance trade-offs. | -| [Set up Teams connector](https://learn.microsoft.com/en-us/azure/sre-agent/set-up-teams-connector) | integrations | 0.60 | Teams connector setup is an integration; page likely includes connector configuration fields (channel IDs, permissions, scopes) specific to this product. | +| [Incident response plans](https://learn.microsoft.com/en-us/azure/sre-agent/incident-response-plans) | configuration | 0.60 | Describes routing incidents to specialized custom agents with specific tools, investigation depth, and autonomy. This is product-specific configuration of incident routing and agent behavior, beyond generic concepts. | ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| | [Add web page knowledge](https://learn.microsoft.com/en-us/azure/sre-agent/add-web-page-knowledge) | 0.40 | How to add web pages as knowledge sources; likely a simple how-to without detailed parameter tables or constraints. | -| [Azure Monitor alerts](https://learn.microsoft.com/en-us/azure/sre-agent/azure-monitor-alerts) | 0.40 | Explains connecting Azure Monitor via managed identity and that a scanner detects alerts every minute. While it mentions a one-minute detection interval, the page is primarily an integration/behavior overview without broader limits tables or configuration matrices, so it does not meet the stronger expert-knowledge criteria. | +| [Connect to ServiceNow](https://learn.microsoft.com/en-us/azure/sre-agent/connect-servicenow) | 0.40 | Tutorial to connect ServiceNow using OAuth 2.0 or basic auth; appears to be a how-to guide without explicit expert-level configuration matrices or error-code-based troubleshooting. | +| [Create an HTTP trigger](https://learn.microsoft.com/en-us/azure/sre-agent/create-http-trigger) | 0.40 | Tutorial for creating an HTTP trigger and integrating it into CI/CD; summary suggests step-by-step usage, but no clear indication of parameter tables, limits, or error mappings that would qualify as expert knowledge. | | [Execute mitigations](https://learn.microsoft.com/en-us/azure/sre-agent/execute-mitigations) | 0.40 | Describes mitigation capabilities and autonomy level conceptually; summary doesn’t show specific configuration values or limits. | -| [Incident platforms](https://learn.microsoft.com/en-us/azure/sre-agent/incident-platforms) | 0.40 | Explains what an incident platform is and how it changes agent behavior; appears conceptual without detailed configuration or limits. | -| [Incident response plans](https://learn.microsoft.com/en-us/azure/sre-agent/incident-response-plans) | 0.40 | Conceptual explanation of response plans routing incidents; summary doesn’t show detailed configuration tables or numeric thresholds. | | [Run modes](https://learn.microsoft.com/en-us/azure/sre-agent/run-modes) | 0.40 | Describes run modes conceptually (approval vs autonomous); no indication of numeric thresholds, config tables, or security role lists. | +| [Set up ServiceNow indexing](https://learn.microsoft.com/en-us/azure/sre-agent/setup-servicenow-indexing) | 0.40 | Tutorial for setting up ServiceNow incident indexing; summary mentions auth options and duration but not specific configuration parameter tables, limits, or troubleshooting mappings. | +| [Set up Teams connector](https://learn.microsoft.com/en-us/azure/sre-agent/set-up-teams-connector) | 0.40 | Tutorial to connect SRE Agent to a Teams channel; from the summary it looks like a step-by-step setup guide without explicit config parameter tables or quotas, so it doesn’t clearly meet any expert-knowledge category. | | [Skills](https://learn.microsoft.com/en-us/azure/sre-agent/skills) | 0.40 | Explains skills and how they extend the agent; likely conceptual with examples, not a detailed config or integration reference. | | [Tools](https://learn.microsoft.com/en-us/azure/sre-agent/tools) | 0.40 | Overview of tools and capabilities; summary does not show specific parameter tables or SDK references required for integrations/configuration classification. | -| [Workflow automation](https://learn.microsoft.com/en-us/azure/sre-agent/workflow-automation) | 0.40 | High-level workflow automation description; no explicit indication of detailed configuration parameters, limits, or decision matrices. | -| [Upload knowledge documents](https://learn.microsoft.com/en-us/azure/sre-agent/upload-knowledge-document) | 0.35 | Uploading knowledge documents is described conceptually; summary mentions file types but not detailed parameter tables or limits/quotas. | | [Agent reasoning](https://learn.microsoft.com/en-us/azure/sre-agent/agent-reasoning) | 0.30 | Explains reasoning behavior and action classification; appears conceptual without concrete config values or error mappings. | | [Anthropic subprocessor](https://learn.microsoft.com/en-us/azure/sre-agent/anthropic-sub-processor) | 0.30 | Describes Anthropic as a subprocessor, contractual terms, and data residency controls at a high level; primarily legal/compliance overview without concrete security configuration parameters or numeric thresholds. | | [Automate incident response](https://learn.microsoft.com/en-us/azure/sre-agent/automate-incidents) | 0.30 | Tutorial for automating incident response; likely procedural steps, not a reference of specific parameters, limits, or error codes. | -| [Automate workflows](https://learn.microsoft.com/en-us/azure/sre-agent/automate-workflows) | 0.30 | Workflow automation tutorial; describes connecting tools and scheduling tasks without clear evidence of detailed configuration tables. | -| [Connect knowledge](https://learn.microsoft.com/en-us/azure/sre-agent/connect-knowledge) | 0.30 | Conceptual guidance on connecting knowledge sources (runbooks, docs, code, web). Summary does not indicate detailed configuration options, limits, or security roles; likely an overview of the knowledge base feature. | -| [Connectors](https://learn.microsoft.com/en-us/azure/sre-agent/connectors) | 0.30 | Describes connectors conceptually (extending to external data sources, collaboration tools, APIs). Summary suggests high-level explanation rather than detailed connector configuration tables or SDK parameters. | +| [Automate workflows](https://learn.microsoft.com/en-us/azure/sre-agent/automate-workflows) | 0.30 | Covers automating workflows and scheduling tasks conceptually; summary does not indicate detailed configuration parameter tables, limits, or error-code-based troubleshooting. | | [Create a Kusto tool](https://learn.microsoft.com/en-us/azure/sre-agent/create-kusto-tool) | 0.30 | Kusto tool creation tutorial focused on portal UI steps and example queries; no evidence of detailed configuration option tables, limits, or error-code-based troubleshooting. | -| [Create and set up](https://learn.microsoft.com/en-us/azure/sre-agent/create-and-set-up) | 0.30 | Onboarding and setup wizard for Azure SRE Agent; likely a step-by-step deployment/connection guide without detailed config matrices, limits, or product-specific parameter tables. | -| [Create scheduled tasks](https://learn.microsoft.com/en-us/azure/sre-agent/create-scheduled-task) | 0.30 | Tutorial for creating and editing scheduled tasks; likely step-by-step usage instructions without configuration parameter matrices, quotas, or troubleshooting mappings. | +| [Create and set up](https://learn.microsoft.com/en-us/azure/sre-agent/create-and-set-up) | 0.30 | Tutorial-style onboarding and setup for Azure SRE Agent; description suggests step-by-step wizard usage without exposing detailed configuration tables, limits, or product-specific troubleshooting mappings. | +| [Create scheduled tasks](https://learn.microsoft.com/en-us/azure/sre-agent/create-scheduled-task) | 0.30 | Tutorial for creating and editing scheduled tasks; summary focuses on basic setup and editing, not on detailed configuration options, limits, or troubleshooting content. | | [Cross-account ADO access](https://learn.microsoft.com/en-us/azure/sre-agent/cross-account-azdo-oauth-authorization) | 0.30 | Cross-tenant Azure DevOps access tutorial; description suggests sign-in flow guidance rather than detailed security role mappings, limits, or troubleshooting content. | | [Data privacy and residency](https://learn.microsoft.com/en-us/azure/sre-agent/data-privacy) | 0.30 | Data residency and privacy overview; typically describes where and how data is stored/processed and general privacy measures, but not concrete RBAC roles, config parameters, or numeric limits. | | [Diagnose with Azure observability](https://learn.microsoft.com/en-us/azure/sre-agent/diagnose-azure-observability) | 0.30 | Describes how the agent automatically queries Azure observability services via managed identity and RBAC. Appears to be behavior/benefits rather than specific configuration parameters, error codes, or limits. | | [File attachments](https://learn.microsoft.com/en-us/azure/sre-agent/file-attachments) | 0.30 | Explains file attachments and automatic analysis; no evidence of specific limits, config options, or error codes. | +| [HTTP triggers](https://learn.microsoft.com/en-us/azure/sre-agent/http-triggers) | 0.30 | Explains what HTTP triggers are and how they’re used; summary does not show specific config tables, limits, or error mappings that would qualify as expert knowledge. | | [Incident response](https://learn.microsoft.com/en-us/azure/sre-agent/incident-response) | 0.30 | Marketing-style description of automated incident response with video; no clear indication of detailed technical configuration or limits. | +| [Install a Marketplace plugin](https://learn.microsoft.com/en-us/azure/sre-agent/install-plugin-from-marketplace) | 0.30 | Tutorial for installing marketplace plugins; summary indicates a basic workflow (add source, browse, import, configure) without detailed configuration matrices or product-specific troubleshooting. | | [Memory & knowledge](https://learn.microsoft.com/en-us/azure/sre-agent/memory) | 0.30 | Describes memory and knowledge conceptually; no evidence of specific configuration parameters or numeric thresholds. | | [Plugin marketplace](https://learn.microsoft.com/en-us/azure/sre-agent/plugin-marketplace) | 0.30 | Plugin Marketplace description focuses on browsing and installing plugins; summary does not indicate detailed configuration parameter tables or limits. | | [Root cause analysis](https://learn.microsoft.com/en-us/azure/sre-agent/root-cause-analysis) | 0.30 | Explains root cause analysis reasoning; appears conceptual without concrete configuration parameters or numeric thresholds. | | [Set up Azure DevOps connector](https://learn.microsoft.com/en-us/azure/sre-agent/azure-devops-connector) | 0.30 | Tutorial-style connector setup; description suggests step-by-step OAuth/PAT connection but no indication of detailed config parameter tables, limits, or error-code-based troubleshooting. | | [Set up MCP connector](https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connector) | 0.30 | MCP connector tutorial using GitHub MCP as example; appears to be a how-to guide rather than a reference of configuration parameters, limits, or troubleshooting mappings. | +| [Set up an incident trigger](https://learn.microsoft.com/en-us/azure/sre-agent/response-plan) | 0.30 | Tutorial for creating an incident response plan; summary emphasizes what the feature does and basic usage, not detailed decision matrices, limits, or configuration references. | | [Subagents](https://learn.microsoft.com/en-us/azure/sre-agent/sub-agents) | 0.30 | Conceptual explanation of custom agents and /agent command; no indication of expert-only configuration or limits. | | [Test a tool in the playground](https://learn.microsoft.com/en-us/azure/sre-agent/test-tool-playground) | 0.30 | Tool testing playground tutorial focuses on usage; unlikely to contain detailed configuration matrices or limits beyond generic testing steps. | +| [Track incident value](https://learn.microsoft.com/en-us/azure/sre-agent/track-incident-value) | 0.30 | Focuses on tracking incident value and analytics; summary indicates conceptual benefits and use cases, not specific configuration options or limits. | | [Troubleshoot App Service](https://learn.microsoft.com/en-us/azure/sre-agent/troubleshoot-azure-app-service) | 0.30 | Tutorial on troubleshooting an app with SRE Agent and App Service; appears procedural without detailed error-code mappings or configuration tables. | | [Troubleshoot Container Apps](https://learn.microsoft.com/en-us/azure/sre-agent/troubleshoot-azure-container-apps) | 0.30 | Tutorial for Azure Container Apps; mostly step-by-step usage of the agent, not deep product-specific limits or configs. | | [General](https://learn.microsoft.com/en-us/azure/sre-agent/faq) | 0.25 | General FAQ about service overview, pricing, and availability; likely high-level and marketing/overview oriented without detailed technical configuration or limits. | -| [Set up an incident trigger](https://learn.microsoft.com/en-us/azure/sre-agent/response-plan) | 0.25 | Tutorial for creating a response plan; summary suggests a guided workflow rather than detailed decision matrices or configuration tables. | | [Use DocsGuide](https://learn.microsoft.com/en-us/azure/sre-agent/use-docsguide) | 0.25 | DocsGuide usage tutorial is about asking questions; unlikely to contain detailed configuration parameters or limits beyond basic usage. | | [Add a web page knowledge source](https://learn.microsoft.com/en-us/azure/sre-agent/add-web-page) | 0.20 | Simple tutorial to add a web page as a knowledge source; appears to be basic UI steps without configuration tables or advanced patterns. | | [Azure Data Explorer connector](https://learn.microsoft.com/en-us/azure/sre-agent/kusto-cluster-grouping) | 0.20 | Describes connecting to Azure Data Explorer clusters and grouping; summary suggests conceptual connector behavior without specific configuration tables, limits, or error mappings. | | [Complete setup](https://learn.microsoft.com/en-us/azure/sre-agent/complete-setup) | 0.20 | Describes completing setup and connecting data sources; appears to be workflow guidance rather than detailed configuration reference. | | [Deep context](https://learn.microsoft.com/en-us/azure/sre-agent/workspace-tools) | 0.20 | Explains the concept of Deep Context and mentions workspace tools enablement, but appears to be conceptual behavior/benefits rather than concrete configuration parameters, limits, or troubleshooting details. | -| [HTTP triggers](https://learn.microsoft.com/en-us/azure/sre-agent/http-triggers) | 0.20 | Describes HTTP triggers conceptually as webhooks; summary does not suggest detailed configuration parameters, limits, or troubleshooting content. | | [Learn via Chat](https://learn.microsoft.com/en-us/azure/sre-agent/docsguide) | 0.20 | DocsGuide feature overview; no indication of detailed configuration parameters, limits, or troubleshooting mappings. | +| [MCP connectors](https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connectors) | 0.20 | Overview of MCP connectors and tools; summary suggests marketing/feature description rather than detailed integration parameters or troubleshooting guidance. | | [Manage global tools](https://learn.microsoft.com/en-us/azure/sre-agent/manage-global-tools) | 0.20 | Tutorial on browsing and toggling tools at space level; appears to be basic feature usage with one date-based note, but no detailed configuration matrices, limits, or troubleshooting content. | | [Run a deep investigation](https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-deep-investigation) | 0.20 | Tutorial on using deep investigation from chat and response plans; appears to be step-by-step usage guidance without detailed config tables, limits, or error mappings. | | [Run your first investigation](https://learn.microsoft.com/en-us/azure/sre-agent/first-investigation) | 0.20 | Quickstart-style tutorial for first investigation; no evidence of expert-only limits, configs, or troubleshooting mappings. | -| [Scheduled tasks](https://learn.microsoft.com/en-us/azure/sre-agent/scheduled-tasks) | 0.20 | Appears to be a conceptual/how-to overview of scheduled tasks with scenarios; no indication of numeric limits, config tables, or error-code-based troubleshooting. | -| [Send notifications](https://learn.microsoft.com/en-us/azure/sre-agent/send-notifications) | 0.20 | High-level description of notifications to Teams/Outlook; no evidence of detailed configuration options, quotas, or error mappings. | +| [Scheduled tasks](https://learn.microsoft.com/en-us/azure/sre-agent/scheduled-tasks) | 0.20 | High-level description of scheduled tasks; summary suggests conceptual benefits, not detailed configuration parameters, limits, or troubleshooting content. | +| [Send notifications](https://learn.microsoft.com/en-us/azure/sre-agent/send-notifications) | 0.20 | Describes notification behavior and supported channels (Teams, Outlook, MCP tools) at a conceptual level; no evidence of detailed configuration, quotas, or troubleshooting. | | [Set up Kusto connector](https://learn.microsoft.com/en-us/azure/sre-agent/kusto-connector) | 0.20 | Tutorial-style connector setup for Azure Data Explorer in SRE Agent; summary indicates step-by-step UI usage without exposing detailed configuration parameter tables, limits, or product-specific error mappings. | | [Set up Outlook connector](https://learn.microsoft.com/en-us/azure/sre-agent/outlook-connector) | 0.20 | Basic Outlook connector setup tutorial; summary indicates high-level steps without product-specific configuration tables, limits, or security role details. | | [Set up incident response](https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-incident-response) | 0.20 | Step in a getting-started flow to connect an incident platform and create response plans; appears to be basic setup instructions rather than deep configuration or troubleshooting content. | | [Team onboarding](https://learn.microsoft.com/en-us/azure/sre-agent/team-onboard) | 0.20 | Conceptual guidance on teaching the agent about team and architecture; no indication of specific configuration parameters or limits. | | [Threads](https://learn.microsoft.com/en-us/azure/sre-agent/threads) | 0.20 | Defines threads and conversation organization; conceptual usage guidance without detailed technical parameters. | | [Tools and skills](https://learn.microsoft.com/en-us/azure/sre-agent/global-tools-page) | 0.20 | High-level description of viewing and toggling tools/skills at space level; no indication of numeric limits, config parameter tables, error codes, or other expert-only details. | -| [Upload knowledge documents](https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-upload-knowledge-document) | 0.20 | Tutorial on uploading knowledge documents; summary indicates workflow guidance, not detailed configuration, limits, or decision matrices. | +| [Upload knowledge documents](https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-upload-knowledge-document) | 0.20 | Tutorial-style guidance on uploading knowledge documents via chat and portal; no configuration tables, limits, error codes, or product-specific parameters beyond basic UI steps. | | [What is SRE Agent?](https://learn.microsoft.com/en-us/azure/sre-agent/overview) | 0.20 | High-level product overview and value proposition without concrete limits, configs, or error details. | +| [Workflow automation](https://learn.microsoft.com/en-us/azure/sre-agent/workflow-automation) | 0.20 | Appears to be a conceptual/feature overview of workflow automation in SRE Agent; no indication of numeric limits, config tables, error codes, or product-specific decision matrices. | | [Share files and screenshots](https://learn.microsoft.com/en-us/azure/sre-agent/file-attachments-tutorial) | 0.15 | Tutorial on sharing files and screenshots; appears to be step-by-step usage without detailed configuration or limits. | | [Agent playground](https://learn.microsoft.com/en-us/azure/sre-agent/agent-playground) | 0.10 | Agent Playground page sounds like a feature overview (split-screen editor, AI scoring) without product-specific config tables or limits. | | [Favorites and Mine filter](https://learn.microsoft.com/en-us/azure/sre-agent/favorites-mine-filter) | 0.10 | Covers using favorites and a filter in the UI; purely organizational UX feature with no product-specific limits, configs, or troubleshooting content. | | [Starter prompts](https://learn.microsoft.com/en-us/azure/sre-agent/starter-prompts) | 0.10 | Starter prompts are usage examples; they don’t represent configuration, limits, or troubleshooting knowledge and are not critical expert reference data. | -| [Track incident value](https://learn.microsoft.com/en-us/azure/sre-agent/track-incident-value) | 0.10 | Mentions tracking incident value with charts and ratings; appears to be an analytics/overview feature without detailed configuration, limits, or error-resolution guidance. | +| [Upload knowledge documents](https://learn.microsoft.com/en-us/azure/sre-agent/upload-knowledge-document) | 0.10 | Very short, appears to be a conceptual description of uploading knowledge documents; no sign of parameter tables, limits, or error codes. | diff --git a/products/azure-synapse-analytics/azure-synapse-analytics.csv b/products/azure-synapse-analytics/azure-synapse-analytics.csv index 44ab43c4..ed38e508 100644 --- a/products/azure-synapse-analytics/azure-synapse-analytics.csv +++ b/products/azure-synapse-analytics/azure-synapse-analytics.csv @@ -226,6 +226,7 @@ https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/connect-monitor- https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/data-sources/apache-spark-cdm-connector,Azure Spark CDM Connector,Spark Common Data Model connector for Azure Synapse Analytics - Azure Synapse Analytics,Use Spark CDM connector to read/write Common Data Model,Learn how to use the Spark CDM connector in Azure Synapse Analytics to read and write Common Data Model entities in a Common Data Model folder on Azure Data Lake Storage.,"The Spark Common Data Model connector (Spark CDM connector) is a format reader/writer in Azure Synapse Analytics. It enables a Spark program to read and write Common Data Model entities in a Common Data Model folder via Spark DataFrames. For information on defining Common Data Model documents by using Common Data Model 1.2, seethis article about what Common Data Model is and how to use it.",2023-03-06T18:05:00.000Z,concept-article,integrations,0.7,True,"Connector-specific article; expected to list schema handling, folder structure requirements, and connector options for CDM in Synapse.",unchanged https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/data-sources/apache-spark-kusto-connector,Azure Data Explorer (Kusto),Azure Data Explorer (Kusto) - Azure Synapse Analytics,Use Kusto connector with Synapse serverless Spark,This article provides information on how to use the connector for moving data between Azure Data Explorer (Kusto) and serverless Apache Spark pools.,"The Azure Data Explorer (Kusto) connector for Apache Spark is designed to efficiently transfer data between Kusto clusters and Spark. This connector is available in Python, Java, and .NET.",2023-10-12T11:16:00.000Z,conceptual,integrations,0.75,True,"Describes Azure Data Explorer connector; likely includes connection properties, authentication parameters, and data movement options unique to this connector.",unchanged https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/data-sources/apache-spark-sql-connector,Azure SQL and SQL Server,Spark connector for SQL databases - Azure Synapse Analytics,Use Synapse Spark connector for SQL databases,Learn how to use Spark Connector to connect to Azure SQL databases from Synapse Spark Runtime,"Important This feature is in preview. The Spark connector for SQL databases is a high-performance library that lets you read from and write to SQL Server, Azure SQL databases, and Fabric SQL databases. The connector offers the following capabilities: The connector is preinstalled in the Synapse Spark 3.5 runtime, so you don't need to install it separately.",2026-02-04T06:14:00.000Z,how-to,integrations,0.75,True,"Connector for SQL Server/Azure SQL/Fabric SQL; expected to document driver options, partitioning, and performance-related parameters unique to this library.",unchanged +https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/how-to-use-certificate-with-service-principal-for-log-storage,Collect Apache Spark Application Logs and Metrics to Azure Storage Account Using Certificate-Based Service Principal Authentication,Collect Apache Spark Application Logs and Metrics to Azure Storage account Using Certificate-Based Service Principal Authentication - Azure Synapse Analytics,Configure Synapse Spark logs to Azure Storage with cert auth,"Learn to setting up Azure services, particularly focusing on integrating Azure Synapse with Azure Storage account and Key Vault.","The Apache Spark diagnostic emitter extension is a library that allows Spark applications to send logs, event logs, and metrics to destinations like Azure Storage account, Azure Log Analytics, and Azure Storage. In this tutorial, you learn how to create required Azure resources and configure a Spark application with a certificate and service principal to emit logs, event logs, and metrics to Azure Storage account using the Apache Spark diagnostic emitter extension.",2026-04-21T11:11:00.000Z,tutorial,integrations,0.7,True,"Page describes product-specific integration of Synapse Spark with Azure Storage and Key Vault using the Apache Spark diagnostic emitter extension and certificate-based service principal authentication, including concrete configuration steps and parameters unique to this integration scenario.",new https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/how-to-use-certificate-with-service-principalp-emit-log-event-hubs,Collect Apache Spark Application Logs and Metrics to Azure Event Hubs Using Certificate-Based Service Principal Authentication,Collect Apache Spark Application Logs and Metrics to Azure Event Hubs Using Certificate-Based Service Principal Authentication - Azure Synapse Analytics,Secure Synapse Spark log emission with certificate-based service principal,"Learn to setting up Azure services, particularly focusing on integrating Azure Synapse with Azure Event Hubs and Key Vault.","The Apache Spark diagnostic emitter extension is a library that allows Spark applications to send logs, event logs, and metrics to destinations like Azure Event Hubs, Azure Log Analytics, and Azure Storage. In this tutorial, you learn how to create required Azure resources and configure a Spark application with a certificate and service principal to emit logs, event logs, and metrics to Azure Event Hubs using the Apache Spark diagnostic emitter extension.",2025-04-16T11:18:00.000Z,tutorial,security,0.8,True,"Focuses on certificate-based service principal auth for Event Hubs and Key Vault; expected to list specific Azure AD app settings, certificate handling, and RBAC roles.",unchanged https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/intellij-tool-synapse,Connect to Spark with IntelliJ plug-in,Tutorial - Azure Toolkit for IntelliJ (Spark application) - Azure Synapse Analytics,Develop and submit Spark apps from IntelliJ to Synapse,"Tutorial - Use the Azure Toolkit for IntelliJ to develop Spark applications, which are written in Scala, and submit them to a serverless Apache Spark pool.","This tutorial shows you how to use the Azure Toolkit for IntelliJ plug-in to develop Apache Spark applications, which are written inScala, and then submit them to a serverless Apache Spark pool directly from the IntelliJ integrated development environment (IDE). You can use the plug-in in a few ways: In this tutorial, you learn how to:",2023-04-11T00:00:00.000Z,tutorial,integrations,0.65,True,Tutorial for Azure Toolkit for IntelliJ integration; includes plugin-specific configuration and submission patterns to Synapse Spark pools.,unchanged https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/low-shuffle-merge-for-apache-spark,Low Shuffle Merge on Delta Lake,Low Shuffle Merge optimization on Delta tables - Azure Synapse Analytics,Improve Delta MERGE performance with Low Shuffle Merge,Low Shuffle Merge optimization on Delta tables for Apache Spark,"Delta LakeMERGE commandallows users to update a delta table with advanced conditions. It can update data from a source table, view or DataFrame into a target table by using MERGE command. However, the current algorithm isn't fully optimized for handlingunmodifiedrows. With Low Shuffle Merge optimization, unmodified rows are excluded from an expensive shuffling operation that is needed for updating matched rows.",2025-06-02T22:13:00.000Z,reference,best-practices,0.7,True,Explains Low Shuffle Merge optimization for Delta tables; contains product-specific behavior and guidance on when and how to use this optimization.,unchanged diff --git a/products/azure-synapse-analytics/report.md b/products/azure-synapse-analytics/report.md index f688411c..d8b3a44e 100644 --- a/products/azure-synapse-analytics/report.md +++ b/products/azure-synapse-analytics/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: deployment: 'Deploying and managing Synapse workspaces and dedicated SQL pools: ARM/Bicep templates, CI/CD, source control, restore points, maintenance windows, @@ -22,22 +22,22 @@ category_descriptions: troubleshooting: 'Diagnosing and fixing Synapse issues: workspace/tenant moves, Spark jobs and libraries, SQL pools and queries, Synapse Link, Studio/storage connectivity, and common errors/timeouts.' - integrations: Patterns and code to integrate Synapse (Spark, serverless, dedicated - SQL) with ADLS, Cosmos DB, Azure SQL, AML, monitoring (Log Analytics, Prometheus), - and external tools via connectors, REST, and T-SQL. + integrations: Integrating Synapse with Spark, SQL pools, ML, Cosmos DB, ADLS, Kusto, + Stream Analytics, and tools (Fivetran, Striim), plus logging/monitoring, packaging, + and data movement patterns. limits-quotas: 'Synapse SQL pool limits: maintenance windows, memory/concurrency by performance level, capacity caps, temp table behavior, serverless Delta Lake v1 querying, and Synapse Link feature limits/issues.' skill_description: Expert knowledge for Azure Synapse Analytics development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when designing Synapse workspaces, Spark pools, dedicated/serverless SQL, Delta - Lake, or Synapse Link workloads, and other Azure Synapse Analytics related development + Use when working with Synapse workspaces, Spark pools, dedicated/serverless SQL + pools, Synapse Link, or Delta Lake, and other Azure Synapse Analytics related development tasks. Not for Azure Data Factory (use azure-data-factory), Azure Data Explorer (use azure-data-explorer), Azure Databricks (use azure-databricks), Azure HDInsight (use azure-hdinsight). -use_when: Use when designing Synapse workspaces, Spark pools, dedicated/serverless - SQL, Delta Lake, or Synapse Link workloads, and other Azure Synapse Analytics related +use_when: Use when working with Synapse workspaces, Spark pools, dedicated/serverless + SQL pools, Synapse Link, or Delta Lake, and other Azure Synapse Analytics related development tasks. confusable_not_for: Not for Azure Data Factory (use azure-data-factory), Azure Data Explorer (use azure-data-explorer), Azure Databricks (use azure-databricks), Azure @@ -47,14 +47,14 @@ confusable_not_for: Not for Azure Data Factory (use azure-data-factory), Azure D ## Summary -- **Total Pages**: 446 -- **Fetched**: 446 +- **Total Pages**: 447 +- **Fetched**: 447 - **Fetch Failed**: 0 -- **Classified**: 260 +- **Classified**: 261 - **Unclassified**: 186 ### Incremental Update -- **New Pages**: 0 +- **New Pages**: 1 - **Updated Pages**: 0 - **Unchanged**: 446 - **Deleted Pages**: 0 @@ -69,14 +69,18 @@ confusable_not_for: Not for Azure Data Factory (use azure-data-factory), Azure D | configuration | 45 | 10.1% | | decision-making | 16 | 3.6% | | deployment | 11 | 2.5% | -| integrations | 33 | 7.4% | +| integrations | 34 | 7.6% | | limits-quotas | 6 | 1.3% | | security | 55 | 12.3% | | troubleshooting | 22 | 4.9% | -| *(Unclassified)* | 186 | 41.7% | +| *(Unclassified)* | 186 | 41.6% | ## Changes +### New Pages + +- [Collect Apache Spark Application Logs and Metrics to Azure Storage Account Using Certificate-Based Service Principal Authentication](https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/how-to-use-certificate-with-service-principal-for-log-storage) + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -199,6 +203,7 @@ confusable_not_for: Not for Azure Data Factory (use azure-data-factory), Azure D | [Browse an ADLS Gen2 folder with ACLs](https://learn.microsoft.com/en-us/azure/synapse-analytics/how-to-access-container-with-access-control-lists) | security | 0.70 | Focuses on using ACLs instead of account-level roles; likely includes specific RBAC roles, ACL entries, and security configuration details. | | [CETAS](https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-cetas) | integrations | 0.70 | CETAS syntax and options for Synapse SQL are product-specific; includes detailed T-SQL patterns and storage-related options that go beyond generic SQL knowledge. | | [COPY statement examples](https://learn.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/quickstart-bulk-load-copy-tsql-examples) | security | 0.70 | Focuses on authentication mechanisms for COPY with concrete examples; likely includes specific auth options, parameters, and secure configuration patterns unique to Synapse COPY. | +| [Collect Apache Spark Application Logs and Metrics to Azure Storage Account Using Certificate-Based Service Principal Authentication](https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/how-to-use-certificate-with-service-principal-for-log-storage) | integrations | 0.70 | Page describes product-specific integration of Synapse Spark with Azure Storage and Key Vault using the Apache Spark diagnostic emitter extension and certificate-based service principal authentication, including concrete configuration steps and parameters unique to this integration scenario. | | [Collect Apache Spark applications logs and metrics with Azure Event Hubs](https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-eventhub) | integrations | 0.70 | Similar to Storage article but for Event Hubs; likely includes connection string formats, Event Hubs-specific emitter settings, and usage patterns. | | [Collect Apache Spark applications logs and metrics with Azure Storage account](https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-storage) | integrations | 0.70 | Uses diagnostic emitter extension; expected to document library config keys, destinations, and constraints for sending logs/metrics to Storage. | | [Collect Apache Spark applications metrics using APIs](https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/connect-monitor-azure-synapse-spark-application-level-metrics) | integrations | 0.70 | Tutorial for integrating Synapse Spark with on-prem Prometheus; likely includes connector configuration parameters, endpoints, and product-specific integration details. | diff --git a/products/azure-test-plans/azure-test-plans.csv b/products/azure-test-plans/azure-test-plans.csv index 1d63e5e0..23497d30 100644 --- a/products/azure-test-plans/azure-test-plans.csv +++ b/products/azure-test-plans/azure-test-plans.csv @@ -1,22 +1,22 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/devops/test/actual-result?view=azure-devops,Actual Result,Record Actual Results for Manual Test Runs - Azure Test Plans,,"Record actual results for manual test runs in Azure Test Plans. Discover how to configure, capture, and review execution outcomes for each test step.","Azure DevOps Services Important This capability is inpreview. Functionality might change or be discontinued without notice. Preview capabilities have no Service Level Agreement (SLA) and limited support. -If this capability isn't yet available in your organization, wait a few days as it rolls out gradually. Use theActual Resultfield in Azure Test Plans to record the execution outcome for each test step during manual test runs in the web runner. You enable the Actual Result field at the test plan ",2026-04-14T01:03:00.000Z,how-to,,0.3,False,"Describes preview feature to record actual results; configuration is at test plan level but no clear numeric limits, config tables, or error codes in summary.",new +If this capability isn't yet available in your organization, wait a few days as it rolls out gradually. Use theActual Resultfield in Azure Test Plans to record the execution outcome for each test step during manual test runs in the web runner. You enable the Actual Result field at the test plan ",2026-04-14T01:03:00.000Z,how-to,,0.3,False,"Describes preview feature to record actual results; configuration is at test plan level but no clear numeric limits, config tables, or error codes in summary.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/add-to-bugs-exploratory-testing?view=azure-devops,Add to existing bugs,Add findings to existing bugs - Azure Test Plans,,Manual and exploratory testing - add findings to existing bugs when using the Test & Feedback extension,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To help avoid duplication, the Test & Feedback extension automatically searches for and displays existing bugs, based on the keywords in the title as you file a new bug. You can choose to continue creating a new bug or add your findings to an existing bug.",2026-02-27T22:02:00.000Z,how-to,,0.3,False,"Describes how the extension searches for existing bugs and how to add findings; no error codes, numeric limits, or config parameter tables.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/associate-automated-test-with-test-case?view=azure-devops,Associate automated tests with test cases,Associate automated tests with test cases - Azure Test Plans,,"Learn how to associate automated tests with test cases in Azure Test Plans to enable traceability, run tests from pipelines, and track requirement quality.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Associate automated tests with test cases to enable traceability between your test code and requirements. When you link an automated test method to a test case work item, you can:",2026-04-17T21:04:00.000Z,how-to,,0.25,False,"How-to associate automated tests with test cases for traceability; likely code/linking steps but summary shows no specific parameter tables, limits, or error codes.",updated -https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops,Import and export test cases,Bulk import or export test cases - Azure Test Plans,,"Bulk import and export test cases in Azure Test Plans using CSV or Excel files. Create, update, and manage test cases efficiently with smart field mapping.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Import and export test cases in bulk by using CSV or Microsoft Excel (XLSX) files. You can create new test cases, update existing test cases by ID, or download test cases for external editing. Azure DevOps Services includes an enhanced import wizard with auto-mapping, reusable mapping templates, and multi-sheet XLSX support. Import and export test cases in bulk by using CSV or Microsoft Excel (XLSX) files. You can create new ",2026-04-08T21:05:00.000Z,how-to,,0.25,False,Bulk import/export via CSV/Excel with an import wizard; summary does not indicate detailed parameter tables or limits beyond generic capability.,new +https://learn.microsoft.com/en-us/azure/devops/test/associate-automated-test-with-test-case?view=azure-devops,Associate automated tests with test cases,Associate automated tests with test cases - Azure Test Plans,,"Learn how to associate automated tests with test cases in Azure Test Plans to enable traceability, run tests from pipelines, and track requirement quality.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Associate automated tests with test cases to enable traceability between your test code and requirements. When you link an automated test method to a test case work item, you can:",2026-04-17T21:04:00.000Z,how-to,,0.25,False,"How-to associate automated tests with test cases for traceability; likely code/linking steps but summary shows no specific parameter tables, limits, or error codes.",unchanged +https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops,Import and export test cases,Bulk import or export test cases - Azure Test Plans,,"Bulk import and export test cases in Azure Test Plans using CSV or Excel files. Create, update, and manage test cases efficiently with smart field mapping.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Import and export test cases in bulk by using CSV or Microsoft Excel (XLSX) files. You can create new test cases, update existing test cases by ID, or download test cases for external editing. Azure DevOps Services includes an enhanced import wizard with auto-mapping, reusable mapping templates, and multi-sheet XLSX support. Import and export test cases in bulk by using CSV or Microsoft Excel (XLSX) files. You can create new ",2026-04-23T08:00:00.000Z,how-to,,0.2,False,"Primarily a how-to/tutorial for bulk importing and exporting test cases via CSV/Excel. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details beyond general workflow steps.",updated https://learn.microsoft.com/en-us/azure/devops/test/collect-diagnostic-data?view=azure-devops,Collect diagnostic data,Collect diagnostic data - Azure Test Plans,,Learn about collecting diagnostic data while testing web and desktop apps with Azure Test Plans in manual and exploratory testing.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Collect diagnostic data while you test your apps. This data is included in the bugs you file during the test. You can collect diagnostic data from web apps and from desktop apps, and view it in Azure Test Plans.",2025-10-27T22:02:00.000Z,how-to,,0.25,False,"Describes collecting diagnostic data during tests; summary does not show specific log locations, error codes, or config parameters.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/connected-mode-exploratory-testing?view=azure-devops,Test in Connected mode,Exploratory testing in connected mode - Azure Test Plans,,Manual and exploratory testing - exploratory testing by using the Test & Feedback extension in Connected mode,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To use the Test & Feedback extension in Connected mode, connect to an Azure DevOps project, which automatically configures the extension based on your access level.",2025-10-01T18:04:00.000Z,quickstart,,0.3,False,"Connected mode usage of extension; summary suggests basic connection steps, not detailed config matrices or limits.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/copy-clone-test-items?view=azure-devops,Copy or clone test items,"Copy or clone test plans, test suites, test cases, and more - Azure Test Plans",,"Learn how to copy or clone test plans, test suites, and test cases in Azure Test Plans to streamline your testing process and ensure consistency across iterations.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Several tools support copy, clone, or import operations for test items such as test plans, test suites, and test cases. Test cases describe the steps to take to run a test and validate a feature implementation or bug fix. Test suites group test cases, and optionally other test suites, into a particular order. Test plans define a collection of test suites to run for a particular iteration or release. Each test case is designed",2026-03-18T21:04:00.000Z,how-to,,0.2,False,"Describes copy/clone/import operations for test artifacts; general feature usage without numeric limits, config matrices, or error mappings.",new +https://learn.microsoft.com/en-us/azure/devops/test/copy-clone-test-items?view=azure-devops,Copy or clone test items,"Copy or clone test plans, test suites, test cases, and more - Azure Test Plans",,"Learn how to copy or clone test plans, test suites, and test cases in Azure Test Plans to streamline your testing process and ensure consistency across iterations.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Several tools support copy, clone, or import operations for test items such as test plans, test suites, and test cases. Test cases describe the steps to take to run a test and validate a feature implementation or bug fix. Test suites group test cases, and optionally other test suites, into a particular order. Test plans define a collection of test suites to run for a particular iteration or release. Each test case is designed",2026-03-18T21:04:00.000Z,how-to,,0.2,False,"Describes copy/clone/import operations for test artifacts; general feature usage without numeric limits, config matrices, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/create-a-test-plan?view=azure-devops,Create & manage test plans,Create and manage test plans - Azure Test Plans,,"Learn how to create, rename, and delete test plans in Azure Test Plans.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Create test plans and test suites to track manual testing for sprints or milestones. By using this approach, you can see when the testing for a specific sprint or milestone is complete. -For more information about manual testing, seeWhat is Azure Test Plans?",2026-04-17T21:04:00.000Z,how-to,,0.2,False,"How-to guide for creating/managing test plans; step-by-step UI usage without expert-only limits, configs, or decision matrices.",new -https://learn.microsoft.com/en-us/azure/devops/test/create-test-cases?view=azure-devops,Create & manage test cases,Create and manage manual test cases - Azure Test Plans,,"Create and manage manual test cases in Azure Test Plans to validate deliverables, assign testers, and organize your testing process. Start improving quality now.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Create manual test cases to validate that each deliverable meets user requirements. Test cases define the individual steps testers perform, and can includeshared stepsandparameters for data-driven testing. Organize test cases intest plans and test suites, and then assign testers to run them. For key concepts, seeTest objects and terms. Note Test iterations are for data-driven scenarios, not workflow-driven ones. If two test s",2026-04-17T21:04:00.000Z,how-to,,0.2,False,"Covers creating manual test cases and organizing them; lacks specific quotas, config parameters, or troubleshooting content.",new -https://learn.microsoft.com/en-us/azure/devops/test/create-test-suites?view=azure-devops,Create & manage test suites,Create and manage test suites - Azure Test Plans,,"Learn how to create and manage static, requirement-based, and query-based test suites in Azure Test Plans.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Test suites organize test cases within atest plan. Use test suites to group related tests for a sprint, feature, or milestone. Azure Test Plans supports three types of test suites: For key concepts, seeTest objects and terms.",2026-04-17T21:04:00.000Z,how-to,,0.2,False,"Explains creating and managing test suites; procedural instructions but no numeric limits, configuration tables, or error diagnostics.",new +For more information about manual testing, seeWhat is Azure Test Plans?",2026-04-17T21:04:00.000Z,how-to,,0.2,False,"How-to guide for creating/managing test plans; step-by-step UI usage without expert-only limits, configs, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/devops/test/create-test-cases?view=azure-devops,Create & manage test cases,Create and manage manual test cases - Azure Test Plans,,"Create and manage manual test cases in Azure Test Plans to validate deliverables, assign testers, and organize your testing process. Start improving quality now.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Create manual test cases to validate that each deliverable meets user requirements. Test cases define the individual steps testers perform, and can includeshared stepsandparameters for data-driven testing. Organize test cases intest plans and test suites, and then assign testers to run them. For key concepts, seeTest objects and terms. Note Test iterations are for data-driven scenarios, not workflow-driven ones. If two test s",2026-04-17T21:04:00.000Z,how-to,,0.2,False,"Covers creating manual test cases and organizing them; lacks specific quotas, config parameters, or troubleshooting content.",unchanged +https://learn.microsoft.com/en-us/azure/devops/test/create-test-suites?view=azure-devops,Create & manage test suites,Create and manage test suites - Azure Test Plans,,"Learn how to create and manage static, requirement-based, and query-based test suites in Azure Test Plans.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Test suites organize test cases within atest plan. Use test suites to group related tests for a sprint, feature, or milestone. Azure Test Plans supports three types of test suites: For key concepts, seeTest objects and terms.",2026-04-17T21:04:00.000Z,how-to,,0.2,False,"Explains creating and managing test suites; procedural instructions but no numeric limits, configuration tables, or error diagnostics.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/custom-fields?view=azure-devops,Add custom data fields,Custom fields for test runs and test results - Azure Test Plans,Use custom fields for Azure DevOps test runs,Store custom data against the test runs and test results.,"Azure DevOps Services Using the custom fields allowsstoring the custom dataagainst the test run and/or test result. There can be up to 100 custom fields defined for a single Azure DevOps project. -Project administrator canmanage (add/delete) the set of the custom fields.",2025-09-04T16:00:00.000Z,how-to,limits-quotas,0.7,True,Explicitly states a numeric limit: up to 100 custom fields per Azure DevOps project; this is a concrete quota not generally known from training.,new +Project administrator canmanage (add/delete) the set of the custom fields.",2025-09-04T16:00:00.000Z,how-to,limits-quotas,0.7,True,Explicitly states a numeric limit: up to 100 custom fields per Azure DevOps project; this is a concrete quota not generally known from training.,unchanged https://learn.microsoft.com/en-us/azure/devops/test/explore-workitems-exploratory-testing?view=azure-devops,Explore work items,Explore work items when exploratory testing - Azure Test Plans,,Test tools - Manual and exploratory testing - explore work items from the board or by using the Microsoft Test & Feedback extension,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use the Test & Feedback extension to explore existing work items and associate them with a new or ongoing exploratory session. Once a work item is associated with a session, all new bugs, tasks, and test cases created in that session are automatically linked to the work item. This extension enables end-to-end traceability and simplifies the tracking and management of issues. Explore the following items:",2025-10-27T22:02:00.000Z,how-to,,0.3,False,Exploring work items and linking them; workflow guidance without numeric limits or configuration parameter tables.,unchanged https://learn.microsoft.com/en-us/azure/devops/test/how-long-to-keep-test-results?view=azure-devops,Set test retention policies,Set test retention policies - Azure Test Plans,,Manage how long Azure DevOps keeps your manual test results by clearing test results that you don't need anymore or when you delete your builds.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Running tests, especially automated ones, generates lots of data. To keep your test system responsive and performing well, @@ -28,13 +28,13 @@ https://learn.microsoft.com/en-us/azure/devops/test/insights-exploratory-testing at team or individual level, and for a specific period.",2025-10-27T22:02:00.000Z,how-to,,0.2,False,"Describes viewing exploratory testing sessions and insights. Likely a feature walkthrough without specific quotas, configuration parameters, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/manage-test-failure-type?view=azure-devops,Manage test failure type,Manage test failure type - Azure Test Plans,,"Learn about managing failure type in Azure Test Plans. You can add, remove or edit the default test failure types.","Azure DevOps Services Azure Test Plans provides the functionality to customize the failure types of any test case beyond the default values. Afailure typeis an artifact that helps to mark test case failures into defined categories such asregression issueorknown issue. While this categorization of failure types is helpful, users might want to add their own custom failure type beyond the default values, which creates a more customized experience for specific user needs in the Azure DevOps project.",2026-02-27T22:02:00.000Z,overview,,0.35,False,"Customization of failure types is described conceptually; summary does not show numeric limits, config tables, or security/role specifics.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/manual-test-permissions?view=azure-devops,Default permissions (Security),"Permissions, licensing, and access for manual testing - Azure Test Plans",Configure permissions and access for Azure manual testing,Default permissions and access levels in Azure DevOps for manual and exploratory testing.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 This article covers the access levels, licensing, and permissions required for manual and exploratory testing in Azure Test Plans.",2026-04-03T21:03:00.000Z,reference,security,0.7,True,"Covers Azure DevOps access levels, licensing, and permissions for manual and exploratory testing. Likely includes specific permission names, role mappings, and access requirements, which are product-specific security/authorization details.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/navigate-test-plans?view=azure-devops,Tour Test Plans,Navigate Test Plans in Azure DevOps - Azure Test Plans,,"Test Plans navigation: Learn how to find, organize, and manage test plans, suites, and cases in Azure DevOps. Start optimizing your test workflow today.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In this article, you learn how to navigate theTest Planspage, including how to find and favorite test plans, manage test suites, define and organize test cases, execute tests, and track results with charts.",2026-04-07T01:04:00.000Z,overview,,0.2,False,"Navigation and usage overview of Test Plans UI; no limits, configs, error codes, or product-specific numeric thresholds.",new +https://learn.microsoft.com/en-us/azure/devops/test/navigate-test-plans?view=azure-devops,Tour Test Plans,Navigate Test Plans in Azure DevOps - Azure Test Plans,,"Test Plans navigation: Learn how to find, organize, and manage test plans, suites, and cases in Azure DevOps. Start optimizing your test workflow today.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 In this article, you learn how to navigate theTest Planspage, including how to find and favorite test plans, manage test suites, define and organize test cases, execute tests, and track results with charts.",2026-04-07T01:04:00.000Z,overview,,0.2,False,"Navigation and usage overview of Test Plans UI; no limits, configs, error codes, or product-specific numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/overview?view=azure-devops,What is Azure Test Plans?,"What is Azure Test Plans? Manual, exploratory, and automated test tools. - Azure Test Plans",,Learn about the test tools and capabilities that Azure Test Plans provides to drive quality and collaboration throughout the development process.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Azure Test Plans offers powerful tools for driving quality and collaboration throughout the development process. This browser-based test management solution supports planned manual testing, user acceptance testing, exploratory testing, and stakeholder feedback. Note This article applies to Azure DevOps Services and Azure DevOps Server 2020 and later versions. Most of the information is valid for earlier on-premises versions, ",2025-07-17T19:00:00.000Z,overview,,0.2,False,"High-level overview of Azure Test Plans capabilities; no detailed limits, configs, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/perform-exploratory-tests?view=azure-devops,Install the Test & Feedback extension,Install The Test And Feedback Extension - Azure Test Plans,,"Install and set up the Test & Feedback browser extension to run exploratory tests on web apps, capture screenshots and recordings, and submit feedback to Azure DevOps.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 The Test & Feedback extension helps teams run exploratory tests and collect feedback. Team members use it to submit bugs, capture screenshots and recordings, and contribute to product quality.",2026-02-27T22:02:00.000Z,how-to,,0.3,False,Installation and basic use of the Test & Feedback extension; appears as a feature/tutorial page without detailed configuration matrices or limits.,unchanged https://learn.microsoft.com/en-us/azure/devops/test/progress-report?view=azure-devops,Progress report,"Progress report, test plans - Azure Test Plans",,Learn how to use the Test Plans Progress report,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 To track the progress of more than one test plan or test suite, use the Progress Report. The Progress Reports helps you track status of passed, failed, or blocked tests, estimations, rate of execution, progress, and more for your team.",2025-10-27T22:02:00.000Z,how-to,,0.25,False,"Progress report usage; focuses on interpreting charts and status, not on configuration or limits.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/reference-qa?view=azure-devops,Manual testing FAQs,Test plans frequently asked questions - Azure Test Plans,,"Get answers to common questions about test plans, test suites, test cases, permissions, automated testing, test configurations, the Test & Feedback extension, and test data retention in Azure Test Pla","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Get answers to common questions about creating and managing test plans, test cases, test suites, permissions and access levels, running manual and automated tests, test configurations, tracking charts, test data retention, and the Test & Feedback extension in Azure Test Plans. For step-by-step guidance, see the following articles:",2026-04-08T21:05:00Z,FAQ,,0.4,False,"FAQ across many topics; summary does not expose specific limits, error codes, or configuration matrices, so cannot reliably classify as expert knowledge from the given text.",unchanged +https://learn.microsoft.com/en-us/azure/devops/test/reference-qa?view=azure-devops,Manual testing FAQs,Test plans frequently asked questions - Azure Test Plans,Understand Azure Test Plans limits and retention,"Get answers to common questions about test plans, test suites, test cases, permissions, automated testing, test configurations, the Test & Feedback extension, and test data retention in Azure Test Pla","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Get answers to common questions about creating and managing test plans, test cases, test suites, permissions and access levels, running manual and automated tests, test configurations, tracking charts, test data retention, and the Test & Feedback extension in Azure Test Plans. For step-by-step guidance, see the following articles:",2026-04-22T21:02:00Z,faq,limits-quotas,0.7,True,"FAQ pages for Azure Test Plans typically include concrete test data retention periods and possibly other numeric constraints (for example, how long test results are kept, limits on configurations or suites). These are product-specific limits and quotas that an LLM is unlikely to know precisely from training, matching the limits-quotas category.",updated https://learn.microsoft.com/en-us/azure/devops/test/repeat-test-with-different-data?view=azure-devops,Repeat a test with different data,Repeat a test with different data - Azure Test Plans,,Learn about manual and exploratory testing. Repeat a test with different data in Azure Test Plans.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Add parameters to yourmanual testto repeat the test with different test data. For example, you can test adding different quantities to a shopping cart from quantities of 1, 5, 10, or 200. Insert parameters within your test steps for a manual test case. Then, provide a table of parameter values. You can add shared parameters to test cases or convert parameters you recently inserted into shared parameters. Shared steps and shar",2026-02-27T22:02:00.000Z,how-to,,0.25,False,"Shows how to parameterize manual tests and reuse data; procedural but lacks numeric limits, config tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/request-stakeholder-feedback?view=azure-devops,Request & provide feedback,Request and provide stakeholder feedback - Azure Test Plans,,"Learn how to request, provide, and track stakeholder feedback in Azure DevOps using the Test & Feedback browser extension.",Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use the Test & Feedback extension in Azure DevOps to collect stakeholder feedback. The available workflow depends on your platform: Important Select the version of this article that corresponds to your platform and version. The version selector is above the table of contents.Look up your Azure DevOps platform and version.,2026-04-03T21:03:00.000Z,how-to,,0.2,False,"Workflow for requesting and providing stakeholder feedback; platform/version selection but no detailed limits, configs, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/devops/test/request-stakeholder-feedback?view=azure-devops,Request & provide feedback,Request and provide stakeholder feedback - Azure Test Plans,,"Learn how to request, provide, and track stakeholder feedback in Azure DevOps using the Test & Feedback browser extension.",Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use the Test & Feedback extension in Azure DevOps to collect stakeholder feedback. The available workflow depends on your platform: Important Select the version of this article that corresponds to your platform and version. The version selector is above the table of contents.Look up your Azure DevOps platform and version.,2026-04-03T21:03:00.000Z,how-to,,0.2,False,"Workflow for requesting and providing stakeholder feedback; platform/version selection but no detailed limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/run-automated-tests-from-test-hub?view=azure-devops,Run automated tests from test plans,Run automated tests from test plans - Azure Test Plans,,Run automated tests on-demand from test plans in Azure Test Plans with a build or release pipeline.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Run automated tests on demand directly fromAzure Test Planswithout setting up scheduled builds or releases. Select specific tests to rerun after infrastructure fixes or new builds, and give testers a simple way to trigger automated tests without pipeline expertise.",2026-03-18T21:04:00.000Z,how-to,,0.25,False,"Explains how to run automated tests from test plans; focused on triggering tests via UI/pipelines, not on quotas, configuration matrices, or error diagnostics.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/run-manual-tests?view=azure-devops,Run manual tests,Run Manual Tests with Azure Test Plans - Azure Test Plans,,"Run manual tests in Azure Test Plans to validate web and desktop apps, capture diagnostics, and manage bugs. Start testing your software today.","Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Use Microsoft Test Runner to run manual tests and record results for each test step. You can test both web and desktop applications, run all active tests in a suite or select specific test cases, and run tests against a specific build. During a test run, you can capture screenshots, record actions, and create or update bugs directly from Test Runner with test steps, screenshots, and comments automatically included.",2026-04-07T01:04:00.000Z,how-to,,0.3,False,"Describes running manual tests and capturing diagnostics; no numeric limits, config parameter tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/share-steps-between-test-cases?view=azure-devops,Share steps between test cases,Share steps between test cases - Azure Test Plans,,Learn how to share steps between test cases when you want to test web applications in Azure Test.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Many manual tests require performing an identical sequence of steps and test data. For example, logging in to a web app or saving form data are common steps performed in several test sequences. With the use ofShared StepsandShared Parameterswork items, you can minimize the creation of test steps and data that you need to enter and manage.Shared Stepsdefine a sequence of steps that can be referenced by many different test case",2026-02-27T22:02:00.000Z,how-to,,0.3,False,"How-to for using shared steps/parameters in test cases; procedural guidance but no config tables, limits, or error-resolution content.",unchanged @@ -49,7 +49,7 @@ stages that have those different configurations. Use your test plans to decide w on which configurations. You have to make sure that when you run your tests that you have set up your stages fo",2026-03-03T18:05:00.000Z,how-to,,0.25,False,"Explains testing across configurations conceptually; no specific thresholds, configuration parameter tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/devops/test/test-objects-overview?view=azure-devops,Key concepts,Test Objects and Terms Overview in Azure Test Plans - Azure Test Plans,,"Azure Test Plans test objects: Discover essential terms, work item types, and controls to streamline your manual and automated testing workflows.",Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Read this article to gain an understanding of the objects and terms used in manual and exploratory testing.,2026-04-07T01:04:00.000Z,overview,,0.1,False,"Conceptual overview of test objects and terms; no concrete limits, configs, or troubleshooting mappings.",new +https://learn.microsoft.com/en-us/azure/devops/test/test-objects-overview?view=azure-devops,Key concepts,Test Objects and Terms Overview in Azure Test Plans - Azure Test Plans,,"Azure Test Plans test objects: Discover essential terms, work item types, and controls to streamline your manual and automated testing workflows.",Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Read this article to gain an understanding of the objects and terms used in manual and exploratory testing.,2026-04-07T01:04:00.000Z,overview,,0.1,False,"Conceptual overview of test objects and terms; no concrete limits, configs, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/test-runs?view=azure-devops,Manage test runs,Manage test runs in Azure DevOps Test Plans - Azure Test Plans,,"Learn how to create, view, and manage test runs using the Test Run Hub in Azure DevOps Test Plans.","Azure DevOps Services Use the Test Run Hub to track test execution, analyze results, and maintain quality across development cycles. Note The Test Run Hub is rolling out toGeneral Availability (GA). The GA experience might not be available in your organization yet. Tip You canuse AI to help with this tasklater in this article, or seeEnable AI assistance with Azure DevOps MCP Serverto get started. A test run captures the execution of one or more test cases – recording outcome, duration, and envir",2026-03-18T21:04:00.000Z,how-to,,0.25,False,"Covers using the Test Run Hub to track execution and results; appears to be feature usage guidance without specific limits, config tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/devops/test/track-test-status?view=azure-devops,Track test status,View progress report - Azure Test Plans,,Learn how to view the status of your planned testing using an out-of-the-box Progress Report and lightweight charts.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 View the status of planned tests or monitor testing progress by defining test case or test result charts. For more information about test planning, seeCreate test plans and test suites. For information about test result terms, seeTest objects and terms. To track the progress of more than one test plan or test suite, open theProgress Report.",2025-10-27T22:02:00.000Z,how-to,,0.25,False,Viewing test status and charts; reporting how-to without numeric system limits or config matrices.,unchanged https://learn.microsoft.com/en-us/azure/devops/test/user-acceptance-testing?view=azure-devops,Perform user acceptance testing,Assign tests for user acceptance testing - Azure Test Plans,,Create and run user acceptance tests in Azure Test Plans. Test to verify that each of the deliverables meets your users' needs.,"Azure DevOps Services | Azure DevOps Server | Azure DevOps Server 2022 Today's faster development pace requires tools that let test teams more easily verify value based diff --git a/products/azure-test-plans/report.md b/products/azure-test-plans/report.md index 95326f8a..685cf3a0 100644 --- a/products/azure-test-plans/report.md +++ b/products/azure-test-plans/report.md @@ -1,24 +1,25 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - limits-quotas: Configuring and using custom fields in Azure DevOps test runs, including - how to define, manage, and apply them for better test reporting and tracking. + limits-quotas: Test run customization with custom fields, plus limits, quotas, and + data retention rules for Azure Test Plans (e.g., max test cases, suites, runs, + and history duration). security: Managing permissions, access levels, and security roles for users and groups in Azure Test Plans manual testing features. integrations: 'Using tcm.exe CLI to manage Azure Test Plans: create and run test suites, import/export test cases, manage test configurations, and automate test management tasks' skill_description: Expert knowledge for Azure Test Plans development including limits - & quotas, security, and integrations & coding patterns. Use when configuring custom - test run fields, managing test access, or automating suites via tcm.exe and test - configs, and other Azure Test Plans related development tasks. Not for Azure DevOps - (use azure-devops), Azure Boards (use azure-boards), Azure Pipelines (use azure-pipelines), + & quotas, security, and integrations & coding patterns. Use when defining test run + fields, handling test case/suite limits, setting test access, or automating via + tcm.exe, and other Azure Test Plans related development tasks. Not for Azure DevOps + (use azure-devops), Azure Pipelines (use azure-pipelines), Azure Boards (use azure-boards), Azure App Testing (use azure-app-testing). -use_when: Use when configuring custom test run fields, managing test access, or automating - suites via tcm.exe and test configs, and other Azure Test Plans related development +use_when: Use when defining test run fields, handling test case/suite limits, setting + test access, or automating via tcm.exe, and other Azure Test Plans related development tasks. -confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use azure-boards), - Azure Pipelines (use azure-pipelines), Azure App Testing (use azure-app-testing). +confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Pipelines (use + azure-pipelines), Azure Boards (use azure-boards), Azure App Testing (use azure-app-testing). --- # Azure Test Plans Crawl Report @@ -27,14 +28,14 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a - **Total Pages**: 33 - **Fetched**: 33 - **Fetch Failed**: 0 -- **Classified**: 3 -- **Unclassified**: 30 +- **Classified**: 4 +- **Unclassified**: 29 ### Incremental Update -- **New Pages**: 10 -- **Updated Pages**: 1 -- **Unchanged**: 22 -- **Deleted Pages**: 8 +- **New Pages**: 0 +- **Updated Pages**: 2 +- **Unchanged**: 31 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-test-plans/azure-test-plans.csv` ## Classification Statistics @@ -42,40 +43,18 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | Type | Count | Percentage | |------|-------|------------| | integrations | 1 | 3.0% | -| limits-quotas | 1 | 3.0% | +| limits-quotas | 2 | 6.1% | | security | 1 | 3.0% | -| *(Unclassified)* | 30 | 90.9% | +| *(Unclassified)* | 29 | 87.9% | ## Changes -### New Pages - -- [Tour Test Plans](https://learn.microsoft.com/en-us/azure/devops/test/navigate-test-plans?view=azure-devops) -- [Key concepts](https://learn.microsoft.com/en-us/azure/devops/test/test-objects-overview?view=azure-devops) -- [Create & manage test plans](https://learn.microsoft.com/en-us/azure/devops/test/create-a-test-plan?view=azure-devops) -- [Create & manage test suites](https://learn.microsoft.com/en-us/azure/devops/test/create-test-suites?view=azure-devops) -- [Create & manage test cases](https://learn.microsoft.com/en-us/azure/devops/test/create-test-cases?view=azure-devops) -- [Copy or clone test items](https://learn.microsoft.com/en-us/azure/devops/test/copy-clone-test-items?view=azure-devops) -- [Import and export test cases](https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops) -- [Add custom data fields](https://learn.microsoft.com/en-us/azure/devops/test/custom-fields?view=azure-devops) -- [Actual Result](https://learn.microsoft.com/en-us/azure/devops/test/actual-result?view=azure-devops) -- [Request & provide feedback](https://learn.microsoft.com/en-us/azure/devops/test/request-stakeholder-feedback?view=azure-devops) - ### Updated Pages -- [Associate automated tests with test cases](https://learn.microsoft.com/en-us/azure/devops/test/associate-automated-test-with-test-case?view=azure-devops) - - Updated: 2026-02-27T22:02:00.000Z → 2026-04-17T21:04:00.000Z - -### Deleted Pages - -- ~~Bulk import and export test cases (CSV/XLSX)~~ (https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops) -- ~~Copy/clone test plans, suites, cases~~ (https://learn.microsoft.com/en-us/azure/devops/test/copy-clone-test-items?view=azure-devops) -- ~~Create test plans and test suites~~ (https://learn.microsoft.com/en-us/azure/devops/test/create-a-test-plan?view=azure-devops) -- ~~Create test cases~~ (https://learn.microsoft.com/en-us/azure/devops/test/create-test-cases?view=azure-devops) -- ~~Store custom data in test plan or test result~~ (https://learn.microsoft.com/en-us/azure/devops/test/custom-fields?view=azure-devops) -- ~~Navigate Test Plans~~ (https://learn.microsoft.com/en-us/azure/devops/test/navigate-test-plans?view=azure-devops) -- ~~Request and provide stakeholder feedback~~ (https://learn.microsoft.com/en-us/azure/devops/test/request-stakeholder-feedback?view=azure-devops) -- ~~Test objects and terms~~ (https://learn.microsoft.com/en-us/azure/devops/test/test-objects-overview?view=azure-devops) +- [Import and export test cases](https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops) + - Updated: 2026-04-08T21:05:00.000Z → 2026-04-23T08:00:00.000Z +- [Manual testing FAQs](https://learn.microsoft.com/en-us/azure/devops/test/reference-qa?view=azure-devops) + - Updated: 2026-04-08T21:05:00Z → 2026-04-22T21:02:00Z ## Classified Pages @@ -83,13 +62,13 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a |-----------|------|------------|--------| | [Add custom data fields](https://learn.microsoft.com/en-us/azure/devops/test/custom-fields?view=azure-devops) | limits-quotas | 0.70 | Explicitly states a numeric limit: up to 100 custom fields per Azure DevOps project; this is a concrete quota not generally known from training. | | [Default permissions (Security)](https://learn.microsoft.com/en-us/azure/devops/test/manual-test-permissions?view=azure-devops) | security | 0.70 | Covers Azure DevOps access levels, licensing, and permissions for manual and exploratory testing. Likely includes specific permission names, role mappings, and access requirements, which are product-specific security/authorization details. | +| [Manual testing FAQs](https://learn.microsoft.com/en-us/azure/devops/test/reference-qa?view=azure-devops) | limits-quotas | 0.70 | FAQ pages for Azure Test Plans typically include concrete test data retention periods and possibly other numeric constraints (for example, how long test results are kept, limits on configurations or suites). These are product-specific limits and quotas that an LLM is unlikely to know precisely from training, matching the limits-quotas category. | | [Test case management commands](https://learn.microsoft.com/en-us/azure/devops/test/test-case-managment-reference?view=azure-devops) | integrations | 0.70 | Reference for tcm.exe command-line tool, which is a product-specific integration/automation interface. Such pages typically list commands, arguments, and options (API-like parameters) unique to Azure Test Plans, matching the integrations & coding patterns criteria. | ## Unclassified Pages | TOC Title | Confidence | Reason | |-----------|------------|--------| -| [Manual testing FAQs](https://learn.microsoft.com/en-us/azure/devops/test/reference-qa?view=azure-devops) | 0.40 | FAQ across many topics; summary does not expose specific limits, error codes, or configuration matrices, so cannot reliably classify as expert knowledge from the given text. | | [Set test retention policies](https://learn.microsoft.com/en-us/azure/devops/test/how-long-to-keep-test-results?view=azure-devops) | 0.40 | Discusses having a policy for test result retention conceptually; summary does not show concrete retention limits, config parameter tables, or tier-specific values. Appears as general guidance rather than detailed limits or configuration reference. | | [Manage test failure type](https://learn.microsoft.com/en-us/azure/devops/test/manage-test-failure-type?view=azure-devops) | 0.35 | Customization of failure types is described conceptually; summary does not show numeric limits, config tables, or security/role specifics. | | [Actual Result](https://learn.microsoft.com/en-us/azure/devops/test/actual-result?view=azure-devops) | 0.30 | Describes preview feature to record actual results; configuration is at test plan level but no clear numeric limits, config tables, or error codes in summary. | @@ -102,7 +81,6 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Test in Standalone mode](https://learn.microsoft.com/en-us/azure/devops/test/standalone-mode-exploratory-testing?view=azure-devops) | 0.30 | Standalone mode description; mostly explains mode behavior and requirements, not detailed configuration or quotas. | | [Associate automated tests with test cases](https://learn.microsoft.com/en-us/azure/devops/test/associate-automated-test-with-test-case?view=azure-devops) | 0.25 | How-to associate automated tests with test cases for traceability; likely code/linking steps but summary shows no specific parameter tables, limits, or error codes. | | [Collect diagnostic data](https://learn.microsoft.com/en-us/azure/devops/test/collect-diagnostic-data?view=azure-devops) | 0.25 | Describes collecting diagnostic data during tests; summary does not show specific log locations, error codes, or config parameters. | -| [Import and export test cases](https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops) | 0.25 | Bulk import/export via CSV/Excel with an import wizard; summary does not indicate detailed parameter tables or limits beyond generic capability. | | [Manage test runs](https://learn.microsoft.com/en-us/azure/devops/test/test-runs?view=azure-devops) | 0.25 | Covers using the Test Run Hub to track execution and results; appears to be feature usage guidance without specific limits, config tables, or error-code-based troubleshooting. | | [Perform user acceptance testing](https://learn.microsoft.com/en-us/azure/devops/test/user-acceptance-testing?view=azure-devops) | 0.25 | User acceptance testing workflow; mostly process guidance without numeric thresholds or config tables. | | [Progress report](https://learn.microsoft.com/en-us/azure/devops/test/progress-report?view=azure-devops) | 0.25 | Progress report usage; focuses on interpreting charts and status, not on configuration or limits. | @@ -115,6 +93,7 @@ confusable_not_for: Not for Azure DevOps (use azure-devops), Azure Boards (use a | [Create & manage test plans](https://learn.microsoft.com/en-us/azure/devops/test/create-a-test-plan?view=azure-devops) | 0.20 | How-to guide for creating/managing test plans; step-by-step UI usage without expert-only limits, configs, or decision matrices. | | [Create & manage test suites](https://learn.microsoft.com/en-us/azure/devops/test/create-test-suites?view=azure-devops) | 0.20 | Explains creating and managing test suites; procedural instructions but no numeric limits, configuration tables, or error diagnostics. | | [Get insights from sessions](https://learn.microsoft.com/en-us/azure/devops/test/insights-exploratory-testing?view=azure-devops) | 0.20 | Describes viewing exploratory testing sessions and insights. Likely a feature walkthrough without specific quotas, configuration parameters, or error-code-based troubleshooting. | +| [Import and export test cases](https://learn.microsoft.com/en-us/azure/devops/test/bulk-import-export-test-cases?view=azure-devops) | 0.20 | Primarily a how-to/tutorial for bulk importing and exporting test cases via CSV/Excel. The summary does not indicate presence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert details beyond general workflow steps. | | [Request & provide feedback](https://learn.microsoft.com/en-us/azure/devops/test/request-stakeholder-feedback?view=azure-devops) | 0.20 | Workflow for requesting and providing stakeholder feedback; platform/version selection but no detailed limits, configs, or troubleshooting mappings. | | [Tour Test Plans](https://learn.microsoft.com/en-us/azure/devops/test/navigate-test-plans?view=azure-devops) | 0.20 | Navigation and usage overview of Test Plans UI; no limits, configs, error codes, or product-specific numeric thresholds. | | [What is Azure Test Plans?](https://learn.microsoft.com/en-us/azure/devops/test/overview?view=azure-devops) | 0.20 | High-level overview of Azure Test Plans capabilities; no detailed limits, configs, or error mappings. | diff --git a/products/azure-virtual-desktop/azure-virtual-desktop.csv b/products/azure-virtual-desktop/azure-virtual-desktop.csv index e19c8a39..f907cd83 100644 --- a/products/azure-virtual-desktop/azure-virtual-desktop.csv +++ b/products/azure-virtual-desktop/azure-virtual-desktop.csv @@ -21,7 +21,7 @@ https://learn.microsoft.com/en-us/azure/virtual-desktop/azure-ad-joined-session- https://learn.microsoft.com/en-us/azure/virtual-desktop/azure-advisor-recommendations,Resolve Azure Advisor recommendations,Azure Advisor Azure Virtual Desktop Walkthrough - Azure - Azure Virtual Desktop,Resolve Azure Advisor recommendations for AVD,How to resolve Azure Advisor recommendations for Azure Virtual Desktop.,This article describes how you can resolve recommendations that appear in Azure Advisor for Azure Virtual Desktop.,2025-06-20T03:01:00.000Z,conceptual,best-practices,0.65,True,"Walkthrough of concrete Azure Advisor recommendations and how to remediate them for AVD, which are product-specific DO/DO-NOT style guidance with actionable steps.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/azure-extended-zones,Azure Extended Zones,Azure Extended Zones for Azure Virtual Desktop - Azure Virtual Desktop,Use Azure Extended Zones with Azure Virtual Desktop,Learn about using Azure Virtual Desktop on Azure Extended Zones.,"Azure Extended Zonesare small-footprint extensions of Azure placed in metros, industry centers, or a specific jurisdiction to serve low latency and/or data residency workloads. Azure Extended Zones is supported for Azure Virtual Desktop and can run latency-sensitive and throughput-intensive applications close to end users and within approved data residency boundaries. Azure Extended Zones are part of the Microsoft global network that provides secure, reliable, high-bandwidth connectivity between",2025-06-20T03:01:00.000Z,conceptual,decision-making,0.65,True,"Explains using Extended Zones for latency-sensitive and data-residency workloads, helping choose deployment locations based on technical criteria.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/azure-local-overview,Azure Local,Azure Virtual Desktop on Azure Local - Azure Virtual Desktop,Plan Azure Virtual Desktop on Azure Local,"Learn about using Azure Virtual Desktop on Azure Local, enabling you to deploy session hosts where you need them.","Important Azure Virtual Desktop on Azure Local for Azure Government is currently in preview with HCI version 23H2 (and newer). Portal provisioning isn't available. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Using Azure Virtual Desktop on Azure Local, you can deploy session hosts for Azure Virtual Desktop where you need them. If you already have an exi",2025-06-20T03:01:00.000Z,conceptual,decision-making,0.65,True,"Covers when and how to use Azure Local for session hosts, including preview constraints and suitability for specific scenarios, aligning with decision-making guidance.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-desktop/azurecommunicationips,Network connectivity for 169.254.169.254 and 168.63.129.16,Azure fabric communication IPs - Azure Virtual Desktop,Configure Azure fabric communication IPs for AVD and Windows 365,Understanding how to handle the Azure communication IPs in AVD and Windows 365,"Azure Virtual Desktop (AVD) session hosts and Windows 365 Cloud PCs require connectivity to two Azure platform IP addresses. These addresses provide access to core infrastructure services that support provisioning, health monitoring, identity, and platform communication. Traffic to these addresses operates at the Azure platform fabric level. It behaves differently from traffic to FQDN-based endpoints and must be handled accordingly. This article explains what these IP addresses are, why they're ",2026-04-13T15:06:00.000Z,conceptual,configuration,0.8,True,"The page describes two specific Azure platform IP addresses required for Azure Virtual Desktop and Windows 365 Cloud PCs, and how traffic to them must be handled. These are concrete, product-specific network configuration values (IP addresses and handling rules) that qualify as expert configuration knowledge rather than general concepts.",new +https://learn.microsoft.com/en-us/azure/virtual-desktop/azurecommunicationips,Network connectivity for 169.254.169.254 and 168.63.129.16,Azure fabric communication IPs - Azure Virtual Desktop,Configure Azure fabric communication IPs for AVD and Windows 365,Understanding how to handle the Azure communication IPs in AVD and Windows 365,"Azure Virtual Desktop (AVD) session hosts and Windows 365 Cloud PCs require connectivity to two Azure platform IP addresses. These addresses provide access to core infrastructure services that support provisioning, health monitoring, identity, and platform communication. Traffic to these addresses operates at the Azure platform fabric level. It behaves differently from traffic to FQDN-based endpoints and must be handled accordingly. This article explains what these IP addresses are, why they're ",2026-04-13T15:06:00.000Z,conceptual,configuration,0.8,True,"The page describes two specific Azure platform IP addresses required for Azure Virtual Desktop and Windows 365 Cloud PCs, and how traffic to them must be handled. These are concrete, product-specific network configuration values (IP addresses and handling rules) that qualify as expert configuration knowledge rather than general concepts.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/check-access-validate-required-fqdn-endpoint,Check access to required FQDNs and endpoints,Check access to required FQDNs and endpoints for Azure Virtual Desktop - Azure Virtual Desktop,Validate Azure Virtual Desktop FQDN and endpoint connectivity,The Azure Virtual Desktop Agent URL Tool enables you to check that your session host virtual machines can access the required FQDNs and endpoints to ensure Azure Virtual Desktop works as intended.,"In order to deploy Azure Virtual Desktop, you must allow specific FQDNs and endpoints. You can find the list of FQDNs and endpoints inRequired FQDNs and endpoints. Available as part of the Azure Virtual Desktop Agent (RDAgent) on each session host, theAzure Virtual Desktop Agent URL Toolenables you to quickly and easily validate whether your session hosts can access each FQDN and endpoint. The tool lists any required FQDNs and endpoints it can't access so you can unblock them and retest, if need",2025-06-20T03:01:00.000Z,how-to,troubleshooting,0.75,True,"Uses the Agent URL Tool to test access to required endpoints and identify which are blocked, mapping symptom (connectivity issues) to diagnosis and fix.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/cli-powershell,Use Azure CLI and Azure PowerShell,Use Azure CLI and Azure PowerShell with Azure Virtual Desktop - Azure Virtual Desktop,Use Azure CLI and PowerShell modules to manage Azure Virtual Desktop,Learn about Azure CLI and Azure PowerShell with Azure Virtual Desktop and some useful example commands you can run.,"There's an Azure CLI extension and an Azure PowerShell module for Azure Virtual Desktop that you can use to create, update, delete, and interact with Azure Virtual Desktop service objects as alternatives to using the Azure portal. They're part ofAzure CLIandAzure PowerShell, which cover a wide range of Azure services. This article explains how you can use the Azure CLI extension and an Azure PowerShell module, and provides some useful example commands.",2025-06-20T03:01:00.000Z,how-to,integrations,0.7,True,Provides example CLI and PowerShell commands for AVD; includes module/command names and parameters that are product-specific integration patterns.,unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/clipboard-transfer-direction-data-types,Clipboard transfer direction and data types,Configure the clipboard transfer direction in Azure Virtual Desktop - Azure Virtual Desktop,Control clipboard direction and data types in AVD,"Learn how to configure the clipboard transfer direction and data types that can be copied in Azure Virtual Desktop from session host to client, or client to session host.","Clipboard redirection in Azure Virtual Desktop allows users to copy and paste content, such as text, images, and files between the user's device and the remote session in either direction. You might want to limit the direction of the clipboard for users, to help prevent data exfiltration or malicious files being copied to a session host. You can configure whether users can use the clipboard from session host to client, or client to session host, and the types of data that can be copied, from the",2025-06-20T03:01:00.000Z,how-to,security,0.7,True,Focuses on restricting clipboard direction and data types to prevent data exfiltration. This is a security-focused configuration with product-specific options for allowed data types and directions.,unchanged @@ -29,7 +29,7 @@ https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-adfs-sso,Confi https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-automatic-updates,Deploy updates with Configuration Manager,Update session hosts using Microsoft Configuration Manager to automatically deploy software updates to Azure Virtual Desktop session hosts - Azure - Azure Virtual Desktop,Configure Configuration Manager updates for AVD multi-session hosts,How to configure Microsoft Configuration Manager to deploy software updates to Windows 10 Enterprise multi-session on Azure Virtual Desktop.,"Azure Virtual Desktop session hosts running Windows 10 Enterprise multi-session and Windows 11 Enterprise multi-session can be grouped together in Microsoft Configuration Manager to automatically apply updates. A collection is created based on a query which you can then use as the target collection for a servicing plan. You can update Windows 10 Enterprise multi-session and Windows 11 Enterprise multi-session with the corresponding Windows client updates. For example, you can update Windows 10 E",2025-06-20T03:01:00.000Z,how-to,configuration,0.7,True,Describes creating collections and servicing plans for multi-session Windows; involves specific SCCM query/config parameters unique to this scenario.,unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-host-pool-load-balancing,Configure load balancing method,Configure host pool load balancing in Azure Virtual Desktop - Azure Virtual Desktop,Configure Azure Virtual Desktop host pool load balancing,How to configure the load balancing method for pooled host pools in Azure Virtual Desktop.,"Azure Virtual Desktop supports two load balancing algorithms for pooled host pools. Each algorithm determines which session host is used when a user starts a remote session. Load balancing doesn't apply to personal host pools because users always have a 1:1 mapping to a session host within the host pool. The following load balancing algorithms are available for pooled host pools: Breadth-first, which aims to evenly distribute new user sessions across the session hosts in a host pool. You don't h",2025-06-20T03:01:00.000Z,how-to,configuration,0.7,True,"How-to configuration article for load balancing methods; likely includes specific setting names/values for breadth-first vs depth-first, which are product-specific configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-host-pool-personal-desktop-assignment-type,Configure personal desktop assignment,Configure personal desktop assignment in Azure Virtual Desktop - Azure - Azure Virtual Desktop,,"How to configure the assignment type of a personal host pool, unassign or reassign desktops, assign multiple desktops to a user, or set a friendly name for a desktop in Azure Virtual Desktop.","A personal host pool is a type of host pool that has personal desktops. Personal desktops have one-to-one mapping, which means a single user can only be assigned to a single personal desktop. Every time the user signs in, their user session is directed to their assigned personal desktop session host. Personal desktops are ideal for users with resource-intensive workloads because the user experience and session performance improves if there's only one session on the session host. Another benefit ",2025-06-20T03:01:00.000Z,how-to,,0.4,False,"Primarily conceptual and procedural assignment guidance for personal desktops; summary doesn’t indicate detailed config tables, numeric thresholds, or product-specific edge cases.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-managed-identity,Configure a managed identity,Configure managed identity in Azure Virtual Desktop - Azure Virtual Desktop,Configure managed identities for Azure Virtual Desktop host pools,How to configure a managed identity for host pools in Azure Virtual Desktop.,"Important Managed identity support for Azure Virtual Desktop host pools is currently in PREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Azure Virtual Desktop supports assigning permissions toManaged identities for Azure resourcesfor features that need to perform Azure Resource Manager (ARM) operations on virtual machines, key vault, and virtual ne",2025-08-14T20:36:00.000Z,how-to,security,0.8,True,Managed identity setup for host pools requires specific role assignments and ARM permissions; this is product-specific security configuration with concrete RBAC details.,unchanged +https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-managed-identity,Configure a managed identity,Configure managed identity in Azure Virtual Desktop - Azure Virtual Desktop,Configure managed identities for Azure Virtual Desktop host pools,How to configure a managed identity for host pools in Azure Virtual Desktop.,"Important Managed identity support for Azure Virtual Desktop host pools is currently in PREVIEW. See theSupplemental Terms of Use for Microsoft Azure Previewsfor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. Azure Virtual Desktop supports assigning permissions toManaged identities for Azure resourcesfor features that need to perform Azure Resource Manager (ARM) operations on virtual machines, key vault, and virtual ne",2026-04-22T20:42:00.000Z,how-to,security,0.74,True,"The page describes how to configure managed identities specifically for Azure Virtual Desktop host pools, including assigning permissions for ARM operations on VMs, Key Vault, and networking resources. This involves product-specific security configuration steps and role/permission assignments tied to Azure RBAC and managed identities, which qualify as expert security configuration knowledge rather than generic concepts.",updated https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-rdp-shortpath,Configure RDP Shortpath,Configure RDP Shortpath for Azure Virtual Desktop - Azure Virtual Desktop,Configure RDP Shortpath transport for Azure Virtual Desktop,"Learn how to configure RDP Shortpath for Azure Virtual Desktop, which establishes a UDP-based transport for a remote session.",Important Users can connect to a remote session from Azure Virtual Desktop using the Remote Desktop Protocol (RDP) with a UDP or TCP-based transport. RDP Shortpath establishes a UDP-based transport between a local device Windows App or the Remote Desktop app on supported platforms and session host. UDP-based transport offers better connection reliability and more consistent latency. TCP-based reverse connect transport provides the best compatibility with various networking configurations and has,2025-12-17T18:43:00.000Z,how-to,configuration,0.8,True,"How-to configuration article for enabling Shortpath with specific settings and network parameters, matching configuration criteria.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-session-lock-behavior,Configure session lock behavior,Configure the session lock behavior for Azure Virtual Desktop - Azure Virtual Desktop,Configure session lock behavior in Azure Virtual Desktop,Learn how to configure session lock behavior for Azure Virtual Desktop.,"You can choose whether the session is disconnected or the remote lock screen is shown when a remote session is locked, either by the user or by policy. When the session lock behavior is set to disconnect, a dialog is shown to let users know they were disconnected. Users can choose theReconnectoption from the dialog when they're ready to connect again. When used with single sign-on using Microsoft Entra ID, disconnecting the session provides the following benefits: A consistent sign-in experience",2025-06-20T03:01:00.000Z,how-to,configuration,0.7,True,"Describes specific setting options (disconnect vs remote lock screen) and their effects, which are product-specific configuration behaviors.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-single-sign-on,Configure single sign-on using Microsoft Entra ID,Configure single sign-on for Azure Virtual Desktop using Microsoft Entra ID - Azure Virtual Desktop,Configure Entra ID single sign-on for Azure Virtual Desktop,Learn how to configure single sign-on for an Azure Virtual Desktop environment using Microsoft Entra ID.,"Single sign-on (SSO) for Azure Virtual Desktop using Microsoft Entra ID provides a seamless sign-in experience for users connecting to session hosts. When you enable single sign-on, users authenticate to Windows using a Microsoft Entra ID token. This token enables the use of passwordless authentication and third-party identity providers that federate with Microsoft Entra ID when connecting to a session host, making the sign-in experience seamless. Single sign-on using Microsoft Entra ID also pro",2025-08-29T08:00:00.000Z,how-to,security,0.8,True,"Describes enabling SSO with Entra ID tokens, including configuration steps and parameters specific to this product’s SSO behavior.",unchanged @@ -86,7 +86,7 @@ https://learn.microsoft.com/en-us/azure/virtual-desktop/purview-forensic-evidenc https://learn.microsoft.com/en-us/azure/virtual-desktop/quickstart,Deploy a sample Windows 11 desktop,Quickstart: deploy a sample Azure Virtual Desktop environment - Azure Virtual Desktop,,Quickly and easily deploy a sample Azure Virtual Desktop environment from the Azure portal using quickstart.,"You can quickly deploy a sample Azure Virtual Desktop environment withquickstartin the Azure portal. Quickstart enables you to easily evaluate a Windows 11 Enterprise multi-session remotely and become familiar with the service before deploying it in production. When you use quickstart, it deploys a small environment consisting of minimal resources and configuration. A user then signs into Windows App and connects to a full virtual desktop session. Deployment takes approximately 20 minutes to com",2025-06-20T03:01:00.000Z,quickstart,,0.35,False,"Quickstart deployment guide for a sample environment; step-by-step evaluation setup, not production constraints or matrices.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/rbac,Available RBAC roles for Azure Virtual Desktop,Built-in Azure RBAC roles Azure Virtual Desktop - Azure Virtual Desktop,Use built-in Azure RBAC roles for Azure Virtual Desktop,An overview of built-in Azure RBAC roles for Azure Virtual Desktop available.,"Azure Virtual Desktop uses Azure role-based access control (RBAC) to control access to resources. There are many built-in roles for use with Azure Virtual Desktop that are a collection of permissions. You assign roles to users and admins and these roles give permission to carry out certain tasks. To learn more about Azure RBAC, seeWhat is Azure RBAC. The standard built-in roles for Azure areOwner,Contributor, andReader. However, Azure Virtual Desktop has more roles that let you separate managem",2025-06-20T03:01:00.000Z,conceptual,security,0.85,True,"Lists Azure Virtual Desktop–specific RBAC roles and their permissions, which are product-specific security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-bandwidth,Bandwidth considerations,Remote Desktop Protocol bandwidth requirements Azure Virtual Desktop - Azure,Understand RDP bandwidth needs for Azure Virtual Desktop,Understanding RDP bandwidth requirements for Azure Virtual Desktop.,"Remote Desktop Protocol (RDP) is a sophisticated technology that uses various techniques to perfect the server's remote graphics' delivery to the client device. Depending on the use case, availability of computing resources, and network bandwidth, RDP dynamically adjusts various parameters to deliver the best user experience. RDP multiplexes multiple Dynamic Virtual Channels (DVCs) into a single data channel sent over different network transports. There are separate DVCs for remote graphics, inp",2025-06-20T03:01:00.000Z,conceptual,limits-quotas,0.7,True,"Bandwidth requirement article typically includes concrete kbps/Mbps figures for scenarios and possibly thresholds, which are numeric limits unknown to generic models.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-multipath,RDP Multipath,Use RDP Multipath to improve Azure Virtual Desktop connections - Azure Virtual Desktop,Use RDP Multipath to improve Azure Virtual Desktop reliability,Learn how RDP Multipath enhances remote connections to an Azure Virtual Desktop session by intelligently managing multiple network paths.,Important RDP Multipath is now GA and fully deployed! All eligible connections benefit from enhanced reliability and performance. Remote Desktop Protocol (RDP) Multipath improves session stability by continuously monitoring multiple networks paths and dynamically selecting the most reliable one. This intelligent switching mechanism helps reduce the likelihood of disconnections and contributes to a smoother and more consistent user experience. It offers several key benefits: Seamless integration:,2025-11-10T21:32:00.000Z,how-to,best-practices,0.65,True,"Describes when and how RDP Multipath enhances connections and which connections are eligible, providing product-specific usage recommendations.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-multipath,RDP Multipath,Use RDP Multipath to improve Azure Virtual Desktop connections - Azure Virtual Desktop,,Learn how RDP Multipath enhances remote connections to an Azure Virtual Desktop session by intelligently managing multiple network paths.,Remote Desktop Protocol (RDP) Multipath improves session stability by continuously monitoring multiple network paths and dynamically selecting the most reliable one. This intelligent switching mechanism helps reduce the likelihood of disconnections and contributes to a smoother and more consistent user experience across different network conditions. It offers several key benefits: Seamless integration: No configuration changes are needed beyond ensuring your environment supportsRDP Shortpath. In,2026-04-21T03:37:00.000Z,how-to,,0.2,False,"Page appears to describe what RDP Multipath is and its benefits for Azure Virtual Desktop connections, without clear evidence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Based on the summary, it is primarily conceptual/behavioral guidance rather than detailed expert configuration, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-properties,Supported RDP properties,Supported RDP properties - Azure Virtual Desktop,Reference for supported RDP properties in AVD,"Learn about the supported RDP properties you can set to customize the behavior of a remote session, such as for device redirection, display settings, session behavior, and more.","Tip This article is shared for services and products that use the Remote Desktop Protocol (RDP) to provide remote access to Windows desktops and apps. The Remote Desktop Protocol (RDP) has a number of properties you can set to customize the behavior of a remote session, such as for device redirection, display settings, session behavior, and more. The following sections contain each RDP property available and lists its syntax, description, supported values, the default value, and connections to w",2025-06-20T03:01:00.000Z,reference,configuration,0.95,True,"This is a parameter reference listing each RDP property with syntax, supported values, and defaults—exactly the kind of configuration table with defaults and allowed ranges that LLMs lack.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-quality-of-service-qos,Implement Quality of Service for RDP Shortpath,Implement Quality of Service (QoS) for Azure Virtual Desktop - Azure,Configure QoS policies for Azure Virtual Desktop RDP,How to set up QoS for Azure Virtual Desktop.,"RDP Shortpath for managed networksprovides a direct UDP-based transport between Remote Desktop Client and Session host. RDP Shortpath for managed networks enables configuration of Quality of Service (QoS) policies for the RDP data. QoS in Azure Virtual Desktop allows real-time RDP traffic that's sensitive to network delays to ""cut in line"" in front of traffic that's less sensitive. Example of such less sensitive traffic would be a downloading a new app, where an extra second to download isn't a ",2025-06-20T03:01:00.000Z,conceptual,configuration,0.78,True,"QoS setup for RDP Shortpath typically includes specific DSCP values, port ranges, and Group Policy settings unique to Azure Virtual Desktop RDP traffic, which are product-specific configuration parameters not generally known from training.",unchanged @@ -110,10 +110,10 @@ https://learn.microsoft.com/en-us/azure/virtual-desktop/remote-desktop-client/co https://learn.microsoft.com/en-us/azure/virtual-desktop/remote-desktop-client/install-windows-client-per-user,Install the Remote Desktop client for Windows on a per-user basis,Install the Remote Desktop client for Windows on a per-user basis with Intune or Configuration Manager - Azure - Azure Virtual Desktop,Deploy Windows Remote Desktop client per-user via Intune or Configuration Manager,How to install the Azure Virtual Desktop client on a per-user basis with Intune or Configuration Manager.,"You can install the Remote Desktop client for Windows (MSRDC - MSI) on either a per-system or per-user basis. Installing it on a per-system basis installs the client on the machines for all users by default, and administrators control updates. Per-user installation installs the application to a subfolder within the local AppData folder of each user's profile, enabling users to install updates without needing administrative rights. When you install the client usingmsiexec.exe, per-system is the d",2025-06-20T03:01:00.000Z,how-to,deployment,0.7,True,Covers per-user vs per-system MSI installation with msiexec options; includes specific deployment parameters and behavior.,unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/remoteapp-enhancements,RemoteApp enhancements (preview),RemoteApp enhancements (preview) - Azure Virtual Desktop,Enable preview RemoteApp windowing enhancements in AVD,Learn how to enable enhanced windowing and integration behaviors for RemoteApps (preview).,"Upcoming enhancements to Azure Virtual Desktop RemoteApps and Windows 365 Cloud Apps include improved support for Windows Snap and full-screen mode, better multi-monitor handling, and refined visuals like borders, shadows, and theme integration. This article shows you how to enable these upcoming enhancements for testing during public preview. During public preview, these enhancements need to be manually enabled. Once these enhancements become generally available (GA), they'll be enabled by defa",2025-12-02T23:32:00.000Z,how-to,configuration,0.66,True,"Shows how to manually enable preview features, which typically involves specific flags, policy settings, or client configuration parameters unique to this feature.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/remotefx-graphics-performance-counters,Diagnosing graphics performance issues,Diagnose graphics performance issues Remote Desktop - Azure - Azure Virtual Desktop,Diagnose graphics performance issues with RemoteFX counters,This article describes how to use RemoteFX graphics counters in remote desktop protocol sessions to diagnose performance issues with graphics in Azure Virtual Desktop.,"To diagnose experience quality issues with your remote sessions, counters have been provided under the RemoteFX Graphics section of Performance Monitor. This article helps you pinpoint and fix graphics-related performance bottlenecks during Remote Desktop Protocol (RDP) sessions using these counters.",2025-06-20T03:01:00.000Z,troubleshooting,troubleshooting,0.85,True,Explains how to use specific RemoteFX graphics performance counters in PerfMon to diagnose graphics bottlenecks—product-specific counters and interpretation steps.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-desktop/required-fqdn-endpoint,Required FQDNs and endpoints,Required FQDNs and endpoints for Azure Virtual Desktop - Azure Virtual Desktop,Allow required FQDNs and endpoints for Azure Virtual Desktop,"A list of FQDNs and endpoints you must allow, ensuring your Azure Virtual Desktop deployment works as intended.","In order to deploy Azure Virtual Desktop and for your users to connect, you must allow specific FQDNs and endpoints. Users also need to be able to connect to certain FQDNs and endpoints to access their Azure Virtual Desktop resources. This article lists the required FQDNs and endpoints you need to allow for your session hosts and users. These FQDNs and endpoints could be blocked if you're using a firewall, such asAzure Firewall, or proxy service. For guidance on using a proxy service with Azure ",2026-04-13T15:06:00.000Z,conceptual,configuration,0.86,True,"The article is a product-specific allowlist of required FQDNs and endpoints for Azure Virtual Desktop session hosts and users. It enumerates concrete hostnames/URLs that must be permitted through firewalls or proxies, which are configuration parameters unique to this service and not inferable from general knowledge. This fits the configuration category as it defines exact network settings needed for correct operation.",updated +https://learn.microsoft.com/en-us/azure/virtual-desktop/required-fqdn-endpoint,Required FQDNs and endpoints,Required FQDNs and endpoints for Azure Virtual Desktop - Azure Virtual Desktop,Allow required FQDNs and endpoints for Azure Virtual Desktop,"A list of FQDNs and endpoints you must allow, ensuring your Azure Virtual Desktop deployment works as intended.","In order to deploy Azure Virtual Desktop and for your users to connect, you must allow specific FQDNs and endpoints. Users also need to be able to connect to certain FQDNs and endpoints to access their Azure Virtual Desktop resources. This article lists the required FQDNs and endpoints you need to allow for your session hosts and users. These FQDNs and endpoints could be blocked if you're using a firewall, such asAzure Firewall, or proxy service. For guidance on using a proxy service with Azure ",2026-04-13T15:06:00.000Z,conceptual,configuration,0.86,True,"The article is a product-specific allowlist of required FQDNs and endpoints for Azure Virtual Desktop session hosts and users. It enumerates concrete hostnames/URLs that must be permitted through firewalls or proxies, which are configuration parameters unique to this service and not inferable from general knowledge. This fits the configuration category as it defines exact network settings needed for correct operation.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/scaling-automation-logic-apps,How the scaling tool works,Scale session hosts using Azure Automation and Azure Logic Apps for Azure Virtual Desktop - Azure - Azure Virtual Desktop,Use Automation and Logic Apps to scale AVD session hosts,Learn about scaling Azure Virtual Desktop session hosts with Azure Automation and Azure Logic Apps.,"You can reduce your total Azure Virtual Desktop deployment cost by scaling your virtual machines (VMs). This means shutting down and deallocating session host VMs during off-peak usage hours, then turning them back on and reallocating them during peak hours. In this article, you'll learn about the scaling tool built with the Azure Automation account and Azure Logic Apps that automatically scales session host VMs in your Azure Virtual Desktop environment. To learn how to use the scaling tool, see",2025-06-20T03:01:00.000Z,how-to,architecture-patterns,0.6,True,Describes a specific scaling tool architecture using Automation and Logic Apps; this is a product-specific pattern for cost-optimized scaling.,unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/scheduled-agent-updates,Schedule Agent updates,Azure Virtual Desktop Scheduled Agent Updates - Azure Virtual Desktop,Configure scheduled agent updates for Azure Virtual Desktop,How to use the Scheduled Agent Updates feature to choose a date and time to update your Azure Virtual Desktop agent components.,"The Scheduled Agent Updates feature lets you create up to two maintenance windows for the Azure Virtual Desktop agent, side-by-side stack, and Geneva Monitoring agent to get updated so that updates don't happen during peak business hours. To monitor agent updates, you can use Log Analytics to see when agent component updates are available and when updates are unsuccessful. This article describes how the Scheduled Agent Updates feature works and how to set it up.",2025-06-20T03:01:00.000Z,how-to,configuration,0.7,True,Explains Scheduled Agent Updates feature and how to set maintenance windows; likely includes specific configuration options and parameters for update timing.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-desktop/screen-capture-protection,Screen capture protection,Screen capture protection in Azure Virtual Desktop - Azure Virtual Desktop,Configure screen capture protection for Azure Virtual Desktop,Learn how to enable screen capture protection in Azure Virtual Desktop and Windows 365 to help prevent sensitive information from being captured on client devices.,"Screen capture protection, alongsidewatermarking, helps prevent sensitive data from being captured on client devices using specific operating system (OS) features and APIs. When you enable screen capture protection, remote content is automatically blocked in screenshots and screen sharing. This feature can be applied to Azure Virtual Desktop virtual machines and Windows 365 Cloud PCs. When Screen Capture Protection (SCP) is enabled to block screen capture on the client, users can continue to sha",2026-04-17T04:06:00.000Z,how-to,security,0.74,True,"The page describes how to enable and configure Screen Capture Protection for Azure Virtual Desktop and Windows 365, including product-specific security behavior (what is blocked/allowed, client/OS requirements, and interaction with watermarking). This is concrete, product-specific security configuration guidance rather than a conceptual overview.",updated +https://learn.microsoft.com/en-us/azure/virtual-desktop/screen-capture-protection,Screen capture protection,Screen capture protection in Azure Virtual Desktop - Azure Virtual Desktop,Configure screen capture protection for Azure Virtual Desktop,Learn how to enable screen capture protection in Azure Virtual Desktop and Windows 365 to help prevent sensitive information from being captured on client devices.,"Screen capture protection, alongsidewatermarking, helps prevent sensitive data from being captured on client devices using specific operating system (OS) features and APIs. When you enable screen capture protection, remote content is automatically blocked in screenshots and screen sharing. This feature can be applied to Azure Virtual Desktop virtual machines and Windows 365 Cloud PCs. When Screen Capture Protection (SCP) is enabled to block screen capture on the client, users can continue to sha",2026-04-17T04:06:00.000Z,how-to,security,0.74,True,"The page describes how to enable and configure Screen Capture Protection for Azure Virtual Desktop and Windows 365, including product-specific security behavior (what is blocked/allowed, client/OS requirements, and interaction with watermarking). This is concrete, product-specific security configuration guidance rather than a conceptual overview.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/security-recommendations,Security recommendations,Security recommendations for Azure Virtual Desktop - Azure Virtual Desktop,Apply security recommendations to Azure Virtual Desktop,Learn about recommendations for helping keep your Azure Virtual Desktop environment secure.,"Azure Virtual Desktop is a managed virtual desktop service that includes many security capabilities for keeping your organization safe. The architecture of Azure Virtual Desktop comprises many components that make up the service connecting users to their desktops and apps. Azure Virtual Desktop has many built-in advanced security features, such as Reverse Connect where no inbound network ports are required to be open, which reduces the risk involved with having remote desktops accessible from an",2025-06-20T03:01:00.000Z,conceptual,security,0.7,True,Security recommendations for this specific service likely include concrete settings and patterns (for example Reverse Connect usage) beyond generic security advice.,unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/service-architecture-resilience,Service architecture and resilience,Azure Virtual Desktop service architecture and resilience - Azure Virtual Desktop,,Learn about the service architecture of Azure Virtual Desktop and how it has been designed to be resilient.,"Azure Virtual Desktop is designed to provide a resilient, reliable, and secure service for organizations and users. The architecture of Azure Virtual Desktop comprises many components that make up the service connecting users to their desktops and apps. Most components are Microsoft-managed, but some are customer-managed or partner-managed. Microsoft provides the virtual desktop infrastructure (VDI) components for core functionality as a service. These components include: In addition, Azure Virt",2025-06-20T03:01:00.000Z,conceptual,,0.3,False,"Architecture and resilience overview appears conceptual; no indication of quantified thresholds, decision matrices, or config tables.",unchanged https://learn.microsoft.com/en-us/azure/virtual-desktop/service-principal-assign-roles,Assign RBAC roles to the service principals,Assign Azure RBAC roles or Microsoft Entra roles to the Azure Virtual Desktop service principals - Azure Virtual Desktop,Assign RBAC and Entra roles to AVD service principals,"Learn how to assign Azure RBAC roles or Microsoft Entra roles to the Azure Virtual Desktop service principals by using the Azure portal, Azure CLI, or Azure PowerShell.","Several Azure Virtual Desktop features require you to assign Azure role-based access control (Azure RBAC) roles or Microsoft Entra roles to a service principal. Features that you need to assign a role to a service principal include: Tip You can find which role or roles you need to assign to which service principal in the article for each feature. For a list of all the available Azure RBAC roles created specifically for Azure Virtual Desktop, seeBuilt-in Azure RBAC roles for Azure Virtual Desktop",2025-08-14T20:36:00.000Z,how-to,security,0.8,True,"How-to for assigning specific Azure RBAC and Entra roles to service principals, including role names and scopes, is security configuration guidance.",unchanged diff --git a/products/azure-virtual-desktop/report.md b/products/azure-virtual-desktop/report.md index 39c32321..255abc78 100644 --- a/products/azure-virtual-desktop/report.md +++ b/products/azure-virtual-desktop/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: deployment: 'Guides for deploying and migrating AVD: adding session hosts, moving from classic to current AVD, changing regions, using regional host pools, and @@ -21,26 +21,26 @@ category_descriptions: decision-making: 'Planning and cost/licensing decisions for AVD: deployment models, autoscale, host pool and tool choices, storage/FSLogix, data locations, ESU, Local/Extended Zones, and Insights cost estimation' - security: 'Securing Azure Virtual Desktop: SSO (Entra ID/AD FS), MFA/Conditional - Access, RBAC/roles, managed identities, external identities, session protections - (watermarking, screen capture, clipboard), and auditing.' + security: 'Securing AVD access and sessions: SSO (Entra ID/AD FS), MFA/Conditional + Access, RBAC/delegated admin, managed identities, external identities, clipboard/screen/watermark + controls, WebAuthn, Kerberos, and Purview.' limits-quotas: Guidance on RDP bandwidth requirements and optimizing Microsoft Teams (audio/video, collaboration features) performance and configuration in Azure Virtual Desktop. skill_description: Expert knowledge for Azure Virtual Desktop development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when working with FSLogix profiles, MSIX App Attach, autoscale/Start VM on Connect, - Teams optimization, or AVD SSO, and other Azure Virtual Desktop related development + Use when working with AVD host pools, FSLogix profiles, autoscale/Start VM on Connect, + MSIX App Attach, or Teams offload, and other Azure Virtual Desktop related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Dev Box - (use azure-dev-box), Azure VMware Solution (use azure-vmware-solution), Azure Data - Science Virtual Machines (use azure-data-science-vm). -use_when: Use when working with FSLogix profiles, MSIX App Attach, autoscale/Start - VM on Connect, Teams optimization, or AVD SSO, and other Azure Virtual Desktop related - development tasks. + (use azure-dev-box), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), + SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). +use_when: Use when working with AVD host pools, FSLogix profiles, autoscale/Start + VM on Connect, MSIX App Attach, or Teams offload, and other Azure Virtual Desktop + related development tasks. confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), Azure - Dev Box (use azure-dev-box), Azure VMware Solution (use azure-vmware-solution), - Azure Data Science Virtual Machines (use azure-data-science-vm). + Dev Box (use azure-dev-box), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), + SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). --- # Azure Virtual Desktop Crawl Report @@ -49,13 +49,13 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), - **Total Pages**: 150 - **Fetched**: 150 - **Fetch Failed**: 0 -- **Classified**: 121 -- **Unclassified**: 29 +- **Classified**: 120 +- **Unclassified**: 30 ### Incremental Update -- **New Pages**: 1 +- **New Pages**: 0 - **Updated Pages**: 2 -- **Unchanged**: 147 +- **Unchanged**: 148 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-virtual-desktop/azure-virtual-desktop.csv` @@ -64,7 +64,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | Type | Count | Percentage | |------|-------|------------| | architecture-patterns | 5 | 3.3% | -| best-practices | 8 | 5.3% | +| best-practices | 7 | 4.7% | | configuration | 57 | 38.0% | | decision-making | 12 | 8.0% | | deployment | 6 | 4.0% | @@ -72,20 +72,16 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | limits-quotas | 2 | 1.3% | | security | 15 | 10.0% | | troubleshooting | 12 | 8.0% | -| *(Unclassified)* | 29 | 19.3% | +| *(Unclassified)* | 30 | 20.0% | ## Changes -### New Pages - -- [Network connectivity for 169.254.169.254 and 168.63.129.16](https://learn.microsoft.com/en-us/azure/virtual-desktop/azurecommunicationips) - ### Updated Pages -- [Required FQDNs and endpoints](https://learn.microsoft.com/en-us/azure/virtual-desktop/required-fqdn-endpoint) - - Updated: 2026-03-11T08:00:00.000Z → 2026-04-13T15:06:00.000Z -- [Screen capture protection](https://learn.microsoft.com/en-us/azure/virtual-desktop/screen-capture-protection) - - Updated: 2026-02-05T04:10:00.000Z → 2026-04-17T04:06:00.000Z +- [RDP Multipath](https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-multipath) + - Updated: 2025-11-10T21:32:00.000Z → 2026-04-21T03:37:00.000Z +- [Configure a managed identity](https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-managed-identity) + - Updated: 2025-08-14T20:36:00.000Z → 2026-04-22T20:42:00.000Z ## Classified Pages @@ -104,7 +100,6 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Collect and query user connection quality data](https://learn.microsoft.com/en-us/azure/virtual-desktop/connection-quality-monitoring) | troubleshooting | 0.80 | Shows how to set up and query a specific connection quality data table in Log Analytics, including table names and KQL examples for diagnosing issues—product-specific troubleshooting patterns. | | [Configure RDP Shortpath](https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-rdp-shortpath) | configuration | 0.80 | How-to configuration article for enabling Shortpath with specific settings and network parameters, matching configuration criteria. | | [Configure Start VM on Connect](https://learn.microsoft.com/en-us/azure/virtual-desktop/start-virtual-machine-connect) | configuration | 0.80 | Explains enabling/disabling Start VM on Connect with behavior differences for pooled vs personal pools; involves specific host pool settings and constraints. | -| [Configure a managed identity](https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-managed-identity) | security | 0.80 | Managed identity setup for host pools requires specific role assignments and ARM permissions; this is product-specific security configuration with concrete RBAC details. | | [Configure single sign-on using AD FS](https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-adfs-sso) | security | 0.80 | Product-specific SSO configuration using AD FS, with concrete settings and integration steps, fits security configuration. | | [Configure single sign-on using Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-single-sign-on) | security | 0.80 | Describes enabling SSO with Entra ID tokens, including configuration steps and parameters specific to this product’s SSO behavior. | | [Enforce Microsoft Entra multifactor authentication](https://learn.microsoft.com/en-us/azure/virtual-desktop/set-up-mfa) | security | 0.80 | Shows how to enforce MFA via Conditional Access, including policy configuration specific to Azure Virtual Desktop sign-ins. | @@ -133,6 +128,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Increase chroma value to 4:4:4](https://learn.microsoft.com/en-us/azure/virtual-desktop/graphics-chroma-value-increase-4-4-4) | configuration | 0.75 | Article explicitly shows how to set chroma value via Intune or Group Policy, implying specific policy names, registry keys, and allowed values (4:2:0 vs 4:4:4). | | [Onboard to Purview forensic evidence](https://learn.microsoft.com/en-us/azure/virtual-desktop/purview-forensic-evidence) | security | 0.75 | Integration with Purview Insider Risk forensic evidence; involves specific policies, triggers, and onboarding steps that are security-focused and product-specific. | | [Scale with a runbook and logic app](https://learn.microsoft.com/en-us/azure/virtual-desktop/set-up-scaling-script) | configuration | 0.75 | How-to for configuring the scaling script; likely includes runbook parameters, schedules, and Logic Apps settings that are product-specific. | +| [Configure a managed identity](https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-managed-identity) | security | 0.74 | The page describes how to configure managed identities specifically for Azure Virtual Desktop host pools, including assigning permissions for ARM operations on VMs, Key Vault, and networking resources. This involves product-specific security configuration steps and role/permission assignments tied to Azure RBAC and managed identities, which qualify as expert security configuration knowledge rather than generic concepts. | | [Screen capture protection](https://learn.microsoft.com/en-us/azure/virtual-desktop/screen-capture-protection) | security | 0.74 | The page describes how to enable and configure Screen Capture Protection for Azure Virtual Desktop and Windows 365, including product-specific security behavior (what is blocked/allowed, client/OS requirements, and interaction with watermarking). This is concrete, product-specific security configuration guidance rather than a conceptual overview. | | [Add and manage App Attach applications](https://learn.microsoft.com/en-us/azure/virtual-desktop/app-attach-setup) | configuration | 0.72 | Management of App Attach apps uses AVD-specific portal and PowerShell parameters, including configuration of package sources, registration options, and assignment settings. | | [Launch OneDrive with RemoteApp](https://learn.microsoft.com/en-us/azure/virtual-desktop/onedrive-remoteapp) | configuration | 0.72 | Describes a workaround for the standard OneDrive autostart setting not working with RemoteApp, implying specific script/configuration steps unique to AVD. | @@ -198,7 +194,6 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Licensing](https://learn.microsoft.com/en-us/azure/virtual-desktop/licensing) | decision-making | 0.65 | Details licensing differences for internal vs external use and per-user access pricing, guiding choice of licensing models. | | [Monitor Autoscale operations](https://learn.microsoft.com/en-us/azure/virtual-desktop/autoscale-monitor-operations-insights) | troubleshooting | 0.65 | Focuses on monitoring autoscale operations and identifying issues; uses Insights data and likely provides diagnostic patterns specific to autoscale. | | [Preferred application group type overview](https://learn.microsoft.com/en-us/azure/virtual-desktop/preferred-application-group-type) | configuration | 0.65 | Describes behavior of preferred application group type and likely includes specific property names and allowed values that control desktop vs RemoteApp access. | -| [RDP Multipath](https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-multipath) | best-practices | 0.65 | Describes when and how RDP Multipath enhances connections and which connections are eligible, providing product-specific usage recommendations. | | [Redirect video playback and calls](https://learn.microsoft.com/en-us/azure/virtual-desktop/multimedia-redirection-video-playback-calls) | configuration | 0.65 | Covers multimedia redirection for video and calls over RDP for Azure Virtual Desktop/Windows 365/Dev Box; such articles typically include product-specific settings, client/server configuration parameters, and enablement steps that qualify as configuration expert knowledge. | | [Resolve Azure Advisor recommendations](https://learn.microsoft.com/en-us/azure/virtual-desktop/azure-advisor-recommendations) | best-practices | 0.65 | Walkthrough of concrete Azure Advisor recommendations and how to remediate them for AVD, which are product-specific DO/DO-NOT style guidance with actionable steps. | | [Use features of the Remote Desktop client](https://learn.microsoft.com/en-us/azure/virtual-desktop/remote-desktop-client/client-features-windows-msrdc) | configuration | 0.65 | Explains client features and likely their configuration options; product-specific client behavior and settings when used with AVD. | @@ -240,6 +235,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Autoscale glossary](https://learn.microsoft.com/en-us/azure/virtual-desktop/autoscale-glossary) | 0.20 | Glossary of terms; definitions are conceptual and not configuration, limits, or troubleshooting content. | | [Azure Virtual Desktop](https://learn.microsoft.com/en-us/azure/virtual-desktop/whats-new) | 0.20 | A 'what's new' changelog-style page listing recent Azure Virtual Desktop updates; primarily release notes and marketing/overview of new features without structured limits, configuration tables, decision matrices, or troubleshooting mappings. | | [Private Link with Azure Virtual Desktop](https://learn.microsoft.com/en-us/azure/virtual-desktop/private-link-overview) | 0.20 | High-level overview of using Private Link with Azure Virtual Desktop; summary indicates conceptual benefits without detailed configuration tables or limits. | +| [RDP Multipath](https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-multipath) | 0.20 | Page appears to describe what RDP Multipath is and its benefits for Azure Virtual Desktop connections, without clear evidence of numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Based on the summary, it is primarily conceptual/behavioral guidance rather than detailed expert configuration, limits, or troubleshooting content. | | [SxS Network Stack](https://learn.microsoft.com/en-us/azure/virtual-desktop/whats-new-sxs) | 0.20 | Describes new features and updates for the Azure Virtual Desktop SxS Network Stack; functions as release notes rather than detailed configuration, limits, or troubleshooting guidance with structured expert data. | | [Thin client partners](https://learn.microsoft.com/en-us/azure/virtual-desktop/thin-clients) | 0.20 | Primarily a link-out/navigation page to partner thin client vendors and a generic note about using a browser; no detailed configuration tables or product-specific parameters. | | [Glossary](https://learn.microsoft.com/en-us/azure/virtual-desktop/insights-glossary) | 0.10 | A glossary of terms and concepts; definitions are conceptual and not configuration, limits, or troubleshooting content. | diff --git a/products/azure-virtual-machines/azure-virtual-machines.csv b/products/azure-virtual-machines/azure-virtual-machines.csv index e9533d67..68823a71 100644 --- a/products/azure-virtual-machines/azure-virtual-machines.csv +++ b/products/azure-virtual-machines/azure-virtual-machines.csv @@ -51,11 +51,11 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/backup-and-disaster-rec https://learn.microsoft.com/en-us/azure/virtual-machines/backup-recovery,Overview,Overview backup options for VMs - Azure Virtual Machines,,Overview backup options for Azure virtual machines.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets You can protect your data by taking backups at regular intervals. There are several backup options available for virtual machines (VMs), depending on your use-case.",2024-08-22T17:37:00.000Z,overview,,0.3,False,"Overview of backup options; no indication of numeric limits, decision matrices, or detailed configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/boot-diagnostics,Boot diagnostics,Azure boot diagnostics - Azure Virtual Machines,,Overview of Azure boot diagnostics and managed boot diagnostics,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Boot diagnostics is a debugging feature for Azure virtual machines (VM) that allows diagnosis of VM boot failures. Boot diagnostics enables a user to observe the state of their VM as it is booting up by collecting serial log information and screenshots.,2025-12-01T23:05:00.000Z,how-to,,0.3,False,"High-level overview of boot diagnostics; summary doesn’t show config parameter tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/boot-integrity-monitoring-overview,Boot integrity monitoring,Boot integrity monitoring overview - Azure Virtual Machines,Configure boot integrity monitoring and guest attestation for Trusted Launch VMs,Learn how to use the Guest Attestation extension to secure boot your virtual machine and how to handle traffic blocking.,"To help Azure Trusted Launch better prevent malicious rootkit attacks on virtual machines (VMs), guest attestation through an Azure Attestation endpoint is used to monitor the boot sequence integrity. This attestation is critical to provide the validity of a platform's states. YourTrusted Launch VMneeds Secure Boot and virtual Trusted Platform Module (vTPM) to be enabled so that the attestation extensions can be installed. Microsoft Defender for Cloud offers reports based on Guest Attestation ve",2024-08-22T17:37:00.000Z,concept-article,security,0.7,True,"Details use of Guest Attestation extension, Azure Attestation endpoint, and requirements like Secure Boot and vTPM; includes product-specific security configuration and monitoring behavior.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set,Associate a scale set - Uniform,Associate a virtual machine scale set with Uniform Orchestration to a capacity reservation group. - Azure Virtual Machines,,Learn how to associate a new or existing virtual machine scale with Uniform Orchestration set to a capacity reservation group.,"Applies to:✔️ Uniform scale set Azure Virtual Machine Scale Sets has two modes: To learn more about these modes, seeVirtual Machine Scale Sets orchestration modes. This content applies to the Uniform Orchestration mode. For Flexible Orchestration mode, seeAssociate a virtual machine scale set with Flexible Orchestration to a capacity reservation group.",2025-11-24T23:03:00.000Z,how-to,,0.35,False,How-to associate a Uniform orchestration scale set with a capacity reservation group; similar procedural focus as 27.,unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set,Associate a scale set - Uniform,Associate a virtual machine scale set with Uniform Orchestration to a capacity reservation group. - Azure Virtual Machines,,Learn how to associate a new or existing virtual machine scale with Uniform Orchestration set to a capacity reservation group.,"Applies to:✔️ Uniform scale set Azure Virtual Machine Scale Sets has two modes: To learn more about these modes, seeVirtual Machine Scale Sets orchestration modes. This content applies to the Uniform Orchestration mode. For Flexible Orchestration mode, seeAssociate a virtual machine scale set with Flexible Orchestration to a capacity reservation group.",2026-04-20T22:05:00.000Z,how-to,,0.2,False,"Covers associating a Uniform orchestration VM scale set to a capacity reservation group. The summary indicates a scenario-specific how-to, not expert-level limits, configuration matrices, or error-code-based troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set-flex,Associate a scale set - Flexible,Associate a virtual machine scale set with Flexible Orchestration to a capacity reservation group - Azure Virtual Machines,,Learn how to associate a new virtual machine scale set with Flexible Orchestration mode to a capacity reservation group.,"Applies to:✔️ Flexible scale sets Azure Virtual Machine Scale Sets has two modes: To learn more about these modes, seeOrchestration modes for Virtual Machine Scale Sets. This content applies to the Flexible Orchestration mode. For more information about Uniform Orchestration mode, seeAssociate a virtual machine scale set with Uniform Orchestration to a capacity reservation group.",2024-09-09T22:10:00.000Z,how-to,,0.35,False,How-to associate a Flexible orchestration scale set with a capacity reservation group; procedural content rather than configuration reference.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-vm,Associate a VM,Associate a virtual machine to a capacity reservation group - Azure Virtual Machines,,Learn how to associate a new or existing virtual machine to a capacity reservation group.,"Applies to:✔️ Windows Virtual Machines ✔️ Linux Virtual Machines You can use capacity reservation groups with new or existing virtual machines (VMs). To learn more about capacity reservations, see theCapacity reservation overview.",2025-11-24T23:03:00.000Z,how-to,,0.35,False,How-to associate a VM to a capacity reservation group; likely a short procedural article without extensive configuration or limits.,unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-vm,Associate a VM,Associate a virtual machine to a capacity reservation group - Azure Virtual Machines,,Learn how to associate a new or existing virtual machine to a capacity reservation group.,"Applies to:✔️ Windows Virtual Machines ✔️ Linux Virtual Machines You can use capacity reservation groups with new or existing virtual machines (VMs). To learn more about capacity reservations, see theCapacity reservation overview.",2026-04-20T22:05:00.000Z,how-to,,0.2,False,"Described as a guide to associate a VM to a capacity reservation group. This is primarily procedural content without indication of numeric limits, detailed configuration parameter tables, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-create,Create a capacity reservation,Create a capacity reservation in Azure - Azure Virtual Machines,,Learn how to reserve compute capacity in an Azure region or an availability zone by creating a capacity reservation.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale set ✔️ Flexible scale sets Capacity reservation is always created as part of a capacity reservation group. The first step is to create a group if a suitable one doesn't exist already, and then create reservations. After reservations are successfully created, they're immediately available for use with virtual machines (VMs). The capacity is reserved for your use as long as the reservation isn't deleted. A well-formed request for a capacity r",2024-09-09T22:10:00.000Z,how-to,,0.4,False,How-to article for creating capacity reservations; likely step-by-step portal/API instructions rather than a comprehensive configuration reference with ranges and defaults.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-group-share,Share a capacity reservation group,Share a Capacity Reservation Group in Azure - Azure Virtual Machines,,Learn how to share a Capacity Reservation Group.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale set ✔️ Flexible scale sets Important This feature is currently inPreview. See thePreview Terms of Usefor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. On-demand Capacity Reservation Group (CRG) can be shared with other subscriptions. Using this option can make it easier to manage some common configuration needs: Reuse of capacity reserved for disaster recovery.Re",2025-11-25T06:02:00.000Z,how-to,,0.45,False,"Describes sharing Capacity Reservation Groups across subscriptions; summary suggests conceptual explanation and basic steps, not detailed RBAC role lists or configuration matrices.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-group-share,Share a capacity reservation group,Share a Capacity Reservation Group in Azure - Azure Virtual Machines,,Learn how to share a Capacity Reservation Group.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale set ✔️ Flexible scale sets Important This feature is currently inPreview. See thePreview Terms of Usefor legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. On-demand Capacity Reservation Group (CRG) can be shared with other subscriptions. Using this option can make it easier to manage some common configuration needs: Reuse of capacity reserved for disaster recovery.Re",2026-04-20T22:05:00.000Z,how-to,,0.2,False,"Appears to be a how-to/tutorial for sharing a Capacity Reservation Group across subscriptions. The summary suggests step-by-step usage, not detailed limits, configuration matrices, or error-code-based troubleshooting. Likely no dense expert-only configuration tables or numeric constraints.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-modify,Modify a capacity reservation,Modify a capacity reservation in Azure - Azure Virtual Machines,,Learn how to modify a capacity reservation.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale set ✔️ Flexible scale sets After you create a capacity reservation group and a capacity reservation, you might want to modify your reservations. This article explains how to do the following actions by using the API, the Azure portal, and PowerShell.",2024-09-09T22:10:00.000Z,how-to,,0.4,False,How-to for modifying capacity reservations via API/portal/PowerShell; procedural guidance rather than detailed configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-overallocate,Overallocating capacity reservation,Overallocate capacity reservation in Azure - Azure Virtual Machines,,Learn how overallocation works when it comes to capacity reservation.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale set ✔️ Flexible scale sets Azure permits the association of extra virtual machines (VMs) above the number of capacity reservations. These VMs are available to allow for burst and other scale-out scenarios without the limits of reserved capacity. The only difference is that the count of VMs beyond the quantity reserved doesn't receive the capacity availability service-level agreement (SLA) benefit. As long as Azure has available capacity tha,2024-09-09T22:10:00.000Z,how-to,,0.55,False,Explains overallocation behavior for capacity reservations; summary mentions conceptual behavior and SLA differences but not specific numeric thresholds or configuration parameters.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-overview,Overview,On-demand capacity reservation in Azure - Azure Virtual Machines,Use on-demand capacity reservations for Azure VMs,Learn how to reserve compute capacity in an Azure region or an availability zone with capacity reservation.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale set ✔️ Flexible scale sets On-demand capacity reservation enables you to reserve compute capacity in an Azure region or an availability zone for any duration of time. Unlikereserved instances, you don't have to sign up for a one-year or three-year term commitment. You can create and delete reservations at any time and have full control over how you want to manage your reservations. After you create the capacity reservation, you can use the ",2026-02-17T08:00:00.000Z,overview,decision-making,0.65,True,"Explains when and how to use on-demand capacity reservations versus reserved instances, including product-specific behavior (region/zone scope, duration flexibility, creation/deletion semantics). This is specialized decision guidance for capacity planning and reservation strategy, not just conceptual overview, and does not center on numeric limits or configuration tables.",unchanged @@ -91,15 +91,15 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/create-restore-points,C https://learn.microsoft.com/en-us/azure/virtual-machines/custom-data,Custom data,Custom data and Azure virtual machines - Azure Virtual Machines,Use custom data and cloud-init with Azure VMs,This article gives details on using custom data and cloud-init on Azure virtual machines.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets You might need to inject a script or other metadata into a Microsoft Azure virtual machine (VM) at provisioning time. In other clouds, this concept is often calleduser data. Microsoft Azure has a similar feature calledcustom data. Custom data is made available to the VM during first startup or setup, which is calledprovisioning. Provisioning is the process where VM creation parameters (for example, host name, username, password, certi",2024-09-09T17:02:00.000Z,how-to,configuration,0.8,True,"Details Azure’s custom data feature, provisioning behavior, and cloud-init usage; includes product-specific parameters and behaviors.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/custom-domain,Use a custom domain name,Create and use a custom domain - Azure Virtual Machines,,Connect a custom domain to a virtual machine in Azure.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets In Azure there are multiple ways to connect a custom domain to your VM or resource. For any resource with a public IP (Virtual Machine, Load Balancer, Application Gateway) the most straight-forward way is to create an A record set in your corresponding domain registrar.",2024-08-22T17:37:00.000Z,how-to,,0.35,False,"Explains connecting custom domains via A records; generic DNS practice, not Azure-specific limits/config tables.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-compute-optimized-skus,Compute Optimized SKUs,Compute Optimized Azure Dedicated Host SKUs - Azure Virtual Machines,Compute optimized Dedicated Host SKU capacities and packing,Specifications for VM packing of Compute Optimized ADH SKUs.,"Azure Dedicated Host SKUs are the combination of a VM family and a certain hardware specification. You can only deploy VMs of the VM series that the Dedicated Host SKU specifies. For example, on the Dsv3-Type3, you can only provisionDsv3-seriesVMs. This document goes through the hardware specifications and VM packings for all compute optimized Dedicated Host SKUs.",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.9,True,Similar to index 2 but for compute-optimized SKUs; contains detailed hardware specs and VM packing tables with exact numeric limits per SKU.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus,General Purpose SKUs,General Purpose Azure Dedicated Host SKUs - Azure Virtual Machines,General purpose Dedicated Host SKU capacities and packing,Specifications for VM packing of General Purpose ADH SKUs.,"Azure Dedicated Host SKUs are the combination of a VM family and a certain hardware specification. You can only deploy VMs of the VM series that the Dedicated Host SKU specifies. For example, on the Dsv3-Type3, you can only provisionDsv3-seriesVMs. This document goes through the hardware specifications and VM packings for all general purpose Dedicated Host SKUs.",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.9,True,"Described as hardware specifications and VM packing for each general purpose Dedicated Host SKU. These pages typically contain tables with exact vCPU/RAM counts and how many specific VM sizes fit per host, which are product-specific numeric limits.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus,General Purpose SKUs,General Purpose Azure Dedicated Host SKUs - Azure Virtual Machines,General purpose Azure Dedicated Host SKU specifications,Specifications for VM packing of General Purpose ADH SKUs.,"Azure Dedicated Host SKUs are the combination of a VM family and a certain hardware specification. You can only deploy VMs of the VM series that the Dedicated Host SKU specifies. For example, on the Dsv3-Type3, you can only provisionDsv3-seriesVMs. This document goes through the hardware specifications and VM packings for all general purpose Dedicated Host SKUs.",2026-04-23T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"The page describes hardware specifications and VM packing details for each general purpose Dedicated Host SKU. This typically includes exact core counts, memory, maximum number/size of VMs per host, and SKU-specific constraints, which are numeric limits and quotas that an LLM wouldn't reliably know from training.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-gpu-optimized-skus,GPU Optimized SKUs,GPU Optimized Azure Dedicated Host SKUs - Azure Virtual Machines,GPU optimized Dedicated Host SKU capacities and packing,Specifications for VM packing of GPU optimized ADH SKUs.,"Azure Dedicated Host SKUs are the combination of a VM family and a certain hardware specification. You can only deploy VMs of the VM series that the Dedicated Host SKU specifies. For example, on the Dsv3-Type3, you can only provisionDsv3-seriesVMs. This document goes through the hardware specifications and VM packings for all GPU optimized Dedicated Host SKUs.",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.9,True,"Covers hardware specifications and VM packing for GPU-optimized Dedicated Host SKUs, with numeric limits on vCPUs, RAM, and VM counts per host.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-memory-optimized-skus,Memory Optimized SKUs,Memory Optimized Azure Dedicated Host SKUs - Azure Virtual Machines,Memory optimized Dedicated Host SKU capacities and packing,Specifications for VM packing of Memory Optimized ADH SKUs.,"Azure Dedicated Host SKUs are the combination of a VM family and a certain hardware specification. You can only deploy VMs of the VM series that the Dedicated Host SKU specifies. For example, on the Dsv3-Type3, you can only provisionDsv3-seriesVMs. This document goes through the hardware specifications and VM packings for all memory optimized Dedicated Host SKUs. Note The host reserves a portion of the ""Available vCPUs"" and ""Available RAM"" listed in the tables below for its own operations. To de",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.9,True,"Provides hardware specifications and VM packing tables for memory-optimized Dedicated Host SKUs, including available vCPUs and RAM with host-reserved portions, which are precise numeric limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-storage-optimized-skus,Storage Optimized SKUs,Storage Optimized Azure Dedicated Host SKUs - Azure Virtual Machines,Storage optimized Dedicated Host SKU capacities and packing,Specifications for VM packing of Storage Optimized ADH SKUs.,"Azure Dedicated Host SKUs are the combination of a VM family and a certain hardware specification. You can only deploy VMs of the VM series that the Dedicated Host SKU specifies. For example, on the Dsv3-Type3, you can only provisionDsv3-seriesVMs. This document goes through the hardware specifications and VM packings for all storage optimized Dedicated Host SKUs.",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.9,True,Lists hardware specs and VM packing for storage-optimized Dedicated Host SKUs; these are exact numeric capacity and packing limits per SKU.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-hosts,Overview,Overview of Azure Dedicated Hosts for virtual machines - Azure Virtual Machines,,Learn more about how Azure Dedicated Hosts can be used for deploying virtual machines.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale sets Azure Dedicated Host is a service that provides physical servers able to host one or more virtual machines assigned to one Azure subscription. Dedicated hosts are the same physical servers used in our data centers, provided instead as a directly accessible hardware resource. You can provision dedicated hosts within a region, availability zone, and fault domain. You can then place VMs directly into your provisioned hosts in whatever con",2024-08-22T17:37:00.000Z,concept-article,,0.4,False,"High-level overview of Azure Dedicated Hosts; summary suggests conceptual description of what the service is and where it can be used, without detailed limits, configuration tables, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-hosts-how-to,How-to,Deploy Azure dedicated hosts - Azure Virtual Machines,,Deploy VMs and scale sets to dedicated hosts.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Uniform scale sets This article guides you through how to create an Azurededicated hostto host your virtual machines (VMs) and scale set instances.,2024-08-22T17:37:00.000Z,how-to,,0.4,False,How-to deployment guide for dedicated hosts; likely step-by-step portal/CLI instructions without detailed configuration parameter tables or product-specific limits.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/delete,Delete a VM and its resources,Delete a VM and attached resources - Azure Virtual Machines,,Learn how to delete a VM and the resources attached to the VM.,"Depending on how you delete a VM, it may only delete the VM resource, not the networking and disk resources. You can change the default settings for what other resources are deleted when you delete a VM.",2026-02-06T08:00:00.000Z,how-to,,0.2,False,"How-to page for deleting a VM and its attached resources; focuses on behavior and UI options, not on numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Does not meet any sub-skill expert-knowledge criteria.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images,Deprecated images FAQ,Deprecated Azure Marketplace images - Azure Virtual Machines,Understand deprecation rules for Azure VM images,Learn how Marketplace image deprecation can affect your deployment.,This article answers common questions about what happens when Azure Marketplace images are deprecated.,2026-03-26T22:04:00Z,faq,limits-quotas,0.7,True,"The FAQ-style page contains product-specific lifecycle rules and timing details for deprecated Azure Marketplace images (for example, how long images remain deployable after deprecation and what operations are blocked), which are effectively service limits/constraints that are not obvious from general knowledge. These are concrete behavioral limits tied to the platform rather than conceptual guidance.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/disk-bursting,Disk bursting models,Managed disk bursting - Azure Virtual Machines,Use Azure VM managed disk bursting performance limits,Learn about disk bursting for Azure disks and Azure virtual machines.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Azure offers the ability to boost disk storage IOPS and MB/s performance, this is referred to as bursting for both virtual machines (VM) and disks. You can effectively use VM and disk bursting to achieve better bursting performance on both your VMs and disk. Bursting for Azure VMs and disk resources aren't dependent on each other. You don't need a burst-capable VM for an attached burst-capable disk to burst. Simi",2026-04-15T22:04:00.000Z,concept-article,limits-quotas,0.78,True,"Disk bursting behavior for Azure managed disks and VMs is defined by specific IOPS and MB/s thresholds, burst durations, and eligibility conditions that are product- and SKU-specific. This is expert knowledge not reliably known from training and fits the limits-quotas category because it describes concrete performance limits and how bursting is constrained per disk/VM type.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images,Deprecated images FAQ,Deprecated Azure Marketplace images - Azure Virtual Machines,Understand impact of deprecated Azure Marketplace images,Learn how Marketplace image deprecation can affect your deployment.,This article answers common questions about what happens when Azure Marketplace images are deprecated. Important Image deprecation doesn't affect all images the same way. The impact depends on whether the image requires a Marketplacepurchase plan. Purchase plan informationrefers to Marketplace billing terms that you must accept for certain third-party images. Only images that require a purchase plan are subject to enforcement checks after deprecation. To check whether your image has a purchase p,2026-04-23T06:09:00Z,faq,decision-making,0.7,True,"Provides product-specific behavioral rules about how deprecated Marketplace images affect deployments, especially those with purchase plans, and clarifies enforcement behavior and scenarios. This is specialized decision guidance for choosing and managing images rather than generic concepts, but it doesn't focus on limits, configuration tables, or error-code troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/disk-bursting,Disk bursting models,Managed disk bursting - Azure Virtual Machines,Use Azure VM managed disk bursting performance limits,Learn about disk bursting for Azure disks and Azure virtual machines.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Azure offers the ability to boost disk storage IOPS and MB/s performance, this is referred to as bursting for both virtual machines (VM) and disks. You can effectively use VM and disk bursting to achieve better bursting performance on both your VMs and disk. Bursting for Azure VMs and disk resources aren't dependent on each other. You don't need a burst-capable VM for an attached burst-capable disk to burst. Simi",2026-04-15T22:04:00.000Z,concept-article,limits-quotas,0.78,True,"Disk bursting behavior for Azure managed disks and VMs is defined by specific IOPS and MB/s thresholds, burst durations, and eligibility conditions that are product- and SKU-specific. This is expert knowledge not reliably known from training and fits the limits-quotas category because it describes concrete performance limits and how bursting is constrained per disk/VM type.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disk-encryption,Server-side encryption overview,Server-side encryption of Azure managed disks - Azure Virtual Machines,Configure server-side encryption for Azure managed disks,"Azure Storage protects your data by encrypting it at rest before persisting it to Storage clusters. You can use customer-managed keys to manage encryption with your own keys, or you can rely on Micros","Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Most Azure managed disks are encrypted with Azure Storage encryption, which uses server-side encryption (SSE) to protect your data and to help you meet your organizational security and compliance commitments. Azure Storage encryption automatically encrypts your data stored on Azure managed disks (OS and data disks) at rest by default when persisting it to the cloud. Disks with encryption at host enabled, however,",2026-04-02T08:00:00.000Z,concept-article,security,0.7,True,"Page is focused on disk encryption behavior and options (Microsoft-managed vs customer-managed keys, encryption at host, SSE specifics). It likely includes product-specific security configuration details such as which disk types/VM types support which encryption modes, required settings, and key management behaviors. This is security-focused expert knowledge rather than generic concepts, but it does not primarily describe limits/quotas, troubleshooting, or architecture patterns.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disk-encryption-migrate,Migrate from Azure Disk Encryption to encryption at host,Migrate from Azure Disk Encryption to encryption at host - Azure Virtual Machines,Plan migration from Azure Disk Encryption to encryption at host,Learn how to migrate your virtual machines from Azure Disk Encryption (ADE) to encryption at host,"Important Azure Disk Encryption is scheduled for retirement onSeptember 15, 2028. Until that date, you can continue to use Azure Disk Encryption without disruption. On September 15, 2028, ADE-enabled workloads will continue to run, but encrypted disks will fail to unlock after VM reboots, resulting in service disruption. Useencryption at hostfor new VMs. All ADE-enabled VMs (including backups) must migrate to encryption at host before the retirement date to avoid service disruption. SeeMigrate f",2025-11-17T18:07:00.000Z,how-to,decision-making,0.75,True,Retirement timeline plus guidance to migrate from ADE to encryption at host; includes migration considerations and recommendations for new vs existing workloads.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disk-encryption-overview,Disk encryption overview,Overview of managed disk encryption options - Azure Virtual Machines,Understand encryption options for Azure managed disks,Overview of managed disk encryption options,"There are several types of encryption available for your managed disks, including Azure Disk Encryption (ADE), Server-Side Encryption (SSE), and encryption at host. Azure Disk Storage Server-Side Encryption(also referred to as encryption-at-rest or Azure Storage encryption) is always enabled and automatically encrypts data stored on Azure Managed Disks (OS and data disks) when persisting on the Storage Clusters. When configured with a Disk Encryption Set (DES), it supports customer-managed keys ",2025-09-24T05:04:00.000Z,concept-article,security,0.7,True,"Details ADE, SSE, encryption at host, and Disk Encryption Sets; includes product-specific security/encryption configuration options and behaviors.",unchanged @@ -113,7 +113,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/disks-deploy-zrs,Deploy https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-bursting,Enable on-demand bursting,Enable on-demand disk bursting - Azure Virtual Machines,Enable on-demand bursting for Premium SSD disks,Enable on-demand disk bursting on your managed disk.,"Premium solid-state drives (SSD) have two available bursting models; credit-based bursting and on-demand bursting. This article covers how to switch to on-demand bursting. Disks that use the on-demand model can burst beyond their original provisioned targets. On-demand bursting occurs as often as needed by the workload, up to the maximum burst target. On-demand bursting incurs additional charges. For details on disk bursting, seeManaged disk bursting. For the max burst targets on each supported ",2025-01-14T08:00:00.000Z,how-to,configuration,0.75,True,How-to for switching bursting models with specific parameters and supported disk SKUs; configuration of a product-specific performance feature.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-customer-managed-keys-portal,Portal,Azure portal - Enable customer-managed keys with SSE - managed disks - Azure Virtual Machines,Enable customer-managed keys for disks in Azure portal,Enable customer-managed keys on your managed disks through the Azure portal.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Azure Disk Storage allows you to manage your own keys when using server-side encryption (SSE) for managed disks, if you choose. For conceptual information on SSE with customer managed keys, and other managed disk encryption types, see theCustomer-managed keyssection of our disk encryption article:Customer-managed keys",2024-08-23T16:58:00.000Z,how-to,security,0.8,True,Portal-based configuration of customer-managed keys for SSE; includes specific key vault and disk encryption set settings and scopes.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-double-encryption-at-rest-portal,Enable double encryption at rest,Enable double encryption at rest for managed disks - Azure Virtual Machines,Configure double encryption at rest for managed disks,"Enable double encryption at rest for your managed disk data using the Azure portal, Azure PowerShell module, or Azure CLI.","Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Azure Disk Storage supports double encryption at rest for managed disks. For conceptual information on double encryption at rest, and other managed disk encryption types, see theDouble encryption at restsection of our disk encryption article.",2024-09-27T17:00:00.000Z,how-to,security,0.75,True,How-to for enabling double encryption at rest; includes disk and key configuration specific to Azure Disk Storage security.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-host-based-encryption-portal,Portal,Enable end-to-end encryption using encryption at host - Azure portal - managed disks - Azure Virtual Machines,Configure encryption at host for Azure managed disks,Use encryption at host to enable end-to-end encryption on your Azure managed disks - Azure portal.,"Applies to:✔️ Linux VMs ✔️ Windows VMs When you enable encryption at host, data stored on the VM host is encrypted at rest and flows encrypted to the Storage service. For conceptual information on encryption at host, and other managed disk encryption types, see:Encryption at host - End-to-end encryption for your VM data. Temporary disks and ephemeral OS disks are encrypted at rest with platform-managed keys when you enable end-to-end encryption. The OS and data disk caches are encrypted at rest ",2026-04-02T17:04:00.000Z,how-to,security,0.7,True,"How-to page for enabling encryption at host via Azure portal; likely includes specific Azure settings, option names, and product-specific security configuration steps beyond generic concepts.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-host-based-encryption-portal,Portal,Enable end-to-end encryption using encryption at host - Azure portal - managed disks - Azure Virtual Machines,Configure encryption at host for Azure managed disks,Use encryption at host to enable end-to-end encryption on your Azure managed disks - Azure portal.,"Applies to:✔️ Linux VMs ✔️ Windows VMs When you enable encryption at host, data stored on the VM host is encrypted at rest and flows encrypted to the Storage service. For conceptual information on encryption at host, and other managed disk encryption types, see:Encryption at host - End-to-end encryption for your VM data. Temporary disks and ephemeral OS disks are encrypted at rest with platform-managed keys when you enable end-to-end encryption. The OS and data disk caches are encrypted at rest ",2026-04-02T17:04:00.000Z,how-to,security,0.7,True,"How-to page for enabling encryption at host via Azure portal; likely includes specific Azure settings, option names, and product-specific security configuration steps beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-performance,Increase performance of Premium SSD and Standard SSD/HDDs,Increase performance of Premium SSDs and Standard SSD/HDDs - Azure Virtual Machines,Enable performance plus for Azure SSD and HDD disks,Increase the performance of Azure Premium SSDs and Standard SSD/HDDs using performance plus.,"The Input/Output Operations Per Second (IOPS) and throughput limits for Azure Premium solid-state drives (SSD), Standard SSDs, and Standard hard disk drives (HDD) that are 513 GiB and larger can be increased by enabling performance plus. Enabling performance plus improves the experience for workloads that require high IOPS and throughput, such as database and transactional workloads. There's no extra charge for enabling performance plus on a disk. Once enabled, the IOPS and throughput limits for",2025-04-15T08:00:00.000Z,how-to,limits-quotas,0.85,True,"Describes IOPS/throughput limits before and after enabling performance plus, with specific size thresholds (≥513 GiB) and new limits—clear numeric limits/quotas.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-private-links-for-import-export-portal,Configure private links for disks - Portal,Azure portal - Restrict import/export access to managed disks - Azure Virtual Machines,Enable Private Link for managed disk import/export in portal,Enable Private Link for your managed disks with Azure portal. This allows you to securely export and import disks within your virtual network.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets In this article, you create a disk access resource and useprivate endpointsto restrict the export and import of managed disks over aprivate linkfrom clients on your Azure virtual network. This configuration ensures that import/export operations on disks with this configuration occurs within your Azure virtual network. Following the steps in this article only affects the import and export of your disks, it doesn't",2025-11-20T06:02:00.000Z,how-to,security,0.78,True,Portal-based configuration of disk access resources and private endpoints to restrict import/export to a virtual network is a concrete security configuration.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-ultra-ssd,Deploy an Ultra Disk,Ultra Disks for VMs - Azure Managed Disks - Azure Virtual Machines,Configure and deploy Azure Ultra Disks for VMs,Learn about Ultra Disks for Azure VMs,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets This article explains how to deploy and use an Ultra Disk, for conceptual information about Ultra Disks, refer toWhat disk types are available in Azure? Azure Ultra Disks offer high throughput, high IOPS, and consistent low latency disk storage for Azure IaaS virtual machines (VMs). This new offering provides top of the line performance at the same availability levels as our existing disks offerings. One major be",2025-07-02T17:03:00.000Z,how-to,configuration,0.7,True,"How-to for deploying Ultra Disks; full article includes disk performance configuration parameters (IOPS, throughput) and constraints unique to Ultra Disks.",unchanged @@ -223,7 +223,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/vmaccess-win https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/vmsnapshot-linux,Linux,VM Snapshot Linux extension for Azure Backup - Azure Virtual Machines,Configure VM Snapshot Linux extension for Azure Backup,Take application consistent backup of the virtual machine from Azure Backup using VM snapshot Linux extension.,Azure Backup provides support for backing up workloads from on-premises to cloud and backing up cloud resources to Recovery Services vault. Azure Backup uses VM snapshot extension to take an application consistent backup of the Azure virtual machine without the need to shutdown the VM. VM Snapshot Linux extension is published and supported by Microsoft as part of Azure Backup service. Azure Backup will install the extension as part of first scheduled backup triggered post enabling backup. This d,2026-02-27T06:03:00.000Z,concept-article,configuration,0.7,True,"Backup snapshot extension docs usually include extension name, publisher, version, and configuration parameters required for application-consistent backups.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/vmsnapshot-windows,Windows,VM Snapshot Windows extension for Azure Backup - Azure Virtual Machines,Configure VM Snapshot Windows extension for Azure Backup,Take application consistent backup of the virtual machine from Azure Backup using VM snapshot extension,Azure Backup provides support for backing up workloads from on-premises to cloud and backing up cloud resources to Recovery Services vault. Azure Backup uses VM snapshot extension to take an application consistent backup of the Azure virtual machine without the need to shutdown the VM. VM Snapshot extension is published and supported by Microsoft as part of Azure Backup service. Azure Backup will install the extension as part of first scheduled backup triggered post enabling backup. This documen,2026-02-27T06:03:00.000Z,concept-article,configuration,0.7,True,"Similar to Linux snapshot extension, contains extension-specific configuration details and parameters for Windows VMs.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/faq-for-disks,Disks FAQs,Frequently asked questions about disks - Azure Virtual Machines,Azure managed disk and Premium SSD limits FAQ,Frequently asked questions about Azure IaaS Linux VM disks and premium disks (managed and unmanaged),This article answers some frequently asked questions about Azure managed disks and Azure Premium SSDs.,2026-04-02T17:04:00Z,faq,limits-quotas,0.7,True,"An FAQ for Azure managed disks and Premium SSDs typically includes concrete, product-specific details such as maximum disk sizes, IOPS and throughput caps per disk and per VM, snapshot and disk count limits, and other numeric constraints. These are exact limits and quotas that change over time and are not reliably known from training data, matching the limits-quotas category.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/field-programmable-gate-arrays-attestation,FPGA Attestation Service,Azure FPGA Attestation Service - Azure Virtual Machines,,Attestation service for the NP-series VMs.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The FPGA Attestation service performs a series of validations on a design checkpoint file (called a “netlist”) generated by the Xilinx toolset and produces a file that contains the validated image (called a “bitstream”) that can be loaded onto the Xilinx U250 FPGA card in an NP series VM. Note,2026-04-14T17:03:00.000Z,concept-article,,0.3,False,"Describes the FPGA attestation process conceptually; no clear evidence of numeric limits, configuration tables, or error-code-based troubleshooting from the summary.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/field-programmable-gate-arrays-attestation,FPGA Attestation Service,Azure FPGA Attestation Service - Azure Virtual Machines,,Attestation service for the NP-series VMs.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The FPGA Attestation service performs a series of validations on a design checkpoint file (called a “netlist”) generated by the Xilinx toolset and produces a file that contains the validated image (called a “bitstream”) that can be loaded onto the Xilinx U250 FPGA card in an NP series VM. Note,2026-04-14T17:03:00.000Z,concept-article,,0.3,False,"Describes the FPGA attestation process conceptually; no clear evidence of numeric limits, configuration tables, or error-code-based troubleshooting from the summary.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/flash-azure-monitor,Use Azure Monitor,Project Flash - Use Azure Monitor to monitor Azure Virtual Machine availability - Azure Virtual Machines,Monitor Azure VM availability with Project Flash via Azure Monitor metric,This article covers important concepts for monitoring Azure virtual machine availability using the Azure Monitor VM availability metric.,"Azure Monitor is one solution offered by Flash. Flash is the internal name for a project dedicated to building a robust, reliable, and rapid mechanism for customers to monitor virtual machine (VM) health. This article covers the use of the Azure Monitor VM availability metric to monitor Azure Virtual Machine availability. For a general overview of Flash solutions, see theFlash overview. For documentation specific to the other solutions offered by Flash, choose from the following articles:",2024-10-18T21:59:00.000Z,concept-article,integrations,0.7,True,"Uses a specific VM availability metric in Azure Monitor; includes metric name, dimensions, and usage patterns.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/flash-azure-resource-graph,Use Azure Resource Graph,Project Flash - Use Azure Resource Graph to monitor Azure Virtual Machine availability - Azure Virtual Machines,Monitor Azure VM availability with Project Flash via Azure Resource Graph,This article covers important concepts for monitoring Azure virtual machine availability using Azure Resource Graph.,"Azure Resource Graph is one solution offered by Flash. Flash is the internal name for a project dedicated to building a robust, reliable, and rapid mechanism for customers to monitor virtual machine (VM) health. This article covers the use of Azure Resource Graph to monitor Azure Virtual Machine availability. For a general overview of Flash solutions, see theFlash overview. For documentation specific to the other solutions offered by Flash, choose from the following articles:",2024-08-22T17:37:00.000Z,concept-article,integrations,0.7,True,Project Flash-specific use of ARG for VM availability; likely includes KQL patterns and resource schemas unique to this solution.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/flash-azure-resource-health,Use Azure Resource Health,Project Flash - Use Azure Resource Health to monitor Azure Virtual Machine availability - Azure Virtual Machines,Monitor Azure VM availability with Project Flash via Resource Health,This article covers important concepts for monitoring Azure virtual machine availability using Azure Resource Health.,"Azure Resource Health is one solution offered by Flash. Flash is the internal name for a project dedicated to building a robust, reliable, and rapid mechanism for customers to monitor virtual machine (VM) health. This article covers the use of Azure Resource Health to monitor Azure Virtual Machine availability. For a general overview of Flash solutions, see theFlash overview. For documentation specific to the other solutions offered by Flash, choose from the following articles:",2024-08-22T17:37:00.000Z,concept-article,integrations,0.7,True,Describes how Resource Health is used for VM availability in Flash; includes status codes and event semantics.,unchanged @@ -233,7 +233,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/generalize,Generalizing https://learn.microsoft.com/en-us/azure/virtual-machines/generation-2,Generation 2 VMs,Azure support for Generation 2 VMs - Azure Virtual Machines,Decide between Azure Generation 1 and 2 VMs,Overview of Azure support for Generation 2 VMs,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Support for Generation 2 virtual machines (VMs) is now available on Azure. You can also upgrade existing Generation 1 virtual machines to Generation 2 with Trusted launch. Review the considerations on this page and theupgrade guidancebefore choosing or upgrading a generation. Generation 2 VMs support key features that aren't supported in Generation 1 VMs. These features include increased memory, Intel Software Gu",2025-12-05T18:04:00.000Z,how-to,decision-making,0.65,True,"Outlines feature differences and considerations for choosing or upgrading to Gen2, including support for specific capabilities, guiding generation selection decisions.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/hb-hc-known-issues,Known issues with HB-series and N-series VMs,Troubleshooting known issues with HPC and GPU VMs - Azure Virtual Machines - Azure Virtual Machines,Troubleshoot common issues on Azure HPC and GPU VMs,Learn about troubleshooting known issues with HPC and GPU virtual machine (VM) sizes in Azure.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets This article attempts to list recent common issues and their solutions when using theHB-seriesandN-seriesHPC and GPU VMs.,2025-09-22T17:03:00.000Z,troubleshooting,troubleshooting,0.86,True,Explicitly a known-issues troubleshooting article; such pages map specific symptoms and error conditions on HB/N-series to causes and resolutions.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/hbv2-performance,Performance,HBv2-series VM size performance - Azure Virtual Machines,HBv2 VM performance expectations and tuning guidance,Learn about performance testing results for HBv2-series VM sizes in Azure.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Performance testing was conducted across multipleHBv2-seriessize VMs. The following table summarizes key findings from these tests.,2026-02-11T06:03:00.000Z,concept-article,best-practices,0.76,True,"Performance results with microbenchmarks and likely tuning recommendations specific to HBv2, including quantified performance expectations that guide how to configure workloads.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/hbv2-series-overview,Overview,HBv2-series virtual machine (VM) overview - Azure Virtual Machines - Azure Virtual Machines,,Learn about the HBv2-series VM size in Azure.,"Important HBv2-series VMs are scheduled for retirement on May 31, 2027.This applies to all HBv2 sizes: Standard_HB120rs_v2, Standard_HB120-96rs_v2, Standard_HB120-64rs_v2, Standard_HB120-32rs_v2, and Standard_HB120-16rs_v2. After this date, HBv2 VMs will be set to a deallocated state, stop working, stop incurring billing charges, and lose SLA and support. Plan your migration to current-generation HPC alternatives before the retirement date. Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scal",2026-04-14T17:03:00.000Z,concept-article,,0.4,False,"Appears to be an overview and retirement notice for HBv2-series VMs; while it mentions a retirement date, it does not clearly expose detailed numeric limits, configuration parameters, or decision matrices from the summary.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/hbv2-series-overview,Overview,HBv2-series virtual machine (VM) overview - Azure Virtual Machines - Azure Virtual Machines,,Learn about the HBv2-series VM size in Azure.,"Important HBv2-series VMs are scheduled for retirement on May 31, 2027.This applies to all HBv2 sizes: Standard_HB120rs_v2, Standard_HB120-96rs_v2, Standard_HB120-64rs_v2, Standard_HB120-32rs_v2, and Standard_HB120-16rs_v2. After this date, HBv2 VMs will be set to a deallocated state, stop working, stop incurring billing charges, and lose SLA and support. Plan your migration to current-generation HPC alternatives before the retirement date. Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scal",2026-04-14T17:03:00.000Z,concept-article,,0.4,False,"Appears to be an overview and retirement notice for HBv2-series VMs; while it mentions a retirement date, it does not clearly expose detailed numeric limits, configuration parameters, or decision matrices from the summary.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/hbv3-performance,Performance,HBv3-series VM sizes performance and scalability - Azure Virtual Machines,HBv3 VM performance and scalability guidance,Learn about performance and scalability of HBv3-series VM sizes in Azure.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Performance expectations using common HPC microbenchmarks are as follows:,2026-02-11T06:03:00.000Z,concept-article,best-practices,0.76,True,"Summarizes performance expectations from microbenchmarks, giving quantified guidance on how HBv3 scales, which informs best practices for workload sizing and layout.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/hbv3-series-overview,Overview,"HBv3-series virtual machine (VM) overview, architecture, topology - Azure Virtual Machines - Azure Virtual Machines",HBv3 VM architecture and NUMA-aware placement,Learn about the HBv3-series VM size in Azure.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets AnHBv3-seriesserver features 2 * 64-core EPYC 7V73X CPUs for a total of 128 physical ""Zen3"" cores with AMD 3D V-Cache. Simultaneous Multithreading (SMT) is disabled on HBv3. These 128 cores are divided into 16 sections (8 per socket), each section containing 8 processor cores with uniform access to a 96 MB L3 cache. Azure HBv3 servers also run the following AMD BIOS settings: As a result, the server boots with 4 ",2026-02-11T06:03:00.000Z,concept-article,architecture-patterns,0.78,True,"Details HBv3 CPU topology, cache layout, and BIOS settings with guidance on how this affects process placement and performance, a product-specific architecture pattern.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/hbv4-performance,Performance,HBv4-series VM sizes performance and scalability - Azure Virtual Machines,HBv4 VM performance and scalability expectations,Learn about performance and scalability of HBv4-series VM sizes in Azure.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Performance expectations using common HPC microbenchmarks are as follows:,2026-02-10T08:00:00.000Z,concept-article,best-practices,0.76,True,"Provides quantified performance expectations from microbenchmarks, guiding how to best use HBv4 for HPC workloads.",unchanged @@ -311,7 +311,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disks-format-moun https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disks-format-mount-temp-disks-linux,Temporary disks,Format and mount temporary disks on Azure Linux VMs - Azure Virtual Machines,,Learn to format and mount temporary disks (also known as resource disks) on Azure Linux VMs with both SCSI and NVMe interfaces,"Applies to:✔️ Linux VMs ✔️ Flexible scale sets This article covers how to format and mounttemporary disks(also known as resource disks) on Azure Linux virtual machines (VMs). Depending on your VM series, temporary disks use either SCSI or NVMe interfaces. Temporary disks aren't managed disks, and aren't persistent. Store important data on managed disks instead of local temporary disks. Temporary disks are generally meant to store items like page files, swap files, or SQL Server tempdb files.",2025-09-18T22:04:00.000Z,how-to,,0.4,False,"Tutorial on formatting/mounting temporary disks; no detailed limits, config matrices, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disks-upload-vhd-to-managed-disk-cli,Upload a vhd to a disk - CLI,Upload a VHD to Azure or copy a disk across regions - Azure CLI - Azure Virtual Machines,Upload or copy VHDs to managed disks with Azure CLI,"Learn how to upload a VHD to an Azure Managed Disk and copy a managed disk across regions, using the Azure CLI, via direct upload.","Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets This article explains how to either upload a VHD from your local machine to an Azure Managed Disk or copy a managed disk to another region, using AzCopy. This process, direct upload, enables you to upload a VHD up to 32 TiB in size directly into a managed disk. Currently, direct upload is supported for Ultra Disks, Premium SSD v2, Premium SSD, Standard SSD, and Standard HDD. If you're providing a backup solution for IaaS VMs in Azure,",2026-02-19T23:04:00.000Z,how-to,limits-quotas,0.82,True,"Defines direct upload limit of 32 TiB and enumerates supported disk types (Ultra, Premium v2, etc.), which are SKU-specific capability limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/linux/download-vhd,Linux,Download a Linux VHD from Azure - Azure Virtual Machines,Download Linux VHDs from Azure using CLI and portal,Download a Linux VHD using the Azure CLI and the Azure portal.,"Applies to:✔️ Linux VMs ✔️ Flexible scale sets This article describes how to download a Linux virtual hard disk (VHD) file from Azure. To download a VHD, the disk can't be attached to a running VM, which means the VM experiences downtime. Some configurations can safely avoid downtime bysnapshotting the diskand downloading the VHD from the snapshot. If you're usingMicrosoft Entra IDto control resource access, you can use it to restrict uploading of Azure Managed Disks. For more information, seeSe",2026-02-19T23:04:00.000Z,how-to,integrations,0.7,True,Describes specific CLI/portal steps and mentions Entra ID-based restrictions; includes product-specific commands and security integration behavior.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/linux/endorsed-distros,Endorsed distributions,Linux distributions endorsed on Azure - Azure Virtual Machines,Choose endorsed Linux distributions and image sources,"Learn about Linux on Azure-endorsed distributions, including information about Ubuntu, Rocky, Alma, Oracle, Flatcar, Debian, Red Hat, and SUSE.","Applies to:✔️ Linux VMs ✔️ Flexible scale sets ✔️ Uniform scale sets In this article, we'll cover the following: There are several different sources of Linux Virtual Machine (VM) images for Azure. Each source provides a different expectation for quality, utility, and support. This document summarizes each source (marketplace images, platform images, custom images, and community gallery images) and gives you more details about platform images, which are images provided in partnership between Micr",2025-02-21T23:01:00.000Z,concept-article,decision-making,0.7,True,"Summarizes image sources and endorsed distros, with implications for SLA and support; helps decide which distro/source to use.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/linux/endorsed-distros,Endorsed distributions,Linux distributions endorsed on Azure - Azure Virtual Machines,,"Learn about Linux on Azure-endorsed distributions, including information about Ubuntu, Rocky, Alma, Oracle, Flatcar, Debian, Red Hat, and SUSE.","Applies to:✔️ Linux VMs ✔️ Flexible scale sets ✔️ Uniform scale sets In this article, we'll cover the following: There are several different sources of Linux Virtual Machine (VM) images for Azure. Each source provides a different expectation for quality, utility, and support. This document summarizes each source (marketplace images, platform images, custom images, and community gallery images) and gives you more details about platform images, which are images provided in partnership between Micr",2026-04-20T17:03:00.000Z,concept-article,,0.2,False,"Primarily a conceptual/overview page describing Linux image sources and endorsed distributions on Azure. It summarizes image types and partnerships but does not present concrete limits, configuration parameters, error codes, or detailed decision matrices with quantified trade-offs.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/linux/expand-disks,Linux,Expand Virtual Hard Disks on a Linux VM - Azure Virtual Machines,Expand Linux VM OS and data disk sizes in Azure,Learn how to expand virtual hard disks on a Linux VM with the Azure CLI.,"Applies to:✔️ Linux VMs ✔️ Flexible scale sets This article covers expanding operating system (OS) disks and data disks for a Linux virtual machine (VM). You canadd data disksto provide more storage space, and you can also expand an existing data disk. The default virtual hard disk size for the OS is typically 30 GB on a Linux VM in Azure. This article covers expanding either OS disks or data disks. An OS disk has a maximum capacity of 4,095 GiB. However, many operating systems are partitioned w",2025-06-19T05:05:00.000Z,how-to,limits-quotas,0.7,True,"Includes specific default OS disk size (30 GB) and maximum OS disk capacity (4,095 GiB), which are concrete product limits not generally known; rest is procedural.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/linux/faq,Linux,Frequently asked questions for Linux VMs in Azure - Azure Virtual Machines,Operational limits and behaviors for Azure Linux VMs,Provides answers to some of the common questions about Linux virtual machines created with the Resource Manager model.,"This article addresses some common questions about Linux virtual machines created in Azure using the Resource Manager deployment model. For the Windows version of this topic, seeFrequently asked question about Windows Virtual Machines",2026-04-02T17:04:00Z,faq,limits-quotas,0.7,True,"Linux VM FAQ pages typically include concrete, product-specific details such as maximum data disk counts, NIC limits, supported OS versions, password/username constraints, and other numeric or tightly scoped behavioral limits that are not purely conceptual. These align with limits-quotas more than generic FAQ content.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/linux/find-unattached-disks,CLI,Azure CLI - Find and delete unattached managed and unmanaged disks - Azure Virtual Machines,,How to find and delete unattached Azure managed and unmanaged (VHDs/page blobs) disks by using Azure CLI.,"Applies to:✔️ Linux VMs ✔️ Flexible scale sets When you delete a virtual machine (VM) in Azure, by default, any disks that are attached to the VM aren't deleted. This feature helps to prevent data loss due to the unintentional deletion of VMs. After a VM is deleted, you will continue to pay for unattached disks. This article shows you how to find and delete any unattached disks and reduce unnecessary costs. Note Use theaz disk showcommand to get theLastOwnershipUpdateTimefor any disk. This prope",2026-02-18T23:03:00.000Z,how-to,,0.45,False,"Shows how to find/delete unattached disks with az CLI; some mention of LastOwnershipUpdateTime but mainly scripting steps, not structured config/limits/troubleshooting.",unchanged @@ -414,8 +414,8 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/migration/dedicated-hos The main differences between the retiring Dedicated Host SKUs and the newly recommended Dedicated Host SKUs are: Review theFAQsbefore you get started on migration. The next section will go over which Dedicated Host SKUs to migrate to help aid in migration planning and execution.",2025-03-17T22:00:00.000Z,how-to,decision-making,0.75,True,Compares retiring and recommended Dedicated Host SKUs and guides migration planning; decision-making around host SKU selection.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/migration/migrate-from-elastic-compute-cloud-architecture,Migrate Amazon EC2 instances to Azure,Migrate from Amazon Elastic Compute Cloud (Amazon EC2) to Azure Virtual Machines - Azure Virtual Machines,Plan migration from AWS EC2 to Azure Virtual Machines,"Learn to migrate from Amazon Elastic Compute Cloud (Amazon EC2) to Azure Virtual Machines with step-by-step guidance, feature mapping, and validation strategies.","If you use Amazon Elastic Compute Cloud (Amazon EC2) and plan to migrate your workload to Azure, this guide can help you understand the migration process, feature mappings, and best practices. It's for Amazon Web Services (AWS) professionals who are familiar with Amazon EC2 and plan to move workloads to Azure Virtual Machines. The guide highlights key similarities and differences between the platforms and outlines important architectural considerations. It also provides best practices for perfor",2026-02-25T23:16:00.000Z,how-to,decision-making,0.8,True,"Includes feature mapping, architectural considerations, and best practices; helps choose equivalent services and patterns with trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/migration/migration-managed-image-to-compute-gallery,Migrate Managed image to Compute gallery,Migrate Managed image to Compute gallery - Azure Virtual Machines,Migrate legacy managed images to Azure Compute Gallery,Learn how to legacy Managed image to image version in Azure compute gallery.,"Applies to:✔️ Linux Virtual Machine ✔️ Windows Virtual Machine ✔️ Virtual Machine Flex Scale Sets Managed imagesis legacy method to generalize and capture Virtual Machine image. For the most current technology, customers are encouraged to useAzure compute gallery. All new features, like Arm64, Trusted launch, and Confidential Virtual Machine are only supported through Azure compute gallery. If you have an existing managed image, you can use it as a source and create an Azure compute gallery imag",2025-08-24T08:00:00.000Z,how-to,decision-making,0.7,True,"Migration-focused article comparing legacy managed images vs Compute Gallery, with guidance on when and how to move; includes decision and migration considerations.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide,General-purpose sizes,Retired VM Sizes Migration Guide - Azure Virtual Machines,Choose replacement sizes for retired Azure VMs,Migration guide for retired VM size series,"This migration guide is designed for users of Azure virtual machines (VMs) scheduled for retirement. This guide helps you transition to the latest VM series, helping you minimize disruptions while optimizing cost and performance. The guide covers General Purpose, Storage Optimized, and other VM series. It also surfaces critical information for HPC HBv2-series VMs undergoing retirement; specialized workload validation is recommended when migrating HPC workloads. This guide covers: By migrating to",2026-04-14T17:03:00.000Z,concept-article,decision-making,0.68,True,"A migration guide for retired VM size series typically includes concrete mappings from old SKUs to recommended new SKUs, with cost/performance trade-offs and scenario-based guidance. This is expert, product-specific decision guidance on which VM series to move to, fitting the decision-making category.",updated -https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration,GPU compute migration guide,Migration Guide for GPU Compute Workloads in Azure - Azure Virtual Machines,Migrate NC and ND GPU compute workloads to newer sizes,"NC, ND, NCv2-series migration guide.","As more powerful GPUs become available in the marketplace and in Microsoft Azure datacenters, we recommend re-assessing the performance of your workloads and considering migrating to newer GPUs. For the same reason, as well as to maintain a high-quality and reliable service offering, Azure periodically retires the hardware that powers older VM sizes. The first group of GPU products to be retired in Azure are the original NC, NC v2 and ND-series VMs, powered by NVIDIA Tesla K80, P100, and P40 dat",2025-03-17T22:00:00.000Z,concept-article,decision-making,0.82,True,"Migration guide for NC/NCv2/ND with concrete recommendations on newer GPU SKUs and rationale (performance, retirement), directly supporting technology selection decisions.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide,General-purpose sizes,Retired VM Sizes Migration Guide - Azure Virtual Machines,Choose replacement sizes for retired Azure VM series,Migration guide for retired VM size series,"This migration guide is designed for users of Azure virtual machines (VMs) scheduled for retirement. This guide helps you transition to the latest VM series, helping you minimize disruptions while optimizing cost and performance. The guide covers General Purpose, Storage Optimized, and other VM series. It also surfaces critical information for HPC HBv2-series VMs undergoing retirement; specialized workload validation is recommended when migrating HPC workloads. This guide covers: By migrating to",2026-04-21T06:04:00.000Z,concept-article,decision-making,0.74,True,"A migration guide for retired VM size series typically includes SKU-by-SKU mapping, comparison tables, and guidance on which new VM series to select based on workload, cost, and performance. This is product-specific decision guidance with concrete recommendations for different scenarios (general purpose, storage optimized, HPC HBv2), fitting the decision-making category.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration,GPU compute migration guide,Migration Guide for GPU Compute Workloads in Azure - Azure Virtual Machines,Plan migration for Azure N-series GPU workloads,"NC, ND, NCv2, NP-series migration guide.","As more powerful GPUs become available in the marketplace and in Microsoft Azure datacenters, we recommend re-assessing the performance of your workloads and considering migrating to newer GPUs. For the same reason, as well as to maintain a high-quality and reliable service offering, Azure periodically retires the hardware that powers older VM sizes. The first group of GPU products to be retired in Azure are the original NC, NC v2 and ND-series VMs, powered by NVIDIA Tesla K80, P100, and P40 dat",2026-04-21T06:04:00.000Z,concept-article,decision-making,0.7,True,"The migration guide discusses retiring NC/ND/NP-series SKUs and recommends newer GPU SKUs, including product-specific guidance on when and how to move workloads. It contains Azure-specific migration and selection recommendations between GPU VM families, which are decision criteria and trade-offs unique to this product, fitting decision-making.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/nv-series-migration-guide,NV series migration guide,NV series migration guide - Azure Virtual Machines,Migrate workloads from legacy NV GPU VMs,NV series migration guide,"Important The NV-series is retired as of September 6th, 2023, and is no longer available. Please refer to theNV-series retirement pagefor more information. As more powerful GPU VM sizes become available in Azure datacenters, assess your workloads and migrate virtual machines (VMs) in the NV and NV_Promo series. These legacy VMs can be migrated into new VM series, such as NVsv3 and NVasv4, for better performance with reduced cost. The NVsv3 VM series is powered by Nvidia M60 GPUs. The NVasv4 seri",2025-07-11T05:03:00.000Z,concept-article,decision-making,0.82,True,"Explicit migration guide comparing legacy NV/NV_Promo to NVsv3 and NVasv4 with performance and cost considerations, providing concrete recommendations for which series to choose.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/nv-series-migration-guide,NV-series migration guide,NV series migration guide - Azure Virtual Machines,Migrate workloads from retired NV-series GPU VMs,NV series migration guide,"Important The NV-series is retired as of September 6th, 2023, and is no longer available. Please refer to theNV-series retirement pagefor more information. As more powerful GPU VM sizes become available in Azure datacenters, assess your workloads and migrate virtual machines (VMs) in the NV and NV_Promo series. These legacy VMs can be migrated into new VM series, such as NVsv3 and NVasv4, for better performance with reduced cost. The NVsv3 VM series is powered by Nvidia M60 GPUs. The NVasv4 seri",2025-07-11T05:03:00.000Z,concept-article,decision-making,0.8,True,Explains migration from NV/NV_Promo to newer GPU series with performance and cost implications; SKU selection guidance.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/mitigate-se,Mitigating speculative execution,Guidance for mitigating silicon based micro-architectural and speculative execution side-channel vulnerabilities - Azure Virtual Machines,Mitigate speculative execution side-channel vulnerabilities on Azure VMs,Learn more about Guidance for mitigating silicon based micro-architectural and speculative execution side-channel vulnerabilities in Azure.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets This article provides guidance for a new class of silicon based micro-architectural and speculative execution side-channel vulnerabilities that affect many modern processors and operating systems. This includes Intel, AMD, and ARM. Specific details for these silicon-based vulnerabilities can be found in the following security advisories and CVEs: The disclosure of these CPU vulnerabilities has resulted in questio",2024-08-22T17:37:00.000Z,concept-article,security,0.65,True,"Provides Azure-specific guidance for mitigating CPU side-channel vulnerabilities, including configuration and patching recommendations for VM OS and hypervisor; clearly security-focused and product-specific.",unchanged @@ -445,7 +445,7 @@ include software licensing co",2024-08-22T17:37:00.000Z,concept-article,,0.5,Fal https://learn.microsoft.com/en-us/azure/virtual-machines/prepay-reserved-vm-instances,Prepay for VMs,Prepay for Azure virtual machines to save money - Azure Virtual Machines,Choose and purchase Azure Reserved VM Instances,Learn how to buy Azure Reserved Virtual Machine Instances to save on your compute costs.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets When you commit to an Azure reserved VM instance you can save money. The reservation discount is applied automatically to the number of running virtual machines that match the reservation scope and attributes. You don't need to assign a reservation to a virtual machine to get the discounts. A reserved instance purchase covers only the compute part of your VM usage. For Windows VMs, the usage meter is split into t",2026-04-10T06:03:00.000Z,concept-article,decision-making,0.7,True,"Page is focused on how to buy and use Azure Reserved VM Instances to save on compute costs, including product-specific purchasing and discount application behavior that affects cost decisions between pay-as-you-go and reservations. This is concrete, SKU-specific decision guidance rather than generic concepts, but it does not primarily list numeric limits/quotas or configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/quotas,vCPU quotas,vCPU quotas - Azure Virtual Machines,Understand and manage Azure VM vCPU quotas by region,Check your vCPU quotas for Azure virtual-machines.,"Applies to:✔️ Linux VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The vCPU quotas for virtual machines and scale sets are arranged in two tiers for each subscription, in each region. The first tier is the Total Regional vCPUs, and the second tier is the various VM size family cores such as the D-series vCPUs. Anytime a new VM is deployed the vCPUs for the VM must not exceed the vCPU quota for the VM size family or the total regional vCPU quota. If you exceed either of those quotas, the VM dep",2026-04-10T06:03:00.000Z,how-to,limits-quotas,0.92,True,"This page is specifically about vCPU quotas for Azure Virtual Machines and scale sets. It describes how quotas are structured (total regional vCPUs vs. VM size family vCPUs) and how deployments are constrained by these quotas. Such quota documentation typically includes concrete numeric limits, regional/tier distinctions, and quota behavior that are not inferable from general knowledge, fitting the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/regions,Azure Regions,Azure regions - Azure Virtual Machines,,Learn about the regions for running virtual machines in Azure.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets It is important to understand how and where your virtual machines (VMs) operate in Azure, along with your options to maximize performance, availability, and redundancy. This article provides you with an overview of the availability and redundancy features of Azure.",2024-08-22T17:37:00.000Z,concept-article,,0.2,False,"Overview of Azure regions and availability; largely conceptual without numeric limits, config tables, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/reserved-vm-instance-size-flexibility,VM instance size flexibility,Virtual machine size flexibility -Azure Reserved VM Instances - Azure Virtual Machines,Choose instance size flexibility for Azure Reserved VMs,Learn what size series a reservation discount applies to when you by a reserved VM instance.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets When you buy a Reserved VM Instance, you can choose to optimize for instance size flexibility or capacity priority. For more information about setting or changing the optimize setting for reserved VM instances, seeChange the optimize setting for reserved VM instances. With a reserved virtual machine instance that's optimized for instance size flexibility, the reservation you buy can apply to the virtual machines ",2026-04-13T17:04:00.000Z,concept-article,decision-making,0.7,True,"The page explains how Reserved VM Instance size flexibility works across VM size series and how the reservation discount is applied, providing product-specific decision guidance on when to optimize for instance size flexibility versus capacity priority. This is concrete, SKU-specific behavior that affects cost and capacity planning and is not just conceptual marketing content.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/reserved-vm-instance-size-flexibility,VM instance size flexibility,Virtual machine size flexibility -Azure Reserved VM Instances - Azure Virtual Machines,Choose instance size flexibility for Azure Reserved VMs,Learn what size series a reservation discount applies to when you by a reserved VM instance.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets When you buy a Reserved VM Instance, you can choose to optimize for instance size flexibility or capacity priority. For more information about setting or changing the optimize setting for reserved VM instances, seeChange the optimize setting for reserved VM instances. With a reserved virtual machine instance that's optimized for instance size flexibility, the reservation you buy can apply to the virtual machines ",2026-04-13T17:04:00.000Z,concept-article,decision-making,0.7,True,"The page explains how Reserved VM Instance size flexibility works across VM size series and how the reservation discount is applied, providing product-specific decision guidance on when to optimize for instance size flexibility versus capacity priority. This is concrete, SKU-specific behavior that affects cost and capacity planning and is not just conceptual marketing content.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/resource-graph-availability,Availability with Resource Graph,Virtual Machine (VM) availability information in Azure Resource Graph - Azure Virtual Machines,Query Azure VM availability data using Azure Resource Graph,"Query your Azure resources at scale with complex filtering, grouping, and sorting with Azure Resource Graph (ARG).","Azure Resource Graphis an Azure service that allows you to use the same Kusto Query Language (KQL) query language used in log queries to query your Azure resources at scale with complex filtering, grouping, and sorting by resource properties. You can useVM health annotationsto Azure Resource Graph (ARG) for detailed failure attribution and downtime analysis including: To get started with Resource Graph, openResource Graph Explorerin the Azure portal. Select theTabletab and have a look at themicr",2024-08-22T17:37:00.000Z,concept-article,integrations,0.7,True,"Describes VM health annotations in ARG, including table names and fields; product-specific KQL integration surface.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/resource-graph-samples,Azure Resource Graph queries,Azure Resource Graph sample queries for Azure Virtual Machines - Azure Virtual Machines,Query Azure VM resources with Azure Resource Graph,Sample Azure Resource Graph queries for Azure Virtual Machines showing use of resource types and tables to access Azure Virtual Machines related resources and properties.,This page is a collection ofAzure Resource Graphsample queries for Azure Virtual Machines.,2024-08-22T17:37:00.000Z,sample,integrations,0.65,True,"Collection of Resource Graph sample queries for VM-related resources; contains product-specific query patterns, table names, and property usage that act as concrete integration patterns with Azure Resource Graph.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/restore-point-troubleshooting,Troubleshooting,Troubleshoot restore point failures - Azure Virtual Machines,Troubleshoot Azure VM restore point failures,"Symptoms, causes, and resolutions of restore point failures related to agent, extension, and disks.","This article provides troubleshooting steps that can help you resolve restore point errors related to communication with the VM agent and extension. If your Azure issue is not addressed in this article, visit the Azure forums onMicrosoft Q & A and Stack Overflow. You can post your issue in these forums, or post to@AzureSupport on Twitter. You also can submit an Azure support request. To submit a support request, on theAzure supportpage, selectGet support.",2024-08-22T17:37:00.000Z,troubleshooting,troubleshooting,0.9,True,"Organized around restore point errors related to agent, extensions, and disks; maps symptoms to causes and resolutions with product-specific details.",unchanged @@ -499,8 +499,8 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fxmdsv2-series,FXmdsv2 series,FXmdsv2 size series - Azure Virtual Machines,Reference specs for Azure FXmdsv2 memory-intensive VM sizes,Information on and specifications of the FXmdsv2-series sizes,"The FXmdsv2-series Virtual Machine (VM) runs on the 5th Generation Intel® Xeon® Platinum 8573C (Emerald Rapids) processor in ahyper threadedconfiguration. FXv2-series VMs features an all-core-turbo frequency up to 4.0 GHz. These virtual machines offer up to 96 vCPU, up to 1832 GiB of RAM and up to 6x880 GiB fast local SSD storage.  The FXmdsv2-series benefits workloads that are ideal for memory-intensive enterprise applications requiring a high memory to CPU ratio, high performance, high CPU clo",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Includes per-SKU numeric specs including up to 6x880 GiB local SSD and I/O caps, representing concrete limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fxmsv2-series,FXmsv2 series,FXmsv2 size series - Azure Virtual Machines,Reference specs for Azure FXmsv2 memory-intensive VM sizes,Information on and specifications of the FXmsv2-series sizes,"The FXmsv2-series Virtual Machine (VM) runs on the 5th Generation Intel® Xeon® Platinum 8573C (Emerald Rapids) processor in ahyper threadedconfiguration. FXv2-series VMs features an all-core-turbo frequency up to 4.0 GHz. These virtual machines offer up to 96 vCPU and up to 1832 GiB of RAM.  The FXmsv2-series benefits workloads that are ideal for memory-intensive enterprise applications requiring a high memory to CPU ratio, high performance, high CPU clock speed, higher remote storage IOPs and t",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"FXmsv2 series page lists up to 96 vCPU and 1832 GiB RAM plus disk/network limits per size, which are numeric capacity constraints.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/nm-ads-ma35d-series,NM family,NMads MA35d - Azure Virtual Machines,NMads MA35d video transcoding VM specs and limits,Overview of NMads MA35d virtual machine,"The NMads MA35D-Series virtual machines are Azure's first SKU to offer specialized hardware (Xilinx MA35D ""Supernova"") accelerated VM optimized for batch and real-time video transcoding workloads. The VM series are powered by 4thgeneration AMD EPYC™ Genoa processors. It offers 1 ASIC video processing unit (VPU) with 8GB of memory in addition to 16 vCPUs, 32GB of RAM, 76GB of temporary storage, and 4Gbps of network bandwidth. Compared with existing general-purpose CPU or GPU based solutions, the ",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.8,True,"Gives concrete numeric resources (vCPUs, RAM, temporary storage, network bandwidth, VPU count and memory) that define the capacity limits of this VM series.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family,Overview,NP family VM size series - Azure Virtual Machines,Check NP family VM sizes and specs,List of sizes in the NP family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Note Azure NP-series VMs are scheduled for retirement on May 31, 2027. For more information, seeNP-series virtual machine migration guidance The 'NP' subfamily of VM size series are one of Azure's storage-optimized VM instances. They're designed for workloads that require high disk throughput and I/O, such as databases, big data applications, and data warehousing. High disk throughput and large local disk storage",2026-04-14T17:03:00.000Z,concept-article,limits-quotas,0.8,True,"NP family size pages list per-size vCPU, memory, disk, and throughput specifications in tabular form, which are concrete numeric limits and capacities that qualify as limits-quotas expert knowledge.",updated -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series,NP series,NP size series - Azure Virtual Machines,Review NP-series VM size specifications,Information on and specifications of the NP-series sizes,"The NP-series virtual machines are powered by Xilinx U250 FPGAs for accelerating workloads including machine learning inference, video transcoding, and database search & analytics. NP-series VMs are also powered by Intel Xeon 8171M (Skylake) CPUs with all core turbo clock speed of 3.2 GHz. Note Azure NP-series virtual machines are scheduled for retirement on May 31, 2027. For more information, seeNP-series virtual machine migration guidance.",2026-04-13T08:00:00.000Z,concept-article,limits-quotas,0.8,True,"NP-series size documentation provides detailed per-SKU hardware specs (CPU model, core counts, memory, FPGA details, storage and network characteristics) as numeric limits, fitting the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family,Overview,NP family VM size series - Azure Virtual Machines,Reference NP-series FPGA VM sizes and limits,List of sizes in the NP family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Important NP-series VM sizes (Standard_NP10s, Standard_NP20s, Standard_NP40s) are scheduled for retirement onMay 31, 2027. After this date, remaining NP-series VMs are deallocated, stop working, stop incurring charges, and no longer have SLA or support. Managed disk data is preserved. Purchases of 1-year and 3-year Azure Reserved VM Instances for NP-series ended onApril 2, 2026. Existing reservations are honored ",2026-04-21T06:04:00.000Z,concept-article,limits-quotas,0.86,True,"NP family size pages enumerate each NP SKU with numeric specifications (vCPUs, RAM, FPGA count/type, disk and NIC limits) and include concrete retirement dates and reservation cutoffs. These are precise platform limits and lifecycle dates that qualify as expert numerical constraints, matching limits-quotas.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series,NP series,NP size series - Azure Virtual Machines,Review NP-series FPGA VM specifications and retirement,Information on and specifications of the NP-series sizes,"The NP-series virtual machines are powered by Xilinx U250 FPGAs for accelerating workloads including machine learning inference, video transcoding, and database search & analytics. NP-series VMs are also powered by Intel Xeon 8171M (Skylake) CPUs with all core turbo clock speed of 3.2 GHz. Important Azure NP-series virtual machines (Standard_NP10s, Standard_NP20s, Standard_NP40s) are scheduled for retirement onMay 31, 2027. After this date, remaining NP-series VMs are deallocated, stop working, ",2026-04-21T06:04:00.000Z,concept-article,limits-quotas,0.86,True,"The NP-series specification page provides detailed per-size hardware specs (CPU model and clock, FPGA model, memory, disk/NIC limits) and explicit retirement timelines for specific SKUs. These are exact, product-specific numeric limits and lifecycle constraints, which align with the limits-quotas category.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/general-purpose/a-family,Overview,A family VM size series - Azure Virtual Machines,,List of size series in the A family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'A' family of VM size series are one of Azure's general purpose VM instances. They're designed for entry-level workloads, such as development and test environments, small to medium databases, and low-traffic web servers.",2025-07-03T05:03:00.000Z,concept-article,,0.3,False,"Family-level description of A-series; summary suggests conceptual positioning, not detailed per-size specs or limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/general-purpose/av2-series,Av2-series,Av2 size series - Azure Virtual Machines,Use Azure Av2 VM sizes and specifications,Information on and specifications of the Av2-series sizes,"The Av2-series VMs can be deployed on a variety of hardware types and processors. Av2-series run on the Intel® Xeon® Platinum 8573C (Emerald Rapids), Intel® Xeon® Platinum 8370C (Ice Lake), Intel® Xeon® Platinum 8272CL (Cascade Lake), Intel® Xeon® 8171M 2.1 GHz (Skylake), Intel® Xeon® E5-2673 v4 2.3 GHz (Broadwell), or the Intel® Xeon® E5-2673 v3 2.4 GHz (Haswell) processors. Av2-series VMs have CPU performance and memory configurations best suited for entry level workloads like development and ",2026-03-11T22:06:00.000Z,concept-article,limits-quotas,0.86,True,"Per-series VM size page typically includes tables of exact vCPU, RAM, disk, and throughput values for each Av2 size, which are numeric capacity limits unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/general-purpose/b-family,Overview,B family VM size series - Azure Virtual Machines,,List of size series in the B family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'B' family of VM size series are one of Azure's general purpose VM instances. While traditional Azure virtual machines provide fixed CPU performance, B-series virtual machines are the only VM type that use credits for CPU performance provisioning. B-series VMs utilize a CPU credit model to track how much CPU is consumed - the virtual machine accumulates CPU credits when a workload is operating below the base ",2025-11-11T06:03:00.000Z,concept-article,,0.4,False,"B-family landing page is likely descriptive of credit model and workloads, with detailed numeric specs deferred to sub-series pages.",unchanged @@ -585,7 +585,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/n https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ndv2-series,NDv2 series,NDv2 size series - Azure Virtual Machines,Reference NDv2 GPU VM size specifications,Information on and specifications of the NDv2-series sizes,"The NDv2-series virtual machine is a new addition to the GPU family designed for the needs of the most demanding GPU-accelerated AI, machine learning, simulation, and HPC workloads. NDv2 is powered by 8 NVIDIA Tesla V100 NVLINK-connected GPUs, each with 32 GB of GPU memory. Each NDv2 VM also has 40 non-HyperThreaded Intel Xeon Platinum 8168 (Skylake) cores and 672 GiB of system memory. NDv2 instances provide excellent performance for HPC and AI workloads utilizing CUDA GPU-optimized computation ",2026-04-02T17:04:00.000Z,concept-article,limits-quotas,0.88,True,"Lists 8 V100 GPUs with 32 GB each, 40 cores, 672 GiB RAM, plus per-size disk and bandwidth caps. These are precise numeric constraints for NDv2 SKUs, fitting limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ng-family,Overview,NG family VM size series - Azure Virtual Machines,NG GPU VM family sizes and specs,List of sizes in the NG family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'NG' family of VM size series are one of Azure's GPU-optimized VM instances, specifically designed for cloud gaming and remote desktop applications. They harness powerful AMD Radeon™ PRO GPUs to deliver high-quality, interactive gaming experiences in the cloud, optimized for rendering complex graphics and streaming high-definition video. This ensures gamers enjoy a seamless, responsive gaming environment acce",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.7,True,"Family list page for NG sizes typically includes a table of sizes with exact vCPU, RAM, GPU, and storage values, which are numeric resource limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ngadsv620-series,NGads V620 series,NGads_V620 size series - Azure Virtual Machines,NGads V620 GPU VM size specs and limits,Information on and specifications of the NGads_V620-series sizes,"The NGads V620 series are GPU-enabled virtual machines with CPU, memory resources and storage resources balanced to generate and stream high quality graphics for a high performance, interactive gaming experience hosted in Azure. They're powered by AMD Radeon(tm) PRO V620 GPU and AMD EPYC 7763 (Milan) CPUs. The AMD Radeon PRO V620 GPUs have a maximum frame buffer of 32 GB, which can be divided up to four ways through hardware partitioning. The AMD EPYC CPUs have a base clock speed of 2.45 GHz and",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Provides exact numeric values for GPU frame buffer, CPU clock speeds, vCPU, RAM, storage, and network, which are SKU-specific limits.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family,Overview,NV family VM size series - Azure Virtual Machines,NV GPU VM family sizes and capabilities,List of sizes in the NV family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'NV' family of VM size series are one of Azure's GPU-accelerated VM instances, specifically designed for graphics-intensive applications such as graphics rendering, simulation, and virtual desktops. Equipped with NVIDIA or AMD GPUs, NV-series VMs provide a robust platform for rendering and processing graphics-heavy tasks, making them ideal for organizations that require virtual workstations with powerful grap",2024-11-07T18:01:00.000Z,concept-article,limits-quotas,0.7,True,"Family overview that normally includes a size table with exact vCPU, RAM, GPU, and storage values, which are numeric capacity limits.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family,Overview,NV family VM size series - Azure Virtual Machines,Reference NV-series GPU VM sizes and specs,List of sizes in the NV family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'NV' family of VM size series are one of Azure's GPU-accelerated VM instances, specifically designed for graphics-intensive applications such as graphics rendering, simulation, and virtual desktops. Equipped with NVIDIA or AMD GPUs, NV-series VMs provide a robust platform for rendering and processing graphics-heavy tasks, making them ideal for organizations that require virtual workstations with powerful grap",2026-04-21T06:04:00.000Z,concept-article,limits-quotas,0.86,True,"NV family size pages list per-SKU expert details such as vCPU count, RAM, GPU model/count, GPU memory, max data disks, NICs, and other numeric constraints in tabular form. These are concrete capacity limits for each VM size that change over time and are not reliably known from training data, fitting the limits-quotas category.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-series,NV series,NV size series - Azure Virtual Machines,Plan for Azure NV-series GPU VM retirement and migration,Information on and specifications of the NV-series sizes,"Important The NV-series is retired as of September 6th, 2023, and is no longer available. Please refer to theNV-series retirement pagefor more information. Important NV and NV_Promo series Azure virtual machines (VMs) will be retired on September 6, 2023. For more information, see theNV and NV_Promo retirement information. For how to migrate your workloads to other VM sizes, see theNV and NV_Promo series migration guide. This retirement announcement doesn't apply to NVv3 and NVv4 series VMs. The",2026-04-02T17:04:00.000Z,concept-article,decision-making,0.7,True,"Contains concrete retirement dates and links to a migration guide for moving workloads to other VM sizes. Retirement timelines and migration recommendations are product-specific decision inputs (when to move, what to move to), fitting decision-making with expert knowledge about deprecation.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvadsa10v5-series,NVadsA10_v5 series,NVadsA10_v5 size series - Azure Virtual Machines,NVads A10 v5 GPU VM specs and limits,Information on and specifications of the NVadsA10_v5-series sizes,"The NVadsA10v5-series virtual machines are powered by NVIDIA A10 GPUs and AMD EPYC 74F3V(Milan) CPUs with a base frequency of 3.2 GHz, all-cores peak frequency of 4.0 GHz. With NVadsA10v5-series Azure is introducing virtual machines with partial NVIDIA GPUs. Pick the right sized virtual machine for GPU accelerated graphics applications and virtual desktops starting at 1/6th of a GPU with 4-GiB frame buffer to a full A10 GPU with 24-GiB frame buffer. Each virtual machine instance in NVadsA10v5-se",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Includes detailed numeric specs such as 1/6 GPU increments, frame buffer sizes, CPU frequencies, and per-size resource tables that define SKU limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvadsv710-v5-series,Nvads V710 v5-series,NVads V710 v5 size series - Azure Virtual Machines,Reference NVads V710 v5 Azure GPU VM size limits,Information on and specifications of the NVads-V710 v5-series sizes,The NVads V710 v5-series virtual machines are powered by AMD Radeon™ Pro V710 GPUs and AMD EPYC™ 9V64 F (Genoa) CPUs with a base frequency of 3.95 GHz and all-cores peak frequency of 4.3 GHz. VMs take advantage of the AMD Simultaneous Multithreading technology to assign dedicated vCPU threads to each VM. Both Windows and Linux VMs are supported. The series provides five options ranging from 1/6 of a GPU with 4-GiB frame buffer to a full V710 GPU with 24-GiB frame buffer. No other GPU licensing i,2026-03-19T22:07:00.000Z,concept-article,limits-quotas,0.84,True,"Describes a VM series with explicit numeric specs (GPU fractions, frame buffer sizes, CPU frequencies, number of options). Full page will include a size table with exact vCPU, RAM, storage, and GPU allocations per size, which are SKU-specific capacity limits matching limits-quotas.",unchanged @@ -594,13 +594,13 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/n https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvv4-retirement,NVv4 series retirement,NVv4 series retirement - Azure Virtual Machines,Handle NVv4 GPU VM retirement and migration,This article contains the details of the NVv4-series retirement.,"Note There is a known resize operation error that occurs when migrating from the NVv4-series to the NVads_V710_v5-series. Microsoft is working on a fix that will be implemented in April 2026. In the meantime, we suggest that you follow thisworkaround. Note 1-year and 3-year RI purchases for the NVv4-series ended November 1, 2025. On September 30, 2026, Microsoft Azure will retire the Standard_NV4as_v4, Standard_NV4ahs_v4, Standard_NV8as_v4, Standard_NV8ahs_v4, Standard_NV16as_v4, Standard_NV16ah",2025-09-22T22:03:00.000Z,how-to,decision-making,0.78,True,"Retirement article with specific retirement date, affected SKUs, RI purchase cutoff, and a known resize error plus workaround; informs concrete migration and timing decisions.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvv4-series,NVv4 series,NVv4 size series - Azure Virtual Machines,NVv4 GPU VM size specifications and limits,Information on and specifications of the NVv4-series sizes,"Note The NVv4-series will be retired on September 30, 2026. Refer to theNVv4-series retirement pagefor more information and migration recommendations. The NVv4-series virtual machines are powered by AMDRadeon Instinct MI25GPUs and AMD EPYC 7V12(Rome) CPUs with a base frequency of 2.45GHz, all-cores peak frequency of 3.1GHz and single-core peak frequency of 3.3GHz. With NVv4-series Azure is introducing virtual machines with partial GPUs. Pick the right sized virtual machine for GPU accelerated gr",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.84,True,"Describes partial GPU options with specific fractions and frame buffer sizes plus CPU frequencies and resource quantities, all of which are numeric limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hb-family,Overview,HB sub-family VM size series - Azure Virtual Machines,HB HPC VM sub-family sizes and specs,Overview of the 'HB' sub-family of virtual machine sizes,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'HB' sub-family of VM size series are one of Azure's high-performance computing (HPC) optimized H-family VM instances. They're designed for compute-intensive workloads, such as computational fluid dynamics, finite element analysis, and large-scale scientific simulations. HB-series VMs, equipped with high-performance AMD EPYC processors and fast memory, deliver exceptional CPU and memory bandwidth. They're ide",2025-11-25T12:03:00.000Z,concept-article,limits-quotas,0.7,True,"HB sub-family overview that typically includes or links to tables of sizes with exact vCPU, RAM, and bandwidth values, which are numeric capacity limits.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv2-series,HBv2 series,HBv2 size series - Azure Virtual Machines,Use HBv2-series VM size specifications,Information on and specifications of the HBv2-series sizes,"Important HBv2-series VMs are scheduled for retirement on May 31, 2027.This applies to all HBv2 sizes: Standard_HB120rs_v2, Standard_HB120-96rs_v2, Standard_HB120-64rs_v2, Standard_HB120-32rs_v2, and Standard_HB120-16rs_v2. After this date, HBv2 VMs will be set to a deallocated state, stop working, stop incurring billing charges, and lose SLA and support. Plan your migration to current-generation HPC alternatives before the retirement date. For migration guidance, see theMigration guidancesectio",2026-04-14T17:03:00.000Z,concept-article,limits-quotas,0.8,True,"HBv2 size series pages typically include per-size vCPU, memory, storage, and network bandwidth tables, which are precise numeric capacity limits that match the limits-quotas expert knowledge criteria.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv2-series,HBv2 series,HBv2 size series - Azure Virtual Machines,Use HBv2-series VM size specifications,Information on and specifications of the HBv2-series sizes,"Important HBv2-series VMs are scheduled for retirement on May 31, 2027.This applies to all HBv2 sizes: Standard_HB120rs_v2, Standard_HB120-96rs_v2, Standard_HB120-64rs_v2, Standard_HB120-32rs_v2, and Standard_HB120-16rs_v2. After this date, HBv2 VMs will be set to a deallocated state, stop working, stop incurring billing charges, and lose SLA and support. Plan your migration to current-generation HPC alternatives before the retirement date. For migration guidance, see theMigration guidancesectio",2026-04-14T17:03:00.000Z,concept-article,limits-quotas,0.8,True,"HBv2 size series pages typically include per-size vCPU, memory, storage, and network bandwidth tables, which are precise numeric capacity limits that match the limits-quotas expert knowledge criteria.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv3-series,HBv3 series,HBv3 size series - Azure Virtual Machines,HBv3 HPC VM size specifications and limits,Information on and specifications of the HBv3-series sizes,"HBv3-series Virtual Machines (VMs) are designed for very demanding computing tasks. They help with things like studying how liquids move, analyzing structures, predicting weather, and processing earthquake data. They also support oil reservoir modeling and testing computer chip designs. HBv3 VMs feature up to 120 AMD EPYC™ 7V73X (Milan-X) CPU cores, 448 GB of RAM, and no simultaneous multithreading. HBv3-series VMs also provide 350 GB/sec of memory bandwidth (amplified up to 630 GB/s), up to 96 ",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Lists precise values for CPU cores, RAM, memory bandwidth, FLOPS, and network, which are SKU-specific capacity limits.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv4-series,HBv4 series,HBv4 size series - Azure Virtual Machines,HBv4 HPC VM size specifications and limits,Information on and specifications of the HBv4-series sizes,"HBv4-series VMs are optimized for many high-performance computing (HPC) workloads. These include: HBv4 VMs feature up to 176 AMD EPYC™ 9V33X (""Genoa-X"") CPU cores with AMD's 3D V-Cache, clock frequencies up to 3.7 GHz, and no simultaneous multithreading. HBv4-series VMs also provide 768 GB of RAM, 2304 MB L3 cache. Each VM has 2304 MB of L3 cache that delivers up to 5.7 TB/s of bandwidth. This boosts DRAM bandwidth of 780 GB/s, giving an average of 1.2 TB/s of effective memory speed for many wor",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Provides detailed numeric limits such as core counts, RAM, L3 cache size and bandwidth, and memory bandwidth, all specific to this VM series.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv5-series,HBv5 series,HBv5 size series - Azure Virtual Machines,Reference HBv5 Azure HPC VM size specifications,Information on and specifications of the HBv5-series sizes,"HBv5-series VMs are optimized for the most memory bandwidth-intensive HPC applications, including: HBv5 VMs feature 6.7 TB/s of memory bandwidth across 432 GB of high-bandwidth memory (HBM) and up to 368 4th Generation AMD EPYC™ processor cores with 4 GHz boost frequencies, 3.5 GHz base frequencies, and no simultaneous multithreading. Each HBv5-series VM also includes 14.3 TiB of local NVMe SSD storage with up to 50 GB/s (reads) and 30 GB/s (writes) of block device performance. All HBv5-series V",2026-04-02T17:04:00.000Z,concept-article,limits-quotas,0.86,True,"HBv5-series description includes precise numeric specs (TB/s memory bandwidth, GB of HBM, core counts, clock speeds, NVMe capacity and throughput). The full page will have a size table with exact per-SKU limits, fitting the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hc-family,Overview,HC sub-family VM size series - Azure Virtual Machines,HC HPC VM sub-family sizes and specs,Overview of the 'HC' sub-family of virtual machine sizes,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'HC' family of VM size series are one of Azure's high-performance computing (HPC) optimized VM instances. These VMs are designed for compute-intensive workloads that need substantial CPU power. Examples include genomic sequencing, engineering simulations, and financial modeling. HB-series VMs use high-performance Intel Xeon Scalable processors and fast memory. HC-series VMs deliver exceptional computational p",2025-11-25T12:03:00.000Z,concept-article,limits-quotas,0.7,True,"HC sub-family overview that typically includes or links to size tables with exact vCPU, RAM, and bandwidth values, which are numeric limits.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hc-series,HC series,HC size series - Azure Virtual Machines,Reference HC-series Azure VM size specifications,Information on and specifications of the HC-series sizes,"Important Azure Virtual Machines HC-series sizes (Standard_HC44rs, Standard_HC44-16rs, Standard_HC44-32rs) will be retired on May 31, 2027. After this date, any remaining HC-series VMs will be deallocated, stop running, and will no longer incur charges. HC-series will no longer have SLA or support following retirement. Sales of 1-year and 3-year Reserved Instances for HC-series ended on April 2, 2026. For more information, seeMigrate your HC-series virtual machines by May 31, 2027. HC-series Vir",2026-04-17T22:03:00.000Z,concept-article,limits-quotas,0.78,True,"HC-series size pages typically list per-size hardware specs (vCPU count, RAM GB, temp disk size, max data disks, NIC limits, bandwidth caps). These are exact numeric constraints unique to this VM family and not derivable from general knowledge, fitting the limits-quotas category.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hc-series,HC series,HC size series - Azure Virtual Machines,Reference HC-series Azure VM size specifications,Information on and specifications of the HC-series sizes,"Important Azure Virtual Machines HC-series sizes (Standard_HC44rs, Standard_HC44-16rs, Standard_HC44-32rs) will be retired on May 31, 2027. After this date, any remaining HC-series VMs will be deallocated, stop running, and will no longer incur charges. HC-series will no longer have SLA or support following retirement. Sales of 1-year and 3-year Reserved Instances for HC-series ended on April 2, 2026. For more information, seeMigrate your HC-series virtual machines by May 31, 2027. HC-series Vir",2026-04-17T22:03:00.000Z,concept-article,limits-quotas,0.78,True,"HC-series size pages typically list per-size hardware specs (vCPU count, RAM GB, temp disk size, max data disks, NIC limits, bandwidth caps). These are exact numeric constraints unique to this VM family and not derivable from general knowledge, fitting the limits-quotas category.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hx-family,Overview,HX sub-family VM size series - Azure Virtual Machines,,Overview of the 'HX' sub-family of virtual machine sizes,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'HX' family of VM size series are one of Azure's high-memory, high-performance computing (HPC) optimized VM instances. They're built for tasks that need a lot of memory and strong CPU power. Examples include in-memory databases, big data analysis, and advanced scientific simulations. HX-series VMs, equipped with expansive memory and powerful CPUs, delivers the resources to efficiently handle large datasets an",2025-11-25T12:03:00.000Z,concept-article,,0.3,False,"High-level overview of HX family; summary suggests conceptual positioning and examples, not detailed numeric specs or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hx-series,HX series,HX size series - Azure Virtual Machines,Reference HX-series VM size specifications in Azure,Information on and specifications of the HX-series sizes,"HX-series VMs are optimized for workloads that require significant memory capacity with twice the memory capacity as HBv4. For example, workloads such as silicon design can use HX-series VMs to enable EDA customers targeting the most advanced manufacturing processes to run their most memory-intensive workloads. HX VMs feature up to 176 AMD EPYC™ 9V33X (""Genoa-X"") CPU cores with AMD's 3D V-Cache, clock frequencies up to 3.7 GHz, and no simultaneous multithreading. HX-series VMs also provide 1408 ",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Contains detailed numeric specs per VM size (core counts, memory, cache, storage bandwidth) that define hard capacity boundaries for each HX-series SKU; this is expert, SKU-specific data.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/memory-optimized/dndsv6-series,Dndsv6 series,Dndsv6 size series - Azure Virtual Machines,Reference specs for Azure Dndsv6 VM sizes,Information on and specifications of the Dndsv6-series sizes,"Dndsv6-series virtual machines run on the 5th Generation Intel® Xeon® Platinum 8573C (Emerald Rapids) processor reaching an all- core turbo clock speed of 3.0 GHz. These virtual machines offer up to 128 vCPU and 512 GiB of RAM. Dndsv6-series virtual machines, an extension of our standard Ddsv6-series, provide better networking performance for most general-purpose workloads. It also provides increased scalability, upgraded CPU, elevated memory bandwidth, and faster remote storage access compared ",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Documents exact per-size capacities including vCPU, RAM, local storage, and I/O caps, representing concrete service limits for this series.",unchanged @@ -670,13 +670,13 @@ The remaining VMs with these specific sizes on your subscription will be set to https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/dcsv2-series-retirement,DCsv2 series,DCsv2-series retirement - Azure Virtual Machines,Plan migration from DCsv2 to newer Azure VM series,"Retirement information for the DCsv2 series virtual machine sizes. Before retirement, migrate your workloads to recommended options.","This migration guide is designed for users of DCsv2-series virtual machines (VMs), which are scheduled for retirement onJune 30, 2026. To ensure minimal disruption and to continue optimizing cost and performance, this guide helps you transition to the latest series VMs. This document covers: By migrating to newer VM series, you gain access to improved price-performance ratios, broader regional availability, and the latest hardware capabilities.",2025-06-30T08:00:00.000Z,how-to,decision-making,0.78,True,"Retirement migration guide with concrete retirement date and recommended target series, helping users choose replacement SKUs and plan migration.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/dedicated-host-retirement,Dedicated Host SKU Retirement,Azure Dedicated Host SKU Retirement - Azure Virtual Machines,,Azure Dedicated Host SKU Retirement landing page,"We continue to modernize and optimize Azure Dedicated Host by using the latest innovations in processor and datacenter technologies. Azure Dedicated Host is a combination of a virtual machine (VM) series and a specific Intel or AMD-based physical server. As we innovate and work with our technology partners, we also need to plan how we retire aging technology.",2025-03-17T22:00:00.000Z,concept-article,,0.5,False,Retirement landing page; summary indicates high-level explanation of modernization and retirement without detailed migration matrices or numeric thresholds.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/hbv2-series-retirement,HBv2 series,"Migrate your HBv2-series virtual machines by May 31, 2027 - Azure Virtual Machines",Migrate Azure HBv2-series VMs before retirement deadline,"Learn about the retirement of HBv2-series virtual machines on May 31, 2027, how it affects you, and how to migrate to recommended alternative Azure VM sizes.","Note 1-year and 3-year purchases for the HBv2-series end April 2, 2026. Microsoft Azure has announced the retirement of its HBv2-series virtual machines effective May 31, 2027. Introduced in 2020, the HBv2-series is equipped with 120 AMD EPYC 7V12 processor cores, 4 GB of RAM per CPU core, up to 350 GB/s of memory bandwidth, and up to 4 teraFLOPS of FP64 compute. Since launching this series, Microsoft introducedHBv5,HBv4,HX-series, andHBv3series virtual machines. These newer offerings pack the l",2026-03-18T17:33:00.000Z,concept-article,decision-making,0.78,True,"Includes precise retirement and reservation cutoff dates and recommends specific successor VM series (HBv5/HBv4/HX/HBv3) with hardware characteristics, supporting concrete migration and replacement-SKU decisions rather than just conceptual info.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/hc-series-retirement,HC series,"Migrate your HC-series virtual machines by May 31, 2027 - Azure Virtual Machines",Plan migration from HC-series to newer Azure HPC VMs,"Learn about the retirement of HC-series virtual machines on May 31, 2027, how it affects you, and how to migrate to recommended alternative Azure VM sizes.","Note 1-year and 3-year purchases for the HC-series end April 2, 2026. Microsoft Azure has announced the retirement of its HC-series virtual machines effective May 31, 2027. Introduced in 2019, HC-series VMs are equipped with Intel Xeon Platinum 8168 processors, 8 GB of RAM per CPU core, and a 100 Gb/sec Mellanox EDR InfiniBand. Since launching this series, Microsoft introduced HBv5, HBv4, HX, and HBv3 series virtual machines. These newer offerings pack the latest technological improvements in co",2026-04-17T22:03:00.000Z,concept-article,decision-making,0.7,True,"Retirement guidance for HC-series VMs includes concrete retirement dates, purchase cut-off dates, and recommended alternative VM series (HBv5, HBv4, HX, HBv3) with migration guidance. This is product- and time-specific decision content to choose replacement SKUs and plan migration, matching decision-making.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/hc-series-retirement,HC series,"Migrate your HC-series virtual machines by May 31, 2027 - Azure Virtual Machines",Plan migration from HC-series to newer Azure HPC VMs,"Learn about the retirement of HC-series virtual machines on May 31, 2027, how it affects you, and how to migrate to recommended alternative Azure VM sizes.","Note 1-year and 3-year purchases for the HC-series end April 2, 2026. Microsoft Azure has announced the retirement of its HC-series virtual machines effective May 31, 2027. Introduced in 2019, HC-series VMs are equipped with Intel Xeon Platinum 8168 processors, 8 GB of RAM per CPU core, and a 100 Gb/sec Mellanox EDR InfiniBand. Since launching this series, Microsoft introduced HBv5, HBv4, HX, and HBv3 series virtual machines. These newer offerings pack the latest technological improvements in co",2026-04-17T22:03:00.000Z,concept-article,decision-making,0.7,True,"Retirement guidance for HC-series VMs includes concrete retirement dates, purchase cut-off dates, and recommended alternative VM series (HBv5, HBv4, HX, HBv3) with migration guidance. This is product- and time-specific decision content to choose replacement SKUs and plan migration, matching decision-making.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/msv2-mdsv2-retirement,Msv2 and Mdsv2 migration guide,Msv2 and Mdsv2 Isolated Sizes Retirement - Azure Virtual Machines,Plan migration for Msv2 and Mdsv2 isolated VM retirement,Migration guide for sizes,"On March 31, 2027, Azure will retire the Msv2 and Mdsv2-series Medium Memory virtual machines (VM) listed: Msv2 Medium Memory Diskless Mdsv2 Medium Memory with Disk From now to March 31, 2027, you can continue to use the VMs listed without disruption. On March 31, 2027, the remaining VMs with these specific sizes on your subscription will be set to a deallocated state. These VMs are stopped and removed from the host. These VMs won't be billed in the deallocated state. To avoid service disruption",2025-06-06T05:07:00.000Z,concept-article,decision-making,0.7,True,"Retirement guide with specific retirement date (March 31, 2027), deallocation behavior, and recommended target sizes; supports migration and SKU selection decisions with concrete timelines.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/nd-series-retirement,ND series retirement,ND-series retirement - Azure Virtual Machines,Plan migration for ND-series GPU VM retirement,"ND-series retirement by September 6, 2023","Based on feedback we’ve received from customers we’re happy to announce that we're extending the retirement date by one year to 6 September 2023, for the Azure ND-Series virtual machine to give you more time to plan your migration. As we continue to bring modern and optimized virtual machine instances to Azure leveraging the latest innovations in datacenter technologies, we thoughtfully plan how we retire aging hardware. With this in mind, we're retiring our ND GPU VM sizes, powered by NVIDIA ",2025-03-17T22:00:00.000Z,concept-article,decision-making,0.7,True,Retirement notice with a specific retirement date (6 September 2023) and guidance to move to newer GPU SKUs; supports concrete migration planning and SKU selection.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/np-series-retirement,NP series,"Migrate your NP-series virtual machines by May 31, 2027 - Azure Virtual Machines",Plan migration from Azure NP-series VMs before retirement,"Learn about the retirement of NP-series virtual machines on May 31, 2027, how it affects you, and how to migrate to recommended alternative Azure VM sizes.","Note 1-year and 3-year purchases for the NP-series end April 2, 2026. Microsoft Azure has announced the retirement of its NP-series virtual machines. The NP-series, introduced for FPGA-accelerated workloads, is powered by Intel Xeon 8171M (Skylake) CPUs and AMD Xilinx Alveo U250 FPGAs. These VMs are designed for accelerating workloads including machine learning inference, video transcoding, and database search & analytics.",2026-03-18T17:33:00.000Z,concept-article,decision-making,0.78,True,"Retirement notice with specific retirement and purchase end dates plus recommended alternative VM sizes provides time-bound, SKU-specific migration guidance that helps choose replacement sizes; this is concrete decision-making content rather than generic overview.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/nv-series-retirement,NV series retirement,NV series retirement - Azure Virtual Machines,Understand Azure NV series retirement timeline,"NV series retirement starting September 6, 2023","We’re happy to announce that we're extending the retirement date by one year to September 6, 2023, for the Azure NV-Series and NV_Promo Series virtual machine to give you more time to plan your migration. We continue to bring modern and optimized virtual machine (VM) instances to Azure by using the latest innovations in datacenter technologies. As we innovate, we also thoughtfully plan how we retire aging hardware. With this context in mind, we're retiring our NV-series Azure VM sizes on Septem",2025-03-17T22:00:00.000Z,concept-article,decision-making,0.7,True,"Contains specific retirement dates and scope for NV and NV_Promo series, which are time-based constraints driving migration and sizing decisions.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list,Retired sizes,Retired Azure VM size series - Azure Virtual Machines,Review retired Azure VM series and replacements,A list containing all retired and soon to be retired VM size series and their replacement series.,This article provides a list of all sizes that are retired or have been announced for retirement. For sizes that require it there are migration guides to help move to replacement sizes. Warning Series withRetirement Statuslisted asRetiredareno longer availableand can't be provisioned. Virtual Machines (VMs) announced for retirement will have restrictions when deploying through new subscriptions. We recommend using the latest generation VMs to ensure best price-performance and to benefit from ext,2025-12-11T23:04:00.000Z,concept-article,decision-making,0.8,True,"Lists all retired/retiring series with replacement recommendations and retirement status, directly supporting SKU selection and migration decisions.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list,Retired sizes,Retired Azure VM size series - Azure Virtual Machines,Plan migrations from retired Azure VM size series,A list containing all retired and soon to be retired VM size series and their replacement series.,This article provides a list of all sizes that are retired or have been announced for retirement. For sizes that require it there are migration guides to help move to replacement sizes. Warning Series withRetirement Statuslisted asRetiredareno longer availableand can't be provisioned. Virtual Machines (VMs) announced for retirement will have restrictions when deploying through new subscriptions. We recommend using the latest generation VMs to ensure best price-performance and to benefit from ext,2026-04-21T06:04:00.000Z,concept-article,decision-making,0.7,True,"Page lists specific VM size series that are retired or scheduled for retirement and maps them to replacement series, providing concrete migration guidance and choices between SKUs. This is product- and time-specific information that an LLM won't reliably know from training and directly supports decision-making about which VM sizes to use going forward.",updated https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retirement-overview,Overview,Previous-gen and retired VM sizes - Azure Virtual Machines,,Overview of the retirement process for virtual machine size series and information on previous-gen sizes.,"Azure virtual machine sizes are the amount of resources allocated to a virtual machine in the cloud. These resources are a portion of the physical server’s hardware capabilities. Asize seriesis a collection of all sizes that are available within a single physical server’s hardware. As a size series' physical hardware ages and newer components are released, Microsoft stops deploying more of previously established series' hardware. Once users migrate off of said hardware or the hardware becomes su",2025-03-17T22:00:00.000Z,overview,,0.35,False,"Retirement overview is largely conceptual process description; detailed per-size data is in linked pages, not this overview.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/storage-optimized/l-family,Overview,L family VM size series - Azure Virtual Machines,Reference specs for L family storage-optimized VMs,List of sizes in the L family.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets The 'L' family of VM size series are one of Azure's storage-optimized VM instances. They're designed for workloads that require high disk throughput and I/O, such as databases, big data applications, and data warehousing. Equipped with high disk throughput and large local disk storage capacities, L-series VMs support applications and services that benefit from low latency and high sequential read and write speeds",2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.8,True,"L family page typically includes a table of all L-series sizes with exact vCPU, RAM, local disk sizes, and throughput caps, which are numeric capacity limits per SKU.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/storage-optimized/laosv4-series,Laosv4 series,Laosv4 size series - Azure Virtual Machines,Reference Laosv4 storage-optimized VM specs,Information on and specifications of the Laosv4-series sizes,"The Laosv4-series of Azure Virtual Machines (VMs) features high-throughput, low latency, directly mapped local NVMe storage. These VMs utilize AMD's fourth Generation EPYC™ 9004 processors that can achieve a boosted maximum frequency of 3.7GHz. The Laosv4-series VMs are available in sizes from 2 to 32 vCPUs, with 8 GiB of memory allocated per vCPU and 720GB of local NVMe temp disk capacity allocated per vCPU, with up to 23TB (12x1.92TB) of local temp disk capacity available on the L32aos_v4 size",2026-03-10T22:04:00.000Z,concept-article,limits-quotas,0.86,True,"Specifies vCPU range, GiB per vCPU, and NVMe disk capacity per vCPU (720 GB) with max totals, which are numeric capacity constraints.",unchanged @@ -694,7 +694,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machines/ssh-keys-portal,Create https://learn.microsoft.com/en-us/azure/virtual-machines/states-billing,States and billing,States and billing status - Azure Virtual Machines,Understand VM states and Azure billing behavior,Learn about the provisioning and power states that a virtual machine can enter. Provisioning and power states affect billing.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Azure Virtual Machines (VM) instances go through different states. There areprovisioningandpowerstates. This article describes these states and highlights when customers are billed for instance usage.,2024-08-22T17:37:00.000Z,concept-article,limits-quotas,0.66,True,"Details which provisioning/power states incur charges and which do not—effectively numeric billing rules tied to specific states, which are product-specific limits/behaviors.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/troubleshoot-maintenance-configurations,Troubleshoot,Troubleshoot problems with Maintenance Configurations - Azure Virtual Machines,Troubleshoot Azure VM Maintenance Configuration deployment and patching issues,This article provides details on known and fixed issues and how to troubleshoot problems with Maintenance Configurations.,"This article outlines common problems and errors that might arise during the deployment or use of Maintenance Configurations for scheduled patching on virtual machines (VMs), along with strategies to address them. A maintenance configuration doesn't install a scheduled patch on the VMs and gives aShutdownOrUnresponsiveerror. In a static scope, it's essential to avoid relying on outdated VM configurations. Therefore, it's imperative to ensure that the VM is up and running while the patch is being",2025-04-02T17:03:00.000Z,concept-article,troubleshooting,0.85,True,Contains specific error (ShutdownOrUnresponsive) and symptom→cause→solution guidance for Maintenance Configurations.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/troubleshooting-shared-images,Troubleshoot shared images,Troubleshoot problems with shared images in Azure - Azure Virtual Machines,Troubleshoot Azure Compute Gallery shared image issues,Learn how to troubleshoot problems with shared images in Azure Compute Galleries.,"Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets If you have problems performing any operations on Azure Compute Gallery (formerly known as Shared Image Gallery) resources, like galleries, image definitions, and image versions, run the failing command again in debug mode. You activate debug mode by passing the--debugswitch with the Azure CLI and the-Debugswitch with PowerShell. After you've located the error, follow this article to troubleshoot it.",2024-08-22T17:37:00.000Z,troubleshooting,troubleshooting,0.9,True,"Explicit troubleshooting article; will map specific errors and CLI/PowerShell debug outputs to causes and resolutions, matching troubleshooting criteria.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch,Overview,Trusted Launch for Azure VMs - Azure Virtual Machines,Configure Trusted Launch security for Azure VMs,Learn about Trusted Launch for Azure virtual machines.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Azure offers Trusted Launch as a seamless way to improve the security ofGeneration 2virtual machines (VM). Trusted Launch protects against advanced and persistent attack techniques. Trusted Launch is composed of several coordinated infrastructure technologies that can be enabled independently. Each technology provides another layer of defense against sophisticated threats. Trusted Launch is supported for both x64,2026-04-17T22:03:00.000Z,concept-article,security,0.7,True,"Trusted Launch documentation for Azure VMs typically includes product-specific security configuration details such as required VM generation, supported OS/VM sizes, specific security features (vTPM, secure boot, measured boot), and how to enable/disable them. These are concrete, service-specific security settings rather than generic security concepts, fitting the security sub-skill. It is not primarily about limits, deployment, or troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch,Overview,Trusted Launch for Azure VMs - Azure Virtual Machines,Configure Trusted Launch security for Azure VMs,Learn about Trusted Launch for Azure virtual machines.,Applies to:✔️ Linux VMs ✔️ Windows VMs ✔️ Flexible scale sets ✔️ Uniform scale sets Azure offers Trusted Launch as a seamless way to improve the security ofGeneration 2virtual machines (VM). Trusted Launch protects against advanced and persistent attack techniques. Trusted Launch is composed of several coordinated infrastructure technologies that can be enabled independently. Each technology provides another layer of defense against sophisticated threats. Trusted Launch is supported for both x64,2026-04-17T22:03:00.000Z,concept-article,security,0.7,True,"Trusted Launch documentation for Azure VMs typically includes product-specific security configuration details such as required VM generation, supported OS/VM sizes, specific security features (vTPM, secure boot, measured boot), and how to enable/disable them. These are concrete, service-specific security settings rather than generic security concepts, fitting the security sub-skill. It is not primarily about limits, deployment, or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch-existing-vm,Generation 2 VM,Enable Trusted launch on existing Gen2 VMs - Azure Virtual Machines,Enable Trusted Launch on existing Gen2 Azure VMs,Learn how to enable Trusted launch on existing Azure Gen2 virtual machines (VMs).,"Applies to:✔️ Linux VM ✔️ Windows VM Azure Virtual Machines supports enabling Azure Trusted launch on existingAzure Generation 2virtual machines (VM) by upgrading to theTrusted launchsecurity type. Trusted launchis a way to enable foundational compute security onAzure Generation 2 VMsand protects against advanced and persistent attack techniques like boot kits and rootkits. It does so by combining infrastructure technologies like Secure Boot, virtual Trusted Platform Module (vTPM), and boot inte",2026-02-10T23:03:00.000Z,how-to,security,0.65,True,"Guides upgrading existing Gen2 VMs to Trusted Launch, including enabling Secure Boot and vTPM; product-specific security configuration steps.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch-existing-vm-gen-1,Generation 1 VM,Upgrade Gen1 VMs to Trusted launch - Azure Virtual Machines,Upgrade Azure Gen1 VMs to Trusted launch Gen2,Learn how to upgrade existing Azure Gen1 virtual machines (VMs) to Trusted launch.,"Applies to:✔️ Linux VM ✔️ Windows VM ✔️ Generation 1 VM Azure Virtual Machines supports upgrading Generation 1 virtual machines (VM) toGeneration 2by upgrading to theTrusted launchsecurity type. Trusted launchis a way to enable foundational compute security onAzure Generation 2 VMsand protects against advanced and persistent attack techniques like boot kits and rootkits. It does so by combining infrastructure technologies like Secure Boot, virtual Trusted Platform Module (vTPM), and boot integri",2026-03-09T08:00:00.000Z,how-to,configuration,0.68,True,"How-to guide for upgrading Gen1 VMs to Trusted launch Gen2, including product-specific steps and required settings (security type, Secure Boot, vTPM) that represent concrete configuration knowledge beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch-existing-vmss,Enable Trusted launch on existing Virtual Machine Scale Sets,Enable Trusted launch on existing Uniform scale set - Azure Virtual Machines,Enable Trusted Launch on existing Azure VM scale sets,Enable Trusted launch on existing Uniform scale set,"Applies to:✔️ Uniform scale set ✔️ Flex scale set ❌ Service fabric Azure Virtual machine Scale sets supports enabling Trusted launch on existingUniform Scale setsvirtual machine (VM) by upgrading toTrusted launchsecurity type. Trusted launchenables foundational compute security onAzure Generation 2virtual machines & scale sets and protects them against advanced and persistent attack techniques like boot kits and rootkits. It does so by combining infrastructure technologies like Secure Boot, vTPM",2025-06-18T08:00:00.000Z,how-to,security,0.65,True,"Explains enabling Trusted Launch on Uniform and Flex scale sets, with specific security-type settings; product-specific security configuration.",unchanged diff --git a/products/azure-virtual-machines/report.md b/products/azure-virtual-machines/report.md index 8668a5b7..b09d61ab 100644 --- a/products/azure-virtual-machines/report.md +++ b/products/azure-virtual-machines/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: architecture-patterns: 'Design patterns for VM-based architectures: multi-region and fleet strategies, NUMA/topology tuning for HPC SKUs, low-latency placement, @@ -10,9 +10,9 @@ category_descriptions: security: 'Securing Azure VMs and disks: encryption (ADE, CMK, double/host), Key Vault, TLS, Trusted Launch, OS Guard, MSP/metadata hardening, RBAC/Policy, and secure image/gallery sharing.' - limits-quotas: VM size specs, disk and storage performance limits, quotas, lifecycle/deprecation - rules, and operational behaviors to plan capacity, performance, and compliance - for Azure VMs. + limits-quotas: VM size specs, disk/storage performance limits, GPU/HPC capacities, + lifecycle/retirement, and regional vCPU/host quotas for planning, scaling, and + sizing Azure VMs. troubleshooting: 'Diagnosing and fixing Azure VM issues: hibernation, disk encryption, extensions, NSG blocking, Spot/scale set errors, Image Builder, kernel/packages, Trusted Launch, and gallery images.' @@ -22,26 +22,28 @@ category_descriptions: integrations: 'Managing and automating Azure VMs with CLI/PowerShell/REST: backups, snapshots, disk encryption, maintenance/availability monitoring, Key Vault, networking, and Oracle DB integration.' - decision-making: Guidance for choosing VM/disk types, costs, licensing, DNS/images, - and backup/DR, plus detailed migration and retirement paths for VM series, disks, - OSes, GPU/HPC, Oracle, and WebLogic workloads + decision-making: Guidance for choosing VM/disk types, costs, reservations, and backup/DR, + plus migration and retirement planning (VM sizes, GPU/HPC, OS images, Oracle/WebLogic, + AWS/other platforms). best-practices: Performance, scaling, HA, and cost-optimization best practices for Azure VMs, including HPC/InfiniBand tuning, disks/snapshots, OS-specific tweaks, and Image Builder/boot-time optimization. skill_description: Expert knowledge for Azure Virtual Machines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when choosing VM/managed disk SKUs, configuring scale sets, Trusted Launch, - ADE/CMK encryption, or HPC/GPU workloads, and other Azure Virtual Machines related - development tasks. Not for Azure Data Science Virtual Machines (use azure-data-science-vm), - Azure Virtual Machine Scale Sets (use azure-vm-scalesets), SQL Server on Azure Virtual - Machines (use azure-sql-virtual-machines), Azure Cloud Services (use azure-cloud-services). -use_when: Use when choosing VM/managed disk SKUs, configuring scale sets, Trusted - Launch, ADE/CMK encryption, or HPC/GPU workloads, and other Azure Virtual Machines - related development tasks. + Use when choosing VM/SSD/GPU SKUs, using scale sets, Trusted Launch, ADE/CMK, or + automating via CLI/ARM/Bicep, and other Azure Virtual Machines related development + tasks. Not for Azure Data Science Virtual Machines (use azure-data-science-vm), + SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Virtual + Machine Scale Sets (use azure-vm-scalesets), Azure Kubernetes Service (AKS) (use + azure-kubernetes-service). +use_when: Use when choosing VM/SSD/GPU SKUs, using scale sets, Trusted Launch, ADE/CMK, + or automating via CLI/ARM/Bicep, and other Azure Virtual Machines related development + tasks. confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data-science-vm), - Azure Virtual Machine Scale Sets (use azure-vm-scalesets), SQL Server on Azure Virtual - Machines (use azure-sql-virtual-machines), Azure Cloud Services (use azure-cloud-services). + SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Virtual + Machine Scale Sets (use azure-vm-scalesets), Azure Kubernetes Service (AKS) (use + azure-kubernetes-service). --- # Azure Virtual Machines Crawl Report @@ -50,8 +52,8 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- - **Total Pages**: 804 - **Fetched**: 804 - **Fetch Failed**: 0 -- **Classified**: 590 -- **Unclassified**: 214 +- **Classified**: 589 +- **Unclassified**: 215 ### Incremental Update - **New Pages**: 0 @@ -70,39 +72,39 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | decision-making | 64 | 8.0% | | deployment | 20 | 2.5% | | integrations | 45 | 5.6% | -| limits-quotas | 201 | 25.0% | +| limits-quotas | 200 | 24.9% | | security | 75 | 9.3% | | troubleshooting | 21 | 2.6% | -| *(Unclassified)* | 214 | 26.6% | +| *(Unclassified)* | 215 | 26.7% | ## Changes ### Updated Pages +- [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family) + - Updated: 2024-11-07T18:01:00.000Z → 2026-04-21T06:04:00.000Z +- [GPU compute migration guide](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration) + - Updated: 2025-03-17T22:00:00.000Z → 2026-04-21T06:04:00.000Z - [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family) - - Updated: 2024-08-22T17:37:00.000Z → 2026-04-14T17:03:00.000Z + - Updated: 2026-04-14T17:03:00.000Z → 2026-04-21T06:04:00.000Z - [NP series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series) - - Updated: 2026-04-02T17:04:00.000Z → 2026-04-13T08:00:00.000Z -- [FPGA Attestation Service](https://learn.microsoft.com/en-us/azure/virtual-machines/field-programmable-gate-arrays-attestation) - - Updated: 2024-11-07T08:00:00.000Z → 2026-04-14T17:03:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/hbv2-series-overview) - - Updated: 2026-02-10T08:00:00.000Z → 2026-04-14T17:03:00.000Z -- [HBv2 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv2-series) - - Updated: 2026-03-10T22:04:00.000Z → 2026-04-14T17:03:00.000Z -- [HC series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hc-series) - - Updated: 2026-04-02T17:04:00.000Z → 2026-04-17T22:03:00.000Z -- [HC series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/hc-series-retirement) - - Updated: 2026-03-18T17:33:00.000Z → 2026-04-17T22:03:00.000Z -- [VM instance size flexibility](https://learn.microsoft.com/en-us/azure/virtual-machines/reserved-vm-instance-size-flexibility) - - Updated: 2026-03-18T06:03:00.000Z → 2026-04-13T17:04:00.000Z -- [Portal](https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-host-based-encryption-portal) - - Updated: 2024-10-22T08:00:00.000Z → 2026-04-02T17:04:00.000Z -- [Disk bursting models](https://learn.microsoft.com/en-us/azure/virtual-machines/disk-bursting) - - Updated: 2024-08-23T16:58:00.000Z → 2026-04-15T22:04:00.000Z -- [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch) - - Updated: 2025-05-14T17:01:00.000Z → 2026-04-17T22:03:00.000Z + - Updated: 2026-04-13T08:00:00.000Z → 2026-04-21T06:04:00.000Z +- [Retired sizes](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list) + - Updated: 2025-12-11T23:04:00.000Z → 2026-04-21T06:04:00.000Z +- [Deprecated images FAQ](https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images) + - Updated: 2026-03-26T22:04:00Z → 2026-04-23T06:09:00Z +- [Endorsed distributions](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/endorsed-distros) + - Updated: 2025-02-21T23:01:00.000Z → 2026-04-20T17:03:00.000Z +- [General Purpose SKUs](https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus) + - Updated: 2024-08-22T17:37:00.000Z → 2026-04-23T22:04:00.000Z +- [Share a capacity reservation group](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-group-share) + - Updated: 2025-11-25T06:02:00.000Z → 2026-04-20T22:05:00.000Z +- [Associate a VM](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-vm) + - Updated: 2025-11-24T23:03:00.000Z → 2026-04-20T22:05:00.000Z +- [Associate a scale set - Uniform](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set) + - Updated: 2025-11-24T23:03:00.000Z → 2026-04-20T22:05:00.000Z - [General-purpose sizes](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide) - - Updated: 2026-03-11T17:08:00.000Z → 2026-04-14T17:03:00.000Z + - Updated: 2026-04-14T17:03:00.000Z → 2026-04-21T06:04:00.000Z ## Classified Pages @@ -119,7 +121,6 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Ebsv6 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/memory-optimized/ebsv6-series) | limits-quotas | 0.90 | Explicitly states up to 800,000 IOPS, 14,000 MBps, and max vCPU/RAM; this is a clear numeric limits/quotas page for storage throughput and capacity per VM size. | | [FAQ on ephemeral OS disks](https://learn.microsoft.com/en-us/azure/virtual-machines/ephemeral-os-disks-faq) | limits-quotas | 0.90 | FAQ explicitly states maximum size (2 TiB) and non-resizability of ephemeral OS disks, which are precise product limits and behaviors. | | [GPU Optimized SKUs](https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-gpu-optimized-skus) | limits-quotas | 0.90 | Covers hardware specifications and VM packing for GPU-optimized Dedicated Host SKUs, with numeric limits on vCPUs, RAM, and VM counts per host. | -| [General Purpose SKUs](https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus) | limits-quotas | 0.90 | Described as hardware specifications and VM packing for each general purpose Dedicated Host SKU. These pages typically contain tables with exact vCPU/RAM counts and how many specific VM sizes fit per host, which are product-specific numeric limits. | | [General troubleshooting steps](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/troubleshoot) | troubleshooting | 0.90 | Explicit troubleshooting guide; these pages list specific extension error codes/messages, log locations, and resolution steps unique to Azure VM extensions. | | [Key vault for Azure Disk Encryption](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/disk-encryption-key-vault) | security | 0.90 | Explicitly notes required Key Vault access policy option for ADE and other key vault settings; product-specific security configuration details. | | [Lasv3 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/storage-optimized/lasv3-series) | limits-quotas | 0.90 | Lists exact per-size values (vCPUs, 8 GiB RAM per vCPU, 1.92 TB NVMe per 8 vCPUs, max IOPS/throughput, NICs, disks). These are concrete numeric constraints for this VM family. | @@ -191,6 +192,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [FXmsv2 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fxmsv2-series) | limits-quotas | 0.86 | FXmsv2 series page lists up to 96 vCPU and 1832 GiB RAM plus disk/network limits per size, which are numeric capacity constraints. | | [Fasv7 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fasv7-series) | limits-quotas | 0.86 | Fasv7 series documentation describes exact per-size capacities (full cores, GiB RAM, disk and I/O caps). These numeric constraints are expert knowledge about platform limits. | | [Fsv2 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fsv2-series) | limits-quotas | 0.86 | Fsv2 size page includes detailed numeric specs and maximums (vCPUs, memory, temp disk, max data disks, IOPS, bandwidth) per VM size, which are concrete service limits. | +| [General Purpose SKUs](https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus) | limits-quotas | 0.86 | The page describes hardware specifications and VM packing details for each general purpose Dedicated Host SKU. This typically includes exact core counts, memory, maximum number/size of VMs per host, and SKU-specific constraints, which are numeric limits and quotas that an LLM wouldn't reliably know from training. | | [HBv3 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv3-series) | limits-quotas | 0.86 | Lists precise values for CPU cores, RAM, memory bandwidth, FLOPS, and network, which are SKU-specific capacity limits. | | [HBv4 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv4-series) | limits-quotas | 0.86 | Provides detailed numeric limits such as core counts, RAM, L3 cache size and bandwidth, and memory bandwidth, all specific to this VM series. | | [HBv5 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hbv5-series) | limits-quotas | 0.86 | HBv5-series description includes precise numeric specs (TB/s memory bandwidth, GB of HBM, core counts, clock speeds, NVMe capacity and throughput). The full page will have a size table with exact per-SKU limits, fitting the limits-quotas category. | @@ -210,7 +212,10 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [ND-H200-v5 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nd-h200-v5-series) | limits-quotas | 0.86 | VM size spec pages list exact, product-specific numeric details (vCPU counts, RAM, GPU memory, local/remote storage, max NICs, throughput, etc.) that function as hard capacity limits for that SKU. These numeric constraints are not reliably known from training and match the limits-quotas pattern of precise values per size. | | [ND-MI300X-v5 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ndmi300xv5-series) | limits-quotas | 0.86 | Describes a specific VM series with detailed hardware and capacity numbers (GPU count, cores, memory, bandwidth, likely full size table). These are SKU-specific numeric constraints that act as limits and are not generally known, fitting limits-quotas. | | [NGads V620 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ngadsv620-series) | limits-quotas | 0.86 | Provides exact numeric values for GPU frame buffer, CPU clock speeds, vCPU, RAM, storage, and network, which are SKU-specific limits. | +| [NP series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series) | limits-quotas | 0.86 | The NP-series specification page provides detailed per-size hardware specs (CPU model and clock, FPGA model, memory, disk/NIC limits) and explicit retirement timelines for specific SKUs. These are exact, product-specific numeric limits and lifecycle constraints, which align with the limits-quotas category. | | [NVadsA10_v5 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvadsa10v5-series) | limits-quotas | 0.86 | Includes detailed numeric specs such as 1/6 GPU increments, frame buffer sizes, CPU frequencies, and per-size resource tables that define SKU limits. | +| [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family) | limits-quotas | 0.86 | NP family size pages enumerate each NP SKU with numeric specifications (vCPUs, RAM, FPGA count/type, disk and NIC limits) and include concrete retirement dates and reservation cutoffs. These are precise platform limits and lifecycle dates that qualify as expert numerical constraints, matching limits-quotas. | +| [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family) | limits-quotas | 0.86 | NV family size pages list per-SKU expert details such as vCPU count, RAM, GPU model/count, GPU memory, max data disks, NICs, and other numeric constraints in tabular form. These are concrete capacity limits for each VM size that change over time and are not reliably known from training data, fitting the limits-quotas category. | | [Restrict uploading and downloading of your disks](https://learn.microsoft.com/en-us/azure/virtual-machines/disks-secure-upload-download) | security | 0.86 | Describes using Microsoft Entra ID and RBAC roles to gate disk upload/download operations, including required permissions and policy scope—product-specific security configuration. | | [Support matrix](https://learn.microsoft.com/en-us/azure/virtual-machines/concepts-restore-points) | limits-quotas | 0.86 | A 'support matrix and limitations' page for a specific Azure feature almost always includes exact, product-specific constraints (supported/unsupported scenarios, regional/OS/VM-type matrices, and numerical limits). These are expert details not reliably known from training and fit the limits-quotas category best. | | [Use templates](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/dsc-template) | configuration | 0.86 | Explicitly about the Resource Manager template definition for the DSC extension, including property names, types, and allowed values. | @@ -271,7 +276,6 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Dv2 and DSv2 series (11-15)](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/memory-optimized/dv2-dsv2-series-memory) | limits-quotas | 0.82 | This memory-optimized Dv2/Dsv2 page lists exact memory, CPU, and storage characteristics per VM size, defining hard capacity limits for these SKUs; this is detailed numeric, product-specific data. | | [Eadsv5 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/memory-optimized/eadsv5-series) | limits-quotas | 0.82 | Eadsv5 series with specific local storage improvements and disk types; full page includes numeric spec tables that define capacity limits per VM size. | | [Easv5 series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/memory-optimized/easv5-series) | limits-quotas | 0.82 | Easv5 series with defined processor types and memory focus; series pages list numeric vCPU, RAM, and disk limits per SKU. | -| [GPU compute migration guide](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration) | decision-making | 0.82 | Migration guide for NC/NCv2/ND with concrete recommendations on newer GPU SKUs and rationale (performance, retirement), directly supporting technology selection decisions. | | [Linux](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/extensions-rmpolicy-howto-cli) | security | 0.82 | Shows concrete Azure Policy definitions and JSON conditions targeting specific extension types, which are product-specific security configuration patterns. | | [NV series migration guide](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/nv-series-migration-guide) | decision-making | 0.82 | Explicit migration guide comparing legacy NV/NV_Promo to NVsv3 and NVasv4 with performance and cost considerations, providing concrete recommendations for which series to choose. | | [Upload a vhd to a disk - CLI](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disks-upload-vhd-to-managed-disk-cli) | limits-quotas | 0.82 | Defines direct upload limit of 32 TiB and enumerates supported disk types (Ultra, Premium v2, etc.), which are SKU-specific capability limits. | @@ -321,14 +325,12 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Monitoring data reference](https://learn.microsoft.com/en-us/azure/virtual-machines/monitor-vm-reference) | configuration | 0.80 | Reference article listing specific metrics, log categories, and their meanings; detailed product-specific monitoring schema. | | [ND series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nd-series) | limits-quotas | 0.80 | Even though retired, the page documents exact historical limits (P40 GPUs, CPU model, memory, per-size capacities) in tables. | | [NM family](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/nm-ads-ma35d-series) | limits-quotas | 0.80 | Gives concrete numeric resources (vCPUs, RAM, temporary storage, network bandwidth, VPU count and memory) that define the capacity limits of this VM series. | -| [NP series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series) | limits-quotas | 0.80 | NP-series size documentation provides detailed per-SKU hardware specs (CPU model, core counts, memory, FPGA details, storage and network characteristics) as numeric limits, fitting the limits-quotas category. | | [NV-series migration guide](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/nv-series-migration-guide) | decision-making | 0.80 | Explains migration from NV/NV_Promo to newer GPU series with performance and cost implications; SKU selection guidance. | | [Networking options](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/image-builder-networking) | configuration | 0.80 | The page explains how AIB deploys ACI and build VMs into specific subnets within a VNet and that they must be on separate subnets, implying concrete network configuration requirements (subnet associations, staging resource group behavior). These are product-specific networking configuration details rather than generic networking concepts, aligning with the configuration sub-skill. | | [Oracle Database on an Azure Linux VM using Azure backup](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/oracle/oracle-database-backup-azure-backup) | configuration | 0.80 | Describes how to configure Azure Backup for Oracle VMs, including Recovery Services vault settings and application-consistent backup options, which are product-specific configuration details. | | [Oracle Database on an Azure Linux VM using Azure files](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/oracle/oracle-database-backup-azure-storage) | configuration | 0.80 | Includes detailed steps and parameters for mounting Azure Files via SMB and configuring RMAN backup destinations, which are concrete Azure–Oracle integration settings. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disk-encryption-overview-aad) | security | 0.80 | Prereqs for ADE with Entra app include specific app registrations, permissions, and key vault access policies—product-specific security configuration details. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/metadata-security-protocol/overview) | security | 0.80 | Security-focused feature for IMDS and WireServer with specific behavior and constraints; product-specific security configuration concepts. | -| [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family) | limits-quotas | 0.80 | NP family size pages list per-size vCPU, memory, disk, and throughput specifications in tabular form, which are concrete numeric limits and capacities that qualify as limits-quotas expert knowledge. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nc-family) | limits-quotas | 0.80 | NC family page typically includes a matrix of NC sizes with exact GPU counts, vCPUs, RAM, and bandwidth caps, which are numeric capacity limits per SKU. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nd-family) | limits-quotas | 0.80 | ND family page typically includes a table of ND sizes with exact GPU counts, vCPUs, RAM, and bandwidth caps, which are numeric capacity limits per SKU. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/storage-optimized/l-family) | limits-quotas | 0.80 | L family page typically includes a table of all L-series sizes with exact vCPU, RAM, local disk sizes, and throughput caps, which are numeric capacity limits per SKU. | @@ -340,7 +342,6 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [RBAC](https://learn.microsoft.com/en-us/azure/virtual-machines/share-gallery) | security | 0.80 | Focuses on sharing via Azure RBAC; will list specific built-in role names and scope behaviors, which are product-specific security configuration details. | | [Red Hat](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/redhat-create-upload-vhd) | configuration | 0.80 | RHEL-specific preparation across multiple versions and hypervisors; includes detailed OS and Azure configuration steps and possibly constraints. | | [Reserve Disk Storage](https://learn.microsoft.com/en-us/azure/virtual-machines/disks-reserved-capacity) | decision-making | 0.80 | Guides when and how to use disk reservations, with cost and scope considerations; decision-focused content for cost optimization with specific criteria. | -| [Retired sizes](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list) | decision-making | 0.80 | Lists all retired/retiring series with replacement recommendations and retirement status, directly supporting SKU selection and migration decisions. | | [SUSE](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/suse-create-upload-vhd) | configuration | 0.80 | SUSE-specific instructions for building and uploading custom VHDs, including configuration for automation and Azure compatibility. | | [Scheduled events](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/scheduled-events) | integrations | 0.80 | Same Scheduled Events feature but Windows-focused; includes endpoint usage and event schema details. | | [Soft delete a gallery](https://learn.microsoft.com/en-us/azure/virtual-machines/soft-delete-gallery) | limits-quotas | 0.80 | Explicitly states a 7-day retention period and permanent deletion behavior; this is a concrete time-based limit that qualifies as limits-quotas. | @@ -431,6 +432,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Av1-series retirement](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/av1-series-retirement) | decision-making | 0.74 | Retirement notice with a specific date and recommendation to move to Av2; includes concrete behavioral changes (deallocation, billing) guiding migration decisions. | | [Enable FIPS 140-3 on Linux VM Agent](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/agent-linux-fips) | security | 0.74 | The article describes how to opt in to FIPS 140-3 support for Azure Linux VM extensions and the guest agent, including product-specific security behavior and configuration steps tied to FIPS-compliant encryption algorithms. This is security-focused configuration guidance that is specific to Azure VM extensions and the Linux guest agent, and not generic FIPS theory or marketing. | | [Ev3 and Esv3 series](https://learn.microsoft.com/en-us/azure/virtual-machines/ev3-esv3-series) | limits-quotas | 0.74 | Ev3/Esv3 spec page includes numeric per-size resource tables that act as limits/quotas for these VM series. | +| [General-purpose sizes](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide) | decision-making | 0.74 | A migration guide for retired VM size series typically includes SKU-by-SKU mapping, comparison tables, and guidance on which new VM series to select based on workload, cost, and performance. This is product-specific decision guidance with concrete recommendations for different scenarios (general purpose, storage optimized, HPC HBv2), fitting the decision-making category. | | [Handle credentials](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/dsc-credentials) | security | 0.74 | Covers secure credential handling in DSC extension, typically including specific configuration fields (protectedSettings, certificate thumbprints) and security patterns. | | [Chef](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/chef) | configuration | 0.72 | Chef extension docs typically list extension settings (server URL, validation key, runlist) and deployment parameters unique to this extension. | | [Configure and optimize VMs](https://learn.microsoft.com/en-us/azure/virtual-machines/configure) | best-practices | 0.72 | Provides tuning and optimization guidance for specific VM series, including product-specific recommendations and gotchas for achieving HPC performance. | @@ -473,7 +475,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Dedicated Host SKU Migration](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/dedicated-host-migration-guide) | decision-making | 0.70 | Migration guide for retiring Dedicated Host SKUs; likely includes tables mapping old SKUs to recommended new SKUs and guidance for planning and executing migration, which is product-specific decision-making content. | | [Deploy a Virtual Machine or Virtual Machine Scale Sets With MSP](https://learn.microsoft.com/en-us/azure/virtual-machines/metadata-security-protocol/greenfield) | security | 0.70 | How-to configuration for enabling Metadata Security Protocol and GuestProxyAgent during VM/VMSS creation, including OS-specific behavior; product-specific security configuration details. | | [Deploy an Ultra Disk](https://learn.microsoft.com/en-us/azure/virtual-machines/disks-enable-ultra-ssd) | configuration | 0.70 | How-to for deploying Ultra Disks; full article includes disk performance configuration parameters (IOPS, throughput) and constraints unique to Ultra Disks. | -| [Deprecated images FAQ](https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images) | limits-quotas | 0.70 | The FAQ-style page contains product-specific lifecycle rules and timing details for deprecated Azure Marketplace images (for example, how long images remain deployable after deprecation and what operations are blocked), which are effectively service limits/constraints that are not obvious from general knowledge. These are concrete behavioral limits tied to the platform rather than conceptual guidance. | +| [Deprecated images FAQ](https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images) | decision-making | 0.70 | Provides product-specific behavioral rules about how deprecated Marketplace images affect deployments, especially those with purchase plans, and clarifies enforcement behavior and scenarios. This is specialized decision guidance for choosing and managing images rather than generic concepts, but it doesn't focus on limits, configuration tables, or error-code troubleshooting. | | [Direct share](https://learn.microsoft.com/en-us/azure/virtual-machines/share-gallery-direct) | security | 0.70 | Covers direct sharing with subscriptions/tenants; relies on RBAC and access configuration specifics, fitting security configuration patterns. | | [Disable MSP](https://learn.microsoft.com/en-us/azure/virtual-machines/metadata-security-protocol/other-examples/disable) | security | 0.70 | Explains how to turn off MSP using REST API and portal; concrete security configuration operations specific to this feature. | | [Disk encryption FAQ](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disk-encryption-faq) | security | 0.70 | FAQ pages for Azure Disk Encryption typically include product-specific security details such as required Key Vault configurations, supported encryption settings, OS/version-specific behaviors, and concrete answers to edge cases (for example, how ADE interacts with certain VM sizes, key rotation, or passphrase handling). These are security-configuration details unique to Azure Disk Encryption rather than generic concepts, so they qualify as expert knowledge under the security sub-skill. | @@ -488,10 +490,10 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Enable InfiniBand](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/enable-infiniband) | configuration | 0.70 | How-to for enabling InfiniBand on RDMA-capable HB/N-series VMs; typically includes specific extension names, settings, and required configuration parameters unique to these VMs. | | [Enable MSP on Existing Virtual Machine or Virtual Machine Scale Sets](https://learn.microsoft.com/en-us/azure/virtual-machines/metadata-security-protocol/brownfield) | security | 0.70 | Details concrete methods (portal, ARM template, REST API) to enable MSP on existing resources; product-specific security configuration steps. | | [Enable shared disks](https://learn.microsoft.com/en-us/azure/virtual-machines/disks-shared-enable) | configuration | 0.70 | Procedural article for enabling shared disks; includes specific disk settings and constraints required to attach a disk to multiple VMs. | -| [Endorsed distributions](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/endorsed-distros) | decision-making | 0.70 | Summarizes image sources and endorsed distros, with implications for SLA and support; helps decide which distro/source to use. | | [Export extensions](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/export-templates) | deployment | 0.70 | Details how export handles VM extensions, including constraints around protected settings and compatibility, which are deployment-specific behaviors. | | [FAQs](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/oracle/oracle-azure-vms-faq) | limits-quotas | 0.70 | FAQ includes concrete recommendations for VM types, HA/DR behaviors, and likely numeric or tightly-scoped constraints specific to Oracle on Azure VMs. | | [Flatcar Container Linux](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/flatcar-create-upload-vhd) | configuration | 0.70 | Describes obtaining and preparing Flatcar images, including supported channels and any Azure-specific configuration steps. | +| [GPU compute migration guide](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration) | decision-making | 0.70 | The migration guide discusses retiring NC/ND/NP-series SKUs and recommends newer GPU SKUs, including product-specific guidance on when and how to move workloads. It contains Azure-specific migration and selection recommendations between GPU VM families, which are decision criteria and trade-offs unique to this product, fitting decision-making. | | [Get usage metrics with REST](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/metrics-vm-usage-rest) | integrations | 0.70 | Shows concrete REST calls and parameters for Azure Monitor metrics on VMs; product-specific API usage and parameterization. | | [Govern and enforce VM Applications using Azure Policy](https://learn.microsoft.com/en-us/azure/virtual-machines/vm-applications-inject-with-policy) | security | 0.70 | Uses Azure Policy with VM Applications, which typically involves specific policy definitions, effect types, and resource property paths; these are product-specific security/governance configurations. | | [HC series](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/hc-series-retirement) | decision-making | 0.70 | Retirement guidance for HC-series VMs includes concrete retirement dates, purchase cut-off dates, and recommended alternative VM series (HBv5, HBv4, HX, HBv3) with migration guidance. This is product- and time-specific decision content to choose replacement SKUs and plan migration, matching decision-making. | @@ -536,7 +538,6 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/disk-encryption-overview) | security | 0.70 | Step-by-step instructions for enabling ADE on Linux VMs; includes key vault, extension, and policy configuration details. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/maintenance-configurations) | configuration | 0.70 | Explains Maintenance Configurations feature, scopes, and behavior for sensitive workloads; includes Azure-specific configuration concepts and constraints. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ng-family) | limits-quotas | 0.70 | Family list page for NG sizes typically includes a table of sizes with exact vCPU, RAM, GPU, and storage values, which are numeric resource limits. | -| [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family) | limits-quotas | 0.70 | Family overview that normally includes a size table with exact vCPU, RAM, GPU, and storage values, which are numeric capacity limits. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hb-family) | limits-quotas | 0.70 | HB sub-family overview that typically includes or links to tables of sizes with exact vCPU, RAM, and bandwidth values, which are numeric capacity limits. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/high-performance-compute/hc-family) | limits-quotas | 0.70 | HC sub-family overview that typically includes or links to size tables with exact vCPU, RAM, and bandwidth values, which are numeric limits. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machines/spot-vms) | decision-making | 0.70 | Spot VM docs typically include eviction policies, pricing/discount behavior, and concrete guidance on when to choose Spot vs regular VMs for different workloads. This is product-specific decision guidance with trade-offs (cost vs reliability) rather than just a conceptual overview. | @@ -560,6 +561,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [RHEL in-place upgrades](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/redhat/redhat-in-place-upgrade) | deployment | 0.70 | Details Azure-specific implications of in-place OS upgrades (e.g., control-plane/data-plane disconnection, impact on features like auto guest patching), which are product-specific deployment constraints. | | [Red Hat Update Infrastructure in Azure](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/redhat/redhat-rhui) | configuration | 0.70 | Explains how Azure RHUI works for PAYG RHEL images and how to obtain updates (e.g., using yum), which is Azure-specific configuration and behavior. | | [Reset Latched Key](https://learn.microsoft.com/en-us/azure/virtual-machines/metadata-security-protocol/other-examples/key-reset) | troubleshooting | 0.70 | Describes failure modes when latched keys are lost or mismatched and how to reset them; symptom → cause → resolution guidance specific to MSP. | +| [Retired sizes](https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list) | decision-making | 0.70 | Page lists specific VM size series that are retired or scheduled for retirement and maps them to replacement series, providing concrete migration guidance and choices between SKUs. This is product- and time-specific information that an LLM won't reliably know from training and directly supports decision-making about which VM sizes to use going forward. | | [SCSI to NVMe for Linux VMs](https://learn.microsoft.com/en-us/azure/virtual-machines/nvme-linux) | configuration | 0.70 | Describes concrete steps and parameters to convert disk controllers on Azure Linux VMs, including device naming and configuration specifics. | | [Salt Minion](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/salt-minion) | configuration | 0.70 | Salt Minion extension requires specific configuration parameters (master, minion ID, etc.) that are documented and product-specific. | | [Security controls by Azure Policy](https://learn.microsoft.com/en-us/azure/virtual-machines/security-controls-policy-image-builder) | security | 0.70 | Lists specific Azure Policy built-in definitions and compliance controls for Image Builder; these are product-specific security/compliance configurations. | @@ -598,7 +600,6 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Create a disk from a VHD - PowerShell](https://learn.microsoft.com/en-us/azure/virtual-machines/scripts/virtual-machines-powershell-sample-create-managed-disk-from-vhd) | integrations | 0.68 | Includes warnings about creating multiple disks quickly and describes snapshot-based creation behavior—product-specific integration and subtle behavior details. | | [Deploy a Premium SSD v2](https://learn.microsoft.com/en-us/azure/virtual-machines/disks-deploy-premium-v2) | limits-quotas | 0.68 | Deployment doc for Premium SSD v2 disks typically includes concrete, product-specific performance and configuration numbers (IOPS, throughput, latency characteristics, regional availability constraints, sector size defaults like 4K, and possibly min/max size or performance settings). These are exact values and constraints that qualify as limits/quotas-style expert knowledge rather than just conceptual guidance. | | [Deploy disks with ARM template](https://learn.microsoft.com/en-us/azure/virtual-machines/using-managed-disks-template-deployments) | integrations | 0.68 | The article provides product-specific ARM template patterns and schema details for defining managed disks versus deprecated unmanaged disks in Azure Virtual Machines and scale sets. It focuses on how to express disk resources and properties in templates, which is a concrete integration/coding pattern with Azure Resource Manager rather than a conceptual overview. These template structures and required fields are expert, product-specific knowledge that an LLM is unlikely to infer without documentation. | -| [General-purpose sizes](https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide) | decision-making | 0.68 | A migration guide for retired VM size series typically includes concrete mappings from old SKUs to recommended new SKUs, with cost/performance trade-offs and scenario-based guidance. This is expert, product-specific decision guidance on which VM series to move to, fitting the decision-making category. | | [Generation 1 VM](https://learn.microsoft.com/en-us/azure/virtual-machines/trusted-launch-existing-vm-gen-1) | configuration | 0.68 | How-to guide for upgrading Gen1 VMs to Trusted launch Gen2, including product-specific steps and required settings (security type, Secure Boot, vTPM) that represent concrete configuration knowledge beyond generic concepts. | | [Issues with Python 3-enabled Linux systems](https://learn.microsoft.com/en-us/azure/virtual-machines/extensions/issues-using-vm-extensions-python-3) | best-practices | 0.68 | Discusses product-specific issues with extensions requiring Python 2 vs 3 and recommended patterns, which are concrete gotchas and best practices. | | [Migrate AWS and on-premises disk to Azure](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/on-prem-to-azure) | deployment | 0.68 | Details how to bring VHDs from other platforms into Azure managed disks, including generalized vs specialized handling—cross-platform migration/deployment guidance. | @@ -733,7 +734,6 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [PowerShell](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/create-powershell-availability-zone) | 0.45 | PowerShell tutorial for creating a zoned VM; mostly basic commands rather than expert-only configuration or troubleshooting content. | | [PowerShell](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/find-unattached-disks) | 0.45 | PowerShell version of unattached disk cleanup; procedural with Get-AzureDisk usage, but no detailed limits or error-code mappings. | | [PowerShell](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/image-builder-vnet) | 0.45 | Windows variant of using Image Builder with an existing VNet; similar procedural content without structured expert references. | -| [Share a capacity reservation group](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-group-share) | 0.45 | Describes sharing Capacity Reservation Groups across subscriptions; summary suggests conceptual explanation and basic steps, not detailed RBAC role lists or configuration matrices. | | [Specialized](https://learn.microsoft.com/en-us/azure/virtual-machines/vm-specialized-image-version) | 0.45 | Similar to index 12 but for specialized images; mostly step-by-step creation instructions, not expert reference data. | | [Ubuntu](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/canonical/ubuntu-pro-in-place-upgrade) | 0.45 | In-place upgrade to Ubuntu Pro; likely includes steps and some constraints, but from the summary we cannot confirm detailed limits, config tables, or security roles. | | [Use internal DNS](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/static-dns-name-resolution-for-linux-on-azure) | 0.45 | Shows how to set static internal DNS names using vNICs and CLI; procedural configuration without broad decision matrices or numeric limits. | @@ -796,9 +796,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [2 - Add an Azure Linux with OS Guard node pool to your existing cluster](https://learn.microsoft.com/en-us/azure/azure-linux/tutorial-azure-linux-os-guard-add-node-pool) | 0.35 | Tutorial for adding OS Guard node pool; procedural, not a configuration or troubleshooting reference. | | [ARM template](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/ps-template) | 0.35 | Template-based Windows VM creation tutorial; focuses on using a sample ARM template and PowerShell, not on enumerating configuration options or limits. | | [Assign public DNS name](https://learn.microsoft.com/en-us/azure/virtual-machines/create-fqdn) | 0.35 | Portal steps to create FQDN for a VM; simple configuration, no detailed parameter tables or limits. | -| [Associate a VM](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-vm) | 0.35 | How-to associate a VM to a capacity reservation group; likely a short procedural article without extensive configuration or limits. | | [Associate a scale set - Flexible](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set-flex) | 0.35 | How-to associate a Flexible orchestration scale set with a capacity reservation group; procedural content rather than configuration reference. | -| [Associate a scale set - Uniform](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set) | 0.35 | How-to associate a Uniform orchestration scale set with a capacity reservation group; similar procedural focus as 27. | | [Elasticsearch](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/tutorial-elasticsearch) | 0.35 | Tutorial for deploying Elastic Stack on a dev VM; step-by-step install, not a configuration reference or best-practices guide with quantified impact. | | [HPC for HB-series and N-series VMs](https://learn.microsoft.com/en-us/azure/virtual-machines/overview-hb-hc) | 0.35 | Appears to be a conceptual/marketing-style overview of HB/N-series capabilities and workloads, not detailed configuration, limits, or troubleshooting content. | | [How to set up HPC/AI VMs](https://learn.microsoft.com/en-us/azure/virtual-machines/set-up-hpc-vms) | 0.35 | How-to for setting up HPC/AI VMs with GPUs via portal; basic creation steps, no detailed configuration parameter tables or quotas. | @@ -869,6 +867,8 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [ARM template](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/quick-create-template) | 0.20 | ARM template quickstart for Ubuntu VM; example deployment only, no detailed parameter tables or limits. | | [ARM template](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/quick-create-template) | 0.20 | ARM template quickstart for Windows VM; example template only, no expert-level configuration catalog or limits. | | [About the Azure Linux Container Host for AKS](https://learn.microsoft.com/en-us/azure/azure-linux/intro-azure-linux) | 0.20 | Introductory overview of Azure Linux Container Host; conceptual and marketing-style content. | +| [Associate a VM](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-vm) | 0.20 | Described as a guide to associate a VM to a capacity reservation group. This is primarily procedural content without indication of numeric limits, detailed configuration parameter tables, or troubleshooting mappings. | +| [Associate a scale set - Uniform](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set) | 0.20 | Covers associating a Uniform orchestration VM scale set to a capacity reservation group. The summary indicates a scenario-specific how-to, not expert-level limits, configuration matrices, or error-code-based troubleshooting content. | | [Azure Regions](https://learn.microsoft.com/en-us/azure/virtual-machines/regions) | 0.20 | Overview of Azure regions and availability; largely conceptual without numeric limits, config tables, or decision matrices. | | [Bicep](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/quick-create-bicep) | 0.20 | Bicep quickstart for Ubuntu VM; focuses on example template, not exhaustive configuration options or expert constraints. | | [Bicep](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/quick-create-bicep) | 0.20 | Bicep quickstart for Windows VM; focuses on example deployment, not detailed configuration options or quotas. | @@ -880,6 +880,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Create using Azure CLI](https://learn.microsoft.com/en-us/azure/azure-compute-fleet/quickstart-create-azure-cli) | 0.20 | CLI quickstart for creating a Compute Fleet; procedural, without detailed config matrices or limits. | | [Create with ARM template](https://learn.microsoft.com/en-us/azure/azure-compute-fleet/quickstart-create-rest-api) | 0.20 | ARM template quickstart; focuses on basic deployment flow, not on exhaustive configuration options or limits. | | [Delete a VM and its resources](https://learn.microsoft.com/en-us/azure/virtual-machines/delete) | 0.20 | How-to page for deleting a VM and its attached resources; focuses on behavior and UI options, not on numeric limits, configuration parameter tables, error-code-based troubleshooting, or decision matrices. Does not meet any sub-skill expert-knowledge criteria. | +| [Endorsed distributions](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/endorsed-distros) | 0.20 | Primarily a conceptual/overview page describing Linux image sources and endorsed distributions on Azure. It summarizes image types and partnerships but does not present concrete limits, configuration parameters, error codes, or detailed decision matrices with quantified trade-offs. | | [Get started with WebLogic Server on Azure VMs](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/oracle/weblogic-server-azure-virtual-machine) | 0.20 | Quickstart for deploying WebLogic Server on Azure VMs is primarily a step-by-step tutorial using a marketplace offer. It focuses on basic deployment flow rather than enumerating configuration matrices, limits, or product-specific best-practice patterns with quantified impact. | | [Isolated image builds](https://learn.microsoft.com/en-us/azure/virtual-machines/security-isolated-image-builds-image-builder) | 0.20 | The summary describes Isolated Image Builds conceptually (moving customization/validation to dedicated ACI resources for isolation) but does not indicate specific configuration parameters, limits, error codes, or decision matrices. It reads as a feature/behavior overview rather than detailed expert configuration or troubleshooting guidance. | | [Migrate to managed disks](https://learn.microsoft.com/en-us/azure/virtual-machines/windows/migrate-to-managed-disks) | 0.20 | Primarily a migration/how-to article for moving from unmanaged to managed disks. From the summary, it does not clearly expose configuration tables, limits, quotas, or detailed error-code-based troubleshooting. Without evidence of specific numeric limits, config parameter tables, or decision matrices, it does not meet the expert-knowledge criteria for any sub-skill type. | @@ -901,6 +902,7 @@ confusable_not_for: Not for Azure Data Science Virtual Machines (use azure-data- | [Red Hat](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/redhat/overview) | 0.20 | Overview of Red Hat workloads on Azure; high-level product listing without detailed configuration, limits, or decision matrices. | | [Regional to zonal move guide](https://learn.microsoft.com/en-us/azure/virtual-machines/move-virtual-machines-regional-zonal-portal) | 0.20 | Primarily a how-to migration guide for moving VMs from regional to zonal deployments via portal/PowerShell/CLI. Does not focus on limits, configuration matrices, error-code troubleshooting, or other expert-knowledge patterns defined in the sub-skill types. | | [Set up Azure HPC or AI VMs](https://learn.microsoft.com/en-us/azure/virtual-machines/set-up-hpc-vms) | 0.20 | How-to create a basic VM via portal; likely step-by-step tutorial without detailed config matrices or numeric limits. | +| [Share a capacity reservation group](https://learn.microsoft.com/en-us/azure/virtual-machines/capacity-reservation-group-share) | 0.20 | Appears to be a how-to/tutorial for sharing a Capacity Reservation Group across subscriptions. The summary suggests step-by-step usage, not detailed limits, configuration matrices, or error-code-based troubleshooting. Likely no dense expert-only configuration tables or numeric constraints. | | [Support and help](https://learn.microsoft.com/en-us/azure/azure-linux/support-help) | 0.20 | Support/help options page; mostly meta-information about getting assistance, not technical configuration or troubleshooting content. | | [Support and troubleshooting](https://learn.microsoft.com/en-us/azure/virtual-machines/vm-support-help) | 0.20 | Support/help options page is primarily navigational and process-oriented, without technical limits, configuration parameters, or error-code troubleshooting. | | [Terraform VM](https://learn.microsoft.com/en-us/azure/virtual-machines/linux/quick-create-terraform) | 0.20 | Terraform quickstart for Linux VM; shows basic resource definitions but not a comprehensive configuration reference or expert-only patterns. | diff --git a/products/azure-virtual-network/azure-virtual-network.csv b/products/azure-virtual-network/azure-virtual-network.csv index 43690780..860e4a3a 100644 --- a/products/azure-virtual-network/azure-virtual-network.csv +++ b/products/azure-virtual-network/azure-virtual-network.csv @@ -15,7 +15,7 @@ https://learn.microsoft.com/en-us/azure/virtual-network/create-peering-different https://learn.microsoft.com/en-us/azure/virtual-network/create-virtual-machine-accelerated-networking,Create a VM with Accelerated Networking,Create an Azure Virtual Machine with Accelerated Networking,,"Use Azure portal, Azure CLI, or PowerShell to create Linux or Windows virtual machines with Accelerated Networking enabled for improved network performance.",This article describes how to create a Linux or Windows virtual machine (VM) with Accelerated Networking (AccelNet) enabled by using the Azure CLI command-line interface.,2026-02-20T23:21:00.000Z,how-to,,0.4,False,Shows how to create VMs with Accelerated Networking via portal/CLI/PowerShell; largely procedural without config parameter reference tables or limits.,unchanged https://learn.microsoft.com/en-us/azure/virtual-network/deploy-container-networking,Deploy container networking,Deploy Azure virtual network container networking,,Learn how to deploy the Azure Virtual Network container network interface (CNI) plug-in for Kubernetes clusters.,"The Azure Virtual Network container network interface (CNI) plug-in installs in an Azure virtual machine and brings virtual network capabilities to Kubernetes Pods and Docker containers. To learn more about the plug-in, seeEnable containers to use Azure Virtual Network capabilities. Additionally, the plug-in can be used with the Azure Kubernetes Service (AKS) by choosing theAdvanced Networkingoption, which automatically places AKS containers in a virtual network.",2023-03-26T00:00:00.000Z,how-to,,0.35,False,Deployment guide for Azure CNI in Kubernetes; largely tutorial-style without comprehensive configuration reference tables or troubleshooting mappings.,unchanged https://learn.microsoft.com/en-us/azure/virtual-network/deploy-container-networking-docker-linux,Deploy container networking for a stand-alone host,Deploy container networking for a stand-alone Linux Docker host - Azure Virtual Network,,Learn how to deploy the Azure CNI plug-in to enable container virtual network connectivity for a standalone Linux Docker host.,"The Azure CNI plugin enables per container/pod networking for stand-alone docker hosts and Kubernetes clusters. In this article, you learn how to install and configure the CNI plugin for a standalone Linux Docker host.",2026-02-24T06:10:00.000Z,how-to,,0.4,False,Shows how to deploy Azure CNI on a standalone Docker host; focused on step-by-step setup rather than full configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/virtual-network/how-to-configure-subnet-peering,Configure subnet peering,Configure subnet peering - Azure Virtual Network,,Learn how to configure subnet peering for an Azure virtual network.,"Subnet peering connects two virtual networks by linking specific subnets instead of entire virtual network address spaces. This approach gives you granular control over which subnets participate in the peering relationship between local and remote virtual networks. Subnet peering adds flexibility to virtual network peering. You can choose specific subnets to peer across virtual networks. You specify or enter the list of subnets across the virtual networks that you want to peer. In contrast, regu",2025-12-15T23:10:00.000Z,how-to,,0.35,False,"How-to for configuring subnet peering; focuses on steps and basic behavior, not deep configuration references or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-network/how-to-configure-subnet-peering,Configure subnet peering,Configure subnet peering - Azure Virtual Network,,Learn how to configure subnet peering for an Azure virtual network.,"Subnet peering connects two virtual networks by linking specific subnets instead of entire virtual network address spaces. This approach gives you granular control over which subnets participate in the peering relationship between local and remote virtual networks. Subnet peering adds flexibility to virtual network peering. You can choose specific subnets to peer across virtual networks. You specify or enter the list of subnets across the virtual networks that you want to peer. In contrast, regu",2026-04-22T17:34:00.000Z,how-to,,0.2,False,"Page is a how-to guide for configuring subnet peering in Azure Virtual Network. It appears to be procedural (UI/CLI steps) without tables of configuration parameters, numeric limits, error-code-based troubleshooting, or decision matrices. It does not match the specific expert-knowledge patterns for limits, configuration, troubleshooting, or decision-making as defined.",updated https://learn.microsoft.com/en-us/azure/virtual-network/how-to-create-encryption,Create a virtual network with encryption,Create a virtual network with encryption - Azure Virtual Network,Create and configure an encrypted Azure virtual network,Learn how to create an encrypted virtual network. A virtual network lets Azure resources communicate with each other and the internet.,"Azure Virtual Network encryption is a feature of Azure Virtual Network. With Virtual Network encryption, you can seamlessly encrypt and decrypt internal network traffic over the wire, with minimal effect to performance and scale. Virtual Network encryption protects data that traverses your virtual network from virtual machine to virtual machine.",2024-09-25T22:03:00.000Z,how-to,configuration,0.65,True,How-to for creating an encrypted VNet; likely includes specific settings and parameters required to enable VNet encryption.,unchanged https://learn.microsoft.com/en-us/azure/virtual-network/how-to-create-virtual-network-routing-appliance,Create a routing appliance,Create a Routing Appliance - Azure Virtual Network,Register and create Azure Virtual Network routing appliances,"This guide covers registration, configuration, and troubleshooting for the preview of Azure Virtual Network routing appliances.","This article explains how to register your subscription for the preview of Azure Virtual Network routing appliances and how to create a routing appliance in the Azure portal. Use the preview for testing, evaluation, and feedback. It doesn't support production workloads. Important Azure Virtual Network routing appliances are currently in preview. For legal terms that apply to Azure features that are in beta, in preview, or otherwise not yet released into general availability, see theSupplemental ",2026-03-02T23:28:00.000Z,how-to,troubleshooting,0.65,True,"How-to article for preview registration and creation that explicitly includes a troubleshooting section. Preview registration and setup issues typically involve specific error messages and resolution steps that are product- and preview-specific, which qualify as expert troubleshooting knowledge beyond generic debugging guidance.",unchanged https://learn.microsoft.com/en-us/azure/virtual-network/how-to-dhcp-azure,Deploy a DHCP server on an Azure VM,Deploy a DHCP server in Azure on a virtual machine - Azure Virtual Network,Deploy a DHCP server VM for on-premises clients,Learn about how to deploy a Dynamic Host Configuration Protocol (DHCP) server in Azure on a virtual machine as a target for an on-premises DHCP relay agent.,"Learn how to deploy a highly available DHCP server in Azure on a virtual machine. This server is used as a target for an on-premises DHCP relay agent to provide dynamic IP address allocation to on-premises clients. The DHCP relay agent forwards unicast DHCP requests from on-premises clients to the DHCP servers running in Azure. Direct broadcast packets from clients to a DHCP server don't work in an Azure Virtual Network by design. Note The on-premises client to DHCP Server (source port UDP/68, d",2026-02-24T06:10:00.000Z,how-to,configuration,0.7,True,"Describes Azure-specific DHCP relay behavior (no broadcasts) and detailed port/protocol requirements plus configuration steps for a highly available DHCP server in Azure, including product-specific networking constraints.",unchanged diff --git a/products/azure-virtual-network/report.md b/products/azure-virtual-network/report.md index 5a2a5757..426acd91 100644 --- a/products/azure-virtual-network/report.md +++ b/products/azure-virtual-network/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-12' +generated_at: '2026-04-26' category_descriptions: best-practices: 'Network performance and connectivity guidance: VNet design, NSGs, service endpoints, outbound access, MTU/TCP tuning, and tools to test throughput @@ -47,8 +47,8 @@ confusable_not_for: Not for Azure Networking (use azure-networking), Azure Virtu ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 129 +- **Updated Pages**: 1 +- **Unchanged**: 128 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-virtual-network/azure-virtual-network.csv` @@ -67,6 +67,11 @@ confusable_not_for: Not for Azure Networking (use azure-networking), Azure Virtu ## Changes +### Updated Pages + +- [Configure subnet peering](https://learn.microsoft.com/en-us/azure/virtual-network/how-to-configure-subnet-peering) + - Updated: 2025-12-15T23:10:00.000Z → 2026-04-22T17:34:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -159,7 +164,6 @@ confusable_not_for: Not for Azure Networking (use azure-networking), Azure Virtu | [Private IPs](https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/private-ip-addresses) | 0.36 | Overview of private IP addresses and scenarios; conceptual networking behavior without explicit limits, settings tables, or troubleshooting content. | | [Add or remove a subnet delegation](https://learn.microsoft.com/en-us/azure/virtual-network/manage-subnet-delegation) | 0.35 | How-to for adding/removing subnet delegation; procedural with minimal product-specific configuration tables or decision criteria. | | [Add or remove network interfaces](https://learn.microsoft.com/en-us/azure/virtual-network/virtual-network-network-interface-vm) | 0.35 | Explains adding/removing NICs on VMs; operational steps, not a configuration reference or troubleshooting guide with error codes. | -| [Configure subnet peering](https://learn.microsoft.com/en-us/azure/virtual-network/how-to-configure-subnet-peering) | 0.35 | How-to for configuring subnet peering; focuses on steps and basic behavior, not deep configuration references or decision matrices. | | [Custom IP address prefix (BYOIP)](https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/custom-ip-address-prefix) | 0.35 | Conceptual overview of custom IP address prefix (BYOIP); describes what it is and how it behaves, but summary shows no detailed limits or configuration tables. | | [Deploy container networking](https://learn.microsoft.com/en-us/azure/virtual-network/deploy-container-networking) | 0.35 | Deployment guide for Azure CNI in Kubernetes; largely tutorial-style without comprehensive configuration reference tables or troubleshooting mappings. | | [Different subscriptions and tenants](https://learn.microsoft.com/en-us/azure/virtual-network/create-peering-different-subscriptions) | 0.35 | Tutorial for peering VNets across subscriptions/tenants; procedural steps, no detailed limits, config matrices, or troubleshooting mappings. | @@ -197,6 +201,7 @@ confusable_not_for: Not for Azure Networking (use azure-networking), Azure Virtu | [Configure internet routing preference for a virtual machine](https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/configure-routing-preference-virtual-machine) | 0.28 | Tutorial on configuring routing preference for a VM; procedural guidance without detailed numeric thresholds, limits, or troubleshooting mappings. | | [Configure mixed Routing Preference for a VM](https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/routing-preference-mixed-network-adapter-portal) | 0.28 | Tutorial for configuring both routing preferences using two NICs; scenario walkthrough rather than reference-style expert content. | | [Manage a virtual network](https://learn.microsoft.com/en-us/azure/virtual-network/manage-virtual-network) | 0.25 | Basic management article for creating/changing/deleting VNets; generic portal/CLI operations without expert-level configuration or troubleshooting content. | +| [Configure subnet peering](https://learn.microsoft.com/en-us/azure/virtual-network/how-to-configure-subnet-peering) | 0.20 | Page is a how-to guide for configuring subnet peering in Azure Virtual Network. It appears to be procedural (UI/CLI steps) without tables of configuration parameters, numeric limits, error-code-based troubleshooting, or decision matrices. It does not match the specific expert-knowledge patterns for limits, configuration, troubleshooting, or decision-making as defined. | | [Create public IP address - Terraform](https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/create-public-ip-terraform) | 0.20 | Quickstart Terraform tutorial for creating a public IP; primarily step-by-step resource creation with no detailed limits, quotas, configuration matrices, or product-specific best practices beyond generic Terraform usage. | | [Create public IP prefix - Terraform](https://learn.microsoft.com/en-us/azure/virtual-network/ip-services/create-public-ip-prefix-terraform) | 0.20 | Quickstart Terraform tutorial for creating a public IP prefix; focuses on basic creation/change/delete workflow without detailed configuration parameter tables, limits, or troubleshooting content. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-network/accelerated-networking-mana-overview) | 0.20 | Overview of Microsoft Azure Network Adapter (MANA) and its benefits; description suggests conceptual/performance overview without explicit limits, configuration tables, error codes, or decision matrices that meet the expert-knowledge criteria. | diff --git a/products/azure-vm-scalesets/azure-vm-scalesets.csv b/products/azure-vm-scalesets/azure-vm-scalesets.csv index ac2b6b5a..a61d78f8 100644 --- a/products/azure-vm-scalesets/azure-vm-scalesets.csv +++ b/products/azure-vm-scalesets/azure-vm-scalesets.csv @@ -57,7 +57,7 @@ https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/use-spot,Azur https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-attach-detach-vm,Attach VMs to a scale set,Attach or detach a virtual machine to or from a Virtual Machine Scale Set - Azure Virtual Machine Scale Sets,,How to attach or detach a virtual machine to or from a Virtual Machine Scale Set,,2025-04-23T08:00:00.000Z,overview,,0.3,False,"Primarily a how-to attach/detach VM tutorial; unlikely to contain detailed config tables, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-attached-disks,Use data disks with scale sets,Azure Virtual Machine Scale Sets Attached Data Disks - Azure Virtual Machine Scale Sets,Configure attached data disks for VM scale sets,Learn how to use attached data disks with Virtual Machine Scale Sets through outlines of specific use cases.,"To expand your available storage, AzureVirtual Machine Scale Setssupport VM instances with attached data disks. You can attach data disks when the scale set is created, or to an existing scale set.",2024-08-22T17:37:00.000Z,how-to,configuration,0.7,True,"Attached data disks for VMSS typically include product-specific configuration details (disk types, limits per VM size, ARM template properties, and CLI/PowerShell parameters). These are concrete configuration options and patterns unique to VM scale sets rather than just conceptual storage info.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-instance-repairs,Automatic instance repairs,Automatic instance repairs with Azure Virtual Machine Scale Sets - Azure Virtual Machine Scale Sets,Configure automatic instance repairs for VM scale sets,Learn how to configure automatic repairs policy for VM instances in a scale set,"Enabling automatic instance repairs for Azure Virtual Machine Scale Sets helps achieve high availability for applications by maintaining a set of healthy instances. If an unhealthy instance is found byApplication Health extensionorLoad balancer health probes, automatic instance repairs will attempt to recover the instance by triggering repair actions such as deleting the unhealthy instance and creating a new one to replace it, reimaging the unhealthy instance, or restarting the unhealthy instanc",2025-04-01T08:00:00.000Z,concept-article,configuration,0.75,True,"Details automatic repairs policy, including repair actions (delete, reimage, restart) and how health probes/extensions drive behavior—product-specific configuration and behavior.",unchanged -https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-upgrade,Overview,Automatic OS image upgrades with Azure Virtual Machine Scale Sets - Azure Virtual Machine Scale Sets,Configure automatic OS image upgrades for VM scale sets,Learn how to automatically upgrade the OS image on virtual machines in a scale set,"Note Many of the steps listed in this document apply to Virtual Machine Scale Sets using Uniform Orchestration mode. We recommend using Flexible Orchestration for new workloads. For more information, seeOrchestration modes for Virtual Machine Scale Sets in Azure. Enabling automatic OS image upgrades on your scale set helps ease update management by safely and automatically upgrading the OS disk for all instances in the scale set. Automatic OS upgrade has the following characteristics: Note Befor",2025-01-17T22:59:00.000Z,concept-article,configuration,0.7,True,"Automatic OS upgrade behavior for VMSS includes specific settings (properties in the model, modes, health checks, sequencing) and constraints unique to VMSS. These are detailed configuration options and operational behaviors.",unchanged +https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-upgrade,Overview,Automatic OS image upgrades with Azure Virtual Machine Scale Sets - Azure Virtual Machine Scale Sets,Configure automatic OS image upgrades for VM scale sets,Learn how to automatically upgrade the OS image on virtual machines in a scale set,"Enabling automatic OS image upgrades on your scale set helps ease update management by safely and automatically upgrading the OS disk for all instances in the scale set. Automatic OS upgrade has the following characteristics: Note Before enabling automatic OS image upgrades, checkrequirements sectionof this documentation.",2026-04-21T22:04:00.000Z,concept-article,configuration,0.7,True,"The page describes product-specific settings and requirements for enabling automatic OS image upgrades on Azure Virtual Machine Scale Sets, including configuration options and behavior details that go beyond generic knowledge. It focuses on how to configure and use this feature rather than just conceptual overview, fitting the configuration sub-skill best.",updated https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-autoscale-overview,Overview,Overview of autoscale with Azure Virtual Machine Scale Sets - Azure Virtual Machine Scale Sets,,Learn about the different ways that you can automatically scale an Azure Virtual Machine Scale Set based on performance or on a fixed schedule,An Azure Virtual Machine Scale Set can increase or decrease the number of virtual machines that run your application. Instance count can be updated in several ways: This automated and elastic behavior reduces the management overhead to monitor and optimize the performance of your application. This article provides an overview of which performance metrics are available and what actions autoscale can perform. Note Use of autoscaling requires that the scale set is defined with a virtual machine sca,2025-04-01T08:00:00.000Z,overview,,0.35,False,Autoscale overview describes available metrics and actions conceptually; summary does not indicate detailed numeric limits or configuration tables.,unchanged https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-autoscale-portal,Configure autoscale,Autoscale Virtual Machine Scale Sets in the Azure portal - Azure Virtual Machine Scale Sets,Create autoscale rules for VM Scale Sets in Azure portal,How to create autoscale rules for Virtual Machine Scale Sets in the Azure portal,"When you create a scale set, you define the number of VM instances that you wish to run. As your application demand changes, you can automatically increase or decrease the number of VM instances. The ability to autoscale lets you keep up with customer demand or respond to application performance changes throughout the lifecycle of your app. This article shows you how to create autoscale rules in the Azure portal that monitor the performance of the VM instances in your scale set. These autoscale ",2025-04-01T08:00:00.000Z,how-to,configuration,0.6,True,"Portal article for autoscale rules will include specific rule parameters, metric names, and threshold values, which are concrete configuration details.",unchanged https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-change-upgrade-policy,Changing the upgrade policy mode,Change the upgrade policy mode on Virtual Machine Scale Sets - Azure Virtual Machine Scale Sets,Change upgrade policy mode for existing VM scale sets,Learn how to change the upgrade policy mode on Virtual Machine Scale Sets,"The upgrade policy mode for a Virtual Machine Scale Set can be changed at any point in time. Depending on your scenario, you may want to use a particular upgrade policy mode when setting up and developing your workload and once you're ready to move to production, change it to another upgrade policy mode. Select the Virtual Machine Scale Set you want to change the upgrade policy mode for. In the menu underSettings, selectUpgrade Policyand from the drop-down menu, select the upgrade policy mode yo",2024-11-19T13:27:00.000Z,how-to,configuration,0.65,True,Explains how to switch upgrade modes post-deployment and implications; includes UI/API configuration steps specific to this product.,unchanged diff --git a/products/azure-vm-scalesets/report.md b/products/azure-vm-scalesets/report.md index 4a48067a..324b6889 100644 --- a/products/azure-vm-scalesets/report.md +++ b/products/azure-vm-scalesets/report.md @@ -1,9 +1,9 @@ --- -generated_at: '2026-04-05' +generated_at: '2026-04-26' category_descriptions: - configuration: 'Configuring VM Scale Sets: scaling rules, upgrades, networking, - disks, images, health/repair, standby pools, instance mix, protection, and automation - via CLI, PowerShell, templates, and portal' + configuration: 'Configuring VM Scale Sets: scaling rules, upgrades, health/repairs, + disks, networking, instance mix, standby pools, protection, and automation via + portal, CLI, PowerShell, and ARM templates.' architecture-patterns: 'Designing resilient VM scale sets: zones, fault domains, zone balancing modes, proximity placement groups, and standby pools to optimize availability, latency, and scale-out behavior.' @@ -28,17 +28,17 @@ category_descriptions: skill_description: Expert knowledge for Azure Virtual Machine Scale Sets development including troubleshooting, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when configuring VMSS autoscale, upgrade modes, zones/PPGs, Spot/standby pools, - or disk encryption with Key Vault, and other Azure Virtual Machine Scale Sets related + Use when configuring autoscale rules, upgrade modes, Spot/standby pools, disk encryption, + or ARM/CLI/PowerShell deployments, and other Azure Virtual Machine Scale Sets related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), - Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure App Service - (use azure-app-service), Azure Service Fabric (use azure-service-fabric). -use_when: Use when configuring VMSS autoscale, upgrade modes, zones/PPGs, Spot/standby - pools, or disk encryption with Key Vault, and other Azure Virtual Machine Scale - Sets related development tasks. + Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Service Fabric + (use azure-service-fabric), Azure App Service (use azure-app-service). +use_when: Use when configuring autoscale rules, upgrade modes, Spot/standby pools, + disk encryption, or ARM/CLI/PowerShell deployments, and other Azure Virtual Machine + Scale Sets related development tasks. confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), Azure - Kubernetes Service (AKS) (use azure-kubernetes-service), Azure App Service (use - azure-app-service), Azure Service Fabric (use azure-service-fabric). + Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Service Fabric (use + azure-service-fabric), Azure App Service (use azure-app-service). --- # Azure Virtual Machine Scale Sets Crawl Report @@ -52,8 +52,8 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 0 -- **Unchanged**: 93 +- **Updated Pages**: 1 +- **Unchanged**: 92 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-vm-scalesets/azure-vm-scalesets.csv` @@ -73,6 +73,11 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), ## Changes +### Updated Pages + +- [Overview](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-upgrade) + - Updated: 2025-01-17T22:59:00.000Z → 2026-04-21T22:04:00.000Z + ## Classified Pages | TOC Title | Type | Confidence | Reason | @@ -107,7 +112,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Manage fault domains](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-manage-fault-domains) | architecture-patterns | 0.70 | Guides choosing number of fault domains based on orchestration mode and region/zone; product-specific resiliency pattern and configuration trade-offs. | | [Modify a scale set](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-upgrade-scale-set) | configuration | 0.70 | Details how to update scale set configuration via REST/PowerShell/CLI, including which properties can be changed and how they propagate—specific configuration surface. | | [Monitor automatic repairs service state](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/alert-rules-automatic-repairs-service-state) | configuration | 0.70 | Shows how to configure alert rules on Automatic Repairs ServiceState, including metric/log selection and conditions—product-specific monitoring configuration. | -| [Overview](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-upgrade) | configuration | 0.70 | Automatic OS upgrade behavior for VMSS includes specific settings (properties in the model, modes, health checks, sequencing) and constraints unique to VMSS. These are detailed configuration options and operational behaviors. | +| [Overview](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-automatic-upgrade) | configuration | 0.70 | The page describes product-specific settings and requirements for enabling automatic OS image upgrades on Azure Virtual Machine Scale Sets, including configuration options and behavior details that go beyond generic knowledge. It focuses on how to configure and use this feature rather than just conceptual overview, fitting the configuration sub-skill best. | | [Overview](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-upgrade-policy) | decision-making | 0.70 | Compares automatic, manual, and rolling upgrade modes with impact on uptime; provides scenario-based guidance on which mode to choose, a product-specific decision. | | [Proximity placement groups](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/proximity-placement-groups) | architecture-patterns | 0.70 | Explains when and how to use proximity placement groups to minimize latency, a product-specific placement pattern with design trade-offs. | | [Reimage a virtual machine](https://learn.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-reimage-virtual-machine) | configuration | 0.70 | Explains reimage behavior (OS disk replacement, what changes require reimage) and how to invoke it per instance—product-specific operational configuration. | diff --git a/products/azure-vmware-solution/azure-vmware-solution.csv b/products/azure-vmware-solution/azure-vmware-solution.csv index 944a87e8..a0246e84 100644 --- a/products/azure-vmware-solution/azure-vmware-solution.csv +++ b/products/azure-vmware-solution/azure-vmware-solution.csv @@ -14,7 +14,7 @@ https://learn.microsoft.com/en-us/azure/azure-vmware/attach-azure-netapp-files-t https://learn.microsoft.com/en-us/azure/azure-vmware/azure-security-integration,Integrate Microsoft Defender for Cloud,Integrate Microsoft Defender for Cloud with Azure VMware Solution - Azure VMware Solution,Integrate Microsoft Defender for Cloud with Azure VMware Solution,Learn how to protect your Azure VMware Solution VMs with Azure's native security tools from the workload protection dashboard.,"Microsoft Defender for Cloud provides advanced threat protection across your Azure VMware Solution and on-premises virtual machines (VMs). It assesses the vulnerability of Azure VMware Solution VMs and raises alerts as needed. These security alerts can be forwarded to Azure Monitor for resolution. You can define security policies in Microsoft Defender for Cloud. For more information, seeWorking with security policies. Microsoft Defender for Cloud offers many features, including: The diagram show",2026-03-12T08:00:00.000Z,how-to,security,0.75,True,"Covers product-specific security integration between Defender for Cloud and AVS, including how alerts flow, how policies are applied, and how to protect AVS VMs. Such content typically includes concrete configuration steps and security settings unique to this integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-citrix,Deploy Citrix on Azure VMware Solution,Deploy Citrix on Azure VMware Solution - Azure VMware Solution,Deploy Citrix Virtual Apps and Desktops on Azure VMware Solution,Learn how to deploy VMware Citrix on Azure VMware Solution.,Citrix Virtual Apps and Desktop service supports Azure VMware Solution. Azure VMware Solution provides cloud infrastructure containing vSphere clusters created by Azure infrastructure. You can apply the Citrix Virtual Apps and Desktop Service to use Azure VMware Solution for provisioning yourVirtual Delivery Agent (VDA)workload in the same way you'd use vSphere in on-premises environments. Learn more about Citrix virtual apps and desktops Deployment guide Solution brief FAQ (review Q&As) Q. Can ,2026-03-09T22:17:00.000Z,how-to,deployment,0.62,True,"Focuses on deploying Citrix Virtual Apps and Desktops service on Azure VMware Solution. Citrix-on-AVS is a specific deployment scenario with product-specific requirements and constraints (e.g., supported models, topology, and AVS usage patterns) that go beyond generic deployment knowledge, fitting the deployment sub-skill for a particular platform integration.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-horizon,Deploy Horizon on Azure VMware Solution,Deploy Horizon on Azure VMware Solution - Azure VMware Solution,Deploy VMware Horizon virtual desktops on AVS,Learn how to deploy VMware Horizon on Azure VMware Solution.,"Note This document focuses on the VMware Horizon product, formerly known as Horizon 7. Horizon is a different solution than Horizon Cloud on Azure, although there are some shared components. Key advantages of the Azure VMware Solution include both a more straightforward sizing method and the integration of Software-Defined Data Center (SDDC) private cloud management into the Azure portal. VMware Horizon®, a virtual desktop and applications platform, runs in the data center and provides simple an",2025-01-09T08:00:00.000Z,how-to,architecture-patterns,0.65,True,Horizon on AVS deployment with sizing and SDDC integration guidance; likely includes AVS-specific architectural recommendations and trade-offs.,unchanged -https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-known-issues,Known issues,Azure VMware Solution known issues - Azure VMware Solution,Resolve Azure VMware Solution known issues and workarounds,This article provides details about the known issues of Azure VMware Solution.,"This article describes the currently known issues with Azure VMware Solution. Refer to the table to find details about resolution dates or possible workarounds. For more information about the different feature enhancements and bug fixes in Azure VMware Solution, seeWhat's New. In this article, you learned about the current known issues with the Azure VMware Solution. For more information, seeAbout Azure VMware Solution.",2026-04-13T22:10:00.000Z,reference,troubleshooting,0.7,True,"The page is a product-specific known issues list that maps concrete symptoms to resolutions or workarounds and sometimes includes timing for fixes. This is effectively troubleshooting guidance (symptom → cause → workaround) that an LLM wouldn't reliably know from training, so it qualifies as expert knowledge under the troubleshooting sub-skill.",updated +https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-known-issues,Known issues,Azure VMware Solution known issues - Azure VMware Solution,Resolve Azure VMware Solution known issues and workarounds,This article provides details about the known issues of Azure VMware Solution.,"This article describes the currently known issues with Azure VMware Solution. Refer to the table to find details about resolution dates or possible workarounds. For more information about the different feature enhancements and bug fixes in Azure VMware Solution, seeWhat's New. In this article, you learned about the current known issues with the Azure VMware Solution. For more information, seeAbout Azure VMware Solution.",2026-04-13T22:10:00.000Z,reference,troubleshooting,0.7,True,"The page is a product-specific known issues list that maps concrete symptoms to resolutions or workarounds and sometimes includes timing for fixes. This is effectively troubleshooting guidance (symptom → cause → workaround) that an LLM wouldn't reliably know from training, so it qualifies as expert knowledge under the troubleshooting sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-nsx-scale-and-performance-recommendations-for-vmware-hcx,NSX Scale and Performance Recommendations for VMware HCX,NSX Scale and Performance Recommendations - Azure VMware Solution,Optimize NSX performance for HCX migrations in Azure VMware Solution,Learn about the default NSX Topology in Azure VMware Solution and recommended practices to mitigate performance issues around HCX migration use cases.,"In this article, learn about the default NSX topology in Azure VMware Solution and NSX data path performance characteristics. Learn how to identify NSX data path resource limits. Learn about recommended configurations to help mitigate resource limits and optimize overall data path performance for HCX migrations.",2026-03-30T17:14:00.000Z,how-to,best-practices,0.75,True,"Focuses on NSX topology, data path performance characteristics, identifying resource limits, and recommended configurations to mitigate performance issues for HCX migrations. This is product-specific, actionable guidance with configuration recommendations and edge-case considerations, fitting best-practices.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-platform-updates,What's new,What's new in Azure VMware Solution - Azure VMware Solution,,Learn about the platform updates to Azure VMware Solution.,"Microsoft regularly applies important updates to the Azure VMware Solution for new features and software lifecycle management. You should receive a notification through Azure Service Health that includes the timeline of the maintenance. For more information, seeHost maintenance and lifecycle management.",2026-03-30T08:00:00.000Z,reference,,0.2,False,"A 'what's new' / platform updates page; typically lists feature changes and lifecycle notes but not detailed limits, configs, error codes, or decision matrices as defined. No clear evidence of structured expert knowledge per the categories.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-private-cloud-maintenance,Azure VMware Solution private cloud maintenance,Private Cloud Maintenance - Azure VMware Solution,Follow AVS private cloud maintenance and remediation procedures,"Ensuring seamless, reliably maintaintenance of Azure VMware Solution private cloud","Azure VMware Solution undertakes periodic maintenance of the private cloud. This maintenance includes security patches, minor and major updates to VMware software stack. This page describes the host monitoring, remediation, and mandatory steps that keep the private cloud ready for maintenance.",2026-01-19T18:12:00.000Z,concept-article,best-practices,0.6,True,"Describes host monitoring, remediation, and mandatory steps for maintenance; likely includes AVS-specific operational practices and edge cases.",unchanged @@ -22,7 +22,7 @@ https://learn.microsoft.com/en-us/azure/azure-vmware/backup-azure-netapp-files-d https://learn.microsoft.com/en-us/azure/azure-vmware/backup-azure-vmware-solution-virtual-machines,Backup private cloud VMs with Backup Server,Back up Azure VMware Solution VMs with Azure Backup Server - Azure VMware Solution,Back up AVS VMware virtual machines with Azure Backup Server,Configure your Azure VMware Solution environment to back up virtual machines by using Azure Backup Server.,"This article shows you how to back up VMware virtual machines (VMs) running on Azure VMware Solution with Azure Backup Server. First, thoroughly go throughSet up Microsoft Azure Backup Server for Azure VMware Solution. Then, walk through all of the necessary procedures to:",2026-01-30T08:00:00.000Z,how-to,configuration,0.7,True,Stepwise configuration of backup jobs for AVS VMs using Azure Backup Server; includes product-specific options and constraints.,unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/bitnami-appliances-deployment,Bitnami appliance deployment,Deploy Bitnami virtual appliances - Azure VMware Solution,,Learn about the virtual appliances packed by Bitnami to deploy in your Azure VMware Solution private cloud.,"Bitnami by VMware provides a rich catalog of turnkey virtual appliances. You can deploy any vSphere compatible appliance by Bitnami available in theVMware Marketplace, including many of the most common open-source software projects. In this article, learn how to install and configure the following virtual appliances packaged by Bitnami on your Azure VMware Solution private cloud: LAMP Jenkins PostgreSQL NGINX RabbitMQ",2026-03-04T08:00:00.000Z,how-to,,0.25,False,"Explains how to deploy Bitnami virtual appliances (LAMP, Jenkins, PostgreSQL, NGINX, RabbitMQ) on Azure VMware Solution. The summary suggests a guided installation/configuration tutorial, but not detailed configuration parameter tables, limits, or troubleshooting mappings; it reads as deployment guidance rather than deep, product-specific expert reference.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/configure-alerts-for-azure-vmware-solution,Configure alerts and work with metrics,Configure alerts and work with metrics in Azure VMware Solution - Azure VMware Solution,Configure alerts and metrics monitoring for Azure VMware Solution,Learn how to use alerts to receive notifications. Also learn how to work with metrics to gain deeper insights into your Azure VMware Solution private cloud.,"In this article, learn how to configureAzure Action GroupsinMicrosoft Azure Alertsto receive notifications of triggered events that you define. Also learn about usingAzure Monitor Metricsto gain deeper insights into your Azure VMware Solution private cloud. Note For incidents that affect the availability of an Azure VMware Solution host and its corresponding restoration, the following actions occur. The incidents are sent automatically to the Account Administrator, Service Administrator (Classic",2026-02-04T08:00:00.000Z,how-to,configuration,0.75,True,"Explains configuring Azure Alerts, Action Groups, and Metrics for AVS, including incident behavior and AVS-specific monitoring settings.",unchanged -https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san,Configure Azure Elastic SAN,Use Azure VMware Solution with Azure Elastic SAN - Azure VMware Solution,Use Azure Elastic SAN as AVS iSCSI datastores,Learn how to use Elastic SAN with Azure VMware Solution.,"This article explains how to use Azure Elastic SAN as backing storage for Azure VMware Solution.Azure VMware Solutionsupports attaching iSCSI datastores as a persistent storage option. You can create Virtual Machine File System (VMFS) datastores with Azure Elastic SAN volumes and attach them to clusters of your choice. By using VMFS datastores backed by Azure Elastic SAN, you can expand your storage instead of scaling the clusters. Azure Elastic storage area network (SAN) addresses the problem o",2025-10-29T05:11:00.000Z,how-to,configuration,0.75,True,"Details creating VMFS datastores on Elastic SAN volumes and attaching to AVS clusters, with product-specific configuration steps and parameters.",unchanged +https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san,Configure Azure Elastic SAN,Use Azure VMware Solution with Azure Elastic SAN - Azure VMware Solution,Configure Azure Elastic SAN as AVS iSCSI datastore,Learn how to use Elastic SAN with Azure VMware Solution.,"This article explains how to use Azure Elastic SAN as backing storage for Azure VMware Solution.Azure VMware Solutionsupports attaching iSCSI datastores as a persistent storage option. You can create Virtual Machine File System (VMFS) datastores with Azure Elastic SAN volumes and attach them to clusters of your choice. By using VMFS datastores backed by Azure Elastic SAN, you can expand your storage instead of scaling the clusters. Azure Elastic storage area network (SAN) addresses the problem o",2026-04-23T22:13:00.000Z,how-to,integrations,0.68,True,"How-to article for using Azure Elastic SAN as backing storage for Azure VMware Solution via iSCSI VMFS datastores. Likely includes product-specific integration steps, required settings, and configuration parameters unique to AVS + Elastic SAN rather than just conceptual guidance.",updated https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-monitor-for-resource-health-for-azure-vmware-solution,Configure Azure Monitor for Resource Health alerts,Create an Azure Monitor resource health alert rule for Azure VMware Solution - Azure VMware Solution,Create Azure Monitor resource health alerts for AVS,Learn about configuring Azure Monitor alerts for Resource Health for your Azure VMware Solution private cloud.,"This article shows you how to create or edit a resource health alert rule in Azure Monitor for your Azure VMware Solution private cloud. To learn more about alerts, see thealerts overview. Alerts triggered by these alert rules contain a payload that uses thecommon alert schema.",2025-08-05T11:10:00.000Z,how-to,configuration,0.7,True,"Shows how to create/edit resource health alert rules for AVS in Azure Monitor. Such pages typically include alert rule parameters, scopes, conditions, and schema details that are product-specific configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-native-pure-storage-cloud,Configure Azure Native Pure Storage Cloud,Azure Native Pure Storage Cloud - Azure VMware Solution,Use Azure Native Pure Storage Cloud with AVS,Learn how to use Azure Native Pure Storage Cloud with Azure VMware Solution.,,2025-07-01T22:25:00.000Z,how-to,configuration,0.6,True,Describes how to integrate and configure Azure Native Pure Storage Cloud as storage for AVS workloads.,unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-vmware-solution-metrics,Configure Newest Metrics for Azure VMware Solution,Configure the latest iteration of Azure VMware Solution private cloud performance and health metrics - Azure VMware Solution,Configure Azure VMware Solution performance and health metrics,Learn how to take advantage of the latest improvements made to Azure VMware Solution private cloud performance and health metrics,"In this article, you learn how to take advantage of the latest iteration of metrics available to monitor your Azure VMware Solution private cloud. Azure VMware Solution provides users with an array of high-level health and performance metrics to provide visibility into the health and performance of an Azure VMware Solution private cloud. In 2024, a set of identical metrics labeled as ""new"" were introduced. The ""new"" metrics, which include a series of enhancements to the stability, reliability, a",2026-02-19T23:10:00.000Z,how-to,configuration,0.65,True,"How-to configuration article for the 'latest iteration' of AVS metrics. Likely includes specific metric names, dimensions, and configuration steps unique to AVS monitoring, which are product-specific details beyond generic monitoring knowledge.",unchanged @@ -59,10 +59,10 @@ https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-vmware-cloud-directo https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-vsan-stretched-clusters,Deploy vSAN stretched clusters,Deploy vSAN stretched clusters - Azure VMware Solution,,Learn how to deploy vSAN stretched clusters.,"In this article, learn how to implement a vSAN stretched cluster for an Azure VMware Solution private cloud.",2026-03-05T08:00:00.000Z,how-to,,0.3,False,"Summary indicates a procedural deployment guide for vSAN stretched clusters without exposing specific limits, configuration parameter tables, or decision matrices. Likely step-by-step instructions rather than reusable expert configuration or limits data.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-zerto-disaster-recovery,Deploy Zerto disaster recovery,Deploy Zerto disaster recovery on Azure VMware Solution - Azure VMware Solution,,Learn how to implement Zerto disaster recovery for on-premises VMware or Azure VMware Solution virtual machines.,"Zerto is a disaster recovery solution designed to minimize downtime of VMs should a disaster occur. Zerto's platform is built on the foundation of Continuous Data Protection (CDP) that enables minimal or close to no data loss. The platform provides the level of protection wanted for many business-critical and mission-critical enterprise applications. Zerto also automates and orchestrates failover and failback to ensure minimal downtime in a disaster. Overall, Zerto simplifies management through ",2026-03-05T08:00:00.000Z,how-to,,0.3,False,"Appears to be a how-to/tutorial for deploying Zerto disaster recovery on Azure VMware Solution, focused on general deployment and product description (CDP, minimal data loss). The summary does not indicate detailed error codes, configuration tables, limits, or decision matrices; likely procedural guidance rather than expert-knowledge configuration or troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/disable-internet-access,Disable internet access or enable a default route,Set a default internet route or turn off internet access - Azure VMware Solution,,Learn how to set a default internet route or turn off internet access in your Azure VMware Solution private cloud.,"In this article, learn how to set a default internet route or turn off internet access in your Azure VMware Solution private cloud. You have multiple options to set up a default internet access route. You can use a virtual WAN hub or a network virtual appliance (NVA) in a virtual network, or you can use a default route from an on-premises environment. If you don't set a default route, your Azure VMware Solution private cloud has no internet access. With a default route set, you can achieve the f",2026-03-30T17:14:00.000Z,how-to,,0.4,False,"Explains options to set a default internet route or disable internet access. From the summary it appears to be a how-to/tutorial style article without explicit configuration parameter tables, limits, or security role definitions as required by the categories.",unchanged -https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager,Deploy VMware SRM for disaster recovery,Deploy disaster recovery with VMware Site Recovery Manager - Azure VMware Solution,Deploy VMware Site Recovery Manager for AVS disaster recovery,Deploy disaster recovery with VMware Site Recovery Manager (SRM) in your Azure VMware Solution private cloud.,"This article explains how to implement disaster recovery for Azure VMware Solution-based VMs. The solution uses VMware SRM and replication servers are deployed at both the protected and the recovery sites. VMware Live Site Recovery is a disaster recovery solution designed to minimize downtime of the virtual machines in an Azure VMware Solution environment if there was a disaster. VMware SRM automates and orchestrates failover and failback, ensuring minimal downtime in a disaster. Also, built-in ",2025-06-18T22:03:00.000Z,how-to,configuration,0.7,True,Explains configuring SRM and replication servers at protected and recovery AVS sites; includes product-specific DR settings and workflows.,unchanged +https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager,Deploy VMware SRM for disaster recovery,Deploy disaster recovery with VMware Site Recovery Manager - Azure VMware Solution,Deploy VMware Site Recovery Manager for AVS DR,Deploy disaster recovery with VMware Site Recovery Manager (SRM) in your Azure VMware Solution private cloud.,"This article explains how to implement disaster recovery for Azure VMware Solution-based VMs. The solution uses VMware SRM and replication servers are deployed at both the protected and the recovery sites. VMware Live Site Recovery is a disaster recovery solution designed to minimize downtime of the virtual machines in an Azure VMware Solution environment if there was a disaster. VMware SRM automates and orchestrates failover and failback, ensuring minimal downtime in a disaster. Also, built-in ",2026-04-21T22:10:00.000Z,how-to,integrations,0.7,True,Step-by-step implementation of disaster recovery for AVS VMs using VMware SRM and replication servers at protected and recovery sites. This is a product-specific integration pattern (AVS + VMware SRM) that likely includes concrete configuration parameters and SRM/AVS settings rather than just conceptual DR theory.,updated https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-app-monitoring-solutions,Application performance monitoring solutions for Azure VMware Solution,Application performance monitoring and troubleshooting solutions for Azure VMware Solution - Azure VMware Solution,,Learn about leading application monitoring and troubleshooting solutions for your Azure VMware Solution private cloud.,A key objective of Azure VMware Solution is to maintain the performance and security of applications and services across VMware on Azure and on-premises. Getting there requires visibility into complex infrastructures and quickly pinpointing the root cause of service disruptions across the hybrid cloud.,2025-12-15T18:25:00.000Z,how-to,,0.3,False,"Described as an overview of application performance monitoring and troubleshooting solutions; likely catalog/marketing-style guidance without concrete error codes, configuration tables, or numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-back-up-vms,Backup solutions for VMs,Backup solutions for Azure VMware Solution virtual machines - Azure VMware Solution,Select backup solutions for Azure VMware Solution VMs,Learn about leading backup and restore solutions for your Azure VMware Solution virtual machines.,"A key principle of Azure VMware Solution is to enable you to continue to use your investments and your favorite VMware solutions running on Azure. Independent software vendor (ISV) technology support, validated with Azure VMware Solution, is an important part of this strategy. Our backup partners have industry-leading backup and restore solutions in VMware-based environments. Customers widely adopted these solutions for their on-premises deployments. These partners extended their solutions to Az",2026-02-19T23:10:00.000Z,how-to,decision-making,0.6,True,Compares multiple ISV backup options validated for AVS; provides solution selection guidance beyond generic backup concepts.,unchanged -https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms,Disaster recovery solutions for VMs,Disaster recovery solutions for Azure VMware Solution virtual machines - Azure VMware Solution,Resolve disaster recovery issues in Azure VMware,Learn about leading disaster recovery solutions for your Azure VMware Solution private cloud.,"One of the most important aspects of any Azure VMware Solution deployment is disaster recovery. You can create disaster recovery plans between different Azure VMware Solution regions or between Azure and an on-premises vSphere environment. Azure VMware Solution makes use of dynamic ""run command"" modules that require both Microsoft and Partner input to run effectively. In this article, you discover critical known issues related to vSphere upgrades, Azure VMware Solution, Microsoft security enhanc",2026-03-18T17:38:00.000Z,how-to,troubleshooting,0.7,True,"Explicitly mentions 'critical known issues' related to vSphere upgrades, Azure VMware Solution, and Microsoft security enhancements. Such content typically maps specific issues/symptoms to causes and mitigations for DR scenarios, which is product-specific troubleshooting knowledge not captured by generic training.",unchanged +https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms,Disaster recovery solutions for VMs,Disaster recovery solutions for Azure VMware Solution virtual machines - Azure VMware Solution,Resolve disaster recovery issues for AVS virtual machines,Learn about leading disaster recovery solutions for your Azure VMware Solution private cloud.,"One of the most important aspects of any Azure VMware Solution deployment is disaster recovery. You can create disaster recovery plans between different Azure VMware Solution regions or between Azure and an on-premises vSphere environment. Azure VMware Solution makes use of dynamic ""run command"" modules that require both Microsoft and Partner input to run effectively. In this article, you discover critical known issues related to vSphere upgrades, Azure VMware Solution, Microsoft security enhanc",2026-04-21T22:10:00.000Z,how-to,troubleshooting,0.63,True,"Article focuses on disaster recovery solutions and explicitly mentions 'critical known issues' related to vSphere upgrades, AVS, and security enhancements. This suggests symptom/issue descriptions and product-specific guidance to address them, fitting troubleshooting more than generic DR design.",updated https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-external-storage-solutions,External storage solutions,External storage solutions overview - Azure VMware Solution,Choose external storage options for Azure VMware,Learn about external storage solutions for Azure VMware Solution private cloud.,"Azure VMware Solution is a Hyperconverged Infrastructure (HCI) service that offers VMware vSAN as the primary storage option. VMware vSAN provides high-performance NVMe storage for virtual machines. However, it has a limited capacity. The exact amount of storage available in vSAN depends on several factors, including the number and type of nodes in the cluster type; the RAID and FTT (Failures To Tolerate) settings; and the deduplication and compression performance of a particular workload. Some ",2026-03-11T08:00:00.000Z,how-to,decision-making,0.65,True,"Describes concrete constraints of built-in vSAN capacity and when to consider external storage solutions for Azure VMware Solution private clouds. While the summary is partial, this type of article typically includes product-specific guidance on when to use each external storage option based on capacity and workload characteristics, which is decision-making guidance beyond generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-migration-vms,Migration solutions for VMs,Migration solutions for Azure VMware Solution virtual machines - Azure VMware Solution,Evaluate migration solutions for AVS virtual machines,Learn about leading migration solutions for your Azure VMware Solution virtual machines.,"One of the most common use cases for using Azure VMware Solution is data center evacuation. It allows you to continue to maximize your VMware investments, because Azure VMware Solution is always up to date. Additionally, you can enhance your workloads with the full range of native Azure services. An initial key step in this process is the migration of your legacy VMware-based environment onto Azure VMware Solution. Our migration partners have industry-leading migration solutions in VMware-base",2023-12-20T23:04:00.000Z,reference,decision-making,0.6,True,Compares partner migration solutions and use cases for AVS; helps choose appropriate tooling for data center evacuation scenarios.,unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-os-vms,Operating system support for VMs,Operating system support for Azure VMware Solution virtual machines - Azure VMware Solution,,Learn about operating system support for your Azure VMware Solution virtual machines.,"Azure VMware Solution supports a wide range of operating systems to be used in the guest virtual machines. Being based on VMware vSphere, all operating systems currently supported by vSphere can be used by any Azure VMware Solution customer for their workloads.",2025-07-21T22:12:00.000Z,how-to,,0.2,False,High-level OS support statement; no detailed support matrix or numeric constraints in the summary.,unchanged @@ -75,7 +75,7 @@ https://learn.microsoft.com/en-us/azure/azure-vmware/enable-vmware-cds-with-azur https://learn.microsoft.com/en-us/azure/azure-vmware/enable-vmware-vcd-with-azure,Enable VMware Cloud Director with Azure VMware Solution,Enable VMware Cloud Director with Azure VMware Solution - Azure VMware Solution,Install and configure VMware Cloud Director on AVS,This article explains how to use Azure VMware Solution to enable enterprise and hosters to use Azure VMware Solution for private clouds underlying resources for virtual datacenters.,"This article explains how to install VMware Cloud Director to enable Enterprise and hosters to use Azure VMware Solution private cloud underlying resources for virtual datacenters. VMware Cloud directoris a cloud services platform that delivers secure, isolated, and elastic virtual data center compute, network, storage, and security in a self-service model. VMware Cloud Director obtains its resources from an underlying virtual infrastructure. Note VMware Cloud Director on Azure VMware Solution i",2025-05-27T17:04:00.000Z,how-to,configuration,0.65,True,Explains how to install and wire VMware Cloud Director to AVS resources; likely includes product-specific configuration parameters and settings.,unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/enable-vmware-vcd-with-azure-network,VMware Cloud Director on Azure VMware Solution network scenarios,VMware Cloud Director with Azure VMware Solution Networking - Azure VMware Solution,Design networking for VMware Cloud Director tenants on AVS,This article explains how to enable network for VMware Cloud director tenants on Azure VMware Solution,"VMware Cloud Director on Azure VMware Solution offers a robust platform for managing multitenancy, enabling organizations to create secure, isolated virtual data centers. This article provides various network connectivity scenarios for VMware Cloud Director tenants, including connecting to the internet and accessing Azure services. By leveraging the flexibility of VMware Cloud Director and Azure VMware Solution, tenants can achieve seamless integration with external networks and Azure resources,",2025-05-27T17:04:00.000Z,how-to,architecture-patterns,0.65,True,"Provides multiple AVS networking scenarios for VCD tenants, describing when to use each connectivity pattern to internet and Azure services.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/extended-security-updates-windows-sql-server,ESUs for SQL Server and Windows Server in Azure VMware Solution VMs,Extended Security Updates for SQL Server and Windows Server - Azure VMware Solution,Configure Extended Security Updates in Azure VMware Solution,Learn how to configure VMs that run older versions of SQL Server and Windows Server for free Extended Security Updates (ESUs) in Azure VMware Solution.,"This article describes how to enable Extended Security Updates (ESUs) and continue to run software that reached its end-of-support lifecycle in Azure VMware Solution. ESUs allow older versions of software to run in a supported manner by continuing to receive security updates and critical patches. In Azure, which includes Azure VMware Solution, ESUs are free of charge for extra years after their end of support. For more information on timelines, seeExtended Security Updates for SQL Server and Win",2026-03-11T08:00:00.000Z,how-to,security,0.7,True,"Describes concrete, product-specific steps and configuration details to enable ESUs for Windows/SQL Server on AVS, including how ESUs are applied and scoped in this environment. This is operational security configuration rather than a conceptual overview.",unchanged -https://learn.microsoft.com/en-us/azure/azure-vmware/faq,FAQ,Microsoft Azure VMware Solution FAQ - Azure VMware Solution,,Get answers to common questions about Azure VMware Solution.,This article answers commonly asked questions about Azure VMware Solution.,2026-02-24T06:10:00Z,faq,,0.2,False,"FAQ page with general Q&A; summary does not indicate detailed limits, configuration tables, or error-code-based troubleshooting. Likely high-level clarifications rather than deep expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/azure-vmware/faq,FAQ,Microsoft Azure VMware Solution FAQ - Azure VMware Solution,,Get answers to common questions about Azure VMware Solution.,This article answers commonly asked questions about Azure VMware Solution.,2026-04-23T17:12:00Z,faq,,0.0,False,"FAQ page is primarily conceptual and Q&A oriented; without evidence of detailed numeric limits, configuration tables, or error-code-based troubleshooting, it does not clearly match any expert-knowledge sub-skill type.",updated https://learn.microsoft.com/en-us/azure/azure-vmware/fix-deployment-failures,Open a support request for deployment failures,Support for Azure VMware Solution deployment or provisioning failure - Azure VMware Solution,Troubleshoot Azure VMware Solution deployment failures,Get information from your Azure VMware Solution private cloud to file a service request for an Azure VMware Solution deployment or provisioning failure.,"This article shows you how to open asupport requestand provide key information for an Azure VMware Solution deployment or provisioning failure. When you have a failure on your private cloud, you need to open a support request in the Azure portal. To open a support request, first get some key information in the Azure portal:",2026-03-27T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Focused on handling deployment/provisioning failures and opening support requests; likely includes specific failure information to collect, possibly error identifiers and required diagnostic details, fitting a symptom-to-resolution troubleshooting pattern.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/install-cloud-backup-virtual-machines,Install Cloud Backup for Virtual Machines,Install Cloud Backup for Virtual Machines - Azure VMware Solution,Install Cloud Backup plug-in for Azure VMware VMs,Cloud Backup for Virtual Machines is a plug-in installed in the Azure VMware Solution and enables you to back up and restore Azure NetApp Files datastores and virtual machines.,Cloud Backup for Virtual Machines is a plug-in installed in the Azure VMware Solution. That plug-in enables you to back up and restore Azure NetApp Files datastores and virtual machines (VMs) residing in NetApp Datastore to be backed up and restored. Cloud Backup for Virtual Machines features:,2026-03-12T08:00:00.000Z,how-to,integrations,0.7,True,"Covers installing a specific Cloud Backup for Virtual Machines plug-in within Azure VMware Solution to back up Azure NetApp Files datastores and VMs. This is a product-specific integration pattern between AVS and NetApp, likely including configuration steps and parameters unique to this integration, which qualifies as integrations-focused expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/azure-vmware/install-vmware-hcx,4 - Install and activate VMware HCX in Azure VMware Solution,Install VMware HCX in Azure VMware Solution - Azure VMware Solution,,Install VMware HCX in your Azure VMware Solution private cloud.,"VMware HCXis an application mobility platform designed for simplifying application migration, rebalancing workloads, and optimizing disaster recovery across data centers and clouds. VMware HCX has two component services:HCX Cloud ManagerandHCX Connector. These components work together for VMware HCX operations. This article shows you how to install and activate the VMware HCX Cloud Manager and VMware HCX Connector components. HCX Cloud manager is typically deployed as the destination (cloud side",2025-12-08T23:11:00.000Z,how-to,,0.4,False,HCX configuration tutorial; summary lists tasks but not detailed configuration matrices or limits.,unchanged diff --git a/products/azure-vmware-solution/report.md b/products/azure-vmware-solution/report.md index 6ade4ecc..41f43f78 100644 --- a/products/azure-vmware-solution/report.md +++ b/products/azure-vmware-solution/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: 'Configuring AVS environments: networking, storage, backup/DR, monitoring, DNS, SNAT, vSAN, Cloud Director, Arc, and integrations like NetApp, Elastic SAN, @@ -16,31 +16,31 @@ category_descriptions: deployment: 'Guides for deploying AVS workloads: Citrix Virtual Apps/Desktops, JetStream DR for AVS/on-prem, and integrating VMware Cloud Director service with Azure VMware Solution.' - troubleshooting: 'Diagnosing and fixing Azure VMware Solution issues: deployment - failures, known platform errors, and disaster recovery problems (replication, - failover, and recovery).' + troubleshooting: 'Diagnosing and fixing AVS issues: known errors and workarounds, + deployment failures, and disaster recovery problems for Azure VMware Solution + VMs.' best-practices: Best practices for AVS private cloud maintenance, security hardening, and tuning NSX scale/performance specifically for HCX migration scenarios. - integrations: 'Networking, migration, and integration patterns for AVS: VPN/ExpressRoute, - HCX migrations, Traffic Manager, monitoring/logging, backup, and using services - like NetApp Files with AVS VMs' + integrations: 'Patterns for connecting AVS to Azure services: storage (Elastic SAN, + ANF), networking/VPN, HCX migration/DR, backups, monitoring/logging, and automation + via HCX Run Commands.' limits-quotas: Host, cluster, capacity, and quota limits for AVS private clouds, plus vSAN ESA support, Gen2 routing limits, and required network ports and planning steps. skill_description: Expert knowledge for Azure VMware Solution development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. - Use when designing AVS networking/storage, HCX migrations, Citrix/Horizon VDI, JetStream - DR, or NetApp Files integrations, and other Azure VMware Solution related development + Use when configuring AVS networking/vSAN, HCX migration/DR, Citrix/Horizon desktops, + Defender for Cloud, or Cloud Director, and other Azure VMware Solution related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), Azure Stack Edge (use azure-stack-edge), - Azure Baremetal Infrastructure (use azure-baremetal-infrastructure). -use_when: Use when designing AVS networking/storage, HCX migrations, Citrix/Horizon - VDI, JetStream DR, or NetApp Files integrations, and other Azure VMware Solution + Azure Migrate (use azure-migrate). +use_when: Use when configuring AVS networking/vSAN, HCX migration/DR, Citrix/Horizon + desktops, Defender for Cloud, or Cloud Director, and other Azure VMware Solution related development tasks. confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), Azure Stack Edge (use azure-stack-edge), - Azure Baremetal Infrastructure (use azure-baremetal-infrastructure). + Azure Migrate (use azure-migrate). --- # Azure VMware Solution Crawl Report @@ -54,8 +54,8 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 1 -- **Unchanged**: 134 +- **Updated Pages**: 4 +- **Unchanged**: 131 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-vmware-solution/azure-vmware-solution.csv` @@ -65,10 +65,10 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), |------|-------|------------| | architecture-patterns | 7 | 5.2% | | best-practices | 4 | 3.0% | -| configuration | 35 | 25.9% | +| configuration | 33 | 24.4% | | decision-making | 10 | 7.4% | | deployment | 3 | 2.2% | -| integrations | 8 | 5.9% | +| integrations | 10 | 7.4% | | limits-quotas | 8 | 5.9% | | security | 11 | 8.1% | | troubleshooting | 3 | 2.2% | @@ -78,8 +78,14 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), ### Updated Pages -- [Known issues](https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-known-issues) - - Updated: 2026-04-02T08:00:00.000Z → 2026-04-13T22:10:00.000Z +- [FAQ](https://learn.microsoft.com/en-us/azure/azure-vmware/faq) + - Updated: 2026-02-24T06:10:00Z → 2026-04-23T17:12:00Z +- [Configure Azure Elastic SAN](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san) + - Updated: 2025-10-29T05:11:00.000Z → 2026-04-23T22:13:00.000Z +- [Disaster recovery solutions for VMs](https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms) + - Updated: 2026-03-18T17:38:00.000Z → 2026-04-21T22:10:00.000Z +- [Deploy VMware SRM for disaster recovery](https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager) + - Updated: 2025-06-18T22:03:00.000Z → 2026-04-21T22:10:00.000Z ## Classified Pages @@ -93,7 +99,6 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Run Command](https://learn.microsoft.com/en-us/azure/azure-vmware/using-run-command) | configuration | 0.80 | Lists supported run-command operations and their behaviors for the cloudadmin role; includes specific cmdlets and capabilities unique to AVS. | | [Set an external identity source for NSX](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-external-identity-source-nsx-t) | security | 0.80 | Explains how to configure VMware NSX in Azure VMware Solution to use external LDAP/Active Directory for authentication, including mapping accounts to NSX roles. This is product-specific identity and RBAC configuration, matching the security sub-skill type. | | [Attach Azure NetApp Files datastores to Azure VMware Solution hosts](https://learn.microsoft.com/en-us/azure/azure-vmware/attach-azure-netapp-files-to-azure-vmware-solution-hosts) | configuration | 0.75 | Explains creating and attaching ANF-based NFS datastores to AVS; likely includes configuration values and constraints specific to this integration. | -| [Configure Azure Elastic SAN](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san) | configuration | 0.75 | Details creating VMFS datastores on Elastic SAN volumes and attaching to AVS clusters, with product-specific configuration steps and parameters. | | [Configure alerts and work with metrics](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-alerts-for-azure-vmware-solution) | configuration | 0.75 | Explains configuring Azure Alerts, Action Groups, and Metrics for AVS, including incident behavior and AVS-specific monitoring settings. | | [Configure vSAN ESA](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-vsan-esa) | limits-quotas | 0.75 | Includes a table of AVS host types that support vSAN ESA and per-cluster configurations, which are concrete numeric limits and defaults. | | [Enable first-party application service principal](https://learn.microsoft.com/en-us/azure/azure-vmware/native-first-party-principle-security) | security | 0.75 | Gives a specific application ID and name for the AVS service principal and instructions to re-enable it, which is product-specific security configuration. | @@ -121,8 +126,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Configure vSAN](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-vsan) | configuration | 0.70 | Describes VMware vSAN capabilities and default per-cluster configurations within Azure VMware Solution, and how to configure them. The mention of default configurations per cluster and use of Run Commands implies specific vSAN settings and AVS-specific defaults, which are configuration details not generally known from training. | | [Create a placement policy](https://learn.microsoft.com/en-us/azure/azure-vmware/create-placement-policy) | configuration | 0.70 | Placement policies are AVS-specific configuration replacing DRS rules; article likely lists policy types, parameters, and constraints unique to AVS. | | [Deploy VMware Cloud Director Availability in Azure VMware Solution](https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-vmware-cloud-director-availability-in-azure-vmware-solution) | configuration | 0.70 | Covers installation and configuration of VCD Availability for AVS DR; likely includes product-specific settings and integration steps. | -| [Deploy VMware SRM for disaster recovery](https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager) | configuration | 0.70 | Explains configuring SRM and replication servers at protected and recovery AVS sites; includes product-specific DR settings and workflows. | -| [Disaster recovery solutions for VMs](https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms) | troubleshooting | 0.70 | Explicitly mentions 'critical known issues' related to vSphere upgrades, Azure VMware Solution, and Microsoft security enhancements. Such content typically maps specific issues/symptoms to causes and mitigations for DR scenarios, which is product-specific troubleshooting knowledge not captured by generic training. | +| [Deploy VMware SRM for disaster recovery](https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager) | integrations | 0.70 | Step-by-step implementation of disaster recovery for AVS VMs using VMware SRM and replication servers at protected and recovery sites. This is a product-specific integration pattern (AVS + VMware SRM) that likely includes concrete configuration parameters and SRM/AVS settings rather than just conceptual DR theory. | | [ESUs for SQL Server and Windows Server in Azure VMware Solution VMs](https://learn.microsoft.com/en-us/azure/azure-vmware/extended-security-updates-windows-sql-server) | security | 0.70 | Describes concrete, product-specific steps and configuration details to enable ESUs for Windows/SQL Server on AVS, including how ESUs are applied and scoped in this environment. This is operational security configuration rather than a conceptual overview. | | [Enable Managed SNAT for Azure VMware Solution workloads](https://learn.microsoft.com/en-us/azure/azure-vmware/enable-managed-snat-for-workloads) | configuration | 0.70 | Describes product-specific networking behavior and constraints (Managed SNAT not working with a default Azure route, ICMP disabled by design). These are concrete, service-specific configuration and behavior details that aren't generic knowledge, but it doesn't focus on numeric limits or decision matrices. | | [Enable VMware HCX access over the internet](https://learn.microsoft.com/en-us/azure/azure-vmware/enable-hcx-access-over-internet) | integrations | 0.70 | Shows how to expose HCX via public IP, pair sites, and create service mesh; includes AVS-specific networking and HCX configuration parameters. | @@ -144,6 +148,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Set an external identity source for vCenter Server](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-identity-source-vcenter) | security | 0.70 | Describes product-specific identity and role configuration for Azure VMware Solution vCenter, including the special CloudAdmin account and role behavior that differs from other VMware and on-premises deployments, which is security-focused expert knowledge about RBAC and identity integration. | | [VMware Cloud Foundations (VCF) license portability on Azure VMware Solution](https://learn.microsoft.com/en-us/azure/azure-vmware/vmware-cloud-foundations-license-portability) | decision-making | 0.70 | Explains Broadcom VCF licensing changes, dates, and how to bring your own VCF subscription; contains time-bound, product-specific decision guidance. | | [VMware HCX Mobility Optimized Networking (MON) guidance](https://learn.microsoft.com/en-us/azure/azure-vmware/vmware-hcx-mon-guidance) | best-practices | 0.70 | Provides recommended configurations to mitigate NSX data path constraints and improve HCX migration performance—product-specific best practices. | +| [Configure Azure Elastic SAN](https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san) | integrations | 0.68 | How-to article for using Azure Elastic SAN as backing storage for Azure VMware Solution via iSCSI VMFS datastores. Likely includes product-specific integration steps, required settings, and configuration parameters unique to AVS + Elastic SAN rather than just conceptual guidance. | | [Enable guest management and install extensions on Arc-enabled VMs](https://learn.microsoft.com/en-us/azure/azure-vmware/arc-enable-guest-management) | configuration | 0.68 | How-to article for enabling guest management and installing extensions on Arc-enabled VMware VMs. This typically includes product-specific steps, required settings, and parameter values (for Arc agent/extension enablement) that go beyond generic knowledge, fitting the configuration sub-skill. | | [Use VMware HCX Run Commands](https://learn.microsoft.com/en-us/azure/azure-vmware/use-hcx-run-commands) | integrations | 0.68 | Describes a collection of VMware HCX-specific PowerShell cmdlets exposed as Azure VMware Solution Run Commands, including how to invoke privileged operations. This is product- and integration-specific command usage that an LLM is unlikely to know in detail from training, fitting the integrations & coding patterns category better than generic how-to content. | | [Deploy disaster recovery using JetStream DR software](https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-disaster-recovery-using-jetstream) | deployment | 0.66 | Covers implementing JetStream DR across protected and recovery sites using Azure VMware Solution, including how instances are deployed and how CDP via VAIO is used. This is a product-specific DR deployment pattern for AVS and on-premises VMware, which qualifies as expert deployment knowledge rather than generic DR theory. | @@ -161,6 +166,7 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Save costs with a reserved instance](https://learn.microsoft.com/en-us/azure/azure-vmware/reserved-instance) | decision-making | 0.65 | Covers how reserved instances apply specifically to Azure VMware Solution hosts, including what portions of usage are covered (compute and software licensing) and how discounts are applied. This is product- and SKU-specific cost/selection guidance that informs when and how to choose reservations, fitting decision-making. Not just a conceptual overview of reservations. | | [Security recommendations](https://learn.microsoft.com/en-us/azure/azure-vmware/security-recommendations) | best-practices | 0.65 | Explicitly a security recommendations article; likely includes AVS-specific DOs/DON’Ts and configuration guidance beyond generic security advice. | | [VMware Cloud Director on Azure VMware Solution network scenarios](https://learn.microsoft.com/en-us/azure/azure-vmware/enable-vmware-vcd-with-azure-network) | architecture-patterns | 0.65 | Provides multiple AVS networking scenarios for VCD tenants, describing when to use each connectivity pattern to internet and Azure services. | +| [Disaster recovery solutions for VMs](https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms) | troubleshooting | 0.63 | Article focuses on disaster recovery solutions and explicitly mentions 'critical known issues' related to vSphere upgrades, AVS, and security enhancements. This suggests symptom/issue descriptions and product-specific guidance to address them, fitting troubleshooting more than generic DR design. | | [Deploy Citrix on Azure VMware Solution](https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-citrix) | deployment | 0.62 | Focuses on deploying Citrix Virtual Apps and Desktops service on Azure VMware Solution. Citrix-on-AVS is a specific deployment scenario with product-specific requirements and constraints (e.g., supported models, topology, and AVS usage patterns) that go beyond generic deployment knowledge, fitting the deployment sub-skill for a particular platform integration. | | [Enable VMware Cloud Director Service with Azure VMware Solution](https://learn.microsoft.com/en-us/azure/azure-vmware/enable-vmware-cds-with-azure) | deployment | 0.62 | Describes enabling VMware Cloud Director service on top of Azure VMware Solution private clouds to provision and manage virtual datacenters. This is a product-specific enablement/deployment scenario that likely includes required configurations and constraints for wiring the services together, best fitting deployment. | | [API Management](https://learn.microsoft.com/en-us/azure/azure-vmware/architecture-api-management) | architecture-patterns | 0.60 | Provides recommendations for integrating AVS into hub-and-spoke architectures in hybrid scenarios, which is a product-specific architecture pattern. | @@ -218,10 +224,10 @@ confusable_not_for: Not for Azure Virtual Machines (use azure-virtual-machines), | [Bitnami appliance deployment](https://learn.microsoft.com/en-us/azure/azure-vmware/bitnami-appliances-deployment) | 0.25 | Explains how to deploy Bitnami virtual appliances (LAMP, Jenkins, PostgreSQL, NGINX, RabbitMQ) on Azure VMware Solution. The summary suggests a guided installation/configuration tutorial, but not detailed configuration parameter tables, limits, or troubleshooting mappings; it reads as deployment guidance rather than deep, product-specific expert reference. | | [9 - Delete a private cloud](https://learn.microsoft.com/en-us/azure/azure-vmware/tutorial-delete-private-cloud) | 0.20 | Tutorial-style delete operation for an Azure VMware Solution private cloud; no detailed limits, configuration parameter tables, error-code-based troubleshooting, or other product-specific expert data beyond generic deletion behavior. | | [Deploy VMs from the content library](https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-vm-content-library) | 0.20 | Primarily a procedural tutorial on creating a vSphere content library and deploying a VM from an ISO. It does not emphasize configuration parameter tables, limits, or product-specific edge cases; it’s mostly step-by-step UI usage that an LLM can approximate without needing this exact document. | -| [FAQ](https://learn.microsoft.com/en-us/azure/azure-vmware/faq) | 0.20 | FAQ page with general Q&A; summary does not indicate detailed limits, configuration tables, or error-code-based troubleshooting. Likely high-level clarifications rather than deep expert knowledge. | | [Introduction to Gen 2](https://learn.microsoft.com/en-us/azure/azure-vmware/native-introduction) | 0.20 | Introductory/architecture overview of Azure VMware Solution Gen 2 private clouds; summary does not show concrete limits tables, configuration parameters, or other detailed expert data beyond generic statements about networking and performance. | | [Operating system support for VMs](https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-os-vms) | 0.20 | High-level OS support statement; no detailed support matrix or numeric constraints in the summary. | | [Overview](https://learn.microsoft.com/en-us/azure/azure-vmware/resource-health-for-azure-vmware-solution-overview) | 0.20 | High-level overview of Azure Resource Health for AVS; summary indicates conceptual description of what Resource Health does, without concrete error codes, configuration parameters, or limits. | | [Security solutions for Azure VMware Solution](https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-security-solutions) | 0.20 | High-level description of security partner solutions and ecosystem; appears marketing/overview oriented without specific RBAC roles, configuration parameters, or detailed integration settings. Lacks the concrete, product-specific security configuration details required for the security or other expert categories. | | [What's new](https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-platform-updates) | 0.20 | A 'what's new' / platform updates page; typically lists feature changes and lifecycle notes but not detailed limits, configs, error codes, or decision matrices as defined. No clear evidence of structured expert knowledge per the categories. | | [8 - Scale a private cloud](https://learn.microsoft.com/en-us/azure/azure-vmware/tutorial-scale-private-cloud) | - | Not in LLM response | +| [FAQ](https://learn.microsoft.com/en-us/azure/azure-vmware/faq) | - | FAQ page is primarily conceptual and Q&A oriented; without evidence of detailed numeric limits, configuration tables, or error-code-based troubleshooting, it does not clearly match any expert-knowledge sub-skill type. | diff --git a/products/azure-vpn-gateway/azure-vpn-gateway.csv b/products/azure-vpn-gateway/azure-vpn-gateway.csv index 8f263431..83462a2c 100644 --- a/products/azure-vpn-gateway/azure-vpn-gateway.csv +++ b/products/azure-vpn-gateway/azure-vpn-gateway.csv @@ -7,7 +7,7 @@ https://learn.microsoft.com/en-us/azure/vpn-gateway/add-remove-site-to-site-conn https://learn.microsoft.com/en-us/azure/vpn-gateway/azure-vpn-client-optional-configurations,Azure VPN Client - optional settings,Configure Azure VPN Client optional settings - Azure VPN Gateway,Configure optional Azure VPN Client settings for P2S,"Learn how to configure optional configuration settings for the Azure VPN Client. Settings include DNS suffixes, custom DNS servers, custom routes, and VPN client forced tunneling.","This article helps you configure optional settings for the Azure VPN Client for VPN Gateway point-to-site (P2S) connections. You can configure DNS suffixes, custom DNS servers, custom routes, and VPN client-side forced tunneling. Note The Azure VPN Client is only supported for OpenVPN® protocol connections.",2025-07-03T05:10:00.000Z,how-to,configuration,0.74,True,"The article describes concrete, product-specific configuration options for the Azure VPN Client (DNS suffixes, custom DNS servers, custom routes, forced tunneling) tied to Azure VPN Gateway P2S connections. It focuses on how to set particular client-side settings and their effects, which are implementation details not generally known from training. This aligns best with the configuration sub-skill, as it is about specific client configuration behavior rather than generic VPN concepts.",unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/azure-vpn-client-prerequisites-check,Run Prerequisites Test,Azure VPN Client prerequisites check - Azure VPN Gateway,Run Azure VPN Client prerequisites check and fix issues,Learn how run the Azure VPN Client prerequisites check to identify missing prerequisites and mitigate them.,"If you're using the Azure VPN Client for Windows to connect to your point-to-site (P2S) VPN, you can run a prerequisites check to identify missing prerequisites and mitigate them. TheRun Prerequisites Testfeature checks the state of Windows services, background permissions for the client, local setting permissions, internet access, and user device time sync status. You can use this feature to do the following: TheRun Prerequisites Testfeature is available in the Azure VPN Client for Windows, ver",2025-03-14T08:00:00.000Z,how-to,troubleshooting,0.7,True,"Describes the built-in prerequisites test, specific Windows services, permissions, and conditions it checks, and how to remediate failures—symptom-to-fix guidance.",unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/azure-vpn-client-versions,Azure VPN Client versions,Azure VPN Client versions - Azure VPN Gateway,,This article shows the Azure VPN Client versions.,"This article helps you view each of the versions of the Azure VPN Client. As new client versions become available, they're added to this article. To view the version number of an installed Azure VPN Client, launch the client and selectHelp. For the list of Azure VPN Client instructions, including how to download the Azure VPN Client, see the table inVPN Client configuration requirements.",2026-04-01T06:12:00.000Z,concept-article,,0.2,False,"Version history list for the Azure VPN Client; likely just version numbers and dates without limits, configuration matrices, error codes, or decision criteria. Does not match any expert-knowledge sub-skill type defined.",unchanged -https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about,About Basic SKU public IP address migration,About migrating a Basic SKU public IP address to Standard SKU - Azure VPN Gateway,Plan migration from Basic to Standard public IP for VPN Gateway,This article explains the migration process from a Basic SKU public IP address to a Standard SKU public IP address for VPN Gateway deployments that are currently using a Basic SKU public IP address. T,"This article explains the migration process from a Basic SKU public IP address to a Standard SKU public IP address for VPN Gateway deployments. There are separate migration timelines, depending on the VPN Gateway SKU that your gateway is currently configured to use. Important For anticipated migration timelines, see theVPN Gateway - What's newarticle.",2026-03-06T23:15:00.000Z,concept-article,deployment,0.65,True,"Covers migration process and timelines for moving VPN Gateway deployments from Basic to Standard public IP SKUs. This is a product-specific migration/deployment scenario with schedule and process considerations, beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about,About Basic SKU public IP address migration,About migrating a Basic SKU public IP address to Standard SKU - Azure VPN Gateway,Decide and plan VPN Gateway public IP SKU migration,This article explains the migration process from a Basic SKU public IP address to a Standard SKU public IP address for VPN Gateway deployments that are currently using a Basic SKU public IP address. T,"This article explains the migration process from a Basic SKU public IP address to a Standard SKU public IP address for VPN Gateway deployments. There are separate migration timelines, depending on the VPN Gateway SKU that your gateway is currently configured to use. Important For anticipated migration timelines, see theVPN Gateway - What's newarticle.",2026-04-24T18:40:00.000Z,concept-article,decision-making,0.68,True,"The page is focused on migrating from Basic to Standard public IP SKUs for Azure VPN Gateway, which is a product-specific migration/upgrade decision. It discusses different migration timelines and behavior depending on the current VPN Gateway SKU, which fits decision-making guidance around upgrade paths and SKU choices rather than generic how-to. While the summary doesn’t show numeric limits, it clearly provides migration considerations and SKU-dependent guidance that an LLM is unlikely to infer from general training.",updated https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-howto,Migrate a Basic SKU public IP address to Standard,How to migrate a Basic SKU public IP address to a Standard SKU for VPN Gateway - Azure VPN Gateway,Execute Basic-to-Standard public IP migration for VPN Gateway,Learn how to migrate from a Basic SKU public IP address to a Standard SKU public IP address for VPN Gateway deployment.,"This article helps you migrate a Basic SKU public IP address to a Standard SKU for VPN Gateway deployments that use gateway SKUs VpnGw 1-5. For more information about Basic SKU public IP address migration, seeAbout migrating a Basic SKU public IP address to Standard SKU for VPN Gateway. Important During the public IP address SKU migration process, your Basic SKU public IP address resource is migrated to a Standard SKU public IP address resource. The IP address assigned to your gateway doesn't ch",2026-03-02T23:28:00.000Z,how-to,deployment,0.7,True,Step-by-step migration procedure for changing a VPN Gateway’s public IP SKU while preserving the IP address. This is a concrete deployment/migration pattern specific to Azure VPN Gateway and public IP SKUs.,unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-sku-public-ip-remove,Remove the Basic SKU public IP reference - Basic SKU VPN gateways,Remove the Basic SKU public IP Reference from a Basic SKU VPN gateway - Azure portal,,Learn how to remove the Basic SKU public IP address resource reference from an existing Basic SKU VPN gateway using the Azure portal.,"Basic SKU public IP addresses are being retired in Azure. The steps in this article help you remove the Basic SKU public IP address resource reference from an existing Basic SKU VPN gateway. This change doesn't update the gateway's public IP address and doesn't interrupt connectivity. These steps apply only to Basic SKU VPN gateways that have a Basic SKU public IP address resource reference. If your gateway doesn't have a Basic SKU public IP address reference, seeAbout migrating a Basic SKU publ",2026-03-24T22:22:00.000Z,how-to,,0.3,False,"Focused on a migration/cleanup task (removing Basic SKU public IP reference) with portal steps; summary does not suggest detailed limits, configuration parameter tables, or structured troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/bgp-how-to-cli,Azure CLI,Configure BGP for VPN Gateway: CLI - Azure VPN Gateway,Configure BGP for VPN Gateway using Azure CLI,Learn how to configure BGP for VPN gateways using CLI.,"This article helps you enable BGP on cross-premises site-to-site (S2S) VPN connections and VNet-to-VNet connections using Azure CLI. You can also create this configuration using theAzure portalorPowerShellsteps. BGP is the standard routing protocol commonly used in the Internet to exchange routing and reachability information between two or more networks. BGP enables the Azure VPN gateways and your on-premises VPN devices, called BGP peers or neighbors, to exchange ""routes"" that will inform both",2023-03-08T00:00:00.000Z,how-to,configuration,0.65,True,CLI-based BGP configuration uses specific commands and parameter names/constraints that are product-specific.,unchanged @@ -122,5 +122,5 @@ https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-verify-connectio https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-vnet-vnet-rm-ps,Azure PowerShell,Connect a VNet to another VNet using a VPN Gateway VNet-to-VNet connection: PowerShell - Azure VPN Gateway,Connect VNets with VNet-to-VNet using PowerShell,Learn how to connect virtual networks together using a VNet-to-VNet connection and PowerShell.,"This article helps you connect virtual networks by using the VNet-to-VNet connection type. The virtual networks can be in the same or different regions, and from the same or different subscriptions. When you connect virtual networks from different subscriptions, the subscriptions don't need to be associated with the same tenant. If you already have VNets that you want to connect and they're in the same subscription, you might want to use theAzure portalsteps instead because the process is less c",2025-03-26T08:00:00.000Z,how-to,configuration,0.6,True,"PowerShell article exposes concrete cmdlets and parameter names/values for VNet-to-VNet connections, which are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-vpn-faq,VPN Gateway FAQ,Azure VPN Gateway FAQ,Azure VPN Gateway FAQ with limits and behaviors,Get answers to frequently asked questions about VPN Gateway connections and configuration settings.,"This article answers frequently asked questions about Azure VPN Gateway cross-premises connections, hybrid configuration connections, and virtual network (VNet) gateways. It contains comprehensive information about point-to-site (P2S), site-to-site (S2S), and VNet-to-VNet configuration settings, including the Internet Protocol Security (IPsec) and Internet Key Exchange (IKE) protocols.",2025-09-09T08:00:00.000Z,concept-article,limits-quotas,0.7,True,"FAQ for VPN Gateway typically includes precise connection limits, throughput caps, supported combinations, and protocol specifics that are numeric and SKU-dependent.",unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-profile-intune,Intune - Deploy VPN client profile,Create an Intune profile for Azure VPN clients - Azure VPN Gateway,Deploy Azure VPN client profiles using Intune,Learn how to create an Intune custom profile to deploy Azure VPN client profiles.,You can deploy profiles for Azure VPN clients (Windows 10 or later) by using Microsoft Intune. This article helps you create an Intune profile using custom settings. Note,2025-03-31T08:00:00.000Z,how-to,deployment,0.65,True,Shows Intune custom profile schema/OMA-URI or JSON for VPN profiles—product-specific deployment configuration for managed rollout.,unchanged -https://learn.microsoft.com/en-us/azure/vpn-gateway/whats-new,What's new?,What's new in Azure VPN Gateway?,,"Learn what's new with Azure VPN Gateway such as the latest release notes, known issues, bug fixes, deprecated functionality, and upcoming changes.",Azure VPN Gateway is updated regularly. Stay up to date with the latest announcements. This article provides you with information about: You can also find the latest VPN Gateway updates and subscribe to the RSS feedhere.,2026-04-14T08:00:00.000Z,concept-article,,0.0,False,"Release notes and 'what's new' content typically list changes, fixes, and announcements but not structured limits, configuration matrices, decision criteria, or troubleshooting mappings as defined by the sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/vpn-gateway/whats-new,What's new?,What's new in Azure VPN Gateway?,,"Learn what's new with Azure VPN Gateway such as the latest release notes, known issues, bug fixes, deprecated functionality, and upcoming changes.",Azure VPN Gateway is updated regularly. Stay up to date with the latest announcements. This article provides you with information about: You can also find the latest VPN Gateway updates and subscribe to the RSS feedhere.,2026-04-14T08:00:00.000Z,concept-article,,0.0,False,"Release notes and 'what's new' content typically list changes, fixes, and announcements but not structured limits, configuration matrices, decision criteria, or troubleshooting mappings as defined by the sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/vpn-gateway/work-remotely-support,Leveraging Azure VPN connections,Remote work and point-to-site VPN gateways - Azure VPN Gateway,Plan remote work using P2S VPN Gateways,Learn how you can use VPN Gateway point-to-site connections in order to work remotely.,This article describes the options that are available to organizations to set up remote access for their users or to supplement their existing solutions with additional capacity. The Azure VPN Gateway point-to-site VPN solution is cloud-based and can be provisioned quickly to cater for the increased demand of users to work from home. It can scale up easily and turned off just as easily and quickly when the increased capacity isn't needed anymore.,2025-03-31T08:00:00.000Z,concept-article,decision-making,0.6,True,Discusses options and capacity considerations for remote access; helps choose when and how to use P2S versus other solutions.,unchanged diff --git a/products/azure-vpn-gateway/report.md b/products/azure-vpn-gateway/report.md index f6a97e04..5de4af84 100644 --- a/products/azure-vpn-gateway/report.md +++ b/products/azure-vpn-gateway/report.md @@ -1,12 +1,12 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: architecture-patterns: Design patterns and guidance for choosing VPN Gateway topologies, configuring active-active gateways, and building highly available, resilient site-to-site connectivity. - decision-making: Guidance on choosing VPN Gateway SKUs, understanding SKU mappings, - and planning/migrating VPN setups (P2S SSTP→IKEv2/OpenVPN, Classic→ARM, and remote - work P2S strategies). + decision-making: Guidance on choosing VPN Gateway SKUs, planning SKU/IP migrations, + mapping old to new SKUs, and migrating/architecting P2S and Classic-to-ARM VPN + gateways for remote access. security: 'Securing Azure VPN Gateway: IPsec/IKE policies, forced tunneling, cert/RADIUS auth, Entra ID & MFA for P2S, client config (Win/macOS/Linux), access control, roles, and crypto best practices.' @@ -30,17 +30,15 @@ category_descriptions: skill_description: Expert knowledge for Azure VPN Gateway development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when - configuring S2S/P2S tunnels, IPsec/IKE/BGP, Entra/RADIUS auth, active-active gateways, - or S2S over ExpressRoute, and other Azure VPN Gateway related development tasks. - Not for Azure ExpressRoute (use azure-expressroute), Azure Virtual WAN (use azure-virtual-wan), - Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager - (use azure-virtual-network-manager). -use_when: Use when configuring S2S/P2S tunnels, IPsec/IKE/BGP, Entra/RADIUS auth, - active-active gateways, or S2S over ExpressRoute, and other Azure VPN Gateway related - development tasks. -confusable_not_for: Not for Azure ExpressRoute (use azure-expressroute), Azure Virtual - WAN (use azure-virtual-wan), Azure Virtual Network (use azure-virtual-network), - Azure Virtual Network Manager (use azure-virtual-network-manager). + configuring S2S/P2S tunnels, BGP routing, IPsec/IKE policies, Entra/RADIUS auth, + or cross-cloud VPNs, and other Azure VPN Gateway related development tasks. Not + for Azure Virtual Network (use azure-virtual-network), Azure Virtual WAN (use azure-virtual-wan), + Azure ExpressRoute (use azure-expressroute), Azure Application Gateway (use azure-application-gateway). +use_when: Use when configuring S2S/P2S tunnels, BGP routing, IPsec/IKE policies, Entra/RADIUS + auth, or cross-cloud VPNs, and other Azure VPN Gateway related development tasks. +confusable_not_for: Not for Azure Virtual Network (use azure-virtual-network), Azure + Virtual WAN (use azure-virtual-wan), Azure ExpressRoute (use azure-expressroute), + Azure Application Gateway (use azure-application-gateway). --- # Azure VPN Gateway Crawl Report @@ -66,8 +64,8 @@ confusable_not_for: Not for Azure ExpressRoute (use azure-expressroute), Azure V | architecture-patterns | 3 | 2.4% | | best-practices | 1 | 0.8% | | configuration | 53 | 43.1% | -| decision-making | 5 | 4.1% | -| deployment | 11 | 8.9% | +| decision-making | 6 | 4.9% | +| deployment | 10 | 8.1% | | integrations | 4 | 3.3% | | limits-quotas | 2 | 1.6% | | security | 19 | 15.4% | @@ -78,8 +76,8 @@ confusable_not_for: Not for Azure ExpressRoute (use azure-expressroute), Azure V ### Updated Pages -- [What's new?](https://learn.microsoft.com/en-us/azure/vpn-gateway/whats-new) - - Updated: 2026-03-24T02:22:00.000Z → 2026-04-14T08:00:00.000Z +- [About Basic SKU public IP address migration](https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about) + - Updated: 2026-03-06T23:15:00.000Z → 2026-04-24T18:40:00.000Z ## Classified Pages @@ -158,9 +156,9 @@ confusable_not_for: Not for Azure ExpressRoute (use azure-expressroute), Azure V | [User tunnel](https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-howto-always-on-user-tunnel) | configuration | 0.70 | User tunnel configuration requires concrete VPN profile parameters and Azure-side settings that are product-specific. | | [VPN Gateway FAQ](https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-vpn-faq) | limits-quotas | 0.70 | FAQ for VPN Gateway typically includes precise connection limits, throughput caps, supported combinations, and protocol specifics that are numeric and SKU-dependent. | | [VPN over private peering](https://learn.microsoft.com/en-us/azure/vpn-gateway/site-to-site-vpn-private-peering) | integrations | 0.70 | Integration pattern combining VPN Gateway with ExpressRoute, including configuration constraints and supported scenarios. | +| [About Basic SKU public IP address migration](https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about) | decision-making | 0.68 | The page is focused on migrating from Basic to Standard public IP SKUs for Azure VPN Gateway, which is a product-specific migration/upgrade decision. It discusses different migration timelines and behavior depending on the current VPN Gateway SKU, which fits decision-making guidance around upgrade paths and SKU choices rather than generic how-to. While the summary doesn’t show numeric limits, it clearly provides migration considerations and SKU-dependent guidance that an LLM is unlikely to infer from general training. | | [About gateway SKU consolidation & migration](https://learn.microsoft.com/en-us/azure/vpn-gateway/gateway-sku-consolidation) | decision-making | 0.68 | The page describes concrete migration mappings from deprecated VPN Gateway SKUs to new SKUs that support availability zones, along with implications on redundancy, availability, and cost. This is product-specific guidance to decide which new SKU to use when an existing SKU is consolidated, fitting decision-making. It goes beyond generic concepts by detailing SKU-level changes and how they affect customers’ choices. | | [Install VPN client certificates](https://learn.microsoft.com/en-us/azure/vpn-gateway/point-to-site-how-to-vpn-client-install-azure-cert) | configuration | 0.68 | Details OS-specific certificate import steps and required certificate stores/locations for Azure P2S client authentication. | -| [About Basic SKU public IP address migration](https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about) | deployment | 0.65 | Covers migration process and timelines for moving VPN Gateway deployments from Basic to Standard public IP SKUs. This is a product-specific migration/deployment scenario with schedule and process considerations, beyond generic concepts. | | [About active-active mode gateways](https://learn.microsoft.com/en-us/azure/vpn-gateway/about-active-active-gateways) | architecture-patterns | 0.65 | Explains when and how to use active-active mode, including design benefits and trade-offs specific to VPN Gateway. | | [Add or remove a site-to-site connection](https://learn.microsoft.com/en-us/azure/vpn-gateway/add-remove-site-to-site-connections) | configuration | 0.65 | Describes how to manage multiple S2S connections, including limitations and prerequisites specific to VPN Gateway. | | [Azure CLI](https://learn.microsoft.com/en-us/azure/vpn-gateway/bgp-how-to-cli) | configuration | 0.65 | CLI-based BGP configuration uses specific commands and parameter names/constraints that are product-specific. | diff --git a/products/azure-web-application-firewall/azure-web-application-firewall.csv b/products/azure-web-application-firewall/azure-web-application-firewall.csv index b7a49a08..3f416b04 100644 --- a/products/azure-web-application-firewall/azure-web-application-firewall.csv +++ b/products/azure-web-application-firewall/azure-web-application-firewall.csv @@ -77,7 +77,7 @@ https://learn.microsoft.com/en-us/azure/web-application-firewall/secure-web-appl https://learn.microsoft.com/en-us/azure/web-application-firewall/shared/application-ddos-protection,Application DDoS protection,Application DDoS protection - Azure Web Application Firewall,Design application DDoS protection with Azure WAF and Front Door,This article explains how you can use Azure Web Application Firewall with Azure Front Door or Azure Application Gateway to protect your web applications against application layer DDoS attacks.,Azure WAF has several defense mechanisms that can help to prevent distributed denial of service (DDoS) attacks. The DDoS attacks can target at both network layer (L3/L4) or application layer (L7). Azure DDoS protects customer against large network layer volumetric attacks. Azure WAF operating at layer 7 protects web applications against L7 DDoS attacks such as HTTP Floods. These defenses can prevent attackers from reaching your application and affect your application's availability and performan,2025-03-31T08:00:00.000Z,concept-article,architecture-patterns,0.6,True,Explains how to use WAF with Front Door or Application Gateway for L7 DDoS; likely includes pattern-level guidance on when to use which component for DDoS mitigation.,unchanged https://learn.microsoft.com/en-us/azure/web-application-firewall/shared/manage-policies,Configure policies using Firewall Manager,Use Azure Firewall Manager to manage Web Application Firewall policies,Manage WAF policies centrally with Azure Firewall Manager,Learn about managing Azure Web Application Firewall policies using Azure Firewall Manager,"Azure Firewall Manager is a platform to manage and protect your network security resources at scale. You can associate your WAF policies to an Application Gateway or Azure Front Door within Azure Firewall Manager, all in a single place.",2022-07-22T17:04:00.000Z,concept-article,configuration,0.6,True,Explains associating WAF policies to Application Gateway or Front Door via Firewall Manager; involves product-specific management and configuration flows.,unchanged https://learn.microsoft.com/en-us/azure/web-application-firewall/shared/waf-azure-policy,Use Azure Policy,Azure Web Application Firewall and Azure Policy,Enforce WAF governance using Azure Policy,Azure Web Application Firewall (WAF) combined with Azure Policy can help enforce organizational standards and assess compliance at-scale for WAF resources,"Azure Web Application Firewall (WAF) combined with Azure Policy can help enforce organizational standards and assess compliance at-scale for WAF resources. Azure Policy is a governance tool that provides an aggregated view to evaluate the overall state of the environment, with the ability to drill down to the per-resource, per-policy granularity. Azure Policy also helps to bring your resources to compliance through bulk remediation for existing resources and automatic remediation for new resourc",2023-06-09T11:20:00.000Z,concept-article,security,0.65,True,Describes using Azure Policy definitions and assignments to enforce WAF configurations; includes security/governance settings and compliance behaviors specific to WAF resources.,unchanged -https://learn.microsoft.com/en-us/azure/web-application-firewall/support-help,Support and troubleshooting,Support and Troubleshooting - Azure Web Application Firewall,,How to obtain help and support for questions or problems when you create solutions using Azure Web Application Firewall (WAF).,Here are suggestions for where you can get help when developing your Azure Web Application Firewall (WAF) solutions.,2026-04-15T06:10:00.000Z,troubleshooting,,0.1,False,"Page is about where to get help and support for Azure WAF, not technical troubleshooting content. It does not list specific error codes, diagnostic steps, configuration parameters, limits, or other product-specific expert details.",new +https://learn.microsoft.com/en-us/azure/web-application-firewall/support-help,Support and troubleshooting,Support and Troubleshooting - Azure Web Application Firewall,,How to obtain help and support for questions or problems when you create solutions using Azure Web Application Firewall (WAF).,Here are suggestions for where you can get help when developing your Azure Web Application Firewall (WAF) solutions.,2026-04-15T06:10:00.000Z,troubleshooting,,0.1,False,"Page is about where to get help and support for Azure WAF, not technical troubleshooting content. It does not list specific error codes, diagnostic steps, configuration parameters, limits, or other product-specific expert details.",unchanged https://learn.microsoft.com/en-us/azure/web-application-firewall/waf-copilot,Microsoft Security Copilot,Azure Web Application Firewall integration in Microsoft Security Copilot,Investigate Azure WAF events with Security Copilot,Learn about using Microsoft Security Copilot to investigate traffic flagged by Azure Web Application Firewall.,"Microsoft Security Copilot is a cloud-based AI platform that provides natural language copilot experience. It can help support security professionals in different scenarios, like incident response, threat hunting, and intelligence gathering. For more information, seeWhat is Microsoft Security Copilot? Azure Web Application Firewall (WAF) integration in Microsoft Security Copilot enables deep investigation of Azure WAF events. It can help you investigate WAF logs triggered by Azure WAF in a matte",2025-06-09T22:05:00.000Z,concept-article,integrations,0.65,True,"Describes integration between WAF and Security Copilot, including how WAF logs are surfaced and queried; product-specific integration behavior.",unchanged https://learn.microsoft.com/en-us/azure/web-application-firewall/waf-javascript-challenge,JavaScript challenge,Web Application Firewall JavaScript Challenge,Use JavaScript challenge for bot mitigation in WAF,Learn about Azure Web Application Firewall JavaScript challenge on Azure Front Door and Azure Application Gateway.,Azure Web Application Firewall (WAF) on Azure Application Gateway offers a JavaScript (JS) challenge feature as one of the mitigation options for advanced bot protection. Azure Web Application Firewall (WAF) on Azure Front Door offers a JavaScript (JS) challenge feature as one of the mitigation options for advanced bot protection. It's available on the premium version as an action in the custom rule set and the Bot Manager 1.x ruleset. The JavaScript challenge is an invisible web challenge that ,2025-11-18T18:43:00.000Z,concept-article,configuration,0.7,True,"Explains JS challenge feature, availability by platform/tier, and how it is configured as an action; includes product-specific settings and constraints.",unchanged https://learn.microsoft.com/en-us/azure/web-application-firewall/waf-new-threat-detection,Detect new threats using Microsoft Sentinel,Detect new threats using Microsoft Sentinel with Azure Web Application Firewall,Detect new web threats using WAF and Sentinel,This article shows you how to use Microsoft Sentinel with Azure Web Application Firewall (WAF) to detect new threats to your network.,"Web applications face frequent malicious attacks that exploit well-known vulnerabilities, such as Code Injection and Path Traversal Attacks. These attacks are hard to prevent in the application code, as they require constant maintenance, patching, and monitoring at multiple levels of the application architecture. A Web Application Firewall (WAF) solution can provide faster and centralized security by patching a known vulnerability for all web applications, rather than securing each one individua",2024-01-23T18:04:00.000Z,how-to,integrations,0.7,True,"Focuses on using Sentinel analytics with WAF to detect threats; includes specific queries, rules, or integration patterns unique to this combo.",unchanged diff --git a/products/azure-web-application-firewall/report.md b/products/azure-web-application-firewall/report.md index ea80e7dd..7490028c 100644 --- a/products/azure-web-application-firewall/report.md +++ b/products/azure-web-application-firewall/report.md @@ -52,10 +52,10 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat - **Unclassified**: 12 ### Incremental Update -- **New Pages**: 1 +- **New Pages**: 0 - **Updated Pages**: 0 -- **Unchanged**: 79 -- **Deleted Pages**: 1 +- **Unchanged**: 80 +- **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-web-application-firewall/azure-web-application-firewall.csv` ## Classification Statistics @@ -75,14 +75,6 @@ confusable_not_for: Not for Azure Application Gateway (use azure-application-gat ## Changes -### New Pages - -- [Support and troubleshooting](https://learn.microsoft.com/en-us/azure/web-application-firewall/support-help) - -### Deleted Pages - -- ~~Troubleshoot~~ (https://learn.microsoft.com/en-us/azure/web-application-firewall/ag/web-application-firewall-troubleshoot) - ## Classified Pages | TOC Title | Type | Confidence | Reason | diff --git a/products/azure-well-architected/azure-well-architected.csv b/products/azure-well-architected/azure-well-architected.csv index 06f35021..1fdcc22d 100644 --- a/products/azure-well-architected/azure-well-architected.csv +++ b/products/azure-well-architected/azure-well-architected.csv @@ -14,9 +14,9 @@ https://learn.microsoft.com/en-us/azure/well-architected/ai/personas,Workload pe https://learn.microsoft.com/en-us/azure/well-architected/ai/responsible-ai,Responsible AI,Responsible AI in Azure Workloads - Microsoft Azure Well-Architected Framework,Apply responsible AI practices in Azure workloads,Learn how to develop and implement policies that support responsible AI and handle user data appropriately in workload operations on Azure.,"The goal of responsible AI in workload design is to help ensure that the use of AI algorithms isfair,transparent, andinclusive. Microsoft Azure Well-Architected Framework security principles are interrelated and focus onconfidentialityandintegrity. Security measures must be in place to maintain user privacy, protect data, and safeguard the integrity of the design. The design shouldn't be misused for unintended purposes. In AI workloads, models often use opaque logic to make decisions.Users shoul",2025-07-14T17:05:00.000Z,concept-article,workload-patterns,0.6,True,"Gives applied guidance on fairness, transparency, privacy, and misuse prevention in AI workload design and operations, tied to WAF security principles, beyond generic ethics discussion.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/ai/test,Test and evaluate,Test and Evaluate AI Workloads on Azure - Microsoft Azure Well-Architected Framework,Test and evaluate Azure AI models and systems,Learn about AI workload testing operations and metrics that help you maintain the quality of your workload on Azure.,"This article focuses on two distinct aspects: evaluating the models and testing the entire system. Evaluation and testing are often used interchangeably, but they should be considered separate processes that use distinct datasets. Evaluationis an iterative activity that you do during the development phase. It focuses on experimentation to find the best model with the right level of tuning. Then, evaluate the model based on various metrics. Testingincludes verifying the entire system when a chang",2025-09-05T22:04:00.000Z,concept-article,workload-patterns,0.65,True,"Provides concrete guidance on separating model evaluation from system testing, metrics, and dataset usage for AI workloads, which is domain-specific testing practice.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/ai/training-data-design,Training data design,Design Training Data for AI Workloads on Azure - Microsoft Azure Well-Architected Framework,Design training data pipelines for Azure AI workloads,"Learn about considerations for designing training data for discriminative AI workloads. Get guidance for data collection, processing, storing, and testing.","When you design data for AI functionality in applications, consider both non-functional requirements, such as operability, cost, and security, and functional requirements that are related to data ingestion, preparation, and validation. Data design and application design can't be decoupled. Application design requires that you understand use cases, query patterns, and freshness requirements. To address business requirements that drive the need for using AI, the application might need output from ",2025-10-30T22:04:00.000Z,concept-article,workload-patterns,0.7,True,"Provides detailed, domain-specific guidance for AI workloads (training data collection, processing, storage, validation) mapped to WAF-style nonfunctional and functional requirements. This is workload-specific (AI) architectural guidance rather than generic data design.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/architect-role/architecture-decision-record,Maintain an architecture decision record,Maintain an architecture decision record (ADR) - Microsoft Azure Well-Architected Framework,,"Learn about the benefits of creating an architecture decision record in the design process to document decisions, justifications, and implications.","An architecture decision record (ADR) is one of the most important deliverables of a solution architect. Your architecture is the accumulation of its decisions, so the ADR is effectively a record of how and why the system came to be its current shape. The ADR documents all key decisions, including alternatives that you ruled out, forarchitecturally significant requirements. Only include choices that affect the system's structure, key quality attributes, or are difficult to reverse. Each entry ca",2026-04-13T17:09:00.000Z,concept-article,,0.2,False,"Guidance on using Architecture Decision Records is process- and role-focused, not WAF-pillar-specific; it lacks checklist IDs, pillar-specific principles, service configuration, or workload-domain application.",updated +https://learn.microsoft.com/en-us/azure/well-architected/architect-role/architecture-decision-record,Maintain an architecture decision record,Maintain an architecture decision record (ADR) - Microsoft Azure Well-Architected Framework,,"Learn about the benefits of creating an architecture decision record in the design process to document decisions, justifications, and implications.","An architecture decision record (ADR) is one of the most important deliverables of a solution architect. Your architecture is the accumulation of its decisions, so the ADR is effectively a record of how and why the system came to be its current shape. The ADR documents all key decisions, including alternatives that you ruled out, forarchitecturally significant requirements. Only include choices that affect the system's structure, key quality attributes, or are difficult to reverse. Each entry ca",2026-04-13T17:09:00.000Z,concept-article,,0.2,False,"Guidance on using Architecture Decision Records is process- and role-focused, not WAF-pillar-specific; it lacks checklist IDs, pillar-specific principles, service configuration, or workload-domain application.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/architect-role/architecture-design-specification,Build architecture design specification,Develop an architecture design specification - Microsoft Azure Well-Architected Framework,,Learn about the benefits of creating an architecture design specification for a workload. The specification describes design choices through words and diagrams.,"A workload architecture design specification isa detailed specification that describes design choices supported by architecture diagrams and justifications. These decisions must address both functional and nonfunctional choices, as well as support routine, ad hoc, and emergency operations. All of this, however, must be rooted in clear business needs. If you haven't yet established a well-defined set of business goals agreed upon with stakeholders, we recommend starting with the guide onAlign tec",2025-12-09T23:07:00.000Z,concept-article,,0.4,False,Describes what an architecture design specification is and why to create one; process/role guidance rather than WAF pillar-specific expert content.,unchanged -https://learn.microsoft.com/en-us/azure/well-architected/architect-role/collaboration,Collaborate with implementors,Architect Collaboration With Workload Teams - Microsoft Azure Well-Architected Framework,,"Learn how architects collaborate with workload teams. Discover continuous tasks, POC usage, and technical debt strategies to improve delivery.","Delivering architecture specifications isn't a one-time task. Architects should engage with the workload team throughout implementation. This article outlines practical steps for architects during implementation, including continuous collaboration tasks, proof-of-concept (POC) usage, and technical debt management to help you succeed.",2026-04-15T17:03:00.000Z,concept-article,,0.2,False,"Article describes collaboration practices between architects and workload teams; it does not provide WAF pillar checklists, recommendations tied to checklist IDs, service-specific guidance, or trade-off analysis.",updated +https://learn.microsoft.com/en-us/azure/well-architected/architect-role/collaboration,Collaborate with implementors,Architect Collaboration With Workload Teams - Microsoft Azure Well-Architected Framework,,"Learn how architects collaborate with workload teams. Discover continuous tasks, POC usage, and technical debt strategies to improve delivery.","Delivering architecture specifications isn't a one-time task. Architects should engage with the workload team throughout implementation. This article outlines practical steps for architects during implementation, including continuous collaboration tasks, proof-of-concept (POC) usage, and technical debt management to help you succeed.",2026-04-15T17:03:00.000Z,concept-article,,0.2,False,"Article describes collaboration practices between architects and workload teams; it does not provide WAF pillar checklists, recommendations tied to checklist IDs, service-specific guidance, or trade-off analysis.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/architect-role/design-business-requirements,Align with business requirements,Align technical strategy with business requirements - Microsoft Azure Well-Architected Framework,,"Learn how to gather the right information, ask the right questions, and build shared understanding so you can design architectures that are both technically sound and aligned with real business motiva","As a cloud architect, your first task is to create clarity. Before you can make meaningful architectural decisions, you need to understand what the system must achieve, who it serves, and which constraints you must operate within. You need a clear view of expected outcomes and the boundaries set by the business. Every decision is shaped by real pressures: budgets, delivery timelines, compliance rules, performance expectations, and service-level commitments. These aren't optional considerations. ",2025-12-09T23:07:00.000Z,concept-article,,0.4,False,"Guidance on aligning technical strategy with business requirements; conceptual architect-role content, not WAF pillar principles, checklists, or recommendations.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/architect-role/design-diagrams,Create design diagrams,Create architecture design diagrams - Microsoft Azure Well-Architected Framework,,Learn about diagramming practices and types of architecture diagrams that you can create to communicate effectively.,"Architects often communicate through diagrams. Well‑designed visuals are powerful tools that help implementers, security reviewers, and business stakeholders converge on a shared mental model, expose risks earlier, and reduce rework. To communicate with intention, an architect must select and often layer diagram types that match the message, audience, and lifecycle stage. Ultimately, the choice of architecture diagram depends on what you're trying to convey and your audience's questions. Archite",2025-08-14T17:03:00.000Z,concept-article,,0.3,False,"General diagramming practices and communication guidance; not tied to WAF pillars, checklists, or recommendations.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/architect-role/fundamentals,Architect role,Solution Architect's Responsibilities and Guiding Principles - Microsoft Azure Well-Architected Framework,,Learn guiding principles that Well-Architected architects should follow to be effective in their function.,"Acloud solution architectis responsible for guiding the component and topology design of workloads, ensuring they meet both initial requirements and long-term business goals. This role covers the full lifecycle of a workload, adapting the architecture as functionality evolves or organizational needs change. As an architect, your role is to gather input from stakeholders, understand the business context, and shape a design that balances technical, operational, and business considerations. Take ad",2025-12-09T23:07:00.000Z,concept-article,,0.4,False,Role and responsibility overview for solution architects; guidance is generic and not WAF-pillar-specific or checklist/recommendation-based.,unchanged @@ -59,7 +59,7 @@ https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optim https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-component-costs,CO:07 Component costs,Architecture strategies for optimizing component costs. - Microsoft Azure Well-Architected Framework,Optimize individual component costs in Azure workloads,Learn how to optimize component costs by eliminating sources of wasted spending from an existing workload.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide describes the recommendations for optimizing workload component costs. Optimizing component costs refers to the process of evaluating and improving the cost-efficiency of individual elements within a workload. It emphasizes the continuous review and potential removal or improvement of outdated, unnecessary, or rarely used components, such as application features, platform features, and resourc",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,Maps to a Cost Optimization checklist recommendation and provides detailed steps for evaluating and improving cost-efficiency of workload components. Clear implementation guidance.,unchanged https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-data-costs,CO:10 Data costs,Architecture strategies for optimizing data costs - Microsoft Azure Well-Architected Framework,Optimize Azure data storage and management costs,Learn how to optimize data costs for a workload by aligning data spending to data priority.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide describes the recommendations for optimizing data costs for a workload. Optimizing data costs involves minimizing the expenses related to the storage and management of data according to its significance and access frequency. Appropriate data management can significantly reduce overhead costs and align spending with data utility. Neglecting to optimize data costs can lead to inflated expenses, ",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,Directly supports a Cost Optimization checklist recommendation and provides concrete practices for aligning data spend with data priority and access patterns. Implementation-level guidance.,unchanged https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-environment-costs,CO:08 Environment costs,Architecture strategies for optimizing environment costs - Microsoft Azure Well-Architected Framework,Optimize non-production and production environment costs,Learn how to optimize environment costs by tailoring each environment to its specific purpose and optimize it for cost effectiveness.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide describes the recommendations for cost optimizing workload environments. Each environment should be tailored for its specific purpose and optimized for cost effectiveness. It's important to make strategic tradeoffs and allocate resources where they matter the most, without compromising on critical components. By treating environments differently and optimizing them accordingly, you can achieve",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,"Supports a Cost Optimization checklist recommendation and gives prescriptive guidance on tailoring each environment for cost effectiveness. Implementation-focused, not just conceptual.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-flow-costs,CO:09 Flow costs,Architecture strategies for optimizing flow costs - Microsoft Azure Well-Architected Framework,Implement cost optimization for workload data flows,Learn about Azure Well-Architected Framework recommendations for strategically optimizing the cost of each of your workload's flows.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide describes the recommendations for optimizing the cost of each of the flows in your workload. Cost-optimizing the flows in a workload involves strategically allocating and managing resources to minimize expenses while maintaining performance. This optimization is crucial because it ensures efficient utilization of invested resources, reduces unnecessary expenditures, and improves the overall re",2026-04-15T17:03:00.000Z,concept-article,recommendations,0.78,True,"The page is explicitly tied to a Cost Optimization checklist recommendation and provides detailed guidance on how to optimize the cost of each workload flow. This aligns with the 'recommendations' definition: it is the how-to behind a checklist item, with implementation-focused, prescriptive steps rather than just high-level principles.",updated +https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-flow-costs,CO:09 Flow costs,Architecture strategies for optimizing flow costs - Microsoft Azure Well-Architected Framework,Implement cost optimization for workload data flows,Learn about Azure Well-Architected Framework recommendations for strategically optimizing the cost of each of your workload's flows.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide describes the recommendations for optimizing the cost of each of the flows in your workload. Cost-optimizing the flows in a workload involves strategically allocating and managing resources to minimize expenses while maintaining performance. This optimization is crucial because it ensures efficient utilization of invested resources, reduces unnecessary expenditures, and improves the overall re",2026-04-15T17:03:00.000Z,concept-article,recommendations,0.78,True,"The page is explicitly tied to a Cost Optimization checklist recommendation and provides detailed guidance on how to optimize the cost of each workload flow. This aligns with the 'recommendations' definition: it is the how-to behind a checklist item, with implementation-focused, prescriptive steps rather than just high-level principles.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-personnel-time,CO:13 Personnel time,Architecture strategies for optimizing personnel time - Microsoft Azure Well-Architected Framework,Optimize personnel time for Azure workload operations,Learn about Azure Well-Architected Framework recommendations for maximizing the productivity and efficiency of employees.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide describes the recommendations for optimizing personnel time. This optimization is a strategic process of maximizing the productivity and efficiency of employees that design, implement, and operate the workload during their working hours. It involves aligning their skills, strengths, and tasks in a manner that ensures that every hour they spend at work is used most effectively. The goal is to e",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.88,True,"Tied to a Cost Optimization checklist recommendation and describes how to align skills, tasks, and time to maximize productivity. Prescriptive, step-oriented guidance.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-scaling-costs,CO:12 Scaling costs,Architecture strategies for optimizing scaling costs - Microsoft Azure Well-Architected Framework,Optimize scaling strategies to reduce Azure costs,Learn how to optimize scaling costs using specific strategies.,"Applies to this Azure Well-Architected Framework Cost Optimization checklist recommendation: This guide provides recommendations for optimizing scaling costs. Cost optimizing scaling is the process of removing inefficiencies in workload scaling. The goal is to reduce scaling costs while still meeting all nonfunctional requirements. Spending less to get the same result. Optimizing scaling allows you to avoid unnecessary expenses, overprovisioning, and waste. It also helps prevent unexpected spike",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,Supports a Cost Optimization checklist recommendation and provides concrete strategies for removing scaling inefficiencies while meeting NFRs. Implementation-focused guidance.,unchanged https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/principles,Design principles,Cost Optimization design principles - Microsoft Azure Well-Architected Framework,Apply Cost Optimization design principles in Azure,Learn about Cost Optimization design principles that can help you achieve business objectives and justify costs.,"Architecture design is always driven by business goals and mustfactor in return on investment (ROI) and financial constraints. Typical questions to consider include: A cost-optimized workload isn't necessarily a low-cost workload. There are significant tradeoffs. Tactical approaches are reactive and can reduce costs only in the short term. To achieve long-term financial responsibility, you need tocreate a strategy with prioritization, continuous monitoring, and repeatable processesthat focuses o",2025-05-29T17:00:00.000Z,concept-article,design-principles,0.9,True,"Defines named cost-optimization principles, explains why they matter (ROI, constraints, tactical vs strategic), and ties them to the Cost pillar. This matches WAF-specific design principles with rationale.",unchanged @@ -99,7 +99,7 @@ https://learn.microsoft.com/en-us/azure/well-architected/mission-critical/missio https://learn.microsoft.com/en-us/azure/well-architected/mission-critical/mission-critical-security,Security,Security considerations for mission-critical workloads on Azure - Microsoft Azure Well-Architected Framework,Apply security design to mission-critical Azure workloads,This section provides detailed design considerations and recommendations for the security critical design area.,"Security is one of the foundational design principles and also a key design area that must be treated as a first-class concern within the mission-critical architectural process. Given that the primary focus of a mission-critical design is to maximize reliability so that the application remains performant and available, the security considerations and recommendations applied within this design area will focus on mitigating threats with the capacity to impact availability and hinder overall reliab",2023-03-15T08:00:00.000Z,concept-article,workload-patterns,0.78,True,"Mission-critical security article with detailed, availability-focused security considerations and concrete recommendations for this workload type; applies WAF principles to a specific domain (mission-critical) rather than generic security.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/,Quick links,Operational excellence quick links - Microsoft Azure Well-Architected Framework,,Learn about resources that can help you apply operational excellence guidance to your architectures.,Apply operational excellence guidance to your workload to ensure workload quality through standardized processes and team cohesion.,2026-03-31T22:07:00Z,landing-page,,0.2,False,"Quick links/overview page for the Operational Excellence pillar; primarily navigation and high-level guidance without checklist IDs, detailed implementation steps, or pillar-specific principles with rationale.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/checklist,Checklist,Design review checklist for Operational Excellence - Microsoft Azure Well-Architected Framework,Use operational excellence design review checklist,"Learn how to incorporate an operational excellence approach in your workload for repeatable, reliable, and safe deployments of infrastructure and code.","This checklist presents a set of recommendations to help you build a culture of operational excellence. Start with a DevOps approach to integrate specializations from multiple disciplines. This approach creates a rigorous design and development practice. This approach leads to repeatable, reliable, and safe deployments of infrastructure and code. Prioritize human intervention in areas that benefit from it, and incorporate automation in other areas. Observability serves operational excellence by ",2026-03-31T22:07:00.000Z,concept-article,checklists,0.9,True,"Operational Excellence design review checklist with concise, actionable items organized under this pillar and tied to recommendations. These are WAF-specific checklist items (likely with IDs) used as quick-reference for architecture reviews.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/design-patterns,Design patterns,Architecture design patterns that support operational excellence - Microsoft Azure Well-Architected Framework,,Learn about industry patterns that support operational excellence and can help you address common challenges in cloud workloads.,"When you design workload architectures, you should use industry patterns that address common challenges. Patterns help you make intentional tradeoffs and optimize for desired outcomes. They also help you mitigate risks that can impact reliability, security, performance, and cost. Because operations span all those areas, unmanaged risks eventually surface as operational toil or incidents. These patterns are proven in real-world cloud environments, scale with modern operating models, and are inher",2026-04-16T17:04:00.000Z,concept-article,,0.2,False,"The page describes general architecture design patterns that support operational excellence and mentions tradeoffs conceptually, but it is pattern/overview content rather than WAF checklist-linked implementation guidance, pillar-specific principles with IDs, or service/workload-specific configurations. It lacks the structured IDs, detailed checklist mapping, or cross-pillar tradeoff analysis required for the defined sub-skill types.",updated +https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/design-patterns,Design patterns,Architecture design patterns that support operational excellence - Microsoft Azure Well-Architected Framework,,Learn about industry patterns that support operational excellence and can help you address common challenges in cloud workloads.,"When you design workload architectures, you should use industry patterns that address common challenges. Patterns help you make intentional tradeoffs and optimize for desired outcomes. They also help you mitigate risks that can impact reliability, security, performance, and cost. Because operations span all those areas, unmanaged risks eventually surface as operational toil or incidents. These patterns are proven in real-world cloud environments, scale with modern operating models, and are inher",2026-04-16T17:04:00.000Z,concept-article,,0.2,False,"The page describes general architecture design patterns that support operational excellence and mentions tradeoffs conceptually, but it is pattern/overview content rather than WAF checklist-linked implementation guidance, pillar-specific principles with IDs, or service/workload-specific configurations. It lacks the structured IDs, detailed checklist mapping, or cross-pillar tradeoff analysis required for the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/devops-culture,OE:01 DevOps culture,Architecture strategies for fostering DevOps culture - Microsoft Azure Well-Architected Framework,Foster a DevOps culture for Azure workload teams,Learn how to embrace DevOps culture principles and practices in your workload team to strengthen the team.,"Applies to this Azure Well-Architected Framework Operational Excellence checklist recommendation: Operating a workload with a DevOps mindset requires more than tools and processes. There are two core components. The first is culture: shared ownership, accountability, continuous learning, and a focus on quality. The second is execution: teams must be able to run their workloads day to day, respond to incidents and changes, and collaborate with other teams while meeting organizational requirements",2026-02-11T19:15:00.000Z,concept-article,recommendations,0.9,True,Explicitly tied to an Operational Excellence checklist recommendation and provides concrete steps for implementing DevOps culture and execution practices. Detailed how-to guidance.,unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/enable-automation,OE:10 Automation design,Architecture strategies for enabling and implementing automation in a workload - Microsoft Azure Well-Architected Framework,Design and implement automation in Azure workloads,"Learn how to design your workload to enable automation that eliminates repetitive, error-prone tasks. Automation simplifies maintenance tasks, allowing you to update, patch, and upgrade your systems m","Applies to this Azure Well-Architected Framework Operational Excellence checklist recommendation: Design your workload with automation in mind to ensure routine tasks like provisioning, scaling, deployments, and maintenance are performed quickly, reliably, and consistently. Automating repetitive, error-prone tasks enables teams to work more efficiently while reducing human error. Actively look for automation opportunities to reduce management overhead and improve reliability. Evaluate each oppor",2026-03-31T22:07:00.000Z,concept-article,recommendations,0.9,True,"Described as applying to an Operational Excellence checklist recommendation and provides detailed strategies for enabling and implementing automation (provisioning, scaling, deployments, maintenance), which are implementation-focused recommendations tied to checklist items.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/formalize-development-practices,OE:03 Software development practices,Architecture strategies for formalizing development practices - Microsoft Azure Well-Architected Framework,Formalize software development practices for Azure workloads,Learn how to establish standards for managing your workload team's software development practices.,"Applies to this Azure Well-Architected Framework Operational Excellence checklist recommendation: Software development is more than just producing code. Developers need to clearly understand what to build and why, while product owners and managers maintain visibility into what work is being done and how it is progressing. By establishing consistent practices, teams can deliver with quality, surface risks early and manage expectations, and track progress. This guide provides recommendations on h",2026-02-11T19:15:00.000Z,concept-article,recommendations,0.9,True,"Tied to an Operational Excellence checklist recommendation and provides detailed guidance on standardizing development workflows, visibility, and risk management. Concrete how-to content.",unchanged @@ -114,13 +114,13 @@ https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/ https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/tools-processes,OE:04 Tools and processes,Architecture strategies for standardizing tools and processes - Microsoft Azure Well-Architected Framework,Standardize development tools and processes for Azure teams,Learn how to optimize development practices by standardizing tools and processes. Define consistent practices to optimize efficiency and quality of work.,Applies to this Azure Well-Architected Framework Operational Excellence checklist recommendation: This guide describes the recommendations for defining standards for software development tools and processes. Defining consistent practices leads to an efficient workload team and high-quality work. High-performing teams use industry-proven tools and processes to minimize wasted effort and potential code errors. The first step of optimizing development practices is standardizing tools and processes.,2026-02-11T19:15:00.000Z,concept-article,recommendations,0.88,True,Supports an Operational Excellence checklist recommendation and describes specific practices for choosing and standardizing tools and processes to improve efficiency and quality.,unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/tradeoffs,Tradeoffs,Operational Excellence tradeoffs - Microsoft Azure Well-Architected Framework,Analyze Operational Excellence tradeoffs across WAF pillars,Learn about tradeoffs that you might encounter when you design workload architectures and operations for operational excellence.,"Operational Excellence provides workload quality through the implementation of clear team standards, understood responsibility and accountability, attention to customer outcomes, and team cohesion. The implementation of these goals is rooted in DevOps, which recommends minimizing process variance, reducing human error, and ultimately increasing the return of value for the workload. That value isn't just measured against the functional requirements served by the components of the workload. It's a",2024-10-10T08:00:00.000Z,concept-article,tradeoffs,0.88,True,"Discusses how implementing DevOps and operational standards affects value, reliability, cost, and other concerns. Cross-pillar trade-off analysis for the Operational Excellence pillar.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/workload-supply-chain,OE:06 Supply chain for workload development,Architecture strategies for designing a workload development supply chain - Microsoft Azure Well-Architected Framework,Design CI/CD-based workload development supply chains,"Learn how to design a workload development supply chain based on CI/CD pipelines that ensure a predictable, efficient workload lifecycle.","Applies to this Azure Well-Architected Framework Operational Excellence checklist recommendation: To provide a predictable, standardized way to maintain your workload, design your workload development supply chain around continuous integration and continuous delivery (CI/CD). Maintain a single, standardized supply chain and implement it with automated CI/CD pipelines; you can use multiple pipelines as long as they all adhere to the same supply chain. Use a standardized supply chain to protect yo",2026-02-11T19:15:00.000Z,concept-article,recommendations,0.9,True,Supports an Operational Excellence checklist recommendation and gives prescriptive guidance on designing standardized CI/CD pipelines and supply chains. Concrete how-to implementation.,unchanged -https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/,Quick links,Performance efficiency quick links - Microsoft Azure Well-Architected Framework,,Learn about resources that you can use to apply performance efficiency guidance to your architecture.,Apply performance efficiency guidance to your architecture to efficiently meet workload demands.,2025-08-01T17:04:00Z,landing-page,,0.1,False,"A quick-links/navigation page for performance efficiency resources, not detailed guidance or checklists.",unchanged +https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/,Quick links,Performance efficiency quick links - Microsoft Azure Well-Architected Framework,,Learn about resources that you can use to apply performance efficiency guidance to your architecture.,Apply performance efficiency guidance to your architecture to efficiently meet workload demands.,2026-04-23T22:02:00Z,landing-page,,0.2,False,"Quick links/landing page for Performance Efficiency resources; no evidence of detailed principles, checklist IDs, or implementation guidance on the page itself.",updated https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/capacity-planning,PE:02 Capacity planning,Architecture strategies for capacity planning - Microsoft Azure Well-Architected Framework,Implement capacity planning for Azure performance efficiency,Learn about Well-Architected Framework recommendations for implementing capacity planning in your workloads.,"Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: This guide describes the recommendations for capacity planning. Capacity planning refers to the process of determining the resources required to meet workload performance targets. It involves estimating the amount of computing resources such as CPU, memory, storage, and network bandwidth needed to support the workload's performance requirements. Capacity planning helps avoid underprovisioning and en",2025-08-06T22:05:00.000Z,concept-article,recommendations,0.9,True,"Mapped to a Performance Efficiency checklist recommendation and gives concrete implementation guidance for capacity planning (CPU, memory, storage, bandwidth) to meet performance targets, fitting recommendations.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist,Checklist,Design review checklist for Performance Efficiency - Microsoft Azure Well-Architected Framework,Use the Performance Efficiency design review checklist,Use this checklist to assess and verify the completeness of your design for performance efficiency.,"This checklist presents a set of recommendations for you to scale your system so it can grow and meet your workload usage demand. The goal ofperformanceis to maintain the efficiency of every interaction with a healthy system as demand increases. When you design and implement for performance, focus on theefficiency and effectivenessof cost, complexity, supporting new requirements, technical debt, reporting, and toil. For every system, there's a limit to how much you can scale it without redesigni",2023-11-15T08:00:00.000Z,concept-article,checklists,0.9,True,"A design review checklist for Performance Efficiency. In the full article, items are organized by pillar/category with WAF-style IDs and concise actions, matching the checklists definition.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/collect-performance-data,PE:04 Metrics and logs,Architecture strategies for collecting performance data - Microsoft Azure Well-Architected Framework,Collect and use performance data for Azure workloads,Learn recommendations for collecting performance data (metrics and logs) to help you assess the performance of a workload.,"Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: Collecting performance data is the process of gathering metrics and logs that provide information about the performance of a workload. This data includes numerical values, which are known asmetrics. Metrics describe the state of the system at a particular point in time. It also includes logs that contain different types of data organized into records. By collecting performance data, you can monitor ",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,"Explicitly tied to a Performance Efficiency checklist recommendation and provides detailed guidance on collecting metrics and logs to assess performance, matching the recommendations definition.",unchanged +https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist,Checklist,Design review checklist for Performance Efficiency - Microsoft Azure Well-Architected Framework,Use Performance Efficiency design review checklist,Use this checklist to assess and verify the completeness of your design for performance efficiency.,"This checklist presents a set of recommendations for you to scale your system so it can grow and meet your workload usage demand. The goal ofperformanceis to maintain the efficiency of every interaction with a healthy system as demand increases. When you design and implement for performance, focus on theefficiency and effectivenessof cost, complexity, supporting new requirements, technical debt, reporting, and toil. For every system, there's a limit to how much you can scale it without redesigni",2026-04-23T22:02:00.000Z,concept-article,checklists,0.9,True,"Explicitly described as a checklist to assess performance efficiency design. These WAF checklist pages contain numbered, pillar-specific, actionable items (PE:## style) that link to deeper recommendations, which qualifies as expert, structured review content.",updated https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/continuous-performance-optimize,PE:12 Continuous performance optimization,Architecture strategies for continuous performance optimization - Microsoft Azure Well-Architected Framework,Continuously optimize performance efficiency in Azure,Get recommendations for continuous performance optimization.,"Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: This guide describes the recommendations for continuous performance optimization. Continuous performance optimization is the process of constantly monitoring, analyzing, and improving performance efficiency. Performance efficiency adapts to increases and decreases in demand. Performance optimization needs to be an ongoing activity throughout the life of the workload. Workload performance often degra",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.88,True,"Applies to a Performance Efficiency checklist recommendation and describes ongoing monitoring, analysis, and improvement steps, which are detailed implementation recommendations.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/design-patterns,Design patterns,Architecture design patterns that support performance efficiency - Microsoft Azure Well-Architected Framework,,Learn about industry patterns that support performance efficiency and can help you address common challenges in cloud workloads.,"When you design workload architectures, you should use industry patterns that address common challenges. Patterns can help you make intentional tradeoffs within workloads and optimize for your desired outcome. They can also help mitigate risks that originate from specific problems, which can impact reliability, security, cost, and operations. If not mitigated, risks will eventually lead to performance inefficiencies. These patterns are backed by real-world experience, are designed for cloud scal",2024-10-10T08:00:00.000Z,concept-article,,0.4,False,"Architecture design patterns for performance efficiency are largely conceptual and pattern-based; they are not tied to WAF checklist IDs or pillar-scoped recommendations, and thus don't fit the defined sub-skill types.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/maturity-model,Maturity model,Performance Efficiency Maturity Model - Microsoft Azure Well-Architected Framework,Assess Performance Efficiency maturity for Azure workloads,Understand the maturity model levels of the performance efficiency pillar.,"Performance Efficiency is about maintaining user experience even when there's an increase in load by managing capacity. The strategy includes scaling resources, identifying and optimizing potential bottlenecks, and optimizing for peak performance. This maturity model guides you through a strategic journey of performance optimization by scaling resources, identifying and optimizing potential bottlenecks, and optimizing for peak performance. You'll start by selecting the right components and estab",2025-07-14T21:04:00.000Z,concept-article,assessments,0.7,True,"A maturity model for the Performance Efficiency pillar that defines levels and progression. While not a question list, it functions as an assessment framework with maturity indicators and interpretation guidance, closest to the assessments category.",unchanged +https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/monitoring,PE:04 Performance monitoring,Architecture strategies for monitoring workload performance - Microsoft Azure Well-Architected Framework,Implement monitoring strategies for workload performance,Learn recommendations for collecting performance data (metrics and logs) to help you assess the performance of a workload.,"Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: Without performance data, underlying issues and optimization opportunities go unnoticed, leading to degraded user experience. This article describes design strategies for implementing multi-layer performance measurement that captures latency, throughput, and resource behavior to establish baselines and identify performance degradation across the workload. The key strategies in this article build on ",2026-04-23T22:02:00.000Z,concept-article,recommendations,0.85,True,The article is tied to a specific Performance Efficiency checklist recommendation and provides design strategies and implementation guidance for multi-layer performance measurement. This matches the recommendations type: detailed how-to guidance mapped to a checklist item.,new https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-code-infrastructure,PE:07 Code and infrastructure,Architecture strategies for optimizing code and infrastructure - Microsoft Azure Well-Architected Framework,Optimize application code and infrastructure performance on Azure,Learn how to optimize your code and infrastructure by using components only for their core purpose and only when necessary.,"Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: This guide describes the recommendations for optimizing code and infrastructure performance. To optimize your code and infrastructure, you should use your components only for their core purpose and only when necessary. When you overuse code and infrastructure, it creates unnecessary resource consumption, bottlenecks, and slow responses. To compensate for those inefficiencies, you must add more resou",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,"Explicitly a Performance Efficiency checklist-backed guide with detailed recommendations for optimizing code and infrastructure usage, which is implementation-focused guidance.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-data-performance,PE:08 Data performance,Architecture strategies for optimizing data performance - Microsoft Azure Well-Architected Framework,Optimize data performance in Azure workloads,"Learn how to optimize data access, retrieval, storage, and processing operations to enhance the overall performance of your workload.","Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: This guide describes the recommendations for optimizing data performance. Optimizing data performance is about refining the efficiency with which the workload processes and stores data. Every workload operation, transaction, or computation typically relies on the quick and accurate retrieval, processing, and storage of data. When data performance is optimized, the workload runs smoothly. Compromised",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.9,True,"Tied to a Performance Efficiency checklist recommendation and provides detailed guidance on optimizing data access, storage, and processing, matching recommendations.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-operational-tasks,PE:10 Operational tasks,Architecture strategies for optimizing operational tasks - Microsoft Azure Well-Architected Framework,Optimize operational tasks to protect workload performance,Learn about Well-Architected Framework recommendations for optimizing operational tasks.,Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: This guide describes the recommendations for optimizing operational tasks. Optimizing operational tasks is the process of minimizing the effects of tasks that you perform as part of routine workload operations. Operational activities use the same compute resources as the workload itself. Failure to consider the effects of operational tasks can cause the workload to miss its performance targets. It c,2023-11-15T08:00:00.000Z,concept-article,recommendations,0.88,True,"Mapped to a Performance Efficiency checklist recommendation and provides detailed guidance on minimizing the performance impact of operational tasks, which is implementation-focused.",unchanged @@ -133,15 +133,15 @@ https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/ https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/select-services,PE:03 Selecting services,Architecture strategies for selecting the right services - Microsoft Azure Well-Architected Framework,Select Azure services to meet workload performance needs,Learn how to select the appropriate services that best meet the requirements and demands of your workload.,"Applies to this Azure Well-Architected Framework Performance Efficiency checklist recommendation: This guide describes the recommendations for selecting appropriate services for your workload. The following recommendations help you choose services that best meet the requirements and demands of your workload. When you use services that are designed to handle your workload's requirements, you can ensure that your workload meets your performance targets. If you choose inappropriate services for you",2023-11-15T08:00:00.000Z,concept-article,recommendations,0.88,True,"Described as recommendations for selecting appropriate services for a workload, tied to a Performance Efficiency checklist recommendation, with prescriptive selection guidance rather than high-level principles.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/tradeoffs,Tradeoffs,Performance Efficiency tradeoffs - Microsoft Azure Well-Architected Framework,Evaluate Performance Efficiency tradeoffs across WAF pillars,Learn about tradeoffs that you might encounter when you design workload architectures and operations for performance efficiency.,"A workload that meets its performance targets without overprovisioning is efficient. The goal of performance efficiency is to have just enough supply to handle demand at all times. Key strategies for performance efficiency include proper use of code optimizations, design patterns, capacity planning, and scaling. Clear performance targets and testing underpin this pillar. During the process of negotiating a workload's performance targets and designing a workload for performance efficiency, it's i",2024-10-10T08:00:00.000Z,concept-article,tradeoffs,0.9,True,"Explicitly a tradeoffs page discussing how performance efficiency decisions affect other pillars (reliability, cost, etc.) and how to balance them, matching the tradeoffs category.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/pillars,Overview,Microsoft Azure Well-Architected Framework - Microsoft Azure Well-Architected Framework,,"Learn about the five pillars of the Azure Well-Architected Framework and how they can produce a high- quality, stable, and efficient cloud architecture.",The Azure Well-Architected Framework pillars drive architectural excellence at the fundamental level of a workload. Use this matrix to familiarize yourself with the key concepts: Important The reference architectures available in theAzure Architecture Centerare designed with the design principles in mind. The architecture articles describe a prescriptive path for applying the design principles and provide a holistic view on theTen design principles for Azure applications.,2025-01-22T08:00:00.000Z,concept-article,,0.5,False,"Overview of the five pillars and links to other content; high-level matrix and references, but not detailed pillar principles or checklists itself.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/reliability/,Quick links,Reliability quick links - Microsoft Azure Well-Architected Framework,,Learn about resources that can help you apply reliability guidance to your architecture to make your workload resilient to malfunction.,Apply reliability guidance to your architecture to make your workload resilient to malfunction and to ensure that it returns to a fully functioning state after a failure occurs.,2025-09-09T17:02:00Z,landing-page,,0.2,False,"Reliability quick links page is navigational, pointing to other reliability resources; no substantive principles, checklists, or recommendations on-page.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist,Checklist,Design review checklist for Reliability - Microsoft Azure Well-Architected Framework,Use reliability design review checklist for Azure,Use this checklist for Reliability to identify the best infrastructure and application design for your workload.,"This checklist presents a set of recommendations for you to use to evaluate the reliability, resiliency, and failure recovery strategies in your architecture design. To ensure reliability, identify the best infrastructure and application design for your workload. Make these decisions based on your business requirements that are mapped to availability and recoverability target metrics. To implement a reliable design, thoroughly consider decision points in your design and be aware of how those dec",2025-11-19T23:05:00.000Z,concept-article,checklists,0.9,True,Reliability checklist page presents actionable checklist items with WAF-specific IDs (RE:##) organized for architecture review and linking to detailed recommendations.,unchanged +https://learn.microsoft.com/en-us/azure/well-architected/reliability/,Quick links,Reliability quick links - Microsoft Azure Well-Architected Framework,,Learn about resources that can help you apply reliability guidance to your architecture to make your workload resilient to malfunction.,Apply reliability guidance to your architecture to make your workload resilient to malfunction and to ensure that it returns to a fully functioning state after a failure occurs.,2026-04-23T22:02:00Z,landing-page,,0.2,False,"Acts as a quick-links/landing page for reliability content, not a detailed principles, checklist, or implementation page. Lacks numbered checklist items, checklist IDs, or in-depth WAF-specific implementation guidance.",updated +https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist,Checklist,Design review checklist for Reliability - Microsoft Azure Well-Architected Framework,Use reliability design checklist for Azure workloads,Use this checklist for Reliability to identify the best infrastructure and application design for your workload.,"This checklist presents a set of recommendations for you to use to evaluate the reliability, resiliency, and failure recovery strategies in your architecture design. To ensure reliability, identify the best infrastructure and application design for your workload. Make these decisions based on your business requirements that are mapped to availability and recoverability target metrics. To implement a reliable design, thoroughly consider decision points in your design and be aware of how those dec",2026-04-23T22:02:00.000Z,concept-article,checklists,0.95,True,"Explicitly described as a design review checklist for the Reliability pillar. Contains a set of actionable checklist recommendations to evaluate reliability, resiliency, and recovery strategies. This matches the WAF checklist pattern (pillar-specific, actionable items) even though IDs like RE:01 are not visible in the summary.",updated https://learn.microsoft.com/en-us/azure/well-architected/reliability/design-patterns,Design patterns,Architecture design patterns that support reliability - Microsoft Azure Well-Architected Framework,,Learn about industry patterns that support reliability and can help you address common challenges in cloud workloads.,"When you design workload architectures, you should use industry patterns that address common challenges. Patterns can help you make intentional tradeoffs within workloads and optimize for your desired outcome. They can also help mitigate risks that originate from specific problems, which can impact security, performance, cost, and operations. If not mitigated, those risks will eventually cause reliability issues. These patterns are backed by real-world experience, are designed for cloud scale an",2024-10-10T08:00:00.000Z,concept-article,,0.5,False,Design patterns that support reliability are general architectural patterns; summary suggests cross-pillar impact but not clearly WAF-specific implementation or checklist-linked guidance.,unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/disaster-recovery,RE:09 Disaster recovery,Architecture strategies for disaster recovery - Microsoft Azure Well-Architected Framework,Plan disaster recovery strategies for Azure workloads,Learn how to design a disaster recovery strategy for a workload.,"Applies to this Azure Well-Architected Framework Reliability checklist recommendation: Disasters are significant incidents that require careful planning and proactive preparation by workloads and operations teams. This article outlines the key strategies for disaster recovery, emphasizing resilient workload design, data integrity, and clearly defined objectives for failover and recovery. It focuses on the purpose and guiding principles of DR rather than step-by-step procedures. For detailed impl",2025-11-19T23:05:00.000Z,concept-article,recommendations,0.9,True,"Explicitly tied to a Reliability checklist recommendation and outlines DR strategy principles, objectives, and design considerations—implementation-level guidance even if not step-by-step.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/failure-mode-analysis,RE:03 Failure mode analysis,Architecture strategies for failure mode analysis - Microsoft Azure Well-Architected Framework,Perform failure mode analysis for Azure reliability,"Learn how to identify potential points of failure within your workload and the associated flows, and plan mitigation actions accordingly.","Applies to this Azure Well-Architected Framework Reliability checklist recommendation: This guide describes the best practices for performing failure mode analysis (FMA) for your workload. FMA is the practice of identifying potential points of failure within your workload and the associated flows and planning mitigation actions accordingly. At each step of the flow, you identify the blast radius of multiple failure types, which helps you design a new workload or refactor an existing workload to ",2026-01-21T23:02:00.000Z,concept-article,recommendations,0.95,True,"Applies to a Reliability checklist recommendation and provides detailed FMA process steps, mitigation planning, and design implications—implementation-focused guidance.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/identify-flows,RE:02 Critical flows,Architecture strategies for identifying and rating flows - Microsoft Azure Well-Architected Framework,Identify and prioritize workload flows for reliability,Learn how to create a catalog of user and system flows for your workload to better understand the basis for your design decisions as they relate to reliability.,Applies to this Azure Well-Architected Framework Reliability checklist recommendation: This guide describes the recommendations for identifying and prioritizing workload flows. Identifying and prioritizing workload flows involves mapping user flows and system flows to determine their criticality to the organization. This practice ensures you identify and prioritize the most critical workload functionality to reduce the risk of damaging failures. Failure to identify and prioritize workload flows ,2024-01-25T22:58:00.000Z,concept-article,recommendations,0.95,True,"Tied to a specific Reliability checklist recommendation and gives concrete steps for cataloging and rating flows, mapping to criticality, and using that for design decisions.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/maturity-model,Maturity model,Reliability Maturity Model - Microsoft Azure Well-Architected Framework,Assess reliability maturity for Azure workloads,Understand the maturity model levels of the reliability pillar.,The reliability journey is a step-by-step process where each stage builds on the previous one to ensure systems stay available and meet user expectations. This maturity model is intended to help you assess your current state and offer a structured path for improvement. The foundation begins by bootstrapping basic reliability capabilities offered by Azure by using built-in Azure reliability features like zone redundancy for immediate improvements without extensive optimization overhead. Counterin,2025-07-14T21:04:00.000Z,concept-article,assessments,0.7,True,"Reliability maturity model defines levels and a progressive path to improve reliability, used to assess current state and interpret improvement steps—fits assessment questions/maturity guidance.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/metrics,RE:04 Target metrics,Architecture strategies for defining reliability targets - Microsoft Azure Well-Architected Framework,Define availability and recovery targets for reliability,"Learn how to define reliability targets for your critical workloads, and discover key design strategies that prioritize operations.",Applies to this Azure Well-Architected Framework Reliability checklist recommendation: This guide describes the recommendations for defining availability and recovery target metrics for critical workloads and flows. You should derive reliability targets from workshop exercises with business stakeholders. Then refine those targets by monitoring and testing your workloads. Set realistic expectations with your internal stakeholders about workload reliability. Then they can use contractual agreement,2024-08-20T16:59:00.000Z,concept-article,recommendations,0.95,True,"Directly linked to a Reliability checklist recommendation and describes how to derive, refine, and use reliability targets with stakeholders and monitoring—specific implementation guidance.",unchanged -https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring-alerting-strategy,RE:10 Monitoring and alerting,Architecture strategies for designing a reliable monitoring and alerting strategy - Microsoft Azure Well-Architected Framework,Design reliable monitoring and alerting for workloads,Learn how to design a reliable monitoring and alerting strategy to ensure that your workload operates reliably and operations teams are aware of changes.,"Applies to this Azure Well-Architected Framework Reliability checklist recommendation: This guide describes the recommendations for designing a reliable monitoring and alerting strategy. Implement this strategy to keep your operations teams informed of your environment's health status and ensure that you meet the established reliability targets for your workload. Definitions Before you create a monitoring and alerting strategy, perform the following tasks for your workload as part of your reliab",2025-09-25T22:06:00.000Z,concept-article,recommendations,0.95,True,Applies to a Reliability checklist recommendation and provides detailed steps and considerations for monitoring and alerting strategies aligned with reliability targets.,unchanged +https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring,RE:10 Monitoring,Architecture strategies for monitoring workload reliability - Microsoft Azure Well-Architected Framework,Implement monitoring strategies for workload reliability,Learn how to design a reliable monitoring and alerting strategy to ensure that your workload operates reliably and operations teams are aware of changes.,"Applies to this Azure Well-Architected Framework Reliability checklist recommendation: Reliability monitoring is the practice of measuring how well a system meets its business requirements over time, with respect to resiliency and recoverability. A well-architected monitoring system provides real-time view and trends of system behavior by establishing visibility across platform, infrastructure, and workload layers. By correlating these signals across components and over time, monitoring enables",2026-04-23T22:02:00.000Z,concept-article,recommendations,0.9,True,"The page is framed as architecture strategies for monitoring workload reliability and explicitly states it applies to a specific Reliability checklist recommendation. It provides detailed implementation guidance for that checklist item (how to design monitoring and alerting), which aligns with the recommendations sub-skill type.",new https://learn.microsoft.com/en-us/azure/well-architected/reliability/principles,Design principles,Reliability design principles - Microsoft Azure Well-Architected Framework,Apply reliability design principles to Azure workloads,Understand the design principles of the reliability pillar.,"Outages and malfunctions are serious concerns for all workloads. A reliable workload must survive those events andcontinue to consistently provide its intended functionality. It must beresilientso that it can detect and withstand faults while continuing to operate. It must berecoverableso that, if a disruption exceeds resiliency measures, the workload can be restored within agreed recovery targets. It must also beavailableso that users can access the workload during the promised time period at t",2025-09-30T22:08:00.000Z,concept-article,design-principles,0.9,True,"Reliability pillar page explicitly defines reliability-specific principles (resilient, recoverable, available), explains why they matter, and ties them to high-level recommendations for reliability.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/redundancy,RE:05 Redundancy,Architecture Strategies for Designing for Redundancy - Microsoft Azure Well-Architected Framework,Design redundancy strategies to meet reliability targets,Learn how to design for redundancy in your workload to meet your reliability targets and to keep problems contained to a single resource.,"Applies to this Azure Well-Architected Framework Reliability checklist recommendation: This guide describes the recommendations for adding redundancy throughout critical flows at different workload layers, which optimizes resiliency. Meet the requirements of your defined reliability targets by applying the proper levels of redundancy to your compute, data, networking, and other infrastructure tiers. Apply this redundancy to give your workload a strong, reliable foundation to build on. When you b",2025-09-09T17:02:00.000Z,concept-article,recommendations,0.95,True,"Applies to a Reliability checklist recommendation and details how to add redundancy across compute, data, networking, and other tiers to meet defined reliability targets.",unchanged https://learn.microsoft.com/en-us/azure/well-architected/reliability/scaling,RE:06 Scaling,Architecture strategies for designing a reliable scaling strategy - Microsoft Azure Well-Architected Framework,Design reliable scaling strategies for Azure workloads,"Learn about recommendations for designing a reliable scaling strategy, including Azure facilitation and tradeoff considerations.","Applies to this Azure Well-Architected Framework Reliability checklist recommendation: This guide describes the recommendations for designing a reliable scaling strategy. Definitions Note Scaling operations can be static (regularly scheduled daily scaling to accommodate normal load patterns), automatic (an automated process in response to conditions being met), or manual (an operator performs a one-time scaling operation in reaction to an unanticipated load change). Both vertical and horizontal ",2025-05-29T22:02:00.000Z,concept-article,recommendations,0.95,True,"Explicitly tied to a Reliability checklist recommendation and provides concrete guidance on static/automatic/manual scaling, vertical vs horizontal scaling, and tradeoffs.",unchanged diff --git a/products/azure-well-architected/report.md b/products/azure-well-architected/report.md index 7bdda241..0dbd706d 100644 --- a/products/azure-well-architected/report.md +++ b/products/azure-well-architected/report.md @@ -1,5 +1,5 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: workload-patterns: Designing, operating, and optimizing specialized Azure workloads (AI, HPC, mission‑critical, SaaS, SAP, sustainability) using Well‑Architected @@ -13,23 +13,23 @@ category_descriptions: service-guides: End-to-end design, security, networking, operations, monitoring, and optimization guidance for specific Azure services, aligned to Well-Architected Framework best practices. - recommendations: Guidance and best practices to design, operate, and improve Azure - workloads across cost, operations, performance, reliability, security, and sustainability - (FinOps, DevOps, DR, CI/CD, IAM, monitoring). + recommendations: Guidance on cost optimization, reliability, performance, security, + operations, DevOps, and sustainability best practices for architecting and running + Azure workloads. checklists: Checklists to review Azure workloads for cost, operations, performance, - reliability, and security, with concrete questions and best practices for each - architecture pillar. + reliability, and security best practices, with concrete questions and criteria + for each architecture. tradeoffs: Guidance on balancing cost, reliability, performance, security, and operations in Azure designs, including region/AZ choices and cross-pillar tradeoff analysis for architecture decisions skill_description: Expert guidance for designing, assessing, and optimizing Azure workloads using Azure Well Architected. Covers design review checklists, recommendations, design principles, tradeoffs, service guides, workload patterns, and assessment - questions. Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads, - or optimizing cost, security, reliability, and ops, and other Azure Well Architected + questions. Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads + with WAF assessments, guides, checklists, and tradeoffs, and other Azure Well Architected related development tasks. -use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads, or - optimizing cost, security, reliability, and ops, and other Azure Well Architected +use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads with + WAF assessments, guides, checklists, and tradeoffs, and other Azure Well Architected related development tasks. --- # Azure Well Architected Crawl Report @@ -43,10 +43,10 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads - **Unclassified**: 30 ### Incremental Update -- **New Pages**: 0 +- **New Pages**: 2 - **Updated Pages**: 4 -- **Unchanged**: 234 -- **Deleted Pages**: 0 +- **Unchanged**: 232 +- **Deleted Pages**: 2 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/azure-well-architected/azure-well-architected.csv` ## Classification Statistics @@ -64,21 +64,32 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads ## Changes +### New Pages + +- [RE:10 Monitoring](https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring) +- [PE:04 Performance monitoring](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/monitoring) + ### Updated Pages -- [CO:09 Flow costs](https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-flow-costs) - - Updated: 2023-12-20T18:03:00.000Z → 2026-04-15T17:03:00.000Z -- [Design patterns](https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/design-patterns) - - Updated: 2025-11-08T18:02:00.000Z → 2026-04-16T17:04:00.000Z -- [Maintain an architecture decision record](https://learn.microsoft.com/en-us/azure/well-architected/architect-role/architecture-decision-record) - - Updated: 2024-10-10T22:00:00.000Z → 2026-04-13T17:09:00.000Z -- [Collaborate with implementors](https://learn.microsoft.com/en-us/azure/well-architected/architect-role/collaboration) - - Updated: 2025-11-24T23:02:00.000Z → 2026-04-15T17:03:00.000Z +- [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/reliability/) + - Updated: 2025-09-09T17:02:00Z → 2026-04-23T22:02:00Z +- [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist) + - Updated: 2025-11-19T23:05:00.000Z → 2026-04-23T22:02:00.000Z +- [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/) + - Updated: 2025-08-01T17:04:00Z → 2026-04-23T22:02:00Z +- [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist) + - Updated: 2023-11-15T08:00:00.000Z → 2026-04-23T22:02:00.000Z + +### Deleted Pages + +- ~~PE:04 Metrics and logs~~ (https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/collect-performance-data) +- ~~RE:10 Monitoring and alerting~~ (https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring-alerting-strategy) ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| +| [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist) | checklists | 0.95 | Explicitly described as a design review checklist for the Reliability pillar. Contains a set of actionable checklist recommendations to evaluate reliability, resiliency, and recovery strategies. This matches the WAF checklist pattern (pillar-specific, actionable items) even though IDs like RE:01 are not visible in the summary. | | [RE:01 Simplicity and efficiency](https://learn.microsoft.com/en-us/azure/well-architected/reliability/simplify) | recommendations | 0.95 | Explicitly states it applies to a Reliability checklist recommendation and provides detailed implementation guidance on minimizing complexity and using platform services—classic checklist-backed recommendation. | | [RE:02 Critical flows](https://learn.microsoft.com/en-us/azure/well-architected/reliability/identify-flows) | recommendations | 0.95 | Tied to a specific Reliability checklist recommendation and gives concrete steps for cataloging and rating flows, mapping to criticality, and using that for design decisions. | | [RE:03 Failure mode analysis](https://learn.microsoft.com/en-us/azure/well-architected/reliability/failure-mode-analysis) | recommendations | 0.95 | Applies to a Reliability checklist recommendation and provides detailed FMA process steps, mitigation planning, and design implications—implementation-focused guidance. | @@ -87,7 +98,6 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads | [RE:06 Scaling](https://learn.microsoft.com/en-us/azure/well-architected/reliability/scaling) | recommendations | 0.95 | Explicitly tied to a Reliability checklist recommendation and provides concrete guidance on static/automatic/manual scaling, vertical vs horizontal scaling, and tradeoffs. | | [RE:07 Self-preservation](https://learn.microsoft.com/en-us/azure/well-architected/reliability/self-preservation) | recommendations | 0.95 | Linked to a Reliability checklist recommendation and gives detailed implementation strategies for self-preservation and self-healing capabilities to improve reliability. | | [RE:08 Testing](https://learn.microsoft.com/en-us/azure/well-architected/reliability/testing-strategy) | recommendations | 0.95 | Applies to a Reliability checklist recommendation and offers specific guidance on reliability testing, including chaos engineering and fault injection, to validate resiliency. | -| [RE:10 Monitoring and alerting](https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring-alerting-strategy) | recommendations | 0.95 | Applies to a Reliability checklist recommendation and provides detailed steps and considerations for monitoring and alerting strategies aligned with reliability targets. | | [SE:01 Security baseline](https://learn.microsoft.com/en-us/azure/well-architected/security/establish-baseline) | recommendations | 0.95 | Explicitly applies to a Security checklist recommendation and provides detailed guidance on defining, publishing, and enforcing security baselines across the organization. | | [SE:03 Data classification](https://learn.microsoft.com/en-us/azure/well-architected/security/data-classification) | recommendations | 0.95 | Linked to a Security checklist recommendation and describes how to categorize data by sensitivity/compliance and apply appropriate protections—specific implementation guidance. | | [SE:04 Segmentation](https://learn.microsoft.com/en-us/azure/well-architected/security/segmentation) | recommendations | 0.95 | Applies to a Security checklist recommendation and gives detailed guidance on defining segments, perimeters, and isolation boundaries as part of a unified segmentation strategy. | @@ -132,8 +142,7 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads | [CO:12 Scaling costs](https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/optimize-scaling-costs) | recommendations | 0.90 | Supports a Cost Optimization checklist recommendation and provides concrete strategies for removing scaling inefficiencies while meeting NFRs. Implementation-focused guidance. | | [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/checklist) | checklists | 0.90 | A Cost Optimization design review checklist with concise, actionable items organized for architecture review and linked to recommendations. Matches the checklist pattern with WAF-specific IDs in full content. | | [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/checklist) | checklists | 0.90 | Operational Excellence design review checklist with concise, actionable items organized under this pillar and tied to recommendations. These are WAF-specific checklist items (likely with IDs) used as quick-reference for architecture reviews. | -| [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist) | checklists | 0.90 | A design review checklist for Performance Efficiency. In the full article, items are organized by pillar/category with WAF-style IDs and concise actions, matching the checklists definition. | -| [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist) | checklists | 0.90 | Reliability checklist page presents actionable checklist items with WAF-specific IDs (RE:##) organized for architecture review and linking to detailed recommendations. | +| [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist) | checklists | 0.90 | Explicitly described as a checklist to assess performance efficiency design. These WAF checklist pages contain numbered, pillar-specific, actionable items (PE:## style) that link to deeper recommendations, which qualifies as expert, structured review content. | | [Checklist](https://learn.microsoft.com/en-us/azure/well-architected/security/checklist) | checklists | 0.90 | Security design review checklist page contains a structured set of actionable checklist items aligned to the Security pillar and the Zero Trust model, typically using WAF checklist IDs (SE:##). This is expert, WAF-specific checklist content rather than general guidance. | | [Design principles](https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/principles) | design-principles | 0.90 | Defines named cost-optimization principles, explains why they matter (ROI, constraints, tactical vs strategic), and ties them to the Cost pillar. This matches WAF-specific design principles with rationale. | | [Design principles](https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/principles) | design-principles | 0.90 | Defines pillar-specific principles (DevOps, standardized workflows, observability, release management), explains why they matter, and ties them to operational outcomes. Matches design-principles criteria. | @@ -151,12 +160,12 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads | [OE:10 Automation design](https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/enable-automation) | recommendations | 0.90 | Described as applying to an Operational Excellence checklist recommendation and provides detailed strategies for enabling and implementing automation (provisioning, scaling, deployments, maintenance), which are implementation-focused recommendations tied to checklist items. | | [PE:01 Performance targets](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/performance-targets) | recommendations | 0.90 | Explicitly applies to a Performance Efficiency checklist recommendation and provides detailed guidance on identifying objectives, metrics, and targets and using them for continuous improvement, matching recommendations. | | [PE:02 Capacity planning](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/capacity-planning) | recommendations | 0.90 | Mapped to a Performance Efficiency checklist recommendation and gives concrete implementation guidance for capacity planning (CPU, memory, storage, bandwidth) to meet performance targets, fitting recommendations. | -| [PE:04 Metrics and logs](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/collect-performance-data) | recommendations | 0.90 | Explicitly tied to a Performance Efficiency checklist recommendation and provides detailed guidance on collecting metrics and logs to assess performance, matching the recommendations definition. | | [PE:05 Scaling and partitioning](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/scale-partition) | recommendations | 0.90 | Applies to a Performance Efficiency checklist recommendation and gives concrete best practices for scaling and partitioning workloads, which is detailed implementation guidance. | | [PE:06 Performance testing](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/performance-test) | recommendations | 0.90 | Mapped to a Performance Efficiency checklist recommendation and includes how-to guidance on tools, environments, and testing steps, clearly fitting the recommendations category. | | [PE:07 Code and infrastructure](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-code-infrastructure) | recommendations | 0.90 | Explicitly a Performance Efficiency checklist-backed guide with detailed recommendations for optimizing code and infrastructure usage, which is implementation-focused guidance. | | [PE:08 Data performance](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-data-performance) | recommendations | 0.90 | Tied to a Performance Efficiency checklist recommendation and provides detailed guidance on optimizing data access, storage, and processing, matching recommendations. | | [RE:09 Disaster recovery](https://learn.microsoft.com/en-us/azure/well-architected/reliability/disaster-recovery) | recommendations | 0.90 | Explicitly tied to a Reliability checklist recommendation and outlines DR strategy principles, objectives, and design considerations—implementation-level guidance even if not step-by-step. | +| [RE:10 Monitoring](https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring) | recommendations | 0.90 | The page is framed as architecture strategies for monitoring workload reliability and explicitly states it applies to a specific Reliability checklist recommendation. It provides detailed implementation guidance for that checklist item (how to design monitoring and alerting), which aligns with the recommendations sub-skill type. | | [SE:02 Secure development lifecycle](https://learn.microsoft.com/en-us/azure/well-architected/security/secure-development-lifecycle) | recommendations | 0.90 | Page explicitly states it applies to a specific Security checklist recommendation and provides detailed SDLC security practices and implementation guidance. It is the how-to behind a checklist item, matching the recommendations category. | | [SE:09 Application secrets](https://learn.microsoft.com/en-us/azure/well-architected/security/application-secrets) | recommendations | 0.90 | Maps to a Security checklist recommendation and gives detailed implementation guidance for creating, storing, and distributing secrets (keys, tokens, credentials). This is specific how-to guidance behind checklist items. | | [Tradeoffs](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/tradeoffs) | tradeoffs | 0.90 | Explicitly a tradeoffs page discussing how performance efficiency decisions affect other pillars (reliability, cost, etc.) and how to balance them, matching the tradeoffs category. | @@ -188,6 +197,7 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads | [SE:12 Incident response](https://learn.microsoft.com/en-us/azure/well-architected/security/incident-response) | recommendations | 0.86 | Supports a Security checklist recommendation and provides detailed steps for incident identification, management, and mitigation. This is implementation guidance behind checklist items. | | [Design principles](https://learn.microsoft.com/en-us/azure/well-architected/ai/design-principles) | design-principles | 0.85 | Defines AI-specific design principles, explains why they matter, and ties them to WAF pillars and trade-offs. | | [Design principles](https://learn.microsoft.com/en-us/azure/well-architected/mission-critical/mission-critical-design-principles) | design-principles | 0.85 | Defines named mission-critical design principles, explains their rationale and trade-offs, and positions them as a compass across design areas, matching the design-principles definition. | +| [PE:04 Performance monitoring](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/monitoring) | recommendations | 0.85 | The article is tied to a specific Performance Efficiency checklist recommendation and provides design strategies and implementation guidance for multi-layer performance measurement. This matches the recommendations type: detailed how-to guidance mapped to a checklist item. | | [Tradeoffs](https://learn.microsoft.com/en-us/azure/well-architected/reliability/tradeoffs) | tradeoffs | 0.85 | Analyzes how reliability objectives and resiliency targets interact with other pillars (cost, performance, operations), providing guidance on balancing competing concerns. | | [Tradeoffs](https://learn.microsoft.com/en-us/azure/well-architected/security/tradeoffs) | tradeoffs | 0.85 | Discusses how security controls and design principles interact with other pillars and design decisions, providing guidance on balancing security with other concerns. | | [WAF assessments and recommendations](https://learn.microsoft.com/en-us/azure/well-architected/design-guides/implementing-recommendations) | assessments | 0.85 | Describes a self-assessment with ~60 questions based on WAF pillar recommendations and explains how to use it with Azure Advisor. This matches the assessment pattern: evaluative questions organized by pillars and guidance on interpreting/using results, rather than prescriptive checklist items or implementation details. | @@ -316,9 +326,9 @@ use_when: Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads | [Overview](https://learn.microsoft.com/en-us/azure/well-architected/workloads) | 0.20 | Conceptual overview of what a workload is in the WAF context; not a checklist, recommendations, tradeoffs, service guide, or workload-pattern with domain-specific design. | | [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/) | 0.20 | Quick-links/navigation page for Cost Optimization resources; no detailed implementation guidance, principles list, or checklist IDs. | | [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/) | 0.20 | Quick links/overview page for the Operational Excellence pillar; primarily navigation and high-level guidance without checklist IDs, detailed implementation steps, or pillar-specific principles with rationale. | -| [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/reliability/) | 0.20 | Reliability quick links page is navigational, pointing to other reliability resources; no substantive principles, checklists, or recommendations on-page. | +| [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/) | 0.20 | Quick links/landing page for Performance Efficiency resources; no evidence of detailed principles, checklist IDs, or implementation guidance on the page itself. | +| [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/reliability/) | 0.20 | Acts as a quick-links/landing page for reliability content, not a detailed principles, checklist, or implementation page. Lacks numbered checklist items, checklist IDs, or in-depth WAF-specific implementation guidance. | | [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/service-guides/) | 0.20 | This is a navigation/get-started page for service guidance mapped to the Well-Architected Framework, not the detailed service-specific guidance itself. It does not contain implementation details, checklists, or pillar-specific expert content. | | [What is the Well-Architected Framework](https://learn.microsoft.com/en-us/azure/well-architected/what-is-well-architected-framework) | 0.20 | High-level overview of the Azure Well-Architected Framework and its pillars; conceptual and introductory without pillar-specific named principles, checklist IDs, or detailed implementation guidance. | -| [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/) | 0.10 | A quick-links/navigation page for performance efficiency resources, not detailed guidance or checklists. | | [Quick links](https://learn.microsoft.com/en-us/azure/well-architected/security/) | 0.10 | Security quick links page is primarily navigation to other resources; it does not itself present checklists, recommendations, or detailed security design principles. | | [What's new](https://learn.microsoft.com/en-us/azure/well-architected/whats-new) | 0.10 | What's New page is a change log and navigation aid pointing to other content; it doesn't itself contain detailed principles, checklists, recommendations, tradeoffs, service guidance, workload patterns, or assessment questions. | diff --git a/products/caf-generate-summary.md b/products/caf-generate-summary.md index 7838aeaf..1c06e586 100644 --- a/products/caf-generate-summary.md +++ b/products/caf-generate-summary.md @@ -1,7 +1,7 @@ # Generation Summary -**Generated**: 2026-04-19 02:02:25 -**Total Duration**: 0m 24s +**Generated**: 2026-04-26 02:02:46 +**Total Duration**: 0m 19s ## Product Crawl Summary diff --git a/products/generate-summary.md b/products/generate-summary.md index 94198719..13b21351 100644 --- a/products/generate-summary.md +++ b/products/generate-summary.md @@ -1,7 +1,7 @@ # Generation Summary -**Generated**: 2026-04-19 02:03:28 -**Total Duration**: 40m 4s +**Generated**: 2026-04-26 02:03:45 +**Total Duration**: 38m 15s ## Product Crawl Summary @@ -9,224 +9,224 @@ Quick overview for reviewers. See individual product reports for details. | # | Product | Pages | Classified | New | Updated | Deleted | Status | |---|---------|-------|------------|-----|---------|---------|--------| -| 1 | Microsoft Foundry Tools | 51 | 31 | 0 | 6 | 0 | OK | +| 1 | Microsoft Foundry Tools | 51 | 30 | 0 | 5 | 0 | OK | | 2 | Azure AI Vision | 45 | 20 | 0 | 0 | 0 | OK | -| 3 | Azure Kubernetes Service Edge Essentials | 332 | 264 | 0 | 2 | 0 | OK | +| 3 | Azure Kubernetes Service Edge Essentials | 332 | 265 | 1 | 4 | 1 | OK | | 4 | Azure Artifact Signing | 14 | 9 | 0 | 0 | 0 | OK | -| 5 | Azure Advisor | 32 | 25 | 0 | 2 | 0 | OK | +| 5 | Azure Advisor | 32 | 25 | 0 | 0 | 0 | OK | | 6 | Microsoft Foundry Local | 20 | 13 | 0 | 0 | 0 | OK | | 7 | Azure Analysis Services | 1 | 1 | 0 | 0 | 0 | OK | | 8 | Azure AI Anomaly Detector | 24 | 7 | 0 | 0 | 0 | OK | -| 9 | Azure API Management | 275 | 225 | 0 | 7 | 0 | OK | +| 9 | Azure API Management | 275 | 225 | 1 | 6 | 1 | OK | | 10 | Azure App Configuration | 161 | 132 | 0 | 0 | 0 | OK | -| 11 | Azure App Service | 248 | 158 | 0 | 14 | 0 | OK | +| 11 | Azure App Service | 248 | 159 | 0 | 23 | 0 | OK | | 12 | Azure App Testing | 72 | 57 | 0 | 0 | 0 | OK | -| 13 | Azure Application Gateway | 174 | 125 | 0 | 0 | 0 | OK | -| 14 | Azure Arc | 423 | 271 | 8 | 14 | 2 | OK | -| 15 | Azure Artifacts | 73 | 60 | 0 | 0 | 0 | OK | +| 13 | Azure Application Gateway | 175 | 127 | 1 | 8 | 0 | OK | +| 14 | Azure Arc | 423 | 271 | 0 | 2 | 0 | OK | +| 15 | Azure Artifacts | 73 | 60 | 0 | 1 | 0 | OK | | 16 | Azure Attestation | 33 | 22 | 0 | 0 | 0 | OK | -| 17 | Azure Automation | 115 | 91 | 0 | 6 | 0 | OK | -| 18 | Azure Backup | 393 | 230 | 1 | 4 | 0 | OK | -| 19 | Azure Bastion | 40 | 24 | 0 | 0 | 1 | OK | -| 20 | Azure Batch | 113 | 82 | 0 | 1 | 1 | OK | -| 21 | Azure Blob Storage | 408 | 335 | 1 | 4 | 0 | OK | +| 17 | Azure Automation | 115 | 91 | 0 | 30 | 0 | OK | +| 18 | Azure Backup | 393 | 229 | 0 | 15 | 0 | OK | +| 19 | Azure Bastion | 40 | 24 | 0 | 0 | 0 | OK | +| 20 | Azure Batch | 113 | 82 | 0 | 3 | 0 | OK | +| 21 | Azure Blob Storage | 409 | 336 | 1 | 2 | 0 | OK | | 22 | Azure Blueprints | 40 | 34 | 0 | 0 | 0 | OK | -| 23 | Azure Boards | 126 | 34 | 0 | 1 | 0 | OK | +| 23 | Azure Boards | 126 | 34 | 0 | 7 | 0 | OK | | 24 | Azure AI Bot Service | 190 | 152 | 0 | 0 | 0 | OK | | 25 | Azure Cache for Redis | 63 | 57 | 0 | 0 | 0 | OK | -| 26 | Azure Cloud Services | 45 | 31 | 0 | 2 | 0 | OK | +| 26 | Azure Cloud Services | 45 | 31 | 0 | 0 | 0 | OK | | 27 | Azure Cloud Shell | 20 | 5 | 0 | 0 | 0 | OK | -| 28 | Azure AI Search | 299 | 226 | 0 | 14 | 0 | OK | +| 28 | Azure AI Search | 300 | 227 | 1 | 5 | 0 | OK | | 29 | Azure Communication Services | 517 | 415 | 0 | 0 | 0 | OK | -| 30 | Azure Container Apps | 209 | 131 | 1 | 18 | 1 | OK | +| 30 | Azure Container Apps | 211 | 134 | 2 | 2 | 0 | OK | | 31 | Azure Container Instances | 83 | 54 | 0 | 0 | 0 | OK | -| 32 | Azure Container Registry | 122 | 88 | 0 | 1 | 0 | OK | +| 32 | Azure Container Registry | 123 | 89 | 1 | 0 | 0 | OK | | 33 | Azure Container Storage | 33 | 13 | 0 | 0 | 0 | OK | -| 34 | Azure Cosmos DB | 987 | 772 | 4 | 8 | 0 | OK | -| 35 | Azure Cost Management | 267 | 184 | 1 | 1 | 0 | OK | +| 34 | Azure Cosmos DB | 988 | 771 | 1 | 8 | 0 | OK | +| 35 | Azure Cost Management | 267 | 181 | 0 | 13 | 0 | OK | | 36 | Azure AI Custom Vision | 23 | 17 | 0 | 0 | 0 | OK | | 37 | Azure CycleCloud | 114 | 87 | 0 | 0 | 0 | OK | | 38 | Azure Data Box | 79 | 49 | 0 | 0 | 0 | OK | -| 39 | Azure Data Explorer | 197 | 106 | 0 | 7 | 0 | OK | -| 40 | Azure Data Factory | 505 | 425 | 0 | 0 | 0 | OK | +| 39 | Azure Data Explorer | 197 | 106 | 0 | 1 | 0 | OK | +| 40 | Azure Data Factory | 505 | 425 | 0 | 2 | 0 | OK | | 41 | Azure Data Science Virtual Machines | 25 | 18 | 0 | 0 | 0 | OK | | 42 | Azure Data Share | 25 | 16 | 0 | 0 | 0 | OK | | 43 | Azure Database for MariaDB | 0 | 0 | 0 | 0 | 0 | OK | | 44 | Azure Database Migration service | 31 | 15 | 0 | 0 | 0 | OK | -| 45 | Azure Database for MySQL | 179 | 115 | 0 | 1 | 0 | OK | -| 46 | Azure Database for PostgreSQL | 322 | 216 | 5 | 5 | 0 | OK | -| 47 | Azure Databricks | 4808 | 3218 | 459 | 513 | 41 | OK | +| 45 | Azure Database for MySQL | 179 | 115 | 0 | 2 | 0 | OK | +| 46 | Azure Database for PostgreSQL | 324 | 215 | 2 | 5 | 0 | OK | +| 47 | Azure Databricks | 4858 | 3254 | 54 | 397 | 4 | OK | | 48 | Azure DDos Protection | 33 | 25 | 0 | 0 | 0 | OK | | 49 | Azure Dedicated HSM | 16 | 10 | 0 | 0 | 0 | OK | -| 50 | Azure DevOps | 955 | 259 | 5 | 9 | 0 | OK | -| 51 | Azure Pipelines | 569 | 513 | 0 | 1 | 0 | OK | +| 50 | Azure DevOps | 953 | 256 | 1 | 13 | 3 | OK | +| 51 | Azure Pipelines | 570 | 513 | 1 | 14 | 0 | OK | | 52 | Azure DevTest Labs | 98 | 68 | 0 | 0 | 0 | OK | | 53 | Azure Digital Twins | 66 | 55 | 0 | 0 | 0 | OK | | 54 | Azure DNS | 75 | 29 | 0 | 0 | 0 | OK | -| 55 | Azure Elastic SAN | 22 | 21 | 0 | 0 | 0 | OK | +| 55 | Azure Elastic SAN | 23 | 22 | 1 | 2 | 0 | OK | | 56 | Azure Event Grid | 262 | 180 | 0 | 0 | 0 | OK | | 57 | Azure Event Hubs | 111 | 72 | 0 | 0 | 0 | OK | -| 58 | Azure ExpressRoute | 95 | 70 | 0 | 0 | 0 | OK | -| 59 | Azure Files | 126 | 107 | 0 | 1 | 0 | OK | -| 60 | Azure Firewall | 85 | 60 | 1 | 1 | 1 | OK | +| 58 | Azure ExpressRoute | 95 | 70 | 0 | 1 | 0 | OK | +| 59 | Azure Files | 126 | 106 | 0 | 16 | 0 | OK | +| 60 | Azure Firewall | 85 | 60 | 0 | 1 | 0 | OK | | 61 | Azure Firewall Manager | 27 | 11 | 0 | 0 | 0 | OK | -| 62 | Azure Front Door | 101 | 77 | 0 | 1 | 0 | OK | -| 63 | Azure Functions | 284 | 232 | 1 | 14 | 0 | OK | +| 62 | Azure Front Door | 102 | 77 | 1 | 0 | 0 | OK | +| 63 | Azure Functions | 284 | 232 | 0 | 3 | 0 | OK | | 64 | Azure HDInsight | 425 | 334 | 0 | 0 | 0 | OK | -| 65 | Azure Health Data Services | 202 | 151 | 0 | 5 | 0 | OK | -| 66 | Azure HPC Cache | 34 | 32 | 0 | 0 | 0 | OK | +| 65 | Azure Health Data Services | 202 | 151 | 0 | 1 | 0 | OK | +| 66 | Azure HPC Cache | 0 | 0 | 0 | 0 | 0 | OK | | 67 | Azure AI Immersive Reader | 18 | 15 | 0 | 0 | 0 | OK | | 68 | Azure Information Protection | 21 | 16 | 0 | 0 | 0 | OK | -| 69 | Azure IoT | 9 | 3 | 1 | 1 | 0 | OK | +| 69 | Azure IoT | 11 | 5 | 2 | 0 | 0 | OK | | 70 | Azure IoT Central | 89 | 57 | 0 | 0 | 0 | OK | -| 71 | Azure IoT Edge | 98 | 65 | 0 | 9 | 0 | OK | -| 72 | Azure IoT Hub | 204 | 136 | 12 | 4 | 3 | OK | -| 73 | Azure IoT Operations | 122 | 84 | 1 | 6 | 0 | OK | -| 74 | Azure Key Vault | 162 | 89 | 0 | 2 | 0 | OK | -| 75 | Azure Kubernetes Service (AKS) | 588 | 443 | 3 | 20 | 0 | OK | +| 71 | Azure IoT Edge | 98 | 65 | 0 | 1 | 0 | OK | +| 72 | Azure IoT Hub | 204 | 137 | 0 | 2 | 0 | OK | +| 73 | Azure IoT Operations | 122 | 83 | 0 | 3 | 0 | OK | +| 74 | Azure Key Vault | 162 | 89 | 0 | 3 | 0 | OK | +| 75 | Azure Kubernetes Service (AKS) | 593 | 444 | 55 | 12 | 50 | OK | | 76 | Azure Lab Services | 104 | 79 | 0 | 0 | 0 | OK | | 77 | Azure Lighthouse | 28 | 21 | 0 | 0 | 0 | OK | -| 78 | Azure Load Balancer | 87 | 41 | 1 | 3 | 8 | OK | -| 79 | Azure Local | 321 | 216 | 1 | 4 | 0 | OK | -| 80 | Azure Logic Apps | 229 | 187 | 0 | 1 | 0 | OK | -| 81 | Azure Machine Learning | 624 | 482 | 0 | 5 | 0 | OK | +| 78 | Azure Load Balancer | 87 | 41 | 0 | 0 | 0 | OK | +| 79 | Azure Local | 350 | 257 | 350 | 0 | 321 | OK | +| 80 | Azure Logic Apps | 229 | 187 | 0 | 0 | 0 | OK | +| 81 | Azure Machine Learning | 627 | 483 | 3 | 24 | 0 | OK | | 82 | Azure Managed Applications | 62 | 53 | 0 | 0 | 0 | OK | -| 83 | Azure Managed Grafana | 43 | 34 | 0 | 1 | 0 | OK | -| 84 | Azure Managed Lustre | 30 | 28 | 0 | 0 | 0 | OK | -| 85 | Azure Managed Redis | 68 | 56 | 2 | 2 | 0 | OK | +| 83 | Azure Managed Grafana | 44 | 35 | 1 | 3 | 0 | OK | +| 84 | Azure Managed Lustre | 30 | 28 | 0 | 1 | 0 | OK | +| 85 | Azure Managed Redis | 68 | 56 | 0 | 0 | 0 | OK | | 86 | Azure Maps | 147 | 115 | 0 | 0 | 0 | OK | | 87 | Azure AI Metrics Advisor | 19 | 6 | 0 | 0 | 0 | OK | -| 88 | Azure Migrate | 206 | 115 | 0 | 6 | 0 | OK | -| 89 | Azure Monitor | 2371 | 1839 | 0 | 12 | 0 | OK | -| 90 | Azure NAT Gateway | 24 | 19 | 0 | 4 | 0 | OK | -| 91 | Azure NetApp Files | 227 | 153 | 0 | 6 | 0 | OK | -| 92 | Azure Network Watcher | 64 | 29 | 0 | 0 | 0 | OK | +| 88 | Azure Migrate | 206 | 115 | 0 | 2 | 0 | OK | +| 89 | Azure Monitor | 2373 | 1842 | 8 | 11 | 6 | OK | +| 90 | Azure NAT Gateway | 24 | 19 | 0 | 1 | 0 | OK | +| 91 | Azure NetApp Files | 227 | 153 | 0 | 4 | 0 | OK | +| 92 | Azure Network Watcher | 64 | 28 | 0 | 1 | 0 | OK | | 93 | Azure Notification Hubs | 67 | 53 | 0 | 0 | 0 | OK | | 94 | Azure Open Datasets | 44 | 1 | 0 | 0 | 0 | OK | | 95 | Azure AI Personalizer | 35 | 9 | 0 | 0 | 0 | OK | -| 96 | Azure Policy | 158 | 96 | 3 | 1 | 1 | OK | +| 96 | Azure Policy | 158 | 98 | 0 | 23 | 0 | OK | | 97 | Azure Portal | 30 | 11 | 0 | 0 | 0 | OK | | 98 | Azure Private Link | 48 | 18 | 0 | 1 | 0 | OK | -| 99 | Azure Quantum | 135 | 45 | 0 | 6 | 1 | OK | +| 99 | Azure Quantum | 135 | 44 | 2 | 3 | 2 | OK | | 100 | Azure Queue Storage | 24 | 23 | 0 | 0 | 0 | OK | | 101 | Azure Role-based access control | 104 | 96 | 0 | 0 | 0 | OK | -| 102 | Azure Red Hat OpenShift | 66 | 53 | 0 | 0 | 0 | OK | +| 102 | Azure Red Hat OpenShift | 66 | 52 | 0 | 2 | 0 | OK | | 103 | Azure Relay | 27 | 11 | 0 | 0 | 0 | OK | -| 104 | Azure Repos | 206 | 132 | 0 | 2 | 0 | OK | +| 104 | Azure Repos | 206 | 132 | 0 | 1 | 0 | OK | | 105 | Azure Resource Graph | 33 | 22 | 0 | 0 | 0 | OK | -| 106 | Azure Resource Manager | 464 | 357 | 0 | 4 | 0 | OK | -| 107 | SAP HANA on Azure Large Instances | 206 | 172 | 0 | 20 | 0 | OK | -| 108 | Azure Service Bus | 123 | 89 | 0 | 0 | 0 | OK | +| 106 | Azure Resource Manager | 464 | 357 | 0 | 1 | 0 | OK | +| 107 | SAP HANA on Azure Large Instances | 206 | 169 | 0 | 12 | 0 | OK | +| 108 | Azure Service Bus | 123 | 89 | 0 | 9 | 0 | OK | | 109 | Azure Service Fabric | 404 | 330 | 0 | 0 | 0 | OK | -| 110 | Azure Service Health | 47 | 19 | 0 | 0 | 0 | OK | +| 110 | Azure Service Health | 47 | 19 | 0 | 3 | 0 | OK | | 111 | Azure SignalR Service | 73 | 60 | 0 | 0 | 0 | OK | -| 112 | Azure Site Recovery | 202 | 133 | 0 | 6 | 0 | OK | +| 112 | Azure Site Recovery | 202 | 133 | 0 | 2 | 0 | OK | | 113 | Azure US Government | 40 | 31 | 0 | 0 | 0 | OK | -| 114 | Azure AI Speech | 184 | 109 | 1 | 1 | 0 | OK | +| 114 | Azure AI Speech | 184 | 109 | 0 | 0 | 0 | OK | | 115 | Azure Spring Apps | 174 | 141 | 0 | 0 | 0 | OK | -| 116 | Azure SQL Database | 362 | 239 | 0 | 4 | 0 | OK | -| 117 | Azure SQL Managed Instance | 241 | 186 | 2 | 5 | 1 | OK | -| 118 | SQL Server on Azure Virtual Machines | 128 | 98 | 0 | 2 | 0 | OK | +| 116 | Azure SQL Database | 362 | 239 | 0 | 2 | 0 | OK | +| 117 | Azure SQL Managed Instance | 241 | 186 | 0 | 1 | 0 | OK | +| 118 | SQL Server on Azure Virtual Machines | 128 | 98 | 0 | 1 | 0 | OK | | 119 | Azure Stack Edge | 237 | 144 | 0 | 0 | 0 | OK | | 120 | Azure Stream Analytics | 167 | 113 | 0 | 0 | 0 | OK | -| 121 | Azure Synapse Analytics | 446 | 260 | 0 | 0 | 0 | OK | +| 121 | Azure Synapse Analytics | 447 | 261 | 1 | 0 | 0 | OK | | 122 | Azure Table Storage | 15 | 13 | 0 | 0 | 0 | OK | -| 123 | Azure Test Plans | 33 | 3 | 10 | 1 | 8 | OK | +| 123 | Azure Test Plans | 33 | 4 | 0 | 2 | 0 | OK | | 124 | Azure Traffic Manager | 44 | 28 | 0 | 0 | 0 | OK | | 125 | Azure Translator | 125 | 75 | 0 | 0 | 0 | OK | | 126 | Azure AI Video Indexer | 79 | 56 | 0 | 0 | 0 | OK | -| 127 | Azure Virtual Desktop | 150 | 121 | 1 | 2 | 0 | OK | -| 128 | Azure Virtual Machines | 804 | 590 | 0 | 12 | 0 | OK | -| 129 | Azure Virtual Network | 129 | 55 | 0 | 0 | 0 | OK | +| 127 | Azure Virtual Desktop | 150 | 120 | 0 | 2 | 0 | OK | +| 128 | Azure Virtual Machines | 804 | 589 | 0 | 12 | 0 | OK | +| 129 | Azure Virtual Network | 129 | 55 | 0 | 1 | 0 | OK | | 130 | Azure Virtual WAN | 130 | 91 | 0 | 0 | 0 | OK | -| 131 | Azure Virtual Machine Scale Sets | 93 | 83 | 0 | 0 | 0 | OK | -| 132 | Azure VMware Solution | 135 | 89 | 0 | 1 | 0 | OK | +| 131 | Azure Virtual Machine Scale Sets | 93 | 83 | 0 | 1 | 0 | OK | +| 132 | Azure VMware Solution | 135 | 89 | 0 | 4 | 0 | OK | | 133 | Azure VPN Gateway | 123 | 102 | 0 | 1 | 0 | OK | -| 134 | Azure Web Application Firewall | 80 | 68 | 1 | 0 | 1 | OK | +| 134 | Azure Web Application Firewall | 80 | 68 | 0 | 0 | 0 | OK | | 135 | Azure Web PubSub | 112 | 81 | 0 | 0 | 0 | OK | -| 136 | Chaos Studio | 51 | 31 | 0 | 0 | 0 | OK | +| 136 | Chaos Studio | 51 | 31 | 0 | 1 | 0 | OK | | 137 | Azure AI Content Safety | 34 | 15 | 0 | 0 | 0 | OK | | 138 | Azure Data Manager for Agriculture | 26 | 18 | 0 | 0 | 0 | OK | | 139 | Azure AI Document Intelligence | 77 | 36 | 0 | 0 | 0 | OK | -| 140 | Azure AI Language | 197 | 104 | 0 | 0 | 0 | OK | -| 141 | Microsoft Foundry | 275 | 202 | 5 | 31 | 6 | OK | -| 142 | Microsoft Planetary Computer Pro | 45 | 32 | 0 | 0 | 0 | OK | -| 143 | Playwright Workspaces | 22 | 17 | 0 | 0 | 0 | OK | +| 140 | Azure AI Language | 199 | 107 | 6 | 7 | 4 | OK | +| 141 | Microsoft Foundry | 287 | 205 | 19 | 32 | 7 | OK | +| 142 | Microsoft Planetary Computer Pro | 46 | 33 | 1 | 0 | 0 | OK | +| 143 | Playwright Workspaces | 0 | 0 | 0 | 0 | 0 | OK | | 144 | Azure Route Server | 21 | 13 | 0 | 0 | 0 | OK | | 145 | Azure Static Web Apps | 79 | 57 | 0 | 0 | 0 | OK | | 146 | Azure Update Manager | 84 | 55 | 0 | 0 | 0 | OK | | 147 | Azure Virtual Network Manager | 52 | 22 | 0 | 0 | 0 | OK | | 148 | Azure Active Directory B2C | 289 | 257 | 0 | 0 | 0 | OK | -| 149 | Azure Api Center | 36 | 19 | 1 | 0 | 0 | OK | +| 149 | Azure Api Center | 36 | 19 | 0 | 0 | 0 | OK | | 150 | Azure Fluid Relay | 25 | 16 | 0 | 0 | 0 | OK | -| 151 | Azure Impact Reporting | 15 | 10 | 1 | 1 | 0 | OK | +| 151 | Azure Impact Reporting | 15 | 10 | 0 | 0 | 0 | OK | | 152 | Azure Large Instances | 10 | 3 | 0 | 0 | 0 | OK | -| 153 | Azure Baremetal Infrastructure | 8 | 2 | 0 | 1 | 0 | OK | +| 153 | Azure Baremetal Infrastructure | 8 | 2 | 0 | 0 | 0 | OK | | 154 | Azure Business Process Tracking | 6 | 1 | 0 | 0 | 0 | OK | | 155 | Azure Carbon Optimization | 11 | 4 | 0 | 0 | 0 | OK | | 156 | Azure Cloud Hsm | 20 | 13 | 0 | 0 | 0 | OK | | 157 | Azure Confidential Computing | 70 | 51 | 0 | 0 | 0 | OK | | 158 | Azure Confidential Ledger | 30 | 19 | 0 | 0 | 0 | OK | -| 159 | Azure Copilot | 39 | 19 | 0 | 1 | 0 | OK | +| 159 | Azure Copilot | 39 | 19 | 0 | 0 | 0 | OK | | 160 | Azure Data Api Builder | 125 | 85 | 0 | 0 | 0 | OK | -| 161 | Azure Defender For Cloud | 473 | 254 | 5 | 12 | 5 | OK | -| 162 | Azure Defender For Iot | 180 | 116 | 0 | 0 | 0 | OK | -| 163 | Azure Deployment Environments | 32 | 14 | 0 | 0 | 0 | OK | +| 161 | Azure Defender For Cloud | 473 | 255 | 0 | 27 | 0 | OK | +| 162 | Azure Defender For Iot | 180 | 116 | 0 | 1 | 0 | OK | +| 163 | Azure Deployment Environments | 32 | 15 | 0 | 4 | 0 | OK | | 164 | Azure Dev Box | 67 | 47 | 0 | 0 | 0 | OK | | 165 | Azure Education Hub | 11 | 3 | 0 | 0 | 0 | OK | -| 166 | Azure Energy Data Services | 52 | 34 | 0 | 1 | 0 | OK | -| 167 | Azure Extended Zones | 18 | 6 | 1 | 0 | 0 | OK | -| 168 | Azure External Attack Surface Management | 22 | 12 | 0 | 0 | 0 | OK | -| 169 | Azure Firmware Analysis | 16 | 13 | 1 | 1 | 0 | OK | -| 170 | Microsoft Foundry Classic | 373 | 289 | 0 | 22 | 1 | OK | +| 166 | Azure Energy Data Services | 52 | 34 | 0 | 0 | 0 | OK | +| 167 | Azure Extended Zones | 18 | 6 | 0 | 0 | 0 | OK | +| 168 | Azure External Attack Surface Management | 22 | 0 | 0 | 22 | 0 | OK | +| 169 | Azure Firmware Analysis | 16 | 13 | 0 | 0 | 0 | OK | +| 170 | Microsoft Foundry Classic | 376 | 296 | 8 | 24 | 5 | OK | | 171 | Azure Health Bot | 78 | 46 | 0 | 0 | 0 | OK | | 172 | Azure Import Export | 14 | 6 | 0 | 0 | 0 | OK | | 173 | Azure Industry | 1 | 0 | 0 | 0 | 0 | OK | | 174 | Azure Integration Environments | 4 | 0 | 0 | 0 | 0 | OK | | 175 | Azure Internet Peering | 23 | 1 | 0 | 0 | 0 | OK | -| 176 | Azure Microsoft Discovery | 0 | 0 | 0 | 0 | 0 | OK | +| 176 | Azure Microsoft Discovery | 66 | 46 | 0 | 0 | 0 | OK | | 177 | Azure Network Function Manager | 8 | 2 | 0 | 0 | 0 | OK | -| 178 | Azure Networking | 28 | 18 | 0 | 0 | 0 | OK | +| 178 | Azure Networking | 16 | 5 | 11 | 0 | 23 | OK | | 179 | Azure Operator Insights | 0 | 0 | 0 | 0 | 0 | OK | -| 180 | Azure Operator Nexus | 212 | 159 | 0 | 3 | 0 | OK | +| 180 | Azure Operator Nexus | 214 | 161 | 2 | 5 | 0 | OK | | 181 | Azure Operator Service Manager | 47 | 24 | 0 | 0 | 0 | OK | -| 182 | Azure Oracle | 11 | 4 | 0 | 9 | 0 | OK | +| 182 | Azure Oracle | 11 | 4 | 0 | 0 | 0 | OK | | 183 | Azure Osconfig | 31 | 15 | 0 | 0 | 0 | OK | -| 184 | Azure Partner Solutions | 112 | 36 | 0 | 0 | 0 | OK | +| 184 | Azure Partner Solutions | 112 | 36 | 0 | 1 | 0 | OK | | 185 | Azure Payment Hsm | 28 | 18 | 0 | 0 | 0 | OK | | 186 | Azure Peering Service | 10 | 1 | 0 | 0 | 0 | OK | | 187 | Azure Quotas | 15 | 1 | 0 | 0 | 0 | OK | -| 188 | Azure Reliability | 103 | 41 | 0 | 8 | 0 | OK | +| 188 | Azure Reliability | 104 | 44 | 1 | 2 | 0 | OK | | 189 | Azure Resiliency | 20 | 7 | 0 | 0 | 0 | OK | | 190 | Azure Scheduler | 0 | 0 | 0 | 0 | 0 | OK | -| 191 | Azure Security | 126 | 53 | 0 | 0 | 0 | OK | -| 192 | Azure Sentinel | 394 | 289 | 3 | 3 | 0 | OK | -| 193 | Azure Service Connector | 63 | 35 | 0 | 6 | 0 | OK | -| 194 | Azure Sre Agent | 100 | 40 | 0 | 0 | 0 | OK | +| 191 | Azure Security | 126 | 53 | 0 | 3 | 0 | OK | +| 192 | Azure Sentinel | 394 | 283 | 0 | 394 | 0 | OK | +| 193 | Azure Service Connector | 63 | 35 | 0 | 0 | 0 | OK | +| 194 | Azure Sre Agent | 100 | 39 | 2 | 24 | 2 | OK | | 195 | Azure Virtual Enclaves | 0 | 0 | 0 | 0 | 0 | OK | ### Totals - **Products Processed**: 195 success, 0 failed -- **Total Pages**: 32753 -- **Total Classified**: 22538 -- **Total New Pages**: 543 -- **Total Updated Pages**: 917 -- **Total Deleted Pages**: 82 +- **Total Pages**: 32875 +- **Total Classified**: 22605 +- **Total New Pages**: 541 +- **Total Updated Pages**: 1291 +- **Total Deleted Pages**: 429 ### Classification by Type (All Products) | Type | Count | |------|-------| -| architecture-patterns | 565 | -| best-practices | 1278 | -| configuration | 7194 | -| decision-making | 1281 | -| deployment | 1228 | -| integrations | 5331 | -| limits-quotas | 1063 | -| security | 3078 | -| troubleshooting | 1520 | +| architecture-patterns | 557 | +| best-practices | 1266 | +| configuration | 7171 | +| decision-making | 1305 | +| deployment | 1240 | +| integrations | 5387 | +| limits-quotas | 1067 | +| security | 3097 | +| troubleshooting | 1515 | --- @@ -389,7 +389,7 @@ Quick overview for reviewers. See individual product reports for details. | 140 | active | azure-language-service | Azure AI Language | ...oft.com/en-us/azure/ai-services/language-service/toc.json | 0 | | 141 | virtual | microsoft-foundry | Microsoft Foundry | - | 1 | | 142 | active | azure-planetary-computer-pro | Microsoft Planetary Computer Pro | ...arn.microsoft.com/en-us/azure/planetary-computer/toc.json | 0 | -| 143 | active | azure-playwright-workspaces | Playwright Workspaces | ...arn.microsoft.com/en-us/azure/playwright-testing/toc.json | 0 | +| 143 | virtual | azure-playwright-workspaces | Playwright Workspaces | - | 0 | | 144 | active | azure-route-server | Azure Route Server | ...s://learn.microsoft.com/en-us/azure/route-server/toc.json | 0 | | 145 | active | azure-static-web-apps | Azure Static Web Apps | .../learn.microsoft.com/en-us/azure/static-web-apps/toc.json | 0 | | 146 | active | azure-update-manager | Azure Update Manager | https://learn.microsoft.com/en-us/azure/automanage/toc.json | 2 | @@ -422,7 +422,7 @@ Quick overview for reviewers. See individual product reports for details. | 173 | active | azure-industry | Azure Industry | ...y/training-services/microsoft-community-training/toc.json | 0 | | 174 | active | azure-integration-environments | Azure Integration Environments | ...crosoft.com/en-us/azure/integration-environments/toc.json | 0 | | 175 | active | azure-internet-peering | Azure Internet Peering | ...learn.microsoft.com/en-us/azure/internet-peering/toc.json | 0 | -| 176 | virtual | azure-microsoft-discovery | Azure Microsoft Discovery | - | 0 | +| 176 | active | azure-microsoft-discovery | Azure Microsoft Discovery | ...rn.microsoft.com/en-us/azure/microsoft-discovery/toc.json | 0 | | 177 | active | azure-network-function-manager | Azure Network Function Manager | ...crosoft.com/en-us/azure/network-function-manager/toc.json | 0 | | 178 | active | azure-networking | Azure Networking | https://learn.microsoft.com/en-us/azure/networking/toc.json | 1 | | 179 | active | azure-operator-insights | Azure Operator Insights | ...earn.microsoft.com/en-us/azure/operator-insights/toc.json | 0 | diff --git a/products/microsoft-foundry-classic/microsoft-foundry-classic.csv b/products/microsoft-foundry-classic/microsoft-foundry-classic.csv index a4dd7ff5..399a2db1 100644 --- a/products/microsoft-foundry-classic/microsoft-foundry-classic.csv +++ b/products/microsoft-foundry-classic/microsoft-foundry-classic.csv @@ -1,10 +1,10 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/capability-hosts,Capability hosts,Capability hosts for Foundry Agent Service (classic) - Microsoft Foundry (classic),Configure and troubleshoot capability hosts in Foundry,"Learn how capability hosts route agent data to Microsoft-managed or your own Azure resources, and how to configure and troubleshoot them. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Updating capability hosts is not supported. To modify a capability host, you must delete the existing one and recreate it with the new configuration. Capability hosts are sub-resources that you configure at both the Microsoft Foundry account and Foundry project scopes. They tell Foundry Agent Service where to store and process agent data, including:",2026-03-06T23:10:00.000Z,concept-article,configuration,0.8,True,"Explains capability host resources, routing of data, and configuration; likely includes specific setting names, scopes, and troubleshooting mappings unique to this service.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/model-region-support,Model and region support,Azure OpenAI models and regions for Foundry Agent Service (classic) - Microsoft Foundry (classic) portal,Select Azure OpenAI models and regions for Foundry Agents,"Find supported Azure OpenAI models and regions for Microsoft Foundry Agent Service. Compare gpt-5, gpt-4o, and gpt-4 availability across global and regional deployments. (classic)","Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. Azure OpenAI models power agents in Foundry Agent Service. To use these models, you need aMicrosoft Foundry projectwith access to Agent Service. Use the tabs to find a supported model, deployment type, and region combination. ",2026-04-16T17:16:00.000Z,concept-article,decision-making,0.65,True,"Described as a matrix of supported Azure OpenAI models, deployment types, and regions for Foundry Agent Service. This is expert knowledge used to decide which model/region combinations are available and appropriate, fitting decision-making (service/option selection with concrete availability data).",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/model-region-support,Model and region support,Azure OpenAI models and regions for Foundry Agent Service (classic) - Microsoft Foundry (classic) portal,Select Azure OpenAI models and regions for Foundry Agents,"Find supported Azure OpenAI models and regions for Microsoft Foundry Agent Service. Compare gpt-5, gpt-4o, and gpt-4 availability across global and regional deployments. (classic)","Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. Azure OpenAI models power agents in Foundry Agent Service. To use these models, you need aMicrosoft Foundry projectwith access to Agent Service. Use the tabs to find a supported model, deployment type, and region combination. ",2026-04-16T17:16:00.000Z,concept-article,decision-making,0.65,True,"Described as a matrix of supported Azure OpenAI models, deployment types, and regions for Foundry Agent Service. This is expert knowledge used to decide which model/region combinations are available and appropriate, fitting decision-making (service/option selection with concrete availability data).",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/standard-agent-setup,Standard agent setup,Set up standard agent resources for Foundry Agent Service (classic) - Microsoft Foundry (classic),Configure standard agent setup with customer-managed resources,Learn how to provision and configure customer-managed Azure resources for Foundry Agent Service standard agent setup with enterprise-grade data isolation. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Standard agent setup uses customer-managed, single-tenant Azure resources to store agent state and keep all agent data under your control. Use standard setup when you need full data sovereignty, compliance with enterprise security policies, or project-level isolation. In this setup: Tip For a simpler setup that uses Microsoft-managed resources, seeEnvironment setupand choose the basic agent setup opti",2026-02-27T23:08:00.000Z,how-to,configuration,0.78,True,"Describes provisioning and configuring customer-managed Azure resources for standard setup; likely includes specific resource types, settings, and configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/threads-runs-messages,"Threads, runs, and messages","Threads, Runs, and Messages in the Foundry Agent Service (classic) - Microsoft Foundry (classic)",,Learn about the components used in the Foundry Agent Service. (classic),"Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. Foundry Agent Service supports persistent threads, runs, and messages. These components are essential for managing conversation states and interactions with users.",2026-03-06T23:10:00.000Z,concept-article,,0.4,False,"Conceptual explanation of threads, runs, and messages; summary doesn’t indicate numeric limits, config tables, or troubleshooting mappings.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/agents/environment-setup,Environment setup,Set up your environment for Foundry Agent Service (classic) - Microsoft Foundry (classic) portal,Configure environment and infrastructure for Foundry Agent Service,Use this guide to set up your agent environment (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. In this article, you deploy the infrastructure needed to create agents with Foundry Agent Service. After completing this setup, you can create and configure agents using either the SDK of your choice or the Foundry portal. Creating your first ",2026-03-26T06:04:00.000Z,how-to,configuration,0.7,True,"Environment setup for Foundry Agent Service is likely to include specific resource types, configuration parameters, and required settings unique to this service (for example, particular Azure resources, identities, or environment variables). That is product-specific configuration knowledge beyond generic tutorials, so it fits the configuration sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/agents/faq,FAQ,Foundry Agent Service frequently asked questions - Microsoft Foundry,,"Get answers to common questions about Foundry Agent Service, including setup options, data handling, pricing, and networking.","Find answers to common questions about Foundry Agent Service. If you can't find answers to your questions in this article and you still need help, seeFoundry Tools support and help options. Foundry Agent Service is part of Foundry Tools. For getting started, see:",2026-04-14T22:13:00Z,faq,,0.1,False,"Agent Service FAQ description mentions setup, data handling, pricing, and networking but with no evidence of specific limits, config parameters, or error codes in the summary. Likely general conceptual/FAQ content.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/agents/environment-setup,Environment setup,Set up your environment for Foundry Agent Service (classic) - Microsoft Foundry (classic) portal,Configure environment and infrastructure for Foundry Agent Service,Use this guide to set up your agent environment (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. In this article, you deploy the infrastructure needed to create agents with Foundry Agent Service. After completing this setup, you can create and configure agents using either the SDK of your choice or the Foundry portal. Creating your first agent",2026-04-14T22:13:00.000Z,how-to,configuration,0.7,True,"An environment setup guide for a specific service usually includes concrete configuration parameters (resource types/SKUs to deploy, required settings, environment variables, and service-specific prerequisites). These are detailed, product-specific configuration steps that qualify as expert knowledge and align best with the configuration sub-skill.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/agents/faq,FAQ,Foundry Agent Service frequently asked questions - Microsoft Foundry,,"Get answers to common questions about Foundry Agent Service, including setup options, data handling, pricing, and networking.","Find answers to common questions about Foundry Agent Service. If you can't find answers to your questions in this article and you still need help, seeFoundry Tools support and help options. Foundry Agent Service is part of Foundry Tools. For getting started, see:",2026-04-14T22:13:00Z,faq,,0.1,False,"Agent Service FAQ description mentions setup, data handling, pricing, and networking but with no evidence of specific limits, config parameters, or error codes in the summary. Likely general conceptual/FAQ content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/connected-agents,Connected agents,How to use connected agents (classic) - Microsoft Foundry (classic),Design multi-agent systems with Foundry connected agents,Learn how to create multi-agentic systems using connected agents in the Foundry Agent Service. (classic),"Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. Note This tool is only available in2025-05-15-previewAPI. We highly recommend you to migrate to use the2025-11-15-previewAPI versionworkflowsfor multi-agent orchestration. Connected agents in Foundry Agent Service let you brea",2026-02-27T23:08:00.000Z,how-to,architecture-patterns,0.65,True,Describes how to break workflows into specialized agents and orchestrate them; likely includes product-specific patterns and guidance on when to use connected agents.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/metrics,Service monitoring,Monitor Foundry Agent Service with Azure Monitor (classic) - Microsoft Foundry (classic),Monitor Foundry Agent Service with Azure Monitor and KQL,"Learn how to use Azure Monitor to view, analyze, and alert on platform metrics for Foundry Agent Service, including Log Analytics export and KQL queries. (classic)","Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. This article describes: Note If you're already familiar with this service and/or Azure Monitor and just want to know how to analyze monitoring data, see theAnalyzesection near the end of this article. When you have critical ap",2026-02-27T23:08:00.000Z,how-to,configuration,0.75,True,"Describes platform metrics, Log Analytics export, and KQL queries; likely includes metric names, dimensions, and example queries unique to this service.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/azure-ai-search,Overview,How to use an existing AI Search index with the Azure AI Search tool (classic) - Microsoft Foundry (classic),Connect Foundry agents to existing Azure AI Search indexes,Learn how to use Agents Azure AI Search tool. (classic),"Note This document refers to the Microsoft Foundry (classic) agents. 🔍View the new Azure AI Search tool documentation. @@ -59,7 +59,7 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-clas Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. Note This article describes the Microsoft SharePoint tool for Foundry Agent Service. For information on using and deploying SharePoint sites, see theSharePoint documentation. Use th",2026-03-06T23:10:00.000Z,how-to,integrations,0.8,True,Example-focused article; expected to show concrete configuration and code patterns for the SharePoint tool that are product-specific.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/triggers,Trigger an agent using Logic Apps,Trigger a Microsoft Foundry Agent Using Logic Apps (classic) - Microsoft Foundry (classic),Trigger Foundry agents from Logic Apps events,Use this article to learn how to trigger an AI agent when an event occurs. (classic),"Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. To streamline your workflow, you might want to automatically invoke your AI agent when an event occurs. The event might be receiving a new email. Or it might be getting a new customer ticket so that your AI agent can immediate",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Shows how to invoke agents on events like emails or tickets; likely includes Logic Apps trigger configuration, connector parameters, and payload schemas specific to this integration.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/use-your-own-resources,Use your own resources,Use your own resources in the Foundry Agent Service (classic) - Microsoft Foundry (classic) portal,Configure Foundry Agent Service to use your own Azure resources,Learn how to use resources that you already have with the Foundry Agent Service. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. By default, Foundry Agent Service manages storage for files, conversations, and vector stores. If your organization requires full data ownership, customer-managed keys (CMK), or network isolation, you can connect your own Azure resources inste",2026-03-26T06:04:00.000Z,how-to,configuration,0.65,True,"Describes connecting your own Azure resources (storage, CMK, network isolation) instead of managed storage. This typically involves specific resource types, configuration parameters, and wiring patterns unique to Foundry Agent Service, fitting configuration-focused expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks,Use virtual networks,Set up private networking for Foundry Agent Service (classic) - Microsoft Foundry (classic),Set up private networking and VNet integration for Foundry,"Set up private networking for Foundry Agent Service using Bicep or Terraform. Deploy a virtual network with private endpoints, DNS zones, and deny-by-default network rules. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Foundry Agent Service offers aStandard Setup with private networkingenvironment, where you bring your own (BYO) private virtual network. This setup creates an isolated network environment that enables secure access to data while maintaining full control over your network infrastructure. By default, the Standard Setup with private networking ensures: If you don't have an existing virtual network, the S",2026-02-27T23:08:00.000Z,how-to,security,0.8,True,"Describes private networking setup with Bicep/Terraform, private endpoints, DNS zones, and deny-by-default rules; includes product-specific network and security configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks,Use virtual networks,Set up private networking for Foundry Agent Service (classic) - Microsoft Foundry (classic) portal,Configure private networking for Foundry Agent Service,"Set up private networking for Foundry Agent Service using Bicep or Terraform. Deploy a virtual network with private endpoints, DNS zones, and deny-by-default network rules. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Foundry Agent Service offers aStandard Setup with private networkingenvironment. This setup creates an isolated network environment that enables secure access to data while maintaining full control over your network infrastructure. By default, the ",2026-04-14T22:13:00.000Z,how-to,configuration,0.7,True,"The page describes a 'Standard Setup with private networking' for Foundry Agent Service using Bicep/Terraform, including specific Azure networking components (virtual network, private endpoints, DNS zones, deny-by-default rules). This is product-specific configuration guidance with concrete resource types and settings, rather than a generic networking overview, so it fits the configuration sub-skill.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/agents/overview,What is Foundry Agent Service?,What is Foundry Agent Service? (classic) - Microsoft Foundry (classic),,"Learn what Foundry Agent Service is, how it works, and how to build production-ready agents in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Most businesses don't want just chatbots. They want automation that's faster and has fewer errors. That desire might mean summarizing documents, processing invoices, managing support tickets, or publishing blog posts. In all cases, the goal is the same: freeing people and resources to focus on higher-value work by offloading repetitive and predictable tasks. Large language models (LLMs) introduce a ne",2026-02-27T23:08:00.000Z,overview,,0.3,False,"Conceptual overview of Foundry Agent Service; primarily explains what it is and why, not detailed configs or limits.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/quickstart,Quickstart,Quickstart - Create a new Foundry Agent Service project (classic) - Microsoft Foundry (classic),,Use this guide to start using Foundry Agent Service. (classic),"Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. Note This quickstart is for the previous version of agents. See thequickstart for Microsoft Foundryto use the new version of the API. Foundry Agent Service allows you to create AI agents tailored to your needs through custom i",2026-02-27T23:08:00.000Z,quickstart,,0.35,False,Quickstart for creating a project; usually step-by-step with minimal deep configuration or limits content.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/agents/quotas-limits,Quotas and limits,Quotas and limits for Foundry Agent Service (classic) - Microsoft Foundry (classic),Review quotas and limits for Foundry Agent Service,"Learn the quotas and limits for Foundry Agent Service, including file sizes, vector stores, threads, messages, tools, and how to handle limit errors. (classic)","Note This document refers to the Microsoft Foundry (classic) portal. Agents (classic) are now deprecated and will be retired on March 31, 2027. Use the new agents in the generally availableMicrosoft Foundry Agents Service. Follow themigration guideto update your workloads. This article describes the quotas and limits for Foundry Agent Service. Understanding these limits helps you design agents that scale reliably and avoid runtime errors in production.",2026-02-27T23:08:00.000Z,concept-article,limits-quotas,0.9,True,"Explicitly describes quotas and limits (file sizes, vector stores, threads, messages, tools) and handling limit errors; this is numeric limit data central to the page.",unchanged @@ -69,7 +69,7 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/ai-services/content-safe https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/ai-foundry-consolidated-view,Consolidated view,Consolidated view for Foundry Tools in the Azure portal (classic) - Microsoft Foundry (classic),,"Discover how the Foundry consolidated view in the Azure portal simplifies AI workload management with cost, usage, and quota insights in one place. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. The consolidated view for Foundry Tools in the Azure portal shows key insights about your AI workloads, including costs, usage, and quota utilization for resources you use with Microsoft Foundry. Instead of switching between tools, you see in one place how your AI resources perform, what they cost, and whether you're nearing usage limits.",2026-02-27T23:08:00.000Z,concept-article,,0.35,False,Consolidated view article is primarily an overview of a monitoring UI for costs/usage/quotas; unlikely to contain detailed numeric limits or configuration parameter tables.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/ai-red-teaming-agent,AI red teaming agent overview,AI Red Teaming Agent (classic) - Microsoft Foundry (classic),,This article provides conceptual overview of the AI Red Teaming Agent. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI Red Teaming Agent is a powerful tool desig",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Described as a conceptual overview of the AI Red Teaming Agent; unlikely to contain detailed config tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/ai-resources,Hubs overview,Hubs and hub-based project overview (classic) - Microsoft Foundry (classic) portal,,This article introduces concepts about Microsoft Foundry hubs for your Microsoft Foundry projects. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? SDK compatibility note: Code examples require a specific ,2026-03-03T06:03:00.000Z,concept-article,,0.1,False,"The article is an overview of hubs and hub-based projects in Foundry (classic), describing concepts and compatibility notes. The summary indicates conceptual explanation and legacy support context, without evidence of numeric limits, configuration parameter tables, error codes, or decision matrices. It reads as a conceptual overview rather than expert configuration, troubleshooting, or limits content.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/architecture,Service architecture,Microsoft Foundry architecture (classic) - Microsoft Foundry (classic) portal,,Learn about the architecture of Microsoft Foundry. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry organizes AI workloads through a layered architecture: a top-level Foundry resource for governance, projects for development isolation, and connected Azure services for storage, search, and secrets management. This article provide",2026-04-16T22:08:00.000Z,concept-article,,0.1,False,"Described as an architecture overview of Microsoft Foundry’s layered model (resource, projects, connected services). This is conceptual service architecture, not a product-specific decision matrix with thresholds or quantified trade-offs, and it doesn’t indicate limits, configs, or troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/architecture,Service architecture,Microsoft Foundry architecture (classic) - Microsoft Foundry (classic) portal,,Learn about the architecture of Microsoft Foundry. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry organizes AI workloads through a layered architecture: a top-level Foundry resource for governance, projects for development isolation, and connected Azure services for storage, search, and secrets management. This article provide",2026-04-16T22:08:00.000Z,concept-article,,0.1,False,"Described as an architecture overview of Microsoft Foundry’s layered model (resource, projects, connected services). This is conceptual service architecture, not a product-specific decision matrix with thresholds or quantified trade-offs, and it doesn’t indicate limits, configs, or troubleshooting details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/ask-ai,Ask AI,Ask AI for help (classic) - Microsoft Foundry (classic) portal,,"Learn how to ask AI for help, getting your questions answered and tasks supported. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal You can ask AI to assist you in Microsoft Foundry. To start using AI to ask questions, select the AI icon in the upper right of theMicrosoft Foundryportal. A chat window opens where you can type your questions and receive answers in real time. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recom",2026-04-06T08:00:00.000Z,concept-article,,0.2,False,"Usage description of Ask AI chat in the Foundry portal; no limits, configuration tables, error codes, or product-specific numeric details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/authentication-authorization-foundry,Authentication and authorization,Authentication and authorization in Microsoft Foundry (classic) - Microsoft Foundry (classic),Configure authentication and authorization in Microsoft Foundry,"Learn how to authenticate and authorize access in Microsoft Foundry using Microsoft Entra ID and API keys. Explore RBAC, identity types, and best practices. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Authentication and authorization in Microsoft Foundry control how principals prove identity and gain permission to perform operations. Foundry divides operations into control plane (resource management) and data plane (runtime usage), each with its own authentication and role-based access control (RBAC) surface. Foundry supports two authentication methods: Microsoft Entra ID and API keys. Microsoft En",2026-02-27T23:08:00.000Z,concept-article,security,0.85,True,"Article covers Entra ID vs API keys, control vs data plane auth, and RBAC surfaces; it will list specific roles, scopes, and auth flows unique to Foundry, which are security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/built-in-evaluators,Built-in evaluators reference,Built-in Evaluators Reference (classic) - Microsoft Foundry (classic),Configure and use built-in Foundry evaluators,Comprehensive reference for all built-in evaluators in Microsoft Foundry (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Foundry provides a comprehensive set of",2026-02-27T23:08:00.000Z,article,configuration,0.7,True,"Reference for all built-in evaluators; these references usually list evaluator names, parameters, and allowed values, which are product-specific configuration details beyond generic LLM knowledge.",unchanged @@ -84,26 +84,26 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-eval https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/rag-evaluators,Retrieval Augmented Generation (RAG) evaluators,Retrieval-Augmented Generation (RAG) Evaluators for Generative AI (classic) - Microsoft Foundry (classic),Configure RAG evaluators for relevance and groundedness,"Learn about Retrieval-Augmented Generation evaluators for assessing relevance, groundedness, and response completeness in generative AI systems. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note The Microsoft Foundry SDK for evaluation and Foundry portal are in public preview, but the APIs are generally available for model and dataset evaluation (agent evaluation remains in public preview). The Azure AI Evaluation SDK and evaluators marked (preview) in this article are currently in public preview everywhere. A Retrieval-Augmented Generation (RAG) system tries to generate the most relevan",2026-02-27T23:08:00.000Z,reference,configuration,0.7,True,"RAG evaluator reference is product-specific; typically includes evaluator names, inputs, and configuration options for relevance/groundedness/completeness scoring.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/risk-safety-evaluators,Risk and safety evaluators,Risk and Safety Evaluators for Generative AI (classic) - Microsoft Foundry (classic),Configure risk and safety evaluators in Foundry,"Learn about risk and safety evaluators for generative AI, including tools for assessing content safety, jailbreak vulnerabilities, and code security risks. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note The Microsoft Foundry SDK for evaluation and Foundry portal are in public preview, but the APIs are generally available for model and dataset evaluation (agent evaluation remains in public preview). The Azure AI Evaluation SDK and evaluators marked (preview) in this article are currently in public preview everywhere. Risk and safety evaluators draw on insights gained from our previous large langu",2026-02-27T23:08:00.000Z,reference,configuration,0.7,True,"Risk and safety evaluators are specific to this platform; page likely documents evaluator types, parameters, and thresholds that constitute expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/textual-similarity-evaluators,Textual similarity evaluators,Textual Similarity Evaluators for Generative AI (classic) - Microsoft Foundry (classic),Use textual similarity evaluators in Foundry,"Learn about textual similarity evaluators for generative AI, including semantic similarity, F1 score, BLEU, GLEU, ROUGE, and METEOR metrics. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note The Microsoft Foundry SDK for evaluation and Foundry portal are in public preview, but the APIs are generally available for model and dataset evaluation (agent evaluation remains in public preview). The Azure AI Evaluation SDK and evaluators marked (preview) in this article are currently in public preview everywhere. It's important to compare how closely the textual response generated by your AI ",2026-02-27T23:08:00.000Z,reference,configuration,0.7,True,"Covers concrete evaluators (semantic similarity, F1, BLEU, ROUGE, etc.) as implemented in Foundry; likely includes metric-specific parameters and usage details unique to this SDK.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-regions-limits-virtual-network,"Rate limits, region and virtual network support","Rate limits, region support, and enterprise features for evaluation (classic) - Microsoft Foundry (classic) portal",Evaluate Foundry classic with regional limits and VNet,"Learn about region availability, rate limits, virtual network support, and using your own storage account for evaluation in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article provides an overview of which regions support AI-assisted evaluators, the rate limits that apply to evaluation runs, how to configure virtual network support for network isolation, and using your own storage account to run evaluations.",2026-03-20T22:11:00.000Z,how-to,limits-quotas,0.78,True,"The page explicitly focuses on rate limits for evaluation runs, region availability, and enterprise features like virtual network support and storage account usage. These topics typically include concrete numeric rate limits, region support matrices, and feature-specific constraints that are not generally known from training data, matching the limits-quotas category.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-regions-limits-virtual-network,"Rate limits, region and virtual network support","Rate limits, region support, and enterprise features for evaluation (classic) - Microsoft Foundry (classic) portal",Rate limits and regional support for Foundry evaluations,"Learn about region availability, rate limits, virtual network support, and using your own storage account for evaluation in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article provides an overview of which regions support AI-assisted evaluators, the rate limits that apply to evaluation runs, how to configure virtual network support for network isolation, and using your own storage account to run evaluations.",2026-04-14T22:13:00.000Z,how-to,limits-quotas,0.86,True,"The page explicitly focuses on rate limits for evaluation runs, region availability, and enterprise features like virtual network support and storage usage. Documentation of rate limits for a specific Azure service is expert knowledge that changes over time and isn't reliably known from training data. While it also mentions VNet and storage configuration, the primary unique, time-sensitive content is the service-specific rate limits and regional constraints, which best fit the limits-quotas sub-skill.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/fine-tuning-overview,Fine-tuning in Microsoft Foundry,Fine-tune models with Microsoft Foundry (classic) - Microsoft Foundry (classic),Decide when and how to fine-tune models in Foundry,This article explains what fine-tuning is and under what circumstances you should consider doing it. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Fine-tuning customizes a pretrained AI model with additional training on a specific task or dataset to improve performance, add new skills, or enhance accuracy. The result is a new, optimized GenAI model based on the provided examples. This article walks you through key concepts and decisions to make before you fine-tune, including the type of fine-tuning that's righ",2026-02-27T23:08:00.000Z,concept-article,decision-making,0.65,True,"Explains when to fine-tune and which type to choose; likely includes scenario-based guidance and trade-offs for different fine-tuning approaches, which is decision-making content.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/foundry-models-overview,Explore Foundry Models,Microsoft Foundry Models overview (classic) - Microsoft Foundry (classic),,This article introduces Microsoft Foundry Models and the model catalog in Microsoft Foundry portal. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Microsoft Foundry Models is your one-stop destination for discovering, evaluating, and deploying powerful AI models—whether you're building a custom copilot, an agent, enhancing an existing application, or exploring new AI capabilities. With Foundry Models, you can: Whether you're a developer, data scientist, or enterprise architect, Foundry Models gives you the flexibility and control to build AI sol",2026-03-05T06:04:00.000Z,concept-article,,0.25,False,"Models overview and catalog description; marketing/conceptual without detailed limits, configs, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/foundry-models-overview,Explore Foundry Models,Microsoft Foundry Models overview (classic) - Microsoft Foundry (classic) portal,,This article introduces Microsoft Foundry Models and the model catalog in Microsoft Foundry portal. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Microsoft Foundry Models is your one-stop destination for discovering, evaluating, and deploying powerful AI models—whether you're building a custom copilot, an agent, enhancing an existing application, or exploring new AI capabilities. With Foundry Models, you can: Whether you're a developer, data scientist, or enterprise architect, Foundry Models gives you the flexibility and control to build AI sol",2026-04-20T08:00:00.000Z,concept-article,,0.1,False,"Described as an introduction to Foundry Models and the model catalog; this is a conceptual/marketing-style overview without indication of concrete limits, configs, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/hub-encryption-keys-portal,Customer-managed keys,Customer-managed keys for hub projects (classic) - Microsoft Foundry (classic),Configure customer-managed keys for Foundry hub projects,Use customer-managed keys (CMK) with hub-based projects in Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? SDK compatibility note: Code examples require a specific Microsoft Foundry SDK version. If you encounter compatibility issues, considermigrating from a hub-based to a Foundry project. Tip An alternate Foundry proj",2026-02-27T23:08:00.000Z,concept-article,security,0.78,True,"CMK setup for hub projects will include Azure Key Vault integration details, key URI formats, required permissions, and Foundry-specific encryption behaviors that are product-specific security configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/hub-rbac-foundry,Role-based access control for Microsoft Foundry,Role-based access control for Microsoft Foundry (Hubs and Projects) (classic) - Microsoft Foundry (classic),Configure RBAC for Microsoft Foundry hubs and projects,This article introduces role-based access control in Microsoft Foundry portal (hub-focused version). (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? SDK compatibility note: Code examples require a specific Microsoft Foundry SDK version. If you encounter compatibility issues, considermigrating from a hub-based to a Foundry project. Tip An alternate Foundry proj",2026-02-27T23:08:00.000Z,concept-article,security,0.8,True,"RBAC article will list specific roles, permissions, and scope definitions for Foundry hubs and projects, which are product-specific security details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/manage-costs,Manage costs,Plan and Manage Costs (classic) - Microsoft Foundry (classic) portal,,"Manage Microsoft Foundry costs by estimating expenses, monitoring usage, and setting up alerts for spending anomalies with Microsoft Cost Management. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article shows you how to estimate expenses before deployment, track spending in real time, and set up alerts to avoid budget surprises.",2026-04-08T08:00:00.000Z,how-to,,0.3,False,"Cost planning/management overview; summary suggests conceptual guidance on estimating expenses and setting alerts without concrete tier matrices, numeric thresholds, or detailed decision tables.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/management-center,Management center,Management center overview (classic) - Microsoft Foundry (classic),,The management center in Microsoft Foundry portal provides a centralized hub for governance and management activities. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. The management center is part of the Microsoft Foundry portal that streamlines governance and management activities. From the management center, you can manage: Foundry hubs and hub-based projects Azure AI Foundry projects Quotas for models and virtual machines (VMs) Note VMs and VM quotas are only available for hub-based projects. User management and role assignment",2026-03-03T06:03:00.000Z,concept-article,,0.4,False,"Management center overview describes governance surfaces at a high level; summary suggests navigation and conceptual description rather than detailed quotas, roles, or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-benchmarks,Model leaderboards,Model benchmarks and leaderboards in Microsoft Foundry (classic) - Microsoft Foundry (classic),Use Foundry model benchmarks and leaderboards,"Compare AI models using quality, safety, cost, and performance benchmarks on the model leaderboards (preview) in Microsoft Foundry portal. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Model leaderboards (preview) in Microsoft Foundry",2026-02-27T23:08:00.000Z,concept-article,decision-making,0.75,True,"Provides benchmark metrics across quality, safety, cost, and performance to compare models, enabling quantified model selection decisions.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-catalog-content-safety,Guardrails & controls for serverless API,Guardrails & controls for Models Sold Directly by Azure (classic) - Microsoft Foundry (classic),Configure guardrails for Azure-sold Foundry models,"Learn about content safety for models deployed using serverless API deployments, using Microsoft Foundry. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this articl",2026-03-02T18:11:00.000Z,concept-article,security,0.7,True,"Content safety and guardrails for models sold directly by Azure; likely includes specific safety settings, policy controls, and configuration parameters unique to Foundry serverless deployments.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-lifecycle-retirement,Model lifecycle and retirement,Model deprecation and retirement for Foundry Models (classic) - Microsoft Foundry (classic) portal,Plan for Foundry model deprecation and retirement,"Learn about model lifecycle stages, deprecation timelines, notifications, and migration steps for Microsoft Foundry Models. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models are continually refreshed with newer and more capable models. As part of this process, model providers might deprecate and retire their older models, and you might need to update your applications to use a newer model.",2026-03-26T06:04:00.000Z,concept-article,decision-making,0.7,True,"Lifecycle/deprecation guidance usually includes timelines, migration steps, and criteria for when to move to newer models—service-specific upgrade and migration decision guidance.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-retirement-managed-compute,Managed compute models lifecycle,Model deprecation and retirement for managed compute models (classic) - Microsoft Foundry (classic) portal,Handle deprecation and retirement of managed compute models,"Learn about model lifecycle stages, deprecation timelines, replacements for models deployed via managed compute. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry continuously refreshes its model catalog with newer, more capable models. As part of this process, model providers might deprecate and retire their older models, and you might need to update your",2026-04-24T17:22:00.000Z,concept-article,decision-making,0.7,True,"The article focuses on lifecycle stages, deprecation timelines, and replacements for managed compute deployments. This is expert, product-specific guidance that informs when to migrate, what to migrate to, and how to plan around provider-driven changes, aligning with decision-making (migration considerations and upgrade paths).",new https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/models-inference-examples,Inference examples for serverless deployments,Serverless API inference examples for Foundry Models (classic) - Microsoft Foundry (classic),Serverless API inference examples for Foundry Models,Inference examples for Foundry Models that support deployment to serverless APIs in Microsoft Foundry. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. The Foundry model catalog offers a large selection of Microsoft Foundry Models from a wide range of providers. You have various options for deploying models from the model catalog. This article lists inference examples for serverless API deployments. Important Models that are in preview are marked aspreviewon their model cards in the model catalog. To perform inferen,2026-02-27T23:08:00.000Z,concept-article,integrations,0.7,True,"Provides concrete inference examples for serverless API deployments; expected to show request schemas, headers, and parameters specific to Foundry Models serverless APIs.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/observability,Observability overview,Observability in Generative AI (classic) - Microsoft Foundry (classic),,"Learn how Microsoft Foundry enables safe, high-quality generative AI through systematic evaluation and observability tools. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI application lifecycle requires robust eval",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Conceptual overview of observability and evaluation; description suggests lifecycle concepts rather than concrete error codes, limits, or config tables.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/planning,Plan rollout,Microsoft Foundry Rollout Across My Organization (classic) - Microsoft Foundry (classic) portal,Plan Microsoft Foundry classic rollout at scale,"Learn how to plan the rollout of Microsoft Foundry across your organization, including environment setup, data isolation, and governance. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal A structured rollout plan helps you avoid security gaps, cost overruns, and access sprawl when adopting Microsoft Foundry at scale. This guide outlines key decisions for rolling out Foundry, including environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. Use this guide as a starting point and adapt it to your needs. For implementation details, s",2026-04-16T16:35:00.000Z,concept-article,decision-making,0.65,True,"The planning guide outlines concrete rollout decisions for environment setup, data isolation, governance, capacity management, and monitoring when adopting Foundry across an organization. While the summary is high level, this type of rollout-planning content typically includes product-specific decision criteria and trade-offs (for example, how to segment environments, isolate data, and integrate with other Azure services) that go beyond generic concepts and help choose between options, fitting the decision-making category.",updated -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/prompt-flow,Prompt flow overview,Prompt flow in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),,This article introduces prompt flow in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Prompt flow is",2026-02-27T23:08:00.000Z,concept-article,,0.1,False,"Conceptual introduction to prompt flow; no indication of numeric limits, configuration tables, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/rbac-foundry,Role-based access control,Role-based access control for Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,Assign and manage RBAC roles in Microsoft Foundry,This article introduces role-based access control in Microsoft Foundry portal. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Tip This article is for a Foundry project type. An alternate RBAC article for hub-based projects is available:Role-based access control for Microsoft Foundry (Hubs and Projects). In this article, you learn about role-based access control (RBAC) in your Microsoft Foundry resource and how to assign roles that control access to resources. Tip RBAC roles apply when you authenticate using Microsoft Entra I",2026-04-13T08:00:00.000Z,concept-article,security,0.82,True,"An RBAC article for a specific Azure service typically lists concrete role names, scopes, and what each role can do, plus how they apply with Microsoft Entra ID. Those are product-specific security/authorization details that qualify as expert knowledge under the security category.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/planning,Plan rollout,Microsoft Foundry Rollout Across My Organization (classic) - Microsoft Foundry (classic) portal,Plan Microsoft Foundry classic rollout at scale,"Learn how to plan the rollout of Microsoft Foundry across your organization, including environment setup, data isolation, and governance. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal A structured rollout plan helps you avoid security gaps, cost overruns, and access sprawl when adopting Microsoft Foundry at scale. This guide outlines key decisions for rolling out Foundry, including environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. Use this guide as a starting point and adapt it to your needs. For implementation details, s",2026-04-16T16:35:00.000Z,concept-article,decision-making,0.65,True,"The planning guide outlines concrete rollout decisions for environment setup, data isolation, governance, capacity management, and monitoring when adopting Foundry across an organization. While the summary is high level, this type of rollout-planning content typically includes product-specific decision criteria and trade-offs (for example, how to segment environments, isolate data, and integrate with other Azure services) that go beyond generic concepts and help choose between options, fitting the decision-making category.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/prompt-flow,Prompt flow overview,Prompt flow in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,,This article introduces prompt flow in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,,0.2,False,"Conceptual introduction to Prompt Flow in Foundry classic; summary suggests overview and retirement notice without detailed limits, configs, or troubleshooting matrices.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/rbac-foundry,Role-based access control,Role-based access control for Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,Assign and manage RBAC roles in Microsoft Foundry,This article introduces role-based access control in Microsoft Foundry portal. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Tip This article is for a Foundry project type. An alternate RBAC article for hub-based projects is available:Role-based access control for Microsoft Foundry (Hubs and Projects). In this article, you learn about role-based access control (RBAC) in your Microsoft Foundry resource and how to assign roles that control access to resources. Tip RBAC roles apply when you authenticate using Microsoft Entra I",2026-04-13T08:00:00.000Z,concept-article,security,0.82,True,"An RBAC article for a specific Azure service typically lists concrete role names, scopes, and what each role can do, plus how they apply with Microsoft Entra ID. Those are product-specific security/authorization details that qualify as expert knowledge under the security category.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/resource-types,Choose a resource type,Choose an Azure resource type for Foundry (classic) - Microsoft Foundry (classic),Choose the right Azure resource type for Foundry,Learn about the supported Azure resource types in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Microsoft Foundry portal and SDK clients support multiple Azure resource types, each designed for different development and operational needs. This article helps you choose the right resource type for your AI development scenario.",2026-02-27T23:08:00.000Z,concept-article,decision-making,0.7,True,Article explicitly helps choose between supported resource types; likely includes comparison tables and scenario-based recommendations for different development/operational needs.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/retrieval-augmented-generation,Retrieval Augmented Generation (RAG) overview,Retrieval augmented generation (RAG) and indexes in Microsoft Foundry (classic) - Microsoft Foundry (classic),,Learn how retrieval augmented generation (RAG) uses indexes and grounding data to improve response accuracy in generative AI apps. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Retrieval augmented generation (RAG) is a pattern that combines search with large language models (LLMs) so responses are grounded in your data. This article explains how RAG works in Microsoft Foundry, what role indexes play, and how agentic retrieval changes classic RAG patterns. LLMs are trained on public data available at training time. If you need answers based on your private data, or on frequen",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Conceptual explanation of RAG and indexes; summary suggests high-level pattern description without concrete configuration tables, limits, or product-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/safety-evaluations-transparency-note,Transparency Note for safety evaluations,Microsoft Foundry risk and safety evaluations (preview) Transparency Note (classic) - Microsoft Foundry (classic) portal,,"Microsoft Foundry safety evaluations intended purpose, capabilities, limitations and how to achieve the best performance. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews.",2026-04-10T22:08:00.000Z,concept-article,,0.2,False,"The transparency note describes purpose, capabilities, limitations, and usage guidance for safety evaluations at a conceptual level. From the available summary, it doesn't indicate specific numeric limits, configuration parameters, error codes, or decision matrices with quantified trade-offs. It appears to be high-level risk/safety documentation rather than detailed configuration, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/vulnerability-management,Vulnerability management,Vulnerability management (classic) - Microsoft Foundry (classic),Understand vulnerability management for Microsoft Foundry images,"Learn how Microsoft Foundry manages vulnerabilities in images that the service provides, and how you can get the latest security updates for the components that you manage. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? Vulnerability management is the process of detecting, assessing, mitigating, and reporting security vulnerabilities in an organization's systems and software. You and Microsoft share this responsibility. This arti",2026-02-27T23:08:00.000Z,concept-article,security,0.6,True,Describes how Foundry manages vulnerabilities in service-provided images and how customers should update components; likely includes product-specific patching and update guidance.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/faq,Foundry (classic) FAQ,Microsoft Foundry (classic) frequently asked questions - Microsoft Foundry,Resolve common issues in Microsoft Foundry (classic),Get answers to the most popular questions about Microsoft Foundry (classic).,"FAQ forMicrosoft Foundry. If you can't find answers to your questions in this document, and still need help check theFoundry Tools support options guide. Azure OpenAI is part of Foundry Tools.",2026-04-14T22:13:00Z,faq,troubleshooting,0.65,True,"As an FAQ for a specific Azure service, this page is likely to include concrete Q&A about common problems, including specific error messages or behaviors and their resolutions, which fits the troubleshooting pattern more than generic overview content.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/faq,Foundry (classic) FAQ,Microsoft Foundry (classic) frequently asked questions - Microsoft Foundry,Resolve common issues in Microsoft Foundry (classic),Get answers to the most popular questions about Microsoft Foundry (classic).,"FAQ forMicrosoft Foundry. If you can't find answers to your questions in this document, and still need help check theFoundry Tools support options guide. Azure OpenAI is part of Foundry Tools.",2026-04-14T22:13:00Z,faq,troubleshooting,0.65,True,"As an FAQ for a specific Azure service, this page is likely to include concrete Q&A about common problems, including specific error messages or behaviors and their resolutions, which fits the troubleshooting pattern more than generic overview content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/content-filter,Content filtering for Foundry Models,Content filtering for Microsoft Foundry Models (classic) - Microsoft Foundry (classic),,"Learn how Microsoft Foundry Models filter harmful content in prompts and completions, including configuration and API scenarios. (classic)",Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Microsoft Foundryincludes a content filtering system that works alongside core models and image generation models and is powered byAzure AI Content Safety. This system runs both the prompt and completion through an ensemble of classification models designed to detect and prevent the output of harmful content. The content filtering system detects and takes action on s,2026-02-27T23:08:00.000Z,concept-article,,0.4,False,"Explains how Foundry Models filter harmful content conceptually. Summary does not show concrete configuration parameter tables, numeric thresholds, or detailed API parameter references beyond general behavior.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/default-safety-policies,Default safety policies for Foundry Models,Default Guardrails & controls policies for Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Use default safety policies for Foundry Models,Learn how Microsoft Foundry Models applies default safety policies and content filtering to help ensure responsible AI usage. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models applies default safety to all models, excluding audio models such as Whisper in Azure OpenAI in Foundry Models. These configurations provide you with a responsible experience by defau",2026-03-19T22:16:00.000Z,concept-article,security,0.7,True,"Explains default guardrails, safety policies, and content filtering applied to Foundry Models. These are product-specific responsible AI controls and default configurations, which fall under security (safety/compliance configuration) rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/deployment-types,Deployment types,Understanding deployment types in Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Choose Microsoft Foundry deployment types and residency,"Compare Microsoft Foundry deployment types including Global Standard, Provisioned, DataZone, and Batch. Learn about data residency, pricing, and when to use each type. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal When you deploy a model in Microsoft Foundry, you choose a deployment type that determines: The service offers two main categories:standard(pay-per-token) andprovisioned(reserved capacity). Within each category, you can choose global, data zone, or regional processing based on your compliance requirements. Important Data residency for all deployment types: Data stored at rest remains in the designated",2026-02-27T23:08:00.000Z,concept-article,decision-making,0.85,True,"Details Global Standard, Provisioned, DataZone, Batch with data residency and pricing implications, guiding concrete deployment-type selection.",unchanged @@ -111,12 +111,12 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/ https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/model-versions,Model versions,Model versioning in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Plan and manage model versioning in Foundry Models,"Learn how model versions work in Microsoft Foundry Models (classic). Understand update policies, deployment options, and how to manage version upgrades effectively.","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models regularly release new model versions that incorporate the latest features and improvements from model providers. This article explains how model versioning works, what update policies are available for your deployments",2026-03-26T06:04:00.000Z,concept-article,decision-making,0.7,True,"Covers how model versions work, update policies, deployment options, and managing version upgrades. This is guidance for choosing update policies and upgrade paths (for example, auto-upgrade vs pinned versions) and likely includes scenario-based recommendations, making it decision-making rather than just conceptual.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/model-versions,Model versions,Model versioning in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Plan and manage model versioning in Foundry Models,"Learn how model versions work in Microsoft Foundry Models (classic). Understand update policies, deployment options, and how to manage version upgrades effectively.","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models regularly release new model versions that incorporate the latest features and improvements from model providers. This article explains how model versioning works, what update policies are available for your deployments",2026-03-26T06:04:00.000Z,concept-article,decision-making,0.7,True,"Covers how model versions work, update policies, deployment options, and managing version upgrades. This is guidance for choosing update policies and upgrade paths (for example, auto-upgrade vs pinned versions) and likely includes scenario-based recommendations, making it decision-making rather than just conceptual.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-from-partners,Configure Models from Partners and Community,Foundry Models from partners and community (classic) - Microsoft Foundry (classic) portal,,"Learn about Microsoft Foundry Models from partners and community, including their capabilities, supported input and output types, and language support for AI applications. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models in the model catalog comprise two main categories, namelyFoundry Models sold directly by AzureandFoundry Models from partners and community. -This article lists a selection of Foundry Models from partners and community, alon",2026-04-18T06:07:00.000Z,partner-tools,,0.3,False,"Duplicate of index 2: partner and community models listing with capabilities; no indication of detailed limits, configuration parameters, or troubleshooting mappings.",updated +This article lists a selection of Foundry Models from partners and community, alon",2026-04-18T06:07:00.000Z,partner-tools,,0.3,False,"Duplicate of index 2: partner and community models listing with capabilities; no indication of detailed limits, configuration parameters, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-from-partners,Foundry Models from partners and community,Foundry Models from partners and community (classic) - Microsoft Foundry (classic) portal,,"Learn about Microsoft Foundry Models from partners and community, including their capabilities, supported input and output types, and language support for AI applications. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models in the model catalog comprise two main categories, namelyFoundry Models sold directly by AzureandFoundry Models from partners and community. -This article lists a selection of Foundry Models from partners and community, alon",2026-04-18T06:07:00.000Z,partner-tools,,0.3,False,"Catalog-style listing of partner and community models with capabilities and language support; summary shows no quotas, config tables, or error/diagnostic content.",updated +This article lists a selection of Foundry Models from partners and community, alon",2026-04-18T06:07:00.000Z,partner-tools,,0.3,False,"Catalog-style listing of partner and community models with capabilities and language support; summary shows no quotas, config tables, or error/diagnostic content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-sold-directly-by-azure,Foundry Models sold directly by Azure,Foundry Models sold directly by Azure (classic) - Microsoft Foundry (classic) portal,,"Learn about Microsoft Foundry Models sold directly by Azure, their capabilities, deployment types, and regional availability for AI applications. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models in the model catalog comprise two main categories, namelyFoundry Models sold directly by AzureandFoundry Models from partners and community. -This article lists a selection of Foundry Models sold directly by Azure, along wit",2026-04-17T22:08:00.000Z,product-comparison,,0.3,False,"Appears to be a catalog-style listing of models sold directly by Azure with capabilities and availability; summary does not indicate numeric limits, configuration parameters, or troubleshooting details.",updated -https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/faq,Foundry Models FAQ,Microsoft Foundry Models frequently asked questions - Microsoft Foundry,,Get answers to the most popular questions about Foundry Models,"If you can't find answers to your questions in this document, and still need help check theFoundry Tools support options guide.",2026-04-14T22:13:00Z,faq,,0.1,False,"FAQ summary is high level; no indication of numeric limits, config tables, error codes, or other detailed product-specific guidance. Likely general Q&A and support pointers.",updated +This article lists a selection of Foundry Models sold directly by Azure, along wit",2026-04-17T22:08:00.000Z,product-comparison,,0.3,False,"Appears to be a catalog-style listing of models sold directly by Azure with capabilities and availability; summary does not indicate numeric limits, configuration parameters, or troubleshooting details.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/faq,Foundry Models FAQ,Microsoft Foundry Models frequently asked questions - Microsoft Foundry,,Get answers to the most popular questions about Foundry Models,"If you can't find answers to your questions in this document, and still need help check theFoundry Tools support options guide.",2026-04-14T22:13:00Z,faq,,0.1,False,"FAQ summary is high level; no indication of numeric limits, config tables, error codes, or other detailed product-specific guidance. Likely general Q&A and support pointers.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/configure-content-filters,Configure content filtering,How to configure content filters for models in Microsoft Foundry (classic) - Microsoft Foundry (classic),Configure Foundry model content filters and gated changes,"Learn to use and configure the content filters that come with Microsoft Foundry, including getting approval for gated modifications. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your preferred programming language. The content filtering system integrated into Microsoft Foundry runs alongside Foundry Models. It uses an",2026-02-27T23:08:00.000Z,how-to,configuration,0.7,True,"How-to for configuring content filters, including approval for gated modifications. Likely includes specific filter setting names, allowed values, and configuration flows unique to Foundry, which are configuration details rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/configure-deployment-policies,Custom policies for model deployment,Control model deployment with custom policies (classic) - Microsoft Foundry (classic),Create custom Azure Policies to restrict Foundry model deployments,Learn how to use custom Azure Policies to control Microsoft Foundry and Azure OpenAI in Foundry Models deployment with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. When you deploy models in Microsoft Foundry or Azure OpenAI, you might need Azure Policy to control whichdeployment typesare available to users or which specific models they can deploy. This article shows you how to create a custom Azure Policy definition that denies non-approved model deployments. Tip The steps in this article apply to both a Foundry project and hub",2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Shows how to author custom Azure Policy definitions that deny non-approved model deployments and deployment types for Foundry and Azure OpenAI. Involves product-specific policy JSON structure and fields, which are concrete security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/configure-entra-id,Configure key-less authentication,Configure keyless authentication with Microsoft Entra ID (classic) - Microsoft Foundry (classic),Configure keyless Entra ID authentication for Foundry models,"Learn how to configure keyless authentication with Microsoft Entra ID for Microsoft Foundry Models. Eliminate the need for API keys, enhance security with RBAC, and simplify compliance. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article explains how to configure keyless authentication with Microsoft Entra ID for Microsoft Foundry Models. Keyless authentication enhances security by eliminating the need for API keys, simplifies the user experience with role-based access control (RBAC), and reduces operational complexity while providing robust compliance support.",2026-02-27T23:08:00.000Z,how-to,security,0.85,True,"Keyless auth setup includes app registration, scopes, audience/resource IDs, and RBAC role requirements for Foundry Models, which are detailed security configuration patterns.",unchanged @@ -131,9 +131,9 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/qu https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-chat-multi-modal,Work with multimodal models,How to use image and audio in chat completions with Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Process images and audio in Foundry chat completions,Learn how to process audio and images with chat completions models with Microsoft Foundry Models (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your preferred programming language. Important Items marked (preview) in this article are currently in public preview. This preview is provid",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Details how to send image/audio inputs to chat models via Foundry, including request formats and parameters.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-chat-reasoning,Work with reasoning models,How to use reasoning models with Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Use reasoning models with Foundry Models service,Learn how to use reasoning capabilities from models with Microsoft Foundry Models (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your preferred programming language. Important Items marked (preview) in this article are currently in public preview. This preview is provid",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Shows how to invoke reasoning models via Foundry APIs/SDKs, including configuration of reasoning-specific parameters.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-embeddings,Work with text embeddings,How to generate embeddings with Microsoft Foundry Models service (classic) - Microsoft Foundry (classic),Generate text embeddings with Foundry Models,Learn how to generate embeddings with Microsoft Foundry Models (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your preferred programming language. Important Items marked (preview) in this article are currently in public preview. This preview is provid",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Shows how to call embeddings endpoints for Foundry models, including API parameters and usage patterns.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-claude,Claude in Foundry Models,Deploy and use Claude models in Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,"Deploy Anthropic's Claude models in Microsoft Foundry to integrate advanced conversational AI into your apps. Learn how to use Claude Opus, Sonnet, and Haiku models. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Anthropic's Claude models bring advanced conversational AI capabilities to Microsoft Foundry, enabling you to build intelligent applications with state-of-the-art language understanding and generation. Claude models excel at complex reasoning, code",2026-04-16T16:35:00.000Z,how-to,,0.4,False,"Deploy/use Claude models article appears to be a usage tutorial; summary does not show configuration parameter tables, quotas, or error-code-based troubleshooting specific to the product.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-claude,Claude in Foundry Models,Deploy and use Claude models in Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,"Deploy Anthropic's Claude models in Microsoft Foundry to integrate advanced conversational AI into your apps. Learn how to use Claude Opus, Sonnet, and Haiku models. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Anthropic's Claude models bring advanced conversational AI capabilities to Microsoft Foundry, enabling you to build intelligent applications with state-of-the-art language understanding and generation. Claude models excel at complex reasoning, code",2026-04-16T16:35:00.000Z,how-to,,0.4,False,"Deploy/use Claude models article appears to be a usage tutorial; summary does not show configuration parameter tables, quotas, or error-code-based troubleshooting specific to the product.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-flux,FLUX models in Foundry Models,Deploy and use FLUX models in Microsoft Foundry (Classic) - Microsoft Foundry (classic) portal,Deploy and use FLUX image models in Foundry,Deploy Black Forest Labs FLUX image generation models in Microsoft Foundry to generate and edit high-quality images from text prompts and reference images. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Black Forest Labs (BFL) FLUX models bring state-of-the-art image generation to Microsoft Foundry, enabling you to generate and edit high-quality images from text prompts and reference images. In this article, you learn how to: FLUX models are ",2026-04-03T06:06:00.000Z,how-to,integrations,0.7,True,"Covers deploying and using FLUX image models; such pages typically include model-specific parameters, request formats, and constraints for this integration.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-mai,Microsoft AI (MAI) in Foundry Models,Deploy and use MAI models in Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,"Deploy MAI-Image-2 and MAI-Image-2e, text-to-image generation models in Microsoft Foundry, to generate high-quality images from natural language prompts. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. MAI-Image-2 and MAI-Image-2e are text-to-image generation models that create high-quality, visually rich images from natural language prompts. In this article, you learn how to:",2026-04-14T12:58:00.000Z,how-to,,0.4,False,"Deploy/use MAI image models article looks like a how-to guide; summary mentions learning how to use the models but not detailed limits, configuration matrices, or troubleshooting content.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-mai,Microsoft AI (MAI) in Foundry Models,Deploy and use MAI models in Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,"Deploy MAI-Image-2 and MAI-Image-2e, text-to-image generation models in Microsoft Foundry, to generate high-quality images from natural language prompts. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. MAI-Image-2 and MAI-Image-2e are text-to-image generation models that create high-quality, visually rich images from natural language prompts. In this article, you learn how to:",2026-04-14T12:58:00.000Z,how-to,,0.4,False,"Deploy/use MAI image models article looks like a how-to guide; summary mentions learning how to use the models but not detailed limits, configuration matrices, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-image-embeddings,Work with image embeddings,How to generate image embeddings with Microsoft Foundry Models service (classic) - Microsoft Foundry (classic),Generate image embeddings with Foundry Models,Learn how to generate image embeddings with Microsoft Foundry Models (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This article e",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Explains how to generate image embeddings via Foundry APIs, including model support and request configuration.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-structured-outputs,Use structured outputs,How to use structured outputs for chat models (classic) - Microsoft Foundry (classic),Use structured outputs with Foundry chat models,Learn how to use structure outputs with chat completions with Microsoft Foundry Models (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your preferred programming language. Important Items marked (preview) in this article are currently in public preview. This preview is provid",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Explains structured output configuration for chat models, including schema/parameter usage specific to Foundry implementations.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/quotas-limits,Foundry Models quotas and limits,Microsoft Foundry Models quotas and limits (classic) - Microsoft Foundry (classic) portal,Reference quotas and limits for Foundry Models,"Learn about quotas, rate limits, and best practices for Foundry Models, including per-model token and request limits, client timeouts, and how to request increases. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. This article provides a quick reference and detailed description of the quotas and limits forFoundry Models sold directly by Azure. For quotas and limits specific to the Azure OpenAI in Foundry Models, seeQuotas and limits in Azure OpenAI.",2026-03-26T06:04:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly described as a quotas and limits reference, including per-model token/request limits and client timeouts—service-specific numeric constraints that match the limits-quotas criteria.",unchanged @@ -163,7 +163,7 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-hub-terraf https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-manage-compute,Create and manage compute,Create and manage compute instances (classic) - Microsoft Foundry (classic) portal,Configure and manage Foundry compute instances,"Learn how to create and manage compute instances in Foundry portal to use prompt flow, create indexes, and access Visual Studio Code. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. In this article, you learn how to create and manage a compute instance in Foundry portal. A compute instance is required to use prompt flow, create indexes, or access Visual Studio Code in Foundry portal for ",2026-03-02T18:11:00.000Z,how-to,configuration,0.68,True,"A 'how to create and manage compute instances' article for a specific portal typically documents instance-size options, region constraints, and configuration parameters (for prompt flow, index creation, VS Code access). These are product-specific settings and options that go beyond generic knowledge, fitting the configuration sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-manage-compute-session,Create and manage compute session,Create and manage prompt flow compute sessions (classic) - Microsoft Foundry (classic),Configure and manage prompt flow compute sessions,"In this article, learn how to create and manage compute sessions to run prompt flows in Microsoft Foundry portal. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. You need a com",2026-02-27T23:08:00.000Z,how-to,configuration,0.7,True,"How-to for creating/managing compute sessions is likely to include specific settings (SKU choices, timeouts, auto-shutdown, region constraints) and configuration parameters unique to prompt flow compute.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-projects,Create a project,Create a project (classic) - Microsoft Foundry (classic) portal,,This article describes how to create a Microsoft Foundry project so you can work with generative AI in the cloud. (classic),"Use this article to create a Foundry project and confirm that your environment is ready before you start building agents, evaluations, and files. Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article describes how to create a Foundry project inMicrosoft Foundry. Projects let you organize your work—such as agents, evaluations, and files—as you build stateful apps and explore new ideas. AFoundry projectis managed under a Microsoft Foundry reso",2026-04-08T08:00:00.000Z,how-to,,0.2,False,"Task-focused how-to for creating a Foundry project; no indication of numeric limits, configuration parameter tables, error-code mappings, or other product-specific expert details beyond standard portal usage.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-resource-template,Create using Bicep template,Quickstart: Create a Foundry resource using Bicep (classic) - Microsoft Foundry (classic) portal,,Learn how to use a Bicep file (template) to create a Microsoft Foundry resource in your Azure subscription. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal In this quickstart, you deploy aMicrosoft Foundryresource and project by using aMicrosoft Biceptemplate. Bicep helps you create related resources in one coordinated deployment and reuse the same configuration across environments. If you already configured a Foundry resource in the Azure portal, you canexport that configuration as a Bicep fileinstead of authoring a template from scratch. Tip For produc",2026-04-17T17:16:00.000Z,quickstart,,0.2,False,"This is a quickstart showing how to deploy a Foundry resource and project using a Bicep template. From the summary it appears to be a step-by-step tutorial without configuration parameter tables, limits, or product-specific constraints; it focuses on basic deployment mechanics that an LLM can already generalize, so it does not meet the expert-knowledge criteria for any sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-resource-template,Create using Bicep template,Quickstart: Create a Foundry resource using Bicep (classic) - Microsoft Foundry (classic) portal,,Learn how to use a Bicep file (template) to create a Microsoft Foundry resource in your Azure subscription. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal In this quickstart, you deploy aMicrosoft Foundryresource and project by using aMicrosoft Biceptemplate. Bicep helps you create related resources in one coordinated deployment and reuse the same configuration across environments. If you already configured a Foundry resource in the Azure portal, you canexport that configuration as a Bicep fileinstead of authoring a template from scratch. Tip For produc",2026-04-17T17:16:00.000Z,quickstart,,0.2,False,"This is a quickstart showing how to deploy a Foundry resource and project using a Bicep template. From the summary it appears to be a step-by-step tutorial without configuration parameter tables, limits, or product-specific constraints; it focuses on basic deployment mechanics that an LLM can already generalize, so it does not meet the expert-knowledge criteria for any sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-resource-terraform,Create using Terraform template,Use Terraform to create Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,Provision Microsoft Foundry (classic) with Terraform,"In this article, you create a Microsoft Foundry resource, a Microsoft Foundry project, using Terraform infrastructure as code templates. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Use Terraform to automate the creation ofMicrosoft Foundryresources, projects, deployments, and connections. You can use either the TerraformAzAPI ProviderorAzureRM Providerto manage Foundry resources. The AzAPI provider lets you access all Foundry control plane configurations including preview features. The AzureRM variant is limited to core management capabilities. The following table shows which ac",2026-04-08T22:11:00.000Z,how-to,configuration,0.68,True,"The page describes using Terraform (AzAPI and AzureRM providers) to create Foundry resources, projects, deployments, and connections. Terraform-based resource creation for a specific Azure service typically includes resource type names, required/optional arguments, and provider-specific configuration blocks that are product-specific and not generally known to an LLM. This aligns best with the configuration sub-skill, as it focuses on concrete IaC configuration rather than generic deployment or conceptual guidance.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-secure-ai-hub,Create a hub in the Azure portal,Create a secure hub (classic) - Microsoft Foundry (classic),Create a secure Microsoft Foundry hub with managed VNet,Create a Microsoft Foundry hub inside a managed virtual network. The managed virtual network secures access to managed resources such as computes. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? You can secure yourMicrosoft Foundryhub, projects, and managed resources by using a managed virtual network. By using a managed virtual network, you can only allow inbound access through a private endpoint for you",2026-03-02T18:11:00.000Z,how-to,security,0.8,True,"Focuses on managed virtual network, private endpoints, and access restrictions; contains product-specific network security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/custom-policy-definition,Create custom policy definitions,Create a custom Azure Policy for Foundry (classic) - Microsoft Foundry (classic),Create custom Azure Policy definitions for Foundry resources,"Learn how to use custom Azure policies to enable self-service resource management in your organization, while applying guardrails and constraints on allowed configurations to meet security and complia","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Learn how to use custom Azure policies to enable teams to self-manage Microsoft Foundry resources. Apply guardrails and constraints on allowed configurations so you can provide flexibility while meeting security and compliance requirements. By using custom policies, you can:",2026-03-13T22:11:00.000Z,how-to,security,0.7,True,"Custom policy article will show Foundry-specific resource types, fields, and policy rule examples to enforce security/compliance, which are detailed governance/security configurations.",unchanged @@ -186,8 +186,8 @@ applications to Foundry Agent Service. This article walks through practical scenarios, from basic prompt agents to tool-enabled workflows.",2026-03-06T23:10:00.000Z,how-to,integrations,0.7,True,"Describes using langchain-azure-ai package; likely includes SDK configuration, client options, and integration patterns specific to Foundry.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/llama-index,Develop with Llama-Index,Develop application with LlamaIndex and Microsoft Foundry (classic) - Microsoft Foundry (classic),Integrate LlamaIndex with Microsoft Foundry models,This article explains how to use LlamaIndex with models deployed in Microsoft Foundry portal to build advance intelligent applications. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. In this article, you learn how to useLlamaIndexwith models deployed from the model catalog in Microsoft Foundry. You can use models deployed toMicrosoft Foundrywith LlamaIndex in two ways: Using the model's provider specific API:Some models, like OpenAI, Cohere, or Mistral, offer their own set of APIs and extensions for LlamaIndex. Those extensions might include spec",2026-02-27T23:08:00.000Z,how-to,integrations,0.8,True,"Explains using LlamaIndex with Foundry models via provider-specific APIs and Azure AI Model Inference, including configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-ai-red-teaming-cloud,Run red teaming scans in the cloud,Run AI Red Teaming Agent in the cloud (Microsoft Foundry SDK) (classic) - Microsoft Foundry (classic),Run AI Red Teaming Agent in the cloud with Foundry SDK,This article provides instructions on how to use the AI Red Teaming Agent to run an automated scan in the cloud of a Generative AI application with the Microsoft Foundry SDK. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Though the AI Red Teaming Agent (preview) can be ",2026-02-27T23:08:00.000Z,how-to,integrations,0.75,True,"Cloud-based scans via Foundry SDK; includes product-specific SDK configuration, parameters, and possibly resource requirements.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-scans-ai-red-teaming-agent,Run red teaming scans locally,Run AI Red Teaming Agent Locally (Azure AI Evaluation SDK) (classic) - Microsoft Foundry (classic) portal,Run AI Red Teaming Agent locally with Azure AI SDK,Learn how to use the AI Red Teaming Agent to run a local automated scan of a Generative AI application with the Azure AI Evaluation SDK. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI Red Teaming Agent (preview) is a powerful ",2026-03-21T06:03:00.000Z,how-to,integrations,0.68,True,"How-to page for using the AI Red Teaming Agent via the Azure AI Evaluation SDK; likely includes product-specific SDK usage, configuration parameters, and code patterns for running local scans against generative AI apps, which qualifies as integration-focused expert knowledge beyond generic concepts.",updated -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview,Microsoft Foundry SDKs,Get started with Microsoft Foundry SDKs and Endpoints (classic) - Microsoft Foundry (classic),Choose Microsoft Foundry SDKs and endpoints,This article provides an overview of the Microsoft Foundry SDKs and endpoints and how to get started using them. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal A Foundry resource provides unified access to models, agents, and tools. This article explains which SDK and endpoint to use for your scenario. Choose your SDK: Note Resource types:A Foundry resource provides all endpoints previously listed. An Azure OpenAI resource provides only the/openai/v1endpoint. Authentication:Samples here use Microsoft Entra ID (DefaultAzureCredential). API keys work on/openai",2026-02-27T23:08:00.000Z,how-to,decision-making,0.65,True,"Explains which SDK and endpoint to use for specific scenarios and resource types, providing concrete selection guidance beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-scans-ai-red-teaming-agent,Run red teaming scans locally,Run AI Red Teaming Agent Locally (Azure AI Evaluation SDK) (classic) - Microsoft Foundry (classic) portal,Run AI Red Teaming Agent locally with Azure AI SDK,Learn how to use the AI Red Teaming Agent to run a local automated scan of a Generative AI application with the Azure AI Evaluation SDK. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI Red Teaming Agent (preview) is a powerful ",2026-03-21T06:03:00.000Z,how-to,integrations,0.68,True,"How-to page for using the AI Red Teaming Agent via the Azure AI Evaluation SDK; likely includes product-specific SDK usage, configuration parameters, and code patterns for running local scans against generative AI apps, which qualifies as integration-focused expert knowledge beyond generic concepts.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview,Microsoft Foundry SDKs,Get started with Microsoft Foundry SDKs and Endpoints (classic) - Microsoft Foundry (classic) portal,Select and use Microsoft Foundry SDKs and endpoints,This article provides an overview of the Microsoft Foundry SDKs and endpoints and how to get started using them. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal A Foundry resource provides unified access to models, agents, and tools. This article explains which SDK and endpoint to use for your scenario. Choose your SDK: Note Resource types:A Foundry resource provides all endpoints previously listed. An Azure OpenAI resource provides only the/openai/v1endpoint. Authentication:Samples here use Microsoft Entra ID (DefaultAzureCredential). API keys work on/openai",2026-04-14T22:13:00.000Z,how-to,integrations,0.68,True,"The page goes beyond a conceptual overview and explains which specific SDKs and endpoints to use for different scenarios, including product-specific endpoint paths (/openai/v1 vs others) and authentication patterns (DefaultAzureCredential vs API keys). These are concrete integration details and patterns unique to Microsoft Foundry, fitting the integrations category.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/semantic-kernel,Develop with Semantic Kernel,Develop Applications with Semantic Kernel and Microsoft Foundry (classic) - Microsoft Foundry (classic),Use Semantic Kernel with Foundry model catalog,Learn how to Develop applications with Semantic Kernel and Microsoft Foundry with models deployed from the Foundry model catalog. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. In this article, you learn how to useSemantic Kernelwith models deployed from the Foundry model catalog in Microsoft Foundry portal. Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your pre",2026-03-02T18:11:00.000Z,how-to,integrations,0.8,True,"Describes how to configure Semantic Kernel to call Foundry models, including SDK migration notes and product-specific integration settings.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/simulator-interaction-data,Generate synthetic and simulated data for evaluation,Generate Synthetic and Simulated Data for Evaluation (classic) - Microsoft Foundry (classic),Generate synthetic and simulated data for Foundry evaluation,This article provides instructions on how to generate synthetic data to run simulations to evaluate the performance and safety of your generative AI application. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note The Azure",2026-02-27T23:08:00.000Z,how-to,configuration,0.65,True,"How-to for generating synthetic data for evaluation; likely includes SDK calls, parameters, and dataset schema requirements that are specific to Foundry’s evaluation system.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/trace-agents-sdk,Trace and observe agents,Trace and Observe AI Agents in Microsoft Foundry (classic) - Microsoft Foundry (classic),Trace and observe Foundry agents with OpenTelemetry,"Trace and Observe AI Agents in Microsoft Foundry using OpenTelemetry. Learn to see execution traces, debug performance, and monitor AI agent behavior step-by-step. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this articl",2026-03-18T08:00:00.000Z,how-to,configuration,0.75,True,"Explains how to emit and view traces for agents; likely includes exporter configuration, attribute names, and trace structure specific to Foundry.",unchanged @@ -206,15 +206,17 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/fine-tune-managed https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/fine-tune-serverless,Fine-tune models deployed via serverless API,Deploy Fine-Tuned Models with Serverless API in Microsoft Foundry (classic) - Microsoft Foundry (classic),Deploy fine-tuned models via serverless API in Foundry,"Deploy fine-tuned models using serverless API in Microsoft Foundry. Learn how to fine-tune, train, and deploy custom large language models with cost-effective serverless options. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Learn how to d",2026-02-27T23:08:00.000Z,how-to,deployment,0.7,True,"Serverless deployment of fine-tuned models; likely includes deployment options, constraints, and possibly tier-specific behavior for serverless hosting.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-bulk-test-evaluation,Submit batch run and evaluate a flow,Submit batch run and evaluate a flow (classic) - Microsoft Foundry (classic),,Learn how to submit batch run and use built-in evaluation methods in prompt flow to evaluate how well your flow performs with a large dataset with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. To evaluate ho",2026-02-27T23:08:00.000Z,how-to,,0.4,False,Batch run and evaluation how-to; likely procedural without explicit config tables or numeric thresholds in the summary.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-deploy,Deploy a flow for real-time inference,Deploy a Flow as a Managed Online Endpoint for Real-Time Inference (classic) - Microsoft Foundry (classic),Deploy prompt flows as managed online endpoints,Learn how to deploy a flow as a managed online endpoint for real-time inference with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. After you buil",2026-02-27T23:08:00.000Z,how-to,deployment,0.7,True,"Deployment article for managed online endpoints typically includes endpoint configuration options, supported SKUs, and deployment constraints specific to Foundry.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop,Create a flow,How to build with prompt flow (classic) - Microsoft Foundry (classic),,This article provides instructions on how to build with prompt flow. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Prompt flow is",2026-02-27T23:08:00.000Z,how-to,,0.3,False,"General how-to for building with prompt flow; summary does not indicate detailed configuration tables, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop-evaluation,Develop an evaluation flow in Prompt flow,Develop an evaluation flow (classic) - Microsoft Foundry (classic),,"Learn how to customize or create your own evaluation flow tailored to your tasks and objectives, and then use in a batch run as an evaluation method in prompt flow with Microsoft Foundry. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Evaluation flo",2026-02-27T23:08:00.000Z,how-to,,0.4,False,Developing an evaluation flow is mostly design and workflow; summary does not show concrete configuration matrices or limits.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-process-image,Process images in a flow,Process images in prompt flow (classic) - Microsoft Foundry (classic),,Learn how to use images in prompt flow. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Multimodal Lar",2026-02-27T23:08:00.000Z,how-to,,0.3,False,Processing images in prompt flow is a feature how-to; summary does not indicate detailed configuration parameters or numeric constraints.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-tune-prompts-using-variants,Tune prompts using variants,Tune prompts using variants (classic) - Microsoft Foundry (classic),,Learn how to tune prompts using variants in Prompt flow with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this articl",2026-02-27T23:08:00.000Z,how-to,,0.3,False,"Tuning prompts using variants appears to be workflow guidance; summary does not suggest product-specific limits, config tables, or error mappings.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop,Create a flow,How to build with prompt flow (classic) - Microsoft Foundry (classic) portal,Build Prompt Flow workflows in Foundry classic,This article provides instructions on how to build with prompt flow. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,how-to,integrations,0.65,True,"A 'how to build' article for Prompt Flow is likely to include concrete tool/node parameters, flow configuration fields, and product-specific coding patterns for orchestrating LLM calls, which qualify as integration patterns rather than just a tutorial.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop-evaluation,Develop an evaluation flow in Prompt flow,Develop an evaluation flow (classic) - Microsoft Foundry (classic) portal,Develop custom evaluation flows in Prompt Flow classic,"Learn how to customize or create your own evaluation flow tailored to your tasks and objectives, and then use in a batch run as an evaluation method in prompt flow with Microsoft Foundry. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,how-to,best-practices,0.6,True,"Creating evaluation flows tailored to tasks implies concrete guidance on structuring flows, using specific evaluation tools, and wiring metrics in Prompt Flow, which are product-specific DO/DON'T patterns rather than generic theory.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-process-image,Process images in a flow,Process images in prompt flow (classic) - Microsoft Foundry (classic) portal,Process images with Prompt Flow in Foundry classic,Learn how to use images in prompt flow. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,how-to,integrations,0.65,True,"Image processing in Prompt Flow will require specifying tool parameters (e.g., vision model selection, input formats, fields) that are product-specific integration details for Azure OpenAI vision models inside Prompt Flow.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-tune-prompts-using-variants,Tune prompts using variants,Tune prompts using variants (classic) - Microsoft Foundry (classic) portal,,Learn how to tune prompts using variants in Prompt flow with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,how-to,,0.4,False,"Tuning prompts with variants is mostly methodology; summary does not indicate product-specific configuration tables, limits, or error mappings beyond generic prompt engineering concepts.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-cxrreportgen,CXRReportGen model for text generation,Deploy CXRReportGen Healthcare AI Model in Foundry (classic) - Microsoft Foundry (classic),Deploy and use CXRReportGen healthcare AI model,"Learn how to deploy and use the CXRReportGen healthcare AI model with Microsoft Foundry to generate grounded findings from chest X-ray studies. Follow step-by-step instructions to deploy, configure, a","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important The healthcare AI models are intended for research and model development exploration. The models are not designed or intended to be deployed in clinical settings as-is nor for use in the diagnosis or treatment of any health or medical condition, and the individual models’ performances for such purposes have not been established. You bear sole responsibility",2026-02-27T23:08:00.000Z,how-to,integrations,0.65,True,"Covers deployment, configuration, and invocation of a specific model endpoint; expected to include concrete REST/SDK parameters and payload formats unique to this model.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-medimageinsight,MedImageInsight model for embedding,Deploy MedImageInsight for Medical Image Embeddings (classic) - Microsoft Foundry (classic),Deploy and invoke MedImageInsight embeddings model,"Deploy MedImageInsight to generate medical image embeddings for X-Ray, CT, MRI, and more. Get step-by-step deployment guidance and API examples. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important The healthcare AI models are intended for research and model development exploration. The models are not designed or intended to be deployed in clinical settings as-is nor for use in the diagnosis or treatment of any health or medical condition, and the individual models’ performances for such purposes have not been established. You bear sole responsibility",2026-02-27T23:08:00.000Z,how-to,integrations,0.65,True,"Step-by-step deployment plus API examples for a specific healthcare model; likely includes endpoint paths, request/response schemas, and model-specific parameters that qualify as product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-medimageparse,MedImageParse models for prompted segmentation,MedImageParse: Medical Image Segmentation Models (classic) - Microsoft Foundry (classic),Use MedImageParse medical image segmentation models,Learn how to use MedImageParse and MedImageParse 3D healthcare AI models for medical image segmentation with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important The healthcare AI models are intended for research and model development exploration. The models are not designed or intended to be deployed in clinical settings as-is nor for use in the diagnosis or treatment of any health or medical condition, and the individual models’ performances for such purposes have not been established. You bear sole responsibility",2026-02-27T23:08:00.000Z,how-to,integrations,0.65,True,"Model-specific how-to for MedImageParse and MedImageParse 3D; likely documents input formats, parameters, and API usage patterns that are not generic LLM knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/healthcare-ai-models,Foundational AI models for healthcare,Healthcare AI foundation models (classic) - Microsoft Foundry (classic),,"Explore healthcare AI foundation models in Microsoft Foundry for medical imaging, genomics, and clinical data analysis. Deploy multimodal AI models to build healthcare solutions. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important The healthcare AI models are intended for research and model development exploration. The models are not designed or intended to be deployed in clinical settings as-is nor for use in the diagnosis or treatment of any health or medical condition, and the individual models’ performances for such purposes have not been established. You bear sole responsibility",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Appears to be a conceptual overview of healthcare AI foundation models and usage caveats, not detailed configuration, limits, or troubleshooting content.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency,High availability and resiliency,High availability and resiliency for Microsoft Foundry projects and Agent Services (classic) - Microsoft Foundry (classic),Design high availability and resiliency for Foundry projects,Learn how to plan for high availability and resiliency for Microsoft Foundry projects and Agent Service. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Plan ahead to maintain business continuity and pr",2026-02-27T23:08:00.000Z,how-to,architecture-patterns,0.7,True,"HA/resiliency article will include region/deployment patterns, recommended topologies, and possibly RPO/RTO guidance specific to Foundry projects and Agent Service, which are architecture patterns.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency,High availability & disaster recovery,High availability and resiliency for Microsoft Foundry projects and Agent Services (classic) - Microsoft Foundry (classic) portal,Design high availability for Foundry Agent Service,Learn how to plan for high availability and resiliency for Microsoft Foundry projects and Agent Service. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Plan ahead to maintain business continuity and pr",2026-04-21T06:08:00.000Z,how-to,best-practices,0.68,True,"High availability and resiliency guidance for a specific service is typically concrete and product-specific (for example, recommended regional deployment patterns, failover configurations, and service behaviors under failure). This goes beyond generic HA concepts and is likely to include Foundry-specific recommendations and edge cases, fitting best under best-practices.",new +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-deploy-migrated-agent-framework-workflow,Deploy and cut over,Deploy and operate your migrated Agent Framework workflow for Foundry (classic) - Microsoft Foundry,Deploy migrated Agent Framework workflows to Azure,"Set up tracing, deploy your Microsoft Agent Framework workflow to Azure Container Apps, add a CI/CD quality gate, and cut over production traffic from Prompt Flow.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. This article covers the operations and cutover steps of the Prompt Flow to Microsoft Agent Framework migration: setting up tracing, deploying to Azure Contai",2026-04-20T22:11:00.000Z,how-to,deployment,0.75,True,"Focuses on deploying Agent Framework workflows to Azure Container Apps, setting up tracing, CI/CD quality gates, and cutover steps; these are product-specific deployment and operations patterns.",new +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-migrate-prompt-flow-to-agent-framework,Rebuild and validate,"Audit, rebuild, and validate your Prompt Flow workflow in Microsoft Agent Framework for Foundry (classic) - Microsoft Foundry",Rebuild Prompt Flow workflows with Agent Framework,"Export your Prompt Flow, re-implement it with Microsoft Agent Framework WorkflowBuilder and Executor classes, and validate output parity using the Azure AI Evaluation SDK.","Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. This article walks you through the first three phases of the Prompt Flow to Microsoft Agent Framework migration for Microsoft Foundry users: auditing your ex",2026-04-20T22:11:00.000Z,how-to,integrations,0.7,True,"Covers exporting flows and re-implementing them using WorkflowBuilder and Executor classes plus Azure AI Evaluation SDK; this is detailed, product-specific coding and integration guidance.",new https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-configure-private-link,How to configure a private link,How to configure a private link for a Microsoft Foundry hub (classic) - Microsoft Foundry (classic),Configure Private Link for Microsoft Foundry hubs,Learn how to configure a private link for Microsoft Foundry hubs. A private link is used to secure communication with the Microsoft Foundry hub. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? SDK compatibility note: Code examples require a specific Microsoft Foundry SDK version. If you encounter compatibility issues, considermigrating from a hub-based to a Foundry project. Tip An alternate Foundry proj",2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Private Link configuration articles typically include specific resource types, endpoint names, DNS zone patterns, and required settings that are product-specific security/network configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-connections-add,Create and manage connections,Create and manage connections (Hubs) (classic) - Microsoft Foundry (classic),Create and manage hub-scoped connections in Foundry,Learn how to use connections in Microsoft Foundry hubs. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Tip For more information, seeAdd a new connection to your project (Foundry projects). By using connections in Microsoft Foundry hubs, you can securely integrate external resources and services, such as Foundry Tools and other Azure data services. This article covers hub-scoped connection tasks.",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Connection management for hubs will include connection types, required parameters, and portal/API fields for integrating external services, which are concrete integration patterns specific to Foundry hubs.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-create-projects,Create a hub project,Create a hub project (classic) - Microsoft Foundry (classic),,Learn how to create a hub-based project in Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? SDK compatibility note: Code examples require a specific Microsoft Foundry SDK version. If you encounter compatibility issues, considermigrating from a hub-based to a Foundry project. Tip An alternate Foundry proj",2026-02-27T23:08:00.000Z,how-to,,0.4,False,"Creating a hub project is likely a step-by-step portal workflow without detailed configuration tables, limits, or product-specific troubleshooting; mostly procedural getting-started content.",unchanged @@ -229,17 +231,18 @@ these policies to control what models your developers can deploy in the Foundry https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/model-inference-to-openai-migration,Migrate from Azure AI Inference SDK to OpenAI SDK,How to migrate from Azure AI Inference SDK to OpenAI SDK (classic) - Microsoft Foundry (classic) portal,Migrate Azure AI Inference SDK apps to OpenAI SDK,Learn how to migrate to OpenAI SDK from Azure AI Inference SDK for enhanced compatibility and unified APIs when working with Microsoft Foundry Models. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. This article provides guidance on migrating your applications from the Azure AI Inference SDK to the OpenAI SDK. The OpenAI SDK offers broader compatibility, access to the latest OpenAI features, and simplified code with unified patterns across Azu",2026-03-26T06:04:00.000Z,how-to,integrations,0.65,True,Migration guide between two specific SDKs likely includes side-by-side API/parameter mappings and product-specific code patterns that qualify as integration-focused expert knowledge.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/monitor-applications,Continuously monitor your applications,Monitor your Generative AI Applications (classic) - Microsoft Foundry (classic),Configure continuous monitoring for Foundry AI applications,This article provides instructions on how to continuously monitor Generative AI Applications. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Monitoring you",2026-02-27T23:08:00.000Z,how-to,configuration,0.65,True,"Monitoring how-to; typically includes configuration of metrics, logging, and evaluator settings specific to Foundry’s monitoring features.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/monitor-quality-safety,Monitor prompt flow deployments,Monitor Quality and Token Usage of Deployed Prompt Flow Applications (Preview) (classic) - Microsoft Foundry (classic),Monitor quality and token usage for prompt flow apps,Learn how to monitor quality and token usage of deployed prompt flow applications with Microsoft Foundry. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Monitoring app",2026-02-27T23:08:00.000Z,how-to,configuration,0.65,True,"Monitoring article is expected to describe specific metrics, dashboards, and configuration options for tracking token usage and quality in Foundry.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool,Azure OpenAI GPT-4 Turbo with Vision tool,Azure OpenAI GPT-4 Turbo with Vision tool in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Use GPT-4 Turbo with Vision tool in prompt flows,This article introduces you to the Azure OpenAI GPT-4 Turbo with Vision tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. By using the p",2026-03-11T22:16:00.000Z,concept-article,configuration,0.75,True,"Tool article for GPT-4 Turbo with Vision will document parameters like image input formats, size constraints, and model/deployment configuration unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool,Content Safety tool,Content Safety tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the Content Safety tool in prompt flows,This article introduces you to the Content Safety tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The prompt flo",2026-03-12T17:22:00.000Z,concept-article,security,0.7,True,"Content Safety tool article is expected to document safety categories, threshold parameters, and configuration options specific to Azure AI Content Safety.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool,Embedding tool,Embedding tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the Embedding tool in Foundry prompt flows,This article introduces you to the Embedding tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The prompt flo",2026-03-12T17:22:00.000Z,concept-article,configuration,0.7,True,"Embedding tool documentation will include parameters like embedding model, vector dimensions, and deployment configuration unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool,Index Lookup tool,Index Lookup tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the Index Lookup tool for RAG flows,This article introduces you to the Index Lookup tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The prompt flo",2026-03-11T22:16:00.000Z,concept-article,configuration,0.75,True,"Index Lookup tool article likely includes configuration fields (index resource, search parameters, topK, filters) and allowed values specific to Foundry.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool,LLM tool,LLM tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the LLM tool in Foundry prompt flows,This article introduces you to the large language model (LLM) tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. To use large l",2026-03-11T22:16:00.000Z,article,configuration,0.7,True,"Tool-specific article for the LLM tool will typically document parameters (model name, temperature, max tokens, deployment names) and allowed values unique to this product.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-flow-tools-overview,Prompt flow tools overview,Overview of prompt flow tools in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),,Learn about prompt flow tools that are available in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The following ",2026-03-11T22:16:00.000Z,article,,0.2,False,Overview of tools is descriptive; overviews typically list capabilities but not detailed configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool,Prompt tool,Prompt tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the Prompt tool in Foundry prompt flows,This article introduces you to the Prompt tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The prompt flo",2026-03-12T17:22:00.000Z,concept-article,configuration,0.7,True,"Prompt tool article is expected to list tool settings, input/output bindings, and parameter options specific to Foundry prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool,Python tool,Python tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the Python tool in Foundry prompt flows,This article introduces you to the Python tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The prompt flo",2026-02-27T23:08:00.000Z,concept-article,configuration,0.7,True,"Python tool documentation usually includes configuration fields (environment, inputs/outputs, package requirements) and constraints specific to this environment.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool,Rerank tool,Rerank tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Configure the Rerank tool to improve search results,This article introduces you to the Rerank tool for flows in Microsoft Foundry portal. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. The prompt flow Rerank tool improves the search quality of relevant documents given a query for retrieval-augment generation (RAG) in prompt flow. This tool works best with theIndex Look up toolas a ranker after the initial retrieval. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agr,2026-03-12T17:22:00.000Z,concept-article,configuration,0.7,True,"Rerank tool documentation will describe tool parameters (model, topN, scoring options) and constraints specific to Foundry prompt flow.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool,Serp API tool,Serp API tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic),Use the Serp API tool within Foundry prompt flows,This article introduces you to the Serp API tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The prompt flo",2026-03-12T17:22:00.000Z,concept-article,integrations,0.75,True,"Serp API tool integrates an external search API; article will include API key configuration, endpoint/parameter mapping, and constraints specific to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot,Troubleshoot prompt flow,Troubleshoot Guidance for Prompt Flow (classic) - Microsoft Foundry (classic),Troubleshoot compute and usage issues in prompt flow,This article addresses frequently asked questions about prompt flow usage. Learn how to deal with compute session related issues. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. This article addresses frequently asked questions about prompt flow usage.,2026-03-11T22:16:00.000Z,concept-article,troubleshooting,0.8,True,"Explicit troubleshooting guidance for prompt flow; likely includes common errors, causes, and resolutions for compute session and usage problems.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-migration-overview,Migration overview,Migrate from Prompt Flow to Microsoft Agent Framework in Foundry (classic) - Microsoft Foundry,Plan migration from Prompt Flow to Agent Framework,Learn how to migrate your Prompt Flow workflows to Microsoft Agent Framework using FoundryChatClient and Foundry project endpoints.,"Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to operate until that date. Recommended action:Migrate your Prompt Flow workloads toMicrosoft Agent Frameworkbefore April 20, 2027. Prompt Flow is a development tool that streamlines the entire development cycle of AI applications powered by large language models (LLMs). Prompt Flow provi",2026-04-20T22:11:00.000Z,overview,decision-making,0.65,True,"Migration overview will compare Prompt Flow and Agent Framework, outline when and how to migrate, and provide scenario-based guidance and constraints, which supports technology selection and migration decisions.",new +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool,Azure OpenAI GPT-4 Turbo with Vision tool,Azure OpenAI GPT-4 Turbo with Vision tool in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Configure GPT-4 Turbo with Vision tool in Prompt Flow,This article introduces you to the Azure OpenAI GPT-4 Turbo with Vision tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,integrations,0.7,True,"This tool article will describe how to call Azure OpenAI GPT-4 Turbo with Vision from Prompt Flow, including parameters, deployment references, and image input handling, which are product-specific integration details.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool,Content Safety tool,Content Safety tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Apply Content Safety tool in Prompt Flow classic,This article introduces you to the Content Safety tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,security,0.6,True,"Content Safety tool documentation typically includes category flags, severity thresholds, and configuration parameters for safety evaluation, which are product-specific security/safety settings rather than generic advice.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool,Embedding tool,Embedding tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Use the Embedding tool in Prompt Flow classic,This article introduces you to the Embedding tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,integrations,0.7,True,"Embedding tool article will specify model/deployment parameters, vector output handling, and integration with downstream tools, which are concrete SDK/tool configuration patterns.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool,Index Lookup tool,Index Lookup tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Use the Index Lookup tool in Prompt Flow classic,This article introduces you to the Index Lookup tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,integrations,0.7,True,"Index Lookup tool documentation will cover configuration fields (index name, search parameters, connection settings) and how it integrates with Azure AI Search or similar, which are concrete integration patterns.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool,LLM tool,LLM tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Use the LLM tool in Prompt Flow classic,This article introduces you to the large language model (LLM) tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,article,integrations,0.7,True,"An article introducing the LLM tool for flows will typically document tool parameters (model name, temperature, max tokens, deployment references) and how they map to Azure OpenAI, which are product-specific integration settings.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-flow-tools-overview,Prompt flow tools overview,Overview of prompt flow tools in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,,Learn about prompt flow tools that are available in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,article,,0.3,False,"An overview of Prompt Flow tools is likely descriptive and navigational, listing tools without deep configuration tables or error mappings.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool,Prompt tool,Prompt tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Configure the Prompt tool in Prompt Flow classic,This article introduces you to the Prompt tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,integrations,0.65,True,"The Prompt tool article is expected to define fields, variable bindings, and configuration options unique to Prompt Flow nodes, which are concrete integration/config patterns rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool,Python tool,Python tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Use the Python tool in Prompt Flow classic,This article introduces you to the Python tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,integrations,0.7,True,"Python tool documentation usually includes function signatures, input/output schema, environment constraints, and parameter names specific to Prompt Flow, which are integration and coding patterns.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool,Rerank tool,Rerank tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Configure the Rerank tool for RAG in Prompt Flow,This article introduces you to the Rerank tool for flows in Microsoft Foundry portal. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. The prompt flow Rerank tool improves the search quality of relevant documents given a query for retrieval-augment generation (RAG) in prompt flow. This tool works best with theIndex Look up toolas a ranker after t,2026-04-20T22:11:00.000Z,concept-article,integrations,0.7,True,"Rerank tool article describes how to wire it after Index Lookup, including parameters like topN, scoring options, and model selection; these are specific integration settings for Prompt Flow RAG pipelines.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool,Serp API tool,Serp API tool for flows in Microsoft Foundry portal (classic) - Microsoft Foundry (classic) portal,Integrate Serp API tool with Prompt Flow classic,This article introduces you to the Serp API tool for flows in Microsoft Foundry portal. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,integrations,0.65,True,"Serp API tool documentation will include API key configuration, query parameters, and response handling fields specific to Prompt Flow, which are integration details not covered by generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot,Troubleshoot prompt flow,Troubleshoot Guidance for Prompt Flow (classic) - Microsoft Foundry (classic) portal,Troubleshoot Prompt Flow compute and usage issues,This article addresses frequently asked questions about prompt flow usage. Learn how to deal with compute session related issues. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Warning Prompt Flow feature development ended on April 20, 2026. The feature will be fully retired on April 20, 2027. On the retirement date, Prompt Flow enters read-only mode. Your existing flows will continue to",2026-04-20T22:11:00.000Z,concept-article,troubleshooting,0.8,True,"Explicitly a troubleshooting guide; likely organized around common Prompt Flow errors (compute session failures, timeouts) with causes and resolutions unique to Foundry Prompt Flow.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/quota,Manage quotas for Foundry resources,Manage and increase quotas for resources (classic) - Microsoft Foundry (classic),Manage and increase model deployment quotas in Foundry,"Learn how to view, manage, and request increases for model deployment quotas in Microsoft Foundry, including token-per-minute and provisioned throughput allocations. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Tip An alternate hub-focused quota article is available:Manage and increase quotas for hub resources. Quota provides the flexibility to actively manage the allocation of rate limits across the deployments within your subscription. Azure assigns quota per subscription, per region, and per model in units of tokens per minute (TPM). Different deployment types, such as Standard and Provisioned, have diffe",2026-02-27T23:08:00.000Z,how-to,limits-quotas,0.9,True,"Quota article explicitly covers TPM units, per-subscription/region/model allocations, and differences between Standard and Provisioned deployments; it will contain numeric limits and quota behaviors.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/secure-data-playground,Secure playground chat,Securely use playground chat (classic) - Microsoft Foundry (classic),Secure configuration for Foundry playground chat on your data,Learn how to securely use the Microsoft Foundry portal playground chat on your own data. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important This article provides legacy support for hub-based projects. It will not work forFoundry projects. SeeHow do I know which type of project I have? Use this article to learn how to securely useMicrosoft Foundry's playground chat on your data. The following sections provide our recommended configuration to protect your data and resources by using Microsoft Ent,2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Provides recommended configuration to protect data and resources when using playground chat, likely including specific Entra ID, permissions, and network/security settings. This is product-specific security guidance, not just conceptual RAI content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/set-up-key-vault-connection,Create a Key Vault,Set up an Azure Key Vault Connection (classic) - Microsoft Foundry (classic),Set up Azure Key Vault connection for Foundry,Learn how to securely connect your Azure Key Vault to Foundry. Follow step-by-step instructions to manage secrets and ensure seamless integration. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal If you don't set up a Key Vault connection, Microsoft Foundry stores connection details in a Microsoft-managed Key Vault outside your subscription. To manage your own secrets, connect your Azure Key Vault to Foundry. Before you begin, review thelimitationsfor Key Vault connections. Azure Key Vault is a cloud service for securely storing and accessing secrets. A secret is anything that you want to tigh",2026-02-27T23:08:00.000Z,how-to,security,0.8,True,"Key Vault connection setup includes specific permissions, identity configuration, vault URI formats, and Foundry-specific behavior when using customer-managed secrets, which are concrete security configuration details.",unchanged @@ -268,29 +271,29 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/customiz https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/default-safety-policies,Default safety policies,Default Guardrail policies for Azure OpenAI (classic) - Microsoft Foundry (classic) portal,Understand default Guardrail safety policies in Azure OpenAI,Learn about the default Guardrail policies that Azure OpenAI uses to flag content and ensure responsible use of the service. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Azure OpenAI in Foundry Models includes default safety policies applied to all models (excluding Azure OpenAI Whisper). These configurations provide you with a responsible experience by default, includingcontent filtering models, blocklists, prompt transformation,content credentials, and other features. Default safety aims to mitigate risks in different categories such as hate and fairness, sexual, vi",2026-03-19T22:16:00.000Z,concept-article,security,0.78,True,"Describes default Guardrail/safety policies applied to Azure OpenAI in Foundry classic, including content filtering models, blocklists, prompt transformation, and content credentials. These are product-specific security/safety configurations and policy behaviors that an LLM would not fully know from training, fitting the security category.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/fine-tuning-considerations,Fine-tuning,Microsoft Foundry fine-tuning considerations (classic) - Microsoft Foundry (classic),,Learn more about what you should take into consideration before fine-tuning with Microsoft Foundry. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Fine-tuning is the process of taking a pretrained language model and adapting it to perform a specific task or improve its performance on a particular dataset. This involves training the model on a smaller, task-specific dataset while adjusting the model's weights slightly. Fine-tuning leverages the knowledge the model acquired during its initial training on a large, diverse dataset, allowing it to sp",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Fine-tuning considerations are likely conceptual guidance (data quality, overfitting, etc.) without product-specific numeric thresholds or configuration tables.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/gpt-4-v-prompt-engineering,Image prompt engineering techniques,Image prompt engineering techniques (classic) - Microsoft Foundry (classic),,Learn how to better engineer image prompts for vision-enabled chat models. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal To unlock the full potential of vision-enabled chat models, it's essential to tailor the prompts to your specific needs. Here are some guidelines to enhance the accuracy and efficiency of your prompts. Note These prompt engineering techniques apply to vision-enabled models including GPT-4 Turbo with Vision, GPT-4o, and GPT-4o-mini. To deploy a vision-enabled model, seeDeploy models.",2026-02-27T23:08:00.000Z,concept-article,,0.4,False,Image prompt engineering techniques are general guidance for crafting prompts; no indication of product-specific configuration tables or numeric thresholds.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/gpt-with-vision,Vision-enabled models,Vision-enabled chat model concepts (classic) - Microsoft Foundry (classic) portal,,"Learn concepts for using images in your AI model chats, with GPT-4 Turbo with Vision and other models. (classic)",Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Vision-enabled chat models are large multimodal models (LMM) developed by OpenAI that analyze images and provide textual responses to questions about them. They incorporate both natural language processing and vis,2026-04-14T22:13:00.000Z,concept-article,,0.1,False,"Described as a concepts article explaining vision-enabled chat models and what they are. The summary indicates high-level conceptual information about multimodal models, not specific limits, configuration parameters, error codes, or decision matrices. It does not meet any of the expert-knowledge criteria.",updated -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/legacy-models,Legacy models,Retired Azure OpenAI models in Microsoft Foundry (classic) - Microsoft Foundry (classic),Review retired Azure OpenAI model availability,Learn about retired Azure OpenAI models in Microsoft Foundry. (classic),Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Azure OpenAI offers a variety of models for different use cases. The following models are no longer available for deployment.,2026-03-13T22:11:00.000Z,concept-article,limits-quotas,0.65,True,"Retired models list is effectively a product-specific availability constraint; page likely enumerates specific model names and retirement dates, which are concrete limits on what can be deployed.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements,Model retirements,Azure OpenAI in Microsoft Foundry Model Retirements (classic) - Microsoft Foundry (classic) portal,Azure OpenAI model availability and retirement schedule,Learn about model deprecations and retirements in Azure OpenAI. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Azure OpenAI models are continually refreshed with newer and more capable models. As part of this process, we deprecate and retire older models. This article provides information about models that are currently available, deprecated, and retir",2026-03-26T06:04:00.000Z,concept-article,limits-quotas,0.8,True,"A model retirements page typically lists specific models, their deprecation/retirement dates, and sometimes replacement guidance. These are time-bound, model-specific constraints (what is available until when) that function as limits/quotas-like data and are not inferable from generic training.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/gpt-with-vision,Vision-enabled models,Vision-enabled chat model concepts (classic) - Microsoft Foundry (classic) portal,,"Learn concepts for using images in your AI model chats, with GPT-4 Turbo with Vision and other models. (classic)",Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Vision-enabled chat models are large multimodal models (LMM) developed by OpenAI that analyze images and provide textual responses to questions about them. They incorporate both natural language processing and vis,2026-04-14T22:13:00.000Z,concept-article,,0.1,False,"Described as a concepts article explaining vision-enabled chat models and what they are. The summary indicates high-level conceptual information about multimodal models, not specific limits, configuration parameters, error codes, or decision matrices. It does not meet any of the expert-knowledge criteria.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirement-schedule,Foundry Models retirement schedule,Model retirement schedule (classic) - Microsoft Foundry (classic) portal,Use Foundry model retirement schedule for migrations,Retirement dates and replacement models for Microsoft Foundry Models.,"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. This article lists the retirement schedule for Foundry Models — their current lifecycle stage, retirement date, and suggested replacement. Use it to plan migrations before a model is deprecated or retired. For details on what each lifecycle stage m",2026-04-24T17:22:00.000Z,concept-article,decision-making,0.78,True,"A retirement schedule with current lifecycle stage, retirement dates, and suggested replacements is expert knowledge. It directly supports decision-making by providing concrete dates and replacement options to plan migrations, matching the decision-making criteria (comparison of options with timelines and recommendations).",new +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements,Foundry Models lifecycle and support policy,Foundry Models lifecycle and support policy (classic) - Microsoft Foundry (classic) portal,Plan around Foundry model lifecycle and support,"Learn about the Microsoft Foundry Models lifecycle and support policy, including model deprecations and retirements. Explore how these changes affect your deployments. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Microsoft Foundry Models move through a predictable lifecycle—from preview to general availability (GA) to eventual retirement—giving you time to evaluate replacements and migrate workloads. This article explains each lifecycle stage, the overlap c",2026-04-24T17:22:00.000Z,concept-article,decision-making,0.65,True,"Lifecycle and support policy content typically includes concrete timelines, overlap periods, and migration guidance between model versions. This helps users decide when and how to migrate workloads and select replacement models, which aligns with decision-making and contains product-specific policy details not inferable from general knowledge.",new https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-router,Overview,Model router for Microsoft Foundry concepts (classic) - Microsoft Foundry (classic) portal,,Learn about the model router feature in Azure OpenAI in Microsoft Foundry Models. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Model router is a trained language model that intelligently routes your prompts in real time to the most suitable large language model (LLM). You deploy model router like any other Foundry model. Thus, it delivers high performance while saving on costs, reducing latencies, and increasing responsiveness, while maintaining comparable quality, all packaged as a single model deployment. Note You do not ne",2026-03-24T22:15:00.000Z,concept-article,,0.3,False,"Concepts page describing what the model router is and its benefits; summary does not indicate detailed limits, configs, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/priority-processing,Priority processing,Enable priority processing for Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Enable and evaluate Foundry priority processing tiers,Learn how to enable priority processing for Microsoft Foundry models to achieve low latency and high availability for time-sensitive workloads. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Priority processing provides low-latency performance with the flexibility of pay-as-you-go. In this article, you enable priority processing on a model deployment, verify which service tier processed your requests, and monitor associated costs.",2026-03-24T17:13:00.000Z,how-to,decision-making,0.7,True,"Priority processing for model deployments (low latency, high availability, pay-as-you-go) typically involves choosing service tiers, understanding cost/performance trade-offs, and verifying which tier processed requests. This is tier/plan selection with quantified trade-offs and monitoring of associated costs, fitting decision-making.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/priority-processing,Priority processing,Enable priority processing for Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Choose and enable priority processing for Foundry models,Learn how to enable priority processing for Microsoft Foundry models to achieve low latency and high availability for time-sensitive workloads. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Priority processing provides low-latency performance with the flexibility of pay-as-you-go. In this article, you enable priority processing on a model deployment, verify which service tier processed your requests, and monitor associated costs.",2026-04-14T22:13:00.000Z,how-to,decision-making,0.68,True,"The page is specific to Microsoft Foundry (classic) priority processing, describing how and when to enable a priority tier on a model deployment, how to verify which tier processed requests, and how to monitor associated costs. This is product- and tier-specific guidance that helps decide when to use priority processing versus standard, with cost/performance trade-offs, which fits the decision-making category better than generic configuration or limits. The content goes beyond conceptual marketing and includes concrete, service-specific operational guidance that an LLM is unlikely to know from training.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/prompt-engineering,Prompt engineering techniques,Prompt engineering techniques - Microsoft Foundry (classic),,Learn how to use prompt engineering to optimize your work with Azure OpenAI. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal These techniques aren't recommended for reasoning models like gpt-5 and o-series models. Prompt construction can be difficult. In practice, the prompt acts to help the model complete the desired task, but it's more of an art than a science, often requiring experience and intuition to craft a successful prompt. The goal of this article is to help get you started with this learning process. This article",2026-02-27T23:08:00.000Z,concept-article,,0.4,False,Prompt engineering techniques article is largely general best practices for prompts; not focused on Azure-specific configuration values or limits.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/prompt-transformation,Prompt transformations,Azure OpenAI prompt transformation concepts (classic) - Microsoft Foundry (classic) portal,Understand Azure OpenAI prompt transformation for image safety,"Learn about the prompt transformation feature in Azure OpenAI image generation models, how it works, and why it's necessary. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Important The DALL-E image generation modeldall-e-3was retired on March 4, 2026, and is no longer available for new deployments. Existing deployments are non-functional. Usegpt-image-1orgpt-image-1.5for image generation instead. See theimage g",2026-03-19T22:16:00.000Z,concept-article,security,0.65,True,"Explains prompt transformation for Azure OpenAI image models, including how and why prompts are transformed for safety/compliance. This is a product-specific safety mechanism tied to content moderation and responsible use, which aligns best with security-focused configuration/behavior rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/provisioned-migration,Azure OpenAI PTU updates,Azure OpenAI Provisioned 2024 Updates (classic) - Microsoft Foundry (classic),Evaluate 2024 Azure OpenAI provisioned throughput updates,"Learn about the improvements to Provisioned Throughput including model-independent quota, hourly/reservation payment models, and global and data zone deployments. (classic)",Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Microsoft launched improvements to its Provisioned Throughput offering that address customer feedback on usability and operational agility that open new payment options and deployment scenarios. This article is intended for existing users of the provisioned throughput offering. New customers should refer to theAzure OpenAI provisioned onboarding guide.,2026-02-27T23:08:00.000Z,how-to,decision-making,0.7,True,"Update article compares old vs new provisioned models (model-independent quota, payment options, deployment scenarios) and guides existing users on migration/choice, which is decision-making content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/provisioned-throughput,What is the Provisioned Throughput offering (PTU)?,What Is Provisioned Throughput for Foundry Models? (classic) - Microsoft Foundry (classic),Understand provisioned throughput for Foundry models,Learn how provisioned throughput enables efficient deployment of Azure OpenAI and Foundry Models with stable latency and allocated capacity. Get started today. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Tip For more information on recent changes to the provisioned throughput offering, see theupdate article. The Microsoft Foundry provisioned throughput offering is a model deployment type that allows you to specify the amount of throughput you require in a model deployment. Foundry then allocates the necessary model processing capacity and ensures it's ready for you. Use the provisioned throughput you ",2026-03-02T18:11:00.000Z,concept-article,decision-making,0.7,True,"Provisioned throughput concept article for Foundry models will include PTU sizing, capacity allocation behavior, and when to choose provisioned vs standard, which are product-specific decision criteria.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/red-teaming,Red teaming large language models (LLMs),Planning red teaming for large language models (LLMs) and their applications (classic) - Microsoft Foundry (classic),,Learn about how red teaming and adversarial testing are an essential practice in the responsible development of systems and features using large language models (LLMs) (classic),Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This guide offers some potential strategies for planning how to set up and manage red teaming for responsible AI (RAI) risks throughout the large language model (LLM) product life cycle.,2026-02-27T23:08:00.000Z,concept-article,,0.25,False,"Red teaming planning guide is process-oriented and conceptual, not product-specific configuration, limits, or error-code troubleshooting.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/retired-models,Retired Foundry Models,Retired Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Identify retired Foundry models and alternatives,Retired Foundry Models: Find out which Microsoft Foundry Models are retired and explore alternatives (classic),Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Microsoft Foundry offers a variety of models for different use cases. The following models are retired and no longer available for use or for new deployments.,2026-04-24T17:22:00.000Z,concept-article,decision-making,0.62,True,"A list of retired models and their alternatives is specific, time-sensitive product data. It guides users in choosing replacement models and planning migrations away from unavailable ones, which fits decision-making around model selection and upgrade paths.",new https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/safety-system-message-templates,Safety system message templates,Safety system message templates (classic) - Microsoft Foundry (classic),Apply safety system message templates in Azure OpenAI,Use these safety system message templates as a starting point to reduce harmful and ungrounded outputs in your Azure OpenAI apps. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article contains recommended safety system messages for your generative AI systems to help reduce the propensity of harm in various concern areas. Before you begin evaluating and integrating your safety system messages, visit theSafety system message conceptual guideto get started. Note Using a safety system message is one of many techniques you can use to mitigate risks in AI systems. It’s diffe",2026-02-27T23:08:00.000Z,how-to,best-practices,0.7,True,"Contains recommended safety system message templates for various concern areas; these are concrete, reusable patterns specific to Azure OpenAI safety configuration.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/system-message,Safety system messages,Safety system messages (classic) - Microsoft Foundry (classic),Author safety-focused system messages in Foundry,"Learn how safety system messages (system prompts) guide Azure OpenAI model behavior, improve quality, and reduce risks in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Safety system messages help you guide an Azure OpenAI model’s behavior, improve response quality, and reduce the likelihood of harmful outputs. They work best as one layer in a broader safety strategy. Note This article uses ""system message"" interchangeably with ""metaprompt"" and ""system prompt."" Here, we use ""system message"" to align with common terminology. This article also uses ""component"" to mean ",2026-02-27T23:08:00.000Z,concept-article,best-practices,0.6,True,"Safety system messages article provides concrete templates and guidance to reduce harmful outputs, representing product-specific best practices for safety prompting.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/understand-embeddings,Embeddings,Azure OpenAI in Microsoft Foundry Models embeddings (classic) - Microsoft Foundry (classic),,Learn more about how the Azure OpenAI embeddings API uses cosine similarity for document search and to measure similarity between texts. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. An embedding is a special format of data representation that machine learning models and algorithms can easily use. The embedding is an information dense representation of the semantic meaning of a piece of text. Each embedding is a vector of floating-point numbers, such that the distance between two embeddings in the vector space is correlated with semantic similari",2026-02-27T23:08:00.000Z,tutorial,,0.25,False,Conceptual explanation of embeddings and cosine similarity; generic ML knowledge without Azure-specific configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/use-your-data,Text data,Using your data with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Plan using your data with Azure OpenAI deployments,Use this article to learn about using your data for better text generation in Azure OpenAI. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure OpenAI On Your Data is deprecated and approaching retirement. Microsoft has stopped onboarding new models to Azure OpenAI On Your Data. This feature only supports the following models: Once the GPT‑4.1 models retire, all Azure OpenAI On Your Data API endpoints and supported data source connectors stop functioning. We recommend that you migrate Azure O",2026-02-27T23:08:00.000Z,concept-article,decision-making,0.6,True,"Covers using your data with Azure OpenAI and deprecation/migration guidance; likely includes which models and connectors are supported and recommendations for migration, aiding deployment decisions.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/video-generation,Video generation (preview),Sora video generation overview (preview) (classic) - Microsoft Foundry (classic),,"Learn about Sora, an AI model for generating realistic and imaginative video scenes from text instructions, including safety, limitations, and supported features. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Sora is an AI model from OpenAI that creates realistic and imaginative video scenes from text instructions and/or input images or video. The model can generate a wide range of video content, including realistic scenes, animations, and special effects. It supports several video resolutions and durations.",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Conceptual overview of Sora capabilities, safety, and limitations; no clear indication of parameter tables, limits, or configuration matrices.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/encrypt-data-at-rest,Encryption of data at rest,Azure OpenAI in Microsoft Foundry Models encryption of data at rest (classic) - Microsoft Foundry (classic),,Learn how Azure OpenAI encrypts your data when it's persisted to the cloud. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Azure OpenAI automatically encrypts your data when it's persisted to the cloud. The encryption protects your data and helps you meet your organizational security and compliance commitments. This article covers how Azure OpenAI handles encryption of data at rest, specifically training data and fine-tuned models. For information on how data provided by you to the servi",2026-02-27T23:08:00.000Z,how-to,,0.45,False,Explains how data is encrypted at rest; likely high-level service behavior without concrete configuration parameters or limits.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/faq,Azure OpenAI FAQ,Azure OpenAI frequently asked questions,Azure OpenAI FAQ with usage limits and policies,Get answers to the most popular questions about Azure OpenAI,,2026-04-14T22:13:00Z,faq,limits-quotas,0.7,True,"FAQ includes Azure OpenAI–specific numeric limits (for example, rate limits, token limits, request size constraints) and policy details that are deployment- and SKU-specific and not inferable from generic OpenAI knowledge.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/faq,Azure OpenAI FAQ,Azure OpenAI frequently asked questions,Azure OpenAI FAQ with usage limits and policies,Get answers to the most popular questions about Azure OpenAI,,2026-04-14T22:13:00Z,faq,limits-quotas,0.7,True,"FAQ includes Azure OpenAI–specific numeric limits (for example, rate limits, token limits, request size constraints) and policy details that are deployment- and SKU-specific and not inferable from generic OpenAI knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/assistant,Getting started with Assistants,How to create Assistants with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Create Azure OpenAI Assistants with tools in Foundry,Learn how to create helpful AI Assistants with tools like Code Interpreter. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note The Assistants API is deprecated and will be retired on August 26, 2026. Use the generally availableMicrosoft Foundry Agents service. Follow themigration guideto update your workloads.Learn more. Azure OpenAI Assistants (Preview) allows you to create AI assistants tailored to your needs through custom instructions and augmented by advanced tools like code interp",2026-02-27T23:08:00.000Z,how-to,integrations,0.6,True,"How-to for creating Assistants with tools like Code Interpreter; likely includes API parameters, request schemas, and configuration options specific to the Assistants feature.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/assistant-functions,Assistants function calling,How to use Azure OpenAI Assistants function calling (classic) - Microsoft Foundry (classic),Implement Assistants function calling in Azure OpenAI,Learn how to use Assistants function calling (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note The Assistants API is deprecated and will be retired on August 26, 2026. Use the generally availableMicrosoft Foundry Agents service. Follow themigration guideto update your workloads.Learn more. The Assistants API supports function calling, which allows you to describe the structure of functions to an Assistant and then return the functions that need to be call",2026-02-27T23:08:00.000Z,how-to,integrations,0.65,True,"Function calling how-to for Assistants includes function schema definitions, parameter formats, and tool-calling behavior specific to this API.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/assistants-logic-apps,Function calling with Azure Logic Apps,Run Automated Workflows from Assistants - Microsoft Foundry (classic) portal,Trigger Azure Logic Apps from Foundry Assistants,"Run workflows in Azure Logic Apps from assistants in Microsoft Foundry Agent Service (classic). Integrate and automate business tasks by using 1,400+ connectors for services and systems without custom","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Note The Assistants API is deprecated and will be retired on August 26, 2026. Use the generally availableMicrosoft Foundry Agents service. Follow themigration guideto update your workloads.Learn more. To exte",2026-03-20T22:11:00.000Z,how-to,integrations,0.72,True,"How-to for running Logic Apps workflows from Assistants in Foundry Agent Service classic. This is a concrete integration pattern between Assistants and Logic Apps, likely including connector configuration, parameters, and usage patterns that are product-specific, fitting integrations.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/batch,Getting started with batch,How to use global batch processing with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Use Azure OpenAI global batch processing,Learn how to use global batch with Azure OpenAI (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal The Azure OpenAI Batch API is designed to handle large-scale and high-volume processing tasks efficiently. Process asynchronous groups of requests with separate quota, with 24-hour target turnaround, at50% less cost than global standard. With batch processing, rather than send one request at a time you send a large number of requests in a single file. Global batch requests have a separate enqueued tok",2026-02-27T23:08:00.000Z,how-to,limits-quotas,0.8,True,"Describes batch API with separate quota, 24-hour target turnaround, and 50% cost reduction; includes numeric behaviors and quota separation specific to batch processing.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/batch-blob-storage,Batch with Azure Blob Storage,How to configure Azure Blob Storage with Azure OpenAI Batch (classic) - Microsoft Foundry (classic),Configure Azure Blob Storage for OpenAI Batch I/O,Learn how to configure Azure Blob Storage with Azure OpenAI Batch (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Azure OpenAI now supports usingAzure Blob Storagefor Azure OpenAI Batch input and output files. By using your own storage, you aren't subject to the batch restrictions on the number of files.",2026-03-13T22:11:00.000Z,how-to,configuration,0.7,True,"Explains configuring Blob Storage for batch input/output; likely includes container names, SAS/identity configuration, and parameter settings unique to this integration.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/business-continuity-disaster-recovery,Business continuity & disaster recovery (BCDR),Business Continuity and Disaster Recovery (BCDR) with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Design BCDR architecture for Azure OpenAI workloads,Considerations for implementing Business Continuity and Disaster Recovery (BCDR) with Azure OpenAI (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Azure OpenAI is available in multiple regions. When you create an Azure OpenAI resource, you specify a region. From then on, your resource and all its operations stay associated with that Azure server region. It's rare, but not impossible, to encounter a network issue that hits an entire region. If your service needs to always be available, then you should design it ",2026-02-27T23:08:00.000Z,how-to,architecture-patterns,0.62,True,"BCDR guidance for a specific service often includes region selection patterns, failover strategies, and possibly thresholds for multi-region design; this is architecture-specific decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/chatgpt,Chat completions API,Work with chat completion models (classic) - Microsoft Foundry (classic),Work with Azure OpenAI chat completion models,Learn about the options for how to use models with the chat completions API (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Chat models are language models that are optimized for conversational interfaces. The models behave differently than the older completion API models. Previous models were text-in and text-out, which means they accepted a prompt string and returned a completion to append to the prompt. However, the latest models are conversation-in and message-out. The models expect input formatted in a specific chat-l",2026-02-27T23:08:00.000Z,how-to,integrations,0.75,True,"Details chat completions API usage, message formats, and options; includes parameters and patterns specific to Azure OpenAI chat models.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/code-interpreter,Code Interpreter,How to use Azure OpenAI Assistants Code Interpreter (classic) - Microsoft Foundry (classic) portal,Run code with Azure OpenAI Assistants Code Interpreter,Learn how to use Assistants Code Interpreter (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Note The Assistants API is deprecated and will be retired on August 26, 2026. Use the generally availableMicrosoft Foundry Agents service. Follow themigration guideto update your workloads.Learn more. Code Interpr",2026-04-10T08:00:00.000Z,how-to,integrations,0.66,True,"A 'how to use Code Interpreter' article for Azure OpenAI Assistants will typically include product-specific API parameters (for enabling tools, file handling, execution settings), request payload structures, and possibly constraints around execution. These are concrete integration details and coding patterns specific to this service, fitting the integrations sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/codex,Codex,Codex with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Use Codex CLI and VS Code with Azure OpenAI,Learn how to use the Codex CLI and the Codex extension for Visual Studio Code with Azure OpenAI in Microsoft Foundry Models. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. OpenAI’sCodex CLIis the same coding agent that powers ChatGPT’s Codex. You can run this coding agent entirely on Azure infrastructure, while keeping your data inside your compliance boundary with the added advantages of enterprise-grade securi",2026-03-26T06:04:00.000Z,how-to,integrations,0.7,True,"Describes using the Codex CLI and VS Code extension with Azure OpenAI in Foundry Models. This implies CLI flags, configuration fields, environment variables, and extension settings specific to Azure OpenAI integration, which are product-specific integration patterns.",unchanged @@ -320,16 +323,16 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/latency,Pe https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/managed-identity,Managed identity,How to configure Azure OpenAI in Microsoft Foundry Models with Microsoft Entra ID authentication (classic) - Microsoft Foundry (classic),Configure Microsoft Entra ID auth for Azure OpenAI,Provides guidance on how to set managed identity with Microsoft Entra ID (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. More complex security scenarios require Azure role-based access control (Azure RBAC). This document covers how to authenticate to your Azure OpenAI resource using Microsoft Entra ID. In the following sections, you'll use the Azure CLI to sign in, and obtain a bearer token to call the OpenAI resource. If you get stuck, links are provided in each section with all avail",2026-03-13T22:11:00.000Z,how-to,security,0.8,True,"Covers authentication with Entra ID and Azure RBAC; such docs list role names, scopes, and CLI commands specific to this service.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/migration,Migrate to OpenAI Python v1.x,How to migrate to OpenAI Python v1.x (classic) - Microsoft Foundry (classic),Migrate Azure OpenAI apps to OpenAI Python v1.x,Learn about migrating to the latest release of the OpenAI Python library with Azure OpenAI. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important For the latest API examples refer to theAPI lifecycle article. OpenAI released a new version of theOpenAI Python API library. This guide is supplemental toOpenAI's migration guideand will help bring you up to speed on the changes specific to Azure OpenAI. Note This guidance is no longer recommended. To take advantage of the latest v1 API refer to thePython ,2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Migration guide between library versions with Azure-specific changes; these typically include concrete parameter names, code patterns, and configuration differences unique to Azure OpenAI usage.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/migration-javascript,Migrate to OpenAI JavaScript v4.x,How to migrate to OpenAI JavaScript v4.x (classic) - Microsoft Foundry (classic),Migrate Azure OpenAI apps to OpenAI JavaScript v4,Learn about migrating to the latest release of the OpenAI JavaScript library with Azure OpenAI. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important For the latest API examples refer to theAPI lifecycle article.,2026-02-27T23:08:00.000Z,how-to,integrations,0.65,True,Migration guidance for JS client with Azure-specific adjustments; likely includes concrete SDK usage patterns and parameter changes that are product-specific.,unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/model-router,Get started with model router,How to use model router for Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,Learn how to use the model router in Azure OpenAI to select the best model for your task. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Model router for Microsoft Foundry is a deployable AI chat model that selects the best large language model (LLM) to respond to a prompt in real time. It uses different preexisting models to deliver high performance and save on compute costs, all in one model deployment. To learn more about how model router works, its advantages, and limitations, see theModel router concepts guide. Use model router th",2026-04-01T17:18:00.000Z,how-to,,0.4,False,"“How to use model router” sounds procedural; the summary references advantages and limitations but not specific numeric limits, configuration tables, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/model-router,Get started with model router,How to use model router for Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,Learn how to use the model router in Azure OpenAI to select the best model for your task. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Model router for Microsoft Foundry is a deployable AI chat model that selects the best large language model (LLM) to respond to a prompt in real time. It uses different preexisting models to deliver high performance and save on compute costs, all in one model deployment. To learn more about how model router works, its advantages, and limitations, see theModel router concepts guide. Use model router th",2026-04-01T17:18:00.000Z,how-to,,0.4,False,"“How to use model router” sounds procedural; the summary references advantages and limitations but not specific numeric limits, configuration tables, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/monitor-openai,Monitor Azure OpenAI,Monitor Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic),Configure monitoring and logging for Azure OpenAI,Start here to learn how to use Azure Monitor tools like Log Analytics to capture and analyze metrics and data logs for your Azure OpenAI. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. This article describes: Note If you're already familiar with this service and/or Azure Monitor and just want to know how to analyze monitoring data, see theAnalyzesection near the end of this article. When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service c",2026-02-27T23:08:00.000Z,concept-article,configuration,0.65,True,"Monitoring article for a specific service usually lists metric names, log categories, and diagnostic settings unique to Azure OpenAI and Azure Monitor integration.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/network,Virtual networks with private endpoints,Securing Azure OpenAI inside a virtual network with private endpoints (classic) - Microsoft Foundry (classic),Secure Azure OpenAI with virtual networks and private endpoints,How to secure your Azure OpenAI resource inside a virtual network with private endpoints (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. In this article, learn how to create and connect to a secure Azure OpenAI resource. The steps in this article use an Azure Virtual Network to create a security boundary for your Azure OpenAI resource. After completing this article, you'll have the following architecture:",2026-02-27T23:08:00.000Z,how-to,security,0.78,True,"How-to for VNet and private endpoints; such docs include specific network configuration steps, resource types, and settings unique to securing Azure OpenAI.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/network-security-perimeter,Network security perimeter (preview),Add an Azure OpenAI network security perimeter (classic) - Microsoft Foundry (classic),Add Azure OpenAI to a network security perimeter,Use this article to learn about adding Azure OpenAI to your network security perimeter. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important,2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Network security perimeter configuration is security-focused and product-specific, typically including resource types, policy settings, and scopes.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/on-your-data-best-practices,Troubleshooting and best practices,Best practices for using Azure OpenAI On Your Data (classic) - Microsoft Foundry (classic),Apply best practices for Azure OpenAI On Your Data,"Learn about the best practices for using Azure OpenAI On Your Data, along with how to fix common problems. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure OpenAI On Your Data is deprecated and approaching retirement. Microsoft has stopped onboarding new models to Azure OpenAI On Your Data. This feature only supports the following models: Once the GPT‑4.1 models retire, all Azure OpenAI On Your Data API endpoints and supported data source connectors stop functioning. We recommend that you migrate Azure O",2026-02-27T23:08:00.000Z,best-practice,best-practices,0.78,True,"Described as best practices plus fixes for common problems; On Your Data has product-specific behaviors and edge cases that are unlikely to be fully known from training, and the article focuses on concrete guidance for that feature.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/on-your-data-configuration,Network and access configuration,Network and access configuration for Azure OpenAI On Your Data (classic) - Microsoft Foundry (classic),Configure network and access for On Your Data,Use this article to learn about configuring Azure OpenAI when using your data for text generation. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Important Azure OpenAI On Your Data is deprecated and approaching retirement. Microsoft has stopped onboarding new models to Azure OpenAI On Your Data. This feature only supports the following models: Once the GPT‑4.1 models retire, all Azure OpenAI On Your Data API endpoints and supported data source connectors stop functioning. We recommend that you migrate Azure O",2026-02-27T23:08:00.000Z,how-to,configuration,0.75,True,"Explicitly about network and access configuration for Azure OpenAI On Your Data; likely includes VNet, firewall, identity, and connector configuration parameters and allowed values.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/predicted-outputs,Predicted outputs,How to use predicted outputs with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Optimize Azure OpenAI predicted outputs for latency,Learn how to improve your model response latency with predicted outputs (classic),Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Predicted outputs can improve model response latency for chat completions calls where minimal changes are needed to a larger body of text. If you're asking the model to provide a response where a large portion of the expected response is alrea,2026-03-26T06:04:00.000Z,how-to,best-practices,0.68,True,"How-to page for predicted outputs in Azure OpenAI within Foundry classic; likely includes concrete, product-specific guidance such as when to use predicted outputs, how to structure prompts, and configuration/usage patterns that are unique to this feature and not broadly known. This is actionable optimization guidance rather than just conceptual description, fitting best-practices.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/prompt-caching,Prompt caching,Prompt caching with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Configure and use prompt caching in Azure OpenAI Foundry,Learn how to use prompt caching with Azure OpenAI (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Prompt caching allows you to reduce overall request latency and cost for longer prompts that have identical content at the beginning of the prompt.""Prompt""in this context is referring to the input you send to the model as part of your chat completi",2026-04-14T22:13:00.000Z,how-to,configuration,0.7,True,"A how-to page for prompt caching is likely to include product-specific configuration parameters (for enabling caching, cache keys, TTLs, cache behavior flags) and possibly tables of options or defaults that are unique to Azure OpenAI in Foundry. This goes beyond generic prompting concepts and into concrete settings and usage patterns, which qualifies as expert configuration knowledge.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/prompt-caching,Prompt caching,Prompt caching with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Configure and use prompt caching in Azure OpenAI Foundry,Learn how to use prompt caching with Azure OpenAI (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Prompt caching allows you to reduce overall request latency and cost for longer prompts that have identical content at the beginning of the prompt.""Prompt""in this context is referring to the input you send to the model as part of your chat completi",2026-04-14T22:13:00.000Z,how-to,configuration,0.7,True,"A how-to page for prompt caching is likely to include product-specific configuration parameters (for enabling caching, cache keys, TTLs, cache behavior flags) and possibly tables of options or defaults that are unique to Azure OpenAI in Foundry. This goes beyond generic prompting concepts and into concrete settings and usage patterns, which qualifies as expert configuration knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/provisioned-get-started,Get started with Provisioned Deployments,Get started with provisioned deployments in Microsoft Foundry (classic) - Microsoft Foundry (classic),Create and tune provisioned deployments in Foundry,"Learn how to create and configure provisioned throughput deployments, verify quota, handle high utilization, and run benchmarks in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal The following guide walks you through key steps in creating a provisioned deployment with your Microsoft Foundry resource. For more details on the concepts discussed here, see:",2026-02-27T23:08:00.000Z,how-to,configuration,0.75,True,"Getting started with provisioned deployments includes quota verification, utilization handling, and benchmark setup with specific configuration options and thresholds unique to Foundry provisioned deployments.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/provisioned-throughput-onboarding,Understanding and calculating PTU costs,Provisioned throughput unit (PTU) costs and billing (classic) - Microsoft Foundry (classic) portal,Plan PTU costs and capacity for Microsoft Foundry,"Learn about provisioned throughput unit (PTU) costs, hourly billing, Azure reservations, and capacity planning in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Use this article to learn about costs associated with provisioned throughput units (PTUs). For an overview of the provisioned throughput offering, seeWhat is provisioned throughput?. When you're ready to sign up for the provisioned throughput offering, see thegetting started guide. Note In function calling and agent use cases, token usage can be variable. You should understand your expected Tokens Per",2026-04-10T08:00:00.000Z,concept-article,decision-making,0.78,True,"A PTU costs/billing and capacity planning article for a specific Azure service almost certainly includes concrete pricing-related numbers, billing granularity (for example, hourly increments), and guidance on how many PTUs to choose based on expected tokens per minute or similar quantitative thresholds. That is expert, product-specific decision guidance with quantified trade-offs for capacity and cost, fitting the decision-making category.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/provisioned-throughput-onboarding,Understanding and calculating PTU costs,Provisioned throughput unit (PTU) costs and billing (classic) - Microsoft Foundry (classic) portal,Plan PTU costs and capacity for Microsoft Foundry,"Learn about provisioned throughput unit (PTU) costs, hourly billing, Azure reservations, and capacity planning in Microsoft Foundry. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Use this article to learn about costs associated with provisioned throughput units (PTUs). For an overview of the provisioned throughput offering, seeWhat is provisioned throughput?. When you're ready to sign up for the provisioned throughput offering, see thegetting started guide. Note In function calling and agent use cases, token usage can be variable. You should understand your expected Tokens Per",2026-04-10T08:00:00.000Z,concept-article,decision-making,0.78,True,"A PTU costs/billing and capacity planning article for a specific Azure service almost certainly includes concrete pricing-related numbers, billing granularity (for example, hourly increments), and guidance on how many PTUs to choose based on expected tokens per minute or similar quantitative thresholds. That is expert, product-specific decision guidance with quantified trade-offs for capacity and cost, fitting the decision-making category.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/quota,Manage quota,Manage Azure OpenAI in Microsoft Foundry Models quota (classic) - Microsoft Foundry (classic) portal,Manage Azure OpenAI quota in Foundry classic,Learn how to use Azure OpenAI to control your deployments rate limits. (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Quota provides the flexibility to actively manage the allocation of rate limits across the deployments within your subscription. This article walks through the process of managing your Azure OpenAI quota.,2026-04-08T08:00:00.000Z,how-to,limits-quotas,0.78,True,"Quota management for Azure OpenAI deployments is inherently about rate limits and allocation. This page specifically describes how to control deployment rate limits and quota allocation per subscription in the Foundry (classic) portal, which typically includes concrete per-deployment/per-subscription rate limit values and allocation behaviors that aren't generally known from training.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/realtime-audio,Realtime API for speech and audio,Use the GPT Realtime API for speech and audio with Azure OpenAI (classic) - Microsoft Foundry (classic),Use GPT Realtime API for low-latency audio,Learn how to use the GPT Realtime API for speech and audio with Azure OpenAI. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Azure OpenAI GPT Realtime API for speech and audio is part of the GPT-4o model family that supports low-latency, ""speech in, speech out"" conversational interactions. The GPT Realtime API is designed to handle real-time, low-latency conversational interactions. It's a great fit for use cases involving live interactions between a user and a model, such as customer support agents, voice assistants, and r",2026-02-27T23:08:00.000Z,how-to,integrations,0.8,True,"Realtime API how-to for speech/audio; expected to document session configuration, event names, and WebSocket/WebRTC parameters unique to this API.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/realtime-audio-preview-api-migration-guide,Realtime API migration from Preview to GA,Migration from Preview to GA version of Realtime API (classic) - Microsoft Foundry (classic),Migrate from preview to GA Realtime API protocol,Step-by-step guide for migrating from Preview (Beta) to Generally Available version of OpenAI GPT Realtime API protocol. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal The Azure OpenAI GPT Realtime API is now Generally Available (GA). This migration guide helps you update existing applications to use the GA protocol. The GA version introduces changes to SDK usage, endpoint URLs, event names, and session configuration. What's changing: SDK packages, endpoint URL format, event names, and session configuration structure. What's not changing: Core functionality, audio f",2026-02-27T23:08:00.000Z,how-to,decision-making,0.75,True,"Migration guide with concrete changes to SDKs, endpoint URLs, event names, and session configuration; helps decide and execute upgrade paths between API versions.",unchanged @@ -345,7 +348,7 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/role-based https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/spillover-traffic-management,Provisioned spillover,Manage traffic with spillover for provisioned deployments (classic) - Microsoft Foundry (classic),Configure spillover traffic management for provisioned deployments,Learn how to manage traffic bursts in Azure OpenAI provisioned deployments using the spillover feature. Optimize performance and reduce disruptions effectively. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article describes how to manage traffic with spillover for provisioned deployments in Azure OpenAI. Spillover manages traffic fluctuations by routing overage traffic to a corresponding standard deployment. This optional capability can be set for all requests on a deployment or managed on a per-request basis, helping you reduce disruptions during traffic bursts.",2026-03-04T08:00:00.000Z,how-to,configuration,0.75,True,"Spillover feature configuration includes deployment-level and per-request settings, flags, and routing behavior between provisioned and standard deployments, which are detailed configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/stored-completions,Stored completions,How to use Azure OpenAI in Microsoft Foundry Models stored completions & distillation (classic) - Microsoft Foundry (classic) portal,Use stored completions and distillation in Azure OpenAI,Learn how to use stored completions & distillation with Azure OpenAI (classic),Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Stored completions allow you to capture the conversation history from chat completions sessions to use as datasets forevaluationsandfine-tuning.,2026-03-13T22:11:00.000Z,how-to,integrations,0.64,True,"Stored completions and distillation are Azure OpenAI–specific capabilities. A dedicated 'how to' article will typically document API/SDK parameters, dataset formats, and workflow-specific options for capturing and reusing completions, which are detailed integration patterns not generally known from training data.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/structured-outputs,Structured outputs,How to use structured outputs with Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Define and call structured outputs with Azure OpenAI,Learn how to improve your model responses with structured outputs (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Structured outputs make a model follow aJSON Schemadefinition that you provide as part of your inference API call. This is in contrast to the olderJSON modefeature, which guaranteed valid JSON would be generated, but was unable to ensure strict adh",2026-03-26T06:04:00.000Z,how-to,integrations,0.68,True,"A 'how to' for structured outputs with Azure OpenAI in Foundry is likely to include product-specific request/response schemas, parameter names, and configuration patterns (for example, how to pass JSON Schema, flags to enable structured outputs, and model-specific behavior). These are concrete integration/configuration details unique to this product rather than generic LLM knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/switching-endpoints,OpenAI vs Azure OpenAI (Python),How to switch between OpenAI and Azure OpenAI endpoints with Python - Azure OpenAI Service,Switch Python code between OpenAI and Azure OpenAI,Learn about the changes you need to make to your code to swap back and forth between OpenAI and Azure OpenAI endpoints.,"While OpenAI and Azure OpenAI rely on acommon Python client library, there are small changes you need to make to your code in order to swap back and forth between endpoints. This article walks you through the common changes and differences you'll experience when working across OpenAI and Azure OpenAI.",2026-04-14T22:13:00.000Z,how-to,integrations,0.7,True,"Describes concrete, product-specific code and configuration differences (endpoints, parameters) needed to swap between OpenAI and Azure OpenAI using the shared Python client. These are integration details and patterns unique to these services, beyond generic SDK usage.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/switching-endpoints,OpenAI vs Azure OpenAI (Python),How to switch between OpenAI and Azure OpenAI endpoints with Python - Azure OpenAI Service,Switch Python code between OpenAI and Azure OpenAI,Learn about the changes you need to make to your code to swap back and forth between OpenAI and Azure OpenAI endpoints.,"While OpenAI and Azure OpenAI rely on acommon Python client library, there are small changes you need to make to your code in order to swap back and forth between endpoints. This article walks you through the common changes and differences you'll experience when working across OpenAI and Azure OpenAI.",2026-04-14T22:13:00.000Z,how-to,integrations,0.7,True,"Describes concrete, product-specific code and configuration differences (endpoints, parameters) needed to swap between OpenAI and Azure OpenAI using the shared Python client. These are integration details and patterns unique to these services, beyond generic SDK usage.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/use-blocklists,Use blocklists,How to use block lists in Microsoft Foundry models (classic) - Microsoft Foundry (classic) portal,Configure custom block lists for Azure OpenAI content filtering,Learn how to use block lists with Azure OpenAI (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Theconfigurable content filtersavailable in Azure OpenAI are sufficient for most content moderation needs. However, you might need to filter terms specific to your use case—such as competitor names, internal project names, or domain-specific sensitive terms. For this, you can create custom block lists that automatically filter content containing your specified terms. In this article, you learn how to:",2026-03-21T06:03:00.000Z,how-to,security,0.76,True,"Describes how to create and use custom block lists on top of configurable content filters to filter specific terms (competitors, internal project names, etc.). This is product-specific safety/security configuration for content moderation, including how block lists interact with filters, fitting the security category.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/use-web-app,Deploy and use web apps,Use the Azure OpenAI web app (classic) - Microsoft Foundry (classic),,Use this article to learn about using the available web app to chat with Azure OpenAI models. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note The web app and itssource codeare provided ""as is"" and as a sample only. Customers are responsible for all customization and implementation of their web apps. See the support section for the web app onGitHubfor more information. Along withMicrosoft Foundry portal, APIs, and SDKs, you can use the customizable standalone web app to interact with Azure OpenAI model",2026-02-27T23:08:00.000Z,how-to,,0.4,False,Web app usage article is a sample UI walkthrough; summary suggests step-by-step usage rather than enumerating reusable configuration matrices or product-specific limits.,unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/web-search,Web search,Web search with the Responses API (classic) - Microsoft Foundry (classic) portal,Configure web_search tool with Azure OpenAI Responses API,Learn how to use Web search with the Responses API (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Web search enables models to retrieve and ground responses with real-time information from the public web before generating output. When enabled, the model can return up-to-date answers with inline citations. Web search is available via theweb_searchtool in theResponses API. Note web_searchis now the recommended tool for Web search in the Azure OpenAI Responses API. @@ -388,9 +391,9 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/openai/whisper-quickstar https://learn.microsoft.com/en-us/azure/foundry-classic/quickstarts/get-started-code,Quickstart,Microsoft Foundry Quickstart (classic) - Microsoft Foundry (classic),,Get started with Microsoft Foundry SDK building AI applications. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal In this quickstart, you useMicrosoft Foundryto: The Microsoft Foundry SDK is available in multiple languages, including Python, Java, TypeScript, and C#. This quickstart provides instructions for each of these languages. Tip The rest of this article shows how to create and use aFoundry project. SeeQuickstart: Get started with Microsoft Foundry (Hub projects)if you want to use a hub-based project inste",2026-02-27T23:08:00.000Z,quickstart,,0.3,False,"Quickstart tutorial for SDK usage; no detailed configuration tables, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/quickstarts/get-started-playground,Use the chat playground,Get Answers in Chat Playground - Microsoft Foundry Quickstart (classic) - Microsoft Foundry (classic) portal,,"Get answers using the chat playground in Microsoft Foundry portal. Learn how to deploy models, ask questions, and get AI responses quickly with this step-by-step tutorial. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Learn how to use the chat playground inMicrosoft Foundryto explore AI model capabilities interactively. This quickstart focuses on the web-based UI experience; to build applications programmatically, seeBuild",2026-03-03T06:03:00.000Z,quickstart,,0.25,False,"A quickstart for using the chat playground UI is primarily a step-by-step tutorial and conceptual exploration of model capabilities, without clear indication of limits, configuration tables, or product-specific diagnostic or integration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/quickstarts/hub-get-started-code,Quickstart: Get started with Microsoft Foundry using a hub,Quickstart: Get started with Microsoft Foundry (Hub projects) (classic) - Microsoft Foundry (classic),Use Foundry SDK to deploy a model and chat app,"Set up SDK, deploy a model, and build a simple chat app for hub-based projects. (classic)","Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. Tip An alternate Foundry project quickstart is available:Quickstart: Get started with Microsoft Foundry (Foundry projects). This quickstart sets up your local environment for hub-based projects, deploys a model, and builds a simple traced/evaluable chat script.",2026-03-11T22:16:00.000Z,how-to,integrations,0.65,True,"Quickstart for SDK, deployment, and chat app will include concrete API endpoints, SDK methods, and parameter usage specific to Foundry hub projects, which are integration/coding patterns.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-known-issues,Foundry known issues,Known issues - Microsoft Foundry (classic),Resolve known issues and workarounds in Microsoft Foundry,"Find known issues, workarounds, and solutions for Microsoft Foundry, including Speech, Translator, and portal services. Review before submitting a support request. (classic)",Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article lists known issues and workarounds for Foundry. Review these issues before you submit a support request.,2026-02-27T23:08:00.000Z,troubleshooting-known-issue,troubleshooting,0.7,True,"Known issues page typically lists specific symptoms and product behaviors with associated workarounds/solutions, which are symptom→cause→solution mappings unique to the service.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/reference/region-support,Feature availability,Feature availability across cloud regions (classic) - Microsoft Foundry (classic) portal,Check Microsoft Foundry feature availability by region,This article lists Microsoft Foundry feature availability across cloud regions. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Microsoft Foundrybrings together various Azure AI capabilities that were previously only available as standalone Azure services. While Microsoft strives to make all features available in all regions where Foundry is supported at the same time, feature availability might vary by region. In this article, you learn what Foundry features are available across cloud regions.",2026-04-13T17:17:00.000Z,concept-article,deployment,0.7,True,"A region-support matrix for Foundry features is expert, product-specific knowledge that affects where and how you can deploy or use capabilities; this aligns best with deployment, as it constrains which deployment options and features are available in each cloud region.",updated -https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/claude-models/data-privacy,"Data, privacy, and security for Claude models in Microsoft Foundry","Data, privacy, and security for use of Anthropic Claude models in Microsoft Foundry (preview) (classic) - Microsoft Foundry (classic) portal",,"This document details issues for data, privacy, and security for Anthropic Claude models in Microsoft Foundry in Public Preview. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. This article describes how the data that you provide is processed, used, and stored when you use Anthropic Claude models from the Microsoft Marketplace in Microsoft Foundry. Claude in Foundry processes data differently from models sold directl",2026-03-26T06:04:00.000Z,how-to,,0.3,False,"Focuses on how data is processed, used, and stored for Anthropic Claude models in Foundry. Likely high-level privacy and data-handling description without concrete RBAC roles, config parameters, or limits; more policy/overview than actionable configuration, so no sub-skill assigned.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-known-issues,Foundry known issues,Known issues - Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,Resolve known issues in Microsoft Foundry classic portal,"Find known issues, workarounds, and solutions for Microsoft Foundry, including Speech, Translator, and portal services. Review before submitting a support request. (classic)",Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article lists known issues and workarounds for Foundry. Review these issues before you submit a support request.,2026-04-14T22:13:00.000Z,troubleshooting-known-issue,troubleshooting,0.78,True,"The page is a 'known issues and workarounds' reference for Microsoft Foundry (classic), which typically lists specific symptoms and their workarounds/solutions. This aligns with troubleshooting content (symptom → cause → solution) and represents product-specific expert knowledge not inferable from general training data.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/reference/region-support,Feature availability,Feature availability across cloud regions (classic) - Microsoft Foundry (classic) portal,Check Microsoft Foundry feature availability by region,This article lists Microsoft Foundry feature availability across cloud regions. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Microsoft Foundrybrings together various Azure AI capabilities that were previously only available as standalone Azure services. While Microsoft strives to make all features available in all regions where Foundry is supported at the same time, feature availability might vary by region. In this article, you learn what Foundry features are available across cloud regions.",2026-04-13T17:17:00.000Z,concept-article,deployment,0.7,True,"A region-support matrix for Foundry features is expert, product-specific knowledge that affects where and how you can deploy or use capabilities; this aligns best with deployment, as it constrains which deployment options and features are available in each cloud region.",unchanged +https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/claude-models/data-privacy,"Data, privacy, and security for Claude models in Microsoft Foundry","Data, privacy, and security for use of Anthropic Claude models in Microsoft Foundry (preview) (classic) - Microsoft Foundry (classic) portal",Understand data handling and security for Claude in Microsoft Foundry,"This document details issues for data, privacy, and security for Anthropic Claude models in Microsoft Foundry in Public Preview. (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. This article describes how the data that you provide is processed, used, and stored when you use Anthropic Claude models from the Microsoft Marketplace in Microsoft Foundry. Claude in Foundry processes data differently from models sold directly by ",2026-04-20T17:16:00.000Z,how-to,security,0.7,True,"The page is a product-specific data, privacy, and security description for Anthropic Claude models in Microsoft Foundry, detailing how user data is processed, stored, and isolated compared to using Claude directly. This is expert, service-specific security/privacy behavior that isn't derivable from general training data. It focuses on data processing flows and security guarantees rather than generic concepts, so it best fits the security sub-skill.",updated https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/customer-copyright-commitment,Customer Copyright Commitment,Customer Copyright Commitment Required Mitigations (classic) - Microsoft Foundry (classic) portal,Apply copyright commitment mitigations for Azure OpenAI in Foundry,Customer Copyright Commitment Required Mitigations for Azure OpenAI in Foundry Models (classic),Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. Note The requirements described below apply only to customers using Azure OpenAI in Microsoft Foun,2026-03-26T06:04:00.000Z,concept-article,security,0.7,True,"Customer Copyright Commitment mitigation requirements are service-specific compliance and usage rules that define what configurations and processes are required to use Azure OpenAI in Foundry. These are not generic concepts and directly affect how the service must be set up and governed, fitting best under security/compliance guidance.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/data-privacy,"Data, privacy, and security","Data, privacy, and security for Azure Direct Models in Microsoft Foundry (classic) - Microsoft Foundry (classic)",Understand data handling and privacy for Azure Direct Models,"This document details issues for data, privacy, and security for Azure Direct Models (classic)","Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. This article provides details regarding how data provided by you to Azure Direct Models in Microsoft Foundry are processed, used, and stored. Azure Direct Model means an AI model designated and deployed as an “Azure Direct Model” in Foundry, and includes Azur",2026-02-27T23:08:00.000Z,concept-article,security,0.6,True,"Details how data is processed, used, and stored for Azure Direct Models; includes product-specific data retention and security behavior not generally known.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/limited-access,Limited access,Limited access to Azure OpenAI in Microsoft Foundry Models (classic) - Microsoft Foundry (classic) portal,Understand limited access policy for Azure OpenAI in Foundry,This document details the limited access policy for Azure OpenAI (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal Note Some links in this article might open content in the new Microsoft Foundry documentation instead of the Foundry (classic) documentation you're viewing now. Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. As part of Microsoft's commitment to responsible AI, we have designed and operate Azure Direct Mod",2026-03-26T06:04:00.000Z,concept-article,security,0.7,True,"Limited access and responsible AI policy pages for Azure OpenAI typically include product-specific eligibility rules, approval requirements, and usage constraints that are not generic knowledge. These are effectively security/compliance access controls (who can use what, under which conditions), which map best to the security sub-skill. Content is not just conceptual marketing; it governs concrete access behavior for this specific service.",unchanged @@ -402,4 +405,4 @@ https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/copilot-sdk-cr https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/copilot-sdk-evaluate,Part 3: Evaluate the chat app,Part 3: Evaluate a chat app with the Azure AI SDK (classic) - Microsoft Foundry (classic),Evaluate and improve Foundry-based chat apps with SDK,Evaluate and deploy a custom chat app with the prompt flow SDK. This tutorial is part 3 of a 3-part tutorial series. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. In this tutorial, you evaluate the chat app you built inPart 2 of the tutorial series. You assess your app's quality across multiple metrics and then iterate on improvements. In this part, you: This tutorial builds onPart 2: Build a custom chat app with the Microsoft Foundry SDK.",2026-02-27T23:08:00.000Z,tutorial,best-practices,0.6,True,"Focuses on evaluation metrics and iteration; likely includes product-specific evaluation flows, configuration, and recommended patterns for assessing quality.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/deploy-chat-web-app,Deploy an enterprise chat web app,Tutorial: Deploy an enterprise chat web app in the Microsoft Foundry portal playground (classic) - Microsoft Foundry (classic),,Use this article to deploy an enterprise chat web app in the Microsoft Foundry portal playground. (classic),"Applies only to:Foundry (classic) portal. This article isn't available for the new Foundry portal.Learn more about the new portal. In this article, you deploy an enterprise chat web app that uses your data with a large language model in Microsoft Foundry portal. Your data source grounds the model with specific data. Grounding means the model uses your data to understand the context of your question. You don't change the deployed model itself. Your data stays separate and secure in your original ",2026-02-27T23:08:00.000Z,tutorial,,0.4,False,"Tutorial-style deployment walkthrough for a chat web app; likely step-by-step UI and basic config without detailed parameter tables, limits, or product-specific edge cases.",unchanged https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/screen-reader,Use Microsoft Foundry with a screen reader,Use a screen reader with Microsoft Foundry (classic) - Microsoft Foundry (classic) portal,,This quickstart guides you in how to get oriented and navigate Microsoft Foundry with a screen reader. (classic),"Currently viewing:Foundry (classic) portal version-Switch to version for the new Foundry portal This article is for people who use screen readers such asMicrosoft's Narrator, JAWS, NVDA, or Apple's VoiceOver. In this article, you learn the basic structure of Microsoft Foundry and how to navigate efficiently.",2026-03-24T22:15:00.000Z,how-to,,0.1,False,"Accessibility/navigation guidance for using a screen reader with the Foundry portal; it describes UI structure and how to navigate, not product-specific limits, configuration parameters, APIs, or troubleshooting details. This is not the kind of expert technical knowledge targeted by the sub-skill types.",unchanged -https://learn.microsoft.com/en-us/azure/foundry-classic/what-is-foundry,What is Microsoft Foundry (classic) portal?,What is Microsoft Foundry (classic) portal - Microsoft Foundry (classic) portal,,"Microsoft Foundry is a trusted platform that empowers developers to drive innovation and shape the future with AI in a safe, secure, and responsible way. (classic)","This article covers theFoundry (classic)portal and APIs. For the current platform experience, seeWhat is Microsoft Foundry?.",2026-04-13T08:00:00.000Z,overview,,0.2,False,"High-level overview of Microsoft Foundry (classic) with conceptual description of the portal and APIs; no detailed limits, configuration tables, error codes, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/foundry-classic/what-is-foundry,What is Microsoft Foundry (classic) portal?,What is Microsoft Foundry (classic) portal - Microsoft Foundry (classic) portal,,"Microsoft Foundry is a trusted platform that empowers developers to drive innovation and shape the future with AI in a safe, secure, and responsible way. (classic)","This article covers theFoundry (classic)portal and APIs. For the current platform experience, seeWhat is Microsoft Foundry?.",2026-04-13T08:00:00.000Z,overview,,0.2,False,"High-level overview of Microsoft Foundry (classic) with conceptual description of the portal and APIs; no detailed limits, configuration tables, error codes, or decision matrices.",unchanged diff --git a/products/microsoft-foundry-classic/report.md b/products/microsoft-foundry-classic/report.md index e0ed3083..6a34996d 100644 --- a/products/microsoft-foundry-classic/report.md +++ b/products/microsoft-foundry-classic/report.md @@ -1,44 +1,44 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: configuration: Configuring, monitoring, and evaluating Foundry classic agents, models, - tools, and infrastructure, including BYO resources, RAG, safety, tracing, and - Azure OpenAI monitoring. - decision-making: Guides for choosing models, regions, deployments, costs, PTU, streaming, - SDKs, and upgrade/migration paths to design, scale, and optimize Foundry and Azure - OpenAI solutions. + networking, storage, and compute, including Azure OpenAI, RAG, safety, tracing, + and continuous evaluation setups. + decision-making: Guidance on choosing models, regions, deployment types, costs, + PTU, lifecycle/migration, and when to fine-tune or upgrade Azure OpenAI/Foundry + for your scenario architecture-patterns: 'Architectural guidance for Foundry: multi-agent design, service/resource topology, high availability, and full BCDR (backup, recovery, DR, and resiliency) for Agent Service and Azure OpenAI workloads' - integrations: 'Patterns and code to integrate Foundry and Azure OpenAI with tools, - data, and frameworks: search/RAG, files, SharePoint, Bing, MCP/OpenAPI, Functions/Logic - Apps, SDKs, fine-tuning, and realtime audio.' - security: 'Security, privacy, and compliance for Foundry and Azure OpenAI: auth/RBAC, - networking (VNet/Private Link), keys and Key Vault, guardrails/content filters, - Azure Policy, and data handling.' - limits-quotas: Quotas, rate limits, and capacity for Foundry Agents/Models and Azure - OpenAI (deployments, regions, VNets, batch, dynamic quota), plus model availability/retirement - and how to request increases. - deployment: Deploying Foundry hubs, projects, and models (managed/serverless), setting - up agents and prompt flows, and automating deployments/evaluations via CLI, Bicep, - Terraform, VS Code, and CI/CD. - troubleshooting: 'Diagnosing and fixing Foundry issues: compute/usage in prompt - flow, deployments/monitoring, private endpoints, Azure OpenAI fine-tuning, risks - & safety alerts, and known bugs/workarounds.' - best-practices: Best practices for prompts, safety, fine-tuning (incl. vision), - latency/cost optimization, DeepSeek-R1 use, On Your Data, and evaluating/improving - Foundry/Azure OpenAI chat apps + integrations: Patterns and code for integrating Foundry and Azure OpenAI with tools, + data sources, and frameworks (RAG, search, files, web, functions, MCP, Prompt + Flow, LangChain, evaluations, and realtime audio). + security: 'Security, privacy, and compliance for Foundry: auth/RBAC, keys and Key + Vault, network isolation/Private Link, Azure Policy guardrails, content safety/filters, + and data handling for models and tools.' + limits-quotas: Quotas, rate limits, and regional support for Foundry Agents, Models, + Azure OpenAI (incl. batch, dynamic quota, fine-tuning), plus how to view, manage, + and request quota increases + deployment: Deploying Foundry hubs, models, agents, and prompt flows using CLI, + Bicep, Terraform, VS Code, and CI/CD, including serverless/managed options and + regional feature availability. + troubleshooting: 'Diagnosing and fixing Foundry classic issues: compute/Prompt Flow + limits, deployments/monitoring, private endpoints, Azure OpenAI fine-tuning, risks + & safety alerts, and known portal problems.' + best-practices: 'Best practices for Foundry/Azure OpenAI: system and safety prompts, + DeepSeek-R1, eval flows, fine-tuning (incl. vision), latency/perf, On Your Data, + HA design, and chat app evaluation.' skill_description: Expert knowledge for Microsoft Foundry Classic (aka Azure AI Foundry classic) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations - & coding patterns, and deployment. Use when building Foundry agents, prompt flows, - RAG/search, Azure OpenAI deployments, or multi-agent architectures, and other Microsoft - Foundry Classic related development tasks. Not for Microsoft Foundry (use microsoft-foundry), - Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use - microsoft-foundry-tools). -use_when: Use when building Foundry agents, prompt flows, RAG/search, Azure OpenAI - deployments, or multi-agent architectures, and other Microsoft Foundry Classic related - development tasks. + & coding patterns, and deployment. Use when building Foundry classic agents with + Azure OpenAI, RAG, Prompt Flow, MCP tools, or realtime audio workloads, and other + Microsoft Foundry Classic related development tasks. Not for Microsoft Foundry (use + microsoft-foundry), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft + Foundry Tools (use microsoft-foundry-tools). +use_when: Use when building Foundry classic agents with Azure OpenAI, RAG, Prompt + Flow, MCP tools, or realtime audio workloads, and other Microsoft Foundry Classic + related development tasks. confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use microsoft-foundry-tools). --- @@ -46,83 +46,98 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft ## Summary -- **Total Pages**: 373 -- **Fetched**: 373 +- **Total Pages**: 376 +- **Fetched**: 376 - **Fetch Failed**: 0 -- **Classified**: 289 -- **Unclassified**: 84 +- **Classified**: 296 +- **Unclassified**: 80 ### Incremental Update -- **New Pages**: 0 -- **Updated Pages**: 22 -- **Unchanged**: 351 -- **Deleted Pages**: 1 +- **New Pages**: 8 +- **Updated Pages**: 24 +- **Unchanged**: 344 +- **Deleted Pages**: 5 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/microsoft-foundry-classic/microsoft-foundry-classic.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 6 | 1.6% | -| best-practices | 10 | 2.7% | -| configuration | 54 | 14.5% | -| decision-making | 23 | 6.2% | -| deployment | 15 | 4.0% | -| integrations | 117 | 31.4% | -| limits-quotas | 13 | 3.5% | -| security | 44 | 11.8% | +| architecture-patterns | 4 | 1.1% | +| best-practices | 12 | 3.2% | +| configuration | 48 | 12.8% | +| decision-making | 26 | 6.9% | +| deployment | 16 | 4.3% | +| integrations | 128 | 34.0% | +| limits-quotas | 11 | 2.9% | +| security | 44 | 11.7% | | troubleshooting | 7 | 1.9% | -| *(Unclassified)* | 84 | 22.5% | +| *(Unclassified)* | 80 | 21.3% | ## Changes +### New Pages + +- [Foundry Models lifecycle and support policy](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements) +- [Foundry Models retirement schedule](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirement-schedule) +- [Retired Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/retired-models) +- [Managed compute models lifecycle](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-retirement-managed-compute) +- [High availability & disaster recovery](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency) +- [Migration overview](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-migration-overview) +- [Rebuild and validate](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-migrate-prompt-flow-to-agent-framework) +- [Deploy and cut over](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-deploy-migrated-agent-framework-workflow) + ### Updated Pages -- [Prompt caching](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/prompt-caching) +- [Microsoft Foundry SDKs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-14T22:13:00.000Z +- [Explore Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/foundry-models-overview) + - Updated: 2026-03-05T06:04:00.000Z → 2026-04-20T08:00:00.000Z +- [Environment setup](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/environment-setup) - Updated: 2026-03-26T06:04:00.000Z → 2026-04-14T22:13:00.000Z -- [Vision-enabled models](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/gpt-with-vision) +- [Prompt flow overview](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/prompt-flow) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T22:11:00.000Z +- [Create a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T22:11:00.000Z +- [Tune prompts using variants](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-tune-prompts-using-variants) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T22:11:00.000Z +- [Process images in a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-process-image) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T22:11:00.000Z +- [Develop an evaluation flow in Prompt flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop-evaluation) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T22:11:00.000Z +- [Prompt flow tools overview](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-flow-tools-overview) + - Updated: 2026-03-11T22:16:00.000Z → 2026-04-20T22:11:00.000Z +- [LLM tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool) + - Updated: 2026-03-11T22:16:00.000Z → 2026-04-20T22:11:00.000Z +- [Prompt tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool) + - Updated: 2026-03-12T17:22:00.000Z → 2026-04-20T22:11:00.000Z +- [Python tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T22:11:00.000Z +- [Azure OpenAI GPT-4 Turbo with Vision tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool) + - Updated: 2026-03-11T22:16:00.000Z → 2026-04-20T22:11:00.000Z +- [Index Lookup tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool) + - Updated: 2026-03-11T22:16:00.000Z → 2026-04-20T22:11:00.000Z +- [Rerank tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool) + - Updated: 2026-03-12T17:22:00.000Z → 2026-04-20T22:11:00.000Z +- [Content Safety tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool) + - Updated: 2026-03-12T17:22:00.000Z → 2026-04-20T22:11:00.000Z +- [Embedding tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool) + - Updated: 2026-03-12T17:22:00.000Z → 2026-04-20T22:11:00.000Z +- [Serp API tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool) + - Updated: 2026-03-12T17:22:00.000Z → 2026-04-20T22:11:00.000Z +- [Troubleshoot prompt flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot) + - Updated: 2026-03-11T22:16:00.000Z → 2026-04-20T22:11:00.000Z +- [Use virtual networks](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks) - Updated: 2026-02-27T23:08:00.000Z → 2026-04-14T22:13:00.000Z -- [OpenAI vs Azure OpenAI (Python)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/switching-endpoints) - - Updated: 2026-03-13T22:11:00.000Z → 2026-04-14T22:13:00.000Z -- [Foundry Models FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/faq) - - Updated: 2026-02-27T23:08:00Z → 2026-04-14T22:13:00Z -- [FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/faq) - - Updated: 2026-02-27T23:08:00Z → 2026-04-14T22:13:00Z -- [Model and region support](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/model-region-support) - - Updated: 2026-03-18T06:03:00.000Z → 2026-04-16T17:16:00.000Z -- [Azure OpenAI FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/faq) - - Updated: 2026-03-24T22:15:00Z → 2026-04-14T22:13:00Z -- [What is Microsoft Foundry (classic) portal?](https://learn.microsoft.com/en-us/azure/foundry-classic/what-is-foundry) - - Updated: 2026-03-13T22:11:00.000Z → 2026-04-13T08:00:00.000Z -- [Foundry Models sold directly by Azure](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-sold-directly-by-azure) - - Updated: 2026-03-13T22:11:00.000Z → 2026-04-17T22:08:00.000Z -- [Foundry Models from partners and community](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-from-partners) - - Updated: 2026-04-07T18:37:00.000Z → 2026-04-18T06:07:00.000Z -- [Get started with model router](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/model-router) - - Updated: 2026-03-24T22:15:00.000Z → 2026-04-01T17:18:00.000Z -- [Configure Models from Partners and Community](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-from-partners) - - Updated: 2026-04-07T18:37:00.000Z → 2026-04-18T06:07:00.000Z -- [Claude in Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-claude) - - Updated: 2026-04-07T18:37:00.000Z → 2026-04-16T16:35:00.000Z -- [Microsoft AI (MAI) in Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-foundry-models-mai) - - Updated: 2026-04-02T17:11:00.000Z → 2026-04-14T12:58:00.000Z -- [Run red teaming scans locally](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-scans-ai-red-teaming-agent) - - Updated: 2026-02-27T23:08:00.000Z → 2026-03-21T06:03:00.000Z -- [Plan rollout](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/planning) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-16T16:35:00.000Z -- [Create using Bicep template](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-resource-template) - - Updated: 2026-03-21T06:03:00.000Z → 2026-04-17T17:16:00.000Z -- [Understanding and calculating PTU costs](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/provisioned-throughput-onboarding) - - Updated: 2026-04-09T08:00:00.000Z → 2026-04-10T08:00:00.000Z -- [Service architecture](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/architecture) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-16T22:08:00.000Z -- [Role-based access control](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/rbac-foundry) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-13T08:00:00.000Z -- *...and 2 more* +- *...and 4 more* ### Deleted Pages -- ~~REST API~~ (https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-project) +- ~~Model lifecycle and retirement~~ (https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-lifecycle-retirement) +- ~~High availability and resiliency~~ (https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency) +- ~~Legacy models~~ (https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/legacy-models) +- ~~Model retirements~~ (https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements) +- ~~Business continuity & disaster recovery (BCDR)~~ (https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/business-continuity-disaster-recovery) ## Classified Pages @@ -141,6 +156,7 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Data source - Elasticsearch (preview)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/references/elasticsearch) | integrations | 0.86 | Elasticsearch integration reference documents concrete JSON schemas, parameter names, and connector configuration for Azure OpenAI, which are detailed integration patterns beyond generic knowledge. | | [Data source - Mongo DB (preview)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/references/mongo-db) | integrations | 0.86 | MongoDB Atlas integration reference provides detailed REST/Python API shapes and connector configuration options unique to Azure OpenAI On Your Data, fitting the integrations sub-skill. | | [Data source - Pinecone (preview)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/references/pinecone) | integrations | 0.86 | Pinecone integration reference includes specific connector parameters, request formats, and configuration fields for Azure OpenAI, representing expert integration knowledge. | +| [Rate limits, region and virtual network support](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-regions-limits-virtual-network) | limits-quotas | 0.86 | The page explicitly focuses on rate limits for evaluation runs, region availability, and enterprise features like virtual network support and storage usage. Documentation of rate limits for a specific Azure service is expert knowledge that changes over time and isn't reliably known from training data. While it also mentions VNet and storage configuration, the primary unique, time-sensitive content is the service-specific rate limits and regional constraints, which best fit the limits-quotas sub-skill. | | [2025-04-01-preview - Authoring](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/authoring-reference-preview) | integrations | 0.85 | Authoring preview API reference with request/response structure and authorization. Contains concrete API fields and endpoints for authoring scenarios. | | [2025-04-01-preview - Inference](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/reference-preview) | integrations | 0.85 | Preview REST API reference with detailed endpoints and parameters. This is core integration knowledge. | | [Assistants](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/reference-preview) | integrations | 0.85 | Duplicate preview REST API reference entry; same integration-focused expert content. | @@ -186,7 +202,6 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Managed identity](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/managed-identity) | security | 0.80 | Covers authentication with Entra ID and Azure RBAC; such docs list role names, scopes, and CLI commands specific to this service. | | [Managed virtual network](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/managed-virtual-network) | security | 0.80 | Managed VNet article includes outbound isolation options, private endpoint configuration, and supported patterns for Foundry, which are detailed security/network configuration settings. | | [Model region availability for serverless API](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/deploy-models-serverless-availability) | deployment | 0.80 | Region availability matrix for each model supporting serverless APIs is highly product-specific deployment knowledge (which model is deployable where). | -| [Model retirements](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements) | limits-quotas | 0.80 | A model retirements page typically lists specific models, their deprecation/retirement dates, and sometimes replacement guidance. These are time-bound, model-specific constraints (what is available until when) that function as limits/quotas-like data and are not inferable from generic training. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/model-context-protocol) | integrations | 0.80 | Explains how to attach Model Context Protocol servers; likely includes endpoint configuration, auth settings, and tool registration parameters unique to MCP in Foundry. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/sharepoint) | integrations | 0.80 | Describes a SharePoint grounding tool; likely includes configuration (site URLs, permissions, sync options) and constraints unique to this integration. | | [Realtime API for speech and audio](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/realtime-audio) | integrations | 0.80 | Realtime API how-to for speech/audio; expected to document session configuration, event names, and WebSocket/WebRTC parameters unique to this API. | @@ -195,16 +210,16 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Responses API](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/responses) | integrations | 0.80 | How-to for Responses API including creating, retrieving, deleting responses, streaming, and tools; contains request/response schemas and parameters unique to this API. | | [Role-based access control for Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/hub-rbac-foundry) | security | 0.80 | RBAC article will list specific roles, permissions, and scope definitions for Foundry hubs and projects, which are product-specific security details. | | [Troubleshoot deployments and monitoring](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/troubleshoot-deploy-and-monitor) | troubleshooting | 0.80 | Explicit troubleshooting article for deployments and monitors; likely maps deployment/monitoring errors to causes and resolutions. | -| [Troubleshoot prompt flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot) | troubleshooting | 0.80 | Explicit troubleshooting guidance for prompt flow; likely includes common errors, causes, and resolutions for compute session and usage problems. | +| [Troubleshoot prompt flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot) | troubleshooting | 0.80 | Explicitly a troubleshooting guide; likely organized around common Prompt Flow errors (compute session failures, timeouts) with causes and resolutions unique to Foundry Prompt Flow. | | [Troubleshooting guidance](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-troubleshoot) | troubleshooting | 0.80 | Dedicated troubleshooting guide for fine-tuning; such pages typically map specific error codes/messages and failure modes to causes and resolutions. | | [Use customer-managed encryption keys](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/encryption-keys-portal) | security | 0.80 | CMK article describes Key Vault integration, key types, rotation behavior, and which Foundry data is encrypted, which are detailed product-specific security configurations. | -| [Use virtual networks](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks) | security | 0.80 | Describes private networking setup with Bicep/Terraform, private endpoints, DNS zones, and deny-by-default rules; includes product-specific network and security configuration details. | | [Azure OpenAI monitoring data reference](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/monitor-openai-reference) | configuration | 0.78 | Monitoring data reference typically lists metric names, dimensions, log categories, and schema fields used with Azure Monitor, which are product-specific configuration parameters and data contracts. | | [Configure private link](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/configure-private-link) | security | 0.78 | Private Link setup for projects will specify endpoint types, subresource names, DNS configuration, and required network settings, which are product-specific security/network configuration details. | | [Customer-managed keys](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/hub-encryption-keys-portal) | security | 0.78 | CMK setup for hub projects will include Azure Key Vault integration details, key URI formats, required permissions, and Foundry-specific encryption behaviors that are product-specific security configuration knowledge. | | [Default safety policies](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/default-safety-policies) | security | 0.78 | Describes default Guardrail/safety policies applied to Azure OpenAI in Foundry classic, including content filtering models, blocklists, prompt transformation, and content credentials. These are product-specific security/safety configurations and policy behaviors that an LLM would not fully know from training, fitting the security category. | +| [Foundry Models retirement schedule](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirement-schedule) | decision-making | 0.78 | A retirement schedule with current lifecycle stage, retirement dates, and suggested replacements is expert knowledge. It directly supports decision-making by providing concrete dates and replacement options to plan migrations, matching the decision-making criteria (comparison of options with timelines and recommendations). | +| [Foundry known issues](https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-known-issues) | troubleshooting | 0.78 | The page is a 'known issues and workarounds' reference for Microsoft Foundry (classic), which typically lists specific symptoms and their workarounds/solutions. This aligns with troubleshooting content (symptom → cause → solution) and represents product-specific expert knowledge not inferable from general training data. | | [Manage quota](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/quota) | limits-quotas | 0.78 | Quota management for Azure OpenAI deployments is inherently about rate limits and allocation. This page specifically describes how to control deployment rate limits and quota allocation per subscription in the Foundry (classic) portal, which typically includes concrete per-deployment/per-subscription rate limit values and allocation behaviors that aren't generally known from training. | -| [Rate limits, region and virtual network support](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-regions-limits-virtual-network) | limits-quotas | 0.78 | The page explicitly focuses on rate limits for evaluation runs, region availability, and enterprise features like virtual network support and storage account usage. These topics typically include concrete numeric rate limits, region support matrices, and feature-specific constraints that are not generally known from training data, matching the limits-quotas category. | | [Role-based access control (Azure RBAC)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/role-based-access-control) | security | 0.78 | RBAC article will list specific built-in roles, permissions, and scopes for Azure OpenAI, which are product-specific security details. | | [Standard agent setup](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/standard-agent-setup) | configuration | 0.78 | Describes provisioning and configuring customer-managed Azure resources for standard setup; likely includes specific resource types, settings, and configuration parameters. | | [Troubleshooting and best practices](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/on-your-data-best-practices) | best-practices | 0.78 | Described as best practices plus fixes for common problems; On Your Data has product-specific behaviors and edge cases that are unlikely to be fully known from training, and the article focuses on concrete guidance for that feature. | @@ -212,17 +227,16 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Virtual networks with private endpoints](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/network) | security | 0.78 | How-to for VNet and private endpoints; such docs include specific network configuration steps, resource types, and settings unique to securing Azure OpenAI. | | [Use blocklists](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/use-blocklists) | security | 0.76 | Describes how to create and use custom block lists on top of configurable content filters to filter specific terms (competitors, internal project names, etc.). This is product-specific safety/security configuration for content moderation, including how block lists interact with filters, fitting the security category. | | [API lifecycle](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/api-version-lifecycle) | integrations | 0.75 | Details authentication, endpoint formats, and request/response structures for the v1 API, including cross-provider model calls; these are concrete API integration details. | -| [Azure OpenAI GPT-4 Turbo with Vision tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool) | configuration | 0.75 | Tool article for GPT-4 Turbo with Vision will document parameters like image input formats, size constraints, and model/deployment configuration unique to this integration. | | [Azure OpenAI evaluators](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/azure-openai-graders) | configuration | 0.75 | Reference for label grading, string checking, similarity, and custom grading; likely includes grader names, parameters, and constraints unique to Foundry’s implementation. | | [Chat completions API](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/chatgpt) | integrations | 0.75 | Details chat completions API usage, message formats, and options; includes parameters and patterns specific to Azure OpenAI chat models. | | [Computer Use](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/computer-use) | integrations | 0.75 | Describes using a specialized model to interact with UIs; likely includes tool configuration, event schemas, and parameters unique to Computer Use. | +| [Deploy and cut over](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-deploy-migrated-agent-framework-workflow) | deployment | 0.75 | Focuses on deploying Agent Framework workflows to Azure Container Apps, setting up tracing, CI/CD quality gates, and cutover steps; these are product-specific deployment and operations patterns. | | [Deploy models via serverless API](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/deploy-models-serverless) | deployment | 0.75 | How-to for serverless API deployments; such docs typically include supported deployment types, constraints, and configuration parameters specific to Foundry serverless APIs. | | [Fine-tune Azure OpenAI models](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning) | integrations | 0.75 | How-to for fine-tuning via Python and REST; includes API endpoints, request schemas, and parameters (e.g., LoRA settings) that are product-specific integration details. | | [Function calling](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/function-calling) | integrations | 0.75 | Explains how agents emit function call messages and how to define/handle functions; likely includes schema details and handler patterns unique to this product. | | [Get started with Provisioned Deployments](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/provisioned-get-started) | configuration | 0.75 | Getting started with provisioned deployments includes quota verification, utilization handling, and benchmark setup with specific configuration options and thresholds unique to Foundry provisioned deployments. | | [How to use Grounding with Bing Custom Search](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/bing-custom-search-samples) | integrations | 0.75 | Provides step-by-step and code samples; likely documents request formats, parameters, and options unique to the Custom Bing Search tool. | | [How to use Grounding with Bing Search](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/bing-code-samples) | integrations | 0.75 | Code-sample article for a specific grounding tool; expected to show SDK calls, parameter names, and configuration details not covered by generic knowledge. | -| [Index Lookup tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool) | configuration | 0.75 | Index Lookup tool article likely includes configuration fields (index resource, search parameters, topK, filters) and allowed values specific to Foundry. | | [Model leaderboards](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-benchmarks) | decision-making | 0.75 | Provides benchmark metrics across quality, safety, cost, and performance to compare models, enabling quantified model selection decisions. | | [Network and access configuration](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/on-your-data-configuration) | configuration | 0.75 | Explicitly about network and access configuration for Azure OpenAI On Your Data; likely includes VNet, firewall, identity, and connector configuration parameters and allowed values. | | [Network security perimeter](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/add-foundry-to-network-security-perimeter) | security | 0.75 | Article provides Foundry-specific guidance on associating resources with NSPs, including resource types, supported operations, and any special considerations, which are security configuration details. | @@ -234,7 +248,6 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Realtime API migration from Preview to GA](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/realtime-audio-preview-api-migration-guide) | decision-making | 0.75 | Migration guide with concrete changes to SDKs, endpoint URLs, event names, and session configuration; helps decide and execute upgrade paths between API versions. | | [Realtime API via WebRTC](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/realtime-audio-webrtc) | integrations | 0.75 | How-to for using GPT Realtime API via WebRTC for speech/audio. This typically includes signaling URLs, WebRTC configuration parameters (ICE servers, codecs, stream settings), and API-specific options for the Realtime endpoint—concrete integration and coding patterns unique to this service. | | [Run red teaming scans in the cloud](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-ai-red-teaming-cloud) | integrations | 0.75 | Cloud-based scans via Foundry SDK; includes product-specific SDK configuration, parameters, and possibly resource requirements. | -| [Serp API tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool) | integrations | 0.75 | Serp API tool integrates an external search API; article will include API key configuration, endpoint/parameter mapping, and constraints specific to this integration. | | [Service monitoring](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/metrics) | configuration | 0.75 | Describes platform metrics, Log Analytics export, and KQL queries; likely includes metric names, dimensions, and example queries unique to this service. | | [Trace and observe agents](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/trace-agents-sdk) | configuration | 0.75 | Explains how to emit and view traces for agents; likely includes exporter configuration, attribute names, and trace structure specific to Foundry. | | [Trace your application](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/trace-application) | integrations | 0.75 | Shows how to integrate OpenAI SDK with OpenTelemetry and Foundry tracing; includes concrete SDK and telemetry configuration parameters. | @@ -246,6 +259,7 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Audio generation](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/audio-completions-quickstart) | integrations | 0.70 | Quickstart for audio-enabled models with a modalities table; expected to include model names, supported modalities, and API parameters specific to audio generation. | | [Audit and manage hubs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/azure-policy) | security | 0.70 | Explains how to use Azure Policy with Foundry hubs/projects, including relevant policy definitions, scopes, and evaluation behavior, which are product-specific governance/security configurations. | | [Azure OpenAI FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/faq) | limits-quotas | 0.70 | FAQ includes Azure OpenAI–specific numeric limits (for example, rate limits, token limits, request size constraints) and policy details that are deployment- and SKU-specific and not inferable from generic OpenAI knowledge. | +| [Azure OpenAI GPT-4 Turbo with Vision tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool) | integrations | 0.70 | This tool article will describe how to call Azure OpenAI GPT-4 Turbo with Vision from Prompt Flow, including parameters, deployment references, and image input handling, which are product-specific integration details. | | [Azure OpenAI PTU updates](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/provisioned-migration) | decision-making | 0.70 | Update article compares old vs new provisioned models (model-independent quota, payment options, deployment scenarios) and guides existing users on migration/choice, which is decision-making content. | | [Batch with Azure Blob Storage](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/batch-blob-storage) | configuration | 0.70 | Explains configuring Blob Storage for batch input/output; likely includes container names, SAS/identity configuration, and parameter settings unique to this integration. | | [Build and consume vector indexes (portal)](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/index-add) | configuration | 0.70 | How-to for building and consuming vector indexes; likely includes index configuration options, parameters, and constraints specific to Foundry’s indexing system. | @@ -258,7 +272,6 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Configure content filtering](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/configure-content-filters) | configuration | 0.70 | How-to for configuring content filters, including approval for gated modifications. Likely includes specific filter setting names, allowed values, and configuration flows unique to Foundry, which are configuration details rather than generic concepts. | | [Connect your AI hub to the Microsoft Foundry resource](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/configure-project-connection) | configuration | 0.70 | How-to for configuring a connection to Microsoft Foundry Models in an AI project. This typically includes product-specific connection parameters (endpoints, keys, project/hub identifiers, region or environment options) and required settings unique to Foundry (classic), which are configuration details beyond generic LLM knowledge. | | [Consume models from a different project or hub](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/deploy-models-serverless-connect) | configuration | 0.70 | Explains how to consume serverless APIs from a different project or hub, which requires specifying project/hub identifiers, connection settings, and possibly auth/endpoint configuration unique to Foundry (classic). These are product-specific configuration patterns rather than generic concepts. | -| [Content Safety tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool) | security | 0.70 | Content Safety tool article is expected to document safety categories, threshold parameters, and configuration options specific to Azure AI Content Safety. | | [Create a connection](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/connections-add) | integrations | 0.70 | Adding connections to a project generally involves specifying connection types, authentication parameters, and endpoint/secret configuration unique to Foundry. The article likely includes product-specific connection settings and possibly parameter names, which aligns with integrations & coding patterns. | | [Create a hub using Terraform](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-hub-terraform) | deployment | 0.70 | Using Terraform to create a Foundry hub, project, AI services resource, and other dependencies involves product-specific resource types, arguments, and required combinations. This is specialized deployment-as-code knowledge for Foundry, fitting the deployment sub-skill. | | [Create and manage compute session](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-manage-compute-session) | configuration | 0.70 | How-to for creating/managing compute sessions is likely to include specific settings (SKU choices, timeouts, auto-shutdown, region constraints) and configuration parameters unique to prompt flow compute. | @@ -269,6 +282,7 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Custom policies for model deployment](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/configure-deployment-policies) | security | 0.70 | Shows how to author custom Azure Policy definitions that deny non-approved model deployments and deployment types for Foundry and Azure OpenAI. Involves product-specific policy JSON structure and fields, which are concrete security configuration details. | | [Customer Copyright Commitment](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/customer-copyright-commitment) | security | 0.70 | Customer Copyright Commitment mitigation requirements are service-specific compliance and usage rules that define what configurations and processes are required to use Azure OpenAI in Foundry. These are not generic concepts and directly affect how the service must be set up and governed, fitting best under security/compliance guidance. | | [Data monitoring reference](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/reference/monitor-service) | configuration | 0.70 | Monitoring data reference typically lists metric names, dimensions, log categories, and schemas. These are product-specific configuration/telemetry fields needed to integrate monitoring, not generic concepts. | +| [Data, privacy, and security for Claude models in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/claude-models/data-privacy) | security | 0.70 | The page is a product-specific data, privacy, and security description for Anthropic Claude models in Microsoft Foundry, detailing how user data is processed, stored, and isolated compared to using Claude directly. This is expert, service-specific security/privacy behavior that isn't derivable from general training data. It focuses on data processing flows and security guarantees rather than generic concepts, so it best fits the security sub-skill. | | [Data, privacy, and security for Model Catalog](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/concept-data-privacy) | security | 0.70 | Data, privacy, and security behavior for a specific platform is product-specific expert knowledge; such pages typically detail how data is processed/stored and may reference concrete security configurations or scopes. | | [Deep research](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/deep-research) | integrations | 0.70 | Covers using the o3-deep-research model with the Responses API, likely including request schema, tool configuration, parameters controlling depth, sources, and citation behavior. These are concrete API usage and parameter patterns specific to this model and API. | | [Default safety policies for Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/default-safety-policies) | security | 0.70 | Explains default guardrails, safety policies, and content filtering applied to Foundry Models. These are product-specific responsible AI controls and default configurations, which fall under security (safety/compliance configuration) rather than generic concepts. | @@ -279,10 +293,10 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Deployment options in Microsoft Foundry portal](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/deployments-overview) | deployment | 0.70 | Compares deployment options (standard, serverless API, managed compute) with when to use each, a product-specific deployment decision guide. | | [Develop with agents and LangGraph](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/langchain-agents) | integrations | 0.70 | Describes using langchain-azure-ai package; likely includes SDK configuration, client options, and integration patterns specific to Foundry. | | [Document embedding in prompts](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-filter-document-embedding) | integrations | 0.70 | Provides recommended prompt formatting methods (JSON escaping, separating system/user/assistant segments) to improve Guardrails performance. These are concrete, product-specific integration patterns for Azure OpenAI in Foundry. | -| [Embedding tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool) | configuration | 0.70 | Embedding tool documentation will include parameters like embedding model, vector dimensions, and deployment configuration unique to this product. | +| [Embedding tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool) | integrations | 0.70 | Embedding tool article will specify model/deployment parameters, vector output handling, and integration with downstream tools, which are concrete SDK/tool configuration patterns. | | [Enable tracing and collect feedback for a flow deployment](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/trace-production-sdk) | configuration | 0.70 | Tracing/feedback article will likely document SDK parameters, environment variables, and configuration settings for telemetry unique to this product. | | [Endpoints for Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/endpoints) | configuration | 0.70 | Explains endpoint organization and usage, including inference endpoint details and keyless authentication—product-specific endpoint configuration. | -| [Environment setup](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/environment-setup) | configuration | 0.70 | Environment setup for Foundry Agent Service is likely to include specific resource types, configuration parameters, and required settings unique to this service (for example, particular Azure resources, identities, or environment variables). That is product-specific configuration knowledge beyond generic tutorials, so it fits the configuration sub-skill. | +| [Environment setup](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/environment-setup) | configuration | 0.70 | An environment setup guide for a specific service usually includes concrete configuration parameters (resource types/SKUs to deploy, required settings, environment variables, and service-specific prerequisites). These are detailed, product-specific configuration steps that qualify as expert knowledge and align best with the configuration sub-skill. | | [Evaluate your AI agents](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/agent-evaluate-sdk) | integrations | 0.70 | Agent evaluation via SDK; includes specific API calls, evaluator configuration, and parameters that are product-specific integration patterns. | | [Evaluation in Azure DevOps](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/evaluation-azure-devops) | deployment | 0.70 | Azure DevOps integration for evaluations; likely includes task YAML, parameters, and constraints specific to this evaluation action. | | [Evaluation in GitHub Actions](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/evaluation-github-action) | deployment | 0.70 | GitHub Action for offline evaluation; such pages typically document action inputs, environment variables, and constraints, which are product-specific deployment/CI details. | @@ -291,11 +305,9 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [File search](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/file-search) | integrations | 0.70 | File Search how-to describes how to attach external documents, including file limits, indexing behavior, and configuration parameters unique to this feature. | | [Fine-tune models deployed via serverless API](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/fine-tune-serverless) | deployment | 0.70 | Serverless deployment of fine-tuned models; likely includes deployment options, constraints, and possibly tier-specific behavior for serverless hosting. | | [For Speech/Language](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/bring-your-own-azure-storage-speech-language-services) | configuration | 0.70 | Speech/Language BYOS configuration at resource creation time will include specific parameters, allowed values, and constraints unique to these capabilities, fitting configuration expert knowledge. | -| [Foundry known issues](https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-known-issues) | troubleshooting | 0.70 | Known issues page typically lists specific symptoms and product behaviors with associated workarounds/solutions, which are symptom→cause→solution mappings unique to the service. | | [General purpose evaluators](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/general-purpose-evaluators) | configuration | 0.70 | Describes specific general-purpose evaluators (coherence, fluency, QA composite) with likely parameter names and usage patterns; this is product-specific evaluator configuration rather than generic theory. | | [Guardrails & controls for serverless API](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-catalog-content-safety) | security | 0.70 | Content safety and guardrails for models sold directly by Azure; likely includes specific safety settings, policy controls, and configuration parameters unique to Foundry serverless deployments. | | [Hide preview features with Azure tags](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/disable-preview-features) | configuration | 0.70 | Describes specific tag names/values, scopes (subscription/resource group/resource), and behavior across classic/new portals, which are concrete configuration parameters. | -| [High availability and resiliency](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency) | architecture-patterns | 0.70 | HA/resiliency article will include region/deployment patterns, recommended topologies, and possibly RPO/RTO guidance specific to Foundry projects and Agent Service, which are architecture patterns. | | [How to configure a private link](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-configure-private-link) | security | 0.70 | Private Link configuration articles typically include specific resource types, endpoint names, DNS zone patterns, and required settings that are product-specific security/network configuration details. | | [How to upload files](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/file-search-upload-files) | integrations | 0.70 | Step-by-step and code samples for uploading files; likely documents API/SDK parameters, size constraints, and content-type handling specific to this service. | | [How to use Azure AI Search](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/azure-ai-search-samples) | integrations | 0.70 | Explains using an existing search index with the Azure AI Search tool; includes concrete integration steps and parameters for this product combination. | @@ -303,16 +315,17 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [How to use Deep Research](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/deep-research-samples) | integrations | 0.70 | The article is explicitly about using the Deep Research tool with the Azure AI Projects SDK, including code examples and setup instructions. This suggests concrete API usage and configuration details unique to this deprecated tool, fitting the integrations & coding patterns category. | | [How to use browser automation](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/browser-automation-samples) | integrations | 0.70 | Step-by-step and code samples for using the Browser Automation tool; includes tool parameters, usage patterns, and constraints specific to this integration. | | [Image generation](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/dall-e) | integrations | 0.70 | Explains how to call image generation and editing via image generation API and responses API, including model-specific options and parameters, which are integration details. | +| [Index Lookup tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool) | integrations | 0.70 | Index Lookup tool documentation will cover configuration fields (index name, search parameters, connection settings) and how it integrates with Azure AI Search or similar, which are concrete integration patterns. | | [Inference examples for serverless deployments](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/models-inference-examples) | integrations | 0.70 | Provides concrete inference examples for serverless API deployments; expected to show request schemas, headers, and parameters specific to Foundry Models serverless APIs. | | [JSON mode](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/json-mode) | integrations | 0.70 | Describes how to enable JSON mode on Azure OpenAI chat completions, including request parameters and constraints for JSON output, which are specific API integration details. | -| [LLM tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool) | configuration | 0.70 | Tool-specific article for the LLM tool will typically document parameters (model name, temperature, max tokens, deployment names) and allowed values unique to this product. | +| [LLM tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool) | integrations | 0.70 | An article introducing the LLM tool for flows will typically document tool parameters (model name, temperature, max tokens, deployment references) and how they map to Azure OpenAI, which are product-specific integration settings. | | [Limited access](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/limited-access) | security | 0.70 | Limited access and responsible AI policy pages for Azure OpenAI typically include product-specific eligibility rules, approval requirements, and usage constraints that are not generic knowledge. These are effectively security/compliance access controls (who can use what, under which conditions), which map best to the security sub-skill. Content is not just conceptual marketing; it governs concrete access behavior for this specific service. | +| [Managed compute models lifecycle](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-retirement-managed-compute) | decision-making | 0.70 | The article focuses on lifecycle stages, deprecation timelines, and replacements for managed compute deployments. This is expert, product-specific guidance that informs when to migrate, what to migrate to, and how to plan around provider-driven changes, aligning with decision-making (migration considerations and upgrade paths). | | [Managed network for hubs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/configure-managed-network) | security | 0.70 | How-to for configuring a managed network for hub-based projects, including specific networking settings and Azure resources. These are concrete security/network configuration steps beyond generic concepts. | | [Microsoft Fabric (preview)](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/fabric) | integrations | 0.70 | Focuses on integrating Foundry agents with the Microsoft Fabric data agent for analytics. This integration is product-specific and likely includes concrete configuration and API usage patterns, which are not generic concepts and thus count as expert integration knowledge. | | [Migrate from hub-based to Foundry project](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/migrate-project) | decision-making | 0.70 | Migration guide will include concrete mapping between old and new project types, supported/unsupported features, and stepwise migration decisions, which are product-specific decision and migration details. | | [Migrate to Azure OpenAI .NET v2.0](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/dotnet-migration) | integrations | 0.70 | Covers migration between specific Azure OpenAI .NET SDK versions; such docs contain detailed API/parameter changes and code patterns unique to this product. | | [Migrate to OpenAI Python v1.x](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/migration) | integrations | 0.70 | Migration guide between library versions with Azure-specific changes; these typically include concrete parameter names, code patterns, and configuration differences unique to Azure OpenAI usage. | -| [Model lifecycle and retirement](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-lifecycle-retirement) | decision-making | 0.70 | Lifecycle/deprecation guidance usually includes timelines, migration steps, and criteria for when to move to newer models—service-specific upgrade and migration decision guidance. | | [Model versions](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/model-versions) | decision-making | 0.70 | Covers how model versions work, update policies, deployment options, and managing version upgrades. This is guidance for choosing update policies and upgrade paths (for example, auto-upgrade vs pinned versions) and likely includes scenario-based recommendations, making it decision-making rather than just conceptual. | | [Model versions](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/model-versions) | decision-making | 0.70 | Covers how model versions work, update policies, deployment options, and managing version upgrades. This is guidance for choosing update policies and upgrade paths (for example, auto-upgrade vs pinned versions) and likely includes scenario-based recommendations, making it decision-making rather than just conceptual. | | [Monitor model deployments](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/monitor-models) | configuration | 0.70 | Explains using Azure Monitor and Log Analytics for Foundry Models; includes specific metrics, log categories, and configuration steps that are product-specific. | @@ -323,15 +336,14 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/file-search) | integrations | 0.70 | How-to for a concrete tool that augments agents with external knowledge; likely includes tool options, modes, and constraints specific to Foundry file search. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/bring-your-own-azure-storage-foundry) | configuration | 0.70 | BYOS article will include storage account types, required configuration parameters, identity/connection settings, and supported scenarios for agents/evaluations/datasets, which are detailed configuration options. | | [Performance & latency](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/latency) | best-practices | 0.70 | Performance/latency optimization for a specific service typically includes concrete recommendations, configuration values, and throughput behaviors unique to Azure OpenAI. | -| [Priority processing](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/priority-processing) | decision-making | 0.70 | Priority processing for model deployments (low latency, high availability, pay-as-you-go) typically involves choosing service tiers, understanding cost/performance trade-offs, and verifying which tier processed requests. This is tier/plan selection with quantified trade-offs and monitoring of associated costs, fitting decision-making. | | [Prompt caching](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/prompt-caching) | configuration | 0.70 | A how-to page for prompt caching is likely to include product-specific configuration parameters (for enabling caching, cache keys, TTLs, cache behavior flags) and possibly tables of options or defaults that are unique to Azure OpenAI in Foundry. This goes beyond generic prompting concepts and into concrete settings and usage patterns, which qualifies as expert configuration knowledge. | -| [Prompt tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool) | configuration | 0.70 | Prompt tool article is expected to list tool settings, input/output bindings, and parameter options specific to Foundry prompt flow. | -| [Python tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool) | configuration | 0.70 | Python tool documentation usually includes configuration fields (environment, inputs/outputs, package requirements) and constraints specific to this environment. | +| [Python tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool) | integrations | 0.70 | Python tool documentation usually includes function signatures, input/output schema, environment constraints, and parameter names specific to Prompt Flow, which are integration and coding patterns. | | [Reasoning models](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/reasoning) | decision-making | 0.70 | Explains when to use GPT-5 series, o3-mini, o1, and o1-mini for reasoning tasks; likely includes scenario-based guidance and trade-offs between models. | +| [Rebuild and validate](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-migrate-prompt-flow-to-agent-framework) | integrations | 0.70 | Covers exporting flows and re-implementing them using WorkflowBuilder and Executor classes plus Azure AI Evaluation SDK; this is detailed, product-specific coding and integration guidance. | | [Recover from a platform outage](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/agent-service-platform-disaster-recovery) | architecture-patterns | 0.70 | Describes warm standby, failover, and failback procedures for regional/platform outages. These are concrete DR architecture patterns and recovery strategies specific to Foundry Agent Service, including when to use each approach, which aligns with architecture-patterns. | | [Recover from resource and data loss](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/agent-service-operator-disaster-recovery) | architecture-patterns | 0.70 | Resource/data loss recovery article maps specific incident types (deletions, corruption) to recovery procedures across Cosmos DB, AI Search, Storage, etc., which are product-specific DR patterns. | | [Reinforcement fine-tuning](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/reinforcement-fine-tuning) | limits-quotas | 0.70 | Explicitly notes that RFT jobs automatically pause once they hit a $5 limit; this is a numeric quota/limit that qualifies as expert limits-quotas knowledge, alongside RFT-specific configuration. | -| [Rerank tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool) | configuration | 0.70 | Rerank tool documentation will describe tool parameters (model, topN, scoring options) and constraints specific to Foundry prompt flow. | +| [Rerank tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool) | integrations | 0.70 | Rerank tool article describes how to wire it after Index Lookup, including parameters like topN, scoring options, and model selection; these are specific integration settings for Prompt Flow RAG pipelines. | | [Responses API with Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/generate-responses) | integrations | 0.70 | Explains how to use the Responses API with specific models, likely including request parameters and usage patterns unique to Foundry. | | [Retrieval Augmented Generation (RAG) evaluators](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/rag-evaluators) | configuration | 0.70 | RAG evaluator reference is product-specific; typically includes evaluator names, inputs, and configuration options for relevance/groundedness/completeness scoring. | | [Risk and safety evaluators](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/risk-safety-evaluators) | configuration | 0.70 | Risk and safety evaluators are specific to this platform; page likely documents evaluator types, parameters, and thresholds that constitute expert configuration knowledge. | @@ -348,6 +360,7 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Upgrade from Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/upgrade-azure-openai) | decision-making | 0.70 | Explains upgrade path, preserved endpoints, and capability differences; helps choose and plan migration between resource types with concrete implications for existing workloads. | | [Use MCP tools with the Microsoft Foundry Agent Service in VS Code](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/vs-code-agents-mcp) | integrations | 0.70 | Shows how to add and test MCP tools via the VS Code extension; likely includes configuration fields and parameters unique to this integration path. | | [Use structured outputs](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-structured-outputs) | integrations | 0.70 | Explains structured output configuration for chat models, including schema/parameter usage specific to Foundry implementations. | +| [Use virtual networks](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks) | configuration | 0.70 | The page describes a 'Standard Setup with private networking' for Foundry Agent Service using Bicep/Terraform, including specific Azure networking components (virtual network, private endpoints, DNS zones, deny-by-default rules). This is product-specific configuration guidance with concrete resource types and settings, rather than a generic networking overview, so it fits the configuration sub-skill. | | [Weights & Biases integration (preview)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/weights-and-biases-integration) | integrations | 0.70 | Describes a concrete integration between Azure OpenAI fine-tuning and W&B, likely including API keys, environment variables, and run configuration parameters specific to this integration. | | [What is the Provisioned Throughput offering (PTU)?](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/provisioned-throughput) | decision-making | 0.70 | Provisioned throughput concept article for Foundry models will include PTU sizing, capacity allocation behavior, and when to choose provisioned vs standard, which are product-specific decision criteria. | | [Work with image embeddings](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/use-image-embeddings) | integrations | 0.70 | Explains how to generate image embeddings via Foundry APIs, including model support and request configuration. | @@ -357,7 +370,10 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Create and manage compute](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-manage-compute) | configuration | 0.68 | A 'how to create and manage compute instances' article for a specific portal typically documents instance-size options, region constraints, and configuration parameters (for prompt flow, index creation, VS Code access). These are product-specific settings and options that go beyond generic knowledge, fitting the configuration sub-skill. | | [Create using Terraform template](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-resource-terraform) | configuration | 0.68 | The page describes using Terraform (AzAPI and AzureRM providers) to create Foundry resources, projects, deployments, and connections. Terraform-based resource creation for a specific Azure service typically includes resource type names, required/optional arguments, and provider-specific configuration blocks that are product-specific and not generally known to an LLM. This aligns best with the configuration sub-skill, as it focuses on concrete IaC configuration rather than generic deployment or conceptual guidance. | | [Fine-tuning GPT-4o mini](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/tutorials/fine-tune) | integrations | 0.68 | Fine-tuning tutorial for a specific model and date-stamped version; includes API parameters, job configuration, and constraints unique to this product and model. | +| [High availability & disaster recovery](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency) | best-practices | 0.68 | High availability and resiliency guidance for a specific service is typically concrete and product-specific (for example, recommended regional deployment patterns, failover configurations, and service behaviors under failure). This goes beyond generic HA concepts and is likely to include Foundry-specific recommendations and edge cases, fitting best under best-practices. | +| [Microsoft Foundry SDKs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview) | integrations | 0.68 | The page goes beyond a conceptual overview and explains which specific SDKs and endpoints to use for different scenarios, including product-specific endpoint paths (/openai/v1 vs others) and authentication patterns (DefaultAzureCredential vs API keys). These are concrete integration details and patterns unique to Microsoft Foundry, fitting the integrations category. | | [Predicted outputs](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/predicted-outputs) | best-practices | 0.68 | How-to page for predicted outputs in Azure OpenAI within Foundry classic; likely includes concrete, product-specific guidance such as when to use predicted outputs, how to structure prompts, and configuration/usage patterns that are unique to this feature and not broadly known. This is actionable optimization guidance rather than just conceptual description, fitting best-practices. | +| [Priority processing](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/priority-processing) | decision-making | 0.68 | The page is specific to Microsoft Foundry (classic) priority processing, describing how and when to enable a priority tier on a model deployment, how to verify which tier processed requests, and how to monitor associated costs. This is product- and tier-specific guidance that helps decide when to use priority processing versus standard, with cost/performance trade-offs, which fits the decision-making category better than generic configuration or limits. The content goes beyond conceptual marketing and includes concrete, service-specific operational guidance that an LLM is unlikely to know from training. | | [Run red teaming scans locally](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-scans-ai-red-teaming-agent) | integrations | 0.68 | How-to page for using the AI Red Teaming Agent via the Azure AI Evaluation SDK; likely includes product-specific SDK usage, configuration parameters, and code patterns for running local scans against generative AI apps, which qualifies as integration-focused expert knowledge beyond generic concepts. | | [Safety evaluation](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-safety-evaluation) | best-practices | 0.68 | Fine-tuning safety evaluation guidance for Microsoft Foundry is product- and implementation-specific, including concrete recommendations and constraints around how safety checks are applied to fine-tuned models. This is operational, product-specific RAI guidance that an LLM is unlikely to know from pretraining and fits best under best-practices. | | [Structured outputs](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/structured-outputs) | integrations | 0.68 | A 'how to' for structured outputs with Azure OpenAI in Foundry is likely to include product-specific request/response schemas, parameter names, and configuration patterns (for example, how to pass JSON Schema, flags to enable structured outputs, and model-specific behavior). These are concrete integration/configuration details unique to this product rather than generic LLM knowledge. | @@ -370,18 +386,19 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Connected agents](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/connected-agents) | architecture-patterns | 0.65 | Describes how to break workflows into specialized agents and orchestrate them; likely includes product-specific patterns and guidance on when to use connected agents. | | [Continuously evaluate your AI agents](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/continuous-evaluation-agents) | configuration | 0.65 | Continuous evaluation setup; likely documents schedules, evaluator configurations, and portal/SDK settings unique to this feature. | | [Continuously monitor your applications](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/monitor-applications) | configuration | 0.65 | Monitoring how-to; typically includes configuration of metrics, logging, and evaluator settings specific to Foundry’s monitoring features. | +| [Create a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop) | integrations | 0.65 | A 'how to build' article for Prompt Flow is likely to include concrete tool/node parameters, flow configuration fields, and product-specific coding patterns for orchestrating LLM calls, which qualify as integration patterns rather than just a tutorial. | | [Disaster recovery for agent services](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/agent-service-disaster-recovery) | architecture-patterns | 0.65 | DR planning content for a specific service usually details what can/can’t be recovered, readiness checklists, and recovery paths. For Agent Service, this is a product-specific resiliency pattern with clear guidance on when and how to use certain DR approaches, fitting architecture & design patterns for this service. | | [Fine-tuning in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/fine-tuning-overview) | decision-making | 0.65 | Explains when to fine-tune and which type to choose; likely includes scenario-based guidance and trade-offs for different fine-tuning approaches, which is decision-making content. | | [Foundry (classic) FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/faq) | troubleshooting | 0.65 | As an FAQ for a specific Azure service, this page is likely to include concrete Q&A about common problems, including specific error messages or behaviors and their resolutions, which fits the troubleshooting pattern more than generic overview content. | +| [Foundry Models lifecycle and support policy](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements) | decision-making | 0.65 | Lifecycle and support policy content typically includes concrete timelines, overlap periods, and migration guidance between model versions. This helps users decide when and how to migrate workloads and select replacement models, which aligns with decision-making and contains product-specific policy details not inferable from general knowledge. | | [Generate synthetic and simulated data for evaluation](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/simulator-interaction-data) | configuration | 0.65 | How-to for generating synthetic data for evaluation; likely includes SDK calls, parameters, and dataset schema requirements that are specific to Foundry’s evaluation system. | | [Get started with DeepSeek-R1 reasoning model](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/tutorials/get-started-deepseek-r1) | best-practices | 0.65 | Tutorial for a specific reasoning model; likely includes model deployment options, parameter settings, and parsing patterns unique to DeepSeek-R1 in Foundry. | | [Groundedness detection](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-filter-groundedness) | integrations | 0.65 | Describes groundedness detection requirements such as document embedding and prompt formatting for RAG scenarios. Involves product-specific API usage patterns and parameters for integrating Content Safety with LLM responses. | -| [Legacy models](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/legacy-models) | limits-quotas | 0.65 | Retired models list is effectively a product-specific availability constraint; page likely enumerates specific model names and retirement dates, which are concrete limits on what can be deployed. | | [MedImageInsight model for embedding](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-medimageinsight) | integrations | 0.65 | Step-by-step deployment plus API examples for a specific healthcare model; likely includes endpoint paths, request/response schemas, and model-specific parameters that qualify as product-specific integration patterns. | | [MedImageParse models for prompted segmentation](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-medimageparse) | integrations | 0.65 | Model-specific how-to for MedImageParse and MedImageParse 3D; likely documents input formats, parameters, and API usage patterns that are not generic LLM knowledge. | -| [Microsoft Foundry SDKs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview) | decision-making | 0.65 | Explains which SDK and endpoint to use for specific scenarios and resource types, providing concrete selection guidance beyond generic concepts. | | [Migrate from Azure AI Inference SDK to OpenAI SDK](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/model-inference-to-openai-migration) | integrations | 0.65 | Migration guide between two specific SDKs likely includes side-by-side API/parameter mappings and product-specific code patterns that qualify as integration-focused expert knowledge. | | [Migrate to OpenAI JavaScript v4.x](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/migration-javascript) | integrations | 0.65 | Migration guidance for JS client with Azure-specific adjustments; likely includes concrete SDK usage patterns and parameter changes that are product-specific. | +| [Migration overview](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-migration-overview) | decision-making | 0.65 | Migration overview will compare Prompt Flow and Agent Framework, outline when and how to migrate, and provide scenario-based guidance and constraints, which supports technology selection and migration decisions. | | [Model and region support](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/model-region-support) | decision-making | 0.65 | Described as a matrix of supported Azure OpenAI models, deployment types, and regions for Foundry Agent Service. This is expert knowledge used to decide which model/region combinations are available and appropriate, fitting decision-making (service/option selection with concrete availability data). | | [Monitor Azure OpenAI](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/monitor-openai) | configuration | 0.65 | Monitoring article for a specific service usually lists metric names, log categories, and diagnostic settings unique to Azure OpenAI and Azure Monitor integration. | | [Monitor prompt flow deployments](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/monitor-quality-safety) | configuration | 0.65 | Monitoring article is expected to describe specific metrics, dashboards, and configuration options for tracking token usage and quality in Foundry. | @@ -389,12 +406,15 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Plan and manage costs for hubs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/costs-plan-manage) | decision-making | 0.65 | Cost planning article uses Azure pricing calculator and cost analysis; likely includes guidance on which resources drive cost and how to structure deployments for cost control. | | [Plan rollout](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/planning) | decision-making | 0.65 | The planning guide outlines concrete rollout decisions for environment setup, data isolation, governance, capacity management, and monitoring when adopting Foundry across an organization. While the summary is high level, this type of rollout-planning content typically includes product-specific decision criteria and trade-offs (for example, how to segment environments, isolate data, and integrate with other Azure services) that go beyond generic concepts and help choose between options, fitting the decision-making category. | | [Preference fine-tuning](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-direct-preference-optimization) | configuration | 0.65 | DPO technique as implemented in Foundry; likely includes dataset format, parameter settings, and training configuration specific to this service. | +| [Process images in a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-process-image) | integrations | 0.65 | Image processing in Prompt Flow will require specifying tool parameters (e.g., vision model selection, input formats, fields) that are product-specific integration details for Azure OpenAI vision models inside Prompt Flow. | | [Prompt shields](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-filter-prompt-shields) | configuration | 0.65 | Explains Prompt Shields, how to enable them in content filter configuration, and describes returned annotations (detected/filtered flags). These are product-specific configuration and response fields, fitting configuration/integrations more than generic concepts. | +| [Prompt tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool) | integrations | 0.65 | The Prompt tool article is expected to define fields, variable bindings, and configuration options unique to Prompt Flow nodes, which are concrete integration/config patterns rather than generic concepts. | | [Prompt transformations](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/prompt-transformation) | security | 0.65 | Explains prompt transformation for Azure OpenAI image models, including how and why prompts are transformed for safety/compliance. This is a product-specific safety mechanism tied to content moderation and responsible use, which aligns best with security-focused configuration/behavior rather than generic concepts. | | [Protected material detection](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-filter-protected-material) | integrations | 0.65 | Explains how the protected material detection filter scans LLM outputs for copyrighted text/code. Likely includes filter configuration options and behavior specific to Foundry’s Guardrails, which are integration/configuration details. | | [Quickstart: Get started with Microsoft Foundry using a hub](https://learn.microsoft.com/en-us/azure/foundry-classic/quickstarts/hub-get-started-code) | integrations | 0.65 | Quickstart for SDK, deployment, and chat app will include concrete API endpoints, SDK methods, and parameter usage specific to Foundry hub projects, which are integration/coding patterns. | | [Reproducible output](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/reproducible-output) | configuration | 0.65 | Preview feature for determinism likely documents specific configuration parameters (e.g., seed or related flags) and their allowed values, which are product-specific settings. | | [Risks & Safety Monitoring](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/risks-safety-monitor) | troubleshooting | 0.65 | Risks & Safety monitoring dashboard article likely includes guidance on interpreting metrics, diagnosing filter behavior, and adjusting configuration based on observed issues, fitting a troubleshooting pattern. | +| [Serp API tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool) | integrations | 0.65 | Serp API tool documentation will include API key configuration, query parameters, and response handling fields specific to Prompt Flow, which are integration details not covered by generic knowledge. | | [Use your own resources](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/use-your-own-resources) | configuration | 0.65 | Describes connecting your own Azure resources (storage, CMK, network isolation) instead of managed storage. This typically involves specific resource types, configuration parameters, and wiring patterns unique to Foundry Agent Service, fitting configuration-focused expert knowledge. | | [Vision fine-tuning](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-vision) | best-practices | 0.65 | The page covers dataset requirements, image formats, and best practices for vision fine-tuning of GPT-4o and GPT-4.1 in Azure/OpenAI via Foundry classic. Such guidance typically includes product-specific constraints (supported formats, size/ratio limits, labeling structure) and concrete recommendations unique to this service’s fine-tuning pipeline, which fits the best-practices category and constitutes expert knowledge beyond generic LLM training. | | [Vision-enabled chats](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/gpt-with-vision) | integrations | 0.65 | How-to for vision-enabled chat models typically documents request formats, image input parameters, and model-specific options for Azure OpenAI, which are concrete integration details not captured by generic LLM training. | @@ -402,12 +422,14 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Create a hub using the Azure Machine Learning SDK and CLI](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/create-hub-project-sdk) | configuration | 0.64 | The article uses Azure ML SDK and CLI extensions to create Foundry hubs and related resources, which typically involves specific parameter names, CLI flags, and configuration objects unique to this integration. That aligns best with configuration (detailed command/parameter usage) rather than generic how-to content. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/browser-automation) | security | 0.64 | Browser Automation has significant security implications; article likely includes concrete security configuration, restrictions, and safe-use patterns specific to this tool. | | [Stored completions](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/stored-completions) | integrations | 0.64 | Stored completions and distillation are Azure OpenAI–specific capabilities. A dedicated 'how to' article will typically document API/SDK parameters, dataset formats, and workflow-specific options for capturing and reusing completions, which are detailed integration patterns not generally known from training data. | -| [Business continuity & disaster recovery (BCDR)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/business-continuity-disaster-recovery) | architecture-patterns | 0.62 | BCDR guidance for a specific service often includes region selection patterns, failover strategies, and possibly thresholds for multi-region design; this is architecture-specific decision guidance. | | [Create a hub from template](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-azure-ai-hub-template) | deployment | 0.62 | This page describes a specific Bicep template that creates a Foundry hub and its required dependent resources. It encodes product-specific deployment structure and required resources/relationships for hub-based projects, which is specialized deployment knowledge beyond generic Bicep usage. | +| [Retired Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/retired-models) | decision-making | 0.62 | A list of retired models and their alternatives is specific, time-sensitive product data. It guides users in choosing replacement models and planning migrations away from unavailable ones, which fits decision-making around model selection and upgrade paths. | +| [Content Safety tool](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool) | security | 0.60 | Content Safety tool documentation typically includes category flags, severity thresholds, and configuration parameters for safety evaluation, which are product-specific security/safety settings rather than generic advice. | | [Content credentials](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-credentials) | security | 0.60 | Describes how Content Credentials are applied to AI-generated images and the underlying open standard; likely includes product-specific behavior and requirements for verification, which are security/compliance related. | | [Content filter annotations](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-filter-annotations) | integrations | 0.60 | Details annotation fields and how they map to content filtering results, including severity levels and optional models. These are specific response schema details for the product’s APIs. | | [Content streaming](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-streaming) | decision-making | 0.60 | Explains default vs asynchronous filtering modes, their impact on latency and performance, and when to use each. This is product-specific trade-off guidance for selecting streaming options, fitting decision-making. | | [Data, privacy, and security](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/data-privacy) | security | 0.60 | Details how data is processed, used, and stored for Azure Direct Models; includes product-specific data retention and security behavior not generally known. | +| [Develop an evaluation flow in Prompt flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop-evaluation) | best-practices | 0.60 | Creating evaluation flows tailored to tasks implies concrete guidance on structuring flows, using specific evaluation tools, and wiring metrics in Prompt Flow, which are product-specific DO/DON'T patterns rather than generic theory. | | [Evaluation](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/evaluations) | configuration | 0.60 | Evaluation how-to for Foundry Models likely includes specific configuration options, evaluation job parameters, and possibly metrics configuration unique to this service. | | [Getting started with Assistants](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/assistant) | integrations | 0.60 | How-to for creating Assistants with tools like Code Interpreter; likely includes API parameters, request schemas, and configuration options specific to the Assistants feature. | | [Part 2: Build with data retrieval](https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/copilot-sdk-build-rag) | integrations | 0.60 | Tutorial for building a RAG app; likely includes concrete SDK usage patterns, client configuration, and request options specific to the Foundry SDK. | @@ -432,7 +454,6 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Create a hub project](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-create-projects) | 0.40 | Creating a hub project is likely a step-by-step portal workflow without detailed configuration tables, limits, or product-specific troubleshooting; mostly procedural getting-started content. | | [Deploy an enterprise chat web app](https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/deploy-chat-web-app) | 0.40 | Tutorial-style deployment walkthrough for a chat web app; likely step-by-step UI and basic config without detailed parameter tables, limits, or product-specific edge cases. | | [Deploy and use web apps](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/use-web-app) | 0.40 | Web app usage article is a sample UI walkthrough; summary suggests step-by-step usage rather than enumerating reusable configuration matrices or product-specific limits. | -| [Develop an evaluation flow in Prompt flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop-evaluation) | 0.40 | Developing an evaluation flow is mostly design and workflow; summary does not show concrete configuration matrices or limits. | | [Get started with model router](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/model-router) | 0.40 | “How to use model router” sounds procedural; the summary references advantages and limitations but not specific numeric limits, configuration tables, or error-code-based troubleshooting. | | [Getting started with embeddings](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/tutorials/embeddings) | 0.40 | Tutorial-style walkthrough for document search; likely focuses on step-by-step usage rather than enumerating reusable configuration matrices or parameter tables. | | [High availability and disaster recovery for hubs](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-disaster-recovery) | 0.40 | High availability and disaster recovery planning article is likely conceptual guidance and patterns without clear evidence of numeric thresholds, decision matrices, or product-specific configuration tables in the summary. | @@ -445,6 +466,7 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Submit batch run and evaluate a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-bulk-test-evaluation) | 0.40 | Batch run and evaluation how-to; likely procedural without explicit config tables or numeric thresholds in the summary. | | [Supported programming languages and SDKs](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/supported-languages) | 0.40 | Likely a simple list of supported programming languages; useful but not a configuration, limit, or troubleshooting guide. | | [Threads, runs, and messages](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/threads-runs-messages) | 0.40 | Conceptual explanation of threads, runs, and messages; summary doesn’t indicate numeric limits, config tables, or troubleshooting mappings. | +| [Tune prompts using variants](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-tune-prompts-using-variants) | 0.40 | Tuning prompts with variants is mostly methodology; summary does not indicate product-specific configuration tables, limits, or error mappings beyond generic prompt engineering concepts. | | [Upgrade from GitHub Models to Microsoft Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/quickstart-github-models) | 0.40 | Migration-style quickstart from GitHub Models to Foundry Models; summary suggests tutorial/flow guidance, not detailed configuration tables, limits, or decision matrices. | | [View evaluation results in the portal](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/evaluate-results) | 0.40 | Focuses on viewing and interpreting evaluation results in the portal; likely UI-driven without deep configuration or numeric limits. | | [What's new](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/whats-new) | 0.40 | Release notes and high-level feature updates; useful but not a structured limits, configuration, or troubleshooting reference. | @@ -460,8 +482,6 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Assistants](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/assistants) | 0.30 | Concepts article for Assistants API; primarily conceptual description of assistants and tools, not focused on detailed configuration tables or error mappings. | | [Azure AI Content Safety in Foundry portal](https://learn.microsoft.com/en-us/azure/foundry-classic/ai-services/content-safety-overview) | 0.30 | Content Safety overview describes what the service does and mentions a try-it experience, but the summary does not indicate detailed configuration tables, limits, or error mappings. | | [Configure Models from Partners and Community](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/models-from-partners) | 0.30 | Duplicate of index 2: partner and community models listing with capabilities; no indication of detailed limits, configuration parameters, or troubleshooting mappings. | -| [Create a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop) | 0.30 | General how-to for building with prompt flow; summary does not indicate detailed configuration tables, limits, or troubleshooting content. | -| [Data, privacy, and security for Claude models in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/claude-models/data-privacy) | 0.30 | Focuses on how data is processed, used, and stored for Anthropic Claude models in Foundry. Likely high-level privacy and data-handling description without concrete RBAC roles, config parameters, or limits; more policy/overview than actionable configuration, so no sub-skill assigned. | | [Deploy models to managed compute pay-as-you-go](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/deploy-models-managed-pay-go) | 0.30 | Appears to be a how-to deployment guide for Foundry models with pay-as-you-go billing. From the summary, it focuses on describing model categories and deployment to managed compute, but there's no indication of detailed configuration tables, limits, error codes, or other product-specific expert parameters. Likely a procedural tutorial rather than a reference of expert knowledge as defined. | | [Embeddings basics](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/embeddings) | 0.30 | Embeddings how-to summary appears largely conceptual (what embeddings are, general usage). Without clear indication of product-specific configuration tables, limits, or unique patterns, it is likely generic knowledge an LLM already has. | | [Fine-tune models deployed via managed compute](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/fine-tune-managed-compute) | 0.30 | The description and summary indicate a step-by-step how-to for deploying fine-tuned models with managed compute. It reads as a procedural tutorial rather than a reference of limits, configuration matrices, or detailed deployment constraints. Without clear evidence of parameter tables, quotas, or SKU-specific constraints, it does not meet the expert-knowledge criteria for any sub-skill type. | @@ -473,17 +493,15 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Observability overview](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/observability) | 0.30 | Conceptual overview of observability and evaluation; description suggests lifecycle concepts rather than concrete error codes, limits, or config tables. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-router) | 0.30 | Concepts page describing what the model router is and its benefits; summary does not indicate detailed limits, configs, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/overview) | 0.30 | Responsible AI overview; primarily conceptual practices and challenges without concrete configuration or limits indicated in the summary. | -| [Process images in a flow](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-process-image) | 0.30 | Processing images in prompt flow is a feature how-to; summary does not indicate detailed configuration parameters or numeric constraints. | +| [Prompt flow tools overview](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-flow-tools-overview) | 0.30 | An overview of Prompt Flow tools is likely descriptive and navigational, listing tools without deep configuration tables or error mappings. | | [Quickstart](https://learn.microsoft.com/en-us/azure/foundry-classic/quickstarts/get-started-code) | 0.30 | Quickstart tutorial for SDK usage; no detailed configuration tables, limits, or troubleshooting content. | | [Retrieval Augmented Generation (RAG) overview](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/retrieval-augmented-generation) | 0.30 | Conceptual explanation of RAG and indexes; summary suggests high-level pattern description without concrete configuration tables, limits, or product-specific parameters. | | [Transparency note](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/transparency-note) | 0.30 | Transparency note is typically high-level usage and risk information; summary doesn’t indicate specific config parameters, limits, or error mappings. | | [Trustworthy AI overview](https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-use-of-ai-overview) | 0.30 | Responsible AI overview is primarily conceptual and navigational, summarizing resources and standards without specific configuration parameters, limits, or error mappings. | -| [Tune prompts using variants](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-tune-prompts-using-variants) | 0.30 | Tuning prompts using variants appears to be workflow guidance; summary does not suggest product-specific limits, config tables, or error mappings. | | [Video generation (preview)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/video-generation) | 0.30 | Conceptual overview of Sora capabilities, safety, and limitations; no clear indication of parameter tables, limits, or configuration matrices. | | [What is Foundry Agent Service?](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/overview) | 0.30 | Conceptual overview of Foundry Agent Service; primarily explains what it is and why, not detailed configs or limits. | | [Customizing Large Language Models (LLMs)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/customizing-llms) | 0.25 | Getting-started concepts for customizing LLMs (prompting, RAG, fine-tuning); largely general techniques rather than product-specific configuration or limits. | | [Embeddings](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/understand-embeddings) | 0.25 | Conceptual explanation of embeddings and cosine similarity; generic ML knowledge without Azure-specific configuration or limits. | -| [Explore Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/foundry-models-overview) | 0.25 | Models overview and catalog description; marketing/conceptual without detailed limits, configs, or decision matrices. | | [Red teaming large language models (LLMs)](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/red-teaming) | 0.25 | Red teaming planning guide is process-oriented and conceptual, not product-specific configuration, limits, or error-code troubleshooting. | | [Use the chat playground](https://learn.microsoft.com/en-us/azure/foundry-classic/quickstarts/get-started-playground) | 0.25 | A quickstart for using the chat playground UI is primarily a step-by-step tutorial and conceptual exploration of model capabilities, without clear indication of limits, configuration tables, or product-specific diagnostic or integration details. | | [Ask AI](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/ask-ai) | 0.20 | Usage description of Ask AI chat in the Foundry portal; no limits, configuration tables, error codes, or product-specific numeric details. | @@ -491,17 +509,17 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Create using Bicep template](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/create-resource-template) | 0.20 | This is a quickstart showing how to deploy a Foundry resource and project using a Bicep template. From the summary it appears to be a step-by-step tutorial without configuration parameter tables, limits, or product-specific constraints; it focuses on basic deployment mechanics that an LLM can already generalize, so it does not meet the expert-knowledge criteria for any sub-skill type. | | [Fine-tuning cost management](https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-cost-management) | 0.20 | Fine-tuning cost article uses example numbers only and explicitly defers to the pricing page; no authoritative limits, quotas, or product-specific configuration values are defined. | | [Part 1: Set up project and install SDK](https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/copilot-sdk-create-resources) | 0.20 | Part of a step-by-step tutorial to create resources for a RAG app; likely focuses on guided creation steps rather than detailed configuration matrices, limits, or troubleshooting mappings. | -| [Prompt flow tools overview](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-flow-tools-overview) | 0.20 | Overview of tools is descriptive; overviews typically list capabilities but not detailed configuration parameters or limits. | +| [Prompt flow overview](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/prompt-flow) | 0.20 | Conceptual introduction to Prompt Flow in Foundry classic; summary suggests overview and retirement notice without detailed limits, configs, or troubleshooting matrices. | | [Set up your development environment](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/install-cli-sdk) | 0.20 | Environment setup/install guide for runtimes and CLIs; no indication of detailed configuration tables, limits, or product-specific parameters beyond generic installation steps. | | [Status dashboard](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-status-dashboard-documentation) | 0.20 | A status dashboard overview for monitoring service health and incidents is typically conceptual/UX-focused (how to read the dashboard) and does not usually include detailed limits, configuration parameters, or troubleshooting mappings with specific error codes. | | [Transparency Note for safety evaluations](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/safety-evaluations-transparency-note) | 0.20 | The transparency note describes purpose, capabilities, limitations, and usage guidance for safety evaluations at a conceptual level. From the available summary, it doesn't indicate specific numeric limits, configuration parameters, error codes, or decision matrices with quantified trade-offs. It appears to be high-level risk/safety documentation rather than detailed configuration, limits, or troubleshooting content. | | [What is Microsoft Foundry (classic) portal?](https://learn.microsoft.com/en-us/azure/foundry-classic/what-is-foundry) | 0.20 | High-level overview of Microsoft Foundry (classic) with conceptual description of the portal and APIs; no detailed limits, configuration tables, error codes, or decision matrices. | | [What's new](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/whats-new) | 0.20 | What's new/change log; mostly release notes and marketing-style updates, not structured expert knowledge per the categories. | | [What's new](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/whats-new-model-router) | 0.20 | “What’s new” changelog-style page; typically release notes and feature announcements rather than structured limits, configs, or troubleshooting mappings. | +| [Explore Foundry Models](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/foundry-models-overview) | 0.10 | Described as an introduction to Foundry Models and the model catalog; this is a conceptual/marketing-style overview without indication of concrete limits, configs, or decision matrices. | | [FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/agents/faq) | 0.10 | Agent Service FAQ description mentions setup, data handling, pricing, and networking but with no evidence of specific limits, config parameters, or error codes in the summary. Likely general conceptual/FAQ content. | | [Foundry Models FAQ](https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/faq) | 0.10 | FAQ summary is high level; no indication of numeric limits, config tables, error codes, or other detailed product-specific guidance. Likely general Q&A and support pointers. | | [Hubs overview](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/ai-resources) | 0.10 | The article is an overview of hubs and hub-based projects in Foundry (classic), describing concepts and compatibility notes. The summary indicates conceptual explanation and legacy support context, without evidence of numeric limits, configuration parameter tables, error codes, or decision matrices. It reads as a conceptual overview rather than expert configuration, troubleshooting, or limits content. | -| [Prompt flow overview](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/prompt-flow) | 0.10 | Conceptual introduction to prompt flow; no indication of numeric limits, configuration tables, or error mappings. | | [Service architecture](https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/architecture) | 0.10 | Described as an architecture overview of Microsoft Foundry’s layered model (resource, projects, connected services). This is conceptual service architecture, not a product-specific decision matrix with thresholds or quantified trade-offs, and it doesn’t indicate limits, configs, or troubleshooting details. | | [Start with an AI template](https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/ai-template-get-started) | 0.10 | Describes how to find and deploy AI solution templates; appears to be a usage/overview/tutorial flow without specific limits, configuration tables, or error-code-based troubleshooting. | | [Use Microsoft Foundry with a screen reader](https://learn.microsoft.com/en-us/azure/foundry-classic/tutorials/screen-reader) | 0.10 | Accessibility/navigation guidance for using a screen reader with the Foundry portal; it describes UI structure and how to navigate, not product-specific limits, configuration parameters, APIs, or troubleshooting details. This is not the kind of expert technical knowledge targeted by the sub-skill types. | diff --git a/products/microsoft-foundry-tools/microsoft-foundry-tools.csv b/products/microsoft-foundry-tools/microsoft-foundry-tools.csv index a97e65a4..937b7add 100644 --- a/products/microsoft-foundry-tools/microsoft-foundry-tools.csv +++ b/products/microsoft-foundry-tools/microsoft-foundry-tools.csv @@ -17,35 +17,35 @@ https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/text-moder https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/video-moderation-api,Check video content,Analyze video content for objectionable material in C# - Content Moderator - Azure AI services,Moderate video content using Content Moderator .NET SDK,How to analyze video content for various objectionable material using the Content Moderator SDK for .NET,"This article provides information and code samples to help you get started using theContent Moderator SDK for .NETto scan video content for adult or racy content. If you don't have an Azure subscription, create afree accountbefore you begin.",2025-06-12T08:00:00.000Z,how-to,integrations,0.7,True,"Provides code and parameter usage for video moderation APIs, which are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/audio/overview,Audio Overview,Azure Content Understanding in Foundry Tools audio overview - Foundry Tools,,Learn about Azure Content Understanding in Foundry Tools audio solutions,"Audio analyzers enable transcription and diarization for conversational audio and extract structured fields such as summaries, sentiment, and key topics. Customize an audio analyzer template for your needs in theFoundry portalto start generating results. Here are common scenarios for conversational audio data processing:",2026-01-31T06:05:00.000Z,overview,,0.4,False,"Audio overview; describes scenarios and high-level capabilities, not specific configuration parameters or quotas.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/choosing-right-ai-tool,Choose the right tool for document processing,Choose the right Azure AI tool for document processing - Foundry Tools,Choose Azure AI tool for document processing,"Learn about Azure Content Understanding in Foundry Tools, Azure Document Intelligence in Foundry Tools and Azure large language model (LLM) solutions, processes, workflows, use-cases, and field extrac","As organizations increasingly use Generative AI to manage documents and unstructured data, it's essential to select the right tool for building robust, secure, and scalable document processing workflows. This is a comparative overview of the leading Azure AI solutions for intelligent document processing (IDP) to help you evaluate and choose the most effective approach for your business requirements. This article compares the following options:",2026-03-23T08:00:00.000Z,overview,decision-making,0.7,True,"Comparative guidance between Content Understanding, Document Intelligence, and LLM solutions for intelligent document processing; described as a comparative overview to help evaluate and choose the most effective approach, which fits decision-making criteria even though specific numbers aren’t visible in the summary.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference,What are analyzers?,Azure Content Understanding in Foundry Tools - What is an analyzer? Configuration and reference - Foundry Tools,Configure Content Understanding analyzers and parameters,"Learn about Azure Content Understanding in Foundry Tools analyzers, how to configure them, and the parameters you can set when creating custom analyzers.","Ananalyzerin Azure Content Understanding in Foundry Tools is a configurable processing unit that defines how your content is analyzed and what information is extracted. An analyzer defines: Analyzers are the core building blocks of Content Understanding. They combine content extraction, AI-powered analysis, and structured data output into a single, reusable configuration. You can use prebuilt analyzers for common scenarios or create custom analyzers tailored to your specific needs. Content Under",2026-02-01T12:03:00.000Z,overview,configuration,0.85,True,"Analyzer reference explicitly covers configuration and parameters; likely includes setting names, allowed values, and behavior, which are core configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference,What are analyzers?,Azure Content Understanding in Foundry Tools - What is an analyzer? Configuration and reference - Foundry Tools,Configure and reference Content Understanding analyzers,"Learn about Azure Content Understanding in Foundry Tools analyzers, how to configure them, and the parameters you can set when creating custom analyzers.","Ananalyzerin Azure Content Understanding in Foundry Tools is a configurable processing unit that defines how your content is analyzed and what information is extracted. An analyzer defines: Analyzers are the core building blocks of Content Understanding. They combine content extraction, AI-powered analysis, and structured data output into a single, reusable configuration. You can use prebuilt analyzers for common scenarios or create custom analyzers tailored to your specific needs. Content Under",2026-04-21T11:03:00.000Z,overview,configuration,0.8,True,"Analyzer reference and configuration article; likely includes parameter names, allowed values, and configuration details for custom analyzers, which are product-specific configuration knowledge.",updated https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/best-practices,Best practices,Best practices for using Content Understanding - Foundry Tools,Apply best practices for Content Understanding accuracy,"Learn how to best use Azure Content Understanding in Foundry Tools for document, image, video, and audio file content and field extractions.","Azure Content Understanding in Foundry Tools uses generative AI to process documents, images, videos, and audio, transforming them into structured output formats. This article provides best practices to maximize accuracy and efficiency.",2026-01-29T08:00:00.000Z,overview,best-practices,0.8,True,"Explicit best practices article; expected to include product-specific recommendations, configuration tips, and edge cases for maximizing accuracy and efficiency.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/classifier,What are classifiers?,Azure Content Understanding in Foundry Tools Classifier Overview - Foundry Tools,,Learn about Azure Content Understanding in Foundry Tools classifier solutions.,Content Understanding lets you implement classification and splitting as part of the analyzer operation request. You can perform content classification and content extraction as part of a single API call. The global concept ofanalyzernow includes the concept ofcontentCategoriesandenableSegmentto classify and split the input data you process within your application. This analyzer feature can perform classification of an input file as a whole. It can also identify multiple documents or multiple in,2026-03-23T08:00:00.000Z,overview,,0.3,False,"Classifier overview describing concepts like analyzer, contentCategories, and enableSegment; appears conceptual without detailed configuration tables, numeric thresholds, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments,Foundry model deployments,Model deployment options for Content Understanding analyzers - Foundry Tools,Map Content Understanding analyzers to model deployments,"Learn how Content Understanding maps analyzer models to Foundry deployments, and how defaults and request-level overrides interact.","Azure Content Understanding in Foundry Tools uses your Foundry model deployments for all operations that require a generative AI model. This approach helps you maximize provisioned capacity and consolidate capacity into fewer deployments, if needed. You can also choose the model that best fits your scenario for price and latency. You're billed for all tokens (input and output) processed by the connected deployment, and Content Understanding only bills you for Content Understanding-specific meter",2026-03-17T06:03:00.000Z,concept-article,architecture-patterns,0.65,True,"Explains how analyzer models map to Foundry deployments, defaults, and overrides; this is a product-specific design/architecture pattern for capacity and model selection.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments,Foundry model deployments,Model deployment options for Content Understanding analyzers - Foundry Tools,Choose model deployment options for analyzers,"Learn how Content Understanding maps analyzer models to Foundry deployments, and how defaults and request-level overrides interact.","Azure Content Understanding in Foundry Tools uses your Foundry model deployments for all operations that require a generative AI model. This approach helps you maximize provisioned capacity and consolidate capacity into fewer deployments, if needed. You can also choose the model that best fits your scenario for price and latency. You're billed for all tokens (input and output) processed by the connected deployment, and Content Understanding only bills you for Content Understanding-specific meter",2026-04-21T11:03:00.000Z,concept-article,decision-making,0.65,True,"Describes how Content Understanding maps analyzers to Foundry deployments, including defaults and overrides to balance price, latency, and capacity; this is product-specific guidance for selecting deployment options and trade-offs.",updated https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/prebuilt-analyzers,Prebuilt analyzers,Azure Content Understanding in Foundry Tools prebuilt analyzers - Foundry Tools,Use and customize Content Understanding prebuilt analyzers,"Learn about prebuilt analyzers, base analyzers, RAG analyzers, vertical analyzers, and how to use and customize them in Azure Content Understanding in Foundry Tools.","Azure Content Understanding prebuilt analyzers provide a set of domain-specific extraction capabilities that go beyond predefined schemas. They're powered by knowledge bases of real-world document examples. They understand how information is structured and used, adapting to the nuances of each content type. Prebuilt analyzers are ready-to-use tools that streamline common content processing tasks. You can use them for content ingestion in search and retrieval-augmented generation (RAG) workflows.",2026-01-29T08:00:00.000Z,overview,configuration,0.7,True,"Describes prebuilt analyzers and how to customize them; likely includes analyzer names, options, and configuration fields specific to this service.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/secure-communications,Security features,Secure your Content Understanding analyzers and data - Foundry Tools,Secure Content Understanding with keys and identities,Configure security features such as customer-managed keys and managed identities to secure your data and applications.,Content Understanding is part of Microsoft Foundry and provides the same security and privacy features available to all Foundry services. These capabilities help protect your data and ensure compliance with Microsoft security standards.,2026-02-25T02:49:00.000Z,concept-article,security,0.7,True,Describes configuring customer-managed keys and managed identities; these involve specific security configuration parameters and scopes unique to the service.,unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/standard-pro-modes,Modes: standard and pro (Preview),Azure Content Understanding in Foundry Tools standard and pro modes (preview) - Foundry Tools,,Learn about Azure Content Understanding in Foundry Tools standard and pro modes.,"Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). Azure Content Understanding in Foundry Tools is a generative AI service designed to derive structured insights from multimoda",2026-04-13T17:17:00.000Z,overview,,0.3,False,"Described as a conceptual explanation of standard vs pro modes; without evidence of numeric thresholds, decision matrices, or config tables, it appears to be an overview rather than expert-knowledge guidance for any specific sub-skill type.",updated +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/standard-pro-modes,Modes: standard and pro (Preview),Azure Content Understanding in Foundry Tools standard and pro modes (preview) - Foundry Tools,,Learn about Azure Content Understanding in Foundry Tools standard and pro modes.,"Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). Azure Content Understanding in Foundry Tools is a generative AI service designed to derive structured insights from multimoda",2026-04-13T17:17:00.000Z,overview,,0.3,False,"Described as a conceptual explanation of standard vs pro modes; without evidence of numeric thresholds, decision matrices, or config tables, it appears to be an overview rather than expert-knowledge guidance for any specific sub-skill type.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/analyzer-improvement,Analyzer Improvement,"Document analysis with confidence, grounding, and labeled samples - Foundry Tools",Improve document extraction with confidence and grounding,Learn about Azure Content Understanding in Foundry Tools features that can improve extraction quality and performance.,"Processing unstructured documents (such as contracts and statements of work) or structured documents (such as invoices and insurance forms) is critical for intelligent document processing (IDP) workflows and retrieval-augmented generation (RAG) scenarios. Extracting data reliably at scale requires more than text extraction. For high-quality automation, you often need to know what was extracted, where it came from, whether it matches your intent, and how reliable the extraction is. Most enterpris",2026-01-31T06:05:00.000Z,overview,best-practices,0.7,True,"Focuses on improving extraction quality and performance using specific features (confidence, grounding, labeled samples); likely includes concrete usage patterns and recommendations.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/elements,Elements,Document Analysis: Extract Structured Content with Azure Content Understanding in Foundry Tools - Foundry Tools,Configure document layout analysis with Content Understanding,Learn about Azure Content Understanding in Foundry Tools document layout analysis and data extraction capabilities.,,2026-01-29T08:00:00.000Z,overview,configuration,0.6,True,Document layout analysis and extraction capabilities typically involve specifying schemas and options; likely includes configuration fields for document analyzers.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/markdown,Markdown,Document analysis: Markdown representation - Foundry Tools,Consume Content Understanding document Markdown output,Description of the supported Markdown elements returned as part of the Content Understanding Document response and how to use the response in your applications.,"Azure Content Understanding in Foundry Tools converts unstructured documents intoGitHub Flavored Markdown, while maintaining content and layout for accurate downstream use. This article describes how each content and layout element is represented in Markdown.",2026-01-29T08:00:00.000Z,concept-article,integrations,0.75,True,Describes exact Markdown elements returned and how they map to document structure; this is output schema/integration detail for downstream applications.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/overview,Overview,Azure Content Understanding in Foundry Tools Document Overview - Foundry Tools,,Learn about Azure Content Understanding in Foundry Tools document solutions.,"Content Understanding offers sophisticated document analysis capabilities. Organizations can use these capabilities to convert unstructured content into actionable and organized data. Content Understanding can usecustomizable analyzersto expertly extract essential information, fields, and relationships from a diverse range of documents and forms.",2026-03-27T06:03:00.000Z,overview,,0.2,False,"Document overview describing capabilities and use cases for document analysis; no indication of limits, configuration parameters, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/face/overview,What is face detection? (preview),Azure Content Understanding in Foundry Tools face overview (preview) - Foundry Tools,Configure face detection and recognition in Content Understanding,Learn about Azure Content Understanding in Foundry Tools face solutions.,"Important Azure Content Understanding in Foundry Tools is available in preview. Preview releases provide early access to features that are in active development. Features, approaches, and processes can change or have limited capabilities before general availability (GA). For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Azure Content Understanding provides a cloud-based solution for face detection, enrollment, and recognition, enabling secure and intelligent applic",2026-01-31T06:05:00.000Z,overview,configuration,0.6,True,Face overview for detection/enrollment/recognition typically includes specific configuration options and parameters for face-related operations.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/faq,FAQ,Azure Content Understanding in Foundry Tools FAQ,,Get answers to frequently asked questions about the Document Intelligence service.,Find answers to commonly asked questions about Azure Content Understanding,2026-04-01T06:05:00Z,faq,,0.3,False,"FAQ page likely mixes general Q&A; summary does not indicate structured troubleshooting (error codes, diagnostic steps) or other expert-knowledge patterns required by the defined sub-skills.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/foundry-vs-content-understanding-studio,Foundry and Content Understanding Studio comparison,Feature comparison for Content Understanding in Foundry and Content Understanding Studio - Foundry Tools,Choose between Foundry and Content Understanding Studio features,Learn about the available options in Foundry and Content Understanding Studio.,"Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA"").",2026-04-13T17:17:00.000Z,overview,decision-making,0.7,True,"A feature comparison page between Foundry and Content Understanding Studio will contain a comparison table of capabilities by option, guiding users on which environment to choose for different needs, which fits the decision-making category.",updated +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/foundry-vs-content-understanding-studio,Foundry and Content Understanding Studio comparison,Feature comparison for Content Understanding in Foundry and Content Understanding Studio - Foundry Tools,Choose between Foundry and Content Understanding Studio features,Learn about the available options in Foundry and Content Understanding Studio.,"Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA"").",2026-04-13T17:17:00.000Z,overview,decision-making,0.7,True,"A feature comparison page between Foundry and Content Understanding Studio will contain a comparison table of capabilities by option, guiding users on which environment to choose for different needs, which fits the decision-making category.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/glossary,Glossary,Content Understanding Glossary - Foundry Tools,,A quick reference for Content Understanding terms and definitions.,Note Access to this page requires authorization. You can trysigning inorchanging directories. Access to this page requires authorization. You can trychanging directories.,2026-01-31T06:05:00.000Z,glossary,,0.0,False,"Glossary of terms; definitions are conceptual, not configuration, limits, or troubleshooting patterns.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/classification-content-understanding-studio,Classifier tutorial - Split and route,Classify and route your data using Content Understanding Studio - Foundry Tools,Configure classification and routing in Content Understanding Studio,Learn about how to classify and route your data using Content Understanding Studio,"Content Understanding Studio enables you to create custom classification workflows that route your data to the right analyzer. With routing, you can send multiple data streams through the same pipeline and ensure your data is routed to the best analyzer.",2026-01-31T06:05:00.000Z,how-to,configuration,0.65,True,Describes creating classification workflows and routing data to analyzers; likely includes configuration fields and options for routing logic.,unchanged +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/classification-content-understanding-studio,Classifier tutorial - Split and route,Classify and route your data using Content Understanding - Foundry Tools,,Learn how to create classification workflows to categorize and route your data using Content Understanding Studio or the REST API.,"Content Understanding enables you to create custom classification workflows that categorize your content and route it to the right analyzer. With routing, you can send multiple data streams through the same pipeline and ensure your data is processed by the best analyzer for its type. This guide walks you through two steps:",2026-04-21T11:03:00.000Z,how-to,,0.2,False,"How-to guide for creating classification workflows and routing in Content Understanding Studio; appears to be procedural/tutorial content without detailed limits, configuration parameter tables, error-code troubleshooting, or decision matrices. No clear evidence of product-specific expert details per the defined categories.",updated https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/content-understanding-foundry-classic,Create CU Task in Foundry (classic) (Preview),Create Content Understanding Standard and Pro tasks in the Microsoft Foundry (classic) portal - Foundry Tools,Configure Standard and Pro tasks in Foundry classic,Utilize the Foundry (classic) portal to create Content Understanding custom tasks,"Suppose you have files of different types—such as documents, images, audio, or video—and you want to automatically extract key information from them. With Content Understanding, you can create a task to organize your data processing, define a field schema that specifies the information to extract or generate, and then build an analyzer. The analyzer becomes an API endpoint that you can integrate into your applications or workflows. This guide shows you how to use Content Understanding Standard a",2026-01-29T08:00:00.000Z,how-to,configuration,0.65,True,How-to for creating Standard and Pro tasks likely details task configuration options and fields specific to Content Understanding in Foundry classic.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/copy-analyzers,Copy and back up analyzers,Copy custom analyzers - Foundry Tools,,Copy custom analyzers within a resource and across Azure resources.,"Every Content Understanding resource provides access to all prebuilt analyzers by default. For a complete list, seeprebuilt analyzers. Custom analyzers are analyzers you define to process specific content where you can define the content type, schema, and any other processing logic. For more information on defining a custom analyzer, seedefining a custom analyzer. The copy operation on analyzers supports a few different scenarios: Important The copy operation for copying across resources support",2026-04-02T22:14:00.000Z,how-to,,0.2,False,"Describes copying custom analyzers conceptually and supported scenarios; summary does not indicate detailed configuration parameters, limits, or error codes. Appears to be procedural guidance without deep expert-only details.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/create-multi-service-resource,Create a Microsoft Foundry resource,Create a Microsoft Foundry resource - Foundry Tools,,Create and manage a Microsoft Foundry resource for Content Understanding operations,"To use Content Understanding, you need a Microsoft Foundry resource. Important The steps below explain how to create a resource for use with theREST API. To use Content Understanding in the Content Understanding Studio, see theContent Understanding Studio quickstart.",2026-03-31T08:00:00.000Z,how-to,,0.25,False,"Resource creation how-to for Microsoft Foundry; summary suggests step-by-step portal/API setup without detailed configuration parameter tables, limits, or security role mappings.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/customize-analyzer-content-understanding-studio,Create a custom analyzer with Content Understanding Studio,Create and improve your custom analyzer in Content Understanding Studio - Foundry Tools,Build and refine custom analyzers in Content Understanding Studio,Create custom analyzers and apply in context learning to improve them using Content Understanding Studio,Content Understanding Studio lets you build content analyzers that extract content and fields tailored to your needs. Follow these steps to create a custom analyzer in Content Understanding Studio.,2026-02-01T12:03:00.000Z,how-to,configuration,0.7,True,"How-to for creating and improving custom analyzers; involves setting schemas, fields, and options, which are concrete configuration steps.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/migration-preview-to-ga,Migration from CU Preview to GA,Migrate from Azure Content Understanding in Foundry Tools Preview to GA - Foundry Tools,Migrate Azure Content Understanding preview APIs to GA,"Migrate from Azure Content Understanding in Foundry Tools Preview to GA, including API changes and best practices.","Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). The Azure Content Understanding API has reached general availability (GA). It introduces several new capabilities and updates",2026-04-13T17:17:00.000Z,how-to,best-practices,0.68,True,"Migration guides typically include product-specific API changes, parameter mappings, and concrete do/don't guidance unique to this service (for example, renamed operations, changed request/response schemas, and required configuration updates). The description explicitly mentions 'API changes and best practices', which aligns with product-specific best-practices rather than generic concepts.",updated +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/migration-preview-to-ga,Migration from CU Preview to GA,Migrate from Azure Content Understanding in Foundry Tools Preview to GA - Foundry Tools,Migrate Azure Content Understanding preview APIs to GA,"Migrate from Azure Content Understanding in Foundry Tools Preview to GA, including API changes and best practices.","Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). The Azure Content Understanding API has reached general availability (GA). It introduces several new capabilities and updates",2026-04-13T17:17:00.000Z,how-to,best-practices,0.68,True,"Migration guides typically include product-specific API changes, parameter mappings, and concrete do/don't guidance unique to this service (for example, renamed operations, changed request/response schemas, and required configuration updates). The description explicitly mentions 'API changes and best practices', which aligns with product-specific best-practices rather than generic concepts.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/image/overview,Image,Azure Content Understanding in Foundry Tools image overview - Foundry Tools,,Learn how to use Azure Content Understanding in Foundry Tools image solutions,"Azure Content Understanding standardizes the extraction of data from images, making it easier to analyze large volumes of unstructured image data. Standardized extraction speeds up time-to-value and simplifies integration into downstream analytical workflows. With the Content Understanding APIs, you can define schemas to specify the fields, descriptions, and output types for extraction. The service then analyzes your images and provides structured data that can be applied in various use cases, s",2026-03-31T08:00:00.000Z,how-to,,0.1,False,"High-level overview of Azure Content Understanding image capabilities; no numeric limits, configuration tables, error codes, or product-specific decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/language-region-support,Language and region support,Content Understanding region and language support - Foundry Tools,,Azure Content Understanding in Foundry Tools region and language support,Azure Content Understanding in Foundry Tools provides multilingual support in multiple geographic regions to enable users to communicate with Content Understanding applications in natural ways and empower global outreach. The following sections describe the available regions and supported languages/locales.,2026-04-13T17:17:00.000Z,reference,,0.2,False,"Region and language support is typically a capability/coverage listing, not a configuration, quota, or decision matrix; it lacks the kinds of numeric limits, config parameters, or troubleshooting mappings required by any sub-skill type.",updated -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/overview,What is Azure Content Understanding in Foundry Tools?,What is Azure Content Understanding in Foundry Tools? - Foundry Tools,,"Learn about Azure Content Understanding in Foundry Tools solutions, processes, workflows, use-cases, and field extractions.","Note Content Understanding is now a generally available (GA) service with the release of the2025-11-01API version. For details, seeWhat's New. Azure Content Understanding in Foundry Tools is aFoundry Toolthat's available as part of theMicrosoft Foundry Resourcein the Azure portal. It uses generative AI to process and ingest many types of content, including documents, images, videos, and audio, into a user-defined output format. Content Understanding offers a streamlined process to reason over la",2026-03-23T08:00:00.000Z,overview,,0.2,False,"High-level overview of Azure Content Understanding capabilities and workflows; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/language-region-support,Language and region support,Content Understanding region and language support - Foundry Tools,,Azure Content Understanding in Foundry Tools region and language support,Azure Content Understanding in Foundry Tools provides multilingual support in multiple geographic regions to enable users to communicate with Content Understanding applications in natural ways and empower global outreach. The following sections describe the available regions and supported languages/locales.,2026-04-13T17:17:00.000Z,reference,,0.2,False,"Region and language support is typically a capability/coverage listing, not a configuration, quota, or decision matrix; it lacks the kinds of numeric limits, config parameters, or troubleshooting mappings required by any sub-skill type.",unchanged +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/overview,What is Azure Content Understanding in Foundry Tools?,What is Azure Content Understanding in Foundry Tools? - Foundry Tools,,"Learn about Azure Content Understanding in Foundry Tools solutions, processes, workflows, use-cases, and field extractions.","Note Content Understanding is now a generally available (GA) service with the release of the2025-11-01API version. For details, seeWhat's New. Azure Content Understanding in Foundry Tools is aFoundry Toolthat's available as part of theMicrosoft Foundry Resourcein the Azure portal. It uses generative AI to process and ingest many types of content, including documents, images, videos, and audio, into a user-defined output format. Content Understanding offers a streamlined process to reason over la",2026-04-21T11:03:00.000Z,overview,,0.1,False,"High-level overview of Azure Content Understanding capabilities, workflows, and use cases without concrete limits, configuration tables, or product-specific error/decision matrices.",updated https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/pricing-explainer,Pricing model details and examples,Pricing for Azure Content Understanding in Foundry Tools - Foundry Tools,Estimate and plan Content Understanding pricing,"Understand the pricing model for Azure Content Understanding in Foundry Tools, including what you're charged for, how to estimate costs, and pricing examples.","This article explains the Azure Content Understanding in Foundry Tools pricing model with clear examples and cost breakdowns. Learn what you're charged for and how to estimate costs for your workload. For specific pricing rates, seeAzure Content Understanding Pricing.",2026-01-29T08:00:00.000Z,concept-article,decision-making,0.7,True,Pricing explainer with cost breakdowns and examples; supports cost-based decision making and capacity planning for workloads.,unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/quickstart/content-understanding-studio,Content Understanding Studio Quickstart,Quickstart Try out Content Understanding Studio or Foundry portal - Foundry Tools,,"Try out the new features of the Content Understanding Studio, or access the prebuilt analyzers through the Foundry portal.","Content Understanding Studiohelps you try prebuilt analyzers, build and test custom analyzers, and improve analyzer performance over time. This quickstart walks you through the basic steps to get started.",2026-02-12T23:10:00.000Z,quickstart,,0.4,False,"Quickstart for trying Studio/portal; primarily step-by-step UI usage, not detailed configuration matrices or limits.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/quickstart/use-rest-api,Try Content Understanding REST API and SDKs,Quickstart: Azure Content Understanding in Foundry Tools - Foundry Tools,Call Content Understanding REST API for multimodal data,"Learn how to use Content Understanding APIs and SDKs to extract structured data from documents, images, audio, and video.","This quickstart shows you how to use theContent Understanding REST APIto get structured data from multimodal content in document, image, audio, and video files.",2026-03-13T22:11:00.000Z,quickstart,integrations,0.7,True,"REST API quickstart with concrete request examples and parameters for documents, images, audio, and video; these are product-specific integration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits,Service quotas and limits,Service quotas and limits - Content Understanding - Foundry Tools,Review Azure Content Understanding service quotas and limits,"Quick reference, detailed description, and best practices for working within Azure Content Understanding in Foundry Tools service Quotas and Limits","Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). This article lists the quotas and limits for the Azure Content Understanding in Foundry Tools service.",2026-04-13T17:17:00.000Z,limits-and-quotas,limits-quotas,0.95,True,"The page is explicitly a quotas and limits reference for Azure Content Understanding in Foundry Tools and will list concrete numeric limits (for requests, payload sizes, rate limits, etc.), which are product- and version-specific values not reliably known from training.",updated -https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/build-person-directory,Build a face-data person directory (preview),Build a person directory with Azure Content Understanding in Foundry Tools Face APIs (preview) - Foundry Tools,,Learn how to build a person directory with Content Understanding Face APIs.,"Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). A person directory provides a structured approach to storing face data for recognition tasks. It allows you to add individual",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"This is a scenario tutorial on building a person directory with Face APIs. Based on the summary, it focuses on how to use the service in a guided example, not on configuration tables, limits, error-code troubleshooting, or product-specific best-practice/decision matrices. It’s primarily instructional/tutorial content rather than expert reference knowledge as defined.",updated +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits,Service quotas and limits,Service quotas and limits - Content Understanding - Foundry Tools,Review Content Understanding service quotas and limits,"Quick reference, detailed description, and best practices for working within Azure Content Understanding in Foundry Tools service Quotas and Limits","Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). This article lists the quotas and limits for the Azure Content Understanding in Foundry Tools service.",2026-04-21T11:03:00.000Z,limits-and-quotas,limits-quotas,0.95,True,Explicitly described as listing quotas and limits for Azure Content Understanding; such pages typically contain numeric per-API and per-resource limits that are not inferable from general training data.,updated +https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/build-person-directory,Build a face-data person directory (preview),Build a person directory with Azure Content Understanding in Foundry Tools Face APIs (preview) - Foundry Tools,,Learn how to build a person directory with Content Understanding Face APIs.,"Note Content Understanding APIs versions2024-12-01-previewand2025-05-01-previeware currently in public preview. These previews are provided without a service-level agreement and are not recommended for production workloads. For more information, seeSupplemental Terms of Use for Microsoft Azure Previewsand theMicrosoft Products and Services Data Protection Addendum(""DPA""). A person directory provides a structured approach to storing face data for recognition tasks. It allows you to add individual",2026-04-13T17:17:00.000Z,tutorial,,0.2,False,"This is a scenario tutorial on building a person directory with Face APIs. Based on the summary, it focuses on how to use the service in a guided example, not on configuration tables, limits, error-code troubleshooting, or product-specific best-practice/decision matrices. It’s primarily instructional/tutorial content rather than expert reference knowledge as defined.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/build-rag-solution,Build a retrieval-augmented generation solution,Build a retrieval-augmented generation solution with Azure Content Understanding in Foundry Tools - Foundry Tools,,Learn to build a retrieval-augmented generation solution with Content Understanding,"This tutorial explains how to create a retrieval-augmented generation (RAG) solution using Azure Content Understanding in Foundry Tools. It covers the key steps to build a strong RAG system, offers tips to improve relevance and accuracy, and shows how to connect with other Azure services. By the end, you can use Content Understanding to handle multimodal data, improve retrieval, and help AI models provide accurate and meaningful responses.",2026-03-31T08:00:00.000Z,tutorial,,0.3,False,"RAG tutorial likely focuses on workflow and conceptual steps to build a solution; summary does not indicate specific limits, configuration tables, or product-specific error/decision matrices required for expert classification.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/create-custom-analyzer,Create a custom analyzer,Create a custom analyzer with Azure Content Understanding in Foundry Tools - Foundry Tools,Create custom Content Understanding analyzers via REST API,Learn to create a custom analyzer with Azure Content Understanding in Foundry Tools,"Content Understanding analyzers define how to process and extract insights from your content. They ensure uniform processing and output structure across all your content, so you get reliable and predictable results. For common use cases, you can use theprebuilt analyzers. This guide shows how you can customize these analyzers to better fit your needs. This guide shows you how to use theContent Understanding REST APIto create a custom analyzer that extracts structured data from your content.",2026-03-30T22:09:00.000Z,overview,integrations,0.65,True,"Tutorial for creating a custom analyzer using the Content Understanding REST API. Likely includes request/response schemas, parameter names, and product-specific API usage patterns, which qualify as integration and coding patterns beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/robotic-process-automation,Build a robotic process automation solution,Build a robotic process automation (RPA) solution with Azure Content Understanding in Foundry Tools - Foundry Tools,,Learn to build a robotic process automation solution with Content Understanding,"Robotic process automation (RPA) enables organizations to automate repetitive tasks by orchestrating workflows across systems. When combined with Azure Content Understanding, RPA can handle complex content ingestion scenarios across documents, images, audio, and video. Many RPA solutions aim for straight-through processing (STP): automate end-to-end workflows with minimal human intervention. Confidence scores and grounding information help support STP by improving decision quality and auditabili",2026-01-31T06:05:00.000Z,tutorial,,0.2,False,"RPA scenario tutorial focused on workflow orchestration and concepts like STP and confidence scores; no evidence of numeric limits, configuration matrices, or detailed error/diagnostic content.",unchanged diff --git a/products/microsoft-foundry-tools/report.md b/products/microsoft-foundry-tools/report.md index 2fa9d006..991b73ed 100644 --- a/products/microsoft-foundry-tools/report.md +++ b/products/microsoft-foundry-tools/report.md @@ -1,36 +1,33 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: integrations: 'Using Content Moderator and Content Understanding via REST/.NET: text/image/video moderation, term lists, multimodal analysis, and consuming Markdown/structured outputs' - limits-quotas: Quotas, limits, and language support for Azure Content Moderator - and Content Understanding, including image/list caps, API usage constraints, and - .NET sample considerations. - decision-making: Guides for choosing the right Azure AI/Foundry tool for document - processing and estimating Content Understanding costs and pricing plans. - configuration: 'Configuring Foundry environments and resources: credentials, subdomains, - ARM provisioning, logging, and detailed setup for Content Understanding analyzers, - layouts, images, faces, and routing.' + limits-quotas: 'Quotas and limits for Content Moderator and Content Understanding: + image/list caps, usage constraints, and supported languages, plus how to stay + within these limits in .NET samples.' + decision-making: Guides for choosing document processing tools, deployment options, + Foundry vs Content Understanding Studio, and estimating/pricing Content Understanding + workloads. + configuration: Configuring Content Understanding analyzers and layout/face analysis, + customizing prebuilt and custom analyzers, and setting up Standard/Pro task behavior + in Foundry classic best-practices: Guidance on improving Content Understanding accuracy, grounding and confidence in document extraction, and migrating from preview to GA Content Understanding APIs. - architecture-patterns: Designing and configuring how Content Understanding analyzers - are mapped to specific model deployments, including routing strategies and deployment - architecture patterns. security: 'Securing Foundry: auth methods, Entra-only access, keys/Key Vault, CMK encryption, DLP, VNet rules, API key rotation, Azure Policy and regulatory compliance configuration' skill_description: Expert knowledge for Microsoft Foundry Tools (aka Azure AI services, Azure Cognitive Services) development including best practices, decision making, - architecture & design patterns, limits & quotas, security, configuration, and integrations - & coding patterns. Use when using Content Moderator, Content Understanding analyzers, - document extraction, routing, or secure Foundry setup, and other Microsoft Foundry - Tools related development tasks. Not for Microsoft Foundry (use microsoft-foundry), - Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local - (use microsoft-foundry-local). -use_when: Use when using Content Moderator, Content Understanding analyzers, document - extraction, routing, or secure Foundry setup, and other Microsoft Foundry Tools + limits & quotas, security, configuration, and integrations & coding patterns. Use + when using Content Moderator, Content Understanding analyzers, Foundry Standard/Pro + tasks, or securing Foundry access, and other Microsoft Foundry Tools related development + tasks. Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Classic + (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local). +use_when: Use when using Content Moderator, Content Understanding analyzers, Foundry + Standard/Pro tasks, or securing Foundry access, and other Microsoft Foundry Tools related development tasks. confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local). @@ -42,13 +39,13 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft - **Total Pages**: 51 - **Fetched**: 51 - **Fetch Failed**: 0 -- **Classified**: 31 -- **Unclassified**: 20 +- **Classified**: 30 +- **Unclassified**: 21 ### Incremental Update - **New Pages**: 0 -- **Updated Pages**: 6 -- **Unchanged**: 45 +- **Updated Pages**: 5 +- **Unchanged**: 46 - **Deleted Pages**: 0 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/microsoft-foundry-tools/microsoft-foundry-tools.csv` @@ -56,42 +53,39 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 1 | 2.0% | | best-practices | 3 | 5.9% | -| configuration | 7 | 13.7% | -| decision-making | 3 | 5.9% | +| configuration | 6 | 11.8% | +| decision-making | 4 | 7.8% | | integrations | 12 | 23.5% | | limits-quotas | 4 | 7.8% | | security | 1 | 2.0% | -| *(Unclassified)* | 20 | 39.2% | +| *(Unclassified)* | 21 | 41.2% | ## Changes ### Updated Pages -- [Migration from CU Preview to GA](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/migration-preview-to-ga) - - Updated: 2026-02-11T06:03:00.000Z → 2026-04-13T17:17:00.000Z -- [Build a face-data person directory (preview)](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/build-person-directory) - - Updated: 2026-01-29T08:00:00.000Z → 2026-04-13T17:17:00.000Z +- [Classifier tutorial - Split and route](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/classification-content-understanding-studio) + - Updated: 2026-01-31T06:05:00.000Z → 2026-04-21T11:03:00.000Z +- [What is Azure Content Understanding in Foundry Tools?](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/overview) + - Updated: 2026-03-23T08:00:00.000Z → 2026-04-21T11:03:00.000Z - [Service quotas and limits](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits) - - Updated: 2026-03-23T08:00:00.000Z → 2026-04-13T17:17:00.000Z -- [Language and region support](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/language-region-support) - - Updated: 2026-03-30T22:09:00.000Z → 2026-04-13T17:17:00.000Z -- [Foundry and Content Understanding Studio comparison](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/foundry-vs-content-understanding-studio) - - Updated: 2026-03-06T23:10:00.000Z → 2026-04-13T17:17:00.000Z -- [Modes: standard and pro (Preview)](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/standard-pro-modes) - - Updated: 2026-01-31T06:05:00.000Z → 2026-04-13T17:17:00.000Z + - Updated: 2026-04-13T17:17:00.000Z → 2026-04-21T11:03:00.000Z +- [What are analyzers?](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference) + - Updated: 2026-02-01T12:03:00.000Z → 2026-04-21T11:03:00.000Z +- [Foundry model deployments](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments) + - Updated: 2026-03-17T06:03:00.000Z → 2026-04-21T11:03:00.000Z ## Classified Pages | TOC Title | Type | Confidence | Reason | |-----------|------|------------|--------| -| [Service quotas and limits](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits) | limits-quotas | 0.95 | The page is explicitly a quotas and limits reference for Azure Content Understanding in Foundry Tools and will list concrete numeric limits (for requests, payload sizes, rate limits, etc.), which are product- and version-specific values not reliably known from training. | +| [Service quotas and limits](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits) | limits-quotas | 0.95 | Explicitly described as listing quotas and limits for Azure Content Understanding; such pages typically contain numeric per-API and per-resource limits that are not inferable from general training data. | | [.NET SDK samples](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/samples-dotnet) | limits-quotas | 0.85 | Explicitly states maximum of 5 image lists and 5 term lists with up to 10,000 items each—clear numeric quotas. | | [Check images against custom lists](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/image-lists-quickstart-dotnet) | limits-quotas | 0.85 | States maximum of 5 image lists with up to 10,000 images each—explicit numeric quotas for a specific feature. | -| [What are analyzers?](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference) | configuration | 0.85 | Analyzer reference explicitly covers configuration and parameters; likely includes setting names, allowed values, and behavior, which are core configuration knowledge. | | [Best practices](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/best-practices) | best-practices | 0.80 | Explicit best practices article; expected to include product-specific recommendations, configuration tips, and edge cases for maximizing accuracy and efficiency. | | [Content Moderator REST API](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/api-reference) | integrations | 0.80 | API reference pages enumerate operations, parameters, and constraints specific to the product, which are concrete integration details not inferable from general knowledge. | +| [What are analyzers?](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference) | configuration | 0.80 | Analyzer reference and configuration article; likely includes parameter names, allowed values, and configuration details for custom analyzers, which are product-specific configuration knowledge. | | [Elements](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/video/elements) | integrations | 0.75 | Details the contents object with kind: "audioVisual" and capabilities per input type; this is a concrete schema and integration pattern for audio/video analysis. | | [Markdown](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/markdown) | integrations | 0.75 | Describes exact Markdown elements returned and how they map to document structure; this is output schema/integration detail for downstream applications. | | [Markdown](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/video/markdown) | integrations | 0.75 | Documents how each audiovisual element is represented in Markdown; provides precise output schema needed for integrating with downstream systems. | @@ -107,10 +101,9 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Try Content Understanding REST API and SDKs](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/quickstart/use-rest-api) | integrations | 0.70 | REST API quickstart with concrete request examples and parameters for documents, images, audio, and video; these are product-specific integration patterns. | | [Migration from CU Preview to GA](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/migration-preview-to-ga) | best-practices | 0.68 | Migration guides typically include product-specific API changes, parameter mappings, and concrete do/don't guidance unique to this service (for example, renamed operations, changed request/response schemas, and required configuration updates). The description explicitly mentions 'API changes and best practices', which aligns with product-specific best-practices rather than generic concepts. | | [Check text against custom terms](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/term-lists-quickstart-dotnet) | integrations | 0.65 | Quickstart shows concrete C# SDK usage for custom term lists, including specific API operations and parameters unique to Content Moderator, which are product-specific integration details beyond generic SDK knowledge. | -| [Classifier tutorial - Split and route](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/classification-content-understanding-studio) | configuration | 0.65 | Describes creating classification workflows and routing data to analyzers; likely includes configuration fields and options for routing logic. | | [Create CU Task in Foundry (classic) (Preview)](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/content-understanding-foundry-classic) | configuration | 0.65 | How-to for creating Standard and Pro tasks likely details task configuration options and fields specific to Content Understanding in Foundry classic. | | [Create a custom analyzer](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/create-custom-analyzer) | integrations | 0.65 | Tutorial for creating a custom analyzer using the Content Understanding REST API. Likely includes request/response schemas, parameter names, and product-specific API usage patterns, which qualify as integration and coding patterns beyond generic knowledge. | -| [Foundry model deployments](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments) | architecture-patterns | 0.65 | Explains how analyzer models map to Foundry deployments, defaults, and overrides; this is a product-specific design/architecture pattern for capacity and model selection. | +| [Foundry model deployments](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments) | decision-making | 0.65 | Describes how Content Understanding maps analyzers to Foundry deployments, including defaults and overrides to balance price, latency, and capacity; this is product-specific guidance for selecting deployment options and trade-offs. | | [Using the client library or REST API](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/client-libraries) | integrations | 0.65 | Client library quickstart typically shows SDK methods, parameters, and configuration patterns specific to Content Moderator. | | [Elements](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/elements) | configuration | 0.60 | Document layout analysis and extraction capabilities typically involve specifying schemas and options; likely includes configuration fields for document analyzers. | | [Image moderation](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/image-moderation-api) | integrations | 0.60 | Image Moderation API article typically includes request/response schemas and parameter names specific to this service, which are product-specific integration details. | @@ -132,13 +125,14 @@ confusable_not_for: Not for Microsoft Foundry (use microsoft-foundry), Microsoft | [Create a Microsoft Foundry resource](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/create-multi-service-resource) | 0.25 | Resource creation how-to for Microsoft Foundry; summary suggests step-by-step portal/API setup without detailed configuration parameter tables, limits, or security role mappings. | | [Build a face-data person directory (preview)](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/build-person-directory) | 0.20 | This is a scenario tutorial on building a person directory with Face APIs. Based on the summary, it focuses on how to use the service in a guided example, not on configuration tables, limits, error-code troubleshooting, or product-specific best-practice/decision matrices. It’s primarily instructional/tutorial content rather than expert reference knowledge as defined. | | [Build a robotic process automation solution](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/tutorial/robotic-process-automation) | 0.20 | RPA scenario tutorial focused on workflow orchestration and concepts like STP and confidence scores; no evidence of numeric limits, configuration matrices, or detailed error/diagnostic content. | +| [Classifier tutorial - Split and route](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/classification-content-understanding-studio) | 0.20 | How-to guide for creating classification workflows and routing in Content Understanding Studio; appears to be procedural/tutorial content without detailed limits, configuration parameter tables, error-code troubleshooting, or decision matrices. No clear evidence of product-specific expert details per the defined categories. | | [Copy and back up analyzers](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/copy-analyzers) | 0.20 | Describes copying custom analyzers conceptually and supported scenarios; summary does not indicate detailed configuration parameters, limits, or error codes. Appears to be procedural guidance without deep expert-only details. | | [Export or delete account data](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/export-delete-data) | 0.20 | Data export/delete overview; likely procedural and policy-focused without detailed limits, config tables, or error mappings. | | [Language and region support](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/language-region-support) | 0.20 | Region and language support is typically a capability/coverage listing, not a configuration, quota, or decision matrix; it lacks the kinds of numeric limits, config parameters, or troubleshooting mappings required by any sub-skill type. | | [Overview](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/overview) | 0.20 | Document overview describing capabilities and use cases for document analysis; no indication of limits, configuration parameters, or decision matrices. | | [Use customer-managed keys](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/encrypt-data-at-rest) | 0.20 | High-level statement that data is encrypted at rest; no indication of specific configuration parameters, roles, or algorithms. | | [Video Overview](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/video/overview) | 0.20 | Video overview describing what the pre-built video analyzer does and typical use cases; appears as conceptual capability description without expert-level numeric limits, configuration tables, or troubleshooting content. | -| [What is Azure Content Understanding in Foundry Tools?](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/overview) | 0.20 | High-level overview of Azure Content Understanding capabilities and workflows; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds. | | [What is Content Moderator?](https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/overview) | 0.20 | Service overview and deprecation notice for Content Moderator; conceptual description without detailed technical parameters in the summary. | | [Image](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/image/overview) | 0.10 | High-level overview of Azure Content Understanding image capabilities; no numeric limits, configuration tables, error codes, or product-specific decision matrices. | +| [What is Azure Content Understanding in Foundry Tools?](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/overview) | 0.10 | High-level overview of Azure Content Understanding capabilities, workflows, and use cases without concrete limits, configuration tables, or product-specific error/decision matrices. | | [Glossary](https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/glossary) | - | Glossary of terms; definitions are conceptual, not configuration, limits, or troubleshooting patterns. | diff --git a/products/microsoft-foundry/microsoft-foundry.csv b/products/microsoft-foundry/microsoft-foundry.csv index 46679a7a..01a573c3 100644 --- a/products/microsoft-foundry/microsoft-foundry.csv +++ b/products/microsoft-foundry/microsoft-foundry.csv @@ -1,34 +1,39 @@ url,toc_title,title,display_title,description,summary,updated_at,ms_topic,sub_skill_type,confidence,has_expert_knowledge,classification_reason,change_type -https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-identity,Agent identity,Agent identity concepts in Microsoft Foundry - Microsoft Foundry,Configure and govern agent identities in Microsoft Foundry,"Learn how agent identities and agent identity blueprints work in Microsoft Foundry, including RBAC, authentication for tools, and governance.","Anagent identityis a specialized identity type inMicrosoft Entra IDthat's designed specifically for AI agents. It provides a standardized framework for governing, authenticating, and authorizing AI agents across Microsoft services. This framework enables agents to securely access resources, interact with users, and communicate with other systems. Microsoft Foundry automatically provisions and manages agent identities throughout the agent lifecycle. This integration simplifies permission manageme",2026-04-13T08:00:00.000Z,concept-article,security,0.65,True,"Describes agent identities as a specialized Microsoft Entra ID identity type, including RBAC, authentication for tools, and governance patterns specific to Foundry; this is product-specific security/identity configuration guidance even though the summary is conceptual.",updated +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-catalog,Agent manifests,Agent manifests for Foundry Agent Service - Microsoft Foundry,,"Explore pre-built agent manifests for the Foundry Agent Service. Browse manifests across industries with tools, prompt patterns, and one-click deployment.","Note Agent Manifests are for educational and experimentation purposes. Resulting agents are not production ready. Review all provided resources and carefully test agent behavior in the context of your use case. Agents you create may be subject to legal and regulatory requirements, may require licenses, or may not be suitable for all industries, scenarios, or use cases. By using any template, you are acknowledging that resulting agents and other output are solely your responsibility, and that you",2026-04-21T22:14:00.000Z,overview,,0.2,False,"Conceptual description and disclaimers about agent manifests; lacks product-specific parameters, limits, or troubleshooting details.",new +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-identity,Agent identity,Agent identity concepts in Microsoft Foundry - Microsoft Foundry,Configure and govern agent identities in Microsoft Foundry,"Learn how agent identities and agent identity blueprints work in Microsoft Foundry, including RBAC, authentication for tools, and governance.","Anagent identityis a specialized identity type inMicrosoft Entra IDthat's designed specifically for AI agents. It provides a standardized framework for governing, authenticating, and authorizing AI agents across Microsoft services. This framework enables agents to securely access resources, interact with users, and communicate with other systems. Microsoft Foundry automatically provisions and manages agent identities throughout the agent lifecycle. This integration simplifies permission manageme",2026-04-13T08:00:00.000Z,concept-article,security,0.65,True,"Describes agent identities as a specialized Microsoft Entra ID identity type, including RBAC, authentication for tools, and governance patterns specific to Foundry; this is product-specific security/identity configuration guidance even though the summary is conceptual.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-to-agent-authentication,Agent2Agent (A2A) authentication (preview),Agent2Agent (A2A) authentication - Microsoft Foundry,Configure authentication methods for Agent2Agent tools,Learn about ways of adding authentication to the Agent2Agent tool in the Foundry Agent Service.,The Agent2Agent (A2A) protocol enables your agents to invoke other agents. Most A2A endpoints require authentication to access the endpoint and its underlying service. Configuring authentication ensures that only authorized users can invoke your A2A tools in Foundry Agent Service. This article explains the authentication methods available for A2A connections and helps you choose the right approach for your scenario.,2026-02-27T23:08:00.000Z,how-to,security,0.75,True,Explains authentication methods for A2A connections and how to choose between them; security-focused with product-specific auth patterns.,unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/capability-hosts,Capability hosts,Capability hosts for Foundry Agent Service - Microsoft Foundry,Configure capability hosts for Foundry Agent Service,"Learn how capability hosts route agent data to Microsoft-managed or your own Azure resources, and how to configure and troubleshoot them.","Note Updating capability hosts is not supported. To modify a capability host, you must delete the existing one and recreate it with the new configuration. Capability hosts are sub-resources that you configure at both the Microsoft Foundry account and Foundry project scopes. They tell Foundry Agent Service where to store and process agent data, including:",2026-02-27T23:08:00.000Z,concept-article,configuration,0.7,True,Explains capability hosts as sub-resources at account and project scopes and how to configure them; includes product-specific configuration behavior (must recreate to update).,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/development-lifecycle,Agent development lifecycle,Agent development lifecycle - Microsoft Foundry,,"Learn the agent development lifecycle in Microsoft Foundry, from creating and versioning to tracing, evaluation, publishing, and monitoring.","The agent development lifecycle in Microsoft Foundry spans from initial creation through production monitoring. Following this lifecycle helps you build reliable agents, catch issues early, and ship with confidence. Use the Foundry portal or code to build, customize, and test your agent's behavior. Then iterate with tracing, evaluation, and monitoring to improve quality and reliability. When you're ready, publish your agent as an agent application to share it and integrate it into your apps. Thi",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Describes the agent development lifecycle conceptually; no indication of numeric thresholds, config tables, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/foundry-iq-faq,FAQ,Foundry IQ FAQ - Microsoft Foundry,,"Frequently asked questions about Foundry IQ, knowledge bases, knowledge sources, and agentic retrieval.",Find answers to common questions aboutFoundry IQ.,2026-04-14T22:13:00Z,faq,,0.2,False,"FAQ about Foundry IQ and knowledge bases; description suggests conceptual Q&A without specific limits, configuration tables, error-code mappings, or other product-specific expert details.",updated -https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agents,What is a hosted agent?,Hosted agents in Foundry Agent Service (preview) - Microsoft Foundry,,"Deploy and manage containerized agents on Foundry Agent Service (preview) with managed hosting, scaling, and observability.","When you build agentic applications by using open-source frameworks, you typically manage containerization, web server setup, security integration, memory persistence, infrastructure scaling, data transmission, instrumentation, and version rollbacks. These tasks become even more challenging in heterogeneous cloud environments. Important If you use Foundry Agent Service to host agents that interact with third-party models, servers, or agents, you do so at your own risk. We recommend reviewing all",2026-03-05T08:00:00.000Z,concept-article,,0.35,False,"Conceptual description of hosted agents and managed hosting; summary doesn’t show detailed config parameters, limits, or error codes.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/development-lifecycle,Agent development lifecycle,Agent development lifecycle - Microsoft Foundry,,"Learn the agent development lifecycle in Microsoft Foundry, from creating and versioning to tracing, evaluation, publishing, and monitoring.","The agent development lifecycle in Microsoft Foundry spans from initial creation through production monitoring. Following this lifecycle helps you build reliable agents, catch issues early, and ship with confidence. Use the Foundry portal or code to build, customize, and test your agent's behavior. Then iterate with tracing, evaluation, and monitoring to improve quality and reliability. When you're ready, publish your agent as an agent application to share it and integrate it into your apps. Thi",2026-04-23T22:15:00.000Z,concept-article,,0.2,False,"Describes the agent development lifecycle conceptually (creation, tracing, evaluation, publishing); no concrete config values, limits, or decision matrices indicated.",updated +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/foundry-iq-faq,FAQ,Foundry IQ FAQ - Microsoft Foundry,,"Frequently asked questions about Foundry IQ, knowledge bases, knowledge sources, and agentic retrieval.",Find answers to common questions aboutFoundry IQ.,2026-04-14T22:13:00Z,faq,,0.2,False,"FAQ about Foundry IQ and knowledge bases; description suggests conceptual Q&A without specific limits, configuration tables, error-code mappings, or other product-specific expert details.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agent-permissions,Hosted agent permissions reference,Hosted agent permissions reference - Microsoft Foundry,Configure hosted agent permissions and RBAC in Foundry,"Reference for permissions required to create, deploy, and interact with hosted agents in Microsoft Foundry.","When working withhosted agents in Microsoft Foundry, it's important to understand the various permissions involved. There are several classes of permissions involved in hosted agent development, spanning the Azure Resource Manager control plane and the Foundry data plane: This article is a companion toRole-based access control for Microsoft Foundry, which introduces role-based access control (RBAC) concepts and the built-in roles available in Microsoft Foundry. You should familiarize yourself wi",2026-04-22T17:23:00.000Z,reference,security,0.85,True,"A permissions reference for hosted agents will list specific RBAC roles, scopes, and required permissions across control and data planes, which is detailed, product-specific security configuration.",new +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agents,What is a hosted agent?,Hosted agents in Foundry Agent Service (preview) - Microsoft Foundry,,"Deploy and manage containerized agents on Foundry Agent Service (preview) with managed hosting, scaling, and observability.","When you build agentic applications by using open-source frameworks, you typically manage many cross-cutting concerns: containerization, web server setup, security, memory persistence, scaling, instrumentation, and version rollbacks. These tasks become even more challenging in heterogeneous cloud environments. Hosted agents in Foundry Agent Service solve these challenges for Microsoft Foundry users. By using this managed platform, you can deploy and operate AI agents securely and at scale. You c",2026-04-22T17:23:00.000Z,concept-article,,0.3,False,"Conceptual explanation of hosted agents and platform benefits; summary does not show specific configuration parameters, limits, or troubleshooting mappings.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/limits-quotas-regions,"Quotas, limits, and region support",Quotas and limits for Microsoft Foundry Agent Service - Microsoft Foundry,"Quotas, limits, and regions for Foundry Agent Service","Review default limits for Foundry Agent Service, including file sizes, vector stores, messages, tools, error handling, supported regions, and compatible models.","Foundry Agent Service enforces quotas and limits on agent artifacts, file uploads, messages, and tool registrations. Understanding these limits helps you design applications that scale without hitting service boundaries. This article lists default limits, supported regions, compatible models, and guidance for handling limit errors. Note Foundry Agent Service is generally available (GA). Some sub-features, such ashosted agents, are in public preview and might have different constraints.",2026-04-03T08:00:00.000Z,concept-article,limits-quotas,0.95,True,"Explicitly a limits article listing default limits for artifacts, file sizes, vector stores, messages, tools, and regions. This will contain numeric limits, size constraints, and possibly tier-specific tables that LLMs won’t know from training.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/runtime-components,Agent runtime components,"Build with agents, conversations, and responses in Foundry Agent Service - Microsoft Foundry","Use agents, conversations, and responses in Foundry","Learn how to create agents, manage conversations, and generate responses in Microsoft Foundry Agent Service with code examples in Python, C#, JavaScript, Java, and REST API.","Microsoft Foundry Agent Service uses three core runtime components—agents,conversations, andresponses—to power stateful, multi-turn interactions. An agent defines what model, instructions, and tools to use. A conversation persists history across turns. A response is the output the agent produces when it processes input. This article walks through each component and shows how to use them together in code. You'll learn how to create an agent, start a conversation, generate responses (with or witho",2026-04-10T08:00:00.000Z,concept-article,integrations,0.6,True,"Explains how to wire together agents, conversations, and responses with concrete SDK/REST usage; likely includes product-specific API parameters and patterns for stateful interactions, fitting integrations & coding patterns.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/standard-agent-setup,Standard agent setup,Set up standard agent resources for Foundry Agent Service - Microsoft Foundry,Design standard agent setup with isolated resources,Learn how to provision and configure customer-managed Azure resources for Foundry Agent Service standard agent setup with enterprise-grade data isolation.,"Standard agent setup uses customer-managed, single-tenant Azure resources to store agent state and keep all agent data under your control. Use standard setup when you need full data sovereignty, compliance with enterprise security policies, or project-level isolation. In this setup: Tip For a simpler setup that uses Microsoft-managed resources, seeEnvironment setupand choose the basic agent setup option.",2026-02-27T23:08:00.000Z,how-to,architecture-patterns,0.64,True,"Explains when to use standard agent setup vs basic, focusing on data sovereignty, isolation, and compliance. This is a product-specific architecture pattern with clear scenario-based guidance.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice,Tool best practices,Tool best practices for Microsoft Foundry Agent Service - Microsoft Foundry,Apply tool configuration best practices for agents,"Learn tool best practices for Foundry Agent Service: configure tool_choice, secure tool usage, and troubleshoot tool-calling issues.","When you build agents in Microsoft Foundry Agent Service, tools extend what your agent can do—retrieving information, calling APIs, and connecting to external services. This article helps you configure tools effectively, control when the model calls them, and keep your data secure. Tip In your agent instructions, describe what each tool is for and when to use it. For example: When you need information from my indexed documents, use File Search. When you need to call an API, use the OpenAPI tool.",2026-03-13T22:11:00.000Z,concept-article,best-practices,0.85,True,"Explicitly a best-practices article for configuring tools, tool_choice, secure usage, and troubleshooting tool-calling; contains product-specific DOs/DON’Ts and likely concrete config patterns.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-catalog,Tools overview,Agent tools overview for Microsoft Foundry Agent Service - Microsoft Foundry,,"Explore the tools available for agents in Foundry Agent Service, including built-in tools, web search, custom options, and the Foundry tool catalog. Get started today.","Tools extend what your agents can do in Microsoft Foundry Agent Service. An agent on its own can generate text, but tools let it take action — searching the web, running code, querying your data, or calling your own APIs. This article explains what tools are, the types of tools available, how to use a tool in an agent, and how to manage authentication. It also introduces the Foundry tool catalog where you discover and configure tools. To use tools, you need access to a Foundry project and permis",2026-04-09T08:00:00.000Z,concept-article,,0.4,False,"Conceptual overview of tools and the tool catalog; summary does not indicate detailed parameter tables, limits, or error mappings—primarily explains what tools are and how they extend agents.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice,Best practices,Tool best practices for Microsoft Foundry Agent Service - Microsoft Foundry,Apply tool usage best practices in Foundry agents,"Learn tool best practices for Foundry Agent Service: configure tool_choice, secure tool usage, and troubleshoot tool-calling issues.","When you build agents in Microsoft Foundry Agent Service, tools extend what your agent can do—retrieving information, calling APIs, and connecting to external services. This article helps you configure tools effectively, control when the model calls them, and keep your data secure. Tip In your agent instructions, describe what each tool is for and when to use it. For example: When you need information from my indexed documents, use File Search. When you need to call an API, use the OpenAPI tool.",2026-03-13T22:11:00.000Z,concept-article,best-practices,0.8,True,"Explicitly labeled best practices for configuring tools, controlling tool_choice, securing tool usage, and troubleshooting tool-calling; contains product-specific DOs/DON’Ts and likely concrete configuration examples.",new +https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-catalog,Tools overview,Agent tools overview for Microsoft Foundry Agent Service - Microsoft Foundry,,"Explore the tools available for agents in Foundry Agent Service, including built-in tools, web search, custom options, and the Foundry tool catalog. Get started today.","Tools extend what your agents can do in Microsoft Foundry Agent Service. An agent on its own can generate text, but tools let it take action - searching the web, running code, querying your data, or calling your own APIs. This article explains what tools are, the types of tools available, how to use a tool in an agent, and how to manage authentication. It also introduces the Foundry tool catalog where you discover and configure tools. To use tools, you need access to a Foundry project and permis",2026-04-23T08:00:00.000Z,concept-article,,0.3,False,"Overview of tools and tool catalog; summary is conceptual and does not indicate detailed parameter tables, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/vector-stores,Vector stores for file search,Vector stores for file search in Microsoft Foundry Agent Service - Microsoft Foundry,Use vector stores and file search limits in agents,"Learn how vector stores enable file search for agents, including ingestion (chunking and embeddings), readiness, limits, and expiration policies.","Vector store objects give thefile searchtool the ability to search your files. When you add a file to a vector store, the service parses, chunks, embeds, and indexes it so the tool can run both keyword and semantic search. Vector stores can be attached to both agents and conversations. Currently, you can attach at most one vector store to an agent and at most one vector store to a conversation. For a conceptual overview of conversations, seeAgent runtime components. In the current agents develop",2026-02-27T23:08:00.000Z,concept-article,limits-quotas,0.8,True,"Summary explicitly mentions readiness, limits, and expiration policies; such pages typically list numeric limits (e.g., max files, sizes, retention), matching limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/what-is-foundry-iq,What is Foundry IQ?,What is Foundry IQ? - Microsoft Foundry,,"Learn about Foundry IQ, a managed knowledge layer that turns enterprise data into reusable, permission-aware knowledge bases for AI agents.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Agents need context from scattered enterprise content to accurately answer questions. With Foundry IQ, you can create a configurable, multi-sourc",2026-02-27T23:08:00.000Z,concept-article,,0.35,False,"Conceptual overview of Foundry IQ as a managed knowledge layer; summary doesn’t indicate detailed limits, config tables, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/what-is-memory,What is memory?,What is Memory? - Microsoft Foundry,,"Learn what memory is in Microsoft Foundry Agent Service (preview), how it works, and how to use long-term memories safely.","Important Memory (preview) in Foundry Agent Service and the Memory Store API (preview) are licensed to you as part of your Azure subscription and are subject to terms applicable to ""Previews"" in theMicrosoft Product Termsand theMicrosoft Products and Services Data Protection Addendum, as well as the Microsoft Generative AI Services Previews terms in theSupplemental Terms of Use for Microsoft Azure Previews. Memory in Microsoft Foundry Agent Service is a managed, long-term memory solution. It ena",2026-04-06T08:00:00.000Z,concept-article,,0.3,False,"Conceptual explanation of what memory is in Foundry Agent Service and how it works at a high level; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/workflow,Create a workflow,Build a workflow in Microsoft Foundry - Microsoft Foundry,,"Build workflows in Microsoft Foundry to orchestrate AI agents with visual logic, branching, and Power Fx formulas. Create intelligent automation without writing code.","Workflows are UI-based tools in Microsoft Foundry. Use them to create declarative, predefined sequences of actions that orchestrate agents and business logic in a visual builder. Workflows enable you to build intelligent automation systems that seamlessly blend AI agents with business processes in a visual manner. Traditional single-agent systems are limited in their ability to handle complex, multifaceted tasks. By orchestrating multiple agents, each with specialized skills or roles, you can cr",2026-02-27T23:08:00.000Z,how-to,,0.35,False,Conceptual explanation of workflows and orchestration; summary doesn’t indicate detailed configuration tables or limits.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup,Set up your agent resources,Set up your environment for Foundry Agent Service - Microsoft Foundry,Set up infrastructure for Foundry Agent Service,Use this guide to set up your agent environment,"In this article, you deploy the infrastructure needed to create agents with Foundry Agent Service. After completing this setup, you can create and configure agents using either the SDK of your choice or the Foundry portal. Creating your first agent is a two-step process:",2026-02-27T23:08:00.000Z,how-to,deployment,0.66,True,"Environment setup for agents includes provisioning specific Azure resources and configurations required before agent deployment, which are product-specific deployment requirements.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/faq,FAQ,Foundry Agent Service frequently asked questions - Microsoft Foundry,,"Get answers to common questions about Foundry Agent Service, including setup options, data handling, pricing, and networking.","Find answers to common questions about Foundry Agent Service. If you can't find answers to your questions in this article and you still need help, seeFoundry Tools support and help options. Foundry Agent Service is part of Foundry Tools. For getting started, see:",2026-04-14T22:13:00Z,faq,,0.3,False,"General FAQ about Foundry Agent Service (setup, data handling, pricing, networking). Summary indicates high-level answers rather than detailed limits, configuration tables, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365,Agent 365,Publish a Microsoft Foundry agent to Agent 365 - Microsoft Foundry,Deploy Foundry agents as digital workers in Agent 365,"Publish a Microsoft Foundry hosted agent to Agent 365 by using the FoundryA365 sample, approve it, and optionally connect it to Microsoft Teams.","Use this article to publish a Microsoft Foundry hosted agent as a digital worker in Microsoft Agent 365 (A365). The sample uses the Azure Developer CLI to create the required Azure resources and publish an agent application. It also guides you through admin approval, configuration, and creating instances of your digital worker.",2026-02-27T23:08:00.000Z,how-to,deployment,0.65,True,Uses a specific sample and Azure Developer CLI to create resources and publish an agent; contains product-specific deployment patterns and constraints.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway,Use an AI Gateway,Connect an AI gateway to Foundry Agent Service (preview) - Microsoft Foundry,Integrate enterprise AI gateways with Foundry Agent Service,Connect and use models hosted behind enterprise AI gateways like Azure API Management with Foundry Agent Service.,"Foundry Agent Service allows you to connect and use models hosted behind your enterprise AI gateways such asAzure API Managementor othernon-Azure hosted AI model gateways. This capability allows you to maintain control over your model endpoints while using Foundry agent capabilities. Important This feature is currently in preview. Preview features aren't meant for production use. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This capability enables organization",2026-04-09T08:00:00.000Z,how-to,integrations,0.7,True,"A page on connecting AI gateways (Azure API Management and non-Azure gateways) to Foundry Agent Service will necessarily include endpoint configuration details, headers, authentication settings, and possibly parameter names or payload formats specific to this integration. Those are product-specific integration patterns that qualify as expert knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent,Deploy a hosted agent,Deploy a hosted agent - Microsoft Foundry,Deploy custom hosted agents to Foundry Agent Service,Deploy your containerized agent code to Foundry Agent Service using the Azure Developer CLI or Python SDK.,"This article shows you how to deploy a containerized agent to Foundry Agent Service. Use hosted agents when you need to run custom agent code built with frameworks like LangGraph, Microsoft Agent Framework, or your own implementation.",2026-03-04T08:00:00.000Z,how-to,deployment,0.7,True,"How-to for deploying containerized agents via Azure Developer CLI or Python SDK; likely includes product-specific deployment configuration and constraints, fitting deployment.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup,Set up your agent resources,Set up your environment for Foundry Agent Service - Microsoft Foundry,Configure environment infrastructure for Foundry Agent Service,Use this guide to set up your agent environment,"In this article, you deploy the infrastructure needed to create agents with Foundry Agent Service. After completing this setup, you can create and configure agents using either the SDK of your choice or the Foundry portal. Creating your first agent is a two-step process: Note For hosted agents, additional permissions and RBAC configurations are required. SeeHosted agent permissions referencefor detailed requirements.",2026-04-14T22:13:00.000Z,how-to,configuration,0.65,True,"Environment setup for Foundry Agent Service typically includes product-specific infrastructure and configuration requirements (resource types, settings, possibly environment variables and permissions). The summary references hosted agents needing additional permissions and RBAC configurations, indicating concrete configuration details rather than just conceptual guidance.",updated +https://learn.microsoft.com/en-us/azure/foundry/agents/faq,FAQ,Foundry Agent Service frequently asked questions - Microsoft Foundry,,"Get answers to common questions about Foundry Agent Service, including setup options, data handling, pricing, and networking.","Find answers to common questions about Foundry Agent Service. If you can't find answers to your questions in this article and you still need help, seeFoundry Tools support and help options. Foundry Agent Service is part of Foundry Tools. For getting started, see:",2026-04-14T22:13:00Z,faq,,0.3,False,"General FAQ about Foundry Agent Service (setup, data handling, pricing, networking). Summary indicates high-level answers rather than detailed limits, configuration tables, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365,Agent 365,Microsoft Agent 365 and Microsoft Foundry - Microsoft Foundry,Govern and secure Foundry agents with Microsoft Agent 365,"Learn how Microsoft Agent 365 provides governance, security, and lifecycle management for AI agents built in Microsoft Foundry.","This article describes how Foundry agents can be managed and governed in Microsoft Agent 365. Additionally, it describes how AI teammates can be created, approved, and hired in Agent 365.",2026-04-22T17:23:00.000Z,concept-article,security,0.65,True,"Describes governance, security, and lifecycle management; likely includes specific roles/permissions and approval flows unique to Agent 365, which fits product-specific security configuration.",updated +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-applications,Agent applications,Agent applications in Microsoft Foundry - Microsoft Foundry,Publish Foundry agents as agent applications,"Learn about agent applications in Microsoft Foundry, configure authentication and permissions, and use a stable endpoint to invoke your agent.","Note This article describes the legacy publishing experience. For the new agent publishing model, seeMigrate from agent applications to the new agent endpoint and publishing experience. Publishing promotes an agent from a development asset inside your Foundry project into a managed Azure resource that external consumers can call through a stable endpoint. Think of it as the step that moves your agent from ""works in my project"" to ""ready for others to use."" This article shows you how to publish a",2026-04-23T22:15:00.000Z,how-to,configuration,0.7,True,Describes publishing agents as Azure resources with stable endpoints and configuring authentication and permissions; likely includes endpoint settings and auth configuration parameters.,new +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway,Bring your own models with AI gateways,Bring Your Own Model to Foundry Agent Service - Microsoft Foundry,Integrate BYO model gateways with Foundry Agent Service,Connect and bring your own models hosted behind enterprise AI gateways like Azure API Management with Foundry Agent Service.,"Foundry Agent Service allows you to connect and use models hosted behind your AI gateways such asAzure API Managementor othernon-Azure managed AI model gateways. This capability, calledbring your own model, allows you to maintain control over your model endpoints while using Foundry agent capabilities. Important For purposes of this documentation,BYOM modelsrefers to third-party models that you bring to Foundry and does not include Azure Direct Models. Foundry Agent Service supports the ability ",2026-04-23T08:00:00.000Z,how-to,integrations,0.7,True,"Covers connecting models behind enterprise AI gateways like Azure API Management and other non-Azure gateways. This implies product-specific integration patterns, endpoint configuration, and possibly parameter details for using external model endpoints with Foundry agents.",new +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/configure-agent,Configure an agent,Configure and share your Microsoft Foundry agent - Microsoft Foundry,Configure and share Microsoft Foundry agent endpoints,"Learn how to configure your agent's stable endpoint, select the active version, and share your agent with consumers in Microsoft Foundry.","Every agent in Microsoft Foundry has a stable endpoint from the moment it's created. When end users interact with your agent through Microsoft 365 Copilot, Teams, your existing application, or other surfaces, they interact with the agent's stable endpoint. Before you share your agent, verify these settings: This article shows you how to select the active version, enable protocols, set authorization schemes, and add an agent card. After you configure the endpoint, you can: Note If you're migratin",2026-04-22T17:23:00.000Z,how-to,configuration,0.75,True,"Shows how to select active versions, enable protocols, set authorization schemes, and configure stable endpoints; these are concrete configuration options and settings.",new +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent,Deploy a hosted agent,Deploy a hosted agent - Microsoft Foundry,Configure and deploy hosted agents via SDK or REST,Deploy your containerized agent code to Foundry Agent Service using the Python SDK or REST API.,"This article shows you how to deploy a containerized agent to Foundry Agent Service using the Python SDK or REST API. Use these approaches when you want to manage agent deployments directly from your own applications or services. If you're deploying for the first time or want the fastest path, use theQuickstart: Create and deploy a hosted agentinstead. The quickstart uses theAzure Developer CLI (azd)orVS Code extension, which handle building, pushing, versioning, and RBAC configuration automatic",2026-04-22T17:23:00.000Z,how-to,configuration,0.65,True,"How-to article for deploying hosted agents using Python SDK and REST; likely includes request/response schemas, parameter names, and required fields that are product-specific configuration details.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/foundry-iq-connect,Connect knowledge base to agents,Connect Agents to Foundry IQ Knowledge Bases - Microsoft Foundry,Connect Foundry agents to Foundry IQ knowledge bases,Connect Foundry Agent Service to a Foundry IQ knowledge base (Azure AI Search) for grounded retrieval and citation-backed responses.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this article, you learn how to connect a knowledge base in Foundry IQ to an agent in Foundry Agent Service. The connection uses theModel Conte",2026-04-02T08:00:00.000Z,how-to,integrations,0.7,True,"How-to for wiring Foundry Agent Service to Foundry IQ (Azure AI Search) for grounded retrieval. This typically includes product-specific configuration fields (knowledge base IDs, connection parameters, model context settings) and options unique to this integration, which are not generic LLM knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-grounding-with-bing,Manage Grounding with Bing,Manage Grounding With Bing Access - Microsoft Foundry,Manage and disable Grounding with Bing in Foundry,Learn how to manage Grounding with Bing in Microsoft Foundry and Azure.,"Grounding with Bing enables agents to retrieve and incorporate real-time public web data into model-generated responses. It supports summarization, question answering, conversational assistance, and other scenarios by using Grounding with Bing Search or Grounding with Bing Custom Search to fill knowledge gaps. Grounding is available across features in Foundry Agent Service and Azure AI Search. You might need to disable access to these features to meet compliance, privacy, or data governance requ",2026-02-27T23:08:00.000Z,overview,configuration,0.68,True,"Covers enabling/disabling Grounding with Bing Search and Custom Search across Foundry Agent Service and Azure AI Search. Likely includes feature flags, configuration switches, and policy or RBAC settings specific to controlling this capability.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent,Manage hosted agent lifecycle,Manage hosted agent lifecycle - Microsoft Foundry,Manage lifecycle of hosted agent deployments,"Start, stop, update, and delete hosted agent deployments using the Azure CLI or Python SDK.","This article shows you how to manage hosted agent deployments in Foundry Agent Service. After you deploy a hosted agent, you can start, stop, update, and delete it as your needs change.",2026-03-04T08:00:00.000Z,how-to,deployment,0.65,True,"Covers starting, stopping, updating, and deleting hosted agent deployments; operational deployment lifecycle guidance specific to the service.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent,Manage hosted agent lifecycle,Manage hosted agents - Microsoft Foundry,Manage hosted agents and traffic routing in Foundry,"View, monitor, and manage hosted agents in Foundry Agent Service by using the REST API, Python SDK, or Azure Developer CLI.","This article shows you how to manage hosted agents in Foundry Agent Service. After youdeploy a hosted agent, you can view its status, create new versions, configure traffic routing, monitor logs, and delete agents when they're no longer needed. The platform manages the container lifecycle automatically. Compute is provisioned when a request arrives and deprovisioned after the idle timeout (15 minutes). There are no manual start or stop operations.",2026-04-22T17:23:00.000Z,how-to,limits-quotas,0.7,True,Management article explicitly mentions an idle timeout of 15 minutes and automatic deprovisioning; this is a concrete service limit/timeout value that qualifies as expert knowledge.,updated +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-sessions,Manage hosted agent sessions,Manage hosted agent sessions - Microsoft Foundry,Create and manage hosted agent sessions via APIs,"Create, invoke, and manage sessions for hosted agents in Foundry Agent Service by using the REST API, Python SDK, or Azure Developer CLI.","This article shows you how to create and manage sessions for hosted agents in Foundry Agent Service. Sessions provide isolated sandbox compute for each request, so you can run multiple conversations or tasks concurrently without shared state.",2026-04-22T17:23:00.000Z,how-to,configuration,0.65,True,"Covers creating, invoking, and managing sessions using REST, SDK, and CLI; likely documents session-related parameters, IDs, and options that are product-specific configuration details.",new https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/mcp-authentication,MCP authentication,Set Up MCP Server Authentication - Microsoft Foundry,Configure authentication for MCP servers in Foundry,"Learn how to set up authentication for Model Context Protocol (MCP) servers used by agents in Microsoft Foundry Agent Service. Configure key-based, Entra, or OAuth auth.","Most Model Context Protocol (MCP) servers require authentication to access the server and its underlying service. Proper authentication ensures your agents can securely connect to MCP servers, invoke their tools, and access protected resources while maintaining appropriate access controls. In this article, you: Note If you don't already have an account with the MCP server publisher, create one through the publisher's website.",2026-04-09T08:00:00.000Z,how-to,security,0.8,True,"Covers setting up key-based, Entra, or OAuth authentication for MCP servers used by agents; this is detailed, product-specific security and auth configuration guidance.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/memory-usage,Create and use memory,Create and Use Memory - Microsoft Foundry,Configure and manage memory in Foundry agents,Learn how to create and manage memory in Foundry Agent Service to enable AI agents to retain context across sessions and personalize user interactions.,"Important Memory (preview) in Foundry Agent Service and the Memory Store API (preview) are licensed to you as part of your Azure subscription and are subject to terms applicable to ""Previews"" in theMicrosoft Product Termsand theMicrosoft Products and Services Data Protection Addendum, as well as the Microsoft Generative AI Services Previews terms in theSupplemental Terms of Use for Microsoft Azure Previews. Memory in Foundry Agent Service is a managed, long-term memory solution. It enables agent",2026-04-10T08:00:00.000Z,how-to,configuration,0.7,True,"How-to guide for creating and using memory in Foundry Agent Service likely includes concrete API parameters, configuration options, and usage patterns specific to this product (for example, memory store identifiers, scopes, or flags), which qualify as product-specific configuration details beyond generic LLM knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate,Migrate agents,Migrate to the new Foundry Agent Service - Microsoft Foundry,Migrate from Assistants API to Foundry Agent Service,"Learn how to migrate from the Assistants API and classic agents to the new Foundry Agent Service, including threads to conversations, runs to responses, and updated SDK patterns.","Tip Amigration toolis available to help automate migration from the Assistants API to Agents. Foundry Agent Service provides an upgraded developer experience for building intelligent agents that are easy to build, version, operate, and observe. The new agents API introduces a modernized SDK, new enterprise-grade capabilities, and preserves the identity, governance, and observability features you rely on today.",2026-04-10T08:00:00.000Z,how-to,decision-making,0.7,True,"Covers migration from Assistants API/classic agents to the new Agent Service, including mapping of threads to conversations, runs to responses, and updated SDK patterns—this is product-specific migration and upgrade guidance used for decision-making.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-agent-applications,Migrate from Agent Applications to the new agent model,Migrate from Agent Applications to the new Microsoft Foundry agent model - Microsoft Foundry,Decide and migrate to the new Foundry agent model,Migrate from the legacy Agent Application publishing model to the new agent object model in Microsoft Foundry.,"This guide explains how the agent publishing experience changed in Microsoft Foundry. It compares the legacy model (which created a separate Agent Application resource) with the new agent object model, and walks you through migrating your existing agents and published applications.",2026-04-22T17:23:00.000Z,how-to,decision-making,0.7,True,"Explicit migration guide comparing legacy Agent Application model vs new agent object model; likely includes comparison tables and concrete guidance on when/how to move, which is decision-focused.",new +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-hosted-agent-preview,Migrate hosted agents to the refreshed public preview,Migrate hosted agents to the refreshed public preview - Microsoft Foundry,Migrate hosted agents to refreshed Foundry preview,"Migrate your hosted agents from the initial public preview to the refreshed public preview, including API, SDK, CLI, protocol library, and identity model changes.","This article walks you through migrating hosted agents from the initial public preview to the refreshed public preview of Foundry Agent Service. The refreshed preview introduces a new hosting backend, protocol libraries, identity model, and management APIs. Important The initial public preview hosting backend is being retired. You must redeploy your agents using the new model described in this article. Existing agent deployments on the old backend won't be migrated automatically and will be supp",2026-04-22T17:23:00.000Z,how-to,decision-making,0.7,True,"Migration guide between preview backends; typically includes API/SDK changes, mapping of old to new models, and explicit guidance on when/how to move, which is decision and migration-focused expert knowledge.",new https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/private-tool-catalog,Create a private tool catalog (preview),Create a private tool catalog in Foundry Agent Service - Microsoft Foundry,Configure a private tool catalog with Azure API Center,Create a private tool catalog in Foundry Agent Service using Azure API Center. Let developers discover and configure organization-scoped MCP server tools.,"Note This feature is currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Create a private tool catalog so developers in your organization can discover, configure, and use MCP server tools through Foundry Tools. A private tool catalog usesAzure API Ce",2026-02-27T23:08:00.000Z,how-to,configuration,0.7,True,How-to for creating a private tool catalog using Azure API Center; likely includes specific configuration parameters and settings for catalog registration and scope.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-agent,Publish and share agents,Publish agents in Microsoft Foundry - Microsoft Foundry,Publish Foundry agents as managed Azure resources,"Learn how to publish agents in Microsoft Foundry, configure authentication and permissions, and use a stable endpoint to invoke your agent.","Publishing promotes an agent from a development asset inside your Foundry project into a managed Azure resource that external consumers can call through a stable endpoint. Think of it as the step that moves your agent from ""works in my project"" to ""ready for others to use."" This article shows you how to publish an agent, configure its authentication and permissions, and update your Agent Application as you roll out new agent versions. After publishing, see the following articles to use your Agen",2026-03-03T08:00:00.000Z,how-to,deployment,0.7,True,Covers promoting agents to managed resources with stable endpoints and configuring authentication/permissions; contains product-specific deployment and auth configuration details.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-copilot,Publish to Microsoft 365 Copilot and Teams,Publish agents to Microsoft 365 Copilot and Microsoft Teams - Microsoft Foundry,Publish Foundry agents to Microsoft 365 Copilot and Teams,Publish a Microsoft Foundry agent to Microsoft 365 Copilot and Microsoft Teams by creating an agent application and packaging it for distribution.,"Publishing creates an agent application with a stable endpoint. Once created, the agent application can then be published to Microsoft 365 Copilot and Teams or invoked using the Responses API protocol. Use this article to publish an agent to Microsoft 365 Copilot and Teams so people can use it in those interfaces.",2026-03-10T17:09:00.000Z,how-to,deployment,0.7,True,Describes packaging and publishing agents as applications for M365 Copilot and Teams; likely includes platform-specific deployment requirements and configuration steps.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-responses,Invoke your Agent Application using the Responses API protocol,Chat with your Agent Application using the Responses API protocol - Microsoft Foundry,Invoke Foundry Agent Applications via Responses API protocol,Chat with an existing Agent Application using the Responses API protocol,"After publishing, you can invoke your agent application using the Responses API protocol or the Activity Protocol. The Activity Protocol is used when your agent is published to Microsoft 365 and Teams. This article focuses on how you invoke your Agent Application using the Responses API protocol.",2026-03-06T23:10:00.000Z,how-to,integrations,0.6,True,How-to for using the Responses API protocol; likely includes protocol-specific request/response schema and parameters unique to Foundry.,unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-copilot,Publish to Microsoft 365 Copilot and Teams,Publish agents to Microsoft 365 Copilot and Microsoft Teams - Microsoft Foundry,,Publish a Microsoft Foundry agent to Microsoft 365 Copilot and Microsoft Teams from the Foundry portal.,"After you build and test an agent, the next step is often sharing it with others in the surfaces where they already work. Publishing a Foundry agent to Microsoft 365 Copilot and Teams lets you and others interact with and discover your agent through the Microsoft 365 Copilot and Teams UI. What gets published is the agent's stable endpoint, so end users always interact with a consistent agent entity while you seamlessly roll out new agent versions that receive traffic through the endpoint. You pu",2026-04-22T17:23:00.000Z,how-to,,0.3,False,"Publishing to Microsoft 365 Copilot and Teams from the portal is likely a procedural tutorial without detailed limits, config matrices, or product-specific error mappings.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/structured-inputs,Structured inputs,Customize Agent Behavior at Runtime with Structured Inputs - Microsoft Foundry,Configure structured inputs for Foundry agents,"Learn how to customize agent behavior at runtime using structured inputs. Define placeholders with handlebar templates, dynamically configure agent instructions and tools, and pass values at runtime t","You can customize agent instructions at runtime usingstructured inputs. Structured inputs are placeholders defined in the agent using handlebar template syntax ({{variableName}}). At runtime, you provide actual values to dynamically customize agent instructions, tool resource configurations, and response parameters—without creating separate agent versions for each configuration. In this article, you learn how to:",2026-03-31T08:00:00.000Z,how-to,configuration,0.7,True,"Details structured input placeholders, handlebar syntax, and how to pass values at runtime to configure instructions and tools; this is product-specific configuration behavior.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/agent-to-agent,Agent2Agent (A2A) (preview),Connect to an A2A agent endpoint from Foundry Agent Service - Microsoft Foundry,Connect Foundry agents to external A2A endpoints,"Connect your Foundry agent to a remote Agent2Agent (A2A) endpoint. Configure connections, authentication, and use SDK integration to call external A2A agents.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note For information on optimizing tool usage, seebest practices. You can extend the capabilities of your Microsoft Foundry agent by connecting t",2026-04-03T22:09:00.000Z,how-to,integrations,0.75,True,Describes configuring connections and authentication to remote Agent2Agent endpoints and using SDK integration; product-specific integration details and patterns.,unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/ai-search,Azure AI Search,Connect an Azure AI Search index to Foundry agents - Microsoft Foundry,Connect Azure AI Search indexes to agents,"Connect Azure AI Search indexes to Foundry agents for grounding responses with citations. Includes Python, C#, TypeScript, and REST samples.","Tip For a managed knowledge base experience, seeFoundry IQ. For tool optimization, seebest practices. Ground your Foundry agent's responses in your proprietary content by connecting it to an Azure AI Search index. TheAzure AI Searchtool retrieves indexed documents and generates answers with inline citations, enabling accurate, source-backed responses. Important If you want to use a private virtual network with the Azure AI Search tool, make sure you use Microsoft Entra project managed identity t",2026-03-30T08:00:00.000Z,how-to,integrations,0.8,True,"How-to for wiring Azure AI Search as a tool with Foundry agents, including configuration of indexes and tool parameters; clearly an integration pattern between services.",unchanged @@ -44,24 +49,28 @@ https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/file-search, https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/function-calling,Function calling,Use function calling with Microsoft Foundry agents - Microsoft Foundry,Use function calling tools with Foundry agents,"Use function calling to extend Microsoft Foundry agents with custom functions. Define tools with Python, C#, TypeScript, or REST and return outputs to the agent.","Microsoft Foundry agents support function calling, which lets you extend agents with custom capabilities. Define a function with its name, parameters, and description, and the agent can request your app to call it. Your app executes the function and returns the output. The agent then uses the result to continue the conversation with accurate, real-time data from your systems. Important Runs expire 10 minutes after creation. Submit your tool outputs before they expire. You can run agents with fun",2026-04-02T06:08:00.000Z,how-to,integrations,0.8,True,"Defines how to declare tools/functions with names, parameters, and descriptions and handle tool outputs; includes a specific 10-minute run expiry limit (expert detail) and SDK/REST parameters, fitting integrations with some limits-quotas aspect.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/governance,Tool governance,Govern MCP Tools by Using an AI Gateway - Microsoft Foundry,Govern MCP tools with an AI gateway in Foundry,"Learn how to govern MCP tools by using an AI gateway in Microsoft Foundry. Apply rate limits, IP filters, and routing policies by using Azure API Management.","Control how your agents access external tools by routing Model Context Protocol (MCP) traffic through anAI gatewayin Microsoft Foundry. An AI gateway provides a single, governed entry point where you can enforce authentication, rate limits, IP restrictions, and audit logging without modifying your MCP servers or agent code. This feature is in preview. Only new MCP tools created in the Foundry portal that don't use managed OAuth are routed through an AI gateway.",2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Describes enforcing authentication, rate limits, IP restrictions, and routing via Azure API Management; contains security and governance configuration patterns specific to MCP tools.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/image-generation,Image generation (preview),Use the image generation tool (preview) in Foundry Agent Service - Microsoft Foundry,Use image generation tool in Foundry Agent Service,"Generate images from text prompts with the image generation tool in Microsoft Foundry Agent Service. Configure agents, deploy models, and save output.",Important Theimage generation toolin Microsoft Foundry Agent Service generates images from text prompts in conversations and multistep workflows. Use it to create AI-generated visuals and return base64-encoded output that you can save to a file.,2026-04-02T06:08:00.000Z,how-to,integrations,0.65,True,"Describes a specific tool within Foundry Agent Service, including how to configure agents, deploy models, and handle base64-encoded outputs. Likely includes tool-specific parameters and configuration patterns beyond generic image generation concepts.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol,Connect to MCP server,Connect to MCP Server Endpoints for agents - Microsoft Foundry,Connect Foundry agents to MCP servers,Connect your Foundry agents to Model Context Protocol (MCP) servers using the MCP tool. Extend capabilities with external tools and data.,"Connect your Foundry agents toModel Context Protocol (MCP)servers by using the MCP tool. This extends agent capabilities with external tools and data sources. By connecting to remote MCP server endpoints, your agents can access tools hosted by developers and organizations that MCP-compatible clients like Foundry Agent Service can use. MCP is an open standard that defines how applications provide tools and contextual data to large language models (LLMs). It enables consistent, scalable integratio",2026-04-06T17:06:00.000Z,how-to,integrations,0.8,True,"Focuses on connecting agents to MCP server endpoints using the MCP tool; this is a product-specific integration pattern with external MCP-compatible tools and data sources, likely including endpoint and tool configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol,Connect to MCP server,Connect to MCP Server Endpoints for agents - Microsoft Foundry,Integrate Foundry agents with MCP server endpoints,Connect your Foundry agents to Model Context Protocol (MCP) servers using the MCP tool. Extend capabilities with external tools and data.,"Connect your Foundry agents toModel Context Protocol (MCP)servers by using the MCP tool. This connection extends agent capabilities with external tools and data sources. By connecting to remote MCP server endpoints, your agents can access tools hosted by developers and organizations that MCP-compatible clients like Foundry Agent Service can use. MCP is an open standard that defines how applications provide tools and contextual data to large language models (LLMs). It enables consistent, scalable",2026-04-23T22:15:00.000Z,how-to,integrations,0.8,True,"How-to for connecting agents to MCP servers using the MCP tool; expected to document MCP-specific configuration fields, endpoints, and parameters, which are product-specific integration details.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/openapi,OpenAPI tool,Connect OpenAPI tools to Microsoft Foundry agents - Microsoft Foundry,Connect OpenAPI tools to Microsoft Foundry agents,"Connect OpenAPI 3.0 and 3.1 tools to Microsoft Foundry agents using API key, managed identity, or anonymous authentication to integrate external APIs.","Connect your Microsoft Foundry agents to external APIs using OpenAPI 3.0 and 3.1 specifications. Agents that connect to OpenAPI tools can call external services, retrieve real-time data, and extend their capabilities beyond built-in functions. OpenAPI specificationsdefine a standard way to describe HTTP APIs so you can integrate existing services with your agents. Microsoft Foundry supports three authentication methods:anonymous,API key, andmanaged identity. For help choosing an authentication m",2026-03-30T08:00:00.000Z,how-to,integrations,0.85,True,"Explicitly about integrating OpenAPI 3.0/3.1 tools with Foundry agents using specific auth methods (API key, managed identity, anonymous). This will include configuration parameters, auth settings, and tool wiring details unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/sharepoint,SharePoint (preview),Use SharePoint content with agent API - Microsoft Foundry,Integrate Foundry agents with SharePoint content,"Learn how to ground Microsoft Foundry agents with SharePoint content using the agent API. Connect to SharePoint sites or folders, use identity passthrough, and keep enterprise access controls intact.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note Use the SharePoint tool (preview) for SharePoint grounding in Microsoft Foundry Agent Service by retrieving content from a SharePoint site o",2026-04-06T17:06:00.000Z,how-to,integrations,0.8,True,"Describes using the SharePoint tool with the Foundry agent API, including identity passthrough and grounding. This integration is product-specific and likely documents concrete parameters (site/folder identifiers, auth settings, tool configuration) and constraints for the SharePoint tool, fitting the integrations & coding patterns category.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/skills,Skills (preview),Use skills with Microsoft Foundry agents (preview) - Microsoft Foundry,Define and manage SKILL.md skills for Foundry agents,"Manage skills in Microsoft Foundry using the Skills REST API. Author SKILL.md files, store them centrally, and use them in hosted agents.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. As agents grow beyond simple prototypes, teams accumulate behavioral guidelines that need to be consistent across every conversation. A support a",2026-04-23T08:00:00.000Z,how-to,configuration,0.7,True,How-to for using Skills REST API and SKILL.md implies concrete schema/fields and configuration details for skills that are product-specific and not general LLM knowledge.,new +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/toolbox,Toolbox (preview),Curate intent-based toolbox in Foundry (preview) - Microsoft Foundry,Configure intent-based toolboxes for hosted agents,"Use toolbox in Microsoft Foundry to add MCP servers, web search, Azure AI Search, file search, code interpreter tool and more to hosted agents through a single managed endpoint.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. A single agent can depend on multiple tools - APIs, Model Context Protocol (MCP) servers, connectors, and flows - each with its own authenticatio",2026-04-23T08:00:00.000Z,how-to,configuration,0.7,True,"Describes adding MCP servers, web search, Azure AI Search, file search, and more via a single managed endpoint; likely includes toolbox configuration parameters and options unique to Foundry.",new https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/web-overview,Web search overview,Overview of web grounding capabilities in Foundry - Microsoft Foundry,Choose web grounding tools for Foundry agents,"Learn how to choose the right web grounding tool for your Microsoft Foundry agents. Compare Web Search, Grounding with Bing Search, and Bing Custom Search.","Web grounding tools in Microsoft Foundry Agent Service connect your agents to real-time public web data, overcoming the knowledge cutoff that limits large language models. For example, you can ask questions such as ""what is the top AI news today"" and receive current, cited answers.",2026-04-08T22:11:00.000Z,concept-article,decision-making,0.75,True,"Explicitly about choosing between Web Search, Grounding with Bing Search, and Bing Custom Search; likely includes comparison criteria and scenario-based recommendations, which is product-specific decision guidance.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/web-search,Web search tool,Use web search tool in Foundry Agent Service - Microsoft Foundry,Configure and use web search tool in Foundry,Use the web search tool in Foundry Agent Service to retrieve real-time information and ground AI responses. Includes code examples.,"The web search tool in Foundry Agent Service enables models to retrieve and ground responses with real-time information from the public web before generating output. When enabled, the model can return up-to-date answers with inline citations, helping you build agents that provide current, factual information to users. Important The following table shows SDK and setup support.",2026-04-07T08:00:00.000Z,how-to,configuration,0.7,True,"Describes enabling and using the web search tool, and mentions a support table for SDK and setup; this implies concrete configuration options and constraints specific to Foundry’s web search integration.",unchanged https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/use-your-own-resources,Use your own Azure resources,Use your own resources in the Foundry Agent Service - Microsoft Foundry,Configure Foundry Agent Service to use existing Azure resources,Learn how to use resources that you already have with the Foundry Agent Service.,"By default, Foundry Agent Service manages storage for files, conversations, and vector stores. If your organization requires full data ownership, customer-managed keys (CMK), or network isolation, you can connect your own Azure resources instead. This article shows you how to configure the deployment templates to use existing Azure OpenAI, Azure Storage, Azure Cosmos DB, and Azure AI Search resources with Agent Service.",2026-02-27T23:08:00.000Z,how-to,configuration,0.78,True,"How-to article for wiring Foundry Agent Service to existing Azure OpenAI, Storage, Cosmos DB, and AI Search. Likely includes resource- and template-specific parameters (Bicep/Terraform or ARM fields, property names, and required settings) that are product-specific configuration details rather than generic concepts.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/virtual-networks,Virtual networks,Set up private networking for Foundry Agent Service - Microsoft Foundry,Set up private networking for Foundry Agent Service,"Set up private networking for Foundry Agent Service using Bicep or Terraform. Deploy a virtual network with private endpoints, DNS zones, and deny-by-default network rules.","Foundry Agent Service offers aStandard Setup with private networkingenvironment, where you bring your own (BYO) private virtual network. This setup creates an isolated network environment that enables secure access to data while maintaining full control over your network infrastructure. By default, the Standard Setup with private networking ensures: If you don't have an existing virtual network, the Standard Setup with private networking template simplifies deployment by automatically provisioni",2026-03-13T17:15:00.000Z,how-to,security,0.76,True,"Describes private networking setup with BYO virtual network, private endpoints, DNS zones, and deny-by-default rules. This typically includes specific network configuration parameters, resource types, and required settings unique to Foundry Agent Service, which are security/networking configuration details.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/virtual-networks,Virtual networks,Set up private networking for Foundry Agent Service - Microsoft Foundry,Configure private networking for Foundry Agent Service,"Set up private networking for Foundry Agent Service using Bicep or Terraform. Deploy a virtual network with private endpoints, DNS zones, and deny-by-default network rules.","Foundry Agent Service offers aStandard Setup with private networkingenvironment. This setup creates an isolated network environment that enables secure access to data while maintaining full control over your network infrastructure. By default, the Standard Setup with private networking ensures: If you don't have an existing virtual network, the Standard Setup with private networking flow can provision the necessary network infrastructure for you.",2026-04-14T22:13:00.000Z,how-to,security,0.7,True,"Describes setting up private networking using Bicep/Terraform, deploying VNets, private endpoints, DNS zones, and deny-by-default rules. This is product-specific network security configuration and likely includes concrete resource names, settings, and patterns for secure access.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-low-code,Use declarative agent workflows in the Visual Studio Code extension,Add declarative agent workflows in Visual Studio Code - Microsoft Foundry,Configure declarative agent workflows in VS Code,Add and test declarative agent workflows in Foundry Agent Service by using the Microsoft Foundry for Visual Studio Code extension. Convert YAML workflows to Agent Framework code.,"Declarative agent workflows define predefined sequences of actions for your agents using configurations rather than explicit programming logic. In this article, you addFoundry Agent workflowsto an agent and test them by using theMicrosoft Foundry for Visual Studio Code (VS Code) extension. After youbuild an agent in Foundry Agent Servicein the portal, you can add workflows to orchestrate multiple agents into predefined action sequences for complex automation scenarios. Important Items marked (pr",2026-02-27T23:08:00.000Z,how-to,configuration,0.65,True,"Describes YAML-based declarative workflows for Foundry Agent Service; likely includes schema fields, allowed values, and configuration options.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-pro-code,Use hosted agent workflows in the Visual Studio Code extension,Create hosted agent workflows in Visual Studio Code - Microsoft Foundry,Create and deploy hosted Foundry agent workflows in VS Code,"Create, test, and deploy hosted agent workflows in Foundry Agent Service by using the Microsoft Foundry for Visual Studio Code extension.","Create, test, and deployhosted Foundry Agent workflowsby using theMicrosoft Foundry for Visual Studio Code extension. Hosted workflows let multiple agents collaborate in sequence, each with its own model, tools, and instructions. Before you start,build an agent in Foundry Agent Serviceby using the extension. You can then add hosted workflows to that agent. This article covers creating a workflow project, running it locally, visualizing the execution, and deploying it to your Foundry workspace.",2026-04-16T22:08:00.000Z,how-to,deployment,0.7,True,"Covers creating, running, and deploying hosted workflows via the VS Code extension; likely includes product-specific deployment steps, configuration options, and constraints for Foundry Agent Service.",updated -https://learn.microsoft.com/en-us/azure/foundry/agents/overview,What is the Foundry Agent service,What is Microsoft Foundry Agent Service? - Microsoft Foundry,,"Learn about Microsoft Foundry Agent Service capabilities, agent types, tools, and runtime features for building AI agents.","Foundry Agent Service is a fully managed platform for building, deploying, and scaling AI agents. Use any framework andmany modelsfrom the Foundry model catalog. Create no-codeprompt agentsin the Foundry portal, or use the available SDKs and REST API to deploy them and code-basedhosted agentsbuilt with Agent Framework, LangGraph, or your own code. Agent Service handles hosting, scaling, identity, observability, and enterprise security so you can focus on your agent logic.",2026-04-16T16:35:00.000Z,overview,,0.1,False,"Conceptual overview of Foundry Agent Service capabilities and agent types; no specific quotas, config matrices, or error-resolution content.",updated +https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-pro-code,Use hosted agent workflows in the Visual Studio Code extension,Create hosted agent workflows in Visual Studio Code - Microsoft Foundry,Create and deploy hosted Foundry agent workflows in VS Code,"Create, test, and deploy hosted agent workflows in Foundry Agent Service by using the Microsoft Foundry for Visual Studio Code extension.","Create, test, and deployhosted Foundry Agent workflowsby using theMicrosoft Foundry for Visual Studio Code extension. Hosted workflows let multiple agents collaborate in sequence, each with its own model, tools, and instructions. Before you start,build an agent in Foundry Agent Serviceby using the extension. You can then add hosted workflows to that agent. This article covers creating a workflow project, running it locally, visualizing the execution, and deploying it to your Foundry workspace.",2026-04-16T22:08:00.000Z,how-to,deployment,0.7,True,"Covers creating, running, and deploying hosted workflows via the VS Code extension; likely includes product-specific deployment steps, configuration options, and constraints for Foundry Agent Service.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/agents/overview,What is the Foundry Agent service,What is Microsoft Foundry Agent Service? - Microsoft Foundry,,"Learn about Microsoft Foundry Agent Service capabilities, agent types, tools, and runtime features for building AI agents.","Foundry Agent Service is a fully managed platform for building, deploying, and scaling AI agents. Use any framework andmany modelsfrom the Foundry model catalog. Create no-codeprompt agentsin the Foundry portal, or use the available SDKs and REST API to deploy them and code-basedhosted agentsbuilt with Agent Framework, LangGraph, or your own code. Agent Service handles hosting, scaling, identity, observability, and enterprise security so you can focus on your agent logic.",2026-04-22T17:23:00.000Z,overview,,0.2,False,"High-level overview of Foundry Agent Service capabilities and agent types; no concrete limits, configs, error codes, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/prompt-agent,Quickstart - Create a prompt agent,Quickstart: Create a prompt agent - Microsoft Foundry,,Create a prompt agent in Foundry Agent Service using the Microsoft Foundry SDK.,"In this quickstart, you create a prompt agent in Foundry Agent Service and have a conversation with it. A prompt agent is a declaratively defined agent that combines model configuration, instructions, tools, and natural language prompts to drive behavior. If you don't have an Azure subscription, create afree account.",2026-03-30T08:00:00.000Z,quickstart,,0.35,False,Quickstart for creating a prompt agent; likely shows basic SDK usage and portal steps without detailed configuration catalogs or limits.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-hosted-agent,Quickstart - Deploy a hosted agent,Quickstart: Deploy your first hosted agent - Microsoft Foundry,,Learn how to deploy a containerized AI agent to Foundry Agent Service using the Azure Developer CLI or Microsoft Foundry for VS Code.,"In this quickstart, you deploy a containerized AI agent with Foundry tools to Foundry Agent Service. The sample agent uses web search and optionally MCP tools to answer questions. By the end, you have a running hosted agent that you can interact with through the Foundry playground. Choose your preferred deployment method to get started. In this quickstart, you:",2026-04-02T22:14:00.000Z,quickstart,,0.35,False,"Quickstart for deploying a hosted agent using CLI/VS Code; focuses on basic deployment flow, not on tier matrices, constraints, or advanced deployment patterns.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/ai-red-teaming-agent,AI red teaming agent overview,AI Red Teaming Agent - Microsoft Foundry,,This article provides conceptual overview of the AI Red Teaming Agent.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI Red Teaming Agent is a powerful tool designed to help organizations proactively find safety risks associated with generative AI systems du",2026-02-27T23:08:00.000Z,concept-article,,0.2,False,Explicitly described as a conceptual overview of AI Red Teaming Agent; overviews are out of scope.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/architecture,Service architecture,Microsoft Foundry architecture - Microsoft Foundry,,Learn about the architecture of Microsoft Foundry.,"Microsoft Foundry organizes AI workloads through a layered architecture: a top-level Foundry resource for governance, projects for development isolation, and connected Azure services for storage, search, and secrets management. This article provides IT operations and security teams with details on the Foundry resource and underlying Azure service architecture, its components, and its relation with other Azure resource types. Use this information to guide how tocustomizeyour Foundry deployment to",2026-04-16T22:08:00.000Z,concept-article,,0.2,False,"Architecture overview of Foundry resource and related Azure services; described as conceptual guidance for IT and security teams without clear indication of quantified thresholds, decision matrices, or detailed configuration parameters.",updated +https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-agent-catalog,Quickstart - Create an agent from a manifest,Quickstart: Create an agent from a manifest - Microsoft Foundry,,"Create and test an AI agent from the Foundry agent manifests. Pick a pre-built manifest, configure a model, and chat with your agent in the playground.","In this quickstart, you create an agent from a pre-built manifest and test it in the playground. Agent manifests are ready-to-use configurations that combine tested prompts, tool configurations, and interaction patterns so you can skip writing instructions from scratch. In this quickstart, you:",2026-04-21T22:14:00.000Z,quickstart,,0.3,False,Quickstart workflow for creating an agent from a manifest; primarily step-by-step tutorial without detailed configuration tables or limits.,new +https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-hosted-agent,Quickstart - Deploy a hosted agent,Quickstart: Deploy your first hosted agent - Microsoft Foundry,Deploy a containerized hosted agent to Foundry,Learn how to deploy a containerized AI agent to Foundry Agent Service using the Azure Developer CLI or Microsoft Foundry Toolkit extension for VS Code.,"In this quickstart, you deploy a containerized AI agent with Foundry tools to Foundry Agent Service. The sample agent uses web search and optionally Model Context Protocol (MCP) tools to answer questions. By the end, you have a running hosted agent that you can interact with through the Foundry playground. Choose your preferred deployment method to get started. In this quickstart, you:",2026-04-23T17:31:00.000Z,quickstart,deployment,0.6,True,"Quickstart for deploying containerized agents using azd or VS Code; likely includes Foundry-specific deployment commands, required resources, and constraints unique to the service.",updated +https://learn.microsoft.com/en-us/azure/foundry/concepts/ai-red-teaming-agent,AI red teaming agent overview,AI Red Teaming Agent - Microsoft Foundry,,This article provides conceptual overview of the AI Red Teaming Agent.,"The AI Red Teaming Agent is a powerful tool designed to help organizations proactively find safety risks associated with generative AI systems during design and development of generative AI models and applications. Traditional red teaming involves exploiting the cyber kill chain and describes the process by which a system is tested for security vulnerabilities. However, with the rise of generative AI, the term AI red teaming has been coined to describe probing for novel risks (both content and s",2026-04-22T17:23:00.000Z,concept-article,,0.2,False,"Conceptual overview of AI Red Teaming Agent; focuses on what it is and why, not on specific configuration parameters, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/foundry/concepts/architecture,Service architecture,Microsoft Foundry architecture - Microsoft Foundry,,Learn about the architecture of Microsoft Foundry.,"Microsoft Foundry organizes AI workloads through a layered architecture: a top-level Foundry resource for governance, projects for development isolation, and connected Azure services for storage, search, and secrets management. This article provides IT operations and security teams with details on the Foundry resource and underlying Azure service architecture, its components, and its relation with other Azure resource types. Use this information to guide how tocustomizeyour Foundry deployment to",2026-04-16T22:08:00.000Z,concept-article,,0.2,False,"Architecture overview of Foundry resource and related Azure services; described as conceptual guidance for IT and security teams without clear indication of quantified thresholds, decision matrices, or detailed configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/ask-ai,Ask AI,Ask AI for help - Microsoft Foundry,,"Learn how to ask AI for help, getting your questions answered and tasks supported.","You can ask AI to assist you in Foundry. To start using AI to ask questions or complete tasks, select its icon located in the top right bar of theMicrosoft Foundryportal. A chat window opens where you can type your questions and receive answers in real time. You can also ask the agent to run tasks for you. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads",2026-04-06T08:00:00.000Z,concept-article,,0.1,False,"High-level guidance on using Ask AI in the Foundry portal; no detailed configuration parameters, limits, or product-specific error/decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/authentication-authorization-foundry,Authentication and authorization,Authentication and authorization in Microsoft Foundry - Microsoft Foundry,Configure authentication and authorization for Microsoft Foundry,"Learn how to authenticate and authorize access in Microsoft Foundry using Microsoft Entra ID and API keys. Explore RBAC, identity types, and best practices.","Authentication and authorization in Microsoft Foundry control how principals prove identity and gain permission to perform operations. Foundry divides operations into control plane (resource management) and data plane (runtime usage), each with its own authentication and role-based access control (RBAC) surface. Foundry supports two authentication methods: Microsoft Entra ID and API keys. Microsoft Entra ID enables conditional access, managed identities, and granular RBAC. API keys remain availa",2026-02-27T23:08:00.000Z,concept-article,security,0.82,True,"Details Entra ID vs API key auth, control-plane vs data-plane RBAC, and identity types. Likely lists specific RBAC roles, scopes, and permission surfaces unique to Foundry, which are product-specific security configuration details.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/built-in-evaluators,Built-in evaluators reference,Built-in Evaluators Reference - Microsoft Foundry,Reference for all Foundry built-in evaluators,Comprehensive reference for all built-in evaluators in Microsoft Foundry,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Foundry provides a comprehensive set of built-in evaluators to assess the quality, safety, and reliability of AI responses throughout t",2026-03-07T06:07:00.000Z,reference,configuration,0.7,True,"Comprehensive reference will list evaluator names, input/output schemas, parameters, and allowed values, which are detailed configuration references for evaluation features.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/concepts/built-in-evaluators,Built-in evaluators reference,Built-in Evaluators Reference - Microsoft Foundry,,Comprehensive reference for all built-in evaluators in Microsoft Foundry,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Foundry includes built-in evaluators to assess the quality, safety, and reliability of AI responses throughout the development lifecycl",2026-04-03T08:00:00.000Z,reference,,0.4,False,"Reference for built-in evaluators but summary doesn’t indicate detailed config tables, limits, or error mappings; likely describes what evaluators do rather than product-specific parameters.",updated https://learn.microsoft.com/en-us/azure/foundry/concepts/concept-playgrounds,Explore the model playgrounds,Microsoft Foundry Playgrounds - Microsoft Foundry,,"Learn how to use Microsoft Foundry playgrounds for rapid prototyping, experimentation, and validation with AI models before production deployment.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Foundry playgrounds provide an on-demand, instant chat environment for rapid prototyping, API exploration, and technical validation. Us",2026-02-27T23:08:00.000Z,concept-article,,0.35,False,Conceptual overview of Foundry playgrounds for prototyping; summary doesn’t show detailed configuration parameters or numeric constraints.,unchanged +https://learn.microsoft.com/en-us/azure/foundry/concepts/deployments-overview,Deployment overview,Deployment overview for Microsoft Foundry Models - Microsoft Foundry,,"Learn about deployment options for Microsoft Foundry Models, including standard deployments in Foundry resources and managed compute for partner and community models.","Microsoft Foundry Models is the hub for discovering and deploying a wide range of AI models for generative AI applications. To make a model available for inference requests, you deploy it. Foundry offers two deployment options depending on the model type and your infrastructure needs.",2026-04-23T22:15:00.000Z,concept-article,,0.3,False,"Deployment overview describes available deployment options and concepts. The summary doesn’t indicate detailed deployment matrices, constraints by tier, or timing requirements; it’s likely high-level guidance.",new https://learn.microsoft.com/en-us/azure/foundry/concepts/encryption-keys-portal,Configure customer-managed keys,Customer-Managed Keys (CMKs) for Microsoft Foundry - Microsoft Foundry,Configure customer-managed keys for Microsoft Foundry encryption,Learn how to use CMKs for enhanced encryption and data security in Microsoft Foundry. Configure Azure Key Vault integration and meet compliance requirements.,"Customer-managed key (CMK) encryption inMicrosoft Foundrygives you control over encryption of your data. Use CMKs to add an extra protection layer and help meet compliance requirements with Azure Key Vault integration. Microsoft Foundry provides robust encryption capabilities, including the ability to use CMKs stored in Key Vault to help secure your sensitive data. CMK encryption applies to data at rest stored in the Foundry resource's associated storage accounts, including project artifacts, up",2026-03-06T23:10:00.000Z,how-to,security,0.8,True,"Describes CMK encryption, Key Vault integration, and which data stores are covered. Typically includes key vault configuration, key identifiers, and scope of encryption specific to Foundry, which are product-specific security settings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/agent-evaluators,Agent evaluators,Agent Evaluators for Generative AI - Microsoft Foundry,Evaluate Azure AI agents with task-specific metrics,"Learn how to evaluate Azure AI agents using intent resolution, tool call accuracy, and task adherence evaluators.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. AI agents are powerful productivity assistants that can create workflows for business needs. However, observability can be a challenge due to the",2026-04-04T06:03:00.000Z,reference,best-practices,0.65,True,"Agent evaluators (intent resolution, tool call accuracy, task adherence) are product-specific evaluation constructs with concrete, prescriptive guidance on how to apply them to Azure AI agents. This is actionable, service-unique best-practices style content rather than generic LLM theory.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/azure-openai-graders,Azure OpenAI evaluators,Azure OpenAI Graders for generative AI - Microsoft Foundry,,"Learn about Azure OpenAI Graders for evaluating AI model outputs, including label grading, string checking, text similarity, and custom grading.",Azure OpenAI graders are evaluation tools in the Microsoft Foundry SDK that assess the performance of AI models and their outputs using either LLM-based scoring or deterministic comparison. These graders include: You can run graders locally or remotely. Each grader assesses specific aspects of AI models and their outputs.,2026-04-08T06:04:00.000Z,reference,,0.3,False,"Overview of Azure OpenAI graders and what they assess (label grading, string checking, similarity, custom grading). No detailed config tables, limits, or troubleshooting mappings; primarily conceptual feature description.",unchanged @@ -70,21 +79,20 @@ https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/g https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/rag-evaluators,Retrieval Augmented Generation (RAG) evaluators,Retrieval-Augmented Generation (RAG) Evaluators for Generative AI - Microsoft Foundry,,"Learn about Retrieval-Augmented Generation evaluators for assessing relevance, groundedness, and response completeness in generative AI systems.",A Retrieval-Augmented Generation (RAG) system tries to generate the most relevant answer consistent with grounding documents in response to a user's query. A user's query triggers a search retrieval in the corpus of grounding documents to provide grounding context for the AI model to generate a response. Think aboutgroundednessandresponse completenessas:,2026-04-04T06:03:00.000Z,reference,,0.35,False,"Conceptual explanation of RAG evaluators (relevance, groundedness, completeness); summary does not show numeric thresholds, config tables, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/risk-safety-evaluators,Risk and safety evaluators,Risk and Safety Evaluators for Generative AI - Microsoft Foundry,,"Learn about risk and safety evaluators for generative AI, including tools for assessing content safety, jailbreak vulnerabilities, and code security risks.","Risk and safety evaluators draw on insights gained from our previous large language model (LLM) projects such as GitHub Copilot and Bing. This approach ensures a comprehensive approach to evaluating generated responses for risk and safety severity scores. These evaluators are generated through the Microsoft Foundry Evaluation service, which employs a set of language models. Each model assesses specific risks that could be present in the response from your AI system. Specific risks include sexual",2026-04-02T08:00:00.000Z,reference,,0.2,False,"High-level description of risk and safety evaluators and types of risks assessed. Lacks specific configuration parameters, role names, error codes, or decision matrices with thresholds.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/textual-similarity-evaluators,Textual similarity evaluators,Textual Similarity Evaluators for Generative AI - Microsoft Foundry,,"Learn about textual similarity evaluators for generative AI, including semantic similarity, F1 score, BLEU, GLEU, ROUGE, and METEOR metrics.","It's important to compare how closely the textual response generated by your AI system matches the response you would expect. The expected response is called theground truth. Use a LLM-judge metric likeSimilaritywith a focus on the semantic similarity between the generated response and the ground truth. Or, use metrics from the field of natural language processing (NLP), includingF1 score,BLEU,GLEU,ROUGE, andMETEORwith a focus on the overlaps of tokens or n-grams between the two.",2026-04-08T06:04:00.000Z,reference,,0.2,False,"Describes evaluation metrics (semantic similarity, F1, BLEU, ROUGE, METEOR) conceptually without product-specific limits, configs, or error mappings. No concrete configuration tables, thresholds, or service-specific parameters.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-regions-limits-virtual-network,"Rate limits, region support, and enterprise features","Rate limits, region support, and enterprise features for evaluation - Microsoft Foundry",Evaluation rate limits and region support in Foundry,"Learn about region availability, rate limits, virtual network support, and using your own storage account for evaluation in Microsoft Foundry.","This article provides an overview of which regions support AI-assisted evaluators, the rate limits that apply to evaluation runs, how to configure virtual network support for network isolation, and using your own storage account to run evaluations.",2026-03-20T22:11:00.000Z,how-to,limits-quotas,0.8,True,"Explicitly about region availability and rate limits for evaluation; such pages typically list numeric TPS/TPM limits and region tables, matching limits-quotas criteria.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/foundry-models-overview,Foundry Models overview,Microsoft Foundry Models overview - Microsoft Foundry,,"Discover and deploy AI models with Microsoft Foundry Models. Browse 1,900+ models from OpenAI, Meta, and more to build scalable AI solutions. Explore now.","Microsoft Foundry Models is your one-stop destination for discovering, evaluating, and deploying powerful AI models—whether you're building a custom copilot, an agent, enhancing an existing application, or exploring new AI capabilities. With Foundry Models, you can: Whether you're a developer, data scientist, or enterprise architect, Foundry Models gives you the flexibility and control to build AI solutions that scale—securely, responsibly, and fast. Foundry offers a comprehensive catalog of AI ",2026-03-05T06:04:00.000Z,concept-article,,0.3,False,"High-level overview of Foundry Models catalog and capabilities; summary doesn’t indicate numeric limits, config tables, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-regions-limits-virtual-network,"Rate limits, region support, and enterprise features","Rate limits, region support, and enterprise features for evaluation - Microsoft Foundry",Check Foundry evaluation regions and rate limits,"Learn about region availability, rate limits, virtual network support, and using your own storage account for evaluation in Microsoft Foundry.","This article provides an overview of which regions support AI-assisted evaluators, the rate limits that apply to evaluation runs, how to configure virtual network support for network isolation, and using your own storage account to run evaluations.",2026-04-20T11:04:00.000Z,how-to,limits-quotas,0.8,True,Explicitly about region availability and rate limits for evaluations; likely includes concrete per-region limits and possibly tables of quotas and constraints that are product-specific.,updated +https://learn.microsoft.com/en-us/azure/foundry/concepts/foundry-models-overview,Foundry Models overview,Microsoft Foundry Models overview - Microsoft Foundry,,"Discover and deploy AI models with Microsoft Foundry Models. Browse 1,900+ models from OpenAI, Meta, and more to build scalable AI solutions. Explore now.","Microsoft Foundry Models is your one-stop destination for discovering, evaluating, and deploying powerful AI models—whether you're building a custom copilot, an agent, enhancing an existing application, or exploring new AI capabilities. With Foundry Models, you can: Whether you're a developer, data scientist, or enterprise architect, Foundry Models gives you the flexibility and control to build AI solutions that scale—securely, responsibly, and fast. Foundry offers a comprehensive catalog of AI ",2026-04-20T08:00:00.000Z,concept-article,,0.2,False,"Models overview is a high-level discovery/marketing-style page describing what Foundry Models is; unlikely to contain detailed limits, config tables, or decision matrices.",updated https://learn.microsoft.com/en-us/azure/foundry/concepts/general-availability,General availability overview,New Microsoft Foundry portal general availability overview - Microsoft Foundry,,"Learn what general availability means for Microsoft Foundry, including GA scope, supported scenarios, feature readiness, and migration guidance.","The new Microsoft Foundry portal is now generally available (GA). This milestone marks a shift from pilot-focused usage to secure, reliable, enterprise-ready production usage for core scenarios. Foundry is designed for teams that need to build, deploy, and operate AI systems at scale, with governance, security, and operational controls integrated throughout the lifecycle. Foundry unifies the end-to-end lifecycle acrossDiscover,Build, andOperateso teams can move faster without trading off reliabi",2026-03-13T22:11:00.000Z,concept-article,,0.3,False,"GA overview is mostly release scope, supported scenarios, and migration messaging, not detailed technical configuration or limits.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/manage-costs,Plan and manage costs,Plan and Manage Costs - Microsoft Foundry,,"Manage Microsoft Foundry costs by estimating expenses, monitoring usage, and setting up alerts for spending anomalies with Microsoft Cost Management.","This article shows you how to estimate expenses before deployment, track spending in real time, and set up alerts to avoid budget surprises.",2026-04-06T08:00:00.000Z,how-to,,0.2,False,"The description indicates a generic cost management article: estimating expenses, tracking spending, and setting alerts using Microsoft Cost Management. It sounds like conceptual and procedural guidance without product-specific numeric limits, thresholds, or configuration tables. No clear evidence of expert-only details such as specific quotas, SKUs, or unique configuration parameters.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/model-benchmarks,Model leaderboards,Model benchmarks and leaderboards in Microsoft Foundry - Microsoft Foundry,Use Foundry model benchmarks and leaderboards for selection,"Compare AI models using quality, safety, cost, and performance benchmarks on the model leaderboards (preview) in Microsoft Foundry portal.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Model leaderboards (preview) in Foundry portal help you compare models in the Foundry model catalog using industry-standard model benchmarks. To ",2026-03-19T08:00:00.000Z,concept-article,decision-making,0.8,True,"Model leaderboards provide benchmark metrics (quality, safety, cost, performance) to compare models. This is explicitly for choosing between models using quantified trade-offs and comparison tables, fitting decision-making.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/model-lifecycle-retirement,Model lifecycle and retirement for Foundry Models,Model deprecation and retirement for Foundry Models - Microsoft Foundry,Plan for Foundry model deprecation and retirement,"Learn about model lifecycle stages, deprecation timelines, notifications, and migration steps for Microsoft Foundry Models.","Microsoft Foundry Models are continually refreshed with newer and more capable models. As part of this process, model providers might deprecate and retire their older models, and you might need to update your applications to use a newer model. This document communicates information about the model lifecycle and deprecation timelines and explains how you're informed of model lifecycle stages. This article covers general deprecation and retirement information for Foundry Models. For details specif",2026-03-20T06:04:00.000Z,concept-article,decision-making,0.65,True,"Describes lifecycle stages, deprecation timelines, notifications, and migration steps. This guides decisions about when and how to migrate between models and manage upgrades, fitting decision-making around lifecycle management.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/observability,Observability overview,Observability in Generative AI - Microsoft Foundry,,"Learn how Microsoft Foundry enables safe, high-quality generative AI through systematic evaluation and observability tools.","The AI application lifecycle requires robust evaluation frameworks to ensure AI systems deliver accurate, relevant, and reliable outputs. Without rigorous assessment, AI systems risk generating responses that are inaccurate, inconsistent, poorly grounded, or potentially harmful. Observability enables teams to measure and improve both the quality and safety of AI outputs throughout the development lifecycle—from model selection through production monitoring.",2026-03-28T06:04:00.000Z,concept-article,,0.3,False,"Conceptual observability overview for generative AI; summary does not indicate concrete config parameters, limits, or error mappings.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/concepts/planning,Plan rollout,Microsoft Foundry Rollout Across My Organization - Microsoft Foundry,Plan Microsoft Foundry rollout and environment strategy,"Learn how to plan the rollout of Microsoft Foundry across your organization, including environment setup, data isolation, and governance.","A structured rollout plan helps you avoid security gaps, cost overruns, and access sprawl when adopting Microsoft Foundry at scale. This guide outlines key decisions for rolling out Foundry, including environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. Use this guide as a starting point and adapt it to your needs. For implementation details, see the linked articles.",2026-04-16T16:35:00.000Z,concept-article,decision-making,0.65,True,"The page provides product-specific rollout planning guidance for Microsoft Foundry, including key decisions around environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. This is decision-making content that helps choose approaches and configurations for organizational adoption, beyond generic conceptual guidance.",updated -https://learn.microsoft.com/en-us/azure/foundry/concepts/rbac-foundry,Role-based access control,Role-based access control for Microsoft Foundry - Microsoft Foundry,Configure RBAC roles and scopes for Microsoft Foundry,This article introduces role-based access control in Microsoft Foundry portal.,"In this article, you learn core role-based access control (RBAC) concepts for Microsoft Foundry, including scopes, built-in roles, and common enterprise assignment patterns. Tip RBAC roles apply when you authenticate using Microsoft Entra ID. If you use key-based authentication instead, the key grants full access without role restrictions. Microsoft recommends using Entra ID authentication for improved security and granular access control. For more information about authentication and authorizat",2026-04-13T08:00:00.000Z,concept-article,security,0.7,True,"RBAC article for Foundry is likely to list specific built-in role names, scopes, and assignment patterns unique to the product, which matches the security sub-skill criteria for expert knowledge.",updated +https://learn.microsoft.com/en-us/azure/foundry/concepts/observability,Observability overview,Observability in Generative AI - Microsoft Foundry,,"Learn how Microsoft Foundry enables safe, high-quality generative AI through systematic evaluation and observability tools.","The AI application lifecycle requires robust evaluation frameworks to ensure AI systems deliver accurate, relevant, and reliable outputs. Without rigorous assessment, AI systems risk generating responses that are inaccurate, inconsistent, poorly grounded, or potentially harmful. Observability enables teams to measure and improve both the quality and safety of AI outputs throughout the development lifecycle—from model selection through production monitoring.",2026-04-03T08:00:00.000Z,concept-article,,0.2,False,"Conceptual overview of observability and evaluation in generative AI; no concrete limits, configs, or product-specific parameters.",updated +https://learn.microsoft.com/en-us/azure/foundry/concepts/planning,Plan rollout,Microsoft Foundry Rollout Across My Organization - Microsoft Foundry,Plan Microsoft Foundry rollout and environment strategy,"Learn how to plan the rollout of Microsoft Foundry across your organization, including environment setup, data isolation, and governance.","A structured rollout plan helps you avoid security gaps, cost overruns, and access sprawl when adopting Microsoft Foundry at scale. This guide outlines key decisions for rolling out Foundry, including environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. Use this guide as a starting point and adapt it to your needs. For implementation details, see the linked articles.",2026-04-16T16:35:00.000Z,concept-article,decision-making,0.65,True,"The page provides product-specific rollout planning guidance for Microsoft Foundry, including key decisions around environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. This is decision-making content that helps choose approaches and configurations for organizational adoption, beyond generic conceptual guidance.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/concepts/rbac-foundry,Role-based access control,Role-based access control for Microsoft Foundry - Microsoft Foundry,Configure RBAC roles and scopes for Microsoft Foundry,This article introduces role-based access control in Microsoft Foundry portal.,"In this article, you learn core role-based access control (RBAC) concepts for Microsoft Foundry, including scopes, built-in roles, and common enterprise assignment patterns. Tip RBAC roles apply when you authenticate using Microsoft Entra ID. If you use key-based authentication instead, the key grants full access without role restrictions. Microsoft recommends using Entra ID authentication for improved security and granular access control. For more information about authentication and authorizat",2026-04-13T08:00:00.000Z,concept-article,security,0.7,True,"RBAC article for Foundry is likely to list specific built-in role names, scopes, and assignment patterns unique to the product, which matches the security sub-skill criteria for expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/retrieval-augmented-generation,Retrieval Augmented Generation (RAG) overview,Retrieval augmented generation (RAG) and indexes in Microsoft Foundry - Microsoft Foundry,Apply RAG and indexing patterns in Foundry,Learn how retrieval augmented generation (RAG) uses indexes and grounding data to improve response accuracy in generative AI apps.,"Retrieval augmented generation (RAG) is a pattern that combines search with large language models (LLMs) so responses are grounded in your data. This article explains how RAG works in Microsoft Foundry, what role indexes play, and how agentic retrieval changes classic RAG patterns. LLMs are trained on public data available at training time. If you need answers based on your private data, or on frequently changing information, RAG helps you:",2026-02-27T23:08:00.000Z,concept-article,architecture-patterns,0.65,True,"Explains how RAG works specifically in Foundry, including agentic retrieval vs classic RAG; likely includes product-specific architectural patterns and when to use them.",unchanged https://learn.microsoft.com/en-us/azure/foundry/concepts/safety-evaluations-transparency-note,Transparency Note for safety evaluations,Microsoft Foundry risk and safety evaluations Transparency Note - Microsoft Foundry,,"Microsoft Foundry safety evaluations intended purpose, capabilities, limitations and how to achieve the best performance.",,2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Transparency note on risk and safety evaluations; typically conceptual (purpose, capabilities, limitations) without detailed configuration or numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/foundry/configuration/enable-ai-api-management-gateway-portal,Configure an AI gateway,Configure AI Gateway in your Foundry resources - Microsoft Foundry,Configure AI Gateway token controls in Foundry,Enable AI Gateway with Azure API Management to apply tokens-per-minute limits and token quotas to model deployments in Microsoft Foundry.,"This article shows you how to enable AI Gateway for a Microsoft Foundry resource using the Foundry portal. AI Gateway uses Azure API Management behind the scenes to provide token limits, quotas, and governance for model deployments.",2026-04-10T22:08:00.000Z,how-to,configuration,0.7,True,"The article describes how to enable AI Gateway using Azure API Management for Foundry resources, including configuring tokens-per-minute limits and token quotas on model deployments. This involves product-specific configuration steps and settings (how to turn on the gateway and apply limits) rather than just conceptual discussion. While it references limits/quotas, it is primarily about how to configure the gateway and its parameters, not a catalog of numeric limits, so it best fits the configuration sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/control-plane/govern-agent-infrastructure-entra-admin,Govern agent infrastructure as a Microsoft Entra administrator,Govern agent infrastructure as a Microsoft Entra administrator - Microsoft Foundry,Administer Foundry agent infrastructure with Entra roles,"Learn how to elevate access, assign the right roles, and take infrastructure-level actions on Foundry agents as a Microsoft Entra administrator.","As a Microsoft Entra administrator, you might need to take action on Microsoft Foundry agents running in your tenant. Before you do, it's important to understand that the actions available to you in Foundry areinfrastructure actions, not just runtime governance. When you stop or delete an agent, you're operating on Azure resources that might serve multiple tenants or teams. This article helps you get the access you need, understand how admin center actions map to Azure resource operations, and m",2026-04-17T22:08:00.000Z,how-to,security,0.7,True,"Page is about elevating access, assigning roles, and mapping admin center actions to Azure resource operations. This implies specific RBAC roles/permissions and infrastructure-level security actions unique to Foundry agents, which fits the security sub-skill definition.",new -https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-enforce-limits-models,Enforce token limits,Enforce Token Limits for Models - Microsoft Foundry,Configure token rate limits and quotas for Foundry models,"Use Foundry Control Plane integration with an AI gateway to apply limits for model inference, including token limits.",Microsoft Foundry Control Plane enforces tokens-per-minute (TPM) rate limits and total token quotas for model deployments at the project scope. This enforcement prevents runaway token consumption and aligns usage with organizational guardrails. Foundry Control Plane integrates with AI gateways to provide advanced policy enforcement for models. This article explains how to configure token rate limiting and token quotas.,2026-04-13T22:11:00.000Z,how-to,limits-quotas,0.85,True,"Article is explicitly about enforcing tokens-per-minute rate limits and total token quotas for model deployments. This strongly suggests specific numeric TPM and quota settings and configuration details, matching the limits-quotas sub-skill criteria.",updated -https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-manage-agents,Manage agents at scale,Manage agents at scale in Microsoft Foundry Control Plane - Microsoft Foundry,,"Learn how to view your agent inventory, monitor agent health, and perform lifecycle operations by using Microsoft Foundry Control Plane.","Microsoft Foundry Control Plane provides centralized management and observability for agents that run across supported platforms and infrastructures. With Foundry Control Plane, you can manage agents that are distributed across multiple projects within a subscription. This article explains how to view your agent inventory, monitor agent health, and perform lifecycle operations by using the Foundry portal.",2026-04-13T08:00:00.000Z,how-to,,0.3,False,"Summary indicates a how-to for viewing inventory and lifecycle operations in the portal, but doesn't clearly reference configuration tables, limits, or error-code-based troubleshooting. Likely a procedural management guide rather than detailed expert reference content.",updated +https://learn.microsoft.com/en-us/azure/foundry/control-plane/govern-agent-infrastructure-entra-admin,Govern agent infrastructure as a Microsoft Entra administrator,Govern agent infrastructure as a Microsoft Entra administrator - Microsoft Foundry,Administer Foundry agent infrastructure with Entra roles,"Learn how to elevate access, assign the right roles, and take infrastructure-level actions on Foundry agents as a Microsoft Entra administrator.","As a Microsoft Entra administrator, you might need to take action on Microsoft Foundry agents running in your tenant. Before you do, it's important to understand that the actions available to you in Foundry areinfrastructure actions, not just runtime governance. When you stop or delete an agent, you're operating on Azure resources that might serve multiple tenants or teams. This article helps you get the access you need, understand how admin center actions map to Azure resource operations, and m",2026-04-17T22:08:00.000Z,how-to,security,0.7,True,"Page is about elevating access, assigning roles, and mapping admin center actions to Azure resource operations. This implies specific RBAC roles/permissions and infrastructure-level security actions unique to Foundry agents, which fits the security sub-skill definition.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-enforce-limits-models,Enforce token limits,Enforce Token Limits for Models - Microsoft Foundry,Configure token rate limits and quotas for Foundry models,"Use Foundry Control Plane integration with an AI gateway to apply limits for model inference, including token limits.",Microsoft Foundry Control Plane enforces tokens-per-minute (TPM) rate limits and total token quotas for model deployments at the project scope. This enforcement prevents runaway token consumption and aligns usage with organizational guardrails. Foundry Control Plane integrates with AI gateways to provide advanced policy enforcement for models. This article explains how to configure token rate limiting and token quotas.,2026-04-13T22:11:00.000Z,how-to,limits-quotas,0.85,True,"Article is explicitly about enforcing tokens-per-minute rate limits and total token quotas for model deployments. This strongly suggests specific numeric TPM and quota settings and configuration details, matching the limits-quotas sub-skill criteria.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-manage-agents,Manage agents at scale,Manage agents at scale in Microsoft Foundry Control Plane - Microsoft Foundry,,"Learn how to view your agent inventory, monitor agent health, and perform lifecycle operations by using Microsoft Foundry Control Plane.","Microsoft Foundry Control Plane provides centralized management and observability for agents that run across supported platforms and infrastructures. With Foundry Control Plane, you can manage agents that are distributed across multiple projects within a subscription. This article explains how to view your agent inventory, monitor agent health, and perform lifecycle operations by using the Foundry portal.",2026-04-13T08:00:00.000Z,how-to,,0.3,False,"Summary indicates a how-to for viewing inventory and lifecycle operations in the portal, but doesn't clearly reference configuration tables, limits, or error-code-based troubleshooting. Likely a procedural management guide rather than detailed expert reference content.",unchanged https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-manage-compliance-security,Manage compliance and security,Manage Compliance and Security in Microsoft Foundry - Microsoft Foundry,Manage Foundry compliance and security integrations,"Discover how to manage compliance and secure your Microsoft Foundry assets by using guardrail policies, Microsoft Defender for Cloud, and Microsoft Purview DSPM.","Learn how Microsoft Foundry Control Plane helps you manage compliance, enforce guardrail controls, and integrate security tooling such as Microsoft Defender for Cloud across subscriptions. Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental",2026-03-03T23:21:00.000Z,how-to,security,0.8,True,"Describes using guardrail policies, Defender for Cloud, and Purview DSPM with Foundry; will include specific security settings, integration steps, and scopes, which are product-specific security configurations.",unchanged https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-optimize-cost-performance,Optimize cost and performance,Optimize model cost and performance - Microsoft Foundry,Optimize Foundry model cost and performance,"Use Ask AI in Microsoft Foundry to detect cost spikes, switch to cost-efficient models, evaluate quality, and track performance improvements.","When your model or agent costs start increasing, useAsk AI(preview) to quickly diagnose issues, take action, and verify improvements. Ask AI is a built-in chat assistant that you can access from the toolbar in the Microsoft Foundry portal. For more information about Ask AI capabilities and limitations, seeAsk AI for help (preview). In this article, you identify cost spikes, switch to a cost-efficient model, and validate performance improvements by using the Foundry portal. Note When you ask Ask ",2026-03-25T17:14:00.000Z,how-to,decision-making,0.65,True,"Cost/performance optimization using Ask AI; likely includes concrete guidance on when to switch models, how to interpret cost spikes, and scenario-based recommendations, fitting decision-making around model/tier selection.",unchanged https://learn.microsoft.com/en-us/azure/foundry/control-plane/monitoring-across-fleet,Monitor fleet health and performance,Monitor AI agent fleet health and performance - Microsoft Foundry,,"Track agent health, compliance, performance trends, and cost efficiency across your AI fleet by using Microsoft Foundry Control Plane monitoring.","As your organization scales from isolated copilots to autonomous multi-agent fleets, maintaining visibility and control becomes critical. Microsoft Foundry Control Plane provides a unified command center where you can monitor all agents, models, and tools across your enterprise from build to production. Fleet monitoring serves multiple roles: This article shows you how to use Foundry Control Plane capabilities to track agent health, performance, compliance, and cost efficiency at scale. By using",2026-02-27T23:08:00.000Z,how-to,,0.4,False,"Monitoring overview/how-to; summary suggests portal usage patterns but not specific metrics tables, thresholds, or product-unique diagnostic commands.",unchanged @@ -100,11 +108,12 @@ https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/model-ve https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/model-versions-gov,Model versions (government),Model versioning in Microsoft Foundry Models in Azure Government - Microsoft Foundry,Decide on model versioning and upgrade policies in Azure Government,"Learn how model versions work in Microsoft Foundry Models in Azure Government. Understand update policies, deployment options, and how to manage version upgrades effectively.","Microsoft Foundry Models regularly release new model versions that incorporate the latest features and improvements from model providers. This article explains how model versioning works, what update policies are available for your deployments, and how Azure OpenAI and partner model versions are managed. After reading this article, you'll know which upgrade policy to choose when you deploy a model, how Azure manages version upgrades automatically, and how partner model versioning differs from Az",2026-04-08T22:11:00.000Z,concept-article,decision-making,0.7,True,"Explains model versioning behavior, update policies, and how Azure vs partner model versions are managed. This guides users in choosing upgrade policies and deployment strategies with concrete versioning behaviors and options, aligning with decision-making for version/upgrade choices.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-from-partners,Foundry Models sold from partners and community,Foundry Models from partners and community - Microsoft Foundry,Capabilities and availability of partner Foundry models,"Learn about Microsoft Foundry Models from partners and community, including their capabilities, supported input and output types, Azure Marketplace requirements, and troubleshooting.","Microsoft Foundry Models in the model catalog comprise two main categories, namelyFoundry Models sold directly by AzureandFoundry Models from partners and community. This article lists a selection of Foundry Models from partners and community, along with their capabilities, deployment types, and regions of availability, excluding deprecated and retired models. -Most Foundry Model providers are trusted third-party organizations, partners, research labs, and community contributors. Important Models",2026-04-18T06:07:00.000Z,partner-tools,limits-quotas,0.7,True,"Catalog of partner/community models with capabilities, supported input/output types, deployment types, and regions; these per-model details act as specific constraints and capabilities (effectively limits) that are product- and SKU-specific expert knowledge.",updated +Most Foundry Model providers are trusted third-party organizations, partners, research labs, and community contributors. Important Models",2026-04-18T06:07:00.000Z,partner-tools,limits-quotas,0.7,True,"Catalog of partner/community models with capabilities, supported input/output types, deployment types, and regions; these per-model details act as specific constraints and capabilities (effectively limits) that are product- and SKU-specific expert knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-sold-directly-by-azure,Foundry Models sold directly by Azure,Foundry Models sold directly by Azure - Microsoft Foundry,Model catalog details for Azure-sold Foundry models,"Learn about Microsoft Foundry Models sold directly by Azure, their capabilities, deployment types, and regional availability for AI applications.","Microsoft Foundry Models in the model catalog comprise two main categories, namelyFoundry Models sold directly by AzureandFoundry Models from partners and community. -This article lists a selection of Foundry Models sold directly by Azure, along with their capabilities,deployment types, and regions of availability, excluding deprecated and retired models. Foundry Models sold directly by Azure are also referred to asDirect from Azure ModelsorAzure Direct Models. Models sold directly by Azure inclu",2026-04-17T08:00:00.000Z,product-comparison,limits-quotas,0.7,True,"Lists specific models with deployment types and regional availability; such catalog pages typically include per-model capabilities and constraints (for example, supported modalities, context sizes, or deployment types) that function as concrete limits/quotas and are not generally known from training.",updated +This article lists a selection of Foundry Models sold directly by Azure, along with their capabilities,deployment types, and regions of availability, excluding deprecated and retired models. Foundry Models sold directly by Azure are also referred to asDirect from Azure ModelsorAzure Direct Models. Models sold directly by Azure inclu",2026-04-17T08:00:00.000Z,product-comparison,limits-quotas,0.7,True,"Lists specific models with deployment types and regional availability; such catalog pages typically include per-model capabilities and constraints (for example, supported modalities, context sizes, or deployment types) that function as concrete limits/quotas and are not generally known from training.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-sold-directly-by-azure-gov,Foundry Models sold directly by Azure (government),Foundry Models sold directly by Azure in Azure Government - Microsoft Foundry,Select Foundry models sold by Azure in Government,"Learn about Microsoft Foundry Models sold directly by Azure, their capabilities, deployment types, and regional availability for AI applications.","This article lists a selection of Microsoft Foundry Models sold directly by Azure in Azure Government along with their capabilities anddeployment types, and regions of availability. Models sold directly by Azure include all Azure OpenAI models offered in Azure Government. These models are billed through your Azure subscription, covered by Azure service-level agreements, and supported by Microsoft. To learn more about attributes of all Foundry Models sold directly by Azure across all clouds, seeM",2026-04-08T22:11:00.000Z,product-comparison,decision-making,0.65,True,"Lists Foundry models sold directly by Azure in Azure Government with capabilities, deployment types, and regions. This is used to choose which model and deployment to use in a given region, providing concrete comparison attributes (capabilities, availability) that support technology selection decisions.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-code,Claude Code,Configure Claude Code for Microsoft Foundry - Microsoft Foundry,Configure Claude Code CLI and VS Code with Foundry,"Set up Claude Code CLI and VS Code extension to use Claude models in Microsoft Foundry with enterprise security, authentication, and CI/CD integration.","Anthropic'sClaude Codeis an agentic coding tool that reads your codebase, edits files, runs commands, and integrates with your development tools. It's available as a CLI tool and VS Code extension. When you configure Claude Code with Microsoft Foundry, you run the coding agent on Azure infrastructure while keeping your data inside your compliance boundary. This configuration provides enterprise-grade security, private networking, role-based access control, and cost management. In this article, y",2026-02-27T23:08:00.000Z,how-to,integrations,0.8,True,"Explains how to point Claude Code tools at Foundry; includes auth configuration, endpoints, and CI/CD integration parameters—product-specific integration patterns.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-desktop,Claude Desktop,Configure Claude Desktop for Microsoft Foundry - Microsoft Foundry,Configure Claude Desktop to use Microsoft Foundry,Configure Claude Desktop to use Microsoft Foundry as its inference provider for enterprise deployments.,"Anthropic's Claude Desktop runs the Claude Cowork and Claude Code clients. When you configure Claude Desktop to use Microsoft Foundry as the inference provider, all Claude model requests route through your own Foundry resource. Billing stays on your Azure account, conversations remain on user devices, and you can deploy and manage the app through your existing enterprise Mobile Device Management (MDM) tools such as Microsoft Intune, Group Policy, or Jamf. This article shows you how to set up Cla",2026-04-23T06:10:00.000Z,how-to,configuration,0.74,True,"A setup guide for configuring Claude Desktop to use Foundry as an inference provider for enterprise deployments will typically include concrete configuration fields (endpoints, keys, environment variables, or MDM policy settings) and possibly example config snippets. These are product-specific configuration parameters rather than generic guidance, matching the configuration sub-skill.",new https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-entra-id,Configure keyless authentication,Configure keyless authentication with Microsoft Entra ID - Microsoft Foundry,Configure keyless Entra ID authentication for Foundry Models,"Learn how to configure keyless authentication with Microsoft Entra ID for Microsoft Foundry Models. Eliminate the need for API keys, enhance security with RBAC, and simplify compliance.","This article explains how to configure keyless authentication with Microsoft Entra ID for Microsoft Foundry Models. Keyless authentication enhances security by eliminating the need for API keys, simplifies the user experience with role-based access control (RBAC), and reduces operational complexity while providing robust compliance support.",2026-02-27T23:08:00.000Z,how-to,security,0.82,True,"How-to for configuring keyless auth with Entra ID for Foundry Models. Likely includes app registration settings, audience/resource IDs, scopes, and RBAC role assignments specific to this product, which are concrete security configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/create-model-deployments,Deploy Foundry Models using code,Deploy models using Azure CLI and Bicep - Microsoft Foundry,Deploy Foundry Models using Azure CLI and Bicep templates,Learn how to add and configure Microsoft Foundry Models in your Foundry resource for use in inference applications using Azure CLI and Bicep templates.,"Important Azure AI Inference beta SDK is deprecated and will be retired on May 30, 2026. Switch to the generally availableOpenAI/v1 APIwith a stable OpenAI SDK. Follow themigration guideto switch to OpenAI/v1, using the SDK for your preferred programming language. In this article, you learn how to add a new model deployment to a Foundry Models endpoint. The deployment is available for inference in your Foundry resource when you specify the deployment name in your requests.",2026-02-27T23:08:00.000Z,how-to,deployment,0.75,True,"Shows CLI commands and Bicep schema for deployments; includes parameter names, allowed values, and resource types—product-specific deployment configuration patterns.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/deploy-foundry-models,Deploy Foundry Models in the portal,Deploy Microsoft Foundry Models in the Foundry portal - Microsoft Foundry,Deploy Foundry Models via Foundry portal for inference,Learn how to deploy Microsoft Foundry Models in the Foundry portal for AI inference applications and integration into your projects.,"In this article, you learn how to use the Foundry portal to deploy a Foundry Model in a Foundry resource for inference. Foundry Models include models such as Azure OpenAI models, Meta Llama models, and more. After you deploy a Foundry Model, you can interact with it in the Foundry Playground and use it from code. This article uses a Foundry Model from partners and communityLlama-3.2-90B-Vision-Instructfor illustration. Models from partners and community require that you subscribe to Azure Market",2026-02-27T23:08:00.000Z,how-to,deployment,0.65,True,"Portal-based deployment how-to; likely includes deployment-type choices, region/model constraints, and possibly SKU-specific availability—deployment-specific expert details.",unchanged @@ -112,9 +121,9 @@ https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/generate-r https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/model-choice-guide,GPT-5 vs GPT-4.1,GPT-5 vs GPT-4.1 - choosing the right model for your use case - Microsoft Foundry,Decide between GPT-5 and GPT-4.1 for your use case,"Compare GPT-5 and GPT-4.1 to choose the best Azure OpenAI model for your use case, covering reasoning depth, latency, cost, and ideal scenarios for each","GPT-5 is the first model from OpenAI that introduces four adjustable levels of thinking, controlling the amount of time and tokens the model uses when responding to a prompt. When selecting which model to use, or whether to use a reasoning model at all, it is important to consider your application’s priorities. Scenarios like researching and producing a report involve the collection, processing, and generation of large amounts of data. Customers in these scenarios are typically willing to wait m",2026-02-27T23:08:00.000Z,product-comparison,decision-making,0.9,True,"Direct comparison of GPT-5 vs GPT-4.1 with reasoning depth, latency, cost, and scenario guidance; clearly supports model selection with quantified trade-offs.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/monitor-models,Monitor model deployments,Monitor Model Deployments in Microsoft Foundry Models - Microsoft Foundry,Configure Azure Monitor for Foundry model deployments,Learn how to use Azure Monitor tools like Log Analytics to capture and analyze metrics and data logs for Foundry Models.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This article explains how to use Azure Monitor metrics and logs to track availability, performance, and usage for model deployments in Foundry Mo",2026-02-27T23:08:00.000Z,how-to,configuration,0.6,True,"Explains how to wire Foundry Models into Azure Monitor and Log Analytics; likely includes specific metric names, log categories, and configuration settings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/quickstart-github-models,Upgrade GitHub Models to Foundry Models,Upgrade from GitHub Models to Microsoft Foundry Models - Microsoft Foundry,Upgrade workloads from GitHub Models to Foundry Models,Learn how to upgrade from GitHub Models to Microsoft Foundry Models for production-ready AI applications with enhanced features.,"In this article, you learn to develop a generative AI application by starting from GitHub Models and then upgrade your experience by deploying a Foundry Tools resource with Microsoft Foundry Models. GitHub Modelsare useful when you want to find and experiment with AI models for free as you develop a generative AI application. When you're ready to bring your application to production, upgrade your experience by deploying a Foundry Tools resource in an Azure subscription and start using Foundry Mo",2026-02-27T23:08:00.000Z,how-to,decision-making,0.7,True,Guides when and how to move from GitHub Models to Foundry for production; likely includes feature/capability comparisons and migration steps—service selection and migration decision guidance.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude,Claude models,Deploy and use Claude models in Microsoft Foundry - Microsoft Foundry,,"Deploy Anthropic's Claude models in Microsoft Foundry to integrate advanced conversational AI into your apps. Learn how to use Claude Opus, Sonnet, and Haiku models.","Anthropic's Claude models bring advanced conversational AI capabilities to Microsoft Foundry, enabling you to build intelligent applications with state-of-the-art language understanding and generation. Claude models excel at complex reasoning, code generation, and multimodal tasks including image analysis. In this article, you learn how to: Claude models in Foundry include: 1Claude Mythos Previewis only available as agated research preview. Access to the model is granted solely at Anthropic's di",2026-04-16T16:35:00.000Z,how-to,,0.3,False,"Describes how to deploy and use Claude models and lists available variants, but the summary does not show specific limits, configuration parameter tables, or error mappings. Likely a usage tutorial, not expert-level configuration or decision content.",new -https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-flux,FLUX models,Deploy and use FLUX models in Microsoft Foundry - Microsoft Foundry,,Deploy Black Forest Labs FLUX image generation models in Microsoft Foundry to generate and edit high-quality images from text prompts and reference images.,"Black Forest Labs (BFL) FLUX models bring state-of-the-art image generation to Microsoft Foundry, enabling you to generate and edit high-quality images from text prompts and reference images. In this article, you learn how to: FLUX models are optimized for photorealism, prompt fidelity, and compositional control, making them well suited for creative, e-commerce, media, and design-centric applications. They support a range of capabilities including text-to-image generation, multi-reference image ",2026-03-24T06:06:00.000Z,how-to,,0.3,False,"Covers deploying and using FLUX image models with capabilities overview. The description suggests a feature and usage guide, not detailed limits, configuration matrices, or troubleshooting content required for expert classification.",new -https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-mai,Microsoft AI (MAI) models,Deploy and use MAI models in Microsoft Foundry - Microsoft Foundry,,"Deploy MAI-Image-2 and MAI-Image-2e, text-to-image generation models in Microsoft Foundry, to generate high-quality images from natural language prompts.","MAI-Image-2 and MAI-Image-2e are text-to-image generation models that create high-quality, visually rich images from natural language prompts. In this article, you learn how to:",2026-04-14T12:58:00.000Z,how-to,,0.3,False,"Appears to be a how-to deployment/usage guide for MAI image models. The summary does not indicate detailed configuration tables, limits, or troubleshooting mappings; likely a standard tutorial rather than expert reference content.",new +https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude,Claude models,Deploy and use Claude models in Microsoft Foundry - Microsoft Foundry,,"Deploy Anthropic's Claude models in Microsoft Foundry to integrate advanced conversational AI into your apps. Learn how to use Claude Opus, Sonnet, and Haiku models.","Anthropic's Claude models bring advanced conversational AI capabilities to Microsoft Foundry, enabling you to build intelligent applications with state-of-the-art language understanding and generation. Claude models excel at complex reasoning, code generation, and multimodal tasks including image analysis. In this article, you learn how to: Claude models in Foundry include: 1Claude Mythos Previewis only available as agated research preview. Access to the model is granted solely at Anthropic's di",2026-04-16T16:35:00.000Z,how-to,,0.3,False,"Describes how to deploy and use Claude models and lists available variants, but the summary does not show specific limits, configuration parameter tables, or error mappings. Likely a usage tutorial, not expert-level configuration or decision content.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-flux,FLUX models,Deploy and use FLUX models in Microsoft Foundry - Microsoft Foundry,,Deploy Black Forest Labs FLUX image generation models in Microsoft Foundry to generate and edit high-quality images from text prompts and reference images.,"Black Forest Labs (BFL) FLUX models bring state-of-the-art image generation to Microsoft Foundry, enabling you to generate and edit high-quality images from text prompts and reference images. In this article, you learn how to: FLUX models are optimized for photorealism, prompt fidelity, and compositional control, making them well suited for creative, e-commerce, media, and design-centric applications. They support a range of capabilities including text-to-image generation, multi-reference image ",2026-03-24T06:06:00.000Z,how-to,,0.3,False,"Covers deploying and using FLUX image models with capabilities overview. The description suggests a feature and usage guide, not detailed limits, configuration matrices, or troubleshooting content required for expert classification.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-mai,Microsoft AI (MAI) models,Deploy and use MAI models in Microsoft Foundry - Microsoft Foundry,,"Deploy MAI-Image-2 and MAI-Image-2e, text-to-image generation models in Microsoft Foundry, to generate high-quality images from natural language prompts.","MAI-Image-2 and MAI-Image-2e are text-to-image generation models that create high-quality, visually rich images from natural language prompts. In this article, you learn how to:",2026-04-14T12:58:00.000Z,how-to,,0.3,False,"Appears to be a how-to deployment/usage guide for MAI image models. The summary does not indicate detailed configuration tables, limits, or troubleshooting mappings; likely a standard tutorial rather than expert reference content.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/quotas-limits,Foundry Model quotas and limits,Microsoft Foundry Models quotas and limits - Microsoft Foundry,Reference quotas and limits for Foundry Models,"Learn about quotas, rate limits, and best practices for Foundry Models, including per-model token and request limits, client timeouts, and how to request increases.","This article provides a quick reference and detailed description of the quotas and limits forFoundry Models sold directly by Azure. For quotas and limits specific to the Azure OpenAI in Foundry Models, seeQuotas and limits in Azure OpenAI.",2026-02-27T23:08:00.000Z,concept-article,limits-quotas,0.95,True,"Explicit quotas/limits article; almost certainly contains tables of per-model token/request limits, rate limits, and timeouts with exact numeric values and units.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/tutorials/get-started-deepseek-r1,Get started with DeepSeek-R1 reasoning model,Tutorial: Get started with DeepSeek-R1 in Foundry Models - Microsoft Foundry,Deploy and call DeepSeek-R1 in Foundry Models,"Learn how to deploy and use DeepSeek-R1 reasoning model in Microsoft Foundry Models. Get step-by-step guidance, code examples, and best practices for AI reasoning.","In this tutorial, you learn how to deploy and use a DeepSeek reasoning model in Microsoft Foundry. This tutorial usesDeepSeek-R1for illustration. However, the content also applies to the newerDeepSeek-R1-0528reasoning model. What you accomplish: In this tutorial, you deploy the DeepSeek-R1 reasoning model, send inference requests programmatically using code, and parse the reasoning output to understand how the model arrives at its answers. The steps you perform in this tutorial are:",2026-02-27T23:08:00.000Z,tutorial,integrations,0.7,True,"Tutorial includes concrete deployment parameters and code samples for calling DeepSeek-R1 via Foundry endpoints, which are product-specific integration patterns.",unchanged https://learn.microsoft.com/en-us/azure/foundry/foundry-models/whats-new-model-router,What's new in Model Router,What's new in model router in Microsoft Foundry Models? - Microsoft Foundry,,Learn about the latest news and features updates for Azure model router.,"This article provides a summary of the latest releases and major documentation updates for Azure model router, including new supported models, routing features, and deployment options.",2026-03-24T22:15:00.000Z,whats-new,,0.1,False,"What's new / release notes style page; typically lists features and dates, not detailed limits, configs, or troubleshooting mappings.",unchanged @@ -131,14 +140,14 @@ https://learn.microsoft.com/en-us/azure/foundry/how-to/agent-service-platform-di https://learn.microsoft.com/en-us/azure/foundry/how-to/benchmark-model-in-catalog,Compare models,Compare models using the model leaderboard - Microsoft Foundry,Use Foundry model leaderboard to compare and choose models,"Compare model benchmarks across quality, safety, cost, and throughput using the model leaderboard and side-by-side comparison features in Foundry portal.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This article shows you how to streamline model selection in the Foundry model catalog by using the model leaderboards (preview) and side-by-side ",2026-02-27T23:08:00.000Z,how-to,decision-making,0.75,True,"How-to for using leaderboards and side-by-side comparison; includes concrete comparison criteria (quality, safety, cost, throughput) for decision-making.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/bring-your-own-azure-storage-foundry,Connect to your own storage in Foundry,Connect to your own storage - Microsoft Foundry,Configure bring-your-own storage for Microsoft Foundry,"Learn how to bring your own storage to Microsoft Foundry for agents, evaluations, datasets, and other capabilities.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Foundry brings Agents, Azure OpenAI, Speech, and Language services together under one unified resource type. Bring-your-own-storage (BY",2026-02-27T23:08:00.000Z,how-to,configuration,0.78,True,"Describes BYOS for agents, evaluations, datasets, etc. Likely includes storage account types, required container names, identity/permissions settings, and specific binding or configuration properties that are product-specific configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/bring-your-own-azure-storage-speech-language-services,Connect to your own storage for Speech/Language,Connect your own storage to Speech/Language (Preview) - Microsoft Foundry,Bind customer-managed storage to Foundry Speech and Language,Configure customer-managed storage for Speech and Language capabilities in a Microsoft Foundry resource at creation time.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Configure bring-your-own-storage (BYOS) for Speech and Language capabilities in a Foundry resource by setting theuserOwnedStoragebinding at creat",2026-02-27T23:08:00.000Z,how-to,configuration,0.76,True,Explains configuring BYOS for Speech and Language via the userOwnedStorage binding at creation time. The presence of a specific binding name and creation-time configuration implies detailed parameter-level configuration knowledge.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/configure-private-link,Configure private link,How to configure network isolation for Microsoft Foundry - Microsoft Foundry,Configure Private Link network isolation for Microsoft Foundry,Learn how to configure a network isolation end-to-end for Microsoft Foundry. A private link is used to secure communication with the Microsoft Foundry.,Use a private endpoint to secure communication. This article describes how to establish a private connection to your Foundry account and projects using a private endpoint.,2026-04-10T22:08:00.000Z,how-to,security,0.72,True,"End-to-end network isolation with Private Link for a specific service typically includes resource names, required subresources, DNS zone patterns, and possibly role/permission requirements. These are concrete, product-specific security/network configuration details that fit the security sub-skill.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/how-to/configure-private-link,Configure private link,How to configure network isolation for Microsoft Foundry - Microsoft Foundry,Configure private link network isolation for Foundry,Learn how to configure a network isolation end-to-end for Microsoft Foundry. A private link is used to secure communication with the Microsoft Foundry.,Use a private endpoint to secure communication. This article describes how to establish a private connection to your Foundry account and projects using a private endpoint.,2026-04-22T06:08:00.000Z,how-to,security,0.7,True,Explains end-to-end network isolation for Foundry using private endpoints. This is a security-focused configuration topic with product-specific steps and settings for securing communication to Foundry accounts and projects.,updated https://learn.microsoft.com/en-us/azure/foundry/how-to/connections-add,Create a connection,Add a new connection to your project - Microsoft Foundry,Configure and manage Microsoft Foundry connections,Learn how to add a new connection to your Foundry project.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this article, you learn how to add a new connection inMicrosoft Foundry portal. Connections are a way to authenticate and consume both Microso",2026-03-25T22:12:00.000Z,how-to,security,0.65,True,"Connections are explicitly about authenticating to Microsoft and third-party services. Such pages typically enumerate connection types, required scopes, identity options, and possibly RBAC or permission requirements. That constitutes product-specific security/auth configuration details beyond generic concepts, fitting the security sub-skill.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/create-projects,Create and manage projects,Create a project - Microsoft Foundry,,This article describes how to create a Microsoft Foundry project so you can work with generative AI in the cloud.,"Use this article to create a Foundry project and confirm that your environment is ready before you start building agents, evaluations, and files. This article describes how to create a Foundry project inMicrosoft Foundry. Projects let you organize your work—such as agents, evaluations, and files—as you build stateful apps and explore new ideas. If your organization requires customized Azure configurations like alternative names, security controls, or cost tags, you might need to use theAzure por",2026-04-08T08:00:00.000Z,how-to,,0.2,False,"The page is about creating a Foundry project via the portal and confirming the environment. From the summary it appears to be a procedural UI walkthrough/overview without detailed configuration tables, limits, or product-specific parameters; it doesn’t match any expert-knowledge sub-skill criteria.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/create-resource-template,Create resources using Bicep template,Quickstart: Deploy a Foundry resource by using Bicep - Microsoft Foundry,,Learn how to use a Bicep file (template) to create a Microsoft Foundry resource in your Azure subscription.,"In this quickstart, you deploy aMicrosoft Foundryresource and project by using aMicrosoft Biceptemplate. Bicep helps you create related resources in one coordinated deployment and reuse the same configuration across environments. If you already configured a Foundry resource in the Azure portal, you canexport that configuration as a Bicep fileinstead of authoring a template from scratch. Tip For production-ready Bicep templates that cover common Foundry deployment scenarios, see theinfrastructure",2026-04-15T08:00:00.000Z,quickstart,,0.3,False,"Quickstart showing how to deploy a Foundry resource with a Bicep template; appears to be a step-by-step tutorial without detailed configuration parameter tables, limits, or product-specific deployment constraints.",updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/create-projects,Create and manage projects,Create a project - Microsoft Foundry,,This article describes how to create a Microsoft Foundry project so you can work with generative AI in the cloud.,"Use this article to create a Foundry project and confirm that your environment is ready before you start building agents, evaluations, and files. This article describes how to create a Foundry project inMicrosoft Foundry. Projects let you organize your work—such as agents, evaluations, and files—as you build stateful apps and explore new ideas. If your organization requires customized Azure configurations like alternative names, security controls, or cost tags, you might need to use theAzure por",2026-04-20T22:11:00.000Z,how-to,,0.3,False,"Primarily a getting-started/how-to create a Foundry project and verify environment readiness. No indication of detailed limits, configuration parameter tables, security roles, or decision matrices; more of a procedural setup guide.",updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/create-resource-template,Create resources using Bicep template,Quickstart: Deploy a Foundry resource by using Bicep - Microsoft Foundry,,Learn how to use a Bicep file (template) to create a Microsoft Foundry resource in your Azure subscription.,"In this quickstart, you deploy aMicrosoft Foundryresource and project by using aMicrosoft Biceptemplate. Bicep helps you create related resources in one coordinated deployment and reuse the same configuration across environments. If you already configured a Foundry resource in the Azure portal, you canexport that configuration as a Bicep fileinstead of authoring a template from scratch. Tip For production-ready Bicep templates that cover common Foundry deployment scenarios, see theinfrastructure",2026-04-15T08:00:00.000Z,quickstart,,0.3,False,"Quickstart showing how to deploy a Foundry resource with a Bicep template; appears to be a step-by-step tutorial without detailed configuration parameter tables, limits, or product-specific deployment constraints.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/create-resource-terraform,Manage resources using Terraform,Use Terraform to create Microsoft Foundry - Microsoft Foundry,Provision Microsoft Foundry resources using Terraform,"In this article, you create a Microsoft Foundry resource, a Microsoft Foundry project, using Terraform infrastructure as code templates.","Use Terraform to automate the creation ofMicrosoft Foundryresources, projects, deployments, and connections. If you already configured a Foundry resource in the Azure portal, you canexport that configuration as Terraform codeinstead of authoring a configuration from scratch. You can use either the TerraformAzAPI ProviderorAzureRM Providerto manage Foundry resources. The AzAPI provider lets you access all Foundry control plane configurations including preview features. The AzureRM variant is limi",2026-04-08T08:00:00.000Z,how-to,configuration,0.68,True,"Terraform-based resource creation articles typically include provider-specific resource types, property names, and configuration blocks (often with required/optional fields and example values) that are unique to the service and not reliably known from pretraining. This goes beyond a simple tutorial and represents concrete configuration knowledge for Foundry via AzAPI/AzureRM.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/custom-policy-definition,Create custom policy definitions,Create a custom Azure Policy for Foundry - Microsoft Foundry,Create custom Azure Policies to govern Microsoft Foundry,"Learn how to use custom Azure policies to enable self-service resource management in your organization, while applying guardrails and constraints on allowed configurations to meet security and complia","Learn how to use custom Azure policies to enable teams to self-manage Microsoft Foundry resources. Apply guardrails and constraints on allowed configurations so you can provide flexibility while meeting security and compliance requirements. By using custom policies, you can:",2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Explains crafting custom policies to constrain Foundry configurations. Likely includes policy rule examples, resource type names, and configuration fields unique to Foundry, which are security/compliance configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/ai-template-get-started,Start with an AI template,How to get started with an AI template - Microsoft Foundry,,"Find, explore, and deploy AI solution templates from the Foundry portal to accelerate your development.","In this article, you find, explore, and deploy AI solution templates from the Foundry portal. AI solution templates are prebuilt, task-specific templates that include customizable code samples, preintegrated Azure services, and GitHub-hosted quick-start guides. Use templates to skip boilerplate setup and focus on building solutions for use cases like voice agents, release management, and data unification. Important Starter templates, manifests, code samples, and other resources made available by",2026-04-08T08:00:00.000Z,how-to,,0.2,False,"Describes how to find and deploy AI solution templates and their benefits. This is primarily workflow/marketing-style guidance without specific configuration parameters, limits, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/cloud-evaluation,Run evaluations from the SDK,Cloud Evaluation with the Microsoft Foundry SDK - Microsoft Foundry,Run cloud evaluations with Microsoft Foundry SDK,Run scalable evaluations for generative AI applications using the Microsoft Foundry SDK. Learn how to integrate evaluations into your development pipeline.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this article, you learn how to run evaluations in the cloud (preview) for predeployment testing on a test dataset. Use cloud evaluations for m",2026-04-17T08:00:00.000Z,how-to,integrations,0.7,True,"How-to article for running evaluations in the cloud using the Foundry SDK is likely to include SDK-specific parameters, configuration values, and workflow details for cloud execution that go beyond generic LLM knowledge, fitting integrations & coding patterns.",updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/cloud-evaluation,Run evaluations from the SDK,Cloud Evaluation with the Microsoft Foundry SDK - Microsoft Foundry,,Run scalable evaluations for generative AI applications using the Microsoft Foundry SDK. Learn how to integrate evaluations into your development pipeline.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this article, you learn how to run evaluations in the cloud (preview) for predeployment testing on a test dataset. Use cloud evaluations for m",2026-04-23T22:15:00.000Z,how-to,,0.4,False,"How-to guide for running cloud evaluations; summary suggests workflow/tutorial content, not detailed limits, config matrices, or error-code troubleshooting.",updated https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/get-started-projects-vs-code,Work in VS Code,Work with the Microsoft Foundry for Visual Studio Code extension - Microsoft Foundry,,"Create projects, deploy models from the model catalog, and interact with model playgrounds using the Microsoft Foundry for Visual Studio Code extension.","In this article, learn how to install and use theMicrosoft Foundryfor Visual Studio Code extension. Create projects, deploy models from the Foundry model catalog, and interact with model playgrounds from within VS Code.",2026-03-12T08:00:00.000Z,how-to,,0.45,False,VS Code extension usage for Foundry; likely step-by-step UI tutorial rather than detailed configuration or limits.,unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/install-cli-sdk,Set up your developer environment,Prepare your development environment - Microsoft Foundry,,"Set up your development environment with language runtimes, Azure CLI, and tools for Microsoft Foundry development.","Set up your development environment to use the Microsoft Foundry SDK. You also need Azure CLI for authentication so that your code can access your user credentials. In this article, you install language runtimes, Azure CLI, Azure Developer CLI, the Foundry VS Code extension, and Git. Important This article coversgeneral prerequisitesonly, such as language runtimes, global tools, and VS Code and extension setup.It doesn't cover scenario-specific steps like SDK installation or authentication.When ",2026-04-08T08:00:00.000Z,how-to,,0.3,False,"Covers general prerequisites (language runtimes, Azure CLI, VS Code, Git) for Foundry development. Appears as a setup tutorial without detailed configuration option tables, limits, or troubleshooting content.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain,Get started with LangChain and LangGraph in Foundry,Get started with LangChain and LangGraph with Foundry - Microsoft Foundry,Integrate LangChain and LangGraph with Foundry,Learn how to use langchain-azure-ai as an entry point for LangChain and LangGraph apps with Microsoft Foundry capabilities.,"Use thelangchain-azure-aipackage as the entry point for building LangChain and @@ -169,17 +178,17 @@ LangGraph applications and sink them in Azure Application Insights. In this arti Azure Monitor. The tracer emits spans for agent execution, model calls, tool execution, and retrieval operations. You can use it for apps that run fully local, hybrid flows that call Foundry Agent Service, or multi-agent LangGraph ",2026-03-06T23:10:00.000Z,how-to,integrations,0.8,True,"Documents AzureAIOpenTelemetryTracer, its configuration, and how to emit traces; includes specific parameter names and telemetry settings.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-ai-red-teaming-cloud,Run red teaming scans in the cloud,Run AI Red Teaming Agent in the cloud (Microsoft Foundry SDK) - Microsoft Foundry,Run AI Red Teaming Agent scans in cloud,This article provides instructions on how to use the AI Red Teaming Agent to run an automated scan in the cloud of a Generative AI application with the Microsoft Foundry SDK.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Though the AI Red Teaming Agent can be runlocallyduring prototyping and development to help identify safety risks, running them in the cloud allo",2026-04-13T22:11:00.000Z,how-to,integrations,0.75,True,"Cloud red-teaming with the Foundry SDK will require product-specific SDK usage, configuration parameters, and possibly environment settings for cloud execution, which qualify as integration & coding patterns beyond generic advice.",updated -https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-scans-ai-red-teaming-agent,Run red teaming scans locally,Run AI Red Teaming Agent Locally (Azure AI Evaluation SDK) - Microsoft Foundry,Run AI Red Teaming Agent scans locally,Learn how to use the AI Red Teaming Agent to run a local automated scan of a Generative AI application with the Azure AI Evaluation SDK.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI Red Teaming Agent (preview) is a powerful tool designed to help organizations proactively find safety risks associated with generative AI ",2026-04-10T08:00:00.000Z,how-to,integrations,0.75,True,"Local red-teaming via Azure AI Evaluation SDK implies concrete SDK calls, parameters, and configuration patterns unique to this product, matching integrations & coding patterns criteria.",updated -https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/sdk-overview,Microsoft Foundry SDKs,Get started with Microsoft Foundry SDKs and Endpoints - Microsoft Foundry,,This article provides an overview of the Microsoft Foundry SDKs and endpoints and how to get started using them.,"A Foundry resource provides unified access to models, agents, and tools. This article explains which SDK and endpoint to use for your scenario. Choose your SDK: Note Resource types:A Foundry resource provides all endpoints previously listed. An Azure OpenAI resource provides only the/openai/v1endpoint. Authentication:Samples here use Microsoft Entra ID (DefaultAzureCredential). API keys work on/openai/v1. Pass the key asapi_keyinstead of a token provider.",2026-04-10T08:00:00.000Z,how-to,,0.4,False,"Overview of Foundry SDKs and endpoints, resource types, and basic authentication options. While it mentions endpoint paths and auth methods, it lacks detailed configuration tables, limits, or decision criteria that would qualify as expert knowledge under the defined categories.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-ai-red-teaming-cloud,Run red teaming scans in the cloud,Run AI Red Teaming Agent in the cloud (Microsoft Foundry SDK) - Microsoft Foundry,,This article provides instructions on how to use the AI Red Teaming Agent to run an automated scan in the cloud of a Generative AI application with the Microsoft Foundry SDK.,"Though the AI Red Teaming Agent can be runlocallyduring prototyping and development to help identify safety risks, running them in the cloud allows for the following scenarios:",2026-04-22T17:23:00.000Z,how-to,,0.4,False,"How-to for running AI Red Teaming Agent in the cloud; summary suggests procedural SDK usage, not detailed config tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-scans-ai-red-teaming-agent,Run red teaming scans locally,Run AI Red Teaming Agent Locally (Azure AI Evaluation SDK) - Microsoft Foundry,Run AI Red Teaming Agent scans locally,Learn how to use the AI Red Teaming Agent to run a local automated scan of a Generative AI application with the Azure AI Evaluation SDK.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. The AI Red Teaming Agent (preview) is a powerful tool designed to help organizations proactively find safety risks associated with generative AI ",2026-04-10T08:00:00.000Z,how-to,integrations,0.75,True,"Local red-teaming via Azure AI Evaluation SDK implies concrete SDK calls, parameters, and configuration patterns unique to this product, matching integrations & coding patterns criteria.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/sdk-overview,Microsoft Foundry SDKs,Get started with Microsoft Foundry SDKs and Endpoints - Microsoft Foundry,Choose Microsoft Foundry SDKs and endpoints,This article provides an overview of the Microsoft Foundry SDKs and endpoints and how to get started using them.,"A Foundry resource provides unified access to models, agents, and tools. This article explains which SDK and endpoint to use for your scenario. Choose your SDK: Note Resource types:A Foundry resource provides all endpoints previously listed. An Azure OpenAI resource provides only the/openai/v1endpoint. Authentication:Samples here use Microsoft Entra ID (DefaultAzureCredential). API keys work on/openai/v1. Pass the key asapi_keyinstead of a token provider.",2026-04-14T22:13:00.000Z,how-to,decision-making,0.68,True,"The page provides product-specific guidance on when to use different Foundry vs Azure OpenAI endpoints and SDKs, including which resource types expose which endpoints and which auth methods work where. This is concrete selection guidance between options (Foundry resource vs Azure OpenAI resource, /openai/v1 vs other endpoints, Entra ID vs API key) rather than a generic overview, fitting the decision-making category.",updated https://learn.microsoft.com/en-us/azure/foundry/how-to/disable-preview-features,Disable preview features,Disable preview features in Microsoft Foundry - Microsoft Foundry,Disable preview features in Microsoft Foundry using tags and RBAC,Learn how to disable preview features in Microsoft Foundry by using Azure tags to hide portal surfaces or custom RBAC roles to block specific operations.,"Restrict preview features in Microsoft Foundry to keep production environments focused on generally available capabilities. This article covers two approaches: Use tags for portal-level suppression, and use custom RBAC roles when you need to block specific operations or permissions.",2026-03-13T22:11:00.000Z,how-to,security,0.8,True,Explains using Azure tags to hide portal surfaces and custom RBAC roles to block operations. This implies specific tag keys/values and role definitions/permissions that are concrete security and governance configurations.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-generative-ai-app,Run evaluations from the portal,Run evaluations from the Microsoft Foundry portal - Microsoft Foundry,,Evaluate your generative AI models and agents by using Microsoft Foundry.,"Evaluate the performance and safety of your generative AI models and agents by running them against a test dataset. During an evaluation, the model or agent is tested with the dataset and its performance is measured using built-in and custom evaluators. Use the Foundry portal to run evaluations, view results, and analyze metrics.",2026-04-16T22:08:00.000Z,how-to,,0.3,False,"Portal-based evaluation article is primarily about using the UI to run and view evaluations and metrics; summary does not indicate detailed configuration tables, limits, or error mappings.",updated -https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-results,View evaluation results in the portal,See Evaluation Results in Microsoft Foundry portal - Microsoft Foundry,,"See and analyze AI model evaluation results in Microsoft Foundry portal. Learn to view performance metrics, compare results, and interpret evaluation data for model optimization.","In this article, you learn to:",2026-03-19T06:06:00.000Z,how-to,,0.3,False,"Covers viewing and interpreting evaluation results; no evidence of limits, config parameters, or troubleshooting codes.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-azure-devops,Evaluation in Azure DevOps,How to run an evaluation in Azure DevOps - Microsoft Foundry,Run Foundry evaluations in Azure DevOps pipelines,"How to run evaluation in Azure DevOps, which enables offline evaluation of AI models within your CI/CD pipelines in Azure DevOps.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. ThisAzure DevOps extensionenables offline evaluation ofMicrosoft Foundry Agentswithin your CI/CD pipelines. It streamlines the offline evaluation",2026-04-14T22:13:00.000Z,how-to,deployment,0.8,True,"Azure DevOps extension for offline evaluation will include task parameters, configuration options, and pipeline integration details specific to Foundry, representing deployment-focused CI/CD configuration knowledge.",updated -https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-github-action,Evaluation in GitHub Actions,How to run an evaluation in GitHub Action - Microsoft Foundry,Run Foundry evaluations in GitHub Actions CI,"How to run evaluation in GitHub Action to streamline the evaluation process, allowing you to assess model performance and make informed decisions before deploying to production.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. ThisGitHub Actionenables offline evaluation ofMicrosoft Foundry Agentswithin your CI/CD pipelines. It's designed to streamline the offline evalua",2026-04-14T22:13:00.000Z,how-to,deployment,0.8,True,"Describes a specific GitHub Action for Foundry agents, likely including action inputs, configuration fields, and CI/CD usage constraints, which are product-specific deployment patterns for integrating evaluations into pipelines.",updated -https://learn.microsoft.com/en-us/azure/foundry/how-to/fireworks/enable-fireworks-models,Use Fireworks models,Fireworks models on Microsoft Foundry (preview) - Microsoft Foundry,Enable and configure Fireworks models in Foundry,"Learn how to enable, deploy, and use Fireworks models in Microsoft Foundry, including catalog models, custom model import (BYOM), data privacy, and frequently asked questions.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Through integration withFireworks AI, Microsoft Foundry customers can: All of these capabilities are available directly within your Foundry proje",2026-04-13T17:17:00.000Z,how-to,configuration,0.7,True,"Integration article for Fireworks models is expected to detail how to enable, deploy, and use these models, including configuration options, model catalog settings, and BYOM parameters specific to this integration, fitting configuration/integration knowledge; configuration is primary due to enable/usage focus.",updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-generative-ai-app,Run evaluations from the portal,Run evaluations from the Microsoft Foundry portal - Microsoft Foundry,,Evaluate your generative AI models and agents by using Microsoft Foundry.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Evaluate the performance and safety of your generative AI models and agents by running them against a test dataset. During an evaluation, the mod",2026-04-22T22:15:00.000Z,how-to,,0.4,False,"Portal-based evaluation how-to; appears to be step-by-step usage rather than configuration tables, limits, or decision matrices.",updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-results,View evaluation results in the portal,See Evaluation Results in Microsoft Foundry portal - Microsoft Foundry,,"See and analyze AI model evaluation results in Microsoft Foundry portal. Learn to view performance metrics, compare results, and interpret evaluation data for model optimization.","In this article, you learn to:",2026-04-22T17:23:00.000Z,how-to,,0.3,False,Describes viewing and interpreting evaluation results; likely UI walkthrough and conceptual metric interpretation without product-specific limits or configs.,updated +https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-azure-devops,Evaluation in Azure DevOps,How to run an evaluation in Azure DevOps - Microsoft Foundry,Run Foundry evaluations in Azure DevOps pipelines,"How to run evaluation in Azure DevOps, which enables offline evaluation of AI models within your CI/CD pipelines in Azure DevOps.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. ThisAzure DevOps extensionenables offline evaluation ofMicrosoft Foundry Agentswithin your CI/CD pipelines. It streamlines the offline evaluation",2026-04-14T22:13:00.000Z,how-to,deployment,0.8,True,"Azure DevOps extension for offline evaluation will include task parameters, configuration options, and pipeline integration details specific to Foundry, representing deployment-focused CI/CD configuration knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-github-action,Evaluation in GitHub Actions,How to run an evaluation in GitHub Action - Microsoft Foundry,Run Foundry evaluations in GitHub Actions CI,"How to run evaluation in GitHub Action to streamline the evaluation process, allowing you to assess model performance and make informed decisions before deploying to production.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. ThisGitHub Actionenables offline evaluation ofMicrosoft Foundry Agentswithin your CI/CD pipelines. It's designed to streamline the offline evalua",2026-04-14T22:13:00.000Z,how-to,deployment,0.8,True,"Describes a specific GitHub Action for Foundry agents, likely including action inputs, configuration fields, and CI/CD usage constraints, which are product-specific deployment patterns for integrating evaluations into pipelines.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/how-to/fireworks/enable-fireworks-models,Use Fireworks models,Fireworks models on Microsoft Foundry (preview) - Microsoft Foundry,Enable and configure Fireworks models in Foundry,"Learn how to enable, deploy, and use Fireworks models in Microsoft Foundry, including catalog models, custom model import (BYOM), data privacy, and frequently asked questions.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Through integration withFireworks AI, Microsoft Foundry customers can: All of these capabilities are available directly within your Foundry proje",2026-04-13T17:17:00.000Z,how-to,configuration,0.7,True,"Integration article for Fireworks models is expected to detail how to enable, deploy, and use these models, including configuration options, model catalog settings, and BYOM parameters specific to this integration, fitting configuration/integration knowledge; configuration is primary due to enable/usage focus.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/fireworks/import-custom-models,Import custom models with Fireworks,Import custom models into Microsoft Foundry with Fireworks (preview) - Microsoft Foundry,Import and deploy custom Fireworks models in Foundry,"Learn how to import, register, and deploy your own custom model weights in Microsoft Foundry using the Fireworks inference runtime.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Import and deploy your own model weights on Foundry using the Fireworks inference runtime. In this article, you learn how to import, register, an",2026-03-20T08:00:00.000Z,how-to,deployment,0.75,True,"Describes how to import, register, and deploy custom model weights using the Fireworks inference runtime in Foundry. This is a product-specific deployment workflow for custom models, including registration and runtime constraints.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/how-to/high-availability-resiliency,High availability and resiliency,High availability and resiliency for Microsoft Foundry projects and Agent Services - Microsoft Foundry,Plan high availability and resiliency for Foundry projects and agents,Learn how to plan for high availability and resiliency for Microsoft Foundry projects and Agent Service.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Plan ahead to maintain business continuity and prepare for disaster recovery withMicrosoft Foundry. Microsoft strives to ensure that Azure servic",2026-02-27T23:08:00.000Z,how-to,architecture-patterns,0.64,True,"Guidance for HA and resiliency planning for Foundry projects and Agent Service. Likely includes Foundry-specific patterns, regional deployment recommendations, and possibly RPO/RTO targets or thresholds that inform architectural decisions.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/how-to/high-availability-resiliency,High availability and resiliency,High availability and resiliency for Microsoft Foundry projects and Agent Services - Microsoft Foundry,Design high availability and resiliency for Foundry,Learn how to plan for high availability and resiliency for Microsoft Foundry projects and Agent Service.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Plan ahead to maintain business continuity and prepare for disaster recovery withMicrosoft Foundry. Microsoft strives to ensure that Azure servic",2026-04-21T06:08:00.000Z,how-to,architecture-patterns,0.6,True,"Focuses on planning high availability and disaster recovery for Foundry projects and Agent Service. While not explicitly listing limits, it likely includes product-specific resiliency patterns, region/zone usage, and guidance on when to use particular configurations for business continuity.",updated https://learn.microsoft.com/en-us/azure/foundry/how-to/integrate-with-other-apps,Integrate with your applications,Integrate Microsoft Foundry with your applications - Microsoft Foundry,Choose integration patterns for Microsoft Foundry APIs,"Learn how to choose a Microsoft Foundry integration pattern, retrieve endpoints, and send your first REST API request with authentication.","Microsoft Foundry is a developer platform that lets you embed AI models, agents, evaluation tools, and Responsible AI capabilities into your workflows and applications. You can build complete solutions in Foundry or selectively integrate its features into your custom apps and Microsoft or partner solutions. In this article, you learn how to choose an integration pattern and send your first REST API request to a Foundry endpoint.",2026-04-08T08:00:00.000Z,how-to,decision-making,0.6,True,"Article explicitly helps choose an integration pattern and explains how to integrate Foundry into applications. Likely includes pattern selection guidance and criteria for when to use each integration approach, which fits decision-making for integration strategy.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/managed-virtual-network,Managed virtual network,Configure managed virtual network for Microsoft Foundry projects (preview) - Microsoft Foundry,Set up managed virtual networks for Microsoft Foundry projects,Secure your Microsoft Foundry projects with managed virtual networks. Learn to enable outbound isolation and private endpoints for enhanced data protection.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. This article explains how to set up a managed virtual network for your Foundry resource. Managed virtual network streamlines and automates networ",2026-02-27T23:08:00.000Z,how-to,security,0.78,True,"Explains enabling managed virtual networks, outbound isolation, and private endpoints. This usually involves specific configuration flags, subnet requirements, and endpoint settings unique to Foundry, which are security/network configuration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/model-deployment-policy,Built-in policy for model deployment,Built-in policy for model deployment - Microsoft Foundry,Use built-in Azure Policy definitions for Foundry model deployment,Manage AI model deployment in Microsoft Foundry portal with built-in Azure Policy definitions. Learn how to govern and manage model deployments effectively.,"Azure Policy provides built-in policy definitions that help you govern the deployment of AI models in Microsoft Foundry portal. You can use @@ -190,21 +199,21 @@ https://learn.microsoft.com/en-us/azure/foundry/how-to/quota,Manage quotas for F https://learn.microsoft.com/en-us/azure/foundry/how-to/set-up-key-vault-connection,Store secrets in your Azure Key Vault,Set up an Azure Key Vault Connection - Microsoft Foundry,Set up an Azure Key Vault connection for Microsoft Foundry,Learn how to securely connect your Azure Key Vault to Foundry. Follow step-by-step instructions to manage secrets and ensure seamless integration.,"If you don't set up a Key Vault connection, Microsoft Foundry stores connection details in a Microsoft-managed Key Vault outside your subscription. To manage your own secrets, connect your Azure Key Vault to Foundry. Before you begin, review thelimitationsfor Key Vault connections. Azure Key Vault is a cloud service for securely storing and accessing secrets. A secret is anything that you want to tightly control access to, such as API keys, passwords, certificates, or cryptographic keys. For mor",2026-02-27T23:08:00.000Z,how-to,integrations,0.78,True,"How-to for connecting a customer Key Vault to Foundry, including limitations. Likely lists connection configuration parameters, required permissions, and secret handling behavior specific to this integration.",unchanged https://learn.microsoft.com/en-us/azure/foundry/how-to/upgrade-azure-openai,Upgrade from Azure OpenAI Service,Upgrade Azure OpenAI to Microsoft Foundry - Microsoft Foundry,Decide and execute upgrade from Azure OpenAI to Foundry,"Upgrade your Azure OpenAI resource to Microsoft Foundry to access advanced capabilities including a broader model catalog, agents service, and evaluation tools. Learn how to upgrade seamlessly.","The Microsoft Foundry resource type provides a superset of capabilities compared to the Azure OpenAI resource type. It gives you access to a broader model catalog, agents service, and evaluation capabilities. You can upgrade your Azure OpenAI resource to a Foundry resource. You keep your existing Azure OpenAI API endpoint, state of work, and security configurations, and don't need to create a new Foundry resource.",2026-02-27T23:08:00.000Z,how-to,decision-making,0.76,True,"Explains upgrade path, what carries over (endpoints, state, security), and trade-offs between resource types, guiding users through a migration decision and process.",unchanged https://learn.microsoft.com/en-us/azure/foundry/mcp/available-tools,Available tools and sample prompts,Explore available tools and example prompts for Foundry MCP Server (preview) - Microsoft Foundry,Use Foundry MCP Server tools and prompts,"Reference guide for all Foundry MCP Server tools, including tool descriptions, key inputs, permissions, and example prompts for each tool.","Foundry MCP Server exposes 38 tools across 10 categories that let you manage agents, datasets, evaluations, model deployments, and more — all through conversational prompts instead of API calls. Use this reference to explore each tool and try the example prompts in your own project. Tip Before using these tools, complete theFoundry MCP Server setup. Note This feature is currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for producti",2026-04-02T08:00:00.000Z,reference,integrations,0.85,True,"Reference for 38 MCP tools with descriptions, key inputs, permissions, and example prompts. This is detailed tool/parameter-level integration knowledge unique to Foundry MCP Server.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/mcp/build-your-own-mcp-server,Build your own MCP server,Build and register a Model Context Protocol (MCP) server - Microsoft Foundry,Build and register custom MCP servers for Foundry agents,"Learn how to build a custom MCP server using Azure Functions, register it in your organizational tool catalog, and connect it to Foundry Agent Service.","TheModel Context Protocol (MCP)provides a standard interface for AI agents to interact with APIs and external services. When you need to integrate private or internal enterprise systems that don't have existing MCP server implementations, you can build your own custom server. This article shows you how to create a remote MCP server using Azure Functions, register it in a private organizational tool catalog using Azure API Center, and connect it to Foundry Agent Service. This approach enables you",2026-04-16T16:35:00.000Z,how-to,integrations,0.6,True,"Covers building a custom MCP server with Azure Functions, registering it in Azure API Center, and connecting it to Foundry Agent Service; this is a product-specific integration pattern tying together multiple services with concrete configuration steps.",updated +https://learn.microsoft.com/en-us/azure/foundry/mcp/build-your-own-mcp-server,Build your own MCP server,Build and register a Model Context Protocol (MCP) server - Microsoft Foundry,Build and register custom MCP servers for Foundry agents,"Learn how to build a custom MCP server using Azure Functions, register it in your organizational tool catalog, and connect it to Foundry Agent Service.","TheModel Context Protocol (MCP)provides a standard interface for AI agents to interact with APIs and external services. When you need to integrate private or internal enterprise systems that don't have existing MCP server implementations, you can build your own custom server. This article shows you how to create a remote MCP server using Azure Functions, register it in a private organizational tool catalog using Azure API Center, and connect it to Foundry Agent Service. This approach enables you",2026-04-16T16:35:00.000Z,how-to,integrations,0.6,True,"Covers building a custom MCP server with Azure Functions, registering it in Azure API Center, and connecting it to Foundry Agent Service; this is a product-specific integration pattern tying together multiple services with concrete configuration steps.",unchanged https://learn.microsoft.com/en-us/azure/foundry/mcp/get-started,Get started with Foundry MCP Server,Get started using Foundry MCP Server with Visual Studio Code - Microsoft Foundry,Connect VS Code to Foundry MCP Server,"Connect to Foundry MCP Server from Visual Studio Code, authenticate with Entra ID, and run your first prompts against Foundry services.","Foundry MCP Server (preview) is a cloud-hosted implementation of the Model Context Protocol (MCP). It exposes curated tools that let your agents perform read and write operations against Foundry services without calling backend APIs directly. You don't need to deploy infrastructure — the server provides a secure, scalable endpoint with built-in authentication through Microsoft Entra ID. Use an MCP-compliant client such as Visual Studio Code to connect to the public endpoint, authenticate with En",2026-03-20T17:16:00.000Z,get-started,configuration,0.7,True,"Explains how to connect to the cloud-hosted Foundry MCP Server from VS Code, authenticate with Entra ID, and run prompts. This involves specific endpoint, authentication, and client configuration details that are product-specific.",unchanged https://learn.microsoft.com/en-us/azure/foundry/mcp/security-best-practices,Best practices and security guidance,Explore Foundry MCP Server best practices and security guidance - Microsoft Foundry,Secure Foundry MCP Server with RBAC and policies,"Security guidance for Foundry MCP Server, including identity, RBAC, Conditional Access policies, network isolation, and data residency.","Foundry MCP Server (preview) tools automate read and write operations across Foundry resources, including deployments, datasets, evaluations, monitoring, and analytics. This guidance helps you verify intent, reduce risk, and apply security and governance practices before you run MCP tools. In this article, you learn about: Note This feature is currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain featu",2026-04-02T17:11:00.000Z,concept-article,security,0.85,True,"Provides security guidance for Foundry MCP Server including identity, RBAC, Conditional Access, network isolation, and data residency. This is product-specific security configuration and governance guidance with concrete patterns.",unchanged https://learn.microsoft.com/en-us/azure/foundry/observability/concepts/trace-agent-concept,Trace agent overview,Agent tracing in Microsoft Foundry (preview) - Microsoft Foundry,,"Learn how agent tracing in Microsoft Foundry captures inputs, outputs, and tool usage with OpenTelemetry. Debug agent runs, identify latency issues, and improve reliability.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note Tracing is generally available for prompt agents only. Workflow, hosted, and custom agents are in preview. Microsoft Foundry provides an obs",2026-03-28T06:04:00.000Z,concept-article,,0.4,False,"Conceptual article on agent tracing and observability; summary focuses on what tracing is and why, not on concrete configuration parameters, limits, or error codes.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/cluster-analysis,Analyze evaluation results,Evaluation Cluster Analysis - Microsoft Foundry,,Learn how to run and interact with an evaluation cluster analysis.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. After you run one or more evaluation runs, you can generate an evaluation cluster analysis to understand your evaluation results. This analysis p",2026-04-14T22:13:00.000Z,how-to,,0.4,False,"Cluster analysis article appears focused on interpreting evaluation results and analysis views; summary does not suggest concrete configuration parameters, limits, or troubleshooting mappings.",updated -https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/evaluate-agent,Evaluate agent quality,Evaluate your AI agents - Microsoft Foundry,Evaluate Foundry agents with built-in quality and safety tests,"Learn how to evaluate AI agents using built-in evaluators for quality, safety, and agent-specific behaviors.","Evaluation is essential for ensuring your agent meets quality and safety standards before deployment. By running evaluations during development, you establish a baseline for your agent's performance and can set acceptance thresholds, such as an 85% task adherence passing rate, before releasing it to users. In this article, you learn how to run an agent-targeted evaluation against aFoundry agentusing built-in evaluators for quality, safety, and agent behavior. Specifically, you: Tip For general-p",2026-03-12T22:17:00.000Z,how-to,best-practices,0.6,True,How-to for running evaluations with built-in evaluators and setting acceptance thresholds; contains product-specific evaluator types and usage patterns.,unchanged +https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/cluster-analysis,Analyze evaluation results,Evaluation Cluster Analysis - Microsoft Foundry,,Learn how to run and interact with an evaluation cluster analysis.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. After you run one or more evaluation runs, you can generate an evaluation cluster analysis to understand your evaluation results. This analysis p",2026-04-14T22:13:00.000Z,how-to,,0.4,False,"Cluster analysis article appears focused on interpreting evaluation results and analysis views; summary does not suggest concrete configuration parameters, limits, or troubleshooting mappings.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/evaluate-agent,Evaluate agent quality,Evaluate your AI agents - Microsoft Foundry,Evaluate Foundry agents with built-in quality and safety tests,"Learn how to evaluate AI agents using built-in evaluators for quality, safety, and agent-specific behaviors.","Evaluation is essential for ensuring your agent meets quality and safety standards before deployment. By running evaluations during development, you establish a baseline for your agent's performance and can set acceptance thresholds, such as an 85% task adherence passing rate, before releasing it to users. In this article, you learn how to run an agent-targeted evaluation against aFoundry agentorhosted agentusing built-in evaluators for quality, safety, and agent behavior. Specifically, you: Tip",2026-04-22T17:23:00.000Z,how-to,best-practices,0.6,True,"Covers how to run evaluations, set acceptance thresholds (example 85% task adherence), and likely includes concrete evaluator configurations and thresholds, which are product-specific best practices.",updated https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/how-to-monitor-agents-dashboard,Monitor agents in the dashboard,Monitor agents with the Agent Monitoring Dashboard - Microsoft Foundry,Configure Foundry Agent Monitoring Dashboard and metrics,"Learn how to monitor operational metrics, token usage, latency, and evaluation results for AI agents in Microsoft Foundry by using the Agent Monitoring Dashboard and Application Insights.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use the Agent Monitoring Dashboard in Microsoft Foundry to track operational metrics and evaluation results for your agents. This dashboard helps",2026-04-10T08:00:00.000Z,how-to,configuration,0.7,True,"Covers using the Agent Monitoring Dashboard and Application Insights to track metrics like token usage and latency. Such pages usually include specific metric names, dimensions, and dashboard/configuration options unique to Foundry observability, which are configuration-focused expert details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/human-evaluation,Human evaluation for agents,Human Evaluation for Microsoft Foundry Agents - Microsoft Foundry,,"Learn how to set up human evaluation for your Microsoft Foundry agents, create templates, and analyze results to improve agent performance.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. In this article, you’ll learn how to set up human evaluation for your Foundry agent. As an agent builder, you can create evaluation question temp",2026-02-27T23:08:00.000Z,how-to,,0.4,False,Human evaluation setup and templates; likely workflow instructions without detailed config tables or error mappings.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/optimization-dashboard,Monitoring dashboard insights with Foundry agent,Monitoring dashboard insights in Microsoft Foundry with Ask AI - Microsoft Foundry,,Discover how to use Ask AI in Microsoft Foundry to interpret your monitoring dashboard and gain actionable insights for better decision-making.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. After your agent is in production, you can view metrics in two places: the monitoring dashboard for model-level and agent-level metrics (Build>Mo",2026-04-14T22:13:00.000Z,how-to,,0.3,False,"Monitoring dashboard with Ask AI appears to focus on interpreting metrics and insights; summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting.",updated +https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/optimization-dashboard,Monitoring dashboard insights with Foundry agent,Monitoring dashboard insights in Microsoft Foundry with Ask AI - Microsoft Foundry,,Discover how to use Ask AI in Microsoft Foundry to interpret your monitoring dashboard and gain actionable insights for better decision-making.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. After your agent is in production, you can view metrics in two places: the monitoring dashboard for model-level and agent-level metrics (Build>Mo",2026-04-14T22:13:00.000Z,how-to,,0.3,False,"Monitoring dashboard with Ask AI appears to focus on interpreting metrics and insights; summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting.",unchanged https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/optimization-model-upgrade,Optimization model upgrade,Upgrade or switch models with Ask AI - Microsoft Foundry,Use Ask AI to upgrade or switch Foundry models,"Learn how to use Ask AI in the Microsoft Foundry portal to detect deprecated models, evaluate alternatives, upgrade deployments, and update agents to newer model versions.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use Ask AI — the built-in chat assistant in the Microsoft Foundry portal — to detect deprecated or outdated models, compare alternatives, and upg",2026-02-27T23:08:00.000Z,how-to,decision-making,0.65,True,"Guides detection of deprecated models, comparison of alternatives, and upgrade decisions; contains product-specific migration and selection guidance.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/prompt-optimizer,Optimize agent prompts,Optimize agent prompts with Prompt Optimizer (preview) - Microsoft Foundry,Improve Foundry agent prompts with Prompt Optimizer,Learn how to use Prompt Optimizer in Microsoft Foundry to automatically improve your agent's system instructions using AI-driven prompt engineering best practices.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use Prompt Optimizer in Microsoft Foundry to automatically improve your agent's system instructions. Prompt Optimizer appliesprompt-engineering b",2026-04-14T22:13:00.000Z,how-to,best-practices,0.65,True,"Described as applying prompt-engineering best practices specifically for Foundry agents; likely includes concrete, product-specific recommendations and patterns for system instructions rather than generic theory, fitting best-practices.",updated +https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/prompt-optimizer,Optimize agent prompts,Optimize agent prompts with Prompt Optimizer (preview) - Microsoft Foundry,Improve Foundry agent prompts with Prompt Optimizer,Learn how to use Prompt Optimizer in Microsoft Foundry to automatically improve your agent's system instructions using AI-driven prompt engineering best practices.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Use Prompt Optimizer in Microsoft Foundry to automatically improve your agent's system instructions. Prompt Optimizer appliesprompt-engineering b",2026-04-14T22:13:00.000Z,how-to,best-practices,0.65,True,"Described as applying prompt-engineering best practices specifically for Foundry agents; likely includes concrete, product-specific recommendations and patterns for system instructions rather than generic theory, fitting best-practices.",unchanged https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-client-side,Add client-side tracing,Add client-side tracing to Foundry agents - Microsoft Foundry,Add OpenTelemetry client-side tracing to Foundry agents,"Instrument AI agents with OpenTelemetry client-side tracing using the Microsoft Foundry SDK. Export traces to Azure Monitor, the console, or any OTLP-compatible backend.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Microsoft Foundry automatically captures server-side traces for agents running in the portal. Client-side tracing extends that visibility into yo",2026-04-10T08:00:00.000Z,how-to,integrations,0.7,True,"Shows how to instrument Foundry agents with OpenTelemetry using the Microsoft Foundry SDK and export traces to Azure Monitor or OTLP backends. This typically includes SDK configuration parameters, exporter settings, and tracing options unique to this product, matching the integrations sub-skill.",unchanged https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-framework,Tracing integrations,Configure tracing for AI agent frameworks - Microsoft Foundry,Configure tracing for LangChain and other AI agent frameworks,"Debug issues and monitor AI agent performance in production by configuring OpenTelemetry tracing for LangChain, LangGraph and OpenAI Agents SDK.","Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note Tracing is generally available for prompt agents only. Workflow, hosted, and custom agents are in preview. When AI agents behave unexpectedl",2026-03-27T08:00:00.000Z,how-to,configuration,0.7,True,"How-to for configuring OpenTelemetry tracing for LangChain, LangGraph, and OpenAI Agents SDK in the context of Foundry. This will include framework-specific config snippets, environment variables, and tracing setup parameters.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-setup,Set up tracing,Set Up Tracing for AI Agents in Microsoft Foundry - Microsoft Foundry,Configure tracing for Microsoft Foundry AI agents,Learn how to set up tracing in Microsoft Foundry to debug AI agent runs and monitor behavior by sending telemetry to Azure Monitor Application Insights with OpenTelemetry.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note Tracing is generally available for prompt agents only. Workflow, hosted, and custom agents are in preview. Use tracing to debug your AI agen",2026-04-16T22:08:00.000Z,how-to,configuration,0.7,True,"How-to for setting up tracing with Azure Monitor Application Insights and OpenTelemetry is likely to include concrete configuration parameters (connection strings, instrumentation keys, exporter settings) and product-specific setup steps, which qualify as configuration expert knowledge.",updated -https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/troubleshooting,Troubleshoot evaluation and observability issues,Troubleshoot evaluation and observability issues - Microsoft Foundry,Troubleshoot Foundry evaluation and observability issues,"Learn how to troubleshoot common issues with evaluations and observability in Microsoft Foundry, including storage account access, RBAC, and network configuration.","This article provides information to help you solve common issues you might encounter when you use evaluation and observability features in Microsoft Foundry. Issues often relate to storage account configuration, role-based access control (RBAC), or network settings for the Foundry project.",2026-04-14T22:13:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicit troubleshooting article for evaluation/observability, likely mapping storage/RBAC/network symptoms to causes and fixes, with product-specific roles, settings, and diagnostic steps that qualify as expert troubleshooting knowledge.",new +https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-setup,Set up tracing,Set Up Tracing for AI Agents in Microsoft Foundry - Microsoft Foundry,Configure tracing for Microsoft Foundry AI agents,Learn how to set up tracing in Microsoft Foundry to debug AI agent runs and monitor behavior by sending telemetry to Azure Monitor Application Insights with OpenTelemetry.,"Important Items marked (preview) in this article are currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, seeSupplemental Terms of Use for Microsoft Azure Previews. Note Tracing is generally available for prompt agents only. Workflow, hosted, and custom agents are in preview. Use tracing to debug your AI agen",2026-04-16T22:08:00.000Z,how-to,configuration,0.7,True,"How-to for setting up tracing with Azure Monitor Application Insights and OpenTelemetry is likely to include concrete configuration parameters (connection strings, instrumentation keys, exporter settings) and product-specific setup steps, which qualify as configuration expert knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/troubleshooting,Troubleshoot evaluation and observability issues,Troubleshoot evaluation and observability issues - Microsoft Foundry,Troubleshoot Foundry evaluation and observability issues,"Learn how to troubleshoot common issues with evaluations and observability in Microsoft Foundry, including storage account access, RBAC, and network configuration.","This article provides information to help you solve common issues you might encounter when you use evaluation and observability features in Microsoft Foundry. Issues often relate to storage account configuration, role-based access control (RBAC), or network settings for the Foundry project.",2026-04-14T22:13:00.000Z,troubleshooting,troubleshooting,0.85,True,"Explicit troubleshooting article for evaluation/observability, likely mapping storage/RBAC/network symptoms to causes and fixes, with product-specific roles, settings, and diagnostic steps that qualify as expert troubleshooting knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/api-version-lifecycle,API lifecycle,Azure OpenAI in Microsoft Foundry Models v1 API - Microsoft Foundry,Use Azure OpenAI v1 API in Foundry Models,"Learn how to use the Azure OpenAI v1 API, which simplifies authentication, removes api-version parameters, and supports cross-provider model calls.","This article shows you how to use the v1 Azure OpenAI API. The v1 API simplifies authentication, removes the need for datedapi-versionparameters, and supports cross-provider model calls. Note New API response objects might be added to the API response at any time. We recommend you only parse the response objects you require.",2026-03-09T23:03:00.000Z,how-to,configuration,0.7,True,"Explains v1 API usage, auth, and cross-provider calls; likely includes request structure and configuration details specific to this API version.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/audio-completions-quickstart,Audio generation,Quickstart - Get started with Azure OpenAI audio generation - Microsoft Foundry,Call Azure OpenAI audio models via API,Get started with audio generation using Azure OpenAI.,"Audio-enabled models introduce the audio modality into the existing/chat/completionsAPI. The audio model expands the potential for AI applications in text and voice-based interactions and audio analysis. Modalities supported ingpt-4o-audio-previewandgpt-4o-mini-audio-previewmodels include: text, audio, and text + audio. Here's a table of the supported modalities with example use cases: By using audio generation capabilities, you can achieve more dynamic and interactive AI applications. Models th",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"Quickstart for audio generation will include concrete REST/SDK request schemas, required/optional parameters (like model names, modalities, formats), and example payloads specific to Azure OpenAI audio-completions, which are product-specific integration details beyond generic LLM knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/authoring-reference-preview,2025-04-01-preview - Authoring,Azure OpenAI in Microsoft Foundry Models REST API authoring preview reference - Microsoft Foundry,Authoring operations for Foundry OpenAI REST API,"Learn how to use Azure OpenAI's latest authoring preview REST API. In this article, you learn about authorization options, how to structure a request and receive a response.",This article provides details on the inference REST API endpoints for Azure OpenAI.,2026-02-27T23:08:00.000Z,reference,integrations,0.78,True,Authoring preview REST reference with concrete endpoints and parameters for managing resources. Contains product-specific API shapes and fields that qualify as expert integration knowledge.,unchanged @@ -219,21 +228,23 @@ https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-streamin https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/default-safety-policies,Default safety policies,Default Guardrail policies for Azure OpenAI - Microsoft Foundry,Understand default guardrail safety policies in Foundry,Learn about the default Guardrail policies that Azure OpenAI uses to flag content and ensure responsible use of the service.,"Azure OpenAI in Microsoft Foundry Models includes default safety policies applied to all models (excluding Azure OpenAI Whisper). These configurations provide you with a responsible experience by default, includingcontent filtering models, blocklists, prompt transformation,content credentials, and other features. Default safety aims to mitigate risks in different categories such as hate and fairness, sexual, violence, self-harm, protected material content, and user prompt injection attacks. To l",2026-02-27T23:08:00.000Z,concept-article,security,0.62,True,"Describes default safety policies, including specific filters (blocklists, prompt transformation, content credentials). Likely enumerates concrete policy names and behaviors that are product-specific security defaults.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/fine-tuning-considerations,When to use fine-tuning,Microsoft Foundry fine-tuning considerations - Microsoft Foundry,,Learn more about what you should take into consideration before fine-tuning with Microsoft Foundry.,"Fine-tuning is the process of taking a pretrained language model and adapting it to perform a specific task or improve its performance on a particular dataset. This involves training the model on a smaller, task-specific dataset while adjusting the model's weights slightly. Fine-tuning leverages the knowledge the model acquired during its initial training on a large, diverse dataset, allowing it to specialize without starting from scratch. This approach is often more efficient than training a ne",2026-03-11T08:00:00.000Z,concept-article,,0.3,False,"Conceptual discussion of fine-tuning considerations; summary does not indicate concrete config values, limits, or product-specific numeric thresholds.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/gpt-4-v-prompt-engineering,Image prompt engineering techniques,Image prompt engineering techniques - Microsoft Foundry,Apply prompt engineering techniques for vision-enabled GPT models,Learn how to better engineer image prompts for vision-enabled chat models.,"To unlock the full potential of vision-enabled chat models, it's essential to tailor the prompts to your specific needs. Here are some guidelines to enhance the accuracy and efficiency of your prompts. Note These prompt engineering techniques apply to vision-enabled models including GPT-4 Turbo with Vision, GPT-4o, and GPT-4o-mini. To deploy a vision-enabled model, seeDeploy models.",2026-02-27T23:08:00.000Z,concept-article,best-practices,0.7,True,Provides concrete prompt patterns and DO/DON'T guidance specific to GPT-4o and related vision models—product-specific best practices beyond generic prompt advice.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/legacy-models,Legacy models,Retired Azure OpenAI models in Microsoft Foundry - Microsoft Foundry,Review retired Azure OpenAI models in Foundry,Learn about retired Azure OpenAI models in Microsoft Foundry.,Azure OpenAI offers a variety of models for different use cases. The following models are no longer available for deployment.,2026-02-27T23:08:00.000Z,concept-article,decision-making,0.6,True,"Catalog of retired models; informs decisions about legacy migrations and compatibility, containing data not inferable from general knowledge.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements,Azure OpenAI model retirement,Azure OpenAI in Microsoft Foundry Model Retirements - Microsoft Foundry,Manage Azure OpenAI model deprecations in Foundry,Learn about model deprecations and retirements in Azure OpenAI.,"Azure OpenAI models are continually refreshed with newer and more capable models. As part of this process, we deprecate and retire older models. This article provides information about models that are currently available, deprecated, and retired.",2026-03-20T06:04:00.000Z,concept-article,decision-making,0.6,True,"Provides information about available, deprecated, and retired Azure OpenAI models. While partly reference, it directly informs which models you can or should use and when to migrate, supporting model selection and migration decisions.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirement-schedule,Model retirement schedule for Foundry Models,Model retirement schedule - Microsoft Foundry,Plan migrations using Foundry model retirement schedule,Retirement dates and replacement models for all models available through Microsoft Foundry.,"This article lists the retirement schedule for Foundry Models — their current lifecycle stage, retirement date, and suggested replacement. Use it to plan migrations before a model is deprecated or retired. For details on what each lifecycle stage means and how notifications work, seeMicrosoft Foundry Models lifecycle and support policy.",2026-04-24T17:22:00.000Z,concept-article,decision-making,0.7,True,A retirement schedule with specific dates and suggested replacement models is time-sensitive expert knowledge. It directly supports migration and model selection decisions by mapping current models to replacements and timelines.,new +https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements,Foundry Models lifecycle and support policy,Foundry Models lifecycle and support policy - Microsoft Foundry,,"Learn about the Microsoft Foundry Models lifecycle and support policy, including model deprecations and retirements. Explore how these changes affect your deployments.","Microsoft Foundry Models move through a predictable lifecycle—from preview to general availability (GA) to eventual retirement—giving you time to evaluate replacements and migrate workloads. This article explains each lifecycle stage, the overlap commitments Microsoft makes when a model retires, and how you're notified. For specific retirement dates, seeModel retirement schedule.",2026-04-24T17:22:00.000Z,concept-article,,0.3,False,"Lifecycle and support policy is primarily conceptual (preview → GA → retirement, notification behavior). The summary doesn’t indicate concrete numeric limits, configuration parameters, or decision matrices with thresholds.",new https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-router,Overview,Model router for Microsoft Foundry concepts - Microsoft Foundry,,Learn about the model router feature in Azure OpenAI in Microsoft Foundry Models.,"Model router is a trained language model that intelligently routes your prompts in real time to the most suitable large language model (LLM). You deploy model router like any other Foundry model. Thus, it delivers high performance while saving on costs, reducing latencies, and increasing responsiveness, while maintaining comparable quality, all packaged as a single model deployment. Note You do not need to separately deploy the supported LLMs for use with model router, with the exception of the ",2026-03-21T06:03:00.000Z,concept-article,,0.3,False,"Conceptual description of the model router; summary does not indicate numeric limits, configuration tables, error codes, or decision matrices.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing,Priority processing,Enable priority processing for Microsoft Foundry Models - Microsoft Foundry,Enable and choose priority processing for Foundry models,Learn how to enable priority processing for Microsoft Foundry models to achieve low latency and high availability for time-sensitive workloads.,"Priority processing provides low-latency performance with the flexibility of pay-as-you-go. In this article, you enable priority processing on a model deployment, verify which service tier processed your requests, and monitor associated costs.",2026-03-24T22:15:00.000Z,how-to,decision-making,0.6,True,"Describes enabling priority processing, associated costs, and latency/availability behavior; likely includes tiered behavior and cost/performance trade-offs that guide when to use priority processing.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-router-how-it-works,How model router works,How model router works in Microsoft Foundry - Microsoft Foundry,,"Learn how model router analyzes prompts, scores candidate models, and routes requests to optimize cost, quality, and latency in Microsoft Foundry.","Model router is a purpose-built, trained machine-learning model that analyzes each prompt in real time and routes it to the most suitable large language model (LLM). It isn't a set of rules, a keyword matcher, or an LLM itself. It's a compact classifier designed to predict which model performs best for a given prompt at minimal latency. This article explains the architecture, routing logic, and decision factors that power model router. For supported models and version information, see themodel r",2026-04-24T06:04:00.000Z,concept-article,,0.3,False,"Explains architecture and routing logic conceptually (how the router scores and routes). The summary doesn’t indicate numeric thresholds, config parameters, or decision matrices with concrete values.",new +https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing,Priority processing,Enable priority processing for Microsoft Foundry Models - Microsoft Foundry,Configure priority processing for Microsoft Foundry model deployments,Learn how to enable priority processing for Microsoft Foundry models to achieve low latency and high availability for time-sensitive workloads.,"Priority processing provides low-latency performance with the flexibility of pay-as-you-go. In this article, you enable priority processing on a model deployment, verify which service tier processed your requests, and monitor associated costs.",2026-04-23T17:31:00.000Z,how-to,configuration,0.65,True,"Describes enabling priority processing on deployments and verifying which tier processed requests. This typically involves specific deployment settings/flags and possibly tier identifiers, which are product-specific configuration details.",updated https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/prompt-engineering,Prompt engineering techniques,Prompt engineering techniques - Microsoft Foundry,,Learn how to use prompt engineering to optimize your work with Azure OpenAI.,"These techniques aren't recommended for reasoning models like gpt-5 and o-series models. Prompt construction can be difficult. In practice, the prompt acts to help the model complete the desired task, but it's more of an art than a science, often requiring experience and intuition to craft a successful prompt. The goal of this article is to help get you started with this learning process. This article attempts to capture general concepts and patterns that apply to all GPT models. However, it's i",2026-02-27T23:08:00.000Z,concept-article,,0.3,False,"Prompt engineering techniques are general LLM usage guidance, not Foundry-specific configuration, limits, or APIs.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/prompt-transformation,Image prompt transformation,Azure OpenAI prompt transformation concepts - Microsoft Foundry,,"Learn about the prompt transformation feature in Azure OpenAI image generation models, how it works, and why it's necessary.","Important The DALL-E image generation modeldall-e-3was retired on March 4, 2026, and is no longer available for new deployments. Existing deployments are non-functional. Usegpt-image-1orgpt-image-1.5for image generation instead. See theimage generation how-to guidefor updated instructions.",2026-03-19T22:16:00.000Z,concept-article,,0.3,False,"Conceptual explanation of prompt transformation for image generation; summary shows no numeric limits, config tables, or concrete error/diagnostic details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/provisioned-throughput,Provisioned throughput offering (PTU),What Is Provisioned Throughput for Foundry Models? - Microsoft Foundry,Plan provisioned throughput architecture for Foundry models,Learn how provisioned throughput enables efficient deployment of Azure OpenAI and Foundry Models with stable latency and allocated capacity. Get started today.,The Microsoft Foundry provisioned throughput offering is a model deployment type that allows you to specify the amount of throughput you require in a model deployment. Foundry then allocates the necessary model processing capacity and ensures it's ready for you. Use the provisioned throughput you requested across a diverse portfolio ofmodels that are sold directly by Azure. These models include Azure OpenAI models and newly introduced flagship model families like Azure DeepSeek within Foundry Mo,2026-02-27T23:08:00.000Z,concept-article,architecture-patterns,0.6,True,"Concept article on provisioned throughput; typically includes when to use provisioned vs standard, capacity sizing, and trade-offs; this is a product-specific deployment/architecture pattern.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/red-teaming,Red teaming large language models (LLMs),Planning red teaming for large language models (LLMs) and their applications - Microsoft Foundry,,Learn about how red teaming and adversarial testing are an essential practice in the responsible development of systems and features using large language models (LLMs),This guide offers some potential strategies for planning how to set up and manage red teaming for responsible AI (RAI) risks throughout the large language model (LLM) product life cycle.,2026-02-27T23:08:00.000Z,concept-article,,0.35,False,Red teaming planning guide is methodological and conceptual; it does not focus on product-specific configuration parameters or numeric thresholds.,unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/retired-models,Retired Foundry Models,Retired Microsoft Foundry Models - Microsoft Foundry,Select alternatives for retired Foundry models,Retired Foundry Models: Find out which Microsoft Foundry Models are retired and explore alternatives.,Microsoft Foundry offers a variety of models for different use cases. The following models are retired and no longer available for use or for new deployments.,2026-04-24T17:22:00.000Z,concept-article,decision-making,0.65,True,"Lists specific models that are retired and their alternatives. This is expert, time-sensitive catalog information that guides decisions about which replacement model to choose when a model is no longer available.",new https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/safety-system-message-templates,Safety system message templates,Safety system message templates - Microsoft Foundry,Use safety system message templates for Azure OpenAI apps,Use these safety system message templates as a starting point to reduce harmful and ungrounded outputs in your Azure OpenAI apps.,"This article contains recommended safety system messages for your generative AI systems to help reduce the propensity of harm in various concern areas. Before you begin evaluating and integrating your safety system messages, visit theSafety system message conceptual guideto get started. Note Using a safety system message is one of many techniques you can use to mitigate risks in AI systems. It’s different from theAzure AI Content Safetyservice.",2026-02-27T23:08:00.000Z,how-to,security,0.7,True,Provides concrete safety system message templates for different harm areas; these are product-specific prompt templates not generally known.,unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/system-message,Safety system messages,Safety system messages - Microsoft Foundry,Author safety-focused system messages for Azure OpenAI,"Learn how safety system messages (system prompts) guide Azure OpenAI model behavior, improve quality, and reduce risks in Microsoft Foundry.","Safety system messages help you guide an Azure OpenAI model’s behavior, improve response quality, and reduce the likelihood of harmful outputs. They work best as one layer in a broader safety strategy. Note This article uses ""system message"" interchangeably with ""metaprompt"" and ""system prompt."" Here, we use ""system message"" to align with common terminology. This article also uses ""component"" to mean a distinct part of a system message, such as instructions, context, tone, safety guidelines, or ",2026-02-27T23:08:00.000Z,concept-article,security,0.6,True,Covers safety system messages to reduce harmful outputs; likely includes structured templates and components specific to Azure OpenAI safety guidance.,unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/video-generation,Video generation,Sora 2 video generation overview (preview) - Microsoft Foundry,,"Learn about Sora 2, an AI model for generating realistic and imaginative video scenes from text instructions, including safety, limitations, and supported features.","Sora 2 is an AI model from OpenAI that creates realistic and imaginative video scenes from text instructions and/or input images or video. The model can generate a wide range of video content, including realistic scenes, animations, and special effects. It supports several video resolutions and durations.",2026-03-18T08:00:00.000Z,concept-article,,0.2,False,"High-level overview of Sora 2 capabilities, safety, and limitations without specific quotas, configs, or decision matrices.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/batch,Batch processing,How to use global batch processing with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Use Azure OpenAI global batch processing efficiently,Learn how to use global batch with Azure OpenAI,"The Azure OpenAI Batch API is designed to handle large-scale and high-volume processing tasks efficiently. Process asynchronous groups of requests with separate quota, with 24-hour target turnaround, at50% less cost than global standard. With batch processing, rather than send one request at a time you send a large number of requests in a single file. Global batch requests have a separate enqueued token quota avoiding any disruption of your online workloads. Key use cases include: Large-Scale Da",2026-02-27T23:08:00.000Z,how-to,limits-quotas,0.7,True,"Mentions separate quota, 24-hour target turnaround, and 50% cost reduction; the full article likely includes batch-specific quota units, file size limits, and processing constraints—numeric limits and time targets.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/chatgpt,Chat completions API,Work with chat completion models - Microsoft Foundry,Call chat completion models with Azure OpenAI in Foundry,Learn about the options for how to use models with the chat completions API,"Chat models are language models that are optimized for conversational interfaces. The models behave differently than the older completion API models. Previous models were text-in and text-out, which means they accepted a prompt string and returned a completion to append to the prompt. However, the latest models are conversation-in and message-out. The models expect input formatted in a specific chat-like transcript format. They return a completion that represents a model-written message in the c",2026-03-09T22:18:00.000Z,how-to,configuration,0.8,True,"How-to for chat completions; typically includes request/response schemas, parameter names (temperature, max_tokens, etc.), and usage patterns specific to this API.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/codex,Codex,Codex with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Integrate Codex CLI and VS Code with Azure OpenAI,Learn how to use the Codex CLI and the Codex extension for Visual Studio Code with Azure OpenAI in Microsoft Foundry Models.,"OpenAI’sCodex CLIis the same coding agent that powers ChatGPT’s Codex. You can run this coding agent entirely on Azure infrastructure, while keeping your data inside your compliance boundary with the added advantages of enterprise-grade security, private networking, role-based access control, and predictable cost management. Codex is more than a chat with your code agent – it's an asynchronous coding agent that can be triggered from your terminal, VS Code, or from a GitHub Actions runner. Codex ",2026-02-27T23:08:00.000Z,how-to,integrations,0.75,True,"Shows how to configure Codex CLI and VS Code extension to use Azure OpenAI; includes configuration fields, environment variables, and endpoint settings—integration-specific details.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e,Image generation,How to Use Image Generation Models from OpenAI - Microsoft Foundry,Generate and edit images with Azure OpenAI image models,Learn how to generate and edit images using Azure OpenAI image generation models. Discover configuration options and start creating images today.,"OpenAI's image generation models create images from user-provided text prompts and optional images. This article explains how to use these models, configure options, and benefit from advanced image generation capabilities in Azure. You can do image generation viaimage generation APIorresponses API. Or you can experiment with image generation in theFoundry portal Use the tabs at the start of this page to select your preferred API approach and model.",2026-02-27T23:08:00.000Z,how-to,configuration,0.8,True,"Image generation how-to; includes model names, resolution parameters, quality settings, and API options—detailed configuration for image models.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e,Image generation,How to Use Image Generation Models from OpenAI - Microsoft Foundry,Use Azure OpenAI image generation models in Foundry,Learn how to generate and edit images using Azure OpenAI image generation models. Discover configuration options and start creating images today.,"Important The DALL-E image generation modeldall-e-3was retired on March 4, 2026, and is no longer available for new deployments. Existing deployments are non-functional. Use agpt-image-series model for image generation instead. See theimage generation how-to guidefor updated instructions. OpenAI's image generation models create images from user-provided text prompts and optional images. This article explains how to use these models, configure options, and benefit from advanced image generation c",2026-04-17T08:00:00.000Z,how-to,integrations,0.65,True,"The article explains how to use OpenAI image generation models (DALL-E / gpt-image-series) with configuration options. Such a how-to typically includes request payload fields (size, quality, style, image count, etc.) and model-specific parameters for the Azure OpenAI/Foundry API. These concrete API options and parameter names are integration details, so it best fits the integrations category rather than a generic overview.",updated https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/deep-research,Deep research,Deep research with the Responses API - Microsoft Foundry,Run deep research with o3-deep-research via Responses API,Learn how to use Azure OpenAI deep research,"The o3-deep-research model is designed for advanced research tasks. It can browse, analyze, and synthesize information from hundreds of sources to produce a comprehensive, citation-rich report. This model uses multi-step reasoning, web search, and remote Model Context Protocol (MCP) servers to gather and process data. It can also run code for complex analysis. Use deep research when you need: To start, call theResponses APIwith the model set to your o3-deep-research deployment name. Include at l",2026-02-27T23:08:00.000Z,how-to,configuration,0.75,True,"Shows how to call o3-deep-research with Responses API, including required parameters, tool usage (web search, MCP, code execution) and configuration specifics.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/embeddings,Work with embeddings in Azure OpenAI,How to generate embeddings with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Generate and use embeddings with Azure OpenAI in Foundry,Learn how to generate embeddings with Azure OpenAI,"An embedding is a special format of data representation that can be easily utilized by machine learning models and algorithms. The embedding is an information dense representation of the semantic meaning of a piece of text. Each embedding is a vector of floating point numbers, such that the distance between two embeddings in the vector space is correlated with semantic similarity between two inputs in the original format. For example, if two texts are similar, then their vector representations s",2026-02-27T23:08:00.000Z,how-to,configuration,0.8,True,"Embeddings how-to; includes model names, vector dimensions, and API parameters—concrete configuration and numeric characteristics of embeddings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning,Fine-tune models,Customize a model with fine-tuning - Microsoft Foundry,Fine-tune Foundry models via SDK and REST,"Learn how to fine-tune and customize Foundry models by using Python, REST APIs, or the Microsoft Foundry portal. Improve model performance with LoRA adaptation and custom datasets.","Learn how to fine-tune models in Microsoft Foundry for your datasets and use cases. Fine-tuning enables: In contrast to few-shot learning, fine-tuning improves the model by training on more examples than what fits in a prompt. Because weights adapt to your task, you include fewer examples or instructions. Including less reduces tokens per call and potentially lowers cost and latency. We use low-rank adaptation (LoRA) to fine-tune models in a way that reduces their complexity without significantl",2026-02-27T23:08:00.000Z,how-to,integrations,0.7,True,"How-to for fine-tuning will include REST paths, request schemas, job parameters (epochs, batch size, LoRA options), and portal configuration fields specific to Foundry fine-tuning APIs.",unchanged @@ -244,21 +255,21 @@ https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-functi Use thetoolsparameter instead.",2026-02-27T23:08:00.000Z,how-to,integrations,0.75,True,"Describes fine-tuning with tool-calling examples and notes deprecation of function_call/functions in favor of tools, implying specific request schema and parameter usage unique to Azure OpenAI.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-safety-evaluation,Safety evaluation,Safety evaluation for fine-tuning (preview) - Microsoft Foundry,Apply safety evaluation to fine-tuned models,Learn how the safety evaluation works for Microsoft Foundry fine-tuning.,"The advanced capabilities of fine-tuned models come with increased responsible AI challenges related to harmful content, manipulation, human-like behavior, privacy issues, and more. Learn more about risks, capabilities, and limitations in theOverview of Responsible AI practicesandTransparency Note. To help mitigate the risks associated with advanced fine-tuned models, we have implemented additional evaluation steps to help detect and prevent harmful content in the training and outputs of fine-tu",2026-02-27T23:08:00.000Z,how-to,security,0.7,True,"Safety evaluation article will detail evaluation stages, thresholds, and possibly configuration flags for safety checks on training data and outputs, which are product-specific safety controls.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-vision,Vision fine-tuning,Vision fine-tuning - Microsoft Foundry,Fine-tune GPT-4 vision models with images,"Learn how to fine-tune Azure OpenAI GPT-4o and GPT-4.1 models with image inputs, including dataset requirements, image formats, and best practices.","Learn how to fine-tune Azure OpenAI models with image data to customize visual understanding for your use case. Vision fine-tuning lets you include image inputs in your training examples, following the same chat completions format used for text fine-tuning. Images can be provided either as publicly accessible URLs or data URIs containingbase64 encoded images.",2026-02-27T23:08:00.000Z,how-to,best-practices,0.7,True,"Vision fine-tuning doc will specify supported image formats, size/ratio constraints, dataset structuring, and product-specific recommendations for image inputs, which are concrete best practices and constraints.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling,Function calling,How to use function calling with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Configure and use function calling with Azure OpenAI,Learn how to use function calling with OpenAI models.,"If one or more functions are included in your request, the model determines if any of the functions should be called based on the context of the prompt. When the model determines thatodel a function should be called, it responds with a JSON object including the arguments for the function. The models formulate API calls and structure data outputs, all based on the functions you specify. It's important to note that while the models can generate these calls, it's up to you to execute them, ensuring",2026-02-27T23:08:00.000Z,how-to,configuration,0.85,True,"Function calling requires defining functions schema, parameters, and handling tool calls; article will include JSON schema formats and API parameters unique to this feature.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling,Function calling,How to use function calling with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Implement function calling with Azure OpenAI in Foundry,Learn how to use function calling with OpenAI models.,"If one or more functions are included in your request, the model determines if any of the functions should be called based on the context of the prompt. When the model determines thatodel a function should be called, it responds with a JSON object including the arguments for the function. The models formulate API calls and structure data outputs, all based on the functions you specify. It's important to note that while the models can generate these calls, it's up to you to execute them, ensuring",2026-04-25T03:35:00.000Z,how-to,integrations,0.68,True,"A 'how to use function calling' article for Azure OpenAI in Foundry is likely to include product-specific request/response schemas, parameter names, and JSON structures for function definitions and tool calls that are unique to this service’s API surface. These are concrete integration details (API parameters, expected formats) rather than just conceptual explanation, fitting the integrations category.",updated https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/gpt-with-vision,Vision-enabled chats,How to use vision-enabled chat models - Microsoft Foundry,Call vision-enabled chat models with Azure OpenAI in Foundry,"Learn how to use vision-enabled chat models in Azure OpenAI, including how to call the Chat Completion API and process images.","Vision-enabled chat models are large multimodal models (LMM) developed by OpenAI that can analyze images and provide textual responses to questions about them. They incorporate both natural language processing and visual understanding. The current vision-enabled models are theo-series reasoning models, GPT-5 series, GPT-4.1 series, GPT-4.5, GPT-4o series. The vision-enabled models can answer general questions about what's present in the images you upload. Tip To use vision-enabled models, you ca",2026-02-27T23:08:00.000Z,how-to,configuration,0.8,True,"How-to for sending images with chat completions; includes content-type requirements, image encoding, model IDs, and parameter usage—detailed configuration for multimodal calls.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/json-mode,JSON mode,How to use JSON mode with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Enable and tune JSON mode for Azure OpenAI chat completions,Learn how to improve your chat completions with Azure OpenAI JSON mode,"JSON mode allows you to set the model's response format to return a valid JSON object as part of a chat completion. While generating valid JSON was possible previously, there could be issues with response consistency that would lead to invalid JSON objects being generated. JSON mode guarantees valid JSON output, but it doesn't guarantee the output matches a specific schema. If you need schema guarantees, use Structured Outputs. Note While JSON mode is still supported, when possible we recommend ",2026-03-09T22:18:00.000Z,how-to,configuration,0.8,True,Describes how to force JSON output; includes specific request parameters/flags and behavior differences—API configuration details.,unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/latency,Performance & latency,Azure OpenAI in Microsoft Foundry Models performance & latency - Microsoft Foundry,Optimize Azure OpenAI latency and throughput in Foundry,Learn about performance and latency with Azure OpenAI,This article provides you with background around how latency and throughput works with Azure OpenAI and how to optimize your environment to improve performance.,2026-02-27T23:08:00.000Z,how-to,best-practices,0.7,True,"Performance article likely includes concrete configuration recommendations (e.g., concurrency, token settings, batching) and product-specific tuning advice, which are actionable best practices.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/model-router,Get started with model router,How to use model router for Microsoft Foundry - Microsoft Foundry,Call Foundry model router via Chat Completions API,Learn how to use the model router in Azure OpenAI to select the best model for your task.,"Model router for Microsoft Foundry is a deployable AI chat model that selects the best large language model (LLM) to respond to a prompt in real time. It uses different preexisting models to deliver high performance and save on compute costs, all in one model deployment. To learn more about how model router works, its advantages, and limitations, see theModel router concepts guide. Use model router through the Chat Completions API like you'd use a single base model such as GPT-5. Follow the same",2026-03-31T22:17:00.000Z,how-to,integrations,0.65,True,"How-to for using model router through the Chat Completions API; likely includes request/response schemas, tool parameters, and product-specific API usage patterns beyond generic LLM knowledge.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/model-router,Get started with model router,How to use model router for Microsoft Foundry - Microsoft Foundry,,Learn how to use the model router in Azure OpenAI to select the best model for your task.,"Model router for Microsoft Foundry is a deployable AI chat model that selects the best large language model (LLM) to respond to a prompt in real time. It uses different preexisting models to deliver high performance and save on compute costs, all in one model deployment. To learn more about how model router works, its advantages, and limitations, see theModel router concepts guide. To understand the architecture and routing logic, seeHow model router works. Use model router through the Chat Comp",2026-04-24T06:04:00.000Z,how-to,,0.3,False,"A how-to for using model router via chat completions is likely a usage/tutorial page without detailed config tables, limits, or error mappings. The summary doesn’t show product-specific configuration matrices or quotas.",updated https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/predicted-outputs,Predicted outputs,How to use predicted outputs with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Use predicted outputs to reduce Azure OpenAI latency,Learn how to improve your model response latency with predicted outputs,"Predicted outputs can improve model response latency for chat completions calls where minimal changes are needed to a larger body of text. If you're asking the model to provide a response where a large portion of the expected response is already known, predicted outputs can significantly reduce the latency of this request. This capability is particularly well-suited for coding scenarios, including autocomplete, error detection, and real-time editing, where speed and responsiveness are critical f",2026-03-13T22:11:00.000Z,how-to,configuration,0.7,True,Feature-specific how-to; likely includes parameters to enable predicted outputs and patterns for constructing requests—product-specific configuration and usage pattern.,unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/prompt-caching,Prompt caching,Prompt caching with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Configure prompt caching for Azure OpenAI in Foundry,Learn how to use prompt caching with Azure OpenAI,"Prompt caching allows you to reduce overall request latency and cost for longer prompts that have identical content at the beginning of the prompt.""Prompt""in this context is referring to the input you send to the model as part of your chat completions request. Rather than reprocess the same input tokens over and over again, the service is able to retain a temporary cache of processed input token computations to improve overall performance. Prompt caching has no impact on the output content retur",2026-04-14T22:13:00.000Z,how-to,configuration,0.7,True,"Prompt caching behavior (cache keys, duration, size constraints, model support, and request parameters) is product-specific and typically documented via concrete parameters and constraints. This constitutes expert configuration knowledge beyond generic LLM understanding.",updated +https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/prompt-caching,Prompt caching,Prompt caching with Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Configure prompt caching for Azure OpenAI in Foundry,Learn how to use prompt caching with Azure OpenAI,"Prompt caching allows you to reduce overall request latency and cost for longer prompts that have identical content at the beginning of the prompt.""Prompt""in this context is referring to the input you send to the model as part of your chat completions request. Rather than reprocess the same input tokens over and over again, the service is able to retain a temporary cache of processed input token computations to improve overall performance. Prompt caching has no impact on the output content retur",2026-04-14T22:13:00.000Z,how-to,configuration,0.7,True,"Prompt caching behavior (cache keys, duration, size constraints, model support, and request parameters) is product-specific and typically documented via concrete parameters and constraints. This constitutes expert configuration knowledge beyond generic LLM understanding.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/provisioned-get-started,Get started with Provisioned Deployments,Get started with provisioned deployments in Microsoft Foundry - Microsoft Foundry,Create and tune provisioned deployments in Foundry,"Learn how to create and configure provisioned throughput deployments, verify quota, handle high utilization, and run benchmarks in Microsoft Foundry.","The following guide walks you through key steps in creating a provisioned deployment with your Microsoft Foundry resource. For more details on the concepts discussed here, see:",2026-02-27T23:08:00.000Z,how-to,configuration,0.7,True,"Get-started guide for provisioned deployments; likely includes specific deployment parameters, quota checks, utilization thresholds, and benchmark configuration—concrete configuration patterns.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/provisioned-throughput-onboarding,Understanding and calculating PTU costs,Provisioned throughput unit (PTU) costs and billing - Microsoft Foundry,Plan and estimate PTU costs in Microsoft Foundry,"Learn about provisioned throughput unit (PTU) costs, hourly billing, Azure reservations, and capacity planning in Microsoft Foundry.","Use this article to learn about costs associated with provisioned throughput units (PTUs). For an overview of the provisioned throughput offering, seeWhat is provisioned throughput?. When you're ready to sign up for the provisioned throughput offering, see thegetting started guide. Note In function calling and agent use cases, token usage can be variable. You should understand your expected Tokens Per Minute (TPM) usage in detail before migrating workloads to PTU.",2026-04-13T17:17:00.000Z,concept-article,decision-making,0.7,True,"Capacity planning and billing for provisioned throughput units typically include concrete cost figures, billing granularity, and usage thresholds for choosing PTU sizes and reservations. This is specialized decision guidance (cost vs. capacity, migration considerations) that goes beyond generic knowledge.",updated +https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/provisioned-throughput-onboarding,Understanding and calculating PTU costs,Provisioned throughput unit (PTU) costs and billing - Microsoft Foundry,Plan and estimate PTU costs in Microsoft Foundry,"Learn about provisioned throughput unit (PTU) costs, hourly billing, Azure reservations, and capacity planning in Microsoft Foundry.","Use this article to learn about costs associated with provisioned throughput units (PTUs). For an overview of the provisioned throughput offering, seeWhat is provisioned throughput?. When you're ready to sign up for the provisioned throughput offering, see thegetting started guide. Note In function calling and agent use cases, token usage can be variable. You should understand your expected Tokens Per Minute (TPM) usage in detail before migrating workloads to PTU.",2026-04-13T17:17:00.000Z,concept-article,decision-making,0.7,True,"Capacity planning and billing for provisioned throughput units typically include concrete cost figures, billing granularity, and usage thresholds for choosing PTU sizes and reservations. This is specialized decision guidance (cost vs. capacity, migration considerations) that goes beyond generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio,Realtime API for speech and audio,Use the GPT Realtime API for speech and audio with Azure OpenAI - Microsoft Foundry,Implement GPT Realtime audio with Azure OpenAI,Learn how to use the GPT Realtime API for speech and audio with Azure OpenAI.,"Azure OpenAI GPT Realtime API for speech and audio is part of the GPT-4o model family that supports low-latency, ""speech in, speech out"" conversational interactions. The GPT Realtime API is designed to handle real-time, low-latency conversational interactions. It's a great fit for use cases involving live interactions between a user and a model, such as customer support agents, voice assistants, and real-time translators. Most users of the Realtime API, including applications that use WebRTC or ",2026-03-21T06:03:00.000Z,how-to,integrations,0.7,True,"How-to for GPT Realtime API with speech/audio; likely includes concrete WebSocket/WebRTC/SIP parameters, event names, and model-specific configuration patterns that qualify as product-specific integration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-preview-api-migration-guide,Realtime API migration from Preview to GA,Migration from Preview to GA version of Realtime API - Microsoft Foundry,Migrate from preview to GA Realtime API,Step-by-step guide for migrating from Preview (Beta) to Generally Available version of OpenAI GPT Realtime API protocol.,"The Azure OpenAI GPT Realtime API is now Generally Available (GA). This migration guide helps you update existing applications to use the GA protocol. The GA version introduces changes to SDK usage, endpoint URLs, event names, and session configuration. What's changing: SDK packages, endpoint URL format, event names, and session configuration structure. What's not changing: Core functionality, audio format support, and model capabilities. Time to migrate: Most migrations take 30-60 minutes. Impo",2026-02-27T23:08:00.000Z,how-to,decision-making,0.8,True,"Migration guide explicitly compares preview vs GA: changed endpoint URL formats, event names, SDK packages, and session configuration structures, providing concrete before/after mappings and upgrade decisions.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-sip,Realtime API via SIP,Use the GPT Realtime API via SIP - Microsoft Foundry,Use GPT Realtime API over SIP,Learn how to use the GPT Realtime API for speech and audio via SIP.,"Azure OpenAI GPT Realtime API for speech and audio is part of the GPT-4o model family that supports low-latency, ""speech in, speech out"" conversational interactions. You can use the Realtime API via WebRTC, SIP, or WebSocket to send audio input to the model and receive audio responses in real time. Follow the instructions in this article to get started with the Realtime API via SIP. Session Initiation Protocol (SIP) is a signaling protocol used to establish, modify, and terminate real‑time commu",2026-02-27T23:08:00.000Z,how-to,integrations,0.85,True,"SIP integration requires specific SIP headers, URIs, and mapping between SIP sessions and Realtime API sessions; these are concrete configuration and protocol details unique to this product.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-webrtc,Realtime API via WebRTC,Use the GPT Realtime API via WebRTC - Microsoft Foundry,Use GPT Realtime API over WebRTC,Learn how to use the GPT Realtime API for speech and audio via WebRTC.,"Azure OpenAI GPT Realtime API for speech and audio is part of the GPT-4o model family that supports low-latency, ""speech in, speech out"" conversational interactions. You can use the Realtime API via WebRTC, SIP, or WebSocket to send audio input to the model and receive audio responses in real time. Follow the instructions in this article to get started with the Realtime API via WebRTC. In most cases, use the WebRTC API for real-time audio streaming. The WebRTC API is a web standard that enables ",2026-03-21T06:03:00.000Z,how-to,integrations,0.75,True,"WebRTC-specific usage of GPT Realtime API; likely documents signaling parameters, SDP/media settings, and API fields unique to this integration, matching integration & coding patterns criteria.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-websockets,Realtime API via WebSockets,Use the GPT Realtime API via WebSockets - Microsoft Foundry,Use GPT Realtime API over WebSockets,Learn how to use the GPT Realtime API for speech and audio via WebSockets.,"Azure OpenAI GPT Realtime API for speech and audio is part of the GPT-4o model family that supports low-latency, ""speech in, speech out"" conversational interactions. You can use the Realtime API via WebRTC, SIP, or WebSocket to send audio input to the model and receive audio responses in real time. Follow the instructions in this article to get started with the Realtime API via WebSockets. Use the Realtime API via WebSockets in server-to-server scenarios where low latency isn't a requirement. Ti",2026-04-02T22:14:00.000Z,how-to,integrations,0.75,True,"WebSocket-focused guide for GPT Realtime API; expected to include connection URLs, message schemas, event types, and configuration parameters specific to this product integration.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reasoning,Work with Azure OpenAI reasoning models,"Azure OpenAI reasoning models - GPT-5 series, o3-mini, o1, o1-mini - Microsoft Foundry",Call Azure OpenAI reasoning models via Foundry,"Learn how to use Azure OpenAI's advanced GPT-5 series, o3-mini, o1, & o1-mini reasoning models","Azure OpenAI reasoning models are designed to tackle reasoning and problem-solving tasks with increased focus and capability. These models spend more time processing and understanding the user's request, making them exceptionally strong in areas like science, coding, and math compared to previous iterations. Key capabilities of reasoning models:",2026-03-20T17:16:00.000Z,how-to,integrations,0.6,True,"How-to for using GPT-5 series and o1/o3 reasoning models; likely includes model IDs, API usage patterns, and configuration options specific to these reasoning models in Foundry.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reasoning,Work with Azure OpenAI reasoning models,"Azure OpenAI reasoning models - GPT-5 series, o3-mini, o1, o1-mini - Microsoft Foundry",,"Learn how to use Azure OpenAI's advanced GPT-5 series, o3-mini, o1, & o1-mini reasoning models","Azure OpenAI reasoning models are designed to tackle reasoning and problem-solving tasks with increased focus and capability. These models spend more time processing and understanding the user's request, making them exceptionally strong in areas like science, coding, and math compared to previous iterations. Key capabilities of reasoning models:",2026-04-24T08:00:00.000Z,how-to,,0.2,False,"Described as an overview of reasoning models and their capabilities (GPT-5 series, o3-mini, o1, o1-mini) without evidence of configuration tables, limits, or decision matrices. This is primarily conceptual capability description, not detailed configuration, limits, or troubleshooting content.",updated https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reinforcement-fine-tuning,Reinforcement fine-tuning,Reinforcement fine-tuning - Microsoft Foundry,Use reinforcement fine-tuning with cost limits,Learn how to use reinforcement fine-tuning with reasoning models,"Reinforcement fine-tuning (RFT) is a technique for improving reasoning models by training them through a reward-based process, rather than relying only on labeled data. RFT helps models develop better reasoning and problem-solving skills, especially in cases where labeled examples are limited or complex behaviors are desired. Note The fine-tuning service automatically pauses RFT jobs once they hit $5,000 in total training costs (training + grading). You can deploy the most recent checkpoint or r",2026-02-27T23:08:00.000Z,how-to,limits-quotas,0.7,True,"Explicitly states RFT jobs auto-pause at $5,000 in total training costs; this is a concrete numeric limit/guardrail unique to the service, fitting limits-quotas.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/responses,Use the Azure OpenAI Responses API,Use the Azure OpenAI Responses API - Microsoft Foundry,Use Azure OpenAI Responses API with tools and streaming,"Learn how to use the Azure OpenAI Responses API to create, retrieve, and delete stateful responses with Python or REST, including streaming and tools.","Use the Azure OpenAI Responses API to generate stateful, multi-turn responses. It brings together capabilities from chat completions and the Assistants API in one unified experience. The Responses API also supports thecomputer-use-previewmodel that powersComputer use.",2026-03-17T17:21:00.000Z,how-to,configuration,0.85,True,"Responses API is a specific interface; article will contain endpoint paths, parameters, tool configuration, and stateful response handling—detailed API configuration not generally known.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/spillover-traffic-management,Provisioned spillover,Manage traffic with spillover for provisioned deployments - Microsoft Foundry,Design spillover traffic management for provisioned deployments,Learn how to manage traffic bursts in Azure OpenAI provisioned deployments using the spillover feature. Optimize performance and reduce disruptions effectively.,"This article describes how to manage traffic with spillover for provisioned deployments in Azure OpenAI. Spillover manages traffic fluctuations by routing overage traffic to a corresponding standard deployment. This optional capability can be set for all requests on a deployment or managed on a per-request basis, helping you reduce disruptions during traffic bursts.",2026-03-04T08:00:00.000Z,how-to,architecture-patterns,0.7,True,Describes spillover routing from provisioned to standard deployments; likely includes rules/thresholds for when spillover occurs and configuration options—this is a product-specific traffic management pattern.,unchanged @@ -278,6 +289,7 @@ https://learn.microsoft.com/en-us/azure/foundry/openai/latest,Vector stores,Azur https://learn.microsoft.com/en-us/azure/foundry/openai/latest,v1 API,Azure OpenAI in Microsoft Foundry Models v1 REST API reference - Microsoft Foundry,Integrate with Azure OpenAI v1 REST API in Foundry,Learn how to use Azure OpenAI's v1 REST API.,API Version:v1 Server Variables:,2026-03-09T22:18:00.000Z,how-to,integrations,0.85,True,"REST API reference with version, server variables, and endpoints; core integration details like parameters and schemas.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/monitor-openai-reference,Azure OpenAI monitoring data reference,Monitoring data reference for Azure OpenAI - Microsoft Foundry,Monitor Foundry OpenAI with Azure Monitor data,This article contains important reference material you need when you monitor Azure OpenAI in Microsoft Foundry Models by using Azure Monitor.,This article contains all the monitoring reference information for this service. SeeMonitor Azure OpenAIfor details on the data you can collect for Azure OpenAI in Microsoft Foundry Models and how to use it.,2026-02-27T23:08:00.000Z,reference,configuration,0.7,True,"Monitoring reference typically lists metric names, dimensions, log categories, and schema fields. These are configuration/observability parameters unique to this service and not generic knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits,Azure OpenAI quotas and limits,Azure OpenAI in Microsoft Foundry Models Quotas and Limits - Microsoft Foundry,Reference Azure OpenAI quotas and limits in Foundry,This article features detailed descriptions and best practices on the quotas and limits for Azure OpenAI.,This article contains a quick reference and a detailed description of the quotas and limits for Azure OpenAI.,2026-04-08T08:00:00.000Z,limits-and-quotas,limits-quotas,0.95,True,"The page is explicitly a quotas and limits reference for Azure OpenAI in Microsoft Foundry and is described as a detailed description and quick reference. Such pages typically list exact numeric limits (TPM, RPM, request sizes, rate limits) and often include tables by model or tier, which are product- and time-specific values that an LLM won't reliably know from training.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits-gov,Azure OpenAI quotas and limits in Azure Government,Azure OpenAI in Microsoft Foundry Models Quotas and Limits in Azure Government - Microsoft Foundry,Reference quotas and limits for Azure OpenAI in Azure Government,This article features detailed descriptions and best practices on the quotas and limits for Azure OpenAI in Azure Government.,This article contains a quick reference and a detailed description of the quotas and limits for Azure OpenAI in Azure Government.,2026-04-20T22:11:00.000Z,limits-and-quotas,limits-quotas,0.9,True,"Explicitly described as a quick reference of quotas and limits for Azure OpenAI in Azure Government. This will contain numeric per-SKU limits and constraints that are expert, product-specific, and change over time.",new https://learn.microsoft.com/en-us/azure/foundry/openai/realtime-audio-reference,Audio events reference,Audio events reference - Microsoft Foundry,Implement realtime audio events for Foundry OpenAI,Learn how to use events with the Realtime API and Voice Live API.,"Realtime events are used to communicate between the client and server in real-time audio applications. The events are sent as JSON objects over various endpoints, such as WebSockets or WebRTC. The events are used to manage the conversation, audio buffers, and responses in real-time. You can use audio client and server events with these APIs: Unless otherwise specified, the events described in this document are applicable to both APIs.",2026-02-27T23:08:00.000Z,reference,integrations,0.76,True,"Audio events reference defines specific event types, JSON payload schemas, and fields for WebSocket/WebRTC communication. These are concrete API integration details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/reference,2024-10-21 API reference,Azure OpenAI in Microsoft Foundry Models REST API reference - Microsoft Foundry,Call Azure OpenAI inference REST APIs in Foundry,"Learn how to use Azure OpenAI's REST API. In this article, you learn about authorization options, how to structure a request and receive a response.",This article provides details on the inference REST API endpoints for Azure OpenAI.,2026-02-27T23:08:00.000Z,reference,integrations,0.8,True,"Inference REST API reference; provides authorization options, request/response structure, and endpoint details for integrations.",unchanged https://learn.microsoft.com/en-us/azure/foundry/openai/reference-preview,2025-04-01-preview - Inference,Azure OpenAI in Microsoft Foundry Models REST API preview reference - Microsoft Foundry,Use Foundry OpenAI preview inference REST API,"Learn how to use Azure OpenAI's latest preview REST API. In this article, you learn about authorization options, how to structure a request and receive a response.",This article provides details on the inference REST API endpoints for Azure OpenAI.,2026-02-27T23:08:00.000Z,reference,integrations,0.78,True,"Preview REST API reference with detailed endpoint definitions, request bodies, and response formats. These concrete API contracts are product-specific integration details.",unchanged @@ -287,10 +299,10 @@ https://learn.microsoft.com/en-us/azure/foundry/openai/tutorials/embeddings,Embe https://learn.microsoft.com/en-us/azure/foundry/openai/whisper-quickstart,Speech to text with Whisper,Speech to text with Whisper - Microsoft Foundry,Use Azure OpenAI Whisper for speech to text,Learn how to use the Azure OpenAI Whisper model for speech to text conversion.,"In this quickstart, you transcribe speech to text using theAzure OpenAI Whisper model. The Whisper model can transcribe human speech in numerous languages and translate other languages into English. Tip This quickstart takes approximately 10-15 minutes to complete.",2026-02-27T23:08:00.000Z,quickstart,integrations,0.7,True,"Speech-to-text quickstart necessarily documents endpoint paths, request bodies, audio format parameters, and model identifiers for Whisper in Azure OpenAI, which are concrete integration details not derivable from generic training.",unchanged https://learn.microsoft.com/en-us/azure/foundry/quickstarts/get-started-code,Quickstart: Chat with an agent,Microsoft Foundry Quickstart - Microsoft Foundry,,Get started with Microsoft Foundry SDK building AI applications.,In this quickstart you'll get started using models and agents in Foundry. You will:,2026-03-24T06:06:00.000Z,quickstart,,0.3,False,"SDK getting-started quickstart; focuses on basic usage of models and agents rather than detailed configuration tables, limits, or error mappings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/reference/foundry-known-issues,Known issues,Known issues - Microsoft Foundry,Troubleshoot Microsoft Foundry known issues and workarounds,"Find known issues, workarounds, and solutions for Microsoft Foundry, including Speech, Translator, and portal services. Review before submitting a support request.",This article lists known issues and workarounds for Foundry. Review these issues before you submit a support request.,2026-03-30T22:09:00.000Z,troubleshooting-known-issue,troubleshooting,0.8,True,"A 'Known issues' page for Foundry, Speech, Translator, and portal services will list specific symptoms and issues with corresponding workarounds or solutions. This is classic troubleshooting content (symptom → cause → workaround/solution) that is highly product-specific and time-sensitive, and not reliably captured in model pretraining.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/reference/region-support,Feature availability by region,Feature availability across cloud regions - Microsoft Foundry,,This article lists Microsoft Foundry feature availability across cloud regions.,"Microsoft Foundrybrings together various Azure AI capabilities that were previously only available as standalone Azure services. While Microsoft strives to make all features available in all regions where Microsoft Foundry is supported at the same time, feature availability might vary by region. In this article, you learn what Foundry features are available across cloud regions. This article gives you: This article doesn't include a single real-time matrix for every model and feature combination",2026-04-16T22:08:00.000Z,concept-article,,0.2,False,"Region availability matrices are likely present but are high-level feature/region support lists without configuration parameters, limits, or decision matrices; primarily navigation/reference rather than expert configuration, limits, or troubleshooting guidance.",updated +https://learn.microsoft.com/en-us/azure/foundry/reference/region-support,Feature availability by region,Feature availability across cloud regions - Microsoft Foundry,,This article lists Microsoft Foundry feature availability across cloud regions.,"Microsoft Foundrybrings together various Azure AI capabilities that were previously only available as standalone Azure services. While Microsoft strives to make all features available in all regions where Microsoft Foundry is supported at the same time, feature availability might vary by region. In this article, you learn what Foundry features are available across cloud regions. This article gives you: This article doesn't include a single real-time matrix for every model and feature combination",2026-04-16T22:08:00.000Z,concept-article,,0.2,False,"Region availability matrices are likely present but are high-level feature/region support lists without configuration parameters, limits, or decision matrices; primarily navigation/reference rather than expert configuration, limits, or troubleshooting guidance.",unchanged https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/agents/data-privacy-security,"Data, privacy, and security for agents","Data, privacy, and security for Azure AI Agent Service - Microsoft Foundry",Data privacy and security for Foundry Agent Service,"This document details issues for data, privacy, and security for Azure AI Agent Service","Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. Azure AI Agent Service is a fully managed service designed to empower developers to securely build, deploy, and scale high-quality, and extensible AI agents without needing to manage the underlying compute and storage resources. Azure AI Agent Service integratesmodels, tools and technologyand enables you to extend agents with knowledge from connected so",2026-02-27T23:08:00.000Z,concept-article,security,0.66,True,"Details data handling, privacy, and security specifics for Agent Service, likely including storage locations, encryption, and access patterns that are product-specific security knowledge.",unchanged https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/agents/transparency-note,Transparency note,Transparency Note for Azure Agent Service - Microsoft Foundry,,Transparency Note for Azure Agents Service,Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version.,2026-02-27T23:08:00.000Z,concept-article,,0.2,False,"Transparency note for Agent Service is descriptive of system behavior and data use, not a technical configuration or troubleshooting guide.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy,"Data, privacy, and security for Claude models in Microsoft Foundry","Data, privacy, and security for use of Anthropic Claude models in Microsoft Foundry (preview) - Microsoft Foundry",Understand data privacy and security for Anthropic Claude in Foundry,"This document details issues for data, privacy, and security for Anthropic Claude models in Microsoft Foundry in Public Preview.","This article describes how the data that you provide is processed, used, and stored when you use Anthropic Claude models from the Microsoft Marketplace in Microsoft Foundry. Claude in Foundry processes data differently from models sold directly by Azure. When you transact for Claude in Foundry, you will agree to Anthropic's terms of use and Anthropic (not Microsoft) is the processor of the data. As detailed in the following sections, to the extent Microsoft plays a role in processing data in con",2026-02-27T23:08:00.000Z,how-to,security,0.72,True,"Explains how data is processed, used, and stored when using Claude models via Marketplace, including roles of Anthropic vs Microsoft. This is product- and vendor-specific data handling and compliance behavior that an LLM would not infer generically.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy,"Data, privacy, and security for Claude models in Microsoft Foundry","Data, privacy, and security for use of Anthropic Claude models in Microsoft Foundry (preview) - Microsoft Foundry",Understand data handling and privacy for Claude in Foundry,"This document details issues for data, privacy, and security for Anthropic Claude models in Microsoft Foundry in Public Preview.","This article describes how the data that you provide is processed, used, and stored when you use Anthropic Claude models from the Microsoft Marketplace in Microsoft Foundry. Claude in Foundry processes data differently from models sold directly by Azure. When you transact for Claude in Foundry, you will agree to Anthropic's terms of use and Anthropic (not Microsoft) is the processor of the data. As detailed in the following sections, to the extent Microsoft plays a role in processing data in con",2026-04-20T17:16:00.000Z,how-to,security,0.7,True,"The page provides product-specific details on how Anthropic Claude models in Microsoft Foundry handle, process, and store customer data, including which party is the data processor and applicable terms. These are concrete, service-specific security and privacy behaviors that an LLM is unlikely to know from training. While it’s more policy- and responsibility-focused than RBAC/config parameters, it squarely concerns security/data protection specifics for this integration.",updated https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/customer-copyright-commitment,Customer Copyright Commitment,Customer Copyright Commitment Required Mitigations - Microsoft Foundry,,Customer Copyright Commitment Required Mitigations for Azure OpenAI in Foundry Models,"Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. Note The requirements described below apply only to customers using Azure OpenAI in Microsoft Foundry Models (""Azure OpenAI"") and other Covered Products with configurable Metaprompts or other safety systems (""Configurable GAI Services""). They do not apply to customers using other Covered Products including Copilots with safety systems that are fixed. Th",2026-03-20T17:16:00.000Z,concept-article,,0.3,False,"Customer Copyright Commitment mitigations page is primarily policy/requirements text; it does not expose technical configuration values, limits, or error-code mappings that would qualify as expert product knowledge under the defined categories.",unchanged https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/data-privacy,"Data, privacy, and security for model catalog","Data, privacy, and security for Azure Direct Models in Microsoft Foundry - Microsoft Foundry","Understand data, privacy, and security for Azure Direct Models","This document details issues for data, privacy, and security for Azure Direct Models","Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. This article provides details regarding how data provided by you to Azure Direct Models in Microsoft Foundry are processed, used, and stored. Azure Direct Model means an AI model designated and deployed as an “Azure Direct Model” in Foundry, and includes Azure OpenAI models. Azure Direct Models store and process data to provide the service and to monito",2026-02-27T23:08:00.000Z,concept-article,security,0.85,True,"Details how Azure Direct Models process, use, and store data; includes product-specific security and privacy behavior and compliance implications.",unchanged https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/limited-access,Limited access,Limited access to Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,Understand limited access policy for Azure OpenAI,This document details the limited access policy for Azure OpenAI,"Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version. As part of Microsoft's commitment to responsible AI, we have designed and operate Azure Direct Models (as defined in theProduct Terms) with the intention of respecting the rights of individuals and society and fostering transparent human-computer interaction. For this reason, certain Azure Direct Models (or versions of them) are designated as Limited Ac",2026-02-27T23:08:00.000Z,concept-article,limits-quotas,0.7,True,"Limited access policy typically defines concrete eligibility criteria, usage restrictions, and possibly numeric thresholds or conditions for certain models, which are expert limit/eligibility details.",unchanged @@ -298,7 +310,7 @@ https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/overview,O https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/transparency-note,Transparency note,Transparency Note for Azure OpenAI in Microsoft Foundry Models - Microsoft Foundry,,Transparency Note for Azure OpenAI,Important Non-English translations are provided for convenience only. Please consult theEN-USversion of this documentfor the definitive version.,2026-02-27T23:08:00.000Z,concept-article,,0.2,False,"Transparency note is largely policy and behavior description, not a technical configuration or troubleshooting reference.",unchanged https://learn.microsoft.com/en-us/azure/foundry/responsible-use-of-ai-overview,Responsible AI overview,Responsible AI for Microsoft Foundry - Microsoft Foundry,,Learn how to use Foundry Tools and features responsibly with Microsoft Foundry.,"This article provides an overview of the resources for building and deploying trustworthy AI agents. This includes end-to end security, observability, and governance with controls and checkpoints at all stages of the agent lifecycle. Our recommended essential development steps are grounded in theMicrosoft Responsible AI Standard, which sets policy requirements that our own engineering teams follow. Much of the content of the Standard follows a pattern, asking teams to Discover, Protect, and Gove",2026-02-27T23:08:00.000Z,overview,,0.25,False,"Responsible AI overview is broad guidance and policy framing, not product-specific configuration, limits, or API details.",unchanged https://learn.microsoft.com/en-us/azure/foundry/tutorials/developer-journey-idea-to-prototype,Tutorial: Idea to prototype,Tutorial: Idea to prototype - Build and evaluate an enterprise agent - Microsoft Foundry,,"Prototype an enterprise agent: build a single agent with SharePoint grounding and Model Context Protocol (MCP) tools, run batch evaluation, extend to multi-agent, and deploy to Microsoft Foundry.","This tutorial covers the first stage of the Microsoft Foundry developer journey: from an initial idea to a working prototype. You build amodern workplace assistantthat combines internal company knowledge with external technical guidance by using the Microsoft Foundry SDK. Business scenario: Create an AI assistant that helps employees by combining: Tutorial outcome: By the end you have a running Modern Workplace Assistant that can answer policy, technical, and combined implementation questions; a",2026-03-31T08:00:00.000Z,tutorial,,0.3,False,"End-to-end tutorial (idea to prototype) describing a scenario and workflow; no indication of numeric limits, decision matrices, or deep configuration references.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/tutorials/quickstart-create-foundry-resources,Quickstart: Create resources,Quickstart: Set up Microsoft Foundry resources - Microsoft Foundry,,"Learn how to create a Microsoft Foundry project, deploy a model, and grant access to team members so they can build AI applications.","In this quickstart, you create aMicrosoft Foundryproject and deploy a model. If you're managing a team, you also grant access to team members. After you complete these steps, you or your team can start building AI applications using the deployed model. Tip This quickstart shows you how to create resources to build an agent with a basic setup. For more advanced scenarios that use your own resources, seeSet up your environment for agent development.",2026-04-13T08:00:00.000Z,quickstart,,0.2,False,"Quickstart/tutorial for creating Foundry resources; primarily step-by-step setup without detailed configuration tables, limits, or troubleshooting mappings.",updated +https://learn.microsoft.com/en-us/azure/foundry/tutorials/quickstart-create-foundry-resources,Quickstart: Create resources,Quickstart: Set up Microsoft Foundry resources - Microsoft Foundry,,"Learn how to create a Microsoft Foundry project, deploy a model, and grant access to team members so they can build AI applications.","In this quickstart, you create aMicrosoft Foundryproject and deploy a model. If you're managing a team, you also grant access to team members. After you complete these steps, you or your team can start building AI applications using the deployed model. Tip This quickstart shows you how to create resources to build an agent with a basic setup. For more advanced scenarios that use your own resources, seeSet up your environment for agent development.",2026-04-13T08:00:00.000Z,quickstart,,0.2,False,"Quickstart/tutorial for creating Foundry resources; primarily step-by-step setup without detailed configuration tables, limits, or troubleshooting mappings.",unchanged https://learn.microsoft.com/en-us/azure/foundry/tutorials/screen-reader,Use Microsoft Foundry with a screen reader,Use a screen reader with Microsoft Foundry - Microsoft Foundry,,This quickstart guides you in how to get oriented and navigate Microsoft Foundry with a screen reader.,"This article is for people who use screen readers such asMicrosoft's Narrator, JAWS, NVDA, or Apple's VoiceOver. In this article, you learn the basic structure of Microsoft Foundry and how to navigate efficiently.",2026-03-24T22:15:00.000Z,how-to,,0.1,False,"Accessibility/navigation guidance for using a screen reader with the Foundry portal. This is UX and orientation content, not product configuration, limits, troubleshooting, or other expert operational details.",unchanged -https://learn.microsoft.com/en-us/azure/foundry/what-is-foundry,What is Microsoft Foundry?,What is Microsoft Foundry? - Microsoft Foundry,,"Microsoft Foundry is a trusted platform that empowers developers to drive innovation and shape the future with AI in a safe, secure, and responsible way.","Microsoft Foundryis a unified Azure platform-as-a-service offering for enterprise AI operations, model builders, and application development. This foundation combines production-grade infrastructure with friendly interfaces, enabling developers to focus on building applications rather than managing infrastructure. Microsoft Foundry unifies agents, models, and tools under a single management grouping with built-in enterprise-readiness capabilities including tracing, monitoring, evaluations, and c",2026-04-13T08:00:00.000Z,overview,,0.1,False,"High-level product overview of Microsoft Foundry; marketing and conceptual description without concrete limits, configuration parameters, error codes, or decision matrices.",updated -https://learn.microsoft.com/en-us/azure/foundry/whats-new-foundry,What's new in Microsoft Foundry,Microsoft Foundry docs: What's new for April 2026 - Microsoft Foundry,,Discover key changes and updates in Microsoft Foundry documentation for April 2026.,Welcome! This article highlights key changes and updates in Microsoft Foundry documentation for April 2026.,2026-04-17T22:08:00.000Z,whats-new,,0.0,False,"What's new documentation index for April 2026; primarily a changelog/overview without technical limits, configuration tables, or troubleshooting details.",updated +https://learn.microsoft.com/en-us/azure/foundry/what-is-foundry,What is Microsoft Foundry?,What is Microsoft Foundry? - Microsoft Foundry,,"Microsoft Foundry is a trusted platform that empowers developers to drive innovation and shape the future with AI in a safe, secure, and responsible way.","Microsoft Foundryis a unified Azure platform-as-a-service offering for enterprise AI operations, model builders, and application development. This foundation combines production-grade infrastructure with friendly interfaces, enabling developers to focus on building applications rather than managing infrastructure. Microsoft Foundry unifies agents, models, and tools under a single management grouping with built-in enterprise-readiness capabilities including tracing, monitoring, evaluations, and c",2026-04-13T08:00:00.000Z,overview,,0.1,False,"High-level product overview of Microsoft Foundry; marketing and conceptual description without concrete limits, configuration parameters, error codes, or decision matrices.",unchanged +https://learn.microsoft.com/en-us/azure/foundry/whats-new-foundry,What's new in Microsoft Foundry,Microsoft Foundry docs: What's new for April 2026 - Microsoft Foundry,,Discover key changes and updates in Microsoft Foundry documentation for April 2026.,Welcome! This article highlights key changes and updates in Microsoft Foundry documentation for April 2026.,2026-04-17T22:08:00.000Z,whats-new,,0.0,False,"What's new documentation index for April 2026; primarily a changelog/overview without technical limits, configuration tables, or troubleshooting details.",unchanged diff --git a/products/microsoft-foundry/report.md b/products/microsoft-foundry/report.md index dd91f7c0..735a4fcf 100644 --- a/products/microsoft-foundry/report.md +++ b/products/microsoft-foundry/report.md @@ -1,41 +1,41 @@ --- -generated_at: '2026-04-19' +generated_at: '2026-04-26' category_descriptions: - security: 'Security, identity, networking, and compliance for Foundry: auth/RBAC, - keys & encryption, private networking, guardrails, safety policies, content safety, - and data privacy for models and agents' - configuration: Configuring Foundry agents, models, tools, storage, monitoring, security, - and Azure OpenAI features (search, memory, tracing, fine-tuning, prompt controls, - and external integrations). - limits-quotas: Limits, quotas, regions, and availability for Foundry and Azure OpenAI - models, agents, evals, vector/file search, batch, fine-tuning, and partner models. - integrations: Patterns and code for integrating Foundry agents and models with tools, - APIs, LangChain/LangGraph, Azure OpenAI, realtime/audio, search, safety, tracing, - and enterprise systems. - architecture-patterns: 'Architectural patterns for Foundry agents: standard setup, - RAG/indexing, HA/DR, regional recovery, provisioned throughput, spillover traffic, - and LLM routing optimization.' - best-practices: Best practices for configuring tools, prompts, evaluation, safety, - latency, and fine-tuning (incl. vision models) to build high-quality, efficient - Azure AI/Foundry agents - deployment: 'Deploying agents and models: infra setup, hosting, publishing to Azure/M365/Teams, - CI/CD, workflows, custom/fine-tuned/Fireworks models, and managing deployment - lifecycle.' - decision-making: Guides for choosing models, deployments, costs, and tools, plus - migration and upgrade paths (Azure OpenAI, GitHub Models, Assistants API) and - web/Bing grounding decisions. + security: 'Security, identity, and compliance for Foundry: auth/RBAC, agent identities, + private networking, keys/encryption, guardrails/safety, MCP and Agent2Agent security, + and data privacy controls.' + configuration: 'Configuring and operating Foundry agents and models: hosting, endpoints, + tools, storage, security/guardrails, monitoring/tracing, Azure OpenAI usage, and + environment/infra setup.' + limits-quotas: Quotas, rate limits, regions, and availability for Foundry models/agents + (incl. Azure OpenAI), plus how to configure limits, routing, batch/eval usage, + and cost controls. + integrations: 'Integrating Foundry agents and OpenAI with tools, models, and services: + APIs, function calling, MCP, LangChain/LangGraph, search, speech, image, realtime + audio, tracing, safety, and fine-tuning.' + architecture-patterns: 'Designing resilient, high-availability Foundry agents: isolated + setups, RAG/indexing patterns, DR and regional recovery, and scaling/provisioned + throughput with spillover traffic management.' + best-practices: Best practices for prompts, system messages, tools, evaluation, + safety, latency/throughput, and fine-tuning (incl. vision) to improve Foundry/Azure + AI agent quality and performance + decision-making: Guidance on choosing models, SDKs, deployment types, and grounding + tools, plus planning migrations, upgrades, costs, and versioning for Foundry-based + workloads + deployment: Deploying agents and models to Foundry (portal, CLI, Bicep, containers), + plus running evaluations in CI (GitHub/Azure DevOps) and hosting custom/fine-tuned/Fireworks + models troubleshooting: Diagnosing evaluation/observability problems in Foundry (metrics, logging, tracing) and resolving known platform issues with documented workarounds. skill_description: Expert knowledge for Microsoft Foundry (aka Azure AI Foundry) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, - and deployment. Use when building Foundry agents with Azure OpenAI (RAG, evals, - vector/file search, fine-tuning, or realtime/audio), and other Microsoft Foundry - related development tasks. Not for Microsoft Foundry Classic (use microsoft-foundry-classic), - Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use - microsoft-foundry-tools). -use_when: Use when building Foundry agents with Azure OpenAI (RAG, evals, vector/file - search, fine-tuning, or realtime/audio), and other Microsoft Foundry related development + and deployment. Use when building Foundry agents with Azure OpenAI, MCP tools, RAG/search, + private networking, or CI-based evals, and other Microsoft Foundry related development + tasks. Not for Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft + Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use microsoft-foundry-tools). +use_when: Use when building Foundry agents with Azure OpenAI, MCP tools, RAG/search, + private networking, or CI-based evals, and other Microsoft Foundry related development tasks. confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use @@ -45,96 +45,111 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla ## Summary -- **Total Pages**: 275 -- **Fetched**: 275 +- **Total Pages**: 287 +- **Fetched**: 287 - **Fetch Failed**: 0 -- **Classified**: 202 -- **Unclassified**: 73 +- **Classified**: 205 +- **Unclassified**: 82 ### Incremental Update -- **New Pages**: 5 -- **Updated Pages**: 31 -- **Unchanged**: 239 -- **Deleted Pages**: 6 +- **New Pages**: 19 +- **Updated Pages**: 32 +- **Unchanged**: 236 +- **Deleted Pages**: 7 - **Compared With**: `/home/vsts/work/1/s/Agent-Skills/products/microsoft-foundry/microsoft-foundry.csv` ## Classification Statistics | Type | Count | Percentage | |------|-------|------------| -| architecture-patterns | 8 | 2.9% | -| best-practices | 8 | 2.9% | -| configuration | 43 | 15.6% | -| decision-making | 27 | 9.8% | -| deployment | 13 | 4.7% | -| integrations | 61 | 22.2% | -| limits-quotas | 11 | 4.0% | -| security | 29 | 10.5% | +| architecture-patterns | 8 | 2.8% | +| best-practices | 8 | 2.8% | +| configuration | 49 | 17.1% | +| decision-making | 28 | 9.8% | +| deployment | 8 | 2.8% | +| integrations | 58 | 20.2% | +| limits-quotas | 13 | 4.5% | +| security | 31 | 10.8% | | troubleshooting | 2 | 0.7% | -| *(Unclassified)* | 73 | 26.5% | +| *(Unclassified)* | 82 | 28.6% | ## Changes ### New Pages -- [Govern agent infrastructure as a Microsoft Entra administrator](https://learn.microsoft.com/en-us/azure/foundry/control-plane/govern-agent-infrastructure-entra-admin) -- [Microsoft AI (MAI) models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-mai) -- [Claude models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude) -- [FLUX models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-flux) -- [Troubleshoot evaluation and observability issues](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/troubleshooting) +- [Claude Desktop](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-desktop) +- [Agent manifests](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-catalog) +- [Quickstart - Create an agent from a manifest](https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-agent-catalog) +- [Manage hosted agent sessions](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-sessions) +- [Migrate hosted agents to the refreshed public preview](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-hosted-agent-preview) +- [Agent applications](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-applications) +- [Configure an agent](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/configure-agent) +- [Best practices](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice) +- [Toolbox (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/toolbox) +- [Skills (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/skills) +- [Migrate from Agent Applications to the new agent model](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-agent-applications) +- [Hosted agent permissions reference](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agent-permissions) +- [Foundry Models lifecycle and support policy](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements) +- [Model retirement schedule for Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirement-schedule) +- [Retired Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/retired-models) +- [How model router works](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-router-how-it-works) +- [Azure OpenAI quotas and limits in Azure Government](https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits-gov) +- [Deployment overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/deployments-overview) +- [Bring your own models with AI gateways](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway) ### Updated Pages -- [Manage agents at scale](https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-manage-agents) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-13T08:00:00.000Z -- [Enforce token limits](https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-enforce-limits-models) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-13T22:11:00.000Z -- [What is Microsoft Foundry?](https://learn.microsoft.com/en-us/azure/foundry/what-is-foundry) - - Updated: 2026-03-13T22:11:00.000Z → 2026-04-13T08:00:00.000Z -- [Quickstart: Create resources](https://learn.microsoft.com/en-us/azure/foundry/tutorials/quickstart-create-foundry-resources) - - Updated: 2026-03-26T22:23:00.000Z → 2026-04-13T08:00:00.000Z +- [Function calling](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-25T03:35:00.000Z +- [Work with Azure OpenAI reasoning models](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reasoning) + - Updated: 2026-03-20T17:16:00.000Z → 2026-04-24T08:00:00.000Z +- [Image generation](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-17T08:00:00.000Z - [What is the Foundry Agent service](https://learn.microsoft.com/en-us/azure/foundry/agents/overview) - - Updated: 2026-03-25T22:12:00.000Z → 2026-04-16T16:35:00.000Z -- [Agent identity](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-identity) - - Updated: 2026-04-07T22:13:00.000Z → 2026-04-13T08:00:00.000Z -- [Build your own MCP server](https://learn.microsoft.com/en-us/azure/foundry/mcp/build-your-own-mcp-server) - - Updated: 2026-04-02T08:00:00.000Z → 2026-04-16T16:35:00.000Z -- [FAQ](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/foundry-iq-faq) - - Updated: 2026-03-14T05:04:00Z → 2026-04-14T22:13:00Z -- [Set up tracing](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-setup) - - Updated: 2026-03-27T08:00:00.000Z → 2026-04-16T22:08:00.000Z -- [Optimize agent prompts](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/prompt-optimizer) - - Updated: 2026-03-13T08:00:00.000Z → 2026-04-14T22:13:00.000Z -- [FAQ](https://learn.microsoft.com/en-us/azure/foundry/agents/faq) - - Updated: 2026-03-14T05:04:00Z → 2026-04-14T22:13:00Z -- [Foundry Models sold directly by Azure](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-sold-directly-by-azure) - - Updated: 2026-03-14T05:04:00.000Z → 2026-04-17T08:00:00.000Z -- [Foundry Models sold from partners and community](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-from-partners) - - Updated: 2026-04-06T08:00:00.000Z → 2026-04-18T06:07:00.000Z -- [Feature availability by region](https://learn.microsoft.com/en-us/azure/foundry/reference/region-support) - - Updated: 2026-04-03T08:00:00.000Z → 2026-04-16T22:08:00.000Z -- [Understanding and calculating PTU costs](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/provisioned-throughput-onboarding) - - Updated: 2026-04-09T08:00:00.000Z → 2026-04-13T17:17:00.000Z -- [Prompt caching](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/prompt-caching) + - Updated: 2026-04-16T16:35:00.000Z → 2026-04-22T17:23:00.000Z +- [Quickstart - Deploy a hosted agent](https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-hosted-agent) + - Updated: 2026-04-02T22:14:00.000Z → 2026-04-23T17:31:00.000Z +- [Agent development lifecycle](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/development-lifecycle) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-23T22:15:00.000Z +- [What is a hosted agent?](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agents) + - Updated: 2026-03-05T08:00:00.000Z → 2026-04-22T17:23:00.000Z +- [Deploy a hosted agent](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent) + - Updated: 2026-03-04T08:00:00.000Z → 2026-04-22T17:23:00.000Z +- [Manage hosted agent lifecycle](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent) + - Updated: 2026-03-04T08:00:00.000Z → 2026-04-22T17:23:00.000Z +- [Tools overview](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-catalog) + - Updated: 2026-04-09T08:00:00.000Z → 2026-04-23T08:00:00.000Z +- [Connect to MCP server](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol) + - Updated: 2026-04-06T17:06:00.000Z → 2026-04-23T22:15:00.000Z +- [Publish to Microsoft 365 Copilot and Teams](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-copilot) + - Updated: 2026-03-10T17:09:00.000Z → 2026-04-22T17:23:00.000Z +- [Agent 365](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-22T17:23:00.000Z +- [Evaluate agent quality](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/evaluate-agent) + - Updated: 2026-03-12T22:17:00.000Z → 2026-04-22T17:23:00.000Z +- [Foundry Models overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/foundry-models-overview) + - Updated: 2026-03-05T06:04:00.000Z → 2026-04-20T08:00:00.000Z +- [Get started with model router](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/model-router) + - Updated: 2026-03-31T22:17:00.000Z → 2026-04-24T06:04:00.000Z +- [Priority processing](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing) + - Updated: 2026-03-24T22:15:00.000Z → 2026-04-23T17:31:00.000Z +- [Data, privacy, and security for Claude models in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy) + - Updated: 2026-02-27T23:08:00.000Z → 2026-04-20T17:16:00.000Z +- [Create and manage projects](https://learn.microsoft.com/en-us/azure/foundry/how-to/create-projects) + - Updated: 2026-04-08T08:00:00.000Z → 2026-04-20T22:11:00.000Z +- [Set up your agent resources](https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup) - Updated: 2026-02-27T23:08:00.000Z → 2026-04-14T22:13:00.000Z -- [Create resources using Bicep template](https://learn.microsoft.com/en-us/azure/foundry/how-to/create-resource-template) - - Updated: 2026-03-31T22:17:00.000Z → 2026-04-15T08:00:00.000Z -- [Role-based access control](https://learn.microsoft.com/en-us/azure/foundry/concepts/rbac-foundry) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-13T08:00:00.000Z -- [Service architecture](https://learn.microsoft.com/en-us/azure/foundry/concepts/architecture) - - Updated: 2026-02-27T23:08:00.000Z → 2026-04-16T22:08:00.000Z -- [What's new in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry/whats-new-foundry) - - Updated: 2026-04-02T17:11:00.000Z → 2026-04-17T22:08:00.000Z -- *...and 11 more* +- *...and 12 more* ### Deleted Pages -- ~~Govern agents as a global administrator~~ (https://learn.microsoft.com/en-us/azure/foundry/control-plane/govern-agent-infrastructure-entra-admin) -- ~~Claude models in Foundry~~ (https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude) -- ~~FLUX models in Foundry~~ (https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-flux) -- ~~Microsoft AI (MAI) model in Foundry~~ (https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-mai) -- ~~REST API~~ (https://learn.microsoft.com/en-us/azure/foundry/reference/foundry-project) -- ~~REST API (preview)~~ (https://learn.microsoft.com/en-us/azure/foundry/reference/foundry-project-rest-preview) +- ~~Tool best practices~~ (https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice) +- ~~Use an AI Gateway~~ (https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway) +- ~~Publish and share agents~~ (https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-agent) +- ~~Invoke your Agent Application using the Responses API protocol~~ (https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-responses) +- ~~Model lifecycle and retirement for Foundry Models~~ (https://learn.microsoft.com/en-us/azure/foundry/concepts/model-lifecycle-retirement) +- ~~Legacy models~~ (https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/legacy-models) +- ~~Azure OpenAI model retirement~~ (https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements) ## Classified Pages @@ -143,6 +158,7 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Azure OpenAI quotas and limits](https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits) | limits-quotas | 0.95 | The page is explicitly a quotas and limits reference for Azure OpenAI in Microsoft Foundry and is described as a detailed description and quick reference. Such pages typically list exact numeric limits (TPM, RPM, request sizes, rate limits) and often include tables by model or tier, which are product- and time-specific values that an LLM won't reliably know from training. | | [Foundry Model quotas and limits](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/quotas-limits) | limits-quotas | 0.95 | Explicit quotas/limits article; almost certainly contains tables of per-model token/request limits, rate limits, and timeouts with exact numeric values and units. | | [Quotas, limits, and region support](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/limits-quotas-regions) | limits-quotas | 0.95 | Explicitly a limits article listing default limits for artifacts, file sizes, vector stores, messages, tools, and regions. This will contain numeric limits, size constraints, and possibly tier-specific tables that LLMs won’t know from training. | +| [Azure OpenAI quotas and limits in Azure Government](https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits-gov) | limits-quotas | 0.90 | Explicitly described as a quick reference of quotas and limits for Azure OpenAI in Azure Government. This will contain numeric per-SKU limits and constraints that are expert, product-specific, and change over time. | | [GPT-5 vs GPT-4.1](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/model-choice-guide) | decision-making | 0.90 | Direct comparison of GPT-5 vs GPT-4.1 with reasoning depth, latency, cost, and scenario guidance; clearly supports model selection with quantified trade-offs. | | [Available tools and sample prompts](https://learn.microsoft.com/en-us/azure/foundry/mcp/available-tools) | integrations | 0.85 | Reference for 38 MCP tools with descriptions, key inputs, permissions, and example prompts. This is detailed tool/parameter-level integration knowledge unique to Foundry MCP Server. | | [Best practices and security guidance](https://learn.microsoft.com/en-us/azure/foundry/mcp/security-best-practices) | security | 0.85 | Provides security guidance for Foundry MCP Server including identity, RBAC, Conditional Access, network isolation, and data residency. This is product-specific security configuration and governance guidance with concrete patterns. | @@ -153,13 +169,12 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Evals (preview)](https://learn.microsoft.com/en-us/azure/foundry/openai/latest) | integrations | 0.85 | Duplicate of the v1 REST API reference; same expert integration content. | | [Files](https://learn.microsoft.com/en-us/azure/foundry/openai/latest) | integrations | 0.85 | Duplicate of the v1 REST API reference; includes server variables and endpoints. | | [Fine-tuning](https://learn.microsoft.com/en-us/azure/foundry/openai/latest) | integrations | 0.85 | Duplicate of the v1 REST API reference; core integration details. | -| [Function calling](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling) | configuration | 0.85 | Function calling requires defining functions schema, parameters, and handling tool calls; article will include JSON schema formats and API parameters unique to this feature. | +| [Hosted agent permissions reference](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agent-permissions) | security | 0.85 | A permissions reference for hosted agents will list specific RBAC roles, scopes, and required permissions across control and data planes, which is detailed, product-specific security configuration. | | [Models](https://learn.microsoft.com/en-us/azure/foundry/openai/latest) | integrations | 0.85 | Duplicate of the v1 REST API reference; same parameter and schema information. | | [OpenAPI tool](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/openapi) | integrations | 0.85 | Explicitly about integrating OpenAPI 3.0/3.1 tools with Foundry agents using specific auth methods (API key, managed identity, anonymous). This will include configuration parameters, auth settings, and tool wiring details unique to this product. | | [Realtime API via SIP](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-sip) | integrations | 0.85 | SIP integration requires specific SIP headers, URIs, and mapping between SIP sessions and Realtime API sessions; these are concrete configuration and protocol details unique to this product. | | [Responses](https://learn.microsoft.com/en-us/azure/foundry/openai/latest) | integrations | 0.85 | Duplicate of the v1 REST API reference; integration-focused expert knowledge. | | [Structured outputs](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/structured-outputs) | configuration | 0.85 | Structured outputs feature uses specific parameters and JSON Schema integration; article will detail parameter names, schema structure, and constraints—configuration-level expert knowledge. | -| [Tool best practices](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice) | best-practices | 0.85 | Explicitly a best-practices article for configuring tools, tool_choice, secure usage, and troubleshooting tool-calling; contains product-specific DOs/DON’Ts and likely concrete config patterns. | | [Troubleshoot evaluation and observability issues](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/troubleshooting) | troubleshooting | 0.85 | Explicit troubleshooting article for evaluation/observability, likely mapping storage/RBAC/network symptoms to causes and fixes, with product-specific roles, settings, and diagnostic steps that qualify as expert troubleshooting knowledge. | | [Use the Azure OpenAI Responses API](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/responses) | configuration | 0.85 | Responses API is a specific interface; article will contain endpoint paths, parameters, tool configuration, and stateful response handling—detailed API configuration not generally known. | | [Vector stores](https://learn.microsoft.com/en-us/azure/foundry/openai/latest) | integrations | 0.85 | Duplicate of the v1 REST API reference; includes detailed REST contract. | @@ -169,22 +184,22 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [2024-10-21 API reference](https://learn.microsoft.com/en-us/azure/foundry/openai/reference) | integrations | 0.80 | Inference REST API reference; provides authorization options, request/response structure, and endpoint details for integrations. | | [Azure AI Search](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/ai-search) | integrations | 0.80 | How-to for wiring Azure AI Search as a tool with Foundry agents, including configuration of indexes and tool parameters; clearly an integration pattern between services. | | [Azure Functions](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/azure-functions) | integrations | 0.80 | Queue-based integration pattern between Foundry agents and Azure Functions using Azure Queue Storage and AzureFunctionsTool; includes configuration and message flow specifics. | +| [Best practices](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice) | best-practices | 0.80 | Explicitly labeled best practices for configuring tools, controlling tool_choice, securing tool usage, and troubleshooting tool-calling; contains product-specific DOs/DON’Ts and likely concrete configuration examples. | | [Chat completions API](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/chatgpt) | configuration | 0.80 | How-to for chat completions; typically includes request/response schemas, parameter names (temperature, max_tokens, etc.), and usage patterns specific to this API. | | [Claude Code](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-code) | integrations | 0.80 | Explains how to point Claude Code tools at Foundry; includes auth configuration, endpoints, and CI/CD integration parameters—product-specific integration patterns. | | [Configure customer-managed keys](https://learn.microsoft.com/en-us/azure/foundry/concepts/encryption-keys-portal) | security | 0.80 | Describes CMK encryption, Key Vault integration, and which data stores are covered. Typically includes key vault configuration, key identifiers, and scope of encryption specific to Foundry, which are product-specific security settings. | -| [Connect to MCP server](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol) | integrations | 0.80 | Focuses on connecting agents to MCP server endpoints using the MCP tool; this is a product-specific integration pattern with external MCP-compatible tools and data sources, likely including endpoint and tool configuration details. | +| [Connect to MCP server](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol) | integrations | 0.80 | How-to for connecting agents to MCP servers using the MCP tool; expected to document MCP-specific configuration fields, endpoints, and parameters, which are product-specific integration details. | | [Disable preview features](https://learn.microsoft.com/en-us/azure/foundry/how-to/disable-preview-features) | security | 0.80 | Explains using Azure tags to hide portal surfaces and custom RBAC roles to block operations. This implies specific tag keys/values and role definitions/permissions that are concrete security and governance configurations. | | [Evaluation in Azure DevOps](https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-azure-devops) | deployment | 0.80 | Azure DevOps extension for offline evaluation will include task parameters, configuration options, and pipeline integration details specific to Foundry, representing deployment-focused CI/CD configuration knowledge. | | [Evaluation in GitHub Actions](https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-github-action) | deployment | 0.80 | Describes a specific GitHub Action for Foundry agents, likely including action inputs, configuration fields, and CI/CD usage constraints, which are product-specific deployment patterns for integrating evaluations into pipelines. | | [Fine-tune using Azure Developer CLI](https://learn.microsoft.com/en-us/azure/foundry/fine-tuning/fine-tune-cli) | integrations | 0.80 | CLI extension article will list azd commands, flags, and parameter names for initializing, submitting, and managing fine-tuning jobs, which are concrete integration/config details. | | [Function calling](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/function-calling) | integrations | 0.80 | Defines how to declare tools/functions with names, parameters, and descriptions and handle tool outputs; includes a specific 10-minute run expiry limit (expert detail) and SDK/REST parameters, fitting integrations with some limits-quotas aspect. | -| [Image generation](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e) | configuration | 0.80 | Image generation how-to; includes model names, resolution parameters, quality settings, and API options—detailed configuration for image models. | | [JSON mode](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/json-mode) | configuration | 0.80 | Describes how to force JSON output; includes specific request parameters/flags and behavior differences—API configuration details. | | [Known issues](https://learn.microsoft.com/en-us/azure/foundry/reference/foundry-known-issues) | troubleshooting | 0.80 | A 'Known issues' page for Foundry, Speech, Translator, and portal services will list specific symptoms and issues with corresponding workarounds or solutions. This is classic troubleshooting content (symptom → cause → workaround/solution) that is highly product-specific and time-sensitive, and not reliably captured in model pretraining. | | [MCP authentication](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/mcp-authentication) | security | 0.80 | Covers setting up key-based, Entra, or OAuth authentication for MCP servers used by agents; this is detailed, product-specific security and auth configuration guidance. | | [Manage compliance and security](https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-manage-compliance-security) | security | 0.80 | Describes using guardrail policies, Defender for Cloud, and Purview DSPM with Foundry; will include specific security settings, integration steps, and scopes, which are product-specific security configurations. | | [Model leaderboards](https://learn.microsoft.com/en-us/azure/foundry/concepts/model-benchmarks) | decision-making | 0.80 | Model leaderboards provide benchmark metrics (quality, safety, cost, performance) to compare models. This is explicitly for choosing between models using quantified trade-offs and comparison tables, fitting decision-making. | -| [Rate limits, region support, and enterprise features](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-regions-limits-virtual-network) | limits-quotas | 0.80 | Explicitly about region availability and rate limits for evaluation; such pages typically list numeric TPS/TPM limits and region tables, matching limits-quotas criteria. | +| [Rate limits, region support, and enterprise features](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-regions-limits-virtual-network) | limits-quotas | 0.80 | Explicitly about region availability and rate limits for evaluations; likely includes concrete per-region limits and possibly tables of quotas and constraints that are product-specific. | | [Realtime API migration from Preview to GA](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-preview-api-migration-guide) | decision-making | 0.80 | Migration guide explicitly compares preview vs GA: changed endpoint URL formats, event names, SDK packages, and session configuration structures, providing concrete before/after mappings and upgrade decisions. | | [SharePoint (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/sharepoint) | integrations | 0.80 | Describes using the SharePoint tool with the Foundry agent API, including identity passthrough and grounding. This integration is product-specific and likely documents concrete parameters (site/folder identifiers, auth settings, tool configuration) and constraints for the SharePoint tool, fitting the integrations & coding patterns category. | | [Trace with Foundry Observability](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-traces) | integrations | 0.80 | Documents AzureAIOpenTelemetryTracer, its configuration, and how to emit traces; includes specific parameter names and telemetry settings. | @@ -203,12 +218,12 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Connect to your own storage for Speech/Language](https://learn.microsoft.com/en-us/azure/foundry/how-to/bring-your-own-azure-storage-speech-language-services) | configuration | 0.76 | Explains configuring BYOS for Speech and Language via the userOwnedStorage binding at creation time. The presence of a specific binding name and creation-time configuration implies detailed parameter-level configuration knowledge. | | [Network security perimeter](https://learn.microsoft.com/en-us/azure/foundry/how-to/add-foundry-to-network-security-perimeter) | security | 0.76 | Shows how to associate Foundry with an NSP and references Foundry-specific pointers. Likely includes resource type identifiers, perimeter association steps, and access rule patterns specific to Foundry, which are security configuration details. | | [Upgrade from Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/foundry/how-to/upgrade-azure-openai) | decision-making | 0.76 | Explains upgrade path, what carries over (endpoints, state, security), and trade-offs between resource types, guiding users through a migration decision and process. | -| [Virtual networks](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/virtual-networks) | security | 0.76 | Describes private networking setup with BYO virtual network, private endpoints, DNS zones, and deny-by-default rules. This typically includes specific network configuration parameters, resource types, and required settings unique to Foundry Agent Service, which are security/networking configuration details. | | [Agent2Agent (A2A) (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/agent-to-agent) | integrations | 0.75 | Describes configuring connections and authentication to remote Agent2Agent endpoints and using SDK integration; product-specific integration details and patterns. | | [Agent2Agent (A2A) authentication (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-to-agent-authentication) | security | 0.75 | Explains authentication methods for A2A connections and how to choose between them; security-focused with product-specific auth patterns. | | [Browser automation (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/browser-automation) | integrations | 0.75 | Explains configuring and using the Browser Automation tool (Playwright sessions) with warnings and preview constraints; contains product-specific tool configuration and behavior. | | [Codex](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/codex) | integrations | 0.75 | Shows how to configure Codex CLI and VS Code extension to use Azure OpenAI; includes configuration fields, environment variables, and endpoint settings—integration-specific details. | | [Compare models](https://learn.microsoft.com/en-us/azure/foundry/how-to/benchmark-model-in-catalog) | decision-making | 0.75 | How-to for using leaderboards and side-by-side comparison; includes concrete comparison criteria (quality, safety, cost, throughput) for decision-making. | +| [Configure an agent](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/configure-agent) | configuration | 0.75 | Shows how to select active versions, enable protocols, set authorization schemes, and configure stable endpoints; these are concrete configuration options and settings. | | [Custom code interpreter (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/custom-code-interpreter) | configuration | 0.75 | Details configuring custom Python packages and compute resources via Azure Container Apps Dynamic Sessions for a custom code interpreter; focuses on runtime configuration parameters and ranges. | | [Deep research](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/deep-research) | configuration | 0.75 | Shows how to call o3-deep-research with Responses API, including required parameters, tool usage (web search, MCP, code execution) and configuration specifics. | | [Deploy Foundry Models using code](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/create-model-deployments) | deployment | 0.75 | Shows CLI commands and Bicep schema for deployments; includes parameter names, allowed values, and resource types—product-specific deployment configuration patterns. | @@ -218,7 +233,6 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Import custom models with Fireworks](https://learn.microsoft.com/en-us/azure/foundry/how-to/fireworks/import-custom-models) | deployment | 0.75 | Describes how to import, register, and deploy custom model weights using the Fireworks inference runtime in Foundry. This is a product-specific deployment workflow for custom models, including registration and runtime constraints. | | [Realtime API via WebRTC](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-webrtc) | integrations | 0.75 | WebRTC-specific usage of GPT Realtime API; likely documents signaling parameters, SDP/media settings, and API fields unique to this integration, matching integration & coding patterns criteria. | | [Realtime API via WebSockets](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-websockets) | integrations | 0.75 | WebSocket-focused guide for GPT Realtime API; expected to include connection URLs, message schemas, event types, and configuration parameters specific to this product integration. | -| [Run red teaming scans in the cloud](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-ai-red-teaming-cloud) | integrations | 0.75 | Cloud red-teaming with the Foundry SDK will require product-specific SDK usage, configuration parameters, and possibly environment settings for cloud execution, which qualify as integration & coding patterns beyond generic advice. | | [Run red teaming scans locally](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-scans-ai-red-teaming-agent) | integrations | 0.75 | Local red-teaming via Azure AI Evaluation SDK implies concrete SDK calls, parameters, and configuration patterns unique to this product, matching integrations & coding patterns criteria. | | [Tool calling](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-functions) | integrations | 0.75 | Describes fine-tuning with tool-calling examples and notes deprecation of function_call/functions in favor of tools, implying specific request schema and parameter usage unique to Azure OpenAI. | | [Use Foundry Agent Service](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-agents) | integrations | 0.75 | Shows how to connect LangGraph/LangChain applications to Foundry Agent Service, including multi-agent graphs, tools, and human-in-the-loop workflows. These are concrete integration and coding patterns unique to this service. | @@ -226,26 +240,27 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Use Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-models) | integrations | 0.75 | Covers using OpenAI-compatible LangChain classes with Foundry-deployed chat and embedding models, including async calls and vector search. This is concrete SDK/API integration guidance specific to Foundry and langchain-azure-ai. | | [Web search](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/web-search) | integrations | 0.75 | Describes the web_search tool and its use in the Responses API; expected to include tool name, parameters, and configuration details that are product-specific integration knowledge. | | [Web search overview](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/web-overview) | decision-making | 0.75 | Explicitly about choosing between Web Search, Grounding with Bing Search, and Bing Custom Search; likely includes comparison criteria and scenario-based recommendations, which is product-specific decision guidance. | -| [Configure private link](https://learn.microsoft.com/en-us/azure/foundry/how-to/configure-private-link) | security | 0.72 | End-to-end network isolation with Private Link for a specific service typically includes resource names, required subresources, DNS zone patterns, and possibly role/permission requirements. These are concrete, product-specific security/network configuration details that fit the security sub-skill. | -| [Data, privacy, and security for Claude models in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy) | security | 0.72 | Explains how data is processed, used, and stored when using Claude models via Marketplace, including roles of Anthropic vs Microsoft. This is product- and vendor-specific data handling and compliance behavior that an LLM would not infer generically. | +| [Claude Desktop](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-desktop) | configuration | 0.74 | A setup guide for configuring Claude Desktop to use Foundry as an inference provider for enterprise deployments will typically include concrete configuration fields (endpoints, keys, environment variables, or MDM policy settings) and possibly example config snippets. These are product-specific configuration parameters rather than generic guidance, matching the configuration sub-skill. | | [API lifecycle](https://learn.microsoft.com/en-us/azure/foundry/openai/api-version-lifecycle) | configuration | 0.70 | Explains v1 API usage, auth, and cross-provider calls; likely includes request structure and configuration details specific to this API version. | | [Add client-side tracing](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-client-side) | integrations | 0.70 | Shows how to instrument Foundry agents with OpenTelemetry using the Microsoft Foundry SDK and export traces to Azure Monitor or OTLP backends. This typically includes SDK configuration parameters, exporter settings, and tracing options unique to this product, matching the integrations sub-skill. | +| [Agent applications](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-applications) | configuration | 0.70 | Describes publishing agents as Azure resources with stable endpoints and configuring authentication and permissions; likely includes endpoint settings and auth configuration parameters. | | [Apply a guardrail policy for models](https://learn.microsoft.com/en-us/azure/foundry/control-plane/quickstart-create-guardrail-policy) | security | 0.70 | Guardrail policy quickstart will define policy objects, scopes, and configuration fields that govern safety controls across subscriptions, which are product-specific security/governance settings. | | [Audio generation](https://learn.microsoft.com/en-us/azure/foundry/openai/audio-completions-quickstart) | integrations | 0.70 | Quickstart for audio generation will include concrete REST/SDK request schemas, required/optional parameters (like model names, modalities, formats), and example payloads specific to Azure OpenAI audio-completions, which are product-specific integration details beyond generic LLM knowledge. | | [Azure OpenAI monitoring data reference](https://learn.microsoft.com/en-us/azure/foundry/openai/monitor-openai-reference) | configuration | 0.70 | Monitoring reference typically lists metric names, dimensions, log categories, and schema fields. These are configuration/observability parameters unique to this service and not generic knowledge. | | [Azure Speech in Foundry Tools](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/azure-ai-speech) | integrations | 0.70 | Describes connecting Azure Speech via an MCP server, including tool-specific constraints (no network-secured Foundry) and likely configuration parameters unique to this integration. | | [Batch processing](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/batch) | limits-quotas | 0.70 | Mentions separate quota, 24-hour target turnaround, and 50% cost reduction; the full article likely includes batch-specific quota units, file size limits, and processing constraints—numeric limits and time targets. | -| [Built-in evaluators reference](https://learn.microsoft.com/en-us/azure/foundry/concepts/built-in-evaluators) | configuration | 0.70 | Comprehensive reference will list evaluator names, input/output schemas, parameters, and allowed values, which are detailed configuration references for evaluation features. | +| [Bring your own models with AI gateways](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway) | integrations | 0.70 | Covers connecting models behind enterprise AI gateways like Azure API Management and other non-Azure gateways. This implies product-specific integration patterns, endpoint configuration, and possibly parameter details for using external model endpoints with Foundry agents. | | [Built-in policy for model deployment](https://learn.microsoft.com/en-us/azure/foundry/how-to/model-deployment-policy) | security | 0.70 | Describes built-in Azure Policy definitions that govern which models can be deployed. These policies include specific policy names, effects, and allowed/denied configurations that are product-specific governance/security controls. | | [Capability hosts](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/capability-hosts) | configuration | 0.70 | Explains capability hosts as sub-resources at account and project scopes and how to configure them; includes product-specific configuration behavior (must recreate to update). | | [Code interpreter](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/code-interpreter) | integrations | 0.70 | Describes configuring and using the Code Interpreter tool, including file upload/analysis flows and SDK usage; contains product-specific tool parameters and patterns. | | [Computer Use (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/computer-use) | security | 0.70 | Covers the computer use tool with explicit security and privacy risks (prompt injection, UI automation) and likely includes permission/auth configuration; primarily security-focused expert guidance for this tool. | | [Configure an AI gateway](https://learn.microsoft.com/en-us/azure/foundry/configuration/enable-ai-api-management-gateway-portal) | configuration | 0.70 | The article describes how to enable AI Gateway using Azure API Management for Foundry resources, including configuring tokens-per-minute limits and token quotas on model deployments. This involves product-specific configuration steps and settings (how to turn on the gateway and apply limits) rather than just conceptual discussion. While it references limits/quotas, it is primarily about how to configure the gateway and its parameters, not a catalog of numeric limits, so it best fits the configuration sub-skill. | +| [Configure private link](https://learn.microsoft.com/en-us/azure/foundry/how-to/configure-private-link) | security | 0.70 | Explains end-to-end network isolation for Foundry using private endpoints. This is a security-focused configuration topic with product-specific steps and settings for securing communication to Foundry accounts and projects. | | [Connect knowledge base to agents](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/foundry-iq-connect) | integrations | 0.70 | How-to for wiring Foundry Agent Service to Foundry IQ (Azure AI Search) for grounded retrieval. This typically includes product-specific configuration fields (knowledge base IDs, connection parameters, model context settings) and options unique to this integration, which are not generic LLM knowledge. | | [Create a private tool catalog (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/private-tool-catalog) | configuration | 0.70 | How-to for creating a private tool catalog using Azure API Center; likely includes specific configuration parameters and settings for catalog registration and scope. | | [Create and use memory](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/memory-usage) | configuration | 0.70 | How-to guide for creating and using memory in Foundry Agent Service likely includes concrete API parameters, configuration options, and usage patterns specific to this product (for example, memory store identifiers, scopes, or flags), which qualify as product-specific configuration details beyond generic LLM knowledge. | | [Create custom policy definitions](https://learn.microsoft.com/en-us/azure/foundry/how-to/custom-policy-definition) | security | 0.70 | Explains crafting custom policies to constrain Foundry configurations. Likely includes policy rule examples, resource type names, and configuration fields unique to Foundry, which are security/compliance configuration details. | -| [Deploy a hosted agent](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent) | deployment | 0.70 | How-to for deploying containerized agents via Azure Developer CLI or Python SDK; likely includes product-specific deployment configuration and constraints, fitting deployment. | +| [Data, privacy, and security for Claude models in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy) | security | 0.70 | The page provides product-specific details on how Anthropic Claude models in Microsoft Foundry handle, process, and store customer data, including which party is the data processor and applicable terms. These are concrete, service-specific security and privacy behaviors that an LLM is unlikely to know from training. While it’s more policy- and responsibility-focused than RBAC/config parameters, it squarely concerns security/data protection specifics for this integration. | | [Deployment types](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/deployment-types) | decision-making | 0.70 | Compares Global Standard, Provisioned, DataZone, Batch with when-to-use guidance and trade-offs (pricing, residency, behavior); this is service-specific decision guidance. | | [Deployment types (government)](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/deployment-types-gov) | decision-making | 0.70 | The article compares Standard vs ProvisionedManaged and DataZone vs single-region processing for Azure Government, and ties these to data residency and processing behavior. This is concrete service-selection guidance for different deployment types in a specialized cloud (Azure Government). It helps decide which deployment type to use based on data residency and processing requirements, which is product-specific decision-making content. | | [Disaster recovery from a platform outage](https://learn.microsoft.com/en-us/azure/foundry/how-to/agent-service-platform-disaster-recovery) | architecture-patterns | 0.70 | Details warm standby, failover, and failback for regional outages. This is a product-specific DR pattern with concrete procedures and design-for-recovery recommendations, which are architectural patterns for this service. | @@ -263,8 +278,12 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [How to configure Guardrails and Controls](https://learn.microsoft.com/en-us/azure/foundry/guardrails/how-to-create-guardrails) | configuration | 0.70 | How-to article for creating and managing guardrails via portal and REST API. Likely includes specific control names, configuration parameters, and API payload structures unique to Foundry guardrails, which qualify as product-specific configuration details. | | [Image prompt engineering techniques](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/gpt-4-v-prompt-engineering) | best-practices | 0.70 | Provides concrete prompt patterns and DO/DON'T guidance specific to GPT-4o and related vision models—product-specific best practices beyond generic prompt advice. | | [Limited access](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/limited-access) | limits-quotas | 0.70 | Limited access policy typically defines concrete eligibility criteria, usage restrictions, and possibly numeric thresholds or conditions for certain models, which are expert limit/eligibility details. | +| [Manage hosted agent lifecycle](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent) | limits-quotas | 0.70 | Management article explicitly mentions an idle timeout of 15 minutes and automatic deprovisioning; this is a concrete service limit/timeout value that qualifies as expert knowledge. | | [Migrate agents](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate) | decision-making | 0.70 | Covers migration from Assistants API/classic agents to the new Agent Service, including mapping of threads to conversations, runs to responses, and updated SDK patterns—this is product-specific migration and upgrade guidance used for decision-making. | +| [Migrate from Agent Applications to the new agent model](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-agent-applications) | decision-making | 0.70 | Explicit migration guide comparing legacy Agent Application model vs new agent object model; likely includes comparison tables and concrete guidance on when/how to move, which is decision-focused. | | [Migrate from Azure AI Inference SDK to OpenAI SDK](https://learn.microsoft.com/en-us/azure/foundry/how-to/model-inference-to-openai-migration) | decision-making | 0.70 | Provides mapping between SDKs, code changes, and feature differences, helping decide and execute migration with concrete guidance and trade-offs. | +| [Migrate hosted agents to the refreshed public preview](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-hosted-agent-preview) | decision-making | 0.70 | Migration guide between preview backends; typically includes API/SDK changes, mapping of old to new models, and explicit guidance on when/how to move, which is decision and migration-focused expert knowledge. | +| [Model retirement schedule for Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirement-schedule) | decision-making | 0.70 | A retirement schedule with specific dates and suggested replacement models is time-sensitive expert knowledge. It directly supports migration and model selection decisions by mapping current models to replacements and timelines. | | [Model versions](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/model-versions) | decision-making | 0.70 | Explains versioning behavior and update policies for Azure and partner models; supports choosing appropriate upgrade strategies with product-specific rules. | | [Model versions (government)](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/model-versions-gov) | decision-making | 0.70 | Explains model versioning behavior, update policies, and how Azure vs partner model versions are managed. This guides users in choosing upgrade policies and deployment strategies with concrete versioning behaviors and options, aligning with decision-making for version/upgrade choices. | | [Monitor agents in the dashboard](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/how-to-monitor-agents-dashboard) | configuration | 0.70 | Covers using the Agent Monitoring Dashboard and Application Insights to track metrics like token usage and latency. Such pages usually include specific metric names, dimensions, and dashboard/configuration options unique to Foundry observability, which are configuration-focused expert details. | @@ -272,76 +291,75 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Predicted outputs](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/predicted-outputs) | configuration | 0.70 | Feature-specific how-to; likely includes parameters to enable predicted outputs and patterns for constructing requests—product-specific configuration and usage pattern. | | [Prompt caching](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/prompt-caching) | configuration | 0.70 | Prompt caching behavior (cache keys, duration, size constraints, model support, and request parameters) is product-specific and typically documented via concrete parameters and constraints. This constitutes expert configuration knowledge beyond generic LLM understanding. | | [Provisioned spillover](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/spillover-traffic-management) | architecture-patterns | 0.70 | Describes spillover routing from provisioned to standard deployments; likely includes rules/thresholds for when spillover occurs and configuration options—this is a product-specific traffic management pattern. | -| [Publish and share agents](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-agent) | deployment | 0.70 | Covers promoting agents to managed resources with stable endpoints and configuring authentication/permissions; contains product-specific deployment and auth configuration details. | -| [Publish to Microsoft 365 Copilot and Teams](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-copilot) | deployment | 0.70 | Describes packaging and publishing agents as applications for M365 Copilot and Teams; likely includes platform-specific deployment requirements and configuration steps. | | [Realtime API for speech and audio](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio) | integrations | 0.70 | How-to for GPT Realtime API with speech/audio; likely includes concrete WebSocket/WebRTC/SIP parameters, event names, and model-specific configuration patterns that qualify as product-specific integration details. | | [Reinforcement fine-tuning](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reinforcement-fine-tuning) | limits-quotas | 0.70 | Explicitly states RFT jobs auto-pause at $5,000 in total training costs; this is a concrete numeric limit/guardrail unique to the service, fitting limits-quotas. | | [Role-based access control](https://learn.microsoft.com/en-us/azure/foundry/concepts/rbac-foundry) | security | 0.70 | RBAC article for Foundry is likely to list specific built-in role names, scopes, and assignment patterns unique to the product, which matches the security sub-skill criteria for expert knowledge. | -| [Run evaluations from the SDK](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/cloud-evaluation) | integrations | 0.70 | How-to article for running evaluations in the cloud using the Foundry SDK is likely to include SDK-specific parameters, configuration values, and workflow details for cloud execution that go beyond generic LLM knowledge, fitting integrations & coding patterns. | | [Safety evaluation](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-safety-evaluation) | security | 0.70 | Safety evaluation article will detail evaluation stages, thresholds, and possibly configuration flags for safety checks on training data and outputs, which are product-specific safety controls. | | [Safety system message templates](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/safety-system-message-templates) | security | 0.70 | Provides concrete safety system message templates for different harm areas; these are product-specific prompt templates not generally known. | | [Set up tracing](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-setup) | configuration | 0.70 | How-to for setting up tracing with Azure Monitor Application Insights and OpenTelemetry is likely to include concrete configuration parameters (connection strings, instrumentation keys, exporter settings) and product-specific setup steps, which qualify as configuration expert knowledge. | +| [Skills (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/skills) | configuration | 0.70 | How-to for using Skills REST API and SKILL.md implies concrete schema/fields and configuration details for skills that are product-specific and not general LLM knowledge. | | [Speech to text with Whisper](https://learn.microsoft.com/en-us/azure/foundry/openai/whisper-quickstart) | integrations | 0.70 | Speech-to-text quickstart necessarily documents endpoint paths, request bodies, audio format parameters, and model identifiers for Whisper in Azure OpenAI, which are concrete integration details not derivable from generic training. | | [Structured inputs](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/structured-inputs) | configuration | 0.70 | Details structured input placeholders, handlebar syntax, and how to pass values at runtime to configure instructions and tools; this is product-specific configuration behavior. | | [Text generation](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/generate-responses) | configuration | 0.70 | How-to for using Responses API with specific Foundry models; includes model IDs, parameters, and example payloads—product-specific configuration details. | | [Third party Guardrail integrations](https://learn.microsoft.com/en-us/azure/foundry/guardrails/third-party-integrations) | integrations | 0.70 | Describes wiring external safety solutions into Foundry guardrails, likely with specific integration endpoints, configuration parameters, and payload formats unique to this product. | | [Tool governance](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/governance) | security | 0.70 | Describes enforcing authentication, rate limits, IP restrictions, and routing via Azure API Management; contains security and governance configuration patterns specific to MCP tools. | +| [Toolbox (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/toolbox) | configuration | 0.70 | Describes adding MCP servers, web search, Azure AI Search, file search, and more via a single managed endpoint; likely includes toolbox configuration parameters and options unique to Foundry. | | [Tracing integrations](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-framework) | configuration | 0.70 | How-to for configuring OpenTelemetry tracing for LangChain, LangGraph, and OpenAI Agents SDK in the context of Foundry. This will include framework-specific config snippets, environment variables, and tracing setup parameters. | | [Understanding and calculating PTU costs](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/provisioned-throughput-onboarding) | decision-making | 0.70 | Capacity planning and billing for provisioned throughput units typically include concrete cost figures, billing granularity, and usage thresholds for choosing PTU sizes and reservations. This is specialized decision guidance (cost vs. capacity, migration considerations) that goes beyond generic knowledge. | | [Upgrade GitHub Models to Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/quickstart-github-models) | decision-making | 0.70 | Guides when and how to move from GitHub Models to Foundry for production; likely includes feature/capability comparisons and migration steps—service selection and migration decision guidance. | | [Use Fireworks models](https://learn.microsoft.com/en-us/azure/foundry/how-to/fireworks/enable-fireworks-models) | configuration | 0.70 | Integration article for Fireworks models is expected to detail how to enable, deploy, and use these models, including configuration options, model catalog settings, and BYOM parameters specific to this integration, fitting configuration/integration knowledge; configuration is primary due to enable/usage focus. | | [Use Foundry Content Moderation](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-middleware) | security | 0.70 | Describes using Azure AI Content Safety as middleware in LangChain agents via langchain-azure-ai, including prompt shielding and groundedness detection. This is product-specific security/safety configuration and middleware usage, not generic safety theory. | -| [Use an AI Gateway](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway) | integrations | 0.70 | A page on connecting AI gateways (Azure API Management and non-Azure gateways) to Foundry Agent Service will necessarily include endpoint configuration details, headers, authentication settings, and possibly parameter names or payload formats specific to this integration. Those are product-specific integration patterns that qualify as expert knowledge. | | [Use hosted agent workflows in the Visual Studio Code extension](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-pro-code) | deployment | 0.70 | Covers creating, running, and deploying hosted workflows via the VS Code extension; likely includes product-specific deployment steps, configuration options, and constraints for Foundry Agent Service. | +| [Virtual networks](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/virtual-networks) | security | 0.70 | Describes setting up private networking using Bicep/Terraform, deploying VNets, private endpoints, DNS zones, and deny-by-default rules. This is product-specific network security configuration and likely includes concrete resource names, settings, and patterns for secure access. | | [Vision fine-tuning](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-vision) | best-practices | 0.70 | Vision fine-tuning doc will specify supported image formats, size/ratio constraints, dataset structuring, and product-specific recommendations for image inputs, which are concrete best practices and constraints. | | [Web search tool](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/web-search) | configuration | 0.70 | Describes enabling and using the web search tool, and mentions a support table for SDK and setup; this implies concrete configuration options and constraints specific to Foundry’s web search integration. | +| [Function calling](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling) | integrations | 0.68 | A 'how to use function calling' article for Azure OpenAI in Foundry is likely to include product-specific request/response schemas, parameter names, and JSON structures for function definitions and tool calls that are unique to this service’s API surface. These are concrete integration details (API parameters, expected formats) rather than just conceptual explanation, fitting the integrations category. | | [Manage Grounding with Bing](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-grounding-with-bing) | configuration | 0.68 | Covers enabling/disabling Grounding with Bing Search and Custom Search across Foundry Agent Service and Azure AI Search. Likely includes feature flags, configuration switches, and policy or RBAC settings specific to controlling this capability. | | [Manage resources using Terraform](https://learn.microsoft.com/en-us/azure/foundry/how-to/create-resource-terraform) | configuration | 0.68 | Terraform-based resource creation articles typically include provider-specific resource types, property names, and configuration blocks (often with required/optional fields and example values) that are unique to the service and not reliably known from pretraining. This goes beyond a simple tutorial and represents concrete configuration knowledge for Foundry via AzAPI/AzureRM. | +| [Microsoft Foundry SDKs](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/sdk-overview) | decision-making | 0.68 | The page provides product-specific guidance on when to use different Foundry vs Azure OpenAI endpoints and SDKs, including which resource types expose which endpoints and which auth methods work where. This is concrete selection guidance between options (Foundry resource vs Azure OpenAI resource, /openai/v1 vs other endpoints, Entra ID vs API key) rather than a generic overview, fitting the decision-making category. | | [Data, privacy, and security for agents](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/agents/data-privacy-security) | security | 0.66 | Details data handling, privacy, and security specifics for Agent Service, likely including storage locations, encryption, and access patterns that are product-specific security knowledge. | | [Disaster recovery for agent services](https://learn.microsoft.com/en-us/azure/foundry/how-to/agent-service-disaster-recovery) | architecture-patterns | 0.66 | Overview of DR planning including what can/can’t be recovered and readiness checklist. Likely includes Foundry-specific recovery capabilities, limitations, and scenario-based guidance that inform DR architecture decisions. | -| [Set up your agent resources](https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup) | deployment | 0.66 | Environment setup for agents includes provisioning specific Azure resources and configurations required before agent deployment, which are product-specific deployment requirements. | -| [Agent 365](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365) | deployment | 0.65 | Uses a specific sample and Azure Developer CLI to create resources and publish an agent; contains product-specific deployment patterns and constraints. | +| [Agent 365](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365) | security | 0.65 | Describes governance, security, and lifecycle management; likely includes specific roles/permissions and approval flows unique to Agent 365, which fits product-specific security configuration. | | [Agent evaluators](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/agent-evaluators) | best-practices | 0.65 | Agent evaluators (intent resolution, tool call accuracy, task adherence) are product-specific evaluation constructs with concrete, prescriptive guidance on how to apply them to Azure AI agents. This is actionable, service-unique best-practices style content rather than generic LLM theory. | | [Agent identity](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-identity) | security | 0.65 | Describes agent identities as a specialized Microsoft Entra ID identity type, including RBAC, authentication for tools, and governance patterns specific to Foundry; this is product-specific security/identity configuration guidance even though the summary is conceptual. | | [Create a connection](https://learn.microsoft.com/en-us/azure/foundry/how-to/connections-add) | security | 0.65 | Connections are explicitly about authenticating to Microsoft and third-party services. Such pages typically enumerate connection types, required scopes, identity options, and possibly RBAC or permission requirements. That constitutes product-specific security/auth configuration details beyond generic concepts, fitting the security sub-skill. | | [Deploy Foundry Models in the portal](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/deploy-foundry-models) | deployment | 0.65 | Portal-based deployment how-to; likely includes deployment-type choices, region/model constraints, and possibly SKU-specific availability—deployment-specific expert details. | +| [Deploy a hosted agent](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent) | configuration | 0.65 | How-to article for deploying hosted agents using Python SDK and REST; likely includes request/response schemas, parameter names, and required fields that are product-specific configuration details. | | [Embeddings tutorial](https://learn.microsoft.com/en-us/azure/foundry/openai/tutorials/embeddings) | integrations | 0.65 | Tutorial for embeddings-based search; likely includes code with specific API parameters, index configuration, and integration patterns with a dataset—coding/integration patterns. | | [Foundry Models sold directly by Azure (government)](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-sold-directly-by-azure-gov) | decision-making | 0.65 | Lists Foundry models sold directly by Azure in Azure Government with capabilities, deployment types, and regions. This is used to choose which model and deployment to use in a given region, providing concrete comparison attributes (capabilities, availability) that support technology selection decisions. | -| [Get started with model router](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/model-router) | integrations | 0.65 | How-to for using model router through the Chat Completions API; likely includes request/response schemas, tool parameters, and product-specific API usage patterns beyond generic LLM knowledge. | | [Grounding with Bing tools](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/bing-tools) | decision-making | 0.65 | Explains Grounding with Bing Search and Bing Custom Search, and explicitly advises to start with the web search tool; provides comparative guidance on when to use which tool, fitting decision-making. | +| [Image generation](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e) | integrations | 0.65 | The article explains how to use OpenAI image generation models (DALL-E / gpt-image-series) with configuration options. Such a how-to typically includes request payload fields (size, quality, style, image count, etc.) and model-specific parameters for the Azure OpenAI/Foundry API. These concrete API options and parameter names are integration details, so it best fits the integrations category rather than a generic overview. | | [Image generation (preview)](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/image-generation) | integrations | 0.65 | Describes a specific tool within Foundry Agent Service, including how to configure agents, deploy models, and handle base64-encoded outputs. Likely includes tool-specific parameters and configuration patterns beyond generic image generation concepts. | -| [Manage hosted agent lifecycle](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent) | deployment | 0.65 | Covers starting, stopping, updating, and deleting hosted agent deployments; operational deployment lifecycle guidance specific to the service. | +| [Manage hosted agent sessions](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-sessions) | configuration | 0.65 | Covers creating, invoking, and managing sessions using REST, SDK, and CLI; likely documents session-related parameters, IDs, and options that are product-specific configuration details. | | [Migrate from Foundry (classic) portal](https://learn.microsoft.com/en-us/azure/foundry/how-to/navigate-from-classic) | decision-making | 0.65 | Migration guide mapping classic terminology, capabilities, SDKs, and navigation to new equivalents; supports decision-making on how to transition and what to use instead, which is product-specific migration guidance. | -| [Model lifecycle and retirement for Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/concepts/model-lifecycle-retirement) | decision-making | 0.65 | Describes lifecycle stages, deprecation timelines, notifications, and migration steps. This guides decisions about when and how to migrate between models and manage upgrades, fitting decision-making around lifecycle management. | | [Optimization model upgrade](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/optimization-model-upgrade) | decision-making | 0.65 | Guides detection of deprecated models, comparison of alternatives, and upgrade decisions; contains product-specific migration and selection guidance. | | [Optimize agent prompts](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/prompt-optimizer) | best-practices | 0.65 | Described as applying prompt-engineering best practices specifically for Foundry agents; likely includes concrete, product-specific recommendations and patterns for system instructions rather than generic theory, fitting best-practices. | | [Optimize cost and performance](https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-optimize-cost-performance) | decision-making | 0.65 | Cost/performance optimization using Ask AI; likely includes concrete guidance on when to switch models, how to interpret cost spikes, and scenario-based recommendations, fitting decision-making around model/tier selection. | | [Plan rollout](https://learn.microsoft.com/en-us/azure/foundry/concepts/planning) | decision-making | 0.65 | The page provides product-specific rollout planning guidance for Microsoft Foundry, including key decisions around environment setup, data isolation, integration with other Azure services, capacity management, and monitoring. This is decision-making content that helps choose approaches and configurations for organizational adoption, beyond generic conceptual guidance. | | [Preference fine-tuning](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-direct-preference-optimization) | configuration | 0.65 | DPO how-to will include dataset format (preferred/rejected pairs), training parameters, and API fields specific to DPO jobs, which are configuration details for this product feature. | +| [Priority processing](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing) | configuration | 0.65 | Describes enabling priority processing on deployments and verifying which tier processed requests. This typically involves specific deployment settings/flags and possibly tier identifiers, which are product-specific configuration details. | | [Prompt Shields](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-filter-prompt-shields) | configuration | 0.65 | Describes enabling Prompt Shields as part of guardrail controls and shows response annotation structure with fields like detected/filtered. This is product-specific configuration and response schema detail rather than just conceptual security content. | | [Register and manage custom agents](https://learn.microsoft.com/en-us/azure/foundry/control-plane/register-custom-agent) | configuration | 0.65 | Custom agent registration requires specific configuration fields (IDs, endpoints, telemetry settings) and data collection setup, which are product-specific configuration details. | +| [Retired Foundry Models](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/retired-models) | decision-making | 0.65 | Lists specific models that are retired and their alternatives. This is expert, time-sensitive catalog information that guides decisions about which replacement model to choose when a model is no longer available. | | [Retrieval Augmented Generation (RAG) overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/retrieval-augmented-generation) | architecture-patterns | 0.65 | Explains how RAG works specifically in Foundry, including agentic retrieval vs classic RAG; likely includes product-specific architectural patterns and when to use them. | +| [Set up your agent resources](https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup) | configuration | 0.65 | Environment setup for Foundry Agent Service typically includes product-specific infrastructure and configuration requirements (resource types, settings, possibly environment variables and permissions). The summary references hosted agents needing additional permissions and RBAC configurations, indicating concrete configuration details rather than just conceptual guidance. | | [Synthetic Data Generation](https://learn.microsoft.com/en-us/azure/foundry/fine-tuning/data-generation) | configuration | 0.65 | Synthetic data feature article will describe UI/parameter options (dataset size, schema, sampling settings, limits) and preview constraints specific to Foundry’s generator, which are configuration-level details. | | [Use declarative agent workflows in the Visual Studio Code extension](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-low-code) | configuration | 0.65 | Describes YAML-based declarative workflows for Foundry Agent Service; likely includes schema fields, allowed values, and configuration options. | -| [High availability and resiliency](https://learn.microsoft.com/en-us/azure/foundry/how-to/high-availability-resiliency) | architecture-patterns | 0.64 | Guidance for HA and resiliency planning for Foundry projects and Agent Service. Likely includes Foundry-specific patterns, regional deployment recommendations, and possibly RPO/RTO targets or thresholds that inform architectural decisions. | | [Standard agent setup](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/standard-agent-setup) | architecture-patterns | 0.64 | Explains when to use standard agent setup vs basic, focusing on data sovereignty, isolation, and compliance. This is a product-specific architecture pattern with clear scenario-based guidance. | | [Content streaming](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-streaming) | decision-making | 0.63 | Explains default vs asynchronous filtering modes and their impact on latency and performance, guiding users to choose modes based on trade-offs—this is product-specific decision guidance. | | [Default safety policies](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/default-safety-policies) | security | 0.62 | Describes default safety policies, including specific filters (blocklists, prompt transformation, content credentials). Likely enumerates concrete policy names and behaviors that are product-specific security defaults. | | [Groundedness detection](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-filter-groundedness) | integrations | 0.62 | Describes groundedness detection requirements (embeddings, formatting) and likely includes specific API usage patterns and parameters for RAG scenarios, which are product-specific integration details. | | [Agent runtime components](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/runtime-components) | integrations | 0.60 | Explains how to wire together agents, conversations, and responses with concrete SDK/REST usage; likely includes product-specific API parameters and patterns for stateful interactions, fitting integrations & coding patterns. | | [Automatic model updates for Azure OpenAI Models](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/working-with-models) | configuration | 0.60 | Covers managing deployment lifecycle, updates, and retirement; likely includes API usage and configuration details specific to Azure OpenAI in Foundry. | -| [Azure OpenAI model retirement](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements) | decision-making | 0.60 | Provides information about available, deprecated, and retired Azure OpenAI models. While partly reference, it directly informs which models you can or should use and when to migrate, supporting model selection and migration decisions. | | [Build your own MCP server](https://learn.microsoft.com/en-us/azure/foundry/mcp/build-your-own-mcp-server) | integrations | 0.60 | Covers building a custom MCP server with Azure Functions, registering it in Azure API Center, and connecting it to Foundry Agent Service; this is a product-specific integration pattern tying together multiple services with concrete configuration steps. | -| [Evaluate agent quality](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/evaluate-agent) | best-practices | 0.60 | How-to for running evaluations with built-in evaluators and setting acceptance thresholds; contains product-specific evaluator types and usage patterns. | +| [Evaluate agent quality](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/evaluate-agent) | best-practices | 0.60 | Covers how to run evaluations, set acceptance thresholds (example 85% task adherence), and likely includes concrete evaluator configurations and thresholds, which are product-specific best practices. | +| [High availability and resiliency](https://learn.microsoft.com/en-us/azure/foundry/how-to/high-availability-resiliency) | architecture-patterns | 0.60 | Focuses on planning high availability and disaster recovery for Foundry projects and Agent Service. While not explicitly listing limits, it likely includes product-specific resiliency patterns, region/zone usage, and guidance on when to use particular configurations for business continuity. | | [Integrate with your applications](https://learn.microsoft.com/en-us/azure/foundry/how-to/integrate-with-other-apps) | decision-making | 0.60 | Article explicitly helps choose an integration pattern and explains how to integrate Foundry into applications. Likely includes pattern selection guidance and criteria for when to use each integration approach, which fits decision-making for integration strategy. | -| [Invoke your Agent Application using the Responses API protocol](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-responses) | integrations | 0.60 | How-to for using the Responses API protocol; likely includes protocol-specific request/response schema and parameters unique to Foundry. | -| [Legacy models](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/legacy-models) | decision-making | 0.60 | Catalog of retired models; informs decisions about legacy migrations and compatibility, containing data not inferable from general knowledge. | | [Monitor model deployments](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/monitor-models) | configuration | 0.60 | Explains how to wire Foundry Models into Azure Monitor and Log Analytics; likely includes specific metric names, log categories, and configuration settings. | -| [Priority processing](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing) | decision-making | 0.60 | Describes enabling priority processing, associated costs, and latency/availability behavior; likely includes tiered behavior and cost/performance trade-offs that guide when to use priority processing. | | [Provisioned throughput offering (PTU)](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/provisioned-throughput) | architecture-patterns | 0.60 | Concept article on provisioned throughput; typically includes when to use provisioned vs standard, capacity sizing, and trade-offs; this is a product-specific deployment/architecture pattern. | +| [Quickstart - Deploy a hosted agent](https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-hosted-agent) | deployment | 0.60 | Quickstart for deploying containerized agents using azd or VS Code; likely includes Foundry-specific deployment commands, required resources, and constraints unique to the service. | | [Safety system messages](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/system-message) | security | 0.60 | Covers safety system messages to reduce harmful outputs; likely includes structured templates and components specific to Azure OpenAI safety guidance. | | [System message design](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/advanced-prompt-engineering) | best-practices | 0.60 | Focuses on system message design for control, safety, and consistency; likely includes concrete patterns and examples tailored to Azure OpenAI behavior. | -| [Work with Azure OpenAI reasoning models](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reasoning) | integrations | 0.60 | How-to for using GPT-5 series and o1/o3 reasoning models; likely includes model IDs, API usage patterns, and configuration options specific to these reasoning models in Foundry. | ## Unclassified Pages @@ -352,59 +370,67 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Status dashboard](https://learn.microsoft.com/en-us/azure/foundry/foundry-status-dashboard-documentation) | 0.45 | Describes the Foundry status dashboard conceptually (monitor health, incidents, maintenance). Summary doesn’t indicate detailed configuration parameters, error codes, or limits; appears more like a feature overview. | | [Work in VS Code](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/get-started-projects-vs-code) | 0.45 | VS Code extension usage for Foundry; likely step-by-step UI tutorial rather than detailed configuration or limits. | | [Analyze evaluation results](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/cluster-analysis) | 0.40 | Cluster analysis article appears focused on interpreting evaluation results and analysis views; summary does not suggest concrete configuration parameters, limits, or troubleshooting mappings. | +| [Built-in evaluators reference](https://learn.microsoft.com/en-us/azure/foundry/concepts/built-in-evaluators) | 0.40 | Reference for built-in evaluators but summary doesn’t indicate detailed config tables, limits, or error mappings; likely describes what evaluators do rather than product-specific parameters. | | [Harm categories and severity levels](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-filter-severity-levels) | 0.40 | Explains harm categories and severity levels conceptually; unless it includes detailed machine-readable policy tables, it is primarily conceptual safety taxonomy. | | [Human evaluation for agents](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/human-evaluation) | 0.40 | Human evaluation setup and templates; likely workflow instructions without detailed config tables or error mappings. | -| [Microsoft Foundry SDKs](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/sdk-overview) | 0.40 | Overview of Foundry SDKs and endpoints, resource types, and basic authentication options. While it mentions endpoint paths and auth methods, it lacks detailed configuration tables, limits, or decision criteria that would qualify as expert knowledge under the defined categories. | | [Monitor fleet health and performance](https://learn.microsoft.com/en-us/azure/foundry/control-plane/monitoring-across-fleet) | 0.40 | Monitoring overview/how-to; summary suggests portal usage patterns but not specific metrics tables, thresholds, or product-unique diagnostic commands. | +| [Run evaluations from the SDK](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/cloud-evaluation) | 0.40 | How-to guide for running cloud evaluations; summary suggests workflow/tutorial content, not detailed limits, config matrices, or error-code troubleshooting. | +| [Run evaluations from the portal](https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-generative-ai-app) | 0.40 | Portal-based evaluation how-to; appears to be step-by-step usage rather than configuration tables, limits, or decision matrices. | +| [Run red teaming scans in the cloud](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-ai-red-teaming-cloud) | 0.40 | How-to for running AI Red Teaming Agent in the cloud; summary suggests procedural SDK usage, not detailed config tables, limits, or error-code-based troubleshooting. | | [Task Adherence](https://learn.microsoft.com/en-us/azure/foundry/guardrails/task-adherence) | 0.40 | Describes Task Adherence objectives and behavior at a conceptual level (identifying misaligned tool invocations, blocking actions, escalation). Summary does not show specific configuration parameters, thresholds, or error mappings. | -| [Tools overview](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-catalog) | 0.40 | Conceptual overview of tools and the tool catalog; summary does not indicate detailed parameter tables, limits, or error mappings—primarily explains what tools are and how they extend agents. | | [Trace agent overview](https://learn.microsoft.com/en-us/azure/foundry/observability/concepts/trace-agent-concept) | 0.40 | Conceptual article on agent tracing and observability; summary focuses on what tracing is and why, not on concrete configuration parameters, limits, or error codes. | | [Create a workflow](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/workflow) | 0.35 | Conceptual explanation of workflows and orchestration; summary doesn’t indicate detailed configuration tables or limits. | | [Explore the model playgrounds](https://learn.microsoft.com/en-us/azure/foundry/concepts/concept-playgrounds) | 0.35 | Conceptual overview of Foundry playgrounds for prototyping; summary doesn’t show detailed configuration parameters or numeric constraints. | | [General purpose evaluators](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/general-purpose-evaluators) | 0.35 | Describes general-purpose evaluators like coherence and fluency conceptually; summary suggests no product-specific configs, limits, or error-resolution mappings. | | [Intervention points](https://learn.microsoft.com/en-us/azure/foundry/guardrails/intervention-points) | 0.35 | Intervention points article is conceptual about where to apply guardrails in agent workflows, not a detailed configuration or API reference. | | [Quickstart - Create a prompt agent](https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/prompt-agent) | 0.35 | Quickstart for creating a prompt agent; likely shows basic SDK usage and portal steps without detailed configuration catalogs or limits. | -| [Quickstart - Deploy a hosted agent](https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-hosted-agent) | 0.35 | Quickstart for deploying a hosted agent using CLI/VS Code; focuses on basic deployment flow, not on tier matrices, constraints, or advanced deployment patterns. | | [Red teaming large language models (LLMs)](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/red-teaming) | 0.35 | Red teaming planning guide is methodological and conceptual; it does not focus on product-specific configuration parameters or numeric thresholds. | | [Retrieval Augmented Generation (RAG) evaluators](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/rag-evaluators) | 0.35 | Conceptual explanation of RAG evaluators (relevance, groundedness, completeness); summary does not show numeric thresholds, config tables, or troubleshooting mappings. | | [What is Foundry IQ?](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/what-is-foundry-iq) | 0.35 | Conceptual overview of Foundry IQ as a managed knowledge layer; summary doesn’t indicate detailed limits, config tables, or error mappings. | -| [What is a hosted agent?](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agents) | 0.35 | Conceptual description of hosted agents and managed hosting; summary doesn’t show detailed config parameters, limits, or error codes. | | [Abuse monitoring](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/abuse-monitoring) | 0.30 | Abuse monitoring article is a high-level description of monitoring practices and data handling, without detailed error codes, metrics, or configuration parameters. | -| [Agent development lifecycle](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/development-lifecycle) | 0.30 | Describes the agent development lifecycle conceptually; no indication of numeric thresholds, config tables, or error mappings. | | [Azure OpenAI evaluators](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/azure-openai-graders) | 0.30 | Overview of Azure OpenAI graders and what they assess (label grading, string checking, similarity, custom grading). No detailed config tables, limits, or troubleshooting mappings; primarily conceptual feature description. | | [Azure OpenAI supported programming languages](https://learn.microsoft.com/en-us/azure/foundry/openai/supported-languages) | 0.30 | Programming language support is usually a simple list or table without configuration parameters, quotas, or decision matrices; summary does not indicate detailed settings or limits. | | [Claude models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude) | 0.30 | Describes how to deploy and use Claude models and lists available variants, but the summary does not show specific limits, configuration parameter tables, or error mappings. Likely a usage tutorial, not expert-level configuration or decision content. | +| [Create and manage projects](https://learn.microsoft.com/en-us/azure/foundry/how-to/create-projects) | 0.30 | Primarily a getting-started/how-to create a Foundry project and verify environment readiness. No indication of detailed limits, configuration parameter tables, security roles, or decision matrices; more of a procedural setup guide. | | [Create resources using Bicep template](https://learn.microsoft.com/en-us/azure/foundry/how-to/create-resource-template) | 0.30 | Quickstart showing how to deploy a Foundry resource with a Bicep template; appears to be a step-by-step tutorial without detailed configuration parameter tables, limits, or product-specific deployment constraints. | | [Custom evaluators](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/custom-evaluators) | 0.30 | Explains that you can create custom evaluators and mentions preview status, but the summary does not show concrete configuration parameters, SDK options tables, or product-specific error/limit details. | | [Customer Copyright Commitment](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/customer-copyright-commitment) | 0.30 | Customer Copyright Commitment mitigations page is primarily policy/requirements text; it does not expose technical configuration values, limits, or error-code mappings that would qualify as expert product knowledge under the defined categories. | +| [Deployment overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/deployments-overview) | 0.30 | Deployment overview describes available deployment options and concepts. The summary doesn’t indicate detailed deployment matrices, constraints by tier, or timing requirements; it’s likely high-level guidance. | | [FAQ](https://learn.microsoft.com/en-us/azure/foundry/agents/faq) | 0.30 | General FAQ about Foundry Agent Service (setup, data handling, pricing, networking). Summary indicates high-level answers rather than detailed limits, configuration tables, or troubleshooting mappings. | | [FLUX models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-flux) | 0.30 | Covers deploying and using FLUX image models with capabilities overview. The description suggests a feature and usage guide, not detailed limits, configuration matrices, or troubleshooting content required for expert classification. | -| [Foundry Models overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/foundry-models-overview) | 0.30 | High-level overview of Foundry Models catalog and capabilities; summary doesn’t indicate numeric limits, config tables, or decision matrices. | +| [Foundry Models lifecycle and support policy](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements) | 0.30 | Lifecycle and support policy is primarily conceptual (preview → GA → retirement, notification behavior). The summary doesn’t indicate concrete numeric limits, configuration parameters, or decision matrices with thresholds. | | [General availability overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/general-availability) | 0.30 | GA overview is mostly release scope, supported scenarios, and migration messaging, not detailed technical configuration or limits. | +| [Get started with model router](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/model-router) | 0.30 | A how-to for using model router via chat completions is likely a usage/tutorial page without detailed config tables, limits, or error mappings. The summary doesn’t show product-specific configuration matrices or quotas. | +| [How model router works](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-router-how-it-works) | 0.30 | Explains architecture and routing logic conceptually (how the router scores and routes). The summary doesn’t indicate numeric thresholds, config parameters, or decision matrices with concrete values. | | [Image prompt transformation](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/prompt-transformation) | 0.30 | Conceptual explanation of prompt transformation for image generation; summary shows no numeric limits, config tables, or concrete error/diagnostic details. | | [Manage agents at scale](https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-manage-agents) | 0.30 | Summary indicates a how-to for viewing inventory and lifecycle operations in the portal, but doesn't clearly reference configuration tables, limits, or error-code-based troubleshooting. Likely a procedural management guide rather than detailed expert reference content. | | [Microsoft AI (MAI) models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-mai) | 0.30 | Appears to be a how-to deployment/usage guide for MAI image models. The summary does not indicate detailed configuration tables, limits, or troubleshooting mappings; likely a standard tutorial rather than expert reference content. | | [Monitoring dashboard insights with Foundry agent](https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/optimization-dashboard) | 0.30 | Monitoring dashboard with Ask AI appears to focus on interpreting metrics and insights; summary does not indicate detailed configuration tables, limits, or error-code-based troubleshooting. | -| [Observability overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/observability) | 0.30 | Conceptual observability overview for generative AI; summary does not indicate concrete config parameters, limits, or error mappings. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-router) | 0.30 | Conceptual description of the model router; summary does not indicate numeric limits, configuration tables, error codes, or decision matrices. | | [Prompt engineering techniques](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/prompt-engineering) | 0.30 | Prompt engineering techniques are general LLM usage guidance, not Foundry-specific configuration, limits, or APIs. | +| [Publish to Microsoft 365 Copilot and Teams](https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-copilot) | 0.30 | Publishing to Microsoft 365 Copilot and Teams from the portal is likely a procedural tutorial without detailed limits, config matrices, or product-specific error mappings. | +| [Quickstart - Create an agent from a manifest](https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-agent-catalog) | 0.30 | Quickstart workflow for creating an agent from a manifest; primarily step-by-step tutorial without detailed configuration tables or limits. | | [Quickstart: Chat with an agent](https://learn.microsoft.com/en-us/azure/foundry/quickstarts/get-started-code) | 0.30 | SDK getting-started quickstart; focuses on basic usage of models and agents rather than detailed configuration tables, limits, or error mappings. | -| [Run evaluations from the portal](https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-generative-ai-app) | 0.30 | Portal-based evaluation article is primarily about using the UI to run and view evaluations and metrics; summary does not indicate detailed configuration tables, limits, or error mappings. | | [Sensitive Data Leakage (PII)](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-filter-personal-information) | 0.30 | Focuses on what PII is and why detection matters. Summary does not indicate concrete configuration parameters, limits, or error codes; appears primarily conceptual/privacy overview. | | [Set up your developer environment](https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/install-cli-sdk) | 0.30 | Covers general prerequisites (language runtimes, Azure CLI, VS Code, Git) for Foundry development. Appears as a setup tutorial without detailed configuration option tables, limits, or troubleshooting content. | +| [Tools overview](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-catalog) | 0.30 | Overview of tools and tool catalog; summary is conceptual and does not indicate detailed parameter tables, limits, or troubleshooting content. | | [Transparency Note for safety evaluations](https://learn.microsoft.com/en-us/azure/foundry/concepts/safety-evaluations-transparency-note) | 0.30 | Transparency note on risk and safety evaluations; typically conceptual (purpose, capabilities, limitations) without detailed configuration or numeric thresholds. | | [Tutorial: Idea to prototype](https://learn.microsoft.com/en-us/azure/foundry/tutorials/developer-journey-idea-to-prototype) | 0.30 | End-to-end tutorial (idea to prototype) describing a scenario and workflow; no indication of numeric limits, decision matrices, or deep configuration references. | -| [View evaluation results in the portal](https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-results) | 0.30 | Covers viewing and interpreting evaluation results; no evidence of limits, config parameters, or troubleshooting codes. | +| [View evaluation results in the portal](https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluate-results) | 0.30 | Describes viewing and interpreting evaluation results; likely UI walkthrough and conceptual metric interpretation without product-specific limits or configs. | +| [What is a hosted agent?](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agents) | 0.30 | Conceptual explanation of hosted agents and platform benefits; summary does not show specific configuration parameters, limits, or troubleshooting mappings. | | [What is memory?](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/what-is-memory) | 0.30 | Conceptual explanation of what memory is in Foundry Agent Service and how it works at a high level; no detailed limits, configuration tables, error codes, or product-specific numeric thresholds. | | [What is the Foundry control plane?](https://learn.microsoft.com/en-us/azure/foundry/control-plane/overview) | 0.30 | High-level overview of Control Plane capabilities; summary does not indicate concrete roles, limits, or configuration tables. | | [When to use fine-tuning](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/fine-tuning-considerations) | 0.30 | Conceptual discussion of fine-tuning considerations; summary does not indicate concrete config values, limits, or product-specific numeric thresholds. | | [Overview](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/overview) | 0.25 | Overview of responsible AI practices for Azure OpenAI is conceptual and policy-oriented, not a technical reference with parameters or thresholds. | | [Responsible AI overview](https://learn.microsoft.com/en-us/azure/foundry/responsible-use-of-ai-overview) | 0.25 | Responsible AI overview is broad guidance and policy framing, not product-specific configuration, limits, or API details. | -| [AI red teaming agent overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/ai-red-teaming-agent) | 0.20 | Explicitly described as a conceptual overview of AI Red Teaming Agent; overviews are out of scope. | -| [Create and manage projects](https://learn.microsoft.com/en-us/azure/foundry/how-to/create-projects) | 0.20 | The page is about creating a Foundry project via the portal and confirming the environment. From the summary it appears to be a procedural UI walkthrough/overview without detailed configuration tables, limits, or product-specific parameters; it doesn’t match any expert-knowledge sub-skill criteria. | +| [AI red teaming agent overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/ai-red-teaming-agent) | 0.20 | Conceptual overview of AI Red Teaming Agent; focuses on what it is and why, not on specific configuration parameters, limits, or troubleshooting mappings. | +| [Agent development lifecycle](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/development-lifecycle) | 0.20 | Describes the agent development lifecycle conceptually (creation, tracing, evaluation, publishing); no concrete config values, limits, or decision matrices indicated. | +| [Agent manifests](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-catalog) | 0.20 | Conceptual description and disclaimers about agent manifests; lacks product-specific parameters, limits, or troubleshooting details. | | [FAQ](https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/foundry-iq-faq) | 0.20 | FAQ about Foundry IQ and knowledge bases; description suggests conceptual Q&A without specific limits, configuration tables, error-code mappings, or other product-specific expert details. | | [Feature availability by region](https://learn.microsoft.com/en-us/azure/foundry/reference/region-support) | 0.20 | Region availability matrices are likely present but are high-level feature/region support lists without configuration parameters, limits, or decision matrices; primarily navigation/reference rather than expert configuration, limits, or troubleshooting guidance. | +| [Foundry Models overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/foundry-models-overview) | 0.20 | Models overview is a high-level discovery/marketing-style page describing what Foundry Models is; unlikely to contain detailed limits, config tables, or decision matrices. | | [Guardrails and Controls Overview](https://learn.microsoft.com/en-us/azure/foundry/guardrails/guardrails-overview) | 0.20 | High-level overview of guardrails, risks, and intervention points without concrete configuration parameters, limits, or error mappings. Primarily conceptual, not expert configuration or troubleshooting detail. | +| [Observability overview](https://learn.microsoft.com/en-us/azure/foundry/concepts/observability) | 0.20 | Conceptual overview of observability and evaluation in generative AI; no concrete limits, configs, or product-specific parameters. | | [Plan and manage costs](https://learn.microsoft.com/en-us/azure/foundry/concepts/manage-costs) | 0.20 | The description indicates a generic cost management article: estimating expenses, tracking spending, and setting alerts using Microsoft Cost Management. It sounds like conceptual and procedural guidance without product-specific numeric limits, thresholds, or configuration tables. No clear evidence of expert-only details such as specific quotas, SKUs, or unique configuration parameters. | | [Quickstart: Create resources](https://learn.microsoft.com/en-us/azure/foundry/tutorials/quickstart-create-foundry-resources) | 0.20 | Quickstart/tutorial for creating Foundry resources; primarily step-by-step setup without detailed configuration tables, limits, or troubleshooting mappings. | | [Risk and safety evaluators](https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/risk-safety-evaluators) | 0.20 | High-level description of risk and safety evaluators and types of risks assessed. Lacks specific configuration parameters, role names, error codes, or decision matrices with thresholds. | @@ -414,9 +440,10 @@ confusable_not_for: Not for Microsoft Foundry Classic (use microsoft-foundry-cla | [Transparency note](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/agents/transparency-note) | 0.20 | Transparency note for Agent Service is descriptive of system behavior and data use, not a technical configuration or troubleshooting guide. | | [Transparency note](https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/transparency-note) | 0.20 | Transparency note is largely policy and behavior description, not a technical configuration or troubleshooting reference. | | [Video generation](https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/video-generation) | 0.20 | High-level overview of Sora 2 capabilities, safety, and limitations without specific quotas, configs, or decision matrices. | +| [What is the Foundry Agent service](https://learn.microsoft.com/en-us/azure/foundry/agents/overview) | 0.20 | High-level overview of Foundry Agent Service capabilities and agent types; no concrete limits, configs, error codes, or decision matrices. | +| [Work with Azure OpenAI reasoning models](https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reasoning) | 0.20 | Described as an overview of reasoning models and their capabilities (GPT-5 series, o3-mini, o1, o1-mini) without evidence of configuration tables, limits, or decision matrices. This is primarily conceptual capability description, not detailed configuration, limits, or troubleshooting content. | | [Ask AI](https://learn.microsoft.com/en-us/azure/foundry/concepts/ask-ai) | 0.10 | High-level guidance on using Ask AI in the Foundry portal; no detailed configuration parameters, limits, or product-specific error/decision matrices. | | [Use Microsoft Foundry with a screen reader](https://learn.microsoft.com/en-us/azure/foundry/tutorials/screen-reader) | 0.10 | Accessibility/navigation guidance for using a screen reader with the Foundry portal. This is UX and orientation content, not product configuration, limits, troubleshooting, or other expert operational details. | | [What is Microsoft Foundry?](https://learn.microsoft.com/en-us/azure/foundry/what-is-foundry) | 0.10 | High-level product overview of Microsoft Foundry; marketing and conceptual description without concrete limits, configuration parameters, error codes, or decision matrices. | -| [What is the Foundry Agent service](https://learn.microsoft.com/en-us/azure/foundry/agents/overview) | 0.10 | Conceptual overview of Foundry Agent Service capabilities and agent types; no specific quotas, config matrices, or error-resolution content. | | [What's new in Model Router](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/whats-new-model-router) | 0.10 | What's new / release notes style page; typically lists features and dates, not detailed limits, configs, or troubleshooting mappings. | | [What's new in Microsoft Foundry](https://learn.microsoft.com/en-us/azure/foundry/whats-new-foundry) | - | What's new documentation index for April 2026; primarily a changelog/overview without technical limits, configuration tables, or troubleshooting details. | diff --git a/products/waf-generate-summary.md b/products/waf-generate-summary.md index 6473a906..b0b4ef15 100644 --- a/products/waf-generate-summary.md +++ b/products/waf-generate-summary.md @@ -1,7 +1,7 @@ # Generation Summary -**Generated**: 2026-04-19 02:02:06 -**Total Duration**: 0m 18s +**Generated**: 2026-04-26 02:02:26 +**Total Duration**: 0m 19s ## Product Crawl Summary @@ -9,16 +9,16 @@ Quick overview for reviewers. See individual product reports for details. | # | Product | Pages | Classified | New | Updated | Deleted | Status | |---|---------|-------|------------|-----|---------|---------|--------| -| 1 | Azure Well Architected | 238 | 208 | 0 | 4 | 0 | OK | +| 1 | Azure Well Architected | 238 | 208 | 2 | 4 | 2 | OK | ### Totals - **Products Processed**: 1 success, 0 failed - **Total Pages**: 238 - **Total Classified**: 208 -- **Total New Pages**: 0 +- **Total New Pages**: 2 - **Total Updated Pages**: 4 -- **Total Deleted Pages**: 0 +- **Total Deleted Pages**: 2 ### Classification by Type (All Products) diff --git a/skills/azure-aks-edge-essentials/SKILL.md b/skills/azure-aks-edge-essentials/SKILL.md index a5ff515e..1e0a0824 100644 --- a/skills/azure-aks-edge-essentials/SKILL.md +++ b/skills/azure-aks-edge-essentials/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-aks-edge-essentials -description: Expert knowledge for Azure Kubernetes Service Edge Essentials development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing AKS Edge/Arc clusters, Arc connectivity, IoT/OPC/ONVIF edge workloads, TPM/Key Vault, or AI model deploys, and other Azure Kubernetes Service Edge Essentials related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure IoT Edge (use azure-iot-edge), Azure Stack Edge (use azure-stack-edge), Azure Container Apps (use azure-container-apps). +description: Expert knowledge for Azure Kubernetes Service Edge Essentials development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing AKS Edge/Arc clusters, Arc connectivity, IoT/OPC/ONVIF workloads, TPM/AI deployments, or Key Vault secrets, and other Azure Kubernetes Service Edge Essentials related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure IoT Edge (use azure-iot-edge), Azure Stack Edge (use azure-stack-edge), Azure Container Apps (use azure-container-apps). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Kubernetes Service Edge Essentials Skill @@ -24,21 +24,22 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L86 | Diagnosing and fixing AKS Edge/Arc cluster issues: installs, upgrades, networking, storage, security, logs, certificates, known issues, and using tools/PowerShell for deep troubleshooting. | -| Best Practices | L87-L94 | Best practices for AKS Edge/Arc: applying Azure Policy, recovering clusters after management VM loss, and safely upgrading Kubernetes/workload clusters via PowerShell or Admin Center | -| Decision Making | L95-L107 | Guidance on choosing AKS Edge/Arc vs cloud/on-prem, supported versions/add-ons, monitoring, pricing/licensing, support, and planning migrations or retirement of older AKS/Windows Server setups | -| Architecture & Design Patterns | L108-L114 | Designing AKS on Windows Server for Azure Local: high availability on two-node setups, SDN VNet architectures, and deployment patterns for AKS Arc target clusters. | -| Limits & Quotas | L115-L126 | Hardware, storage, IP, and VM requirements plus scale limits, quotas, and support policies for AKS Edge/Arc and AKS on Azure Local across Windows, VMware, and release versions | -| Security | L127-L160 | Securing AKS Edge/Arc/hybrid clusters: auth (Entra ID, AD, gMSA, workload identity), RBAC, SSH hardening, cert and key management, and encrypting/securing secrets and container workloads. | -| Configuration | L161-L240 | Configuring AKS Edge/Arc/hybrid clusters: networking, storage, load balancers, autoscaling, Arc connectivity, Windows/Linux node settings, monitoring, upgrades, and offline/host setup. | -| Integrations & Coding Patterns | L241-L299 | Integrating AKS Edge/Arc/hybrid with Azure and on-prem services: REST/CLI/PowerShell management, storage/backup, CSI, networking, IoT/OPC/ONVIF, TPM, AI model deploy, and Key Vault secrets. | -| Deployment | L300-L335 | Deploying, upgrading, and managing AKS Edge/AKS hybrid/AKS Arc clusters and node pools (Linux/Windows/GPU), including installs, updates, removals, offline/disconnected ops, and system requirements | +| Troubleshooting | L37-L87 | Diagnosing and fixing AKS Edge/Arc cluster issues: installs, upgrades, networking, storage, security, certificates, logs, known issues, and VMware/Windows Server–specific problems. | +| Best Practices | L88-L95 | Best practices for AKS Edge/Arc: applying Azure Policy, recovering clusters after management VM loss, and safely upgrading Kubernetes/workload clusters via PowerShell or Admin Center | +| Decision Making | L96-L108 | Guidance on choosing AKS Edge/Arc vs cloud/on-prem, supported versions/add-ons, monitoring, pricing/licensing, support, and planning migrations or retirement of older AKS/Windows Server setups | +| Architecture & Design Patterns | L109-L115 | Designing AKS on Windows Server for Azure Local: high availability on two-node setups, SDN VNet architectures, and deployment patterns for AKS Arc target clusters. | +| Limits & Quotas | L116-L127 | Hardware, storage, IP, and scale limits for AKS Edge/Arc on Azure Local/VMware/Windows, plus support policies and release changes to plan capacity and compatibility. | +| Security | L128-L160 | Securing AKS Edge/Arc/hybrid clusters: auth (Entra ID, AD, gMSA, workload identity), RBAC, SSH hardening, cert and key management, and encrypting/securing secrets and container workloads. | +| Configuration | L161-L241 | Configuring AKS Edge/Arc/hybrid clusters: networking, storage, load balancers, autoscaling, Arc connectivity, monitoring, Windows/Linux nodes, offline/online updates, and PowerShell-based setup. | +| Integrations & Coding Patterns | L242-L300 | Integrating AKS Edge/Arc/hybrid with Azure and on-prem services: REST/CLI/PowerShell management, storage/backup, CSI, networking, IoT/OPC/ONVIF, TPM, AI model deploy, and Key Vault secrets. | +| Deployment | L301-L336 | Deploying, upgrading, and managing AKS Edge/AKS hybrid/AKS Arc clusters and node pools (Linux/Windows/GPU), including installs, updates, removals, offline/disconnected ops, and system requirements | ### Troubleshooting | Topic | URL | |-------|-----| | Run AKS Arc diagnostic checker for cluster failures | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-arc-diagnostic-checker | | Resolve known issues in AKS enabled by Azure Arc | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-arc-known-issues | +| Validate and troubleshoot AKS Edge secret encryption | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption | | Collect and use AKS Edge Essentials logs for troubleshooting | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-resources-logs | | Troubleshoot common AKS Edge Essentials issues | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-troubleshoot-overview | | Retrieve kubelet logs from AKS Arc nodes | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-get-kubelet-logs | @@ -129,7 +130,6 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Configure AD single sign-on to AKS Arc API server | https://learn.microsoft.com/en-us/azure/aks/aksarc/ad-sso | | Use Key Manager to rotate AKS Edge service account keys | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-key-manager | -| Configure KMS-based secret encryption for AKS Edge Essentials | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-secret-encryption | | Configure workload identity on AKS Edge Essentials | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-workload-identity | | Set up Azure RBAC authorization for AKS Arc clusters | https://learn.microsoft.com/en-us/azure/aks/aksarc/azure-rbac-aks-hybrid | | Use Azure RBAC to control kubeconfig access in AKS Arc | https://learn.microsoft.com/en-us/azure/aks/aksarc/azure-rbac-local | @@ -171,7 +171,7 @@ This skill requires **network access** to fetch documentation content: | Advanced AKS Edge Essentials configuration and scripts | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-more-configs | | Configure multiple NICs for AKS Edge Essentials Linux nodes | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-multi-nic | | Configure AKS Edge Essentials for offline installation | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-offline-install | -| Prepare Windows host machines for AKS Edge Essentials | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine | +| Prepare machines and configure nodes for AKS Edge | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-machine | | Configure nested virtualization for AKS Edge Essentials | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-setup-nested-environment | | Uninstall AKS Edge Essentials from host machines | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-uninstall | | Update AKS Edge Essentials clusters online | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-edge-howto-update | @@ -182,6 +182,7 @@ This skill requires **network access** to fetch documentation content: | Configure Arc-enabled logical networks for AKS Arc | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-networks | | Configure Kubernetes extension for AKS Arc on VMware | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-vmware-install-kubernetes-extension | | Retrieve admin kubeconfig for AKS Arc on VMware clusters | https://learn.microsoft.com/en-us/azure/aks/aksarc/aks-vmware-retrieve-kubeconfig | +| Configure Azure Arc gateway for AKS on Azure Local | https://learn.microsoft.com/en-us/azure/aks/aksarc/arc-gateway-aks-arc | | Configure cluster autoscaler for AKS Arc | https://learn.microsoft.com/en-us/azure/aks/aksarc/auto-scale-aks-arc | | Configure Kubernetes cluster labels in AKS Arc | https://learn.microsoft.com/en-us/azure/aks/aksarc/cluster-labels | | Configure container networking for AKS on Windows Server applications | https://learn.microsoft.com/en-us/azure/aks/aksarc/concepts-container-networking | diff --git a/skills/azure-api-management/SKILL.md b/skills/azure-api-management/SKILL.md index 31245442..1d40c818 100644 --- a/skills/azure-api-management/SKILL.md +++ b/skills/azure-api-management/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-api-management -description: Expert knowledge for Azure API Management development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing APIM policies, self-hosted gateways, VNet/custom domains, OAuth/Entra auth, or multi-region setups, and other Azure API Management related development tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure Front Door (use azure-front-door), Azure Api Center (use azure-api-center), Azure App Service (use azure-app-service). +description: Expert knowledge for Azure API Management development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing APIM policies, VNet/self-hosted gateways, Entra/OAuth auth, multi-region setups, or Front Door/App GW routes, and other Azure API Management related development tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), Azure Logic Apps (use azure-logic-apps). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure API Management Skill @@ -29,10 +29,10 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L57-L77 | Guidance for choosing APIM tiers/networking, scaling and cost planning, DevOps/CI/CD, migrations (ARM, portals, workspaces, Amazon API Gateway), and monetization/analytics transitions. | | Architecture & Design Patterns | L78-L84 | Patterns for placing API Management behind App Gateway/WAF, Azure Front Door, or AKS, including routing, security, and high‑availability reference architectures. | | Limits & Quotas | L85-L103 | Rate, quota, and validation limits in API Management: throttling, per-key quotas, OpenAI/LLM token control, protocol format limits, WebSocket/self-hosted gateway caps, and request/response validation. | -| Security | L104-L141 | Securing API Management and gateways: authN/Z (OAuth2, Entra ID, B2C, JWT, certs, mTLS), RBAC, managed identity, TLS/ciphers, DDoS/Defender, and secure developer portal access. | -| Configuration | L142-L243 | Configuring Azure API Management behavior: policies, caching, networking/VNet, domains, monitoring/metrics, workspaces, self-hosted gateways, credentials, and developer portal integration. | -| Integrations & Coding Patterns | L244-L275 | Patterns and samples for integrating API Management with AI/LLM backends, OAuth/Graph, logging/monitoring, events, Service Bus, Service Fabric, MCP, and importing/exporting APIs. | -| Deployment | L276-L296 | Deploying and scaling APIM: multi-region, VNet and zone setups, self-hosted gateways (AKS/K8s/Docker/Arc), backup/restore, migration, automation, and portal deployment. | +| Security | L104-L142 | Securing Azure API Management: authN/authZ (Entra ID, B2C, OAuth2, JWT, mTLS, basic), certificates, RBAC, managed identity, self-hosted gateway security, DDoS/Defender, and compliance policies. | +| Configuration | L143-L242 | Configuring Azure API Management behavior: policies, caching, networking/VNet, domains, monitoring/metrics, workspaces, self-hosted gateways, credentials, and developer portal integration. | +| Integrations & Coding Patterns | L243-L275 | Patterns and samples for integrating API Management with external APIs, LLMs, backends, logging/monitoring tools, events, OAuth providers, and exporting/importing API definitions. | +| Deployment | L276-L296 | Deploying and scaling APIM: multi-region, VNet and zone setups, backups/migration, autoscale, automation, certificates, and self-hosted gateways (AKS, ACA, Docker, Kubernetes, Arc). | ### Troubleshooting | Topic | URL | @@ -121,6 +121,7 @@ This skill requires **network access** to fetch documentation content: | Use authentication-certificate policy for client certificate auth in API Management | https://learn.microsoft.com/en-us/azure/api-management/authentication-certificate-policy | | Configure authentication-managed-identity policy in API Management | https://learn.microsoft.com/en-us/azure/api-management/authentication-managed-identity-policy | | Migrate API Management identity providers from ADAL to MSAL | https://learn.microsoft.com/en-us/azure/api-management/breaking-changes/identity-provider-adal-retirement-sep-2025 | +| Configure credential providers in API Management | https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers | | Set up basic username/password auth for API Management developer portal | https://learn.microsoft.com/en-us/azure/api-management/developer-portal-basic-authentication | | Configure CORS for API Management developer portal test console | https://learn.microsoft.com/en-us/azure/api-management/enable-cors-developer-portal | | Retrieve authorization context with get-authorization-context policy | https://learn.microsoft.com/en-us/azure/api-management/get-authorization-context-policy | @@ -175,7 +176,6 @@ This skill requires **network access** to fetch documentation content: | Configure API Management automatic service update settings | https://learn.microsoft.com/en-us/azure/api-management/configure-service-update-settings | | Configure CORS behavior with cors policy in Azure API Management | https://learn.microsoft.com/en-us/azure/api-management/cors-policy | | Configure cosmosdb-data-source policy for GraphQL resolvers | https://learn.microsoft.com/en-us/azure/api-management/cosmosdb-data-source-policy | -| Configure credential providers in API Management | https://learn.microsoft.com/en-us/azure/api-management/credentials-configure-common-providers | | Understand credential manager OAuth 2.0 management and runtime flows | https://learn.microsoft.com/en-us/azure/api-management/credentials-process-flow | | Enable cross-domain access with cross-domain policy in API Management | https://learn.microsoft.com/en-us/azure/api-management/cross-domain-policy | | Extend API Management developer portal with custom functionality | https://learn.microsoft.com/en-us/azure/api-management/developer-portal-extend-custom-functionality | @@ -222,7 +222,6 @@ This skill requires **network access** to fetch documentation content: | Configure send-request policy with timeout settings | https://learn.microsoft.com/en-us/azure/api-management/send-request-policy | | Configure send-service-bus-message policy for Azure Service Bus | https://learn.microsoft.com/en-us/azure/api-management/send-service-bus-message-policy | | Configure Dapr set-backend-service policy in API Management | https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-dapr-policy | -| Configure set-backend-service policy and backend entities | https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy | | Configure set-body policy for API Management requests | https://learn.microsoft.com/en-us/azure/api-management/set-body-policy | | Configure and edit Azure API Management policy definitions | https://learn.microsoft.com/en-us/azure/api-management/set-edit-policies | | Configure set-header policy in Azure API Management | https://learn.microsoft.com/en-us/azure/api-management/set-header-policy | @@ -256,7 +255,7 @@ This skill requires **network access** to fetch documentation content: | Import Microsoft Foundry AI endpoints into API Management | https://learn.microsoft.com/en-us/azure/api-management/azure-ai-foundry-api | | Import Azure OpenAI model APIs as REST in API Management | https://learn.microsoft.com/en-us/azure/api-management/azure-openai-api-from-specification | | Configure GraphQL field resolvers in Azure API Management | https://learn.microsoft.com/en-us/azure/api-management/configure-graphql-resolver | -| Create managed Microsoft Graph connections via API Management credential manager | https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad | +| Create managed Microsoft Graph connection from API Management | https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-azure-ad | | Configure GitHub OAuth connections in API Management | https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-github | | Configure user-delegated OAuth connections in API Management | https://learn.microsoft.com/en-us/azure/api-management/credentials-how-to-user-delegated | | Export Azure API Management APIs to Postman collections | https://learn.microsoft.com/en-us/azure/api-management/export-api-postman | @@ -272,6 +271,7 @@ This skill requires **network access** to fetch documentation content: | Integrate Google Gemini OpenAI-compatible APIs with API Management | https://learn.microsoft.com/en-us/azure/api-management/openai-compatible-google-gemini-api | | Import OpenAI-compatible LLM APIs into API Management | https://learn.microsoft.com/en-us/azure/api-management/openai-compatible-llm-api | | Enable Dapr integration for API Management self-hosted gateway | https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-enable-dapr | +| Use set-backend-service policy in Azure API Management | https://learn.microsoft.com/en-us/azure/api-management/set-backend-service-policy | ### Deployment | Topic | URL | diff --git a/skills/azure-app-service/SKILL.md b/skills/azure-app-service/SKILL.md index 8ba280be..f1649e07 100644 --- a/skills/azure-app-service/SKILL.md +++ b/skills/azure-app-service/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-app-service -description: Expert knowledge for Azure App Service development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing App Service plans/tiers, configuring TLS/domains, deploying via slots/CI-CD, or securing with Entra/managed identity, and other Azure App Service related development tasks. Not for Azure Functions (use azure-functions), Azure Spring Apps (use azure-spring-apps), Azure Static Web Apps (use azure-static-web-apps), Azure Kubernetes Service (AKS) (use azure-kubernetes-service). +description: Expert knowledge for Azure App Service development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing App Service plans/ASE, configuring VNet/inbound/outbound, auth/TLS, slots/CI-CD, or custom containers, and other Azure App Service related development tasks. Not for Azure Functions (use azure-functions), Azure Container Apps (use azure-container-apps), Azure Spring Apps (use azure-spring-apps), Azure Static Web Apps (use azure-static-web-apps). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure App Service Skill @@ -25,14 +25,14 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L37-L43 | Diagnosing and troubleshooting App Service apps using built-in diagnostics and Azure Monitor, plus fixing common WordPress-on-App-Service configuration and runtime issues. | -| Best Practices | L44-L54 | Best practices for deploying, securing, routing, and maintaining App Service apps, including handling IP/TLS changes, Traffic Manager, and minimizing downtime during maintenance/restarts | +| Best Practices | L44-L54 | Best practices for deploying and securing App Service apps, handling inbound/outbound and TLS IP changes, and using Traffic Manager for resilient, geo-distributed endpoints | | Decision Making | L55-L75 | Guidance on choosing App Service tiers, plans, auth and networking, plus planning cost, TLS, domains, and migrations (Windows↔Linux, .NET, VNet, Docker Compose, Arc). | | Architecture & Design Patterns | L76-L80 | Architectural guidance for App Service: ASE geo-distribution, outbound traffic via NAT Gateway, and recommended Azure services/patterns for building scalable, secure apps. | | Limits & Quotas | L81-L85 | App Service resource limits (CPU, memory, connections), quota types, how they’re measured/monitored, and how to use metrics to detect and avoid hitting plan or app quotas. | -| Security | L86-L133 | Securing App Service apps: auth (Entra, social, OIDC, MCP), TLS/certs, IP/VNet/firewall, managed identities/Graph/SQL/Storage access, and end‑to‑end network and data protection. | -| Configuration | L134-L189 | Configuring App Service apps: runtime and language settings, containers, networking/VNet, domains/certs, storage, security/auth, monitoring, backups, and environment variables. | -| Integrations & Coding Patterns | L190-L203 | Patterns for integrating App Service with TLS/SSL, Application Gateway, Azure OpenAI chatbots, Key Vault via MSI, managed identity DB access, and WebJobs event-driven bindings. | -| Deployment | L204-L229 | Deploying and scaling App Service apps: CI/CD (GitHub Actions, Azure Pipelines, CLI/PowerShell), ZIP/FTP/Git deploy, custom containers, slots, ASE/Arc, scaling, DNS and credentials. | +| Security | L86-L139 | Securing App Service apps: auth (Entra, social, OIDC, MCP), managed identities/Graph/Storage/SQL, TLS/certs, IP/VNet/firewall, Key Vault secrets, and end-to-end network isolation. | +| Configuration | L140-L193 | Configuring App Service apps: app settings, networking/VNet, storage, containers/sidecars, auth, certificates/domains, language runtimes, health/monitoring, backup/restore, and ASE setup. | +| Integrations & Coding Patterns | L194-L204 | Patterns for integrating App Service apps with APM, TLS/SSL certs, Application Gateway, MCP, Azure OpenAI chatbots (Node/Flask), and event-driven jobs via WebJobs bindings. | +| Deployment | L205-L230 | Deploying and scaling App Service apps: CI/CD (GitHub Actions, Azure Pipelines, CLI/PowerShell), ZIP/FTP/Git deploy, custom containers, slots, ASE/Arc, scaling, DNS and credentials. | ### Troubleshooting | Topic | URL | @@ -50,7 +50,7 @@ This skill requires **network access** to fetch documentation content: | Prepare App Service apps for outbound IP address changes | https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-outbound | | Handle TLS/SSL IP address changes for App Service bindings | https://learn.microsoft.com/en-us/azure/app-service/ip-address-change-ssl | | Apply security best practices to Azure App Service deployments | https://learn.microsoft.com/en-us/azure/app-service/overview-security | -| Apply Traffic Manager best practices with Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager | +| Configure Azure Traffic Manager with App Service endpoints | https://learn.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager | ### Decision Making | Topic | URL | @@ -90,20 +90,21 @@ This skill requires **network access** to fetch documentation content: | Handle App Service Managed Certificate changes and validation | https://learn.microsoft.com/en-us/azure/app-service/app-service-managed-certificate-changes-july-2025 | | Configure TLS mutual authentication in Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/app-service-web-configure-tls-mutual-auth | | Secure App Service OpenAPI tools for Foundry with Entra auth | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-ai-foundry-openapi-tool | +| Manage App Service authentication API versions | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version | | Customize sign-in and sign-out behavior in App Service auth | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-customize-sign-in-out | | Configure MCP server authorization in Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-mcp | | Secure MCP servers on App Service with Entra authentication | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-mcp-server-vscode | | Manage OAuth tokens with App Service authentication | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-oauth-tokens | | Configure Microsoft Entra authentication for App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-aad | -| Configure Sign in with Apple for App Service authentication | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple | -| Configure Facebook authentication for Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook | -| Configure GitHub authentication for Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github | +| Configure Sign in with Apple for App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-apple | +| Configure Facebook as App Service identity provider | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-facebook | +| Configure GitHub authentication for App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-github | | Configure Google authentication for Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-google | | Configure custom OpenID Connect providers for App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-openid-connect | -| Configure X (Twitter) authentication for Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter | +| Configure X (Twitter) authentication for App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-provider-twitter | | Access user identities with App Service authentication | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-user-identities | | Disable basic auth for App Service deployments securely | https://learn.microsoft.com/en-us/azure/app-service/configure-basic-auth-disable | -| Encrypt App Service app source at rest with CMK | https://learn.microsoft.com/en-us/azure/app-service/configure-encrypt-at-rest-using-cmk | +| Encrypt Azure App Service app content with customer-managed keys | https://learn.microsoft.com/en-us/azure/app-service/configure-encrypt-at-rest-using-cmk | | Configure security for Java apps on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-security | | Configure TLS/SSL bindings for App Service custom domains | https://learn.microsoft.com/en-us/azure/app-service/configure-ssl-bindings | | Configure TLS/SSL certificates for Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-ssl-certificate | @@ -114,20 +115,25 @@ This skill requires **network access** to fetch documentation content: | Configure and use managed identities in Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity | | Configure TLS/SSL security for Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/overview-tls | | Prevent dangling subdomain takeovers in Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/reference-dangling-subdomain-prevention | -| Configure managed identity access to Microsoft Graph from App Service | https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app | +| Use managed identity to call Microsoft Graph from App Service | https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-app | | Grant delegated Microsoft Graph access for App Service users | https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-microsoft-graph-as-user | | Secure App Service access to Azure Storage with managed identities | https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-access-storage | | Enable authentication for Azure App Service web apps | https://learn.microsoft.com/en-us/azure/app-service/scenario-secure-app-authentication-app-service | | Use Azure Policy compliance controls for App Service | https://learn.microsoft.com/en-us/azure/app-service/security-controls-policy | | Configure minimum TLS versions for Azure App Service and Functions | https://learn.microsoft.com/en-us/azure/app-service/tls-minimum-version | | Secure App Service apps end-to-end with built-in auth | https://learn.microsoft.com/en-us/azure/app-service/tutorial-auth-aad | -| Access Microsoft Graph as app using managed identity | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript | -| Access Microsoft Graph as user from App Service | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript | +| Configure managed identity access to Microsoft Graph from App Service | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-app-javascript | +| Grant delegated Microsoft Graph access for App Service users | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-microsoft-graph-as-user-javascript | | Connect App Service to SQL on behalf of signed-in user | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-sql-database-as-user-dotnet | -| Secure JavaScript web app access to Azure Storage | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript | -| Configure E2E user auth from App Service to Azure services | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript | +| Use managed identities to access Azure Storage from App Service | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-access-storage-javascript | +| Configure end-to-end user auth to Azure services | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-app-app-graph-javascript | | Secure database access from App Service with managed identity | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-azure-database | +| Secure .NET App Service secrets with Key Vault | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault | +| Secure JavaScript App Service secrets with Key Vault | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript | +| Secure PHP App Service secrets with Key Vault | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php | +| Secure Python App Service secrets with Key Vault | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python | | Secure SQL access with managed identity in App Service | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-sql-database | +| Isolate Azure App Service traffic with VNet integration | https://learn.microsoft.com/en-us/azure/app-service/tutorial-networking-isolate-vnet | | Secure App Service apps with custom domains and certificates | https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-domain-certificate | | Secure N-tier Azure App Service with private networking | https://learn.microsoft.com/en-us/azure/app-service/tutorial-secure-ntier-app | @@ -137,7 +143,6 @@ This skill requires **network access** to fetch documentation content: | Configure Azure App Service App Configuration references | https://learn.microsoft.com/en-us/azure/app-service/app-service-configuration-references | | Configure Hybrid Connections for Azure App Service apps | https://learn.microsoft.com/en-us/azure/app-service/app-service-hybrid-connections | | Configure Key Vault references in App Service settings | https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references | -| Manage App Service authentication API and runtime versions | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-api-version | | Configure App Service authentication using file-based settings | https://learn.microsoft.com/en-us/azure/app-service/configure-authentication-file-based | | Configure common settings for Azure App Service apps | https://learn.microsoft.com/en-us/azure/app-service/configure-common | | Mount Azure Storage file shares in App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-connect-to-azure-storage | @@ -149,7 +154,6 @@ This skill requires **network access** to fetch documentation content: | Configure Aspire applications on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnet-aspire | | Configure ASP.NET apps on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnet-framework | | Configure ASP.NET Core apps on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-dotnetcore | -| Configure APM for Java apps on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm | | Configure Java data sources on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-data-sources | | Deploy and configure Java apps on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-deploy-run | | Configure Node.js applications on Azure App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-nodejs | @@ -190,15 +194,12 @@ This skill requires **network access** to fetch documentation content: ### Integrations & Coding Patterns | Topic | URL | |-------|-----| +| Integrate Java apps with APM tools on App Service | https://learn.microsoft.com/en-us/azure/app-service/configure-language-java-apm | | Use App Service TLS/SSL certificates in application code | https://learn.microsoft.com/en-us/azure/app-service/configure-ssl-certificate-in-code | | Integrate App Service Environment with Azure Application Gateway | https://learn.microsoft.com/en-us/azure/app-service/environment/integrate-with-application-gateway | | Integrate Azure App Service as an MCP server | https://learn.microsoft.com/en-us/azure/app-service/scenario-ai-model-context-protocol-server | | Integrate Node.js Express chatbot with Azure OpenAI via App Service | https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-openai-chatbot-node | | Integrate Python Flask chatbot with Azure OpenAI via App Service | https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-openai-chatbot-python | -| Use Key Vault with App Service .NET via MSI | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault | -| Use Key Vault with App Service JavaScript via MSI | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-javascript | -| Use Key Vault with App Service PHP via MSI | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-php | -| Use Key Vault with App Service Python via MSI | https://learn.microsoft.com/en-us/azure/app-service/tutorial-connect-msi-key-vault-python | | Implement event-driven jobs with Azure WebJobs SDK bindings | https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-how-to | ### Deployment diff --git a/skills/azure-application-gateway/SKILL.md b/skills/azure-application-gateway/SKILL.md index fe08bdf8..002b90a0 100644 --- a/skills/azure-application-gateway/SKILL.md +++ b/skills/azure-application-gateway/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-application-gateway -description: Expert knowledge for Azure Application Gateway development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring listeners/routing, WAF/TLS, AGIC/AKS integration, autoscale/zone redundancy, or v1→v2 migration, and other Azure Application Gateway related development tasks. Not for Azure Front Door (use azure-front-door), Azure Load Balancer (use azure-load-balancer), Azure Traffic Manager (use azure-traffic-manager), Azure Firewall (use azure-firewall). +description: Expert knowledge for Azure Application Gateway development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring listeners/routing, WAF/TLS, autoscale/zone redundancy, AGIC with AKS, or App Gateway for Containers, and other Azure Application Gateway related development tasks. Not for Azure Load Balancer (use azure-load-balancer), Azure Front Door (use azure-front-door), Azure Traffic Manager (use azure-traffic-manager), Azure Virtual Network (use azure-virtual-network). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Application Gateway Skill @@ -24,19 +24,20 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L36-L41 | Diagnosing and fixing Application Gateway runtime issues: backend health, 502s, certificates/Key Vault, listeners, session affinity, mTLS, redirects, AKS/ALB/containers, and HTTP response codes. | -| Best Practices | L42-L46 | Guidance on designing Application Gateway for very high traffic: sizing, autoscaling, performance tuning, capacity planning, and configuration patterns to handle large loads reliably. | -| Decision Making | L47-L56 | Guidance on choosing networking and pricing for Application Gateway, and planning migrations (AGIC to containers, v1 retirement, classic VMs to ARM) | -| Limits & Quotas | L57-L61 | Autoscaling and zone redundancy settings, gateway capacity and configuration limits, and guidance for migrating from Application Gateway v1 to v2. | -| Security | L62-L103 | TLS/SSL, certificates, mTLS, WAF, DDoS, HSTS, and secure access patterns for Application Gateway and App Gateway for Containers, including Key Vault, cert-manager, and protocol/cipher policies | -| Configuration | L104-L168 | Configuring and monitoring Azure Application Gateway and Application Gateway for Containers: listeners, routing, probes, health, headers/URL rewrite, session affinity, diagnostics, and AKS/AGIC integration. | -| Integrations & Coding Patterns | L169-L176 | Patterns for integrating App Gateway for Containers with monitoring, security, and scaling: Prometheus/Grafana, Istio, Sentinel/Defender, and autoscaling AKS pods via gateway metrics. | -| Deployment | L177-L191 | Guides for deploying and migrating Application Gateway (v1→v2, IPv6, mTLS), configuring autoscale, and setting up/upgrading AGIC with AKS using portal, ARM, PowerShell, and Helm. | +| Troubleshooting | L36-L42 | Diagnosing and fixing Application Gateway for Containers issues using ALB Controller backend health, metrics, and common error/behavior troubleshooting steps. | +| Best Practices | L43-L47 | Guidance on designing Application Gateway for very high traffic: sizing, autoscaling, performance tuning, capacity planning, and configuration patterns to handle large loads reliably. | +| Decision Making | L48-L58 | Guidance on choosing networking and load-balancing for App Gateway for Containers, estimating pricing, and planning migrations (AGIC, v1 retirement, classic VMs to ARM). | +| Limits & Quotas | L59-L63 | Autoscaling and zone redundancy settings, gateway capacity and configuration limits, and guidance for migrating from Application Gateway v1 to v2. | +| Security | L64-L105 | TLS/SSL, certificates, mTLS, WAF, DDoS, HSTS, and secure access patterns for Application Gateway and App Gateway for Containers, including Key Vault, cert-manager, and protocol/cipher policies | +| Configuration | L106-L170 | Configuring Application Gateway and Application Gateway for Containers: listeners, routing, probes, health, headers/URL rewrite, WebSockets/gRPC, networking, monitoring, and AKS/AGIC integration. | +| Integrations & Coding Patterns | L171-L178 | Integrating App Gateway for Containers with Prometheus/Grafana, Istio, Sentinel/Defender, and using its metrics to autoscale AKS pods and build observability/security pipelines. | +| Deployment | L179-L193 | Guides for deploying and migrating Application Gateway (v1→v2, IPv6, mTLS), configuring autoscale, and setting up/upgrading AGIC with AKS using portal, ARM, PowerShell, and Helm. | ### Troubleshooting | Topic | URL | |-------|-----| | Use ALB Controller backend health and metrics for troubleshooting | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-backend-health-metrics | +| Troubleshoot common issues in Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/faq | | Troubleshoot common issues in Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/troubleshooting-guide | ### Best Practices @@ -48,7 +49,8 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Choose container networking for Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/container-networking | -| Plan migration from AGIC to Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc | +| Choose load balancing strategies in Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/load-balancing-strategies | +| Decide and plan migration from AGIC to Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/migrate-from-agic-to-agc | | Estimate and understand pricing for Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/understanding-pricing | | Plan migration for Azure Application Gateway V1 retirement | https://learn.microsoft.com/en-us/azure/application-gateway/retirement-faq | | Understand billing and pricing for Azure Application Gateway SKUs | https://learn.microsoft.com/en-us/azure/application-gateway/understanding-pricing | @@ -124,7 +126,7 @@ This skill requires **network access** to fetch documentation content: | Configure Application Gateway with Azure App Service backend | https://learn.microsoft.com/en-us/azure/application-gateway/configure-web-app | | Create custom error pages in Azure Application Gateway | https://learn.microsoft.com/en-us/azure/application-gateway/custom-error | | Configure ALB Controller Helm chart for Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/alb-controller-helm-chart | -| Use Kubernetes API specification for Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes | +| Use Application Gateway for Containers Kubernetes API spec | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/api-specification-kubernetes | | Configure components of Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/application-gateway-for-containers-components | | Use Azure Monitor metrics with Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/application-gateway-for-containers-metrics | | Configure custom health probes for Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/custom-health-probe | @@ -170,7 +172,7 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Integrate App Gateway for Containers with Prometheus and Grafana | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/prometheus-grafana | -| Integrate Application Gateway for Containers with Istio service mesh | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration | +| Integrate Istio service mesh with Application Gateway for Containers | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/service-mesh-integration | | Integrate Application Gateway for Containers logs with Microsoft Sentinel and Defender | https://learn.microsoft.com/en-us/azure/application-gateway/for-containers/siem-integration-with-sentinel | | Autoscale AKS pods using Application Gateway metrics | https://learn.microsoft.com/en-us/azure/application-gateway/ingress-controller-autoscale-pods | diff --git a/skills/azure-arc/SKILL.md b/skills/azure-arc/SKILL.md index a5ea2c6f..554ea0aa 100644 --- a/skills/azure-arc/SKILL.md +++ b/skills/azure-arc/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-arc -description: Expert knowledge for Azure Arc development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing Arc-enabled Kubernetes, servers/VMs, SQL MI, Edge RAG, resource bridge, or Arc data services, and other Azure Arc related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Virtual Machines (use azure-virtual-machines), Azure Stack Edge (use azure-stack-edge), Azure VMware Solution (use azure-vmware-solution). +description: Expert knowledge for Azure Arc development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing Arc-enabled Kubernetes, servers/VMs, SQL MI, Edge RAG, resource bridge, or Arc data services, and other Azure Arc related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Stack Edge (use azure-stack-edge), Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Network (use azure-virtual-network). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Arc Skill @@ -30,7 +30,7 @@ This skill requires **network access** to fetch documentation content: | Architecture & Design Patterns | L103-L111 | Patterns for Arc data/compute design: container storage data flow, Arc Edge Volumes, HA/DR for Arc SQL MI and failover groups, and advanced Edge RAG data parsing. | | Limits & Quotas | L112-L127 | Limits, quotas, versions, and requirements for Arc-enabled Kubernetes, Edge RAG, Arc data services, resource bridge, and billing/ESU behavior for connected machines and Windows Server. | | Security | L128-L186 | Securing Azure Arc: identity, RBAC, AD/Entra auth, keytabs, TDE, certificates, network/Private Link, policies, and hardening for Kubernetes, servers, SQL MI, Edge RAG, SCVMM, and vSphere. | -| Configuration | L187-L286 | Configuring Azure Arc infrastructure and data services: storage, networking, GitOps, monitoring, security, Edge RAG, agents/extensions, and lifecycle tasks for Arc-enabled Kubernetes, servers, and data. | +| Configuration | L187-L286 | Configuring Azure Arc infrastructure and services: Kubernetes, data services, Edge RAG, storage, networking, GitOps, monitoring, VM/agent settings, extensions, and portal/CLI management. | | Integrations & Coding Patterns | L287-L310 | Programmatic and automation patterns for Azure Arc: CLI/PowerShell/ARM/SDK usage, onboarding at scale, VM extensions, monitoring/security integration, and infrastructure-as-code workflows. | | Deployment | L311-L342 | Deploying and upgrading Azure Arc components: data controllers, Edge RAG, resource bridge, agents (Kubernetes, VMs, SCVMM, vSphere), plus prerequisites, support matrices, and clean removal steps. | @@ -242,7 +242,7 @@ This skill requires **network access** to fetch documentation content: | Use Azure portal Kubernetes resource view for Arc-enabled clusters | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/kubernetes-resource-view | | Use version-managed extensions on Arc-enabled Kubernetes | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/managed-extensions | | Monitor Flux v2 GitOps status on Arc and AKS | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/monitor-gitops-flux-2 | -| Configure Azure Key Vault Secret Store extension on Arc Kubernetes | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension | +| Configure Azure Key Vault Secret Store extension for Arc Kubernetes | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension | | Configure Azure Key Vault Secret Store Extension on Arc Kubernetes | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/secret-store-extension-reference | | Configure AKV Secrets Provider extension on Arc Kubernetes | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/tutorial-akv-secrets-provider | | Apply Flux v2 configurations at scale with Azure Policy | https://learn.microsoft.com/en-us/azure/azure-arc/kubernetes/use-azure-policy-flux-2 | diff --git a/skills/azure-architecture/SKILL.md b/skills/azure-architecture/SKILL.md index 24daf35a..f14b58b6 100644 --- a/skills/azure-architecture/SKILL.md +++ b/skills/azure-architecture/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-architecture -description: Expert guidance for designing Azure solutions using Azure Architecture. Covers reference architectures, solution ideas, design patterns, technology choices, architecture styles, best practices, anti-patterns, example workloads, and migration guides. Use when designing AKS or AVD solutions, hybrid/Arc setups, multiregion DR, SAP/IoT platforms, or GenAI/RAG workloads, and other Azure Architecture related development tasks. +description: Expert guidance for designing Azure solutions using Azure Architecture. Covers reference architectures, solution ideas, design patterns, technology choices, architecture styles, best practices, anti-patterns, example workloads, and migration guides. Use when designing AKS, data/AI pipelines, Zero Trust networking, hybrid/Arc setups, or multiregion apps on Azure, and other Azure Architecture related development tasks. compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Architecture Skill @@ -24,15 +24,15 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Reference Architectures | L37-L92 | End-to-end Azure solution blueprints for mission-critical, hybrid, and multiregion workloads: networking, security, data/ML, AKS, AVD, integration, and DR-ready reference architectures. | -| Solution Ideas | L93-L124 | End-to-end solution patterns for AI, analytics, security, DevSecOps, SAP, and data platforms on Azure, including reference architectures and implementation guidance. | -| Design Patterns | L125-L178 | Design and integration patterns for building resilient, scalable, secure Azure apps: messaging, gateways, multitenancy, caching, transactions, auth, monitoring, and legacy modernization. | -| Technology Choices | L179-L213 | Guides for choosing the right Azure/Fabric services (AI, data, storage, compute, networking, containers, messaging) by comparing options for specific workload and architecture needs. | -| Architecture Styles | L214-L224 | Azure app architecture patterns: when and how to use Big Compute, Big Data, event-driven, microservices, N-tier, and Web-Queue-Worker styles, with design guidance and tradeoffs. | -| Best Practices | L225-L279 | Best-practice patterns for Azure apps: API design, autoscale, caching, DR/HA, networking/IPv4-6, AKS, IoT, SAP, Event Hubs, App Service, DevSecOps, and GenAI/RAG operations. | +| Reference Architectures | L37-L91 | End-to-end Azure solution blueprints: mission-critical, hybrid, and multiregion architectures for data, networking, security, AKS, AVD, MLOps, integration, and global-scale app delivery. | +| Solution Ideas | L92-L123 | End-to-end solution patterns for AI, analytics, IoT, security, SAP, and data platforms on Azure, including architecture, services to use, and reference implementations. | +| Design Patterns | L124-L177 | Patterns for designing resilient, scalable Azure apps: messaging, gateways, multitenancy, data partitioning, caching, workflows, auth, monitoring, and integration with legacy systems. | +| Technology Choices | L178-L212 | Guides for choosing the right Azure/Fabric services (compute, storage, data, AI/ML, analytics, networking, messaging) for specific workloads and architectural requirements. | +| Architecture Styles | L213-L224 | Guidance on choosing and designing Azure app architectures (microservices, N‑tier, event-driven, web‑queue‑worker, big data, big compute) and when/how to apply each style. | +| Best Practices | L225-L279 | Best-practice patterns for Azure apps: API design, autoscaling, caching, DR, networking, AKS ops, IoT, SAP, Event Hubs/Functions, multitenant/RAG, and secure, resilient cloud architectures. | | Anti-patterns | L280-L294 | Diagnosing and fixing common Azure performance and scalability anti-patterns (busy DB/front end, chatty I/O, no caching, retry storms, noisy neighbors, sync I/O, monolithic persistence). | -| Example Workloads | L295-L371 | End-to-end reference architectures and patterns for real-world Azure workloads, covering data/AI, mainframe migration, networking, security, hybrid/Arc, AKS, Fabric, VDI, and app modernization. | -| Migration Guides | L372-L406 | Guides for migrating from AWS, GCP, on-prem Oracle, Kafka, and Kubernetes/EKS to Azure, including service mapping, architecture, governance, security, cost, and app modernization patterns. | +| Example Workloads | L295-L372 | End-to-end reference architectures for real-world Azure workloads: data/AI pipelines, mainframe migration, AKS and networking, security/Zero Trust, hybrid/Arc, VDI, and enterprise app deployments. | +| Migration Guides | L373-L407 | Guides for migrating from AWS, GCP, on-prem Oracle, Kafka, and Kubernetes/EKS to Azure, including service mapping, architecture, governance, security, cost, and app modernization patterns. | ### Reference Architectures | Topic | URL | @@ -47,7 +47,6 @@ This skill requires **network access** to fetch documentation content: | Deploy MongoDB Atlas securely on Azure | https://learn.microsoft.com/en-us/azure/architecture/databases/architecture/mongodb-atlas-baseline | | Move archive data from mainframes to Azure storage | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/move-archive-data-mainframes | | Secure private AKS clusters with Azure Firewall and hub-spoke | https://learn.microsoft.com/en-us/azure/architecture/guide/aks/aks-firewall | -| Deploy Terraform-based Azure Sandbox environments | https://learn.microsoft.com/en-us/azure/architecture/guide/azure-sandbox/azure-sandbox | | Architect mission-critical global content delivery on Azure | https://learn.microsoft.com/en-us/azure/architecture/guide/networking/global-web-applications/mission-critical-content-delivery | | Implement mission-critical global HTTP ingress on Azure | https://learn.microsoft.com/en-us/azure/architecture/guide/networking/global-web-applications/mission-critical-global-http-ingress | | Design global routing redundancy for mission-critical web apps | https://learn.microsoft.com/en-us/azure/architecture/guide/networking/global-web-applications/overview | @@ -80,7 +79,7 @@ This skill requires **network access** to fetch documentation content: | Deploy a Databricks-based stream processing pipeline | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/data/stream-processing-databricks | | Implement stream processing with Azure Stream Analytics | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/data/stream-processing-stream-analytics | | Implement secure hybrid DMZ network in Azure | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/dmz/secure-vnet-dmz | -| Deploy basic enterprise integration with Logic Apps and APIM | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration | +| Deploy basic enterprise integration with Logic Apps and API Management | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/basic-enterprise-integration | | Connect on-premises networks to Azure with ExpressRoute and VPN failover | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/expressroute-vpn-failover | | Create AD DS resource forest in Azure for hybrid trust | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/identity/adds-forest | | Extend AD FS to Azure for federated authentication | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/identity/adfs | @@ -102,7 +101,7 @@ This skill requires **network access** to fetch documentation content: | Orchestrate MLOps pipelines with Azure Databricks | https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/orchestrate-machine-learning-azure-databricks | | Design conversation analytics with Foundry Tools | https://learn.microsoft.com/en-us/azure/architecture/ai-ml/idea/unlock-insights-from-conversational-data | | Use Cosmos DB change feed for minimal-cost archival | https://learn.microsoft.com/en-us/azure/architecture/databases/idea/minimal-storage-change-feed-replicate-data | -| Load test Event Hubs and IoT Hub with custom JMeter plugins | https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins | +| Simulate device behaviors with Azure Load Testing and custom plugins | https://learn.microsoft.com/en-us/azure/architecture/guide/testing/load-testing/load-testing-with-custom-plugins | | Build real-time analytics using Azure Service Bus and Microsoft Fabric | https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/analytics-service-bus | | Design a modern analytics architecture with Azure Databricks | https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/azure-databricks-modern-analytics-architecture | | Build first security layer with core Azure security services | https://learn.microsoft.com/en-us/azure/architecture/solution-ideas/articles/azure-security-build-first-layer-defense | @@ -140,7 +139,7 @@ This skill requires **network access** to fetch documentation content: | Implement the Choreography pattern for workflows | https://learn.microsoft.com/en-us/azure/architecture/patterns/choreography | | Use the Circuit Breaker pattern for resilient calls | https://learn.microsoft.com/en-us/azure/architecture/patterns/circuit-breaker | | Apply the Claim-Check pattern for large messages | https://learn.microsoft.com/en-us/azure/architecture/patterns/claim-check | -| Use the Compensating Transaction pattern for rollback | https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction | +| Apply the Compensating Transaction pattern in distributed workflows | https://learn.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction | | Use the Competing Consumers pattern for scaling | https://learn.microsoft.com/en-us/azure/architecture/patterns/competing-consumers | | Consolidate compute with the Resource Consolidation pattern | https://learn.microsoft.com/en-us/azure/architecture/patterns/compute-resource-consolidation | | Apply the CQRS pattern to separate reads and writes | https://learn.microsoft.com/en-us/azure/architecture/patterns/cqrs | @@ -190,7 +189,8 @@ This skill requires **network access** to fetch documentation content: | Select Azure batch processing technologies for big data | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/batch-processing | | Select big data storage technologies in Azure | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage | | Choose an analytical data store in Microsoft Fabric | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-analytical-data-stores | -| Choose Azure natural language processing technologies | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing | +| Select the right Microsoft Fabric deployment pattern | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/fabric-deployment-patterns | +| Select Azure services for natural language processing workloads | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/natural-language-processing | | Choose Azure data pipeline orchestration services | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/pipeline-orchestration-data-movement | | Choose an Azure search data store technology | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/search-options | | Compare Azure real-time stream processing services | https://learn.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/stream-processing | @@ -202,12 +202,11 @@ This skill requires **network access** to fetch documentation content: | Choose Azure hybrid hosting and connectivity options | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/hybrid-considerations | | Select Azure load balancing services for applications | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/load-balancing-overview | | Choose Azure asynchronous messaging services | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/messaging | -| Compare Java application hosting options on Azure | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/service-for-java-comparison | | Select the right Azure storage service | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/storage-options | | Navigate Azure technology choice decision guides | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/technology-choices-overview | | Select an Azure service for vector search | https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/vector-search | -| Choose Azure compute options for microservices | https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options | -| Choose compute platform options for microservices | https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options | +| Choose Azure compute platform for microservices workloads | https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options | +| Choose Azure compute platform for microservices workloads | https://learn.microsoft.com/en-us/azure/architecture/microservices/design/compute-options | | Evaluate Kubernetes edge compute options on Azure | https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/choose-kubernetes-edge-compute-option | | Choose connectivity options for on-premises to Azure VNets | https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/ | @@ -221,6 +220,7 @@ This skill requires **network access** to fetch documentation content: | Adopt microservices architecture style on Azure | https://learn.microsoft.com/en-us/azure/architecture/guide/architecture-styles/microservices | | Implement N-tier architecture style on Azure | https://learn.microsoft.com/en-us/azure/architecture/guide/architecture-styles/n-tier | | Use Web-Queue-Worker architecture style on Azure | https://learn.microsoft.com/en-us/azure/architecture/guide/architecture-styles/web-queue-worker | +| Design and implement microservices architecture on Azure | https://learn.microsoft.com/en-us/azure/architecture/microservices/design/ | ### Best Practices | Topic | URL | @@ -267,11 +267,11 @@ This skill requires **network access** to fetch documentation content: | Use top-down triage practices for AKS operations | https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/aks-triage-practices | | Apply day-2 operations practices for AKS clusters | https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/day-2-operations-guide | | Troubleshoot networking issues in AKS clusters | https://learn.microsoft.com/en-us/azure/architecture/operator-guides/aks/troubleshoot-network-aks | -| Architect and optimize Event Hubs with Azure Functions | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions | +| Architect Event Hubs integrations with Azure Functions | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/event-hubs-functions | | Monitor Event Hubs and Azure Functions with Application Insights | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/observability | -| Optimize performance and scale for Event Hubs-triggered Functions | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale | -| Design resilient Event Hubs and Azure Functions solutions | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design | -| Secure Azure Functions integrated with Event Hubs | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security | +| Optimize performance and scale for Event Hubs Functions | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/performance-scale | +| Design resilient Event Hubs-triggered Azure Functions | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/resilient-design | +| Secure Azure Functions that consume Event Hubs | https://learn.microsoft.com/en-us/azure/architecture/serverless/event-hubs-functions/security | | Protect APIs with Application Gateway and API Management | https://learn.microsoft.com/en-us/azure/architecture/web-apps/api-management/architectures/protect-apis | | Design multi-region Azure App Service for disaster recovery | https://learn.microsoft.com/en-us/azure/architecture/web-apps/guides/multi-region-app-service/multi-region-app-service | | Securely access App Service apps from on-premises networks | https://learn.microsoft.com/en-us/azure/architecture/web-apps/guides/networking/access-multitenant-web-app-from-on-premises | @@ -314,7 +314,7 @@ This skill requires **network access** to fetch documentation content: | Implement SCI-based sustainability scoring for Azure apps | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/measure-azure-app-sustainability-sci-score | | Apply SRE principles to build scalable Azure API platforms | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/scalable-apps-performance-modeling-site-reliability | | Implement multiregion BCDR for Azure Virtual Desktop | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/azure-virtual-desktop/azure-virtual-desktop-multi-region-bcdr | -| Automate certificate lifecycle management with nonintegrated CAs on Azure | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/ | +| Implement automated certificate lifecycle management on Azure | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/certificate-lifecycle/ | | Build a unified Azure data warehouse and analytics pipeline | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/data-warehouse | | Implement Esri ArcGIS Pro on Azure Virtual Desktop | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/esri-arcgis-azure-virtual-desktop | | Implement a greenfield Microsoft Fabric lakehouse platform | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/data/greenfield-lakehouse-fabric | @@ -331,7 +331,7 @@ This skill requires **network access** to fetch documentation content: | Implement Zero Trust network for web apps with Azure Firewall | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gateway/application-gateway-before-azure-firewall | | Secure virtual networks with Azure Firewall and Application Gateway | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gateway/firewall-application-gateway | | Operate AKS clusters using GitOps with Flux and Argo CD | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/gitops-aks/gitops-blueprint-aks | -| Apply AKS baseline architecture on Azure Local | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline | +| Implement AKS baseline architecture on Azure Local | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-baseline | | Build GitOps pipelines for AKS on Azure Local with Arc | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/aks-hybrid-azure-local | | Provide on-premises access to Azure Files with AD DS | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/azure-files-on-premises-authentication | | Secure hybrid messaging client access with MFA | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/hybrid/secure-hybrid-messaging-client | @@ -353,6 +353,7 @@ This skill requires **network access** to fetch documentation content: | Rehost IMS DC and IMS DB on Azure with Raincode IMSql | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/mainframe/rehost-ims-raincode-imsql | | Implement Siemens Teamcenter PLM baseline on Azure | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/manufacturing/teamcenter-baseline | | Build real-time monitoring and observability for media telemetry | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/monitoring/monitoring-observable-systems-media | +| Implement compliant Azure VM images with DevOps | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/security/virtual-machine-compliance | | Replatform Kubernetes microservices to Azure Container Apps | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/serverless/microservices-with-container-apps | | Build Dapr-based microservices on Azure Container Apps | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/serverless/microservices-with-container-apps-dapr | | Plan WSUS deployment to update isolated Azure Windows VMs | https://learn.microsoft.com/en-us/azure/architecture/example-scenario/wsus/ | diff --git a/skills/azure-artifacts/SKILL.md b/skills/azure-artifacts/SKILL.md index 0cc01981..04b5087a 100644 --- a/skills/azure-artifacts/SKILL.md +++ b/skills/azure-artifacts/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-artifacts -description: Expert knowledge for Azure Artifacts development including best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing feeds, upstream sources, package publishing/restore, GitHub Actions CI/CD, or npm/NuGet config, and other Azure Artifacts related development tasks. Not for Azure DevOps (use azure-devops), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos), Azure Boards (use azure-boards). +description: Expert knowledge for Azure Artifacts development including best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing feeds, upstream sources, views/promotion, retention, GitHub Actions CI/CD, or package tool auth, and other Azure Artifacts related development tasks. Not for Azure DevOps (use azure-devops), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Artifacts Skill @@ -29,7 +29,7 @@ This skill requires **network access** to fetch documentation content: | Limits & Quotas | L49-L56 | Storage quotas, free allocation, and per-package size/count limits in Azure Artifacts, plus how to monitor, manage, and publish packages within those limits. | | Security | L57-L63 | Securing Azure Artifacts feeds: configuring permissions, protecting upstream sources from malicious packages, and using npm audit to find and fix vulnerabilities. | | Configuration | L64-L74 | Configuring Azure Artifacts feeds: views/promotion, retention/deletion, upstream sources, npm/.npmrc and scopes, and .artifactignore for optimizing pipeline artifacts. | -| Integrations & Coding Patterns | L75-L117 | How to connect build tools and CLIs (Cargo, Maven, Gradle, npm, NuGet, Python, PowerShell, Universal) to Azure Artifacts feeds, publish/restore packages, and use upstream sources. | +| Integrations & Coding Patterns | L75-L117 | How to connect build tools (NuGet, npm, Maven, Gradle, Cargo, Python, PowerShell) to Azure Artifacts, publish/restore packages, use upstream sources, and debug with symbols. | | Deployment | L118-L121 | Using GitHub Actions to build and push packages (NuGet, npm, etc.) to Azure Artifacts feeds, including workflow setup, authentication, and CI/CD integration. | ### Best Practices @@ -83,7 +83,7 @@ This skill requires **network access** to fetch documentation content: | Publish and restore Maven packages with Azure Artifacts | https://learn.microsoft.com/en-us/azure/devops/artifacts/get-started-maven?view=azure-devops | | Publish and consume npm packages using Azure Artifacts | https://learn.microsoft.com/en-us/azure/devops/artifacts/get-started-npm?view=azure-devops | | Publish and download NuGet packages with Azure Artifacts | https://learn.microsoft.com/en-us/azure/devops/artifacts/get-started-nuget?view=azure-devops | -| Use Google Maven Repository as Azure Artifacts upstream | https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops | +| Consume Google Maven packages via Azure Artifacts upstream | https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/google-maven?view=azure-devops | | Add Gradle Plugins repository as Azure Artifacts upstream | https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/gradle-plugins?view=azure-devops | | Configure Maven to restore packages from Azure Artifacts | https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/install?view=azure-devops | | Configure JitPack as an Azure Artifacts upstream source | https://learn.microsoft.com/en-us/azure/devops/artifacts/maven/jitpack-upstream?view=azure-devops | diff --git a/skills/azure-automation/SKILL.md b/skills/azure-automation/SKILL.md index 418eea73..e40fb4d3 100644 --- a/skills/azure-automation/SKILL.md +++ b/skills/azure-automation/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-automation -description: Expert knowledge for Azure Automation development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building runbooks, DSC/State Configuration, Hybrid Runbook Workers, webhooks/identities auth, or Azure/AWS/SQL integrations, and other Azure Automation related development tasks. Not for Azure Functions (use azure-functions), Azure Logic Apps (use azure-logic-apps), Azure Scheduler (use azure-scheduler), Azure Update Manager (use azure-update-manager). +description: Expert knowledge for Azure Automation development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building runbooks, Hybrid Runbook Workers, DSC/State Configuration, webhooks/integrations, or managed identity auth, and other Azure Automation related development tasks. Not for Azure Functions (use azure-functions), Azure Logic Apps (use azure-logic-apps), Azure Scheduler (use azure-scheduler), Azure DevOps (use azure-devops). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Automation Skill @@ -24,14 +24,14 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L36-L46 | Diagnosing and fixing Azure Automation issues: DSC/State Configuration, Hybrid Runbook Workers (agent/extension), managed identities, runbook failures, shared resources, and collecting support diagnostics. | -| Best Practices | L47-L56 | Best practices for structuring, chaining, and managing runbooks, handling errors and output streams, ensuring resilient execution, and avoiding context-switching issues in Azure Automation. | -| Decision Making | L57-L65 | Guidance on choosing Azure Automation runbook types and planning migrations (Orchestrator, Log Analytics agent, Hybrid workers, Run As accounts, AzureRM→Az, and agent-to-extension changes). | -| Limits & Quotas | L66-L72 | Limits, quotas, and version/support details for Azure Automation: DSC extension changes, Automation resource limits, subscription quotas, and Change Tracking/Inventory support with AMA. | -| Security | L73-L93 | Securing Automation accounts: identities, RBAC, auth methods, encryption, certificates/credentials, private endpoints, and policy/compliance for secure runbooks and access. | -| Configuration | L94-L134 | Configuring and managing Azure Automation runbooks, DSC/State Configuration, Hybrid Runbook Workers, alerts, schedules, source control, policies, packages, and deployment/runtime settings. | -| Integrations & Coding Patterns | L135-L147 | Integrating Automation runbooks with Azure/AWS/Office 365/SQL, authenticating via identities/webhooks, deploying ARM, sending logs to Monitor, and emailing via SendGrid | -| Deployment | L148-L157 | Guides for deploying resilient Automation accounts, setting up DR and continuous deployment, and installing/migrating Windows/Linux Hybrid Runbook Workers and agents. | +| Troubleshooting | L36-L46 | Diagnosing and fixing Azure Automation issues: runbook failures, DSC problems, Hybrid Runbook Worker (agent/extension), managed identity/auth errors, shared resources, and collecting support diagnostics. | +| Best Practices | L47-L57 | Designing robust runbooks: modular parent-child patterns, Hybrid Runbook Workers, error handling, output/log streams, context switching issues, and managing/migrating PowerShell modules. | +| Decision Making | L58-L66 | Guidance on choosing runbook types and planning migrations: Orchestrator to Automation, Log Analytics to AMA, agent to extension workers, and Run As accounts to managed identities. | +| Limits & Quotas | L67-L73 | Azure Automation limits per subscription/account, how to view/manage quotas, and AMA-based Change Tracking & Inventory support/scale constraints. | +| Security | L74-L93 | Securing Automation accounts: identities, RBAC, auth methods, encryption, certificates/credentials, Private Link, Azure Policy, and Terraform-based secure provisioning. | +| Configuration | L94-L135 | Configuring Azure Automation runbooks and State Configuration: DSC compilation and remediation, Hybrid Runbook Workers, schedules, alerts, policies, networking, packages, source control, and deployment. | +| Integrations & Coding Patterns | L136-L148 | Patterns for integrating runbooks with AWS, ARM templates, webhooks, email (SendGrid), Azure Monitor, Office 365, Azure SQL, and using the graphical runbook SDK and managed identity. | +| Deployment | L149-L157 | Guides for deploying resilient Automation accounts, setting up DR, and installing/configuring Hybrid Runbook Workers (Windows/Linux, agent- and extension-based) and DSC/Chocolatey CD. | ### Troubleshooting | Topic | URL | @@ -40,19 +40,20 @@ This skill requires **network access** to fetch documentation content: | Troubleshoot Azure Automation State Configuration problems | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/desired-state-configuration | | Troubleshoot extension-based Hybrid Runbook Worker issues | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/extension-based-hybrid-runbook-worker | | Troubleshoot agent-based Hybrid Runbook Worker issues | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/hybrid-runbook-worker | -| Troubleshoot managed identity issues in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity | -| Troubleshoot Azure Automation runbook execution issues | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks | +| Troubleshoot managed identity issues in Azure Automation accounts | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/managed-identity | +| Diagnose and fix Azure Automation runbook execution issues | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/runbooks | | Troubleshoot Azure Automation shared resource problems | https://learn.microsoft.com/en-us/azure/automation/troubleshoot/shared-resources | ### Best Practices | Topic | URL | |-------|-----| | Design modular parent-child runbooks in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-child-runbooks | -| Design resilient Azure Automation runbook execution behavior | https://learn.microsoft.com/en-us/azure/automation/automation-runbook-execution | +| Run and manage runbooks on Hybrid Runbook Workers | https://learn.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks | | Implement error handling in Azure Automation graphical runbooks | https://learn.microsoft.com/en-us/azure/automation/automation-runbook-graphical-error-handling | | Configure output and message streams in Azure Automation runbooks | https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages | -| Avoid Azure Automation runbook issues from context switching | https://learn.microsoft.com/en-us/azure/automation/context-switching | +| Avoid context switching issues in Azure Automation runbooks | https://learn.microsoft.com/en-us/azure/automation/context-switching | | Manage Azure Automation runbooks with recommended design patterns | https://learn.microsoft.com/en-us/azure/automation/manage-runbooks | +| Manage and migrate PowerShell modules in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules | ### Decision Making | Topic | URL | @@ -60,8 +61,8 @@ This skill requires **network access** to fetch documentation content: | Migrate System Center Orchestrator runbooks to Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-orchestrator-migration | | Choose appropriate Azure Automation runbook types | https://learn.microsoft.com/en-us/azure/automation/automation-runbook-types | | Migrate Change Tracking from Log Analytics agent to AMA | https://learn.microsoft.com/en-us/azure/automation/change-tracking/guidance-migration-log-analytics-monitoring-agent | +| Migrate agent-based Hybrid Runbook Workers to extension-based workers | https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers | | Plan and execute migration from Run As to managed identities | https://learn.microsoft.com/en-us/azure/automation/migrate-run-as-accounts-managed-identity | -| Plan migration from AzureRM to Az modules in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules | ### Limits & Quotas | Topic | URL | @@ -81,10 +82,9 @@ This skill requires **network access** to fetch documentation content: | Configure authentication methods for Azure Automation accounts | https://learn.microsoft.com/en-us/azure/automation/automation-security-overview | | Configure Microsoft Entra ID authentication for Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-use-azure-ad | | Disable local authentication for Azure Automation securely | https://learn.microsoft.com/en-us/azure/automation/disable-local-authentication | -| Disable system-assigned managed identity on Automation accounts | https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation | +| Disable system-assigned managed identity in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/disable-managed-identity-for-automation | | Enable system-assigned managed identity for Azure Automation | https://learn.microsoft.com/en-us/azure/automation/enable-managed-identity-for-automation | | Secure Azure Automation access with Private Link and private endpoints | https://learn.microsoft.com/en-us/azure/automation/how-to/private-link-security | -| Use managed identity in Azure Automation PowerShell runbooks | https://learn.microsoft.com/en-us/azure/automation/learn/powershell-runbook-managed-identity | | Provision Automation account and Reader role via Terraform | https://learn.microsoft.com/en-us/azure/automation/quickstarts/create-azure-automation-account-terraform | | Enable managed identities for Azure Automation accounts | https://learn.microsoft.com/en-us/azure/automation/quickstarts/enable-managed-identity | | Apply Azure Policy compliance controls to Automation | https://learn.microsoft.com/en-us/azure/automation/security-controls-policy | @@ -97,7 +97,7 @@ This skill requires **network access** to fetch documentation content: | Configure metric alerts for Azure Automation runbooks | https://learn.microsoft.com/en-us/azure/automation/automation-alert-metric | | Configure and use connection assets in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-connections | | Trigger Azure Automation runbooks from Azure Monitor alerts | https://learn.microsoft.com/en-us/azure/automation/automation-create-alert-triggered-runbook | -| Compile DSC configurations in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile | +| Compile DSC configurations in Azure Automation State Configuration | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-compile | | Configure DSC data at scale in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-config-data-at-scale | | Generate DSC configurations from existing servers | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-config-from-server | | Configure STIG-based DSC data in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-configuration-based-on-stig | @@ -105,7 +105,7 @@ This skill requires **network access** to fetch documentation content: | Send State Configuration data to Azure Monitor Logs | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-diagnostics | | Use Azure DSC extension version history for configuration | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-extension-history | | Perform common Azure Automation State Configuration tasks | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-getting-started | -| Onboard machines to Azure Automation State Configuration | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding | +| Enable and onboard machines to Azure Automation State Configuration | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding | | Remediate noncompliant servers with State Configuration | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-remediate | | Use the Azure Automation textual editor for PowerShell runbooks | https://learn.microsoft.com/en-us/azure/automation/automation-edit-textual-runbook | | Author and configure graphical runbooks in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-graphical-authoring-intro | @@ -113,7 +113,7 @@ This skill requires **network access** to fetch documentation content: | Configure network requirements for Azure Automation components | https://learn.microsoft.com/en-us/azure/automation/automation-network-configuration | | Author and manage Automation runbooks using VS Code | https://learn.microsoft.com/en-us/azure/automation/automation-runbook-authoring | | Create watcher tasks to track file updates in Automation | https://learn.microsoft.com/en-us/azure/automation/automation-scenario-using-watcher-task | -| Update and manage Azure PowerShell modules in Automation accounts | https://learn.microsoft.com/en-us/azure/automation/automation-update-azure-modules | +| Update built-in Azure PowerShell modules in Automation accounts | https://learn.microsoft.com/en-us/azure/automation/automation-update-azure-modules | | Compose DSC configurations using composite resources | https://learn.microsoft.com/en-us/azure/automation/compose-configurationwithcompositeresources | | Enforce Hybrid Runbook Worker job execution via policy | https://learn.microsoft.com/en-us/azure/automation/enforce-job-execution-hybrid-worker | | Configure Azure Automation regional DNS records for firewalled networks | https://learn.microsoft.com/en-us/azure/automation/how-to/automation-region-dns-records | @@ -129,6 +129,7 @@ This skill requires **network access** to fetch documentation content: | Configure Azure Automation source control integration | https://learn.microsoft.com/en-us/azure/automation/source-control-integration | | Choose methods to start Azure Automation runbooks | https://learn.microsoft.com/en-us/azure/automation/start-runbooks | | Remove DSC configuration and unregister Automation node | https://learn.microsoft.com/en-us/azure/automation/state-configuration/remove-node-and-configuration-package | +| Configure machines to desired state with Azure Automation | https://learn.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state | | Enable Change Tracking and Inventory at scale via Machines pane | https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/enable-change-tracking-at-scale-machines-blade | | Enable Change Tracking and Inventory at scale with Azure Policy | https://learn.microsoft.com/en-us/azure/azure-change-tracking-inventory/enable-change-tracking-at-scale-policy | @@ -140,7 +141,7 @@ This skill requires **network access** to fetch documentation content: | Forward Azure Automation job logs to Azure Monitor | https://learn.microsoft.com/en-us/azure/automation/automation-manage-send-joblogs-log-analytics | | Provision AWS virtual machines using Azure Automation runbooks | https://learn.microsoft.com/en-us/azure/automation/automation-scenario-aws-deployment | | Send email from Azure Automation runbook using SendGrid | https://learn.microsoft.com/en-us/azure/automation/automation-send-email | -| Trigger Azure Automation runbooks via webhooks from external services | https://learn.microsoft.com/en-us/azure/automation/automation-webhooks | +| Trigger Azure Automation runbooks using webhooks | https://learn.microsoft.com/en-us/azure/automation/automation-webhooks | | Use the Azure Automation graphical runbook SDK | https://learn.microsoft.com/en-us/azure/automation/graphical-runbook-sdk | | Manage Office 365 services with Azure Automation | https://learn.microsoft.com/en-us/azure/automation/manage-office-365 | | Manage Azure SQL databases using Automation managed identity | https://learn.microsoft.com/en-us/azure/automation/manage-sql-server-in-automation | @@ -153,5 +154,4 @@ This skill requires **network access** to fetch documentation content: | Set up continuous deployment with DSC and Chocolatey | https://learn.microsoft.com/en-us/azure/automation/automation-dsc-cd-chocolatey | | Deploy Linux Hybrid Runbook Worker agent | https://learn.microsoft.com/en-us/azure/automation/automation-linux-hrw-install | | Deploy agent-based Windows Hybrid Runbook Workers in Azure Automation | https://learn.microsoft.com/en-us/azure/automation/automation-windows-hrw-install | -| Deploy extension-based Hybrid Runbook Workers for Windows and Linux | https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install | -| Migrate Azure Automation hybrid workers to extension-based | https://learn.microsoft.com/en-us/azure/automation/migrate-existing-agent-based-hybrid-worker-to-extension-based-workers | \ No newline at end of file +| Deploy extension-based Hybrid Runbook Workers on Windows and Linux | https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install | \ No newline at end of file diff --git a/skills/azure-backup/SKILL.md b/skills/azure-backup/SKILL.md index ccf24e0e..1a49255c 100644 --- a/skills/azure-backup/SKILL.md +++ b/skills/azure-backup/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-backup -description: Expert knowledge for Azure Backup development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when backing up Azure VMs, SQL/SAP HANA/PostgreSQL, Files/Blobs/Disks, configuring vaults/policies, or using CLI/PowerShell/REST, and other Azure Backup related development tasks. Not for Azure Site Recovery (use azure-site-recovery), Azure Virtual Machines (use azure-virtual-machines), Azure Blob Storage (use azure-blob-storage), Azure Files (use azure-files). +description: Expert knowledge for Azure Backup development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when protecting Azure VMs, SQL/SAP HANA/PostgreSQL, Files/Blobs/Disks, AKS, or MABS/DPM via CLI/PowerShell/REST, and other Azure Backup related development tasks. Not for Azure Site Recovery (use azure-site-recovery). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Backup Skill @@ -24,15 +24,15 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L67 | Diagnosing and fixing Azure Backup errors across VMs, databases (SQL, PostgreSQL, MySQL, SAP), files, disks, AKS, MARS/MABS/DPM, vault/agent issues, and slow or failed backups/restores. | -| Best Practices | L68-L78 | Best practices for Azure Backup Server, DPM, SQL (incl. Always On), and Azure VMs, covering safe configuration, TRIM handling, and product-specific backup and recovery guidance. | -| Decision Making | L79-L85 | Guidance on estimating Azure Backup costs, checking supported VM image SKUs for backup policies, and deciding how to migrate classic backup alerts to Azure Monitor alerts. | -| Architecture & Design Patterns | L86-L90 | Azure Backup’s architecture for protecting SAP HANA: components, data flow, backup/restore process, scalability, security, and integration with Azure storage and recovery services. | -| Limits & Quotas | L91-L119 | Backup support matrices, regional availability, limits, retention, and performance/behavior details for Azure Backup across VMs, databases, disks, files, blobs, SAP, and monitoring/reporting. | -| Security | L120-L160 | Security, encryption, access control, soft delete, immutability, private endpoints, MUA/Resource Guard, and Azure Policy for protecting and restoring Azure Backup data. | -| Configuration | L161-L238 | Configuring Azure Backup and restore: vaults, policies, agents, scripts/APIs, monitoring/alerts, and workload-specific setup for VMs, AKS, Files, Blobs, disks, SQL, SAP HANA, PostgreSQL, and AD. | -| Integrations & Coding Patterns | L239-L294 | End-to-end scripting patterns for configuring, running, monitoring, and restoring Azure Backup across VMs, SQL, PostgreSQL, SAP HANA, Files, Blobs, Disks, and on-premises using CLI, PowerShell, REST, and Logic Apps. | -| Deployment | L295-L301 | MABS v3/v4 deployment details: supported workload/protection matrices and how to automate unattended/silent installation of Azure Backup Server v4. | +| Troubleshooting | L37-L67 | Diagnosing and fixing Azure Backup failures and errors across VMs, disks, databases (SQL, PostgreSQL, MySQL, SAP), files, AKS, MABS/DPM, agents, vaults, and restore/recovery issues. | +| Best Practices | L68-L79 | Best practices for configuring and optimizing Azure Backup/MABS/DPM for Hyper-V VMs, SQL (incl. Always On), Exchange, TRIM-aware backups, and safe restore/recovery across vaults. | +| Decision Making | L80-L86 | Guidance on estimating Azure Backup costs, checking supported VM image SKUs for backup policies, and deciding how to migrate classic backup alerts to Azure Monitor alerts. | +| Architecture & Design Patterns | L87-L91 | Azure Backup’s architecture for protecting SAP HANA: components, data flow, backup/restore process, scalability, security, and integration with Azure storage and recovery services. | +| Limits & Quotas | L92-L121 | Backup support matrices, regional support, limits, retention, performance, and behavioral constraints for Azure Backup across VMs, databases, disks, files, blobs, AKS, SAP, and monitoring. | +| Security | L122-L162 | Security, encryption, access control, soft delete, immutability, private endpoints, MUA/Resource Guard, and Azure Policy for protecting and restoring Azure Backup data. | +| Configuration | L163-L237 | Configuring Azure Backup and restore: vaults, policies, agents, scripts/APIs, monitoring/alerts, and workload-specific setup for VMs, AKS, Files, Blobs, disks, SQL, SAP HANA, PostgreSQL, and AD. | +| Integrations & Coding Patterns | L238-L293 | End-to-end scripting patterns for configuring, running, monitoring, and restoring Azure Backup across VMs, SQL, PostgreSQL, SAP HANA, Files, Blobs, Disks, and on-premises using CLI, PowerShell, REST, and Logic Apps. | +| Deployment | L294-L300 | MABS v3/v4 deployment details: supported workload/protection matrices and how to automate unattended/silent installation of Azure Backup Server v4. | ### Troubleshooting | Topic | URL | @@ -56,7 +56,7 @@ This skill requires **network access** to fetch documentation content: | Resolve Azure Backup agent and extension failures | https://learn.microsoft.com/en-us/azure/backup/backup-azure-troubleshoot-vm-backup-fails-snapshot-timeout | | Troubleshoot Azure VM file-level recovery issues | https://learn.microsoft.com/en-us/azure/backup/backup-azure-vm-file-recovery-troubleshoot | | Fix Azure VM backup and restore errors | https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-troubleshoot | -| Resolve known issues in Microsoft Azure Backup Server v3 | https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3 | +| Troubleshoot known issues in MABS v3 | https://learn.microsoft.com/en-us/azure/backup/backup-mabs-release-notes-v3 | | Troubleshoot SQL Server backups on Azure VMs using Azure Backup | https://learn.microsoft.com/en-us/azure/backup/backup-sql-server-azure-troubleshoot | | Troubleshoot Backup vault management errors | https://learn.microsoft.com/en-us/azure/backup/backup-vault-troubleshoot | | Fix backup and restore failures in Azure Disk Backup | https://learn.microsoft.com/en-us/azure/backup/disk-backup-troubleshoot | @@ -68,6 +68,7 @@ This skill requires **network access** to fetch documentation content: ### Best Practices | Topic | URL | |-------|-----| +| Back up and restore Hyper-V VMs with MABS | https://learn.microsoft.com/en-us/azure/backup/back-up-hyper-v-virtual-machines-mabs | | Recover Azure Backup Server data from any vault-registered server | https://learn.microsoft.com/en-us/azure/backup/backup-azure-alternate-dpm-server | | Configure DPM to back up Exchange to Azure safely | https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-exchange-server | | Back up SQL Server to Azure via DPM with TRIM handling | https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-sql | @@ -100,13 +101,14 @@ This skill requires **network access** to fetch documentation content: | Overview and retention limits for Azure PostgreSQL backups | https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-overview | | Azure Backup behavior for PostgreSQL servers | https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-server-faq | | Support matrix for PostgreSQL single server backup | https://learn.microsoft.com/en-us/azure/backup/backup-azure-database-postgresql-support-matrix | +| MARS agent backup limits and behavioral constraints | https://learn.microsoft.com/en-us/azure/backup/backup-azure-file-folder-backup-faq | | Support matrix for MySQL Flexible Server long-term backup | https://learn.microsoft.com/en-us/azure/backup/backup-azure-mysql-flexible-server-support-matrix | | Azure VM Backup FAQ limits and behaviors | https://learn.microsoft.com/en-us/azure/backup/backup-azure-vm-backup-faq | | Check Backup center workload support and limitations | https://learn.microsoft.com/en-us/azure/backup/backup-center-support-matrix | | Instant Restore performance limits for Azure VM backups | https://learn.microsoft.com/en-us/azure/backup/backup-instant-restore-capability | -| Review global Azure Backup support settings and limits | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix | -| Support matrix and limitations for Azure VM backups | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas | -| Review MABS and DPM backup support and limits | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm | +| Review Azure Backup support settings and limits | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix | +| Review Azure VM backup support limits and matrix | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-iaas | +| Check MABS and DPM Azure Backup support limits | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mabs-dpm | | Check MARS agent backup support and limitations | https://learn.microsoft.com/en-us/azure/backup/backup-support-matrix-mars-agent | | Review Azure Blob backup support matrix and limits | https://learn.microsoft.com/en-us/azure/backup/blob-backup-support-matrix | | Check Azure Disk Backup support and limitations | https://learn.microsoft.com/en-us/azure/backup/disk-backup-support-matrix | @@ -171,7 +173,6 @@ This skill requires **network access** to fetch documentation content: | Configure AKS backup extension and trusted access | https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-manage-backups | | Restore AKS clusters and volumes using Azure Backup | https://learn.microsoft.com/en-us/azure/backup/azure-kubernetes-service-cluster-restore | | Configure Azure Backup vault diagnostics via Azure Policy | https://learn.microsoft.com/en-us/azure/backup/azure-policy-configure-diagnostics | -| Restore Azure VMs from Recovery Services vaults using portal | https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms | | Auto-enable VM backups using Azure Policy | https://learn.microsoft.com/en-us/azure/backup/backup-azure-auto-enable-backup | | Configure MARS offline seeding with Azure Import/Export | https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-import-export | | Configure DPM and MABS offline seeding with Import/Export | https://learn.microsoft.com/en-us/azure/backup/backup-azure-backup-server-import-export | @@ -193,7 +194,6 @@ This skill requires **network access** to fetch documentation content: | Configure built-in monitoring for Azure Backup workloads | https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitoring-built-in-monitor | | Configure Azure Monitor Logs and custom alerts for Azure Backup | https://learn.microsoft.com/en-us/azure/backup/backup-azure-monitoring-use-azuremonitor | | Use resource-specific diagnostic data model for Azure Backup | https://learn.microsoft.com/en-us/azure/backup/backup-azure-reports-data-model | -| Restore Windows Server system state from Azure Backup | https://learn.microsoft.com/en-us/azure/backup/backup-azure-restore-system-state | | Configure agentless multidisk crash-consistent backups for Azure VMs | https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-agentless-multi-disk-crash-consistent | | Configure agentless crash-consistent backups for Azure VMs | https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-agentless-multi-disk-crash-consistent-overview | | Configure Enhanced policy for Azure VM backups | https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-enhanced-policy | @@ -210,7 +210,6 @@ This skill requires **network access** to fetch documentation content: | Audit and enforce Managed Disks backup with Azure Policy | https://learn.microsoft.com/en-us/azure/backup/backup-managed-disks-policy | | Query Azure Backup logs using system functions | https://learn.microsoft.com/en-us/azure/backup/backup-reports-system-functions | | Use ARM and Bicep templates for Azure Backup | https://learn.microsoft.com/en-us/azure/backup/backup-rm-template-samples | -| Configure Windows backups using the MARS agent | https://learn.microsoft.com/en-us/azure/backup/backup-windows-with-mars-agent | | Configure operational and vaulted backups for Azure Blobs | https://learn.microsoft.com/en-us/azure/backup/blob-backup-configure-manage | | Configure Azure Backup reporting with Log Analytics and workbooks | https://learn.microsoft.com/en-us/azure/backup/configure-reports | | Create and delete Azure Backup vaults for newer workloads | https://learn.microsoft.com/en-us/azure/backup/create-manage-backup-vault | diff --git a/skills/azure-batch/SKILL.md b/skills/azure-batch/SKILL.md index 05ba712a..8342044c 100644 --- a/skills/azure-batch/SKILL.md +++ b/skills/azure-batch/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-batch -description: Expert knowledge for Azure Batch development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Batch pools/tasks, autoscale, Spot VMs, CI/CD job deployment, or render/MPI workloads, and other Azure Batch related development tasks. Not for Azure HDInsight (use azure-hdinsight), Azure Databricks (use azure-databricks), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Virtual Machines (use azure-virtual-machines). +description: Expert knowledge for Azure Batch development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Batch pools/tasks, autoscale, private networking, CI/CD deployment, or SDK-based job orchestration, and other Azure Batch related development tasks. Not for Azure HDInsight (use azure-hdinsight), Azure Databricks (use azure-databricks), Azure Machine Learning (use azure-machine-learning), Azure Virtual Machines (use azure-virtual-machines). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Batch Skill @@ -26,11 +26,11 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L43 | Diagnosing, interpreting, and fixing Azure Batch job, task, pool, and node errors, including error codes, failure patterns, and recommended recovery/handling strategies. | | Best Practices | L44-L57 | Performance, scaling, scheduling, security, and data/output best practices for designing, monitoring, and optimizing large or specialized Azure Batch workloads (MPI, rendering, high task counts). | -| Decision Making | L58-L68 | Guidance on choosing VM sizes/images, using Spot VMs, managing Batch costs, and deciding/migrating between custom image options, Compute Gallery, and simplified node communication. | +| Decision Making | L58-L68 | Guidance on choosing VM sizes/images, using Spot VMs, managing Batch costs, and planning/migrating custom image pools and node communication features. | | Architecture & Design Patterns | L69-L74 | Architectures and best practices for bursting on-prem render farms to Azure Batch, including storage layout, data movement patterns, and performance-optimized rendering workflows. | | Limits & Quotas | L75-L79 | Batch account limits (cores, pools, nodes, jobs), default and regional quotas, how to view current usage, request quota increases, and plan deployments within these constraints | | Security | L80-L98 | Securing Batch accounts and pools: auth with Entra ID/managed identities, keys and CMK encryption, RBAC and policy, private endpoints/network perimeters, Key Vault access, and certificate/key rotation. | -| Configuration | L99-L137 | Configuring Batch pools, tasks, networking, containers, autoscale, OS/images, filesystems, and setting up monitoring, diagnostics events, alerts, and metrics/logs | +| Configuration | L99-L137 | Configuring Batch pools, tasks, networking, containers, autoscale, monitoring/diagnostics, and storage mounts, plus reference for events, metrics, and management via SDK. | | Integrations & Coding Patterns | L138-L148 | Using Azure Batch programmatically and via CLI/PowerShell: SDK patterns (JavaScript, .NET, Linux workloads), storing task output in Storage, and adding telemetry with Application Insights. | | Deployment | L149-L153 | Deploying Azure Batch workloads using Azure Pipelines and CLI templates, including end-to-end job setup, automation, and integration into CI/CD workflows. | @@ -62,7 +62,7 @@ This skill requires **network access** to fetch documentation content: | Choose and migrate custom image options for Batch pools | https://learn.microsoft.com/en-us/azure/batch/batch-custom-images | | Select compute-intensive and GPU VM sizes for Batch | https://learn.microsoft.com/en-us/azure/batch/batch-pool-compute-intensive-sizes | | Choose Azure Batch VM sizes and images | https://learn.microsoft.com/en-us/azure/batch/batch-pool-vm-sizes | -| Migrate Batch pools to simplified node communication | https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide | +| Plan and execute migration to Azure Batch simplified node communication | https://learn.microsoft.com/en-us/azure/batch/batch-pools-to-simplified-compute-node-communication-model-migration-guide | | Decide when to run Azure Batch on Spot VMs | https://learn.microsoft.com/en-us/azure/batch/batch-spot-vms | | Plan and manage Azure Batch costs effectively | https://learn.microsoft.com/en-us/azure/batch/plan-to-manage-costs | @@ -131,8 +131,8 @@ This skill requires **network access** to fetch documentation content: | Configure remote access endpoints for Azure Batch pools | https://learn.microsoft.com/en-us/azure/batch/pool-endpoint-configuration | | Mount Azure Files shares on Azure Batch compute nodes | https://learn.microsoft.com/en-us/azure/batch/pool-file-shares | | Create and use Azure Batch resource files | https://learn.microsoft.com/en-us/azure/batch/resource-files | -| Enable simplified compute node communication in Azure Batch | https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication | -| Create simplified communication Batch pools without public IPs | https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip | +| Configure simplified compute node communication in Azure Batch | https://learn.microsoft.com/en-us/azure/batch/simplified-compute-node-communication | +| Configure Azure Batch pools without public IP addresses | https://learn.microsoft.com/en-us/azure/batch/simplified-node-communication-pool-no-public-ip | | Mount virtual file systems on Azure Batch pool nodes | https://learn.microsoft.com/en-us/azure/batch/virtual-file-mount | ### Integrations & Coding Patterns diff --git a/skills/azure-blob-storage/SKILL.md b/skills/azure-blob-storage/SKILL.md index a41540bd..45e1d42e 100644 --- a/skills/azure-blob-storage/SKILL.md +++ b/skills/azure-blob-storage/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-blob-storage -description: Expert knowledge for Azure Blob Storage development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Data Lake features, NFS/SFTP/BlobFuse, static website hosting, SAS/RBAC auth, or SDK-based blob operations, and other Azure Blob Storage related development tasks. Not for Azure Files (use azure-files), Azure Table Storage (use azure-table-storage), Azure Queue Storage (use azure-queue-storage), Azure NetApp Files (use azure-netapp-files). +description: Expert knowledge for Azure Blob Storage development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Data Lake/NFS/SFTP, static website hosting/CDN, SAS/RBAC auth, lifecycle/immutability, or SDK/BlobFuse APIs, and other Azure Blob Storage related development tasks. Not for Azure Files (use azure-files), Azure Queue Storage (use azure-queue-storage), Azure Table Storage (use azure-table-storage), Azure NetApp Files (use azure-netapp-files). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Blob Storage Skill @@ -31,7 +31,7 @@ This skill requires **network access** to fetch documentation content: | Security | L127-L184 | Configuring secure access to Blob Storage: RBAC/ABAC, Entra ID auth, SAS tokens, SFTP, ACLs, encryption (CSE, CPK, scopes), private networking, and preventing anonymous access. | | Configuration | L185-L251 | Configuring and operating Blob Storage: monitoring, lifecycle, immutability, soft delete, snapshots/restore, NFS/SFTP/BlobFuse, Data Lake, static sites, and third‑party backup/migration tools. | | Integrations & Coding Patterns | L252-L387 | SDK, CLI, and tooling patterns for integrating with Blob/Data Lake: connect from various languages, mount/file-system access, copy/migrate data, manage containers/blobs, leases, tiers, tags, and events. | -| Deployment | L388-L401 | Guides for deploying and configuring Blob Storage: static website hosting (CDN, GitHub Actions, Terraform), feature support, Data Lake enablement, and hybrid/migration tools (Data Box, WANdisco, Nasuni, Tiger Bridge). | +| Deployment | L388-L402 | Guides for deploying and integrating Blob Storage: static website hosting (CDN, GitHub Actions, Terraform), feature enablement, and data migration from HDFS/Hadoop and third‑party storage tools. | ### Troubleshooting | Topic | URL | @@ -397,5 +397,6 @@ This skill requires **network access** to fetch documentation content: | Check Azure Blob Storage feature support by account type | https://learn.microsoft.com/en-us/azure/storage/blobs/storage-feature-support-in-storage-accounts | | Deploy static website on Azure Storage with Terraform | https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-static-website-terraform | | Enable Data Lake capabilities on existing Blob accounts | https://learn.microsoft.com/en-us/azure/storage/blobs/upgrade-to-data-lake-storage-gen2-how-to | +| Deploy and operate NetApp Data Migrator to Azure NetApp Files | https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/data-management/netapp-data-migrator-guide | | Deploy and configure Nasuni with Azure Blob Storage | https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/primary-secondary-storage/nasuni-deployment-guide | | Deploy Tiger Bridge hybrid data with Azure Blob | https://learn.microsoft.com/en-us/azure/storage/solution-integration/validated-partners/primary-secondary-storage/tiger-bridge-deployment-guide | \ No newline at end of file diff --git a/skills/azure-boards/SKILL.md b/skills/azure-boards/SKILL.md index 333065c9..7c076adc 100644 --- a/skills/azure-boards/SKILL.md +++ b/skills/azure-boards/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-boards -description: Expert knowledge for Azure Boards development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when managing work items, boards/backlogs, WIQL queries, Excel/Office sync, or GitHub/Teams integrations, and other Azure Boards related development tasks. Not for Azure DevOps (use azure-devops), Azure Test Plans (use azure-test-plans), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos). +description: Expert knowledge for Azure Boards development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when managing work items, backlogs/boards, WIQL queries, GitHub/Office integration, or Azure Boards security, and other Azure Boards related development tasks. Not for Azure DevOps (use azure-devops), Azure Test Plans (use azure-test-plans), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Boards Skill @@ -24,7 +24,7 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L35-L43 | Diagnosing and fixing Azure Boards + Excel/Office integration issues (sync, add-in, connection, mapping) and resolving backlog nesting/reordering errors. | +| Troubleshooting | L35-L43 | Diagnosing and fixing Azure Boards issues with Office integration, backlog hierarchy/reordering problems, and work item query errors or unexpected results. | | Best Practices | L44-L51 | Best practices for using Azure Boards for Agile: product management, Kanban, Scrum/sprints, and scaling Agile across teams and projects. | | Decision Making | L52-L58 | Guidance on choosing Azure Boards processes, tools, and integrations, plus planning cross-team dependencies and migrations to get the right setup for your organization. | | Limits & Quotas | L59-L64 | Managing Azure Boards limits for test artifacts and work item attachments, including size/quantity constraints and how to restore deleted test-related items. | @@ -35,11 +35,11 @@ This skill requires **network access** to fetch documentation content: ### Troubleshooting | Topic | URL | |-------|-----| -| Resolve common Excel and Azure Boards integration questions | https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/faqs?view=azure-devops | | Troubleshoot Azure DevOps Office integration issues | https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops | | Troubleshoot Azure DevOps Office integration issues | https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops | | Troubleshoot Azure DevOps Office integration issues | https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/office/tfs-office-integration-issues?view=azure-devops | -| Fix Azure Boards backlog nesting and reorder errors | https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops | +| Troubleshoot Azure Boards backlog nesting and reordering | https://learn.microsoft.com/en-us/azure/devops/boards/backlogs/resolve-backlog-reorder-issues?view=azure-devops | +| Troubleshoot Azure Boards work item query issues | https://learn.microsoft.com/en-us/azure/devops/boards/queries/query-faqs?view=azure-devops | ### Best Practices | Topic | URL | diff --git a/skills/azure-chaos-studio/SKILL.md b/skills/azure-chaos-studio/SKILL.md index 401d257a..25f0af05 100644 --- a/skills/azure-chaos-studio/SKILL.md +++ b/skills/azure-chaos-studio/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-chaos-studio -description: Expert knowledge for Chaos Studio development including troubleshooting, limits & quotas, security, configuration, and integrations & coding patterns. Use when defining ARM/Bicep experiments, deploying Chaos Agents, using CLI/REST, or integrating with Azure Monitor, and other Chaos Studio related development tasks. Not for Azure Monitor (use azure-monitor), Azure Resiliency (use azure-resiliency), Azure Reliability (use azure-reliability), Azure Site Recovery (use azure-site-recovery). +description: Expert knowledge for Chaos Studio development including troubleshooting, limits & quotas, security, configuration, and integrations & coding patterns. Use when securing Chaos Studio targets, deploying agents via ARM/Policy, configuring experiments, or using CLI/REST APIs, and other Chaos Studio related development tasks. Not for Azure Monitor (use azure-monitor), Azure Resiliency (use azure-resiliency), Azure Reliability (use azure-reliability), Azure Site Recovery (use azure-site-recovery). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-03-16" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Chaos Studio Skill @@ -27,7 +27,7 @@ This skill requires **network access** to fetch documentation content: | Troubleshooting | L33-L40 | Diagnosing and fixing Chaos Studio and Chaos Agent issues, including installation/health problems, VM agent status checks, known errors, and common experiment or connectivity failures. | | Limits & Quotas | L41-L48 | Chaos Studio limits: agent OS/fault compatibility, known issues, regional/HA behavior, and throttling, quotas, and usage constraints for experiments | | Security | L49-L63 | Securing Chaos Studio: identities, roles, permissions, CMK encryption, network/IP controls, Private Link, VNet injection, AKS auth, and safely controlling experiment targets/capabilities. | -| Configuration | L64-L76 | Configuring Chaos Studio: ARM/Bicep experiment definitions, deploying agents/targets, parameters, Azure Monitor/Workbook integration, OS/tool compatibility, and onboarding via Azure Policy | +| Configuration | L64-L76 | Configuring Chaos Studio: deploy agents/targets via ARM/Bicep/Policy, define experiments and parameters, set up monitoring with Azure Monitor/Workbooks, and verify OS/tool compatibility. | | Integrations & Coding Patterns | L77-L82 | Using CLI/REST to create and manage Chaos Studio experiments, plus patterns for sending Chaos Agent telemetry to Application Insights and integrating experiments into automated workflows | ### Troubleshooting @@ -70,7 +70,7 @@ This skill requires **network access** to fetch documentation content: | Configure Azure Workbook to measure Chaos Studio faults | https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-fault-metrics-and-dashboard | | Configure Azure Monitor integration for Chaos Studio | https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-set-up-azure-monitor | | Check OS and tool compatibility for Chaos Studio | https://learn.microsoft.com/en-us/azure/chaos-studio/chaos-studio-versions | -| Use Azure Policy to onboard resources to Chaos Studio | https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets | +| Use Azure Policy to add Chaos Studio targets | https://learn.microsoft.com/en-us/azure/chaos-studio/sample-policy-targets | | Define Chaos Studio experiments using ARM templates | https://learn.microsoft.com/en-us/azure/chaos-studio/sample-template-experiment | | Deploy Chaos Studio targets and capabilities via ARM | https://learn.microsoft.com/en-us/azure/chaos-studio/sample-template-targets | diff --git a/skills/azure-cloud-adoption-framework/SKILL.md b/skills/azure-cloud-adoption-framework/SKILL.md index 27c5f4f3..eed98a9b 100644 --- a/skills/azure-cloud-adoption-framework/SKILL.md +++ b/skills/azure-cloud-adoption-framework/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-cloud-adoption-framework -description: Expert guidance for planning and executing cloud adoption using Azure Cloud Adoption Framework. Covers strategy, planning, readiness & landing zones, adoption patterns, governance, security, operations & management, organization & teams, and adoption scenarios. Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, or AI/analytics workloads, and other Azure Cloud Adoption Framework related development tasks. +description: Expert guidance for planning and executing cloud adoption using Azure Cloud Adoption Framework. Covers strategy, planning, readiness & landing zones, adoption patterns, governance, security, operations & management, organization & teams, and adoption scenarios. Use when designing Azure landing zones, AVS/VMware, SAP/Oracle, AKS, or analytics/data mesh deployments, and other Azure Cloud Adoption Framework related development tasks. compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Cloud Adoption Framework Skill @@ -26,7 +26,7 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Strategy | L37-L55 | Strategic cloud planning: defining business goals, exec strategy, cost, resiliency, security, sustainability, and workload-specific plans (AI agents, data, VDI, VMware, SAP, Oracle, hybrid/multicloud). | | Planning | L56-L84 | Planning cloud adoption, migration waves, modernization roadmaps, cost/skills readiness, and detailed plans for workloads (AI, data, Oracle, SAP, AVS, AVD, analytics) on Azure. | -| Readiness & Landing Zones | L85-L216 | Designing and operating Azure landing zones: network topology, identity, subscriptions, governance, automation/DevOps, multitenancy, and workload‑specific patterns (SAP, AVS, analytics, Oracle, etc.). | +| Readiness & Landing Zones | L85-L216 | Designing and operating Azure landing zones: network topology, identity, subscriptions, governance, automation/DevOps, and workload-specific setups (AVS, SAP, analytics, Oracle, App Service, Container Apps). | | Adoption Patterns | L217-L253 | Patterns and step-by-step guides for planning, migrating, modernizing, and operating workloads on Azure (apps, data, AI agents, AVD, SAP, Oracle, VMware) using CAF best practices. | | Governance | L254-L294 | Cloud and data governance for Azure: policies, guardrails, cost control, tagging, compliance, responsible AI, landing zones, Arc, analytics, SAP, AKS, and API/App Service governance. | | Security | L295-L331 | Security design and governance for Azure landing zones, including Zero Trust, IAM, encryption, DevOps, AKS, analytics, SAP/Oracle, Arc, and ongoing security operations. | @@ -122,10 +122,10 @@ This skill requires **network access** to fetch documentation content: | Test Azure landing zone deployments and policies | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/enterprise-scale/testing-approach | | Transition existing Azure environments to landing zones | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/enterprise-scale/transition | | Understand Azure landing zones as environment foundation | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/ | -| Use duplicate management groups to audit landing zone alignment | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only | +| Duplicate landing zone management group in audit-only mode | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-approach-duplicate-brownfield-audit-only | | Transition existing management groups to landing zone hierarchy | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-multiple-management-groups | | Align regional dev/test/prod structures to landing zones | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-regional-org | -| Migrate a single-subscription Azure environment to landing zones | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription | +| Migrate single-subscription Azure environments to landing zone | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/align-scenario-single-subscription | | Define and configure Microsoft Entra tenants for Azure | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/azure-ad-define | | Understand CSP service agreements in Azure landing zones | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/azure-billing-cloud-solution-provider | | Design Enterprise Agreement enrollment for Azure landing zones | https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/landing-zone/design-area/azure-billing-enterprise-agreement | diff --git a/skills/azure-cognitive-search/SKILL.md b/skills/azure-cognitive-search/SKILL.md index 9cdda030..2c6513d2 100644 --- a/skills/azure-cognitive-search/SKILL.md +++ b/skills/azure-cognitive-search/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-cognitive-search -description: Expert knowledge for Azure AI Search development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing indexes/skillsets, vector/semantic search, RAG queries, indexers, or secure multi-tenant setups, and other Azure AI Search related development tasks. Not for Azure Cosmos DB (use azure-cosmos-db), Azure Data Explorer (use azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Factory (use azure-data-factory). +description: Expert knowledge for Azure AI Search development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing indexes, skillsets, vector/semantic search, indexers, or RAG/agentic retrieval with Azure AI Search, and other Azure AI Search related development tasks. Not for Azure Cosmos DB (use azure-cosmos-db), Azure SQL Database (use azure-sql-database), Azure Table Storage (use azure-table-storage), Azure Open Datasets (use azure-open-datasets). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure AI Search Skill @@ -24,20 +24,20 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L48 | Diagnosing and fixing Azure AI Search indexer and skillset issues, including debug sessions, OData filter errors, private link problems, and storage/metrics discrepancies. | +| Troubleshooting | L37-L48 | Diagnosing and fixing Azure AI Search indexer and skillset issues (errors, warnings, no-error failures), OData filter problems, private link issues, and storage/metrics discrepancies. | | Best Practices | L49-L69 | Design, scaling, and performance tuning of Azure AI Search indexing/querying, including enrichment pipelines, chunking/vectorization, data modeling, concurrency-safe updates, and vector optimization. | -| Decision Making | L70-L82 | Planning Azure AI Search capacity, SKUs, and costs, and upgrading/migrating REST APIs, skills, SDKs, and service tiers for agentic retrieval and .NET apps | -| Architecture & Design Patterns | L83-L89 | Architectural guidance for Azure AI Search: RAG patterns, knowledge store design, multitenancy and tenant isolation, and multi-region/high-availability deployment designs. | -| Limits & Quotas | L90-L99 | Limits, quotas, and behaviors for Azure AI Search: service and index caps by tier, vector/index size limits, indexer scheduling windows, enrichment quotas, and related FAQs. | -| Security | L100-L138 | Securing Azure AI Search: RBAC/keyless auth, keys, CMK encryption, private endpoints, firewalls, indexer identity/network access, ACL/Purview-based document security, and Azure Policy compliance. | -| Configuration | L139-L230 | Configuring Azure AI Search: data sources, indexes, analyzers, skillsets, enrichment, vector/semantic settings, knowledge bases, monitoring, and relevance/answer tuning. | -| Integrations & Coding Patterns | L231-L289 | Integrating Azure AI Search with data sources, SDKs, vectorizers, and custom skills; building knowledge stores; and crafting queries/filters (OData, Lucene, semantic, vector, agentic). | -| Deployment | L290-L297 | Deploying and moving Azure AI Search services with ARM/Bicep/Terraform, plus guidance on cross-region moves and checking regional feature and SKU availability. | +| Decision Making | L70-L83 | Guidance on Azure AI Search capacity, SKUs, regions, cost planning, and upgrading/migrating REST APIs, skills, SDKs, and service resources. | +| Architecture & Design Patterns | L84-L90 | Architectural guidance for Azure AI Search: RAG patterns, knowledge store design, multitenancy and tenant isolation, and multi-region/high-availability deployment designs. | +| Limits & Quotas | L91-L100 | Limits, quotas, and behaviors for Azure AI Search: service and index caps by tier, vector/index size limits, indexer scheduling windows, enrichment quotas, and related FAQs. | +| Security | L101-L138 | Securing Azure AI Search: RBAC/keyless auth, keys, CMK encryption, private endpoints, firewalls, indexer identity/network access, ACL/Purview-based document security, and Azure Policy compliance. | +| Configuration | L139-L231 | Configuring Azure AI Search: data sources, index schemas, analyzers, enrichment skillsets, vector/semantic settings, knowledge bases, monitoring, and relevance/ranking behavior | +| Integrations & Coding Patterns | L232-L290 | Integrating Azure AI Search with data sources, SDKs, vectorizers, and custom skills; building queries (Lucene/OData), enrichments, knowledge stores, and agentic/semantic retrieval. | +| Deployment | L291-L298 | Deploying and moving Azure AI Search services with ARM/Bicep/Terraform, plus guidance on cross-region moves and checking regional feature and SKU availability. | ### Troubleshooting | Topic | URL | |-------|-----| -| Resolve common Azure AI Search indexer errors and warnings | https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings | +| Troubleshoot Azure AI Search indexer errors and warnings | https://learn.microsoft.com/en-us/azure/search/cognitive-search-common-errors-warnings | | Understand Debug Sessions for skillset troubleshooting | https://learn.microsoft.com/en-us/azure/search/cognitive-search-debug-session | | Debug and troubleshoot Azure AI Search skillsets | https://learn.microsoft.com/en-us/azure/search/cognitive-search-how-to-debug-skillset | | Tutorial: Practice debugging Azure AI Search skillsets | https://learn.microsoft.com/en-us/azure/search/cognitive-search-tutorial-debug-sessions | @@ -77,6 +77,7 @@ This skill requires **network access** to fetch documentation content: | Choose and use Azure AI Search management SDKs | https://learn.microsoft.com/en-us/azure/search/search-dotnet-mgmt-sdk-migration | | Upgrade Azure AI Search .NET apps to SDK v11 | https://learn.microsoft.com/en-us/azure/search/search-dotnet-sdk-migration-version-11 | | Upgrade Azure AI Search service capacity in portal | https://learn.microsoft.com/en-us/azure/search/search-how-to-upgrade | +| Choose alternative region for Azure AI Search capacity | https://learn.microsoft.com/en-us/azure/search/search-region-capacity | | Plan and manage Azure AI Search costs | https://learn.microsoft.com/en-us/azure/search/search-sku-manage-costs | | Choose the right Azure AI Search SKU and tier | https://learn.microsoft.com/en-us/azure/search/search-sku-tier | @@ -115,7 +116,6 @@ This skill requires **network access** to fetch documentation content: | Configure ADLS Gen2 indexer to ingest ACL and RBAC metadata | https://learn.microsoft.com/en-us/azure/search/search-indexer-access-control-lists-and-role-based-access | | Connect Azure AI Search indexers to SQL Managed Instance privately | https://learn.microsoft.com/en-us/azure/search/search-indexer-how-to-access-private-sql | | Configure Azure AI Search indexer access through IP firewalls | https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-ip-restricted | -| Configure shared private link indexer connections in Azure AI Search | https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private | | Configure trusted service exception for Azure AI Search indexers | https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-trusted-service-exception | | Secure indexer access to network-protected resources in Azure AI Search | https://learn.microsoft.com/en-us/azure/search/search-indexer-securing-resources | | Configure indexers to ingest Microsoft Purview sensitivity labels | https://learn.microsoft.com/en-us/azure/search/search-indexer-sensitivity-labels | @@ -161,7 +161,6 @@ This skill requires **network access** to fetch documentation content: | Configure Azure Content Understanding skill for document chunking | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-content-understanding | | Configure Custom Entity Lookup skill parameters | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-custom-entity-lookup | | Control Document Extraction behavior in skillsets | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-extraction | -| Configure Document Layout skill for Azure AI Search enrichment | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout | | Configure Entity Linking v3 skill in Azure AI Search | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-entity-linking-v3 | | Configure Entity Recognition skill v2 in skillsets | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-entity-recognition | | Configure Entity Recognition v3 skill in Azure AI Search | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-entity-recognition-v3 | @@ -202,9 +201,11 @@ This skill requires **network access** to fetch documentation content: | Configure JSON blob parsing for Azure AI Search indexers | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-azure-blob-json | | Configure Markdown parsing for Azure Blob indexers | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-azure-blob-markdown | | Configure one-to-many blob indexing with parsing modes | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-azure-blob-one-to-many | +| Configure SharePoint Online indexer for Azure AI Search | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online | | Load and refresh data into Azure AI Search indexes | https://learn.microsoft.com/en-us/azure/search/search-how-to-load-search-index | | Configure and manage Azure AI Search indexer runs | https://learn.microsoft.com/en-us/azure/search/search-howto-run-reset-indexers | | Configure field mappings for Azure AI Search indexers | https://learn.microsoft.com/en-us/azure/search/search-indexer-field-mappings | +| Configure shared private link connections for Azure AI Search indexers | https://learn.microsoft.com/en-us/azure/search/search-indexer-howto-access-private | | Configure SharePoint ACL ingestion with Azure AI Search indexers | https://learn.microsoft.com/en-us/azure/search/search-indexer-sharepoint-access-control-lists | | Manage Azure AI Search with Azure CLI | https://learn.microsoft.com/en-us/azure/search/search-manage-azure-cli | | Manage Azure AI Search with PowerShell scripts | https://learn.microsoft.com/en-us/azure/search/search-manage-powershell | @@ -237,6 +238,7 @@ This skill requires **network access** to fetch documentation content: | Implement Azure AI Search custom skill with Bing Entity Search | https://learn.microsoft.com/en-us/azure/search/cognitive-search-create-custom-skill-example | | Implement custom skill interface for Azure AI Search enrichment | https://learn.microsoft.com/en-us/azure/search/cognitive-search-custom-skill-interface | | Implement Custom Web API skill for enrichment | https://learn.microsoft.com/en-us/azure/search/cognitive-search-custom-skill-web-api | +| Use Document Layout skill in Azure AI Search enrichments | https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-document-intelligence-layout | | Connect Azure AI Search knowledge stores to Power BI | https://learn.microsoft.com/en-us/azure/search/knowledge-store-connect-power-bi | | Create Azure AI Search knowledge stores via REST APIs | https://learn.microsoft.com/en-us/azure/search/knowledge-store-create-rest | | Implement complex projection shapes for knowledge stores | https://learn.microsoft.com/en-us/azure/search/knowledge-store-projection-example-long | @@ -259,7 +261,6 @@ This skill requires **network access** to fetch documentation content: | Use Azure Logic Apps workflows for automated indexing | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-logic-apps | | Index Azure Database for MySQL with Azure AI Search (preview) | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-mysql | | Configure OneLake files indexer for Azure AI Search | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-onelake-files | -| Configure SharePoint Online indexer for Azure AI Search | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sharepoint-online | | Configure Azure SQL indexer for Azure AI Search | https://learn.microsoft.com/en-us/azure/search/search-how-to-index-sql-database | | Use integrated vectorization with Azure AI Search REST APIs | https://learn.microsoft.com/en-us/azure/search/search-how-to-integrated-vectorization | | Index Markdown blobs with Azure AI Search REST APIs | https://learn.microsoft.com/en-us/azure/search/search-markdown-data-tutorial | diff --git a/skills/azure-container-apps/SKILL.md b/skills/azure-container-apps/SKILL.md index 0fe59136..61ff7207 100644 --- a/skills/azure-container-apps/SKILL.md +++ b/skills/azure-container-apps/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-container-apps -description: Expert knowledge for Azure Container Apps development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring ingress/scale, securing with Entra/OIDC, wiring Dapr/Spring, or deploying via GitHub Actions, and other Azure Container Apps related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Container Instances (use azure-container-instances), Azure App Service (use azure-app-service), Azure Functions (use azure-functions). +description: Expert knowledge for Azure Container Apps development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring ingress/scale, Dapr, managed identity, CI/CD pipelines, or Java microservices on Azure Container Apps, and other Azure Container Apps related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure App Service (use azure-app-service), Azure Functions (use azure-functions), Azure Spring Apps (use azure-spring-apps). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Container Apps Skill @@ -29,10 +29,10 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L59-L78 | Guidance on choosing Container Apps plans, compute, GPUs, and hosting options, plus cost modeling and migration paths from Functions, Heroku, Java/Spring/Tomcat, and other Azure services. | | Architecture & Design Patterns | L79-L85 | Architectures and patterns for Java microservices on Azure Container Apps, including Eureka HA clusters, AI-enabled PetClinic, and end-to-end microservice design best practices. | | Limits & Quotas | L86-L91 | Quota and limit rules for Container Apps (CPU/memory, environments, revisions, scale) and how to request increases when you hit platform or subscription limits. | -| Security | L92-L127 | Securing Container Apps: auth (Entra, social, OIDC, mTLS, certs), secrets/MI, network controls (VNet, private endpoints, firewall, WAF), TLS/DNS, and security best practices/policies. | -| Configuration | L128-L170 | Configuring Container Apps runtime: networking, ingress, scaling, revisions, Dapr, Java features, logging/monitoring, storage mounts, workload profiles, and routing/traffic controls. | -| Integrations & Coding Patterns | L171-L193 | Patterns for connecting Container Apps to each other and to services (Dapr, Spring, Front Door, Service Connector) plus dynamic/shell/code-interpreter sessions and event-driven jobs. | -| Deployment | L194-L202 | Deploying and automating Container Apps: CI/CD with GitHub Actions/Azure Pipelines, Docker Compose migration, self-hosted runners, and Arc-enabled Kubernetes integration. | +| Security | L92-L128 | Auth, network, and data protection for Container Apps: identity providers, mTLS/client certs, CORS, domains/TLS, secrets, managed identity, private endpoints, firewalls, and security best practices. | +| Configuration | L129-L172 | Configuring Container Apps runtime: networking, ingress, scaling, revisions, Dapr, Java features, logging/monitoring, storage mounts, workload profiles, and maintenance/alerts. | +| Integrations & Coding Patterns | L173-L195 | Patterns for connecting Container Apps to each other and to services (Dapr, Spring, Front Door, Service Connector) plus dynamic/shell/code-interpreter sessions and event-driven jobs. | +| Deployment | L196-L205 | Guides for deploying and automating Container Apps: CI/CD with GitHub Actions/Azure Pipelines, Docker Compose, Arc-enabled Kubernetes, cross-region moves, and self-hosted CI runners. | ### Troubleshooting | Topic | URL | @@ -107,6 +107,7 @@ This skill requires **network access** to fetch documentation content: | Secure Dapr component connections to Azure services | https://learn.microsoft.com/en-us/azure/container-apps/dapr-component-connect-services | | Configure custom environment DNS suffix and TLS in Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/environment-custom-dns-suffix | | Configure NSGs, UDRs, and firewalls for Container Apps VNets | https://learn.microsoft.com/en-us/azure/container-apps/firewall-integration | +| Secure secrets and host keys for Functions in Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/functions-secrets-tutorial | | Secure Container Apps with private endpoints | https://learn.microsoft.com/en-us/azure/container-apps/how-to-use-private-endpoint | | Configure IP-based ingress restrictions for Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/ip-restrictions | | Import and manage Container Apps certificates from Key Vault | https://learn.microsoft.com/en-us/azure/container-apps/key-vault-certificates-manage | @@ -160,6 +161,7 @@ This skill requires **network access** to fetch documentation content: | Configure custom domains with rule-based routing | https://learn.microsoft.com/en-us/azure/container-apps/rule-based-routing-custom-domain | | Configure autoscaling rules in Azure Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/scale-app | | Configure service discovery resiliency policies | https://learn.microsoft.com/en-us/azure/container-apps/service-discovery-resiliency | +| Configure custom container sessions in Azure Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/sessions-custom-container | | Enable session affinity (sticky sessions) in Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/sticky-sessions | | Configure storage mounts for Azure Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/storage-mounts | | Create Azure Files volume mounts in Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/storage-mounts-azure-files | @@ -199,4 +201,5 @@ This skill requires **network access** to fetch documentation content: | Deploy Docker Compose agents to Azure Container Apps | https://learn.microsoft.com/en-us/azure/container-apps/compose-agent | | Automate Container Apps revisions with GitHub Actions | https://learn.microsoft.com/en-us/azure/container-apps/github-actions | | Generate Container Apps GitHub Actions via Azure CLI | https://learn.microsoft.com/en-us/azure/container-apps/github-actions-cli | +| Relocate Azure Container Apps across regions | https://learn.microsoft.com/en-us/azure/container-apps/relocate-region | | Run self-hosted CI/CD runners with Container Apps jobs | https://learn.microsoft.com/en-us/azure/container-apps/tutorial-ci-cd-runners-jobs | \ No newline at end of file diff --git a/skills/azure-container-registry/SKILL.md b/skills/azure-container-registry/SKILL.md index da825e10..1b47ef03 100644 --- a/skills/azure-container-registry/SKILL.md +++ b/skills/azure-container-registry/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-container-registry -description: Expert knowledge for Azure Container Registry development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using ACR Tasks, geo-replication, Private Link, connected registries, or image signing with Notation, and other Azure Container Registry related development tasks. Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Red Hat OpenShift (use azure-redhat-openshift). +description: Expert knowledge for Azure Container Registry development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using ACR Tasks, geo-replication, Private Link, connected registries, or image signing/verification, and other Azure Container Registry related development tasks. Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Red Hat OpenShift (use azure-redhat-openshift). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Container Registry Skill @@ -31,8 +31,8 @@ This skill requires **network access** to fetch documentation content: | Limits & Quotas | L74-L79 | Details on ACR SKUs (Basic/Standard/Premium) including throughput and feature limits, and how image storage capacity, layers, and repositories are calculated and constrained. | | Security | L80-L119 | Securing Azure Container Registry: auth methods, RBAC/ABAC and tokens, network isolation (firewall, Private Link, IP/service tags), data exfiltration controls, image signing/verification, and policy/compliance. | | Configuration | L120-L137 | Configuring ACR behavior: caching, purge/retention/soft delete, delete locks, tasks (YAML, timers, multi-step, patching, agent pools), webhooks, wildcard cache rules, and monitoring. | -| Integrations & Coding Patterns | L138-L153 | How to integrate ACR with ACI, AKS, Helm, ORAS, Buildpacks, ACR Transfer, GitHub Actions, Notation, Key Vault, and webhooks for image access, builds, signing, and automation | -| Deployment | L154-L159 | Using ARM templates to automate ACR quick tasks and data transfer, and deploying/managing ACR connected registries via Azure Arc extension | +| Integrations & Coding Patterns | L138-L154 | Patterns and code samples for integrating ACR with AKS/ACI, GitHub Actions, ORAS, Helm, Notation signing, webhooks, artifact caching, transfers, and image import/build workflows. | +| Deployment | L155-L160 | Using ARM templates to automate ACR quick tasks and data transfer, and deploying/managing ACR connected registries via Azure Arc extension | ### Troubleshooting | Topic | URL | @@ -138,6 +138,7 @@ This skill requires **network access** to fetch documentation content: ### Integrations & Coding Patterns | Topic | URL | |-------|-----| +| Configure ACR-to-ACR artifact caching with managed identity | https://learn.microsoft.com/en-us/azure/container-registry/artifact-cache-acr-to-acr-cli | | Grant Azure Container Instances access to ACR with service principals | https://learn.microsoft.com/en-us/azure/container-registry/container-registry-auth-aci | | Create Kubernetes pull secrets for Azure Container Registry access | https://learn.microsoft.com/en-us/azure/container-registry/container-registry-auth-kubernetes | | Host and manage Helm chart repositories in ACR | https://learn.microsoft.com/en-us/azure/container-registry/container-registry-helm-repos | diff --git a/skills/azure-cosmos-db/SKILL.md b/skills/azure-cosmos-db/SKILL.md index f68e988b..31c0cd07 100644 --- a/skills/azure-cosmos-db/SKILL.md +++ b/skills/azure-cosmos-db/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-cosmos-db -description: Expert knowledge for Azure Cosmos DB development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Cosmos DB NoSQL/Mongo/Cassandra APIs, change feed, vector search, global distribution, or HTAP/analytics workloads, and other Azure Cosmos DB related development tasks. Not for Azure Table Storage (use azure-table-storage), Azure SQL Database (use azure-sql-database), Azure Database for MySQL (use azure-database-mysql), Azure Database for PostgreSQL (use azure-database-postgresql). +description: Expert knowledge for Azure Cosmos DB development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Cosmos DB NoSQL/Mongo/Cassandra/Postgres APIs, change feed, vector search, global distribution, or HTAP, and other Azure Cosmos DB related development tasks. Not for Azure Table Storage (use azure-table-storage), Azure SQL Database (use azure-sql-database), Azure Data Explorer (use azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Cosmos DB Skill @@ -25,13 +25,13 @@ This skill requires **network access** to fetch documentation content: | Category | Location | Description | |----------|----------|-------------| | Troubleshooting | L37-L91 | Diagnosing and fixing Cosmos DB issues: SDK errors, timeouts, 4xx/5xx codes, performance/RU analysis, metrics/log queries, CMK/backup problems, and API-specific (Mongo/Cassandra/Gremlin/Postgres) troubleshooting. | -| Best Practices | L92-L151 | Performance, scaling, cost, and resiliency best practices for Cosmos DB (all APIs/SDKs), including partitioning, indexing, throughput, benchmarking, DR, and tuning PostgreSQL/Cassandra workloads | +| Best Practices | L92-L151 | Performance, scaling, partitioning, indexing, cost optimization, SDK tuning, and HA/DR best practices for Cosmos DB (NoSQL, MongoDB, Cassandra, PostgreSQL) and legacy DocumentDB. | | Decision Making | L152-L206 | Guides for choosing Cosmos DB options (consistency, throughput, backup, analytics, vector search), estimating RU/costs, and planning/migrating workloads across APIs (Cassandra, MongoDB, PostgreSQL, DynamoDB). | | Architecture & Design Patterns | L207-L249 | Architectural patterns for Cosmos DB and PostgreSQL: multitenancy, sharding, HA/DR, change feed, HTAP, real-time analytics, and AI/LLM agents, memory, vectors, and semantic caching. | -| Limits & Quotas | L250-L294 | Limits, quotas, and behaviors for Cosmos DB and DocumentDB: RU/throughput, autoscale, burst, backups, partitions, indexing, APIs (Core, Cassandra, Mongo, Table, Gremlin), and PostgreSQL cluster sizing. | -| Security | L295-L361 | Securing Cosmos DB and related services: identity/RBAC, keys and encryption, network isolation (VNet, Private Link, firewalls), TLS, auditing, policies, and data‑level protections. | -| Configuration | L362-L485 | Configuring Cosmos DB and related services: throughput, indexing, TTL, backup/restore, global distribution, monitoring, emulators, SDK tuning, and deployment via Bicep/ARM/Terraform across all APIs. | -| Integrations & Coding Patterns | [integrations.md](integrations.md) | SDK patterns, bulk ops, change feed, vector search, and integrations (Kafka, Spark, Functions, BI, AI agents) for Cosmos DB APIs (NoSQL, Mongo, Cassandra, PostgreSQL, Gremlin, DocumentDB). | +| Limits & Quotas | L250-L294 | Limits, quotas, and behaviors for Cosmos DB and DocumentDB: RUs, autoscale, burst, backup/PITR, partitions, indexing, APIs (Mongo/Cassandra/Table/Gremlin), fleets, free tier, and PostgreSQL cluster sizing. | +| Security | L295-L362 | Securing Cosmos DB and related services: identity/RBAC, keys and encryption, network isolation (VNet, Private Link, firewalls), TLS, auditing, policies, and workload-specific security (NoSQL, Mongo, PostgreSQL, Cassandra, DocumentDB). | +| Configuration | L363-L485 | Configuring Cosmos DB and related services: throughput, indexing, TTL, backup/restore, global distribution, monitoring, emulators, SDK tuning, and deployment via Bicep/ARM/Terraform across all APIs. | +| Integrations & Coding Patterns | [integrations.md](integrations.md) | SDK patterns, bulk ops, change feed, vector search, and integration guides for Cosmos DB APIs (NoSQL, Mongo, Cassandra, PostgreSQL, Gremlin, DocumentDB) plus Kafka, Spark, BI, and migration tools. | | Deployment | [deployment.md](deployment.md) | Deploying and managing Cosmos DB and Azure DocumentDB: ARM/Bicep/Terraform templates, CI/CD, scaling, backup/restore, upgrades, maintenance, and start/stop operations for various APIs. | ### Troubleshooting @@ -97,7 +97,7 @@ This skill requires **network access** to fetch documentation content: | Best practices for Azure Cosmos DB .NET SDK v3 | https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-dotnet | | Best practices for Azure Cosmos DB Java SDK v4 | https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-java | | Optimize Azure Cosmos DB Python SDK performance | https://learn.microsoft.com/en-us/azure/cosmos-db/best-practice-python | -| Apply performance best practices for Cosmos DB JavaScript SDK | https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript | +| Optimize Azure Cosmos DB JavaScript SDK usage | https://learn.microsoft.com/en-us/azure/cosmos-db/best-practices-javascript | | Adapt Apache Cassandra applications to Cosmos DB Cassandra API | https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/adoption | | Apply recommended Cosmos DB Cassandra driver extension settings | https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/driver-extensions | | Implement lightweight transactions in Cosmos DB Cassandra API | https://learn.microsoft.com/en-us/azure/cosmos-db/cassandra/lightweight-transactions | @@ -264,7 +264,6 @@ This skill requires **network access** to fetch documentation content: | Runtime limits for Cosmos DB Gremlin engine | https://learn.microsoft.com/en-us/azure/cosmos-db/gremlin/limits | | Use hierarchical partition keys to bypass 20-GB limit | https://learn.microsoft.com/en-us/azure/cosmos-db/hierarchical-partition-keys-unlimited-scale | | Alert when Cosmos DB logical partitions near 20 GB limit | https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-alert-on-logical-partition-key-storage-size | -| Manage Cosmos DB accounts and understand control plane limits | https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account | | Understand limits and behavior of Cosmos DB integrated cache | https://learn.microsoft.com/en-us/azure/cosmos-db/integrated-cache-faq | | Request unit charges for key-value operations in Cosmos DB | https://learn.microsoft.com/en-us/azure/cosmos-db/key-value-store-cost | | Migrate nonpartitioned Cosmos DB containers to partitioned | https://learn.microsoft.com/en-us/azure/cosmos-db/migrate-containers-partitioned-to-nonpartitioned | @@ -279,6 +278,7 @@ This skill requires **network access** to fetch documentation content: | FAQ for Azure Cosmos DB for Table | https://learn.microsoft.com/en-us/azure/cosmos-db/table/faq | | Configure and manage Azure Cosmos DB throughput buckets | https://learn.microsoft.com/en-us/azure/cosmos-db/throughput-buckets | | FAQ on Cosmos DB throughput bucket limits and behavior | https://learn.microsoft.com/en-us/azure/cosmos-db/throughput-buckets-faq | +| Estimate and interpret Azure Cosmos DB RU consumption | https://learn.microsoft.com/en-us/azure/cosmos-db/understand-request-unit-consumption | | Configure and use change streams in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/change-streams | | Review MongoDB feature compatibility limits in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/compatibility-features | | Check MQL compatibility across MongoDB versions in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/compatibility-query-language | @@ -301,7 +301,7 @@ This skill requires **network access** to fetch documentation content: | Configure RBAC permissions for Cosmos DB continuous backup restore | https://learn.microsoft.com/en-us/azure/cosmos-db/continuous-backup-restore-permissions | | Configure Cosmos DB to meet data residency requirements | https://learn.microsoft.com/en-us/azure/cosmos-db/data-residency | | Use Microsoft Defender threat protection for Cosmos DB | https://learn.microsoft.com/en-us/azure/cosmos-db/defender-for-cosmos-db | -| Configure Dynamic Data Masking in Cosmos DB | https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking | +| Configure Dynamic Data Masking in Azure Cosmos DB | https://learn.microsoft.com/en-us/azure/cosmos-db/dynamic-data-masking | | Secure Azure Cosmos DB for Gremlin accounts | https://learn.microsoft.com/en-us/azure/cosmos-db/gremlin/security | | Add and assign Cosmos DB RBAC user roles | https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-add-assign-user-roles | | Use Always Encrypted client-side encryption in Cosmos DB | https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-always-encrypted | @@ -336,6 +336,7 @@ This skill requires **network access** to fetch documentation content: | Reference for Cosmos DB data plane RBAC roles | https://learn.microsoft.com/en-us/azure/cosmos-db/reference-data-plane-security | | Reference for Cosmos DB data plane RBAC roles | https://learn.microsoft.com/en-us/azure/cosmos-db/reference-data-plane-security | | Protect Cosmos DB resources with Azure locks | https://learn.microsoft.com/en-us/azure/cosmos-db/resource-locks | +| Secure Azure Cosmos DB for NoSQL accounts | https://learn.microsoft.com/en-us/azure/cosmos-db/security | | Review Cosmos DB Azure Policy compliance controls | https://learn.microsoft.com/en-us/azure/cosmos-db/security-controls-policy | | Enforce minimum TLS version for Cosmos DB | https://learn.microsoft.com/en-us/azure/cosmos-db/self-serve-minimum-tls-enforcement | | Store Cosmos DB credentials securely in Azure Key Vault | https://learn.microsoft.com/en-us/azure/cosmos-db/store-credentials-key-vault | @@ -376,7 +377,6 @@ This skill requires **network access** to fetch documentation content: | Configure Azure Monitor alerts for Cosmos DB resources | https://learn.microsoft.com/en-us/azure/cosmos-db/create-alerts | | Use keyboard shortcuts in Cosmos DB Data Explorer | https://learn.microsoft.com/en-us/azure/cosmos-db/data-explorer-shortcuts | | Configure and use Azure Cosmos DB local emulator | https://learn.microsoft.com/en-us/azure/cosmos-db/emulator | -| Run Azure Cosmos DB Linux-based emulator container | https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-linux | | Control Cosmos DB Windows emulator via CLI and PowerShell | https://learn.microsoft.com/en-us/azure/cosmos-db/emulator-windows-arguments | | Retrieve request unit charges for Cosmos DB queries | https://learn.microsoft.com/en-us/azure/cosmos-db/find-request-unit-charge | | Reference schema for Azure Cosmos DB Fleet Analytics tables | https://learn.microsoft.com/en-us/azure/cosmos-db/fleet-analytics-schema-reference | diff --git a/skills/azure-cosmos-db/integrations.md b/skills/azure-cosmos-db/integrations.md index a5a59a1e..8433ebbe 100644 --- a/skills/azure-cosmos-db/integrations.md +++ b/skills/azure-cosmos-db/integrations.md @@ -203,7 +203,7 @@ | Persist aggregation output with $out in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$out | | Transform documents with $replaceWith in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$replacewith | | Randomly sample documents with $sample in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$sample | -| Set or update fields with $set in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set | +| Use the $set update operator in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$set | | Implement pagination with $skip in DocumentDB pipelines | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/$skip | | Convert expression types with $convert in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/type-expression/$convert | | Check numeric types using $isNumber in DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/aggregation/type-expression/$isnumber | @@ -229,7 +229,6 @@ | Apply $bitAnd bitwise operations in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitand | | Use $bitNot bitwise operator in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitnot | | Use $bitOr bitwise operator in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/bitwise/$bitor | -| Compare values using $cmp in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$cmp | | Filter equal values with $eq in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$eq | | Query greater-than values with $gt in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$gt | | Query minimum thresholds with $gte in Azure DocumentDB | https://learn.microsoft.com/en-us/azure/documentdb/operators/comparison-query/$gte | diff --git a/skills/azure-cost-management/SKILL.md b/skills/azure-cost-management/SKILL.md index b8bc48c5..9cb1b9ea 100644 --- a/skills/azure-cost-management/SKILL.md +++ b/skills/azure-cost-management/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-cost-management -description: Expert knowledge for Azure Cost Management development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing budgets/alerts, exports, Cost Management APIs, reservations/savings plans, or Hybrid Benefit, and other Azure Cost Management related development tasks. Not for Azure Advisor (use azure-advisor), Azure Monitor (use azure-monitor), Azure Quotas (use azure-quotas), Azure Portal (use azure-portal). +description: Expert knowledge for Azure Cost Management development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing budgets/alerts, tags/exports, Cost Management APIs, reservations/savings plans, or EA→MCA billing, and other Azure Cost Management related development tasks. Not for Azure Advisor (use azure-advisor), Azure Monitor (use azure-monitor), Azure Quotas (use azure-quotas), Azure Impact Reporting (use azure-impact-reporting). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Cost Management Skill @@ -26,13 +26,13 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L66 | Diagnosing and fixing Azure billing, subscription, and reservation issues—errors, disabled subscriptions, payment failures, access problems, and low/zero utilization or recommendation anomalies. | | Best Practices | L67-L77 | Best practices for analyzing and optimizing Azure costs, using Advisor and savings plans, onboarding under MCA, managing Azure Hybrid Benefit, and setting up org-wide cost management processes | -| Decision Making | L78-L149 | Deciding how to allocate and analyze Azure costs, choose subscriptions and billing models, and optimize spend with reservations, savings plans, Hybrid Benefit, and limited-time discounts. | -| Architecture & Design Patterns | L150-L154 | How Azure bills centrally assigned SQL Server licenses hourly, including license assignment, cost calculation, and optimization considerations | -| Limits & Quotas | L155-L170 | Azure cost limits, quotas, and timing: free tier usage, credits, spending limits, data transfer fees, savings plans, SQL licensing, subscription limits, and billing account dormancy. | -| Security | L171-L192 | Securing Azure billing and cost data: RBAC roles, admin access, directory transfers, fraud prevention, and permissions for subscriptions, reservations, and savings plans. | -| Configuration | L193-L234 | Configuring Azure Cost Management: budgets, alerts, tags, exports, cost analysis views, reservations/savings plans, billing ownership/transfers, payment methods, and EA/MCA/partner billing settings. | -| Integrations & Coding Patterns | L235-L251 | APIs, scripts, and Power BI for cost/billing analysis, plus programmatic creation and management of EA/MCA/MPA subscriptions and reservations across tenants. | -| Deployment | L252-L255 | Configuring automated, large-scale exports of Azure cost and usage data to storage (like Azure Storage), including setup, scheduling, and management for ongoing cost analysis. | +| Decision Making | L78-L151 | Guidance for choosing cost views, billing models, reservations, savings plans, and allocation strategies, plus EA→MCA transitions and product-specific discount/chargeback decisions. | +| Architecture & Design Patterns | L152-L156 | How Azure bills centrally assigned SQL Server licenses hourly, including license assignment, cost calculation, and optimization considerations | +| Limits & Quotas | L157-L171 | Azure cost limits, quotas, and timing: free tier usage, credits, spending limits, data transfer fees, savings plans, SQL licensing, subscription limits, and billing account dormancy. | +| Security | L172-L193 | Securing Azure billing and cost data: RBAC roles, admin access, directory transfers, fraud prevention, and permissions for subscriptions, reservations, and savings plans. | +| Configuration | L194-L231 | Configuring Azure Cost Management: budgets, alerts, tags, exports, cost analysis views, reservations/savings plans, billing ownership/transfers, payment methods, and EA/MCA/partner billing settings. | +| Integrations & Coding Patterns | L232-L248 | APIs, scripts, and Power BI for cost/billing analysis, plus programmatic creation and management of EA/MCA/MPA subscriptions and reservations across tenants. | +| Deployment | L249-L252 | Configuring automated, large-scale exports of Azure cost and usage data to storage (like Azure Storage), including setup, scheduling, and management for ongoing cost analysis. | ### Troubleshooting | Topic | URL | @@ -82,7 +82,9 @@ This skill requires **network access** to fetch documentation content: | Plan and implement Azure cost allocation strategies | https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/cost-allocation-introduction | | Choose and use built-in Cost Analysis views | https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/cost-analysis-built-in-views | | Plan migration from EA to MCA Cost Management APIs | https://learn.microsoft.com/en-us/azure/cost-management-billing/costs/migrate-cost-management-api | -| Use EA VM reservations to optimize Azure costs | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations | +| Manage Azure Marketplace usage under EA enrollments | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-azure-marketplace | +| Understand how Azure EA agreements affect billing | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-agreements | +| Use Azure EA VM reservations to optimize costs | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-portal-vm-reservations | | Interpret Azure EA pricing and usage calculations | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-pricing-overview | | Transition Enterprise Agreement billing to Microsoft Customer Agreement | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/mca-setup-account | | Request billing ownership transfer under MPA as CSP | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/mpa-request-ownership | @@ -160,7 +162,6 @@ This skill requires **network access** to fetch documentation content: | Monitor Azure free service usage against quotas | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/check-free-service-usage | | Understand Azure free account credits and duration limits | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/create-free-services | | Understand Azure data transfer fee rules in Europe | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/data-transfer-fees | -| Understand timing of direct EA invoice document availability | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-billing-invoice-documents | | Handle Azure region optimization policy constraints | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/region-optimization | | Manage Azure spending limit and credit-based quotas | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/spending-limit | | Identify Azure savings plan software cost exclusions | https://learn.microsoft.com/en-us/azure/cost-management-billing/savings-plan/software-costs-not-included | @@ -181,7 +182,7 @@ This skill requires **network access** to fetch documentation content: | Assign Azure billing access roles securely | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/manage-billing-access | | Understand PSD2 SCA requirements for Azure purchases | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/open-banking-strong-customer-authentication | | Protect Azure tenants and subscriptions from fraud and abuse | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/protect-tenants-subscriptions | -| Understand and assign Azure Enterprise Agreement admin roles | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles | +| Manage Azure EA administrative roles and permissions | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-ea-roles | | Use MCA billing roles for access control | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/understand-mca-roles | | Manage tenants and secure billing access under MCA | https://learn.microsoft.com/en-us/azure/cost-management-billing/microsoft-customer-agreement/manage-tenants | | View Azure reservations as a Cloud Solution Provider | https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/how-to-view-csp-reservations | @@ -206,10 +207,6 @@ This skill requires **network access** to fetch documentation content: | Transfer Azure plan subscriptions between Microsoft partners | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/azure-plan-subscription-transfer-partners | | Transfer billing ownership of MOSP Azure subscriptions | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/billing-subscription-transfer | | Cancel and permanently delete Azure subscriptions safely | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/cancel-azure-subscription | -| Initiate and manage change of channel partner for EA | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/change-of-channel-partner | -| Perform common EA billing administration tasks in Azure portal | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/direct-ea-administration | -| Manage indirect EA billing as a partner in Azure portal | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-billing-administration-partners | -| Transfer Azure Enterprise enrollment accounts and subscriptions | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/ea-transfers | | Configure Azure Marketplace and private offer purchase policies | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/enable-marketplace-purchases | | Configure Partner Admin Link (PAL) for Azure customer management | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/link-partner-id | | Configure Partner Admin Link for Power Platform with Azure credentials | https://learn.microsoft.com/en-us/azure/cost-management-billing/manage/link-partner-id-power-apps-accounts | diff --git a/skills/azure-data-explorer/SKILL.md b/skills/azure-data-explorer/SKILL.md index fa6a804a..5133c83d 100644 --- a/skills/azure-data-explorer/SKILL.md +++ b/skills/azure-data-explorer/SKILL.md @@ -3,7 +3,7 @@ name: azure-data-explorer description: Expert knowledge for Azure Data Explorer development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring ADX auth/RBAC and private endpoints, streaming/queued ingestion, follower DBs, or Power BI integration, and other Azure Data Explorer related development tasks. Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure Stream Analytics (use azure-stream-analytics), Azure HDInsight (use azure-hdinsight), Azure Databricks (use azure-databricks). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Data Explorer Skill diff --git a/skills/azure-data-factory/SKILL.md b/skills/azure-data-factory/SKILL.md index 29ccaa28..9deb437a 100644 --- a/skills/azure-data-factory/SKILL.md +++ b/skills/azure-data-factory/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-data-factory -description: Expert knowledge for Azure Data Factory development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing ADF pipelines, Mapping Data Flows, SHIR/SSIS IR, secure VNet/Private Link setups, or CI/CD deployments, and other Azure Data Factory related development tasks. Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure Databricks (use azure-databricks), Azure Stream Analytics (use azure-stream-analytics), Azure Data Explorer (use azure-data-explorer). +description: Expert knowledge for Azure Data Factory development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building ADF pipelines, mapping data flows, SHIR/SSIS IR, VNet/Private Link, or CI/CD deployments, and other Azure Data Factory related development tasks. Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure Databricks (use azure-databricks), Azure Stream Analytics (use azure-stream-analytics), Azure Data Explorer (use azure-data-explorer). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Data Factory Skill @@ -31,7 +31,7 @@ This skill requires **network access** to fetch documentation content: | Limits & Quotas | L146-L150 | Info on ADF connector lifecycle stages and timelines, plus how reservation discounts work for Mapping Data Flows and how they affect cost and capacity planning | | Security | L151-L181 | Securing Data Factory with identity, encryption, Key Vault, and Azure Policy, plus network controls like VNets, Private Link, firewalls, private endpoints, and secure runtimes (Azure-SSIS, self-hosted). | | Configuration | L182-L295 | Configuring ADF artifacts and runtime: pipelines, activities, data flows, triggers, copy behavior, formats, IRs (Azure/self-hosted/SSIS), monitoring, logging, parameters, and DevOps setup. | -| Integrations & Coding Patterns | L296-L482 | Connector how-tos, patterns, and activities for copying/transforming data between ADF and many services (databases, SaaS apps, files, Fabric, Databricks, SSIS, ML, Synapse) using data flows and SDKs. | +| Integrations & Coding Patterns | L296-L482 | Connector how-tos, patterns, and activities for integrating ADF with dozens of data sources, running external compute (Databricks, Synapse, SSIS, ML), and building mapping data flow transformations. | | Deployment | L483-L496 | CI/CD and deployment for ADF: ARM/linked templates, Azure DevOps pipelines, hotfix flows, pre/post scripts, IR automation, SSIS job migration, and networked IR migration. | ### Troubleshooting @@ -363,7 +363,7 @@ This skill requires **network access** to fetch documentation content: | Configure Azure Data Factory OData connector | https://learn.microsoft.com/en-us/azure/data-factory/connector-odata | | Use ODBC connector in Azure Data Factory | https://learn.microsoft.com/en-us/azure/data-factory/connector-odbc | | Copy and transform Microsoft 365 data with Data Factory | https://learn.microsoft.com/en-us/azure/data-factory/connector-office-365 | -| Configure Azure Data Factory Oracle connector | https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle | +| Integrate Azure Data Factory with Oracle databases | https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle | | Connect Azure Data Factory to Oracle Cloud Storage | https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-cloud-storage | | Connect Azure Data Factory to Oracle Eloqua | https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-eloqua | | Connect Azure Data Factory to Oracle Responsys | https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-responsys | diff --git a/skills/azure-database-mysql/SKILL.md b/skills/azure-database-mysql/SKILL.md index 9de9943b..cf0d2cb2 100644 --- a/skills/azure-database-mysql/SKILL.md +++ b/skills/azure-database-mysql/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-database-mysql -description: Expert knowledge for Azure Database for MySQL development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when planning MySQL Flexible Server HA/BCDR, CI/CD deployments, VNet/Private Link, read replicas, or AKS connectivity, and other Azure Database for MySQL related development tasks. Not for Azure Database for MariaDB (use azure-database-mariadb), Azure Database for PostgreSQL (use azure-database-postgresql), Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance). +description: Expert knowledge for Azure Database for MySQL development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when deploying MySQL Flexible Server, configuring VNet/Private Link, HA/read replicas, backups/restore, or AKS connectivity, and other Azure Database for MySQL related development tasks. Not for Azure Database for MariaDB (use azure-database-mariadb), Azure Database for PostgreSQL (use azure-database-postgresql), Azure SQL Database (use azure-sql-database), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Database for MySQL Skill @@ -26,7 +26,7 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L53 | Diagnosing and fixing MySQL Flexible Server issues: connectivity, performance (CPU/memory/queries), errors, corruption, capacity, replication lag, CLI problems, and using logs/self-heal tools | | Best Practices | L54-L71 | Best practices for monitoring, performance tuning, troubleshooting, safe operations, BCDR, and end-to-end migration/optimization for Azure Database for MySQL Flexible Server | -| Decision Making | L72-L89 | Guidance on MySQL Flexible Server planning: version lifecycle, HA/BCDR, performance features, sizing and pricing, and assessing, choosing methods for, and executing migrations to Azure. | +| Decision Making | L72-L89 | Guidance on MySQL version planning, HA/BCDR choices, sizing and tiers, performance features, and end-to-end migration/upgrade strategies to Azure Database for MySQL. | | Architecture & Design Patterns | L90-L97 | Patterns for connecting AKS to MySQL Flexible Server, designing backup/restore, data-in/out replication, high availability (zone-redundant), and read-replica-based scaling. | | Limits & Quotas | L98-L106 | Limits, quotas, and performance caps for MySQL Flexible Server: compute/storage/IOPS limits, quota increase process, restore retention limits, and stop/start duration constraints. | | Security | L107-L133 | Securing Azure Database for MySQL Flexible Server: network isolation (Private Link, firewalls), TLS and cert rotation, encryption, Entra auth, users, and audit logging before/after migration. | @@ -72,7 +72,7 @@ This skill requires **network access** to fetch documentation content: ### Decision Making | Topic | URL | |-------|-----| -| Plan for Azure Database for MySQL version lifecycle | https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy | +| Plan MySQL version lifecycle on Azure Database | https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy | | Choose and purchase MySQL Flexible Server reserved capacity | https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concept-reserved-pricing | | Use accelerated logs for high-performance MySQL workloads | https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-accelerated-logs | | Plan business continuity for MySQL Flexible Server | https://learn.microsoft.com/en-us/azure/mysql/flexible-server/concepts-business-continuity | diff --git a/skills/azure-database-postgresql/SKILL.md b/skills/azure-database-postgresql/SKILL.md index b9f37fb9..6975ca5e 100644 --- a/skills/azure-database-postgresql/SKILL.md +++ b/skills/azure-database-postgresql/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-database-postgresql -description: Expert knowledge for Azure Database for PostgreSQL development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using pgvector search, Azure AI/OpenAI embeddings, VNet/private endpoints, geo-replication, or CI/CD deploys, and other Azure Database for PostgreSQL related development tasks. Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Cosmos DB (use azure-cosmos-db). +description: Expert knowledge for Azure Database for PostgreSQL development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using pgvector, Azure AI/OpenAI embeddings, VNet/TLS security, sharding/elastic scale, or PITR/backup, and other Azure Database for PostgreSQL related development tasks. Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Cosmos DB (use azure-cosmos-db). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Database for PostgreSQL Skill @@ -30,9 +30,9 @@ This skill requires **network access** to fetch documentation content: | Architecture & Design Patterns | L95-L105 | Patterns for using Azure PostgreSQL (often with OpenAI) to build recommendation/semantic search apps, microservices, multitenancy, real-time dashboards, and sharded/elastic data architectures. | | Limits & Quotas | L106-L125 | Backup/restore behavior, storage and performance limits, quotas, client/replica caps, elastic cluster limits, and known migration/conversion constraints for Azure Database for PostgreSQL. | | Security | L126-L157 | Securing Azure Database for PostgreSQL: network/VNet and firewall, TLS/SSL, identities and auth (Entra, SCRAM, managed identities), encryption, auditing, Defender, and security policies. | -| Configuration | L158-L246 | Configuring Azure Database for PostgreSQL servers: parameters, extensions, logging, monitoring, HA, networking, maintenance, scaling, vector search, and migration-related settings. | -| Integrations & Coding Patterns | L247-L276 | Integrating Azure PostgreSQL with AI/ML and apps: Azure AI/OpenAI embeddings, LangChain, Foundry/MCP, SDKs (C#, Java, Python, Go, PHP), VS Code, Storage, Data Factory, and migration tools. | -| Deployment | L277-L287 | Deploying and upgrading Azure PostgreSQL (CI/CD, Bicep, AKS/Django, Web Apps), managing networking (VNet, private endpoints), maintenance behavior, and point-in-time restore. | +| Configuration | L158-L245 | Configuring Azure Database for PostgreSQL servers: parameters, extensions, logging, monitoring, HA, networking, maintenance, scaling, vector search, and migration-related settings. | +| Integrations & Coding Patterns | L246-L275 | Integrating Azure PostgreSQL with AI/ML and apps: Azure AI/OpenAI embeddings, LangChain, Foundry/MCP, SDKs (C#, Java, Python, Go, PHP), VS Code, Storage, Data Factory, and migration tools. | +| Deployment | L276-L286 | CI/CD deployment to Azure PostgreSQL, infrastructure-as-code (Bicep), app deployment patterns (AKS/Django, Web Apps + VNet), upgrades, network migrations, and point-in-time restore. | ### Troubleshooting | Topic | URL | @@ -158,7 +158,6 @@ This skill requires **network access** to fetch documentation content: ### Configuration | Topic | URL | |-------|-----| -| Configure and plan scheduled maintenance for Azure PostgreSQL | https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-maintenance | | Apply server configuration concepts for Azure PostgreSQL flexible server | https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/concepts-servers | | Schedule maintenance windows for Azure PostgreSQL flexible server | https://learn.microsoft.com/en-us/azure/postgresql/configure-maintain/how-to-configure-scheduled-maintenance | | Deploy PostgreSQL elastic clusters using ARM templates | https://learn.microsoft.com/en-us/azure/postgresql/elastic-clusters/quickstart-create-elastic-cluster-arm-template | @@ -283,5 +282,5 @@ This skill requires **network access** to fetch documentation content: | Deploy Azure PostgreSQL flexible server using Bicep | https://learn.microsoft.com/en-us/azure/postgresql/developer/create-server-bicep | | Deploy Django on AKS with Azure PostgreSQL backend | https://learn.microsoft.com/en-us/azure/postgresql/developer/django-aks-database | | Deploy Azure Web App and PostgreSQL in same VNet | https://learn.microsoft.com/en-us/azure/postgresql/developer/webapp-server-vnet | -| Understand PostgreSQL maintenance release rollout behavior | https://learn.microsoft.com/en-us/azure/postgresql/release-notes-maintenance/release-notes-maintenance-index | +| Migrate Azure PostgreSQL from VNet injection to private endpoints | https://learn.microsoft.com/en-us/azure/postgresql/network/how-to-migrate-vnet-private-endpoint-capable-server | | Restore PostgreSQL flexible server to a point in time | https://learn.microsoft.com/en-us/azure/postgresql/samples/sample-point-in-time-restore | \ No newline at end of file diff --git a/skills/azure-databricks/SKILL.md b/skills/azure-databricks/SKILL.md index b1d61142..d20d649f 100644 --- a/skills/azure-databricks/SKILL.md +++ b/skills/azure-databricks/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-databricks -description: Expert knowledge for Azure Databricks development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Unity Catalog, Lakehouse/Lakebase, Lakeflow pipelines, Vector Search/RAG, or model serving, and other Azure Databricks related development tasks. Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure HDInsight (use azure-hdinsight), Azure Machine Learning (use azure-machine-learning), Azure Data Factory (use azure-data-factory). +description: Expert knowledge for Azure Databricks development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Unity Catalog, Lakehouse/Lakebase, Lakeflow pipelines, Vector Search, or AI/agent & model serving, and other Azure Databricks related development tasks. Not for Azure Synapse Analytics (use azure-synapse-analytics), Azure HDInsight (use azure-hdinsight), Azure Machine Learning (use azure-machine-learning), Azure Data Explorer (use azure-data-explorer). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Databricks Skill @@ -24,15 +24,15 @@ This skill requires **network access** to fetch documentation content: | Category | Location | Description | |----------|----------|-------------| -| Troubleshooting | L37-L138 | Diagnosing and fixing Databricks errors, job and compute failures, connector/ingestion issues, SQL error codes, and performance/debugging problems across Spark, AI, Lakeflow, and tooling. | -| Best Practices | L139-L313 | End-to-end Databricks best practices for performance, cost, governance, streaming, ML/LLM/RAG, BI, Lakeflow, Vector Search, and operational reliability across Azure Databricks workloads. | -| Decision Making | L314-L401 | Guides for choosing Azure Databricks architectures, compute, runtimes, ML/LLM options, and detailed migration paths (Unity Catalog, Delta, SQL, Connect, MLflow, serverless, Lakebase, and networking). | -| Architecture & Design Patterns | L402-L444 | Architectural blueprints and patterns for Databricks: lakehouse, networking, storage, HA/DR, governance, performance, ML/MLOps, RAG/agents, Lakebase, streaming, and external data access. | -| Limits & Quotas | [limits-quotas.md](limits-quotas.md) | Limits, quotas, and constraints for Databricks compute, AI/BI, connectors, Lakeflow, Lakebase, model serving, tokens, data types, and Unity Catalog resources, plus related workarounds. | -| Security | [security.md](security.md) | Identity, access control, encryption, networking, compliance, and secure integrations for Azure Databricks, Unity Catalog, Lakeflow, Lakebase, apps, and external data sources. | -| Configuration | [configuration.md](configuration.md) | Configuring and administering Azure Databricks: accounts, workspaces, security, networking, compute, storage, jobs, ML/serving, Lakehouse/Unity Catalog, Lakeflow, apps, and system-table–based monitoring. | -| Integrations & Coding Patterns | [integrations.md](integrations.md) | Patterns and APIs for integrating Databricks with external data systems, tools, and AI/ML frameworks, plus detailed PySpark/SQL function references and Lakehouse Federation/streaming examples. | -| Deployment | [deployment.md](deployment.md) | Deploying and operating Databricks apps, agents, models, jobs, and infrastructure using CI/CD, IaC, bundles, serving, Terraform, Git, and region/release planning. | +| Troubleshooting | L37-L135 | Diagnosing and fixing Azure Databricks issues: cluster startup/termination, Spark and SQL errors, connectors/Lakeflow ingestion, jobs/pipelines, CLI/VS Code, Feature Store, model serving, and AI agents. | +| Best Practices | L136-L307 | Best-practice guidance for configuring, optimizing, securing, governing, and operating Azure Databricks and Lakehouse workloads across BI, streaming, ML/LLM, Lakeflow, Vector Search, and cost management. | +| Decision Making | L308-L398 | Guides for choosing and migrating Databricks runtimes, compute, storage, ML/LLM options, and Unity Catalog, plus cost, sizing, and architecture decisions for production lakehouse workloads. | +| Architecture & Design Patterns | L399-L441 | Architectural blueprints and patterns for Databricks: lakehouse design, networking, storage, HA/DR, governance, performance, cost, streaming, ML/MLOps, RAG, and AI/agent systems. | +| Limits & Quotas | [limits-quotas.md](limits-quotas.md) | Limits, quotas, and constraints for Azure Databricks compute, AI/BI, connectors, Lakehouse/Lakebase, SQL types, streaming, Git/Repos, model serving, and Unity Catalog resources. | +| Security | [security.md](security.md) | Identity, access control, encryption, networking, compliance, and secure integrations for Azure Databricks, Unity Catalog, Lakebase, Delta Sharing, Apps, and ingestion workloads. | +| Configuration | [configuration.md](configuration.md) | Configuring and managing Azure Databricks: accounts, workspaces, security, networking, compute, storage, jobs, streaming, Unity Catalog, ML/serving, Marketplace, Lakeflow, and admin tooling. | +| Integrations & Coding Patterns | [integrations.md](integrations.md) | Patterns and APIs for integrating Databricks with external data systems, BI/AI tools, agents, streaming, ML/LLM serving, Lakehouse Federation, and low-level Spark/SQL/PySpark coding primitives. | +| Deployment | [deployment.md](deployment.md) | Deploying and managing Azure Databricks workspaces, CI/CD, IaC, jobs, model/feature serving, AI agents, bundles, and related tooling (CLI, Terraform, GitHub Actions, Azure DevOps). | ### Troubleshooting | Topic | URL | @@ -42,10 +42,8 @@ This skill requires **network access** to fetch documentation content: | Resolve Databricks classic compute termination error codes | https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/cluster-error-codes | | Debug Spark applications using Databricks Spark UI | https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/debugging-spark-ui | | Troubleshoot Apache Kafka usage on Databricks | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/faq | -| Audit and monitor Delta Sharing activity with Databricks logs | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/audit-logs | -| Troubleshoot common Delta Sharing access errors | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/troubleshooting | -| Troubleshoot common Databricks CLI errors and issues | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/troubleshooting | -| Use Databricks app details for monitoring and troubleshooting | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/view-app-details | +| Troubleshoot common Azure Databricks Delta Sharing errors | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/troubleshooting | +| Troubleshoot Azure Databricks CLI issues | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/troubleshooting | | Troubleshoot Databricks Connect for Python issues | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/troubleshooting | | Troubleshoot Databricks Connect for Scala problems | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/troubleshooting | | Troubleshoot common Databricks Terraform provider errors | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/troubleshoot | @@ -90,15 +88,14 @@ This skill requires **network access** to fetch documentation content: | Troubleshoot HubSpot connector ingestion problems | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/hubspot-troubleshoot | | Troubleshoot Jira Lakeflow ingestion errors | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/jira-troubleshoot | | Troubleshoot Meta Ads Lakeflow ingestion issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-troubleshoot | -| Diagnose and fix MySQL Lakeflow Connect ingestion errors | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-troubleshoot | -| Resolve common PostgreSQL Lakeflow Connect connector issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-faq | -| Troubleshoot PostgreSQL ingestion with Lakeflow Connect | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-troubleshoot | +| Diagnose and fix Lakeflow MySQL ingestion errors | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-troubleshoot | +| Resolve common PostgreSQL Lakeflow ingestion issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-troubleshoot | | Troubleshoot Lakeflow Connect query-based connector issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/query-based-troubleshoot | | Troubleshoot Salesforce Lakeflow ingestion problems | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-troubleshoot | | Diagnose and fix Databricks ServiceNow connector issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-troubleshoot | | Diagnose and fix Lakeflow SharePoint connector issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-troubleshoot | | Answer common SQL Server Lakeflow Connect connector questions | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-faq | -| Troubleshoot SQL Server ingestion with Lakeflow Connect | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-troubleshoot | +| Troubleshoot Databricks Lakeflow SQL Server ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-troubleshoot | | Troubleshoot TikTok Ads connector in Lakeflow | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-troubleshoot | | Troubleshoot Workday HCM connector in Lakeflow | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-troubleshoot | | Diagnose and fix Databricks Workday connector issues | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-troubleshoot | @@ -139,9 +136,10 @@ This skill requires **network access** to fetch documentation content: ### Best Practices | Topic | URL | |-------|-----| -| Use Databricks default compute policy families effectively | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-families | -| Apply identity best practices and migrate to federation | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices | -| Apply best practices for Azure Databricks serverless workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces-best-practices | +| Tag Azure Databricks resources for cost attribution | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage-detail-tags | +| Apply default Databricks compute policy families | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-families | +| Apply identity best practices and migrate to federation in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices | +| Apply best practices for Databricks serverless workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces-best-practices | | Migrate Databricks library installs from init scripts | https://learn.microsoft.com/en-us/azure/databricks/archive/compute/libraries-init-scripts | | Apply best practices for Databricks compute policies | https://learn.microsoft.com/en-us/azure/databricks/archive/compute/policies-best-practices | | Use DBIO for transactional writes to cloud storage in Databricks | https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/dbio-commit | @@ -164,24 +162,24 @@ This skill requires **network access** to fetch documentation content: | Control large interactive queries with Query Watchdog | https://learn.microsoft.com/en-us/azure/databricks/compute/troubleshooting/query-watchdog | | Apply observability best practices for Databricks jobs and pipelines | https://learn.microsoft.com/en-us/azure/databricks/data-engineering/observability-best-practices | | Write efficient UDFs for Unity Catalog ABAC policies | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/abac/udf-best-practices | -| Apply Unity Catalog data governance best practices | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/best-practices | +| Apply Unity Catalog governance best practices in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/best-practices | | Work with legacy Hive metastore database objects | https://learn.microsoft.com/en-us/azure/databricks/database-objects/hive-metastore | | Follow DBFS root storage recommendations in Databricks | https://learn.microsoft.com/en-us/azure/databricks/dbfs/dbfs-root | | Migrate from DBFS mounts to Unity Catalog external locations | https://learn.microsoft.com/en-us/azure/databricks/dbfs/mounts | | Apply DBFS and Unity Catalog usage best practices | https://learn.microsoft.com/en-us/azure/databricks/dbfs/unity-catalog | | Optimize Delta Sharing to reduce cloud egress costs | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-egress | | Apply Delta Lake best practices on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/best-practices | -| Optimize Databricks tables with liquid clustering | https://learn.microsoft.com/en-us/azure/databricks/delta/clustering | +| Optimize Delta tables with liquid clustering on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/clustering | | Tune Azure Databricks data skipping with stats and Z-order | https://learn.microsoft.com/en-us/azure/databricks/delta/data-skipping | -| Use deletion vectors to accelerate Delta table modifications on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/deletion-vectors | +| Use deletion vectors to accelerate Delta table updates | https://learn.microsoft.com/en-us/azure/databricks/delta/deletion-vectors | | Optimize Delta table file layout on Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/optimize | | Handle Delta Lake limitations on Amazon S3 | https://learn.microsoft.com/en-us/azure/databricks/delta/s3-limitations | -| Apply selective overwrite patterns in Delta Lake | https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite | +| Apply selective overwrite patterns with Delta Lake | https://learn.microsoft.com/en-us/azure/databricks/delta/selective-overwrite | | Control Delta table data file size on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/tune-file-size | -| Vacuum Delta tables safely and efficiently on Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/vacuum | +| Vacuum Delta tables safely and efficiently in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/vacuum | | Optimize VARIANT data performance with shredding on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/variant-shredding | | Apply MLOps Stack best practices with bundles | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/mlops-stacks | -| Apply Databricks-recommended CI/CD workflows and patterns | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/best-practices | +| Apply CI/CD best practices on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/best-practices | | View Databricks cluster policy families via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policy-families-commands | | Apply security and performance best practices for Databricks apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/best-practices | | Test Databricks Connect for Python code with pytest | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/testing | @@ -197,13 +195,11 @@ This skill requires **network access** to fetch documentation content: | Configure Genie Code custom instructions effectively | https://learn.microsoft.com/en-us/azure/databricks/genie-code/instructions | | Apply practical tips to improve Genie Code responses | https://learn.microsoft.com/en-us/azure/databricks/genie-code/tips | | Apply best practices for curating Genie spaces | https://learn.microsoft.com/en-us/azure/databricks/genie/best-practices | +| Design effective Genie knowledge stores for accurate answers | https://learn.microsoft.com/en-us/azure/databricks/genie/knowledge-store | | Migrate existing Auto Loader streams to file events | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/migrating-to-file-events | -| Apply common Auto Loader data ingestion patterns | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/patterns | | Configure Databricks Auto Loader for production | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/production | -| Configure Auto Loader with Unity Catalog for secure ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/unity-catalog | | Apply common COPY INTO data loading patterns | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/examples | -| Use the _metadata file column in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/file-metadata-column | -| Apply common patterns for Lakeflow managed ingestion pipelines | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/common-patterns | +| Apply common Lakeflow Connect ingestion pipeline patterns | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/common-patterns | | Fully refresh Lakeflow Connect managed ingestion target tables | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/full-refresh | | Query system.billing.usage to monitor Lakeflow ingestion costs | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/monitor-costs | | Perform ongoing maintenance for Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/pipeline-maintenance | @@ -211,7 +207,7 @@ This skill requires **network access** to fetch documentation content: | Enable incremental ingestion for Salesforce formula fields | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-formula-fields | | Use Databricks init scripts for cluster customization | https://learn.microsoft.com/en-us/azure/databricks/init-scripts/ | | Reference external files safely in Databricks init scripts | https://learn.microsoft.com/en-us/azure/databricks/init-scripts/referencing-files | -| Configure compute for Lakeflow Jobs with recommended patterns | https://learn.microsoft.com/en-us/azure/databricks/jobs/compute | +| Configure Azure Databricks compute for Lakeflow Jobs | https://learn.microsoft.com/en-us/azure/databricks/jobs/compute | | Build metadata-driven For each jobs with control tables | https://learn.microsoft.com/en-us/azure/databricks/jobs/how-to/foreach-sql-lookup-tutorial | | Apply best practices for configuring classic Lakeflow Jobs | https://learn.microsoft.com/en-us/azure/databricks/jobs/run-classic-jobs | | Apply cost optimization best practices on Databricks | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/cost-optimization/best-practices | @@ -227,8 +223,7 @@ This skill requires **network access** to fetch documentation content: | Apply development best practices to Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/develop | | Manage Python dependencies safely in Databricks pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/external-dependencies | | Implement advanced expectation patterns at scale | https://learn.microsoft.com/en-us/azure/databricks/ldp/expectation-patterns | -| Reduce Lakeflow pipeline initialization latency | https://learn.microsoft.com/en-us/azure/databricks/ldp/fix-high-init | -| Backfill historical data with Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/flows-backfill | +| Reduce initialization latency in Databricks pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/fix-high-init | | Run full refresh operations for Databricks streaming tables safely | https://learn.microsoft.com/en-us/azure/databricks/ldp/full-refresh-st | | Optimize stateful stream processing with watermarks | https://learn.microsoft.com/en-us/azure/databricks/ldp/stateful-processing | | Design CDC and snapshot patterns in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ldp/what-is-change-data-capture | @@ -236,7 +231,7 @@ This skill requires **network access** to fetch documentation content: | Apply data loading best practices on Databricks AI Runtime | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/dataloading | | Track experiments and monitor GPU health on AI Runtime | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/tracking-observability | | Apply Hyperopt best practices and troubleshooting on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl-hyperparam-tuning/hyperopt-best-practices | -| Implement point-in-time correct feature joins for time series | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/time-series | +| Implement point-in-time correct feature joins | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/time-series | | Benchmark Databricks LLM endpoints for latency and throughput | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/prov-throughput-run-benchmark | | Load and prepare data for ML on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/ | | Implement LLMOps workflows on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/llmops | @@ -255,7 +250,7 @@ This skill requires **network access** to fetch documentation content: | Evaluate and compare MLflow prompt versions effectively | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/prompt-registry/evaluate-prompts | | Use manual MLflow tracing for production GenAI apps | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/app-instrumentation/manual-tracing/ | | Analyze GenAI traces for errors and performance | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/analyze-traces | -| Apply software engineering practices to Databricks notebooks | https://learn.microsoft.com/en-us/azure/databricks/notebooks/best-practices | +| Apply engineering best practices to Azure Databricks notebooks | https://learn.microsoft.com/en-us/azure/databricks/notebooks/best-practices | | Apply Genie Code effectively in Databricks notebooks | https://learn.microsoft.com/en-us/azure/databricks/notebooks/code-assistant | | Run Databricks notebooks safely and efficiently | https://learn.microsoft.com/en-us/azure/databricks/notebooks/run-notebook | | Test and schedule Databricks notebook code | https://learn.microsoft.com/en-us/azure/databricks/notebooks/test-notebooks | @@ -283,7 +278,6 @@ This skill requires **network access** to fetch documentation content: | Encrypt inter-node traffic in Databricks clusters | https://learn.microsoft.com/en-us/azure/databricks/security/keys/encrypt-otw | | Transform complex and nested data types in Databricks | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/complex-types | | Use higher-order functions on arrays in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/higher-order-functions | -| Query semi-structured data using VARIANT in Databricks | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant | | Differences between VARIANT and JSON strings in Databricks | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant-json-diff | | Work with OBJECT type and VARIANT schemas in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/object-type | | Use TIMESTAMP_NTZ type and Delta support in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/timestamp-ntz-type | @@ -296,9 +290,9 @@ This skill requires **network access** to fetch documentation content: | Production best practices for Databricks Structured Streaming | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/production | | Optimize and monitor Databricks real-time streaming performance | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/performance | | Optimize stateless Structured Streaming queries on Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stateless-streaming | -| Monitor Azure Databricks Structured Streaming queries | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stream-monitoring | +| Monitor Databricks Structured Streaming queries effectively | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/stream-monitoring | | Combine Unity Catalog with Structured Streaming workloads | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/unity-catalog | -| Apply watermarks for efficient stateful streaming | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/watermarks | +| Apply watermarks for Databricks streaming state control | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/watermarks | | Use Automatic Feature Enablement for Unity Catalog tables | https://learn.microsoft.com/en-us/azure/databricks/tables/automatic-feature-enablement | | Analyze and optimize Delta table storage size on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/tables/size | | Design data models optimized for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/transform/data-modeling | @@ -314,15 +308,18 @@ This skill requires **network access** to fetch documentation content: ### Decision Making | Topic | URL | |-------|-----| -| Manage and change Azure Databricks subscription plans | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account | -| Plan migration from Databricks Standard to Premium tier | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/standard-tier | -| Decide when and how to use serverless workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces | +| Manage and change Azure Databricks subscriptions | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account | +| Create and monitor Azure Databricks spending budgets | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/budgets | +| Plan for Databricks Standard tier end of life | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/standard-tier | +| Use and configure the Personal Compute policy | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/personal-compute | +| Evaluate and use Databricks serverless workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces | | Decide and migrate from dbx to Databricks bundles | https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/dbx/dbx-migrate | | Migrate optimized LLM endpoints to provisioned throughput | https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/migrate-provisioned-throughput | | Decide when to use Databricks Light runtime | https://learn.microsoft.com/en-us/azure/databricks/archive/runtime/light | | Plan migration of Databricks workloads to Spark 3.x | https://learn.microsoft.com/en-us/azure/databricks/archive/spark-3.x-migration/ | | Select and manage the default Unity Catalog catalog | https://learn.microsoft.com/en-us/azure/databricks/catalogs/default | | Choose appropriate Azure Databricks compute types | https://learn.microsoft.com/en-us/azure/databricks/compute/choose-compute | +| Select compatible instance types for flexible nodes | https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-type-instances | | Decide when and how to use GPU Databricks compute | https://learn.microsoft.com/en-us/azure/databricks/compute/gpu | | Decide when and how to use Azure Databricks pools | https://learn.microsoft.com/en-us/azure/databricks/compute/pool-index | | Plan migration from classic to serverless Databricks compute | https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/migration | @@ -337,7 +334,7 @@ This skill requires **network access** to fetch documentation content: | Manage Databricks account budget policies via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-budget-policy-commands | | Configure Databricks account budgets using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-budgets-commands | | Manage Databricks account usage dashboards via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-usage-dashboards-commands | -| Select and configure compute size for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/compute-size | +| Choose compute size for Azure Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/compute-size | | Plan migration from legacy Databricks Connect runtimes | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect-legacy | | Migrate from older to new Databricks Connect for Python | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/migrate | | Migrate from legacy to new Scala Databricks Connect | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/scala/migrate | @@ -361,12 +358,12 @@ This skill requires **network access** to fetch documentation content: | Choose a programming language for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/languages/overview | | Migrate legacy and third-party online tables to Lakebase | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/migrate-from-online-tables | | Upgrade workspace feature tables to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/upgrade-feature-table-to-uc | -| Choose Databricks-hosted foundation models and endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models | +| Choose Databricks-hosted foundation models in Foundation Model APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models | | Migrate MLflow model versions to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/migrate-models | | Decide and migrate to Unity Catalog model management | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/migrate-to-uc | | Upgrade Databricks ML workflows to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/upgrade-workflows | | Choose Databricks options for batch model inference | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-inference/ | -| Select supported foundation models on Mosaic AI | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/foundation-model-overview | +| Select supported foundation models on Mosaic AI Model Serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/foundation-model-overview | | Migrate from legacy MLflow to Mosaic AI Model Serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/migrate-model-serving | | Decide when to use Spark vs. Ray on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/spark-ray-overview | | Plan migration of data applications to Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/migration/ | @@ -383,17 +380,17 @@ This skill requires **network access** to fetch documentation content: | Choose and configure incremental refresh for Databricks materialized views | https://learn.microsoft.com/en-us/azure/databricks/optimizations/incremental-refresh | | Choose pandas options on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/pandas/ | | Plan and use Hive metastore federation with Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/query-federation/hms-federation-concepts | -| Migrate Databricks HTTP routing to serverless compute | https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration | | Migrate legacy Databricks query federation to Lakehouse Federation | https://learn.microsoft.com/en-us/azure/databricks/query-federation/migrate | | Plan and execute migration to Databricks Runtime 11.x | https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/11.x-migration | | Migrate workloads to Databricks Runtime 12.x safely | https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/12.x-migration | | Migrate workloads to Databricks Runtime 13.x safely | https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/13.x-migration | | Migrate workloads to Databricks Runtime 14.x safely | https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/14.x-migration | | Plan around Databricks Runtime support lifecycles | https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/databricks-runtime-ver | -| Understand serverless DBU billing by Azure Databricks SKU | https://learn.microsoft.com/en-us/azure/databricks/resources/pricing | +| Decide and configure Databricks Designated Services geos | https://learn.microsoft.com/en-us/azure/databricks/resources/designated-services | +| Choose Azure Databricks serverless SKUs and DBU multipliers | https://learn.microsoft.com/en-us/azure/databricks/resources/pricing | | Evaluate Databricks serverless networking data transfer costs | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/cost-management | | Decide between Spark Connect and Spark Classic | https://learn.microsoft.com/en-us/azure/databricks/spark/connect-vs-classic | -| Decide between SparkR and sparklyr on Databricks | https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-vs-sparklyr | +| Choose between SparkR and sparklyr on Databricks | https://learn.microsoft.com/en-us/azure/databricks/sparkr/sparkr-vs-sparklyr | | Migrate to the latest Databricks SQL REST API | https://learn.microsoft.com/en-us/azure/databricks/sql/dbsql-api-latest | | Use SYNC to upgrade Hive tables to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-sync | | Choose Structured Streaming output modes on Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/output-mode | @@ -402,9 +399,9 @@ This skill requires **network access** to fetch documentation content: ### Architecture & Design Patterns | Topic | URL | |-------|-----| -| Plan disaster recovery architecture for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/disaster-recovery | +| Design disaster recovery patterns for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/disaster-recovery | | Design and use materialization for Databricks metric views | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/materialization | -| Implement fan-in and fan-out patterns in Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/data-engineering/fan-in-fan-out | +| Implement fan-in and fan-out in Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/data-engineering/fan-in-fan-out | | Choose patterns for external access to Databricks data | https://learn.microsoft.com/en-us/azure/databricks/external-access/ | | Build an IDP pipeline with Databricks AI Functions | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/idp-pipeline-tutorial | | Build multi-agent orchestrator apps on Databricks | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/multi-agent-apps | @@ -414,6 +411,7 @@ This skill requires **network access** to fetch documentation content: | Apply agent system design patterns on Databricks | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/guide/agent-system-design-patterns | | Design measurement infrastructure for RAG quality on Databricks | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/evaluate-enable-measurement | | Design and tune Databricks RAG inference chains | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/ai-cookbook/fundamentals-inference-chain-rag | +| Apply Auto Loader data ingestion patterns in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/patterns | | Design cost optimization architecture for Databricks lakehouse | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/cost-optimization/ | | Apply data and AI governance architecture on Databricks | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/data-governance/ | | Design Delta Lake and medallion data architecture on Databricks | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/delta-lake | @@ -427,14 +425,13 @@ This skill requires **network access** to fetch documentation content: | Apply Azure Databricks lakehouse reference architectures | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/reference | | Design reliability architecture for Databricks lakehouse | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/reliability/ | | Apply the data lakehouse pattern on Databricks | https://learn.microsoft.com/en-us/azure/databricks/lakehouse/ | -| Design online feature workflows with Databricks and third-party stores | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows | +| Architect online feature serving with Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-feature-store | | Choose Databricks ML model deployment patterns | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/deployment-patterns | | Implement MLOps workflows on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/mlops-workflow | | Choose and train deep-learning recommenders in Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/train-recommender-models | | Use Lakebase branches for database development workflows | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/branches | | Design for high availability with Lakebase computes | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/high-availability | | Scale reads with Lakebase read replicas | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/read-replicas | -| Serve lakehouse data via Lakebase synced tables | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/sync-tables | | Connect Databricks Serverless Private Git to on-prem Git | https://learn.microsoft.com/en-us/azure/databricks/repos/connect-on-prem-git-server | | Set up Databricks Serverless Private Git with Private Link | https://learn.microsoft.com/en-us/azure/databricks/repos/serverless-private-git | | Choose patterns for modeling semi-structured data on Databricks | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/ | diff --git a/skills/azure-databricks/configuration.md b/skills/azure-databricks/configuration.md index c44aba93..dcdc6176 100644 --- a/skills/azure-databricks/configuration.md +++ b/skills/azure-databricks/configuration.md @@ -8,63 +8,66 @@ | Topic | URL | |-------|-----| | Configure Azure Databricks account-level settings | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/ | -| Interpret Azure Databricks diagnostic audit logs | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-logs | -| Configure and monitor Azure Databricks account budgets | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/budgets | -| Enable and configure custom URL for Databricks account | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/custom-url | -| Configure disabling legacy features in new Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/legacy-features | -| Enable and configure verbose audit logs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/verbose-logs | -| Configure and enforce Personal Compute policy in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/personal-compute | -| Define Azure Databricks compute policy attributes | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-definition | +| Configure Azure Databricks diagnostic log delivery | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-log-delivery | +| Reference Azure Databricks diagnostic log services and events | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-logs | +| Configure a custom URL for Databricks account access | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/custom-url | +| Disable legacy features in new Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/legacy-features | +| Import and use Azure Databricks usage dashboards | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/usage | +| Enable and manage verbose audit logs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/verbose-logs | +| Configure automatic cluster updates in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/automatic-cluster-update | +| Create and manage Databricks compute policies | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policies | +| Define Databricks compute policies with JSON attributes | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/policy-definition | | Enable and manage the Azure Databricks web terminal | https://learn.microsoft.com/en-us/azure/databricks/admin/clusters/web-terminal | -| Configure SQL warehouse admin settings and access controls | https://learn.microsoft.com/en-us/azure/databricks/admin/sql/ | +| Configure SQL warehouse admin settings in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/sql/ | +| Configure legacy data access for Databricks SQL warehouses | https://learn.microsoft.com/en-us/azure/databricks/admin/sql/data-access-configuration | | Set up and manage serverless SQL warehouses | https://learn.microsoft.com/en-us/azure/databricks/admin/sql/serverless | -| Use Azure Databricks system tables for monitoring | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/ | -| Monitor Genie Code usage with assistant events system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/assistant | -| Use the Databricks audit log system table schema | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/audit-logs | -| Query billable usage system table in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/billing | -| Track clean room activity with events system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/clean-rooms | -| Reference Databricks compute system tables for monitoring | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/compute | -| Use data classification system table for sensitive data detection | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-classification | -| Use data quality monitoring results system table in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-quality-monitoring | -| Use Databricks jobs system tables for monitoring | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs | -| Use Databricks system tables to monitor job costs | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs-cost | -| Use Databricks lineage system tables for data tracking | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage | -| Analyze Databricks Marketplace data with system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/marketplace | -| Use Delta Sharing materialization history system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/materialization | -| Query MLflow experiment metadata via Databricks system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/mlflow | -| Analyze Databricks model serving costs with system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/model-serving-cost | -| Analyze network access denials with Databricks system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/network | -| Use predictive optimization system table in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/predictive-optimization | +| Use Databricks system tables for operational monitoring | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/ | +| Monitor Genie Code usage with assistant events table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/assistant | +| Use audit log system table for account activity | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/audit-logs | +| Query billable usage system table in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/billing | +| Track clean room events via system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/clean-rooms | +| Use compute system tables to monitor Databricks clusters | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/compute | +| Use data classification results system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-classification | +| Query data quality monitoring results system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/data-quality-monitoring | +| Use lakeflow jobs system tables to track Databricks jobs | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs | +| Analyze Lakeflow job costs and performance with system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/jobs-cost | +| Query table and column lineage system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/lineage | +| Use Marketplace system tables for provider analytics | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/marketplace | +| Analyze Delta Sharing materialization history table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/materialization | +| Work with MLflow experiment system tables in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/mlflow | +| Track Mosaic AI Model Serving costs with system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/model-serving-cost | +| Analyze network access events via system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/network | +| Inspect predictive optimization operation history table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/predictive-optimization | | Use Databricks pricing system table for SKU price history | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/pricing | -| Query Azure Databricks query history system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/query-history | -| Analyze Databricks serverless compute costs via system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/serverless-billing | -| Use warehouse events system table for SQL warehouse monitoring | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouse-events | -| Analyze SQL warehouses via Databricks warehouses system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouses | -| Monitor Azure Databricks workspaces with system table reference | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/workspaces | -| Monitor Zerobus Ingest activity using Databricks system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/zerobus-ingest | -| Configure Databricks serverless usage policies for tagging | https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies | -| Use Databricks billable usage system table for costs | https://learn.microsoft.com/en-us/azure/databricks/admin/usage/system-tables | -| Configure automatic identity sync from Entra ID to Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management | -| Configure SCIM-based user and group provisioning to Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/ | -| Manage legacy workspace-local groups in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/workspace-local-groups | -| Configure workspace appearance settings in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/appearance | -| Configure serverless workspace base environments in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/base-environment | +| Query query-history system table in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/query-history | +| Monitor serverless compute costs via billing table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/serverless-billing | +| Monitor SQL warehouse events with system tables | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouse-events | +| Analyze SQL warehouses using system.compute.warehouses table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/warehouses | +| Inspect workspace metadata using system table | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/workspaces | +| Query Zerobus Ingest system table schema and fields | https://learn.microsoft.com/en-us/azure/databricks/admin/system-tables/zerobus-ingest | +| Configure serverless usage policies for cost attribution | https://learn.microsoft.com/en-us/azure/databricks/admin/usage/budget-policies | +| Monitor default storage costs using billable usage table | https://learn.microsoft.com/en-us/azure/databricks/admin/usage/default-storage | +| Query billable usage system table for Databricks costs | https://learn.microsoft.com/en-us/azure/databricks/admin/usage/system-tables | +| Manage workspace base environments for serverless notebooks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/base-environment | | Manage DBFS visual file browser access in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/dbfs-browser | | Set default access mode for Databricks jobs compute | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-access-mode | | Configure default Python package repositories in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/default-python-packages | -| Auto-enable deletion vectors for new Delta tables | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/deletion-vectors | +| Configure default deletion vector behavior for Delta tables | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/deletion-vectors | | Disable the upload data UI in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/disable-upload-data-ui | -| Configure email notification settings for Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/email | -| Manage Azure Databricks preview feature settings | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/manage-previews | +| Configure workspace email notification settings in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/email | | Configure storage location for Databricks notebook results | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notebook-results | | Manage user access to Databricks notebook features | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notebooks | -| Configure Azure Databricks notification destinations via webhooks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notification-destinations | +| Configure notification destinations and webhooks in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/notification-destinations | | Purge Azure Databricks workspace storage securely | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/storage | | Change Azure Databricks workspace storage redundancy settings | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-storage-redundancy | | Configure Slack notifications for AI/BI dashboard subscriptions | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/slack-subscriptions | | Configure Microsoft Teams notifications for AI/BI dashboards | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/teams-subscriptions | | Configure workspace themes for AI/BI dashboards | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/themes | -| Configure AI Gateway endpoints in the beta experience | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-endpoints-beta | +| Configure Unity AI Gateway on Databricks model serving endpoints | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-ai-gateway-endpoints | +| Configure Unity AI Gateway endpoints in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/configure-endpoints-beta | +| Enable and use Unity AI Gateway inference tables for served models | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables | +| Configure Unity AI Gateway inference tables for endpoint monitoring | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/inference-tables-beta | +| Configure and query Unity AI Gateway usage tracking tables | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/usage-tracking-beta | | Understand Databricks cluster UI changes and access modes | https://learn.microsoft.com/en-us/azure/databricks/archive/compute/cluster-ui-preview | | Configure legacy Azure Databricks cluster settings | https://learn.microsoft.com/en-us/azure/databricks/archive/compute/configure | | Install and configure legacy Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/ | @@ -96,23 +99,21 @@ | Configure legacy Delta Lake storage credentials | https://learn.microsoft.com/en-us/azure/databricks/archive/storage/delta-storage-credentials | | Configure agent (semantic) metadata for Databricks metric views | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/agent-metadata | | Create and edit Unity Catalog metric views | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/create-edit | -| Reference YAML configuration for Databricks metric views | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/yaml-reference | +| Use YAML syntax to define Azure Databricks metric views | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/yaml-reference | | Use Catalog Explorer Entity Relationship Diagram in Databricks | https://learn.microsoft.com/en-us/azure/databricks/catalog-explorer/entity-relationship-diagram | | Create Unity Catalog catalogs in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/catalogs/create-catalog | | Manage Unity Catalog catalogs in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/catalogs/manage-catalog | | Run notebooks securely in clean rooms | https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/clean-room-notebook | -| Create Azure Databricks clean rooms for secure collaboration | https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/create-clean-room | | Manage and monitor Azure Databricks clean rooms | https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/manage-clean-room | | Create and use Clean Rooms output tables | https://learn.microsoft.com/en-us/azure/databricks/clean-rooms/output-tables | | Add and manage comments on Unity Catalog assets | https://learn.microsoft.com/en-us/azure/databricks/comments/ | -| Use AI-generated comments for Unity Catalog objects | https://learn.microsoft.com/en-us/azure/databricks/comments/ai-comments | -| Use Databricks compute configuration settings effectively | https://learn.microsoft.com/en-us/azure/databricks/compute/configure | +| Configure Azure Databricks compute settings and options | https://learn.microsoft.com/en-us/azure/databricks/compute/configure | | Configure custom Docker containers for Databricks compute | https://learn.microsoft.com/en-us/azure/databricks/compute/custom-containers | -| Reference compatible instance groups for flexible nodes | https://learn.microsoft.com/en-us/azure/databricks/compute/flexible-node-type-instances | | Configure Databricks instance pools in the UI | https://learn.microsoft.com/en-us/azure/databricks/compute/pools | | Configure environments and policies for Databricks serverless notebooks | https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/dependencies | | Configure and manage Databricks SQL warehouses | https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/create | | Monitor Databricks SQL warehouses in the UI | https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/monitor/ | +| Configure connections for Lakeflow managed ingestion sources | https://learn.microsoft.com/en-us/azure/databricks/connect/managed-ingestion | | Configure Kafka connector options in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/options | | Configure Unity Catalog external locations for cloud storage | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/external-locations | | Connect DBFS root as a Unity Catalog external location | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/external-locations-dbfs-root | @@ -123,8 +124,9 @@ | Query Databricks audit logs to monitor dashboard usage | https://learn.microsoft.com/en-us/azure/databricks/dashboards/monitor-usage | | Apply certified and deprecated tags in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/certify-deprecate-data | | Create and link Unity Catalog metastores in Databricks | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/create-metastore | +| Configure automatic data classification in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification | | Use supported Databricks data classification tags | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-classification-tags | -| View and analyze data lineage with Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-lineage | +| View and configure Unity Catalog data lineage | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-lineage | | Create and configure data profiles in Databricks UI | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/create-monitor-ui | | Define and use custom metrics in Databricks data profiling | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/custom-metrics | | Query Databricks data quality monitoring expenses | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/expense | @@ -135,18 +137,15 @@ | Ingest external data lineage into Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/external-lineage | | Configure legacy Hive metastore alongside Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/hive-metastore | | Update Databricks jobs after upgrading to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/jobs-update | -| Manage Unity Catalog metastore lifecycle and behavior | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-metastore | +| Configure and manage Unity Catalog metastores | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-metastore | | Upgrade Hive metastore tables and views to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/migrate | | Use UCX utilities to automate Unity Catalog workspace upgrade | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/ucx | | Understand Databricks DBFS behavior and deprecation | https://learn.microsoft.com/en-us/azure/databricks/dbfs/ | | Disable DBFS root and mounts in existing workspaces | https://learn.microsoft.com/en-us/azure/databricks/dbfs/disable-dbfs-root-mounts | | Create and manage Delta Sharing recipients in Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient | -| Create and manage Unity Catalog Delta Sharing shares | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-share | | Access Delta Sharing data as a recipient in Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/recipient | | Configure SAP BDC connector for Delta Sharing | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/create-connection | | Configure Delta Sharing for Azure Databricks providers | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/set-up | -| Configure Databricks-to-Databricks Delta Sharing for providers | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-databricks | -| Configure open Delta Sharing for external data recipients | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/share-data-open | | Configure catalog commits for Unity Catalog Delta tables | https://learn.microsoft.com/en-us/azure/databricks/delta/catalog-commits | | Enable and use Delta Lake Checkpoint V2 in Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/checkpoint-v2 | | Clone Delta, Parquet, and Iceberg tables on Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/clone | @@ -155,7 +154,7 @@ | Configure and use Delta Lake change data feed on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/delta-change-data-feed | | Drop Delta table features and downgrade protocol versions | https://learn.microsoft.com/en-us/azure/databricks/delta/drop-feature | | Drop and replace Delta and Unity Catalog tables | https://learn.microsoft.com/en-us/azure/databricks/delta/drop-table | -| Define and use Delta Lake generated columns on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/generated-columns | +| Define and manage Delta Lake generated columns on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/generated-columns | | Work with Delta table history and time travel on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/history | | Enable and use row tracking for Delta and Iceberg tables | https://learn.microsoft.com/en-us/azure/databricks/delta/row-tracking | | Inspect Azure Databricks Delta table metadata with DESCRIBE DETAIL | https://learn.microsoft.com/en-us/azure/databricks/delta/table-details | @@ -205,7 +204,6 @@ | Manage clean room asset revisions via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-asset-revisions-commands | | Configure clean room auto-approval rules via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-auto-approval-rules-commands | | Configure Databricks CLI shell autocompletion | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/completion-commands | -| Manage Unity Catalog data quality via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-quality-commands | | Query Databricks SQL data sources using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-sources-commands | | Administer Databricks database instances with CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/database-commands | | Assign and manage Unity Catalog entity tags via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/entity-tag-assignments-commands | @@ -216,6 +214,7 @@ | View cluster policy compliance using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policy-compliance-for-clusters-commands | | Check job policy compliance with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policy-compliance-for-jobs-commands | | Configure Databricks workspace settings via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/settings-commands | +| Configure Databricks workspace settings with CLI v2 | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-settings-v2-commands | | Configure Databricks Apps with app.yaml runtime settings | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/app-runtime | | Configure app-to-app resources for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/apps-resource | | Configure Databricks Apps templates, auth, and routing | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/configuration | @@ -229,10 +228,10 @@ | Add and configure Lakeflow Jobs as Databricks app resources | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/lakeflow | | Configure MLflow experiment resources for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/mlflow | | Add model serving endpoint resources to Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/model-serving | -| Configure OpenTelemetry-based telemetry for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/observability | +| Enable and configure telemetry for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/observability | | Configure Databricks app resources and secret management | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/resources | | Configure SQL warehouse resources for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/sql-warehouse | -| Understand environment and dependencies for Databricks apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/system-env | +| Understand Databricks Apps system binaries and env vars | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/system-env | | Configure vector search index resources in Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/vector-search | | Configure compute connections for Databricks Connect | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/cluster-config | | Configure and use Databricks Connect for Python | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/ | @@ -241,8 +240,6 @@ | Configure Databricks projects in VS Code extension | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/configure | | Configure settings for the Databricks VS Code extension | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/settings | | Explore Unity Catalog volumes and storage paths | https://learn.microsoft.com/en-us/azure/databricks/discover/files | -| Enable Compatibility Mode for external table reads | https://learn.microsoft.com/en-us/azure/databricks/external-access/compatibility-mode | -| Configure Iceberg REST catalog access to Databricks tables | https://learn.microsoft.com/en-us/azure/databricks/external-access/iceberg | | Use Databricks file utilities and APIs | https://learn.microsoft.com/en-us/azure/databricks/files/ | | Understand Databricks default working directory behavior | https://learn.microsoft.com/en-us/azure/databricks/files/cwd-dbr-14 | | Use Unity Catalog volumes for file storage | https://learn.microsoft.com/en-us/azure/databricks/files/volumes | @@ -257,26 +254,23 @@ | Replace deprecated Databricks agent feedback model | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/feedback-model | | Log and register Databricks Model Serving AI agents | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/log-agent | | Migrate from deprecated Databricks agent inference tables | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/request-assessment-logs | -| Host custom MCP servers as Databricks apps | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp | -| Configure external MCP servers with Databricks proxies | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/external-mcp | | Configure _meta parameters for Databricks managed MCP servers | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp-meta-param | | Configure Databricks external endpoints for OpenAI | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/tutorials/external-models-tutorial | | Create and configure custom Genie Code agent skills | https://learn.microsoft.com/en-us/azure/databricks/genie-code/skills | +| Configure and manage Genie spaces in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/genie/set-up | | Configure and use Apache Iceberg v3 features in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/iceberg/iceberg-v3 | -| Load data via Unity Catalog external locations in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/add-data-external-locations | | Configure Auto Loader directory listing mode for ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/directory-listing-mode | | Use managed file events with Auto Loader | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/file-events-explained | | Configure Databricks Auto Loader in file notification mode | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/file-notification-mode | | Configure Auto Loader cloudFiles options in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/options | | Configure schema inference and evolution in Auto Loader | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/schema | | Enable automatic type widening in Auto Loader | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/auto-loader/type-widening | -| Configure COPY INTO with Unity Catalog volumes | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/unity-catalog | | Configure incremental ingestion from ADLS with Auto Loader | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/onboard-data | | Create or modify Delta tables via file upload UI | https://learn.microsoft.com/en-us/azure/databricks/ingestion/create-or-modify-table | +| Use the _metadata file metadata column in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/file-metadata-column | | Configure column selection for Lakeflow Connect ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/column-selection | | Use Confluence connector reference for schemas and metadata | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/confluence-reference | | Use Dynamics 365 connector configuration reference | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-reference | -| Configure Dynamics 365 and storage for Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/d365-source-setup | | Monitor Lakeflow ingestion gateways with event logs | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/gateway-event-logs | | Use Google Ads connector table schemas and data types | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-reference | | Configure OAuth 2.0 for Google Ads Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/google-ads-source-setup | @@ -324,10 +318,9 @@ | Configure environment variables in Databricks init scripts | https://learn.microsoft.com/en-us/azure/databricks/init-scripts/environment-variables | | Configure global init scripts across Databricks workspace | https://learn.microsoft.com/en-us/azure/databricks/init-scripts/global | | Configure scheduled data refreshes in Google Sheets connector | https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/schedule-refresh | -| Configure Databricks JDBC driver connections (v3+) | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/configure | -| Configure Databricks JDBC driver connection properties | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/properties | +| Configure Databricks JDBC driver connection settings | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/configure | +| Use supported connection properties for Databricks JDBC | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/properties | | Configure legacy Simba-based Databricks JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/ | -| Configure capability settings for legacy Databricks JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/capability | | Set compute options for the legacy Databricks JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/compute | | Configure connections using the legacy Databricks JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/configure | | Create Azure Databricks connections in Microsoft Power Platform | https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform-setup | @@ -349,7 +342,6 @@ | Configure JAR tasks for Azure Databricks jobs | https://learn.microsoft.com/en-us/azure/databricks/jobs/jar | | Create Databricks-compatible JARs for Lakeflow Jobs | https://learn.microsoft.com/en-us/azure/databricks/jobs/jar-create | | Configure and use job parameters in Databricks | https://learn.microsoft.com/en-us/azure/databricks/jobs/job-parameters | -| Monitor and inspect Lakeflow Jobs runs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/jobs/monitor | | Configure notebook tasks in Azure Databricks jobs | https://learn.microsoft.com/en-us/azure/databricks/jobs/notebook | | Configure job and task notifications in Databricks | https://learn.microsoft.com/en-us/azure/databricks/jobs/notifications | | Parameterize Azure Databricks jobs and tasks | https://learn.microsoft.com/en-us/azure/databricks/jobs/parameters | @@ -374,26 +366,19 @@ | Generate Lakeflow pipelines from metadata with dlt-meta | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/dlt-meta | | Configure pipeline event hooks for custom monitoring | https://learn.microsoft.com/en-us/azure/databricks/ldp/event-hooks | | Configure pipeline expectations for data quality | https://learn.microsoft.com/en-us/azure/databricks/ldp/expectations | -| Example configurations of flows in Lakeflow Spark Declarative Pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/flow-examples | -| Configure and use flows in Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/flows | | Configure JSON schema inference and evolution with from_json | https://learn.microsoft.com/en-us/azure/databricks/ldp/from-json-schema-evolution | | Configure pipelines with legacy Hive metastore | https://learn.microsoft.com/en-us/azure/databricks/ldp/hive-metastore | | Configure and use Lakeflow sink API with flows | https://learn.microsoft.com/en-us/azure/databricks/ldp/ldp-sinks | | Understand and migrate from legacy LIVE schema | https://learn.microsoft.com/en-us/azure/databricks/ldp/live-schema | -| Load data into Lakeflow pipelines from supported sources | https://learn.microsoft.com/en-us/azure/databricks/ldp/load | | Migrate Databricks pipelines to the default publishing mode | https://learn.microsoft.com/en-us/azure/databricks/ldp/migrate-to-dpm | | Use Lakeflow pipeline event log schema for auditing | https://learn.microsoft.com/en-us/azure/databricks/ldp/monitor-event-log-schema | | Query and use the Lakeflow pipeline event log | https://learn.microsoft.com/en-us/azure/databricks/ldp/monitor-event-logs | -| Use Lakeflow Pipelines Editor for ETL development | https://learn.microsoft.com/en-us/azure/databricks/ldp/multi-file-editor | | Parameterize Lakeflow Spark Declarative Pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/parameters | -| Choose and configure triggered vs continuous pipeline modes | https://learn.microsoft.com/en-us/azure/databricks/ldp/pipeline-mode | | Configure Lakeflow pipeline JSON and table properties | https://learn.microsoft.com/en-us/azure/databricks/ldp/properties | | Configure serverless Lakeflow Spark pipelines in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ldp/serverless | | Configure custom sinks in Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/sinks | | Configure source-controlled Lakeflow pipelines with Git | https://learn.microsoft.com/en-us/azure/databricks/ldp/source-controlled | | Set default catalog and schema for Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/target-schema | -| Declare and manage data transformations in pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/transform | -| Configure Unity Catalog usage in Lakeflow pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/unity-catalog | | Use ALTER SQL statements with pipeline datasets | https://learn.microsoft.com/en-us/azure/databricks/ldp/using-alter-sql | | Manage compute-scoped libraries on Databricks clusters | https://learn.microsoft.com/en-us/azure/databricks/libraries/cluster-libraries | | Configure notebook-scoped Python libraries in Databricks | https://learn.microsoft.com/en-us/azure/databricks/libraries/notebooks-python-libraries | @@ -402,13 +387,15 @@ | Install Databricks libraries from Unity Catalog volumes | https://learn.microsoft.com/en-us/azure/databricks/libraries/volume-libraries | | Install libraries from workspace files in Databricks | https://learn.microsoft.com/en-us/azure/databricks/libraries/workspace-files-libraries | | Connect notebooks and jobs to Databricks AI Runtime | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/connecting | -| Configure Python environments for Databricks AI Runtime | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/environment | +| Set up Python environments for Databricks AI Runtime | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/environment | | Configure AutoML classification data preparation settings | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/classification-data-prep | | Configure AutoML forecasting data preparation settings | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/forecasting-data-prep | | Configure AutoML regression data preparation settings | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regression-data-prep | | Configure Databricks Runtime for Machine Learning compute | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/databricks-runtime-ml | | Configure Model Serving with automatic feature lookup | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/automatic-feature-lookup | +| Reference for Databricks declarative feature APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-api-reference | | Configure and use Databricks Feature Serving endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/feature-function-serving | +| Materialize Databricks declarative features to tables | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/materialized-features | | Configure Databricks Feature Engineering Python client | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/python-api | | Create and manage feature tables in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/uc/feature-tables-uc | | Manage Workspace Feature Store feature tables | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/feature-tables | @@ -418,7 +405,7 @@ | Manage MLflow models in legacy Workspace Model Registry | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/manage-model-lifecycle/workspace-model-registry | | Create and configure custom model serving endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/create-manage-serving-endpoints | | Configure telemetry persistence for Databricks custom model serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-model-serving-uc-logs | -| Configure custom models and compute for Databricks serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-models | +| Configure custom model serving on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/custom-models | | Enable inference tables on Databricks endpoints via API | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/enable-model-serving-inference-tables | | Use inference tables to monitor and debug Databricks endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/inference-tables | | Configure and manage Azure Databricks model serving endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/manage-serving-endpoints | @@ -453,9 +440,9 @@ | Configure production trace linking to MLflow app versions | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/version-tracking/link-production-traces-to-app-versions | | Track external GenAI app versions with MLflow LoggedModel | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/version-tracking/track-application-versions-with-mlflow | | Configure MLflow LoggedModel for GenAI version tracking | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/prompt-version-mgmt/version-tracking/version-concepts | +| Migrate MLflow traces to new Unity Catalog format | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/migrate-uc-trace-table-prefix | | Access and view MLflow traces in Databricks UI | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/ui-traces | | Configure Langfuse to export traces to MLflow | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/langfuse | -| Set OpenTelemetry span attributes for Databricks MLflow | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/otel-span-attributes | | Configure MLflow trace storage in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/trace-unity-catalog | | Use and migrate legacy ${param} notebook widgets | https://learn.microsoft.com/en-us/azure/databricks/notebooks/legacy-widgets | | Configure Databricks notebook file formats and commits | https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-format | @@ -466,16 +453,13 @@ | Configure and install Postgres extensions in Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/extensions | | Create and manage Lakebase branches | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-branches | | Enable and configure Lakebase high availability | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-high-availability | -| Create and configure Lakebase Postgres projects | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-projects | | Manage Lakebase with Declarative Automation Bundles | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-with-bundles | | Use Lakebase metrics dashboard for monitoring | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/metrics | | Use point-in-time branches in Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/point-in-time-branching | -| Register Lakebase Postgres databases in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/register-uc | | Configure Lakebase scale-to-zero idle suspension behavior | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/scale-to-zero | | Create and restore Lakebase database snapshots | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/snapshots | | Configure automatic updates for Lakebase computes | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/updates | -| Use pandas API on Spark in Databricks | https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-on-spark | -| Enable BI compatibility mode for Databricks metric views | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/bi-metric-view | +| Enable and use BI compatibility mode for metric views | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/bi-metric-view | | Create and use Power BI connections in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-uc-connect | | Set up Azure Databricks integration with Matillion Data Productivity Cloud | https://learn.microsoft.com/en-us/azure/databricks/partners/prep/matillion | | Configure DataFrame persistence and storage levels | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframe/persist | @@ -506,15 +490,15 @@ | Configure OneLake catalog federation with Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/query-federation/onelake | | Enable Snowflake catalog federation in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-catalog-federation | | Configure Databricks Lakehouse Federation for Teradata | https://learn.microsoft.com/en-us/azure/databricks/query-federation/teradata | -| Read Excel files with Databricks built-in format | https://learn.microsoft.com/en-us/azure/databricks/query/formats/excel | | Configure Databricks image data source for Spark | https://learn.microsoft.com/en-us/azure/databricks/query/formats/image | -| Choose appropriate Databricks serverless environment versions | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/ | -| Reference serverless environment version 5 system details | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five | -| Reference serverless environment version 4 system details | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four | +| Reference available Databricks serverless environment versions | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/ | +| Review system environment details for serverless environment v5 | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five | +| Inspect configuration for Serverless GPU environment version 5 | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/five-gpu | +| Check system environment details for serverless environment v4 | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four | | Use serverless GPU environment version 4 for AI workloads | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/four-gpu | | Reference serverless environment version 1 system details | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/one | | Reference serverless environment version 3 system details | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three | -| Understand deprecated serverless GPU environment version 3 | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three-gpu | +| Review deprecated Serverless GPU environment version 3 settings | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/three-gpu | | Reference serverless environment version 2 system details | https://learn.microsoft.com/en-us/azure/databricks/release-notes/serverless/environment-version/two | | Use Azure Databricks Git folders for version control | https://learn.microsoft.com/en-us/azure/databricks/repos/ | | Enable or disable Databricks Git folders via API | https://learn.microsoft.com/en-us/azure/databricks/repos/enable-disable-repos-with-api | @@ -522,22 +506,23 @@ | Understand Databricks Git folders concepts and providers | https://learn.microsoft.com/en-us/azure/databricks/repos/git-folders-concepts | | Create and manage Azure Databricks Git folders | https://learn.microsoft.com/en-us/azure/databricks/repos/git-operations-with-repos | | Configure Git server proxy for private Git with Databricks | https://learn.microsoft.com/en-us/azure/databricks/repos/git-proxy | -| Configure IPs and domains for Azure Databricks networking | https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region | +| Configure network rules using Databricks IPs and domains | https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region | | Verify supported browsers for Azure Databricks UI access | https://learn.microsoft.com/en-us/azure/databricks/resources/supported-browsers | | Create schemas in Unity Catalog and Hive metastore | https://learn.microsoft.com/en-us/azure/databricks/schemas/create-schema | | Configure Databricks connectivity to on-premises networks | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/on-prem-network | | Configure user-defined routes for Databricks VNets | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/udr | -| Update Azure Databricks workspace VNet settings | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/update-workspaces | +| Update Azure Databricks workspace VNet and network settings | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/update-workspaces | | Deploy Azure Databricks with VNet injection | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-inject | -| Configure inbound Private Link for Databricks services | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-privatelink | | Manage Databricks serverless private endpoint rules | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/manage-private-endpoint-rules | | Set up Private Link from serverless to VNets | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/pl-to-internal-network | | Configure Private Link from Databricks serverless compute | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-link | | Use ARM template to enable Databricks storage firewall | https://learn.microsoft.com/en-us/azure/databricks/security/network/storage/firewall-support-arm-template | +| Query and transform VARIANT semi-structured data in Databricks | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/variant | | Set and manage Spark configuration properties on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/spark/conf | | Manage R dependencies with renv on Databricks | https://learn.microsoft.com/en-us/azure/databricks/sparkr/renv | | Use BOOLEAN data type in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/boolean-type | | Define STRUCT types and fields in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/data-types/struct-type | +| Use OPTIMIZE to tune Delta Lake table layout | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-optimize | | Use REORG TABLE to optimize Delta layout | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/delta-reorg-table | | Use (deprecated) ai_generate_text for LLM text in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ai_generate_text | | Attach explicit collations with Databricks SQL collate | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/collate | @@ -589,7 +574,6 @@ | Use JSON path expressions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-json-path-expression | | Name resolution rules in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-name-resolution | | Naming rules and limits for Databricks SQL objects | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-names | -| Configure and use parameter markers in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameter-marker | | Understand Databricks SQL configuration parameter hierarchy | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameters | | Define and manage table partitions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-partition | | Use reserved words and schemas in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-reserved-words | @@ -610,12 +594,11 @@ | Modify governed tags with ALTER GOVERNED TAG in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-governed-tag | | ALTER EXTERNAL LOCATION properties in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-location | | Use ALTER MATERIALIZED VIEW in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-materialized-view | -| ALTER PROVIDER name and ownership in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-provider | | ALTER RECIPIENT name and ownership in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-recipient | | Configure schemas using ALTER SCHEMA in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema | | Set managed locations for foreign schemas in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema-set-managed-location | | ALTER STREAMING TABLE metadata in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-streaming-table | -| Alter table schema and properties in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table | +| Alter Databricks SQL tables with ALTER TABLE | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table | | Add table constraints with ALTER TABLE ADD CONSTRAINT | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-add-constraint | | Drop table constraints with ALTER TABLE DROP CONSTRAINT | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-drop-constraint | | Alter table columns and fields in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-table-manage-column | @@ -628,6 +611,7 @@ | Use CREATE DATABASE (alias for CREATE SCHEMA) in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-database | | Create governed tags with Databricks CREATE GOVERNED TAG | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-governed-tag | | Configure external locations with CREATE EXTERNAL LOCATION | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-location | +| Use CREATE MATERIALIZED VIEW in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view | | Configure REFRESH POLICY for materialized views | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view-refresh-policy | | Create Delta Sharing recipients in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-recipient | | Use CREATE SERVER alias for connections | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-server | @@ -635,6 +619,7 @@ | All CREATE TABLE variants in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table | | Create Hive-format tables in Databricks Runtime | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-hiveformat | | Create tables LIKE existing tables in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-like | +| Define tables with CREATE TABLE in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-using | | Drop Unity Catalog catalogs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-catalog | | Drop connections in Databricks Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-connection | | Drop credentials in Databricks Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-credential | @@ -672,19 +657,19 @@ | Configure Structured Streaming batch size with admission controls | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/batch-size | | Configure Structured Streaming checkpoints on Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/checkpoints | | Read and query Structured Streaming state data on Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/read-state | -| Set up and configure real-time mode in Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/setup | +| Configure real-time mode for Databricks Structured Streaming | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/setup | | Configure RocksDB state store for Databricks Structured Streaming | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/rocksdb-state-store | | Configure Structured Streaming trigger intervals on Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/triggers | -| Define constraints on Databricks Delta tables | https://learn.microsoft.com/en-us/azure/databricks/tables/constraints | +| Configure SQL constraints on Delta tables in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/tables/constraints | | Convert foreign tables to Unity Catalog managed tables | https://learn.microsoft.com/en-us/azure/databricks/tables/convert-foreign-managed | | Configure partition discovery for Unity Catalog external tables | https://learn.microsoft.com/en-us/azure/databricks/tables/external-partition-discovery | | Query and register Unity Catalog foreign tables | https://learn.microsoft.com/en-us/azure/databricks/tables/foreign | | Create and manage Unity Catalog managed tables | https://learn.microsoft.com/en-us/azure/databricks/tables/managed | | Understand schema enforcement on Databricks tables | https://learn.microsoft.com/en-us/azure/databricks/tables/schema-enforcement | -| Use multi-statement transactions on Unity Catalog tables | https://learn.microsoft.com/en-us/azure/databricks/transactions/ | +| Configure and run multi-statement transactions on Unity Catalog tables | https://learn.microsoft.com/en-us/azure/databricks/transactions/ | | Implement batch Python UDFs in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/udf/python-batch-udf | | Configure Python user-defined table functions on Databricks | https://learn.microsoft.com/en-us/azure/databricks/udf/python-udtf | -| Register Python UDTFs in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/udf/udtf-unity-catalog | +| Register and configure Python UDTFs in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/udf/udtf-unity-catalog | | Configure SQL and Python UDFs in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/udf/unity-catalog | | Configure and create Mosaic AI Vector Search endpoints and indexes | https://learn.microsoft.com/en-us/azure/databricks/vector-search/create-vector-search | | Create and manage Unity Catalog views in Databricks | https://learn.microsoft.com/en-us/azure/databricks/views/create-views | @@ -694,7 +679,6 @@ | Format numeric values in Databricks visualizations | https://learn.microsoft.com/en-us/azure/databricks/visualizations/format-numeric-types | | Configure heatmap chart options in Databricks | https://learn.microsoft.com/en-us/azure/databricks/visualizations/heatmap | | Configure histogram visualization options in Databricks | https://learn.microsoft.com/en-us/azure/databricks/visualizations/histogram | -| Use and migrate Databricks legacy visualizations | https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-visualizations | | Configure map visualizations in Databricks | https://learn.microsoft.com/en-us/azure/databricks/visualizations/maps | | Customize Databricks table visualization options | https://learn.microsoft.com/en-us/azure/databricks/visualizations/tables | | Apply path rules and access controls in volumes | https://learn.microsoft.com/en-us/azure/databricks/volumes/paths | diff --git a/skills/azure-databricks/deployment.md b/skills/azure-databricks/deployment.md index 2f33cb4b..7daf1904 100644 --- a/skills/azure-databricks/deployment.md +++ b/skills/azure-databricks/deployment.md @@ -7,6 +7,10 @@ ### Deployment | Topic | URL | |-------|-----| +| Deploy an Azure Databricks workspace with Azure CLI | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/azure-cli | +| Deploy Databricks workspaces using the Azure portal | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/create-workspace | +| Delete Azure Databricks workspaces and managed resources | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/delete-workspace | +| Deploy an Azure Databricks workspace using PowerShell | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/powershell | | Deploy Databricks stacks using the legacy Stack CLI | https://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/stack-cli | | Deploy MLflow models with legacy Databricks Model Serving | https://learn.microsoft.com/en-us/azure/databricks/archive/legacy-model-serving/model-serving | | Export and import Databricks dashboards across workspaces | https://learn.microsoft.com/en-us/azure/databricks/dashboards/automate/import-export | @@ -16,9 +20,8 @@ | Set up Azure DevOps CI/CD pipelines for Databricks | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/azure-devops | | Use Databricks GitHub Actions for CI/CD pipelines | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/github | | Prepare workspace and local environment for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/configure-env | -| Deploy Databricks apps via UI and CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/deploy | +| Deploy Azure Databricks Apps via UI or CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/deploy | | Use Declarative Automation Bundles with Databricks VS Code | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/bundles | -| Deploy custom AI agents on Databricks Apps with Agent Framework | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent | | Author and deploy Databricks agents on Model Serving | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/author-agent-model-serving | | Build and deploy a chat UI for Databricks agents | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/chat-app | | Deploy Databricks AI agents with Model Serving | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/deploy-agent | @@ -26,6 +29,7 @@ | Plan Infrastructure as Code strategy for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/lakehouse-architecture/deployment-guide/iac | | Deploy Databricks batch inference pipelines with AI Functions | https://learn.microsoft.com/en-us/azure/databricks/large-language-models/batch-inference-pipelines | | Deploy and query a Databricks feature serving endpoint with SDK | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/feature-serving-tutorial | +| Serve Databricks declarative features via endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/serve-declarative-features | | Deploy provisioned throughput Foundation Model APIs on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/deploy-prov-throughput-foundation-model-apis | | Integrate Databricks ML workflows into CI/CD pipelines | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/mlops/ci-cd-for-ml | | Use serverless optimized deployments for model serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/serverless-optimized-deployments | @@ -35,9 +39,10 @@ | Enable MLflow Tracing for agents deployed outside Databricks | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/prod-tracing-external | | Create and manage scheduled Databricks notebook jobs | https://learn.microsoft.com/en-us/azure/databricks/notebooks/schedule-notebook-jobs | | Provision Lakebase resources with Terraform | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/automate-with-terraform | -| Manage Databricks Git folders with Terraform | https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-terraform | +| Migrate Databricks HTTP routing to serverless | https://learn.microsoft.com/en-us/azure/databricks/query-federation/http-migration | | Integrate Databricks Git folders into CI/CD workflows | https://learn.microsoft.com/en-us/azure/databricks/repos/ci-cd | | Check Azure Databricks feature availability by region | https://learn.microsoft.com/en-us/azure/databricks/resources/feature-region-support | | Understand Azure Databricks platform release windows | https://learn.microsoft.com/en-us/azure/databricks/resources/platform-release | | Use Databricks-hosted RStudio Server runtimes | https://learn.microsoft.com/en-us/azure/databricks/sparkr/hosted-rstudio-server | +| Deploy OSS embedding model for Databricks Vector Search | https://learn.microsoft.com/en-us/azure/databricks/vector-search/embedding-with-oss-models | | Migrate legacy line charts to new Databricks chart types | https://learn.microsoft.com/en-us/azure/databricks/visualizations/legacy-charts | diff --git a/skills/azure-databricks/integrations.md b/skills/azure-databricks/integrations.md index c19673ed..f583a5b9 100644 --- a/skills/azure-databricks/integrations.md +++ b/skills/azure-databricks/integrations.md @@ -7,8 +7,9 @@ ### Integrations & Coding Patterns | Topic | URL | |-------|-----| -| Query Databricks billable usage for storage cost tracking | https://learn.microsoft.com/en-us/azure/databricks/admin/usage/default-storage | | Manage AI/BI assets using Databricks REST APIs | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/use-apis | +| Integrate coding agents with Unity AI Gateway for governance | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/coding-agent-integration-beta | +| Query Unity AI Gateway endpoints via unified APIs | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/query-endpoints-beta | | Use legacy ABS-AQS streaming connector in Databricks | https://learn.microsoft.com/en-us/azure/databricks/archive/azure/aqs | | Read and write data between Databricks and Azure Cosmos DB | https://learn.microsoft.com/en-us/azure/databricks/archive/azure/cosmosdb | | Stream data from Databricks to Azure Synapse with Structured Streaming | https://learn.microsoft.com/en-us/azure/databricks/archive/azure/stream-synapse | @@ -73,10 +74,12 @@ | Configure legacy WASB connector for Azure Blob Storage | https://learn.microsoft.com/en-us/azure/databricks/archive/storage/wasb-blob | | Implement advanced metric view calculations in Databricks | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/advanced-techniques | | Use LOD expressions in Databricks metric views | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/level-of-detail | +| Query Azure Databricks metric views from downstream tools | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/query | +| Migrate to token-based Databricks cluster events pagination | https://learn.microsoft.com/en-us/azure/databricks/compute/events-api-updates | | Configure Azure Databricks connections to external data | https://learn.microsoft.com/en-us/azure/databricks/connect/ | | Configure JDBC Unity Catalog connections to external databases | https://learn.microsoft.com/en-us/azure/databricks/connect/jdbc-connection | | Connect Azure Databricks streaming to Apache Kafka | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/ | -| Integrate Google Pub/Sub with Azure Databricks streaming | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pub-sub | +| Subscribe to Google Pub/Sub with Databricks Structured Streaming | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pub-sub | | Configure Apache Pulsar streaming source on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/pulsar | | Use Unity Catalog service credentials to call cloud APIs | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/use-service-credentials | | Embed Azure Databricks dashboards in external apps | https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/ | @@ -84,24 +87,23 @@ | Use Databricks dashboard REST APIs for management | https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/dashboard-crud-api | | Manage AI/BI dashboards via Workspace and Lakeview APIs | https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/workspace-dashboard-api | | Create data profiles using Databricks quality_monitors API | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/data-profiling/create-monitor-api | -| Read Databricks-to-Databricks Delta Sharing data | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-databricks | -| Consume Delta Sharing open shares with bearer tokens | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-open | | Integrate SAP BDC with Azure Databricks via Delta Sharing | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/ | | Configure Delta Sharing access for SAP BDC recipients | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/share-to-sap | | Use Python M2M OIDC federation to access Delta Sharing | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sharing-over-oidc-m2m | | Use U2M OIDC federation clients with Delta Sharing | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sharing-over-oidc-u2m | | Use MERGE to upsert into Delta tables on Databricks | https://learn.microsoft.com/en-us/azure/databricks/delta/merge | +| Enable Databricks workload identity federation with GitHub Actions | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-github | | Use Databricks CLI bundle commands for deployments | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/bundle-commands | | Run Databricks CLI from Azure Cloud Shell | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/databricks-cli-from-azure-cloud-shell | | Download Databricks billable usage logs via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-billable-usage-commands | | Manage Databricks account resources via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-commands | | Use Databricks CLI alerts command group | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-commands | +| Use Databricks CLI alerts-v2 commands for SQL alerts | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/alerts-v2-commands | | Call Databricks REST APIs with CLI api command | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/api-commands | -| Manage Databricks Apps using CLI commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/apps-commands | +| Manage Databricks apps with CLI apps commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/apps-commands | | Manage Unity Catalog artifact allowlists via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/artifact-allowlists-commands | -| Configure Databricks CLI authentication with auth commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/auth-commands | | Manage Unity Catalog catalogs with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/catalogs-commands | -| Manage clean room assets via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-assets-commands | +| Manage Databricks clean room assets via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-assets-commands | | Control clean room task runs with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-room-task-runs-commands | | Administer clean rooms using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/clean-rooms-commands | | Manage cluster policies with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/cluster-policies-commands | @@ -113,16 +115,22 @@ | Handle consumer personalization requests via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/consumer-personalization-requests-commands | | Use Databricks CLI consumer-providers Marketplace commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/consumer-providers-commands | | Retrieve current user information via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/current-user-commands | -| Manage MLflow experiments using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/experiments-commands | +| Control Unity Catalog data classification with CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-classification-commands | +| Use Databricks CLI data-quality commands for Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/data-quality-commands | +| Manage Databricks environments with CLI environments commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/environments-commands | +| Manage MLflow experiments using Databricks CLI experiments commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/experiments-commands | | Manage Databricks feature store via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/feature-engineering-commands | | Use Databricks CLI fs commands for DBFS and volumes | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/fs-commands | | Manage Unity Catalog user-defined functions via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/functions-commands | | Use Databricks CLI genie commands for Genie spaces | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/genie-commands | | Configure Git credentials for Databricks via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/git-credentials-commands | +| Manage Databricks groups with CLI groups-v2 commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/groups-v2-commands | | Manage Databricks instance pools using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/instance-pools-commands | | Create and manage Databricks jobs via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/jobs-commands | +| Manage Knowledge Assistants using Databricks CLI commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/knowledge-assistants-commands | | Use Databricks CLI labs commands for experimental apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/labs-commands | | Manage Lakeview dashboards with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/lakeview-commands | +| Embed Lakeview dashboards via Databricks CLI lakeview-embedded | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/lakeview-embedded-commands | | Manage Databricks libraries via CLI commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/libraries-commands | | Administer Unity Catalog metastores using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/metastores-commands | | Use CLI model-registry commands for workspace models | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/model-registry-commands | @@ -130,7 +138,8 @@ | Configure notification destinations via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/notification-destinations-commands | | Create and manage online tables using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/online-tables-commands | | Control Databricks pipelines with CLI commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/pipelines-commands | -| Manage Lakebase Postgres resources with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/postgres-commands | +| Manage Unity Catalog ABAC policies with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/policies-commands | +| Manage Lakebase Postgres resources using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/postgres-commands | | Manage marketplace exchange filters with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-exchange-filters-commands | | Administer Databricks marketplace exchanges via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-exchanges-commands | | Manage Databricks marketplace files using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-files-commands | @@ -139,8 +148,9 @@ | Manage provider analytics dashboards via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-provider-analytics-dashboards-commands | | Administer Databricks marketplace providers using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/provider-providers-commands | | Manage Delta Sharing providers with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/providers-commands | -| Connect to Databricks Postgres instances with psql CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/psql-command | -| Configure Databricks quality monitors via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitors-commands | +| Connect to Lakebase Postgres with Databricks CLI psql | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/psql-command | +| Use Databricks CLI quality-monitor-v2 commands for data quality | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitor-v2-commands | +| Use Databricks CLI quality-monitors command group | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/quality-monitors-commands | | Create and manage Databricks SQL queries with CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/queries-commands | | Use deprecated queries-legacy commands in CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/queries-legacy-commands | | Access Databricks SQL query history using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/query-history-commands | @@ -149,25 +159,28 @@ | Administer Unity Catalog registered models via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/registered-models-commands | | Manage Databricks Git repos (folders) using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/repos-commands | | Manage Unity Catalog schemas with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/schemas-commands | -| Manage Databricks model serving endpoints with CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/serving-endpoints-commands | -| Manage Unity Catalog shares using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/shares-commands | +| Manage service principal secrets via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principal-secrets-proxy-commands | +| Manage Databricks service principals with CLI v2 | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principals-v2-commands | +| Control Databricks serving endpoints using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/serving-endpoints-commands | +| Manage Unity Catalog shares with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/shares-commands | | Use Databricks CLI ssh commands for remote development | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/ssh-commands | | Use Databricks CLI storage-credentials commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/storage-credentials-commands | | Sync local files with Databricks CLI sync command | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/sync-commands | | Manage system schemas with Databricks CLI commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/system-schemas-commands | | Manage table constraints via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/table-constraints-commands | -| Manage Unity Catalog tables with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tables-commands | +| Administer Unity Catalog tables via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tables-commands | | Administer governed tag policies with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tag-policies-commands | | Generate temporary path credentials with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/temporary-path-credentials-commands | | Generate temporary table credentials with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/temporary-table-credentials-commands | | Administer token management using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/token-management-commands | | Create and revoke tokens with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/tokens-commands | | Manage Databricks users with CLI users commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/users-commands | -| Manage vector search endpoints via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-endpoints-commands | +| Manage Databricks users with CLI users-v2 commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/users-v2-commands | +| Operate vector search endpoints using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-endpoints-commands | | Manage vector search indexes with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/vector-search-indexes-commands | | Check Databricks CLI version with version command | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/version-command | | Manage Unity Catalog volumes using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/volumes-commands | -| Manage SQL warehouses with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/warehouses-commands | +| Manage Databricks SQL warehouses via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/warehouses-commands | | Configure Unity Catalog workspace bindings with CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-bindings-commands | | Manage workspace files and folders via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-commands | | Update advanced workspace settings via Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-conf-commands | @@ -202,11 +215,13 @@ | Automate Unity Catalog deployment with Databricks Terraform | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/automate-uc | | Provision Databricks clusters, notebooks, and jobs with Terraform | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/cluster-notebook-job | | Manage Azure Databricks workspace resources using Terraform | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/workspace-management | +| Use Compatibility Mode for external Delta and Iceberg clients | https://learn.microsoft.com/en-us/azure/databricks/external-access/compatibility-mode | +| Configure Iceberg REST catalog access to Unity Catalog tables | https://learn.microsoft.com/en-us/azure/databricks/external-access/iceberg | | Use Unity REST API from external Delta clients | https://learn.microsoft.com/en-us/azure/databricks/external-access/unity-rest | | Unzip and read compressed files in Databricks | https://learn.microsoft.com/en-us/azure/databricks/files/unzip-files | | Parse documents using Databricks Document Parsing | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/document-parsing | | Create information extraction agents with Databricks | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/key-info-extraction | -| Orchestrate Databricks multi-agent systems with Supervisor | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/multi-agent-supervisor | +| Use Databricks Supervisor API to run agent loops | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-bricks/supervisor-api | | Implement Databricks AI agent tools with MCP and Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-tool | | Integrate Anthropic SDK with Databricks Unity Catalog tools | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/anthropic-uc-integration | | Use the Databricks Python code interpreter tool in AI agents | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/code-interpreter-tools | @@ -218,28 +233,30 @@ | Query Databricks agents via REST and SDK clients | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/query-agent | | Integrate Databricks AI agents with Slack via HTTP connections | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/slack-agent | | Connect Databricks AI agents to structured data sources | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/structured-retrieval-tools | +| Build Databricks Apps agents with Supervisor API orchestration | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/supervisor-api-app | | Integrate Databricks AI agents with Microsoft Teams via OAuth OBO | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/teams-agent | | Use Unity Catalog tools with third-party gen AI frameworks | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/unity-catalog-tool-integration | | Connect Databricks AI agents to unstructured data with Vector Search | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/unstructured-retrieval-tools | -| Use Databricks managed MCP servers to connect agents to data | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp | | Connect Genie Code to external MCP servers | https://learn.microsoft.com/en-us/azure/databricks/genie-code/mcp | -| Integrate Genie natural language querying via API | https://learn.microsoft.com/en-us/azure/databricks/genie/conversation-api | +| Integrate Genie via the conversation API into applications | https://learn.microsoft.com/en-us/azure/databricks/genie/conversation-api | +| Load ADLS data into Unity Catalog tables with COPY INTO | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/unity-catalog | | Incrementally clone Parquet and Iceberg to Delta Lake | https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/clone-parquet | | Convert Parquet and Iceberg tables to Delta Lake | https://learn.microsoft.com/en-us/azure/databricks/ingestion/data-migration/convert-to-delta | | Ingest Google Drive files into Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/google-drive | | Ingest OpenTelemetry data via Zerobus OTLP | https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/ | | Query OpenTelemetry traces, logs, and metrics in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/opentelemetry/queries | -| Ingest data from SFTP using Lakeflow Connect | https://learn.microsoft.com/en-us/azure/databricks/ingestion/sftp | -| Ingest SharePoint files into Delta tables | https://learn.microsoft.com/en-us/azure/databricks/ingestion/sharepoint | +| Ingest SFTP server data using Lakeflow Connect Auto Loader | https://learn.microsoft.com/en-us/azure/databricks/ingestion/sftp | +| Ingest SharePoint files into Delta tables with Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/sharepoint | | Ingest data with Zerobus Ingest connector | https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-ingest | | Configure OAuth SSO from Tableau Server to Databricks | https://learn.microsoft.com/en-us/azure/databricks/integrations/configure-oauth-tableau | | Use Databricks Connector to access data from Google Sheets | https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/ | | Set up Databricks Connector for Google Sheets | https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/connect | -| Query Databricks data from Google Sheets with SQL | https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/query-data | +| Query Azure Databricks data using Google Sheets connector | https://learn.microsoft.com/en-us/azure/databricks/integrations/google-sheets/query-data | | Use GraphFrames with Scala on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/integrations/graphframes/user-guide-scala | | Use Databricks JDBC metadata for metric views | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/metadata | | Java API reference for the Databricks JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/reference | | Manage Unity Catalog volume files via Databricks JDBC | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/volumes | +| Configure advanced capability settings for Databricks JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/capability | | Manage Unity Catalog volume files via legacy JDBC driver | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc/volumes | | Connect Lovable no-code apps to Databricks via OAuth | https://learn.microsoft.com/en-us/azure/databricks/integrations/lovable | | Add Databricks Genie MCP server to Microsoft Foundry | https://learn.microsoft.com/en-us/azure/databricks/integrations/microsoft-foundry | @@ -256,11 +273,10 @@ | Use Azure Databricks AI Functions from SQL and Python | https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-functions | | Call ai_query to access AI models in Databricks | https://learn.microsoft.com/en-us/azure/databricks/large-language-models/ai-query | | Use LangChain integrations with Databricks LLMs | https://learn.microsoft.com/en-us/azure/databricks/large-language-models/langchain | -| Clone Hive metastore pipelines to Unity Catalog via REST | https://learn.microsoft.com/en-us/azure/databricks/ldp/clone-hms-to-uc | +| Clone Hive metastore pipelines to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/ldp/clone-hms-to-uc | | Replicate external RDBMS tables with AUTO CDC | https://learn.microsoft.com/en-us/azure/databricks/ldp/database-replication | -| Create and manage Databricks SQL materialized views | https://learn.microsoft.com/en-us/azure/databricks/ldp/dbsql/materialized | | Define Lakeflow pipeline datasets with Python decorators | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/definition-function | -| Use append_flow decorator for pipeline append-only flows | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-append-flow | +| Use append_flow decorator in Databricks pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-append-flow | | Use create_auto_cdc_flow for pipeline CDC processing | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-apply-changes | | Use create_auto_cdc_from_snapshot_flow for snapshot CDC | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-apply-changes-from-snapshot | | Apply data quality expectations in Lakeflow Python pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/developer/ldp-python-ref-expectations | @@ -285,7 +301,7 @@ | Import Python modules from Git or workspace into pipelines | https://learn.microsoft.com/en-us/azure/databricks/ldp/import-workspace-files | | Develop and debug Databricks pipelines using legacy notebooks | https://learn.microsoft.com/en-us/azure/databricks/ldp/notebook-devex | | Run Lakeflow pipelines from Jobs, Airflow, or Data Factory | https://learn.microsoft.com/en-us/azure/databricks/ldp/workflows | -| Use Serverless GPU Python API for multi-GPU training | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/distributed-training | +| Use serverless GPU Python API for multi-GPU training | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/distributed-training | | Get started with H100 serverless GPU using serverless_gpu API | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-api-h100-starter | | Train CNN image classifier on Databricks AI Runtime | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-cnn-mnist | | Distributed fine-tuning of gpt-oss-20b on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-distributed-gpt-oss-20b | @@ -297,7 +313,7 @@ | Fine-tune Olmo3 7B with Axolotl on Databricks serverless GPU | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-olmo3-7b-lora-axolotl | | Distributed two-tower recommender training with Lightning on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-recommender-system-lightning | | Train RetinaNet object detection model on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-retinanet-image-detection-model-training | -| Fine-tune Llama 3.2 1B with LoRA and DeepSpeed | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-sft-trl-deepspeed-llama-1b | +| Fine-tune Llama 3.2 1B with LoRA on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-sft-trl-deepspeed-llama-1b | | Time series forecasting with GluonTS on Databricks GPU | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-time-series-gluonts-101 | | Train XGBoost regression model on Databricks GPU | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ai-runtime/examples/tutorials/sgc-xgboost | | Use Hyperopt with distributed training on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl-hyperparam-tuning/hyperopt-distributed-ml | @@ -309,29 +325,32 @@ | Integrate Databricks Feature Store with AutoML experiments | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/feature-store-integration | | Use AutoML Python API for forecasting on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/forecasting-train-api | | Use AutoML Python API for regression on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/automl/regression-train-api | +| Use Databricks declarative feature engineering APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/declarative-apis | | Implement on-demand feature computation with UDFs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/on-demand-features | +| Use Databricks features in online ML workflows | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-workflows | | Publish Databricks feature tables to third-party online stores | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/publish-features | | Integrate third-party online stores with Feature Store | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/third-party-online-stores | -| Train models and run batch inference with Databricks feature tables | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-models-with-feature-store | -| Use Databricks Foundation Model REST APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference | +| Train ML models with Databricks feature tables | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-models-with-feature-store | +| Train models using Databricks declarative features | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/train-with-declarative-features | +| Use Databricks Foundation Model REST APIs for model integration | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference | | Use Mosaic Streaming to load Spark data into PyTorch | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/streaming | | Save and load TFRecord data with Spark on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/load-data/tfrecords-save-load | | Create and call Databricks foundation model endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/create-foundation-model-endpoints | | Deploy custom Python logic with Databricks Model Serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/deploy-custom-python-code | -| Implement function calling on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/function-calling | +| Implement function calling with Databricks foundation model endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/function-calling | | Call provider-native OpenAI, Anthropic, and Gemini APIs on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/provider-native-apis | | Query Databricks Anthropic models with Messages API | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-anthropic-messages | | Write and send chat model queries on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-chat-models | | Query embedding models via Databricks endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-embedding-models | | Integrate Google Gemini API with Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-gemini-api | -| Use OpenAI Responses API with Databricks endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-openai-responses | -| Query reasoning models using Foundation Model API | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-reason-models | +| Query Databricks foundation models using the OpenAI Responses API | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-openai-responses | +| Query reasoning foundation models using Foundation Model API | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-reason-models | | Query route-optimized Databricks serving endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-route-optimization | -| Query vision foundation models via Databricks endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-vision-models | +| Query vision foundation models via Mosaic AI Model Serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/query-vision-models | | Send scoring requests to Databricks custom model endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-custom-model-endpoints | -| Send query requests to Databricks foundation models | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-foundation-models | +| Send query requests to Databricks and external foundation models | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/score-foundation-models | | Use structured outputs with Databricks LLM endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/structured-outputs | -| Use web search grounding with Databricks models | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/web-search | +| Use web search with Databricks foundation models and MCP | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/web-search | | Featurize data for transfer learning with pandas UDFs on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/preprocess-data/transfer-learning-tensorflow | | Combine Spark and Ray in one Databricks environment | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/connect-spark-ray | | Integrate MLflow tracking with Ray on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/ray/ray-mlflow | @@ -394,8 +413,10 @@ | Trace txtai embeddings and LLM workflows with MLflow | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/integrations/txtai | | Use MLflow MCP server to manage traces | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/mlflow-mcp | | Access MLflow trace metadata and spans via SDK | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/access-trace-data | -| Query MLflow GenAI traces programmatically via SDK | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-via-sdk | +| Query Unity Catalog MLflow traces with Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-dbsql | +| Search and analyze MLflow traces via SDK | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/query-via-sdk | | Example queries using mlflow.search_traces() | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/observe-with-traces/search-traces-examples | +| Set OpenTelemetry span attributes for Databricks MLflow | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tracing/third-party/otel-span-attributes | | Create custom MLflow scorers for RAG evaluation | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tutorials/examples/custom-scorers | | Optimize chained prompts with MLflow multi-prompt workflows | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tutorials/examples/multi-prompt-optimization | | Implement MLflow prompt optimization with GEPA and GPT-OSS | https://learn.microsoft.com/en-us/azure/databricks/mlflow3/genai/tutorials/examples/prompt-optimization-quickstart | @@ -403,34 +424,32 @@ | Use Databricks notebook features for code development | https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebooks-code | | Share and modularize code across Databricks notebooks | https://learn.microsoft.com/en-us/azure/databricks/notebooks/share-code | | Connect to and query Lakebase database instances | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/ | -| Query Lakebase instances from Databricks notebooks | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/notebook | +| Query Lakebase database instances from notebooks | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/notebook | | Connect external SQL clients to Lakebase instances | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/query/psql | | Register Lakebase instances with Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/register-uc | | Integrate Lakebase with Unity Catalog and synced data | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/sync-data/ | | Use Lakebase Autoscaling APIs, CLI, and SDKs | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/api-usage | -| Manage Lakebase with the Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/cli | | Connect to Lakebase using DBeaver | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-dbeaver | | Connect and monitor Lakebase with pgAdmin | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pgadmin | | Monitor Lakebase Postgres performance with PgHero | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-pghero | | Connect to Lakebase Postgres using psql | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/connect-psql | | Use Lakebase Data API for RESTful Postgres access | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-api | -| Integrate Lakebase Postgres with Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/databricks-apps | -| Connect external apps to Lakebase via SDK | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-connect | +| Use Lakebase Postgres as backend for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/databricks-apps | +| Connect external apps to Lakebase via SDK and OAuth | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-connect | | Connect external apps to Lakebase via REST API | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-apps-manual-api | -| Integrate external monitoring tools with Lakebase Postgres | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/external-monitoring-tools | | Back Databricks Online Feature Stores with Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/feature-store | | Use frameworks to connect to Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/framework-examples | | Backup and restore Lakebase with pg_dump/pg_restore | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/pg-dump-restore | | Use Postgres clients with Lakebase databases | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/postgres-clients | | Query Lakebase from Lakehouse SQL editor | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/query-sql-editor | | Store AI agent state in Lakebase Postgres | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/state-management | -| Apply pandas function APIs to PySpark DataFrames | https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-function-apis | +| Connect custom Databricks app to Lakebase Autoscaling | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorial-databricks-apps-autoscaling | +| Use pandas function APIs with PySpark DataFrames | https://learn.microsoft.com/en-us/azure/databricks/pandas/pandas-function-apis | | Convert between PySpark and pandas DataFrames with Arrow | https://learn.microsoft.com/en-us/azure/databricks/pandas/pyspark-pandas-conversion | | Connect Databricks to ingestion partners via Partner Connect | https://learn.microsoft.com/en-us/azure/databricks/partner-connect/ingestion | | Connect Databricks to ML partners via Partner Connect | https://learn.microsoft.com/en-us/azure/databricks/partner-connect/ml | | Connect Databricks to data prep partners via Partner Connect | https://learn.microsoft.com/en-us/azure/databricks/partner-connect/prep | | Walkthrough: Connect Fivetran to Databricks via Partner Connect | https://learn.microsoft.com/en-us/azure/databricks/partner-connect/walkthrough-fivetran | -| Read Unity Catalog data from Microsoft Fabric | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/fabric | | Connect Databricks SQL warehouses to Hex | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/hex | | Connect Looker to Azure Databricks clusters and SQL | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/looker | | Connect Looker Studio to Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/looker-studio | @@ -677,8 +696,8 @@ | Write DataFrames to databases via DataFrameWriter.jdbc | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/jdbc | | Write JSON output with DataFrameWriter.json in Databricks | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/json | | Control overwrite behavior with DataFrameWriter.mode | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/mode | -| Set single output option with DataFrameWriter.option | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/option | -| Set multiple output options with DataFrameWriter.options | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/options | +| Use DataFrameWriter.option for Databricks data sources | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/option | +| Configure multiple DataFrameWriter.options in Databricks | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/options | | Write ORC files with DataFrameWriter.orc in Databricks | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/orc | | Write Parquet output with DataFrameWriter.parquet | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/parquet | | Partition output data with DataFrameWriter.partitionBy | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriter/partitionby | @@ -692,8 +711,8 @@ | Cluster data using DataFrameWriterV2.clusterBy | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/clusterby | | Create new tables with DataFrameWriterV2.create | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/create | | Use DataFrameWriterV2.createOrReplace in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/createorreplace | -| Set write options with DataFrameWriterV2.option | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/option | -| Configure multiple write options with options() | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/options | +| Use DataFrameWriterV2.option for Databricks writes | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/option | +| Configure DataFrameWriterV2 options in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/options | | Use DataFrameWriterV2.overwrite for conditional updates | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwrite | | Use overwritePartitions with DataFrameWriterV2 | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/overwritepartitions | | Partition tables with DataFrameWriterV2.partitionedBy | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/dataframewriterv2/partitionedby | @@ -1264,7 +1283,7 @@ | Decompress Zstandard data with zstd_decompress | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/zstd_decompress | | Configure Databricks Lakehouse Federation for BigQuery | https://learn.microsoft.com/en-us/azure/databricks/query-federation/bigquery | | Federate Databricks queries across workspaces | https://learn.microsoft.com/en-us/azure/databricks/query-federation/databricks | -| Create Unity Catalog HTTP connections for external APIs | https://learn.microsoft.com/en-us/azure/databricks/query-federation/http | +| Create and use HTTP connections in Databricks | https://learn.microsoft.com/en-us/azure/databricks/query-federation/http | | Set up Lakehouse Federation for MySQL | https://learn.microsoft.com/en-us/azure/databricks/query-federation/mysql | | Configure Databricks Lakehouse Federation for Oracle | https://learn.microsoft.com/en-us/azure/databricks/query-federation/oracle | | Set up Lakehouse Federation for PostgreSQL | https://learn.microsoft.com/en-us/azure/databricks/query-federation/postgresql | @@ -1275,7 +1294,7 @@ | Federate Databricks queries to Snowflake using OAuth | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake | | Federate Databricks queries to Snowflake with basic auth | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-basic-auth | | Federate Databricks queries to Snowflake with Entra ID | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-entra | -| Federate Databricks queries to Snowflake with OAuth tokens | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token | +| Use OAuth tokens for Snowflake federation in Databricks | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-oauth-access-token | | Federate Databricks queries to Snowflake with Okta OAuth | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-okta | | Configure Databricks federated queries to Snowflake with PEM | https://learn.microsoft.com/en-us/azure/databricks/query-federation/snowflake-pem | | Configure Databricks Lakehouse Federation for SQL Server | https://learn.microsoft.com/en-us/azure/databricks/query-federation/sql-server | @@ -1284,11 +1303,15 @@ | Read binary files into Spark DataFrames on Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/binary | | Read CSV files using Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/csv | | Read Delta Sharing tables with Spark DataFrames | https://learn.microsoft.com/en-us/azure/databricks/query/formats/deltasharing | +| Read Excel files with built-in Databricks Spark format | https://learn.microsoft.com/en-us/azure/databricks/query/formats/excel | +| Process JSON files with Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/json | | Load MLflow experiment run data in Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/mlflow-experiment | +| Read and write ORC files in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/orc | | Read Parquet files using Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/parquet | | Process text files with Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/text | | Read and write XML files in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/query/formats/xml | | Use MLflow REST APIs on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/reference/mlflow-api | +| Automate Azure Databricks Git folders with Terraform | https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-terraform | | Use Databricks secrets for JDBC credentials | https://learn.microsoft.com/en-us/azure/databricks/security/secrets/example-secret-workflow | | Query JSON string columns with Databricks SQL operators | https://learn.microsoft.com/en-us/azure/databricks/semi-structured/json | | Use SparkR, sparklyr, and dplyr DataFrames on Databricks | https://learn.microsoft.com/en-us/azure/databricks/sparkr/dataframes-tables | @@ -1345,6 +1368,7 @@ | Parse CSV strings with Databricks SQL from_csv | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_csv | | Parse JSON strings with Databricks SQL from_json | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_json | | Parse XML to structs with Databricks SQL from_xml | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/from_xml | +| Use get_json_object in Azure Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/get_json_object | | Read individual bits using Databricks SQL getbit | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/getbit | | Use h3_boundaryasgeojson in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_boundaryasgeojson | | Use h3_boundaryaswkb in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_boundaryaswkb | @@ -1377,6 +1401,16 @@ | Get parent H3 cell at resolution with h3_toparent | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_toparent | | Safely cover geography with H3 BIGINT using h3_try_coverash3 | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_try_coverash3 | | Safely cover geography with H3 strings using h3_try_coverash3string | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/h3_try_coverash3string | +| Use ip_as_binary in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_as_binary | +| Use ip_as_string in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_as_string | +| Use ip_cidr in Databricks SQL for CIDR handling | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_cidr | +| Use ip_cidr_contains in Databricks SQL filters | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_cidr_contains | +| Use ip_host in Databricks SQL for IP normalization | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_host | +| Use ip_network in Databricks SQL for network extraction | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network | +| Use ip_network_first alias in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network_first | +| Use ip_network_last in Databricks SQL for last IP | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_network_last | +| Use ip_prefix_length in Databricks SQL for CIDR masks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_prefix_length | +| Use ip_version in Databricks SQL to detect IP type | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/ip_version | | Invoke Java methods from Databricks SQL with java_method | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/java_method | | Use kll_merge_agg_bigint for KLL sketches | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_bigint | | Use kll_merge_agg_double for KLL sketches | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/kll_merge_agg_double | @@ -1427,7 +1461,7 @@ | Raise custom SQL errors in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/raise_error | | Generate random numbers with rand in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/rand | | Generate random numbers with random in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/random | -| Read files with read_files table function in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_files | +| Use read_files table-valued function in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_files | | Query Kafka with read_kafka in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_kafka | | Stream from Amazon Kinesis using read_kinesis TVF | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_kinesis | | Read Google Pub/Sub streams with read_pubsub in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/read_pubsub | @@ -1545,6 +1579,10 @@ | Transform map keys with transform_keys function | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/transform_keys | | Compute averages safely with try_avg aggregate | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_avg | | Safely access array or map elements with try_element_at | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_element_at | +| Use try_ip_as_binary in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_as_binary | +| Use try_ip_as_string in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_as_string | +| Use try_ip_cidr for CIDR handling in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_cidr | +| Use try_ip_host for IP parsing in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_ip_host | | Parse timestamps safely with try_parse_timestamp | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_parse_timestamp | | Retrieve Databricks secrets with try_secret function | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_secret | | Convert formatted strings to DECIMAL with try_to_number | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/try_to_number | @@ -1597,16 +1635,19 @@ | Use zstd_compress in Databricks SQL queries | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zstd_compress | | Decompress data with zstd_decompress in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/functions/zstd_decompress | | Invoke built-in and user-defined functions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-function-invocation | +| Use built-in SQL functions in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-builtin | | Implement user-defined aggregate functions in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-udf-aggregate | | Integrate Hive UDFs, UDAFs, UDTFs with Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-udf-hive | | Create and register scalar UDFs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-functions-udf-scalar | | Use H3 geospatial SQL functions in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions | | Alphabetical reference of H3 functions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions-alpha | | Example analytics using H3 functions on Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-h3-geospatial-functions-examples | +| Work with IP address SQL functions in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-ip-functions | | Write and use lambda functions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-lambda-functions | | Use IDENTIFIER clause for safe Databricks SQL parameterization | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-names-identifier-clause | | Understand NULL semantics in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-null-semantics | -| Write SQL scripts with Databricks SQL/PSM syntax | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-scripting | +| Use parameter markers in Databricks SQL statements | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-parameter-marker | +| Author SQL/PSM procedural scripts in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-scripting | | Use ST geospatial functions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-st-geospatial-functions | | Alphabetical reference of ST functions in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-st-geospatial-functions-alpha | | Use CALL to invoke Databricks SQL procedures | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-call | @@ -1645,22 +1686,21 @@ | List users with SHOW USERS in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-users | | List views with SHOW VIEWS in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-views | | Add comments and hints in Databricks SQL statements | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-comment | +| Use ALTER PROVIDER in Databricks SQL for Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-provider | | Create catalogs with Databricks SQL CREATE CATALOG | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-catalog | | Define foreign connections with CREATE CONNECTION | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-connection | | Create external functions with Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-function | -| Define materialized views in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-materialized-view | | Create and manage procedures in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-procedure | | Create schemas with Databricks SQL CREATE SCHEMA | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-schema | | Create SQL and Python functions in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-sql-function | | Create streaming tables in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-streaming-table | | Use FLOW AUTO CDC with CREATE STREAMING TABLE | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-streaming-table-auto-cdc | | Define constraints in CREATE TABLE and MATERIALIZED VIEW | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-constraint | -| Create managed, temporary, and external tables in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-table-using | | Create views and metric views in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-view | | Create Unity Catalog volumes with CREATE VOLUME | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-volume | | Use DECLARE VARIABLE for session variables in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-declare-variable | | Drop user-defined functions with DROP FUNCTION in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-function | -| Use INSERT syntax in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into | +| Use INSERT syntax in Azure Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-dml-insert-into | | Author SQL pipelines in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-pipeline | | Compose queries in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-query | | Write SELECT subqueries in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-qry-select | @@ -1687,7 +1727,7 @@ | Use named parameter markers in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-parameters | | Create and use query snippets in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/user/queries/query-snippets | | Implement custom stateful streaming with transformWithState | https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/ | -| Implement example custom stateful streaming apps | https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/examples | +| Implement custom stateful streaming with transformWithState | https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/examples | | Use legacy arbitrary stateful operators on Databricks | https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/legacy | | Handle schema evolution in transformWithState state store | https://learn.microsoft.com/en-us/azure/databricks/stateful-applications/schema-evolution | | Use Avro and Schema Registry with Kafka streaming on Databricks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/avro-dataframe | @@ -1696,10 +1736,14 @@ | Serialize and deserialize protocol buffers in Databricks streaming | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/protocol-buffers | | Use real-time mode with Kafka and custom sinks | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/examples | | Create Scala user-defined aggregate functions on Databricks | https://learn.microsoft.com/en-us/azure/databricks/udf/aggregate-scala | -| Create and use pandas UDFs on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/udf/pandas | +| Create and optimize pandas UDFs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/udf/pandas | | Implement Python scalar UDFs in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/udf/python | | Implement Scala scalar UDFs in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/udf/scala | | Access task context inside Databricks UDFs | https://learn.microsoft.com/en-us/azure/databricks/udf/udf-task-context | | Integrate custom embedding models with Vector Search | https://learn.microsoft.com/en-us/azure/databricks/vector-search/custom-embedding-model | | Query Mosaic AI Vector Search indexes with filters and reranking | https://learn.microsoft.com/en-us/azure/databricks/vector-search/query-vector-search | +| Use Vector Search with OpenAI embeddings | https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-external-embedding-model-example | +| Use Vector Search with GTE foundation embeddings | https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-foundation-embedding-model-gte-example | +| Use the Databricks Vector Search Python SDK | https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-python-sdk-example | +| Run Databricks vector search example notebooks | https://learn.microsoft.com/en-us/azure/databricks/vector-search/vs-example-notebooks | | Manage files in Unity Catalog volumes across tools | https://learn.microsoft.com/en-us/azure/databricks/volumes/volume-files | diff --git a/skills/azure-databricks/limits-quotas.md b/skills/azure-databricks/limits-quotas.md index 2be8f324..e557f3b1 100644 --- a/skills/azure-databricks/limits-quotas.md +++ b/skills/azure-databricks/limits-quotas.md @@ -8,7 +8,7 @@ | Topic | URL | |-------|-----| | Review Azure Databricks serverless compute quotas | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/serverless-quotas | -| Configure AI Gateway endpoint rate limits in Databricks | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/rate-limits-beta | +| Set and manage Unity AI Gateway endpoint rate limits | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/rate-limits-beta | | Clone legacy Databricks dashboards to AI/BI | https://learn.microsoft.com/en-us/azure/databricks/archive/legacy/clone-legacy-to-aibi | | Use and migrate from legacy Databricks online tables | https://learn.microsoft.com/en-us/azure/databricks/archive/machine-learning/feature-store/online-tables | | Check Databricks metric view feature runtime requirements | https://learn.microsoft.com/en-us/azure/databricks/business-semantics/metric-views/feature-availability | @@ -32,11 +32,9 @@ | Understand Meta Ads ingestion connector limits | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/meta-ads-limits | | Review MySQL Lakeflow connector limitations and considerations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-limits | | Understand NetSuite Lakeflow connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/netsuite-limits | -| Understand PostgreSQL Lakeflow Connect connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-limits | | Review Salesforce Lakeflow connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/salesforce-limits | | Understand ServiceNow Lakeflow connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-limits | | Review Microsoft SharePoint connector ingestion limits | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-limits | -| Review SQL Server Lakeflow Connect connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-limits | | Review TikTok Ads Lakeflow connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-limits | | Understand Workday HCM Lakeflow connector limitations | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-limits | | Review limitations for Workday Reports ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-reports-limits | @@ -44,11 +42,10 @@ | Zerobus Ingest connector limits and constraints | https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-limits | | Understand Databricks JDBC driver parameter limits | https://learn.microsoft.com/en-us/azure/databricks/integrations/jdbc-oss/example | | Handle large parameter arrays in For each tasks | https://learn.microsoft.com/en-us/azure/databricks/jobs/for-each-lookup-example | -| Understand Lakeflow Spark Declarative Pipeline limits and quotas | https://learn.microsoft.com/en-us/azure/databricks/ldp/limitations | +| Lakeflow Spark pipeline limits in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/ldp/limitations | | Understand Databricks Runtime ML library maintenance and support | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/databricks-runtime-ml-maintenance | -| Use Databricks Online Feature Stores for low-latency serving | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/online-feature-store | | Overview and limits of Databricks Foundation Model APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/ | -| Foundation Model APIs rate limits and quotas | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/limits | +| Understand Databricks Foundation Model APIs limits and quotas | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/limits | | Understand model units for provisioned throughput capacity | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/model-units | | Review Azure Databricks model serving limits and regions | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-limits | | Understand and debug Databricks model serving timeouts | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/model-serving-timeouts | @@ -57,7 +54,7 @@ | Notebook size and feature limits in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-limitations | | Understand Lakebase and standard Postgres differences | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/compatibility | | Lakebase Autoscaling feature and compatibility limitations | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/limitations | -| Manage Lakebase databases and branch limits | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-databases | +| Understand Lakebase Postgres database count limits | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-databases | | Wait for streaming query termination with awaitTermination | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/awaittermination | | Use StreamingQuery.id for unique query identification | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/id | | Name streaming queries with StreamingQuery.name | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/classes/streamingquery/name | @@ -83,4 +80,5 @@ | DROP VOLUME retention behavior in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-volume | | Recover dropped Unity Catalog tables with UNDROP | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-undrop-table | | Understand real-time mode limitations in Structured Streaming | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/limitations | +| Reference supported features and limits for real-time mode | https://learn.microsoft.com/en-us/azure/databricks/structured-streaming/real-time/reference | | Scale Mosaic AI Vector Search endpoints to high QPS | https://learn.microsoft.com/en-us/azure/databricks/vector-search/high-qps | diff --git a/skills/azure-databricks/security.md b/skills/azure-databricks/security.md index 01244f90..f29d7794 100644 --- a/skills/azure-databricks/security.md +++ b/skills/azure-databricks/security.md @@ -8,24 +8,25 @@ | Topic | URL | |-------|-----| | Monitor and revoke Azure Databricks personal access tokens | https://learn.microsoft.com/en-us/azure/databricks/admin/access-control/tokens | -| Configure Azure Databricks diagnostic audit log delivery securely | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/audit-log-delivery | -| Configure admin protection for no-isolation shared clusters | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/no-isolation-shared | -| Understand governed tags for policy enforcement in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/ | -| Manage permissions and access control on governed tags | https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-permissions | -| Configure SQL warehouse data access properties in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/sql/data-access-configuration | +| Secure no isolation shared clusters with admin protection | https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/no-isolation-shared | +| Create and manage governed tags in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-governed-tags | +| Manage permissions for governed tags in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/governed-tags/manage-permissions | | Manage users, groups, and service principals in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/ | -| Create and manage Azure Databricks groups for access control | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-groups | -| Manage Azure Databricks service principals and access | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-service-principals | -| Configure Microsoft Entra SCIM provisioning to Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/aad | -| Use Azure Databricks service principals for secure automation | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/service-principals | -| Add, update, and remove Databricks users securely | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/users | +| Configure automatic identity sync from Entra ID to Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management | +| Understand and use groups for Databricks access control | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/groups | +| Create and manage groups in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-groups | +| Manage service principals in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/manage-service-principals | +| Configure SCIM-based user and group provisioning to Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/ | +| Set up SCIM provisioning from Microsoft Entra ID to Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/scim/aad | +| Use service principals for secure automation in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/service-principals | +| Manage Azure Databricks workspace users | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/users | +| Manage legacy workspace-local groups in Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/workspace-local-groups | | Enforce user isolation cluster types in Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/enforce-user-isolation | -| Configure RestrictWorkspaceAdmins for Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/restrict-workspace-admins | -| Control Azure Databricks personnel access to workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-access | +| Restrict workspace admin permissions in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace-settings/restrict-workspace-admins | +| Configure Azure Databricks personnel access to workspaces | https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/workspace-access | | Configure AI/BI access controls in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/ | | Configure embedding controls for dashboards and Genie | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/admin/embed | | Manage consumer access and permissions in Databricks One | https://learn.microsoft.com/en-us/azure/databricks/ai-bi/consumers/ | -| Govern LLM endpoints with AI Gateway permissions and limits | https://learn.microsoft.com/en-us/azure/databricks/ai-gateway/overview-beta | | Enable Microsoft Entra conditional access for Databricks | https://learn.microsoft.com/en-us/azure/databricks/archive/azure-admin/conditional-access | | Configure legacy credential passthrough security in Databricks | https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/ | | Secure ADLS access with Entra ID passthrough in Databricks | https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/adls-passthrough | @@ -40,12 +41,11 @@ | Configure dedicated group access for Databricks compute | https://learn.microsoft.com/en-us/azure/databricks/compute/group-access | | Understand Databricks Lakeguard user isolation model | https://learn.microsoft.com/en-us/azure/databricks/compute/lakeguard | | Use fine-grained access control on dedicated compute | https://learn.microsoft.com/en-us/azure/databricks/compute/single-user-fgac | -| Create Lakeflow Connect connections for managed ingestion | https://learn.microsoft.com/en-us/azure/databricks/connect/managed-ingestion | | Configure Kafka authentication on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/connect/streaming/kafka/authentication | | Secure Unity Catalog connections to external systems | https://learn.microsoft.com/en-us/azure/databricks/connect/uc-connections | | Govern external cloud service access with Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/ | | Manage Unity Catalog service credentials and permissions | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/manage-service-credentials | -| Create Unity Catalog service credentials for cloud services | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials | +| Create Unity Catalog service credentials for external services | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-services/service-credentials | | Configure Unity Catalog access to cloud object storage | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/ | | Use Azure managed identities for Unity Catalog storage | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/azure-managed-identities | | Administer Unity Catalog external locations and permissions | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/manage-external-locations | @@ -53,7 +53,7 @@ | Create storage credentials for Azure Data Lake Storage | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials | | Create storage credentials for Cloudflare R2 in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials-r2 | | Create read-only AWS S3 storage credentials in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials-s3 | -| Securely embed Databricks dashboards for external users | https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/external-embed | +| Securely configure external dashboard embedding in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/dashboards/share/embedding/external-embed | | Manage Azure Databricks dashboard permissions via API | https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/manage-permissions | | Configure Hive metastore table ACLs in Databricks | https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/ | | Control ANY FILE securable access in Databricks | https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/any-file | @@ -66,12 +66,12 @@ | Learn Unity Catalog permissions model and inheritance | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control/permissions-concepts | | Reference Unity Catalog privileges and applicable objects | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/access-control/privileges-reference | | Access and interpret Databricks anomaly detection results | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/data-quality-monitoring/anomaly-detection/results | -| Secure data with Unity Catalog row filters and column masks on Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/ | +| Secure data with row filters and column masks in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/ | | Manually apply Unity Catalog row filters and column masks using mapping tables | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/filters-and-masks/manually-apply | | Manage Unity Catalog privileges and data access | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/ | | Configure and manage Unity Catalog access requests | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/access-request-destinations | | Understand admin privileges for Unity Catalog management | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/admin-privileges | -| Configure Unity Catalog allowlist for Databricks compute | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/allowlist | +| Manage Unity Catalog allowlist for libraries and scripts | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/allowlist | | Manage and transfer Unity Catalog object ownership | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/ownership | | Upgrade Unity Catalog to privilege inheritance model | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/upgrade-privilege-model | | Reference Unity Catalog securable objects and permissions | https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/securable-objects | @@ -79,10 +79,11 @@ | Understand trust and safety for Databricks AI features | https://learn.microsoft.com/en-us/azure/databricks/databricks-ai/databricks-ai-trust | | Configure partner-powered AI features in Databricks | https://learn.microsoft.com/en-us/azure/databricks/databricks-ai/partner-powered | | Secure Delta Sharing open access with IP allow lists | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/access-list | +| Monitor Delta Sharing activity with audit logs | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/audit-logs | | Configure OIDC federation for Delta Sharing open access | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-oidc-fed | -| Secure Delta Sharing access for external users with bearer tokens | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-token | +| Configure Delta Sharing bearer-token recipients for open sharing | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/create-recipient-token | | Manage provider-side access control for Delta Sharing | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/grant-access | -| Manage Delta Sharing provider objects in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/manage-provider | +| Access Delta Sharing open-shared data using bearer tokens | https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/read-data-open | | Configure authorization for Azure Databricks APIs and CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/ | | Manually obtain Microsoft Entra tokens for Databricks REST APIs | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/aad-token-manual | | Configure Azure DevOps pipelines to authenticate to Databricks | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/auth-with-azure-devops | @@ -93,13 +94,12 @@ | Authenticate to Azure Databricks with Azure PowerShell | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-powershell-login | | Authenticate Azure Databricks using Microsoft Entra service principals | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-sp | | Authenticate to Databricks using federated IdP tokens | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-exchange | -| Configure OAuth token federation policy for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-policy | +| Configure OAuth federation policy for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-policy | | Enable Databricks workload identity federation in CI/CD | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-federation-provider | | Authorize Databricks service principals with OAuth M2M | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-m2m | -| Authorize Databricks user access with OAuth | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-u2m | +| Authorize Azure Databricks user access with OAuth | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-u2m | | Configure Databricks workload identity federation for Azure DevOps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-azure-devops | | Configure Databricks workload identity federation for CircleCI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-circleci | -| Configure Databricks workload identity federation for GitHub Actions | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-github | | Configure Databricks workload identity federation for GitLab CI/CD | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-gitlab | | Enable Databricks workload identity federation for Terraform Cloud and others | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/provider-other | | Use Databricks service principals for CI/CD automation | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/service-principals | @@ -123,6 +123,7 @@ | Manage Databricks service principals using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-service-principals-commands | | Manage Databricks account users via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-users-commands | | Manage Databricks workspace assignments for principals | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/account-workspace-assignment-commands | +| Configure Databricks CLI authentication with auth commands | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/auth-commands | | Configure secure authentication for Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/configure-commands | | Manage Unity Catalog credentials with Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/credentials-commands | | Manage Unity Catalog grants using Databricks CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/grants-commands | @@ -134,10 +135,11 @@ | Handle Unity Catalog access requests with rfa CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/rfa-commands | | Manage Databricks secrets and scopes via CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/secrets-commands | | Administer Databricks service principals using CLI | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/service-principals-commands | +| Manage Databricks workspace IAM via CLI v2 | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/reference/workspace-iam-v2-commands | | Configure authentication and authorization for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/auth | | Use token-based auth for Databricks app APIs | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/connect-local | | Add Unity Catalog connection resources to Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/connections | -| Monitor Databricks app logs and audit events | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/monitor | +| Configure logging and audit monitoring for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/monitor | | Configure networking and access controls for Databricks Apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/networking | | Manage Databricks app permissions and access control | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/permissions | | Use Databricks secrets as secure resources in apps | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/secrets | @@ -147,9 +149,12 @@ | Configure OAuth authorization for Databricks VS Code extension | https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext/authentication | | Configure secure external access to Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/external-access/admin | | Configure Unity Catalog credential vending for external engines | https://learn.microsoft.com/en-us/azure/databricks/external-access/credential-vending | -| Configure authentication for Databricks App-based AI agents | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication | +| Configure authentication for Databricks Apps AI agents | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication | | Configure authentication for Databricks AI agents on Model Serving | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/agent-framework/agent-authentication-model-serving | | Authenticate external clients to Databricks MCP servers | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/connect-external-services | +| Host custom MCP servers on Databricks Apps with controlled access | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/custom-mcp | +| Secure external MCP servers with Databricks-managed proxies | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/external-mcp | +| Use Databricks managed MCP servers with Unity Catalog security | https://learn.microsoft.com/en-us/azure/databricks/generative-ai/mcp/managed-mcp | | Configure secure ADLS access for Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/configure-data-access | | Generate temporary ADLS credentials for Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/generate-temporary-credentials | | Use temporary credentials with COPY INTO securely | https://learn.microsoft.com/en-us/azure/databricks/ingestion/cloud-object-storage/copy-into/temporary-credentials | @@ -161,11 +166,13 @@ | Assign required MySQL privileges for Lakeflow ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/mysql-privileges | | Configure PostgreSQL user privileges for Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/postgresql-privileges | | Configure Run as identity for Lakeflow ingestion pipelines | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/run-as | +| Use SharePoint connector with enhanced security workspaces | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-pipeline | | Configure OAuth M2M auth for SharePoint ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-m2m | | Set up manual token refresh for SharePoint ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-refresh-token | | Configure OAuth U2M auth for SharePoint ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sharepoint-source-setup-u2m | | Set SQL Server user permissions for Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/sql-server-privileges | | Configure TikTok Ads auth for Databricks ingestion | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tiktok-ads-source-setup | +| Configure TLS certificate validation for Lakeflow Connect databases | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/tls-server-certificate-validation | | Configure Workday HCM authentication for Databricks | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/workday-hcm-setup | | Configure OAuth security for Zendesk Support connector | https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/zendesk-support-source-setup | | Configure OAuth sign-on from BI partner tools to Databricks | https://learn.microsoft.com/en-us/azure/databricks/integrations/configuration | @@ -187,7 +194,7 @@ | Install Databricks libraries from external package repositories | https://learn.microsoft.com/en-us/azure/databricks/libraries/package-repositories | | Configure authentication for third-party online feature stores | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/fs-authentication | | Configure access control for Databricks feature tables | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/feature-store/workspace-feature-store/access-control | -| Compliance and security profiles for Foundation Model APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/compliance | +| Apply compliance and security profiles to Databricks Foundation Model APIs | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/compliance | | Apply OpenAI high-risk use mitigations on Databricks | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/open-ai-mitigation-requirements | | Secure external resource access from Databricks endpoints | https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-serving/store-env-variable-model-serving | | Onboard as a Databricks Marketplace provider securely | https://learn.microsoft.com/en-us/azure/databricks/marketplace/become-provider | @@ -197,19 +204,19 @@ | Grant and manage Lakebase instance access in UI | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/manage-privileges | | Create and manage PostgreSQL roles for Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/pg-roles | | Use pre-created roles and permissions in Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/instances/roles | -| Configure authentication for Lakebase Postgres connections | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/authentication | +| Configure Lakebase authentication and token rotation | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/authentication | | Use customer-managed keys to encrypt Lakebase data | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/customer-managed-keys | | Configure data protection for Lakebase Postgres | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/data-protection | | Programmatically manage Lakebase project permissions | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-permissions-programmatically | | Grant Lakebase project and database access | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/grant-user-access-tutorial | | Manage Lakebase project permissions and access | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-project-permissions | | Manage Postgres roles in Lakebase projects | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles | -| Manage Lakebase Postgres roles and permissions | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles-permissions | +| Grant and manage Lakebase database permissions | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/manage-roles-permissions | | Create and manage Lakebase Postgres roles | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/postgres-roles | | Configure Private Link for Lakebase Autoscaling traffic | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/private-link | | Configure protected branches in Lakebase Postgres | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/protected-branches | -| Configure Postgres roles and permissions in Lakebase | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/roles-permissions | -| Secure Databricks app access to Lakebase Autoscaling | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/tutorial-databricks-apps-autoscaling | +| Configure Lakebase Postgres roles and permissions | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/roles-permissions | +| Transfer Lakebase Postgres object ownership via group roles | https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/transfer-object-ownership | | Configure Azure Databricks service principals for Power BI M2M OAuth | https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi-m2m | | Decrypt data with aes_decrypt in Databricks PySpark | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/aes_decrypt | | Encrypt data with aes_encrypt in Databricks PySpark | https://learn.microsoft.com/en-us/azure/databricks/pyspark/reference/functions/aes_encrypt | @@ -223,15 +230,14 @@ | Authorize Databricks service principals for Git folders | https://learn.microsoft.com/en-us/azure/databricks/repos/automate-with-sp | | Configure secure Git integration for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/repos/repos-setup | | Manage Databricks data residency with Azure Geographies | https://learn.microsoft.com/en-us/azure/databricks/resources/databricks-geos | -| Manage data residency with Databricks Designated Services | https://learn.microsoft.com/en-us/azure/databricks/resources/designated-services | | Configure domain-based firewall rules for Databricks | https://learn.microsoft.com/en-us/azure/databricks/resources/firewall-rules | | Manage Unity Catalog schemas and required permissions | https://learn.microsoft.com/en-us/azure/databricks/schemas/manage-schema | -| Configure Azure Databricks workspace object ACLs | https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/ | +| Manage access control lists for Databricks workspace objects | https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/ | | Assign roles and ACLs for Databricks service principals | https://learn.microsoft.com/en-us/azure/databricks/security/auth/access-control/service-principal-acl | | Configure Azure Databricks personal access token permissions | https://learn.microsoft.com/en-us/azure/databricks/security/auth/api-access-permissions | -| Change default Databricks workspace access to consumer | https://learn.microsoft.com/en-us/azure/databricks/security/auth/change-default-workspace-access | +| Configure default consumer workspace access via group cloning | https://learn.microsoft.com/en-us/azure/databricks/security/auth/change-default-workspace-access | | Understand default workspace permissions in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/auth/default-permissions | -| Manage Databricks user and group entitlements | https://learn.microsoft.com/en-us/azure/databricks/security/auth/entitlements | +| Manage Azure Databricks user and group entitlements | https://learn.microsoft.com/en-us/azure/databricks/security/auth/entitlements | | Configure just-in-time user provisioning in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/auth/jit | | Configure customer-managed keys for Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmek-unity-catalog | | Configure CMK encryption for Azure managed disks | https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-disks-azure/ | @@ -253,18 +259,19 @@ | Manage Databricks SQL query and result encryption | https://learn.microsoft.com/en-us/azure/databricks/security/keys/sql-encryption | | Configure secure networking for Azure Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/security/network/ | | Secure classic compute plane networking in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/ | -| Configure classic compute Private Link for Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard | +| Configure classic compute plane Private Link for Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard | | Enable secure cluster connectivity (no public IP) in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/secure-cluster-connectivity | | Configure VNet service endpoint policies for Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/service-endpoints | | Configure VNet peering for Azure Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-peering | | Manage context-based network policies for Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/security/network/context-based-policies | | Secure user access to Azure Databricks with front-end controls | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ | | Configure context-based ingress control for Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/context-based-ingress | -| Configure inbound Private Link to Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/front-end-private-connect | +| Configure inbound Private Link for Azure Databricks workspaces | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/front-end-private-connect | | Configure Azure Databricks IP access lists | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list | | Configure account console IP access lists | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list-account | | Manage workspace IP access lists with CLI and API | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list-workspace | | Create and manage context-based ingress policies in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/manage-ingress-policies | +| Set up inbound Private Link for Databricks performance services | https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-privatelink | | Configure serverless egress network policies in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/manage-network-policies | | Use serverless egress control policies in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/network-policies | | Configure legacy serverless compute firewall access | https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewall | @@ -274,7 +281,7 @@ | Configure Canada Protected B controls in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/cccs-medium-protected-b | | Configure enhanced security and compliance in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/enhanced-security-compliance | | Set up enhanced security monitoring in Azure Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/enhanced-security-monitoring | -| Implement GDPR and CCPA-compliant deletions with Delta | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/gdpr-delta | +| Prepare Delta Lake data for GDPR and CCPA erasure compliance | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/gdpr-delta | | Understand HIPAA compliance controls in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/hipaa | | Use HITRUST compliance controls in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/hitrust | | Apply IRAP compliance controls in Databricks | https://learn.microsoft.com/en-us/azure/databricks/security/privacy/irap | @@ -325,12 +332,13 @@ | List accessible Unity Catalog credentials with SHOW CREDENTIALS | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-credentials | | List Unity Catalog policies with SHOW POLICIES | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-aux-show-policies | | Manage data shares with ALTER SHARE in Unity Catalog | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-share | -| Configure column masks for fine-grained data access | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-column-mask | +| Define and manage column masks in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-column-mask | | Define row-level security policies in Databricks SQL | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-create-policy | | Drop row and column policies in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-drop-policy | | Use REFRESH FOREIGN for Unity Catalog objects | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-refresh-foreign | | Apply row filters for secure data access in Databricks | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-row-filter | | Set Unity Catalog tags with required privileges | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-set-tag | | Remove Unity Catalog tags with proper permissions | https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-unset-tag | +| Call Databricks Vector Search with OAuth tokens | https://learn.microsoft.com/en-us/azure/databricks/vector-search/vector-search-oauth-token | | Implement dynamic views for fine-grained access control | https://learn.microsoft.com/en-us/azure/databricks/views/dynamic | | Unity Catalog volume privileges and task permissions | https://learn.microsoft.com/en-us/azure/databricks/volumes/privileges | diff --git a/skills/azure-defender-for-cloud/SKILL.md b/skills/azure-defender-for-cloud/SKILL.md index cdc89f83..4badce7e 100644 --- a/skills/azure-defender-for-cloud/SKILL.md +++ b/skills/azure-defender-for-cloud/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-defender-for-cloud -description: Expert knowledge for Azure Defender For Cloud development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when securing Azure VMs, AKS/containers, SQL/storage, multi-cloud connectors, or Defender for DevOps APIs, and other Azure Defender For Cloud related development tasks. Not for Azure Defender For Iot (use azure-defender-for-iot), Azure DDos Protection (use azure-ddos-protection), Azure External Attack Surface Management (use azure-external-attack-surface-management), Azure Security (use azure-security). +description: Expert knowledge for Azure Defender For Cloud development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when securing Azure VMs, AKS/containers, SQL/storage, multicloud connectors, or Defender for Servers plans, and other Azure Defender For Cloud related development tasks. Not for Azure Defender For Iot (use azure-defender-for-iot), Azure Security (use azure-security), Azure Sentinel (use azure-sentinel), Azure Firewall Manager (use azure-firewall-manager). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Defender For Cloud Skill @@ -24,15 +24,15 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L65 | Diagnosing and fixing Defender for Cloud issues: alert validation/interpretation, connector/onboarding problems (AWS/GCP/APIs/SQL/K8s), deployment gaps, malware/vuln handling, and incident references. | -| Best Practices | L66-L85 | Best practices for investigating and remediating vulnerabilities, misconfigurations, secrets, and API/endpoint/Kubernetes risks across Defender for Cloud, AKS, registries, and CI/CD. | -| Decision Making | L86-L104 | Guidance for choosing and planning Defender for Cloud plans, pricing, portals, agents, regional availability, and migration/transition paths for servers, containers, storage, and GCP. | -| Architecture & Design Patterns | L105-L114 | Architectural guidance for Defender for Servers/Containers: agentless scanning, malware/vuln detection on VMs/Kubernetes, data collection, residency, workspaces, and large-scale deployment. | -| Limits & Quotas | L115-L124 | Limits, quotas, and prerequisites for Defender for Cloud features: free trials, data ingestion, APIs, DevOps, portal preview, alert export limits, and data collection extension changes. | -| Security | L125-L199 | Security alerts, permissions, and hardening for Defender for Cloud: configuring protections, interpreting alerts/recommendations, RBAC/CIEM, data handling, and securing SQL, storage, containers, VMs, APIs, and Kubernetes. | -| Configuration | L200-L267 | Configuring Defender for Cloud features: onboarding, policies, exemptions, scanning (agentless, containers, SQL, storage, DevOps), alerts, exports, private links, and data security posture. | -| Integrations & Coding Patterns | L268-L297 | Integrating Defender for Cloud with CI/CD, SIEM, EDR, ITSM, APIs, and multi-cloud logs, plus querying/exporting security data and automating tickets and container/AKS scans | -| Deployment | L298-L325 | Deploying and managing Defender for Cloud plans and agents (Containers, SQL, Storage) across AKS/EKS/GKE and servers using portal, CLI, IaC, policies, APIs, and reviewing support/regions. | +| Troubleshooting | L37-L66 | Diagnosing and fixing Defender for Cloud issues: alert validation/response, connector and SQL/Kubernetes/container problems, GCP/AWS onboarding, and malware/vulnerability scan errors. | +| Best Practices | L67-L85 | Best practices for investigating and remediating vulnerabilities, misconfigurations, secrets, and API/endpoint/Kubernetes risks across Defender for Cloud, AKS, registries, and CI/CD. | +| Decision Making | L86-L104 | Guidance for choosing Defender for Cloud plans, portals, pricing and cost allocation, regional availability, migration paths, and deployment options for servers, containers, storage, and GCP. | +| Architecture & Design Patterns | L105-L113 | Architectural guidance for Defender for Servers/Containers: agentless scanning, malware/vuln detection on VMs/Kubernetes, data collection, residency, workspaces, and large-scale deployment. | +| Limits & Quotas | L114-L123 | Limits, quotas, and prerequisites for Defender for Cloud features: free trials, data ingestion, APIs, DevOps, portal preview, alert export limits, and data collection extension changes. | +| Security | L124-L201 | Security alerts, permissions, and hardening for Defender for Cloud: configuring protections, interpreting alerts/recommendations, RBAC/CIEM, data handling, and securing VMs, containers, SQL, storage, APIs, and Kubernetes. | +| Configuration | L202-L267 | How to enable and tune Defender for Cloud features: configuring scans, alerts, exports, exemptions, DevOps integrations, data security posture, and plan settings across Azure and multicloud. | +| Integrations & Coding Patterns | L268-L298 | Integrating Defender for Cloud with CI/CD, SIEM, EDR, ticketing, cloud logs, partners, and APIs, plus exporting/querying security data via ARG, REST, CLI, and PowerShell. | +| Deployment | L299-L326 | Deploying and managing Defender for Cloud protections (Containers, SQL, Storage, Servers) at scale via portal, CLI/IaC/Policy/REST, plus migration, removal, and support matrices. | ### Troubleshooting | Topic | URL | @@ -48,6 +48,7 @@ This skill requires **network access** to fetch documentation content: | Reference deprecated Defender for Cloud alert IDs | https://learn.microsoft.com/en-us/azure/defender-for-cloud/deprecated-alerts | | Remediate Defender for Cloud endpoint detection gaps | https://learn.microsoft.com/en-us/azure/defender-for-cloud/endpoint-detection-response-solution-recommendations | | Resolve common issues in Endor Labs integration | https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-endor-labs | +| Resolve common Microsoft Defender for Cloud issues | https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-general | | Use Defender for Cloud incident reference catalog | https://learn.microsoft.com/en-us/azure/defender-for-cloud/incidents-reference | | Handle malware alerts on Kubernetes nodes in Defender | https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-malware | | Investigate and fix Defender Kubernetes node vulnerabilities | https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-nodes-va | @@ -75,7 +76,6 @@ This skill requires **network access** to fetch documentation content: | Remediate endpoint detection and response gaps in Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/endpoint-detection-response-solution-recommendations | | Apply Defender networking recommendations for Azure | https://learn.microsoft.com/en-us/azure/defender-for-cloud/protect-network-resources | | Remediate cloud deployment secrets in Defender | https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-cloud-deployment-secrets | -| Remediate machine secrets findings in Defender | https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-server-secrets | | Remediate machine vulnerability findings in Defender for Servers | https://learn.microsoft.com/en-us/azure/defender-for-cloud/remediate-vulnerability-findings-vm | | Review security annotations on pull requests in GitHub and Azure DevOps | https://learn.microsoft.com/en-us/azure/defender-for-cloud/review-pull-request-annotations | | Prioritize and fix vulnerabilities in AKS containers | https://learn.microsoft.com/en-us/azure/defender-for-cloud/view-and-remediate-vulnerabilities-containers | @@ -93,7 +93,7 @@ This skill requires **network access** to fetch documentation content: | Estimate Defender for Cloud costs with calculator | https://learn.microsoft.com/en-us/azure/defender-for-cloud/cost-calculator | | Choose Defender for Containers deployment options | https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-containers-deployment-overview | | Decide between Defender for Storage classic and new plan | https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic | -| Migrate from Defender for Storage classic to new plan | https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate | +| Decide and migrate from Defender for Storage classic to new plan | https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-classic-migrate | | Use BYOL vulnerability assessment with Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/deploy-vulnerability-assessment-byol-vm | | Choose the right Defender for Servers plan | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-select-plan | | Plan for Defender for Cloud Log Analytics agent retirement | https://learn.microsoft.com/en-us/azure/defender-for-cloud/prepare-deprecation-log-analytics-mma-agent | @@ -110,7 +110,6 @@ This skill requires **network access** to fetch documentation content: | Design a Defender for Servers deployment architecture | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers | | Understand Defender for Servers data collection design | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-agents | | Plan Defender for Servers data residency and workspaces | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-data-workspace | -| Scale Microsoft Defender for Servers across environments | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale | ### Limits & Quotas | Topic | URL | @@ -160,11 +159,13 @@ This skill requires **network access** to fetch documentation content: | Interpret Defender for Storage threats and alerts | https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-threats-alerts | | Configure disable rules for container vulnerability findings | https://learn.microsoft.com/en-us/azure/defender-for-cloud/disable-vulnerability-findings-containers-secure-score | | Enable Defender for open-source databases on AWS | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-aws | -| Enable Defender for open-source databases on Azure | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure | +| Configure sensitive data threat detection for Defender for Storage | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity | | Enable CIEM in Microsoft Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-permissions-management | | Enable and configure gated deployment for Kubernetes clusters | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enablement-guide-runtime-gated | | Understand and assign Defender for Cloud permissions | https://learn.microsoft.com/en-us/azure/defender-for-cloud/faq-permissions | | Configure governance rules to enforce Defender remediation | https://learn.microsoft.com/en-us/azure/defender-for-cloud/governance-rules | +| Use Defender for Cloud attack path analysis | https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-attack-path | +| Query security risks with cloud security explorer | https://learn.microsoft.com/en-us/azure/defender-for-cloud/how-to-manage-cloud-security-explorer | | Use Purview data sensitivity in Defender alerts | https://learn.microsoft.com/en-us/azure/defender-for-cloud/information-protection | | Apply Defender for Cloud Kubernetes data plane hardening | https://learn.microsoft.com/en-us/azure/defender-for-cloud/kubernetes-workload-protections | | Configure on-upload malware scanning for Azure Storage | https://learn.microsoft.com/en-us/azure/defender-for-cloud/on-upload-malware-scanning | @@ -173,7 +174,7 @@ This skill requires **network access** to fetch documentation content: | Configure roles and permissions for Defender for Servers | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-roles | | Manage Defender for Cloud user and personal data | https://learn.microsoft.com/en-us/azure/defender-for-cloud/privacy | | Use Defender for Cloud AI security recommendations | https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-ai | -| Apply Defender for Cloud API security recommendations | https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-api | +| Use Defender for Cloud API security recommendations | https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-api | | Use Defender for Cloud security recommendations for Azure App Service | https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-app-services | | Apply Defender for Cloud compute security recommendations | https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-compute | | Apply Defender for Cloud container security recommendations | https://learn.microsoft.com/en-us/azure/defender-for-cloud/recommendations-reference-container | @@ -196,6 +197,7 @@ This skill requires **network access** to fetch documentation content: | Changelog for SQL vulnerability assessment rules | https://learn.microsoft.com/en-us/azure/defender-for-cloud/sql-azure-vulnerability-assessment-rules-changelog | | Prerequisites and permissions for Defender for Storage | https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-storage | | Manage tenant-wide permissions in Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/tenant-wide-permissions-management | +| Configure JIT access and application control for Defender for Servers | https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-protect-resources | ### Configuration | Topic | URL | @@ -237,7 +239,7 @@ This skill requires **network access** to fetch documentation content: | Disable specific VM vulnerability findings in Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/disable-vulnerability-findings | | Configure exemptions and disable container VA findings | https://learn.microsoft.com/en-us/azure/defender-for-cloud/disable-vulnerability-findings-containers | | Configure agentless scanning for virtual machines | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-agentless-scanning-vms | -| Enable and configure sensitive data threat detection for Storage | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-storage-data-sensitivity | +| Enable Defender for open-source Azure databases | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-defender-for-databases-azure | | Enable just-in-time access for Azure virtual machines | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-just-in-time-access | | Enable DevOps pull request security annotations | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-pull-request-annotations | | Configure Defender Vulnerability Management for containers | https://learn.microsoft.com/en-us/azure/defender-for-cloud/enable-vulnerability-assessment | @@ -252,9 +254,7 @@ This skill requires **network access** to fetch documentation content: | Configure Microsoft Security DevOps GitHub Action | https://learn.microsoft.com/en-us/azure/defender-for-cloud/github-action | | Configure IaC misconfiguration scanning with Microsoft Security DevOps | https://learn.microsoft.com/en-us/azure/defender-for-cloud/iac-vulnerabilities | | Configure and manage MCSB security standard | https://learn.microsoft.com/en-us/azure/defender-for-cloud/manage-mcsb | -| Enable Defender for Cloud on management groups via policy | https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboard-management-group | | Use built-in Azure Policy definitions for Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/policy-reference | -| Onboard Defender for Cloud using PowerShell | https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding | | PowerShell script to enable SQL VA express configuration | https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-sample-vulnerability-assessment-azure-sql | | PowerShell script to set SQL VA baselines | https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-sample-vulnerability-assessment-baselines | | Query SBOM data in Defender for Cloud using Cloud Security Explorer | https://learn.microsoft.com/en-us/azure/defender-for-cloud/query-software-bill-of-materials | @@ -292,6 +292,7 @@ This skill requires **network access** to fetch documentation content: | Connect Bright Security DAST with Defender | https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-bright | | Integrate StackHawk testing with Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/onboarding-guide-stackhawk | | Use legacy security solution integrations with Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/partner-integration | +| Automate Defender for Cloud onboarding with PowerShell | https://learn.microsoft.com/en-us/azure/defender-for-cloud/powershell-onboarding | | Run Azure Resource Graph queries for Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/resource-graph-samples | | Use Defender VM subassessments for container vulnerabilities | https://learn.microsoft.com/en-us/azure/defender-for-cloud/subassessment-rest-api | @@ -318,8 +319,8 @@ This skill requires **network access** to fetch documentation content: | Deploy gated deployment agent via Infrastructure as Code | https://learn.microsoft.com/en-us/azure/defender-for-cloud/gated-deployment-infrastructure-as-code | | Identify SQL Servers still protected by Microsoft Monitoring Agent | https://learn.microsoft.com/en-us/azure/defender-for-cloud/identify-sql-servers-protected-by-monitor-agent | | Migrate File Integrity Monitoring to Defender for Endpoint | https://learn.microsoft.com/en-us/azure/defender-for-cloud/migrate-file-integrity-monitoring | -| Check Defender for Cloud interoperability across Azure services and environments | https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud | -| Support matrix for Defender for Containers features | https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers | +| Scale Microsoft Defender for Servers across environments | https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers-scale | +| Check Defender for Cloud interoperability and support matrix | https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-cloud | +| Container support matrix for Defender for Cloud | https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-containers | | Review support matrix and requirements for Defender for Servers | https://learn.microsoft.com/en-us/azure/defender-for-cloud/support-matrix-defender-for-servers | -| Deploy Microsoft Defender for Storage on Azure | https://learn.microsoft.com/en-us/azure/defender-for-cloud/tutorial-enable-storage-plan | | Verify Defender for SQL Servers on Machines protection status | https://learn.microsoft.com/en-us/azure/defender-for-cloud/verify-machine-protection | \ No newline at end of file diff --git a/skills/azure-defender-for-iot/SKILL.md b/skills/azure-defender-for-iot/SKILL.md index ca1f7b76..dd5438c6 100644 --- a/skills/azure-defender-for-iot/SKILL.md +++ b/skills/azure-defender-for-iot/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-defender-for-iot -description: Expert knowledge for Azure Defender For Iot development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when deploying OT sensors, configuring micro agents, setting up traffic mirroring, or integrating with Sentinel/SIEM, and other Azure Defender For Iot related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Security (use azure-security), Azure External Attack Surface Management (use azure-external-attack-surface-management), Azure Sentinel (use azure-sentinel). +description: Expert knowledge for Azure Defender For Iot development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when deploying OT sensors, configuring traffic mirroring, integrating with Sentinel/SIEM, or managing IoT micro agents, and other Azure Defender For Iot related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Security (use azure-security), Azure IoT (use azure-iot), Azure IoT Hub (use azure-iot-hub). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-03-16" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Defender For Iot Skill @@ -31,7 +31,7 @@ This skill requires **network access** to fetch documentation content: | Limits & Quotas | L76-L84 | Info on OT trial setup, supported/retiring features, appliance catalog and requirements, and Defender for IoT data retention and storage limits. | | Security | L85-L103 | Securing Defender for IoT OT environments: auth, RBAC/roles, SSO, certificates, Zero Trust, alert workflows/response, and auditing user and programming activity. | | Configuration | L104-L135 | Configuring Defender for IoT agents/sensors: micro agent twins, dependencies, alerts, OT sensor settings, traffic mirroring, connectivity, monitoring methods, and threat intel updates. | -| Integrations & Coding Patterns | L136-L163 | Integrating Defender for IoT with SIEMs, firewalls, ServiceNow, Sentinel, OT sensors, and micro agents, plus using APIs, playbooks, and workbooks to automate alerts and manage inventory/vulnerabilities. | +| Integrations & Coding Patterns | L136-L163 | Integrating Defender for IoT/OT with SIEMs, firewalls, ServiceNow, Sentinel, REST APIs, and micro agents, plus automating alerts, inventories, vulnerabilities, and visualizations. | | Deployment | L164-L187 | Planning and deploying Defender for IoT OT sensors: hardware/VM options, appliance-specific guides, traffic mirroring, onboarding, activation, and moving IoT security resources across regions. | ### Troubleshooting @@ -147,7 +147,7 @@ This skill requires **network access** to fetch documentation content: | Send Defender for IoT alerts to LogRhythm | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/logrhythm | | Send Defender for IoT alerts to RSA NetWitness | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/netwitness | | Connect on-premises Defender for IoT to Sentinel (legacy) | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/on-premises-sentinel | -| Stream Defender for IoT cloud alerts to external SIEMs | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners | +| Integrate Defender for IoT alerts with partner SIEMs | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/send-cloud-data-to-partners | | Configure legacy ServiceNow integration for Defender for IoT | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/integrations/service-now-legacy | | Use Sentinel solution to detect IoT threats | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/iot-advanced-threat-monitoring | | Connect Defender for IoT with Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/defender-for-iot/organizations/iot-solution | diff --git a/skills/azure-deployment-environments/SKILL.md b/skills/azure-deployment-environments/SKILL.md index 27ffff37..5df758ae 100644 --- a/skills/azure-deployment-environments/SKILL.md +++ b/skills/azure-deployment-environments/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-deployment-environments -description: Expert knowledge for Azure Deployment Environments development including troubleshooting, best practices, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing ADE catalogs, environment.yaml schemas, custom images, RBAC/roles, or CI/CD image pipelines, and other Azure Deployment Environments related development tasks. Not for Azure DevTest Labs (use azure-devtest-labs), Azure Dev Box (use azure-dev-box), Azure Integration Environments (use azure-integration-environments), Azure Managed Applications (use azure-managed-applications). +description: Expert knowledge for Azure Deployment Environments development including troubleshooting, best practices, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when defining environment.yaml, ADE catalogs, RBAC/managed identities, custom images, or CI/CD pipelines, and other Azure Deployment Environments related development tasks. Not for Azure DevTest Labs (use azure-devtest-labs), Azure Dev Box (use azure-dev-box), Azure Integration Environments (use azure-integration-environments), Azure Managed Applications (use azure-managed-applications). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-02-28" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Deployment Environments Skill @@ -25,12 +25,12 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L35-L39 | Diagnosing and resolving Azure Deployment Environments custom image deployment failures, including common error codes, validation issues, and configuration or image compatibility problems. | -| Best Practices | L40-L44 | Guidance on structuring ADE catalogs: organizing templates, folders, and repos for reusable, maintainable, and scalable deployment environment definitions. | -| Limits & Quotas | L45-L49 | How to view current Azure Deployment Environments quotas/capacity, understand default limits, and request increases for org, project, and environment resource usage. | -| Security | L50-L57 | RBAC and identity for ADE: planning Azure roles/scopes, using Azure CLI auth for REST, configuring managed identities, and assigning built‑in ADE roles and access. | -| Configuration | L58-L65 | Defining and configuring ADE environment.yaml schemas, environment definitions, and custom container images, plus required CLI environment variables for building and running those images. | -| Integrations & Coding Patterns | L66-L70 | Using the ADE CLI to build, publish, and manage custom environment images, automate image pipelines, and integrate ADE image workflows into CI/CD and DevOps processes | -| Deployment | L71-L75 | How to integrate Azure Deployment Environments with CI/CD tools like Azure Pipelines and GitHub Actions, including configuring pipelines to create, update, and delete ADE environments. | +| Best Practices | L40-L45 | Guidance on structuring ADE template catalogs and implementing resiliency patterns (high availability, fault tolerance, recovery) in Azure Deployment Environments. | +| Limits & Quotas | L46-L50 | How to view current Azure Deployment Environments quotas/capacity, understand default limits, and request increases for org, project, and environment resource usage. | +| Security | L51-L58 | RBAC and role planning, secure API auth, configuring managed identities, and assigning ADE built-in roles/scopes to control access to deployments and resources. | +| Configuration | L59-L66 | Defining environment.yaml schemas, configuring ADE environment definitions, and building/using custom container images (including required CLI env vars) for Azure Deployment Environments | +| Integrations & Coding Patterns | L67-L71 | Using the ADE CLI to build, publish, and manage custom environment images, automate image pipelines, and integrate ADE image workflows into CI/CD and DevOps processes | +| Deployment | L72-L76 | How to integrate Azure Deployment Environments with CI/CD tools like Azure Pipelines and GitHub Actions, including configuring pipelines to create, update, and delete ADE environments. | ### Troubleshooting | Topic | URL | @@ -41,6 +41,7 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Apply catalog structure best practices in ADE | https://learn.microsoft.com/en-us/azure/deployment-environments/best-practice-catalog-structure | +| Apply resiliency best practices in Azure Deployment Environments | https://learn.microsoft.com/en-us/azure/deployment-environments/concept-reliability-deployment-environments | ### Limits & Quotas | Topic | URL | @@ -51,14 +52,14 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Plan Azure RBAC roles for Deployment Environments | https://learn.microsoft.com/en-us/azure/deployment-environments/concept-deployment-environments-role-based-access-control | -| Authenticate to ADE REST APIs using Azure CLI | https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate | +| Authenticate to Azure Deployment Environments REST APIs securely | https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-authenticate | | Configure managed identities for ADE deployments | https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-managed-identity | | Assign ADE built-in roles and access scopes | https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-manage-deployment-environments-access | ### Configuration | Topic | URL | |-------|-----| -| Configure environment.yaml schema for ADE definitions | https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml | +| Configure environment.yaml schema for Azure Deployment Environments | https://learn.microsoft.com/en-us/azure/deployment-environments/concept-environment-yaml | | Configure ADE environment definitions and container images | https://learn.microsoft.com/en-us/azure/deployment-environments/configure-environment-definition | | Configure custom container images in ADE extensibility | https://learn.microsoft.com/en-us/azure/deployment-environments/how-to-configure-extensibility-model-custom-image | | Reference ADE CLI environment variables for custom images | https://learn.microsoft.com/en-us/azure/deployment-environments/reference-deployment-environment-variables | diff --git a/skills/azure-devops/SKILL.md b/skills/azure-devops/SKILL.md index e234926d..6eab7b95 100644 --- a/skills/azure-devops/SKILL.md +++ b/skills/azure-devops/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-devops -description: Expert knowledge for Azure DevOps development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing Boards/work items, pipelines, repos, Analytics/OData/Power BI, or Azure DevOps Server deployments, and other Azure DevOps related development tasks. Not for Azure Boards (use azure-boards), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). +description: Expert knowledge for Azure DevOps development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing org/projects, repos, pipelines, agents/pools, work items/boards, or Analytics/Power BI, and other Azure DevOps related development tasks. Not for Azure Boards (use azure-boards), Azure Pipelines (use azure-pipelines), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure DevOps Skill @@ -24,30 +24,28 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L54 | Diagnosing and fixing Azure DevOps issues: Managed DevOps Pools, performance, email notifications, connectivity/allowlists, permissions, dashboards/Analytics, wikis restore, and upgrade failures. | -| Best Practices | L55-L69 | Guidance on optimizing Azure DevOps performance, analytics, and reporting: cost-efficient pools, fast OData queries, Power BI reports, dashboards, and data cleanup/maintenance. | -| Decision Making | L70-L85 | Guidance on Azure DevOps architectural choices: org/project/team structure, work tracking and wikis, analytics/reporting, cost and topology of agents and Azure DevOps Server deployments. | -| Architecture & Design Patterns | L86-L97 | Architectural guidance for Azure DevOps/Server: pool architecture, reliability/DR, SQL/database dependencies, and design patterns for simple to complex multi-server topologies and analytics modeling. | -| Limits & Quotas | L98-L112 | Limits, quotas, and behaviors for Azure DevOps orgs, projects, naming, access levels, work tracking, dashboards, wikis, pipelines, and Analytics data availability/latency. | -| Security | L113-L173 | Managing Azure DevOps security: identities, auth, permissions, access levels, groups, auditing, project/repo/pipeline rights, server service accounts, SSL, and download integrity. | -| Configuration | L174-L252 | Configuring Azure DevOps/Server: Managed DevOps Pools, notifications, Boards/work items, Analytics/OData/Power BI, dashboards, search, backups, networking, email/SMTP, and server admin settings. | -| Integrations & Coding Patterns | L253-L298 | Integrating Azure DevOps with tools (VS, SIEM, notifications, clients) and building Analytics/OData- and Power BI–based reports for work items, pipelines, and test/requirements metrics. | -| Deployment | L299-L330 | Installing, configuring, scaling, moving, backing up, restoring, and upgrading Azure DevOps Server/TFS deployments, including SQL, SharePoint, domains, and project collections | +| Troubleshooting | L37-L52 | Troubleshooting Azure DevOps connectivity, performance, permissions, notifications, wikis, analytics, Managed DevOps Pools, VS sign-in/setup, and server/project collection upgrades. | +| Best Practices | L53-L67 | Guidance on optimizing Azure DevOps performance, analytics, and reporting: cost-efficient pools, fast OData queries, Power BI reports, dashboards, and data cleanup/maintenance. | +| Decision Making | L68-L83 | Guidance on Azure DevOps architectural choices: org/project/team structure, work tracking and wikis, analytics/reporting, cost and topology of agents and Azure DevOps Server deployments. | +| Architecture & Design Patterns | L84-L95 | Architectural guidance for Azure DevOps/Server: pool architecture, reliability/DR, SQL/database dependencies, and design patterns for simple to complex multi-server topologies and analytics modeling. | +| Limits & Quotas | L96-L111 | Limits, quotas, and constraints for Azure DevOps orgs, projects, naming, work tracking, wikis, dashboards, pipelines, analytics, and managed pools, plus related retention and notification behaviors | +| Security | L112-L171 | Managing Azure DevOps security: identities, auth, permissions, access levels, groups, auditing, project/repo/pipeline rights, server service accounts, SSL, and download integrity. | +| Configuration | L172-L250 | Configuring Azure DevOps/Server: pools, agents, networking, notifications, work items, dashboards, Analytics, backups, SQL, services, and server/admin settings. | +| Integrations & Coding Patterns | L251-L295 | Integrating Azure DevOps with tools (VS, SIEM, notifications, clients) and building Analytics/OData- and Power BI–based reports for work items, pipelines, and test/requirements metrics. | +| Deployment | L296-L327 | Installing, configuring, scaling, moving, backing up, restoring, and upgrading Azure DevOps Server/TFS deployments, including SQL, SharePoint, domains, and project collections | ### Troubleshooting | Topic | URL | |-------|-----| | Use diagnostic logs for Managed DevOps Pools troubleshooting | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/diagnostics?view=azure-devops | -| Troubleshoot common Managed DevOps Pools issues | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops | +| Diagnose and fix Azure Managed DevOps Pools issues | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/troubleshooting?view=azure-devops | +| Resolve Visual Studio setup and sign-in issues with Azure DevOps | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops | | Investigate Azure DevOps usage and performance issues | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/usage-monitoring?view=azure-devops | -| Investigate delayed Azure DevOps notification emails | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-delayed-email?view=azure-devops | -| Troubleshoot missing Azure DevOps notification emails | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops | -| Troubleshoot unexpected Azure DevOps notification emails | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-unexpected-email?view=azure-devops | +| Diagnose and fix missing Azure DevOps notification emails | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/troubleshoot-not-getting-email?view=azure-devops | | Use subscription logging to debug Azure DevOps notifications | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/use-subscription-logging?view=azure-devops | | Allowlist IPs and URLs for Azure DevOps connectivity | https://learn.microsoft.com/en-us/azure/devops/organizations/security/allow-list-ip-url?view=azure-devops | -| Troubleshoot Azure DevOps access and permission issues | https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops | +| Diagnose and fix Azure DevOps permission issues | https://learn.microsoft.com/en-us/azure/devops/organizations/security/troubleshoot-permissions?view=azure-devops | | Restore deleted Azure DevOps wikis using REST API and recycle bin | https://learn.microsoft.com/en-us/azure/devops/project/wiki/restore-deleted-wiki?view=azure-devops | -| Troubleshoot Azure DevOps dashboards and charts issues | https://learn.microsoft.com/en-us/azure/devops/report/dashboards/faqs?view=azure-devops | | Troubleshoot Azure DevOps Analytics views for Power BI | https://learn.microsoft.com/en-us/azure/devops/report/powerbi/troubleshooting-views?view=azure-devops | | Azure DevOps Server administration FAQ and support guidance | https://learn.microsoft.com/en-us/azure/devops/server/faq?view=azure-devops-server | | Troubleshoot Azure DevOps project collection upgrade failures | https://learn.microsoft.com/en-us/azure/devops/server/troubleshooting/collection-upgrade-failure?view=azure-devops-server | @@ -98,6 +96,7 @@ This skill requires **network access** to fetch documentation content: ### Limits & Quotas | Topic | URL | |-------|-----| +| Understand quotas and options for Managed DevOps Pools | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/faq?view=azure-devops | | Change Azure DevOps organization image within size limits | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-image?view=azure-devops | | Delete Azure DevOps organization and data retention | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/delete-your-organization?view=azure-devops | | Recover deleted Azure DevOps organizations within retention limits | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/recover-your-organization?view=azure-devops | @@ -118,8 +117,8 @@ This skill requires **network access** to fetch documentation content: | Configure privacy policy URL for Azure DevOps public projects | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/add-privacy-policy-url?view=azure-devops | | Configure Azure DevOps application access and security policies | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-application-access-policies?view=azure-devops | | Change Azure DevOps organization ownership and permissions | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-ownership?view=azure-devops | -| Manage Azure DevOps access via Microsoft Entra ID | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops | -| Manage Azure DevOps users, permissions, and tokens securely | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops | +| Configure Azure DevOps access via Microsoft Entra ID | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-azure-access?view=azure-devops | +| Manage Azure DevOps users, permissions, and access | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops | | Enable and use Azure DevOps audit logging securely | https://learn.microsoft.com/en-us/azure/devops/organizations/audit/azure-devops-auditing?view=azure-devops | | Configure Azure DevOps project public visibility and access | https://learn.microsoft.com/en-us/azure/devops/organizations/projects/make-project-public?view=azure-devops | | Understand Azure DevOps permissions and security groups | https://learn.microsoft.com/en-us/azure/devops/organizations/security/about-permissions?view=azure-devops | @@ -138,7 +137,6 @@ This skill requires **network access** to fetch documentation content: | Download and interpret pipeline release permissions report | https://learn.microsoft.com/en-us/azure/devops/organizations/security/download-permissions-report-release?view=azure-devops | | Download Azure DevOps repository permissions report | https://learn.microsoft.com/en-us/azure/devops/organizations/security/download-permissions-report?view=azure-devops | | Export Azure DevOps users and access levels | https://learn.microsoft.com/en-us/azure/devops/organizations/security/export-users-audit-log?view=azure-devops | -| Use Stakeholder access permissions in Azure DevOps | https://learn.microsoft.com/en-us/azure/devops/organizations/security/get-started-stakeholder?view=azure-devops | | Identify Azure DevOps Administrators via Microsoft Entra | https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-azure-devops-administrator?view=azure-devops | | Find and manage Azure DevOps project administrators | https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-project-administrators?view=azure-devops | | Identify Project Collection Administrators in Azure DevOps | https://learn.microsoft.com/en-us/azure/devops/organizations/security/look-up-project-collection-administrators?view=azure-devops | @@ -182,7 +180,7 @@ This skill requires **network access** to fetch documentation content: | Configure additional storage for Managed DevOps Pools agents | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/configure-storage?view=azure-devops | | Configure demands and capabilities in Managed DevOps Pools | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/demands?view=azure-devops | | Configure prerequisites for Managed DevOps Pools | https://learn.microsoft.com/en-us/azure/devops/managed-devops-pools/prerequisites?view=azure-devops | -| Configure remote Azure DevOps MCP Server endpoint access | https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops | +| Configure remote Azure DevOps MCP Server endpoint | https://learn.microsoft.com/en-us/azure/devops/mcp-server/remote-mcp-server?view=azure-devops | | Configure organization and user time zones in Azure DevOps | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-time-zone?view=azure-devops | | Reference list of Azure DevOps auditing events | https://learn.microsoft.com/en-us/azure/devops/organizations/audit/auditing-events?view=azure-devops | | Determine Azure DevOps notification email recipients | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/concepts-email-recipients?view=azure-devops | @@ -253,7 +251,6 @@ This skill requires **network access** to fetch documentation content: ### Integrations & Coding Patterns | Topic | URL | |-------|-----| -| Set up Visual Studio integration with Azure DevOps | https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-set-up-vs?view=azure-devops | | Stream Azure DevOps audit logs to external SIEM | https://learn.microsoft.com/en-us/azure/devops/organizations/audit/auditing-streaming?view=azure-devops | | Integrate Azure DevOps notifications with third-party services | https://learn.microsoft.com/en-us/azure/devops/organizations/notifications/integrate-third-party-services?view=azure-devops | | Connect various clients to Azure DevOps projects | https://learn.microsoft.com/en-us/azure/devops/organizations/projects/connect-to-projects?view=azure-devops | diff --git a/skills/azure-education-hub/SKILL.md b/skills/azure-education-hub/SKILL.md index 90a60565..6cc43bf4 100644 --- a/skills/azure-education-hub/SKILL.md +++ b/skills/azure-education-hub/SKILL.md @@ -1,6 +1,6 @@ --- name: azure-education-hub -description: Expert knowledge for Azure Education Hub development including troubleshooting, and limits & quotas. Use when managing Azure for Students credits, yearly quotas, renewals, or Dev Tools for Teaching sign-in issues, and other Azure Education Hub related development tasks. +description: Expert knowledge for Azure Education Hub development including troubleshooting, and limits & quotas. Use when managing Azure for Students credits, yearly quotas, renewals, or Dev Tools for Teaching sign-in issues, and other Azure Education Hub related development tasks. Not for Azure Lab Services (use azure-lab-services), Azure DevTest Labs (use azure-devtest-labs), Azure Virtual Desktop (use azure-virtual-desktop). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: generated_at: "2026-02-28" diff --git a/skills/azure-elastic-san/SKILL.md b/skills/azure-elastic-san/SKILL.md index c4cd23bc..c5afbdb1 100644 --- a/skills/azure-elastic-san/SKILL.md +++ b/skills/azure-elastic-san/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-elastic-san -description: Expert knowledge for Azure Elastic SAN development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. Use when scripting Elastic SAN volumes, tuning AVS datastores, using CMK encryption, sizing for IOPS, or running clustered SQL, and other Azure Elastic SAN related development tasks. Not for Azure Blob Storage (use azure-blob-storage), Azure Files (use azure-files), Azure NetApp Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre). +description: Expert knowledge for Azure Elastic SAN development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. Use when creating iSCSI volumes, AVS datastores, snapshots, CMK encryption, or AKS-integrated Elastic SAN storage, and other Azure Elastic SAN related development tasks. Not for Azure Blob Storage (use azure-blob-storage), Azure NetApp Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre), Azure Files (use azure-files). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-03-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Elastic SAN Skill @@ -25,13 +25,13 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L36-L40 | Diagnosing and resolving common Azure Elastic SAN issues, including provisioning failures, connectivity/IO errors, performance problems, and typical error codes/logs. | -| Best Practices | L41-L47 | Performance tuning for Elastic SAN volumes and AVS datastores, plus how to design, configure, and use snapshots for backup and recovery best practices. | +| Best Practices | L41-L47 | Tuning Elastic SAN performance and configuration (including AVS datastores) and using volume snapshots for backup and recovery best practices. | | Decision Making | L48-L52 | Guidance on sizing and configuring Elastic SAN (performance, capacity, architecture) and deciding how to integrate it with AKS workloads and storage patterns. | | Architecture & Design Patterns | L53-L57 | Patterns for running clustered apps (SQL, Failover Cluster, etc.) on Azure Elastic SAN, including shared volume setup, fencing, failover behavior, and high-availability design. | -| Limits & Quotas | L58-L63 | Performance and scale limits for Elastic SAN: max volumes, capacity, IOPS/throughput per volume/volume group/SAN, and how VM size and workload affect achievable performance. | +| Limits & Quotas | L58-L63 | Details on Elastic SAN capacity, IOPS, throughput, and VM performance limits, including how these quotas scale, constrain workloads, and impact design choices. | | Security | L64-L73 | Encrypting Elastic SAN with customer-managed keys and configuring secure access via private endpoints, service endpoints, and other network security options for volumes and volume groups. | -| Configuration | L74-L81 | Configuring, deploying, resizing, deleting, and monitoring Azure Elastic SAN resources and volumes, including safe capacity changes and using built-in metrics effectively. | -| Integrations & Coding Patterns | L82-L87 | Using PowerShell to batch-create Elastic SAN volumes and configuring Linux and Windows clients to connect, mount, and use those iSCSI-based volumes. | +| Configuration | L74-L82 | Configuring, resizing, deleting, and monitoring Azure Elastic SAN resources/volumes, plus managing iSCSI IQN naming authority and safe operational changes. | +| Integrations & Coding Patterns | L83-L88 | Using PowerShell to batch-create Elastic SAN volumes and configuring Linux and Windows clients to connect, mount, and use those iSCSI-based volumes. | ### Troubleshooting | Topic | URL | @@ -41,7 +41,7 @@ This skill requires **network access** to fetch documentation content: ### Best Practices | Topic | URL | |-------|-----| -| Apply Azure Elastic SAN performance best practices | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices | +| Optimize Azure Elastic SAN configuration and performance | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-best-practices | | Optimize Elastic SAN datastore performance on AVS | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-performance-on-azure-vmware-solutions | | Use snapshots to back up Azure Elastic SAN volumes | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-snapshots | @@ -59,7 +59,7 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Understand Azure Elastic SAN and VM performance limits | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-performance | -| Azure Elastic SAN scalability, IOPS, and throughput limits | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets | +| Elastic SAN capacity, IOPS, and throughput limits | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-scale-targets | ### Security | Topic | URL | @@ -78,6 +78,7 @@ This skill requires **network access** to fetch documentation content: | Delete Azure Elastic SAN resources correctly | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-delete | | Resize Azure Elastic SAN resources and volumes safely | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-expand | | Use Azure Elastic SAN monitoring metrics effectively | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-metrics | +| Transition Elastic SAN IQN naming authority | https://learn.microsoft.com/en-us/azure/storage/elastic-san/elastic-san-transition-iqn-naming-authority | ### Integrations & Coding Patterns | Topic | URL | diff --git a/skills/azure-expressroute/SKILL.md b/skills/azure-expressroute/SKILL.md index 507d7e9a..355bc1fe 100644 --- a/skills/azure-expressroute/SKILL.md +++ b/skills/azure-expressroute/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-expressroute -description: Expert knowledge for Azure ExpressRoute development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing ExpressRoute circuits/gateways, BGP routing, Global Reach, FastPath, or S2S VPN coexistence, and other Azure ExpressRoute related development tasks. Not for Azure Internet Peering (use azure-internet-peering), Azure Peering Service (use azure-peering-service), Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). +description: Expert knowledge for Azure ExpressRoute development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing ExpressRoute circuits/gateways, BGP routing, Global Reach, encryption (IPsec/MACsec), or automation, and other Azure ExpressRoute related development tasks. Not for Azure Internet Peering (use azure-internet-peering), Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway), Azure Virtual Network (use azure-virtual-network). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure ExpressRoute Skill @@ -30,7 +30,7 @@ This skill requires **network access** to fetch documentation content: | Architecture & Design Patterns | L62-L72 | Designing resilient, highly available ExpressRoute topologies, multi-circuit routing, coexistence with S2S VPN, DR/backup patterns, and using Microsoft peering for PSTN services. | | Limits & Quotas | L73-L80 | ExpressRoute bandwidth, route, and gateway limits, FastPath constraints, rate limiting on provider circuits, and how to monitor advertised routes to stay within quotas | | Security | L81-L90 | Encryption (IPsec, MACsec), NAT rules, RBAC roles, and security best practices for protecting ExpressRoute circuits and traffic | -| Configuration | L91-L127 | How to configure and manage ExpressRoute circuits, peerings, VNets, gateways, routing/BGP, NAT, IPv6, monitoring, resiliency, and Global Reach using portal, PowerShell, and CLI | +| Configuration | L91-L127 | Configuring and managing ExpressRoute circuits, peerings, gateways, routing, NAT, BFD, IPv6, Global Reach, monitoring, resiliency, and linking VNets using portal, PowerShell, and CLI. | | Integrations & Coding Patterns | L128-L134 | Automating ExpressRoute circuit creation/management with PowerShell or Azure CLI, and configuring a site-to-site VPN that runs over ExpressRoute Microsoft peering. | | Deployment | L135-L141 | Guides for deploying and migrating ExpressRoute circuits/gateways, understanding Direct SKUs, testing multi-site resiliency, and automating setup with ARM templates, PowerShell, and Terraform. | @@ -96,7 +96,7 @@ This skill requires **network access** to fetch documentation content: | Configure BFD over Azure ExpressRoute peering | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-bfd | | Configure NAT on Cisco and Juniper for ExpressRoute | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-config-samples-nat | | Router interface and BGP configuration samples for ExpressRoute | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-config-samples-routing | -| Create and manage ExpressRoute virtual network gateways | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager | +| Configure Azure ExpressRoute virtual network gateway in portal | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-portal-resource-manager | | Manage ExpressRoute virtual network gateways with PowerShell | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-resource-manager | | Add IPv6 support to ExpressRoute private peering | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-ipv6 | | Configure coexisting ExpressRoute and S2S VPN connections (classic) | https://learn.microsoft.com/en-us/azure/expressroute/expressroute-howto-coexist-classic | diff --git a/skills/azure-external-attack-surface-management/SKILL.md b/skills/azure-external-attack-surface-management/SKILL.md deleted file mode 100644 index 00bb7fbf..00000000 --- a/skills/azure-external-attack-surface-management/SKILL.md +++ /dev/null @@ -1,53 +0,0 @@ ---- -name: azure-external-attack-surface-management -description: Expert knowledge for Azure External Attack Surface Management development including limits & quotas, configuration, and integrations & coding patterns. Use when querying EASM assets, setting policy rules, exporting to Log Analytics or Data Explorer, or estimating billing, and other Azure External Attack Surface Management related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Security (use azure-security), Azure Sentinel (use azure-sentinel), Azure Firewall (use azure-firewall). -compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. -metadata: - generated_at: "2026-03-16" - generator: "docs2skills/1.0.0" ---- -# Azure External Attack Surface Management Skill - -This skill provides expert guidance for Azure External Attack Surface Management. Covers limits & quotas, configuration, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. - -## How to Use This Skill - -> **IMPORTANT for Agent**: Use the **Category Index** below to locate relevant sections. For categories with line ranges (e.g., `L35-L120`), use `read_file` with the specified lines. For categories with file links (e.g., `[security.md](security.md)`), use `read_file` on the linked reference file - -> **IMPORTANT for Agent**: If `metadata.generated_at` is more than 3 months old, suggest the user pull the latest version from the repository. If `mcp_microsoftdocs` tools are not available, suggest the user install it: [Installation Guide](https://github.com/MicrosoftDocs/mcp/blob/main/README.md) - -This skill requires **network access** to fetch documentation content: -- **Preferred**: Use `mcp_microsoftdocs:microsoft_docs_fetch` with query string `from=learn-agent-skill`. Returns Markdown. -- **Fallback**: Use `fetch_webpage` with query string `from=learn-agent-skill&accept=text/markdown`. Returns Markdown. - -## Category Index - -| Category | Lines | Description | -|----------|-------|-------------| -| Limits & Quotas | L31-L35 | Explains how Defender EASM billing works, what counts as a billable asset, and how asset counts affect costs and quotas. | -| Configuration | L36-L49 | Filtering and querying EASM inventory by asset type (domains, hosts, IPs/blocks, ASNs, pages, contacts, SSL certs) and configuring policy engine automation rules. | -| Integrations & Coding Patterns | L50-L53 | Configuring Defender EASM to export discovery and asset data into Log Analytics and Azure Data Explorer, including connection setup and data usage for analysis. | - -### Limits & Quotas -| Topic | URL | -|-------|-----| -| Understand Defender EASM billing and billable asset counts | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/understanding-billable-assets | - -### Configuration -| Topic | URL | -|-------|-----| -| Use ASN asset filters in Defender EASM inventory | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/asn-asset-filters | -| Use contact asset filters in Defender EASM inventory | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/contact-asset-filters | -| Apply domain asset filters in Defender EASM inventory | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/domain-asset-filters | -| Apply host asset filters in Defender EASM inventory | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/host-asset-filters | -| Use Defender EASM inventory filters and saved queries | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/inventory-filters | -| Use IP address asset filters in Defender EASM | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-address-asset-filters | -| Use IP block asset filters in Defender EASM | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ip-block-asset-filters | -| Apply page asset filters in Defender EASM inventory | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/page-asset-filters | -| Configure Defender EASM policy engine automation rules | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/policy-engine | -| Use SSL certificate asset filters in Defender EASM | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/ssl-certificate-asset-filters | - -### Integrations & Coding Patterns -| Topic | URL | -|-------|-----| -| Configure Defender EASM data connections to Log Analytics and ADX | https://learn.microsoft.com/en-us/azure/external-attack-surface-management/data-connections | \ No newline at end of file diff --git a/skills/azure-files/SKILL.md b/skills/azure-files/SKILL.md index bdd6718b..b0b2a4a6 100644 --- a/skills/azure-files/SKILL.md +++ b/skills/azure-files/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-files -description: Expert knowledge for Azure Files development including best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Azure Files/File Sync, SMB/NFS access, DFS/VDI setups, RAG over files, or data migrations, and other Azure Files related development tasks. Not for Azure Blob Storage (use azure-blob-storage), Azure NetApp Files (use azure-netapp-files), Azure Managed Lustre (use azure-managed-lustre), Azure Virtual Machines (use azure-virtual-machines). +description: Expert knowledge for Azure Files development including best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Azure Files/File Sync networking, SMB/NFS access, DR/failover, performance limits, or RAG app integration, and other Azure Files related development tasks. Not for Azure Blob Storage (use azure-blob-storage), Azure NetApp Files (use azure-netapp-files), Azure Table Storage (use azure-table-storage), Azure Queue Storage (use azure-queue-storage). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Files Skill @@ -24,13 +24,13 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Best Practices | L35-L50 | Disaster recovery, lifecycle, and performance best practices for Azure Files and Azure File Sync, including failover planning, server/drive replacement, large directory handling, and VDI/FSLogix usage. | +| Best Practices | L35-L50 | Best practices for Azure Files and Azure File Sync: DR/failover planning, server/drive replacement, safe deprovision/recovery, and performance tuning for SMB/NFS and VDI/FSLogix workloads | | Decision Making | L51-L71 | Guidance for planning Azure Files deployments: choosing tiers, redundancy, billing/cost models, reservations, access patterns, and migration/architecture options for SMB/NFS and File Sync. | | Limits & Quotas | L72-L79 | Azure Files/File Sync limits: capacity, IOPS/throughput, scalability targets, API throttling behavior, redundancy/region support, and FAQ on performance-related constraints. | -| Security | L80-L107 | Securing Azure Files with identity-based auth (AD DS, Entra ID, Kerberos), NTFS/share permissions, TLS/SMB/NFS hardening, and network/firewall/proxy configuration for secure access. | -| Configuration | L108-L134 | Configuring Azure Files and Azure File Sync: networking/VPN and private endpoints, monitoring/alerts, cloud tiering, DFS integration, redundancy, soft delete, and secure access for apps and RAG. | -| Integrations & Coding Patterns | L135-L156 | Patterns and code samples for building RAG apps over Azure Files using Haystack, LangChain, LlamaIndex with Pinecone/Qdrant/Weaviate, plus .NET, Java, and Python integration guides. | -| Deployment | L157-L168 | Guides for migrating and syncing data to Azure Files/Azure File Sync from NAS, Linux, GlusterFS, SMB/NFS shares, and moving File Sync resources safely across scopes. | +| Security | L80-L106 | Securing Azure Files with identity-based auth (AD DS/Entra/Kerberos), NTFS/share permissions, SMB/NFS security, TLS, firewalls/proxies, and network perimeter configuration. | +| Configuration | L107-L132 | Configuring Azure Files and Azure File Sync: networking/VPN and private endpoints, monitoring and alerts, cloud tiering, DFS integration, redundancy, soft delete, and file copy tools. | +| Integrations & Coding Patterns | L133-L155 | Patterns and code to build RAG apps with Azure Files: authenticating and loading files, and integrating Haystack/LangChain/LlamaIndex with Pinecone, Qdrant, Weaviate, plus .NET/Java/Python SDK usage. | +| Deployment | L156-L167 | Guides for migrating and syncing data to Azure Files/Azure File Sync from NAS, Linux, GlusterFS, SMB/NFS shares, and moving File Sync resources safely across scopes. | ### Best Practices | Topic | URL | @@ -86,7 +86,7 @@ This skill requires **network access** to fetch documentation content: | Enable OAuth-based REST access to Azure file shares | https://learn.microsoft.com/en-us/azure/storage/files/authorize-oauth-rest | | Change identity source for Azure Files SMB authentication | https://learn.microsoft.com/en-us/azure/storage/files/change-identity-source | | Enable TLS encryption in transit for NFS Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/encryption-in-transit-for-nfs-shares | -| Use managed identities to access Azure SMB file shares | https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities | +| Configure managed identity access to Azure file shares | https://learn.microsoft.com/en-us/azure/storage/files/files-managed-identities | | Configure network security perimeter for Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/files-network-security-perimeter | | Secure and configure NFS file shares in Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/files-nfs-protocol | | Disable insecure SMB1 on Linux for Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/files-remove-smb1-linux | @@ -103,7 +103,6 @@ This skill requires **network access** to fetch documentation content: | Configure Kerberos auth for Linux Azure Files clients | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-auth-linux-kerberos-enable | | Configure NTFS ACL permissions for Azure file shares | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-configure-file-level-permissions | | Configure Azure Files with multiple AD DS forests | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-identity-multiple-forests | -| Configure secure networking for Azure file shares | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-overview | ### Configuration | Topic | URL | @@ -118,14 +117,13 @@ This skill requires **network access** to fetch documentation content: | Configure networking for Azure File Sync caching servers | https://learn.microsoft.com/en-us/azure/storage/file-sync/file-sync-networking-overview | | Reference metrics and logs for monitoring Azure File Sync | https://learn.microsoft.com/en-us/azure/storage/file-sync/monitor-file-sync-reference | | Analyze Azure Files performance metrics with Azure Monitor | https://learn.microsoft.com/en-us/azure/storage/files/analyze-files-metrics | -| Authenticate and access Azure Files for RAG ingestion | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup | | Change redundancy configuration for Azure Files accounts | https://learn.microsoft.com/en-us/azure/storage/files/files-change-redundancy-configuration | | Integrate DFS Namespaces with Azure file shares | https://learn.microsoft.com/en-us/azure/storage/files/files-manage-namespaces | | Create Azure Monitor alerts for Azure Files health | https://learn.microsoft.com/en-us/azure/storage/files/files-monitoring-alerts | | Copy files between Azure file shares with tools | https://learn.microsoft.com/en-us/azure/storage/files/migrate-files-between-shares | | Configure Linux point-to-site VPN for Azure Files access | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-p2s-vpn-linux | | Configure Windows P2S VPN access to Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-p2s-vpn-windows | -| Configure site-to-site VPN for Azure Files access | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn | +| Configure site-to-site VPN connectivity for Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-configure-s2s-vpn | | Configure Azure Monitor metrics and logs for Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-monitoring | | Use Azure Files monitoring metrics and logs | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-monitoring-reference | | Configure DNS forwarding to Azure Files private endpoints | https://learn.microsoft.com/en-us/azure/storage/files/storage-files-networking-dns | @@ -138,15 +136,16 @@ This skill requires **network access** to fetch documentation content: | Integrate Haystack RAG pipelines with Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/haystack | | Integrate LangChain RAG pipelines with Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/langchain | | Integrate LlamaIndex RAG pipelines with Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/orchestrations/llamaindex | -| Build Haystack + Pinecone RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone | +| Authenticate and download Azure Files data for RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/setup | +| Integrate Azure Files with Haystack and Pinecone RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-pinecone/tutorial-haystack-pinecone | | Implement Haystack–Qdrant RAG with Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-qdrant/tutorial-haystack-qdrant | | Implement Haystack–Weaviate RAG with Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/haystack-weaviate/tutorial-haystack-weaviate | -| Build LangChain + Pinecone RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone | -| Build LangChain + Qdrant RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant | -| Build LangChain + Weaviate RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate | -| Build LlamaIndex + Pinecone RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone | -| Build LlamaIndex + Qdrant RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant | -| Build LlamaIndex + Weaviate RAG over Azure Files | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate | +| Integrate Azure Files with LangChain and Pinecone RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-pinecone/tutorial-langchain-pinecone | +| Integrate Azure Files with LangChain and Qdrant RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-qdrant/tutorial-langchain-qdrant | +| Integrate Azure Files with LangChain and Weaviate RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/langchain-weaviate/tutorial-langchain-weaviate | +| Integrate Azure Files with LlamaIndex and Pinecone RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-pinecone/tutorial-llamaindex-pinecone | +| Integrate Azure Files with LlamaIndex and Qdrant RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-qdrant/tutorial-llamaindex-qdrant | +| Integrate Azure Files with LlamaIndex and Weaviate RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/tutorials/llamaindex-weaviate/tutorial-llamaindex-weaviate | | Use Pinecone vector database with Azure Files RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/pinecone | | Use Qdrant vector database with Azure Files RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/qdrant | | Use Weaviate vector database with Azure Files RAG | https://learn.microsoft.com/en-us/azure/storage/files/artificial-intelligence/retrieval-augmented-generation/open-source-frameworks/vector-databases/weaviate | diff --git a/skills/azure-firewall/SKILL.md b/skills/azure-firewall/SKILL.md index f731784f..6e2646d7 100644 --- a/skills/azure-firewall/SKILL.md +++ b/skills/azure-firewall/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-firewall -description: Expert knowledge for Azure Firewall development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Azure Firewall SKUs, policies/rules, TLS inspection, hub-spoke DNAT, or SFTP to Storage, and other Azure Firewall related development tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), Azure DDos Protection (use azure-ddos-protection). +description: Expert knowledge for Azure Firewall development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when selecting Firewall SKUs, designing hub-spoke/forced tunneling, configuring DNAT/SNAT/app rules, TLS inspection, or DNS proxy, and other Azure Firewall related development tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure Front Door (use azure-front-door), Azure Web Application Firewall (use azure-web-application-firewall), Azure DDos Protection (use azure-ddos-protection). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Firewall Skill @@ -28,7 +28,7 @@ This skill requires **network access** to fetch documentation content: | Best Practices | L43-L50 | Best practices for Azure Firewall DNS proxy/caching, performance tuning, rule optimization with Policy Analytics, and hardening/security configuration guidance. | | Decision Making | L51-L59 | Guidance on choosing Azure Firewall SKUs (Basic/Standard/Premium), comparing features and performance, and planning or changing deployments based on throughput and requirements. | | Architecture & Design Patterns | L60-L72 | Designing Azure Firewall network architectures: hub-and-spoke, forced tunneling, load balancer integration, hybrid/AVD/M365 protection, and DNAT for overlapping/private IP networks. | -| Limits & Quotas | L73-L82 | Azure Firewall capacity, IP/port/session limits, SNAT scaling with NAT Gateway, prescaling ranges, and TCP idle timeout behaviors and configuration. | +| Limits & Quotas | L73-L82 | Azure Firewall capacity, IP/port/session limits, SNAT scaling with NAT Gateway, prescaling options, and TCP idle timeout behaviors and configuration | | Security | L83-L96 | Azure Firewall security setup: compliance, RBAC/permissions, Azure Policy, TLS inspection and CA chains, threat intel, DNAT, AKS and hybrid network protection, and portal deployment. | | Configuration | L97-L118 | Configuring Azure Firewall policies, rules (DNAT/SNAT/app), IP Groups, DNS/proxy/FTP, maintenance windows, monitoring/logging, and advanced Premium/PowerShell management. | | Integrations & Coding Patterns | L119-L123 | Configuring Azure Firewall to securely access Azure Storage via SFTP, including required rules, network paths, and integration patterns for SFTP traffic. | @@ -78,7 +78,7 @@ This skill requires **network access** to fetch documentation content: | Scale Azure Firewall SNAT ports with NAT Gateway | https://learn.microsoft.com/en-us/azure/firewall/integrate-with-nat-gateway | | Integrate Azure Firewall with NAT Gateway V2 for SNAT scaling | https://learn.microsoft.com/en-us/azure/firewall/integrate-with-nat-gateway-v2 | | Configure Azure Firewall prescaling capacity ranges | https://learn.microsoft.com/en-us/azure/firewall/prescaling | -| Manage Azure Firewall TCP session idle timeouts | https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior | +| Configure Azure Firewall TCP session idle timeouts | https://learn.microsoft.com/en-us/azure/firewall/tcp-session-behavior | ### Security | Topic | URL | diff --git a/skills/azure-front-door/SKILL.md b/skills/azure-front-door/SKILL.md index ea08af81..8c93399d 100644 --- a/skills/azure-front-door/SKILL.md +++ b/skills/azure-front-door/SKILL.md @@ -3,7 +3,7 @@ name: azure-front-door description: Expert knowledge for Azure Front Door development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Front Door routing/caching, rules engine, WAF/TLS, Private Link origins, or classic-to-Std/Prm migrations, and other Azure Front Door related development tasks. Not for Azure Application Gateway (use azure-application-gateway), Azure Traffic Manager (use azure-traffic-manager), Azure Load Balancer (use azure-load-balancer), Azure Web Application Firewall (use azure-web-application-firewall). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Front Door Skill diff --git a/skills/azure-functions/SKILL.md b/skills/azure-functions/SKILL.md index 5aa9be12..944eb317 100644 --- a/skills/azure-functions/SKILL.md +++ b/skills/azure-functions/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-functions -description: Expert knowledge for Azure Functions development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building HTTP/queue-triggered apps, Durable Functions, Linux/container hosting, API Mgmt/Logic Apps, or Flex plans, and other Azure Functions related development tasks. Not for Azure App Service (use azure-app-service), Azure Logic Apps (use azure-logic-apps), Azure Container Apps (use azure-container-apps), Azure Kubernetes Service (AKS) (use azure-kubernetes-service). +description: Expert knowledge for Azure Functions development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building HTTP/queue/event-triggered Functions, Durable workflows, containerized Functions, or API Management/Logic Apps integrations, and other Azure Functions related development tasks. Not for Azure App Service (use azure-app-service), Azure Logic Apps (use azure-logic-apps), Azure Container Apps (use azure-container-apps), Azure Kubernetes Service (AKS) (use azure-kubernetes-service). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Functions Skill @@ -26,13 +26,13 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L59 | Diagnosing and fixing Durable Functions/Task SDK issues, AZFD/AZFW error codes, storage and config problems, and runtime/deployment errors for Node.js, Python, and VM start/stop functions. | | Best Practices | L60-L79 | Patterns and guidance for robust, performant Azure Functions and Durable Functions: orchestration/entity design, versioning, error handling, DI, HTTP/connection usage, scaling, and language-specific best practices. | -| Decision Making | L80-L101 | Guidance on choosing Functions hosting/runtime models, estimating costs, and planning migrations (plans, runtimes, languages, AWS Lambda, Express.js, Service Bus) for optimal architecture. | +| Decision Making | L80-L101 | Guidance on choosing Functions hosting/runtime models, comparing costs and plans, and planning migrations (between runtimes, models, platforms like AWS Lambda and Express.js). | | Architecture & Design Patterns | L102-L107 | Running Functions in Linux containers, Durable Functions design with Azure Storage, and hosting Functions on Azure Container Apps for scalable, container-based architectures. | | Limits & Quotas | L108-L116 | Details on Functions hosting limits: legacy and Flex Consumption plans, scaling behavior, concurrency and target-based scaling settings, and supported languages/versions. | | Security | L117-L134 | Securing Functions apps: encryption at rest, secure storage, keys/secrets, managed identity, private endpoints/VNet, Web PubSub, networking/access controls, and secure MCP hosting. | | Configuration | L135-L169 | Configuring Azure Functions apps: bindings, triggers, app/host settings, runtime versions, plans, networking, monitoring/telemetry, and local/Core Tools setup. | | Integrations & Coding Patterns | L170-L273 | Patterns and how-tos for wiring Functions to external systems (DBs, messaging, AI/OpenAI, Dapr, MCP, storage, HTTP) using triggers/bindings, plus integration with API Mgmt, Logic Apps, and on-prem. | -| Deployment | L274-L303 | Deploying and hosting Azure Functions: provisioning plans (Consumption/Flex/Kubernetes), containers, CI/CD (GitHub/Azure Pipelines), slots, zero‑downtime, and migration/deployment automation. | +| Deployment | L274-L303 | Deploying and hosting Azure Functions: provisioning plans (Consumption/Flex), templates (Bicep/ARM/Terraform), containers/Kubernetes, CI/CD (GitHub/Azure Pipelines), slots, and migration tasks. | ### Troubleshooting | Topic | URL | @@ -291,7 +291,7 @@ This skill requires **network access** to fetch documentation content: | Set up Azure Pipelines CI/CD for Azure Functions | https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-azure-devops | | Run Azure Functions in custom Linux containers on Container Apps | https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-custom-container | | Configure GitHub Actions CI/CD for Azure Functions | https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-github-actions | -| Automate Azure Functions deployment with Bicep or ARM | https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code | +| Deploy Azure Functions with Bicep or ARM templates | https://learn.microsoft.com/en-us/azure/azure-functions/functions-infrastructure-as-code | | Host Azure Functions on Kubernetes with KEDA | https://learn.microsoft.com/en-us/azure/azure-functions/functions-kubernetes-keda | | Configure zone-redundant Azure Functions apps | https://learn.microsoft.com/en-us/azure/azure-functions/functions-zone-redundancy | | Migrate Azure Cosmos DB Functions extension from v3 to v4 | https://learn.microsoft.com/en-us/azure/azure-functions/migrate-cosmos-db-version-3-version-4 | diff --git a/skills/azure-health-data-services/SKILL.md b/skills/azure-health-data-services/SKILL.md index 9c51aa7b..da8b4a75 100644 --- a/skills/azure-health-data-services/SKILL.md +++ b/skills/azure-health-data-services/SKILL.md @@ -3,7 +3,7 @@ name: azure-health-data-services description: Expert knowledge for Azure Health Data Services development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using FHIR, DICOM, MedTech, de-identification APIs, Synapse/ADF pipelines, or SMART on FHIR apps, and other Azure Health Data Services related development tasks. Not for Azure Health Bot (use azure-health-bot), Azure API Management (use azure-api-management), Azure Data Factory (use azure-data-factory), Azure Cosmos DB (use azure-cosmos-db). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Health Data Services Skill diff --git a/skills/azure-iot-edge/SKILL.md b/skills/azure-iot-edge/SKILL.md index f1ea18ef..25c98455 100644 --- a/skills/azure-iot-edge/SKILL.md +++ b/skills/azure-iot-edge/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-iot-edge -description: Expert knowledge for Azure IoT Edge development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when provisioning IoT Edge/EFLOW, deploying modules via manifests/CI-CD, using DPS/X.509, or designing gateway topologies, and other Azure IoT Edge related development tasks. Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Central (use azure-iot-central), Azure IoT Operations (use azure-iot-operations), Azure Stack Edge (use azure-stack-edge). +description: Expert knowledge for Azure IoT Edge development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when provisioning IoT Edge/EFLOW, deploying modules via manifests/CI-CD, using DPS/X.509, gateways, or nested edge, and other Azure IoT Edge related development tasks. Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Central (use azure-iot-central), Azure IoT Operations (use azure-iot-operations), Azure Stack Edge (use azure-stack-edge). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure IoT Edge Skill @@ -28,7 +28,7 @@ This skill requires **network access** to fetch documentation content: | Best Practices | L48-L53 | Monitoring module twins for health/alerts and production-readiness guidance for IoT Edge solutions (deployment hardening, reliability, security, and operational best practices). | | Decision Making | L54-L60 | Guidance on choosing IoT Edge/EFLOW platforms, provisioning methods, networking setups, and nested virtualization options for different deployment scenarios. | | Architecture & Design Patterns | L61-L66 | Gateway design patterns for connecting downstream devices and patterns for handling offline/intermittent connectivity, local processing, and sync behavior in Azure IoT Edge setups. | -| Limits & Quotas | L67-L71 | Azure IoT Edge service and resource limits: max modules, routes, deployments, message sizes, throttling, and other scalability and quota constraints for edge solutions. | +| Limits & Quotas | L67-L71 | IoT Edge-specific limits and quotas: max modules, routes, deployments, message sizes, resource constraints, and other scalability or capacity restrictions for edge solutions. | | Security | L72-L84 | Securing IoT Edge with certificates, X.509 provisioning, confidential computing, downstream device auth, EST server setup, and network protection via Private Link/endpoints. | | Configuration | L85-L118 | Configuring IoT Edge devices, networking, gateways, provisioning (DPS, X.509, TPM, symmetric keys), EFLOW/VM settings, storage, proxies, and built-in/custom metrics and alerts. | | Integrations & Coding Patterns | L119-L126 | Remote management and logging via direct methods, building and packaging custom IoT Edge modules, and managing IoT Edge on Windows with EFLOW PowerShell functions | diff --git a/skills/azure-iot-hub/SKILL.md b/skills/azure-iot-hub/SKILL.md index 0539cd63..73fb25f8 100644 --- a/skills/azure-iot-hub/SKILL.md +++ b/skills/azure-iot-hub/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-iot-hub -description: Expert knowledge for Azure IoT Hub development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when provisioning devices via DPS, managing twins/jobs/routing, using device streams, or integrating Device Update, and other Azure IoT Hub related development tasks. Not for Azure IoT (use azure-iot), Azure IoT Central (use azure-iot-central), Azure IoT Edge (use azure-iot-edge), Azure Defender For Iot (use azure-defender-for-iot). +description: Expert knowledge for Azure IoT Hub development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring IoT Hub/DPS, device twins/routing, MQTT/AMQP clients, Cosmos DB event storage, or Device Update, and other Azure IoT Hub related development tasks. Not for Azure IoT (use azure-iot), Azure IoT Central (use azure-iot-central), Azure IoT Edge (use azure-iot-edge), Azure Defender For Iot (use azure-defender-for-iot). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure IoT Hub Skill @@ -28,11 +28,11 @@ This skill requires **network access** to fetch documentation content: | Best Practices | L54-L63 | Best practices for secure, scalable IoT Hub/DPS deployments: device provisioning at scale, OEM security, cert renewal/ADR, resilient reconnection, auto device config, and IoT Hub hardening. | | Decision Making | L64-L76 | Guidance for choosing IoT Hub vs alternatives, tiers/scale, pricing, routing, comms patterns (C2D/D2C), monitoring methods, and when to use or disable disaster recovery. | | Architecture & Design Patterns | L77-L83 | Design patterns for DPS lifecycle/HA/DR, VNet connectivity, secure device streams, and reliably persisting ordered IoT Hub events with Cosmos DB. | -| Limits & Quotas | L84-L89 | Details on IoT Hub and Device Update service limits, quotas, throttling behavior, and how many devices/operations you can scale to before hitting constraints. | -| Security | L90-L128 | Securing IoT Hub, DPS, and Device Update: auth (Entra ID, RBAC, SAS, X.509), certificates and revocation, TLS/ciphers, keys, network/IP controls, private endpoints, and Azure Policy compliance. | -| Configuration | L129-L170 | Configuring IoT Hub and DPS: enroll devices, manage certificates and ADR, set twins, jobs, routing, endpoints, file upload, message enrichments, queries, and monitoring/metrics/logs. | -| Integrations & Coding Patterns | L171-L195 | Patterns and code samples for connecting devices/DPS to IoT Hub (MQTT/HTTPS/AMQP), managing identities/twins/files, direct methods, C2D messages, bulk ops, and message formats. | -| Deployment | L196-L207 | Deploying and updating IoT Hubs and devices: region/SKU migration, failover, ARM/Bicep deployments, Device Update (image/package, proxy, OS support), and scheduling jobs via CLI. | +| Limits & Quotas | L84-L90 | Details on IoT Hub, DPS, and Device Update service limits, quotas, and throttling behaviors, including scale constraints and how operations are restricted under load. | +| Security | L91-L129 | Securing IoT Hub, DPS, and Device Update: auth (Entra ID, RBAC, SAS, X.509), certificates and revocation, TLS/ciphers, keys, network/IP controls, private endpoints, and Azure Policy compliance. | +| Configuration | L130-L171 | Configuring IoT Hub and DPS: enroll devices, manage certificates and ADR, set twins, jobs, routing, endpoints, file upload, message enrichments, queries, and monitoring/metrics/logs. | +| Integrations & Coding Patterns | L172-L196 | Patterns and code samples for connecting devices/DPS to IoT Hub (MQTT/HTTPS/AMQP), managing identities/twins/files, direct methods, C2D messages, bulk ops, and message formats. | +| Deployment | L197-L208 | Deploying and updating IoT Hubs and devices: region/SKU migration, failover, ARM/Bicep deployments, Device Update (image/package, proxy, OS support), and scheduling jobs via CLI. | ### Troubleshooting | Topic | URL | @@ -84,6 +84,7 @@ This skill requires **network access** to fetch documentation content: ### Limits & Quotas | Topic | URL | |-------|-----| +| Azure IoT DPS FAQ with limits and behaviors | https://learn.microsoft.com/en-us/azure/iot-dps/dps-faq | | Review Azure Device Update service limits | https://learn.microsoft.com/en-us/azure/iot-hub-device-update/device-update-limits | | Azure IoT Hub quotas, limits, and throttling behavior | https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-quotas-throttling | diff --git a/skills/azure-iot-operations/SKILL.md b/skills/azure-iot-operations/SKILL.md index 86b8e42c..51a924b6 100644 --- a/skills/azure-iot-operations/SKILL.md +++ b/skills/azure-iot-operations/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-iot-operations -description: Expert knowledge for Azure IoT Operations development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building IoT data flows/graphs, MQTT broker setups, WASM/ONNX workloads, Akri/REST connectors, or OPC UA/MQTT security, and other Azure IoT Operations related development tasks. Not for Azure IoT (use azure-iot), Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central). +description: Expert knowledge for Azure IoT Operations development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring MQTT broker/data flows, OPC UA assets, Akri/REST/WASM connectors, ONNX transforms, or Prometheus/Grafana, and other Azure IoT Operations related development tasks. Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central), Azure Digital Twins (use azure-digital-twins). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure IoT Operations Skill @@ -30,9 +30,9 @@ This skill requires **network access** to fetch documentation content: | Architecture & Design Patterns | L57-L61 | Designing IoT data processing pipelines with data flow graphs and applying Azure IoT Operations in layered/segmented industrial network topologies and architectures. | | Limits & Quotas | L62-L66 | Details on MQTT broker feature support, protocol limits, and control capabilities in Azure IoT Operations, including which MQTT functions and controls are available or restricted. | | Security | L67-L84 | Securing Azure IoT Operations: TLS/cert management, OPC UA trust, MQTT authz/authn, private networking, secrets/Key Vault, RBAC roles, and image validation. | -| Configuration | L85-L127 | Configuring Azure IoT Operations data flows, endpoints, routing, transforms, persistence, MQTT broker settings, device/asset models, and observability/metrics for monitoring and tuning. | -| Integrations & Coding Patterns | L128-L146 | Patterns and code for integrating devices and cameras, building Akri/REST/WASM connectors and transforms, using state store, ONNX, schemas, and expression/mapping languages in IoT data flows | -| Deployment | L147-L155 | Deploying, cloning, upgrading, and securing Azure IoT Operations in production (incl. private networks), plus deploying observability (Prometheus/Grafana) and WASM/graph workloads. | +| Configuration | L85-L126 | Configuring Azure IoT Operations data flows, endpoints, schemas, MQTT broker, assets/devices, persistence, scaling, and observability/metrics to tune, monitor, and manage the system. | +| Integrations & Coding Patterns | L127-L145 | Patterns and code for integrating devices and cameras, building Akri/REST/WASM connectors and transforms, using state store, ONNX, schemas, and expression/mapping languages in IoT data flows | +| Deployment | L146-L154 | Deploying, cloning, upgrading, and securing Azure IoT Operations in production (incl. private networks), plus deploying observability (Prometheus/Grafana) and WASM/graph workloads. | ### Troubleshooting | Topic | URL | @@ -110,7 +110,6 @@ This skill requires **network access** to fetch documentation content: | Configure container registry endpoints for IoT Operations | https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-configure-registry-endpoint | | Configure WebAssembly graph definitions for IoT Operations | https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/howto-configure-wasm-graph-definitions | | Use MQTT broker state store for data persistence | https://learn.microsoft.com/en-us/azure/iot-operations/develop-edge-apps/overview-state-store | -| Define Azure IoT Operations assets and devices in Device Registry | https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/concept-assets-devices | | Configure OPC UA assets and devices in Azure IoT Operations | https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-configure-opc-ua | | Manage Azure IoT Operations resources in the operations experience UI | https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-operations-experience | | Configure SSE connector assets and devices in Azure IoT Operations | https://learn.microsoft.com/en-us/azure/iot-operations/discover-manage-assets/howto-use-sse-connector | diff --git a/skills/azure-iot/SKILL.md b/skills/azure-iot/SKILL.md index 302e6db9..c3759308 100644 --- a/skills/azure-iot/SKILL.md +++ b/skills/azure-iot/SKILL.md @@ -1,14 +1,14 @@ --- name: azure-iot -description: Expert knowledge for Azure IoT development including architecture & design patterns, and integrations & coding patterns. Use when using MQTT, IoT Plug and Play, DPS/IoT Hub, SAP ERP integration, or industrial IoT reference architectures, and other Azure IoT related development tasks. Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central), Azure Defender For Iot (use azure-defender-for-iot). +description: Expert knowledge for Azure IoT development including decision making, architecture & design patterns, and integrations & coding patterns. Use when using MQTT or IoT Plug and Play, DPS/IoT Hub, SAP ERP integration, device registries, or schema registries, and other Azure IoT related development tasks. Not for Azure IoT Hub (use azure-iot-hub), Azure IoT Edge (use azure-iot-edge), Azure IoT Central (use azure-iot-central), Azure Defender For Iot (use azure-defender-for-iot). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure IoT Skill -This skill provides expert guidance for Azure IoT. Covers architecture & design patterns, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. +This skill provides expert guidance for Azure IoT. Covers decision making, architecture & design patterns, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. ## How to Use This Skill @@ -24,8 +24,15 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Architecture & Design Patterns | L30-L35 | Reference architectures and patterns for industrial IoT on Azure, including dataspace-based designs, component choices, and end-to-end implementation guidance for industrial scenarios. | -| Integrations & Coding Patterns | L36-L39 | Patterns and code for integrating devices via MQTT and IoT Plug and Play, building device/service apps, formatting payloads, using DPS/IoT Hub, and connecting SAP ERP to Azure IoT. | +| Decision Making | L31-L36 | Guidance on designing Azure Device Registry namespaces and schema registries, including structure, organization, and planning for IoT device data and metadata. | +| Architecture & Design Patterns | L37-L42 | Reference architectures and patterns for industrial IoT on Azure, including dataspace-based designs, component choices, and end-to-end implementation guidance for industrial scenarios. | +| Integrations & Coding Patterns | L43-L46 | Patterns and code for integrating devices via MQTT and IoT Plug and Play, building device/service apps, formatting payloads, using DPS/IoT Hub, and connecting SAP ERP to Azure IoT. | + +### Decision Making +| Topic | URL | +|-------|-----| +| Design and choose Azure Device Registry namespaces | https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-namespace-guidance | +| Plan Azure Device Registry schema registries for IoT | https://learn.microsoft.com/en-us/azure/iot/iot-device-registry-schema-registry-guidance | ### Architecture & Design Patterns | Topic | URL | diff --git a/skills/azure-key-vault/SKILL.md b/skills/azure-key-vault/SKILL.md index 4a46f760..1263ddfc 100644 --- a/skills/azure-key-vault/SKILL.md +++ b/skills/azure-key-vault/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-key-vault -description: Expert knowledge for Azure Key Vault development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Key Vault/Managed HSM APIs, RBAC vs access policies, Private Link, key rotation, or IaC deployment, and other Azure Key Vault related development tasks. Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure Cloud Hsm (use azure-cloud-hsm), Azure Payment Hsm (use azure-payment-hsm), Azure Information Protection (use azure-information-protection). +description: Expert knowledge for Azure Key Vault development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using KV/Managed HSM SDKs, Entra auth/RBAC, Private Link, key rotation/backup, or IaC deployments, and other Azure Key Vault related development tasks. Not for Azure Dedicated HSM (use azure-dedicated-hsm), Azure Cloud Hsm (use azure-cloud-hsm), Azure Payment Hsm (use azure-payment-hsm), Azure Information Protection (use azure-information-protection). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Key Vault Skill @@ -25,10 +25,10 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L36-L44 | Diagnosing and fixing Key Vault errors: REST/API error codes, access policy failures, Private Link misconfig, and Azure Policy enforcement issues. | -| Best Practices | L45-L55 | Guidance on BYOK/HSM key import, key/secret security best practices, disaster recovery for Managed HSM, and automating single/dual-credential secret rotation in Key Vault. | +| Best Practices | L45-L55 | Best practices for BYOK/HSM key generation and transfer, secure key management, disaster recovery for Managed HSM, and automating single/dual-credential secret rotation in Key Vault. | | Decision Making | L56-L62 | Guidance on planning key and HSM capacity, scaling, and migrating cryptographic workloads or Key Vault access control from access policies to RBAC | | Limits & Quotas | L63-L73 | Key Vault/Managed HSM limits: throttling, quotas, size constraints, logging latency, soft-delete behavior, and network/IP firewall configuration. | -| Security | L74-L99 | Securing Azure Key Vault and Managed HSM: auth, RBAC vs access policies, firewalls, Private Link, soft-delete, backups, and security best practices for keys, secrets, and certificates. | +| Security | L74-L99 | Securing Key Vault and Managed HSM: auth with Entra ID, RBAC vs access policies, network/firewall/Private Link, soft delete, backup/restore, and hardening best practices for keys, secrets, and certs. | | Configuration | L100-L123 | Configuring Key Vault and Managed HSM: monitoring, alerts, logging, policies, key types/rotation, secure key release, replication, and special secret formats (e.g., multiline). | | Integrations & Coding Patterns | L124-L151 | Using Key Vault from code and services: JS/Go/.NET/Python client patterns for keys/secrets/certs, rotation and backup, plus integrations with Event Grid, Logic Apps, Databricks, DigiCert, and TLS offload. | | Deployment | L152-L155 | How to deploy and provision Azure Key Vault and Managed HSM (vaults, keys, secrets) using ARM templates, Bicep, Terraform, Azure CLI, and PowerShell | @@ -47,7 +47,7 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Plan and execute BYOK HSM key transfers to Key Vault | https://learn.microsoft.com/en-us/azure/key-vault/keys/hsm-protected-keys | | Implement BYOK HSM-protected keys for Azure Key Vault | https://learn.microsoft.com/en-us/azure/key-vault/keys/hsm-protected-keys-byok | -| Apply security best practices for Azure Key Vault keys | https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys | +| Apply secure key management practices in Azure Key Vault | https://learn.microsoft.com/en-us/azure/key-vault/keys/secure-keys | | Execute disaster recovery for Azure Managed HSM disruptions | https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/disaster-recovery-guide | | Generate and import BYOK HSM keys into Azure Managed HSM | https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/hsm-protected-keys-byok | | Automate single-credential secret rotation with Key Vault | https://learn.microsoft.com/en-us/azure/key-vault/secrets/tutorial-rotation | diff --git a/skills/azure-kubernetes-service/SKILL.md b/skills/azure-kubernetes-service/SKILL.md index d8776fa0..3c060b2f 100644 --- a/skills/azure-kubernetes-service/SKILL.md +++ b/skills/azure-kubernetes-service/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-kubernetes-service -description: Expert knowledge for Azure Kubernetes Service (AKS) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing AKS clusters with Fleet, Istio, Dapr, GPUs, GitOps/CI-CD, or AI/ML and Wasm workloads, and other Azure Kubernetes Service (AKS) related development tasks. Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Red Hat OpenShift (use azure-redhat-openshift), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). +description: Expert knowledge for Azure Kubernetes Service (AKS) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring AKS networking/storage, Istio or service mesh, Fleet multi-cluster, GPUs/AI workloads, or KEDA autoscaling, and other Azure Kubernetes Service (AKS) related development tasks. Not for Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure Red Hat OpenShift (use azure-redhat-openshift), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Kubernetes Service (AKS) Skill @@ -24,32 +24,35 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L57 | Diagnosing and fixing AKS cluster issues: networking, DNS, kubelet, GPUs, security/vulns, upgrades, Windows nodes, Fleet CRDs, and agent/log-based troubleshooting. | -| Best Practices | L58-L102 | AKS reliability, performance, security, and cost best practices: cluster design, upgrades, networking, storage, scheduling, GPU/ML, PCI, and workload resiliency patterns. | -| Decision Making | L103-L149 | Guidance for planning, choosing, and migrating AKS architectures: VM/node pools, networking, upgrades, pricing, cost optimization, PCI, multi‑cloud comparisons, and Fleet Manager workflows. | -| Architecture & Design Patterns | L150-L174 | Designing AKS architectures: HA/DR patterns, multi-region and PCI designs, upgrade/rollout strategies, node pool/network isolation, virtual nodes, and Fleet-based multi-cluster management. | -| Limits & Quotas | L175-L197 | AKS limits, quotas, SLAs, and lifecycle: node/pod/identity caps, CNI/NAT/load balancer scaling, Istio performance, version support/LTS, Fleet limits, and platform support constraints. | -| Security | L198-L281 | Securing AKS clusters: identity and access (Entra, RBAC, workload identity), network and node hardening, encryption, policies, PCI compliance, and secure integrations with other Azure services. | -| Configuration | L282-L434 | Configuring AKS clusters and fleets: networking, ingress/egress, storage, GPUs, autoscaling, cost/monitoring, security, add-ons (Istio, Dapr, ACNS), and infrastructure for common data/AI workloads. | -| Integrations & Coding Patterns | L435-L459 | Patterns and how-tos for integrating AKS with AI agents, KAITO, storage, security (Key Vault/CSI), GPUs, monitoring, GitOps/Fleet, and external Azure/OSS services and tools. | -| Deployment | L460-L514 | Deploying and upgrading AKS clusters and apps, including CI/CD, service meshes, autoscaling, storage/migration, AI/ML and Wasm workloads, and production-ready infrastructure setup. | +| Troubleshooting | L37-L60 | Diagnosing and fixing AKS cluster issues: networking, DNS, kubelet, GPUs, security/KMS, upgrades, Windows containers, Fleet, and troubleshooting tools like Insights, events, and logs. | +| Best Practices | L61-L105 | AKS operational best practices: reliability, security, cost, performance, scheduling, storage/backup, networking, upgrades, GPU/ML, PCI, and workload-specific resiliency patterns. | +| Decision Making | L106-L155 | Guidance for AKS design and migration decisions: cluster/network/VM sizing, pricing and cost optimization, security/PCI, OS and node pools, upgrades, and comparisons with other platforms. | +| Architecture & Design Patterns | L156-L179 | Designing AKS architectures: HA/DR patterns, multi-region and PCI designs, upgrade/rollout strategies, node pool/network isolation, virtual nodes, and Fleet-based multi-cluster management. | +| Limits & Quotas | L180-L204 | AKS platform limits, SLAs, quotas, version lifecycle, performance/capacity (Istio, NAT, load balancers), identity scaling, networking/subnet limits, and support policies. | +| Security | L205-L285 | Securing AKS clusters: identity/RBAC, Entra auth, network policies, disk/secret encryption, node hardening, PCI compliance, Istio/mTLS, image integrity, and Azure Policy-based controls. | +| Configuration | L286-L437 | Configuring AKS clusters, networking, storage, security, autoscaling, service mesh, databases, costs, and multi-cluster/Fleet features, plus extensions and advanced networking (ACNS, CNI, ingress). | +| Integrations & Coding Patterns | L438-L462 | Patterns and how-tos for integrating AKS with AI agents/KAITO, storage, secrets, GPUs, monitoring, KEDA, Fleet, Istio/OSM, and external services like ACR, MongoDB, and GitHub Actions | +| Deployment | L463-L515 | Deploying and upgrading AKS clusters and add-ons, setting up CI/CD and IaC (ARM/Bicep/Terraform), and running apps/workloads (AI, Ray, Dapr, service mesh, Wasm, OpenFaaS, KEDA). | ### Troubleshooting | Topic | URL | |-------|-----| | Diagnose and fix common Agentic CLI for AKS issues | https://learn.microsoft.com/en-us/azure/aks/agentic-cli-for-aks-troubleshoot | +| Use AI assistant to troubleshoot AKS Desktop workloads | https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-ai-assistant | +| Use AKS Desktop Insights to troubleshoot Kubernetes apps | https://learn.microsoft.com/en-us/azure/aks/aks-desktop-deploy-troubleshooting | | Support and troubleshooting options for Azure Kubernetes Service | https://learn.microsoft.com/en-us/azure/aks/aks-support-help | | Diagnose AKS network issues using Advanced Container Networking Services | https://learn.microsoft.com/en-us/azure/aks/container-network-observability-guide | | Troubleshoot CoreDNS issues in AKS clusters | https://learn.microsoft.com/en-us/azure/aks/coredns-troubleshoot | | Use Kubernetes events to troubleshoot AKS clusters | https://learn.microsoft.com/en-us/azure/aks/events | | Diagnose GPU node health with NPD in AKS | https://learn.microsoft.com/en-us/azure/aks/gpu-health-monitoring | +| Monitor and troubleshoot AKS KMS etcd encryption | https://learn.microsoft.com/en-us/azure/aks/kms-observability | | Collect and view kubelet logs from AKS nodes | https://learn.microsoft.com/en-us/azure/aks/kubelet-logs | | Diagnose and fix common Open Service Mesh add-on issues on AKS | https://learn.microsoft.com/en-us/azure/aks/open-service-mesh-troubleshoot | -| AKS security bulletins and vulnerability troubleshooting | https://learn.microsoft.com/en-us/azure/aks/security-bulletins/overview | +| Use AKS security bulletins for vulnerability impact and mitigation | https://learn.microsoft.com/en-us/azure/aks/security-bulletins/overview | | Troubleshoot Container Network Insights Agent issues on AKS | https://learn.microsoft.com/en-us/azure/aks/troubleshoot-container-network-insights-agent | | Troubleshoot SNAT port exhaustion for AKS load balancers | https://learn.microsoft.com/en-us/azure/aks/troubleshoot-source-network-address-translation | | Troubleshoot UDP packet drops in AKS clusters | https://learn.microsoft.com/en-us/azure/aks/troubleshoot-udp-packet-drops | -| Resolve common AKS cluster upgrade issues (FAQ) | https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq | +| Resolve common Azure AKS upgrade issues and questions | https://learn.microsoft.com/en-us/azure/aks/upgrade-aks-faq | | Windows Server containers on AKS FAQ and issues | https://learn.microsoft.com/en-us/azure/aks/windows-faq | | Identify and migrate preview Azure Kubernetes Fleet CRDs to supported versions | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-migrate-preview-to-ga-fleets | | Interpret ClusterResourcePlacement and ResourcePlacement status | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-understand-placement | @@ -81,7 +84,6 @@ This skill requires **network access** to fetch documentation content: | Implement cluster isolation strategies in Azure Kubernetes Service | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-cluster-isolation | | Apply AKS cluster security and upgrade best practices | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-cluster-security | | Implement container image security best practices in AKS | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-container-image-management | -| Best practices for AKS authentication and authorization | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-identity | | Apply AKS network configuration best practices | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-network | | Apply basic scheduler best practices in AKS | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-scheduler | | Apply AKS storage and backup operator best practices | https://learn.microsoft.com/en-us/azure/aks/operator-best-practices-storage | @@ -95,6 +97,7 @@ This skill requires **network access** to fetch documentation content: | Upgrade AKS node pools and control plane safely | https://learn.microsoft.com/en-us/azure/aks/upgrade-node-pools | | Follow best practices for AKS node OS version upgrades | https://learn.microsoft.com/en-us/azure/aks/upgrade-os-version | | Validate Valkey resiliency during AKS node pool upgrades | https://learn.microsoft.com/en-us/azure/aks/upgrade-valkey-aks-nodepool | +| Manage and configure AKS system node pools | https://learn.microsoft.com/en-us/azure/aks/use-system-pools | | Validate and operate PostgreSQL HA deployments on AKS | https://learn.microsoft.com/en-us/azure/aks/validate-postgresql-ha | | Load test and validate Valkey cluster resiliency on AKS | https://learn.microsoft.com/en-us/azure/aks/validate-valkey-cluster | | Apply best practices for Windows containers on AKS | https://learn.microsoft.com/en-us/azure/aks/windows-best-practices | @@ -110,6 +113,7 @@ This skill requires **network access** to fetch documentation content: | Choose small vs large language models on AKS | https://learn.microsoft.com/en-us/azure/aks/concepts-ai-ml-language-models | | Plan IP address space for Azure AKS clusters | https://learn.microsoft.com/en-us/azure/aks/concepts-network-ip-address-planning | | Choose and understand legacy CNI options in Azure Kubernetes Service | https://learn.microsoft.com/en-us/azure/aks/concepts-network-legacy-cni | +| Understand AKS Confidential Containers preview capabilities and alternatives | https://learn.microsoft.com/en-us/azure/aks/confidential-containers-overview | | Use Azure Advisor recommendations for AKS cost | https://learn.microsoft.com/en-us/azure/aks/cost-advisors | | Migrate from Dapr OSS to Dapr extension on AKS | https://learn.microsoft.com/en-us/azure/aks/dapr-migration | | Compare AWS and Azure platforms for EDW workloads | https://learn.microsoft.com/en-us/azure/aks/eks-edw-understand | @@ -139,9 +143,11 @@ This skill requires **network access** to fetch documentation content: | Use Arm64 node pools in AKS for cost efficiency | https://learn.microsoft.com/en-us/azure/aks/use-arm64-vms | | Decide and use Azure Linux node pools on AKS | https://learn.microsoft.com/en-us/azure/aks/use-azure-linux | | Use capacity reservation groups with AKS node pools | https://learn.microsoft.com/en-us/azure/aks/use-capacity-reservation-groups | +| Migrate AKS clusters from KMS v1 to KMS v2 | https://learn.microsoft.com/en-us/azure/aks/use-kms-v2 | | Choose and use AKS Virtual Machines node pools | https://learn.microsoft.com/en-us/azure/aks/virtual-machines-node-pools | | Use and migrate from Windows Server Annual Channel on AKS | https://learn.microsoft.com/en-us/azure/aks/windows-annual-channel | | Plan Windows vs Linux container workloads on AKS | https://learn.microsoft.com/en-us/azure/aks/windows-vs-linux-containers | +| Choose migration approach to Entra Workload ID for AKS | https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity | | Choose the right Azure Kubernetes Fleet Manager configuration | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-choosing-fleet | | Select Azure Kubernetes Fleet Manager member cluster types | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-member-cluster-types | | Migrate Kubernetes update workflows from Terragrunt/Terraform to Azure Kubernetes Fleet Manager | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/howto-migrate-updates-from-terraform | @@ -165,7 +171,6 @@ This skill requires **network access** to fetch documentation content: | Use proximity placement groups to reduce AKS latency | https://learn.microsoft.com/en-us/azure/aks/reduce-latency-ppg | | Multi-region AKS deployment models and trade-offs | https://learn.microsoft.com/en-us/azure/aks/reliability-multi-region-deployment-models | | Use stateful workload upgrade patterns on AKS | https://learn.microsoft.com/en-us/azure/aks/stateful-workload-upgrades | -| Design and use system vs user node pools in AKS | https://learn.microsoft.com/en-us/azure/aks/use-system-pools | | Scale AKS workloads with virtual nodes and ACI | https://learn.microsoft.com/en-us/azure/aks/virtual-nodes | | Manage workload drift with Fleet applyStrategy | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-drift | | Control Fleet takeover of existing workloads | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-takeover | @@ -180,16 +185,18 @@ This skill requires **network access** to fetch documentation content: | Create private AKS Automatic cluster with readiness SLA | https://learn.microsoft.com/en-us/azure/aks/automatic/quick-automatic-private-custom-network | | Understand AKS preview API lifecycle and deprecation timing | https://learn.microsoft.com/en-us/azure/aks/concepts-preview-api-life-cycle | | Configure static block allocation for Azure CNI Pod Subnet | https://learn.microsoft.com/en-us/azure/aks/configure-azure-cni-static-block-allocation | +| Configure custom certificate authorities on AKS nodes | https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority | | AKS FAQ with node, pod, and quota limits | https://learn.microsoft.com/en-us/azure/aks/faq | -| Understand AKS identity binding limits and scale | https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts | +| Scale AKS workload identity with identity bindings | https://learn.microsoft.com/en-us/azure/aks/identity-bindings-concepts | | Compare latency impact across AKS Istio add-on versions | https://learn.microsoft.com/en-us/azure/aks/istio-latency | | Understand Istio add-on performance, capacity, and scaling limits on AKS | https://learn.microsoft.com/en-us/azure/aks/istio-scale | | Use long-term support channels for AKS Kubernetes versions | https://learn.microsoft.com/en-us/azure/aks/long-term-support | -| Understand AKS NAT gateway outbound flow limits | https://learn.microsoft.com/en-us/azure/aks/nat-gateway | +| Configure AKS NAT gateway outbound flow limits | https://learn.microsoft.com/en-us/azure/aks/nat-gateway | | Understand AKS node resource reservations and capacity | https://learn.microsoft.com/en-us/azure/aks/node-resource-reservations | | AKS resource limits, quotas, SKUs, and region availability | https://learn.microsoft.com/en-us/azure/aks/quotas-skus-regions | | AKS support policies and platform limitations | https://learn.microsoft.com/en-us/azure/aks/support-policies | | AKS supported Kubernetes versions and lifecycle rules | https://learn.microsoft.com/en-us/azure/aks/supported-kubernetes-versions | +| Use Entra pod-managed identities in AKS | https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity | | Scale AKS with multiple Standard Load Balancers | https://learn.microsoft.com/en-us/azure/aks/use-multiple-standard-load-balancer | | Kubernetes version support policy for Azure Kubernetes Fleet Manager | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-fleet-kubernetes-version-support | | Use Fleet placement snapshots and history limits | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-placement-snapshots | @@ -198,15 +205,15 @@ This skill requires **network access** to fetch documentation content: ### Security | Topic | URL | |-------|-----| -| Use Conditional Access for AKS control plane and nodes | https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad | +| Configure Conditional Access for AKS clusters and nodes | https://learn.microsoft.com/en-us/azure/aks/access-control-managed-azure-ad | | Access private AKS clusters using command invoke and Run command | https://learn.microsoft.com/en-us/azure/aks/access-private-cluster | | Configure service account and workload identity for Agentic CLI | https://learn.microsoft.com/en-us/azure/aks/agentic-cli-for-aks-service-account-workload-identity-setup | -| Set up RBAC permissions for AKS desktop users | https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions | +| Configure RBAC permissions for AKS Desktop roles | https://learn.microsoft.com/en-us/azure/aks/aks-desktop-permissions | +| Reference AKS service permissions and required roles | https://learn.microsoft.com/en-us/azure/aks/aks-service-permissions | | Secure AKS API server with authorized IP ranges | https://learn.microsoft.com/en-us/azure/aks/api-server-authorized-ip-ranges | | Use service tags for AKS API authorized IP ranges | https://learn.microsoft.com/en-us/azure/aks/api-server-service-tags | | Configure TLS for AKS Gateway API with Key Vault | https://learn.microsoft.com/en-us/azure/aks/app-routing-gateway-api-tls | -| Use Entra groups with Kubernetes RBAC in AKS | https://learn.microsoft.com/en-us/azure/aks/azure-ad-rbac | -| Configure customer-managed disk encryption keys for AKS | https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys | +| Encrypt AKS managed disks with customer-managed keys | https://learn.microsoft.com/en-us/azure/aks/azure-disk-customer-managed-keys | | Manage AKS certificate rotation and autorotation | https://learn.microsoft.com/en-us/azure/aks/certificate-rotation | | Harden AKS Azure Linux 2.0 nodes to CIS benchmark | https://learn.microsoft.com/en-us/azure/aks/cis-azure-linux | | Harden AKS Azure Linux 3.0 nodes to CIS benchmark | https://learn.microsoft.com/en-us/azure/aks/cis-azure-linux-v3 | @@ -214,26 +221,26 @@ This skill requires **network access** to fetch documentation content: | Align AKS Ubuntu node image with CIS benchmark | https://learn.microsoft.com/en-us/azure/aks/cis-ubuntu | | Align AKS Windows node image with CIS benchmark | https://learn.microsoft.com/en-us/azure/aks/cis-windows | | Configure secure ACR authentication for AKS clusters | https://learn.microsoft.com/en-us/azure/aks/cluster-container-registry-integration | -| Control kubeconfig access using Azure RBAC for AKS | https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access | -| Secure AKS Key Vault access with CSI identity options | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access | -| Configure custom certificate authorities on AKS nodes | https://learn.microsoft.com/en-us/azure/aks/custom-certificate-authority | -| Enable AKS-managed Microsoft Entra integration | https://learn.microsoft.com/en-us/azure/aks/enable-authentication-microsoft-entra-id | -| Enable FIPS 140-2 compliant node pools in AKS | https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes | +| Restrict kubeconfig access in AKS using Azure RBAC | https://learn.microsoft.com/en-us/azure/aks/control-kubeconfig-access | +| Choose identity models for AKS Secrets Store CSI to access Key Vault | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-identity-access | +| Create FIPS 140-2 compliant AKS node pools | https://learn.microsoft.com/en-us/azure/aks/enable-fips-nodes | | Enable host-based encryption for AKS node VMs | https://learn.microsoft.com/en-us/azure/aks/enable-host-encryption | -| Configure external identity providers for AKS structured authentication | https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure | -| Understand AKS structured authentication with external IdPs | https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-overview | -| Set up identity bindings for AKS workload identity | https://learn.microsoft.com/en-us/azure/aks/identity-bindings | -| Validate signed container images with AKS Image Integrity | https://learn.microsoft.com/en-us/azure/aks/image-integrity | +| Authorize AKS Kubernetes API with Entra ID RBAC | https://learn.microsoft.com/en-us/azure/aks/entra-id-authorization | +| Enable Entra ID authentication for AKS control plane | https://learn.microsoft.com/en-us/azure/aks/entra-id-control-plane-authentication | +| Configure external JWT identity providers for AKS | https://learn.microsoft.com/en-us/azure/aks/external-identity-provider-authentication-configure | +| Configure WireGuard encryption on AKS with ACNS | https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard | +| Set up AKS identity bindings for workload identity | https://learn.microsoft.com/en-us/azure/aks/identity-bindings | +| Enforce signed container image validation with AKS Image Integrity | https://learn.microsoft.com/en-us/azure/aks/image-integrity | | Restrict AKS pod access to IMDS endpoint | https://learn.microsoft.com/en-us/azure/aks/imds-restriction | | Enable Istio CNI for secure Istio add-on workloads on AKS | https://learn.microsoft.com/en-us/azure/aks/istio-cni | | Plug external CA certificates into AKS Istio add-on | https://learn.microsoft.com/en-us/azure/aks/istio-plugin-ca | | Configure HTTPS and mTLS secure ingress gateways for AKS Istio add-on | https://learn.microsoft.com/en-us/azure/aks/istio-secure-gateway | | Secure KEDA-based autoscaling on AKS with workload identity | https://learn.microsoft.com/en-us/azure/aks/keda-workload-identity | -| Enable KMS data encryption for AKS secrets | https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption | -| Authenticate to AKS using kubelogin and Entra ID | https://learn.microsoft.com/en-us/azure/aks/kubelogin-authentication | -| Use Microsoft Entra service principals with AKS | https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal | -| Configure Azure RBAC for Kubernetes authorization in AKS | https://learn.microsoft.com/en-us/azure/aks/manage-azure-rbac | -| Manage AKS local accounts with Entra integration | https://learn.microsoft.com/en-us/azure/aks/manage-local-accounts-managed-azure-ad | +| Enable KMS-based secret encryption for AKS clusters | https://learn.microsoft.com/en-us/azure/aks/kms-data-encryption | +| Use kubelogin for Entra-based AKS authentication | https://learn.microsoft.com/en-us/azure/aks/kubelogin-authentication | +| Use Entra groups with Kubernetes RBAC in AKS | https://learn.microsoft.com/en-us/azure/aks/kubernetes-rbac-entra-id | +| Configure Entra service principals for AKS clusters | https://learn.microsoft.com/en-us/azure/aks/kubernetes-service-principal | +| Manage AKS local admin accounts with Entra integration | https://learn.microsoft.com/en-us/azure/aks/local-accounts | | Configure and manage SSH access to AKS nodes | https://learn.microsoft.com/en-us/azure/aks/manage-ssh-node-access | | Securely connect to AKS cluster nodes for maintenance | https://learn.microsoft.com/en-us/azure/aks/node-access | | Configure networking and RBAC for NAP-enabled AKS clusters | https://learn.microsoft.com/en-us/azure/aks/node-auto-provisioning-networking | @@ -253,31 +260,28 @@ This skill requires **network access** to fetch documentation content: | Use built-in Azure Policy definitions for AKS | https://learn.microsoft.com/en-us/azure/aks/policy-reference | | Use pre-created kubelet managed identity in AKS | https://learn.microsoft.com/en-us/azure/aks/pre-created-kubelet-managed-identity | | Configure network access paths to private AKS clusters | https://learn.microsoft.com/en-us/azure/aks/private-cluster-connect | -| Configure PIM-based just-in-time access to AKS | https://learn.microsoft.com/en-us/azure/aks/privileged-identity-management | +| Set up PIM-based just-in-time access for AKS | https://learn.microsoft.com/en-us/azure/aks/privileged-identity-management | | RDP and SSH access to AKS Windows nodes | https://learn.microsoft.com/en-us/azure/aks/rdp | -| Harden AKS containers with namespaces, AppArmor, seccomp | https://learn.microsoft.com/en-us/azure/aks/secure-container-access | +| Harden AKS containers with user namespaces, AppArmor, seccomp | https://learn.microsoft.com/en-us/azure/aks/secure-container-access | | Use Azure Policy regulatory controls for AKS | https://learn.microsoft.com/en-us/azure/aks/security-controls-policy | | Configure system-assigned managed identity for AKS clusters | https://learn.microsoft.com/en-us/azure/aks/system-assigned-managed-identity | -| Configure Trusted Access for secure AKS API access | https://learn.microsoft.com/en-us/azure/aks/trusted-access-feature | -| Rotate AKS service principal and Entra credentials | https://learn.microsoft.com/en-us/azure/aks/update-credentials | -| Update Key Vault mode for AKS KMS etcd | https://learn.microsoft.com/en-us/azure/aks/update-kms-key-vault | -| Use Entra pod-managed identities in AKS | https://learn.microsoft.com/en-us/azure/aks/use-azure-ad-pod-identity | -| Secure AKS clusters using Azure Policy add-on | https://learn.microsoft.com/en-us/azure/aks/use-azure-policy | -| Run AKS workloads on Confidential VMs | https://learn.microsoft.com/en-us/azure/aks/use-cvm | -| Enable GMSA for Windows pods on AKS | https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts | +| Configure Trusted Access from Azure services to AKS | https://learn.microsoft.com/en-us/azure/aks/trusted-access-feature | +| Rotate AKS service principal and Entra app credentials | https://learn.microsoft.com/en-us/azure/aks/update-credentials | +| Change Key Vault connectivity mode for AKS KMS | https://learn.microsoft.com/en-us/azure/aks/update-kms-key-vault | +| Secure AKS clusters with Azure Policy definitions | https://learn.microsoft.com/en-us/azure/aks/use-azure-policy | +| Enable GMSA for Windows nodes in AKS | https://learn.microsoft.com/en-us/azure/aks/use-group-managed-service-accounts | | Configure legacy KMS etcd encryption in AKS | https://learn.microsoft.com/en-us/azure/aks/use-kms-etcd-encryption | | Configure AKS network policies to secure pod traffic | https://learn.microsoft.com/en-us/azure/aks/use-network-policies | | Configure OIDC issuer and provider for AKS clusters | https://learn.microsoft.com/en-us/azure/aks/use-oidc-issuer | | Deploy and use pod sandboxing in AKS | https://learn.microsoft.com/en-us/azure/aks/use-pod-sandboxing | +| Configure Pod Security Admission policies in AKS | https://learn.microsoft.com/en-us/azure/aks/use-psa | | Enable Trusted Launch security for AKS nodes | https://learn.microsoft.com/en-us/azure/aks/use-trusted-launch | | Enable user-assigned managed identity for AKS clusters | https://learn.microsoft.com/en-us/azure/aks/user-assigned-managed-identity | | Configure cross-tenant workload identity for AKS | https://learn.microsoft.com/en-us/azure/aks/workload-identity-cross-tenant | | Configure AKS cluster with Entra Workload ID | https://learn.microsoft.com/en-us/azure/aks/workload-identity-deploy-cluster | -| Migrate AKS pods from pod identity to Workload ID | https://learn.microsoft.com/en-us/azure/aks/workload-identity-migrate-from-pod-identity | | Use Microsoft Entra Workload ID with AKS workloads | https://learn.microsoft.com/en-us/azure/aks/workload-identity-overview | | Configure RBAC roles for Azure Kubernetes Fleet Manager | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/concepts-rbac | | Use managed identities securely with Azure Kubernetes Fleet Manager | https://learn.microsoft.com/en-us/azure/kubernetes-fleet/use-managed-identity | -| Configure legacy Entra integration for AKS via CLI | https://learn.microsoft.com/en-us/previous-versions/azure/aks/azure-ad-integration-cli | ### Configuration | Topic | URL | @@ -288,6 +292,7 @@ This skill requires **network access** to fetch documentation content: | Configure storage and secrets to deploy Airflow on AKS with Helm | https://learn.microsoft.com/en-us/azure/aks/airflow-deploy | | Configure AKS Communication Manager for maintenance notifications | https://learn.microsoft.com/en-us/azure/aks/aks-communication-manager | | Manage AKS component versioning and patching strategy | https://learn.microsoft.com/en-us/azure/aks/aks-component-versioning | +| Configure AKS clusters for AKS Desktop compatibility | https://learn.microsoft.com/en-us/azure/aks/aks-desktop-install-cluster-setup | | Configure AKS-managed GPU node pools | https://learn.microsoft.com/en-us/azure/aks/aks-managed-gpu-nodes | | Set up custom domains and SSL for AKS application routing | https://learn.microsoft.com/en-us/azure/aks/app-routing-dns-ssl | | Configure multiple NGINX ingress controllers and annotations in AKS | https://learn.microsoft.com/en-us/azure/aks/app-routing-nginx-configuration | @@ -296,7 +301,6 @@ This skill requires **network access** to fetch documentation content: | Configure AKS workloads with App Configuration provider | https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration-quickstart | | Tune Azure App Configuration extension settings for AKS | https://learn.microsoft.com/en-us/azure/aks/azure-app-configuration-settings | | Configure Azure CNI Overlay networking in AKS | https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay | -| Expand pod CIDR space for Azure CNI Overlay in AKS | https://learn.microsoft.com/en-us/azure/aks/azure-cni-overlay-pod-expand | | Configure Azure CNI Powered by Cilium in AKS | https://learn.microsoft.com/en-us/azure/aks/azure-cni-powered-by-cilium | | Configure Azure NetApp Files for AKS pods | https://learn.microsoft.com/en-us/azure/aks/azure-netapp-files | | Provision Azure NetApp Files dual-protocol volumes | https://learn.microsoft.com/en-us/azure/aks/azure-netapp-files-dual-protocol | @@ -329,7 +333,7 @@ This skill requires **network access** to fetch documentation content: | Create infrastructure for HA PostgreSQL on AKS with CloudNativePG | https://learn.microsoft.com/en-us/azure/aks/create-postgresql-ha | | Create Azure infrastructure for Valkey clusters on AKS | https://learn.microsoft.com/en-us/azure/aks/create-valkey-infrastructure | | Configure Azure Disk persistent volumes on AKS | https://learn.microsoft.com/en-us/azure/aks/create-volume-azure-disk | -| Configure Azure Key Vault CSI provider options on AKS | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options | +| Configure Azure Key Vault provider options for Secrets Store CSI on AKS | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-configuration-options | | Configure AKS node OS and kubelet settings | https://learn.microsoft.com/en-us/azure/aks/custom-node-configuration | | Customize resource configuration for AKS managed add-ons | https://learn.microsoft.com/en-us/azure/aks/customize-resource-configuration | | Configure Dapr extension settings on AKS and Arc | https://learn.microsoft.com/en-us/azure/aks/dapr-settings | @@ -344,13 +348,12 @@ This skill requires **network access** to fetch documentation content: | Create infrastructure for highly available GitHub Actions on AKS with Azure Files | https://learn.microsoft.com/en-us/azure/aks/github-actions-azure-files-create-infrastructure | | Configure FQDN filtering policies with ACNS on AKS | https://learn.microsoft.com/en-us/azure/aks/how-to-apply-fqdn-filtering-policies | | Configure L7 network policies with ACNS on AKS | https://learn.microsoft.com/en-us/azure/aks/how-to-apply-l7-policies | -| Configure WireGuard encryption for ACNS on AKS clusters | https://learn.microsoft.com/en-us/azure/aks/how-to-apply-wireguard | | Configure and use Container Network Insights Agent on AKS | https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-insights-agent | | Set up container network flow logs with Advanced Container Networking | https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-logs | | Configure container network metrics filtering in AKS with Cilium | https://learn.microsoft.com/en-us/azure/aks/how-to-configure-container-network-metrics-filtering | -| Enable eBPF host routing with ACNS on AKS | https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing | +| Enable eBPF host routing on AKS with ACNS | https://learn.microsoft.com/en-us/azure/aks/how-to-enable-ebpf-host-routing | | Configure HTTP proxy settings for AKS nodes | https://learn.microsoft.com/en-us/azure/aks/http-proxy | -| Configure Image Cleaner to remove stale AKS images | https://learn.microsoft.com/en-us/azure/aks/image-cleaner | +| Configure and run Image Cleaner to remove stale AKS images | https://learn.microsoft.com/en-us/azure/aks/image-cleaner | | Create and use internal load balancers in AKS | https://learn.microsoft.com/en-us/azure/aks/internal-lb | | Deploy and configure egress gateways for AKS Istio add-on | https://learn.microsoft.com/en-us/azure/aks/istio-deploy-egress | | Configure external and internal ingress gateways for AKS Istio add-on | https://learn.microsoft.com/en-us/azure/aks/istio-deploy-ingress | @@ -361,7 +364,6 @@ This skill requires **network access** to fetch documentation content: | Configure monitoring and networking for Kafka on AKS using Strimzi | https://learn.microsoft.com/en-us/azure/aks/kafka-configure | | Deploy Strimzi and a Kafka cluster on AKS | https://learn.microsoft.com/en-us/azure/aks/kafka-deploy | | Prepare Azure infrastructure for Kafka on AKS with Strimzi | https://learn.microsoft.com/en-us/azure/aks/kafka-infrastructure | -| Monitor AKS legacy KMS etcd encryption metrics | https://learn.microsoft.com/en-us/azure/aks/kms-observability | | Manage AKS Kubernetes resources through Azure portal UI | https://learn.microsoft.com/en-us/azure/aks/kubernetes-portal | | Install and configure Kueue for batch on AKS | https://learn.microsoft.com/en-us/azure/aks/kueue-overview | | Limit AKS egress traffic using Azure Firewall | https://learn.microsoft.com/en-us/azure/aks/limit-egress-traffic | @@ -395,7 +397,9 @@ This skill requires **network access** to fetch documentation content: | Upgrade Windows OS versions for AKS workloads | https://learn.microsoft.com/en-us/azure/aks/upgrade-windows-os | | Configure Advanced Container Networking Services on AKS | https://learn.microsoft.com/en-us/azure/aks/use-advanced-container-networking-services | | Configure AMD GPU node pools on AKS | https://learn.microsoft.com/en-us/azure/aks/use-amd-gpus | +| Run AKS node pools on Azure Dedicated Hosts | https://learn.microsoft.com/en-us/azure/aks/use-azure-dedicated-hosts | | Configure custom CNI plugins for Azure Kubernetes Service | https://learn.microsoft.com/en-us/azure/aks/use-byo-cni | +| Create and configure AKS node pools with Confidential VMs | https://learn.microsoft.com/en-us/azure/aks/use-cvm | | Use eTags for concurrency control in AKS APIs | https://learn.microsoft.com/en-us/azure/aks/use-etags | | Use Kubernetes node pool labels effectively in AKS | https://learn.microsoft.com/en-us/azure/aks/use-labels | | Configure Metrics Server VPA on AKS clusters | https://learn.microsoft.com/en-us/azure/aks/use-metrics-server-vertical-pod-autoscaler | @@ -405,7 +409,6 @@ This skill requires **network access** to fetch documentation content: | Configure NVIDIA GPU node pools on AKS | https://learn.microsoft.com/en-us/azure/aks/use-nvidia-gpu | | Configure Premium SSD v2 disks for AKS workloads | https://learn.microsoft.com/en-us/azure/aks/use-premium-v2-disks | | Configure Pod Security Admission in AKS clusters | https://learn.microsoft.com/en-us/azure/aks/use-psa | -| Configure Pod Security Admission in AKS clusters | https://learn.microsoft.com/en-us/azure/aks/use-psa | | Configure and apply Azure tags in AKS | https://learn.microsoft.com/en-us/azure/aks/use-tags | | Enable and configure Ultra Disks on AKS | https://learn.microsoft.com/en-us/azure/aks/use-ultra-disks | | Deploy and manage Vertical Pod Autoscaler on AKS | https://learn.microsoft.com/en-us/azure/aks/use-vertical-pod-autoscaler | @@ -443,7 +446,7 @@ This skill requires **network access** to fetch documentation content: | Integrate Azure HPC Cache with AKS workloads | https://learn.microsoft.com/en-us/azure/aks/azure-hpc-cache | | Mount Azure Blob storage in AKS using CSI | https://learn.microsoft.com/en-us/azure/aks/create-volume-azure-blob-storage | | Use Azure Files CSI volumes with AKS pods | https://learn.microsoft.com/en-us/azure/aks/create-volume-azure-files | -| Integrate AKS with Azure Key Vault via CSI driver | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver | +| Use Azure Key Vault provider for Secrets Store CSI on AKS | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-driver | | Configure Secrets Store CSI with NGINX TLS on AKS | https://learn.microsoft.com/en-us/azure/aks/csi-secrets-store-nginx-tls | | Refactor EDW application code to use Azure services | https://learn.microsoft.com/en-us/azure/aks/eks-edw-refactor | | Integrate AKS Istio add-on metrics with Azure Managed Prometheus | https://learn.microsoft.com/en-us/azure/aks/istio-metrics-managed-prometheus | @@ -473,7 +476,7 @@ This skill requires **network access** to fetch documentation content: | Programmatically deploy Azure Kubernetes applications with Azure CLI | https://learn.microsoft.com/en-us/azure/aks/deploy-application-az-cli | | Deploy Azure Kubernetes applications using ARM templates | https://learn.microsoft.com/en-us/azure/aks/deploy-application-template | | Deploy production AKS clusters with Terraform AVM | https://learn.microsoft.com/en-us/azure/aks/deploy-cluster-terraform-verified-module | -| Deploy AKS clusters with Confidential Containers | https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy | +| Deploy AKS clusters with Confidential Containers and default policy | https://learn.microsoft.com/en-us/azure/aks/deploy-confidential-containers-default-policy | | Deploy Kubernetes applications from Azure Marketplace to AKS | https://learn.microsoft.com/en-us/azure/aks/deploy-marketplace | | Deploy a Ray cluster on AKS with KubeRay | https://learn.microsoft.com/en-us/azure/aks/deploy-ray | | Deploy Ray with BlobFuse storage on AKS | https://learn.microsoft.com/en-us/azure/aks/deploy-ray-tuning | @@ -508,7 +511,5 @@ This skill requires **network access** to fetch documentation content: | Plan and manage AKS cluster and component upgrades | https://learn.microsoft.com/en-us/azure/aks/upgrade-cluster-components | | Understand AKS rolling upgrade process for clusters | https://learn.microsoft.com/en-us/azure/aks/upgrade-conceptual | | Automate AKS node upgrades using GitHub Actions | https://learn.microsoft.com/en-us/azure/aks/upgrade-github-actions | -| Use Azure Dedicated Hosts for AKS nodes | https://learn.microsoft.com/en-us/azure/aks/use-azure-dedicated-hosts | | Run Flyte data and ML pipelines on AKS | https://learn.microsoft.com/en-us/azure/aks/use-flyte | -| Migrate AKS clusters from KMS v1 to v2 | https://learn.microsoft.com/en-us/azure/aks/use-kms-v2 | | Deploy wasmCloud on AKS for distributed Wasm apps | https://learn.microsoft.com/en-us/azure/aks/wasmcloud | \ No newline at end of file diff --git a/skills/azure-language-service/SKILL.md b/skills/azure-language-service/SKILL.md index c3bbbed2..c536cc52 100644 --- a/skills/azure-language-service/SKILL.md +++ b/skills/azure-language-service/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-language-service -description: Expert knowledge for Azure AI Language development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building CLU intents, custom NER, text classification, CQA, sentiment/summarization, or health text solutions, and other Azure AI Language related development tasks. Not for Azure AI Search (use azure-cognitive-search), Azure AI Document Intelligence (use azure-document-intelligence), Azure AI Speech (use azure-speech), Azure Translator (use azure-translator). +description: Expert knowledge for Azure AI Language development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building CLU apps, custom NER/PII, sentiment/summarization, CQA/QnA, or containerized Language workloads, and other Azure AI Language related development tasks. Not for Azure AI Search (use azure-cognitive-search), Azure AI Speech (use azure-speech), Azure Translator (use azure-translator), Azure AI Bot Service (use azure-bot-service). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure AI Language Skill @@ -25,14 +25,14 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L37-L42 | Diagnosing and fixing common errors, low-accuracy results, and configuration issues in custom text classification and custom question answering projects in Azure AI Language. | -| Best Practices | L43-L60 | Best practices for designing, labeling, and evaluating CLU, custom NER, text classification, and CQA projects, including multilingual handling, emojis, schemas, and autolabeling. | +| Best Practices | L43-L60 | Best practices for designing, labeling, and evaluating CLU, custom NER, text classification, and custom question answering projects, including multilingual handling, emojis, and data/schema prep. | | Decision Making | L61-L70 | Guidance on choosing regions and resources, lifecycle policies, and migration paths from LUIS, QnA Maker, Text Analytics, and Language Studio to Azure Language and Microsoft Foundry | | Architecture & Design Patterns | L71-L77 | Architectural guidance for CLU and custom text classification: choosing CLU vs orchestration workflows, and designing regional backup, redundancy, and failover strategies. | -| Limits & Quotas | L78-L95 | Limits, quotas, and language/region support for Azure AI Language features (CLU, NER, classification, PII, CQA), including data size, rate, throughput, and container request limits. | +| Limits & Quotas | L78-L95 | Limits, quotas, and language/region support for Azure AI Language features (CLU, NER, classification, QnA, PII, containers), including data size, rate, and throughput constraints. | | Security | L96-L105 | Security for Azure AI Language: encryption at rest, customer-managed keys, RBAC, managed identities, SAS tokens, and network isolation/Private Link for CQA resources. | -| Configuration | L106-L133 | Configuring Azure AI Language projects and containers: CLU, custom NER, text classification, CQA, sentiment, summarization, health, data formats, resources, and runtime settings. | -| Integrations & Coding Patterns | L134-L165 | How to call Azure Language/CLU/Health/Summarization/CQA APIs and SDKs, wire them into bots, Power Automate, and Foundry, and correctly handle async, parameters, and outputs | -| Deployment | L166-L175 | How to deploy and run Azure AI Language models (custom classification, NER, QnA, key phrases, language detection) across regions, containers, AKS, and migrate projects/resources. | +| Configuration | L106-L134 | Configuring Azure AI Language projects and containers: resources, data formats, training, evaluation, entities/NER, PII, CQA behavior, diagnostics, and on-prem Docker setups. | +| Integrations & Coding Patterns | L135-L168 | Coding examples and API guides for calling Azure Language features (CLU, NER, PII, sentiment, summarization, health, CQA) via SDK/REST and integrating with bots, storage, and Power Automate. | +| Deployment | L169-L178 | How to deploy and run Azure AI Language models (custom classification, NER, QnA, key phrases, language detection) across regions, containers, AKS, and migrate projects/resources. | ### Troubleshooting | Topic | URL | @@ -56,7 +56,7 @@ This skill requires **network access** to fetch documentation content: | Label data effectively for custom text classification | https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/how-to/tag-data | | Implement best practices for CQA project quality | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/best-practices | | Apply project authoring best practices in CQA | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/best-practices | -| Apply document format guidelines for CQA imports | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines | +| Format documents for custom question answering imports | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/reference/document-format-guidelines | ### Decision Making | Topic | URL | @@ -91,7 +91,7 @@ This skill requires **network access** to fetch documentation content: | Review language support for Named Entity Recognition | https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/language-support | | Review orchestration workflow data and throughput limits | https://learn.microsoft.com/en-us/azure/ai-services/language-service/orchestration-workflow/service-limits | | Apply PII container per-call character and document limits | https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/use-containers | -| Service limits and boundaries for CQA projects | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits | +| Custom question answering service limits and quotas | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/limits | ### Security | Topic | URL | @@ -119,6 +119,7 @@ This skill requires **network access** to fetch documentation content: | View and interpret evaluation metrics for text classification models | https://learn.microsoft.com/en-us/azure/ai-services/language-service/custom-text-classification/how-to/view-model-evaluation | | Map NER entity types and tags across API versions | https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/concepts/ga-preview-mapping | | Configure NER skill parameters and inference options | https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/how-to/skill-parameters | +| Check language support for Azure PII detection | https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/language-support | | Understand and configure confidence scores in CQA | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/concepts/confidence-score | | Enable diagnostics and run analytics for CQA projects | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/analytics | | Customize default answer behavior in CQA projects | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/change-default-answer | @@ -148,8 +149,10 @@ This skill requires **network access** to fetch documentation content: | Implement language detection using SDKs and REST | https://learn.microsoft.com/en-us/azure/ai-services/language-service/language-detection/quickstart | | Call the NER API to extract named entities | https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/how-to-call | | Use the NER client library to extract entities | https://learn.microsoft.com/en-us/azure/ai-services/language-service/named-entity-recognition/quickstart | -| Use native document support with Language APIs | https://learn.microsoft.com/en-us/azure/ai-services/language-service/native-document-support/overview | -| Use the CQA Authoring API for automated management | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring | +| Use conversation PII API for chats and transcripts | https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-conversation-pii | +| Integrate document-based PII redaction with Blob Storage | https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-document-pii | +| Call Azure Text PII redaction APIs from code | https://learn.microsoft.com/en-us/azure/ai-services/language-service/personally-identifiable-information/how-to/redact-text-pii | +| Automate CQA projects with Authoring REST API | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/authoring | | Call the prebuilt CQA API for ad-hoc answering | https://learn.microsoft.com/en-us/azure/ai-services/language-service/question-answering/how-to/prebuilt | | Call Sentiment and Opinion Mining APIs | https://learn.microsoft.com/en-us/azure/ai-services/language-service/sentiment-opinion-mining/how-to/call-api | | Call Sentiment Analysis via SDK and REST | https://learn.microsoft.com/en-us/azure/ai-services/language-service/sentiment-opinion-mining/quickstart | diff --git a/skills/azure-local/SKILL.md b/skills/azure-local/SKILL.md index 43687de9..7c2209c5 100644 --- a/skills/azure-local/SKILL.md +++ b/skills/azure-local/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-local -description: Expert knowledge for Azure Local development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when planning Azure Local racks/SDN, configuring disconnected clusters, securing RBAC/PKI, or migrating VMs/AKS, and other Azure Local related development tasks. Not for Microsoft Foundry Local (use microsoft-foundry-local), Azure Stack Edge (use azure-stack-edge), Azure Kubernetes Service Edge Essentials (use azure-aks-edge-essentials), Azure IoT Edge (use azure-iot-edge). +description: Expert knowledge for Azure Local development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when planning Azure Local racks/SDN, AKS/VM workloads, Arc integration, SAN/Fibre Channel, or disconnected sites, and other Azure Local related development tasks. Not for Microsoft Foundry Local (use microsoft-foundry-local), Azure Stack Edge (use azure-stack-edge), Azure Arc (use azure-arc). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Local Skill @@ -24,264 +24,305 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L67 | Diagnosing and fixing Azure Local issues: deployment/upgrade failures, SDN and networking, VM/Arc/AKS problems, registration, disconnected ops, and collecting/using logs and support tools. | -| Best Practices | L68-L77 | Best practices for Azure Local networking, rack-aware deployment prep, drift detection, alerting, Arc VM operations, and safe, reliable update management. | -| Decision Making | L78-L93 | Planning Azure Local deployments: billing/pricing, VM types, connectivity and networking (incl. disconnected), migration, multi‑rack scale, deployment types, and upgrade paths. | -| Architecture & Design Patterns | L94-L120 | Network, storage, and resiliency design for Azure Local: SDN topologies, rack-aware clustering, SAN use, room-to-room links, load balancing, and VM/DR architecture patterns. | -| Limits & Quotas | L121-L127 | Cluster rack/layout limits, SLB HA port scale constraints, and VMware-to–Azure Local migration system requirements and supported configurations. | -| Security | L128-L167 | Securing Azure Local: identity/RBAC, AD/PKI, certificates and rotation, firewall/NSGs, BitLocker, Trusted launch/attestation, Defender/SIEM, and security defaults for connected/disconnected setups. | -| Configuration | L168-L252 | Configuring Azure Local infrastructure: networking, hardware/GPUs, registration/proxy, monitoring/Health, backups, disconnected ops, migrations, multi-rack, and VM/network policies. | -| Integrations & Coding Patterns | L253-L264 | VM networking, load balancing, image/VM creation, and migration/replication patterns for Azure Local (Hyper-V, Azure Migrate, Site Recovery, CLI/PowerShell, SSH, public IPs). | -| Deployment | L265-L287 | Deploying, scaling, upgrading, and repairing Azure Local clusters (including rack-aware and disconnected), plus ARM/portal deployment, Arc gateway, ACR, SQL workloads, and post-deploy tasks | +| Troubleshooting | L37-L64 | Diagnosing and fixing Azure Local issues: provisioning, SDN/connectivity, Arc VMs, registration, upgrades/updates, multi‑rack, and collecting logs/diagnostics for known errors. | +| Best Practices | L65-L75 | Guidance on networking and hardware prep, SDN upgrades, supported VM operations (single/multi‑rack), preserving IPs on VM moves, drift detection, and managing Azure Local updates. | +| Decision Making | L76-L91 | Guidance on choosing Azure Local vs alternatives, VM types, networking, storage, licensing, pricing, migration, deployment patterns/scale, and upgrade planning. | +| Architecture & Design Patterns | L92-L119 | Designing Azure Local network and SDN architectures: rack-aware, multisite/DR, disconnected, vTPM, and detailed 2–4 node/switchless/Fibre Channel reference network patterns. | +| Limits & Quotas | L120-L128 | Hardware, host, and physical network prerequisites for Azure Local disaggregated deployments, plus system requirements for migrating workloads from Hyper-V and VMware. | +| Security | L129-L175 | Security, compliance, and identity for Azure Local: mapping to standards (FedRAMP, HIPAA, PCI, ISO), RBAC/identity, certificates/PKI, NSGs/firewalls, encryption, Defender, logging, and security updates. | +| Configuration | L176-L279 | Configuring Azure Local infrastructure: networking, storage, GPUs, SDN, private endpoints, monitoring, updates, and managing VMs (single/multi-rack, connected or disconnected) via Azure/Arc tools. | +| Integrations & Coding Patterns | L280-L291 | VM/AKS integration patterns for Azure Local: SAN storage, Arc-enabled VMs, SSH/RDP access, remote diagnostics, VM migration, managed disk download, and multi-rack image creation. | +| Deployment | L292-L328 | End-to-end Azure Local deployment, clustering, SDN, disconnected setups, workload provisioning, and update/upgrade procedures, including rack-aware and disaggregated scenarios. | ### Troubleshooting | Topic | URL | |-------|-----| -| Validate rack aware cluster readiness with LLDP tool | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-readiness-check?view=azloc-2603 | -| Troubleshoot simplified machine provisioning in Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/troubleshoot-simplified-machine-provisioning?view=azloc-2603 | -| Resolve known issues in Azure Local releases | https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2603 | -| Collect diagnostic logs for Azure Local Arc-enabled VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-log-files-arc-enabled-vms?view=azloc-2603 | -| Use appliance fallback log collection for Azure Local disconnected VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-fallback?view=azloc-2603 | -| Known issues and workarounds for disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-known-issues?view=azloc-2603 | -| Collect on-demand logs via PowerShell for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-on-demand-logs?view=azloc-2603 | -| Interpret and resolve Azure Local Health Service faults | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-faults?view=azloc-2603 | -| Use AKS Arc Support Tool to remediate Azure Local infrastructure | https://learn.microsoft.com/en-us/azure/azure-local/manage/remediate-support-tool-infrastructure?view=azloc-2603 | -| Use Azure Local Remote Support Arc extension for diagnostics | https://learn.microsoft.com/en-us/azure/azure-local/manage/remote-support-arc-extension?view=azloc-2603 | -| Collect SDN logs on Azure Local for troubleshooting | https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-log-collection?view=azloc-2603 | -| Troubleshoot Azure Local SDN deployment, connectivity, and NSG issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-troubleshooting?view=azloc-2603 | -| Use Azure Local Support Diagnostic Tool for troubleshooting | https://learn.microsoft.com/en-us/azure/azure-local/manage/support-tools?view=azloc-2603 | -| Troubleshoot Azure Local Arc-enabled virtual machines | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-arc-enabled-vms?view=azloc-2603 | -| Collect SDN traces and logs on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-common-sdn-issues?view=azloc-2603 | -| Troubleshoot Azure Local registration failures in Configurator app | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment-configurator-app?view=azloc-2603 | -| Troubleshoot Azure Local 23H2 deployment validation issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment?view=azloc-2603 | -| Troubleshoot SDN deployment via Windows Admin Center | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-sdn-deployment?view=azloc-2603 | -| Troubleshoot Azure Local SDN Software Load Balancer | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-software-load-balancer?view=azloc-2603 | -| Upgrade SDN infrastructure and fix upgrade issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn?view=azloc-2603 | -| Troubleshoot Azure Local VM migration with Azure Migrate | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-faq?view=azloc-2603 | -| Troubleshoot Azure Local VM migrations with Azure Migrate | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-troubleshoot?view=azloc-2603 | -| Use Azure CLI serial console for Azure Local multi-rack VM recovery | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-serial-console?view=azloc-2603 | -| Known issues and workarounds for Azure Local 23xx releases | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-23?view=azloc-2603 | -| Known issues and workarounds for Azure Local 24xx releases | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-24?view=azloc-2603 | -| Troubleshoot Azure Local 23H2 solution updates | https://learn.microsoft.com/en-us/azure/azure-local/update/update-troubleshooting-23h2?view=azloc-2603 | -| Diagnose and fix Azure Local 23H2 upgrade issues | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/troubleshoot-upgrade-to-23h2?view=azloc-2603 | +| Troubleshoot simplified machine provisioning for Azure Local (preview) | https://learn.microsoft.com/en-us/azure/azure-local/deploy/troubleshoot-simplified-machine-provisioning?view=azloc-2604 | +| Resolve known issues in Azure Local releases | https://learn.microsoft.com/en-us/azure/azure-local/known-issues?view=azloc-2604 | +| Collect diagnostic logs for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-log-files-arc-enabled-vms?view=azloc-2604 | +| Use appliance fallback log collection for Azure Local Arc-enabled VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-fallback?view=azloc-2604 | +| Resolve known issues in Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-known-issues?view=azloc-2604 | +| Collect on-demand logs via PowerShell for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-on-demand-logs?view=azloc-2604 | +| Interpret and resolve Azure Local Health Service faults | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-faults?view=azloc-2604 | +| Use Remediation Support Tool to fix Azure Local infrastructure issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/remediate-support-tool-infrastructure?view=azloc-2604 | +| Collect SDN logs for Azure Local troubleshooting | https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-log-collection?view=azloc-2604 | +| Troubleshoot Azure Local SDN deployment and connectivity issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-troubleshooting?view=azloc-2604 | +| Use Azure Local Support Diagnostic Tool for data collection and issue resolution | https://learn.microsoft.com/en-us/azure/azure-local/manage/support-tools?view=azloc-2604 | +| Troubleshoot Azure Local Arc-enabled virtual machines | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-arc-enabled-vms?view=azloc-2604 | +| Collect traces and logs for common SDN issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-common-sdn-issues?view=azloc-2604 | +| Troubleshoot Azure Local registration issues using Configurator app | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment-configurator-app?view=azloc-2604 | +| Troubleshoot Azure Local 23H2 deployment validation failures via Azure portal | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-deployment?view=azloc-2604 | +| Troubleshoot SDN deployment via Windows Admin Center | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-sdn-deployment?view=azloc-2604 | +| Troubleshoot Software Load Balancer data path issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/troubleshoot-software-load-balancer?view=azloc-2604 | +| Troubleshoot Azure Local VM migrations with Azure Migrate | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-troubleshoot?view=azloc-2604 | +| Use Azure CLI serial console for Azure Local multi-rack VM recovery | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-serial-console?view=azloc-2604 | +| Troubleshoot Azure Local multi-rack Arc-enabled VM issues | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-troubleshoot-arc-enabled-vms?view=azloc-2604 | +| Known issues and workarounds for Azure Local 23xx | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-23?view=azloc-2604 | +| Known issues and workarounds for Azure Local 24xx | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/known-issues-24?view=azloc-2604 | +| Troubleshoot Azure Local solution update failures | https://learn.microsoft.com/en-us/azure/azure-local/update/update-troubleshooting-23h2?view=azloc-2604 | +| Troubleshoot Azure Local upgrade failures and issues | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/troubleshoot-upgrade-to-23h2?view=azloc-2604 | ### Best Practices | Topic | URL | |-------|-----| -| Apply Network ATC best practices for Azure Local networking | https://learn.microsoft.com/en-us/azure/azure-local/concepts/network-atc-overview?view=azloc-2603 | -| Prepare network and hardware for rack aware cluster deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-prep?view=azloc-2603 | -| Apply drift detection to maintain Azure Local configuration health | https://learn.microsoft.com/en-us/azure/azure-local/manage/drift-detection?view=azloc-2603 | -| Enable recommended Azure Local alert rules for baseline monitoring | https://learn.microsoft.com/en-us/azure/azure-local/manage/set-up-recommended-alert-rules?view=azloc-2603 | -| Use supported operations for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-operations?view=azloc-2603 | -| Best practices for Azure Local update management | https://learn.microsoft.com/en-us/azure/azure-local/update/update-best-practices?view=azloc-2603 | +| Prepare network and hardware for rack aware cluster deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-prep?view=azloc-2604 | +| Use drift detection to maintain Azure Local configuration consistency | https://learn.microsoft.com/en-us/azure/azure-local/manage/drift-detection?view=azloc-2604 | +| Upgrade SDN infrastructure managed by on-premises tools | https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn?view=azloc-2604 | +| Use supported operations for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-operations?view=azloc-2604 | +| Preserve static IP addresses during Azure Local VM migration | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-maintain-ip-addresses?view=azloc-2604 | +| Use supported operations for Azure Local multi-rack Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-operations?view=azloc-2604 | +| Best practices for managing Azure Local updates | https://learn.microsoft.com/en-us/azure/azure-local/update/update-best-practices?view=azloc-2604 | ### Decision Making | Topic | URL | |-------|-----| -| Understand billing and payment model for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/billing?view=azloc-2603 | -| Choose Azure Local VM types and management capabilities | https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2603 | -| Choose between Azure Local and Windows Server | https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2603 | -| Plan Azure Local connectivity using private endpoints | https://learn.microsoft.com/en-us/azure/azure-local/deploy/about-private-endpoints?view=azloc-2603 | -| Use Extended Security Updates on Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-benefits-esu?view=azloc-2603 | -| Understand billing and capacity pricing for disconnected Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-billing?view=azloc-2603 | -| Plan networking for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-network?view=azloc-2603 | -| Plan and use disconnected operations for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-overview?view=azloc-2603 | -| Choose migration options for VMs to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-options-overview?view=azloc-2603 | -| Evaluate multi-rack deployments for large Azure Local datacenters | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-overview?view=azloc-2603 | -| Select Azure Local deployment type and scale | https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2603 | -| Understand upgrade options from Azure Stack HCI 22H2 | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/about-upgrades-23h2?view=azloc-2603 | +| Use Azure Hybrid Benefit for Azure Local licensing | https://learn.microsoft.com/en-us/azure/azure-local/concepts/azure-hybrid-benefit-disaggregated?view=azloc-2604 | +| Choose Azure Local VM type and management model | https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-vm-management-capabilities?view=azloc-2604 | +| Decide between Azure Local and Windows Server | https://learn.microsoft.com/en-us/azure/azure-local/concepts/compare-windows-server?view=azloc-2604 | +| Plan and configure external SAN storage support for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2604 | +| Choose and manage SDN enabled by Azure Arc on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-overview?view=azloc-2604 | +| Decide how to use Private Endpoints with Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/about-private-endpoints?view=azloc-2604 | +| Understand billing and capacity-based pricing for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-billing?view=azloc-2604 | +| Choose VM migration options to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migration-options-overview?view=azloc-2604 | +| Select a network reference pattern for Azure Local disaggregated | https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern-disaggregated?view=azloc-2604 | +| Choose Azure Local deployment network reference pattern | https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2604 | +| Select Azure Local deployment type and scale | https://learn.microsoft.com/en-us/azure/azure-local/scalability-deployments?view=azloc-2604 | +| Plan upgrade from Azure Stack HCI 22H2 to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/about-upgrades-23h2?view=azloc-2604 | ### Architecture & Design Patterns | Topic | URL | |-------|-----| -| Design Azure Local deployments with external SAN storage | https://learn.microsoft.com/en-us/azure/azure-local/concepts/external-storage-support?view=azloc-2603 | -| Plan Network Controller deployment topology on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-network-controller-deployment?view=azloc-2603 | -| Plan SDN infrastructure for Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-software-defined-networking-infrastructure-23h2?view=azloc-2603 | -| Understand Azure Local rack aware clustering architecture | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-overview?view=azloc-2603 | -| Reference network architecture for Azure Local rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-reference-architecture?view=azloc-2603 | -| Design room-to-room connectivity for rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-room-to-room-connectivity?view=azloc-2603 | -| Plan SDN Multisite topology and disaster recovery on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-multisite-overview?view=azloc-2603 | -| Choose SDN management methods enabled by Azure Arc on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-overview?view=azloc-2603 | -| Design resilient infrastructure for Azure Local deployments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-infrastructure-resiliency?view=azloc-2603 | -| Plan disaster recovery architecture for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-overview?view=azloc-2603 | -| Design resilient virtual machines on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-vm-resiliency?view=azloc-2603 | -| Load balance multiple logical networks in Azure Local SDN | https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balance-multiple-networks?view=azloc-2603 | -| Choose Azure Local deployment network pattern | https://learn.microsoft.com/en-us/azure/azure-local/plan/choose-network-pattern?view=azloc-2603 | -| Plan Azure Local four-node switchless dual-link network | https://learn.microsoft.com/en-us/azure/azure-local/plan/four-node-switchless-two-switches-two-links?view=azloc-2603 | -| Understand Azure Local network reference patterns | https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2603 | -| Apply SDN considerations to Azure Local patterns | https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-sdn-considerations?view=azloc-2603 | -| Plan single-server storage network pattern for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-deployment?view=azloc-2603 | -| Design Azure Local three-node switchless single-link network | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-single-link?view=azloc-2603 | -| Design Azure Local three-node switchless dual-link network | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-two-links?view=azloc-2603 | -| Plan two-node switched converged Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-converged?view=azloc-2603 | -| Plan two-node switched non-converged Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-non-converged?view=azloc-2603 | -| Plan Azure Local two-node switchless single-switch network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-single-switch?view=azloc-2603 | -| Plan Azure Local two-node switchless dual-switch network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-two-switches?view=azloc-2603 | +| Plan SDN infrastructure and topology for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-software-defined-networking-infrastructure-23h2?view=azloc-2604 | +| Use reference architecture for Azure Local rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-reference-architecture?view=azloc-2604 | +| Design room-to-room connectivity for rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-room-to-room-connectivity?view=azloc-2604 | +| Design SDN Multisite topology and disaster recovery with Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/sdn-multisite-overview?view=azloc-2604 | +| Design resilient Azure Local infrastructure for DR | https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-infrastructure-resiliency?view=azloc-2604 | +| Plan disaster recovery strategy for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/disaster-recovery-overview?view=azloc-2604 | +| Design network architecture for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-network?view=azloc-2604 | +| Load balance multiple logical networks in SDN | https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balance-multiple-networks?view=azloc-2604 | +| Deploy and manage SDN Multisite for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-sdn-multisite?view=azloc-2604 | +| Understand automatic vTPM state transfer for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-automatic-state-transfer?view=azloc-2604 | +| Plan Azure Local cloud deployment network topology | https://learn.microsoft.com/en-us/azure/azure-local/plan/cloud-deployment-network-considerations?view=azloc-2604 | +| Plan Fiber Channel disaggregated pattern without backup network | https://learn.microsoft.com/en-us/azure/azure-local/plan/fiber-channel-no-backup-disaggregated-pattern?view=azloc-2604 | +| Plan Fiber Channel disaggregated pattern with backup network | https://learn.microsoft.com/en-us/azure/azure-local/plan/fiber-channel-with-backup-disaggregated-pattern?view=azloc-2604 | +| Plan four-node switchless dual-link Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/four-node-switchless-two-switches-two-links?view=azloc-2604 | +| Understand network reference patterns for Azure Local disaggregated | https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview-disaggregated?view=azloc-2604 | +| Understand Azure Local network reference pattern options | https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-overview?view=azloc-2604 | +| Apply SDN design considerations to Azure Local patterns | https://learn.microsoft.com/en-us/azure/azure-local/plan/network-patterns-sdn-considerations?view=azloc-2604 | +| Plan single-server storage network pattern for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-deployment?view=azloc-2604 | +| Plan three-node switchless single-link Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-single-link?view=azloc-2604 | +| Plan three-node switchless dual-link Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-switchless-two-switches-two-links?view=azloc-2604 | +| Plan two-node switched converged Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-converged?view=azloc-2604 | +| Plan two-node switched non-converged Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switched-non-converged?view=azloc-2604 | +| Plan two-node switchless single-switch Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-single-switch?view=azloc-2604 | +| Plan two-node switchless dual-switch Azure Local network | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-switchless-two-switches?view=azloc-2604 | ### Limits & Quotas | Topic | URL | |-------|-----| -| Rack aware cluster requirements and supported configurations | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-requirements?view=azloc-2603 | -| Configure SLB high availability ports and understand limits | https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-software-load-balancer?view=azloc-2603 | -| System requirements for VMware migration to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-requirements?view=azloc-2603 | +| Host network requirements for Azure Local disaggregated | https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements-disaggregated?view=azloc-2604 | +| Physical network requirements for Azure Local disaggregated | https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements-disaggregated?view=azloc-2604 | +| System requirements for Azure Local disaggregated deployments | https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-disaggregated?view=azloc-2604 | +| System requirements for Hyper-V migration to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-requirements?view=azloc-2604 | +| System requirements for VMware migration to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-vmware-requirements?view=azloc-2604 | ### Security | Topic | URL | |-------|-----| -| Configure firewall rules and endpoints for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2603 | -| Configure Azure verification attestation for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/deploy/azure-verification?view=azloc-2603 | -| Assign Azure Arc registration and deployment permissions for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-arc-register-server-permissions?view=azloc-2603 | -| Use local identity and Key Vault for Azure Local deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2603 | -| Prepare Active Directory for Azure Local deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prep-active-directory?view=azloc-2603 | -| Assign built-in RBAC roles for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-vm-rbac-roles?view=azloc-2603 | -| Configure managed identity for enhanced Azure Local management | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-enhanced-management-managed-identity?view=azloc-2603 | -| Use tags with SDN network security groups in WAC | https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-network-security-groups-with-tags?view=azloc-2603 | -| Create NSGs, rules, and default network access policies for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-security-groups?view=azloc-2603 | -| Plan identity architecture for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-identity?view=azloc-2603 | -| Configure PKI and certificates for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-pki?view=azloc-2603 | -| Apply security controls for Azure Local disconnected VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-security?view=azloc-2603 | -| Use Kerberos SPN authentication with Network Controller | https://learn.microsoft.com/en-us/azure/azure-local/manage/kerberos-with-spn?view=azloc-2603 | -| Manage BitLocker encryption and recovery keys on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-bitlocker?view=azloc-2603 | -| Enable default VM network access policies on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-default-network-access-policies-virtual-machines-23h2?view=azloc-2603 | -| Manage NSGs and security rules for Azure Local Arc-enabled VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-network-security-groups?view=azloc-2603 | -| Rotate deployment user password and internal secrets on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secrets-rotation?view=azloc-2603 | -| Manage Azure Local 23H2 security default settings | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-baseline?view=azloc-2603 | -| Manage Secure Boot certificate updates in Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2603 | -| Manage security settings after upgrading to Azure Local from Azure Stack HCI | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-post-upgrade?view=azloc-2603 | -| Secure Azure Local with Microsoft Defender for Cloud | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-with-defender-for-cloud?view=azloc-2603 | -| Configure syslog forwarding from Azure Local to SIEM | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-syslog-forwarding?view=azloc-2603 | -| Configure Application Control policies on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-wdac?view=azloc-2603 | -| Configure security for Azure Local Network Controller traffic | https://learn.microsoft.com/en-us/azure/azure-local/manage/nc-security?view=azloc-2603 | -| Manage certificates for Azure Local SDN Network Controller | https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-manage-certs?view=azloc-2603 | -| Enable guest attestation for Azure Local Trusted launch VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-guest-attestation?view=azloc-2603 | -| Back up and restore Trusted launch guest state keys | https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-import-key?view=azloc-2603 | -| Renew Azure Local Network Controller certificates | https://learn.microsoft.com/en-us/azure/azure-local/manage/update-network-controller-certificates?view=azloc-2603 | -| Renew SDN infrastructure and SLB MUX certificates | https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn-infrastructure-certificates?view=azloc-2603 | -| Configure SDN network security groups with PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-powershell?view=azloc-2603 | -| Configure SDN network security groups in Windows Admin Center | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-windows-admin-center?view=azloc-2603 | -| Use built-in RBAC roles for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-assign-vm-rbac-roles?view=azloc-2603 | -| Configure network security groups for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-security-groups?view=azloc-2603 | -| Configure custom Active Directory permissions and DNS for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/configure-custom-settings-active-directory?view=azloc-2603 | -| Security update reference for Azure Local 23xx releases | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-23?view=azloc-2603 | -| Security update reference for Azure Local 24xx releases | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-24?view=azloc-2603 | +| Align Azure Local deployments with FedRAMP | https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-fedramp-guidance?view=azloc-2604 | +| Plan HIPAA-compliant solutions with Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-hipaa-guidance?view=azloc-2604 | +| Use Azure Local to support ISO 27001 controls | https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-iso27001-guidance?view=azloc-2604 | +| Use Azure Local to support PCI DSS compliance | https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-pci-dss-guidance?view=azloc-2604 | +| Map Azure Local to security standards and certifications | https://learn.microsoft.com/en-us/azure/azure-local/assurance/azure-stack-security-standards?view=azloc-2604 | +| Configure firewall rules and endpoints for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/firewall-requirements?view=azloc-2604 | +| Use Azure verification for VMs on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/azure-verification?view=azloc-2604 | +| Assign Azure Arc permissions to register Azure Local machines | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-arc-register-server-permissions?view=azloc-2604 | +| Deploy Azure Local using local identity with Azure Key Vault | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault?view=azloc-2604 | +| Prepare Active Directory environment for Azure Local deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prep-active-directory?view=azloc-2604 | +| Assign RBAC roles for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-vm-rbac-roles?view=azloc-2604 | +| Configure Extended Security Updates on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-benefits-esu?view=azloc-2604 | +| Configure enhanced Azure management using managed identity for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-enhanced-management-managed-identity?view=azloc-2604 | +| Use tags with SDN network security groups in WAC | https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-network-security-groups-with-tags?view=azloc-2604 | +| Create NSGs, rules, and default access policies for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-security-groups?view=azloc-2604 | +| Plan identity architecture for Azure Local disconnected environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-identity?view=azloc-2604 | +| Configure PKI and certificates for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-pki?view=azloc-2604 | +| Apply security controls for Azure Local disconnected VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-security?view=azloc-2604 | +| Use Kerberos SPN authentication with Network Controller | https://learn.microsoft.com/en-us/azure/azure-local/manage/kerberos-with-spn?view=azloc-2604 | +| Manage BitLocker encryption on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-bitlocker?view=azloc-2604 | +| Enable default network access policies for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-default-network-access-policies-virtual-machines-23h2?view=azloc-2604 | +| Manage NSGs and security rules for Azure Local Arc-enabled VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-network-security-groups?view=azloc-2604 | +| Rotate deployment user password on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secrets-rotation?view=azloc-2604 | +| Manage default security settings on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-baseline?view=azloc-2604 | +| Manage Secure Boot certificate updates on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-secure-boot-updates?view=azloc-2604 | +| Manage security after upgrading Azure Local from 22H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-post-upgrade?view=azloc-2604 | +| Secure Azure Local with Microsoft Defender for Cloud | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-security-with-defender-for-cloud?view=azloc-2604 | +| Configure syslog forwarding from Azure Local to SIEM | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-syslog-forwarding?view=azloc-2604 | +| Configure Application Control on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-wdac?view=azloc-2604 | +| Configure Network Controller security channels and protocols | https://learn.microsoft.com/en-us/azure/azure-local/manage/nc-security?view=azloc-2604 | +| Manage certificates for Azure Local SDN communications | https://learn.microsoft.com/en-us/azure/azure-local/manage/sdn-manage-certs?view=azloc-2604 | +| Enable guest attestation for Azure Local Trusted launch VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-guest-attestation?view=azloc-2604 | +| Back up and restore Trusted launch guest state keys | https://learn.microsoft.com/en-us/azure/azure-local/manage/trusted-launch-vm-import-key?view=azloc-2604 | +| Renew Network Controller certificates in Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/update-network-controller-certificates?view=azloc-2604 | +| Renew SDN infrastructure and SLB MUX certificates | https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn-infrastructure-certificates?view=azloc-2604 | +| Configure SDN network security groups with PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-powershell?view=azloc-2604 | +| Configure SDN network security groups with Windows Admin Center | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-datacenter-firewall-windows-admin-center?view=azloc-2604 | +| Assign built-in RBAC roles for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-assign-vm-rbac-roles?view=azloc-2604 | +| Create network security groups for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-security-groups?view=azloc-2604 | +| Manage NSGs and security rules on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-network-security-groups?view=azloc-2604 | +| Configure custom Active Directory permissions and DNS for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/configure-custom-settings-active-directory?view=azloc-2604 | +| Security update catalog for Azure Local 23xx releases | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-23?view=azloc-2604 | +| Security update catalog for Azure Local 24xx releases | https://learn.microsoft.com/en-us/azure/azure-local/previous-releases/security-update-24?view=azloc-2604 | ### Configuration | Topic | URL | |-------|-----| -| Configure host networking for Azure Local clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2603 | -| Configure physical network requirements for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2603 | -| Configure AKS node distribution in rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-aks-nodes?view=azloc-2603 | -| Provision Azure Local VMs in local availability zones | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-provision-vm-local-availability-zone?view=azloc-2603 | -| Configure system requirements for Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2603 | -| Configure low-capacity Azure Local deployment hardware | https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2603 | -| Configure private endpoints for Azure Local without proxy or gateway | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-no-gateway?view=azloc-2603 | -| Configure private endpoints for Azure Local with Arc gateway | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-with-gateway?view=azloc-2603 | -| Configure private endpoints for Azure Local with proxy only | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-no-gateway?view=azloc-2603 | -| Configure private endpoints for Azure Local with proxy and gateway | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-with-gateway?view=azloc-2603 | -| Review security, software, hardware, and network prerequisites for Azure Local deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prerequisites?view=azloc-2603 | -| Configure Azure Local registration via Arc gateway and proxy | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2603 | -| Configure Azure Local Arc registration without gateway | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-without-azure-arc-gateway?view=azloc-2603 | -| Enable SDN integration on Azure Local using PowerShell action plans | https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-sdn-integration?view=azloc-2603 | -| Configure and manage Azure Arc extensions on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/arc-extension-management?view=azloc-2603 | -| Attach and configure GPUs for Linux VMs on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/attach-gpu-to-linux-vm?view=azloc-2603 | -| Configure prerequisites for Azure Local Arc-enabled VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-prerequisites?view=azloc-2603 | -| Collect and upload Azure Local diagnostic logs to Microsoft | https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-logs?view=azloc-2603 | -| Configure proxy settings for Azure Local 23H2 environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-proxy-settings-23h2?view=azloc-2603 | -| Configure logical networks for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-logical-networks?view=azloc-2603 | -| Manage Azure Local VMs in disconnected mode with Azure Arc | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-arc-vm?view=azloc-2603 | -| Configure backup parameters for disconnected Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-back-up-restore?view=azloc-2603 | -| Configure Azure CLI for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-cli?view=azloc-2603 | -| Integrate monitoring solutions with Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-monitoring?view=azloc-2603 | -| Configure Azure Policy in disconnected Azure Local environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-policy?view=azloc-2603 | -| Configure Azure PowerShell for disconnected Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-powershell?view=azloc-2603 | -| Enable nested virtualization on Azure Local hosts | https://learn.microsoft.com/en-us/azure/azure-local/manage/enable-nested-virtualization?view=azloc-2603 | -| Enable and manage remote support for Azure Local OS | https://learn.microsoft.com/en-us/azure/azure-local/manage/get-remote-support?view=azloc-2603 | -| Configure GPU Discrete Device Assignment on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-device?view=azloc-2603 | -| Configure GPU partitioning (GPU-P) for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-partitioning?view=azloc-2603 | -| Configure GPUs for Azure Local hyperconverged hosts | https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-preparation?view=azloc-2603 | -| Integrate Azure Local health service with Azure Monitor alerts | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-alerts-via-azure-monitor-alerts?view=azloc-2603 | -| Track automated Health Service actions in Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-actions?view=azloc-2603 | -| Retrieve Azure Local cluster performance history via Health Service | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-cluster-performance-history?view=azloc-2603 | -| Use Health Service to monitor Azure Local clusters | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-overview?view=azloc-2603 | -| Tune Azure Local Health Service settings and thresholds | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-settings?view=azloc-2603 | -| Monitor Azure Local at scale using portal dashboards | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-at-scale-dashboard?view=azloc-2603 | -| Manage logical network configuration for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-logical-networks?view=azloc-2603 | -| Deploy and manage SDN Multisite for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-sdn-multisite?view=azloc-2603 | -| Configure storage thin provisioning on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-thin-provisioning-23h2?view=azloc-2603 | -| Configure Azure Monitor Metrics for Azure Local clusters | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-cluster-with-metrics?view=azloc-2603 | -| Monitor Azure Local features with Insights workbooks | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-features?view=azloc-2603 | -| Configure Insights to monitor multiple Azure Local systems | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-23h2?view=azloc-2603 | -| Enable Azure Local Insights at scale using Azure Policy | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-azure-policies?view=azloc-2603 | -| Configure Insights to monitor a single Azure Local system | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-single-23h2?view=azloc-2603 | -| Enable ReFS deduplication and compression on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/refs-deduplication-and-compression?view=azloc-2603 | -| Configure metric alerts for Azure Local resources | https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-metric-alerts?view=azloc-2603 | -| Set up log alerts for Azure Local using Insights queries | https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-system-alerts?view=azloc-2603 | -| Unregister and re-register Azure Local machines using PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/unregister-register-machine?view=azloc-2603 | -| Use Azure Local Environment Checker for readiness validation | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-environment-checker?view=azloc-2603 | -| Configure Windows Server VM activation on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-activate?view=azloc-2603 | -| Configure VM affinity and anti-affinity rules on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-affinity?view=azloc-2603 | -| Configure virtual machine load balancing on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-load-balancing?view=azloc-2603 | -| Enable guest management on Azure Local migrated VMs | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-enable-guest-management?view=azloc-2603 | -| Complete prerequisites for Hyper-V migration to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-prerequisites?view=azloc-2603 | -| System requirements for Hyper-V migration to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-requirements?view=azloc-2603 | -| Preserve static IP addresses during Azure Local VM migration | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-maintain-ip-addresses?view=azloc-2603 | -| Configure diagnostic settings to monitor Azure Local migrations | https://learn.microsoft.com/en-us/azure/azure-local/migrate/monitor-migration?view=azloc-2603 | -| Install Azure CLI extensions for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-cli-extensions?view=azloc-2603 | -| Configure Layer 3 isolation domains for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-configure-layer-3-isolation-domain?view=azloc-2603 | -| Create Azure Arc-enabled VMs on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-arc-virtual-machines?view=azloc-2603 | -| Create logical networks for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-logical-networks?view=azloc-2603 | -| Create network interfaces for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-interfaces?view=azloc-2603 | -| Create public load balancers on Azure Local multi-rack virtual networks | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-load-balancer-virtual-networks?view=azloc-2603 | -| Configure virtual networks for Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-virtual-networks?view=azloc-2603 | -| Create and restore data disk snapshots on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-disk-snapshot?view=azloc-2603 | -| Manage disks and NIC resources for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machine-resources?view=azloc-2603 | -| Manage Azure Arc-enabled VMs on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machines?view=azloc-2603 | -| Configure Azure Monitor metrics for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-cluster-with-metrics?view=azloc-2603 | -| Prerequisites for Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-prerequisites?view=azloc-2603 | -| Manage VM extensions on Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-extension?view=azloc-2603 | -| VM requirements for Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-vm-management-prerequisites?view=azloc-2603 | -| Review single-server Azure Local network components | https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-components?view=azloc-2603 | -| Configure IP addressing for single-server Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-ip-requirements?view=azloc-2603 | -| Review three-node Azure Local network components | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-components?view=azloc-2603 | -| Configure IP addressing for three-node Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-ip-requirements?view=azloc-2603 | -| Review two-node Azure Local network components | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-components?view=azloc-2603 | -| Configure IP addressing for two-node Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-ip-requirements?view=azloc-2603 | -| Import Azure Local update packages in low-connectivity sites | https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2603 | -| Apply Azure Local solution updates via PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/update/update-via-powershell-23h2?view=azloc-2603 | -| Configure Network ATC on existing Azure Local clusters | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/install-enable-network-atc?view=azloc-2603 | +| Configure host networking for Azure Local clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/host-network-requirements?view=azloc-2604 | +| Meet physical network requirements for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/physical-network-requirements?view=azloc-2604 | +| Distribute AKS nodes across Azure Local rack aware zones | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-aks-nodes?view=azloc-2604 | +| Provision Azure Local VMs in local availability zones | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-provision-vm-local-availability-zone?view=azloc-2604 | +| Use supported SAN solutions with Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/san-requirements?view=azloc-2604 | +| Apply system requirements for Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-23h2?view=azloc-2604 | +| Configure low-capacity Azure Local hardware requirements | https://learn.microsoft.com/en-us/azure/azure-local/concepts/system-requirements-small-23h2?view=azloc-2604 | +| Configure Private Endpoints for Azure Local without proxy or gateway | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-no-gateway?view=azloc-2604 | +| Configure Private Endpoints with Arc gateway for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-no-proxy-with-gateway?view=azloc-2604 | +| Configure Private Endpoints with proxy for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-no-gateway?view=azloc-2604 | +| Configure Azure Local private endpoints with proxy and Arc gateway | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-private-endpoints-with-proxy-with-gateway?view=azloc-2604 | +| Configure Azure Arc gateway for Azure Local deployments | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-arc-gateway-overview?view=azloc-2604 | +| Register Azure Local with Azure Arc gateway and proxy | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-with-azure-arc-gateway?view=azloc-2604 | +| Register Azure Local with Azure Arc without gateway and configure proxy | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-without-azure-arc-gateway?view=azloc-2604 | +| Enable SDN integration on Azure Local using PowerShell action plan | https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-sdn-integration?view=azloc-2604 | +| Use LLDP validator to check rack aware cluster readiness | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-readiness-check?view=azloc-2604 | +| Add physical NICs to existing Network ATC intents on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/add-network-adapters-to-network-intents?view=azloc-2604 | +| Configure and manage Azure Arc extensions on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/arc-extension-management?view=azloc-2604 | +| Assign SDN public IP addresses to Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/assign-public-ip-to-vm?view=azloc-2604 | +| Attach and use GPUs on Linux VMs in Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/attach-gpu-to-linux-vm?view=azloc-2604 | +| Satisfy prerequisites for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-arc-vm-management-prerequisites?view=azloc-2604 | +| Collect and upload Azure Local diagnostic logs via portal and PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/collect-logs?view=azloc-2604 | +| Configure proxy settings for Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-proxy-settings-23h2?view=azloc-2604 | +| Configure SLB high availability ports on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/configure-software-load-balancer?view=azloc-2604 | +| Configure logical networks for Azure Local Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-logical-networks?view=azloc-2604 | +| Configure network interfaces for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-network-interfaces?view=azloc-2604 | +| Configure storage paths for Azure Local VM images | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-storage-path?view=azloc-2604 | +| Configure and run backups for Azure Local disconnected environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-back-up-restore?view=azloc-2604 | +| Configure Azure CLI for Azure Local disconnected environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-cli?view=azloc-2604 | +| Integrate monitoring solutions with Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-monitoring?view=azloc-2604 | +| Configure Azure Policy in disconnected Azure Local environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-policy?view=azloc-2604 | +| Set up Azure PowerShell for Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-powershell?view=azloc-2604 | +| Register Azure Local disconnected operations for compliance | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-registration?view=azloc-2604 | +| Configure and execute restores for Azure Local disconnected environments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-restore?view=azloc-2604 | +| Enable nested virtualization on Azure Local clusters | https://learn.microsoft.com/en-us/azure/azure-local/manage/enable-nested-virtualization?view=azloc-2604 | +| Manage SDN gateway connections in Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/gateway-connections?view=azloc-2604 | +| Configure GPU Discrete Device Assignment on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-device?view=azloc-2604 | +| Configure GPU partitioning (GPU-P) for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-manage-via-partitioning?view=azloc-2604 | +| Prepare GPUs for Azure Local VMs and AKS workloads | https://learn.microsoft.com/en-us/azure/azure-local/manage/gpu-preparation?view=azloc-2604 | +| Use Azure Monitor alerts to act on Azure Local health issues | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-alerts-via-azure-monitor-alerts?view=azloc-2604 | +| Track and understand Health Service automated actions on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-actions?view=azloc-2604 | +| Retrieve cluster performance history with Health Service on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-cluster-performance-history?view=azloc-2604 | +| Monitor Azure Local clusters using Health Service | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-overview?view=azloc-2604 | +| Modify Azure Local Health Service settings and behavior | https://learn.microsoft.com/en-us/azure/azure-local/manage/health-service-settings?view=azloc-2604 | +| Configure and manage Software Load Balancer policies | https://learn.microsoft.com/en-us/azure/azure-local/manage/load-balancers?view=azloc-2604 | +| Manage disks and NIC resources for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-arc-virtual-machine-resources?view=azloc-2604 | +| Manage logical network configuration for Azure Local VMs | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-logical-networks?view=azloc-2604 | +| Configure storage thin provisioning on Azure Local 23H2 with PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/manage-thin-provisioning-23h2?view=azloc-2604 | +| Monitor Azure Local with Azure Monitor Metrics and performance dashboards | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-cluster-with-metrics?view=azloc-2604 | +| Monitor Azure Local features like ReFS deduplication using Insights | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-features?view=azloc-2604 | +| Configure Insights to monitor multiple Azure Local systems | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-23h2?view=azloc-2604 | +| Enable Azure Local Insights at scale using Azure Policy | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-multi-azure-policies?view=azloc-2604 | +| Configure Insights to monitor a single Azure Local 23H2 system | https://learn.microsoft.com/en-us/azure/azure-local/manage/monitor-single-23h2?view=azloc-2604 | +| Enable and tune ReFS deduplication and compression on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/refs-deduplication-and-compression?view=azloc-2604 | +| Replace failed NICs in Network ATC intents on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/replace-network-adapter-to-network-intents?view=azloc-2604 | +| Enable recommended metric alert rules for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/set-up-recommended-alert-rules?view=azloc-2604 | +| Configure metric alerts for Azure Local resources | https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-metric-alerts?view=azloc-2604 | +| Set up log alerts for Azure Local using Insights and sample queries | https://learn.microsoft.com/en-us/azure/azure-local/manage/setup-system-alerts?view=azloc-2604 | +| Manage tenant logical networks in Azure Local SDN | https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-logical-networks?view=azloc-2604 | +| Manage tenant virtual networks with Hyper-V Network Virtualization | https://learn.microsoft.com/en-us/azure/azure-local/manage/tenant-virtual-networks?view=azloc-2604 | +| Unregister and re-register Azure Local machines via PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/unregister-register-machine?view=azloc-2604 | +| Update SDN infrastructure components on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/update-sdn?view=azloc-2604 | +| Run Azure Local Environment Checker for deployment readiness | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-environment-checker?view=azloc-2604 | +| Configure and manage VM extensions on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-manage-extension?view=azloc-2604 | +| Configure Windows Server VM activation on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-activate?view=azloc-2604 | +| Configure VM affinity and anti-affinity rules on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-affinity?view=azloc-2604 | +| Configure virtual machine load balancing on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-load-balancing?view=azloc-2604 | +| Manage Azure Local virtual machines using PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm-powershell?view=azloc-2604 | +| Create and manage Azure Local VMs with Windows Admin Center | https://learn.microsoft.com/en-us/azure/azure-local/manage/vm?view=azloc-2604 | +| Configure guest management for Azure Local migrated VMs | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-enable-guest-management?view=azloc-2604 | +| Configure diagnostic settings to monitor Azure Local migrations | https://learn.microsoft.com/en-us/azure/azure-local/migrate/monitor-migration?view=azloc-2604 | +| Install Azure CLI extensions for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-cli-extensions?view=azloc-2604 | +| Configure Layer 3 isolation domains for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-configure-layer-3-isolation-domain?view=azloc-2604 | +| Connect to Azure Local multi-rack VMs via SSH and RDP over SSH | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-connect-arc-vm-using-ssh?view=azloc-2604 | +| Create Azure Arc-enabled VMs on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-arc-virtual-machines?view=azloc-2604 | +| Configure internal load balancers on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-internal-load-balancer-virtual-networks?view=azloc-2604 | +| Create load balancers on logical networks in Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-load-balancer-logical-network?view=azloc-2604 | +| Create logical networks for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-logical-networks?view=azloc-2604 | +| Configure network interfaces for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-network-interfaces?view=azloc-2604 | +| Create public IP resources on Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-ip?view=azloc-2604 | +| Configure public load balancers on Azure Local multi-rack VNets | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-load-balancer-virtual-networks?view=azloc-2604 | +| Create virtual networks for Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-virtual-networks?view=azloc-2604 | +| Create and restore data disk snapshots on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-disk-snapshot?view=azloc-2604 | +| Manage disks and NIC resources for Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machine-resources?view=azloc-2604 | +| Manage lifecycle of Azure Local multi-rack Arc-enabled VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-arc-virtual-machines?view=azloc-2604 | +| Manage logical networks for Azure Local multi-rack Arc VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-logical-networks?view=azloc-2604 | +| Monitor Azure Local multi-rack with Azure Monitor Metrics | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-monitor-cluster-with-metrics?view=azloc-2604 | +| Meet prerequisites for Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-prerequisites?view=azloc-2604 | +| Install and manage VM extensions on Azure Local multi-rack VMs | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-extension?view=azloc-2604 | +| Manage VM images on Azure Local multi-rack deployments | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-manage-image?view=azloc-2604 | +| Satisfy VM prerequisites for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-vm-management-prerequisites?view=azloc-2604 | +| Review single-server network components for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-components?view=azloc-2604 | +| Apply IP address requirements for single-server Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/single-server-ip-requirements?view=azloc-2604 | +| Review three-node network components for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-components?view=azloc-2604 | +| Apply IP address requirements for three-node Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/three-node-ip-requirements?view=azloc-2604 | +| Review two-node network components for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-components?view=azloc-2604 | +| Apply IP address requirements for two-node Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/plan/two-node-ip-requirements?view=azloc-2604 | +| Import Azure Local update packages in offline environments | https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2604 | +| Manage Solution Builder Extension updates on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/update/solution-builder-extension?view=azloc-2604 | +| Configure Azure Local update settings and behavior | https://learn.microsoft.com/en-us/azure/azure-local/update/update-settings?view=azloc-2604 | ### Integrations & Coding Patterns | Topic | URL | |-------|-----| -| Create Azure Local VMs from Azure Compute Gallery images | https://learn.microsoft.com/en-us/azure/azure-local/manage/virtual-machine-image-azure-compute-gallery?view=azloc-2603 | -| Discover and replicate Hyper-V VMs with Azure Migrate to Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-hyperv-replicate?view=azloc-2603 | -| Migrate Azure Local VMs with Azure Migrate PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-via-powershell?view=azloc-2603 | -| Connect to Azure Local multi-rack Windows VMs via SSH | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-connect-arc-vm-using-ssh?view=azloc-2603 | -| Create internal load balancer via Azure CLI for Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-internal-load-balancer-virtual-networks?view=azloc-2603 | -| Create load balancers on logical networks with Azure CLI | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-load-balancer-logical-network?view=azloc-2603 | -| Create public IP resources on Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-create-public-ip?view=azloc-2603 | -| Create Azure Local multi-rack VM images from Azure Storage | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-image-storage-account?view=azloc-2603 | +| Enable external SAN storage integration with Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/enable-external-storage?view=azloc-2604 | +| Connect to Azure Local VMs using SSH and RDP | https://learn.microsoft.com/en-us/azure/azure-local/manage/connect-arc-vm-using-ssh?view=azloc-2604 | +| Create Azure Local VMs enabled by Azure Arc | https://learn.microsoft.com/en-us/azure/azure-local/manage/create-arc-virtual-machines?view=azloc-2604 | +| Use Azure Local Remote Support Arc extension for assisted diagnostics | https://learn.microsoft.com/en-us/azure/azure-local/manage/remote-support-arc-extension?view=azloc-2604 | +| Use custom AKS storage classes to consume external SAN on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/use-external-storage-for-containerized-workloads?view=azloc-2604 | +| Migrate VMs to Azure Local with Azure Migrate PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/migrate/migrate-via-powershell?view=azloc-2604 | +| Download Azure managed disks to Azure Local multi-rack | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-manage-data-disks?view=azloc-2604 | +| Create Azure Local multi-rack VM images from Azure Storage | https://learn.microsoft.com/en-us/azure/azure-local/multi-rack/multi-rack-virtual-machine-image-storage-account?view=azloc-2604 | ### Deployment | Topic | URL | |-------|-----| -| Add or repair nodes in Azure Local rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-add-server?view=azloc-2603 | -| Deploy and manage Azure Arc gateway for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-arc-gateway-overview?view=azloc-2603 | -| Deploy Azure Local using Azure Resource Manager templates | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template?view=azloc-2603 | -| Deploy Azure Local with local identity and Key Vault via ARM template | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault-template?view=azloc-2603 | -| Deploy a virtualized Azure Local instance | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2603 | -| Deploy Azure Local rack aware clusters via Azure portal | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-portal?view=azloc-2603 | -| Deploy rack aware clusters using ARM templates at scale | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deployment-via-template?view=azloc-2603 | -| Perform post-deployment tasks for rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-post-deployment?view=azloc-2603 | -| Deploy SQL Server workloads on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/deploy/sql-server-23h2?view=azloc-2603 | -| Scale Azure Local capacity by adding cluster nodes | https://learn.microsoft.com/en-us/azure/azure-local/manage/add-server?view=azloc-2603 | -| Acquire and set up disconnected operations appliance for Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-acquire?view=azloc-2603 | -| Deploy Azure Container Registry on Azure Local disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-azure-container-registry?view=azloc-2603 | -| Deploy disconnected Azure Local instances in datacenters | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-deploy?view=azloc-2603 | -| Prepare Azure Local nodes for disconnected deployments | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-prepare?view=azloc-2603 | -| Register Azure Local disconnected operations for compliance | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-registration?view=azloc-2603 | -| Repair Azure Local 23H2 cluster nodes safely | https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server?view=azloc-2603 | -| Suspend and resume Azure Local machines for maintenance | https://learn.microsoft.com/en-us/azure/azure-local/manage/suspend-resume-cluster-maintenance?view=azloc-2603 | -| Plan supported Azure Local release upgrade paths | https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2603 | -| Run post-upgrade tasks for Azure Local via PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/post-upgrade-steps?view=azloc-2603 | -| Upgrade Azure Stack HCI OS 22H2 to 23H2 or 24H2 via PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-22h2-to-23h2-powershell?view=azloc-2603 | \ No newline at end of file +| Plan Network Controller VM deployment on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/concepts/plan-network-controller-deployment?view=azloc-2604 | +| Add or repair nodes in Azure Local rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-add-server?view=azloc-2604 | +| Understand Azure Local rack aware clustering capabilities | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-overview?view=azloc-2604 | +| Review requirements and supported configs for rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/concepts/rack-aware-cluster-requirements?view=azloc-2604 | +| Deploy Azure Local disaggregated clusters via Azure portal | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal-disaggregated?view=azloc-2604 | +| Deploy an Azure Local instance from the Azure portal | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deploy-via-portal?view=azloc-2604 | +| Deploy Azure Local disaggregated using ARM templates | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template-disaggregated?view=azloc-2604 | +| Deploy Azure Local 23H2 using ARM templates at scale | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-azure-resource-manager-template?view=azloc-2604 | +| Install Azure Local OS for disaggregated deployments with SConfig | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os-disaggregated?view=azloc-2604 | +| Install Azure Stack HCI 23H2 OS using SConfig | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-install-os?view=azloc-2604 | +| Deploy Azure Local with local identity and Key Vault via ARM template | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-local-identity-with-key-vault-template?view=azloc-2604 | +| Verify security, hardware, and network prerequisites for Azure Local deployment | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-prerequisites?view=azloc-2604 | +| Deploy virtualized Azure Local 23H2/24H2 systems | https://learn.microsoft.com/en-us/azure/azure-local/deploy/deployment-virtual?view=azloc-2604 | +| Download Azure Local 23H2 OS from Azure portal | https://learn.microsoft.com/en-us/azure/azure-local/deploy/download-23h2-software?view=azloc-2604 | +| Deploy Azure Local rack aware clusters via Azure portal | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deploy-portal?view=azloc-2604 | +| Deploy rack aware clusters using ARM templates | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-deployment-via-template?view=azloc-2604 | +| Complete post-deployment tasks for rack aware clusters | https://learn.microsoft.com/en-us/azure/azure-local/deploy/rack-aware-cluster-post-deployment?view=azloc-2604 | +| Deploy SDN infrastructure with SDN Express scripts | https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-express-23h2?view=azloc-2604 | +| Deploy SDN via Windows Admin Center on Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/deploy/sdn-wizard-23h2?view=azloc-2604 | +| Provision Azure Local machines with simplified machine provisioning | https://learn.microsoft.com/en-us/azure/azure-local/deploy/simplified-machine-provisioning?view=azloc-2604 | +| Deploy SQL Server workloads on Azure Local 23H2 | https://learn.microsoft.com/en-us/azure/azure-local/deploy/sql-server-23h2?view=azloc-2604 | +| Protect Azure Local Hyper-V VMs with Site Recovery | https://learn.microsoft.com/en-us/azure/azure-local/manage/azure-site-recovery?view=azloc-2604 | +| Acquire and provision Azure Local disconnected operations appliance | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-acquire?view=azloc-2604 | +| Deploy and manage Azure Container Registry on Azure Local disconnected | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-azure-container-registry?view=azloc-2604 | +| Deploy Azure Local management cluster for disconnected operations | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-deploy?view=azloc-2604 | +| Prepare Azure Local nodes for disconnected deployment | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-prepare?view=azloc-2604 | +| Update Azure Local disconnected operations appliance safely | https://learn.microsoft.com/en-us/azure/azure-local/manage/disconnected-operations-update?view=azloc-2604 | +| Repair nodes in Azure Local 23H2 clusters | https://learn.microsoft.com/en-us/azure/azure-local/manage/repair-server?view=azloc-2604 | +| Upgrade SDN gateway VMs with minimal disruption | https://learn.microsoft.com/en-us/azure/azure-local/manage/upgrade-sdn-gateways?view=azloc-2604 | +| Plan supported Azure Local release upgrade paths | https://learn.microsoft.com/en-us/azure/azure-local/release-information-23h2?view=azloc-2604 | +| Use Azure Update Manager to update Azure Local | https://learn.microsoft.com/en-us/azure/azure-local/update/azure-update-manager-23h2?view=azloc-2604 | +| Import and discover Azure Local updates in limited connectivity | https://learn.microsoft.com/en-us/azure/azure-local/update/import-discover-updates-offline-23h2?view=azloc-2604 | +| Apply Azure Local solution updates via PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/update/update-via-powershell-23h2?view=azloc-2604 | +| Upgrade Azure Stack HCI OS 22H2 to 23H2 or 24H2 via PowerShell | https://learn.microsoft.com/en-us/azure/azure-local/upgrade/upgrade-22h2-to-23h2-powershell?view=azloc-2604 | \ No newline at end of file diff --git a/skills/azure-machine-learning/SKILL.md b/skills/azure-machine-learning/SKILL.md index f33ecf17..6a9c00d5 100644 --- a/skills/azure-machine-learning/SKILL.md +++ b/skills/azure-machine-learning/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-machine-learning -description: Expert knowledge for Azure Machine Learning development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Azure ML workspaces, AutoML, Prompt Flow, online/batch endpoints, or the Python SDK/CLI, and other Azure Machine Learning related development tasks. Not for Azure Databricks (use azure-databricks), Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Science Virtual Machines (use azure-data-science-vm), Azure HDInsight (use azure-hdinsight). +description: Expert knowledge for Azure Machine Learning development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Azure ML jobs/endpoints, AutoML, Prompt Flow, feature store, or CI/CD MLOps pipelines, and other Azure Machine Learning related development tasks. Not for Azure Databricks (use azure-databricks), Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Science Virtual Machines (use azure-data-science-vm), Azure HDInsight (use azure-hdinsight). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Machine Learning Skill @@ -26,13 +26,13 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L71 | Diagnosing and fixing Azure ML issues: pipelines, AutoML, endpoints (online/batch), networking (VNet/private/Kubernetes), environments/images, prompt flow, feature store, and known bugs. | | Best Practices | L72-L95 | Best practices for Azure ML training, AutoML, and LLM/prompt flow: cost and compute optimization, data prep, monitoring, deployment scripts, performance tuning, and ethical data use. | -| Decision Making | L96-L124 | Guidance on Azure ML design choices: algorithm selection, training methods, networking and DR, cost optimization, and detailed migration/upgrade paths from SDK v1 to v2 and between services. | -| Architecture & Design Patterns | L125-L131 | Designing Azure ML inference architectures: choosing endpoint types, planning real-time online endpoints, and structuring data movement and multistep pipeline components. | -| Limits & Quotas | L132-L140 | Info on Azure ML regional/sovereign availability, VM SKUs, and service limits, plus how to view, plan, and manage quotas and capacity for model deployments and endpoints. | -| Security | L141-L194 | Securing Azure ML workspaces, data, and endpoints with encryption, identity/RBAC, Key Vault, policies, and network isolation (VNets, private access, exfiltration prevention). | -| Configuration | L195-L467 | Configuring Azure ML: designer components, AutoML, compute, networking, data, monitoring, registries, prompt flow, and full CLI/SDK/YAML setup for training, deployment, and ops. | -| Integrations & Coding Patterns | L468-L511 | Integrating Azure ML with data sources, Spark/Databricks/Synapse, MLflow, REST/HTTP, prompt flow, and batch/online endpoints, plus patterns for logging, storage, and deployment. | -| Deployment | L512-L553 | Deploying and operationalizing models and prompt flows: online/batch endpoints, AKS/ACI, CI/CD, MLOps/GenAIOps, blue‑green rollouts, pipelines, and cross-workspace consumption. | +| Decision Making | L96-L125 | Guidance on Azure ML design choices and migrations: selecting algorithms, training and networking options, cost/failover strategies, and upgrading/migrating from v1, ACI, Prompt Flow to v2/Agent Framework. | +| Architecture & Design Patterns | L126-L132 | Designing Azure ML inference architectures: choosing endpoint types, planning real-time online endpoints, and structuring data movement and multistep pipeline components. | +| Limits & Quotas | L133-L141 | Info on Azure ML regional/sovereign availability, VM SKUs, and service limits, plus how to view, plan, and manage quotas and capacity for model deployments and endpoints. | +| Security | L142-L194 | Securing Azure ML workspaces, data, and endpoints with encryption, identity/RBAC, Key Vault, policies, and network isolation (VNets, private access, exfiltration prevention). | +| Configuration | L195-L455 | Configuring Azure ML: designer components, AutoML, YAML/CLI jobs, compute, networking, data, monitoring, registries, environments, and deployment/runtime settings. | +| Integrations & Coding Patterns | L456-L511 | Integrating Azure ML with data sources, Spark/Databricks/Synapse, MLflow, REST APIs, batch/online endpoints, Prompt Flow, and external tools to run, track, and deploy ML workflows. | +| Deployment | L512-L554 | Deploying and operating models and prompt flows: online/batch endpoints, AKS/ACI, CI/CD and MLOps/GenAIOps pipelines, blue‑green rollouts, cross-workspace use, and pipeline-based batch scoring. | ### Troubleshooting | Topic | URL | @@ -121,6 +121,7 @@ This skill requires **network access** to fetch documentation content: | Evaluate compute management changes from AML v1 to v2 | https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-resource-compute?view=azureml-api-2 | | Migrate datastore management from AML v1 to v2 | https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-resource-datastore?view=azureml-api-2 | | Decide how to upgrade Azure ML workspaces to SDK v2 | https://learn.microsoft.com/en-us/azure/machine-learning/migrate-to-v2-resource-workspace?view=azureml-api-2 | +| Plan migration from Prompt Flow to Agent Framework | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/migrate-prompt-flow-to-agent-framework?view=azureml-api-2 | ### Architecture & Design Patterns | Topic | URL | @@ -186,7 +187,6 @@ This skill requires **network access** to fetch documentation content: | Securely use private Python packages in Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-private-python-packages?view=azureml-api-1 | | Securely use Key Vault secrets in Azure ML runs | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-secrets-in-runs?view=azureml-api-2 | | Apply built-in Azure Policy definitions for AML | https://learn.microsoft.com/en-us/azure/machine-learning/policy-reference?view=azureml-api-2 | -| Manage API and data source credentials with prompt flow connections | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/concept-connections?view=azureml-api-2 | | Secure prompt flow with virtual network isolation in Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-secure-prompt-flow?view=azureml-api-2 | | Apply Azure Policy regulatory controls to Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/security-controls-policy?view=azureml-api-2 | | Secure Azure ML workspace with custom VNet | https://learn.microsoft.com/en-us/azure/machine-learning/tutorial-create-secure-workspace-vnet?view=azureml-api-2 | @@ -381,23 +381,11 @@ This skill requires **network access** to fetch documentation content: | Configure VS Code remote sessions to Azure ML compute | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-work-in-vs-code-remote?view=azureml-api-2 | | Configure serverless Spark compute for Azure ML notebooks | https://learn.microsoft.com/en-us/azure/machine-learning/interactive-data-wrangling-with-apache-spark-azure-ml?view=azureml-api-2 | | Reference Azure Machine Learning monitoring metrics and logs | https://learn.microsoft.com/en-us/azure/machine-learning/monitor-azure-machine-learning-reference?view=azureml-api-2 | -| Configure custom base images for prompt flow sessions | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2 | +| Customize compute session base images for Prompt Flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-customize-session-base-image?view=azureml-api-2 | | Configure and consume streaming responses from prompt flow endpoints | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-enable-streaming-mode?view=azureml-api-2 | | Enable tracing and user feedback collection for prompt flow deployments | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-enable-trace-feedback-for-deployment?view=azureml-api-2 | | Configure and manage prompt flow compute sessions in Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-manage-compute-session?view=azureml-api-2 | | Configure monitoring for Azure ML generative AI apps | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-monitor-generative-ai-applications?view=azureml-api-2 | -| Configure Azure OpenAI GPT-4 Turbo with Vision tool | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/azure-open-ai-gpt-4v-tool?view=azureml-api-2 | -| Configure Content Safety text tool in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2 | -| Configure embedding tool for OpenAI in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2 | -| Configure Index Lookup tool for RAG in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2 | -| Configure LLM tool in Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2 | -| Use Open Model LLM tool in Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/open-model-llm-tool?view=azureml-api-2 | -| Configure OpenAI GPT-4V tool in Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/openai-gpt-4v-tool?view=azureml-api-2 | -| Configure and manage tools in Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/overview?view=azureml-api-2 | -| Use and configure prompt templates in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2 | -| Create and configure Python tools in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2 | -| Configure Rerank tool for RAG in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2 | -| Configure SerpAPI search tool in Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/serp-api-tool?view=azureml-api-2 | | Configure Automated ML forecasting jobs via YAML | https://learn.microsoft.com/en-us/azure/machine-learning/reference-automated-ml-forecasting?view=azureml-api-2 | | Author AutoML image classification jobs in YAML | https://learn.microsoft.com/en-us/azure/machine-learning/reference-automl-images-cli-classification?view=azureml-api-2 | | Define AutoML image instance segmentation YAML jobs | https://learn.microsoft.com/en-us/azure/machine-learning/reference-automl-images-cli-instance-segmentation?view=azureml-api-2 | @@ -502,10 +490,22 @@ This skill requires **network access** to fetch documentation content: | Integrate Azure Databricks MLflow tracking with Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-mlflow-azure-databricks?view=azureml-api-2 | | Configure MLflow tracking from Azure Synapse to Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-mlflow-azure-synapse?view=azureml-api-2 | | Integrate Azure Synapse Spark in Azure ML pipelines | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-synapsesparkstep?view=azureml-api-1 | -| Create and use custom tool packages in Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2 | -| Evaluate Semantic Kernel plugins and planners with prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2 | -| Integrate LangChain workflows with Azure ML prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2 | +| Create and use custom tool packages in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2 | +| Evaluate Semantic Kernel plugins using Prompt Flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-evaluate-semantic-kernel?view=azureml-api-2 | +| Integrate LangChain workflows into Prompt Flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-langchain?view=azureml-api-2 | +| Rebuild Prompt Flow workflows using Agent Framework | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-migrate-prompt-flow-to-agent-framework?view=azureml-api-2 | | Incorporate image inputs into Azure ML prompt flows | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-process-image?view=azureml-api-2 | +| Configure Azure OpenAI GPT-4 Turbo with Vision tool in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/azure-open-ai-gpt-4v-tool?view=azureml-api-2 | +| Use Content Safety text tool in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/content-safety-text-tool?view=azureml-api-2 | +| Configure embedding tool for OpenAI models in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/embedding-tool?view=azureml-api-2 | +| Configure Index Lookup tool for RAG in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/index-lookup-tool?view=azureml-api-2 | +| Configure and use LLM tool in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/llm-tool?view=azureml-api-2 | +| Use Open Model LLM tool with foundational models in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/open-model-llm-tool?view=azureml-api-2 | +| Use OpenAI GPT-4V vision tool in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/openai-gpt-4v-tool?view=azureml-api-2 | +| Use prompt templates with the prompt tool in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/prompt-tool?view=azureml-api-2 | +| Build and configure Python tools in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/python-tool?view=azureml-api-2 | +| Use rerank tool to improve RAG search in prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/rerank-tool?view=azureml-api-2 | +| Integrate SerpAPI search results with prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/tools-reference/serp-api-tool?view=azureml-api-2 | | Quickstart: Configure Spark jobs in Azure ML | https://learn.microsoft.com/en-us/azure/machine-learning/quickstart-spark-jobs?view=azureml-api-2 | | Map Azure ML v1 logging APIs to MLflow tracking | https://learn.microsoft.com/en-us/azure/machine-learning/reference-migrate-sdk-v1-mlflow-tracking?view=azureml-api-2 | @@ -547,6 +547,7 @@ This skill requires **network access** to fetch documentation content: | Operationalize scoring pipelines on AML batch endpoints | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-scoring-pipeline?view=azureml-api-2 | | Operationalize training pipelines on AML batch endpoints | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-training-pipeline?view=azureml-api-2 | | Build RAG pipelines with Azure ML and prompt flow | https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-pipelines-prompt-flow?view=azureml-api-2 | +| Deploy and operate Agent Framework workflows on Azure | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-migrated-agent-framework-workflow?view=azureml-api-2 | | Deploy prompt flows to managed or Kubernetes online endpoints with CLI | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-deploy-to-code?view=azureml-api-2 | | Implement GenAIOps with prompt flow and Azure DevOps pipelines | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-end-to-end-azure-devops-with-prompt-flow?view=azureml-api-2 | | Implement GenAIOps with prompt flow and GitHub pipelines | https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-end-to-end-llmops-with-prompt-flow?view=azureml-api-2 | diff --git a/skills/azure-managed-grafana/SKILL.md b/skills/azure-managed-grafana/SKILL.md index 66b0aeb3..b86c388c 100644 --- a/skills/azure-managed-grafana/SKILL.md +++ b/skills/azure-managed-grafana/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-managed-grafana -description: Expert knowledge for Azure Managed Grafana development including troubleshooting, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when integrating Azure Managed Grafana with Azure Monitor/AKS, Entra auth, plugins, email alerts, or zone-redundant setups, and other Azure Managed Grafana related development tasks. Not for Azure Monitor (use azure-monitor), Azure Synapse Analytics (use azure-synapse-analytics), Azure Data Explorer (use azure-data-explorer). +description: Expert knowledge for Azure Managed Grafana development including troubleshooting, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when connecting Azure Monitor/Prometheus, configuring Entra auth/RBAC, zone-redundant workspaces, alerts, or image/report rendering, and other Azure Managed Grafana related development tasks. Not for Azure Monitor (use azure-monitor). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Managed Grafana Skill @@ -27,10 +27,10 @@ This skill requires **network access** to fetch documentation content: | Troubleshooting | L35-L40 | Diagnosing and resolving common Azure Managed Grafana issues, including access, configuration, and private endpoint connectivity and DNS problems. | | Decision Making | L41-L48 | Guidance on choosing and managing Grafana Enterprise plans, migrating from self‑hosted/other dashboards, and planning upgrades (Grafana 11→12) in Azure Managed Grafana. | | Limits & Quotas | L49-L54 | Using image/report rendering features in Azure Managed Grafana and understanding its service limits, quotas, and operational constraints | -| Security | L55-L68 | Securing Managed Grafana: auth/permissions, Entra/Team Sync, roles, service accounts/tokens, private access/endpoints, outbound IPs, data encryption, and security best practices. | -| Configuration | L69-L78 | Configuring Azure Managed Grafana workspaces: access control, instance settings, plugins, metrics/diagnostics via Azure Monitor, and SMTP email alert setup. | -| Integrations & Coding Patterns | L79-L91 | Integrating Managed Grafana with Azure AI/Agent Framework, Prometheus, AKS, Azure Monitor, and Data Explorer, plus configuring data sources, alerts, and private/managed connections. | -| Deployment | L92-L95 | Designing highly available Azure Managed Grafana workspaces, including reliability features, SLAs, and enabling zone-redundant deployments for resiliency. | +| Security | L55-L68 | Securing Managed Grafana: auth and RBAC, Entra/Team Sync, service accounts/tokens, private endpoints and outbound IPs, data encryption, Azure Monitor access, and security best practices. | +| Configuration | L69-L77 | Configuring Managed Grafana workspaces: instance settings, plugins, metrics via Azure Monitor, diagnostic logging, and SMTP email alert setup. | +| Integrations & Coding Patterns | L78-L92 | Patterns and code to integrate Managed Grafana with Azure Monitor, Prometheus, App Insights, AKS, Azure AI Foundry, MCP, and Data Explorer, plus managing data source plugins and alerts. | +| Deployment | L93-L96 | Designing highly available Azure Managed Grafana workspaces, including reliability features, SLAs, and enabling zone-redundant deployments for resiliency. | ### Troubleshooting | Topic | URL | @@ -59,7 +59,7 @@ This skill requires **network access** to fetch documentation content: | Configure authentication and permissions for Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-authentication-permissions | | Connect Managed Grafana to data sources via private endpoints | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-connect-to-data-source-privately | | Set up deterministic outbound IPs for Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-deterministic-ip | -| Manage user and identity roles in Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities | +| Manage Azure Managed Grafana roles and access | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-access-permissions-users-identities | | Grant Azure Monitor access to Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-permissions | | Use service accounts and tokens in Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-service-accounts | | Configure private access and endpoints for Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-set-up-private-access | @@ -69,7 +69,6 @@ This skill requires **network access** to fetch documentation content: ### Configuration | Topic | URL | |-------|-----| -| Configure Azure Managed Grafana AMG-MCP server access | https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server | | Configure Azure Managed Grafana instance settings | https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-settings | | Manage Grafana plugins in Azure Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-manage-plugins | | Monitor Managed Grafana metrics with Azure Monitor charts | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-monitor-managed-grafana-metrics | @@ -81,6 +80,8 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Monitor Agent Framework workflows in Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/agent-framework-workflow-dashboard | | Build Azure AI Foundry monitoring dashboards in Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/azure-ai-foundry-dashboard | +| Use Azure Managed Grafana MCP server programmatically | https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-mcp-server | +| Ingest telemetry into App Insights for Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/grafana-opentelemetry-app-insights | | Configure bundled Prometheus integration in Managed Grafana | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-bundled-prometheus | | Integrate Azure Managed Grafana MCP with Azure AI Foundry agents | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-configure-mcp-for-ai-foundry | | Add Azure Data Explorer as a Grafana data source | https://learn.microsoft.com/en-us/azure/managed-grafana/how-to-connect-azure-data-explorer | diff --git a/skills/azure-managed-lustre/SKILL.md b/skills/azure-managed-lustre/SKILL.md index 71c4d57c..780968f8 100644 --- a/skills/azure-managed-lustre/SKILL.md +++ b/skills/azure-managed-lustre/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-managed-lustre -description: Expert knowledge for Azure Managed Lustre development including troubleshooting, best practices, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. Use when mounting AMLFS, integrating with Blob auto-import/export, using AKS CSI, setting CMK/root squash, or tuning performance, and other Azure Managed Lustre related development tasks. Not for Azure HPC Cache (use azure-hpc-cache), Azure NetApp Files (use azure-netapp-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san). +description: Expert knowledge for Azure Managed Lustre development including troubleshooting, best practices, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. Use when deploying AMLFS with Blob auto-import/export, Terraform, AKS CSI, quotas, or HA/DR failover, and other Azure Managed Lustre related development tasks. Not for Azure HPC Cache (use azure-hpc-cache), Azure NetApp Files (use azure-netapp-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-05" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Managed Lustre Skill @@ -24,7 +24,7 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L35-L40 | Diagnosing and resolving Azure Managed Lustre deployment failures and performance issues, including cluster provisioning errors, throughput/latency problems, and tuning guidance. | +| Troubleshooting | L35-L40 | Diagnosing and resolving Azure Managed Lustre issues, including cluster deployment failures, configuration problems, and performance bottlenecks or slow I/O. | | Best Practices | L41-L46 | Guidance on tuning Azure Managed Lustre performance via optimal file/directory layout, client striping, and network setup (NICs, throughput, latency, and scaling). | | Architecture & Design Patterns | L47-L51 | Designing Azure Managed Lustre for high availability, regional redundancy, disaster recovery, and failover strategies across regions or zones | | Limits & Quotas | L52-L56 | Configuring and managing user, group, and project storage quotas in Azure Managed Lustre, including setup steps, commands, and best practices for capacity control. | diff --git a/skills/azure-microsoft-discovery/SKILL.md b/skills/azure-microsoft-discovery/SKILL.md new file mode 100644 index 00000000..295aa536 --- /dev/null +++ b/skills/azure-microsoft-discovery/SKILL.md @@ -0,0 +1,117 @@ +--- +name: azure-microsoft-discovery +description: Expert knowledge for Azure Microsoft Discovery development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building Discovery workspaces, supercomputer agents, Bookshelf/CogLoop logs, REST control, or v1→v2 migration, and other Azure Microsoft Discovery related development tasks. +compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. +metadata: + generated_at: "2026-04-26" + generator: "docs2skills/1.0.0" +--- +# Azure Microsoft Discovery Skill + +This skill provides expert guidance for Azure Microsoft Discovery. Covers troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. It combines local quick-reference content with remote documentation fetching capabilities. + +## How to Use This Skill + +> **IMPORTANT for Agent**: Use the **Category Index** below to locate relevant sections. For categories with line ranges (e.g., `L35-L120`), use `read_file` with the specified lines. For categories with file links (e.g., `[security.md](security.md)`), use `read_file` on the linked reference file + +> **IMPORTANT for Agent**: If `metadata.generated_at` is more than 3 months old, suggest the user pull the latest version from the repository. If `mcp_microsoftdocs` tools are not available, suggest the user install it: [Installation Guide](https://github.com/MicrosoftDocs/mcp/blob/main/README.md) + +This skill requires **network access** to fetch documentation content: +- **Preferred**: Use `mcp_microsoftdocs:microsoft_docs_fetch` with query string `from=learn-agent-skill`. Returns Markdown. +- **Fallback**: Use `fetch_webpage` with query string `from=learn-agent-skill&accept=text/markdown`. Returns Markdown. + +## Category Index + +| Category | Lines | Description | +|----------|-------|-------------| +| Troubleshooting | L37-L41 | Diagnosing and fixing Discovery task execution problems, including common failures, debugging steps, logs, and configuration issues. | +| Best Practices | L42-L48 | Guidance on structuring Discovery projects, running trustworthy investigations, and applying responsible AI and trust patterns in Microsoft Discovery research workflows | +| Decision Making | L49-L58 | Guidance on choosing agent types and models, planning compute and billing, and migrating/transitioning Discovery resources and configurations between v1 and v2. | +| Architecture & Design Patterns | L59-L63 | Advanced investigation workflows in Discovery Engine: multi-step queries, pattern-based searches, correlation strategies, and best practices for complex data exploration. | +| Limits & Quotas | L64-L70 | File size/type limits, investigation file handling, required Azure quotas/reservations, and naming rules for Microsoft Discovery resources and deployments | +| Security | L71-L82 | Securing Discovery resources: encryption at rest, customer-managed keys, managed identities, RBAC, private endpoints, network rules, and enabling/exporting audit logs. | +| Configuration | L83-L103 | Configuring Discovery workspaces, storage, tools, agents, and supercomputers, plus accessing and querying logs for Bookshelf, CogLoop, and other Discovery resources | +| Integrations & Coding Patterns | L104-L110 | Integrating external tools/models into Discovery workflows, managing Discovery supercomputers via REST, and writing action scripts for action-based tools. | +| Deployment | L111-L117 | Deploying and hardening Discovery in Azure: Bicep-based infra setup, v2 resource recreation, container image publishing to ACR, and secure networked stack deployment. | + +### Troubleshooting +| Topic | URL | +|-------|-----| +| Troubleshoot and debug Discovery task execution issues | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-debug-task-execution | + +### Best Practices +| Topic | URL | +|-------|-----| +| Organize Microsoft Discovery projects and investigations effectively | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-projects-investigations | +| Apply responsible AI practices in Microsoft Discovery research | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-responsible-ai | +| Apply trust and basic investigation patterns in Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-trust-basic-investigation-patterns | + +### Decision Making +| Topic | URL | +|-------|-----| +| Choose between prompt and workflow agents in Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-agent-types | +| Understand Microsoft Discovery billing model and metering | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-discovery-billing | +| Plan Microsoft Discovery v1 to v2 resource transition | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-v1-to-v2-transition-guide | +| Export Microsoft Discovery v1 configurations for migration | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-collect-v1-configurations | +| Plan compute and functional requirements for Discovery tools | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-plan-tool-requirements | +| Select optimal OpenAI models for Microsoft Discovery agents | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-select-models-for-agents | + +### Architecture & Design Patterns +| Topic | URL | +|-------|-----| +| Use advanced investigation patterns with Discovery Engine | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-advanced-investigation-patterns | + +### Limits & Quotas +| Topic | URL | +|-------|-----| +| Understand file handling and limitations in Discovery investigations | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-files-storage-assets | +| Plan Azure quotas and reservations for Microsoft Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-quota-reservation | +| Apply Microsoft Discovery resource naming rules | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-resource-naming | + +### Security +| Topic | URL | +|-------|-----| +| Manage data encryption at rest in Microsoft Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-data-encryption-at-rest | +| Use managed identities with Microsoft Discovery resources | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-managed-identities | +| Configure network security for Microsoft Discovery workspaces | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-network-security | +| Configure RBAC role assignments for Microsoft Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-role-assignments | +| Configure user-assigned managed identities for Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-configure-managed-identity | +| Configure private endpoints and network security for Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-configure-network-security | +| Enable and export audit logs for Microsoft Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-enable-audit-logging | +| Configure customer-managed keys for Discovery resources | https://learn.microsoft.com/en-us/azure/microsoft-discovery/howto-data-encryption-at-rest | + +### Configuration +| Topic | URL | +|-------|-----| +| Configure Azure Container Registry for Microsoft Discovery tools | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-azure-container-registry | +| Register Microsoft Discovery resource provider in Azure | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-resource-provider-registration | +| Configure Azure Blob Storage for Microsoft Discovery data | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-storage-account | +| Access Discovery resource application logs in Managed Resource Groups | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-access-resource-logs | +| Create prompt and workflow agents in Microsoft Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-agent-creation | +| Author tool definition YAML for Discovery tools | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-create-tool-definition | +| Configure data handling and storage assets for Discovery agents | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-data-handling-with-tools-agents | +| Configure Bookshelf resources and index Discovery knowledgebases | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-index-bookshelf-knowledgebase | +| Configure storage containers and assets in Discovery workspaces | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-storage-containers | +| Create and manage Microsoft Discovery supercomputers and nodepools | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-supercomputers | +| Configure and manage Microsoft Discovery workspaces | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-workspaces | +| Query bookshelf indexing logs from Discovery supercomputers | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-bookshelf-indexing-logs | +| Query bookshelf knowledgebase query logs in Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-bookshelf-logs | +| Query CogLoop orchestration logs for Discovery investigations | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-cognitive-loop-logs | +| Query supercomputer platform and tool logs in Discovery | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-supercomputer-logs | +| Query Discovery workspace logs with Kusto and correlation IDs | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-query-workspace-logs | +| View and filter Azure activity logs for Discovery resources | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-view-activity-logs | + +### Integrations & Coding Patterns +| Topic | URL | +|-------|-----| +| Integrate tools and models into Microsoft Discovery workflows | https://learn.microsoft.com/en-us/azure/microsoft-discovery/concept-tools-model-integration | +| Manage Microsoft Discovery supercomputers via REST API | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-manage-supercomputers-rest-api | +| Implement action scripts for Discovery action-based tools | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-write-tool-action-scripts | + +### Deployment +| Topic | URL | +|-------|-----| +| Deploy a fully network-hardened Microsoft Discovery stack | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-deploy-network-hardened-stack | +| Publish Discovery tool container images to Azure Container Registry | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-publish-tool-to-acr | +| Recreate Microsoft Discovery resources on v2 infrastructure | https://learn.microsoft.com/en-us/azure/microsoft-discovery/how-to-recreate-v2-resources | +| Deploy Microsoft Discovery infrastructure using Bicep | https://learn.microsoft.com/en-us/azure/microsoft-discovery/quickstart-infrastructure-bicep | \ No newline at end of file diff --git a/skills/azure-migrate/SKILL.md b/skills/azure-migrate/SKILL.md index b3004551..e75d9d14 100644 --- a/skills/azure-migrate/SKILL.md +++ b/skills/azure-migrate/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-migrate -description: Expert knowledge for Azure Migrate development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using AppCAT/CAST/Copilot, Site Recovery APIs, Azure Migrate appliances, Arc discovery, or Resource Mover, and other Azure Migrate related development tasks. Not for Azure Database Migration service (use azure-database-migration), Azure Site Recovery (use azure-site-recovery), Azure Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). +description: Expert knowledge for Azure Migrate development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using AppCAT/CAST, Site Recovery REST APIs, Azure Migrate appliance, Arc-based discovery, or Resource Mover, and other Azure Migrate related development tasks. Not for Azure Database Migration service (use azure-database-migration), Azure Site Recovery (use azure-site-recovery), Azure Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Migrate Skill @@ -30,7 +30,7 @@ This skill requires **network access** to fetch documentation content: | Architecture & Design Patterns | L106-L111 | Architecture and data flow for Azure Migrate using Arc-based discovery and Hyper-V, including components, connectivity, prerequisites, and how discovery/inventory works. | | Limits & Quotas | L112-L125 | Azure Migrate region support, appliance prerequisites/capacity, and support matrices/limits for VMware, Hyper-V, and physical server discovery and migration. | | Security | L126-L141 | Securing Azure Migrate: least-privilege roles/accounts, Private Link discovery/migration, encrypted VM moves, Trusted Launch, RBAC, and Entra ID app registration. | -| Configuration | L142-L170 | Configuring Azure Migrate projects, appliances, assessments, dependencies, Arc/agents, networking, and Resource Mover settings for server, SQL, .NET/Java, and PostgreSQL migrations | +| Configuration | L142-L170 | Configuring Azure Migrate appliances, assessments, dependencies, Arc/agents, networking (Private Link), and Resource Mover settings for discovering, assessing, and preparing servers/DBs for migration. | | Integrations & Coding Patterns | L171-L178 | Code-level integration patterns: using AppCAT CLI, CAST Highlight, GitHub Copilot insights, and Site Recovery REST APIs to assess and automate VMware-to-Azure app migrations. | | Deployment | L179-L186 | Planning and executing migration waves, setting up Azure DevOps pipelines for containers, and checking support/matrix for cross-region moves of VMs and Azure SQL with Resource Mover | @@ -159,7 +159,7 @@ This skill requires **network access** to fetch documentation content: | Enable extra data collection on Arc-enabled servers | https://learn.microsoft.com/en-us/azure/migrate/how-to-enable-additional-data-collection-for-arc-servers?view=migrate | | Manage Arc resource synchronization in Azure Migrate projects | https://learn.microsoft.com/en-us/azure/migrate/how-to-manage-arc-resource-sync?view=migrate | | Configure Azure Migrate appliance for physical servers | https://learn.microsoft.com/en-us/azure/migrate/how-to-set-up-appliance-physical?view=migrate | -| Use Azure Migrate over Private Link with private endpoints | https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate | +| Configure Azure Migrate with Private Endpoints and Private Link | https://learn.microsoft.com/en-us/azure/migrate/how-to-use-azure-migrate-with-private-endpoints?view=migrate | | Reference for Azure Migrate Collector VM extension settings | https://learn.microsoft.com/en-us/azure/migrate/migrate-virtual-machine-extension-reference?view=migrate | | Use built-in Azure Policy definitions for Azure Migrate | https://learn.microsoft.com/en-us/azure/migrate/policy-reference?view=migrate | | Configure PostgreSQL assessment properties in Azure Migrate | https://learn.microsoft.com/en-us/azure/migrate/postgresql-assessment-properties?view=migrate | diff --git a/skills/azure-monitor/SKILL.md b/skills/azure-monitor/SKILL.md index f0465ac4..fb5dd9c9 100644 --- a/skills/azure-monitor/SKILL.md +++ b/skills/azure-monitor/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-monitor -description: Expert knowledge for Azure Monitor development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Log Analytics, Application Insights, DCRs/agents, Prometheus/Grafana, or Azure Monitor alerts, and other Azure Monitor related development tasks. Not for Azure Network Watcher (use azure-network-watcher), Azure Service Health (use azure-service-health), Azure Defender For Cloud (use azure-defender-for-cloud), Azure Security (use azure-security). +description: Expert knowledge for Azure Monitor development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Log Analytics, Application Insights, DCRs/agents, alerts/autoscale, or Prometheus/Grafana monitoring, and other Azure Monitor related development tasks. Not for Azure AI Metrics Advisor (use azure-metrics-advisor), Azure Network Watcher (use azure-network-watcher), Azure Service Health (use azure-service-health), Azure Defender For Cloud (use azure-defender-for-cloud). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Monitor Skill @@ -24,13 +24,13 @@ This skill requires **network access** to fetch documentation content: | Category | Location | Description | |----------|----------|-------------| -| Troubleshooting | L37-L86 | Diagnosing and fixing Azure Monitor issues: agent/extension health, data collection and ingestion failures, alerts/ITSM, Application Insights, containers/Prometheus, workbooks, and VM performance. | -| Best Practices | L87-L123 | Best practices for configuring, scaling, querying, and optimizing Azure Monitor (logs, metrics, alerts, autoscale, AKS/VMs, Prometheus, multicloud) for performance, reliability, and cost. | -| Decision Making | L124-L161 | Guidance for choosing Azure Monitor options and planning migrations (agents, alerts, metrics, logs, SCOM, Prometheus, Splunk), plus cost, billing, and visualization decisions. | -| Architecture & Design Patterns | L162-L169 | Designing Azure Monitor architectures: enterprise-wide layouts, Private Link network patterns, choosing single vs multiple workspaces, and using workspace replication for resilience. | -| Limits & Quotas | L170-L237 | Limits, performance, and scaling behavior for Azure Monitor logs, metrics, agents, autoscale, Prometheus, Container Insights, Workbooks, and per‑resource metric definitions. | -| Security | L238-L293 | Securing Azure Monitor: auth (Entra, RBAC, keys), network (NSP, firewalls, Private Link, TLS), ITSM/webhooks, Container/Prometheus/Grafana access, and security/audit log schemas and analysis. | -| Configuration | [configuration.md](configuration.md) | Configuring Azure Monitor end to end: agents, DCRs, pipelines, networking, alerts, autoscale, Application Insights, Kubernetes/Prometheus, Private Link, logs/metrics schemas, and resource‑specific metrics/logs. | +| Troubleshooting | L37-L87 | Diagnosing and fixing Azure Monitor data collection, agent, alert, ingestion, and workbook issues across VMs, containers, pipelines, Application Insights, ITSM, and Copilot observability. | +| Best Practices | L88-L125 | Best practices for Azure Monitor alerts, autoscale, AKS/VM monitoring, logs/metrics performance and cost, telemetry, Prometheus/PromQL, multicloud, and operational excellence. | +| Decision Making | L126-L164 | Guidance for choosing Azure Monitor features, costs, and tools, and step‑by‑step migrations from legacy agents, alerts, APIs, SCOM, Splunk, Prometheus, and VM monitoring to modern Azure Monitor options | +| Architecture & Design Patterns | L165-L172 | Designing Azure Monitor architectures: enterprise-wide layouts, Private Link network patterns, choosing single vs multiple workspaces, and using workspace replication for resilience. | +| Limits & Quotas | L173-L241 | Limits, quotas, performance, and scaling behavior for Azure Monitor logs, metrics, agents, autoscale, Prometheus, Container Insights, Workbooks, and resource‑specific metric coverage. | +| Security | L242-L297 | Securing Azure Monitor and related services: auth (Entra, RBAC, keys), network/TLS/NSP/Private Link, secure webhooks, ITSM, workbooks, and detailed security/audit log schemas and analysis. | +| Configuration | [configuration.md](configuration.md) | Configuring Azure Monitor end to end: agents, DCRs, pipelines, alerts, autoscale, workbooks, Private Link, pricing, and detailed schemas for logs, metrics, and tables across Azure and partner services. | | Integrations & Coding Patterns | [integrations.md](integrations.md) | Integrating Azure Monitor with apps, alerts, ITSM, Prometheus/Grafana, REST/CLI, and using KQL patterns to query, export, and analyze logs/metrics across many Azure and third‑party services | | Deployment | [deployment.md](deployment.md) | Deploying and migrating Azure Monitor agents/resources at scale (VMs, Arc, diagnostics, alerts, Profiler, workspaces, Grafana) using Policy, ARM, CLI, and PowerShell | @@ -61,9 +61,10 @@ This skill requires **network access** to fetch documentation content: | Troubleshoot telemetry issues using Application Insights SDK stats | https://learn.microsoft.com/en-us/azure/azure-monitor/app/sdk-stats | | Troubleshoot Azure Monitor autoscale behavior and actions | https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-troubleshoot | | Use Live Data in Container insights for real-time AKS troubleshooting | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-livedata-overview | -| Troubleshoot Container Insights container log collection issues | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot | +| Diagnose and fix Azure Monitor container log collection | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-troubleshoot | | Troubleshoot Prometheus metrics collection in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-troubleshoot | | Monitor and troubleshoot DCR-based data collection in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-monitor | +| Troubleshoot Azure Monitor pipeline deployment and data issues | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-troubleshoot | | Resolve Azure Monitor Log Analytics API error codes | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/errors | | Troubleshoot stopped data collection in Azure Monitor Logs | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-collection-troubleshoot | | Diagnose and alert on Log Analytics workspace health | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-analytics-workspace-health | @@ -99,6 +100,7 @@ This skill requires **network access** to fetch documentation content: | Design cost-effective alerting for AKS in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/cost-effective-alerting | | Apply Azure Monitor best practices for Kubernetes layers | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/monitor-kubernetes | | Apply best practices for Azure Monitor data collection rules | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-best-practices | +| Apply sizing best practices for Azure Monitor pipeline throughput | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-sizing | | Optimize Azure Monitor costs with configuration | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/best-practices-cost | | Implement multicloud monitoring for AWS and GCP with Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/best-practices-multicloud | | Configure Azure Monitor for operational excellence | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/best-practices-operation | @@ -134,6 +136,7 @@ This skill requires **network access** to fetch documentation content: | Transition from Container Monitoring Solution to Container Insights | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-transition-solution | | Choose Azure Monitor OTLP ingestion option for AKS and VMs | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/opentelemetry-summary | | Choose between Azure Monitor metrics export and data plane API | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-plane-versus-metrics-export | +| Decide when and how to use Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-overview | | Decide how to migrate SCOM monitoring to Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/azure-monitor-operations-manager | | Estimate Azure Monitor costs with pricing calculator | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/cost-estimate | | Map Azure Monitor charges to billing meter names | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/cost-meters | @@ -143,7 +146,7 @@ This skill requires **network access** to fetch documentation content: | Plan and optimize Azure Monitor Logs costs and pricing options | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/cost-logs | | Use Auxiliary table plan for low-cost Azure Monitor log retention | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table-auxiliary | | Migrate from HTTP Data Collector API to Logs Ingestion API | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/custom-logs-migrate | -| Plan and use Azure Monitor Logs dedicated clusters | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters | +| Decide and configure Azure Monitor dedicated clusters | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-dedicated-clusters | | Choose Azure Monitor Logs table plans by usage | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-table-plans | | Plan migration from Splunk to Azure Monitor Logs | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/migrate-splunk-to-azure-monitor-logs | | Plan migration from self-hosted Prometheus to Azure Monitor managed Prometheus | https://learn.microsoft.com/en-us/azure/azure-monitor/metrics/prometheus-migrate | @@ -182,6 +185,7 @@ This skill requires **network access** to fetch documentation content: | Use region mappings for Container Insights and Log Analytics | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-region-mapping | | Configure autoscaling limits for Azure Managed Prometheus addon pods | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-scrape-autoscaling | | Plan Prometheus scraping performance and scale in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/prometheus-metrics-scrape-scale | +| Review Azure Monitor pipeline extension versions and changes | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-extension-versions | | Azure Monitor platform limits and quotas reference | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/service-limits | | Understand caching behavior in Logs Query API | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/cache | | Run cross-workspace queries via Logs API | https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/cross-workspace-queries | @@ -245,8 +249,8 @@ This skill requires **network access** to fetch documentation content: | Configure IP address handling in Application Insights | https://learn.microsoft.com/en-us/azure/azure-monitor/app/ip-collection | | Migrate Container Insights from legacy to managed identity authentication | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-authentication | | Configure secure access to Live Data in Container insights | https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-livedata-setup | -| Configure TLS and mTLS for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls | -| Use automated TLS certificate management for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated | +| Choose TLS and mTLS options for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls | +| Configure automated TLS certificate management for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-automated | | Configure customer-managed TLS certificates for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-tls-custom | | Configure network and firewall access to Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/azure-monitor-network-access | | Securely configure and deploy Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/best-practices-security | diff --git a/skills/azure-monitor/configuration.md b/skills/azure-monitor/configuration.md index 9f034a73..bb25d86a 100644 --- a/skills/azure-monitor/configuration.md +++ b/skills/azure-monitor/configuration.md @@ -17,7 +17,7 @@ | Configure Azure Diagnostics extension to stream data to Event Hubs | https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-stream-event-hubs | | Azure Diagnostics extension configuration schema version history | https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-versions | | Install and configure Azure Diagnostics extension for Windows VMs | https://learn.microsoft.com/en-us/azure/azure-monitor/agents/diagnostics-extension-windows-install | -| Configure Log Analytics gateway for Azure Monitor connectivity | https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway | +| Configure Log Analytics gateway for Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/agents/gateway | | Enable and use the Azure Monitor common alert schema | https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-common-schema | | Create activity log and health alert rules in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-create-activity-log-alert-rule | | Configure Azure Monitor log search alert rules | https://learn.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-create-log-alert-rule | @@ -91,15 +91,14 @@ | Understand Azure Monitor data collection rule JSON schema | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-structure | | View and inspect data collection rule definitions in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-rule-view | | Configure Azure Monitor data transformations in DCRs | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations | -| Create and test Azure Monitor transformation queries | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/data-collection-transformations-create | | Configure DCR-based metrics export in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/metrics-export-create | | Author DCR JSON for Azure Monitor metrics export | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/metrics-export-structure | -| Configure Azure Monitor pipeline in your environment | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure | -| Configure Azure Monitor pipeline using CLI or ARM templates | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli | -| Configure Azure Monitor pipeline client connections | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-clients | -| Configure Azure Monitor pipeline using Azure portal | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal | -| Expose Azure Monitor pipeline via Traefik gateway | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway | -| Configure pod placement for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement | +| Prepare and configure clusters for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure | +| Configure Azure Monitor pipeline with CLI and ARM | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-cli | +| Configure Azure Monitor pipeline and dataflows in portal | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-configure-portal | +| Deploy and manage Kubernetes gateway for Azure Monitor pipeline | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-kubernetes-gateway | +| Configure pod placement for Azure Monitor pipeline on Kubernetes | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-pod-placement | +| Configure Azure Monitor pipeline log transformations | https://learn.microsoft.com/en-us/azure/azure-monitor/data-collection/pipeline-transformations | | Index of Azure Monitor REST API operation groups | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/azure-monitor-rest-api-index | | Configure data sources and collection methods in Azure Monitor | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/data-sources | | Reference monitoring data types for Azure Monitor itself | https://learn.microsoft.com/en-us/azure/azure-monitor/fundamentals/monitor-azure-monitor-reference | diff --git a/skills/azure-nat-gateway/SKILL.md b/skills/azure-nat-gateway/SKILL.md index ba2d9053..b7f98df6 100644 --- a/skills/azure-nat-gateway/SKILL.md +++ b/skills/azure-nat-gateway/SKILL.md @@ -1,14 +1,14 @@ --- name: azure-nat-gateway -description: Expert knowledge for Azure NAT Gateway development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, configuration, and deployment. Use when managing SNAT ports, outbound IPs, flow logs, hub-spoke egress, or Azure Firewall integration, and other Azure NAT Gateway related development tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). +description: Expert knowledge for Azure NAT Gateway development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, configuration, integrations & coding patterns, and deployment. Use when configuring SNAT ports, scaling outbound IPs, reading NAT flow logs, or deploying NAT Gateway via IaC, and other Azure NAT Gateway related development tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure Load Balancer (use azure-load-balancer), Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure NAT Gateway Skill -This skill provides expert guidance for Azure NAT Gateway. Covers troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, configuration, and deployment. It combines local quick-reference content with remote documentation fetching capabilities. +This skill provides expert guidance for Azure NAT Gateway. Covers troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, configuration, integrations & coding patterns, and deployment. It combines local quick-reference content with remote documentation fetching capabilities. ## How to Use This Skill @@ -24,13 +24,14 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L35-L39 | Diagnosing and fixing NAT Gateway issues: reading flow logs, resolving misconfigurations, connectivity failures with Azure services, and outbound internet connection problems. | -| Best Practices | L40-L44 | Guidance on reducing SNAT port exhaustion and optimizing outbound connectivity patterns when using Azure NAT Gateway. | -| Decision Making | L45-L51 | Guidance on choosing NAT Gateway SKUs, migrating existing NAT Gateways to StandardV2, and moving outbound internet access from other methods to NAT Gateway. | -| Architecture & Design Patterns | L52-L60 | Design patterns for placing NAT Gateway in VNets, hub-spoke, with NVAs, and with internal/public load balancers, plus scaling outbound traffic and combining with Azure Firewall. | -| Limits & Quotas | L61-L65 | NAT Gateway limits, SNAT port quotas, connection scaling behavior, per-resource caps, and FAQs on throughput, IPs, and troubleshooting limit-related issues. | -| Configuration | L66-L74 | Configuring NAT Gateway (Standard and StandardV2), managing IPs/resources, setting up flow logs, and configuring monitoring, metrics, and alerts for gateway traffic. | -| Deployment | L75-L80 | How to deploy and redeploy NAT Gateway (ARM/Bicep), migrate or move outbound traffic from VMs/public IPs, and transition existing outbound access to Azure NAT Gateway. | +| Troubleshooting | L36-L40 | Diagnosing and fixing NAT Gateway issues: reading flow logs, resolving misconfigurations, connectivity failures with Azure services, and outbound internet connection problems. | +| Best Practices | L41-L45 | Guidance on reducing SNAT port exhaustion and optimizing outbound connectivity patterns when using Azure NAT Gateway. | +| Decision Making | L46-L52 | Guidance on choosing NAT Gateway SKUs, migrating existing NAT Gateways to StandardV2, and moving outbound internet access from other methods to NAT Gateway. | +| Architecture & Design Patterns | L53-L61 | Design patterns for placing NAT Gateway in VNets, hub-spoke, with NVAs, and with internal/public load balancers, plus scaling outbound traffic and combining with Azure Firewall. | +| Limits & Quotas | L62-L66 | NAT Gateway limits, SNAT port quotas, connection scaling behavior, per-resource caps, and FAQs on throughput, IPs, and troubleshooting limit-related issues. | +| Configuration | L67-L75 | Configuring NAT Gateway (Standard and StandardV2), managing IPs/resources, setting up flow logs, and configuring monitoring, metrics, and alerts for gateway traffic. | +| Integrations & Coding Patterns | L76-L80 | Guidance and templates for deploying Azure NAT Gateway V2 using infrastructure-as-code (ARM/Bicep/Terraform), including configuration patterns and automation. | +| Deployment | L81-L85 | How to deploy and redeploy NAT Gateway (ARM/Bicep), migrate or move outbound traffic from VMs/public IPs, and transition existing outbound access to Azure NAT Gateway. | ### Troubleshooting | Topic | URL | @@ -72,9 +73,13 @@ This skill requires **network access** to fetch documentation content: | Configure Azure NAT Gateway resource components | https://learn.microsoft.com/en-us/azure/nat-gateway/nat-gateway-resource | | Configure metrics and alerts for Azure NAT Gateway | https://learn.microsoft.com/en-us/azure/nat-gateway/nat-metrics | +### Integrations & Coding Patterns +| Topic | URL | +|-------|-----| +| Deploy Azure NAT Gateway V2 with IaC templates | https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates | + ### Deployment | Topic | URL | |-------|-----| -| Deploy Standard V2 NAT Gateway with ARM/Bicep | https://learn.microsoft.com/en-us/azure/nat-gateway/quickstart-create-nat-gateway-v2-templates | | Redeploy NAT Gateway after cross-region resource move | https://learn.microsoft.com/en-us/azure/nat-gateway/region-move-nat-gateway | | Move VM public IP outbound traffic to NAT Gateway | https://learn.microsoft.com/en-us/azure/nat-gateway/tutorial-migrate-ilip-nat | \ No newline at end of file diff --git a/skills/azure-netapp-files/SKILL.md b/skills/azure-netapp-files/SKILL.md index 3db1d975..7a09912b 100644 --- a/skills/azure-netapp-files/SKILL.md +++ b/skills/azure-netapp-files/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-netapp-files -description: Expert knowledge for Azure NetApp Files development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when deploying ANF for SAP HANA/Oracle, AzAcSnap, ZRS, AVS, S3/OneLake object REST API, or Databricks, and other Azure NetApp Files related development tasks. Not for Azure Files (use azure-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san), Azure Managed Lustre (use azure-managed-lustre). +description: Expert knowledge for Azure NetApp Files development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when deploying SAP HANA/Oracle on ANF, using AzAcSnap, REST/PowerShell APIs, S3/object access, or AVS/Databricks, and other Azure NetApp Files related development tasks. Not for Azure Files (use azure-files), Azure Blob Storage (use azure-blob-storage), Azure Elastic SAN (use azure-elastic-san), Azure Managed Lustre (use azure-managed-lustre). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure NetApp Files Skill @@ -29,8 +29,8 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L74-L89 | Guidance on sizing, performance tiers, volume types, data protection/backup/replication, SMB CA, cool access trade-offs, and cost optimization/TCO for Azure NetApp Files workloads | | Architecture & Design Patterns | L90-L101 | Architectural guidance and patterns for designing, deploying, and optimizing Azure NetApp Files workloads (SAP, Oracle, AVS, hybrid/cache, ZRS, AD integration, and network/topology). | | Limits & Quotas | L102-L126 | Limits, quotas, and performance caps for Azure NetApp Files: volume size/perf limits, user/group/inode quotas, file/path/charset constraints, regional capacity, and workload benchmarks. | -| Security | L127-L162 | Security configuration for Azure NetApp Files: encryption (CMK/HSM, cross-tenant), Kerberos/LDAP/AD, NFS/SMB permissions and ACLs, ransomware protection, and secure control/data plane and API. | -| Configuration | L163-L196 | Configuring Azure NetApp Files accounts, pools, volumes, networking, security, quotas, logging, backups, replication, and AzAcSnap for SAP/Oracle and enterprise workloads | +| Security | L127-L163 | Security, encryption, and access control for Azure NetApp Files: keys/CMK, Kerberos, LDAP/AD, ACLs/permissions (NFS/SMB/S3), ransomware protection, and control/data plane hardening. | +| Configuration | L164-L196 | Configuring Azure NetApp Files accounts, pools, volumes, networking, security, quotas, logging, backups, replication, and AzAcSnap for SAP/Oracle and enterprise workloads | | Integrations & Coding Patterns | L197-L211 | Using azacsnap with Azure NetApp Files, REST API and PowerShell operations, and integrating ANF with SAP HANA/Oracle AVGs, S3 clients, Databricks, and OneLake via object REST API. | | Deployment | L212-L224 | Deploying and configuring Azure NetApp Files for SAP HANA and Oracle (AVGs, HSR, DR, backups), migrating ONTAP data, managing AzAcSnap, zones, and requesting regional access | @@ -130,12 +130,12 @@ This skill requires **network access** to fetch documentation content: | Use Azure Policy definitions with Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/azure-policy-definitions | | Configure NFSv4.1 access control lists in Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-access-control-lists | | Configure customer-managed keys for Azure NetApp Files encryption | https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys | -| Use HSM-backed customer-managed keys for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware | +| Set up HSM-backed customer-managed keys for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-customer-managed-keys-hardware | | Configure NFSv4.1 Kerberos encryption for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-kerberos-encryption | | Configure AD DS LDAP over TLS for NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/configure-ldap-over-tls | | Configure Azure NetApp Files control plane security | https://learn.microsoft.com/en-us/azure/azure-netapp-files/control-plane-security | | Create and manage AD connections for NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/create-active-directory-connections | -| Configure cross-tenant CMK encryption for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant | +| Configure cross-tenant customer-managed keys for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/customer-managed-keys-cross-tenant | | Configure Azure NetApp Files data plane security | https://learn.microsoft.com/en-us/azure/azure-netapp-files/data-plane-security | | Disable NFS showmount for Azure NetApp Files security | https://learn.microsoft.com/en-us/azure/azure-netapp-files/disable-showmount | | Choose dual-protocol security styles and permissions in Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/dual-protocol-permission-behaviors | @@ -155,9 +155,10 @@ This skill requires **network access** to fetch documentation content: | Configure NAS share permissions for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/network-attached-storage-permissions | | Manage NFS group memberships and supplemental groups for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/network-file-system-group-memberships | | Use NFSv4.x ACLs for access control in Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/nfs-access-control-lists | -| Configure Azure NetApp Files object REST API securely | https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure | +| Configure S3 object REST API access for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/object-rest-api-access-configure | | Understand Kerberos security and performance impact on Azure NetApp Files NFSv4.1 | https://learn.microsoft.com/en-us/azure/azure-netapp-files/performance-impact-kerberos | | Configure advanced ransomware protection for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-configure | +| Meet requirements for Azure NetApp Files ransomware protection | https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements | | Configure data encryption at rest and in transit for Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/understand-data-encryption | ### Configuration @@ -191,7 +192,6 @@ This skill requires **network access** to fetch documentation content: | Create SMB volume on Elastic zone-redundant storage | https://learn.microsoft.com/en-us/azure/azure-netapp-files/elastic-volume-server-message-block | | Configure file access logging for Azure NetApp Files volumes | https://learn.microsoft.com/en-us/azure/azure-netapp-files/manage-file-access-logs | | Manage manual QoS capacity pools in NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/manage-manual-qos-capacity-pool | -| Satisfy requirements for ANF ransomware protection | https://learn.microsoft.com/en-us/azure/azure-netapp-files/ransomware-protection-requirements | | Configure DFS Namespaces and root consolidation with Azure NetApp Files | https://learn.microsoft.com/en-us/azure/azure-netapp-files/use-dfs-n-and-dfs-root-consolidation-with-azure-netapp-files | ### Integrations & Coding Patterns diff --git a/skills/azure-network-watcher/SKILL.md b/skills/azure-network-watcher/SKILL.md index ab324ac6..8171a6f5 100644 --- a/skills/azure-network-watcher/SKILL.md +++ b/skills/azure-network-watcher/SKILL.md @@ -3,7 +3,7 @@ name: azure-network-watcher description: Expert knowledge for Azure Network Watcher development including troubleshooting, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when configuring Connection Monitor, NSG/VNet flow logs, packet capture, Traffic Analytics, or Sentinel integrations, and other Azure Network Watcher related development tasks. Not for Azure Monitor (use azure-monitor), Azure Networking (use azure-networking), Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Network Watcher Skill @@ -29,7 +29,7 @@ This skill requires **network access** to fetch documentation content: | Limits & Quotas | L51-L55 | How to configure and run Network Watcher packet capture, storage and filtering options, and the key limits/quotas (size, duration, concurrency) that apply to captures | | Security | L56-L62 | Configuring secure access to Network Watcher with RBAC, using Traffic Analytics for Zero Trust segmentation, and protecting VNet flow logs with managed identities. | | Configuration | L63-L78 | Configuring and governing Network Watcher logging: AMA for Connection Monitor, NSG/VNet flow logs setup, schemas, filtering, templates (Bicep/ARM), and Azure Policy enforcement. | -| Integrations & Coding Patterns | L79-L85 | Using Network Watcher data in tools and code: parsing NSG flow logs with PowerShell, visualizing in Power BI, triggering packet capture from Functions, querying Traffic Analytics with KQL, and integrating with Sentinel. | +| Integrations & Coding Patterns | L79-L84 | Using Network Watcher data in tools and code: parsing NSG flow logs with PowerShell, visualizing in Power BI, triggering packet capture from Functions, querying Traffic Analytics with KQL, and integrating with Sentinel. | ### Troubleshooting | Topic | URL | @@ -81,5 +81,4 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Parse and read Azure flow logs with PowerShell | https://learn.microsoft.com/en-us/azure/network-watcher/flow-logs-read | | Trigger Network Watcher packet captures from Azure Functions alerts | https://learn.microsoft.com/en-us/azure/network-watcher/packet-capture-alert-triggered | -| Analyze Traffic Analytics data with KQL queries | https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-queries | -| Integrate Azure Traffic Analytics with Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-sentinel | \ No newline at end of file +| Analyze Traffic Analytics data with KQL queries | https://learn.microsoft.com/en-us/azure/network-watcher/traffic-analytics-queries | \ No newline at end of file diff --git a/skills/azure-networking/SKILL.md b/skills/azure-networking/SKILL.md index c60811c2..a954c871 100644 --- a/skills/azure-networking/SKILL.md +++ b/skills/azure-networking/SKILL.md @@ -1,14 +1,14 @@ --- name: azure-networking -description: Expert knowledge for Azure Networking development including troubleshooting, best practices, decision making, architecture & design patterns, security, and integrations & coding patterns. Use when designing hub-spoke VNets, Azure Firewall/NSG rules, WAF (App GW/Front Door), Accelerated Networking, or Resource Graph queries, and other Azure Networking related development tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan), Azure Network Watcher (use azure-network-watcher). +description: Expert knowledge for Azure Networking development including troubleshooting, decision making, architecture & design patterns, security, and integrations & coding patterns. Use when designing hub-spoke VNets, Azure Firewall/NSG rules, App Gateway/Front Door WAF, or Resource Graph queries, and other Azure Networking related development tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan), Azure Network Watcher (use azure-network-watcher). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-05" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Networking Skill -This skill provides expert guidance for Azure Networking. Covers troubleshooting, best practices, decision making, architecture & design patterns, security, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. +This skill provides expert guidance for Azure Networking. Covers troubleshooting, decision making, architecture & design patterns, security, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. ## How to Use This Skill @@ -24,51 +24,33 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L34-L38 | Diagnosing and resolving Microsoft.Network resource provisioning failures in Azure, including common error patterns, causes, and step-by-step remediation guidance. | -| Best Practices | L39-L43 | Guidance on boosting Azure NVA and VM network throughput/latency using Accelerated Connections, including configuration, tuning, and performance best practices. | -| Decision Making | L44-L51 | Guidance on choosing Azure network architectures: using region latency data, selecting secure topologies and app delivery options, and planning networking for remote and hybrid work scenarios. | -| Architecture & Design Patterns | L52-L58 | Routing and traffic flow design in Azure: analyzing control vs data plane paths, and building secure hub-spoke network architectures for web apps. | -| Security | L59-L70 | Zero Trust security for Azure networking: policies, NSGs, Azure Firewall, DDoS, App Gateway/Front Door WAF hardening, and securing virtual networks for web apps. | -| Integrations & Coding Patterns | L71-L74 | Using Azure Resource Graph to query, filter, and analyze Azure networking resources at scale (e.g., VNets, NSGs, public IPs) for inventory, compliance, and reporting. | +| Troubleshooting | L33-L37 | Diagnosing and resolving Microsoft.Network resource provisioning failures in Azure, including common error patterns, causes, and step-by-step remediation guidance. | +| Decision Making | L38-L42 | Guidance on choosing Azure network architectures: using region latency data, selecting secure topologies and app delivery options, and planning networking for remote and hybrid work scenarios. | +| Architecture & Design Patterns | L43-L47 | Routing and traffic flow design in Azure: analyzing control vs data plane paths, and building secure hub-spoke network architectures for web apps. | +| Security | L48-L52 | Zero Trust security for Azure networking: policies, NSGs, Azure Firewall, DDoS, App Gateway/Front Door WAF hardening, and securing virtual networks for web apps. | +| Integrations & Coding Patterns | L53-L56 | Querying and analyzing Azure networking resources with Azure Resource Graph, including sample queries, filters, and best practices for large-scale inventory and compliance. | ### Troubleshooting | Topic | URL | |-------|-----| | Troubleshoot Microsoft.Network failed provisioning states | https://learn.microsoft.com/en-us/azure/networking/troubleshoot-failed-state | -### Best Practices -| Topic | URL | -|-------|-----| -| Optimize NVA and VM performance with Accelerated Connections | https://learn.microsoft.com/en-us/azure/networking/nva-accelerated-connections | - ### Decision Making | Topic | URL | |-------|-----| | Use Azure region latency stats for architecture planning | https://learn.microsoft.com/en-us/azure/networking/azure-network-latency | -| Choose secure Azure application delivery options | https://learn.microsoft.com/en-us/azure/networking/secure-application-delivery | -| Select a secure Azure network topology | https://learn.microsoft.com/en-us/azure/networking/secure-network-topology | -| Plan Azure networking for remote work scenarios | https://learn.microsoft.com/en-us/azure/networking/working-remotely-support | ### Architecture & Design Patterns | Topic | URL | |-------|-----| -| Analyze control plane routing interoperability in Azure | https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-control-plane | -| Analyze data plane paths across Azure networks | https://learn.microsoft.com/en-us/azure/networking/connectivity-interoperability-data-plane | | Design a secure hub-spoke network for Azure web apps | https://learn.microsoft.com/en-us/azure/networking/cross-service-scenarios/design-secure-hub-spoke-network | ### Security | Topic | URL | |-------|-----| -| Deploy a Zero Trust virtual network for web apps | https://learn.microsoft.com/en-us/azure/networking/create-zero-trust-network-web-apps | -| Use built-in Azure Policy definitions for networking | https://learn.microsoft.com/en-us/azure/networking/policy-reference | | Apply Azure Policy compliance controls to networking | https://learn.microsoft.com/en-us/azure/networking/security-controls-policy | -| Secure Application Gateway WAF with Zero Trust guidance | https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-application-gateway-waf | -| Harden Azure Firewall with Zero Trust settings | https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-azure-firewall | -| Implement Zero Trust for Azure DDoS Protection | https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-ddos-protection | -| Secure Front Door WAF at the edge with Zero Trust | https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-front-door-waf | -| Apply Zero Trust recommendations to Azure network security | https://learn.microsoft.com/en-us/azure/networking/security/zero-trust-network-security | ### Integrations & Coding Patterns | Topic | URL | |-------|-----| -| Run Azure Resource Graph queries for networking resources | https://learn.microsoft.com/en-us/azure/networking/fundamentals/resource-graph-samples | \ No newline at end of file +| Use Azure Resource Graph queries for networking resources | https://learn.microsoft.com/en-us/azure/networking/resource-graph-samples | \ No newline at end of file diff --git a/skills/azure-operator-nexus/SKILL.md b/skills/azure-operator-nexus/SKILL.md index 721a76c7..0209d8cc 100644 --- a/skills/azure-operator-nexus/SKILL.md +++ b/skills/azure-operator-nexus/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-operator-nexus -description: Expert knowledge for Azure Operator Nexus development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Nexus clusters/fabric, BGP/ACL/QoS, NPB TAP rules, near-edge storage, or Nexus AKS ETCD ops, and other Azure Operator Nexus related development tasks. Not for Azure Network Function Manager (use azure-network-function-manager), Azure Networking (use azure-networking), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan). +description: Expert knowledge for Azure Operator Nexus development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when managing Nexus clusters, fabric upgrades, NPB TAP rules, near-edge storage, or bare-metal lifecycle ops, and other Azure Operator Nexus related development tasks. Not for Azure Network Function Manager (use azure-network-function-manager), Azure Networking (use azure-networking), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Operator Nexus Skill @@ -29,10 +29,10 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L89-L96 | Guidance on choosing Nexus SKUs, VM sizes, storage software versions, and planning where to place Nexus Kubernetes resources in your deployment. | | Architecture & Design Patterns | L97-L101 | Designing near-edge storage for Azure Operator Nexus: architecture choices, data locality, performance, capacity planning, redundancy, and integration with Nexus network/compute. | | Limits & Quotas | L102-L114 | Limits, capacity planning, supported versions, and operational behaviors for Nexus clusters: quotas, storage sizing, node restarts, isolation domain requirements, upgrades, and log retention. | -| Security | L115-L153 | Securing Nexus: identity/RBAC, ACLs, SSH/serial access, break-glass, key/secret rotation, managed identities, Defender/Policy, TAP/ExpressRoute auth, and secure VM/cluster access. | -| Configuration | L154-L216 | Configuring and updating Nexus clusters and network fabric: JSON templates, isolation domains, BGP/route policies, ACLs, QoS, maintenance, monitoring, and Kubernetes/node settings. | -| Integrations & Coding Patterns | L217-L221 | Configuring Network Packet Broker TAP rules in Azure Operator Nexus, including rule creation, traffic mirroring, filtering, and integration patterns for network observability. | -| Deployment | L222-L230 | Guides for upgrading and replacing Nexus fabric devices/terminal servers, running pre-validation and template-based fabric upgrades, and building/maintaining VM images for Operator Nexus. | +| Security | L115-L154 | Securing Nexus: identity/RBAC, ACLs, SSH and break-glass access, key/cert/secret rotation, managed identities, Defender/Policy, and secure VM/cluster/network management. | +| Configuration | L155-L218 | Configuring and operating Azure Operator Nexus: cluster templates/parameters, Kubernetes/node settings, network fabric (BGP, VRFs, QoS, ACLs, route policies), isolation domains, maintenance, and monitoring. | +| Integrations & Coding Patterns | L219-L223 | Configuring Network Packet Broker TAP rules in Azure Operator Nexus, including rule creation, traffic mirroring, filtering, and integration patterns for network observability. | +| Deployment | L224-L232 | Guides for upgrading and replacing Nexus fabric devices/terminal servers, running pre-validation and template-based fabric upgrades, and building/maintaining VM images for Operator Nexus. | ### Troubleshooting | Topic | URL | @@ -122,6 +122,7 @@ This skill requires **network access** to fetch documentation content: | Apply ACLs to Azure Operator Nexus resources | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-apply-access-control-list | | Manage emergency SSH access to BMCs with bmckeyset in Operator Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-bmc-ssh | | Manage emergency SSH access to bare metal machines with baremetalmachinekeyset | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-baremetal-bmm-ssh | +| Rotate and resync certificates in Azure Operator Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-certificate-rotation | | Use managed identities and user resources in Operator Nexus clusters | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-cluster-managed-identity-user-provided-resources | | Configure SSH ACLs on Nexus management VPN NNI | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-configure-acls-for-ssh-management-on-access-vpn | | Configure Network TAP rules with UAMI-based access in Operator Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/howto-configure-network-tap-rules-with-user-assigned-managed-identity | @@ -154,10 +155,10 @@ This skill requires **network access** to fetch documentation content: ### Configuration | Topic | URL | |-------|-----| -| Use cluster.jsonc template for Operator Nexus cluster configuration | https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example | -| Configure cluster.parameters.jsonc for multi-rack Operator Nexus clusters | https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example | -| Use clusterManager.jsonc template settings for Operator Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example | -| Configure clusterManager.parameters.jsonc for Operator Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example | +| Author cluster.jsonc templates for Nexus clusters | https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-jsonc-example | +| Define cluster.parameters.jsonc for multi-rack Nexus clusters | https://learn.microsoft.com/en-us/azure/operator-nexus/cluster-parameters-jsonc-example | +| Use clusterManager.jsonc template for Nexus deployment | https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-jsonc-example | +| Configure clusterManager.parameters.jsonc for Nexus clusters | https://learn.microsoft.com/en-us/azure/operator-nexus/clustermanager-parameters-jsonc-example | | Apply A/B staged configuration updates in Nexus Fabric | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-ab-staged-commit-configuration-update-commit-workflow | | Define and apply access control lists in Nexus Network Fabric | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-access-control-lists | | Use commit workflow v2 for Nexus Network Fabric changes | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-commit-workflow-v2 | @@ -166,6 +167,7 @@ This skill requires **network access** to fetch documentation content: | Monitor and detect configuration drift in Nexus Network Fabric | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-network-fabric-configuration-monitoring | | Modify Nexus Fabric devices using read-write commands | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-network-fabric-read-write-commands | | Batch and commit Nexus Network Fabric configuration updates | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-network-fabric-resource-update-commit | +| Model terminal servers as Azure Operator Nexus resources | https://learn.microsoft.com/en-us/azure/operator-nexus/concepts-terminal-server-as-resource | | Customize CoreDNS and node-local-dns in Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-customize-kubernetes-cluster-dns | | Create and manage IP prefixes and rules in Operator Nexus | https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-ip-prefixes | | Create and manage route policies in Operator Nexus Network Fabric | https://learn.microsoft.com/en-us/azure/operator-nexus/how-to-route-policy | diff --git a/skills/azure-partner-solutions/SKILL.md b/skills/azure-partner-solutions/SKILL.md index 10f1788c..0e82f7da 100644 --- a/skills/azure-partner-solutions/SKILL.md +++ b/skills/azure-partner-solutions/SKILL.md @@ -3,7 +3,7 @@ name: azure-partner-solutions description: Expert knowledge for Azure Partner Solutions development including troubleshooting, decision making, architecture & design patterns, security, configuration, and integrations & coding patterns. Use when using Service Connector, Confluent Cloud, MongoDB Atlas, Dynatrace APM, or Palo Alto Cloud NGFW on Azure, and other Azure Partner Solutions related development tasks. Not for Azure Industry (use azure-industry), Azure Managed Applications (use azure-managed-applications), Azure Lighthouse (use azure-lighthouse), Azure Networking (use azure-networking). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-05" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Partner Solutions Skill diff --git a/skills/azure-pipelines/SKILL.md b/skills/azure-pipelines/SKILL.md index eade10ac..171b3a9a 100644 --- a/skills/azure-pipelines/SKILL.md +++ b/skills/azure-pipelines/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-pipelines -description: Expert knowledge for Azure Pipelines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring YAML pipelines, self-hosted agents, service connections, Key Vault secrets, or Web App/Kubernetes deploys, and other Azure Pipelines related development tasks. Not for Azure DevOps (use azure-devops), Azure Boards (use azure-boards), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). +description: Expert knowledge for Azure Pipelines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when authoring YAML pipelines, configuring agents/triggers, deploying to App Service/AKS/VMs, or integrating Git/GitHub, and other Azure Pipelines related development tasks. Not for Azure DevOps (use azure-devops), Azure Boards (use azure-boards), Azure Repos (use azure-repos), Azure Test Plans (use azure-test-plans). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Pipelines Skill @@ -24,26 +24,26 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L47 | Diagnosing and fixing Azure Pipelines issues: service connection/auth problems, web app deploy failures, triggers, stuck jobs, and using logs to debug run failures. | +| Troubleshooting | L37-L47 | Diagnosing and fixing Azure Pipelines issues: service connection/auth problems, deployment failures, trigger/queue/start issues, and using logs to debug run failures. | | Best Practices | L48-L58 | Best practices for faster, reliable pipelines: caching, cross-platform scripts, handling flaky tests, parallel test execution (incl. VSTest), Test Impact Analysis, and UI test configuration. | | Decision Making | L59-L65 | Guides for migrating CI/CD pipelines to Azure Pipelines, including from Jenkins/Travis and from classic UI pipelines to YAML, with patterns, pitfalls, and safe migration steps. | | Architecture & Design Patterns | L66-L73 | Guidance on end-to-end CI/CD and DevOps architectures for Azure: baseline pipeline patterns, Web App deployment design, and IaaS/VM-focused DevTest and production pipelines. | | Limits & Quotas | L74-L82 | Limits, quotas, and capacity for Azure Pipelines: hosted agent caps, image deprecation, parallel jobs, agent pool concurrency, large package handling, and retention policy configuration. | -| Security | L83-L135 | Securing Azure Pipelines: agent auth, service connections, secrets/Key Vault, permissions and approvals, secure variables/files, repo access, signing, and integrating security scans/policies. | -| Configuration | L136-L462 | Configuring Azure Pipelines: agents, YAML/classic triggers, stages/jobs/steps, variables, environments, artifacts, test/analytics, and detailed setup for built-in tasks and deployment strategies. | -| Integrations & Coding Patterns | L463-L492 | Patterns and tasks for integrating Azure Pipelines with languages, tools, secrets, Git/GitHub, ServiceNow, Slack, Key Vault, and external APIs, plus caching and test automation. | +| Security | L83-L134 | Securing Azure Pipelines end-to-end: auth for agents and service connections, secrets/Key Vault, permissions/approvals, secure variables, and integrating code/security scanning. | +| Configuration | L135-L462 | Configuring Azure Pipelines YAML/classic pipelines: agents, triggers, stages/jobs/steps, variables, environments, artifacts, test/analytics, and detailed task input references. | +| Integrations & Coding Patterns | L463-L492 | Patterns and tasks for integrating Azure Pipelines with languages, tools, Git/GitHub, Key Vault, ServiceNow, Slack, REST/Functions, and managing variables, caching, and test automation. | | Deployment | L493-L584 | Agent setup and deployment guides for Azure Pipelines: installing/hosting agents, configuring CI/CD to VMs, App Service, containers, Kubernetes, databases, and publishing/consuming artifacts. | ### Troubleshooting | Topic | URL | |-------|-----| | Troubleshoot Azure Resource Manager service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/azure-rm-endpoint?view=azure-devops | -| Troubleshoot ARM workload identity service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops | +| Troubleshoot Azure workload identity service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/troubleshoot-workload-identity?view=azure-devops | | Review Azure Pipelines logs for diagnostics | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/review-logs?view=azure-devops | | Troubleshoot Azure Web App deployment tasks in pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-azure-web-app-deploy?view=azure-devops | -| Fix Azure Pipelines jobs that never start | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops | -| Troubleshoot Azure Pipelines trigger issues | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops | -| Troubleshoot Azure Pipelines run failures using logs | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops | +| Resolve Azure Pipelines queued-but-not-starting problems | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-start?view=azure-devops | +| Troubleshoot Azure Pipelines trigger and start issues | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshoot-triggers?view=azure-devops | +| Diagnose and fix Azure Pipelines run failures | https://learn.microsoft.com/en-us/azure/devops/pipelines/troubleshooting/troubleshooting?view=azure-devops | ### Best Practices | Topic | URL | @@ -83,20 +83,19 @@ This skill requires **network access** to fetch documentation content: ### Security | Topic | URL | |-------|-----| -| Choose authentication options for self-hosted agents | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops | +| Choose and configure auth methods for self-hosted agents | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agent-authentication-options?view=azure-devops | | Run Azure Pipelines agent with self-signed certificate | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/certificate?view=azure-devops-server | | Register Azure Pipelines agent using device code flow | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/device-code-flow-agent-registration?view=azure-devops | | Register Azure Pipelines agent using PAT authentication | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/personal-access-token-agent-registration?view=azure-devops | -| Register Azure Pipelines agent with service principal | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops | +| Register Azure Pipelines agents with a service principal | https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/service-principal-agent-registration?view=azure-devops | | Securely sign mobile apps in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/apps/mobile/app-signing?view=azure-devops | | Configure Docker Content Trust signing in Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/containers/content-trust?view=azure-devops | | Secure Azure DevOps pipelines with Entra workload identities | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/add-devops-entra-service-connection?view=azure-devops | | Assign administrators for protected pipeline resources | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/add-resource-protection?view=azure-devops | -| Handle special ARM service connection authentication cases | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops | -| Configure Azure Resource Manager service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops | +| Handle special-case Azure Resource Manager service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/azure-resource-manager-alternate-approaches?view=azure-devops | +| Configure Azure Resource Manager service connections in pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops | | Link Azure Pipelines variable groups to Key Vault | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/link-variable-groups-to-key-vaults?view=azure-devops | | Manage secure files and access in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops | -| Configure and manage Azure Pipelines service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops | | Manage Azure Pipelines variable groups and access | https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops | | Manage Azure Pipelines permissions and security groups | https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops | | Manage Azure Pipelines permissions and security groups | https://learn.microsoft.com/en-us/azure/devops/pipelines/policies/permissions?view=azure-devops | @@ -115,7 +114,7 @@ This skill requires **network access** to fetch documentation content: | Use secret variables securely in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/process/set-secret-variables?view=azure-devops | | Use Azure Key Vault secrets in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/azure-key-vault?view=azure-devops | | Create ARM service connection using client secret | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-app-secret?view=azure-devops | -| Manually configure ARM workload identity connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops | +| Manually configure workload identity ARM service connections | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/configure-workload-identity?view=azure-devops | | Plan an approach for securing YAML pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/approach?view=azure-devops | | Securely handle variables and parameters in pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/inputs?view=azure-devops | | Secure agents, projects, and containers in pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/misc?view=azure-devops | @@ -123,7 +122,7 @@ This skill requires **network access** to fetch documentation content: | Automate Azure Pipelines security with REST and PowerShell | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/project-security-script?view=azure-devops | | Configure pipeline resource security and approvals | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/resources?view=azure-devops | | Protect and manage secrets in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secrets?view=azure-devops | -| Secure repository access from Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops | +| Secure Azure Pipelines access to source repositories | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/secure-access-to-repos?view=azure-devops | | Use YAML templates to improve pipeline security | https://learn.microsoft.com/en-us/azure/devops/pipelines/security/templates?view=azure-devops | | Configure CodeQL analysis task for vulnerabilities | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/advanced-security-codeql-analyze-v1?view=azure-pipelines | | Configure CodeQL initialization for Advanced Security | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/advanced-security-codeql-init-v1?view=azure-pipelines | @@ -165,7 +164,7 @@ This skill requires **network access** to fetch documentation content: | Use variables in classic Azure release pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops | | Monitor pipelines with Azure DevOps dashboard widgets | https://learn.microsoft.com/en-us/azure/devops/pipelines/reports/pipeline-widgets?view=azure-devops | | Use Azure Pipelines analytics and reports | https://learn.microsoft.com/en-us/azure/devops/pipelines/reports/pipelinereport?view=azure-devops | -| Configure multi-repository checkout in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops | +| Configure multi-repo checkout in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops | | Configure advanced Git repository options in Azure Pipelines | https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops | | Configure built-in Azure Pipelines task parameters | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/?view=azure-pipelines | | Configure AndroidBuild@1 task for Gradle builds | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/android-build-v1?view=azure-pipelines | @@ -231,6 +230,7 @@ This skill requires **network access** to fetch documentation content: | Configure deprecated Azure Web PowerShell deployment task | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-web-powershell-deployment-v1?view=azure-pipelines | | Configure Bash@3 task for cross-platform scripts | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bash-v3?view=azure-pipelines | | Configure BatchScript@1 task for Windows commands | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/batch-script-v1?view=azure-pipelines | +| Configure Azure Pipelines BicepDeploy@0 task inputs | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/bicep-deploy-v0?view=azure-pipelines | | Configure CacheBeta@0 task (deprecated caching) | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cache-beta-v0?view=azure-pipelines | | Configure CacheBeta@1 task for pipeline caching | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cache-beta-v1?view=azure-pipelines | | Configure Cache@2 task to cache pipeline files | https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/cache-v2?view=azure-pipelines | diff --git a/skills/azure-planetary-computer-pro/SKILL.md b/skills/azure-planetary-computer-pro/SKILL.md index d1e1219d..29aa73b5 100644 --- a/skills/azure-planetary-computer-pro/SKILL.md +++ b/skills/azure-planetary-computer-pro/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-planetary-computer-pro -description: Expert knowledge for Microsoft Planetary Computer Pro development including troubleshooting, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when managing STAC collections, GeoCatalog ingestion, SAS tokens, Explorer visualization, or QGIS/ArcGIS integration, and other Microsoft Planetary Computer Pro related development tasks. Not for Azure Open Datasets (use azure-open-datasets), Azure Maps (use azure-maps), Azure Data Explorer (use azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics). +description: Expert knowledge for Microsoft Planetary Computer Pro development including troubleshooting, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when managing STAC collections/items, GeoCatalog ingestion, SAS tokens, tiles/mosaics, or QGIS/ArcGIS integration, and other Microsoft Planetary Computer Pro related development tasks. Not for Azure Open Datasets (use azure-open-datasets), Azure Maps (use azure-maps), Azure Data Explorer (use azure-data-explorer), Azure Synapse Analytics (use azure-synapse-analytics). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-05" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Microsoft Planetary Computer Pro Skill @@ -28,8 +28,8 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L40-L44 | Guidance on selecting how to access Planetary Computer Pro data, including connection options, integrations with tools/services, and choosing the best method for your workflow. | | Limits & Quotas | L45-L49 | Supported file formats, data types, and size/usage limits for datasets and computations in Planetary Computer Pro, including quotas that affect how you process and store data. | | Security | L50-L60 | Authenticating apps and services to Planetary Computer Pro, configuring Entra ID, RBAC, managed identities, cross-tenant access, and SAS-based authorization for GeoCatalog access and data ingestion | -| Configuration | L61-L74 | Configuring Planetary Computer Pro collections: ingestion sources, mosaics, tiles, render/colormap settings, Explorer visualization, queryable filters, and US Gov cloud endpoints. | -| Integrations & Coding Patterns | L75-L88 | Patterns and APIs for creating/managing STAC collections/items, bulk ingesting data, generating SAS tokens, and integrating Planetary Computer Pro with web apps, QGIS, ArcGIS, and other tools | +| Configuration | L61-L75 | Configuring Planetary Computer Pro data access and visualization: collections, tiles, mosaics, render/colormap settings, queryables, ingestion sources, APIM proxy, and US Gov cloud endpoints. | +| Integrations & Coding Patterns | L76-L89 | Patterns and APIs for creating/managing STAC collections/items, bulk ingesting data, generating SAS tokens, and integrating Planetary Computer Pro with web apps, QGIS, ArcGIS, and other tools | ### Troubleshooting | Topic | URL | @@ -63,6 +63,7 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Configure Planetary Computer Pro collections for Explorer visualization | https://learn.microsoft.com/en-us/azure/planetary-computer/collection-configuration-concept | | Configure collection visualization settings in Planetary Computer Pro portal | https://learn.microsoft.com/en-us/azure/planetary-computer/configure-collection-web-interface | +| Configure APIM proxy for Planetary Computer GeoCatalog | https://learn.microsoft.com/en-us/azure/planetary-computer/create-api-proxy-geocatalog | | Apply sample render configurations for Planetary Computer Pro data visualization | https://learn.microsoft.com/en-us/azure/planetary-computer/data-visualization-samples | | Configure ingestion sources for Planetary Computer Pro GeoCatalogs | https://learn.microsoft.com/en-us/azure/planetary-computer/ingestion-source | | Configure mosaic options for Planetary Computer Pro collections | https://learn.microsoft.com/en-us/azure/planetary-computer/mosaic-configurations-for-collections | diff --git a/skills/azure-policy/SKILL.md b/skills/azure-policy/SKILL.md index 3e7f4df3..2f979869 100644 --- a/skills/azure-policy/SKILL.md +++ b/skills/azure-policy/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-policy -description: Expert knowledge for Azure Policy development including troubleshooting, best practices, decision making, security, configuration, integrations & coding patterns, and deployment. Use when authoring Machine Configuration packages, deploying via ARM/Bicep/Terraform, enforcing security baselines, migrating from DSC, or querying compliance with Resource Graph, and other Azure Policy related development tasks. Not for Azure Blueprints (use azure-blueprints), Azure Role-based access control (use azure-rbac), Azure Resource Manager (use azure-resource-manager), Azure Security (use azure-security). +description: Expert knowledge for Azure Policy development including troubleshooting, best practices, decision making, security, configuration, integrations & coding patterns, and deployment. Use when authoring guest config packages, deploying via ARM/Bicep/Terraform, mapping to CIS/NIST, or querying compliance with Resource Graph, and other Azure Policy related development tasks. Not for Azure Blueprints (use azure-blueprints), Azure Resource Manager (use azure-resource-manager), Azure Role-based access control (use azure-rbac), Azure Security (use azure-security). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Policy Skill @@ -27,10 +27,10 @@ This skill requires **network access** to fetch documentation content: | Troubleshooting | L35-L41 | Diagnosing and fixing Azure Policy non-compliance, common policy evaluation/deployment errors, and Machine Configuration deployment and remediation issues. | | Best Practices | L42-L57 | Designing effective Azure Policy definitions: effects, logical/value operators, arrays, tags, initiatives, parameters, and testing/behavior of Machine/Guest Configuration. | | Decision Making | L58-L64 | Guidance for planning migrations from Azure Automation DSC, DSC extension, and Automanage Best Practices to Azure Policy/Machine Configuration, including mapping features and migration steps. | -| Security | L65-L126 | Using Azure Policy and Machine Configuration for security baselines and mapping/enforcing compliance with standards (CIS, NIST, ISO, PCI, FedRAMP, HIPAA, SOC 2, regional regs). | -| Configuration | L127-L141 | Authoring, assigning, storing, and securing Machine Configuration (guest configuration) packages and policies, plus prerequisites, networking, remediation, and compliance result analysis. | -| Integrations & Coding Patterns | L142-L147 | Using Azure Resource Graph to query Azure Policy compliance data and guest configuration state across resources for reporting, auditing, and large-scale policy analysis | -| Deployment | L148-L157 | How to deploy and assign Machine Configuration packages via ARM/Bicep/Terraform/REST, publish packages to storage, and use safe deployment practices with Azure Policy. | +| Security | L65-L128 | Using Azure Policy and Machine Configuration for security/compliance: baselines, guest config, package signing, exemptions, and mappings to CIS, NIST, ISO, PCI, FedRAMP, HIPAA, CMMC, and regional standards. | +| Configuration | L129-L143 | Authoring, assigning, storing, and securing Machine Configuration (guest configuration) packages and policies, plus prerequisites, networking, remediation, and compliance result analysis. | +| Integrations & Coding Patterns | L144-L149 | Using Azure Resource Graph to query Azure Policy compliance data and guest configuration state across resources for reporting, auditing, and large-scale policy analysis | +| Deployment | L150-L159 | How to deploy and assign Machine Configuration packages via ARM/Bicep/Terraform/REST, publish packages to storage, and use safe deployment practices with Azure Policy. | ### Troubleshooting | Topic | URL | @@ -71,7 +71,9 @@ This skill requires **network access** to fetch documentation content: | Sign Machine Configuration packages and enforce signed content | https://learn.microsoft.com/en-us/azure/governance/machine-configuration/how-to/develop-custom-package/6-sign-package | | Define and use Azure Policy exemption structure | https://learn.microsoft.com/en-us/azure/governance/policy/concepts/exemption-structure | | Map Azure Policy to Australian ISM PROTECTED controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/australia-ism | -| Apply Microsoft cloud security benchmark via Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark | +| Align with Microsoft cloud security benchmark via Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/azure-security-benchmark | +| Use Azure built-in policy initiatives for compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-initiatives | +| Use Azure built-in policy definitions for governance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/built-in-policies | | Use Azure Policy for Canada Federal PBMM compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/canada-federal-pbmm | | Map CIS Azure 1.1.0 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-azure-1-1-0 | | Map CIS Azure 1.3.0 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-azure-1-3-0 | @@ -85,19 +87,19 @@ This skill requires **network access** to fetch documentation content: | Use CIS benchmarks for SUSE Linux via Machine Configuration | https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/suse-ado | | Use CIS benchmarks for Ubuntu via Machine Configuration | https://learn.microsoft.com/en-us/azure/governance/policy/samples/cis-linux/ubuntu-ado | | Align Azure Policy with CMMC Level 3 controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/cmmc-l3 | -| Map Azure Policy to FedRAMP High requirements | https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high | -| Map Azure Policy to FedRAMP Moderate requirements | https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate | -| Map Microsoft cloud security benchmark to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark | +| Map FedRAMP High controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-high | +| Map FedRAMP Moderate controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/fedramp-moderate | +| Use Azure Policy for Microsoft cloud security benchmark in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-azure-security-benchmark | | Map CIS Azure 1.1.0 (Gov) controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-cis-azure-1-1-0 | | Map CIS Azure 1.3.0 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-cis-azure-1-3-0 | | Implement CMMC Level 3 controls with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-cmmc-l3 | -| Align Azure Government with FedRAMP High via Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high | -| Align Azure Government with FedRAMP Moderate via Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate | +| Map FedRAMP High controls to Azure Policy in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-high | +| Map FedRAMP Moderate controls to Azure Policy in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-fedramp-moderate | | Implement IRS 1075 2016 controls with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-irs-1075-sept2016 | | Use Azure Policy for ISO 27001:2013 compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-iso-27001 | -| Use Azure Policy for NIST SP 800-171 R2 | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2 | -| Implement NIST SP 800-53 R4 with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4 | -| Implement NIST SP 800-53 R5 with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5 | +| Map NIST SP 800-171 R2 controls to Azure Policy in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-171-r2 | +| Implement NIST SP 800-53 R4 with Azure Policy in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r4 | +| Implement NIST SP 800-53 R5 with Azure Policy in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-nist-sp-800-53-r5 | | Align SOC 2 controls with Azure Policy in Azure Government | https://learn.microsoft.com/en-us/azure/governance/policy/samples/gov-soc-2 | | Apply CIS Linux security baselines via Machine Configuration | https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-cis-linux | | Apply Docker security baseline via guest configuration | https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-docker | @@ -107,20 +109,20 @@ This skill requires **network access** to fetch documentation content: | Implement HIPAA HITRUST controls using Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/hipaa-hitrust | | Use Azure Policy for IRS 1075 (2016) compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/irs-1075-sept2016 | | Align Azure Policy with ISO 27001:2013 controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/iso-27001 | -| Use Azure Policy for Sovereignty Baseline Confidential compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential | -| Use Azure Policy for Sovereignty Baseline Global compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global | -| Use Azure Policy to meet NIST SP 800-171 R2 | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2 | -| Implement NIST SP 800-53 Rev. 4 with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4 | -| Implement NIST SP 800-53 Rev. 5 with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5 | -| Map Azure Policy to NL BIO Cloud Theme controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme | +| Use Sovereignty Baseline Confidential policies for compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-confidential | +| Use Sovereignty Baseline Global policies for compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/mcfs-baseline-global | +| Map NIST SP 800-171 R2 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-171-r2 | +| Map NIST SP 800-53 Rev. 4 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r4 | +| Map NIST SP 800-53 Rev. 5 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nist-sp-800-53-r5 | +| Map NL BIO Cloud Theme controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/nl-bio-cloud-theme | | Implement PCI DSS 3.2.1 controls with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/pci-dss-3-2-1 | | Implement PCI DSS v4.0 controls with Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/pci-dss-4-0 | | Use Azure Policy for RBI IT Framework for Banks | https://learn.microsoft.com/en-us/azure/governance/policy/samples/rbi-itf-banks-2016 | | Use Azure Policy for RBI IT Framework for NBFC | https://learn.microsoft.com/en-us/azure/governance/policy/samples/rbi-itf-nbfc-2017 | -| Map Azure Policy to RMIT Malaysia compliance controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/rmit-malaysia | +| Map RMIT Malaysia controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/rmit-malaysia | | Map SOC 2 controls to Azure Policy initiatives | https://learn.microsoft.com/en-us/azure/governance/policy/samples/soc-2 | | Use Azure Policy for Spain ENS regulatory compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/spain-ens | -| Map Azure Policy to SWIFT CSP-CSCF v2021 controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2021 | +| Map SWIFT CSP-CSCF 2021 controls to Azure Policy | https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2021 | | Map Azure Policy to SWIFT CSP-CSCF v2022 controls | https://learn.microsoft.com/en-us/azure/governance/policy/samples/swift-csp-cscf-2022 | | Use Azure Policy for UK OFFICIAL and NHS compliance | https://learn.microsoft.com/en-us/azure/governance/policy/samples/ukofficial-uknhs | diff --git a/skills/azure-private-link/SKILL.md b/skills/azure-private-link/SKILL.md index fec94be5..8840bdb2 100644 --- a/skills/azure-private-link/SKILL.md +++ b/skills/azure-private-link/SKILL.md @@ -3,7 +3,7 @@ name: azure-private-link description: Expert knowledge for Azure Private Link development including best practices, decision making, architecture & design patterns, limits & quotas, security, and configuration. Use when configuring Private Endpoints, DNS and Private Resolver, NSP migrations, Azure Firewall traffic control, or High Scale limits, and other Azure Private Link related development tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure VPN Gateway (use azure-vpn-gateway), Azure ExpressRoute (use azure-expressroute), Azure Virtual WAN (use azure-virtual-wan). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Private Link Skill diff --git a/skills/azure-quantum/SKILL.md b/skills/azure-quantum/SKILL.md index 902a31d0..f173a891 100644 --- a/skills/azure-quantum/SKILL.md +++ b/skills/azure-quantum/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-quantum -description: Expert knowledge for Azure Quantum development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using Azure Quantum workspaces, QDK/Q#, QIR/OpenQASM circuits, IonQ/PASQAL/Quantinuum/Rigetti targets, or hybrid jobs, and other Azure Quantum related development tasks. Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch (use azure-batch), Azure Databricks (use azure-databricks), Azure Machine Learning (use azure-machine-learning). +description: Expert knowledge for Azure Quantum development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using QDK/VS Code, IonQ/PASQAL/Quantinuum/Rigetti targets, QIR/OpenQASM/Qiskit circuits, or hybrid jobs, and other Azure Quantum related development tasks. Not for Azure HPC Cache (use azure-hpc-cache), Azure Batch (use azure-batch), Azure Databricks (use azure-databricks), Azure Machine Learning (use azure-machine-learning). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Quantum Skill @@ -24,22 +24,21 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L45 | Troubleshooting Azure Quantum provider issues: diagnosing job failures and understanding support/escalation policies and limits for IonQ, PASQAL, Quantinuum, and Rigetti. | -| Best Practices | L46-L52 | Best practices for using QDK in VS Code with Copilot, optimizing large Q# programs via resource estimation, and systematically testing and debugging quantum code. | -| Decision Making | L53-L61 | Guidance on Azure Quantum costs, provider pricing and regions, workspace migration, choosing Q# dev tools, and planning quantum-safe cryptography with the resource estimator. | -| Architecture & Design Patterns | L62-L66 | Guidance on designing hybrid quantum-classical workflows in Azure Quantum, including architecture options, orchestration patterns, and when to offload tasks to quantum hardware. | -| Limits & Quotas | L67-L73 | Managing Azure Quantum quotas, job/session limits, timeouts, and Rigetti-specific hardware constraints and target capabilities. | +| Troubleshooting | L37-L44 | Troubleshooting Azure Quantum provider issues: diagnosing job failures and understanding support/escalation policies and limits for IonQ, PASQAL, Quantinuum, and Rigetti. | +| Best Practices | L45-L51 | Best practices for using QDK in VS Code with Copilot, optimizing large Q# programs via resource estimation, and systematically testing and debugging quantum code. | +| Decision Making | L52-L60 | Guidance on Azure Quantum costs, provider pricing and regions, workspace migration, choosing Q# dev tools, and planning quantum-safe cryptography with the resource estimator. | +| Architecture & Design Patterns | L61-L65 | Guidance on designing hybrid quantum-classical workflows in Azure Quantum, including architecture options, orchestration patterns, and when to offload tasks to quantum hardware. | +| Limits & Quotas | L66-L73 | Usage limits, quotas, session timeouts, and hardware capacity specs for Azure Quantum targets (e.g., Pasqal, Rigetti), plus how to monitor and manage these constraints. | | Security | L74-L84 | Managing secure access to Azure Quantum workspaces: RBAC and access control, bulk user assignment, ARM locks, managed identities, service principals, and secure handling of access keys. | -| Configuration | L85-L101 | Configuring Azure Quantum workspaces, QDK tools, simulators, noise models, and hardware targets (IonQ, PASQAL, Quantinuum, Rigetti), plus tuning and batching resource estimator runs. | -| Integrations & Coding Patterns | L102-L111 | Using the Azure Quantum QDK and SDKs to connect workspaces and submit/run circuits and programs (QIR, OpenQASM, Pulser, Qiskit, Cirq) and hybrid jobs with Adaptive RI | -| Deployment | L112-L116 | Deploying Azure Quantum workspaces with Bicep and running/submitting Q# quantum programs from VS Code to Azure Quantum backends | +| Configuration | L85-L100 | Configuring Azure Quantum environments: workspaces, simulators, hardware targets (IonQ/Quantinuum/neutral atoms), VS Code/QDK setup, and tuning/optimizing resource estimator runs. | +| Integrations & Coding Patterns | L101-L110 | Using the Azure Quantum QDK and SDKs to connect workspaces and submit/run circuits and programs (QIR, OpenQASM, Pulser, Qiskit, Cirq) and hybrid jobs with Adaptive RI | +| Deployment | L111-L115 | Deploying Azure Quantum workspaces with Bicep and running/submitting Q# quantum programs from VS Code to Azure Quantum backends | ### Troubleshooting | Topic | URL | |-------|-----| | Diagnose and fix common Azure Quantum issues | https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-common-issues | | Support and escalation policy for IonQ on Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-support-ionq | -| Support policy for PASQAL on Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-support-pasqal | | Support policy for Quantinuum on Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-support-quantinuum | | Support policy for Rigetti on Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-support-rigetti | @@ -69,6 +68,7 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Review and manage Azure Quantum usage quotas | https://learn.microsoft.com/en-us/azure/quantum/azure-quantum-quotas | | Manage Azure Quantum sessions and avoid timeouts | https://learn.microsoft.com/en-us/azure/quantum/how-to-work-with-sessions | +| Pasqal targets specifications and capacity limits in Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal | | Rigetti provider targets and hardware limits in Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-rigetti | ### Security @@ -85,14 +85,13 @@ This skill requires **network access** to fetch documentation content: ### Configuration | Topic | URL | |-------|-----| +| Configure and use Azure Quantum backend simulators | https://learn.microsoft.com/en-us/azure/quantum/backend-simulators | | Configure Azure Quantum workspaces with Azure CLI | https://learn.microsoft.com/en-us/azure/quantum/how-to-manage-quantum-workspaces-with-the-azure-cli | | Run Microsoft Quantum resource estimator locally and online | https://learn.microsoft.com/en-us/azure/quantum/how-to-submit-re-jobs | -| Install and use the QDK molecule visualizer in Jupyter | https://learn.microsoft.com/en-us/azure/quantum/how-to-use-molecule-visualizer | | Set up QDK VS Code extension and environment | https://learn.microsoft.com/en-us/azure/quantum/install-overview-qdk | | Install and run neutral atom device simulators in QDK | https://learn.microsoft.com/en-us/azure/quantum/install-qdk-neutral-atom-simulators | | Configure target parameters for the Quantum resource estimator | https://learn.microsoft.com/en-us/azure/quantum/overview-resources-estimator | | Configure and use IonQ targets in Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-ionq | -| Configure PASQAL simulators and QPUs in Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-pasqal | | Configure Quantinuum quantum targets in Azure Quantum | https://learn.microsoft.com/en-us/azure/quantum/provider-quantinuum | | Configure noise models for neutral atom simulations in QDK | https://learn.microsoft.com/en-us/azure/quantum/qdk-simulator-noise-models | | Batch and compare multiple resource estimator configurations | https://learn.microsoft.com/en-us/azure/quantum/resource-estimator-batching | diff --git a/skills/azure-redhat-openshift/SKILL.md b/skills/azure-redhat-openshift/SKILL.md index d1315993..e2d2695e 100644 --- a/skills/azure-redhat-openshift/SKILL.md +++ b/skills/azure-redhat-openshift/SKILL.md @@ -3,7 +3,7 @@ name: azure-redhat-openshift description: Expert knowledge for Azure Red Hat OpenShift development including troubleshooting, best practices, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when creating ARO clusters, configuring networking/storage, securing with Entra/NSGs, using GPUs/Key Vault, or upgrading, and other Azure Red Hat OpenShift related development tasks. Not for Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Container Apps (use azure-container-apps), Azure Container Instances (use azure-container-instances), Azure VMware Solution (use azure-vmware-solution). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-05" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Red Hat OpenShift Skill @@ -27,11 +27,11 @@ This skill requires **network access** to fetch documentation content: | Troubleshooting | L36-L43 | Fixing common ARO cluster issues, restoring cluster access, and manually updating or troubleshooting cluster certificates and connectivity via CLI | | Best Practices | L44-L51 | Guidance on sizing and deploying ARO clusters and infra nodes, optimizing OpenShift Virtualization VMs, and understanding ARO 4 support limits and policies | | Decision Making | L52-L56 | Defines the shared responsibility model for Azure Red Hat OpenShift, detailing which operational tasks are handled by Microsoft, Red Hat, and the customer. | -| Limits & Quotas | L57-L62 | Scaling and capacity limits for ARO clusters, including configuring multiple load balancer IPs and understanding ARO versioning, support lifecycle, and upgrade constraints. | -| Security | L63-L79 | Identity, auth, and network security for ARO: Entra/managed identities, workload identity, NSGs/egress control, disk encryption, FIPS, Front Door protection, Lockbox, and credential rotation. | -| Configuration | L80-L98 | Configuring ARO clusters: networking (proxy, DNS, egress, MTU), storage (Azure Files, Prometheus), registry/pull secrets, node/subnet layout, Spot VMs, tagging, and health alerts. | -| Integrations & Coding Patterns | L99-L107 | Running ARO with external services: virtualization, NVIDIA GPUs, Azure NetApp Files, Prometheus→Azure Monitor, ACR auth, and Azure Key Vault secret integration. | -| Deployment | L108-L119 | Deploying and operating ARO clusters and apps: cluster creation (private/ARM/Bicep), upgrades, networking migration, backups/restores, and app runtimes (JBoss, WebSphere, S2I, serverless). | +| Limits & Quotas | L57-L61 | Scaling and capacity limits for ARO clusters, including configuring multiple load balancer IPs and understanding ARO versioning, support lifecycle, and upgrade constraints. | +| Security | L62-L78 | Identity, auth, and network security for ARO: Entra/managed identities, workload identity, NSGs/egress control, disk encryption, FIPS, Front Door protection, Lockbox, and credential rotation. | +| Configuration | L79-L97 | Configuring ARO clusters: networking (proxy, DNS, egress, MTU), storage (Azure Files, Prometheus), registry/pull secrets, node/subnet layout, Spot VMs, tagging, and health alerts. | +| Integrations & Coding Patterns | L98-L106 | Running ARO with external services: virtualization, NVIDIA GPUs, Azure NetApp Files, Prometheus→Azure Monitor, ACR auth, and Azure Key Vault secret integration. | +| Deployment | L107-L118 | Deploying and operating ARO clusters and apps: cluster creation (private/ARM/Bicep), upgrades, networking migration, backups/restores, and app runtimes (JBoss, WebSphere, S2I, serverless). | ### Troubleshooting | Topic | URL | @@ -58,7 +58,6 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Configure multiple load balancer IPs to scale ARO clusters | https://learn.microsoft.com/en-us/azure/openshift/howto-multiple-ips | -| Understand Azure Red Hat OpenShift support lifecycle and versions | https://learn.microsoft.com/en-us/azure/openshift/support-lifecycle | ### Security | Topic | URL | diff --git a/skills/azure-reliability/SKILL.md b/skills/azure-reliability/SKILL.md index e8f677f7..6069c79e 100644 --- a/skills/azure-reliability/SKILL.md +++ b/skills/azure-reliability/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-reliability -description: Expert knowledge for Azure Reliability development including best practices, decision making, architecture & design patterns, limits & quotas, and deployment. Use when designing zone-redundant AKS, MySQL Flexible Server, Azure Functions, Queue Storage, or multi-region DR setups, and other Azure Reliability related development tasks. Not for Azure Monitor (use azure-monitor), Azure Resiliency (use azure-resiliency), Azure Service Health (use azure-service-health), Azure Sre Agent (use azure-sre-agent). +description: Expert knowledge for Azure Reliability development including best practices, decision making, architecture & design patterns, limits & quotas, and deployment. Use when designing zone/multi-region apps, MySQL Flexible Server HA, resilient Functions, DR failover, or Queue size limits, and other Azure Reliability related development tasks. Not for Azure Resiliency (use azure-resiliency), Azure Monitor (use azure-monitor), Azure Service Health (use azure-service-health), Azure Site Recovery (use azure-site-recovery). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Reliability Skill @@ -24,11 +24,11 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Best Practices | L33-L71 | Patterns and guidance to design, configure, and harden Azure services (AKS, DBs, networking, messaging, backup, DR) for high availability, fault tolerance, and disaster recovery. | -| Decision Making | L72-L76 | Guidance on using availability zones, nonregional services, and resilient Azure Functions architectures to design highly available, fault-tolerant Azure solutions. | -| Architecture & Design Patterns | L77-L83 | Designing Azure apps for high availability using zones and multi-region patterns, including planning zone-resilient workloads, hardening zonal deployments, and building in nonpaired regions. | -| Limits & Quotas | L84-L88 | Details on Azure Queue Storage message size limits, including max message size, behavior when limits are exceeded, and best practices for handling large payloads. | -| Deployment | L89-L92 | Guidance on deploying Azure services and MySQL Flexible Server with availability zones, including configuring zone-redundant high availability and migration to zone-resilient setups. | +| Best Practices | L33-L74 | Patterns and guidance to design, configure, and harden Azure services (compute, data, networking, messaging) for high availability, failover, and disaster recovery. | +| Decision Making | L75-L79 | Guidance on using availability zones, nonregional services, and resilient Azure Functions architectures to design highly available, fault-tolerant Azure solutions. | +| Architecture & Design Patterns | L80-L86 | Designing Azure apps for high availability using zones and multi-region patterns, including planning zone-resilient workloads, hardening zonal deployments, and building in nonpaired regions. | +| Limits & Quotas | L87-L91 | Details on Azure Queue Storage message size limits, including max message size, behavior when limits are exceeded, and best practices for handling large payloads. | +| Deployment | L92-L95 | Guidance on deploying Azure services and MySQL Flexible Server with availability zones, including configuring zone-redundant high availability and migration to zone-resilient setups. | ### Best Practices | Topic | URL | @@ -36,6 +36,7 @@ This skill requires **network access** to fetch documentation content: | Design resilient clusters in Azure Kubernetes Service | https://learn.microsoft.com/en-us/azure/reliability/reliability-aks | | Configure reliability for Azure API Center | https://learn.microsoft.com/en-us/azure/reliability/reliability-api-center | | Build resilient configurations with Azure App Configuration | https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration | +| Build resilient configurations with Azure App Configuration | https://learn.microsoft.com/en-us/azure/reliability/reliability-app-configuration | | Harden Azure App Service Environment reliability | https://learn.microsoft.com/en-us/azure/reliability/reliability-app-service-environment | | Architect highly available Azure Application Gateway v2 | https://learn.microsoft.com/en-us/azure/reliability/reliability-application-gateway-v2 | | Design resilient backup strategies with Azure Backup | https://learn.microsoft.com/en-us/azure/reliability/reliability-backup | @@ -48,6 +49,7 @@ This skill requires **network access** to fetch documentation content: | Design resilient Azure Database for MySQL deployments | https://learn.microsoft.com/en-us/azure/reliability/reliability-database-mysql | | Implement high availability for Azure Database for PostgreSQL | https://learn.microsoft.com/en-us/azure/reliability/reliability-database-postgresql | | Implement resilient architectures in Azure Databricks | https://learn.microsoft.com/en-us/azure/reliability/reliability-databricks | +| Harden application reliability with Azure DDoS Protection | https://learn.microsoft.com/en-us/azure/reliability/reliability-ddos-protection | | Ensure reliability for Azure Device Registry metadata | https://learn.microsoft.com/en-us/azure/reliability/reliability-device-registry | | Design high availability for Azure DocumentDB | https://learn.microsoft.com/en-us/azure/reliability/reliability-documentdb | | Build resilient architectures with Azure Event Grid | https://learn.microsoft.com/en-us/azure/reliability/reliability-event-grid | @@ -60,6 +62,7 @@ This skill requires **network access** to fetch documentation content: | Design resilient architectures with Azure Load Balancer | https://learn.microsoft.com/en-us/azure/reliability/reliability-load-balancer | | Design resilient workflows with Azure Logic Apps | https://learn.microsoft.com/en-us/azure/reliability/reliability-logic-apps | | Improve reliability of Azure Managed Grafana workspaces | https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-grafana | +| Ensure resilient Azure Key Vault Managed HSM deployments | https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-hsm | | Increase reliability of Azure Managed Redis caches | https://learn.microsoft.com/en-us/azure/reliability/reliability-managed-redis | | Implement resilient logging with Azure Monitor Logs | https://learn.microsoft.com/en-us/azure/reliability/reliability-monitor-logs | | Improve reliability of Azure Notification Hubs | https://learn.microsoft.com/en-us/azure/reliability/reliability-notification-hubs | diff --git a/skills/azure-repos/SKILL.md b/skills/azure-repos/SKILL.md index 73312ff9..74e8a03f 100644 --- a/skills/azure-repos/SKILL.md +++ b/skills/azure-repos/SKILL.md @@ -3,7 +3,7 @@ name: azure-repos description: Expert knowledge for Azure Repos development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. Use when managing Azure Repos Git/TFVC, branch/PR policies, repo permissions, CodeQL/secret scans, or TFVC migrations, and other Azure Repos related development tasks. Not for Azure DevOps (use azure-devops), Azure Pipelines (use azure-pipelines), Azure Boards (use azure-boards), Azure Artifacts (use azure-artifacts). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Repos Skill diff --git a/skills/azure-resource-manager/SKILL.md b/skills/azure-resource-manager/SKILL.md index cbfaee74..a654f0bb 100644 --- a/skills/azure-resource-manager/SKILL.md +++ b/skills/azure-resource-manager/SKILL.md @@ -3,7 +3,7 @@ name: azure-resource-manager description: Expert knowledge for Azure Resource Manager development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when authoring Bicep/ARM templates, CI/CD deployments, template specs, deployment stacks, or ARM REST/CLI automations, and other Azure Resource Manager related development tasks. Not for Azure Blueprints (use azure-blueprints), Azure Policy (use azure-policy), Azure Portal (use azure-portal), Azure Resource Graph (use azure-resource-graph). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Resource Manager Skill diff --git a/skills/azure-sap/SKILL.md b/skills/azure-sap/SKILL.md index 2d153f23..2997ee26 100644 --- a/skills/azure-sap/SKILL.md +++ b/skills/azure-sap/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-sap -description: Expert knowledge for SAP HANA on Azure Large Instances development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing HANA Large Instances with SDAF/ACSS, Azure Monitor for SAP, Key Vault/RBAC, HA/DR clusters, or RISE, and other SAP HANA on Azure Large Instances related development tasks. Not for Azure Large Instances (use azure-large-instances), Azure Virtual Machines (use azure-virtual-machines), Azure Baremetal Infrastructure (use azure-baremetal-infrastructure), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). +description: Expert knowledge for SAP HANA on Azure Large Instances development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when deploying SAP HANA on Azure Large Instances, S/4HANA, NetWeaver, LaMa, Azure Monitor for SAP, or RISE integration, and other SAP HANA on Azure Large Instances related development tasks. Not for Azure Large Instances (use azure-large-instances), Azure Virtual Machines (use azure-virtual-machines), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Baremetal Infrastructure (use azure-baremetal-infrastructure). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # SAP HANA on Azure Large Instances Skill @@ -26,13 +26,13 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L48 | Diagnosing and fixing SAP on Azure issues: deployment automation, BPS data extraction (ADF/Fabric), SAP Insights/AMS, Azure Monitor for SAP, VM scale sets, and VM extensions. | | Best Practices | L49-L60 | Best practices for testing, validating, and optimizing SAP on Azure: HA/DR design, automated test frameworks, Quality Insights checks, Azure Files/DFS-N tuning, and VM scale set deployment. | -| Decision Making | L61-L77 | Guidance on planning SAP on Azure: choosing infra, VM/storage options, DR, data models and tiering, SAP version support, data extraction, and app/network architecture decisions. | -| Architecture & Design Patterns | L78-L105 | Architecting SAP on Azure: HA/DR patterns, DB choices (HANA, SQL, Oracle, Db2, ASE, MaxDB), RISE integration, zoning/latency, networking, security, and multi-region/VM designs. | -| Limits & Quotas | L106-L110 | SAP on Azure limits: supported platforms/features for SAP testing automation, Azure Monitor for SAP quotas/behavior, and sizing/HA deployment constraints using Azure Files SMB. | -| Security | L111-L126 | Security, identity, and access design for SAP on Azure: Key Vault secrets, RBAC, TLS, private endpoints, encrypted storage, secure DB providers, and Entra ID/RISE integration. | -| Configuration | L127-L200 | Configuring and automating SAP on Azure: Terraform/SDAF setup, Azure Center registration and lifecycle, monitoring (Azure Monitor), HA/DR and storage layouts, and OS/DB-specific HA cluster designs. | -| Integrations & Coding Patterns | L201-L214 | Patterns and automation for integrating SAP HANA/Azure LI with Azure Monitor, ADF, RISE, Salesforce, Exchange, Universal Print, VIS APIs, and SAP Principal Propagation. | -| Deployment | L215-L243 | Deploying and automating SAP on Azure: control plane/workload zones, SDAF scripts/pipelines, ACSS-based S/4HANA/NetWeaver/BO/B1 installs, HA/DR, and planning checklists. | +| Decision Making | L61-L76 | Guidance on planning SAP on Azure: choosing infra, VM/storage options, DR, data models and tiering, SAP version support, data extraction, and app/network architecture decisions. | +| Architecture & Design Patterns | L77-L104 | Architecting SAP on Azure: HA/DR patterns, DB choices (HANA, SQL, Oracle, Db2, ASE, MaxDB), RISE integration, zoning/latency, networking, security, and multi-region/VM designs. | +| Limits & Quotas | L105-L109 | SAP on Azure limits: supported platforms/features for SAP testing automation, Azure Monitor for SAP quotas/behavior, and sizing/HA deployment constraints using Azure Files SMB. | +| Security | L110-L124 | Security, identity, and access design for SAP on Azure: Key Vault secrets, RBAC, TLS, private endpoints, encrypted storage, secure DB providers, and Entra ID/RISE integration. | +| Configuration | L125-L199 | Configuring SAP on Azure: automation (Terraform/SDAF), networking, storage, HA/DR clusters, monitoring (Azure Monitor/Center), and VM extensions for HANA, NetWeaver, Db2, and LaMa. | +| Integrations & Coding Patterns | L200-L214 | Patterns and scripts to integrate SAP on Azure with monitoring, automation, VIS APIs/CLI/PowerShell, data pipelines, email/printing, identity, and external services like Salesforce and RISE | +| Deployment | L215-L240 | Deploying and tearing down SAP landscapes on Azure: automation scripts, control planes, DevOps pipelines, infrastructure patterns, HA setups, and app-specific deployments (S/4HANA, B1, BI, NetWeaver). | ### Troubleshooting | Topic | URL | @@ -61,7 +61,6 @@ This skill requires **network access** to fetch documentation content: ### Decision Making | Topic | URL | |-------|-----| -| Choose new vs existing infrastructure for SAP automation | https://learn.microsoft.com/en-us/azure/sap/automation/new-vs-existing | | Plan SAP Deployment Automation Framework usage on Azure | https://learn.microsoft.com/en-us/azure/sap/automation/plan-deployment | | Use Business Process Solutions templates for analytics and AI agents | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/business-templates | | Select and understand data models in Business Process Solutions | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/data-models-business-process-solutions | @@ -113,7 +112,6 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Set SAP deployment SPN secrets in Azure Key Vault | https://learn.microsoft.com/en-us/azure/sap/automation/bash/set-secrets | | Configure Azure RBAC for Azure Center for SAP solutions | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/manage-with-azure-rbac | -| Configure Azure networking and security for S/4HANA | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network | | Configure TLS 1.2 security for Azure Monitor for SAP | https://learn.microsoft.com/en-us/azure/sap/monitor/enable-tls-azure-monitor-sap-solutions | | Enable Trusted Access and private endpoints for Azure Monitor for SAP solutions | https://learn.microsoft.com/en-us/azure/sap/monitor/enable-trusted-access | | Create secure IBM Db2 provider for Azure Monitor for SAP | https://learn.microsoft.com/en-us/azure/sap/monitor/provider-ibm-db2 | @@ -138,15 +136,16 @@ This skill requires **network access** to fetch documentation content: | Configure SDAF control plane web application | https://learn.microsoft.com/en-us/azure/sap/automation/configure-webapp | | Extend SAP Deployment Automation Framework configuration | https://learn.microsoft.com/en-us/azure/sap/automation/extensibility | | Customize Azure resource naming in SAP automation | https://learn.microsoft.com/en-us/azure/sap/automation/naming-module | +| Reference shell scripts for SAP deployment automation | https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash | | Configure Insights and templates in Business Process Solutions | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/configure-insights | | Configure SAP source systems with Open Mirroring | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/configure-source-system-with-open-mirroring | | Prepare SAP installation media for Azure Center for SAP | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/get-sap-installation-media | | Monitor SAP systems with Azure Center for SAP | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/monitor-portal | -| Start and stop SAP systems via Azure CLI in Azure Center | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli | -| Start and stop SAP systems via Azure PowerShell in Azure Center | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell | +| Prepare Azure virtual network for S/4HANA deployment | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/prepare-network | +| Create HA SAP infrastructure with custom Azure resource names | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom | | Register existing SAP systems in Azure Center using Azure CLI | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-register-system-cli | | Register existing SAP systems in Azure Center using PowerShell | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-register-system-powershell | -| Register existing SAP systems in Azure Center via portal | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system | +| Register existing SAP systems in Azure Center for SAP solutions | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/register-existing-system | | Control SAP system lifecycle via Azure VIS | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/start-stop-sap-systems | | Reference of logs and metrics for Azure Monitor for SAP solutions | https://learn.microsoft.com/en-us/azure/sap/monitor/data-reference | | Configure HA Pacemaker provider for Azure Monitor for SAP | https://learn.microsoft.com/en-us/azure/sap/monitor/provider-ha-pacemaker-cluster | @@ -169,7 +168,7 @@ This skill requires **network access** to fetch documentation content: | Configure HA SAP NetWeaver on RHEL with Azure NetApp Files | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-netapp-files | | Configure HA SAP NetWeaver on RHEL with Azure Files NFS | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-nfs-azure-files | | Configure Pacemaker clusters on RHEL for Azure SAP | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-pacemaker | -| Configure SAP PAS and AAS on RHEL ASCS/SCS HA VMs | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance | +| Configure SAP PAS/AAS on RHEL HA VMs in Azure | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-dialog-instance | | Configure SAP HANA ASCS/ERS high availability on RHEL VMs | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-rhel-with-hana-ascs-ers-dialog-instance | | Configure HA SAP NetWeaver on Azure VMs with SLES | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-suse | | Configure HA SAP NetWeaver on SLES with Azure NetApp Files | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-suse-netapp-files | @@ -203,9 +202,10 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Integrate Azure Monitor for SAP with automation | https://learn.microsoft.com/en-us/azure/sap/automation/integration-azure-monitor-sap | | Run SDAF Ansible playbooks to install SAP | https://learn.microsoft.com/en-us/azure/sap/automation/run-ansible | -| Download SAP media using automation framework playbooks | https://learn.microsoft.com/en-us/azure/sap/automation/software | | Configure Salesforce as a source system for Business Process Solutions | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/configure-salesforce-source-system | | Integrate SAP source systems with Business Process Solutions via Azure Data Factory | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/configure-source-system-with-data-factory | +| Control SAP systems via Azure CLI VIS commands | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-cli | +| Control SAP systems via Azure PowerShell VIS commands | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quick-stop-start-sap-powershell | | Control SAP and VM lifecycle via VIS REST APIs | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/stop-start-sap-and-underlying-vm | | Integrate SAP ABAP outbound email with Exchange Online | https://learn.microsoft.com/en-us/azure/sap/workloads/exchange-online-integration-sap-email-outbound | | Configure SAP Principal Propagation for live OData with Power Query | https://learn.microsoft.com/en-us/azure/sap/workloads/expose-sap-odata-to-power-query | @@ -225,13 +225,11 @@ This skill requires **network access** to fetch documentation content: | Configure Azure DevOps for SAP automation pipelines | https://learn.microsoft.com/en-us/azure/sap/automation/configure-devops | | Deploy control plane for SAP automation framework | https://learn.microsoft.com/en-us/azure/sap/automation/deploy-control-plane | | Deploy SAP infrastructure with SDAF and Azure DevOps | https://learn.microsoft.com/en-us/azure/sap/automation/devops-tutorial | -| Use Bash scripts to deploy SAP automation framework | https://learn.microsoft.com/en-us/azure/sap/automation/reference-bash | | Check SAP Deployment Automation support matrix | https://learn.microsoft.com/en-us/azure/sap/automation/supportability | | Deploy Business Process Solutions workload item in Fabric | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/deploy-workload-item | | Run data extraction and processing pipelines in Business Process Solutions | https://learn.microsoft.com/en-us/azure/sap/business-process-solutions/run-extraction-data-processing | -| Deploy S/4HANA infrastructure with Azure Center for SAP | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana | +| Deploy S/4HANA infrastructure with Azure Center for SAP solutions | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/deploy-s4hana | | Install SAP software on ACSS-managed systems | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/install-software | -| Create distributed HA SAP system with ACSS using Azure CLI | https://learn.microsoft.com/en-us/azure/sap/center-sap-solutions/quickstart-create-high-availability-namecustom | | Deploy SAP Business One on Azure Virtual Machines | https://learn.microsoft.com/en-us/azure/sap/workloads/business-one-azure | | Deploy SAP BusinessObjects BI on Azure for Linux | https://learn.microsoft.com/en-us/azure/sap/workloads/businessobjects-deployment-guide-linux | | Deploy SAP BusinessObjects BI on Azure for Windows | https://learn.microsoft.com/en-us/azure/sap/workloads/businessobjects-deployment-guide-windows | @@ -239,5 +237,4 @@ This skill requires **network access** to fetch documentation content: | Deploy SAP NetWeaver on Azure Linux VMs | https://learn.microsoft.com/en-us/azure/sap/workloads/deployment-guide | | Prepare and deploy SAP HANA on Azure VMs | https://learn.microsoft.com/en-us/azure/sap/workloads/hana-get-started | | Deploy HA SAP NetWeaver on Azure NetApp Files SMB | https://learn.microsoft.com/en-us/azure/sap/workloads/high-availability-guide-windows-netapp-files-smb | -| Onboard SAP Edge Integration Cell to Azure AKS/Arc | https://learn.microsoft.com/en-us/azure/sap/workloads/sap-edge-integration-cell-with-azure | | Install SAP NetWeaver HA on WSFC with file share | https://learn.microsoft.com/en-us/azure/sap/workloads/sap-high-availability-installation-wsfc-file-share | \ No newline at end of file diff --git a/skills/azure-security/SKILL.md b/skills/azure-security/SKILL.md index 41f57fd0..72082652 100644 --- a/skills/azure-security/SKILL.md +++ b/skills/azure-security/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-security -description: Expert knowledge for Azure Security development including troubleshooting, best practices, decision making, security, configuration, integrations & coding patterns, and deployment. Use when securing AKS workloads, SBOMs, Notation image signing, Key Vault/HSM keys, or Customer Lockbox access, and other Azure Security related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Firewall (use azure-firewall), Azure DDos Protection (use azure-ddos-protection), Azure Web Application Firewall (use azure-web-application-firewall). +description: Expert knowledge for Azure Security development including troubleshooting, best practices, decision making, security, configuration, integrations & coding patterns, and deployment. Use when securing AKS workloads, signing container images/SBOMs, configuring firewalls/antimalware, or choosing Key Vault/HSM, and other Azure Security related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Sentinel (use azure-sentinel), Azure Firewall (use azure-firewall), Azure Web Application Firewall (use azure-web-application-firewall). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Security Skill @@ -25,7 +25,7 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L35-L39 | Diagnosing and resolving common Azure Customer Lockbox issues, including access request problems, approval/denial errors, and configuration or permission-related failures. | -| Best Practices | L40-L58 | Security hardening checklists and patterns for Azure (IaaS/PaaS), covering identity, network, data encryption, secrets, DNS, and app/database protection best practices | +| Best Practices | L40-L58 | Security hardening checklists and patterns for Azure IaaS/PaaS: identity, network, data encryption, secrets, SQL/Storage/Service Fabric, Marketplace images, and subdomain takeover prevention. | | Decision Making | L59-L63 | Guidance on choosing Azure key management options (Key Vault, managed HSM, app-managed keys), including security, compliance, performance, and integration trade-offs. | | Security | L64-L94 | Securing Azure workloads: threat modeling mitigations, AKS image signing, crypto and data protection, ransomware defense, incident response, and Azure-specific security/operational best practices. | | Configuration | L95-L102 | Configuring Azure security features like antimalware, firewalls, container vulnerability tools, security logging/auditing, and upcoming managed TLS/DCV changes | @@ -41,7 +41,7 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Harden Azure Marketplace images before publishing | https://learn.microsoft.com/en-us/azure/security/fundamentals/azure-marketplace-images | -| Implement Azure data security and encryption best practices | https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices | +| Apply Azure data security and encryption best practices | https://learn.microsoft.com/en-us/azure/security/fundamentals/data-encryption-best-practices | | Use Azure SQL database security checklist | https://learn.microsoft.com/en-us/azure/security/fundamentals/database-security-checklist | | Apply security best practices to Azure IaaS workloads | https://learn.microsoft.com/en-us/azure/security/fundamentals/iaas | | Apply Microsoft Entra identity security best practices | https://learn.microsoft.com/en-us/azure/security/fundamentals/identity-management-best-practices | diff --git a/skills/azure-sentinel/SKILL.md b/skills/azure-sentinel/SKILL.md index 6fd0b1f3..b6444cfd 100644 --- a/skills/azure-sentinel/SKILL.md +++ b/skills/azure-sentinel/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-sentinel -description: Expert knowledge for Azure Sentinel development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Sentinel data connectors, analytics rules, SOAR playbooks, UEBA/ASIM content, or multi-workspace setups, and other Azure Sentinel related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Defender For Iot (use azure-defender-for-iot), Azure Monitor (use azure-monitor), Azure Security (use azure-security). +description: Expert knowledge for Azure Sentinel development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring Sentinel connectors, analytics rules, UEBA/Fusion, ASIM/DCR pipelines, or Logic Apps playbooks, and other Azure Sentinel related development tasks. Not for Azure Defender For Cloud (use azure-defender-for-cloud), Azure Monitor (use azure-monitor), Azure Security (use azure-security), Azure Network Watcher (use azure-network-watcher). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Sentinel Skill @@ -24,337 +24,331 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L48 | Diagnosing and fixing Microsoft Sentinel ingestion, connector, KQL/data lake, analytics rule (auto-disable), MCP tools, and SAP/AWS/Blob/CEF/Syslog integration issues. | -| Best Practices | L49-L75 | Best practices for SOC operations in Microsoft Sentinel: rule tuning, automation/playbooks, incident tasks/metrics, watchlists, data collection, solution lifecycle, and monitoring/health. | -| Decision Making | L76-L112 | Planning Sentinel deployments, pricing, and data tiers; choosing connectors and content; and migrating detections, automation, and historical data from MMA, ArcSight, QRadar, and Splunk. | -| Architecture & Design Patterns | L113-L126 | Architecting Sentinel deployments: multi-workspace/tenant patterns, MSSP setups, SOAR automation, BCDR/resiliency, cross-workspace data/incident ops, SAP, ML models, and Jupyter-based hunting. | -| Limits & Quotas | L127-L138 | Limits, quotas, pricing, and retention tiers for Sentinel data, search jobs, watchlists, MCP servers, ASIM, and workspace removal impacts | -| Security | L139-L154 | Security configuration for Microsoft Sentinel: RBAC and roles, row-level/resource-context access, playbook auth/restrictions, encryption keys, audit logs, SAP roles/params, and network/attack protections. | -| Configuration | L155-L283 | Configuring Microsoft Sentinel data connectors, retention, analytics/automation rules, ASIM schemas, UEBA, SAP/Power Platform content, data lake, MCP tools, and health/auditing settings. | -| Integrations & Coding Patterns | L284-L336 | APIs, connectors, and code patterns for ingesting data, threat intel, and incidents into Sentinel, querying the data lake/graphs, and integrating with playbooks, MCP, Teams, Power BI, and Logic Apps | -| Deployment | L337-L360 | Deploying and managing Microsoft Sentinel solutions and content (CI/CD, ARM, content hub, marketplace) and specialized connectors/agents for SAP, Power Platform, Dynamics, Azure Stack Hub, and hunting notebooks. | +| Troubleshooting | L37-L48 | Diagnosing and fixing Microsoft Sentinel data ingestion, connector, KQL, analytics rule, and MCP/SAP solution issues across AWS S3, Azure Storage, Syslog/CEF, and the data lake. | +| Best Practices | L49-L67 | Best practices for Sentinel workspace/data design, cost tuning, rule noise reduction, SOC operations, identity/SAP threat detection, ASIM use, and safe watchlist management. | +| Decision Making | L68-L109 | Planning and decision guides for Sentinel deployment, pricing and data tiers, connector and analytics choices, and migrating from legacy SIEMs (Splunk/QRadar/ArcSight) and MMA/SAP to Sentinel. | +| Architecture & Design Patterns | L110-L122 | Designing Microsoft Sentinel architectures: workspace/tenant layouts, HA/DR, MSSP multi-tenant models, SAP-specific patterns, and cross-workspace/tenant integration strategies. | +| Limits & Quotas | L123-L134 | Limits, quotas, pricing, and retention tiers for Sentinel data, workspaces, watchlists, search jobs, and ASIM normalization, plus impacts of workspace removal. | +| Security | L135-L152 | Securing Microsoft Sentinel: RBAC and row-level access, playbook auth and restrictions, encryption keys, network perimeters, and SAP/Power Platform/Dynamics security content and parameters | +| Configuration | L153-L286 | Configuring Microsoft Sentinel data ingestion, connectors, schemas, analytics/automation rules, UEBA/Fusion, data lake, MCP tools, and health/auditing to tailor and operate the SIEM platform. | +| Integrations & Coding Patterns | L287-L329 | Programmatic and low-code ways to integrate data, threat intel, incidents, graphs, and analytics with Microsoft Sentinel using APIs, Logic Apps, DCRs, ASIM, notebooks, and external tools. | +| Deployment | L330-L354 | Deploying and managing Microsoft Sentinel at scale, including prerequisites, multi-tenant/workspace setup, CI/CD content deployment, Content hub solutions, and SAP/Power Platform/Dynamics connectors. | ### Troubleshooting | Topic | URL | |-------|-----| -| Troubleshoot Microsoft Sentinel AWS S3 connector problems | https://learn.microsoft.com/en-us/azure/sentinel/aws-s3-troubleshoot | -| Troubleshoot Microsoft Sentinel Azure Storage Blob connector | https://learn.microsoft.com/en-us/azure/sentinel/azure-storage-blob-connector-troubleshoot | -| Troubleshoot Sentinel CEF and Syslog AMA ingestion issues | https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-troubleshooting | +| Troubleshoot AWS S3 log ingestion connector issues in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/aws-s3-troubleshoot | +| Troubleshoot Microsoft Sentinel Azure Storage Blob connector issues | https://learn.microsoft.com/en-us/azure/sentinel/azure-storage-blob-connector-troubleshoot | +| Troubleshoot Syslog and CEF AMA connectors in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-troubleshooting | | Troubleshoot KQL queries and jobs in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-troubleshoot | | Best practices and troubleshooting for Sentinel MCP tools | https://learn.microsoft.com/en-us/azure/sentinel/datalake/troubleshoot-sentinel-mcp | | Troubleshoot Sentinel SAP data connector agent | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-deploy-troubleshoot | -| Troubleshoot Sentinel analytics rules and AUTO DISABLED | https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-analytics-rules | -| Troubleshoot Microsoft Sentinel solution ingestion issues | https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-sentinel-solutions | +| Troubleshoot Sentinel analytics rule issues | https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-analytics-rules | +| Troubleshoot Microsoft Sentinel solution ingestion and analytics | https://learn.microsoft.com/en-us/azure/sentinel/troubleshoot-sentinel-solutions | ### Best Practices | Topic | URL | |-------|-----| -| Audit and track Sentinel incident task changes | https://learn.microsoft.com/en-us/azure/sentinel/audit-track-tasks | -| Implement Sentinel automation rules for SOAR operations | https://learn.microsoft.com/en-us/azure/sentinel/automate-incident-handling-with-automation-rules | -| Automate Sentinel response to compromised users with playbooks | https://learn.microsoft.com/en-us/azure/sentinel/automation/tutorial-respond-threats-playbook | -| Apply operational best practices for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/best-practices | -| Apply data collection best practices in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/best-practices-data | -| Apply fine-tuning recommendations to Sentinel rules | https://learn.microsoft.com/en-us/azure/sentinel/detection-tuning | -| Use ASIM-based essential domain solutions in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/domain-based-essential-solutions | -| Reduce false positives in Microsoft Sentinel analytics | https://learn.microsoft.com/en-us/azure/sentinel/false-positives | -| Standardize Sentinel incident handling with tasks | https://learn.microsoft.com/en-us/azure/sentinel/incident-tasks | +| Apply recommended Microsoft Sentinel playbook templates | https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-recommendations | +| Apply best practices for managing Sentinel workspaces | https://learn.microsoft.com/en-us/azure/sentinel/best-practices | +| Apply data collection best practices in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/best-practices-data | +| Reduce and optimize Microsoft Sentinel costs | https://learn.microsoft.com/en-us/azure/sentinel/billing-reduce-costs | +| Use Sentinel identity attack graph to find lateral paths | https://learn.microsoft.com/en-us/azure/sentinel/datalake/identity-attack-graph | +| Tune Sentinel analytics rules to reduce noise | https://learn.microsoft.com/en-us/azure/sentinel/detection-tuning | | Handle data ingestion delay in Sentinel rules | https://learn.microsoft.com/en-us/azure/sentinel/ingestion-delay | -| Use Sentinel incident metrics to manage SOC performance | https://learn.microsoft.com/en-us/azure/sentinel/manage-soc-with-incident-metrics | -| Update SOC and analyst processes for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration-security-operations-center-processes | -| Monitor health and integrity of Microsoft Sentinel analytics rules | https://learn.microsoft.com/en-us/azure/sentinel/monitor-analytics-rule-integrity | -| Monitor and optimize Sentinel scheduled analytics rule execution | https://learn.microsoft.com/en-us/azure/sentinel/monitor-optimize-analytics-rule-execution | -| Protect MSSP intellectual property in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/mssp-protect-intellectual-property | -| Apply operational recommendations for Microsoft Sentinel SOCs | https://learn.microsoft.com/en-us/azure/sentinel/ops-guide | +| Convert Sentinel content to use ASIM normalization | https://learn.microsoft.com/en-us/azure/sentinel/normalization-modify-content | +| Follow operational recommendations for Microsoft Sentinel SOCs | https://learn.microsoft.com/en-us/azure/sentinel/ops-guide | | Configure Sentinel SAP detections and threat protection | https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-solution-configuration | -| Monitor Zero Trust TIC 3.0 with Sentinel solution | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution | -| Manage lifecycle of deprecated Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-deprecation | -| Apply quality guidelines to Microsoft Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-quality-guidance | -| Use Sentinel watchlists to enrich and correlate events | https://learn.microsoft.com/en-us/azure/sentinel/watchlists | +| Check SAP security controls with Sentinel workbook | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-controls-workbook | +| Use SAP Security Audit log workbook in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-log-workbook | +| Apply Sentinel SOC optimization recommendations | https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-access | +| Use Sentinel watchlists to enrich and correlate data | https://learn.microsoft.com/en-us/azure/sentinel/watchlists | | Maintain and edit Microsoft Sentinel watchlists safely | https://learn.microsoft.com/en-us/azure/sentinel/watchlists-manage | -| Use Sentinel incident tasks in analyst workflows | https://learn.microsoft.com/en-us/azure/sentinel/work-with-tasks | ### Decision Making | Topic | URL | |-------|-----| -| Plan and execute migration from MMA to AMA for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/ama-migrate | -| Decide and migrate Sentinel alert-trigger playbooks to automation rules | https://learn.microsoft.com/en-us/azure/sentinel/automation/migrate-playbooks-to-automation-rules | -| Choose when to use Microsoft Sentinel data lake tier | https://learn.microsoft.com/en-us/azure/sentinel/basic-logs-use-cases | -| Plan and estimate Microsoft Sentinel pricing and billing | https://learn.microsoft.com/en-us/azure/sentinel/billing | -| Analyze and optimize Microsoft Sentinel cost and billing | https://learn.microsoft.com/en-us/azure/sentinel/billing-monitor-costs | -| Use Microsoft Sentinel prepurchase plans to save costs | https://learn.microsoft.com/en-us/azure/sentinel/billing-pre-purchase-plan | -| Reduce Microsoft Sentinel costs with product features | https://learn.microsoft.com/en-us/azure/sentinel/billing-reduce-costs | -| Choose and configure Sentinel connectors for Cisco ASA/FTD | https://learn.microsoft.com/en-us/azure/sentinel/cisco-ftd-firewall | +| Plan and execute Sentinel migration from MMA to AMA | https://learn.microsoft.com/en-us/azure/sentinel/ama-migrate | +| Migrate Sentinel alert-trigger playbooks to automation rules | https://learn.microsoft.com/en-us/azure/sentinel/automation/migrate-playbooks-to-automation-rules | +| Decide when to use the Sentinel data lake tier | https://learn.microsoft.com/en-us/azure/sentinel/basic-logs-use-cases | +| Plan Microsoft Sentinel pricing, billing, and cost control | https://learn.microsoft.com/en-us/azure/sentinel/billing | +| Monitor and manage Microsoft Sentinel costs | https://learn.microsoft.com/en-us/azure/sentinel/billing-monitor-costs | +| Use Sentinel prepurchase plans to optimize analytics costs | https://learn.microsoft.com/en-us/azure/sentinel/billing-pre-purchase-plan | +| Choose and configure Cisco firewall connectors for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/cisco-ftd-firewall | | Compare Sentinel analytics rules vs Defender custom detections | https://learn.microsoft.com/en-us/azure/sentinel/compare-analytics-rules-custom-detections | -| Assess Sentinel connector data type support by cloud | https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support | +| Check Sentinel connector data type support by cloud | https://learn.microsoft.com/en-us/azure/sentinel/data-type-cloud-support | +| Use KQL jobs to promote Sentinel data | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs | | Choose between KQL jobs, summary rules, and search jobs | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs-summary-rules-search-jobs | -| Plan side-by-side deployment with existing SIEM | https://learn.microsoft.com/en-us/azure/sentinel/deploy-side-by-side | +| Plan side-by-side deployment of Sentinel with existing SIEM | https://learn.microsoft.com/en-us/azure/sentinel/deploy-side-by-side | | Enroll Sentinel workspaces in simplified pricing tiers | https://learn.microsoft.com/en-us/azure/sentinel/enroll-simplified-pricing-tier | -| Check Microsoft Sentinel feature availability by Azure cloud | https://learn.microsoft.com/en-us/azure/sentinel/feature-availability | -| Plan Sentinel deployment for geography and data residency | https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency | -| Choose data tiers and retention for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/manage-data-overview | -| Use Microsoft Sentinel within the Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-sentinel-defender-portal | -| Plan and phase a migration to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration | -| Migrate ArcSight SOAR automation to Sentinel rules and playbooks | https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-automation | +| Review Microsoft Sentinel feature availability by Azure cloud | https://learn.microsoft.com/en-us/azure/sentinel/feature-availability | +| Plan Sentinel deployment for data residency compliance | https://learn.microsoft.com/en-us/azure/sentinel/geographical-availability-data-residency | +| Plan Sentinel data tiers and retention for cost and operations | https://learn.microsoft.com/en-us/azure/sentinel/manage-data-overview | +| Assess Defender XDR connector data type support across clouds | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-cloud-support | +| Plan migration from legacy SIEMs to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration | +| Migrate ArcSight SOAR automation to Sentinel playbooks | https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-automation | | Map and migrate ArcSight detection rules to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-detection-rules | | Export ArcSight historical data for Sentinel migration | https://learn.microsoft.com/en-us/azure/sentinel/migration-arcsight-historical-data | -| Choose an Azure target platform for Sentinel historical data | https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-target-platform | -| Select a data ingestion tool for Sentinel historical logs | https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-tool | -| Migrate QRadar SOAR automation to Sentinel automation rules | https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-automation | +| Convert legacy SIEM dashboards to Sentinel workbooks | https://learn.microsoft.com/en-us/azure/sentinel/migration-convert-dashboards | +| Choose Azure target platform for Sentinel historical data | https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-target-platform | +| Select data ingestion tools for Sentinel migration | https://learn.microsoft.com/en-us/azure/sentinel/migration-ingestion-tool | +| Migrate QRadar SOAR automation to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-automation | | Migrate QRadar detection rules to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-detection-rules | | Export QRadar historical data for Sentinel migration | https://learn.microsoft.com/en-us/azure/sentinel/migration-qradar-historical-data | -| Migrate Splunk SOAR automation to Sentinel automation rules | https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-automation | -| Migrate Splunk detection rules to Microsoft Sentinel analytics | https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-detection-rules | +| Update SOC and analyst processes for Sentinel migration | https://learn.microsoft.com/en-us/azure/sentinel/migration-security-operations-center-processes | +| Migrate Splunk SOAR automation to Sentinel automation | https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-automation | +| Migrate Splunk detection rules to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-detection-rules | | Export Splunk historical data for Sentinel migration | https://learn.microsoft.com/en-us/azure/sentinel/migration-splunk-historical-data | -| Choose between Sentinel standalone and XDR alert connectors | https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema-differences | -| Select Sentinel content hub solutions by domain | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-catalog | -| Use Sentinel SIEM migration experience for rule mapping | https://learn.microsoft.com/en-us/azure/sentinel/siem-migration | -| Apply SOC optimization recommendations in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-access | +| Transition Sentinel operations from Azure to Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/move-to-defender | +| Protect MSSP intellectual property in Sentinel deployments | https://learn.microsoft.com/en-us/azure/sentinel/mssp-protect-intellectual-property | +| Prioritize Microsoft Sentinel data connectors by value | https://learn.microsoft.com/en-us/azure/sentinel/prioritize-data-connectors | +| Migrate from SAP agent container to agentless | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-agent-migrate | +| Select Microsoft Sentinel solutions from content hub catalog | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-catalog | +| Use Sentinel SIEM migration tool for detection mapping | https://learn.microsoft.com/en-us/azure/sentinel/siem-migration | +| Reference Sentinel SOC optimization recommendation types | https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-reference | ### Architecture & Design Patterns | Topic | URL | |-------|-----| -| Design Sentinel SOAR with automation rules and playbooks | https://learn.microsoft.com/en-us/azure/sentinel/automation/automation | -| Bring custom machine learning models into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/bring-your-own-ml | -| Design BCDR and resiliency architecture for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery | -| Query and manage Sentinel data across workspaces and tenants | https://learn.microsoft.com/en-us/azure/sentinel/extend-sentinel-across-workspaces-tenants | -| Investigate Sentinel incidents using large dataset search | https://learn.microsoft.com/en-us/azure/sentinel/investigate-large-datasets | -| Work with Sentinel incidents across multiple workspaces | https://learn.microsoft.com/en-us/azure/sentinel/multiple-workspace-view | -| Use Jupyter notebooks for Sentinel threat hunting | https://learn.microsoft.com/en-us/azure/sentinel/notebooks | -| Design Microsoft Sentinel solution components and patterns | https://learn.microsoft.com/en-us/azure/sentinel/partner-integrations | -| Design multi-workspace architecture for Sentinel SAP | https://learn.microsoft.com/en-us/azure/sentinel/sap/cross-workspace | -| Use workspace manager to operate multiple Sentinel workspaces | https://learn.microsoft.com/en-us/azure/sentinel/workspace-manager | +| Design Sentinel for high availability and disaster recovery | https://learn.microsoft.com/en-us/azure/sentinel/business-continuity-disaster-recovery | +| Extend Sentinel analytics across workspaces and tenants | https://learn.microsoft.com/en-us/azure/sentinel/extend-sentinel-across-workspaces-tenants | +| Onboard and manage multiple Sentinel tenants as an MSSP | https://learn.microsoft.com/en-us/azure/sentinel/multiple-tenants-service-providers | +| Design integration patterns for Microsoft Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/partner-integrations | +| Design multi-workspace and multi-tenant Sentinel layouts | https://learn.microsoft.com/en-us/azure/sentinel/prepare-multiple-workspaces | +| Choose Microsoft Sentinel workspace architecture patterns | https://learn.microsoft.com/en-us/azure/sentinel/sample-workspace-designs | +| Architect multi-workspace Sentinel deployments for SAP | https://learn.microsoft.com/en-us/azure/sentinel/sap/cross-workspace | +| Integrate SAP LogServ with Sentinel SAP solution | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-logserv-overview | +| Design multi-workspace Sentinel deployments in Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/workspaces-defender-portal | ### Limits & Quotas | Topic | URL | |-------|-----| -| Service limits and quotas for Microsoft Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-service-limits | +| Service limits and quotas for Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-service-limits | | Sentinel MCP server pricing, limits, and availability | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-billing | | Select Microsoft Sentinel log retention tiers and limits | https://learn.microsoft.com/en-us/azure/sentinel/log-plans | -| Review ASIM known issues and limitations in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-known-issues | -| Understand removal impact of Microsoft Sentinel workspaces | https://learn.microsoft.com/en-us/azure/sentinel/offboard-implications | -| Run Sentinel search jobs for large datasets and archives | https://learn.microsoft.com/en-us/azure/sentinel/search-jobs | -| Review Microsoft Sentinel service limits and quotas | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-service-limits | -| Create Sentinel watchlists and manage file size limits | https://learn.microsoft.com/en-us/azure/sentinel/watchlists-create | +| Review ASIM normalization known issues and limitations | https://learn.microsoft.com/en-us/azure/sentinel/normalization-known-issues | +| Understand impacts and timing of removing Sentinel workspaces | https://learn.microsoft.com/en-us/azure/sentinel/offboard-implications | +| Run Sentinel search jobs with query timeout limits | https://learn.microsoft.com/en-us/azure/sentinel/search-jobs | +| Reference Microsoft Sentinel service limits and quotas | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-service-limits | +| Create Microsoft Sentinel watchlists with size limits | https://learn.microsoft.com/en-us/azure/sentinel/watchlists-create | ### Security | Topic | URL | |-------|-----| -| Audit Microsoft Sentinel queries and user activities | https://learn.microsoft.com/en-us/azure/sentinel/audit-sentinel-data | | Configure authentication for Microsoft Sentinel playbooks | https://learn.microsoft.com/en-us/azure/sentinel/automation/authenticate-playbooks-to-sentinel | -| Define access restriction policies for Sentinel Standard playbooks | https://learn.microsoft.com/en-us/azure/sentinel/automation/define-playbook-access-restrictions | -| Enable automated attack disruption actions on AWS identities | https://learn.microsoft.com/en-us/azure/sentinel/aws-disruption | -| Set up customer-managed keys for Microsoft Sentinel encryption | https://learn.microsoft.com/en-us/azure/sentinel/customer-managed-keys | -| Use audit log for Sentinel data lake and graph activities | https://learn.microsoft.com/en-us/azure/sentinel/datalake/auditing-lake-activities | -| Enable network security for Sentinel Azure Storage connector | https://learn.microsoft.com/en-us/azure/sentinel/enable-storage-network-security | -| Configure resource-context RBAC for Microsoft Sentinel data access | https://learn.microsoft.com/en-us/azure/sentinel/resource-context-rbac | -| Configure Microsoft Sentinel roles and permissions | https://learn.microsoft.com/en-us/azure/sentinel/roles | -| ABAP roles and authorizations for Sentinel SAP logs | https://learn.microsoft.com/en-us/azure/sentinel/sap/required-abap-authorizations | +| Define access restriction policies for Sentinel playbooks | https://learn.microsoft.com/en-us/azure/sentinel/automation/define-playbook-access-restrictions | +| Configure AWS attack disruption actions from Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/aws-disruption | +| Security content reference for Power Platform and CE | https://learn.microsoft.com/en-us/azure/sentinel/business-applications/power-platform-solution-security-content | +| Set up customer-managed keys for Sentinel encryption | https://learn.microsoft.com/en-us/azure/sentinel/customer-managed-keys | +| Security content reference for Dynamics 365 F&O | https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/dynamics-365-finance-operations-security-content | +| Enable network security perimeters for Sentinel storage connectors | https://learn.microsoft.com/en-us/azure/sentinel/enable-storage-network-security | +| Configure resource-context RBAC for Sentinel data access | https://learn.microsoft.com/en-us/azure/sentinel/resource-context-rbac | +| Assign Microsoft Sentinel RBAC roles and permissions | https://learn.microsoft.com/en-us/azure/sentinel/roles | +| ABAP authorizations for Sentinel SAP log ingestion | https://learn.microsoft.com/en-us/azure/sentinel/sap/required-abap-authorizations | +| Security content reference for Sentinel SAP BTP | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-security-content | +| Security content reference for Sentinel SAP solutions | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-security-content | | SAP security parameters monitored by Sentinel analytics | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-suspicious-configuration-security-parameters | | Configure row-level RBAC scoping in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/scoping | ### Configuration | Topic | URL | |-------|-----| -| Add advanced OR condition groups to Sentinel automation rules | https://learn.microsoft.com/en-us/azure/sentinel/add-advanced-conditions-to-automation-rules | -| Use Microsoft Sentinel audit tables for monitoring | https://learn.microsoft.com/en-us/azure/sentinel/audit-table-reference | -| Configure Microsoft Sentinel automation rules and conditions | https://learn.microsoft.com/en-us/azure/sentinel/automation-rule-reference | -| Security content reference for Power Platform and CE | https://learn.microsoft.com/en-us/azure/sentinel/business-applications/power-platform-solution-security-content | +| Configure advanced OR conditions in Sentinel automation rules | https://learn.microsoft.com/en-us/azure/sentinel/add-advanced-conditions-to-automation-rules | +| Reference anomalies detected by Sentinel ML engine | https://learn.microsoft.com/en-us/azure/sentinel/anomalies-reference | +| Audit Sentinel queries and activities using log tables | https://learn.microsoft.com/en-us/azure/sentinel/audit-sentinel-data | +| Use SentinelAudit table fields for audit analysis | https://learn.microsoft.com/en-us/azure/sentinel/audit-table-reference | +| Configure Microsoft Sentinel automation rules properties | https://learn.microsoft.com/en-us/azure/sentinel/automation-rule-reference | | Map CEF keys to Sentinel CommonSecurityLog fields | https://learn.microsoft.com/en-us/azure/sentinel/cef-name-mapping | -| Configure Syslog and CEF connectors via Azure Monitor Agent | https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-overview | -| Configure Security Events connector for anomalous RDP detection | https://learn.microsoft.com/en-us/azure/sentinel/configure-connector-login-detection | +| Configure Syslog and CEF AMA connectors for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/cef-syslog-ama-overview | +| Configure Security Events connector for anomalous RDP detection in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/configure-connector-login-detection | +| Configure Microsoft Sentinel connectors, analytics, and automation | https://learn.microsoft.com/en-us/azure/sentinel/configure-content | +| Install and configure Microsoft Sentinel data connectors | https://learn.microsoft.com/en-us/azure/sentinel/configure-data-connector | | Configure interactive and long-term Sentinel data retention | https://learn.microsoft.com/en-us/azure/sentinel/configure-data-retention-archive | -| Configure ingestion-time data transformation and custom log ingestion | https://learn.microsoft.com/en-us/azure/sentinel/configure-data-transformation | -| Configure Fusion multistage attack detection rules | https://learn.microsoft.com/en-us/azure/sentinel/configure-fusion-rules | -| Configure AWS service log connector for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws | -| Prepare AWS environment to send logs to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-configure-environment | -| Configure AWS WAF S3 connector to ingest logs to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-s3-waf | +| Configure ingestion-time data transformation and custom log ingestion in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/configure-data-transformation | +| Configure Fusion multistage attack rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/configure-fusion-rules | +| Configure AWS service log connector to ingest data into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws | +| Prepare AWS environment to send logs to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-configure-environment | +| Configure AWS EKS S3 connector to ingest audit logs into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks | +| Configure AWS WAF S3 connector to stream logs to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-s3-waf | | Configure Microsoft Entra ID connector to send logs to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-active-directory | -| Connect Azure Virtual Desktop telemetry to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-virtual-desktop | -| Configure Sentinel connections to Azure and Microsoft services | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-windows-microsoft-services | -| Configure AMA-based syslog and CEF ingestion to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama | -| Configure Custom Logs via AMA to ingest text-file logs | https://learn.microsoft.com/en-us/azure/sentinel/connect-custom-logs-ama | -| Connect Microsoft Defender for Cloud alerts to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-defender-for-cloud | -| Configure AMA connector for Windows DNS log streaming | https://learn.microsoft.com/en-us/azure/sentinel/connect-dns-ama | +| Connect Azure Virtual Desktop diagnostics and logs to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-virtual-desktop | +| Configure Sentinel connectors for Azure and Microsoft services | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-windows-microsoft-services | +| Ingest Syslog and CEF data to Sentinel using AMA | https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama | +| Configure Custom Logs via AMA to ingest text files into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-custom-logs-ama | +| Configure Defender for Cloud alerts ingestion into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-defender-for-cloud | +| Stream and filter Windows DNS logs to Sentinel with AMA | https://learn.microsoft.com/en-us/azure/sentinel/connect-dns-ama | | Configure GCP Pub/Sub connectors to ingest logs into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-google-cloud-platform | -| Configure Microsoft Defender XDR connector in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-365-defender | -| Stream Microsoft Purview Information Protection data to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-purview | -| Configure API-based data connectors for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-services-api-based | +| Configure Microsoft Defender XDR connector to stream incidents to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-365-defender | +| Configure Microsoft Purview Information Protection connector for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-microsoft-purview | +| Configure API-based Microsoft service connectors for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-services-api-based | | Configure diagnostic settings-based connectors for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-services-diagnostic-setting-based | | Configure Windows agent-based data connectors for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-services-windows-based | -| Create scheduled analytics rules from Sentinel templates | https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rule-from-template | -| Create custom scheduled analytics rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rules | +| Create and configure scheduled Sentinel rules | https://learn.microsoft.com/en-us/azure/sentinel/create-analytics-rules | | Configure incident creation from alerts in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/create-incidents-from-alerts | -| Configure Sentinel automation rules for incident response | https://learn.microsoft.com/en-us/azure/sentinel/create-manage-use-automation-rules | +| Configure Microsoft Sentinel automation rules for response | https://learn.microsoft.com/en-us/azure/sentinel/create-manage-use-automation-rules | | Create and manage NRT detection rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/create-nrt-rules | -| Create Sentinel incident task lists via automation rules | https://learn.microsoft.com/en-us/azure/sentinel/create-tasks-automation-rule | -| Customize Sentinel alert names, severity, and tactics | https://learn.microsoft.com/en-us/azure/sentinel/customize-alert-details | +| Configure automation rules to create Sentinel incident tasks | https://learn.microsoft.com/en-us/azure/sentinel/create-tasks-automation-rule | +| Customize Sentinel alert naming and severity | https://learn.microsoft.com/en-us/azure/sentinel/customize-alert-details | | Customize activities on Sentinel entity timelines | https://learn.microsoft.com/en-us/azure/sentinel/customize-entity-activities | -| Configure CCF JSON for Azure Storage Blob connector | https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-azure-storage | -| Configure RestApiPoller connector JSON for Sentinel CCF | https://learn.microsoft.com/en-us/azure/sentinel/data-connector-connection-rules-reference | -| Reference Sentinel-supported data source schemas | https://learn.microsoft.com/en-us/azure/sentinel/data-source-schema-reference | -| Use asset data tables in Microsoft Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/asset-data-tables | +| Configure Azure Storage Blob connector JSON in CCF | https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-azure-storage | +| Configure GCP data connector JSON for Sentinel CCF | https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-gcp | +| Configure RestApiPoller data connector JSON rules | https://learn.microsoft.com/en-us/azure/sentinel/data-connector-connection-rules-reference | +| Define Codeless Connector Framework UIConfig JSON | https://learn.microsoft.com/en-us/azure/sentinel/data-connector-ui-definitions-reference | +| Configure custom data ingestion and transformation in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/data-transformation | +| Reference asset data table mappings in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/asset-data-tables | +| Use audit logs for Sentinel data lake and graph activities | https://learn.microsoft.com/en-us/azure/sentinel/datalake/auditing-lake-activities | | Configure federated data connectors in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/data-federation-setup | -| Configure and schedule KQL jobs in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs | -| Configure and schedule KQL jobs in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs | -| Manage Microsoft Sentinel data lake KQL jobs | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-manage-jobs | +| Create and schedule KQL jobs in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-jobs | +| Manage Sentinel data lake KQL jobs in portal | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-manage-jobs | | Run and manage KQL queries in Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries | -| Create and schedule Sentinel Spark notebook jobs | https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-jobs | -| Configure connectors and retention for Sentinel data lake tiers | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-connectors | -| Onboard Sentinel data lake from Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboard-defender | -| Onboard tenants to Microsoft Sentinel data lake and graph | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboarding | -| Configure data exploration tools in Sentinel MCP server | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool | -| Configure and use Microsoft Sentinel MCP server tools | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started | -| Use Sentinel MCP tools with Microsoft Foundry AI agents | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-azure-ai-foundry | -| Configure Sentinel MCP tools in Microsoft Copilot Studio | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-copilot-studio | -| Add Sentinel MCP tools to Microsoft Security Copilot | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-security-copilot | -| Build Sentinel workbooks using data lake as source | https://learn.microsoft.com/en-us/azure/sentinel/datalake/workbooks-for-data-lake | -| Configure DNS over AMA connector fields and schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/dns-ama-fields | -| Security content reference for Dynamics 365 F&O | https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/dynamics-365-finance-operations-security-content | +| Schedule and manage Sentinel notebook jobs in VS Code | https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-jobs | +| Onboard Sentinel data lake from Microsoft Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboard-defender | +| Onboard to Microsoft Sentinel data lake and graph | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-lake-onboarding | +| Enable Sentinel MCP connector in ChatGPT or Claude | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-chatgpt-claude-connector | +| Create and configure custom Sentinel MCP tools | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-create-custom-tool | +| Configure Microsoft Sentinel MCP server tools | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-get-started | +| Configure Sentinel MCP tools in Azure AI Foundry | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-azure-ai-foundry | +| Configure Sentinel MCP tools in Copilot Studio | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-copilot-studio | +| Add Sentinel MCP tools to Security Copilot | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-security-copilot | +| Use Sentinel MCP tools in Visual Studio Code | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-use-tool-visual-studio-code | +| Configure Sentinel workbooks for data lake queries | https://learn.microsoft.com/en-us/azure/sentinel/datalake/workbooks-for-data-lake | +| Configure DNS AMA connector fields and normalization schema | https://learn.microsoft.com/en-us/azure/sentinel/dns-ama-fields | +| Use ASIM-based essential domain solutions in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/domain-based-essential-solutions | | Enable and configure Sentinel UEBA data sources | https://learn.microsoft.com/en-us/azure/sentinel/enable-entity-behavior-analytics | | Enable Sentinel auditing and health monitoring and query logs | https://learn.microsoft.com/en-us/azure/sentinel/enable-monitoring | -| Use Sentinel entity types and identifiers correctly | https://learn.microsoft.com/en-us/azure/sentinel/entities-reference | -| Configure auditing and health monitoring in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/health-audit | -| Query and interpret Microsoft Sentinel health tables | https://learn.microsoft.com/en-us/azure/sentinel/health-table-reference | -| Bulk import threat indicators from files into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/indicators-bulk-file-import | +| Understand and configure entities in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/entities | +| Reference Sentinel entity types and identifiers | https://learn.microsoft.com/en-us/azure/sentinel/entities-reference | +| Reference Fusion-detected multistage attack scenarios | https://learn.microsoft.com/en-us/azure/sentinel/fusion-scenario-reference | +| Understand Sentinel auditing and health monitoring capabilities | https://learn.microsoft.com/en-us/azure/sentinel/health-audit | +| Use SentinelHealth table fields for SIEM monitoring | https://learn.microsoft.com/en-us/azure/sentinel/health-table-reference | | Manage Sentinel analytics rule template versions | https://learn.microsoft.com/en-us/azure/sentinel/manage-analytics-rule-templates | -| Configure and manage installed Microsoft Sentinel platform solutions | https://learn.microsoft.com/en-us/azure/sentinel/manage-platform-solutions | -| Configure table retention and tier settings for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention | -| Map analytics rule fields to Sentinel entities | https://learn.microsoft.com/en-us/azure/sentinel/map-data-fields-to-entities | +| Configure and manage installed Sentinel platform solutions | https://learn.microsoft.com/en-us/azure/sentinel/manage-platform-solutions | +| Use Sentinel incident metrics to manage SOC performance | https://learn.microsoft.com/en-us/azure/sentinel/manage-soc-with-incident-metrics | +| Configure Sentinel table retention and tier settings | https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention | +| Map data fields to Sentinel entities | https://learn.microsoft.com/en-us/azure/sentinel/map-data-fields-to-entities | | Use Purview Information Protection connector record types in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-purview-record-types-activities | +| Use Microsoft Sentinel within the Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-sentinel-defender-portal | +| Monitor health and integrity of Sentinel analytics rules | https://learn.microsoft.com/en-us/azure/sentinel/monitor-analytics-rule-integrity | | Monitor Sentinel automation rules and playbook health | https://learn.microsoft.com/en-us/azure/sentinel/monitor-automation-health | -| Monitor Microsoft Sentinel data connector health and ingestion | https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health | -| Monitor SAP–Sentinel connection health and alerts | https://learn.microsoft.com/en-us/azure/sentinel/monitor-sap-system-health | -| Configure multi-tenant management for Microsoft Sentinel MSSPs | https://learn.microsoft.com/en-us/azure/sentinel/multiple-tenants-service-providers | +| Monitor Sentinel data connector health with SentinelHealth | https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health | +| Monitor and rerun Sentinel scheduled analytics rules | https://learn.microsoft.com/en-us/azure/sentinel/monitor-optimize-analytics-rule-execution | +| Monitor health of Sentinel–SAP connectivity | https://learn.microsoft.com/en-us/azure/sentinel/monitor-sap-system-health | +| View and manage Sentinel incidents across multiple workspaces | https://learn.microsoft.com/en-us/azure/sentinel/multiple-workspace-view | | Configure near-real-time analytics rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/near-real-time-rules | | Manage workspace-deployed ASIM parsers in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-workspace-parsers | -| Apply ASIM common schema fields in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-common-fields | -| Develop and deploy custom ASIM parsers for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-develop-parsers | +| Reference common ASIM schema fields in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-common-fields | +| Use ASIM-based normalized security content in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-content | | Implement ASIM Application Entity schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-application | -| Implement ASIM Device Entity schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-device | -| Implement ASIM User Entity schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-user | -| Manage and customize ASIM parsers in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-manage-parsers | -| Convert Sentinel content to use ASIM normalized data | https://learn.microsoft.com/en-us/azure/sentinel/normalization-modify-content | +| Implement ASIM Device Entity schema in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-device | +| Implement ASIM User Entity schema in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-entity-user | +| Manage and customize ASIM parsers in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-manage-parsers | +| Reference list of ASIM parsers for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-parsers-list | | Use ASIM Alert Events normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-alert | +| Implement ASIM Asset Entity normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-asset | | Use ASIM Audit Events normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-audit | | Use ASIM Authentication normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-authentication | -| Use ASIM DHCP normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dhcp | -| Use ASIM DNS normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dns | +| Use ASIM DHCP normalization schema in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dhcp | +| Use ASIM DNS normalization schema in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-dns | | Use ASIM File Event normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-file-event | | Use ASIM Network Session normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-network | -| Use ASIM Process Event normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-process-event | -| Use ASIM Registry Event normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-registry-event | -| Use Sentinel user management normalization schema | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-user-management | +| Use ASIM Process Event schema in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-process-event | +| Use ASIM Registry Event schema in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-registry-event | +| Apply user management normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-user-management | | Use legacy Sentinel network normalization schema v0.1 | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-v1 | -| Use ASIM Web Session normalization schema in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-web | -| Configure Sentinel notebooks and MSTICPy basics | https://learn.microsoft.com/en-us/azure/sentinel/notebook-get-started | -| Apply advanced MSTICPy and notebook settings in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/notebooks-msticpy-advanced | -| Remove Microsoft Sentinel from a Log Analytics workspace | https://learn.microsoft.com/en-us/azure/sentinel/offboard | -| Integrate Microsoft Purview solution with Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/purview-solution | -| Restore archived Sentinel logs for high-performance queries | https://learn.microsoft.com/en-us/azure/sentinel/restore | +| Implement ASIM Web Session normalization schema | https://learn.microsoft.com/en-us/azure/sentinel/normalization-schema-web | +| Configure advanced MSTICPy and notebook settings for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/notebooks-msticpy-advanced | | Configure SAP HANA audit log collection in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/collect-sap-hana-audit-logs | | Prepare SAP systems for Sentinel SAP connector | https://learn.microsoft.com/en-us/azure/sentinel/sap/preparing-sap | -| Review prerequisites for Sentinel SAP solution deployment | https://learn.microsoft.com/en-us/azure/sentinel/sap/prerequisites-for-deploying-sap-continuous-threat-monitoring | | Kickstart script parameters for SAP connector deployment | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-kickstart | -| Legacy systemconfig.ini settings for Sentinel SAP agent | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig | -| systemconfig.json settings for Sentinel SAP agent | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig-json | -| Update script parameters for Sentinel SAP connector | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-update | -| Use SAP Security Audit Controls workbook in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-controls-workbook | -| Use SAP Security Audit log workbook in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-audit-log-workbook | -| Security content reference for Sentinel SAP BTP solution | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-btp-security-content | -| Function reference for Sentinel SAP solution workspace | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-function-reference | -| Log and table schema reference for Sentinel SAP solution | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-log-reference | -| Reference for Sentinel SAP security content and rules | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-security-content | -| Stop SAP log collection and disable Sentinel connector | https://learn.microsoft.com/en-us/azure/sentinel/sap/stop-collection | +| Legacy systemconfig.ini reference for Sentinel SAP agent | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig | +| systemconfig.json reference for Sentinel SAP agent | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-systemconfig-json | +| Update script parameter reference for SAP connector | https://learn.microsoft.com/en-us/azure/sentinel/sap/reference-update | +| Stop SAP data collection in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/stop-collection | | Configure scheduled analytics rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/scheduled-rules-overview | -| Use Microsoft Sentinel security alert schema | https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema | -| Map Sentinel tables to their data connectors | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-tables-connectors-reference | -| Use customizable anomaly detection in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/soc-ml-anomalies | -| Prepare prerequisites for Microsoft Sentinel SIEM solutions | https://learn.microsoft.com/en-us/azure/sentinel/solution-setup-essentials | -| Configure and use summary rules to aggregate Sentinel data | https://learn.microsoft.com/en-us/azure/sentinel/summary-rules | +| Use Microsoft Sentinel security alert schema fields | https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema | +| Compare Sentinel alert schemas for XDR and standalone connectors | https://learn.microsoft.com/en-us/azure/sentinel/security-alert-schema-differences | +| Deploy and use Sentinel Zero Trust (TIC 3.0) solution | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution | +| Manage deprecated Microsoft Sentinel solutions lifecycle | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solution-deprecation | +| Remove and restore Microsoft Sentinel solutions and content | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-delete | +| Set up Azure Storage Blob connector for Sentinel logs | https://learn.microsoft.com/en-us/azure/sentinel/setup-azure-storage-connector | +| Configure and use Sentinel summary rules for aggregation | https://learn.microsoft.com/en-us/azure/sentinel/summary-rules | | Surface custom event details in Sentinel alerts | https://learn.microsoft.com/en-us/azure/sentinel/surface-custom-details-in-alerts | -| Configure threat intelligence integrations in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/threat-intelligence-integration | -| Configure filter and split transformations in Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/transformation-filter-split | +| Configure filter and split transformations for Sentinel ingestion | https://learn.microsoft.com/en-us/azure/sentinel/transformation-filter-split | | Reference for Sentinel UEBA entity enrichments | https://learn.microsoft.com/en-us/azure/sentinel/ueba-reference | -| Configure unified connectors to integrate with Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-integration | -| Apply built-in Sentinel watchlist template schemas | https://learn.microsoft.com/en-us/azure/sentinel/watchlist-schemas | -| Select Windows security event sets for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/windows-security-event-id-reference | -| Create and tune anomaly analytics rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/work-with-anomaly-rules | -| Configure multiple Microsoft Sentinel workspaces in Defender portal | https://learn.microsoft.com/en-us/azure/sentinel/workspaces-defender-portal | +| Configure specific devices for CEF via AMA Sentinel connector | https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-cef-device | +| Configure Sentinel custom log ingestion for specific applications | https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-custom-device | +| Configure unified connectors to integrate Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-integration | +| Configure appliances and devices for Syslog via AMA | https://learn.microsoft.com/en-us/azure/sentinel/unified-connector-syslog-device | +| Use schemas for built-in Sentinel watchlist templates | https://learn.microsoft.com/en-us/azure/sentinel/watchlist-schemas | +| Select Windows security event sets for Sentinel ingestion | https://learn.microsoft.com/en-us/azure/sentinel/windows-security-event-id-reference | +| Configure anomaly detection analytics rules in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/work-with-anomaly-rules | +| Configure and use Sentinel workspace manager for multi-workspace operations | https://learn.microsoft.com/en-us/azure/sentinel/workspace-manager | ### Integrations & Coding Patterns | Topic | URL | |-------|-----| -| Create Sentinel Data Collection Rules via API examples | https://learn.microsoft.com/en-us/azure/sentinel/api-dcr-reference | -| Use Sentinel Logic Apps triggers and actions in playbooks | https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-triggers-actions | -| Integrate Sentinel incidents with Microsoft Teams collaboration | https://learn.microsoft.com/en-us/azure/sentinel/collaborate-in-microsoft-teams | -| Ingest AWS EKS audit logs into Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-aws-eks | -| Build Azure Functions-based connectors to ingest data into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-functions-template | -| Use Logstash with DCR-based API to stream logs to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules | -| Enable Defender Threat Intelligence data connector in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-mdti-data-connector | -| Connect STIX/TAXII threat intel feeds to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-taxii | -| Connect threat intelligence platform to Sentinel (legacy connector) | https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-tip | -| Connect TIP to Sentinel using Threat Intel upload API | https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-upload-api | -| Create codeless connectors for Microsoft Sentinel with CCF | https://learn.microsoft.com/en-us/azure/sentinel/create-codeless-connector | -| Build push-based codeless connectors for Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/create-push-codeless-connector | -| Configure GCP data connectors with Sentinel CCF | https://learn.microsoft.com/en-us/azure/sentinel/data-connection-rules-reference-gcp | -| Define connector UIConfig JSON for Sentinel CCF | https://learn.microsoft.com/en-us/azure/sentinel/data-connector-ui-definitions-reference | -| Build and manage custom security graphs with Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/datalake/create-custom-graphs | -| Use GQL syntax to query Sentinel custom graphs | https://learn.microsoft.com/en-us/azure/sentinel/datalake/gql-reference-for-sentinel-custom-graph | +| Use Sentinel API examples to create Data Collection Rules | https://learn.microsoft.com/en-us/azure/sentinel/api-dcr-reference | +| Use Sentinel playbook triggers and actions via Logic Apps | https://learn.microsoft.com/en-us/azure/sentinel/automation/playbook-triggers-actions | +| Integrate Microsoft Sentinel incidents with Microsoft Teams | https://learn.microsoft.com/en-us/azure/sentinel/collaborate-in-microsoft-teams | +| Integrate external data sources via Azure Functions with Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-functions-template | +| Integrate Logstash with Sentinel using DCR-based output plugin | https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules | +| Enable Defender Threat Intelligence data connector | https://learn.microsoft.com/en-us/azure/sentinel/connect-mdti-data-connector | +| Connect TAXII threat intel feeds to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-taxii | +| Connect threat intelligence platform to Sentinel (legacy) | https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-tip | +| Connect TIP to Sentinel using Upload API | https://learn.microsoft.com/en-us/azure/sentinel/connect-threat-intelligence-upload-api | +| Create codeless data connectors using Sentinel CCF | https://learn.microsoft.com/en-us/azure/sentinel/create-codeless-connector | +| Build custom Sentinel connectors with AI agent in VS Code | https://learn.microsoft.com/en-us/azure/sentinel/create-custom-connector-builder-agent | +| Configure push-based codeless connectors for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/create-push-codeless-connector | +| Query Sentinel graphs with GQL syntax reference | https://learn.microsoft.com/en-us/azure/sentinel/datalake/gql-reference-for-sentinel-custom-graph | | Call Sentinel custom graph REST APIs programmatically | https://learn.microsoft.com/en-us/azure/sentinel/datalake/graph-rest-api | -| Run Sentinel data lake KQL queries via REST APIs | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries-api | -| Notebook code examples for querying Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-examples | -| Use Jupyter notebooks with Sentinel data lake in VS Code | https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebooks | -| Use Sentinel graph provider API in Spark notebooks | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-provider-reference | -| Leverage Sentinel MCP agent creation tool collection | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-agent-creation-tool | -| Enable and use Microsoft Sentinel MCP connector with ChatGPT or Claude | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-chatgpt-claude-connector | -| Create custom Sentinel MCP tools from KQL queries | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-create-custom-tool | -| Integrate Sentinel MCP tools into Azure Logic Apps | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-logic-apps | -| Use Sentinel MCP triage tools for incident hunting | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-triage-tool | -| Use SentinelProvider class to access Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-provider-class-reference | -| Enrich Sentinel entities with geolocation REST API | https://learn.microsoft.com/en-us/azure/sentinel/geolocation-data-api | -| Manage Microsoft Sentinel hunting queries via REST API | https://learn.microsoft.com/en-us/azure/sentinel/hunting-with-rest-api | -| Author custom hunting KQL queries in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/hunts-custom-queries | +| Call Sentinel data lake KQL APIs via REST | https://learn.microsoft.com/en-us/azure/sentinel/datalake/kql-queries-api | +| Sample notebook code for querying Sentinel data lake | https://learn.microsoft.com/en-us/azure/sentinel/datalake/notebook-examples | +| Use sentinel_graph API to build security graphs | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-graph-provider-reference | +| Use Sentinel MCP agent creation tool collection | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-agent-creation-tool | +| Use Sentinel MCP data exploration tools for queries | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-data-exploration-tool | +| Integrate Sentinel MCP tools with Azure Logic Apps | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-logic-apps | +| Use Sentinel MCP triage tools for incidents | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-mcp-triage-tool | +| Use MicrosoftSentinelProvider class with Spark notebooks | https://learn.microsoft.com/en-us/azure/sentinel/datalake/sentinel-provider-class-reference | +| Enrich Sentinel entities with geolocation via REST API | https://learn.microsoft.com/en-us/azure/sentinel/geolocation-data-api | +| Manage Sentinel hunting queries via Log Analytics REST API | https://learn.microsoft.com/en-us/azure/sentinel/hunting-with-rest-api | +| Bulk import threat intel files into Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/indicators-bulk-file-import | | Ingest Defender for Cloud incidents via Defender XDR | https://learn.microsoft.com/en-us/azure/sentinel/ingest-defender-for-cloud-incidents | -| Integrate Microsoft Defender XDR with Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-sentinel-integration | -| Use ASIM helper functions for normalized data in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/normalization-functions | -| Build Power BI reports from Sentinel log data | https://learn.microsoft.com/en-us/azure/sentinel/powerbi | -| Trigger Sentinel playbooks from entities during hunts | https://learn.microsoft.com/en-us/azure/sentinel/respond-threats-during-investigation | -| Create analytics rules for Microsoft Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-analytic-rules-creation | -| Create hunting queries for Microsoft Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-hunting-rules-creation | -| Build and publish Microsoft Sentinel SIEM solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-integration-guide | -| Create and publish playbooks for Microsoft Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-playbook-creation | -| Create summary rules and tables for Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-summary-rules-creation | -| Create and publish workbooks for Microsoft Sentinel solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-workbook-creation | -| Configure Azure Storage Blob connector for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/setup-azure-storage-connector | -| Call Microsoft Sentinel SOC optimization recommendations API | https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-api | -| Import threat intelligence using Sentinel STIX upload API | https://learn.microsoft.com/en-us/azure/sentinel/stix-objects-api | -| Enrich Sentinel incidents with IP reputation automation | https://learn.microsoft.com/en-us/azure/sentinel/tutorial-enrich-ip-information | -| Extract non-native Sentinel entities using playbook actions | https://learn.microsoft.com/en-us/azure/sentinel/tutorial-extract-incident-entities | -| Use legacy Sentinel upload indicators API | https://learn.microsoft.com/en-us/azure/sentinel/upload-indicators-api | -| Use Sentinel watchlists in KQL queries and rules | https://learn.microsoft.com/en-us/azure/sentinel/watchlists-queries | -| Query STIX indicator and object tables in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/work-with-stix-objects-indicators | +| Integrate Microsoft Defender XDR with Sentinel incidents | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-sentinel-integration | +| Use ASIM KQL parsers in Sentinel queries | https://learn.microsoft.com/en-us/azure/sentinel/normalization-about-parsers | +| Develop and deploy custom ASIM parsers | https://learn.microsoft.com/en-us/azure/sentinel/normalization-develop-parsers | +| Use ASIM helper functions for normalized Sentinel data | https://learn.microsoft.com/en-us/azure/sentinel/normalization-functions | +| Build Power BI reports from Sentinel data | https://learn.microsoft.com/en-us/azure/sentinel/powerbi | +| Integrate Microsoft Purview insights and logs with Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/purview-solution | +| Function reference for Sentinel SAP solution | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-function-reference | +| Log and table reference for Sentinel SAP solution | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-log-reference | +| Use Sentinel SOC optimization recommendations API programmatically | https://learn.microsoft.com/en-us/azure/sentinel/soc-optimization/soc-optimization-api | +| Import threat intelligence STIX objects via Sentinel upload API | https://learn.microsoft.com/en-us/azure/sentinel/stix-objects-api | +| Integrate threat intelligence feeds with Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/threat-intelligence-integration | +| Use legacy Sentinel upload indicators API for STIX IOCs | https://learn.microsoft.com/en-us/azure/sentinel/upload-indicators-api | +| Query STIX objects and indicators in Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/work-with-stix-objects-indicators | ### Deployment | Topic | URL | |-------|-----| -| Deploy Sentinel solution for Power Platform and CE | https://learn.microsoft.com/en-us/azure/sentinel/business-applications/deploy-power-platform-solution | -| Create repository connections to deploy Sentinel content | https://learn.microsoft.com/en-us/azure/sentinel/ci-cd | -| Use repositories and CI/CD for Microsoft Sentinel content | https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-content | -| Customize CI/CD repository deployments for Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-deploy | -| Onboard Azure Stack Hub VMs to Microsoft Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-stack | -| Deploy Sentinel solution for Dynamics 365 Finance and Operations | https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/deploy-dynamics-365-finance-operations-solution | +| Connect Power Platform and Dynamics CE to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/business-applications/deploy-power-platform-solution | +| Create repository connections to deploy Sentinel custom content | https://learn.microsoft.com/en-us/azure/sentinel/ci-cd | +| Manage Sentinel custom content with CI/CD repositories | https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-content | +| Customize CI/CD repository deployments for Sentinel content | https://learn.microsoft.com/en-us/azure/sentinel/ci-cd-custom-deploy | +| Onboard Azure Stack Hub VMs to Microsoft Sentinel monitoring | https://learn.microsoft.com/en-us/azure/sentinel/connect-azure-stack | +| Plan and execute a Microsoft Sentinel deployment | https://learn.microsoft.com/en-us/azure/sentinel/deploy-overview | +| Connect Dynamics 365 Finance and Operations to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/dynamics-365/deploy-dynamics-365-finance-operations-solution | +| Enable Microsoft Sentinel SIEM and core features | https://learn.microsoft.com/en-us/azure/sentinel/enable-sentinel-features-content | | Import and export Sentinel analytics rules via ARM | https://learn.microsoft.com/en-us/azure/sentinel/import-export-analytics-rules | -| Manage Sentinel automation rules as code with ARM templates | https://learn.microsoft.com/en-us/azure/sentinel/import-export-automation-rules | -| Check Sentinel Defender XDR data support by cloud | https://learn.microsoft.com/en-us/azure/sentinel/microsoft-365-defender-cloud-support | -| Run Sentinel hunting notebooks in Azure ML workspaces | https://learn.microsoft.com/en-us/azure/sentinel/notebooks-hunt | -| Package and publish Microsoft Sentinel platform solutions | https://learn.microsoft.com/en-us/azure/sentinel/package-platform-solution | -| Publish Microsoft Sentinel SIEM solutions to marketplace | https://learn.microsoft.com/en-us/azure/sentinel/publish-sentinel-solutions | -| Deploy SAP connector container via command line | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-command-line | -| Deploy SAP data connector container to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-data-connector-agent-container | -| Deploy Sentinel solution for SAP BTP systems | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-btp-solution | +| Export and import Sentinel automation rules as ARM templates | https://learn.microsoft.com/en-us/azure/sentinel/import-export-automation-rules | +| Package and deploy Microsoft Sentinel platform solutions | https://learn.microsoft.com/en-us/azure/sentinel/package-platform-solution | +| Verify prerequisites for Microsoft Sentinel deployment | https://learn.microsoft.com/en-us/azure/sentinel/prerequisites | +| Deploy SAP data connector agent via CLI | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-command-line | +| Deploy containerized SAP data connector to Sentinel | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-data-connector-agent-container | +| Deploy Sentinel solution for SAP BTP | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-btp-solution | | Install Microsoft Sentinel solution for SAP applications | https://learn.microsoft.com/en-us/azure/sentinel/sap/deploy-sap-security-content | -| Migrate Sentinel SAP container agent to agentless connector | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-agent-migrate | +| Understand deployment process for Sentinel SAP solution | https://learn.microsoft.com/en-us/azure/sentinel/sap/deployment-overview | +| Verify prerequisites for deploying Sentinel SAP monitoring | https://learn.microsoft.com/en-us/azure/sentinel/sap/prerequisites-for-deploying-sap-continuous-threat-monitoring | | Expert deployment options for Sentinel SAP connector | https://learn.microsoft.com/en-us/azure/sentinel/sap/sap-solution-deploy-alternate | | Update Sentinel SAP data connector agent safely | https://learn.microsoft.com/en-us/azure/sentinel/sap/update-sap-data-connector | -| Discover and deploy Sentinel content hub solutions | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-deploy | -| Track Microsoft Sentinel solution status after publishing | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-post-publish-tracking | \ No newline at end of file +| Discover and deploy Sentinel solutions from Content hub | https://learn.microsoft.com/en-us/azure/sentinel/sentinel-solutions-deploy | +| Deploy Microsoft Sentinel across multiple workspaces and tenants | https://learn.microsoft.com/en-us/azure/sentinel/use-multiple-workspaces | \ No newline at end of file diff --git a/skills/azure-service-bus/SKILL.md b/skills/azure-service-bus/SKILL.md index 490c15ad..0f3bda6a 100644 --- a/skills/azure-service-bus/SKILL.md +++ b/skills/azure-service-bus/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-service-bus -description: Expert knowledge for Azure Service Bus development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using queues/topics, sessions, filters, geo-replication/DR, or Premium scaling and throughput limits, and other Azure Service Bus related development tasks. Not for Azure Event Hubs (use azure-event-hubs), Azure Event Grid (use azure-event-grid), Azure Relay (use azure-relay), Azure Queue Storage (use azure-queue-storage). +description: Expert knowledge for Azure Service Bus development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using queues/topics, sessions, filters, dead-lettering, geo-recovery, or Premium scaling, and other Azure Service Bus related development tasks. Not for Azure Event Hubs (use azure-event-hubs), Azure Relay (use azure-relay), Azure Queue Storage (use azure-queue-storage), Azure Web PubSub (use azure-web-pubsub). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Service Bus Skill @@ -29,8 +29,8 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L62-L72 | Guidance on choosing Service Bus vs other messaging services/tiers, configuring autoforwarding, geo-disaster recovery/replication, and migrating from Standard to Premium. | | Architecture & Design Patterns | L73-L81 | Patterns for designing resilient, federated, multi-namespace Service Bus systems, including partitioning, replication, and using NServiceBus for message-driven architectures. | | Limits & Quotas | L82-L87 | Service Bus message, entity, and namespace quotas (size, connections, throughput) and how throttling works, including limits, behaviors under load, and mitigation strategies. | -| Security | L88-L110 | Securing Service Bus: identity-based auth, SAS, keys and encryption, TLS, network isolation (VNet, Private Link, firewall), Azure Policy, and compliance configuration. | -| Configuration | L111-L134 | Configuring Service Bus entities, filters, sessions, partitioning, monitoring, and management via portal, PowerShell, ARM, and local emulator, plus message browsing, counts, and replication. | +| Security | L88-L110 | Securing Service Bus: Entra ID/managed identity auth, SAS and auth rules, CMK encryption, TLS settings, network isolation (VNet, Private Link, firewalls), and Azure Policy/compliance. | +| Configuration | L111-L134 | Configuring and managing Service Bus entities: forwarding, dead-lettering, sessions, partitioning, monitoring/metrics, SQL filters/actions, PowerShell/ARM management, and local emulation. | | Integrations & Coding Patterns | L135-L150 | Patterns and code for integrating Service Bus with JMS, AMQP, RabbitMQ, Event Grid/Logic Apps/Functions, subscription filters, and batch message operations/migration scenarios | | Deployment | L151-L160 | Deploying and scaling Service Bus: autoscaling Premium messaging units and creating/moving namespaces, queues, topics, subscriptions, and rules using ARM templates or Bicep. | @@ -92,17 +92,17 @@ This skill requires **network access** to fetch documentation content: | Enable confidential computing for Service Bus Premium | https://learn.microsoft.com/en-us/azure/service-bus-messaging/confidential-computing | | Configure customer-managed keys for Service Bus encryption | https://learn.microsoft.com/en-us/azure/service-bus-messaging/configure-customer-managed-key | | Disable SAS local authentication for Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/disable-local-authentication | -| Configure network security for Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security | -| Configure network security perimeter for Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter | -| Apply Azure Policy definitions to Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference | +| Configure network security for Azure Service Bus namespaces | https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security | +| Associate Azure Service Bus with a network security perimeter | https://learn.microsoft.com/en-us/azure/service-bus-messaging/network-security-perimeter | +| Use built-in Azure Policy definitions for Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/policy-reference | | Integrate Service Bus with Azure Private Link | https://learn.microsoft.com/en-us/azure/service-bus-messaging/private-link-service | | Apply regulatory compliance policies to Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/security-controls-policy | -| Configure authentication and authorization for Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization | +| Configure authentication and authorization for Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-authentication-and-authorization | | Configure IP firewall rules for Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-ip-filtering | -| Authenticate to Azure Service Bus with managed identities | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity | +| Authenticate to Azure Service Bus using managed identities | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-managed-service-identity | | Migrate Service Bus apps to passwordless Entra ID auth | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-migrate-azure-credentials | | Create Service Bus authorization rules with ARM templates | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-resource-manager-namespace-auth-rule | -| Secure Service Bus with Shared Access Signatures | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas | +| Use shared access signatures with Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-sas | | Configure Service Bus virtual network service endpoints | https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-service-endpoints | | Audit Service Bus TLS minimum version compliance with Azure Policy | https://learn.microsoft.com/en-us/azure/service-bus-messaging/transport-layer-security-audit-minimum-version | | Configure minimum TLS version for a Service Bus namespace | https://learn.microsoft.com/en-us/azure/service-bus-messaging/transport-layer-security-configure-minimum-version | @@ -114,9 +114,9 @@ This skill requires **network access** to fetch documentation content: | Map classic Service Bus management APIs to ARM | https://learn.microsoft.com/en-us/azure/service-bus-messaging/deprecate-service-bus-management | | Configure auto-forwarding for Service Bus queues and subscriptions | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-auto-forward | | Enable dead-lettering for Service Bus queues and subscriptions | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-dead-letter | -| Configure duplicate detection for Service Bus entities | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection | +| Configure duplicate message detection in Azure Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-duplicate-detection | | Enable and configure Service Bus message sessions | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-message-sessions | -| Enable partitioning in Basic and Standard Service Bus | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard | +| Enable partitioning for Azure Service Bus entities | https://learn.microsoft.com/en-us/azure/service-bus-messaging/enable-partitions-basic-standard | | Suspend and reactivate Azure Service Bus entities | https://learn.microsoft.com/en-us/azure/service-bus-messaging/entity-suspend | | Use Service Bus Explorer in Azure portal for data operations | https://learn.microsoft.com/en-us/azure/service-bus-messaging/explorer | | Use Azure Service Bus message browsing and peek operations | https://learn.microsoft.com/en-us/azure/service-bus-messaging/message-browsing | diff --git a/skills/azure-service-health/SKILL.md b/skills/azure-service-health/SKILL.md index 81616ec1..b10a864b 100644 --- a/skills/azure-service-health/SKILL.md +++ b/skills/azure-service-health/SKILL.md @@ -3,7 +3,7 @@ name: azure-service-health description: Expert knowledge for Azure Service Health development including troubleshooting, limits & quotas, security, configuration, and integrations & coding patterns. Use when integrating Service Health via APIs/webhooks, configuring alerts, querying Resource Graph, or diagnosing VM health, and other Azure Service Health related development tasks. Not for Azure Monitor (use azure-monitor), Azure Reliability (use azure-reliability), Azure Resiliency (use azure-resiliency), Azure Quotas (use azure-quotas). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Service Health Skill diff --git a/skills/azure-site-recovery/SKILL.md b/skills/azure-site-recovery/SKILL.md index ff480e25..66d68c42 100644 --- a/skills/azure-site-recovery/SKILL.md +++ b/skills/azure-site-recovery/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-site-recovery -description: Expert knowledge for Azure Site Recovery development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when planning ASR for VMware/Hyper‑V, automating vaults via ARM/Bicep/Terraform, or designing DR for SQL/SAP/AD workloads, and other Azure Site Recovery related development tasks. Not for Azure Backup (use azure-backup), Azure Migrate (use azure-migrate), Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). +description: Expert knowledge for Azure Site Recovery development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when planning ASR for VMware/Hyper‑V, configuring replication/failover, automating via ARM/Bicep, or securing appliances, and other Azure Site Recovery related development tasks. Not for Azure Backup (use azure-backup), Azure Migrate (use azure-migrate), Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Site Recovery Skill @@ -28,9 +28,9 @@ This skill requires **network access** to fetch documentation content: | Best Practices | L66-L71 | Guidance on tuning Azure Site Recovery performance: analyzing high data churn on VMs, and monitoring/troubleshooting process server health, capacity, and throughput. | | Decision Making | L72-L89 | Planning and sizing Azure Site Recovery: choosing tools vs Azure Migrate, VMware/Hyper-V DR capacity and cost estimation, managed disk pricing, failover/failback options, and classic-to-modern migration. | | Architecture & Design Patterns | L90-L99 | Designing Azure Site Recovery architectures for specific workloads (AD/DNS, SAP, Dynamics AX, SharePoint, IIS, SQL, VMware, file servers) and multi-tier app DR patterns. | -| Limits & Quotas | L100-L114 | Site Recovery scale, capacity, and support limits: VM/Hyper‑V/VMware matrices, high‑churn limits, shared disks, Mobility service usage, planner limits, and safe use with Azure Backup. | +| Limits & Quotas | L100-L114 | Support limits, capacity planning, and compatibility matrices for Azure Site Recovery (VMware/Hyper-V/Azure VMs), including churn limits, shared disks, Mobility service usage, and safe use with Backup | | Security | L115-L125 | Securing Azure Site Recovery: NSGs, TLS, encryption changes, RBAC, managed identities, and hardening replication appliances/VMware replication traffic. | -| Configuration | L126-L187 | Configuring Azure Site Recovery for VMs and clusters (Azure, VMware, Hyper‑V, physical): replication setup, networking, encryption, appliances, policies, monitoring, failover and failback settings. | +| Configuration | L126-L187 | Configuring Azure Site Recovery for VMs/VMware/Hyper-V/physical: setup, networking, encryption, policies, appliances, monitoring, and managing replication/failover/failback settings. | | Integrations & Coding Patterns | L188-L200 | Scripts and templates for automating ASR: PowerShell for Hyper‑V/shared disks, ExpressRoute/Traffic Manager integration, and Bicep/ARM/Terraform to deploy Recovery Services vaults. | | Deployment | L201-L204 | Guidance for moving VMware disaster recovery setups from classic Azure Site Recovery to the modernized architecture, including migration steps, prerequisites, and configuration changes. | @@ -103,7 +103,7 @@ This skill requires **network access** to fetch documentation content: | Check Azure Site Recovery VM support matrix | https://learn.microsoft.com/en-us/azure/site-recovery/azure-to-azure-support-matrix | | Use high churn support limits for Azure VM DR | https://learn.microsoft.com/en-us/azure/site-recovery/concepts-azure-to-azure-high-churn-support | | Verify Hyper-V to Azure Site Recovery support matrix | https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-azure-support-matrix | -| Check replication appliance requirements for VMware DR | https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix | +| Check support matrix for Site Recovery replication appliance | https://learn.microsoft.com/en-us/azure/site-recovery/replication-appliance-support-matrix | | Understand shared disk support for Site Recovery | https://learn.microsoft.com/en-us/azure/site-recovery/shared-disk-support-matrix | | Use Azure Site Recovery with Azure Backup safely | https://learn.microsoft.com/en-us/azure/site-recovery/site-recovery-backup-interoperability | | Review version history and limitations of Site Recovery Deployment Planner | https://learn.microsoft.com/en-us/azure/site-recovery/site-recovery-deployment-planner-history | @@ -154,7 +154,7 @@ This skill requires **network access** to fetch documentation content: | Delete a Recovery Services vault configured for Site Recovery | https://learn.microsoft.com/en-us/azure/site-recovery/delete-vault | | Enable Extended Zones VM disaster recovery during VM creation | https://learn.microsoft.com/en-us/azure/site-recovery/disaster-recovery-for-edge-zone-via-vm-flow-tutorial | | Configure disaster recovery for VMs on Azure Extended Zones via vault flow | https://learn.microsoft.com/en-us/azure/site-recovery/disaster-recovery-for-edge-zone-vm-tutorial | -| Configure Site Recovery replication with private endpoints | https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints | +| Configure Azure Site Recovery replication with private endpoints | https://learn.microsoft.com/en-us/azure/site-recovery/hybrid-how-to-enable-replication-private-endpoints | | Configure Hyper-V disaster recovery without VMM using Site Recovery | https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-azure-tutorial | | Configure on-premises Hyper-V infrastructure for Site Recovery | https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-prepare-on-premises-tutorial | | Set up Hyper-V with VMM disaster recovery to Azure | https://learn.microsoft.com/en-us/azure/site-recovery/hyper-v-vmm-azure-tutorial | diff --git a/skills/azure-sql-database/SKILL.md b/skills/azure-sql-database/SKILL.md index 3c6dcf1f..581b2d45 100644 --- a/skills/azure-sql-database/SKILL.md +++ b/skills/azure-sql-database/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-sql-database -description: Expert knowledge for Azure SQL Database development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing vCore/DTU/Hyperscale/serverless, configuring geo‑replication, elastic pools, Data Sync, or HA/DR, and other Azure SQL Database related development tasks. Not for Azure SQL Managed Instance (use azure-sql-managed-instance), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Cosmos DB (use azure-cosmos-db), Azure Database for PostgreSQL (use azure-database-postgresql). +description: Expert knowledge for Azure SQL Database development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing vCore/DTU/Hyperscale, configuring geo‑replication, elastic pools, Data Sync, or security (TDE/Always Encrypted), and other Azure SQL Database related development tasks. Not for Azure Database for MariaDB (use azure-database-mariadb), Azure Database for MySQL (use azure-database-mysql), Azure Database for PostgreSQL (use azure-database-postgresql), Azure SQL Managed Instance (use azure-sql-managed-instance). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure SQL Database Skill @@ -29,7 +29,7 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L77-L100 | Guidance for choosing Azure SQL deployment, pricing, and HA/DR options: vCore vs DTU, Hyperscale, serverless, reservations, licensing, migration paths, and cost/DR/automation planning. | | Architecture & Design Patterns | L101-L117 | Architectural patterns for Azure SQL: geo-replication, HA/DR, backups, connectivity, sharding/elastic scale-out, multi-tenant SaaS models, and rolling upgrade/failover designs. | | Limits & Quotas | L118-L131 | Limits, quotas, and resource caps for Azure SQL (free tiers, DTU/vCore, elastic pools, Hyperscale), plus how to request quota increases and watcher/operational constraints. | -| Security | L132-L199 | Securing Azure SQL: authentication (Entra, MFA, managed identity), network/firewall, encryption (TDE, Always Encrypted), auditing/Defender, data masking/classification, and compliance best practices. | +| Security | L132-L199 | Configuring Azure SQL security: authentication (Entra, MI, MFA), network/firewall, auditing, encryption (TDE, Always Encrypted), threat protection, compliance, and best‑practice hardening. | | Configuration | L200-L265 | Configuring Azure SQL databases: monitoring, backups, geo-replication/failover, security/immutability, scaling and elastic pools, Data Sync, maintenance windows, and CLI/PowerShell setup. | | Integrations & Coding Patterns | L266-L294 | Connecting apps and tools to Azure SQL (EF Core, .NET, Node.js, Python), sharding/elastic patterns, and automating management, replication, sync, and streaming via PowerShell/Spark/Stream Analytics | | Deployment | L295-L310 | Deploying and scaling Azure SQL databases/MI: automation (GitHub, ARM, Bicep, Terraform), Hyperscale/elastic pools, regional moves, feature availability, and dev environment setup. | @@ -191,8 +191,8 @@ This skill requires **network access** to fetch documentation content: | Use database-level TDE with customer-managed keys in Azure SQL | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-database-level-overview?view=azuresql | | Configure TDE with user-assigned managed identity in Azure SQL | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-identity?view=azuresql | | Rotate Azure SQL TDE protector using PowerShell and CLI | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-key-rotation?view=azuresql | -| Set up customer-managed TDE with Azure Key Vault | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql | -| Respond to compromised Azure SQL TDE protector key | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql | +| Configure customer-managed TDE with Azure Key Vault | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql | +| Rotate and remove BYOK TDE protector for Azure SQL | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-remove-tde-protector?view=azuresql | | Configure transparent data encryption for Azure SQL databases | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-tde-overview?view=azuresql | | Secure Azure SQL with virtual network service endpoints and rules | https://learn.microsoft.com/en-us/azure/azure-sql/database/vnet-service-endpoint-rule-overview?view=azuresql | | Prepare for Azure SQL TLS root certificate rotation | https://learn.microsoft.com/en-us/azure/azure-sql/updates/ssl-root-certificate-expiring?view=azuresql | diff --git a/skills/azure-sql-managed-instance/SKILL.md b/skills/azure-sql-managed-instance/SKILL.md index 9c7bb988..06de1e16 100644 --- a/skills/azure-sql-managed-instance/SKILL.md +++ b/skills/azure-sql-managed-instance/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-sql-managed-instance -description: Expert knowledge for Azure SQL Managed Instance development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when using MI link, HA/DR and geo-replication, Entra/Kerberos auth, Key Vault TDE, or XEvents/Intelligent Insights, and other Azure SQL Managed Instance related development tasks. Not for Azure SQL Database (use azure-sql-database), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Database for MariaDB (use azure-database-mariadb), Azure Database for MySQL (use azure-database-mysql). +description: Expert knowledge for Azure SQL Managed Instance development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing MI vs Azure SQL, configuring networking/HA, tuning performance, securing access, or integrating MI Link, and other Azure SQL Managed Instance related development tasks. Not for Azure SQL Database (use azure-sql-database), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Database for MySQL (use azure-database-mysql), Azure Database for PostgreSQL (use azure-database-postgresql). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure SQL Managed Instance Skill @@ -29,7 +29,7 @@ This skill requires **network access** to fetch documentation content: | Decision Making | L77-L90 | Guidance for choosing Azure SQL Managed Instance vs other Azure SQL options, tiers, pools, networking, HA/DR options, ML differences, and planning Db2/Oracle migrations. | | Architecture & Design Patterns | L91-L95 | Connectivity architecture, networking models, and connection options for Azure SQL Database, including gateways, endpoints, firewalls, and integration with VNets and private access. | | Limits & Quotas | L96-L105 | Limits, quotas, and performance caps for Azure SQL MI: DTUs, free-tier and memory limits, resource ceilings, monitoring behavior, and how to request quota increases. | -| Security | L106-L160 | Configuring Azure SQL Managed Instance security: Entra auth and logins, Windows/Kerberos, managed identities, TDE and Key Vault, TLS, auditing, threat protection, networking, and policy-based controls. | +| Security | L106-L160 | Configuring auth, encryption, and network protection for SQL Managed Instance: Entra/Windows logins, TDE & Key Vault, TLS, auditing, threat protection, Private Link, and security best practices. | | Configuration | L161-L212 | Configuring and monitoring SQL Managed Instance: networking, backups/restore, maintenance windows, alerts, metrics/logs, Intelligent Insights, XEvents, auditing, and deployment options. | | Integrations & Coding Patterns | L213-L236 | Connecting apps and tools to SQL Managed Instance (.NET, Java, Python, etc.), automation, data import, DTC, XEvents, MI Link, backups, tracing, and Spark integration. | | Deployment | L237-L257 | Deploying, scaling, moving, and pausing Azure SQL Managed Instance; networking setup, regional availability, DR/replication, migrations, and BACPAC import/export. | @@ -134,7 +134,7 @@ This skill requires **network access** to fetch documentation content: | Enable Azure SQL TDE with Key Vault BYOK | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-configure?view=azuresql | | Configure cross-tenant customer-managed keys for Azure SQL TDE | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-cross-tenant?view=azuresql | | Use user-assigned managed identities for TDE customer-managed keys | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-identity?view=azuresql | -| Set up customer-managed TDE with Azure Key Vault | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql | +| Configure customer-managed TDE with Azure Key Vault | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql | | Enable and manage transparent data encryption in Azure SQL | https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-tde-overview?view=azuresql | | Secure SQL Managed Instance with Microsoft Entra logins | https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/aad-security-configure-tutorial?view=azuresql | | Configure SQL Server Audit on Azure SQL Managed Instance | https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/auditing?view=azuresql | diff --git a/skills/azure-sql-virtual-machines/SKILL.md b/skills/azure-sql-virtual-machines/SKILL.md index 020876a6..f13abaa7 100644 --- a/skills/azure-sql-virtual-machines/SKILL.md +++ b/skills/azure-sql-virtual-machines/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-sql-virtual-machines -description: Expert knowledge for SQL Server on Azure Virtual Machines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing Azure SQL vs SQL VMs, configuring Always On/FCI, tuning storage/VM sizing, or securing with Key Vault/Entra, and other SQL Server on Azure Virtual Machines related development tasks. Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), Azure Virtual Machines (use azure-virtual-machines), Azure Database Migration service (use azure-database-migration). +description: Expert knowledge for SQL Server on Azure Virtual Machines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing Azure SQL vs SQL VMs, configuring Always On/FCI, tuning storage/VM sizing, or securing with Key Vault/Entra, and other SQL Server on Azure Virtual Machines related development tasks. Not for Azure SQL Database (use azure-sql-database), Azure SQL Managed Instance (use azure-sql-managed-instance), Azure Virtual Machines (use azure-virtual-machines), Azure Data Science Virtual Machines (use azure-data-science-vm). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # SQL Server on Azure Virtual Machines Skill @@ -25,7 +25,7 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L37-L51 | Diagnosing and fixing Azure SQL and SQL Server on VM issues: capacity, connectivity, performance, geo-replication, memory, log full, I/O throttling, and IaaS Agent extension errors | -| Best Practices | L52-L65 | Best practices for SQL Server on Azure VMs: performance tuning, storage/VM sizing, tempdb/ephemeral disks, HADR/FCI/DNN configuration, backups, and planned maintenance readiness. | +| Best Practices | L52-L65 | Best practices for SQL Server on Azure VMs: performance tuning, storage/VM sizing, tempdb, backups/restore, HADR/FCI/DNN configuration, and maintenance planning. | | Decision Making | L66-L80 | Guidance for choosing Azure SQL vs SQL VMs, comparing pricing/licensing, selecting HADR options, and planning migrations (Db2, Oracle, SQL 2014) and regional feature availability. | | Architecture & Design Patterns | L81-L89 | High-level designs and patterns for SQL Server on Azure VMs: connectivity, Always On availability groups, failover cluster instances, and Windows Server Failover Clustering setup. | | Limits & Quotas | L90-L95 | Info on Azure SQL capacity limits, DTU benchmark behavior, regional feature availability, and how to request quota increases for databases and managed instances | diff --git a/skills/azure-sre-agent/SKILL.md b/skills/azure-sre-agent/SKILL.md index 9e657d8a..c734628b 100644 --- a/skills/azure-sre-agent/SKILL.md +++ b/skills/azure-sre-agent/SKILL.md @@ -1,14 +1,14 @@ --- name: azure-sre-agent -description: Expert knowledge for Azure Sre Agent development including troubleshooting, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when wiring SRE Agent to DevOps/Teams/ServiceNow, configuring tools and RBAC, querying KQL logs, or deploying as a Teams bot, and other Azure Sre Agent related development tasks. Not for Azure Monitor (use azure-monitor), Azure Reliability (use azure-reliability), Azure Resiliency (use azure-resiliency), Azure Service Health (use azure-service-health). +description: Expert knowledge for Azure Sre Agent development including troubleshooting, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when wiring SRE Agent to DevOps/GitHub, PagerDuty/ServiceNow, configuring hooks/tools, RBAC, or KQL-based diagnostics, and other Azure Sre Agent related development tasks. Not for Azure Monitor (use azure-monitor), Azure Reliability (use azure-reliability), Azure Resiliency (use azure-resiliency), Azure Service Health (use azure-service-health). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Sre Agent Skill -This skill provides expert guidance for Azure Sre Agent. Covers troubleshooting, decision making, limits & quotas, security, configuration, integrations & coding patterns, and deployment. It combines local quick-reference content with remote documentation fetching capabilities. +This skill provides expert guidance for Azure Sre Agent. Covers troubleshooting, decision making, limits & quotas, security, configuration, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. ## How to Use This Skill @@ -24,13 +24,12 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L35-L40 | Diagnosing Azure SRE Agent deployment/operation issues and using KQL to query its telemetry, logs, and actions for troubleshooting. | -| Decision Making | L41-L47 | Guidance on when to trigger deep investigations, how SRE Agent pricing works and what drives cost, and which Azure regions you can or should deploy the agent in | -| Limits & Quotas | L48-L52 | Tracking Azure SRE Agent usage, AI Unit consumption, limits, and understanding how pricing, billing units, and quotas affect your workloads. | -| Security | L53-L61 | Managing SRE Agent identities, authentication methods, and configuring RBAC/role permissions (including managed identity access to Azure DevOps repos) for secure resource access. | -| Configuration | L62-L73 | Configuring SRE Agent behavior: hooks and governance, custom tools/skills, specialized subagents, network/firewall settings, and enabling/using the Python/shell Code Interpreter via UI or REST API | -| Integrations & Coding Patterns | L74-L97 | Integrating Azure SRE Agent with DevOps, GitHub, Teams, ServiceNow, PagerDuty, Kusto, observability tools, and custom HTTP/Python/MCP tools for automation and incident workflows | -| Deployment | L98-L101 | How to deploy and configure the Azure SRE Agent as a Microsoft Teams bot, including setup steps, required permissions, and integration details. | +| Troubleshooting | L34-L39 | Diagnosing Azure SRE Agent deployment/operation issues and using KQL to query its telemetry, logs, and actions for troubleshooting. | +| Decision Making | L40-L46 | Guidance on when to trigger deep investigations, how SRE Agent pricing works and what drives cost, and which Azure regions you can deploy SRE Agent in. | +| Limits & Quotas | L47-L51 | Tracking Azure SRE Agent usage, AI Unit consumption, limits, and understanding how pricing, billing units, and quotas affect your workloads. | +| Security | L52-L61 | Identities, auth, and RBAC for SRE Agent: managed identities, ADO repo access, roles/permissions, and subscription/resource visibility configuration. | +| Configuration | L62-L75 | Configuring SRE Agent behavior: hooks and governance, code interpreter (Python/shell), custom tools/skills, subagents, tool overrides, incident plans, and network/firewall settings. | +| Integrations & Coding Patterns | L76-L95 | Integrating SRE Agent with DevOps, GitHub, alerts, incidents, observability tools, and custom Python/Kusto tools, including cross-tenant and PagerDuty/ServiceNow connections. | ### Troubleshooting | Topic | URL | @@ -57,6 +56,7 @@ This skill requires **network access** to fetch documentation content: | Configure managed identity access to ADO repos in SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/connect-ado-repo-managed-identity | | Manage Azure SRE Agent permissions and resource access | https://learn.microsoft.com/en-us/azure/sre-agent/manage-permissions | | Configure Azure SRE Agent permissions and RBAC access | https://learn.microsoft.com/en-us/azure/sre-agent/permissions | +| Understand subscription visibility and RBAC for SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/subscription-permission-visibility | | Configure Azure SRE Agent roles and permissions | https://learn.microsoft.com/en-us/azure/sre-agent/user-roles | ### Configuration @@ -67,6 +67,8 @@ This skill requires **network access** to fetch documentation content: | Create and manage governance hooks in Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/create-manage-hooks-ui | | Create custom skills with tools and files in Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/create-skill | | Configure specialized subagents in Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/create-subagent | +| Configure default and custom agent tool overrides | https://learn.microsoft.com/en-us/azure/sre-agent/default-agent-override-info | +| Configure incident response plans for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/incident-response-plans | | Configure network and firewall requirements for SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/network-requirements | | Configure agent hooks via REST API in Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/tutorial-agent-hooks | | Enable and use Code Interpreter in Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/use-code-interpreter | @@ -76,26 +78,18 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Connect Azure DevOps to Azure SRE Agent for code and work items | https://learn.microsoft.com/en-us/azure/sre-agent/ado-connector | | Connect Azure DevOps wikis as knowledge sources for SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/azure-devops-wiki-knowledge | -| Use Azure SRE Agent from Microsoft Teams | https://learn.microsoft.com/en-us/azure/sre-agent/chat-from-your-tools | +| Integrate Azure Monitor alerts with Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/azure-monitor-alerts | | Connect Azure DevOps wiki as SRE Agent knowledge source | https://learn.microsoft.com/en-us/azure/sre-agent/connect-devops-wiki | +| Connect runbooks and knowledge sources to SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/connect-knowledge | | Configure PagerDuty integration for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/connect-pagerduty | -| Connect Azure SRE Agent to ServiceNow for incident management | https://learn.microsoft.com/en-us/azure/sre-agent/connect-servicenow | -| Create and use HTTP triggers for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/create-http-trigger | +| Use connectors to extend Azure SRE Agent integrations | https://learn.microsoft.com/en-us/azure/sre-agent/connectors | | Build and deploy a Python SLA calculator tool in SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/create-python-tool | | Configure cross-tenant Azure DevOps access for SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/cross-account-ado-oauth | | Integrate external observability tools with Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/diagnose-observability | | Integrate GitHub with Azure SRE Agent via OAuth or PAT | https://learn.microsoft.com/en-us/azure/sre-agent/github-connector | -| Install and configure marketplace plugins for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/install-plugin-from-marketplace | +| Connect incident platforms to Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/incident-platforms | | Define Kusto tools to run deterministic KQL in SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/kusto-tools | -| Use MCP connectors and tools with Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/mcp-connectors | | Integrate PagerDuty incidents with Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/pagerduty-incidents | | Create and configure Python tools for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/python-code-execution | -| Connect ServiceNow as an incident platform for SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/servicenow-incidents | -| Connect PagerDuty incidents to Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/set-up-pagerduty-indexing | -| Configure Microsoft Teams connector for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/set-up-teams-connector | -| Configure ServiceNow incident indexing for Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/setup-servicenow-indexing | - -### Deployment -| Topic | URL | -|-------|-----| -| Deploy Azure SRE Agent as a Microsoft Teams bot | https://learn.microsoft.com/en-us/azure/sre-agent/teams-bot | \ No newline at end of file +| Integrate ServiceNow incidents with Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/servicenow-incidents | +| Connect PagerDuty incidents to Azure SRE Agent | https://learn.microsoft.com/en-us/azure/sre-agent/set-up-pagerduty-indexing | \ No newline at end of file diff --git a/skills/azure-synapse-analytics/SKILL.md b/skills/azure-synapse-analytics/SKILL.md index 461678e8..618c8d1c 100644 --- a/skills/azure-synapse-analytics/SKILL.md +++ b/skills/azure-synapse-analytics/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-synapse-analytics -description: Expert knowledge for Azure Synapse Analytics development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing Synapse workspaces, Spark pools, dedicated/serverless SQL, Delta Lake, or Synapse Link workloads, and other Azure Synapse Analytics related development tasks. Not for Azure Data Factory (use azure-data-factory), Azure Data Explorer (use azure-data-explorer), Azure Databricks (use azure-databricks), Azure HDInsight (use azure-hdinsight). +description: Expert knowledge for Azure Synapse Analytics development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when working with Synapse workspaces, Spark pools, dedicated/serverless SQL pools, Synapse Link, or Delta Lake, and other Azure Synapse Analytics related development tasks. Not for Azure Data Factory (use azure-data-factory), Azure Data Explorer (use azure-data-explorer), Azure Databricks (use azure-databricks), Azure HDInsight (use azure-hdinsight). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Synapse Analytics Skill @@ -31,8 +31,8 @@ This skill requires **network access** to fetch documentation content: | Limits & Quotas | L163-L172 | Synapse SQL pool limits: maintenance windows, memory/concurrency by performance level, capacity caps, temp table behavior, serverless Delta Lake v1 querying, and Synapse Link feature limits/issues. | | Security | L173-L231 | Securing Synapse workspaces end-to-end: auth and RBAC, network isolation, private endpoints, encryption, data exfiltration, policies, and secure access to storage, SQL, Spark, and migration scenarios. | | Configuration | L232-L280 | Configuring Synapse workspaces, Spark pools, and SQL pools: environments, scaling, libraries, monitoring/metrics, backups/restore, workload management, and integrations (Purview, AML, SynapseML). | -| Integrations & Coding Patterns | L281-L317 | Patterns and code to integrate Synapse (Spark, serverless, dedicated SQL) with ADLS, Cosmos DB, Azure SQL, AML, monitoring (Log Analytics, Prometheus), and external tools via connectors, REST, and T-SQL. | -| Deployment | L318-L331 | Deploying and managing Synapse workspaces and dedicated SQL pools: ARM/Bicep templates, CI/CD, source control, restore points, maintenance windows, and automated compute management. | +| Integrations & Coding Patterns | L281-L318 | Integrating Synapse with Spark, SQL pools, ML, Cosmos DB, ADLS, Kusto, Stream Analytics, and tools (Fivetran, Striim), plus logging/monitoring, packaging, and data movement patterns. | +| Deployment | L319-L332 | Deploying and managing Synapse workspaces and dedicated SQL pools: ARM/Bicep templates, CI/CD, source control, restore points, maintenance windows, and automated compute management. | ### Troubleshooting | Topic | URL | @@ -292,6 +292,7 @@ This skill requires **network access** to fetch documentation content: | Use Spark CDM connector to read/write Common Data Model | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/data-sources/apache-spark-cdm-connector | | Use Kusto connector with Synapse serverless Spark | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/data-sources/apache-spark-kusto-connector | | Use Synapse Spark connector for SQL databases | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/data-sources/apache-spark-sql-connector | +| Configure Synapse Spark logs to Azure Storage with cert auth | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/how-to-use-certificate-with-service-principal-for-log-storage | | Develop and submit Spark apps from IntelliJ to Synapse | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/intellij-tool-synapse | | Use MSSparkUtils utilities in Synapse Spark notebooks | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/microsoft-spark-utilities | | Mount external storage using Synapse Spark file APIs | https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/synapse-file-mount-api | diff --git a/skills/azure-test-plans/SKILL.md b/skills/azure-test-plans/SKILL.md index 6c93c5fd..ab4c667c 100644 --- a/skills/azure-test-plans/SKILL.md +++ b/skills/azure-test-plans/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-test-plans -description: Expert knowledge for Azure Test Plans development including limits & quotas, security, and integrations & coding patterns. Use when configuring custom test run fields, managing test access, or automating suites via tcm.exe and test configs, and other Azure Test Plans related development tasks. Not for Azure DevOps (use azure-devops), Azure Boards (use azure-boards), Azure Pipelines (use azure-pipelines), Azure App Testing (use azure-app-testing). +description: Expert knowledge for Azure Test Plans development including limits & quotas, security, and integrations & coding patterns. Use when defining test run fields, handling test case/suite limits, setting test access, or automating via tcm.exe, and other Azure Test Plans related development tasks. Not for Azure DevOps (use azure-devops), Azure Pipelines (use azure-pipelines), Azure Boards (use azure-boards), Azure App Testing (use azure-app-testing). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Test Plans Skill @@ -24,14 +24,15 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Limits & Quotas | L31-L35 | Configuring and using custom fields in Azure DevOps test runs, including how to define, manage, and apply them for better test reporting and tracking. | -| Security | L36-L40 | Managing permissions, access levels, and security roles for users and groups in Azure Test Plans manual testing features. | -| Integrations & Coding Patterns | L41-L44 | Using tcm.exe CLI to manage Azure Test Plans: create and run test suites, import/export test cases, manage test configurations, and automate test management tasks | +| Limits & Quotas | L31-L36 | Test run customization with custom fields, plus limits, quotas, and data retention rules for Azure Test Plans (e.g., max test cases, suites, runs, and history duration). | +| Security | L37-L41 | Managing permissions, access levels, and security roles for users and groups in Azure Test Plans manual testing features. | +| Integrations & Coding Patterns | L42-L45 | Using tcm.exe CLI to manage Azure Test Plans: create and run test suites, import/export test cases, manage test configurations, and automate test management tasks | ### Limits & Quotas | Topic | URL | |-------|-----| | Use custom fields for Azure DevOps test runs | https://learn.microsoft.com/en-us/azure/devops/test/custom-fields?view=azure-devops | +| Understand Azure Test Plans limits and retention | https://learn.microsoft.com/en-us/azure/devops/test/reference-qa?view=azure-devops | ### Security | Topic | URL | diff --git a/skills/azure-virtual-desktop/SKILL.md b/skills/azure-virtual-desktop/SKILL.md index 0c10684a..8813ed75 100644 --- a/skills/azure-virtual-desktop/SKILL.md +++ b/skills/azure-virtual-desktop/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-virtual-desktop -description: Expert knowledge for Azure Virtual Desktop development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when working with FSLogix profiles, MSIX App Attach, autoscale/Start VM on Connect, Teams optimization, or AVD SSO, and other Azure Virtual Desktop related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Dev Box (use azure-dev-box), Azure VMware Solution (use azure-vmware-solution), Azure Data Science Virtual Machines (use azure-data-science-vm). +description: Expert knowledge for Azure Virtual Desktop development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when working with AVD host pools, FSLogix profiles, autoscale/Start VM on Connect, MSIX App Attach, or Teams offload, and other Azure Virtual Desktop related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Dev Box (use azure-dev-box), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Virtual Desktop Skill @@ -25,14 +25,14 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L37-L52 | Diagnosing and fixing AVD issues: agent updates, MSIX App Attach, autoscale, FQDN/connectivity, connection quality, graphics, session host health, Teams, and Log Analytics-based troubleshooting | -| Best Practices | L53-L64 | Operational best practices for AVD: autoscale, Start VM on Connect, Windows multi-session tuning, validation host pools, proxy/RDP Multipath guidance, and resolving Azure Advisor recommendations. | -| Decision Making | L65-L80 | Planning and cost/licensing decisions for AVD: deployment models, autoscale, host pool and tool choices, storage/FSLogix, data locations, ESU, Local/Extended Zones, and Insights cost estimation | -| Architecture & Design Patterns | L81-L89 | Design patterns for AVD app delivery, stateless hosts, DR, FSLogix profile containers, and automated scaling with Automation/Logic Apps. | -| Limits & Quotas | L90-L95 | Guidance on RDP bandwidth requirements and optimizing Microsoft Teams (audio/video, collaboration features) performance and configuration in Azure Virtual Desktop. | -| Security | L96-L114 | Securing Azure Virtual Desktop: SSO (Entra ID/AD FS), MFA/Conditional Access, RBAC/roles, managed identities, external identities, session protections (watermarking, screen capture, clipboard), and auditing. | -| Configuration | L115-L175 | Configuring AVD environments: images, autoscale, networking, RDP/redirection, Teams/media, licensing, updates, monitoring, and client/RemoteApp settings for optimized session hosts. | -| Integrations & Coding Patterns | L176-L183 | Managing AVD via CLI/PowerShell, integrating partner App Attach delivery, enabling WebRTC multimedia redirection, and launching resources using custom URI schemes. | -| Deployment | L184-L192 | Guides for deploying and migrating AVD: adding session hosts, moving from classic to current AVD, changing regions, using regional host pools, and deploying Windows clients via Intune/ConfigMgr. | +| Best Practices | L53-L63 | Operational best practices for AVD: autoscale, Start VM on Connect, Windows multi-session tuning, validation host pools, proxy/RDP Multipath guidance, and resolving Azure Advisor recommendations. | +| Decision Making | L64-L79 | Planning and cost/licensing decisions for AVD: deployment models, autoscale, host pool and tool choices, storage/FSLogix, data locations, ESU, Local/Extended Zones, and Insights cost estimation | +| Architecture & Design Patterns | L80-L88 | Design patterns for AVD app delivery, stateless hosts, DR, FSLogix profile containers, and automated scaling with Automation/Logic Apps. | +| Limits & Quotas | L89-L94 | Guidance on RDP bandwidth requirements and optimizing Microsoft Teams (audio/video, collaboration features) performance and configuration in Azure Virtual Desktop. | +| Security | L95-L113 | Securing AVD access and sessions: SSO (Entra ID/AD FS), MFA/Conditional Access, RBAC/delegated admin, managed identities, external identities, clipboard/screen/watermark controls, WebAuthn, Kerberos, and Purview. | +| Configuration | L114-L174 | Configuring AVD environments: images, autoscale, networking, RDP/redirection, Teams/media, licensing, updates, monitoring, and client/RemoteApp settings for optimized session hosts. | +| Integrations & Coding Patterns | L175-L182 | Managing AVD via CLI/PowerShell, integrating partner App Attach delivery, enabling WebRTC multimedia redirection, and launching resources using custom URI schemes. | +| Deployment | L183-L191 | Guides for deploying and migrating AVD: adding session hosts, moving from classic to current AVD, changing regions, using regional host pools, and deploying Windows clients via Intune/ConfigMgr. | ### Troubleshooting | Topic | URL | @@ -58,7 +58,6 @@ This skill requires **network access** to fetch documentation content: | Configure Azure Virtual Desktop validation host pools | https://learn.microsoft.com/en-us/azure/virtual-desktop/configure-validation-environment | | Azure Virtual Desktop FAQ and operational best practices | https://learn.microsoft.com/en-us/azure/virtual-desktop/faq | | Apply proxy server guidelines for Azure Virtual Desktop | https://learn.microsoft.com/en-us/azure/virtual-desktop/proxy-server-support | -| Use RDP Multipath to improve Azure Virtual Desktop reliability | https://learn.microsoft.com/en-us/azure/virtual-desktop/rdp-multipath | | Start VM on Connect FAQ and usage best practices | https://learn.microsoft.com/en-us/azure/virtual-desktop/start-virtual-machine-connect-faq | | FAQ and best practices for Windows multi-session on AVD | https://learn.microsoft.com/en-us/azure/virtual-desktop/windows-multisession-faq | diff --git a/skills/azure-virtual-machines/SKILL.md b/skills/azure-virtual-machines/SKILL.md index 226b8a99..11c8e42f 100644 --- a/skills/azure-virtual-machines/SKILL.md +++ b/skills/azure-virtual-machines/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-virtual-machines -description: Expert knowledge for Azure Virtual Machines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing VM/managed disk SKUs, configuring scale sets, Trusted Launch, ADE/CMK encryption, or HPC/GPU workloads, and other Azure Virtual Machines related development tasks. Not for Azure Data Science Virtual Machines (use azure-data-science-vm), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Cloud Services (use azure-cloud-services). +description: Expert knowledge for Azure Virtual Machines development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when choosing VM/SSD/GPU SKUs, using scale sets, Trusted Launch, ADE/CMK, or automating via CLI/ARM/Bicep, and other Azure Virtual Machines related development tasks. Not for Azure Data Science Virtual Machines (use azure-data-science-vm), SQL Server on Azure Virtual Machines (use azure-sql-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), Azure Kubernetes Service (AKS) (use azure-kubernetes-service). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Virtual Machines Skill @@ -26,10 +26,10 @@ This skill requires **network access** to fetch documentation content: |----------|----------|-------------| | Troubleshooting | L37-L61 | Diagnosing and fixing Azure VM issues: hibernation, disk encryption, extensions, NSG blocking, Spot/scale set errors, Image Builder, kernel/packages, Trusted Launch, and gallery images. | | Best Practices | L62-L88 | Performance, scaling, HA, and cost-optimization best practices for Azure VMs, including HPC/InfiniBand tuning, disks/snapshots, OS-specific tweaks, and Image Builder/boot-time optimization. | -| Decision Making | L89-L156 | Guidance for choosing VM/disk types, costs, licensing, DNS/images, and backup/DR, plus detailed migration and retirement paths for VM series, disks, OSes, GPU/HPC, Oracle, and WebLogic workloads | +| Decision Making | L89-L156 | Guidance for choosing VM/disk types, costs, reservations, and backup/DR, plus migration and retirement planning (VM sizes, GPU/HPC, OS images, Oracle/WebLogic, AWS/other platforms). | | Architecture & Design Patterns | L157-L174 | Design patterns for VM-based architectures: multi-region and fleet strategies, NUMA/topology tuning for HPC SKUs, low-latency placement, and Oracle/OpenShift deployment and DR designs. | -| Limits & Quotas | L175-L379 | VM size specs, disk and storage performance limits, quotas, lifecycle/deprecation rules, and operational behaviors to plan capacity, performance, and compliance for Azure VMs. | -| Security | L380-L458 | Securing Azure VMs and disks: encryption (ADE, CMK, double/host), Key Vault, TLS, Trusted Launch, OS Guard, MSP/metadata hardening, RBAC/Policy, and secure image/gallery sharing. | +| Limits & Quotas | L175-L378 | VM size specs, disk/storage performance limits, GPU/HPC capacities, lifecycle/retirement, and regional vCPU/host quotas for planning, scaling, and sizing Azure VMs. | +| Security | L379-L457 | Securing Azure VMs and disks: encryption (ADE, CMK, double/host), Key Vault, TLS, Trusted Launch, OS Guard, MSP/metadata hardening, RBAC/Policy, and secure image/gallery sharing. | | Configuration | [configuration.md](configuration.md) | Configuring Azure VMs and scale sets: images, disks, networking, extensions, monitoring, maintenance, HPC/GPU, OS agents, SSH/WinRM, Oracle workloads, and automation via CLI/ARM/Bicep. | | Integrations & Coding Patterns | [integrations.md](integrations.md) | Managing and automating Azure VMs with CLI/PowerShell/REST: backups, snapshots, disk encryption, maintenance/availability monitoring, Key Vault, networking, and Oracle DB integration. | | Deployment | [deployment.md](deployment.md) | Deploying and migrating Azure VMs/AKS nodes: disk type moves, regional/zonal moves, in-place OS upgrades, blue-green/rolling deployments, and DevOps-based image and snapshot workflows. | @@ -96,6 +96,7 @@ This skill requires **network access** to fetch documentation content: | Select constrained vCPU VM sizes for licensing | https://learn.microsoft.com/en-us/azure/virtual-machines/constrained-vcpu | | Monitor and control Azure VM spending with Cost Management | https://learn.microsoft.com/en-us/azure/virtual-machines/cost-optimization-monitor-costs | | Plan and estimate Azure VM costs using Cost Management | https://learn.microsoft.com/en-us/azure/virtual-machines/cost-optimization-plan-to-manage-costs | +| Understand impact of deprecated Azure Marketplace images | https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images | | Plan migration from Azure Disk Encryption to encryption at host | https://learn.microsoft.com/en-us/azure/virtual-machines/disk-encryption-migrate | | Choose options to improve Azure disk performance | https://learn.microsoft.com/en-us/azure/virtual-machines/disks-performance-options | | Select redundancy options for Azure managed disks | https://learn.microsoft.com/en-us/azure/virtual-machines/disks-redundancy | @@ -105,14 +106,13 @@ This skill requires **network access** to fetch documentation content: | Estimate Azure VM costs using portal cost card | https://learn.microsoft.com/en-us/azure/virtual-machines/estimated-vm-create-cost-card | | Decide between Azure Generation 1 and 2 VMs | https://learn.microsoft.com/en-us/azure/virtual-machines/generation-2 | | Choose DNS name resolution options for Linux Azure VMs | https://learn.microsoft.com/en-us/azure/virtual-machines/linux/azure-dns | -| Choose endorsed Linux distributions and image sources | https://learn.microsoft.com/en-us/azure/virtual-machines/linux/endorsed-distros | | Plan and create custom Linux images for Azure | https://learn.microsoft.com/en-us/azure/virtual-machines/linux/imaging | | Plan migration from retiring Dedicated Host SKUs | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/dedicated-host-migration-guide | | Migrate workloads from retiring Azure Dedicated Host SKUs | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/dedicated-host-migration-guide | | Plan migration from AWS EC2 to Azure Virtual Machines | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/migrate-from-elastic-compute-cloud-architecture | | Migrate legacy managed images to Azure Compute Gallery | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/migration-managed-image-to-compute-gallery | -| Choose replacement sizes for retired Azure VMs | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide | -| Migrate NC and ND GPU compute workloads to newer sizes | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration | +| Choose replacement sizes for retired Azure VM series | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/d-ds-dv2-dsv2-ls-series-migration-guide | +| Plan migration for Azure N-series GPU workloads | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/n-series-migration | | Migrate workloads from legacy NV GPU VMs | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/nv-series-migration-guide | | Migrate workloads from retired NV-series GPU VMs | https://learn.microsoft.com/en-us/azure/virtual-machines/migration/sizes/nv-series-migration-guide | | Migrate from NC24rs_v3 before retirement | https://learn.microsoft.com/en-us/azure/virtual-machines/ncv3-nc24rs-retirement | @@ -133,7 +133,7 @@ This skill requires **network access** to fetch documentation content: | Plan migration for ND-series GPU VM retirement | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/nd-series-retirement | | Plan migration from Azure NP-series VMs before retirement | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/np-series-retirement | | Understand Azure NV series retirement timeline | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/nv-series-retirement | -| Review retired Azure VM series and replacements | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list | +| Plan migrations from retired Azure VM size series | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/retirement/retired-sizes-list | | Decide when and how to use Azure Spot VMs | https://learn.microsoft.com/en-us/azure/virtual-machines/spot-vms | | Plan migration from Azure unmanaged disks retirement | https://learn.microsoft.com/en-us/azure/virtual-machines/unmanaged-disks-deprecation | | Use Azure VM restore points for granular recovery | https://learn.microsoft.com/en-us/azure/virtual-machines/virtual-machines-create-restore-points | @@ -180,11 +180,10 @@ This skill requires **network access** to fetch documentation content: | Understand Azure VM compute throttling limits | https://learn.microsoft.com/en-us/azure/virtual-machines/compute-throttling-limits | | Check support matrix and limits for Azure VM restore points | https://learn.microsoft.com/en-us/azure/virtual-machines/concepts-restore-points | | Compute optimized Dedicated Host SKU capacities and packing | https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-compute-optimized-skus | -| General purpose Dedicated Host SKU capacities and packing | https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus | +| General purpose Azure Dedicated Host SKU specifications | https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-general-purpose-skus | | GPU optimized Dedicated Host SKU capacities and packing | https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-gpu-optimized-skus | | Memory optimized Dedicated Host SKU capacities and packing | https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-memory-optimized-skus | | Storage optimized Dedicated Host SKU capacities and packing | https://learn.microsoft.com/en-us/azure/virtual-machines/dedicated-host-storage-optimized-skus | -| Understand deprecation rules for Azure VM images | https://learn.microsoft.com/en-us/azure/virtual-machines/deprecated-images | | Use Azure VM managed disk bursting performance limits | https://learn.microsoft.com/en-us/azure/virtual-machines/disk-bursting | | Understand performance tiers for Azure Managed Disks | https://learn.microsoft.com/en-us/azure/virtual-machines/disks-change-performance | | Configure and deploy Azure Premium SSD v2 disks | https://learn.microsoft.com/en-us/azure/virtual-machines/disks-deploy-premium-v2 | @@ -224,8 +223,8 @@ This skill requires **network access** to fetch documentation content: | Reference specs for Azure FXmdsv2 memory-intensive VM sizes | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fxmdsv2-series | | Reference specs for Azure FXmsv2 memory-intensive VM sizes | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/compute-optimized/fxmsv2-series | | NMads MA35d video transcoding VM specs and limits | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/nm-ads-ma35d-series | -| Check NP family VM sizes and specs | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family | -| Review NP-series VM size specifications | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series | +| Reference NP-series FPGA VM sizes and limits | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-family | +| Review NP-series FPGA VM specifications and retirement | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/fpga-accelerated/np-series | | Use Azure Av2 VM sizes and specifications | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/general-purpose/av2-series | | Select Azure Basv2 burstable VM sizes | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/general-purpose/basv2-series | | Choose Azure Bpsv2 Arm-based VM sizes | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/general-purpose/bpsv2-series | @@ -302,7 +301,7 @@ This skill requires **network access** to fetch documentation content: | Reference NDv2 GPU VM size specifications | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ndv2-series | | NG GPU VM family sizes and specs | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ng-family | | NGads V620 GPU VM size specs and limits | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/ngadsv620-series | -| NV GPU VM family sizes and capabilities | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family | +| Reference NV-series GPU VM sizes and specs | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nv-family | | NVads A10 v5 GPU VM specs and limits | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvadsa10v5-series | | Reference NVads V710 v5 Azure GPU VM size limits | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvadsv710-v5-series | | NVv4 GPU VM size specifications and limits | https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/gpu-accelerated/nvv4-series | diff --git a/skills/azure-virtual-network/SKILL.md b/skills/azure-virtual-network/SKILL.md index 2fcb4ade..b5cb043e 100644 --- a/skills/azure-virtual-network/SKILL.md +++ b/skills/azure-virtual-network/SKILL.md @@ -3,7 +3,7 @@ name: azure-virtual-network description: Expert knowledge for Azure Virtual Network development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, and configuration. Use when designing VNets, NSGs, service endpoints, VNet peering, VPN gateways, or Azure Firewall/NAT gateways, and other Azure Virtual Network related development tasks. Not for Azure Networking (use azure-networking), Azure Virtual Network Manager (use azure-virtual-network-manager), Azure Virtual WAN (use azure-virtual-wan), Azure VPN Gateway (use azure-vpn-gateway). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-12" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Virtual Network Skill diff --git a/skills/azure-vm-scalesets/SKILL.md b/skills/azure-vm-scalesets/SKILL.md index 6c962ba4..ffb67b41 100644 --- a/skills/azure-vm-scalesets/SKILL.md +++ b/skills/azure-vm-scalesets/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-vm-scalesets -description: Expert knowledge for Azure Virtual Machine Scale Sets development including troubleshooting, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring VMSS autoscale, upgrade modes, zones/PPGs, Spot/standby pools, or disk encryption with Key Vault, and other Azure Virtual Machine Scale Sets related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure App Service (use azure-app-service), Azure Service Fabric (use azure-service-fabric). +description: Expert knowledge for Azure Virtual Machine Scale Sets development including troubleshooting, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring autoscale rules, upgrade modes, Spot/standby pools, disk encryption, or ARM/CLI/PowerShell deployments, and other Azure Virtual Machine Scale Sets related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Kubernetes Service (AKS) (use azure-kubernetes-service), Azure Service Fabric (use azure-service-fabric), Azure App Service (use azure-app-service). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-05" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Virtual Machine Scale Sets Skill @@ -29,7 +29,7 @@ This skill requires **network access** to fetch documentation content: | Architecture & Design Patterns | L56-L64 | Designing resilient VM scale sets: zones, fault domains, zone balancing modes, proximity placement groups, and standby pools to optimize availability, latency, and scale-out behavior. | | Limits & Quotas | L65-L73 | VMSS capacity, instance and placement group limits, standby pool constraints, maintenance notification behavior, and FAQs on supported scale, features, and quotas | | Security | L74-L82 | Encrypting VM scale set disks (CLI, PowerShell, ARM), configuring Key Vault and extension sequencing for Azure Disk Encryption, and setting security policies/RBAC for VMSS. | -| Configuration | L83-L130 | Configuring VM Scale Sets: scaling rules, upgrades, networking, disks, images, health/repair, standby pools, instance mix, protection, and automation via CLI, PowerShell, templates, and portal | +| Configuration | L83-L130 | Configuring VM Scale Sets: scaling rules, upgrades, health/repairs, disks, networking, instance mix, standby pools, protection, and automation via portal, CLI, PowerShell, and ARM templates. | | Integrations & Coding Patterns | L131-L140 | Using CLI/PowerShell/DSC/custom script to deploy apps, configure, and manage VM Scale Sets, plus integrating standby pools with Log Analytics for monitoring and automation. | | Deployment | L141-L149 | Creating and deploying VM scale sets with gallery/custom images, ARM templates, app deployment steps, and configuring instances across availability zones. | diff --git a/skills/azure-vmware-solution/SKILL.md b/skills/azure-vmware-solution/SKILL.md index ed35d6fc..f628af6a 100644 --- a/skills/azure-vmware-solution/SKILL.md +++ b/skills/azure-vmware-solution/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-vmware-solution -description: Expert knowledge for Azure VMware Solution development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when designing AVS networking/storage, HCX migrations, Citrix/Horizon VDI, JetStream DR, or NetApp Files integrations, and other Azure VMware Solution related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), Azure Stack Edge (use azure-stack-edge), Azure Baremetal Infrastructure (use azure-baremetal-infrastructure). +description: Expert knowledge for Azure VMware Solution development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring AVS networking/vSAN, HCX migration/DR, Citrix/Horizon desktops, Defender for Cloud, or Cloud Director, and other Azure VMware Solution related development tasks. Not for Azure Virtual Machines (use azure-virtual-machines), Azure Virtual Machine Scale Sets (use azure-vm-scalesets), Azure Stack Edge (use azure-stack-edge), Azure Migrate (use azure-migrate). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure VMware Solution Skill @@ -24,21 +24,21 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L43 | Diagnosing and fixing Azure VMware Solution issues: deployment failures, known platform errors, and disaster recovery problems (replication, failover, and recovery). | +| Troubleshooting | L37-L43 | Diagnosing and fixing AVS issues: known errors and workarounds, deployment failures, and disaster recovery problems for Azure VMware Solution VMs. | | Best Practices | L44-L51 | Best practices for AVS private cloud maintenance, security hardening, and tuning NSX scale/performance specifically for HCX migration scenarios. | | Decision Making | L52-L65 | Guidance on choosing AVS connectivity, migration tools, backup and storage options, SQL/Windows licensing, cross-region moves, reserved instances, and VMware Cloud Foundation subscriptions. | | Architecture & Design Patterns | L66-L76 | Network, storage, and desktop virtualization design for AVS: hub-spoke integration, vSAN/stretched clusters, Citrix/Horizon, GitHub Enterprise Server, and Cloud Director tenant networking. | | Limits & Quotas | L77-L88 | Host, cluster, capacity, and quota limits for AVS private clouds, plus vSAN ESA support, Gen2 routing limits, and required network ports and planning steps. | | Security | L89-L103 | Securing AVS: identity/access control, Defender for Cloud, vSAN CMK encryption, external LDAP/vCenter identity, Trusted Launch/vTPM, ESU, role assignments, app protection, and credential rotation. | -| Configuration | L104-L142 | Configuring AVS environments: networking, storage, backup/DR, monitoring, DNS, SNAT, vSAN, Cloud Director, Arc, and integrations like NetApp, Elastic SAN, Pure, and Aria Operations. | -| Integrations & Coding Patterns | L143-L154 | Networking, migration, and integration patterns for AVS: VPN/ExpressRoute, HCX migrations, Traffic Manager, monitoring/logging, backup, and using services like NetApp Files with AVS VMs | +| Configuration | L104-L140 | Configuring AVS environments: networking, storage, backup/DR, monitoring, DNS, SNAT, vSAN, Cloud Director, Arc, and integrations like NetApp, Elastic SAN, Pure, and Aria Operations. | +| Integrations & Coding Patterns | L141-L154 | Patterns for connecting AVS to Azure services: storage (Elastic SAN, ANF), networking/VPN, HCX migration/DR, backups, monitoring/logging, and automation via HCX Run Commands. | | Deployment | L155-L160 | Guides for deploying AVS workloads: Citrix Virtual Apps/Desktops, JetStream DR for AVS/on-prem, and integrating VMware Cloud Director service with Azure VMware Solution. | ### Troubleshooting | Topic | URL | |-------|-----| | Resolve Azure VMware Solution known issues and workarounds | https://learn.microsoft.com/en-us/azure/azure-vmware/azure-vmware-solution-known-issues | -| Resolve disaster recovery issues in Azure VMware | https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms | +| Resolve disaster recovery issues for AVS virtual machines | https://learn.microsoft.com/en-us/azure/azure-vmware/ecosystem-disaster-recovery-vms | | Troubleshoot Azure VMware Solution deployment failures | https://learn.microsoft.com/en-us/azure/azure-vmware/fix-deployment-failures | ### Best Practices @@ -111,7 +111,6 @@ This skill requires **network access** to fetch documentation content: | Configure Cloud Backup policies for AVS datastores and VMs | https://learn.microsoft.com/en-us/azure/azure-vmware/backup-azure-netapp-files-datastores-vms | | Back up AVS VMware virtual machines with Azure Backup Server | https://learn.microsoft.com/en-us/azure/azure-vmware/backup-azure-vmware-solution-virtual-machines | | Configure alerts and metrics monitoring for Azure VMware Solution | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-alerts-for-azure-vmware-solution | -| Use Azure Elastic SAN as AVS iSCSI datastores | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san | | Create Azure Monitor resource health alerts for AVS | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-monitor-for-resource-health-for-azure-vmware-solution | | Use Azure Native Pure Storage Cloud with AVS | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-native-pure-storage-cloud | | Configure Azure VMware Solution performance and health metrics | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-vmware-solution-metrics | @@ -128,7 +127,6 @@ This skill requires **network access** to fetch documentation content: | Configure Windows Server Failover Cluster on AVS vSAN | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-windows-server-failover-cluster | | Create and manage AVS placement policies for VMs | https://learn.microsoft.com/en-us/azure/azure-vmware/create-placement-policy | | Deploy VMware Cloud Director Availability on AVS | https://learn.microsoft.com/en-us/azure/azure-vmware/deploy-vmware-cloud-director-availability-in-azure-vmware-solution | -| Deploy VMware Site Recovery Manager for AVS disaster recovery | https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager | | Configure Managed SNAT for Azure VMware workloads | https://learn.microsoft.com/en-us/azure/azure-vmware/enable-managed-snat-for-workloads | | Configure default routes and disable internet for Azure VMware Solution | https://learn.microsoft.com/en-us/azure/azure-vmware/enable-public-ip-nsx-edge | | Configure Azure Hybrid Benefit Unlimited Virtualization for SQL on AVS | https://learn.microsoft.com/en-us/azure/azure-vmware/enable-sql-azure-hybrid-benefit | @@ -143,7 +141,9 @@ This skill requires **network access** to fetch documentation content: ### Integrations & Coding Patterns | Topic | URL | |-------|-----| +| Configure Azure Elastic SAN as AVS iSCSI datastore | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-azure-elastic-san | | Configure site-to-site VPN with Azure Virtual WAN for Azure VMware Solution | https://learn.microsoft.com/en-us/azure/azure-vmware/configure-port-mirroring-azure-vmware-solution | +| Deploy VMware Site Recovery Manager for AVS DR | https://learn.microsoft.com/en-us/azure/azure-vmware/disaster-recovery-using-vmware-site-recovery-manager | | Enable HCX migrations over public internet for AVS | https://learn.microsoft.com/en-us/azure/azure-vmware/enable-hcx-access-over-internet | | Install Cloud Backup plug-in for Azure VMware VMs | https://learn.microsoft.com/en-us/azure/azure-vmware/install-cloud-backup-virtual-machines | | Integrate Azure Native Monitoring and Protection with AVS VMs | https://learn.microsoft.com/en-us/azure/azure-vmware/integrate-azure-native-services | diff --git a/skills/azure-vpn-gateway/SKILL.md b/skills/azure-vpn-gateway/SKILL.md index 07a34018..bf17aeb5 100644 --- a/skills/azure-vpn-gateway/SKILL.md +++ b/skills/azure-vpn-gateway/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-vpn-gateway -description: Expert knowledge for Azure VPN Gateway development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring S2S/P2S tunnels, IPsec/IKE/BGP, Entra/RADIUS auth, active-active gateways, or S2S over ExpressRoute, and other Azure VPN Gateway related development tasks. Not for Azure ExpressRoute (use azure-expressroute), Azure Virtual WAN (use azure-virtual-wan), Azure Virtual Network (use azure-virtual-network), Azure Virtual Network Manager (use azure-virtual-network-manager). +description: Expert knowledge for Azure VPN Gateway development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when configuring S2S/P2S tunnels, BGP routing, IPsec/IKE policies, Entra/RADIUS auth, or cross-cloud VPNs, and other Azure VPN Gateway related development tasks. Not for Azure Virtual Network (use azure-virtual-network), Azure Virtual WAN (use azure-virtual-wan), Azure ExpressRoute (use azure-expressroute), Azure Application Gateway (use azure-application-gateway). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure VPN Gateway Skill @@ -26,13 +26,13 @@ This skill requires **network access** to fetch documentation content: |----------|-------|-------------| | Troubleshooting | L37-L44 | Diagnosing and fixing Azure VPN Gateway issues: S2S/P2S connection failures, certificate/auth errors, macOS IKEv2, throughput, health checks, resets, and packet-capture/log-based debugging | | Best Practices | L45-L49 | Guidance on using network virtual appliances (NVAs) in Azure as VPN endpoints for remote access, including design, routing, security, and integration with Azure VPN Gateway. | -| Decision Making | L50-L58 | Guidance on choosing VPN Gateway SKUs, understanding SKU mappings, and planning/migrating VPN setups (P2S SSTP→IKEv2/OpenVPN, Classic→ARM, and remote work P2S strategies). | -| Architecture & Design Patterns | L59-L65 | Design patterns and guidance for choosing VPN Gateway topologies, configuring active-active gateways, and building highly available, resilient site-to-site connectivity. | -| Limits & Quotas | L66-L71 | VPN Gateway client version history, SKU comparisons, and FAQs about gateway limits, scale, performance, and connection behavior | -| Security | L72-L94 | Securing Azure VPN Gateway: IPsec/IKE policies, forced tunneling, cert/RADIUS auth, Entra ID & MFA for P2S, client config (Win/macOS/Linux), access control, roles, and crypto best practices. | -| Configuration | L95-L151 | Configuring Azure VPN Gateway and clients: P2S/S2S setup, auth (Entra, cert, RADIUS), BGP, IPsec/NAT, routing, monitoring, maintenance, and client configs for Windows, macOS, Linux, iOS. | -| Integrations & Coding Patterns | L152-L159 | Configuring Azure VPN Gateway with on-prem devices and services: NPS/RADIUS VSAs for P2S, S2S over ExpressRoute, Cisco ASA samples, and BGP VPN connectivity with AWS. | -| Deployment | L160-L173 | Deploying and migrating Azure VPN Gateways: create/upgrade gateways and SKUs, switch active/active modes, set up S2S VPNs, and manage client profiles and IP migrations via PowerShell/CLI. | +| Decision Making | L50-L59 | Guidance on choosing VPN Gateway SKUs, planning SKU/IP migrations, mapping old to new SKUs, and migrating/architecting P2S and Classic-to-ARM VPN gateways for remote access. | +| Architecture & Design Patterns | L60-L66 | Design patterns and guidance for choosing VPN Gateway topologies, configuring active-active gateways, and building highly available, resilient site-to-site connectivity. | +| Limits & Quotas | L67-L72 | VPN Gateway client version history, SKU comparisons, and FAQs about gateway limits, scale, performance, and connection behavior | +| Security | L73-L95 | Securing Azure VPN Gateway: IPsec/IKE policies, forced tunneling, cert/RADIUS auth, Entra ID & MFA for P2S, client config (Win/macOS/Linux), access control, roles, and crypto best practices. | +| Configuration | L96-L152 | Configuring Azure VPN Gateway and clients: P2S/S2S setup, auth (Entra, cert, RADIUS), BGP, IPsec/NAT, routing, monitoring, maintenance, and client configs for Windows, macOS, Linux, iOS. | +| Integrations & Coding Patterns | L153-L160 | Configuring Azure VPN Gateway with on-prem devices and services: NPS/RADIUS VSAs for P2S, S2S over ExpressRoute, Cisco ASA samples, and BGP VPN connectivity with AWS. | +| Deployment | L161-L173 | Deploying and migrating Azure VPN Gateways: create/upgrade gateways and SKUs, switch active/active modes, set up S2S VPNs, and manage client profiles and IP migrations via PowerShell/CLI. | ### Troubleshooting | Topic | URL | @@ -51,6 +51,7 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Select appropriate Azure VPN Gateway SKU | https://learn.microsoft.com/en-us/azure/vpn-gateway/about-gateway-skus | +| Decide and plan VPN Gateway public IP SKU migration | https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about | | Understand Azure VPN Gateway SKU consolidation and mappings | https://learn.microsoft.com/en-us/azure/vpn-gateway/gateway-sku-consolidation | | Migrate P2S connections from SSTP to IKEv2/OpenVPN | https://learn.microsoft.com/en-us/azure/vpn-gateway/ikev2-openvpn-from-sstp | | Migrate VPN Gateways from Classic to Resource Manager | https://learn.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-classic-resource-manager-migration | @@ -160,7 +161,6 @@ This skill requires **network access** to fetch documentation content: ### Deployment | Topic | URL | |-------|-----| -| Plan migration from Basic to Standard public IP for VPN Gateway | https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-about | | Execute Basic-to-Standard public IP migration for VPN Gateway | https://learn.microsoft.com/en-us/azure/vpn-gateway/basic-public-ip-migrate-howto | | Create a Basic SKU VPN Gateway via PowerShell | https://learn.microsoft.com/en-us/azure/vpn-gateway/create-gateway-basic-sku-powershell | | Deploy a VPN Gateway using PowerShell | https://learn.microsoft.com/en-us/azure/vpn-gateway/create-gateway-powershell | diff --git a/skills/azure-well-architected/SKILL.md b/skills/azure-well-architected/SKILL.md index 5ed5c3c1..537cf5ca 100644 --- a/skills/azure-well-architected/SKILL.md +++ b/skills/azure-well-architected/SKILL.md @@ -1,9 +1,9 @@ --- name: azure-well-architected -description: Expert guidance for designing, assessing, and optimizing Azure workloads using Azure Well Architected. Covers design review checklists, recommendations, design principles, tradeoffs, service guides, workload patterns, and assessment questions. Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads, or optimizing cost, security, reliability, and ops, and other Azure Well Architected related development tasks. +description: Expert guidance for designing, assessing, and optimizing Azure workloads using Azure Well Architected. Covers design review checklists, recommendations, design principles, tradeoffs, service guides, workload patterns, and assessment questions. Use when designing AI, SAP, SaaS, HPC, or mission‑critical workloads with WAF assessments, guides, checklists, and tradeoffs, and other Azure Well Architected related development tasks. compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Azure Well Architected Skill @@ -24,8 +24,8 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Design Review Checklists | L35-L43 | Checklists to review Azure workloads for cost, operations, performance, reliability, and security, with concrete questions and best practices for each architecture pillar. | -| Recommendations | L44-L115 | Guidance and best practices to design, operate, and improve Azure workloads across cost, operations, performance, reliability, security, and sustainability (FinOps, DevOps, DR, CI/CD, IAM, monitoring). | +| Design Review Checklists | L35-L43 | Checklists to review Azure workloads for cost, operations, performance, reliability, and security best practices, with concrete questions and criteria for each architecture. | +| Recommendations | L44-L115 | Guidance on cost optimization, reliability, performance, security, operations, DevOps, and sustainability best practices for architecting and running Azure workloads. | | Design Principles | L116-L133 | Design principles and patterns for cost, reliability, security, performance, ops, and sustainability across Azure workloads like AI, SAP, Oracle, AVS, AVD, SaaS, HPC, and mission‑critical systems. | | Tradeoffs | L134-L143 | Guidance on balancing cost, reliability, performance, security, and operations in Azure designs, including region/AZ choices and cross-pillar tradeoff analysis for architecture decisions | | Service Guides | L144-L194 | End-to-end design, security, networking, operations, monitoring, and optimization guidance for specific Azure services, aligned to Well-Architected Framework best practices. | @@ -37,8 +37,8 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Use Cost Optimization design review checklist for Azure | https://learn.microsoft.com/en-us/azure/well-architected/cost-optimization/checklist | | Use operational excellence design review checklist | https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/checklist | -| Use the Performance Efficiency design review checklist | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist | -| Use reliability design review checklist for Azure | https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist | +| Use Performance Efficiency design review checklist | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/checklist | +| Use reliability design checklist for Azure workloads | https://learn.microsoft.com/en-us/azure/well-architected/reliability/checklist | | Use the Security design review checklist for workloads | https://learn.microsoft.com/en-us/azure/well-architected/security/checklist | ### Recommendations @@ -77,8 +77,8 @@ This skill requires **network access** to fetch documentation content: | Standardize development tools and processes for Azure teams | https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/tools-processes | | Design CI/CD-based workload development supply chains | https://learn.microsoft.com/en-us/azure/well-architected/operational-excellence/workload-supply-chain | | Implement capacity planning for Azure performance efficiency | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/capacity-planning | -| Collect and use performance data for Azure workloads | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/collect-performance-data | | Continuously optimize performance efficiency in Azure | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/continuous-performance-optimize | +| Implement monitoring strategies for workload performance | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/monitoring | | Optimize application code and infrastructure performance on Azure | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-code-infrastructure | | Optimize data performance in Azure workloads | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-data-performance | | Optimize operational tasks to protect workload performance | https://learn.microsoft.com/en-us/azure/well-architected/performance-efficiency/optimize-operational-tasks | @@ -92,7 +92,7 @@ This skill requires **network access** to fetch documentation content: | Perform failure mode analysis for Azure reliability | https://learn.microsoft.com/en-us/azure/well-architected/reliability/failure-mode-analysis | | Identify and prioritize workload flows for reliability | https://learn.microsoft.com/en-us/azure/well-architected/reliability/identify-flows | | Define availability and recovery targets for reliability | https://learn.microsoft.com/en-us/azure/well-architected/reliability/metrics | -| Design reliable monitoring and alerting for workloads | https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring-alerting-strategy | +| Implement monitoring strategies for workload reliability | https://learn.microsoft.com/en-us/azure/well-architected/reliability/monitoring | | Design redundancy strategies to meet reliability targets | https://learn.microsoft.com/en-us/azure/well-architected/reliability/redundancy | | Design reliable scaling strategies for Azure workloads | https://learn.microsoft.com/en-us/azure/well-architected/reliability/scaling | | Build self-healing and self-preservation into workloads | https://learn.microsoft.com/en-us/azure/well-architected/reliability/self-preservation | diff --git a/skills/microsoft-foundry-classic/SKILL.md b/skills/microsoft-foundry-classic/SKILL.md index cfc966d0..60205ec2 100644 --- a/skills/microsoft-foundry-classic/SKILL.md +++ b/skills/microsoft-foundry-classic/SKILL.md @@ -1,9 +1,9 @@ --- name: microsoft-foundry-classic -description: Expert knowledge for Microsoft Foundry Classic (aka Azure AI Foundry classic) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building Foundry agents, prompt flows, RAG/search, Azure OpenAI deployments, or multi-agent architectures, and other Microsoft Foundry Classic related development tasks. Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use microsoft-foundry-tools). +description: Expert knowledge for Microsoft Foundry Classic (aka Azure AI Foundry classic) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building Foundry classic agents with Azure OpenAI, RAG, Prompt Flow, MCP tools, or realtime audio workloads, and other Microsoft Foundry Classic related development tasks. Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use microsoft-foundry-tools). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Microsoft Foundry Classic Skill @@ -24,31 +24,33 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Troubleshooting | L37-L47 | Diagnosing and fixing Foundry issues: compute/usage in prompt flow, deployments/monitoring, private endpoints, Azure OpenAI fine-tuning, risks & safety alerts, and known bugs/workarounds. | -| Best Practices | L48-L61 | Best practices for prompts, safety, fine-tuning (incl. vision), latency/cost optimization, DeepSeek-R1 use, On Your Data, and evaluating/improving Foundry/Azure OpenAI chat apps | -| Decision Making | L62-L88 | Guides for choosing models, regions, deployments, costs, PTU, streaming, SDKs, and upgrade/migration paths to design, scale, and optimize Foundry and Azure OpenAI solutions. | -| Architecture & Design Patterns | L89-L98 | Architectural guidance for Foundry: multi-agent design, service/resource topology, high availability, and full BCDR (backup, recovery, DR, and resiliency) for Agent Service and Azure OpenAI workloads | -| Limits & Quotas | L99-L115 | Quotas, rate limits, and capacity for Foundry Agents/Models and Azure OpenAI (deployments, regions, VNets, batch, dynamic quota), plus model availability/retirement and how to request increases. | -| Security | L116-L163 | Security, privacy, and compliance for Foundry and Azure OpenAI: auth/RBAC, networking (VNet/Private Link), keys and Key Vault, guardrails/content filters, Azure Policy, and data handling. | -| Configuration | L164-L221 | Configuring, monitoring, and evaluating Foundry classic agents, models, tools, and infrastructure, including BYO resources, RAG, safety, tracing, and Azure OpenAI monitoring. | -| Integrations & Coding Patterns | L222-L342 | Patterns and code to integrate Foundry and Azure OpenAI with tools, data, and frameworks: search/RAG, files, SharePoint, Bing, MCP/OpenAPI, Functions/Logic Apps, SDKs, fine-tuning, and realtime audio. | -| Deployment | L343-L360 | Deploying Foundry hubs, projects, and models (managed/serverless), setting up agents and prompt flows, and automating deployments/evaluations via CLI, Bicep, Terraform, VS Code, and CI/CD. | +| Troubleshooting | L37-L47 | Diagnosing and fixing Foundry classic issues: compute/Prompt Flow limits, deployments/monitoring, private endpoints, Azure OpenAI fine-tuning, risks & safety alerts, and known portal problems. | +| Best Practices | L48-L63 | Best practices for Foundry/Azure OpenAI: system and safety prompts, DeepSeek-R1, eval flows, fine-tuning (incl. vision), latency/perf, On Your Data, HA design, and chat app evaluation. | +| Decision Making | L64-L93 | Guidance on choosing models, regions, deployment types, costs, PTU, lifecycle/migration, and when to fine-tune or upgrade Azure OpenAI/Foundry for your scenario | +| Architecture & Design Patterns | L94-L101 | Architectural guidance for Foundry: multi-agent design, service/resource topology, high availability, and full BCDR (backup, recovery, DR, and resiliency) for Agent Service and Azure OpenAI workloads | +| Limits & Quotas | L102-L116 | Quotas, rate limits, and regional support for Foundry Agents, Models, Azure OpenAI (incl. batch, dynamic quota, fine-tuning), plus how to view, manage, and request quota increases | +| Security | L117-L164 | Security, privacy, and compliance for Foundry: auth/RBAC, keys and Key Vault, network isolation/Private Link, Azure Policy guardrails, content safety/filters, and data handling for models and tools. | +| Configuration | L165-L216 | Configuring, monitoring, and evaluating Foundry classic agents, models, networking, storage, and compute, including Azure OpenAI, RAG, safety, tracing, and continuous evaluation setups. | +| Integrations & Coding Patterns | L217-L348 | Patterns and code for integrating Foundry and Azure OpenAI with tools, data sources, and frameworks (RAG, search, files, web, functions, MCP, Prompt Flow, LangChain, evaluations, and realtime audio). | +| Deployment | L349-L367 | Deploying Foundry hubs, models, agents, and prompt flows using CLI, Bicep, Terraform, VS Code, and CI/CD, including serverless/managed options and regional feature availability. | ### Troubleshooting | Topic | URL | |-------|-----| | Resolve common issues in Microsoft Foundry (classic) | https://learn.microsoft.com/en-us/azure/foundry-classic/faq | -| Troubleshoot compute and usage issues in prompt flow | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot | +| Troubleshoot Prompt Flow compute and usage issues | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-troubleshoot | | Troubleshoot Foundry deployments and monitoring issues | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/troubleshoot-deploy-and-monitor | | Troubleshoot Foundry private endpoint connection errors | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/troubleshoot-secure-connection-project | | Troubleshoot Azure OpenAI fine-tuning issues in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-troubleshoot | | Monitor and troubleshoot Risks & Safety in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/risks-safety-monitor | -| Resolve known issues and workarounds in Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-known-issues | +| Resolve known issues in Microsoft Foundry classic portal | https://learn.microsoft.com/en-us/azure/foundry-classic/reference/foundry-known-issues | ### Best Practices | Topic | URL | |-------|-----| | Deploy and use DeepSeek-R1 reasoning model in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/tutorials/get-started-deepseek-r1 | +| Develop custom evaluation flows in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop-evaluation | +| Design high availability for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency | | Design effective system messages for Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/advanced-prompt-engineering | | Apply safety system message templates in Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/safety-system-message-templates | | Author safety-focused system messages in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/system-message | @@ -65,7 +67,7 @@ This skill requires **network access** to fetch documentation content: | Select Azure OpenAI models and regions for Foundry Agents | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/concepts/model-region-support | | Decide when and how to fine-tune models in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/fine-tuning-overview | | Use Foundry model benchmarks and leaderboards | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-benchmarks | -| Plan for Foundry model deprecation and retirement | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-lifecycle-retirement | +| Handle deprecation and retirement of managed compute models | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/model-retirement-managed-compute | | Plan Microsoft Foundry classic rollout at scale | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/planning | | Choose the right Azure resource type for Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/resource-types | | Choose Microsoft Foundry deployment types and residency | https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/concepts/deployment-types | @@ -74,13 +76,16 @@ This skill requires **network access** to fetch documentation content: | Decide between GPT-5 and GPT-4.1 in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/how-to/model-choice-guide | | Compare models with Foundry leaderboards | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/benchmark-model-in-catalog | | Plan and manage costs for Microsoft Foundry hubs | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/costs-plan-manage | -| Choose Microsoft Foundry SDKs and endpoints | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview | | Migrate from hub-based to new Foundry projects | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/migrate-project | +| Plan migration from Prompt Flow to Agent Framework | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-migration-overview | | Decide when and how to upgrade Azure OpenAI to Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/upgrade-azure-openai | | Choose content streaming and filtering modes in Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-streaming | -| Enable and evaluate Foundry priority processing tiers | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/priority-processing | +| Use Foundry model retirement schedule for migrations | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirement-schedule | +| Plan around Foundry model lifecycle and support | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements | +| Choose and enable priority processing for Foundry models | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/priority-processing | | Evaluate 2024 Azure OpenAI provisioned throughput updates | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/provisioned-migration | | Understand provisioned throughput for Foundry models | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/provisioned-throughput | +| Identify retired Foundry models and alternatives | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/retired-models | | Plan using your data with Azure OpenAI deployments | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/use-your-data | | Plan PTU costs and capacity for Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/provisioned-throughput-onboarding | | Migrate from preview to GA Realtime API protocol | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/realtime-audio-preview-api-migration-guide | @@ -93,19 +98,15 @@ This skill requires **network access** to fetch documentation content: | Plan disaster recovery for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/agent-service-disaster-recovery | | Recover Foundry Agent Service from resource and data loss | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/agent-service-operator-disaster-recovery | | Recover Foundry Agent Service from platform outages | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/agent-service-platform-disaster-recovery | -| Design high availability and resiliency for Foundry projects | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/high-availability-resiliency | -| Design BCDR architecture for Azure OpenAI workloads | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/business-continuity-disaster-recovery | ### Limits & Quotas | Topic | URL | |-------|-----| | Review quotas and limits for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/quotas-limits | -| Evaluate Foundry classic with regional limits and VNet | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-regions-limits-virtual-network | +| Rate limits and regional support for Foundry evaluations | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-regions-limits-virtual-network | | Reference quotas and limits for Foundry Models | https://learn.microsoft.com/en-us/azure/foundry-classic/foundry-models/quotas-limits | | Manage and request quotas for Microsoft Foundry hub resources | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-quota | | Manage and increase model deployment quotas in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/quota | -| Review retired Azure OpenAI model availability | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/legacy-models | -| Azure OpenAI model availability and retirement schedule | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/model-retirements | | Azure OpenAI FAQ with usage limits and policies | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/faq | | Use Azure OpenAI global batch processing | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/batch | | Use dynamic quota for Azure OpenAI deployments | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/dynamic-quota | @@ -118,7 +119,6 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Configure Browser Automation tool securely for Foundry agents | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/browser-automation | | Securely use Foundry Computer Use tool with agents | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/tools-classic/computer-use | -| Set up private networking and VNet integration for Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks | | Configure authentication and authorization in Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/authentication-authorization-foundry | | Configure customer-managed keys for Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/encryption-keys-portal | | Configure customer-managed keys for Foundry hub projects | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/hub-encryption-keys-portal | @@ -144,7 +144,7 @@ This skill requires **network access** to fetch documentation content: | Configure Private Link for Microsoft Foundry hubs | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-configure-private-link | | Set up managed virtual network for Foundry projects | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/managed-virtual-network | | Use built-in Azure Policy to govern Foundry model deployments | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/model-deployment-policy | -| Configure the Content Safety tool in prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool | +| Apply Content Safety tool in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/content-safety-tool | | Secure configuration for Foundry playground chat on your data | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/secure-data-playground | | Set up Azure Key Vault connection for Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/set-up-key-vault-connection | | Use Content Credentials for Azure OpenAI images | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-credentials | @@ -157,6 +157,7 @@ This skill requires **network access** to fetch documentation content: | Add Azure OpenAI to a network security perimeter | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/network-security-perimeter | | Apply Azure RBAC roles to Azure OpenAI resources | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/role-based-access-control | | Configure custom block lists for Azure OpenAI content filtering | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/use-blocklists | +| Understand data handling and security for Claude in Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/claude-models/data-privacy | | Apply copyright commitment mitigations for Azure OpenAI in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/customer-copyright-commitment | | Understand data handling and privacy for Azure Direct Models | https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/data-privacy | | Understand limited access policy for Azure OpenAI in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/responsible-ai/openai/limited-access | @@ -169,6 +170,7 @@ This skill requires **network access** to fetch documentation content: | Configure environment and infrastructure for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/environment-setup | | Monitor Foundry Agent Service with Azure Monitor and KQL | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/metrics | | Configure Foundry Agent Service to use your own Azure resources | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/use-your-own-resources | +| Configure private networking for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/how-to/virtual-networks | | Use Azure Monitor metrics and logs for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry-classic/agents/reference/monitor-service | | Configure and use built-in Foundry evaluators | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/built-in-evaluators | | Configure agent evaluators for Azure AI agents | https://learn.microsoft.com/en-us/azure/foundry-classic/concepts/evaluation-evaluators/agent-evaluators | @@ -200,13 +202,6 @@ This skill requires **network access** to fetch documentation content: | Create and use vector indexes for RAG in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/index-add | | Configure continuous monitoring for Foundry AI applications | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/monitor-applications | | Monitor quality and token usage for prompt flow apps | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/monitor-quality-safety | -| Use GPT-4 Turbo with Vision tool in prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool | -| Configure the Embedding tool in Foundry prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool | -| Configure the Index Lookup tool for RAG flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool | -| Configure the LLM tool in Foundry prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool | -| Configure the Prompt tool in Foundry prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool | -| Configure the Python tool in Foundry prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool | -| Configure the Rerank tool to improve search results | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool | | Enable and interpret Prompt Shields in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/concepts/content-filter-prompt-shields | | Configure Azure Blob Storage for OpenAI Batch I/O | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/batch-blob-storage | | Configure and run Azure OpenAI model evaluations | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/evaluations | @@ -263,15 +258,26 @@ This skill requires **network access** to fetch documentation content: | Integrate LlamaIndex with Microsoft Foundry models | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/llama-index | | Run AI Red Teaming Agent in the cloud with Foundry SDK | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-ai-red-teaming-cloud | | Run AI Red Teaming Agent locally with Azure AI SDK | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/run-scans-ai-red-teaming-agent | +| Select and use Microsoft Foundry SDKs and endpoints | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/sdk-overview | | Use Semantic Kernel with Foundry model catalog | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/semantic-kernel | | View OpenAI SDK traces with OpenTelemetry in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/trace-application | | Configure MCP server tools for Foundry agents in VS Code | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/develop/vs-code-agents-mcp | +| Build Prompt Flow workflows in Foundry classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-develop | +| Process images with Prompt Flow in Foundry classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-process-image | | Deploy and use CXRReportGen healthcare AI model | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-cxrreportgen | | Deploy and invoke MedImageInsight embeddings model | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-medimageinsight | | Use MedImageParse medical image segmentation models | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/healthcare-ai/deploy-medimageparse | +| Rebuild Prompt Flow workflows with Agent Framework | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-migrate-prompt-flow-to-agent-framework | | Create and manage hub-scoped connections in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/hub-connections-add | | Migrate Azure AI Inference SDK apps to OpenAI SDK | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/model-inference-to-openai-migration | -| Use the Serp API tool within Foundry prompt flows | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool | +| Configure GPT-4 Turbo with Vision tool in Prompt Flow | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/azure-open-ai-gpt-4v-tool | +| Use the Embedding tool in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/embedding-tool | +| Use the Index Lookup tool in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/index-lookup-tool | +| Use the LLM tool in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/llm-tool | +| Configure the Prompt tool in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/prompt-tool | +| Use the Python tool in Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/python-tool | +| Configure the Rerank tool for RAG in Prompt Flow | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/rerank-tool | +| Integrate Serp API tool with Prompt Flow classic | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/prompt-flow-tools/serp-api-tool | | Use image-to-text models from Foundry catalog | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/use-image-models | | Use Azure OpenAI v1 API in Foundry Models | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/api-version-lifecycle | | Get started with Azure OpenAI audio generation | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/audio-completions-quickstart | @@ -356,5 +362,6 @@ This skill requires **network access** to fetch documentation content: | Run Foundry evaluations in GitHub Actions pipelines | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/evaluation-github-action | | Deploy fine-tuned models via serverless API in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/fine-tune-serverless | | Deploy prompt flows as managed online endpoints | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/flow-deploy | +| Deploy migrated Agent Framework workflows to Azure | https://learn.microsoft.com/en-us/azure/foundry-classic/how-to/how-to-deploy-migrated-agent-framework-workflow | | Deploy fine-tuned Azure OpenAI models in Foundry | https://learn.microsoft.com/en-us/azure/foundry-classic/openai/how-to/fine-tuning-deploy | | Check Microsoft Foundry feature availability by region | https://learn.microsoft.com/en-us/azure/foundry-classic/reference/region-support | \ No newline at end of file diff --git a/skills/microsoft-foundry-tools/SKILL.md b/skills/microsoft-foundry-tools/SKILL.md index fbaa18ba..2617f28e 100644 --- a/skills/microsoft-foundry-tools/SKILL.md +++ b/skills/microsoft-foundry-tools/SKILL.md @@ -1,14 +1,14 @@ --- name: microsoft-foundry-tools -description: Expert knowledge for Microsoft Foundry Tools (aka Azure AI services, Azure Cognitive Services) development including best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. Use when using Content Moderator, Content Understanding analyzers, document extraction, routing, or secure Foundry setup, and other Microsoft Foundry Tools related development tasks. Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local). +description: Expert knowledge for Microsoft Foundry Tools (aka Azure AI services, Azure Cognitive Services) development including best practices, decision making, limits & quotas, security, configuration, and integrations & coding patterns. Use when using Content Moderator, Content Understanding analyzers, Foundry Standard/Pro tasks, or securing Foundry access, and other Microsoft Foundry Tools related development tasks. Not for Microsoft Foundry (use microsoft-foundry), Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Microsoft Foundry Tools Skill -This skill provides expert guidance for Microsoft Foundry Tools. Covers best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. +This skill provides expert guidance for Microsoft Foundry Tools. Covers best practices, decision making, limits & quotas, security, configuration, and integrations & coding patterns. It combines local quick-reference content with remote documentation fetching capabilities. ## How to Use This Skill @@ -24,13 +24,12 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| -| Best Practices | L35-L41 | Guidance on improving Content Understanding accuracy, grounding and confidence in document extraction, and migrating from preview to GA Content Understanding APIs. | -| Decision Making | L42-L48 | Guides for choosing the right Azure AI/Foundry tool for document processing and estimating Content Understanding costs and pricing plans. | -| Architecture & Design Patterns | L49-L53 | Designing and configuring how Content Understanding analyzers are mapped to specific model deployments, including routing strategies and deployment architecture patterns. | -| Limits & Quotas | L54-L61 | Quotas, limits, and language support for Azure Content Moderator and Content Understanding, including image/list caps, API usage constraints, and .NET sample considerations. | -| Security | L62-L66 | Securing Foundry: auth methods, Entra-only access, keys/Key Vault, CMK encryption, DLP, VNet rules, API key rotation, Azure Policy and regulatory compliance configuration | -| Configuration | L67-L77 | Configuring Foundry environments and resources: credentials, subdomains, ARM provisioning, logging, and detailed setup for Content Understanding analyzers, layouts, images, faces, and routing. | -| Integrations & Coding Patterns | L78-L92 | Using Content Moderator and Content Understanding via REST/.NET: text/image/video moderation, term lists, multimodal analysis, and consuming Markdown/structured outputs | +| Best Practices | L34-L40 | Guidance on improving Content Understanding accuracy, grounding and confidence in document extraction, and migrating from preview to GA Content Understanding APIs. | +| Decision Making | L41-L48 | Guides for choosing document processing tools, deployment options, Foundry vs Content Understanding Studio, and estimating/pricing Content Understanding workloads. | +| Limits & Quotas | L49-L56 | Quotas and limits for Content Moderator and Content Understanding: image/list caps, usage constraints, and supported languages, plus how to stay within these limits in .NET samples. | +| Security | L57-L61 | Securing Foundry: auth methods, Entra-only access, keys/Key Vault, CMK encryption, DLP, VNet rules, API key rotation, Azure Policy and regulatory compliance configuration | +| Configuration | L62-L71 | Configuring Content Understanding analyzers and layout/face analysis, customizing prebuilt and custom analyzers, and setting up Standard/Pro task behavior in Foundry classic | +| Integrations & Coding Patterns | L72-L86 | Using Content Moderator and Content Understanding via REST/.NET: text/image/video moderation, term lists, multimodal analysis, and consuming Markdown/structured outputs | ### Best Practices | Topic | URL | @@ -43,21 +42,17 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Choose Azure AI tool for document processing | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/choosing-right-ai-tool | +| Choose model deployment options for analyzers | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments | | Choose between Foundry and Content Understanding Studio features | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/foundry-vs-content-understanding-studio | | Estimate and plan Content Understanding pricing | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/pricing-explainer | -### Architecture & Design Patterns -| Topic | URL | -|-------|-----| -| Map Content Understanding analyzers to model deployments | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/models-deployments | - ### Limits & Quotas | Topic | URL | |-------|-----| | Use Content Moderator image lists within quota limits | https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/image-lists-quickstart-dotnet | | Use supported languages in Content Moderator API | https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/language-support | | Apply Content Moderator .NET samples with list limits | https://learn.microsoft.com/en-us/azure/ai-services/content-moderator/samples-dotnet | -| Review Azure Content Understanding service quotas and limits | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits | +| Review Content Understanding service quotas and limits | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/service-limits | ### Security | Topic | URL | @@ -67,11 +62,10 @@ This skill requires **network access** to fetch documentation content: ### Configuration | Topic | URL | |-------|-----| -| Configure Content Understanding analyzers and parameters | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference | +| Configure and reference Content Understanding analyzers | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/analyzer-reference | | Use and customize Content Understanding prebuilt analyzers | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/concepts/prebuilt-analyzers | | Configure document layout analysis with Content Understanding | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/document/elements | | Configure face detection and recognition in Content Understanding | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/face/overview | -| Configure classification and routing in Content Understanding Studio | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/classification-content-understanding-studio | | Configure Standard and Pro tasks in Foundry classic | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/content-understanding-foundry-classic | | Build and refine custom analyzers in Content Understanding Studio | https://learn.microsoft.com/en-us/azure/ai-services/content-understanding/how-to/customize-analyzer-content-understanding-studio | diff --git a/skills/microsoft-foundry/SKILL.md b/skills/microsoft-foundry/SKILL.md index 00f4a1bd..8b321bd4 100644 --- a/skills/microsoft-foundry/SKILL.md +++ b/skills/microsoft-foundry/SKILL.md @@ -1,9 +1,9 @@ --- name: microsoft-foundry -description: Expert knowledge for Microsoft Foundry (aka Azure AI Foundry) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building Foundry agents with Azure OpenAI (RAG, evals, vector/file search, fine-tuning, or realtime/audio), and other Microsoft Foundry related development tasks. Not for Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use microsoft-foundry-tools). +description: Expert knowledge for Microsoft Foundry (aka Azure AI Foundry) development including troubleshooting, best practices, decision making, architecture & design patterns, limits & quotas, security, configuration, integrations & coding patterns, and deployment. Use when building Foundry agents with Azure OpenAI, MCP tools, RAG/search, private networking, or CI-based evals, and other Microsoft Foundry related development tasks. Not for Microsoft Foundry Classic (use microsoft-foundry-classic), Microsoft Foundry Local (use microsoft-foundry-local), Microsoft Foundry Tools (use microsoft-foundry-tools). compatibility: Requires network access. Uses mcp_microsoftdocs:microsoft_docs_fetch or fetch_webpage to retrieve documentation. metadata: - generated_at: "2026-04-19" + generated_at: "2026-04-26" generator: "docs2skills/1.0.0" --- # Microsoft Foundry Skill @@ -25,14 +25,14 @@ This skill requires **network access** to fetch documentation content: | Category | Lines | Description | |----------|-------|-------------| | Troubleshooting | L37-L42 | Diagnosing evaluation/observability problems in Foundry (metrics, logging, tracing) and resolving known platform issues with documented workarounds. | -| Best Practices | L43-L54 | Best practices for configuring tools, prompts, evaluation, safety, latency, and fine-tuning (incl. vision models) to build high-quality, efficient Azure AI/Foundry agents | -| Decision Making | L55-L85 | Guides for choosing models, deployments, costs, and tools, plus migration and upgrade paths (Azure OpenAI, GitHub Models, Assistants API) and web/Bing grounding decisions. | -| Architecture & Design Patterns | L86-L97 | Architectural patterns for Foundry agents: standard setup, RAG/indexing, HA/DR, regional recovery, provisioned throughput, spillover traffic, and LLM routing optimization. | -| Limits & Quotas | L98-L112 | Limits, quotas, regions, and availability for Foundry and Azure OpenAI models, agents, evals, vector/file search, batch, fine-tuning, and partner models. | -| Security | L113-L145 | Security, identity, networking, and compliance for Foundry: auth/RBAC, keys & encryption, private networking, guardrails, safety policies, content safety, and data privacy for models and agents | -| Configuration | L146-L192 | Configuring Foundry agents, models, tools, storage, monitoring, security, and Azure OpenAI features (search, memory, tracing, fine-tuning, prompt controls, and external integrations). | -| Integrations & Coding Patterns | L193-L257 | Patterns and code for integrating Foundry agents and models with tools, APIs, LangChain/LangGraph, Azure OpenAI, realtime/audio, search, safety, tracing, and enterprise systems. | -| Deployment | L258-L273 | Deploying agents and models: infra setup, hosting, publishing to Azure/M365/Teams, CI/CD, workflows, custom/fine-tuned/Fireworks models, and managing deployment lifecycle. | +| Best Practices | L43-L54 | Best practices for prompts, system messages, tools, evaluation, safety, latency/throughput, and fine-tuning (incl. vision) to improve Foundry/Azure AI agent quality and performance | +| Decision Making | L55-L86 | Guidance on choosing models, SDKs, deployment types, and grounding tools, plus planning migrations, upgrades, costs, and versioning for Foundry-based workloads | +| Architecture & Design Patterns | L87-L98 | Designing resilient, high-availability Foundry agents: isolated setups, RAG/indexing patterns, DR and regional recovery, and scaling/provisioned throughput with spillover traffic management. | +| Limits & Quotas | L99-L115 | Quotas, rate limits, regions, and availability for Foundry models/agents (incl. Azure OpenAI), plus how to configure limits, routing, batch/eval usage, and cost controls. | +| Security | L116-L150 | Security, identity, and compliance for Foundry: auth/RBAC, agent identities, private networking, keys/encryption, guardrails/safety, MCP and Agent2Agent security, and data privacy controls. | +| Configuration | L151-L203 | Configuring and operating Foundry agents and models: hosting, endpoints, tools, storage, security/guardrails, monitoring/tracing, Azure OpenAI usage, and environment/infra setup. | +| Integrations & Coding Patterns | L204-L265 | Integrating Foundry agents and OpenAI with tools, models, and services: APIs, function calling, MCP, LangChain/LangGraph, search, speech, image, realtime audio, tracing, safety, and fine-tuning. | +| Deployment | L266-L276 | Deploying agents and models to Foundry (portal, CLI, Bicep, containers), plus running evaluations in CI (GitHub/Azure DevOps) and hosting custom/fine-tuned/Fireworks models | ### Troubleshooting | Topic | URL | @@ -43,7 +43,7 @@ This skill requires **network access** to fetch documentation content: ### Best Practices | Topic | URL | |-------|-----| -| Apply tool configuration best practices for agents | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice | +| Apply tool usage best practices in Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/tool-best-practice | | Evaluate Azure AI agents with task-specific metrics | https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-evaluators/agent-evaluators | | Evaluate Foundry agents with built-in quality and safety tests | https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/evaluate-agent | | Improve Foundry agent prompts with Prompt Optimizer | https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/prompt-optimizer | @@ -56,10 +56,11 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Migrate from Assistants API to Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate | +| Decide and migrate to the new Foundry agent model | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-agent-applications | +| Migrate hosted agents to refreshed Foundry preview | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/migrate-hosted-agent-preview | | Use Bing grounding tools with agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/bing-tools | | Choose web grounding tools for Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/web-overview | | Use Foundry model benchmarks and leaderboards for selection | https://learn.microsoft.com/en-us/azure/foundry/concepts/model-benchmarks | -| Plan for Foundry model deprecation and retirement | https://learn.microsoft.com/en-us/azure/foundry/concepts/model-lifecycle-retirement | | Plan Microsoft Foundry rollout and environment strategy | https://learn.microsoft.com/en-us/azure/foundry/concepts/planning | | Optimize Foundry model cost and performance | https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-optimize-cost-performance | | Choose Foundry deployment types and data residency options | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/deployment-types | @@ -70,15 +71,15 @@ This skill requires **network access** to fetch documentation content: | Decide between GPT-5 and GPT-4.1 for your use case | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/model-choice-guide | | Upgrade workloads from GitHub Models to Foundry Models | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/quickstart-github-models | | Use Foundry model leaderboard to compare and choose models | https://learn.microsoft.com/en-us/azure/foundry/how-to/benchmark-model-in-catalog | +| Choose Microsoft Foundry SDKs and endpoints | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/sdk-overview | | Choose integration patterns for Microsoft Foundry APIs | https://learn.microsoft.com/en-us/azure/foundry/how-to/integrate-with-other-apps | | Migrate from Azure AI Inference SDK to OpenAI SDK | https://learn.microsoft.com/en-us/azure/foundry/how-to/model-inference-to-openai-migration | | Plan migration from classic Foundry portal | https://learn.microsoft.com/en-us/azure/foundry/how-to/navigate-from-classic | | Decide and execute upgrade from Azure OpenAI to Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/upgrade-azure-openai | | Use Ask AI to upgrade or switch Foundry models | https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/optimization-model-upgrade | | Choose content streaming and filtering modes in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-streaming | -| Review retired Azure OpenAI models in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/legacy-models | -| Manage Azure OpenAI model deprecations in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirements | -| Enable and choose priority processing for Foundry models | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing | +| Plan migrations using Foundry model retirement schedule | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/model-retirement-schedule | +| Select alternatives for retired Foundry models | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/retired-models | | Estimate and manage fine-tuning costs in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-cost-management | | Plan and estimate PTU costs in Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/provisioned-throughput-onboarding | | Migrate from preview to GA Realtime API | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-preview-api-migration-guide | @@ -91,7 +92,7 @@ This skill requires **network access** to fetch documentation content: | Plan disaster recovery for Foundry Agent Service in standard mode | https://learn.microsoft.com/en-us/azure/foundry/how-to/agent-service-disaster-recovery | | Recover Foundry Agent Service from resource and data loss | https://learn.microsoft.com/en-us/azure/foundry/how-to/agent-service-operator-disaster-recovery | | Recover Foundry Agent Service from regional platform outages | https://learn.microsoft.com/en-us/azure/foundry/how-to/agent-service-platform-disaster-recovery | -| Plan high availability and resiliency for Foundry projects and agents | https://learn.microsoft.com/en-us/azure/foundry/how-to/high-availability-resiliency | +| Design high availability and resiliency for Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/high-availability-resiliency | | Plan provisioned throughput architecture for Foundry models | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/provisioned-throughput | | Design spillover traffic management for provisioned deployments | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/spillover-traffic-management | @@ -100,7 +101,8 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Quotas, limits, and regions for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/limits-quotas-regions | | Use vector stores and file search limits in agents | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/vector-stores | -| Evaluation rate limits and region support in Foundry | https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-regions-limits-virtual-network | +| Manage hosted agents and traffic routing in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent | +| Check Foundry evaluation regions and rate limits | https://learn.microsoft.com/en-us/azure/foundry/concepts/evaluation-regions-limits-virtual-network | | Configure token rate limits and quotas for Foundry models | https://learn.microsoft.com/en-us/azure/foundry/control-plane/how-to-enforce-limits-models | | Capabilities and availability of partner Foundry models | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-from-partners | | Model catalog details for Azure-sold Foundry models | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/models-sold-directly-by-azure | @@ -108,6 +110,7 @@ This skill requires **network access** to fetch documentation content: | Use Azure OpenAI global batch processing efficiently | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/batch | | Use reinforcement fine-tuning with cost limits | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reinforcement-fine-tuning | | Reference Azure OpenAI quotas and limits in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits | +| Reference quotas and limits for Azure OpenAI in Azure Government | https://learn.microsoft.com/en-us/azure/foundry/openai/quotas-limits-gov | | Understand limited access policy for Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/limited-access | ### Security @@ -115,10 +118,12 @@ This skill requires **network access** to fetch documentation content: |-------|-----| | Configure and govern agent identities in Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-identity | | Configure authentication methods for Agent2Agent tools | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/agent-to-agent-authentication | +| Configure hosted agent permissions and RBAC in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/hosted-agent-permissions | +| Govern and secure Foundry agents with Microsoft Agent 365 | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365 | | Configure authentication for MCP servers in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/mcp-authentication | | Configure computer use tool securely for agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/computer-use | | Govern MCP tools with an AI gateway in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/governance | -| Set up private networking for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/virtual-networks | +| Configure private networking for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/virtual-networks | | Configure authentication and authorization for Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/concepts/authentication-authorization-foundry | | Configure customer-managed keys for Microsoft Foundry encryption | https://learn.microsoft.com/en-us/azure/foundry/concepts/encryption-keys-portal | | Configure RBAC roles and scopes for Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/concepts/rbac-foundry | @@ -127,7 +132,7 @@ This skill requires **network access** to fetch documentation content: | Create guardrail policies for model deployments | https://learn.microsoft.com/en-us/azure/foundry/control-plane/quickstart-create-guardrail-policy | | Configure keyless Entra ID authentication for Foundry Models | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-entra-id | | Add Microsoft Foundry resources to a network security perimeter | https://learn.microsoft.com/en-us/azure/foundry/how-to/add-foundry-to-network-security-perimeter | -| Configure Private Link network isolation for Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/configure-private-link | +| Configure private link network isolation for Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/configure-private-link | | Configure and manage Microsoft Foundry connections | https://learn.microsoft.com/en-us/azure/foundry/how-to/connections-add | | Create custom Azure Policies to govern Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/custom-policy-definition | | Apply Azure AI Content Safety in LangChain agents | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-middleware | @@ -140,27 +145,34 @@ This skill requires **network access** to fetch documentation content: | Author safety-focused system messages for Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/system-message | | Apply safety evaluation to fine-tuned models | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-safety-evaluation | | Data privacy and security for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/agents/data-privacy-security | -| Understand data privacy and security for Anthropic Claude in Foundry | https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy | +| Understand data handling and privacy for Claude in Foundry | https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/claude-models/data-privacy | | Understand data, privacy, and security for Azure Direct Models | https://learn.microsoft.com/en-us/azure/foundry/responsible-ai/openai/data-privacy | ### Configuration | Topic | URL | |-------|-----| | Configure capability hosts for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/capability-hosts | +| Configure environment infrastructure for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup | +| Publish Foundry agents as agent applications | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-applications | +| Configure and share Microsoft Foundry agent endpoints | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/configure-agent | +| Configure and deploy hosted agents via SDK or REST | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent | | Manage and disable Grounding with Bing in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-grounding-with-bing | +| Create and manage hosted agent sessions via APIs | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-sessions | | Configure and manage memory in Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/memory-usage | | Configure a private tool catalog with Azure API Center | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/private-tool-catalog | | Configure structured inputs for Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/structured-inputs | | Configure custom MCP-based code interpreter runtime | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/custom-code-interpreter | | Configure file search tool for Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/file-search | +| Define and manage SKILL.md skills for Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/skills | +| Configure intent-based toolboxes for hosted agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/toolbox | | Configure and use web search tool in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/web-search | | Configure Foundry Agent Service to use existing Azure resources | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/use-your-own-resources | | Configure declarative agent workflows in VS Code | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-low-code | -| Reference for all Foundry built-in evaluators | https://learn.microsoft.com/en-us/azure/foundry/concepts/built-in-evaluators | | Configure AI Gateway token controls in Foundry | https://learn.microsoft.com/en-us/azure/foundry/configuration/enable-ai-api-management-gateway-portal | | Register custom agents with Foundry Control Plane | https://learn.microsoft.com/en-us/azure/foundry/control-plane/register-custom-agent | | Configure synthetic data generation in Foundry | https://learn.microsoft.com/en-us/azure/foundry/fine-tuning/data-generation | | Use Foundry Models endpoints and authentication correctly | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/concepts/endpoints | +| Configure Claude Desktop to use Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-desktop | | Generate text with Foundry Models using the Responses API | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/generate-responses | | Configure Azure Monitor for Foundry model deployments | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/monitor-models | | Configure guardrails and controls in Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/guardrails/how-to-create-guardrails | @@ -174,12 +186,11 @@ This skill requires **network access** to fetch documentation content: | Configure tracing for Microsoft Foundry AI agents | https://learn.microsoft.com/en-us/azure/foundry/observability/how-to/trace-agent-setup | | Use Azure OpenAI v1 API in Foundry Models | https://learn.microsoft.com/en-us/azure/foundry/openai/api-version-lifecycle | | Configure Prompt Shields for Foundry model security | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-filter-prompt-shields | +| Configure priority processing for Microsoft Foundry model deployments | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/priority-processing | | Call chat completion models with Azure OpenAI in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/chatgpt | -| Generate and edit images with Azure OpenAI image models | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e | | Run deep research with o3-deep-research via Responses API | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/deep-research | | Generate and use embeddings with Azure OpenAI in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/embeddings | | Configure DPO fine-tuning for Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-direct-preference-optimization | -| Configure and use function calling with Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling | | Call vision-enabled chat models with Azure OpenAI in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/gpt-with-vision | | Enable and tune JSON mode for Azure OpenAI chat completions | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/json-mode | | Use predicted outputs to reduce Azure OpenAI latency | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/predicted-outputs | @@ -194,9 +205,8 @@ This skill requires **network access** to fetch documentation content: | Topic | URL | |-------|-----| | Use agents, conversations, and responses in Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/concepts/runtime-components | -| Integrate enterprise AI gateways with Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway | +| Integrate BYO model gateways with Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/ai-gateway | | Connect Foundry agents to Foundry IQ knowledge bases | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/foundry-iq-connect | -| Invoke Foundry Agent Applications via Responses API protocol | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-responses | | Connect Foundry agents to external A2A endpoints | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/agent-to-agent | | Connect Azure AI Search indexes to agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/ai-search | | Integrate Azure Speech MCP tool with Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/azure-ai-speech | @@ -206,20 +216,18 @@ This skill requires **network access** to fetch documentation content: | Connect Microsoft Fabric data agent to Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/fabric | | Use function calling tools with Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/function-calling | | Use image generation tool in Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/image-generation | -| Connect Foundry agents to MCP servers | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol | +| Integrate Foundry agents with MCP server endpoints | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/model-context-protocol | | Connect OpenAPI tools to Microsoft Foundry agents | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/openapi | | Integrate Foundry agents with SharePoint content | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/tools/sharepoint | | Run fine-tuning jobs with azd extension | https://learn.microsoft.com/en-us/azure/foundry/fine-tuning/fine-tune-cli | | Configure Claude Code CLI and VS Code with Foundry | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/configure-claude-code | | Deploy and call DeepSeek-R1 in Foundry Models | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/tutorials/get-started-deepseek-r1 | | Integrate third-party safety tools with Foundry | https://learn.microsoft.com/en-us/azure/foundry/guardrails/third-party-integrations | -| Run cloud evaluations with Microsoft Foundry SDK | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/cloud-evaluation | | Integrate LangChain and LangGraph with Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain | | Build LangGraph agents with Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-agents | | Add Foundry long-term memory to LangChain apps | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-memory | | Use LangChain models with Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-models | | Trace LangChain apps with Foundry and Azure Monitor | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/langchain-traces | -| Run AI Red Teaming Agent scans in cloud | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-ai-red-teaming-cloud | | Run AI Red Teaming Agent scans locally | https://learn.microsoft.com/en-us/azure/foundry/how-to/develop/run-scans-ai-red-teaming-agent | | Set up an Azure Key Vault connection for Microsoft Foundry | https://learn.microsoft.com/en-us/azure/foundry/how-to/set-up-key-vault-connection | | Use Foundry MCP Server tools and prompts | https://learn.microsoft.com/en-us/azure/foundry/mcp/available-tools | @@ -229,14 +237,14 @@ This skill requires **network access** to fetch documentation content: | Authoring operations for Foundry OpenAI REST API | https://learn.microsoft.com/en-us/azure/foundry/openai/authoring-reference-preview | | Use groundedness detection with Foundry OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/concepts/content-filter-groundedness | | Integrate Codex CLI and VS Code with Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/codex | +| Use Azure OpenAI image generation models in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/dall-e | | Fine-tune Foundry models via SDK and REST | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning | | Fine-tune tool calling behavior in Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/fine-tuning-functions | -| Call Foundry model router via Chat Completions API | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/model-router | +| Implement function calling with Azure OpenAI in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/function-calling | | Implement GPT Realtime audio with Azure OpenAI | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio | | Use GPT Realtime API over SIP | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-sip | | Use GPT Realtime API over WebRTC | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-webrtc | | Use GPT Realtime API over WebSockets | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/realtime-audio-websockets | -| Call Azure OpenAI reasoning models via Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/reasoning | | Use web_search tool with Azure OpenAI Responses API | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/web-search | | Set up and secure Azure OpenAI webhooks in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/how-to/webhooks | | Integrate with Azure OpenAI v1 REST API in Foundry | https://learn.microsoft.com/en-us/azure/foundry/openai/latest | @@ -258,13 +266,8 @@ This skill requires **network access** to fetch documentation content: ### Deployment | Topic | URL | |-------|-----| -| Set up infrastructure for Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/environment-setup | -| Deploy Foundry agents as digital workers in Agent 365 | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/agent-365 | -| Deploy custom hosted agents to Foundry Agent Service | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/deploy-hosted-agent | -| Manage lifecycle of hosted agent deployments | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/manage-hosted-agent | -| Publish Foundry agents as managed Azure resources | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-agent | -| Publish Foundry agents to Microsoft 365 Copilot and Teams | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/publish-copilot | | Create and deploy hosted Foundry agent workflows in VS Code | https://learn.microsoft.com/en-us/azure/foundry/agents/how-to/vs-code-agents-workflow-pro-code | +| Deploy a containerized hosted agent to Foundry | https://learn.microsoft.com/en-us/azure/foundry/agents/quickstarts/quickstart-hosted-agent | | Deploy Foundry Models using Azure CLI and Bicep templates | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/create-model-deployments | | Deploy Foundry Models via Foundry portal for inference | https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/deploy-foundry-models | | Run Foundry evaluations in Azure DevOps pipelines | https://learn.microsoft.com/en-us/azure/foundry/how-to/evaluation-azure-devops |